W0307 22:16:44.613000 50046 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] W0307 22:16:44.613000 50046 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** W0307 22:16:44.613000 50046 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. W0307 22:16:44.613000 50046 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** *** [2025-03-07 22:16:57,164] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,164] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,167] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,177] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,169] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,173] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,183] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,182] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,174] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,176] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,182] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,182] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,183] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,183] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,181] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,181] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,181] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,181] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,182] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,184] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,183] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,190] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,190] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,181] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,181] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,183] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,183] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:57,183] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:16:59,761] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,766] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,766] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,761] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,761] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,762] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,764] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,764] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,762] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,762] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,763] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,763] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,763] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,768] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,768] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,768] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,768] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,768] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,768] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,769] [INFO] [comm.py:652:init_distributed] cdb=None [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,769] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:16:59,771] [INFO] [comm.py:652:init_distributed] cdb=None Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). [2025-03-07 22:17:00,524] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:00,522] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 -report_to flag to control the integrations used for logging result (for instance --report_to none). You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:00,527] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). [2025-03-07 22:17:00,571] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:00,569] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). [2025-03-07 22:17:00,600] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:00,610] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). [2025-03-07 22:17:00,924] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). [2025-03-07 22:17:00,970] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). [2025-03-07 22:17:00,989] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). [2025-03-07 22:17:01,054] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,056] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,056] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 ). [2025-03-07 22:17:01,059] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:01,060] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:01,060] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 ve the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:01,067] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. nstance --report_to none). You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. nstance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). [2025-03-07 22:17:01,073] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,076] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,077] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,077] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:01,079] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:01,085] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). [2025-03-07 22:17:01,127] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,129] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,130] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 -report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,132] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:01,134] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). [2025-03-07 22:17:01,147] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,145] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 ve the model to GPU after initializing it on CPU with `model.to('cuda')`. Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:01,154] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). [2025-03-07 22:17:01,157] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,158] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,159] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:01,165] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,164] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 -report_to flag to control the integrations used for logging result (for instance --report_to none). You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:01,172] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,172] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,172] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,172] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:01,178] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:01,186] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). [2025-03-07 22:17:01,199] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). [2025-03-07 22:17:01,200] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:01,211] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,215] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:01,217] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,220] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). Using the `WANDB_DISABLED` environment variable is deprecated and will be removed in v5. Use the --report_to flag to control the integrations used for logging result (for instance --report_to none). [2025-03-07 22:17:01,247] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,247] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,249] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:17:01,255] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,257] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,257] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:17:01,258] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 [2025-03-07 22:17:03,203] [INFO] [partition_parameters.py:348:__exit__] finished initializing model - num_params = 730, num_elems = 8.29B Loading checkpoint shards: 0%| | 0/5 [00:00, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=0, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-39, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, )Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=6, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-101, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) raining args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=6, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-41, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, )Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=5, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-56, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=6, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-56, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, )TrTraining args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=Fa Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=3, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-102, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=0, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-102, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=se, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=2, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-101, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=3, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-101, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=nputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=3, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-41, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_indputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=1, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-56, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=3, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-56, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=2, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-56, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) =-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) rediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=3, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-104, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) elay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=7, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-101, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=1, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-102, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=4, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-102, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=7, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-102, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, )Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=2, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-16-59_HOST-10-140-60-102, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=['tensorboard'], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf[TCSLoader] config_path: ~/petreloss.conf config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf config_path: ~/petreloss.conf --> before Client(conf_path) --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/pet[TCSLoader][TCSLoader] config_path: ~/petreloss.conf Client(conf_path) --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf--> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] --> before Client(conf_path) --> before Client(conf_path) conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) reloss.conf --> before Client(conf_path) --> before Client(conf_path) --> before Client(conf_path) reloss.conf --> before Client(conf_path) reloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf[TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) -> after Client(conf_path) --> after Client(conf_path)--> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path)--> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/guienv.json with all sampling strategy --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) Rank 0: Loaded 327972 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/guienv.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/omniact_fix.json with all sampling strategy Rank 0: Loaded 6720 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/omniact_fix.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ricoig16k.json with all sampling strategy Rank 0: Loaded 16133 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ricoig16k.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ricosca.json with all sampling strategy Rank 0: Loaded 173212 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ricosca.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/seeclick.json with all sampling strategy Rank 0: Loaded 271121 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/seeclick.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ui_refexp.json with all sampling strategy Rank 0: Loaded 15624 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ui_refexp.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/webui350k.json with all sampling strategy Rank 0: Loaded 57389 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/webui350k.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/widget_captioning.json with all sampling strategy Rank 0: Loaded 101426 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/widget_captioning.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_direct_150k_description_filtered.json with all sampling strategy Rank 0: Loaded 148416 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_direct_150k_description_filtered.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_direct_258k_function_filtered.json with all sampling strategy Rank 0: Loaded 248264 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_direct_258k_function_filtered.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_hybrid_773k_max_25qa_filtered_new.json with all sampling strategy Rank 0: Loaded 1243857 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_hybrid_773k_max_25qa_filtered_new.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_windows_splited.json with all sampling strategy Rank 0: Loaded 1075344 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_windows_splited.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_linux_splited.json with all sampling strategy Rank 0: Loaded 43156 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_linux_splited.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_macos_splited.json with all sampling strategy Rank 0: Loaded 18399 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_macos_splited.json Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/uibert_train_ground_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 4646 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/uibert_train_ground_d240430_v1.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_taperception_grounding_d240815_v2.jsonl with all sampling strategy Rank 0: Loaded 2500 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_taperception_grounding_d240815_v2.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_widget_grounding_d240815_v2.jsonl with all sampling strategy Rank 0: Loaded 14878 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_widget_grounding_d240815_v2.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_mug_grounding_d240812.jsonl with all sampling strategy Rank 0: Loaded 26090 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_mug_grounding_d240812.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_phone_2403_ground_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 24798 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_phone_2403_ground_d240430_v1.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_aig_share_2405_ground_d240521_v1.jsonl with all sampling strategy Rank 0: Loaded 5008 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_aig_share_2405_ground_d240521_v1.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_aig_share_2406_ground_d240612_v1.jsonl with all sampling strategy Rank 0: Loaded 7903 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_aig_share_2406_ground_d240612_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/amex_grounding_d240813_v1.jsonl with all sampling strategy Rank 0: Loaded 102007 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/amex_grounding_d240813_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/guicourse_guienv_text_grounding_1_d240815_v3.jsonl with all sampling strategy Rank 0: Loaded 63581 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/guicourse_guienv_text_grounding_1_d240815_v3.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/guicourse_guienv_text_grounding_2_d240815_v3.jsonl with all sampling strategy Rank 0: Loaded 6852 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/guicourse_guienv_text_grounding_2_d240815_v3.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_schedual_extract_20240520_v2_r464_reprompt_d240607.jsonl with repeat:2 sampling strategy Rank 0: Loaded 928 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_schedual_extract_20240520_v2_r464_reprompt_d240607.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_app_d20240822_v1.jsonl with repeat:2 sampling strategy Rank 0: Loaded 2488 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_app_d20240822_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_os_d20240822_v1.jsonl with repeat:2 sampling strategy Rank 0: Loaded 1242 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_os_d20240822_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_web_d20240822_v1.jsonl with repeat:2 sampling strategy Rank 0: Loaded 2360 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_web_d20240822_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_element_recognition_d240605_v1_correct_d240607.jsonl with all sampling strategy Rank 0: Loaded 3791 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_element_recognition_d240605_v1_correct_d240607.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_marker_recognition_d240605_v1.jsonl with all sampling strategy Rank 0: Loaded 5179 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_marker_recognition_d240605_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_ocr_d240605_v1.jsonl with all sampling strategy Rank 0: Loaded 5090 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_ocr_d240605_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_operation_oral_d240605_v1.jsonl with all sampling strategy Rank 0: Loaded 5070 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_operation_oral_d240605_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_visual_prompt_with_bbox_d240605_v1.jsonl with all sampling strategy Rank 0: Loaded 5248 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_visual_prompt_with_bbox_d240605_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2408_region_caption_d240903_v1.jsonl with all sampling strategy Rank 0: Loaded 5854 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2408_region_caption_d240903_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_detection_d20240418_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_detection_d20240418_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_element_recognition_d20240416_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_element_recognition_d20240416_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_ground_d20240416_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_ground_d20240416_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_detection_d20240418_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_detection_d20240418_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_element_recognition_d20240416_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_element_recognition_d20240416_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_ground_d20240416_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_ground_d20240416_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_element_recognition_d240605_v1_correct_d240607.jsonl with all sampling strategy Rank 0: Loaded 24620 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_element_recognition_d240605_v1_correct_d240607.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_long_caption_d20240604_v2.jsonl with all sampling strategy Rank 0: Loaded 17196 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_long_caption_d20240604_v2.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_long_caption_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 5998 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_long_caption_d240430_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_ocr_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 31276 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_ocr_d240430_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screen_qa_short_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 27880 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screen_qa_short_d240430_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screen_qa_with_bbox_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 62401 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screen_qa_with_bbox_d240430_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screenai_layout_20240604_v1.jsonl with all sampling strategy Rank 0: Loaded 22076 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screenai_layout_20240604_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_grounding_d240924_v1.jsonl with all sampling strategy Rank 0: Loaded 1405 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_grounding_d240924_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_oral_operation_d240924_v1.jsonl with all sampling strategy Rank 0: Loaded 1405 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_oral_operation_d240924_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_region_caption_d240924_v1.jsonl with all sampling strategy Rank 0: Loaded 1405 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_region_caption_d240924_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_ui_operation_oral_wbox_d241023_v2.jsonl with all sampling strategy Rank 0: Loaded 20293 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_ui_operation_oral_wbox_d241023_v2.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_comment_20240606_json_d20241023_v2.jsonl with all sampling strategy Rank 0: Loaded 1055 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_comment_20240606_json_d20241023_v2.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_internal_aig_json_d241126.jsonl with repeat:3 sampling strategy Rank 0: Loaded 6837 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_internal_aig_json_d241126.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_internal_aig_xml_d241126.jsonl with repeat:3 sampling strategy Rank 0: Loaded 6873 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_internal_aig_xml_d241126.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/OS_Altas_androidworld_grounding_d241120_v1.jsonl with all sampling strategy Rank 0: Loaded 89860 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/OS_Altas_androidworld_grounding_d241120_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_long_caption_20240604_v1.jsonl with repeat:4 sampling strategy Rank 0: Loaded 3156 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_long_caption_20240604_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/grounding_new.jsonl with all sampling strategy Rank 0: Loaded 863 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/grounding_new.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/regioncaption_new.jsonl with all sampling strategy Rank 0: Loaded 863 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/regioncaption_new.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/oral_operation_new.jsonl with all sampling strategy Rank 0: Loaded 863 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/oral_operation_new.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240812_grounding_dataset_20240812_v1_r6600.jsonl with all sampling strategy Rank 0: Loaded 6600 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240812_grounding_dataset_20240812_v1_r6600.jsonl Rank 0: Loaded 4387471 samples from data/stage1_20250307_1.yaml Rank 0: Formatting inputs...Skip in lazy mode Detected kernel version 3.10.0, which is below the recommended minimum of 5.5.0; this can cause the process to hang. It is recommended to upgrade the kernel to the minimum version or higher. Parameter Offload: Total persistent parameters: 877056 in 401 params 0%| | 0/34278 [00:00 Failed to fetch sample 3646711. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd68717b50> Failed to fetch sample 4191737. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fc99e0> Failed to fetch sample 2634406. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc27789850> Failed to fetch sample 3600824. Exception: cannot identify image file <_io.BytesIO object at 0x7f64189a1cb0> Failed to fetch sample 2659396. Exception: cannot identify image file <_io.BytesIO object at 0x7f64189a1c10> Failed to fetch sample 1359622. Exception: cannot identify image file <_io.BytesIO object at 0x7f65d6a77e20> Failed to fetch sample 2547681. Exception:Failed to fetch sample 1611311. Exception: cannot identify image file <_io.BytesIO object at 0x7f655cdba570>cannot identify image file <_io.BytesIO object at 0x7f3bcf3ba070>Failed to fetch sample 2090676. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcf3ba020> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f64189d5670> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7f6419391490> Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO object at 0x7f6417e0db70> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcea1f4c0> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bd13240e0> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f6417e0f5b0> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556028e50> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f655600fe20> Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c74ea0> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c766b0> Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496028950> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7f619188d800> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c74bd0> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7f65d6a29210> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7f39728f07c0> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f6191ca4ea0> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f3d1589ad40> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7f39728f07c0> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f62ac2ab8d0> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7f6191ca6340> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f39460758f0> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f3a62dd7920> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f655600fba0> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556028f40> Failed to fetch sample 4106410. Exception: cannot identify image file <_io.BytesIO object at 0x7f3d1598d990> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7f655602fc40> Failed to fetch sample 3571476. Exception:Failed to fetch sample 2148940. Exception:bject at 0x7f39486c0a90> Failed to fetch sample 3705515. Exception: cannot identify image file <_io.BytesIO object at 0x7f3946076700> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7f655602b470> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7f3d146df240> Failed to fetch sample 1787371. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c5605fa60> cannot identify image file <_io.BytesIO object at 0x7fdd68d2ade0> Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fd9620> Failed to fetch sample 2725411. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cec950e00> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cebf7eac0> Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a71a0> Failed to fetch sample 2701377. Exception: cannot identify image file <_io.BytesIO object at 0x7f3d1510d670> Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcea1db20> Failed to fetch sample 1570916. Exception: cannot identify image file <_io.BytesIO object at 0x7f649602a610> Failed to fetch sample 3329980. Exception: cannot identify image file <_io.BytesIO object at 0x7f655cdba570> Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496056c00> Failed to fetch sample 3920507. Exception: cannot identify image file <_io.BytesIO object at 0x7f3a62dd7920> Failed to fetch sample 3150542. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c575cede0> Failed to fetch sample 1291371. Exception: cannot identify image file <_io.BytesIO object at 0x7f3d15847ce0> Failed to fetch sample 2170106. Exception: cannot identify image file <_io.BytesIO object at 0x7fa6506f0e50> Failed to fetch sample 1919680. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a6861440> cannot identify image file <_io.BytesIO object at 0x7f7cec950e00> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cebf7f100> cannot identify image file <_io.BytesIO object at 0x7f82a5e97f10> Failed to fetch sample 1079347. Exception:Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5ebef20> cannot identify image file <_io.BytesIO object at 0x7f7a568a25c0> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a91feff10> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7f801f5efe20> Failed to fetch sample 1614232. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd6a4b1b20> Failed to fetch sample 3106520. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc26e3dc60> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a663cfb50> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc255d7ec0> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd68ecaf70> Failed to fetch sample 2338284. Exception: ject at 0x7f801f5efa10> Failed to fetch sample 3705515. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5e97e70> cannot identify image file <_io.BytesIO object at 0x7f7a663cfd80> Failed to fetch sample 1787371. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc26986f70> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf81bb00> Failed to fetch sample 2835131. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a6250> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e585858f0> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd68e56840> Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2392b0> Failed to fetch sample 3329980. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c9c10> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f238d60> Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a73d0> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf9833d0> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a8bcc5d50> Failed to fetch sample 3873212. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc255d7650> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7fd99f4c0ea0> Failed to fetch sample 1066999. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d1594f740> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f6ba8b6fd80> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7fd99f4c20c0> Failed to fetch sample 3705515. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d145b9c60> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d1599a020> Failed to fetch sample 4378941. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e58449b70> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf81bb50> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2379c0> cannot identify image file <_io.BytesIO object at 0x7f936bd16390> Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7f936b357ab0> cannot identify image file <_io.BytesIO object at 0x7f6a8bcc5d50> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a8bcc6a20> Failed to fetch sample 3176073. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e58388cc0> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a8e1c9530> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d943509f0> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f237830> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f23bb00> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f237ba0> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a8bcc5d50> Failed to fetch sample 2989692. Exception:Failed to fetch sample 3232595. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c76f20> cannot identify image file <_io.BytesIO object at 0x7f6a8e1c9530> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c76c00> cannot identify image file <_io.BytesIO object at 0x7f6a8e1c9350> Failed to fetch sample 2082780. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f1af150> cannot identify image file <_io.BytesIO object at 0x7f9d50588680> Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4fbbe570> le <_io.BytesIO object at 0x7f655602e7f0> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7f655600ff10> Failed to fetch sample 2840465. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcf3b9df0> Failed to fetch sample 1986286. Exception:Failed to fetch sample 1484363. Exception: ject at 0x7f3c50a30180> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c73bf0> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7f8028ef49f0> Failed to fetch sample 3101535. Exception: Failed to fetch sample 1787371. Exception: ect at 0x7f39728f07c0> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f3d1586b8d0> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cec950540>cannot identify image file <_io.BytesIO object at 0x7f80a82bb2e0> cannot identify image file <_io.BytesIO object at 0x7f39460758f0> Failed to fetch sample 3281167. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a68611c0> Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d6cc05850> cannot identify image file <_io.BytesIO object at 0x7f39486c0a90> Failed to fetch sample 3253181. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c764d0> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5e970b0> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c73290> Failed to fetch sample 3915660. Exception: cannot identify image file <_io.BytesIO object at 0x7f64189a1c10> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84efba0> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556281850> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7f801f5efe20> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560131f0> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f801f5efa10> Failed to fetch sample 3705515. Exception: Failed to fetch sample 4110233. Exception: ect at 0x7f655613fbf0> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f80201b3ba0> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7f80201b3f10> Failed to fetch sample 1901586. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d50588900> Failed to fetch sample 1149949. Exception: cannot identify image file <_io.BytesIO object at 0x7f802848f740> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d50f4f920> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4fbbede0> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7f9ac15f07c0> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f9dd1563790> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f9ee157a840> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7f9acc06b6a0> Failed to fetch sample 2526307. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82bafc0> Failed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7f81b9ca6d40> Failed to fetch sample 2438090. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcf3b9df0> Failed to fetch sample 2910986. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a68613f0> Failed to fetch sample 1305316. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cec950e00> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ced30f290> Failed to fetch sample 1819035. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcf3b9df0> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f3d14680d60> Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c7bc40> Failed to fetch sample 2117429. Exception:Failed to fetch sample 1419084. Exception: cannot identify image file <_io.BytesIO object at 0x7f6ba8b6fd80> Failed to fetch sample 4151771. Exception:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4151771. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f238ea0> cannot identify image file <_io.BytesIO object at 0x7f83e84ef0b0> Failed to fetch sample 3599167. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f26a250> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f26a7f0> Failed to fetch sample 2607760. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa2db20> Failed to fetch sample 2970666. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7b9c0> Failed to fetch sample 1831515. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a6ca0> Failed to fetch sample 2555087. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf81ae30> Failed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf81bdd0> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf81b100> Failed to fetch sample 2035921. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e584358a0> Failed to fetch sample 3594396. Exception: cannot identify image file <_io.BytesIO object at 0x7f936bd16390> Failed to fetch sample 1705064. Exception: cannot identify image file <_io.BytesIO object at 0x7f936bd163e0> Failed to fetch sample 1857192. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc27789850> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc255d76a0> Failed to fetch sample 1787371. Exception: cannot identify image file <_io.BytesIO object at 0x7f936c731440> Failed to fetch sample 2753315. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e5838d8f0> Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc269871a0> Failed to fetch sample 2970666. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f236c00> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc26987010> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7fd99eea58f0> Failed to fetch sample 1110266. Exception:Failed to fetch sample 1696609. Exception: cannot identify image file <_io.BytesIO object at 0x7fd99eea6610> Failed to fetch sample 2987926. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d944359e0> Failed to fetch sample 3705515. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc255d7fb0> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d145bbb50> Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO objcannot identify image file <_io.BytesIO object at 0x7f9d50f65620> cannot identify image file <_io.BytesIO object at 0x7f6a7f281710> cannot identify image file <_io.BytesIO object at 0x7fb8ee739350> Failed to fetch sample 3672112. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc5544c6d0> Failed to fetch sample 1201923. Exception: cannot identify image file <_io.BytesIO object at 0x7fcd9b038040> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc57d6a390> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8edd673d0> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7fcea3aff290> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778df30> Failed to fetch sample 2503713. Exception: Failed to fetch sample 1158515. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8ee739350> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9fae04900> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7fb667590ea0> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7fcaae70c400> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7fb667592610> Failed to fetch sample 1941188. Exception: cannot identify image file <_io.BytesIO object at 0x7fba3048af70> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7fb6a29fd580> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9ce339a30> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7fb669afc9f0> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf753b50> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf753880> Failed to fetch sample 1007230. Exception:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIO object at 0x7fb8ee739120> ust got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... =(true | false) To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7f936bd163e0> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f936b357b00> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777933d0> cannot identify image file <_io.BytesIO object at 0x7f936c731440> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7f910da03c90> Failed to fetch sample 2955526. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a99f7330> Failed to fetch sample 4125910. Exception: cannot identify image file <_io.BytesIO object at 0x7f95bc522890> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f90e4c5eca0> Failed to fetch sample 1392619. Exception: cannot identify image file <_io.BytesIO object at 0x7f1bfcb9b1f0> cannot identify image file <_io.BytesIO object at 0x7f90e524d7b0> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7f936c739fd0> Failed to fetch sample 2072865. Exception: huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ion: cannot identify image file <_io.BytesIO object at 0x7f83e84ef060> Failed to fetch sample 3251376. Exception: cannot identify image file <_io.BytesIO object at 0x7f3d1586b8d0> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84eed40> Failed to fetch sample 3735721. Exception: cannot identify image file <_io.BytesIO object at 0x7f64189a1c60> Failed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c96ac0> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7f6417e0db70> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c73600> Failed to fetch sample 3445908. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a6861350> Failed to fetch sample 1787371. Exception: cannot identify image file <_io.BytesIO object at 0x7f64189bb9c0> Failed to fetch sample 1986286. Exception: Failed to fetch sample 1689985. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d50f65620>cannot identify image file <_io.BytesIO object at 0x7f82a5e97fb0> Failed to fetch sample 3329980. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84eba10> cannot identify image file <_io.BytesIO object at 0x7f9ac15f13f0> Failed to fetch sample 1985115. Exception: ect at 0x7f1b7c555e90> Failed to fetch sample 1074051. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cec950e00> Failed to fetch sample 3101535. Exception: ect at 0x7f9abcdc3fb0> cannot identify image file <_io.BytesIO object at 0x7f7d6cc75f80> cannot identify image file <_io.BytesIO object at 0x7f846e393650> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a3a60> cannot identify image file <_io.BytesIO object at 0x7f80201b3150> Failed to fetch sample 3101535. Exception:Failed to fetch sample 1866280. Exception: ject at 0x7f7a91feff10> cannot identify image file <_io.BytesIO object at 0x7f801f5efa10> cannot identify image file <_io.BytesIO object at 0x7f7b800ead40> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7f801f5efe20> Failed to fetch sample 1866280. Exception:Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84ebc90>Failed to fetch sample 1633231. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50b3c98f0> cannot identify image file <_io.BytesIO object at 0x7f1b7c555e40> Failed to fetch sample 2406460. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e302f13f0> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a663cfb50> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7cf4ad90> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a3f60> ject at 0x7fa50aa2ed40> Failed to fetch sample 2531816. Exception: Failed to fetch sample 1348427. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7b290> cannot identify image file <_io.BytesIO object at 0x7f936bd162f0> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4fbbf150> Failed to fetch sample 3101535. Exception: ect at 0x7fb9777b7f10> cannot identify image file <_io.BytesIO object at 0x7f9abcd95850> ile <_io.BytesIO object at 0x7fb9777b4310> cannot identify image file <_io.BytesIO object at 0x7fa39edcf920> Failed to fetch sample 3930844. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50b3c9b20> Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b62a0> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd971a0> Failed to fetch sample 4230612. Exception: Failed to fetch sample 1748763. Exception: ect at 0x7fa58c1b73d0> Failed to fetch sample 2428943. Exception: ect at 0x7f1bfcb93e70> cannot identify image file <_io.BytesIO object at 0x7f9ac15f07c0> ile <_io.BytesIO object at 0x7fa58d5b69d0> ailed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f9dd1566570> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c36f0> cannot identify image file <_io.BytesIO object at 0x7fa2846ca660> Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7fa2846c8fe0> Failed to fetch sample 2408351. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6494db0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f802848f740> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82e2f70> cannot identify image file <_io.BytesIO object at 0x7f7cec950ea0> Failed to fetch sample 2056357. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc255d76a0> Failed to fetch sample 3705515. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6612ed0> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cebf7f7e0> Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ced30f3d0> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cec9542c0> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a91feff10> ile <_io.BytesIO object at 0x7fb97778e8e0> Failed to fetch sample 1110266. Exception:Token indices sequence length is longer than the specified maximum Failed to fetch sample 1746943. Exception: ). Running this sequence through the model will result in indexing errors cannot identify image file <_io.BytesIO object at 0x7fb784822fc0> ile <_io.BytesIO object at 0x7f7a663cfb50> Failed to fetch sample 2009172. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a6861800> Failed to fetch sample 1791704. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8550ea0> Failed to fetch sample 2713848. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50a9d7970> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d50588270> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4fbbe570> cannot identify image file <_io.BytesIO object at 0x7fa50aa2da80> Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84efe70> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7e020> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7fa28409e750> Failed to fetch sample 2989692. Exception: huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 1879625. Exception: Failed to fetch sample 3705747. Exception: ect at 0x7fa2846ca110> Failed to fetch sample 3571476. Exception: ect at 0x7fccdf753010> cannot identify image file <_io.BytesIO object at 0x7fa39edcf920> ile <_io.BytesIO object at 0x7fccdf776430> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5e97920> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIO object at 0x7fccdf775710> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 2309012. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4fbbe5c0> ile <_io.BytesIO object at 0x7f1b7c555d00> Failed to fetch sample 2970666. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c36f0> cannot identify image file <_io.BytesIO object at 0x7f9abcd97970> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64e11c0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7f19d56dd490> _PARALLELISM=(true | false) Failed to fetch sample 1964363. Exception: cannot identify image file <_io.BytesIO object at 0x7f1a0ff1bb00> Failed to fetch sample 2417132. Exception: cannot identify image file <_io.BytesIO object at 0x7f1d9a747650> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f18f3249cb0> Failed to fetch sample 3329980. Exception: Failed to fetch sample 3571476. Exception:ject at 0x7f80a80dc1d0> Failed to fetch sample 1985115. Exception: ject at 0x7f18e64e3a10> cannot identify image file <_io.BytesIO object at 0x7f80a82bb510> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 2456731. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd92e30> Failed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7fa58dfd0a40> Failed to fetch sample 1899122. Exception: ect at 0x7f9d4fbbe570> cannot identify image file <_io.BytesIO objcannot identify image file <_io.BytesIO object at 0x7fa6517294e0> Failed to fetch sample 3318243. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e306e36a0> Failed to fetch sample 1129781. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84ee700> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7d8f0> cannot identify image file <_io.BytesIO object at 0x7fb8ee739210> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8edd67ab0> cannot identify image file <_io.BytesIO object at 0x7fa28409e750> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1024490. Exception: huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable le this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) ion: cannot identify image file <_io.BytesIO object at 0x7fa39edcf920> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2428943. Exception:Failed to fetch sample 2666716. Exception:bject at 0x7fb667590ea0> cannot identify image file <_io.BytesIO object at 0x7f910da03c90> le <_io.BytesIO object at 0x7fb8edd67f60> Failed to fetch sample 3705515. Exception: cannot identify image file <_io.BytesIO object at 0x7fa2846c9620>Failed to fetch sample 3705515. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac236340> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac20ef70> Failed to fetch sample 1639236. Exception: cannot identify image file <_io.BytesIO object at 0x7efc52081260> Failed to fetch sample 3122128. Exception: cannot identify image file <_io.BytesIO object at 0x7efc52081170> Failed to fetch sample 3699865. Exception: cannot identify image file <_io.BytesIO object at 0x7efd95b66f70> e TOKENIZERS_PARALLELISM=(true | false) ion: cannot identify image file <_io.BytesIO object at 0x7f1b7cf1f1f0> Failed to fetch sample 2970666. Exception:bject at 0x7efc516afbf0> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7efc516aed40> cannot identify image file <_io.BytesIO object at 0x7f1d9a747650> Failed to fetch sample 3409704. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c555cb0> le <_io.BytesIO object at 0x7efcdbb11210> Failed to fetch sample 2438256. Exception: Failed to fetch sample 1875546. Exception:cannot identify image file <_io.BytesIO object at 0x7ef9f8833fb0> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9f8833fb0> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64977e0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2118761. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cebf7f7e0> cannot identify image file <_io.BytesIO object at 0x7fa58df9f240> le <_io.BytesIO object at 0x7ef9cc625080> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9cc626750> Failed to fetch sample 3705515. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb137e0> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9cc626200> le <_io.BytesIO object at 0x7fcd9b1e0630> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7fa58ca509f0> cannot identify image file <_io.BytesIO object at 0x7fcc54a82d40> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7fcd9b1a6bb0> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4142195. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a2b60> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO ob cannot identify image fFailed to fetch sample 2989692. Exception: huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... TOKENIZERS_PARALLELISM=(true | false) To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7fcae8ce2cf0> le <_io.BytesIO object at 0x7f816c6d7d30>Failed to fetch sample 2805262. Exception:Failed to fetch sample 1879625. Exception: ject at 0x7f7a65ab1760> Failed to fetch sample 1381036. Exception: Failed to fetch sample 3210209. Exception:ject at 0x7f827966e070> cannot identify image file <_io.BytesIO object at 0x7fa50bd8f010> Failed to fetch sample 2981253. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84eb010> Failed to fetch sample 4080764. Exception: cannot identify image fihuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... a10> Failed to fetch sample 4102616. Exception: Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2232499. Exception: cannot identify image file <_io.BytesIO object at 0x7f936b356200> cannot identify image file <_io.BytesIO object at 0x7fa50bdbacf0> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83161b0> huggingface/tokenizers: The current processFailed to fetch sample 2970666. Exception:ready been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7f94ac20f3d0> Failed to fetch sample 2036348. Exception: just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7fa2846ca660> Failed to fetch sample 2639168. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa2da80> Failed to fetch sample 2373901. Exception: cannot identify image file <_io.BytesIO object at 0x7f5780c8e700> cannot identify image file <_io.BytesIO object at 0x7f1bfcb431a0> ile <_io.BytesIO object at 0x7f563cef3510> Failed to fetch sample 1722498. Exception: cannot identify image file <_io.BytesIO objcannot identify image file <_io.BytesIO object at 0x7fa275c9d8a0> cannot identify image file <_io.BytesIO object at 0x7f563bb96ed0> cannot identify image file <_io.BytesIO object at 0x7f1bfc440770> cannot identify image file <_io.BytesIO object at 0x7f69190a60c0> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f69186e5c10> Failed to fetch sample 4236147. Exception:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIO object at 0x7fa275c9cef0> le <_io.BytesIO object at 0x7f6682f93c90> Failed to fetch sample 2061737. Exception: Failed to fetch sample 1533611. Exception: ect at 0x7f80a82f8fe0> Failed to fetch sample 3151413. Exception: cannot identify image file <_io.BytesIO object at 0x7f816c19f060> cannot identify image file <_io.BytesIO objcannot identify image file <_io.BytesIO object at 0x7fa275c9d760> Failed to fetch sample 4151771. Exception: cannot identify image file <_io.BytesIO object at 0x7f1bfcb93e70> le <_io.BytesIO object at 0x7fa2846ca110> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5e97100> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7fa6518aab10> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7f7dce76fc90> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f563bb96ed0> Failed to fetch sample 3060109. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c52de90> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a820b6f0> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c549f30> uggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7f53b5389760> _io.BytesIO object at 0x7f60ffea7d80> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7f7da2cd3ce0> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f54d28576a0> cannot identify image file <_io.BytesIO object at 0x7f69190a5ee0> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b30b0> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c54a660> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To diFcannot identify image file <_io.BytesIO object at 0x7fdd43bf9990> cannot identify image file <_io.BytesIO object at 0x7ff697de5170> tly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3065228. Exception: cannot identify image file <_io.BytesIO object at 0x7fa6512baac0> Failed to fetch sample 983645. Exception:: cannot identify image file <_io.BytesIO object at 0x7ff6973ebc40> cannot identify image file <_io.BytesIO oFailed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaade87510> bject at 0x7f563c52ddf0> an either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ion: ject at 0x7fa275c7afc0> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7f936bd16340> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7fdab2e549f0> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f5780c21260> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7f56bce8af70> Failed to fetch sample 4230612. Exception: ject at 0x7ff401f55260> - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7fddc4dbeac0> cannot identify image file <_io.BytesIO object at 0x7ff43b7a7c40> parallelism has already been used. DisabliFailed to fetch sample 2563858. Exception: cannot identify image file <_io.BytesIO object at 0x7f936bd16390> ore the fork if possible - Explicitly set the environment variable Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7f53b5389760> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4eeb2d350> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaade87560> le <_io.BytesIO object at 0x7f936b356200> cannot identify image file <_io.BytesIO obhuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7f910da03c90> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f563bb958a0> Failed to fetch sample 1799278. Exception: cannot identify image file <_io.BytesIO object at 0x7f563cf22f20> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac6457b0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682f923e0> Failed to fetch sample 3329980. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dbc0e0> can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f90e524d4e0> Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682f93600> cannot identify image file <_io.BytesIO object at 0x7f633ef414e0> Failed to fetch sample 1413496. Exception: cannot identify image file <_io.BytesIO object at 0x7f6100878ae0> Failed to fetch sample 1091968. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d4fc7e8e0> Failed to fetch sample 1991158. Exception: cannot identify image file <_io.BytesIO object at 0x7fba30275df0> ore the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIO object at 0x7f90e4c5f0b0> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f610123ab10> Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO object at 0x7f610317f3d0> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f7c565c6f70> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8edd673d0> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7f5ea213e2a0> Failed to fetch sample 2050005. Exception: cannot identify image file <_io.BytesIO object at 0x7f62450ce0c0> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 1045233. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d1d8a0> Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b93bb00> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6aa2e7f0> Failed to fetch sample 1787371. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778f830> Failed to fetch sample 2692014. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc54a83740> Failed to fetch sample 1899078. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d1d800> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7f633ef414e0> Failed to fetch sample 1414490. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13327330> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f633ef41ad0> Failed to fetch sample 2965706. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778ddf0> Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc54a83f60> Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b5300> cannot identify image file <_io.BytesIO object at 0x7efcdbb12ed0> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable e <_io.BytesIO object at 0x7f5e7a25fdd0> Failed to fetch sample 4151771. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778de40> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b5760> llelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9e6d76020> ng parallelism to avoid deadlocks... Failed to fetch sample 2310516. Exception: Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) <_io.BytesIO object at 0x7f7b13d2c720> Failed to fetch sample 3430697. Exception: cannot identify image file <_io.BytesIO object at 0x7f6181647010> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e795f7d80> cannot identify image file <_io.BytesIO object at 0x7f36351e63e0> ile <_io.BytesIO object at 0x7f788cc6d3a0> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7f788cc6d3a0> Failed to fetch sample 3329980. Exception: cannot identify image file <_io.BytesIO object at 0x7f36372a3240> Failed to fetch sample 1766859. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9ce339a30> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f788cc6de90> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f788d0ec4f0> Failed to fetch sample 1985115. Exception:Failed to fetch sample 3667570. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e7a25f6a0> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7fcd9a579a30> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f788d0ec4f0> cannot identify image file <_io.BytesIO object at 0x7f816c6d7d30> ile <_io.BytesIO object at 0x7f7b1477b150> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9793e7a0> Failed to fetch sample 2220750. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7ddb8860> cannot identify image file <_io.BytesIO object at 0x7f5a5884b1a0> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7f5b1c6ba110> Failed to fetch sample 2956864. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb685a020> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420d26b0> cannot identify image file <_io.BytesIO object at 0x7f69190a6070> Failed to fetch sample 1987318. Exception: cannot identify image file <_io.BytesIO object at 0x7f239e6b7380> Failed to fetch sample 3062265. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8ee739170> Failed to fetch sample 2578561. Exception: cannot identify image file <_io.BytesIO object at 0x7f10094012b0> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7f69186e5d50> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7fba30c3a250> cannot identify image file <_io.BytesIO object at 0x7fdc269873d0> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7f114e3ca8e0> cannot identify image file <_io.BytesIO object at 0x7f5783972070> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f5746e82610> Failed to fetch sample 1055686. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f92fe70> cannot identify image file <_io.BytesIO object at 0x7f22e39ba7f0> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778f240> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f574204a980> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093737b50> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7f5751088d60> Failed to fetch sample 1985115. Exception: ect at 0x7f0d822e17b0> Failed to fetch sample 4230612. Exception:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: Failed to fetch sample 2989692. Exception: if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7f114cd6f650> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7fb784820c20> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7fb784822fc0> cannot identify image file <_io.BytesIO object at 0x7f34f38d7ce0> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7f0e9cd5b600> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f2d11c10> cannot identify image file <_io.BytesIO object at 0x7fb97778f7e0> Failed to fetch sample 3510868. Exception:Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7f3294d9b1f0> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f42ed4e0> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b955ee0> Failed to fetch sample 1356016. Exception:cannot identify image file <_io.BytesIO object at 0x7f7b9b96bf60> Failed to fetch sample 3329980. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b956110> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b969f80> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7f225a5efe20> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f225b9d27f0> Failed to fetch sample 3013388. Exception:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3627094. Exception: cannot identify image file <_io.BytesIO object at 0x7f36366e24d0>ing parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2727197. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d1cf90> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f239cff3830> Failed to fetch sample 1469916. Exception: cannot identify image file <_io.BytesIO object at 0x7ff0067f22f0> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f2d11bc0> cannot identify image file <_io.BytesIO object at 0x7f20ef4abf60> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b133272e0> Failed to fetch sample 2841370. Exception: ect at 0x7f7d7d3fdda0> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f2d19030> Failed to fetch sample 3571476. Exception: ject at 0x7feef8654270> cannot identify image file <_io.BytesIO objcannot identify image fFailed to fetch sample 1079347. Exception: Failed to fetch sample 4100905. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f38d5620> cannot identify image file <_io.BytesIO object at 0x7f7b9b937ab0> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f42ed4e0> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7f788cc6d3a0> cannot identify image file <_io.BytesIO objFailed to fetch sample Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7ff6973eb5b0> 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f326a4f5a80> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7feb2f639c10> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f788cc6de90>Failed to fetch sample 2989692. Exception: Failed to fetch sample 1684072. Exception: cannot identify image file <_io.BytesIO object at 0x7f239e6b7380> Failed to fetch sample 1787371. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f57420> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7f788db47fb0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1879625. Exception: Failed to fetch sample 3210209. Exception:ject at 0x7feb2fc46890> cannot identify image file <_io.BytesIO obFailed to fetch sample 1959886. Exception: ple 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b937830> cannot identify image file <_io.BytesIO objcannot identify image file <_io.BytesIO obFailed to fetch sample 3 cannot identify image file <_io.BytesIO object at 0x7f7b9b937b00> t 0x7f6692311030> cannot identify image file <_io.BytesIO obFailed to fetch sample 4016661. Exception: cannot identify image file <_io.BytesIO object at 0x7f239ce246d0> Failed to fetch sample 1308895. Exception: cannot identify image file <_io.BytesIO object at 0x7ff7dd04f240> ject at 0x7efcdbd7ef70> Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb5ec2110> Failed to fetch sample 3747017. Exception: cannot identify image fiFailed to fetch sample 2881495. Exception: cannot identify image file <_io.BytesIO object at 0x7f225a5ee020> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variablFailed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f6919ac1260> Failed to fetch sample 1875546. Exception:TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3281265. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39c6bb0> as already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 1716757. Exception: cannot identify image file <_io.BytesIO object at 0x7f1ffc685cb0> Failed to fetch sample 4217931. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc57d684f0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7f225a5eff10> Failed to fetch sample 4204775. Exception: cannot identify image file <_io.BytesIO object at 0x7ff6973ebc40> Failed to fetch sample 1443927. Exception: cannot identify image file <_io.BytesIO object at 0x7efc520813f0> Failed to fetch sample 3571476. Exception: Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7efc516afbf0>cannot identify image file <_io.BytesIO object at 0x7ff7182aec00> file <_io.BytesIO object at 0x7efcdbb13c90> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9f8833fb0> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f90e4c5f0b0> Failed to fetch sample 2219757. Exception: cannot identify image file <_io.BytesIO object at 0x7f6998b01a80> ailed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac211e40> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) 19. Exception:Failed to fetch sample 2946246. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39b6930> Failed to fetch sample 1410555. Exception: cannot identify image file <_io.BytesIO object at 0x7f633ef41ad0> Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73b3f10> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682f8d3f0> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ec31ca1b0> Failed to fetch sample 2881495. Exception: cannot identify image file <_io.BytesIO object at 0x7fc214b05210> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc57398450> Failed to fetch sample 1069279. Exception: cannot identify image file <_io.BytesIO object at 0x7fcd9af76f70> Failed to fetch sample 1651065. Exception: cannot identify image file <_io.BytesIO object at 0x7f788cc6d3a0> ile <_io.BytesIO object at 0x7fbf006ff510> Failed to fetch sample 4101886. Exception: cannot identify image file <_io.BytesIO object at 0x7f3294d9b1f0> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7fccdf712a70> Failed to fetch sample 1787371. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9fae04900> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7f788cc6de90> Failed to fetch sample 3129076. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b7ab0> cannot identify image file <_io.BytesIO object at 0x7f7d7ddb87c0> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f6692312160> Failed to fetch sample 1430100. Exception: cannot identify image file <_io.BytesIO object at 0x7fc193db6250> cannot identify image file <_io.BytesIO object at 0x7ff401f56ca0> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc54a83e20>FFailed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7ff40701d120> cannot identify image file <_io.BytesIO object at 0x7fdd431ef790> ile <_io.BytesIO object at 0x7f60ffea7f10> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9ce339a30> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078afb50> Failed to fetch sample 1466972. Exception: Failed to fetch sample 3013388. Exception:ject at 0x7f5e7a25f7e0> cannot identify image file <_io.BytesIO object at 0x7fdaadeb1670> Failed to fetch sample 3251742. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd4460ccc0> ile <_io.BytesIO object at 0x7f7c54402b10> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e7a25fdd0> cannot identify image file <_io.BytesIO object at 0x7f7af4a2fc40> Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadeb21b0> Failed to fetch sample 1879625. Exception: just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.Bytescannot identify image file <_io.BytesIO object at 0x7f7af7613c40> Failed to fetch sample 3721013. Exception: cannot identify image file <_io.BytesIO object at 0x7f7af76137e0> FFailed to fetch sample 1579494. Exception:cannot identify image file <_io.BytesIO object at 0x7fbf006f2f20> Failed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697777c40> Failed to fetch sample 2153518. Exception: ject at 0x7fbf0066b1a0> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420d3510> cannot identify image file <_io.BytesIO object at 0x7ff401f57bf0> Failed to fetch sample 2229538. Exception: cannot identify image file <_io.BytesIO object at 0x7f36345d2750> cannot identify image file <_io.BytesIO object at 0x7ff43b7a7c40> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b908b80> Failed to fetch sample 3922910. Exception: ect at 0x7ff411288540> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b908d60> cannot identify image file <_io.BytesIO obhuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIO object at 0x7fc19474e890> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7f3294d9b1f0> le <_io.BytesIO object at 0x7fedb685a0c0> Failed to fetch sample 2270043. Exception: cannot identify image file <_io.BytesIO object at 0x7feb2f639c10> Failed to fetch sample 3158912. Exception: cannot identify image file <_io.BytesIO object at 0x7feef99df740> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb5ec3ab0> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f33876239c0> Failed to fetch sample 1565916. Exception: cannot identify image file <_io.BytesIO object at 0x7feb2f639cb0> Failed to fetch sample 4195233. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f923330>Failed to fetch sample 2906843. Exception: Failed to fetch sample 2156147. Exception:cannot identify image file <_io.BytesIO object at 0x7f51caf89990> cannot identify image file <_io.BytesIO object at 0x7f530dc3d8f0> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f51ca5c6b10> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a584dd1c0> Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7f51cb9771f0> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a2675bab0> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f5caf576f20> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e0d10> cannot identify image file <_io.BytesIO object at 0x7f5caf575710> le <_io.BytesIO object at 0x7ff697de53a0> Failed to fetch sample 3281265. Exception: cannot identify image file <_io.BytesIO object at 0x7f5caf574d60> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7f5caf5758f0> cannot identify image fcannot identify image file <_io.BytesIO object at 0x7fee3f926c50> 2291536. Exception: cannot identify image file <_io.BytesIO object at 0x7f6100878a40> Failed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7fb784820c20> cannot identify image file <_io.BytesIO object at 0x7f7e078d5c60> Failed to fetch sample 4245950. Exception: cannot identify image file <_io.BytesIO object at 0x7f136d85bce0> Failed to fetch sample 1426970. Exception:bject at 0x7f6100878ae0> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778f560> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` bef cannot identify image file <_io.BytesIO object at 0x7feef85e76a0> Failed to fetch sample 3041036. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a64fdd50> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7feefb2be9d0> Failed to fetch sample 1875546. Exceptcannot identify image file <_io.BytesIO object at 0x7f60ffea7e70> d50> Failed to fetch sample 1225312. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7ddb86d0> Failed to fetch sample 1787371. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fb4a5b61990> 86286. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ec0743f10> Failed to fetch sample 1024490. Exception:Failed to fetch sample 3510868. Exception:bject at 0x7f5e6a8b74c0> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7fb21f2ce160> cannot identify image file <_io.BytesIO object at 0x7f3634fa8810> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7f7af4a2ffb0> cannot identify image file <_io.BytesIO object at 0x7f5e7a25fdd0> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e7a25f7e0> Failed to fetch sample 2428943. Exception:bject at 0x7f7af4a2fc40> Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f2d11c60> Failed to fetch sample 1356016. Exception: ject at 0x7fb339f87d80> cannot identify image file <_io.BytesIO object at 0x7f5e6a8b7150> ile <_io.BytesIO object at 0x7fb21f8d56c0> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7f7af7613100> Failed to fetch sample 1296285. Exception: cannot identify image file <_io.BytesIO object at 0x7f10094012b0> Failed to fetch sample 3705515. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f7ee2f0> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7f225afe4860> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7f7e078aba60> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a1f5b0> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7f1009dc3290> TOKENIZERS_PARALLELISM=(true | false) ion: Failed to fetch sample 1232078. Exception: cannot identify image fFailed to fetch sample 990989. Exception: cannot identify image file <_io.BytesIO object at 0x7f66be6e92b0> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7f12573a2a70> Failed to fetch sample 1695344. Exception: cannot identify image file <_io.BytesIO object at 0x7f668810f010> Failed to fetch sample 4230612. Exception: huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIO object at 0x7f114cd62a70> Failed to fetch sample 3747017. Exception: cannot identify image file <_io.BytesIO object at 0x7f6692312160> Failed to fetch sample 1697773. Exception: cannot identify image file <_io.BytesIO object at 0x7f0d82904b80> cannot identify image file <_io.BytesIO object at 0x7fdd445df010> Failed to fetch sample 1699831. Exception: Failed to fetch sample 1356016. Exception: ect at 0x7ef9f8833fb0> Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bbb600> cannot identify image file <_io.BytesIO object at 0x7f0d82905d00> ile <_io.BytesIO object at 0x7efae6483330> Failed to fetch sample 3313261. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093733240> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9cc625030> cannot identify image file <_io.BytesIO object at 0x7f7b147071f0> ile <_io.BytesIO object at 0x7fdab2e555d0> Failed to fetch sample 1885521. Exception: huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2710201. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fc9a80> cannot identify image file <_io.BytesIO objcannot identify image file <_io.BytesIO object at 0x7fde75f15cb0> 2). Running this sequence through the model will result in indexing errors Failed to fetch sample 990670. Exception:ss just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) 0> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73b3ce0> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7fddc4c95cb0> Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a239ad490> cannot identify image file <_io.BytesIO object at 0x7fc3a66d98a0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 3013388. Exception: Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a1ff10> cannot identify image file <_io.BytesIO obFailed to fetch sample 1875546. Exception: ple 2788338. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a92ca3e70> cannot identify image file <_io.BytesIO object at 0x7f114c45b1a0> Failed to fetch sample 2087201. Exception: Failed to fetch sample 1182119. Exception: cannot identify image f cannot identify image file <_io.BytesIO obhuggingface/tokenizers: Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73b3ec0> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2138810. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fe1e40> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you Failed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a962f6f0> e TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d7d89bc0> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7f27820f6de0> Failed to fetch sample 1642822. Exception: cannot identify image file <_io.BytesIO object at 0x7efd96eb6fc0> Failed to fetch sample 1567734. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d7d89990> Failed to fetch sample 2970666. Exception: cannot identify image file <_io.BytesIO object at 0x7f5b1c6b9800> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7f289ed8fce0> Failed to fetch sample 1875546. Exception: ect at 0x7f12b39f3f60> Failed to fetch sample 4151771. Exception: cannot identify image file <_io.BytesIO object at 0x7f69989fe5c0> cannot identify image file <_io.BytesIO object at 0x7f802848f830> cannot identify image file <_io.BytesIO objcannot identify image file <_io.BytesIO object at 0x7fbf006ed760> cannot identify image file <_io.BytesIO object at 0x7f59d73b3ce0> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f27847e64d0> Failed to fetch sample 1787371. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb139c0> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d7da2ca0> Failed to fetch sample 2686021. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80de610> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7f5783972070> Failed to fetch sample 3905234. Exception: cannot identify image file <_io.BytesIO obj cannot identify image file <_io.BytesIO object at 0x7feefa6fa1b0> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f5b1d3dcd10> Failed to fetch sample 2792298. Exception: cannot identify image file <_io.BytesIO object at 0x7efd966dfe70> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2127419. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078da660> Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO object at 0x7f114c2031a0> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7f7af76137e0> Failed to fetch sample 2208637. Exception:bject at 0x7fedb5ec3ab0> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d7da22f0> Failed to fetch sample 3472931. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a0b405d50> cannot identify image file <_io.BytesIO object at 0x7f80a80de6b0> ile <_io.BytesIO object at 0x7efc516af560> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f91f9c0> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d7da3fb0> cannot identify image file <_io.BytesIO object at 0x7f5cacdc4c70> sequence length for this model (8940 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7ff6991d2c00> Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7efc52c3b3d0> Failed to fetch sample 2694335. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a1717a430> Failed to fetch sample 3595302. Exception: cannot identify image file <_io.BytesIO object at 0x7fddc4e827f0> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f2a20> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3fa8b3d0> Failed to fetch sample 1593925. Exception: cannot identify image fihuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can eFailed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73b3ce0> ENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2216241. Exception: cannot identify image file <_io.BytesIO object at 0x7efae6483330> Failed to fetch sample 1024490. Exception:huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you c cannot identify image file <_io.BytesIO object at 0x7f1b7c572c50> - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2318966. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d894afc0> Failed to fetch sample 1900563. Exception: cannot identify image file <_io.BytesIO object at 0x7feefa670b80> Failed to fetch sample 1635599. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9cc625030> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7fdab2e549f0> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO objcannot identify image fFailed to fetch sample 1875546. Exception:ject at 0x7f114e6bcc20> Failed to fetch sample 4230612. Exception: ject at 0x7fedb5ec3ab0> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb13d80> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd431efe20> cannot identify image file <_io.BytesIO object at 0x7f1008a1fe20> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO ohuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093736700>TOKENIZERS_PARALLELISM=(true | false) ion: cannot identify image file <_io.BytesIO objFailed to fetch sample 1356016. Exception: le 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d59e97dd0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 2072865. Exception:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 3923640. Exception:not identify image file <_io.BytesIO object at 0x7f1b7b9c0b80> ion: cannot identify image file <_io.BytesIO object at 0x7fb4a5b619e0> cannot identify image file <_io.BytesIO object at 0x7ff4db5489a0> le <_io.BytesIO object at 0x7f5affefbf10> Failed to fetch sample 3878860. Exception: cannot identify image file <_io.BytesIO object at 0x7f2b50f10040> Failed to fetch sample 3051178. Exception: cannot identify image file <_io.BytesIO object at 0x7efc520812b0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. DisabliFailed to fetch sample 3246712. Exception: able this warning, you can either: esIO object at 0x7f5e9cd4b420> - Avoid using `tokenizers` before the forFailed to fetch sample 4146922. Exception: ment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2446023. Exception: ect at 0x7f33876239c0> Failed to fetch sample 3325462. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3c165c0> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de388bec0> Failed to fetch sample 3112248. Exception: cannot identify image file <_io.BytesIO object at 0x7feef996f240> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7fb21f2ced40> Failed to fetch sample 3149198. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a64fd440> cannot identify image file <_io.BytesIO object at 0x7f9a9d7f4950> Failed to fetch sample 3888061. Exception: cannot identify image file <_io.BytesIO object at 0x7f10afb83650> cannot identify image file <_io.BytesIO object at 0x7ff27c2ffbf0> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable e <_io.BytesIO object at 0x7f566827b3d0> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7fe44409a430> TOKENIZERS_PARALLELISM=(true | false) Tokecannot identify image file <_io.BytesIO object at 0x7f0e9cd5b330> Failed to fetch sample 1430141. Exception: cannot identify image file <_io.BytesIO object at 0x7f12295ec7c0> FaileFailed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cfbf0> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c424a610> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f17fb0> Failed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7f11cfd93bf0> Failed to fetch sample 1787371. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f3920> the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f122963b8d0> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7fc193db63e0> Failed to fetch sample 2943308. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249668f90> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1bd8487c0> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7f11f156ede0> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f530d54f380> Failed to fetch sample 1961364. Exception:bject at 0x7feb2f639c10> Failed to fetch sample 3552752. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d7d89d50> Failed to fetch sample 3511582. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7feb6a3cac50> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f5b1c6b84f0> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf057a5b70> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd431effb0> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7feb2f63b1f0> Failed to fetch sample 2587421. Exception: cannot identify image file <_io.BytesIO object at 0x7efd96eb6fc0> Failed to fetch sample 3666868. ExcFailed to fetch sample 1980458. Exception: cannot identify image file <_io.BytesIO object at 0x7ff7190a79c0> eption: cannot identify image file <_io.BytesIO object at 0x7f5de3949b20> Failed to fetch sample 1879625. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fe1be4a3f60> le <_io.BytesIO object at 0x7f225a5eff60> Failed to fetch sample 3705515. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2496691c0> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f3650> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0f9a2390> Failed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7efd966dee30> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d59e97f60> cannot identify image file <_io.BytesIO object at 0x7f22e39baac0> Failed to fetch sample 2454376. Exception: Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7ff6987cb330> Failed to fetch sample 2534173. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e84965350> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fa6a4bc90> lism has already been used. Disabling parallelism to avoid deadlocks... - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3067785. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24966a2a0> Failed to fetch sample 1496036. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7ddb86d0> Failed to fetch sample 3449687. Exception: cannot identify image file <_io.BytesIO object at 0x7f10094017b0> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e83f96d90> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f69186e5d50> cannot identify image file <_io.BytesIO object at 0x7efcd7b16610> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7f2448426840> Failed to fetch sample 2673027. Exception: cannot identify image file <_io.BytesIO object at 0x7fcd9a26ef70> Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ed1c0> has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7fc4c909e7a0> Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebae480> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190bf650> Failed to fetch sample 2072865. Exception: just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: esIO object at 0x7efcd7b39c60> Failed to fetch sample 2679148. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e9e6f1fd0> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08d7ce0> _PARALLELISM=(true | false) Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f225a5ee020> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ed0d0> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7f5bb391ccc0> Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7b17880> Failed to fetch sample 3919129. Exception: ect at 0x7f10937356c0> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39b6fc0>Failed to fetch sample 3869412. Exception: cannot identify image file <_io.BytesIO object at 0x7fb5ea6f0f40> Failed to fetch sample 4157325. Exception:Failed to fetch sample 3705515. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37122f0> Failed to fetch sample 4230612. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7f3939c76ac0> ile <_io.BytesIO object at 0x7fc000b14900> cannot identify image file <_io.BytesIO object at 0x7f69190be980> Failed to fetch sample 3433571. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca1080> cannot identify image file <_io.BytesIO object at 0x7fc263520720> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f239cfe6ac0> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7f69186e4310> Failed to fetch sample 3705515. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a1ff60> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1440224. Exception: cannot identify image file <_io.BytesIO object at 0x7fc3a4d2bd30> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7f20ef4abf60> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08d74c0> cannot identify image file <_io.BytesIO object at 0x7fc000b14900> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7f1ffc685cb0> Failed to fetch sample 2072865. Exception:ject at 0x7f6556057b50> Failed to fetch sample 2639168. Exception:bject at 0x7fc0f08d70b0> Failed to fetch sample 1243147. Exception:bject at 0x7fbfd9809c60> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39b7380> Token indices sequence length is longer th cannot identify image file <_io.BytesIO object at 0x7fc38471de40> Failed to fetch sample 1859332. Exception: cannot identify image file <_io.BytesIO object at 0x7f655602f420> cannot identify image file <_io.BytesIO object at 0x7fbfdc66f7e0> ile <_io.BytesIO object at 0x7fdc25fe31f0> cannot identify image file <_io.BytesIO object at 0x7f1093736980> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2243630. Exception:Failed to fetch sample 2526328. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71c6d0> Failed to fetch sample 3329980. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093758450> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1747603. Exception: cannot identify image fiFailed to fetch sample 1314678. Exception:F cannot identify image file <_io.BytesIO object at 0x7f6556057c90> e <_io.BytesIO object at 0x7fc19474e7f0> Failed to fetch sample 4236147. Exception:jFailed to fetch sample cannot identify image file <_io.BytesIO object at 0x7fdcaf99bc90> Failed to fetch sample 3329980. Exception:bject at 0x7fc2eb71fe70>Failed to fetch sample 1382315. Exception:Failed to fetch sample 3916323. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71e8e0> cannot identify image file <_io.BytesIO object at 0x7fb7401f81d0> le <_io.BytesIO object at 0x7f5c9d8fb740> s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3010844. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cf470> Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c9e371710> cannot identify image file <_io.BytesIO object at 0x7f5cacdc4ae0> ile <_io.BytesIO object at 0x7f6100878ae0> cannot identify image file <_io.BytesIO object at 0x7f80a82bb790> Failed to fetch sample 1182119. Exception: Failed to fetch sample Failed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f256fc0> Failed to fetch sample 1513345. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f1bf150> sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7f225a5efe20> cannot identify image file <_io.BytesIO obj cannot identify image file <_io.BytesIO object at 0x7f5a16f55170> cannot identify image file <_io.BytesIO object at 0x7f65561f3ab0> nnot identify image file <_io.BytesIO object at 0x7f6a7f2722a0> cannot identify image file <_io.BytesIO object at 0x7f122963ae80> Failed to fetch sample 4102616. Exception: Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a23a47d80> cannot identify image file <_io.BytesIO object at 0x7f6a7f271210> Failed to fetch sample 1614626. Exception: Failed to fetch sample 1440224. Exception: ect at 0x7f65575a1b20> cannot identify image file <_io.BytesIO object at 0x7f6d145b9c60> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 2163001. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e83f96d90> cannot identify image file <_io.BytesIO object at 0x7f619188e480> cannot identify image file <_io.BytesIO object at 0x7f1b7c556020> ile <_io.BytesIO object at 0x7f4e84965490> Failed to fetch sample 1440224. Exception: ect at 0x7f21b0880860> sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 3390528. Exception: cannot identify image file <_io.BytesIO object at 0x7f11f252fe70> Failed to fetch sample 2435017. Exception: cannot identify image f cannot identify image file <_io.BytesIO o cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f5df066c310> Failed to fetch sample 4028317. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 21Failed to fetch saFailed to fetch sample 2Failed to fetch sample 2491527. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a23a47d80> ile <_io.BytesIO object at 0x7f995956ecf0> Failed to fetch sample 1429383. Exception: cannot identify image file <_io.BytesIO object at 0x7f566823c130> Failed to fetch sample 1079347. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7f6191ca4d60> ile <_io.BytesIO object at 0x7f5a23a47f60> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f10afbc53f0> Failed to fetch sample 1556637. Exception: cannot identify image file <_io.BytesIO object at 0x7f4fc86f6c00> Failed to fetch sample 4199490. Exception: ject at 0x7f5de370f3d0> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7f96d65bd2b0> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f5525ebe070> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The c cannot identify image file <_io.BytesIO object at 0x7f4d18206c50> used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7fc80a138a40> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f96d65bea20> cannot identify image file <_io.BytesIO object at 0x7f6556033dd0> Failed to fetch sample 4151771. Exception:Failed to fetch sample 3705515. Exception: cannot identify image fiFailed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f96d6a08bd0> Failed to fetch sample 3101535. Exception:cannot identify image file <_io.BytesIO object at 0x7f991b216070> ile <_io.BytesIO object at 0x7f6a7f254e50> Failed to fetch sample 2094843. Exception: cannot identify image file <_io.BytesIO object at 0x7efce5b849f0> Failed to fetch sample 4230612. Exception:ject at 0x7fc80a139f80> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7fc814c7bb00> ile <_io.BytesIO object at 0x7fdc25629990> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f255c10> huggingface/tokenizers: The current proceshuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable ed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7f225a5eff60> cannot identify image file <_io.BytesIO object at 0x7f6556032f20> le <_io.BytesIO object at 0x7fdc25629080> Failed to fetch sample 4236147. Exception: cannot identify image fiFailed to fetch sample 1866280. Exception:Failed to fetch sample 4236147. Exception: just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you Failed to fetch sample 2970491. Exception:fore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO ocannot identify image file <_io.BytesIO object at 0x7f6556033010> cannot identify image file <_io.BytesIO object at 0x7fdcaf83c680> cannot identify image file <_io.BytesIO object at 0x7f56682d2bb0> Failed to fetch sample 1182119. Exception:Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f779210> Failed to fetch sample 2816660. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de256b290> cannot identify image file <_io.BytesIO object at 0x7fc383d85d00> le <_io.BytesIO object at 0x7f5c9ed193f0> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682d31a0> cannot identify image file <_io.BytesIO object at 0x7efce656b240> Failed to fetch sample 2158469. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fd6c50> Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7fc26217a610> Failed to fetch sample 2816660. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08d92b0> ailed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f1ffc685cb0> Failed to fetch sample 3521247. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093732f20> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7f5525ebfb50> cannot identify image file <_io.BytesIO oFailed to fetch sample 1769963. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d27887ba0> bject at 0x7fdc25629940> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7f991b2160c0> bject at 0x7f80a80dde40> Failed to fetch sample 3705515. Exception: Failed to fetch sample cannot identify image file <_io.BytesIO object at 0x7fdc25fd8ef0> at 0x7f99a3940540> cannot identify image file <_io.BytesIO object at 0x7fc262b44a40> Failed to fetch sample 3371113. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278865c0> Failed to fetch sample 2460138. Exception: ect at 0x7f0d82904400> ile <_io.BytesIO object at 0x7fc262bb0720> cannot identify image file <_io.BytesIO object at 0x7f80a80de160> le <_io.BytesIO object at 0x7f97b0167f10> Failed to fetch sample 4142612. Exception: cannot identify image file <_io.BytesIO object at 0x7f991bdd88b0> Failed to fetch sample 1532776. Exception:ject at 0x7f07a8e9cea0> Failed to fetch sample 3525574. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2f97ee30> cannot identify image file <_io.BytesIO object at 0x7fd3103d4950> ile <_io.BytesIO object at 0x7fdab2e549f0> Failed to fetch sample 1875546. Exception: Failed to fetch sample 3224871. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7fc193db6390> 3251742. Exception: cannot identify image file <_io.BytesIO object at 0x7f991b216020> cannot identify image file <_io.BytesIO object at 0x7f07a84a3b50> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f9692a09c10> cannot identify image file <_io.BytesIO object at 0x7fdaade864d0>Failed to fetch sample 1431890. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a0b405fd0> Failed to fetch sample 1787371. Exception:ject at 0x7f80a80df920> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3705515. Exception:bject at 0x7f08282d2e80> TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a5b619e0> ion: cannot identify image file <_io.BytesIO object at 0x7f13efd83740> cannot identify image file <_io.BytesIO object at 0x7f99a3942a20> Failed to fetch sample 1623580. Exception: cannot identify image file <_io.BytesIO object at 0x7f51caf899e0> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f7eb600> Failed to fetch sample 1787371. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a0bdf5620> cannot identify image file <_io.BytesIO object at 0x7f51ca5c7330> sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 2129288. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d5a864a40> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7fb21f2ce160> cannot identify image file <_io.BytesIO object at 0x7f51cb9771f0> Failed to fetch sample 3140437. Exception: cannot identify image file <_io.BytesIO object at 0x7f51caf89a30> Failed to fetch sample 3608571. Exception: cannot identify image fFailed to fetch sample 3709156. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a5b619e0> Failed to fetch sample 3101535. Exception: cannot identify image fFailed to fetch sample 1406387. Exception: Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c161e4a90>Failed to fetch sample 3666292. Exception: ject at 0x7f5de3712430> cannot identify image file <_io.BytesIO object at 0x7f5e61e4f240> Failed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdfbbe7a0> Token indices sequence length is longer thaFailed to fetch sample 1456814. Exception:r this model (8747 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c424ba60> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278a32e0> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f442002c0> cannot identify image file <_io.BytesIO object at 0x7fee3fa8b3d0> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200359f80> Failed to fetch sample 3162205. Exception: cannot identify image file <_io.BytesIO object at 0x7fe444a648b0> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7fcef8cefd30> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIO object at 0x7f524fbb7a10> Failed to fetch sample 1381036. Exception: cannot identify image fihuggingface/tokenizers: The current procescannot identify image file <_io.BytesIO object at 0x7f655602e430> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3571476. Exception:Failed to fetch sample 3677196. Exception:bject at 0x7fcef8ceff60> Failed to fetch sample 4192667. Exception:bject at 0x7f1bfc28b560> Failed to fetch sample 3032101. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d14540> cannot identify image file <_io.BytesIO object at 0x7f00d8625cb0> cannot identify image file <_io.BytesIO object at 0x7f12573a68e0> Failed to fetch sample 2102696. Exception: ect at 0x7f5c9d947240> Failed to fetch sample 1102280. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9ce339a30> Failed to fetch sample 2102696. Exception:Failed to fetch sample 1243147. Exception:bject at 0x7fc26217a5c0> cannot identify image file <_io.BytesIO object at 0x7f00da6d2430> cannot identify image file <_io.BytesIO object at 0x7fc99c2071f0> Failed to fetch sample 1006977. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a7226f20> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4230612. Exception: cannot identify image fFailed to fetch sample 2601168. Exception: Failed to fetch sample 2526328. Exception: cannot identify image ailed to fetch sample 2243630. Exception: c cannot identify image file <_io.BytesIO object at 0x7f80a82d3100 cannot identify image file <_io.BytesIO object at 0x7fcc54a82f70> Failed to fetch sample 3013388. Exception: cFailed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a8bcc5d50> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7f81b9ca6d40> cannot identify image file <_io.BytesIO object at 0x7f001fa6bec0> le <_io.BytesIO object at 0x7fca1c187240> cannot identify image file <_io.BytesIO object at 0x7f5e9f077b00> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3237467. Exception: cannot identify image fiFailed to fetch sample 1635049. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a0bdb78d0> Failed to fetch sample 1356016. Exception:cannot identify image file <_io.BytesIO object at 0x7f83e8329170> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set Failed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f38720> lready been used. Disabling parallelism to avoid deadlocks... TOKENIZERS_PARALLELISMTo disable this warning, you can either: =(true | false) - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huFailed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f29674860> Failed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682cf1f0> e the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors ace/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3916323. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc548b84a0> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f3ad90> cannot identify image file <_io.BytesIO object at 0x7f5e9e6f2bb0> le <_io.BytesIO object at 0x7fc3a570c180> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 1110266. Exceptcannot identify image file <_io.BytesIO object at 0x7f6a7f237b00> Failed to fetch sample 2739283. Exception: cannot identify image file <_io.BytesIO obj cannot identify image file <_io.BytesIO object at 0x7fc715940ef0> le <_io.BytesIO object at 0x7f22313e6f70> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de04f8e00> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7fc26217a610> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f2a3fbf60> cannot identify image file <_io.BytesIO object at 0x7f27820f6de0> Failed to fetch sample 1131540. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbde561940> cannot identify image file <_io.BytesIO object at 0x7f597e6c5b20> Failed to fetch sample 4151771. Exception: cannot identify image file <_io.BytesIO object at 0x7f0e9cd5b600> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) file <_io.BytesIO object at 0x7fc263507b00> cannot identify image file <_io.BytesIO object at 0x7fc7159415d0> le <_io.BytesIO object at 0x7f21b08802c0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. DisabliFailed to fetch sample 2549560. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7ddb86d0> Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39c1260> led to fetch sample 1001518. Exception: cannot identify image file <_io.BytesIO object at 0x7f910da03c90> Failed to fetch sample 2036348. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ebcc02f70> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f3740> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a41bb7c90> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7fbfd9808a90> cannot identify image file <_io.BytesIO object at 0x7f5cacdc4b80> ile <_io.BytesIO object at 0x7fd2c46f7f60> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b1241e90> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cac3cfc40> Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17e71b9c0> Failed to fetch sample 2957876. Exception: cannot identify image file <_io.BytesIO object at 0x7f90e524ce00> Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc195350> Failed to fetch sample 1989738. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c8090> Failed to fetch sample 1787371. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b147071f0> Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7ff6973eb920> Failed to fetch sample 2428943. Exception: cannot identify image fiFailed to fetch sample 3747017. Exception:Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e83f97c40> cannot identify image file <_io.BytesIO object at 0x7f6a7f25a9d0> le <_io.BytesIO object at 0x7fcbdc194d10> cannot identify image file <_io.BytesIO object at 0x7f5cb389fce0> ile <_io.BytesIO object at 0x7f6d145b8900> Failed to fetch sample 2850892. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e306e55d0> Failed to fetch sample 1885133. Exception: cannot identify image file <_io.BytesIO object at 0x7f56cd3f27a0> Failed to fetch sample 3003551. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697de9f30> Failed to fetch sample 1133272. Exception: ject at 0x7f5de3713c90> cannot identify image file <_io.BytesIO object at 0x7f9a9e1b71f0> le <_io.BytesIO object at 0x7fc38471d800> cannot identify image file <_io.BytesIO object at 0x7f80a82d2430> Failed to fetch sample 2189796. Exception: cannot identify image file <_io.BytesIO object at 0x7f95bc522890> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7fc80a138a40> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` befFailed to fetch sample 1784589. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a64fdc10> cannot identify image file <_io.BytesIO object at 0x7f6d145b9120> Failed to fetch sample 2964818. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a6250> e TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ebab0> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a15160f40> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcafc523e0> Failed to fetch sample 2733843. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d145b84f0> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f995956f600> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7fc814c7bd80> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c8630> Failed to fetch sample 1961415. Exception: Failed to fetch sample 1695752. Exception: cannot identify image file <_io.BytesIO object at 0x7fdab9acf830> able this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3101535. Exception: ect at 0x7f80a82d3740> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` befFailed to fetch sample 3705515. Exception: cannot identify image filFailed to fetch sample 3571476. Exceptioncannot identify image file <_io.BytesIO object at 0x7f9b1f107830> Failed to fetch sample 2217549. Exception: cannot identify image file <_io.BytesIO object at 0x7fa14d0ee340> > Failed to fetch sample 1047785. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf87efc0> Failed to fetch sample 3916323. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d2570> cannot identify image file <_io.BytesIO object at 0x7ff24966b330> Failed to fetch sample 2423353. Exception: cannot identify image file <_io.BytesIO object at 0x7f08282d38d0> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9d7867a0> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f08282470b0> cannot identify image file <_io.BytesIO object at 0x7fb21f2ce160> cannot identify image file <_io.BytesIO object at 0x7f70b3889a80> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1917880. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c9fd0> cannot identify image file <_io.BytesIO object at 0x7f12b39f3d30> cannot identify image file <_io.BytesIO object at 0x7fb742671710> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c3d73d0> an either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ocess just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... TFailed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7fb743981760> Failed to fetch sample 1356016. Exception: cannot identify image filFailed to fetch sample 2555299. Exception:bject at 0x7fb4a651dc60> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c77920> cannot identify image file <_io.BytesIO object at 0x7f995aaf0fe0> ile <_io.BytesIO object at 0x7fc193db6390> just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2943308. Exception:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cf2e0> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7fb4a5b60b30> le <_io.BytesIO object at 0x7fbf0073dfd0> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f3790> le <_io.BytesIO object at 0x7fde75f15990> Failed to fetch sample 2210293. Exception: cannot identify image file <_io.BytesIO object at 0x7f6ba8b6fd80> cannot identify image file <_io.BytesIO object at 0x7fb376604b30> ile <_io.BytesIO object at 0x7fd99f4c20c0> cannot identify image file <_io.BytesIO object at 0x7fccdf7538d0> Failed to fetch sample 2123374. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7426d0ea0> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7416e2b60> Failed to fetch sample 2250980. Exception:Failed to fetch sample 1697947. Exception:mple 1637841. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1d7ae8e0> Failed to fetch sample 1866280. Exception:s just got forked, after cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7fccdf753c90> cannot identify image file <_io.BytesIO object at 0x7fe6cc981b20> - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO obFailed to fetch sample 1506692. Exception: ject at 0x7fbf057a5b70> le <_io.BytesIO object at 0x7fcbdf10f420> cannot identify image file <_io.BytesIO object at 0x7fdc25888590> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7fe91b161f80> Failed to fetch sample 3172214. Exception: cannot identify image file <_io.BytesIO object at 0x7fd99f4c0cc0> Failed to fetch sample 2072865. Exception:b cannot identify image file <_io.BytesIO object at 0x7ff4db548950> Failed to fetch sample 1787371. Exception: cannot identify image file <_io.ByFailed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25628a90> eption: LISM=(true | false) locks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesFailed to fetch sample 1440224. Exception:ject at 0x7f5838fc7470> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7fc193db5800> Failed to fetch sample 1485143. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebaa700> cannot identify image file <_io.BytesIO object at 0x7fc80a139f80> le <_io.BytesIO object at 0x7f583a3a71a0> cannot identify image file <_io.BytesIO object at 0x7f4415c17e70> le <_io.BytesIO object at 0x7ff4dbf0a0c0> Failed to fetch sample 1801044. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99cbef060> le <_io.BytesIO object at 0x7f5838fc7f60> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7fe445765c10> Failed to fetch sample 3067785. Exception: cannot identify image file <_io.BytesIO object at 0x7f4532a9ea20> Failed to fetch sample 2639168. Exception: cannot identify image fiFailed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f55a8a2c7c0> ailed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7534c0> cannot identify image file <_io.BytesIO object at 0x7fc99b842ed0> le <_io.BytesIO object at 0x7f58c3a251c0> Failed to fetch sample 1047785. Exception:Failed to fetch sample 1102280. Exception: ject at 0x7fc3850e7560> le <_io.BytesIO object at 0x7f55a8a2c770> cannot identify image file <_io.BytesIO object at 0x7fccdf7ae2a0> cannot identify image file <_io.BytesIO object at 0x7efe29bd71f0> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7fe445d6d850>F cannot identify image file <_io.BytesIO object at 0x7f5838fc7420> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380f740> e <_io.BytesIO object at 0x7fc383d86ed0> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c0ef0> cannot identify image file <_io.BytesIO object at 0x7f70b56b10d0> Failed to fetch sample 2956663. Exception: cannot identify image file <_io.BytesIO object at 0x7f936bd162f0> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4201103. Exception: cannot identify image file <_io.BytesIO object at 0x7efce5b85120> cannot identify image file <_io.BytesIO object at 0x7fc385111490> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` befcannot identify image file <_io.BytesIO object at 0x7f83e83195d0> e TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1196025. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a0b406020> Failed to fetch sample 3492865. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f270040> Failed to fetch sample 3916323. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c0d10> Failed to fetch sample 2166326. Exception: cannot identify image file <_io.BytesIO object at 0x7fd99eea58f0> Failed to fetch sample 1182119. Exception: Failed to fetch sample 2410651. Exception: cannot identify image file <_io.BytesIO object at 0x7f8151509b70> Failed to fetch sample 2450569. Exception: cannot identify image file <_io.BytesIO object at 0x7f469ef9ca40> Failed to fetch sample 2989692. Exception:ject at 0x7f10af1a9e90> Failed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f238ea0> Failed to fetch sample 3214448. Exception: cannot identify image file <_io.BytesIO object at 0x7f0d61212520> Failed to fetch sample 1393921. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a663cfb50> cannot identify image file <_io.BytesIO object at 0x7f351eb94b30> Failed to fetch sample 1024490. Exception: cannot identify image fiFailed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1b5377e0> cannot identify image file <_io.BytesIO object at 0x7f469e5d85e0> le <_io.BytesIO object at 0x7f6e55f934c0> cannot identify image file <_io.BytesIO object at 0x7fc0ffa8d030> le <_io.BytesIO object at 0x7efce5b851c0> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7f10afbc5530> ile <_io.BytesIO object at 0x7f991bdd89a0> cannot identify image file <_io.BytesIO object at 0x7fe6cd347650> Failed to fetch sample 988581. Exception: cFailed to fetch sample 2268425. Exception: cannot identify image file <_io.BytesIO object at 0x7fc814c7bb00> cannot identify image file <_io.BytesIO object at 0x7efce518af70> Failed to fetch sample 3013388. Exception: ect at 0x7fc51570b4c0> Failed to fetch sample 1787371. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641f880> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cbfe5a30> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986420ef0> cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7d3fc540> ject at 0x7f802848f8d0> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbe9fe02c0> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342277e0> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f91ffd77d30> cannot identify image file <_io.BytesIO object at 0x7f99a3af3a60> cannot identify image file <_io.BytesIO object at 0x7f0e1e212f20> 82119. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f7d7ddbcae0> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7f97b0167f10> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7f1010650b80> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f0e2840d2b0> Failed to fetch sample 3705515. Exception: Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7f0e2840d0d0> Failed to fetch sample 3733254. Exception: cannot identify image fiFailed to fetch sample 1471384. Exception:cannot identify image file <_io.BytesIO object at 0x7f7cec950cc0> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f09953d6070> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342d3a60> Failed to fetch sample 4242511. Exception: cannot identify image file <_io.BytesIO object at 0x7efce518b740> cannot identify image file <_io.BytesIO object at 0x7f7af4a2fc40> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIO object at 0x7f9692a0a1b0> Failed to fetch sample 4089113. Exception: cannot identify image file <_io.BytesIO object at 0x7f469ef9c900> ocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1183242. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342d35b0> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8d19120> Failed to fetch sample 2211172. Exception: cannot identify image file <_io.BytesIO object at 0x7fb5ff9d8860> cannot identify image file <_io.BytesIO object at 0x7f0c1b535e40> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7efe76ab7c40> Failed to fetch sample 3549889. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b223830> Failed to fetch sample 1245623. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200d1fe70> Failed to fetch sample 1986286. Exception:Failed to fetch sample 1322661. Exception: cannot identify image file <_io.BytesIO object at 0x7fe444a64720> Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO object at 0x7fb5fefe2f20> cannot identify image file <_io.BytesIO object at 0x7f7da2cd3ce0> Failed to fetch sample 3571476. Exception:bject at 0x7f5d59e97f10> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7fe447374310> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` befhuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disa cannot identify image file <_io.BytesIO object at 0x7efa5f45f600> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cbfe6b60> FailFailed to fetch sample 3705515. Exception: ot identify image file <_io.BytesIO object at 0x7f0e6bde2b60> Failed to fetch sample 1739651. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c76cf0>: cannot identify image file <_io.BytesIO object at 0x7f47e0f6ebb0> cannot identify image file <_io.BytesIO object at 0x7f80284b5cb0> Failed to fetch sample 3608877. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0fd5769d0> cannot identify image file <_io.BytesIO object at 0x7fe4c424bbf0> le <_io.BytesIO object at 0x7f4418a3bec0> Failed to fetch sample 2748392. Exception:bject at 0x7f0fa3bbde40> cannot identify image file <_io.BytesIO object at 0x7fe445765c10> Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1b535da0> cannot identify image file <_io.BytesIO object at 0x7fcbdef65cb0> Failed to fetch sample 3547658. Exception: ect at 0x7f472789bf60> Failed to fetch sample 2378618. Exception:ject at 0x7f277545bba0> Failed to fetch sample 1356016. Exception:bject at 0x7fb5ff9d8860> Failed to fetch sample 2541015. Exception: cannot identify image file <_io.BytesIO object at 0x7fcef8cefd30> Failed to fetch sample 3597435. Exception: cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7ff55caa3740> Failed to fetch sample 2102696. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f27820f6de0> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18bb00> Failed to fetch sample 1493215. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18b6f0> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab77e20> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: cannot identify image file <_io.BytesIO object at 0x7f472789bb00> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f991b216070> ion: cannot identify image file <_io.BytesIO object at 0x7f5838fc6520> Failed to fetch sample 3013388. Exception: annot identify image filFailed to fetch sample 2243583. Exception:bject at 0x7f27847e5580> 3388. Exception: cannot identify image file <_io.BytesIO object at 0x7f97b0167f10> cannot identify image file <_io.BytesIO object at 0x7f5c9d8fbe20> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... IO object at 0x7fca1c102f70> Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2496690d0> Failed to fetch sample 2989692. Exception:To disable this warning, cannot identify image file <_io.BytesIO object at 0x7f8150b1bf60> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a5e6d5a80> 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7f2862453880> Failed to fetch sample 4171919. Exception: cannot identify image file <_io.BytesIO object at 0x7fc4043b79c0> Failed to fetch sample 2072865. Exception:bject at 0x7fcae18ead40> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249668f90> cannot identify image file <_io.BytesIO objcannot identify image file <_io.BytesIO object at 0x7fdcaf81ec00> Failed to fetch sample 3101535. Exception: cannot identify image fFailed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7fc715940720> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7ff27c2ffbf0> Failed to fetch sample 3661798. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c56f4c0> Failed to fetch sample 1787371. Exception: cannot identify image file <_io.BytesIO object at 0x7f995aaf11c0> Failed to fetch sample 1787371. Exception:Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3af6160> Failed to fetch sample 4230612. Exception: cannot identify image file <_io.BytesIO object at 0x7f098657b650> Failed to fetch sample 3542359. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf81fc90> cannot identify image file <_io.BytesIO object at 0x7f995a101c10> le <_io.BytesIO object at 0x7f7eca3f23e0> Failed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7f1bfc28b560> cannot identify image file <_io.BytesIO object at 0x7ff2575efc40> Failed to fetch sample 2316998. Exception: cannot identify image file <_io.BytesIO object at 0x7ff62059dd50> Failed to fetch sample 2993949. Exception: cannot identify image file <_io.BytesIO object at 0x7f51caf89ad0> Failed to fetch sample 3677186. Exception: cannot identify image file <_io.BytesIO object at 0x7f51caff5760> Failed to fetch sample 2602959. Exception: cannot identify image file <_io.BytesIO object at 0x7f995956db70> ailed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2495e7060> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f83045caf20> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7fde75ec5670> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7f09953d6070> Failed to fetch sample 2887528. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf855080> Failed to fetch sample 2516250. Exception: cannot identify image file <_io.BytesIO object at 0x7fd311a71850> Failed to fetch sample 4102616. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c1c10> Failed to fetch sample 1165247. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568d53a0> Failed to fetch sample 2970666. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bb7dd0> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7f51caff4770> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253b5aed0> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ef6be8e00> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a4eac0> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bb7970> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253b5bc40> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disablicannot identify image file <_io.BytesIO object at 0x7f7a568d7970> Failed to fetch sample 1885521. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf854db0> FaTOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3452227. Exception: cannot identify image fileFaFailed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641f880> Failed to fetch sample 2036348. ExceptFailed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7322fc90> ion: cannot identify image file <_io.BytesIhuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) del will result in indexing errors Failed to fetch sample 3895186. Exception: ject at 0x7fb8ee739ad0> ailed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f442009f0> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f44201c10> cannot identify image file <_io.BytesFailed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f5838fc7470> y set the environment variable TOKENIZERS_PARALLELISM=(true | false) n: cannot identify image file <_io.BytesIO object at 0x7fa088c43060> Failed to fetch sample 2988347. Exception: cannot identify image file <_io.BytesIO object at 0x7f21afeb7740> Failed to fetch sample 1530445. Exception: cannot identify image file <_io.BytesIO object at 0x7fb742672570> Failed to fetch sample 1971835. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a1735a7f0> Failed to fetch sample 3443661. Exception: cannot identify image file <_io.BytesIO object at 0x7f991bdd89a0> Failed to fetch sample 2102696. Exception:Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c187c90> Failed to fetch sample 3157540. Exception: cannot identify image fiFailed to fetch sample 2148940. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7fb742360360> 875546. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cac3cfc40> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMFailed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7fc715329800> Failed to fetch sample 1102280. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c12b0> cannot identify image file <_io.BytesIO object at 0x7fcbde565f30> le <_io.BytesIO object at 0x7f9d82547600> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401ce570> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` befFailed to fetch sample 2528140. Exception: cannot identify image filTOKENIZERS_PARALLELISM=(true | false) ion: cannot identify image file <_io.BytesIO object at 0x7f18e64e13a0> Failed to fetch sample 2526328. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c09a0> Failed to fetch sample 2250980. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f237830> Failed to fetch sample 3013388. Exception:cannot identify image file <_io.BytesIO object at 0x7fe0ecdcbe70> Failed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca3ce0> ing parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1098352. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380dd00> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7bb5f970> Failed to fetch sample 2903773. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e5920f9c0> sable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4102616. Exception: ject at 0x7fc188b2da30> Failed to fetch sample 2989692. Exception:ject at 0x7f3294d9b1f0> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380f790> can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f236e30> Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c0b30> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f264a40> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17e71acf0> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7fb37942fec0> TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7fc24c6be8e0> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1875b0> huggingface/tokenizers: The current process just got forked, after Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f25ab10> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2646d0> Failed to fetch sample 3855219. ExceptFailed to fetch sample 1024490. Exception: O object at 0x7fd3103d4a40> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7fbea9cdef20> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2962f0> cannot identify image file <_io.BytesIO object at 0x7fd2003a09f0> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7b27765c0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: _PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. DisabliFailed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99cc16d90> Failed to fetch sample 3285119. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84ed9e0> Failed to fetch sample 2686021. Exception: cannot identify image file <_io.BytesIO object at 0x7f39486c0ea0> Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7fcef824d530> Failed to fetch sample 2170101. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b388b420> an either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7fb667590ea0> ailed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7fbea6364a90> cannot identify image file <_io.BytesIO object at 0x7f6a7f297ce0> Failed to fetch sample 3537815. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cbf97920> Failed to fetch sample 4102616. Exception:Failed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c56082660> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7fb667592610> ailed to fetch sample 3535594. ExceptiFailed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca3880> llelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7fd2003a1440>Failed to fetch sample 3644627. Exception: ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 1465124. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7fccdf7534c0> cannot identify image file <_io.BytesIO object at 0x7fc24d053880> le <_io.BytesIO object at 0x7f64189bb330> cannot identify image file <_io.BytesIO object at 0x7f3c56083600> le <_io.BytesIO object at 0x7fccdf7538d0> Failed to fetch sample 3371113. Exception:bject at 0x7f357b924ef0> Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b926a20> cannot identify image file <_io.BytesIO object at 0x7fd30f9dff10> Failed to fetch sample 2805860. Exception:ject at 0x7fccdf7af9c0> ile <_io.BytesIO object at 0x7f9abcdc09a0> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e2e06fe70> Failed to fetch sample 1174992. Exception:Failed to fetch sample 2198101. Exception: ect at 0x7fb784820c20> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf776bb0> cannot identify image file <_io.BytesIO object at 0x7f357b926160> Failed to fetch sample 3571476. Exception:huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMcannot identify image file <_io.BytesIO object at 0x7fb8edd66c50>=(true | false) cannot identify image file <_io.BytesIO object at 0x7f101067e9d0> Failed to fetch sample 4175538. Exception: cannot identify image file <_io.BytesIO object at 0x7f8151509d50> an either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) locks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d6bba0> Failed to fetch sample 3535594. Exception: cannot identify image file <_io.BytesIO object at 0x7fc4c85658f0> Failed to fetch sample 3910842. Exception: cannot identify image file <_io.BytesIO object at 0x7f51ca5c7330> Failed to fetch sample 1986286. Exception: cannot identify image fiFailed to fetch sample 2526328. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777dbb00> le <_io.BytesIO object at 0x7f530e73efc0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ion: cannot identify image file <_io.BytesIO object at 0x7f469ef9cbd0> Failed to fetch sample 1355374. Exception: cannot identify image file <_io.BytesIO object at 0x7f51ca573f10> Failed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7f51ca573880> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7f7eca3f1620> cannot identify image file <_io.BytesIO object at 0x7fdfaae41b70> 9. Exception: cannot identify image file <_io.BytesIO object at 0x7fc188cb3060> Failed to fetch sample 3636421. Exception: cannot identify image file <_io.BytesIO object at 0x7f51caf89df0> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7f47e274da80> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7f7eca3f23e0> Failed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe774ad3a0> Failed to fetch sample 2208567. Exception:ject at 0x7fb9777da5c0> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f44200cc0> cannot identify image file <_io.BytesIO object at 0x7f65561b9a80> an either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3916323. Exception:Failed to fetch sample 1024490. Exception: cannot identify image fF cannot identify image file <_io.BytesIO object at 0x7fb8edd67470> e <_io.BytesIO object at 0x7fdfab859440> ile <_io.BytesIO object at 0x7f83045caf20> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bcbba0> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789cef0> cannot identify image file <_io.BytesIO object at 0x7f7d7d3fdd00> le <_io.BytesIO object at 0x7f7ecaa08cc0> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` befFailed to fetch sample 3414742. Exception: the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1250585. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84a2d90> Failed to fetch sample 2428943. Exception: cannot identify image fihuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable - Explicitly set the environment variable Failed to fetch sample 2237717. Exception: cannot identify image file <_io.BytesIO object at 0x7f655602cd60> cannot identify image file <_io.BytesIO object at 0x7fdcaf83d1c0> Failed to fetch sample 1156321. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f537e0> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f47e0e8e840> Failed to fetch sample 1555436. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db53f920> TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7f8150b71a30>Failed to fetch sample 3705515. Exception:: cannot identify image file <_io.BytesIO object at 0x7f5cacdc4ef0> Failed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a6360e1b0> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf83e0c0> fore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7fe0ed73e7f0> Failed to fetch sample 3738116. Exception:Failed to fetch sample 2148940. Exception: cannot identify image ficannot identify image file <_io.BytesIO obFailed to fetch sample 3414742. Exception: ject at 0x7f60ffea7dd0> Failed to fetch sample 4102616. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d6d1bb330> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf83f7e0> Failed to fetch sample 2693825. Exception:ject at 0x7f65560753f0> cannot identify image file <_io.BytesIO object at 0x7f610123a340> cannot identify image file <_io.BytesIO object at 0x7f001fb55a30> Failed to fetch sample 4102616. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e7426f240> Failed to fetch sample 2970666. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a663cfc40> Failed to fetch sample 1110721. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fdb1a0> ailed to fetch sample 2421925. Exception: cannot identify image file <_io.BytesIO object at 0x7f8150b1b880> e cannot identify image file <_io.BytesIO object at 0x7f56e8136480> urrent process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... Failed to fetch sample 1478347. Exception: cannot identify image file <_io.BytesIO object at 0x7f001fa7d3f0> y set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d505882c0> Failed to fetch sample 4151771. Exception: cannot identify image file <_io.BytesIO object at 0x7f936c731350> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disablicannot identify image file <_io.BytesIO object at 0x7f65560774c0> ile <_io.BytFailed to fetch sample 2518896. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82bb330> esIO object at 0x7f001faef970> ore the fork if possible - Explicitly set the environment variable Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7eff95ea4310> Failed to fetch sample 2639168. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16ecad90> _io.BytesIO object at 0x7f80284b5030> Failed to fetch sample 1102280. Exception: cannot identify image file <_io.BytesIO object at 0x7f5df06be930> Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7eff97202930> Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d5059b6f0> cannot identify image file <_io.BytesIO object at 0x7fdc25fdb100> Failed to fetch sample 2639168. Exception: Failed to fetch sample 2526328. Exception:ject at 0x7f7a568c2a20> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f710d0> Failed to fetch sample 1102280. Exception: ject at 0x7f80a80e1d00> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7f9ac15f07c0> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c0b30> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc256284f0> cannot identify image file <_io.BytesIO object at 0x7f69186e5d00> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) iled to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7efd0d5fc860> ailed to fetch sample 3916323. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f72f70> cannot identify image file <_io.BytesIO object at 0x7f7a568c1d00> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7f101107b510> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7eff95ea4400> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7efe2a006890> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d15996a20> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c34c0> Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7eff97202930> Failed to fetch sample 3705515. Exception: cannot identify image file <_io.BytesIO object at 0x7efd102aba60> Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 819cannot identify image file <_io.BytesIO object at 0x7f9d5059bd30> parallelFailed to fetch sample 4236147. Exception:lelism to avoid deadlocks... Failed to fetch sample 4080764. Exception:- Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7f7b9b2a7b00> Failed to fetch sample 3150542. Exception: cannot identify image file <_io.BytesIO object at 0x7fd99f4c0ea0> Failed to fetch sample 1199879. Exception: cannot identify image file <_io.BytesIO object at 0x7fd393f43e70> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4fbbe840> cannot identify image file <_io.BytesIO object at 0x7f4727808ea0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4102616. Exception: ect at 0x7f7a568c3060> Failed to fetch sample 3197193. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d6c00> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25629940> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7fd31041f7e0> cannot identify image file <_io.BytesIO object at 0x7fb5fefe3f10> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: esIO object at 0x7fba323f2430> Failed to fetch sample 1221984. Exception:s - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disacannot identify image file <_io.BytesIO object at 0x7fe4369d6b60> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a53a520> able TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7fd2c46f6980> Failed to fetch sample 2982320. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20035a4d0> Failed to fetch sample 2498079. Exception:Failed to fetch sample 2740188. Exception: cannot identify image fiFailed to fetch sample 1007813. Exception: ect at 0x7fdd453fdc60> The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2036348. Exception: ject at 0x7fbc5bc0b470> Failed to fetch sample 1462964. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac20fa10> Failed to fetch sample 3013388. Exception: ect at 0x7f99a3af71f0> ile <_io.BytesIO object at 0x7f655602eb10>cannot identify image file <_io.BytesIO object at 0x7fd390b13420> Failed to fetch sample 3574575. Exception: cannot identify image file <_io.BytesIO object at 0x7fb669afd710> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13358b30> cannot identify image file <_io.BytesIO objcannot identify image file <_io.BytesIO object at 0x7f81db542570> Failed to fetch sample 1568882. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc548b82c0> Failed to fetch sample 1102280. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d2ae80> cannot identify image file <_io.BytesIO object at 0x7f18e5d17f60> Failed to fetch sample 2446023. Exception: cannot identify image file <_io.BytesIO object at 0x7fb669afc7c0> cannot identify image file <_io.BytesIO object at 0x7f6496057970> Failed to fetch sample 3274422. Exception:FFailed to fetch sample 1879625. Exceptioncannot identify image file <_io.BytesIO object at 0x7f94ac26db20> ss just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7f7b137d81d0> cannot identify image file <_io.BytesIO object at 0x7fd08a213970> le <_io.BytesIO object at 0x7fb977796a20> Failed to fetch sample 3013769. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd97510> ore the fork if possible - Explicitly set the environment variable Failed to fetch sample 1819501. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a65ab1760> Failed to fetch sample 1162434. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13327420> Failed to fetch sample 1866346. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7760c0> Failed to fetch sample 1381036. Exception:Failed to fetch sample 1986286. Exception:bject at 0x7fb977796cf0> le <_io.BytesIO object at 0x7f6a7f23b3d0> cannot identify image file <_io.BytesIO object at 0x7eff95ea4310> le <_io.BytesIO object at 0x7fd08a213ab0> cannot identify image file <_io.BytesIO object at 0x7f3939c8c270> Failed to fetch sample 2964818. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496057d30> cannot identify image file <_io.BytesIO object at 0x7f9abcd971f0> Failed to fetch sample 2341477. Exception:Failed to fetch sample 3176736. Exception:bject at 0x7f9abcdc8c70> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ace6db20> Failed to fetch sample 2943308. Exception: cannot identify image file <_io.BytesIO object at 0x7fb977796980> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13359260> Failed to fetch sample 1961364. Exception:scannot identify image file <_io.BytesIO object at 0x7f3939c8e8e0> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496029440> ore the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIO object at 0x7fb9777969d0> Failed to fetch sample 3101535. Exception:ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ocess just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2168818. Exception: cannot identify image fiFailed to fetch sample 2964818. Exception: g parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) O object at 0x7fb9777ac090> cannot identify image file <_io.BytesIO object at 0x7f21b0880450> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7efa5cad3920> Failed to fetch sample 3725471. Exception:bject at 0x7f90e4c5eca0> Failed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7f9acc06bb50> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b1c90040> cannot identify image file <_io.BytesIO object at 0x7fc99c206d90> Failed to fetch sample 4028686. Exception: just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 4102616. Exception: cannot identify image file <_io.BytesIO object at 0x7efd102abe70> Failed to fetch sample 2314849. Exception: ject at 0x7f3939c9c400> Failed to fetch sample 2816660. Exception: cannot identify image file <_io.BytesIO object at 0x7f6667bc2110> Failed to fetch sample 3510868. Exception:ject at 0x7f18f324a890> ile <_io.BytesIO object at 0x7fdfaa423920> cannot identify image file <_io.BytesIO object at 0x7f94ac20f010> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777aef20> le <_io.BytesIO object at 0x7f09953d6070> cannot identify image file <_io.BytesIO object at 0x7f1f1b223d30> cannot identify image file <_io.BytesIO object at 0x7fcbdefba160> Failed to fetch sample 1769963. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496057b50> - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ion: cannot identify image file <_io.BytesIO object at 0x7fb9777ae0c0>Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec222700> g parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 3371113. Exception: cannot identify image file <_io.BytesIO object at 0x7fa0081476a0> Failed to fetch sample 4230612. Exception: cannot identify image fihuggingface/tokenizers: The current procesFailed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7fa14c6d5850> sable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) O object at 0x7f64960296c0> Failed to fetch sample 2816660. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c8d300> Failed to fetch sample 3200208. Exception:Failed to fetch sample 2639168. Exception: cannot identify image fiFailed to fetch sample 2305088. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 31Failed to fetch sample 2192832. Exception:Failed to fetch sample 1746943. Exception:Failed to fetch sample 2011991. Exception:Failed to fetch sample 1356016. Exception: ect at 0x7f12295ec590> Failed to fetch sample 1102280. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4f9f2200> huggingface/tokenizers: The current process just got forked, after Failed to fetch sample 1341710. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c9c540> an either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ion:Failed to fetch sample 2072865. Exception:bject at 0x7f122963b8d0> Failed to fetch sample 1610254. Exception: cannot identify image file <_io.BytesIO object at 0x7f351e001d00> Failed to fetch sample 2526328. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d50597ab0> Failed to fetch sample 1079347. Exception:bject at 0x7f22f4e9b830> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIFailed to fetch sample 1588738. Exception: sample 2678942. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaae41cb0> cannot identify image file <_io.BytesIO object at 0x7f9d50f4f790> cannot identify image file <_io.BytesIO object at 0x7f655602bb00> Failed to fetch sample 1722249. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaae41d00> Failed to fetch sample 2228347. Exception: cannot identify image file <_io.BytesIO object at 0x7f13dc60ab10> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you cFailed to fFailed to fetch sample 2720382. Exception: tify image file <_io.BytesIO object at 0x7f9d4f9f08b0> Failed to fetch sample 1510621. Exception: if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, afterFailed to fetch sample 2850892. Exception:ing parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable le this warning, you can either: - Avoid using `tokenizers` befor cannot identify image file <_io.BytesIO object at 0x7f6556057bf0> Failed to fetch sample 4151771. ExcepFailed to fetch sample 3916323. Exception: IO object at 0x7ff4dab77c90> TOKENIZERS_PARALLELISM=(true | false) IO object at 0x7fb9777b6520> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4105551. Exception: cannot identify image file <_io.BytesIO object at 0x7fb977792b60> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec21e570> Failed to fetch sample 3492865. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf858090> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd4e7bc7c0> cannot identify image file <_io.BytesIO object at 0x7efcdbb3d3a0> Failed to fetch sample 1449019. Exception: cannot identify image file <_io.BytesIO object at 0x7fd39153f740> Failed to fetch sample 4151771. Exception: cannot identify image file <_io.BytesIO object at 0x7fe445766840> Failed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf859b20> Failed to fetch sample 4028686. Exception:bject at 0x7efc516afbf0> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f237ce0> Failed to fetch sample 1491559. Exception: Failed to fetch sample 3376420. Exception: ect at 0x7fe16e2d6de0> Failed to fetch sample 4171232. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7f11f15739c0> 525601. Exception: cannot identify image file <_io.BytesIO object at 0x7fd3103d4900> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7fd30f9de1b0> Failed to fetch sample 3210209. Exception:bject at 0x7f4e853546d0> Failed to fetch sample 1875546. Exception: cannot identify image fiFailed to fetch sample 1102280. Exception:bject at 0x7fb784820c20> Failed to fetch sample 2816660. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b49a0> cannot identify image file <_io.BytesIO object at 0x7fe436b7bb50> Failed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f236840> an either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 1787371. Exception:ingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1440224. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9cc626750> Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO object at 0x7fd30fabc1d0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable eption: cannot identify image file <_io.BytesIO object at 0x7fd2c46f5620> xception: cannot identify image file <_io.BytesIO object at 0x7fe8110e3510> Failed to fetch sample 2036348. Exception:Failed to fetch sample 3414742. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7fd08940c950> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64cc810> Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf87f290> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2984912. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9cf54c70> 2). Running this sequence through the model will result in indexing errors Failed to fetch sample 3013587. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7d3c70b0> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f264ae0> Failed to fetch sample 3445146. Exception: cannot identify image fiFailed to fetch sample 2590198. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c2a103f60> le <_io.BytesIO object at 0x7fd08940d6c0> Failed to fetch sample 4236147. Exception: cannot identify image fiFailed to fetch sample 2422687. Exception:Failed to fetch sample 3666868. Exception:bject at 0x7f08282cb420> le <_io.BytesIO object at 0x7fdcaf87d760> Failed to fetch sample 3331005. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f283650> Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a539210> the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e83fc4c70> Failed to fetch sample 2526328. Exception:huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: Failed to fetch sample 2850892. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc3ab0> S_PARALLELISMcannot identify image file <_io.BytesIO object at 0x7efcdbb2fb50>=(true | false) okenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 2839187. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23bc40> _io.BytesIO object at 0x7f4f07b14c70> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. DisabliFailed to fetch sample 2964818. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc13a0> Failed to fetch sample 4225937. Exception: cannot identify image file <_io.BytesIO object at 0x7f082823b510> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d20c0> y set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current proc cannot identify image file <_io.BytesIO object at 0x7f0c35ced620> ling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf789710> le <_io.BytesIO object at 0x7f0f4c16e700> cannot identify image file <_io.BytesIO object at 0x7f3939c77470> Failed to feFailed to fetch sample 2072865. Exception: ify image file <_io.BytesIO object at 0x7f0970903790> Failed to fetch sample 3385580. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc17b0> Failed to fetch sample 1886443. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342d6e80> s already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4102616. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf87dd50> Failed to fetch sample 2816660. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc3ce0> bject at 0x7f81db542c50> le <_io.BytesIO object at 0x7f05223edc60> Failed to fetch sample 2897584. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec21de90> cannot identify image file <_io.BytesIO object at 0x7fe6cd347510> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c429dd8f0> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8458a0> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c429de6b0> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIO object at 0x7f05223ec2c0> Failed to fetch sample 1440224. Exception: cannot identify image file <_io.BytesIO object at 0x7f39460758f0> can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesFailed to fetch sample 2457024. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1fb5879c0> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable 0> Failed to fetch sample 3371113. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc1990> TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2403627. Exception: Failed to fetch sample 3725520. Exception: ect at 0x7f53a90796c0> ile <_io.BytesIO object at 0x7fb669afc7c0>cannot identify image file <_io.BytesIO object at 0x7fe444a646d0> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaa4238d0> Failed to fetch sample 4151771. Exception: cannot identify image file <_io.BytesIO object at 0x7f4427f4bb50> Failed to fetch sample 1256056. Exception: Failed to fetch sample 1079347. Exception:ject at 0x7fb669afc9f0> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesToken indices sequence length is longer than the specified maximum sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors not identify image file <_io.BytesIO object at 0x7f82a5e972e0> Failed to fetch sample 4043766. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8318900> cannot identify image file <_io.BytesIO object at 0x7fa008b31c10> ile <_io.BytesIO object at 0x7f7cec9514e0> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7f0994dbd080> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7fbc5c248a40> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e1670> ng parallelism to avoid deadlocks... To disable this warning, you cannot iden - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) object at 0x7f10afe44450> cannot identify image file <_io.BytesIO object at 0x7fdd4e7bc7c0> Failed to fetch sample 3598349. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c206d90> Failed to fetch sample 1375798. Exception: cannot identify image file <_io.BytesIO object at 0x7f08296bc860> Failed to fetch sample 1440224. Exception: cannot identify image file <_io.BytesIO object at 0x7f80201b3ba0> Failed to fetch sample 1872922. Exception: cannot identify image file <_io.BytesIO object at 0x7f39486c0ea0> Failed to fetch sample 1269007. Exception: cannot identify image file <_io.BytesIO object at 0x7efce5b849f0> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7fa0891de660> ile <_io.BytesIO object at 0x7fdd2458cc70> cannot identify image file <_io.BytesIO object at 0x7f0c1b535e40> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6ce8e6020> le <_io.BytesIO object at 0x7f3bcea1c770> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO objFailed to fetch sample cannot identify imagFailed to fetch sample 4151771. Exception:ject at 0x7fdd2458d3f0> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcea3b6a0> Failed to fetch sample 3896645. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008b31d00> y set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7fe4369d7a60> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864236a0> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7efce51b37e0> cannot identify image file <_io.BytesIO object at 0x7f7ebcc02f70> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e3d80> Failed to fetch sample 3915337. Exception: cannot identify image file <_io.BytesIO object at 0x7f5780c62610> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7f563bfe08b0> Failed to fetch sample 3101535. Exception:-Failed to fetch sample 1265302. Exception: f possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2072865. Exception: ect at 0x7f00da6d0c70> cannot identify image file <_io.BytesIO object at 0x7f563bb94860> file <_io.BytesIO object at 0x7fe445765c10> annot identify image file <_io.BytesIO object at 0x7f3bcea39df0> Failed to fetch sample 2639168. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8318c20> Failed to fetch sample 1418263. Exception: cannot identify image fFailed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7efa5cad3920> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c5498f0> Failed to fetch sample 2785944. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568fe2a0> Failed to fetch sample 1102280. Exception: cannot identify image file <_io.BytesIO object at 0x7efa5cad3d30> cannot identify image file <_io.BytesIO object at 0x7f83e831b740> Failed to fetch sample 2149877. Exception: cannot identify image file <_io.BytesIO object at 0x7f655602f790> Failed to fetch sample 3500077. Exception: cannot identify image file <_io.BytesIO object at 0x7ff55c513740> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab77e70> _PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... IO object at 0x7fe445d6d850> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568fe250> e TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1866280. Exception: Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3201441. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d82547a10> cannot identify image file <_io.BytesIO object at 0x7efed45a8270> ile <_io.BytesIO object at 0x7f563bb94680> Failed to fetch sample 1381036. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d145bbab0> cannot identify image file <_io.BytesIO object at 0x7ff4dab77dd0> Failed to fetch sample 1681861. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c96750> Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24966da30>F iled to fetch sample 1265302. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568fe1b0> Failed to fetch sample 1356016. Exception: cannot identify image file <_io.BytesIO object at 0x7fe445d6d530> Failed to fetch sample 3244503. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f8ffc40> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 2072865. Exception:ject at 0x7f563c54af70> Failed to fetch sample 2909688. Exception: ect at 0x7f3939ccdda0> F cannot identify image file <_io.BytesIO object at 0x7fe4369d7ab0> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b800ead40> Failed to fetch sample 2072865. Exception: cannot identify image file <_io.BytesIO object at 0x7efe2a566cf0> Failed to fetch sample 3732948. Exception: cannot identify image file <_io.BytesIO object at 0x7efce5b85120> Failed to fetch sample 3146052. Exception:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7f8294bd3e70> Failed to fetch sample 1424763. Exception: cannot identify image file <_io.BytesIO object at 0x7f00da6d33d0> Failed to fetch sample 3510868. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7f7a568c69d0> Failed to fetch sample 1289195. Exception: cannot identify image file <_io.BytesIO obhuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variaFailed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db541080> Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e831ae80> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7ff27c2ffbf0> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50bdb9300> Failed to fetch sample 1344406. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7f650> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIO object at 0x7f7eca3f1620> 1047785. Exception:ot identify image file <_io.BytesIO object at 0x7f7c54fcd760> Failed to fetch sample 1139137. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b93b510> cannot identify image file <_io.BytesIO object at 0x7f910da03c90> Failed to fetch sample 2987734. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a9979760> ile <_io.BytesIO object at 0x7ff2575efc40> Failed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c5045e520> Failed to fetch sample 983451. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec21a340> Failed to fetch sample 1787371. Exception: cannot identify image file <_io.BytesIO object at 0x7fa6517206d0> Failed to fetch sample 4151771. Exception: cannot identify image fiFailed to fetch sample 2590198. Exception:Failed to fetch sample 1695344. Exception: cannot identify image file <_io.BytesIO object at 0x7f80284b4f90> Failed to fetch sample 3256885. Exception: cannot identify image file <_io.BytesIO object at 0x7f71350da250> Failed to fetch sample 1962901. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4fbbe570> cannot identify image file <_io.BytesIO object at 0x7ff4dab99da0> Failed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcec7e610> Failed to fetch sample 2595485. Exception: cannot identify image file <_io.BytesIO object at 0x7f39486c0ea0> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7f80284c4860> Failed to fetch sample 1628518. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdce2f0> Failed to fetch sample 3013587. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab9b9c0> Failed to fetch sample 1047785. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ccda30> cannot identify image file <_io.BytesIO object at 0x7fa14c6d30b0> Failed to fetch sample 1937656. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1bed1f30> Failed to fetch sample 3705239. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 3884679. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2c46f3010> cannot identify image file <_io.BytesIO object at 0x7f07a84a3b50> Failed to fetch sample 3146052. Exception: cannot identify image fiFailed to fetch sample 2403442. Exception:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 3747017. Exception:bject at 0x7f19d56dd490> Failed to fetch sample 1754611. Exception: n the specified maximum sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors cannot identify image file <_io.BytesIO obj cannot identify image cannot identify image file <_io.BytesIO object at 0x7f469ef9c0e0> Failed to fetch sample 3572434. Exception: cannot identify image file <_io.BytesIO objeFailed to fetch sample 2037964. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4fbbe9d0> ng parallelism to avoid deadlocks... To disable this warning, you can either: Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7f46a0d811c0> _PARALLELISM=(true | false) Failed to fetch sample 3432561. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa3cef0> Failed to fetch sample 2957876. Exception: cannot identify image file <_io.BytesIO object at 0x7f668810d9e0> cannot identify image file <_io.BytesIO object at 0x7f351e0003b0> Failed to fetch sample 2344108. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac20fa10> cannot identify image file <_io.BytesIO object at 0x7f991b216020> Failed to fetch sample 1313739. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d68180> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789e430> Failed to fetch sample 3276033. Exception:Failed to fetch sample 2590198. Exception: ject at 0x7fc107a4f6f0> Failed to fetch sample 3013388. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7f37719ccb30> Failed to fetch sample 3225671. Exception: cannot identify image file <_io.BytesIO object at 0x7f90e4c5eca0> 875546. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecb201da0> cannot identify image file <_io.BytesIO object at 0x7fcef8ceff60> Failed to fetch sample 3854766. Exception: cannot identify image file <_io.BytesIO object at 0x7fe444a64900> Failed to fetch sample 2143670. Exception: cannot identify image file <_io.BytesIO object at 0x7f7af4a2ffb0> Failed to fetch sample 2438256. Exception: ect at 0x7fc18853b650> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f47e0e87510> Failed to fetch sample 1787371. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdba67a0> cannot identify image file <_io.BytesIO object at 0x7f18e64ceac0> Failed to fetch sample 1440224. Exception:ject at 0x7f33af810bd0> Failed to fetch sample 1024636. Exception: cannot identify image file <_io.BytesIO object at 0x7f366045d8a0> Failed to fetch sample 1627382. Exception:bject at 0x7fc1897b0f40> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f47edf87380> bject at 0x7f6a8ecdac50> 68550. Exception: cannot identify image file <_io.BytesIO object at 0x7fbc5c248810> cannot identify image file <_io.BytesIO object at 0x7fb9777dcc20> Failed to fetch sample 1110800. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b97978770> Failed to fetch sample 4128727. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a8bcc5e90> Failed to fetch sample 1695344. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64ce520> bject at 0x7fd2003677e0> Failed to fetch sample 2533353. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17e750900> cannot identify image file <_io.BytesIO obFailed to fetch sample 1024490. Exception: ple 3329980. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7ee30> Failed to fetch sample 3210209. Exception:bject at 0x7fb9777b6020> le <_io.BytesIO object at 0x7f351e001d00> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7f90e524e610> Failed to fetch sample 1182119. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7ed66390> Failed to fetch sample 1437727. Exception:ject at 0x7fb6003c2610> Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f5525ebe110> Failed to fetch sample 1875546. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17e71b6a0> Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401ccfe0> cannot identify image file <_io.BytesIO object at 0x7f18e64ce430> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid d cannot identify image file <_io.BytesIO object at 0x7fa50a9d7830> Failed to fetch sample 3246712. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7ab10> le <_io.BytesIO object at 0x7fd17f0fa750> enizers` before the fork if possible - Explicitly set the environment variable To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1022720. Exception: cannot identify image file <_io.BytesIO object at 0x7fb784822fc0> FFailed to fetch sample 2639168. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078c9800> Failed to fetch sample 2189796. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071f420> Failed to fetch sample 3535594. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac236340> Failed to fetch sample 2446023. Exception: just got forked, after parallelism has already been used. Disabli cannot identify image file <_io.BytesIO object at 0x7f7a568a6cf0> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf87d3f0> F cannot identify image file <_io.BytesIO object at 0x7fb4933deac0> Failed to fetch sample 3582614. ExceptionTFailed to fetch sample 2957876. Exception: ject at 0x7fedb5ec3a60> cannot identify image file <_io.BytesIO object at 0x7f7e078cb6a0> le <_io.BytesIO object at 0x7fbc5b857f60> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2370b0> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1858383. Exception:ject at 0x7fb97778f420> Failed to fetch sample 3492865. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777c2980> ile <_io.BytesIO object at 0x7f7e078cab60> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7af70> Failed to fetch sample 3510868. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0efc40> Failed to fetch sample 3086408. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d3315Failed to fetch sample 4236147. Exception: Failed to fetch sample 1451627. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7f995aaf12b0> Failed to fetch sample 3712588. Exception: cannot identify image file <_io.BytesIO object at 0x7f10af1a9e40>Failed to fetch sample 2545584. Exception: caTOKENIZERS_PARALLELISM=(true | false) ion: cannot identify imageFailed to fetch sample 1524920. Exception:0> Failed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f2de90> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7f7c547649f0> cannot identify image file <_io.BytesIO object at 0x7fc10844ce00> Failed to fetch sample 2776666. Exception: cannot identify image fiFailed to fetch sample 3247337. Exception:Failed to fetch sample 3652924. Exception: cannot identify image fiFailed to fetch sample 2102696. Exception:ject at 0x7f3939c7a6b0> deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 2180416. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c555d00> Failed to fetch sample 2234033. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cc981a30> ither: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7fc107d09ad0> le <_io.BytesIO object at 0x7fca9bacd5d0> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7f3297a88900> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e795f7d80> Failed to fetch sample 1110266. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560650d0> Failed to fetch sample 3106325. Exception: cannot identify image file <_io.BytesIO object at 0x7fa69c52db70> Failed to fetch sample 3082272. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c56081170> Failed to fetch sample 2957876. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8bab10> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7f3661547330> s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7f5525ebe110> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7f11f14966b0> cannot identify image file <_io.BytesIO object at 0x7f3939c75260> le <_io.BytesIO object at 0x7f566a699490> Failed to fetch sample 2189796. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8bb920> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7fbea9cdef20> Failed to fetch sample 3649257. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c7bce0> Failed to fetch sample 1440224. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e7a25f6a0> Failed to fetch sample 2189635. Exception: cannot identify image file <_io.BytesIO object at 0x7fc10844cf40> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1912464. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9ba8b650> 2). Running this sequence through the model will result in indexing errors lelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4102616. Exception: cannot identify image file <_io.BytesIO object at 0x7f66be6e92b0> cannot identify image file <_io.BytesIO objFailed to fetch sample 3571476. Exception:ple 4129471. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56afd5d0> Failed to fetch sample 2555299. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f26a750> Failed to fetch sample 3013587. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7b2e0> Failed to fetch sample 1701865. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556066250> Failed to fetch sample 3642733. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fd93a0> cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7f32986bb880> cannot identify image file <_io.BytesIO object at 0x7fbe723befc0> Failed to fetch sample 1597570. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e85354680> Failed to fetch sample 2943308. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a7c90> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c56082ed0> cannot identify image file <_io.BytesIO object at 0x7f52978ad940> Failed to fetch sample 2250980. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7bd30> cannot identify image file <_io.BytesIO object at 0x7f4bee3a58a0> errors huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM cannot identify image file <_io.BytesIO object at 0x7fc188bf03b0> Failed to fetch sample 3053546. Exception: cannot identify im=(true | false) sample 2970666. Exception: cannoFailed to fetch sample 2680070. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e83f974c0> got forked, after parallelism has already been used. DisabliFailed to fetch sample 1870141. Exception: cannot identify image file <_io.BytesIO object at 0x7fcae8ce2cf0> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7f53a8e9d6c0> Failed to fetch sample 2072865. Exception: Failed to fetch sample 2285458. Exception: cannot identify image fiFailed to fetch sample 2036348. Exception: cannot identify image file <_io.BytesIO object at 0x7f699846b880> Failed to fetch sample 2063116. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecbdc1df0> le <_io.BytesIO object at 0x7f5e6a8e18a0> Failed to fetch sample 1079347. Exception:cannot identify image file <_io.BytesIO object at 0x7f7cebf7f650> Failed to fetch sample 2250980. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a663cfb50> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e9f80> cannot identify image file <_io.BytesIO object at 0x7f351eb94900> Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc2562ab60> Failed to fetch sample 1986286. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecb2018f0> ore the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIO object at 0x7fa275cb3c40> 8. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db2e30> Failed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f0463f5b0> Failed to fetch sample 1992772. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf006ff420> cannot identify image file <_io.BytesIO ocannot identify image file <_io.BytesIO object at 0x7efcdbb3d2b0> : cannot identify image file <_io.BytesIO object at 0x7f351e001d00> 31352. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f5e6a8d3bf0> Failed to fetch sample 3013587. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b5b70> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb3fa10> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 4236147. Exception:ed to fetch sample 2625656. Exception: cannot identify image file <_io.BytesIO object at 0x7fe588667f10> ion: cannot identify image file <_io.BytesIO object at 0x7f6682db26b0> Failed to fetch sample 1440224. Exception: cannot identify image file <_io.BytesIO object at 0x7f53b78a53a0> cannot identify image file <_io.BytesIO object at 0x7f7a568cee30> Failed to fetch sample 3602747. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d145b8810> Failed to fetch sample 1864192. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf88aa70> Failed to fetch sample 2491527. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf854180> cannot identify image file <_io.BytesIO object at 0x7f5e6a8d1c60> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b5620> Failed to fetch sample 1541588. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf789d00> Failed to fetch sample 3916323. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8d1120> Failed to fetch sample 2957876. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc57398270> cannot identify image file <_io.BytesIO object at 0x7fe44409a3e0> Failed to fetch sample 2102696. Exception: cannot identify image file <_io.BytesIO object at 0x7fbc5b856430> Failed to fetch sample 2036348. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8e9e700> cannot identify image file <_io.BytesIO object at 0x7f7a568a7650> Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4260810> Failed to fetch sample 1024490. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4440c5170> Failed to fetch sample 1024490. Exception:Failed to fetch sample 1976252. Exception: cannot identify image fFailed to fetch sample 2639168. Exception: Failed to fetch sample 3666868. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b0889760> Failed to fetch sample 2438256. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1bd8487c0> cannot identify image file <_io.BytesIO object at 0x7fbcdc0f62a0> Failed to fetch sample 2438256. Exception:bject at 0x7fdaadce92b0> le <_io.BytesIO object at 0x7f70b4260400> Failed to fetch sample 1102280. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 1738536. Exception: ject at 0x7f223151ea70> Failed to fetch sample 2428943. Exception: cannot identify image fiFailed to fetch sample 1875546. Exception:Failed to fetch sample 2189796. Exception:bject at 0x7fbc5b857010> Failed to fetch sample 3013388. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9cb2b4900> cannot identify image file <_io.BytesIO object at 0x7f22e39bb740> le <_io.BytesIO object at 0x7f6e2b28cf90> Failed to fetch sample 2526328. Exception: cannot identify image file <_io.BytesIO object at 0x7f563bb95350> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778f560> cannot identify image file <_io.BytesIO object at 0x7fdd445df1f0> le <_io.BytesIO object at 0x7fbc5c248a90> cannot identify image file <_io.BytesIO object at 0x7fc99b8413a0> ile <_io.BytesIO object at 0x7fbcdc299e40> Failed to fetch sample 3435080. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134d858f0> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdd667470> Failed to fetch sample 3202744. Exception: cannot identify image file <_io.BytesIO object at 0x7f351e2a0090> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7f587f226890> Failed to fetch sample 2970666. Exception: cannot identify image file <_io.BytesIO object at 0x7fbc5b857010> cannot identify image file <_io.BytesIO object at 0x7fcaae70c400> Failed to fetch sample 1265302. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e5df0> TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3705515. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f2a3fbf60> Failed to fetch sample 3414742. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f255350> cannot identify image file <_io.BytesIO object at 0x7f69190c07c0> le <_io.BytesIO object at 0x7fccdf75b880> Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7f563bb95a80> Failed to fetch sample 2382575. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6496c50> Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fdbd30> Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f256f70> Failed to fetch sample 3916323. Exception: ject at 0x7f69186e50d0> Failed to fetch sample 1451627. Exception:Failed to fetch sample 1719573. Exception:bject at 0x7f18e6495d00> ailed to fetch sample 3358845. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c52de90> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9d5cbfdd0> Failed to fetch sample 3404915. Exception: Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d15995620> cannot identify image file <_iocannot identify image file <_io.BytesIO object at 0x7f69186e5670> Exception: cannot identify image file <_io.BytesIO object at 0x7fcd9a26c270> Failed to fetch sample 1961364. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc54a82f20>Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7fba3256a660> Failed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b937560> Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 1986286. Exception: Failed to fetch sample 2250980. Exception:ject at 0x7f351e1e9fd0> cannot identify image file <_io.BytesIO object at 0x7fb9777b2200> Failed to fetch sample 4022580. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39c7100> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc54a82f70> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b995ad0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. DisabliFailed to fetch sample 1047785. Exception:sable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7fb9777e6390> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79e340> Failed to fetch sample 2563789. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39c7830> Failed to fetch sample 3345102. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e0d60> cannot identify image file <_io.BytesIO object at 0x7f69186e4c70> Failed to fetch sample 2850892. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82bb420> Failed to fetch sample 3101535. Exception: cannot identify image file <_io.BytesIO object at 0x7f3297a88900> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ion: cannot identify image file <_io.BytesIO object at 0x7f6556067650> cannot identify image file <_io.BytesIO object at 0x7fdd459ca7a0> Failed to fetch sample 2319946. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f92e610> Failed to fetch sample 2237717. Exception: cannot identify image fiFailed to fetch sample 2579003. Exception:huggingface/tokenizers: The current process just got forked, after cannot identify image file <_io.BytesIO object at 0x7f22e3812610> Failed to fetch sample 2655232. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8edd67330> - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1738368. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb50d0> Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5e97c40> Failed to fetch sample 3034064. Exception: cannot identify image file <_io.BytesIO object at 0x7f39486c0ea0> Failed to fetch sample 2989850. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c56082930> Failed to fetch sample 1110266. Exception:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) <_io.BytesIO object at 0x7fee3f94d030> cannot identify image file <_io.BytesIO object at 0x7f69190c1b20> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80de0c0> Failed to fetch sample 2925428. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82bb790> Failed to fetch sample 1866280. Exception: cannot identify image file <_io.BytesIO object at 0x7f36afae503Failed to fetch sample 2199896. Exception: Failed to fetch sample 3705515. Exception:ject at 0x7f82a6861b70> Failed to fetch sample 1868903. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5e972e0> Failed to fetch sample 2222432. Exception: cannot identify image file <_io.BytesIO object at 0x7f6191ca4d60> FaFailed to fetch sample 3240031. Exception: nnot identify image file <_io.BytesIO object at 0x7f3c56082660> Failed to fetch sample 2964818. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84ef740> Failed to fetch sample 1864192. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3812110> cannot identify image file <_io.BytesIO object at 0x7fdc25fe3b50> Failed to fetch sample 3287476. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c8090> Failed to fetch sample 2501360. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d17c90> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd690ebc90> Failed to fetch sample 1514677. Exception: cannot identify image file <_io.BytesIO object at 0x7fd08940d6c0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMFailed to fetch sample 4194434. Exception: just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2148940. Exception: cannot identify image file <_io.BytesIO object at 0x7f8150b71a30> huggingface/tokenizers: The current processFailed to fetch sample 1099589. Exception:ject at 0x7f357b93bec0> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7f3939c77bf0> Failed to fetch sample 2816660. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8318630> Failed to fetch sample 2970666. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfab859440> Failed to fetch sample 3151320. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d50f65620> cannot identify image file <_io.BytesIO object at 0x7fd08940c950> an either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2687916. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c573150> _io.BytesIO object at 0x7f83e83491c0> Failed to fetch sample 2223594. Exception: cannot identify image file <_io.BytesIO object at 0x7f655602f790> Failed to fetch sample 2250980. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c76840> Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568ba020> Failed to fetch sample 3233960. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986421e40> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) tion: cannot identify image file <_io.BytesIO object at 0x7f3939ca1850>Failed to fetch sample 1465471. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d17b50> Failed to fetch sample 2804056. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7ddb8860> Failed to fetch sample 4155387. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682f93420> Failed to fetch sample 2816660. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93b3d0> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7f09953d6070> Failed to fetch sample 2487248. Exception:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: cannot identify image file <_io.BytesIO object at 0x7fb8edd676f0> ly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7f7a568bb560> Failed to fetch sample 2054527. Exception: ect at 0x7f563c54af20> cannot identify image file <_io.BytesIO obcannot identify image file <_io.BytesIO object at 0x7efcdbb46520> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7f39728f07c0> Failed to fetch sample 3492865. Exception:bject at 0x7f563c54a200> Failed to fetch sample 1430100. Exception: Failed to fetch sample 3329980. Exception:ject at 0x7f7a568c2610> cannot identify image file <_io.BytesIO object at 0x7f6682dbc540> Failed to fetch sample 1769963. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b938950> Failed to fetch sample 3371113. Exception: cannot identify image file <_io.BytesIO object at 0x7f326cb39030> Failed to fetch sample 2733843. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dbc720> Failed to fetch sample 4073421. Exception:bject at 0x7f07a8ea7600> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7f56bd2e39c0> Failed to fetch sample 1007772. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac20fc90> Failed to fetch sample 2737912. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5e97100> Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7f051fed1580> Failed to fetch sample 3836103. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d2dbc0> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8ee739990> Failed to fetch sample 2878892. Exception: cannot identify image file <_io.BytesIO object at 0x7fcae8ce2cf0> cannot identify image file <_io.BytesIO object at 0x7f6556065260> Failed to fetch sample 3067785. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca2110> Failed to fetch sample 1794593. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84eb880> _PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7f7a568c18f0> le <_io.BytesIO object at 0x7fccdf7777e0> Failed to fetch sample 2591241. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777dab10> Failed to fetch sample 4215550. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4fbbf380> Failed to fetch sample 2850892. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9fae04900> just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set Failed to fetch sample 33Failed to fetch sample 1746943. ExceptionFailed to fetch sample 2687916. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7af7e0> Failed to fetch sample 1676812. Exception: cannot identify image filFailed to fetch sample 1698610. ExceptionFaFailed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf777380> iled to fetch sample 3414742. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7fbf0066b1a0> Failed to fetch sample 3535594. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643e2f740> Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4f9f0770> Failed to fetch sample 1465471. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf776b10> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c95b70> cannot identify image file <_io.BytesIO object at 0x7f7d7ddb8c20> le <_io.BytesIO object at 0x7f82a5e97fb0> Failed to fetch sample 2392280. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560670b0> cannot identify image file <_io.BytesIO object at 0x7f9d4f9f0360> Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7f936b3549a0> Failed to fetch sample 2926772. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d9358aa20> Failed to fetch sample 2849346. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcbf10> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ion:Failed to fetch sample 3322388. Exception: ject at 0x7f7e078b2d90> cannot identify image file <_io.BytesIO object at 0x7f9ac15f13f0> Failed to fetch sample 3492865. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a9979760> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac25fc40> Failed to fetch sample 2816660. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9ce339a30> cannot identify image file <_io.BytesIO object at 0x7fe6cbfe6b60> ile <_io.BytesIO object at 0x7f6556064b80> Failed to fetch sample 2555299. Exception:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` befFailed to fetch sample 2189796. Exception: cannot identify image file <_io.BytesIO object Failed to fetch sam cannot identify image file <_io.BytesIO object at 0x7f9abcd97100> ect at 0x7f3939ccfb50> Failed to fetch sample 1591233. Exception: cannot identify image fiFailed to fetch sample 1696801. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcefc0> Failed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7f802848f060> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7f936b3547c0> Failed to fetch sample 1701865. Exception:: cannot identify image file <_io.BytesIO object at 0x7fd99eea58f0> g parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c4502cef0> Failed to fetch sample 2116382. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ca5940> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c95850> Failed to fetch sample 2127419. Exception: cannot identify image file <_io.BytesIO object at 0x7f655602ee30> cannot identify image file <_io.BytesIO object at 0x7f9abcdcdc60> an either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) IO object at 0x7f34f2d12d90> Failed to fetch sample 4053655. Exception: cannot identify image file <_io.BytesIO object at 0x7f4d18206c50> Failed to fetch sample 3849331. Exception: cannot identify image file <_io.BytesIO object at 0x7f6191ca4d60> Failed to fetch sample 1047785. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e834d850> cannot identify image file <_io.BytesIO object at 0x7fbc5c1b4e00> le <_io.BytesIO object at 0x7f3939c95d50> Failed to fetch sample 1265302. Exception: cannot identify image file <_io.BytesIO object at 0x7fa58d5b69d0> Failed to fetch sample 1112025. Exception: cannot identify image ficannot identify image file <_io.BytesIO obFailed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd97060> Failed to fetch sample 1265302. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9273d0> Failed to fetch sample 3644627. Exception: ject at 0x7efcdbb12f20> Failed to fetch sample 4072524. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f7aec50> Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7f6417e0b380> Failed to fetch sample 1885521. Exception: cannot identify image fFailed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb69f30> cannot identify image file <_io.BytesIO object at 0x7fccdf79e110> uggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1271421. Exception: cannot identify image file <_io.BytesIO object at 0x7f587f226a20> Failed to fetch sample 2819520. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a90c68e0> Failed to fetch sample 2302254. Exception: cannot identify image file <_io.BytesIO object at 0x7f90e524d5d0> Failed to fetch sample 2234718. Exception: cannot identify image file <_io.BytesIO object at 0x7fbc5c297970> Failed to fetch sample 3441423. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23a200> cannot identify image file <_io.BytesIO object at 0x7efcdbb6b4c0> Failed to fetch sample 1105639. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7f357b925c60> Failed to fetch sample 1085676. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682f8e250> Failed to fetch sample 2452845. Exception: Failed to fetch sample 1015677. Exception: cannot identify image fFailed to fetch sample 2446023. Exception: Failed to fetch sample 1724927. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84a3b50> cannot identify image file <_io.BytesIO object at 0x7efc52c3b6a0> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b927010> cannot identify image file <_io.BytesIO object at 0x7f5cad7b3c90> Failed to fetch sample 2321497. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a4f6f0> le <_io.BytesIO object at 0x7fa58de13830> cannot identify image file <_io.BytesIO object at 0x7ff69741c5e0> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8bb600> Failed to fetch sample 3497670. Exception: cannot identify image file <_io.BytesIO object at 0x7f6667bc2110> Failed to fetch sample 1047785. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f8f2e0> huggingface/tokenizers: The current process just got forked, after cannot identify image file <_io.BytesIO object at 0x7f18e64c2c00> le <_io.BytesIO object at 0x7f699846b880> an either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7efae6483330> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb5d800> cannot identify image file <_io.BytesIO obj cannot identify image file <_io.BytesIO object at 0x7f94ac239350>cFailed to fetch sample 2446023. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6497060> Failed to fetch sample 3649257. Exception: cannot identify image file <_io.BytesIO object at 0x7f6191ca6340> ailed to fetch sample 2943308. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1be4a3bf0> cannot identify image file <_io.BytesIO oFailed to fetch sample 3028996. Exception: cannot identify image fiFailed to fetch sample 4049655. Exception: cannot identify image file <_io.BytesIO object at 0x7fd012a12fc0> Failed to fetch sample 4151771. Exception: cannot identify image file <_io.BytesIO object at 0x7fe44409b6a0> cannot identify image file <_io.BytesIO object at 0x7fbcdc0f1670> Failed to fetch sample 1961364. Exception:ject at 0x7fc263506de0> Failed to fetch sample 2516289. Exception: cannot identify image file <_io.BytesIO object at 0x7f6100878ea0> can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 3725471. Exception: d to fetch sample 1535841. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bdde90> Failed to fetch sample 2189796. Exception: ject at 0x7fb52f7f7ab0> Failed to fetch sample 1259517. Exception: cannot identify image file <_io.BytesIO object at 0x7fdab2e54cc0> Failed to fetch sample 1452319. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f922980> Failed to fetch sample 2250980. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f7f7600> cannot identify image file <_io.BytesIO object at 0x7f3939c76f20> cannot identify image file <_io.BytesIO object at 0x7f70b6c796c0> Failed to fetch sample 1695344. Exception: cannot identify image file <_io.BytesIO object at 0x7f633ef410d0> Failed to fetch sample 1699250. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f571a0> Failed to fetch sample 3747017. Exception: cannot identify image file <_io.BytesIO object at 0x7f5ea213e2a0> s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3329980. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9237e0> Failed to fetch sample 3725471. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f7f7650> Failed to fetch sample 2074213. Exception: cannot identify image fiFailed to fetch sample 1746943. Exception:Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64ce8e0> Failed to fetch sample 1440224. Exception: cannot identify image file <_io.BytesIO object at 0x7fdabd7c7f60> Failed to fetch sample 2957876. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb365c0> Failed to fetch sample 1985115. Exception: cannot identify image fiFailed to fetch sample 2454376. Exception: ailed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f7f7740> Failed to fetch sample 3897108. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a6eeecf0> Failed to fetch sample 2036348. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd431efec0> Failed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7f5742046f70> Failed to fetch sample 2189796. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb35ad0> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f67db20> Failed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6295d0> Failed to fetch sample 2426760. Exception:ject at 0x7fb977792700> Failed to fetch sample 3087443. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f92c00> Failed to fetch sample 2553290. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778f8d0> cannot identify image file <_io.BytesIO object at 0x7f39486c0ea0> cannot identify image file <_io.BytesIO object at 0x7f5751088d60> FFailed to fetch sample 2563789. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b48b0> cannot identify image file <_io.BytesIO object at 0x7f94ad732c00> Failed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7f351e0003b0> Failed to fetch sample 3492865. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f55760> Failed to fetch sample 3345102. Exception:Failed to fetch sample 2943308. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f5751089030> Failed to fetch sample 2639168. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d7100> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdd1f30> Failed to fetch sample 3355968. Exception: cannot identify image file <_io.BytesIO object at 0x7f90e524d5d0> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4102616. Exception:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1102280. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43035120> _io.BytesIO object at 0x7f22e39bb330> Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777df2e0> Failed to fetch sample 2446023. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f56d90> Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7f225afe4860> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777de6b0> Failed to fetch sample 1339550. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8ebba0> ore the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIO object at 0x7f357b93d260> ailed to fetch sample 2526328. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd430347c0> can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2354595. Exception: cannot identify image file <_io.BytesIO object at 0x7f910da03c90> <_io.BytesIO object at 0x7f18e6497c90> Failed to fetch sample 2036348. Exception: cannot identify image file <_io.BytesIO objcannot identify image file <_io.BytesIO object at 0x7f357b93e480> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d6bb0> annot identify image file <_io.BytesIO object at 0x7f5e6a8eb150> Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f56b60> cannot identify image file <_io.BytesIO object at 0x7f18e64c2cf0> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f237ce0> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a7830> Failed to fetch sample 2639168. Exception: cannot identify image file <_io.BytesIO object at 0x7f5b1c6bad90> Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43034770> cannot identify image file <_io.BytesIO objFailed to fetch sample 2741322. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f1ed40> ect at 0x7f34f2d19f80> cannot identify image file <_io.BytesIO object at 0x7f18e64c22a0> Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9797ab60> Failed to fetch sample 3916323. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43036a70> Failed to fetch sample 1232232. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaade9cc70> ailed to fetch sample 2639168. Exception: cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7f936bd85fd0> Failed to fetch sample 2135466. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f2ec00> cannot identify image file <_io.BytesIO object at 0x7f6a7f237740> cannot identify image file <_io.BytesIO object at 0x7f5741f1f790> Failed to fetch sample 2526328. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420ef380> Failed to fetch sample 3297686. Exception: cannot identify image file <_io.BytesIO object at 0x7f5b1d32bbf0> Failed to fetch sample 1914221. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b1b20> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3067785. Exception: cannot identify image file cannot identify image file <_io.BytesIO object at 0x7f35a7f2e840> ailed to fetch sample 2102457. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb133d0> Failed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73b3ec0> cannot identify image file <_io.BytesIO object at 0x7fc107a4eac0> ile <_io.BytesIO object at 0x7f69190a6750> Failed to fetch sample 2943308. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f57f10> cannot identify image file <_io.BytesIO object at 0x7f801f5efe20> Failed to fetch sample 3103507. Exception: huggingface/tokenizers: The current process just got forked, afterFailed to fetch sample 3225671. Exception: Failed to fetch sample 3345102. Exception:ject at 0x7fb97778fbf0> Failed to fetch sample 1961364. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f57740> Failed to fetch sample 1081255. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b7970> Failed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7f1ffc685cb0> Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e7600> Failed to fetch sample 2250980. Exception: cannot identify image file <_io.BytesIO object at 0x7ff7180e8680> _io.BytesIO object at 0x7f936bd2ede0> Failed to fetch sample 2045437. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d9530> Failed to fetch sample 3916323. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420ed7b0> Failed to fetch sample 2657140. Exception: cannot identify image file <_io.BytesIO object at 0x7fbc5cc54270> Failed to fetch sample 2992834. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e35b0> Failed to fetch sample 4102616. Exception: cannot identify image fiFailed to fetch sample 1701865. Exception: cannot identify image file <_io.BytesIO object at 0x7f0db9bc59e0> le <_io.BytesIO object at 0x7fb97778fec0> Failed to fetch sample 3427761. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc55e13380> cannot identify image file <_io.BytesIO object at 0x7f6417e0d9e0> Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682ded940> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7fbea9cdef20> Failed to fetch sample 4151771. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556031670> Failed to fetch sample 1440224. Exception: cannot identify image file <_io.BytesIO object at 0x7f22429c3150> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f899e0> Failed to fetch sample 3013587. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf75b060> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) sample 1110266. Exception:Failed to fetch sample 2189796. Exception: cannot identify image fiFailed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7b2e0> Failed to fetch sample 2508204. Exception: cannot identify image fiFailed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f57290> le <_io.BytesIO object at 0x7fa2846ca660> Failed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cebfb0540> cannot identify image file <_io.BytesIO object at 0x7f6682deeed0> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8112980> cannot identify image file <_io.BytesIO object at 0x7fc24d0bf420> le <_io.BytesIO object at 0x7f7a568a7420> Failed to fetch sample 2149877. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf856ac0> cannot identify image file <_io.BytesIO object at 0x7fa39edcf920> Failed to fetch sample 2060832. Exception: cannot identify image file <_io.BytesIO object at 0x7fa6518aab10> Failed to fetch sample 1695344. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84ee660> Failed to fetch sample 2189796. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa3be70> Failed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8113420> Failed to fetch sample 2760994. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b7420> uggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 2639168. Exception: Failed to fetch sample 1590489. Exception: ect at 0x7f69186e47c0> Failed to fetch sample 3327966. Except cannot identify image file <_io.BytesIO object at 0x7fdcaf87efc0> huggcannot identify image file <_io.BytesIO object at 0x7f22e39d4ef0> Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e37e0> Failed to fetch sample 3672296. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf78b7e0> KENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2401941. Exception: cannot identify image fFailed to fetch sample 4098478. Exception: cannot identify image file <_io.BytesIO object at 0x7f563cf21440> ile <_io.BytesIO object at 0x7fdc25888590> cannot identify image file <_io.BytesIO object at 0x7f69186e4d10> Failed to fetch sample 4144841. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0067e980> Failed to fetch sample 2491527. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dcfb50> Failed to fetch sample 2703342. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf057a5b20> Failed to fetch sample 3091195. Exception: cannot identify image file <_io.BytesIO object at 0x7f69186e5030> cannot identify image file <_io.BytesIO object at 0x7fedb5ec2020> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25629580> le <_io.BytesIO object at 0x7fb669afc7c0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` befFailed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object Failed to fetch samFailed to fetch sample 3106325. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0f9a13f0> cannot identify image file <_io.BytesIO object at 0x7fb977797f10> Failed to fetch sample 2914320. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b5cb0> cannot identify image file <_io.BytesIO object at 0x7f54d28576a0> le <_io.BytesIO object at 0x7f7c54fcd760> Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d4680> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7fb669afc7c0> rent process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf006f3f10> this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMTOKENIZERS_PARALLELISM=(true | false) =(true | false) Failed to fetch sample 1809476. Exception:huggingface/tokenizers: The current process just got forked, after cannot identify image file <_io.BytesIO object at 0x7f7b13d2c630> le <_io.BytesIO object at 0x7f357b956660> cannot identify image file <_io.BytesIO object at 0x7f53b78a4bd0> - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` bef cannot identify image file <_io.BytesIO object at 0x7fee3f94dd00> pFailed to fetch sample 1916142. Exception: cannot identify image file <_io.BytTOKENIZERS_PARALLELISM=(true | Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7fb784822fc0> Failed to fetch sample 3127679. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39c7380> n: cannot identify image file <_io.BytesIO object at 0x7ff4dab77e70> Failed to fetch sample 3106325. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f257f60> Failed to fetch sample 2341971. Exception: cannot identify image fiFailed to fetch sample 1697947. Exceptionhuggingface/tokenizers: TFailed to fetch sample 1079347. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4daba1a30> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To dis cannot identify image file <_io.BytesIO object at 0x7f7b9b997a60> le <_io.BytesIO object at 0x7f53a905bf60> the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2250980. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf006f0860> le <_io.BytesIO object at 0x7fbcdc0f6d90> cannot identify image file <_io.BytesIO object at 0x7f3939caacf0> le <_io.BytesIO object at 0x7f9d50f65620> Failed to fetch sample 1961364. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a905b970> cannot identify image file <_io.BytesIO object at 0x7f5a239af2e0> Failed to fetch sample 3725471. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078aeac0> Token indices sequence length is longer than the specified maximum cannot identify image file <_io.BytesIO object at 0x7f7b9b997600> through the model will result in indexing errors Failed to fetch sample 3433571. Exception:bject at 0x7f563bb94d10> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f7330> cannot identify image file <_io.BytesIO object at 0x7f3939cab5b0> Failed to fetch sample 2989692. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4daba2a20> Failed to fetch sample 1662512. Exception: cannot identify image file <_io.BytesIO object at 0x7f27820f6de0> cannot identify image file <_io.BytesIO object at 0x7f7e078da390> le <_io.BytesIO object at 0x7f21afeb74c0> ailed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a239e91c0> ile <_io.BytesIO object at 0x7fbf006f2d40> Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078ae7a0> Failed to fetch sample 3552479. Exception: cannot identify image fiFailed to fetch sample 1879625. Exception: n the specified maximum sequence length for this model (8747 > 8192). RunningFailed to fetch sample 2957876. Exception: cannot identify imageFailed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b95cc70> _PARALLELISM=(true | false) 11. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f37e0> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93a570> cannot identify image file <_io.BytesIO object at 0x7f655602d080> Failed to fetch sample 1575251. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcea1cb30> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4daba2c50>cFailed to fetch sample 2189796. Exception:ect at 0x7f277546f920> ile <_io.BytesIO object at 0x7fbf006f34c0> Failed to fetch sample 3067785. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf006f29d0> Failed to fetch sample 2957876. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381a6b0> s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7f2401747290> le <_io.BytesIO object at 0x7f7b13d1dda0> Failed to fetch sample 2887528. Exception: cannot identify image file <_io.BytesIO object at 0x7fcd98de7ab0> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f777e0> Failed to fetch sample 4087536. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdfb766b0> Failed to fetch sample 3490122. Exception: cannot identify image file <_io.BytesIO object at 0x7fcea3b01490> Failed to fetch sample 3558630. Exception: cannot identify image file <_io.BytesIO object at 0x7ff6973ebbf0> Failed to fetch sample 3268929. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8edd67650> Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7fcd98de6110> Failed to fetch sample 1047785. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b937fb0> Failed to fetch sample 2970666. Exception:Failed to fetch sample 3682863. Exception: ject at 0x7f3bcea1c9f0> Failed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7ff40701e750> Failed to fetch sample 3337057. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc54a82480> Failed to fetch sample 1547387. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc54a82340> Failed to fetch sample 2514825. Exception: cannot identify image fiFailed to fetch sample 1007919. Exception:cFailed to fetch sample 4102616. Exception: cannot identify image file <_io.BytesIO object at 0x7ff411288720> annot identify image file <_io.BytesIO object at 0x7fc32576e200> Failed to fetch sample 1746943. Exception: cannot identify image fiFailed to fetch sample 1886580. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c966b0> cannot identify image file <_io.BytesIO object at 0x7f6a7f0a74c0> Failed to fetch sample 1579559. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d97ae85e0> TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1620386. Exception: cannot identify image file <_io.BytesIO object at 0x7f633ef410d0> Failed to fetch sample 4014570. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078b3420> cannot identify image file <_io.BytesIO object at 0x7efcdbb3ea20> FFailed to fetch sample 1759192. Exception:cannot identify image file <_io.BytesIO object at 0x7efcd7b176a0> Failed to fetch sample 2992070. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd431ef510> Failed to fetch sample 3026436. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcea39fd0>Failed to fetch sample 1680945. Exception:sFailed to fetch sample 2528140. Exception:bject at 0x7f6d145b8590> g parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 1659475. Exception:bject at 0x7fb8edd98e50> Failed to fetch sample 2036348. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f75e40> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b6f20> cannot identify image file <_io.BytesIO object at 0x7fee3f95a980> le <_io.BytesIO object at 0x7f6682dbb060> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190bfa60> cannot identify image file <_io.BytesIO object at 0x7f56682f7e70> le <_io.BytesIO object at 0x7f10937607c0> Failed to fetch sample 1694360. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d729f0ea0> Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682f7e20> ore the fork if possible - Explicitly set the environment variable Failed to fetch sample 3431220. Exception: cannot identify image file <_io.BytesIO object at 0x7feefb10ea20> Failed to fetch sample 3747017. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f959d50> Failed to fetch sample 1449840. Exception:ject at 0x7f5e6a8b6340> Failed to fetch sample 3423432. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8edd66e80> Failed to fetch sample 1042732. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f76c00> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2897586. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f57830> Failed to fetch sample 1406222. Exception: cannot identify image fiFailed to fetch sample 1701865. Exception: annot identify image file <_io.BytesIO object at 0x7fbf0f9a2390> Failed to fetch sample 2452845. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f771a0> Failed to fetch sample 1544007. Exception: cannot identify image file <_io.cannot identify image file <_io.By cannot identify image file <_io.BytesIO object at 0x7efcdbb6b6f0> Failed to fetch sample 3838116. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd431efdd0> Failed to fetch sample 2686021. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e2750> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2687581. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6229d0> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9cc626750> Failed to fetch sample 1591233. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 2210827. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c555d00>oid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 2850892. Exception:Failed to fetch sample 1269875. Exception: cannot identify image file <_io.BytesIO object at cannot identify image file <_io.BytesIO object at 0x7f9abcd93ab0> Failed to fetch sample 4104608. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beeba7b50> identify image file <_io.BytesIO object at 0x7efcdbb6b880> Failed to fetch sample 4236147. Exception: cannot identify image fiFailed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071aa70> an either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIFailed to fetch sample 17195 cannot identify image file <_io.BytesIO object at 0x7f9abcdc1e90> 7f5e7a25f420> Failed to fetch sample 3916323. Exception:Failed to fetch sample 2970162. Exception: cannot identify image fi cannot identify image file <_io.BytesIO Failed to fetch sample 264 cannot identify image file <_io.BytesIO Failed to fetch sample 391 cannot identifyFailed to fetch sample 1731981. Exception: d937e0> le <_io.BytesIO object at 0x7f5253a0b830> Failed to fetch sample 2215099. Exception: cannot identify image file <_io.BytesIO object at 0x7f936b3549a0> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ion: cannot identify image file <_io.BytesIO object at 0x7f6682db0b80> Failed to fetch sample 3353351. Exception:bject at 0x7f5e6a8da340> le <_io.BytesIO object at 0x7f82a5e97c40> Failed to fetch sample 3414742. Exception:bject at 0x7f5253bba430> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3371113. Exception: ject at 0x7f94ac26d670> le <_io.BytesIO object at 0x7f7b13358fe0> cannot identify image file <_io.BytesIO object at 0x7f6682db2ac0> Failed to fetch sample 3225671. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 3445146. Exception:ject at 0x7f5e6a90f1f0> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d33a60> Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7f5742102610> Failed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f67dc60> cannot identify image file <_io.BytesIO object at 0x7f34f38d5ad0> Failed to fetch sample 4073421. Exception: cannot identify image fiFailed to fetch sample 1769778. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b93b3d0> Failed to fetch sample 2049762. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e831bd80> Failed to fetch sample 3035065. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d14fb32e0> cannot identify image file <_io.BytesIO object at 0x7fb52f67ce00> cannot identify image file <_io.BytesIO object at 0x7fbfd980a160> Failed to fetch sample 2964818. Exception: Failed to fetch sample 2957876. Exception:bject at 0x7f6682792610>Failed to fetch sample 2033563. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420ed8f0> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a7226e80> Failed to fetch sample 2148690. Exception: cannot identify image fiFailed to fetch sample 3322388. Exception:Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8316930> cannot identify image file <_io.BytesIO object at 0x7f0e9cd5b330> Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190a6430> Failed to fetch sample 2749330. Exception:bject at 0x7f94ac26d710> ailed to fetch sample 3322688. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f9c10> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7fee3f97bbf0> e <_io.BytesIO object at 0x7fb3766061b0> cannot identify image file <_io.BytesIO object at 0x7f6496057600> le <_io.BytesIO object at 0x7fb5fee10540> cannot identify image file <_io.BytesIO object at 0x7f5a16daa480> Failed to fetch sample 3547912. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de370f830> Failed to fetch sample 1265302. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f97a340> Failed to fetch sample 3601411. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459ca700> Failed to fetch sample 1933331. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa2d9e0> Failed to fetch sample 1248723. Exception: annot identify image file <_io.BytesIO object at 0x7f53bc93b970> cannot identify image file <_io.BytesIO object at 0x7f64960566b0> le <_io.BytesIO object at 0x7fb5fee104f0> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f97a7f0> cannot identify image file <_io.BytesIO object at 0x7f5d2ceaa9d0> ailed to fetch sample 3747017. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9074c0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 2734186. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e83f97330> Failed to fetch sample 3162454. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf057a5c10> Failed to fetch sample 4169112. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a90523e0>: cannot identify image file <_io.BytesIO object at 0x7fb977797fb0> Failed to fetch sample 2058977. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37363e0> Failed to fetch sample 1472700. Exception: cannot identify image file <_io.BytesIO object at 0x7f788d0ec6d0> Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777af380> Failed to fetch sample 1865048. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777df7e0> cannot identify image file <_io.BytesIO object at 0x7f18e649b100> le <_io.BytesIO object at 0x7f94ac23bb50> cannot identify image file <_io.BytesIO object at 0x7f6496028f40> cannot identify image file <_io.BytesIO object at 0x7f5caf576f70> le <_io.BytesIO object at 0x7f53a9078860> Failed to fetch sample 2543862. Exception:ject at 0x7fee3f97aac0> parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMTOKENIZERS_PARALLELISM=(true | false) =(true | false) cannot identify image file <_io.BytesIO object at 0x7f6496083a10> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To di cannot identify image file <_io.BytesIO object at 0x7f583a3a6cf0> le <_io.BytesIO object at 0x7fdd431efdd0> the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2964818. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a23a47d30> cannot identify image file <_io.BytesIO object at 0x7f53a90535b0> ile <_io.BytesIO object at 0x7feb2f639cb0> cannot identify image file <_io.BytesIO object at 0x7f801f5efe20> Failed to fetch sample 2149877. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496029030> Failed to fetch sample 3492670. Exception: cannot identify image file <_io.BytesIO object at 0x7f6667c54c70> Failed to fetch sample 3237343. Exception: cannot identify image file <_io.BytesIO object at 0x7f64189bbec0> Failed to fetch sample 2250103. Exception: cannot identify image file <_io.BytesIO object at 0x7ff411288720> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a907a070> Failed to fetch sample 1589461. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e3970> cannot identify image file <_io.BytesIO object at 0x7fbf006ee2f0> Failed to fetch sample 1945810. Exception:Failed to fetch sample 3492865. Exception:bject at 0x7f80201b39c0> le <_io.BytesIO object at 0x7fe1be4a3f60>Failed to fetch sample 1988418. Exception:b cannot identify image file <_io.BytesIO object at 0x7f7d7ddbc900> le <_io.BytesIO object at 0x7f6556067ab0> Failed to fetch sample 1cannot identify image file <_io.BytesIO object at 0x7fb4a6eed3f0> at 0x7f6417e0cb80> cannot identify image file <_io.BytesIO object at 0x7ff401f8a2f0> Failed to fetch sample 1313487. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ba7a0> le <_io.BytesIO object at 0x7f80a82b7d80> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1477290. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e2750> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` befFailed to fetch sample 3863020. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf759e4Fai cannot identify image file <_io.BytesIO object at 0x7fb52f767150> le <_io.BytesIO object at 0x7f9d50597290> led to fetch sample 2965079. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dbc720> Failed to fetch sample 3106325. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b7510> Failed to fetch sample 1265302. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a1fe70> Failed to fetch sample 1963191. Exception:Failed to fetch sample 1894264. Exception:bject at 0x7f9d4fbbf240> Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf759a30> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f8be20> Failed to fetch sample 3051002. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b75b0> Failed to fetch sample 2647526. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f6a7f236d40> 38408. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71f8d0> Failed to fetch sample 1695344. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2621a8450> Failed to fetch sample 2943308. Exception:ject at 0x7f6556064400> Failed to fetch sample 987378. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca5da0> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7f64189a22f0> Failed to fetch sample 2991716. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7ade0> cannot identify image file <_io.BytesIO oFailed to fetch sample 3747017. Exception:an the specified maximum sequence length for this model (8747 > 8192). Running this sequence cannot identify image file <_io.BytesIO object at cannot identify image file <_io.BytesIO object at 0x7f6a7f25a2a0> lelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) IO object at 0x7fc2621a87c0> Failed to fetch sample 1961364. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d5d193f60> cannot identify image file <_io.BytesIO object at 0x7f1093732f20> ile <_io.BytesIO object at 0x7f7a568c72e0> Failed to fetch sample 1701865. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43035440> Failed to fetch sample 2036348. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f60d120> Failed to fetch sample 4026334. Exception: Failed to fetch sample 3316837. Exception:cannot identify image file <_io.BytesIO object at 0x7fdaade85030> cannot identify image file <_io.BytesIO object at 0x7f59d894ae30> cannot identify image file <_io.BytesIO object at 0x7f1009401c60> Failed to fetch sample 2198944. Exception: cannot identify image file <_io.BytesIO object at 0x7f11cfd93bf0> le <_io.BytesIO object at 0x7f7a568c7790> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3421141. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8edd67330> Failed to fetch sample 4270540. Exception: cannot identify image fiFailed to fetch sample 3725471. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c78d0> Failed to fetch sample 1102280. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b7c40> Failed to fetch sample 2087658. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcc45e0> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f32b60> Failed to fetch sample 2584972. Exception: cannot identify image fiFailed to fetch sample 3225671. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cdeca0> le <_io.BytesIO object at 0x7fdaadcc5170> Failed to fetch sample 2526328. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f60ebb0> cannot identify image file <_io.BytesIO object at 0x7f7af4a2ffb0> Failed to fetch sample 1047785. Exception: cannot identify image file <_io.BytesIO object at 0x7f0db9bc59e0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: esIO object at 0x7f7e078aff60> - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) _io.BytesIO object at 0x7fccdf75b010> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f60f1a0> Failed to fetch sample 1737319. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf788e00> Failed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568d7ab0> Failed to fetch sample 4023897. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaade87290> Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f60ed90> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1695363. Exception:cannot identify image file <_io.BytesIO object at 0x7f1009413060> FFailed to fetch sample 3150542. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf78a1b0> Failed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e2cf0> Failed to fetch sample 1304332. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf76d3a0> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf78a5c0> Failed to fetch sample 3414742. Exception: cannot identify image file <_io.BytesIO object at 0x7f563bb95120> cannot identify image file <_io.BytesIO object at 0x7fbf0067e980> Failed to fetch sample 2127419. Exception:bject at 0x7f655602d710> Failed to fetch sample 3106325. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7ddb86d0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 2755940. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071fb00> Failed to fetch sample 1440224. Exception: cannot identify image file <_io.BytesIO object at 0x7f114cdc2ac0> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ion:bject at 0x7f53a8e9f790> Failed to fetch sample 2943308. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078af880> Failed to fetch sample 1564951. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db9d50> Failed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071fc9 caFailed to fetch sample 4271244. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2579c0> Failed to fetch sample 2138088. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64bba60> nnot identify image file <_io.BytesIO object at 0x7f3939c9cf90> Failed to fetch sample 1961364. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078afba0> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071d8a0> Failed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dbb5b0> TOKENIZERS_PARALLELISM=(true | false) ion:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3855812. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f256ac0>TOKENIZERS_PARALLELISM=(true | false) IO object at 0x7f83e83277e0> Failed to fetch sample 1049083. Exception: Failed to fetch sample 1122238. Exception: cannot identify image file <_io.BytesIO object at 0x7f788cc6d3a0> Failed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79e9d0> Failed to fetch sample 3062074. Exception: cannot identify image fFailed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2572e0> ile <_io.BytesIO object at 0x7f6682dbbbf0> Failed to fetch sample 1056413. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093737420> Failed to fetch sample 3725471. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078dd260> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64b9fd0> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7b6a0> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f257a10> Failed to fetch sample 3855812. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79e1b0> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dba430> Failed to fetch sample 3433571. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fa3bbcd10> Failed to fetch sample 3176736. Exception:Failed to fetch sample 1099589. Exception:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you c cannot identify image file <_io.BytesIO object at 0x7fee3f95b470> Failed to fetch sample 3401019. Exception: Failed to fetch sample 1102280. Exception: cannot identify image file <_io.BytesIO object at 0x7f0e9cd5b600> able this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c8e890> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f926ca0> _io.BytesIO object at 0x7f80a80f9d50> IO object at 0x7ff697deef20> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ee6b0> Failed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7af5b0> Failed to fetch sample 1840911. Exception: cannot identify image file <_io.BytesIO object at 0x7ff411288720> Failed to fetch sample 3176736. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f1272949ee0> 507159. Exception: cannot identify image file <_io.BytesIO object at 0x7f5838fc6fc0> Failed to fetch sample 3386016. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e83f97bf0> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078ae5c0> Failed to fetch sample 2526328. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a64630> Failed to fetch sample 2586701. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f67d120> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7f3939c9e160> Failed to fetch sample 2354993. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2581d7ab0> Failed to fetch sample 2555299. Exception: cannot identify image file <_io.BytesIO object at 0x7fa6518aab10> Failed to fetch sample 3067785. Exception:Failed to fetch sample 1964429. Exception: ject at 0x7f7e078dff60> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7f1009411b70> Failed to fetch sample 2846277. Exception: cannot identify image file <_io.BytesIO object at 0x7feef8f57dd0> cannot identify image file <_io.BytesIO object at 0x7f11f1ebd3f0> Failed to fetch sample 2596206. Exception: cannot identify image file <_io.BytesIO object at 0x7f995b4ad260> an either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2581d7fb0> Failed to fetch sample 3735111. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b7f8fc40> Failed to fetch sample 1701865. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7f060> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` befcannot identify image file <_io.BytesIO object at 0x7f1008a64cc0> le <_io.BytesIO object at 0x7f1137f11b7 cannot identify image file <_io.BytesIO object at 0x7f8028e89120> Failed to fetch sample 1099589. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643e2e4d0> Failed to fetch sample 3916323. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7b38040> Failed to fetch sample 2407758. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f922980> Failed to fetch sample 3397401. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a239ad4e0> Failed to fetch sample 2151521. Exception: cannot identify image file <_io.BytesIO object at 0x7feb2f63b1f0> cannot identify image file <_io.BytesIO object at 0x7fccdf7af330> Failed to fetch sample 3316534. Exception: cannot identify image file <_io.BytesIO object at 0x7f1009413470> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` befFailed to fetch sample 1198363. Exception: cannot identify image filFailed to fetch sample 1388111. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f37e0> e <_io.BytesIO object at 0x7f5a239ae6b0> Failed to fetch sample 3246712. Exception: cannot identify image file <_io.BytesIO object at 0x7feefb18a2f0> Failed to fetch sample 3535594. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcea5c0> _io.BytesIO object at 0x7fdc27789850> Failed to fetch sample 2671486. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7b38310> cannot identify image file <_io.BytesIO object at 0x7fb8edd98ae0> an either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 2132208. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f57830> Failed to fetch sample 1232641. ExceptFailed to fetch sample 2446023. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb8a5dc60> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadceb830> Failed to fetch sample 4028686. Exception: cannot identify image file < cannot identify image file <_io.BytesIO object at 0x7f7e078adf80> Failed to fetch sample 2533229. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebaa980> le <_io.BytesIO object at 0x7f53a9079580> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d94e97470> Failed to fetch sample 1688963. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a8e1c9350> Failed to fetch sample 1255860. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb2f5b0> cannot identify image file <_io.BytesIO object at 0x7f6191ca4d60> Failed to fetch sample 3649257. Exception: Failed to fetch sample 3013587. Exception:ject at 0x7f82a7c01b70> Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 2483581. Ex cannot identify image file <_io.BytesIO object at 0x7f4beebe5530> c9cf40> Failed to fetch sample 3907787. Exception:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556075850> cannot identify image file <_io.BytesIO object at 0x7f1137f12430> le <_io.BytesIO object at 0x7fedb5ec0d10> Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb2dc60> Failed to fetch sample 3414742. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50bdbacf0> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebe4f40> Failed to fetch sample 2348963. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2d81c3f10> Failed to fetch sample 2811434. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381a160> Failed to fetch sample 3165193. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e3a10> Failed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7fb977747560> Failed to fetch sample 2055788. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79d300> Failed to fetch sample 2528140. Exception: Failed to fetch sample 3855812. Exception:ject at 0x7f6100878d60> cannot identify image file <_io.BytesIO object at 0x7f6a7f23ae80> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb2eed0> Failed to fetch sample 3414742. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43035030> Failed to fetch sample 1961364. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb5ec17bFailed to fetch sample 3225671. Exception: ject at 0x7f6556074860> Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c9d9e0> le <_io.BytesIO object at 0x7f56681136f0> Failed to fetch sample 2297776. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab77c40> Failed to fetch sample 2353389. Exception:bject at 0x7f12b381a9d0> le <_io.BytesIO object at 0x7fcc54a83100> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb2f6f0> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3013587. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3942340> Failed to fetch sample 2736017. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0069fe20> Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d6cf0> Failed to fetch sample 2957876. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 4 cannot identify image file <_io.BytesIO object at 0x7f5cad7b2750> Failed to fetch saFailed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object Failed to fetch sample 4130341. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f282570> Failed to fetch samFailed to fetch sample 1756987. Exception: ject at 0x7fedb5ec1760> Failed to fetch sample cannot identify image file <_io.BytesIO object at 0x7f80a82da8e0> at 0x7fdd43035e90> Failed to fetch sample 2189796. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f94ba60> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb6877560> Failed to fetch sample 1316862. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d145b95d0> Failed to fetch sample 1591233. Exception: cannot identify image fFailed to fetch sample 1036703. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b936e30> Failed to fetch sample 1881880. Exception: cannot identify image file <_io.BytesIO object at 0x7fd99eea58f0> Failed to fetch sample 3225671. Exception: Failed to fetch sample 2511133. Exception:ject at 0x7f5253a09b20> Failed to fetch sample 2992417. Exception: cannot identify image file <_io.BytesIO object at 0x7ff40701d120> Failed to fetch sample 1591233. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b5f30> cannot identify image file <_io.BytesIO object at 0x7f3939c966b0> Failed to fetch sample 3649257. Exception:bject at 0x7f34f2d12d90> Failed to fetch sample 2127419. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f268bd0> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d145b84f0> cannot identify image file <_io.BytesIO object at 0x7f5253a098a0> le <_io.BytesIO object at 0x7fedb6877d30> Failed to fetch sample 3225671. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9976f0> Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7ff40701d0d0> Failed to fetch sample 1509006. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b1477b150> Failed to fetch sample 1657079. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420d6cf0> Failed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9b8fe0> Failed to fetch sample 1099589. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c964d0> Failed to fetch sample 2591241. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556076cf0> Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b937f10> Failed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d32570> Failed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7f6244f3f3d0> cannot identify image file <_io.BytesIO object at 0x7f5253bd7b50> cannot identify image file <_io.BytesIO obFailed to fetch sample 1Failed to fetch sample 1047785. Exception:bject at 0x7f53a9069850> Failed to fetch sample 2086219. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23b420> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you c cannot identify image file <_io.BytesIO object at 0x7f12b3852700> Failed to fetch sample 2244227. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bfe4db920> Failed to fetch sample 2353389. ExceptionFailed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8d9fd0> Failed to fetch sample 2988390. Exception: cannot identify image filFailed to fetch sample 1138918. ExceptionFailed to fetch sample 2964818. Exception: cannot identify image filFailed to fetch sample 4222767. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a17f37fb0> Failed to fetch sample 2970162. ExceptionFailed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23b790> Failed to fetch sample 2820344. Exception: cannot identify image filFailed to fetch sample 978026. Exception: ailed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7f225a5eff10> : cannot identify image file <_io.BytesIO object at 0x7f5a16da9f80> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd7240> Failed to fetch sample 1963191. Exception:bject at 0x7f3939ca1a80> Failed to fetch sample 1186789. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64e22a0> Failed to fetch sample 2189796. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d27883e20> Failed to fetch sample 2602189. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d61b0> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbca4d0> cannot identify image file <_io.BytesIO object at 0x7f10937371f0> Failed to fetch sample 3535594. Exception:ject at 0x7f3939ca27a0> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23b1f0> Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 2857516. Exception: cannot identify image file <_io.BytesIO object at 0x7f6417e0bf10> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1769963. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e0310> Failed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c3bf0> cannot identify image file <_io.BytesIO object at 0x7f788d0ed8f0> cannot identify image file <_io.BytesIO object at 0x7f357b924d10> Failed to fetch sample 3747017. Exception: cannot identify image file <_io.BytesIO object at 0x7f0d82905d00> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23bb50> Failed to fetch sample 2498939. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9363e0> Failed to fetch sample 1261600. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420ee930> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbc8810> cannot identify image file <_io.BytesIO object at 0x7f7b9b95ea70> Failed to fetch sample 2237717. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b96dc60> Failed to fetch sample 1029661. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b52340> Failed to fetch sample 2251634. Exception: cannot identify image fiFailed to fetch sample 1695344. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa3cea0> ile <_io.BytesIO object at 0x7fc2621a86d0> Failed to fetch sample 2250980. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39bb380> Failed to fetch sample 3208833. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b74c0> Failed to fetch sample 1289819. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556029e40> Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbc8e00> Failed to fetch sample 3225671. Exception: cannot identify image file <_io.BytesIO object at 0x7feb2f63b1f0> Failed to fetch sample 1941856. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f1e570> Failed to fetch sample 1734766. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08dbf60> Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420eefc0> Failed to fetch sample 2596238. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d85d50> Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f979170> the environment variable TOKENIZERS_PARALLELISM=(true | false) Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 819huggingface/tokenizers: The current process just got forked, after parallelFailed to fetch sample 1885521. Excannot identify image file <_io.BytesIO object at 0x7f7d6d1bb330> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a6cf0> 77b73d0> BytesIO object at 0x7f57420ef420> TOKENIZERS_PARALLELISM=(true | false) ion: cannot identify image file <_io.BytesIO object at 0x7fee3f97a2a0> Failed to fetch sample 2591241. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f09105e0> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39eae30> cannot identify image file <_io.BytesIO object at 0x7f7a568c3f10> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568ce9d0> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7f58392793f0>cannot identify image file <_io.BytesIO object at 0x7f22e39eb830> 535594. Exception: bject at 0x7f668810d9e0> Failed to fetch sample 2750009. Exception: cannot identify image file <_io.BytesIO object at 0x7f326a4f67f0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you cFailed to fFailed to fetch sample 3682856. cannot identify image file <_io.BytesIO object at 0x7f55cca03ec0> Failed to fetch sample 2948010. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380f650> 2e39d1ee0> Failed to fetch sample 3543474. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9793ea20> Failed to fetch sample 3067785. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ea480> Failed to fetch sample 3318648. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e9e6eecf0> cannot identify image file <_io.BytesIO object at 0x7fdcaf81b150> Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568cd210> Failed to fetch sample 3029943. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd4460d030> Failed to fetch sample 2555299. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9793f150> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d09a0> Failed to fetch sample 1265302. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682df7b50> Failed to fetch sample 2446023. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b939490> gingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3423432. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf83db70> Failed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3987330> Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e077076a0> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ion: cannot identify image file <_io.BytesIO object at 0x7f7b9793e520> Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7f55b329bc90> Failed to fetch sample 2560457. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00723a60> Failed to fetch sample 3652924. Exception: Failed to fetch sample 1573781. Exception:ject at 0x7f357b939710> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682df7970> Failed to fetch sample 4102616. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f6191ca4d60> 106325. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39cfab0> Failed to fetch sample 1014138. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9578d0> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d4090> Failed to fetch sample 2887528. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00723970> Failed to fetch sample 1094622. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078af9c0> Failed to fetch sample 3106325. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9069d0> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ion: ject at 0x7fdcaf87e520> cannot identify image file <_io.BytesIO object at 0x7fb977796bb0> ile <_io.BytesIO object at 0x7f6998a431a0> Failed to fetch sample 2943308. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39cf380> Failed to fetch sample 1440224. Exception: Failed to fetch sample 2274389. Exception:ject at 0x7f5838df44a0> Failed to fetch sample 3427972. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071ebb0> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b956660> Failed to fetch sample 3928710. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078aba60> Failed to fetch sample 2528140. Exception: cannot identify image file <_io.BytesIO object at 0x7f7af76137e0> Failed to fetch sample 1864192. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e07706390> Failed to fetch sample 3665508. Exception: ject at 0x7fdaade9cfe0> Failed to fetch sample 2615184. Exception: cannot identify image file <_io.BytesIO object at 0x7f5ad353cae0> Failed to fetch sample 1390425. Exception: cannot identify image file <_io.BytesIO object at 0x7fc385111210> cannot identify image file <_io.BytesIO object at 0x7fb9777adfd0> cannot identify image file <_io.BytesIO object at 0x7f80201b3f10> ile <_io.BytesIO object at 0x7fccdf759a30> cannot identify image file <_io.BytesIO object at 0x7f58c39cfce0> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d1bc0> Failed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071f420> Failed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7ddb8e00> Failed to fetch sample 2036348. Exception: cannot identify image file <_io.BytesIO object at 0x7f5838df59e0> Failed to fetch sample 2943308. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b907380> Failed to fetch sample 1265302. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b96c090> Failed to fetch sample 1522382. Exception: cannot identify image file <_io.BytesIO object at 0x7fddc429ffb0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2555299. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf75b5b0> Failed to fetch sample 3347953. Exception:l will result in indexing errors Failed to fetch sample 3922008. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8bb880> Failed to fetch sample 1126390. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39b9c10> Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc57d6a660> Failed to fetch sample 1961364. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93a570> Failed to fetch sample 2639168. Exception: cannot identify image file <_io.BytesIO object at 0x7f59fee87dd0> Failed to fetch sample 1364283. Exception: ject at 0x7fdcaf87f830> le <_io.BytesIO object at 0x7f357b96c9f0> cannot identify image file <_io.BytesIO object at 0x7fdd430358a0> Failed to fetch sample 2059534. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420d2de0> cannot identify image file <_io.BytesIO object at 0x7fa275c7eb60> ile <_io.BytesIO object at 0x7efcdbb13920> Failed to fetch sample 2686021. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078afa60> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071f150> Failed to fetch sample 2250980. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93bec0> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43035e90> Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420d26b0> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7fcea3b016c0> Failed to fetch sample 3606838. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9066e80> Failed to fetch sample 2887528. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8e8b380> sable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2528140. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9e579d50> cannot identify image file <_io.BytesIO object at 0x7f6a7f265300> Failed to fetch sample 1591233. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078da9d0> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf006ee200> Failed to fetch sample 3398470. Exception: cannot identify image file <_io.BytesIO object at 0x7f277540bfb0> cannot identify image file <_io.BytesIO object at 0x7fa50b3c9fd0> le <_io.BytesIO object at 0x7f4beebd22f0> Failed to fetch sample 2526328. Exception: cannot identify image file <_io.BytesIO object at 0x7f5838df48b0> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39dd4e0> Failed to fetch sample 2017828. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682ded3f0> Failed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7fb977796980> Failed to fetch sample 2788381. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc54a829d0> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43034810>FFailed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39dd580>Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8e894e0> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2506233. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c56083920> Failed to fetch sample 2887528. Exception: cannot identify image fi Failed to fetch sample 1747603. ExceptionFailed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7fb977797dd0> Failed to fetch sample 1677361. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e6750> Failed to fetch sample 2606964. Exception: cannot identify image filFailed to fetch sample 4121555. Exception: cannot identify image file <_io.BytesIO object at 0x7fb5ea6a2ca0> _PARALLELISM=(true | false) 21. ExceptionFailed to fetch sample 3067785. Exception: cannot identify image filFailed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc51c0> Failed to fetch sample 2491527. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e78d0> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf78ade0> e <_io.BytesIO object at 0x7f357b938270> Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380f9c0> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7ff6973eb470> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754780e0> Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6209f0> Failed to fetch sample 1885521. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754796c0> Failed to fetch sample 2237717. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e6430> cannot identify image file <_io.BytesIO object at 0x7f357b93b290> Failed to fetch sample 3916323. Exception: cannot identify image file <_io.BytesIO object at 0x7f5838fc6930> ile <_io.BytesIO object at 0x7ff697dee160> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: esIO object at 0x7fdd459cb6a0> cannot identify image file <_io.BytesIO object at 0x7fa275cb4770> Failed to fetch sample 3878146. Exception: Failed to fetch sample 1697947. Exception: sample 1087933. ExceptFailed to fetch sample 2710635. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c3bb880> cannot identify image file <_io.BytesIO object at 0x7f9abcdc6160> ailed to fetch sample 3371113. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43034a40> ile <_io.BytesIO object at 0x7f7e078db1f0> Failed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093733e70> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIO object at 0x7f6496080270> Failed to fetch sample 2686021. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7dc10> le <_io.BytesIO object at 0x7f7d7d3fc090> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078b3650> Failed to fetch sample 1719573. Exception:Failed to fetch sample 4121555. Exception:bject at 0x7f10937337e0> Failed to fetch sample 2578970. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093737a60> Failed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dcf240> Failed to fetch sample 1988418. Exception: cannot identify image fiFailed to fetch sample 1591233. Exception:cannot identify image filFailed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d9440> Failed to fetch sample 2998742. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebf72e0> e <_io.BytesIO object at 0x7fb9777b66b0> Failed to fetch sample 1207900. Exception:ject at 0x7fa275cb4590> Failed to fetch sample 1159149. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093756890> Failed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078dec50> Failed to fetch sample 1021211. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf6a3ba0> cannot identify image file <_io.BytesIO object at 0x7f5b1d223fb0> parallelism has already been used. Disablicannot identify image file <_io.BytesIO object at 0x7f21b1241e90> can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggFailed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f1d760> arallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2113028. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebaa750> Failed to fetch sample 2903722. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fa3bbcd10> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078dd8a0> Failed to fetch sample 4102616. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e83f97fb0> Failed to fetch sample 1375416. Exception: cannot identify image file <_io.BytesIO object at 0x7f55a8a2dda0> cannot identify image file <_io.BytesIO object at 0x7fb9777b4950> cannot identify image file <_io.BytesIO object at 0x7f0e2840d2b0>Failed to fetch sample 3095576. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fa3bbe2a0> Failed to fetch sample 2555299. Exception: cannot identify image file <_io.BytesIO object at 0x7f1228c22fc0> cannot identify image file <_io.BytesIO object at 0x7fccdf7ae7f0>Failed to fetch sample 2231126. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79eed0> Failed to fetch sample 2957876. Exception: cannot identify image file <_io.BytesIO object at 0x7ff6973ebc90>c cannot identify image file <_io.BytesIO object at 0x7f4beebf6340 cannot identify image file <_io.BytesIO objeFailed to fetch sample 2545424. Exception:le 3433571. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08dac00> le <_io.BytesIO object at 0x7f5e6a8d9350> Failed to fetch sample 2884186. Exception: Failed to fetch sample 3342416. Exception:cannot identify image file <_io.BytesIO object at 0x7fccdf79ec00> cannot identify image file <_io.BytesIO object at 0x7f7b9bac2cf0> Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7b6070> Failed to fetch sample 2189796. Exception:bject at 0x7f5a16dabba0> le <_io.BytesIO object at 0x7ff401f77ec0> y set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2036348. Exception: cannot identify image f cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7f1b7c56e9d0> cannot identify image file <_io.BytesIO object at 0x7fc0f090ad40> 23432. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b93aed0>cannot identify image file <_io.BytesIO object at 0x7fc9ce338cc0> Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7ae1b0> Failed to fetch sample 2735892. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f70900> Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f76480> Failed to fetch sample 1972971. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278027f0> Failed to fetch sample 4150686. Exception: cannot identify image file <_io.BytesIO object at 0x7f5af5078a40> Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8d2610> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7aebb0> Failed to fetch sample 2446023. Exception: cannot identify image file <_io.BytesIO object at 0x7f788d0ec4f0> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f75800>Failed to fetch sample 2606964. Exception: cannot identify image fhuggingface/tokenizers: The current processFailed to fetch sample 1343468. Exception: cannot identify image file <_io.BytesIO object at 0x7f277548af20> Failed to fetch sample 4030195. Exception: cannot identify image file <_io.BytesIO object at 0x7fb21f8d53f0> the environment variable TOKENIZERS_PARALLELISM=(true | false) IO object at 0x7f11f37fdd50> le <_io.BytesIO object at 0x7f5e6a8d3510> Failed to fetch sample 1858383. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7afa10> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a15160f40> Failed to fetch sample 2555299. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c9d8fbf60> Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b93b7e0> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f57fb0> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f76430> Failed to fetch sample 1524920. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7b5080> able this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2966217. Exception:Failed to fetch sample 4049116. Exception:bject at 0x7f1137f127f0> le <_io.BytesIO object at 0x7f5e6a8d38d0> Failed to fetch sample 3106325. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b93a980> Failed to fetch sample 1864192. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278af1f0> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fd98f0> cannot identify image file <_io.BytesIO object at 0x7f5253bda2a0> Failed to fetch sample 2684710. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9236f0> Failed to fetch sample 2491527. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f95b420> cannot identify image file <_io.BytesIO object at 0x7f4e83fc4950> le <_io.BytesIO object at 0x7f7a5695a4d0> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c556390> Failed to fetch sample 2528140. Exception: cannot identify image file <_io.BytesIO object at 0x7f1dcabd63e0> Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebf7d80> 2). Running this sequence through the model will result in indexing errors Failed to fetch sample 3146699. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25629580> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f7510> Failed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c36f0> Failed to fetch sample 1996155. Exception: ect at 0x7f7b9b93ac00> ile <_io.BytesIO object at 0x7f7e31298680> cannot identify image file <_io.BytesIO object at 0x7f21b7fba2a0> Failed to fetch sample 2610154. Exception: Failed to fetch sample 3527020. Exception: cannot identify image fFailed to fetch sample 2887528. Exception: Failed to fetch sample 1672022. Exception: cannot identify image fFailed to fetch sample 2606964. Exception: Failed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2581d71a0> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f0f8d0> cannot identify image file <_io.BytesIO object at 0x7f7a56959710> Failed to fetch sample 2392280. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64e1d50> Failed to fetch sample 4151771. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2d9059f80> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64bbab0> Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cec950e00> Failed to fetch sample 2000787. Exception: cannot identify image fiFailed to fetch sample 1701865. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093763ec0> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4145373. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64b9d50> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b96a390> cannot identify image file <_io.BytesIO object at 0x7f5a2675bc90> Failed to fetch sample 2599185. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db55f380> cannot identify image file <_io.BytesIO object at 0x7ff4daba1fd0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly setFailed to fetch sample 2528140. Exception:LELISM=(true | false) Failed to fetch sample 1158769. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca78d0> cannot identify image file <_io.BytesIO object at 0x7f277545f600> Failed to fetch sample 2591241. Exception: cannot identify image file <_io.BytesIO object at 0x7f277546f9c0> Failed to fetch sample 2563789. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f94a520> cannot identify image file <_io.BytesIO object at 0x7f7b9b9690d0> Failed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7fcd98d97060> Failed to fetch sample 3067785. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b968090> Failed to fetch sample 3273016. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80fa5c0> cannot identify image file <_io.BytesIO object at 0x7f0d822e17b0> Failed to fetch sample 3524880. Exception: cannot identify image file <_io.BytesIO object at 0x7f69186e4cc0> Failed to fetch sample 2129244. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078aa750> cannot identify image file <_io.BytesIO object at 0x7f3939ca2a20> le <_io.BytesIO object at 0x7f18e64e32e0> Failed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f285ee0> Failed to fetch sample 3425226. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43036a70> Failed to fetch sample 2598260. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb68758f0> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f266570> cannot identify image file <_io.BytesIO object at 0x7fb8edd98680> Failed to fetch sample 1045305. Exception: Failed to fetch sample 1158188. Exception:ject at 0x7f7b1335a6b0> Failed to fetch sample 3697959. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b6cf0> Failed to fetch sample 2686021. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf75a1b0> Failed to fetch sample 3091195. Exception: cannot identify image fiFailed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f287010> le <_io.BytesIO object at 0x7fbf0071a980> Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93af20> s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7fc0f0931990> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb6875940> ile <_io.BytesIO object at 0x7f5e6a96d300> Failed to fetch sample 1854583. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7b28680> Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190c35b0> Failed to fetch sample 3414742. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071aca0> Failed to fetch sample 3918973. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682deecf0> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93a570> Failed to fetch sample 2998072. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a96ede0> Failed to fetch sample 3700472. Exception:ject at 0x7fe5886f1760> ile <_io.BytesIO object at 0x7fb9777e22a0> Failed to fetch sample 2127419. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf75b240> Failed to fetch sample 3202812. Exception: cannot identify image file <_io.BytesIO object at 0x7ff40701d0d0> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b6070> le <_io.BytesIO object at 0x7f69190c1620> cannot identify image file <_io.BytesIO object at 0x7fbf007192b0> Failed to fetch sample 4243929. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682ded8f0> Failed to fetch sample 1864192. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93a2a0> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078cb240> Failed to fetch sample 2056788. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0770afc0> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190c20c0> Failed to fetch sample 1007919. Exception: cannot identify imageFailed to fetch sample 2154266. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f88860> Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a96e4d0> Failed to fetch sample 2769327. Exception: cannot identify image fiFailed to fetch sample 3679185. Exception:Failed to fetch sample 1102280. Exception:ample 2686021. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd96570> Failed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7ff6991d2c00> Failed to fetch sample 3477329. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7231a0> Failed to fetch sample 1304332. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a96eca0> Failed to fetch sample 1591233. Exception: ject at 0x7fb5ff9d8fe0> Failed to fetch sample 2957876. Exception: cannot identify image file <_io.BytesIO object at 0x7f52a1abe0c0> cannot identify image file <_io.BytesIO object at 0x7f82a68618a0> Failed to fetch sample 3150542. Exception:ject at 0x7ff401f56b60> Failed to fetch sample 3202635. Exception: Failed to fetch sample 1070036. Exception:ject at 0x7f22e39d0310> Failed to fetch sample 3150542. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb721350> Failed to fetch sample 2127419. Exception:bject at 0x7f81db55fec0> Failed to fetch sample 2164107. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdba58f0> Failed to fetch sample 4238312. Exception:bject at 0x7f69190c2840> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0069bce0> Failed to fetch sample 2647526. Exception: cannot identify image fiFailed to fetch sample 1138284. Exception:Failed to fetch sample 1597434. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6179c0> cannot identify image file <_io.BytesIO object at 0x7f34f2d12d90> Failed to fetch sample 2555299. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecb2013f0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db55f470> Failed to fetch sample 970385. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9708b0> Failed to fetch sample 2570139. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3711760> Failed to fetch sample 1383842. Exception: cannot identify image fiFailed to fetch sample 1701865. Exception:Failed to fetch sample 2269064. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f8ac50> Failed to fetch sample 2957876. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f617420> Failed to fetch sample 2555299. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd430349a0> Failed to fetch sample 4155387. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f948270> Failed to fetch sample 3013587. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3746cf0> Failed to fetch sample 2980823. Exception: cannot identify image fiFailed to fetch sample 1135356. Exception:Failed to fetch sample 3916323. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db55cfe0> bject at 0x7f4bfd74ddf0> Failed to fetch sample 3505555. Exception: cannot identify image file <_io.BytesIO object at 0x7f56cd3f24d0> Failed to fetch sample 1528270. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937655d0> Failed to fetch sample 1832420. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db3420> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b926c50> Failed to fetch sample 4188604. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43034fe0> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3746250> Failed to fetch sample 1099589. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e853271a0> Failed to fetch sample 2887528. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093767a60> Failed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db32e0> Failed to fetch sample 2024032. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754867f0> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b67a0> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7f225b9d31fFai cannot identify image file <_io.BytesIO object at 0x7f0ecc7dd350> Failed to fetch sample 2889550. Exception: cannot identify image file <_io.BytesIO object at 0x7f8294b86ed0> le <_io.BytesIO object at 0x7efcdbb1fc40> Failed to fetch sample 3397401. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0ffa8e020> Failed to fetch sample 3089553. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35cd7330> Failed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5677e0> Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9cc626750> Failed to fetch sample 1583288. Exception: ject at 0x7f6556024720> Failed to fetch sample 2550194. Exception: cannot identify image file <_io.BytesIO object at 0x7f6191ca4d60> Failed to fetch sample 1885521. Exception:cannot identify image file <_io.BytesIO object at 0x7f22e39c7f10> Failed to fetch sample 2920120. Exception: cannot identify image file <_io.BytesIO object at 0x7fbfd980a160> Failed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7f67acce3ba0> Failed to fetch sample 3106325. Exception: cannot identify image file <_io.BytesIO object at 0x7efc516dc7c0> Failed to fetch sample 1207900. Exception: cannot identify image fi cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7f22e39c7290> Failed to fetch sample 3246712. Exception: cannot identify image fiFailed to fetch sample 3855812. Exception: cannot identify image file <_io.BytesIO object at 0x7efc516dd620> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f1e570> Failed to fetch sample 2224807. Exception: cannot identify image file <_io.BytesIO object at 0x7f801f5efe20> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556076890> Failed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f60d120> Failed to fetch sample 1961364. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7f0f4ccd7f60> Failed to fetch sample 1974249. Exception: cannot identify image file <_io.BytesIOFailed to fetch sample 2446023. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ef470> sIO object at 0x7f1b7c572d40> Failed to fetch sample 2816660. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f1e660> Failed to fetch sample 2526328. Exception: cannot identify image fiFailed to fetch sample 1099589. Exception:Failed to fetch sample 3492865. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a918e94e0> Failed to fetch sample 3414742. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f60f1a0> Failed to fetch sample 2250980. Exception: cannot identify image fiFailed to fetch sample 2654468. Exception:Failed to fetch sample 2241322. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071db70> Failed to fetch sample 2591241. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7f18e5d14270> Failed to fetch sample 3025472. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568cd760> ile <_io.BytesIO object at 0x7f5741f1e890> Failed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a91723ce0> Failed to fetch sample 3725471. Exception: cannot identify image file <_io.BytesIO object at 0x7efc516dccc0> cannot identify image file <_io.BytesIO object at 0x7f9a9c2728e0> le <_io.BytesIO object at 0x7f99a3922700> Failed to fetch sample 2993528. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0074b1f0> cannot identify image file <_io.BytesIO object at 0x7fa275c7ed40> TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 267148Failed to fetch sample 1102280. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d89940> le <_io.BytesIO object at 0x7f59d73f1df0> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb5e660> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f60d710> Failed to fetch sample 1350772. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56afd5d0> Failed to fetch sample 3916323. Exception: cannot identify image file <_io.BytesIO object at 0x7fc384737060> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1465471. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f442009f0> Failed to fetch sample 2526328. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d89990> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f60d440> Failed to fetch sample 3550444. Exception: cannot identify image file <_io.BytesIO object at 0x7f0e2840d0d0> cannot identify image file <_io.BytesIO object at 0x7fa275c94090> Failed to fetch sample 2036348. Exception:Failed to fetch sample 3523584. Exception:bject at 0x7fd200374c70> cannot identify image file <_io.BytesIO object at 0x7fdcaf83fab0> Failed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c7f10> Failed to fetch sample 2943308. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8eee80> Failed to fetch sample 4141634. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f34f2d1b790> cannot identify image file <_io.BytesIO object at 0x7fa275ca8540> through the model will result in indexing errors Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200377f60> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e2c50> Failed to fetch sample 3929233. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8edd66ed0> Failed to fetch sample 1961364. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8eefc0> Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f2d1b7e0> cannot identify image file <_io.BytesIO object at 0x7fdaadcebd30> Failed to fetch sample 2780470. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadeb2c50> Failed to fetch sample 2686021. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2788fe20> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8bab10> Failed to fetch sample 3091195. Exception:bject at 0x7f7a5696e3e0> Failed to fetch sample 3401019. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fc9e90> Failed to fetch sample 2170770. Exception:Failed to fetch sample 1099589. Exception: cannot identify image fihuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly setFailed to fetch sample 3225671. Exception: cannot identify image fiFailed to fetch sample 3150542. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadeb2e30> Failed to fetch sample 3725471. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8eec00> Failed to fetch sample 2127419. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278ae200> Failed to fetch sample 2042834. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937572e0> Failed to fetch sample 1988418. Exc cannot identify image file <_io.BytesIO object at 0x7fd200374770> dfbf0> Failed to fetch sample 3246712. Exception: cannot identify image file <_io.BytesIO object at 0x7f4fc92c1b20> Failed to fetch sample 3225671. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682df47c0> Failed to fetch sample 3492865. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381be20> Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378f9c0> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71df80> Failed to fetch sample 1440224. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0912660> Failed to fetch sample 3371113. Exception:bject at 0x7fdc256298f0> Failed to fetch sample 2846277. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf83ef70> le <_io.BytesIO object at 0x7fdaadeb1da0> le <_io.BytesIO object at 0x7f6682df7f10> Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43034810> Failed to fetch sample 4155387. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381dfd0> cannot identify image file <_io.BytesIO object at 0x7fa275ca8360> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7f6417e0b510> Failed to fetch sample 3233238. Exception: cannot identify image file <_io.BytesIO object at 0x7f83ea6fa3e0> Failed to fetch sample 2036348. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f09129d0> Failed to fetch sample 3705506. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f7ef240> Failed to fetch sample 1099448. Exception:Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39ce700> Failed to fetch sample 3649257. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43034400> cannot identify image file <_io.BytesIO object at 0x7fdcaf8f3f60> Failed to fetch sample 1346862. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf857ab0> le <_io.BytesIO object at 0x7f6682df4e00> Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7f4cddb147c0> Failed to fetch sample 1430100. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381be70> Failed to fetch sample 1062998. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39f1f80> cannot identify image file <_io.BytesIO object at 0x7f1093733830> Failed to fetch sample 1258063. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb5ec1bc0> Failed to fetch sample 2555299. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f623e20> Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf6a3790> Failed to fetch sample 2198101. Exception: cannot identify image fiFailed to fetch sample 3717356. Exception:Failed to fetch sample 2641615. Exception: cannot identify image fiFailed to fetch sample 2708935. Exception:Failed to fetch sample 1701865. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6201d0> Failed to fetch sample 3593262. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb36c50> cannot identify image file <_io.BytesIO object at 0x7f4beebaf740> Failed to fetch sample 2765571. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078e4360> Failed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190bdc10> cannot identify image file <_io.BytesIO object at 0x7f53a9067c40> ile <_io.BytesIO object at 0x7f5742103740> Failed to fetch sample 2149877. Exception: cannot identify image fiFailed to fetch sample 3106325. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bfd74cc70> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39f3060> Failed to fetch sample 4045711. Exception:Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556068f40> cannot identify image file <_io.BytesIO object at 0x7f96d65bd2b0> an either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variableFFailed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7f0d82904b80> Failed to fetch sample 2943308. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd9ee0> Failed to fetch sample 2189796. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b957150> Failed to fetch sample 2998072. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078e71f0> Failed to fetch sample 1659475. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00722bb0> Failed to fetch sample 3535594. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682df48b0> cannot identify image file <_io.BytesIO object at 0x7f9a9c273f60> ile <_io.BytesIO object at 0x7f109378e520> Failed to fetch sample 2816660. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd05e0> cannot identify image file <_io.BytesIO object at 0x7fdcaf845580> le <_io.BytesIO object at 0x7f6556067c90> Failed to fetch sample 3225671. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f2d10950> Failed to fetch sample 3303243. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93f010> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078e4720> cannot identify image file <_io.BytesIO object at 0x7f936bd2d5d0> Failed to fetch sample 3477014. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcfec0> Failed to fetch sample 1118335. Exception:Failed to fetch sample 1465471. Exception: cannot identify image fiFailed to fetch sample 1363208. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378fb50> Failed to fetch sample 2250980. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebaaac0> Failed to fetch sample 3355968. Exception: cannot identify image file <_io.BytesIO object at 0x7f655602d620> Failed to fetch sample 1701387. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcea1d850> Failed to fetch sample 2975650. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937899e0> Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b96ede0> Failed to fetch sample 3492865. Exception: cannot identify image fiFailed to fetch sample 4027148. Exception:bject at 0x7fe4c424b6a0> Failed to fetch sample 3355836. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 1648454. Exception: cannot identify image fiFailed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078e78d0> Failed to fetch sample 3725471. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebda930> Failed to fetch sample 2850892. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093758270> Failed to fetch sample 2961542. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c424a9d0> Failed to fetch sample 3246712. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b223c90> Failed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b909a30> Failed to fetch sample 2528140. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459ab920> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7f5751088d60> cannot identify image file <_io.BytesIO object at 0x7fdcaf8ed170> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c0b30> Failed to fetch sample 2964818. Exception: cannot identify image file <_io.BytesIO object at 0x7f109375a070> annot identify image file <_io.BytesIO object at 0x7f001faefa10> cannot identify image file <_io.BytesIO object at 0x7ff697dedd50> le <_io.BytesIO object at 0x7f9d77c87e20> Failed to fetch sample 3886591. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b242ed0> Failed to fetch sample 2446023. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cddee0> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496083920> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556067c90> Failed to fetch sample 1304332. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078e6610> Token indices sequence length is longer than the specified maximum Failed to fetch sample 3246712. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d77c87f10> errors Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebab2e0> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7f109375bf60> Failed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f2d109f0> Failed to fetch sample 1055686. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b19ec00> Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2230b0> Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7fa00aa976f0> Failed to fetch sample 3106325. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2541d0> Failed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac20ea20> cannot identify image file <_io.BytesIO object at 0x7f59d7da27f0> Failed to fetch sample 2392280. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d7150> FFailed to fetch sample 1007919. Exception:ject at 0x7fccdf7574c0> le <_io.BytesIO object at 0x7f5741f33d80> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3152305. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2134c0> Failed to fetch sample 2943308. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b223970> Failed to fetch sample 2567266. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d4f90> Failed to fetch sample 2047363. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71a340> Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf754c70> Failed to fetch sample 4123946. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7af650> Failed to fetch sample 3013587. Exception:cannot identify image file <_io.BytesIO object at 0x7f530e6e0db0> Failed to fetch sample 2555299. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f88400> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d7ce0> Failed to fetch sample 2374167. Exception: cannot identify image fiFailed to fetch sample 1011927. Exception:Failed to fetch sample 2943308. Exception:bject at 0x7f7a568a3060> le <_io.BytesIO object at 0x7f94ac213c90> Failed to fetch sample 1701865. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401ed6980> cannot identify image file <_io.BytesIO object at 0x7fedb5ec0bd0> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c1e90> cannot identify image file <_io.BytesIO object at 0x7f83e84ef560> le <_io.BytesIO object at 0x7f4c161e4a90> Failed to fetch sample 3535594. Exception: cannot identify image fiFailed to fetch sample 1536214. Exception:Failed to fetch sample 2816660. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093736bb0> Failed to fetch sample 2227829. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e83f978d0> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b957510> Failed to fetch sample 1769963. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937584f0> Failed to fetch sample 3371113. Exception: cannot identify image file <_io.BytesIO object at 0x7f109375bf10> Failed to fetch sample 1961364. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f75b70> Failed to fetch sample 2214757. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8d9a30> Failed to fetch sample 3401019. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc54a82de0> cannot identify image file <_io.BytesIO object at 0x7f5253a096c0> Failed to fetch sample 2528140. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecbdc2480> Failed to fetch sample 2237717. Exception:bject at 0x7fa008198b80> Failed to fetch sample 3725471. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b254040> Failed to fetch sample 3290266. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b3010> Failed to fetch sample 1672947. Exception: cannot identify image fiFailed to fetch sample 3662436. Exception:bject at 0x7f83e831b2e0> Failed to fetch sample 1098521. Exception: cannot identify image file <_io.BytesIO object at 0x7fa2846c8fe0> Failed to fetch sample 2563789. Exception: cannot identify image file <_io.BytesIO object at 0x7f6244130130> Failed to fetch sample 3503110. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f7a568c5030> Failed to fetch sample 1432321. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 3cannot identify image file <_io.BytesIO object at 0x7fee3f9235b0> Failed to fetch sample 3535594. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f1d760> ile <_io.BytesIO object at 0x7f633ef3ed40> Failed to fetch sample 3492865. Exception: cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7f83e84eed40> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7de40> Failed to fetch sample 2218299. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2581d71a0> Failed to fetch sample 1312718. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556026ac0> Failed to fetch sample 1096763. Exception:s just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` befFailed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d7da30b0> Failed to fetch sample 1847861. Exception: cannot identify image file <_io.BytesIO object at 0x7efc5208185Failed to fetch sample 1098033. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1c4a90> 0> Failed to fetch sample 2887528. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4daba0c70> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008199440> Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f948680> Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7f9ac15f07c0> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d50596f20> Failed to fetch sample 2163674. Exception: cannot identify image file <_io.BytesIO object at 0x7f277545f420> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb4090> Failed to fetch sample 3013587. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7f4c0> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f973740> Failed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object at 0x7f277546f7e0> Failed to fetch sample 2555299. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0fd576f20> Failed to fetch sample 2850892. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775485e40> Failed to fetch sample 3855812. Exception: cannot identify image fFailed to fetch sample 3423432. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc6359e0> ile <_io.BytesIO object at 0x7fee3f9275b0> Failed to fetch sample 1864192. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f971580> Failed to fetch sample 1888260. Exception: cannot identify image fiFailed to fetch sample 2563789. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7e9d0> le <_io.BytesIO object at 0x7f277546fe20> Failed to fetch sample 1701865. Exception:Failed to fetch sample 2964818. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090b560>cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f53a90670b0> 353389. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd430350d0> Failed to fetch sample 2149877. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f926700> Failed to fetch sample 1248797. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f91eb10> Failed to fetch sample 3423432. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7fa275cb7d30> ile <_io.BytesIO object at 0x7fdc25fd7880> cannot identify image file <_io.BytesIO objFailed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7f136c68a890> ect at 0x7fcef824d530> cannot identify image file <_io.BytesIO object at 0x7f53a90788b0> ile <_io.BytesIO object at 0x7f5a16f776a0> Failed to fetch sample 2127419. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdba7b50> Failed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682def290> Failed to fetch sample 1099589. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f979440> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8edd985e0> Failed to fetch sample 2753386. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b968860> cannot identify image file <_io.BytesIO object at 0x7f53a9078180> le <_io.BytesIO object at 0x7f5253bd4ae0> Failed to fetch sample 2114339. Exception: cannot identify image file <_io.BytesIO object at 0x7f469f98b790> cannot identify image file <_io.BytesIO object at 0x7f56682d7d30> Failed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f4c70> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dbdd00> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777db420> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b969030> ject at 0x7fee3f946340> cannot identify image file <_io.BytesIO obFailed to fetch sample 3 cannot identify image file <_io.BytesIO object at 0x7f94ac239670> Failed to fetch saFailed to fetch sample 3695885. Exception:Failed to fetch sample 2591241. Exception: cannot cannot identify image file <_io.BytesIO object at 0x7fc32576e200> Failed to fetch sample 2218753. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93a390> le <_io.BytesIO object at 0x7fb9777da480> Failed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b979781d0> Failed to fetch sample 1486887. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b76f0> Failed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b96ae30> Failed to fetch sample 2686021. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f53a9065bc0> Failed to fetch sample 1240398. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac239260> Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7f936c731440> 46828. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dbe480> Failed to fetch sample 3013587. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0069fc90> Failed to fetch sample 1759566. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beeba7650> cannot identify image file <_io.BytesIO object at 0x7f99a3912ca0> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 23Failed to fetch sample 1522636. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f949c10> Failed to fetch sample 1769963. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 3067785. Exception:ample 3150542. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd1e40> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f94b150> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f67e7a0> Failed to fetch sample 3371113. Exception:Failed to fetch sample 2528140. Exception: cannot identify image file <_io.BytesIO object at 0x7f277545e340> cannot identify image file <_io.BytesIO object at 0x7fb4a64fe250> Failed to fetch sample 3839277. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08bbce0> Failed to fetch sample 2715150. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d84400> Failed to fetch sample 2127419. Exception: cannot identify image file <_io.BytesIO object at 0x7f1228c235b0> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd3740> cannot identify image file <_io.BytesIO object at 0x7f7b9b94d5d0> Failed to fetch sample 1461794. Exception: cannot identify image fFailed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf006f3f60> Failed to fetch sample 1099589. Exception: cannot identify image file <_io.BytesIO object at 0x7f7af76137e0> Failed to fetch sample 3661452. Exception: Failed to fetch sample 3649257. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b94c680> Failed to fetch sample 3112387. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebab380> Failed to fetch sample 1454167. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190bdc10> cannot identify image file <_io.BytesIO object at 0x7fd2003896c0> Failed to fetch sample 1185242. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f662a0> cannot identify image file <_io.BytesIO object at 0x7f47278d1800> Failed to fetch sample 3741949. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f8a250> le <_io.BytesIO object at 0x7f18e5d16a20> Failed to fetch sample 2687916. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7ddb8e00> Failed to fetch sample 1316978. Exception: cannot identify image file <_io.BytesIO object at 0x7fc26217aa20> Failed to fetch sample 1507884. Exception: cannot identify image file <_io.BytesIO object at 0x7f5ea213e2a0>Failed to fetch sample 3272809. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f89940> Failed to fetch sample 1465471. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d1800> Failed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682def650> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2491527. Exception: cannot identify image file <_io.BytesIO object at 0x7f7da242da30> Failed to fetch sample 2374167. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80fa2a0> Failed to fetch sample 2358305. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2a12b0> Failed to fetch sample 2491527. Exception: cannot identify image file <_io.BytesIO object at 0x7fc26217b740> Failed to fetch sample 2671486. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f54310> Failed to fetch sample 4027110. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420f9da0> Failed to fetch sample 3067785. Exception: cannot identify image file <_io.BytesIO object at 0x7fc5156f5a80> Failed to fetch sample 1591233. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f623100> Failed to fetch sample 2528140. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a58457a60> Failed to fetch sample 2237717. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b53240> Failed to fetch sample 2237717. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3af7100> Failed to fetch sample 3629963. Exception: cannot identify image file <_io.BytesIO object at 0x7f10ae7e9580> Failed to fetch sample 3067785. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668289f80> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c272930> Failed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f622f20> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd97330> Failed to fetch sample 3408600. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9052ca0> cannot identify image file <_io.BytesIO object at 0x7f109378ecf0> Failed to fetch sample 3384350. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3986480> cannot identify image file <_io.BytesIO object at 0x7fb37942fd80> Failed to fetch sample 2957876. Exception: cannot identify image file <_io.BytesIO object at 0x7f0e1e212a70> Failed to fetch sample 1885521. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c271c60> Failed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f0a5c10> Failed to fetch sample 1465471. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378e570> cannot identify image file <_io.BytesIO object at 0x7f7b9b96f830> Token indices sequence length is longer thaFailed to fetch sample 1638234. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775486980> Failed to fetch sample 2392280. EFailed to fetch sample 3517802. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f8bce0> Failed to fetch sample 4376608. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278aacf0> Failed to fetch sample 4121555. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b96ed90> Failed to fetch sample 1591233. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25629620> Failed to fetch sample 1347064. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f2d109f0> Failed to fetch sample 4152305. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b3837fb0> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce55a80> cannot identify image file <_io.BytesIO objFailed to fetch sample 2526328. Exception: le 1055686. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d59e97ab0> Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 8192). Running this sequence through the modelcannot identify image file <_io.Failed to fetch sample 3880988. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f237e20> Failed to fetch sample 2127419. ExFailed to fetch sample 1591233. Exception:ject at 0x7fb7401e9cb0> 5fe3060> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d153f0> cannot identify image file <_io.BytesIO object at 0x7f83e831a7a0> Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce54f40> Failed to fetch sample 2452845. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc3970> cannot identify image file <_io.BytesIO object at 0x7f57421015d0> le <_io.BytesIO object at 0x7f21b0880ea0> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e9580> cannot identify image file <_io.BytesIO object at 0x7f58c3809940> Failed to fetch sample 2671486. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190c36f0> Failed to fetch sample 3359449. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093765850> Failed to fetch sample 2127419. Exception: Failed to fetch sample 1368716. Exception: ect at 0x7f08282d2bb0> Failed to fetch sample 2392280. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f57010> F cannot identify image file <_io.BytesIO object at 0x7fb7401ebfb0> le <_io.BytesIO object at 0x7f18e5d0af70> ailed to fetch sample 1314228. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79f290> Failed to fetch sample 2073465. Exception: cannot identify image file <_io.BytesIO object at 0x7fc000b14900> cannot identify image file <_io.BytesIO objFailed to fetch sample 2189796. Exception:ject at 0x7fdaadeb16c0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: tch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ef790> Failed to fetch sample 4121555. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc5544cf40> Failed to fetch sample 2270043. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71eca0> - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) _io.BytesIO object at 0x7f1b7b9c0630> cannot identify image file <_io.BytesIO object at 0x7f4c161e4a90> Failed to fetch sample 4224708. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db2660> ailed to fetch sample 1852487. Exception: cannot identify image file <_io.BytesIO object at 0x7efc516de2f0> Failed to fetch sample 1725742. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a0bdf5670> Failed to fetch sample 3492865. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3744d60> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b45030> Failed to fetch sample 3725832. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d7da35b0> Failed to fetch sample 2184717. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb5fb50> Failed to fetch sample 2970666. Exception: cannot identify image fiFailed to fetch sample 1454662. Exception: cannot identify image fiFailed to fetch sample 2446023. Exception:cannot identify image file <_io.BytesIO object at 0x7f6682db2ac0> sequence length for this model (11653 > 8192). Running this sequence through the model will result in indexing errors ytesIO object at 0x7fdaadeb1530> Failed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7efc516af330> cannot identify image file <_io.BytesIO object at 0x7fa00aaa9ad0> Failed to fetch sample 1296227. Exception: ect at 0x7f83e84ef5b0> ile <_io.BytesIO object at 0x7f6682db3b00> Failed to fetch sample 3431220. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7a2980> Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7f801f5efe20> cannot identify image file <_io.BytesIO object at 0x7f96d6a081d0> le <_io.BytesIO object at 0x7f82a6861850> cannot identify image file <_io.BytesIO object at 0x7f5741f390d0> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb5f100> cannot identify image file <_io.BytesIO object at 0x7f6682db35b0> Failed to fetch sample 3210811. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078af6a0> Failed to fetch sample 3106325. Exception:bject at 0x7f5253bee200> Failed to fetch sample 3858182. Exception: cannot identify image file <_io.BytesIO object at 0x7f47edf87380> Failed to fetch sample 1625475. Exception:bject at 0x7f83e8319ad0> Failed to fetch sample 2392280. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e831b3d0> cannot identify image file <_io.BytesIO object at 0x7f22e39c7a60> cannot identify image file <_io.BytesIO object at 0x7fd17c6a6930> ile <_io.BytesIO object at 0x7f4418a3bec0> Failed to fetch sample 2106639. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0074a110> Failed to fetch sample 3010687. Exception:bject at 0x7f5781611b20> parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 1961364. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3812570> _io.BytesIO object at 0x7f83e84ef650> ion: cannot identify image file <_io.BytesIO object at 0x7fbf0074b330> Failed to fetch sample 1561390. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9b8770> Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078ae6b0> Failed to fetch sample 1316862. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0074b420> bject at 0x7f9a9c26d300> cannot identify image file <_io.BytesIO object at 0x7fc384737ba0> Failed to fetch sample 2505614. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754a9080> Failed to fetch sample 3457328. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f2d18db0> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078dfe20> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0074b830> Failed to fetch sample 2189796. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9b8720> Failed to fetch sample 2555299. Exception: cannot identify image fiFailed to fetch sample 2798609. Exception:Failed to fetch sample 3Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078dc7c0> cannot identify image file <_io.BytesIO object at 0x7f18e65abc40> at 0x7fccdf79c9a0> cannot identify image file <_io.BytesIO object at 0x7fb52f60f060> le <_io.BytesIO object at 0x7fa275c9c6d0> Failed to fetch sample 2127419. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8318680> Failed to fetch sample 1701865. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f2d10040> Failed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078deb10> Failed to fetch sample 1739144. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078b2b10> Failed to fetch sample 1672947. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9b9210> Failed to fetch sample 2641615. Exception: cannot identify image ficannot identify image file <_io.BytesIO obcannot identify image file <_io.BytesIO object at 0x7fc21a9237e0> Failed to fetch sample 2243956. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fcc57d6a020> 50542. Exception:bject at 0x7f4beebdf790> Failed to fetch sample 1122719. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd9ee0> Failed to fetch sample 1190106. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093733880> cannot identify image file <_io.BytesIO object at 0x7f6d145b9440> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f6840> le <_io.BytesIO object at 0x7f5d2788f150> Failed to fetch sample 1158324. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f6930> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebdfdd0> Failed to fetch sample 1985754. Exception: cannot identify image fiFailed to fetch sample 3133979. Exception:bject at 0x7f6556074220> 49257. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd0220> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c161e4a90> Failed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e07705f30> Failed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c9f8d0> Failed to fetch sample 2998072. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebdfa10> Failed to fetch sample 3518244. Exception: cannot identify image fiFailed to fetch sample 3401019. Exception:ject at 0x7f80a82d0400> Failed to fetch sample 1113404. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc1d50> 961415. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebab740> Failed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7f326cb388b0> Failed to fetch sample 2392280. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08db3d0> Failed to fetch sample 1038032. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697dee390> Failed to fetch sample 1552684. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789fba0> Failed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb721df0> Failed to fetch sample 3414742. Exception: cannot identify image file <_io.BytesIO object at 0x7f1009411f80> Failed to fetch sample 3345102. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f23e0> cannot identify image file <_io.BytesIO object at 0x7f10af1aa480> le <_io.BytesIO object at 0x7efc52081670> cannot identify image file <_io.BytesIO object at 0x7f53b78a4b80> Failed to fetch sample 1477290. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc2fc0> Failed to fetch sample 2470844. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8eb100> Failed to fetch sample 2998072. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2788ec00> Failed to fetch sample 2149877. Exception: cannot identify image file <_io.BytesIO object at 0x7ff69741c4f0> cannot identify image file <_io.BytesIO object at 0x7f6556074b80> le <_io.BytesIO object at 0x7fc2eb7232e0> Failed to fetch sample 2306584. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b93aa20> Failed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906b060> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50b3c9f30> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a65710> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381e3e0> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7f10094134c0> cannot identify image file <_io.BytesIO object at 0x7f22e39f18a0> Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 81Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b969080> F Failed to fetch sample 2993528. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2e4ef0Failed to fetch sample 1591233. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0908bd0> FFailed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39f1b20>Failed to fetch sample 2993528. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f9c10> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b7fb9530> Failed to fetch sample 3125405. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e7fb0> cannot identify image file <_io.BytesIO object at 0x7f4beebdf6a0> ile <_io.BytesIO object at 0x7f5d278af1a0> Failed to fetch sample 3662605. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278b71a0> Failed to fetch sample 2446023. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e795f7b00> Failed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f759e0> Failed to fetch sample 1277996. Exception:bject at 0x7f9abcdcdf80> ailed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b7f8fba0> Failed to fetch sample 2846277. Exception: ect at 0x7fa275c95030> Failed to fetch sample 3051272. Exception: cannot identify image file <_io.BytesIO object at 0x7f80201b3f10> Failed to fetch sample 1432321. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556055300> Failed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556074db0> Failed to fetch sample 4162805. Exception: ject at 0x7fb9777e69d0> Failed to fetch sample 2887528. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278b78d0> Failed to fetch sample 1304332. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c9e3061b0> Failed to fetch sample 3855812. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f77240> Failed to fetch sample 3652924. Exception: cannot identify image ficannot identify image file <_io.BytesIO obFailed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9080900> bject at 0x7f65560eccc0> Failed to fetch sample 3271481. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8d2c50> Failed to fetch sample 2767646. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e6070> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278827f0> Failed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f77740> Failed to fetch sample 3106325. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e2ed0> Failed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dec400> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39c1ad0> cannot identify image file <_io.BytesIO object at 0x7f5a16f57920> Failed to fetch sample 1093018. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f8ba60> Failed to fetch sample 1961364. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3735f80> Failed to fetch sample 1754296. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f7f7240> ailed to fetch sample 1885521. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d27882980> Failed to fetch sample 2943308. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e2c00> Failed to fetch sample 2418683. Exception: cannot identify image file <_io.BytesIO object at 0x7f936c731440> Failed to fetch sample 2556572. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bb7b00> Failed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f55080> Failed to fetch sample 2250980. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f80cc0> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f749a0> Failed to fetch sample 2767646. Exception: Failed to fetch sample 1477290. Exception: cannot identify image fFailed to fetch sample 3417785. Exception: Failed to fetch sample 1440224. Exception: cannot identify image file <_io.BytesIO object at 0x7fd08a213c90> cannot identify image file <_io.BytesIO object at 0x7f5a16f80c20> Failed to fetch sample 2250980. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e2e80> cannot identify image file <_io.BytesIO object at 0x7f6682def010> Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d9da0> le <_io.BytesIO object at 0x7f53a9080950> Failed to fetch sample 3725471. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e23e0> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7f995aaf0fe0> Failed to fetch sample 2970666. Exception: cannot identify image file <_io.BytesIO object at 0x7fc24c1356c0> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e6b10> Failed to fetch sample 1431292. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777ae660> cannot identify image file <_io.BytesIO object at 0x7f5e6a8f7b50> le <_io.BytesIO object at 0x7fdd43034950> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3736480> Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7fb21f2ce160> Failed to fetch sample 1465471. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8112c00> cannot identify image file <_io.BytesIO object at 0x7fc108455e40> Failed to fetch sample 2289863. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3912c00> Failed to fetch sample 2454376. Exception: Failed to fetch sample 2828655. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e62a0> Failed to fetch sample 3588282. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39bbf60> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7fb977797fb0> cannot identify image file <_io.BytesIO object at 0x7fdcaf8f18f0> Failed to fetch sample 1635122. Exception:Failed to fetch sample 1978365. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faf2840> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3af3150> Failed to fetch sample 3067785. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f832e0> Failed to fetch sample 3747017. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200359c60> Failed to fetch sample 1900442. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f5cb0> Failed to fetch sample 1524920. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e5850> Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777ac630> Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39bacf0> cannot identify image file <_io.BytesIO object at 0x7fdd459d79c0> Failed to fetch sample 3345102. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcf44f0> Failed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f94a3e0> Failed to fetch sample 2237717. Exception: Failed to fetch sample 3030135. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6497650> Failed to fetch sample 2423129. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f282750> cannot identify image file <_io.BytesIO object at 0x7fa2846c8fe0> le <_io.BytesIO object at 0x7f0f4c61d7b0> Failed to fetch sample 3358227. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3923c40> Failed to fetch sample 2130061. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcf4720> Failed to fetch sample 3048644. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb6877290> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f93a2f0> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43034a90> Failed to fetch sample 999878. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52b7ceac0> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c353f6f70> Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3922980> cannot identify image file <_io.BytesIO object at 0x7f6556024bd0> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa2cf90> cannot identify image file <_io.BytesIO object at 0x7f53a9083830> le <_io.BytesIO object at 0x7fedb5ec13f0> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f939c10> Failed to fetch sample 2998072. Exception:Failed to fetch sample 1695344. Exception: cannot identify image fiFailed to fetch sample 2887528. Exception:cannot identify image file <_io.BytesIO object at 0x7f4f442009f0> FFailed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00723650> ailed to fetch sample 1221734. Exception: cannot identify image file <_io.BytesIO object at 0x7f991bdd8900> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d5d193f60> Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb5ec1210> Failed to fetch sample 2780579. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f923650> Failed to fetch sample 2526328. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a559ad0> Failed to fetch sample 2525985. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e306e4ea0> Failed to fetch sample 2789590. Exception: cannot identify image file <_io.BytesIO object at 0x7f53bc93b970> Failed to fetch sample 1739207. Exception: cannot identify image fiFailed to fetch sample 2633741. Exception:Failed to fetch sample 1394536. Exception: cannot identify image fiFailed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a55b420> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568ce7f0> Failed to fetch sample 2641615. Exception:Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d7ba0> Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7f566827b3d0> Failed to fetch sample 1885521. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fbf007238d0> Failed to fetch sample 2908337. Exception: cannot identify image file <_io.BytesIO object at 0x7f668810eb10> Failed to fetch sample 2799151. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71fe20> Failed to fetch sample 4133825. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71f3d0> Failed to fetch sample 1027355. Exception:Failed to fetch sample 1624831. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb37060> Failed to fetch sample 1304332. Exception: cannot identify image fiFailed to fetch sample 3652924. Exception:Failed to fetch sample 3098320. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3809b70> Failed to fetch sample 1871559. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e07706e80> Failed to fetch sample 2Failed to fetch sample 2884516. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb35fd0>cFailed to fetch sample 2377601. Exception:ect at 0x7fd20035a660> cannot identify image file <_io.BytesIO object at 0x7fe4c427fa10> Failed to fetch sample 2957876. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17f5293f0> ile <_io.BytesIO object at 0x7fd07a5598f0> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568ce570> Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f0a4770> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7f36366e1cb0> Failed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7ddb8860> Failed to fetch sample 2789476. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7cf493a0> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568cf970> Failed to fetch sample 4121555. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc193380> Failed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc2070> Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f2d18e50> Failed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e07706160> cannot identify image file <_io.BytesIO object at 0x7f39486c1350> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c184a90> le <_io.BytesIO object at 0x7fe4c427ff60> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb37ec0> Failed to fetch sample 1701865. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb5ec1170> Failed to fetch sample 3747017. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f044b81d0> Failed to fetch sample 3855812. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078aa750> Failed to fetch sample 1304332. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f2d109a0> cannot identify image file <_io.BytesIO object at 0x7f5c9d8fbf10> Failed to fetch sample 3458049. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777dba10> Failed to fetch sample 3224871. Exception:ject at 0x7fcbdc1cb880> cannot identify image file <_io.BytesIO object at 0x7fcef8cefd30> Failed to fetch sample 2392280. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1bd7dfe20> le <_io.BytesIO object at 0x7fb9777dbe20> Failed to fetch sample 3478098. Exception: cannot identify image file <_io.BytesIO object at 0x7ff69741c7c0> Failed to fetch sample 4244036. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9793dbc0> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789cea0> Failed to fetch sample 1506523. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf75b8d0> Failed to fetch sample 2850892. Exception: cannot identify image fiFailed to fetch sample 2250980. Exception:Failed to fetch sample 2998072. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f5253bb7bf0> Failed to fetch sample 2419702. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5676a0> cannot identify image file <_io.BytesIO object at 0x7f6a7f265850> Failed to fetch sample 1304332. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c04f0> le <_io.BytesIO object at 0x7fccdf7b3650> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e35b0> Failed to fetch sample 2964818. Exception: cannot identify image file <_io.BytesIO object at 0x7ff69741ea20> cannot identify image file <_io.BytesIO object at 0x7fd17fad3d30> Failed to fetch sample 1591233. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c427f420> Failed to fetch sample 2284248. Exception: cannot identify image file <_io.BytesIO object at 0x7f7af4a2fd80> Failed to fetch sample 2951293. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a69d0> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2669d0> cannot identify image file <_io.BytesIO object at 0x7fa275c7b560> le <_io.BytesIO object at 0x7fd17e71b100> Failed to fetch sample 2328950. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f73560> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20037bf10> Failed to fetch sample 2127419. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c424b920> Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b979f6980> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf75afc0> Failed to fetch sample 3431220. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f735b0> Failed to fetch sample 2235023. Exception: cannot identify image fFailed to fetch sample 2073465. Exception: Failed to fetch sample 3150542. Exception: cannot identify image fFailed to fetch sample 3344635. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4f9f1990> ile <_io.BytesIO object at 0x7fb687f6a110> Failed to fetch sample 1811455. Exception: cannot identify image fFailed to fetch sample 3423432. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d5ee0> ile <_io.BytesIO object at 0x7fd2c51dee30> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f72a20> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7240b6f0> Failed to fetch sample 3355893. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c718a5d50> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd5a80> Failed to fetch sample 2686021. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093763740> Failed to fetch sample 1304332. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7b18f0> Failed to fetch sample 2532980. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b969b70> cannot identify image file <_io.BytesIO object at 0x7f9d50596660> Failed to fetch sample 4102616. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7b16cf0> cannot identify image file <_io.BytesIO object at 0x7f5253bd7ce0> Failed to fetch sample 3210209. Exception: Failed to fetch sample 3397401. Exception: ect at 0x7f0c429de6b0> Failed to fetch sample 3747017. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25b740> Failed to fetch sample 4151771. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cd670> Failed to fetch sample 3454551. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b08863e0> Failed to fetch sample 2382497. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190c1ad0> Failed to fetch sample 3371113. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7fb52f6af290> Failed to fetch sample 3345102. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7b29da0> cannot identify image file <_io.BytesIO object at 0x7f7fe4e2b880> Failed to fetch sample 1127961. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b6ed0> ile <_io.BytesIO object at 0x7f7e078ab920> Failed to fetch sample 2127419. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093763ce0> Failed to fetch sample 3401019. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d505969d0> Failed to fetch sample 1465471. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190bede0> Failed to fetch sample 3091195. Exception: cannot identify image file <_io.BytesIO object at 0x7fc214d4e2f0> Failed to fetch sample 1432321. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c161e4a90> Failed to fetch sample 4102616. Exception: cannot identify image file <_io.BytesIO object at 0x7f136ce2bc90> cannot identify image file <_io.BytesIO object at 0x7efcd7b28720> Failed to fetch sample 3217832. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420ed850> Failed to fetch sample 2090097. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e9c10> Failed to fetch sample 1043114. Exception:bject at 0x7f4e83fc5670> Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071e700> Failed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420ee660> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7b2b510> cannot identify image file <_io.BytesIO object at 0x7f7ecaa09f30> ile <_io.BytesIO object at 0x7f2775487b00> Failed to fetch sample 1672947. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4fbbebb0> cannot identify image file <_io.BytesIO object at 0x7f6a7f26a840> Failed to fetch sample 1068865. Exception:Failed to fetch sample 2647526. Exception: cannot identify image fiFailed to fetch sample 1057917. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e8496f9c0> Failed to fetch sample 3595827. Exception:Failed to fetch sample 3652924. Exception:bject at 0x7f9a9c2a1bc0> Failed to fetch sample 3431220. Exception:ject at 0x7f6191ca4d60> ile <_io.BytesIO object at 0x7f109377a250> Failed to fetch sample 2998072. Exception: cannot identify image fiFailed to fetch sample 4219622. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093756200> Failed to fetch sample 3662436. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e83f97ba0> cannot identify image file <_io.BytesIO object at 0x7f5ea213e2a0> Failed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420eea20> cannot identify image file <_io.BytesIO object at 0x7fdc25fe2e80> Failed to fetch sample 2036348. Exception: cannot identify image file <_io.BytesIO object at 0x7fbc5bc0b470> cannot identify image file <_io.BytesIO object at 0x7f5a16f8ac00> Failed to fetch sample 3106325. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5704f0> le <_io.BytesIO object at 0x7fbf0071cb80> Failed to fetch sample 2446023. Exception: cannot identify image fiFailed to fetch sample 1868762. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f3ab60> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420eee30> le <_io.BytesIO object at 0x7f4e8496f2e0> cannot identify image file <_io.BytesIO object at 0x7f6a7f268720> Failed to fetch sample 2587174. Exception:Failed to fetch sample 2943308. Exception:bject at 0x7fdc25629620> ailed to fetch sample 2461520. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8eb880> Failed to fetch sample 2452845. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b7fb0> Failed to fetch sample 1679106. Exception: ject at 0x7f12b38128e0> Failed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071ebb0> Failed to fetch sample 3747017. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b426e1b0> Failed to fetch sample 1961364. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ea980> Failed to fetch sample 3431220. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0070b4c0> cannot identify image file <_io.BytesIO object at 0x7f81db542fc0> Failed to fetch sample 2354595. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f5a8e0> Failed to fetch sample 3384472. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bef790> ile <_io.BytesIO object at 0x7f12b39f74c0> Failed to fetch sample 1102280. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3813560> Failed to fetch sample 3671101. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebaba10> Failed to fetch sample 3315269. Exception:Failed to fetch sample 2269387. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200376e80> cannot identify image file <_io.BytesIO object at 0x7f10937786d0> Failed to fetch sample 2250980. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0070b9c0> Failed to fetch sample 2491527. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b907150> Failed to fetch sample 2217039. Exception: cannot identify image fFailed to fetch sample 1357884. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008b4a6b0> ile <_io.BytesIO object at 0x7f12b382e890> Failed to fetch sample 2465777. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3810cc0> Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebab4c0> Failed to fetch sample 1099589. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b956750> cannot identify image file <_io.BytesIO object at 0x7fdc25fd9f30> Failed to fetch sample 1785148. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f7b00> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fe2fc0> cannot identify image file <_io.BytesIO object at 0x7f5e9d77df30> Failed to fetch sample 1524647. Exception:Failed to fetch sample 3725471. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db570900> Failed to fetch sample 2943308. Exception: cannot identify image file <_io.BytesIO object at 0x7f12573a4770> Failed to fetch sample 2237717. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b906cf0> Failed to fetch sample 3431220. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db9170> Failed to fetch sample 1747603. Exception: cannot identify image fiFailed to fetch sample 1976843. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3811670> ailed to fetch sample 3150542. Exception: cannot identify image filFailed to fetch sample 2687916. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b957f10> Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937ff790> Failed to fetch sample 1961364. ExceptionFailed to fetch sample 2123315. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b7fbbe20> : cannot identify image file <_io.BytesIO object at 0x7f6682db2f20> Failed to fetch sample 2452845. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dba020> Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3812840> Failed to fetch sample 1465471. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b957240> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008b49580> Failed to fetch sample 2587174. Exception: Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937fd8a0> Failed to fetch sample 2374167. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db0ae0> Failed to fetch sample 2250980. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a65300> ject at 0x7f5de3738590> cannot identify image file <_io.BytesIO object at 0x7f5253bba7a0> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db543290> ile <_io.BytesIO object at 0x7fdcaf889e40> Failed to fetch sample 3401019. Exception: cannot identify image file <_io.BytesIO object at 0x7f277548bec0> Failed to fetch sample 3916323. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3813d80> Failed to fetch sample 3067785. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682ddad90> Failed to fetch sample 3725471. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a64450>F cannot identify image file <_io.BytesIO object at 0x7f18e649b290> Failed to fetch sample 1240753. Exception: cannot identify image file <_io.BytesIO object at 0x7efe283f8cc0> Failed to fetch sample 4102616. Exceptioncannot identify image file <_io.BytesIO object at 0x7f81db543fb0> > Failed to fetch sample 1351024. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20035ad40> Failed to fetch sample 1473036. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39bb6a0> Failed to fetch sample 4166824. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7add00> Failed to fetch sample 3167909. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20038af70> Failed to fetch sample 1885521. Exception: cannot identify image file <_io.BytesIO object at 0x7fa1d0abb740> Failed to fetch sample 1061802. Exception: cannot identify image fFailed to fetch sample 3209388. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d2ed0> ile <_io.BytesIO object at 0x7f8151527830> Failed to fetch sample 2476148. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f8f650> Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3af3b00> Failed to fetch sample 3329980. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3734900> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a7ce0> Failed to fetch sample 1901956. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937374c0> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a654e0> Failed to fetch sample 3329980. Exception: cannot identify image file <_io.BytesIO object at 0x7f60ffea65c0> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7763e0> Failed to fetch sample 2373574. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789dd50> Failed to fetch sample 2819794. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d2e930> Failed to fetch sample 2563789. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b94c540> Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d2d40> Failed to fetch sample 2487079. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f56b60> Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20035b2e0> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de373a340> Failed to fetch sample 3098893. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d1d00> Failed to fetch sample 1305263. Exception: cannot identify image file <_io.BytesIO object at 0x7ff69741cdb0> Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f56b10> cannot identify image file <_io.BytesIO object at 0x7f08282d35b0> le <_io.BytesIO object at 0x7f32c4233ce0> Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7f81515267f0> Failed to fetch sample 2687916. Exception: cannot identify image file <_io.BytesIO object at 0x7f5df020d8a0> Failed to fetch sample 1849553. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3737650> Failed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d2930> Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c7ab0> Failed to fetch sample 2815721. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3af64d0> Failed to fetch sample 1102280. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f63e70> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c299990> Failed to fetch sample 1465471. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f8f3d0> Failed to fetch sample 2957876. Exception: cannot identify image file <_io.BytesIO object at 0x7eff96838e50> Failed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f57b0> Failed to fetch sample 4151771. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db541300> Failed to fetch sample 2103868. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc57d6a930> Failed to fetch sample 4099617. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d3a10> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7aeed0> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b94f650> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f8aa70> Failed to fetch sample 3010687. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc57d68360> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f6a70> Failed to fetch sample 2956966. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35b9bc90> Failed to fetch sample 2189796. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0900270> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7f10ae7e8590> Failed to fetch sample 2846277. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a5cf974c0> Failed to fetch sample 3329980. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39e84a0> Failed to fetch sample 1906307. Exception: cannot identify image file <_io.BytesIO object at 0x7f4ebf9fa2a0> Failed to fetch sample 3355968. Exception: cannot identify image file <_io.BytesIO object at 0x7f0d822e2ca0> Failed to fetch sample 3067785. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093779670> Failed to fetch sample 4077261. Exception: cannot identify image file <_io.BytesIO object at 0x7f1228c22f20> Failed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db0c20> Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39c7330> Failed to fetch sample 1695344. Exception: cannot identify image file <_io.BytesIO object at 0x7f4ebf9fb1f0> Failed to fetch sample 2354595. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093765210> Failed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7f12295f2fc0> Failed to fetch sample 3747017. Exception: cannot identify image file <_io.BytesIO object at 0x7f4ebf9fad40> Failed to fetch sample 1428584. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f04e5ede0> Failed to fetch sample 2639918. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd0450> Failed to fetch sample 4018722. Exception: cannot identify image file <_io.BytesIO object at 0x7f1228c234c0> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789d300> Failed to fetch sample 1864192. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b94f970> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f57e20> Failed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7b62f0> Failed to fetch sample 2024032. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79d8a0> Failed to fetch sample 2748881. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7322e2a0> Failed to fetch sample 3255259. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732517b0> Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f09020c0> Failed to fetch sample 2771006. Exception: cannot identify image file <_io.BytesIO object at 0x7f5751089530> ile <_io.BytesIO object at 0x7f0c35b99a80> Failed to fetch sample 3609250. Exception: ect at 0x7f563c54b9c0> ile <_io.BytesIO object at 0x7f001f91bf10> Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7240c310> Failed to fetch sample 2036348. Exception: cannot identify image file <_io.BytesIO object at 0x7f991c7c6890> Failed to fetch sample 2528140. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd7150> Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d6b740> Failed to fetch sample 2563789. Exception:bject at 0x7efd6f794fe0> le <_io.BytesIO object at 0x7f1137f63f60> Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c273560> Failed to fetch sample 1349835. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23b8d0> Failed to fetch sample 1432321. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643e2eb10> Failed to fetch sample 2716887. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c3420> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7f51ccfeec50> Failed to fetch sample 3329980. Exception: cannot identify image file <_io.BytesIO object at 0x7f00da6d26b0> Failed to fetch sample 2526328. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a5cf7b1f0> Failed to fetch sample 3162454. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 1Failed to fetch sample 3345102. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 3Failed to fetch saFailed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91a020> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7f97af7de980> Failed to fetch sample 2988390. Exception: cannoFailed to fetch sample 1785901. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c9ed14630> cannot identify image file <_io.BytesIO object at 0x7f6191ca4d60> Failed to fetch sample 1232641. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb6a480> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7f6245073d30> Failed to fetch sample 1587180. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278b6bb0> Failed to fetch sample 1477290. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556065f80> cannot identify image file <_io.BytesIO object at 0x7fb7401f9cb0> ailed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9b9210> Failed to fetch sample 1766659. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db9f30> Failed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200367010> cannot identify image file <_io.BytesIO object at 0x7f5741f3b740> Failed to fetch sample 3410194. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2ba2a0> Failed to fetch sample 3298418. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e3c90> Failed to fetch sample 1864192. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f3a250> Failed to fetch sample 2674975. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f94b150> cannot identify image file <_io.BytesIO object at 0x7f5253bd6840> Failed to fetch sample 3150542. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd9b70> Failed to fetch sample 2965895. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8d9670> Failed to fetch sample 3091195. Exception: cannot identify image file <_io.BytesIO object at 0x7f788d0ec4f0> Failed to fetch sample 2647526. Exception: cannot identify image fFailed to fetch sample 1831120. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937fff10> Failed to fetch sample 3492865. Exception: Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b938950> cannot identify image file <_io.BytesIO object at 0x7fe0ec246160> Failed to fetch sample 2255807. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381bf10> Failed to fetch sample 3464038. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378b920> Failed to fetch sample 3431220. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937579c0> cannot identify image file <_io.BytesIO object at 0x7fc0f0932890> Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb13920> Failed to fetch sample 2490418. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20035aca0> ile <_io.BytesIO object at 0x7fbf00717f10>cannot identify image file <_io.BytesIO object at 0x7f12b3819a30> Failed to fetch sample 3649257. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378a1b0> Failed to fetch sample 3581121. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd1530> Failed to fetch sample 3339848. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39c1fd0> Failed to fetch sample 3329980. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e2250> le <_io.BytesIO object at 0x7f35a7f54220> cannot identify image file <_io.BytesIO object at 0x7f57420f9c10> ailed to fetch sample 2887528. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20038ef70> Failed to fetch sample 3492865. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39fb880> Failed to fetch sample 2036348. Exception: cannot identify image file <_io.BytesIO object at 0x7f5527249530> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682d71f0> Failed to fetch sample 1588738. Exception: cannot identify image fiFailed to fetch sample 1451627. Exception: ailed to fetch sample 4236147. Exception:bject at 0x7f57420f95d0> Failed to fetch sample 2578850. Exception: cannot identify image fiFailed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986449c10> le <_io.BytesIO object at 0x7fc385111490> cannot identify image file <_io.BytesIO object at 0x7fcbdc1b30b0> le <_io.BytesIO object at 0x7fb784820c20> Failed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9689f0> cannot identify image file <_io.BytesIO object at 0x7f83e8396160> le <_io.BytesIO object at 0x7f001f9292b0> Failed to fetch sample 3325720. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd24d0> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39ec090> Failed to fetch sample 1885521. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20038db20> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b79c0> Failed to fetch sample 2528140. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f90ecf0> Failed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7efd102abe70> cannot identify image file <_io.BytesIO object at 0x7f5a239eac00> cannot identify image file <_io.BytesIO oFailed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7fbea6364a90> bject at 0x7f1fca234900> Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b93ba10> Failed to fetch sample 1197543. Exception:ject at 0x7f9d732876f0> Failed to fetch sample 2820344. Exception: Failed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f90dd50> cannot identify image file <_io.BytesIO object at 0x7f9a9c28d170> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73287150> cannot identify image file <_io.BytesIO object at 0x7f7a568fd710> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f796e80> le <_io.BytesIO object at 0x7f53a90676f0> Failed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071ab10> Failed to fetch sample 3213598. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682ce5c0> Failed to fetch sample 3321209. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777ac680> cannot identify image file <_io.BytesIO object at 0x7f7a568c6a70> Failed to fetch sample 4236147. Exception:cannot identify image file <_io.BytesIO object at 0x7f65560ed030> ile <_io.BytesIO object at 0x7fee3f971350> Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f90db70> Failed to fetch sample 3855812. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d6de0> Failed to fetch sample 4128446. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e90d0> Failed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f972520> Failed to fetch sample 2591241. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071a700> Failed to fetch sample 2969982. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c9490> cannot identify image file <_io.BytesIO obcannot identify image file <_io.BytesIO object at 0x7f7a568c70b0> Failed to fetch sample 3916323. Exception: cannot identify image fi cannot identify image file <_io.BytesIO obFailed to fetch sample 1304332. Exception: ject at 0x7fb9777acfe0> Failed to fetch sample 1591233. Exception: cannot identify image fFailed to fetch sample 2446023. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e2d8046d0> ile <_io.BytesIO object at 0x7f22e39d69d0> Failed to fetch sample 1395074. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278abe20> cannot identify image file <_io.BytesIO object at 0x7fbe817149a0> Failed to fetch sample 3916323. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f90f470> FFailed to fetch sample 2491527. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4246e30> Failed to fetch sample 1676410. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaae632e0> cannot identify image file <_io.BytesIO object at 0x7f7a568fd670> le <_io.BytesIO object at 0x7f7e078b2610> Failed to fetch sample 3343366. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faf3880> Failed to fetch sample 2639168. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72407b50> Failed to fetch sample 3283152. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a84900> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f81ad0> cannot identify image file <_io.BytesIO object at 0x7fb784820c20> Failed to fetch sample 1963191. Exception:Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568ff0b0> cannot identify image file <_io.BytesIO object at 0x7fb667590ea0> cannot identify image file <_io.BytesIO object at 0x7f83e834f1a0> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c9db20> le <_io.BytesIO object at 0x7f001f927ec0> Failed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7240aed0> Failed to fetch sample 4377818. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a904e610> Failed to fetch sample 4099617. Exception:ject at 0x7fb52f67ef20> Failed to fetch sample 2849703. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb87bbb50> ile <_io.BytesIO object at 0x7f83e831b560> cannot identify image file <_io.BytesIO object at 0x7f22e39d5490> Failed to fetch sample 3495884. Exception: cannot identify image file <_io.BytesIO object at 0x7feefb18a2f0> Failed to fetch sample 3414597. Exception: ject at 0x7f7a568c2200> Failed to fetch sample 1267446. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a55b060> Failed to fetch sample 3371479. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a6360ccc0> Failed to fetch sample 997019. Exception:object at 0x7fc1cf6a1210> Failed to fetch sample 1718944. Exception: cannot identify image file <_io.BytesIO object at 0x7f7af76131f0> le <_io.BytesIO object at 0x7fbe7240a610> Failed to fetch sample 2621738. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d84720> Failed to fetch sample 2279696. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e8496fce0> Failed to fetch sample 1339249. Exception: cannot identify image file <_io.BytesIO object at 0x7f5838df4860> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7240b420> Failed to fetch sample 3254786. Exception: cannot identify image file <_io.BytesIO object at 0x7f655606bb50> Failed to fetch sample 3160397. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ef1f0> cannot identify image file <_io.BytesIO object at 0x7f7d6cc06390> ile <_io.BytesIO object at 0x7f83e834eac0> Failed to fetch sample 1026599. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c560798a0> Failed to fetch sample 2145770. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2788f740> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078e78d0> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f70db0> Failed to fetch sample 2686021. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf673240> Failed to fetch sample 2767646. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e83fc5210> Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7f7eca3f1620> cannot identify image file <_io.BytesIO object at 0x7f3939ca71f0> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd7650> cannot identify image file <_io.BytesIO object at 0x7fc2eb71ede0> ile <_io.BytesIO object at 0x7fee3f946070> cannot identify image file <_io.BytesIO object at 0x7fc1cf6a0ae0> Failed to fetch sample 2164070. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f37e0> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f73ce0> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c4c20> Failed to fetch sample 1007919. Exception:bject at 0x7f5d278e2c50> Failed to fetch sample 3916323. Exception:Failed to fetch sample 1659475. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d14f9df30> cannot identify image file <_io.BytesIO object at 0x7f5253bd45e0> Failed to fetch sample 3150542. Exception: cannot identify image fiFailed to fetch sample 1858383. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f945620> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd7f60> Failed to fetch sample 2078559. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278d1b20> le <_io.BytesIO object at 0x7f357b93a110> Failed to fetch sample 3319242. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937553a0> Failed to fetch sample 3725471. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcc72e0> Failed to fetch sample 2586918. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078cb4c0> Failed to fetch sample 2647526. Exception: cannot identify image fFailed to fetch sample 3585054. Exception: Failed to fetch sample 1988418. Exception:bject at 0x7f71340ec180>Failed to fetch sample 2310942. Exception: cannot identify image file <_io.BytesIO object at 0x7f530d197970> Failed to fetch sample 3007504. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db55e5c0> ile <_io.BytesIO object at 0x7f357b9239c0> Failed to fetch sample 1477290. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937feac0> Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f97aac0> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7fc384735c10> Failed to fetch sample 1814001. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd44609df0> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7fdf30596020> cannot identify image file <_io.BytesIO object at 0x7fd2003798f0> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7fa0096f3100> le <_io.BytesIO object at 0x7f53c6d07e70> Failed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775485940> Failed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078c9fd0> Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b382b6f0> Failed to fetch sample 1368716. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682f92ca0> Failed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b923790> Failed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e16c0> cannot identify image file <_io.BytesIO object at 0x7f71340ec090> Failed to fetch sample 1678953. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7b3d0> Failed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560768e0> le <_io.BytesIO object at 0x7f9b1f139cb0> cannot identify image file <_io.BytesIO object at 0x7f472789f600> Failed to fetch sample 1759192. Exception: Failed to fetch sample 2887528. Exception: ect at 0x7f5253a09f30> ile <_io.BytesIO object at 0x7f4e8496f7e0> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078c86d0> Failed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f60eca0> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaade85ad0> Failed to fetch sample 1858383. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9465c0> Failed to fetch sample 1534741. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e9f0ba250> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754851c0> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3812b10> Failed to fetch sample 3133474. Exception:ject at 0x7f1f1b25b240> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2002e6ac0> Failed to fetch sample 2568784. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f1df0> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986421080> ile <_io.BytesIO object at 0x7fee3f979fd0> Failed to fetch sample 3855812. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6255d0> cannot identify image file <_io.BytesIO object at 0x7fa008b32340> Failed to fetch sample 4218759. Exception: cannot identify image file <_io.BytesIO object at 0x7f21afeb74c0> Failed to fetch sample 2894096. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ccfbf0> cannot identify image file <_io.BytesIO object at 0x7f7b13d338d0> Failed to fetch sample 2633741. Exception: cannot identify image file <_io.BytesIO object at 0x7f239e44d3a0> Failed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7efc520818f0> Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b7f8f5b0> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b222750> Failed to fetch sample 2336168. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17f0f22f0> cannot identify image file <_io.BytesIO object at 0x7f3294d9b1f0> le <_io.BytesIO object at 0x7fa275cb3970> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556076e30> cannot identify image file <_io.BytesIO object at 0x7f22e39eacf0> ile <_io.BytesIO object at 0x7f5de3712de0> Failed to fetch sample 2802317. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f956750> cannot identify image file <_io.BytesIO object at 0x7fe4c424b010> Failed to fetch sample 1672090. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682fb060> Failed to fetch sample 1885521. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b222520> Failed to fetch sample 2267746. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20171f6f0> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b7fbbab0> Failed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1d3f60> cannot identify image file <_io.BytesIO object at 0x7f83e83495d0> Failed to fetch sample 3133474. Exception:ject at 0x7fdaad56c680> cannot identify image file <_io.BytesIO object at 0x7f34f38d5cb0> Failed to fetch sample 1534741. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebe5f30> ailed to fetch sample 3150542. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682fba60> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42ab0b0> Failed to fetch sample 3459340. Exception: cannot identify image file <_io.BytesIO object at 0x7f566a47dc60> Failed to fetch sample 3397401. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a3740> ject at 0x7f7a568ce930> Failed to fetch sample 2198559. Exception: cannot identify image file <_io.BytesIO object at 0x7efa5cad3d80> le <_io.BytesIO object at 0x7fa275cb2520> Failed to fetch sample 2708935. Exception:bject at 0x7f472789ede0> le <_io.BytesIO object at 0x7f22e3813420> Failed to fetch sample 1961731. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8edd67ab0> Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568cfb50> Failed to fetch sample 1944854. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18b4c0> Failed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb1a30> cannot identify image file <_io.BytesIO object at 0x7f4beec82700> ile <_io.BytesIO object at 0x7fb5fee05e90> Failed to fetch sample 2688836. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c50a0450> Failed to fetch sample 3137414. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b0880450> Failed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7323bc90> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682f9a80> Failed to fetch sample 3929909. Exception: cannot identify image file <_io.BytesIO object at 0x7f53bc93b970> Failed to fetch sample 2998072. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42aad90> Failed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a2b60> Failed to fetch sample 3414742. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1f85e5440> Failed to fetch sample 3683673. Exception: cannot identify image file <_io.BytesIO object at 0x7fb37942fc40> Failed to fetch sample 3013587. Exception: cannot identify image file <_io.BytesIO object at 0x7f21afee61b0> Failed to fetch sample 1281257. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dce480>Failed to fetch sample 1534741. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7d3fc590> cannot identify image file <_io.BytesIO object at 0x7f53a907abb0> ile <_io.BytesIO object at 0x7f11f15b93f0> Failed to fetch sample 3210209. Exception: cannot identify image file <_io.BytesIO object at 0x7fe44542b4c0> Failed to fetch sample 3013587. Exception:bject at 0x7fb5eaf87510> Failed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7fc384753510> Failed to fetch sample 3401019. Exception:cannot identify image file <_io.BytesIO object at 0x7fb9777c1120> Failed to fetch sample 1767004. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7829d0> Failed to fetch sample 2442225. Exception: cannot identify image file <_io.BytesIO object at 0x7ff52b9cf1f0> Failed to fetch sample 1672947. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd430355d0> Failed to fetch sample 2591241. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f60eac0> Failed to fetch sample 3875942. Exception: cannot identify image file <_io.BytesIO object at 0x7f27820f6a20> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d70b0> Failed to fetch sample 2846277. Exception:Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d852b0> bject at 0x7f3939cce2a0> cannot identify image file <_io.BytesIO object at 0x7fb7401cfec0> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b0887510> Failed to fetch sample 3150542. Exception: cannot identify image file <_io.BytesIO object at 0x7f10af1cf2e0> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42aa700> Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1be4a3bf0> annot identify image file <_io.BytesIO object at 0x7f6682f92a70> Failed to fetch sample 3576810. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9576f0> Failed to fetch sample 3397401. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190a6430> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401ce250> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7f10ae7e8bd0> Failed to fetch sample 2274770. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093755670> Failed to fetch sample 1437082. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682df51c0> cannot identify image file <_io.BytesIO object at 0x7efe286b0590> Failed to fetch sample 2024032. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568baca0> Failed to fetch sample 3247529. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf88ae80> le <_io.BytesIO object at 0x7fc3a66d6110> Failed to fetch sample 4137816. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1dae80> Failed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777c0e50> Failed to fetch sample 3522319. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008198ae0> Failed to fetch sample 1950156. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133f438d0> cannot identify image file <_io.BytesIO object at 0x7f524fbb7060> Failed to fetch sample 2994490. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f197b0> Failed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7f668810d9e0> Failed to fetch sample 1394389. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2231f0> Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c424b920> Failed to fetch sample 1432321. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378f5b0> Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8edd98d60> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39b7c90> cannot identify image file <_io.BytesIO object at 0x7fa275c7e570> le <_io.BytesIO object at 0x7fc000b14900> Failed to fetch sample 3423432. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13327740> Failed to fetch sample 2402399. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777de110> Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7837e0> Fcannot identify image file <_io.BytesIO object at 0x7f5253bef560> Failed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fbb7ba0> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db572890> Failed to fetch sample 1002867. Exception:Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79d530> Failed to fetch sample 2243630. Exception: cannot identify image filFailed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39ddda0> Failed to fetch sample 1198363. ExceptionFailed to fetch sample 3283898. Exception: ject at 0x7fa275c7b330> Failed to fetch sample 1864192. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7f420> le <_io.BytesIO object at 0x7f0e1e212a70> Failed to fetch sample 3397401. Exception: cannot identify image filFailed to fetch sample 3143350. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401ed6980> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79fa60> e <_io.BytesIO object at 0x7fe4440c8130> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b24a4d0> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fbd9760> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f18d10> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f26700> Failed to fetch sample 1232641. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378f600> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3852f20> Failed to fetch sample 2149877. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9061b0> Failed to fetch sample 2639587. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79ffb0> Failed to fetch sample 4238312. Exception:Failed to fetch sample 1945810. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b73d0> cannot identify image file <_io.BytesIO object at 0x7f109378e890> Failed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f06d0> Failed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3851f80> Failed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093763880> Failed to fetch sample 3845662. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd1670> Failed to fetch sample 3397401. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0913790> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a87663240> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f26070> cannot identify image file <_io.BytesIO object at 0x7f7b9b95e250> Failed to fetch sample 1891980. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1f85e5440> cannot identify image file <_io.BytesIO object at 0x7fdc25628590> Failed to fetch sample 2002836. Exception: Failed to fetch sample 1976843. Exception:ject at 0x7f6a7f285c10> Failed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc2562b290> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a87663a60> Failed to fetch sample 2127419. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133f42840> Failed to fetch sample 2584216. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebf69d0> Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378d800> Failed to fetch sample 3535594. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381a390> Failed to fetch sample 1741849. Exception: cannot identify image file <_io.BytesIO object at 0x7f1228c22a70> Failed to fetch sample 1304332. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e4e2f0> Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39dd440> Failed to fetch sample 3414742. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c161e4a90> Failed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c43cf240> Failed to fetch sample 2113887. Exception: cannot identify image file <_io.BytesIO object at 0x7f995aaf13a0> Failed to fetch sample 2970666. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1f85e5440> Failed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7fb37942fd80> Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008198ae0> Failed to fetch sample 2107834. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b382a250> cannot identify image file <_io.BytesIO object at 0x7fdcaf888bd0> cannot identify image file <_io.BytesIO object at 0x7f83e84edf80> Failed to fetch sample 3150542. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa3ac00> Failed to fetch sample 3385768. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c56079710> Failed to fetch sample 1198363. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b95d580> Failed to fetch sample 1718944. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb5fe70>FFailed to fetch sample 2887528. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79f560> Failed to fetch sample 3143487. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de03007c0Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbdad40> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38268e0> Failed to fetch sample 2817471. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f5e40> Failed to fetch sample 1007919. Exception: cannot identify image fiFailed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f880e0> Failed to fetch sample 1232641. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b996660> le <_io.BytesIO object at 0x7f5838df6250> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093b797b0> Failed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b907740> Failed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7f12295f14e0> Failed to fetch sample 3508405. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a15160f40> Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf777f10> Failed to fetch sample 2993528. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278aca90> Failed to fetch sample 2452845. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754b26b0> Failed to fetch sample 2750694. Exception: cannot identify image file <_io.BytesIO object at 0x7fb5e8c8b8d0> Failed to fetch sample 3397401. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cec9514e0> Failed to fetch sample 3002531. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3808b80> Failed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d27887ec0> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a15162250> Failed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb6a390> cannot identify image file <_io.BytesIO object at 0x7f35a7f72980> Failed to fetch sample 2446023. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcb7e0> Failed to fetch sample 2846277. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f257100> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2936a0> Failed to fetch sample 2061981. Exception: cannot identify image file <_io.BytesIO object at 0x7f0dd010a0c0> Failed to fetch sample 3437033. Exception: ject at 0x7f3939ca8b30> Failed to fetch sample 2891850. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfab859490> le <_io.BytesIO object at 0x7fc0f08feb10> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb5e6b0> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb69710> Failed to fetch sample 1468867. Exception: cannot identify image file <_io.BytesIO object at 0x7f00d8f31f80> Failed to fetch sample 3747017. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a5b609a0> cannot identify image file <_io.BytesIO object at 0x7f94ac2422f0> Failed to fetch sample 1885521. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b426da80> Failed to fetch sample 3631501. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d34c0> Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd97dd0> cannot identify image file <_io.BytesIO object at 0x7f7b9b95d940> Failed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e95d0> Failed to fetch sample 3362301. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8d2bb0> Failed to fetch sample 2309209. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d14f9fd30> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cec955a80> Failed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f291da0> Failed to fetch sample 2591241. Exception: cannot identify image file <_io.BytesIO object at 0x7f09953d4db0> Failed to fetch sample 2208960. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac243c40> cannot identify image file <_io.BytesIO object at 0x7f22e39eb8d0> Failed to fetch sample 1534741. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf8d6f70> Failed to fetch sample 4193225. Exception: cannot identify image fFailed to fetch sample 2385274. Exception: Failed to fetch sample 4221561. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd4d60> Failed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7efd0d5fd940> cannot identify image file <_io.BytesIO object at 0x7f95bc524590> cannot identify image file <_io.BytesIO object at 0x7f357b921b70> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8f7a60> cannot identify image file <_io.BytesIO object at 0x7efcdbb4b290> Failed to fetch sample 3719596. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f95a840> Failed to fetch sample 4077268. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37167f0> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb70270> Failed to fetch sample 2762807. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37453a0> Failed to fetch sample 1157357. Exception: ject at 0x7f816d067470> Failed to fetch sample 2943308. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b923f60> Failed to fetch sample 2606964. Exception cannot identify image file <_io.BytesIO object at 0x7f65560273d0> le <_io.BytesIO object at 0x7fee3f95b560> Failed to fetch sample 1885521. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37159e0> Failed to fetch sample 1173229. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f1800> Failed to fetch sample 2051285. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c272fc0> Failed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777ae660> Failed to fetch sample 2528140. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d5a865030> cannot identify image file <_io.BytesIO object at 0x7f8028e89490> Failed to fetch sample 2012611. Exception:ject at 0x7f12b381b7e0> Failed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7f1009412430> Failed to fetch sample 1508404. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f83e20> Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd6ca0> Failed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bdb380> Failed to fetch sample 1848040. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db6fbba0> Failed to fetch sample 3013587. Exception: cannot identify image file <_io.BytesIO object at 0x7fc108457150> Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb4a2a0> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f266610> Failed to fetch sample 1448420. Exception: cannot identify image file <_io.BytesIO object at 0x7fe445d6d170> Failed to fetch sample 3035327. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80f8a90> Failed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3712a20> Failed to fetch sample 3641068. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789f420> Failed to fetch sample 1961364. Exception: cannot identify image file <_io.BytesIO object at 0x7f846e3ef2e0> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1084560c0> Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd4090> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd9f80> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d896c0> cannot identify image file <_io.BytesIO object at 0x7f6d94e9a700> Failed to fetch sample 3353116. Exception:Failed to fetch sample 2602679. Exception: cannot identify image fiFailed to fetch sample 2392280. Exception:Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f266cf0> Failed to fetch sample 2957876. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a025c0> cannot identify image fi cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7f80201b3ba0> le <_io.BytesIO object at 0x7ff401fbb150> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777ae4d0> bject at 0x7f9a9c3bb880> Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f45c10> Failed to fetch sample 2686021. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3745490> Failed to fetch sample 1232641. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcea700> Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754b2f70> Failed to fetch sample 1588738. Exception: cannot identify image fiFailed to fetch sample 2852433. Exception:Failed to fetch sample 1591233. Exception: cannot identify image fiFailed to fetch sample 3725471. Exception:Failed to fetch sample 1424357. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8edd98630> cannot identify image file <_io.BytesIO object at 0x7f991c7c76f0> Failed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7ae070> Failed to fetch sample 2574635. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3af76f0> Failed to fetch sample 2062925. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20038f470> cannot identify image file <_io.BytesIO object at 0x7f9abcdc9670> ile <_io.BytesIO object at 0x7f357b93ad40> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190c2520> cannot identify image file <_io.BytesIO object at 0x7f1137f45e40> le <_io.BytesIO object at 0x7f4beebab4c0> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9630b0> Failed to fetch sample 3133474. Exception: cannot identify image fFailed to fetch sample 1158188. Exception: Failed to fetch sample 3133474. Exception: cannot identify image fFailed to fetch sample 2993528. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93b5b0> cannot identify image file <_io.BytesIO object at 0x7f83e834b240> 432321. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a90f740> Failed to fetch sample 2400608. Exception: ailed to fetch sample 2717891. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fdaadeb10d0> le <_io.BytesIO object at 0x7ff62056e980> Failed to fetch sample 3091195. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebab790> Failed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93bab0> Failed to fetch sample 1221827. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf85b380> cannot identify image file <_io.BytesIO object at 0x7f9d7322f6a0> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f47060> Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebda660> cannot identify image file <_io.BytesIO object at 0x7f5e6a90f650> Failed to fetch sample 1945810. Exception: ject at 0x7f82a5ec4040> cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7fccdf78ae80> Failed to fetch sample 1781171. Exception: cannot identify image file <_io.BytesIO object at 0x7fc000b14900> cannot identify image file <_io.BytesIO object at 0x7f9d7322f290> Failed to fetch sample 1681909. Exception: cannot identify image fiFailed to fetch sample 1232641. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a90d990> Failed to fetch sample 1749320. Exception:Failed to fetch sample 4238312. Exception:bject at 0x7ff2575efc40> Failed to fetch sample 3333205. Exception: cannot identify image file <_io.BytesIO object at 0x7f32986bb880> Failed to fetch sample 4168370. Exception:Failed to fetch sample 3137117. Exception: cannot identify image fiFailed to fetch sample 2018692. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c776a0> Failed to fetch sample 2452845. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f65c0> Failed to fetch sample 3010687. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093737830> Failed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebdbab0> Failed to fetch sample 2597101. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b39c0> Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b7c40> Failed to fetch sample 1525240. Exception: cannot identify image file <_io.BytesIO object at 0x7f60ffea7a60> Failed to fetch sample 3662436. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8d9670> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b1e40> cannot identify image file <_io.BytesIO object at 0x7f35a7f2b6f0> Failed to fetch sample 2103611. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08ef420> Failed to fetch sample 3496220. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdef620c0> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7f6180c29e40> Failed to fetch sample 1659475. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf854f90> Failed to fetch sample 1092914. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadeb21b0> Failed to fetch sample 3492865. Exception: ailed to fetch sample 2654468. Exception: cannot identify image fiFailed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe724067f0> Failed to fetch sample 2446023. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b96db70> cannot identify image file <_io.BytesIO object at 0x7f6417e0aca0> Failed to fetch sample 1243147. Exception:cannot identify image file <_io.BytesIO object at 0x7f27754b26b0> Failed to fetch sample 1718944. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3876bb0> FFailed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73261620> ailed to fetch sample 2084105. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb49b20> Failed to fetch sample 4169491. Exception: cannot identify image file <_io.BytesIO object at 0x7efc516afbf0> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadeb2930> Failed to fetch sample 2325409. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39eb5b0> Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7f36637eda30> cannot identify image file <_io.BytesIO object at 0x7f655606b150> le <_io.BytesIO object at 0x7efcdbb36890> Failed to fetch sample 1543124. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39f9bc0> Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9b86d0> Failed to fetch sample 1565916. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b1440> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f225afe4ea0> Failed to fetch sample 1722020. Exception: ject at 0x7fd310dc33d0> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a561710> le <_io.BytesIO object at 0x7f7e078e3560> Failed to fetch sample 1534741. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39f8680> cannot identify image file <_io.BytesIO object at 0x7efcdbb2fdd0> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb37e70> Failed to fetch sample 1480960. Exception:ject at 0x7f9abcdf67a0> Failed to fetch sample 3106325. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f2bce0> Failed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdcd79d00> Failed to fetch sample 2411770. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbde505800> Failed to fetch sample 4151771. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b3889670> cannot identify image file <_io.BytesIO object at 0x7fbf0071b510> Failed to fetch sample 3747017. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078e1940> cannot identify image file <_io.BytesIO object at 0x7efcdbb12d90> Failed to fetch sample 991659. Exception:: cannot identify image file <_io.BytesIO object at 0x7fc2eb721df0> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb376a0> Failed to fetch sample 973236. Exception:: cannot identify image file <_io.BytesIO object at 0x7f5a16f774c0> Failed to fetch sample 1821542. Exception: cannot identify image file <_io.BytesIO object at 0x7f83045caf20> Failed to fetch sample 3142724. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2ceb26b0> cannot identify image file <_io.BytesIO object at 0x7f58399b9120> Failed to fetch sample 1494716. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd0a90> Failed to fetch sample 3010687. Exception: cannot identify image file <_io.BytesIO object at 0x7fd08940d6c0> Failed to fetch sample 3247123. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278af470> Failed to fetch sample 3707415. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8eac00> Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb72a480> Failed to fetch sample 2452845. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a935da0> Failed to fetch sample 3593912. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697de53a0> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7f6417e0b740> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca3a10> Failed to fetch sample 3150542. Exception: cannot identify image file <_io.BytesIO object at 0x7fd08940c7c0> Failed to fetch sample 3602844. Exception: cannot identify image file <_io.BytesIO object at 0x7f08282cbce0> Failed to fetch sample 1144891. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fbdd0> Failed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7f05223ec2c0> cannot identify image file <_io.BytesIO object at 0x7fc1cf6a1210> Failed to fetch sample 2024032. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e07704b30> le <_io.BytesIO object at 0x7f001f929bc0> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f6ec50> Failed to fetch sample 2245924. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe81f73fb0> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b979ad5d0> Failed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d6110> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf6730b0> Failed to fetch sample 1471109. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a907b1f0> ELISM=(true | false) Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c28f9c0> Failed to fetch sample 3397401. Exception:Failed to fetch sample 4243929. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d5080> cannot identify image file <_io.BytesIO object at 0x7fc24d0ce520> Failed to fetch sample 2325409. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278db4c0> Failed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f55080> Failed to fetch sample 1316862. Exception: cannot identify image fiFailed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf672ed0> cannot identify image file <_io.BytesIO object at 0x7fee3f972f20> Failed to fetch sample 2982611. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f1170> Failed to fetch sample 3397401. Exception: cannot identify image file <_io.BytesIO object at 0x7f6100878540> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278d8fe0> Failed to fetch sample 1224848. Exception: cannot identify image file <_io.BytesIO object at 0x7f1bfcb56160> Failed to fetch sample 1991139. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71d8f0> Failed to fetch sample 3002531. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f82f20> cannot identify image file <_io.BytesIO object at 0x7f08280fa9d0> le <_io.BytesIO object at 0x7f22e39d7f60> Failed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b55d0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable cannot identify image file <_io.BytesIO object at 0x7f35a7f55080> 36. Exception: cannot identify image file <_io.BytesIO object at 0x7ff6973eb420> Failed to fetch sample 1534741. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ecfe0> Failed to fetch sample 2671486. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71e070> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64c3fb0> Failed to fetch sample 1304332. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf6a39c0> cannot identify image file <_io.BytesIO object at 0x7f27754aaa70> Failed to fetch sample 1534741. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f969d00>Failed to fetch sample 2374167. Exception:: cannot identify image file <_io.BytesIO object at 0x7fb5e8ca2480> Failed to fetch sample 1068121. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d84810> Failed to fetch sample 2789393. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f94be20> able this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 981857. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7ae020> cannot identify image file <_io.BytesIO object at 0x7f7e078c8e00> Failed to fetch sample 3887462. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3813bf0> cannot identify image file <_io.BytesIO object at 0x7f35a7f57560> le <_io.BytesIO object at 0x7f18e64e29d0> Failed to fetch sample 1701865. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754aacf0> Failed to fetch sample 4080908. Exception: cannot identify image file <_io.BytesIO object at 0x7fc108457ab0> Failed to fetch sample 3150542. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db567740> cannot identify image file <_io.BytesIO object at 0x7fc2eb727e70> Failed to fetch sample 1579842. Exception: Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9491c0> Failed to fetch sample 3133474. Exception: cannot identify image fFailed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7fd30f9de6b0> ile <_io.BytesIO object at 0x7fb52f60eca0> Failed to fetch sample 1116444. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db53f790> le <_io.BytesIO object at 0x7f7cebfb09a0> cannot identify image file <_io.BytesIO object at 0x7f5a23883420> Failed to fetch sample 2325409. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f622f70> Failed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f60d300> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f94a750> Failed to fetch sample 1802212. Exception: cannot identify image fiFailed to fetch sample 1534741. Exception:cannot identify image file <_io.BytesIO object at 0x7f357b906cf0> ile <_io.BytesIO object at 0x7f71342d2fc0> cannot identify image file <_io.BytesIO object at 0x7f18f5746390> cannot identify image file <_io.BytesIO object at 0x7fc2eb8eb330> Failed to fetch sample 1918012. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777df600> Failed to fetch sample 1975989. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f954540> Failed to fetch sample 3397401. Exception: cannot identify image file <_io.BytesIO object at 0x7f2b50f1b920> Failed to fetch sample 3470605. Exception: cannot identify image file <_io.BytesIO object at 0x7fdf914697b0> Failed to fetch sample 2512060. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43bf9d50> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7fd3121b0270> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f627510> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7f10ae7e9300> Failed to fetch sample 1961364. Exception:bject at 0x7f357b937ba0> le <_io.BytesIO object at 0x7f2a918e9120> Failed to fetch sample 2325409. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f9a9c2b8fe0> Failed to fetch sample 1066589. Exception: cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7fbe7224af70> Failed to fetch sample 3184905. Exception: cannot identify image fiFailed to fetch sample 2633877. Exception: ject at 0x7f18e64e15d0> Failed to fetch sample 3067785. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 10 cannot identify image file <_io.BytesIO object at 0x7f7e07823100> le <_io.BytesIO oFailed to fetch sample 1019971. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d6d1bb330> bject at 0x7fc2eb71af70> cannot identify image file <_io.BytesIO object at 0x7f936b355350> ile <_io.BytesIO object at 0x7f995956ca90> Failed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7323bc90> Failed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d77c87e20> Failed to fetch sample 1211015. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008147880> Failed to fetch sample 2368591. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3af6ed0> Failed to fetch sample 2767646. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b95d580> Failed to fetch sample 2981005. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b94ea70> Failed to fetch sample 2602806. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f8b2e0> Failed to fetch sample 1534741. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64e04a0> Failed to fetch sample 1834039. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c26d440> Failed to fetch sample 2894310. Exception: cannot identify image file <_io.BytesIO object at 0x7f566810f600> Failed to fetch sample 4102616. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d77c87f10> Failed to fetch sample 1357937. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a64680> Failed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078dbba0> Failed to fetch sample 2579791. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e4c180> Failed to fetch sample 2846277. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078cb4c0> Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b996890> Failed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e834e520> Failed to fetch sample 4102616. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84a3fb0> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7224bf10> ailed to fetch sample 2892743. Exception: cannot identify image file <_io.BytesIO object at 0x7f5838fc6980> cannot identify image file <_io.BytesIO object at 0x7fee3f95bbf0> Failed to fetch sample 3224871. Exception: Failed to fetch sample 4225869. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9560c0> cannot identify image file <_io.BytesIO object at 0x7f936bd2cb80> le <_io.BytesIO object at 0x7f7a568e7790> Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b95f1f0> Failed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078c9a80> Failed to fetch sample 1026599. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39f2f20> Failed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a65260> Failed to fetch sample 1477290. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a937d80> Failed to fetch sample 4232471. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f77b1a0> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a4f510> Failed to fetch sample 2392280. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3af7ce0> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db55e520> Failed to fetch sample 3067785. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a4ee80> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7f469e5ab510> Failed to fetch sample 4146610. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cacdc5080> Failed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7f936b355030> Failed to fetch sample 3498900. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a6d90> Failed to fetch sample 1198363. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f31fd0> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7f5746e82610> Failed to fetch sample 1528064. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e306e55d0> Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b1990> Failed to fetch sample 2366578. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378a2f0> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a651f380> Failed to fetch sample 3206459. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb3c680> Failed to fetch sample 4072036. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaa423bf0> Failed to fetch sample 1659475. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568d7600> Failed to fetch sample 2482217. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e306e62a0> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789e750> Failed to fetch sample 1134701. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278d3060> Failed to fetch sample 1074552. Exception: cannot identify image file <_io.BytesIO object at 0x7f00da6cee80> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7f440e00c7c0> cannot identify image file <_io.BytesIO object at 0x7ff401f57fb0> Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777ddb20> ile <_io.BytesIO object at 0x7f47278d3010>Failed to fetch sample 2127419. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3920360> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c8f830> Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8316bb0> Failed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcea39f30> cannot identify image file <_io.BytesIO object at 0x7f6682dcf0b0> le <_io.BytesIO object at 0x7f995a102160> Failed to fetch sample 1277996. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789cfe0> Failed to fetch sample 2261676. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdba6430> Failed to fetch sample 1079743. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16daa700> cannot identify image file <_io.BytesIO object at 0x7f109378eac0> Failed to fetch sample 1759192. Exception: Failed to fetch sample 1243441. Exception: cannot identify image fFailed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f5cb0> Failed to fetch sample 2237717. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789f970> Failed to fetch sample 3162635. Exception: Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcea1d800> cannot identify image file <_io.BytesIO object at 0x7f1137f62340> Failed to fetch sample 1663065. Exception: cannot identify image file <_io.BytesIO object at 0x7fd30f9dff10> Failed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278d2980> Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7f469e5d83b0> Failed to fetch sample 2767646. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093756a70> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568b8e50> Failed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25628db0> Failed to fetch sample 3168220. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6496750> Failed to fetch sample 1520592. Exception: cannot identify image file <_io.BytesIO object at 0x7f8150b71530> Failed to fetch sample 2122051. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378f4c0> Failed to fetch sample 1432321. Exception: cannot identify image file <_io.BytesIO object at 0x7f69186e5d50> Failed to fetch sample 2237717. Exception: cannot identify image file <_io.BytesIO object at 0x7fb6877daf70> Failed to fetch sample 2591241. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcea3b0b0> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ccea70> Failed to fetch sample 2036348. Exception: cannot identify image file <_io.BytesIO object at 0x7ff69741ca90> Failed to fetch sample 2132924. Exception: cannot identify image file <_io.BytesIO object at 0x7f2230ead800> cannot identify image file <_io.BytesIO object at 0x7f5de37eb1f0> cannot identify image file <_io.BytesIO object at 0x7f08280ead40> le <_io.BytesIO object at 0x7f6682df7b50> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9392b0> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f2d1bce0> Failed to fetch sample 2768015. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec23b240> Failed to fetch sample 4205826. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381d7b0> Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378d9e0> cannot identify image file <_io.BytesIO object at 0x7fdc25628810> Failed to fetch sample 3648356. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c9ee0> cannot identify image file <_io.BytesIO object at 0x7f7b13d2d5d0> cannot identify image file <_io.BytesIO object at 0x7fe447380590> le <_io.BytesIO object at 0x7f3939cce570> le <_io.BytesIO object at 0x7f6682df6340> cannot identify image file <_io.BytesIO object at 0x7f4e84965c60> Failed to fetch sample 1991893. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd2ac0> cannot identify image file <_io.BytesIO object at 0x7f5a16f8b3d0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disFailed to fetch sample 2957876. Exception:Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378e570> Failed to fetch sample 31Failed to fetch sample 1930213. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 3247622. Exception: ject at 0x7fe1f85e5440> cannot identify image file <_io.BytesIO object at 0x7fc24c6bf560> Failed to fetch sample 3703034. Exception: cannot identify image file <_io.BytesIO object at 0x7fe5886f1760> Failed to fetch sample 1570910. Exception: cannot identify image file <_io.BytesIO object at 0x7f21afee5a30> Failed to fetch sample 1922203. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c273c40> cannot identify image file <_io.BytesIO object at 0x7f7a568a77e0> ailed to fetch sample 4127367. Exception: cannot identify image file <_io.BytesIO object at 0x7f655602cd60> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7f4ebf82d170> Failed to fetch sample 3662436. Exception: cannot identify image file <_io.BytesIO object at 0x7f69186e5990> Failed to fetch sample 1858383. Exception: cannot identify image fFailed to fetch sample 3742762. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf83c5e0> ile <_io.BytesIO object at 0x7f4e83fc4590> Failed to fetch sample 3209388. Exception: cannot identify image fiFailed to fetch sample 2374036. Exception:Failed to fetch sample 1885521. Exception: cannot identify image file <_io.BytesIO object at 0x7f4ebf82f3d0> Failed to fetch sample 1524920. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937ff150> Failed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e83fc44a0> Failed to fetch sample 2654468. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 3458049. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f0680> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fdbd30> ject at 0x7f655602e700> Failed to fetch sample 1446180. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133f16c00> Failed to fetch sample 1676382. Exception: cannot identify image file <_io.BytesIO object at 0x7fa0892ef420> Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd21b0> Failed to fetch sample 1524920. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83161b0> Failed to fetch sample 1102824. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84ef920> Failed to fetch sample 3925866. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c3f10> Failed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faf1c10> Failed to fetch sample 2860326. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9a9440> Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39110d0> cannot identify image file <_io.BytesIO object at 0x7fd2003849a0> Failed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568bb100> Failed to fetch sample 3656581. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c556390> Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf83db70> le <_io.BytesIO object at 0x7f08280ebdd0> Failed to fetch sample 2194844. Exception: cannot identify image fiFailed to fetch sample 2045465. Exception:Failed to fetch sample 2998072. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133fd3470> Failed to fetch sample 3595827. Exception: cannot identify image fiFailed to fetch sample 1594139. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e070f9030> Failed to fetch sample 3162454. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8319f30> Failed to fetch sample 1875514. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777db790> Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f90c70> Failed to fetch sample 4121555. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278b6ac0> Failed to fetch sample 1222876. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8edd66ed0> Failed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789d580> cannot identify image file <_io.BytesIO object at 0x7fc0f090aa70> Failed to fetch sample 3431220. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420eb010>Failed to fetch sample 2591241. Exception: cFailed to fetch sample 4023897. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a907aed0> annot identify image file <_io.BytesIO object at 0x7f65560e75b0> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280e9620> Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73287d30> Failed to fetch sample 1270418. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682d32e0> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133fd2890> Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200387650> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7f12295f21b0> Failed to fetch sample 2353389. ExceptionFailed to fetch sample 3462168. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8da340> Failed to fetch sample 2430210. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697deba60> : cannot identify image file <_io.BytesIO object at 0x7f7d7d3c6e80> cannot identify image file <_io.BytesIO object at 0x7f53a8eaea70> Failed to fetch sample 1718944. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80dffb0> Failed to fetch sample 1477290. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a907a7f0> Failed to fetch sample 2898473. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d9a80> le <_io.BytesIO object at 0x7f08280ea4d0> Failed to fetch sample 2639168. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec23afc0> Failed to fetch sample 1277996. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a90f7e0> Failed to fetch sample 2975549. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381a250> Failed to fetch sample 1316862. Exception: cannot identify image file <_io.BytesIO object at 0x7f326a4f5a80> Failed to fetch sample 2233122. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697debf10> Failed to fetch sample 1513523. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b93a980> Failed to fetch sample 2670557. Exception: cannot identify image fiFailed to fetch sample 2341971. Exception:cannot identify image file <_io.BytesIO object at 0x7f0986421260> Failed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c56f970> ile <_io.BytesIO object at 0x7fccdf7b73d0> Failed to fetch sample 1265302. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697dea250> bject at 0x7f357b922ed0> cannot identify image file <_io.BytesIO object at 0x7fedb5ec0900> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090bb00> Failed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f3bbf0> Failed to fetch sample 1102280. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdba72e0> cannot identify image file <_io.BytesIO object at 0x7fb7401e34c0> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f4cc0> Failed to fetch sample 3492865. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864596c0> cannot identify image file <_io.BytesIO object at 0x7fe0ec239080> Failed to fetch sample 3331465. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dec130> Failed to fetch sample 1448529. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071cf40> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cec00> huggingface/tokenizers: The current process just got forked, after Failed to fetch sample 1573517. Exception: cannot identify image file <_io.BytesIO object at 0x7fc108457790> Failed to fetch sample 2900646. Exception: cannot identify image file <_io.BytesIO object at 0x7f991b3e79c0> TOKENIZERS_PARALLELISM=(true | false) FaiFailed to fetch sample 3002531. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7a0bd0> Failed to fetch sample 3188673. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093737150> Failed to fetch sample 3010687. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00748f40> Failed to fetch sample 2670557. Exception: cannot identify image fFailed to fetch sample 1484787. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778ed40> Failed to fetch sample 1635122. Exceptio cannot identify image file <_io.BytesIO object at 0x7f80a82d9ad0> FaFailed to fetch sample 2528140. Exception:annot identify image file <_io.BytesIO object at 0x7f563bb95350> Failed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340effb0> on: cannot identify image file <_io.BytesIO object at 0x7f21b7fb9800> Failed to fetch sample 3150542. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9cb2b48b0> Failed to fetch sample 3876557. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac213290> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2435458. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e8680> Failed to fetch sample 4232471. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3942ca0> Failed to fetch sample 4117657. Exception: cannot identify image fFailed to fetch sample 3662436. Exception: Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd2458cc70> Failed to fetch sample 2950442. Exception: cannot identify image fFailed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7ff52b9cf1f0> Failed to fetch sample 1186620. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f72200> ile <_io.BytesIO object at 0x7fa50aa2c4f0> Failed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e833a520> cannot identify image file <_io.BytesIO object at 0x7fd07a55b600> le <_io.BytesIO object at 0x7efcdbb472e0> cannot identify image file <_io.BytesIO object at 0x7f21b7fa87c0> Failed to fetch sample 2491527. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08e3dd0> Failed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f9990> Failed to fetch sample 3438762. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2a1f30> Failed to fetch sample 4023897. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17f0f05e0> Failed to fetch sample 1026599. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f70220> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777ebc40> cannot identify image file <_io.BytesIO object at 0x7fa50aa3d1c0> Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec23b1f0> Failed to fetch sample 2237717. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08f9260> Failed to fetch sample 1814259. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093733f60> Failed to fetch sample 3644627. Exception:bject at 0x7f2a0a86c450> Failed to fetch sample 4121658. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f93b4c0> Failed to fetch sample 3470747. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a0b406340> le <_io.BytesIO object at 0x7f995a102390> Failed to fetch sample 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f72bb0> Failed to fetch sample 3067152. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f90ecf0> cannot identify image file <_io.BytesIO object at 0x7fa50aa2d300> Failed to fetch sample 2671486. Exception:Failed to fetch sample 1035361. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82bb600> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7fbda068ae80> Failed to fetch sample 3085897. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b94e520> Failed to fetch sample 1662810. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdd2d90> cannot identify image file <_io.BytesIO object at 0x7f10937ffa60> Failed to fetch sample 3607208. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078aba60> ile <_io.BytesIO object at 0x7f5e6a8b6a20> Failed to fetch sample 3322388. Exception: Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcbd30> cannot identify image file <_io.BytesIO object at 0x7f7b9b9680e0> Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b7150> Failed to fetch sample 2528140. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b44db0> Failed to fetch sample 3924877. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7fd07a559080> ile <_io.BytesIO object at 0x7f9abc636fc0> Failed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object at 0x7f10094129d0> Failed to fetch sample 2612136. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3747880> Failed to fetch sample 3459242. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134283740> Failed to fetch sample 1071966. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d5a865030> Failed to fetch sample 2671486. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb71c10> Failed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71b150> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f90f100> Failed to fetch sample 1443608. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278d2980> cannot identify image file <_io.BytesIO object at 0x7fe4440b8720> ile <_io.BytesIO object at 0x7f7e078e3240> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f469e5aab60> cannot identify image file <_io.BytesIO object at 0x7f53a90806d0> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc636020> Failed to fetch sample 2127419. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a0a86cb80> Failed to fetch sample 3507144. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906a930> cannot identify image file <_io.BytesIO object at 0x7f5e7a25f7e0> Failed to fetch sample 4102616. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e2d804770> Failed to fetch sample 3220951. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7ff4dab774c0> 53389. Exception: cannot identify image fiFailed to fetch sample 992574. Exception: Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3747a60> Failed to fetch sample 2885050. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9083d30> Failed to fetch sample 2753934. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc5d00> Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7fe44409b380> cannot identify image file <_io.BytesIO object at 0x7f469efa3c40> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7eff972033d0> le <_io.BytesIO object at 0x7f7e078ae5c0> Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190c3150> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb39c60> Failed to fetch sample 2794518. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a918e94e0> Failed to fetch sample 1895897. Exception:ject at 0x7ff4dab766b0> le <_io.BytesIO object at 0x7fee3f97b740> Failed to fetch sample 3682847. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420eb380> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3713740> Failed to fetch sample 1885521. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078e1fd0> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190c2930> cannot identify image file <_io.BytesIO object at 0x7f6e2d805d00> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4fbbeca0> Failed to fetch sample 1477290. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a905bec0> Failed to fetch sample 3835656. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2e6d40> Failed to fetch sample 1212765. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a90831f0> cannot identify image file <_io.BytesIO object at 0x7f57420e85e0> Failed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a87663650> FFailed to fetch sample 3228490. Exception: cannot identify image file <_io.BytesIO object at 0x7f109377b060> Failed to fetch sample 3355968. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39ef560> Failed to fetch sample 3162454. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd0a90> Failed to fetch sample 1515221. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9b8b30> cannot identify image file <_io.BytesIO object at 0x7ff4dab77e70> Failed to fetch sample 2935990. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c5607a020> le <_io.BytesIO object at 0x7fc26217bf10> Failed to fetch sample 2354595. Exception: cannot identify image file <_io.BytesIO object at 0x7f597d826b60> Failed to fetch sample 4204952. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071bec0> Failed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec01030> Failed to fetch sample 3458049. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9bbd30> Failed to fetch sample 1158188. Exception:Failed to fetch sample 4232471. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf76e5c0> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7f57cd629580> Failed to fetch sample 2036348. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412c1d0> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078df1a0> Failed to fetch sample 4380869. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9bb060> cannot identify image file <_io.BytesIO object at 0x7fdd459d5670> Failed to fetch sample 3596092. Exception:ject at 0x7fd20037b3d0> ile <_io.BytesIO object at 0x7f2a87642bb0> cannot identify image file <_io.BytesIO object at 0x7f6a7f266980> Failed to fetch sample 3492865. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cd376f70> le <_io.BytesIO object at 0x7f22e39bacf0> Failed to fetch sample 2127419. Exception: cannot identify image fiFailed to fetch sample 2670557. Exception: Failed to fetch sample 3675988. Exception:Failed to fetch sample 3900769. Exception: cannot identify image fF cannot identify image file <_io.BytesIO object at 0x7f9d5059b1a0Failed to fetch sample 2367841. Exception: ject at 0x7fca1c1b9d50> Failed to fetch sample 4080764. Exception: cannot identify image file <_io.BytesIO object at 0x7ff36f2c2660> file <_io.BytesIO object at 0x7f9abcd97f60> Failed to fetch sample 4087771. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac228310> Failed to fetch sample 2639168. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4266110> cannot identify image file <_io.BytesIO object at 0x7f3294d9b1f0> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078dec00> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190c3970> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2679c0> cannot identify image file <_io.BytesIO object at 0x7fd200379120> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` befFailed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a1571 cannot identify image file <_io.BytesIO object at 0x7f5253bfb790> Failed to fetch sample 3397558. Exception: cannot identify image file <_io.BytesIO object at 0x7fc108456980> Failed to fetch sample 3068766. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe724067a0> 0> Failed to fetch sample 1047785. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668113d30> Failed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7fcef8cefd30> Failed to fetch sample 2432855. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b241c10> Failed to fetch sample 2082361. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f735b0> cannot identify image file <_io.BytesIO object at 0x7f9d5059b7e0> Failed to fetch sample 1277996. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac26f470> cannot identify image file <_io.BytesIO object at 0x7f5741f18680> ile <_io.BytesIO object at 0x7f12b39f2250> Failed to fetch sample 2869694. Exception: cannot identify image file <_io.BytesIO object at 0x7f328586a8e0> Failed to fetch sample 3414742. Exception: cannot identify image fiFailed to fetch sample 1102280. Exception: cannot identify image file <_io.BytesIO object at 0x7f727792bfb0> Failed to fetch sample 1498794. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9067150> le <_io.BytesIO object at 0x7fc107a4f510> Failed to fetch sample 3089553. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a09260> Failed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f70f90> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bfa250> led to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071a340> Failed to fetch sample 1440224. Exception:ject at 0x7f5741f1afc0> ile <_io.BytesIO object at 0x7fb9777a9c60> Failed to fetch sample 2767646. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac235b20> Failed to fetch sample 2526328. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72405990> Failed to fetch sample 1995586. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd69d0> Failed to fetch sample 3648356. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a0abb0> Failed to fetch sample 2846277. Exception: ect at 0x7f70b4267e70> Failed to fetch sample 4077261. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9083e70> ile <_io.BytesIO object at 0x7fc107a4f3d0> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72406750> Failed to fetch sample 3875855. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3813600> Failed to fetch sample 3571231. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f39b70> cannot identify image file <_io.BytesIO object at 0x7f56f0943d30> Failed to fetch sample 1081564. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b255d00> le <_io.BytesIO object at 0x7f5253bd6d40> Failed to fetch sample 1976843. Exception:ject at 0x7f0986452340> ile <_io.BytesIO object at 0x7f12b381e2a0> Failed to fetch sample 1277996. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b96e5c0> Failed to fetch sample 2370022. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093736020> Failed to fetch sample 1747603. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134d6ef70> cannot identify image file <_io.BytesIO object at 0x7f5751088680> Failed to fetch sample 2993528. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f3b9c0> Failed to fetch sample 3079108. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43034fe0> Failed to fetch sample 1718944. Exception: cannot identify image file <_io.BytesIO object at 0x7f6692311030> Failed to fetch sample 1759192. Exception: cannot identify image fiFailed to fetch sample 3455029. Exception: ailed to fetch sample 1831372. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0f9a24d0> Failed to fetch sample 2767646. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b96ddf0> Failed to fetch sample 1659475. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093767600> Failed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7f1228c222a0> Failed to fetch sample 2325409. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dc7b50> cannot identify image file <_io.BytesIO object at 0x7f53a9082f20> Failed to fetch sample 2122051. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b96fab0> Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937e3ec0> cannot identify image file <_io.BytesIO object at 0x7f5e7a25f7e0> Failed to fetch sample 1256456. Exception:Failed to fetch sample 2219434. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39dec50> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7f12295ecf40> cannot identify image file <_io.BytesIO object at 0x7f5a23a47d30> Failed to fetch sample 3418606. Exception: Failed to fetch sample 2659119. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b426f7e0> Failed to fetch sample 1659475. Exception: cannot identify image fiFailed to fetch sample 1741559. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682f92ac0> Failed to fetch sample 3196682. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078fa2a0> Failed to fetch sample 3557219. Exception:Failed to fetch sample 3355968. Exception:ject at 0x7f5527249490> ile <_io.BytesIO object at 0x7fdaadeb2c50> cannot identify image file <_io.BytesIO object at 0x7f58c39dc0e0> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fa3bbcd10> Failed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078fa8e0> Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9066b60> cannot identify image file <_io.BytesIO object at 0x7f5a16f727f0> ile <_io.BytesIO object at 0x7f357b9d8180> Failed to fetch sample 1102280. Exception: cannot identify image file <_io.BytesIO object at 0x7f566811dbc0> Failed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39df150> Failed to fetch sample 3431220. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d84770> Failed to fetch sample 2452845. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcd0ae0> cannot identify image file <_io.BytesIO object at 0x7fccdf79e930> le <_io.BytesIO object at 0x7f7e078fb920> cannot identify image file <_io.BytesIO object at 0x7ff249486b60> ile <_io.BytesIO object at 0x7fa008b320c0> Failed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071e930> cannot identify image file <_io.BytesIO oFailed to fetch sample 2840571. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82bb420> cannot identify image file <_io.BytesIO object at 0x7f08280f9a30> Failed to fetch sample 3397401. Exception: ject at 0x7f58c39deac0> Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078faca0> Failed to fetch sample 1422377. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c13a0> Failed to fetch sample 2237717. Exception:ject at 0x7f9d7325f3d0> Failed to fetch sample 2526328. Exception: cannot identify image file <_io.BytesIO object at 0x7f5525ebc310> Failed to fetch sample 2671486. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2992b0> Failed to fetch sample 2647584. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7326cfe0> Failed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280faf20> Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39deca0> Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071e200> annot identify image file <_io.BytesIO object at 0x7f80a80ddda0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4045412. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643e2e480> Failed to fetch sample 4043255. Exception: ject at 0x7fbe7224a6b0> Failed to fetch sample 2673531. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bb6e30> cannot identify image file <_io.BytesIO object at 0x7f12b38c2610> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7f55b329b790> le <_io.BytesIO object at 0x7f0ecb203650> cannot identify image file <_io.BytesIO object at 0x7f1b7b9c06d0> le <_io.BytesIO object at 0x7f80a8109210> cannot identify image file <_io.BytesIO object at 0x7f22e384cdb0> Failed to fetch sample 2122051. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bb7ab0> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f57e20> cannot identify image file <_io.BytesIO object at 0x7f08280fb830> Failed to fetch sample 2970666. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdb9ebb0> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7fe16e2e5a80> Failed to fetch sample 1479777. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200386930> Failed to fetch sample 3916323. Exception: cannot identify image fiFailed to fetch sample 1256456. Exception:Failed to fetch sample 2633741. Exception:ject at 0x7fd079d5dc60> Failed to fetch sample 2325409. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64e2d40> Failed to fetch sample 4072492. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7ad170> Failed to fetch sample 3557219. Exception:bject at 0x7fc19513d260> le <_io.BytesIO object at 0x7f9d7326de90> Failed to fetch sample 1277996. Exception: cannot identify image file <_io.BytesIO object at 0x7fcf245c0770> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7fc193db62a0> Failed to fetch sample 1747603. Exception: cannot identify image fi Failed to fetch sample 4171747. ExceptionFailed to fetch sample 2767646. Exception:ject at 0x7f47278d34c0> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bb71f0> Failed to fetch sample 2376106. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f56ca0> > Failed to fetch sample 2821232. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b526b0> Failed to fetch sample 2136946. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b95f330> cannot identify image file <_io.BytesIO object at 0x7fd200385580> le <_io.BytesIO object at 0x7f4727926020> cannot identify image file <_io.BytesIO object at 0x7f69190bede0> Failed to fetch sample 3646975. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071d530> Failed to fetch sample 3298418. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093758770> ile <_io.BytesIO object at 0x7f9abcdb2890> Failed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643e2d260> Failed to fetch sample 2122051. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200367920> Failed to fetch sample 2859127. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3a979c0> Failed to fetch sample 2671486. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f5300> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd5800> Failed to fetch sample 1277996. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d72e0> Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcefc0> Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3816200> cannot identify image file <_io.BytesIO object at 0x7f5a16db6390> Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f82700> FFailed to fetch sample 1858383. Exception: ject at 0x7ff249487740> e <_io.BytesIO object at 0x7f5253bd5080> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7f3634f87a60> Failed to fetch sample 2767646. Exception: cannot identify image file <_io.BytesIO object at 0x7f69186e5d50> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093758680> cannot identify image file <_io.BytesIO object at 0x7fb687f6bf60> Failed to fetch sample 4023897. Exception:bject at 0x7fd30f9df560> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f80680> Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7f991b3e7f60> Failed to fetch sample 1572611. Exception: cannot identify image file <_io.BytesIO object at 0x7f8151525800> Failed to fetch sample 2201604. Exception: annot identify image file <_io.BytesIO object at 0x7fd310dc1e90> Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd4c20> Failed to fetch sample 2683467. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39a83b0> Failed to fetch sample 2752631. Exception: cannot identify image file <_io.BytesIO object at 0x7f84f6fb9170> Failed to fetch sample 2854404. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf783100> Failed to fetch sample 2988753. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb729800> Failed to fetch sample 2521553. Exception: cannot identify image fi Failed to fetch sample 3916323. ExceptionFailed to fetch sample 1478462. Exception: cannot identify image filFailed to fetch sample 4117657. Exception: iled to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39ab600> Failed to fetch sample 4023897. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16db6430> Failed to fetch sample 1092679. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecb2015d0> Failed to fetch sample 2341971. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7fe436a03380> Failed to fetch sample 2828361. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 2595378. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c21f830> Failed to fetch sample 3Failed to fetch sample 1104903. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200376700> Failed to fetch sample 3691803. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 1Failed to fetch sample 3922746. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bbaf20> Failed to fetch sample 2325409. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 1858383. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200387ba0> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f69940> Failed to fetch sample 2392280. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20035abb0> ject at 0x7f53a8e8ab60> Failed to fetch sample 3351202. Exception:Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643e2fa60> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcd800> Failed to fetch sample 2354595. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8e8a570> Failed to fetch sample 3392648. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342d2de0> Failed to fetch sample 1416262. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f81490> Failed to fetch sample 3150542. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdba7bf0> Failed to fetch sample 3855812. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200377740> Failed to fetch sample 2491527. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25ad90> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253ffd6c0> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39a8cc0> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7f969583fc40> Failed to fetch sample 2816660. Exception: cannot identify image file <_io.BytesIO object at 0x7f8150b70b80> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3826f20> Failed to fetch sample 2555299. Exception: cannot identify image fFailed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7f846e3ec180> ile <_io.BytesIO object at 0x7f6682df6930> Failed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200376cf0> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556069a30> cannot identify image file <_io.BytesIO object at 0x7fdcaf88f4c0> Failed to fetch sample 2087658. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cec954860> Failed to fetch sample 2866361. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d2d440> Failed to fetch sample 2883299. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d2bbf0> cannot identify image file <_io.BytesIO object at 0x7f80a810a0c0> Failed to fetch sample 3395958. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadeb1fd0> le <_io.BytesIO object at 0x7f82a5e971f0> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7fd3121fccc0> cannot identify image file <_io.BytesIO object at 0x7f1f1b258f90> Failed to fetch sample 1386752. Exception:bject at 0x7fb976fb6c50> Failed to fetch sample 3073327. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7b2f70> Failed to fetch sample 2374167. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b137d81d0> Failed to fetch sample 4232471. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb727e70> Failed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d2c270> Failed to fetch sample 3408340. Exception: cannot identify image file <_io.BytesIO object at 0x7f277545bf60> Failed to fetch sample 1309618. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090c040> Failed to fetch sample 2491527. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a651f4c0> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d5a30> Failed to fetch sample 1241940. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a876609f0> Failed to fetch sample 2087658. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090c360> Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775477fb0> Failed to fetch sample 1776730. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278aeac0> Failed to fetch sample 1804828. Exception: cannot identify image fiFailed to fetch sample 1221095. Exception:Failed to fetch sample 1583532. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9ebab0> Failed to fetch sample 3306675. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d845e0> cannot identify image file <_io.BytesIO object at 0x7f94ac25f2e0> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cde2f0> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca3650> Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560683b0> le <_io.BytesIO object at 0x7fccdf7b9ee0> Failed to fetch sample 3401019. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d1dee0> le <_io.BytesIO object at 0x7f8150b704a0> cannot identify image file <_io.BytesIO object at 0x7f99a3af7470> Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253be7d30> Failed to fetch sample 3221585. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16daa6b0> Failed to fetch sample 2948290. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3af3f10> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253be6a20> Failed to fetch sample 2671486. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3911710> Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e6610> Failed to fetch sample 3886652. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8d1350> Failed to fetch sample 2127419. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20035bbf0> cannot identify image file <_io.BytesIO object at 0x7fbf0071cf40> Failed to fetch sample 2184717. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebf7b00> Failed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b92fc40> Failed to fetch sample 4232471. Exception: cannot identify image file <_io.BytesIO object at 0x7f8028e89490> Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb727c40> Failed to fetch sample 1026599. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7bb970> Failed to fetch sample 1672947. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13358ae0> cannot identify image file <_io.BytesIO object at 0x7f5a16daa430> Failed to fetch sample 3574902. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f92b650> ile <_io.BytesIO object at 0x7f566810cf90> Failed to fetch sample 1672947. Exception: ject at 0x7fee3f9eafc0> Failed to fetch sample 973449. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f979170> Failed to fetch sample 4023897. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090db70> Failed to fetch sample 1232641. Exception:ject at 0x7fbf0074b600> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b92f380> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebf6a20> 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7ba610> cannot identify image file <_io.BytesIO object at 0x7f94acc318a0> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e4ae0> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668106890> Failed to fetch sample 1534741. Exception: cannot identify image file <_io.BytesIO object at 0x7f52978ad8f0> Failed to fetch sample 1547061. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778ff10> Failed to fetch sample 4243929. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278e2f20> Failed to fetch sample 3416430. Exception: cannot identify image fiFailed to fetch sample 2087658. Exception: cannot identify image file <_io.BytesIO object at 0x7f80284b73d0> Failed to fetch sample 1342670. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc5eadae80> cannot identify image file <_io.BytesIO object at 0x7f4e83fc4950> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38c09a0> Failed to fetch sample 4174126. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093733e70> Failed to fetch sample 2087658. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778e890> Failed to fetch sample 2846277. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac3b84f0> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e67f0> cannot identify image file <_io.BytesIO object at 0x7f001f926bb0> Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c8860> Failed to fetch sample 4232471. Exception: cannot identify image file <_io.BytesIO object at 0x7f00d8f36480> Failed to fetch sample 2555299. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a4efc0> Failed to fetch sample 3149661. Exception:Failed to fetch sample 3662436. Exception:bject at 0x7ff52b9cf1f0> Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7ff6973eb330> Failed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1dafc0> Failed to fetch sample 4232471. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777c3f10> cannot identify image file <_io.BytesIO object at 0x7f56682cdc10> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f99be20> cannot identify image file <_io.BytesIO object at 0x7f001f927560> Failed to fetch sample 1701865. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a844a0> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f927510> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697deaf20> Failed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac25ed90> Failed to fetch sample 2891983. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e2b28cf90> Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faf12b0> cannot identify image file <_io.BytesIO object at 0x7f3939ca22a0> cannot identify image file <_io.BytesIO object at 0x7efd6f794a40> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568ba390> le <_io.BytesIO object at 0x7f6682def2e0> Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668106ca0> Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7f00da317f10> Failed to fetch sample 1976843. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568d7fb0> Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1973d0> cannot identify image file <_io.BytesIO object at 0x7fbf00721710> le <_io.BytesIO object at 0x7f5e7a25f7e0> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f62f20> Failed to fetch sample 2633741. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca0ea0> le <_io.BytesIO object at 0x7f6682dc34c0> Failed to fetch sample 2Failed to fetch sample 1945810. Exception: le <_io.BytesIO object at 0x7f6682db9f30> Failed to fetch sample 1723824. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00748720> cannot identify image file <_io.BytesIO object at 0x7fccdf7763e0> cannot identify image file <_io.BytesIO object at 0x7f21b7fb9cb0> ile <_io.BytesIO object at 0x7f7a56971850> cannot identify image file <_io.BytesIO object at 0x7f0ecdba5350> le <_io.BytesIO object at 0x7f4ebf9fb1a0> Failed to fetch sample 2165662. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebe55d0> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dc22f0> Failed to fetch sample 3900769. Exception:bject at 0x7ff697deb470> Failed to fetch sample 1414336. Exception:cannot identify image file <_io.BytesIO object at 0x7fee3f97a890> FFailed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e7ba0> cannot identify image file <_io.BytesIO object at 0x7f7a568b8680> le <_io.BytesIO object at 0x7f7134295a30> Failed to fetch sample 2387855. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db09f0> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebe4810> Failed to fetch sample 2491527. Exception: cannot identify image file <_io.BytesIO object at 0x7f4ebf9fb290> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f998db0> Failed to fetch sample 1945810. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420eb290> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37eb010> Failed to fetch sample 1099589. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fa3bbe2a0> Failed to fetch sample 3355612. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39ef380> Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf005ced40> Failed to fetch sample 3280125. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0770a7f0> Failed to fetch sample 2991472. Exception:Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342db100>: cannot identify image file <_io.BytesIO object at 0x7f5de3711a80> Failed to fetch sample 3385054. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d47c0> Failed to fetch sample 2237717. Exception: cannot identify image file <_io.BytesIO object at 0x7f4ebf9fb1f0> Failed to fetch sample 1901998. Exception: cannot identify image file <_io.BytesIO object at 0x7f69186e5d00> Failed to fetch sample 2687916. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f3920> Failed to fetch sample 1277996. Exception: cannot identify image file <_io.BytesIO object at 0x7f7af76137e0> cannot identify image file <_io.BytesIO object at 0x7fa69c52db70> le <_io.BytesIO object at 0x7f6682dda070> Failed to fetch sample 3717689. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb11da0> Failed to fetch sample 2904733. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a0a86c630> Failed to fetch sample 3291702. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64954e0> Failed to fetch sample 2942822. Exception: ject at 0x7f9d73259850> Failed to fetch sample 1844272. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133fd0f40> Failed to fetch sample 4140040. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190c36f0> Failed to fetch sample 2767646. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d3380> Failed to fetch sample 2482217. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e07707d80> Failed to fetch sample 2012611. Exception: cannot identify image fFailed to fetch sample 1886723. Exception:jFailed to fetch sample cannot identify imFailed to fetch sample 3322388. Exception: 6b0> Failed to fetch sample 3431220. Exception: cannot identify image file <_io.BytesIO object at 0x7f277545bf60> Failed to fetch sample 3356024. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08fb560> le <_io.BytesIO object at 0x7f7e07704630> Failed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7f12295f24d0> cannot identify image file <_io.BytesIO object at 0x7f0c35d6b1f0> Failed to fetch sample 2507324. Exception: cannot identify image file <_io.BytesIO object at 0x7fc108454d10> Failed to fetch sample 3431220. Exception:Failed to fetch sample 40Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors o fetch sample 2452845. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775486480> Failed to fetch sample 2149877. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f088b0b0> Failed to fetch sample 2521121. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ec268ae30> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7f1228c22750> cannot identify image file <_io.BytesIO object at 0x7fc10844d3a0> Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190c3150> Failed to fetch sample 4232471. Exception: cannot identify image fFailed to fetch sample 3490796. Exception:bject at 0x7ff69741cb30> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789eed0> Failed to fetch sample 3364975. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b94cfe0> Failed to fetch sample 1376128. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb73c2c0> Failed to fetch sample 2087658. Exception: ject at 0x7f5253bb71f0> le <_io.BytesIO object at 0x7fb9777dc310> Failed to fetch sample 1707756. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777ebba0> cannot identify image file <_io.BytesIO object at 0x7f9d732849a0> le <_io.BytesIO object at 0x7f7e07705ee0> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754abec0> Failed to fetch sample 2374167. Exception: cannot identify image file <_io.BytesIO object at 0x7f1009413380> Failed to fetch sample 3423432. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789e660> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb73d300> Failed to fetch sample 3298418. Exception: cannot identify image filTOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2597517. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732537e0> o disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a0b406430> Failed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe71a7ad90> Failed to fetch sample 2883927. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71f650> Failed to fetch sample 4232471. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64ca660> Failed to fetch sample 2319404. Exception: cannot identify image fihuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To diFailed to fetch sample 2094324. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9bbd30> Failed to fetch sample 137 cannot identify image file <_io.BytesIO object at 0x7f35a7f55120> 0x7f7e07704f90> Failed to fetch sample 2850892. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c56fc40> ailed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7f001baf1cb0> cannot identify image file <_io.BytesIO object at 0x7f22e39d3c40> Failed to fetch sample 1853445. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a90dda0> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b94ec50> cannot identify image file <_io.BytesIO object at 0x7f65560775b0> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84eee30> Failed to fetch sample 1945810. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556064b80> Failed to fetch sample 1701865. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732505e0> ile <_io.BytesIO object at 0x7f59ffd83880> Failed to fetch sample 3223618. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc2562aa70> Failed to fetch sample 1480077. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093757ce0> Failed to fetch sample 3636421. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9631f0> cannot identify image file <_io.BytesIO object at 0x7fb9777e93f0> Failed to fetch sample 3546995. Exception: cannot identify image file <_io.BytesFailed to fetch sample 2964818. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c0220> Failed to fetch sample 3121991. Exception: cannot identify image file < cannot identify image file <_io.Bytes cannot identify image file <_io.BytesIO object at 0x7f83e83157b0> FailFailed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568ce930> not identify image file <_io.BytesIO object at 0x7fe4440c7470> Failed to fetch sample 2859442. Exception: cannot identify image file <_io.BytesIO object at 0x7ff55c513740> cannot identify image file <_io.BytesIO object at 0x7fdd459d4b80> ile <_io.BytesIO object at 0x7f1093768fe0> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf6a3c40> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93d3a0> Failed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380e3e0> Failed to fetch sample 2000123. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4440c6250> Failed to fetch sample 3688015. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b241c60> Failed to fetch sample 4023897. Exception: cannot identify image file <_io.BytesIO object at 0x7f1bfcc08cc0> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7f1cc06c0130> Failed to fetch sample 2087658. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f239bc0> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf057a6d40> Failed to fetch sample 3492865. Exception: cannot identify image file <_io.BytesIO object at 0x7f109376a430> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9634c0> Failed to fetch sample 2392280. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380e200> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3402884. Exception: just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you chuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... ToFailed to fetch sample 2087658. Exception:Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3414742. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b241c10> ion: cannot identify image file <_io.BytesIO object at 0x7f472789d710> Failed to fetch sample 3593359. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381d4e0> Failed to fetch sample 3175611. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d9cc7380> Failed to fetch sample 1163583. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3736480> Failed to fetch sample 2087658. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a918e9760> cannot identify image file <_io.BytesIO object at 0x7fe436a151c0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` befFailed to fetch sample 1949460. Exception: the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3921017. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d5350> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3091195. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d9ce3dd0> cannot identify image file <_io.BytesIO object at 0x7f1f1b241a80> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1457148. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17f0fb330> cannot identify image file <_io.BytesIO object at 0x7f0aaf7eb7e0> Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f26e390> Failed to fetch sample 1769963. Exception:Failed to fetch sample 4232471. Exception: cannot identify image fiFailed to fetch sample 1651202. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420ee660> Failed to fetch sample 1168368. ExceptiFailed to fetch sample 2100695. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c97830> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7f5e6a8e1300> 03b0> Failed to fetch sample 3248804. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ebbf0> Failed to fetch sample 2450396. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b174d10> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` befFailed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cbf97fbFailed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd445df380> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f1490> 0> Failed to fetch sample 3162454. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c962a0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly setFailed to fetch sample 3246712. Exception: cannot identify image file <_io.BytesIO object at 0x7f0994dbd850> Failed to fetch sample 2587174. Exception: cannot identify imageFailed to fetch sample 2393914. Exception:0> Failed to fetch sample 1835222. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2431f0> Failed to fetch sample 4018722. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcf0e50> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb38310> Failed to fetch sample 2714314. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6ad7b0> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420a7a10> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f1a700>: cannot identify image file <_io.BytesIO object at 0x7f7e078de250> Failed to fetch sample 4023897. Exception:bject at 0x7fca1c18aa70> le <_io.BytesIO object at 0x7fc2eb7b4770> cannot identify image file <_io.BytesIO object at 0x7efd6f7b2f20> Failed to fetch sample 3179326. Exception: cannot identify image file <_io.Byte cannot identify image file <_io.BytesIO object at 0x7f47278d28e0> Failed to fetch sample 1825353. Exception: cannot identify image file <_To disable this warning, you can eith cannot identify image file <_io.BytesIO object at 0x7f6a7f26f1f0> icitlFailed to fetch sample 2488635. Exception: ARALLELISM=(true | false) Failed to fetch sample 1003423. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b223010> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d1773ef70> Failed to fetch sample 1070036. Exception:bject at 0x7f53a8e9f1f0> le <_io.BytesIO object at 0x7fc19513dfd0> Failed to fetch sample 2892410. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c56e8e0> Failed to fetch sample 2237717. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f245e0> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcf3fb0> Failed to fetch sample 1477290. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775475260> Failed to fetch sample 3651591. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d7da3ec0>cFailed to fetch sample 2087658. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0069ff10> ailed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d1773f010> Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c56f970> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25629120> Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64cd4e0> Failed to fetch sample 2821911. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f290310> Failed to fetch sample 3383743. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093779e40> Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3827420> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you cFailed to fetch sample 3644627. Exception:ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) cannot identify image file <_io.BytesIO object at 0x7f7b9b93a930> huggingface/tokenizers: The current process=(true | false) Failed to fetch sample 3690578. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bda660> image file <_io.BytesIO object at 0x7f18f3249cb0> Failed to fetch sample 1586576. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1efd30> ed to fetch sample 2850892. Exception: cannot identify image file <_io.BytesIO object at 0x7f633ef42700> Failed to fetch sample 3091195. Exception:Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2179513f0> cannot identify image file <_io.BytesIO object at 0x7f1008a64db0> le <_io.BytesIO object at 0x7f0e6bde09f0> Failed to fetch sample 3088885. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf83dd50> Failed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f284450> cannot identify image file <_io.BytesIO obcannot identify image file <_io.BytesIO object at 0x7f3bcea3a390> Failed to fetch sample 3397401. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8329210> Failed to fetch sample 2884516. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7f53a8eaecf0> ile <_io.BytesIO object at 0x7f18e64f47c0> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7f6f60808720> Failed to fetch sample 4023897. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25693790> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568d61b0> Failed to fetch sample 3464855. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d145b8ef0> Failed to fetch sample 4099617. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f286cf0>iled to fetch sample 2887528. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a84450> Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7f277548a610> Failed to fetch sample 3670977. Exception: cannot identify image fi Failed to fetch sample 3106325. Exception cannot identify image file <_io.BytesIO object at 0x7f1f1b257b50> Failed to fetch sample 1108664. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4440ca070> Failed to fetch sample 1627346. Exception: cannot identify image file <_io.BytesIO object at 0x7f5742102c00> Failed to fetch sample 1439392. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9592b0> Failed to fetch sample 1367777. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f09f0> Failed to fetch sample 1419363. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c427f8d0> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f5490> Failed to fetch sample 3157561. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c560785e0> Failed to fetch sample 4243929. Exception: cannot identify image file <_io.BytesIO object at 0x7f64960297b0> Failed to fetch sample 3150542. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaa47d710> Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7f277546f380> cannot identify image file <_io.BytesIO object at 0x7fc107a4f290> le <_io.BytesIO object at 0x7f7a568d5350> cannot identify image file <_io.BytesIO object at 0x7f5e6a8b6e80> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b96b880> ile <_io.BytesIO object at 0x7f5741f3bb50> Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f32de0> Failed to fetch sample 3072323. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcd0d10> Failed to fetch sample 3856361. Exception: cannot identify image file <_io.BytesIO object at 0x7f277548a980> Failed to fetch sample 2325409. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f55a30> Failed to fetch sample 4162110. Exception: cannot identify image file <_io.BytesIO object at 0x7f277540b1a0> Failed to fetch sample 1316862. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f1120> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f959170> Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741fd25c0> Failed to fetch sample 3423432. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8eaf560> Failed to fetch sample 4102616. Exception: cannot identify image file <_io.BytesIO object at 0x7f36637df380> Failed to fetch sample 2993528. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a90659e0> Failed to fetch sample 3212271. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23c180> Failed to fetch sample 2197846. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2393a0> not identify image file <_io.BytesIO object at 0x7f1008a64680> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e070f93a0> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f0680> cannot identify image file <_io.BytesIO object at 0x7ff2494a2390> Failed to fetch sample 1515853. Exception: cannot identify image file <_io.BytesIO object at 0x7fcae18f9d00> ile <_io.BytesIO object at 0x7f94ac229170> Failed to fetch sample 3350750. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23e0c0> Failed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7f6692311030> Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093778c70> Failed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3813880> Failed to fetch sample 3283504. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b907790> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7ddbee80> FFailed to fetch sample 2574882. Exception:cannot identify image file <_io.BytesIO object at 0x7f7e078fbc40> Failed to fetch sample 3385960. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39dfd30> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e7a25f7e0> Failed to fetch sample 3371113. Exception: cannot identify image file <_io.BytesIO object at 0x7f60ffec7ce0> Failed to fetch sample 3688291. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e384df80> Failed to fetch sample 3857268. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39bbb00> the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2232218. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bb62a0> Failed to fetch sample 3855812. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5869d0> Failed to fetch sample 2250980. Exception: ect at 0x7fa50aa2c4f0> Failed to fetch sample 3246712. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18b8d0> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dbf09710> ile <_io.BytesIO object at 0x7f58c39def20> Failed to fetch sample 2883927. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078cad90> Failed to fetch sample 4179589. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e3920> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` befFailed to fetch sample 2149877. Exception: the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3423432. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38127a0> Failed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e384c450> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39b9f30> Failed to fetch sample 2610323. Exception:bject at 0x7fa75e3252b0> Failed to fetch sample 2686021. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682d3600> Failed to fetch sample 2721985. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf777290> Failed to fetch sample 2253168. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e2980> Failed to fetch sample 3725471. Exception: cannot identify image file <_io.BytesIO object at 0x7fc26217b600> Failed to fetch sample 4099617. Exception: cannot identify image fiFailed to fetch sample 3289295. Exception:Failed to fetch sample 2639168. Exception: cannot identify image file <_io.BytesIO object at 0x7f37719cee80> Failed to fetch sample 2998072. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac213a60> bject at 0x7f56682d33d0> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db55ce50> Failed to fetch sample 2533320. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008b32110> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disablicannot identify image file <_io.BytesIO object at 0x7fca1c18bc40> ile <_io.BytesIO object at 0x7f1093758680> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c273d30> Failed to fetch sample 2957876. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb11a30> Failed to fetch sample 1316862. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7b2bc90> Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d4950> Failed to fetch sample 4196420. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf005cfb50> Failed to fetch sample 4114602. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f65d50> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c299f80> Failed to fetch sample 2947083. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f83e8339210> Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18bfb0> ailed to fetch sample 1254524. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadd5af20> e - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2633741. Exception: cannot identify image file <_io.BytesIO object at 0x7f12ef959ee0> huggingface/tokenizers: The current process just got forked, afterFailed to fetch sample 4187648. Exception:Failed to fetch sample 3662436. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f67ba0> Failed to fetch sample 2189796. Exception: cannot identify image file <_io.BytesIO object at 0x7f36638cc0e0> cannot identify image file <_io.BytesIO object at 0x7ff4daba3240> <_io.BytesIO object at 0x7f81db55e1b0> Failed to fetch sample 1244206. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a6203f10> cannot identify image file <_io.BytesIO object at 0x7f109378aa70> Failed to fetch sample 2325409. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093763920> huggingface/tokenizers: The current proces cannot identify image file <_io.BytesIO object at 0x7fdd459d76a0> Failed to fetch sample 1215065. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9713a0> ore the fork if possible - Explicitly set the environment variablFailed to fetch sample 2558573. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3911940> cannot identify image file <_io.BytesIO object at 0x7f098641ecf0> 5941. Exception: cannot identify image file <_io.BytesIO object at 0x7fb74383bc90> Failed to fetch sample 2486416. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb68749a0> Failed to fetch sample 2555299. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab76f20> Failed to fetch sample 2591241. Exception:: cannot identify image file <_io.BytesIO object at 0x7f9abcd96bb0> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc65c0> Failed to fetch sample 4179589. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f94d0d0> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb68774c0> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec23b060> cannot identify image file <_io.BytesIO object at 0x7f5a16f834c0> Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8319710> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca7600> Failed to fetch sample 2943308. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18be20> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4082533. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f916250> cannot identify image file <_io.BytesIO object at 0x7f36366e1cb0> Failed to fetch sample 2633220. Exception: Failed to fetch sample 3067785. Exception: cannot identify image fFailed to fetch sample 2766404. Exception: Failed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7feb6a3cac50> cannot identify image file <_io.BytesIO object at 0x7f35a7f44720> Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcf740> Failed to fetch sample 2770280. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc2bb0> Failed to fetch sample 4156231. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb5ec0a40> Failed to fetch sample 2233662. Exception: cannot identify image ficannot identify image file <_io.BytesIO obFailed to fetch sample 1659475. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f92e660> cannot identify image file <_io.BytesIO object at 0x7f81d6d88a40> Failed to fetch sample 2325409. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39a8720> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7fb5fefe2660> cannot identify image file <_io.BytesIO object at 0x7f22e39f3e20> cannot identify image file <_io.BytesIO object at 0x7fca1c18b880> Failed to fetch sample 3671896. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24949b1a0> Failed to fetch sample 998153. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdf5fd0> Failed to fetch sample 4236147. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f44900> Failed to fetch sample 4121555. Exception:bject at 0x7f001f917ec0> Failed to fetch sample 2767646. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16da9490> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39a83b0> Failed to fetch sample 3205709. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd56c0> cannot identify image file <_io.BytesIO object at 0x7f22e39baac0> Failed to fetch sample 1735272. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e5da0> cannot identify image file <_io.BytesIO object at 0x7fc0f0931e90> cannot identify image file <_io.BytesIO object at 0x7fca1c18af70> Failed to fetch sample 3091195. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249499490> Failed to fetch sample 2087658. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd953a0>FFailed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a1e2a0> ailed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71f600> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3817240> cannot identify image file <_io.BytesIO object at 0x7fb52f67fce0> le <_io.BytesIO object at 0x7f22e39bacf0> Failed to fetch sample 3079909. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7823e0> Failed to fetch sample 3649832. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3810b30> Failed to fetch sample 3725471. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568e77e0> Failed to fetch sample 2671486. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d84e50> Failed to fetch sample 1727762. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9793e3e0> Failed to fetch sample 2919698. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278affb0> Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71ef20> Failed to fetch sample 1084633. Exception: cannot identify image fi Failed to fetch sample 2452845. ExceptionFailed to fetch sample 4232471. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd95df0> cannot identify image file <_io.BytesIO object at 0x7ff24949bc40> Failed to fetch sample 2767404. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a559bc0> Failed to fetch sample 3573442. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e38127a0> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9793dbc0> Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278e1350> Failed to fetch sample 3636421. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8ea250> cannot identify image file <_io.BytesIO object at 0x7f4beebaaa70> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9b91c0> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7fd30f9de9d0> Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c0b30> Failed to fetch sample 2647526. Exception: cannot identify image f Failed to fetch sample 1323919. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c29be20> Failed to fetch sample 2715947. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25b790> cannot identify image file <_io.BytesIO object at 0x7f5f941362a0> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8efce0> Failed to fetch sample 3642783. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1bee6de0> Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 819Failed to fetch sample 2454376. Exception: cannot identify image file <_io. cannot identify image file <_io.BytesIO object at 0x7f5caf575800> le <_io.BytesIO object at 0x7f94ac239b20> Failed to fetch sample 3015989. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d2db20> BytesIO object at 0x7fe4473808b0> Failed to fetch sample 3063581. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b7fba160> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c29a570> Failed to fetch sample 2149877. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25b0b0> cannot identify image file <_io.BytesIO object at 0x7fca1c1ba2a0> Failed to fetch sample 1457654. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab77e70> Failed to fetch sample 4238312. Exception:Failed to fetch sample 1534741. Exception: cannot identify image fiFailed to fetch sample 3570582. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a87663d30> Failed to fetch sample 1037347. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f979170> Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c29bf10> Failed to fetch sample 3067785. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200384450> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906acf0> cannot identify image file <_io.BytesIO object at 0x7f7b13d2e9d0> Failed to fetch sample 3193171. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401dc6de0> Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697def420> Failed to fetch sample 2882845. Exception:ject at 0x7f5de3b548b0> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaade9ccc0> Failed to fetch sample 4238312. Exception:Failed to fetch sample 1262384. Exception: cannot identify image fiFailed to fetch sample 2656477. Exception:Failed to fetch sample 3312379. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b939800> Failed to fetch sample 2374167. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f81db5663e0> Failed to fetch sample 1033050. ExceptionFailed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a90690d0> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a90536a0> Failed to fetch sample 1304332. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20035bb00> Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db565c60> Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f4220> cannot identify image file <_io.BytesIO object at 0x7fee3f999b20> Failed to fetch sample 2844492. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0912cf0> Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133f16390> Failed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7f563bb94d10> Failed to fetch sample 2149877. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b3888590> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc192d90> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e077043b0> Failed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a87643830> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0910680> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f96a1b0> led to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f5260> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5c5170> Failed to fetch sample 1672947. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec01120> Failed to fetch sample 1138218. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93bd30> Failed to fetch sample 2934230. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c29ad40> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7f10af1cf970> cannot identify image file <_io.BytesIO object at 0x7f7b13d2d440> Failed to fetch sample 4099617. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1dbb00> Failed to fetch sample 1440224. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1bee6e80> Failed to fetch sample 2633741. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a90698a0> cannot identify image file <_io.BytesIO object at 0x7f3939ca2d40> Failed to fetch sample 3564093. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72405080> le <_io.BytesIO object at 0x7f9a9c2bbdd0> Failed to fetch sample 3431220. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c29a0c0> Failed to fetch sample 3901197. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a17f37fb0> Failed to fetch sample 1142386. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789e480> Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1da1b0> Failed to fetch sample 1266181. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf78a430> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb5ec1350> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090c9f0> Failed to fetch sample 2961542. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f97b1f0> Failed to fetch sample 4121555. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3745da0> Failed to fetch sample 3529820. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb36020> Failed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f999cb0> Failed to fetch sample 1717309. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2597b0> Failed to fetch sample 3297609. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d2dc10> Failed to fetch sample 3931715. Exception:Failed to fetch sample 2130826. Exception:bject at 0x7f6682dc2a20> Failed to fetch sample 2036348. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1bee7e70> Failed to fetch sample 3002887. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568fe3e0> Failed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf78b7e0> Failed to fetch sample 1534741. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986421620> Failed to fetch sample 2671486. Exception: cannot identify image fiFailed to fetch sample 1008482. Exception:Failed to fetch sample 3002531. Exception:bject at 0x7f357b907740> Failed to fetch sample 3021215. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682defce0> cannot identify image file <_io.BytesIO object at 0x7f1137f153f0> Failed to fetch sample 2995245. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420f9e90> Failed to fetch sample 1067169. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f07dd0> le <_io.BytesIO object at 0x7f357b9383b0> Failed to fetch sample 1310685. Exception:Failed to fetch sample 2527932. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64e13f0>:Failed to fetch sample 1277019. Exception:bject at 0x7f357b939350> Failed to fetch sample 1672947. Exception: Failed to fetch sample 2639168. Exception: cannot identify image fFailed to fetch sample 2037228. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8edd9d5d0> cannot identify image file <_io.BytesIO object at 0x7f9a9c28d260> Failed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560695d0> cannot identify image file <_io.BytesIO object at 0x7f35a7f73290> cannot identify image file <_io.BytesIO object at 0x7fa08883e980> Failed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cd0d0> ile <_io.BytesIO object at 0x7f7e078e4cc0> Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b938ef0> cannot identify image file <_io.BytesIO object at 0x7f5253bdec00> Failed to fetch sample 1659475. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bed1c0> Failed to fetch sample 4099617. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c3b50> Failed to fetch sample 2690614. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3d8d350> cannot identify image file <_io.BytesIO object at 0x7fc0f08f9490> Failed to fetch sample 1718944. Exception:Failed to fetch sample 2517307. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a391a2f0> Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253befb00> Failed to fetch sample 4121555. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e87c0> Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777aad90> Failed to fetch sample 1484787. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b995ee0> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35ba98f0> Failed to fetch sample 4110336. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7f5643e2f470> Failed to fetch sample 1451627. Exception: Failed to fetch sample 2353389. Exception: cannot identify image fFailed to fetch sample 3575993. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b95d8f0> Failed to fetch sample 2041632. Exception: Failed to fetch sample 2374147. Exception: cannot identify image fFailed to fetch sample 2282533. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd15d0> Failed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078de930> Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17f0fa9d0> Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777a8220> cannot identify image file <_io.BytesIO object at 0x7f5de37cb2e0> le <_io.BytesIO object at 0x7f07aacbe2a0> Failed to fetch sample 1263970. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494976a0> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560ee840> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777de6b0> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b98f650> Failed to fetch sample 1801686. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13358810> cannot identify image file <_io.BytesIO object at 0x7f80a80e7600> ailed to fetch sample 2591241. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c28f010> Failed to fetch sample 3645496. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4440ba570> cannot identify image file <_io.BytesIO object at 0x7fc0f090b470> Failed to fetch sample 3643589. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078da610> Failed to fetch sample 1026599. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9d9670> cannot identify image file <_io.BytesIO object at 0x7f0c1b534450> le <_io.BytesIO object at 0x7f5643e2fb50> Failed to fetch sample 2671486. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e48b0>: cannot identify image file <_io.BytesIO object at 0x7f99a39128e0> cannot identify image file <_io.BytesIO object at 0x7f7e078df600> Failed to fetch sample 1035361. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078b3f60> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08da6b0> Failed to fetch sample 2606964. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24966f560> Failed to fetch sample 4088210. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c212f70> Failed to fetch sample 2261990. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c56081bc0> Failed to fetch sample 2865537. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8397ba0> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7f6417e0a2a0> Failed to fetch sample 2681772. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b923420> Failed to fetch sample 3351526. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f544f0> cannot identify image file <_io.BytesIO object at 0x7fdd43034bd0> Failed to fetch sample 2671486. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f1580> cannot identify image file <_io.BytesIO object at 0x7f09953d5580> ile <_io.BytesIO object at 0x7f7e078eb6a0> Failed to fetch sample 3635113. Exception: just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4187558. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cd371300> cannot identify image file <_io.BytesIO object at 0x7f94ac244860> Failed to fetch sample 2961542. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c549d50> cannot identify image file <_io.BytesIO object at 0x7fb7401e06d0> ile <_io.BytesIO object at 0x7f846e357c40> cannot identify image file <_io.BytesIO object at 0x7f65560553a0> cannot identify image file <_io.BytesIO object at 0x7f12b381fe20> Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9071f0> Failed to fetch sample 3369029. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682fc3380> Failed to fetch sample 1580010. Exception: Failed to fetch sample 1448134. Exception: ect at 0x7f7e078ebc40> Failed to fetch sample 2138915. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf686020> cannot identify image file <_io.BytesIO object at 0x7f35a7f2aca0> 199896. Exception:mple 2050144. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf855c10> cannot identify image file <_io.BytesIO object at 0x7fb7401e13a0> le <_io.BytesIO object at 0x7fe436a02480> Failed to fetch sample 2607936. Exception: ject at 0x7f99a3910c70> Failed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7223a570> Failed to fetch sample 3010687. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381de40> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7f69186e4f90> Failed to fetch sample 2669914. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0070be70> Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac247c90> Failed to fetch sample 4123579. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc12a340> cannot identify image file <_io.BytesIO object at 0x7fca1c1873d0> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24966ff60> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560ecf90> Failed to fetch sample 2392280. Exception:Failed to fetch sample 2723001. Exception: cannot identify image fi Failed to fetch sample 3150542. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a033d0> cannot identify image file <_io.BytesIO oFailed to fetch sample 30Failed to fetch sample 3133715. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c29a160> Failed to fetch sample 2633741. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 40Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 2087658. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682ce0c0> bject at 0x7f65560ef970> Failed to fetch sample 1477290. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de372b290> Failed to fetch sample 3372211. Exception: cannot identify image file <_io.BytesIO object at 0x7f27847e62f0> Failed to fetch sample 3040271. Exception: ect at 0x7f8028e89490> ile <_io.BytesIO object at 0x7f12b381fab0> Failed to fetch sample 1431144. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a934e50> Failed to fetch sample 1033050. Exception: cannot identify image ficannot identify image file <_io.BytesIO obFailed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a4ec50> ailed to fetch sample 1534741. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937f0f40> bject at 0x7fe436a01800> Failed to fetch sample 3905599. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5696e2a0> cannot identify image file <_io.BytesIO object at 0x7fc71532a520> Failed to fetch sample 2374167. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd0540> Failed to fetch sample 3575993. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556a05b70> cannot identify image file <_io.BytesIO object at 0x7f5e6a947650> Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf783d80> Failed to fetch sample 4226308. Exception: Failed to fetch sample 1961415. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7fb7401e3a10> Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7f566812c860> le <_io.BytesIO object at 0x7f5253bd6b60> Failed to fetch sample 3040842. Exception: Failed to fetch sample 1210334. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789e660> cannot identify image file <_io.BytesIO object at 0x7fbe7224b880> Failed to fetch sample 1945810. Exception: ect at 0x7f5741f0d350> Failed to fetch sample 2873000. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec219e40> Failed to fetch sample 3401019. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd1350> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560579c0> Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors BytesIO object at 0x7fb7401e3b50> Failed to fetch sample 3266629. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f17fb0> cannot identify image file <_io.BytesIO object at 0x7f472789cfe0>cannot identify image file <_io.BytesIO objeFailed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d0bd80> Failed to fetch sample 2915896. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568b94e0> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d7c40> cannot identify image Failed to fetch sample 3176035. Exception: cFailed to fetch sample 2103624. Exception: cannot identify image Failed to fetch sample 2514711. Exception:> Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a845e0> cannot identify image file <_io.BytesIO object at 0x7fb9777b2570> ile <_io.BytesIO object at 0x7f8294c44d60> cannot identify image file <_io.BytesIO object at 0x7f5741f0f380> le <_io.BytesIO object at 0x7f9d7324b740> Failed to fetch sample 3373358. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f66520> cannot identify image file <_io.BytesIO object at 0x7f18e5d16750> Failed to fetch sample 3855812. Exception: cannot identify image fiFailed to fetch sample 3445146. Exception:Failed to fetch sample 1247219. Exception:ample 2036348. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c213e70> Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cdf650> cannot identify image file <_io.BytesIO object at 0x7f9d7324a9d0> Failed to fetch sample 2122051. Exception: Failed to fetch sample 3123506. Exception:ject at 0x7fccdf76e750> Failed to fetch sample 1325247. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401ea890> Failed to fetch sample 3138387. Exception: cannot identify image file <_io.BytesIO object at 0x7f5527249490> Failed to fetch sample 2187628. Exception: cannot identify image file <_io.BytesIO object at 0x7f991b214400> Failed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5791c0> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253befb50> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f991b216070> Failed to fetch sample 2881725. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdba64d0> cannot identify image file <_io.BytesIO object at 0x7fc383d852b0> Failed to fetch sample 1124584. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d9cc72e0> le <_io.BytesIO object at 0x7f3939cdf5b0> cannot identify image file <_io.BytesIO object at 0x7fccdf7bd2b0> Failed to fetch sample 3513561. Exception: cannot identify image file <_io.BytesIO object at 0x7f61008784f0> Failed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401eafc0> Failed to fetch sample 3414742. Exception: cannot identify image file <_io.BytesIO object at 0x7f566811f240> Failed to fetch sample 1737070. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2b8fe0> Failed to fetch sample 3322388. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7324a890> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f6bd30> Failed to fetch sample 3002531. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b77e0> cannot identify image file <_io.BytesIO object at 0x7f65560e5530> Failed to fetch sample 1102280. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdef66f20> le <_io.BytesIO object at 0x7f1137f66e80> Failed to fetch sample 3423432. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f69ad0> Failed to fetch sample 1222876. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00722200> Failed to fetch sample 993268. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778fa10> Failed to fetch sample 1264629. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e07c0> Failed to fetch sample 2993528. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401dc6f70> Failed to fetch sample 2767646. Exception:: cannot identify image file <_io.BytesIO object at 0x7fd17f0f22f0> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7f566811ed90> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f66a70> Failed to fetch sample 2526328. Exception: ailed to fetch sample 3203287. Exception:bject at 0x7f1f1b2553f0> cannot identify image file <_io.BytesIO object at 0x7efd6f796750> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c572840> Failed to fetch sample 3866329. Exception: cannot identify image file <_io.BytesIO object at 0x7efa5f45f600> cannot identify image file <_io.BytesIO object at 0x7fb97778ed40> Failed to fetch sample 1426712. Exception: cannot identify image file <_io.BytesIO object at 0x7f936b355620> Failed to fetch sample 2844592. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342d73d0> Failed to fetch sample 3431220. Exception: cannot identify image fiFailed to fetch sample 1759192. Exception:Failed to fetch sample 2184810. Exception: cannot identify image file <_io.BytesIO object at 0x7fa6517206d0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variableFailed to fetch sample 1030473. ExceptiFailed to fetch sample 3880232. Exception: cannot identify image file <_io.BytesIO object at 0x7f7c54857e20> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777dfe70> disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2122051. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8112250> cannot identify image file <_io.BytesIO object at 0x7f7ecaa09f30> Failed to fetch sample 2967592. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f90eed0> ile <_io.BytesIO object at 0x7fd17e71ae80> cannot identify image file <_io.BytesIO object at 0x7f18e5d17f60> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7778d0> Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b6d40> cannot identify image file <_io.BytesIO object at 0x7fc99c633a10> Failed to fetch sample 2913953. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ca56c0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa2c4a0> cannot identify image file <_io.BytesIO object at 0x7f5741f1e750> Failed to fetch sample 1371612. Exception: Failed to fetch sample 4099617. Exception: ect at 0x7ff4db548ea0> cannot identify image file <_io.BytesIO object at 0x7f12b38c3470> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3846de0> ile <_io.BytesIO object at 0x7fc99c631a30> cannot identify image file <_io.BytesIO object at 0x7fd20037b1a0> Failed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c424b010> Failed to fetch sample 2909783. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e6070> Failed to fetch sample 4018722. Exception: cannot identify image file <_io. cannot identify image file <_io.BytesIO object at 0x7efcdbb38f90> Failed to fetch sample 2024032. EFailed to fetch sample 2199896. Exception: cannot identify image file <_io.B cannot identify image file <_io.Failed to fetch sample 3103426. Exception:o fetch sample 2705254. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5432e0> le <_io.BytesIO object at 0x7f5e6a8e2f20> Failed to fetch sample 2454376. Exception: ect at 0x7fa50aa2ca90> Failed to fetch sample 1090482. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c95b70> ile <_io.BytesIO object at 0x7ff401f6e250> Failed to fetch sample 2528140. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab77150> Failed to fetch sample 3176035. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7efd6f777650> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d08220> Failed to fetch sample 3091195. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 30Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20035b740> Failed to fetch sample 1062477. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 36Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7f469e5aab60> Failed to fetch sample 1840837. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f56ac0> cannot identify image file <_io.BytesIO object at 0x7f136e6f8ae0> 20468. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b93aa70> <_io.BytesIO object at 0x7f6a7f271350> Failed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4daba2ca0> Failed to fetch sample 4178774. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c272160> Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340ef560> Failed to fetch sample 3700895. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f57d30> Failed to fetch sample 1477290. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7240ba60> Failed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f63010> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20035bba0> bject at 0x7f6a7f2729d0> Failed to fetch sample 3733356. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcf3d0> Failed to fetch sample 2392280. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8112ed0> Failed to fetch sample 1316862. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a907a0c0> Failed to fetch sample 3927990. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278d3ba0> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f55a80> Failed to fetch sample 2127400. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f716c0> cannot identify image file <_io.BytesIO object at 0x7efcdbb3ac00> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777dbdd0> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f23a7a0> Failed to fetch sample 3665414. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25629620> ailed to fetch sample 2819794. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a980e00> Failed to fetch sample 3092562. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7aea70> Failed to fetch sample 3695885. Exception: cannot identify image fFailed to fetch sample 3189065. Exception: Failed to fetch sample 2408590. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bb7150> Failed to fetch sample 1462076. Exception: cannot identify image file <_io.BytesIO object at 0x7f469ef9d0d0> Failed to fetch sample 2Failed to fetch sample 2024032. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7b29990> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e384d760> Failed to fetch sample 1465471. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4015db6a0> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9fcbd0> cannot identify image file <_io.BytesIO object at 0x7ff4dab773d0> Failed to fetch sample 2654468. Exception:Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003862f0> Failed to fetch sample 2147481. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db55db20> Failed to fetch sample 2591241. Exception:Failed to fetch sample 3417026. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f61710> cannot identify image file <_io.BytesIO object at 0x7f566811f330> Failed to fetch sample 3048527. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c15d0> Failed to fetch sample 2883927. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d5620> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b96b880> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2a6610> Failed to fetch sample 1368716. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789eac0> cannot identify image file <_io.BytesIO object at 0x7f1008a64c20> cannot identify image file <_io.BytesIO object at 0x7ff4daba24d0> Failed to fetch sample 2482217. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496029710> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3395599. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c8cf40> cannot identify image file <_io.BytesIO object at 0x7f6a7f11b970> Failed to fetch sample 3649257. Exception: cannot identify image file <_io.BytesIO object at 0x7f18f3249cb0> cannot identify image file <_io.BytesIO object at 0x7f5a16f76b10> Failed to fetch sample 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4daba17b0> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7f649602bfb0> Failed to fetch sample 4153622. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18aa70> Failed to fetch sample 3860537. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280eb6a0> cannot identify image file <_io.BytesIO oFailed to fetch sample 3106325. Exception:ample 3856219. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b7fa9fd0> Failed to fetch sample 1570660. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b259b20> cannot identify image file <_io.BytesIO object at 0x7f5525ebc810> Failed to fetch sample 1445562. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecb2018f0> Failed to fetch sample 1767120. Exception:bject at 0x7f4bee421bc0> Failed to fetch sample 1026599. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b2634b30> Failed to fetch sample 1866028. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b7faa9d0> Failed to fetch sample 2521553. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7ff0067f1c10> Failed to fetch sample 1Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bee421260> Failed to fetch sample 2149877. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 991716. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b7fabce0> Failed to fetch sample 2374167. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280e8a90> cannot identify image file <_io.BytesIO object at 0x7ff401f60b30> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e4f600> Failed to fetch sample 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560ffa60> Failed to fetch sample 2998072. Exception: cannot identify image file <_io.BytesIO object at 0x7f566810dee0> Failed to fetch sample 2087658. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b0886480> Failed to fetch sample 2392280. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bee421940> cannot identify image file <_io.BytesIO object at 0x7fee3f956750> Failed to fetch sample 1446817. ExceptionF Failed to fetch sample 2606166. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7fcbdc1e8720> ailed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 3530049. Exception:bject at 0x7fa275cb8540> Failed to fetch sample 3401019. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280e8950> cannot identify image file <_io.BytesIO object at 0x7f9d73252480> Failed to fetch sample 3546041. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9abfb0> cannot identify image file <_io.BytesIO object at 0x7f22e39ef420> Failed to fetch sample 2085038. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f63740> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777dbab0> Failed to fetch sample 3247622. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f90cc0> Failed to fetch sample 1368716. Exception: cannot identify image file <_io.BytesIO object at 0x7f241f570400> Failed to fetch sample 1038290. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a96fe20> Failed to fetch sample 1209099. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb726390> Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdb9e9d0> Failed to fetch sample 2563789. Exception: cannot identify image file <_io.BytesIO object at 0x7f81515258a0> Failed to fetch sample 2024032. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c7b830> Failed to fetch sample 1927823. Exception: cannot identify image file <_io.BytesIO object at 0x7f655602cd60> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7bc770> Failed to fetch sample 2591241. Exception: cannot identify image file <_io.BytesIO object at 0x7f5838fc6b10> Failed to fetch sample 2482217. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b922a20> Failed to fetch sample 3408820. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec03ec0> Failed to fetch sample 3928242. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9aa7a0> Failed to fetch sample 1718944. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9a97b0> Failed to fetch sample 3106325. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdb9cc20> Failed to fetch sample 3008241. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c29a80> Failed to fetch sample 1256456. Exception: ject at 0x7fa275c7ea20> Failed to fetch sample 1672947. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280e9ee0> Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560ffb00> le <_io.BytesIO object at 0x7fccdf79db20> Failed to fetch sample 1275331. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71e750> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e0c20> Failed to fetch sample 1355374. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7325dc10> Failed to fetch sample 4243929. Exception: Failed to fetch sample 2360636. Exception:ject at 0x7f99a3af7920> Failed to fetch sample 3345102. Exception: cannot identify image file <_io.BytesIO object at 0x7f81d6d89800> Failed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe724067f0> Failed to fetch sample 3247622. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f929f30> Failed to fetch sample 3661452. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496083010> Failed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556065990> cannot identify image file <_io.BytesIO object at 0x7fbfdc66fec0> Failed to fetch sample 2940430. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3890590> Failed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7fc3a615c180> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8edd67650> ile <_io.BytesIO object at 0x7f5253bd6d40> Failed to fetch sample 1534741. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecbdc2570> cannot identify image file <_io.BytesIO object at 0x7f21b0880db0> Failed to fetch sample 2452845. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682fbd30> le <_io.BytesIO object at 0x7fc2eb71c680> Failed to fetch sample 1850669. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f57420e9a80> 408768. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278b6bb0> cannot identify image file <_io.BytesIO object at 0x7fbe72405080> Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ecaa09f30> Failed to fetch sample 2670557. Exception:Failed to fetch sample 2954472. Exception:ject at 0x7f53a8ead7b0> Failed to fetch sample 3431220. Exception: cannot identify image file <_io.BytesIO object at 0x7f440e00c680> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd7bf0> Failed to fetch sample 1976843. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496054270> Failed to fetch sample 3397401. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35ba99e0> Failed to fetch sample 1047236. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789f290> cannot identify image file <_io.BytesIO object at 0x7fba3202e5c0> Failed to fetch sample 2850113. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f11620> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4263420> Failed to fetch sample 2492755. Exception: cannot identify image fiFailed to fetch sample 4151578. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9bb2e0> le <_io.BytesIO object at 0x7f5de0ebeac0> Failed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278db470> Failed to fetch sample 2274389. Exception: cannot identify image fiFailed to fetch sample 3856361. Exception:Failed to fetch sample 3725471. Exception: cannot identify image fi Failed to fetch sample 2627624. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f22f0> cannot identify image file <_io.BytesIO object at 0x7f5253bd5a80> Failed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7f64960542c0> Failed to fetch sample 2352723. Exception: cannot identify image file <_io.BytesIO object at 0x7f0513439670> Failed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d69850> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7f8151525760> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72407150> cannot identify image file <_io.BytesIO object at 0x7f2a876639c0> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6160c0> ile <_io.BytesIO object at 0x7f9a9c272160> Failed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133f42570> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a1800> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1db150> Failed to fetch sample 2919698. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b920450> Failed to fetch sample 2722601. Exception: cannot identify image file <_io.BytesIO object at 0x7f36366e1cb0> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9a6070> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a90e570> Failed to fetch sample 1871769. Exception: cannot identify image file <_io.BytesIO object at 0x7ff69741c9a0> Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20038ec00> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4262bb0> cannot identify image file <_io.BytesIO object at 0x7fc2eb71c5e0> Failed to fetch sample 4376367. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e5120> Failed to fetch sample 2883927. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d2b3d0> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e6890> Failed to fetch sample 4143989. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a946930> Failed to fetch sample 2455856. Exception:Failed to fetch sample 1258034. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775479710> Failed to fetch sample 2085038. Exception: cannot identify image fiFailed to fetch sample 3567355. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c50c557b0> Failed to fetch sample 2521553. Exception:Failed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697deeca0> Failed to fetch sample 3687864. Exception: cannot identify image fFailed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a6c50> Failed to fetch sample 1216821. Exception: cannot identify image file <_io.BytesIO object at 0x7fd3117b74c0> FFailed to fetch sample 2123635. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de372acf0> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401faae80> le <_io.BytesIO object at 0x7f357b93bb00> Failed to fetch sample 1672947. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278eb240> Failed to fetch sample 3200944. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bdd580> Failed to fetch sample 1602779. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9ba8e0> Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d1940> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ef1f0> Failed to fetch sample 2846277. Exception: cannot identify image file <_io.BytesIO object at 0x7f563bb94860> Failed to fetch sample 2846277. Exception: Failed to fetch sample 1954280. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac212ac0> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f04f0> cannot identify image file <_io.BytesIO object at 0x7f357b922930> Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4daba2d40> Failed to fetch sample 3401019. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcea39fd0> cannot identify image file <_io.BytesIO object at 0x7fd200376610> Failed to fetch sample 1945810. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c29a340> Failed to fetch sample 3492865. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f153f0> cannot identify image file <_io.BytesIO object at 0x7f12295f1e40> Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7fd3117b6480> Failed to fetch sample 2378248. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906bfb0> Failed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8eac400>FaTOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1701865. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84dbd80> Failed to fetch sample 1672947. Exception: cannot identify image fileFailed to fetch sample 4074385. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f293dd0> FaFailed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a934a90> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d0040> cannot identify image file <_io.BytesIO object at 0x7f5253a1dbc0> Failed to fetch sample 3711650. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f5b0b0> Failed to fetch sample 3377676. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaade8fab0> Failed to fetch sample 2883276. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f23a020> Failed to fetch sample 1232641. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d730a31f0> Failed to fetch sample 1461142. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dc0900> Failed to fetch sample 3367419. Exception: Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682d7ce0> Failed to fetch sample 3329980. Exception: cannot identify image file <_io.BytesIO object at 0x7f21afee48b0> cannot identify image file <_io.BytesIO object at 0x7f4beec63330> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b94e9d0> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568ff920> cannot identify image file <_io.BytesIO object at 0x7fdd459d7740> Failed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906b920> Failed to fetch sample 3106325. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc2823e0> Failed to fetch sample 1636972. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bda750> Failed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dc1530> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f286200> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e9ee0> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777c2200> Failed to fetch sample 3661452. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b94f1f0> Failed to fetch sample 2085038. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381e6b0> Failed to fetch sample 2086216. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682ded4e0> ailed to fetch sample 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568fdcb0> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb6bd80> Failed to fetch sample 1321712. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37274c0> Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43035530> cannot identify image file <_io.BytesIO object at 0x7f6556027b00> ile <_io.BytesIO object at 0x7f80a80fb790> Failed to fetch sample 2523035. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b3888630> Failed to fetch sample 1534741. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d7ce0> Failed to fetch sample 1329984. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f09328e0> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb6a020> cannot identify image file <_io.BytesIO object at 0x7fc262b51940> Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f55cb0> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd463d7fb0> Failed to fetch sample 3247622. Exception: cannot identify image file <_io.BytesIO object at 0x7f655601e3e0> FFailed to fetch sample 1697561. Exception: ect at 0x7f53a906bab0> Failed to fetch sample 1961364. Exception:Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64a9da0> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bded40> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d37e0> Failed to fetch sample 3397401. Exception:bject at 0x7f22e39bbba0> Failed to fetch sample 2919698. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9607c0> Failed to fetch sample 3329980. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b52de0>Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381e250> FFailed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906a0c0> cannot identify image file <_io.BytesIO object at 0x7fe4369d7d30> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARAL cannot identify image file <_io.BytesIO object at 0x7f83e980bf60> Failed to fetch sample 2199896. Exception: cannot identify imageFailed to fetch sample 1121146. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc4090> Failed to fetch sample 2250980. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f33d0> Failed to fetch sample 1986578. Exception: cannot identify image file <_io.BytesIO objeFailed to fetch sample 2819794. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d4c20> ct at 0x7f80a80f9fd0> Failed to fetch sample 3905987. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f617dd0> Failed to fetch sample 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcd3e70> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7fc38471e200> Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db28e0> Failed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dbc180> cannot identify image file <_io.BytesIO object at 0x7fccdf7b72e0> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b979ad4e0> Failed to fetch sample 3010687. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbd8c20> Failed to fetch sample 3130336. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775474270> bject at 0x7f5a16f831a0> le <_io.BytesIO object at 0x7f7a568c6340> le <_io.BytesIO object at 0x7f6682e64db0> Failed to fetch sample 2231126. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00723240> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db0e50> Failed to fetch sample 2919698. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381ce50> Failed to fetch sample 3629990. Exception: cannot identify image fiFailed to fetch sample 3084715. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f27d80> Failed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd445df1f0> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d84680> Failed to fetch sample 3002531. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380e200> Failed to fetch sample 3067785. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91d940> ailed to fetch sample 1804828. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f27754740e0> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f269d0> Failed to fetch sample 4150446. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420eade0> Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO ob cannot identify image fFailed to fetch sample 2566223. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f1bbf0> cannot identify image file <_io.BytesIO object at 0x7f001b92b2e0> Failed to fetch sample 4108175. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38c3c90> Failed to fetch sample 1334944. Exception: cannot identify image fiFailed to fetch sample 2768081. Exception:Failed to fetch sample 2700162. Exception: ject at 0x7ff401dc6f20> Failed to fetch sample 1600199. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b3a60> Failed to fetch sample 1766671. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0070ba60> Failed to fetch sample 3667389. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39c3970> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dc27f0> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078af880> Failed to fetch sample 1026599. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38c0040> Failed to fetch sample 2452845. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5672e0> cannot identify image file <_io.BytesIO object at 0x7f1137f739c0> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd5bc0> Failed to fetch sample 4245935. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d6b2e0> Failed to fetch sample 1985115. Exception: cannot identify image file <_io.BytesIO object at 0x7f5df065ea20> Failed to fetch sample 3644066. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093733380> Failed to fetch sample 2723001. Exception:Failed to fetch sample 4099617. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278df6f0> Failed to fetch sample 3149661. Exception: cannot identify image fiFailed to fetch sample 2888688. Exception: cannot identify image fiFailed to fetch sample 1365507. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f94d580> le <_io.BytesIO object at 0x7f6682e62890> Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39ec860> Failed to fetch sample 1035361. Exception: cannot identify image fFailed to fetch sample 1332667. Exception: Failed to fetch sample 1807262. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f471a0> cannot identify image file <_io.BytesIO object at 0x7f991c7c7a10>FFailed to fetch sample 3924877. Exception:Failed to fetch sample 2919265. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7f0c1bee6cf0> Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO obj cannot identify image file <_io.BytesIO object at 0x7f83e8373600> le <_io.BytesIO object at 0x7f5253a4af20> Failed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7f991b3e7290> Failed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b240ef0> Failed to fetch sample 1925668. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaad56e0c0> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f1580> Failed to fetch sample 3493751. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f94d3a0> Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43035490> cannot identify image file <_io.BytesIO object at 0x7f5a41bb7c90> le <_io.BytesIO object at 0x7fe4c4282ed0> Failed to fetch sample 3135072. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777311c0> Failed to fetch sample 3855812. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b241440> Failed to fetch sample 1484787. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200386890> Failed to fetch sample 1531084. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d77c87dd0> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7f11f1496110> Failed to fetch sample 3420284. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f6930> Failed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c28f3d0> Failed to fetch sample 2354595. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42464d0> Failed to fetch sample 2428943. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c9d8fbf10> Failed to fetch sample 3573442. Exception: cannot identify image file <_io.BytesIO object at 0x7fb6a675bab0> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732863e0> Failed to fetch sample 2208130. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e8b920> Failed to fetch sample 2452845. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b92fc40> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f45a80> Failed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b243f10> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf6a3600> Failed to fetch sample 1484787. Exception: cannot identify image fiFailed to fetch sample 2700162. Exception: ailed to fetch sample 1508099. Exception: cannot identify image fiFailed to fetch sample 2959338. Exception:Failed to fetch sample 1564079. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43035490> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbd6110> Failed to fetch sample 3002531. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d84680> Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f973e20> ect at 0x7fd200366b10> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c28ec00> Failed to fetch sample 3191594. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7324aca0> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2416c0> Failed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d84f40> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb1f790> Failed to fetch sample 2789996. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de370ede0> Failed to fetch sample 2894732. Exception: cannot identify image fiFailed to fetch sample 1061148. Exception:Failed to fetch sample 2667872. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79e2f0> Failed to fetch sample 3833696. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb723ba0> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e38774c0> bject at 0x7fbe7223bf10> Failed to fetch sample 2729606. Exception: cannot identify image file <_io.BytesIO object at 0x7fe445765c10> Failed to fetch sample 2899394. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9729d0> Failed to fetch sample 1470654. Exception:bject at 0x7f1f1b241850> le <_io.BytesIO object at 0x7f7b9b96d350> cannot identify image file <_io.BytesIO object at 0x7f109378a070> le <_io.BytesIO object at 0x7fc383d852b0> Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 819Failed to fetch sample 2936442. Exception:bject at 0x7f94ac23d760> Failed to fetch sample 1945810. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe723e7010> Failed to fetch sample 3855812. Exception:Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dcfc40> cannot identify image file <_io.BytesIO object at 0x7fbe7223b970> Failed to fetch sample 2528140. Exception: cannot identify image file <_io.BytesIO object at 0x7f4727900310> Failed to fetch sample 3648356. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937881d0> Failed to fetch sample 2085038. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e070fb560> cannot identify image file <_io.BytesIO object at 0x7fc2eb729d50> Failed to fetch sample 3392964. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401e451c0> Failed to fetch sample 3414742. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b96f920> Failed to fetch sample 2024032. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f971800> cannot identify image file <_io.BytesIO object at 0x7fd17e71b6f0> Failed to fetch sample 2424480. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f69760> le <_io.BytesIO object at 0x7f6682e4d210> Failed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b1da0> Failed to fetch sample 3247622. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e070f8ae0> Failed to fetch sample 3876327. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b907380> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190bdf30> Failed to fetch sample 1304332. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72239030> Failed to fetch sample 1268276. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c353f46d0> Failed to fetch sample 3575993. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf9e5d00> Failed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682df75b0> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f90e160> Failed to fetch sample 3049426. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faef600> cannot identify image file <_io.BytesIO object at 0x7fb7401e8fe0> Failed to fetch sample 3060627. Exception: cannot identify image file <_io.BytesIO object at 0x7f566827b3d0> ailed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7f109375a160> Failed to fetch sample 3445146. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab77c90> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b3f60> Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7326e750> Failed to fetch sample 1672947. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401eaa20> Failed to fetch sample 1267902. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f266a70> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7f6417e0ba60> Failed to fetch sample 1152583. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99e9cfbf0> Failed to fetch sample 2199896. Exception: cannot identify image filFailed to fetch sample 2103262. ExceptionTOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23ecf0> Failed to fetch sample 2297543. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b7880> sIO object at 0x7fb37942fbf0> Failed to fetch sample 1673329. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99b841300> Failed to fetch sample 2887528. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644d7b0> ailed to fetch sample 1862112. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d60c0> Failed to fetch sample 3723797. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3827100> Failed to fetch sample 1248828. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5f8d0> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f974450> Failed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db541580> Failed to fetch sample 2819794. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f8a6b0> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b916110> Failed to fetch sample 1484787. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d2610> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e834ecf0> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7f9566cc96c0> Failed to fetch sample 1415702. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c54b6f0> cannot identify image file <_io.BytesIO object at 0x7fdaade86b60> ile <_io.BytesIO object at 0x7f12b3812840> Failed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7efc516afbf0> Failed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7f1009413380> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0770a570> Failed to fetch sample 2784203. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b917470> Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5f740> Failed to fetch sample 2706058. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a907ac00> cannot identify image file <_io.BytesIO object at 0x7fccdf782890> Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d7420> Failed to fetch sample 1106777. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e84f0> le <_io.BytesIO object at 0x7f08280f62f0> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5ec4d10> Failed to fetch sample 3634544. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24949b600> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca3e70> Failed to fetch sample 4099617. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378e390> Failed to fetch sample 1361843. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39ddd50> Failed to fetch sample 3479025. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38c3f60> cannot identify image file <_io.BytesIO object at 0x7f098644d8a0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) 0> Failed to fetch sample 2446558. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556067fb0> Failed to fetch sample 3355968. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa2e160> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8317ba0> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca1850> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7f801f5efa10> Failed to fetch sample 2850892. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f4590> Failed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24949b3d0> Failed to fetch sample 1659475. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84ef100> Failed to fetch sample 1738850. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078c9a80> Failed to fetch sample 1136372. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078da890> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d1620> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937665c0> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83154e0> Failed to fetch sample 2354595. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa3ec50> cannot identify image file <_io.BytesIO object at 0x7f0986453f60> Failed to fetch sample 1052004. Exception:Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8327920> cannot identify image file <_io.BytesIO object at 0x7fd07a5733d0> Failed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7f09953d4db0> Failed to fetch sample 1885521. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644eb10> Failed to fetch sample 3297609. Exception:bject at 0x7f12b38a5d00> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093755120> Failed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0770b1a0> Failed to fetch sample 2773958. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078e4bd0> Failed to fetch sample 1026599. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9068db0> Failed to fetch sample 1046372. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc10d6c0> Failed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8e898f0> Failed to fetch sample 3036687. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777eb830> cannot identify image file <_io.BytesIO object at 0x7f81db5c7b00> an either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1610592. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b3888860> Failed to fetch sample 2028280. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc3d30> Failed to fetch sample 975614. Exception: cannot identify image fiFailed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83d5cb0> Failed to fetch sample 1264206. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cef20> le <_io.BytesIO object at 0x7f9abcd96660> Failed to fetch sample 1270459. Exception: cannot identify image file <_io.BytesIO object at 0x7f802848f920> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a904f0b0> Failed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc10dc10> Failed to fetch sample 4121555. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e2b28dd00> Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac244630> Failed to fetch sample 3596499. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342d71f0> Failed to fetch sample 1036164. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2e47c0> Failed to fetch sample 3149235. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc55e11d00> Failed to fetch sample 2915907. Exception: cannot identify image file <_io.BytesIO object at 0x7fb977797790> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a904e3e0> Failed to fetch sample 3575993. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d89080> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778fec0> cannot identify image file <_io.BytesIO object at 0x7fe436a09bc0> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644e480> Failed to fetch sample 2354595. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37466b0> Failed to fetch sample 1412731. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b92fce0> cannot identify image file <_io.BytesIO object at 0x7f83e834dd00> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f6a20> Failed to fetch sample 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556064180> Failed to fetch sample 3041150. Exception: Failed to fetch sample 1713409. Exception:ject at 0x7f65560eee30> Failed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e834a890> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcf970> Failed to fetch sample 2819794. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2e4a90> cannot identify image file <_io.BytesIO object at 0x7f7b9b96fe20> FFailed to fetch sample 2194953. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e834e750> Failed to fetch sample 3578223. Exception:Failed to fetch sample 3162635. Exception: cannot identify image ficannot identify image file <_io.BytesIO obFailed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560ee9d0> Failed to fetch sample 3340422. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c9d9e0> Failed to fetch sample 3Failed to fetch sample 2809536. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71cd10> Failed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 2Failed to fetch sample 2816660. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 1388040. Exception:mple 3176709. Exception: cannot identify image file <_io.BytesIO o Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9dc9f0>FFailed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71c220> Failed to fetch sample 1864192. Exception: cannot identify image file <_io.BytesIO object at 0x7f610123a340> ailed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd31a0> Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb01d0> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b96cd10> Failed to fetch sample 1701865. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f031f0> cannot identify image file <_io.BytesIO object at 0x7fee3f956ed0> Failed to fetch sample 1524216. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d76f0> Failed to fetch sample 1946200. Exception:Failed to fetch sample 1469166. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a5b60180> Failed to fetch sample 1672947. Exception: cannot identify image fiFailed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937fdda0> Failed to fetch sample 3883596. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fa6a4b9c0> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560ee250> Failed to fetch sample 4099617. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd3dd0> Failed to fetch sample 1769963. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f5800> Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7aca0> Failed to fetch sample 2938358. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1d9f30> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f61da0> Failed to fetch sample 1491157. Exception: cannot identify image file <_io.BytesIO object at 0x7fa088c39b20> Failed to fetch sample 3221682. Exception: cannot identify image file <_io.BytesIO object at 0x7fb6877d8360> Failed to fetch sample 2846277. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f627bf0> Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9ddb20> Failed to fetch sample 2374167. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db55d620> Failed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f72570> Failed to fetch sample 1857208. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789e890> Failed to fetch sample 3020959. Exception: cannot identify image file <_io.BytesIO object at 0x7f8151526610> t the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... TOKENIZERS_PARALLELISMTOKENIZERS_PARALLELISM=(true | false) To disable this warning, you can either: =(true | false) - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253be6660> Failed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bb6e30> Failed to fetch sample 3576092. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd50761300> Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd2a70> Failed to fetch sample 1277996. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cdf010> Failed to fetch sample 1718944. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb5ee0> Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0909530> cannot identify image file <_io.BytesIO object at 0x7f10937f3970> Failed to fetch sample 4224928. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381cbd0> 4099617. Exception:cannot identify image file <_io.BytesIO object at 0x7f472789e6b0> cannot identify image file <_io.BytesIO object at 0x7f5a16f739c0> cannot identify image file <_io.BytesIO object at 0x7f1137f62660> Failed to fetch sample 3371113. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f7060> Failed to fetch sample 2961542. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560ef3d0> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f44200770> Failed to fetch sample 1672947. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db55d8f0> cannot identify image file <_io.BytesIO object at 0x7f098641fce0> Failed to fetch sample 2933221. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5776f0> le <_io.BytesIO object at 0x7f12b381d6c0> Failed to fetch sample 2354595. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278d1580> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb24d0> Failed to fetch sample 2325409. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb5e40> Failed to fetch sample 3246712. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfab859440> Failed to fetch sample 4065763. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84a3100> Failed to fetch sample 2452845. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f94e0> Failed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7fc71532a8e0> Failed to fetch sample 3254896. Exception:Failed to fetch sample 1516215. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9a6a70> Failed to fetch sample 3209388. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf78a0c0> cannot identify image file <_io.BytesIO object at 0x7f469e5d8720> Failed to fetch sample 2189796. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641f740> Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd5b70> cannot identify image file <_io.BytesIO object at 0x7f6496057ec0> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cda340> Failed to fetch sample 2200309. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8eeca0> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090ac00> Failed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9a53f0> Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf78bd30> Failed to fetch sample 2032575. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec24eb60> Failed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84a3010> Failed to fetch sample 2883927. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0073d4e0> Failed to fetch sample 1961415. Exception:bject at 0x7fe436a03880> le <_io.BytesIO object at 0x7f4beebabd30> cannot identify image file <_io.BytesIO object at 0x7f1f1b2590d0> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682d2070> Failed to fetch sample 3354065. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b2634b30> Failed to fetch sample 3633837. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d730a3920> Failed to fetch sample 3629769. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2f57b0> Failed to fetch sample 1568918. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008b322a0> Failed to fetch sample 1945810. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682eb330> Failed to fetch sample 981592. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17f0fb010> FFailed to fetch sample 2068755. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c2131a0> Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec24c180> cannot identify image file <_io.BytesIO object at 0x7f7ecaa0a020> Failed to fetch sample 2842932. Exception: cannot identify image file <_io.BytesIO object at 0x7f1009dc3150> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec7c860> Failed to fetch sample 1997328. Exception: ject at 0x7fb52f6293a0> le <_io.BytesIO object at 0x7fa275cbb510> Failed to fetch sample 1135347. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381fab0> Failed to fetch sample 3717123. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f97b060> Failed to fetch sample 1544576. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7f7b9b94ede0> 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08efa60> Failed to fetch sample 4023897. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f7ef510> Failed to fetch sample 1878535. Exception: cannot identify image file <_io.BytesIO object at 0x7efd966df7e0> Failed to fetch sample 1179621. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb5ec0680>Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb6b18450> Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381da30> ore the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3138939. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf007411c0> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec7fba0> cannot identify image file <_io.BytesIO object at 0x7fdcaf87d4e0> Failed to fetch sample 2553914. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568d76a0> le <_io.BytesIO object at 0x7f35a7f265c0> Failed to fetch sample 1666446. Exception:Failed to fetch sample 2819794. Exception: ject at 0x7f35a7f2b790> Failed to fetch sample 3141187. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d145b9990> cannot identify image file <_io.BytesIO object at 0x7f801f5efe20> huggingface/tokenizers: The current process just got forked, after Failed to fetch sample 1358565. Exception:bject at 0x7f1f1b256f70> le <_io.BytesIO object at 0x7f35a7f73510> Failed to fetch sample 4068392. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1b3a60> TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 2943308. Exception: cannot identify image fiFailed to fetch sample 2358239. Exception:Failed to fetch sample 3747017. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5b6f0> cannot identify image file <_io.BytesIO object at 0x7efa5f45f600> Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8394b80> Failed to fetch sample 4080972. Exception:Failed to fetch sample 2633741. Exception:bject at 0x7fdcaf844d10> Failed to fetch sample 1731703. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f239bc0> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d097b0> Failed to fetch sample 3431220. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d1773f4c0> cannot identify image file <_io.BytesIO object at 0x7f001faf3f60> le <_io.BytesIO object at 0x7f357b9631a0> cannot identify image file <_io.BytesIO object at 0x7f9d7325c9f0> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b256430> Failed to fetch sample 1746943. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2231f0> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73258810> Failed to fetch sample 2819794. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7325d620> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d09d00> Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f79d5d0> Failed to fetch sample 3843062. Exception: cannot identify image file <_io.BytesIO object at 0x7efa5cad3d30> Failed to fetch sample 2452845. Exception: cannot identify image fFailed to fetch sample 1858383. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d730a1b70> ile <_io.BytesIO object at 0x7fdcaf8edd00> Failed to fetch sample 3297609. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9aa430> Failed to fetch sample 1278085. Exception: cannot identify image file <_io.BytesIO object at 0x7f277548a610> Failed to fetch sample 4022407. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775474a90> le <_io.BytesIO object at 0x7f5253bd67a0> Failed to fetch sample 1524920. Exception: cannot identify image fiFailed to fetch sample 2448314. Exception:Failed to fetch sample 4121555. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b94cc20> cannot identify image file <_io.BytesIO object at 0x7fcbdc1c7740> Failed to fetch sample 3423432. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9b0b2390> Failed to fetch sample 2226848. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e3bf0> cannot identify image file <_io.BytesIO object at 0x7f5253bd6ca0> ile <_io.BytesIO object at 0x7fe0ec21b7e0> Failed to fetch sample 4085031. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84ef920> Failed to fetch sample 3209388. Exception: cannot identify image file <_io.BytesIO object at 0x7f27847e64d0>Failed to fetch sample 1929613. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d850d0> Failed to fetch sample 3247622. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090f6a0> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7fc80a138a90> Failed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f24e890> Failed to fetch sample 2424522. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d145b80e0> Failed to fetch sample 1728469. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f796340> Failed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d09e40> cannot identify image file <_io.BytesIO object at 0x7fe0ec21ab10> Failed to fetch sample 1316862. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8318c20> Failed to fetch sample 1570735. Exception:Failed to fetch sample 2780802. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0936110> ailed to fetch sample 2337861. Exception: cannot identify image filFailed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64c0680> e <_io.BytesIO object at 0x7f35a7f55490> Failed to fetch sample 2009202. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d2570> Failed to fetch sample 1809107. Exception: cannot identify image file <_io.BytesIO object at 0x7f9e950ca020> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a97e430> cannot identify image file <_io.BytesIO object at 0x7f9a9c29b510> le <_io.BytesIO object at 0x7efcdbbdae30> Failed to fetch sample 1484787. Exception:Failed to fetch sample 3244520. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133fd0040> Failed to fetch sample 2Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 3Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64e2930> Failed to fetch sample 2467681. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7976a0> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f24cb30> cannot identify image file <_io.BytesIO object at 0x7f9d50588cc0> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777dff60> Failed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39bbbf0> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a947100> Failed to fetch sample 2961542. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8da3e0> ailed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2d88b0> Failed to fetch sample 3875881. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b97d81350> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e9a80> Failed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754c91c0> Failed to fetch sample 2794558. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f06980> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c26da30> cannot identify image file <_io.BytesIO object at 0x7f9d5059b9c0> Failed to fetch sample 2042842. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b6ca0> Failed to fetch sample 1278085. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbda3e0> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741fd0400> Failed to fetch sample 3897789. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb72b600> Failed to fetch sample 3695885. Exception:Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e9ee0> cannot identify image file <_io.BytesIO object at 0x7fe1fb5530b0> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4daba2d40> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64c3150> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf87e9d0> Failed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80eb740> Failed to fetch sample 3562075. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778d490> Failed to fetch sample 2858679. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb729620> Failed to fetch sample 1341177. Exception: cannot identify image file <_io.BytesIO object at 0x7ff6973eb150> Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e8220> Failed to fetch sample 3429477. Exception: cannot identify image fiFailed to fetch sample 2081894. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f7830> Failed to fetch sample 1184240. Exception:Failed to fetch sample 1782534. Exception: cannot identify image fiFailed to fetch sample 2819794. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777c0bd0> Failed to fetch sample 1218899. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb728a90> cannot identify image file <_io.BytesIO object at 0x7f6d145b92b0> Failed to fetch sample 2374167. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f6a20> Failed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf87f060> Failed to fetch sample 2289094. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f6ac0> le <_io.BytesIO object at 0x7ff4daba3830> cannot identify image file <_io.BytesIO object at 0x7efd6f797240> Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fa8d60> Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a947b00> Failed to fetch sample 2085038. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7bf8d0> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ebd80> Failed to fetch sample 2514686. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb771120> ct at 0x7f99a3af7470> Failed to fetch sample 2528140. Exception: cannot identify image file <_io.BytesIO object at 0x7f8294b8b740> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91e070> cannot identify image file <_io.BytesIO object at 0x7fca1c1bc900> le <_io.BytesIO object at 0x7fb7401f5300> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7fe15de88b80> cannot identify image file <_io.BytesIO object at 0x7f6d14f9e2f0> Failed to fetch sample 3670279. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f26aa70> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d7da3830> Failed to fetch sample 2322554. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f606020> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f795800> Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e79c0> Failed to fetch sample 4089308. Exception: cannot identify image fhuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disFailed to fetch sample 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f5850> Failed to fetch sample 2219767. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39131a0> not identify image file <_io.BytesIO object at 0x7ff4dab76890> Failed to fetch sample 2299819. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7546d0> Failed to fetch sample 4166739. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278da930> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a944180> Failed to fetch sample 1586198. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e0590> Failed to fetch sample 4232471. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a84540> Failed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf780f90> Failed to fetch sample 994847. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb773bf0> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278dbd80> Failed to fetch sample 1327539. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d69d0> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7f566810e390> cannot identify image file <_io.BytesIO object at 0x7f7e078caed0> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9decf0> errors o fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a848b0> Failed to fetch sample 3298186. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdba61b0> Failed to fetch sample 2498932. Exception: cannot identify image file <_io.BytesIO object at 0x7f001baf3290> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39131f0> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcf0540> Failed to fetch sample 1061538. Exception: cannot identify image file <_io.BytesIO object at 0x7fde3edb95d0> Failed to fetch sample 2482217. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc7b00> Failed to fetch sample 3503607. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bee420270> Failed to fetch sample 1106932. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b382f420> Failed to fetch sample 1186857. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b7420> Failed to fetch sample 2112525. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f465c0> Failed to fetch sample 3431220. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecb201120> Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39acf90> cannot identify image file <_io.BytesIO object at 0x7efd6f777650> Failed to fetch sample 4235627. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7892b0> Failed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278e3150> Failed to fetch sample 2149877. Exception: cannot identify image file <_io.BytesIO object at 0x7f001b9295d0>Failed to fetch sample 3433571. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec251620> FFailed to fetch sample 1961415. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1e9710> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc7dd0> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d50596e80>FFailed to fetch sample 3397401. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137ff5d50> ailed to fetch sample 3096384. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadd4be70> Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0930540> Failed to fetch sample 1645668. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb73010> Failed to fetch sample 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadce9490> Failed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f26660> Failed to fetch sample 2870586. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d5350> Failed to fetch sample 2746828. Exception: ject at 0x7fccdf788bd0> Failed to fetch sample 3010687. Exception:Failed to fetch sample 3176736. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6ae200> Failed to fetch sample 3668652. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2552b0> cannot identify image file <_io.BytesIO object at 0x7f9abcdc5ee0> Failed to fetch sample 2883927. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d50596390> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f13a0> Failed to fetch sample 2500450. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1d2660> Failed to fetch sample 3881339. Exception: cannot identify image file <_io.BytesIO object at 0x7f71f8fa8d60> Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8eacf90> ile <_io.BytesIO object at 0x7f6a7f286c00> Failed to fetch sample 2819794. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d5490> Failed to fetch sample 2563789. Exception: cannot identify image file <_io.BytesIO object at 0x7f19d56dd490> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f948360> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f24630> Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d85710> Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb5cbd0> Failed to fetch sample 1012088. Exception: cannot identify image file <_io.BytesIO object at 0x7fc384753010> Failed to fetch sample 4205430. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f795f30> Failed to fetch sample 4077261. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f290090> Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f286cf0> FFailed to fetch sample 4121555. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d85850>Failed to fetch sample 3345102. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c0770> FFailed to fetch sample 1081136. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1d3650>Failed to fetch sample 1099605. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4262a20> FFailed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f291990> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f285cb0> ailed to fetch sample 3133474. Exception: cannot identify image file <_io.BytesIO object at 0x7f10af1aa480> Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4263420> Failed to fetch sample 2846277. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c39c0> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200386110> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7fe447381120> Failed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c0720> Failed to fetch sample 2138417. Exception: cannot identify image file <_io.BytesIO object at 0x7fe44409a390> Failed to fetch sample 1506119. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7324a9d0> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b27f0> Failed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bee422f70> Failed to fetch sample 3113714. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebf6bb0> Failed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754b2f70> Failed to fetch sample 2489916. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf057a5b20> Failed to fetch sample 2129024. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071f5b0> Failed to fetch sample 3209388. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dbb600> Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b3888c20> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb734c0> cannot identify image file <_io.BytesIO object at 0x7f5e6a95c4a0> Failed to fetch sample 1135347. Exception: Failed to fetch sample 1916848. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3852480> Failed to fetch sample 1539269. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682f8f600> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c161e4a90> Failed to fetch sample 3573442. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf005cebb0> Failed to fetch sample 1011100. Exception: cannot identify image file <_io.BytesIO object at 0x7f5742102250> cannot identify image file <_io.BytesIO object at 0x7f83e83193f0> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cde7a0> ile <_io.BytesIO object at 0x7f9abcdc4090> Failed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7fe44443f290> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c09a0> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7f9dd1f69530> Failed to fetch sample 4023897. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4260720> Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341037e0> Failed to fetch sample 3550242. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd96c50> Failed to fetch sample 3867666. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fdb6f0> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5696cbd0> Failed to fetch sample 2883927. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebf6520> Failed to fetch sample 3860784. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682f92cf0> Failed to fetch sample 2085038. Exception: cannot identify image file <_io.BytesIO object at 0x7f69984d6200> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf005cfd30> cannot identify image file <_io.BytesIO object at 0x7f0ecb2004a0> Failed to fetch sample 3259199. Exception: Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f3fb0> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcebb50> Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb4a610> iled to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7324bba0> Failed to fetch sample 3266166. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20037a480> Failed to fetch sample 2164469. Exception: cannot identify image file <_io.BytesIO object at 0x7fe44409af20> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 3878904. Exception:: cannot identify image file <_io.BytesIO object at 0x7f098647e890> Failed to fetch sample 2189113. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fd97b0> ailed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b424e200> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5696c7c0> cannot identify image file <_io.BytesIO object at 0x7f6682dc7880> Failed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b0b30> Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20037b6a0> Failed to fetch sample 3256248. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd7560> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f0f90> Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5723e0> Failed to fetch sample 2607936. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682f91f80> cannot identify image file <_io.BytesIO object at 0x7fd20037a5c0> an either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Failed to fetch sample 2887528. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc126e30> Failed to fetch sample 4077261. Exception:arallelism to avoid deadlocks... To disable this warning, you can eFailed to fetch sample 1409299. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f43b0> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59b8d0> Failed to fetch sample 2998072. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789f6f0> er: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 4232471. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647ed40> Failed to fetch sample 2940430. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f11bec0> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5696f740> Failed to fetch sample 2302822. Exception: cannot identify image file <_io.BytesIO object at 0x7f991b3e6f20> cannot identify image file <_io.BytesIO object at 0x7f1093763380> Failed to fetch sample 2920988. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39dea20> Failed to fetch sample 2814302. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420fac00> Failed to fetch sample 2024032. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459c9170> Failed to fetch sample 3508447. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3af6e30> Failed to fetch sample 1805473. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f90e0c0> Failed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7f469e5ab330> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db573010> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f0590> Failed to fetch sample 3202305. Exception: cannot identify image fFailed to fetch sample 4027148. Exception: Failed to fetch sample 3372492. Exception: cannot identify image file <_io.BytesIO object at 0x7f991b3e7ec0> Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a4b100> Failed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7f469e5ab8d0> Failed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f90fc40> Failed to fetch sample 2002784. Exception: cannot identify image file <_io.BytesIO object at 0x7f469e5d8720> Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f0a90> cannot identify image file <_io.BytesIO object at 0x7fb5ff9d9120> Failed to fetch sample 4155741. Exception: Failed to fetch sample 4243929. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcf6840> Failed to fetch sample 3401019. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d855d0> Failed to fetch sample 2382103. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f76b10> Failed to fetch sample 3002531. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278fb2e0> cannot identify image file <_io.BytesIO object at 0x7f3bcea3a3e0> le <_io.BytesIO object at 0x7f098645a2f0> Failed to fetch sample 2961542. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5696f8d0> Failed to fetch sample 4270979. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f60af70> Failed to fetch sample 2883927. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8d1990> Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb73ff10> Failed to fetch sample 3140074. Exception: cannot identify image file <_io.BytesIO object at 0x7fdabd7c7c90> Failed to fetch sample 3423432. Exception:Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79ed40> Failed to fetch sample 1739207. Exception: cannot identify image fiFailed to fetch sample 2335870. Exception:Failed to fetch sample 24 cannot identify image file <_io.BytesIO object at 0x7fbcdc125da0> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcff10> Failed to fetch sample 4158118. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23aa20> 0x7ff401f76160> Failed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcea3acf0> Failed to fetch sample 3594012. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249486a70> Failed to fetch sample 4231900. Exception: cannot identify image file <_io.BytesIO object at 0x7fc7159420c0> cannot identify image file <_io.BytesIO object at 0x7feef8f52c00> Failed to fetch sample 2049208. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcc5990> cannot identify image file <_io.BytesIO object at 0x7f5d278b7600> FFailed to fetch sample 1480295. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644c040> ailed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7c5850> Failed to fetch sample 1140456. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79f740> Failed to fetch sample 1672947. Exception:Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d27882e80> cannot identify image file <_io.BytesIO object at 0x7ff401f758a0> cannot identify image file <_io.BytesIO object at 0x7f1137f45c10> Failed to fetch sample 4022407. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f16de0> ailed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcee30> Failed to fetch sample 2374167. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c381cae0> Failed to fetch sample 2915907. Exception: cannot identify image file <_io.BytesIO object at 0x7f597ce0db70> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0770abb0> Failed to fetch sample 2087658. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c2072e0> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a64590> Failed to fetch sample 2614991. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38c0630> Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f47060> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79d490> cannot identify image file <_io.BytesIO object at 0x7fe0ec23a340> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641ebb0> Failed to fetch sample 3424394. Exception: cannot identify image file <_io.BytesIO object at 0x7fd454e1ed40> Failed to fetch sample 2152568. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643e2ecf0> cannot identify image file <_io.BytesIO oFailed to fetch sample 4232471. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c184db0> Failed to fetch sample 2379671. Exception: cannot identify image fiFailed to fetch sample 4077261. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777eaed0> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79fec0> Failed to fetch sample 991716. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac22a750> Failed to fetch sample 3247622. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2ebb34630> Failed to fetch sample 1007919. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec23b330> Failed to fetch sample 2948857. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b133276a0> Failed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7fba330a98f0> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7bc9f0> Failed to fetch sample 1253603. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401e46de0> Failed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39de250> Failed to fetch sample 2087658. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5785e0> Failed to fetch sample 1186855. Exception: cannot identify image file <_io.BytesIO object at 0x7f81528ba340> cannot identify image file <_io.BytesIO object at 0x7fd31193bc40> cannot identify image file <_io.BytesIO object at 0x7fdaae0c4770> Failed to fetch sample 3201201. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090bbf0> Failed to fetch sample 3154260. Exception: cannot identify image file <_io.BytesIO object at 0x7f53b538a3e0> Failed to fetch sample 1991893. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac229760>Failed to fetch sample 986348. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bcbf10> Failed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7f655601f2e0> Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2a2570> Failed to fetch sample 3431220. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4263ba0> Failed to fetch sample 3298418. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ca9030> Failed to fetch sample 1659475. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556065a30> Failed to fetch sample 4023897. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c6c00> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f46e80> Failed to fetch sample 2757604. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17e71b1f0> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bef560> Failed to fetch sample 1231297. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f3ba0> Failed to fetch sample 1778361. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db584720> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7f277545af70> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadd5ea20> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08ef9c0> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bcb6a0> Failed to fetch sample 3592974. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faefd30> Failed to fetch sample 4023897. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5fb790> Failed to fetch sample 3297609. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcc5530> FFailed to fetch sample 3661638. Exception: cannot identify image Failed to fetch sample 3209388. Exception:>FFailed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3725ee0>Failed to fetch sample 2593142. Exception: ject at 0x7f94ac229a80> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9068900> le <_io.BytesIO object at 0x7fdcaf8f34c0> Failed to fetch sample 2282533. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078dbec0> Failed to fetch sample 3458049. Exception: cannot identify image fFailed to fetch sample 4099617. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db55e520> cannot identify image file <_io.BytesIO objFailed to fetch sample 3387732. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b937150> ect at 0x7f9a9c2f4c70> Failed to fetch sample 3002531. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682e8ea0> Failed to fetch sample 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f71df0> Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37c8590> Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0073cea0> Failed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378ea20> Failed to fetch sample 1589248. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71a700> Failed to fetch sample 1565916. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1b5348b0> Failed to fetch sample 1747586. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebe62a0> Failed to fetch sample 1047785. Exception: cannot identify image file <_io.BytesIO object at 0x7f277545b9c0> Failed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f2890> Failed to fetch sample 3536544. Exception: cannot identify image file <_io.BytesIO object at 0x7fc19513c180> Failed to fetch sample 3456927. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f7ef650> Failed to fetch sample 3575816. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a6660> Failed to fetch sample 4154579. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b96f920> Failed to fetch sample 4077261. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7be610> Failed to fetch sample 1878125. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a819d3a0> Failed to fetch sample 2268124. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2005ed5d0> Failed to fetch sample 3558012. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec80310> Failed to fetch sample 3856361. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c3740> le <_io.BytesIO object at 0x7f9a9c2f7970> Failed to fetch sample 2374167. Exception: cannot identify image fFailed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de372e750> Failed to fetch sample 4161397. Exception: Failed to fetch sample 4161397. Exception:bject at 0x7f5d278ab830>Failed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003abfb0> Failed to fetch sample 3413527. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20035ae80> Failed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7af510> Failed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a239eb880> cannot identify image file <_io.BytesIO object at 0x7f4beec827a0> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebabd30> Failed to fetch sample 3855812. Exception: cannot identify image fiFailed to fetch sample 3661638. Exception:Failed to fetch sample 1584163. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf6a1350> Failed to fetch sample 1561037. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b303e520> cannot identify image file <_io.BytesIO object at 0x7f5d278aa430> Failed to fetch sample 4099617. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c36a0> Failed to fetch sample 3247622. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17f0f29d0> Failed to fetch sample 2401709. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008b322a0> Failed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37ab790> Failed to fetch sample 2720468. Exception: cannot identify image fiFailed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668105ee0> Failed to fetch sample 1672947. Exception:Failed to fetch sample 3573442. Exception:bject at 0x7fd200377420> Failed to fetch sample 1619559. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4247470> Failed to fetch sample 3355968. Exception: cannot identify image file <_io.BytesIO object at 0x7fa0081986d0> cannot identify image file <_io.BytesIO object at 0x7f12b303ede0> Failed to fetch sample 1864245. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937fe660> Failed to fetch sample 2310893. Exception: cannot identify image fiFailed to fetch sample 1324819. Exception:Failed to fetch sample 2358648. Exception: cannot identify image file <_io.BytesIO object at 0x7f12295f22a0> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf684860> Failed to fetch sample 3465546. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd8770> cannot identify image file <_io.BytesIO object at 0x7f22e39ef740> le <_io.BytesIO object at 0x7f58c380fd80> Failed to fetch sample 3149045. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f97b1f0> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb6876c50> Failed to fetch sample 2164469. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a23882e30> cannot identify image file <_io.BytesIO object at 0x7f9abcdb2660> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568b9f80> le <_io.BytesIO object at 0x7f6682df79c0> Failed to fetch sample 2583452. Exception: cannot identify image file <_io.BytesIO object at 0x7f12295f2480> Failed to fetch sample 3929909. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf6a3dd0> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38447c0> Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec7fe20> Failed to fetch sample 1718944. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c38b82c0> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e8a2a0> Failed to fetch sample 980211. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b303e750> cannot identify image file <_io.BytesIO object at 0x7f21afee5a30> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681b2f20> Failed to fetch sample 2915923. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7b27765c0> Failed to fetch sample 2671486. Exception:bject at 0x7fdcaf88c810> le <_io.BytesIO object at 0x7f9d732536a0> cannot identify image file <_io.BytesIO object at 0x7fee3f97b060> Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f983830> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a238838d0> Failed to fetch sample 2699693. Exception: cannot identify image fiFailed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b08874c0> Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec7eca0> Failed to fetch sample 4379145. Exception:Failed to fetch sample 1420758. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 4059752. Exception: mple 2164117. Exception:Failed to fetch sample 3246712. Exception: cannot identify image fiFailed to fetch sample 1534021. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412f6a0> Failed to fetch sample 2482217. Exception:Failed to fetch sample 2325409. Exception: cannot identify image file <_io.BytesIO object at 0x7f55b329bc40> cannot identify image file <_io.BytesIO object at 0x7f12b39f7f10> Failed to fetch sample 2165000. Exception: cannot identify image file <_io.BytesIO object at 0x7f10ae7e8810> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd9670> le <_io.BytesIO object at 0x7f0c353f6b10> Failed to fetch sample 2723475. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3af3470> Failed to fetch sample 4113199. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a391e340> Failed to fetch sample 2380508. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0770ea70> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c38b9350> Failed to fetch sample 1278085. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190be020> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebdad40> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3844e00> Failed to fetch sample 1306992. Exception: cannot identify image file <_io.BytesIO object at 0x7fb5e83aa610> Failed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9722f0> Failed to fetch sample 1602779. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43035710> Failed to fetch sample 1769963. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340f1ad0> Failed to fetch sample 2961542. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e88310> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7be4900> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f64d990> Failed to fetch sample 4073277. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3745da0> Failed to fetch sample 4375328. Exception:Failed to fetch sample 3328772. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9b7fb0> cannot identify image file <_io.BytesIO object at 0x7fee3f968860> Failed to fetch sample 3648356. Exception: cannot identify image file <_io.BytesIO object at 0x7efd966de6b0> cannot identify image file <_io.BytesIO object at 0x7f1137f1b560> Failed to fetch sample 3106325. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f81d0> Failed to fetch sample 1806959. Exception: cannot identify image file <_io.BytesIO object at 0x7f001fd383b0> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c3580e3e0> cannot identify image file <_io.BytesIO object at 0x7ff2494871f0> Failed to fetch sample 1979344. Exception:Failed to fetch sample 1095508. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb13650> Failed to fetch sample 1265302. Exception: cannot identify image fiFailed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e88d60> cannot identify image file <_io.BytesIO object at 0x7fcbdc1932e0> le <_io.BytesIO object at 0x7fc2eb786020> le <_io.BytesIO object at 0x7fee3f9eb9c0> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f64d120> Failed to fetch sample 3661452. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7b3a340> cannot identify image file <_io.BytesIO object at 0x7f10ae7e86d0>ile <_io.BytesIO object at 0x7f12b38c0040> Failed to fetch sample 2801888. Exception: ject at 0x7f3939c8f9c0> Failed to fetch sample 3397401. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd23f6e7f0> le <_io.BytesIO object at 0x7fedb5ec1530> Failed to fetch sample 1729681. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb3e890> Failed to fetch sample 1999190. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08d9bc0> Failed to fetch sample 1271155. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f0b80> Failed to fetch sample 3677463. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b0889760> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cfc40> Failed to fetch sample 3229639. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b282070> cannot identify image file <_io.BytesIO object at 0x7f99a391c3b0> Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7f0f4c61d7b0> Failed to fetch sample 1943702. Exception: cannot identify image file <_io.BytesIO object at 0x7f001b929b70> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f92b790> Failed to fetch sample 2474373. Exception: cannot identify image file <_io.BytesIO object at 0x7fc24c6755d0> Failed to fetch sample 4379426. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f5a890> Failed to fetch sample 2721419. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3912f20> Failed to fetch sample 2031733. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a0b600> ile <_io.BytesIO object at 0x7fee3f9b4400>cannot identify image file <_io.BytesIO object at 0x7f2a918ea7a0> Failed to fetch sample 1976843. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7b29d00> Failed to fetch sample 2199896. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb368e0> Failed to fetch sample 1943116. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb97150> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7f00d89cac00> Failed to fetch sample 3934242. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f8310> Failed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a5b61e40> Failed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7be6840> Failed to fetch sample 1135347. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9b7c40> Failed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b2634b30> cannot identify image file <_io.BytesIO object at 0x7f109378aca0> Failed to fetch sample 2332319. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b907380> cannot identify image file <_io.BytesIO object at 0x7fcbdc197a10> le <_io.BytesIO object at 0x7f524fbda070> Failed to fetch sample 1516180. Exception: cannot identify image file <_io.BytesIO object at 0x7f20ef4abf60> Failed to fetch sample 4095299. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f09322a0> cannot identify image file <_io.BytesIO object at 0x7f6a7f292520> Failed to fetch sample 3091195. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a09e40> Failed to fetch sample 2270043. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a55a1b0> le <_io.BytesIO object at 0x7f99a39ade40> Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9b62f0> Failed to fetch sample 2215978. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090e9d0> Failed to fetch sample 3725471. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cfec0> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b255620> cannot identify image file <_io.BytesIO object at 0x7fb7401cef70> le <_io.BytesIO object at 0x7f8296ef3e70> cannot identify image file <_io.BytesIO object at 0x7f22e39b7a60> cannot identify image file <_io.BytesIO object at 0x7f3939c8dc60> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e831bd80> le <_io.BytesIO object at 0x7fbe72407ab0> Failed to fetch sample 1186857. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39aef70> Failed to fetch sample 1747586. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08d9210> cannot identify image file <_io.BytesIO object at 0x7f4beec02660> Failed to fetch sample 4191269. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3853790> Failed to fetch sample 3175031. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9098f0> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378af20> Failed to fetch sample 1452131. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e070f94e0> cannot identify image file <_io.BytesIO object at 0x7f7b979f5d50> Failed to fetch sample 3648356. Exception: cannot identify image file <_io.BytesIO object at 0x7fc000b14900> Failed to fetch sample 2819794. Exception: cannot identify image file <_io.BytesIO object at 0x7f566812df80> Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f26d40> Failed to fetch sample 3494092. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dc3970> cannot identify image file <_io.BytesIO object at 0x7f0c35d6af20> Failed to fetch sample 2870729. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078e8090> Failed to fetch sample 2085038. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93e980> Failed to fetch sample 1198752. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b382ab10> Failed to fetch sample 1719573. Exception:Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3af2c50> : cannot identify image file <_io.BytesIO object at 0x7fb7401f8220>Failed to fetch sample 3174406. Exception: cannot identify image file <_io.BytesIO object at 0x7f7da2cd3d30> Failed to fetch sample 2656218. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f2eac0> FFailed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7fbfdc66f7e0> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3816570>FFailed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbda7a0> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0930770> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f0d10> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec004f0> ailed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7fb74384bab0> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4440bb4c0> Failed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e0d60> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f26c50> Failed to fetch sample 2085038. Exception: cannot identify image file <_io.BytesIO object at 0x7f80284c6520> Failed to fetch sample 1047785. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7405e2660> Failed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7f225a5eff10> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f57d30> Failed to fetch sample 3002531. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72415b20> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7fc02a7c3dd0> cannot identify image file <_io.BytesIO object at 0x7fe436a0ac50> le <_io.BytesIO object at 0x7f7d7d3fc360>cannot identify image file <_io.BytesIO object at 0x7f3939cd5fd0> Failed to fetch sample 2392280. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f013a0> cannot identify image file <_io.BytesIO object at 0x7fccdf76ecf0> Failed to fetch sample 1752926. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d7420> Failed to fetch sample 2656425. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777ab790> Failed to fetch sample 2491527. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644e2a0> Failed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7f1fca234860> cannot identify image file <_io.BytesIO object at 0x7f51caf89e40> Failed to fetch sample 1879625. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a2675bd80> Failed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7bcb80> Failed to fetch sample 3434266. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f91c60> Failed to fetch sample 2555299. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5f240> Failed to fetch sample 1591233. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1c53f0> Failed to fetch sample 2953182. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4f9f0590> Failed to fetch sample 1885521. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f26340> Failed to fetch sample 3247622. Exception: cannot identify image file <_io.BytesIO object at 0x7f80284b7f10> Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8e8a570> cannot identify image file <_io.BytesIO object at 0x7fc99c213830> Failed to fetch sample 1317999. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83268e0> Failed to fetch sample 1377471. Exception:Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42a9b20> Failed to fetch sample 1203134. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777a9c60> Failed to fetch sample 2437099. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278a3830> Failed to fetch sample 3571476. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cac3ece50> Failed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fbb7970> cannot identify image file <_io.BytesIO object at 0x7f5741fd29d0> Failed to fetch sample 2538638. ExceptionFailed to fetch sample 2237717. Exception:object at 0x7f2a876633d0> le <_io.BytesIO object at 0x7f22e38162a0> Failed to fetch sample 1747586. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf76f010> Failed to fetch sample 2919698. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e070fb3d0> Failed to fetch sample 1065759. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08feac0> cannot identify image file <_io.BytesIO object at 0x7f098644d1c0> Failed to fetch sample 1860118. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568e7ce0> Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a0ab60> Failed to fetch sample 2555299. Exception:cannot identify image file <_io.BytesIO object at 0x7f3939c9fdd0> Failed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84ee250> FFailed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754aa0c0> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08ff600> Failed to fetch sample 3661452. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f1170> Failed to fetch sample 1532496. Exception: cannot identify image fiFailed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39544f0> Failed to fetch sample 4022407. Exception:Failed to fetch sample 2086216. Exception: cannot identify image fiFailed to fetch sample 1520464. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c8da30> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8325030> Token indices sequence length is longer than the specified maximum sequence length for this model (8747 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 3322388. ExFailed to fetch sample 1747586. Exception: cannot identify image file <_io.cannot identify image file <_io.BytesIO object at 0x7fb7401e0fe0> Failed to fetch sample 3151733. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4283380> Failed to fetch sample 2127419. Exception: cannot identify image file <_io.ByFailed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84dbe20> Failed to fetch sample 4377360. EFailed to fetch sample 1190606. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278fac50> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f83010> 372b1a0> Failed to fetch sample 3192504. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84a3fb0> cannot identify image file <_io.BytesIO object at 0x7f69186e5990> Failed to fetch sample 2068453. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3808540> le <_io.BytesIO object at 0x7f5253be6b60> Failed to fetch sample 1377471. Exception:Failed to fetch sample 2998072. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190c22f0> Failed to fetch sample 2844781. Exception: cannot identify image file <_io.BytesIO object at 0x7f55a8a2dda0> cannot identify image file <_io.BytesIO object at 0x7f27754aab10> Failed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f30d10> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a87643920> Failed to fetch sample 1278085. Exception: cannot identify image fiFailed to fetch sample 3209388. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380bdd0> Failed to fetch sample 4210780. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0074af20> Failed to fetch sample 1613440. Exception: cannot identify image fiFailed to fetch sample 2583625. Exception:Failed to fetch sample 3191179. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d7420> cannot identify image file <_io.BytesIO object at 0x7f5de372acf0> Failed to fetch sample 2961542. Exception: cannot identify image file <_io.BytesIO object at 0x7fdb9ce5da80> Failed to fetch sample 4068517. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb2dfd0> Failed to fetch sample 2919698. Exception: cannot identify image fiFailed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7de40> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cc3ab0> Failed to fetch sample 2452845. Exception: cannot identify image file <_io.BytesIO object at 0x7fd30f9de840> cannot identify image file <_io.BytesIO object at 0x7f9abce53920> Failed to fetch sample 1498794. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754c98f0> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84dbce0> Failed to fetch sample 2019230. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb8900> Failed to fetch sample 3013832. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39df470> Failed to fetch sample 1445442. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf005cecf0> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec21ae80> Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f399e0> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf788e50> Failed to fetch sample 2446023. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fcbdc1b2b10> Failed to fetch sample 2671486. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f56d40> Failed to fetch sample 2599580. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 14Failed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb9a80> 59206. Exception: cannot identify image file <_io.BytesIO object at 0x7f677212a0c0> Failed to fetch sample 4378032. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f669d0> Failed to fetch sample 1694858. Exception: cannot identify image file <_io.BytesIO object at 0x7f936b355120> Failed to fetch sample 3571975. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec60f40> Failed to fetch sample 4018722. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754c8d60> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcf5df0> Failed to fetch sample 3850045. Exception: cannot identify image file <_io.BytesIO object at 0x7f713410b420> Failed to fetch sample 2846277. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1d9d50> Failed to fetch sample 3567452. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80de020> Failed to fetch sample 1304332. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2a4810> Failed to fetch sample 2376303. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a8e9d530> Failed to fetch sample 1956777. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560753f0> Failed to fetch sample 1627132. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496080630> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe71a784a0> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43035120> Failed to fetch sample 3569394. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f1a430> cannot identify image file <_io.BytesIO object at 0x7ff4015d8a90> Failed to fetch sample 1251515. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8f7650> le <_io.BytesIO object at 0x7f1f2a3fbce0> Failed to fetch sample 4232471. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c2acf0> Failed to fetch sample 2085038. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789d4e0> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b90a0c0> Failed to fetch sample 1945810. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280eab10> Failed to fetch sample 1747586. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd7920> Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560779c0> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e07793600> Failed to fetch sample 2719591. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39dfd30> Failed to fetch sample 3870997. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f1030> Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35ba9210> Failed to fetch sample 2493558. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f77150> Failed to fetch sample 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9db6a0> Failed to fetch sample 2607936. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec632e0> Failed to fetch sample 1695344. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a572610> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7f34beaf8950> Failed to fetch sample 1035361. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00709df0> Failed to fetch sample 1945810. Exception: Failed to fetch sample 4023897. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c2b970> ject at 0x7f9a9c2bb6a0> le <_io.BytesIO object at 0x7fa275cbb6a0> Failed to fetch sample 2728042. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84ef920> Failed to fetch sample 4241828. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f118a0> Failed to fetch sample 2835768. Exception: cannot identify image file <_io.BytesIO object at 0x7fb5fee06980> cannot identify image file <_io.BytesIO object at 0x7fcbdc1e86d0> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f1a340> Failed to fetch sample 1976843. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459c92b0> Failed to fetch sample 1459987. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dbb290> Failed to fetch sample 3241059. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f0ae0> Failed to fetch sample 3458049. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8f71a0> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b6520> Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8f4180> Failed to fetch sample 2122051. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4015d84f0> Failed to fetch sample 3573019. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e8c70> Failed to fetch sample 4109566. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e5a30> Failed to fetch sample 2018759. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ea0c0> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8d1ee0> Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7ff43b7a7c40> Failed to fetch sample 1961364. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e9530> Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39eabb0> Failed to fetch sample 3085317. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697deebb0> Failed to fetch sample 3856361. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8f6d40> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777eb240> Failed to fetch sample 2700162. Exception:Failed to fetch sample 2633741. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b921170> bject at 0x7f83e8349cb0> Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4015d9f80> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39eba60> Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0933f60> Failed to fetch sample 2919698. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcf7880> : cannot identify image file <_io.BytesIO object at 0x7f1137fe6a20> Failed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7324be20> Failed to fetch sample 3568902. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17e71b600> Failed to fetch sample 3031504. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2217b0> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e9c10> Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7fb5fefe2a70> cannot identify image file <_io.BytesIO object at 0x7fd07a53a2f0> ile <_io.BytesIO object at 0x7f5253a48360> Failed to fetch sample 3575993. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9d8680> Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fe4360> Failed to fetch sample 2085038. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2c346700> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7324b7e0> Failed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2223e0> Failed to fetch sample 2883927. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f6b740> Failed to fetch sample 4047265. Exception: cannot identify image fiFailed to fetch sample 3397401. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a48a90> cannot identify image file <_io.BytesIO object at 0x7fca9a6f1120> 29909. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401faa840> Failed to fetch sample 3381564. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8ebf10> Failed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b3010> Failed to fetch sample 1379121. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3891300> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697deaca0> Failed to fetch sample 3877394. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1d8b80> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7c0e50> Failed to fetch sample 2058633. Exception: cannot identify image file <_io.BytesIO object at 0x7ff411288720> Failed to fetch sample 4085469. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3877ab0> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b96b380> Failed to fetch sample 2607936. Exception: cannot identify image file <_io.BytesIO object at 0x7fb977616250> Failed to fetch sample 1222876. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fa8d60> Failed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e9f30> Failed to fetch sample 1858383. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697debc90> Failed to fetch sample 2856005. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3893e70> Failed to fetch sample 3365475. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7a22a0> Failed to fetch sample 3423432. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73270950> Failed to fetch sample 1772865. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775479530> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37248b0> Failed to fetch sample 2964818. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3735f80> Failed to fetch sample 4243929. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8348c70> Failed to fetch sample 3174794. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7bdee0> Failed to fetch sample 3209388. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f70220> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39eab10> Failed to fetch sample 1524920. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4015da570> Failed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a983e20> Failed to fetch sample 3285705. Exception: cannot identify image file <_io.BytesIO object at 0x7fa0081986d0> ailed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3713880>cannot identify image file <_io.BytesIO object at 0x7f5de37471a0> ile <_io.BytesIO object at 0x7f7b9b1db650> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb73fe70> Failed to fetch sample 3646003. Exception: cannot identify image file <_io.BytesIO object at 0x7fa0081998f0> Failed to fetch sample 4023897. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cfd30> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d145b9350> Failed to fetch sample 1747586. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e795f7b00> cannot identify image file <_io.BytesIO object at 0x7fa50aa3bdd0> le <_io.BytesIO object at 0x7ff401f74590> Failed to fetch sample 1602779. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1d91c0> Failed to fetch sample 3021215. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3812bb0> Failed to fetch sample 4103135. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2787f8d0> Failed to fetch sample 3494989. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777ddd00> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7c0400> ng parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Failed to fetch sample 1917022. Exception: cannot identify image fFailed to fetch sample 3907787. Exception:ject at 0x7f80a82b6c50> ile <_io.BytesIO object at 0x7f22e39eb970> Failed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f758a0> Failed to fetch sample 2071595. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9a7240> cannot identify image file <_io.BytesIO object at 0x7f08280ff9c0> le <_io.BytesIO object at 0x7f7134109da0> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71cc70> cannot identify image file <_io.BytesIO object at 0x7f2775482b10> Failed to fetch sample 2363535. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb4b740> Failed to fetch sample 2607936. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37269d0> Failed to fetch sample 2802883. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de372b970> Failed to fetch sample 1826440. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c74c70> Failed to fetch sample 2452845. Exception: cannot identify image fFailed to fetch sample 1701865. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fba6b0> Failed to fetch sample 2973513. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb724f40> Failed to fetch sample 2164469. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b66b0> Failed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7fc3a6bee1b0> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278a71f0> Failed to fetch sample 3633590. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 2454376. Exception:bject at 0x7f9d73285530> Failed to fetch sample 1653057. Exception: cannot identify image fiFailed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8efe70> Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fb8860> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO ocannot identify image filFailed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beeba9f30> cannot identify image file <_io.BytesIO object at 0x7f3939c759e0> 7089. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9bacd5d0> Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73287880> Failed to fetch sample 2815747. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2d9490> Failed to fetch sample 3247622. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb726a70> Failed to fetch sample 3042899. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a118b80> cannot identify image file <_io.BytesIO object at 0x7f5de3717240>Failed to fetch sample 1416952. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f970310> Failed to fetch sample 3067785. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7c3fb0> Failed to fetch sample 1065020. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278cac50> Failed to fetch sample 1729780. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071a480> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecb201940> cannot identify image file <_io.BytesIO object at 0x7f80a80e6f20> Failed to fetch sample 3929909. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1927f0> cannot identify image file <_io.BytesIO object at 0x7fe0ec2f30b0> ile <_io.BytesIO object at 0x7f9d7328c9a0> cannot identify image file <_io.BytesIO object at 0x7f2775482a70> Failed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52b63f2e0> Failed to fetch sample 1659475. Exception: Failed to fetch sample 1265302. Exception:ject at 0x7f5de3745760> Failed to fetch sample 3535594. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754ba160> Failed to fetch sample 3556511. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f6c6d0> cannot identify image file <_io.BytesIO object at 0x7f81db543150> Failed to fetch sample 4158016. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f975bc0> cannot identify image file <_io.BytesIO object at 0x7f53a8eaff60> Failed to fetch sample 1218899. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d34c0> le <_io.BytesIO object at 0x7fc2eb8ed030> Failed to fetch sample 2988390. Exception: cannot identify image fiFailed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a651dad0> Failed to fetch sample 1480960. Exception:Failed to fetch sample 1719573. Exception:bject at 0x7f5de3747d80> Failed to fetch sample 4238312. Exception:bject at 0x7fca1c1c5940> Failed to fetch sample 3906134. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab77d80> le <_io.BytesIO object at 0x7f9abc636070> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb726c00> Failed to fetch sample 1210045. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d5a864e50> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a5b609a0> cannot identify image file <_io.BytesIO object at 0x7fd20039a660> cannot identify image file <_io.BytesIO object at 0x7f0ecb200810> Failed to fetch sample 3400193. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c356076a0> le <_io.BytesIO object at 0x7fb9777e3e70> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f6ea20> Failed to fetch sample 1047785. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d27cbe200> Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db598c70> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278b62f0> Failed to fetch sample 3571975. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c353f5350> Failed to fetch sample 1693785. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc1760> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f293f10> Failed to fetch sample 2038609. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc2562aa70> ailed to fetch sample 2033363. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401e47880> Failed to fetch sample 1864351. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7bf330> Failed to fetch sample 4077261. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d27cd17b0> cannot identify image file <_io.BytesIO object at 0x7f1008a64450> Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380f060> Failed to fetch sample 2448314. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebe5170> Failed to fetch sample 3440157. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf6a3510> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682df5df0> Failed to fetch sample 3571975. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ea700> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200399ee0> Failed to fetch sample 1047785. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2ca660> Failed to fetch sample 3355968. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4daba3790> cannot identify image file <_io.BytesIO object at 0x7fee3f982660> Failed to fetch sample 2914835. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9dc4a0> Failed to fetch sample 3460756. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f60b150> Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2496755d0> Failed to fetch sample 3096384. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f6e430> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce47600> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d50597bf0> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c57b0> Failed to fetch sample 3310757. Exception: cannot identify image file <_io.BytesIO object at 0x7f240174a660> Failed to fetch sample 3423432. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37a9c10> Failed to fetch sample 1026599. Exception: cannot identify image file <_io.BytesIO object at 0x7f802848f7e0> Failed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7bcae0> Failed to fetch sample 2819794. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdf6ed0> Failed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9804f0> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcc5bc0> cannot identify image file <_io.BytesIO object at 0x7f18e6501fd0> Failed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7f0d606cd260> Failed to fetch sample 2991472. Exception: ect at 0x7fd2003a0860> Failed to fetch sample 4142348. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f83b0> ile <_io.BytesIO object at 0x7f9abc6346d0> Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278fd800> cannot identify image file <_io.BytesIO object at 0x7f109378e160> le <_io.BytesIO object at 0x7fb52f6254e0> Failed to fetch sample 2816660. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0900810> Failed to fetch sample 1186857. Exception: cannot identify image fFailed to fetch sample 4077261. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3922160> Failed to fetch sample 1510425. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f1d990> Failed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7f566810e160> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7f0994dbea20> Failed to fetch sample 1801303. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f979710> Failed to fetch sample 1769963. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0903e70> Failed to fetch sample 1716837. Exception: cannot identify image file <_io.BytesIO object at 0x7f3d146e2a70> Failed to fetch sample 1304332. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c71a0> Failed to fetch sample 1135347. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39ce250> Failed to fetch sample 3091195. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91f240> cannot identify image file <_io.BytesIO object at 0x7f6a8e1ca430> Failed to fetch sample 2159029. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3826ac0> cannot identify image file <_io.BytesIO object at 0x7fcbdc1efab0> cannot identify image file <_io.BytesIO object at 0x7fc0f08ec770> Failed to fetch sample 1243147. Exception:Failed to fetch sample 1589430. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e38179c0> cannot identify image file <_io.BytesIO object at 0x7fedb6876c50> cannot identify image file <_io.BytesIO object at 0x7fd31219eed0> Failed to fetch sample 2591241. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1bee7ba0> Failed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f48b0> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f265a80> Failed to fetch sample 2969693. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c273010> Failed to fetch sample 3037789. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20035ad40> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a3420> Failed to fetch sample 4018722. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39cc400> Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91e7f0> Failed to fetch sample 2089328. Exception:ject at 0x7fcbdc1ee520> Failed to fetch sample 1184578. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac238ea0> Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce47740> Failed to fetch sample 2202984. Exception: Failed to fetch sample 3036687. Exception: cannot identify image fFailed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35b95760> Failed to fetch sample 3604625. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f289120>Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3d3d7b0> Failed to fetch sample 2315277. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f72b60> Failed to fetch sample 2106281. Exception: cannot identify image file <_io.BytesIO object at 0x7f56e8136480> Failed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b259b20> Failed to fetch sample 1222876. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2a11c0> Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38462a0> Failed to fetch sample 4134119. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f824d0> cannot identify image file <_io.BytesIO object at 0x7fe0ec21f3d0> Failed to fetch sample 3015989. Exception: cannot identify image file <_io.BytesIO object at 0x7f655601e480> Failed to fetch sample 1866028. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f72890> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682d2f70> Failed to fetch sample 4054575. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9aafc0> cannot identify image file <_io.BytesIO object at 0x7fe74cff36f0> Failed to fetch sample 1029421. Exception:Failed to fetch sample 2746828. Exception:bject at 0x7fdcaf83fe20> Failed to fetch sample 2085038. Exception: cannot identify image file <_io.BytesIO object at 0x7efe76ab7c40> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35b95a30> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91eca0> Failed to fetch sample 2094063. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ecaa08e50> ailed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25a750> Failed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078ebab0> Failed to fetch sample 1770342. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b914310> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb6a250> Failed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7f566811f240> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682d2c50> Failed to fetch sample 1991893. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f71b20> Failed to fetch sample 2237717. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f1a80> ailed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16d9e110> Failed to fetch sample 1479796. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a239ac720> Failed to fetch sample 3177701. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37474c0> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f1170> Failed to fetch sample 2985451. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f5b6f0> Failed to fetch sample 1502132. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb389f880> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7f4418a3bce0> Failed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078def20> Failed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b937ba0> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38a74c0> Failed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682f9940> Failed to fetch sample 4022407. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f71cb0> Failed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c4502d1c0> Failed to fetch sample 2986972. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ccddf0> cannot identify image file <_io.BytesIO object at 0x7f6e2b28dd00> Failed to fetch sample 2522299. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80fbf60> Failed to fetch sample 3097403. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f3e20> le <_io.BytesIO object at 0x7f55268568e0> cannot identify image file <_io.BytesIO object at 0x7fe436a17d80> Failed to fetch sample 3493751. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f1120> ile <_io.BytesIO object at 0x7f34f42eee30> Failed to fetch sample 1265302. Exception: cannot identify image file <_io.BytesIO object at 0x7f4418a3bec0> FFailed to fetch sample 2885050. Exception: annot identify image file <_io.BytesIO object at 0x7f0c35bcd990> Failed to fetch sample 2183867. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3af6610> Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789f3d0> Failed to fetch sample 2595709. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35b99e90> Failed to fetch sample 2769695. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08ed7b0> Failed to fetch sample 3626988. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd430372e0> Failed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f0810> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7f1009401c60> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b939440> Failed to fetch sample 2549403. Exception: cannot identify image file <_io.BytesIO object at 0x7f7af76137e0> cannot identify image file <_io.BytesIO object at 0x7fc7159420c0> Failed to fetch sample 2439548. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28ae80> Failed to fetch sample 4078184. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668105710> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c9aa70> Failed to fetch sample 2045138. Exception: cannot identify image fFailed to fetch sample 1659475. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a392ebb0> Failed to fetch sample 2850892. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d6b420> cannot identify image file <_io.BytesIO object at 0x7fb9777a8860> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 3200944. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cea890> ject at 0x7fcaae70c9f0> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7fb976fb6f70> Failed to fetch sample 1063052. Exception: Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35baa7a0> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b936520> Failed to fetch sample 3015989. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e070fbfb0> Failed to fetch sample 3557596. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093763290> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fe1e2f3d0> Failed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7f56cd3f27a0> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c2072e0> Failed to fetch sample 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c98680> Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7f4ebf82ddf0> Failed to fetch sample 1207900. Exception:bject at 0x7f70b4260a40> le <_io.BytesIO object at 0x7f57420e46d0> Failed to fetch sample 2635874. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb5ec07c0> Failed to fetch sample 2049208. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e070fd4e0> Failed to fetch sample 2861238. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b303fb00> cannot identify image file <_io.BytesIO object at 0x7fe6ce8dce00> Failed to fetch sample 2671486. Exception: cannot identify image file <_io.BytesIO object at 0x7fd3117b5f30> Failed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28abb0> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7f098645ab10> Failed to fetch sample 2547976. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cbfe4540> Failed to fetch sample 4376367. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b7fab970> Failed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f2a110> Failed to fetch sample 2243630. Exception: cannot identify image file <_io.BytesIO object at 0x7f5838fc6c50> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8eb240> Failed to fetch sample 3082320. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c9d8fb5b0> Failed to fetch sample 3388442. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682e4040> Failed to fetch sample 2633741. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcf6d40> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7afa10> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b7fa9d50> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7f098645a890> Failed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a8e1c9350> Failed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c9dd50> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c9ade0> Failed to fetch sample 2417592. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c28d260> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f28ea0> Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b7fb9e40> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7f13eee9b880> Failed to fetch sample 2717980. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fdfbc95d0> Failed to fetch sample 3584178. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d7470> Failed to fetch sample 3486801. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7fc900> Failed to fetch sample 1578193. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c9d947240> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7f69186e4f90> Failed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c9b560> Failed to fetch sample 3010687. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c97010> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ea930> Failed to fetch sample 2231126. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39eee30> Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7f51caf89e40> Failed to fetch sample 1372687. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a6700> Failed to fetch sample 2164469. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9ea3e0>Failed to fetch sample 1493570. Exception: cannot identify image file <_io.BytesIO object at 0x7f5b1c36a110> cannot identify image file <_io.BytesIO object at 0x7fdcaf8f3150> le <_io.BytesIO object at 0x7f12b381b330> Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcf71a0> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278cbbf0> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7ddbdd50> cannot identify image file <_io.BytesIO object at 0x7f3939c95ad0> Failed to fetch sample 2633741. Exception: Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f30d10> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a651e520> cannot identify image file <_io.BytesIO object at 0x7f655601d6c0> Failed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72407010> Failed to fetch sample 1769963. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278a36a0> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9aac50> Failed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fbd99e0> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2422f0> cannot identify image file <_io.BytesIO object at 0x7fdcaf8f0360> Failed to fetch sample 1231297. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568e58f0> cannot identify image file <_io.BytesIO object at 0x7f3939c9bb00> Failed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00754d10> cannot identify image filFailed to fetch sample 3225671. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5720c0> Failed to fetch sample 3423432. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2408b0> e <_io.BytesIO object at 0x7f524fbd9530> Failed to fetch sample 2770500. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b382bdd0> Failed to fetch sample 2392280. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d6de0> Failed to fetch sample 4179589. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568e5350> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f20c0> Failed to fetch sample 2101670. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380b600> Failed to fetch sample 3463683. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682618a90> Failed to fetch sample 1265302. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00754090> Failed to fetch sample 2686021. Exception:Failed to fetch sample 2358239. Exception:bject at 0x7f3939c8d800> Failed to fetch sample 1747586. Exception: cannot identify image fFailed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39b6de0>file <_io.BytesIO object at 0x7efcdbb71e90> cannot identify image file <_io.BytesIO object at 0x7f524fbda980> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a391a390> Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7222e750> ile <_io.BytesIO object at 0x7fc4c851ba60> Failed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bee421490> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00756520> Failed to fetch sample 11Failed to fetch sample 1540763. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 1671818. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80df2e0> cannot identify image fiFailed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f73d0> le <_io.BytesIO object at 0x7fe4369d7920> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f19e0> Failed to fetch sample 3535594. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a651df80> Failed to fetch sample 1179835. Exception: cannot identify image fFailed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7f93f3995580> ile <_io.BytesIO object at 0x7f27754aa430> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08fdd00> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a9173ff10> Failed to fetch sample 1186857. Exception: cannot identify image file <_io.BytesIO object at 0x7efd9519acf0> Failed to fetch sample 1591233. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5f970> Failed to fetch sample 4023897. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fff0b0> Failed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf057a6d40> Failed to fetch sample 982331. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d5e90> Failed to fetch sample 2915909. Exception: cannot identify image filcannot identify image file <_io.BytesIO object at 0x7f99a3952200> Failed to fetch sample 1731153. Exception:Failed to fetch sample 3855812. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412f420> Failed to fetch sample 4077261. Exception:Failed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39b6a70> Failed to fetch sample 1806603. Exception: cannot identify image fiFailed to fetch sample 1418131. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e4ec00> Failed to fetch sample 4138472. Exception: cannot identify image file <_io.BytesIO object at 0x7f566811e430> Failed to fetch sample 2883927. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2d16c0> Failed to fetch sample 1862968. Exception: cannot identify image file <_io.BytesIO object at 0x7fe44542b790> Failed to fetch sample 2534771. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c210e50> Failed to fetch sample 4072136. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a4f600> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e7a60> Failed to fetch sample 3535594. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e38177e0> Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a41bb7c90> Failed to fetch sample 2374167. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42ab5b0> Failed to fetch sample 2444358. Exception: cannot identify image file <_io.BytesIO object at 0x7fe44409aa70> cannot identify image file <_io.BytesIO object at 0x7f655606a8e0> Failed to fetch sample 3414732. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc2e80> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f605580> Failed to fetch sample 4113199. Exception: cannot identify image file <_io.BytesIO object at 0x7f566811e110> Failed to fetch sample 3924877. Exception:Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ceb1f0> cannot identify image file <_io.BytesIO object at 0x7fe4c4277c40> Failed to fetch sample 2049310. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7efcdbb44810> Failed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadd48590> Failed to fetch sample 3535594. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 29Failed to fetch sample 3695557. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e6f70> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6acfe0> Failed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7fb5e98d0540> bject at 0x7fe4c4268270> Failed to fetch sample 1988977. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80fe250> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b38898a0> Failed to fetch sample 1026599. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17f0fbbf0> Failed to fetch sample 3209388. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdb2ac0> Failed to fetch sample 2495872. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9065e40> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72249490> Failed to fetch sample 2485920. Exception: cannot identify image fFailed to fetch sample 1479796. Exception: Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a87622f0> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80debb0> cannot identify image file <_io.BytesIO object at 0x7fa00819aa70> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42683b0> Failed to fetch sample 3406001. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe724098a0> Failed to fetch sample 1523696. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003862a0> Failed to fetch sample 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17f0fb830> Failed to fetch sample 3002531. Exception:Failed to fetch sample 3148733. Exception:bject at 0x7f713412fd80> Failed to fetch sample 3901250. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73284770> Failed to fetch sample 4125296. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1aecc3ce0> Failed to fetch sample 1858166. Exception: cannot identify image fiFailed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdb14e0> Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e98a0> bject at 0x7fbe7224a7f0> Failed to fetch s cannot identify image file <_io.BytesIO object at 0x7fccdf78ad90> Failed to fetch sample 2943308. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d27883fb0> Failed to fetch sample 18Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200385760> Failed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c426af70> Failed to fetch sample 2739815. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7328d4e0> 61364. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c9ecef1a0> Failed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a96e6b0> Failed to fetch sample 2805322. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f93a60> Failed to fetch sample 1433646. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fd5350> Failed to fetch sample 2282052. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac20f330> Failed to fetch sample 1395042. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f89ad0> cannot identify image file <_io.BytesIO object at 0x7fbe72416610> cannot identify image file <_io.BytesIO object at 0x7f7e0779f9c0> le <_io.BytesIO object at 0x7fe4c424b3d0> Failed to fetch sample 3745591. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e0fe0> Failed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c426b740> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73284090> Failed to fetch sample 2903629. Exception:ject at 0x7f6a7f290ef0> Failed to fetch sample 2528140. Exception: cannot identify image file <_io.BytesIO object at 0x7efce5b85670> Failed to fetch sample 3342520. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a53b470> cannot identify image file <_io.BytesIO object at 0x7fb9777ac7c0> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f91c60> Failed to fetch sample 2815601. Exception: ject at 0x7ff249484f40> Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0779cae0> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249485ee0> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e2a20> Failed to fetch sample 1047785. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c426a520> cannot identify image file <_io.BytesIO object at 0x7f94ac244810> Failed to fetch sample 4018722. Exception: Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7fe44409a390> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c427e250> cannot identify image file <_io.BytesIO object at 0x7fd08940c950> Failed to fetch sample 3907787. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278b3ec0> cannot identify image file <_io.BytesIO object at 0x7f7e0779fb00> le <_io.BytesIO object at 0x7f5a16da80e0> Failed to fetch sample 2607936. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2931a0> Failed to fetch sample 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fd7d30> Failed to fetch sample 3297609. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4440b85e0> cannot identify image file <_io.BytesIO object at 0x7f1fd4c5b9c0> Failed to fetch sample 2448314. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f62cf0> Failed to fetch sample 2454376. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278b0770> Failed to fetch sample 3013832. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777adf80> Failed to fetch sample 2392280. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c427d1c0> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e2930> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fd4360> cannot identify image file <_io.BytesIO oFailed to fetch sample 1992100. Exception: ject at 0x7fe1aecc1030> Failed to fetch sample 1485580. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7f5d278b1030> Failed to fetch sample 1448881. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 1101307. Exception:bject at 0x7f098644f240> Failed to fetch sample 1294515. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de372a660> Failed to fetch sample 2317348. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec7f790> Failed to fetch sample 3463683. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e9bc0> Failed to fetch sample 3458049. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7efd6f776700> ile <_io.BytesIO object at 0x7fbe774ad120> Failed to fetch sample 1498809. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db570cc0> le <_io.BytesIO object at 0x7f1137fd50d0> Failed to fetch sample 3447439. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cc4f0> Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7fb5fefe3f60> Failed to fetch sample 1171510. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777eb1a0> Failed to fetch sample 4173937. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf768950> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ef2e0> Failed to fetch sample 3575993. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f67fb0> Failed to fetch sample 2765360. Exception: cannot identify image file <_io.BytesIO object at 0x7efd102ab830> Failed to fetch sample 1508117. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cf1f0> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a390eed0> Failed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db598c20> Failed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72802430> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f6a2f0> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f17560> Failed to fetch sample 2723001. Exception: cannot identify image fiFailed to fetch sample 2700162. Exception:Failed to fetch sample 4027148. Exception: cannot identify image fiFailed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a583790> Failed to fetch sample 1136372. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf78be20>FFailed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9635b0> Failed to fetch sample 2226416. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e63e20> Failed to fetch sample 3571975. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9dd850> Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9263e0> ailed to fetch sample 1265302. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f94df30> Failed to fetch sample 3151582. Exception:Failed to fetch sample 2231126. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb4ac50> Failed to fetch sample 2282533. Exception: cannot identify image fiFailed to fetch sample 2111456. Exception:bject at 0x7fdd23f6e7f0> Failed to fetch sample 2720468. Exception:Failed to fetch sample 1571126. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64c23e0> Failed to fetch sample 1186857. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59b5b0> Failed to fetch sample 1047785. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe817149a0> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7f34f2d1a570> Failed to fetch sample 3546995. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e60a40> Failed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9631a0> Failed to fetch sample 1164861. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cd372b10> Failed to fetch sample 2591241. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a580f40> Failed to fetch sample 2251249. Exception: cannot identify image fFailed to fetch sample 1016144. Exception:Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38454e0>cFailed to fetch sample 4018722. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb0810> cannot identify image file <_io.BytesIO object at 0x7f34f2d109f0> Failed to fetch sample 1716874. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c38b98a0> Failed to fetch sample 1601394. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7d3fdd50> Failed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9f4180> Failed to fetch sample 3002531. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280e9e90> Failed to fetch sample 3458049. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378ff10> Failed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3847ab0> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9d9800> Failed to fetch sample 1626957. Exception: cannot identify image fiFailed to fetch sample 3317161. Exception:Failed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f7790> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db573ab0> Failed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faeee30> Failed to fetch sample 2846277. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9bae30> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d6b1f0> Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c1620> Failed to fetch sample 1430100. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faf38d0> Failed to fetch sample 2671486. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a0bf60> Failed to fetch sample 3281867. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f297ab0> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64c1760> le <_io.BytesIO object at 0x7f12b38271f0> Failed to fetch sample 1804828. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568d5990> Failed to fetch sample 3053417. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf88fe20> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7f469f98a980> Failed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9baa70> Failed to fetch sample 4027148. Exception: cannot identify image fFailed to fetch sample 2712222. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dce340> Failed to fetch sample 1186857. Exception: Failed to fetch sample 1672947. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73270630> Failed to fetch sample 3463301. Exception: cannot identify image fFailed to fetch sample 3633869. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644bd80> Failed to fetch sample 2538638. Exception: Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93cb80> Failed to fetch sample 2883927. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c381d440> Failed to fetch sample 1329424. Exception: cannot identify image fFailed to fetch sample 2085038. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732633d0> ile <_io.BytesIO object at 0x7f357b96ebb0> Failed to fetch sample 2695719. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b921a80> Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38469d0> Failed to fetch sample 1662953. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db09f0> Failed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e64fe0> cannot identify image file <_io.BytesIO object at 0x7f109377ba60> le <_io.BytesIO object at 0x7f18e64e1e90> Failed to fetch sample 2849284. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72416570> Failed to fetch sample 2733843. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f918270> Failed to fetch sample 2282533. Exception: cannot identify image file <_io.BytesIO object at 0x7f82942756c0> cannot identify image file <_io.BytesIO object at 0x7fa275c9e070> Failed to fetch sample 1586939. Exception: cannot identify image file <_io.BytesIO object at 0x7f6191ca4d60> cannot identify image file <_io.BytesIO object at 0x7fa275cb52b0> 47785. Exception: cannot identify image file <_io.BytesIO object at 0x7efd102abab0> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f24c6d0> Failed to fetch sample 3247622. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7325f100> Failed to fetch sample 2380041. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2bb7e0> Failed to fetch sample 2846277. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7324b4c0> Failed to fetch sample 2230828. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42909f0> Failed to fetch sample 1170679. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9e5765c0> Failed to fetch sample 2087658. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137ff7ab0> Failed to fetch sample 3015989. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7bc180> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beeba9b20> Failed to fetch sample 1747586. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dccbd0> Failed to fetch sample 3856361. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c9d440> Failed to fetch sample 3646041. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cbb510> Failed to fetch sample 4058966. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7322e200> Failed to fetch sample 1111257. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fe5cb0> cannot identify image file <_io.BytesIO object at 0x7efc516afbf0> Failed to fetch sample 2840084. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37a9c60> le <_io.BytesIO object at 0x7fc2eb749260> Failed to fetch sample 3535594. Exception: cannot identify image fiFailed to fetch sample 3225671. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c272ed0> Failed to fetch sample 4027526. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f7830> Failed to fetch sample 4232471. Exception:Failed to fetch sample 13 cannot identify image file <_io.BytesIO oFailed to fetch sample 3015989. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd7740> Failed to fetch sample 2647853. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca26b0> bject at 0x7fdc25fe2e80> Failed to fetch sample 1278085. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f295710> cannot identify image file <_io.BytesIO object at 0x7f80284c4720> Failed to fetch sample 4375363. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f547c0> Failed to fetch sample 1314700. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5bab0> Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1c74c0> le <_io.BytesIO object at 0x7f9a9c2c2750> Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f53ce0> Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c29aa70> Failed to fetch sample 2791306. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f7290> Failed to fetch sample 2964818. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73251cb0> Failed to fetch sample 3448968. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682e8ae0> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7f0e2840d2b0> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f65440> Failed to fetch sample 4023897. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137ff5f30> Failed to fetch sample 1026599. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682fb4c0> Failed to fetch sample 2954477. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682f92d40> Failed to fetch sample 4141250. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8dacf0> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73250590> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f1f790> Failed to fetch sample 1123067. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f3e20> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f5670> Failed to fetch sample 3561717. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fa13150> Failed to fetch sample 2165838. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91e930> Failed to fetch sample 2816660. Exception:Failed to fetch sample 2387941. Exception: cannot identify image fiFailed to fetch sample 4115015. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8eace00> Failed to fetch sample 2237717. Exception:Failed to fetch sample 1820671. Exception: cannot identify image fiFailed to fetch sample 4376367. Exception:Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7f7da2cd3d30> bject at 0x7f6a7f296980> le <_io.BytesIO object at 0x7f9a9c273c90> cannot identify image file <_io.BytesIO object at 0x7f9d73253c90> Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401fba60> Failed to fetch sample 4018722. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf0715d0> FFailed to fetch sample 2158301. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaade852b0> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3716110> Failed to fetch sample 3132523. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8dad40> cannot identify image file <_io.BytesIO object at 0x7f001f91e2f0> Failed to fetch sample 1834136. Exception:Failed to fetch sample 4028686. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7fc1cf6a3ab0> ile <_io.BytesIO object at 0x7fd07a573560> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf0721b0> Failed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f24ea20> Failed to fetch sample 3929909. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64a9b20> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3c5f330> Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf87fbf0> Failed to fetch sample 3002531. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d1773f330> Failed to fetch sample 3856361. Exception: cannot identify image file <_io.BytesIO object at 0x7fb5eaf87510> Failed to fetch sample 1369893. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0901d50> Failed to fetch sample 1231297. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754b3b50> Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8319080> Failed to fetch sample 1976843. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83493f0> cannot identify image file <_io.BytesIO object at 0x7fdc25fd67a0> Failed to fetch sample 1222876. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e649a6b0> Failed to fetch sample 1769963. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73251da0> Failed to fetch sample 2762250. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2a2660> cannot identify image file <_io.BytesIO object at 0x7f81db5790d0> Failed to fetch sample 2850892. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38c1ee0> Failed to fetch sample 1954361. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39217b0> le <_io.BytesIO object at 0x7f9a9c2a3560> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2c46f53f0> Failed to fetch sample 1859663. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b258450> Failed to fetch sample 3371113. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7327d170> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25a2a0> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20038ff10> Failed to fetch sample 3526425. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20035a660> Failed to fetch sample 1489397. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3811e90> Failed to fetch sample 2964818. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937eb970> Failed to fetch sample 2341811. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7323bf60> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cbb0b0> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20039a0c0> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc121760> cannot identify image file <_io.BytesIO object at 0x7f99a39a8590> le <_io.BytesIO object at 0x7fb52f6005e0> Failed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754b2ac0> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38c0400> Failed to fetch sample 3355522. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1db010> Failed to fetch sample 2683338. Exception: cannot identify image file <_io.BytesIO object at 0x7f95bc526250> Failed to fetch sample 1561037. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937ea430> Failed to fetch sample 2085038. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f629a80> cannot identify image file <_io.BytesIO object at 0x7f1b7b9c15d0> ile <_io.BytesIO object at 0x7fccdf79fec0> Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8349cb0> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38a7c90> Failed to fetch sample 3247622. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6453a0> Failed to fetch sample 1213089. Exception:Failed to fetch sample 4203369. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b382bb00> Failed to fetch sample 31 cannot identify image file <_io.BytesIO object at 0x7f22e39b7380> Failed to fetch sample 1231297. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d69d0> Failed to fetch sFailed to fetch sample 2950468. Exception: cannot identify image file <_io.BytesIO object atFailed to fetch sFailed to fetch sample 3573442. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b94f1f0> Failed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e6250> ample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a239e9670> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38be2a0> Failed to fetch sample 2358648. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937613f0> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342d6b10> Failed to fetch sample 4099617. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1d9bc0> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b960a90> Failed to fetch sample 3025714. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7bf060> cannot identify image file <_io.BytesIO object at 0x7f1133fd3740> Failed to fetch sample 4129476. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c7f7fb0> Failed to fetch sample 3231814. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f72a70> Failed to fetch sample 1209919. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682d29d0> Failed to fetch sample 1718944. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2a7dd0> Failed to fetch sample 1746137. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133f40c20> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682d3100> cannot identify image file <_io.BytesIO object at 0x7fc0f08ee2a0> 1855. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e384f1a0> Failed to fetch sample 1018000. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb13740> Failed to fetch sample 3512714. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0ffa8d030> Failed to fetch sample 1846470. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a1f150> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2d87c0> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7f566810d440> Failed to fetch sample 1971538. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b272cf0> Failed to fetch sample 2325409. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2a4810> Failed to fetch sample 2878536. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4246b60> Failed to fetch sample 1136372. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133fd0130> Failed to fetch sample 1784209. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f6390> Failed to fetch sample 2881023. Exception: cannot identify image file <_io.BytesIO object at 0x7f991b3e7d80> Failed to fetch sample 1479796. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39f2f70> cannot identify image file <_io.BytesIO object at 0x7fc0f0934590> Failed to fetch sample 2128734. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fba6b0> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2a7510> Failed to fetch sample 2958365. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08fa200> Failed to fetch sample 3096384. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090b740> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de375fc90> Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682faca0> Failed to fetch sample 1813125. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b279760> Failed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd7f10> Failed to fetch sample 1602779. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e384f1f0> Failed to fetch sample 1531320. Exception: cannot identify image file <_io.BytesIO object at 0x7fde75ec53a0> Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8f5ee0> Failed to fetch sample 2358385. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f57d30> Failed to fetch sample 3091195. Exception: cannot identify image file <_io.BytesIO object at 0x7fc02a7c3dd0> Failed to fetch sample 3929909. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093737830> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682f9fd0> Failed to fetch sample 3490477. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcf2160> Failed to fetch sample 1772414. Exception: Failed to fetch sample 1653057. Exception: Failed to fetch sample cannot identify image file <_io.BytesIO object at 0x7fc715329940> Failed to fetch sample 2087658. Exception:Failed to fetch sample 2647241. Exception: cannot identify image file <_io.BytesIO object at 0x7f991b3e7970> Failed to fetch sample 4104695. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f2c00> Failed to fetch sample 3117311. Exception: cannot identify image fiFailed to fetch sample 1719573. Exception:Failed to fetch sample 1448881. Exception:bject at 0x7f5de375f060> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f21c10> Failed to fetch sample 1418371. Exception: ject at 0x7f6a7f28a0c0> Failed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f3d80> Failed to fetch sample 2454855. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d7600> Failed to fetch sample 3671176. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f777ba0> Failed to fetch sample 1207900. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fca1c1bad40> le <_io.BytesIO object at 0x7f991b3e6f20> Failed to fetch sample 3297609. Exception:Failed to fetch sample 2247555. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1d93a0> cannot identify image file <_io.BytesIO object at 0x7f81db5a4090> ailed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f25800> Failed to fetch sample 2093504. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f33240> Failed to fetch sample 1672947. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b27b010> cannot identify image file <_io.BytesIO object at 0x7fbf00743dd0> ile <_io.BytesIO object at 0x7f5e6a1185e0> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b98f0> cannot identify image file <_io.BytesIO object at 0x7f6a7f28be70> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369ff2e0> Failed to fetch sample 2708935. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7ffce0> le <_io.BytesIO object at 0x7f81db5772e0> Failed to fetch sample 3535594. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682fa7f0> Failed to fetch sample 1171510. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4eeec6570>Failed to fetch sample 1747586. Exception: cannot identify image file <_io.BytesIO object at 0x7fd99f4c20c0> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7f60ffea6c00> Failed to fetch sample 2846277. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b243b00> Failed to fetch sample 2758452. Exception:bject at 0x7fe0ec21b830> Failed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24949b600> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18b510> Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278de7a0> cannot identify image file <_io.BytesIO object at 0x7f81db5a7b00> Failed to fetch sample 1316862. Exception: cannot identify image file <_io.BytesIO object at 0x7f60ffea6930> Failed to fetch sample 2045138. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f94ac2f9ee0> 53253. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7a0680> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28b1a0> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278dd5d0> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a119bc0> Failed to fetch sample 2536558. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f67600> Failed to fetch sample 4125296. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b382ab60> Failed to fetch sample 4089308. Exception:Failed to fetch sample 3256986. Exception: cannot identify image file <_io.BytesIO object at 0x7fe445d6d120> Failed to fetch sample 1203134. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7befc0>cannot identify image file <_io.BytesIO object at 0x7fccdf7a1210> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b906110> Failed to fetch sample 2903629. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f5d00> Failed to fetch sample 1365938. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf007542c0> cannot identify image file <_io.BytesIO object at 0x7f47278d34c0> Failed to fetch sample 3929909. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35baad40> Failed to fetch sample 2684710. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf9bf53240> Failed to fetch sample 1984836. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e96c0> Failed to fetch sample 3856361. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278de7f0> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1da980> Failed to fetch sample 2376463. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7be9d0> Failed to fetch sample 3463683. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f266390> Failed to fetch sample 3015989. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a9173fe20> Failed to fetch sample 2686021. Exception: cannot identify image file <_io.BytesIO object at 0x7f5df06bdb20> cannot identify image file <_io.BytesIO object at 0x7f56681074c0> le <_io.BytesIO object at 0x7f0c35c5ee30> cannot identify image file <_io.BytesIO object at 0x7ff24949b8d0> Failed to fetch sample 2990524. Exception: Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7feb671d60c0> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcf4040> cannot identify image file <_io.BytesIO object at 0x7f83e847ab60> Failed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18bfb0> Failed to fetch sample 2179582. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cdfc90> Failed to fetch sample 2482217. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ca9030> Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249677150> Failed to fetch sample 2087658. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8329490> Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18afc0> Failed to fetch sample 2085038. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c774c0> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c96840> Failed to fetch sample 1171510. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f236980> cannot identify image file <_io.BytesIO object at 0x7fdcaf858e50> Failed to fetch sample 4104440. Exception: Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249674cc0> Failed to fetch sample 4232471. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83462f0> Failed to fetch sample 3297609. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38289f0> Failed to fetch sample 1061148. Exception: cannot identify image fFailed to fetch sample 3458049. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c1760> ile <_io.BytesIO object at 0x7f6a8e1ca430> cannot identify image file <_io.BytesIO object at 0x7f239e6b9e40> Failed to fetch sample 1479796. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf76c3b0> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1e6a20> cannot identify image file <_io.BytesIO object at 0x7f35a7f45f80> Failed to fetch sample 2747941. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdd2cf0> le <_io.BytesIO object at 0x7f0c35c5eb10> Failed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7224a8e0> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278be750> Failed to fetch sample 3089841. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278fb4c0> Failed to fetch sample 2448314. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f6750> Failed to fetch sample 1447326. Exception: cannot identify image file <_io.BytesIO object at 0x7f583a3a71a0> Failed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f999300> Failed to fetch sample 2894466. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7b178d0> Failed to fetch sample 1659475. Exception: cannot identify image file <_io.BytesIO object at 0x7f1fca234860> Failed to fetch sample 1479796. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a917a0860> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb70130> Failed to fetch sample 1492629. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4260810> Failed to fetch sample 2085038. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdf68e0> Failed to fetch sample 2962164. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9066e30> Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83298f0> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2d35b0> Failed to fetch sample 3648356. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275d08450> Failed to fetch sample 3247622. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc5cc130> cannot identify image file <_io.BytesIO object at 0x7f1008a65760> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO objFailed to fetch sample 3571975. Exception: cannot identify image fFailed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf76f2e0> Failed to fetch sample 1988418. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278b0c70> cannot identify image file <_io.BytesIO object at 0x7f4beebd9170> Failed to fetch sample 1233128. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db9ee0> Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378f5b0> Failed to fetch sample 1265302. Exception: cannot identify image fFailed to fetch sample 3856361. Exception: Failed to fetch sample 3205989. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08f9b70> Failed to fetch sample 4Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c427e250> Failed to fetch sample 2Failed to fetch sample 1203134. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dba3e0> cannot identify image file <_io.BytesIO object at 0x7fa275c97c40> Failed to fetch sample 3458049. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939318fe0> Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5ec6de0> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a9173e340> Failed to fetch sample 2633741. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7be6de0> Failed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b66b0> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1e5760> Failed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275d08130> Failed to fetch sample 2820344. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c9ecef150> Failed to fetch sample 4123070. Exception: cannot identify image file <_io.BytesIO object at 0x7fba3256af70> Failed to fetch sample 2234623. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faf3740> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f6d90> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a866b0> ailed to fetch sample 2252625. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9251c0> Failed to fetch sample 3036687. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec7e6b0> cannot identify image file <_io.BytesIO object at 0x7fee3f9712b0> Failed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7be66b0> Failed to fetch sample 1048083. Exception:Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939318ea0> cannot identify image file <_io.BytesIO object at 0x7f2a9173f100> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f970680> Failed to fetch sample 2415892. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f62c00> Failed to fetch sample 2607936. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c427e2f0> cannot identify image file <_io.BytesIO object at 0x7fb9777a8130> Failed to fetch sample 3678509. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3877ab0> le <_io.BytesIO object at 0x7efcd7b2bfb0> cannot identify image file <_io.BytesIO object at 0x7f4beebdb1f0> Failed to fetch sample 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b995cb0> Failed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380fc40> Failed to fetch sample 2448314. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a917221b0> Failed to fetch sample 1188250. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9dcf40> cannot identify image file <_io.BytesIO object at 0x7f83e83aeb60> le <_io.BytesIO object at 0x7f1137f618a0> Failed to fetch sample 2529149. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c38bbdd0> cannot identify image file <_io.BytesIO object at 0x7fb9777a8b30> Failed to fetch sample 4235606. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3877f10> Failed to fetch sample 2858679. Exception:cannot identify image file <_io.BytesIO object at 0x7efcdbbc6ed0> Failed to fetch sample 2800504. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37ceed0> Failed to fetch sample 1629926. Exception:Failed to fetch sample 1450108. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e97b0> cannot identify image file <_io.BytesIO object at 0x7fd200384bd0> cannot identify image file <_io.BytesIO object at 0x7f0c1b5355d0> ailed to fetch sample 3149661. Exception: Failed to fetch sample 2523309. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f7ee890> cannot identify image file <_io.BytesIO object at 0x7f6682dda9d0> Failed to fetch sample 2439548. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebab7e0> Failed to fetch sample 1258849. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0772f470> cannot identify image file <_io.BytesIO object at 0x7f35a7f549a0> le <_io.BytesIO object at 0x7f5e6a982c50> Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777a83b0> Failed to fetch sample 3929909. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741fb0130> Failed to fetch sample 3490074. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0935da0> Failed to fetch sample 1319882. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f915b20> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b0db0> Failed to fetch sample 1432953. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078deca0> Failed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db10d0> cannot identify image file <_io.BytesIO object at 0x7f393931be70> le <_io.BytesIO object at 0x7f22e39e8a90> cannot identify image file <_io.BytesIO object at 0x7f0986422890> le <_io.BytesIO object at 0x7f5e6a8eac00> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd430352b0> Failed to fetch sample 3209388. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0934e00> cannot identify image file <_io.BytesIO objFailed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1bee4ef0> cannot identify image fcannot identify image file <_io.BytesIO object at 0x7f35a7f564d0> ile <_io.BytesIO object at 0x7f001f976ed0> ile <_io.BytesIO object at 0x7f7e078dc5e0> Failed to fetch sample 1602779. Exception: Failed to fetch sample 1557819. Exception: cannot identify image fFailed to fetch sample 2819794. Exception: ect at 0x7fdd43035620> Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO objFailed to fetch sample Failed to fetch sample 2700162. Exception:ject at 0x7f7b9b968f40> Failed to fetch sampFailed to fetch sample 2975154. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420fc9a0> e 2149756. Exception: cannot identify image file <_io.BytesIO object at 0x7f1fd4c5b9c0> Failed to fetch sample 3399465. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f77a10> Failed to fetch sample 2831002. Exception:ject at 0x7f098647fc90> Failed to fetch sample 1006844. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5697d300> ile <_io.BytesIO object at 0x7f7e078e4090> Failed to fetch sample 2954638. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f04e794e0> Failed to fetch sample 1170726. Exception: cannot identify image fiFailed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1bee6340> Failed to fetch sample 2149877. Exception: cannot identify image fiFailed to fetch sample 1111496. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401eb920> le <_io.BytesIO object at 0x7f71340f3010> Failed to fetch sample 4376367. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9065d00> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7f32986bb9c0> Failed to fetch sample 2421798. Exception: cannot identify image file <_io.BytesIO object at 0x7f277547b650> Failed to fetch sample 1327539. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec63920> Failed to fetch sample 3423432. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1bee7060> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec63150> cannot identify image file <_io.BytesIO object at 0x7f3939ce9bc0> Failed to fetch sample 3343320. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c9bc40> e <_io.BytesIO object at 0x7fccdf75a1b0> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9065b70> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f56110> cannot identify image file <_io.BytesIO object at 0x7fedb68779c0> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37caf70> Failed to fetch sample 3010687. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754b29d0> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf75b880> Failed to fetch sample 4125296. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a981b20> cannot identify image file <_io.BytesIO object at 0x7f3939cea570> Failed to fetch sample 1069703. Exception:Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278af420> cannot identify image file <_io.BytesIO object at 0x7f3939cce390> Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c5608d3f0> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560ecc20> Failed to fetch sample 3236812. Exception: cannot identify image fiFailed to fetch sample 1430100. Exception: cannot identify image file <_io.BytesIO object at 0x7f1fd4c5b740> Failed to fetch sample 2903629. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e8950> Failed to fetch sample 1492667. Exception:Failed to fetch sample 2455254. Exception: cannot identify image fiFailed to fetch sample 1203134. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e839a390> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c9ac50> le <_io.BytesIO object at 0x7f2775475d50> Failed to fetch sample 3096384. Exception: cannot identify image fi cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7f53a9066660> le <_io.BytesIO object at 0x7f5e6a8eb420> Failed to fetch sample 3244596. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c5607be70> cannot identify image file <_io.BytesIO object at 0x7f7e07704900> Failed to fetch sample 1772414. Exception:Failed to fetch sample 4243929. Exception:Failed to fetch sample 11Failed to fetch sample 1604074. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 2607936. Exception: cannot identify image file <_io.BytesIO object at 0x7f3d152efbf0> Failed to fetch sample 39Failed to fetch sample 1435834. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681132e0> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcf4be020> Failed to fetch sample 1302362. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fd07a539d50> le <_io.BytesIO object at 0x7f5e6a8ea7a0>FFailed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7f655601c9a0>Failed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a16ac0> Failed to fetch sample 3652924. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4282520> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c560822a0> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e671f0> Failed to fetch sample 3244596. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775487c90> cannot identify image file <_io.BytesIO object at 0x7f53a9066480> le <_io.BytesIO object at 0x7f109378f0b0> Failed to fetch sample 3106325. Exception:cannot identify image file <_io.BytesIO object at 0x7f566819fab0> cannot identify image file <_io.BytesIO object at 0x7fe4c420fc40> Failed to fetch sample 1443093. Exception: Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560279c0> Failed to fetch sample 3136267. Exception:bject at 0x7f22e39e6e80> Failed to fetch sample 2884186. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a17a10> Failed to fetch sample 2693043. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d6de0> cannot identify image file <_io.BytesIO object at 0x7f109377ab60> ile <_io.BytesIO object at 0x7fe4c420fbf0> cannot identify image file <_io.BytesIO object at 0x7f53a9065a80> le <_io.BytesIO object at 0x7fa275cdc180> Failed to fetch sample 2819794. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f07d30> Failed to fetch sample 3209615. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682d3970> Failed to fetch sample 4047069. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fd6610> cannot identify image file <_io.BytesIO object at 0x7f001f999350> 705920. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a7226e30> cannot identify image file <_io.BytesIO object at 0x7f7a5697e7f0> cannot identify image file <_io.BytesIO object at 0x7fddc4316840> Failed to fetch sample 1582912. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3746b10> Failed to fetch sample 3599011. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f983a10> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7f277545b880> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378f100> Failed to fetch sample 2886659. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f44200cc0> Failed to fetch sample 4077261. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f998180> Failed to fetch sample 2147448. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7323b2e0> Failed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7324b060> Failed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c426a340> cannot identify image file <_io.BytesIO object at 0x7f7b13d2e7f0> Failed to fetch sample 4162267. Exception: Failed to fetch sample 1702041. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59f290> Failed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5697f1a0> Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a573ab0> Failed to fetch sample 1662658. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6497650> cannot identify image file <_io.BytesIO object at 0x7fe1bd7dfe20> Failed to fetch sample 1484787. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7323bba0> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7f530f401d00> cannot identify image file <_io.BytesIO object at 0x7f3c56079670> le <_io.BytesIO object at 0x7f6a7f293ce0> Failed to fetch sample 4125296. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e652b0> Failed to fetch sample 3738780. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b921d50> Failed to fetch sample 1992091. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b996570> cannot identify image file <_io.BytesIO object at 0x7f80a819e890> Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac25e200> le <_io.BytesIO object at 0x7fdd445df0b0> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7feb6a3cac50> cannot identify image file <_io.BytesIO object at 0x7fe4c4277ec0> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d730a3a60> Failed to fetch sample 3091195. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec24fc40> Failed to fetch sample 1142979. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f0c20> Failed to fetch sample 2686209. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560e6de0> cannot identify image file <_io.BytesIO object at 0x7f7b13327fb0> le <_io.BytesIO object at 0x7f82944802c0> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5697f510> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369ec630> Failed to fetch sample 1804828. Exception:bject at 0x7f10af1aa430> Failed to fetch sample 2166106. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17f0f2480> cannot identify image file <_io.BytesIO object at 0x7f81db59ddf0> le <_io.BytesIO object at 0x7f109378a430> Failed to fetch sample 2558835. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369f49f0> Failed to fetch sample 3578223. Exception:Failed to fetch sample 2916193. Exception: cannot identify image fFailed to fetch sample 3297609. Exception:cannot identify image file <_io.BytesIO object at 0x7f7a5697c040> Failed to fetch sample 4006109. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7afc40> cannot identify image file <_io.BytesIO object at 0x7fde3edb95d0> Failed to fetch sample 2662642. Exception: cannot identify image file <_io.BytesIO object at 0x7ff411288720> Failed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c9ad40> Failed to fetch sample 3015989. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a3830> cannot identify image file <_io.BytesIO object at 0x7f277548af70> Failed to fetch sample 3209388. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37ceb60> ile <_io.BytesIO object at 0x7f81db55d8f0> cannot identify image file <_io.BytesIO object at 0x7f81db560a40> Failed to fetch sample 3263857. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a810bbf0> Failed to fetch sample 2646426. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7af4c0> Failed to fetch sample 2964818. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b51ee0> Failed to fetch sample 3361738. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8d8450> Failed to fetch sample 2282533. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f835b0> Failed to fetch sample 2973715. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556068b80> Failed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ce6cf0> Failed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8366a20> Failed to fetch sample 2025829. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac244220> Failed to fetch sample 2363461. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7765c0> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a90dda0> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b531f0> cannot identify image file <_io.BytesIO object at 0x7f7e078c9e90> Failed to fetch sample 1254189. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37ccd60> cannot identify image file <_io.BytesIO object at 0x7fd2003aad40> le <_io.BytesIO object at 0x7f5253a1f6f0> Failed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb0bd0> Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d1da0> Failed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e4e50> Failed to fetch sample 3326485. Exception: cannot identify image fiFailed to fetch sample 4077786. Exception:cannot identify image file <_io.BytesIO object at 0x7f81db586610> Failed to fetch sample 2676889. Exception: cannot identify image file <_io.BytesIO object at 0x7f001baf27a0> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecb2006d0> Failed to fetch sample 2049038. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682deff60> Failed to fetch sample 1829945. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fde169350> cannot identify image file <_io.BytesIO object at 0x7fd2003a2f70> Failed to fetch sample 2112501. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b5490> Failed to fetch sample 4179589. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078fb470> Failed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb0b30> Failed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec21ae80> cannot identify image file <_io.BytesIO object at 0x7f70b3837970> Failed to fetch sample 2998072. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fe32e0> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d0ea0> Failed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b6f20> Failed to fetch sample 2953832. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80ebf10> Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecb201710> Failed to fetch sample 3693940. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f94fb00> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ca7420> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db573380> Failed to fetch sample 3355968. Exception: cannot identify image file <_io.BytesIO object at 0x7f001b929710> Failed to fetch sample 3421855. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078f9e90> Failed to fetch sample 1769963. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b53ab0> Failed to fetch sample 991716. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789f510> Failed to fetch sample 2687916. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e38168e0> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7b4590> Failed to fetch sample 3571975. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce77150> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a819dcb0> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fe1670> Failed to fetch sample 4193225. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b38889a0> Failed to fetch sample 1186857. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b6c50> Failed to fetch sample 2354595. Exception: Failed to fetch sample 3297609. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17e71b060> Failed to fetch sample 1026599. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280faa20> cannot identify image file <_io.BytesIO object at 0x7f001f94dcb0> Failed to fetch sample 3546995. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc8680> Failed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a391cb30> Failed to fetch sample 2085038. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937f3a60> Failed to fetch sample 2618951. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e07792bb0> Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560e6cf0> Failed to fetch sample 4018722. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56959b70> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce77b00> Failed to fetch sample 2203221. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340edd50> Failed to fetch sample 2448314. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a3330> Failed to fetch sample 4098758. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9162f0> Failed to fetch sample 1327539. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf6e62a0> Failed to fetch sample 2924747. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278bc5e0> Failed to fetch sample 3224871. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fe0310> Failed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7afe20> Failed to fetch sample 2884186. Exception:Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412ed90> cannot identify image file <_io.BytesIO object at 0x7f80a80e76f0> Failed to fetch sample 1496831. Exception: cannot identify image fiFailed to fetch sample 2555299. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b0880e50> Failed to fetch sample 2102012. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f513a0> le <_io.BytesIO object at 0x7fbcdc0f3510> cannot identify image file <_io.BytesIO object at 0x7fc2eb7bcf90> Failed to fetch sample 1806693. Exception:Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2230b0> Failed to fetch sample 2164469. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73262de0> cannot identify image file <_io.BytesIO object at 0x7fc195145d50> FFailed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8349cb0>Failed to fetch sample 1435331. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b68e0> FFailed to fetch sample 3091195. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f797c40> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986449ee0>Failed to fetch sample 1591233. Exception: cannot identify image filFailed to fetch sample 3096384. ExceptionFailed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56972660> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a53b1a0> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b24a610> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7322cf90> cannot identify image file <_io.BytesIO object at 0x7f5d278b1120> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340edcb0> Failed to fetch sample 1481009. Exception: cannot identify image file <_io.BytesIO object at 0x7f53b5389760> Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d50596930> Failed to fetch sample 3499072. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8f40e0> Failed to fetch sample 1183674. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b44a0> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e0680> Failed to fetch sample 2091123. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7bbc40> Failed to fetch sample 1906747. Exception: Failed to fetch sample 1511646. Exception:cannot identify image file <_io.BytesIO object at 0x7f1f1b271d50> cannot identify image file <_io.BytesIO object at 0x7f1f1b25aed0> Failed to fetch sample 2105582. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1748b0> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7843b0> Failed to fetch sample 2647526. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbde5635b0> Failed to fetch sample 3224871. Exception:bject at 0x7fbf00741260> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3971080> Failed to fetch sample 2725161. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf006ef150> Failed to fetch sample 2903629. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00743c90> cannot identify image file <_io.BytesIO object at 0x7fa50aa32e30> ile <_io.BytesIO object at 0x7f53a8eff740> Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340ee9d0> Failed to fetch sample 2108837. Exception: cannot identify image fiFailed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8efe70> Failed to fetch sample 3013832. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a95d5d0> Failed to fetch sample 1157978. Exception:Failed to fetch sample 3229145. Exception: cannot identify image fiFailed to fetch sample 2103217. Exception:Failed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f8b240> Failed to fetch sample 2610886. Exception: cannot identify image fiFailed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7f81d6d8a070> Failed to fetch sample 2971250. Exception:Failed to fetch sample 2970162. Exception: cannot identify image fiFailed to fetch sample 3036687. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb787c40> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf782660> Failed to fetch sample 1448881. ExceptionFailed to fetch sample 2379671. Exception: cannot identify image filFailed to fetch sample 3103706. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3745440> Failed to fetch sample 3297420. ExceptionFailed to fetch sample 1930721. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b979adbc0> : cannot identify image file <_io.BytesIO object at 0x7fc0f090c3b0> l will result in indexing errors Failed to fetch sample 1201788. Exception:ject at 0x7f80a82b3290> Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340ef150> Failed to fetch sample 1804828. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8efc9f0> ile <_io.BytesIO object at 0x7fc1881ffa10> Failed to fetch sample 2448314. Exception:bject at 0x7f65560e5800> Failed to fetch sample 1311840. Exception: cannot identify image file <_io.BytesIO object at 0x7ff248cf04f0> ile <_io.BytesIO object at 0x7fb52fa34720> Failed to fetch sample 2282533. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f7ebce0> ailed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf780ef0> Failed to fetch sample 1304332. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5f53a0> Failed to fetch sample 1158188. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fdb20> Failed to fetch sample 3317678. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca1850> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0930e00> Failed to fetch sample 2511372. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0926c00> Failed to fetch sample 2903629. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f96ac00> Failed to fetch sample 1747586. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f645710> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7f21afeb7f60> le <_io.BytesIO object at 0x7f5e6a8e3420> cannot identify image file <_io.BytesIO object at 0x7f5e6a95efc0> Failed to fetch sample 3036687. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf781800> Failed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b9d50> Failed to fetch sample 2394669. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64cdb70> Failed to fetch sample 2679326. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9793f6f0> cannot identify image file <_io.BytesIO object at 0x7f6556064680> Failed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7ff248cf1170> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca0450> le <_io.BytesIO object at 0x7fc2eb73fb00> Failed to fetch sample 2231126. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e3e20> Failed to fetch sample 2915907. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf788540> Failed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71ec00> Failed to fetch sample 2164469. Exception:Failed to fetch sample 1665522. Exception: cannot identify image file <_io.BytesIO object at 0x7f5525ebc590> Failed to fetch sample 2400535. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401fb650> Failed to fetch sample 3297609. Exception:bject at 0x7f6682dba3e0> ailed to fetch sample 1414336. Exception: Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16db4540> cannot identify image file <_io.BytesIO object at 0x7fdcaf071c10> Failed to fetch sample 1070036. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f796930> Failed to fetch sample 4376367. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e63060> cannot identify image file <_io.BytesIO object at 0x7f71340ecb30> le <_io.BytesIO object at 0x7f9a9c2a7150> cannot identify image file <_io.BytesIO object at 0x7fedb5ec1350> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0927ab0> cannot identify image file <_io.BytesIO object at 0x7fb52f62b240> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3745d00> 84545. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72406610> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16d9de90> Failed to fetch sample 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9e5765c0> Failed to fetch sample 2448314. Exception: cannot identify image file <_io.BytesIO object at 0x7f566810efc0> Failed to fetch sample 3922170. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c296070> Failed to fetch sample 3091195. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f10680> cannot identify image file <_io.BytesIO object at 0x7f001f916250> Failed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7fc108454680> Failed to fetch sample 1186857. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a4ee30> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e833b7e0>Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6453a0> FFailed to fetch sample 3458049. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72407010> Failed to fetch sample 3237123. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a396a4d0> Failed to fetch sample 2332230. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb2eed0> Failed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de375fd80> Failed to fetch sample 3573442. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaf9887fb0> Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e61670> Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7222f9c0> Failed to fetch sample 2637223. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd7fb0> Failed to fetch sample 1357187. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f26660> Failed to fetch sample 2562235. Exception:ject at 0x7f18e5d1eb60> Failed to fetch sample 1890810. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f777ba0> Failed to fetch sample 3636979. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d9440> Failed to fetch sample 1168038. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986479c10> ile <_io.BytesIO object at 0x7efcdbb4f2e0> Failed to fetch sample 1602779. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c3290> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9e578ea0> Failed to fetch sample 2045138. Exception: Failed to fetch sample 3595827. Exception:ject at 0x7fdaad56e7f0> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f24950> Failed to fetch sample 3421855. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb4f740> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de375f970> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08d9080> Token indices sequence length is longer than the specified maximum sequence length for this model (10090 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 1384354. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090b060> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f971a30> Failed to fetch sample 2217803. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420ff510> cannot identify image file <_io.BytesIO object at 0x7f3939ca2ac0> Failed to fetch sample 2164469. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1084562f0> Failed to fetch sample 1421662. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c383ffb0> Failed to fetch sample 1688560. Exception: cannot identify image fiFailed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b38889f0> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bdee30> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe71a785e0> le <_io.BytesIO object at 0x7fc0f0912200> Failed to fetch sample 1203134. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420e9580> Failed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f94d9e0> Failed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca2750> Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c1620> Failed to fetch sample 2354595. Exception:cannot identify image file <_io.BytesIO object at 0x7fe4c4247060> Failed to fetch sample 3653860. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42686d0> FFailed to fetch sample 2142074. Exception: cannot identify image file <_io.BytesIO object at 0x7fd310dc22f0> Failed to fetch sample 2998072. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d82c0>Failed to fetch sample 2181632. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb49b20> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cdf240> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaa47dbc0> Failed to fetch sample 1602779. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e831ba10> Failed to fetch sample 1718944. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1be250> cannot identify image file <_io.BytesIO object at 0x7f5e6a11ac00> Failed to fetch sample 1675354. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789dc10> le <_io.BytesIO object at 0x7fdaad56fa60> cannot identify image file <_io.BytesIO object at 0x7f97af7de980> Failed to fetch sample 3313922. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f45e0> Failed to fetch sample 1575597. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b29d0> Failed to fetch sample 3036687. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568ccef0> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d8310> Failed to fetch sample 1885521. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec266520>FFailed to fetch sample 3162635. Exception:cannot identify image file <_io.BytesIO object at 0x7efcdbb70db0> ailed to fetch sample 1945810. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb2c680> Failed to fetch sample 1420150. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73b3f10> cannot identify image file <_io.BytesIO object at 0x7f99a392e7f0> Failed to fetch sample 2693003. Exception:Failed to fetch sample 2388130. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7fd200377470> 893040. Exception: cannot identify image file <_io.BytesIO object at 0x7f64960295d0> cannot identify image file <_io.BytesIO object at 0x7f357b9b8db0> le <_io.BytesIO object at 0x7efd6f7d7c40> Failed to fetch sample 2434377. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8eafd30> e <_io.BytesIO object at 0x7fdcaf87e700> Failed to fetch sample 1945810. Exception:bject at 0x7fccdf7771a0> ailed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d6d40> Failed to fetch sample 2284336. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcea520> Failed to fetch sample 3856361. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1c74c0> cannot identify image file <_io.BytesIO object at 0x7f82a5ec4db0> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275d086d0> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c7330> Failed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca3e70> Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e077098a0> cannot identify image file <_io.BytesIO object at 0x7f7a568f6020> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7fd30f9df1f0> Failed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d6c50> Failed to fetch sample 1575255. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b961c60> Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7f4ebf8d1b70> cannot identify image file <_io.BytesIO object at 0x7fc18853b650> ailed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83af010> Failed to fetch sample 3176035. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec238b80> Failed to fetch sample 1252088. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568f41d0> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37558f0> Failed to fetch sample 3442093. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f094fce0> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754ab4c0> Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe723ea610> Failed to fetch sample 1706685. Exception: cannot identify image file <_io.BytesIO object at 0x7f469efa3420> Failed to fetch sample 3015989. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f0040> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7fd30f9df740> cannot identify image file <_io.BytesIO object at 0x7f4beebd30b0> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e07792b10> ile <_io.BytesIO object at 0x7fc0f094c540> Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3738db0> Failed to fetch sample 3015989. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b962cf0> cannot identify image file <_io.BytesIO object at 0x7f7b9b968e50> le <_io.BytesIO object at 0x7fc107a84c20> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7240c4a0> cannot identify image file <_io.BytesIO object at 0x7f1b7c56eca0> Failed to fetch sample 2049208. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd68736250> Failed to fetch sample 2164469. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f77b380> Failed to fetch sample 1135347. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d90d0> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7fdb9ce5db20> Failed to fetch sample 3318084. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b923b50> Failed to fetch sample 2049208. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b90b3d0> cannot identify image file <_io.BytesIO object at 0x7fa275d0be20> le <_io.BytesIO object at 0x7fb52f625d00> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3739bc0> Failed to fetch sample 3920671. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a0b4063e0> cannot identify image file <_io.BytesIO object at 0x7f7b9b96b380> le <_io.BytesIO object at 0x7f357b93eca0> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd45a1d9e0> Failed to fetch sample 2353389. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec239940> cannot identify image file <_io.BytesIO object at 0x7fa008b4a660> Failed to fetch sample 2116813. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754d4b30> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0770bb50> Failed to fetch sample 2761524. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0074c3b0> Failed to fetch sample 4196470. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278df150> Failed to fetch sample 3021222. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cba9d0> cannot identify image file <_io.BytesIO object at 0x7fdcaf85ac50> Token indices sequence length is longer than the specified maximum sequence length for this model (10090 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 2883927. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c56f9c0> Failed to fetch sample 2762003. Exception: cannot identify image fiFailed to fetch sample 2482217. Exception:Failed to fetch sample 1806075. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faeee30> cannot identify image file <_io.BytesIO object at 0x7f82a5ec6520> Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f630b0> Failed to fetch sample 3318896. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b221fd0> cannot identify image file <_io.BytesIO object at 0x7f9abcdc1760> ile <_io.BytesIO object at 0x7fdfaae63ec0> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec23b9c0> Failed to fetch sample 2687916. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278dd530> Failed to fetch sample 1534741. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278cfec0> Failed to fetch sample 2403193. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ef240> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a173895d0> Failed to fetch sample 3923972. Exception: cannot identify image fFailed to fetch sample 2801888. Exception: Failed to fetch sample 2040208. Exception: cannot identify image fFailed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e7ab0> Failed to fetch sample 3159206. Exception: Failed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2581d7470> Failed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2496741d0> cannot identify image file <_io.BytesIO object at 0x7f5d278cfdd0> Failed to fetch sample 1465471. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2787fe20> Failed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ee4d0> Failed to fetch sample 1927084. Exception: cannot identify image file <_io.BytesIO object at 0x7ff6973eb330> Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b279350> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f09a0> Failed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64e0c70> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2876a0> Failed to fetch sample 3011343. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b222fc0> Failed to fetch sample 3133979. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2813f0> Failed to fetch sample 996065. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681b3a60> Failed to fetch sample 3345102. Exception: cannot identify image file <_io.BytesIO object at 0x7f566813d760> Failed to fetch sample 3297609. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc269871a0> Failed to fetch sample 1813538. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf76ce50> Failed to fetch sample 3297609. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b960860> Failed to fetch sample 1208309. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ed4e0> Failed to fetch sample 3436144. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fb8220> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ee2a0> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7bd440> Failed to fetch sample 2413297. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf776020> Failed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3826980> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f796b10> Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7f566811f510> Failed to fetch sample 1222876. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25aa20> Failed to fetch sample 2448314. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93fc90> Failed to fetch sample 2796360. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9067150> Failed to fetch sample 2452845. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc4fa9d0> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf76fa10> Failed to fetch sample 3912241. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777ad850> Failed to fetch sample 1097450. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadd4ed90> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7f2c57a12700> Failed to fetch sample 1012506. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4440c85e0> Failed to fetch sample 1591233. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a390e2a0> Failed to fetch sample 3546995. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253be5e90> Failed to fetch sample 3244486. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d6bc90> ailed to fetch sample 1431086. Exception: cannot identify image file <_io.BytesIO object at 0x7fb6921b69d0> Failed to fetch sample 1580010. Exception:ject at 0x7f58c381eb10> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380db70> Failed to fetch sample 1406975. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3846660> Failed to fetch sample 2494899. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ebd30> ailed to fetch sample 2651019. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b24a520> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4440b8720> Failed to fetch sample 2330392. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2c46f7f60> cannot identify image file <_io.BytesIO object at 0x7f7b9b95f330> Failed to fetch sample 2563789. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f7150> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7b47c0> Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777ad490> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f77790> Failed to fetch sample 1833289. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08db6a0> Failed to fetch sample 1524920. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754ba200> cannot identify image file <_io.BytesIO object at 0x7fdc25628e00> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9082890> Failed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbde505800> Failed to fetch sample 1268716. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23ce50> Failed to fetch sample 4192010. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9568e0> Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7f277545e250> Failed to fetch sample 1632509. Exception: cannot identify image file <_io.BytesIO object at 0x7f991bdd8900> Failed to fetch sample 2940430. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c28540> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7b5710> Failed to fetch sample 2493003. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d69d0> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e8860> Failed to fetch sample 2761972. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7ae570> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643e2c2c0> Failed to fetch sample 3225671. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bcddf0> cannot identify image file <_io.BytesIO object at 0x7fdcaf8890d0> Failed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7efce518b380> le <_io.BytesIO object at 0x7f0ecdba7d30> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7f5838df6250> Failed to fetch sample 3493751. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682ddbbf0> Failed to fetch sample 1437250. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b6390> Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0900360> Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bcef70> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9a9df0> Failed to fetch sample 2562281. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381f380> Failed to fetch sample 3856361. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f67600> Failed to fetch sample 3710254. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c427de90> cannot identify image file <_io.BytesIO object at 0x7fb690189e40> Failed to fetch sample 3493751. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e384d3a0> Failed to fetch sample 3929909. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a9705e0> Failed to fetch sample 3096384. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b94ff10> cannot identify image file <_io.BytesIO object at 0x7f58c380fa10> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dd8630> Failed to fetch sample 3021215. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fdfbc95d0> Failed to fetch sample 3165117. Exception: cannot identify image fiFailed to fetch sample 1864192. Exception:Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c54b470> Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f1030> bject at 0x7fb68e63d350> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e384ca90> Failed to fetch sample 2086216. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b968310> Failed to fetch sample 2607936. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f998180> Failed to fetch sample 3886620. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dcf790> cannot identify image file <_io.BytesIO object at 0x7fd20037ce00> Failed to fetch sample 2684710. Exception:Failed to fetch sample 1804828. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb770900> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf753330> cannot identify image file <_io.BytesIO object at 0x7f6682dcebb0> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dda9d0> Failed to fetch sample 2892575. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39cf240> ile <_io.BytesIO object at 0x7f7b9b94c720> Failed to fetch sample 2770811. Exception: cannot identify image file <_io.BytesIO object at 0x7f563bb944f0> Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644f1f0> Failed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340fc0e0> Failed to fetch sample 2941574. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fe4fe0> cannot identify image file <_io.BytesIO object at 0x7f22e384cc70> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7acb30> Failed to fetch sample 3290404. Exception:bject at 0x7f0c35bce7a0> Failed to fetch sample 2126189. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c356076a0> Failed to fetch sample 3839571. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db2d90> Failed to fetch sample 3023888. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec62d90> Failed to fetch sample 2670915. Exception:bject at 0x7f9a9c2b7c40> cannot identify image file <_io.BytesIO object at 0x7fedb68779c0> le <_io.BytesIO object at 0x7f566810fab0> cannot identify image file <_io.BytesIO object at 0x7f5643e2e520> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f0e50> le <_io.BytesIO object at 0x7fb7401cf290> Failed to fetch sample 2892575. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fe72e0> Failed to fetch sample 4102634. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4282f70> Failed to fetch sample 3094703. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b52bb0> cannot identify image file <_io.BytesIO object at 0x7f81db585a80> cannot identify image file <_io.BytesIO object at 0x7f12b381f380> Failed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 1917111. Exception:bject at 0x7f098644cb80> le <_io.BytesIO object at 0x7f22e384c4f0> Failed to fetch sample 3649257. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b53f60> Failed to fetch sample 2474025. Exception: cannot identify image file <_io.BytesIO object at 0x7f566819f3d0> Failed to fetch sample 1035361. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17f0fbbf0> Failed to fetch sample 3133621. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f2020> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7ae390> cannot identify image file <_io.BytesIO object at 0x7f5643e2f3d0> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f06d0> le <_io.BytesIO object at 0x7f0c35486390> Failed to fetch sample 3500995. Exception: cannot identify image file <_io.BytesIO object at 0x7ff411288720> Failed to fetch sample 3433024. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278add00> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7f566819d670> Failed to fetch sample 4018722. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682e59e0> Failed to fetch sample 3649257. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38b2ed0> Failed to fetch sample 1017215. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bcbf10> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682ddaed0>Failed to fetch sample 2671486. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3725300> FFailed to fetch sample 2577554. Exception: cannot identify image file <_io.BytesIO object at 0x7f469efa2de0>Failed to fetch sample 3379902. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c38ba200> F cannot identify image file <_io.BytesIO object at 0x7f0c1b535120> le <_io.BytesIO object at 0x7f27754a8ef0>Failed to fetch sample 3096384. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdba6bb0> Failed to fetch sample 1903287. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d1d7b0> Failed to fetch sample 1502424. Exception: cannot identify image file <_io.BytesIO object at 0x7fb6a411fba0> Failed to fetch sample 2964818. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494901d0> Failed to fetch sample 4376367. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8397c40> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d5530> Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682e5620> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380ee30> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7f6790a313f0> FFailed to fetch sample 2374167. Exception:Failed to fetch sample 3013832. Exception: cannot identify image fFailed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e3560> Failed to fetch sample 4229545. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37ab1a0> Failed to fetch sample 1017215. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadd5ea70> cannot identify image file <_io.BytesIO object at 0x7f0c1bee55d0> le <_io.BytesIO object at 0x7f12b38bede0> Failed to fetch sample 1688560. Exception:Failed to fetch sample 3379587. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078ca340> Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7f5838df4720> Failed to fetch sample 3013832. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071c130> Failed to fetch sample 1664782. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f3939cd3b50> Failed to fetch sample 1025393. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e3e70> Failed to fetch sample 18 cannot identify image file <_io.BytesIO object at 0x7f10937ff650> Failed to fetch s Failed to fetch sample 1035361. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe724097b0> Failed to fetch sample 3848536. Exception: cannot identify image file <_io.BytesIO object aFailed to fetch sample 2275731. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd96fc0> t 0x7f5de372f8d0>cannot identify image file <_io.BytesIO object at 0x7efcdbb95030> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b94f560> Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1d5350> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d5b70> Failed to fetch sample 1148906. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7491c0> Failed to fetch sample 3463683. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e13f0> cannot identify image file <_io.BytesIO object at 0x7f9d730a1210> Failed to fetch sample 1465471. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f2abb0> Failed to fetch sample 2563789. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c285440> cannot identify image file <_io.BytesIO object at 0x7fb52f6c8e00> Failed to fetch sample 2033363. Exception:Failed to fetch sample 3247622. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c549850> cannot identify image file <_io.BytesIO object at 0x7efcdbb95800> Failed to fetch sample 1838740. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0931a30> Failed to fetch sample 3386032. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a876639c0> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1d5fd0> Failed to fetch sample 1672947. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1b535760> cannot identify image file <_io.BytesIO object at 0x7f1137f53a60> Failed to fetch sample 3345102. Exception:Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777a47c0> Failed to fetch sample 3054965. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac26d7b0> cannot identify image file <_io.BytesIO object at 0x7fbf0070f740> Failed to fetch sample 2754097. Exception: cannot identify image file <_io.BytesIO object at 0x7f09953d4db0> Failed to fetch sample 1700824. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6497650> Failed to fetch sample 1647795. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28afc0> Failed to fetch sample 2993528. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c0770> Failed to fetch sample 2898708. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a60acf0> Failed to fetch sample 2846277. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647ef20> Failed to fetch sample 1747089. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8efe70> Failed to fetch sample 3494092. Exception:cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f83e832a750> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 1480960. Exception:ample 3588151. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c21e4d0> Failed to fetch sample 2642092. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb44f0> Failed to fetch sample 2816660. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249490540> Failed to fetch sample 2965763. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf006f39c0> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dcd850> Failed to fetch sample 3464440. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200367ba0> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7f995a102110> Failed to fetch sample 2850892. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f1b740> Failed to fetch sample 2687916. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681316c0> Failed to fetch sample 2155552. Exception: cannot identify image file <_io.BytesIO object at 0x7fbfed7ff3d0> Failed to fetch sample 2961632. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647e200> Failed to fetch sample 3162454. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a573510> bject at 0x7f0c35d8bec0> Failed to fetch sample 4077261. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd83b0> Failed to fetch sample 1099589. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fa111c0> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dceac0> Failed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0f9a24d0> Failed to fetch sample 2719591. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071e5c0> Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bee4205e0> Failed to fetch sample 2149756. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf6869d0> Failed to fetch sample 2881528. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf005cfec0> Failed to fetch sample 3585602. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093762c00> Failed to fetch sample 1609725. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c427de90> Failed to fetch sample 1292298. Exception:Failed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd98a0> Failed to fetch sample 3326216. Exception: cannot identify image fiFailed to fetch sample 3162635. Exception:bject at 0x7f524fbd9df0> Failed to fetch sample 4019267. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 1479796. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b08870b0> Failed to fetch sample 3021215. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071d0d0> Failed to fetch sample 3418048. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecbdc2570> cannot identify image file <_io.BytesIO object at 0x7f80a82b71f0> le <_io.BytesIO object at 0x7f99a39496c0> Failed to fetch sample 4125296. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd9c10> Failed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7f655606b600> Failed to fetch sample 3649257. Exception: cannot identify image file <_io.BytesIO object at 0x7f58399b8cc0> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf684e50> Failed to fetch sample 2484212. Exception:Failed to fetch sample 2035749. Exception: cannot identify image ficannot identify image file <_io.BytesIO obFailed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bee4229d0> Failed to fetch sample 1Failed to fetch sample 4123070. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c560781d0> Failed to fetch sample 4163893. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf48cfe840> Failed to fetch sample 1761583. Exception: cannot identify image fiFailed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb7b00> Failed to fetch sample 2733843. Exception:Failed to fetch sample 3117311. Exception: cannot identify image fiFailed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9b8590> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f55a30> le <_io.BytesIO object at 0x7f655606b0b0> cannot identify image file <_io.BytesIO object at 0x7f5838df48b0> Failed to fetch sample 1431086. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190c3b00> Failed to fetch sample 2004558. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093762bb0> Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789d800> Failed to fetch sample 1171510. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71cc70> Failed to fetch sample 3117311. Exception:bject at 0x7fb52f644b80> le <_io.BytesIO object at 0x7f47278d34c0> ailed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7f47e274e340> le <_io.BytesIO object at 0x7f3c5607ab10> cannot identify image file <_io.BytesIO object at 0x7f3939cd6700> cannot identify image file <_io.BytesIO obcannot identify image file <_io.BytesIO object at 0x7fc2eb7863e0> cannot identify image file <_io.BytesIO object at 0x7f357b9b9c10> Failed to fetch sample 2717980. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937eb470> Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190c2f20> Failed to fetch sample 1252147. Exception: cannot identify image fiFailed to fetch sample 2491527. Exception:Failed to fetch sample 2448314. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f62b6f0> Failed to fetch sample 1714804. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb37e70> cannot identify image file <_io.BytesIO object at 0x7f655606a840> Failed to fetch sample 1804828. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789f150> Failed to fetch sample 1729011. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaae62e30> Failed to fetch sample 3297609. Exception: cannot identify image file <_io.BytesIO object at 0x7f21afeb75b0> Failed to fetch sample 2731192. Exception: cannot identify image file <_io.BytesIO object at 0x7fc4c851ba60> Failed to fetch sample 2494631. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f3ade0> Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c56079da0> Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd8360> Failed to fetch sample 2687916. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c9f30> Failed to fetch sample 2439548. Exception: cannot identify image file <_io.BytesIO object at 0x7f655606bbf0> Failed to fetch sample 1878125. Exception: cannot identify image file <_io.BytesIO object at 0x7f67a003dd00> Failed to fetch sample 3458049. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a49260> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b917830> Failed to fetch sample 4225419. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2b51c0> Failed to fetch sample 2953025. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f62a0> Failed to fetch sample 2448314. Exception: cannot identify image fiFailed to fetch sample 1772414. Exception: ailed to fetch sample 4190756. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7fb9777bdfd0> ile <_io.BytesIO object at 0x7f69190c1c60> ile <_io.BytesIO object at 0x7f9a9c29a390> Failed to fetch sample 1423870. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c28f3d0> Failed to fetch sample 1470654. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2a39c0> Failed to fetch sample 1617998. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789fec0> Failed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f570b0> Failed to fetch sample 2566105. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f1a200> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f66750> Failed to fetch sample 2077425. Exception: cannot identify image file <_io.BytesIO object at 0x7fa0081994e0> Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17f0faf20> cannot identify image file <_io.BytesIO object at 0x7f668268e3e0> ile <_io.BytesIO object at 0x7f9a9c2d2980> Failed to fetch sample 1308960. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9e579c60> Failed to fetch sample 2033363. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f268720> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f154e0> Failed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008b49da0> Failed to fetch sample 2282533. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789f880> Failed to fetch sample 3732379. Exception: Failed to fetch sample 3571975. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56cd5620> Failed to fetch sample 1878125. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7f35a7f57560> Failed to fetch sample 1477290. Exception: Failed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf78bec0> Failed to fetch sample 2164469. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777c1990> Failed to fetch sample 4153608. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f655606a700> Failed to fetch sample 3031240. Exception: cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7f4beebd3010> Failed to fetch sample 2846150. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071bec0> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00742b10>Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3953420> Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420e9ad0> Failed to fetch sample 1827809. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682ccef0> Failed to fetch sample 4018722. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf76a1b0> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777bd350> Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a569582c0> Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5697f510> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f25a3e0> le <_io.BytesIO object at 0x7f5a16f85260> Failed to fetch sample 1382990. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f35a7f56bb0> Failed to fetch sample 3546995. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdb1080> le <_io.BytesIO object at 0x7fccdf7bf510> Failed to fetch sample 1484787. Exception:Failed to fetch sample 1659475. Exception:bject at 0x7ff4015da890> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf78bc90> Failed to fetch sample 1241656. Exception: cannot identify image file <_io.BytesIO object at 0x7f788db47fb0> Failed to fetch sample 3277824. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d2d9e0> Failed to fetch sample 1431086. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f89300> Failed to fetch sample 4376367. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadeb2c50> Failed to fetch sample 3834885. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459c9da0> Failed to fetch sample 3537455. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f213a0> cannot identify image file <_io.BytesIO object at 0x7f7a65ab1760> Failed to fetch sample 2420125. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a17c90> Failed to fetch sample 3096384. Exception: Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f57290> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682cec00> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c28d260> cannot identify image file <_io.BytesIO object at 0x7f1093787c40> Failed to fetch sample 2607936. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56959620> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7fdb9acc2c00> Failed to fetch sample 4096509. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a244f0> annot identify image file <_io.BytesIO object at 0x7ff4dab771a0> 670557. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f6750> cannot identify image file <_io.BytesIO object at 0x7fb7401e3240> Failed to fetch sample 4073247. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35b99d00> Failed to fetch sample 2819794. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a16ac0> ailed to fetch sample 2970536. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72406cf0> cannot identify image file <_io.BytesIO object at 0x7f1093787010> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697deb470> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c381ee30> Failed to fetch sample 1308960. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0074e750> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d7600> Failed to fetch sample 2903629. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f8a7a0> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7ff44a28d9e0> cannot identify image file <_io.BytesIO object at 0x7f9a9c2c06d0> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b4180> Failed to fetch sample 2374167. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754b14e0> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadd771a0> Failed to fetch sample 2502000. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08f9b70> Failed to fetch sample 2290500. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f2be70> Failed to fetch sample 3453800. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8109850> cannot identify image file <_io.BytesIO object at 0x7f4beec2e840> cannot identify image file <_io.BytesIO object at 0x7ff4015d89f0> Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7ff44a28da30> F cannot identify image file <_io.BytesIO object at 0x7fa58c1b4180> e <_io.BytesIO object at 0x7f5e6a90ec00> Failed to fetch sample 3303771. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d1ee0> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fb9490> Failed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4015da340> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadeb12b0> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6af560> Failed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775485ee0> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd445df0b0> cannot identify image file <_io.BytesIO object at 0x7fc214b013f0> Failed to fetch sample 2324602. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd8c70> Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078f4450> Failed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7fc19474eed0> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9c7f60> Failed to fetch sample 2448314. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f84bd0> Failed to fetch sample 1449702. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbd58a0> Failed to fetch sample 3660581. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8eeac0> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8109b20> Failed to fetch sample 2294578. Exception: cannot identify image file <_io.BytesIO object at 0x7f8028e89490> Failed to fetch sample 1672947. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a276a0> cannot identify image file <_io.BytesIO object at 0x7fbf0074e700> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9c4bd0> Failed to fetch sample 1791282. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebda070> Failed to fetch sample 1048906. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7ddbd350> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0ef2e0> Failed to fetch sample 2869092. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4267600> Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7fc24d0cb240> Failed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340efb00> Failed to fetch sample 2984651. Exception: ailed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7f802848ed90>cannot identify image file <_io.BytesIO object at 0x7f80a810a3e0>n: cannot identify image file <_io.BytesIO object at 0x7fd2003862a0> Failed to fetch sample 2062527. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5690db70> Failed to fetch sample 1796668. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3941760> Failed to fetch sample 1306613. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a392d490> cannot identify image file <_io.BytesIO object at 0x7f07a84dbe20> Failed to fetch sample 3013832. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cc04f0> Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b3837970> FFailed to fetch sample 3358722. Exception: ect at 0x7fdaf61c9da0> ile <_io.BytesIO object at 0x7f001b92b330> Failed to fetch sample 2889114. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db53f470> Failed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3956d40> Failed to fetch sample 1192061. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b426ee80> Failed to fetch sample 2841736. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1278d0> Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39c3420> cannot identify image file <_io.BytesIO object at 0x7f3939cc0d10>cannot identify image file <_io.BytesIO object at 0x7f098641f380> file <_io.BytesIO object at 0x7f0513439bc0> le <_io.BytesIO object at 0x7f35a7f26110> Failed to fetch sample 4123070. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5690dad0> cannot identify image file <_io.BytesIO object at 0x7fe4c502f2e0> cannot identify image file <_io.BytesIO object at 0x7f6682dbb740> Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec03380> Failed to fetch sample 3369056. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3845a80>FFailed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9cb2b4900Fa Failed to fetch sample 3383385. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f75f80> . Running this sequence through the model will result in indexing errors .BytesIO object at 0x7f7d9348fbf0> Failed to fetch sample 1278085. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b3f60> Failed to fetch sample 1243147. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1bf3dcb0> Failed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56d29580> Failed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a24d10> cannot identify image file <_io.BytesIO object at 0x7fca2e1af6f0> Failed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24966b2e0> Failed to fetch sample 2907066. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec253e70> cannot identify image file <_io.BytesIO obFailed to fetch sample 1171510. Exception:mple 1672947. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412ecf0> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340ed3f0> Failed to fetch sample 4125296. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d9348d9e0> cannot identify image file <_io.BytesIO object at 0x7f2775479cb0> Failed to fetch sample 4179589. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8f67a0> F Failed to fetch sample 4099940. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a559530Failed to fetch sample 2961542. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ca74c0> Failed to fetch sample 2587174. Exception: cannot identify image file <_io.BytesIO object at 0x7fde3edb95d0> FaFailed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7f277547bfb0Failed to fetch sample 1524093. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38536f0> > Failed to fetch sample 1203134. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f57fb0> cannot identify image file <_io.BytesIO object at 0x7fbe723ff970> Token indices sequence length is longer than the specified maximum sequence length for this model (10090 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 3862350. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16da9a30> Failed to fetch sample 3343993. Exception: Failed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8f4b30> Failed to fetch sample 4135597. Exception: cannot identify image file <_io.BytesIO object at 0x7fb6a3f1d350> ject at 0x7fd200387e20> Failed to fetch sample 1099589. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c426bb50> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73284360> Failed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cddc10> ailed to fetch sample 1718944. Exception: cannot identify image fiFailed to fetch sample 1203134. Exception:Failed to fetch sample 3662436. Exception:bject at 0x7f1137f73f10> Failed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4267ab0> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dba3e0> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278e3b50> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568f5b70> cannot identify image file <_io.BytesIO object at 0x7f5a16f72750> Failed to fetch sample 2562281. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937d3f10> Failed to fetch sample 3836084. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4291710> Failed to fetch sample 2155552. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebde1b0> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7f802848efc0> cannot identify image file <_io.BytesIO object at 0x7fdb9a641a80> Failed to fetch sample 1099589. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f64e700> le <_io.BytesIO object at 0x7f83e831ba10> cannot identify image file <_io.BytesIO object at 0x7f098647b880> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7fcef8cefd30> Failed to fetch sample 3401019. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a55bce0> Failed to fetch sample 3013832. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568cc770> cannot identify image file <_io.BytesIO object at 0x7f0c35d88f90> Failed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f70680> Failed to fetch sample 3885172. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9624d0> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37e9a30> Failed to fetch sample 4228702. Exception: cannot identify image file <_io.BytesIO object at 0x7f2c57a12700> cannot identify image file <_io.BytesIO object at 0x7fe4c42691c0> Failed to fetch sample 4022407. Exception:Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986479850> Failed to fetch sample 1448881. Exception: cannot identify image fiFailed to fetch sample 2687916. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f64fd80> Failed to fetch sample 2394516. Exception:Failed to fetch sample 1804828. Exception: cannot identify image fiFailed to fetch sample 3659262. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a570630> le <_io.BytesIO object at 0x7f7d9348d850> Failed to fetch sample 1488282. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906a5c0> Failed to fetch sample 2439548. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73284720> Failed to fetch sample 2964969. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f63dd0> Failed to fetch sample 3493751. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e831b6a0> cannot identify image file <_io.BytesIO object at 0x7f5a16dab380> Failed to fetch sample 3358649. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f6e480> le <_io.BytesIO object at 0x7f7a568ce700> Failed to fetch sample 2846277. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5588b0> Failed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7328c6d0> cannot identify image file <_io.BytesIO object at 0x7f4beec63740> Failed to fetch sample 1285751. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc10e3e0> Failed to fetch sample 1203134. Exception: cannot identify image file <_io.BytesIO object at 0x7f90e524d5d0> Failed to fetch sample 3297609. Exception: cannot identify image file <_io.BytesIO object at 0x7f80284b58a0> Failed to fetch sample 1047785. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0efd80> Failed to fetch sample 1499078. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e8aed0> Failed to fetch sample 2564922. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b923d30> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b961ee0> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7fb6921b69d0> Failed to fetch sample 3316035. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fd1a5bfb0> Failed to fetch sample 3147374. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568d76f0> Failed to fetch sample 2448314. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d9348e750> Failed to fetch sample 2758452. Exception: cannot identify image fiFailed to fetch sample 2719591. Exception:Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e1fd0> Failed to fetch sample 3909071. Exception: cannot identify image fi Failed to fetch sample 3661638. ExceptionFailed to fetch sample 1116798. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fd9b20> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a0a890> Failed to fetch sample 3458049. Exception: cannot identify image file <_io.BytesIO object at 0x7f1dcabd5d50> cannot identify image file <_io.BytesIO object at 0x7f4ebf8d3920> Failed to fetch sample 3077341. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bee421cb0> Failed to fetch sample 1784730. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e8a4d0> Failed to fetch sample 3684442. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9b5ad0> Failed to fetch sample 1250413. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078f54e0> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f739c0> Failed to fetch sample 2535152. Exception: cannot identify image file <_io.BytesIO object at 0x7fd9dc1c3fb0>: cannot identify image file <_io.BytesIO object at 0x7f7b9b996e30> Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3862ca0> cannot identify image file <_io.BytesIO object at 0x7fbcdc10f740> Failed to fetch sample 3578223. Exception: Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f6750>Failed to fetch sample 1329424. Exception: ect at 0x7f6682df6750> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec039c0> Failed to fetch sample 2742244. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078c8180> Failed to fetch sample 2224363. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf673880> Failed to fetch sample 2521553. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a55b560> cannot identify image file <_io.BytesIO object at 0x7f1f1b25b470> cannot identify image file <_io.BytesIO object at 0x7fa275c97e20> Failed to fetch sample 1588738. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84d4860> cannot identify image file <_io.BytesIO object at 0x7f53a905bce0> Failed to fetch sample 3026653. Exception: cannot identify image file <_io.BytesIO object at 0x7f351e1c6b10> le <_io.BytesIO object at 0x7f2a0a86d210> Failed to fetch sample 1480960. Exception:Failed to fetch sample 1258856. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8eb330> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f7600> cannot identify image file <_io.BytesIO object at 0x7f5caf575fd0> Failed to fetch sample 2687916. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b282390> Failed to fetch sample 1130395. Exception: cannot identify image fiFailed to fetch sample 2915907. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf780450> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc9f30> Failed to fetch sample 2210758. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c356076a0> Failed to fetch sample 1885521. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84d4b30> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8338e50> cannot identify image file <_io.BytesIO object at 0x7f53a8ee09f0> Failed to fetch sample 2244285. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643e2dda0>Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d730a25c0> Failed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb9120> Failed to fetch sample 2944849. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278add00> Failed to fetch sample 1171510. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf783880> Failed to fetch sample 3322486. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b979f4270> Failed to fetch sample 2987811. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093755670> Failed to fetch sample 3359645. Exception: cannot identify image file <_io.BytesIO object at 0x7f1009401c60> Failed to fetch sample 1932093. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c7010> Failed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9958f0> Failed to fetch sample 1126428. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7c3bf0> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16daa6b0> Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071e5c0> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e833ade0> Failed to fetch sample 3596482. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c213970> Failed to fetch sample 2719591. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf781580> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b937560> Failed to fetch sample 4027148. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5ec4db0> cannot identify image file <_io.BytesIO object at 0x7f9a9c2ba070> ile <_io.BytesIO object at 0x7f4beec00450> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071de90> Failed to fetch sample 1252147. Exception: cannot identify image fiFailed to fetch sample 1035361. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71f7e0> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278c65c0> Failed to fetch sample 3280452. Exception:Failed to fetch sample 1589090. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec67b50> Failed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dcd850> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37cade0> Failed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb35760> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c3330> Failed to fetch sample 2998072. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b281490> Failed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0067e980> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7f69186e5620> Failed to fetch sample 2033363. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0779f4c0> Failed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7f109376b920> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078db420> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc9210> Failed to fetch sample 3458049. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7324b100> cannot identify image file <_io.BytesIO object at 0x7f53a905ac50> le <_io.BytesIO object at 0x7f56681d8d60> cannot identify image file <_io.BytesIO object at 0x7f5d278aef20> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278b1030> Failed to fetch sample 1410764. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99b840400> Failed to fetch sample 3326970. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fbe70> Failed to fetch sample 2778807. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25f4c0> Failed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce6b880> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83afc40> ailed to fetch sample 1561037. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f8f4c0> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39ae890> Failed to fetch sample 1305588. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91ca90> Failed to fetch sample 1341777. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39491c0> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7f10105a7ba0> Failed to fetch sample 3167719. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d8630> cannot identify image file <_io.BytesIO object at 0x7f5d278fd7b0> Failed to fetch sample 1328181. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb74bce0> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b93bb50> Failed to fetch sample 2123650. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d7da31a0> cannot identify image file <_io.BytesIO object at 0x7f10937f35b0> Failed to fetch sample 2086216. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38c05e0> Failed to fetch sample 3927599. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf88f2e0> Failed to fetch sample 3297609. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d6250> Failed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16da9d00> Failed to fetch sample 1272184. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2a7150> Failed to fetch sample 4179198. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133fd1c10> cannot identify image file <_io.BytesIO object at 0x7fc99c2130b0> Failed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9fddf0> Failed to fetch sample 4171117. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c9ed1d5d0> Failed to fetch sample 3732128. Exception: cannot identify image file <_io.BytesIO object at 0x7f20b4288c70> Failed to fetch sample 2358648. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b979f7560> Failed to fetch sample 1198363. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 1344720. Exception: ect at 0x7fdcaf88fc90> Failed to fetch sample 2448314. Exception: cannot identify image filFailed to fetch sample 3290404. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9fc450> 43045. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a394b4c0> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7f469e5ab2e0> cannot identify image file <_io.BytesIO object at 0x7fa275d0a3e0> Failed to fetch sample 2333677. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b17b0> Failed to fetch sample 2353389. Exception: Failed to fetch sample 2940708. Exception:ject at 0x7fc99c212f20> le <_io.BytesIO object at 0x7fc2eb74ad40> Failed to fetch sample 1878125. Exception: cannot identify image file <_io.BytesIO object at 0x7f23ebead120> Failed to fetch sample 3592220. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777df560> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741fd3fb0> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a391e020> Failed to fetch sample 3582945. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4daba2480> Failed to fetch sample 1171510. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39ac9f0> Failed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2cd9e0> le <_io.BytesIO object at 0x7f47278f4fe0> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91f060> ailed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c9f6e8d60> Failed to fetch sample 3662436. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb74af20> Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7f1fd4c5b880> Failed to fetch sample 1653057. Exception:Failed to fetch sample 2743590. Exception: ject at 0x7f1133fe3ec0> Failed to fetch sample 3648356. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3af7510> Failed to fetch sample 2719591. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39ae840> Failed to fetch sample 3888694. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fb693dcc0e0> Failed to fetch sFailed to fetch sample 3659331. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56935620> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775477830> ample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ea930> Failed to fetch sample 2624597. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b93acf0> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf855ad0> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275d0b2e0> Failed to fetch sample 1963191. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c213a10> Failed to fetch sample 2164469. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b93a930> Failed to fetch sample 2168694. Exception: cannot identify image fiFailed to fetch sample 994847. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775474e00> ile <_io.BytesIO object at 0x7f655602c4a0> Failed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39def20> Failed to fetch sample 2220395. Exception: cannot identify image file <_io.BytesIO object at 0x7f136c712250> Failed to fetch sample 2228162. Exception: ject at 0x7f9a9c284590> le <_io.BytesIO object at 0x7fb9777c1580> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9fa700> Failed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf6a3ab0> Failed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecb201760> Failed to fetch sample 1976843. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3949800> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381c130> Failed to fetch sample 3546995. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0070fc90> Failed to fetch sample 2801888. Exception: cannot identify image fFailed to fetch sample 4170722. Exception: ailed to fetch sample 2950641. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d1620> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7fc325744fe0> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fcbf94b30> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f57b0> Failed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f7740> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e69e6e30> Failed to fetch sample 2101564. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d22a0> Failed to fetch sample 1545754. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8b70b0> Failed to fetch sample 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb360c0> Failed to fetch sample 1232641. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08feca0> Failed to fetch sample 1725001. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f3b830> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f093c9f0> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7f13dc630720> Failed to fetch sample 2784971. Exception: cannot identify image file <_io.BytesIO object at 0x7f0dbd079530> Failed to fetch sample 1113580. Exception: cannot identify image file <_io.BytesIO object at 0x7fc195144450> cannot identify image file <_io.BytesIO object at 0x7f81db566f20> Failed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278fe700> Failed to fetch sample 1804828. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f1e200> Failed to fetch sample 1457555. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b922980> Failed to fetch sample 1528709. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e07709760> cannot identify image file <_io.BytesIO object at 0x7fd07a572480> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e650f560> Failed to fetch sample 4227489. Exception:Failed to fetch sample 3162635. Exception:bject at 0x7f6d14f9fec0> le <_io.BytesIO object at 0x7fbf007428e0> Failed to fetch sample 3209388. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b922750> Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3949f30> cannot identify image file <_io.BytesIO object at 0x7f655602f6f0> le <_io.BytesIO object at 0x7fc0f0936520> Failed to fetch sample 2815741. Exception:bject at 0x7fd31193bfb0> le <_io.BytesIO object at 0x7f5741f393f0> Failed to fetch sample 2145609. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0932e30> Failed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7fcae06dba60> Failed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db57b830> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f90fa10> Failed to fetch sample 2241749. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f82390> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc0cc0> Failed to fetch sample 3330852. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cacdc4fe0> cannot identify image file <_io.BytesIO object at 0x7f357b93a520> Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b92e250> Failed to fetch sample 1409821. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a1f740> Failed to fetch sample 2301650. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7322e7a0> Failed to fetch sample 2940430. Exception: cannot identify image file <_io.BytesIO object at 0x7f34beaf8950> Failed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00748fe0> Failed to fetch sample 1712012. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7240b4c0> Failed to fetch sample 1804828. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278cbe20> Failed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7fcd98241a80> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdb9ebb0> cannot identify image file <_io.BytesIO object at 0x7fd30f9df420> Failed to fetch sample 3096384. Exception:cannot identify image file <_io.BytesIO object at 0x7fca1c187ec0> ile <_io.BytesIO object at 0x7f7e0770b100> cannot identify image file <_io.BytesIO object at 0x7fdcaf846200> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f11c0> Failed to fetch sample 2632647. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380e930> Failed to fetch sample 2803617. Exception: cannot identify image fiFailed to fetch sample 3297609. Exception:Failed to fetch sample 2049208. Exception: cannot identify image fiFailed to fetch sample 2086216. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf005ccfe0> Failed to fetch sample 3922185. Exception:Failed to fetch sample 3523488. Exception: cannot identify image fiFailed to fetch sample 3578223. Exception:Failed to fetch sample 1231297. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378ec00> Failed to fetch sample 1099589. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3826e30> cannot identify image file <_io.BytesIO object at 0x7f5cb38c3ba0> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72247f60> Failed to fetch sample 1421261. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c0770> ailed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f22070> Failed to fetch sample 2439548. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d85710> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf85be20> Failed to fetch sample 2969350. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e68c8270> Failed to fetch sample 4179589. Exception: cannot identify image fiFailed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2448b0> Failed to fetch sample 3199698. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc2ca0> Failed to fetch sample 4012431. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f0bd0> Failed to fetch sample 2276215. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf788770> Failed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ecf40> Failed to fetch sample 1203134. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd93ab0> Failed to fetch sample 3128324. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340f2110> Failed to fetch sample 1024750. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80df380> Failed to fetch sample 1864192. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2474c0> Failed to fetch sample 3421855. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937eec50> Failed to fetch sample 3290404. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf005cf600> Failed to fetch sample 1465471. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 2563789. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b52d40> Failed to fetch sample 3424490. Exception: cannot identify image file <_io.BytesIO object at 0x7f1fec8d1760> Failed to fetch sample 2Failed to fetch sample 4125296. Exception:Failed to fetch sample 3247622. Exception: Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7f1dcabd9580> ject at 0x7f7134103240> Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071cf40> Failed to fetch sample 1602779. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378ed90> cannot identify image file <_io.BytesIO object at 0x7f9d73263b50> le <_io.BytesIO object at 0x7fc2eb74e750> Failed to fetch sample 1027576. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b3dd0> Failed to fetch sample 3837981. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e38918f0>Failed to fetch sample 1790119. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278dd800> Failed to fetch sample 1099589. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35b96160> Failed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf87e110> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64e3290> Failed to fetch sample 1747586. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1d22a0> Failed to fetch sample 2432701. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200379c10> Failed to fetch sample 2903629. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7325fab0> Failed to fetch sample 4164601. Exception: cannot identify image file <_io.BytesIO object at 0x7f566810eb10> Failed to fetch sample 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134101620> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f73ec0> cannot identify image file <_io.BytesIO object at 0x7f5741f27470> Failed to fetch sample 1784730. Exception: cannot identify image file <_io.BytesIO ocannot identify image file <_io.BytesIO object at 0x7fd3117b6480> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7f18f5746390> cannot identify image file <_io.BytesIO object at 0x7fdd459c88b0> Failed to fetch sample 1414982. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090a520> cannot identify image file <_io.BytesIO object at 0x7f7d7d3c75b0> Failed to fetch sample 2631334. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380ee80> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20035bab0> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7325fd30> Failed to fetch sample 2903629. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39f3d80> Failed to fetch sample 1231297. Exception: cannot identify image file <_io.BFailed to fetch sample 2344465. EFailed to fetch sample 1196092. Exception: cannot identify image file <_io.BFailed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828114ae0> Failed to fetch sample 1864192. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb74fa60> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac242bb0> ailed to fetch sample 4202016. Exception: cannot identify image file <_io.BytesIO object at 0x7fe447370720> FFailed to fetch sample 3705515. Exception: cannot identify image file <_io.BytesIO object at 0x7f07aac71ee0>Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e7a25f7e0>: cannot identify image file <_io.BytesIO object at 0x7fbe7240aa70> ile <_io.BytesIO object at 0x7fdbcfd21030> Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d6f20> Failed to fetch sample 2155552. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08ef380> Failed to fetch sample 2448314. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73262980> Failed to fetch sample 2248862. Exception: cannot identify image file <_io.BytesIO object at 0x7fb5fefe2a70> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadce9b20> Failed to fetch sample 2282533. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b66b0> Failed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f8a8e0> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd430347c0> Failed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08edc60> cannot identify image file <_io.BytesIO object at 0x7f5a16f730b0> Failed to fetch sample 2122051. Exception: ject at 0x7fdcaf87ff60> Failed to fetch sample 3036273. Exception: cannot identify image file <_io.BytesIO object at 0x7f6ba8b6fd80> Failed to fetch sample 3737126. Exception: cannot identify image file <_io.BytesIO object at 0x7f668268ef70> Failed to fetch sample 1414336. Exception:ject at 0x7f22e39f09f0> Failed to fetch sample 2475611. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3875940> ile <_io.BytesIO object at 0x7fb6877dbba0> Failed to fetch sample 2341971. Exception: cannot identify image file <_io.BytesIO object at 0x7fe447370630> Failed to fetch sample 1700717. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c299990> cannot identify image file <_io.BytesIO object at 0x7f3939c76160> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7f69186e44f0> Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72246f20> Failed to fetch sample 2675930. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a390f6a0> Failed to fetch sample 1029804. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4440c85e0> Failed to fetch sample 2630888. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560fdad0>cannot identify image file <_io.BytesIO object at 0x7fdb9ce5db20> Failed to fetch sample 1726382. Exception: cFailed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7f677018b880> Failed to fetch sample 2892575. Exception: cannot identify image cannot identify image file <_io.BytesIO object at 0x7fbe722462f0> Failed to fetch sample 1477290. Exception: cannot identify image file <_io.BytesIO objeFailed to fetch sample 4179589. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0073e890> Failed to fetch sample 3184473. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb722f0> Failed to fetch sample 3344625. Exception: ect at 0x7fe4440c84f0> Failed to fetch sample 2372060. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f2a3fbb50> Failed to fetch sample 2809771. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b0b80>cannot identify image file <_io.BytesIO object at 0x7f7a91feffb0> Failed to fetch sample 3081751. Exception: cFailed to fetch sample 2164469. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cdf470>: Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8f7790> Failed to fetch sample 3345102. Exception: cannot identify image fFailed to fetch sample 1718944. Exception: ailed to fetch sample 1858383. Exception: ect at 0x7f7134197ce0> Failed to fetch sample 1998895. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342d3560> cannot identify image file <_io.BytesIO object at 0x7f4beebd7330> ile <_io.BytesIO object at 0x7f22e39ef420> ailed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08db3d0> Failed to fetch sample 1772414. Exception: cannot identify image fFailed to fetch sample 2403018. Exception: Failed to fetch sample 4198149. Exception: ect at 0x7fbf00718cc0> Failed to fetch sample 2564963. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39c3fb0> Failed to fetch sample 3222810. Exception: cannot identify image fiFailed to fetch sample 1586288. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e13a0> Failed to fetch sample 2093141. Exception:Failed to fetch sample 2633741. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebaebb0> cannot identify image file <_io.BytesIO object at 0x7f3939c9dd50> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7fc20031d8f0>FFailed to fetch sample 2315757. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84a3fb0> Failed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd58f0> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39fa1b0>FFailed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b289850> Failed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f03e20> ailed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754b9710> Failed to fetch sample 1135347. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec03010> cannot identify image file <_io.BytesIO object at 0x7ff437a0dee0> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8f41d0> Failed to fetch sample 1231297. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8eaef20> Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7efce51b8bd0> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c381f330> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a537bf0> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d7240> Failed to fetch sample 3654560. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2549a0> ailed to fetch sample 3376557. Exception: cannot identify image file <_io.BytesIO object at 0x7f10ae7e8810> Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732526b0> cannot identify image file <_io.BytesIO object at 0x7fbe723fee30> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f71f8581850> Failed to fetch sample 3489464. Exception:bject at 0x7fee3f95b600> Failed to fetch sample 4123070. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420fc310> Failed to fetch sample 2915907. Exception: cannot identify image file <_io.BytesIO object at 0x7efc516dcc20> cannot identify image file <_io.BytesIO object at 0x7f7e077059e0> Failed to fetch sample 2975376. Exception: cannot identify image file <_io.BytesIO object at 0x7f3d146e2430> Failed to fetch sample 2915907. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ea7f0> Failed to fetch sample 3029950. Exception: cannot identify image file <_io.BytesIO object at 0x7f80201b3830> cannot identify image file <_io.BytesIO object at 0x7f35a7f47240> Failed to fetch sample 1265302. Exception:Failed to fetch sample 1265302. Exception:bject at 0x7f5e6a96df80> cannot identify image file <_io.BytesIO object at 0x7fbe7240e2f0> Failed to fetch sample 3654082. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d7bf0> cannot identify image file <_io.BytesIO object at 0x7efcdbb73c90> Failed to fetch sample 1218899. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420fe520> ailed to fetch sample 1864192. Exception: cannot identify image file <_io.BytesIO object at 0x7f240174a200> Failed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681074c0> Failed to fetch sample 2180018. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f120680> Failed to fetch sample 2737069. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a4f290> Failed to fetch sample 2358323. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f7e0770d080> 9918. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249491a80> Failed to fetch sample 4123070. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e839b790> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8f4040> Failed to fetch sample 3128503. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401dc5490> Failed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ed350> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7f6101ebcb30> Failed to fetch sample 2110654. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777a8040> Failed to fetch sample 3200969. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741fd2e30> Failed to fetch sample 2791306. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f99b420> Failed to fetch sample 2633220. Exception: cannot identify image fiFailed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2933d0> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d4b80> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5370b0> Failed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5727a0> Failed to fetch sample 994847. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420eb420> Failed to fetch sample 1706034. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f60b420> Failed to fetch sample 2454239. Exception: cannot identify image file <_io.BytesIO object at 0x7efe29173380> Failed to fetch sample 4012770. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64ba390> Failed to fetch sample 1602779. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f285c10> Failed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7efe28508630> Failed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7fd54ea12b60> Failed to fetch sample 3470223. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c56083790> Failed to fetch sample 1933247. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ccfdd0> Failed to fetch sample 2033363. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ce8090> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d5cb0> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a96e980> Failed to fetch sample 1804828. Exception: cannot identify image file <_io.BytesIO object at 0x7f4ebf8d37e0> Failed to fetch sample 3290404. Exception: cannot identify image file <_io.BytesIO object at 0x7f7bd4b36e80>FFailed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f616430> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8d2c50> cannot identify image file <_io.BytesIO object at 0x7f83e839b4c0> Failed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341377e0> Failed to fetch sample 3061711. Exception:Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008b4a4d0> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7f10af1ce890> Failed to fetch sample 1152861. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278cbe70> cannot identify image file <_io.BytesIO object at 0x7f3bcf4be020> Failed to fetch sample 4119978. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf78abb0> Failed to fetch sample 1432321. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278a6020> Failed to fetch sample 3266868. Exception:Failed to fetch sample 1047785. Exception:bject at 0x7f5a239e9440> Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f609760> Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e831ac00>:Failed to fetch sample 2562281. Exception: cannot identify image fiFailed to fetch sample 1017215. Exception:bject at 0x7efce51b2b60> Failed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 25Failed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278eb790> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7fc108457f10> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9ab2e0> 819794. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f1800> Failed to fetch sample 1327539. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9f49f0> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c958f0> cannot identify image file <_io.BytesIO object at 0x7fdaadd4c4f0> Failed to fetch sample 2506827. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d5059b740> Failed to fetch sample 1424901. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a8ccbfb50> Failed to fetch sample 3244596. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6ad3a0> Failed to fetch sample 2208449. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3737bf0> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb2f010> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d45e0> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9dc9a0> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e831abb0> Failed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa2cb80> Failed to fetch sample 1047785. Exception:ject at 0x7f47278ffc90> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0770e4d0> Failed to fetch sample 3244596. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682ddb290> Failed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7f357bda5f30> Failed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb3b560> Failed to fetch sample 1203134. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37cc310> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6ad300> Failed to fetch sample 3592308. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754c92b0> Failed to fetch sample 1cannot identify image file <_io.BytesIO object at 0x7f65560e68e0> ile <_io.BytesIO object at 0x7f7e0770ef70> 158552. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37ead90> Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb3a2f0> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f617dd0> Failed to fetch sample 1093263. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f96ab10> Failed to fetch sample 1578201. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a651e8e0> Failed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754aa9d0> Failed to fetch sample 2974407. Exception: cannot identify image file <_io.BFailed to fetch sample 3929909. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39c1710> annot identify image file <_io.BytesIO object at 0x7f83e8398a90> Failed to fetch sample 1159198. Exception: c cannot identify image file <_io.BytesIO object at 0x7f1137f72930Failed to fetch sample 3572387. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f35b0> > Failed to fetch sample 1033645. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c284770> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cc9a1df0> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa30270> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec24f330> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb3b150> cannot identify image file <_io.BytesIO object at 0x7f1137f71670> Failed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b258c20> ailed to fetch sample 4079327. Exception: Failed to fetch sample 1518283. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9ce338f90> cannot identify image file <_io.BytesIO object at 0x7f70b4261850>Failed to fetch sample 3343209. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a917a17b0> FFailed to fetch sample 2325409. Exception:cannot identify image file <_io.BytesIO object at 0x7fb52f6cbf60> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb3a8e0> Failed to fetch sample 3344386. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134103650> Failed to fetch sample 1028638. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f66ac0> Failed to fetch sample 3654176. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1b2020> Failed to fetch sample 2363228. Exception: cannot identify image fiFailed to fetch sample 1497035. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643e2dda0> Failed to fetch sample 1697947. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f26a20> Failed to fetch sample 2325409. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5e020> Failed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b42634c0> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc6371a0> Failed to fetch sample 3458049. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc5cdd00> Failed to fetch sample 1635381. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaade86750> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7f27bdaaa200> Failed to fetch sample 1883071. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a87660900> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb4d0d0> Failed to fetch sample 1035361. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f975580> Failed to fetch sample 4047976. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00b220c0> Failed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcc7790> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754aa2f0> Failed to fetch sample 1858383. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc635210> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fe18a0> Failed to fetch sample 3864469. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db567f60> Failed to fetch sample 1450074. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fa13240> Failed to fetch sample 1878125. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a624e2de0> Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a6fc0> cannot identify image file <_io.BytesIO object at 0x7f1f1b25a340> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7404096c0> cannot identify image file <_io.BytesIO object at 0x7f10937f3bf0> Failed to fetch sample 3244596. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39f3b50> le <_io.BytesIO object at 0x7fb9777de390> Failed to fetch sample 1870086. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278e6a20> Failed to fetch sample 1277996. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fa114e0> Failed to fetch sample 1222876. Exception: cannot identify image file <_io.BytesIO object at 0x7f001b9c6fc0> cannot identify image file <_io.BytesIO object at 0x7f9abc65b010> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4f9c0> Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0070d4e0> Failed to fetch sample 2122051. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb74aa20> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb73c540> Failed to fetch sample 2498099. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f5e90> Failed to fetch sample 2049038. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777a66b0> Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9fdbc0> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7fc38478cb80> Failed to fetch sample 1657728. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f09328e0> Failed to fetch sample 3013832. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d85620> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5d3010> Failed to fetch sample 4089308. Exception:Failed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b258900> Failed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO object at 0x7f5525ebd350> Failed to fetch sample 3701583. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732731a0> Failed to fetch sample 2374167. Exception: cannot identify image file <_io.BytesIO object at 0x7f566813f8d0> cannot identify image file <_io.BytesIO object at 0x7fbf0070ab60> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c383c040> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9ffb00> Failed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b258b80> cannot identify image file <_io.BytesIO object at 0x7f18e5d17600> Failed to fetch sample 2103217. Exception: Failed to fetch sample 3662436. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278be610> Failed to fetch sample 994847. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0070bdd0> Failed to fetch sample 2061226. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078daf20> cannot identify image file <_io.BytesIO object at 0x7f7a568f5cb0> Failed to fetch sample 1747586. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a572c50> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2662a0> Failed to fetch sample 3015989. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5690f150> Failed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cc982020> Failed to fetch sample 2844450. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08db830> Failed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7f351eb94d10> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcfa10> cannot identify image file <_io.BytesIO object at 0x7f7b9b9ff920> Failed to fetch sample 4149866. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f62cf0> Failed to fetch sample 1183428. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789fe20> Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2ebb12570> Failed to fetch sample 2164469. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a11ac00> Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f5bc0> Failed to fetch sample 1231297. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278dd1c0> cannot identify image file <_io.BytesIO object at 0x7fd07a576980> Failed to fetch sample 3507907. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64cf9c0> Failed to fetch sample 2049208. Exception:Failed to fetch sample 2103217. Exception:ject at 0x7f56681d9bc0> ile <_io.BytesIO object at 0x7f80284b5d50> cannot identify image file <_io.BytesIO object at 0x7fdc25628e00> ailed to fetch sample 2959239. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754a8130> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7c2a70> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16e0e160> Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7f9acc06bce0> Failed to fetch sample 2482217. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a118310> Failed to fetch sample 3051338. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278dfa60> Failed to fetch sample 2646566. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7c6d40> Failed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08eca90> Failed to fetch sample 3096384. Exception: cannot identify image file <_io.BytesIO object at 0x7efde30fcae0> cannot identify image file <_io.BytesIO object at 0x7f1f1b240f90> le <_io.BytesIO object at 0x7f0c35c2a890> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5a65c0> cannot identify image file <_io.BytesIO object at 0x7f7a568d62a0> Failed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a36d90> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a539da0> Failed to fetch sample 2379671. Exception: Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71f7e0> Failed to fetch sample 3421855. Exception: cannot identify image fFFailed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078f4f90> Failed to fetch sample 3380271. Exception:Failed to fetch sample 1902057. Exception: cannot identify image file <_io.BytesIO object at 0x7fda7d136c00> bject at 0x7f83e87281d0> e <_io.BytesIO object at 0x7fb4a651ca40> Failed to fetch sample 3244596. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f751c0> Failed to fetch sample 2230973. Exception: cannot identify image file <_io.BytesIO object at 0x7fc24c675120> Failed to fetch sample 3644627. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fe610> Failed to fetch sample 2647591. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72407010> Failed to fetch sample 3648356. Exception: cannot identify image file <_io.BytesIO object at 0x7f60ffea6d40> Failed to fetch sample 1047785. Exception: cannot identify image fFailed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a905bce0> Failed to fetch sample 4125296. Exception: Failed to fetch sample 4084511. Exception: cannot identify image fFailed to fetch sample 4104440. Exception: Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078cb100> Failed to fetch sample Failed to fetch sample 2641623. Exception: cannot identify image file <_io.BytesIO objFailed to fetch sample 2198101. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16d9d800> Failed to fetch sample 4125296. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35ba9620> ect at 0x7f83e831aca0> Failed to fetch sample 4125296. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84edb70> Failed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133fd17b0> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0772cb30> Failed to fetch sample 3134464. Exception:Failed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72406340> cannot identify image file <_io.BytesIO object at 0x7fbe723fe4d0> Failed to fetch sample 3897834. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16d9e2f0> Failed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9b01d0> cannot identify image file <_io.BytesIO object at 0x7f1137f1dc10> le <_io.BytesIO object at 0x7f83e8371fd0> Failed to fetch sample 2903629. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35ba91c0> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078f74c0> Failed to fetch sample 2363228. Exception:cannot identify image file <_io.BytesIO object at 0x7f7e078dbab0> Failed to fetch sample 1186857. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93ac50> Failed to fetch sample 2164469. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64c9210> Failed to fetch sample 2903629. Exception:bject at 0x7fc2eb7bc630> Failed to fetch sample 2587903. ExceptionFailed to fetch sample 2045138. Exception:object at 0x7fccdf789cb0> Failed to fetch sample 1113188. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71d080> Failed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a119990> Failed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078f6980> Failed to fetch sample 3345102. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39daed0> cannot identify image file <_io.BytesIO object at 0x7f83e8371990> Failed to fetch sample 3571975. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20038a5c0> Failed to fetch sample 2903629. Exception: cannot identify image file <_io.BytesIO object at 0x7f64960573d0> Failed to fetch sample 3595827. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fff60> Failed to fetch sample 3295434. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab99490> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c7a60> Failed to fetch sample 2919698. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a119300> Failed to fetch sample 1603657. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72417d30> Failed to fetch sample 3303818. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722463e0> Failed to fetch sample 3042212. Exception: cannot identify image file <_io.BytesIO object at 0x7f81d6d8a110> Failed to fetch sample 2537301. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8eea20> ailed to fetch sample 2448314. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c2b4c0>Failed to fetch sample 1203134. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253be6ed0> Failed to fetch sample 2732557. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5f8ea0> Failed to fetch sample 3381491. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59e7a0> Failed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc3060> Failed to fetch sample 3134464. Exception: cannot identify image fFailed to fetch sample 3117311. Exception: Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7fb458a94f40> cannot identify image file <_io.BytesIO object at 0x7f7d7e026c00> Failed to fetch sample 1160161. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00726160> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39da1b0> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9c72e0> Failed to fetch sample 4238312. Exception: cannot identify image file <_io.BytesIO object at 0x7f81d6d8a250> Failed to fetch sample 3742596. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91c680> Failed to fetch sample 3015989. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7fcbdc191b70> Failed to fetch sample 1327539. Exception: cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7f08280fd5d0> Failed to fetch sample 3297609. Exception: cannot identify image fFailed to fetch sample 2024032. Exception: cannot identify image file <_io.BytesIO object at 0x7f8151525620> Failed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91f2e0> Failed to fetch sample 2049208. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db57b790> 795289. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20037cea0> Failed to fetch sample 1103403. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2cb1f0> Failed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e98a0> Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00726390> Failed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093755670>Failed to fetch sample 2617752. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937eaa20> Failed to fetch sample 2351977. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b92dd50> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078facf0> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071e2f0> Failed to fetch sample 1479796. Exception: cannot identify image file <_io.BytesIO object at 0x7f8294c44d60> cannot identify image file <_io.BytesIO object at 0x7fc0f0937010> Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f270b0> Failed to fetch sample 1961161. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f33b50> Failed to fetch sample 2821882. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f7ab0> Failed to fetch sample 4235606. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560e41d0> Failed to fetch sample 2448314. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e832a8e0> Failed to fetch sample 2282533. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b996bb0> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078f4040> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf007258f0> Failed to fetch sample 3501092. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3824860> Failed to fetch sample 1813238. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec60ae0> Failed to fetch sample 2290684. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f976bb0> Failed to fetch sample 2772206. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 1171510. Exception: cannot identify image fFailed to fetch sample 1070036. Exception: cannot identify image fiFailed to fetch sample 2829189. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986479c10> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0fd5769d0> ject at 0x7f1009412c50> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f7c90> Failed to fetch sample 3694880. Exception: cannot identify image file <_io.BytesIO object at 0x7f1228c23650> Failed to fetch sample 4108370. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec63ba0> cannot identify image file <_io.BytesIO object at 0x7f81db541cb0> Failed to fetch sample 3209388. Exception: annot identify image file <_io.BytesIO object at 0x7f001f9e3330> Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91dfd0> Failed to fetch sample 3162454. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f918220> ile <_io.BytesIO object at 0x7fa088cbed40> Failed to fetch sample 988547. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f53920> Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2993a0> Failed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20037d9e0> Failed to fetch sample 2865838. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fd4360> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7fb3766061b0> Failed to fetch sample 2235818. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b0887dd0> Failed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137ff4c70> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f682c0> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a64720> Failed to fetch sample 2425997. Exception: cannot identify image file <_io.BytesIO object at 0x7f051343acf0> Failed to fetch sample 969712. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1d6200> FFailed to fetch sample 2606802. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c25fba4d0> Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093778130>FFailed to fetch sample 2239448. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39b6e80> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f831a0> Failed to fetch sample 1198658. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777a89f0>FFailed to fetch sample 4056827. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c5cb0> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5f8c70> ailed to fetch sample 4232629. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682deff10> Failed to fetch sample 1627889. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf005cc8b0> Failed to fetch sample 2915907. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f69d0> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a1f740> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378e610> Failed to fetch sample 2828746. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937887c0> Failed to fetch sample 3745397. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7693f0> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401eaca0> ailed to fetch sample 3048025. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4eeec6570> Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7f8151efad40> Failed to fetch sample 3910327. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38b0fe0> Failed to fetch sample 2776302. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1c71a0> Failed to fetch sample 2470840. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaef661f30> Failed to fetch sample 2448334. Exception: cannot identify image file <_io.BytesIO object at 0x7f2c57a123e0> cannot identify image file <_io.BytesIO object at 0x7fd07a580f90> Failed to fetch sample 2033363. Exception:Failed to fetch sample 2178120. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ca9cb0> Failed to fetch sample 1479796. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce68860> ailed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754bbf60> Failed to fetch sample 4108370. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4f10612b0> Failed to fetch sample 2165648. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3811760> Failed to fetch sample 3350750. Exception: cannot identify image file <_io.BytesIO object at 0x7ff69741ce50> Failed to fetch sample 1218899. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1d6f20> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137ff73d0> Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278b55d0> Failed to fetch sample 1550413. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bb71a0> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a644f0> Failed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093759d50> Failed to fetch sample 2858679. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec2ef20> Failed to fetch sample 1345056. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dc3ab0> Failed to fetch sample 2954477. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4eeec7560> Failed to fetch sample 1345056. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e0e00> cannot identify image file <_io.BytesIO object at 0x7efcd7b29440> Failed to fetch sample 4379145. Exception:Failed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3846a70> cannot identify image file <_io.BytesIO object at 0x7fc383d85120> le <_io.BytesIO object at 0x7f22e389a8e0> Failed to fetch sample 2277075. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cd2e80> Failed to fetch sample 2893118. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986444180> Failed to fetch sample 2819794. Exception: cannot identify image file <_io.BytesIO object at 0x7f351e1c6b10> Failed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac242bb0> xception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1d42c0> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e95d0> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4015d81d0> Failed to fetch sample 2265597. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3813b50> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac239300> Failed to fetch sample 2086216. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937581d0> Failed to fetch sample 1218899. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec2e480> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d844a0> Failed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378fbf0> Failed to fetch sample 2903629. Exception: Failed to fetch sample 4108370. Exception:cannot identify image fiFailed to fetch sample 2237717. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35b96750> Failed to fetch sample 1432321. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459cb920> Failed to fetch sample 1278085. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754c29d0> Failed to fetch sample 2881442. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db570c70> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39efc40> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697deb470> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f57a10> Failed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7bf830> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378a9d0> Failed to fetch sample 2394516. Exception: cannot identify image fiFailed to fetch sample 2954477. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac246160> Failed to fetch sample 2325409. Exception:Failed to fetch sample 3400038. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682deddf0> Failed to fetch sample 3662436. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c429dd8f0> Failed to fetch sample 3038970. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280e8a90> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdf67f0> cannot identify image file <_io.BytesIO object at 0x7f098641f6f0> ailed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f57ab0> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4015d9df0> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7b6de0> Failed to fetch sample 1614272. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a970810> cannot identify image file <_io.BytesIO object at 0x7f9d7322e7a0> Failed to fetch sample 1829959. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7322f650> Failed to fetch sample 1198363. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682f9f30> Failed to fetch sample 1198363. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f2b240> ailed to fetch sample 1023408. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775476980> Failed to fetch sample 1448881. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ed670> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7b5a80> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7325e9d0> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459c8a40> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a64400> Failed to fetch sample 1278085. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d14f9f600> ile <_io.BytesIO object at 0x7f5668133880>cannot identify image file <_io.BytesIO object at 0x7fb687f2b7e0> Failed to fetch sample 1943116. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73253b00> Failed to fetch sample 3488586. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b7fb9e40> Failed to fetch sample 2758975. Exception: cannot identify image file <_io.BytesIO object at 0x7fde75ec5120> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f2a20> Failed to fetch sample 1265302. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641f4c0> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937ee6b0> cannot identify image file <_io.BytesIO object at 0x7f5cb38b1210> Failed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c5e9d0> Failed to fetch sample 3013832. Exception: cannot identify image file <_io.BytesIO object at 0x7efde30fcae0> Failed to fetch sample 3374424. Exception: cannot identify image file <_io.BytesIO object at 0x7f668261ade0> Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac244d10> Failed to fetch sample 3931583. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbc62a0> Failed to fetch sample 1448881. Exception: cannot identify image file <_io.BytesIO object at 0x7efabe8ef010> cannot identify image file <_io.BytesIO object at 0x7fdcaf8f2160> Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f0e00> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a09170> ailed to fetch sample 2915907. Exception: Failed to fetch sample 2961542. Exception: cannot identify image fFailed to fetch sample 3573442. Exception: cannot identify image file <_io.BytesIO object at 0x7f668261a3e0> Failed to fetch sample 1602779. Exception: Failed to fetch sample 2714441. Exception: cannot identify image fFailed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647f9c0> Failed to fetch sample 3319454. Exception: cannot identify image file <_io.BytesIO object at 0x7f566819fab0> Failed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2944a0> Failed to fetch sample 1295335. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f1b560> cannot identify image file <_io.BytesIO object at 0x7f94ac2455d0> ile <_io.BytesIO object at 0x7f100b4187c0> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadd15760> cannot identify image file <_io.BytesIO object at 0x7fca2e1c6110> le <_io.BytesIO object at 0x7fccdf7bd490> Failed to fetch sample 3463509. Exception: cannot identify image file <_io.BytesIO object at 0x7ff699bae110> Failed to fetch sample 2049208. Exception:Failed to fetch sample 1644065. Exception: cannot identify image fiFailed to fetch sample 3099009. Exception:ject at 0x7fd200377060> Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f09f0> F ailed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db6d1260>Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a09710> ailed to fetch sample 2647351. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f97a4d0> Failed to fetch sample 2537891. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f623fb0> Failed to fetch sample 1929135. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38a68e0> Failed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7f098683e4d0> Failed to fetch sample 2719591. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb72a20> Failed to fetch sample 978664. Exception: cannot identify image file <_io.BytesIO object at 0x7f240174a200> Failed to fetch sample 2204862. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401dc5580> Failed to fetch sample 2265597. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79c9a0> Failed to fetch sample 3431220. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ed73b510> Failed to fetch sample 1203134. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38a6890> Failed to fetch sample 1491577. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937d39c0> Failed to fetch sample 2903629. Exception: cannot identify image file <_io.BytesIO object at 0x7fc108455620> Failed to fetch sample 2379671. Exception:Failed to fetch sample 3665508. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1c8c1120>cannot identify image file <_io.BytesIO object at 0x7f098647ddf0> cannot identify image file <_io.BytesIO object at 0x7f5e6a9361b0> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9ce7ad940> Failed to fetch sample 2145986. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79c900> Failed to fetch sample 3582987. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a90e3e0> Failed to fetch sample 1017215. Exception: cannot identify image file <_io.BytesIO object at 0x7f991bdd8900> Failed to fetch sample 4211307. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a9823e0> Failed to fetch sample 1176463. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7bfe70> Failed to fetch sample 971177. Exception: cannot identify image file <_io.BytesIO object at 0x7ff402390180> FFailed to fetch sample 2791393. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568e4db0> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f6ed0> Failed to fetch sample 3416210. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a53b510> Failed to fetch sample 4057891. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a573830> Failed to fetch sample 1885521. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7a1a30> Failed to fetch sample 1047785. Exception:Failed to fetch sample 21Failed to fetch sample 2009202. Exception: cannot identify image file <_io.BytesIO object at 0x7fc188600c70> Failed to fetch sample 1017215. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c299bc0> Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7322a610> Failed to fetch sample 2033363. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42b7d80> Failed to fetch sample 2502634. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d2c630> Failed to fetch sample 3647985. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a17f37fb0> Failed to fetch sample 4132812. Exception: cannot identify image file <_io.BytesIO object at 0x7f5ea1ecda30> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a09670> Failed to fetch sample 3162454. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3724540> cannot identify image file <_io.BytesIO object at 0x7fc99b841530> ile <_io.BytesIO object at 0x7fccdf7ae2f0> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdb1b70> Failed to fetch sample 3546995. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789dd00> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568e67f0> Failed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7fcea3aff150> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9a6610> Failed to fetch sample 3010687. Exception:Failed to fetch sample 2826890. Exception:bject at 0x7f9d7327c270> Failed to fetch sample 3087770. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b258ef0> Failed to fetch sample 3529643. Exception: cannot identify image file <_io.BytesIO object at 0x7f9dbc5b53f0> Failed to fetch sample 2915907. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2d8770> Failed to fetch sample 1940280. Exception: cannot identify image file <_io.BytesIO object at 0x7fb6877da660> cannot identify image file <_io.BytesIO object at 0x7fd07a5729d0> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a7f9d7fb0> Failed to fetch sample 2164469. Exception: cannot identify image fiFailed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9a7b00> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25a4d0> Failed to fetch sample 2358354. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bee46aa70> Failed to fetch sample 1445442. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7b2aed0> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f606f20> Failed to fetch sample 2850821. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078f7970> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7fbfef916110> Failed to fetch sample 2363461. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c263ab150> Failed to fetch sample 3633396. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e077909f0> Failed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078f7060> Failed to fetch sample 4237432. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381d800> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071b100> Failed to fetch sample 3244596. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0074f2e0> Failed to fetch sample 3180383. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c161e4c20> Failed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e077098f0> Failed to fetch sample 2111070. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38c1c60> Failed to fetch sample 4243929. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7327db70> Failed to fetch sample 3013832. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25ae30> Failed to fetch sample 2317073. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682e9260> Failed to fetch sample 2385794. Exception: cannot identify image fiFailed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078f5ad0> Failed to fetch sample 3633396. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bee3a58a0> Failed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf4a02b0b0> Failed to fetch sample 1448881. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b255bc0> Failed to fetch sample 4188775. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00757420> Failed to fetch sample 2681195. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38bf1f0> Failed to fetch sample 3323628. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f777c40> Failed to fetch sample 1909527. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d1773d4e0> Failed to fetch sample 3870037. Exception: cannot identify image file <_io.BytesIO object at 0x7f6b6bdea890> Failed to fetch sample 2670557. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a571210> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cebf7f600> cannot identify image file <_io.BytesIO object at 0x7f5253bed8a0> Failed to fetch sample 1301230. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f7d385210> le <_io.BytesIO object at 0x7f7e0770b0b0> cannot identify image file <_io.BytesIO object at 0x7f9abcdc18f0> le <_io.BytesIO object at 0x7f5741f0f380> Failed to fetch sample 2970382. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb49b20> Failed to fetch sample 3036687. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a876634c0> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a683b0> Failed to fetch sample 3300822. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4440bab10> cannot identify image file <_io.BytesIO object at 0x7fee3f99ac50> ile <_io.BytesIO object at 0x7f5253bedee0> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f7d385850> cannot identify image file <_io.BytesIO object at 0x7f7e07791c60> Failed to fetch sample 3132360. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9d8f90> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0073e840> Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682fb420> Failed to fetch sample 3546995. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b6f20> Failed to fetch sample 1171510. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25a980> Failed to fetch sample 2688317. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4440c5f30> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b1800> Failed to fetch sample 1428317. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39698f0> Failed to fetch sample 2940708. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bfe8e0> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e661ac50> Failed to fetch sample 2719591. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2590d0> cannot identify image file <_io.BytesIO object at 0x7f357b9d9cb0> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0073f0b0> Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2787eb10> Failed to fetch sample 1017215. Exception: cannot identify image file <_io.BytesIO object at 0x7fe44443f290> Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f24cae0> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf48cfe2f0> Failed to fetch sample 1491577. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9da700> Failed to fetch sample 2086216. Exception:bject at 0x7fca1c18b600> le <_io.BytesIO object at 0x7f7a568c2750> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7e9f80> Failed to fetch sample 3919508. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb7600> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e077927f0> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f24eac0> Failed to fetch sample 4024091. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864498a0> cannot identify image file <_io.BytesIO object at 0x7f99a39203b0> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bfd440> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7f5024019c10> Failed to fetch sample 1944420. Exception: cannot identify image file <_io.BytesIO object at 0x7f5b0e2e32e0> Failed to fetch sample 2282533. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278a6ed0> Failed to fetch sample 2154985. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d58a0> Failed to fetch sample 1921516. Exception: cannot identify image file <_io.BytesIO object at 0x7f00da6d26b0> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39cde40> Failed to fetch sample 1260838. Exception: cannot identify image file <_io.BytesIO object at 0x7f563bb95350> cannot identify image file <_io.BytesIO object at 0x7f7a568c76f0> Failed to fetch sample 1189922. Exception: Failed to fetch sample 4093927. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f093e840> Failed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a239acf40> Failed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8762570> Failed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1da3e0> Failed to fetch sample 1524920. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2787f150> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7f60ffea7b00> Failed to fetch sample 3725045. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39dd4e0> Failed to fetch sample 2149756. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278dacf0> cannot identify image file <_io.BytesIO object at 0x7f5253a69580> Failed to fetch sample 2993528. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f906250> Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f7d385210> le <_io.BytesIO object at 0x7f7b9b1e7a10> Failed to fetch sample 2633741. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a11ad90> Failed to fetch sample 3189450. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278b7f10> Failed to fetch sample 4110135. Exception:Failed to fetch sample 1879013. Exception:bject at 0x7fee3f9d1580> Failed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278fe070> Failed to fetch sample 1432321. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3815990> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e384d4e0> cannot identify image file <_io.BytesIO object at 0x7fa275cfb1f0> Failed to fetch sample 2528901. Exception: cannot identify image file <_io.BytesIO object at 0x7f082810ac50> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560ef8d0> Failed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3911ad0> cannot identify image file <_io.BytesIO object at 0x7f53a8e9ef20> Failed to fetch sample 4241627. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9a9080> cannot identify image file <_io.BytesIO object at 0x7f5e6a11a480> Failed to fetch sample 2403018. Exception:Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cdd7b0> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64b9c60> cannot identify image file <_io.BytesIO object at 0x7f22e384ce50> Failed to fetch sample 1198363. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3815f80> Failed to fetch sample 3015234. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9ff1f0> Failed to fetch sample 3162454. Exception:bject at 0x7f7e078dbec0> le <_io.BytesIO object at 0x7ff401fb4ae0> Failed to fetch sample 2162327. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a119300> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7f20007c7fb0> cannot identify image file <_io.BytesIO object at 0x7f08280f5260> le <_io.BytesIO object at 0x7f5d2788f290> Failed to fetch sample 1232641. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3815c10> cannot identify image file <_io.BytesIO object at 0x7f53a8eaf2e0> le <_io.BytesIO object at 0x7f5253a09030> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7f5caf5756c0> cannot identify image file <_io.BytesIO object at 0x7f7e078e55d0> le <_io.BytesIO object at 0x7f99a39affb0> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c9a700> cannot identify image file <_io.BytesIO object at 0x7f9a9c2c3bf0> Failed to fetch sample 1479796. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c2c00> Failed to fetch sample 4125296. Exception:Failed to fetch sample 2586787. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9c13f0> cannot identify image file <_io.BytesIO object at 0x7fb7401cdd00> Failed to fetch sample 2452576. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c696390> Failed to fetch sample 1619991. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f731a0> Failed to fetch sample 2105577. Exception:Failed to fetch sample 2903629. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2f6390>cannot identify image file <_io.BytesIO object at 0x7f9a9c2c3c90> Failed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f04c70> Failed to fetch sample 1277996. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2993f0> Failed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078e54e0> Failed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556056020> Failed to fetch sample 3096384. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faf2d90> cannot identify image file <_io.BytesIO object at 0x7f53a8eafe20> Failed to fetch sample 1747586. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a876b790> le <_io.BytesIO object at 0x7f0ecdb9da30> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7f96cd9a4c70> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2bb1a0> Failed to fetch sample 1620088. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16d9cea0> Failed to fetch sample 2451701. Exception: cannot identify image file <_io.BytesIO object at 0x7fbda1956c50> Failed to fetch sample 3260918. Exception: cannot identify image fiFailed to fetch sample 2858679. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f70720> Failed to fetch sample 3297609. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2bbc90> Failed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc2020> cannot identify image file <_io.BytesIO object at 0x7fb52f600900> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560577e0>Failed to fetch sample 3891125. Exception: cannot identify image file <_io.BytesIO object at 0x7f001b92a8e0> Failed to fetch sample 1989333. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0937ba0> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcc6ca0> Failed to fetch sample 1429596. Exception:Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8318540> Failed to fetch sample 2110654. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e839b5b0> cannot identify image file <_io.BytesIO object at 0x7f9a9c29a980> Failed to fetch sample 3929909. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008199670> Failed to fetch sample 3909071. Exception: cannot identify image fiFailed to fetch sample 3922701. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556064d60> le <_io.BytesIO object at 0x7f96cd9a5080> Failed to fetch sample 2858679. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bca5c0> Failed to fetch sample 1186857. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdb9f1a0> Failed to fetch sample 1475518. Exception: cannot identify image file <_io.BytesIO object at 0x7feef8f52a20> Failed to fetch sample 1804828. Exception: cannot identify image file <_io.BytesIO object at 0x7f277547b790> Failed to fetch sample 1222876. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37470b0> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39bb100> cannot identify image file <_io.BytesIO object at 0x7f18e64fe7f0> Failed to fetch sample 2450415. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d14fe9e40> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5706d0> le <_io.BytesIO object at 0x7f9a9c2993a0> Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f1170> cannot identify image file <_io.BytesIO object at 0x7f7e0779fec0> Failed to fetch sample 1061320. Exception:Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f71580> Failed to fetch sample 1198363. Exception: cannot identify image fiFailed to fetch sample 3493751. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6c8cc0> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0935b70> Failed to fetch sample 2363461. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f945940> le <_io.BytesIO object at 0x7fd20038ec50> Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c273ba0> Failed to fetch sample 2563789. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c424b380> Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732860c0> Failed to fetch sample 4235606. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20037ce50> Failed to fetch sample 2164469. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c548e50> Failed to fetch sample 3453270. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5d6c0> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7f549560e0c0> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f29d0> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f600720> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de370f240> Failed to fetch sample 3709199. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6cb560> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d5b22a520> Failed to fetch sample 3180383. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9995d0> Failed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7f578b8603b0> Failed to fetch sample 3345102. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42625c0> Failed to fetch sample 3375762. Exception: cannot identify image file <_io.BytesIO object at 0x7f100b4187c0> Failed to fetch sample 3852173. Exception: cannot identify image file <_io.BytesIO object at 0x7f677018a3e0> Failed to fetch sample 1709454. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9d2f70> cannot identify image file <_io.BytesIO object at 0x7f5e6a8e33d0> Failed to fetch sample 2613588. Exception: cannot identify image file <_io.BytesIO object at 0x7fb977617e70> le <_io.BytesIO object at 0x7f0c35bc9e90> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3757a10> Failed to fetch sample 2754401. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faeee30> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35ba9620> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8763bf0> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7f563bb95a80> Failed to fetch sample 1524920. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0937b50> ailed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2e5d50> Failed to fetch sample 4155746. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc121030> Failed to fetch sample 1836045. Exception: cannot identify image file <_io.BytesIO object at 0x7fd30f9df650> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4260c70> cannot identify image file <_io.BytesIO object at 0x7f357b9269d0> ile <_io.BytesIO object at 0x7f5a16d9e840> Failed to fetch sample 994847. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bcb150> Failed to fetch sample 1361377. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bce520> FFailed to fetch sample 1448881. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3847f60> Failed to fetch sample 1615806. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f264d60> ailed to fetch sample 1316862. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe723ff060> Failed to fetch sample 2719591. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e3dd1f920> cannot identify image file <_io.BytesIO object at 0x7fd20037c270> Failed to fetch sample 2122051. Exception:Failed to fetch sample 1277996. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35b969d0> Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f90eed0> Failed to fetch sample 1198363. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f94f8d0> cannot identify image file <_io.BytesIO object at 0x7f53a8eacc20> Failed to fetch sample 1469503. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9081530> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc123ba0> Failed to fetch sample 3575862. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f67880> Failed to fetch sample 4179589. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b93a1b0> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f99b9c0> Failed to fetch sample 2045138. Exception:bject at 0x7f6a7f23af70> Failed to fetch sample 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7fd079db57b0> le <_io.BytesIO object at 0x7f5d278de160> Failed to fetch sample 3478785. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc54a82430> cannot identify image file <_io.BytesIO object at 0x7f58c39dac50> Failed to fetch sample 2585129. Exception: cannot identify image file <_io.BytesIO object at 0x7f4ebf8d3ce0> Failed to fetch sample 1171510. Exception: cannot identify image file <_io.BytesIO object at 0x7f1229a32ca0> Failed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a5b63330> Failed to fetch sample 1928190. Exception: cannot identify image file <_io.BytesIO object at 0x7fdb9ce5da80> Failed to fetch sample 2657311. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f923f60> Failed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f99b2e0> Failed to fetch sample 2110654. Exception: cannot identify image file <_io.BytesIO object at 0x7f578df7ffb0> Failed to fetch sample 1198363. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37470b0> Failed to fetch sample 3108427. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a48310> Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f937060> Failed to fetch sample 3908804. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a390eac0> Failed to fetch sample 1232641. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faef100> Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db566e80> Failed to fetch sample 2122051. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bcfd80> Failed to fetch sample 3856361. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9ba6b0> Failed to fetch sample 991716. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39d35b0> cannot identify image file <_io.BytesIO object at 0x7f6a7f268b30> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2662a0> Failed to fetch sample 3484950. Exception:Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828116250>: cannot identify image file <_io.BytesIO object at 0x7feb5488d940> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9581d0> Failed to fetch sample 3116930. Exception: cannot identify image file <_io.BytesIO object at 0x7fdae97fd5d0> Failed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a819ef20> Failed to fetch sample 4184751. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbde505800> Failed to fetch sample 1595018. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b2f7d4630> Failed to fetch sample 2853706. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071e890> Failed to fetch sample 1726768. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf006ff100> Failed to fetch sample 3301652. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b955120> Failed to fetch sample 1792408. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c383ca40> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c426bbf0> Failed to fetch sample 3144222. Exception: cannot identify image file <_io.BytesIO object at 0x7f9b0d09dcb0> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7f109376a4d0> Failed to fetch sample 2158552. Exception:bject at 0x7f7a65ab1760> Failed to fetch sample 2892575. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28b650> le <_io.BytesIO object at 0x7f80a819c5e0> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f0630> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf005ced90> Failed to fetch sample 1400353. Exception: cannot identify image file <_io.BytesIO object at 0x7f3298f195d0> Failed to fetch sample 2858679. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380ba60> Failed to fetch sample 1099589. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59f240> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5f7e0> cannot identify image file <_io.BytesIO object at 0x7f5d2789fe20> Failed to fetch sample 3334544. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13327fb0> Failed to fetch sample 3657553. Exception: cannot identify image file <_io.BytesIO object at 0x7f5f59b23f10> Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078ca890> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7fdb9ce5db20> Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37478d0> Failed to fetch sample 1828841. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9e6d75120> Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2e7790> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f3790> cannot identify image file <_io.BytesIO object at 0x7f7a568a2b10> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f26b600> le <_io.BytesIO object at 0x7f81db59f510> Failed to fetch sample 1160161. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7ed9ce50> Failed to fetch sample 1889271. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ed620> Failed to fetch sample 2292429. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093755670> Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d67a0> Failed to fetch sample 2108837. Exception: Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7f5af560e660> Failed to fetch sample 1431086. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9fcdb0> cannot identify image file <_io.BytesIO object at 0x7f9a9c2a54e0> Failed to fetch sample 3073957. Exception:bject at 0x7fb52f60ce50> Failed to fetch sample 3350750. Exception: cannot identify image file <_io.BytesIO object at 0x7fb5eafe75b0> ailed to fetch sample 1140456. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2e5b70> ile <_io.BytesIO object at 0x7f81db59ef20> Failed to fetch sample 2692216. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f736a0> cannot identify image file <_io.BytesIO object at 0x7f99a39d1760> Failed to fetch sample 3550661. Exception: ject at 0x7f58c39f2750> Failed to fetch sample 1198144. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d2160>Failed to fetch sample 1171955. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaad56c090> Failed to fetch sample 1742099. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99b842e80> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f23aa20> Failed to fetch sample 3866416. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16daf510> Failed to fetch sample 2033363. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5695ac50> cannot identify image file <_io.BytesIO object at 0x7f9a9c28ce00> ailed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278dfce0> Failed to fetch sample 3847231. Exception: cannot identify image file <_io.BytesIO object at 0x7f566813cea0> Failed to fetch sample 1524920. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bde570> Failed to fetch sample 1840768. Exception:bject at 0x7f58c380a7f0> Failed to fetch sample 1029176. Exception:Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaad56fe70> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9ff740> cannot identify image file <_io.BytesIO object at 0x7f4beebd3010> cannot identify image file <_io.BytesIO object at 0x7f098644d5d0> le <_io.BytesIO object at 0x7f5d278dd120> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ed940> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278fb510> Failed to fetch sample 3096384. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568d67a0> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7efce51b2250> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b96bc90> cannot identify image file <_io.BytesIO object at 0x7f109378f4c0> le <_io.BytesIO object at 0x7fee3f927240> Failed to fetch sample 1653239. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca1850> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278eff60> Failed to fetch sample 4199585. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d85120> Failed to fetch sample 1483680. Exception: cannot identify image file <_io.BytesIO object at 0x7f81d6ff13f0> Failed to fetch sample 3076887. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4eeec6570> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278ddc60> Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9fcc70> cannot identify image file <_io.BytesIO object at 0x7f7b10cef790> Failed to fetch sample 2047816. Exception: cannot identify image file <_io.BytesIO object at 0x7f0d8316a4d0> Failed to fetch sample 2394516. Exception:ject at 0x7fcbdc1be1b0> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2fa0c0> ile <_io.BytesIO object at 0x7fc0f08f9b70> Failed to fetch sample 1308960. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f95a0c0> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2d0360> cannot identify image file <_io.BytesIO object at 0x7f5253be55d0> le <_io.BytesIO object at 0x7fb687f29990> Failed to fetch sample 3559581. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668106ca0> Failed to fetch sample 1277996. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668133a10> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4015d9b20> cannot identify image file <_io.BytesIO object at 0x7f7e07793330> Failed to fetch sample 3724020. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf005f8b80> Failed to fetch sample 2858679. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0937a60> Failed to fetch sample 2231126. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0926d90> Failed to fetch sample 2940708. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9d0f40> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7fb977616020> Failed to fetch sample 3909071. Exception:cannot identify image file <_io.BytesIO object at 0x7f94ac2f9350> ile <_io.BytesIO object at 0x7fa275c9e660> le <_io.BytesIO object at 0x7fc3850e71a0> cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fbf005f86d0> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0770bec0>Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38a5bc0> Failed to fetch sample 2360655. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec7f6f0> Failed to fetch sample 2244335. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca23e0> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280ebf10> Failed to fetch sample 2895867. Exception: cannot identify image file <_io.BytesIO object at 0x7fc26217a7a0> Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0934ef0> Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb933d0> cannot identify image file <_io.BytesIO object at 0x7f9a9c273510> Failed to fetch sample 2122051. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668279760> Failed to fetch sample 2671128. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7adcb0> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7fcc556be9d0> cannot identify image file <_io.BytesIO object at 0x7f4beec7e930> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f2a160> Failed to fetch sample 1448881. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668104c70> Failed to fetch sample 3298418. Exception: cannot identify image fiFailed to fetch sample 2699547. Exception:Failed to fetch sample 3007623. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340ed3f0> cannot identify image file <_io.BytesIO object at 0x7f7e07705d50> Failed to fetch sample 2426748. Exception: Failed to fetch sample 2784203. Exception:cannot identify image file <_io.BytesIO object at 0x7fb25c5ebfb0> cannot identify image file <_io.BytesIO object at 0x7fee3f9d3240> Failed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab76ca0> Failed to fetch sample 3534881. Exception: cannot identify image fiFailed to fetch sample 994847. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0934b80> Failed to fetch sample 1192776. Exception: cannot identify image file <_io.BytesIO object at 0x7fef478ca020> ile <_io.BytesIO object at 0x7f9a9c294040> Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682fb240> Failed to fetch sample 2956251. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73250ae0> Failed to fetch sample 1524920. Exception: cannot identify image fiFailed to fetch sample 2855055. Exception:Failed to fetch sample 2286558. Exception: cannot identify image file <_io.BytesIO object at 0x7f109377a5c0> Failed to fetch sample 2659939. Exception: cannot identify image fiFailed to fetch sample 2641615. Exception:Failed to fetch sample 2827935. Exception:bject at 0x7f9a9c272110> Failed to fetch sample 1171510. Exception:cannot identify image file <_io.BytesIO object at 0x7f7a569607c0> Failed to fetch sample 2563789. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cbf97920> FFailed to fetch sample 3162110. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2d9490> Failed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732526b0> Failed to fetch sample 1878125. Exception: cannot identify image file <_io.BytesIO object at 0x7f0db4ede110> Failed to fetch sample 2719591. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681b3100> Failed to fetch sample 3715556. Exception: cannot identify image file <_io.BytesIO object at 0x7f2009934b80> Failed to fetch sample 2035018. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42939c0> Failed to fetch sample 1858383. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682fa7f0> Failed to fetch sample 4018722. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7328b100> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c26dc10> Failed to fetch sample 2744255. Exception:bject at 0x7fdc2562aa70> Failed to fetch sample 3350750. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56960720> le <_io.BytesIO object at 0x7f1008a66980> Failed to fetch sample 2017911. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0f9a13f0> Failed to fetch sample 3345102. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a0bab0> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280eb830> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab9abb0> cannot identify image file <_io.BytesIO object at 0x7f93f3997240> le <_io.BytesIO object at 0x7fedb5ec0720> Failed to fetch sample 2110654. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8da110> cannot identify image file <_io.BytesIO object at 0x7f9d7328bba0> Failed to fetch sample 3013832. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2da570> Failed to fetch sample 2280658. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fe7330> Token indices sequence length is longer than the specified maximum sequence length for this model (10090 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2590d0> Failed to fetch sample 3728815. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681137e0> Failed to fetch sample 3867894. Exception: cannot identify image file <_io.BytesIO object at 0x7f566813f790> Failed to fetch sample 1448881. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c29bf10> Failed to fetch sample 1682203. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf889ee0> Failed to fetch sample 1024277. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08efe20> cannot identify image file <_io.BytesIO object at 0x7f08282cb420> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828117ba0> le <_io.BytesIO object at 0x7fe436a0b330> Failed to fetch sample 3467927. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668183600> Failed to fetch sample 1769826. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1373d0> Failed to fetch sample 2482217. Exception: cannot identify image file <_io.BytesIO object at 0x7f566813cb30> Failed to fetch sample 3463683. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c26e7a0> Failed to fetch sample 3863545. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f65760> Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7322f650> Failed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b258c70> Failed to fetch sample 2479743. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668112250> Failed to fetch sample 4125126. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39d89a0> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf76be70> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c9d9458f0> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7f566813e840> Failed to fetch sample 1943116. Exception: cannot identify image fFailed to fetch sample 2139522. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f73790> ile <_io.BytesIO object at 0x7f12b38a4e50> cannot identify image file <_io.BytesIO object at 0x7fe436a0b2e0> Failed to fetch sample 1009658. Exception:Failed to fetch sample 3645977. Exception: cannot identify image fiFailed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39d8310> Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00755e40> Failed to fetch sample 1743116. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0070b100> le <_io.BytesIO object at 0x7f9a9c299b70> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7f566813d710> Failed to fetch sample 3648356. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc118770> Failed to fetch sample 2555404. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134197d30> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcfe20> Failed to fetch sample 1316862. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35b98040> Failed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3941760> ailed to fetch sample 2265597. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d8ab60> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecbdc2570> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a986e30> Failed to fetch sample 2858679. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f73b00> Failed to fetch sample 3589159. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401dc5580> Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682ddbc40> cannot identify image file <_io.BytesIO oFailed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341959e0> Failed to fetch sample 28Failed to fetch sample 1755291. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 4080024. Exception:ample 2312531. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec21af70> Failed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f1850> Failed to fetch sample 2850892. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec245e40> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99e9cfbf0> Failed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec24ff60> cannot identify image file <_io.BytesIO object at 0x7f566813fab0> ailed to fetch sample 2223750. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381d4e0> Failed to fetch sample 1120039. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39df1f0> Failed to fetch sample 3180383. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9db060> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7f66c4cb3380> Failed to fetch sample 4073546. Exception: cannot identify image file <_io.BytesIO object at 0x7fd9c41298f0> Failed to fetch sample 1850444. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d69d0> Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f72570> Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a95df80> Failed to fetch sample 2049208. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5759e0> annot identify image file <_io.BytesIO object at 0x7f53a9068f40> Failed to fetch sample 2964818. Exception: Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b9120> Failed to fetch sample 3535594. Exception: cannot identify image fFailed to fetch sample 1431448. Exception: Failed to fetch sample 2526317. Exception: cannot identify image file <_io.BytesIO object at 0x7f566810c540> cannot identify image file <_io.BytesIO object at 0x7f66b8aae7f0> Failed to fetch sample 2964818. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec223420> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec24d210> Failed to fetch sample 4162588. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb6520> Failed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4eeec6570> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a95fce0> Failed to fetch sample 1847996. Exception: cannot identify image file <_io.BytesIO object at 0x7fb2fd566de0> Failed to fetch sample 2363461. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f32390> Failed to fetch sample 1108282. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fd5d0> Failed to fetch sample 4030253. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401ded620> Failed to fetch sample 1479796. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f646750> ailed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d9800> Failed to fetch sample 994847. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4f1061760> Failed to fetch sample 2438975. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39dd5d0> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc118310> Failed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39ddd50> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9d8e00> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682db2660> Failed to fetch sample 1742099. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9f4f71940> Failed to fetch sample 3633396. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e4040> Failed to fetch sample 2078887. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fbbbf0> Failed to fetch sample 1277996. Exception: cannot identify image fFailed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459ca5c0> Failed to fetch sample 3180383. Exception: Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828129ad0> Failed to fetch sample 4028686. Exception: cannot identify image fFailed to fetch sample 1031287. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a651d620> Failed to fetch sample 3689918. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f616430> Failed to fetch sample 1231297. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0931e40> cannot identify image file <_io.BytesIO object at 0x7f566810fdd0> Failed to fetch sample 1284272. Exception:Failed to fetch sample 4125296. Exception: cannot identify image file <_io.BytesIO object at 0x7f3291771350> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c381aa20> Failed to fetch sample 2383732. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b938590> Failed to fetch sample 2633220. Exception: cannot identify image filFailed to fetch sample 1026599. Exception: cannot identify image file <_io.BytesIO object at 0x7f10106df060> le <_io.BytesIO object at 0x7ff401f57fb0> Failed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7bc090> Failed to fetch sample 3348800. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ee4d0> Failed to fetch sample 3034526. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3812e80> cannot identify image file <_io.BytesIO object at 0x7fc706ed43b0> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644bb00> Failed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a982b10> Failed to fetch sample 1445442. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278fd5d0> Failed to fetch sample 1491402. Exception:cannot identify image file <_io.BytesIO object at 0x7f3291771940> Failed to fetch sample 2452272. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682df4090> FFailed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2f6f70> Failed to fetch sample 3493751. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c0ae0> Failed to fetch sample 1811599. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25ad90> Failed to fetch sample 4376367. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3729120> Failed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7bfa60> cannot identify image file <_io.BytesIO object at 0x7fe0ec24d3a0> Failed to fetch sample 2903629. Exception:Failed to fetch sample 2763938. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db541d50> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b938f40> Failed to fetch sample 2047816. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7ae020> Failed to fetch sample 2317494. Exception: cannot identify image file <_io.BytesIO object at 0x7fb976fcc900> Failed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a982430> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e5a80> Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39b7b00> Failed to fetch sample 1804828. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ef150> FFailed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37cf1f0> Failed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f604ea0> Failed to fetch sample 1461463. Exception: cannot identify image file <_io.BytesIO object at 0x7feb5488d940> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7f277548b6a0>FFailed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777dc220> cannot identify image file <_io.BytesIO object at 0x7fdfaa47c180> Failed to fetch sample 1517927. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556075120> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec24f1a0> Failed to fetch sample 4026847. Exception: Failed to fetch sample 2103217. Exception: cannot identify image fFailed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c49a0> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777dc0e0> Failed to fetch sample 3123930. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39cf1f0> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c1530> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3815760> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f75f80> Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778fa10> Failed to fetch sample 3244596. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37cc5e0> Failed to fetch sample 1014966. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83ade40> Failed to fetch sample 4028462. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c96200> cannot identify image file <_io.BytesIO object at 0x7f18e5d0a070> Failed to fetch sample 1779474. Exception: cannot identify image file <_io.BytesIO object at 0x7fd31219fba0> Failed to fetch sample 2759315. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf064ef0> le <_io.BytesIO object at 0x7f5253a69670> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39229d0> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7f530f3db880> Failed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72246ac0> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec24f880> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560fc720> Failed to fetch sample 3371113. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706ed6f20> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278eb010> Failed to fetch sample 2898278. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2f5030> Failed to fetch sample 3180383. Exception:ject at 0x7fb66a377e70> Failed to fetch sample 1858383. Exception: Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d0a250> Failed to fetch sample 1832660. Exception: cannot identify image fFailed to fetch sample 3398883. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078deca0> Failed to fetch sample 1424068. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec7c720> ailed to fetch sample 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fbda1b0> Failed to fetch sample 2155552. Exception: cannot identify image file <_io.BytesIO object at 0x7f0d22b324d0> Failed to fetch sample 2719591. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a689f0> Failed to fetch sample 3036687. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722496c0> Failed to fetch sample 4205242. Exception: cannot identify image file <_io.BytesIO object at 0x7f47edfc16c0> Failed to fetch sample 3230192. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd2b60> Failed to fetch sample 3258016. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078f6250> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e8a7a0> cannot identify image file <_io.BytesIO object at 0x7fd2c46f2d40> le <_io.BytesIO object at 0x7f4beec67790> Failed to fetch sample 1431086. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078ea7f0> Failed to fetch sample 1906992. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f6f70> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754b2cf0> Failed to fetch sample 3919850. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb1ef70> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37371a0> Failed to fetch sample 2295708. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a23865cb0> cannot identify image file <_io.BytesIO object at 0x7fe4369d7920> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d0ac00> Failed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf83cdb0> ile <_io.BytesIO object at 0x7fe4440c6d90> Failed to fetch sample 972910. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a651dfd0> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7fef478edfd0> FFailed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b3830> cannot identify image file <_io.BytesIO object at 0x7f12b3847f60> ile <_io.BytesIO object at 0x7f4450b68f90> ailed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a23864220> Failed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fbb380> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db574450> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7fe44443f290> Failed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72248950> Failed to fetch sample 3207078. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec7d260> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b1beabfb0> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f7a60> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278fd5d0> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f919fd0> cannot identify image file <_io.BytesIO object at 0x7fd07a539ad0> Failed to fetch sample 3682356. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64a9170> Failed to fetch sample 3566079. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28eac0> cannot identify image file <_io.BytesIO object at 0x7fa008b4bc90> Failed to fetch sample 1935512. Exception: Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db98d990> Failed to fetch sample 3484950. Exception: cannot identify image fFailed to fetch sample 2231126. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37470b0> Failed to fetch sample 2827935. Exception: Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7fd3d0070810> cannot identify image file <_io.BytesIO object at 0x7f7b10ceffb0> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f998b30> Failed to fetch sample 3209388. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb2dc10> cannot identify image file <_io.BytesIO object at 0x7f08282cf4c0> Failed to fetch sample 1843217. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2d4ae0> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560744f0> le <_io.BytesIO object at 0x7fb52f64ea20> Failed to fetch sample 2155552. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb3f4c0> Failed to fetch sample 1804828. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37da610> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f979440> Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbf3330> Failed to fetch sample 1602779. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37ab010> le <_io.BytesIO object at 0x7fe436a08e50> cannot identify image file <_io.BytesIO object at 0x7fd07a575580> Failed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a27a10> Failed to fetch sample 1728608. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf4c6fbfb0> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937890d0> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078e97b0> Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f64e3e0> Failed to fetch sample 2718737. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72415e90> Failed to fetch sample 4077261. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722389a0> Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7f44427e76f0> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7f4454763100> Failed to fetch sample 3312391. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0933ab0> Failed to fetch sample 1231297. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a4bab0> Failed to fetch sample 2441796. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a624e2de0> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdba5ee0> cannot identify image file <_io.BytesIO object at 0x7f1f1b282480> Failed to fetch sample 1351607. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20035b1f0> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7f626fb06ca0> cannot identify image file <_io.BytesIO object at 0x7f9abc637ba0> le <_io.BytesIO object at 0x7f6a7f28dfd0> cannot identify image file <_io.BytesIO object at 0x7fc2eb7261b0> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7ff69802f560> Failed to fetch sample 3127810. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278a6020> Failed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a96c9f0> ile <_io.BytesIO object at 0x7f47278f7510> Failed to fetch sample 4129175. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9233d0> Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf4a02b0b0> Failed to fetch sample 2449866. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c696390>Failed to fetch sample 2122051. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b24a7f0> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200379120> Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16d9f600> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec7ca90> Failed to fetch sample 4018722. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72239f80> Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7f530e7423e0> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278ea1b0> Failed to fetch sample 2419283. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bed490> cannot identify image file <_io.BytesIO object at 0x7f7a568f4ef0> Failed to fetch sample 1026599. Exception: cannot identify image file <_io.BytesIO object at 0x7efce5b85620> Failed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7fe445d6d120> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f67ede0> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb961b0> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe774ae7a0> Failed to fetch sample 1602779. Exception: Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3726700> cannot identify image file <_io.BytesIO object at 0x7f7ec268c950> Failed to fetch sample 1327539. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c38ba1b0>Failed to fetch sample 2695950. Exception:: cannot identify image file <_io.BytesIO object at 0x7f6682e4ff60> Failed to fetch sample 4082026. Exception: cannot identify image file <_io.BytesIO object at 0x7f3636691e90> cannot identify image file <_io.BytesIO object at 0x7f6a7f2714e0> Failed to fetch sample 1160161. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a6bb0> Failed to fetch sample 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7fe750> Failed to fetch sample 1186857. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a09670> Failed to fetch sample 2082148. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d1773ed90> Failed to fetch sample 1602779. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a48310> Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1084565c0> Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f915440> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b223ba0> Failed to fetch sample 4023681. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35ba9080> Failed to fetch sample 1431086. Exception: Failed to fetch sample 3162454. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278b5580> Failed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f924770> Failed to fetch sample 1635122. Exception: cannot identify image fFailed to fetch sample 1885879. Exception: cannot identify image f Failed to fetch sample 1476796. Exception:Failed to fetch sample 2988390. Exception:bject at 0x7fe4369d4360>Failed to fetch sample 1930721. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e4c180> Failed to fetch sample 2883284. Exception:bject at 0x7f47278b76a0> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f94fe20> Failed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9271a0> Failed to fetch sample 1858383. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b281c10> Failed to fetch sample 1231297. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a58a0> Failed to fetch sample 2110654. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cddc10> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560fd490> Failed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e5b420> cannot identify image file <_io.BytesIO object at 0x7f58c38bb830> le <_io.BytesIO object at 0x7f001f925620> Failed to fetch sample 4179589. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937d4fe0> Failed to fetch sample 1934945. Exception: cannot identify image file <_io.BytesIO object at 0x7fc325744fe0> cannot identify image file <_io.BytesIO object at 0x7f57420e44f0> Failed to fetch sample 1108182. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbd8810> Failed to fetch sample 3728803. Exception:bject at 0x7f5d278fe7a0> Failed to fetch sample 3908365. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb74a700> cannot identify image file <_io.BytesIO object at 0x7f001f9bc090> Failed to fetch sample 2282533. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9e3830> Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f924c70> Failed to fetch sample 2864511. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4262a70> cannot identify image file <_io.BytesIO object at 0x7fa275cde9d0> le <_io.BytesIO object at 0x7f1093757560> Failed to fetch sample 2675625. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38bfb00> Failed to fetch sample 3688042. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d6b60> Failed to fetch sample 1432321. Exception: cannot identify image file <_io.BytesIO object at 0x7f5838df6250> Failed to fetch sample 3505786. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f1aca0> Failed to fetch sample 2954477. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420e4a40> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741fd5a30> Failed to fetch sample 2282526. Exception:cannot identify image fiFailed to fetch sample 2729090. Exception:Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7fbda1956c50> Failed to fetch sample 1570672. Exception: cannot identify image fiFailed to fetch sample 3012056. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f61ba10> cannot identify image file <_io.BytesIO object at 0x7f3c56083650> Failed to fetch sample 2887391. Exception:Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91c680> cannot identify image file <_io.BytesIO object at 0x7fa50aa2d2b0> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d71f0> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3739260> Failed to fetch sample 1076919. Exception: cannot identify image file <_io.BytesIO object at 0x7f578a291a80> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741fd4fe0> cannot identify image file <_io.BytesIO object at 0x7f94ac26f790> Failed to fetch sample 2363461. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c54a700> ailed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f12d40> Failed to fetch sample 4233344. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f630b0> cannot identify image file <_io.BytesIO obcannot identify image file <_io.BytesIO object at 0x7f65560ff150> Failed to fetch sample 1109166. Exception: cannot identify image fiFailed to fetch sample 1861362. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb12d40> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f1a110> le <_io.BytesIO object at 0x7f08280fbb00> cannot identify image file <_io.BytesIO object at 0x7f35a7f2b4c0> Failed to fetch sample 2723001. Exception: Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f96a20>Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789eed0> FFailed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f19b20>FFailed to fetch sample 4080972. Exception:bject at 0x7f58c3826f70> e <_io.BytesIO object at 0x7fb6877e2e80> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac26e200> cannot identify image file <_io.BytesIO object at 0x7fc2eb7499e0> Failed to fetch sample 2738146. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9fd490> Failed to fetch sample 1088788. Exception: c cannot identify image file <_io.BytesIO object at 0x7f7a568fd9e0cannot identify image file <_io.BytesIO object at 0x7f3d152efbf0> Failed to fetch sample 2993528. Exception: ca cannot identify image file <_io.BytesIO object at 0x7f99a394b060> Failed to fetch sample 3281978. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72246340> <_io.BytesIO object at 0x7fc0f08d7240> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401fbec0> cannot identify image file <_io.BytesIO object at 0x7f7b9b9fa2a0> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cc482f330> le <_io.BytesIO object at 0x7fc0f08edf80> Failed to fetch sample 4203000. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d14fe9e40> Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568fede0> ailed to fetch sample 1506173. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f73740> Failed to fetch sample 2633741. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39484a0> Failed to fetch sample 1017215. Exception: cannot identify image file <_io.BytesIO object at 0x7f21afee6200> Failed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7f10ae7e9260> Failed to fetch sample 3088352. Exception: cannot identify image file <_io.BytesIO object at 0x7f469efa3ab0> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278cfec0> Failed to fetch sample 2051302. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f75f80> Failed to fetch sample 2667463. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4473702c0> Failed to fetch sample 3633396. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb6b60> Failed to fetch sample 2122051. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37471f0> Failed to fetch sample 1289334. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadeb12b0> Failed to fetch sample 3541898. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775486340> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0903fb0> Failed to fetch sample 1602779. Exception:ject at 0x7f53f0ba5b70> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7f951ecdd300> Failed to fetch sample 2085038. Exception: Failed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5f51c0> cannot identify image file <_io.BytesIO object at 0x7f71341377e0> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f544a0> Failed to fetch sample 1677653. Exception: cannot identify image file <_io.BytesIO object at 0x7fc3849dc310> Failed to fetch sample 4172414. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d7da3420> Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37e29d0> Failed to fetch sample 2587903. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775477150> Failed to fetch sample 2110654. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadeb2f70> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420ed120> Failed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643ec0590> Failed to fetch sample 4026469. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd87c0> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcf4be020> Failed to fetch sample 2955590. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca84a0> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f291bc0> Failed to fetch sample 2374167. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706e58220> Failed to fetch sample 1278085. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac26db20> Failed to fetch sample 1218899. Exception:Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278fb8d0> Failed to fetch sample 2767646. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39519e0> Failed to fetch sample 1135347. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a394bbf0> Failed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd9c10> Failed to fetch sample 1719573. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab99300> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cea340> Failed to fetch sample 3015989. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c76b60> Failed to fetch sample 1966122. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f6a570> Failed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7f5df06f8cc0> cannot identify image file <_io.BytesIO object at 0x7f81db5d2480> le <_io.BytesIO object at 0x7feb671d60c0> cannot identify image file <_io.BytesIO object at 0x7f0c1bed2520> Failed to fetch sample 3304454. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f256cf0> Failed to fetch sample 1844688. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c5da0> Failed to fetch sample 2237717. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7bfe20> Failed to fetch sample 1765980. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64e0bd0> Failed to fetch sample 2122051. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3953a60> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a394bec0> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7f566813f330> Failed to fetch sample 3401019. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706e5a980> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38b2570> cannot identify image file <_io.BytesIO object at 0x7f5643ec0220> Failed to fetch sample 2961542. Exception:Failed to fetch sample 4104025. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f98de90> Failed to fetch sample 2049208. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cdc4a0> Failed to fetch sample 1207900. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e48270> cannot identify image file <_io.BytesIO object at 0x7f80a80df2e0> le <_io.BytesIO object at 0x7f18e64cfba0> Failed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741fd2430> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0930680> Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3919a30> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7f47e26ef830> Failed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO object at 0x7f566811df80> Failed to fetch sample 3858090. Exception: cannot identify image file <_io.BytesIO object at 0x7f566811fd80> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5c4e50> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278efe70> Failed to fetch sample 3909071. Exception: cannot identify image fFailed to fetch sample 2335870. Exception: Failed to fetch sample 2903629. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6504ea0> Failed to fetch sample 1479796. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cdda80> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e88d60> Failed to fetch sample 2158552. Exception: cannot identify image fFailed to fetch sample 1635122. Exception: Failed to fetch sample 1183354. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c549850> Failed to fetch sample 1806693. Exception: cannot identify image fFailed to fetch sample 2903623. Exception: Failed to fetch sample 3493751. Exception: ect at 0x7f5b68d52e30> ile <_io.BytesIO object at 0x7fbcdc112610> Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7fc407f2cd60> le <_io.BytesIO object at 0x7f69190be520> cannot identify image file <_io.BytesIO object at 0x7f58c383e3e0> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2909f0> Failed to fetch sample 3015989. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f57e70>Failed to fetch sample 2767646. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00755e90> Failed to fetch sample 3004849. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a947b50> Failed to fetch sample 1820465. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39eb880> Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278dd1c0> Failed to fetch sample 3746739. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f61f30> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777def20> Failed to fetch sample 1301230. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e8c20> Failed to fetch sample 3355667. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24949ba60> Failed to fetch sample 1047785. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab76ca0> Failed to fetch sample 4206268. Exception:Failed to fetch sample 1644344. Exception: cannot identify image fiFailed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb71c680> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f61df0> Failed to fetch sample 2761972. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777e5800> Failed to fetch sample 4123070. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce81b70> Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc111990> Failed to fetch sample 3096384. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80de0c0> Failed to fetch sample 3350750. Exception:Failed to fetch sample 1073868. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f26a70> Failed to fetch sample 2362788. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fc99c212d40> Failed to fetch sample 1047449. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 35Failed to fetch sample 1470654. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52b63f5b0> Failed to fetch sample 1160161. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f27c40> Failed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO object at 0x7f81d6dcf650> Failed to fetch sample 33Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e89580> Failed to fetch sample 3533762. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec60ae0> Failed to fetch sample 1235694. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7f051343ab10> bject at 0x7fbf00757bf0> Failed to fetch sample 1506948. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0772e8e0> Failed to fetch sample 1431086. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f234c0> Failed to fetch sample 2086216. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f971620> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644cf90> Failed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e585bc630> Failed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0ef010> Failed to fetch sample 1945810. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f657b0> Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd933d0> Failed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9082b10> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00756890> Failed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681db290> Failed to fetch sample 3467031. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b5300> Failed to fetch sample 3576521. Exception: cannot identify image file <_io.BFailed to fetch sample 2719591. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5729d0> Failed to fetch sample 1231297. EFailed to fetch sample 2394516. Exception: cannot identify image file <_io.BFailed to fetch sample 1878125. Exception: cannot identify image fiFailed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fe72e0> Failed to fetch sample 1443175. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b278040> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.Failed to fetch sample 3876713. Ex cannot identify image file <_io.BytesIO object at 0x7f6a7f293ec0> Failed tFailed to fetch sample 3117311. Exception: ject at 0x7fb9777dbe20> Failed to fetch sample 2145986. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b968400> Failed to fetch sample 1575318. Exception: cannot identify image file <_io.BFailed to fetch sample 4179589. Exception: cannot identify image file <_io.Failed to fetch sample 2311643. Exception: cannot identify image file <_io.BytesIO object at 0x7fb976fcb240> Failed to fetch sample 3568204. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8f7a60> ailed to fetch sample 3458049. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2783b0> Failed to fetch sample 2272589. Exception: cannot identify image file <_io.BytesIO object at 0x7fdae529a840> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52b63f6f0> Failed to fetch sample 1792833. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8db2e0> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f257c90> Failed to fetch sample 2855055. Exception: cannot identify image fFailed to fetch sample 2647104. Exception: Failed to fetch sample 1569977. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb2f420> Failed to fetch sample 2784203. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52b63fe20> Failed to fetch sample 2144992. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9dff60> Failed to fetch sample 3929909. Exception: cannot identify image file <_io.BytesIO object at 0x7f829443aac0> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c299ee0> Failed to fetch sample 3036687. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f47b50> Failed to fetch sample 2210532. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7773d0> Failed to fetch sample 1423015. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a87663ce0> Failed to fetch sample 3116930. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb3b560> Failed to fetch sample 1329822. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789bd30> Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7f51ca5c7470> Failed to fetch sample 2149756. Exception: cannot identify image fiFailed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2552b0> Failed to fetch sample 2661050. Exception:Failed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b242250> Failed to fetch sample 3212215. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f729d0> bject at 0x7fccdf7b2700> 80452. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133f40cc0> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681d9620> Failed to fetch sample 3096384. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20038af70> cannot identify image file <_io.BytesIO object at 0x7f2775479210> Failed to fetch sample 2448314. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8769440> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278bf510> Failed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7f991bde2f20> cannot identify image file <_io.BytesIO object at 0x7f1b7b9c1530> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568d7650> ile <_io.BytesIO object at 0x7f81db59bd80> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd6ed0> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f722f0> Failed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278b4c70> Failed to fetch sample 2925506. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556027ab0> Failed to fetch sample 2033363. Exception: cannot identify image fiFailed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f7d385850> Failed to fetch sample 1858383. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278be480> le <_io.BytesIO object at 0x7f56681d8f40> Failed to fetch sample 3134464. Exception: ject at 0x7f357b9df8d0> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72247100> Failed to fetch sample 1602779. Exception: cannot identify image file <_io.BytesIO object at 0x7f991bde2cf0> cannot identify image file <_io.BytesIO object at 0x7fee3f9b4360> Failed to fetch sample 1557167. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcea980> le <_io.BytesIO object at 0x7fbcdc14f880> Failed to fetch sample 2931138. Exception:Failed to fetch sample 1524920. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f66b0> ject at 0x7f7e30684ae0> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f284b80> Failed to fetch sample 1160161. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25b150> Failed to fetch sample 3244596. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b6c50> Failed to fetch sample 1459586. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec00b80> cannot identify image file <_io.BytesIO object at 0x7feef9dbe520> Failed to fetch sample 1017215. Exception:bject at 0x7f80a8109620> Failed to fetch sample 3929909. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e9ee0> Failed to fetch sample 1065907. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b7fb0> le <_io.BytesIO object at 0x7f5e6a96d530> Failed to fetch sample 1558054. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39eebb0> Failed to fetch sample 1448881. Exception: cannot identify image file <_io.BytesIO object at 0x7fc138cf1e40> Failed to fetch sample 1431086. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b257150> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2007a2f20> cannot identify image file <_io.BytesIO object at 0x7efd6f7b0860> Failed to fetch sample 3109995. Exception::cannot identify image file <_io.BytesIO object at 0x7fbe72244770> cannot identify image file <_io.BytesIO object at 0x7f5a624e2de0> le <_io.BytesIO object at 0x7f5746e81850> Failed to fetch sample 3345102. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f70680> Failed to fetch sample 2717293. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2d11c0> Failed to fetch sample 2490040. Exception:bject at 0x7f5e6a8f40e0> Failed to fetch sample 1160161. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39eeb60> Failed to fetch sample 1156385. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8d9800> Failed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39f1a30> cannot identify image file <_io.BytesIO object at 0x7f80a81124d0> Failed to fetch sample 1222876. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80eabb0> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16dab830> cannot identify image file <_io.BytesIO object at 0x7fb4a651f330> Failed to fetch sample 3350750. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741fb3b00> Failed to fetch sample 2522299. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16dab920> Failed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7f95bc525620> Failed to fetch sample 3134464. Exception: cannot identify image fiFailed to fetch sample 3225671. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e4eb60> Failed to fetch sample 3350750. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39fb4c0> Failed to fetch sample 4179589. Exception: cannot identify image fFailed to fetch sample 3571975. Exception: Failed to fetch sample 3325737. Exception: cannot identify image file <_io.BytesIO object at 0x7feb671d60c0> cannot identify image file <_io.BytesIO object at 0x7f2009934b80> cannot identify image file <_io.BytesIO object at 0x7f22e39dd580> Failed to fetch sample 3046470. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c1620> cannot identify image file <_io.BytesIO object at 0x7f4beec7b790> Failed to fetch sample 1470654. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e4c9f0> le <_io.BytesIO object at 0x7fb52f64f600> Failed to fetch sample 2671128. Exception: cannot identify image file <_io.BytesIO object at 0x7feb2fc45490> Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f64d2b0> Failed to fetch sample 2719591. Exception:Failed to fetch sample 2679326. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd43035490> cannot identify image file <_io.BytesIO object at 0x7f78b3865d50> Failed to fetch sample 1227260. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39ae890> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bba890> cannot identify image file <_io.BytesIO object at 0x7f4beec63740> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7f713413f010> Failed to fetch sample 2049208. Exception: cannot identify image fcannot identify image file <_io.BytesIO objFailed to fetch sample 2940708. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b258c70> ect at 0x7f1b7b9c15d0> Failed to fetch sample 1907677. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f24c400> ile <_io.BytesIO object at 0x7f9692a08b30> Failed to fetch sample 4220232. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d145b9990> cannot identify image file <_io.BytesIO object at 0x7f07aac70a90> Failed to fetch sample 3276750. Exception:ject at 0x7f6682f92250> Failed to fetch sample 1602779. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec7b880> FFailed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c187d30> le <_io.BytesIO object at 0x7f0c35bc3240> Failed to fetch sample 1770342. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bbb420> Failed to fetch sample 3866811. Exception: cannot identify image file <_io.BytesIO object at 0x7f4440882c00> Failed to fetch sample 2892575. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3912fc0> Failed to fetch sample 4376367. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b994900> cannot identify image file <_io.BytesIO object at 0x7f1f1b223470> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7404096c0> Failed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dcf560> Failed to fetch sample 2049208. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134571030> Failed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a810b290> cannot identify image file <_io.BytesIO object at 0x7f22e39eeca0> Failed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bff920> Failed to fetch sample 2892575. Exception: cannot identify image file <_io.BytesIO object at 0x7f443e855620> le <_io.BytesIO object at 0x7efcdbb2e5c0> cannot identify image file <_io.BytesIO object at 0x7fca2e1c74c0> Failed to fetch sample 3013587. Exception: cannot identify image file <_io.BytesIO object at 0x7f07aac718f0> Failed to fetch sample 3332023. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8370b30> Failed to fetch sample 1972307. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e2b28cf90> Failed to fetch sample 4125296. Exception: cannot identify image file <_io.BytesIO object at 0x7f936bd2fe20> Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bbaca0> Failed to fetch sample 2824278. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c56e390> cannot identify image file <_io.BytesIO object at 0x7f2a9173dfd0> Failed to fetch sample 1308960. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaade9cc70> Failed to fetch sample 1720874. Exception: cannot identify image file <_io.BytesIO object at 0x7feef8fb3100> cannot identify image file <_io.BytesIO object at 0x7fca1c186c50> ailed to fetch sample 2984651. Exception: Failed to fetch sample 4121547. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8890d0> cannot identify image file <_io.BytesIO object at 0x7f5253bfd120> Failed to fetch sample 2930403. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f7b9b9968e0> 339123. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f3790> Failed to fetch sample 1136020. Exception: cannot identify image file <_io.BytesIO object at 0x7fbc5c1b4e00> Failed to fetch sample 2940708. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadd4c590> Failed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f969d50> Failed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4267a60> Failed to fetch sample 3281163. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37a81d0> Failed to fetch sample 2562281. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35486ac0> Failed to fetch sample 3403770. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5774c0> cannot identify image file <_io.BytesIO object at 0x7f4fc86f72e0> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9968e0> Failed to fetch sample 2679576. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25fab0> ile <_io.BytesIO object at 0x7f5de37cecf0> cannot identify image file <_io.BytesIO object at 0x7fdcaf888090> le <_io.BytesIO object at 0x7fee3f9ead90> Failed to fetch sample 4171913. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682e7b00> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecb201120> Failed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec26a8e0> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7f936bd2dee0> Failed to fetch sample 3002531. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc174590> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8108900> Failed to fetch sample 4099617. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4266c50> Failed to fetch sample 4179589. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5855d0> Failed to fetch sample 3233880. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1dabb0> cannot identify image file <_io.BytesIO object at 0x7f5d2787f150> le <_io.BytesIO object at 0x7f9a9c2c1f80> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7327eac0> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db576660> Failed to fetch sample 3421855. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db585c10> cannot identify image file <_io.BytesIO object at 0x7fe4369fd850> Failed to fetch sample 2075577. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a537bf0> Failed to fetch sample 3523093. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec8e7a0> Failed to fetch sample 2300940. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38a59e0> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9b1cb0> Failed to fetch sample 4122522. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093757290> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37a8f90> Failed to fetch sample 2616210. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28aca0> Failed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278cbce0> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72417830> Failed to fetch sample 1484787. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72246340>cannot identify image file <_io.BytesIO object at 0x7f83e83c8220> Failed to fetch sample 3379064. Exception: cFailed to fetch sample 2746828. Exception: cannot identify image Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4267ce0> Failed to fetch sample 2448314. Exception: cFailed to fetch sample 1099589. Exception: cannot identify image Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe724148b0> Failed to fetch sample 2531778. Exception: cFailed to fetch sample 1584467. Exception: cannot identify image Failed to fetch sample 3648356. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7327d670> file <_io.BytesIO object at 0x7f94ac23c6d0> Failed to fetch sample 2850892. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9ce8db70> Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc121ad0> Failed to fetch sample 3021215. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a810bf10> Failed to fetch sample 3196207. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20037a480> cannot identify image file <_io.BytesIO object at 0x7fdc25fe18a0> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f264b30> Failed to fetch sample 3225671. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a372e0> Failed to fetch sample 4179589. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7c3920> Failed to fetch sample 2687916. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd431efe20> Failed to fetch sample 1946454. Exception: cannot identify image file <_io.BytesIO object at 0x7f472781f1f0> Failed to fetch sample 1806603. Exception:bject at 0x7f6a5c529760> Failed to fetch sample 2831255. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3898db0> Failed to fetch sample 1037897. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f905b20> Failed to fetch sample 1878125. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c3b00> cannot identify image file <_io.BytesIO object at 0x7f5e6a972570> Failed to fetch sample 2572927. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79e660> Fcannot identify image file <_io.BytesIO object at 0x7fdcaf8577e0> Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a59be70> Failed to fetch sample 4235606. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf066a70> Failed to fetch sample 3421855. Exception:Failed to fetch sample 4075749. Exception: cannot identify image fiFailed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5cd10> Failed to fetch sample 3502572. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003abc40> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38c1710> Failed to fetch sample 3879860. Exception: cannot identify image file <_io.BytesIO object at 0x7f0daaeef880> cannot identify image file <_io.BytesIO object at 0x7fe0ee697ce0> Failed to fetch sample 1465471. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec282e80> le <_io.BytesIO object at 0x7f357b93e5c0> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f2890> Failed to fetch sample 1100749. Exception: cannot identify image file <_io.BytesIO object at 0x7fda7d136c00> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5371a0> Failed to fetch sample 2007723. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7c2a70> Failed to fetch sample 1975187. Exception: cannot identify image file <_io.BytesIO object at 0x7f6ac76ba5c0> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c187970> Failed to fetch sample 3872727. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4262ed0> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7fcf245c0770> Failed to fetch sample 2024032. Exception:Failed to fetch sample 27Failed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8570b0> Failed to fetch sample 1099448. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fe9d0> Failed to fetch sample 2155552. Exception:Failed to fetch sample 4222253. Exception: cannot identify image ficannot identify image file <_io.BytesIO obFailed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf066020> Failed to fetch sample 1536214. Exception: cannot identify image fiFailed to fetch sample 1629521. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99b841350> Failed to fetch sample 4179589. Exception: cannot identify image file <_io.BytesIO object at 0x7f393931aed0> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37a92b0> Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7fa14c6d5e40> Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf856430> cannot identify image file <_io.BytesIO object at 0x7fbe723fcae0> Failed to fetch sample 1382222. Exception:Failed to fetch sample 2122051. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fc2eb749d50> 55321. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00724ef0> cannot identify image file <_io.BytesIO object at 0x7f5de37aac00> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec24ed40> Failed to fetch sample 2737577. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83aea70> cannot identify image file <_io.BytesIO object at 0x7f52de877fb0> le <_io.BytesIO object at 0x7f3d146e2930> Failed to fetch sample 3431220. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3824d60> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071c540> i cannot identify image file <_io.BytesIO oFailed to fetch sample 36Failed to fetch sample 1186857. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec24f650> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83af5b0> Failed to fetch sample 2149756. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcb650> Failed to fetch sample 1769963. Exception: cannot identify image fiFailed to fetch sample 2403018. Exception:Failed to fetch sample 2492755. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc10fba0> Failed to fetch sample 1742608. Exception: cannot identify image file <_io.BytesIO object at 0x7f109375be70> Failed to fetch sample 2479743. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0074d5d0> cannot identify image file <_io.BytesIO object at 0x7f001f926020> Failed to fetch sample 3661452. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe723ff4c0> ile <_io.BytesIO object at 0x7f58c38267f0> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00753c90> Failed to fetch sample 1876146. Exception:Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec24f6a0> Failed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e834a2a0> Failed to fetch sample 3880305. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf88d490> Failed to fetch sample 3535594. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a537060> Failed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f282480> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5690f0b0> Failed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd69b02520> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf88da80> Failed to fetch sample 3535594. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fe3e0> Failed to fetch sample 3531320. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28f420> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a537b00> Failed to fetch sample 2086216. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f0a65c0> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cec954c20> Failed to fetch sample 2648043. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a8a7249a0> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf88e480> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fe1c60> Failed to fetch sample 3015989. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28c0e0> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7f6ac76ba200> Failed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c0ae0> Failed to fetch sample 2875666. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e834b5b0> Failed to fetch sample 1602779. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939319120> 92). Running this sequence through the model will result in indexing errors Failed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe723fc0e0> Failed to fetch sample 1774192. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003db100> Failed to fetch sample 3247622. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec257560> Failed to fetch sample 3036687. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83aefc0> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec24f1a0> Failed to fetch sample 3290404. Exception: cannot identify image file <_io.BytesIO object at 0x7f6abac062a0> Failed to fetch sample 2975002. Exception:cannot identify image file <_io.BytesIO object at 0x7fbe723eb790> cannot identify image file <_io.BytesIO object at 0x7fb687f94090> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc10df30> Failed to fetch sample 2625770. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c381bbf0> ailed to fetch sample 2483212. Exception:Failed to fetch sample 2915907. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b979afe20> Failed to fetch sample 3380043. Exception: cannot identify image file <_io.BytesIO object at 0x7f20e1bb46d0> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7f9ee1578a40> Failed to fetch sample 3840067. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb6875260> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7feef8f52a20> Failed to fetch sample 2004558. Exception:bject at 0x7f83e83481d0> cannot identify image file <_io.BytesIO object at 0x7fe4369fcbd0> Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO ocannot identify image file <_io.BytesIO object at 0x7fb52f6ac130> Failed to fetch sample 3234945. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f18590> ile <_io.BytesIO object at 0x7f357b923e20> bject at 0x7fd99f4c20c0> Failed to fetch sample 4138210. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7bb470> Failed to fetch sample 2439548. Exception: cannot identify image file <_io.BytesIO object at 0x7f53f1f122f0> Failed to fetch sample 3443509. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f10720> Failed to fetch sample 3135173. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2a11c0> Failed to fetch sample 2782431. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732617b0> Failed to fetch sample 1267331. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b223100> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4268400> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f8cae0> Failed to fetch sample 2101533. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777dbe20> Failed to fetch sample 1015338. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9fae04db0> cannot identify image file <_io.BytesIO object at 0x7f82a5e97650> ailed to fetch sample 1878125. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420ef2e0> ailed to fetch sample 2367879. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a394bab0> Failed to fetch sample 2954477. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0779f1f0> Failed to fetch sample 2102984. Exception: cannot identify image file <_io.BytesIO object at 0x7fd079db49f0> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f797c90> cannot identify image file <_io.BytesIO object at 0x7f5253bee110> FFailed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93b150>Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f19b20> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401fbb50> Failed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2b5350>Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7bb6a0> cannot identify image file <_io.BytesIO object at 0x7f22e3850db0> cannot identify image file <_io.BytesIO object at 0x7f0ecdb9fc90> Failed to fetch sample 2956200. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f8900> le <_io.BytesIO object at 0x7fd20038d350> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200375f80> Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f12660> Failed to fetch sample 2764777. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a18f0> Failed to fetch sample 1715436. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd4590> Failed to fetch sample 2131338. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe71afe3e0> Failed to fetch sample 4064850. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38b36a0> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200376f20> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9f3001440> Failed to fetch sample 1846292. Exception: cannot identify image file <_io.BytesIO object at 0x7fc383d85350> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9d8d10> Failed to fetch sample 4123070. Exception: cannot identify image file <_io.BytesIO object at 0x7f469efa3ba0> Failed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7fc24c6c1ad0> Failed to fetch sample 1017215. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c381f880> Failed to fetch sample 1542067. Exception: cannot identify image file <_io.BytesIO object at 0x7f10829090d0> Failed to fetch sample 1173897. Exception: cannot identify image file <_io.BytesIO object at 0x7fcf24787fb0> Failed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a6250> Failed to fetch sample 3695885. Exception: cannot identify image fFailed to fetch sample 2858679. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f8ae0> ile <_io.BytesIO object at 0x7fc0f0930860> Failed to fetch sample 1522623. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f41c10> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2b4db0> Failed to fetch sample 3448616. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008199760> Failed to fetch sample 2892575. Exception: cannot identify image file <_io.BytesIO object at 0x7fc138cf1e40> cannot identify image file <_io.BytesIO object at 0x7fbcdc0f1ad0> ile <_io.BytesIO object at 0x7f81db5d3c90> Failed to fetch sample 1029176. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200384680> Failed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a7d30> Failed to fetch sample 2811558. Exception: cannot identify image fFailed to fetch sample 2437168. Exception:FFailed to fetch sample 1424068. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249677d30> cannot identify image file <_io.BytesIO object at 0x7fb687f16ac0> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2b7510> Failed to fetch sample 1160161. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008199120> Failed to fetch sample 2110654. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f8360> Failed to fetch sample 3097943. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c294450> Failed to fetch sample 2424391. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4440ba340> Failed to fetch sample 2049208. Exception: cannot identify image fiFailed to fetch sample 1770342. Exception:bject at 0x7f81db567790> Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278ee340> le <_io.BytesIO object at 0x7fb8edd679c0> le <_io.BytesIO object at 0x7fd2003b2890> Failed to fetch sample 1033050. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4473708b0> Failed to fetch sample 3507586. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f95b240> Failed to fetch sample 2890579. Exception:bject at 0x7f7e078d6020> Failed to fetch sample 1445442. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 32Failed to fetch sample 1758557. Exception:ject at 0x7fe44409a980> Failed to fetch saFailed to fetch sample 4230303. Exception:bject at 0x7f469f3f6d90> le <_io.BytesIO object at 0x7f12b39f73d0> mple 1079974. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c29a390> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f8c70> Failed to fetch sample 2024032. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c287bf0> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a4fe0> Failed to fetch sample 1479796. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137ff7ce0> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200386570> cannot identify image file <_io.BytesIO object at 0x7fb667590ea0> Failed to fetch sample 1381127. Exception:Failed to fetch sample 2273674. Exception: cannot identify image fiFailed to fetch sample 2204862. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a7f9d7fb0> ailed to fetch sample 1982840. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fe4d60> Failed to fetch sample 2511117. Exception:cannot identify image file <_io.BytesIO object at 0x7f53a8ee2c00> cannot identify image file <_io.BytesIO object at 0x7fee3f9d3290> Failed to fetch sample 3149301. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a651dad0> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f645a30> le <_io.BytesIO object at 0x7fd2003676a0> Failed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42813f0> Failed to fetch sample 2265597. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722458f0> Failed to fetch sample 1051349. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a391b510> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c276226b0> Failed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369eea20> Failed to fetch sample 1276850. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fdfaae42110> ailed to fetch sample 1878125. Exception: Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568f71f0> cannot identify image file <_io.BytesIO object at 0x7fe447381350> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4283ec0> Failed to fetch sample 3188322. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e831bd30> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f45fd0> Failed to fetch sample 4161397. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7f5741f33560> 876980. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008198860> Failed to fetch sample 2856013. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fa8f40> Failed to fetch sample 4125296. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137ff65c0> Failed to fetch sample 2587903. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4298fe0> Failed to fetch sample 2145986. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9d3c90> Failed to fetch sample 1428803. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72405710> cannot identify image file <_io.BytesIO object at 0x7f80a82d3150> Failed to fetch sample 1086466. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643e2dda0> Failed to fetch sample 3013832. Exception: cannot identify image file <_io.BytesIO object at 0x7f0f05f1a6b0> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003d8860> Failed to fetch sample 2324880. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64e3150> Failed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64e0f40> Failed to fetch sample 2903629. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137ff66b0> Failed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2007a5e40> Failed to fetch sample 2317509. Exception: cannot identify image fiFailed to fetch sample 2455856. Exception: ailed to fetch sample 2455856. Exception: ject at 0x7efcdbb6a020> cannot identify image file <_io.BytesIO object at 0x7f5a16f6b9c0> Failed to fetch sample 1171400. Exception: cannot identify image file <_io.BytesIO object at 0x7f82966a0f40> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 3227008. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249675300> Failed to fetch sample 17Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732873d0> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 2834857. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090b100> cannot identify image file <_io.BytesIO object at 0x7f563bfe08b0> Failed to fetch sample 2720468. Exception:bject at 0x7f5a16f8b740> Failed to fetch sample 3493751. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7222df30> Failed to fetch sample 2231126. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59bf10> Failed to fetch sample 1878125. Exception:Failed to fetch sample 3180383. Exception: cannot identify image fiFailed to fetch sample 3276759. Exception:Failed to fetch sample 3128285. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f8bf10> cannot identify image file <_io.BytesIO object at 0x7f0e64c33fb0> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b1490> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2cfd80> Failed to fetch sample 3297609. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f14db0> cannot identify image file <_io.BytesIO object at 0x7f83e833ade0> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83a8770> Failed to fetch sample 3207078. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249676750> ailed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7222e3e0> Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c5f510> cannot identify image file <_io.BytesIO object at 0x7fb5ea6f1d50> Failed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9fa909710> le <_io.BytesIO object at 0x7f5a16f6a570> cannot identify image file <_io.BytesIO object at 0x7f5e6a9479c0> Failed to fetch sample 1401955. Exception: Failed to fetch sample 2269054. Exception:ject at 0x7fc262b51990> ile <_io.BytesIO object at 0x7f1137f166b0> Failed to fetch sample 2439548. Exception: cannot identify image file <_io.BytesIO object at 0x7fdae1cf0cc0> Failed to fetch sample 2160296. Exception: cannot identify image file <_io.BytesIO object at 0x7f81d6dce7f0> Failed to fetch sample 1759192. Exception: cannot identify image fiFailed to fetch sample 1333014. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3957510> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39d8590> Failed to fetch sample 2540616. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16e4e4d0> cannot identify image file <_io.BytesIO object at 0x7f18e6501d50> le <_io.BytesIO object at 0x7f3c5608e6b0> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b8680> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e833a430> Failed to fetch sample 2522299. Exception: cannot identify image file <_io.BytesIO object at 0x7fc196770d60> Failed to fetch sample 2024032. Exception: cannot identify image fiFailed to fetch sample 3599011. Exception:cannot identify image file <_io.BytesIO object at 0x7f5a16f6a110> Failed to fetch sample 3845129. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a4a7f0> Failed to fetch sample 1707636. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35b98040> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7222d5d0> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7f8058ccd990> Failed to fetch sample 2495819. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ca6e80> Failed to fetch sample 3846691. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4db58fc90> Failed to fetch sample 2487619. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ce85e0> Failed to fetch sample 2538638. Exception: cannot identify image fiFailed to fetch sample 3355968. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbd98a0> le <_io.BytesIO object at 0x7ff249530680> Failed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83984f0> Failed to fetch sample 2388130. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64e3c40> cannot identify image file <_io.BytesIO object at 0x7f0c35bc2ac0> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbd8fe0> Failed to fetch sample 3395805. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd5e40> Failed to fetch sample 3011121. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8318a90> Failed to fetch sample 3039961. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37d8130> Failed to fetch sample 3015989. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a17c90>Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2496745e0> Failed to fetch sample 2354595. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72404e50> Failed to fetch sample 3127642. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bb7f60> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc3e20> cannot identify image file <_io.BytesIO object at 0x7fc0f09327f0> Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5bb00> cannot identify image file <_io.BytesIO object at 0x7fc9eff98040> Failed to fetch sample 2940430. Exception:Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647ec00> cannot identify image file <_io.BytesIO object at 0x7ff249487d30> Failed to fetch sample 1097359. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1bb0b0> Failed to fetch sample 3648356. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a15e40> Failed to fetch sample 1310794. Exception: cannot identify image file <_io.BytesIO object at 0x7f10af1aa430> cannot identify image file <_io.BytesIO object at 0x7f5741f0e660> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e65015d0> Failed to fetch sample 2856005. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568ba570> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7fe74cff36f0> cannot identify image file <_io.BytesIO object at 0x7f9d7327f880> Failed to fetch sample 1063378. Exception: ject at 0x7fc0f0933970> Failed to fetch sample 1308960. Exception:Failed to fetch sample 1659412. Exception: cannot identify image fiFailed to fetch sample 1086907. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003bd300> le <_io.BytesIO object at 0x7fc26217b740> Failed to fetch sample 3890093. Exception: cannot identify image file <_io.BytesIO object at 0x7ff40701e750> Failed to fetch sample 1639449. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789f650> Failed to fetch sample 3380043. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7c01d0> Failed to fetch sample 3047675. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a3a10> Failed to fetch sample 2562281. Exception: cannot identify image fFailed to fetch sample 2827935. Exception: Failed to fetch sample 2940708. Exception: cannot identify image fFailed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278e6d40> Failed to fetch sample 2993528. Exception: cannot identify image file <_io.BytesIO object at 0x7fc26217b380> Failed to fetch sample 2954477. Exception: Failed to fetch sample 1976843. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f7150> cannot identify image file <_io.BytesIO object at 0x7fc2eb71b740> Failed to fetch sample 1982954. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9ce338cc0> Failed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647f650> Failed to fetch sample 3575993. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278ddc10> Failed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b53010> Failed to fetch sample 3244596. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494f8630> cannot identify image file <_io.BytesIO object at 0x7f0c35bca9d0> Failed to fetch sample 4059368. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16daac00> Failed to fetch sample 2079203. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d5059bd30> cannot identify image file <_io.BytesIO object at 0x7fc0f0933d30> Failed to fetch sample 3298418. Exception: cannot identify image fFailed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc25c0> Failed to fetch sample 3514522. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a16250> Failed to fetch sample 2814996. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2686d0>Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741fb1670> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0931940> Failed to fetch sample 3266988. Exception:Failed to fetch sample 4131981. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986421620> Failed to fetch sample 3351155. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f624e50> cannot identify image file <_io.BytesIO object at 0x7f563c6360c0> Failed to fetch sample 2919698. Exception: cannot identify image fiFailed to fetch sample 2784203. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741fb2660> cannot identify image file <_io.BytesIO object at 0x7fd2003b4860> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 17Failed to fetch sample 4108370. Exception: le <_io.BytesIO object at 0x7fccdf79cc20> ailed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f73420> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc13f0> Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380f880> Failed to fetch sample 3877675. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f08280f68e0> Failed to fetch sample 2932930. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c9f9c0> Failed to fetch sample 2343306. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 2cannot identify image file <_io.BytesIO object at 0x7fd2003b6570> at 0x7f4beec63d30> cannot identify image file <_io.BytesIO object at 0x7f563c54b790> ile <_io.BytesIO object at 0x7fe4369d7a10> Failed to fetch sample 3136267. Exception:: cannot identify image file <_io.BytesIO object at 0x7fccdf7c5d00>Failed to fetch sample 2954477. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c381fb00> Failed to fetch sample 1422383. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7ddbd0d0> Failed to fetch sample 2429018. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e740e0> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7f566810fab0> Failed to fetch sample 3856187. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f4e73cc20> Failed to fetch sample 3662436. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59f060> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16da93f0> Failed to fetch sample 3892008. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c8c4f0> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7f082810bf60> Failed to fetch sample 1801531. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cf8130> Failed to fetch sample 2799267. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e60e00> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64cdd00> Failed to fetch sample 2131554. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8da660> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a16f20> Failed to fetch sample 3126575. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706ed5620> Failed to fetch sample 1427831. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280e8540> cannot identify image file <_io.BytesIO object at 0x7fee3f973150> Failed to fetch sample 2435764. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754b3880> Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6cade0> Failed to fetch sample 2437168. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9709a0> Failed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO object at 0x7fc51570b8d0> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f09333d0> Failed to fetch sample 3286154. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3736390> Failed to fetch sample 3021215. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c9cf90> Failed to fetch sample 1246142. Exception: cannot identify image file <_io.BytesIO object at 0x7fc195139df0> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380e480> cannot identify image file <_io.BytesIO object at 0x7f5e3dd140e0> Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59ebb0> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fa82ed0> Failed to fetch sample 1120584. Exception:Failed to fetch sample 1168708. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cf90d0> cannot identify image file <_io.BytesIO object at 0x7f5de37cec50> Failed to fetch sample 1092388. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5ae890> Failed to fetch sample 1592889. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f906250> cannot identify image file <_io.BytesIO object at 0x7fdcaf87cc20> le <_io.BytesIO object at 0x7f1f1b257f60> Failed to fetch sample 3081392. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c28c900> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c5fd0> Failed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b258220> cannot identify image file <_io.BytesIO object at 0x7f5affefbfb0> Failed to fetch sample 1644949. Exception: Failed to fetch sample 1688560. Exception: cannot identify image fFailed to fetch sample 4104102. Exception: Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d730a3f10> Failed to fetch sample 3254207. Exception: cannot identify image fFailed to fetch sample 3557596. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7f330> ile <_io.BytesIO object at 0x7f240174a750> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681d8f90> Failed to fetch sample 4115026. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278ca9d0> Failed to fetch sample 2482217. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9071f0> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7f23418566b0> : cannot identify image file <_io.BytesIO object at 0x7fc2eb726de0> Failed to fetch sample 1308960. Exception:Failed to fetch sample 3136267. Exception:bject at 0x7f60ffea7a60> le <_io.BytesIO object at 0x7f12b3827ba0> Failed to fetch sample 2393910. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec008b0> cannot identify image file <_io.BytesIO object at 0x7f5643ec04f0> e <_io.BytesIO object at 0x7f2a91722a70> Failed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2f79c0> Failed to fetch sample 1316862. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b0889f30> Failed to fetch sample 4199822. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093784e00> Failed to fetch sample 3929909. Exception: cannot identify image fiFailed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f70ef0> le <_io.BytesIO object at 0x7f583c6d93a0> Failed to fetch sample 1407211. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dc09a0> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec00c70> Failed to fetch sample 3094135. Exception: cannot identify image file <_io.BytesIO object at 0x7f7af76136a0> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a92ca3ce0> Failed to fetch sample 3197182. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d9350> Failed to fetch sample 2394516. Exception: cannot identify image fFailed to fetch sample 3108839. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f80a57fb0> Failed to fetch sample 2746828. Exception: Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681dbd80> Failed to fetch sample 3116930. Exception: cannot identify image fFailed to fetch sample 1747586. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cdddf0> le <_io.BytesIO object at 0x7f58c380d490> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bee570> Failed to fetch sample 4238994. Exception: cannot identify image file <_io.BytesIO object at 0x7f10b897af20> Failed to fetch sample 3503629. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faef600> Failed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f1990> Failed to fetch sample 1930721. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a1c1d0> Failed to fetch sample 984581. Exception: cannot identify image file <_io.BytesIO object at 0x7f4ebf9fb0b0> Failed to fetch sample 1429446. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e652b0> cannot identify image file <_io.BytesIO object at 0x7fee3f949f30> ailed to fetch sample 2940708. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a972840> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b95f290> Failed to fetch sample 1365649. Exception:Failed to fetch sample 2237265. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a973bf0> cannot identify image file <_io.BytesIO object at 0x7f5ea213e520> Failed to fetch sample 1760983. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b7880> Failed to fetch sample 2642978. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fdb290>cannot identify image file <_io.BytesIO object at 0x7fb7401e03b0> ile <_io.BytesIO object at 0x7f5e6a8f7dd0> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a972bb0> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a9853a0> Failed to fetch sample 3050043. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec82250> Failed to fetch sample 3661452. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f906f20> Failed to fetch sample 2607936. Exception:Failed to fetch sample 3297609. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca8590> le <_io.BytesIO object at 0x7fc9f8d178d0> cannot identify image file <_io.BytesIO object at 0x7efd6f7db2e0> Failed to fetch sample 2578023. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fd9a80> cannot identify image file <_io.BytesIO object at 0x7f35a7fff650> le <_io.BytesIO object at 0x7f8151ecf3d0> Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722a89f0> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c381c770> Failed to fetch sample 2256131. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b303d4e0> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b6c50> Failed to fetch sample 3506359. ExceptionFailed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec21f100> Failed to fetch sample 2118156. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83d6a20> object at 0x7f47278c93f0> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722bcb80> : cannot identify image file <_io.BytesIO object at 0x7fc2eb773ce0> Failed to fetch sample 2995778. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420e8810> cannot identify image file <_io.BytesIO object at 0x7f80a80ea2a0> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7f351e1c6ca0> Failed to fetch sample 2091363. Exception:Failed to fetch sample 4151440. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697def560> Failed to fetch sample 2979603. Exception: cannot identify image fi cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7f3939cabdd0> le <_io.BytesIO object at 0x7f001f904590> bject at 0x7fe436a09d00> Failed to fetch sample 1381127. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fd9e40> Failed to fetch sample 2448314. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fc400> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380ebb0> Failed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e83fc4db0> cannot identify image file <_io.BytesIO object at 0x7fb25c5ebfb0> Failed to fetch sample 2919698. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37249f0> Failed to fetch sample 3729900. Exception: Failed to fetch sample 2919698. Exception: cannot identify image fFailed to fetch sample 1811449. Exception: Failed to fetch sample 3298418. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4263ab0> Failed to fetch sample 1470654. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42b0810> Failed to fetch sample 3509971. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fdf7f8cc0> cannot identify image file <_io.BytesIO object at 0x7f7a56963790> Failed to fetch sample 3390339. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8ea430> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d7330> le <_io.BytesIO object at 0x7f80284c4540> Failed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec7ad90> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e07782070> Failed to fetch sample 3204345. Exception:cannot identify image file <_io.BytesIO object at 0x7fe445766840> cannot identify image file <_io.BytesIO object at 0x7f57420e7330> Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3724ef0> Failed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcd3060> cannot identify image file <_io.BytesIO object at 0x7f4beec03420> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078cb4c0> le <_io.BytesIO object at 0x7f80a80e5080> Failed to fetch sample 3280959. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3913470> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f56160> Failed to fetch sample 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7f80284b74c0> Failed to fetch sample 2885050. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac242de0> Failed to fetch sample 2940708. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 4190342. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8108d10> cannot identify image fiFailed to fetch sample 2878722. Exception:Failed to fetch sample 1445442. Exception:bject at 0x7f5741f1a480> cannot identify image file <_io.BytesIO object at 0x7f4beec7a390> Failed to fetch sample 1306392. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0772d530> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db55ec00> Failed to fetch sample 2202214. Exception: cannot identify image fiFailed to fetch sample 1717774. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777eb9c0> Failed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c38b9260> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec786d0> Failed to fetch sample 2212027. Exception: cannot identify image file <_io.BytesIO object at 0x7f0dace53fb0> Failed to fetch sample 3031167. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644ec50> ailed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7fe696745760> cannot identify image file <_io.BytesIO object at 0x7f94ac2127f0> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82da890> cannot identify image file <_io.BytesIO object at 0x7f5741f18c20> Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaa47d580> Failed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7f655601ef70> Failed to fetch sample 1411002. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281318a0> Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8316a20> Failed to fetch sample 1438628. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f1350> cannot identify image file <_io.BytesIO object at 0x7fbf0071dcb0> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec7b4c0> Failed to fetch sample 1878125. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f257c90> Failed to fetch sample 3837981. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093758130> Failed to fetch sample 2321063. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682f92cf0> Failed to fetch sample 1514345. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9fb100> Failed to fetch sample 1301230. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0779c7c0> Failed to fetch sample 1845411. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c383e570> Failed to fetch sample 1120039. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf006f3060> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e83fc4a40> Failed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3898310> Failed to fetch sample 2531396. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37e8720> Failed to fetch sample 1484787. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668133150> cannot identify image file <_io.BytesIO object at 0x7f098641e7f0> Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f255710> le <_io.BytesIO object at 0x7f80a8113d30> cannot identify image file <_io.BytesIO object at 0x7f81db5aff10>Failed to fetch sample 3881249. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a561df0> Failed to fetch sample 1276510. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a3a10> ile <_io.BytesIO object at 0x7fbf0071cf40> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078e4ef0> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668131800> Failed to fetch sample 3666516. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f115d0> Failed to fetch sample 1484787. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f2b290> Failed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c383eb10> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937898a0> cannot identify image file <_io.BytesIO object at 0x7f0c35bcdd00> le <_io.BytesIO object at 0x7f6682f93c40>cannot identify image file <_io.BytesIO object at 0x7fe4c427e9d0> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e2ca0> file <_io.BytesIO object at 0x7fc19359a9d0> Failed to fetch sample 4123070. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9974c0> Failed to fetch sample 2772050. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9fb1a0> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380d990> Failed to fetch sample 2560731. Exception: cannot identify image fiFailed to fetch sample 3898287. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741fd2200> Failed to fetch sample 4103755. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f939c0> Failed to fetch sample 3116930. Exception:Failed to fetch sample 2503403. Exception: cannot identify image fiFailed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c427fa10> Failed to fetch sample 2783090. Exception:Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7f804c2751c0> Failed to fetch sample 2910681. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9eff98b80> cannot identify image file <_io.BytesIO object at 0x7fcbdc250400> Failed to fetch sample 3695885. Exception: cannot identify image file <_io.BytesIO object at 0x7f109376ac00> Failed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f8360> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a3de263e0> Failed to fetch sample 2700162. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de2580810> Failed to fetch sample 3236052. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc112570> Failed to fetch sample 2554903. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38a45e0> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7fb3a3207970> Failed to fetch sample 3148873. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f1df0> Failed to fetch sample 1635925. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4440c6d90> cannot identify image file <_io.BytesIO object at 0x7f440e00c7c0> ile <_io.BytesIO object at 0x7fe4369d7d80> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668131350> Failed to fetch sample 1186857. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401fb790> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e0540> Failed to fetch sample 2813892. Exception: cannot identify image fiFailed to fetch sample 3579212. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1c6930> le <_io.BytesIO object at 0x7f357b92dfd0> Failed to fetch sample 1026599. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc174720> Failed to fetch sample 3569490. Exception: cannot identify image file <_io.BytesIO object at 0x7f5b0e2e32e0> Failed to fetch sample 3933045. Exception: cannot identify image file <_io.BytesIO object at 0x7ff40701d120> Failed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278c57b0> Failed to fetch sample 3488886. Exception: cannot identify image file <_io.BytesIO object at 0x7f61ceb93ba0> cannot identify image file <_io.BytesIO object at 0x7f7a5690e020> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a810b2e0> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a87608b0> cannot identify image file <_io.BytesIO object at 0x7fe1aecc07c0> Failed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681335b0> Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f11c10> cannot identify image file <_io.BytesIO object at 0x7f5253bef560> ile <_io.BytesIO object at 0x7f357b996110> Failed to fetch sample 2858679. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38a76f0> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3831ee0> Failed to fetch sample 2583452. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682f931a0> Failed to fetch sample 1980088. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9fa1b0> Failed to fetch sample 4376367. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8f0d670> Failed to fetch sample 1218899. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38a7290> cannot identify image file <_io.BytesIO object at 0x7f098644c0e0> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1bee7240> e <_io.BytesIO object at 0x7f4beec7d4e0> FFailed to fetch sample 4161397. Exception:cannot identify image file <_io.BytesIO object at 0x7f12b3812e80> Failed to fetch sample 2931138. Exception: annot identify image file <_io.BytesIO object at 0x7f6682e5a890> cannot identify image file <_io.BytesIO object at 0x7f5a16d9e3e0> Failed to fetch sample 4063575. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5bc860> cannot identify image file <_io.BytesIO object at 0x7f5d2789dc10> ile <_io.BytesIO object at 0x7f94ac2cc590> cannot identify image file <_io.BytesIO object at 0x7ff4daba3f10> Failed to fetch sample 2374167. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1fb4fbec0> le <_io.BytesIO object at 0x7f53a906a980> Failed to fetch sample 3575993. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cfce0> Failed to fetch sample 1688683. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2e4540> Failed to fetch sample 2883927. Exception: cannot identify image file <_io.BytesIO object at 0x7f655602f1f0> Failed to fetch sample 2607936. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83c8900> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f3ba0> Failed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278dc0e0> cannot identify image file <_io.BytesIO object at 0x7f81db5be390> cannot identify image file <_io.BytesIO object at 0x7fe4c42696c0> Failed to fetch sample 2522299. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f6a430> Failed to fetch sample 1382222. Exception:Failed to fetch sample 1175660. Exception:Failed to fetch sample 14Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a904f9c0> Failed to fetch sample 4023897. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc12b150> Failed to fetch sample 1480006. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937560c0> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c228309f0> Failed to fetch sample 994847. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38a7790> Failed to fetch sample 2969300. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f5cb0> cannot identify image file <_io.BytesIO object at 0x7f0c35bc2cf0>cannot identify image file <_io.BytesIO obFailed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778fba0> cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 3876873. Exception:ample 1930721. Exception: cannot identify image file <_io.BytesIO object at 0x7f6191ca6340> le <_io.BytesIO object at 0x7f0fd1a5bfb0> Failed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937d4fe0> Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac213290> Failed to fetch sample 3079085. Exception: cannot identify image file <_io.BytesIO object at 0x7f051fed14e0> Failed to fetch sample 3244596. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a907af20> Failed to fetch sample 2149877. Exception: cannot identify image fiFailed to fetch sample 4026469. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560e5fd0> cannot identify image file <_io.BytesIO object at 0x7f5a239e86d0> Failed to fetch sample 3316437. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 2522299. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c5e3e0> Failed to fetch sample 43Failed to fetch sample 3494092. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2f9df0> e <_io.BytesIO object at 0x7f60ffea7790> Failed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7f788d0eca90> Failed to fetch sample 2058679. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778f1f0> Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e3dd140e0> Failed to fetch sample 2290911. Exception: cannot identify image file <_io.BytesIO object at 0x7f001b9c65c0> Failed to fetch sample 3879860. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9ba840> Failed to fetch sample 1729796. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db542d40> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8f0d850> Failed to fetch sample 2072870. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4266c50> Failed to fetch sample 1470654. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdce110> Failed to fetch sample 3649257. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc118180> cannot identify image file <_io.BytesIO object at 0x7f78b3865d50> Failed to fetch sample 2679326. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc3150> Failed to fetch sample 1021055. Exception: cannot identify image file <_io.BytesIO object at 0x7f4ebf9fac50> Failed to fetch sample 1395411. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38c2c00> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f73240> cannot identify image file <_io.BytesIO object at 0x7ef9de63e6b0> le <_io.BytesIO object at 0x7f357b9d23e0> Failed to fetch sample 3136267. Exception: ject at 0x7f65560e6e80> Failed to fetch sample 3110748. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5ec6de0> Failed to fetch sample 1470654. Exception: cannot identify image file <_io.BytesIO object at 0x7f655602c2c0> Failed to fetch sample 2846277. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec238c20> Failed to fetch sample 2201040. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec264ea0> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59aed0> Failed to fetch sample 3493751. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f62e80> cannot identify image file <_io.BytesIO object at 0x7f9abce50450> Failed to fetch sample 2991472. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b42676a0> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412ff60> cannot identify image file <_io.BytesIO object at 0x7fb9777ddb20> Failed to fetch sample 3531506. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e90d0> Failed to fetch sample 1074487. Exception: cannot identify image file <_io.BytesIO object at 0x7fb976fcd350> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc0040> Failed to fetch sample 2940708. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59a3e0> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5ec4e00> Failed to fetch sample 2951868. Exception: cannot identify image file <_io.BytesIO object at 0x7f07aac708b0> Failed to fetch sample 2826957. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3956bb0> Failed to fetch sample 984093. Exception: cannot identify image file <_io.BytesIO object at 0x7feb671d60c0> cannot identify image file <_io.BytesIO object at 0x7fb9777dc400> Failed to fetch sample 4235606. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf769260> Failed to fetch sample 2591530. Exception:Failed to fetch sample 3276750. Exception: cannot identify image fiFailed to fetch sample 1495777. Exception:Failed to fetch sample 2853475. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401e34ae0> cannot identify image file <_io.BytesIO object at 0x7f71340f3510> Failed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcf4c0> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ddeabb330> Failed to fetch sample 3074437. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23d800> Failed to fetch sample 2632146. Exception: cannot identify image fFailed to fetch sample 2761972. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbd9990> Failed to fetch sample 1386094. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf77c720> Failed to fetch sample 2155552. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf759ad0> cannot identify image file <_io.BytesIO object at 0x7f0e64c33fb0> le <_io.BytesIO object at 0x7f7d7ddbdd50> Failed to fetch sample 1769963. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd68e59940> Failed to fetch sample 1308960. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39d8d60> Failed to fetch sample 1556954. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa3e9d0> Failed to fetch sample 1035361. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340f36f0> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce50ea0> cannot identify image file <_io.BytesIO object at 0x7f21b7f8f3d0> Failed to fetch sample 3050699. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7322cef0> Failed to fetch sample 2753096. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f53920> Failed to fetch sample 1448881. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83afc90> Failed to fetch sample 1451627. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84a3a60> Failed to fetch sample 4121555. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c207100> Failed to fetch sample 4158261. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a5e6f8a90> Failed to fetch sample 4102600. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59fb50> Failed to fetch sample 1160161. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7224a340> Failed to fetch sample 1308960. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278e4a90> Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa3ef70> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7c19e0> Failed to fetch sample 2784203. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce53790> Failed to fetch sample 2940708. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278e4e00> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560dc040> Failed to fetch sample 1168708. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84dbd80> Failed to fetch sample 1063748. Exception: cannot identify image file <_io.BytesIO object at 0x7fc26351d490> Failed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003bf9c0> Failed to fetch sample 2794558. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278b6570> Failed to fetch sample 3074077. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9faaeffb0> Failed to fetch sample 2122051. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5ac400> Failed to fetch sample 1431086. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72248900> Failed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278e5d50> cannot identify image file <_io.BytesIO object at 0x7f58c39d9850> Failed to fetch sample 1905284. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b271850> cannot identify image file <_io.BytesIO object at 0x7efd6f7c0860> ile <_io.BytesIO object at 0x7f10ae7e8680> Failed to fetch sample 3004849. Exception: cannot identify image fiFailed to fetch sample 3557596. Exception:Failed to fetch sample 2028729. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2b4d10> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003bce50> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a52feffb0> cannot identify image file <_io.BytesIO object at 0x7f4beec8eb60> Failed to fetch sample 3661452. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39e9a30> Failed to fetch sample 1186857. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1a6de0> le <_io.BytesIO object at 0x7efcdbb69620> Failed to fetch sample 2624181. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f98d7b0> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7f9acc06bce0> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7f0f05f1a6b0> Failed to fetch sample 2784203. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b76f0> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278b7470> Failed to fetch sample 3034526. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7bd440> Failed to fetch sample 2899629. Exception: ject at 0x7efd6f796f70> le <_io.BytesIO object at 0x7f35a7f46700> Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc123650> Failed to fetch sample 2955270. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f947ab0> Failed to fetch sample 980211. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090a610> FFailed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd96fc0> ailed to fetch sample 3610395. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f15f80> > annot identify image file <_io.BytesIO object at 0x7fca071cc720> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5aed40> Failed to fetch sample 1215483. Exception: Failed to fetch sample 1610595. Exception: ect at 0x7f58c39ebd80> Failed to fetch sample 3898287. Exception: cannot identify image file <_io.BytesIO object at 0x7feef8f52a20> Failed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459c99e0> Failed to fetch sample 1871262. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb5d350> cannot identify image file <_io.BytesIO object at 0x7f1f1b25ade0> Failed to fetch sample 2368615. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200377060> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f73100> Failed to fetch sample 4046110. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecb201120> Failed to fetch sample 1858383. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db53e7a0> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7fbeabc4bd80> cannot identify image file <_io.BytesIO object at 0x7efd6f79fe20> Failed to fetch sample 2136103. Exception: cannot identify image file <_io.BytesIO object at 0x7fb249b24c70> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f99ae30> Failed to fetch sample 3036687. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459ca610> Failed to fetch sample 1431086. Exception: cannot identify image file <_io.BytesIO object at 0x7efa006c5fd0> cannot identify image file <_io.BytesIO object at 0x7f35a7f47a10> ailed to fetch sample 1647254. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39b5c60> Failed to fetch sample 3909071. Exception: ject at 0x7f1f1b27ede0> Failed to fetch sample 1878125. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200374fe0> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f71a30> Failed to fetch sample 1281096. Exception: cannot identify image file <_io.BytesIO object at 0x7f393931ab60> Failed to fetch sample 1099589. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281298a0> cannot identify image file <_io.BytesIO object at 0x7f58c39c7470> Failed to fetch sample 3886415. Exception: cannot identify image file <_io.BytesIO object at 0x7f4e85327150> Failed to fetch sample 3021215. Exception:Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7efc53fb60c0> cannot identify image file <_io.BytesIO object at 0x7fc262b4d8a0> Failed to fetch sample 3007130. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80dc680> Failed to fetch sample 1476796. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcb290> Failed to fetch sample 3105348. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d72983b50> Failed to fetch sample 2041223. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20037a480> Failed to fetch sample 2915676. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d5059bf60> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9fa909710> Failed to fetch sample 3194835. Exception: cannot identify image fiFailed to fetch sample 1792884. Exception:Failed to fetch sample 4137423. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f927600> Failed to fetch sample 1504520. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a810acf0> cannot identify image file <_io.BytesIO object at 0x7f3bcea1c8b0> Failed to fetch sample 3439352. Exception: cannot identify image file <_io.BytesIO object at 0x7f051343bc40> le <_io.BytesIO object at 0x7fd20035b7e0> Failed to fetch sample 1520107. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20037fd30> cannot identify image file <_io.BytesIO object at 0x7feb2fc46890> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb73290> le <_io.BytesIO object at 0x7fc2eb7bf970> Failed to fetch sample 1405878. Exception: cannot identify image file <_io.BytesIO object at 0x7fc26217af20> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3860680> Failed to fetch sample 1484787. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc65ba60> Failed to fetch sample 2040873. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9ab790> Failed to fetch sample 3542758. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253be5fd0> Failed to fetch sample 4186967. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35485a80> Failed to fetch sample 1743116. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f72660> Failed to fetch sample 2954477. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0163f8c70> cannot identify image file <_io.BytesIO object at 0x7f18e64c1e40> Failed to fetch sample 3409052. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf858e00> Failed to fetch sample 2964818. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7ca020> Failed to fetch sample 2354595. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cbfe44f0> Failed to fetch sample 1448881. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64c36a0> Failed to fetch sample 2689711. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd698873d0> Failed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f1030> Failed to fetch sample 2155552. Exception: cannot identify image fiFailed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b531a0> Failed to fetch sample 2784203. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39eda80> Failed to fetch sample 4028686. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f79d0d0> Failed to fetch sample 3463683. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b028a0ae0> cannot identify image file <_io.BytesIO object at 0x7f80a810b3d0> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d5059a930> Failed to fetch sample 1472913. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3724860> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7fd9c41298f0> cannot identify image file <_io.BytesIO object at 0x7fe4c42827a0> Failed to fetch sample 2892575. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008147fb0> Failed to fetch sample 2110654. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003ceb60> Failed to fetch sample 3648356. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a9d50> Failed to fetch sample 1561037. Exception: cannot identify image file <_io.BytesIO object at 0x7f1090972ca0> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fa13a10> Failed to fetch sample 2080350. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647aac0> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7c1b70> Failed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf85aac0> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb74c680> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278e67a0> cannot identify image file <_io.BytesIO object at 0x7fdbd77e8bd0> Failed to fetch sample 2855348. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a87660d10> Failed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3724cc0> Failed to fetch sample 3711497. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec257830> Failed to fetch sample 4380758. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b259850> Failed to fetch sample 2123650. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecc08c310> Failed to fetch sample 3649257. Exception: cannot identify image file <_io.BytesIO object at 0x7f9e997ca1b0> Failed to fetch sample 2319498. Exception:Failed to fetch sample 3542346. Exception: cannot identify image file <_io.BytesIO object at 0x7efa006c5fd0> cannot identify image file <_io.BytesIO object at 0x7f5741f0e660> Failed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459ca4d0> Failed to fetch sample 1484787. Exception:Failed to fetch sample 2719591. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7fbf006f3fb0> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7f9acc06bce0> 96833. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b223470> Failed to fetch sample 2358648. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdba58f0> cannot identify image file <_io.BytesIO object at 0x7f192fed28e0> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7c3060> Failed to fetch sample 4077261. Exception:Failed to fetch sample 3096384. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a87678720> Failed to fetch sample 2633741. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3725580> Failed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7efc53fb60c0> cannot identify image file <_io.BytesIO object at 0x7f9d73240540> Failed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a9580> Failed to fetch sample 2985781. Exception:Failed to fetch sample 2729090. Exception: ject at 0x7f9a9c2993a0> cannot identify image file <_io.BytesIO object at 0x7f5d278efe20> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7c3c40> Failed to fetch sample 1213089. Exception: Failed to fetch sample 2307798. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faf30b0> Failed to fetch sample 4082605. Exception: cannot identify image fFailed to fetch sample 2184978. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d145bbb50> Failed to fetch sample 1021516. Exception: Failed to fetch sample 2379671. Exception: ect at 0x7efcdbbca6b0> Failed to fetch sample 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3746b10> Failed to fetch sample 3880305. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9fa250> Failed to fetch sample 2630285. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0ef6f0>Failed to fetch sample 3290404. Exception:bject at 0x7f3939318950> Failed to fetch sample 1470654. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cd2ed0> le <_io.BytesIO object at 0x7fbe7223bc40> F Failed to fetch sample 1350498. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d6ed0FaFailed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253be45e0 cannot identify image file <_io.BytesIO object at 0x7f80a810a8e0> Failed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc659210> e <_io.BytesIO object at 0x7f5d278fc590> n: cannot identify image file <_io.BytesIO object at 0x7f5de3746a20> Failed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91bf10> Failed to fetch sample 2573012. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bf7d80> Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcbab0> Failed to fetch sample 2231126. Exception: cannot identify image file <_io.BytesIO object at 0x7fc108457ab0> Failed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7fc108456930> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7c1440> cannot identify image file <_io.BytesIO object at 0x7f5de3735170> Failed to fetch sample 4046281. Exception:Failed to fetch sample 4235606. Exception: cannot identify image fiFailed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a810aca0> Failed to fetch sample 1205164. Exception:Failed to fetch sample 4018722. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7f3939cd22f0> Failed to fetch sample 2347459. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 3575993. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc637290> Failed to fetch sample 3029005. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1bf150> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce55b70> Failed to fetch sample 1Failed to fetch sample 3533762. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 3Failed to fetch sample 1080938. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c93f0> r this model (10090 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9a6fc0> Failed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7fc108456cf0> Failed to fetch sample 1140456. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8108540> cannot identify image file <_io.BytesIO object at 0x7efd6f7c39c0> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647a9d0> ile <_io.BytesIO object at 0x7ff249669a80> Failed to fetch sample 1747586. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f2f20> Failed to fetch sample 1505175. Exception: cannot identify image file <_io.BytesIO object at 0x7f566819e0c0> Failed to fetch sample 3377211. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73263d80> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2d8400> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc634540> Failed to fetch sample 3484950. Exception:ject at 0x7f22e39edad0> Failed to fetch sample 1501968. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf79c7c0> ile <_io.BytesIO object at 0x7fbe722a4b80> Failed to fetch sample 1168708. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a1d710> Failed to fetch sample 1617759. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8edd67150> Failed to fetch sample 2892575. Exception: cannot identify image file <_io.BytesIO object at 0x7f20b1f3d490> Failed to fetch sample 4018068. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401dc6430> Failed to fetch sample 2587903. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420fc680> Failed to fetch sample 2587903. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadd15bc0> cannot identify image file <_io.BytesIO object at 0x7fa58ce293f0> Failed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cd1d00> le <_io.BytesIO object at 0x7f9a9c2dafc0> Failed to fetch sample 3676629. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f4e020> Failed to fetch sample 1856298. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83a9c60> Failed to fetch sample 4232471. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24966a160> cannot identify image file <_io.BytesIO object at 0x7f80a817f010> Failed to fetch sample 2183334. Exception: cannot identify image file <_io.BytesIO object at 0x7f9ee1578a40> Failed to fetch sample 3493751. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4f9f06d0> Failed to fetch sample 2155552. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133f436a0> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c50ca2e30> Failed to fetch sample 1431086. Exception: cannot identify image fiFailed to fetch sample 3575993. Exception:Failed to fetch sample 3376972. Exception: cannot identify image file <_io.BytesIO object at 0x7fa36444a110> Failed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a8767bfb0> Failed to fetch sample 1250858. Exception: cannot identify image fiFailed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdf65c0> Failed to fetch sample 3664457. Exception:Failed to fetch sample 2149756. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682d3290> Failed to fetch sample 3259618. Exception: cannot identify image fiFailed to fetch sample 4089308. Exception:Failed to fetch sample 3501745. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7404096c0> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2da480> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cd2700> Failed to fetch sample 2845473. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2f5cb0> Failed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681c0310> cannot identify image file <_io.BytesIO object at 0x7fdaadeb2ed0> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420ec9f0> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6c93a0> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadceba10> Failed to fetch sample 1822546. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754d4590> cannot identify image file <_io.BytesIO object at 0x7f5e6a95f4c0> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7ff84cc107c0> Failed to fetch sample 2363197. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c9ed1d4e0> Failed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a97e390> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7fcea3b01490> Failed to fetch sample 1250858. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e3ab0> Failed to fetch sample 4070338. Exception: cannot identify image file <_io.BytesIO object at 0x7ff7dc69be20> Failed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c9f6e8d60> Failed to fetch sample 1491402. Exception: cannot identify image fiFailed to fetch sample 3280452. Exception:cannot identify image file <_io.BytesIO object at 0x7ff249523380> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7f566810fab0> Failed to fetch sample 1613094. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaade85da0> Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f19a80> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc637380> Failed to fetch sample 2915907. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f08b0> Failed to fetch sample 1680364. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc10dd50> Failed to fetch sample 4200491. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac22f470> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4f9f2340> Failed to fetch sample 4200419. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd931a0> Failed to fetch sample 2439548. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1eb2e0> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a985120> Failed to fetch sample 1628586. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2e69d0> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcea9d0> Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7fc12c265df0> Failed to fetch sample 3254207. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23a7a0> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7f5742309ee0> Failed to fetch sample 2108438. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf6a3600> Failed to fetch sample 2377988. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a96c1d0> Failed to fetch sample 2164469. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7ff2494a0cc0> 36021. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadceac50> Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986453420> Failed to fetch sample 3182588. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a8c2858f0> Failed to fetch sample 1219735. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a37e0> Failed to fetch sample 2379260. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e650f600> Failed to fetch sample 2658306. Exception: cannot identify image fiFailed to fetch sample 1729437. Exception:Failed to fetch sample 2645595. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6093a0>cannot identify image file <_io.BytesIO object at 0x7fb4a651ddf0> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaf61c9da0> le <_io.BytesIO object at 0x7f18e64ce750> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a569486d0> Failed to fetch sample 2081083. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568d6ca0> Failed to fetch sample 1035361. Exception: cannot identify image file <_io.BytesIO object at 0x7fd079db71f0> Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56948630> Failed to fetch sample 3244596. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56949620> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3af3970> Failed to fetch sample 989768. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789d760> Failed to fetch sample 1017215. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a4eed0> cannot identify image file <_io.BytesIO object at 0x7f58c381bba0> Failed to fetch sample 3663671. Exception: cannot identify image file <_io.BytesIO object at 0x7f69186e5990> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e4c9f0> Failed to fetch sample 3234463. Exception: cannot identify image file <_io.BytesIO object at 0x7f991bde2f20> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f974ef0> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a96dcb0> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a27597dd0> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7f5f59b23f10> Failed to fetch sample 1198666. Exception: cannot identify image file <_io.BytesIO object at 0x7f55df2d8db0> Failed to fetch sample 2123650. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf685080> cannot identify image file <_io.BytesIO object at 0x7fca1c1bb060> le <_io.BytesIO object at 0x7f47278f66b0> Failed to fetch sample 2024032. Exception: cannot identify image file <_io.BytesIO object at 0x7f66ceb3ffb0> Failed to fetch sample 1110352. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9b3290> Failed to fetch sample 3096384. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e3f10> Failed to fetch sample 2358648. Exception: cannot identify image file <_io.BytesIO object at 0x7fc325791800> Failed to fetch sample 3840067. Exception: cannot identify image fFailed to fetch sample 1038501. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643e2e250> ile <_io.BytesIO object at 0x7fc325791800> Failed to fetch sample 2720468. Exception:Failed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdb1f30> cannot identify image file <_io.BytesIO object at 0x7f2c5f0884a0> Failed to fetch sample 2679326. Exception: cannot identify image file <_io.BytesIO object at 0x7f5aed8ebfb0> cannot identify image file <_io.BytesIO object at 0x7f4ebf8d26b0> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9b2160> Failed to fetch sample 2583452. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf6e5b20> cannot identify image file <_io.BytesIO object at 0x7f99a3969b70> Failed to fetch sample 2104276. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f01d0> e <_io.BytesIO object at 0x7f357b9f9b20> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7fc19359a9d0> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e87e20> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a651f470> Failed to fetch sample 1301230. Exception: cannot identify image file <_io.BytesIO object at 0x7f27bdaaa340> Failed to fetch sample 1504520. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568b8540> Failed to fetch sample 2794891. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5767a0> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7f991bde3b00> cannot identify image file <_io.BytesIO object at 0x7f5643e2d080> le <_io.BytesIO object at 0x7f4ebf8d35b0> Failed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b906520> Failed to fetch sample 980211. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf6a1300> Failed to fetch sample 1506948. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc3c90> Failed to fetch sample 1186857. Exception:Failed to fetch sample 3850225. Exception:bject at 0x7fca1c1b8c70> Failed to fetch sample 3036687. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9383b0> Failed to fetch sample 2671128. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fbe723e7fb0> Failed to fetch sample 1231297. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f90e2f0> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a396ba10> Failed to fetch sample 2948132. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64bb420> cannot identify image file <_io.BytesIO object at 0x7f80a817e6b0> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc6362a0>Failed to fetch sample 3639895. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a974630> ile <_io.BytesIO object at 0x7fdaadd4f920> Failed to fetch sample 2850749. Exception: cannot identify image fFailed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f069d0> Failed to fetch sample 1592881. Exception: cannot identify image file <_io.BytesIO object at 0x7fe696745760> cannot identify image file <_io.BytesIO object at 0x7fca1c1c49a0> Failed to fetch sample 1183601. Exception: cannot identify image file <_io.BytesIO object at 0x7f39460758f0> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e833ae30> Failed to fetch sample 1476796. Exception: Failed to fetch sample 2439548. Exception:ject at 0x7fca1c1bab60> Failed to fetch sample 1811271. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb4680> cannot identify image file <_io.BytesIO object at 0x7f35a7f5a570> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8109bc0> Failed to fetch sample 4082955. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4262c00> le <_io.BytesIO object at 0x7f6a7ea882c0> Failed to fetch sample 4123070. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f5170> Failed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7fe29dc20950> Failed to fetch sample 4179589. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f937650> Failed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7f991bde3560> Failed to fetch sample 3421855. Exception:Failed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc2660> Failed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39690d0> Failed to fetch sample 995322. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c3a8a75b0> Failed to fetch sample 2782398. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c384e1da0> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3830090> Failed to fetch sample 1808534. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00741da0> Failed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9399e0> Failed to fetch sample 3645741. Exception:Failed to fetch sample 3010687. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4260680> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7fe29dc209f0> Failed to fetch sample 1715052. Exception: cannot identify image fiFailed to fetch sample 1664782. Exception:bject at 0x7f83e831a2f0> Failed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706ed7920> Failed to fetch sample 2552488. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249499710> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1baf70> Failed to fetch sample 2198494. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7f330> Failed to fetch sample 2032480. Exception: cannot identify image file <_io.BytesIO object at 0x7f805a3dfc90> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadd4c3b0> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7f991bde22f0> Failed to fetch sample 2933327. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093736250> Failed to fetch sample 2892575. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f25800> cannot identify image file <_io.BytesIO object at 0x7fe4c4263560> Failed to fetch sample 2482217. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c272110> Failed to fetch sample 2517428. Exception: cannot identify image file <_io.BytesIO object at 0x7f0d92627fb0> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381ef70> Failed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9353f0> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39686d0> cannot identify image file <_io.BytesIO object at 0x7fa275cd98f0> Failed to fetch sample 2972172. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fb5f30> Failed to fetch sample 1555096. Exception:Failed to fetch sample 2888688. Exception: cannot identify image fiFailed to fetch sample 3015321. Exception:Failed to fetch sample 1742099. Exception: cannot identify image file <_io.BytesIO object at 0x7fb97778eed0> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c287510> Failed to fetch sample 1262384. Exception: cannot identify image f Failed to fetch sample 2801811. Exception:Failed to fetch sample 2490255. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732633d0> Failed to fetch sample 2522299. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f287100> Failed to fetch sample 1120039. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f265d00> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f796070> Failed to fetch sample 1133332. Exception: cannot identify image file <_io.BytesIO object at 0x7fb691bf6750> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c285580> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668133970> cannot identify image file <_io.BytesIO object at 0x7f5a16daf6a0> Failed to fetch sample 2589457. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2b43b0> Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64ffd80> Failed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37cfb50> Failed to fetch sample 1133332. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f32e30> Failed to fetch sample 2744255. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278c6610> Failed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093787010> Failed to fetch sample 2402011. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777dfe20> cannot identify image file <_io.BytesIO object at 0x7f5cac3ed800> ile <_io.BytesIO object at 0x7f9a9c287fb0> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682fb2e0> Failed to fetch sample 1250858. Exception:Failed to fetch sample 3209775. Exception:bject at 0x7fa275cb6840> le <_io.BytesIO object at 0x7ff697dee570> cannot identify image file <_io.BytesIO object at 0x7f12b3851760> Failed to fetch sample 987857. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777c2b10> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f75f80> Failed to fetch sample 2047816. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b2fc0> F cannot identify image file <_io.BytesIO object at 0x7f99a39f3ce0> Failed to fetch sample 1136372. ExceptionFFailed to fetch sample 3665980. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9e84d3fb0> ailed to fetch sample 2116073. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f8029877b00> 545108. Exception: cannot identify image file <_io.BytesIO object at 0x7fe445766840> Failed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2675b0> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568b8630> cannot identify image file <_io.BytesIO object at 0x7fa275cf9b20> Failed to fetch sample 3928175. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249487b00> le <_io.BytesIO object at 0x7f5a16dac5e0> Failed to fetch sample 3697432. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db55ec00> Failed to fetch sample 994847. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f47c0> Failed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64cff60> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c285a30> Failed to fetch sample 1377471. Exception: ect at 0x7ff4eed37ba0> Failed to fetch sample 3880305. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401e35350> Failed to fetch sample 2140828. Exception: Failed to fetch sample 2783090. Exception: cannot identify image fFailed to fetch sample 2358648. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cf8f90> cannot identify image file <_io.BytesIO object at 0x7fca9a6f1c60> Failed to fetch sample 3743077. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80ff1f0> 374089. Exception: cannot identify image file <_io.BytesIO object at 0x7f9ad4585080> Failed to fetch sample 1262131. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278b5080> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16d9d8f0> Failed to fetch sample 2636021. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16e0d260> Failed to fetch sample 3849972. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faeeed0> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28acf0> : cannot identify image file <_io.BytesIO object at 0x7f69186e5490> ailed to fetch sample 4181774. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2a7150> Failed to fetch sample 1120039. Exception:bject at 0x7f5a16daf920> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bb6070> Failed to fetch sample 4161397. Exception:Failed to fetch sample 4235606. Exception: cannot identify image file <_io.BytesIO object at 0x7f109375a3e0> cannot identify image file <_io.BytesIO object at 0x7f5a16d9ea70> Failed to fetch sample 3266988. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f6d760> Failed to fetch sample 3010687. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278b6cf0> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7f9772b59300> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f94f240> cannot identify image file <_io.BytesIO object at 0x7fc12c265df0> le <_io.BytesIO object at 0x7f3d15716bb0> Failed to fetch sample 3116930. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560744f0> Failed to fetch sample 2583452. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb6160> Failed to fetch sample 1743116. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f6d3a0> cannot identify image file <_io.BytesIO object at 0x7f94ac242e30> Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c295c10> Failed to fetch sample 2155552. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2ce4d0> Failed to fetch sample 1656236. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fbb790> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7ff69741e2f0> Failed to fetch sample 1258476. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39de160> Failed to fetch sample 2155552. Exception: ject at 0x7fc0f093fba0> Failed to fetch sample 1814389. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754b04a0> le <_io.BytesIO object at 0x7f08280fdd00> Failed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9bfbf0> cannot identify image file <_io.BytesIO object at 0x7f69186e4d60> Failed to fetch sample 4125881. Exception:Failed to fetch sample 1504520. Exception:bject at 0x7f5d278c7740> Failed to fetch sample 4243929. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278b7ce0> Failed to fetch sample 2679326. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3850270> cannot identify image file <_io.BytesIO object at 0x7f3939c9cfe0> Failed to fetch sample 980211. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cf8a90> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7f61be079080> ile <_io.BytesIO object at 0x7f22e39b6f70> Failed to fetch sample 3575993. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f915b70> Failed to fetch sample 2898278. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16e0e0c0> Failed to fetch sample 3666566. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459c9710> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39c1620>F cannot identify image file <_io.BytesIO oFailed to fetch sample 17cannot identify iFailed to fetch sample 1170726. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280ff970> Failed to fetch sample 3167299. Exception: cannotcannot identify image file <_io.BytesIO object at 0x7f80a80df9c0> Failed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1cb470> Failed to fetch sample 1171510. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a819fba0> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2457b0> Failed to fetch sample 2502743. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc146ed0> Failed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7f81b9ca6d40> Failed to fetch sample 2244676. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8e8a5c0> Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc192ca0> Failed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac242d90> Failed to fetch sample 2223460. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d2930> Failed to fetch sample 2719591. Exception: cannot identify image file <_io.BytesIO object at 0x7f81b9c9f150> Failed to fetch sample 3350750. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc147e20> Failed to fetch sample 2149756. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82bfc90> Failed to fetch sample 2202214. Exception: Failed to fetch sample 3418139. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f120360> cannot identify image file <_io.BytesIO object at 0x7f53a8effab0> Failed to fetch sample 3880305. Exception: cannot identify image fFailed to fetch sample 3603263. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd459d49f0> ile <_io.BytesIO object at 0x7f94ac246a70> ject at 0x7f5d278a58f0> ailed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4283e20> Failed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2a64d0> Failed to fetch sample 3259174. Exception: cannot identify image fiFailed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7f6419fe0fe0> Failed to fetch sample 2086216. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cd6160> le <_io.BytesIO object at 0x7f80a8178f40> Failed to fetch sample 2097343. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f67ba0> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc636d90> Failed to fetch sample 2363228. Exception: cannot identify image fiFailed to fetch sample 3700472. Exception: cannot identify image file <_io.BytesIO object at 0x7f05223edc60> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278e4720> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39d1120> Failed to fetch sample 3270019. Exception:bject at 0x7fdd459c8c70> le <_io.BytesIO object at 0x7f472789e7a0> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7222d5d0> cannot identify image file <_io.BytesIO object at 0x7f098641e7f0> Failed to fetch sample 1776844. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72249a30> cannot identify image file <_io.BytesIO object at 0x7f58c380f060> le <_io.BytesIO object at 0x7fcf1d090ea0> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2406d0> Failed to fetch sample 3272055. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7ff4e3e49f30> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7f61ba0fb060> le <_io.BytesIO object at 0x7f0da8eba3e0> le <_io.BytesIO object at 0x7f53b78a5df0> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc635f80> Failed to fetch sample 3575993. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39d0180> Failed to fetch sample 1448881. Exception: cannot identify image file <_io.BytesIO object at 0x7fdb9ce5de90> Failed to fetch sample 1026599. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf5efed080> Failed to fetch sample 1577099. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f8b790> cannot identify image file <_io.BytesIO object at 0x7fe4369d6d90> le <_io.BytesIO object at 0x7f4fc86f72e0> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72249e90> Failed to fetch sample 3687864. Exception: cannot identify image fiFailed to fetch sample 2130826. Exception:Failed to fetch sample 1523937. Exception: ject at 0x7f35a7f67600> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc193010> Failed to fetch sample 2439548. Exception:Failed to fetch sample 3105503. Exception: cannot identify image fiFailed to fetch sample 1770342. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16e2a340> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7222dc60> le <_io.BytesIO object at 0x7f0fd32abd80> Failed to fetch sample 1351028. Exception: cannot identify image fiFailed to fetch sample 3195841. Exception:cannot identify image file <_io.BytesIO object at 0x7f65560752b0> Failed to fetch sample 1341697. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5ebf290> Failed to fetch sample 3290404. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c8dee0> Failed to fetch sample 2591530. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b963fb0> Failed to fetch sample 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b95fce0> Failed to fetch sample 3596066. Exception: cannot identify image file <_io.BytesIO object at 0x7f334a642b60> Failed to fetch sample 3846458. Exception: cannot identify image file <_io.BytesIO object at 0x7f0e6057a9d0> Failed to fetch sample 1308960. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f67d3a0> Failed to fetch sample 2Failed to fetch sample 2388130. Exception: ect at 0x7f5d278b6250> Failed to fetch sample 3194835. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b995490> at 0x7f4beec60950> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7f3291771350> Failed to fetch sample 2940708. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39ef600> Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39d2570> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72405440> Failed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16e29030> Failed to fetch sample 3244596. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39d1530> Failed to fetch sample 1433325. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39c0680> Failed to fetch sample 2388130. Exception:Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7222fa10>cannot identify image file <_io.BytesIO object at 0x7f5a16f895d0> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7fded4b9e2f0> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7fb238771d00> Failed to fetch sample 2892575. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b24023f60> Failed to fetch sample 3315883. Exception: cannot identify image file <_io.BytesIO object at 0x7fd3117b5e90> Failed to fetch sample 1561037. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebdcae0> Failed to fetch sample 3494092. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b923e20> Failed to fetch sample 3880305. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39f8fe0> Failed to fetch sample 1324683. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fdfbc96c0> Failed to fetch sample 3245561. Exception: cannot identify image fiFailed to fetch sample 1186857. Exception:cannot identify image filFailed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c38c1580> Failed to fetch sample 2123650. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f8bce0> Failed to fetch sample 3136267. ExceptionFailed to fetch sample 1888864. Exception: cannot identify image filFailed to fetch sample 3188707. ExceptionFailed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7fd079dc6de0> Failed to fetch sample 1656236. Exception: cannot identify image file <_io.BytesIO object at 0x7fd9c41298f0> Failed to fetch sample 3575993. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c38c15d0> Failed to fetch sample 1870993. Exception: ject at 0x7f5de37e3ab0> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7fdf306026b0> le <_io.BytesIO object at 0x7f0fd1a5bfb0> cannot identify image file <_io.BytesIO obFailed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadceafc0> Failed to fetch sample 1262384. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 2177938. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d2fe20> Failed to fetch sample 2671128. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f0ea0> Failed to fetch sample 1258859. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db598ea0> Failed to fetch sample 1524618. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf76ea20> Failed to fetch sample 3658652. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8d25c0> cannot identify image file <_io.BytesIO object at 0x7efcdbb95170> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37e0900> FFailed to fetch sample 1688910. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13d2ea20> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a90e430> le <_io.BytesIO object at 0x7f7aa02049a0> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644f150> Failed to fetch sample 3105090. Exception: Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64e0e00> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f3d30> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7f7f01ccda80> cannot identify image file <_io.BytesIO object at 0x7fcf32b6bb00> Failed to fetch sample 1160161. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f478d0> Failed to fetch sample 4108370. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbc8680> le <_io.BytesIO object at 0x7fc2eb7b7970> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf76db70> Failed to fetch sample 2827935. Exception: ject at 0x7f53a8f32ed0> le <_io.BytesIO object at 0x7f99a39531f0> Failed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a4bdd0> Failed to fetch sample 1431086. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59b150> cannot identify image file <_io.BytesIO object at 0x7f1d0d5ba610> ile <_io.BytesIO object at 0x7f6682e67830> Failed to fetch sample 1039017. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722a47c0> Failed to fetch sample 3737602. Exception: cannot identify image file <_io.BytesIO object at 0x7f55e503d6c0> Failed to fetch sample 3257443. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682f8f150> cannot identify image file <_io.BytesIO object at 0x7f5e6a1420c0> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf77fb00> Failed to fetch sample 3540812. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bf70b0> Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db570c70> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7f5ead6207c0> cannot identify image file <_io.BytesIO object at 0x7f563bb949f0> le <_io.BytesIO object at 0x7f4f44202020> Failed to fetch sample 1770342. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7c59e0> cannot identify image file <_io.BytesIO object at 0x7f5de37d84f0> Failed to fetch sample 2117785. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420eeb60> Failed to fetch sample 3602146. Exception:F cannot identify image file <_io.BytesIO object at 0x7f0c35bcb790> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db571580> iled to fetch sample 3350750. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37d9df0>Failed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de375efc0> Failed to fetch sample 1832660. Exception: ject at 0x7f1137f451c0> ailed to fetch sample 1537528. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568d6700> Failed to fetch sample 4048877. Exception: cannot identify image file <_io.BytesIO object at 0x7f566819ce50> Failed to fetch sample 2032258. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f73100> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986455da0> Failed to fetch sample 3559241. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a571b20> Failed to fetch sample 2587903. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8da520> Failed to fetch sample 4108370. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5695a160> Failed to fetch sample 1901368. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732889a0> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fe7c90> Failed to fetch sample 1929498. Exception: cannot identify image file <_io.BytesIO object at 0x7f566819e8e0> cannot identify image file <_io.BytesIO object at 0x7f53a8f0e5c0> Failed to fetch sample 1086890. Exception:Failed to fetch sample 1878125. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f70950> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681ab7e0> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7f11fdfa6250> cannot identify image file <_io.BytesIO object at 0x7f802848ed90> Failed to fetch sample 2198494. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f68e755d0> Failed to fetch sample 3369655. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f907ab0> cannot identify image file <_io.BytesIO object at 0x7f7e078abfb0> Failed to fetch sample 1302140. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb729c60> Failed to fetch sample 4073421. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1b9940> Failed to fetch sample 4123070. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8f0e700> cannot identify image file <_io.BytesIO object at 0x7efcdbb4a8e0> Failed to fetch sample 2591530. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a917a3470> Failed to fetch sample 1014165. Exception:cannot identify image file <_io.BytesIO object at 0x7f1093735b20> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3852e30> Failed to fetch sample 3898287. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d25c0> Failed to fetch sample 2556622. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f2b4c0> Failed to fetch sample 2108438. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd5b20> Failed to fetch sample 1198363. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134137560> ile <_io.BytesIO object at 0x7ff4e3e4a6b0>cannot identify image file <_io.BytesIO object at 0x7f07a84a3100> Failed to fetch sample 1432321. Exception:Failed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d9348e0c0> cannot identify image file <_io.BytesIO object at 0x7f65560fbdd0> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7f3981a7ddf0> Failed to fetch sample 3392928. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaa47cae0> Failed to fetch sample 1097504. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b2110> Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd7290> Failed to fetch sample 2970162. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab9ad90> Failed to fetch sample 3880305. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84ee750> Failed to fetch sample 4121555. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84cd8a0> Failed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7f65a98efec0> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c8d440> Failed to fetch sample 3662436. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b32e0> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8326f70> Failed to fetch sample 1057407. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab776f0> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556028d10> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cd63e0> Failed to fetch sample 2051646. Exception: cannot identify image file <_io.BytesIO object at 0x7f641a71ec00> Failed to fetch sample 2931566. Exception: cannot identify image file <_io.BytesIO object at 0x7fc7159420c0> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8327dd0> Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c187330> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560ee4d0> Failed to fetch sample 2282533. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec279bc0> Failed to fetch sample 3898412. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249487dd0> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7f3981a7dd50> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556031940> Failed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7f61be079080> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c9dd00> Failed to fetch sample 1053705. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556065120> Failed to fetch sample 3924877. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249522f70> Failed to fetch sample 4063550. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249490220> Failed to fetch sample 1446601. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1c6750> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560ecc20> Failed to fetch sample 1415232. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83c8270> Failed to fetch sample 3379902. Exception: cannot identify image file <_io.BytesIO object at 0x7ff248d6e7a0> Failed to fetch sample 1905407. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c206d90> Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560f90d0> Failed to fetch sample 1203134. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560649f0> cannot identify image file <_io.BytesIO object at 0x7f2775486340> Failed to fetch sample 2858679. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8f0ecf0> cannot identify image file <_io.BytesIO object at 0x7f5253bdd8a0> le <_io.BytesIO object at 0x7ff249680450> Failed to fetch sample 3162454. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1d94e0> cannot identify image file <_io.BytesIO object at 0x7f1f1b28b4c0> Failed to fetch sample 1682209. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668180950> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8eae020> Failed to fetch sample 1232641. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340ff4c0> Failed to fetch sample 1133124. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380a070> ailed to fetch sample 2250055. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb72b3d0> Failed to fetch sample 1504520. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278af650> Failed to fetch sample 1308960. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754aba10> Failed to fetch sample 2988003. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e5df0> cannot identify image file <_io.BytesIO object at 0x7f5668113290> Failed to fetch sample 3350750. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d72980680> Failed to fetch sample 2482217. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b288ef0> Failed to fetch sample 1010081. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a59a7f0> cannot identify image file <_io.BytesIO object at 0x7fbe723ff2e0> le <_io.BytesIO object at 0x7f35a7fff600> Failed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38bc1d0> Failed to fetch sample 3194835. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ec268ad90> Failed to fetch sample 1614750. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9951c0> Failed to fetch sample 1198666. Exception: Failed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f1d00> Failed to fetch sample 3330045. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08fd9e0> Failed to fetch sample 2940708. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775478400> Failed to fetch sample cannot identify image file <_io.BytesIO object at 0x7f5668182340> Failed to fetch sample 1711540. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7322cef0> ailed to fetch sampFailed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO objectFailed to fetch sampFailed to fetch sample 1232641. Exception:bject at 0x7f5253bdddf0> le <_io.BytesIO object at 0x7f1f1b289d00> le 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078abec0> Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f2a020> Failed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1c7c40> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb7650> Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83afbf0> cannot identify image file <_io.BytesIO object at 0x7f27754d7560> le <_io.BytesIO object at 0x7f9d7325d4e0> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b289670> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37c9530> Failed to fetch sample 1168708. Exception: cannot identify image file <_io.BytesIO object at 0x7f75bb0677e0> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7f13dc64ca90> Failed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e07709760> Failed to fetch sample 2459673. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1c7ba0> Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7ff248d6ddf0> Failed to fetch sample 4123070. Exception: cannot identify image file <_io.BytesIO object at 0x7f08282d22f0> Failed to fetch sample 4380313. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668113330> cannot identify image file <_io.BytesIO object at 0x7f7d7dc28310> Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b2f7d4c20> Failed to fetch sample 3275360. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071bec0> Failed to fetch sample 3662436. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd6890> Failed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7325e660> Failed to fetch sample 2583452. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f769dad90> Failed to fetch sample 1745884. Exception: cannot identify image file <_io.BytesIO object at 0x7f96cd9a50d0> Failed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c389ba60> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078d2c00> Failed to fetch sample 1160161. Exception: cannot identify image file <_io.BytesIO object at 0x7ff3322a56c0> Failed to fetch sample 1412473. Exception: cannot identify image fiFailed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9069300> Failed to fetch sample 3379902. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc252ac0> le <_io.BytesIO object at 0x7f27754b3f10> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08d7fb0> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7f55e503d6c0> ailed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9a9490> cannot identify image file <_io.BytesIO object at 0x7fd07a59b790> le <_io.BytesIO object at 0x7f08280e84a0> Failed to fetch sample 3560856. Exception: cannot identify image file <_io.BytesIO object at 0x7f655602cd60> Failed to fetch sample 3036687. Exception: cannot identify image file <_io.BytesIO object at 0x7fa28409e750> Failed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7f84f700c860> Failed to fetch sample 1638542. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8399f80> cannot identify image file <_io.BytesIO object at 0x7f9abcdcfba0> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8f0f650> le <_io.BytesIO object at 0x7f58c39d9710> Failed to fetch sample 2562281. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec02cf0> Failed to fetch sample 3356617. Exception: cannot identify image file <_io.BytesIO object at 0x7ff420f57650> Failed to fetch sample 2903629. Exception:ject at 0x7f21b088b5b0> Failed to fetch sample 2436722. Exception: cannot identify image file <_io.BytesIO object at 0x7f566810d210> Failed to fetch sample 3581003. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac239210> Failed to fetch sample 1490386. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093794540> Failed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39dab10> Failed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c3013d850> Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0073d530> Failed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7be7240> Failed to fetch sample 1742099. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f97ab60> cannot identify image file <_io.BytesIO object at 0x7f22304b5080> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7f566812f4c0> Failed to fetch sample 3013561. Exception: cannot identify image file <_io.BytesIO object at 0x7f10af1aa430> Failed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b289b20> ile <_io.BytesIO object at 0x7f5e6a977100> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4015da520> Failed to fetch sample 3557596. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37c05e0> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7f27c1653fb0> cannot identify image file <_io.BytesIO object at 0x7fbe7223a9d0> Failed to fetch sample 2856507. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c4502e070> le <_io.BytesIO object at 0x7fdfaa47cdb0> Failed to fetch sample 1140456. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401dc48b0> Failed to fetch sample 3523488. Exception:cannot identify image file <_io.BytesIO object at 0x7f53a9068f40> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fffc40> Failed to fetch sample 2439548. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb4270> Failed to fetch sample 1240959. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9b38d0> Failed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937972e0> 92). Running this sequence through the model will result in indexing errors Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754d5210> Failed to fetch sample 2919698. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b088a250> Failed to fetch sample 3176991. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f93790> Failed to fetch sample 2671622. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cf290> cannot identify image file <_io.BytesIO object at 0x7f4450b68f90> Failed to fetch sample 1021516. Exception: Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249674ae0> Failed to fetch sample 1806603. Exception: cannot identify image fFailed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b281df0> Failed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3757920> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7be5fd0> Failed to fetch sample 3649257. Exception:Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b28a1b0> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7fb83fc0cae0> Failed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c286bb0> cannot identify image file <_io.BytesIO object at 0x7f5e6a8e3ab0> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4021de2a0> Failed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39eb970> Failed to fetch sample 1561037. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f6d210> Failed to fetch sample 3297609. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c9ed1c130> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7ff40166a930> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a9767a0> Failed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278b0860> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697def240> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278fcb30> Failed to fetch sample 3878544. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2cb060> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c295c10> Failed to fetch sample 3494092. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687fb4540> Failed to fetch sample 1135347. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39d85e0> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280ea6b0> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401669760> cannot identify image file <_io.BytesIO object at 0x7f47278ea1b0> Failed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a84540> Failed to fetch sample 1934945. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2fab60> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcfb00> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f98c20> Failed to fetch sample 4142196. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4260680> Failed to fetch sample 2646277. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fca6b0> Failed to fetch sample 2633741. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2876a0> Failed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2ba840> Failed to fetch sample 2358648. Exception: cannot identify image file <_io.BytesIO object at 0x7ff437a0dee0> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278fff10> Failed to fetch sample 4077998. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9e00e0> Failed to fetch sample 1140456. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9f6a6aa70> Failed to fetch sample 2439548. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3756250> cannot identify image file <_io.BytesIO object at 0x7efd6f7d7650> le <_io.BytesIO object at 0x7f357b93a390> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdce570> Failed to fetch sample 3544113. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc6346d0> Failed to fetch sample 1796897. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d671a0> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7f44427e76f0> Failed to fetch sample 1883857. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadcc7dd0> cannot identify image file <_io.BytesIO object at 0x7f08280e87c0> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2496750d0> ile <_io.BytesIO object at 0x7fe4c42627a0> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7be4680> Failed to fetch sample 1688740. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f93b880> Failed to fetch sample 2229126. Exception: cannot identify image fiFailed to fetch sample 978639. Exception::Failed to fetch sample 2045138. Exception: cannot identify image fiFailed to fetch sample 1099589. Exception:bject at 0x7f9abce55260> Failed to fetch sample 1907365. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8155490> le <_io.BytesIO object at 0x7f5de37e7e70> Failed to fetch sample 2587903. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb5ec5440> Failed to fetch sample 3879860. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9e33d0> le <_io.BytesIO object at 0x7f5253beffb0> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7efbf19439c0> Failed to fetch sample 2122922. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52b63f1f0> cannot identify image file <_io.BytesIO object at 0x7ff401f88950> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f90ede0> cannot identify image file <_io.BytesIO object at 0x7fe0ec2657b0> le <_io.BytesIO object at 0x7fb7401cfce0> Failed to fetch sample 1017215. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ce5d00> Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c284b30> cannot identify image file <_io.BytesIO object at 0x7fb52b63f2e0> Failed to fetch sample 1100749. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9714e0> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278fdbc0> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7efd102abe70> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f6070> Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864534c0> cannot identify image file <_io.BytesIO object at 0x7fb7b2759ad0> Failed to fetch sample 2794558. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52b63efc0> Failed to fetch sample 2334809. Exception:Failed to fetch sample 1509732. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83aa890> Failed to fetch sample 4068000. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f001fd32160> Failed to fetch sample 4046281. Exception:Failed to fetch sample 1988585. Exception: cannot identify image fiFailed to fetch sample 2354595. Exception: ect at 0x7fcbdc1afe70> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc6344f0> 060696. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003aa3e0> Failed to fetch sample 3035741. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e9030> cannot identify image file <_io.BytesIO object at 0x7fb52b63fa60> Failed to fetch sample 971314. Exception: cFailed to fetch sample 2048677. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937d7060> annot identify image file <_io.BytesIO object at 0x7fc0f090a610> Failed to fetch sample 2888688. Exception: ccannot identify image file <_io.BytesIO object at 0x7f098647b650>FaFailed to fetch sample 1597034. Exception: nnot identify image file <_io.BytesIO object at 0x7f47278b33d0> Failed to fetch sample 3609835. Exception: cannot identify image file <_io.BytesIO object at 0x7f469e5ab510> Failed to fetch sample 1465471. Exception: cannot identify image file <_io.BytesIO object at 0x7efce793a750> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9368e0> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249498cc0> Failed to fetch sample 1945810. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278b07c0> Failed to fetch sample 1120039. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f12b3828900> 327376. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278cc1d0> cannot identify image file <_io.BytesIO object at 0x7fcbdc1aefc0> Failed to fetch sample 1278085. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc6362f0> Failed to fetch sample 1475326. Exception:Failed to fetch sample 3298163. Exception: cannot identify image fiFailed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd07c0>ile <_io.BytesIO object at 0x7f351e1c7ab0> cannot identify image file <_io.BytesIO object at 0x7fd2003d7380> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f698a0> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f6d40> Failed to fetch sample 2793784. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253be44f0> Failed to fetch sample 4123070. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668175e40> Failed to fetch sample 2257310. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2b5350> Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42b3650> Failed to fetch sample 1876113. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16dade40> Failed to fetch sample 2941140. Exception: cannot identify image file <_io.BytesIO object at 0x7f991bde22f0> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7fbea6365170> Failed to fetch sample 2961542. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2d0400> Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f6a070> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7efd4860cf90> Failed to fetch sample 2522299. Exception: cannot identify image fiFailed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a3470> Failed to fetch sample 1530436. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4282390> Failed to fetch sample 2858679. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668176f70> le <_io.BytesIO object at 0x7f472789e8e0> Failed to fetch sample 1201796. Exception: cannot identify image file <_io.BytesIO object at 0x7f81d6dcccc0> Failed to fetch sample 1878125. Exception: cannot identify image file <_io.BytesIO object at 0x7f991bde2f20> Failed to fetch sample 1930721. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39cf010> Failed to fetch sample 1943116. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16daed40> cannot identify image file <_io.BytesIO object at 0x7fc127be6cf0> Failed to fetch sample 3739942. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f23240> Failed to fetch sample 3016143. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f619170> Failed to fetch sample 2031445. Exception: cannot identify image file <_io.BytesIO object at 0x7f96ef1d14e0> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4568e9da0> Failed to fetch sample 3666826. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c5e40>Failed to fetch sample 1327539. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789f740> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f6ca0> Failed to fetch sample 1218899. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681770b0> cannot identify image file <_io.BytesIO object at 0x7f32986bbce0> ile <_io.BytesIO object at 0x7f82a69996c0> ailed to fetch sample 974986. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003aaed0> Failed to fetch sample 1746137. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5f7560> Failed to fetch sample 2482217. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278b6390> Failed to fetch sample 3138953. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278cb150> Failed to fetch sample 2679326. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39ab060> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7f81540162f0> cannot identify image file <_io.BytesIO ocannot identify image file <_io.BytesIO object at 0x7fb52f645120> ile <_io.BytesIO object at 0x7f001f90ca90> Failed to fetch sample 2914311. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c161e4b30> bject at 0x7fe4c42b1df0> Failed to fetch sample 3197698. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42772e0> Failed to fetch sample 2178845. Exception: cannot identify image fiFailed to fetch sample 1876146. ExceptionFailed to fetch sample 1262384. Exception:object at 0x7f47278b4a40> Failed to fetch sample 2033363. Exception: cannot identify image file <_io.BytesIO object at 0x7f47277599e0> annot identify image file <_io.BytesIO object at 0x7f9d72980c20> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f6ba60> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f56686da7f0> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a8cc0> Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42831a0> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5f4720>: cannot identify image file <_io.BytesIO object at 0x7f4d18aba110> Failed to fetch sample 3649257. Exception cannot identify image file <_io.BytesIO object at 0x7f47278b61b0> Failed to fetch sample 1418489. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db543510> cannot identify image file <_io.BytesIO object at 0x7f35a7fe1a30> cannot identify image file <_io.BytesIO object at 0x7f5668176700> Failed to fetch sample 1422636. Exception: cannot identify image file <_io.BytesIO object at 0x7fde75ec53a0> cannot identify image file <_io.BytesIO object at 0x7fe1fb4fbec0> ile <_io.BytesIO object at 0x7fb687f69c10> Failed to fetch sample 1000362. Exception: s/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 3929909. Exception: cannot identify image fcannot identify image file <_io.BytesIO objFailed to fetch sample 1296423. Exception: cannot identify image file <_io.BytesIO object at 0x7f32986bb880> Failed to fetch sample 3856361. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412ea70> Failed to fetch sample 1256456. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5ed40> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f2bb0> cannot identify image file <_io.BytesIO object at 0x7f1137f48540> Failed to fetch sample 3880305. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42b2570> Failed to fetch sample 1207400. Exception: cannot identify image file <_io.BytesIO object at 0x7f97af7de980> cannot identify image file <_io.BytesIO object at 0x7f22e3816980> ile <_io.BytesIO object at 0x7f47278a32e0> Failed to fetch sample 1135347. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003aa930> Failed to fetch sample 3611255. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5fb0b0> Failed to fetch sample 1976843. Exception: cannot identify image file <_io.BytesIO object at 0x7f47e264e160> Failed to fetch sample 1470654. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39c0ae0> warn( cannot identify image file <_io.BytesIO object at 0x7fe0ec2d54e0> Failed to fetch sample 3661638. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec272840> Failed to fetch sample 1222876. Exception:Failed to fetch sample 1536214. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f2d260> Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003aa110> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668141a80> Failed to fetch sample 1383309. Exception: cannot identify image file <_io.BytesIO object at 0x7f8056d45530> cannot identify image file <_io.BytesIO object at 0x7fd07a975300> Failed to fetch sample 1436047. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d1ea20> Failed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f5da0> cannot identify image file <_io.BytesIO object at 0x7f22e38153a0> Failed to fetch sample 3156890. Exception:Failed to fetch sample 3529825. Exception: cannot identify image ficannot identify image file <_io.BytesIO obFailed to fetch sample 3648356. Exception: s/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarFailed to fetch sample 1737850. Exception: cannot identify image file <_io.BytesIO object Failed to fetch samcannot identify image file <_io.BytesIO object at 0x7f18e5d1d3a0> Failed to fetch sample 3439352. Exception: Failed to fetch sample 2728543. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28e0c0> ject at 0x7f35a7f4a8e0> d=True. Gradients will be None warnings.warn( Failed to fetch sample 2655491. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e0590> Failed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3817920> Failed to fetch sample 2680786. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc12a0c0> Failed to fetch sample 4027148. Exception: cannot identify image fiFailed to fetch sample 1534741. Exception: ailed to fetch sample 2587903. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac22e200> Failed to fetch sample 3458804. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc147c90> checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 971148. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf76ea20> Failed to fetch sample 3728263. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643e2dda0> cannot identify image file <_io.BytesIO object at 0x7f5cb38b36a0> le <_io.BytesIO object at 0x7ff401f891c0> cannot identify image file <_io.BytesIO object at 0x7fb687f12b10> Failed to fetch sample 2990000. Exception: cannot identify image fcannot identify image file <_io.BytesIO objFailed to fetch sample 3262876. Exception: cannot identify image file <_io.BytesIO object at 0x7fc193db5530> ect at 0x7fc0f090a610> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37ebb50> cannot identify image file <_io.BytesIO object at 0x7f18e5d1cbd0> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f10680> Failed to fetch sample 1697346. Exception: Failed to fetch sample 1484787. Exception:ject at 0x7ff401fb9da0> Failed to fetch sample 1043770. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb74ad90> ile <_io.BytesIO object at 0x7f07aac71120> Failed to fetch sample 3098853. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec244680> Failed to fetch sample 3349915. Exception:ject at 0x7f5a16f80b30> Failed to fetch sample 2562281. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00755e90>Failed to fetch sample 2282533. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732906d0> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00756ac0> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2932e0>cannot identify image file <_io.BytesIO object at 0x7ff6987cb240> / cannot identify image file <_io.BytesIO object at 0x7f5de3746020> Failed to fetch sample 4146076. Exception: cannot identify image file <_io.BytesIO object at 0x7fc515708720> =True. Gradients will be None warnings.warn( Failed to fetch sample 4232466. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697dedbc0> Failed to fetch sample 1528709. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f3b290> cannot identify image file <_io.BytesIO object at 0x7fa275d0b420> le <_io.BytesIO object at 0x7fbf0071de90> cannot identify image file <_io.BytesIO object at 0x7f18e5d1cae0> Failed to fetch sample 3116930. Exception: cannot identify image file <_io.BytesIO object at 0x7f6ab643b5b0> Token indices sequence length is longer than the specified maximum sequence length for this model (10090 > 8192). Running this sequence through the model will result in indexing errors cannot identify image file <_io.BytesIO object at 0x7fc21a923ab0> le <_io.BytesIO object at 0x7fbf005d8090> Failed to fetch sample 2896307. Exception: cannot identify image file <_io.BytesIO object at 0x7feef8f52a20> Failed to fetch sample 1635441. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b14707240> Failed to fetch sample 1410411. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fb8a90> cannot identify image file <_io.BytesIO object at 0x7fe0ec24c1d0> Failed to fetch sample 3193123. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c958f0> le <_io.BytesIO object at 0x7fbf005db240> cannot identify image file <_io.BytesIO object at 0x7f8029877b00> Failed to fetch sample 2805133. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08ef470> Failed to fetch sample 1718704. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f982840> Failed to fetch sample 2364117. Exception: cannot identify image file <_io.BytesIO object at 0x7fb25a06e570> Failed to fetch sample 3013832. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf005d9670> Failed to fetch sample 3008522. Exception:bject at 0x7fbea6364a90> Failed to fetch sample 3081392. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f6e7f0> cannot identify image file <_io.BytesIO object at 0x7f6a7f285080> Failed to fetch sample 2394950. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d7240> le <_io.BytesIO object at 0x7f80a80e4db0>cannot identify image file <_io.BytesIO object at 0x7f80a80df2e0> Failed to fetch sample 3254207. Exception:bject at 0x7ff4db5484f0> Failed to fetch sample 2429018. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f609120> Failed to fetch sample 2535152. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f983fb0> Failed to fetch sample 1030035. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9fbe20> Failed to fetch sample 2253806. Exception:bject at 0x7fbf005d9d50> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38d9300> le <_io.BytesIO object at 0x7f7a568f6b60> Failed to fetch sample 1980080. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd698873d0> cannot identify image file <_io.BytesIO object at 0x7f5253bdf8d0> Failed to fetch sample 971839. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16e29300> cannot identify image file <_io.BytesIO object at 0x7fee3f980450> Failed to fetch sample 2799267. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f609350> Failed to fetch sample 3847123. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3951b20> Failed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e3dd1f920> Failed to fetch sample 3194835. Exception: cannot identify image file <_io.BytesIO object at 0x7f53b78a53a0>cannot identify image file <_io.BytesIO object at 0x7f80a80e5850> le <_io.BytesIO object at 0x7f0e6bde2b60> Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99b840680> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7f5838fc6d40> Failed to fetch sample 1120039. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568ce160> Failed to fetch sample 2012611. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2496698f0> Failed to fetch sample 3463683. Exception:bject at 0x7f5741fb1990> Failed to fetch sample 3004849. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f696fc0> Failed to fetch sample 2034491. Exception:Failed to fetch sample 3898287. Exception: cannot identify image fiFailed to fetch sample 3582987. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93da80> le <_io.BytesIO object at 0x7f6a7f257560> cannot identify image file <_io.BytesIO object at 0x7fcbdc1ea2a0> Failed to fetch sample 3653818. Exception:cannot identify image file <_io.BytesIO object at 0x7f1137f65ad0> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7327eb10> Failed to fetch sample 2101533. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13358270> Failed to fetch sample 2642977. Exception:bject at 0x7ff24966bd30> Failed to fetch sample 2991472. Exception:Failed to fetch sample 2687916. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a371f0> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f257dd0> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8eed40> inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 3100356. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebf5850> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_graFailed to fetch sample 3929909. Exception: cannoFailed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2894e0> Failed to fetch sample 2127129. Exception: cannot identify imFailed to fetch sample 1430100. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986436700> Failed to fetch sample 2388130. Exception: cannoFailed to fetch sample 2982247. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab9ac50> Failed to fetch sample 2883927. Exception: cannot identify imFailed to fetch sample 1407616. Exception: cannoFailed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568f5cb0> Failed to fetch sample 3148873. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf87eb60> Failed to fetch sample 1465471. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a35a30> t identify image file <_io.BytesIO object at 0x7fdd430348b0> Failed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1bef20> Failed to fetch sample 3417146. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b222250> Failed to fetch sample 4132812. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f671f0> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2882c0> checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 4013530. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a84540> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/cFailed to fetch sample 2733843. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986434630> Failed to fetch sample 1770342. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b27ce50> cannot identify image file <_io.BytesIO object at 0x7fdbd77e8bd0> age file <_io.BytesIO object at 0x7f001f99b600> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568f4310> FFailed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f09327f0>Failed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643e2cb30>> Failed to fetch sample 4018722. Exception: s/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils//mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0931440> cannot identify image file <_io.BytesIO object at 0x7fbe723fc770> Failed to fetch sample 2845022. Exception: cannot identify image file <_io.BytesIO object at 0x7f00da6d26b0> cannot identify image file <_io.BytesIO object at 0x7fca1c1bf3d0> ile <_io.BytesIO object at 0x7f53f1f122f0> Failed to fetch sample 3239418. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a974770> Failed to fetch sample 1138466. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f88950> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3861bc0> Failed to fetch sample 3505254. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39d1e40> Failed to fetch sample 2547113. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ed030> Failed to fetch sample 1719059. Exception:Failed to fetch sample 1158438. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278fd760>cannot identify image file <_io.BytesIO object at 0x7fc0084b8ef0> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14d760> checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 3834693. Exception: cannot identify image file <_io.BytesIO object at 0x7efa08657ec0> cannot identify image file <_io.BytesIO object at 0x7f99a39d00e0> Failed to fetch sample 3573442. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8efb50> le <_io.BytesIO object at 0x7f13dc630720> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 2608078. Exception: s/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grFailed to fetch sample 3929909. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644e700> /aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 3319679. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093769c10> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fd1db4c20> Failed to fetch sample 2821710. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf346cd5d0> cannot identify image file <_io.BytesIO object at 0x7efcdbb3f8d0> Failed to fetch sample 1988187. Exception:ject at 0x7f70b426d1c0> Failed to fetch sample 1160161. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9009b70> Failed to fetch sample 3336519. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4260310> cannot identify image file <_io.BytesIO object at 0x7f5e6a95f240> Failed to fetch sample 2575242. Exception: cannot identify image file <_io.BytesIO object at 0x7f6101ebcb30> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7f109376a7a0> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fca5cca90> Failed to fetch sample 1465471. Exception: ject at 0x7fc0f092d5d0> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1245e5440> cannot identify image file <_io.BytesIO object at 0x7f99a3af2b10> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722a8770> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7f5319f814e0> le <_io.BytesIO object at 0x7f5e899d31a0> cannot identify image file <_io.BytesIO object at 0x7fbcdc0ef6a0> Failed to fetch sample 1222876. Exception: cannot identify image file <_io.BytesIO object at 0x7f098646e5c0> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f9692a0a1b0> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_graFailed to fetch sample 2101533. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e86c00> Failed to fetch sample 1659475. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b426f010> aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 4113419. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39f2d90> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 994847. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39d07c0> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0931530> Failed to fetch sample 2657269. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f971a30> Failed to fetch sample 2583452. Exception: cannot identify image file <_io.BytesIO object at 0x7efbf19439c0> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42613a0> Failed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7f566810f920> Failed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9fae04db0> Failed to fetch sample 2813892. Exception: ject at 0x7fdcaf8ef380> Failed to fetch sample 4030725. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a164d0> cannot identify image file <_io.BytesIO object at 0x7fc0f08fd990> Failed to fetch sample 1473020. Exception:Failed to fetch sample 4221900. Exception: cannot identify image ficannot identify image file <_io.BytesIO obFailed to fetch sample 980211. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbc9bc0> cannot identify image file <_io.BytesIO object at 0x7fccdf7813f0> ile <_io.BytesIO object at 0x7fc1cf6a3600> cannot identify image file <_io.BytesIO object at 0x7fe4c42614e0> Failed to fetch sample 2591530. Exception: cannot identify image file <_io.BytesIO object at 0x7fb6a411fba0> cannot identify image file <_io.BytesIO object at 0x7fdcaf8ec5e0> ile <_io.BytesIO object at 0x7f96e65ebfb0> Failed to fetch sample 2006795. Exception: cannot identify image file <_io.BytesIO object at 0x7fb6877dbec0> Failed to fetch sample 2155552. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7328a070> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a26ca0> cannot identify image file <_io.BytesIO object at 0x7f53a9083100> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906a200> Failed to fetch sample 3015989. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0f9a2890> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9b38d0> Failed to fetch sample 1571306. Exception:cannot identify image file <_io.BytesIO object at 0x7f5d278af650> Failed to fetch sample 1248377. Exception: cannot identify image file <_io.BytesIO object at 0x7fc195145a30> le <_io.BytesIO object at 0x7f10af1aa430> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2db790> Failed to fetch sample 2687916. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647eca0> cannot identify image file <_io.BytesIO object at 0x7efc5425e2f0> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7fbfebb83330> Failed to fetch sample 4185329. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278e5210> Failed to fetch sample 1135347. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4262e80> Failed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9c5cb0> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a15f30> Failed to fetch sample 2123650. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078aef20> cannot identify image file <_io.BytesIO object at 0x7f57cd62b7e0> Failed to fetch sample 3211480. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd7830> Failed to fetch sample 1840442. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7322cef0> cannot identify image file <_io.BytesIO object at 0x7efcdbb73920> Failed to fetch sample 2961542. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0936660> le <_io.BytesIO object at 0x7f1f1b223100> Failed to fetch sample 2855055. Exception:: cannot identify image file <_io.BytesIO object at 0x7f098647e700> ailed to fetch sample 1504520. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd7bf0> warn( Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42632e0> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7325f6f0> Failed to fetch sample 1308960. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f65990> Failed to fetch sample 2892575. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9f3001440> cannot identify image file <_io.BytesIO object at 0x7fe0ec2f0680> Failed to fetch sample 1314662. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c5c6340> cannot identify image file <_io.BytesIO object at 0x7f53f1f122f0> Failed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8eafab0> Failed to fetch sample 4138245. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac25f2e0> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbc6890> Failed to fetch sample 2335870. Exception:Failed to fetch sample 1224214. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cd5df0> Failed to fetch sample 1087148. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ca6e80> cannot identify image file <_io.BytesIO object at 0x7f7b06d772e0> Failed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7f50159ff240> Failed to fetch sample 2940708. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f1efc0> cannot identify image file <_io.BytesIO object at 0x7f0c35d69760> Failed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560544a0> Failed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1c5080> Failed to fetch sample 1479796. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2573d0> Failed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b0956c0> cannot identify image file <_io.BytesIO object at 0x7f5a239acf40> Failed to fetch sample 2861238. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec8e930> Failed to fetch sample 980211. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078afb50> cannot identify image file <_io.BytesIO object at 0x7f35a7f575b0> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906b420> Failed to fetch sample 4179589. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac25d8f0> Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cde9d0> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190be520> Failed to fetch sample 3380043. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078f5620> cannot identify image file <_io.BytesIO object at 0x7fc0f0932cf0> Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebde7a0> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c25fba4d0> cannot identify image file <_io.BytesIO object at 0x7f47278f6610> Failed to fetch sample 4125296. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec21a7a0> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7f655602a570> Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7fc715940fe0> FailedFailed to fetch sample 4123070. Exception:bject at 0x7efcdbbc6020> Failed to fetch sample 3374089. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a239e8680> Failed to fetch sample 1740611. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f093c720> aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f1e3e0> Failed to fetch sample 3112271. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754abce0> heckpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 2607936. Exception: cannot identify image file <_io.BytesIO object at 0x7f22304b5080> Failed to fetch sample 4143625. Exception: cannot identify image file <_io.BytesIO object at 0x7f52cc0affb0> cannot identify image file <_io.BytesIO object at 0x7f001f9e3ec0> Failed to fetch sample 3900769. Exception:Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a239e9620> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fe2de0> Failed to fetch sample 1784730. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ce46d0> FailedFailed to fetch sample 2705441. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e4ffb0> Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682e8Failed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a25490> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f7830> Failed to fetch sample 1218899. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd69d0> Failed to fetch sample 2481812. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cdd6c6d0> Failed to fetch sample 2024032. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f52b0> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec255b20> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7fb39f275a30> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grFailed to fetch sample 3629990. Exception: cannotFailed to fetch sample 2394516. Exception: ject at 0x7f35a7f/mnt/petrelfs/liuzhaoyang/workspace/programs/miniFailed to fetch sample 3314720. Exception: cannot identify image file <_io.BytesIO object at 0x7f4ebf8d35b0> Failed to fetch sample 2861238. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dc7ba0> cannot identify image file <_io.BytesIO object at 0x7fe436a166b0> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f5f80> Failed to fetch sample 3162454. Exception: Failed to fetch sample 2168694. Exception: ect at 0x7fe436a09d00> Failed to fetch sample 2188545. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec245c60> Failed to fetch sample 3297609. Exception: cannot identify image fiFailed to fetch sample 3710579. Exception:Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f977600> cannot identify image file <_io.BytesIO object at 0x7f6682e4fdd0> Failed to fetch sample 1965614. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754d2070> Failed to fetch sample 2691333. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fca40> Failed to fetch sample 3467927. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2b5990> Failed to fetch sample 1637223. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f77b1a0> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a15350> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f5c60> Failed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cd371260> Failed to fetch sample 2448314. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec21bc90> Failed to fetch sample 2563789. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2743b0> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7fa36444a110> Failed to fetch sample 4243929. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f7880> cannot identify image file <_io.BytesIO object at 0x7f32986bbf60> Failed to fetch sample 1960976. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9ce8d6c0> le <_io.BytesIO object at 0x7f4beec00d60> Failed to fetch sample 2720468. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fd080> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a258f0> Failed to fetch sample 2880836. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a91feffb0> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2dc9a0> Failed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2a50d0> Failed to fetch sample 1017215. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b1800> Failed to fetch sample 3345102. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec274220> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ce5120> cannot identify image file <_io.BytesIO object at 0x7fcbdcdb5710> Failed to fetch sample 2563789. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9ce8e570> ile <_io.BytesIO object at 0x7fc9b7f0e6b0> Failed to fetch sample 1213347. Exception: cannot identify image fFailed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb4680> Failed to fetch sample 1168708. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560643b0> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c9b830> Failed to fetch sample 2439548. Exception:Failed to fetch sample 2446264. Exception: ject at 0x7f0c35bc1620> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91c680> Failed to fetch sample 1301230. Exception: cannot identify image fiFailed to fetch sample 1597034. Exception:ject at 0x7f5668177790> Failed to fetch sample 2482217. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f2dc10> Failed to fetch sample 1416747. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c272d90> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 1Failed to fetch sam cannot identify image file <_io.BytesIO object at 0x7fcbdc1ca8e0> Failed to fetch sample 3345102. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9bacc180> ect at 0x7f7a56ca68e0> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f2f920> Failed to fetch sample 2845302. Exception: cannot identify image fiFailed to fetch sample 1349339. Exception: cannot identify image file <_io.BytesIO object at 0x7f23ebead120> Failed to fetch sample 2898272. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e383d4e0> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278c57b0> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cdd850> Failed to fetch sample 1035361. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9eff99ad0> Failed to fetch sample 3435401. Exception: ject at 0x7f1137f2f240> Failed to fetch sample 1770342. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008b48ef0> Failed to fetch sample 2110654. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f2be70> Failed to fetch sample 2538638. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fc680> Failed to fetch sample 1448881. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb5170> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c98a90> le <_io.BytesIO object at 0x7f7a569493f0> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f24f740> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a17790> cannot identify image file <_io.BytesIO object at 0x7f5d27882430> Failed to fetch sample 1885051. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9ce8d490> cannot identify image file <_io.BytesIO object at 0x7f7ef4f5fb50> Failed to fetch sample 4108370. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5436f0> Failed to fetch sample 2024032. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a60a700> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e651d490> cannot identify image file <_io.BytesIO object at 0x7f4beec7d210> Failed to fetch sample 2787073. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec277b50> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b8770> Failed to fetch sample 2858679. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a569486d0> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c9a1b0> Failed to fetch sample 3211987. Exception:bject at 0x7fcbdc1baa70> le <_io.BytesIO object at 0x7f81540162f0> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5844f0> Failed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91d4e0> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278b3b00> cannot identify image file <_io.BytesIO object at 0x7f4beebd8d10> Failed to fetch sample 3880305. Exception: cannot identify image fiFailed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1ba2f0> le <_io.BytesIO object at 0x7fcbdc1ee3e0> Failed to fetch sample 2671128. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cae261300> cannot identify image file <_io.BytesIO object at 0x7f1137f2ecf0> Failed to fetch sample 3880305. Exception: cannot identify image file <_io.BytesIO object at 0x7fa0081994e0> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7f7faa79dbc0> cannot identify image file <_io.BytesIO object at 0x7f4beec2e7f0> Failed to fetch sample 1250858. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c161e4b30> Failed to fetch sample 3223985. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134137c90> Failed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91df80> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937e1cb0> Failed to fetch sample 3004849. Exception: Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5ad260> cannot identify image file <_io.BytesIO object at 0x7f4bfe4dbfb0> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d72980770> Failed to fetch sample 3840067. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20039aa20> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f5fd0> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a91720130> Failed to fetch sample 2105220. Exception: cannot identify image file <_io.BytesIO object at 0x7f96d6a089a0> Failed to fetch sample 1879967. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec265990> Failed to fetch sample 3463683. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb6430> Failed to fetch sample 3414597. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fd580> Failed to fetch sample 1327539. Exception: cannot identify image fFailed to fetch sample 2190327. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7c03b0> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008b4a5c0> ile <_io.BytesIO object at 0x7f3939c982c0> Failed to fetch sample 1171510. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb4590> cannot identify image file <_io.BytesIO object at 0x7f81db5ae070> Failed to fetch sample 2013175. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a3fc93fb0> Failed to fetch sample 2417768. Exception: ailed to fetch sample 1431086. Exception:bject at 0x7f5253bfa7a0> Failed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a16d40> Failed to fetch sample 3249441. Exception: cannot identify image fiFailed to fetch sample 1694230. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb73fb00> le <_io.BytesIO object at 0x7f18e64f79c0> Failed to fetch sample 4243929. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a573a60> Failed to fetch sample 1218899. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d6d650a40> Failed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7f6ac0cde9d0> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c56f600> Failed to fetch sample 1558498. Exception:Failed to fetch sample 3863728. Exception: cannot identify image file <_io.BytesIO object at 0x7f192fed28e0>cannot identify image file <_io.BytesIO object at 0x7efd6f7fecf0> Failed to fetch sample 2329563. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc256293f0> Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c71a0> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f24dfd0> Failed to fetch sample 2363461. Exception: cannot identify image file <_io.BytesIO object at 0x7f191dbaf240> Failed to fetch sample 1113189. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64c0770> Failed to fetch sample 2550257. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc256288b0> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5696c360> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7db740> Failed to fetch sample 1697346. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25628270> Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f24d580> Failed to fetch sample 3180383. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c573740> Failed to fetch sample 2979749. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf854310> cannot identify image file <_io.BytesIO object at 0x7f94ac2f9300> Failed to fetch sample 2799267. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d2890> Failed to fetch sample 3355968. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99b840680> e <_io.BytesIO object at 0x7f7cebf7f600> Failed to fetch sample 2142902. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f0d10> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f236ac0> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278cd4e0> Failed to fetch sample 1017215. Exception: Failed to fetch sample 2767646. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7f7b9beda2f0> ile <_io.BytesIO object at 0x7efd6f7da610> Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7f27bdaaa200> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7f995aaf13f0> Failed to fetch sample 3380043. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37a84f0> Failed to fetch sample 3633396. Exception: cannot identify image file <_io.BytesIO object at 0x7f191df17d30> Failed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16dad990> Failed to fetch sample 3850311. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986449b20> Failed to fetch sample 4376367. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556064720> Failed to fetch sample 3605195. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937d19e0> Failed to fetch sample 4057666. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fd8f90> Failed to fetch sample 1686316. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f3510> Failed to fetch sample 1339465. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093789670> Failed to fetch sample 4045090. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a396b7e0> cannot identify image file <_io.BytesIO object at 0x7fc384737dd0> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a8767b5b0> Failed to fetch sample 3021215. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281144f0>Failed to fetch sample 1551481. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986434cc0> Failed to fetch sample 1884451. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369f4f40> ile <_io.BytesIO object at 0x7f58c39f2ac0> Failed to fetch sample 1319642. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f734c0> Failed to fetch sample 3349884. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a392db70> Failed to fetch sample 2170029. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb68779c0> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c5731f0> Failed to fetch sample 1410411. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25fd87c0> Failed to fetch sample 3002806. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc146c50> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f3e20> Failed to fetch sample 4133167. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bdf8d0> Failed to fetch sample 1653057. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16daf010> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341340e0> Failed to fetch sample 2078887. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f980680> cannot identify image file <_io.BytesIO object at 0x7f583c6d93a0> ile <_io.BytesIO object at 0x7fd07a537b00> Failed to fetch sample 2681494. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d7c90> Failed to fetch sample 1035361. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369f63e0> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a7e0f44a0> cannot identify image file <_io.BytesIO object at 0x7f001f94ec50> Failed to fetch sample 3714933. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16e29030> le <_io.BytesIO object at 0x7f9a9c2cb5b0> ailed to fetch sample 3693861. Exception: ject at 0x7f5e6a95eac0> Failed to fetch sample 4117437. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697ded4e0> Failed to fetch sample 3686234. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fbbc40> Failed to fetch sample 1011675. Exception:cannot identify image file <_io.BytesIO object at 0x7fd200385350> Failed to fetch sample 2970780. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a7880> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42b3a10> FFailed to fetch sample 2389460. Exception:bject at 0x7f001faeef70> le <_io.BytesIO object at 0x7fbcdc122cf0>Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986437ab0> FFailed to fetch sample 1770342. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39fa6b0> Failed to fetch sample 1350131. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b384f9c0> Failed to fetch sample 2738042. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9df060>Failed to fetch sample 1018079. Exception: cannot identify image file <_io.BytesIO object at 0x7f95bc525620> cannot identify image file <_io.BytesIO object at 0x7f08280eb1f0>Failed to fetch sample 2373755. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3746520> Failed to fetch sample 2595972. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754ca660> FFailed to fetch sample 4079452. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcfba0>Failed to fetch sample 3278957. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568ffe70> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986436ed0> Failed to fetch sample 4186967. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce56660> Failed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7f71f8288db0> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38a6f70> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7f1bfc3a4270> Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a561490> cannot identify image file <_io.BytesIO object at 0x7f001f905530> ile <_io.BytesIO object at 0x7efe28db7330> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d1e110> Failed to fetch sample 3879422. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1bee63e0> Failed to fetch sample 2101564. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568fd5d0> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c77d80> Failed to fetch sample 1864192. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc146e30> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7f936bd2cc70> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 3328821. Exception: cannot identify image file <_io.BytesIO object at 0x7f69190c1c60> Failed to fetch sample 3571975. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1c4680> Failed to fetch sample 3704146. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a239acf40> Failed to fetch sample 1308960. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37db790> Failed to fetch sample 3245561. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f970860> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401dc75b0> Failed to fetch sample 2987417. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f571f0> Failed to fetch sample 2149756. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9dd0d0> cannot identify image file <_io.BytesIO object at 0x7f001f9049f0> Failed to fetch sample 1120039. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72247470> le <_io.BytesIO object at 0x7f0c1b5343b0> Failed to fetch sample 1528709. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a7f9d7fb0> cannot identify image file <_io.BytesIO object at 0x7f3939cc28e0> ile <_io.BytesIO object at 0x7f5de37dbfb0> Failed to fetch sample 3013832. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3755c10> Failed to fetch sample 2889444. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9f9260> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7f936bd2d760> Failed to fetch sample 3880305. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39f97b0> Failed to fetch sample 2535152. Exception: cannot identify image file <_io.BytesIO object at 0x7f69186e5490> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7f190815b010> Failed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d1a2f0> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39ec040> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7f578b860400> Failed to fetch sample 2744831. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1245e5440> cannot identify image file <_io.BytesIO object at 0x7f5e6a1414e0> Failed to fetch sample 1126177. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f88950> Failed to fetch sample 1628257. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fb9da0> Failed to fetch sample 3898287. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e7a25fa10> le <_io.BytesIO object at 0x7f70b40bd850> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37db880> Failed to fetch sample 1582732. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a96dee0> heckpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( to fetch sample 3463683. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3755b20> Failed to fetch sample 1409821. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090a6b0> Failed to fetch sample 2358648. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce54590> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce53ab0> Failed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbde566110> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaef661f30> Failed to fetch sample 2647104. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401e46660> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7ff437a0dee0> cannot identify image file <_io.BytesIO object at 0x7f1137f73a60> le <_io.BytesIO object at 0x7fc1187897b0> cannot identify image file <_io.BytesIO object at 0x7f524fa80720> Failed to fetch sample 4175296. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cc1b20> le <_io.BytesIO object at 0x7f18e6505a30> Failed to fetch sample 1858383. Exception: cannot identify image f/mnt/petrelfs/liuzhaoyang/workspace/programFailed to fetch sample 3846620. Exception:te-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 2583452. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce57920> cannot identify image file <_io.BytesIO object at 0x7fdb9a641a80> Failed to fetch sample 2719591. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a624e2de0> Failed to fetch sample 2758452. Exception:Failed to fetch sample 1017215. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64cc900> Failed to fetch sample 1258856. Exception: cannot identify image fiFailed to fetch sample 2746130. Exception:cFailed to fetch sample 1784730. Exception:ect at 0x7f5eb402ffb0> ile <_io.BytesIO object at 0x7f0fcbf94b30> Failed to fetch sample 1802273. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fc66c2200> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7fbfed7ff3d0> Failed to fetch sample 2919698. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864357b0> Failed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fca5ccb30> Failed to fetch sample 1066543. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380f8d0> Failed to fetch sample 3889193. Exception: cannot identify image file <_io.BytesIO object at 0x7f1a0d29a610> Failed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56960e00> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093795c10> Failed to fetch sample 1278085. Exception: cannot identify image file <_io.BytesIO object at 0x7fbfef916110> Failed to fetch sample 1854584. Exception: ject at 0x7f5e6a90fa60> e <_io.BytesIO object at 0x7f12b38d8b30> /checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 2954477. Exception: cannot identify image file <_io.BytesIO object at 0x7f55defc3fb0> cannot identify image file <_io.BytesIO object at 0x7fdd43035120> ile <_io.BytesIO object at 0x7fc26217acf0> Failed to fetch sample 3889096. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cec956ac0> Failed to fetch sample 1567306. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64c0770> Failed to fetch sample 2429018. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5c77e0> Failed to fetch sample 2998507. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8eaf70> Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8d8ea0> Failed to fetch sample 2263726. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fdabd7c5a80> Failed to fetch sample 4130361. Exception: cannot identify image file <_io.BytesIO o cannot identify image fiFailed to fetch sample 1007158. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5690f420> le <_io.BytesIO object at 0x7f5de3727a60> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fa12250> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faf1c10> Failed to fetch sample 2799267. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5d3420> checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.Failed to fetch sample 1160161. Exception: cannot identify image file <_io.BytesIO object at 0x7f192201de90> Failed to fetch sample 2434212. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754767a0> Failed to fetch sample 1659672. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fa83ec0> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39612b0> Failed to fetch sample 1482299. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7f56685093f0> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( io.BytesIO object at 0x7fa275ce4400> Failed to fetch sample 1431086. Exception: cannot identify image fiFailed to fetch sample 3563030. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3861a80> Failed to fetch sample 3374089. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7bef20> le <_io.BytesIO object at 0x7fd30f9de5c0> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7f7eb0102610> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56905800> Failed to fetch sample 2605959. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134100590> Failed to fetch sample 3917410. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc6346d0> Failed to fetch sample 2801888. Exception: cannot identify image fiFailed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64e1350> Failed to fetch sample 1311478. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f777c40> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56960a90> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986470180> Failed to fetch sample 3058629. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac22f3d0> Failed to fetch sample 2677051. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9cea8ae0> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7f1921b312b0> Failed to fetch sample 3584139. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1477e0> cannot identify image file <_io.BytesIO object at 0x7fbf005fbd80> Failed to fetch sample 1178668. Exception: cannot identify image file <_io.BytesIO object at 0x7f27a941dbc0> cannot identify image file <_io.BytesIO object at 0x7ff249531580> Failed to fetch sample 3880305. Exception: cannot identify image ficannot identify image file <_io.BytesIO obFailed to fetch sample 3228934. Exception:mple 1791282. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c718a5d50> Failed to fetch sample 2055141. Exception: cannot identify image file <_io.BytesIO object at 0x7f4431fe7e70> Failed to fetch sample 3478584. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d69760> Failed to fetch sample 1464881. Exception: cannot identify image file <_io.BytesIO object at 0x7f81528af0b0> Failed to fetch sample 2536122. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b382f060> Failed to fetch sample 2948675. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dce610> cannot identify image file <_io.BytesIO object at 0x7ff248cf01d0> Failed to fetch sample 1984920. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa3d3a0> Failed to fetch sample 3899288. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39eef20> Failed to fetch sample 3304102. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3815120> cannot identify image file <_io.BytesIO object at 0x7fcbdc1b3470> le <_io.BytesIO object at 0x7f7e07781c10> cannot identify image file <_io.BytesIO object at 0x7fcef824d530> le <_io.BytesIO object at 0x7f22e3868bd0> Failed to fetch sample 4181408. Exception: cannot identify image file <_io.BytesIO object at 0x7fc3a43c7100> Failed to fetch sample 3350750. Exception: cannot identify image file <_io.BytesIO object at 0x7f81528ada30> Failed to fetch sample 2836080. Exception: cannot identify image file <_io.BytesIO object at 0x7f20b4288c20> cannot identify image file <_io.BytesIO object at 0x7f9abcdcf060> le <_io.BytesIO object at 0x7fdaadd4fb00> Failed to fetch sample 2115038. Exception: Failed to fetch sample 3840067. Exception: ject at 0x7f4beec8df30> ailed to fetch sample 2033792. Exception:cannot identify image file <_io.BytesIO object at 0x7f5253bee390> Failed to fetch sample 3840067. Exception: cannot identify image file <_io.BytesIO object at 0x7f4440882bb0> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f7d385850> Failed to fetch sample 3451108. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b907c90> cannot identify image file <_io.BytesIO object at 0x7f9a9c2d8310> Failed to fetch sample 1263112. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b73d0> Failed to fetch sample 3885469. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7404096c0> Failed to fetch sample 3634589. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bf9f80> cannot identify image file <_io.BytesIO object at 0x7fdaadd4ef20> Failed to fetch sample 3333023. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732609f0> Failed to fetch sample 1120039. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c29b830> cannot identify image file <_io.BytesIO object at 0x7f4beec61350> le <_io.BytesIO object at 0x7fb7401e0c20> warn( Failed to fetch sample 1818686. Exception:Failed to fetch sample 2661050. Exception: cannot identify image fiFailed to fetch sample 4133653. Exceptioncannot identify image file <_io.BytesIO object at 0x7fa50aa338d0> FaFailed to fetch sample 1504520. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbc6980> iled to fetch sample 3546995. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaae62840> on: cannot identify image file <_io.BytesIO object at 0x7f9d73261a30> Failed to fetch sample 1250858. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2cbf60> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 2663199. Exception: cannot identify icannot identify image file <_io.BytesIO object at 0x7efd6f7db920> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7f191dbaf240> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object atFailed to fetch sample 2192747. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f092c770> Failed to fetch sample 1762324. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbf9620> 0x7efd6f7f65c0> Failed to fetch sample 4115379. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28d350> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7f1d0d5805e0> Failed to fetch sample 2773958. Exception: cannot identify image file <_io.BytesIO object at 0x7efa5f45fec0> Failed to fetch sample 2473307. Exception: cannot identify image file <_io.BytesIO object at 0x7f6ab2ffdc10> Failed to fetch sample 1169838. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8eb6a0> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f1189a0> Failed to fetch sample 3317161. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d9260> Failed to fetch sample 2101533. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f1e90> Failed to fetch sample 3418606. Exception:Failed to fetch sample 1355996. Exception: s/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( t identify image file <_io.BytesIO object at 0x7f566810c950> Failed to fetch sample 2736300. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73271fd0> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b237650> Failed to fetch sample 3701496. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c28f510> Failed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7f97ee4c94e0> cannot identify image file <_io.BytesIO object at 0x7f99a3981940> Failed to fetch sample 2467776. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3941760> Failed to fetch sample 1568044. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe723ffd30> cannot identify image file <_io.BytesIO object at 0x7fbcdc5349a0> Failed to fetch sample 2647073. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f1710> Failed to fetch sample 4245147. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e2d8046d0> Failed to fetch sample 3331636. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4db5484f0> Failed to fetch sample 4048736. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b258ef0> Failed to fetch sample 1231297. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7324ba60> Failed to fetch sample 3432027. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4473708b0> Failed to fetch sample 2940430. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe723eb470> Failed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f094a8e0> Failed to fetch sample 4179589. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732492b0> Failed to fetch sample 2066827. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e383e980> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777f9c10> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38b3560> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c420ff10> Failed to fetch sample 2799267. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c28f970> Failed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f2cc20> Failed to fetch sample 2562281. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1e86d0> Failed to fetch sample 1382990. Exception: cannot identify image fFailed to fetch sample 2047816. Exception: Failed to fetch sample 3309890. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc12b830> Failed to fetch sample 1742099. Exception:ms/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/ cannot identify image file <_io.BytesIO object at 0x7f83e834a480> Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1ba840> Failed to fetch sample 2819794. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2495214e0> /mnt/petrelfs/liuzhaoyang/workspace/progra cannot identify image file <_io.BytesIO object at 0x7fdcaf847c90> le <_io.BytesIO object at 0x7f53a9065d00> Failed to fetch sample 1742099. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c54bfb0> Failed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1b32e0> annot identify image file <_io.BytesIO object at 0x7f9d732731f0> Failed to fetch sample 3225455. Exception: cannot identify image file <_io.BytesIO object at 0x7f9959f75a30> Failed to fetch sample 2823559. Exception: cannot identify image file <_io.BytesIO object at 0x7fb246062a70> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadd5fc90> Failed to fetch sample 1133332. Exception: cannot identify image file <_io.BytesIO object at 0x7fdab9acd7b0> cannot identify image file <_io.BytesIO object at 0x7fbcdc126f70> Failed to fetch sample 3149661. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc12a0c0> Failed to fetch sample 2847135. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f605b20> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7fdab2e555d0> cannot identify image file <_io.BytesIO object at 0x7f9a9c2ceac0> Failed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73273b50> Failed to fetch sample 1445442. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741fb0c70> Failed to fetch sample 1680813. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb9f6a0> warn( Failed to fetch sample 1133332. Exception: cannot identify image file <_io.BytesIO object at 0x7f53f1f122f0> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1b1710> Failed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc255da0> cannot identify image file <_io.BytesIO object at 0x7fe0ec2829d0> ile <_io.BytesIO object at 0x7f1f1b2407c0> Failed to fetch sample 1209471. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4440ba570> Failed to fetch sample 2402011. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf846a70> cannot identify image file <_io.BytesIO object at 0x7fc0f08db790> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f0d800> Failed to fetch sample 3427903. Exception: Failed to fetch sample 1035361. Exception:Failed to fetch sample 2363228. Exception: le 1682209. Exception: cannot identify image file <_io.BytesIO object at 0x7efde30fcae0> Failed to fetch sample 2402011. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c549cb0> Failed to fetch sample 2296319. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80dc040> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1b1fd0> ct at 0x7f0986438c20> ile <_io.BytesIO object at 0x7f47278c0fe0> Failed to fetch sample 3380043. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7222cf90> Failed to fetch sample 3290404. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38b3a10> Failed to fetch sample 3196985. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d6b240> Failed to fetch sample 2144110. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db55d8f0> Failed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7f443d05ff60> Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38b3e70> Failed to fetch sample 1770342. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d6b420> Failed to fetch sample 1202706. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278d3ce0> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7f8151923fb0> Failed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d6b8d0> Failed to fetch sample 1140456. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38b16c0> Failed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7f83bce57010> Failed to fetch sample 2117588. Exception: cannot identify image file <_io.BytesIO object at 0x7efd4860cf90> Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35ba9b70> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38c0810> Failed to fetch sample 2522299. Exception: cannot identify image file <_io.BytesIO object at 0x7efd3a02b600> Failed to fetch sample 3880305. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35baab60> cannot identify image file <_io.BytesIO object at 0x7ff4e3e4be20> Failed to fetch sample 1432321. Exception: cannot identify image file <_io.BytesIO object at 0x7fde3edb95d0> Failed to fetch sample 1136561. Exception: cannot identify image file <_io.BytesIO object at 0x7f61aaaa16c0> le <_io.BytesIO object at 0x7f5cb38b14e0> Failed to fetch sample 4375328. Exception:ject at 0x7fdcaf8ee8e0> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( io.BytesIO object at 0x7f80a80ddc60> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c3fba0> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38b19e0> le <_io.BytesIO object at 0x7fe44409b1f0> cannot identify image file <_io.BytesIO object at 0x7fd07a5f4860> le <_io.BytesIO object at 0x7f56681db510> Failed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42ab600> checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 1198363. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec282e80> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7f61b6aed760> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c3ed90> cannot identify image file <_io.BytesIO object at 0x7fb52f6c95d0> Failed to fetch sample 2655989. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbd7a10> Failed to fetch sample 1183659. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f624e00> cannot identify image file <_io.BytesIO object at 0x7f552860e250> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38f8400> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42aefc0> Failed to fetch sample 1062882. Exception: cannot identify image file <_io.BytesIO object at 0x7f5838df48b0> cannot identify image file <_io.BytesIO object at 0x7ff4dab99300> Failed to fetch sample 2979603. Exception:Failed to fetch sample 4199112. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f291440> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 1943116. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a573600> cannot identify image file <_io.BytesIO object at 0x7f56681b1f30> Failed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42aee30> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681d9c60> Failed to fetch sample 3357514. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9c6a70> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3824860> cannot identify image file <_io.BytesIO object at 0x7fb52f6c9350> Failed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6d51c0> Failed to fetch sample 3662436. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f1aca0> Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08ede90> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f62ac00> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f64e890> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7f5afc1d67f0> Failed to fetch sample 3380043. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420efdd0> cannot identify image file <_io.BytesIO object at 0x7f566813cc70> Failed to fetch sample 3713962. Exception:bject at 0x7f6ab2ffdad0> Failed to fetch sample 1432321. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a36d40> d=True. Gradients will be None warnings.warn( Failed to fetch sample 2085023. Exception: cannot identify image file <_io.BytesIO object at 0x7f6b6bdea890> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c38c3060> cannot identify image file <_io.BytesIO object at 0x7f001f924a90> Failed to fetch sample 1079564. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16daf060> le <_io.BytesIO object at 0x7f357b9c4ae0> FFailed to fetch sample 1198363. Exception: cannot identify image file <_io.BytesIO object at 0x7efd4860cf90> Failed to fetch sample 1706460. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253be5fd0> Failed to fetch sample 3881414. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdb9f380> Failed to fetch sample 3700979. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16dacb80> Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9c53a0> Failed to fetch sample 2961542. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3827650> cannot identify image file <_io.BytesIO object at 0x7f6a7f26c680> Failed to fetch sample 1448881. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91ebb0> Failed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f7d385210> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16dad670> cannot identify image file <_io.BytesIO object at 0x7fe4369f5cb0> Failed to fetch sample 3909103. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6ce8e7880> ile <_io.BytesIO object at 0x7f357b907f10> cannot identify image file <_io.BytesIO objFailed to fetch sample 3463683. Exception: ple 1432321. Exception: cannot identify image file <_io.BytesIO object at 0x7f082812b510> cannot identify image file <_io.BytesIO object at 0x7f5741f3b100> Failed to fetch sample 2469453. Exception:bject at 0x7f001f924b30> Failed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cac3ec680> cannot identify image file <_io.BytesIO object at 0x7fe4369ffe70> sequence length for this model (10994 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 3876980. Exception:Failed to fetch sample 3862654. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7ff4dab98b80> Failed to fetch sample 1587644. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec23b3d0> Failed to fetch sample 3917209. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c971a0> ning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 1551481. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a0ba10> Failed to fetch sample 3840067. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f26de40> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguviFailed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7f082812bf60> Failed to fetch sample 2157349. Exception: cannotFailed to fetch sample 3892787. Exception: 0x7fbe7223e250> Failed to fetch sample 4121793. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bdcc20> Failed to fetch sample 4186967. Exception: cannot identify image file <_io.BytesIO object at 0x7f334a642b60> Failed to fetch sample 2641623. Exception: cannot identify image file <_io.BytesIO object at 0x7fc196770d60> cannot identify image file <_io.BytesIO object at 0x7fcbdc1dafc0>Fcannot identify image file <_io.BytesIO oFailed to fetch sample 1482362. Exception: haoyang/workspace/program cannot identify image file <_io.BytesIO object at 0x7f0c35bdccc0> heckpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369f4540> Failed to fetch sample 2482217. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a0b9c0> Failed to fetch sample 3598313. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37e7470> cannot identify image file <_io.BytesIO object at 0x7fbf48cfe2f0> cannot identify image file <_io.BytesIO object at 0x7fc2eb74b7e0> ile <_io.BytesIO object at 0x7f12b384c900> Failed to fetch sample 2887391. Exception:bject at 0x7fe0ec239350> Failed to fetch sample 1232641. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f1a80> Failed to fetch sample 1947526. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071e980> Failed to fetch sample 2714349. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec030b0> Failed to fetch sample 1163982. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f380e0> Failed to fetch sample 1854205. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a0af20> Failed to fetch sample 3596834. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b39f7880> cannot identify image file <_io.BytesIO object at 0x7fa275cb2020> Failed to fetch sample 1804828. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4daba2570> Failed to fetch sample 1999059. Exception: cannot identify image file <_io.BytesIO object at 0x7f098646d8f0> cannot identify image file <_io.BytesIO object at 0x7f1137f51c10> Failed to fetch sample 2504719. Exception: cannot identify image file <_io.BytesIO object at 0x7fe5886f1210> Failed to fetch sample 2393018. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d729805e0> cannot identify image file <_io.BytesIO object at 0x7f59d73f1670> le <_io.BytesIO object at 0x7fd07a5815d0> Failed to fetch sample 2883927. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec273600> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1dbc40> Failed to fetch sample 1095181. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b271850> Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebf7880> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f23b240> cannot identify image file <_io.BytesIO object at 0x7fa275cb2660> le <_io.BytesIO object at 0x7fe436a262a0> Failed to fetch sample 2231126. Exception: cannot identify image file <_io.BytesIO object at 0x7f098646d3a0> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38d90d0> Failed to fetch sample 3510001. Exception: cannot identify image file <_io.BytesIO object at 0x7f7c11f93fb0> Failed to fetch sample 2554097. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093769c60> cannot identify image file <_io.BytesIO object at 0x7f21b0887f60> Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828129940> Failed to fetch sample 4182163. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8399c60> Failed to fetch sample 1545586. Exception: cannot identify image file <_io.BytesIO object at 0x7f801f5efc90> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7bfb0> Failed to fetch sample 2898550. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5436f0> Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bcd5d0> Failed to fetch sample 1301230. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5bf2e0> Failed to fetch sample 1784730. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bcd850> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7f7f0518a070> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bcef20> Failed to fetch sample 2454909. Exception: cannot identify image file <_io.BytesIO object at 0x7f001b929710> Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35b96840> cannot identify image file <_io.BytesIO object at 0x7f114e6bc090> Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec787c0> Failed to fetch sample 1160161. Exception:Failed to fetch sample 2429018. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8319df0>cannot identify image file <_io.BytesIO object at 0x7f84f700c860> Failed to fetch sample 1316862. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1c5490> Failed to fetch sample 1909557. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf4c6fbfb0> Failed to fetch sample 3402784. Exception: cannot identify image file <_io.BytesIO object at 0x7fc195144450> cannot identify image file <_io.BytesIO object at 0x7fd07a55ade0> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d09ee0> Failed to fetch sample 980211. Exception: cannot identify image file <_io.BytesIO object at 0x7f6ab2ffdad0> Failed to fetch sample 3648356. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a0b880> cannot identify image file <_io.BytesIO object at 0x7f7dce76ffb0> Failed to fetch sample 2040873. Exception:Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f600900> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec008b0> Failed to fetch sample 1095181. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0f9a13f0> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00757420> cannot identify image file <_io.BytesIO object at 0x7f5a176d8a90> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bdf1f0> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 3395527. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f7470> Failed to fetch sample 2799267. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8394180> Failed to fetch sample 1431086. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5e97470> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1c5710>Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00743970> cannot identify image file <_io.BytesIO object at 0x7f7eb00d6340> cannot identify image file <_io.BytesIO object at 0x7f9d73254400> Failed to fetch sample 2723800. Exception: Failed to fetch sample 4187499. Exception: cannot identify image fFailed to fetch sample 3661452. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a27100> Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fd530> ile <_io.BytesIO object at 0x7fb9777bed40> Failed to fetch sample 4044348. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278fc0e0> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7bc040> cannot identify image file <_io.BytesIO object at 0x7fb52f65ec00> ile <_io.BytesIO object at 0x7f5d278fbdd0> cannot identify image file <_io.BytesIO object at 0x7f53a876bba0> Failed to fetch sample 2851854. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1d0900> Failed to fetch sample 3670803. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39d3880> Failed to fetch sample 1292874. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9be430> le <_io.BytesIO object at 0x7f5e7a25fb00> cannot identify image file <_io.BytesIO object at 0x7f18e5d0a020> Failed to fetch sample 3856361. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a558860> e <_io.BytesIO object at 0x7f83e8395a80> Failed to fetch sample 2482217. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828109080> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7ea6b0> Failed to fetch sample 3592274. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a60acf0> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7fb693f1bfb0> Failed to fetch sample 2272798. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a239af2e0> Failed to fetch sample 2783531. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08ef6f0> Failed to fetch sample 3696851. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37e25c0> Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7fb6f2b3d300> Failed to fetch sample 2525780. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4eeec7560> Failed to fetch sample 2679326. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d27783c90> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb74fdd0> cannot identify image file <_io.BytesIO object at 0x7f18e5d144a0> ailed to fetch sample 2330574. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c2122a0> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8349fd0> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f5fd0> cannot identify image file <_io.BytesIO object at 0x7ff401fb4fe0> Failed to fetch sample 3850470. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3862e80> Failed to fetch sample 3467927. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f566811e5c0> le <_io.BytesIO object at 0x7f7c11f91760> ile <_io.BytesIO object at 0x7f7b9b1da520> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b53e20> Failed to fetch sample 3115338. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e070fb600> Failed to fetch sample 2389804. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682de9f80> Failed to fetch sample 3588244. Exception:Failed to fetch sample 3892787. Exception:bject at 0x7f7b9b1e5210> Failed to fetch sample 3648356. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb749760> cannot identify image file <_io.BytesIO object at 0x7f65560643b0> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7f80201b3150> Failed to fetch sample 3856361. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c213d80> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7f082810b880> Failed to fetch sample 2479743. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697defba0> Failed to fetch sample 3290404. Exception: cannot identify image file <_io.BytesIO object at 0x7f566813e570> cannot identify image file <_io.BytesIO obFailed to fetch sample 2979749. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9f9d00> ject at 0x7f6682dc6de0> Failed to fetch sample 3624884. Exception:F cannot identify image file <_io.BytesIO object at 0x7f5de37db060> Failed to fetch sample 1012319. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754d01d0> Failed to fetch sample 4099964. ExceptionFailed to fetch sample 1659475. Exception:ject at 0x7f81d6d889a0> > Failed to fetch sample 4186967. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556065170> cannot identify image file <_io.BytesIO oFailed to fetch sample 36Failed to fetch sample 3299243. Exception:le <_io.BytesIO object at 0x7f0828109530> Failed to fetch sample 3191444. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fd350> cannot identify image file <_io.BytesIO object at 0x7fc2eb749490> Failed to fetch sample 1687187. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7fc108456d40> le <_io.BytesIO object at 0x7fe0ec2511c0> Failed to fetch sample 3333588. Exception: Failed to fetch sample 2919698. Exception:ject at 0x7f83e8316a20> Failed to fetch sample 1561037. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556066e30> le <_io.BytesIO object at 0x7f0ecdb9c4f0> cannot identify image file <_io.BytesIO object at 0x7f53a90809f0> Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f91620> cannot identify image file <_io.BytesIO object at 0x7ff84cc107c0> Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb74d490> Failed to fetch sample 2130826. Exception: cannot identify image file <_io.BytesIO object at 0x7f08282cec50> Failed to fetch sample 2363461. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a729ede0> Failed to fetch sample 4077261. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2d6520> Failed to fetch sample 2954477. Exception: cannot identify image file <_io.BytesIO object at 0x7fc108457560> Failed to fetch sample 2072417. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f6d90> Failed to fetch sample 1100749. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754d2e80> Failed to fetch sample 1424068. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc1350> Failed to fetch sample 1784730. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc635850> Failed to fetch sample 4099461. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd4e7bc7c0> Failed to fetch sample 1819980. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f970efa1bc0> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133fd2430> Failed to fetch sample 1207977. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 1926880. Exception:ample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681d0a40> Failed to fetch sample 2953992. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412eac0> Failed to fetch sample 1277019. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fd030> Failed to fetch sample 2358648. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560e6e30>Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac26f740> Failed to fetch sample 2633220. Exception: cannot identify image fiFailed to fetch sample 1749153. Exception:Failed to fetch sample 3116930. Exception: cannot identify image ficannot identify image file <_io.BytesIO obFailed to fetch sample 2258523. Exception:bject at 0x7f9abc65a0c0> Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc10f970> Failed to fetch sample 3207078. Exception: cannot identify image fFailed to fetch sample 2898734. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec21b650>Failed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828108db0> Failed to fetch sample 3380043. Exception: cannot identify image file <_io.BytesIO object at 0x7f85160e5530> Failed to fetch sample 3633396. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e834e6b0> Failed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2d4630> Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc636250> Failed to fetch sample 1186857. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280ff010> Failed to fetch sample 2583452. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560e5b70> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd32e0> Failed to fetch sample 2535152. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9384f0> Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7fb6877e1940> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42a9760> Failed to fetch sample 1774890. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b1243240> Failed to fetch sample 3431352. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a60b880> Failed to fetch sample 1759595. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f6cd60> Failed to fetch sample 3480772. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf856c50> Failed to fetch sample 3868582. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc3240> Failed to fetch sample 2671128. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c5e610> Failed to fetch sample 3634589. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39404f0> ailed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec255b20> cannot identify image file <_io.BytesIO object at 0x7fdd459d6430> le <_io.BytesIO object at 0x7fb6877dbc90> Failed to fetch sample 3674200. Exception: cannot identify image file <_io.BytesIO object at 0x7f5525ebcd10> Failed to fetch sample 4173587. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f02520> Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc1bc0> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bb7970> Failed to fetch sample 1580010. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fc4f0> Failed to fetch sample 1614757. Exception: cannot identify image fFailed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7fb6877e2250> Failed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e1120> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7f5525ebc720> ile <_io.BytesIO object at 0x7fd07a53a2a0> Failed to fetch sample 2723001. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7bbab0> Failed to fetch sample 1976843. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a24c70> Failed to fetch sample 1029176. Exception: cannot identify image fFFailed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecb2007c0> cannot identify image file <_io.BytesIO object at 0x7fc0f08fd940> e <_io.BytesIO object at 0x7f83e84ebf10> Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd2b10> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec257830> cannot identify image file <_io.BytesIO object at 0x7fb5fefe3f60> Failed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681dab10> le <_io.BytesIO object at 0x7fa50aa2cea0> Failed to fetch sample 1282083. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8eaa20> Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644b060> Failed to fetch sample 1659475. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5718a0> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f90fe0> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f24ffb0> Failed to fetch sample 3576443. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc6344f0> cannot identify image file <_io.BytesIO object at 0x7fe0ec2537e0> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7fa36444a020> cannot identify image file <_io.BytesIO object at 0x7fd9d68c1d00> Failed to fetch sample 3345102. Exception: cannot identify image file <_io.BytesIO object at 0x7efce5b85940> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3963970> Failed to fetch sample 2584050. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278b6cf0> Failed to fetch sample 1377471. Exception: cannot identify image fFailed to fetch sample 3418606. Exception:Failed to fetch sample 3546995. Exception: cannot identify image filFailed to fetch sample 2565811. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d27882c50> Failed to fetch sample 4116468. Exception:Failed to fetch sample 1322959. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f051fed2b10> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e834dcb0> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7fa69c531c60> Failed to fetch sample 1123204. Exception: cannot identify image file <_io.BytesIO Failed to fetch sample 2465952. Exception: cannot identify image fiFailed to fetch sample 2040208. Exception: Failed to fetch sample 4073917. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9f9580> Failed to fetch sample 2761972. Exception: cannot identify image fFailed to fetch sample 3467927. Exception: Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO object at 0x7f44f5ebdc10> Failed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59b8d0> cannot identify image file <_io.BytesIO object at 0x7f7b9b1e4860> le <_io.BytesIO object at 0x7f5de37ebab0> Failed to fetch sample 1301230. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682d27a0> Failed to fetch sample 3582098. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c2b010> Failed to fetch sample 3376972. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401ce980> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7f56685093f0> Failed to fetch sample 2591530. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732626b0> Failed to fetch sample 1198388. Exception: cannot identify image file <_io.BytesIO object at 0x7fe5d5aaa520> Failed to fetch sample 1484787. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2827f0> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cca520> Failed to fetch sample 2258741. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9eb060> Failed to fetch sample 2919698. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a570270> Failed to fetch sample 2359987. Exception: cannot identify image fiFailed to fetch sample 1205550. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d2980> le <_io.BytesIO object at 0x7fe436a0a980> Failed to fetch sample 3150064. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0949440> Failed to fetch sample 1455820. Exception: cannot identify image file <_io.BytesIO object at 0x7f051fed14e0> Failed to fetch sample 2784203. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db598bd0> Failed to fetch sample 2322241. Exception: ject at 0x7fe44443f290> Failed to fetch sample 2531778. Exception:Failed to fetch sample 1019459. Exception: cannot identify image fiFailed to fetch sample 3374089. Exception:cannot identify image file <_io.BytesIO object at 0x7f7b9b1e4db0> iFailed to fetch sample 1573545. Exception:Failed to fetch sample 3116930. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722453f0> cannot identify image file <_io.BytesIO object at 0x7f7e078f7060> Failed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac242610> cannot identify image file <_io.BytesIO object at 0x7f0828115f80> Failed to fetch sample 1218899. Exception:Failed to fetch sample 3647389. Exception: cannot identify image fi Failed to fetch sample 1495777. ExceptionFailed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebaa7f0> Failed to fetch sample 2784203. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e077583b0> cannot identify image file <_io.BytesIO object at 0x7fcf32969260> Failed to fetch sample 2487289. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebe5120> Failed to fetch sample 3207382. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003abc40> cannot identify image file <_io.BytesIO object at 0x7efcdbb90ea0>Failed to fetch sample 2202214. Exception:bject at 0x7f35a7fe0d10> ile <_io.BytesIO object at 0x7fc0f08db790> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828114fe0> Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560cfa60> e <_io.BytesIO object at 0x7f94ac2403b0> cannot identify image file <_io.BytesIO object at 0x7f4beebaa5c0>FFailed to fetch sample 2254505. ExceptionFa cannot identify image file <_io.BytesIO object at 0x7f3939c8d2b0> <_io.BytesIO object at 0x7f58c39cf240> Failed to fetch sample 3504244. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec80310> Failed to fetch sample 1206222. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3881fd0> cannot identify image file <_io.BytesIO object at 0x7f1f1b223dd0> Failed to fetch sample 1808696. Exception: cannot identify image file <_io.BytesIO object at 0x7f566812eb10> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281165c0> Failed to fetch sample 4172817. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b40bd850> Failed to fetch sample 3314214. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce6b560> cannot identify image file <_io.BytesIO object at 0x7f4beec83510> Failed to fetch sample 1095181. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cd5b70>Failed to fetch sample 3876694. Exception: cannot identify image file <_io.BytesIO object at 0x7f53b78a62f0> Failed to fetch sample 4193310. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f2ac0> Failed to fetch sample 2221802. Exception: cannot identify image file <_io.BytesIO object at 0x7fc108455ad0> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39cf830> Failed to fetch sample 1633328. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec80e50> Failed to fetch sample 1770342. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3880fe0> Failed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac25ff60> Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828117060> cannot identify image file <_io.BytesIO object at 0x7f99a39c2200> Failed to fetch sample 1229036. Exception: cannot identify image file <_io.BytesIO object at 0x7fc108457970> FFailed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7f566810e6b0> ailed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91d120> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f934cc0> cannot identify image file <_io.BytesIO object at 0x7f70b426d0d0> Failed to fetch sample 2365052. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643ec30b0> Failed to fetch sample 1017215. Exception:Failed to fetch sample 1382222. Exception: ject at 0x7f9abcdf5fd0> Failed to fetch sample 2059104. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9ce8d710> Failed to fetch sample 3162454. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc145b20> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac25f880> cannot identify image file <_io.BytesIO object at 0x7f6556026390> ailed to fetch sample 4160319. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0926cf0> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7f566810c630> Failed to fetch sample 1988187. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fe1e40> Failed to fetch sample 1876146. Exception: cannot identify image file <_io.BytesIO object at 0x7efd37e9d530> Failed to fetch sample 2940708. Exception:F cannot identify image file <_io.BytesIO object at 0x7fc0f093fa60> ailed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9273d0> Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39c1cb0> Failed to fetch sample 1380418. Exception: cannot identify image fiFailed to fetch sample 1485929. Exception:ject at 0x7f5643e2fec0> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 3634589. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cd0bd0> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078ea570> Failed to fetch sample 4142811. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c384e1da0> cannot identify image file <_io.BytesIO object at 0x7fccdf76dc10> cannot identify image file <_io.BytesIO object at 0x7f6a7f282070> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f24c8b0> Failed to fetch sample 3919378. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a17330> Failed to fetch sample 1934945. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f282480> Failed to fetch sample 3160025. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6e47420> Failed to fetch sample 1073493. Exception: cannot identify image file <_io.BytesIO object at 0x7f1d9a747650> Failed to fetch sample 4146684. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7a1bc0> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64c3920> Failed to fetch sample 1749658. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a8e1c9350> Failed to fetch sample 2021278. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5697c400> Failed to fetch sample 2892575. Exception: cannot identify image file <_io.BytesIO object at 0x7f1d0d58b380> Failed to fetch sample 2437168. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568e4d60> Failed to fetch sample 3355968. Exception: cannot identify image file <_io.BytesIO object at 0x7efed45b53f0> Failed to fetch sample 2987590. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d14f9e5c0> Failed to fetch sample 2354595. Exception: cannot identify image file <_io.BytesIO object at 0x7efe2a164860> Failed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08f9cb0> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280e9760> Failed to fetch sample 2832505. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a624e2de0> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7f90e524e610> Failed to fetch sample 2523392. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39c2840> Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9344a0> Failed to fetch sample 2784203. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a84540> Failed to fetch sample 1977747. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7fb7401f1df0> 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f09279c0> Failed to fetch sample 2784203. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f093ed40> Failed to fetch sample 1909297. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3861a80> cannot identify image file <_io.BytesIO object at 0x7fca1c1bf7e0> Failed to fetch sample 3374089. Exception: cannot identify image file <_io.BytesIO object at 0x7f655601f600> Failed to fetch sample 2343184. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c49a0> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280e9a80> Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4267bf0> Failed to fetch sample 1312036. Exception: cannot identify image file <_io.BytesIO object at 0x7fa009521440> Failed to fetch sample 1570137. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f65990> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7f2000924c70> Failed to fetch sample 1597034. Exception:Failed to fetch sample 2801133. Exception: ject at 0x7fbf37b0b600> cannot identify image file <_io.BytesIO object at 0x7f5643ec0ef0> Failed to fetch sample 1597034. ExceptionFailed to fetch sample 1523937. Exception: bject at 0x7f9abcdcdd50> Failed to fetch sample 2671128. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00713bf0> Failed to fetch sample 1197124. Exception: cannot identify image fiFailed to fetch sample 2832127. Exception:Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7f655601e1b0> Failed to fetch sample 1017215. Exception: cannot identify image fiFailed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f922610> Failed to fetch sample 3509333. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c428e6b0> FailedFailed to fetch sample 3306136. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c77b50> Failed to fetch sample 3144697. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e9fd0> to fetch sample 3106582. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681071f0> Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f678d0> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c3e4d0> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c428f380> Failed to fetch sample 1120039. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668107600> Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278e46d0> Failed to fetch sample 1860609. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9a8040> Failed to fetch sample 1658894. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937f27a0> Failed to fetch sample 3257443. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec2e840> Failed to fetch sample 980122. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078e4b30> Failed to fetch sample 3596808. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fc66c2200> Failed to fetch sample 1472558. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380ef20> Failed to fetch sample 3374089. Exception: Failed to fetch sample 2803041. Exception:cannot identify image fiFailed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64cdd00> Failed to fetch sample 1168708. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278dbba0> Failed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb749670> ailed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f100e0> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3824860> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093769c60> Failed to fetch sample 1988585. Exception:bject at 0x7f3939c76d40> Failed to fetch sample 1752888. Exception: cannot identify image file <_io.BytesIO object at 0x7f393931b880> Failed to fetch sample 2607936. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706ee3f10> cannot identify image file <_io.BytesIO object at 0x7f5d278fef70> Failed to fetch sample 1135347. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb74a660> ile <_io.BytesIO object at 0x7fb687f10ef0> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732709a0> Failed to fetch sample 2331127. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7f94ac2e7920> Failed to fetch sample 1630596. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb46b60> 8639. Exception:Failed to fetch sample 2892575. Exception:Failed to fetch sample 3351202. Exception: cannot identify image fFailed to fetch sample 3493751. Exception: cannot identify image file <_io.BytesIO object at 0x7f3981a7dd50> ile <_io.BytesIO object at 0x7fbf470824d0> ect at 0x7fc2eb73f970> Failed to fetch sample 1203644. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9bc3b0> Failed to fetch sample 1545618. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f7d385850> Failed to fetch sample 3388477. Exception: ect at 0x7f9d732705e0> Failed to fetch sample 1019459. Exception: Failed to fetch sample 2531778. Exception:ject at 0x7f1137f53a10> Failed to fetch sample 1176052. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2b98f0> cannot identify image file <_io.BytesIO object at 0x7f56681b1fd0> cannot identify image file <_io.BytesIO object at 0x7f94ac2e7fb0> Failed to fetch sample 1727498. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2db510> i cannot identify image file <_io.BytesIO object at 0x7f5e6a8e8900> Failed to fetch sample 1501211. Exception: cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7f81db59e700> Failed to fetch sample 1414336. Exception: cannot identify image fiFailed to fetch sample 1019459. Exception:Failed to fetch sample 3596808. Exception:bject at 0x7f995956c950> Failed to fetch sample 3157362. Exception: cannot identify image fiFailed to fetch sample 4196593. Exception:Failed to fetch sample 2439548. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687fb6610> Failed to fetch sample 4026446. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f6d3a0> Failed to fetch sample 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59e610> cannot identify image file <_io.BytesIO object at 0x7efd6f7fdee0> Failed to fetch sample 1401592. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e650c680> Failed to fetch sample 1240201. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568d7650> cannot identify image file <_io.BytesIO object at 0x7f94ac2e6160> le <_io.BytesIO object at 0x7f1f1b25ed40> Failed to fetch sample 2175007. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bed5d0> cannot identify image file <_io.BytesIO object at 0x7f12b3824860> Failed to fetch sample 3159206. Exception: ailed to fetch sample 2746130. Exception:ject at 0x7fb7401ce980> FFailed to fetch sample 1218899. Exception:ject at 0x7f9abcdca2f0> le <_io.BytesIO object at 0x7f1f1b279da0> Failed to fetch sample 1495777. Exception: Failed to fetch sample 1791851. Exception: cannot identify image fFFailed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42ae6b0> Failed to fetch sample 3906251. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003bf9c0> Failed to fetch sample 3240425. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a663e0> Failed to fetch sample 3314686. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5a7e20> Failed to fetch sample 3197729. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7327e610> Failed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687fb7f10> Failed to fetch sample 2045138. Exception:Failed to fetch sample 1378512. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008b4b920> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59f010> Failed to fetch sample 2040873. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a64180> Failed to fetch sample 2801877. Exception: cannot identify image file <_io.BytesIO object at 0x7efe2a23ec00> Failed to fetch sample 3901852. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f292ca0> le <_io.BytesIO object at 0x7f9d7324a1b0> Failed to fetch sample 2429018. Exception:bject at 0x7f1f1b27b3d0> Failed to fetch sample 2909378. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682f92bb0> Failed to fetch sample 3493751. Exception:bject at 0x7f951ecdd300> Failed to fetch sample 2851854. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a65ad0> cannot identify image file <_io.BytesIO object at 0x7f3c56081c10> Failed to fetch sample 2358239. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647e110> Failed to fetch sample 4147306. Exception:: cannot identify image file <_io.BytesIO object at 0x7f0f062878d0> Failed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73282160> Failed to fetch sample 1409821. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37d9b20> Failed to fetch sample 2799267. Exception:bject at 0x7f5a16db4680> cannot identify image file <_io.BytesIO object at 0x7f6a7f293240> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7fe890> le <_io.BytesIO object at 0x7f0c35c431a0> Failed to fetch sample 2198494. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39cf290> Failed to fetch sample 2909389. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f628e0> Failed to fetch sample 2198494. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7b6890> Failed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ddeabb330> cannot identify image file <_io.BytesIO object at 0x7f12b38bd850> Failed to fetch sample 4098530. Exception:cannot identify image file <_io.BytesIO object at 0x7f5a16f87600> F cannot identify image file <_io.BytesIO object at 0x7f098647cfe0> Failed to fetch sample 1227532. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641cb30> e <_io.BytesIO object at 0x7f5056edcfe0> Failed to fetch sample 2108438. Exception: cannot identify image file <_io.BytesIO object at 0x7f991b214590> Failed to fetch sample 3228934. Exception: ject at 0x7f07aac70ef0> le <_io.BytesIO object at 0x7f1f1b27b600> cannot identify image file <_io.BytesIO object at 0x7fc2eb7b4e50> Failed to fetch sample 1762324. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f8d850> le <_io.BytesIO object at 0x7fd2003bd350> Failed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732729d0> Failed to fetch sample 2883481. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4263b50> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b243c90> Failed to fetch sample 1250858. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fe6660> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732722a0> Failed to fetch sample 1658625. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4246390> Failed to fetch sample 2429018. Exception: cannot identify image file <_io.BytesIO object at 0x7fe5886ef600> Failed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b242110> Failed to fetch sample 3091512. Exception: cannot identify image fiFailed to fetch sample 4186967. Exception:Failed to fetch sample 4186967. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39cf420> Failed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e3dd1f920> Failed to fetch sample 40Failed to fetch sample 4380313. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b384c900> Failed to fetch sample 2979749. Exception: cannot identify image file <_io.BytesIO o cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f5de37db420> Failed to fetch sample 1339914. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f46430> Failed to fetch sample 3015866. Exception: cannot identify image file <_io.BytesIO object at 0x7f96ef1d14e0> Failed to fetch sample 279cannot identify image file <_io.BytesIO object at 0x7f99a39a9a30> ile <_io.BytesIO Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37db150> object at 0x7fc2eb7bfab0> Failed to fetch sample 2364126. Exception: Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3746e80> cannot identify image file <_io.BytesIO object at 0x7f5e6a95d760> Failed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b241a30> Failed to fetch sample 2363461. Exception: cannot identify image file <_io.BytesIO object at 0x7ff6973eb4c0> Failed to fetch sample 2123650. Exception:Failed to fetch sample 2832751. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7b58f0>cannot identify image file <_io.BytesIO object at 0x7f5d278fa570> Failed to fetch sample 2123650. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39cd530> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f5df0> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b243880> Failed to fetch sample 3255867. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b0884b80> Failed to fetch sample 3180383. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401668090> Failed to fetch sample 1947526. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93f5b0> Failed to fetch sample 1397352. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c38198a0> Failed to fetch sample 2101533. Exception: cannot identify image fFailed to fetch sample 2358648. Exception: Failed to fetch sample 2819059. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c381b740> Failed to fetch sample 2358648. Exception:Failed to fetch sample 2437168. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a17f36160>cannot identify image file <_io.BytesIO object at 0x7fc2eb7b7880> Failed to fetch sample 1947526. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9fd6c0> cannot identify image file <_io.BytesIO object at 0x7f96bb6955d0> Failed to fetch sample 2919266. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf5efed080> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5af420> le <_io.BytesIO object at 0x7f1f1b240c70> Failed to fetch sample 2722659. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687fa1620> Failed to fetch sample 4112540. Exception: Failed to fetch sample 3194835. Exception:ject at 0x7f5c9d947240> Failed to fetch sample 1784730. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f928a90> Failed to fetch sample 2583452. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8eb380> cannot identify image file <_io.BytesIO obFailed to fetch sample 1301230. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722a5170> cannot identify image file <_io.BytesIO object at 0x7ff4daba2250> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7f0daaeef880> cannot identify image file <_io.BytesIO object at 0x7f1f1b279e90> Failed to fetch sample 3329766. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003d66b0> Failed to fetch sample 3879860. Exception: cannot identify image file <_io.BytesIO object at 0x7f995a101da0> Failed to fetch sample 3617437. Exception: cannot identify image file <_io.BytesIO object at 0x7f56cdd51c60> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b382c810> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b279800> Failed to fetch sample 1100749. Exception: cannot identify image file <_io.BytesIO object at 0x7f96ea2945e0> cannot identify image file <_io.BytesIO object at 0x7fbe722a59e0> le <_io.BytesIO object at 0x7f109378fec0> Failed to fetch sample 4376367. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73260c70> Failed to fetch sample 980211. Exception: cannot identify image filFailed to fetch sample 2855055. Exception:Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b278680> Failed to fetch sample 3072626. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2d8310> Failed to fetch sample 2705094. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9cb2b48b0> Failed to fetch sample 1008319. Exception: ect at 0x7ff4015d8f40> Failed to fetch sample 2248321. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093755670> Failed to fetch sample 2305332. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c0da9fbf0> Failed to fetch sample 2186961. Exception: cannot identify image file <_io.BytesIO object at 0x7f4ebf8d1990> Failed to fetch sample 4116143. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b1e4900> Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37e7f10> Failed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8111f80> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7f5affefbfb0> Failed to fetch sample 1301230. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80fcd10> cannot identify image file <_io.BytesIO object at 0x7efd6f7f4590> ile <_io.BytesIO object at 0x7ff4015d9c60> Failed to fetch sample 2394645. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7b9c10> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de370eac0> Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401668590> cannot identify image file <_io.BytesIO object at 0x7f7e0775bb00> le <_io.BytesIO object at 0x7f5de37e1120> Failed to fetch sample 2085038. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc12a980> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8178400> Failed to fetch sample 2726391. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb4680> Failed to fetch sample 2782472. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83a9260> Failed to fetch sample 1994331. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca8a40> Failed to fetch sample 2993528. Exception: cannot identify image file <_io.BytesIO object at 0x7ff55c12c630> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e834a660> Failed to fetch sample 1049543. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249497b50> cannot identify image file <_io.BytesIO object at 0x7f97ee4c94e0> Failed to fetch sample 4211049. Exception: cannot identify image file <_io.BytesIO object at 0x7fb458a94f40> Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73291170> Failed to fetch sample 1198666. Exception: cannot identify image file <_io.BytesIO object at 0x7f97e67d1170> Failed to fetch sample 2365052. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378e700> Failed to fetch sample 2437612. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39b7fb0> Failed to fetch sample 3180383. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ead90> cannot identify image file <_io.BytesIO object at 0x7fd07a5732e0> Failed to fetch sample 1943116. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f79f6a0> le <_io.BytesIO object at 0x7f366153fce0> Failed to fetch sample 3247622. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc538860> Failed to fetch sample 1688560. Exception: ject at 0x7f83e83493a0> Failed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca8590> le <_io.BytesIO object at 0x7f44f8060900> Failed to fetch sample 3440552. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f89e90> Failed to fetch sample 4128627. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72249a80> Failed to fetch sample 1250858. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a689f0> Failed to fetch sample 1306914. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16daef20> Failed to fetch sample 2429018. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc85e0> Failed to fetch sample 4120777. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59b150> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe81714810> Failed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a832e0> Failed to fetch sample 1713179. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f71f0> Failed to fetch sample 2799267. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bf5d50> Failed to fetch sample 3467927. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f687c0> Failed to fetch sample 1120039. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db571440> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f769dad90> Failed to fetch sample 3004849. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc9da0> Failed to fetch sample 2535152. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f45e0> Failed to fetch sample 2952548. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d7a60> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a81118f0> cannot identify image file <_io.BytesIO object at 0x7f9d73261670> le <_io.BytesIO object at 0x7f5aed8ebfb0> Failed to fetch sample 2587903. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249485b70> Failed to fetch sample 3374089. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83aaf20> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f89d50> Failed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5f5940> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7fa1d0a53ba0> Failed to fetch sample 3458049. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f46700> Failed to fetch sample 2946239. Exception: cannot identify image file <_io.BytesIO object at 0x7f804c274720> Failed to fetch sample 3637098. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a1ff60> Failed to fetch sample 2180795. Exception: cannot identify image fiFailed to fetch sample 1080696. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac246bb0> e <_io.BytesIO object at 0x7f81db5a71f0> Failed to fetch sample 2585004. Exception: cannot identify image file <_io.BytesIO object at 0x7fbebdd2b740> Failed to fetch sample 3596808. Exception: cannot identify image fiFailed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec80310> cannot identify image file <_io.BytesIO object at 0x7fe436a42340> Failed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647f830> Failed to fetch sample 3374089. Exception:bject at 0x7fc2eb7b9350> le <_io.BytesIO object at 0x7f83e8365170> cannot identify image file <_io.BytesIO object at 0x7f1f1b222250> Failed to fetch sample 2979749. Exception: Failed to fetch sample 2165648. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c956c0> Failed to fetch sample 1029176. Exception: cannot identify image fcannot identify image file <_io.BytesIO objFailed to fetch sample 2736237. Exception: cannot identify image file <_io.BytesIO object at 0x7f0f05f1a6b0> ect at 0x7fe436a37e70> Failed to fetch sample 4099207. Exception:bject at 0x7fc2eb7b97b0> cannot identify image file <_io.BytesIO object at 0x7f4c25fba4d0> le <_io.BytesIO object at 0x7fe436a40db0> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1b535210> Failed to fetch sample 4224233. Exception:bject at 0x7f444ebe0d60>cannot identify image file <_io.BytesIO object at 0x7fc2eb74b470> file <_io.BytesIO object at 0x7f47edfc16c0> Failed to fetch sample 2141455. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278b7330> Failed to fetch sample 4173015. Exception: cannot identify image file <_io.BytesIO object at 0x7f81515388b0> cannot identify image file <_io.BytesIO object at 0x7fa275ca8220> Failed to fetch sample 3072992. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c21e9d0> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7f91f2f522a0> Failed to fetch sample 2522299. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4fbbebb0>Failed to fetch sample 3573442. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb74b8d0> cannot identify image file <_io.BytesIO ocannot identify image file <_io.BytesIO object at 0x7f0dbba89c10> :Failed to fetch sample 3234234. Exception: cannot identify image fi Failed to fetch sample 2954477. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133fd39c0> Failed to fetch sample 2890373. ExceptionFailed to fetch sample 4222253. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647fc90> : cannot identify image file <_io.BytesIO object at 0x7f566813e250> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7f566812dd50> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8367bf0> Failed to fetch sample 1659475. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c2157b0> Failed to fetch sample 1858166. Exception: cannot identify image file <_io.BytesIO object at 0x7f051343ad40> cannot identify image file <_io.BytesIO object at 0x7f53a8761760> Failed to fetch sample 2339182. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc113e70> Failed to fetch sample 2213075. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a6192b0> cannot identify image file <_io.BytesIO object at 0x7fb2fd566de0> cannot identify image file <_io.BytesIO object at 0x7f5e6a961d00> Failed to fetch sample 2612037. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f1b4c0> le <_io.BytesIO object at 0x7fa39edcd850> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8366d40>Failed to fetch sample 2761972. Exception: cannot identify image file <_io.BytesIO object at 0x7f84378bb1f0> Failed to fetch sample 3856361. Exception:bject at 0x7fee3f9ded40> Failed to fetch sample 2979749. Exception:Failed to fetch sample 1930721. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e65006d0> b cannot identify image file <_io.BytesIO object at 0x7f35a7f467f0> Failed to fetch sample 3276759. Exception: cannot identify image fcannot identify image file <_io.BytesIO objFailed to fetch sample Failed to fetch sample 3124846. Exception: cannot identify image file <_io.BytesIO objFailed to fetch sample cannot identify image file <_io.BytesIO object at 0x7f3d152efbf0> Failed to fetch sample 1430726. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2703b0> Failed to fetch sample 2767646. Exception: cannot identify image file <_io.BytesIO object at 0x7f082812a3e0> Failed to fetch sample 4018722. Exception: cannot identify image file <_io.BytesIO object at 0x7f051343c9f0> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83657b0> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83d4a90> Failed to fetch sample 3571797. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f52b0> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6503ce0> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647f880> cannot identify image file <_io.BytesIO ocannot identify image file <_io.BytesIO object at 0x7f5e6a962200> Failed to fetch sample 1593589. Exception: cannot identify image fil cannot identify image file <_io.BytesIO object at 0x7f3982c99c10> Failed to fetch sample 2122051. Exception: cannot identify image file <_io.BytesIO Failed to fetch sample 26Failed to fetch sample 2904352. Exception:le <_io.BytesIO object at 0x7f1b7b9c15d0> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647f240> Failed to fetch sample 1161929. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c1030> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8365b70>Failed to fetch sample 4240560. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7fe436a3e4d0> 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e834e6b0> Failed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3813790> Failed to fetch sample 2219405. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca2fc0> Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281228e0> cannot identify image file <_io.BytesIO object at 0x7f99a3960ae0> Failed to fetch sample 1997114. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003862a0>FFailed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c38c22a0> cannot identify image file <_io.BytesIO object at 0x7f53a906a840> Failed to fetch sample 2530938. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf48cfe2f0> cannot identify image file <_io.BytesIO object at 0x7f81d6dce660> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3919710> Failed to fetch sample 4161397. Exception:Failed to fetch sample 3743590. Exception: cannot identify image file <_io.BytesIO object at 0x7f61ceb93ba0> Failed to fetch sample 24 cannot identify image file <_io.BytesIO object at 0x7f936bd2d3f0> 0x7fe0ec23bbf0> Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7f082812b1f0> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7f082810a7a0> cannot identify image file <_io.BytesIO object at 0x7f18e64f4ae0> Failed to fetch sample 3380043. Exception:Failed to fetch sample 2903623. Exception:bject at 0x7f1137f736a0> Failed to fetch sample 2248321. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b84f0> cannot identify image file <_io.BytesIO object at 0x7f83e834e1b0> Failed to fetch sample 4143981. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8395a80> ailed to fetch sample 4053739. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341087c0> Failed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7f936bd2d760> Failed to fetch sample 4117657. Exception: cannot identify image fFailed to fetch sample 1524920. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681aa930> Failed to fetch sample 1843031. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f1620> Failed to fetch sample 3573478. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b1a30> cannot identify image file <_io.BytesIO object at 0x7fc107a84a90> ile <_io.BytesIO object at 0x7fe0ec23b3d0> Failed to fetch sample 4106218. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e833ad40> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7f1fd4c5bce0> Failed to fetch sample 1208958. Exception: cannot identify image fiFailed to fetch sample 3596808. Exception:Failed to fetch sample 1858383. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f1620> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e833a6b0> cannot identify image file <_io.BytesIO object at 0x7fc0f09276f0> cannot identify image file <_io.BytesIO object at 0x7fcbdc23c360> Failed to fetch sample 2525062. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec273380> Failed to fetch sample 1829871. Exception:ject at 0x7f001b92b6a0> Failed to fetch sample 2171598. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d730a2070> le <_io.BytesIO object at 0x7f80201b3150> Failed to fetch sample 3010687. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1c6b60> cannot identify image file <_io.BytesIO object at 0x7f0e6bde2b60> le <_io.BytesIO object at 0x7fd2003baa70> Failed to fetch sample 2091825. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39fbab0> Failed to fetch sample 2249540. Exception: cannot identify image file <_io.BytesIO object at 0x7fa2aebdbfb0> Failed to fetch sample 1768026. Exception: cannot identify image fFFailed to fetch sample 1847758. Exception: ject at 0x7f80a819ed90> le <_io.BytesIO object at 0x7fcef8cefd80> le <_io.BytesIO object at 0x7f84f700c860> Failed to fetch sample 1135347. Exception: cannot identify image file <_io.BytesIO object at 0x7f082810be70> Failed to fetch sample 1028798. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c2122a0> cannot identify image file <_io.BytesIO object at 0x7f098647ec50> cannot identify image file <_io.BytesIO object at 0x7fb52f7ef6f0> Failed to fetch sample 2797856. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadd5e5c0> ile <_io.BytesIO object at 0x7f1f1b258ef0> Failed to fetch sample 1160161. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e5350> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697dea520> Failed to fetch sample 2924881. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b3560> cannot identify image file <_io.BytesIO object at 0x7f6682dcd170> ile <_io.BytesIO object at 0x7f9d7325f650> Failed to fetch sample 2535152. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b255440> Failed to fetch sample 1203134. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd48b0> Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828109530> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f80b30> Failed to fetch sample 3633396. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d730a30b0> Failed to fetch sample 1262384. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7ff4016697b0> 196741. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732406d0> Failed to fetch sample 2203948. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25e7f0> Failed to fetch sample 3909731. Exception: cannot identify image file <_io.BytesIO object at 0x7fd31011d3a0> Failed to fetch sample 3374089. Exception: cannot identify image fiFailed to fetch sample 1431086. Exception: ailed to fetch sample 2761972. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4297edee0> Failed to fetch sample 1446167. Exception: cannot identify image file <_io.BytesIO object at 0x7f655606b600> Failed to fetch sample 4108370. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f09864367f0> le <_io.BytesIO object at 0x7f07a84cd800> ile <_io.BytesIO object at 0x7fd2c46f3290> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7328bce0> cannot identify image file <_io.BytesIO object at 0x7f5af5285d50> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7ff43d5fc770> Failed to fetch sample 3322023. Exception:Failed to fetch sample 2813892. Exception: ject at 0x7f9d730a3010> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7f22f57f6160> Failed to fetch sample 2954477. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7328a2f0> cannot identify image file <_io.BytesIO object at 0x7f18e64fecf0> Failed to fetch sample 3900769. Exception: cannot identify image fFailed to fetch sample 2813892. Exception: Failed to fetch sample 3456613. Exception:ject at 0x7f655606a3e0> Failed to fetch sample 3355968. Exception: cannot identify image fFailed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5a7100> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b283060> Failed to fetch sample 998775. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fc0e50> Failed to fetch sample 1026768. Exception: cannot identify image file <_io.BytesIO object at 0x7f583c6d93a0> cannot identify image file <_io.BytesIO ocannot identify image file <_io.BytesIO object at 0x7ff4015da6b0> Failed to fetch sample 2108438. Exception: cannot identify image filFailed to fetch sample 1135347. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864376f0> cannot identify image file <_io.BytesIO object at 0x7f1f1b24b6a0> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fc36a0> Failed to fetch sample 2348489. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280ebb50> Failed to fetch sample 2354595. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828133790> Failed to fetch sample 4186967. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a3f8cae80> cannot identify image file <_io.BytesIO object at 0x7f4beec00ae0> Failed to fetch sample 2765801. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e070f89a0> le <_io.BytesIO object at 0x7fdab9acfb50> Failed to fetch sample 3573442. Exception: Failed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7f9e9c1da430> Failed to fetch sample 2335870. Exception: cannot identify image fFailed to fetch sample 2746130. Exception: Failed to fetch sample 3876980. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 1417605. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a569484a0> Failed to fetch sample 1Failed to fetch sample 2636021. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 1250858. Exception: cannot identify image file <_io.BytesIO object at 0x7f2794418cc0> cannot identify image file <_io.BytesIO object at 0x7f9d7323bf60> Failed to fetch sample 1476178. Exception: cannot identify image file <_io.BytesIO object at 0x7f52ce1fa110> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f46750> Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7f655606bb50> Failed to fetch sample 3007987. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8324f40> Failed to fetch sample 1292020. Exception: cannot identify image file <_io.BytesIO object at 0x7f53f1f122f0> Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e5c10>cFailed to fetch sample 3596833. Exception: cannot identify image Failed to fetch sample 2110654. Exception:> Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7f52cdd955d0> Failed to fetch sample 3633396. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2da890> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d77c87f60> cannot identify image file <_io.BytesIO object at 0x7f7a5696fec0> Failed to fetch sample 3443298. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8eaed0> Failed to fetch sample 1982695. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a15a30> le <_io.BytesIO object at 0x7f9a9c2da3e0> Failed to fetch sample 1324309. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93e7f0> Failed to fetch sample 3204136. Exception: cannot identify image file <_io.BytesIO object at 0x7fe44409b1f0> cannot identify image file <_io.BytesIO object at 0x7fbcdc11a200> Failed to fetch sample 2358648. Exception:Failed to fetch sample 2304801. Exception:bject at 0x7f0828147c90> Failed to fetch sample 2717980. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cf8900> Failed to fetch sample 2522299. Exception: cannot identify image fi cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7fdc25fdb5b0> Failed to fetch sample 2522299. Exception: cannot identify image fiFailed to fetch sample 2993528. Exception:cannot identify image file <_io.BytesIO object at 0x7fb52f6c8d10> Failed to fetch sample 1770342. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560dc130> Failed to fetch sample 1772414. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fea70> cannot identify image file <_io.BytesIO object at 0x7f96fb06d440> Failed to fetch sample 3382824. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b4860> Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668131c60> Failed to fetch sample 2671128. Exception: cannot identify image file <_io.BytesIO object at 0x7f1d0d582200> cannot identify image file <_io.BytesIO object at 0x7f5d278fdbc0> cannot identify image file <_io.BytesIO object at 0x7f9774e9ce00> Failed to fetch sample 3494092. Exception: Failed to fetch sample 2557933. Exception: cannot identify image fFailed to fetch sample 2723001. Exception: Failed to fetch sample 2078887. Exception: cannot identify image file <_io.BytesIO object at 0x7fb660192ac0> cannot identify image file <_io.BytesIO object at 0x7fc9faaef8d0> Failed to fetch sample 980211. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278fec50> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7f192201df30> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7f991c7c74c0> Failed to fetch sample 4159731. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093755670> Failed to fetch sample 1947526. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6c9f80> Failed to fetch sample 1350091. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fdf7f8cc0> cannot identify image file <_io.BytesIO object at 0x7fca9ce8d710> Failed to fetch sample 1628257. Exception:Failed to fetch sample 1131587. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093733880> Failed to fetch sample 2799267. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b994cc0> cannot identify image file <_io.BytesIO object at 0x7f9a9c29b560> Failed to fetch sample 1198666. Exception: cannot identify image file <_io.BytesIO object at 0x7fa14c684770> cannot identify image file <_io.BytesIO object at 0x7fccdf76fd30> Failed to fetch sample 2079067. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4eeec6570> cannot identify image file <_io.BytesIO object at 0x7fdc269ace50> cannot identify image file <_io.BytesIO object at 0x7fd2003b6840> Failed to fetch sample 1252147. Exception: Failed to fetch sample 1528709. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fa828e0> cannot identify image file <_io.BytesIO object at 0x7f9d730e3060> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c273e20> Failed to fetch sample 3105090. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8b44a0> Failed to fetch sample 2579559. Exception:ject at 0x7fca9bacdad0> Failed to fetch sample 1232894. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4e1b0> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668142930> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5989f0> cannot identify image file <_io.BytesIO object at 0x7f0828144590> Failed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560df790> Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556064720> Failed to fetch sample 2683423. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99b840680> Failed to fetch sample 1017215. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986457c40> cannot identify image file <_io.BytesIO object at 0x7f7e078ddd50> ile <_io.BytesIO object at 0x7f4f68e755d0> Failed to fetch sample 2830282. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f6bb0> Failed to fetch sample 2729098. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a392e890> Failed to fetch sample 4072385. Exception: ject at 0x7fca9bacda30> Failed to fetch sample 3528850. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdf74c0> Failed to fetch sample 3209388. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fdb010> cannot identify image file <_io.BytesIO object at 0x7f9a5d957d30> cannot identify image file <_io.BytesIO object at 0x7efd6f7b9490> Failed to fetch sample 1436833. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568ff970> ile <_io.BytesIO object at 0x7f9ad457d120> Failed to fetch sample 1276143. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec836f0> Failed to fetch sample 3895655. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3869c60> Failed to fetch sample 3840067. Exception:cannot identify image file <_io.BytesIO object at 0x7f1f1b095080> FFailed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7f65a98f49f0>cannot identify image file <_io.BytesIO object at 0x7f27754ba8e0> Failed to fetch sample 1333992. Exception: cFailed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fdacf0>cannot identify image file <_io.BytesIO object at 0x7f5c9d944a40> ile <_io.BytesIO object at 0x7f81db578720> Failed to fetch sample 2158552. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f7a568c5f30> 68379. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35cd7330> Failed to fetch sample 4205131. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278b5120> cannot identify image file <_io.BytesIO object at 0x7f3939cd2c50> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the Failed to fetch sample 2884150. Exception: cannot identify image file <_io.BytesIO object at 0x7f951ecdd300> Failed to fetch sample 3494820. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82d3240> Failed to fetch sample 3180383. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c36bc4ea0> Failed to fetch sample 3411791. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db579b20> Failed to fetch sample 3189296. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a51760> Failed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdd2b60> ile <_io.BytesIO object at 0x7f44f5d1f290> Failed to fetch sample 2 cannot identify image file <_io.BytesIO object at 0x7f4d18206980> Failed to fetch saFailed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object aFailed to fetch sample 3633396. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac229e90> t 0x7f81db5a4ea0> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8329710> Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca7420> Failed to fetch sample 2363228. Exception:Failed to fetch sample 1179835. Exception: cannot identify image fiFailed to fetch sample 2799267. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac228310> le <_io.BytesIO object at 0x7f1b7b9c0770> Failed to fetch sample 1167444. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec276e30> Failed to fetch sample 1322009. Exception: ect at 0x7f566813f4c0> Failed to fetch sample 3706161. Exception: cannot identify image file <_io.BytesIO object at 0x7fcf32b6bb00> Failed to fetch sample 1686041. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a7330> cannot identify image file <_io.BytesIO object at 0x7f5a16db6390> Failed to fetch sample 2674356. Exception: cannot identify image file <_io.BytesIO object at 0x7f469f3f6d90> cannot identify image file <_io.BytesIO object at 0x7f713412e6b0> ile <_io.BytesIO object at 0x7fc108456840> Failed to fetch sample 1019459. Exception: ject at 0x7f83e83a8900> le <_io.BytesIO object at 0x7f6d145b8c70> Failed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a371a0> cannot identify image file <_io.BytesIO object at 0x7f566813db70> Failed to fetch sample 2416587. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b241a30> Failed to fetch sample 2225625. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b1243240> Failed to fetch sample 3116930. Exception:cannot identify image file <_io.BytesIO object at 0x7f5a16f86160> FFailed to fetch sample 2885556. Exception:cannot identify image file <_io.BytesIO object at 0x7f47278c35b0> Failed to fetch sample 2463420. Exception:Failed to fetch sample 2724377. Exception:bject at 0x7f0c35bca7a0> Failed to fetch sample 2671128. Exception: ject at 0x7f6556024540> cannot identify image file <_io.BytesIO object at 0x7fb687f02ed0> Failed to fetch sample 3467927. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b222250> Failed to fetch sample 1878125. Exception: cannot identify image file <_io.BytesIO object at 0x7f1fa1caef70> le <_io.BytesIO object at 0x7fe0ec2d6750> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83a93a0> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003d8590> FFailed to fetch sample 3249441. ExceptionFailed to fetch sample 3272055. Exception:object at 0x7f7aa0204e50> Failed to fetch sample 2954477. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7426d1e40> cannot identify image file <_io.BytesIO object at 0x7f47278c21b0> le <_io.BytesIO object at 0x7f1f1b27e1b0> Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2787c0> Failed to fetch sample 2368726. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4281b70> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a6840> cannot identify image file <_io.BytesIO object at 0x7f10937f1c10> ile <_io.BytesIO object at 0x7f83e8346480> Failed to fetch sample 2988390. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec246610> Failed to fetch sample 1160161. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb2020> Failed to fetch sample 1237017. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278fe570>:Failed to fetch sample 3029168. Exception:bject at 0x7fdcaf8eae80> Failed to fetch sample 3350122. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9ab510> Failed to fetch sample 1019238. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe723fc450> ailed to fetch sample 1926691. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5690ede0> Failed to fetch sample 1528709. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c98270> Failed to fetch sample 3608698. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f93790> cannot identify image file <_io.BytesIO object at 0x7f0da7579760> Failed to fetch sample 3468703. Exception: Failed to fetch sample 1832660. Exception: cannot identify image fFailed to fetch sample 3275923. Exception: ject at 0x7f47278f6a70> ile <_io.BytesIO object at 0x7fcfd8194fe0> le <_io.BytesIO object at 0x7fa275cb76f0> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5696c770> Failed to fetch sample 1012319. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8e9670> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7f61ceb93ba0> Failed to fetch sample 2591924. Exception: cannot identify image fFailed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a37e20> ile <_io.BytesIO object at 0x7f61b6aed670> Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb67a0> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a6160> Failed to fetch sample 1947526. Exception:bject at 0x7f94ac26eca0> le <_io.BytesIO object at 0x7fe4c42bf1f0> Failed to fetch sample 1705560. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732709a0> Failed to fetch sample 3081751. Exception:bject at 0x7fc0f0943560> le <_io.BytesIO object at 0x7f060231d620> cannot identify image file <_io.BytesIO object at 0x7f469efeb8d0> Failed to fetch sample 1777650. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39193f0> Failed to fetch sample 2087668. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18b650> cannot identify image file <_io.BytesIO object at 0x7fdcaf8ebe70> Failed to fetch sample 3159206. Exception: Failed to fetch sample 1491577. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c9acf0> Failed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2533d0> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb0c20> Failed to fetch sample 3232663. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca2a70> Failed to fetch sample 3481049. Exception:Failed to fetch sample 4161397. Exception:ample 2594318. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a539f80> cannot identify image file <_io.BytesIO object at 0x7fc108456520> Failed to fetch sample 1753990. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73259940> Failed to fetch sample 3533762. Exception: cannot identify image file <_io.BytesIO object at 0x7f52dad16250> Failed to fetch sample 2898491. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7a1940> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7fd391542b10> Failed to fetch sample 2607936. Exception: cannot identify image fFailed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5732e0> ile <_io.BytesIO object at 0x7fa2a930d850> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca1b70> Failed to fetch sample 4096776. Exception:Failed to fetch sample 32 cannot identify image file <_io.BytesIO object at 0x7fc2eb787100> 0x7fdcaf8fd300> Failed to fetch sample 3431371. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cdf790> Failed to fetch sample 3116930. Exception:Failed to fetch sample 3922185. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c9a430>cannot identify image file <_io.BytesIO object at 0x7f3bcea3b150> Failed to fetch sample 2978860. Exception: cannot identify image fiFailed to fetch sample 2633741. Exception: ailed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb2660> Failed to fetch sample 3206261. Exception: cannot identify image ficannot identify image file <_io.BytesIO o Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d730d2250> ject at 0x7fdc282bb5b0> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a53ade0> Failed to fetch sample 3628880. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754c3c40> Failed to fetch sample 3240425. Exception: ject at 0x7f3939c9a9d0> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcea1c7c0> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cb4220> cannot identify image file <_io.BytesIO object at 0x7f1137ff5760> cannot identify image file <_io.BytesIO object at 0x7f6682e66160> le <_io.BytesIO object at 0x7f9d730d1530> Failed to fetch sample 3573442. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2854e0> Failed to fetch sample 1992919. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668508540> Failed to fetch sample 3407344. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73261670> Failed to fetch sample 4044328. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200385350> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7f0f05f1a6b0> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732592b0> Failed to fetch sample 3210355. Exception: cannot identify image file <_io.BytesIO object at 0x7fb458a949f0> l will result in indexing errors Failed to fetch sample 1837133. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f121710> Failed to fetch sample 2797936. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986479b20> cannot identify image file <_io.BytesIO object at 0x7f81db5be8e0> Failed to fetch sample 3172854. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0937b00> cannot identify image file <_io.BytesIO object at 0x7f393931b9c0> Failed to fetch sample 3292749. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00754c20> le <_io.BytesIO object at 0x7f52cbbf4900> Failed to fetch sample 4123070. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732902c0> Failed to fetch sample 1135347. Exception: ject at 0x7f27754818a0> Failed to fetch sample 1095181. Exception:Failed to fetch sample 2155552. Exception: cannot identify image fiFailed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f59940> Failed to fetch sample 1573545. Exception:Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7fc138cf1e40> cannot identify image file <_io.BytesIO object at 0x7f52ce1fa110> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7b2759ad0> Failed to fetch sample 3031918. Exception: cannot identify image file <_io.BytesIO object at 0x7f21afee4860> Failed to fetch sample 3876414. Exception: cannot identify image fFailed to fetch sample 1865385. Exception: Failed to fetch sample 3253839. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80ea2a0> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5bdd0> Failed to fetch sample 4142602. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f2dbc0> Failed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789eac0> Failed to fetch sample 2479743. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bffe20> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f937e0> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f84270> Failed to fetch sample 3047125. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a904f3d0> Failed to fetch sample 1316998. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac247fb0> Failed to fetch sample 3546995. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340f04f0> Failed to fetch sample 1536964. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340fd850> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8efeca0> Failed to fetch sample 2863704. Exception: cannot identify image file <_io.BytesIO object at 0x7f951ecab470> Failed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f84e50> Failed to fetch sample 2784203. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8e89030> Failed to fetch sample 1424068. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac242110> cannot identify image file <_io.BytesIO object at 0x7f3bcea3bf10> ile <_io.BytesIO object at 0x7f9d730d3fb0> Failed to fetch sample 3840067. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668106520> Failed to fetch sample 2607936. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f85210> Failed to fetch sample 1790331. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4440b9120> Failed to fetch sample 1171179. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2444a0> Failed to fetch sample 3507174. Exception: cannot identify image file <_io.BytesIO object at 0x7f995956d4e0> Failed to fetch sample 1218899. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d82547600> cannot identify image file <_io.BytesIO object at 0x7f27754c2570> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f092c720> le <_io.BytesIO object at 0x7f9a9c2d1fd0> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a52cf0> cannot identify image file <_io.BytesIO object at 0x7fbf47083510> le <_io.BytesIO object at 0x7f47278f09a0> cannot identify image file <_io.BytesIO object at 0x7f3c5023d3a0> le <_io.BytesIO object at 0x7fcbdc1b3a60> Failed to fetch sample 2554836. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c3010> Failed to fetch sample 2591530. Exception: cannot identify image file <_io.BytesIO object at 0x7f995956d580> Failed to fetch sample 3227008. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2458a0> Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c36a0> Failed to fetch sample 1199445. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcfb00> Failed to fetch sample 3837981. Exception: cannot identify image file <_io.BytesIO object at 0x7fc32576e4d0> cannot identify image file <_io.BytesIO object at 0x7f396d309760> le <_io.BytesIO object at 0x7fb7401fb8d0> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00754bd0> Failed to fetch sample 2539019. Exception:: cannot identify image file <_io.BytesIO object at 0x7f9d7326e7a0> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2cbc40> cannot identify image file <_io.BytesIO oFailed to fetch sample 1189922. Exception:ample 2040208. Exception:Failed to fetch sample 3245561. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9e57af20> Failed to fetch sample 2388130. Exception:bject at 0x7fc0f0936570> Failed to fetch sample 2049038. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a817c400> cannot identify image file <_io.BytesIO object at 0x7f3939318bd0> Failed to fetch sample 1965350. Exception: cannot identify image file <_io.BytesIO object at 0x7f472775ac00> Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc27d8f0> Failed to fetch sample 2089328. ExceptionFailed to fetch sample 2038853. Exception: cannot identify image fil cannot identify image file <_io.BytesIO object at 0x7f5d278e48b0> le <_io.BytesIO object at 0x7f0c35c2b830> Failed to fetch sample 1539033. ExceptionFailed to fetch sample 2966793. Exception: bject at 0x7f472789d710> Failed to fetch sample 1762324. Exception: cannot identify image file <_io.BytesIO object at 0x7f51cb9771a0> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7fc193db49f0> cannot identify image file <_io.BytesIO object at 0x7fc0f0936b10> Failed to fetch sample 1280377. Exception:Failed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706ed5440> cannot identify image file <_io.BytesIO object at 0x7f9abce815d0> Failed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c29a160> Failed to fetch sample 3324359. Exception: cannot identify image file <_io.BytesIO object at 0x7f07aac70950> Failed to fetch sample 1934945. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c8ccc0> Failed to fetch sample 2915907. Exception: cannot identify image file <_io.BytesIO object at 0x7f9b082433d0> Failed to fetch sample 2636455. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce83fb0> Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7f910da03fb0> Failed to fetch sample 4090977. Exception: cannot identify image file <_io.BytesIO object at 0x7ff284073f60> cannot identify image file <_io.BytesIO object at 0x7fc13a0fe6b0> cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7f35a7f3e390> Failed to fetch sample 2439548. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcba10> Failed to fetch sample 3540812. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc27a0>Failed to fetch sample 2885632. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7db100> Failed to fetch sample 2866258. Exception: ject at 0x7f5d278c6660> le <_io.BytesIO object at 0x7f94ac2dcb80> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac20ee30> cannot identify image file <_io.BytesIO object at 0x7f83e831a840> Failed to fetch sample 2170277. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682deb9c0> Failed to fetch sample 1858341. Exception: cannot identify image file <_io.BytesIO object at 0x7f6917eb6520> cannot identify image file <_io.BytesIO oFailed to fetch sample 3571975. Exception: ject at 0x7f58c381e3e0> cannot identify image file <_io.BytesIO object at 0x7f9d4f9f2340> cannot identify image file <_io.BytesIO object at 0x7fe4369ee5c0> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7fd079dc79c0> ile <_io.BytesIO object at 0x7f94ac236750> Failed to fetch sample 2841865. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1af6f0> Failed to fetch sample 1129095. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9079a30> Failed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2449a0> Failed to fetch sample 2915907. Exception:ject at 0x7f81db5af1f0> ile <_io.BytesIO object at 0x7efd6f7f6de0> Failed to fetch sample 3229467. Exception:ject at 0x7f83e831a430> Failed to fetch sample 3330710. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd7740> ile <_io.BytesIO object at 0x7f9abce80810> Failed to fetch sample 3081392. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fb687f97bf0> 18610. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1a6520> Failed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8f0c4a0> Failed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f3f600> cannot identify image file <_io.BytesIO object at 0x7fe436a54bd0> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db53f6f0> Failed to fetch sample 3254207. Exception:Failed to fetch sample 1019459. Exception:bject at 0x7efd6f79ede0> Failed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7fd31193be70> Failed to fetch sample 1762324. Exception: ject at 0x7efcdbb5d350> cannot identify image file <_io.BytesIO object at 0x7f9abcdc34c0> Failed to fetch sample 1099589. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f83e20> Failed to fetch sample 1260303. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f467f0> cannot identify image file <_io.BytesIO object at 0x7f5d278c5d50> cannot identify image file <_io.BytesIO object at 0x7fbf5efed080> Failed to fetch sample 3619695. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16d9c810> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59f600> Failed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278c6e30> Failed to fetch sample 2647104. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7b5a80> Failed to fetch sample 4047467. Exception: cannot identify image file <_io.BytesIO object at 0x7fc21a923ab0> Failed to fetch sample 3465859. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37d92b0> Failed to fetch sample 1445442. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3725940> cannot identify image file <_io.BytesIO object at 0x7f0c35bddc10> Failed to fetch sample 2535152. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16d9c7c0> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5ad710> Failed to fetch sample 1524708. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f093ff10> cannot identify image file <_io.BytesIO object at 0x7fe436a43330> Failed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369ef380> le <_io.BytesIO object at 0x7fbeb87dd2b0> Failed to fetch sample 2503604. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16d9d490>Failed to fetch sample 2915907. Exception: cannot identify image file <_io.BytesIO object at 0x7fd079dc6930> Failed to fetch sample 3180383. Exception: cannot identify image file <_io.BytesIO object at 0x7f0f4c533fb0>Failed to fetch sample 1976210. Exception: cannot identify image fi e <_io.BytesIO object at 0x7fe4369fc400> cannot identify image file <_io.BytesIO object at 0x7fbe722bcb30> le <_io.BytesIO object at 0x7f578df7ffb0> Failed to fetch sample 3116930. Exception:bject at 0x7f201088aed0> le <_io.BytesIO object at 0x7fc0f0930680> Failed to fetch sample 3240425. Exception: cannot identify image file <_io.BytesIO object at 0x7f81540162f0> Failed to fetch sample 3633396. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bde7a0> Failed to fetch sample 1923061. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c27560> Failed to fetch sample 3649911. Exception: cannot identify image file <_io.BytesIO object at 0x7f61ceb93ba0> Failed to fetch sample 2697841. Exception: cannot identify image file <_io.BytesIO object at 0x7f001b9c4810> cannot identify image file <_io.BytesIO object at 0x7fd30f9dff60> Failed to fetch sample 1017215. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d7a60> cannot identify image file <_io.BytesIO object at 0x7fc1245e59e0> Failed to fetch sample 2040873. Exception: cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7f0c35c3e4d0> Failed to fetch sample 3584846. Exception: cannot identify image fiFailed to fetch sample 2851854. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a0950> cannot identify image file <_io.BytesIO object at 0x7f1b7c56f600> 40906. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f924f40> cannot identify image file <_io.BytesIO object at 0x7f357b9944f0> ile <_io.BytesIO object at 0x7fd2003b6fc0> Failed to fetch sample 1359089. Exception: cannot identify image file <_io.BytesIO object at 0x7f566810fdd0> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7fe696745760> Failed to fetch sample 2647104. Exception: cannot identify image file <_io.BytesIO object at 0x7f61be079080> ailed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722bd9e0> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd5fd0> Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42ac720> Failed to fetch sample 1876146. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38fb5b0> Failed to fetch sample 3327098. Exception: cannot identify image fFailed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f090ed90> annot identify image file <_io.BytesIO objFailed to fetch sample 2204862. Exception: le 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722be430> Failed to fetch sample 2344536. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9a4220> Failed to fetch sample 1934945. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38faed0> cannot identify image file <_io.BytesIO object at 0x7f58c38c14e0> Failed to fetch sample 4164576. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3969c10> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c7e3997b0> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722465c0> Failed to fetch sample 2671128. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8f319e0> Failed to fetch sample 2687916. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f83470> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8f306d0> Failed to fetch sample 1465471. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f81df0> Failed to fetch sample 2763647. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ddeabb330> Failed to fetch sample 2465798. Exception: cannot identify image file <_io.BytesIO object at 0x7f9dd02140e0> Failed to fetch sample 3552533. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9cb2b48b0> Failed to fetch sample 3602220. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc192200> Failed to fetch sample 1129536. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc1760> Failed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc146890> Failed to fetch sample 1504520. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc4db0> Failed to fetch sample 2979749. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc23c360> Failed to fetch sample 1774784. Exception:Failed to fetch sample 2628213. Exception:object at 0x7fca2e1c5f80>Failed to fetch sample 3081751. Exception: ect at 0x7f549560e0c0> file <_io.BytesIO object at 0x7ff248cf2e30> Failed to fetch sample 2679326. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b3812480> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7f55defc3fb0> Failed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc12a980> Failed to fetch sample 1879013. Exception:bject at 0x7f56681c1080> Failed to fetch sample 3634589. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b1398a0> Failed to fetch sample 2914368. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681dbec0> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7fe44409b1f0> Failed to fetch sample 1558724. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7327f380> Failed to fetch sample 2671128. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681dac00> Failed to fetch sample 2482217. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 17 cannot identify image file <_io.BytesIO object at 0x7fca2e1c4400> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7ff248cf23e0> 0x7fe4c42ac270> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7f5525ebc720> cannot identify image file <_io.BytesIO object at 0x7f9abcdc40e0> Failed to fetch sample 1747586. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc147740> Failed to fetch sample 1955902. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc112430> cannot identify image file <_io.BytesIO object at 0x7f81db59b1f0> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1c49f0> bject at 0x7f9d7327e110> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce47b50> bject at 0x7fe4c4269d00> cannot identify image file <_io.BytesIO oFailed to fetch sample 2033363. Exception: cannot identify image fiFailed to fetch sample 1108302. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64fc6d0> cannot identify image file <_io.BytesIO object at 0x7fc2eb7ba0c0> Failed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7f5eb219bce0> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7fbfebb83330> Failed to fetch sample 2086216. Exception: cannot identify image fiFailed to fetch sample 3021215. Exception: ailed to fetch sample 3637216. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c7e3997b0> Failed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a84540> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8eaa20> cannot identify image file <_io.BytesIO object at 0x7fc706e5bce0> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1dee80>Failed to fetch sample 2253806. Exception: Failed to fetch sample 2587903. Exception: ect at 0x7f5e6a1404f0> Failed to fetch sample 1432321. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0eff10> le <_io.BytesIO object at 0x7f80fd8e2340> Failed to fetch sample 3096384. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc128680> cannot identify image file <_io.BytesIO object at 0x7f0ecb2006d0> Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7223f830> Failed to fetch sample 1440576. Exception: cannot identify image file <_io.BytesIO object at 0x7f96bb6955d0> Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc1f30> Failed to fetch sample 1213089. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7ff6205726b0> le <_io.BytesIO object at 0x7fc814c7bd30> Failed to fetch sample 2767387. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b60c0> Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce477e0> Failed to fetch sample 1198363. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc111a80> Failed to fetch sample 3880305. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7223c2c0> Failed to fetch sample 2647104. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39ce2a0> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdb9f3d0> Failed to fetch sample 1526769. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341a0ef0> cannot identify image file <_io.BytesIO object at 0x7fc0f08efa60> Failed to fetch sample 2437168. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e833a7f0> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7226da30> Failed to fetch sample 2610109. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a390f2e0> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc3a10> Failed to fetch sample 2942336. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3952e30> cannot identify image file <_io.BytesIO object at 0x7fe4c4269490> le <_io.BytesIO object at 0x7fdc25fdbd30> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e650f3d0> Failed to fetch sample 2758569. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5697c400> cannot identify image file <_io.BytesIO object at 0x7f27754c1c60> Failed to fetch sample 2594903. Exception: cannot identify image file <_io.BytesIO object at 0x7fa008b4a1b0> Failed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42c4630> cannot identify image file <_io.BytesIO object at 0x7fc8f219acf0> Failed to fetch sample 2753846. Exception: cannot identify image fiFailed to fetch sample 1898055. Exception:Failed to fetch sample 2283471. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278a6ed0> bject at 0x7fb4a651f6a0> Failed to fetch sample 3557465. Exception:Failed to fetch sample 2979749. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a7f9d7fb0> cannot identify image file <_io.BytesIO object at 0x7f18e64ceca0> Failed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7fe696745760> Failed to fetch sample 3744461. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc27f27b50> Failed to fetch sample 3717362. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac26c7c0> Failed to fetch sample 1432321. Exception: ject at 0x7f4beebaacf0> Failed to fetch sample 1232641. Exception:Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14f150> cannot identify image file <_io.BytesIO object at 0x7fc99cbeefc0> Failed to fetch sample 3577885. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bfe4dbfb0> Failed to fetch sample 2044303. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c97ce0> Failed to fetch sample 2522299. Exception:bject at 0x7fe4c42c68e0> Failed to fetch sample 1934945. Exception: cannot identify image file <_io.BytesIO object at 0x7f5381a3a110> Failed to fetch sample 2590535. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b0e6d90> Failed to fetch sample 1955939. Exception:Failed to fetch sample 1707082. Exception:bject at 0x7efd6f7c9f80> cannot identify image file <_io.BytesIO object at 0x7f936bd85f80> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc62f0> Failed to fetch sample 2519180. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a819d530> cannot identify image file <_io.BytesIO object at 0x7fa275cd7dd0> Failed to fetch sample 3662436. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab99e90> Failed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1ddda0> Failed to fetch sample 3571975. Exception: cannot identify image file <_io.BytesIO object at 0x7f051343b330> Failed to fetch sample 1544144. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a7e0f44a0> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7f5df0218e50> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a84a90> Failed to fetch sample 2450796. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91db70> Failed to fetch sample 2772031. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faf3d80> Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c5dd82890> Failed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7f9b08910bd0> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59b060> Failed to fetch sample 2033363. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16dac9a0> Failed to fetch sample 1160161. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91d530> Failed to fetch sample 2786440. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faf3d30> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc24d0> Failed to fetch sample 3096384. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cac3ec900> Failed to fetch sample 1431086. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faef880> Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91cd60> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91ddf0> Failed to fetch sample 2718207. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9a7060> Failed to fetch sample 2765269. Exception:ject at 0x7f9abcdc5580> Failed to fetch sample 2918234. Exception: c Failed to fetch sample 1232641. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7cbba0> Failed to fetch sample 1070703. Exception: cannot identify imageFailed to fetch sample 3053708. Exception: caFailed to fetch sample 4376367. Exception:bject at 0x7fe4c42628eFailed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f26fce0> 0> cannot identify image file <_io.BytesIO object at 0x7f47278f2c50> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f925e90> Failed to fetch sample 2940430. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9a68e0> le <_io.BytesIO object at 0x7f9a9d003b50> cannot identify image file <_io.BytesIO object at 0x7ff4dab9aac0> Failed to fetch sample 2919698. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1af920> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7f051343b880> Failed to fetch sample 2702490. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a3c270> Failed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28fa10> Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a819cea0> Failed to fetch sample 2993528. Exception:cannot identify image file <_io.BytesIO object at 0x7fb7401f8450> Failed to fetch sample 4133927. Exception: cannot identify image fil cannot identify image file <_io.BytesIO ailed to fetch sample 265Failed to fetch sample 1128237. Exception: cannot identify image file <_io.BytesIO Failed to fetch sample 1163463. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec244680> object at 0x7f9d7322cef0> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568c79c0> Failed to fetch sample 2426798. Exception: cannot identify image file <_io.BytesIO object at 0x7f53f0ba6070> Failed to fetch sample 3654627. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc636ed0> Failed to fetch sample 3511868. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1c4d10> Failed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7f051343dc10> cannot identify image file <_io.BytesIO object at 0x7fbcdc11a200> Failed to fetch sample 2908790. Exception: cannot identify image file <_io.BytesIO object at 0x7f8028ff2d40> Failed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9bacdb20> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f12b10> Failed to fetch sample 1095181. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7325f650> Failed to fetch sample 1069353. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b234ae0> : cannot identify image file <_io.BytesIO object at 0x7f9abc634ae0> Failed to fetch sample 3571003. Exception: cannot identify image file <_io.BytesIO object at 0x7f713418ecf0> Failed to fetch sample 2284772. Exception: cannot identify image file <_io.BytesIO object at 0x7f051343f6a0> e <_io.BytesIO object at 0x7f7a568fd350>Failed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687fb5120> Failed to fetch sample 2046517. Exception: cannot identify image file <_io.BytesIO object at 0x7fe44409a980> cannot identify image file <_io.BytesIO object at 0x7f53a906bf10> Failed to fetch sample 2730847. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc6355d0> Failed to fetch sample 2851854. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadd5e110> Failed to fetch sample 3244596. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cdef70> cannot identify image file <_io.BytesIO object at 0x7f098641fc90> le <_io.BytesIO object at 0x7f9d73272110> Failed to fetch sample 1762324. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1f6910c20> Failed to fetch sample 3346438. Exception: cannot identify image file <_io.BytesIO object at 0x7f566812eb10> cannot identify image file <_io.BytesIO oFailed to fetch sample 2853902. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8325b20> Failed to fetch sample 1759192. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec274270> Failed to fetch sample 4105132. Exception: cannot identify image file <_io.BytesIO object at 0x7f3981a7dd50> Failed to fetch sample 28Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a3330> 88560. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cdff60> Failed to fetch sample 4082525. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8318360> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4fbbf5b0> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906ac50> le <_io.BytesIO object at 0x7f09864474c0> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697dea750> ailed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd6b10> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca0ae0> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7f84378b27f0> Failed to fetch sample 3368130. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80fb880> Failed to fetch sample 3006684. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20039aa70> Failed to fetch sample 3123387. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b088b5b0> Failed to fetch sample 3398766. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b98a0> cannot identify image file <_io.BytesIO object at 0x7f58c38b7740> le <_io.BytesIO object at 0x7f10af018db0> Failed to fetch sample 3411798. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200378e50> Failed to fetch sample 1122491. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7f6d14f9f650> Failed to fetch sample 1198363. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647f420> Failed to fetch sample 3880166. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644b7e0> Failed to fetch sample 2402011. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8108e50> Failed to fetch sample 1445442. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdd6d27a0> Failed to fetch sample 3494092. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a81d67f0> Failed to fetch sample 2383622. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac26ff60> Failed to fetch sample 2931138. Exception: cannot identify image ficannot identify image file <_io.BytesIO obFailed to fetch sample 3 cannot identify image file <_io.BytesIO object at 0x7fd31011c4a0> Failed to fetch saFailed to fetch sample 2111750. Exception:mage file <_io.BytesIO object at 0x7f1137ff4270> Failed to fetch sample 1947526. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20035b1f0> Failed to fetch sample 2702352. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340edda0> Failed to fetch sample 2047816. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8112fc0> Failed to fetch sample 1501211. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a819fd30> Failed to fetch sample 1029176. Exception: cannot identify image file <_io.BytesIO object at 0x7f90f4e8b510> Failed to fetch sample 3662436. Exception: ect at 0x7f357b93ff60> Failed to fetch sample 2072944. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37e9a80> ailed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7f098643ac50> cannot identify image file <_io.BytesIO oFailed to fetch sample 29 cannot identify image file <_io.BytesIO object at 0x7f9d73242520> 0x7f713412db70> Failed to fetch sample 1683394. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f2cc20> Failed to fetch sample 4112540. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a81d7b00> cannot identify image file <_io.BytesIO object at 0x7f7a568d4950> Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647f5b0> Failed to fetch sample 3244596. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 2778920. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac234fe0> bject at 0x7fb687f077e0> Failed to fetch sample 1302915. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f4e200> Failed to fetch sample 3557596. Exception: cannot identify image fiFailed to fetch sample 3731033. Exception:Failed to fetch sample 2098207. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a7f9d7fb0> Failed to fetch sample 3582303. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b2930>cannot identify image file <_io.BytesIO object at 0x7f84f700da30> Failed to fetch sample 1549770. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2393f0> file <_io.BytesIO object at 0x7f7a5690d8f0> Failed to fetch sample 2110654. Exception: cannot identify image file <_io.BytesIO object at 0x7fa1d0a53ba0> Failed to fetch sample 3431400. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c420ff10> Failed to fetch sample 2975120. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4266c50> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7f804c2751c0> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cec957bf0> Failed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b1800> Failed to fetch sample 2400986. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28be70> cannot identify image file <_io.BytesIO object at 0x7f951ecdea20> le <_io.BytesIO object at 0x7fe436a16d90> Failed to fetch sample 4029722. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a960a40> Failed to fetch sample 2641623. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28ade0> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a14ea0> Failed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b35b0> Failed to fetch sample 3136267. Exception: cannot identify image fiFailed to fetch sample 2417238. Exception: Failed to fetch sample 2736920. Exception:ject at 0x7fe4c42ab100> ile <_io.BytesIO object at 0x7f6a7f28b880> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2ddcb0> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fca5cc5e0> cannot identify image file <_io.BytesIO object at 0x7fe5d5aaa520> Failed to fetch sample 1153730. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcea3a110> cannot identify image file <_io.BytesIO object at 0x7f713412f380> cannot identify image file <_io.BytesIO object at 0x7fe436a26340> Failed to fetch sample 1947526. Exception: Failed to fetch sample 3094926. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf3600a1b0> Failed to fetch sample 1258856. Exception: cannot identify image fFailed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84ebdd0> cannot identify image file <_io.BytesIO object at 0x7f94ac237a60> FFailed to fetch sample 3549131. Exception: cannot identify image file <_io.BytesIO object at 0x7f57981d9760> Failed to fetch sample 2337284. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b38893f0> cannot identify image file <_io.BytesIO object at 0x7efd6f7bdfd0> Failed to fetch sample 3240425. Exception: cannot identify image file <_io.BytesIO object at 0x7fc19513dfd0> Failed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7f393931a160> Failed to fetch sample 2347720. Exception: Failed to fetch sample 3585602. Exception:ject at 0x7f56681c0040> Failed to fetch sample 2335342. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c3c90> Failed to fetch sample 2040873. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf005fba10> Failed to fetch sample 1131587. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a26d40> cannot identify image file <_io.BytesIO object at 0x7fca1c18aac0> Failed to fetch sample 1554330. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2cfe20> Failed to fetch sample 2851854. Exception:Failed to fetch sample 3266988. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7db5b0> Failed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647ddf0> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a244a0> Failed to fetch sample 2861238. Exception:ject at 0x7ff401f91940> Failed to fetch sample 2475210. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a41bb7fb0> cannot identify image file <_io.BytesIO object at 0x7f9a9c2b8590> cannot identify image file <_io.BytesIO object at 0x7efd6f7db3d0> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2ce160> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18b650> Failed to fetch sample 1586493. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaa47dc10> Failed to fetch sample 4146076. Exception: cannot identify image file <_io.BytesIO object at 0x7f61ba0fb060> Failed to fetch sample 2746130. Exception:cannot identify image fiFailed to fetch sample 2158552. Exception: cannot identify image fFailed to fetch sample 3903711. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2cda30> ile <_io.BytesIO object at 0x7f18e64c3ec0> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7fe525b0dda0> Failed to fetch sample 969775. Exception:object at 0x7f9a9c2b8720> cannot identify image file <_io.BytesIO object at 0x7f4c161e4a90> le <_io.BytesIO object at 0x7fca1c1bb100> Failed to fetch sample 1494278. Exception: cannot identify image file <_io.BytesIO object at 0x7f65565ff100> Failed to fetch sample 2004558. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560e75b0> cannot identify image file <_io.BytesIO object at 0x7efe2a565350> Failed to fetch sample 1120039. Exception: cannot identify image file <_io.BytesIO object at 0x7f1dd224a750> le <_io.BytesIO object at 0x7fdfaa47eca0> Failed to fetch sample 3859102. Exception:Failed to fetch sample 2171729. Exception: ject at 0x7fe0ec2f1e90> Failed to fetch sample 3244596. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7fcf32b69a30> Failed to fetch sample 1618997. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c284360> Failed to fetch sample 1250858. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496028590> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7f655602c0e0> Failed to fetch sample 2717980. Exception: cannot identify image file <_io.BytesIO object at 0x7f61aaaa16c0> Failed to fetch sample 3180383. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7f190815b010> Failed to fetch sample 2898278. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 2940430. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c285c10> Failed to fetch sample 2503947. Exception: cannot identify image file <_io.BytesIO object at 0x7f566811f600> Failed to fetch sample 2856005. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003d80e0> Failed to fetch sample 1445442. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c4400> Failed to fetch sample 2437168. Exception: cannot identify image file <_io.BytesIO object at 0x7f56e8180bd0> Failed to fetch sample 3367173. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f46700> Failed to fetch sample 3552579. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc27eb60> cannot identify image file <_io.BytesIO object at 0x7fd2003db380> Failed to fetch sample 3452755. Exception: cannot identify image file <_io.BytesIO object at 0x7f289ed8fa10> Failed to fetch sample 3064561. Exception: cannot identify image file <_io.BytesIO object at 0x7f44f5d1d4e0> Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7efd280e0400> Failed to fetch sample 1445442. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9a5a30> Failed to fetch sample 2030995. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db579990> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f926250> Failed to fetch sample 1828766. Exception: cannot identify image file <_io.BytesIO object at 0x7f10804eb2e0> Failed to fetch sample 3257443. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bcb150> Failed to fetch sample 1137437. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7240c4a0> Failed to fetch sample 1865082. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35b97fb0> Failed to fetch sample 2856005. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe723fc0e0> Failed to fetch sample 2043332. Exception: cannot identify image file <_io.BytesIO object at 0x7fbea6365080> Failed to fetch sample 1029176. Exception: cannot identify image file <_io.BytesIO object at 0x7f10804eb2e0> Failed to fetch sample 974565. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3949e90> Failed to fetch sample 2101533. Exception: cannot identify image file <_io.BytesIO object at 0x7fbeb87dd2b0> Failed to fetch sample 2562281. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe723fec00> Failed to fetch sample 2035159. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278b0590> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bcf330> Failed to fetch sample 3021215. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722a8c20> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bddcb0> Failed to fetch sample 2761972. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bdc3b0> Failed to fetch sample 1014875. Exception: cannot identify image file <_io.BytesIO object at 0x7f81d6dcec50> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bddc10> Failed to fetch sample 1446794. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789f290> Failed to fetch sample 3462226. Exception: cannot identify image file <_io.BytesIO object at 0x7f10804cade0> Failed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7f0f4c211e40> Failed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7f44f5d1d4e0> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f0ae0> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bdec00> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bcf290> Failed to fetch sample 3679971. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3953a60> Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39522a0> Failed to fetch sample 3888958. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72246250> Failed to fetch sample 1784730. Exception: cannot identify image file <_io.BytesIO object at 0x7f96cd9a50d0> Failed to fetch sample 1840193. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789dfd0> Failed to fetch sample 2632930. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722be6b0> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7f9b37c3aed0> Failed to fetch sample 3330832. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f72c00> Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7f96cd9a4c70> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7f9772b5b470> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3960360> Failed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3952b10> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3953fb0> Failed to fetch sample 3667044. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bcb6a0> Failed to fetch sample 2786816. Exception: cannot identify image file <_io.BytesIO object at 0x7f991b214590> Failed to fetch sample 3240425. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3941760> Failed to fetch sample 1219288. Exception: cannot identify image file <_io.BytesIO object at 0x7f10804eb2e0> Failed to fetch sample 2040873. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3940db0> Failed to fetch sample 2851854. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39431a0> Failed to fetch sample 1966918. Exception: cannot identify image file <_io.BytesIO object at 0x7f44f5d1f290> Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7f44f5d1d4e0> Failed to fetch sample 4157790. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c27ba0> Failed to fetch sample 1784730. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278bd710> Failed to fetch sample 2780092. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecb2005e0> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c3f420> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134136d40> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec264950> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278b04f0> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdb9fc90> Failed to fetch sample 2522299. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1b3a60> cannot identify image file <_io.BytesIO object at 0x7ef9fa909710> le <_io.BytesIO object at 0x7f61ceb93ba0> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec27b380> Failed to fetch sample 3633396. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c97e70> Failed to fetch sample 2915907. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706e5b060> Failed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7f649602c6d0> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec265670> Failed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7f39612b96c0> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec27a7a0> Failed to fetch sample 2086216. Exception: cannot identify image file <_io.BytesIO object at 0x7f62a46cd170> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c73420> Failed to fetch sample 2439548. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b83b0> Failed to fetch sample 2595788. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280ff880> Failed to fetch sample 3376748. Exception: cannot identify image file <_io.BytesIO object at 0x7f6209d480e0> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7f63e3aa0720> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c97ce0> Failed to fetch sample 1693845. Exception: cannot identify image file <_io.BytesIO object at 0x7f8039853fb0> cannot identify image file <_io.BytesIO object at 0x7f5381a3a110> Failed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b9a30> Failed to fetch sample 1818686. Exception: cannot identify image file <_io.BytesIO object at 0x7f082818bb50> Failed to fetch sample 2671128. Exception: cannot identify image file <_io.BytesIO object at 0x7f61a591ef20> Failed to fetch sample 3290404. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496029530> Failed to fetch sample 2966793. Exception: cannot identify image file <_io.BytesIO object at 0x7f395c19a160> Failed to fetch sample 2750090. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249485800> Failed to fetch sample 3116930. Exception: cannot identify image fiFailed to fetch sample 1221414. ExceptioncFailed to fetch sample 1189922. Exception: cannot identify image cannot identify image file <_io.BytesIO object at 0x7f80292d3100> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134137650> 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f7f60> Failed to fetch sample 3456569. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7bbab0> Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f79c0> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c56db70> Failed to fetch sample 3404164. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644d670> cannot identify image file <_io.BytesIO object at 0x7f4450b68f90> Failed to fetch sample 1912139. Exception: cannot identify image file <_io.BytesIO object at 0x7efdefbebfb0> le <_io.BytesIO object at 0x7f6496083e20> Failed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1ba2f0> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c98540> Failed to fetch sample 1929498. Exception: cannot identify image fFailed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7efe2a92ffb0> ile <_io.BytesIO object at 0x7f93f3996fc0> Failed to fetch sample 2339915. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cef2e0> Failed to fetch sample 1877279. Exception: cannot identify image fcannot identify image file <_io.BytesIO objFailed to fetch sample cannot identify image file <_io.BytesIO object at 0x7f3939c95c10> Failed to fetch sample 1140456. Exception: cannot identify image file <_io.BytesIO object at 0x7f64960819e0> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7f6667ad62f0> Failed to fetch sample 2040208. Exception: cannot identify image fFailed to fetch sample 2388130. Exception:bject at 0x7f5d278b00e0> 380043. Exception: cannot identify image file <_io.BytesIO object at 0x7fa2bda8dd00> bject at 0x7f93f39953f0> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e2e06fdd0> cannot identify image file <_io.BytesIO object at 0x7f47278be340> Failed to fetch sample 2558575. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f7c40> Failed to fetch sample 2594375. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59b9c0> Failed to fetch sample 3348577. Exception:Failed to fetch sample 1509622. Exception: ject at 0x7f3d1574ce00> cannot identify image file <_io.BytesIO object at 0x7fb249b24c70> le <_io.BytesIO object at 0x7f61be079080> Failed to fetch sample 3228934. Exception:bject at 0x7f53a8f0d760> Failed to fetch sample 2752003. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b426e110> le <_io.BytesIO object at 0x7f3939cc3830> Failed to fetch sample 3722648. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc223830> Failed to fetch sample 3270019. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f1137f81670> Failed to fetch sample 2826106. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b271850> Failed to fetch sample 1062546. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2db2e0> cannot identify image file <_io.BytesIO object at 0x7f10937798f0> Failed to fetch sample 1250858. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a64c70> cannot identify image file <_io.BytesIO object at 0x7f81db599c60> Failed to fetch sample 1556680. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2824d0> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496082020> Failed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cedad0> Failed to fetch sample 2722825. Exception: cannot identify image file <_io.BytesIO object at 0x7f0da8eba3e0>Failed to fetch sample 3840067. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59b420> Failed to fetch sample 2785729. Exception: cannot identify image file <_io.BytesIO object at 0x7f8151923fb0> cannot identify image file <_io.BytesIO object at 0x7f7134100810> Failed to fetch sample 978639. Exception: cannot identify image filFailed to fetch sample 1277996. Exception Failed to fetch sample 1397352. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7fe0ee1786d0> Failed to fetch sample 978639. Exception:Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496080ef0>cannot identify image file <_io.BytesIO object at 0x7f3939ca02c0> 3634589. Exception:bject at 0x7fd20035bc90> Failed to fetch sample 2819059. Exception: cannot identify image file <_io.BytesIO object at 0x7fc26217acf0> cannot identify image file <_io.BytesIO object at 0x7fc138cf1e40> Failed to fetch sample 4066022. Exception: cannot identify image file <_io.BytesIO object at 0x7fa2a930d850> cannot identify image file <_io.BytesIO object at 0x7f5a16dacdb0> Failed to fetch sample 3324406. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59a840> Failed to fetch sample 3257882. Exception:ject at 0x7fb4a6516b10> ile <_io.BytesIO object at 0x7f22e39df6f0> cannot identify image file <_io.BytesIO object at 0x7f97ee4c94e0> cannot identify image file <_io.BytesIO object at 0x7fdcaf8f3740> Failed to fetch sample 1624286. Exception: cannot identify image file <_io.BytesIO object at 0x7fd169770ea0> le <_io.BytesIO object at 0x7f1f1b24a7f0> Failed to fetch sample 2363461. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e38985e0>Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b38884a0> Failed to fetch sample 3604356. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2d85e0> cannot identify image file <_io.BytesIO object at 0x7fd20037bb50> Failed to fetch sample 3148873. Exception: cannot identify image file <_io.BytesIO object at 0x7f96fca223e0> Failed to fetch sample 2437168. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f71f80> Failed to fetch sample 2482096. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c424b290> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b253100> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7fd079dc6340> Failed to fetch sample 3617437. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20037a660> Failed to fetch sample 3194835. Exception: cannot identify image file <_io.BytesIO object at 0x7f96ef1d14e0> Failed to fetch sample 1010081. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2d8310> cannot identify image file <_io.BytesIO object at 0x7f649602cf40> Failed to fetch sample 3095520. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd7c90> Failed to fetch sample 3904125. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828128270> FFailed to fetch sample 4149591. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568f7e20> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f1ffc686200> Failed to fetch sample 2565544. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f1c60> Failed to fetch sample 3249441. Exception:Failed to fetch sample 3548361. Exception: ject at 0x7f1f1b0e4c20> Failed to fetch sample 4108370. Exception:ject at 0x7fdcaf8f0d10> Failed to fetch sample 3013832. Exception: cannot identify image file <_io.BytesIO object at 0x7fd079dc5f80> Failed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fe7f0> Failed to fetch sample 2655989. Exception:bject at 0x7f47278f2570> cannot identify image file <_io.BytesIO object at 0x7f0828128130> Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec281d50> Failed to fetch sample 2954477. Exception:bject at 0x7f94ac23dfd0> Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9eff98d60> Failed to fetch sample 3511999. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249491b20> Failed to fetch sample 3692543. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7bfa10> Failed to fetch sample 2147513. Exception: cannot identify image file <_io.BytesIO object at 0x7f80284b7f10> Failed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369feb10> Failed to fetch sample 4380313. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23e8e0> Failed to fetch sample 1459345. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7f6f70> Failed to fetch sample 2979749. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56961490> Failed to fetch sample 2463630. Exception: Failed to fetch sample 3249441. Exception:ject at 0x7fbcdc1a6a70> ile <_io.BytesIO object at 0x7fd07a576de0> Failed to fetch sample 3463683. Exception: cannot identify image f cannot identify imaFailed to fetch sample cannot identify image file <_io.BytesIO object at 0x7fca1c1a7010> Failed to fetch sample 3298418. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249680db0> annot identify image file <_io.BytesIO object at 0x7f1f1b0e71a0> Failed to fetch sample 2696540. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f64900> Failed to fetch sample 4046007. Exception: cannot identify image file <_io.BytesIO object at 0x7f9e997ca1b0> Failed to fetch sample 2522299. Exception: cannot identify image file <_io.BytesIO object at 0x7f505eb4b6a0> Failed to fetch sample 4106695. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c2a9d0> Failed to fetch sample 1580319. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24966a9d0> Failed to fetch sample 1806693. Exception:ject at 0x7f52dad16250> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b243fb0> Failed to fetch sample 3467927. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fc15d0> Failed to fetch sample 1171510. Exception: ject at 0x7fcbdc1ba9d0> cannot identify image file <_io.BytesIO object at 0x7f524fbdbba0> le <_io.BytesIO object at 0x7f80a82b60c0> cannot identify image file <_io.BytesIO object at 0x7fd08940c950> ile <_io.BytesIO object at 0x7fbcdc1216c0> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7322eb60> cannot identify image file <_io.BytesIO object at 0x7f5e6a960b80> Failed to fetch sample 3672667. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac26ea70> ailed to fetch sample 2479743. Exception: cannot identify image file <_io.BytesIO object at 0x7f10af1aa430> Failed to fetch sample 2051837. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c3c90> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b279800> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fc28e0> Failed to fetch sample 1355047. Exception:ject at 0x7f524fa829d0>Failed to fetch sample 3233535. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7240c4a0> Failed to fetch sample 1318636. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6612ed0> Failed to fetch sample 2089328. Exception:bject at 0x7fc0f09367f0> 719591. Exception:cannot identify image file <_io.BytesIO object at 0x7efd6f79f830> cannot identify image file <_io.BytesIO object at 0x7fd079dc7ce0> Failed to fetch sample 4154800. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200376f20> Failed to fetch sample 1095181. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2b8900> cannot identify image file <_io.BytesIO object at 0x7fbe723fc450> Failed to fetch sample 2928243. Exception: cannot identify image fiFailed to fetch sample 1207231. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b5300> Failed to fetch sample 3116930. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac26f1a0> Failed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722a8e00> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6504e00> Failed to fetch sample 3879860. Exception: ect at 0x7f12b3847d30> Failed to fetch sample 2280960. Exception: cannot identify image fiFailed to fetch sample 2940708. Exception: ect at 0x7fcf7040f510> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 2086216. Exception:mple 1100749. Exception: cannot identify image file <_io.BytesIO object at 0x7fcf2296ff10> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2e6de0> cannot identify image file <_io.BytesIO object at 0x7f18e64cef70> Failed to fetch sample 1448134. ExceptionFailed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0936ac0> annot identify image file <_io.BytesIO object at 0x7f94ac26f060> Failed to fetch sample 3036687. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1a6ac0> Failed to fetch sample 3571975. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a573f60> Failed to fetch sample 1947526. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369f65c0> Failed to fetch sample 3467927. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64cdd50> Failed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72408e00> cannot identify image file <_io.BytesIO object at 0x7f12b3824860> Failed to fetch sample 4207065. Exception: ject at 0x7f80a80ebbf0> cannot identify image file <_io.BytesIO object at 0x7fbe723ea5c0> Failed to fetch sample 3180383. Exception:cannot identify image file <_io.BytesIO object at 0x7fe4c424b290> Failed to fetch sample 2147268. Exception: Failed to fetch sample 1348890. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f924450> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5732e0> Failed to fetch sample 1075274. Exception: cannot identify image fiFailed to fetch sample 2801811. Exception: cannot identify image fFailed to fetch sample 1230454. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc1760> Failed to fetch sample 1382990. Exception: Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec282cf0> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1baed0> Failed to fetch sample 3614998. Exception: cannot identify image file <_io.BytesIO object at 0x7f96c13e3ba0> cannot identify image file <_io.BytesIO object at 0x7f18e650d440> Failed to fetch sample 2912699. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1af7e0> Failed to fetch sample 2439548. Exception: cannot identify image file <_io.BytesIO object at 0x7fbea6365080> Failed to fetch sample 1250858. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2835ffd30> Failed to fetch sample 1858383. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec280d10> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1a5d50> Failed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1ae7a0> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249682340> Failed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249677380> Failed to fetch sample 1524920. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec280b30> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1c4400> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249675300> Failed to fetch sample 2633741. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1c49f0> Failed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1deac0> cannot identify image file <_io.BytesIO object at 0x7ff401f60b30> Failed to fetch sample 3930916. Exception:ject at 0x7fd07a619ad0> Failed to fetch sample 4232909. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc12b830> Failed to fetch sample 2784203. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8154cc0> Failed to fetch sample 1745368. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f86b10> cannot identify image file <_io.BytesIO object at 0x7f3939ca2ac0> Failed to fetch sample 1135347. Exception:Failed to fetch sample 2314133. Exception:bject at 0x7fca2e1af920> Failed to fetch sample 2607936. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe7240c860> Failed to fetch sample 2894595. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bde160> Failed to fetch sample 3346372. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7fd07a572890> Failed to fetch sample 2831140. Exception: Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722a90d0> ailed to fetch sample 1157917. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c426b470> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac26eca0> ile <_io.BytesIO object at 0x7fbe722bc8b0> Failed to fetch sample 1Failed to fetch sample 3573442. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137ff9e90> Failed to fetch sample 3408264. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 3Failed to fetch sample 3081751. Exception: le <_io.BytesIO object at 0x7ff2494876a0> Failed to fetch sample 3557596. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f971a0> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137ffba10> Failed to fetch sample 3743930. Exception: cannot identify image file <_io.BytesIO object at 0x7fb456751850> cannot identify image file <_io.BytesIO object at 0x7fbcdc14ef70> Failed to fetch sample 3732379. Exception:Failed to fetch sample 1969600. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278c6de0> cannot identify image file <_io.BytesIO object at 0x7fbcdc1add00> Failed to fetch sample 2196459. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ecaa0a020> Failed to fetch sample 3726047. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f90180> Failed to fetch sample 2767646. Exception:bject at 0x7fb687f2aca0> Failed to fetch sample 2108438. Exception: cannot identify image file <_io.BytesIO object at 0x7f52cc0affb0> Failed to fetch sample 1171179. Exception: Failed to fetch sample 1784730. Exception:bject at 0x7fbcdc14e700> Failed to fetch sample 2773958. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f6700> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7f951ecdd300> cannot identify image file <_io.BytesIO object at 0x7fb3a3207970> Failed to fetch sample 4186967. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681c05e0> Failed to fetch sample 3859704. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc23f150> ailed to fetch sample 1899574. Exception: cannot identify image file <_io.BytesIO object at 0x7f225c43b880> Failed to fetch sample 1501211. Exception:cannot identify image file <_io.BytesIO obFailed to fetch sample 1523937. Exception:Failed to fetch sample 2122051. Exception:mage file <_io.BytesIO object at 0x7fb3a1296ed0> Failed to fetch sample 1561037. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681c3470> Failed to fetch sample 3869624. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4282660> cannot identify image file <_io.BytesIO object at 0x7f94ac26f790>Failed to fetch sample 1029176. Exception: cannot identify image file <_io.BytesIO object at 0x7f1ffea0b6a0> Failed to fetch sample 3599011. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a3de263e0> Failed to fetch sample 3898287. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0165c0a90> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3816610> Failed to fetch sample 3868582. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e5123be20> Failed to fetch sample 2358648. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681c1cb0> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7f1fec8d1760> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7fc012ac0ea0> Failed to fetch sample 1424068. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4473705e0> Failed to fetch sample 2856005. Exception:Failed to fetch sample 2583452. Exception:mple 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14e7f0> Failed to fetch sample 3439352. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1afb00> cannot identify image file <_io.BytesIO object at 0x7f0c35d6b1f0> Failed to fetch sample 3837981. Exception:Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e389bf10> cannot identify image file <_io.BytesIO object at 0x7fb7401f3010> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b5c60> Failed to fetch sample 2562281. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278b0090> cannot identify image file <_io.BytesIO object at 0x7f53a90809f0> Failed to fetch sample 3368781. Exception cannot identify image file <_io.BytesIO object at 0x7f0c35bdeca0> Failed to fetch sample 2049038. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e3dd140e0> Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bd940> Failed to fetch sample 3108588. Exception: ject at 0x7ff4daba2c50> le <_io.BytesIO object at 0x7f5d278dbec0> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7fc263506700> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e389acf0> Failed to fetch sample 3247622. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249669b20> cannot identify image file <_io.BytesIO object at 0x7f53a9083d80> le <_io.BytesIO object at 0x7f22e3817e70>cannot identify image file <_io.BytesIO object at 0x7f0dbce75670> file <_io.BytesIO object at 0x7fc2eb8ebe70> ailed to fetch sample 3488524. Exception: cannot identify image file <_io.BytesIO object at 0x7ff3322a56c0> cannot identify image file <_io.BytesIO object at 0x7fe436a344f0> le <_io.BytesIO object at 0x7fc2eb715530> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1be250> cannot identify image file <_io.BytesIO object at 0x7fd17f0f29d0> Failed to fetch sample 3574323. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b221fd0> Failed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f98770> Failed to fetch sample 2040873. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003ab150> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7fb3b517f1a0> Failed to fetch sample 4108370. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b094b80> Failed to fetch sample 2381596. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c286570> Failed to fetch sample 2323458. Exception: cannot identify image fFailed to fetch sample 1094088. Exception: Failed to fetch sample 1650017. Exception: cannot identify image fiFailed to fetch sample 2815741. Exception: cannot identify image file <_io.BytesIO object at 0x7f9af1d39620> cannot identify image file <_io.BytesIO object at 0x7f0986446110> Failed to fetch sample 2503554. Exception: cannot identify image file <_io.BytesIO object at 0x7fa2aebdbfb0> le <_io.BytesIO object at 0x7fcbdc1ee020> Failed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7fbac8b11cb0> Failed to fetch sample 1858383. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14e570> Failed to fetch sample 3240425. Exception: cannot identify image file <_io.BytesIO object at 0x7f2009934b80> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 38 cannot identify cannot identify image file <_io.BytesIO object at 0x7fe4c428c680> Failed to fetch sample 2040873. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b1389a0> Failed to fetch sample 1826453. Exception:Failed to fetch sample 1854172. Exception:bject at 0x7fe4c428fd30> Failed to fetch sample 2148365. Exception: cannot identify image file <_io.BytesIO object at 0x7fa199b909a0> Failed to fetch sample 2767646. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b13aa20> cannot identify image file <_io.BytesIO object at 0x7fc99b840680> Failed to fetch sample 2080601. Exception:cannot identify image file <_io.BytesIO object at 0x7f0c1d2823e0> FFailed to fetch sample 2641623. Exception:cannot identify image file <_io.BytesIO object at 0x7efb3ccde0c0> Failed to fetch sample 1801427. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f09a0> Failed to fetch sample 1524920. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bc2c0> Failed to fetch sample 1213089. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7f0986447560> Failed to fetch sample 2411316. Exception: cannot identify image file <_io.BytesIO object at 0x7fd079dfaf70> Failed to fetch sample 2672633. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 1061148. Exception:mple 2753379. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b45af8d60> Failed to fetch sample 2592805. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f79e1b0> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7c3420> cannot identify image file <_io.BytesIO object at 0x7fa259aa07c0> Failed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c428c400> Failed to fetch sample 1095181. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd97f10> Failed to fetch sample 2658306. Exception: cannot identify image fiFailed to fetch sample 1738622. Exception:Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641f4c0> Failed to fetch sample 1947526. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d730e2890> Failed to fetch sample 3880305. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1ab1a0> Failed to fetch sample 1316862. Exception: cannot identify image fiFailed to fetch sample 2671128. Exception:cannot identify image file <_io.BytesIO object at 0x7fc2eb7bc9a0> Failed to fetch sample 2604250. Exception: cannot identify image file <_io.BytesIO object at 0x7fc26217b380> > Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7c26b0> Failed to fetch sample 2782785. Exception:Failed to fetch sample 1029176. Exception:bject at 0x7f001b9c6de0> Failed to fetch sample 2437168. Exception: cannot identify image file <_io.BytesIO object at 0x7f8151923fb0> Failed to fetch sample 2671920. Exception:bject at 0x7fcf249e5300> Failed to fetch sample 1635122. Exception:ject at 0x7f357b9095d0> Failed to fetch sample 2856005. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3769990> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce69120> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1a98f0> cannot identify image file <_io.BytesIO object at 0x7f83e8346de0> Failed to fetch sample 1739207. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c21f9c0> Failed to fetch sample 3273357. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37eb8d0> Failed to fetch sample 1310195. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1759e0> Failed to fetch sample 2641615. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9ce8df80> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7f9ee1578a40> cannot identify image file <_io.BytesIO object at 0x7fd20037c270> Failed to fetch sample 2641623. Exception cannot identify image file <_io.BytesIO object at 0x7f1a0ff1be20> F Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c206fc0FaFailed to fetch sample 1213089. Exception:annot identify image file <_io.BytesIO object at 0x7fbcdc0f21b0> Failed to fetch sample 2282533. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc175bc0> Failed to fetch sample 2930874. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ee8e0> cannot identify image file <_io.BytesIO object at 0x7fd2003a9b20> Failed to fetch sample 1177526. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b249f30> Failed to fetch sample 3106163. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c429b790> Failed to fetch sample 1947526. Exception: cannot identify image fiFailed to fetch sample 1573545. Exception:Failed to fetch sample 1140456. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003aab10> Failed to fetch sample 1327539. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac241210> cannot identify image file <_io.BytesIO object at 0x7fc4c8563060> Failed to fetch sample 2693998. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de375e9d0> Failed to fetch sample 3435401. Exception: ject at 0x7f07a8e9d4e0> Failed to fetch sample 2280652. Exception:Failed to fetch sample 2701232. Exception:bject at 0x7fe0ec2f1990> Failed to fetch sample 3074562. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83d4950> Failed to fetch sample 2969651. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec272f20> Failed to fetch sample 1381164. Exception: cannot identify image file <_io.BytesIO object at 0x7f80201b39c0> cannot identify image file <_io.BytesIO object at 0x7f35a7f545e0> le <_io.BytesIO object at 0x7fc0f08ffd80> cannot identify image file <_io.BytesIO object at 0x7fcf2296fe20> Failed to fetch sample 1524920. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864474c0> cannot identify image file <_io.BytesIO object at 0x7f5de37e0400> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2ce200> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340ecb30> cannot identify image file <_io.BytesIO object at 0x7f5de375e6b0> Failed to fetch sample 1480960. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84cf790> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e9ae8540> Failed to fetch sample 2749265. Exception: cannot identify image file <_io.BytesIO object at 0x7fa27536bd30> Failed to fetch sample 3134464. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd23f6d580> Failed to fetch sample 1877279. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec24f6f0> Failed to fetch sample 3298418. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7efd6f7ba660> ailed to fetch sample 1381127. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200378db0> Failed to fetch sample 2033363. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2eb1f0> Failed to fetch sample 3344496. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16db4d10> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003aaa20> Failed to fetch sample 4056569. Exception:Failed to fetch sample 3415992. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db599300> cannot identify image file <_io.BytesIO object at 0x7fd200399a80> Failed to fetch sample 1474215. Exception: cannot identify image file <_io.BytesIO object at 0x7fb3a3207970> Failed to fetch sample 3314842. Exception:bject at 0x7f09953d5580> Failed to fetch sample 2993528. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412f2e0> Failed to fetch sample 1316862. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f3ec00> Failed to fetch sample 3374089. Exception: cannot identify image fiFailed to fetch sample 3661522. Exception:Failed to fetch sample 1449629. Exception:bject at 0x7f563c54a610> Failed to fetch sample 1327539. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2413a0> Failed to fetch sample 2746130. Exception:bject at 0x7f97ee4c94e0> cannot identify image file <_io.BytesIO object at 0x7f0986453470> le <_io.BytesIO object at 0x7f56681d9760> le <_io.BytesIO object at 0x7f71341344f0> Failed to fetch sample 3148873. Exception: cannot identify image fFailed to fetch sample 986271. Exception: cannot identify image file <_io.BytesIO object at 0x7fd4546cb150> : cannot identify image file <_io.BytesIO object at 0x7f001f92ad40> cannot identify image file <_io.BytesIO object at 0x7f83e834e3e0> Failed to fetch sample 4112540. Exception: cannot identify image file <_io.BytesIO object at 0x7f566811f740> Failed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7bacf0> Failed to fetch sample 2724619. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d6cc19cb0> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7f098649fd30> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac26e3e0> Failed to fetch sample 3925231. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c54b420> Failed to fetch sample 1279253. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35b958a0> Failed to fetch sample 1761341. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73254cc0> Failed to fetch sample 2402011. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2cbc40> cannot identify image file <_io.BytesIO object at 0x7f537f91ee30> Failed to fetch sample 1479796. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc129ad0> Failed to fetch sample 4376367. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340f3d80> Failed to fetch sample 2764375. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134109580> Failed to fetch sample 2033363. Exception:Failed to fetch sample 2047816. Exception:ject at 0x7f61ceb93ba0> Failed to fetch sample 3898287. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bde160> Failed to fetch sample 2357983. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cac3ec900> cannot identify image file <_io.BytesIO object at 0x7f9a9c29b560> Failed to fetch sample 1100749. Exception:Failed to fetch sample 2477293. Exception:bject at 0x7f5525ebd3a0> Failed to fetch sample 2439548. Exception:bject at 0x7fdaf61c9760> Failed to fetch sample 2258741. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a7227470> Failed to fetch sample 3194835. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84ebc90> Failed to fetch sample 3532574. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe774ad3a0> Failed to fetch sample 2915907. Exception: cannot identify image fiFailed to fetch sample 4125296. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc147060> Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7f713418fe70> Failed to fetch sample 1327539. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e2e06fe70> cannot identify image file <_io.BytesIO object at 0x7f5cac3ed8f0> Failed to fetch sample 1120039. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72245da0> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bde7f0> Failed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16daf0b0> Failed to fetch sample 1029176. Exception: ect at 0x7f83e83ae610> Failed to fetch sample 1524334. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b97b0> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83d7330> Failed to fetch sample 2085901. Exception: cannot identify image fiFailed to fetch sample 3240425. Exception:Failed to fetch sample 2832807. Exception: cannot identify image file <_io.BytesIO object at 0x7f396d309760> Failed to fetch sample 2200669. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2511c0> cannot identify image file <_io.BytesIO object at 0x7f10937ee9d0> le <_io.BytesIO object at 0x7f0828108f40> Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340f2d40> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7f713410b1a0> cannot identify image file <_io.BytesIO object at 0x7f0986455df0> ile <_io.BytesIO object at 0x7f0c35bcf9c0> Failed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16e25990> Failed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e834f0b0> cannot identify image file <_io.BytesIO object at 0x7f1137f66c00> Failed to fetch sample 3732379. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d67d30>Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340efce0> ailed to fetch sample 1698671. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c36bc4e00> Failed to fetch sample 3837981. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c8d850> Failed to fetch sample 3117311. Exception: Failed to fetch sample 3091512. Exception: ect at 0x7fbcdc144040> Failed to fetch sample 3244596. Exception: cannot identify image file <_io.BytesIO object at 0x7f713418e8e0> cannot identify image file <_io.BytesIO object at 0x7fc24d050540> Failed to fetch sample 3518639. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722c44f0> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f722f0> cannot identify image file <_io.BytesIO object at 0x7f83e83adb20> Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84ebc40> ailed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b9c10> Failed to fetch sample 2033363. Exception:bject at 0x7f9a9d071b20> Failed to fetch sample 3473797. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c29a160> Failed to fetch sample 2867601. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f2bb50> Failed to fetch sample 2049038. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 25Failed to fetch s cannot identify image file <_io.BytesIO object at 0x7f56682fb920> Failed to fetch sample 2180368. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687fb66b0> le <_io.BytesIO object at cannot identify Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83c0270> image file <_io.BytesIO object at 0x7fbf9fab3790> Failed to fetch sample 2288399. Exception: cannot identify image file <_io.BytesIO object at 0x7f7f03b7ddf0> Failed to fetch sample 3732379. Exception: cannot identify image fFailed to fetch sample 3647389. Exception:: cannot identify image file <_io.BytesIO object at 0x7fb687f13d80> le <_io.BytesIO object at 0x7fb52f65c6d0> Failed to fetch sample 1431086. Exception:bject at 0x7f713451d7b0> Failed to fetch sample 3723923. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4266c50> Failed to fetch sample 2268760. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5ae200> Failed to fetch sample 1448881. Exception:Failed to fetch sample 2448314. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fb2fd566de0> Failed to fetch sample 1129792. Exception: cannot identify image file <_io.BytesIO object at 0x7f5ae389ff60> le <_io.BytesIO object at 0x7f1137f63fb0> Failed to fetch sample 1512076. Exception: Failed to fetch sample 3468605. Exception: ect at 0x7f691aed6250> Failed to fetch sample 2101533. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7f9abce82de0> Failed to fetch sample 1688560. Exception: Failed to fetch sample 1485531. Exception: cannot identify image fFailed to fetch sample 3463683. Exception: Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412c3b0> cannot identify image file <_io.BytesIO object at 0x7f001f94fce0> Failed to fetch sample 2235486. Exception:ject at 0x7f5de37e25c0> cannot identify image file <_io.BytesIO oFailed to fetch sample 39 cannot identify image file <_io.BytesIO oFailed to fetch sample 28Failed to fetch saFailed to fetch sample 1784730. Exception:cb830> Failed to fetch sample 3493751. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 1Failed to fetch saFailed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object aFailed to fetch sample 1061148. Exception:bject at 0x7fb687fb5710> identify image file <_io.BytesIO object at 0x7f001faef8d0> cannot identify image file <_io.BytesIO oFailed to fetch sample 12 cannot identify image file <_io.BytesIO object at 0x7f9ee1578a40> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412f150> cannot identify image file <_io.BytesIO object at 0x7f001f94fbf0> cannot identify image file <_io.BytesIO object at 0x7f6a7f28afc0> le <_io.BytesIO object at 0x7fbcdc144b30> cannot identify image file <_io.BytesIO object at 0x7fd2003a03b0> Failed to fetch sample 3601918. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73259940> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO objFailed to fetch sample 1792884. Exception:ject at 0x7fb24451db20> cannot identify image file <_io.BytesIO object at 0x7efd6f7bd710> Failed to fetch sample 2057267. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369ec090> Failed to fetch sample 1397352. Exception: cannot identify image file <_io.BytesIO object at 0x7f6ac0cdeac0> Failed to fetch sample 1252147. Exception: cannot identify image fFailed to fetch sample 4017515. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc145490> Failed to fetch sample 1495777. Exception: cannot identify image fFailed to fetch sample 1503263. Exception:bject at 0x7efd37e9d530> cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7fc262b52f20> cannot identify image file <_io.BytesIO object at 0x7f9d7323ad90> cannot identify image file <_io.BytesIO object at 0x7f0c1bee6160> Failed to fetch sample 2819059. Exception:Failed to fetch sample 4243929. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc146660> Failed to fetch sample 4103355. Exception: cannot identify image file <_io.BytesIO object at 0x7f57cd62b7e0> cannot identify image file <_io.BytesIO object at 0x7f09953d5080> Failed to fetch sample 1318810. Exception:FFailed to fetch sample 3523488. Exception:bject at 0x7fb4a6516930> e <_io.BytesIO object at 0x7f71340f0fe0> Failed to fetch sample 3463683. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcd580> cannot identify image file <_io.BytesIO object at 0x7f83e831bab0> cannot identify image file <_io.BytesIO object at 0x7f098647f100>Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7f7dba874b80> Failed to fetch sample 2562281. Exception:Failed to fetch sample 3063143. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f65df0> Failed to fetch sample 1095181. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a59a2f0> Failed to fetch sample 3686943. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28ae80> Failed to fetch sample 4132519. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a3f4c0> cannot identify image file <_io.BytesIO object at 0x7f1f1b288860> le <_io.BytesIO object at 0x7f99a394aca0> cannot identify image file <_io.BytesIO object at 0x7fca1c1c35b0> Failed to fetch sample 1810435. Exception: cannot identify image file <_io.BytesIO object at 0x7fa2aebdbf60> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341b2930> Failed to fetch sample 1171510. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcfa60> Failed to fetch sample 1742099. Exception:Failed to fetch sample 3546995. Exception: Failed to fetch sample 3Failed to fetch sample 2904278. Exception:ject at 0x7fbf005d9800> Failed to fetch sample 1133332. Exception:ject at 0x7f35a7fdb2e0> Failed to fetch sample 3087874. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc2217b0> Failed to fetch sample 3565044. Exception: cannot identify image file <_io.BytesIO object at 0x7f351e1c7e20> cannot identify image file <_io.BytesIO object at 0x7fa58cdfff60> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c0040> Failed to fetch sample 2402011. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8eb560> cannot identify image file <_io.BytesIO object at 0x7f12b3847fb0> Failed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00755300> Failed to fetch sample 1397352. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fa612d760> Failed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341b3c90> Failed to fetch sample 4164963. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a960130> Failed to fetch sample 2591530. Exception: cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7fc2eb74bb50> Failed to fetch sample 2059047. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 2274389. Exception:cannot identify image file <_io.BytesIO object at 0x7fa275ce4860> Failed to fetch sample 3180383. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ca5800> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c3150> Failed to fetch sample 2403018. Exception: cannot identify image fiFailed to fetch sample 2338500. Exception:Failed to fetch sample 3617437. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00754e00> bject at 0x7f0c1bee63e0> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1c7420> Failed to fetch sample 2777115. Exception:Failed to fetch sample 2641623. Exception: ject at 0x7fc108456520> Failed to fetch sample 1081825. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7f1f54027880> Failed to fetch sample 3014675. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2520c0> cannot identify image file <_io.BytesIO object at 0x7fcbde565c60> Failed to fetch sample 1602779. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc10efc0> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412f560> Failed to fetch sample 2033363. Exception: cannot identify image fiFailed to fetch sample 2078887. Exception:Failed to fetch sample 1792884. Exception:bject at 0x7f1f1b255490> Failed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7f5525ebc720> cannot identify image file <_io.BytesIO object at 0x7f0c1bee7380> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369ffc90> cannot identify image file <_io.BytesIO object at 0x7f5de3713510> Failed to fetch sample 1947526. Exception:cFailed to fetch sample 2158552. Exception:ject at 0x7fc127be6570> Failed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3712e80> Failed to fetch sample 4029408. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 2388130. Exception:bject at 0x7fc2eb749da0> Failed to fetch sample 2470013. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c9d944b30> cannot identify image file <_io.BytesIO object at 0x7f7245153ec0> cannot identify image file <_io.BytesIO object at 0x7f5de3726fc0> cannot identify image file <_io.BytesIO object at 0x7f098647e660> Failed to fetch sample 1424068. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a52700> iled to fetch sample 3596808. Exception: cFailed to fetch sample 1491402. Exception:ect at 0x7f5d278e49a0> cannot identify image file <_io.BytesIO object at 0x7f5de3724d10> Failed to fetch sample 3374089. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278e4d60> Failed to fetch sample 2916193. Exception:bject at 0x7f001f906ca0> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7f61b844a570> Failed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4265b20> Failed to fetch sample 2158552. Exception:Failed to fetch sample 3007467. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f001f9a5210> Failed to fetch sample 1208423. Exception:Failed to fetch sample 2993528. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7cad90> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a37790> cannot identify image file <_io.BytesIO object at 0x7fa275ce4e50> F cannot identify image file <_io.BytesIO object at 0x7f1137f2ef20> Failed to fetch sample 3177160. ExceptionFailed to fetch sample 2168694. Exception: cannot identify image filFailed to fetch sample 2165648. ExceptionFaFailed to fetch sample 1832660. Exception:annot identify image file <_io.BytesIO object at 0x7fc0f08efa60> Failed to fetch sample 3221548. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f26f420> cannot identify image file <_io.BytesIO object at 0x7f07a84cf470> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7f6667ad62f0> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1b2a20> le <_io.BytesIO object at 0x7efd6f79e0c0> Failed to fetch sample 1664782. Exception:Failed to fetch sample 3171814. Exception: cannot identify image file <_io.BytesIO object at 0x7fa27536bfb0> Failed to fetch sample 3346372. Exception: cannot identify image fiFailed to fetch sample 2101564. Exception:Failed to fetch sample 1885146. Exception: cannot identify image file <_io.BytesIO object at 0x7f082812b5b0> Failed to fetch sample 3227008. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a09cb0> cannot identify image file <_io.BytesIO object at 0x7f5b03b41260> ile <_io.BytesIO object at 0x7fe15de88b80> Failed to fetch sample 3162635. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281329d0> cannot identify image file <_io.BytesIO object at 0x7f35a7f2bbf0> Failed to fetch sample 2879618. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b388abb0> Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1db060> Failed to fetch sample 3575993. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42b2f20>Failed to fetch sample 4069362. Exception: ject at 0x7ff4daba2890> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7b880> cannot identify image file <_io.BytesIO object at 0x7fe4369d6f70> Failed to fetch sample 1709330. Exception:cannot identify image file <_io.BytesIO object at 0x7f7134136930> Failed to fetch sample 1931813. Exception: cannot identify image file <_io.BytesIO object at 0x7f563bb952b0> ile <_io.BytesIO object at 0x7fe4c420fc40> Failed to fetch sample 2165648. Exception: cannot identify image file <_io.BytesIO object at 0x7f52dad16250> Failed to fetch sample 1179835. Exception: cannot identify image fiFailed to fetch sample 2248061. Exception:Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc27f330> cannot identify image file <_io.BytesIO object at 0x7fa2a930d850> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7f9dbc5b7e20> Failed to fetch sample 4244227. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f077e0> cannot identify image file <_io.BytesIO object at 0x7f5a3de263e0> le <_io.BytesIO object at 0x7f7e7ffdf5b0> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e6a003fb0> Failed to fetch sample 3620077. Exception: ject at 0x7fe4369feb60> Failed to fetch sample 2279427. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fede0> le <_io.BytesIO object at 0x7f9d7327eb10> Failed to fetch sample 2089328. Exception:cannot identify image file <_io.BytesIO object at 0x7fb52f62a430> : cannot identify image file <_io.BytesIO object at 0x7f5e5123be20> Failed to fetch sample 3634589. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f62a070> Failed to fetch sample 4123070. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaa47cd10> Failed to fetch sample 3588244. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275d0b880> Failed to fetch sample 1766466. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937f1620> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7fbae2e007c0> Failed to fetch sample 3596808. Exception: cannot identify image file <_io.BytesIO object at 0x7f802848f650> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134134040> cannot identify image file <_io.BytesIO object at 0x7fe4369fc6d0> le <_io.BytesIO object at 0x7fcbdc1ba1b0> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2804f0> Failed to fetch sample 2945842. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cc9a09a0> cannot identify image file <_io.BytesIO object at 0x7fc706e58310> ile <_io.BytesIO object at 0x7fe4c429b6a0> Failed to fetch sample 3634589. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2b8590> cannot identify image file <_io.BytesIO object at 0x7f7ddb155350> Failed to fetch sample 1898440. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1b2660> Failed to fetch sample 2961542. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaa6eba60> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7fbab2fc1530> Failed to fetch sample 1432321. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbde562d90> Failed to fetch sample 1631688. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80dc310> Failed to fetch sample 4179589. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706e59a80> Failed to fetch sample 2915907. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f569d0> cannot identify image file <_io.BytesIO object at 0x7fe4c42926b0> le <_io.BytesIO object at 0x7fca1c18b600> Failed to fetch sample 3203914. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2496834c0> Failed to fetch sample 1218899. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8178860> Failed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134137560> Failed to fetch sample 1198363. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f1c60> Failed to fetch sample 1480960. Exception: Failed to fetch sample 3664242. Exception: ect at 0x7fb9cb2b5e90> Failed to fetch sample 2180795. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a817e340> cannot identify image file <_io.BytesIO object at 0x7fe0ec2cdf80>cannot identify image file <_io.BytesIO object at 0x7fe436a70a40> Failed to fetch sample 3421855. Exception: Failed to fetch sample 1232641. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7fc2eb7b9800> Failed to fetch sample 1327539. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1be9d0> Failed to fetch sample 3010687. Exception: cannot identify image ficannot identify image file <_io.BytesIO object at 0x7fc92ec62e80> Failed to fetch sample 2165648. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 1445442. Exception:mple 1448881. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1a7fb0> Failed to fetch sample 3716501. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137ffbf60> ailed to fetch sample 2587814. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac25ed40> Failed to fetch sample 1345056. Exception: ject at 0x7f44448f4d10>F cannot identify image file <_io.BytesIO object at 0x7fbf0073d350> cannot identify image file <_io.BytesIO object at 0x7fc99c213b00> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706ee05e0> Failed to fetch sample 2767646. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e4b790> Failed to fetch sample 2040873. Exception:F cannot identify image file <_io.BytesIO object at 0x7fbcdc1a55d0> Failed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7ff9170> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac26ff60> e <_io.BytesIO object at 0x7fca2e1d94e0> Failed to fetch sample 2761972. Exception:jFailed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaa47d0d0> ect at 0x7fe89010e520> Failed to fetch sample 3868582. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a51f30> Failed to fetch sample 2654468. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ba980> Failed to fetch sample 2033363. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706ee3100> Failed to fetch sample 1602779. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706e5a750> Failed to fetch sample 2979603. Exception:bject at 0x7fbe722c7100> le <_io.BytesIO object at 0x7fbf00757e70> Failed to fetch sample 2122051. Exception: Failed to fetch sample 1331617. Exception: ect at 0x7ff4e3e4ba60> Failed to fetch sample 994847. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaa47dad0> Failed to fetch sample 4243929. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249683740> : cannot identify image file <_io.BytesIO object at 0x7f80a82d9bc0> Failed to fetch sample 1171510. Exception: cannot identify image file <_io.BytesIO object at 0x7fba0c580ef0> Failed to fetch sample 2145986. Exception: cannot identify image file <_io.BytesIO object at 0x7f7dcc983ab0> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce46750> cannot identify image file <_io.BytesIO object at 0x7fbe72407510> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0073dc10> Failed to fetch sample 2790481. Exception: cannot identify image fiFailed to fetch sample 1958605. Exception:FFailed to fetch sample 2903623. Exception: annot identify image file <_io.BytesIO object at 0x7fb38897bba0> cannot identify image file <_io.BytesIO object at 0x7f80a80fa5c0> Failed to fetch sample 3105090. Exception: cannot identify image file <_io.BytesIO object at 0x7f7dc8be19e0> le <_io.BytesIO object at 0x7fb687f90770> Failed to fetch sample 3879860. Exception: cannot identify image fiFailed to fetch sample 2719591. Exception:Failed to fetch sample 3634589. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f821b0> cannot identify image file <_io.BytesIO object at 0x7f9abcdc6f20> Failed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a25800> cannot identify image file <_io.BytesIO object at 0x7fbe722bc0e0> Failed to fetch sample 2402011. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d671f0> Failed to fetch sample 2129340. Exception: cannot identify image file <_io.BytesIO object at 0x7f96c13e3fb0> Failed to fetch sample 1100749. Exception: cannot identify image file <_io.BytesIO object at 0x7fc10844ce00> Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1070a1670> Failed to fetch sample 2047816. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c7e399da0> Failed to fetch sample 3494092. Exception:bject at 0x7f5de37e7d30> cannot identify image file <_io.BytesIO object at 0x7f99a391b8d0> cannot identify image file <_io.BytesIO ob cannot identify image fcannot identify image file <_io.BytesIO objFailed to fetch sample Failed to fetch sample 2960042. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7323bf60> Failed to fetch sample 4117657. Exception:ject at 0x7fcbdc1ae520> Failed to fetch sample 2889444. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3960180> le <_io.BytesIO object at 0x7f9e9c1da930> Failed to fetch sample 1753067. Exception:bject at 0x7f6682e66980> Failed to fetch sample 1203227. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f99a3962a70> Failed to fetch sample 2571103. Exception: cannot identify image file <_io.BytesIO object at 0x7f8151ef9670> cannot identify image file <_io.BytesIO object at 0x7fe4440b9350> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d730d3c40> Failed to fetch sample 1858383. Exception:Failed to fetch sample 2786440. Exception:bject at 0x7f0c35c3dee0> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39513a0> Failed to fetch sample 2979749. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59acf0> cannot identify image file <_io.BytesIO object at 0x7efd6f7764d0> Failed to fetch sample 3895655. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f292ca0> Failed to fetch sample 3929909. Exception: cannot identify image file <_io.BytesIO object at 0x7fa14c6d2e30> cannot identify image file <_io.BytesIO object at 0x7f0c35c3c360> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7f9ad4587600> cannot identify image file <_io.BytesIO object at 0x7fa275cb2020> ile <_io.BytesIO object at 0x7f99a394a3e0> Failed to fetch sample 2437168. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3953f10> Failed to fetch sample 1524920. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e4a890> cannot identify image file <_io.BytesIO obFailed to fetch sample 3Failed to fetch sample 2918234. Exception:ject at 0x7f99a3920590> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7f97b0167fb0> Failed to fetch sample 1610821. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278bdbc0> Failed to fetch sample 1222876. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fa275cb7bf0> 05596. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c48b0> Failed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3962b60> Failed to fetch sample 3267533. Exception:ject at 0x7f35a7f5b330> i cannot identify image file <_io.BytesIO oFailed to fetch sample 29Failed to fetch s Failed to fetch sample 4079327. Exception: ect at 0x7f83e8318360> ject at 0x7f47278b7ab0> Failed to fetch sample 3892787. Exception: cannot identify image fFailed to fetch sample 1768427. Exception: Failed to fetch sample 2562281. Exception: cannot identify image file <_io.BytesIO object at 0x7f4450b68f90> Failed to fetch sample 1761661. Exception: cannot identify image file <_io.BytesIO object at 0x7fd310dc2c50> Failed to fetch sample 1650924. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c22a0> Failed to fetch sample 3194835. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0937f10> Failed to fetch sample 3653818. Exception:ject at 0x7f9d73280ae0> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7323acf0> Failed to fetch sample 2617567. Exception:bject at 0x7f99a39489a0> cannot identify image file <_io.BytesIO object at 0x7fc5d036ea20> cannot identify image file <_io.BytesIO object at 0x7fca1c1b9940> le <_io.BytesIO object at 0x7fc132355ad0> cannot identify image file <_io.BytesIO object at 0x7f18e5d1e570> Failed to fetch sample 2282533. Exception: cannot identify image file <_io.BytesIO object at 0x7fd079db7330> cannot identify image file <_io.BytesIO object at 0x7fc108456520> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3961d50> Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca0ae0> Failed to fetch sample 3629990. Exception:ject at 0x7fc12815f970> Failed to fetch sample 1445442. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d1a2a0> Failed to fetch sample 1395563. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84ee0c0> Failed to fetch sample 4123070. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9eff9b2e0> Failed to fetch sample 2108438. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca0860> Failed to fetch sample 2027281. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560e5940> Failed to fetch sample 1612112. Exception: cannot identify image fiFailed to fetch sample 3029045. Exception: cannot identify image file <_io.BytesIO object at 0x7f191dbaef20> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7f9ad4585080> Failed to fetch sample 3467927. Exception:Fcannot identify image file <_io.BytesIO object at 0x7f0c35c29940> Fcannot identify image file <_io.BytesIO object at 0x7f83e838f790> Failed to fetch sample 3113386. Exception:cannot identify image file <_io.BytesIO object at 0x7f713412c7c0> FFailed to fetch sample 1740611. Exception:Failed to fetch sample 2858679. Exception: cannot identify image fFailed to fetch sample 1784730. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59e8e0> ile <_io.BytesIO object at 0x7f6556066ed0> Failed to fetch sample 4186967. Exception: cannot identify image file <_io.BytesIO object at 0x7f396d309760> Failed to fetch sample 2379671. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24966b2e0> Failed to fetch sample 2479743. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e838e200> ailed to fetch sample 2960755. Exception:Failed to fetch sample 1198666. Exception:bject at 0x7f5a5e783fb0> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5ad440> Failed to fetch sample 3240002. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f7240> cannot identify image file <_io.BytesIO oFailed to fetch sample 3240425. Exception: mage file <_io.BytesIO oFailed to fetch sample 2101533. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8ead800> bject at 0x7ff249485490> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e838ec50> Failed to fetch sample 2267632. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f47600> cannot identify image file <_io.BytesIO object at 0x7f81db5acb30> ile <_io.BytesIO object at 0x7fca1c1bb470> Failed to fetch sample 2103217. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494a3e20> Failed to fetch sample 2040873. Exception: Failed to fetch sample 2774853. Exception: cannot identify image fFailed to fetch sample 2294453. Exception: cannot identify image fFailed to fetch sample 1316862. Exception: Failed to fetch sample 2799267. Exception: cannot identify image fFailed to fetch sample 2851854. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f7970> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5af150> cannot identify image file <_io.BytesIO object at 0x7f3939ce92b0> Failed to fetch sample 3840067. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6501b70> Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9eff9b0b0> Failed to fetch sample 3661928. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8339710> ile <_io.BytesIO object at 0x7f58c3827880> ject at 0x7fbcdc1c5260> Failed to fetch sample 2641615. Exception: cannot identify image fiFailed to fetch sample 2738082. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec277380> le <_io.BytesIO object at 0x7f35a7f2f600> Failed to fetch sample 3848013. Exception:Failed to fetch sample 1651629. Exception: cannot identify image fiFailed to fetch sample 3021215. Exception:bject at 0x7f81db59f7e0> 04849. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494f8f90> Failed to fetch sample 1689368. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18aac0> Failed to fetch sample 3670509. Exception:bject at 0x7f1f1b2867a0> le <_io.BytesIO object at 0x7fcbdef66250> Failed to fetch sample 2164469. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2495236f0> Failed to fetch sample 1012319. Exception: cannot identify image file <_io.BytesIO object at 0x7f84378b27f0> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9eff9a110> Failed to fetch sample 3455872. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd3970> Failed to fetch sample 1879013. Exception: ject at 0x7f7134136200> Failed to fetch sample 4179589. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5f380> Failed to fetch sample 1327539. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fe3060> Failed to fetch sample 2641615. Exception: cannot identify image fiFailed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8ee2980> Failed to fetch sample 3010687. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f3cf90> le <_io.BytesIO object at 0x7f83e83487c0> Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c217ec0> Failed to fetch sample 3909071. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e6a003f60> Failed to fetch sample 994847. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1bbe70> Failed to fetch sample 4089308. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2495214e0> Failed to fetch sample 980211. Exception: cannot identify image filFailed to fetch sample 3421855. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f44d10> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5de40> Failed to fetch sample 2940430. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9065b70> Failed to fetch sample 3083149. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b426d580> Failed to fetch sample 2024032. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f55ad0> Failed to fetch sample 3722411. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd7290> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ccafc0> Failed to fetch sample 1504520. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9a6de0> Failed to fetch sample 3013832. Exception:bject at 0x7fd07a609d00> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08ed120> Failed to fetch sample 3879860. Exception: ject at 0x7f7134134270> Failed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f47330> Failed to fetch sample 1445442. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8ee20c0> Failed to fetch sample 2033363. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fe2d40> cannot identify image file <_io.BytesIO object at 0x7fdcaf82ccc0> le <_io.BytesIO object at 0x7f70b426e160> cannot identify image file <_io.BytesIO object at 0x7f83e834bec0> Failed to fetch sample 2827935. Exception: Failed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278b0680> Failed to fetch sample 2086216. Exception: cannot identify image fFailed to fetch sample 1403393. Exception: Failed to fetch sample 1806693. Exception: cannot identify image fFailed to fetch sample 1448881. Exception: Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278db510> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a60b8d0> Failed to fetch sample 1474494. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2509a0> Failed to fetch sample 2858679. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5e3e0> Failed to fetch sample 3096384. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fe0450>cannot identify image file <_io.BytesIO object at 0x7f1137f484a0> ile <_io.BytesIO object at 0x7efd3a02b600> Failed to fetch sample 1930721. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a90838d0> Failed to fetch sample 2682973. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681d2ed0> cannot identify image file <_io.BytesIO object at 0x7f0986450950>cannot identify image file <_io.BytesIO obFailed to fetch sample 1327539. Exception: mple 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a562930> 35544. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe72405620> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO obcannot identify image file <_io.BytesIO object at 0x7f7134295df0> cannot identify image file <_io.BytesIO object at 0x7fa275cdc360> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cdbdd0> Failed to fetch sample 3207931. Exception:bject at 0x7f001f9a7510> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe723fc450> Failed to fetch sample 1171510. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83adf80> Failed to fetch sample 1194243. Exception: cannot identify image file <_io.BytesIO object at 0x7f3981a7dd50> cannot identify image file <_io.BytesIO object at 0x7efd6f7bbfb0> Failed to fetch sample 3513077. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d6250> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a609440> bject at 0x7fe0ec2525c0> Failed to fetch sample 1806603. Exception:Failed to fetch sample 3314842. Exception: ject at 0x7fa275cd5bc0> Failed to fetch sample 1641060. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cfba60> Failed to fetch sample 1689089. Exception: cannot identify image file <_io.BytesIO object at 0x7f051343fbf0> cannot identify image file <_io.BytesIO object at 0x7f7134136750>Failed to fetch sample 2710385. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7ff8d0> FFailed to fetch sample 1740611. Exception:cannot identify image file <_io.BytesIO object at 0x7efa9b9e3b50> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b1800> cannot identify image file <_io.BytesIO object at 0x7f3939c77a10> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7fd3103d5080> Failed to fetch sample 2033363. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2d3470> Failed to fetch sample 2783531. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cfbbf0> Failed to fetch sample 1327539. Exception: cannot identify image file <_io.BytesIO object at 0x7f051343cef0> Failed to fetch sample 1198666. Exception: ect at 0x7f9abce83e20> Failed to fetch sample 2719591. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340edcb0> Failed to fetch sample 4201650. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc111c10> Failed to fetch sample 3270019. Exception: ject at 0x7f3939c76bb0> cannot identify image file <_io.BytesIO object at 0x7fd07a562520> Failed to fetch sample 3030569. Exception: cannot identify image file <_io.BytesIO object at 0x7f6b6c1570b0> le <_io.BytesIO object at 0x7fa275cdee80> FFailed to fetch sample 2658107. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c426b470>FFailed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cedc10> Failed to fetch sample 2424060. Exception: cannot identify image fFailed to fetch sample 3238020. Exception: cannot identify image file <_io.BytesIO object at 0x7f61ceb93ba0> Failed to fetch sample 3223727. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275368fe0> Failed to fetch sample 1784730. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b1243240> cannot identify image file <_io.BytesIO oFailed to fetch sample 1484787. Exception:bFailed to fetch sample 2Failed to fetch sa cannot identify image file <_io.BytesIO object at 0x7f098641fba0> Failed to fetch sample 1 cannot identify image file <_io.BytesIO object at 0x7f051343ee30> Failed to fetch saFailed to fetch sample 2323798. Exception:bject at 0x7f5de37e7bf0> ject at 0x7fa50aa30d60> cannot identify image file <_io.BytesIO object at 0x7f5838fc6fc0> Failed to fetch sample 2761972. Exception: cannot identify image fiFailed to fetch sample 1329424. Exception:Fcannot identify image file <_io.BytesIO o cannot identify image fiFailed to fetch sample 3452401. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a4aeabfb0> le <_io.BytesIO object at 0x7f051343d5d0> Failed to fetch sample 1308960. Exception: cannot identify image file <_io.BytesIO object at 0x7fa75e325080> Failed to fetch sample 3506261. Exception:Failed to fetch sample 23 cannot identify image file <_io.BytesIO object at 0x7fe4c42630b0> 76750. Exception:Failed to fetch sample 1931180. Exception: cannot identify image fiFailed to fetch sample 3569930. Exception: cannot identify image file <_io.BytesIO object at 0x7f5caeba17b0> Failed to fetch sample 1644264. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668132ca0> Failed to fetch sample 1422467. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4473705e0> Failed to fetch sample 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986485940> Failed to fetch sample 3034526. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38fbf10> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f2a3fbe70> Failed to fetch sample 1742099. Exception: cannot identify image file <_io.BytesIO object at 0x7f52ce1fa110> Failed to fetch sample 1120039. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4282660> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3961620> Failed to fetch sample 1149817. Exception: cannot identify image fiFailed to fetch sample 1327539. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc144130> Failed to fetch sample 2786440. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ef4c0> cannot identify image file <_io.BytesIO object at 0x7fe436a57d30> le <_io.BytesIO object at 0x7f52a1abd1c0> Failed to fetch sample 1250858. Exception: Failed to fetch sample 2503403. Exception: cannot identify image fFailed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b283510> ile <_io.BytesIO object at 0x7f5cb38fb010> Failed to fetch sample 3227648. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722bd940> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59a2f0> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986486f70> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668131a80> Failed to fetch sample 1907202. Exception: cannot identify image file <_io.BytesIO object at 0x7f0e4eb86ac0> Failed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7fe696745760> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b24a070> Failed to fetch sample 1407211. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cb38b0c70> Failed to fetch sample 3898287. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722bddf0> cannot identify image file <_io.BytesIO object at 0x7fe4369ec090> le <_io.BytesIO object at 0x7f001f94f510> cannot identify image file <_io.BytesIO object at 0x7f5643e2c540> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722479c0> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7fe447370950> Failed to fetch sample 2687095. Exception: Failed to fetch sample 1717106. Exception:ject at 0x7f55df2d8db0> ile <_io.BytesIO object at 0x7fb52f64cef0> cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7f70b56b1170> Failed to fetch sample 3648356. Exception: cannot identify image fiFailed to fetch sample 1491402. Exception:Failed to fetch sample 2243451. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39e9260> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7f936bd2fce0> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f64d260> cannot identify image file <_io.BytesIO object at 0x7fbe722bff10> Failed to fetch sample 4053800. Exception: cannot identify image fiFailed to fetch sample 3346372. Exception:F cannot identify image file <_io.BytesIO cannot identify image file <_io.BytesIO object at 0x7f58c38b7b00> : cannot identify image file <_io.BytesIO object at 0x7f71341014e0> Failed to fetch sample 1167096. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643ec3740> Failed to fetch sample 1084638. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe723e7420> Failed to fetch sample 3175494. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f931a0> Failed to fetch sample 1976843. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa3ef70> Failed to fetch sample 1806693. Exception:Failed to fetch sample 1029176. Exception: cannot identify image file <_io.BytesIO object at 0x7fa6506ef2e0> Failed to fetch sample 3573442. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f5080> Failed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275369990> Failed to fetch sample 1332077. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c187c40> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cdecf0> Failed to fetch sample 2554675. Exception: cannot identify image file <_io.BytesIO object at 0x7fa69c533470> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828188040> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa323e0> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7fa2bda8e110> Failed to fetch sample 3239944. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275d0ad40> Failed to fetch sample 3353539. Exception: cannot identify image file <_io.BytesIO object at 0x7f3982c99c10> Failed to fetch sample 2761972. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ce6ac0> Failed to fetch sample 2971980. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c96bb0> Failed to fetch sample 3380043. Exception: cannot identify image file <_io.BytesIO object at 0x7f395c19a160> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ce6840> Failed to fetch sample 2473153. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec24c7c0> Failed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c7ae80> Failed to fetch sample 1727143. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ebcc032e0> Failed to fetch sample 1120039. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ddb155350> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687fb4ef0> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ce4220> Failed to fetch sample 3099584. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643ec22f0> Failed to fetch sample 3849580. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1b2980> Failed to fetch sample 1218899. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134102840> Failed to fetch sample 3251116. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23dfd0> cannot identify image file <_io.BytesIO object at 0x7f5de376a070> ile <_io.BytesIO object at 0x7f7a569049a0> Failed to fetch sample 2947823. Exception:Failed to fetch sample 3270019. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 1250858. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e59e0> Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134101df0> Failed to fetch sample 3719356. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac644d10> Failed to fetch sample 2671128. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc27c1d0> Failed to fetch sample 3846458. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a81d7d30> Failed to fetch sample 1762324. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5690ecf0> ile <_io.BytesIO object at 0x7fbcdc1469d0> Failed to fetch sample 3091512. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7f96cd9a4c70> 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7f713418f420> Failed to fetch sample 1310001. Exception: cannot identify image file <_io.BytesIO object at 0x7f7dcac1afc0> Failed to fetch sample 3419178. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23f650> cannot identify image file <_io.BytesIO object at 0x7fc132355ad0> le <_io.BytesIO object at 0x7f99a39214e0> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7fc84d52dc60> Failed to fetch sample 978639. Exception: cannot identify image filFailed to fetch sample 1133332. Exception:Failed to fetch sample 3575993. Exception: ject at 0x7fc12815f970> e <_io.BytesIO object at 0x7f71340f2430> Failed to fetch sample 3245561. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ddb154d60> cannot identify image file <_io.BytesIO object at 0x7f098641ef70> FFailed to fetch sample 2002244. Exception: annot identify image file <_io.BytesIO object at 0x7f0c1bee6e30> cannot identify image file <_io.BytesIO object at 0x7fc4c81d8770> ile <_io.BytesIO object at 0x7f0d6138fce0> cannot identify image file <_io.BytesIO o cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f4418a398a0> 76754. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80f9c60> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641f420> Failed to fetch sample 2784203. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275368950> Failed to fetch sample 1666438. Exception: cannot identify image fiFailed to fetch sample 2979603. Exception:Failed to fetch sample 23Failed to fetch sample 1277996. Exception:le <_io.BytesIO object at 0x7fb39d910ae0> cannot identify image file <_io.BytesIO object at 0x7f098644b4c0> Failed to fetch sample 1700630. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249502d40> ile <_io.BytesIO object at 0x7fe4c42c55d0> Failed to fetch sample 2761972. Exception:ject at 0x7f191dbaf240> Failed to fetch sample 1979477. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fdfd0> cannot identify image file <_io.BytesIO object at 0x7fcbdc1baa70> Failed to fetch sample 3288844. Exception:Failed to fetch sample 1806603. Exception:bject at 0x7f9abcd96160> Failed to fetch sample 2767646. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f36f0> cannot identify image file <_io.BytesIO object at 0x7fcf7040f510> Failed to fetch sample 3346372. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 14Failed to fetch sample 1876146. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c6250> cannot identify image file <_io.BytesIO object at 0x7fd310168860> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42c4680>Failed to fetch sample 2890579. Exception:bject at 0x7f9abce455d0> Failed to fetch sample 1734015. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828121df0> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278be160> Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO object at 0x7f443e855620> cannot identify image file <_io.BytesIO object at 0x7fe4c426ab60> le <_io.BytesIO object at 0x7f469ef1a9d0> cannot identify image file <_io.BytesIO object at 0x7f9acc06bec0> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a3c720> cannot identify image file <_io.BytesIO object at 0x7fe4c426abb0> Failed to fetch sample 3041644. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9f005a30> Failed to fetch sample 4026469. Exception:bject at 0x7f0828569850> Failed to fetch sample 2497248. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828189b70> Failed to fetch sample 2479152. Exception:bject at 0x7fcbdc195f30> Failed to fetch sample 1004829. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134136ac0> Failed to fetch sample 2122051. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1931f0> Failed to fetch sample 2984651. Exception:bject at 0x7f2009934b80> Failed to fetch sample 3340230. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7322f4c0> Failed to fetch sample 1122273. Exception: cannot identify image file <_io.BytesIO object at 0x7f9bab5a9030> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc134180> Failed to fetch sample 3508573. Exception:ject at 0x7fc0f08ec040> Failed to fetch sample 3898077. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf82ecf0> cannot identify image file <_io.BytesIO object at 0x7f9d7327fbf0> Failed to fetch sample 1742099. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84eba10> Failed to fetch sample 1179835. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828188ef0> cannot identify image file <_io.BytesIO object at 0x7f7134136a20> Failed to fetch sample 3557219. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f3f60> Failed to fetch sample 2950641. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc65a4d0> Failed to fetch sample 3493751. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded143790> Failed to fetch sample 1740611. Exception: cannot identify image file <_io.BytesIO object at 0x7f54d2855670> cannot identify image file <_io.BytesIO object at 0x7fdcaf82d8a0> Failed to fetch sample 1133332. Exception: cannot identify image fiFailed to fetch sample 3522558. Exception:Failed to fetch sample 4029134. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722c7f60> Failed to fetch sample 3193123. Exception: Failed to fetch sample 2642963. Exception: ect at 0x7f001f9a6de0> Failed to fetch sample 1635122. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f28e0> Failed to fetch sample 2045138. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7f5c9d944b30> ile <_io.BytesIO object at 0x7fb4a651ef20> ile <_io.BytesIO object at 0x7f53a906a480> Failed to fetch sample 2089328. Exception: cannot identify image file <_io.BytesIO object at 0x7f0970931ee0> Failed to fetch sample 3005840. Exception: cannot identify image file <_io.BytesIO object at 0x7f07aac70a40> Failed to fetch sample 3298418. Exception:Failed to fetch sample 4158784. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 3228934. Exception:ample 4117657. Exception: cannot identify image file <_io.BytesIO o cannot identify image fiFailed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f62a75490> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7fb2fd3c9a80> le <_io.BytesIO object at 0x7f524fbd95d0> Failed to fetch sample 3240425. Exception: cannot identify image fiFailed to fetch sample 3021215. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f619f30> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fdd00> Failed to fetch sample 2591924. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc6585e0> Failed to fetch sample 1858383. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f23e0> Failed to fetch sample 2970164. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c2122a0> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f2c00> Failed to fetch sample 1140456. Exception: Failed to fetch sample 1355996. Exception:ject at 0x7f7ef7159990> cannot identify image file <_io.BytesIO object at 0x7fb4a6515fd0> Failed to fetch sample 1491577. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d5059b470> cannot identify image file <_io.BytesIO object at 0x7f1f4e73cf90> Failed to fetch sample 1490821. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668140e00> Failed to fetch sample 4087757. Exception: ailed to fetch sample 98 cannot identifycannot identify image file cannot identify image ficannot identifyFailed to fetch sample 3627401. Exception: cannot idFailed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a27597fb0> entify image file <_io.BytesIO object at Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7fb25a06e520> Failed to fetch sample 2783090. Exception:bject at 0x7f70b4266c50> Failed to fetch sample 3582987. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d5059b830> Failed to fetch sample 2630287. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8eafab0> Failed to fetch sample 2553207. Exception: cannot identify image file <_io.BytesIO object at 0x7f91ffd77fb0> Failed to fetch sample 1371060. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2406d0> Failed to fetch sample 1524920. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f1d50> cannot identify image file <_io.BytesIO o cannot identify image fiFailed to fetch sample 3922185. Exception: ject at 0x7f5d278ddc60> 93751. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706e5b6f0> Failed to fetch sample 3419961. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278de390> cannot identify image file <_io.BytesIO object at 0x7f52a1abd1c0> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b287880> cannot identify image file <_io.BytesIO object at 0x7fbcdc10d760> Failed to fetch sample 1878125. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4cdf1f0> Failed to fetch sample 3899161. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ebe70> Failed to fetch sample 2420553. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc192980> cannot identify image file <_io.BytesIO object at 0x7f08280fd030> Failed to fetch sample 2807713. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1af790> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706e59530> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b285b20> Failed to fetch sample 3633280. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754bb790> le <_io.BytesIO object at 0x7fc2eb8ebc40> Failed to fetch sample 3081751. Exception:Failed to fetch sample 3314842. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f22304b5080> Failed to fetch sample 3180383. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f293f0> Failed to fetch sample 3034526. Exception: cannot identify image file <_io.BytesIO object at 0x7f566813dee0> Failed to fetch sample 4271260. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42829d0> Failed to fetch sample 2450886. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b3888590> Failed to fetch sample 1096978. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16e256c0>cannot identify image file <_io.BytesIO object at 0x7fb3929cbfb0> Failed to fetch sample 3542563. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42619e0> Failed to fetch sample 3605195. Exception: cannot identify image file <_io.BytesIO object at 0x7f566811f510> ile <_io.BytesIO object at 0x7f80a80ebe20> Failed to fetch sample 4108370. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a52feffb0> cannot identify image file <_io.BytesIO object at 0x7fc99c213dd0> Failed to fetch sample 3148873. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f29710> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4282660> Failed to fetch sample 2787643. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137ffa8e0> cannot identify image file <_io.BytesIO object at 0x7f27754bb650> Failed to fetch sample 2439548. Exception: cannot identify image file <_io.BytesIO object at 0x7f08282cf9c0> le <_io.BytesIO object at 0x7f566813c540> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f36a0> cannot identify image file <_io.BytesIO object at 0x7fbcdc137420> Failed to fetch sample 1407211. Exception: cannot identify image file <_io.BytesIO object at 0x7f52cbbf4900> Failed to fetch sample 1770342. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e9d50> Failed to fetch sample 2562281. Exception:Failed to fetch sample 1120039. Exception: ject at 0x7f27754bbf60> Failed to fetch sample 4231132. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f66c00> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f37e0> Failed to fetch sample 1854666. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d6b1f0> ailed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac383240> Failed to fetch sample 3484950. Exception: cannot identify image fiFailed to fetch sample 3159206. Exception: cannot identify image file <_io.BytesIO object at 0x7f082818ac50>Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7f277545b150> Failed to fetch sample 3728628. Exception: cannot identify image file <_io.BytesIO object at 0x7f3981a7dd50> Failed to fetch sample 1157951. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f48a90> Failed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8156fc0> Failed to fetch sample 1035361. Exception: cannot identify image file <_io.BytesIO object at 0x7f277545b3d0> le <_io.BytesIO object at 0x7f6417e0d670> cannot identify image file <_io.BytesIO object at 0x7f0c35bc0b30> Failed to fetch sample 2202372. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4daba1c10> Failed to fetch sample 2394516. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706e597b0> Failed to fetch sample 2119385. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc2ca0> cannot identify image file <_io.BytesIO object at 0x7fb3910e9850> le <_io.BytesIO object at 0x7fbcdc10efc0> Failed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a81097b0> Failed to fetch sample 1432321. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f65560dc3b0> le <_io.BytesIO object at 0x7fb7b2759ad0> cannot identify image file <_io.BytesIO object at 0x7f0c35c3d7b0> Failed to fetch sample 1327539. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bdcdb0> Failed to fetch sample 3010687. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9ce8d800> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc10f5b0> Failed to fetch sample 1470654. Exception:Failed to fetch sample 3290404. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecbddc360> cannot identify image file <_io.BytesIO object at 0x7f6556068680> Failed to fetch sample 2297434. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f0dd50> Failed to fetch sample 1680813. Exception: cannot identify image fiFailed to fetch sample 3880305. Exception: Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc1e90> cannot identify image file <_io.BytesIO object at 0x7f35a7f70900> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7f90fe026ed0> Failed to fetch sample 4222253. Exception: cannot identify image fFailed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7f0d22b324d0> ile <_io.BytesIO object at 0x7fcbdc1da1b0> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc10ee80> cannot identify image file <_io.BytesIO object at 0x7fd07a5f5d50> le <_io.BytesIO object at 0x7f0c7e3997b0> cannot identify image file <_io.BytesIO object at 0x7f35a7f83650> le <_io.BytesIO object at 0x7ff62026dd00> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f2714e0> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80eb100> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fc0860> Failed to fetch sample 2924881. Exception: cannot identify image file <_io.BytesIO object at 0x7f7db2303c90> cannot identify image file <_io.BytesIO object at 0x7f0c35c26c00> Failed to fetch sample 2916193. Exception:Fcannot identify image file <_io.BytesIO object at 0x7f6a7f270ef0> Failed to fetch sample 2033363. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a570810> le <_io.BytesIO object at 0x7fcbdc1ecb30> Failed to fetch sample 2655989. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c3fab0> Failed to fetch sample 2937693. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db567010> Failed to fetch sample 4216474. Exception: cannot identify image file <_io.BytesIO object at 0x7f81b9c9f150> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ddeabb330> Failed to fetch sample 2940708. Exception: cannot identify image fFailed to fetch sample 3909731. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a810a430> Failed to fetch sample 1664782. Exception:cFailed to fetch sample 2654468. Exception:ect at 0x7f1f62a74a90> cannot identify image file <_io.BytesIO object at 0x7fd07a61b5b0> Failed to fetch sample 1702737. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64c9120> ile <_io.BytesIO object at 0x7f655609f920> cannot identify image file <_io.BytesIO object at 0x7f35a7f49ad0> Failed to fetch sample 2986584. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc174900> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d9348ef20> Failed to fetch sample 3637098. Exception: ject at 0x7f1f1b097150> Failed to fetch sample 1823558. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c426b790> Failed to fetch sample 1365649. Exception: cannot identify image f cannot identify image file <_io.BytesIO ob cannot identify image fFailed to fetch sample 2047816. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e38162a0> cannot identify image file <_io.BytesIO object at 0x7fc706e589a0> FFailed to fetch sample 1203134. Exception:cannot identify image file <_io.BytesIO object at 0x7f83e8327c40> Failed to fetch sample 2769245. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706ee0310> Failed to fetch sample 2108524. Exception: ect at 0x7f96cd9a5080> ile <_io.BytesIO object at 0x7f21b09ebfb0> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42c7470> Failed to fetch sample 1876146. Exception:ject at 0x7efd6f7b8b30> ile <_io.BytesIO object at 0x7f9a9c2c5530> Failed to fetch sample 1168708. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 2040883. Exception: cannot identify image file <_io.BytesIO object at 0x7f80fd8e27f0> Failed to fetch sample 3Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 1cannot identify image file <_io.BytesIO object at 0x7f9be6bbffb0> ile <_io.BytesIO obFailed to fetch sample 1742099. Exception: cannot identify image file <_io.BytesIO object at 0x7fc24c6becf0> Failed to fetch sample 2856005. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c4db0> Failed to fetch sample 3557596. Exception: cannot identify image file <_io.BytesIO object at 0x7fb83fc0cae0> cannot identify image file <_io.BytesIO object at 0x7efd6f7b8bd0> le <_io.BytesIO object at 0x7fa275cd3420> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4daba2890> Failed to fetch sample 1740611. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83487c0> Failed to fetch sample 2761972. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1e449ccc0> cannot identify image file <_io.BytesIO object at 0x7fbcdc111c10> Failed to fetch sample 2402011. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687fb45e0> cannot identify image file <_io.BytesIO object at 0x7efd6f7979c0> le <_io.BytesIO object at 0x7f9a9c287010> cannot identify image file <_io.BytesIO object at 0x7f5a176d8a90> Failed to fetch sample 3634946. Exception: Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a27597fb0> Failed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687fb7bf0> Failed to fetch sample 3335738. Exception: cannot identify image file <_io.BytesIO object at 0x7fcf1d090ea0> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278de390> Failed to fetch sample 3540812. Exception: cannot identify image file <_io.BytesIO object at 0x7fd012a13240> Failed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42c55d0> Failed to fetch sample 3840067. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83afb50> Failed to fetch sample 1934945. Exception:bject at 0x7f5d2787ed40> le <_io.BytesIO object at 0x7f395c19ac50> Failed to fetch sample 2184338. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 3081392. Exception:mple 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42c5e40> Failed to fetch sample 2073654. Exception: cannot identify image fi cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7f70b3889620> Failed to fetch sample 3653486. Exception: cannot identify image fiFailed to fetch sample 2903623. Exception:Failed to fetch sample 3585602. Exception:bject at 0x7fe4c42c4040> Failed to fetch sample 3254207. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1be0c0> Failed to fetch sample 3359583. Exception: cannot identify image fiFailed to fetch sample 1179524. Exception:bject at 0x7efd6fb84ae0> Failed to fetch sample 2751715. Exception: cannot identify image file <_io.BytesIO object at 0x7fd079dc71a0> cannot identify image file <_io.BytesIO object at 0x7f47e0cfa1b0> le <_io.BytesIO object at 0x7f70b426d940> Failed to fetch sample 1572013. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f45030> ailed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f71a0> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789bce0> Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bc310> Failed to fetch sample 1400353. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d5059b470> Failed to fetch sample 3380043. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac26d760> Failed to fetch sample 2322721. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbde560630> cannot identify image file <_io.BytesIO object at 0x7fd079dc5800> Failed to fetch sample 3132975. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1bab60> cannot identify image file <_io.BytesIO object at 0x7f47e26ef380> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a176d9d00> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e59f29850> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f47330> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bf510> Failed to fetch sample 1967992. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc128540> Failed to fetch sample 1120039. Exception: cannot identify image ficannot identify image file <_io.BytesIO obFailed to fetch sample 2Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278b51c0> Failed to fetch sample 3929199. Exception:bject at 0x7fca1c1b9760> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7fd079dc7a10> Failed to fetch sample 3575993. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14eb10> Failed to fetch sample 3493751. Exception: cannot identify image file <_io.BytesIO object at 0x7f32986bb880> Failed to fetch sample 1250858. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc2544f0> cannot identify image file <_io.BytesIO object at 0x7f81db5791c0> cannot identify image file <_io.BytesIO object at 0x7f5a43dad710> le <_io.BytesIO object at 0x7f47278b71f0> Failed to fetch sample 1943116. Exception: cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7fbcdc1af830> cannot identify image file <_io.BytesIO object at 0x7f10b1313c40> Failed to fetch sample 2717980. Exception: ect at 0x7fe436a25530> Failed to fetch sample 2045138. Exception:cannot identify image file <_io.BytesIO object at 0x7f3939ca0450> Failed to fetch sample 1042496. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4804fa7a0> cannot identify image file <_io.BytesIO object at 0x7f1137f80950> Failed to fetch sample 2674510. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec238a40> cannot identify image file <_io.BytesIO object at 0x7fbcdc14e340> Failed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc2573d0> Failed to fetch sample 3392152. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c549b70> Failed to fetch sample 1141130. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f802c0> Failed to fetch sample 2966793. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 3049982. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5746d0> Failed to fetch sample 3232520. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a271a0> Failed to fetch sample 1495777. Exception:Failed to fetch sample 978639. Exception: bject at 0x7f5a4ea398f0> cannot identify image file <_io.BytesIO obFailed to fetch sample 2565884. Exception:mple 1567855. Exception: cannot identify image file <_io.BytesIO object at 0x7fb258c19df0> ject at 0x7f3377e58cc0> Failed to fetch sample 2591530. Exception: cannot identify image file <_io.BytesIO object at 0x7f56bdeed800> Failed to fetch sample 2100397. Exception: cannot identify image fFailed to fetch sample 2616544. Exception: Failed to fetch sample 3380043. Exception: cannot identify image fiFailed to fetch sample 1029176. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f2bf10> Failed to fetch sample 3900769. Exception: cannot identify image fFailed to fetch sample 4179589. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f35a7f5e110> Failed to fetch sample 3469703. Exception:Failed to fetch sample 3700643. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5738d0> Failed to fetch sample 1909851. Exception: cannot identify image fiFailed to fetch sample 2086216. Exception:bject at 0x7f54d2855670> Failed to fetch sample 1135347. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a26de0> cannot identify image file <_io.BytesIO object at 0x7fb687f2b290> le <_io.BytesIO object at 0x7f7dcac1af20> cannot identify image file <_io.BytesIO obFailed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a51bc0> ject at 0x7fdfab859490> Failed to fetch sample 3421855. Exception: cannot identify image fiFailed to fetch sample 2761972. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ee070> Failed to fetch sample 1429596. Exception:Failed to fetch sample 2049038. Exception:bject at 0x7f35a7f5c900> Failed to fetch sample 4189167. Exception: cannot identify image file <_io.BytesIO object at 0x7f5643e2e0c0> Failed to fetch sample 3569930. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a8108270> le <_io.BytesIO object at 0x7fe436a513f0> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fa129d0> cannot identify image file <_io.BytesIO object at 0x7ff249521e40> Failed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec270cc0> Failed to fetch sample 2960142. Exception: cannot identify image file <_io.BytesIO object at 0x7f52c9c17330> Failed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f29350> Failed to fetch sample 2856005. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08ee6b0> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7fb258c19f30> Failed to fetch sample 2180795. Exception: ject at 0x7f1137ff8220> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fa12020> cannot identify image file <_io.BytesIO object at 0x7fc0f08ee890> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7fb238771d00> ile <_io.BytesIO object at 0x7fb7401cfc40> Failed to fetch sample 2562281. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137ff95d0> Failed to fetch sample 1016055. Exception: cannot identify image file <_io.BytesIO object at 0x7f52ce1fa110> Failed to fetch sample 3021215. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137ff8b30> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7fb24451df30> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f68e755d0> cannot identify image file <_io.BytesIO object at 0x7fe696745760> Failed to fetch sample 3105090. Exception: Failed to fetch sample 2888688. Exception: cannot identify image fFailed to fetch sample 3605195. Exception: Failed to fetch sample 2049208. Exception:ject at 0x7f4f7d385210> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08dad90> Failed to fetch sample 2145986. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08dacf0>: cannot identify image file <_io.BytesIO object at 0x7f35a7f46700> Failed to fetch sample 2258741. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ee200> Failed to fetch sample 2507527. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ae8e0> cannot identify image file <_io.BytesIO object at 0x7f0d6138fce0> Failed to fetch sample 3069770. Exception:bject at 0x7fe1f484ce00> cannot identify image file <_io.BytesIO object at 0x7fe0ec276430> Failed to fetch sample 2335084. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec251620> Failed to fetch sample 1648054. Exception: cannot identify image fiFailed to fetch sample 3666572. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5c8b0> Failed to fetch sample 3209388. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc23afc0> Failed to fetch sample 1407211. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80dc770> cannot identify image file <_io.BytesIO object at 0x7f96cb821a80> cannot identify image file <_io.BytesIO object at 0x7ff249521b70> le <_io.BytesIO object at 0x7fc2eb74b330> Failed to fetch sample 3898287. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2761b0> Failed to fetch sample 3578223. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7b8c70>cFailed to fetch sample 1279294. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bc1e90> annot identify image file <_io.BytesIO object at 0x7f35a7f5e520> Failed to fetch sample 1474155. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb715cb0> le <_io.BytesIO object at 0x7f969583fec0> Failed to fetch sample 4125296. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d6bba0> Failed to fetch sample 3148873. Exception: cannot identify image ficannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7f32986bb740> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7f3377e58770> cannot identify image file <_io.BytesIO object at 0x7fdfad624450> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1476f0> Failed to fetch sample 1042779. Exception: cannot identify image file <_io.BytesIO object at 0x7fc73df1bab0> Failed to fetch sample 3518221. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249486a20> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd7c90> FFailed to fetch sample 3538308. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7ba980> ailed to fetch sample 2903629. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249523830> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275caba60> Failed to fetch sample 1280998. Exception:bject at 0x7fbcdc12bec0> Failed to fetch sample 2108837. Exception: cannot identify image fiFailed to fetch sample 2806356. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 27Failed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 24Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249523e20> le <_io.BytesIO object at 0x7f99a394a9d0> Failed to fetch sample 1586835. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59ab10> Failed to fetch sample 1870993. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fe4c4260950> Failed to fetch sample 1329379. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2828e0> Failed to fetch sample 1504520. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42b19e0> Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a394a160> Failed to fetch sample 2966793. Exception:bject at 0x7f35a7f66430> cannot identify image file <_io.BytesIO object at 0x7fe1f484ce00> Failed to fetch sample 3557596. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b281120> cannot identify image file <_io.BytesIO object at 0x7f991b214590> Failed to fetch sample 4081563. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722bd8a0> Failed to fetch sample 3105090. Exception:Failed to fetch sample 980301. Exception: cannot identify image fil cannot identify image file <_io.BytesIO object at 0x7f35a7fd9760> Failed to fetch sample 2369979. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8e8a5c0> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 39 cannot identify iFailed to fetch sample 2981136. Exception:4d850> Failed to fetch sample 3926066. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7fe696745760> Failed to fetch sample 2865695. Exception: cannot identify image file <_io.BytesIO object at 0x7fb456751850> Failed to fetch sample 1303658. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f906d0> Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4262ac0> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7b2776f20> Failed to fetch sample 1133290. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 2cannot identify image file <_io.BytesIO object at 0x7f0986471850> ile <_io.BytesIO o Failed to fetch sample 1140456. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4263560> Failed to fetch sample 2603053. Exception: cannot identify image fFailed to fetch sample 2330684. Exception: Failed to fetch sample 1168708. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3737ba0> cannot identify image file <_io.BytesIO object at 0x7fb7401e1800> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42b0180> Failed to fetch sample 3837981. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644d760> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f65580> Failed to fetch sample 2794558. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8e8abb0> Failed to fetch sample 2658306. Exception:ject at 0x7fcf24787fb0> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42b1620> Failed to fetch sample 4186967. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f03b50> cannot identify image file <_io.BytesIO oFailed to fetch sample 35 cannot identify image file <_io.BytesIO oFailed to fetch sample 3929909. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c217970> bject at 0x7fe4c42b16c0> Failed to fetch sample 1561037. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f91850> Failed to fetch sample 3731866. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c272390> Failed to fetch sample 2966793. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17e7fcb30> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4263e20> Failed to fetch sample 2123650. Exception: cannot identify image file <_io.BytesIO object at 0x7fb39d910ae0> Failed to fetch sample 1504520. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c5440> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8efcc70> Failed to fetch sample 1830487. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc27c0e0> le <_io.BytesIO object at 0x7fb39f275a30> cannot identify image file <_io.BytesIO object at 0x7f6a7f28e2f0> Failed to fetch sample 1171510. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644ebb0> Failed to fetch sample 2583452. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f918a0> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f26e9d0> Failed to fetch sample 1061148. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cc9a2c50> Failed to fetch sample 3148873. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a3eb60> Failed to fetch sample 2719591. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1c8c1490> Failed to fetch sample 2202789. Exception:cannot identify image file <_io.BytesIO object at 0x7f9a9c285da0> cannot identify image file <_io.BytesIO object at 0x7fe40fc5b330> cannot identify image file <_io.BytesIO object at 0x7f472789bbf0> le <_io.BytesIO obFailed to fetch sample 1739207. Exception: ect at 0x7f1f1b281710> Failed to fetch sample 3290404. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2877e0> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c426ab60> cannot identify image file <_io.BytesIO object at 0x7fe6cbfe5a80> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4804fa7a0> ile <_io.BytesIO object at 0x7fe29baf58a0> cannot identify image file <_io.BytesIO object at 0x7f36afb0b880> le <_io.BytesIO object at 0x7f9a9c285a30> FFailed to fetch sample 2104150. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc193380>FFailed to fetch sample 2101533. Exception:cannot identify image file <_io.BytesIO object at 0x7fe6cc9a2ed0> Failed to fetch sample 2554836. Exception: cannot identify image fiFailed to fetch sample 2964062. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134136d90> Failed to fetch sample 2327994. Exception: cannot identify image file <_io.BytesIO object at 0x7f655601f790> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c55d0> Failed to fetch sample 3596834. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fdc60> Failed to fetch sample 1721141. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64cde90> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a41530> Failed to fetch sample 3240425. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134137650> Failed to fetch sample 1434228. Exception: cannot identify image file <_io.BytesIO object at 0x7f724518e840> Failed to fetch sample 3596808. Exception:bject at 0x7f5c9f6e9670> le <_io.BytesIO object at 0x7f3377e58cc0> cannot identify image file <_io.BytesIO object at 0x7f18e64c8ea0> le <_io.BytesIO object at 0x7f08280fee80> Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369f54e0> Failed to fetch sample 3374089. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64c9df0> Failed to fetch sample 2522299. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275d095d0> cannot identify image file <_io.BytesIO object at 0x7f6e55f93790> Failed to fetch sample 1448881. Exception: cannot identify image file <_io.BytesIO object at 0x7f3375d31c60> le <_io.BytesIO object at 0x7f082818b740> Failed to fetch sample 2066707. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24966ed90> Failed to fetch sample 2746130. Exception:Failed to fetch sample 3194835. Exception: cannot identify image file <_io.BytesIO object at 0x7f472775f010> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7f47279e7650> Failed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7f47edf85710> Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278be070> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7f472789bba0> Failed to fetch sample 1047873. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722c41d0> Failed to fetch sample 1573349. Exception: cannot identify image fFailed to fetch sample 3463683. Exception: Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a42f20> Failed to fetch sample 3177508. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc5c9030> Failed to fetch sample 2051793. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4263790> cannot identify image file <_io.BytesIO object at 0x7fd07a617970> Failed to fetch sample 1171510. Exception: cannot identify image file <_io.BytesIO object at 0x7f32d28152b0> Failed to fetch sample 2500688. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce680e0> Failed to fetch sample 4179589. Exception:Failed to fetch sample 3711673. Exception:bject at 0x7fd07a616f70> cannot identify image file <_io.BytesIO object at 0x7fcbdc261580> Failed to fetch sample 3418410. Exception: Failed to fetch sample 3421855. Exception: ect at 0x7fa14c6d6160> Failed to fetch sample 3487093. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb74a2f0>Failed to fetch sample 2994604. Exception:bject at 0x7f9d73254ef0> Failed to fetch sample 2142673. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681d2b10> Failed to fetch sample 2980232. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f54027ba0> Failed to fetch sample 2165648. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ea8e0> cannot identify image file <_io.BytesIO object at 0x7f35a7f815d0> Failed to fetch sample 2531690. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc23c180> Failed to fetch sample 1131587. Exception: ject at 0x7f9d7327d260> Failed to fetch sample 3531448. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681d0cc0> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f4e73cf90> cannot identify image file <_io.BytesIO object at 0x7fd07a617d80> cannot identify image file <_io.BytesIO object at 0x7f9e5fbb37e0> Failed to fetch sample 3494092. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b288270> Failed to fetch sample 1876232. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f1580> Failed to fetch sample 1602779. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73256a20> cannot identify image file <_io.BytesIO object at 0x7fcbdc1927f0> Failed to fetch sample 1277996. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f83470> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f9a0c0> Failed to fetch sample 1816957. Exception: cannot identify image fFailed to fetch sample 3489379. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded143790> ile <_io.BytesIO object at 0x7f65560de390> Failed to fetch sample 1034340. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f0f600> Failed to fetch sample 2719591. Exception: cannot identify image file <_io.BytesIO object at 0x7f060231d670> Failed to fetch sample 4123695. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c952b0> Failed to fetch sample 1743045. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249681170> Failed to fetch sample 1484787. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b0e00> Failed to fetch sample 1642184. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c95c10> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249681850> Failed to fetch sample 1095315. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275d0ae80> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b20c0> Failed to fetch sample 3648356. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2496805e0> Failed to fetch sample 1798420. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84d4950> Failed to fetch sample 1992912. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec265710> Failed to fetch sample 1785543. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275c96d90> Failed to fetch sample 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b1ee0> Failed to fetch sample 3661452. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494904f0> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f2b290> Failed to fetch sample 2642509. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b283510> Failed to fetch sample 2767646. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f81300>Failed to fetch sample 2363461. Exception:: cannot identify image file <_io.BytesIO object at 0x7f94ac2449a0> Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7fba0df14d60> cannot identify image file <_io.BytesIO object at 0x7f1f1b24a2a0> le <_io.BytesIO object at 0x7f71341a1cb0> Failed to fetch sample 1923659. Exception: Failed to fetch sample 3180383. Exception:ject at 0x7f80a81545e0> cannot identify image file <_io.BytesIO object at 0x7efd6f79d8a0> cannot identify image file <_io.BytesIO object at 0x7f1f1b249710> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f04e00> Failed to fetch sample 1000131. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec269f80> le <_io.BytesIO object at 0x7fb9cb2b5e90> cannot identify image file <_io.BytesIO object at 0x7f35a7f88c20> cannot identify image file <_io.BytesIO object at 0x7f1f1b252840> le <_io.BytesIO object at 0x7f80a8238950> Failed to fetch sample 4235606. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341a06d0> Failed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249682070> Failed to fetch sample 3584846. Exception:bject at 0x7efd6f7c8860> Failed to fetch sample 2572486. Exception: ject at 0x7f2009934b80> cannot identify image file <_io.BytesIO object at 0x7fcbdc223920> Failed to fetch sample 2827935. Exception: Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b281710> Failed to fetch sample 1504520. Exception: cannot identify image file <_io.BytesIO object at 0x7f10af018db0> cannot identify image file <_io.BytesIO object at 0x7f7134195850> Failed to fetch sample 3055804. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0ef2e0> Failed to fetch sample 3557219. Exception:bject at 0x7fc9b7f071f0> Failed to fetch sample 1916848. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec26a2f0> Failed to fetch sample 2867178. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83afe20> Failed to fetch sample 2919698. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b280220>cannot identify image file <_io.BytesIO object at 0x7fd07a60bd80> ile <_io.BytesIO object at 0x7f94ac247d80> Failed to fetch sample 2438893. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828131b70> Failed to fetch sample 2656628. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f795800> Failed to fetch sample 3350750. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644c7c0> Failed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342d5e40> Failed to fetch sample 3575993. Exception:Failed to fetch sample 4376367. Exception: cannot identify image fiFailed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b250f90> cannot identify image file <_io.BytesIO object at 0x7fd08a213c90> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f81580> Failed to fetch sample 1507789. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc147f10> Failed to fetch sample 2861238. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc2526b0> Failed to fetch sample 2149756. Exception: cannot identify image file <_io.BytesIO object at 0x7f713410a7a0> cannot identify image file <_io.BytesIO object at 0x7f5b05e905e0> ile <_io.BytesIO object at 0x7f1137f49da0> cannot identify image file <_io.BytesIO object at 0x7ff2496831f0> Failed to fetch sample 3929909. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a9883330> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f074c0> cannot identify image file <_io.BytesIO object at 0x7fd07a60aca0> le <_io.BytesIO object at 0x7fbcdc1c7880> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1ab380> Failed to fetch sample 4117657. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f82250> Failed to fetch sample 4146076. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc251350> cannot identify image file <_io.BytesIO object at 0x7f1137ffa160> Failed to fetch sample 2752989. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687fb5620> Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134109490> Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2ec090> Failed to fetch sample 2342401. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bdd760> Failed to fetch sample 1876146. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c6b7b2890> Failed to fetch sample 1301230. Exception: cannot identify image fFailed to fetch sample 1659412. Exception: Failed to fetch sample 4144029. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab9aed0> cannot identify image file <_io.BytesIO object at 0x7f713412dd50> Failed to fetch sample 2349988. Exception:Failed to fetch sample 3629990. Exception:bject at 0x7f001f94db20> Failed to fetch sample 1934945. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bded90> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16dad170> Failed to fetch sample 3508677. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f68e755d0> cannot identify image file <_io.BytesIO object at 0x7f9abcdcb8d0> Failed to fetch sample 1858383. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f49620> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341a3380> Failed to fetch sample 2875578. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2ef560> Failed to fetch sample 4045834. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded143790> Failed to fetch sample 3244596. Exception: Failed to fetch sample 3728803. Exception: ject at 0x7f1137ff8d60> ailed to fetch sample 3661452. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412dda0> Failed to fetch sample 2570312. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f55d50> Failed to fetch sample 1524920. Exception: cannot identify image f cannot identify image file <_io.BytesIO object at 0x7fca1c1b8e00> ailed to fetch sample 4101829. Exception: cannot identify image file <_io.BytesIO objFailed to fetch sample cannot identify image file <_io.BytesIO object at 0x7f11fdfa6250> le <_io.BytesIO object at 0x7fbcdc137d30> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7f11fdf840e0> Failed to fetch sample 1876146. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de3737fb0> Failed to fetch sample 2600057. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd6520> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1123e0> cannot identify image file <_io.BytesIO object at 0x7f530e73eed0> le <_io.BytesIO object at 0x7f80a817d850> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f3ebb0> Failed to fetch sample 2717980. Exception: cannot identify image file <_io.BytesIO object at 0x7fc84d52e0c0> Failed to fetch sample 2047816. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fbb7a60> cannot identify image file <_io.BytesIO object at 0x7f5de3727e20> le <_io.BytesIO object at 0x7fbcdc10ee80> cannot identify image file <_io.BytesIO object at 0x7fca1c1b98a0> le <_io.BytesIO object at 0x7fcbdc193ab0> Failed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7f71350af970> Failed to fetch sample 3136267. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0ef6f0> Failed to fetch sample 2417008. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42b0900> ile <_io.BytesIO object at 0x7fc715329940> Failed to fetch sample 2605425. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f2700> Failed to fetch sample 2578282. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1d94e0> Failed to fetch sample 1095181. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd7b50> cannot identify image file <_io.BytesIO object at 0x7f35a7f3f650> Failed to fetch sample 3573478. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc10f380> Failed to fetch sample 1457828. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f24f40> cannot identify image file <_io.BytesIO object at 0x7fc0f08fd6c0> le <_io.BytesIO object at 0x7f713412c540> Failed to fetch sample 3374089. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8769c60> Failed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f3da30> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc10ea20> cannot identify image file <_io.BytesIO object at 0x7fa275ce6390> Failed to fetch sample 1085332. Exception: cannot identify image file <_io.BytesIO object at 0x7f93f39ea570> Failed to fetch sample 2231126. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5b150> Failed to fetch sample 3538308. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f0937f10> Failed to fetch sample 2746130. Exception: Failed to fetch sample 3349915. Exception:ject at 0x7f53a8768b30> Failed to fetch sample 1135347. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f3c450> cannot identify image file <_io.BytesIO object at 0x7fa275cbbe70> Failed to fetch sample 2149756. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39523e0> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828120810> Failed to fetch sample 1829945. Exception: cannot identify image file <_io.BytesIO object at 0x7f39583e3fb0> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1c4130> Failed to fetch sample 3909731. Exception:bject at 0x7f35a7f5b790> Failed to fetch sample 3351202. Exception: cannot identify image fiFailed to fetch sample 3066652. Exception: cannot identify image file <_io.BytesIO object at 0x7efce793a750> le <_io.BytesIO object at 0x7fba0df14770> Failed to fetch sample 3637098. Exception: cannot identify image fcannot identify image file <_io.BytesIO objFailed to fetch sample 2402011. Exception:ple 2815741. Exception: cannot identify image file <_io.BytesIO object at 0x7f3982e2de90> Failed to fetch sample 3929909. Exception:bject at 0x7f99a3962f20> Failed to fetch sample 2984651. Exception:bject at 0x7fd07a5fcb30> Failed to fetch sample 1734015. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3af3ab0> Failed to fetch sample 1409821. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca2160> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c5850> cannot identify image file <_io.BytesIO object at 0x7efd6f7cbd30> Failed to fetch sample 1222876. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5ffc90> Failed to fetch sample 4202509. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca0450> cannot identify image file <_io.BytesIO object at 0x7f99a3921cb0> ile <_io.BytesIO object at 0x7ff27c2ffec0> cannot identify image file <_io.BytesIO object at 0x7fd9c771afc0> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706ee11c0> Failed to fetch sample 2180795. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281226b0> Failed to fetch sample 1541061. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f3d30> Failed to fetch sample 3575993. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d65c0> Failed to fetch sample 2165648. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d6c00> Failed to fetch sample 2698844. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f3ec0> cannot identify image file <_io.BytesIO object at 0x7f94ac240a90> F cannot identify image file <_io.BytesIO object at 0x7f99a3923d30> e <_io.BytesIO object at 0x7fbcdc147150> Failed to fetch sample 2165648. Exception: cannot identify image file <_io.BytesIO ob cannot identify image fFailed to fetch sample 2892575. Exception: Failed to fetch sample 2961542. Exception: cannot identify image fFailed to fetch sample 4079922. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a16b10> cannot identify image file <_io.BytesIO object at 0x7fbcdc118860> Failed to fetch sample 3315746. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c758a0> cannot identify image file <_io.BytesIO object at 0x7f9a9c2877e0> Failed to fetch sample 1761694. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249500630> cannot identify image file <_io.BytesIO object at 0x7f94ac2e9850> Failed to fetch sample 2320059. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1e8540> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f1df0> Failed to fetch sample 3276759. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca8770> Failed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7f060231d620> Failed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2426b0> Failed to fetch sample 2127193. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ec5e0> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc256840> Failed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cdf380> Failed to fetch sample 2863704. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd60de3fb0> cannot identify image file <_io.BytesIO object at 0x7f94ac2e8400> Failed to fetch sample 4090061. Exception: cannot identify image file <_io.BytesIO object at 0x7f9af1d39620> cannot identify image file <_io.BytesIO object at 0x7fb7401f0a90> Failed to fetch sample 3898287. Exception:Failed to fetch sample 1424068. Exception:bject at 0x7f951ecdd300> Failed to fetch sample 3180383. Exception: cannot identify image file <_io.BytesIO object at 0x7f9acc06bec0> Failed to fetch sample 4065433. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe26234b80> cannot identify image file <_io.BytesIO object at 0x7fb687f2b790> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f293f0> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706e59800> cannot identify image file <_io.BytesIO obj cannot identify image file <_io.BytesIO object at 0x7fb7401cfb00> annot identify image file <_io.BytesIO object at 0x7f9d5059a930> Failed to fetch sample 2408322. Exception: cannot identify image fFailed to fetch sample 4070291. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f2a930> ile <_io.BytesIO object at 0x7fbcdc12a3e0> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2e92b0> Failed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc636110> Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7fb6003c3ba0> Failed to fetch sample 1504810. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f13f10> Failed to fetch sample 2580515. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681d22a0> cannot identify image file <_io.BytesIO object at 0x7fc99c213790> le <_io.BytesIO object at 0x7fcbdc27f3d0> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc6346d0> Failed to fetch sample 3227008. Exception: cannot identify image file <_io.BytesIO object at 0x7fe15dd0dda0> Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdca2f0> Failed to fetch sample 2047082. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cf7e0> Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b287d30> cannot identify image file <_io.BytesIO object at 0x7efb3ccddb20> cannot identify image file <_io.BytesIO object at 0x7fc706e597b0> Failed to fetch sample 3207078. Exception:Failed to fetch sample 2813892. Exception: cannot identify image fiFailed to fetch sample 1876146. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2858a0>ile <_io.BytesIO object at 0x7fbcdc1bfce0> ailed to fetch sample 1038667. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1b94e0> Failed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc659940> cannot identify image file <_io.BytesIO object at 0x7fa2b08a1260> le <_io.BytesIO object at 0x7fb9cb2b48b0> Failed to fetch sample 2282533. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc2387c0> Failed to fetch sample 1485531. Exception:Failed to fetch sample 1160161. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc634630> Failed to fetch sample 2377897. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2510d0> Failed to fetch sample 2639470. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd84f0> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce829d0> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14e700> Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO object at 0x7f9af7dc89f0> cannot identify image file <_io.BytesIO oFailed to fetch sample 2855055. Exception:ample 3490333. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494877e0> Failed to fetch sample 4235606. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 1869446. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a3f3d0> bject at 0x7f9acc06bc40> Failed to fetch sample 2625611. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2761b0> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f33d0> Failed to fetch sample 3257443. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494fa2a0> Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7efe76ac88b0> Failed to fetch sample 2155552. Exception: ject at 0x7fbded143790> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdc9580> cannot identify image file <_io.BytesIO object at 0x7fe0ec2d7380> Failed to fetch sample 2482217. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fe436a3e340> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 27Failed to fetch s cannot identify image file <_io.BytesIO object at 0x7f52ce1fa110> bject at 0x7fa50aa2c8b0> Failed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2d58a0> Failed to fetch sample 2248321. Exception: cannot identify image file <_io.BytesIO object at 0x7f55272924d0> Failed to fetch sample 2142262. Exception: ject at 0x7fe0ec245b20> Failed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfab8593f0> Failed to fetch sample 1770342. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275368450> Failed to fetch sample 3573478. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668133bf0> ailed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647e480> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7efb3effd800> Failed to fetch sample 1119083. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d6250> Failed to fetch sample 2746828. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08fd670> ailed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa322a0> cannot identify image file <_io.BytesIO object at 0x7fbcdc14ede0> Failed to fetch sample 2579749. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682d3d80> le <_io.BytesIO object at 0x7f0986446340>Failed to fetch sample 3538308. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681333d0> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a154e0> cannot identify image file <_io.BytesIO object at 0x7fe0ec2d2a20> Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7fa50aa332e0> Failed to fetch sample 3648356. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec275b20> Failed to fetch sample 2365052. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668141350>FFailed to fetch sample 2924881. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682d3d30> cannot identify image file <_io.BytesIO object at 0x7f0986446660> ailed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2d2660> Failed to fetch sample 4050129. Exception:Failed to fetch sample 2335870. Exception: ject at 0x7f5668133560> le <_io.BytesIO object at 0x7efa8b768ea0> cannot identify image file <_io.BytesIO object at 0x7f5319f83ab0> cannot identify image file <_io.BytesIO object at 0x7f56681313a0>Failed to fetch sample 2093718. Exception: Failed to fetch sample 3661452. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec223420> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2d2ac0> Failed to fetch sample 3634589. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83a9d00> Failed to fetch sample 2122051. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275369b20> Failed to fetch sample 1179753. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec275990> Failed to fetch sample 2569036. Exception:bject at 0x7efd6f777fb0> cannot identify image file <_io.BytesIO object at 0x7fe4c426b920> cannot identify image file <_io.BytesIO object at 0x7f99a3af3880> Failed to fetch sample 2238423. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f904ef0> cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7f83e833acf0> cannot identify image file <_io.BytesIO object at 0x7f32986bb740> cannot identify image file <_io.BytesIO object at 0x7f001f9a6e80> Failed to fetch sample 4123070. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec274630> Failed to fetch sample 2096869. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494a3f10> Failed to fetch sample 2049038. Exception: cannot identify image file <_io.BytesIO object at 0x7efd102a9760> le <_io.BytesIO object at 0x7f07a8e9d530> Failed to fetch sample 2966793. Exception: cannot identify image file <_io.BytesIO object at 0x7efdefb04590> Failed to fetch sample 2919698. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec275df0> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494a1990> Failed to fetch sample 2238643. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec273420> Failed to fetch sample 1571615. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e833b290> Failed to fetch sample 2940430. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f53a0> Failed to fetch sample 3418606. Exception:bject at 0x7f7134137650> Failed to fetch sample 2335870. Exception:bject at 0x7f9d732549f0> Failed to fetch sample 1218899. Exception:bject at 0x7fe0ec274270> Failed to fetch sample 2455856. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494a3a10> Failed to fetch sample 3551874. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e834a340> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7f32986bb880> Failed to fetch sample 35Failed to fetch sample 3675736. Exception: e <_io.BytesIO object at 0x7f0986455490> cannot identify image file <_io.BytesIO object at 0x7f81db567c90> cannot identify image file <_io.BytesIO object at 0x7f9d73256980> Failed to fetch sample 1814357. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8388a90> Failed to fetch sample 2596679. Exception: cannot identify image file <_io.BytesIO object at 0x7f7f0518a070>F cannot identify image file <_io.BytesIO object at 0x7f9db90b68e0> e <_io.BytesIO object at 0x7efd6f7ca9d0> Failed to fetch sample 2591530. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986456390>Failed to fetch sample 3493751. Exception:bject at 0x7f81db5998a0> cannot identify image file <_io.BytesIO oFailed to fetch sample 2846481. Exception: mple 3338788. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5ec08b0> cannot identify image file <_io.BytesIO object at 0x7f81528aeac0> Failed to fetch sample 2316268. Exception: cannot identify image file <_io.BytesIO object at 0x7f524fa13a60> Failed to fetch sample 3618119. Exception: cannot identify image file <_io.BytesIO object at 0x7f7f03b7ddf0> cannot identify image file <_io.BytesIO object at 0x7f1137f64ea0> ailed to fetch sample 1467431. Exception: cannot identify image file <_io.BytesIO object at 0x7f80201b3bf0> cannot identify image file <_io.BytesIO object at 0x7f35a7f44e50> le <_io.BytesIO object at 0x7fe0ec23b4c0> cannot identify image file <_io.BytesIO object at 0x7f81db59a200> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28ed90> Failed to fetch sample 994847. Exception: cannot identify image file <_io.BytesIO object at 0x7f32d493af70> Failed to fetch sample 3913185. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db6fbce0> Failed to fetch sample 2405995. Exception: cannot identify image file <_io.BytesIO object at 0x7f9ad4587600> Failed to fetch sample 2746828. Exception: cannot identify image fiFailed to fetch sample 3247622. Exception Fcannot identify image file <_io.BytesIO object at 0x7f81db6fb880> le <_io.BytesIO object at 0x7fd07a59a2f0> Failed to fetch sample 3494449. Exception:bject at 0x7f082828ad90> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec270630> cannot identify image file <_io.BytesIO object at 0x7f7d9346e110> Failed to fetch sample 3467927. Exception:Failed to fetch sample 3648356. Exception:bject at 0x7f80a8154cc0> cannot identify image file <_io.BytesIO object at 0x7f9bb3f3c9f0> Failed to fetch sample 1008432. Exception:bject at 0x7efd6f7cb060> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a60acf0> cannot identify image file <_io.BytesIO oFailed to fetch sample 1355996. Exception:bject at 0x7f9a9c295bc0> Failed to fetch sample 2750035. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4282660> Failed to fetch sample 2479743. Exception: cannot identify image file <_io.BytesIO ob cannot identify image fFailed to fetch sample 4380313. Exception:jFailed to fetch sample 2680836. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706e59fd0> Failed to fetch sample 1976843. Exception: ject at 0x7fe4c428d080> cannot identify image file <_io.BytesIO object at 0x7efd6f7cbce0> cannot identify image file <_io.BytesIO object at 0x7f70b3888c20> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ddb155350> Failed to fetch sample 3916409. Exception: cannot identify image file <_io.BytesIO object at 0x7f53f0ba6070> Failed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7df9c0> Failed to fetch sample 3116930. Exception:Failed to fetch sample 2919698. Exception:bject at 0x7f70b4265f30> cannot identify image file <_io.BytesIO object at 0x7efd6f7cacf0> Failed to fetch sample 3468567. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e5580> ailed to fetch sample 3595563. Exception: cannot identify image file <_io.BytesIO object at 0x7f051fed14e0> Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b3888db0> Failed to fetch sample 2113172. Exception: ject at 0x7efd6f79cea0> cannot identify image file <_io.BytesIO object at 0x7f713412d850> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ebcc02cf0> Failed to fetch sample 3648463. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8ee1d50> cannot identify image file <_io.BytesIO object at 0x7fd07a5591c0> Failed to fetch sample 2702018. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a53a2f0> Failed to fetch sample 1322959. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a582fc0> Failed to fetch sample 3146596. Exception: cannot identify image file <_io.BytesIO object at 0x7f098648ccc0> Failed to fetch sample 1903068. Exception: Failed to fetch sample 2591530. Exception:ject at 0x7f5668142070> Failed to fetch sample 2598675. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f98a0> Failed to fetch sample 3086664. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681333d0> Failed to fetch sample 1258856. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906a660> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412fb00> Failed to fetch sample 3530977. Exception: cannot identify image ficannot identify image file <_io.BytesIO obFailed to fetch sample 1131587. Exception: cannot identify image file <_io.BytesIO object at 0x7f52ce1fa110> Failed to fetch sample 4170818. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687fb4590> Failed to fetch sample 1742099. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cf970> cannot identify image file <_io.BytesIO object at 0x7f36604ad1c0> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7f6f0b7275b0> Failed to fetch sample 2633741. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412cc20> Failed to fetch sample 2402011. Exception:Failed to fetch sample 2817823. Exception:bject at 0x7fb742671710> cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7fdd23f6d580> cannot identify image file <_io.BytesIO object at 0x7fe436a43740> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a581a80> Failed to fetch sample 1740483. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f1f30> Failed to fetch sample 1808993. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac644d10> Failed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412f650> Failed to fetch sample 3380043. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647ed90> Failed to fetch sample 2487289. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf82e1b0> Failed to fetch sample 1355996. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a581800> cannot identify image file <_io.BytesIO object at 0x7f71341b3060> Failed to fetch sample 1956953. Exception:Failed to fetch sample 3484950. Exception:bject at 0x7f94ac23e8e0> Failed to fetch sample 1448299. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f0f40> Failed to fetch sample 1135347. Exception:bject at 0x7fe0ec23bd30> Failed to fetch sample 1231297. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2495009a0> Failed to fetch sample 2972117. Exception: cannot identify image file <_io.BytesIO object at 0x7f051343ea20> cannot identify image file <_io.BytesIO object at 0x7f94ac2ccd10> Failed to fetch sample 2371876. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2d9850> cannot identify image file <_io.BytesIO object at 0x7fdcaf82c4f0> Failed to fetch sample 1429596. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a580ea0> Failed to fetch sample 3531348. Exception:Failed to fetch sample 4222253. Exception:bject at 0x7f71341a1c10> Failed to fetch sample 2772050. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac23dd50> cannot identify image file <_io.BytesIO object at 0x7fe0ec273b00> le <_io.BytesIO object at 0x7f71341a3240> Failed to fetch sample 2394516. Exception:Failed to fetch sample 1633328. Exception: cannot identify image file <_io.BytesIO object at 0x7fd9b2f62d40> Failed to fetch sample 1907422. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275ce42c0> cannot identify image file <_io.BytesIO object at 0x7f94ac2db380> Failed to fetch sample 4017139. Exception:Failed to fetch sample 4235606. Exception:bject at 0x7fcbdc193ab0> Failed to fetch sample 3183291. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec282cf0> cannot identify image file <_io.BytesIO object at 0x7fd07a61a2f0> Failed to fetch sample 2659901. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf82cf90> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 4059671. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f1ee0>object at 0x7ff2495216c0> cannot identify image file <_io.BytesIO object at 0x7f94ac2d93f0> FFailed to fetch sample 1484787. ExceptionF cannot identify image file <_io.BytesIO object at 0x7fcbdc200bd0> Failed to fetch sample 1615181. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2aee80> Failed to fetch sample 2826236. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaa47dad0> Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec23bab0> Failed to fetch sample 2403018. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249521ee0> bject at 0x7fd167254c20> Failed to fetch sample 4142811. Exception:Failed to fetch sample 3142837. Exception:bject at 0x7fd9c7aaaed0> Failed to fetch sample 3929239. Exception: cannot identify image file <_io.BytesIO object at 0x7fd0b5eb0ea0> cannot identify image file <_io.BytesIO object at 0x7f35a7f2f600> Failed to fetch sample 1980088. Exception: cannot identify image file <_io.BytesIO object at 0x7f951ecdd300> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1b9530> Failed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7fd3103d4f90> cannot identify image file <_io.BytesIO object at 0x7ff2495028e0> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec272fc0> Failed to fetch sample 3680391. Exception: Failed to fetch sample 2531778. Exception:ject at 0x7fe1be4a3c40> cannot identify image file <_io.BytesIO object at 0x7f32d493af70> Failed to fetch sample 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ba2f0> Failed to fetch sample 2149756. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a618810> Failed to fetch sample 3439450. Exception:bject at 0x7fe0ec26b3d0> Failed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84dbbf0> cannot identify image file <_io.BytesIO object at 0x7fe4c426b290> le <_io.BytesIO object at 0x7f99a3918180>Failed to fetch sample 2723514. Exception: ject at 0x7fd07a570e00> Failed to fetch sample 2155552. Exception:Failed to fetch sample 3105090. Exception: ect at 0x7ff36f2c2660> Failed to fetch sample 3534758. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c758a0> Failed to fetch sample 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec283830> Failed to fetch sample 2545822. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a53a2f0> cannot identify image file <_io.BytesIO object at 0x7fe1f6910680> Failed to fetch sample 3215324. Exception:ject at 0x7f3377e58cc0> Failed to fetch sample 3288992. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ebe70> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f3470> Failed to fetch sample 2916193. Exception:bject at 0x7f051343f240> Failed to fetch sample 1966675. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84cee30> cannot identify image file <_io.BytesIO object at 0x7fe1e2948c20> ile <_io.BytesIO object at 0x7ff4e3e6ef20> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c804f0> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a4d10> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7fd0b5eb0ea0> Failed to fetch sample 2473603. Exception: cannot identify image file <_io.BytesIO object at 0x7f098646c720> cannot identify image file <_io.BytesIO object at 0x7fcbdc23d300> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f1c10> Failed to fetch sample 1796453. Exception:Failed to fetch sample 2887391. Exception:bject at 0x7f05ffde92b0> Failed to fetch sample 2231126. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fd0b5eb0ea0> Failed to fetch sample 1464889. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 18Failed to fetch sample 1804828. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a572250> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7fd079dc7010> Failed to fetch sample 1470654. Exception: cannot identify image file <_io.BytesIO object at 0x7f098646f9c0> Failed to fetch sample 2384516. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494f9490> Failed to fetch sample 1804828. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc23e4d0> Failed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc200ef0> Failed to fetch sample 2149756. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fe3e20> cannot identify image file <_io.BytesIO object at 0x7f5cae261300> icannot identify image file <_io.BytesIO object at 0x7f08281331a0> ile <_io.BytesIO object at 0x7fe447380e00> le <_io.BytesIO object at 0x7fd07a582340> Failed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a6890> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab9a0c0> Failed to fetch sample 4176478. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab77650> Failed to fetch sample 3435401. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24966f290> cannot identify image file <_io.BytesIO object at 0x7fe4c42c6250> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c426a840> Failed to fetch sample 3520194. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc195170> Failed to fetch sample 1680813. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc23fb00> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc202570> cannot identify image file <_io.BytesIO object at 0x7ff4daba3d80> Failed to fetch sample 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e6e890> Failed to fetch sample 2235904. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7db2e0> Failed to fetch sample 1264285. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986478680> Failed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7f09953d6070> cannot identify image file <_io.BytesIO object at 0x7fcbdc23d8a0> cannot identify image file <_io.BytesIO object at 0x7ff4daba36a0> Failed to fetch sample 1068023. Exception: cannot identify image file <_io.BytesIO object at 0x7f08282cbe70> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab77c90> Failed to fetch sample 1382990. Exception: Failed to fetch sample 3880305. Exception:bject at 0x7f098644c860> Failed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864794e0> Failed to fetch sample 1140149. Exception:bject at 0x7ff4dab77790> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7f082812bfb0> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7f713418f150> cannot identify image file <_io.BytesIO object at 0x7f098644f1a0> Failed to fetch sample 3215324. Exception:Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4daba2c50> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864539c0> cannot identify image file <_io.BytesIO object at 0x7fcbdc192d40> Failed to fetch sample 1139861. Exception: cannot identify image file <_io.BytesIO object at 0x7f09c7b66700> ailed to fetch sample 2232474. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a6861a80> Failed to fetch sample 4380313. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc2627f0> Failed to fetch sample 2079644. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcd620> Failed to fetch sample 2815768. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134195b70> cannot identify image file <_io.BytesIO object at 0x7fe0ec2d3dd0> Failed to fetch sample 3575993. Exception: Failed to fetch sample 1704577. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec275990> cannot identify image file <_io.BytesIO object at 0x7fe0ec2a76a0> Failed to fetch sample 1298558. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cad7b1c60> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec282c50> Failed to fetch sample 2377742. Exception: cannot identify image file <_io.BytesIO object at 0x7fc74cd69d50> Failed to fetch sample 1502431. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249490f40> cannot identify image file <_io.BytesIO object at 0x7f9abcdca390> Failed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341a2c50> Failed to fetch sample 1012319. Exception:Failed to fetch sample 4235606. Exception: ject at 0x7f53a8eae480> le <_io.BytesIO object at 0x7fe0ec276cf0> cannot identify image file <_io.BytesIO object at 0x7fe436a571f0> Failed to fetch sample 2636882. Exception: cannot identify image file <_io.BytesIO object at 0x7fe81069bce0> Failed to fetch sample 2522299. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e6a003fb0> cannot identify image file <_io.BytesIO object at 0x7fc706e597b0> Failed to fetch sample 1501211. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8eae3e0> Failed to fetch sample 2997357. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded143790> Failed to fetch sample 2004921. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc6346d0> Failed to fetch sample 4112540. Exception: cannot identify image file <_io.BytesIO object at 0x7f563bb94860> cannot identify image file <_io.BytesIO object at 0x7f09864871f0> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a50040> Failed to fetch sample 3164188. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdf7740> Failed to fetch sample 1029176. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc65a610> cannot identify image file <_io.BytesIO object at 0x7ff4e3e6fd30> Failed to fetch sample 4375328. Exception: cannot identify image fFailed to fetch sample 2641623. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401fbd30> ile <_io.BytesIO object at 0x7fe0ec234400> Failed to fetch sample 3634589. Exception:bject at 0x7fe445765cb0> Failed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a0bce0> cannot identify image file <_io.BytesIO object at 0x7f9abcdd1530> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc635620> Failed to fetch sample 1118536. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc27dc10> Failed to fetch sample 1100749. Exception: cannot identify image file <_io.BytesIO object at 0x7f563bb95800> Failed to fetch sample 2146136. Exception: cannot identify image file <_io.BytesIO object at 0x7fbda068e160> cannot identify image file <_io.BytesIO object at 0x7fe0ec2353f0> Failed to fetch sample 2633741. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2496804f0> ile <_io.BytesIO object at 0x7fe436a57970> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc636110> Failed to fetch sample 1083624. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f3e1b0> cannot identify image file <_io.BytesIO object at 0x7efd6f7d84a0> Failed to fetch sample 3876873. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7caf20> Failed to fetch sample 3228934. Exception: cannot identify image fiFailed to fetch sample 2761972. Exception: ailed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369ee840> Failed to fetch sample 3678998. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7ae520> cannot identify image file <_io.BytesIO object at 0x7f9abc65b330> ile <_io.BytesIO object at 0x7efd6f7f5440> Failed to fetch sample 4375328. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc147a10> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc634720> cannot identify image file <_io.BytesIO object at 0x7ff2494990d0> cannot identify image file <_io.BytesIO object at 0x7efd6f7dc680> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fe7a0> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369ec310> Failed to fetch sample 3380043. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7af380> Failed to fetch sample 1135347. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249683740> Failed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7cbb50> cannot identify image file <_io.BytesIO object at 0x7f9abc6589f0> Failed to fetch sample 2633741. Exception:Failed to fetch sample 1796453. Exception:bject at 0x7fbcdc11a2a0> cannot identify image file <_io.BytesIO object at 0x7fe4369ec630> le <_io.BytesIO object at 0x7ff24966b2e0> Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f26ed90> Failed to fetch sample 3900769. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249559c10> cannot identify image file <_io.BytesIO object at 0x7f9d5059af70> Failed to fetch sample 1258856. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc107600> Failed to fetch sample 1471178. Exception: cannot identify image file <_io.BytesIO object at 0x7f56bdeed800> Failed to fetch sample 2903623. Exception:bject at 0x7ff24949b6f0> cannot identify image file <_io.BytesIO object at 0x7f9b05ca87c0> cannot identify image file <_io.BytesIO object at 0x7efd6f7cbe70> Failed to fetch sample 1135347. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc106cf0> Failed to fetch sample 1614588. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf859990> Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7dc6d0> Failed to fetch sample 3351202. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc107b00> cannot identify image file <_io.BytesIO object at 0x7fdcaf8f25c0> Failed to fetch sample 2071882. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f3fb0> Failed to fetch sample 4164862. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f32e0> Failed to fetch sample 2228400. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f2930> Failed to fetch sample 1025825. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf81ea70> Failed to fetch sample 1034396. Exception: cannot identify image file <_io.BytesIO object at 0x7fe15dcfc9a0> cannot identify image file <_io.BytesIO object at 0x7fbcdc125f80> Failed to fetch sample 3010141. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f3ab0> Failed to fetch sample 2979749. Exception: ject at 0x7f81db5993a0> Failed to fetch sample 3719356. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db598400> Failed to fetch sample 4099207. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf82e890> Failed to fetch sample 3419178. Exception:Failed to fetch sample 3546995. Exception:bject at 0x7f81db59b6f0> cannot identify image file <_io.BytesIO oFailed to fetch sample 35 cannot identify icannot identify image file <_io.BytesIO object atcannot identify image file <_io.BytesIO object at 0x7fe4c42c4e00> Failed to fetch sample 1742099. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4274040> Failed to fetch sample 2297084. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 1431086. Exception: ject at 0x7fe4c4277c90> cannot identify image file <_io.BytesIO object at 0x7f655606a110> cannot identify image file <_io.BytesIO object at 0x7f0986473060> Failed to fetch sample 1743116. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a40630>FFailed to fetch sample 2047816. Exception:: cannot identify image file <_io.BytesIO object at 0x7f6556068680> Failed to fetch sample 2813892. Exception:bject at 0x7fe4c42c40e0> Failed to fetch sample 2535152. Exception:bject at 0x7f098644c6d0> cannot identify image file <_io.BytesIO object at 0x7f83e83ad760> Failed to fetch sample 1198666. Exception: cannot identify image file <_io.BytesIO object at 0x7f6191ca5170> Failed to fetch sample 2694826. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e56c0> cannot identify image file <_io.BytesIO object at 0x7f1b7b9c22a0> Failed to fetch sample 1258856. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83afe20> Failed to fetch sample 3840067. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560ddcb0> cannot identify image file <_io.BytesIO object at 0x7f1a0d29a610> Failed to fetch sample 2898278. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5e570> Failed to fetch sample 1672535. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a400e0> Failed to fetch sample 1740611. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e7d30> Failed to fetch sample 2562281. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d1dd00> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644fe70> Failed to fetch sample 3870596. Exception:Failed to fetch sample 3021215. Exception: ject at 0x7f3377e58cc0> Failed to fetch sample 1258856. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906ae80> cannot identify image file <_io.BytesIO object at 0x7f18e5d1fe20> Failed to fetch sample 2855055. Exception:ject at 0x7fca1c1bad40> cannot identify image file <_io.BytesIO object at 0x7f09bbb4e890> le <_io.BytesIO object at 0x7ff2494a38d0> Failed to fetch sample 2078887. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9069850> Failed to fetch sample 1198666. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e5800> Failed to fetch sample 1947526. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906bf60> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c5940> cannot identify image file <_io.BytesIO object at 0x7f09c7b66250> Failed to fetch sample 3374089. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647ede0> Failed to fetch sample 3840067. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e4ae0> Failed to fetch sample 3634589. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e7ffdd530> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647e7f0> Failed to fetch sample 2678277. Exception:Failed to fetch sample 1559425. Exception: ject at 0x7efd6f7bec50> cannot identify image file <_io.BytesIO object at 0x7f70b426d350> cannot identify image file <_io.BytesIO object at 0x7f055b4499e0> Failed to fetch sample 3302333. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac247240> Failed to fetch sample 1498357. Exception: cannot identify image file <_io.BytesIO object at 0x7f911e5525c0> Failed to fetch sample 2059469. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412f7e0> Failed to fetch sample 2757801. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ddb155350> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7f91ffd77fb0> Failed to fetch sample 1372529. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5728e0> ailed to fetch sample 2934443. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134136d90> Failed to fetch sample 2861238. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80ff290> Failed to fetch sample 2892575. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac243010> Failed to fetch sample 3909821. Exception:Failed to fetch sample 4146076. Exception:bject at 0x7f0986449ee0> cannot identify image file <_io.BytesIO object at 0x7f81b9c9f150> Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342d3bf0> Failed to fetch sample 2078876. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b3889670> Failed to fetch sample 2562281. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a55ac00> Failed to fetch sample 1276582. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8eace50> Failed to fetch sample 1784730. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134137650> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7f713413c7c0> Failed to fetch sample 2717980. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80de160> Failed to fetch sample 3021215. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5718f0> Failed to fetch sample 2378371. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828114ef0> ailed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e6690d8f0> Failed to fetch sample 2365312. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9ce8d5d0> Failed to fetch sample 1973264. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939caad40> Failed to fetch sample 1692675. Exception: cannot identify image file <_io.BytesIO object at 0x7f3377e58cc0> Failed to fetch sample 3661452. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9ce8e6b0> Failed to fetch sample 2302077. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864522a0> Failed to fetch sample 1878125. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec23b4c0> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7f05596256c0> Failed to fetch sample 1976843. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9bacd800> Failed to fetch sample 3870855. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5f650> Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340fc0e0> Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec270310> Failed to fetch sample 3374089. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f70900> Failed to fetch sample 2506139. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b0950> Failed to fetch sample 1377471. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ef380> ailed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340fd850> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO obcannot identify image file <_io.BytesIO object at 0x7efd6f7ea2f0> cannot identify image file <_io.BytesIO object at 0x7fcbdc1ef6f0> Failed to fetch sample 4026469. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b1ee0> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134137fb0> Failed to fetch sample 1648989. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd5d7860c0> Failed to fetch sample 2679326. Exception:bject at 0x7f098646d9e0>Failed to fetch sample 3374089. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7eab10> cannot identify image file <_io.BytesIO object at 0x7f35a7f5e8e0> Failed to fetch sample 3418606. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9bacd8f0> Failed to fetch sample 1252147. Exception: cannot identify image fiFailed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d49f0> Failed to fetch sample 2144064. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a55af20> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f05df0> cannot identify image file <_io.BytesIO object at 0x7f71341036a0> Failed to fetch sample 1431086. Exception:Failed to fetch sample 1878125. Exception:bject at 0x7fe0ec2f2e80> cannot identify image file <_io.BytesIO object at 0x7fd0b2072e30> Failed to fetch sample 1402391. Exception: cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7fc9b7f0f830> Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f3e20> Failed to fetch sample 4184215. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c5489f0> Failed to fetch sample 4161397. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdf64d0> Failed to fetch sample 3336797. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a8e9d530> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec24fce0> Failed to fetch sample 1234262. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a364d0> Failed to fetch sample 1099935. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d9432f920> cannot identify image file <_io.BytesIO object at 0x7fc9b7f051c0> Failed to fetch sample 1110918. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2937febb0> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f1fd0> Failed to fetch sample 2906377. Exception: cannot identify image file <_io.BytesIO object at 0x7ff248cf2b10> Failed to fetch sample 2961542. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f053a0> Failed to fetch sample 2756136. Exception: cannot identify image file <_io.BytesIO object at 0x7ff3344abf10> Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f2930> Failed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7ff248cf1760> Failed to fetch sample 1898440. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494fa250> Failed to fetch sample 2429632. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42826b0> Failed to fetch sample 2641623. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4295850> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7ff248cf20c0> Failed to fetch sample 3018803. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494fab10> Failed to fetch sample 2078887. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4295c60> Failed to fetch sample 1947526. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1f6910c20> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7ff3322a56c0> Failed to fetch sample 2561156. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc2017b0> Failed to fetch sample 3357984. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a819f3d0> cannot identify image file <_io.BytesIO object at 0x7ff4dab77650> Failed to fetch sample 1752377. Exception: cannot identify image fFailed to fetch sample 1630532. Exception: Failed to fetch sample 3634589. Exception:bject at 0x7fc73df1bab0> Failed to fetch sample 3606565. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1bc360> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab76d90> cannot identify image file <_io.BytesIO object at 0x7f587f274a40> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc201f80> Failed to fetch sample 3308126. Exception: cannot identify image file <_io.BytesIO object at 0x7f32986bbf60> Failed to fetch sample 2790481. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7ff8590> cannot identify image file <_io.BytesIO object at 0x7fc706e59080> cannot identify image file <_io.BytesIO object at 0x7fdc269ab240> ile <_io.BytesIO object at 0x7fc706ee0130> Failed to fetch sample 3335944. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaa47d8f0> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec246610> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc203970> Failed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1a6c00> cannot identify image file <_io.BytesIO object at 0x7fdcaf8f10d0> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7fc74cd69d50> Failed to fetch sample 4150751. Exception:bject at 0x7f35a7ff8630> Failed to fetch sample 1278085. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc193e20> Failed to fetch sample 1336212. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc220040> cannot identify image file <_io.BytesIO object at 0x7f1f4e73cf90> Failed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO obcannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 23 cannot identify image file <_io.BytesIO object at 0x7fd9dc1c3f60> Failed to fetch sample 1100375. Exception: cannot identify image file <_io.BytesIO object at 0x7efb3ccddb20> 0x7fcbdc1f35b0> cannot identify image file <_io.BytesIO object at 0x7f1f54027ba0> le <_io.BytesIO object at 0x7fbcdc10e2f0> Failed to fetch sample 3386755. Exception: cannot identify image file <_io.BytesIO object at 0x7fba0184ffb0> Failed to fetch sample 2710987. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1c4360> cannot identify image file <_io.BytesIO object at 0x7f082812bec0> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706e59990> le <_io.BytesIO object at 0x7f53a8ee1670> Failed to fetch sample 2145986. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2876a0> le <_io.BytesIO object at 0x7fa275ce4090> cannot identify image file <_io.BytesIO object at 0x7fcbdc1bb420> Failed to fetch sample 1160161. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc123c90> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded18ba60> Failed to fetch sample 2819059. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906b0b0> Failed to fetch sample 2815741. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc223a60> Failed to fetch sample 3573442. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4f8d0> Failed to fetch sample 1431086. Exception:ject at 0x7fa275ce7d30> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7f05596256c0> cannot identify image file <_io.BytesIO object at 0x7fbcdc1a6ca0> Failed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7fba0184ffb0> Failed to fetch sample 3923884. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8ee1c10> Failed to fetch sample 2892575. Exception:bject at 0x7fc74cd69d50> cannot identify image file <_io.BytesIO object at 0x7efd6f7e9fd0> Failed to fetch sample 1120039. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a0a2f0> cannot identify image file <_io.BytesIO object at 0x7fcbdc1ecb80> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4ed40> Failed to fetch sample 3617437. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906bf60> Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded18ba60> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7cbb50> cannot identify image file <_io.BytesIO object at 0x7fc99c2128e0> Failed to fetch sample 1541061. Exception: cannot identify image file <_io.BytesIO object at 0x7fc8f219acf0> Failed to fetch sample 3333023. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9cf55080> Failed to fetch sample 1485531. Exception: cannot identify image file <_io.BytesIO object at 0x7fbab2decc20> Failed to fetch sample 2561921. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc125030> Failed to fetch sample 1931539. Exception:Failed to fetch sample 2698844. Exception:bject at 0x7ff36fc75990> cannot identify image file <_io.BytesIO object at 0x7fcbdc248090> Failed to fetch sample 3876980. Exception: cannot identify image file <_io.BytesIO object at 0x7fba0df15260> Failed to fetch sample 3091512. Exception:Failed to fetch sample 3467927. Exception:bject at 0x7fe436a089a0> cannot identify image file <_io.BytesIO object at 0x7ff24966b2e0> cannot identify image file <_io.BytesIO object at 0x7fbcdc147920> Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1a53f0> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a35350> cannot identify image file <_io.BytesIO object at 0x7ff249533ce0> Failed to fetch sample 2479743. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc146020> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded1402c0> Failed to fetch sample 1640911. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2d9260> Failed to fetch sample 1947526. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249675300> Failed to fetch sample 3851313. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f14e0> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc127740> Failed to fetch sample 3357132. Exception: Failed to fetch sample 3143457. Exception: cannot identify image file <_io.BytesIO object at 0x7f911e5525c0> Failed to fetch sample 3002831. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1aefc0> Failed to fetch sample 2124798. Exception: cannot identify image file <_io.BytesIO object at 0x7fbda068e160> cannot identify image file <_io.BytesIO object at 0x7f83e8346480> Failed to fetch sample 2040873. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83aba60> Failed to fetch sample 2829458. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f776d40> cannot identify image file <_io.BytesIO object at 0x7fbac8b11cb0> le <_io.BytesIO object at 0x7f83e8345d00> Failed to fetch sample 2721244. Exception: cannot identify image file <_io.BytesIO object at 0x7f055b4494e0> Failed to fetch sample 1367937. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4473709a0> cannot identify image file <_io.BytesIO object at 0x7fd07a53a2f0> Failed to fetch sample 1397352. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c420f970> Failed to fetch sample 4079327. Exception: cannot identify image file <_io.BytesIO object at 0x7f054eb956c0> Failed to fetch sample 4104440. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bba60> cannot identify image file <_io.BytesIO object at 0x7fe4c428fd30> Failed to fetch sample 1448134. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1ac900> cannot identify image file <_io.BytesIO object at 0x7fe4c428dad0> Failed to fetch sample 2827935. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c428e5c0> Failed to fetch sample 2607936. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1aca40> Failed to fetch sample 1369494. Exception: cannot identify image file <_io.BytesIO object at 0x7f9be6bbffb0> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcad90> Failed to fetch sample 3153431. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28e7f0> Failed to fetch sample 4131919. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706ed5490> Failed to fetch sample 2601346. Exception: cannot identify image file <_io.BytesIO object at 0x7f9ac15f13f0> Failed to fetch sample 1203381. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5e110> Failed to fetch sample 3240425. Exception: cannot identify image file <_io.BytesIO object at 0x7f9af7dc8d10> Failed to fetch sample 2040873. Exception: Failed to fetch sample 2331409. Exception:ject at 0x7f9abcdf71a0> Failed to fetch sample 3116930. Exception: ject at 0x7fe0ec273b00> Failed to fetch sample 2136958. Exception:ject at 0x7f35a7f5d940> Failed to fetch sample 2851854. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcd95620> Failed to fetch sample 1157055. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac20eca0> cannot identify image file <_io.BytesIO object at 0x7f08280e91c0> Failed to fetch sample 2522299. Exception: cannot identify image file <_io.BytesIO object at 0x7f91ffd75cb0>Failed to fetch sample 3491477. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5a9d50>> Failed to fetch sample 2671128. Exception: cannot identify image file <_io.BytesIO object at 0x7f0dd0108540> Failed to fetch sample 2429018. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280ea6b0> le <_io.BytesIO object at 0x7f35a7fff330> Failed to fetch sample 3596808. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5abce0> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1bee7fb0> Failed to fetch sample 2262375. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b38889f0> Failed to fetch sample 2799267. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280ebf10> Failed to fetch sample 2875120. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644f5b0> Failed to fetch sample 3374089. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5714e0> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f72f70> Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f463e0> Failed to fetch sample 1095181. Exception: cannot identify image file <_io.BytesIO object at 0x7efe76ab7c40> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5aaac0> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644ecf0> cannot identify image file <_io.BytesIO object at 0x7f71340ed4e0> Failed to fetch sample 3004849. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134102160> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7f32bcb5c3b0> Failed to fetch sample 3933650. Exception: cannot identify image file <_io.BytesIO object at 0x7f36afade8e0> cannot identify image file <_io.BytesIO object at 0x7fe6cc9a3010> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b3fb0> Failed to fetch sample 3640205. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a37c90> Failed to fetch sample 2892575. Exception:bject at 0x7f08280eac00> cannot identify image file <_io.BytesIO object at 0x7f098644fba0> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7e8d60> Failed to fetch sample 2501561. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cc9a2c50> cannot identify image file <_io.BytesIO object at 0x7f35a7f5f240> Failed to fetch sample 2167153. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a42070> Failed to fetch sample 3156626. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9cf55080> Failed to fetch sample 1000815. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b426d030> Failed to fetch sample 2430362. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a27f60> Failed to fetch sample 4236873. Exception: cannot identify image file <_io.BytesIO object at 0x7f098648e0c0> Failed to fetch sample 1211518. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864794e0> Failed to fetch sample 1134263. Exception: Failed to fetch sample 2117739. Exception:bject at 0x7ff249503600> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7f32d0944cc0> Failed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986450770> Failed to fetch sample 1258856. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1eb790> Failed to fetch sample 3879860. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a15e40> Failed to fetch sample 1021516. Exception:Failed to fetch sample 1497147. Exception:bject at 0x7f098649fe70> cannot identify image file <_io.BytesIO object at 0x7f060231d670> le <_io.BytesIO object at 0x7fd07a58c270> Failed to fetch sample 1100749. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a262f0> cannot identify image file <_io.BytesIO object at 0x7ff249500b80> Failed to fetch sample 2437168. Exception: cannot identify image file <_io.BytesIO object at 0x7f0554a45e40> Failed to fetch sample 2522299. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a617970> Failed to fetch sample 2996900. Exception:Failed to fetch sample 3253820. Exception: cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7fd0c5f531f0> ect at 0x7fca1c1b20c0> Failed to fetch sample 3022510. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec21f6a0> Failed to fetch sample 2045138. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249485710> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f0f740> Failed to fetch sample 4108370. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec253470> cannot identify image file <_io.BytesIO object at 0x7fd07a60b790> Failed to fetch sample 1495777. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249500540> Failed to fetch sample 2277243. Exception: cannot identify image file <_io.BytesIO object at 0x7ff248d6d3a0> Failed to fetch sample 3472153. Exception: cannot identify image file <_io.BytesIO ob cannot identify image fFailed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a60acf0> Failed to fetch sample 2940430. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986437470> cannot identify image file <_io.BytesIO object at 0x7ff249501760> Failed to fetch sample 2086216. Exception: cannot identify image file <_io.BytesIO object at 0x7fd169770ea0> Failed to fetch sample 1249979. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d6c50> Failed to fetch sample 1445442. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986437920> Failed to fetch sample 1340648. Exception:Failed to fetch sample 1429596. Exception:bject at 0x7f7a5690d620> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7ca9d0> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a55aa20> Failed to fetch sample 2972755. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369ede40> Failed to fetch sample 2786440. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cec957650> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d71f0>Failed to fetch sample 2394516. Exception:: cannot identify image file <_io.BytesIO object at 0x7fd16742f510> cannot identify image file <_io.BytesIO object at 0x7ff249500900> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7efce51b3e70> Failed to fetch sample 1011202. Exception: cannot identify image file <_io.BytesIO object at 0x7fcb2d10aac0> Failed to fetch sample 3484950. Exception: cannot identify image file <_io.BytesIO object at 0x7fd167254c20> Failed to fetch sample 3506124. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64cecf0> Failed to fetch sample 3833121. Exception:bject at 0x7ff249520ae0> Failed to fetch sample 1762324. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e832aac0> cannot identify image file <_io.BytesIO object at 0x7f80a819e9d0> cannot identify image file <_io.BytesIO object at 0x7efce51b1fd0> Failed to fetch sample 1095181. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6504ef0> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7f192201de90> Failed to fetch sample 1219918. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644c0e0> Failed to fetch sample 2961542. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7c9f80> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64cb2e0> Failed to fetch sample 2916193. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2495202c0> Failed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c213650> Failed to fetch sample 3571184. Exception:Failed to fetch sample 2887391. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249502890>: cannot identify image file <_io.BytesIO object at 0x7f7dba874b80> cannot identify image file <_io.BytesIO object at 0x7f098644da80> Failed to fetch sample 2851854. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d9346f880> ailed to fetch sample 2961261. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2e7240> Failed to fetch sample 2108837. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249523f60> cannot identify image file <_io.BytesIO object at 0x7f098644fe70> Failed to fetch sample 1265476. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a60bc90> cannot identify image file <_io.BytesIO object at 0x7ff249499030> Failed to fetch sample 2231126. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaa47d8f0> Failed to fetch sample 2229674. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1bf8d0> Failed to fetch sample 1274669. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8348900> Failed to fetch sample 3151174. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f1fd0> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a570220> Failed to fetch sample 3260226. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cbbfb0> Failed to fetch sample 1397352. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249498f40> Failed to fetch sample 1804828. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2e55d0> cannot identify image file <_io.BytesIO object at 0x7f098644cea0> Failed to fetch sample 1254271. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986446700> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7fd0c2614db0> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f2570> Failed to fetch sample 2819059. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644e4d0> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647fb50> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2dc2ffc90> Failed to fetch sample 3923884. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249532570> cannot identify image file <_io.BytesIO object at 0x7efd6f7bf880> Failed to fetch sample 3201846. Exception: cannot identify image file <_io.BytesIO object at 0x7efe76ac88b0> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a570f40> Failed to fetch sample 3141879. Exception:Failed to fetch sample 1847271. Exception: cannot identify image file <_io.BytesIO object at 0x7fe47ef82110> cannot identify image file <_io.BytesIO object at 0x7fe4369ffb50> Failed to fetch sample 1393319. Exception:Failed to fetch sample 2892575. Exception:bject at 0x7f713410b150> Failed to fetch sample 2462181. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bd760> cannot identify image file <_io.BytesIO object at 0x7f6b06670860> Failed to fetch sample 4235606. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7f68e0> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a572bb0> Failed to fetch sample 2985451. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7be9d0> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fdfd0> cannot identify image file <_io.BytesIO object at 0x7fbcdc14da80> Failed to fetch sample 4108370. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134196f70> Failed to fetch sample 2155552. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f79ccc0> Failed to fetch sample 3731088. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc192c50> Failed to fetch sample 1355407. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d50599f30> Failed to fetch sample 1222876. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7bdc60> Failed to fetch sample 2954477. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342d7010> cannot identify image file <_io.BytesIO object at 0x7efe283592b0> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4f9f05e0> Failed to fetch sample 3306875. Exception: cannot identify image file <_io.BytesIO object at 0x7f9acc069620> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f1620> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d4f9f0680> cannot identify image file <_io.BytesIO object at 0x7fe436a37970> Failed to fetch sample 2149756. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f79ec50> Failed to fetch sample 3146269. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369ede40> Failed to fetch sample 1508426. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce6aa20> Failed to fetch sample 1195245. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9cf55080> Failed to fetch sample 1250858. Exception: cannot identify image file <_io.BytesIO object at 0x7f9acc06bec0> Failed to fetch sample 1878636. Exception: cannot identify image file <_io.BytesIO object at 0x7fc84d52e0c0> Failed to fetch sample 1762324. Exception: cannot identify image file <_io.BytesIO object at 0x7f9ae54cffb0> cannot identify image file <_io.BytesIO object at 0x7fe436a3fe20> Failed to fetch sample 2661050. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7f7c40> Failed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24966f6f0>cannot identify image file <_io.BytesIO object at 0x7fe47ef82110> Failed to fetch sample 1095181. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1c4360> Failed to fetch sample 2487289. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1d1710> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce80db0> cannot identify image file <_io.BytesIO object at 0x7fd07a5779c0> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a37e70> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7f6b60> Failed to fetch sample 3606431. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a57c40> cannot identify image file <_io.BytesIO object at 0x7fcbdc192d40> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7efe76aba1b0> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a358f0> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7fd169770ea0> Failed to fetch sample 3419178. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc23c270> Failed to fetch sample 2031631. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134101cb0> Failed to fetch sample 3504244. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc248310> Failed to fetch sample 3349915. Exception:Failed to fetch sample 2627580. Exception:bject at 0x7fd07a5a8310> Failed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7fe47d15f0b0> cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7fcbdc24a1b0> cannot identify image file <_io.BytesIO object at 0x7fdd50761350> Failed to fetch sample 3116930. Exception:Failed to fetch sample 1633328. Exception:bject at 0x7fe0ec23ba60> cannot identify image file <_io.BytesIO object at 0x7fe436a3e020> Failed to fetch sample 3020829. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4fa60> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f2f600> Failed to fetch sample 2659901. Exception: ject at 0x7fe0ec23af70> cannot identify image file <_io.BytesIO object at 0x7fca9cf55080> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd3d189670> Failed to fetch sample 4059671. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1edc60> Failed to fetch sample 1736101. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fd9080> Failed to fetch sample 4154251. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412d0d0> Failed to fetch sample 1066194. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc24a390> Failed to fetch sample 2931138. Exception:ject at 0x7efa981c5120> cannot identify image file <_io.BytesIO object at 0x7fe1fa934770> Failed to fetch sample 1262384. Exception: cannot identify image file <_io.BytesIO object at 0x7fde3edb95d0> Failed to fetch sample 3573442. Exception:Failed to fetch sample 1523937. Exception:bject at 0x7f71350af970> Failed to fetch sample 4142811. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdd6be020> cannot identify image file <_io.BytesIO object at 0x7fe0ec270db0> ailed to fetch sample 2298567. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a53a7a0> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec283920> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec246390> cannot identify image file <_io.BytesIO object at 0x7f713412d850> Failed to fetch sample 1168155. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5e110> Failed to fetch sample 4172391. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f57b50> Failed to fetch sample 2984651. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2836f0> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7f32d28152b0> Failed to fetch sample 1278085. Exception: cannot identify image file <_io.BytesIO object at 0x7f36afade8e0> cannot identify image file <_io.BytesIO object at 0x7fe0ec2831f0> Failed to fetch sample 3568590. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c21efc0> Failed to fetch sample 2979603. Exception:Failed to fetch sample 2961542. Exception:bject at 0x7f35a7f5f240> Failed to fetch sample 1431086. Exception: cannot identify image file <_io.BytesIO object at 0x7f32c4534e50> cannot identify image file <_io.BytesIO Failed to fetch sample 3540812. Exception: ample 1470654. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c217100> Failed to fetch sample 1112875. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828116cf0> cannot identify image file <_io.BytesIO object at 0x7fcbdc238a40> cannot identify image file <_io.BytesIO object at 0x7fe436a41c60> Failed to fetch sample 2813892. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 30 cannot identify Failed to fetch sample 1085670. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5b920> image file <_io.BytesIO object at 0x7fc99c216ca0> Failed to fetch sample 2101533. Exception: cannot identify image file <_io.BytesIO object at 0x7f055b4494e0> Failed to fetch sample 2940708. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1bee6ed0> Failed to fetch sample 3254207. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a82f20> cannot identify image file <_io.BytesIO object at 0x7f35a7f566b0> Failed to fetch sample 2740012. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1ac680> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c21eca0> Failed to fetch sample 2855055. Exception: cannot identify image file <_io.BytesIO object at 0x7f32c06f6d90> Failed to fetch sample 2165947. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a27a10> Failed to fetch sample 2437168. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647e4d0> Failed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a418a0> Failed to fetch sample 3573442. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a26480> Failed to fetch sample 2308839. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986422660> Failed to fetch sample 2044196. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a14ae0> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a246d0> Failed to fetch sample 1095181. Exception: cannot identify image fiFailed to fetch sample 1792026. Exception:cannot identify image file <_io.BytesIO object at 0x7f35a7f48a90> Failed to fetch sample 1482362. Exception: cannot identify image file <_io.BytesIO object at 0x7f32be6c18a0> Failed to fetch sample 4189508. Exception: cannot identify image file <_io.BytesIO object at 0x7f802848f8d0> Failed to fetch sample 2388959. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e4ae0> Failed to fetch sample 1484787. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f82480> Failed to fetch sample 3689567. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1276a0> Failed to fetch sample 2531778. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f82930> Failed to fetch sample 1506105. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc145940> cannot identify image file <_io.BytesIO object at 0x7f18e65049a0> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647c0e0> Failed to fetch sample 2274389. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f82660> ailed to fetch sample 3907651. Exception: cannot identify image file <_io.BytesIO object at 0x7f055b4494e0> Failed to fetch sample 3240425. Exception: cannot identify image file <_io.BytesIO object at 0x7fa36444a110> Failed to fetch sample 2202214. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f88a40> Failed to fetch sample 2892575. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fc680> Failed to fetch sample 2851854. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd6ac0> Failed to fetch sample 4046281. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f88680> Failed to fetch sample 3270019. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f824d0> Failed to fetch sample 3575993. Exception: Failed to fetch sample 1940885. Exception: cannot identify image file <_io.BytesIO object at 0x7fd08940c7c0> Failed to fetch sample 3501017. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac243010> Failed to fetch sample 1382222. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4aa70> Failed to fetch sample 2535152. Exception: Failed to fetch sample 1120039. Exception:ject at 0x7fd3117b53a0> Failed to fetch sample 2721567. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986478810> cannot identify image file <_io.BytesIO object at 0x7f94ac2eee80> Failed to fetch sample 2127031. Exception: cannot identify image file <_io.BytesIO object at 0x7f713413fd30> Failed to fetch sample 1250858. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac246110> cannot identify image file <_io.BytesIO object at 0x7fc9b7f0e250> Failed to fetch sample 4108370. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986453420> cannot identify image file <_io.BytesIO object at 0x7f94ac241a30> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f05ad0> Failed to fetch sample 2591530. Exception: cannot identify image file <_io.BytesIO object at 0x7f713413d800> Failed to fetch sample 2954477. Exception: cannot identify image file <_io.BytesIO object at 0x7f098649dfd0> Failed to fetch sample 3047919. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc10ed40> Failed to fetch sample 4099595. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1edee0> Failed to fetch sample 1796400. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded1402c0> Failed to fetch sample 2204862. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9ce8dd50> Failed to fetch sample 4167451. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b426d6c0> Failed to fetch sample 3533762. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1bc360> le <_io.BytesIO object at 0x7f6d94e9a700>Failed to fetch sample 3513590. Exception: cannot identify image file <_io.BytesIO object at 0x7f060231d670> cannot identify image file <_io.BytesIO object at 0x7fca9ce8d5d0> Failed to fetch sample 3350750. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e1e7ec270> Failed to fetch sample 3240425. Exception: cannot identify image file <_io.BytesIO object at 0x7f6aa7059d50> Failed to fetch sample 2266173. Exception: cannot identify image file <_io.BytesIO object at 0x7f09cfb42700> Failed to fetch sample 2040873. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f26f100> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7c8720> Failed to fetch sample 3394498. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340f2f20> cannot identify image file <_io.BytesIO object at 0x7f6a7f26c770> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7f09bbb4e890> Failed to fetch sample 2587903. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340f28e0> Failed to fetch sample 3392416. Exception: Failed to fetch sample 4017966. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986437510> Failed to fetch sample 1037875. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9eff98220> Failed to fetch sample 2338721. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986439440> Failed to fetch sample 1543622. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a8e9d530> cannot identify image file <_io.BytesIO object at 0x7efd6f7d9440> Failed to fetch sample 1361273. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc24a390> Failed to fetch sample 3066973. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec21acf0> ailed to fetch sample 2169470. Exception: ect at 0x7f9abcdd0fe0> Failed to fetch sample 3022490. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1d1c60> cannot identify image file <_io.BytesIO object at 0x7fd169770ea0> Failed to fetch sample 4149219. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7c8bd0> Failed to fetch sample 2040069. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d6c00> Failed to fetch sample 3923374. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369eca40> cannot identify image file <_io.BytesIO object at 0x7fcbdc1dac00> Failed to fetch sample 2071882. Exception: Failed to fetch sample 3573442. Exception:ject at 0x7fcbdc24bdd0> cannot identify image file <_io.BytesIO object at 0x7fe0ec2f12b0> cannot identify image file <_io.BytesIO object at 0x7efd6f7dd8f0> Failed to fetch sample 3562030. Exception:Failed to fetch sample 1740611. Exception:bject at 0x7fbcdc12a7a0> Failed to fetch sample 1012319. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1dbe70> cannot identify image file <_io.BytesIO object at 0x7fe436a082c0> Failed to fetch sample 3907636. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369feb10> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369eea20> Failed to fetch sample 3728803. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7df3d0> Failed to fetch sample 1501211. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1b27f0> Failed to fetch sample 2228400. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a09350> Failed to fetch sample 3582987. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a37a60> cannot identify image file <_io.BytesIO object at 0x7fcbdc1d1120> Failed to fetch sample 4112540. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc27d210> Failed to fetch sample 2633740. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249486bb0> Failed to fetch sample 1831941. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18b650> cannot identify image file <_io.BytesIO object at 0x7fe436a089a0> Failed to fetch sample 3922185. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fe610> Failed to fetch sample 3879860. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc27f420> Failed to fetch sample 3073136. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cc9a2610> Failed to fetch sample 3879548. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a3d760> cannot identify image file <_io.BytesIO object at 0x7ff249520770> cannot identify image file <_io.BytesIO object at 0x7fcbdc249530> Failed to fetch sample 1100749. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1b0e50> Failed to fetch sample 2571913. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded143790> cannot identify image file <_io.BytesIO object at 0x7ff249521440> Failed to fetch sample 2078887. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18bf60> Failed to fetch sample 3439858. Exception: cannot identify image file <_io.BytesIO object at 0x7fc8159551c0> cannot identify image file <_io.BytesIO object at 0x7fcbdc1d1440> Failed to fetch sample 2234844. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706e5ade0> le <_io.BytesIO object at 0x7efce526cb30> cannot identify image file <_io.BytesIO object at 0x7f35a7fe0e00> Failed to fetch sample 3733532. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1c6f20> Failed to fetch sample 4099207. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1d0ae0> Failed to fetch sample 1552902. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc193e20> Failed to fetch sample 1512918. Exception:Failed to fetch sample 2647104. Exception:bject at 0x7fe0ec2ce0c0> cannot identify image file <_io.BytesIO object at 0x7f35a7f5d210> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc201300> Failed to fetch sample 3558586. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56904680> Failed to fetch sample 4148238. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a16c50> Failed to fetch sample 2382140. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e832ac50> Failed to fetch sample 2222528. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec244400> Failed to fetch sample 3846458. Exception: cannot identify image file <_io.BytesIO object at 0x7fe47d15f0b0> Failed to fetch sample 2680053. Exception:Failed to fetch sample 3573442. Exception: ject at 0x7efb3ccde0c0> Failed to fetch sample 3000063. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf81eb60> cannot identify image file <_io.BytesIO object at 0x7fe0ec276de0> Failed to fetch sample 1815480. Exception: cannot identify image file <_io.BytesIO object at 0x7fe46c325ad0> Failed to fetch sample 3576643. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf85b6f0> Failed to fetch sample 3728803. Exception:Failed to fetch sample 1911678. Exception: ject at 0x7fe1fb4fa7a0> cannot identify image file <_io.BytesIO object at 0x7fdcaf81ec00> ile <_io.BytesIO object at 0x7ff249490180> Failed to fetch sample 3577885. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf82de90> Failed to fetch sample 3245561. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4804fa110> Failed to fetch sample 2692971. Exception: cannot identify image file <_io.BytesIO object at 0x7efe76ab7c40> Failed to fetch sample 3573442. Exception: Failed to fetch sample 3139478. Exception:ject at 0x7efd6f7db6a0> Failed to fetch sample 1906264. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a571f30> Failed to fetch sample 3730479. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f1ee0> Failed to fetch sample 2742687. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7efa8b46ffb0> Failed to fetch sample 3276754. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a16570> Failed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d9260> Failed to fetch sample 2503403. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a573650> Failed to fetch sample 1414336. Exception:bject at 0x7ff24955af70> cannot identify image file <_io.BytesIO object at 0x7fd07a609710> Failed to fetch sample 2730084. Exception:bject at 0x7f713410a7a0> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7efe76ab74c0> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7fd1a42b2b10> Failed to fetch sample 2212856. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0ef6f0> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7f713410af70> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2da020> Failed to fetch sample 3117311. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280faf20> Failed to fetch sample 3235363. Exception: ject at 0x7fcbdc5b23e0> Failed to fetch sample 1333869. Exception: cannot identify image file <_io.BytesIO object at 0x7f32d28152b0> cannot identify image file <_io.BytesIO object at 0x7f6a7f28d490> Failed to fetch sample 3297609. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280faac0> Failed to fetch sample 2979603. Exception: Failed to fetch sample 1762324. Exception:ject at 0x7fe0ec272520> cannot identify image file <_io.BytesIO object at 0x7f94ac2ef880> Failed to fetch sample 4376367. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ad670> Failed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fdbba0> cannot identify image file <_io.BytesIO object at 0x7f6a7f2718a0> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7f94ac243010> cannot identify image file <_io.BytesIO object at 0x7f08281a5a30> Failed to fetch sample 3238131. Exception: cannot identify image fFailed to fetch sample 3687864. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1bd710> ile <_io.BytesIO object at 0x7f08281a41d0> Failed to fetch sample 1485016. Exception: cannot identify image file <_io.BytesIO object at 0x7fa29d43d800> Failed to fetch sample 4139772. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c187c40> Failed to fetch sample 1397352. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e834a1b0> Failed to fetch sample 1762324. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd7c90> Failed to fetch sample 1659412. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83c37e0> cannot identify image file <_io.BytesIO object at 0x7fcbdc193f60> le <_io.BytesIO object at 0x7fca1cbd5760> Failed to fetch sample 3554286. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f2c00> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84a2c00> Failed to fetch sample 3923884. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83ae390> Failed to fetch sample 997316. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1bf150> Failed to fetch sample 3125298. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaa47c900> Failed to fetch sample 3617437. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e83adb20> Failed to fetch sample 3380043. Exception: cannot identify image file <_io.BytesIO object at 0x7fc72566bb50> Failed to fetch sample 2784203. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a7420> Failed to fetch sample 1930721. Exception:bject at 0x7fcbdc193a10> cannot identify image file <_io.BytesIO object at 0x7fc706ee3290> Failed to fetch sample 1688560. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc23bab0> Failed to fetch sample 2507695. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a5e40> Failed to fetch sample 4026469. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706ee09f0> Failed to fetch sample 1475896. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a40860> cannot identify image file <_io.BytesIO object at 0x7fcbdc23a2a0> Failed to fetch sample 1501140. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f83880> Failed to fetch sample 1294979. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e55f93790> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a4590> Failed to fetch sample 2587903. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18ab60> Failed to fetch sample 4111705. Exception: cannot identify image file <_io.BytesIO object at 0x7fc715329800> cannot identify image file <_io.BytesIO object at 0x7fe436a36700> cannot identify image file <_io.BytesIO object at 0x7f3377e58cc0> Failed to fetch sample 3380043. Exception: cannot identify image file <_io.BytesIO object at 0x7f6f3b375a80> Failed to fetch sample 2223859. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1eb420> Failed to fetch sample 3741145. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fba480> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f884f0> Failed to fetch sample 3236734. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f3cf90> Failed to fetch sample 2122259. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412e520> Failed to fetch sample 3350750. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f267a0> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7f351e000220> Failed to fetch sample 2766775. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f07ce0> cannot identify image file <_io.BytesIO object at 0x7fe4369d7c90> Failed to fetch sample 1278085. Exception:Failed to fetch sample 3346372. Exception:bject at 0x7f32d28152b0> Failed to fetch sample 1445442. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4ede0> cannot identify image file <_io.BytesIO object at 0x7fe4804fa110> Failed to fetch sample 2961542. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f88d60> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO obcannot identify image file <_io.BytesIO object at 0x7f9af0543fb0> Failed to fetch sample 2335870. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f834c0> Failed to fetch sample 2859794. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fc37e0> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7f9ae54cffb0> Failed to fetch sample 1319375. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5e430> cannot identify image file <_io.BytesIO object at 0x7fe4369ffd80> Failed to fetch sample 4379145. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e49ee0> Failed to fetch sample 3585602. Exception: cannot identify image file <_io.BytesIO object at 0x7f32d28152b0> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369feb10> cannot identify image file <_io.BytesIO object at 0x7ff4e3e48310> Failed to fetch sample 1083920. Exception: Failed to fetch sample 3712906. Exception:ject at 0x7f9abcdf7380> Failed to fetch sample 2861238. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5fb00> Failed to fetch sample 4082384. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986486980> Failed to fetch sample 1278085. Exception: cannot identify image file <_io.BytesIO object at 0x7ff3322a56c0> Failed to fetch sample 4146076. Exception: cannot identify image file <_io.BytesIO object at 0x7f32c06f69d0> le <_io.BytesIO object at 0x7ff3344abf10>Failed to fetch sample 1072986. Exception: cannot identify image file <_io.BytesIO object at 0x7f32c06f69d0> Failed to fetch sample 2551139. Exception: Failed to fetch sample 2004558. Exception: ect at 0x7fe436a0be70> cannot identify image file <_io.BytesIO object at 0x7ff4dbf0aca0> le <_io.BytesIO object at 0x7f0c9c68c540> Failed to fetch sample 2918234. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f55a80> Failed to fetch sample 1282399. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5593f0>Failed to fetch sample 2598129. Exception:: cannot identify image file <_io.BytesIO object at 0x7fd0b2072e30> cannot identify image file <_io.BytesIO object at 0x7fe0ec2531a0> Failed to fetch sample 3420231. Exception:Failed to fetch sample 2892575. Exception:bject at 0x7efd6f7cbc90> Failed to fetch sample 1120039. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986472020> cannot identify image file <_io.BytesIO object at 0x7f32b033cb30> Failed to fetch sample 2717980. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f87d80> Failed to fetch sample 2641623. Exception: cannot identify image file <_io.BytesIO object at 0x7fd099752e30> cannot identify image file <_io.BytesIO object at 0x7fe0ec2d3bf0> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a8e9d530> cannot identify image file <_io.BytesIO object at 0x7efd6f7cb830> Failed to fetch sample 2633220. Exception:Failed to fetch sample 2078887. Exception: cannot identify image file <_io.BytesIO object at 0x7fd099752e30>cannot identify image file <_io.BytesIO object at 0x7fd079dc7dd0> cannot identify image file <_io.BytesIO object at 0x7fe0ec2d6de0> le <_io.BytesIO object at 0x7f098644d760> Failed to fetch sample 1947526. Exception:Failed to fetch sample 1865385. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a617d80>cannot identify image file <_io.BytesIO object at 0x7fd079dc7470> Failed to fetch sample 2909009. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647b4c0> Failed to fetch sample 3091512. Exception: cannot identify image fcannot identify image file <_io.BytesIO object at 0x7fe0ec2f14e0> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a58cd60>Failed to fetch sample 1030035. Exception: cannot identify image file <_io.BytesIO object at 0x7fde041adda0> le <_io.BytesIO object at 0x7f0986438b80> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986472520> Failed to fetch sample 1301230. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1c93c7c0> Failed to fetch sample 2078887. Exception: cannot identify image file <_io.BytesIO object at 0x7fd0b5bb7f60> cannot identify image file <_io.BytesIO object at 0x7f71340ef290> Failed to fetch sample 1385753. Exception: cannot identify image file <_io.BytesIO object at 0x7f098648fb50> Failed to fetch sample 2208350. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a156c0> Failed to fetch sample 1947526. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134136980> Failed to fetch sample 1081124. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a24680> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a58c9a0> Failed to fetch sample 3440709. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9ce8e3e0> Failed to fetch sample 2416526. Exception: cannot identify image file <_io.BytesIO object at 0x7fc824063dd0> Failed to fetch sample 1980290. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a91feffb0> Failed to fetch sample 1787770. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369ee480> Failed to fetch sample 2940430. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a246d0> Failed to fetch sample 3000975. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f2ed0> Failed to fetch sample 2258741. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5690dc60> cannot identify image file <_io.BytesIO object at 0x7ff249520270> Failed to fetch sample 2578023. Exception: cannot identify image fiFailed to fetch sample 969775. Exception: Failed to fetch sample 3518834. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412c450> cannot identify image file <_io.BytesIO object at 0x7fdcaf8f2930> Failed to fetch sample 1445442. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249485030> Failed to fetch sample 1381127. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24949a570> Failed to fetch sample 2940430. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412c360> Failed to fetch sample 2179559. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded143790> Failed to fetch sample 3390339. Exception: Failed to fetch sample 1445442. Exception:ject at 0x7ff249533c90> cannot identify image file <_io.BytesIO object at 0x7f713412f650> Failed to fetch sample 2478061. Exception: Failed to fetch sample 3900769. Exception:ject at 0x7f0554a45e40> cannot identify image file <_io.BytesIO object at 0x7fcbdc1ed8a0> Failed to fetch sample 1199377. Exception: cannot identify image file <_io.BytesIO object at 0x7fc84d52dc60> Failed to fetch sample 2758452. Exception: Failed to fetch sample 1390192. Exception: cannot identify image file <_io.BytesIO object at 0x7efa981c56c0> cannot identify image file <_io.BytesIO object at 0x7fcbdc1ef420> Failed to fetch sample 3448050. Exception: cannot identify image file <_io.BytesIO object at 0x7fc92ec60db0> Failed to fetch sample 1012319. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7c1bc0> Failed to fetch sample 1070790. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1bee6b10> Failed to fetch sample 1775362. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1d22f0> Failed to fetch sample 1501211. Exception:Failed to fetch sample 1454159. Exception:bject at 0x7efd6f794f40> Failed to fetch sample 2101533. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fcbdc1d1120> Failed to fetch sample 4112540. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 42 cannot identify image file <_io.BytesIO object at 0x7f082812b830> cannot identify image file <_io.BytesIO object at 0x7fcbe9fa0400> Failed to fetch sample 2440113. Exception: cannot identify image file <_io.BytesIO ob cannot identify image fcannot identify image file <_io.BytesIO object at 0x7f082812a0c0> 3879860. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f795710> Failed to fetch sample 3311193. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a70360> Failed to fetch sample 1189922. Exception:Failed to fetch sample 1100749. Exception:bject at 0x7f054a63e980> Failed to fetch sample 2974608. Exception: cannot identify image file <_io.BytesIO object at 0x7f713413f330> Failed to fetch sample 3350750. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a71490> Failed to fetch sample 1195701. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a178d0> cannot identify image file <_io.BytesIO object at 0x7f08280f2c00> Failed to fetch sample 3374089. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b42609f0> Failed to fetch sample 4133927. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a25760> Failed to fetch sample 1890208. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a573b50> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7f71f9066930> Failed to fetch sample 4108370. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f91ad0> cannot identify image file <_io.BytesIO object at 0x7fd07a61b240> Failed to fetch sample 2196276. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2ef740> Failed to fetch sample 1746683. Exception: cannot identify image file <_io.BytesIO object at 0x7fc8f219b880> Failed to fetch sample 1740611. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac245b70> Failed to fetch sample 2604869. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f79e9d0> Failed to fetch sample 1198666. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac20ff60> Failed to fetch sample 1016995. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f0ea0> Failed to fetch sample 2984380. Exception: cannot identify image file <_io.BytesIO object at 0x7fde041adda0> Failed to fetch sample 1704050. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99cc62f70> cannot identify image file <_io.BytesIO object at 0x7f7134194130> Failed to fetch sample 4108370. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d8310> Failed to fetch sample 1397352. Exception: cannot identify image file <_io.BytesIO object at 0x7fc92ec60db0> cannot identify image file <_io.BytesIO object at 0x7ff2494f8c70> Failed to fetch sample 1980290. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706ed6660> Failed to fetch sample 4117024. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494fab10> Failed to fetch sample 2819059. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 22 cannot identify image file <_io.BytesIO object at 0x7f09cc762070> Failed to fetch sample 2954477. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7f4680> Failed to fetch sample 1029176. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a3dd50> Failed to fetch sample 1063939. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494faa20> Failed to fetch sample 2197722. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2836f0> Failed to fetch sample 3129898. Exception:Failed to fetch sample 4080972. Exception: cannot identify image fi cannot identify image file <_io.BytesIO Failed to fetch sample 121 686. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249550a40> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2815d0> Failed to fetch sample 2801888. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a6160> Failed to fetch sample 1329424. Exception:ject at 0x7f9abcdcb6a0> Failed to fetch sample 3617437. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbe9fa0400> cannot identify image file <_io.BytesIO object at 0x7f09c7b66700> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a35f80> Failed to fetch sample 1740611. Exception: cannot identify image file <_io.BytesIO object at 0x7f9ae54cffb0> Failed to fetch sample 1198666. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcddf0> Failed to fetch sample 2761972. Exception: cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7f9acc06b420> Failed to fetch sample 2717343. Exception: cannot identify image file <_io.BytesIO object at 0x7f32c06f6d90> Failed to fetch sample 1110673. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134136d90> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647dc10> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a3f1f0> le <_io.BytesIO object at 0x7fe0ec216610> Failed to fetch sample 3203917. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2495233d0> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7ffd080> Failed to fetch sample 2741645. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc177d30> cannot identify image file <_io.BytesIO object at 0x7f098647e570> Failed to fetch sample 3907303. Exception: cannot identify image file <_io.BytesIO object at 0x7efb3effd3a0> Failed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fd800> Failed to fetch sample 3634589. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a53a2f0> cannot identify image file <_io.BytesIO object at 0x7f351e1c68e0> Failed to fetch sample 1401216. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c2176a0> Failed to fetch sample 2158552. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249521670> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d7c90> Failed to fetch sample 3350750. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a53b650> Failed to fetch sample 3730314. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4f9c0> Failed to fetch sample 2040208. Exception:Failed to fetch sample 2587903. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fd9760> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7fe91b15df30> Failed to fetch sample 2976623. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341353f0> Failed to fetch sample 2913564. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1abc90> Failed to fetch sample 1397352. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e5242d170> Failed to fetch sample 3895655. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9d5cbfdd0> Failed to fetch sample 2819059. Exception:Failed to fetch sample 3865031. Exception: cannot identify image fiFailed to fetch sample 2689470. Exception: cannot identify image file <_io.BytesIO object at 0x7fd0b2072e30> Failed to fetch sample 3292795. Exception:Failed to fetch sample 2331127. Exception: ject at 0x7fe890154270> cannot identify image file <_io.BytesIO object at 0x7fe0ec265f80> Failed to fetch sample 2583399. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b20c0> Failed to fetch sample 3370884. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84d4860> Failed to fetch sample 3923884. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e5242d120> Failed to fetch sample 3634589. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a17b50> cannot identify image file <_io.BytesIO object at 0x7fe0ec277330> Failed to fetch sample 2819059. Exception: cannot identify image file <_io.BytesIO object at 0x7fd079dc73d0> Failed to fetch sample 2940430. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7dfdd0> Failed to fetch sample 3063079. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7ca390> Failed to fetch sample 3163765. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f8a8e0> Failed to fetch sample 3617437. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e59f29990> Failed to fetch sample 2940430. Exception:/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/mnt/petrelfs/liuzhaoyang/workspace/progra/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkp/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 1445442. Exception: cannot identify image file <_io.BytesIO object at 0x7fbab5304bd0> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f88450> Failed to fetch sample 4377580. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc193a10> utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( cannot identify image file <_io.BytesIO object at 0x7fe0ec237150> Failed to fetch sample 4084399. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e48310> d=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils//mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 2554836. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc27ef20>/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 1218899. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc23abb0> Failed to fetch sample 3846458. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc27c9f0> Failed to fetch sample 1806603. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbde562ca0> Failed to fetch sample 1815480. Exception: cannot identify image file <_io.BytesIO object at 0x7fc92ec60db0> Failed to fetch sample 3245561. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbde515620> Failed to fetch sample 2890579. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc23bf10> Failed to fetch sample 2471304. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac20f4c0> Failed to fetch sample 1939382. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f14e0> cannot identify image file <_io.BytesIO object at 0x7fcbdc27c270> Failed to fetch sample 994847. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1bdda0> Failed to fetch sample 3113541. Exception:Failed to fetch sample 4085432. Exception:bject at 0x7fe0ec21af20> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 2297443. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec24e070> cannot identify image file <_io.BytesIO object at 0x7fbcdc1ac590> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 3079767. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e6a003fb0> Failed to fetch sample 2889444. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec21bc40> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f12b0> Failed to fetch sample 3242773. Exception: cannot identify image file <_io.BytesIO object at 0x7ff55d5a1300> Failed to fetch sample 2793565. Exception: cannot identify image file <_io.BytesIO object at 0x7f098646c3b0> Failed to fetch sample 3280452. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bb2e0> Failed to fetch sample 2253312. Exception:F cannot identify image file <_io.BytesIO ocannot identify image filcannot identify image file <_io.BytesIO oFailed to fetch sample 2033363. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bb150> Failed to fetch sample 3610179. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ba840> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4262110> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 3096384. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641fbf0> Failed to fetch sample 3188277. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1b535da0> Failed to fetch sample 4167719. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4466a6b10> Failed to fetch sample 1496013. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c21d990> Failed to fetch sample 3116930. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249503ec0> Failed to fetch sample 1656236. Exception: cannot identify image file <_io.BytesIO object at 0x7ff55d445ee0> Failed to fetch sample 4155668. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986437510> Failed to fetch sample 3523488. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249487e20> Failed to fetch sample 1562826. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24949bb50> Failed to fetch sample 1258856. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cbfe6b60> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7be9d0> cannot identify image file <_io.BytesIO object at 0x7fc99c21fd80> Failed to fetch sample 4201062. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828108a90> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249485120> Failed to fetch sample 1228267. Exception: ject at 0x7f71341037e0> Failed to fetch sample 2231126. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7ed260> Failed to fetch sample 2606135. Exception:ject at 0x7fdd3802aa20> ile <_io.BytesIO object at 0x7f713412d850> cannot identify image file <_io.BytesIO object at 0x7fe436a24630> Failed to fetch sample 1807623. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a53a2f0> Failed to fetch sample 1504520. Exception:Failed to fetch sample 1804828. Exception:bject at 0x7f0554a45e40> cannot identify image file <_io.BytesIO object at 0x7f6e5242d170> Failed to fetch sample 3380043. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134103e70> cannot identify image file <_io.BytesIO object at 0x7fe0ed73bba0> Failed to fetch sample 2931138. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2495238d0> Failed to fetch sample 3584846. Exception:Failed to fetch sample 3167299. Exception:bject at 0x7f6e3d6cff10> cannot identify image file <_io.BytesIO object at 0x7f055b4494e0> Failed to fetch sample 1523937. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249523f60> Failed to fetch sample 2086216. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986434860> Failed to fetch sample 1891794. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c5ad0> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f6ea5d9ff60> Failed to fetch sample 2086216. Exception:ject at 0x7efe76ab74c0> cannot identify image file <_io.BytesIO object at 0x7f0828108d10> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249485030> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134103010> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7f07aa6c5030> Failed to fetch sample 3634589. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a575940> Failed to fetch sample 3533762. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986451710> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134102660> Failed to fetch sample 3635022. Exception: cannot identify image file <_io.BytesIO object at 0x7f32d493af70> Failed to fetch sample 4096936. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fb8860> Failed to fetch sample 3290404. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986437470> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7f32d09451c0> Failed to fetch sample 1308960. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f92610> Failed to fetch sample 3886409. Exception: cannot identify image file <_io.BytesIO ob cannot identify image f cannot identify image file <_io.BytesIO object at 0x7f71341001d0> Failed to fetch sample 4177964. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc254090> Failed to fetch sample 2940708. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fb9260> Failed to fetch sample 3290404. Exception:bject at 0x7f098647b6a0> cannot identify image file <_io.BytesIO object at 0x7f07a8e9d530> Failed to fetch sample 2856005. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1dd800> Failed to fetch sample 1381127. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 25 cannot identify Failed to fetch sample 1365649. Exception: cannot cannot identify image file <_io.BytesIO object at 0x7fca2e1de4d0> Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828Failed to fetch sample 3406927. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641f560> Failed to fetch sample 3390339. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a58e7f0> Failed to fetch sample 1650971. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a178d0> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1dc130> cannot identify image file <_io.BytesIO object at 0x7f0986479a80> le <_io.BytesIO object at 0x7f054abfb470> Failed to fetch sample 1849035. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b20c0> Failed to fetch sample 1956953. Exception: cannot identify image file <_io.BytesIO object at 0x7fc8f219acf0> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f904f0> Failed to fetch sample 4132812. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647e890> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5813f0> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1cbd30> FFailed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7fd08a2119e0> cannot identify image file <_io.BytesIO object at 0x7fc9b7f0f330> Failed to fetch sample 1884908. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2cf100> Failed to fetch sample 1784730. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f07f60> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828108a40> Failed to fetch sample 3257443. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2ce8e0> Failed to fetch sample 3921018. Exception: cannot identify image file <_io.BytesIO object at 0x7f082810a7a0> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdef62fc0> cannot identify image file <_io.BytesIO object at 0x7fcbdc260860> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647ba60> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7f054a63e980> Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f07920> Failed to fetch sample 2450399. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2495585e0> cannot identify image file <_io.BytesIO object at 0x7fcbdc23e200> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986479ad0> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f06a70> Failed to fetch sample 1615181. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc23dda0> Failed to fetch sample 1189922. Exception:bject at 0x7efd6f7eaf20> Failed to fetch sample 2661531. Exception:bject at 0x7fc9b7f0fdd0> cannot identify image file <_io.BytesIO object at 0x7fcbdc1d2660> Failed to fetch sample 1072318. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7e93a0> Failed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f04310> Failed to fetch sample 3576643. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ef4c0> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7fc74cd69d50> cannot identify image file <_io.BytesIO object at 0x7efd6f7eb380> Failed to fetch sample 1911678. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1d02c0> Failed to fetch sample 3473797. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7eb2e0> Failed to fetch sample 3741331. Exception: cannot identify image file <_io.BytesIO o cannot identify image fiFailed to fetch sample 2100472. Exception: cannot identify image file <_io.BytesIO object at 0x7efb3d062200> Failed to fetch sample 3349349. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341947c0> Failed to fetch sample 4014461. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f2fc0> Failed to fetch sample 2363228. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a67f0> Failed to fetch sample 1301230. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f79ec50> Failed to fetch sample 1681531. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fc680> Failed to fetch sample 3730479. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc24a570> cannot identify image file <_io.BytesIO object at 0x7fe6cc9a2ed0> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7fd0b2072e30> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d8310> cannot identify image file <_io.BytesIO object at 0x7fbcdc1c4ae0> Failed to fetch sample 2529135. Exception: cannot identify image file <_io.BytesIO object at 0x7fc84d52e0c0> cannot identify image file <_io.BytesIO object at 0x7fe6cc9a3290> Failed to fetch sample 2616704. Exception: cannot identify image file <_io.BytesIO object at 0x7f0994dbefc0> cannot identify image file <_io.BytesIO object at 0x7f35a7f44900> Failed to fetch sample 3459261. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded143790> Failed to fetch sample 3540812. Exception: cannot identify image file <_io.BytesIO object at 0x7fbc5c1b4e00> Failed to fetch sample 1518122. Exception:bject at 0x7fe0ec237420> Failed to fetch sample 2758452. Exception:bject at 0x7f0986450770> Failed to fetch sample 1308960. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cc99ade0> cannot identify image file <_io.BytesIO object at 0x7fcbdc202610> Failed to fetch sample 2671128. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fcc6d0> Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bea70> Failed to fetch sample 3513282. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b3888d10> Failed to fetch sample 3081392. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1c71f0> Failed to fetch sample 4078850. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded1402c0> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7f33b242c860> Failed to fetch sample 1784730. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc203970> Failed to fetch sample 1784730. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bec50> Failed to fetch sample 2671128. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412fc90> Failed to fetch sample 3254207. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1c69d0> Failed to fetch sample 2641623. Exception: cannot identify image file <_io.BytesIO object at 0x7fbda068d300> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f3f60> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14ff60> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341b3790> Failed to fetch sample 2729090. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc177f10> Failed to fetch sample 2078887. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9d5cbd990> Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc200130> Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1128e0> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828132ca0> cannot identify image file <_io.BytesIO object at 0x7fc840a71990> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bfb00> Failed to fetch sample 1947526. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc123010> Failed to fetch sample 2612412. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f84180> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9cf55080> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bf2e0> Failed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f3ab0> Failed to fetch sample 1829945. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f862a0> Failed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bc680> Failed to fetch sample 3414603. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc146250> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded18ba60> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1b9760> Failed to fetch sample 2815741. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f865c0> Failed to fetch sample 1929498. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc125620> Failed to fetch sample 2801173. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fdb380> Failed to fetch sample 3880305. Exception: cannot identify image file <_io.BytesIO object at 0x7f07a84ce200> Failed to fetch sample 1409821. Exception: Failed to fetch sample 2204102. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a576f0> cannot identify image file <_io.BytesIO object at 0x7fbcdc1a9a80> cannot identify image file <_io.BytesIO object at 0x7ff2495000e0> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a4810> cannot identify image file <_io.BytesIO object at 0x7fbcdc146610> Failed to fetch sample 1680813. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded143790> Failed to fetch sample 3533762. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f87920> Failed to fetch sample 2408187. Exception: cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7f07a84d4d60>Failed to fetch sample 2562281. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded143790> Failed to fetch sample 1682209. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1256c0> Failed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249500d10> Failed to fetch sample 1916330. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249683420> cannot identify image file <_io.BytesIO object at 0x7fbcdc107b50> Failed to fetch sample 2578023. Exception: cannot identify image file <_io.BytesIO object at 0x7f6ea5d9ff60> Failed to fetch sample 3604356. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1abfb0> cannot identify image file <_io.BytesIO object at 0x7ff55de48cc0> Failed to fetch sample 2527761. Exception: cannot identify image file <_io.BytesIO objcannot identify image file <_io.BytesIO object at 0x7fbab315de40> Failed to fetch sample 1381127. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134136070> cannot identify image file <_io.BytesIO object at 0x7fc74cd69d50> Failed to fetch sample 1019459. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2495234c0> Failed to fetch sample 4187548. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e48c20> Failed to fetch sample 3927414. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249683ba0> cannot identify image file <_io.BytesIO object at 0x7fbcdc1a9df0> ile <_io.BytesIO object at 0x7fca1c187c40> cannot identify image file <_io.BytesIO object at 0x7efd6f7e8900> Failed to fetch sample 3390339. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134135a80> Failed to fetch sample 2770727. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a573e20> Failed to fetch sample 3108237. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2dc2ffc90> cannot identify image file <_io.BytesIO object at 0x7fbc5c1b4e00> Failed to fetch sample 2916396. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec21acf0> Failed to fetch sample 1980290. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d6f70> Failed to fetch sample 1470654. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5feca0> Failed to fetch sample 2578023. Exception: cannot identify image file <_io.BytesIO object at 0x7fd0a1cc04a0> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f2930> Failed to fetch sample 3520194. Exception: cannot identify image file <_io.BytesIO object at 0x7fc73df1bab0> Failed to fetch sample 1308960. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5ff470> Failed to fetch sample 2258741. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b2a70> Failed to fetch sample 1381127. Exception: cannot identify image file <_io.BytesIO object at 0x7fd079dc5c60> Failed to fetch sample 2500627. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864794e0> Failed to fetch sample 1225950. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1bee6b10> Failed to fetch sample 2940708. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a53ade0> Failed to fetch sample 3390339. Exception: cannot identify image file <_io.BytesIO object at 0x7fd3103dbc90> Failed to fetch sample 3107206. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864a6340> Failed to fetch sample 1930721. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c9c4f8900> Failed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a571800> Failed to fetch sample 1501211. Exception: cannot identify image file <_io.BytesIO object at 0x7f09ae101c60> Failed to fetch sample 4026469. Exception: cannot identify image file <_io.BytesIO object at 0x7f098649c630> Failed to fetch sample 1413922. Exception: cannot identify image file <_io.BytesIO object at 0x7fd169770ea0> Failed to fetch sample 1382990. Exception:bject at 0x7f70b56b1120> cannot identify image file <_io.BytesIO object at 0x7fd07a5fc090> Failed to fetch sample 4112540. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864a55d0> Failed to fetch sample 3052918. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7afbf0> Failed to fetch sample 2679326. Exception: cannot identify image file <_io.BytesIO object at 0x7f098649ccc0> Failed to fetch sample 2784203. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5fd4e0> Failed to fetch sample 3879860. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864a7ba0> 52196. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2cdd50>cannot identify image file <_io.BytesIO object at 0x7f0828129f80> Failed to fetch sample 4061283. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b38886d0> Failed to fetch sample 3435401. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec279260> cannot identify image file <_io.BytesIO object at 0x7f6e2d804cc0> le <_io.BytesIO object at 0x7f08280f2c00> Failed to fetch sample 2001390. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc129d50> Failed to fetch sample 3922185. Exception:Failed to fetch sample 3596808. Exception:bject at 0x7f0939cdae30> cannot identify image file <_io.BytesIO object at 0x7fbcdc1affb0> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341004a0> Failed to fetch sample 4132812. Exception: cannot identify image file <_io.BytesIO object at 0x7f082812a980> Failed to fetch sample 3374089. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc12a610> Failed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341037e0> cannot identify image file <_io.BytesIO object at 0x7fc9b7f07920> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc147b00> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134101fd0> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134101cb0> Failed to fetch sample 3882515. Exception: Failed to fetch sample 1273196. Exception: cannot identify image file <_io.BytesIO object at 0x7fe472475cb0> ailed to fetch sample 1547193. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f3c540> Failed to fetch sample 2115425. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bd080> ailed to fetch sample 2641623. Exception:Failed to fetch sample 2856005. Exception:bject at 0x7fe436a35710> le <_io.BytesIO object at 0x7f055b4494e0> cannot identify image file <_io.BytesIO object at 0x7f35a7f4dad0> Failed to fetch sample 2531872. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1b8db0> Failed to fetch sample 3134210. Exception: cannot identify image file <_io.BytesIO object at 0x7f32bcb5cb30> Failed to fetch sample 2078887. Exception: cannot identify image file <_io.BytesIO object at 0x7fbaf0507f60> cannot identify image file <_io.BytesIO object at 0x7fe436a0bfb0> Failed to fetch sample 2537042. Exception: Failed to fetch sample 1120039. Exception:ject at 0x7f098648e1b0> cannot identify image file <_io.BytesIO object at 0x7fe0ec272660> cannot identify image file <_io.BytesIO object at 0x7f35a7f55300> Failed to fetch sample 2562281. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4f600> Failed to fetch sample 1947526. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a0a6b0> Failed to fetch sample 1104215. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2733d0> cannot identify image file <_io.BytesIO object at 0x7f098648cf40> Failed to fetch sample 3021215. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f3c4a0> Failed to fetch sample 1834449. Exception: cannot identify image file <_io.BytesIO object at 0x7efb3d062200> Failed to fetch sample 3272055. Exception:Failed to fetch sample 1634880. Exception:bject at 0x7fe0ec2db560> cannot identify image file <_io.BytesIO object at 0x7f32a7dcbab0> cannot identify image file <_io.BytesIO object at 0x7f098648fbf0> Failed to fetch sample 2764266. Exception: cannot identify image file <_io.BytesIO object at 0x7f098648ec00> Failed to fetch sample 3346341. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986457f60> Failed to fetch sample 3634589. Exception: cannot identify image file <_io.BytesIO object at 0x7f0dd0108540> Failed to fetch sample 2056463. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d7650> Failed to fetch sample 3305093. Exception: cannot identify image file <_io.BytesIO object at 0x7f05ffde87c0> Failed to fetch sample 3370199. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2d6570> Failed to fetch sample 2926584. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864782c0> Failed to fetch sample 1559570. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a570900> cannot identify image file <_io.BytesIO object at 0x7fe0ec2d0d10> Failed to fetch sample 2221653. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1def20> Failed to fetch sample 3557596. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2d1b70> Failed to fetch sample 1114723. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc111c10> Failed to fetch sample 2237134. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fc680> Failed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1c7970> Failed to fetch sample 1120039. Exception: cannot identify image file <_io.BytesIO object at 0x7f0939cad850> Failed to fetch sample 1123977. Exception:Failed to fetch sample 4144768. Exception:bject at 0x7fba4d8d6cf0> Failed to fetch sample 1645844. Exception:bject at 0x7f082812ac00> le <_io.BytesIO object at 0x7fe436a24cc0> cannot identify image file <_io.BytesIO object at 0x7fca9a6f1210> le <_io.BytesIO object at 0x7fca2e1d94e0> Failed to fetch sample 1250858. Exception: cannot identify image file <_io.BytesIO object at 0x7f05ffde92b0> Failed to fetch sample 2663582. Exception: cannot identify image file <_io.BytesIO object at 0x7fe890154270> Failed to fetch sample 1397352. Exception:Failed to fetch sample 3480772. Exception:bject at 0x7fbcdc122fc0> cannot identify image file <_io.BytesIO object at 0x7fe436a273d0> Failed to fetch sample 3091512. Exception: cannot identify image file <_io.BytesIO object at 0x7f055b4499e0> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7fc74cd69d50> Failed to fetch sample 1400395. Exception:bject at 0x7fcbdc255b20> cannot identify image file <_io.BytesIO oFailed to fetch sample 3180383. Exception:ample 1571615. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a25ad0> cannot identify image file <_io.BytesIO object at 0x7fba11b9ade0> cannot identify image file <_io.BytesIO object at 0x7f0522e4e3e0> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7f082818b740> Failed to fetch sample 1656236. Exception: cannot identify image file <_io.BytesIO o cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 36Failed to fetch sample 3923884. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f2cf0> Failed to fetch sample 2434863. Exception:Failed to fetch sample 3584846. Exception:bject at 0x7fcbdc1f2840> Failed to fetch sample 1562826. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f0548bcd3a0> Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7fc73df1bab0> Failed to fetch sample 3617437. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1218a0> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a57bf0> cannot identify image file <_io.BytesIO object at 0x7f0939cad850> Failed to fetch sample 1140456. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1afa10> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7f07aa6c5030> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1dc0e0> Failed to fetch sample 1694633. Exception: cannot identify image file <_io.BytesIO object at 0x7f082812a8e0> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1de700> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7f0522e4e4d0> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1df420> Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO object at 0x7f0639773c90> Failed to fetch sample 4130557. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a67a0> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1dfba0> Failed to fetch sample 1464443. Exception: cannot identify image file <_io.BytesIO object at 0x7ff3322a56c0> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7f063cd776f0> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdef66430> Failed to fetch sample 1982896. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706e5bb00> Failed to fetch sample 1046935. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249681d00> Failed to fetch sample 2671128. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1cbd5760> Failed to fetch sample 1825679. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f0f470> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7fc715941210> Failed to fetch sample 3302630. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c0cc0> FFailed to fetch sample 4031365. Exception: cannot identify image file <_io.BytesIO object at 0x7efe76ac88b0> Failed to fetch sample 1541061. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c2cf0> Failed to fetch sample 4059671. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b1620> Failed to fetch sample 2125639. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828131d00> Failed to fetch sample 2698844. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706ed5350> Failed to fetch sample 4027173. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e6c450> cannot identify image file <_io.BytesIO object at 0x7fd079db9da0> Failed to fetch sample 1066194. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b13f0> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828132ac0>FFailed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e6c4a0> cannot identify image file <_io.BytesIO object at 0x7fd3103dbc90> Failed to fetch sample 1612395. Exception: cannot identify image file <_io.BytesIO object at 0x7f09ae101990> Failed to fetch sample 4142811. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b1760> Failed to fetch sample 4131003. Exception: cannot identify image file <_io.BytesIO object at 0x7fd310557ba0> Failed to fetch sample 1400353. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641f920> Failed to fetch sample 2193882. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134136700> Failed to fetch sample 4061653. Exception:Failed to fetch sample 1656236. Exception:bject at 0x7fca9cf55080> cannot identify image file <_io.BytesIO object at 0x7fd30f9de1b0> le <_io.BytesIO object at 0x7fbcdc146c00> cannot identify image file <_io.BytesIO object at 0x7ff4e3e6ff10> ile <_io.BytesIO object at 0x7f6e6a003fb0> Failed to fetch sample 3143457. Exception:Failed to fetch sample 2661389. Exception: ject at 0x7fcbdc262c00> Failed to fetch sample 1095181. Exception:cannot identify image file <_io.BytesIO object at 0x7fca1c18b650> F cannot identify image file <_io.BytesIO object at 0x7fd310dc38d0> e <_io.BytesIO object at 0x7ff4e3e6d670> Failed to fetch sample 3583792. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f81d00> Failed to fetch sample 4131003. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340ee980> Failed to fetch sample 2255647. Exception: cannot identify image file <_io.BytesIO object at 0x7fbab315de40> Failed to fetch sample 2134532. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc147380> Failed to fetch sample 1414336. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f8ab10> Fcannot identify image file <_io.BytesIO object at 0x7fe4369fda30> le <_io.BytesIO object at 0x7f6f0b7269d0> Failed to fetch sample 1504520. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc144770> Failed to fetch sample 2040208. Exception: Failed to fetch sample 1792884. Exception: ect at 0x7ff27f80e840> Failed to fetch sample 2819059. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18a7f0> cannot identify image file <_io.BytesIO object at 0x7fbded143790> cannot identify image file <_io.BytesIO object at 0x7f72451536f0> Failed to fetch sample 3211355. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0ef6f0> Failed to fetch sample 1964920. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f45cb0> Failed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc144900> n the specified maximum sequence length for this model (10090 > 8192). Running this sequence through the model will result in indexing errors Failed to fetch sample 3079460. Exception: cannot identify image file <_io.BytesIO object at 0x7fd0a1cc04a0> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded143790> Failed to fetch sample 2086216. Exception:bject at 0x7fe4804fa7a0> cannot identify image file <_io.BytesIO object at 0x7fbcdc1448b0> cannot identify image file <_io.BytesIO object at 0x7fd07a55b150> Failed to fetch sample 2437168. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5c540> Failed to fetch sample 1858009. Exception:Failed to fetch sample 3892787. Exception:bject at 0x7f09864794e0> cannot identify image file <_io.BytesIO object at 0x7fe0ec2eb600> Failed to fetch sample 3617437. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1bbbf0> cannot identify image file <_io.BytesIO object at 0x7fbcdc106c50> Failed to fetch sample 3290404. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c9c4f8900> cannot identify image file <_io.BytesIO object at 0x7fbcdc1469d0> Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO obcannot identify image file <_io.BytesIO object at 0x7f09864534c0> Failed to fetch sample 1691312. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644d210> cannot identify image file <_io.BytesIO object at 0x7fbcdc1078d0> Failed to fetch sample 3375687. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e2e06fe70> Failed to fetch sample 3316516. Exception: cannot identify image file <_io.BytesIO object at 0x7f33b242ed90> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644e020> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e2d804cc0> Failed to fetch sample 2783531. Exception: cannot identify image file <_io.BytesIO object at 0x7f33b2cebfb0> Failed to fetch sample 3115491. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded143790> Failed to fetch sample 1758557. Exception:Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc107ec0> Failed to fetch sample 3345849. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ed73bba0> Failed to fetch sample 1198666. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfaa4238d0> cannot identify image file <_io.BytesIO object at 0x7fbcdc145b70> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc104630> Failed to fetch sample 3719356. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdd769d00> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc107d80> Failed to fetch sample 3207898. Exception: cannot identify image file <_io.BytesIO object at 0x7fc83c582a20> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc106a70> cannot identify image file <_io.BytesIO object at 0x7f7134101210> Failed to fetch sample 3419178. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc124cc0> cannot identify image file <_io.BytesIO object at 0x7fe0ec27a840> cannot identify image file <_io.BytesIO object at 0x7fe436a26ac0> Failed to fetch sample 4082935. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fede0> Failed to fetch sample 2856005. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a5f510> Failed to fetch sample 2010543. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec235c60> Failed to fetch sample 2040873. Exception: cannot identify image file <_io.BytesIO object at 0x7fe890154270> Failed to fetch sample 1898440. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec27bfb0> Failed to fetch sample 2562281. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a276f0> Failed to fetch sample 2851854. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369e4630> Failed to fetch sample 1935292. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1ac860> Failed to fetch sample 2250055. Exception:Failed to fetch sample 3021215. Exception:bject at 0x7fe0ec2cc1d0> cannot identify image file <_io.BytesIO object at 0x7fe436a26d40> Failed to fetch sample 1847672. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412f650> Failed to fetch sample 4235606. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc12a110> Failed to fetch sample 2155552. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1baf70> Failed to fetch sample 3259618. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1acb80> Failed to fetch sample 2787460. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f567f0> Failed to fetch sample 2149756. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bb1a0> Failed to fetch sample 3059280. Exception: Failed to fetch sample 2661050. Exception:ject at 0x7f098647c860> cannot identify image file <_io.BytesIO object at 0x7fbcdc1bb470> Failed to fetch sample 3596833. Exception: cannot identify image file <_io.BytesIO object at 0x7f33b242ed90> Failed to fetch sample 2472369. Exception: cannot identify image file <_io.BytesIO object at 0x7f32d0945080> cannot identify image file <_io.BytesIO object at 0x7f0548bcd3a0> Failed to fetch sample 3215324. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a75b0> cannot identify image file <_io.BytesIO object at 0x7fbcdc1ba020> Failed to fetch sample 2659430. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828108a40> Failed to fetch sample 1529264. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4dfd0> Failed to fetch sample 1796453. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded143790> Failed to fetch sample 2578023. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1218a0> Failed to fetch sample 1504520. Exception: cannot identify image file <_io.BytesIO object at 0x7f32d493af70> Failed to fetch sample 1381127. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc122fc0> Failed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f3d3a0> Failed to fetch sample 3390339. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1a7740> Failed to fetch sample 1428457. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1d94e0> Failed to fetch sample 1016375. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986446520> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f3c040> Failed to fetch sample 3126720. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc2553a0> Failed to fetch sample 3290404. Exception: cannot identify image file <_io.BytesIO object at 0x7f32d28152b0> Failed to fetch sample 2363461. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647c130> cannot identify image file <_io.BytesIO object at 0x7fcbdc1eb790> Failed to fetch sample 2801811. Exception:Failed to fetch sample 2142902. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f1a30> Failed to fetch sample 3633396. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1eb880>F cannot identify image file <_io.BytesIO object at 0x7f0986487880> e <_io.BytesIO object at 0x7fc741b18e00> Failed to fetch sample 1410411. Exception: Failed to fetch sample 2437168. Exception: ect at 0x7fc8406fcd10> cannot identify image file <_io.BytesIO object at 0x7f35a7f4cf40> cannot identify image file <_io.BytesIO object at 0x7fc706e5aa70> Failed to fetch sample 1806829. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a7470> Failed to fetch sample 2112620. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1eafc0> Failed to fetch sample 3436553. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e6c090> Failed to fetch sample 1140456. Exception: Failed to fetch sample 3397003. Exception:ject at 0x7f35a7f4dad0> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec21af20> Failed to fetch sample 2686859. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986434900> Failed to fetch sample 3236342. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b1440> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4f330> Failed to fetch sample 2806356. Exception:Failed to fetch sample 2059047. Exception:bject at 0x7f0986434180> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986447560> Failed to fetch sample 2168694. Exception: ject at 0x7fe0ec2f3ab0> cannot identify image file <_io.BytesIO object at 0x7f35a7f4cb80> Failed to fetch sample 3494092. Exception:Failed to fetch sample 3520194. Exception:bject at 0x7f0986448db0> Failed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1d516e30> Failed to fetch sample 1131311. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cd372b60> cannot identify image file <_io.BytesIO object at 0x7fe0ec255e90> cannot identify image file <_io.BytesIO object at 0x7f35a7f3d080> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7f09ae101990> Failed to fetch sample 3898287. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fd760> Failed to fetch sample 1364920. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f888b0> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f3f600> Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641f920> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a37790> cannot identify image file <_io.BytesIO object at 0x7f35a7f55f80> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b4266c50> Failed to fetch sample 1795036. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fbbf10> cannot identify image file <_io.BytesIO object at 0x7f0986446160> Failed to fetch sample 3148873. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a36750> Failed to fetch sample 1770342. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f904f0> Failed to fetch sample 3194835. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369feb10> Failed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fb80e0> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fc590> Failed to fetch sample 1648426. Exception: cannot identify image file <_io.BytesIO object at 0x7f32d493af70> cannot identify image file <_io.BytesIO object at 0x7fe0ec23a250> Failed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a35c10> Failed to fetch sample 3880305. Exception: cannot identify image file <_io.BytesIO objcannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a366b0> Failed to fetch sample 1988585. Exception:F cannot identify image file <_io.BytesIO cannot identify image file <_io.BytesIO object at 0x7f35a7f92cf0> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a35ad0> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7f36604588b0> cannot identify image file <_io.BytesIO object at 0x7f08280f23e0> cannot identify image file <_io.BytesIO object at 0x7efce5bf1030> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7fe46c326700> Failed to fetch sample 3021215. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f79d2b0> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7fe4369d7c40> Failed to fetch sample 2819059. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a50680> Failed to fetch sample 1227433. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d4b30> Failed to fetch sample 3436761. Exception: cannot identify image file <_io.BytesIO object at 0x7f09bd6270b0> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7f098645b420> cannot identify image file <_io.BytesIO object at 0x7f07a9883fb0> le <_io.BytesIO object at 0x7fe436a51800> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7efa9b9e3b50> Failed to fetch sample 1740611. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641e750> Failed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7f0d605381d0> Failed to fetch sample 3617437. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a09fd0> cannot identify image file <_io.BytesIO object at 0x7f054707cdb0> Failed to fetch sample 3680425. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a5fd0> Failed to fetch sample 1198666. Exception: cannot identify image file <_io.BytesIO object at 0x7f09c7b66700> Failed to fetch sample 2040873. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c58a0> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7f08282cb560> Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a562250> Failed to fetch sample 3840067. Exception: cannot identify image file <_io.BytesIO object at 0x7f09c1c45580> Failed to fetch sample 2851854. Exception:bject at 0x7fe0ec2a5b20> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7f08282cbc90> Failed to fetch sample 2487289. Exception: cannot identify image file <_io.BytesIO object at 0x7fc7f3355580> cannot identify image file <_io.BytesIO object at 0x7f098647f1f0> Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a6091c0> cannot identify image file <_io.BytesIO object at 0x7fe0ec2e0090> Failed to fetch sample 1784730. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a7600> Failed to fetch sample 1448299. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c09f0> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a6930> cannot identify image file <_io.BytesIO object at 0x7f71341362a0> Failed to fetch sample 2935786. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134136d90> Failed to fetch sample 2534421. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a572c50> Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a59b920> Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a5800> Failed to fetch sample 2828197. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c10d0> Failed to fetch sample 1770342. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a572f20> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a6098a0> Failed to fetch sample 4376663. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f0f330> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7fde320719e0> Failed to fetch sample 1633328. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c1940> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a60a2a0> Failed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5ff600> Failed to fetch sample 1189922. Exception:Failed to fetch sample 1252147. Exception:bject at 0x7fe0ec2824d0> Failed to fetch sample 3180383. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f0e980> cannot identify image file <_io.BytesIO object at 0x7fd07a59b970> Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5ff1f0> Failed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a5990> Failed to fetch sample 3633396. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f0fdd0> cannot identify image file <_io.BytesIO object at 0x7fd07a60b240> Failed to fetch sample 3880305. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5fe9d0> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec26a840> Failed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f05210> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5fdee0> Failed to fetch sample 2042891. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a15f30> cannot identify image file <_io.BytesIO object at 0x7f32986b9940> ile <_io.BytesIO object at 0x7fc706ed5350> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986485990> Failed to fetch sample 1762324. Exception: cannot identify image fiFailed to fetch sample 1740611. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 1029176. Exception: mple 3173823. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc113f60> cannot identify image file <_io.BytesIO object at 0x7fca1c1c0310> Failed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c187ec0> Failed to fetch sample 1236694. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280ffd30> Failed to fetch sample 3188707. Exception: ject at 0x7f33b242c860> Failed to fetch sample 1066194. Exception: cannot identify image file <_io.BytesIO object at 0x7fc82d521e90> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f057b0> Failed to fetch sample 3596833. Exception:Failed to fetch sample 3276750. Exception:bject at 0x7f08280fdb20> Failed to fetch sample 4142811. Exception:bject at 0x7f09cf9ade90> cannot identify image file <_io.BytesIO object at 0x7fca1c1c15d0> Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f06750> Failed to fetch sample 2761972. Exception:Failed to fetch sample 2128890. Exception:bject at 0x7f0986484630> cannot identify image file <_io.BytesIO object at 0x7f713412f650> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f05760> Failed to fetch sample 2806356. Exception:Failed to fetch sample 2359987. Exception: ject at 0x7f71341a3790> cannot identify image file <_io.BytesIO object at 0x7f09ae101c60> Failed to fetch sample 3494092. Exception:Failed to fetch sample 1573545. Exception: ject at 0x7f71342d6430> Failed to fetch sample 2234374. Exception: cannot identify image file <_io.BytesIO object at 0x7f08ec577f60> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7f09c1c45580> Failed to fetch sample 1012319. Exception: Failed to fetch sample 2903623. Exception: ect at 0x7f082819b420> cannot identify image file <_io.BytesIO object at 0x7f0986447560> Failed to fetch sample 1501211. Exception: cannot identify image file <_io.BytesIO object at 0x7f082819a390> Failed to fetch sample 3105883. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f88450> Failed to fetch sample 2585410. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1d3d30> Failed to fetch sample 1148324. Exception: cannot identify image file <_io.BytesIO object at 0x7fbac8b11cb0> Failed to fetch sample 3263292. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a574fe0> Failed to fetch sample 3879860. Exception:bject at 0x7fcbde561080> cannot identify image file <_io.BytesIO object at 0x7f08280ea930> le <_io.BytesIO object at 0x7fbab315de40> cannot identify image file <_io.BytesIO object at 0x7f0986448db0> Failed to fetch sample 1021516. Exception:Failed to fetch sample 1934945. Exception:bject at 0x7ff4dbf0ad40> Failed to fetch sample 1100749. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281994e0> cannot identify image file <_io.BytesIO object at 0x7fbcdc1af380> cannot identify image file <_io.BytesIO object at 0x7f0986449350> cannot identify image file <_io.BytesIO object at 0x7fe0ec238f90> Failed to fetch sample 3859963. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec265a80> Failed to fetch sample 2856005. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249683240> Failed to fetch sample 2562281. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249681990> Failed to fetch sample 3425344. Exception:Failed to fetch sample 3021215. Exception:bject at 0x7fbcdc1a6160> cannot identify image file <_io.BytesIO object at 0x7ff249559170> Failed to fetch sample 4241620. Exception: cannot identify image file <_io.BytesIO object at 0x7f07aac71e90> cannot identify image file <_io.BytesIO object at 0x7fbcdc123f60> Failed to fetch sample 3668159. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 41 cannot identify image file <_io.BytesIO object at 0x7f08280f9b20> cannot identify image file <_io.BytesIO object at 0x7fbcdc1a7ec0> Failed to fetch sample 1656236. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f30b0> Failed to fetch sample 1562826. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f2ed0> Failed to fetch sample 2659240. Exception: cannot identify image file <_io.BytesIO object at 0x7fc80a138c70> Failed to fetch sample 2090733. Exception: cannot identify image file <_io.BytesIO object at 0x7fc72566bfb0> cannot identify image file <_io.BytesIO object at 0x7fcbde0bf3d0> Failed to fetch sample 1618453. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a4fe0> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b9620> cannot identify image file <_io.BytesIO object at 0x7fcbdc257ab0> Failed to fetch sample 2979749. Exception:Failed to fetch sample 2339167. Exception:bject at 0x7f07a84a3010> Failed to fetch sample 3443298. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1c7420> cannot identify image file <_io.BytesIO object at 0x7fca9a6f07c0> cannot identify image file <_io.BytesIO object at 0x7efb7979e250> Failed to fetch sample 1762094. Exception: cannot identify image file <_io.BytesIO object at 0x7efb79f9ffb0> Failed to fetch sample 1697688. Exception: Failed to fetch sample 2574291. Exception:cannot identify image file <_io.BytesIO object at 0x7efd6f777740> cannot identify image file <_io.BytesIO object at 0x7efd6f7e8680> cannot identify image file <_io.BytesIO object at 0x7ff291574e00> Failed to fetch sample 1700847. Exception:Failed to fetch sample 2886123. Exception:bject at 0x7efd6f7eb600> Failed to fetch sample 1504520. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7dd4e0> Failed to fetch sample 2778831. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1baf70> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1e8590> Failed to fetch sample 3617437. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7eafc0> Failed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7dddf0> Failed to fetch sample 2882848. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1ba0c0> Failed to fetch sample 2086216. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7dd760> cannot identify image file <_io.BytesIO object at 0x7f70b42620c0> Failed to fetch sample 1455638. Exception: cannot identify image file <_io.BytesIO object at 0x7f09cfb42700> Failed to fetch sample 1381127. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369d7e70> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706e5a110> cannot identify image file <_io.BytesIO object at 0x7efd6f7cac00> Failed to fetch sample 3088645. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7bf970> Failed to fetch sample 1762324. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647c770> Failed to fetch sample 3390339. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a0a9d0> cannot identify image file <_io.BytesIO object at 0x7fe0ec2e8db0> Failed to fetch sample 3390339. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7dcbd0> cannot identify image file <_io.BytesIO object at 0x7fbcdc111bc0> Failed to fetch sample 2801811. Exception:Failed to fetch sample 3633396. Exception:bject at 0x7efd6f7de430> Failed to fetch sample 1973030. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a246d0> cannot identify image file <_io.BytesIO object at 0x7fbab5304bd0> Failed to fetch sample 1392856. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f71440> Failed to fetch sample 2999271. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f79d990> Failed to fetch sample 1504520. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a5f010> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7df150> Failed to fetch sample 3167299. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a26200> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14f2e0> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7de9d0> Failed to fetch sample 2613338. Exception: cannot identify image file <_io.BytesIO object at 0x7fe445765c10> Failed to fetch sample 2086216. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a5e070> Failed to fetch sample 3423289. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7def70> Failed to fetch sample 1742099. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fca40> cannot identify image file <_io.BytesIO object at 0x7f72451963e0> cannot identify image file <_io.BytesIO object at 0x7fca1c1b20c0> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a26890> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7df2e0> Failed to fetch sample 1029176. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a35f80> Failed to fetch sample 3439450. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134136d90> cannot identify image file <_io.BytesIO object at 0x7fca1c187ec0> Failed to fetch sample 1806693. Exception:bject at 0x7fe436a5fab0> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7dede0> cannot identify image file <_io.BytesIO object at 0x7fbcdc113e70> Failed to fetch sample 1675267. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134103060> Failed to fetch sample 2801811. Exception:bject at 0x7fc72566bfb0> Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14eac0> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a37d80> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134103510> Failed to fetch sample 1140456. Exception:bject at 0x7fca1c1b0180> cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7fbcdc1bc040> Failed to fetch sample 3148873. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134102930> cannot identify image file <_io.BytesIO object at 0x7fc9b7f07920> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a5e430> Failed to fetch sample 3134219. Exception: cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7fc9b7f045e0> cannot identify image file <_io.BytesIO object at 0x7f70b40bd8a0> cannot identify image file <_io.BytesIO object at 0x7fd07a570f90> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a5f5b0> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341004f0> Failed to fetch sample 2758452. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f48a90> Failed to fetch sample 3608170. Exception: cannot identify image file <_io.BytesIO object at 0x7ff27f5b9da0> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f04db0> cannot identify image file <_io.BytesIO object at 0x7f7134136980> le <_io.BytesIO object at 0x7fd0b5eb0ea0> Failed to fetch sample 1784730. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f8b240> Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134102430> Failed to fetch sample 3921760. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b3888680> cannot identify image file <_io.BytesIO object at 0x7fcb2d149760> Failed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4ab10> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342d3ba0> Failed to fetch sample 1329424. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f8b290> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f82200> Failed to fetch sample 1739532. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded143790> Failed to fetch sample 3271052. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1b9760> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f81bc0> Failed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7f32be6c18a0> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f889a0> Failed to fetch sample 1206165. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fc680> Failed to fetch sample 1956046. Exception: cannot identify image file <_io.BytesIO object at 0x7f09cf9ade90> Failed to fetch sample 1740611. Exception: cannot identify image file <_io.BytesIO object at 0x7f07aa6c5030> Failed to fetch sample 1417348. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281162a0> Failed to fetch sample 3306124. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bb470> ailed to fetch sample 2049038. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986434180> Failed to fetch sample 3840067. Exception: cannot identify image file <_io.BytesIO object at 0x7fd0b5eb0ea0> Failed to fetch sample 4108370. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1ac090> cannot identify image file <_io.BytesIO object at 0x7f0986434900> Failed to fetch sample 2954477. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1acc20> Failed to fetch sample 3078543. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986444450> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641f920> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7f09ce6e9ee0> Failed to fetch sample 2423417. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2915750d0> Failed to fetch sample 2155364. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc11a480> Failed to fetch sample 1095181. Exception: cannot identify image file <_io.BytesIO object at 0x7ff28f520c20> Failed to fetch sample 2770050. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d7650> Failed to fetch sample 3065638. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c9d1a1300> cannot identify image file <_io.BytesIO object at 0x7fe0ec2815d0> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO object at 0x7ff3344abd30> Failed to fetch sample 2535152. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644e480> Failed to fetch sample 3349915. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249520e50> Failed to fetch sample 1258856. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644fab0> Failed to fetch sample 3449061. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdcfaee30> Failed to fetch sample 1692547. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1dafc0> Failed to fetch sample 1630682. Exception: cannot identify image file <_io.BytesIO object at 0x7fc72566bfb0> Failed to fetch sample 1745218. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2e6840> Failed to fetch sample 1533133. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f5f560> Failed to fetch sample 1602322. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbde560630> Failed to fetch sample 2978534. Exception: cannot identify image file <_io.BytesIO object at 0x7fbc5cc32ed0> Failed to fetch sample 1934945. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1c7a10> Failed to fetch sample 3467927. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828130540> cannot identify image file <_io.BytesIO object at 0x7fcbdc1e9b70> Failed to fetch sample 1697346. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1235b0> Failed to fetch sample 2479743. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a6f20> Failed to fetch sample 3846458. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fced40> Failed to fetch sample 2142902. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1a6c50> Failed to fetch sample 4129675. Exception: cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7f08281a6200> cannot identify image file <_io.BytesIO object at 0x7f35a7f47fb0> Failed to fetch sample 4057666. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9fd7063e0> Failed to fetch sample 3155853. Exception:Failed to fetch sample 2478148. Exception:bject at 0x7fe436a3e1b0> cannot identify image file <_io.BytesIO object at 0x7fcbe9fe02c0> Failed to fetch sample 3245561. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fcfd80> Failed to fetch sample 1410411. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f32e0> Failed to fetch sample 3744461. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbde0bf3d0> Failed to fetch sample 3276754. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fcce50> Failed to fetch sample 2112620. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9cb2b5e90> Failed to fetch sample 2954186. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7972e0> Failed to fetch sample 1131479. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f3d350> Failed to fetch sample 1381127. Exception:Failed to fetch sample 4050151. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7f35a7fb93f0> FFailed to fetch sample 3390339. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7bb290> Failed to fetch sample 1506879. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2222a0> Failed to fetch sample 2627889. Exception: cannot identify image file <_io.BytesIO object at 0x7f08282cbc90> Failed to fetch sample 1504520. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2376f0> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7f053230ffb0> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec244450> cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7fe0ec256a20> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7f082812a9d0> Failed to fetch sample 2086216. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec276fc0> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14e700> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec235fd0> Failed to fetch sample 3504244. Exception:Failed to fetch sample 1535968. Exception:bject at 0x7fe0ec2f2a20> Failed to fetch sample 3290404. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec237060> cannot identify image file <_io.BytesIO object at 0x7fe436a27a60> Failed to fetch sample 2828197. Exception:Failed to fetch sample 2359987. Exception:bject at 0x7fe0ec2eaf70> Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec222110> cannot identify image file <_io.BytesIO object at 0x7fbcdc14fe70> Failed to fetch sample 1633328. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f2d40>Failed to fetch sample 4115987. Exception:: cannot identify image file <_io.BytesIO object at 0x7fe0ec276f20> cannot identify image file <_io.BytesIO object at 0x7efd6f7cb420> Failed to fetch sample 2659901. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f1670> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec236930> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f79ddf0> Failed to fetch sample 4059671. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd491dd620> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec236bb0> Failed to fetch sample 3494092. Exception: cannot identify image file <_io.BytesIO object at 0x7efb3d062200> Failed to fetch sample 1066194. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1fb4f92b0> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec276390> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14eca0> Failed to fetch sample 2734583. Exception:Failed to fetch sample 2406115. Exception: ject at 0x7ff4e3e48220> Failed to fetch sample 4142811. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f3fb0> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7fe15dcfc9a0> cannot identify image file <_io.BytesIO object at 0x7f351e1c7dd0> Failed to fetch sample 1505750. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f82930> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14f790> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2369d0> Failed to fetch sample 1762324. Exception:bject at 0x7efd6f7cb8d0> cannot identify image file <_io.BytesIO object at 0x7f35a7ffe430> Failed to fetch sample 2455121. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a577dd0> Failed to fetch sample 1762324. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a573ce0> Failed to fetch sample 4162165. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded143790> Failed to fetch sample 2041309. Exception: cannot identify image file <_io.BytesIO object at 0x7f3297a1ff60> Failed to fetch sample 2371876. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641f970> cannot identify image file <_io.BytesIO object at 0x7fbcdc1477e0> Failed to fetch sample 3298307. Exception: cannot identify image file <_io.BytesIO object at 0x7f32d2815850> Failed to fetch sample 3898287. Exception:Failed to fetch sample 2772050. Exception:bject at 0x7f0986444450> cannot identify image file <_io.BytesIO object at 0x7fbcdc1a98a0> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641f740> Failed to fetch sample 3921018. Exception:Failed to fetch sample 3148873. Exception:bject at 0x7fbcdc1a91c0> cannot identify image file <_io.BytesIO object at 0x7f0986447560> Failed to fetch sample 4027789. Exception: Failed to fetch sample 3194835. Exception:ject at 0x7fbcdc0f2d90> Failed to fetch sample 970714. Exception: cannot identify image file <_io.BytesIO object at 0x7f7245153ec0> cannot identify image file <_io.BytesIO object at 0x7f098647ef20> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1ab6a0> cannot identify image file <_io.BytesIO object at 0x7f098641fec0> Failed to fetch sample 1876146. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341a2700> Failed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7f0a7514e160> Failed to fetch sample 2542729. Exception: cannot identify image file <_io.BytesIO object at 0x7f0a7514d5d0> cannot identify image file <_io.BytesIO object at 0x7fbcdc1abba0> Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412d8f0> Failed to fetch sample 2547473. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134103c90> cannot identify image file <_io.BytesIO object at 0x7f098641f6a0> Failed to fetch sample 2180795. Exception: cannot identify image file <_io.BytesIO object at 0x7f09ce6e9ee0> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647e4d0> Failed to fetch sample 2165648. Exception: cannot identify image file <_io.BytesIO object at 0x7f09cc762070> cannot identify image file <_io.BytesIO object at 0x7f7134102430> Failed to fetch sample 1345056. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494a3c90> cannot identify image file <_io.BytesIO object at 0x7f09c1c46070> Failed to fetch sample 1133332. Exception:Failed to fetch sample 2265597. Exception:bject at 0x7f7134135ad0> cannot identify image file <_io.BytesIO object at 0x7f09c15a9620> Failed to fetch sample 2402011. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341376a0> Failed to fetch sample 2311543. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1d2980> Failed to fetch sample 2047816. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341378d0> Failed to fetch sample 2305425. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e4a9d0> Failed to fetch sample 1490880. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1ddf30> Failed to fetch sample 3675855. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b89a0> Failed to fetch sample 3837981. Exception: cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7efd6f7972e0> cannot identify image file <_io.BytesIO object at 0x7fe0ec2f36f0> Failed to fetch sample 2363461. Exception:Failed to fetch sample 1944664. Exception: cannot identify image file <_io.BytesIO object at 0x7ff27c2ffec0>cannot identify image file <_io.BytesIO object at 0x7f08281a66b0> Failed to fetch sample 4131003. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7e3420> Failed to fetch sample 3022490. Exception: cannot identify image file <_io.BytesIO object at 0x7fde3edb96c0> Failed to fetch sample 2591530. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a5a30> Failed to fetch sample 3180383. Exception: cannot identify image file <_io.BytesIO object at 0x7ff28f520c20> Failed to fetch sample 3107229. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1af240> ailed to fetch sample 2966793. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7e1e40>cannot identify image file <_io.BytesIO object at 0x7fca2e1de1b0> Failed to fetch sample 2071882. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd3802aa20> Failed to fetch sample 4053776. Exception: cannot identify image file <_io.BytesIO object at 0x7fc7f3355580> Failed to fetch sample 2760526. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e486d0> Failed to fetch sample 976313. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc146700> Failed to fetch sample 1301230. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bafc0> cannot identify image file <_io.BytesIO object at 0x7fe0ec256b10> Failed to fetch sample 2458879. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c1940> Failed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7ff62056ecf0> Failed to fetch sample 2236223. Exception: cannot identify image file <_io.BytesIO object at 0x7f082812ac00> Failed to fetch sample 2539019. Exception: cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7fe0ec218e00> Failed to fetch sample 3010141. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c14e0> Failed to fetch sample 1632937. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfab918ef0> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e4a520> Failed to fetch sample 3898287. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281298a0> Failed to fetch sample 1025825. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd5b961df0> Failed to fetch sample 4099207. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c2890> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e6eb60> Failed to fetch sample 3105090. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 31 cannot identify image file <_io.BytesIO object at 0x7f0828120860> Failed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e6cea0> Failed to fetch sample 3148873. Exception: cannot identify image file <_io.BytesIO object at 0x7f082812ade0> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e6cae0> Failed to fetch sample 3879548. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd24f8a700> Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO object at 0x7f32c4233ba0>cannot identify image file <_io.BytesIO object at 0x7ff4e3ebf150> Failed to fetch sample 3010141. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2ebab0> Failed to fetch sample 3194835. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281213f0> Failed to fetch sample 1482449. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f2aac0> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e4a660> Failed to fetch sample 4099207. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec21ad40> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7f082812ad90> Failed to fetch sample 1387630. Exception: ect at 0x7f35a7f4ddf0> cannot identify image file <_io.BytesIO object at 0x7efd6f7c9c60> ile <_io.BytesIO object at 0x7fcbdc1baa20> Failed to fetch sample 3376972. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f3fe70> Failed to fetch sample 3180383. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f1c10> Failed to fetch sample 1876146. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7dcbd0> Failed to fetch sample 3633396. Exception:bject at 0x7fc7f36db740> Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7f082812ae80> Failed to fetch sample 1934945. Exception:bject at 0x7fcbdc1f2a20> Failed to fetch sample 1792884. Exception:bject at 0x7efd6f79d1c0> cannot identify image file <_io.BytesIO object at 0x7fc715329940> cannot identify image file <_io.BytesIO object at 0x7fcbdc1f27a0> Failed to fetch sample 1758557. Exception:Failed to fetch sample 1870993. Exception:Failed to fetch sample 1491402. Exception:object at 0x7fca1c1b17b0>cannot identify image file <_io.BytesIO object at 0x7f0828121710> cannot identify image file <_io.BytesIO object at 0x7fcbdc1f33d0> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc203600> Failed to fetch sample 2077793. Exception:Failed to fetch sample 3346372. Exception:bject at 0x7efd6f7cbab0> cannot identify image file <_io.BytesIO object at 0x7fcbdc1f22f0> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7fc92ec60db0> Failed to fetch sample 2132343. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a53a480> Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc2009a0> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f3420> Failed to fetch sample 3709425. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a563dd0> Failed to fetch sample 2602538. Exception: cannot identify image file <_io.BytesIO object at 0x7fba0df14d60> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc110db0> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0eefc0> Failed to fetch sample 1996903. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134136c00> Failed to fetch sample 1764573. Exception: cannot identify image file <_io.BytesIO object at 0x7ff27c2ffec0> Failed to fetch sample 1029176. Exception: Failed to fetch sample 3149864. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412e9d0> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e6f290> Failed to fetch sample 3837981. Exception: Failed to fetch sample 4161294. Exception:ject at 0x7f71341a2fc0> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e6e610> Failed to fetch sample 2049038. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341005e0> cannot identify image file <_io.BytesIO object at 0x7ff4dab99fd0> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f724518e840> cannot identify image file <_io.BytesIO object at 0x7ff4e3ebf510> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7cab10> Failed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4db4fe0c0> Failed to fetch sample 3346372. Exception:Failed to fetch sample 3647389. Exception: ject at 0x7efe76ac88b0> Failed to fetch sample 1623907. Exception: cannot identify image file <_io.BytesIO object at 0x7f0e6bde2660> cannot identify image file <_io.BytesIO object at 0x7ff6205d9350> Failed to fetch sample 3144261. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7dd2b0> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3ebc4f0> Failed to fetch sample 4380619. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a562e30> Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7ff249681580> Failed to fetch sample 3094812. Exception: cannot identify image file <_io.BytesIO object at 0x7f063cd776f0> Failed to fetch sample 3180383. Exception: cannot identify image file <_io.BytesIO object at 0x7fd0c5f531f0> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7efa8b768ea0> Failed to fetch sample 1133332. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986486520> Failed to fetch sample 3633396. Exception: cannot identify image file <_io.BytesIO object at 0x7fd0b5eb0ea0> Failed to fetch sample 2086216. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647f740> Failed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a537fb0> cannot identify image file <_io.BytesIO object at 0x7ff249683c40> Failed to fetch sample 2047816. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647fc40> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a561e40> Failed to fetch sample 3290404. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a59b420> cannot identify image file <_io.BytesIO object at 0x7ff3344abd30> Failed to fetch sample 3525396. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249546700> Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24955b740> cannot identify image file <_io.BytesIO object at 0x7fd07a563e70> Failed to fetch sample 1140456. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f1a80> F cannot identify image file <_io.BytesIO object at 0x7ff249558680> e <_io.BytesIO object at 0x7fc92ec630b0> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a537d30> cannot identify image file <_io.BytesIO object at 0x7ff24955ba60> Failed to fetch sample 1029176. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbe9fbc950> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc104130> cannot identify image file <_io.BytesIO object at 0x7ff249681e40> Failed to fetch sample 2111070. Exception: cannot identify image file <_io.BytesIO object at 0x7fbded1402c0> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f33d0> Failed to fetch sample 1860308. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f3420> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a6f20> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2835ffd30> Failed to fetch sample 3448050. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd24f8a700> Failed to fetch sample 2460063. Exception: cannot identify image file <_io.BytesIO object at 0x7fc73df1bab0> Failed to fetch sample 3374089. Exception: cannot identify image file <_io.BytesIO object at 0x7f08ec575b70> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249681850> Failed to fetch sample 1775362. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f1df0> Failed to fetch sample 3898287. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f046d0> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a5080> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7ff27f80e840> Failed to fetch sample 1454159. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2ea390> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc257510> Failed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbe9fa0400> Failed to fetch sample 3548361. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc254360> Failed to fetch sample 3148873. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f0e070> Failed to fetch sample 4150751. Exception:Failed to fetch sample 3647389. Exception:bject at 0x7fe0ec2558a0> Failed to fetch sample 3194835. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1a4fe0> cannot identify image file <_io.BytesIO object at 0x7fcbdc203a60> Failed to fetch sample 1182897. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1e9b70> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f22f0> cannot identify image file <_io.BytesIO object at 0x7efd6f7efa10> Failed to fetch sample 1442340. Exception: cannot identify image file <_io.BytesIO ob cannot identify image file <_io.BytesIO object at 0x7efd6f7bc8b0> Failed to fetch sample 1381737. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1c3010> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1a5d50> Failed to fetch sample 1398442. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc123a10> Failed to fetch sample 3443298. Exception: cannot identify image file <_io.BytesIO object at 0x7fbda195dc10> Failed to fetch sample 1560142. Exception: cannot identify image file <_io.BytesIO object at 0x7fba4d8d6cf0> Failed to fetch sample 2987417. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1a4860> Failed to fetch sample 3898287. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9d5cbfdd0> Failed to fetch sample 1154977. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1da8e0> Failed to fetch sample 1511396. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1a74c0> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9cb2b4900> Failed to fetch sample 1628257. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1a4400> Failed to fetch sample 3148873. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bd260> Failed to fetch sample 3194835. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9fa18c810> Failed to fetch sample 3258134. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4cb80> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1bccc0> Failed to fetch sample 2539019. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4d4e0> Failed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14f420> Failed to fetch sample 3047513. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f3ccc0> Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc111bc0> Failed to fetch sample 4094238. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a6cf0> Failed to fetch sample 3193123. Exception: cannot identify image file <_io.BytesIO object at 0x7f32c4233ba0> Failed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9fdb0ede0> cannot identify image file <_io.BytesIO object at 0x7fe0ec2a76f0> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a5080> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd4eaa7fb0> Failed to fetch sample 4070669. Exception:Failed to fetch sample 3346372. Exception: ject at 0x7fd0b5eb0ea0> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a563e70> cannot identify image file <_io.BytesIO object at 0x7fe0ec2a4c20> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641f970> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a609710> Failed to fetch sample 4172937. Exception: cannot identify image file <_io.BytesIO object at 0x7f32bcb5cb30> Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO object at 0x7fde041adda0> Failed to fetch sample 2363461. Exception:Failed to fetch sample 1879013. Exception:bject at 0x7f35a7f88e00> cannot identify image file <_io.BytesIO object at 0x7fe0ec283790> Failed to fetch sample 3360142. Exception: cannot identify image file <_io.BytesIO object at 0x7fca2e1c6750> cannot identify image file <_io.BytesIO object at 0x7f35a7f4ab10> Failed to fetch sample 3633396. Exception: cannot identify image file <_io.BytesIO object at 0x7fc72566bfb0> cannot identify image file <_io.BytesIO object at 0x7f35a7f4b970> Failed to fetch sample 2165648. Exception: cannot identify image file <_io.BytesIO object at 0x7fc715940c70> ailed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7f32986b9940> cannot identify image file <_io.BytesIO object at 0x7fe436a37600> Failed to fetch sample 1345056. Exception:Failed to fetch sample 2730890. Exception:bject at 0x7fcbdc1ec900> Failed to fetch sample 1491402. Exception: cannot identify image file <_io.BytesIO object at 0x7f32986bbfb0> cannot identify image file <_io.BytesIO object at 0x7fc99e02e930> cannot identify image file <_io.BytesIO object at 0x7fe4466a6b10> Failed to fetch sample 2245953. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7ecea0> Failed to fetch sample 2248321. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9af21c10> Failed to fetch sample 3249441. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f88400> cannot identify image file <_io.BytesIO object at 0x7fc7f36db740> Failed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7f33b242c860> cannot identify image file <_io.BytesIO object at 0x7fca2e1c6a70> Failed to fetch sample 3573478. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fd670> cannot identify image file <_io.BytesIO object at 0x7f35a7f88cc0> Failed to fetch sample 2579749. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fe430> cannot identify image file <_io.BytesIO object at 0x7f32bcb5c3b0> Failed to fetch sample 3538308. Exception:Failed to fetch sample 1879013. Exception: ject at 0x7fe4369fdf30> cannot identify image file <_io.BytesIO object at 0x7f3297a1ff60> Failed to fetch sample 2365052. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a36bb0> cannot identify image file <_io.BytesIO object at 0x7f08281a6480> Failed to fetch sample 2924881. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369ffc40> Failed to fetch sample 1095181. Exception:Failed to fetch sample 3909731. Exception:bject at 0x7f08281a7240> cannot identify image file <_io.BytesIO object at 0x7fe4369fe020> Failed to fetch sample 3228934. Exception: cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7fe436a37330> Failed to fetch sample 3475553. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e55f93790> cannot identify image file <_io.BytesIO object at 0x7f08281a40e0> Failed to fetch sample 2180795. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341a27f0> Failed to fetch sample 2165648. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134102250> Failed to fetch sample 1345056. Exception: cannot identify image file <_io.BytesIO object at 0x7f6f0b7269d0> Failed to fetch sample 2265597. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341a04f0> Failed to fetch sample 2145986. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341a1b20> Failed to fetch sample 1175528. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a0a570> Failed to fetch sample 2766877. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a24fe0> Failed to fetch sample 1597276. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a76f0> Failed to fetch sample 1029176. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a6930> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7fde02092840> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a5da0> Failed to fetch sample 2761972. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a4bd0> cannot identify image file <_io.BytesIO object at 0x7fbcdc145620> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a7ab0> Failed to fetch sample 1573349. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc119df0> Failed to fetch sample 1573545. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a7740> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a67a0> Failed to fetch sample 2903623. Exception: Failed to fetch sample 2237837. Exception:ject at 0x7fde3edb96c0> cannot identify image file <_io.BytesIO object at 0x7f32bcb5cb30> Failed to fetch sample 1029176. Exception: cannot identify image file <_io.BytesIO object at 0x7f32bcb5c3b0> Failed to fetch sample 2693281. Exception: cannot identify image file <_io.BytesIO object at 0x7f32c4233ba0> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f83420> Failed to fetch sample 1388614. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4d1c0> Failed to fetch sample 3276750. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f275b0> Failed to fetch sample 2111070. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4d4e0> Failed to fetch sample 2761972. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f824d0> Failed to fetch sample 3245493. Exception: Failed to fetch sample 2359987. Exception:ject at 0x7f09bbb4e890> Failed to fetch sample 2454588. Exception:bject at 0x7f35a7f4a9d0> Failed to fetch sample 3056093. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342d3740> Failed to fetch sample 1573545. Exception:bject at 0x7efd6f7a0c20> Failed to fetch sample 3569930. Exception: cannot identify image file <_io.BytesIO object at 0x7f0a73037880> cannot identify image file <_io.BytesIO object at 0x7f35a7f81210> Failed to fetch sample 3105090. Exception: cannot identify image file <_io.BytesIO object at 0x7f71340fd710> Failed to fetch sample 3647389. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4a840> Failed to fetch sample 3034526. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986437c90> Failed to fetch sample 1006235. Exception: cannot identify image file <_io.BytesIO object at 0x7f08282cbce0> ailed to fetch sample 2267808. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 3915163. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249523bf0> Failed to fetch sample 1742099. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828123560> Failed to fetch sample 2940708. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7ed2b0> Failed to fetch sample 1339465. Exception: cannot identify image file <_io.BytesIO object at 0x7f0dd010a2a0> Failed to fetch sample 2040873. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249544db0> Failed to fetch sample 1133332. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828121ad0> cannot identify image file <_io.BytesIO object at 0x7fbcdcd79d00> Failed to fetch sample 1407211. Exception: Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7fbda068aca0> cannot identify image file <_io.BytesIO object at 0x7f0828121ee0> cannot identify image file <_io.BytesIO object at 0x7f09cfb42700>Failed to fetch sample 2402011. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7ecdb0> Failed to fetch sample 2047816. Exception: cannot identify image file <_io.BytesIO object at 0x7f0829085e40> Failed to fetch sample 2851854. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249521a80> Failed to fetch sample 2047816. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c187e20> cannot identify image file <_io.BytesIO object at 0x7fbcdc1132e0> Failed to fetch sample 2681934. Exception:Failed to fetch sample 1742099. Exception:bject at 0x7fcbdc2792b0> cannot identify image file <_io.BytesIO object at 0x7fe436a357b0> cannot identify image file <_io.BytesIO object at 0x7fca1c186cf0> Failed to fetch sample 3086664. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a37330> cannot identify image file <_io.BytesIO object at 0x7fc72566bfb0> Failed to fetch sample 1133332. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9b7f0ee80> Failed to fetch sample 1131587. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fe430> cannot identify image file <_io.BytesIO object at 0x7fc9b7f068e0> Failed to fetch sample 2759888. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a36ac0> Failed to fetch sample 1788358. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647f740> Failed to fetch sample 3105090. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647e840> Failed to fetch sample 1027167. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a51c0> Failed to fetch sample 1062970. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e6c090> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab9a110> Failed to fetch sample 1580794. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc146610> Failed to fetch sample 3615028. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a25670> Failed to fetch sample 2343306. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a27740> Failed to fetch sample 3050183. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b56b1120> Failed to fetch sample 1890486. Exception: cannot identify image file <_io.BytesIO object at 0x7f08282cbce0> Failed to fetch sample 1218572. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14dfd0> Failed to fetch sample 1470095. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b2b60> Failed to fetch sample 1172853. Exception: cannot identify image file <_io.BytesIO object at 0x7f098644be20> Failed to fetch sample 1820452. Exception: cannot identify image file <_io.BytesIO object at 0x7ff27f80e840> Failed to fetch sample 1762324. Exception: cannot identify image file <_io.BytesIO object at 0x7ff55c30c590> Failed to fetch sample 3347743. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2765c0> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7fde3edb96c0> Failed to fetch sample 3260553. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1bac00> Failed to fetch sample 3494092. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd3802aa20> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f14e0> Failed to fetch sample 1758557. Exception: Failed to fetch sample 1576941. Exception: ect at 0x7fcbdc201940> cannot identify image file <_io.BytesIO object at 0x7f098641fe20> Failed to fetch sample 3562916. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641f560> Failed to fetch sample 2730890. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641f970> Failed to fetch sample 2248321. Exception: cannot identify image file <_io.BytesIO object at 0x7f098647e520> Failed to fetch sample 3573478. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641f920> Failed to fetch sample 2579749. Exception:Failed to fetch sample 3840550. Exception:bject at 0x7f0986485b20> cannot identify image file <_io.BytesIO object at 0x7f35a7f3e1b0> cannot identify image file <_io.BytesIO object at 0x7f0986447470> Failed to fetch sample 2966793. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f4d8f0> Failed to fetch sample 2923798. Exception: cannot identify image file <_io.BytesIO object at 0x7fc8f219acf0> cannot identify image file <_io.BytesIO object at 0x7f098647f1f0> Failed to fetch sample 3909731. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864848b0> Failed to fetch sample 1661733. Exception: cannot identify image file <_io.BytesIO object at 0x7fbc5cc54270> Failed to fetch sample 3644537. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412c770> Failed to fetch sample 3240425. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc146070> Failed to fetch sample 3569930. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e5242cf90> Failed to fetch sample 2040873. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1ae0c0> Failed to fetch sample 3034526. Exception: cannot identify image file <_io.BytesIO object at 0x7f713412ebb0> Failed to fetch sample 2851854. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc145f80> Failed to fetch sample 3605195. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e5242d170> Failed to fetch sample 1339465. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e3d6cff10> Failed to fetch sample 1407211. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341a1850> Failed to fetch sample 2904427. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a76f0> Failed to fetch sample 1326468. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14f740> Failed to fetch sample 3105090. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0efab0> Failed to fetch sample 1352404. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f3510> Failed to fetch sample 3898287. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f27a0> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f2a20> Failed to fetch sample 3148873. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc2036f0> Failed to fetch sample 3194835. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc201490> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9bacd4e0> Failed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbde5058f0> Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1f2070> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc200900> Failed to fetch sample 3851946. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249523fb0> Failed to fetch sample 3652311. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fa2f0> Failed to fetch sample 3634589. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280fac00> Failed to fetch sample 3887256. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ef600> Failed to fetch sample 3929430. Exception: cannot identify image file <_io.BytesIO object at 0x7f32be6c18a0> Failed to fetch sample 2783090. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9ce8c720> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ed7b0> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc278ae0> Failed to fetch sample 979505. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc147d80> Failed to fetch sample 1762324. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc128e50> Failed to fetch sample 3902912. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ef7e0> Failed to fetch sample 2049038. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ee3e0> Failed to fetch sample 3868375. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc279080> Failed to fetch sample 3520194. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a6430> Failed to fetch sample 2434723. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f0ea0> Failed to fetch sample 3835135. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a6430> Failed to fetch sample 3733333. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a7240> Failed to fetch sample 1739681. Exception: cannot identify image file <_io.BytesIO object at 0x7f05ffff1b70> Failed to fetch sample 1723547. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a6a70> Failed to fetch sample 2771142. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a4fe0> Failed to fetch sample 2888601. Exception: cannot identify image file <_io.BytesIO object at 0x7f055eb679c0> Failed to fetch sample 2914062. Exception: cannot identify image file <_io.BytesIO object at 0x7f0554a45e40> Failed to fetch sample 2604984. Exception: cannot identify image file <_io.BytesIO object at 0x7f055b4494e0> Failed to fetch sample 1471494. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a4040> Failed to fetch sample 1107088. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a4860> 0%| | 1/34278 [01:51<1062:37:15, 111.60s/it] {'loss': 2.7051, 'grad_norm': 40.05717688896068, 'learning_rate': 9.718172983479106e-09, 'epoch': 0.0} 0%| | 1/34278 [01:51<1062:37:15, 111.60s/it]Failed to fetch sample 1342876. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac644d10> Failed to fetch sample 2761713. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e650f470> Failed to fetch sample 2466674. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf81d8f0> Failed to fetch sample 4189667. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a733d0> Failed to fetch sample 1865634. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7be5c0> Failed to fetch sample 2427199. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56960c70> cannot identify image file <_io.BytesIO object at 0x7f58c386fb00> Failed to fetch sample 2249643. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9f90d0>Failed to fetch sample 1929498. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e38623e0> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7c4680> Failed to fetch sample 1770342. Exception: cannot identify image file <_io.BytesIO object at 0x7f27755db880> cannot identify image file <_io.BytesIO object at 0x7f53a8ed5260> Failed to fetch sample 2979749. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fb9260> le <_io.BytesIO object at 0x7fbf0074e0c0> Failed to fetch sample 3467927. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d72bbdda0> Failed to fetch sample 3419178. Exception: Failed to fetch sample 2479743. Exception: ect at 0x7f22e3863bf0> le <_io.BytesIO object at 0x7fbf0f9a13f0> Failed to fetch sample 2479743. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39f96c0> Failed to fetch sample 2253806. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775479350> cannot identify image file <_io.BytesIO object at 0x7f0c35c3e020> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d77c87e20> cannot identify image file <_io.BytesIO object at 0x7fe436a71580> Failed to fetch sample 3711673. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f66ed0> cannot identify image file <_io.BytesIO object at 0x7fbf0074f2e0> Failed to fetch sample 1213089. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754aae30> Failed to fetch sample 2531690. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f676f0> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a73100> Failed to fetch sample 2787204. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb4b150> Failed to fetch sample 3604356. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00753ce0> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754a8900> Failed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a73240> Failed to fetch sample 1010081. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39ee4d0> ailed to fetch sample 2331127. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754a86d0> Failed to fetch sample 2655989. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e386aa20> Failed to fetch sample 4380313. Exception: cannot identify image file <_io.BytesIO object at 0x7f23ebead120> cannot identify image file <_io.BytesIO object at 0x7fbf00721e90> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 1702419. Exception:Failed to fetch sample 2327727. Exception: ject at 0x7fdcaf856110> cannot identify image file <_io.BytesIO object at 0x7f47278d1ad0> cannot identify image file <_io.BytesIO object at 0x7fd07a572a20> Failed to fetch sample 3177508. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5735b0> Failed to fetch sample 1157171. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4daba36a0> cannot identify image file <_io.BytesIO object at 0x7fd07a5eb790> Failed to fetch sample 2531690. Exception: Failed to fetch sample 1018021. Exception: ect at 0x7fd07a5fcf90> Failed to fetch sample 2748477. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5febb0> cannot identify image file <_io.BytesIO object at 0x7fc262b4eb10> Failed to fetch sample 1470654. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eafaeb60> Failed to fetch sample 1308960. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b4eac0> Failed to fetch sample 2940708. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eafac0e0> Failed to fetch sample 1923667. Exception: Failed to fetch sample 3718054. Exception:bject at 0x7f0c35c3f740> ailed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7fc262b4eca0> cannot identify image file <_io.BytesIO o cannot identify image fiFailed to fetch sample 2468744. Exception:Failed to fetch sample 3895655. Exception: cannot identify image file <_io.BytesIO object at 0x7fb976fcb3d0> Failed to fetch sample 4014831. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278cb290> cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7fc262b4d6c0> Failed to fetch sample 1179750. Exception: cannot identify image file <_io.BytesIO object at 0x7f81b9c9f150> ailed to fetch sample 2965534. Exception: Failed to fetch sample 986315. Exception: ject at 0x7fd20038dd00> cannot identify image file <_io.BytesIO object at 0x7f4ebf82f5b0> Failed to fetch sample 1751634. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b384d2b0> Failed to fetch sample 3596834. Exception: cannot identify image file <_io.BytesIO object at 0x7f4cddb152b0> Failed to fetch sample 1956953. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38be4d0> Failed to fetch sample 3177508. Exception: cannot identify image file <_io.BytesIO object at 0x7f4fc92b6610> Failed to fetch sample 1956953. Exception: cannot identify image file <_io.BytesIO object at 0x7f83045cbba0> Failed to fetch sample 1967543. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a5e674630> cannot identify image file <_io.BytesIO object at 0x7f0fdfbcb5b0> Failed to fetch sample 2531690. Exception: cannot identify image file <_io.BytesIO object at 0x7f8151ef96c0> Failed to fetch sample 2979749. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a392db70> Failed to fetch sample 2772050. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5bb5b0> cannot identify image file <_io.BytesIO object at 0x7f12b384fab0> Failed to fetch sample 2748477. Exception: cannot identify image file <_io.BytesIO object at 0x7f4ebf9f9760> Failed to fetch sample 3921018. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38bd710> Failed to fetch sample 3921018. Exception:Failed to fetch sample 4027789. Exception: ject at 0x7f81db5ba110> cannot identify image file <_io.BytesIO object at 0x7f12b38beb10> Failed to fetch sample 3733155. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b382b650> cannot identify image file <_io.BytesIO object at 0x7f11f0f36ac0> Failed to fetch sample 1929498. Exception:ject at 0x7f5de37daed0> cannot identify image file <_io.BytesIO object at 0x7f1137f66ed0> le <_io.BytesIO object at 0x7f5b0e2e2070> Failed to fetch sample 2801877. Exception: cannot identify image fiFailed to fetch sample 1367088. Exception:Failed to fetch sample 1680813. Exception:bject at 0x7f098646e7f0> cannot identify image file <_io.BytesIO oFailed to fetch sample 34Failed to fetch sample 3368153. Exception: cannot identify image fiFailed to fetch sample 1391226. Exception: cannot identify image file <_io.BytesIO object at 0x7fdaadd47d80> le <_io.BytesIO object at 0x7f1137f1e6b0> cannot identify image file <_io.BytesIO object at 0x7f58c39fb060> ile <_io.BytesIO object at 0x7f098646ed40> Failed to fetch sample 2397886. Exception:Failed to fetch sample 3103854. Exception: ject at 0x7f1137fc22a0> cannot identify image file <_io.BytesIO object at 0x7efd6f7bf560> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7f0e6bde32e0> Failed to fetch sample 1770342. Exception:ject at 0x7f22e39ee4d0> ile <_io.BytesIO object at 0x7f1137fc3b50> Failed to fetch sample 2655989. Exception:bject at 0x7efd6f7bfc40> le <_io.BytesIO object at 0x7f22e38684a0> cannot identify image file <_io.BytesIO object at 0x7f1137f659e0> le <_io.BytesIO object at 0x7fe0ec2f3b00> cannot identify image file <_io.BytesIO obFailed to fetch sample 43 cannot identify Failed to fetch sample 1213089. Exception: cannot cannot identify image file <_io.BytesIO object at 0x7f1137fc2f20> cannot identify image file <_io.BytesIO object at 0x7fe0ec2f1d00> Failed to fetch sample 4085188. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c2171a0> Failed to fetch sample 1682209. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f2cf0> cannot identify image file <_io.BytesIO object at 0x7f357b934e00> Failed to fetch sample 1988585. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7be5c0> cannot identify image file <_io.BytesIO object at 0x7fe0ec2f2a70> le <_io.BytesIO object at 0x7f7a91feffb0> Failed to fetch sample 2422766. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2e7ba0> Failed to fetch sample 4134262. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 2655989. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1fb4fa0c0> Failed to fetch sample 4380313. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f3ec0> Failed to fetch sample 2296629. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cc9a3dd0> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a71b20> Failed to fetch sample 4235964. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9d3790> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f959bc0> 0%| | 2/34278 [01:57<470:33:40, 49.42s/it] {'loss': 2.8738, 'grad_norm': 42.36435700832868, 'learning_rate': 1.943634596695821e-08, 'epoch': 0.0} 0%| | 2/34278 [01:57<470:33:40, 49.42s/it]Failed to fetch sample 1424790. Exception:Failed to fetch sample 1863405. Exception: ject at 0x7fb39f275a30> Failed to fetch sample 1911009. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25f3d0> Failed to fetch sample 1762846. Exception: cannot identify image file <_io.BytesIO object at 0x7f537f91ee30> Failed to fetch sample 3933696. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133f16c50> annot identify image file <_io.BytesIO object at 0x7fcbdc2526b0> Failed to fetch sample 3440636. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc12bb00> Failed to fetch sample 2612567. Exception: cannot identify image file <_io.BytesIO object at 0x7f713410a7a0> Failed to fetch sample 2091264. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f959f30> Failed to fetch sample 1880922. Exception: cannot identify image file <_io.BytesIO object at 0x7f27a941dbc0> Failed to fetch sample 2383706. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a6eeeca0> cannot identify image file <_io.BytesIO object at 0x7f0daab562f0> Failed to fetch sample 2443361. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2d8695a30> Failed to fetch sample 2263078. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c427f5b0> Failed to fetch sample 3838011. Exception: cannot identify image file <_io.BytesIO object at 0x7fd20038a160> Failed to fetch sample 2647104. Exception: cannot identify image file <_io.BytesIO object at 0x7fb37942da30> Failed to fetch sample 1930721. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e3dd140e0> Failed to fetch sample 2861238. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35484360> Failed to fetch sample 2903791. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bfe4db970> Failed to fetch sample 1301230. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380d800> Failed to fetch sample 3557596. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5a7100> Failed to fetch sample 1021516. Exception: cannot identify image file <_io.BytesIO object at 0x7f95bc5267a0> Failed to fetch sample 2108438. Exception: cannot identify image file <_io.BytesIO object at 0x7f563c6360c0> Failed to fetch sample 1280377. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc220fe0> Failed to fetch sample 2815741. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80ff9c0> Failed to fetch sample 4146076. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c353f7ce0> Failed to fetch sample 2861238. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd6890> Failed to fetch sample 2861238. Exception: cannot identify image file <_io.BytesIO object at 0x7f97b0167fb0> Failed to fetch sample 2633220. Exception: cannot identify image file <_io.BytesIO object at 0x7f443d05ff60> Failed to fetch sample Failed to fetch sample 3557596. Exception: cannot identify image file <_io.BytesIO objFailed to fetch sample 2639470. Exception: cannot identify image file <_io.BytesIO object at 0x7f4d18206980> ect at 0x7f7a568d6e80> Failed to fetch sample 4186967. Exception: cannot identify image file <_io.BytesIO object at 0x7f09864794e0> Failed to fetch sample 4186967. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568d7380> Failed to fetch sample 1561037. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986479350> Failed to fetch sample 1561037. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56971580> Failed to fetch sample 1409821. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf673830> cannot identify image file <_io.BytesIO object at 0x7f53a8f00bd0> le <_io.BytesIO object at 0x7f22e38512b0> Failed to fetch sample 2004558. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f6cae0> Failed to fetch sample 2123650. Exception: cannot identify image file <_io.BytesIO object at 0x7f20b4288c70> Failed to fetch sample 2717980. Exception: cannot identify image file <_io.BytesIO object at 0x7ff52b9cf4c0> Failed to fetch sample 2358648. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39eb7e0> Failed to fetch sample 2004558. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bd4f40> cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7f1f1b25ea20> le <_io.BytesIO object at 0x7f0986452520> Failed to fetch sample 2358648. Exception: cannot identify image fiFailed to fetch sample 2583452. Exception: ject at 0x7f53cb412bb0> Failed to fetch sample 2583452. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b097c40> cannot identify image file <_io.BytesIO object at 0x7f53a9065670> le <_io.BytesIO object at 0x7f7a56970d10> Failed to fetch sample 980211. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b25ee80> /mnt/petrelfs/liuzhaoyang/workspace/progrFailed to fetch sample 980211. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a569596c0> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the input/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the input/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 1272472. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 2 cannot identify image file <_io.BytesIO oFailed to fetch sample 19Failed to fetch sample 1168708. Exception: cannot identify image file <_io.BytesIO object at 0x7f27754a8fe0> Failed to fetch sample 1930721. Exception: cannot identify image file <_io.BytesIO object at 0x7fc108455df0> Failed to fetch sample 4026469. Exception:Failed to fetch sample 4026469. Exception:bject at 0x7f7e078af6f0> Failed to fetch sample 2679326. Exception: cannot identify image file <_io.BytesIO object at 0x7f1133fd1670> Failed to fetch sample 2139475. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e04f0> Failed to fetch sample 2861238. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078ae4d0> Failed to fetch sample 4146076. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078af1f0> Failed to fetch sample 3585602. Exception: cannot identify image file <_io.BytesIO object at 0x7fb37942ffb0> ailed to fetch sample 2861238. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e3150> Failed to fetch sample 4146076. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401e0680> Failed to fetch sample 3455666. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f90ae0> cannot identify image file <_io.BytesIO object at 0x7f713410b240> Failed to fetch sample 1930721. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342d68e0> cannot identify image file <_io.BytesIO object at 0x7fb7401e2930> Failed to fetch sample 4026469. Exception: cannot identify image file <_io.BytesIO object at 0x7f71342d7010> Failed to fetch sample 2679326. Exception: cannot identify image file <_io.BytesIO object at 0x7f724518e840> Failed to fetch sample 3695052. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abcdcade0> Failed to fetch sample 2783531. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abce77740> Failed to fetch sample 1173608. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd95d0> Failed to fetch sample 3643618. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560643b0> Failed to fetch sample 1231243. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cfbbf0> Failed to fetch sample 1350575. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd8680> Failed to fetch sample 3509971. Exception:Failed to fetch sample 2702504. Exception:bject at 0x7fa58cc98360> cannot identify image file <_io.BytesIO object at 0x7f4e84aaf380> Failed to fetch sample 1231243. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebaa700> Failed to fetch sample 1350575. Exception: cannot identify image file <_io.BytesIO object at 0x7f4c161e4b30> Failed to fetch sample 3509971. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f4e73cf90> cannot identify image file <_io.BytesIO object at 0x7f4fc86f5ee0> Failed to fetch sample 2615148. Exception:Failed to fetch sample 2429230. Exception: ject at 0x7f7b979f5710> cannot identify image file <_io.BytesIO object at 0x7f5742102e30> Failed to fetch sample 3533762. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420e5300> cannot identify image file <_io.BytesIO object at 0x7f09864794e0> Failed to fetch sample 1227562. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369edd50> Failed to fetch sample 1348937. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5695a070> Failed to fetch sample 1315170. Exception:bject at 0x7f7b9797bfb0> cannot identify image file <_io.BytesIO object at 0x7f5692a78fe0> le <_io.BytesIO object at 0x7fdc25628720> ailed to fetch sample 1291020. Exception: ect at 0x7f5a41bb7fb0> Failed to fetch sample 1409821. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b979ad0d0> Failed to fetch sample 1536594. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac213290> le <_io.BytesIO object at 0x7f7b979f6430> Failed to fetch sample 2647104. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278ffbf0> Failed to fetch sample 2229731. Exception:ject at 0x7f357b9ddb20> Failed to fetch sample 3368502. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 1788426. Exception: cannot identify image file <_io.BytesIO object at 0x7fef478ca020> Failed to fetch sample 2358648. Exception: cannot identify image file <_io.BytesIO object at 0x7f5f76a89760> Failed to fetch sample 2647104. Exception: cannot identify image file <_io.BytesIO object at 0x7f566810dee0> Failed to fetch sample 1685202. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37c2520> cannot identify image file <_io.BytesIO object at 0x7fc72566bfb0> Failed to fetch sample 980211. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37c0d60> Failed to fetch sample 3557596. Exception: cannot identify image file <_io.BytesIO object at 0x7fc706ee1940> Failed to fetch sample 1331834. Exception: cannot identify image file <_io.BytesIO object at 0x7ff248d6f2e0> Failed to fetch sample 4139728. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5a5710> 93222. Exception: ject at 0x7ff4daba1c10> Failed to fetch sample 2954477. Exception:bject at 0x7fb52f605350> Failed to fetch sample 2198494. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 2 cannot identify image file <_io.BytesIO object at 0x7fa26146f510> t 0x7f81db541c60> Failed to fetch sample 4186967. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5744f0> Failed to fetch sample 2108438. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db576a20> cannot identify image file <_io.BytesIO object at 0x7fb238772bb0> le <_io.BytesIO object at 0x7f99a39a98a0> Failed to fetch sample 2123650. Exception: cannot identify image fiFailed to fetch sample 3287116. Exception:Failed to fetch sample 4186967. Exception:bject at 0x7ff401f88a40> cannot identify image file <_io.BytesIO object at 0x7fe4c427e610> le <_io.BytesIO object at 0x7f96a4d87650> Failed to fetch sample 3533762. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5bf740> Failed to fetch sample 3464119. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39abe20> Failed to fetch sample 2004558. Exception: cannot identify image file <_io.BytesIO object at 0x7fbfebb83330> Failed to fetch sample 2123650. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64e3010> Failed to fetch sample 2358648. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f607600> cannot identify image file <_io.BytesIO object at 0x7fc2eb7bd4e0> le <_io.BytesIO object at 0x7f9a5c9af1f0> Failed to fetch sample 1409821. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f574c0> Failed to fetch sample 2583452. Exception: cannot identify image file <_io.BytesIO object at 0x7f96aec33f10> Failed to fetch sample 3333023. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f6f010> cannot identify image file <_io.BytesIO object at 0x7efcdbd744f0> cannot identify image file <_io.BytesIO o cannot identify image fiFailed to fetch sample 980211. Exception: cannot identify image filFailed to fetch sample 2783531. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80dec50> Failed to fetch sample 2861238. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb3f060> Failed to fetch sample 4146076. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9de63e6b0> Failed to fetch sample 3264059. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb95170> Failed to fetch sample 2717980. Exception:ject at 0x7f12b303c310> cannot identify image file <_io.BytesIO object at 0x7ef9f6a6aa70> Failed to fetch sample 1489531. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fb5adfa10> Failed to fetch sample 2000453. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e16c0> cannot identify image file <_io.BytesIO object at 0x7fc3849dc310> Failed to fetch sample 1231243. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a974cc0> cannot identify image file <_io.BytesIO object at 0x7fc0f0951210> Failed to fetch sample 1350575. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f09526b0> cannot identify image file <_io.BytesIO object at 0x7fc1cf673830> ile <_io.BytesIO object at 0x7fc0f090a480> Failed to fetch sample 2783531. Exception: cannot identify image file <_io.BytesIO object at 0x7fc1cf6853a0> Failed to fetch sample 4144376. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d159deb10> Failed to fetch sample 2639470. Exception: cannot identify image file <_io.BytesIO object at 0x7f6ea5d86ac0> 0%| | 3/34278 [02:05<289:43:48, 30.43s/it] {'loss': 2.4747, 'grad_norm': 39.0451605386664, 'learning_rate': 2.915451895043732e-08, 'epoch': 0.0} 0%| | 3/34278 [02:05<289:43:48, 30.43s/it]Failed to fetch sample 1601693. Exception: ailed to fetch sample 3383402. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf889ee0> Failed to fetch sample 3236005. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d145b8540> Failed to fetch sample 2112357. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d1dbc0> Failed to fetch sample 3503261. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a571850> Failed to fetch sample 3487466. Exception: cannot identify image file <_io.BytesIO object at 0x7f09ae101c60> Failed to fetch sample 4218927. Exception: cannot identify image fFailed to fetch sample 1396568. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2915750d0> Failed to fetch sample 3874635. Exception: Failed to fetch sample 3569930. Exception: cannot identify image fFailed to fetch sample 2539019. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401cf290> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42b34c0> Failed to fetch sample 3569930. Exception: Failed to fetch sample 3028761. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a0a570> Failed to fetch sample 2305108. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f57d80> Failed to fetch sample 4027846. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e3d6cff10> Failed to fetch sample 1721807. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1dbba0> Failed to fetch sample 3531448. Exception: cannot identify image file <_io.BytesIO object at 0x7fccdf7bd620> Failed to fetch sample 3034526. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ccc5ffe70> Failed to fetch sample 2165648. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777dd530> Failed to fetch sample 3494092. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec025c0> Failed to fetch sample 3376972. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253bddee0> Failed to fetch sample 3034526. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a394a200> Failed to fetch sample 3083872. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe723fc270> Failed to fetch sample 2165648. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f9057b0> Failed to fetch sample 2049038. Exception: cannot identify image file <_io.BytesIO object at 0x7f60ffec6b10> Failed to fetch sample 3605195. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ccc605710> Failed to fetch sample 3193123. Exception: cannot identify image file <_io.BytesIO object at 0x7f80201b31f0> Failed to fetch sample 1345056. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496057fb0> Failed to fetch sample 2165648. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbde566660> Failed to fetch sample 1339465. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08ef2e0> Failed to fetch sample 3083872. Exception: cannot identify image file <_io.BytesIO object at 0x7fbaf0507f60> Failed to fetch sample 1339465. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac239080> Failed to fetch sample 3376972. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a267f0> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7fd99f4c0cc0> Failed to fetch sample 3892787. Exception: cannot identify image file <_io.BytesIO object at 0x7f1a0d29a610> Failed to fetch sample 1407211. Exception: cannot identify image file <_io.BytesIO object at 0x7fc10f8affb0> Failed to fetch sample 2591593. Exception: cannot identify image file <_io.BytesIO object at 0x7f70b607dc60> Failed to fetch sample 3290404. Exception: Failed to fetch sample 3290404. Exception:cannot identify image fiFailed to fetch sample 1407211. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2fa0c0> Failed to fetch sample 2145986. Exception: cannot identify image file <_io.BytesIO object at 0x7f64960835b0> cannot identify image file <_io.BytesIO object at 0x7fd2003daf20> Failed to fetch sample 2265597. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ed670> Failed to fetch sample 2801811. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d18fe0> Failed to fetch sample 2145986. Exception: cannot identify image file <_io.BytesIO object at 0x7fc83c582a20> Failed to fetch sample 1140456. Exception: cannot identify image file <_io.BytesIO object at 0x7fde75ec5670> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d1dd00> Failed to fetch sample 3629990. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8e2890> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d1fa10> Failed to fetch sample 2168694. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25628540> /mnt/petrelfs/liuzhaoyang/workspace/progra/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( ms/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( iconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25629120> Failed to fetch sample 1664782. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e5d1fe20> warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8e3560> Failed to fetch sample 2888688. Exception: cannot identify image file <_io.BytesIO object at 0x7f191a59d760> Failed to fetch sample 2704010. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937d1490> Failed to fetch sample 3433732. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682dc15d0> Failed to fetch sample 3531448. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682ded620> Failed to fetch sample 2720068. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9f3001300> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9de63e6b0> Failed to fetch sample 3494092. Exception: cannot identify image file <_io.BytesIO object at 0x7efcd7bf4860> Failed to fetch sample 2671803. Exception: Failed to fetch sample 1599800. Exception:ject at 0x7fbf5f1c5bc0> Failed to fetch sample 2238423. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 2276484. Exception: cannot identify image file <_io.BytesIO object at 0x7fc26217a930> Failed to fetch sample 1876146. Exception: cannot identify image file <_io.BytesIO object at 0x7f4d18204bd0> Failed to fetch sample 1934945. Exception:: cannot identify image file <_io.BytesIO object at 0x7fb37942ffb0> Failed to fetch sample 2958575. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42b2980> Failed to fetch sample 2049038. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722a9850> cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7fc2eb74cfe0> Failed to fetch sample 2111070. Exception: cannot identify image file <_io.BytesIO object at 0x7fb39d910ae0> Failed to fetch sample 4195435. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f54027ba0> cannot identify image file <_io.BytesIO obFailed to fetch sample 1cannot identify image file <_io.BytesIO object at 0x7fc0f0903fb0> at 0x7fc0084b8ef0> Failed to fetch sample 2625647. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7cb7e0> Failed to fetch sample 1934945. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e306e62a0> Failed to fetch sample 3550318. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a60bba0> Failed to fetch sample 1876146. Exception:cannot identify image file <_io.BytesIO object at 0x7efd05a7ffb0> F cannot identify image file <_io.BytesIO object at 0x7fb2fd3c9a80> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741fd66b0> Failed to fetch sample 2806356. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6a6250> Failed to fetch sample 3494092. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741f1d7b0> Failed to fetch sample 1605459. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7cbb00> Failed to fetch sample 1934945. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a60a570> Failed to fetch sample 2966793. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 3193123. Exception:ample 4083003. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496082a20> Failed to fetch sample 3568773. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7c814130>: cannot identify image file <_io.BytesIO object at 0x7f47278b5120> ailed to fetch sample 2208615. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a41bb7fb0>FFailed to fetch sample 2098555. Exception cannot identify image file <_io.BytesIO object at 0x7f490a689580> le <_io.BytesIO object at 0x7f5a176d8a90> Failed to fetch sample 2421840. Exception:ject at 0x7f082810b650> Failed to fetch sample 3034526. Exception: cannot identify image file <_io.BytesIO object at 0x7f61928728e0> Failed to fetch sample 2528376. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec276fc0> Failed to fetch sample 3605195. Exception:bject at 0x7fbf0f9a1490> Failed to fetch sample 2238423. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf346cd5d0> cannot identify image file <_io.BytesIO object at 0x7fcbdc1dba10> cannot identify image file <_io.BytesIO object at 0x7f64960820c0> Failed to fetch sample 3837981. Exception: cannot identify image file <_io.BytesIO object at 0x7f082810b330> Failed to fetch sample 3837981. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec236d40> Failed to fetch sample 2049038. Exception: cannot identify image file <_io.BytesIO object at 0x7f0615a46660> Failed to fetch sample 1339465. Exception: cannot identify image file <_io.BytesIO object at 0x7f649602dfd0> Failed to fetch sample 1542067. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec277100> Failed to fetch sample 2539019. Exception: cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fc3ffe713a0> 66793. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828188fe0> Failed to fetch sample 1407211. Exception: cannot identify image file <_io.BytesIO object at 0x7f6496a056c0> Failed to fetch sample 2966793. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec274e50> Failed to fetch sample 2539019. Exception: cannot identify image fiFailed to fetch sample 2806356. Exception:Failed to fetch sample 3047513. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a5ecf0> Failed to fetch sample 4227559. Exception: cannot identify image file <_io.BytesIO object at 0x7f59ec798360> Failed to fetch sample 3588244. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a24270> Failed to fetch sample 3489137. Exception: cannot identify image fFailed to fetch sample 3193123. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35d6b3d0> cannot identify image file <_io.BytesIO object at 0x7f58c39c59e0> le <_io.BytesIO object at 0x7f6d145b9850> cannot identify image file <_io.BytesIO object at 0x7fb6a411fba0> Failed to fetch sample 2165648. Exception: cannot identify image file <_io.BytesIO object at 0x7f5690564bd0> Failed to fetch sample 1345056. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39d94e0> Failed to fetch sample 1987808. Exception:ject at 0x7f5ea6aa7fb0> ile <_io.BytesIO object at 0x7f5aed8ebe20> Failed to fetch sample 2165648. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a143240> cannot identify image file <_io.BytesIO object at 0x7f55cb0cc9f0> Failed to fetch sample 2145986. Exception:Failed to fetch sample 1345056. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e899d38d0> bject at 0x7f5de3734770> le <_io.BytesIO object at 0x7fbab5304bd0> Failed to fetch sample 2265597. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a118310> Failed to fetch sample 2145986. Exception: cannot identify image file <_io.BytesIO object at 0x7f5ead6207c0> Failed to fetch sample 2049038. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc145170> Failed to fetch sample 2966793. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3812c50> cannot identify image file <_io.BytesIO object at 0x7fbcdc107a10> Failed to fetch sample 2310853. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3893880> 0%| | 4/34278 [02:09<190:06:36, 19.97s/it] {'loss': 2.8732, 'grad_norm': 42.16677754699, 'learning_rate': 3.887269193391642e-08, 'epoch': 0.0} 0%| | 4/34278 [02:09<190:06:36, 19.97s/it]Failed to fetch sample 2524499. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8e95d0> Failed to fetch sample 2546338. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f285350> Failed to fetch sample 1532862. Exception: cannot identify image file <_io.BytesIO object at 0x7f1b7b9c05e0> Failed to fetch sample 2338191. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5697eb10> Failed to fetch sample 1591886. Exception: cannot identify image file <_io.BytesIO object at 0x7f09ba001df0> Failed to fetch sample 2530233. Exception: cannot identify image file <_io.BytesIO object at 0x7efe76affd80> Failed to fetch sample 2217185. Exception: cannot identify image file <_io.BytesIO object at 0x7fd0b3de75b0> Failed to fetch sample 4116521. Exception: cannot identify image file <_io.BytesIO object at 0x7fe46c325ad0> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f67a0> Failed to fetch sample 2671128. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c388e5c0> Failed to fetch sample 3116930. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73249d50> Failed to fetch sample 3380043. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42ae610> Failed to fetch sample 1832660. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b2430b0> Failed to fetch sample 3480772. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2c01d0> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c380f060> Failed to fetch sample 3846458. Exception: cannot identify image file <_io.BytesIO object at 0x7f122a57ac00> Failed to fetch sample 3122390. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278c0e50> Failed to fetch sample 3846458. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722c47c0> Failed to fetch sample 2979603. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e651d990> Failed to fetch sample 1501211. Exception: cannot identify image file <_io.BytesIO object at 0x7f7eb00d6340> Failed to fetch sample 3272055. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a58f9c0> Failed to fetch sample 3846458. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f797d80> Failed to fetch sample 3861924. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a243b0> Failed to fetch sample 4112540. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5697c310> Failed to fetch sample 3015321. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73270360> Failed to fetch sample 1387479. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e833b1f0> Failed to fetch sample 3879860. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a3fe53650> Failed to fetch sample 1815480. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f797740> Failed to fetch sample 1387479. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4466a6b10> Failed to fetch sample 3879860. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568e5fd0> Failed to fetch sample 3245561. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7c3290> Failed to fetch sample 1573395. Exception: cannot identify image file <_io.BytesIO object at 0x7fe8eb237240> Failed to fetch sample 3879860. Exception: Failed to fetch sample 1100749. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568e6390> Failed to fetch sample 3276754. Exception: cannot identify image file <_io.BytesIO object at 0x7efe76ab74c0> ject at 0x7f357b9d18f0> Failed to fetch sample 3245561. Exception: cannot identify image file <_io.BytesIO object at 0x7fcf20a0f150> Failed to fetch sample 1523937. Exception: cannot identify image fFailed to fetch sample 3845128. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436e06250> ile <_io.BytesIO object at 0x7f9a9c2d1440> Failed to fetch sample 3276754. Exception: cannot identify image file <_io.BytesIO object at 0x7fd200376750> Failed to fetch sample 2388130. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d91b83830> Failed to fetch sample 3671548. Exception: cannot identify image file <_io.BytesIO object at 0x7fe45eaf1850> Failed to fetch sample 1467431. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2d1a80> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 2868963. Exception: cannot identify image file <_io.BytesIO object at 0x7f991b3e74c0> Failed to fetch sample 2101533. Exception: cannot identify image file <_io.BytesIO object at 0x7f991bde3600> Failed to fetch sample 1162911. Exception: cannot identify image file <_io.BytesIO object at 0x7f78b3865d50> Failed to fetch sample 2520004. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b13358270> Failed to fetch sample 3380043. Exception:Failed to fetch sample 2101533. Exception:bject at 0x7f4beec79c10> cannot identify image file <_io.BytesIO object at 0x7f94ac2ef8d0> cannot identify image file <_io.BytesIO object at 0x7f5de37ab420> Failed to fetch sample 1357780. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8ead800> Failed to fetch sample 2554836. Exception:Failed to fetch sample 4172647. Exception:bject at 0x7f53a8e9d9e0> cannot identify image file <_io.BytesIO oFailed to fetch sample 33Failed to fetch sample 2605050. Exception:bject at 0x7f8151923fb0> cannot identify image file <_io.BytesIO object at 0x7fdaad56f5b0> Failed to fetch sample 1815480. Exception: cannot identify image fiFailed to fetch sample 3435906. Exception:Failed to fetch sample 2120765. Exception:ject at 0x7ff401fb90d0> Failed to fetch sample 3245561. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8f32ed0> Failed to fetch sample 3276754. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8eae7f0> Failed to fetch sample 2729944. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f73fb0> cannot identify image file <_io.BytesIO object at 0x7fdcaf8e9350> Failed to fetch sample 1815480. Exception: cannot identify image file <_io.BytesIO object at 0x7f5cacd624d0> Failed to fetch sample 1894808. Exception:ject at 0x7fd9c99fbe20> Failed to fetch sample 2181592. Exception: cannot identify image file <_io.BytesIO object at 0x7fd9d9c3a520> Failed to fetch sample 2461417. Exception: cannot identify image fFailed to fetch sample 3176205. Exception: Failed to fetch sample 1815480. Exception: cannot identify image fFailed to fetch sample 3276754. Exception: Failed to fetch sample 3245561. Exception: cannot identify image fFailed to fetch sample 2554836. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c1bee6b10> cannot identify image file <_io.BytesIO object at 0x7f10ae7e9a80> Failed to fetch sample 3422624. Exception:Failed to fetch sample 3846458. Exception: cannot identify image file <_io.BytesIO object at 0x7f09c1c45580>: cannot identify image file <_io.BytesIO object at 0x7f1137ff2200> cannot identify image file <_io.BytesIO object at 0x7f27bed7af70> cannot identify image file <_io.BytesIO object at 0x7f5a41bb7fb0> Failed to fetch sample 1863122. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b926480> Failed to fetch sample 2554836. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278ca8e0> cannot identify image file <_io.BytesIO object at 0x7f10ae7e9210> Failed to fetch sample 1100749. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137f72f20> Failed to fetch sample 3846458. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278cb3d0> Failed to fetch sample 1815480. Exception: cannot identify image fiFailed to fetch sample 4133927. Exception:Failed to fetch sample 1815480. Exception:bject at 0x7fbea6365080> cannot identify image file <_io.BytesIO object at 0x7f5d278caa20> Failed to fetch sample 3245561. Exception: cannot identify image fiFailed to fetch sample 2372858. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681d3880> Failed to fetch sample 3276754. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278c88b0> Failed to fetch sample 4133927. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7c32e0> Failed to fetch sample 1180601. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42a8e50> Failed to fetch sample 2366300. Exception: cannot identify image file <_io.BytesIO ob cannot identify image fFailed to fetch sample 1995538. Exception: Failed to fetch sample 4212880. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c42ade40> ailed to fetch sample 3088654. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1b3d30> Failed to fetch sample 1819849. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc637d80> cannot identify image file <_io.BytesIO object at 0x7fb52f6cb920> Failed to fetch sample 2554836. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e795f7b00> Failed to fetch sample 2906731. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9de63e6b0> Failed to fetch sample 2437168. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f625580> cannot identify image file <_io.BytesIO object at 0x7fcbdc27c860> Failed to fetch sample 2437168. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb3a570> Failed to fetch sample 1501211. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd36a0> Failed to fetch sample 2834769. Exception:bject at 0x7f5e6a8d2750> Failed to fetch sample 1815480. Exception: cannot identify image fiFailed to fetch sample 4156153. Exception: cannot identify image file <_io.BytesIO object at 0x7f802ac51b70> Failed to fetch sample 3551874. Exception: cannot identify image ficannot identify image file <_io.BytesIO obFailed to fetch sample 3245561. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e834b9c0> Failed to fetch sample 3879860. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd2930> Failed to fetch sample 1697987. Exception: cannot identify image file <_io.BytesIO object at 0x7ff249520220> cannot identify image file <_io.BytesIO object at 0x7fcbdc27f920> le <_io.BytesIO object at 0x7fe436a154e0> Failed to fetch sample 1100749. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cd0220> cannot identify image file <_io.BytesIO object at 0x7fee3f94fc40> le <_io.BytesIO object at 0x7fcbdc27cdb0> Failed to fetch sample 2671128. Exception:F cannot identify image file <_io.BytesIO object at 0x7fe436a57dd0> e <_io.BytesIO object at 0x7fee3f9731f0> cannot identify image file <_io.BytesIO object at 0x7fcbdc27ecf0> le <_io.BytesIO object at 0x7fe436a16570> Failed to fetch sample 1476826. Exception: cannot identify image file <_io.BytesIO object at 0x7f6682ddb2e0> Failed to fetch sample 1890614. Exception: cannot identify image file <_io.BytesIO object at 0x7f39486c0bd0> Failed to fetch sample 1284491. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f69300> Failed to fetch sample 1012319. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939318bd0> ailed to fetch sample 1357942. Exception: cannot identify image file <_io.BytesIO object at 0x7f724518ec00> Failed to fetch sample 1815480. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcea1c8b0> Failed to fetch sample 3326958. Exception:Failed to fetch sample 3245561. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 4112540. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687fa0220> Failed to fetch sample 3276754. Exception: cannot identify image file <_io.BytesIO o cannot identify image fi cannot identify image file <_io.BytesIO object at 0x7fb7416e7e70> Failed to fetch sample 1100749. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4933deac0> Failed to fetch sample 970478. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2db0b0> Failed to fetch sample 2453966. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280eabb0> Failed to fetch sample 3846458. Exception: cannot identify image file <_io.BytesIO object at 0x7f082819b240> Failed to fetch sample 1815480. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828199ee0> Failed to fetch sample 3245561. Exception: cannot identify image file <_io.BytesIO object at 0x7f063cd776f0> Failed to fetch sample 3276754. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281983b0> Failed to fetch sample 4375313. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b243f10> Failed to fetch sample 1765023. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f7ab0> Failed to fetch sample 3380043. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e64f4ae0> 0%| | 5/34278 [02:15<142:37:51, 14.98s/it] {'loss': 2.041, 'grad_norm': 28.392678217691305, 'learning_rate': 4.8590864917395535e-08, 'epoch': 0.0} 0%| | 5/34278 [02:15<142:37:51, 14.98s/it]Failed to fetch sample 3596598. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8316980> Failed to fetch sample 1307475. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cde9d0> Failed to fetch sample 2778890. Exception: Failed to fetch sample 4174338. Exception: cannot identify image fcannot identify image file <_io.BytesIO obFailed to fetch sample 3694109. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d7323b330> Failed to fetch sample 3667410. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668141350> Failed to fetch sample 3683565. Exception: cannot identify image file <_io.BytesIO object at 0x7f2044008c70> Failed to fetch sample 2884160. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b29d0> Failed to fetch sample 3566789. Exception: cannot identify image file <_io.BytesIO object at 0x7fb742671710> Failed to fetch sample 35 Failed to fetch sample 2241468. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4369fe020> Failed to fetch sample 4144912. Exception: cannot identify image file <_io.BytesIO object at 0x7fd1a4b73f60>Failed to fetch sample 1010384. Exception: cannot identify image file <_io.BytesIO object at 0x7f09ba001b20> Failed to fetch sample 1099222. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7c94e0> Failed to fetch sample 2427631. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cec950e00> Failed to fetch sample 2730890. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc269ace50> Failed to fetch sample 1792884. ExceptioFailed to fetch sample 3661522. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f092eac0> Failed to fetch sample 1742099. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e9e6ef060> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7fb4a651dad0> Failed to fetch sample 1133332. Exception: cannot identify image fileFailed to fetch sample 1133332. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d73283600> Failed to fetch sample 3188707. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b24bf60> Failed to fetch sample 3925231. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682fa890> Failed to fetch sample 2730890. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c299670> Failed to fetch sample 1821617. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003bafc0> Failed to fetch sample 2402011. ExceptioFailed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00756700> Failed to fetch sample 3148873. Exception: cannot identify image fileFailed to fetch sample 3925231. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f08fe070> Failed to fetch sample 1133332. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d59ec0450> Failed to fetch sample 2402011. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd45a1d9e0> Failed to fetch sample 2248321. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341376f0> Failed to fetch sample 2248321. ExceptioFailed to fetch sample 2248321. Exception: cannot identify image fileFailed to fetch sample 4080972. Exception: cannot identify image file <_io.BytesIO object at 0x7f5caf574d60> Failed to fetch sample 3633396. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59f5b0> Failed to fetch sample 2248321. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bdfe20> Failed to fetch sample 2579749. ExceptioFailed to fetch sample 3573478. Exception: cannot identify image file <_io.BytesIO object at 0x7f7135736cf0> Failed to fetch sample 2359987. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b252d40> Failed to fetch sample 2579749. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2d9d00> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cabb00> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3ebf7e0> Failed to fetch sample 2359987. Exception:Failed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ccdd50> cannot identify image file <_io.BytesIO object at 0x7fa275ca8e50> Failed to fetch sample 2579749. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560ec0e0> Failed to fetch sample 2294453. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16e4e5c0> Failed to fetch sample 2365052. Exception: cannot identify image file <_io.BytesIO object at 0x7f96c1534950> Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381d3f0> Failed to fetch sample 2579749. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35b961b0> Failed to fetch sample 1189922. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e3dd140e0> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b381de40> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/Failed to fetch sample 2365052. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1132e0> Failed to fetch sample 2365052. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341c9760> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 1252147. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16da91c0> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 3909731. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3953fb0> Failed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db59e1b0> Failed to fetch sample 2040208. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a52feffb0> Failed to fetch sample 2365052. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bdf2e0> utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7ca110> Failed to fetch sample 2924881. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf87e610> Failed to fetch sample 2924881. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0efd80> Failed to fetch sample 2924881. Exception: cannot identify image file <_io.BytesIO object at 0x7f71341369d0> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7ff27c2fffb0> Failed to fetch sample 3637098. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3952660> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7f83045868e0> Failed to fetch sample 3637098. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560ee570> Failed to fetch sample 2924881. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bdf6a0> Failed to fetch sample 3081751. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5af060> Failed to fetch sample 3909731. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c69c75cb0> Failed to fetch sample 1879013. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5ac2c0> Failed to fetch sample 3637098. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bdddf0> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2581d7470> Failed to fetch sample 1225417. Exception: cannot identify image file <_io.BytesIO object at 0x7fde3edb96c0> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a4c70> Failed to fetch sample 3116386. Exception: cannot identify image file <_io.BytesIO object at 0x7f0828123740> Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2a67a0> Failed to fetch sample 3105090. Exception: cannot identify image file <_io.BytesIO object at 0x7f08280f29d0> Failed to fetch sample 2391522. Exception: cannot identify image file <_io.BytesIO object at 0x7ff7dc6968e0> Failed to fetch sample 3898287. Exception: cannot identify image file <_io.BytesIO object at 0x7ff40177d8f0> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4015da2f0> Failed to fetch sample 3148873. Exception: cannot identify image file <_io.BytesIO object at 0x7ff40166b510> Failed to fetch sample 3194835. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4015d9490> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7ff40166b560> Failed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7ff6973eb060> Failed to fetch sample 2646757. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc635620> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 29 cannot identify image file <_io.BytesIO object at 0x7f9af7dc89f0> 31154. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b996250> cannot identify image file <_io.BytesIO object at 0x7f9d7323b0b0> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc637f60> Failed to fetch sample 3231174. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8da2f0> Failed to fetch sample 3194835. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc659300> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7f9be63fe660> Failed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc65a2a0> Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc65bc90> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7f9abc634bd0> Failed to fetch sample 3506735. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a176d8a90> Failed to fetch sample 2731220. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f64ea70>cannot identify image file <_io.BytesIO object at 0x7f5a52feffb0> file <_io.BytesIO object at 0x7f5d59ec1350> Failed to fetch sample 3898287. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf48cfe2f0> Failed to fetch sample 2232130. Exception:bject at 0x7f5a3fe53650> le <_io.BytesIO object at 0x7fbf005d93f0> Failed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf005da840> Failed to fetch sample 1661266. Exception: cannot identify image file <_io.BytesIO object at 0x7f27acd83fb0> Failed to fetch sample 3148873. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2787fe70> Failed to fetch sample 1029176. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb715620> Failed to fetch sample 1118652. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c38c0590> cannot identify image file <_io.BytesIO object at 0x7f81db5af060> Failed to fetch sample 3194835. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f64f0b0> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7fc325746110> cannot identify image file <_io.BytesIO object at 0x7fc2eb7bab60> Failed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278e57b0> Failed to fetch sample 3925231. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb7b9120> Failed to fetch sample 1280998. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf005db9c0> Failed to fetch sample 3530834. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c38c2250> cannot identify image file <_io.BytesIO object at 0x7f99a3af3ab0> Failed to fetch sample 1767382. Exception: cannot identify image file <_io.BytesIO object at 0x7efd280e0400> Failed to fetch sample 2761972. Exception: cannot identify image file <_io.BytesIO object at 0x7f7ecaa092b0> Failed to fetch sample 3439450. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278df790> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf005d9210> Failed to fetch sample 3584846. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f94f010> Failed to fetch sample 2359987. Exception: cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 34Failed to fetch sample 1131587. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 3314842. Exception: cannot identify image file <_io.BytesIO object at 0x7f32c06f69d0> bject at 0x7f001f917ce0> Failed to fetch sample 2947518. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5adb20> Failed to fetch sample 2759888. Exception: cannot identify image file <_io.BytesIO object at 0x7f9772b59300> Failed to fetch sample 3332187. Exception: cannot identify image file <_io.BytesIO object at 0x7f32c4233f10> cannot identify image file <_io.BytesIO object at 0x7f62a46cd170> 47389. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5af2e0> Failed to fetch sample 3346372. Exception: cannot identify image file <_io.BytesIO object at 0x7efd29827fb0> Failed to fetch sample 2903623. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5af560> Failed to fetch sample 3148873. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9de63e2f0> Failed to fetch sample 3562916. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9b4770> Failed to fetch sample 3562916. Exception: cannot identify image file <_io.BytesIO object at 0x7f5c27623100> Failed to fetch sample 1806693. Exception: cannot identify image file <_io.BytesIO object at 0x7efe2a006b60> Failed to fetch sample 2707047. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14fe70> Failed to fetch sample 3653818. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbd8fe0> Failed to fetch sample 2730890. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d894ae30> Failed to fetch sample 2281458. Exception:Failed to fetch sample 1675267. Exception: ject at 0x7f35a7f88cc0> Failed to fetch sample 3086664. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1115d0> Failed to fetch sample 3741100. Exception: ailed to fetch sample 2248321. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741fd0950> Failed to fetch sample 3573478. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f945e40> Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbd89f0> cannot identify image file <_io.BytesIO object at 0x7fd07a537ec0> ailed to fetch sample 2759888. Exception: cannot identify image file <_io.BytesIO object at 0x7f61be078db0> ile <_io.BytesIO object at 0x7f35a7f81b70> Failed to fetch sample 1131587. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc14ff10> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f88400> Failed to fetch sample 3086664. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e2d805d00> Failed to fetch sample 2759888. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc10a160> Failed to fetch sample 1131587. Exception: cannot identify image file <_io.BytesIO object at 0x7f7134103bf0> Failed to fetch sample 3538308. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f99b8d0> Failed to fetch sample 2579749. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d73f0d10> Failed to fetch sample 3105090. Exception:Failed to fetch sample 2365052. Exception: cannot identify image file <_io.BytesIO object at 0x7f59d7da26b0> cannot identify image file <_io.BytesIO object at 0x7fd2c46f2d40> le <_io.BytesIO object at 0x7fd0b5eb0e50> Failed to fetch sample 3714016. Exception: cannot identify image file <_io.BytesIO object at 0x7f52ce1fa110> Failed to fetch sample 2365052. Exception:ject at 0x7fee3f91ed40> Failed to fetch sample 2924881. Exception: cannot identify image file <_io.BytesIO object at 0x7f5741fd38d0> Failed to fetch sample 2924881. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f998f90> Failed to fetch sample 3731616. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a609cb0> Failed to fetch sample 3909731. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f945b70> Failed to fetch sample 1870993. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a59b010> cannot identify image file <_io.BytesIO object at 0x7f1f1b223f60> Failed to fetch sample 1821617. Exception: Failed to fetch sample 3637098. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e84ea4d0> Failed to fetch sample 3698728. Exception: cannot identify image file <_io.BytesIO object at 0x7f52a1abe020> Failed to fetch sample 1349984. Exception: Failed to fetch sample 2284356. Exception:ject at 0x7fe436a37c40> Failed to fetch sample 2243451. Exception:bject at 0x7f3982e2de90> cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7f12b39f3600> cannot identify image file <_io.BytesIO object at 0x7efd6f7d65c0> cannot identify image file <_io.BytesIO object at 0x7f3972bdbf60> Failed to fetch sample 1206640. Exception:Failed to fetch sample 1742099. Exception:bject at 0x7fa50b438360> Failed to fetch sample 1771810. Exception: cannot identify image file <_io.BytesIO object at 0x7f122a57ac00> Failed to fetch sample 1742099. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cde9d0> cannot identify image file <_io.BytesIO object at 0x7fe436a37dd0> cannot identify image file <_io.BytesIO object at 0x7fbe72247e20> le <_io.BytesIO object at 0x7f10bd8dcc70> Failed to fetch sample 2808025. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bfe4db970> Failed to fetch sample 2730890. Exception:Failed to fetch sample 2402011. Exception:bject at 0x7fa275ca8c20> cannot identify image file <_io.BytesIO object at 0x7fe436a35990> Failed to fetch sample 2248321. Exception: ject at 0x7f12b381c680> Failed to fetch sample 1344015. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093755670> Failed to fetch sample 1792884. Exception: cannot identify image file <_io.BytesIO object at 0x7f4d18206980> cannot identify image file <_io.BytesIO object at 0x7fa275ce6930> le <_io.BytesIO object at 0x7f12b38be570> Failed to fetch sample 3661522. Exception: cannot identify image fFailed to fetch sample 2047816. Exception: Failed to fetch sample 1758557. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bfd74cf40> Failed to fetch sample 3573478. Exception:Failed to fetch sample 3925231. Exception: ject at 0x7fbe722bf150> cannot identify image file <_io.BytesIO object at 0x7f10937ffb50> ile <_io.BytesIO object at 0x7fbea990e520> Failed to fetch sample 3538308. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722478d0> Failed to fetch sample 2365052. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722bf100> Failed to fetch sample 2924881. Exception: cannot identify image file <_io.BytesIO object at 0x7fc10844d3f0> Failed to fetch sample 3909731. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722bf5b0> Failed to fetch sample 3637098. Exception: cannot identify image file <_io.BytesIO object at 0x7fbe722bddf0> Failed to fetch sample 3056740. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab76cf0> Failed to fetch sample 1742099. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e6da80> Failed to fetch sample 1133332. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e6c090> Failed to fetch sample 2402011. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3e6fdd0> Failed to fetch sample 2047816. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4e3ebd940> 0%| | 6/34278 [02:19<106:42:47, 11.21s/it] {'loss': 3.5868, 'grad_norm': 72.60317211375468, 'learning_rate': 5.830903790087464e-08, 'epoch': 0.0} 0%| | 6/34278 [02:19<106:42:47, 11.21s/it]Failed to fetch sample 1962930. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c429acf0> Failed to fetch sample 4020302. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fcb470> Failed to fetch sample 1497057. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f178d0> Failed to fetch sample 1782681. Exception: cannot identify image file <_io.BytesIO object at 0x7f970868a390> Failed to fetch sample 2047149. Exception: cannot identify image file <_io.BytesIO object at 0x7fcf24787fb0> Failed to fetch sample 2205499. Exception: cannot identify image file <_io.BytesIO object at 0x7f21b7fab740> annot identify image file <_io.BytesIO object at 0x7fcbdc193920> ile <_io.BytesIO object at 0x7ff44a28d9e0> Failed to fetch sample 2707951. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a936f20> Failed to fetch sample 3257443. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a8e3f60> Failed to fetch sample 2551720. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39e4ea0> Failed to fetch sample 2753054. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb721350> Failed to fetch sample 3257443. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b92d760> Failed to fetch sample 2799267. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078c8ef0> Failed to fetch sample 2799267. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093757dd0> Failed to fetch sample 2429018. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1c46d0> Failed to fetch sample 3435401. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7fdb290> Failed to fetch sample 2110654. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc193a10> Failed to fetch sample 3080422. Exception: cannot identify image file <_io.BytesIO object at 0x7ff2494f89f0> Failed to fetch sample 2799267. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1b9990> cannot identify image file <_io.BytesIO object at 0x7fdcaf8da200> Failed to fetch sample 4043317. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d9080> le <_io.BytesIO object at 0x7fe4c466a890> Failed to fetch sample 3004849. Exception:Failed to fetch sample 3244596. Exception:bject at 0x7f10937ffab0> Failed to fetch sample 3004849. Exception: cannot identify image file <_io.BytesIO object at 0x7fbc5b857f10> Failed to fetch sample 1219686. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7dac00> Failed to fetch sample 3004849. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf87c9a0> cannot identify image file <_io.BytesIO object at 0x7fe696745760> le <_io.BytesIO object at 0x7f7e078da700> Failed to fetch sample 2838960. Exception: ject at 0x7fc2eb729fd0> le <_io.BytesIO object at 0x7efcdbb2e980> cannot identify image file <_io.BytesIO object at 0x7efe76ac88b0> ile <_io.BytesIO object at 0x7efcdbbf1030> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 3617610. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8395df0> Failed to fetch sample 2756136. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8399a30> Failed to fetch sample 1898440. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8319ee0> Failed to fetch sample 2250055. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e831a430> Failed to fetch sample 1958527. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c4263880> Failed to fetch sample 3080422. Exception: cannot identify image file <_io.BytesIO object at 0x7fe4c429ac00> Failed to fetch sample 3279333. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b995710> Failed to fetch sample 3121131. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b9c7e20> cannot identify image file <_io.BytesIO object at 0x7fdfaa47c4f0> Failed to fetch sample 3435401. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a944130> cannot identify image file <_io.BytesIO object at 0x7f0c35bf5940> Failed to fetch sample 2756136. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35bf76a0> cannot identify image file <_io.BytesIO object at 0x7fa58c4a6c50> Failed to fetch sample 2616357. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560e6c00> Failed to fetch sample 2661797. Exception: ject at 0x7f0c35bf4040> cannot identify image file <_io.BytesIO object at 0x7fc2eb72b3d0> Failed to fetch sample 3080422. Exception:bject at 0x7fe8103cfa60> cannot identify image file <_io.BytesIO object at 0x7fc2eb722e30> Failed to fetch sample 3144561. Exception: cannot identify image file <_io.BytesIO object at 0x7f39853b75b0> Failed to fetch sample 3435401. Exception: cannot identify image file <_io.BytesIO object at 0x7f3c56079850> Failed to fetch sample 3298369. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986438b80> cannot identify image file <_io.BytesIO object at 0x7fcbdc193a10> Failed to fetch sample 4026964. Exception: cannot identify image file <_io.BytesIO object at 0x7f098641fec0> Failed to fetch sample 2734727. Exception: cannot identify image file <_io.BytesIO object at 0x7f09953d5030> Failed to fetch sample 3435612. Exception: cannot identify image file <_io.BytesIO object at 0x7f098648cc20> Failed to fetch sample 2795376. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c383fba0> cannot identify image file <_io.BytesIO object at 0x7fdae9414b30> Failed to fetch sample 983047. Exception: cannot identify image file <_io.BytesIO object at 0x7f098648d080> Failed to fetch sample 4205859. Exception: cannot identify image file <_io.BytesIO object at 0x7f2775486340> Failed to fetch sample 2756136. Exception: cannot identify image file <_io.BytesIO object at 0x7fdb9abac180> Failed to fetch sample 4117024. Exception: ect at 0x7f09cfb42610> ile <_io.BytesIO object at 0x7ff401e2c630>cannot identify image file <_io.BytesIO object at 0x7f27754b5530> ile <_io.BytesIO object at 0x7f09cfb42bb0> Failed to fetch sample 2257247. Exception: cannot identify image file <_io.BytesIO object at 0x7f796ad95a80> Failed to fetch sample 2250055. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f72700> cannot identify image file <_io.BytesIO object at 0x7f098643abb0> le <_io.BytesIO object at 0x7ff437a0dee0> Failed to fetch sample 2548660. Exception:Failed to fetch sample 3503493. Exception:bject at 0x7f796ad95ee0> cannot identify image file <_io.BytesIO object at 0x7f2775485fd0> Failed to fetch sample 2553414. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37cc720> cannot identify image file <_io.BytesIO object at 0x7f7e078d98a0> Failed to fetch sample 2587903. Exception: cannot identify image file <_io.BytesIO object at 0x7f7bd4b36e80> Failed to fetch sample 2898278. Exception: cannot identify image file <_io.BytesIO object at 0x7f56682e8db0> Failed to fetch sample 3004849. Exception: cannot identify image file <_io.BytesIO object at 0x7f9d732931f0> cannot identify image file <_io.BytesIO object at 0x7f99a39d3f60> Failed to fetch sample 4241509. Exception:Failed to fetch sample 3429280. Exception: ject at 0x7fb256e76110> cannot identify image file <_io.BytesIO object at 0x7f001f9e1030> Failed to fetch sample 3866604. Exception: cannot identify image file <_io.BytesIO o cannot identify image fi cannot identify image file <_io.BytesIO oFailed to fetch sample 23Failed to fetch sFailed to fetch sample 35 cannot identify image file <_io.BytesIO object at 0x7f20b4288c20> 0x7f6d145ba660> cannot identify image file <_io.BytesIO object at 0x7f001f9e1260> Failed to fetch sample 3254207. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f977ab0> Failed to fetch sample 2814661. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beebd3bf0> Failed to fetch sample 1211698. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937ffab0> Failed to fetch sample 2756136. Exception: cannot identify image fiFailed to fetch sample 2979542. Exception:Failed to fetch sample 3257443. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906b0b0> bject at 0x7f81db5c4630> Failed to fetch sample 1898440. Exception: cannot identify image file <_io.BytesIO object at 0x7f4fc86f7560> Failed to fetch sample 2250055. Exception: cannot identify image file <_io.BytesIO object at 0x7f4beec2c2c0> Failed to fetch sample 3729408. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1a4a90> Failed to fetch sample 2055915. Exception: cannot identify image file <_io.BytesIO object at 0x7fbab2decc20> cannot identify image file <_io.BytesIO object at 0x7f47278f8220> Failed to fetch sample 2453882. Exception:Failed to fetch sample 3423897. Exception:bject at 0x7efcdbb2e980> cannot identify image file <_io.BytesIO object at 0x7ff2494f9620> cannot identify image file <_io.BytesIO object at 0x7efcdbb2fce0> Failed to fetch sample 2858679. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb2e890> Failed to fetch sample 1218899. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb2cfe0> Failed to fetch sample 2756136. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbb2e5c0> cannot identify image file <_io.BytesIO object at 0x7fcf249e5300> Failed to fetch sample 1536489. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f40720> Failed to fetch sample 1898440. Exception: cannot identify image fiFailed to fetch sample 994847. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbf21b0> Failed to fetch sample 2756136. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f15f80> Failed to fetch sample 2250055. Exception: cannot identify image file <_io.BytesIO object at 0x7fd17f52b420> Failed to fetch sample 1898440. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f40fe0> Failed to fetch sample 2250055. Exception: cannot identify image file <_io.BytesIO object at 0x7fb3a3207970> Failed to fetch sample 3661903. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789ca90> Failed to fetch sample 2429018. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d2789cb30> Failed to fetch sample 2799267. Exception:bject at 0x7fbf007098a0> cannot identify image file <_io.BytesIO object at 0x7f5d278ecae0> le <_io.BytesIO object at 0x7fbf006f3fb0> Failed to fetch sample 4133438. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777a9350> Failed to fetch sample 3004849. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278edc60> Failed to fetch sample 2429018. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777b7b00> Failed to fetch sample 3004849. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00708590> cannot identify image file <_io.BytesIO object at 0x7f578a291a80> Failed to fetch sample 2429018. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420ea1b0> cannot identify image file <_io.BytesIO object at 0x7f7a5696ecf0> Failed to fetch sample 2799267. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420e8860> Failed to fetch sample 3004849. Exception: cannot identify image file <_io.BytesIO object at 0x7f57420e8c20> Failed to fetch sample 2799267. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5696f830> Failed to fetch sample 3004849. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a5694b830> Failed to fetch sample 2535455. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f93b510> Failed to fetch sample 2429018. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f99a570> Failed to fetch sample 2799267. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f99a660> Failed to fetch sample 3004849. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f93aa70> 0%| | 7/34278 [02:23<84:59:19, 8.93s/it] {'loss': 2.8729, 'grad_norm': 60.76151115615286, 'learning_rate': 6.802721088435375e-08, 'epoch': 0.0} 0%| | 7/34278 [02:23<84:59:19, 8.93s/it]Failed to fetch sample 4060282. Exception: cannot identify image file <_io.BytesIO object at 0x7f94ac2eec50> Failed to fetch sample 3894579. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1c2ac0> Failed to fetch sample 4051217. Exception: cannot identify image file <_io.BytesIO object at 0x7f8029877ab0> Failed to fetch sample 1734512. Exception: cannot identify image file <_io.BytesIO object at 0x7f32d2450bd0> Failed to fetch sample 1975212. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbde0bf3d0> Failed to fetch sample 1708133. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8ee0c70> Failed to fetch sample 1947526. Exception: cannot identify image file <_io.BytesIO object at 0x7f95bb99ff60> Failed to fetch sample 2578023. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a906b0b0>FFailed to fetch sample 3217152. Exception: cannot identify image file <_io.BytesIO object at 0x7fc84b73eed0> Failed to fetch sample 1602322. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc123ba0> Failed to fetch sample 1381127. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a9068b30> Failed to fetch sample 4230799. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc254720> Failed to fetch sample 2658306. Exception: cannot identify image file <_io.BytesIO object at 0x7f0ecdb9cd60> Failed to fetch sample 1120039. Exception: cannot identify image file <_io.BytesIO object at 0x7f001faef880> Failed to fetch sample 1740611. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a804f0> Failed to fetch sample 2892575. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b93af20> Failed to fetch sample 2819059. Exception: cannot identify image file <_io.BytesIO object at 0x7f97b0167fb0> Failed to fetch sample 2078887. Exception: cannot identify image file <_io.BytesIO object at 0x7f5b411916c0> Failed to fetch sample 1250858. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91c680> Failed to fetch sample 1198666. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a827a0> Failed to fetch sample 3923884. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a3913010> Failed to fetch sample 1947526. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a4aeabfb0> Failed to fetch sample 2316288. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1c0c20> Failed to fetch sample 3390339. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8ee0270> Failed to fetch sample 3840067. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a83240> Failed to fetch sample 3617437. Exception: cannot identify image file <_io.BytesIO object at 0x7f96c1534950> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f91f740> Failed to fetch sample 1656236. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1ba0c0> Failed to fetch sample 4057666. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24966b650> Failed to fetch sample 1258856. Exception: cannot identify image file <_io.BytesIO object at 0x7fe15dccbab0> Failed to fetch sample 978639. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275d05f80> Failed to fetch sample 1562826. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1ba340> Failed to fetch sample 1410411. Exception: cannot identify image file <_io.BytesIO object at 0x7ff28f520c20> Failed to fetch sample 2478148. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1a6200> Failed to fetch sample 3617437. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f6a70> Failed to fetch sample 3744461. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc1a7790> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 1850746. Exception: cannot identify image file <_io.BytesIO object at 0x7fb9777aef70> Failed to fetch sample 2389294. Exception: cannot identify image file <_io.BytesIO object at 0x7fc9faaef8d0> Failed to fetch sample 1935218. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb8ef650> Failed to fetch sample 1505183. Exception: cannot identify image file <_io.BytesIO object at 0x7f668268c180> Failed to fetch sample 1397352. Exception:Failed to fetch sample 1876488. Exception:bject at 0x7fc2eb8efba0> Failed to fetch sample 2819059. Exception:bject at 0x7f71340ec450> cannot identify image file <_io.BytesIO object at 0x7f66923107c0> Failed to fetch sample 3617437. Exception: cannot identify image file <_io.BytesIO object at 0x7fc2eb786b60> cannot identify image file <_io.BytesIO object at 0x7f5f1891db20> Failed to fetch sample 1397352. Exception:bject at 0x7f71340efb50> Failed to fetch sample 4057666. Exception:bject at 0x7f5a16d9de90> cannot identify image file <_io.BytesIO object at 0x7f7134135850> le <_io.BytesIO object at 0x7f5a16d9fec0> Failed to fetch sample 1410411. Exception:Failed to fetch sample 3923884. Exception:bject at 0x7f7134134810> Failed to fetch sample 2112620. Exception:bject at 0x7f5b03b40f90> Failed to fetch sample 3617437. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a3fc93e70> Failed to fetch sample 1188472. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a87660540> cannot identify image file <_io.BytesIO object at 0x7f9a9c2d8720> Failed to fetch sample 2739250. Exception:Fcannot identify image file <_io.BytesIO object at 0x7f001faeff60> Failed to fetch sample 1198666. Exception:bject at 0x7f9a9c299e40> Failed to fetch sample 2574291. Exception: cannot identify image file <_io.BytesIO object at 0x7f001f925d50> Failed to fetch sample 3840067. Exception:bject at 0x7f9a9c2cbb50> cannot identify image file <_io.BytesIO object at 0x7f192eb45a30> Failed to fetch sample 3596833. Exception:ject at 0x7f53a8ee3ab0> F cannot identify image file <_io.BytesIO object at 0x7f9a9c2dbb50> e <_io.BytesIO object at 0x7fbf00726c00> Failed to fetch sample 1991777. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b254130> Failed to fetch sample 3617437. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2db600> Failed to fetch sample 1109477. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7401f8e00> cannot identify image file <_io.BytesIO object at 0x7f12b38276f0> Failed to fetch sample 1645844. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf00726520> Failed to fetch sample 4131003. Exception: cannot identify image file <_io.BytesIO object at 0x7f136e6f8a40> Failed to fetch sample 4131003. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0071d670> Failed to fetch sample 4182598. Exception: cannot identify image file <_io.BytesIO object at 0x7f47278f6a70> cannot identify image file <_io.BytesIO object at 0x7fb7401cfe20> ile <_io.BytesIO object at 0x7f18e64c0630> Failed to fetch sample 4131003. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7f28ad90> Failed to fetch sample 1381127. Exception: cannot identify image file <_io.BytesIO object at 0x7f4532a9ea20> Failed to fetch sample 1381127. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b255030> cannot identify image file <_io.BytesIO object at 0x7f83e834ab60> Failed to fetch sample 1653519. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c77ba0> Failed to fetch sample 1645844. Exception: cannot identify image file <_io.BytesIO object at 0x7f82a5ec52b0> le <_io.BytesIO object at 0x7f58c3819210> Failed to fetch sample 1562826. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b38d84a0> Failed to fetch sample 1562826. Exception: cannot identify image file <_io.BytesIO object at 0x7fc195f901d0> Failed to fetch sample 1397352. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b93b380> Failed to fetch sample 2641623. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca3a60> Failed to fetch sample 1656236. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c381b920> cannot identify image file <_io.BytesIO object at 0x7efcdbb711c0> Failed to fetch sample 2367911. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e69d0> Failed to fetch sample 1381127. Exception: cannot identify image file <_io.BytesIO object at 0x7f55df2d89f0> Failed to fetch sample 1947526. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ceddf0> cannot identify image file <_io.BytesIO object at 0x7f351f55b650> Failed to fetch sample 2270412. Exception:cannot identify image file <_io.BytesIO object at 0x7f99a39acc70> Failed to fetch sample 2339167. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a819d2b0> Failed to fetch sample 1072318. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7f2ba10> Failed to fetch sample 2641623. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2e3fb0> Failed to fetch sample 2459986. Exception: cannot identify image file <_io.BytesIO object at 0x7f36afae5030> Failed to fetch sample 2078887. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a81993f0> Failed to fetch sample 3473797. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7ffd8f0>FFailed to fetch sample 1700847. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a80e73d0> cannot identify image file <_io.BytesIO object at 0x7f22e3860cc0> Failed to fetch sample 2019013. Exception: Failed to fetch sample 1398442. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24966bd30> Failed to fetch sample 3004368. Exception:bject at 0x7fd200378090> le <_io.BytesIO object at 0x7f357b93af20> Failed to fetch sample 1842393. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078de660> Failed to fetch sample 1381127. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e39eef20> Failed to fetch sample 2008748. Exception: cannot identify image file <_io.BytesIO object at 0x7ff43d5fc770> cannot identify image file <_io.BytesIO object at 0x7f7e078dddf0> ile <_io.BytesIO object at 0x7f5e6a8e3f10> Failed to fetch sample 3390339. Exception: cannot identify image file <_io.BytesIO object at 0x7f22e3862fc0> Failed to fetch sample 1258856. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e078df3d0> Failed to fetch sample 2873850. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fb86d0> cannot identify image file <_io.BytesIO object at 0x7fdcaf8f1f30> le <_io.BytesIO object at 0x7f655606a5c0> Failed to fetch sample 1511396. Exception: cannot identify image file <_io.BytesIO object at 0x7ff24949be70> Failed to fetch sample 2535152. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556068270> Failed to fetch sample 2516824. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc25629260> Failed to fetch sample 1602322. Exception:Failed to fetch sample 1992592. Exception:bject at 0x7ff401deea20> cannot identify image file <_io.BytesIO object at 0x7ff249675120> Failed to fetch sample 1258856. Exception: cannot identify image file <_io.BytesIO object at 0x7f6556068310> cannot identify image file <_io.BytesIO object at 0x7fdcaf8e9d50> Failed to fetch sample 1523416. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f88180> cannot identify image file <_io.BytesIO obcannot identify image fiFailed to fetch sample 2278342. Exception:Failed to fetch sample 4206733. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc256295d0> cannot identify image file <_io.BytesIO object at 0x7ff7180e8680> Failed to fetch sample 4129675. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401f89990> Failed to fetch sample 3868993. Exception:bject at 0x7fd9d86ae110> Failed to fetch sample 2579485. Exception: cannot identify image file <_io.BytesIO object at 0x7ff401fb9490> Failed to fetch sample 3744461. Exception:bject at 0x7fca1c18b650> Failed to fetch sample 1968995. Exception: cannot identify image file <_io.BytesIO object at 0x7f054707cdb0> Failed to fetch sample 4163489. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf8f0db0> Failed to fetch sample 1203605. Exception: cannot identify image file <_io.BytesIO object at 0x7fdd6b02e9d0> cannot identify image file <_io.BytesIO object at 0x7f08280fc680> Failed to fetch sample 2142902. Exception: cannot identify image file <_io.BytesIO object at 0x7fc715941210> Failed to fetch sample 3175413. Exception: cannot identify image file <_io.BytesIO object at 0x7fdc256297b0> Failed to fetch sample 4206733. Exception: cannot identify image file <_io.BytesIO object at 0x7fc72e412840> cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7f05ffff1e90> Failed to fetch sample 1410411. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c18b470> Failed to fetch sample 2579485. Exception: cannot identify image file <_io.BytesIO object at 0x7f0554a45e40> cannot identify image file <_io.BytesIO object at 0x7fca9a6f0630> le <_io.BytesIO object at 0x7fc706e59bc0> Failed to fetch sample 4163489. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1ea5c0> Failed to fetch sample 1203605. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f0a90> Failed to fetch sample 3175413. Exception: cannot identify image file <_io.BytesIO object at 0x7fca9a6f1030> Failed to fetch sample 2041919. Exception: cannot identify image file <_io.BytesIO object at 0x7f1008a66980> Failed to fetch sample 1198666. Exception: cannot identify image file <_io.BytesIO object at 0x7f0daaeef560> Failed to fetch sample 3840067. Exception: cannot identify image file <_io.BytesIO object at 0x7f109378e890> 0%| | 8/34278 [02:29<77:04:23, 8.10s/it] {'loss': 2.924, 'grad_norm': 43.30099558144198, 'learning_rate': 7.774538386783285e-08, 'epoch': 0.0} 0%| | 8/34278 [02:29<77:04:23, 8.10s/it]Failed to fetch sample 1793519. Exception: cannot identify image file <_io.BytesIO object at 0x7f3d146e1a80>cannot identify image file <_io.BytesIO object at 0x7f0fcde3aca0> Failed to fetch sample 4184087. Exception: cannot identify image file <_io.BytesIO object at 0x7f357b974680> Failed to fetch sample 1840224. Exception: cannot identify image file <_io.BytesIO object at 0x7f7d7ddbcf40> Failed to fetch sample 4021883. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937f1cb0> Failed to fetch sample 2464919. Exception: cannot identify image file <_io.BytesIO object at 0x7f4bfd74e2f0> Failed to fetch sample 1830453. Exception: cannot identify image file <_io.BytesIO object at 0x7f597e45c540> Failed to fetch sample 1030035. Exception: cannot identify image file <_io.BytesIO object at 0x7f10937ee340> Failed to fetch sample 3147441. Exception: cannot identify image file <_io.BytesIO object at 0x7fdae97fd5d0> Failed to fetch sample 3240425. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3881990> Failed to fetch sample 2591530. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cdc450> Failed to fetch sample 1030035. Exception: cannot identify image file <_io.BytesIO object at 0x7f8056d45530> Failed to fetch sample 1762324. Exception: cannot identify image file <_io.BytesIO object at 0x7f65560c77e0> Failed to fetch sample 3730479. Exception: cannot identify image file <_io.BytesIO object at 0x7fdfa9b56200> Failed to fetch sample 3448050. Exception: cannot identify image file <_io.BytesIO object at 0x7fcb2d149760> le <_io.BytesIO object at 0x7f2a918eb7e0> cannot identify image file <_io.BytesIO object at 0x7efcdbb466b0> Failed to fetch sample 3520194. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9eaf70> cannot identify image file <_io.BytesIO object at 0x7f53b78a5df0> Failed to fetch sample 2851854. Exception:Failed to fetch sample 2851854. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39f8680> Failed to fetch sample 2851854. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8769760> Failed to fetch sample 1025825. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d5ad0> Failed to fetch sample 1454159. Exception: cannot identify image file <_io.BytesIO object at 0x7fc7f36db740> Failed to fetch sample 3010141. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d5800> Failed to fetch sample 4099207. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7b3e70> Failed to fetch sample 4150751. Exception: cannot identify image file <_io.BytesIO object at 0x7fca1c1c19e0> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 2986728. Exception: cannot identify image file <_io.BytesIO object at 0x7fcbdc1efc40> Failed to fetch sample 1567091. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6495b70> Failed to fetch sample 3634589. Exception: cannot identify image file <_io.BytesIO object at 0x7f18e6506200> Failed to fetch sample 1859771. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6acec770>Failed to fetch sample 2816631. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f629f30> le <_io.BytesIO object at 0x7f7b9b9a7fb0> Failed to fetch sample 1541061. Exception: cannot identify image file <_io.BytesIO object at 0x7f5eb43fad90> Failed to fetch sample 3240425. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f6c98f0> Failed to fetch sample 2040873. Exception: cannot identify image file <_io.BytesIO object at 0x7fb258c19df0> Failed to fetch sample 2851854. Exception: cannot identify image file <_io.BytesIO object at 0x7fb52f62b060> cannot identify image file <_io.BytesIO object at 0x7fc19513dfd0> Failed to fetch sample 2966809. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b273470> Failed to fetch sample 3332418. Exception: cannot identify image file <_io.BytesIO object at 0x7fe1aecc2570> Failed to fetch sample 3438374. Exception: cannot identify image file <_io.BytesIO object at 0x7fb687f97c40> Failed to fetch sample 3634589. Exception:Failed to fetch sample 4178599. Exception:bject at 0x7fb687fb1170> cannot identify image file <_io.BytesIO object at 0x7f4beec83ab0> Failed to fetch sample 1283090. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003a98f0> Failed to fetch sample 1788000. Exception: cannot identify image file <_io.BytesIO object at 0x7f5a16f8b7e0> Failed to fetch sample 2289442. Exception: cannot identify image file <_io.BytesIO object at 0x7f9ad4587600> cannot identify image file <_io.BytesIO object at 0x7f5de2581260> Failed to fetch sample 1762324. Exception: cannot identify image file <_io.BytesIO object at 0x7f99a39421b0> Failed to fetch sample 1299560. Exception: cannot identify image file <_io.BytesIO object at 0x7f56681a9800> Failed to fetch sample 1679572. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137ff5210> cannot identify image file <_io.BytesIO object at 0x7f5253a67ab0> Failed to fetch sample 1893080. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c29580> cannot identify image file <_io.BytesIO object at 0x7f5de3746020> Failed to fetch sample 1356530. Exception: cannot identify image file <_io.BytesIO object at 0x7f53ddf4a4d0> Failed to fetch sample 1968975. Exception: cannot identify image file <_io.BytesIO object at 0x7f35a7ffe8e0> cannot identify image file <_io.BytesIO object at 0x7fdcaf81ea70> le <_io.BytesIO object at 0x7f56bdeed800> cannot identify image file <_io.BytesIO object at 0x7f655601fce0> cannot identify image file <_io.BytesIO object at 0x7ef9f3001440> Failed to fetch sample 1541061. Exception: cannot identify image file <_io.BytesIO obFailed to fetch sample 2Failed to fetch samFailed to fetch sample 2698844. Exception:de0> cannot identify image file <_io.BytesIO object at 0x7f6a7f272c00> ailed to fetch sample 1613924. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c3881490> Failed to fetch sample 1930755. Exception: ject at 0x7efbf19439c0> cannot identify image file <_io.BytesIO object at 0x7f83e832a160> ile <_io.BytesIO object at 0x7f5a239ad260> Failed to fetch sample 3113541. Exception:ject at 0x7f58c38832e0> cannot identify image file <_io.BytesIO object at 0x7f83e832aac0> Failed to fetch sample 1018789. Exception: cannot identify image file <_io.BytesIO object at 0x7fa275cba7a0> Failed to fetch sample 2889444. Exception: cannot identify image file <_io.BytesIO object at 0x7f83e8346200> Failed to fetch sample 4125529. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5a8bd0> Failed to fetch sample 1030035. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a570ea0> Failed to fetch sample 3665953. Exception: cannot identify image file <_io.BytesIO object at 0x7efe2a569760> cannot identify image file <_io.BytesIO object at 0x7fee3f9fb290> Failed to fetch sample 2786440. Exception: cannot identify image file <_io.BytesIO object at 0x7fedb5ec0680> cannot identify image file <_io.BytesIO object at 0x7ff27c2ffec0> Failed to fetch sample 2698844. Exception:Failed to fetch sample 1364672. Exception: ject at 0x7fee3f9f95d0> Failed to fetch sample 3897453. Exception:bject at 0x7efb3d062200> Failed to fetch sample 1662300. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 1Failed to fetch sample 1762324. Exception:ile <_io.BytesIO object at 0x7efa85731440> Failed to fetch sample 1541061. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e55f93790> Failed to fetch sample 4214508. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d79c0> Failed to fetch sample 2698844. Exception: cannot identify image file <_io.BytesIO object at 0x7f7cebf7ff60> Failed to fetch sample 4150751. Exception: cannot identify image file <_io.BytesIO object at 0x7efd6f7d5490> cannot identify image file <_io.BytesIO object at 0x7f3939cab240> Failed to fetch sample 3504244. Exception: cannot identify image file <_io.BytesIO object at 0x7f0986454d60> cannot identify image file <_io.BytesIO object at 0x7f0828187830> Failed to fetch sample 3596808. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939c80310> Failed to fetch sample 2828197. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a7060> Failed to fetch sample 3374089. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939ca8fe0> Failed to fetch sample 1633328. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a53f0> Failed to fetch sample 2746130. Exception: cannot identify image file <_io.BytesIO object at 0x7f3d146e1670> Failed to fetch sample 2850860. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a4400> Failed to fetch sample 4059671. Exception:ject at 0x7fc0f090f920> cannot identify image file <_io.BytesIO object at 0x7f08281a6cf0> Failed to fetch sample 1066194. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281871a0> Failed to fetch sample 4142811. Exception: cannot identify image file <_io.BytesIO object at 0x7f08281a6a70> 0%| | 9/34278 [02:36<72:26:51, 7.61s/it] {'loss': 3.2271, 'grad_norm': 56.1718385637286, 'learning_rate': 8.746355685131196e-08, 'epoch': 0.0} 0%| | 9/34278 [02:36<72:26:51, 7.61s/it]Failed to fetch sample 2178667. Exception: annot identify image file <_io.BytesIO object at 0x7fdcaf894900> cannot identify image file <_io.BytesIO object at 0x7f94ac644d10> Failed to fetch sample 3367628. Exception: cannot identify image file <_io.BytesIO object at 0x7f80a82b60c0> Failed to fetch sample 2211750. Exception: cannot identify image file <_io.BytesIO object at 0x7fc0f09262f0> Failed to fetch sample 2090328. Exception: cannot identify image file <_io.BytesIO object at 0x7fee3f9d3790> file <_io.BytesIO object at 0x7fc99c21e070> Failed to fetch sample 4200101. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d145b9080> Failed to fetch sample 2255406. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a568a70b0> Failed to fetch sample 2982409. Exception: cannot identify image file <_io.BytesIO object at 0x7f7e0775acf0> Failed to fetch sample 3690551. Exception: cannot identify image file <_io.BytesIO object at 0x7f12b382a520> Failed to fetch sample 3630945. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39fb060> annot identify image file <_io.BytesIO object at 0x7f5e6a8e4c20> Failed to fetch sample 3467927. Exception: cannot identify image file <_io.BytesIO object at 0x7fc107a84c20> Failed to fetch sample 3895655. Exception: cannot identify image file <_io.BytesIO object at 0x7f4f7d385210> Failed to fetch sample 1956953. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35c24950> cannot identify image file <_io.BytesIO object at 0x7f08281a7a10> Failed to fetch sample 3467927. Exception: cannot identify image file <_io.BytesIO object at 0x7f655602ca90> Failed to fetch sample 2979749. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4dab9bce0> Failed to fetch sample 3467927. Exception: cannot identify image file <_io.BytesIO object at 0x7fc99c21cea0> ile <_io.BytesIO object at 0x7fdcaf88c540> Failed to fetch sample 1597034. Exception: cannot identify image file <_io.BytesIO object at 0x7f5e6a93c1d0> Failed to fetch sample 2479743. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b96f3d0> Failed to fetch sample 2371876. Exception: cannot identify image file <_io.BytesIO object at 0x7efa006c5fd0> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7f7b9b9fb9c0> Failed to fetch sample 2479743. Exception: cannot identify image file <_io.BytesIO object at 0x7fc75200bdd0> Failed to fetch sample 1308960. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e1e7edd50> Failed to fetch sample 2801877. Exception: cannot identify image file <_io.BytesIO object at 0x7fb976fca980> cannot identify image file <_io.BytesIO object at 0x7f0c35c26c50> Failed to fetch sample 3921018. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c35baa890> cannot identify image file <_io.BytesIO object at 0x7efcdbb4a660> Failed to fetch sample 1365649. Exception: cannot identify image file <_io.BytesIO object at 0x7efcdbbd4090> cannot identify image file <_io.BytesIO object at 0x7f6e1e7efc90> Failed to fetch sample 1382990. Exception: cannot identify image file <_io.BytesIO object at 0x7f6e1e7ecef0> Failed to fetch sample 3604356. Exception:bject at 0x7efcdbbd64d0> Failed to fetch sample 2784203. Exception:bject at 0x7fb695d3bfb0> le <_io.BytesIO object at 0x7efcdbb4a480> cannot identify image file <_io.BytesIO object at 0x7f0c7e3997b0> Failed to fetch sample 1010081. Exception:bject at 0x7f70b426e700> cannot identify image file <_io.BytesIO object at 0x7fb976fcb5b0> Failed to fetch sample 2655989. Exception: cannot identify image file <_io.BytesIO object at 0x7fb6a411db70> Failed to fetch sample 4380313. Exception: cannot identify image file <_io.BytesIO object at 0x7fba7f70cb80> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Failed to fetch sample 3144125. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2f3a60> Failed to fetch sample 3719356. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec2e64d0> Failed to fetch sample 3419178. Exception: cannot identify image file <_io.BytesIO object at 0x7fe0ec264f90> Failed to fetch sample 3036631. Exception: cannot identify image file <_io.BytesIO object at 0x7ff44a28d9e0> Failed to fetch sample 3736333. Exception: cannot identify image file <_io.BytesIO object at 0x7ff697deff60> Failed to fetch sample 4102690. Exception: cannot identify image file <_io.BytesIO object at 0x7f5de37db4c0> Failed to fetch sample 2020332. Exception: Failed to fetch sample 1305559. Exception: ect at 0x7f5741f0ddf0> cannot identify image file <_io.BytesIO object at 0x7f357b915b70> cannot identify image file <_io.BytesIO object at 0x7f5741f662a0> Failed to fetch sample 4377897. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cd0a40> Failed to fetch sample 2479743. Exception: cannot identify image file <_io.BytesIO object at 0x7f3bcf4be020> Failed to fetch sample 1351069. Exception: cannot identify image file <_io.BytesIO object at 0x7f3939cd9080> Failed to fetch sample 3291024. Exception:Failed to fetch sample 2301045. Exception:bject at 0x7f94ac23cae0> cannot identify image file <_io.BytesIO object at 0x7fb52b63c450> Failed to fetch sample 1449345. Exception:ject at 0x7f94ac644d10> cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO object at 0x7f94ac2af100> Failed to fetch sample 2371876. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fc3880> Failed to fetch sample 3533636. Exception: cannot identify image file <_io.BytesIO object at 0x7f1137fc1e40> Failed to fetch sample 1827041. Exception: cannot identify image file <_io.BytesIO object at 0x7fb458a94f40> cannot identify image file <_io.BytesIO object at 0x7f5e6a8e6d40> Failed to fetch sample 3921018. Exception:Failed to fetch sample 4169858. Exception:bFailed to fetch sample 3Failed to fetch sample 2499646. Exception: cannot identify image f cannot identify image file <_io.BytesIO obFailed to fetch sample 3467927. Exception:mple 2813109. Exception: cannot identify image file <_io.BytesIO object at 0x7fbcdc0f1080> Failed to fetch sample 3177508. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7ea8a890> Failed to fetch sample 2926931. Exception: cannot identify image file <_io.BytesIO oFailed to fetch sample 3711673. Exception:object at 0x7f35a7f59d50>cannot identify image file <_io.BytesIO object at 0x7fba0184ffb0> 79743. Exception: cannot identify image file <_io.BytesIO object at 0x7f001b9c4f40> Failed to fetch sample 2479743. Exception:ject at 0x7f58c39fb970> Failed to fetch sample 2866997. Exception: cannot identify image file <_io.BytesIO object at 0x7f6a7ea88720> Failed to fetch sample 2105032. Exception: cannot identify image file <_io.BytesIO object at 0x7f7a56961fd0>FFailed to fetch sample 3736170. Exception: cannot identify image file <_io.BytesIO object at 0x7f58c39c3bf0>FFailed to fetch sample 1748826. Exception: cannot identify image file <_io.BytesIO object at 0x7f9a9c2dff10> Failed to fetch sample 3719356. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf0074dbc0> Failed to fetch sample 4141248. Exception: cannot identify image file <_io.BytesIO object at 0x7f1093765b20> Failed to fetch sample 3702982. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003bfa60> Failed to fetch sample 2748477. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d14f9e610> Failed to fetch sample 2374403. Exception: cannot identify image file <_io.BytesIO object at 0x7f81db5996c0> Failed to fetch sample 4054447. Exception: cannot identify image file <_io.BytesIO object at 0x7f5051ddd9e0> Failed to fetch sample 3075940. Exception: cannot identify image fiFailed to fetch sample 2801877. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003bddf0> le <_io.BytesIO object at 0x7f4f7d385c10> Failed to fetch sample 1010081. Exception: cannot identify image file <_io.BytesIO object at 0x7f53a8effc90> cannot identify image file <_io.BytesIO object at 0x7f5253bff830> ailed to fetch sample 1680813. Exception: cannot identify image file <_io.BytesIO object at 0x7fd2003b4310> Failed to fetch sample 1956953. Exception: cannot identify image file <_io.BytesIO object at 0x7f5253a51c10> Failed to fetch sample 3076076. Exception: cannot identify image file <_io.BytesIO object at 0x7f05ffde92b0> Failed to fetch sample 1682209. Exception: cannot identify image file <_io.BytesIO object at 0x7fcf245c0770> cannot identify image file <_io.BytesIO object at 0x7f5253bfdd50> Failed to fetch sample 2711457. Exception: ject at 0x7f098646f1f0> ailed to fetch sample 3604356. Exception: cannot identify image file <_io.BytesIO object at 0x7fcf2296fe20> Failed to fetch sample 1527755. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b236cf0> cannot identify image file <_io.BytesIO objcannot identify image fcannot identify image file <_io.BytesIO object at 0x7f9772cf5bc0> 2772050. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668183ab0> Failed to fetch sample 3596834. Exception: cannot identify image file <_io.BytesIO object at 0x7f22f57ce570> Failed to fetch sample 3618119. Exception: cannot identify image file <_io.BytesIO object at 0x7f5d278f3150> cannot identify image file <_io.BytesIO object at 0x7efd6f7bf560> le <_io.BytesIO object at 0x7f5668181c60> Failed to fetch sample 3177508. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b234590> Failed to fetch sample 4380313. Exception:Failed to fetch sample 1597034. Exception:bject at 0x7fd2003bf6f0> Failed to fetch sample 4027789. Exception: cannot identify image file <_io.BytesIO object at 0x7f5668110770> Failed to fetch sample 3711673. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b235ad0> cannot identify image file <_io.BytesIO o cannot identify image file <_io.BytesIO oFailed to fetch sample 12 cannot identify image file <_io.BytesIO object at 0x7f5668112b10> Failed to fetch sample 2531690. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b27f290> Failed to fetch sample 1980088. Exception: cannot identify image file <_io.BytesIO object at 0x7f52de877fb0> Failed to fetch sample 2748477. Exception: cannot identify image file <_io.BytesIO object at 0x7f1f1b27d8f0> Failed to fetch sample 2852354. Exception:bject at 0x7fa27536b1a0> cannot identify image file <_io.BytesIO object at 0x7f70b426d080> le <_io.BytesIO object at 0x7f655602f3d0> Failed to fetch sample 1453889. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf88d4e0> Failed to fetch sample 2359256. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcafc65350> Failed to fetch sample 1004382. Exception: cannot identify image file <_io.BytesIO object at 0x7fdcaf88e160> Failed to fetch sample 4152672. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a571df0> Failed to fetch sample 2801877. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5fd210> Failed to fetch sample 1680813. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5e9d50> Failed to fetch sample 1682209. Exception: cannot identify image file <_io.BytesIO object at 0x7fd0b5eb0ea0> Failed to fetch sample 3604356. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a571300> Failed to fetch sample 1010081. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5eb5b0> Failed to fetch sample 2655989. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5fd9e0> Failed to fetch sample 4380313. Exception: cannot identify image file <_io.BytesIO object at 0x7fd07a5e89f0> Failed to fetch sample 2081512. Exception: cannot identify image file <_io.BytesIO object at 0x7fe436a71800> Failed to fetch sample 2979749. Exception: cannot identify image file <_io.BytesIO object at 0x7fe6cc998860> W0307 22:25:30.075000 3654 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] W0307 22:25:30.075000 3654 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** W0307 22:25:30.075000 3654 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. W0307 22:25:30.075000 3654 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** *** *** W0307 22:25:30.084000 36665 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] W0307 22:25:30.084000 36665 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** W0307 22:25:30.084000 36665 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. W0307 22:25:30.084000 36665 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** [2025-03-07 22:25:41,600] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:44,250] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:44,943] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:53,470] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,477] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,477] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,478] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,478] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,478] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,478] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,542] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,542] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,542] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,544] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,549] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,549] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,549] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,550] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,568] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,575] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,580] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,580] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,581] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,581] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,583] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,583] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,604] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,604] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,604] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,605] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,608] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,609] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,609] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,607] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,609] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,611] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,612] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,612] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,613] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,613] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,613] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,650] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,658] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,658] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,659] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,659] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,659] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,659] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,660] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,694] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,696] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,702] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,703] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,703] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,703] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,703] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:53,703] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:54,092] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:54,308] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:54,312] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:54,312] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:54,315] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:54,316] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:54,316] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:54,316] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:54,316] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:25:55,387] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,387] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,387] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,387] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,388] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,388] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,388] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,808] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,808] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,808] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,808] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,808] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,808] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,809] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,815] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,815] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,816] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,817] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,817] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,817] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,817] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,818] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,831] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,831] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,831] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,831] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,831] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,831] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,832] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,832] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,952] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,952] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,952] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,952] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,952] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,952] [INFO] [comm.py:683:init_distributed] Initializing TorchBackend in DeepSpeed with backend nccl [2025-03-07 22:25:55,952] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,953] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,957] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:55,994] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,067] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:56,167] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,170] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,170] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,170] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,171] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,178] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,179] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,502] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:56,502] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:56,506] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:56,508] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:56,508] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:56,509] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:56,541] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:56,610] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:56,624] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:56,701] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,773] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:56,799] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,799] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,800] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,800] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,800] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,800] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,800] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,801] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,859] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,859] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,859] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,859] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,861] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,862] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,862] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,865] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:25:56,917] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,058] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:57,062] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,068] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:57,070] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,076] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:57,079] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,087] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,146] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,158] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,220] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:57,220] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,233] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:57,233] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:57,236] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,238] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,330] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:57,331] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:57,335] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,342] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:57,343] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,348] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:57,348] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:57,349] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,358] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,474] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:57,475] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,536] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:57,536] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:57,540] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,541] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,577] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,664] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:57,759] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:58,291] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:58,291] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:58,292] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:58,293] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:58,295] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:58,296] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:58,305] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:58,420] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:58,422] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:58,422] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:58,426] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:25:58,426] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:58,429] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:25:58,432] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 [2025-03-07 22:26:00,438] [INFO] [partition_parameters.py:348:__exit__] finished initializing model - num_params = 730, num_elems = 8.29B Loading checkpoint shards: 0%| | 0/5 [00:00, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=6, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-55_HOST-10-140-60-41, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, )Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=0, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-56_HOST-10-140-60-101, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, )raining args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=6, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-55_HOST-10-140-60-96, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, )Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_traTraining args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=7, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-55_HOST-10-140-60-95, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, )Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=4, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-44_HOST-10-140-60-95, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_tn=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=3, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-55_HOST-10-140-60-56, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=2, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-55_HOST-10-140-60-56, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, )Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=4, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-55_HOST-10-140-60-56, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_lorain=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=2, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-55_HOST-10-140-60-95, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=3, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-55_HOST-10-140-60-95, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_see_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=2, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-56_HOST-10-140-60-104, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=5, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-56_HOST-10-140-60-104, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, )Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_een=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=6, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-56_HOST-10-140-60-102, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=3, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-56_HOST-10-140-60-102, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=1, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-56_HOST-10-140-60-102, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediseen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=3, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-56_HOST-10-140-60-104, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, )Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=1, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-56_HOST-10-140-60-104, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=7, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-25-56_HOST-10-140-60-104, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) reloss.conf --> before Client(conf_path) [TCSLoader] [TCSLoader] config_path: ~/petreloss.conf lient(conf_path) [TCSLoader] config_path: ~/pe--> before C[TCSLoader] config_path: ~/pet[TCSLoader] config_path: ~/petreloss.conflient(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) --> before Client(conf_path) reloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf[TCSLoader] config_path: ~/pet --> before --> before Client(conf_path) reloss.conf[TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf[TCSLoader] config_path: ~/petreloss.conf config_path: ~/petreloss.conf --> before Client(conf_path) --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) reloss.conf [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) reloss.conf --> before Client(conf_path) reloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path)--> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) -> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/guienv.json with all sampling strategy --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) Rank 0: Loaded 327972 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/guienv.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/omniact_fix.json with all sampling strategy Rank 0: Loaded 6720 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/omniact_fix.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ricoig16k.json with all sampling strategy Rank 0: Loaded 16133 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ricoig16k.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ricosca.json with all sampling strategy Rank 0: Loaded 173212 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ricosca.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/seeclick.json with all sampling strategy Rank 0: Loaded 271121 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/seeclick.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ui_refexp.json with all sampling strategy Rank 0: Loaded 15624 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ui_refexp.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/webui350k.json with all sampling strategy Rank 0: Loaded 57389 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/webui350k.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/widget_captioning.json with all sampling strategy Rank 0: Loaded 101426 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/widget_captioning.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_direct_150k_description_filtered.json with all sampling strategy Rank 0: Loaded 148416 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_direct_150k_description_filtered.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_direct_258k_function_filtered.json with all sampling strategy Rank 0: Loaded 248264 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_direct_258k_function_filtered.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_hybrid_773k_max_25qa_filtered_new.json with all sampling strategy Rank 0: Loaded 1243857 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_hybrid_773k_max_25qa_filtered_new.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_windows_splited.json with all sampling strategy Rank 0: Loaded 1075344 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_windows_splited.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_linux_splited.json with all sampling strategy Rank 0: Loaded 43156 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_linux_splited.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_macos_splited.json with all sampling strategy Rank 0: Loaded 18399 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_macos_splited.json Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/uibert_train_ground_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 4646 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/uibert_train_ground_d240430_v1.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_taperception_grounding_d240815_v2.jsonl with all sampling strategy Rank 0: Loaded 2500 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_taperception_grounding_d240815_v2.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_widget_grounding_d240815_v2.jsonl with all sampling strategy Rank 0: Loaded 14878 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_widget_grounding_d240815_v2.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_mug_grounding_d240812.jsonl with all sampling strategy Rank 0: Loaded 26090 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_mug_grounding_d240812.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_phone_2403_ground_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 24798 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_phone_2403_ground_d240430_v1.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_aig_share_2405_ground_d240521_v1.jsonl with all sampling strategy Rank 0: Loaded 5008 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_aig_share_2405_ground_d240521_v1.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_aig_share_2406_ground_d240612_v1.jsonl with all sampling strategy Rank 0: Loaded 7903 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_aig_share_2406_ground_d240612_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/amex_grounding_d240813_v1.jsonl with all sampling strategy Rank 0: Loaded 102007 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/amex_grounding_d240813_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/guicourse_guienv_text_grounding_1_d240815_v3.jsonl with all sampling strategy Rank 0: Loaded 63581 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/guicourse_guienv_text_grounding_1_d240815_v3.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/guicourse_guienv_text_grounding_2_d240815_v3.jsonl with all sampling strategy Rank 0: Loaded 6852 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/guicourse_guienv_text_grounding_2_d240815_v3.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_schedual_extract_20240520_v2_r464_reprompt_d240607.jsonl with repeat:2 sampling strategy Rank 0: Loaded 928 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_schedual_extract_20240520_v2_r464_reprompt_d240607.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_app_d20240822_v1.jsonl with repeat:2 sampling strategy Rank 0: Loaded 2488 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_app_d20240822_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_os_d20240822_v1.jsonl with repeat:2 sampling strategy Rank 0: Loaded 1242 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_os_d20240822_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_web_d20240822_v1.jsonl with repeat:2 sampling strategy Rank 0: Loaded 2360 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_web_d20240822_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_element_recognition_d240605_v1_correct_d240607.jsonl with all sampling strategy Rank 0: Loaded 3791 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_element_recognition_d240605_v1_correct_d240607.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_marker_recognition_d240605_v1.jsonl with all sampling strategy Rank 0: Loaded 5179 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_marker_recognition_d240605_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_ocr_d240605_v1.jsonl with all sampling strategy Rank 0: Loaded 5090 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_ocr_d240605_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_operation_oral_d240605_v1.jsonl with all sampling strategy Rank 0: Loaded 5070 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_operation_oral_d240605_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_visual_prompt_with_bbox_d240605_v1.jsonl with all sampling strategy Rank 0: Loaded 5248 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_visual_prompt_with_bbox_d240605_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2408_region_caption_d240903_v1.jsonl with all sampling strategy Rank 0: Loaded 5854 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2408_region_caption_d240903_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_detection_d20240418_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_detection_d20240418_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_element_recognition_d20240416_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_element_recognition_d20240416_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_ground_d20240416_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_ground_d20240416_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_detection_d20240418_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_detection_d20240418_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_element_recognition_d20240416_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_element_recognition_d20240416_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_ground_d20240416_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_ground_d20240416_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_element_recognition_d240605_v1_correct_d240607.jsonl with all sampling strategy Rank 0: Loaded 24620 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_element_recognition_d240605_v1_correct_d240607.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_long_caption_d20240604_v2.jsonl with all sampling strategy Rank 0: Loaded 17196 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_long_caption_d20240604_v2.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_long_caption_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 5998 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_long_caption_d240430_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_ocr_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 31276 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_ocr_d240430_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screen_qa_short_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 27880 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screen_qa_short_d240430_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screen_qa_with_bbox_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 62401 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screen_qa_with_bbox_d240430_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screenai_layout_20240604_v1.jsonl with all sampling strategy Rank 0: Loaded 22076 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screenai_layout_20240604_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_grounding_d240924_v1.jsonl with all sampling strategy Rank 0: Loaded 1405 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_grounding_d240924_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_oral_operation_d240924_v1.jsonl with all sampling strategy Rank 0: Loaded 1405 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_oral_operation_d240924_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_region_caption_d240924_v1.jsonl with all sampling strategy Rank 0: Loaded 1405 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_region_caption_d240924_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_ui_operation_oral_wbox_d241023_v2.jsonl with all sampling strategy Rank 0: Loaded 20293 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_ui_operation_oral_wbox_d241023_v2.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_comment_20240606_json_d20241023_v2.jsonl with all sampling strategy Rank 0: Loaded 1055 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_comment_20240606_json_d20241023_v2.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_internal_aig_json_d241126.jsonl with repeat:3 sampling strategy Rank 0: Loaded 6837 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_internal_aig_json_d241126.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_internal_aig_xml_d241126.jsonl with repeat:3 sampling strategy Rank 0: Loaded 6873 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_internal_aig_xml_d241126.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/OS_Altas_androidworld_grounding_d241120_v1.jsonl with all sampling strategy Rank 0: Loaded 89860 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/OS_Altas_androidworld_grounding_d241120_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_long_caption_20240604_v1.jsonl with repeat:4 sampling strategy Rank 0: Loaded 3156 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_long_caption_20240604_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/grounding_new.jsonl with all sampling strategy Rank 0: Loaded 863 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/grounding_new.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/regioncaption_new.jsonl with all sampling strategy Rank 0: Loaded 863 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/regioncaption_new.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/oral_operation_new.jsonl with all sampling strategy Rank 0: Loaded 863 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/oral_operation_new.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240812_grounding_dataset_20240812_v1_r6600.jsonl with all sampling strategy Rank 0: Loaded 6600 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240812_grounding_dataset_20240812_v1_r6600.jsonl Rank 0: Loaded 4387471 samples from data/stage1_20250307_1.yaml Rank 0: Formatting inputs...Skip in lazy mode Detected kernel version 3.10.0, which is below the recommended minimum of 5.5.0; this can cause the process to hang. It is recommended to upgrade the kernel to the minimum version or higher. Parameter Offload: Total persistent parameters: 877056 in 401 params 0%| | 0/34278 [00:00 Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ac9c62a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f93072db0b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67adb76f20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ac548590> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f666967bdd0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f930687b600> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f666967bdd0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) Traceback (most recent call last): File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90715c4ae0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f66f3acba10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9080005da0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f666a023fb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f68b73a8f40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90a6d3f010> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90805ea7a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90715c6430> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90805e9440> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff4b5863420> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90715c64d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb79b7e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb6109a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f682f919da0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff4b4de82c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff4b7078f40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb6213a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f666967be20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff4b4de82c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff0eb9bba60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff0eb9bad40> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff4b661bdd0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff0e167f650> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb6235b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb6233d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb6219e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff0eb9bab60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ae6a5580> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f66f3943ce0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67af375210> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f66f394f5b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90715c6bb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb623560> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb623920> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb622ac0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb623060> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff0eb9bbec0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb621030> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f666967bba0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f63dc6e7ba0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f63dc6e7c90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f66f3966ed0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90715c79c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90715c7380> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8b44fdd120> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90715c7ec0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90715f4950> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff5609b3420> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff371891b20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb613e70> Traceback (most recent call last): ll last): [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank41]: train() [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank41]: trainer.train() [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank41]: return inner_training_loop( [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank41]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank41]: batch_samples += [next(epoch_iterator)] [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank41]: current_batch = next(dataloader_iter) [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank41]: data = self._next_data() [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank41]: return self._process_data(data) [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank41]: data.reraise() [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank41]: raise exception [rank41]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank41]: Original Traceback (most recent call last): [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank41]: sample = self._get_item(i) [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank41]: data_dict = self.preprocess_qwen2vl_v3( [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank41]: return self.preprocess_qwen2vl( [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank41]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank41]: image = tcs_loader(image) [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank41]: img = pil_loader(img_value_str) [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank41]: img = Image.open(buff) [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank41]: raise UnidentifiedImageError(msg) [rank41]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff5609b3420> [rank41]: During handling of the above exception, another exception occurred: [rank41]: Traceback (most recent call last): [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank41]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank41]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank41]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank41]: print(f"Failed to fetch sample {i}. Exception:", e) [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank41]: print(f"Failed to fetch sample {i}. Exception:", e) [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank41]: return self.dispatch_line(frame) [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank41]: if self.quitting: raise BdbQuit [rank41]: bdb.BdbQuit File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa9503aa980> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f93072db0b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff0fb666340> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ae69a660> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff0e167f150> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f63dc6e7b00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f955711e2a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff0e167f650> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f63dc6e7c90> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9306acc8b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9080005da0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff0e167fb00> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9080006b60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff4b70d32e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb623a60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f93068797b0> (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f930687b5b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f93072db0b0> (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb6273d0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb623240> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) (Pdb) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff371893e20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa639e804f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9080005da0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb64a980> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfTraceback (most recent call last): qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faabc5615d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb6233d0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f930687b600> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8d8cb2af20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc084aaf20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff371893ab0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb648f90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb6486d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff371893920> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb64b420> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff4b70e20c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb649530> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f63dc6e75b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f680d2d5300> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ad370770> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb64b240> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f66f394fba0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb7b2070> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb648b30> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f66f3967d80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f93072db0b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff3fb62fbf0> [rank40]: Traceback (most recent call last): [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank40]: train() [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank40]: trainer.train() [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank40]: return inner_training_loop( [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank40]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank40]: batch_samples += [next(epoch_iterator)] [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank40]: current_batch = next(dataloader_iter) [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank40]: data = self._next_data() [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank40]: return self._process_data(data) [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank40]: data.reraise() [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank40]: raise exception [rank40]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank40]: Original Traceback (most recent call last): [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank40]: sample = self._get_item(i) [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank40]: data_dict = self.preprocess_qwen2vl_v3( [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank40]: return self.preprocess_qwen2vl( [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank40]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank40]: image = tcs_loader(image) [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank40]: img = pil_loader(img_value_str) [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank40]: img = Image.open(buff) [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank40]: raise UnidentifiedImageError(msg) [rank40]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ad370770> [rank40]: During handling of the above exception, another exception occurred: [rank40]: Traceback (most recent call last): [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank40]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank40]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank40]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank40]: print(f"Failed to fetch sample {i}. Exception:", e) [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank40]: print(f"Failed to fetch sample {i}. Exception:", e) [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank40]: return self.dispatch_line(frame) [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank40]: if self.quitting: raise BdbQuit [rank40]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ae6f76a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f666967b920> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f66f394fce0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff0e167fb00> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9307622390> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f63dc6e7b00> (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f66f3964130> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9306879c10> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff0eb9baa70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f63dc6e7c90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f66f3964900> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9080005da0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ae6a49a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f66f3964ae0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff0eb9bba60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9080006b60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f66f3943dd0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f93068784a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9307cc4f90> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f66f3943c90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90715b6d40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9557117330> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90a6d3f010> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2c807312b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3c729d18f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6952699fd0> (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) k (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3ae10c5b70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2c807335b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) ample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f66f396c950> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f680d2d71f0> > /mnt/> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2d0b673970> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f68977f8e00> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f932389c220> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f29f9da7ab0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9306acc8b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f29f9da6e30> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faa84bdf600> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa8cf30bce0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2d0b7fab10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa639cfabb0> [rank42]: Traceback (most recent call last): [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank42]: train() [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank42]: trainer.train() [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank42]: return inner_training_loop( [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank42]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank42]: batch_samples += [next(epoch_iterator)] [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank42]: current_batch = next(dataloader_iter) [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank42]: data = self._next_data() [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank42]: return self._process_data(data) [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank42]: data.reraise() [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank42]: raise exception [rank42]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank42]: Original Traceback (most recent call last): [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank42]: sample = self._get_item(i) [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank42]: data_dict = self.preprocess_qwen2vl_v3( [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank42]: return self.preprocess_qwen2vl( [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank42]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank42]: image = tcs_loader(image) [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank42]: img = pil_loader(img_value_str) [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank42]: img = Image.open(buff) [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank42]: raise UnidentifiedImageError(msg) [rank42]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9306acc8b0> [rank42]: During handling of the above exception, another exception occurred: [rank42]: Traceback (most recent call last): [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank42]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank42]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank42]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank42]: print(f"Failed to fetch sample {i}. Exception:", e) [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank42]: print(f"Failed to fetch sample {i}. Exception:", e) [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank42]: return self.dispatch_line(frame) [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank42]: if self.quitting: raise BdbQuit [rank42]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f29fa36d940> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa639e88270> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) iuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9306879670> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa639e86110> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2d0b673600> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f69526757b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa639d07380> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa63ed33b00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f680fa8ee30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f680f0862a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f680d2d71a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa8cfcd6f70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f919ad0fd80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f69503ddbc0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90715c3a60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9306879940> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90715c3790> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank43]: Traceback (most recent call last): [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank43]: train() [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank43]: trainer.train() [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank43]: return inner_training_loop( [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank43]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank43]: batch_samples += [next(epoch_iterator)] [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank43]: current_batch = next(dataloader_iter) [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank43]: data = self._next_data() [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank43]: return self._process_data(data) [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank43]: data.reraise() [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank43]: raise exception [rank43]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank43]: Original Traceback (most recent call last): [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank43]: sample = self._get_item(i) [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank43]: data_dict = self.preprocess_qwen2vl_v3( [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank43]: return self.preprocess_qwen2vl( [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank43]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank43]: image = tcs_loader(image) [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank43]: img = pil_loader(img_value_str) [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank43]: img = Image.open(buff) [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank43]: raise UnidentifiedImageError(msg) [rank43]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f69503ddbc0> [rank43]: During handling of the above exception, another exception occurred: [rank43]: Traceback (most recent call last): [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank43]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank43]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank43]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank43]: print(f"Failed to fetch sample {i}. Exception:", e) [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank43]: print(f"Failed to fetch sample {i}. Exception:", e) [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank43]: return self.dispatch_line(frame) [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank43]: if self.quitting: raise BdbQuit [rank43]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9306878b80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90715c3f60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f69503cbe70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f93068798a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9306acc8b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f930687ae30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f930687b5b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f680d2d72e0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc083993f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8b791bc360> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdac5fc34c0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f680d2d7650> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd84246fe70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8e504f6a70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd84246f560> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f680d2d7330> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f680dc9ed40> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8d8c1cfba0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f680f0862a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8ba0fe3ce0> .BytesIO object at 0x7f69517793a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f68976245e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f658727d850> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8d8c1cfdd0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f68977fa1b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f689770e250> Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f680dc9f970> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f657ced2ac0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f68977fa070> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f657ced3920> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f68104779c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2d0b673bf0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f680d2d6bb0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f68977fb830> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa8cf30bab0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2d0b68a890> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3b6b8d2d90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faa84bdf600> ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f68977f9710> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdac60331a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3b6b8d3e70> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f680d2d5d50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f394dc9c9f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f396dc92d90> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File " (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3ae2ea35b0> line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f680d2d5da0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2dc4c8e570> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faa14278860> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) haoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc08313ab0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8d0e561490> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3ae10c7a60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2c807335b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2dc66d8fe0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa763c3eca0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faa1466ecf0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa639d18720> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faa1466ecf0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa63ed32bb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa639d07ab0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa639d1a980> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa639d1a660> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa639d18b30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdd167d2f70> Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa639d1bc90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdd167cfce0> .BytesIO object at 0x7fa639d1ad90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa8cf30b9c0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa639d18ae0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc0b33bce0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc08384e50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Traceback (most recent call last): (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc083954e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc083876a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd84246fbf0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa639e06b10> > /mnt/hwfile/liuzhaoyang/workspace File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Traceback (most recent call last): ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc08399bc0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdac5fc3830> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd84246fe70> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa8cf30b790> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc0839ab60> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc8dbdeb60> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa639cfb1f0> .BytesIO object at 0x7fdc08395760> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc083999e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8d8c1cfdd0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdac5fc3830> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc08399d00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8d8d52d1c0> (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc0839fd80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2d0b673920> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd959c89440> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa639cfa660> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2d0b68b0b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) .BytesIO object at 0x7fd842a0e1b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8e504f53f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc0839a160> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc0839f970> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc0839f830> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3b6b8d3290> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) Traceback (most recent call last): PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3b6b8d3bf0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2dc6678950> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc084a2520> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3b6b8d37e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2d0b669260> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3b6b8d2d40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd842a0e930> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd842a0e1b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa950dd7ce0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2dc42ca160> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3ae10c7a10> PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2d0b703dd0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank36]: Traceback (most recent ca File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8a77bab4c0> y", line 701, in __next__ [rank36]: data = self._next_data() [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank36]: return self._process_data(data) [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank36]: data.reraise() [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank36]: raise exception [rank36]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank36]: Original Traceback (most recent call last): [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank36]: sample = self._get_item(i) [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank36]: data_dict = self.preprocess_qwen2vl_v3( [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank36]: return self.preprocess_qwen2vl( [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank36]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank36]: image = tcs_loader(image) [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank36]: img = pil_loader(img_value_str) [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank36]: img = Image.open(buff) [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank36]: raise UnidentifiedImageError(msg) [rank36]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2dc42ca160> [rank36]: During handling of the above exception, another exception occurred: [rank36]: Traceback (most recent call last): [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank36]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank36]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank36]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank36]: print(f"Failed to fetch sample {i}. Exception:", e) [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank36]: print(f"Failed to fetch sample {i}. Exception:", e) [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank36]: return self.dispatch_line(frame) [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank36]: if self.quitting: raise BdbQuit [rank36]: bdb.BdbQuit [rank32]: Traceback (most recent call last): [rank32]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank32]: train() [rank32]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank32]: trainer.train() [rank32]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank32]: return inner_training_loop( [rank32]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank32]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank32]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank32]: batch_samples += [next(epoch_iterator)] [rank32]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank32]: current_batch = next(dataloader_iter) [rank32]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank32]: data = self._next_data() [rank32]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank32]: return self._process_data(data) [rank32]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank32]: data.reraise() [rank32]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank32]: raise exception [rank32]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank32]: Original Traceback (most recent call last): [rank32]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank32]: sample = self._get_item(i) [rank32]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank32]: data_dict = self.preprocess_qwen2vl_v3( [rank32]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank32]: return self.preprocess_qwen2vl( [rank32]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank32]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank32]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank32]: image = tcs_loader(image) [rank32]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank32]: img = pil_loader(img_value_str) [rank32]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank32]: img = Image.open(buff) [rank32]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank32]: raise UnidentifiedImageError(msg) [rank32]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa950dd7ce0> [rank32]: During handling of the above exception, another exception occurred: [rank32]: Traceback (most recent call last): [rank32]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank32]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank32]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank32]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank32]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank32]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank32]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank32]: print(f"Failed to fetch sample {i}. Exception:", e) [rank32]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank32]: print(f"Failed to fetch sample {i}. Exception:", e) [rank32]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank32]: return self.dispatch_line(frame) [rank32]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank32]: if self.quitting: raise BdbQuit [rank32]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8a77bab830> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8a77babbf0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc088e31a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8b791bc360> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98f01dfec0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9bc0633420> (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd84246fbf0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File " (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2dc410c630> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3c72a02610> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank8]: Traceback (most recent call last): [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank8]: train() [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank8]: trainer.train() [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank8]: return inner_training_loop( [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank8]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank8]: batch_samples += [next(epoch_iterator)] [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank8]: current_batch = next(dataloader_iter) [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank8]: data = self._next_data() [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank8]: return self._process_data(data) [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank8]: data.reraise() [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank8]: raise exception [rank8]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank8]: Original Traceback (most recent call last): [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank8]: sample = self._get_item(i) [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank8]: data_dict = self.preprocess_qwen2vl_v3( [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank8]: return self.preprocess_qwen2vl( [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank8]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank8]: image = tcs_loader(image) [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank8]: img = pil_loader(img_value_str) [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank8]: img = Image.open(buff) [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank8]: raise UnidentifiedImageError(msg) [rank8]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc088e31a0> [rank8]: During handling of the above exception, another exception occurred: [rank8]: Traceback (most recent call last): [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank8]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank8]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank8]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank8]: print(f"Failed to fetch sample {i}. Exception:", e) [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank8]: print(f"Failed to fetch sample {i}. Exception:", e) [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank8]: return self.dispatch_line(frame) [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank8]: if self.quitting: raise BdbQuit [rank8]: bdb.BdbQuit [rank10]: Traceback (most recent call last): [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank10]: train() [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank10]: trainer.train() [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank10]: return inner_training_loop( [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank10]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank10]: batch_samples += [next(epoch_iterator)] [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank10]: current_batch = next(dataloader_iter) [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank10]: data = self._next_data() [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank10]: return self._process_data(data) [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank10]: data.reraise() [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank10]: raise exception [rank10]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank10]: Original Traceback (most recent call last): [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank10]: sample = self._get_item(i) [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank10]: data_dict = self.preprocess_qwen2vl_v3( [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank10]: return self.preprocess_qwen2vl( [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank10]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank10]: File "/mnt/hwfile/liu (Pdb) zhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank10]: image = tcs_loader(image) [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank10]: img = pil_loader(img_value_str) [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank10]: img = Image.open(buff) [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank10]: raise UnidentifiedImageError(msg) [rank10]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8b791bc360> [rank10]: During handling of the above exception, another exception occurred: [rank10]: Traceback (most recent call last): [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank10]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank10]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank10]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank10]: print(f"Failed to fetch sample {i}. Exception:", e) [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank10]: print(f"Failed to fetch sample {i}. Exception:", e) [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank10]: return self.dispatch_line(frame) [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank10]: if self.quitting: raise BdbQuit [rank10]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2d0b7f9c60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdac56236a0> Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd84246fbf0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2c807312b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2dc66ca2a0> (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/wor File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdac60331a0> /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3c729d18f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f29f9da6e30> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2c80733010> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc08395300> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2d0b65e840> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2c80732f70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3ae10c5b70> [rank34]: Traceback (most recent call last): [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank34]: train() [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank34]: trainer.train() [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank34]: return inner_training_loop( [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank34]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank34]: batch_samples += [next(epoch_iterator)] [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank34]: current_batch = next(dataloader_iter) [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank34]: data = self._next_data() [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank34]: return self._process_data(data) [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank34]: data.reraise() [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank34]: raise exception [rank34]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank34]: Original Traceback (most recent call last): [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank34]: sample = self._get_item(i) [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank34]: data_dict = self.preprocess_qwen2vl_v3( [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank34]: return self.preprocess_qwen2vl( [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank34]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank34]: image = tcs_loader(image) [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank34]: img = pil_loader(img_value_str) [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank34]: img = Image.open(buff) [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank34]: raise UnidentifiedImageError(msg) [rank34]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3c729d18f0> [rank34]: During handling of the above exception, another exception occurred: [rank34]: Traceback (most recent call last): [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank34]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank34]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank34]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank34]: print(f"Failed to fetch sample {i}. Exception:", e) [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank34]: print(f"Failed to fetch sample {i}. Exception:", e) [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank34]: return self.dispatch_line(frame) [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank34]: if self.quitting: raise BdbQuit [rank34]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2c80732e30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f29f9da7ab0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2d0b677ec0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2d0b68af70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2d0b6889a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3b6af01350> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f78126c00e0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/lTraceback (most recent call last): vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3b6af03830> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2377d57060> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f110667e390> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3b6af01bc0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76ce5af880> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9d3772d620> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc0839acf0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/wor File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8ab08f4e00> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f93e3e541d0> Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f986fa235b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8a864c20c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f924fa444a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f394dc9d260> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f96525ba930> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/worTraceback (most recent call last): vis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da91a340> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f110667e570> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f24e505c1d0> /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7583b0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f915c2db600> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f915c897010> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98f0b2de90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f915c2db600> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4ffa817c40> (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2a77ba3290> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f24e3c5e570> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f26286c24d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2537e1c900> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2565e48fe0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2289dd9940> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f892dd38d10> (Pdb) ck (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f78126c00e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f775760e520> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77576c1580> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f88266f85e0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f88dac73ec0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9abc9ad760> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)[rank38]: Traceback (most recent call last): [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank38]: train() [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank38]: trainer.train() [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank38]: return inner_training_loop( [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank38]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank38]: batch_samples += [next(epoch_iterator)] [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank38]: current_batch = next(dataloader_iter) [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank38]: data = self._next_data() [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank38]: return self._process_data(data) [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank38]: data.reraise() [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank38]: raise exception [rank38]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank38]: Original Traceback (most recent call last): [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank38]: sample = self._get_item(i) [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank38]: data_dict = self.preprocess_qwen2vl_v3( [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank38]: return self.preprocess_qwen2vl( [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank38]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank38]: image = tcs_loader(image) [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank38]: img = pil_loader(img_value_str) [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank38]: img = Image.open(buff) [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank38]: raise UnidentifiedImageError(msg) [rank38]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f93e4c346d0> [rank38]: During handling of the above exception, another exception occurred: [rank38]: Traceback (most recent call last): [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank38]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank38]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank38]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank38]: print(f"Failed to fetch sample {i}. Exception:", e) [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank38]: print(f"Failed to fetch sample {i}. Exception:", e) [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank38]: return self.dispatch_line(frame) [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank38]: if self.quitting: raise BdbQuit [rank38]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98707bb650> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "Traceback (most recent call last): UIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7d1300> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) l_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f986fa23150> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98730bac50> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95e9513a60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4e672f44f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff15bc733d0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98f01beca0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95e9513ba0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dd612250> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98707bb650> (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) 2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f513d7e5bc0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f513c24e480> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dda63420> Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f88262212b0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f87233f0ae0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f87233f0a40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4d71fea2f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f876b7f1300> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f876b7e92b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f876b7f02c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc1214e0> y", line 701, in __next__ [rank62]: data = self._next_data() [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank62]: return self._process_data(data) [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank62]: data.reraise() [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank62]: raise exception [rank62]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank62]: Original Traceback (most recent call last): [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank62]: sample = self._get_item(i) [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank62]: data_dict = self.preprocess_qwen2vl_v3( [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank62]: return self.preprocess_qwen2vl( [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank62]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank62]: image = tcs_loader(image) [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank62]: img = pil_loader(img_value_str) [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank62]: img = Image.open(buff) [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank62]: raise UnidentifiedImageError(msg) [rank62]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f892dd38d10> [rank62]: During handling of the above exception, another exception occurred: [rank62]: Traceback (most recent call last): [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank62]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank62]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank62]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank62]: print(f"Failed to fetch sample {i}. Exception:", e) [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank62]: print(f"Failed to fetch sample {i}. Exception:", e) [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank62]: return self.dispatch_line(frame) [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank62]: if self.quitting: raise BdbQuit [rank62]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8723dbbe20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f849addbab0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77576c5580> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) k (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f857b133ec0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f85b741ce00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76ce5af6f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7563317100> .BytesIO object at 0x7f849addab10> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f849addab10> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1a03f67ab0> > /mnt/h> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9978566a70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f985a0c7f60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f78101272e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f74485dba10> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/wor> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f914d77ac00> ib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18259ff920> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f849addb010> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f876b7f3d80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwf File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76ce5af8d0> ib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98779bc770> iuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2537e1eb10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f182ef48860> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f88262212b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f876b7f2b60> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76ce5afbf0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76cef77e70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1aba7622f0> mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f876b7f2fc0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98bf910e00> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocesTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18256b4cc0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f876b7f0e50> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f746d3f2bb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f78126c00e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7445fa3ba0> ntifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f173bc83b50> (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspaceTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19c56eb2e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_ File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/l> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) ) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f93e3e541d0> Traceback (most recent call last): last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77576c56c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ Traceback (most recent call last): /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f986fa210d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76cef771f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9aea6aecf0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95e9513ba0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77576c7b50> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8723dbbe20> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f45b51b8a40> .BytesIO object at 0x7f173c226200> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Traceback (most recent call last): rkspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9bc0294540> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77577cb880> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9bbb77aac0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/G File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f93e2a52520> 2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9872bc3740> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19cb5ab010> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47486fe700> o fetch sample {i}. Exception:", e) (Pdb) Tracebac> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) et_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f986fa21490> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9aeb0762f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7d3420> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19cb5ab1a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95e9513a60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9bbb78b3d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) raceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95e9513a60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9bbb77b150> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io File "/mnt/hwfile/liuzhaoyang/wor File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f110667e390> qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98f14ed850> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9bbb77b6f0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19cb58bdd0> ib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95e9a49e90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9bbb77b2e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f78126c00e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9bbb77ab60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95e9513380> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9bbb77ab10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f99b46a4180> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9bbb77b970> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f404d320cc0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76ce5af920> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95e9a4aca0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5de2df44f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9bbb77ab60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feff221d5d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank25]: Traceback (most recent call last): [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank25]: train() [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank25]: trainer.train() [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank25]: return inner_training_loop( [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank25]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank25]: batch_samples += [next(epoch_iterator)] [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank25]: current_batch = next(dataloader_iter) [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank25]: data = self._next_data() [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank25]: return self._process_data(data) [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank25]: data.reraise() [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank25]: raise exception [rank25]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank25]: Original Traceback (most recent call last): [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank25]: sample = self._get_item(i) [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank25]: data_dict = self.preprocess_qwen2vl_v3( [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank25]: return self.preprocess_qwen2vl( [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank25]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank25]: image = tcs_loader(image) [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank25]: img = pil_loader(img_value_str) [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank25]: img = Image.open(buff) [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank25]: raise UnidentifiedImageError(msg) [rank25]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f78126c00e0> [rank25]: During handling of the above exception, another exception occurred: [rank25]: Traceback (most recent call last): [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank25]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank25]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank25]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank25]: print(f"Failed to fetch sample {i}. Exception:", e) [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank25]: print(f"Failed to fetch sample {i}. Exception:", e) [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank25]: return self.dispatch_line(frame) [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank25]: if self.quitting: raise BdbQuit [rank25]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9703cb5300> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f78126c00e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77576c2bb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1c00be0040> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f61c2dfaa20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f986fa22930> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77576c1d50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Traceback (most recent call last): (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f986fa21b70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f996302a1b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77576c6bb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5fff681670> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77576c28e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2737583970> (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1d0b209990> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5cef68ba60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f74485dba10> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9bc16a6f70> (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1c006cce50> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77576c2610> [rank31]: Traceback (most recent call last): [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank31]: train() [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank31]: trainer.train() [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank31]: return inner_training_loop( [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank31]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank31]: batch_samples += [next(epoch_iterator)] [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank31]: current_batch = next(dataloader_iter) [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank31]: data = self._next_data() [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank31]: return self._process_data(data) [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank31]: data.reraise() [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank31]: raise exception [rank31]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank31]: Original Traceback (most recent call last): [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank31]: sample = self._get_item(i) [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank31]: data_dict = self.preprocess_qwen2vl_v3( [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank31]: return self.preprocess_qwen2vl( [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank31]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank31]: image = tcs_loader(image) [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank31]: img = pil_loader(img_value_str) [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank31]: img = Image.open(buff) [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank31]: raise UnidentifiedImageError(msg) [rank31]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f996302a1b0> [rank31]: During handling of the above exception, another exception occurred: [rank31]: Traceback (most recent call last): [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank31]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank31]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank31]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank31]: print(f"Failed to fetch sample {i}. Exception:", e) [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank31]: print(f"Failed to fetch sample {i}. Exception:", e) [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank31]: return self.dispatch_line(frame) [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank31]: if self.quitting: raise BdbQuit [rank31]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5cef68ade0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1b3c81b650> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77576c27a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f110667e570> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77576c1da0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1aba760720> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77576c1e40> PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77576c6c50> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1b3c4d5f30> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4137d54770> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/lTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7939e4dfd0> iuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f24cf85c3b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1aba7628e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95dab8c540> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f996302a1b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77576c5d50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5e0d9f9b70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18340e3ab0> (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f404d320cc0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18340e29d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e27742610> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7efdd0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb235fd0720> .BytesIO object at 0x7f96dc7d8a90> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f61078b4590> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18256b93a0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff2a1819080> (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7d1710> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3f41b947c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5ffb680d60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff15bc736a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47c8b68a40> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9bc0c79f80> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95e9513a60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e27d09f30> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5cef68ba60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) k (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feff221d5d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ne 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4137a93150> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7d1350> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5cefc422f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47486fe750> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f404d320cc0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98644772e0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc2a0450> .BytesIO object at 0x7f44b3c97dd0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5cefc42070> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspaceTraceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to File "/mnt/hwfile/liuzhaoyang/wo(Pdb) / File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f96dc7d8a90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f40adec6890> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77576c6ac0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f45b51b84a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7d1490> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa480d7a980> Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98f027a9d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47486fde40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) .BytesIO object at 0x7f95da800680> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44b830bc40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da8018f0> [rank53]: Traceback (most recent call last): [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank53]: train() [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank53]: trainer.train() [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank53]: return inner_training_loop( [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank53]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank53]: batch_samples += [next(epoch_iterator)] [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank53]: current_batch = next(dataloader_iter) [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank53]: data = self._next_data() [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank53]: return self._process_data(data) [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank53]: data.reraise() [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank53]: raise exception [rank53]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank53]: Original Traceback (most recent call last): [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank53]: sample = self._get_item(i) [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank53]: data_dict = self.preprocess_qwen2vl_v3( [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank53]: return self.preprocess_qwen2vl( [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank53]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank53]: image = tcs_loader(image) [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank53]: img = pil_loader(img_value_str) [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank53]: img = Image.open(buff) [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank53]: raise UnidentifiedImageError(msg) [rank53]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa480d7bec0> [rank53]: During handling of the above exception, another exception occurred: [rank53]: Traceback (most recent call last): [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank53]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank53]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank53]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank53]: print(f"Failed to fetch sample {i}. Exception:", e) [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank53]: print(f"Failed to fetch sample {i}. Exception:", e) [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank53]: return self.dispatch_line(frame) [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank53]: if self.quitting: raise BdbQuit [rank53]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_ioTraceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44c24c13f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18340e29d0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7d21b0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyaTraceback (most recent call last): rc/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9609cd74c0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qTOKENIZERS_PARALLELISM=(true | false) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgenTraceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7d2c50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) : train() [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank46]: trainer.train() [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank46]: return inner_training_loop( [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank46]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank46]: batch_samples += [next(epoch_iterator)] [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank46]: current_batch = next(dataloader_iter) [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank46]: data = self._next_data() [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank46]: return self._process_data(data) [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank46]: data.reraise() [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank46]: raise exception [rank46]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank46]: Original Traceback (most recent call last): [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank46]: sample = self._get_item(i) [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank46]: data_dict = self.preprocess_qwen2vl_v3( [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank46]: return self.preprocess_qwen2vl( [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank46]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank46]: image = tcs_loader(image) [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank46]: img = pil_loader(img_value_str) [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank46]: img = Image.open(buff) [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank46]: raise UnidentifiedImageError(msg) [rank46]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f514eb6d580> [rank46]: During handling of the above exception, another exception occurred: [rank46]: Traceback (most recent call last): [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank46]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank46]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank46]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank46]: print(f"Failed to fetch sample {i}. Exception:", e) [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank46]: print(f"Failed to fetch sample {i}. Exception:", e) [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank46]: return self.dispatch_line(frame) [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank46]: if self.quitting: raise BdbQuit [rank46]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f52e28e02c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) mnt/hwf File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4ffa816430> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44c24c1da0> 002c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f52e1c83bf0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9bbb7a91c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc11b150> Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f110667e390> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd26990a480> TracebacTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4ffa817c40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc11be20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbb8b5c8590> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc9086ae3e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank21]: Traceback (most recent call last): [rank21]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank21]: train() [rank21]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank21]: trainer.train() [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank21]: return inner_training_loop( [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank21]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank21]: batch_samples += [next(epoch_iterator)] [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank21]: current_batch = next(dataloader_iter) [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank21]: data = self._next_data() [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank21]: return self._process_data(data) [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank21]: data.reraise() [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank21]: raise exception [rank21]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank21]: Original Traceback (most recent call last): [rank21]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank21]: sample = self._get_item(i) [rank21]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank21]: data_dict = self.preprocess_qwen2vl_v3( [rank21]: File "/mnt/hwfile/liuzhaoyang/works (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1b3f8ef920> _utils/worker.py", line 351, in _worker_loop [rank21]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank21]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank21]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank21]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank21]: print(f"Failed to fetch sample {i}. Exception:", e) [rank21]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank21]: print(f"Failed to fetch sample {i}. Exception:", e) [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank21]: return self.dispatch_line(frame) [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank21]: if self.quitting: raise BdbQuit [rank21]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f513c13c860> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc149940> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbd1ff407c0> [rank57]: Traceback (most recent call last): [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank57]: train() [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank57]: trainer.train() [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank57]: return inner_training_loop( [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank57]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank57]: batch_samples += [next(epoch_iterator)] [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank57]: current_batch = next(dataloader_iter) [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank57]: data = self._next_data() [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank57]: return self._process_data(data) [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank57]: data.reraise() [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank57]: raise exception [rank57]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank57]: Original Traceback (most recent call last): [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank57]: sample = self._get_item(i) [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank57]: data_dict = self.preprocess_qwen2vl_v3( [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank57]: return self.preprocess_qwen2vl( [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank57]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank57]: image = tcs_loader(image) [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank57]: img = pil_loader(img_value_str) [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank57]: img = Image.open(buff) [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank57]: raise UnidentifiedImageError(msg) [rank57]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbb8b5c8590> [rank57]: During handling of the above exception, another exception occurred: [rank57]: Traceback (most recent call last): [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank57]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank57]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank57]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank57]: print(f"Failed to fetch sample {i}. Exception:", e) [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank57]: print(f"Failed to fetch sample {i}. Exception:", e) [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank57]: return self.dispatch_line(frame) [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank57]: if self.quitting: raise BdbQuit [rank57]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc8064ff560> ll last): [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank44]: train() [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank44]: trainer.train() [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank44]: return inner_training_loop( [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank44]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank44]: batch_samples += [next(epoch_iterator)] [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank44]: current_batch = next(dataloader_iter) [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank44]: data = self._next_data() [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank44]: return self._process_data(data) [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank44]: data.reraise() [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank44]: raise exception [rank44]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank44]: Original Traceback (most recent call last): [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank44]: sample = self._get_item(i) [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank44]: data_dict = self.preprocess_qwen2vl_v3( [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank44]: return self.preprocess_qwen2vl( [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank44]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank44]: image = tcs_loader(image) [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank44]: img = pil_loader(img_value_str) [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank44]: img = Image.open(buff) [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank44]: raise UnidentifiedImageError(msg) [rank44]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7489c9b330> [rank44]: During handling of the above exception, another exception occurred: [rank44]: Traceback (most recent call last): [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank44]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank44]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank44]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank44]: print(f"Failed to fetch sample {i}. Exception:", e) [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank44]: print(f"Failed to fetch sample {i}. Exception:", e) [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank44]: return self.dispatch_line(frame) [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank44]: if self.quitting: raise BdbQuit [rank44]: bdb.BdbQuit File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1323f7d300> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4ffa815d50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/ImaTraceback (most recent call last): e UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbac0efcdb0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7513836b60> ib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc80f723650> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbd1ece6cf0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f233caf4c20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc57dacbb00> File "> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile <_io.BytesIO object at 0x7fba8e712b10> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f14ec4a1490> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f225107da80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f722da047c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbd1f6b7010> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f13dd525c60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2289dd9940> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f72054567a0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f225faf20c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f75138af8d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f40adec4450> Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f273755a0c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) .BytesIO object at 0x7f5f75f070b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18340e3ab0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7ef7e0> Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f75138af920> .BytesIO object at 0x7f24e3c5e4d0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f10123bfbf0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41379a8950> (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f61078b4590> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1aba7626b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7ef880> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f225107f380> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/wor File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0f38b6f3d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fcf120cbc90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dcc83150> line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7eeb60> (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5f768a7ec0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f14ba7f0810> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1323fa4ea0> e/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank9]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank9]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank9]: image = tcs_loader(image) [rank9]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank9]: img = pil_loader(img_value_str) [rank9]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank9]: img = Image.open(buff) [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank9]: raise UnidentifiedImageError(msg) [rank9]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f107c6e4d10> [rank9]: During handling of the above exception, another exception occurred: [rank9]: Traceback (most recent call last): [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank9]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank9]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank9]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank9]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank9]: print(f"Failed to fetch sample {i}. Exception:", e) [rank9]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank9]: print(f"Failed to fetch sample {i}. Exception:", e) [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank9]: return self.dispatch_line(frame) [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank9]: if self.quitting: raise BdbQuit [rank9]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41f11f16c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd0a4b739c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd12f6c0e50> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f107c6807c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5cefc42020> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) haoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank1]: raise UnidentifiedImageError(msg) [rank1]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f24e3c5e4d0> [rank1]: During handling of the above exception, another exception occurred: [rank1]: Traceback (most recent call last): [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank1]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank1]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank1]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank1]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank1]: print(f"Failed to fetch sample {i}. Exception:", e) [rank1]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank1]: print(f"Failed to fetch sample {i}. Exception:", e) [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank1]: return self.dispatch_line(frame) [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank1]: if self.quitting: raise BdbQuit [rank1]: bdb.BdbQuit File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f14ba905f80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41f33c0ea0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) ample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd12f64be20> (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7ef510> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd1e9688310> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f40adec44a0> (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feff221d5d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f413798fc40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0cde87e520> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc1219e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fce21a2f380> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7ee5c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f24e3c5e430> .BytesIO object at 0x7f413798ba60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41f064c7c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fce21a2efc0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd0a4b739c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e27742610> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2bb96de070> ge.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7ee390> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd12f6d11c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f225107e890> ib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feff221d5d0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc8064ffc90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd2f164dad0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e277433d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f997855ee30> ge.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7ef380> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Traceback (most recent call last): ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2a76c02ca0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f225107dc10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) .py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da7d1940> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): (Pdb) /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f27f09b3830> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47c9e24e50> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f225107e6b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1c006cce50> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd2f1642d40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faff07d7b50> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2afff3eb60> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd251e77150> (Pdb) k (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9876fb3420> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19c23733d0> Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f225107dee0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44b3c93bf0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2bb81c0400> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) TOKENIZERS_PARALLELISM File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) =(true | false) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f22510a> /mnt (Pdb) k (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc14a160> (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2afff3e020> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f748a6af100> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/ImaTraceback (most recent call last): e UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc149760> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2afff3d3a0> self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fcfc8e36160> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) iuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) Traceback (most recent call last): File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f488d7deac0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44ec8b1490> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)(Pdb) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44b32bdb20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44c24c1ad0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2bba6b9b20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44b32bffb0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fce21a2f380> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample => /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc14b100> [rank26]: Traceback (most recent call last): [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank26]: train() [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank26]: trainer.train() [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank26]: return inner_training_loop( [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank26]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank26]: batch_samples += [next(epoch_iterator)] [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank26]: current_batch = next(dataloader_iter) [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank26]: data = self._next_data() [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank26]: return self._process_data(data) [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank26]: data.reraise() [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank26]: raise exception [rank26]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank26]: Original Traceback (most recent call last): [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank26]: sample = self._get_item(i) [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank26]: data_dict = self.preprocess_qwen2vl_v3( [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank26]: return self.preprocess_qwen2vl( [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank26]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank26]: image = tcs_loader(image) [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank26]: img = pil_loader(img_value_str) [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank26]: img = Image.open(buff) [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank26]: raise UnidentifiedImageError(msg) [rank26]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1c003376a0> [rank26]: During handling of the above exception, another exception occurred: [rank26]: Traceback (most recent call last): [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank26]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank26]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank26]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank26]: print(f"Failed to fetch sample {i}. Exception:", e) [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank26]: print(f"Failed to fetch sample {i}. Exception:", e) [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank26]: return self.dispatch_line(frame) [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank26]: if self.quitting: raise BdbQuit [rank26]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fcf399b3d80> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18256e9120> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd251e77150> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f61c2dd0c70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd0a5828270> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc149710> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f182560fab0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc149e40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2afff3dd00> ll last): [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank27]: train() [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank27]: trainer.train() [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank27]: return inner_training_loop( [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank27]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank27]: batch_samples += [next(epoch_iterator)] [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank27]: current_batch = next(dataloader_iter) [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank27]: data = self._next_data() [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank27]: return self._process_data(data) [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank27]: data.reraise() [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank27]: raise exception [rank27]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank27]: Original Traceback (most recent call last): [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank27]: sample = self._get_item(i) [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank27]: data_dict = self.preprocess_qwen2vl_v3( [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank27]: return self.preprocess_qwen2vl( [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank27]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank27]: image = tcs_loader(image) [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank27]: img = pil_loader(img_value_str) [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank27]: img = Image.open(buff) [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank27]: raise UnidentifiedImageError(msg) [rank27]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f61c2dd0c70> [rank27]: During handling of the above exception, another exception occurred: [rank27]: Traceback (most recent call last): [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank27]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank27]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank27]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank27]: print(f"Failed to fetch sample {i}. Exception:", e) [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank27]: print(f"Failed to fetch sample {i}. Exception:", e) [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank27]: return self.dispatch_line(frame) [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank27]: if self.quitting: raise BdbQuit [rank27]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5fff68b150> [rank28]: Traceback (most recent call last): [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank28Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc14a250> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ta.reraise() [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank28]: raise exception [rank28]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank28]: Original Traceback (most recent call last): [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank28]: sample = self._get_item(i) [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank28]: data_dict = self.preprocess_qwen2vl_v3( [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank28]: return self.preprocess_qwen2vl( [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank28]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank28]: image = tcs_loader(image) [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank28]: img = pil_loader(img_value_str) [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank28]: img = Image.open(buff) [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank28]: raise UnidentifiedImageError(msg) [rank28]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd0a5828270> [rank28]: During handling of the above exception, another exception occurred: [rank28]: Traceback (most recent call last): [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank28]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank28]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank28]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank28]: print(f"Failed to fetch sample {i}. Exception:", e) [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank28]: print(f"Failed to fetch sample {i}. Exception:", e) [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank28]: return self.dispatch_line(frame) [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank28]: if self.quitting: raise BdbQuit [rank28]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd0a4bd1bc0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvisTraceback (most recent call last): m__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) TOKENIZERS_PARALLELISM=(true | false) ace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5fff6ab290> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1b3ce93150> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0fb81daed0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f27f09b26b0> Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5fff7de390> PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1aba761760> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4137a9aa70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2afff3df80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f40adec6200> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f986fa22ca0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f40aeb20900> Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2afff66570> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41379104f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3f41b947c0> (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fcf39108590> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0f38b6fc40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc69a4d5120> kages/torch/utils/data/dataloader.py", line 701, in __next__ [rank29]: data = self._next_data() [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank29]: return self._process_data(data) [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank29]: data.reraise() [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank29]: raise exception [rank29]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank29]: Original Traceback (most recent call last): [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank29]: sample = self._get_item(i) [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank29]: data_dict = self.preprocess_qwen2vl_v3( [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank29]: return self.preprocess_qwen2vl( [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank29]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank29]: image = tcs_loader(image) [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank29]: img = pil_loader(img_value_str) [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank29]: img = Image.open(buff) [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank29]: raise UnidentifiedImageError(msg) [rank29]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f986fa22ca0> [rank29]: During handling of the above exception, another exception occurred: [rank29]: Traceback (most recent call last): [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank29]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank29]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank29]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank29]: print(f"Failed to fetch sample {i}. Exception:", e) [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank29]: print(f"Failed to fetch sample {i}. Exception:", e) [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank29]: return self.dispatch_line(frame) [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank29]: if self.quitting: raise BdbQuit [rank29]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98703ef6f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd12f7db470> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc80f743330> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f986fa23240> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47c81dbab0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc1231f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95e9a4af70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc80f7431f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47c9e03ec0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44b3ae6e80> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd251e77150> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc80f742890> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47c8b6bc40> [rank11]: Traceback (most recent call last): [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank11]: train() [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank11]: trainer.train() [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank11]: return inner_training_loop( [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank11]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank11]: batch_samples += [next(epoch_iterator)] [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank11]: current_batch = next(dataloader_iter) [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank11]: data = self._next_data() [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank11]: return self._process_data(data) [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank11]: data.reraise() [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank11]: raise exception [rank11]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank11]: Original Traceback (most recent call last): [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank11]: sample = self._get_item(i) [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank11]: data_dict = self.preprocess_qwen2vl_v3( [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank11]: return self.preprocess_qwen2vl( [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank11]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank11]: image = tcs_loader(image) [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank11]: img = pil_loader(img_value_str) [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank11]: img = Image.open(buff) [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank11]: raise UnidentifiedImageError(msg) [rank11]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47c9e03ec0> [rank11]: During handling of the above exception, another exception occurred: [rank11]: Traceback (most recent call last): [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank11]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank11]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank11]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank11]: print(f"Failed to fetch sample {i}. Exception:", e) [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank11]: print(f"Failed to fetch sample {i}. Exception:", e) [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank11]: return self.dispatch_line(frame) [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank11]: if self.quitting: raise BdbQuit [rank11]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc806ec7920> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f14541b2840> > /mnt/hwfile/liuzhaoyang/workspaceTraceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample => /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd250883b50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd2d798c450> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4bde0f1f30> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc80f7232e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19c2d3bf60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) iuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1012b07ec0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd2d79a4770> [rank30]: Traceback (most recent call last): [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank30]: train() [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank30]: trainer.train() [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank30]: return inner_training_loop( [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank30]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank30]: batch_samples += [next(epoch_iterator)] [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank30]: current_batch = next(dataloader_iter) [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank30]: data = self._next_data() [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank30]: return self._process_data(data) [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank30]: data.reraise() [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank30]: raise exception [rank30]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank30]: Original Traceback (most recent call last): [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank30]: sample = self._get_item(i) [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank30]: data_dict = self.preprocess_qwen2vl_v3( [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank30]: return self.preprocess_qwen2vl( [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank30]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank30]: image = tcs_loader(image) [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank30]: img = pil_loader(img_value_str) [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank30]: img = Image.open(buff) [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank30]: raise UnidentifiedImageError(msg) [rank30]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41f267d0d0> [rank30]: During handling of the above exception, another exception occurred: [rank30]: Traceback (most recent call last): [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank30]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank30]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank30]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank30]: print(f"Failed to fetch sample {i}. Exception:", e) [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank30]: print(f"Failed to fetch sample {i}. Exception:", e) [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank30]: return self.dispatch_line(frame) [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank30]: if self.quitting: raise BdbQuit [rank30]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f233caf4c20> preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f183469a250> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) k (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc80f743240> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19c56eb240> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fcfc8e36160> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fcfc8e36160> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1b3ce499e0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable ple {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb4cb327740> Traceback (most recent call last): e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f173bc82a20> ile <_io.BytesIO object at 0x7f233caf4c70> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e27742610> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5f791e0040> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fcfc8e37830> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95da803ab0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd12f64fe20> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4bddd14cc0> ib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd400959ee0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5cef68ade0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e277433d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.U> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19c56eb290> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5f791e05e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd24fe44cc0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2a77ba3290> > /mnt/hwfile/liuzhaoyang/workspaceTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19c2373240> /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd12f6f7d30> ge.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2550331850> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb0b46e6a20> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd2d7a86840> > /mnt/hwfile/liuzhaoyang/workspaceTraceback (most recent call last): taset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb0b433ea20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faf7045e340> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) k (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fcfc94255d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) nk3]: train() [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank3]: trainer.train() [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank3]: return inner_training_loop( [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank3]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank3]: batch_samples += [next(epoch_iterator)] [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank3]: current_batch = next(dataloader_iter) [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank3]: data = self._next_data() [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank3]: return self._process_data(data) [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank3]: data.reraise() [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank3]: raise exception [rank3]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank3]: Original Traceback (most recent call last): [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank3]: sample = self._get_item(i) [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank3]: data_dict = self.preprocess_qwen2vl_v3( [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank3]: return self.preprocess_qwen2vl( [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank3]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank3]: image = tcs_loader(image) [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank3]: img = pil_loader(img_value_str) [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank3]: img = Image.open(buff) [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank3]: raise UnidentifiedImageError(msg) [rank3]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0ad75967a0> [rank3]: During handling of the above exception, another exception occurred: [rank3]: Traceback (most recent call last): [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank3]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank3]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank3]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank3]: print(f"Failed to fetch sample {i}. Exception:", e) [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank3]: print(f"Failed to fetch sample {i}. Exception:", e) [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank3]: return self.dispatch_line(frame) [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank3]: if self.quitting: raise BdbQuit [rank3]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb0b4f472e0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f15e3a68ea0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb235fe3ec0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1323f80ae0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faf7045e2f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faf7045fdd0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f14b9e273d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) is warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable Traceback (most recent call last): e) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f413798b7e0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f27ee3d3ba0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb0b46be070> [rank2]: Traceback (most recent call last): [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank2]: train() [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank2]: trainer.train() [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank2]: return inner_training_loop( [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank2]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank2]: batch_samples += [next(epoch_iterator)] [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank2]: current_batch = next(dataloader_iter) [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank2]: data = self._next_data() [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank2]: return self._process_data(data) [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank2]: data.reraise() [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank2]: raise exception [rank2]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank2]: Original Traceback (most recent call last): [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank2]: sample = self._get_item(i) [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank2]: data_dict = self.preprocess_qwen2vl_v3( [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank2]: return self.preprocess_qwen2vl( [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank2]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank2]: image = tcs_loader(image) [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank2]: img = pil_loader(img_value_str) [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank2]: img = Image.open(buff) [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank2]: raise UnidentifiedImageError(msg) [rank2]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb0b4f472e0> [rank2]: During handling of the above exception, another exception occurred: [rank2]: Traceback (most recent call last): [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank2]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank2]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank2]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank2]: print(f"Failed to fetch sample {i}. Exception:", e) [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank2]: print(f"Failed to fetch sample {i}. Exception:", e) [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank2]: return self.dispatch_line(frame) [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank2]: if self.quitting: raise BdbQuit [rank2]: bdb.BdbQuit Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f14b9e26e80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dat File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc9086ae520> y(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f413798b970> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f27ee3d2f20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f14b9e271a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) .py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41379a4ae0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch samTraceback (most recent call lastTOK File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/Traceback (most recent call last): line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4bddd14cc0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f13df7ddfd0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faff0460ea0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff15bc71760> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb6b8923600> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc80f7050d0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0726d8d5d0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc80f717470> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faff029be20> > /mnt/hwfile/liuzhaoyang/workspace> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4bdd257b00> ge.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1323f804f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc8064ff9c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb1339950d0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1323f(Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f059271d3a0> nt/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) >Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc69a4d5120> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/wor File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6fcaf871f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f721c74eac0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f072f8c4950> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc80f715760> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liu(Pdb) ck (most recent call last): _gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47490f5490> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)= self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f072f9ea5c0> line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb1339950d0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4bddd150d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc80f717a10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d363ced40> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6fcaf87010> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4a720df4c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faf7045fdd0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwf File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d363e7830> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4bddd14cc0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98bfa1b510> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faff03ae570> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d4279fb50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f17504f6cf0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faff02bf240> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f07267eb790> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6e5ef31260> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc275440> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)Traceback (most recent call last): ll last): [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank51]: train() [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank51]: trainer.train() [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank51]: return inner_training_loop( [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank51]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank51]: batch_samples += [next(epoch_iterator)] [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank51]: current_batch = next(dataloader_iter) [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank51]: data = self._next_data() [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank51]: return self._process_data(data) [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank51]: data.reraise() [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank51]: raise exception [rank51]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank51]: Original Traceback (most recent call last): [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank51]: sample = self._get_item(i) [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank51]: data_dict = self.preprocess_qwen2vl_v3( [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank51]: return self.preprocess_qwen2vl( [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank51]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank51]: image = tcs_loader(image) [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank51]: img = pil_loader(img_value_str) [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank51]: img = Image.open(buff) [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank51]: raise UnidentifiedImageError(msg) [rank51]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f17504f6cf0> [rank51]: During handling of the above exception, another exception occurred: [rank51]: Traceback (most recent call last): [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank51]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank51]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank51]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank51]: print(f"Failed to fetch sample {i}. Exception:", e) [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank51]: print(f"Failed to fetch sample {i}. Exception:", e) [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank51]: return self.dispatch_line(frame) [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank51]: if self.quitting: raise BdbQuit [rank51]: bdb.BdbQuit File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95e69d2070> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc9086aeca0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) ample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb34459eca0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f131ff83240> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbe4a958900> Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98bf931120> Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9844fa2f20> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0726d8d5d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/ (Pdb) le/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f147c79d670> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc11b650> Traceback (most recent call last): last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ s> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) eprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb5483c9030> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98bf9305e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0725a69170> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f541ac70360> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/ File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwf> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc12fb00> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4bde0f1f30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f984b5949a0> (Pdb) [rank49]: Traceback (most recent call last): [rank49]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank49]: train() [rank49]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank49]: trainer.train() [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank49]: return inner_training_loop( [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank49]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank49]: batch_samples += [next(epoch_iterator)] [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 563, in __iter__ [rank49]: next_batch = next(dataloader_iter) [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank49]: data = self._next_data() [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank49]: return self._process_data(data) [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank49]: data.reraise() [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank49]: raise exception [rank49]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 1. [rank49]: Original Traceback (most recent call last): [rank49]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank49]: sample = self._get_item(i) [rank49]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank49]: data_dict = self.preprocess_qwen2vl_v3( [rank49]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank49]: return self.preprocess_qwen2vl( [rank49]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank49]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank49]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank49]: image = tcs_loader(image) [rank49]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank49]: img = pil_loader(img_value_str) [rank49]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank49]: img = Image.open(buff) [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank49]: raise UnidentifiedImageError(msg) [rank49]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb34459eca0> [rank49]: During handling of the above exception, another exception occurred: [rank49]: Traceback (most recent call last): [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank49]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank49]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank49]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank49]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank49]: print(f"Failed to fetch sample {i}. Exception:", e) [rank49]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank49]: print(f"Failed to fetch sample {i}. Exception:", e) [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank49]: return self.dispatch_line(frame) [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank49]: if self.quitting: raise BdbQuit [rank49]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb4cb327600> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f168b48f1f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feed754b9c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc8064ffc40> ge.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb60cbea840> [rank47]: Traceback (most recent call last): [rank47]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank47]: train() [rank47]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank47]: trainer.train() [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank47]: return inner_training_loop( [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank47]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank47]: batch_samples += [next(epoch_iterator)] [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank47]: current_batch = next(dataloader_iter) [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank47]: data = self._next_data() [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-package File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f08a39bc720> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f07265245e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) sibly_batched_index] [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank47]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank47]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank47]: print(f"Failed to fetch sample {i}. Exception:", e) [rank47]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank47]: print(f"Failed to fetch sample {i}. Exception:", e) [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank47]: return self.dispatch_line(frame) [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank47]: if self.quitting: raise BdbQuit [rank47]: bdb.BdbQuit File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb235fe0a90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f17cd6ece00> [rank23]: Traceback (most recent call last): [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank23]: train() [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank23]: trainer.train() [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank23]: return inner_training_loop( [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank23]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank23]: batch_samples += [next(epoch_iterator)] [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank23]: current_batch = next(dataloader_iter) [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank23]: data = self._next_data() [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank23]: return self._process_data(data) [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank23]: data.reraise() [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank23]: raise exception [rank23]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank23]: Original Traceback (most recent call last): [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank23]: sample = self._get_item(i) [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank23]: data_dict = self.preprocess_qwen2vl_v3( [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank23]: return self.preprocess_qwen2vl( [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank23]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank23]: image = tcs_loader(image) [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank23]: img = pil_loader(img_value_str) [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank23]: img = Image.open(buff) [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank23]: raise UnidentifiedImageError(msg) [rank23]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f541ac70360> [rank23]: During handling of the above exception, another exception occurred: [rank23]: Traceback (most recent call last): [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank23]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank23]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank23]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank23]: print(f"Failed to fetch sample {i}. Exception:", e) [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank23]: print(f"Failed to fetch sample {i}. Exception:", e) [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank23]: return self.dispatch_line(frame) [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank23]: if self.quitting: raise BdbQuit [rank23]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbe4b50a110> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95bc99fba0> File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f072f8d9940> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f15bc7f4810> e 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f072f9f1da0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) iuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4c679a4720> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb2360f1f80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f07265245e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4d90a5e5c0> .BytesIO object at 0x7f049f99a700> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f049566bba0> [rank60]: Traceback (most recent call last): [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank60]: train() [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank60]: trainer.train() [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank60]: return inner_training_loop( [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank60]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank60]: batch_samples += [next(epoch_iterator)] [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank60]: current_batch = next(dataloader_iter) [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank60]: data = self._next_data() [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank60]: return self._process_data(data) [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank60]: data.reraise() [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank60]: raise exception [rank60]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank60]: Original Traceback (most recent call last): [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank60]: sample = self._get_item(i) [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank60]: data_dict = self.preprocess_qwen2vl_v3( [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank60]: return self.preprocess_qwen2vl( [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank60]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank60]: image = tcs_loader(image) [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank60]: img = pil_loader(img_value_str) [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank60]: img = Image.open(buff) [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank60]: raise UnidentifiedImageError(msg) [rank60]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc8064ffc40> [rank60]: During handling of the above exception, another exception occurred: [rank60]: Traceback (most recent call last): [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank60]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank60]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank60]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank60]: print(f"Failed to fetch sample {i}. Exception:", e) [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank60]: print(f"Failed to fetch sample {i}. Exception:", e) [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank60]: return self.dispatch_line(frame) [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank60]: if self.quitting: raise BdbQuit [rank60]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetchTraceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc0676587c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1a00cbf9c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb23b12b470> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9878403290> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f049f99b970> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f07267eb7e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) space/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7face9675c60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d363e77e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19cb58ef20> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb23b12bab0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f072b8e3b50> t/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff15bc73880> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfTraceback (most recent call last): qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6e5ef30a90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9a83f86160> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbfdda6f4c0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb235fe1760> .BytesIO object at 0x7f049566bba0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f997855e250> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19cb58fb00> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise Uniden (Pdb) ageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd40067c680> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_ioTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb235fe3e20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9847127240> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1ac46b83b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19cb58fec0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98779bcf90> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) is warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d4279fb50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9876fb34c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb235fe0310> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4c0ff469d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19cb58ee80> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98461fcbd0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98bf9123e0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/lTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19c23739c0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb235fe1620> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f14b9e26a70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98855e83b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1a00be2200> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9878d7f600> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc80f8661b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwenTraceback (most recent call last): e 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1a00b76f20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6244009440> self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4b184b0c70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9948e15b20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9876fb0180> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19cb581440> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =(Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb235fe3dd0> get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff15bc73a10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) raceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc8456e9990> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f182ef490d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f14b9e27d80> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io> /mnt/hwfile/liuzhaoyang/workspace File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f721c74e8e0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2afff2ede0> ams/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank61]: return self._process_data(data) [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank61]: data.reraise() [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank61]: raise exception [rank61]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank61]: Original Traceback (most recent call last): [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank61]: sample = self._get_item(i) [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank61]: data_dict = self.preprocess_qwen2vl_v3( [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank61]: return self.preprocess_qwen2vl( [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank61]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank61]: image = tcs_loader(image) [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank61]: img = pil_loader(img_value_str) [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank61]: img = Image.open(buff) [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank61]: raise UnidentifiedImageError(msg) [rank61]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1a00b76f20> [rank61]: During handling of the above exception, another exception occurred: [rank61]: Traceback (most recent call last): [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank61]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank61]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank61]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank61]: print(f"Failed to fetch sample {i}. Exception:", e) [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank61]: print(f"Failed to fetch sample {i}. Exception:", e) [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank61]: return self.dispatch_line(frame) [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank61]: if self.quitting: raise BdbQuit [rank61]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_iTraceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank56]: Traceback (most recent call last): [rank56]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank56]: train() [rank56]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank56]: trainer.train() [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank56]: return inner_training_loop( [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank56]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank56]: batch_samples += [next(epoch_iterator)] [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank56]: current_batch = next(dataloader_iter) [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank56]: data = self._next_data() [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank56]: return self._process_data(data) [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank56]: data.reraise() [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank56]: raise exception [rank56]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank56]: Original Traceback (most recent call last): [rank56]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank56]: sample = self._get_item(i) [rank56]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank56]: data_dict = self.preprocess_qwen2vl_v3( [rank56]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank56]: return self.preprocess_qwen2vl( [rank56]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank56]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank56]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank56]: image = tcs_loader(image) [rank56]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank56]: img = pil_loader(img_value_str) [rank56]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank56]: img = Image.open(buff) [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank56]: raise UnidentifiedImageError(msg) [rank56]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6244009440> [rank56]: During handling of the above exception, another exception occurred: [rank56]: Traceback (most recent call last): [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank56]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank56]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank56]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank56]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank56]: print(f"Failed to fetch sample {i}. Exception:", e) [rank56]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank56]: print(f"Failed to fetch sample {i}. Exception:", e) [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank56]: return self.dispatch_line(frame) [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank56]: if self.quitting: raise BdbQuit [rank56]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9884b4a070> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/worksp> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6fcaf86f20> [rank20]: Traceback (most recent call last): [rank20]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank20]: train() [rank20]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank20]: trainer.train() [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank20]: return inner_training_loop( [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank20]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank20]: batch_samples += [next(epoch_iterator)] [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank20]: current_batch = next(dataloader_iter) [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank20]: data = self._next_data() [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank20]: return self._process_data(data) [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f14b9e26b10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) c/aguvis/dataset.py", line 40, in pil_loader [rank20]: img = Image.open(buff) [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank20]: raise UnidentifiedImageError(msg) [rank20]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dd7b8680> [rank20]: During handling of the above exception, another exception occurred: [rank20]: Traceback (most recent call last): [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank20]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank20]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank20]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank20]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank20]: print(f"Failed to fetch sample {i}. Exception:", e) [rank20]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank20]: print(f"Failed to fetch sample {i}. Exception:", e) [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank20]: return self.dispatch_line(frame) [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank20]: if self.quitting: raise BdbQuit [rank20]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_i> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) iuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fda2c6b5120> ce/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f168a6ef0b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) Traceback (most recent call last): PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f984f5a3bf0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d363d96c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lTraceback (most recent call last): ge.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa30869d530> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen (Pdb) k (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/l File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fcfc8e36160> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f628b7a3fb0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc8064ffa10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) Traceback (most recent call last): File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d364e2c00> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6fcaf87010> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2bb8147240> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f168a6ecb30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd2d79a0090> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f17504f6cf0> ile <_io.BytesIO object at 0x7f5fea275d50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6db85058a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f620e9bc2c0> Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f721c74e160> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f17cc0e0cc0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2bb9551620> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6fcb9571f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Traceback (most recent call last): (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa1c375c2c0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb6806e6cf0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) .BytesIO object at 0x7f6d36408310> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff9003eb9c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa1c375e110> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f290ae24e50> ge.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98855e83b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95f5ce6f20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5fbe057100> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preproces File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raisTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_ File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f036f39af20> .BytesIO object at 0x7f4bde0f1f30> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4c679321b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb4cb3274c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4bdd257b00> File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbfdda6ce50> s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f984f5a1260> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f173bc82ca0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) lf.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd2d79a3420> /liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2457f7d30> >> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) _qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f942df44040> (Pdb) Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6344d43380> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f984f5a1b70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f173c225ad0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) y", line 52, in [rank12]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank12]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank12]: print(f"Failed to fetch sample {i}. Exception:", e) [rank12]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank12]: print(f"Failed to fetch sample {i}. Exception:", e) [rank12]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank12]: return self.dispatch_line(frame) [rank12]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank12]: if self.quitting: raise BdbQuit [rank12]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f942df1bb00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample => /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9570342480> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0103196cf0> .BytesIO object at 0x7f2bba651030> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f721c7ab470> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa1c4125530> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa1c375db20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff8804836f0> > /mnt/> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9844fa1120> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f984f5a1670> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f628b7b0860> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6035f4c450> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa308b0be20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff1dc4d3740> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2a77ba3290> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f628b7b3010> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f943081b920> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f628b7b3dd0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Ima> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1ac4ff5440> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Traceback (most recent call last): (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f643ea0e2a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa1c375e3e0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb231cf1620> ib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f984f5a38d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f25500ff7e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f233caf4c20> Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19c2371620> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f624400b790> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5fbda8a480> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6244fae390> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) iuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff15bc73830> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa1c375e520> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2afff47420> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff87f53f7e0> kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f25515de890> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff9006e6ed0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6244009d00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1a00b825c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2a77ba3290> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f49d34df420> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb231e363e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5fbda8a520> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6244bab7e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocesTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2550119210> (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb236d22e30> t/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f643ea0e340> [rank0]: Traceback (most recent call last): [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank0]: train() [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank0]: trainer.train() [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank0]: return inner_training_loop( [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank0]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank0]: batch_samples += [next(epoch_iterator)] [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank0]: current_batch = next(dataloader_iter) [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank0]: data = self._next_data() [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank0]: return self._process_data(data) [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank0]: data.reraise() [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank0]: raise exception [rank0]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank0]: Original Traceback (most recent call last): [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank0]: sample = self._get_item(i) [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank0]: data_dict = self.preprocess_qwen2vl_v3( [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank0]: return self.preprocess_qwen2vl( [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank0]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank0]: image = tcs_loader(image) [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank0]: img = pil_loader(img_value_str) [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank0]: img = Image.open(buff) [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank0]: raise UnidentifiedImageError(msg) [rank0]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f233caf4c20> [rank0]: During handling of the above exception, another exception occurred: [rank0]: Traceback (most recent call last): [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank0]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank0]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank0]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank0]: print(f"Failed to fetch sample {i}. Exception:", e) [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank0]: print(f"Failed to fetch sample {i}. Exception:", e) [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank0]: return self.dispatch_line(frame) [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank0]: if self.quitting: raise BdbQuit [rank0]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f266112aa70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) guvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank16]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank16]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank16]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank16]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank16]: print(f"Failed to fetch sample {i}. Exception:", e) [rank16]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank16]: print(f"Failed to fetch sample {i}. Exception:", e) [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank16]: return self.dispatch_line(frame) [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank16]: if self.quitting: raise BdbQuit [rank16]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f49d34df380> [rank55]: Traceback (most recent call last): [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank55]: train() [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank55]: trainer.train() [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank55]: return inner_training_loop( [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank55]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank55]: batch_samples += [next(epoch_iterator)] [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank55]: current_batch = next(dataloader_iter) [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank55]: data = self._next_data() [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank55]: return self._process_data(data) [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank55]: data.reraise() [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank55]: raise exception [rank55]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank55]: Original Traceback (most recent call last): [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank55]: sample = self._get_item(i) [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank55]: data_dict = self.preprocess_qwen2vl_v3( [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank55]: return self.preprocess_qwen2vl( [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank55]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank55]: image = tcs_loader(image) [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank55]: img = pil_loader(img_value_str) [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank55]: img = Image.open(buff) [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank55]: raise UnidentifiedImageError(msg) [rank55]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f49d34df420> [rank55]: During handling of the above exception, another exception occurred: [rank55]: Traceback (most recent call last): [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank55]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank55]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank55]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank55]: print(f"Failed to fetch sample {i}. Exception:", e) [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank55]: print(f"Failed to fetch sample {i}. Exception:", e) [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank55]: return self.dispatch_line(frame) [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank55]: if self.quitting: raise BdbQuit [rank55]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb231cf1da0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_ioTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d363e7650> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd8e798d8a0> .BytesIO object at 0x7f24d0646110> self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) Traceback (most recent call last): File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff9001ffb50> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff9001f4770> > /mnt/hwfile/liuzhaoyang/workspace (Pdb) Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47759d8db0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb231cf1bc0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) tifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa1c4125530> [rank17]: Traceback (most recent call last): [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank17]: train() [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank17]: trainer.train() [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank17]: return inner_training_loop( [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank17]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank17]: batch_samples += [next(epoch_iterator)] [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank17]: current_batch = next(dataloader_iter) [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank17]: data = self._next_data() [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank17]: return self._process_data(data) [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank17]: data.reraise() [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank17]: raise exception [rank17]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank17]: Original Traceback (most recent call last): [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank17]: sample = self._get_item(i) [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank17]: data_dict = self.preprocess_qwen2vl_v3( [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank17]: return self.preprocess_qwen2vl( [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank17]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank17]: image = tcs_loader(image) [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank17]: img = pil_loader(img_value_str) [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank17]: img = Image.open(buff) [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank17]: raise UnidentifiedImageError(msg) [rank17]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2a77ba3290> [rank17]: During handling of the above exception, another exception occurred: [rank17]: Traceback (most recent call last): [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank17]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank17]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank17]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank17]: print(f"Failed to fetch sample {i}. Exception:", e) [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank17]: print(f"Failed to fetch sample {i}. Exception:", e) [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank17]: return self.dispatch_line(frame) [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank17]: if self.quitting: raise BdbQuit [rank17]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspa> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ror: cannot identify image file <_io.BytesIO object at 0x7f2bb814e250> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f704d4228e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff9002ff8d0> Traceback (most recent call last): Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f22499355d0> self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f942df44040> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb231cf2890> self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2bb9f157b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2247257560> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f25500ff880> ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff87f53f1a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff9016eade0> Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9876fb3470> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f96b2615da0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) ck (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6fcaf86fc0> /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) k (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f970b5bbd80> line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2442fbdd0> Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6deb6444f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ffa49be0590> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Traceback (most recent call last): ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2247257560> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6fcb9571f0> /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f942d57c2c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) .BytesIO object at 0x7f9f367d27a0> ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9878214360> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f255016ac00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb54821afc0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6deb647c90> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb4c73052b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdf80775800> Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", li File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f984f532070> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = ImageTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f25500ff9c0> iuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd65f15ec00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6f3067d3a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff9001f5a30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspaceTraceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sTraceback (most recent call last): mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95e69d2e30> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d4279ede0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd6617464d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff87f53d490> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) eprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6deb647c40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdba0b9cf90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6deb647c40> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd8e798f420> Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb236d23b50> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6dec00cf40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fddbb889300> Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff9001f6d40> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f22499355d0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc1230bd350> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2550103920> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb236d22bb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b59beb4c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ s File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f255102bd80> fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b687b9300> (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f24cf80b880> ge.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb320cfe980> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b59a725c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) : train() [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank63]: trainer.train() [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank63]: return inner_training_loop( [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank63]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank63]: batch_samples += [next(epoch_iterator)] [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank63]: current_batch = next(dataloader_iter) [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank63]: data = self._next_data() [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank63]: return self._process_data(data) [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank63]: data.reraise() [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank63]: raise exception [rank63]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank63]: Original Traceback (most recent call last): [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank63]: sample = self._get_item(i) [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank63]: data_dict = self.preprocess_qwen2vl_v3( [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank63]: return self.preprocess_qwen2vl( [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank63]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank63]: image = tcs_loader(image) [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank63]: img = pil_loader(img_value_str) [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank63]: img = Image.open(buff) [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank63]: raise UnidentifiedImageError(msg) [rank63]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9876fb3470> [rank63]: During handling of the above exception, another exception occurred: [rank63]: Traceback (most recent call last): [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank63]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank63]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank63]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank63]: print(f"Failed to fetch sample {i}. Exception:", e) [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank63]: print(f"Failed to fetch sample {i}. Exception:", e) [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank63]: return self.dispatch_line(frame) [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank63]: if self.quitting: raise BdbQuit [rank63]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f22499355d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) k (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6dec00cf40> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd8e902a570> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f25500ffdd0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f22499365c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6deb647c40> > /mnt/hwfile/liuzhaoyang/workspace> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f17cc0ded40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd8e798f5b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f17cc1ee4d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd6525eeac0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f17cc0df8d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: IAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f17cc0dd530> - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) xception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443acc20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f14042aa1b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fae4c1b4e50> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdb263c1b20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443ae390> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) k (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbd57797470> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) k (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc1226edda0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc12207c540> e/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank50]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank50]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank50]: image = tcs_loader(image) [rank50]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank50]: img = pil_loader(img_value_str) [rank50]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank50]: img = Image.open(buff) [rank50]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank50]: raise UnidentifiedImageError(msg) [rank50]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fae4c1b4e50> [rank50]: During handling of the above exception, another exception occurred: [rank50]: Traceback (most recent call last): [rank50]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank50]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank50]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank50]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank50]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank50]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank50]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank50]: print(f"Failed to fetch sample {i}. Exception:", e) [rank50]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank50]: print(f"Failed to fetch sample {i}. Exception:", e) [rank50]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank50]: return self.dispatch_line(frame) [rank50]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank50]: if self.quitting: raise BdbQuit [rank50]: bdb.BdbQuit File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443ac900> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9844fa3100> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f984f5abab0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/l File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443adfd0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f168a6ef330> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ader, conv) [rank45]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank45]: image = tcs_loader(image) [rank45]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank45]: img = pil_loader(img_value_str) [rank45]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank45]: img = Image.open(buff) [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank45]: raise UnidentifiedImageError(msg) [rank45]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc12207c540> [rank45]: During handling of the above exception, another exception occurred: [rank45]: Traceback (most recent call last): [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank45]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank45]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank45]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank45]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank45]: print(f"Failed to fetch sample {i}. Exception:", e) [rank45]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank45]: print(f"Failed to fetch sample {i}. Exception:", e) [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank45]: return self.dispatch_line(frame) [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank45]: if self.quitting: raise BdbQuit [rank45]: bdb.BdbQuit File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9844fa2f20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank59]: Traceback (most recent call last): [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank59]: train() [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank59]: trainer.train() [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank59]: return inner_training_loop( [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank59]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank59]: batch_samples += [next(epoch_iterator)] [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank59]: current_batch = next(dataloader_iter) [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank59]: data = self._next_data() [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank59]: return self._process_data(data) [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank59]: data.reraise() [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank59]: raise exception [rank59]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank59]: Original Traceback (most recent call last): [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank59]: sample = self._get_item(i) [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank59]: data_dict = self.preprocess_qwen2vl_v3( [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank59]: return self.preprocess_qwen2vl( [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank59]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank59]: image = tcs_loader(image) [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank59]: img = pil_loader(img_value_str) [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank59]: img = Image.open(buff) [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank59]: raise UnidentifiedImageError(msg) [rank59]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9844fa3100> [rank59]: During handling of the above exception, another exception occurred: [rank59]: Traceback (most recent call last): [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank59]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank59]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank59]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank59]: print(f"Failed to fetch sample {i}. Exception:", e) [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank59]: print(f"Failed to fetch sample {i}. Exception:", e) [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank59]: return self.dispatch_line(frame) [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank59]: if self.quitting: raise BdbQuit [rank59]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)= self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9847127240> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95bf0365c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa1c6a23290> (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d44dd2c50> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f13fa10b830> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f14042aa1b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1b92607ab0> [rank54]: Traceback (most recent call last): [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank54]: train() [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank54]: trainer.train() [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank54]: return inner_training_loop( [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank54]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank54]: batch_samples += [next(epoch_iterator)] [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank54]: current_batch = next(dataloader_iter) [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank54]: data = self._next_data() [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank54]: return self._process_data(data) [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank54]: data.reraise() [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank54]: raise exception [rank54]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank54]: Original Traceback (most recent call last): [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank54]: sample = self._get_item(i) [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank54]: data_dict = self.preprocess_qwen2vl_v3( [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank54]: return self.preprocess_qwen2vl( [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank54]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank54]: image = tcs_loader(image) [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank54]: img = pil_loader(img_value_str) [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank54]: img = Image.open(buff) [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank54]: raise UnidentifiedImageError(msg) [rank54]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdb263c1b20> [rank54]: During handling of the above exception, another exception occurred: [rank54]: Traceback (most recent call last): [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank54]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank54]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank54]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank54]: print(f"Failed to fetch sample {i}. Exception:", e) [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank54]: print(f"Failed to fetch sample {i}. Exception:", e) [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank54]: return self.dispatch_line(frame) [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank54]: if self.quitting: raise BdbQuit [rank54]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d363fe6b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb4c7ccf380> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f704ca399e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_itTraceback (most recent call last): ng/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb231e6f8d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1b92607a60> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd2d3170ae0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb236d22e30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d363fd080> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd24f455cb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9847127240> File "/mnt/hwfile/liuzhaoyang/wor File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d363fd7b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1cd4ec02c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc067676d90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1c1b641170> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/wor File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fcfc94258a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f984b5a0cc0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File " (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1b92607560> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd392bbf790> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f95bc99f650> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f190c54e700> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd2d3170bd0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b681ae930> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb6b592ef20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb4c7307330> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd65f15f1f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0248e21cb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1a267a9350> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6dec00d6c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb4c7cceb10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b59ae5da0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) Traceback (most recent call last): PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b59ae2340> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb54837cfe0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb548d7d940> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7eff70781fd0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b59113150> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/wor File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde75aa7740> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd8e798f560> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa24587ea70> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fda2c654f40> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efe6d394590> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd968cbaac0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f71116290d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fddbb81b2e0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd8e8556c50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f033ff4de90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1cd429e390> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd6617464d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample => /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efa3cb04c70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d363e1fd0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1c1b7d1c60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd65f15f1f0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6fcb9571f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443acc70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ef9bb3dd5d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd8e798f560> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f704d4228e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) iuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6f31115760> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443aecf0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d363e2930> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efbfea3b4c0> kages/torch/utils/data/dataloader.py", line 701, in __next__ [rank48]: data = self._next_data() [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank48]: return self._process_data(data) [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank48]: data.reraise() [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank48]: raise exception [rank48]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank48]: Original Traceback (most recent call last): [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank48]: sample = self._get_item(i) [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank48]: data_dict = self.preprocess_qwen2vl_v3( [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank48]: return self.preprocess_qwen2vl( [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank48]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank48]: image = tcs_loader(image) [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank48]: img = pil_loader(img_value_str) [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank48]: img = Image.open(buff) [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank48]: raise UnidentifiedImageError(msg) [rank48]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6f31115760> [rank48]: During handling of the above exception, another exception occurred: [rank48]: Traceback (most recent call last): [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank48]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank48]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank48]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank48]: print(f"Failed to fetch sample {i}. Exception:", e) [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank48]: print(f"Failed to fetch sample {i}. Exception:", e) [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank48]: return self.dispatch_line(frame) [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank48]: if self.quitting: raise BdbQuit [rank48]: bdb.BdbQuit File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa1c375d4e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7eff70781e40> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443ae0c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f17d9eba9d0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/wor File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ef7267ff7e0> /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Token indices sequence length is longer than the specified maximum sequence length for this model (8940 > 8192). Running this sequence through the model will result in indexing errors nt/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0103195440> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/ImaTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b59ae20c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b59aefd80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ef72699a9d0> kages/torch/utils/data/dataloader.py", line 701, in __next__ [rank18]: data = self._next_data() [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank18]: return self._process_data(data) [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank18]: data.reraise() [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank18]: raise exception [rank18]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank18]: Original Traceback (most recent call last): [rank18]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank18]: sample = self._get_item(i) [rank18]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank18]: data_dict = self.preprocess_qwen2vl_v3( [rank18]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank18]: return self.preprocess_qwen2vl( [rank18]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank18]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank18]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank18]: image = tcs_loader(image) [rank18]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank18]: img = pil_loader(img_value_str) [rank18]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank18]: img = Image.open(buff) [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank18]: raise UnidentifiedImageError(msg) [rank18]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f17d9eba9d0> [rank18]: During handling of the above exception, another exception occurred: [rank18]: Traceback (most recent call last): [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank18]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank18]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank18]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank18]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank18]: print(f"Failed to fetch sample {i}. Exception:", e) [rank18]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank18]: print(f"Failed to fetch sample {i}. Exception:", e) [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank18]: return self.dispatch_line(frame) [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank18]: if self.quitting: raise BdbQuit [rank18]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6c80067d80> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b59ae2480> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6f3067a570> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f033ff58e50> .BytesIO object at 0x7f6deb647c40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b59aeb880> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b59ae1f80> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f51bd4dde90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efa3ca8a4d0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9697879c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb54b8a0a40> [rank33]: Traceback (most recent call last): [rank33]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank33]: train() [rank33]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank33]: trainer.train() [rank33]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank33]: return inner_training_loop( [rank33]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank33]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank33]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank33]: batch_samples += [next(epoch_iterator)] [rank33]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank33]: current_batch = next(dataloader_iter) [rank33]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank33]: data = self._next_data() [rank33]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank33]: return self._process_data(data) [rank33]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank33]: data.reraise() [rank33]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank33]: raise exception [rank33]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank33]: Original Traceback (most recent call last): [rank33]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank33]: sample = self._get_item(i) [rank33]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank33]: data_dict = self.preprocess_qwen2vl_v3( [rank33]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank33]: return self.preprocess_qwen2vl( [rank33]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank33]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank33]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank33]: image = tcs_loader(image) [rank33]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank33]: img = pil_loader(img_value_str) [rank33]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank33]: img = Image.open(buff) [rank33]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank33]: raise UnidentifiedImageError(msg) [rank33]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f033ff58e50> [rank33]: During handling of the above exception, another exception occurred: [rank33]: Traceback (most recent call last): [rank33]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank33]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank33]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank33]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank33]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank33]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank33]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank33]: print(f"Failed to fetch sample {i}. Exception:", e) [rank33]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank33]: print(f"Failed to fetch sample {i}. Exception:", e) [rank33]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank33]: return self.dispatch_line(frame) [rank33]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank33]: if self.quitting: raise BdbQuit [rank33]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb231cf1170> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b681af2e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): e/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) uzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f168a6ee9d0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d363fdf80> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa30867f4c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6fcaf85170> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f51a38dc900> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) k (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b687b9bc0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb4ca4a3330> Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb54964cc20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f168b92b240> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank7]: Traceback (most recent call last): [rank7]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank7]: train() [rank7]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank7]: trainer.train() [rank7]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank7]: return inner_training_loop( [rank7]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank7]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank7]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank7]: batch_samples += [next(epoch_iterator)] [rank7]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank7]: current_batch = next(dataloader_iter) [rank7]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank7]: data = self._next_data() [rank7]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank7]: return self._process_data(data) [rank7]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank7]: data.reraise() [rank7]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank7]: raise exception [rank7]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank7]: Original Traceback (most recent call last): [rank7]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank7]: sample = self._get_item(i) [rank7]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank7]: data_dict = self.preprocess_qwen2vl_v3( [rank7]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank7]: return self.preprocess_qwen2vl( [rank7]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank7]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank7]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank7]: image = tcs_loader(image) [rank7]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank7]: img = pil_loader(img_value_str) [rank7]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank7]: img = Image.open(buff) [rank7]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank7]: raise UnidentifiedImageError(msg) [rank7]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efa3ca8a4d0> [rank7]: During handling of the above exception, another exception occurred: [rank7]: Traceback (most recent call last): [rank7]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank7]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank7]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank7]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank7]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank7]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank7]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank7]: print(f"Failed to fetch sample {i}. Exception:", e) [rank7]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank7]: print(f"Failed to fetch sample {i}. Exception:", e) [rank7]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank7]: return self.dispatch_line(frame) [rank7]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank7]: if self.quitting: raise BdbQuit [rank7]: bdb.BdbQuit if self.quitting: raise BdbQuit [rank35]: bdb.BdbQuit Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd8e798fb50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b59aeb3d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f17cd703ba0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18549db0b0> [rank19]: Traceback (most recent call last): [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank19]: train() [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank19]: trainer.train() [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank19]: return inner_training_loop( [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank19]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank19]: batch_samples += [next(epoch_iterator)] [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank19]: current_batch = next(dataloader_iter) [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank19]: data = self._next_data() [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank19]: return self._process_data(data) [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank19]: data.reraise() [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank19]: raise exception [rank19]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank19]: Original Traceback (most recent call last): [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank19]: sample = self._get_item(i) [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank19]: data_dict = self.preprocess_qwen2vl_v3( [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank19]: return self.preprocess_qwen2vl( [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank19]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank19]: image = tcs_loader(image) [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank19]: img = pil_loader(img_value_str) [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank19]: img = Image.open(buff) [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank19]: raise UnidentifiedImageError(msg) [rank19]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6fcaf85170> [rank19]: During handling of the above exception, another exception occurred: [rank19]: Traceback (most recent call last): [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank19]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank19]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank19]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank19]: print(f"Failed to fetch sample {i}. Exception:", e) [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank19]: print(f"Failed to fetch sample {i}. Exception:", e) [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank19]: return self.dispatch_line(frame) [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank19]: if self.quitting: raise BdbQuit [rank19]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f168a6ed0d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f71106e80e0> (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b5910a020> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/l> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd652487e20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f17cc100e00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9d3873e2a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd65f15fab0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f17cc1e7150> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) Traceback (most recent call last): PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b5910b1f0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6c583bc2c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) k (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6fcaf870b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) k (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9e3bf3470> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f17cd57e610> .BytesIO object at 0x7f036f39af20> > /mnt/hwfile/liuzhaoyang/workspaceTraceback (most recent call last): taset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d363f7e70> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9f8b5c22f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f51bb437560> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f14042aa480> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6e5ef30a90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0103197150> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9aa4f89080> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd794a5b240> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f033ff58db0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f51bd4dde90> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwf File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Ima> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efe6dd47f10> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd539631030> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9e7d7f98f0> .BytesIO object at 0x7fd8d66d2200> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd794a5b240> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd8d4356520> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f51bb435b20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1cd4d60fe0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9ab3a15df0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4d9d4dd3f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) iuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4f260e0950> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/ImaTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4f61e31850> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) py", line 701, in __next__ [rank14]: data = self._next_data() [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank14]: return self._process_data(data) [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank14]: data.reraise() [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank14]: raise exception [rank14]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank14]: Original Traceback (most recent call last): [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank14]: sample = self._get_item(i) [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank14]: data_dict = self.preprocess_qwen2vl_v3( [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank14]: return self.preprocess_qwen2vl( [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank14]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank14]: image = tcs_loader(image) [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank14]: img = pil_loader(img_value_str) [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank14]: img = Image.open(buff) [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank14]: raise UnidentifiedImageError(msg) [rank14]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd794a5b240> [rank14]: During handling of the above exception, another exception occurred: [rank14]: Traceback (most recent call last): [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank14]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank14]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank14]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank14]: print(f"Failed to fetch sample {i}. Exception:", e) [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank14]: print(f"Failed to fetch sample {i}. Exception:", e) [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank14]: return self.dispatch_line(frame) [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank14]: if self.quitting: raise BdbQuit [rank14]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd7937e7560> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d363e15d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) : Traceback (most recent call last): [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank15]: train() [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank15]: trainer.train() [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank15]: return inner_training_loop( [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank15]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank15]: batch_samples += [next(epoch_iterator)] [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank15]: current_batch = next(dataloader_iter) [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank15]: data = self._next_data() [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank15]: return self._process_data(data) [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank15]: data.reraise() [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank15]: raise exception [rank15]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank15]: Original Traceback (most recent call last): [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank15]: sample = self._get_item(i) [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank15]: data_dict = self.preprocess_qwen2vl_v3( [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank15]: return self.preprocess_qwen2vl( [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank15]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank15]: image = tcs_loader(image) [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank15]: img = pil_loader(img_value_str) [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank15]: img = Image.open(buff) [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank15]: raise UnidentifiedImageError(msg) [rank15]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4d9d4dd3f0> [rank15]: During handling of the above exception, another exception occurred: [rank15]: Traceback (most recent call last): [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank15]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank15]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank15]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank15]: print(f"Failed to fetch sample {i}. Exception:", e) [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank15]: print(f"Failed to fetch sample {i}. Exception:", e) [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank15]: return self.dispatch_line(frame) [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank15]: if self.quitting: raise BdbQuit [rank15]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4b49c93e70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd8d4267970> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9aa4f8a160> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd968cb9120> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd8d43a2610> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f48c3503b00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d363e20c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd50d4748b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f48c3503b50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd50d474c20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd94b52e020> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4c8cd4ca90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd7937e7ec0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4d9d4d60c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4c8c912b60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb4c73071a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd7937e7510> .BytesIO object at 0x7f1b92607970> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa1c375f8d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb231cf18a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd79362d120> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4bd3e5e250> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4b49c93e70> (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443aa2f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa3086922a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa245881a30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa244392e80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa38db70db0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa244392de0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9d3873e2a0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd794a5b240> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443a9f80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9d3873e2a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa244392de0> (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Traceback (most recent call last): Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443ab790> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443acae0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: IAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd79362dc10> > /mnt/hwfile/liuzhaoyang/workspace - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(t File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443ab920> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa1c375d2b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443aacf0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443aefc0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4b49c91800> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9e3bf3470> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443ab380> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... ", e)To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443abce0> (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9d3873e2a0> Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1cd4118d60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443a9cb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4bd3ebca40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443aa930> .BytesIO object at 0x7fe7869cc310> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2443ab420> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9ba6ac4400> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9d37ca1e90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... ile "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9d386ff150> To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) on:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9aa4f850d0> (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe9a9669170> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa1c375f920> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9ab3453560> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_ioTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4da4e53600> ils/data/dataloader.py", line 701, in __next__ [rank5]: data = self._next_data() [rank5]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank5]: return self._process_data(data) [rank5]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank5]: data.reraise() [rank5]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank5]: raise exception [rank5]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank5]: Original Traceback (most recent call last): [rank5]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank5]: sample = self._get_item(i) [rank5]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank5]: data_dict = self.preprocess_qwen2vl_v3( [rank5]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank5]: return self.preprocess_qwen2vl( [rank5]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank5]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank5]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank5]: image = tcs_loader(image) [rank5]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank5]: img = pil_loader(img_value_str) [rank5]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank5]: img = Image.open(buff) [rank5]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank5]: raise UnidentifiedImageError(msg) [rank5]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9ba6ac4400> [rank5]: During handling of the above exception, another exception occurred: [rank5]: Traceback (most recent call last): [rank5]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank5]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank5]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank5]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank5]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank5]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank5]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank5]: print(f"Failed to fetch sample {i}. Exception:", e) [rank5]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank5]: print(f"Failed to fetch sample {i}. Exception:", e) [rank5]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank5]: return self.dispatch_line(frame) [rank5]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank5]: if self.quitting: raise BdbQuit [rank5]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9d37ca1d50> 701, in __next__ [rank22]: data = self._next_data() [rank22]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank22]: return self._process_data(data) [rank22]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank22]: data.reraise() [rank22]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank22]: raise exception [rank22]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank22]: Original Traceback (most recent call last): [rank22]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank22]: sample = self._get_item(i) [rank22]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank22]: data_dict = self.preprocess_qwen2vl_v3( [rank22]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank22]: return self.preprocess_qwen2vl( [rank22]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank22]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank22]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank22]: image = tcs_loader(image) [rank22]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank22]: img = pil_loader(img_value_str) [rank22]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank22]: img = Image.open(buff) [rank22]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank22]: raise UnidentifiedImageError(msg) [rank22]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa1c375f920> [rank22]: During handling of the above exception, another exception occurred: [rank22]: Traceback (most recent call last): [rank22]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank22]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank22]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank22]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank22]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank22]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank22]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank22]: print(f"Failed to fetch sample {i}. Exception:", e) [rank22]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank22]: print(f"Failed to fetch sample {i}. Exception:", e) [rank22]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank22]: return self.dispatch_line(frame) [rank22]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank22]: if self.quitting: raise BdbQuit [rank22]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9aa4f847c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) .10/site-packages/torch/utils/data/dataloader.py", line 1445, in _next_data [rank24]: return self._process_data(data) [rank24]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank24]: data.reraise() [rank24]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank24]: raise exception [rank24]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 1. [rank24]: Original Traceback (most recent call last): [rank24]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank24]: sample = self._get_item(i) [rank24]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank24]: data_dict = self.preprocess_qwen2vl_v3( [rank24]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank24]: return self.preprocess_qwen2vl( [rank24]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank24]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank24]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank24]: image = tcs_loader(image) [rank24]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank24]: img = pil_loader(img_value_str) [rank24]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank24]: img = Image.open(buff) [rank24]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank24]: raise UnidentifiedImageError(msg) [rank24]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd251e77150> [rank24]: During handling of the above exception, another exception occurred: [rank24]: Traceback (most recent call last): [rank24]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank24]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank24]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank24]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank24]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank24]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank24]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank24]: print(f"Failed to fetch sample {i}. Exception:", e) [rank24]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank24]: print(f"Failed to fetch sample {i}. Exception:", e) [rank24]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank24]: return self.dispatch_line(frame) [rank24]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank24]: if self.quitting: raise BdbQuit [rank24]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe785d75da0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)sable this warning, you can either: (Pdb) /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_itTraceback (most recent call last): ng/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd2d3186110> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe80ffb0c20> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9d37ca3880> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) TracebaTraceback (most recent call File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9aa4f80770> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) : train() [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank52]: trainer.train() [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank52]: return inner_training_loop( [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank52]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank52]: batch_samples += [next(epoch_iterator)] [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank52]: current_batch = next(dataloader_iter) [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank52]: data = self._next_data() [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank52]: return self._process_data(data) [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank52]: data.reraise() [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank52]: raise exception [rank52]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank52]: Original Traceback (most recent call last): [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank52]: sample = self._get_item(i) [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank52]: data_dict = self.preprocess_qwen2vl_v3( [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank52]: return self.preprocess_qwen2vl( [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank52]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank52]: image = tcs_loader(image) [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank52]: img = pil_loader(img_value_str) [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank52]: img = Image.open(buff) [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank52]: raise UnidentifiedImageError(msg) [rank52]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb4c7307380> [rank52]: During handling of the above exception, another exception occurred: [rank52]: Traceback (most recent call last): [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank52]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank52]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank52]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank52]: print(f"Failed to fetch sample {i}. Exception:", e) [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank52]: print(f"Failed to fetch sample {i}. Exception:", e) [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank52]: return self.dispatch_line(frame) [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank52]: if self.quitting: raise BdbQuit [rank52]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb549793f10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb68d249e90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9d390b0680> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe80ffb1800> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe4ff567a60> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: IAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1cd66b89f0> - Avoid using `tokenizers` before File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe4ff567560> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Traceback (most recent call last): [rank37]: Traceback (most recent call last): [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank37]: train() [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank37]: trainer.train() [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank37]: return inner_training_loop( [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank37]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank37]: batch_samples += [next(epoch_iterator)] [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank37]: current_batch = next(dataloader_iter) [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank37]: data = self._next_data() [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank37]: return self._process_data(data) [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank37]: data.reraise() [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank37]: raise exception [rank37]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank37]: Original Traceback (most recent call last): [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank37]: sample = self._get_item(i) [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank37]: data_dict = self.preprocess_qwen2vl_v3( [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank37]: return self.preprocess_qwen2vl( [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank37]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank37]: image = tcs_loader(image) [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank37]: img = pil_loader(img_value_str) [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank37]: img = Image.open(buff) [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank37]: raise UnidentifiedImageError(msg) [rank37]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1cd66b89f0> [rank37]: During handling of the above exception, another exception occurred: [rank37]: Traceback (most recent call last): [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank37]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank37]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank37]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank37]: print(f"Failed to fetch sample {i}. Exception:", e) [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank37]: print(f"Failed to fetch sample {i}. Exception:", e) [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank37]: return self.dispatch_line(frame) [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank37]: if self.quitting: raise BdbQuit [rank37]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe501c36070> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9d390b0680> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) k (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe4ff566430> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4f260dd3f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1c1b63d850> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1cd4c0db70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f190c54e110> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb231cf0fe0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1909dbbb50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1909dbb4c0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: Traceback (most recent call last): the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1cd46dd710> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4f2b23b7e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1c1b63e890> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9e7d7f9440> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4f61e31850> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1c1b63efc0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank6]: Traceback (most recent call last): [rank6]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank6]: train() [rank6]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank6]: trainer.train() [rank6]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank6]: return inner_training_loop( [rank6]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank6]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank6]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank6]: batch_samples += [next(epoch_iterator)] [rank6]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank6]: current_batch = next(dataloader_iter) [rank6]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank6]: data = self._next_data() [rank6]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank6]: return self._process_data(data) [rank6]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank6]: data.reraise() [rank6]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank6]: raise exception [rank6]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank6]: Original Traceback (most recent call last): [rank6]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank6]: sample = self._get_item(i) [rank6]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank6]: data_dict = self.preprocess_qwen2vl_v3( [rank6]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank6]: return self.preprocess_qwen2vl( [rank6]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank6]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank6]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank6]: image = tcs_loader(image) [rank6]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank6]: img = pil_loader(img_value_str) [rank6]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank6]: img = Image.open(buff) [rank6]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank6]: raise UnidentifiedImageError(msg) [rank6]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f872e50e980> [rank6]: During handling of the above exception, another exception occurred: [rank6]: Traceback (most recent call last): [rank6]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank6]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank6]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank6]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank6]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank6]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank6]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank6]: print(f"Failed to fetch sample {i}. Exception:", e) [rank6]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank6]: print(f"Failed to fetch sample {i}. Exception:", e) [rank6]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank6]: return self.dispatch_line(frame) [rank6]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank6]: if self.quitting: raise BdbQuit [rank6]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f872debe430> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1c1b63e340> ib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9f8b5c59e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9d37ca1e90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0277e50630> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03b834e7a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1dcee87b00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe785d74450> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe7871562a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe80ffb1580> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/wor File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe8cb605b20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f523c563740> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank39]: Traceback (most recent call last): [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank39]: train() [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank39]: trainer.train() [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank39]: return inner_training_loop( [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank39]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank39]: batch_samples += [next(epoch_iterator)] [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank39]: current_batch = next(dataloader_iter) [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank39]: data = self._next_data() [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank39]: return self._process_data(data) [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank39]: data.reraise() [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank39]: raise exception [rank39]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank39]: Original Traceback (most recent call last): [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank39]: sample = self._get_item(i) [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank39]: data_dict = self.preprocess_qwen2vl_v3( [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank39]: return self.preprocess_qwen2vl( [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank39]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank39]: image = tcs_loader(image) [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank39]: img = pil_loader(img_value_str) [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank39]: img = Image.open(buff) [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank39]: raise UnidentifiedImageError(msg) [rank39]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f51bb4343b0> [rank39]: During handling of the above exception, another exception occurred: [rank39]: Traceback (most recent call last): [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank39]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank39]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank39]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank39]: print(f"Failed to fetch sample {i}. Exception:", e) [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank39]: print(f"Failed to fetch sample {i}. Exception:", e) [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank39]: return self.dispatch_line(frame) [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank39]: if self.quitting: raise BdbQuit [rank39]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5048129300> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4f26103880> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4f61e31850> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f51bd4dde90> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe9a9669170> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File " (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4f26103330> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f88704e4a90> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwf File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe786b262f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f51a38dc900> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe9a9669170> Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f523c18aca0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f872f89d620> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) >Traceback (most recent call last): GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4f35409ee0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank13]: Traceback (most recent ca File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f51bb435350> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f51bc422250> en2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank13]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank13]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank13]: image = tcs_loader(image) [rank13]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank13]: img = pil_loader(img_value_str) [rank13]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank13]: img = Image.open(buff) [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank13]: raise UnidentifiedImageError(msg) [rank13]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe9a9669170> [rank13]: During handling of the above exception, another exception occurred: [rank13]: Traceback (most recent call last): [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank13]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank13]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank13]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank13]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank13]: print(f"Failed to fetch sample {i}. Exception:", e) [rank13]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank13]: print(f"Failed to fetch sample {i}. Exception:", e) [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank13]: return self.dispatch_line(frame) [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank13]: if self.quitting: raise BdbQuit [rank13]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe785d75d50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f51bb4349f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4f260e10d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe80ffb4540> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4f26103fb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4f61e31850> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe4ff567560> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f51bb434c70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f51bb435ad0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe7871562a0> Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4f260ded40> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f027822e1b0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f027886f7e0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4f260dde40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) k (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe786b262f0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfTraceback (most recent call last): qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4f260de070> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f02774404a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe80ffbf330> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f043b1cf3d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7effeedf2660> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03b825cea0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03b84619e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f027886f7e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe785d75c10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe4ff567560> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe80ffb07c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe80ffb2700> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe80ffe7920> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe9a9669170> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe80ffb39c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03b9b23740> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f02775eb920> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f027822e1b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f010b780860> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03b8d7f060> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0277e50f40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03b9bf56c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7effeedf3740> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f02774433d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f027822e1b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03b824bb50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03ba6c8bd0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f027886f7e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03b8240ae0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank4]: Traceback (most recent call last): [rank4]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank4]: train() [rank4]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank4]: trainer.train() [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank4]: return inner_training_loop( [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank4]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank4]: batch_samples += [next(epoch_iterator)] [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank4]: current_batch = next(dataloader_iter) [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank4]: data = self._next_data() [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank4]: return self._process_data(data) [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank4]: data.reraise() [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank4]: raise exception [rank4]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank4]: Original Traceback (most recent call last): [rank4]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank4]: sample = self._get_item(i) [rank4]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank4]: data_dict = self.preprocess_qwen2vl_v3( [rank4]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 [rank4]: return self.preprocess_qwen2vl( [rank4]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl [rank4]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank4]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank4]: image = tcs_loader(image) [rank4]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank4]: img = pil_loader(img_value_str) [rank4]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank4]: img = Image.open(buff) [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank4]: raise UnidentifiedImageError(msg) [rank4]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f027886f7e0> [rank4]: During handling of the above exception, another exception occurred: [rank4]: Traceback (most recent call last): [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank4]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank4]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank4]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank4]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank4]: print(f"Failed to fetch sample {i}. Exception:", e) [rank4]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank4]: print(f"Failed to fetch sample {i}. Exception:", e) [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank4]: return self.dispatch_line(frame) [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank4]: if self.quitting: raise BdbQuit [rank4]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0277e50770> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03b8355d50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03b83c3740> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f02775eb920> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7effeedf3740> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efff1401760> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f02775eb9c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f02775eb920> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efff1401ad0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03b83c30b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03b8242de0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f027822e200> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03b8243ce0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f02775eb7e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03b82715d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03b8273880> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 704, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 513, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03b8273d80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) 0%| | 0/34278 [01:31 sys.exit(load_entry_point('torch==2.5.1', 'console_scripts', 'torchrun')()) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper return f(*args, **kwargs) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py", line 919, in main run(args) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py", line 910, in run elastic_launch( File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 138, in __call__ return launch_agent(self._config, self._entrypoint, list(args)) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 269, in launch_agent raise ChildFailedError( torch.distributed.elastic.multiprocessing.errors.ChildFailedError: ============================================================ train.py FAILED ------------------------------------------------------------ Failures: ------------------------------------------------------------ Root Cause (first observed failure): [0]: time : 2025-03-07_22:32:15 host : HOST-10-140-60-104 rank : 62 (local_rank: 6) exitcode : 1 (pid: 36748) error_file: traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html ============================================================ W0307 22:32:18.940000 79426 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/api.py:897] Sending process 79525 closing signal SIGTERM W0307 22:32:18.947000 79426 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/api.py:897] Sending process 79526 closing signal SIGTERM W0307 22:32:18.947000 79426 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/api.py:897] Sending process 79527 closing signal SIGTERM W0307 22:32:18.948000 79426 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/api.py:897] Sending process 79528 closing signal SIGTERM W0307 22:32:18.948000 79426 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/api.py:897] Sending process 79529 closing signal SIGTERM W0307 22:32:18.949000 79426 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/api.py:897] Sending process 79530 closing signal SIGTERM W0307 22:32:18.950000 79426 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/api.py:897] Sending process 79531 closing signal SIGTERM E0307 22:32:19.081000 109975 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/api.py:869] failed (exitcode: 1) local_rank: 4 (pid: 110114) of binary: /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/bin/python Traceback (most recent call last): File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/bin/torchrun", line 33, in sys.exit(load_entry_point('torch==2.5.1', 'console_scripts', 'torchrun')()) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper return f(*args, **kwargs) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py", line 919, in main run(args) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py", line 910, in run elastic_launch( File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 138, in __call__ return launch_agent(self._config, self._entrypoint, list(args)) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 269, in launch_agent raise ChildFailedError( torch.distributed.elastic.multiprocessing.errors.ChildFailedError: ============================================================ train.py FAILED ------------------------------------------------------------ Failures: ------------------------------------------------------------ Root Cause (first observed failure): [0]: time : 2025-03-07_22:32:15 host : HOST-10-140-60-56 rank : 20 (local_rank: 4) exitcode : 1 (pid: 110114) error_file: traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html ============================================================ E0307 22:32:20.098000 36057 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/api.py:869] failed (exitcode: 1) local_rank: 1 (pid: 36131) of binary: /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/bin/python Traceback (most recent call last): File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/bin/torchrun", line 33, in sys.exit(load_entry_point('torch==2.5.1', 'console_scripts', 'torchrun')()) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper return f(*args, **kwargs) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py", line 919, in main run(args) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py", line 910, in run elastic_launch( File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 138, in __call__ return launch_agent(self._config, self._entrypoint, list(args)) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 269, in launch_agent raise ChildFailedError( torch.distributed.elastic.multiprocessing.errors.ChildFailedError: ============================================================ train.py FAILED ------------------------------------------------------------ Failures: [1]: time : 2025-03-07_22:32:17 host : HOST-10-140-60-95 rank : 31 (local_rank: 7) exitcode : 1 (pid: 36137) error_file: traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html ------------------------------------------------------------ Root Cause (first observed failure): [0]: time : 2025-03-07_22:32:17 host : HOST-10-140-60-95 rank : 25 (local_rank: 1) exitcode : 1 (pid: 36131) error_file: traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html ============================================================ E0307 22:32:20.620000 99473 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/api.py:869] failed (exitcode: 1) local_rank: 0 (pid: 99589) of binary: /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/bin/python Traceback (most recent call last): File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/bin/torchrun", line 33, in sys.exit(load_entry_point('torch==2.5.1', 'console_scripts', 'torchrun')()) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper return f(*args, **kwargs) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py", line 919, in main run(args) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py", line 910, in run elastic_launch( File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 138, in __call__ return launch_agent(self._config, self._entrypoint, list(args)) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 269, in launch_agent raise ChildFailedError( torch.distributed.elastic.multiprocessing.errors.ChildFailedError: ============================================================ train.py FAILED ------------------------------------------------------------ Failures: ------------------------------------------------------------ Root Cause (first observed failure): [0]: time : 2025-03-07_22:32:17 host : HOST-10-140-60-96 rank : 32 (local_rank: 0) exitcode : 1 (pid: 99589) error_file: traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html ============================================================ E0307 22:32:20.768000 30413 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/api.py:869] failed (exitcode: 1) local_rank: 5 (pid: 30491) of binary: /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/bin/python Traceback (most recent call last): File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/bin/torchrun", line 33, in sys.exit(load_entry_point('torch==2.5.1', 'console_scripts', 'torchrun')()) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper return f(*args, **kwargs) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py", line 919, in main run(args) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py", line 910, in run elastic_launch( File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 138, in __call__ return launch_agent(self._config, self._entrypoint, list(args)) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 269, in launch_agent raise ChildFailedError( torch.distributed.elastic.multiprocessing.errors.ChildFailedError: ============================================================ train.py FAILED ------------------------------------------------------------ Failures: ------------------------------------------------------------ Root Cause (first observed failure): [0]: time : 2025-03-07_22:32:18 host : HOST-10-140-60-102 rank : 53 (local_rank: 5) exitcode : 1 (pid: 30491) error_file: traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html ============================================================ E0307 22:32:21.641000 93035 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/api.py:869] failed (exitcode: 1) local_rank: 1 (pid: 93119) of binary: /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/bin/python Traceback (most recent call last): File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/bin/torchrun", line 33, in sys.exit(load_entry_point('torch==2.5.1', 'console_scripts', 'torchrun')()) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper return f(*args, **kwargs) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py", line 919, in main run(args) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py", line 910, in run elastic_launch( File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 138, in __call__ return launch_agent(self._config, self._entrypoint, list(args)) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 269, in launch_agent raise ChildFailedError( torch.distributed.elastic.multiprocessing.errors.ChildFailedError: ============================================================ train.py FAILED ------------------------------------------------------------ Failures: ------------------------------------------------------------ Root Cause (first observed failure): [0]: time : 2025-03-07_22:32:16 host : HOST-10-140-60-101 rank : 41 (local_rank: 1) exitcode : 1 (pid: 93119) error_file: traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html ============================================================ E0307 22:32:21.944000 3654 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/api.py:869] failed (exitcode: 1) local_rank: 2 (pid: 3760) of binary: /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/bin/python Traceback (most recent call last): File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/bin/torchrun", line 33, in sys.exit(load_entry_point('torch==2.5.1', 'console_scripts', 'torchrun')()) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper return f(*args, **kwargs) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py", line 919, in main run(args) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py", line 910, in run elastic_launch( File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 138, in __call__ return launch_agent(self._config, self._entrypoint, list(args)) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 269, in launch_agent raise ChildFailedError( torch.distributed.elastic.multiprocessing.errors.ChildFailedError: ============================================================ train.py FAILED ------------------------------------------------------------ Failures: ------------------------------------------------------------ Root Cause (first observed failure): [0]: time : 2025-03-07_22:32:16 host : HOST-10-140-60-41 rank : 10 (local_rank: 2) exitcode : 1 (pid: 3760) error_file: traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html ============================================================ E0307 22:32:24.098000 79426 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/api.py:869] failed (exitcode: 1) local_rank: 0 (pid: 79524) of binary: /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/bin/python Traceback (most recent call last): File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/bin/torchrun", line 33, in sys.exit(load_entry_point('torch==2.5.1', 'console_scripts', 'torchrun')()) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/elastic/multiprocessing/errors/__init__.py", line 355, in wrapper return f(*args, **kwargs) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py", line 919, in main run(args) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py", line 910, in run elastic_launch( File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 138, in __call__ return launch_agent(self._config, self._entrypoint, list(args)) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/launcher/api.py", line 269, in launch_agent raise ChildFailedError( torch.distributed.elastic.multiprocessing.errors.ChildFailedError: ============================================================ train.py FAILED ------------------------------------------------------------ Failures: ------------------------------------------------------------ Root Cause (first observed failure): [0]: time : 2025-03-07_22:32:18 host : HOST-10-140-60-39 rank : 0 (local_rank: 0) exitcode : 1 (pid: 79524) error_file: traceback : To enable traceback see: https://pytorch.org/docs/stable/elastic/errors.html ============================================================ W0307 22:37:08.708000 117665 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] W0307 22:37:08.708000 117665 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** W0307 22:37:08.708000 117665 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. W0307 22:37:08.708000 117665 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** W0307 22:48:58.213000 10975 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] W0307 22:48:58.213000 10975 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** W0307 22:48:58.213000 10975 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. W0307 22:48:58.213000 10975 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** W0307 22:52:30.532000 21752 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] W0307 22:52:30.532000 21752 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** W0307 22:52:30.532000 21752 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. W0307 22:52:30.532000 21752 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** W0307 22:52:31.445000 73095 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] W0307 22:52:31.445000 73095 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** W0307 22:52:31.445000 73095 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. W0307 22:52:31.445000 73095 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** *** [2025-03-07 22:52:46,312] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:52:46,323] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:52:46,323] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:52:46,323] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:52:46,325] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:52:46,323] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:52:49,132] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:52:49,147] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:52:49,147] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:52:49,147] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:52:49,148] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:52:49,828] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:52:49,865] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:52:50,096] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:52:50,096] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:52:50,099] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:52:50,114] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:52:57,272] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:52:57,272] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:52:57,277] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:52:57,774] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:52:57,774] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:52:59,182] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:52:59,262] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:52:59,262] [INFO] [comm.py:683:init_distributed] Initializing TorchBackend in DeepSpeed with backend nccl [2025-03-07 22:52:59,265] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:52:59,825] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:52:59,861] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:52:59,913] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:52:59,995] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:52:59,995] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:52:59,997] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:52:59,998] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:00,633] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:00,635] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:00,640] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:09,329] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:09,329] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:09,845] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:09,846] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:09,867] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:09,867] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:09,868] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:09,868] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:11,269] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:11,365] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:11,850] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:11,859] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:12,036] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:12,082] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:12,082] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:12,082] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:12,082] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:12,168] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:12,169] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:12,714] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:12,716] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:12,778] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:12,820] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:12,868] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:12,871] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:12,875] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:21,367] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,412] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,412] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,413] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,422] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,446] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,457] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,459] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,470] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,470] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,470] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,471] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,493] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,494] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,490] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,490] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,490] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,491] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,491] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,514] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,520] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,525] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,526] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,528] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,529] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,529] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,538] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,539] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,539] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,540] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,545] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,547] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,582] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,584] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,586] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,586] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,585] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,587] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,601] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:21,601] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 22:53:23,275] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,275] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,293] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,296] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,297] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,320] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,381] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,381] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,382] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,382] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,389] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,394] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,414] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,418] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,418] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,418] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,419] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,419] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,419] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,446] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,467] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,479] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,479] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,491] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,499] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,499] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,499] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,489] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,557] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,557] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,557] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,569] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,570] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,591] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,591] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,604] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,606] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:23,612] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 22:53:24,016] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,101] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,168] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,206] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,233] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:24,236] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,248] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:24,250] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,263] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,321] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:24,321] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,327] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:24,331] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,355] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,360] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,428] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,448] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:24,452] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,453] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:24,455] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:24,455] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,459] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,480] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,487] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:24,487] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:24,487] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:24,495] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:24,498] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 ve the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,490] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 ve the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,492] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 ve the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,494] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,500] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:24,503] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:24,513] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 ve the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,521] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,565] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 22:53:24,570] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 22:53:24,571] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 [2025-03-07 22:53:26,540] [INFO] [partition_parameters.py:348:__exit__] finished initializing model - num_params = 730, num_elems = 8.29B Loading checkpoint shards: 0%| | 0/5 [00:00, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=7, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-53-23_HOST-10-140-60-41, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=5, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-53-23_HOST-10-140-60-41, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=2, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-53-23_HOST-10-140-60-102, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, )Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_tTraining args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=5, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-52-49_HOST-10-140-60-39, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=1, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-53-23_HOST-10-140-60-95, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_ Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=3, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-53-23_HOST-10-140-60-96, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_en=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=4, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-53-23_HOST-10-140-60-56, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=1, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-52-59_HOST-10-140-60-56, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=2, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-53-23_HOST-10-140-60-56, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction__train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=3, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-53-23_HOST-10-140-60-104, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=5, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-53-12_HOST-10-140-60-104, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokenoss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=1, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-53-23_HOST-10-140-60-101, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=0, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-53-23_HOST-10-140-60-101, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=5, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-53-23_HOST-10-140-60-101, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=6, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-53-23_HOST-10-140-60-101, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=7, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_22-53-23_HOST-10-140-60-101, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=iction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) reloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) --> before Client(conf_path) reloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf[TCSLoader] config_path: ~/pet --> before Client(conf_path) reloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/pe --> before [TCSLoader] config_path: ~/pet --> before Client(conf_path) reloss.conf[TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf[TCSLoader] config_path: ~/petreloss.conf config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) --> before Client(conf_path) reloss.conf --> before Client(conf_path) --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path)--> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) Rank 0: --> after Client(conf_path) data/liuzhaoyang/gui/gui_annos/aguvis/stage1/guienv.json with all sampling strategy --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) Rank 0: Loaded 327972 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/guienv.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/omniact_fix.json with all sampling strategy Rank 0: Loaded 6720 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/omniact_fix.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ricoig16k.json with all sampling strategy Rank 0: Loaded 16133 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ricoig16k.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ricosca.json with all sampling strategy Rank 0: Loaded 173212 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ricosca.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/seeclick.json with all sampling strategy Rank 0: Loaded 271121 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/seeclick.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ui_refexp.json with all sampling strategy Rank 0: Loaded 15624 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ui_refexp.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/webui350k.json with all sampling strategy Rank 0: Loaded 57389 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/webui350k.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/widget_captioning.json with all sampling strategy Rank 0: Loaded 101426 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/widget_captioning.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_direct_150k_description_filtered.json with all sampling strategy Rank 0: Loaded 148416 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_direct_150k_description_filtered.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_direct_258k_function_filtered.json with all sampling strategy Rank 0: Loaded 248264 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_direct_258k_function_filtered.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_hybrid_773k_max_25qa_filtered_new.json with all sampling strategy Rank 0: Loaded 1243857 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_hybrid_773k_max_25qa_filtered_new.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_windows_splited.json with all sampling strategy Rank 0: Loaded 1075344 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_windows_splited.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_linux_splited.json with all sampling strategy Rank 0: Loaded 43156 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_linux_splited.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_macos_splited.json with all sampling strategy Rank 0: Loaded 18399 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_macos_splited.json Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/uibert_train_ground_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 4646 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/uibert_train_ground_d240430_v1.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_taperception_grounding_d240815_v2.jsonl with all sampling strategy Rank 0: Loaded 2500 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_taperception_grounding_d240815_v2.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_widget_grounding_d240815_v2.jsonl with all sampling strategy Rank 0: Loaded 14878 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_widget_grounding_d240815_v2.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_mug_grounding_d240812.jsonl with all sampling strategy Rank 0: Loaded 26090 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_mug_grounding_d240812.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_phone_2403_ground_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 24798 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_phone_2403_ground_d240430_v1.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_aig_share_2405_ground_d240521_v1.jsonl with all sampling strategy Rank 0: Loaded 5008 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_aig_share_2405_ground_d240521_v1.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_aig_share_2406_ground_d240612_v1.jsonl with all sampling strategy Rank 0: Loaded 7903 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_aig_share_2406_ground_d240612_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/amex_grounding_d240813_v1.jsonl with all sampling strategy Rank 0: Loaded 102007 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/amex_grounding_d240813_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/guicourse_guienv_text_grounding_1_d240815_v3.jsonl with all sampling strategy Rank 0: Loaded 63581 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/guicourse_guienv_text_grounding_1_d240815_v3.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/guicourse_guienv_text_grounding_2_d240815_v3.jsonl with all sampling strategy Rank 0: Loaded 6852 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/guicourse_guienv_text_grounding_2_d240815_v3.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_schedual_extract_20240520_v2_r464_reprompt_d240607.jsonl with repeat:2 sampling strategy Rank 0: Loaded 928 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_schedual_extract_20240520_v2_r464_reprompt_d240607.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_app_d20240822_v1.jsonl with repeat:2 sampling strategy Rank 0: Loaded 2488 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_app_d20240822_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_os_d20240822_v1.jsonl with repeat:2 sampling strategy Rank 0: Loaded 1242 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_os_d20240822_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_web_d20240822_v1.jsonl with repeat:2 sampling strategy Rank 0: Loaded 2360 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_web_d20240822_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_element_recognition_d240605_v1_correct_d240607.jsonl with all sampling strategy Rank 0: Loaded 3791 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_element_recognition_d240605_v1_correct_d240607.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_marker_recognition_d240605_v1.jsonl with all sampling strategy Rank 0: Loaded 5179 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_marker_recognition_d240605_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_ocr_d240605_v1.jsonl with all sampling strategy Rank 0: Loaded 5090 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_ocr_d240605_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_operation_oral_d240605_v1.jsonl with all sampling strategy Rank 0: Loaded 5070 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_operation_oral_d240605_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_visual_prompt_with_bbox_d240605_v1.jsonl with all sampling strategy Rank 0: Loaded 5248 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_visual_prompt_with_bbox_d240605_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2408_region_caption_d240903_v1.jsonl with all sampling strategy Rank 0: Loaded 5854 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2408_region_caption_d240903_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_detection_d20240418_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_detection_d20240418_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_element_recognition_d20240416_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_element_recognition_d20240416_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_ground_d20240416_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_ground_d20240416_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_detection_d20240418_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_detection_d20240418_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_element_recognition_d20240416_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_element_recognition_d20240416_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_ground_d20240416_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_ground_d20240416_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_element_recognition_d240605_v1_correct_d240607.jsonl with all sampling strategy Rank 0: Loaded 24620 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_element_recognition_d240605_v1_correct_d240607.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_long_caption_d20240604_v2.jsonl with all sampling strategy Rank 0: Loaded 17196 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_long_caption_d20240604_v2.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_long_caption_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 5998 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_long_caption_d240430_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_ocr_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 31276 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_ocr_d240430_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screen_qa_short_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 27880 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screen_qa_short_d240430_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screen_qa_with_bbox_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 62401 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screen_qa_with_bbox_d240430_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screenai_layout_20240604_v1.jsonl with all sampling strategy Rank 0: Loaded 22076 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screenai_layout_20240604_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_grounding_d240924_v1.jsonl with all sampling strategy Rank 0: Loaded 1405 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_grounding_d240924_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_oral_operation_d240924_v1.jsonl with all sampling strategy Rank 0: Loaded 1405 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_oral_operation_d240924_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_region_caption_d240924_v1.jsonl with all sampling strategy Rank 0: Loaded 1405 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_region_caption_d240924_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_ui_operation_oral_wbox_d241023_v2.jsonl with all sampling strategy Rank 0: Loaded 20293 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_ui_operation_oral_wbox_d241023_v2.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_comment_20240606_json_d20241023_v2.jsonl with all sampling strategy Rank 0: Loaded 1055 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_comment_20240606_json_d20241023_v2.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_internal_aig_json_d241126.jsonl with repeat:3 sampling strategy Rank 0: Loaded 6837 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_internal_aig_json_d241126.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_internal_aig_xml_d241126.jsonl with repeat:3 sampling strategy Rank 0: Loaded 6873 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_internal_aig_xml_d241126.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/OS_Altas_androidworld_grounding_d241120_v1.jsonl with all sampling strategy Rank 0: Loaded 89860 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/OS_Altas_androidworld_grounding_d241120_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_long_caption_20240604_v1.jsonl with repeat:4 sampling strategy Rank 0: Loaded 3156 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_long_caption_20240604_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/grounding_new.jsonl with all sampling strategy Rank 0: Loaded 863 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/grounding_new.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/regioncaption_new.jsonl with all sampling strategy Rank 0: Loaded 863 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/regioncaption_new.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/oral_operation_new.jsonl with all sampling strategy Rank 0: Loaded 863 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/oral_operation_new.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240812_grounding_dataset_20240812_v1_r6600.jsonl with all sampling strategy Rank 0: Loaded 6600 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240812_grounding_dataset_20240812_v1_r6600.jsonl Rank 0: Loaded 4387471 samples from data/stage1_20250307_1.yaml Rank 0: Formatting inputs...Skip in lazy mode Detected kernel version 3.10.0, which is below the recommended minimum of 5.5.0; this can cause the process to hang. It is recommended to upgrade the kernel to the minimum version or higher. Parameter Offload: Total persistent parameters: 877056 in 401 params 0%| | 0/34278 [00:00[[175, 370, 207, 415]] Please click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_782679.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_782679.png'}, {'type': 'text', 'text': "\nClick on 'Newsroom link'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_11769.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_642995.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_17/C4web50k-1_91930202-split-2.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_414829.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_37634.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_162058_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_162058_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Shape'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_642995.png'}, {'type': 'text', 'text': "\nClick on 'Breadcrumb list'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_414829.png'}, {'type': 'text', 'text': '\nClick on \'the image of "wefly"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_37634.png'}, {'type': 'text', 'text': '\nClick on \'The bottom item in the list of buttons, which contains the text "Guest registration (Day 1)"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_17/C4web50k-1_91930202-split-2.png'}, {'type': 'text', 'text': '\nHow would you determine the bounding box for the Subscribe and NEVER Miss a Post! shown in the image? Please click on it.'}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_11769.png'}, {'type': 'text', 'text': "\nClick on 'navigate to the full article about server-side hardware characteristics'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_225812.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_225812.png'}, {'type': 'text', 'text': "\nClick on 'Download data in CSV format'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image47198.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_241720.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_195703_before_screenshot_sub3.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_344406.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_195703_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Dislike'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_344406.png'}, {'type': 'text', 'text': "\nClick on 'Link opens search'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_241720.png'}, {'type': 'text', 'text': "\nClick on 'redirect to Weaver Xtreme Theme website'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image47198.jpg'}, {'type': 'text', 'text': '\nWhat is the name of the application? Explain your answer based on UI element. Please click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_709024.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_709024.png'}, {'type': 'text', 'text': "\nClick on 'Terms & Conditions'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_582168.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_582168.png'}, {'type': 'text', 'text': '\nClick on \'button titled "Protein-energy malnutrition mortality rates in children", above "No thanks", under "Probability of dying in infancy, by sex"\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_022602_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_022602_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Microsoft Rewards 54'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_202811_before_screenshot.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_42027.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_356551.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0601-0708_099_batch_1_num_2000/20240606210108362/main_3292.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/d6e150d7-6dd3-42d6-a6e5-23594dac4098.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_356551.png'}, {'type': 'text', 'text': "\nClick on 'Previous Item:\nAquaman & Mera, labeled Aquaman & Mera'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_42027.png'}, {'type': 'text', 'text': '\nClick on \'the image of "30 Day Money Back Guarantee *"\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/d6e150d7-6dd3-42d6-a6e5-23594dac4098.png'}, {'type': 'text', 'text': '\nPlease identify the area of the element with bounding box that can provide an answer to this question: No repetition. Please click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_202811_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Page up'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0601-0708_099_batch_1_num_2000/20240606210108362/main_3292.png'}, {'type': 'text', 'text': '\n提供一个结构合理、准确的json布局描述,以满足开发者对软件产品用户界面设计的需求。\n限制:json布局描述必须遵守软件设计规范,易于实现,同时确保信息的准确性和完整性。\n输出格式:json格式,包含node_type, location, content等属性。\n工作流程:\n\n分析软件产品的用户界面设计需求。\n根据分析结果,确定json布局描述中的node_type, location, content等属性。\n编写符合规范的json布局描述。 Click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_756442.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_756442.png'}, {'type': 'text', 'text': '\nClick on \'Things to do button, in "Explore Festivals" section\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_122372.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_353695.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_184544_before_screenshot.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_182941_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_182941_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Accept Change'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_184544_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Open'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_353695.png'}, {'type': 'text', 'text': "\nClick on 'the text box'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_122372.png'}, {'type': 'text', 'text': '\nClick on \'The text link with the label "František Nušl observatory in Jindřichův Hradec" located in the Table of Contents on the right side of the webpage.\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_483350.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_483350.png'}, {'type': 'text', 'text': "\nClick on 'Posts by managed.abnworks'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_83911.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_83911.png'}, {'type': 'text', 'text': "\nClick on 'redirect to April Newsletter page'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_043917_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_043917_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'New Tab'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_175330_before_screenshot_sub2.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_543001.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_14049.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_41/C4web50k-2_184516137-split-0.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_668397.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_668397.png'}, {'type': 'text', 'text': "\nClick on 'Online Application Opens in new window'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_175330_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'AutomationID: Icons_Badge2_M'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_543001.png'}, {'type': 'text', 'text': '\nClick on \'Blue Bungalow Information, Reviews & coupons, in the "Best Cyber Monday Early Deals 2023" section\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_41/C4web50k-2_184516137-split-0.png'}, {'type': 'text', 'text': '\nDetermine the spatial location of the UI element Home with bounding box in the image. Please click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_14049.png'}, {'type': 'text', 'text': '\nClick on \'A dark blue rectangular button centered on the page, containing the text "MBA University" in white with a white arrow to the right.\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_40503.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_40503.png'}, {'type': 'text', 'text': "\nClick on 'access Crystal Healing therapy page'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_106614.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240410_reorg_filtered/02617_label.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_522314.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_152726_original_screenshot.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_412260.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_49801.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_55511.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_367723.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_152726_original_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'javascript.format.enable'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_522314.png'}, {'type': 'text', 'text': "\nClick on 'March Madness (11 items)'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_412260.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Site Logo"\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_106614.png'}, {'type': 'text', 'text': "\nClick on 'filter by BURGERS'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_49801.png'}, {'type': 'text', 'text': "\nClick on 'toggle full screen mode on'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_367723.png'}, {'type': 'text', 'text': "\nClick on 'CLICK Centered, bold, dark grey hyperlink: 'King's Digital Lab''"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_55511.png'}, {'type': 'text', 'text': '\nClick on \'The element is a navigation menu item labeled "BRANDS" positioned between "SHOP" and "BLOG" on the website\'s main navigation bar.\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240410_reorg_filtered/02617_label.png'}, {'type': 'text', 'text': '\n简单介绍下marker为6的元素。 Please double-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_808206.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_808206.png'}, {'type': 'text', 'text': "\nClick on 'WordPress site by Miramedia, titled Miramedia, a WordPress agency in Kent'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_398011.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_040952_before_screenshot.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_139768.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imagemobile-327.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_533388.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_211141_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_211141_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Format Cell Number'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_398011.png'}, {'type': 'text', 'text': "\nClick on 'Follow %site-company-name% on Facebook'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_040952_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'CNN'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_533388.png'}, {'type': 'text', 'text': "\nClick on 'Red-orange dual-circle SVG icon with black border, indicating payment method or status'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_139768.png'}, {'type': 'text', 'text': "\nClick on 'CLICK Women, at the top of the page'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imagemobile-327.png'}, {'type': 'text', 'text': '\nLocate the +49 (0)531 1805092222 in the picture and give its boundary details. Please click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_277502.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_277502.png'}, {'type': 'text', 'text': '\nClick on \'link titled "REAL ESTATE"\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_123351_before_screenshot.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imagePlayer_FM_2024_3_4_23_31-566.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_394039.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_123351_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Google Cloud Platform'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_394039.png'}, {'type': 'text', 'text': "\nClick on 'View Acid Brown'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imagePlayer_FM_2024_3_4_23_31-566.png'}, {'type': 'text', 'text': '\nMark the area occupied by the UI element https://dash.applovin.com/p/how-applovin-shows-you-ads?id=976dba45bb43a2035ee49d19eb79121fe63532bb with bounding box in the provided image. Please right-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_68843.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_68843.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Lowry Logo"\''}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_30_15_5_f78e02999cc74e2dba31c15df67d00fd-4.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_30_15_5_f78e02999cc74e2dba31c15df67d00fd-4.png'}, {'type': 'text', 'text': '\nDescribe the positioning of the UI element New York with bounding box in the image. Please click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_354661.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_354661.png'}, {'type': 'text', 'text': "\nClick on 'Discussion Forum'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_173359.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_104455.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_573622.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_062853_before_screenshot_sub3.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_690068.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_130319_before_screenshot.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_210216_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_062853_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'next'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_573622.png'}, {'type': 'text', 'text': "\nClick on 'the textarea'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_210216_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Page right'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_130319_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'The Weather Channel'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_173359.png'}, {'type': 'text', 'text': "\nClick on 'check to add keyword 'Dufur''"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_104455.png'}, {'type': 'text', 'text': '\nClick on \'The blue button labeled "See Our Milestones" located below the "Our Purpose" section.\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_690068.png'}, {'type': 'text', 'text': '\nClick on \'Gradient green link with partially hidden sans-serif text, under "Help Id Saccamoeba?" in "Help Id Saccamoeba?"\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_021600_before_screenshot.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_114221_original_screenshot_sub1.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_9_23_46_ec53717b8d864416ae94b58002b3336e-1.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_183590.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_181155.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_51656.png Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_020737_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_021600_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Fork 95'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_181155.png'}, {'type': 'text', 'text': '\nClick on \'link for "Technology"\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_114221_original_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Properties'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_183590.png'}, {'type': 'text', 'text': "\nClick on 'Our commitment to accessibility'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_9_23_46_ec53717b8d864416ae94b58002b3336e-1.png'}, {'type': 'text', 'text': '\nProvide the bounding box for the UI element Tue, Apr 9 seen in the image. Click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_51656.png'}, {'type': 'text', 'text': "\nClick on 'text input area'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_020737_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'Enable Split Screen'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_022035_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_022035_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'Payment Terms'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_055353_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_055353_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'The Weather Channel - MSN'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_159344.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_159344.png'}, {'type': 'text', 'text': "\nClick on 'navigate to episode 34 page'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_163642_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_163642_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Mark as Decorative'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_124529_before_screenshot.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_041732_before_screenshot_sub3.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_480759.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240321/20240321_filtered/WPS/screen_00000213.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_711093.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_162692.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_300340.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_162692.png'}, {'type': 'text', 'text': "\nClick on 'Navigation'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_300340.png'}, {'type': 'text', 'text': "\nClick on 'the text box'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_480759.png'}, {'type': 'text', 'text': "\nClick on 'Graham Cracker Crust button'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_041732_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'AutomationID: sb_feedback'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_124529_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'View Side by Side'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_711093.png'}, {'type': 'text', 'text': '\nClick on \'Light blue to dark blue gradient button with white "X" icon and rounded rectangular shape\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240321/20240321_filtered/WPS/screen_00000213.jpg'}, {'type': 'text', 'text': '\n输出图片中文字和位置。 Right-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_532549.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_044550_before_screenshot_sub1.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_528649.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_485123.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160305_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_044550_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Microsoft Start Gaming'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_528649.png'}, {'type': 'text', 'text': "\nClick on 'Favorites'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_532549.png'}, {'type': 'text', 'text': "\nClick on 'World Wildlife Day History, Top Tweets, Facts & Events'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160305_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Minimize'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_485123.png'}, {'type': 'text', 'text': '\nClick on \'DISCLOSURE link, at the right side of "gritty", to the top left of "gritty"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_97025.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_97025.png'}, {'type': 'text', 'text': "\nClick on 'Basket'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_033051_before_screenshot_sub0.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_228370.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240321/20240321_filtered/WPS/screen_00000471.jpg Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_127611.pngconv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_228370.png'}, {'type': 'text', 'text': "\nClick on 'GridForceOne'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_033051_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'INICIO'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_127611.png'}, {'type': 'text', 'text': "\nClick on 'visit Yoga Courses page'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240321/20240321_filtered/WPS/screen_00000471.jpg'}, {'type': 'text', 'text': '\n描述下图里有什么 Please double-click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageCastbox_2024_3_8_22_22-71.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageCastbox_2024_3_8_22_22-71.png'}, {'type': 'text', 'text': '\nLocate the specific Open the home page and its bounding box within this image. Please right-click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240904_154344_screenshot.pngred_506301.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240904_154344_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Information'"}]}] er by tag name'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_105883.png'}, {'type': 'text', 'text': '\nClick on \'The word "HOME" located at the top navigation bar.\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_134049_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_134049_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'MSNBC'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_122756_before_screenshot_sub3.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/ui_data/ui_home_screen_phone/ui_homescreen_phone_20240416_v7homescreen-1465.jpgLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/c0b31b2e-a2fe-43c1-aeb4-b17d8259923d.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_356029.pngLoad image from ceph: VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240325/20240325_filtered/hupu/screen_00000202.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_311014.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/c0b31b2e-a2fe-43c1-aeb4-b17d8259923d.png'}, {'type': 'text', 'text': '\nFor the operation: Today 4:00 AM, output the bounding box of the region in the image that can complete the task. Please click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_356029.png'}, {'type': 'text', 'text': "\nClick on 'Fun Kids – the UK's children's radio station'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_311014.png'}, {'type': 'text', 'text': "\nClick on 'Add to cart: “Four Layers Storage Rectangle Shelf SQ-1967”, located on the left side of the webpage'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240325/20240325_filtered/hupu/screen_00000202.jpg'}, {'type': 'text', 'text': '\n标注出界面中编辑资料的位置和边界框坐标。 Click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/ui_data/ui_home_screen_phone/ui_homescreen_phone_20240416_v7homescreen-1465.jpg'}, {'type': 'text', 'text': '\n识别屏幕截图中中国国航元素的具体位置,输出坐标值。 Please double-click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_4/C4web50k-0_1681945-split-0.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240326/20240326_filtered/manmanmai/screen_00000072.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_14596.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_4/C4web50k-0_1681945-split-0.png'}, {'type': 'text', 'text': '\nWhere is the Quimbob located in the picture, and what are its bounding box coordinates? Please double-click on it.'}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_14596.png'}, {'type': 'text', 'text': '\nClick on \'The "Sign in" option, located at the upper right corner of the webpage next to the "Cart" icon.\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240326/20240326_filtered/manmanmai/screen_00000072.jpg'}, {'type': 'text', 'text': '\n告诉我图片中的内容,详细一点。 Right-click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image9179.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image9179.jpg'}, {'type': 'text', 'text': '\nWhat is the selected number in the "NOTES"? Explain your answer based on UI element. Please click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240506_reorg_filtered/03394.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_040629_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_040629_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'View 939 comments'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_481273.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240506_reorg_filtered/03394.png'}, {'type': 'text', 'text': '\n请总结一下图片[[872, 51, 1000, 108]]框内的重要元素特征。 Click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_481273.png'}, {'type': 'text', 'text': '\nClick on \'button for "Enclosure Cards, Tags, Picks & Holders"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_236260.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_236260.png'}, {'type': 'text', 'text': '\nClick on \'link for "create an account", to the top right of "Page Toolbox", located at the top of the page\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_572875.png Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_691319.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_572875.png'}, {'type': 'text', 'text': "\nClick on 'View all posts by admin, at the center of the screenshot'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_566374.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_691319.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Light gray rectangle with dark gray rounded corner box of a trailer hitch extender", at the bottom of the image, above "TorkLift Wiring Harness Extension for 24" SuperTruss Hitch Extender - 7-Way RV", to the top right of "TorkLift Wiring Harness Extension for 24" SuperTruss Hitch Extender - 7-Way RV"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_566374.png'}, {'type': 'text', 'text': "\nClick on 'Contact Us & Locations'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160948_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160948_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Styles...'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_57/C4web50k-3_276242013-split-3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_57/C4web50k-3_276242013-split-3.png'}, {'type': 'text', 'text': '\nProvide the bounding box coordinates for Archives seen in this screen. Right-click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageIKEA_HK_2024_3_9_1_42-430.png Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_76051.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_20702.png Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_564898.pngconv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageIKEA_HK_2024_3_9_1_42-430.png'}, {'type': 'text', 'text': '\nPinpoint the location of the UI element Bedroom on the image with bounding box. Please right-click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_130319_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_20702.png'}, {'type': 'text', 'text': "\nClick on 'view reasons for the 'Cannot get normal audio data!' warning'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_76051.png'}, {'type': 'text', 'text': "\nClick on 'Astrid Lindgren (35) (Little People, BIG DREAMS): Sanchez Vegara, Maria'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_564898.png'}, {'type': 'text', 'text': '\nClick on \'link for "Privacy Policy"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_130319_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'View comments 987 Comment'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_204649.png Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/36717.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_204649.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Cokin M Size (P Series) product page'"}]}]Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_30_10_49_4b5f544e164a42e5bba1ff2c2028efaf-20.png Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_25147.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/36717.jpg'}, {'type': 'text', 'text': '\nShow me the bounding box coordinates that can provide an answer to the operation: select the third option click on second one. Please click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_25147.png'}, {'type': 'text', 'text': "\nClick on 'Paypal'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_30_10_49_4b5f544e164a42e5bba1ff2c2028efaf-20.png'}, {'type': 'text', 'text': '\nIdentify the bounding box coordinates for the ONE-WAY in the screen. Please right-click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/Keep/screen_00000100.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/Keep/screen_00000100.jpg'}, {'type': 'text', 'text': '\n如果要找一些关于瑜伽的课程,你能帮我推荐一些吗?(with grounding) Right-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_624873.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_114722_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_624873.png'}, {'type': 'text', 'text': "\nClick on 'Scroll back to Top, on the bottom left of the screenshot'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_114722_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Add Text'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_201313_before_screenshot_sub2.png Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_803825.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_201313_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'Sheet1'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_803825.png'}, {'type': 'text', 'text': '\nClick on \'Halmstad University button, at the top of the screenshot, under "Partner"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_151163.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_151163.png'}, {'type': 'text', 'text': '\nClick on \'Copyright, on the left side of "My Account", under "Medical Care in Brockville, ON"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_567034.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_567034.png'}, {'type': 'text', 'text': "\nClick on 'Light gray rectangular button with rounded corners, featuring a darker gray horizontal line'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240320/20240320/qidiandushu/screen_00000105.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240320/20240320/qidiandushu/screen_00000105.jpg'}, {'type': 'text', 'text': '\n除了《问丹朱》,希行还写了哪些其他作品?(with grounding) Right-click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image34666.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_312438.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_164620.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_325761.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_325761.png'}, {'type': 'text', 'text': "\nClick on 'text box'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_164620.png'}, {'type': 'text', 'text': "\nClick on 'Manage Preferences button, on the right side of the image'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_312438.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Blue logo with stylized alligator head, orange text \'ORANGE & BLUE\' and \'SPORTS NETWORK\'"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image34666.jpg'}, {'type': 'text', 'text': '\nHow many points will be deducted if translation is used? Explain your answer based on UI element. Click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_155854_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_155854_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Location'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_share/visual_with_bbox_images/ui_agi_appdata_phone/ui_agi_appdata_phone_20240506_reorg_filtered/00840.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_share/visual_with_bbox_images/ui_agi_appdata_phone/ui_agi_appdata_phone_20240506_reorg_filtered/00840.jpg'}, {'type': 'text', 'text': '\n描述一下图中被框选区域的内容,并描述这个元素属于什么种类。 Right-click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_051144_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_051144_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Sign in - Google Accounts'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_140504_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_140504_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'File Explorer - 1 running window'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_507285.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_398068.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_107300.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image34638.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_86262.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_043412_before_screenshot.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_133328_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_043412_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Extensions'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_107300.png'}, {'type': 'text', 'text': "\nClick on 'edit search keyword'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_133328_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Heat - Severe Heat severe warning'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_398068.png'}, {'type': 'text', 'text': "\nClick on 'Thumbnail-template--16646149734637__main-7'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_86262.png'}, {'type': 'text', 'text': "\nClick on 'Your e-mail address here'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_507285.png'}, {'type': 'text', 'text': '\nClick on \'button titled "October 2023"\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image34638.jpg'}, {'type': 'text', 'text': '\nWhat permissions are required before we start singing? Answer the question using a single word or phrase. Please click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_160337_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_160337_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Control Panel - 1 running window'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_475503.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_115632_before_screenshot_sub0.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_644803.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_97403.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_644803.png'}, {'type': 'text', 'text': "\nClick on 'Add to Wishlist'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_475503.png'}, {'type': 'text', 'text': '\nClick on \'the image of "google plus"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_115632_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Search icon'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_97403.png'}, {'type': 'text', 'text': '\nClick on \'Special Offers section that includes the text "Clearance items" and a price tag icon.\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_161942_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_161942_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Expand to see more New options'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_123601_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_123601_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'AutomationID: tab-20'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_379556.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240821_190246_before_screenshot.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_241610.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_379556.png'}, {'type': 'text', 'text': "\nClick on 'PERSONALIZED SKULL INVITATIONS'"}]}] : 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_241610.png'}, {'type': 'text', 'text': "\nClick on 'navigate to carbon footprint reduction information'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageTIDAL_2024_3_5_12_30-158.png'}, {'type': 'text', 'text': '\nDetermine the position of Back in the image and its bounding box parameters. Click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_784075.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_784075.png'}, {'type': 'text', 'text': '\nClick on \'button for "EVENING OR DAY ACCESSORIES ON A CRUISE", on the left side of "BUZZ SOPHIE DIGARD"\''}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable : [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/6e2198f2-9d98-4465-a882-db290d4974ba.png'}, {'type': 'text', 'text': '\nShow the bounding box coordinates to click for completing the objective: North is up. Please click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_130525_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Sign in to Chrome'"}]}] TOKENIZERS_PARALLELISM=(true | false) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_157161.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_111113_before_screenshot_sub0.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_567071.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_31402.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_453727.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_567071.png'}, {'type': 'text', 'text': "\nClick on 'Reefer Madness'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_111113_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'More actions for Sign in shortcut'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_453727.png'}, {'type': 'text', 'text': '\nClick on \'Light blue button with white upward-pointing arrow, interactive and accessible with tooltip "Scroll back to top"\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_157161.png'}, {'type': 'text', 'text': '\nClick on \'the select dropdown menu, next to "Current filters:"\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_31402.png'}, {'type': 'text', 'text': '\nClick on \'button labeled "JOIN"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_613838.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240325/20240325_filtered/xiechenglvxing/screen_00000008.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_613838.png'}, {'type': 'text', 'text': '\nClick on \'Golf (130 items), above "Dublin"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240325/20240325_filtered/xiechenglvxing/screen_00000008.jpg'}, {'type': 'text', 'text': '\n描述下我给的截图。 Please right-click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageWish_2024_3_8_3_9-421.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_203710_before_screenshot_sub0.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_111124_before_screenshot.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_31937.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_461528.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_78471.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_217281.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_111124_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Sign InAccount'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_461528.png'}, {'type': 'text', 'text': '\nClick on \'More navigation options, next to "Research output"\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_217281.png'}, {'type': 'text', 'text': "\nClick on 'Blue, gray, and black tabs with rounded corners, centered magnifying glass icon for search'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_31937.png'}, {'type': 'text', 'text': "\nClick on 'subscribe to job notifications from this recruiter'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_78471.png'}, {'type': 'text', 'text': '\nClick on \'here to give likes to this post., Thumbs-up approval button: click to give likes, above "2020-06-22 12:50 PM"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageWish_2024_3_8_3_9-421.png'}, {'type': 'text', 'text': '\nCould you locate the Navigate up on this image and specify its bounding box dimensions? Please double-click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_203710_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on '35 Like'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_024222_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_024222_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Skip to main search results'"}]}] n'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_130906_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_130906_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'amazon - Search Images'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_045240_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_045240_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Recipes - MSN'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image15514.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image15514.jpg'}, {'type': 'text', 'text': '\nWhich is the selected subject? Answer the question using a single word or phrase. Please click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image15014.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_134174.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_219263.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_702285.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_141320_before_screenshot.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_355352.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_141320_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Insert Table'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_134174.png'}, {'type': 'text', 'text': '\nClick on \'It is a link titled "ListenBack, Shows," under the "Podcasts" section of the webpage.\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_702285.png'}, {'type': 'text', 'text': "\nClick on 'the text input field'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_355352.png'}, {'type': 'text', 'text': '\nClick on \'the image of "goodreads.com"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_219263.png'}, {'type': 'text', 'text': "\nClick on 'the text area'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image15014.jpg'}, {'type': 'text', 'text': '\nWhat is the quantity of water in the daily goal? Explain your answer based on UI element. Please double-click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_031223_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_031223_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Home'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_073428_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_073428_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Search for Images '"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_070342_before_screenshot_sub1.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/54e0c9ce-120e-4796-b8d3-bd7d93c97e09.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_282224.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_002211_before_screenshot.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_215421_before_screenshot.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240802162453569/main_59.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_070342_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Sign in to your account'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_002211_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Customer Service'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/54e0c9ce-120e-4796-b8d3-bd7d93c97e09.png'}, {'type': 'text', 'text': '\nIndicate the precise location with bounding box that can solve the problem: Storage. Please click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_282224.png'}, {'type': 'text', 'text': "\nClick on 'Orange envelope icon with white outline and shadow on a solid background'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_215421_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'View'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240802162453569/main_59.png'}, {'type': 'text', 'text': '\n提供一个结构合理、准确的json布局描述,以满足开发者对软件产品用户界面设计的需求。\n限制:json布局描述必须遵守软件设计规范,易于实现,同时确保信息的准确性和完整性。\n输出格式:json格式,包含node_type, location, content等属性。\n工作流程:\n\n分析软件产品的用户界面设计需求。\n根据分析结果,确定json布局描述中的node_type, location, content等属性。\n编写符合规范的json布局描述。 Right-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_367412.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_367412.png'}, {'type': 'text', 'text': '\nClick on \'the image of "OHSAA Logo"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_224757.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_794776.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_773553.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240326/20240326_filtered/zhongguoliantong/screen_00000060.jpghuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_773553.png'}, {'type': 'text', 'text': '\nClick on \'Company image, under "You might also like"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_224757.png'}, {'type': 'text', 'text': '\nClick on \'**Blue bold text "Superior Patient - From Moderate" with light blue background, hyperlink to patient care/treatment resource.**\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240326/20240326_filtered/zhongguoliantong/screen_00000060.jpg'}, {'type': 'text', 'text': '\n这个UI界面是什么 Click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/imagesmnt/afs/mashengnan/GUI/make_data/1212_aw_ui/data/1218_make_dataset/SimpleCalendar/calendar_wangting_0059/0008.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/imagesmnt/afs/mashengnan/GUI/make_data/1212_aw_ui/data/1218_make_dataset/SimpleCalendar/calendar_wangting_0059/0008.jpg'}, {'type': 'text', 'text': '\nHighlight the click area with bounding box to complete the task: Bottom navigation button that switches view to regular event calendar interface. Please double-click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_083908_before_screenshot_sub3.pngimage from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_639719.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_140160.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_023326_before_screenshot_sub2.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_178706.pngTOKENIZERS_PARALLELISM=(true | false) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_023326_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'CGI'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_043151_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Chrome Web Store'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_140160.png'}, {'type': 'text', 'text': '\nClick on \'the image of "myhalo", to the left of "Home"\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_639719.png'}, {'type': 'text', 'text': "\nClick on 'text area'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_178706.png'}, {'type': 'text', 'text': '\nClick on \'the image of "thumbs up", at the center of the webpage\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_178273.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_178273.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Sneakers product page'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_83989.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_83989.png'}, {'type': 'text', 'text': "\nClick on 'view red PowerPoint templates'"}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM {'type': 'text', 'text': "\nClick on 'display information on treating termites'"}]}] =(true | false) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_774592.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_774592.png'}, {'type': 'text', 'text': '\nClick on \'Inactive white rectangle with thin black border, in "Deliver to the UK"\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_164536_before_screenshot.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/1246a2f4-379a-40ad-89dc-7639d69f445b.pngefore_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'CloudCompare'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/1246a2f4-379a-40ad-89dc-7639d69f445b.png'}, {'type': 'text', 'text': '\nPlease output the bounding box of the region correctly responding to this task: Recipes. Please click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_220015.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_220015.png'}, {'type': 'text', 'text': "\nClick on 'the text input box'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_124223_before_screenshot.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_73693.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_228677.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_290560.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image26711.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_583973.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_42535.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_124223_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'google_privacy_policy_en.pdf'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_583973.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Leaders Corporate"\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_290560.png'}, {'type': 'text', 'text': '\nClick on \'the text input box, under "Your First Order"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image26711.jpg'}, {'type': 'text', 'text': '\nCreate a tree-style visualization of the page layout using the provided image. Please double-click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_73693.png'}, {'type': 'text', 'text': '\nClick on \'The element is a button labeled "CONTACT" and is located in the top navigation bar of the webpage. It is positioned between the "SCHEDULE APPOINTMENT" and the "GIVE A DONATION" buttons.\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_42535.png'}, {'type': 'text', 'text': "\nClick on 'nearby AC repair companies link'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_228677.png'}, {'type': 'text', 'text': '\nClick on \'Any canon references to US unit locations within Korea, labeled I was wondering if anyone knows if there was ever any more location data beyond what is in the US Combat Vehicle guide. Even the unit histories offer very little insight so I was hoping there might be a Challenge article or something, above "List of Canon unit locations and strengths?", below "Semi Canon Europe Strike map"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_133867.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_133867.png'}, {'type': 'text', 'text': "\nClick on 'Enter Your Email Address'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageBBC_Sport_2024_3_6_21_40-411.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_023712_before_screenshot.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_021937_before_screenshot_sub2.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_78013.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_132113_before_screenshot_sub0.pngages20240822_023712_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Blame'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_78013.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Catechism review page'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_021937_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'cockade of France'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageBBC_Sport_2024_3_6_21_40-411.png'}, {'type': 'text', 'text': '\nLocate the Navigate up in this image and provide its bounding box. Please click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_43714.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_43714.png'}, {'type': 'text', 'text': '\nClick on \'Statistics by Country, above "Most Popular Authors"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_132113_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Uncategorized'"}]}] eb_hybrid_773k_max_25qa_filtered_710195.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_710195.png'}, {'type': 'text', 'text': '\nClick on \'button labeled "Uncategorised", next to "December 3rd, 2020"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_9/C4web50k-0_2720798-split-1.png'}, {'type': 'text', 'text': '\nIllustrate the location of the UI element No technical knowledge is required to test or use our service. Try it now! within the picture with bounding box. Click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageWish_2024_3_8_3_9-91.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_15/C4web50k-0_697916-split-4.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_110642.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240325/20240325_filtered/dewu/screen_00000267.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_15663.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageWish_2024_3_8_3_9-91.png'}, {'type': 'text', 'text': '\nProvide the dimensions and location of the UI element Search on Wish with bounding box in the image. Right-click on it.'}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_110642.png'}, {'type': 'text', 'text': "\nClick on 'CLICK Bold, dark gray text 'Our Story' within a rectangular clickable box'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_15663.png'}, {'type': 'text', 'text': '\nClick on \'The element is a bright orange button with the text "Get In Touch" located on the upper right side of the webpage, in line with the navigation menu items "Our Process," "Results," and "Services."\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_15/C4web50k-0_697916-split-4.png'}, {'type': 'text', 'text': "\nCan you find the Paul & Jamie take a weekend vacation. Jamie obsesses over leaving and what to pack while Paul tries to keep it simple. Once there, Jamie initially finds it hard to relax. Soon Jamie embraces the area and Paul wants to leave.'s position and bounding box in this image? Please click on it."}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240325/20240325_filtered/dewu/screen_00000267.jpg'}, {'type': 'text', 'text': '\n这张图片是什么 Click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_253733.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_141009_before_screenshot_sub0.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_141612_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_141612_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'AutomationID: tab-13'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_141009_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Standings'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_253733.png'}, {'type': 'text', 'text': "\nClick on 'Paul Danziger profile'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_177134.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_177134.png'}, {'type': 'text', 'text': "\nClick on 'Health Topics'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageHome_shopping_2024_3_9_3_28-338.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageHome_shopping_2024_3_9_3_28-338.png'}, {'type': 'text', 'text': '\nProvide the location coordinates and bounding box of Cancel in this image. Please click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_171740_before_screenshot.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image22864.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_114066.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_171740_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Copilot (Ctrl+Shift+.)'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_114066.png'}, {'type': 'text', 'text': "\nClick on 'expand Adoption, Surrogacy, Fertility menu'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image22864.jpg'}, {'type': 'text', 'text': '\nHow many topics are in the "General Community Chat"? Answer the question using a single word or phrase. Click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_594959.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_594959.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Lauren"\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240904_140819_screenshot.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160822_before_screenshot.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_775131.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_241996.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_164888.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_97891.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_241996.png'}, {'type': 'text', 'text': "\nClick on 'navigate to the information page about payment process'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_164888.png'}, {'type': 'text', 'text': '\nClick on \'the input box, above "Mailing List"\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_775131.png'}, {'type': 'text', 'text': "\nClick on 'Purchase a copy of Amor and Exile!'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160822_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on '566ba9ff-a5b0-4b6f-bbdf-c3ab41993fc2'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybridLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_751169.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_751169.png'}, {'type': 'text', 'text': "\nClick on 'Cougar_Crest_Winery_20080509_255.jpg'"}]}] s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_12503.png'}, {'type': 'text', 'text': "\nClick on 'Apricus Fashion – Premiere Women's Fashion at Affordable Prices - Affordable ladies fashion dress online store for every occasion. Shop now for the latest styles of Dresses with 30% discount and more!'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_142836_before_screenshot_sub1.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/ui_data/ui_home_screen_phone/ui_homescreen_phone_20240416_v7homescreen-780.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_142836_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Day 1: Arriving in Yemen (surreal to be here) - YouTube'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/ui_data/ui_home_screen_phone/ui_homescreen_phone_20240416_v7homescreen-780.jpg'}, {'type': 'text', 'text': '\n分析图像,找出所有的APP并输出边界框坐标。 Please double-click on it.'}]}] Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/38439.jpgLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240321/20240321_filtered/tengxunTV/screen_00000011.jpgLoad image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/3083.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_558982.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_141234_before_screenshot_sub0.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_024116_before_screenshot.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_021440_before_screenshot_sub1.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_401964.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/38439.jpg'}, {'type': 'text', 'text': '\nIndicate the location with bounding box to solve the task: select 11 . 5 miles select 2 option. Please right-click on it.'}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/3083.jpg'}, {'type': 'text', 'text': '\nTask: select payment, output the bounding box of the area in the image that can complete the task. Please double-click on it.'}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_558982.png'}, {'type': 'text', 'text': "\nClick on 'blogcomments'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_141234_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Headings'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_024116_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Microsoft Learn Developer & IT'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_021440_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Sign up'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_401964.png'}, {'type': 'text', 'text': "\nClick on 'admin link, on the bottom left of the webpage'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240321/20240321_filtered/tengxunTV/screen_00000011.jpg'}, {'type': 'text', 'text': '\n请对这张图片进行OCR处理,并标注出每个识别出的文字的位置坐标。 Please click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_171424_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_171424_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Screen Recording'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_125033_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_125033_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'HTC'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_133437_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_133437_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Streaming Coverage | T3 - Sleeping'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_042416_before_screenshot.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_37576.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_023519_before_screenshot_sub0.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_128756.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_37576.png'}, {'type': 'text', 'text': "\nClick on 'OKiBook'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_042416_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'initial assessment'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_023519_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Cloud Computing Services | Microsoft Azure'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_128756.png'}, {'type': 'text', 'text': '\nClick on \'A magnifying glass icon located next to the search bar at the top right corner of the webpage, adjacent to the "CART" button.\''}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_426742.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_426742.png'}, {'type': 'text', 'text': '\nClick on \'Gray uppercase "WISCONSIN" text on white background, centered with small margins, at the center of the screenshot\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_22243.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_22243.png'}, {'type': 'text', 'text': '\nClick on \'link titled "Beauty Courses", at the center of the image, in the "Error 404 - page not found"\''}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image35802.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image35802.jpg'}, {'type': 'text', 'text': '\nWhat application is asking for permission? Explain your answer based on UI element. Please click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_645024.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_645024.png'}, {'type': 'text', 'text': "\nClick on 'the Quantity text box'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_178336.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Pontifex Physiotherapy"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_462457.pngimage from ceph: langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240802185928664/main_239.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_798904.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_88870.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_462457.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Zetron"\''}]}] cessing/screenshotsweb_hybrid_773k_max_25qa_filtered_798904.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Krysta Maravilla"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_140950_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Skip to footer'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_88870.png'}, {'type': 'text', 'text': '\nClick on \'Next Page », labeled NOLMBW: Chapter 37, to the top right of "Facebook"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_123618_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Login'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240802185928664/main_239.png'}, {'type': 'text', 'text': '\n图片中[[48, 469, 960, 759]]这个区域的内容。 Please click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_95774.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_95774.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Tools page'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_043407_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Weather'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_433781.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_433781.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Thomas Kinkade\'s Festive Fire Station(2019)"\''}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_28/C4web50k-1_94171232-split-0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_28/C4web50k-1_94171232-split-0.png'}, {'type': 'text', 'text': '\nCan you determine the location coordinates of Home displayed in the screen? Right-click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageVivid_Seats_2024_3_7_18_35-434.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_382813.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_033306_before_screenshot_sub1.png Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_190113_before_screenshot.png Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240422_reorg_filtered/00546.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageVivid_Seats_2024_3_7_18_35-434.png'}, {'type': 'text', 'text': '\nIdentify and give the bounding box coordinates of Back seen in the image. Please double-click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_382813.png'}, {'type': 'text', 'text': "\nClick on 'Print The Post'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_033306_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'This site scope'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_190113_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Reading View'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240422_reorg_filtered/00546.jpg'}, {'type': 'text', 'text': '\n我想查看当前播放列表,请问应该点击哪个按钮?(with grounding) Please double-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_335646.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_335646.png'}, {'type': 'text', 'text': "\nClick on 'Gradient heart icon with thin outline, centered in a simple and modern design'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_015638_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_015638_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Free AI Writing Assistance for Students | Grammarly'"}]}] ata/OS-Atlas/desktop_domain/windows_images20240828_194744_before_screenshot_sub0.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_170782.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_509026.png Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_030920_before_screenshot_sub3.pngconv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_194744_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Artificial Intelligence'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_170782.png'}, {'type': 'text', 'text': "\nClick on 'Bikebarn Racing Home Page'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_524093.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Email Icon", under the "Rocky Mountain The Nub Decocking Point Durable Machine Aluminum Design"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_509026.png'}, {'type': 'text', 'text': "\nClick on 'textarea'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_030920_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Era: Evolution Era: Evolution'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/f8fae48e-2e46-4548-8474-9348dc404a47.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/f8fae48e-2e46-4548-8474-9348dc404a47.png'}, {'type': 'text', 'text': '\nHighlight the click area with bounding box to complete the task: DEF. Please double-click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image65571.jpgLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_135005_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_135005_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Favorites bar'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image65571.jpg'}, {'type': 'text', 'text': '\nOutput the tree-structured page layout corresponding to the screenshot. Click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_053435_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_053435_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Sign in - Google Accounts'"}]}] - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_734237.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_734237.png'}, {'type': 'text', 'text': "\nClick on 'CLICK Tech Spec'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageQuora_2024_3_9_18_46-205.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageQuora_2024_3_9_18_46-205.png'}, {'type': 'text', 'text': '\nAscertain the location of the UI element Me in the screen with bounding box. Click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_034752_before_screenshot.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_161459_before_screenshot_sub2.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_787318.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_74388.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_034752_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Something else is wrong Something else is wrong'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_161459_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'Positive negative'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_787318.png'}, {'type': 'text', 'text': "\nClick on 'Dark blue, bold, underlined 'ISO' hyperlink'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_74388.png'}, {'type': 'text', 'text': '\nClick on \'the Colour select dropdown menu, above "Quantity:", under the "Tarpaulin Sheeting Flame Retardant 8m x 17m"\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_025232_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_025232_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Industrial equipmentExpand: Industrial equipment'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_142116_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_142116_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'Start'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_429402.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_38935.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_38935.png'}, {'type': 'text', 'text': "\nClick on 'Open Legal Notice'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_429402.png'}, {'type': 'text', 'text': "\nClick on 'select dropdown list'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_113420_before_screenshot_sub3.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_17478.pnges20240827_113420_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Google Cloud documentation'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_17478.png'}, {'type': 'text', 'text': '\nClick on \'link for "Accessibility", to the right of "Community Links"\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_001134_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_001134_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Python'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_407746.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_407746.png'}, {'type': 'text', 'text': '\nClick on \'Recently Viewed, at the right side of "Sign In", under "(888) 618-1802"\''}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_034816_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_034816_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Microphone Ask (default)'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_48/C4web50k-3_274610391-split-4.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_48/C4web50k-3_274610391-split-4.png'}, {'type': 'text', 'text': '\nPlease locate Range of 50’-130’ (depending on the home’s construction) provides a wide coverage area in the image and outline its bounding box. Please click on it.'}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/agent_data/aig_share/ui_agi_appdata_car/20240606/00589.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_140903_before_screenshot_sub1.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_201236_before_screenshot.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_050924_before_screenshot.png Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_692815.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_140903_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Symbol'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_050924_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Sign in - Google Accounts'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_201236_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'How to Start Strength Training If You've Never Done It'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_692815.png'}, {'type': 'text', 'text': '\nClick on \'Light gray bordered white box with centered bold "COW" text\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/agent_data/aig_share/ui_agi_appdata_car/20240606/00589.png'}, {'type': 'text', 'text': '\n请标出界面中电池电量显示标签图标的位置,给出边界框坐标。 Click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_035005_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_035005_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'View comments 4 Comment'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_792825.pnglangchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_15296.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image15969.jpg'}, {'type': 'text', 'text': '\nWhat is the name of the application? Explain your answer based on UI element. Please right-click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_15296.png'}, {'type': 'text', 'text': '\nClick on \'The element is a navigation menu item labeled "TESTIMONIALS" located on the top blue navigation bar between the "PORTFOLIO" and "CONTACT" menu items.\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_546371.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_546371.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Manchester SEO Worker Bee", at the top left corner of the screenshot\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_792825.png'}, {'type': 'text', 'text': "\nClick on 'to login, on the top right of the page'"}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avo conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_55779.png'}, {'type': 'text', 'text': "\nClick on 'view detailed information about the Minolta Pro Shot 20 Mega Pixel HD Digital Camera'"}]}] TOKENIZERS_PARALLELISM=(true | false) Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/26576.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/26576.jpg'}, {'type': 'text', 'text': '\nIdentify the bounding box for the element in the screen to complete the task: click on expand all. Click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_195727.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_195727.png'}, {'type': 'text', 'text': "\nClick on 'view detailed ban details'"}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240326/20240326_filtered/jiakaobaodian/screen_00000459.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240326/20240326_filtered/jiakaobaodian/screen_00000459.jpg'}, {'type': 'text', 'text': '\n请从这张图片中提取出文字,并且给出这些文字在图片中的位置。 Please double-click on it.'}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_560698.pngd deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_560698.png'}, {'type': 'text', 'text': "\nClick on 'to share on Twitter'"}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_194334_before_screenshot_sub1.png Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_813710.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_063724_before_screenshot_sub1.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_69/C4web50k-3_275273181-split-3.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_719422.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_191813_before_screenTOKENIZERS_PARALLELISM=(true | false) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_719422.png'}, {'type': 'text', 'text': "\nClick on 'Projects, at the top of the screenshot'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_191813_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Share'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_063724_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Privacy Help Center - Policies Help'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_813710.png'}, {'type': 'text', 'text': "\nClick on 'Sleek gradient link with blurred date and title'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_69/C4web50k-3_275273181-split-3.png'}, {'type': 'text', 'text': '\nCan you detect the location and bounding box of The Dark Side of ‘Clean Perfumes’: another Fragrant Deception here? Please click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240327/20240327_filtered/sougoushurufa/screen_00000162.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240327/20240327_filtered/sougoushurufa/screen_00000162.jpg'}, {'type': 'text', 'text': '\n描述下截图的内容。 Please right-click on it.'}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_153460.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_358531.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_34/C4web50k-2_183322085-split-3.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_035405_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_035405_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Local'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_153460.png'}, {'type': 'text', 'text': "\nClick on 'View the GVSU Quick Links'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_358531.png'}, {'type': 'text', 'text': "\nClick on 'Community Notices link'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_34/C4web50k-2_183322085-split-3.png'}, {'type': 'text', 'text': '\nMap out the exact position of the UI element More On with bounding box shown in the image. Please click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_168500.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_168500.png'}, {'type': 'text', 'text': '\nClick on \'button titled "Ries, Lydia"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_30425.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_30425.png'}, {'type': 'text', 'text': '\nClick on \'A teal-colored button with white text that reads "Click Here to Fill Out BIO Link."\''}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/ui2json_app_d20240822_v1/collect/Soul_screen_00000018.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_108041.png Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_110940_before_screenshot.png Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_112033.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_108041.png'}, {'type': 'text', 'text': '\nClick on \'the image of "en_US"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_110940_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Graphite Purple'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_666371.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_112033.png'}, {'type': 'text', 'text': "\nClick on 'toggle list view'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/ui2json_app_d20240822_v1/collect/Soul_screen_00000018.jpg'}, {'type': 'text', 'text': '\n将UI界面的结构化信息导出为json格式。 Click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_666371.png'}, {'type': 'text', 'text': '\nClick on \'the image of "D’oh!"\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_003537_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_003537_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'Microsoft Edge'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_177676.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_203257_before_screenshot.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_32/C4web50k-1_95165518-split-1.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_649910.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_119802.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_741725.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_123008_before_screenshot_sub0.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_93232.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_220512_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_203257_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Facebook'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_123008_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Compute, containers, and serverless'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_649910.png'}, {'type': 'text', 'text': "\nClick on 'text area'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_32/C4web50k-1_95165518-split-1.png'}, {'type': 'text', 'text': '\nCan you determine the location coordinates of March 2014 displayed in the screen? Please right-click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_741725.png'}, {'type': 'text', 'text': '\nClick on \'BLOG button, to the top right of "ACTIVATED WHITE CHARCOAL POWDER"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_220512_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Refresh All'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_93232.png'}, {'type': 'text', 'text': "\nClick on 'navigate to the '11 Must-Play Political Strategy Video Games for Any Gamer' article page'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_119802.png'}, {'type': 'text', 'text': "\nClick on 'Cartoon face with homemade mask thumbnail, on the right side of the webpage'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_153404_before_screenshot.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_052059_before_screenshot_sub3.pngLoad image from ceph: VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240322/20240322_filtered/baidutieba/screen_00000268.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_95819.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_135310_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_153404_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Show desktop'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_052059_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'MarketWatch'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_95819.png'}, {'type': 'text', 'text': "\nClick on 'x, labeled Share This'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_135310_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'WeChat - 1 running window'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240322/20240322_filtered/baidutieba/screen_00000268.jpg'}, {'type': 'text', 'text': '\n找出界面中通知的位置,输出其边界框坐标值。 Please double-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_333351.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_13528.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_333351.png'}, {'type': 'text', 'text': "\nClick on 'click to open'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_13528.png'}, {'type': 'text', 'text': "\nClick on 'navigate to TONIC 2018 event page'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_131345_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_131345_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Privacy Checkup'"}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_022254_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_022254_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Create new...'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_81421.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_81421.png'}, {'type': 'text', 'text': "\nClick on 'View all posts by Jeremy Hess'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_045858_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_045858_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Google Account'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_632225.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240325/20240325_filtered/tantan/screen_00000136.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_48199.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_632225.png'}, {'type': 'text', 'text': '\nClick on \'the image of "joinnewsletter"\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_48199.png'}, {'type': 'text', 'text': '\nClick on \'The element is a menu item labeled "EVENTS" located in the navigation bar near the top of the webpage.\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240325/20240325_filtered/tantan/screen_00000136.jpg'}, {'type': 'text', 'text': '\n能否帮忙从这张图片中提取出文字,并且给出这些文字在图片中的位置? Right-click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_135901_before_screenshot_sub0.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_50933.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_124437_before_screenshot.png Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_65/C4web50k-100_79730-split-0.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_339519.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_135901_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'System'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_124437_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Image result for amazon'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_50933.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Studio Osmosis"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_339519.png'}, {'type': 'text', 'text': '\nClick on \'button labeled "Blog"\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_65/C4web50k-100_79730-split-0.png'}, {'type': 'text', 'text': '\nHow can you locate the Fellows of Professional Orgs and its bounding box in this image? Right-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_55040.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_55040.png'}, {'type': 'text', 'text': '\nClick on \'The element is a tab labeled "Том3" located above the "Gallery" section heading.\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_033307_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_033307_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'News'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_032818_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_032818_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Yard, Garden & Outdoor Living'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_202018_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_202018_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'Visual Studio Code - 1 running window'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_153645_original_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_153645_original_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'SeeClickFix/dev.seeclickfix.com'"}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_112345_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_112345_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Action Center, 2 new notifications'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_143815_original_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_143815_original_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Delete Page Break'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_020108_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_020108_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'https://openrgb.org'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_218233.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_218233.png'}, {'type': 'text', 'text': "\nClick on 'navigate to the article about hurricane havoc'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_221452_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_221452_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Filter'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_023414_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_023414_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'Flintlock: The Siege of Dawn Jul 18, 2024'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image11560.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image11560.jpg'}, {'type': 'text', 'text': '\nThrough what application can we log in? Explain your answer based on UI element. Right-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_217804.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_217804.png'}, {'type': 'text', 'text': '\nClick on \'the image of "t-party", under "T Party", in the "Find T-shirts" section\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_236087.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_297283.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image56759.jpgLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_042046_before_screenshot.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_29781.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_297283.png'}, {'type': 'text', 'text': '\nClick on \'News & Media, under the "Mount Rushmore Tea Gift"\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_042046_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'SpaceX'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image56759.jpg'}, {'type': 'text', 'text': '\nWho is the singer of the song? Explain your answer based on UI element. Click on it.'}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_236087.png'}, {'type': 'text', 'text': "\nClick on 'Help Center and other resources'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_29781.png'}, {'type': 'text', 'text': '\nClick on \'The text link that reads "Explore latest Editions."\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_644561.png'}, {'type': 'text', 'text': '\nClick on \'category Real Estate, in "Past Offers" section\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_362608.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_362608.png'}, {'type': 'text', 'text': "\nClick on 'Green heart-shaped button: 'Add to Favorites', on the bottom right of the webpage'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_120754_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_120754_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Google Cloud pricing'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_96685.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_96685.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Winnie The Pooh: Whimsical Women Oversized T-Shirts Online"\''}]}] s/multi_modal/agent_data/rico/dataset/image48542.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_531323.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_034353_before_screenshot_sub1.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240904_140006_screenshot.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_483541.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_531323.png'}, {'type': 'text', 'text': "\nClick on 'the input box'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_203356.png'}, {'type': 'text', 'text': "\nClick on 'Visit event details page'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image45366.jpg'}, {'type': 'text', 'text': '\nWhat is the number of dislikes of the video? Explain your answer based on UI element. Click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_483541.png'}, {'type': 'text', 'text': "\nClick on 'copy link'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_034353_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Refresh daily'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240904_140006_screenshot.png'}, {'type': 'text', 'text': "\nClick on '\uec00'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image48542.jpg'}, {'type': 'text', 'text': '\nBased on the screenshot, output a hierarchical representation of the page layout. Please double-click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240826_175815_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240826_175815_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'This command is currently disabled.'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_134416_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_134416_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Grid Table 1 Light - Accent 6'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_194512_before_screenshot_sub0.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_043408_before_screenshot_sub1.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_034252_before_screenshot_sub0.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image10757.jpgLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240326/20240326_filtered/zhongguoyidong/screen_00000301.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_409916.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_131104_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_131104_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'HUAWEI'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_043408_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Stocks - MSN'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_409916.png'}, {'type': 'text', 'text': "\nClick on 'I AGREE WITH ALL COOKIES, at the center of the page'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_034252_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Following'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image10757.jpg'}, {'type': 'text', 'text': '\nWhat is the user name? Explain your answer based on UI element. Click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_194512_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'User Privacy Notice | eBay'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240326/20240326_filtered/zhongguoyidong/screen_00000301.jpg'}, {'type': 'text', 'text': '\n能否将这张图片中的文字提取出来,并且给出它们在图片中的位置坐标? Please double-click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_021828_before_screenshot.png .png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_021828_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on '15 Ways Modern Life Contradicts the Teachings of Jesus'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_6300.pngLoad image from ceph: VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240327/20240327_filtered/wangyiyunyingyue/screen_00000267.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_725796.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_203610_before_screenshot_sub2.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_132573.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_6300.png'}, {'type': 'text', 'text': '\nClick on \'Audi oil change interval button, below "2007 audi b7 s4"\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_203610_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'Privacy Policy'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_132573.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Ohio State decision dates 2021'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240327/20240327_filtered/wangyiyunyingyue/screen_00000267.jpg'}, {'type': 'text', 'text': '\n确定截图中次元漫游的位置,标记并提供边界框坐标。 Click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_725796.png'}, {'type': 'text', 'text': '\nClick on \'Light blue checkmark icon with darker blue checkmark inside a circle, next to "Payroll – up to 10 payslips and filing obligations - €100/month"\''}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_40/C4web50k-2_184235270-split-2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_40/C4web50k-2_184235270-split-2.png'}, {'type': 'text', 'text': '\nDetect the SwAbstractDialogFactory_Impl::GetTabPageCreatorFunc() and its bounding box coordinates in this image. Please double-click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_191616_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_191616_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Center Shadow Rectangle'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageWeawow_2024_3_5_17_33-113.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_642722.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_671496.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_93382.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_65230.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_93382.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Vintage beige page with ornate Islamic calligraphy in black ink, worn and creased"\''}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_65230.png'}, {'type': 'text', 'text': "\nClick on 'navigate to detailed Semi-automatic Block Machine page'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_671496.png'}, {'type': 'text', 'text': "\nClick on 'Download CDI-01-1560.tif'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_642722.png'}, {'type': 'text', 'text': "\nClick on 'Version 2020 (20 items)'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageWeawow_2024_3_5_17_33-113.png'}, {'type': 'text', 'text': '\nLocate the Keyword search in this image and provide its bounding box. Please click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_083111_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_083111_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'WEB'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_548519.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_548519.png'}, {'type': 'text', 'text': "\nClick on 'Site Map'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_004840_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_004840_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Page down'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_072607_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_072607_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Cookie statement'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_499432.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_499432.png'}, {'type': 'text', 'text': "\nClick on 'Our work'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_121212_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_121212_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'About Us'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_082950_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_082950_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Bing Menu Bar'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240821_170526_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240821_170526_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Q2790: 100%'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_150995.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_150995.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Real Fur Pom Poms category'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_678409.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_043933_before_screenshot.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_449467.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_030925_before_screenshot_sub0.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_288467.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_180205.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_043933_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Extensions'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_678409.png'}, {'type': 'text', 'text': "\nClick on ':crazy:'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_288467.png'}, {'type': 'text', 'text': "\nClick on 'Subscribe'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_449467.png'}, {'type': 'text', 'text': '\nClick on \'the text input box, above "Vehicle Inspections"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_030925_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'New Tab'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_180205.png'}, {'type': 'text', 'text': '\nClick on \'the select menu, to the top right of "Categories", located above "Recent Posts", in "Star Bharat Gears To Bring The Viewers Their Upcoming New Show Lakshmi Ghar Aayi" section\''}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0726_099_batch_3_num_1500/20240726140648323/main_179.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0726_099_batch_3_num_1500/20240726140648323/main_179.png'}, {'type': 'text', 'text': '\n提供一个结构合理、准确的XML布局描述,以满足开发者对软件产品用户界面设计的需求。\n限制:XML布局描述必须遵守软件设计规范,易于实现,同时确保信息的准确性和完整性。\n输出格式:XML格式,包含node_type, location, content等属性。\n工作流程:\n\n分析软件产品的用户界面设计需求。\n根据分析结果,确定XML布局描述中的node_type, location, content等属性。\n编写符合规范的XML布局描述。 Right-click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_723759.png'}, {'type': 'text', 'text': "\nClick on 'irish bouebe f'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240730210300924/main_93.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240730210300924/main_93.png'}, {'type': 'text', 'text': '\n角色:移动应用UI/UX设计顾问\n背景:客户希望优化其移动应用的用户界面和用户体验,需要输出相应的XML代码片段,包括元素类型(node_type)、位置(location)、内容(content)等属性。\n目标:提供一个结构清晰、准确的XML代码片段,以满足客户对移动应用用户界面和用户体验优化的需求。\n限制:XML代码片段必须遵循移动应用设计标准,易于理解,同时确保信息的精确性和完整性。\n输出格式:XML格式,包含element_type, position, text等属性。\n工作流程:\n分析移动应用的用户界面和用户体验设计需求。\n根据分析结果,确定XML代码片段中的element_type, position, text等属性。\n编写符合规范的XML代码片段: Please right-click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_26_19_50_79ddbe2a414643328f25c525292d1797-15.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_26_19_50_79ddbe2a414643328f25c525292d1797-15.png'}, {'type': 'text', 'text': '\nProvide the dimensions and location of the UI element close with bounding box in the image. Please right-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_379296.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_379296.png'}, {'type': 'text', 'text': "\nClick on 'Careers, on the top right of the webpage'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240327/20240327_filtered/jiebaoqinglidashi/screen_00000030.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240327/20240327_filtered/jiebaoqinglidashi/screen_00000030.jpg'}, {'type': 'text', 'text': '\n我刚刚清理了手机,现在想回到上一级菜单,应该怎么做?(with grounding) Click on it.'}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_604599.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_604599.png'}, {'type': 'text', 'text': '\nClick on \'Trinity College London - more information button, to the top left of "RESET"\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_133925_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_133925_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Title'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_122923_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_122923_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Cloud Storage'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_022714_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_022714_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Illness news & latest pictures from Newsweek.com'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_144967.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_144967.png'}, {'type': 'text', 'text': "\nClick on 'Atlanta_solar_power_payback'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_080737_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_080737_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'This site scope'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_343133.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_343133.png'}, {'type': 'text', 'text': "\nClick on 'Orange SVG icon of a person with raised arms, modern and minimalist, on the left side of the page'"}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_446643.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_446643.png'}, {'type': 'text', 'text': '\nClick on \'Purple-blue gradient button with modern design, on the bottom left of the webpage, under "Online Sabong Agent Application"\''}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_21/C4web50k-1_92843853-split-9.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_21/C4web50k-1_92843853-split-9.png'}, {'type': 'text', 'text': "\nIdentify the position of UI element annie may's in the visual representation with bounding box. Please double-click on it."}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_172328_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_172328_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Underline'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_294037.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_294037.png'}, {'type': 'text', 'text': "\nClick on 'funding (3 items)'"}]}] Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/agent_data/aig_share/ui_agi_appdata_pad/ui_agi_appdata_pad_20240506_reorg_filtered/00359.pngLoad image from ceph: langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240731110448887/main_46.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_124858_before_screenshot.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_191813_before_screenshot.pngLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240821_192129_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_124858_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Settings - System'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_191813_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Arcade'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240821_192129_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Path'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240731110448887/main_46.png'}, {'type': 'text', 'text': '\n深入探讨一下截图[[271, 106, 500, 160]]区域所蕴含的元素。 Please click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/agent_data/aig_share/ui_agi_appdata_pad/ui_agi_appdata_pad_20240506_reorg_filtered/00359.png'}, {'type': 'text', 'text': '\n识别图中返回按钮的精确位置,并框选出其边界。 Please double-click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imagebrightwheel_2024_2_2_13_44-14.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_41987.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_41987.png'}, {'type': 'text', 'text': "\nClick on 'navigate to CNC Lathe page'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imagebrightwheel_2024_2_2_13_44-14.png'}, {'type': 'text', 'text': "\nCan you find the Navigate up's position and bounding box in this image? Click on it."}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_30_12_46_a0b55ccd74184877b6f4d826a9740832-4.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_105847.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_60492.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_211111.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_55014.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_81867.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_211111.png'}, {'type': 'text', 'text': "\nClick on 'Ring Size'"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_105847.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Sign In Compliance Formerly ThreatSwitch"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_30_12_46_a0b55ccd74184877b6f4d826a9740832-4.png'}, {'type': 'text', 'text': '\nLocate the specific ONE-WAY and its bounding box within this image. Please double-click on it.'}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_60492.png'}, {'type': 'text', 'text': '\nClick on \'Anesthesiology, Perioperative and Pain Medicine Clinical Department, above "Dermatology"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_81867.png'}, {'type': 'text', 'text': "\nClick on 'select sort by date'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_55014.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Myspace Social Icon"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_575779.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_575779.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Logo-sans-paw.png"\''}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/ui2json_web_d20240822_v1/course/C4web50k-2_183673712-split-0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/ui2json_web_d20240822_v1/course/C4web50k-2_183673712-split-0.png'}, {'type': 'text', 'text': '\n把当前UI界面的结构化信息转化为json格式并输出。 Please click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_55/C4web50k-3_275877086-split-6.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_55/C4web50k-3_275877086-split-6.png'}, {'type': 'text', 'text': '\nAscertain the location of the UI element Support in the screen with bounding box. Right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67661e6520> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_15260.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_15260.png'}, {'type': 'text', 'text': "\nClick on 'view profile of the board member displayed'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67661e7880> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_202044_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_202044_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Add Sheet'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ef680db0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_222931.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_222931.png'}, {'type': 'text', 'text': "\nClick on 'Bill Rexford/Results/Canfield Speedway - NASCAR Late Model Sportsman Division/1950'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f64dfa6b470> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_69545.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_69545.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Gray silhouette of a person\'s head and shoulders on a white background"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f64dfa6a160> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_203747_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_203747_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Microsoft Edge is optimised for Windows'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f64e00496c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_192742_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_192742_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Page 4'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67661e7880> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_177985.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_177985.png'}, {'type': 'text', 'text': '\nClick on \'Type into the text input box, under "Drop Location*", on the left side of "Drop Location*"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f64dfa6b470> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_788202.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_788202.png'}, {'type': 'text', 'text': "\nClick on 'Search...'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ef670810> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_135578.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_135578.png'}, {'type': 'text', 'text': '\nClick on \'The word "campus" styled in italic, lowercase blue letters.\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ef7964d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_064532_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_064532_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Privacy Checkup'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ef6808b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_278730.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_278730.png'}, {'type': 'text', 'text': '\nClick on \'Welcome to the official website of the Association for Contextual Behavioral Science, a worldwide online learning and research community, and a living resource for anyone interested in ACT, RFT, and Contextual Behavioral Science, for "Overcome Obesity"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ef6818f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_295325.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_295325.png'}, {'type': 'text', 'text': "\nClick on 'Search input'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ef683420> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240325/20240325_filtered/ziru/screen_00000425.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240325/20240325_filtered/ziru/screen_00000425.jpg'}, {'type': 'text', 'text': '\n我要了解其他用户对这项服务的看法。with grounding Click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ef7897b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67661e4310> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fedbba08040> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_79349.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_79349.png'}, {'type': 'text', 'text': '\nClick on \'Light background search bar with dark border and centered placeholder text "Keyword search..."\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fedbba08040> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fedbb01b6f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_525610.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_525610.png'}, {'type': 'text', 'text': '\nClick on \'the select menu for "$2.99", to the bottom left of "$2.99"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25e2db20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_061842_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_061842_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'DJI -0.43%'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_95827.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_95827.png'}, {'type': 'text', 'text': '\nClick on \'button titled "Full Steam Ahead for Sommerfel ..."\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fef006c6d90> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb6162e4d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_584892.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_584892.png'}, {'type': 'text', 'text': '\nClick on \'link labeled "PAY SUBSCRIPTION"\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_042442_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_042442_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Accessible login button'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image29110.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image29110.jpg'}, {'type': 'text', 'text': '\nOutput a hierarchical diagram of the page layout based on the provided screenshot. Please right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ef680f40> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb59a421b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fec4fa3fd80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_122189.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_122189.png'}, {'type': 'text', 'text': "\nClick on 'check notifications'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_91618.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_91618.png'}, {'type': 'text', 'text': "\nClick on '2:29 pm, on the left side of the page'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_342275.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_342275.png'}, {'type': 'text', 'text': "\nClick on 'Rory McIlroy, PGA Tour, The Players Championship'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ef682160> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25f82430> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fedbb0192b0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_040259_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_040259_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Creators'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_322277.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_322277.png'}, {'type': 'text', 'text': '\nClick on \'Sign Up link, on the right side of "Our Services"\''}]}] Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ef681030> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb34e09030> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageWalmart_2024_2_3_18_47-62.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageWalmart_2024_2_3_18_47-62.png'}, {'type': 'text', 'text': '\nReport the position of the UI element Close within the image, specifying its coordinates. Right-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25e2e7f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_134147_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_134147_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Equation'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f68aa6f9f80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_18369.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_18369.png'}, {'type': 'text', 'text': '\nClick on \'Rope Access button, at the bottom of the image, to the right of "Personal Development", to the bottom left of "VIEW COURSE CALENDAR"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ef683b50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/ae63bd4d-acd9-4e5c-97c6-b428387e844d.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/ae63bd4d-acd9-4e5c-97c6-b428387e844d.png'}, {'type': 'text', 'text': '\nCan you determine the bounding box coordinates to complete my task: 10 mins. Please double-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f68aa6fa840> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image27118.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image27118.jpg'}, {'type': 'text', 'text': '\nWhat are the different types of categories for men to shop? Explain your answer based on UI element. Please double-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f68aa6f9210> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_125740.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_125740.png'}, {'type': 'text', 'text': "\nClick on 'BUY NOW button'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f64dfa6b470> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_123214_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_123214_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'See on map'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67661e7880> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/9b8d4943-f011-4e04-b97f-2c8fef737082.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/9b8d4943-f011-4e04-b97f-2c8fef737082.png'}, {'type': 'text', 'text': '\nWhere is the bounding box coordinates to click for solving the task: Català. Please click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f231246e200> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_140958_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_140958_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Update IME Dictionary...'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_132444.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_132444.png'}, {'type': 'text', 'text': "\nClick on 'The element is a rectangular input field labeled 'To' with an airplane icon on the left side.'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fef006c6d90> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f239b86a6b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_160105_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_160105_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Activation'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25e1bc40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240325/20240325_filtered/ziru/screen_00000288.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240325/20240325_filtered/ziru/screen_00000288.jpg'}, {'type': 'text', 'text': '\n帮我标出所有文字的内容和具体位置? Click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25e198f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_553467.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_553467.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Kiwa High Performance Web"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67661e4090> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25deafc0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_152719_original_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_152719_original_screenshot.png'}, {'type': 'text', 'text': "\nClick on ' Ask For Kernel Restart. Warn the user before restarting a kernel. '"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ef683bf0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fedbba08040> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_114154_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_114154_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Enhanced Support | Google Cloud'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_692926.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_692926.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Relaxing poolside seating area with blue water and lush greenery"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f64e0049990> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f68aa67f4c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_618107.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_618107.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Wests Tigers logo"\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_200708_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_200708_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Beige'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_240777.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_240777.png'}, {'type': 'text', 'text': "\nClick on 'the select dropdown list'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67eb680590> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f68aa6f9f80> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fedbb019ee0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160142_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160142_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Customize Quick Access Toolbar'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank41]: Traceback (most recent call last): [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank41]: train() [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank41]: trainer.train() [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank41]: return inner_training_loop( [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank41]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank41]: batch_samples += [next(epoch_iterator)] [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank41]: current_batch = next(dataloader_iter) [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank41]: data = self._next_data() [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank41]: return self._process_data(data) [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank41]: data.reraise() [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank41]: raise exception [rank41]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank41]: Original Traceback (most recent call last): [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank41]: sample = self._get_item(i) [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank41]: data_dict = self.preprocess_qwen2vl_v3( [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank41]: return self.preprocess_qwen2vl( [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank41]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank41]: image = tcs_loader(image) [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank41]: img = pil_loader(img_value_str) [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank41]: img = Image.open(buff) [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank41]: raise UnidentifiedImageError(msg) [rank41]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67661e4090> [rank41]: During handling of the above exception, another exception occurred: [rank41]: Traceback (most recent call last): [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank41]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank41]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank41]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank41]: print(f"Failed to fetch sample {i}. Exception:", e) [rank41]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank41]: print(f"Failed to fetch sample {i}. Exception:", e) [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank41]: return self.dispatch_line(frame) [rank41]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank41]: if self.quitting: raise BdbQuit [rank41]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image55623.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image55623.jpg'}, {'type': 'text', 'text': '\nWhich item is selected? Answer the question using a single word or phrase. Please click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_49/C4web50k-3_274781915-split-4.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_49/C4web50k-3_274781915-split-4.png'}, {'type': 'text', 'text': '\nMap out the exact position of the UI element Any birth injury caused by medical negligence is eligible for financial compensation through a medical malpractice lawsuit. with bounding box shown in the image. Click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f64efc834c0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_103815.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_103815.png'}, {'type': 'text', 'text': "\nClick on 'submit a new anonymous tip'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67eb733b50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_153808_original_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_153808_original_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Sell'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25df8ef0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_192100_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_192100_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Line up'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ef683f60> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_182444_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_182444_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'WordArt'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67661e5b70> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_421152.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_421152.png'}, {'type': 'text', 'text': "\nClick on 'Share on Tumblr'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb59a421b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67661e7920> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_769690.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_769690.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Dip Memento Photography"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_762099.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_762099.png'}, {'type': 'text', 'text': '\nClick on \'Download Metadata, to the top right of "Page: 1", in "[Postcard] 1909-05-17 [from] Charley [to] (Mrs) [mother] Beecroft (PC-EP-00750)" section\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/18368.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/18368.jpg'}, {'type': 'text', 'text': '\nInstruction: to change the details use star setting, output the bounding box of the region in the screen that can complete the task. Double-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fee3cf5fec0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f68a8c1b380> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image46427.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image46427.jpg'}, {'type': 'text', 'text': "\nWhat's the code? Answer the question using a single word or phrase. Please double-click on it."}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_109671.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_109671.png'}, {'type': 'text', 'text': "\nClick on 'Copyright link'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb6162e4d0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67eb6828e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_29219.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_29219.png'}, {'type': 'text', 'text': "\nClick on 'navigate to the article '2018 - Safety Directors: Where Did They Go?''"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_072152_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_072152_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Blogger Policies and Guidelines - Transparency Center'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25dfb510> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67eb682660> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_194154_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_194154_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'AutomationID: tab-24'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_122228_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_122228_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Google Play logo'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb34e09760> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67eb6803b0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_37538.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_37538.png'}, {'type': 'text', 'text': "\nClick on 'Basketball: Tarpey, the favorite'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f68a8c93740> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25df9c60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f348554e340> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_15367.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_15367.png'}, {'type': 'text', 'text': '\nClick on \'The word "SERVICES" in the top navigation menu, located between "ABOUT" and "RECENT WORK".\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6768aec090> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_131606_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_131606_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'Owl Nest'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6766bac400> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image30563.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image30563.jpg'}, {'type': 'text', 'text': '\nWhich tab is selected in Adda247? Explain your answer based on UI element. Double-click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_115949_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_115949_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'limited or no ads'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_032935_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_032935_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'English'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ef683290> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25e03f60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f64dfa6b470> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_413254.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_413254.png'}, {'type': 'text', 'text': "\nClick on 'MODULES (4 FILES) link'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_200221_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_200221_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Light-Gray'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_112869.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_112869.png'}, {'type': 'text', 'text': "\nClick on 'Type into the text area'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f68aa6dfec0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25e484f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f64dfa6a160> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_191628_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_191628_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Zoom'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_465773.png (Pdb) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_465773.png'}, {'type': 'text', 'text': "\nClick on '2 mi\n5 mi\n10 mi\n30 mi\n50 mi, labeled Distance'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_371073.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_371073.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Google Git"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ef6ace50> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f231246c180> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25e1b330> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6766bac400> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_230323.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_230323.png'}, {'type': 'text', 'text': "\nClick on 'View all posts in National News'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240326/20240326_filtered/zhongguoliantong/screen_00000415.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240326/20240326_filtered/zhongguoliantong/screen_00000415.jpg'}, {'type': 'text', 'text': '\n你能否提供图片中[[22, 46, 100, 82]]框出区域的详细元素描述? Double-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f67ef6af9c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25e1b880> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/11768965-d912-4975-bc47-13db7bb1cb5a.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/11768965-d912-4975-bc47-13db7bb1cb5a.png'}, {'type': 'text', 'text': '\nPlease output the bounding box of the region correctly responding to this task: Contact List - Key Clients. Click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_120058_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_120058_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Create your Google Account'"}]}] Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/agent_data/aig_share/ui_agi_appdata_car/20240606/03317.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/agent_data/aig_share/ui_agi_appdata_car/20240606/03317.png'}, {'type': 'text', 'text': '\n在图像中定位出电池电量显示标签,给出其边界框的坐标信息。 Please double-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25e4a8e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_407492.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_407492.png'}, {'type': 'text', 'text': "\nClick on 'Most Popular'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25e03150> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_778541.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_778541.png'}, {'type': 'text', 'text': "\nClick on 'the text input box'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25e48270> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fedbba08040> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3484170090> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fedbb019ee0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_124480.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_124480.png'}, {'type': 'text', 'text': "\nClick on 'view details of 'spring 2003c' artwork'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fedbb019ee0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_69547.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_69547.png'}, {'type': 'text', 'text': "\nClick on 'Brunch in a Box - service for 2 priced at $64.00.'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_114224_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_114224_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Offset'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3484f1e2f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb59a421b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2312f361b0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_167208.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_167208.png'}, {'type': 'text', 'text': "\nClick on 'text input area'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25dfb830> (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f217f128270> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) Traceback (most recent call last): PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f35c86889f0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f231246c270> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_248139.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_248139.png'}, {'type': 'text', 'text': '\nClick on \'button titled "Sign up"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_7214.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_7214.png'}, {'type': 'text', 'text': "\nClick on 'Light grey rectangular text input with white border and 'Email address' placeholder'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f23132522f0> [rank40]: Traceback (most recent call last): [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank40]: train() [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank40]: trainer.train() [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank40]: return inner_training_loop( [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank40]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank40]: batch_samples += [next(epoch_iterator)] [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank40]: current_batch = next(dataloader_iter) [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank40]: data = self._next_data() [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank40]: return self._process_data(data) [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank40]: data.reraise() [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank40]: raise exception [rank40]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank40]: Original Traceback (most recent call last): [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank40]: sample = self._get_item(i) [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank40]: data_dict = self.preprocess_qwen2vl_v3( [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank40]: return self.preprocess_qwen2vl( [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank40]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank40]: image = tcs_loader(image) [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank40]: img = pil_loader(img_value_str) [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank40]: img = Image.open(buff) [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank40]: raise UnidentifiedImageError(msg) [rank40]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f217f128270> [rank40]: During handling of the above exception, another exception occurred: [rank40]: Traceback (most recent call last): [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank40]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank40]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank40]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank40]: print(f"Failed to fetch sample {i}. Exception:", e) [rank40]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank40]: print(f"Failed to fetch sample {i}. Exception:", e) [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank40]: return self.dispatch_line(frame) [rank40]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank40]: if self.quitting: raise BdbQuit [rank40]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f239bafcea0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_97774.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_97774.png'}, {'type': 'text', 'text': '\nClick on \'the text input area for "10.17", to the left of "ADD TO CART"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_658681.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_658681.png'}, {'type': 'text', 'text': "\nClick on 'view our cookie policy, Cookie Policy'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f231246f880> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240321/20240321_filtered/wangyixinwen/screen_00000598.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240321/20240321_filtered/wangyixinwen/screen_00000598.jpg'}, {'type': 'text', 'text': '\n能否分析这张图片,并且输出每个文字的位置坐标? Please click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_179443.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_179443.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Tire Technology Expo 2024", on the left side of the page\''}]}] Traceback (most recent call last): [rank43]: Traceback (most recent call last): [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank43]: train() [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank43]: trainer.train() [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank43]: return inner_training_loop( [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank43]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank43]: batch_samples += [next(epoch_iterator)] [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank43]: current_batch = next(dataloader_iter) [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank43]: data = self._next_data() [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank43]: return self._process_data(data) [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank43]: data.reraise() [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank43]: raise exception [rank43]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank43]: Original Traceback (most recent call last): [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank43]: sample = self._get_item(i) [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank43]: data_dict = self.preprocess_qwen2vl_v3( [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank43]: return self.preprocess_qwen2vl( [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank43]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank43]: image = tcs_loader(image) [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank43]: img = pil_loader(img_value_str) [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank43]: img = Image.open(buff) [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank43]: raise UnidentifiedImageError(msg) [rank43]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f35c86889f0> [rank43]: During handling of the above exception, another exception occurred: [rank43]: Traceback (most recent call last): [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank43]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank43]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank43]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank43]: print(f"Failed to fetch sample {i}. Exception:", e) [rank43]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank43]: print(f"Failed to fetch sample {i}. Exception:", e) [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank43]: return self.dispatch_line(frame) [rank43]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank43]: if self.quitting: raise BdbQuit [rank43]: bdb.BdbQuit File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f208be5e020> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f20a5335e90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3484f1e2f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_39345.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_39345.png'}, {'type': 'text', 'text': "\nClick on 'select iPhone 11 Pro Max option'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_155148_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_155148_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Customize Quick Access Toolbar'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image10556.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image10556.jpg'}, {'type': 'text', 'text': '\nWhich tab has been selected? Answer the question using a single word or phrase. Please double-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_687032.png File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_687032.png'}, {'type': 'text', 'text': "\nClick on 'text input box, at the top of the image'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f239b9f3bf0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f239b9d3740> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f208c43cef0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_182532_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_182532_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Photo Album...'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f34a750ef70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_175219_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_175219_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Banded Variant 4'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_112229_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_112229_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Tray Input Indicator - Chinese (Simplified, China)'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f20a5335e90> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f208be5e020> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f31f3d0b100> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageBlueMail_2024_3_18_21_46-69.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageBlueMail_2024_3_18_21_46-69.png'}, {'type': 'text', 'text': '\nIdentify the position of UI element Navigate up in the visual representation with bounding box. Please right-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_242385.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_242385.png'}, {'type': 'text', 'text': "\nClick on 'Join Hot Legs USA on social media to share leg wear fashions and new product news releases'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_001728_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_001728_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Class: octicon arrow-symbol-mktg'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f239b877e70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f24544cd6c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f31f3d0acf0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_46918.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_46918.png'}, {'type': 'text', 'text': '\nClick on \'View the content page [alt-shift-c], above "Eclipse Wiki"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_277863.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_277863.png'}, {'type': 'text', 'text': '\nClick on \'Gray "Search" button with a minimalist design, used to submit a form, to the top right of "FOOTBALL GIFTS"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f239b877290> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f31fdedd3a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_210123_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_210123_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Terms of use'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f239b883f10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_025643_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_025643_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Subscribe'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f20a5335e90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_143901_original_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_143901_original_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Name Box'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_803095.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_803095.png'}, {'type': 'text', 'text': '\nClick on \'Show products matching tag Ayodhya special saree, under "Saree Collection"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f31eebdd6c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f208be5f740> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/54592.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/54592.jpg'}, {'type': 'text', 'text': '\nFor the instruction: insert new, output the bounding box for the region in the screen that is most relevant to answering the question. Please click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fedbba08040> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3484171d50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_67943.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_67943.png'}, {'type': 'text', 'text': "\nClick on 'X / Twitter'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f32f12604f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fedbb019ee0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_629310.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_629310.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Hyperville 40"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image43500.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image43500.jpg'}, {'type': 'text', 'text': '\nWhat is the selected percent? Explain your answer based on UI element. Please double-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f31eed27e70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fef703f1030> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160218_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160218_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Home'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160758_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160758_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'View'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_002613_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_002613_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Captions & Subtitles'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f34841734c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f31eebb1210> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fee3ce42e30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_203755_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_203755_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Thesaurus...'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb59a421b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_054330_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_054330_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'BAE Systems Brasil | BAE Systems'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_11718.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_11718.png'}, {'type': 'text', 'text': '\nClick on \'link titled "Vehicle Research"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25dfb8d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_145014_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_145014_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Class: icon-img'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb34e09760> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25dfb150> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_26_17_14_7237194ab7d6485eaed1c489ce6fb88f-10.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_26_17_14_7237194ab7d6485eaed1c489ce6fb88f-10.png'}, {'type': 'text', 'text': '\nCan you specify the location and bounding box of Navigate up shown here? Click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25dfbf10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_415622.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_415622.png'}, {'type': 'text', 'text': "\nClick on 'Enter email address'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb34e09030> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_164862.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_164862.png'}, {'type': 'text', 'text': '\nClick on \'CLICK falling, above "Remember the potato test.", on the right side of "tools"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25e28950> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240322/20240322_filtered/xizoajiguanjia/screen_00000316.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240322/20240322_filtered/xizoajiguanjia/screen_00000316.jpg'}, {'type': 'text', 'text': '\n我想了解图片[[0, 91, 334, 142]]框内展示的是哪些具体元素。 Please click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25f80b80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_085118_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_085118_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Copilot (Ctrl+Shift+.)'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb59a421b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_305437.png(Pdb) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_305437.png'}, {'type': 'text', 'text': '\nClick on \'the image of "55078057AA - Exterior Ornamentation: Nameplate for Mopar"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fec4fa3fd80> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4709724310> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageTED_2024_3_7_18_23-404.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageTED_2024_3_7_18_23-404.png'}, {'type': 'text', 'text': '\nDetect and mark the position of Continue with Google with bounding box in this image. Please right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f470a4e1f80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_820033.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_820033.png'}, {'type': 'text', 'text': "\nClick on 'textarea'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44b1fb1670> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_17344.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_17344.png'}, {'type': 'text', 'text': "\nClick on 'choose color combination for the shirt'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f449f1ced90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_124452_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_124452_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Clear Formatting'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f495cf8ee30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/ui_data/ui_home_screen_phone/ui_homescreen_phone_20240416_v7homescreen-1778.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/ui_data/ui_home_screen_phone/ui_homescreen_phone_20240416_v7homescreen-1778.jpg'}, {'type': 'text', 'text': '\n图片中[[89, 89, 189, 135]]这个区域的内容。 Please right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4709725940> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25df7600> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_023811_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_023811_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Privacy Statement'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25df73d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_742188.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_742188.png'}, {'type': 'text', 'text': "\nClick on 'City, State'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fedbba08040> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_16556.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_16556.png'}, {'type': 'text', 'text': "\nClick on 'Add to cart'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) FLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_32891.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_32891.png'}, {'type': 'text', 'text': "\nClick on 'Event Calendar, at the bottom of the image'"}]}] , line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25df72e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fef0037ff60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_817015.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_817015.png'}, {'type': 'text', 'text': "\nClick on 'International freight forwarder'"}]}] [rank42]: Traceback (most recent call last): [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank42]: train() [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank42]: trainer.train() [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank42]: return inner_training_loop( [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank42]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank42]: batch_samples += [next(epoch_iterator)] [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank42]: current_batch = next(dataloader_iter) [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank42]: data = self._next_data() [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank42]: return self._process_data(data) [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank42]: data.reraise() [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank42]: raise exception [rank42]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank42]: Original Traceback (most recent call last): [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank42]: sample = self._get_item(i) [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank42]: data_dict = self.preprocess_qwen2vl_v3( [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank42]: return self.preprocess_qwen2vl( [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank42]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank42]: image = tcs_loader(image) [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank42]: img = pil_loader(img_value_str) [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank42]: img = Image.open(buff) [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank42]: raise UnidentifiedImageError(msg) [rank42]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fef0037ff60> [rank42]: During handling of the above exception, another exception occurred: [rank42]: Traceback (most recent call last): [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank42]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank42]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank42]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank42]: print(f"Failed to fetch sample {i}. Exception:", e) [rank42]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank42]: print(f"Failed to fetch sample {i}. Exception:", e) [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank42]: return self.dispatch_line(frame) [rank42]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank42]: if self.quitting: raise BdbQuit [rank42]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff00b412250> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f470a4e1f80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f470ab541d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_223698.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_223698.png'}, {'type': 'text', 'text': "\nClick on 'navigate to The Writer's Almanac website'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47097273d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160250_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160250_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Numbering'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4793d15620> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f470ab541d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_642842.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_642842.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Red Pinterest button with white outline and stylized white \'P\' logo, directing to Pinterest profile or feature", on the bottom left of the image, under "Connect with P3C Technologies LLC", in the "Hard drive crashed, did you backup?"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_32.png ine 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44b1fb1670> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_175330_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_175330_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'AutomationID: Icons_Acorn_M'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) et_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fee3ce42e30> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f470a10c680> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_228128.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_228128.png'}, {'type': 'text', 'text': "\nClick on 'Posts by Linda K. Laffey, MFT'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4793d2c7c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/54526.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/54526.jpg'}, {'type': 'text', 'text': '\nIdentify the position of UI element advertisement in the visual representation with bounding box. Please click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_015624_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_015624_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on '25 Basic Linux Commands For Beginners - GeeksforGeeks'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7feb25df6bb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_033307_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_033307_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Learn more about the guidance we produce'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47097246d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_131322_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_131322_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Clear all'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f495cffc130> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_582633.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_582633.png'}, {'type': 'text', 'text': '\nClick on \'Category:Proposals for reforming policy and governance of Wikimedia projects, under "Want to work on this proposal?" section\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4964635210> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_42707.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_42707.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Santee Lakes"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47097256c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_116870.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_116870.png'}, {'type': 'text', 'text': '\nClick on \'The "Altan Pharmaceuticals" link located in the upper right corner of the webpage under a dropdown menu.\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4964635210> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_152719_original_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_152719_original_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'GitKraken, group'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4793d4a5c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageAutoZone_2024_2_3_15_23-176.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageAutoZone_2024_2_3_15_23-176.png'}, {'type': 'text', 'text': '\nProvide the bounding box for the UI element Go back seen in the image. Please double-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4709724db0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_534369.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_534369.png'}, {'type': 'text', 'text': "\nClick on 'text area, on the right side of the webpage'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f45763b4130> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47097242c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_6352.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_6352.png'}, {'type': 'text', 'text': "\nClick on 'navigate to the Dubsmash Dubbing App download page'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4793d4ac50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_166796.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_166796.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Back Home"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47097259e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_20906.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_20906.png'}, {'type': 'text', 'text': "\nClick on 'view OUI DRAM Whisky Tastings legal information'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4793e2b060> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_005535_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_005535_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Intense Emphasis'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_223731.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_223731.png'}, {'type': 'text', 'text': "\nClick on 'textarea, on the bottom right of the screenshot'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44b1fb1670> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f459e14bd80> Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/agent_data/aig_share/ui_agi_appdata_car/20240425/main_56.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/agent_data/aig_share/ui_agi_appdata_car/20240425/main_56.png'}, {'type': 'text', 'text': '\n界面截图中的副驾空调升温按钮图标在哪里?请提供其边界框坐标。 Please double-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/baidu/screen_00000515.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/baidu/screen_00000515.jpg'}, {'type': 'text', 'text': '\n分析图像,确定菜单元素的位置,输出边界框坐标。 Please double-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f85a26deac0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): //wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_140757_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_140757_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Powered by Cookiebot'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f845ea26ca0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe77449f3d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_191118_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_191118_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Sound'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f84e7f38ae0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f470a37c040> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_235930.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_235930.png'}, {'type': 'text', 'text': "\nClick on 'Shop Kids'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f81d63d37e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_515832.png Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_28557.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_28557.png'}, {'type': 'text', 'text': "\nClick on 'The element is a small icon with outward-pointing arrows located at the top right corner of the webpage.'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) IAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe49d395580> gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f81d63d3970> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_071602_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Google Account Help'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f85a2f64680> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_044135_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_044135_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Moneywise'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f84e7f3b790> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_363819.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_363819.png'}, {'type': 'text', 'text': "\nClick on 'LOCATION'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f84e7f39620> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f04ce69bab0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f01f6f5e610> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f55765f4f90> Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe1a5e9b9c0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_315611.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_315611.png'}, {'type': 'text', 'text': "\nClick on 'Download .csv'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) zhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5577a90900> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): //wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_113627_before_screenshot_sub0.png Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_542244.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_542244.png'}, {'type': 'text', 'text': '\nClick on \'the image of "MENU"\''}]}] _getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f04cd895c10> Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98f37ebce0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe22f8f44f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_072458_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_072458_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'http://policies.google.com/privacy/up...'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_232770.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_232770.png'}, {'type': 'text', 'text': "\nClick on 'redirect to Absence Form page'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_vLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/62269790-c6a8-4625-8f01-b95e4a99f313.pngFile "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe22f769300> conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/62269790-c6a8-4625-8f01-b95e4a99f313.png'}, {'type': 'text', 'text': '\nOutput the bounding box of the element in the image that directly relates to solving this question: Favorite: Music, 1 subfolder. Please click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_429883.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_429883.png'}, {'type': 'text', 'text': '\nClick on \'mendelssohn, below "Education", at the bottom of the image\''}]}] [rank8]: Traceback (most recent call last): [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank8]: train() [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank8]: trainer.train() [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank8]: return inner_training_loop( [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank8]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank8]: batch_samples += [next(epoch_iterator)] [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank8]: current_batch = next(dataloader_iter) [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank8]: data = self._next_data() [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank8]: return self._process_data(data) [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank8]: data.reraise() [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank8]: raise exception [rank8]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank8]: Original Traceback (most recent call last): [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank8]: sample = self._get_item(i) [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank8]: data_dict = self.preprocess_qwen2vl_v3( [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank8]: return self.preprocess_qwen2vl( [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank8]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank8]: image = tcs_loader(image) [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank8]: img = pil_loader(img_value_str) [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank8]: img = Image.open(buff) [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank8]: raise UnidentifiedImageError(msg) [rank8]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f98f37ebce0> [rank8]: During handling of the above exception, another exception occurred: [rank8]: Traceback (most recent call last): [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank8]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank8]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank8]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank8]: print(f"Failed to fetch sample {i}. Exception:", e) [rank8]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank8]: print(f"Failed to fetch sample {i}. Exception:", e) [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank8]: return self.dispatch_line(frame) [rank8]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank8]: if self.quitting: raise BdbQuit [rank8]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_138001.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_138001.png'}, {'type': 'text', 'text': "\nClick on 'navigate to the details page for the 3 bed apartment at Oyster Court Lahore'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_628790.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_628790.png'}, {'type': 'text', 'text': '\nClick on \'link for "Advanced Search", under the "Welcome to PPE Online Store" section\''}]}] wen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe49d394d60> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageTodayTix_2024_3_7_22_52-445.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageTodayTix_2024_3_7_22_52-445.png'}, {'type': 'text', 'text': '\nDetect and mark the position of 7 $29 with bounding box in this image. Please double-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_234856.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_234856.png'}, {'type': 'text', 'text': "\nClick on 'filter results by 'curator''"}]}] t/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f53e30b0d10> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0389964e00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) ', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_17609.png'}, {'type': 'text', 'text': "\nClick on 'More info, titled Click here to read our privacy and cookie policy'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_226394.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_226394.png'}, {'type': 'text', 'text': "\nClick on 'view larger version of the artwork'"}]}] 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4793d1fe20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_149794.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_327179.png'}, {'type': 'text', 'text': "\nClick on 'Buy women's stripe cardigan at wholesale prices, on the left side of the image'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe6306e3380> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f55ffd30ef0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0f8ab61300> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe5b74b63e0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_131253_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_131253_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Prime Shopping Online'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_140845_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_140845_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Quick Access Toolbar'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4793d47e70> 78950> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f00f955fb50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_115956_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_115956_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Bookmark:'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_11/C4web50k-0_3307392-split-1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_11/C4web50k-0_3307392-split-1.png'}, {'type': 'text', 'text': '\nLocate , skipping the sending RFQs/receiving responses portion of the and describe its bounding box in this picture. Click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_372977.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_372977.png'}, {'type': 'text', 'text': "\nClick on 'Country Rose Florist Homepage'"}]}] i/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f021d954f90> Load image from ceph: langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240729201113678/main_158.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240729201113678/main_158.png'}, {'type': 'text', 'text': '\n提供一个结构合理、准确的XML布局描述,以满足开发者对软件产品用户界面设计的需求。\n限制:XML布局描述必须遵守软件设计规范,易于实现,同时确保信息的准确性和完整性。\n输出格式:XML格式,包含node_type, location, content等属性。\n工作流程:\n\n分析软件产品的用户界面设计需求。\n根据分析结果,确定XML布局描述中的node_type, location, content等属性。\n编写符合规范的XML布局描述。 Double-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_196764.png conv: [{'role': 'usTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f57998f3150> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe6306e3560> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_460915.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_460915.png'}, {'type': 'text', 'text': '\nClick on \'link labeled "Terms of Use"\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_164006_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_164006_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Maps - Not Selected'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe2e8213ce0> (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe39bc75350> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe39bc79940> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f57998f3150> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_246271.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_246271.png'}, {'type': 'text', 'text': "\nClick on 'enter First Name in the text box'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_131253_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_131253_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'More'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe3a02dfab0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe3a02dfab0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_140313_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_140313_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Grayscale'"}]}] > /mnt/hwfile/liuzhaoyang/workspace File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe1a5e9b790> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_1265.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_1265.png'}, {'type': 'text', 'text': "\nClick on '_form_23_submit'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe3a02df330> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_487394.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_487394.png'}, {'type': 'text', 'text': '\nClick on \'button titled "CATALOGUES", at the top of the image\''}]}] Traceback (most recent call last): //wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_184557_before_screenshot_sub0.png File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f484c388ea0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_139929.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_139929.png'}, {'type': 'text', 'text': "\nClick on 'open map of 1A GLENVIEW ROAD, GLEN EDEN'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) file/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f53e3c12250> conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_428839.png'}, {'type': 'text', 'text': "\nClick on 'Posts published on March 14, 2024'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe6b0ba6c00> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44861b2ed0> conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_659396.png'}, {'type': 'text', 'text': "\nClick on 'mattress nerd logo'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ntent': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_142108_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Phonetic Guide...'"}]}] ce/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe2ea6c9e90> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe49d395580> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) r', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_802650.png'}, {'type': 'text', 'text': "\nClick on 'inputfield-933052-description'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4793d44270> [rank11]: Traceback (most recent call last): [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank11]: train() [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank11]: trainer.train() [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank11]: return inner_training_loop( [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank11]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank11]: batch_samples += [next(epoch_iterator)] [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank11]: current_batch = next(dataloader_iter) [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank11]: data = self._next_data() [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank11]: return self._process_data(data) [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank11]: data.reraise() [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank11]: raise exception [rank11]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank11]: Original Traceback (most recent call last): [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank11]: sample = self._get_item(i) [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank11]: data_dict = self.preprocess_qwen2vl_v3( [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank11]: return self.preprocess_qwen2vl( [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank11]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank11]: image = tcs_loader(image) [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank11]: img = pil_loader(img_value_str) [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank11]: img = Image.open(buff) [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank11]: raise UnidentifiedImageError(msg) [rank11]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f484c388ea0> [rank11]: During handling of the above exception, another exception occurred: [rank11]: Traceback (most recent call last): [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank11]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank11]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank11]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank11]: print(f"Failed to fetch sample {i}. Exception:", e) [rank11]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank11]: print(f"Failed to fetch sample {i}. Exception:", e) [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank11]: return self.dispatch_line(frame) [rank11]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank11]: if self.quitting: raise BdbQuit [rank11]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/wor File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb43066b1f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_538727.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_538727.png'}, {'type': 'text', 'text': "\nClick on 'Heating button, on the top left of the page'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe39bc7a250> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): //dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_142365.png item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb2eb2abd30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_142365.png'}, {'type': 'text', 'text': "\nClick on 'Steering, located under Shop Departments in the left sidebar navigation menu.'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe39bc7b100> r: cannot identify image file <_io.BytesIO object at 0x7fe1a5e9b790> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb06407bb00> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/AndroidUI/ui_caption_inhousepinduoduo_screen_00000064.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/AndroidUI/ui_caption_inhousepinduoduo_screen_00000064.jpg'}, {'type': 'text', 'text': '\n详细描述下截图的内容 Please right-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_130659_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_130659_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'AutomationID: menu-item-77762'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) ample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe7fc92b420> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_135826_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_135826_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Formality, 0 issues. Press space or enter to review items.'"}]}] Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) _ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe39bc7ad90> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_398630.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_398630.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Avatar"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageTemu_2024_3_8_5_55-344.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageTemu_2024_3_8_5_55-344.png'}, {'type': 'text', 'text': '\nDetermine the bounding box and location of Enhance Home With A Positive Energy Lapis Lazuli Pyramid - Natural Elixir Epoxy Ogan Pyramid Decor in this image. Please double-click on it.'}]}] IAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb17e5909a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe6b15aa7f0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe39bc79fd0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_505158.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_505158.png'}, {'type': 'text', 'text': "\nClick on 'Allow button, at the bottom right corner of the page'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f84e7f28180> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb06407aac0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240325/20240325_filtered/qichezhijia/screen_00000034.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240325/20240325_filtered/qichezhijia/screen_00000034.jpg'}, {'type': 'text', 'text': '\n对图片进行OCR处理,并标出每个识别出的文字的位置。 Right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_031717_before_screenshot_sub0.png File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_031717_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Buy iPad - Apple'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd70935d760> Traceback (most recent call last): //wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_031944_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_031944_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Cookies | About | NICE'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe39bc7ab60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe3aa60b010> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe6b0ba6700> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_585080.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_585080.png'}, {'type': 'text', 'text': '\nClick on \'the image of "slide"\''}]}] l_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb055543b00> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_205158_before_screenshot_sub1.png File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_205158_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Microsoft rewards'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0413715620> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0413ac2390> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_005857_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_005857_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Running applications'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f20277b03b0> [rank61]: Traceback (most recent call last): [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank61]: train() [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank61]: trainer.train() [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank61]: return inner_training_loop( [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank61]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank61]: batch_samples += [next(epoch_iterator)] [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank61]: current_batch = next(dataloader_iter) [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank61]: data = self._next_data() [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank61]: return self._process_data(data) [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank61]: data.reraise() [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank61]: raise exception [rank61]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank61]: Original Traceback (most recent call last): [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank61]: sample = self._get_item(i) [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank61]: data_dict = self.preprocess_qwen2vl_v3( [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank61]: return self.preprocess_qwen2vl( [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank61]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank61]: image = tcs_loader(image) [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank61]: img = pil_loader(img_value_str) [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank61]: img = Image.open(buff) [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank61]: raise UnidentifiedImageError(msg) [rank61]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe1a5e9b790> [rank61]: During handling of the above exception, another exception occurred: [rank61]: Traceback (most recent call last): [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank61]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank61]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank61]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank61]: print(f"Failed to fetch sample {i}. Exception:", e) [rank61]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank61]: print(f"Failed to fetch sample {i}. Exception:", e) [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank61]: return self.dispatch_line(frame) [rank61]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank61]: if self.quitting: raise BdbQuit [rank61]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_666866.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_666866.png'}, {'type': 'text', 'text': "\nClick on 'Connexion'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0413717510> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_153507.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_153507.png'}, {'type': 'text', 'text': "\nClick on 'Covering almost every product, process or service imaginable, ISO makes standards used everywhere'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/agu File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44b1fb1670> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) 770.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_815770.png'}, {'type': 'text', 'text': '\nClick on \'the image of "ZipRecruiter", in the "Careers in sustainability, climate tech, renewable energy, net zero, the circular economy, ESG and more"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_072702_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_072702_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Delete photos & videos - Computer - Google Photos Help'"}]}] taset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f55765f6bb0> mage', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_68066.png'}, {'type': 'text', 'text': "\nClick on 'text input field'"}]}] Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0413716c50> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_63521.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_63521.png'}, {'type': 'text', 'text': "\nClick on 'The topmost of the two small square icons located in the upper right corner of the webpage header, next to the email address, that represents a social media platform.'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Ima> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd331de4fe0> pen(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f495cfbf150> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f84e7f3bc40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_131246.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://ding File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f56b9463e20> kspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd430f96570> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd34019bba0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in p> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) is/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44861b2250> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): ao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240911_213607_original_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240911_213607_original_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Pivot Table'"}]}] 述下这个屏幕的内容。 Double-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f572c106ac0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_200834_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_200834_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Linked Cell'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f53e3c121b0> _gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5a82e726b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240321/20240321_filtered/dongchedi/screen_00000235.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240321/20240321_filtered/dongchedi/screen_00000235.jpg'}, {'type': 'text', 'text': '\n请识别并列出这幅图像中包含的所有文本。(with grounding) Please click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f52e61ff100> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd5c371f2e0> Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4793d17a60> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_324006.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_324006.png'}, {'type': 'text', 'text': "\nClick on 'text input area'"}]}] Load image from ceph: VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/weixindushu/screen_00000376.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/weixindushu/screen_00000376.jpg'}, {'type': 'text', 'text': '\n在给定的界面截图中,识别出哲学宗教图标,并框出其边界。 Click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) 'text', 'text': "\nClick on 'CLICK Reviews, in the middle of the image'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f55ffd42a70> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_40268.png conv: [{'role': 'user', 'conTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image30037.jpg File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4793d172e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5bc420e700> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_055311_before_screenshot_sub2.png [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_125040_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Use Chrome with multiple profiles'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f55ffd68db0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe39bc7a340> conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_055311_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'Not Your Boss Babe'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_073510_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_073510_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Kategorie'"}]}] (Pdb) [rank3]: Traceback (most recent call last): [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank3]: train() [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank3]: trainer.train() [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank3]: return inner_training_loop( [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank3]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank3]: batch_samples += [next(epoch_iterator)] [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank3]: current_batch = next(dataloader_iter) [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank3]: data = self._next_data() [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank3]: return self._process_data(data) [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank3]: data.reraise() [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank3]: raise exception [rank3]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank3]: Original Traceback (most recent call last): [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank3]: sample = self._get_item(i) [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank3]: data_dict = self.preprocess_qwen2vl_v3( [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank3]: return self.preprocess_qwen2vl( [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank3]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank3]: image = tcs_loader(image) [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank3]: img = pil_loader(img_value_str) [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank3]: img = Image.open(buff) [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank3]: raise UnidentifiedImageError(msg) [rank3]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f53e3c121b0> [rank3]: During handling of the above exception, another exception occurred: [rank3]: Traceback (most recent call last): [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank3]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank3]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank3]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank3]: print(f"Failed to fetch sample {i}. Exception:", e) [rank3]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank3]: print(f"Failed to fetch sample {i}. Exception:", e) [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank3]: return self.dispatch_line(frame) [rank3]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank3]: if self.quitting: raise BdbQuit [rank3]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f55ffd44a40> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_265966.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_265966.png'}, {'type': 'text', 'text': "\nClick on 'CLICK Privacy Policy, on the left side of the screenshot'"}]}] Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f851c38ef20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_030529_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_030529_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'previous'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_729531.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_729531.png'}, {'type': 'text', 'text': '\nClick on \'Select options for “Crochet Dreads – Brown Blonde «Boho»”, located on the left side of the screenshot, located in "Crochet Dreads – Blonde Light Blue"\''}]}] oyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe7744f02c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f52e61ffb00> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5cd2ac3060> (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_18985.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_18985.png'}, {'type': 'text', 'text': '\nClick on \'The button labeled "Sign Up" with a red background, located within a pop-up form titled "Join The Team," containing fields for First Name, Last Name, Email Address, and Zip.\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_433397.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_433397.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Hotline"\''}]}] nt/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44861b3380> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb2eb564e00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ontent': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageReddit_2024_3_9_14_30-412.png'}, {'type': 'text', 'text': '\nAscertain the location of the UI element Back in the screen with bounding box. Please click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f55765f7150> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe39bc787c0> aset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5bc420e700> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f52f0592930> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)Traceback (most recent call last): hao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_155917_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_155917_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'AutomationID: DynamicSearchBoxGleamImage'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_221422.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_221422.png'}, {'type': 'text', 'text': "\nClick on 'open sign-in dialog'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_433042.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_433042.png'}, {'type': 'text', 'text': "\nClick on 'A, labeled Reset font size'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe39bc79300> word Search", above "6,000"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240328/20240328_filtered/douyu/screen_00000086.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240328/20240328_filtered/douyu/screen_00000086.jpg'}, {'type': 'text', 'text': '\n描述下截图的内容。 Please click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f55ffd47650> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liTraceback (most recent call last): l_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f52f0592480> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_Traceback (most recent call last): tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb36c3a0680> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) yang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_525630.png'}, {'type': 'text', 'text': '\nClick on \'the image of "iMore", above "MOST POPULAR"\''}]}] {'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_013908_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'WatchlistExpand Watch List'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_121583.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_121583.png'}, {'type': 'text', 'text': "\nClick on 'Shopify'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb0556c5e40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f00f955eca0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_607561.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_607561.png'}, {'type': 'text', 'text': "\nClick on 'Regulatory Affairs, labeled We offer a wide variety of handles for our knives. Because we do offer tortoise and ivory, we want to assure our customers that the materials we use are not illegal.'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUI> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe22f79b650> uvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb06407bb00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f56b9762930> (Pdb) hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ntent': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_289245.png'}, {'type': 'text', 'text': "\nClick on 'DEEDS Cartularies and Repositories'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe1a5e9b9c0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_010639_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_010639_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Close'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb06407ac00> tr) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd87c6c2480> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f55ffd44c20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f100c0dfdd0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in pTraceback (most recent call last): f.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb0556c5e40> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f845ea26cf0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f85a21539c0> File Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_533287.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_533287.png'}, {'type': 'text', 'text': "\nClick on 'the dropdown'"}]}] taset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f114d687b50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_162248_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_162248_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'New comment'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/ui_data/ui_home_screen_phone/ui_homescreen_phone_20240416_v7homescreen-297.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/ui_data/ui_home_screen_phone/ui_homescreen_phone_20240416_v7homescreen-297.jpg'}, {'type': 'text', 'text': '\n分析图像,找出所有的APP并输出边界框坐标。 Please double-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in prepro(Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) n2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0f8ab62f70> liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe22f8f5b70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd737b45260> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240327/20240327_filtered/Bzhan/screen_00000383.jpg conv: [Traceback (most recent call last): ': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240327/20240327_filtered/Bzhan/screen_00000383.jpg'}, {'type': 'text', 'text': '\n请将图片中的文字识别出来,并给出这些文字的位置。 Please double-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) nshotsweb_hybrid_773k_max_25qa_filtered_93261.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_93261.png'}, {'type': 'text', 'text': '\nClick on \'CLICK Writer link, above "2 replies", under "View original", to the right of "Did this help you find an answer to your question?"\''}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f845ea24ef0> [rank9]: Traceback (most recent call last): [rank9]: File "/mnt/hwfi> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) UIAgent/qwen2vl_gui/train.py", line 210, in train [rank9]: trainer.train() [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank9]: return inner_training_loop( [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank9]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank9]: batch_samples += [next(epoch_iterator)] [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank9]: current_batch = next(dataloader_iter) [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank9]: data = self._next_data() [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank9]: return self._process_data(data) [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank9]: data.reraise() [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank9]: raise exception [rank9]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank9]: Original Traceback (most recent call last): [rank9]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank9]: sample = self._get_item(i) [rank9]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank9]: data_dict = self.preprocess_qwen2vl_v3( [rank9]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank9]: return self.preprocess_qwen2vl( [rank9]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank9]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank9]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank9]: image = tcs_loader(image) [rank9]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank9]: img = pil_loader(img_value_str) [rank9]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank9]: img = Image.open(buff) [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank9]: raise UnidentifiedImageError(msg) [rank9]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f845ea26cf0> [rank9]: During handling of the above exception, another exception occurred: [rank9]: Traceback (most recent call last): [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank9]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank9]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank9]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank9]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank9]: print(f"Failed to fetch sample {i}. Exception:", e) [rank9]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank9]: print(f"Failed to fetch sample {i}. Exception:", e) [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank9]: return self.dispatch_line(frame) [rank9]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank9]: if self.quitting: raise BdbQuit [rank9]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_031741_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_031741_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Class: in-page-nav__list'"}]}] is/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe1a683be70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdf1f5efd30> (Pdb) hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_372780.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_372780.png'}, {'type': 'text', 'text': "\nClick on 'privacy policy, labeled Privacy policies'"}]}] /src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f81d63d37e0> Traceback (most recent call last): [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_075410_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Review your security settings'"}]}] Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe2e8d26520> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0df7a4d170> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_245387.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_245387.png'}, {'type': 'text', 'text': "\nClick on 'navigate to the documentation of psycopg.Copy method'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f100c0fb150> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_135715_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_135715_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Focus '"}]}] Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0d042e3bf0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb4b3dadf80> (Pdb) hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240821_171107_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_043941_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'tab-3'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_2_15_37_195cde51abb8471196e0408a1ba080e2-5.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_2_15_37_195cde51abb8471196e0408a1ba080e2-5.png'}, {'type': 'text', 'text': '\nLocate the Back in the picture and give its boundary details. Click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f100c0f84f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) nshot.pngelf._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f100c26b970> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_661442.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_661442.png'}, {'type': 'text', 'text': "\nClick on 'ActivistChic on Pinterest'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) y", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f845ea27dd0> conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_061607_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'View comments 84 Comment'"}]}] (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_274746.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_274746.png'}, {'type': 'text', 'text': "\nClick on 'Skiing in Belgium'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_23_19_46_eb32c51543d749539b68e6c61ff72fb8-17.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_23_19_46_eb32c51543d749539b68e6c61ff72fb8-17.png'}, {'type': 'text', 'text': '\nIdentify and give the bounding box coordinates of Search for… seen in the image. Click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_fro> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image14366.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image14366.jpg'}, {'type': 'text', 'text': '\nWhat is the status of "Weather Notification"? Answer the question using a single word or phrase. Please double-click on it.'}]}] 515, in> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) taset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f84e7f38d60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_767935.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_767935.png'}, {'type': 'text', 'text': "\nClick on 'checkbox'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5ff48e8590> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f100c0f8950> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_219451.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_219451.png'}, {'type': 'text', 'text': '\nClick on \'Share on reddit, in "Democracy vs Parliamentary Democracy Definition" section\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f04137158a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ntent': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_042044_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Watchlist Ideas'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5ff1de4770> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_115522_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_115522_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Content safety'"}]}] Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240905_150033_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240905_150033_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Wi‑Fi'"}]}] e': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/ef423b4e-dda3-49ef-aae0-77e3fe50dc07.png'}, {'type': 'text', 'text': '\nPlease identify the area of the element with bounding box that can provide an answer to this question: Stop refreshing. Please right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0413717560> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f85a1657e20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_42765.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_42765.png'}, {'type': 'text', 'text': "\nClick on 'select pick-up time'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f81d980b1a0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_259832.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_259832.png'}, {'type': 'text', 'text': "\nClick on 'Quick actions'"}]}] g/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5a82e726b0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_030907_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_030907_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'iPhone'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0413746570> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240322/20240322_filtered/youku/screen_00000076.jpg __ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f100c0fb470> conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240322/20240322_filtered/youku/screen_00000076.jpg'}, {'type': 'text', 'text': '\n能否帮忙从这张图片中提取出文字,并且给出这些文字在图片中的位置? Please right-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) t/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f81d63d2ac0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f58ee97c9a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ntent': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_644148.png'}, {'type': 'text', 'text': "\nClick on 'Go to the homepage'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_765648.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_765648.png'}, {'type': 'text', 'text': '\nClick on \'Momentous WordPress Theme, below "CATEGORIES"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preproc> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ge_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0413746b10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0d042e3bf0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_620968.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_438721.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_438721.png'}, {'type': 'text', 'text': "\nClick on 'text area'"}]}] an advisor'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_156934.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_156934.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Notice at collection page'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/ImaTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Traceback (most recent call last): //dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_127382.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_127382.png'}, {'type': 'text', 'text': '\nClick on \'The website navigation bar contains a series of links horizontally positioned at the top of the page. Look for the link labeled "Shop" positioned between the "Careers" and "About Us" links.\''}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageGOAT_2024_3_5_19_35-92.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageGOAT_2024_3_5_19_35-92.png'}, {'type': 'text', 'text': '\nCould you locate the soccer shoes on this image and specify its bounding box dimensions? Right-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e): cannot identify image file <_io.BytesIO object at 0x7f0413746c00> > /mnt/h File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_022050_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_022050_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Free AI Writing Assistance for Students | Grammarly'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd737b453f0> rom_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5a81a92200> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_708332.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_708332.png'}, {'type': 'text', 'text': "\nClick on 'Site Home Page'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5cd2a95e90> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_752809.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_752809.png'}, {'type': 'text', 'text': '\nClick on \'DIRECTORY link, at the top of the page, above "Author: Candice Cowan"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_495534.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_495534.png'}, {'type': 'text', 'text': '\nClick on \'the image of "The Fitzwilliam Museum Logo"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f84e7f3b1f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5bc427d080> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_125929_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_125929_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Google Cloud Platform'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) ge from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_045056_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_045056_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Mostly cloudy'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_2069.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_2069.png'}, {'type': 'text', 'text': "\nClick on 'navigate to detailed article about Planning A Major Building Project in London'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_014955_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_014955_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'View comments 11 Comment'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_276644.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_276644.png'}, {'type': 'text', 'text': "\nClick on 'Rehab Management'"}]}] ace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5823c4c860> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb05555ccc0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5ff1de5030> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_632694.png conv: [{'role': 'user', 'coLoad image from ceph: langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0726_099_batch_3_num_1500/2024072313581655/main_106.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0726_099_batch_3_num_1500/2024072313581655/main_106.png'}, {'type': 'text', 'text': '\n提供一个结构合理、准确的XML布局描述,以满足开发者对软件产品用户界面设计的需求。\n限制:XML布局描述必须遵守软件设计规范,易于实现,同时确保信息的准确性和完整性。\n输出格式:XML格式,包含node_type, location, content等属性。\n工作流程:\n\n分析软件产品的用户界面设计需求。\n根据分析结果,确定XML布局描述中的node_type, location, content等属性。\n编写符合规范的XML布局描述。 Please double-click on it.'}]}] qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb36c6ff740> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb2ec65c0e0> self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd737b47880> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_031517_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_031517_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Kids' Home Store'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image912.jpg'}, {'type': 'text', 'text': '\nWhat is the given email address? Explain your answer based on UI element. Please double-click on it.'}]}] 59, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5ff3d20450> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_768485.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_768485.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Virtual Workforce Solutions"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) eprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f57fb20a9d0> .BytesIO object at 0x7fd4af187b00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb05555d490> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image31935.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image31935.jpg'}, {'type': 'text', 'text': '\nWhat is the status of "Enable color emoji"? Explain your answer based on UI element. Please right-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_021506_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_021506_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'models'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_134049_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_134049_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Customize'"}]}] "}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5ff1e2dbc0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240826_170736_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240826_170736_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Get Data'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb1579ec220> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ntent': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_819653.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Patio Blinds Icon"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5bc438e160> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd8fb999c60> .BytesIO object at 0x7f00f955f510> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_642820.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_642820.png'}, {'type': 'text', 'text': "\nClick on 'Dark blue background with white dashboard layout, featuring numerical data, icons, and buttons, with overlaid white text promoting Appsumo's Fraud Blocker Review: Lifetime Deal, at the center of the image'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_571782.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_571782.png'}, {'type': 'text', 'text': "\nClick on 'Small, gradient-button with magnifying glass icon and 'Go to last post' text, clickable link, on the bottom right of the page'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f84e7f38f90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank30]: Traceback (most recent call last): [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_41/C4web50k-2_184493014-split-2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_41/C4web50k-2_184493014-split-2.png'}, {'type': 'text', 'text': '\nPlease locate Kanpani Girls in the image and outline its bounding box. Double-click on it.'}]}] ransformers/trainer.py", line 2473, in _inner_training_loop [rank30]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank30]: batch_samples += [next(epoch_iterator)] [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank30]: current_batch = next(dataloader_iter) [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank30]: data = self._next_data() [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank30]: return self._process_data(data) [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank30]: data.reraise() [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank30]: raise exception [rank30]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank30]: Original Traceback (most recent call last): [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank30]: sample = self._get_item(i) [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank30]: data_dict = self.preprocess_qwen2vl_v3( [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank30]: return self.preprocess_qwen2vl( [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank30]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank30]: image = tcs_loader(image) [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank30]: img = pil_loader(img_value_str) [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank30]: img = Image.open(buff) [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank30]: raise UnidentifiedImageError(msg) [rank30]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb36c6ff740> [rank30]: During handling of the above exception, another exception occurred: [rank30]: Traceback (most recent call last): [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank30]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank30]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank30]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank30]: print(f"Failed to fetch sample {i}. Exception:", e) [rank30]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank30]: print(f"Failed to fetch sample {i}. Exception:", e) [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank30]: return self.dispatch_line(frame) [rank30]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank30]: if self.quitting: raise BdbQuit [rank30]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb2ea905fd0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5ff3d22840> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ntent': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_142904_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Pictures'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_114220.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_114220.png'}, {'type': 'text', 'text': "\nClick on 'open the news article titled 'Foreign brands see sales surge in Chinese market via live-streaming promotions, e-commerce platforms''"}]}] Load image from ceph: langchao2:s3: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd87d033c40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_367757.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_367757.png'}, {'type': 'text', 'text': "\nClick on 'Add to cart: “Dark Chocolate Flavoured Macadamia Spread 250g”'"}]}] line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6135131a30> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb36cd25d50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd7b83b6930> .BytesIO object at 0x7fb06ad08c70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_195309.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_195309.png'}, {'type': 'text', 'text': "\nClick on 'expand dropdown'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_005423_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_005423_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'No contributions on January 23rd.'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f607bc8a570> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240325/20240325_filtered/ziru/screen_00000288.jpg on:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_663796.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_663796.png'}, {'type': 'text', 'text': "\nClick on 'Office & Stationery'"}]}] guvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) (Pdb) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb06407aac0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_340200.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_340200.png'}, {'type': 'text', 'text': "\nClick on 'MF_logo_web_RGB_72'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_30_22_1_80827c2b324a487ab4a73cd990b45ec8-9.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_30_22_1_80827c2b324a487ab4a73cd990b45ec8-9.png'}, {'type': 'text', 'text': '\nHow can you locate the 3.2k followers and its bounding box in this image? Double-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240325/20240325_filtered/ziru/screen_00000288.jpg'}, {'type': 'text', 'text': '\n详细描述下截图的内容 Right-click on it.'}]}] 08, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd7092318a0> Load image from ceph: langchao2:s3:Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/ File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_551841.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_551841.png'}, {'type': 'text', 'text': '\nClick on \'Dark> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd331de7a10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_120150_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_120150_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Software in Google services'"}]}] (Pdb) Load image from ceph: langc File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3Traceback (most recent call last): ( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5bc4283920> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_447356.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_447356.png'}, {'type': 'text', 'text': '\nClick on \'the input box for "Join us."\''}]}] en2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd647ad8130> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd8fb999c60> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_297274.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_297274.png'}, {'type': 'text', 'text': "\nClick on 'Search events by name or location'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py",Traceback (most recent call last): e = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb1579ecb30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faed9252570> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_131814_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_131814_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Amazon Echo Dot 3'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_498969.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd73adf3830> ite-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd331de7c40> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f20a8a168e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_594675.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_594675.png'}, {'type': 'text', 'text': "\nClick on 'Select all issues'"}]}] e/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faf63728c70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imagepixiv_2024_3_9_17_29-442.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imagepixiv_2024_3_9_17_29-442.png'}, {'type': 'text', 'text': '\nProvide the location coordinates and bounding box of Open navigation sidebar in this image. Please right-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/wor File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1d915b34c0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_627 File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) cess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd331417ab0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_328534.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359[rank10]: Traceback (most recent call last): [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank10]: train() [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank10]: trainer.train() [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank10]: return inner_training_loop( [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank10]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank10]: batch_samples += [next(epoch_iterator)] [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank10]: current_batch = next(dataloader_iter) [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank10]: data = self._next_data() [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank10]: return self._process_data(data) [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank10]: data.reraise() [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank10]: raise exception [rank10]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank10]: Original Traceback (most recent call last): [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank10]: sample = self._get_item(i) [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank10]: data_dict = self.preprocess_qwen2vl_v3( [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank10]: return self.preprocess_qwen2vl( [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank10]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank10]: image = tcs_loader(image) [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank10]: img = pil_loader(img_value_str) [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank10]: img = Image.open(buff) [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank10]: raise UnidentifiedImageError(msg) [rank10]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5a81a923e0> [rank10]: During handling of the above exception, another exception occurred: [rank10]: Traceback (most recent call last): [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank10]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank10]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank10]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank10]: print(f"Failed to fetch sample {i}. Exception:", e) [rank10]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank10]: print(f"Failed to fetch sample {i}. Exception:", e) [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank10]: return self.dispatch_line(frame) [rank10]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank10]: if self.quitting: raise BdbQuit [rank10]: bdb.BdbQuit Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1da08b96c0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_774333.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_774333.png'}, {'type': 'text', 'text': '\nClick on \'Account link, to the bottom left of "CONTACT US", to the top right of "See if you qualify"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_195925.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_195925.png'}, {'type': 'text', 'text': "\nClick on 'navigate to homepage'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tc> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/402e9e19-1bde-4a09-927f-84e5ef16ea8d.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/402e9e19-1bde-4a09-927f-84e5ef16ea8d.png'}, {'type': 'text', 'text': '\nHighlight the click area with bounding box to complete the task: Replace (1). Please click on it.'}]}] /dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1da08b99e0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_124858_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_124858_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Atour Hotel - Google hotels'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_132444_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_132444_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'File name:'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe775827740> .BytesIO object at 0x7f83d91d6f70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2026b5b7e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_030303_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_030303_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Class: cmplz-close'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3:(Pdb) /20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_53/C4web50k-3_275414886-split-0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_53/C4web50k-3_275414886-split-0.png'}, {'type': 'text', 'text': '\nDetect the Meet the Team and its bounding box coordinates in this image. Right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8152f864d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_47/C4web50k-2_186651074-split-22.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_47/C4web50k-2_186651074-split-22.png'}, {'type': 'text', 'text': '\nProvide the dimensions and location of the UI element , the nitrate recovery rate was about 40 percent and 55 percent, respectively. The sulfate recovery rate fluctuated among the tested specimens, but on average, about 90 percent sulfate was extracted from the surface. with bounding box in the image. Please double-click on it.'}]}] aset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe5a1924db0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_135327_before_screenshot_sub0.png 222, in [rank1]: train() [rank1]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank1]: trainer.train() [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank1]: return inner_training_loop( [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank1]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank1]: batch_samples += [next(epoch_iterator)] [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank1]: current_batch = next(dataloader_iter) [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank1]: data = self._next_data() [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank1]: return self._process_data(data) [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank1]: data.reraise() [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank1]: raise exception [rank1]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank1]: Original Traceback (most recent call last): [rank1]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank1]: sample = self._get_item(i) [rank1]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank1]: data_dict = self.preprocess_qwen2vl_v3( [rank1]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank1]: return self.preprocess_qwen2vl( [rank1]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank1]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank1]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank1]: image = tcs_loader(image) [rank1]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank1]: img = pil_loader(img_value_str) [rank1]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank1]: img = Image.open(buff) [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank1]: raise UnidentifiedImageError(msg) [rank1]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe775827740> [rank1]: During handling of the above exception, another exception occurred: [rank1]: Traceback (most recent call last): [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank1]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank1]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank1]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank1]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank1]: print(f"Failed to fetch sample {i}. Exception:", e) [rank1]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank1]: print(f"Failed to fetch sample {i}. Exception:", e) [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank1]: return self.dispatch_line(frame) [rank1]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank1]: if self.quitting: raise BdbQuit [rank1]: bdb.BdbQuit conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_135327_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'AutomationID: tab-25'"}]}] rc/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd7390096c0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc43d3aa340> Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb2ea9049a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/WPS/screen_00000268.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/WPS/screen_00000268.jpg'}, {'type': 'text', 'text': '\n在图像中定位出立即开通,给出其边界框的坐标信息。 Please double-click on it.'}]}] > /mnt/Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/agent_data/aig_share/ui_agi_appdata_car/20240606/00911.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/agent_data/aig_share/ui_agi_appdata_car/20240606/00911.png'}, {'type': 'text', 'text': '\n界面截图中的电池电量显示标签在哪里?请提供其边界框坐标。 Please click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160808_before_screenshot.png [rank2]: Traceback (most recent call last): [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank2]: train() [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank2]: trainer.train() [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank2]: return inner_training_loop( [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank2]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank2]: batch_samples += [next(epoch_iterator)] [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank2]: current_batch = next(dataloader_iter) [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank2]: data = self._next_data() [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank2]: return self._process_data(data) [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank2]: data.reraise() [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank2]: raise exception [rank2]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank2]: Original Traceback (most recent call last): [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank2]: sample = self._get_item(i) [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank2]: data_dict = self.preprocess_qwen2vl_v3( [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank2]: return self.preprocess_qwen2vl( [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank2]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank2]: image = tcs_loader(image) [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank2]: img = pil_loader(img_value_str) [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank2]: img = Image.open(buff) [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank2]: raise UnidentifiedImageError(msg) [rank2]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd7390096c0> [rank2]: During handling of the above exception, another exception occurred: [rank2]: Traceback (most recent call last): [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank2]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank2]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank2]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank2]: print(f"Failed to fetch sample {i}. Exception:", e) [rank2]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank2]: print(f"Failed to fetch sample {i}. Exception:", e) [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank2]: return self.dispatch_line(frame) [rank2]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank2]: if self.quitting: raise BdbQuit [rank2]: bdb.BdbQuit Traceback (most recent call last): //dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_28985.png conv: [{'role': 'user', 'content': File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuz File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f04137455d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_70677.png .py", line 222, in [rank51]: train() [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank51]: trainer.train() [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank51]: return inner_training_loop( [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank51]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank51]: batch_samples += [next(epoch_iterator)] [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank51]: current_batch = next(dataloader_iter) [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank51]: data = self._next_data() [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank51]: return self._process_data(data) [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank51]: data.reraise() [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank51]: raise exception [rank51]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank51]: Original Traceback (most recent call last): [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank51]: sample = self._get_item(i) [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank51]: data_dict = self.preprocess_qwen2vl_v3( [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank51]: return self.preprocess_qwen2vl( [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank51]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank51]: image = tcs_loader(image) [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank51]: img = pil_loader(img_value_str) [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank51]: img = Image.open(buff) [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank51]: raise UnidentifiedImageError(msg) [rank51]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe5a1924db0> [rank51]: During handling of the above exception, another exception occurred: [rank51]: Traceback (most recent call last): [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank51]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank51]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank51]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank51]: print(f"Failed to fetch sample {i}. Exception:", e) [rank51]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank51]: print(f"Failed to fetch sample {i}. Exception:", e) [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank51]: return self.dispatch_line(frame) [rank51]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank51]: if self.quitting: raise BdbQuit [rank51]: bdb.BdbQuit Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_172605_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_172605_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Manage'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotswebTraceback (most recent call last): [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_050422_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Fri 30'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_023110_before_screenshot.png .png Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc0164b7b50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgen File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_309232.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_309232.png'}, {'type': 'text', 'text': '\nClick on \'Published and unpublished papers of Kevin Crowston, above "Home"\''}]}] .py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb05564a8e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_ioTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc0251f7560> y", line 701, in __next__ [rank46]:> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)utils/data/dataloader.py", line 1465, in _next_data [rank46]: return self._process_data(data) [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank46]: data.reraise() [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank46]: raise exception [rank46]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank46]: Original Traceback (most recent call last): [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank46]: sample = self._get_item(i) [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank46]: data_dict = self.preprocess_qwen2vl_v3( [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank46]: return self.preprocess_qwen2vl( [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank46]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank46]: image = tcs_loader(image) [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank46]: img = pil_loader(img_value_str) [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank46]: img = Image.open(buff) [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank46]: raise UnidentifiedImageError(msg) [rank46]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9ea04e20c0> [rank46]: During handling of the above exception, another exception occurred: [rank46]: Traceback (most recent call last): [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank46]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank46]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank46]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank46]: print(f"Failed to fetch sample {i}. Exception:", e) [rank46]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank46]: print(f"Failed to fetch sample {i}. Exception:", e) [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank46]: return self.dispatch_line(frame) [rank46]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank46]: if self.quitting: raise BdbQuit [rank46]: bdb.BdbQuit (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240826_172348_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240826_172348_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Unshare Workbook'"}]}] gent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9d5ba2b5b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.oTraceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAge File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/pytho File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9d5b1bae80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ne 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb36cec9f30> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd430f96660> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/d7b6198d-f682-4fb4-ba6f-655e16690bb3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/d7b6198d-f682-4fb4-ba6f-655e16690bb3.png'}, {'type': 'text', 'text': '\nIndicate the location with bounding box that can finalize the instruction: Crag Climbing. Please right-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5bc429f6f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_284264.png in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc0257c6c00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_284264.png'}, {'type': 'text', 'text': "\nClick on 'text input box'"}]}] gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f83d91d6e30> (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f04cc4f67a0> Load image from ceph: langchao2:s3://dingzLoad image from ceph: langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240802110611738/main_62.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240802110611738/main_62.png'}, {'type': 'text', 'text': '\n- Role: APP界面设计分析专家\n- Background: 用户需要对APP界面进行布局和图标设计,要求输出相应的json结构信息,包括node_type, location, content等属性。\n- Goals: 提供一个清晰、准确的json结构信息,以满足用户对APP界面布局和图标设计的需求。\n- OutputFormat: json格式,包含node_type, location, content等属性。\n- Workflow:\n 1. 分析APP界面的布局和图标设计需求。\n 2. 根据分析结果,确定json结构中的node_type, location, content等属性。\n 3. 编写符合规范的json结构信息。\n请直接给出json格式的数据:\n Click on it.'}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc43d3aa340> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image28698.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image28698.jpg'}, {'type': 'text', 'text': '\nWhat is the status of "Local Headlines"? Answer the question using a single word or phrase. Please click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb2ea906d40> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240506_reorg_filtered/04830_label.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240506_reorg_filtered/04830_label.png'}, {'type': 'text', 'text': '\n输出图片中数字为15的元素的信息。 Right-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faed92507c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f23b558bb50> [rank53]: Traceback (most recent call last): [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank53]: train() [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank53]: trainer.train() [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank53]: return inner_training_loop( [rank53]: File "/mnt/petrelfs/liuzhaoyang/woTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f21dc5896c0> iconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank53]: raise exception [rank53]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank53]: Original Traceback (most recent call last): [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank53]: sample = self._get_item(i) [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank53]: data_dict = self.preprocess_qwen2vl_v3( [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank53]: return self.preprocess_qwen2vl( [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank53]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank53]: image = tcs_loader(image) [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank53]: img = pil_loader(img_value_str) [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank53]: img = Image.open(buff) [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank53]: raise UnidentifiedImageError(msg) [rank53]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f23b55b47c0> [rank53]: During handling of the above exception, another exception occurred: [rank53]: Traceback (most recent call last): [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank53]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank53]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank53]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank53]: print(f"Failed to fetch sample {i}. Exception:", e) [rank53]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank53]: print(f"Failed to fetch sample {i}. Exception:", e) [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank53]: return self.dispatch_line(frame) [rank53]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank53]: if self.quitting: raise BdbQuit [rank53]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/tengxunTV/screen_00000430.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/tengxunTV/screen_00000430.jpg'}, {'type': 'text', 'text': '\n想要按照类型来筛选视频,应该选择哪个标签?with grounding Please right-click on it.'}]}] ages/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_22/C4web50k-1_92983204-split-1.png'}, {'type': 'text', 'text': '\nProvide the dimensions and location of the UI element Additional Information: with bounding box in the image. Click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_192984.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_192984.png'}, {'type': 'text', 'text': '\nClick on \'NEW ARRIVALS link, below "Search"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd34019b380> (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_246938.png item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faedb2273d0> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imagePodbean_2024_3_5_0_49-573.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imagePodbean_2024_3_5_0_49-573.png'}, {'type': 'text', 'text': '\nWhat are the bounding box details of the The Minimalists Podcast\n test appium in this image? Click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd430490f90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) k (most recent call last): //wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_041509_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_041509_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Extensions'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160634_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160634_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Paragraph...'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0f8b5ec6d0> .BytesIO object at 0x7f0413735cb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_111747_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_111747_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Choose File: No file chosen'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f05d5c83ba0> conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_246938.png'}, {'type': 'text', 'text': "\nClick on 'Neolayr-Shea-Body-Butter-With-Almond-Oil-50-GM-2, Black jar with yellow label featuring white and yellow text and a shea tree image, on a white background'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc2df6c45e0> t at 0x7f58bab58130> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f20a940c270> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f11da68ec50> n2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd331de7c90> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f00f955fb50> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_402443.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_402443.png'}, {'type': 'text', 'text': '\nClick on \'the image of "VA logo and Seal, U.S. Department of Veterans Affairs"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240826_173219_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240826_173219_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Q2790: 100%'"}]}] eel)'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_033204_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on '7. Storage duration and erasure'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/BOSSzhiping/screen_00000221.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/BOSSzhiping/screen_00000221.jpg'}, {'type': 'text', 'text': '\n我需要保存当前页面的设置或信息,应该怎么做?with grounding Please click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_448731.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_448731.png'}, {'type': 'text', 'text': "\nClick on 'Orange/Red Play/Pause Button'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_150452.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_150452.png'}, {'type': 'text', 'text': "\nClick on 'Chicago Indian Discussions, at the top of the page'"}]}] ess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc2ac460680> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1d914a9670> ck (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0389967100> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faed9250360> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f04137364d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dTraceback (most recent call last): last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f100d773ab0> 630_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_133630_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Apparel'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_14274.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_14274.png'}, {'type': 'text', 'text': '\nClick on \'The text "One Dharma Forum International" is located beneath a black-and-white logo representing Chinese calligraphy at the top of the webpage.\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_30817.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_30817.png'}, {'type': 'text', 'text': "\nClick on 'Product quantity, text input area'"}]}] 发送“在么”消息?(with grounding) Right-click on it.'}]}] return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f04ce66aa20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f00f955fb50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_716086.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_716086.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Nexstar Logo", in the "Stay Connected" section\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f58bab5a2a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1da02db3d0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_763161.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_763161.png'}, {'type': 'text', 'text': "\nClick on 'Saltwater Vertebrates'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_035801_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_035801_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'rise percent'"}]}Traceback (most recent call last): : Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0413707510> ile <_io.BytesIO object at 0x7f56255a90d0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_331020.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_331020.png'}, {'type': 'text', 'text': "\nClick on 'toggle search'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_22/C4web50k-1_92941446-split-1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_22/C4web50k-1_92941446-split-1.png'}, {'type': 'text', 'text': '\nIdentify where the (271) is located in the image and its bounding box. Please click on it.'}]}] /qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd4b176a570> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5a81a93ab0> ib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fad4687ffb0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f01f6f5e610> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_225416.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_225416.png'}, {'type': 'text', 'text': '\nClick on \'Quantity textarea, placeholder text "1", to the left of "ADD TO CART"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1da02da390> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f04137374c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_302196.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_302196.png'}, {'type': 'text', 'text': '\nClick on \'button titled "View Inventory"\''}]}] ] Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd7b82ab5b0> (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image fTraceback (most recent call last): acb650> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_153802_original_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_153802_original_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Loaded Translucent Grape'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_68342.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_68342.png'}, {'type': 'text', 'text': '\nClick on \'the image of "mail"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/wor File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fca1166d670> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc40573740> rocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03899676a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc0251f6480> /qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd430491850> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_148905.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_148905.png'}, {'type': 'text', 'text': "\nClick on 'Category Mother's Day image'"}]}] Traceback (most recent call last): Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_382380.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_382380.png'}, {'type': 'text', 'text': "\nClick on 'Share on WhatsApp'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f04137367f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa2b327ca90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) text': "\nClick on 'Expert Portfolios'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa33b8b3830> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f04137453a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_418869.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_418869.png'}, {'type': 'text', 'text': '\nClick on \'the image of "bibi khanum"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_4226.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_4226.png'}, {'type': 'text', 'text': "\nClick on 'navigate to pre-employment employee physicals details page'"}]}] 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd5c371d710> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/4b7e3ba5-0682-44f7-9e17-749798265b74.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/4b7e3ba5-0682-44f7-9e17-749798265b74.png'}, {'type': 'text', 'text': '\nFor question: Search for..., locate the bounding box of the element in the image that holds the key to this question. Click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_760351.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_760351.png'}, {'type': 'text', 'text': '\nClick on \'Revision 12257a98, under the "Latest revisions" section\''}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageKhan_Academy_2024_2_2_14_48-22.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageKhan_Academy_2024_2_2_14_48-22.png'}, {'type': 'text', 'text': '\nLocate the Settings and provide its bounding box coordinates in this visual representation. Click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)er', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_729777.png'}, {[rank62]: Traceback (most recent call last): [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank62]: train() [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank62]: trainer.train() [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank62]: return inner_training_loop( [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank62]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank62]: batch_samples += [next(epoch_iterator)] [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank62]: current_batch = next(dataloader_iter) [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank62]: data = self._next_data() [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank62]: return self._process_data(data) [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank62]: data.reraise() [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank62]: raise exception [rank62]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank62]: Original Traceback (most recent call last): [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank62]: sample = self._get_item(i) [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank62]: data_dict = self.preprocess_qwen2vl_v3( [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank62]: return self.preprocess_qwen2vl( [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank62]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank62]: image = tcs_loader(image) [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank62]: img = pil_loader(img_value_str) [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank62]: img = Image.open(buff) [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank62]: raise UnidentifiedImageError(msg) [rank62]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8463c9b240> [rank62]: During handling of the above exception, another exception occurred: [rank62]: Traceback (most recent call last): [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank62]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank62]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank62]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank62]: print(f"Failed to fetch sample {i}. Exception:", e) [rank62]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank62]: print(f"Failed to fetch sample {i}. Exception:", e) [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank62]: return self.dispatch_line(frame) [rank62]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank62]: if self.quitting: raise BdbQuit [rank62]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f83d91d7060> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_93456.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_93456.png'}, {'type': 'text', 'text': "\nClick on 'Negatyw option represented by an eye icon and yellow text.'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ngzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_80531.png'}, {'type': 'text', 'text': '\nClick on \'The "Sign in" link with an icon of a person, located to the right of the "Create event" button at the top-right corner of the webpage.\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspaceTraceback (most recent call last): taset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd34019bba0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f00f955fb50> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa33b8ca0c0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_8430.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_8430.png'}, {'type': 'text', 'text': '\nClick on \'The "Patient Reviews" menu item, located in the middle portion of the drop-down menu on the left side of the screen.\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) nv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd4b176ab10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) /workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fca10c5f9c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgenTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc738bee570> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoy> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_055309_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_055309_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Definitions'"}]}] gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd7b82c9cb0> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240410_reorg_filtered/01182_label.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240410_reorg_filtered/01182_label.png'}, {'type': 'text', 'text': '\n简单介绍下标号为4的元素。 Please double-click on it.'}]}] conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9ed076200> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8148bffb50> hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_641831.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_641831.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Tips For Multi-lingual Website Testing"\''}]}] er', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_488256.png'}, {'type': 'text', 'text': "\nClick on 'Go to Home Page'"}]}] load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd331416f20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc94d561030> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)er', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_213643_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Manage Data Model'"}]}] dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd73adf36f0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_146194.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_146194.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Exercise to Music course page'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/worTraceback (most recent call last): vis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwenLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_98158.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_98158.png'}, {'type': 'text', 'text': '\nClick on \'The element is the "R & D" option located in the website\'s main navigation menu at the top of the page.\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwTraceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_234757_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc90fe2d4e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) //st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image5668.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image5668.jpg'}, {'type': 'text', 'text': '\nWhat is the chosen chord? Explain your answer based on UI element. Right-click on it.'}]}] Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f826d0ecf40> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageHome_shopping_2024_3_9_3_28-549.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageHome_shopping_2024_3_9_3_28-549.png'}, {'type': 'text', 'text': '\nLocate the Navigate up in the picture and give its boundary details. Please double-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd331441580> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0413745940> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_51772.png conv: [{'role': 'use File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc0251f7560> rocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9abbf5670> 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_175395.png'}, {'type': 'text', 'text': "\nClick on 'Recently Listed'"}]}] Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160957_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160957_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Review'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc8cb2e71f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8152f864d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd5c371d710> Traceback (most recent call last): Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_804650.png File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_064618_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_064618_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Describe your issue'"}]}] ltered_804650.png'}, {'type': 'text', 'text': '\nClick on \'the image of "freestar"\''}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd331443a10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc8cb2e70b0> e 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd430f96660> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image11648.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image11648.jpg'}, {'type': 'text', 'text': '\nWhat Gmail address is used? Explain your answer based on UI element. Please click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_123646_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_123646_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Match case'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8463739760> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd33213f650> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd5c371fa60> Load image from ceph: langchao2:s3:Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_419177.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_419177.png'}, {'type': 'text', 'text': '\nClick on \'button for "Find Support", at the bottom of the screenshot\''}]}] l_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0f0d515350> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa6e40a1440> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_201253_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Actives'"}]}] /liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9ba4a2980> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc90fe2d3f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f846373bba0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_87946.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_87946.png'}, {'type': 'text', 'text': '\nClick on \'the text input field, above "OUR PRODUCTS", at the top of the image\''}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240802185509115/main_91_after.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240802185509115/main_91_after.png'}, {'type': 'text', 'text': '\n角色:移动应用UI/UX设计顾问\n背景:客户希望优化其移动应用的用户界面和用户体验,需要输出相应的XML代码片段,包括元素类型(node_type)、位置(location)、内容(content)等属性。\n目标:提供一个结构清晰、准确的XML代码片段,以满足客户对移动应用用户界面和用户体验优化的需求。\n限制:XML代码片段必须遵循移动应用设计标准,易于理解,同时确保信息的精确性和完整性。\n输出格式:XML格式,包含element_type, position, text等属性。\n工作流程:\n分析移动应用的用户界面和用户体验设计需求。\n根据分析结果,确定XML代码片段中的element_type, position, text等属性。\n编写符合规范的XML代码片段: Click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", l File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(s> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)e = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9ba4a1990> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/dongchedi/screen_00000299.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/dongchedi/screen_00000299.jpg'}, {'type': 'text', 'text': '\n我正在寻找露营装备,特别是天幕。你能告诉我应该看哪个部分吗?with grounding Double-click on it.'}]}] line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f84638bb8d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)r', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_81846.png'}, {'type': 'text', 'text': "\nClick on 'visit Fixed-Rate Mortgages page'"}]}] qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa5abddfa10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Traceback (most recent call last): ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdad4dcbd80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_030940_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_030940_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'URL'"}]}] set.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0f0d48be70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfileLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_162220_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_162220_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Undo Typing'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conTraceback (most recent call last): workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa5ab894ef0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f85c596e0c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f84637627f0> /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_251325.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_251325.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Live Chat Operator"\''}]}] Traceback (most recent call last): //wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_175115_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_175115_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'AutomationID: Icons_Airplane_M'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_30_21_5_fade3f5c37ef4477974833eeece4693f-7.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_30_21_5_fade3f5c37ef4477974833eeece4693f-7.png'}, {'type': 'text', 'text': '\nHow can you locate the HONG KONG and its bounding box in this image? Please double-click on it.'}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa3f660c220> .BytesIO object at 0x7fa29d41fba0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_ioLoad image from ceph: langchao2:s3:Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/WPS/screen_00000528.jpgimage', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_49103.png'}, {'type': 'text', 'text': '\nClick on \'View all posts by News Staff, under "Virtual Information Meeting on Adair County Solar Farm Project" section\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/WPS/screen_00000528.jpg'}, {'type': 'text', 'text': '\n打卡功能在哪里?(with grounding) Click on it.'}]}] 099_batch_3_num_1500/20240723133759421/main_4.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0726_099_batch_3_num_1500/20240723133759421/main_4.png'}, {'type': 'text', 'text': '\n这个输入框图标是用来做什么的?[[945, 32, 984, 46]] Please click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_141204.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_141204.png'}, {'type': 'text', 'text': '\nClick on \'The dark-themed poster has large white and red text featuring "DIABLO IV" at the top, "CAMPFIRE CHAT" in the middle, and "BLIZZCON 2023" at the bottom. The background shows a campfire scene with dim lighting and some indistinct details. The text "DIABLO IV | CAMPFIRE CHAT" appears beneath the image.\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/agent_data/aig_share/ui_agi_appdata_phone/ui_agi_appdata_phone_20240410_reorg_filtered/01823.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/agent_data/aig_share/ui_agi_appdata_phone/ui_agi_appdata_phone_20240410_reorg_filtered/01823.png'}, {'type': 'text', 'text': '\n在屏幕截图上找出返回按钮图标,给出边界框的详细坐标。 Please double-click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_30_17_17_da675f08c5b24500b180236dfa176604-3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_30_17_17_da675f08c5b24500b180236dfa176604-3.png'}, {'type': 'text', 'text': '\nEnumerate the coordinates defining the location of UI element Search apps. Please double-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f851e69e2a0> acb650> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa29d41f600> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_157777.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_157777.png'}, {'type': 'text', 'text': "\nClick on 'edit input in subject textbox'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): //dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_27852.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_27852.png'}, {'type': 'text', 'text': '\nClick on \'The rectangular arrow icon located on the left side of the screen, sitting in the shadowed area, is just above a "BOOK NOW" button and nestled alongside an image showing a fireworks event.\''}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdcc00e5800> .BytesIO object at 0x7f83d91d7010> [rank31]: Traceback (most recent call last): [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank31]: train() [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank31]: trainer.train() [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank31]: return inner_training_loop( [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank31]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank31]: batch_samples += [next(epoch_iterator)] [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank31]: current_batch = next(dataloader_iter) [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank31]: data = self._next_data() [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank31]: return self._process_data(data) [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank31]: data.reraise() [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank31]: raise exception [rank31]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank31]: Original Traceback (most recent call last): [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank31]: sample = self._get_item(i) [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank31]: data_dict = self.preprocess_qwen2vl_v3( [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank31]: return self.preprocess_qwen2vl( [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank31]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank31]: image = tcs_loader(image) [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank31]: img = pil_loader(img_value_str) [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank31]: img = Image.open(buff) [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank31]: raise UnidentifiedImageError(msg) [rank31]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa3f660c220> [rank31]: During handling of the above exception, another exception occurred: [rank31]: Traceback (most recent call last): [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank31]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank31]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank31]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank31]: print(f"Failed to fetch sample {i}. Exception:", e) [rank31]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank31]: print(f"Failed to fetch sample {i}. Exception:", e) [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank31]: return self.dispatch_line(frame) [rank31]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank31]: if self.quitting: raise BdbQuit [rank31]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa664e59a80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa521c85760> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_025823_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_025823_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Partly cloudy'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240322/20240322_filtered/meiyou/screen_00000070.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240322/20240322_filtered/meiyou/screen_00000070.jpg'}, {'type': 'text', 'text': '\n尽可能详细的描述这个手机截图的内容。 Right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f85c458d300> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ck (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa29d41f600> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_637502.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_637502.png'}, {'type': 'text', 'text': "\nClick on 'BUSINESSES'"}]}] of shishkabab'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_071053_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_071053_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Create your Google Account'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8152f86480> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa29d41fba0> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/a211267f-2440-425e-9e89-79689ed45210.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/a211267f-2440-425e-9e89-79689ed45210.png'}, {'type': 'text', 'text': '\nCan you determine the bounding box coordinates to complete my task: File copy_summer_vacation_plans.md . Please click on it.'}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc405c9a80> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_140313_before_screenshot_sub0.pngne 222, in [rank60]: train() [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank60]: trainer.train() [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank60]: return inner_training_loop( [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank60]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank60]: batch_samples += [next(epoch_iterator)] [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank60]: current_batch = next(dataloader_iter) [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank60]: data = self._next_data() [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank60]: return self._process_data(data) [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank60]: data.reraise() [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank60]: raise exception [rank60]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank60]: Original Traceback (most recent call last): [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank60]: sample = self._get_item(i) [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank60]: data_dict = self.preprocess_qwen2vl_v3( [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank60]: return self.preprocess_qwen2vl( [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank60]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank60]: image = tcs_loader(image) [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank60]: img = pil_loader(img_value_str) [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank60]: img = Image.open(buff) [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank60]: raise UnidentifiedImageError(msg) [rank60]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa521c85760> [rank60]: During handling of the above exception, another exception occurred: [rank60]: Traceback (most recent call last): [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank60]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank60]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank60]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank60]: print(f"Failed to fetch sample {i}. Exception:", e) [rank60]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank60]: print(f"Failed to fetch sample {i}. Exception:", e) [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank60]: return self.dispatch_line(frame) [rank60]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank60]: if self.quitting: raise BdbQuit [rank60]: bdb.BdbQuit Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_24154.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_24154.png'}, {'type': 'text', 'text': '\nClick on \'Canada Warehouse button, on the left side of the webpage, above "Search", at the left side of "Showing the single result"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", linTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidenTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc0257c5ad0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_164507.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_164507.png'}, {'type': 'text', 'text': "\nClick on 'textbox'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_023515_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'appropriately notifying'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fac6c3fd9e0> Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) r', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_618689.png'}, {'type': 'text', 'text': "\nClick on 'Light gray Instagram camera icon with square frame and central circle, interactive button or link'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_131217_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_131217_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Prime App'"}]}] ge) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) (Pdb) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa6e40a1440> (Pdb) ge from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_239730.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_239730.png'}, {'type': 'text', 'text': '\nClick on \'link labeled "China"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_206483.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_206483.png'}, {'type': 'text', 'text': '\nClick on \'the image of "BruceDropEmOff as seen in a selfie taken before his journey to Japan in July 2023"\''}]}] (Pdb) ck (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f846373a7a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f58bb8de930> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa3f4efa660> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) iuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0f0d4b3ec0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_161148_original_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_161148_original_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Weather'"}]}] 2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageAqua_Mail_2024_3_7_23_6-257.png'}, {'type': 'text', 'text': '\nHow can you locate the Navigate up and its bounding box in this image? Please double-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_200235_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_200235_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Caramel'"}]}] pe anchovy'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f851e69e2a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa021a06f20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faf63737d80> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f01f6f5e610> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f846373aed0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f58bb8de930> Traceback (most recent call last): [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240506_reorg_filtered/01216_label.png'}, {'type': 'text', 'text': '\n图中标记为1的元素是什么? Click on it.'}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fac7f635ee0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_743309.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_743309.png'}, {'type': 'text', 'text': "\nClick on 'Tommy Robinson Fishing Videos, located at the top of the screenshot'"}]}] y", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde637df420> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/jinritoutiao/screen_00000003.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/jinritoutiao/screen_00000003.jpg'}, {'type': 'text', 'text': '\n如何退出当前页面或弹出窗口?(with grounding) Please double-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_105606.pngget_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0dc787fa60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) mage.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa33b84b1a0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_042302_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_042302_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Watches, Parts & Accessories'"}]}] /qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb06ad08c70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_105606.png'}, {'type': 'text', 'text': "\nClick on 'Language Select'"}]}] top_domain/windows_images20240823_204256_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'Recover Unsaved Workbooks'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fad3304af70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f56255ab470> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5ff1e2f0b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd709332110> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_ioLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_347687.png guvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0389967740> (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_125591.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_125591.png'}, {'type': 'text', 'text': "\nClick on 'navigate to agr accounts homepage'"}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)TOKENIZERS_PARALLELISM=(true | false) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_155719_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_155719_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Zoom'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_694125.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_694125.png'}, {'type': 'text', 'text': '\nClick on \'What\'s New, under "Surnames", above "S122"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_30282.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_30282.png'}, {'type': 'text', 'text': "\nClick on 'Members tab, located in the navigation bar.'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fad4687ffb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f607bb83ba0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd645076660> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_163613_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_163613_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Search'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f04137484f0> /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/sr> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_778924.png> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_778924.png'}, {'type': 'text', 'text': '\nClick on \'tTraceback (most recent call last): \''}]}] (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faed9252430> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)er', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_445149.png'}, {'type': 'text', 'text': "\nClick on 'Reply to Celeste'"}]}] cannot identify image file <_io.BytesIO object at 0x7fde63647f10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_062150_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Item logo image for Slate'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd34019bba0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f58bb8de930> value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f00f955fb50> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fac53211940> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9abc0bbf0> : [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_33285.png'}, {'type': 'text', 'text': '\nClick on \'the image of "front thumbnail"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/be5bd673-9821-4052-9955-379f5f6e0f12.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/be5bd673-9821-4052-9955-379f5f6e0f12.png'}, {'type': 'text', 'text': '\nFor the instruction: Attend Team Meeting, output the bounding box for the region in the screen that is most relevant to answering the question. Please right-click on it.'}]}] kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f607bb9c5e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faf6374cf90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_467986.png > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_467986.png'}, {'type': 'text', 'text': "\nClick on 'Etsy Digital Shop Branding Bundle Kit Black & White DigiPax'"}]}] (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0b32673420> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ine 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd647bae8e0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_112645_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_112645_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Microsoft Edge - 1 running window'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_576759.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_576759.png'}, {'type': 'text', 'text': "\nClick on 'Open submenu of About Us'"}]}] py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5625542b10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f607bb9fc90> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_65905.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_65905.png'}, {'type': 'text', 'text': '\nClick on \'select menu, under the "Categories" section\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faf6374f600> (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_553912.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_553912.png'}, {'type': 'text', 'text': "\nClick on 'Plus sign icon in top-right corner of a plain square'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_730022.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_730022.png'}, {'type': 'text', 'text': '\nClick on \'link labeled "October 2006"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f01038f27a0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_050457_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_050457_before_scre File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd904417ce0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/d> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faf6374fa60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd34019bc40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_413672.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_413672.png'}, {'type': 'text', 'text': '\nClick on \'link titled "CAREERS", under "REVIEWS"\''}]}] Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_565060.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_565060.png'}, {'type': 'text', 'text': "\nClick on 'Permalink to: Exploring the Worth of a Career in Law: Passion, Purpose, and Practical Considerations'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_609070.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_609070.png'}, {'type': 'text', 'text': "\nClick on 'NewsquestEventsScotland logo LocaliQ 2022'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_214759_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_214759_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'From Web'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_376502.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_376502.png'}, {'type': 'text', 'text': '\nClick on \'peace of mind (20 items), located under "Newsletter Signup" section\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f04cf196890> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb01e2e3f10> (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) ample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f607bb9f510> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd430f96660> [rank28]: Traceback (most recent call last): [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank28]: train() [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank28]: trainer.train() [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank28]: return inner_training_loop( [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank28]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank28]: batch_samples += [next(epoch_iterator)] [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank28]: current_batch = next(dataloader_iter) [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank28]: data = self._next_data() [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank28]: return self._process_data(data) [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank28]: data.reraise() [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank28]: raise exception [rank28]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank28]: Original Traceback (most recent call last): [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank28]: sample = self._get_item(i) [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank28]: data_dict = self.preprocess_qwen2vl_v3( [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank28]: return self.preprocess_qwen2vl( [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank28]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank28]: image = tcs_loader(image) [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank28]: img = pil_loader(img_value_str) [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank28]: img = Image.open(buff) [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank28]: raise UnidentifiedImageError(msg) [rank28]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd647bae8e0> [rank28]: During handling of the above exception, another exception occurred: [rank28]: Traceback (most recent call last): [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank28]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank28]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank28]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank28]: print(f"Failed to fetch sample {i}. Exception:", e) [rank28]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank28]: print(f"Failed to fetch sample {i}. Exception:", e) [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank28]: return self.dispatch_line(frame) [rank28]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank28]: if self.quitting: raise BdbQuit [rank28]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", l> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) ataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc405c9d50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) "\nClick on 'Bitwise'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0b326733d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) age from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_115051_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_115051_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Sign in'"}]}] ocated in the navigation bar towards the top center of the webpage, this element contains text that reads "TEMPLE FUEL CATERING" with a downward-facing arrow immediately following the text.\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_190614_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_190614_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'AutomationID: NavigationControl'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_120417.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_120417.png'}, {'type': 'text', 'text': "\nClick on 'An image showcasing a bustling urban street with a vibrant bike-sharing station, featuring sleek and modern electric hybrid bikes lined up neatly'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0726_099_batch_3_num_1500/20240726144907478/main_111.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0726_099_batch_3_num_1500/20240726144907478/main_111.png'}, {'type': 'text', 'text': '\n图片中[[80, 100, 98, 111]]框选的是什么元素? Click on it.'}]}] ng/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f04ce6c6a70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faf6374d210> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f607bb9c4f0> Traceback (most recent call last): //st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/12a7ca64-204b-44b3-9dc9-2b17ba4a634d.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/12a7ca64-204b-44b3-9dc9-2b17ba4a634d.png'}, {'type': 'text', 'text': '\nCan you pinpoint the location with bounding box for completing the task: File python_learning_goals_ezaq.md . Please double-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_045756_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_045756_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on '30 Fri d1000'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_358385.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_358385.png'}, {'type': 'text', 'text': '\nClick on \'the image of "logo footer"\''}]}] e.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0b32673fb0> .BytesIO object at 0x7fd93d2be7a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_114440_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_114440_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'LinkedIn Privacy Policy'"}]}] hen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_600966.png'}, {'type': 'text', 'text': "\nClick on 'Italienisch - Deutsch'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b6b8880e0> .BytesIO object at 0x7faf6374c220> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0389966020> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_146848.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_146848.png'}, {'type': 'text', 'text': "\nClick on 'A small red circle, positioned at the bottom center of the page, which stands out amongst a row of small circular buttons.'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_134656_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_134656_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Hide or show region'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd459731260> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd331dea390> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f55050a9080> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc40fd8540> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_024332_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_024332_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Google Account Help'"}]}] ot identify image file <_io.BytesIO object at 0x7f04ccfe17b0> (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_211434_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_211434_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Search highlights icon opens search> /mnt/hwfile/liuzh> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_023109_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_023109_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Places'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd467723ab0> > /mnt/hwfile/liuzhaoyang/workspace File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9abbcfce0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0389965170> kages/torch/utils/data/dataloader.py", line 701, in __next__ [rank21]: data = self._next_data() [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank21]: return self._process_data(data) [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank21]: data.reraise() [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank21]: raise exception [rank21]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank21]: Original Traceback (most recent call last): [rank21]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank21]: sample = self._get_item(i) [rank21]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank21]: data_dict = self.preprocess_qwen2vl_v3( [rank21]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank21]: return self.preprocess_qwen2vl( [rank21]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank21]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank21]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank21]: image = tcs_loader(image) [rank21]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank21]: img = pil_loader(img_value_str) [rank21]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank21]: img = Image.open(buff) [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank21]: raise UnidentifiedImageError(msg) [rank21]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc405c9d50> [rank21]: During handling of the above exception, another exception occurred: [rank21]: Traceback (most recent call last): [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank21]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank21]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank21]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank21]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank21]: print(f"Failed to fetch sample {i}. Exception:", e) [rank21]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank21]: print(f"Failed to fetch sample {i}. Exception:", e) [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank21]: return self.dispatch_line(frame) [rank21]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank21]: if self.quitting: raise BdbQuit [rank21]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_686090.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_686090.png'}, {'type': 'text', 'text': '\nClick on \'the image of "38-oz Microwave Rectangular Container with Lid", at the top of the screenshot\''}]}] s/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe928d006d0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0df85ce250> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d01273bf0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_2203.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_2203.png'}, {'type': 'text', 'text': "\nClick on 'open the sizing chart modal'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_3/C4web50k-0_1500654-split-3.png'}, {'type': 'text', 'text': '\nLocate the Make the frosting: and provide its bounding box coordinates in this visual representation. Click on it.'}]}] t/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc13fed7dd0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9b7fc3470> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_482412.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_482412.png'}, {'type': 'text', 'text': '\nClick on \'link for "Inspiring design trends this fall 2019", under "Recent Posts" in "Tag: Articles"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdcc15db5b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160655_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160655_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Clear Formatting'"}]}] '\nPlease output the bounding box of the element correctly responding to this question: Change language of this app. Restart app for changes to take effect\n\nEnglish (English, United States). Please double-click on it.'}]}] Traceback (most recent call last): //dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_460082.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_460082.png'}, {'type': 'text', 'text': '\nClick on \'the image of "parenting consulting holding a parenting book", on the right side of the image, in the "Latest Articles"\''}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0f8ab60fe0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f501ee27ce0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_295327.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_295327.png'}, {'type': 'text', 'text': "\nClick on '01332 215 570, titled Call Vibrant Doors'"}]}] ation and bounding box of 1:40 'About' LifeEdited Video shown here? Click on it."}]}] aoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0c3415d080> Traceback (most recent call last): //dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_192782.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_192782.png'}, {'type': 'text', 'text': "\nClick on 'navigate to GriefShare Can't Save My Dad, But I Can't Either page'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f52b39405e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc90fe2d3a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0f0cb0bec0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_555864.png i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_746728.pngsub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_133731_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Change Case'"}]}] Traceback (most recent call last): [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_746728.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Fashion model sunbathing in striped swimsuit"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/worLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_122741.pngzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0df85ce340> conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_122741.png'}, {'type': 'text', 'text': "\nClick on 'redirect to 'The Influence of Market Rumours' article'"}]}] Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui[rank0]: Traceback (most recent call last): [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank0]: train() [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank0]: trainer.train() [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank0]: return inner_training_loop( [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank0]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank0]: batch_samples += [next(epoch_iterator)] [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank0]: current_batch = next(dataloader_iter) [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank0]: data = self._next_data() [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank0]: return self._process_data(data) [rank0]: File "/mnt/petrelfs/liuzhaoyang/w> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank0]: raise exception [rank0]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank0]: Original Traceback (most recent call last): [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank0]: sample = self._get_item(i) [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank0]: data_dict = self.preprocess_qwen2vl_v3( [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank0]: return self.preprocess_qwen2vl( [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank0]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank0]: image = tcs_loader(image) [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank0]: img = pil_loader(img_value_str) [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank0]: img = Image.open(buff) [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank0]: raise UnidentifiedImageError(msg) [rank0]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0f8ab60fe0> [rank0]: During handling of the above exception, another exception occurred: [rank0]: Traceback (most recent call last): [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank0]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank0]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank0]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank0]: print(f"Failed to fetch sample {i}. Exception:", e) [rank0]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank0]: print(f"Failed to fetch sample {i}. Exception:", e) [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank0]: return self.dispatch_line(frame) [rank0]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank0]: if self.quitting: raise BdbQuit [rank0]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workTraceback (most recent call last): is/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0dc787f9c0> Traceback (most recent call last): //dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_785719.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_785719.png'}, {'type': 'text', 'text': '\nClick on \'Gorewear Gore-Tex Infinium Thermo Split Gloves\nEasy-to-live-with gloves for dry, sub-zero conditions link, on the right side of the page, above "Gorewear C5 Gore-Tex Thermo Gloves\nComfortable gloves that excel in genuinely challenging weather conditions, whether cold, wet or both"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc90fe2e3e0> (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240821_190214_before_screenshot.png conv: [{'role': 'user', 'Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_070611_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_070611_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Google Account Help'"}]}] e = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0b3267fba0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwf> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f58bab5bc40> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0b3267fba0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_233205.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_233205.png'}, {'type': 'text', 'text': "\nClick on 'open phone application and dial Vancouver: 360-944-7600'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) k (most recent call last): (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_284081.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_284081.png'}, {'type': 'text', 'text': "\nClick on 'textbox, in the middle of the page'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageDuolingo_2024_2_2_14_16-73.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageDuolingo_2024_2_2_14_16-73.png'}, {'type': 'text', 'text': "\nCan you find the Learn Tab's position and bounding box in this image? Please double-click on it."}]}] qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f10d18489a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_25_21_52_ed3b1c97d42e482298bdf2df5552a00b-4.png File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0b372a3ab0> conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_25_21_52_ed3b1c97d42e482298bdf2df5552a00b-4.png'}, {'type': 'text', 'text': '\nFind the Podcast name / Keywords in the picture and detail its boundary coordinates. Please click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_033101_before_screenshot_sub1.png conv: [{'role': 'useTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc90fe2e390> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9abbf58f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) object at 0x7fb01e6e5d50> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_29179.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_29179.png'}, {'type': 'text', 'text': '\nClick on \'The "Business Solutions" navigation tab located in the top navigation bar, positioned between the "Laundry Pro" and "Contact Us" tabs.\''}]}] Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_789389.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_789389.png'}, {'type': 'text', 'text': "\nClick on 'text input box, at the center of the image'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0b372a2f70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc90fe2e520> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_184350.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_184350.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Slip-Ons & Loafers section'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image13637.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image13637.jpg'}, {'type': 'text', 'text': '\nWhat is the name of given user? Explain your answer based on UI element. Right-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_184903.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_184903.png'}, {'type': 'text', 'text': "\nClick on 'redirect to Transfer Roller page'"}]}] preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0b372a3650> (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_69760.png conv: [{'role'(Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0b415cb600> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdd85696160> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_194821_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_194821_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Wrap Text'"}]}] v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GULoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_114818.png b_hybrid_773k_max_25qa_filtered_196892.png'}, {'type': 'text', 'text': "\nClick on 'CLICK Motivate link'"}]}] _image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdcc146bba0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc90fe2f970> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_071753_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_071753_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Page information'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_111512_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_111512_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Action Center, 2 new notifications'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc405c9da0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc42d715d0> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240506_reorg_filtered/02543.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240506_reorg_filtered/02543.png'}, {'type': 'text', 'text': '\n图像里[[500, 828, 1000, 1000]]所框定的具体是哪些内容? Please right-click on it.'}]}] preprocess_qwen2vl_v3( [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank26]: return self.preprocess_qwen2vl( [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank26]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank26]: image = tcs_loader(image) [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank26]: img = pil_loader(img_value_str) [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank26]: img = Image.open(buff) [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank26]: raise UnidentifiedImageError(msg) [rank26]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb01e6e5d50> [rank26]: During handling of the above exception, another exception occurred: [rank26]: Traceback (most recent call last): [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank26]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank26]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank26]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank26]: print(f"Failed to fetch sample {i}. Exception:", e) [rank26]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank26]: print(f"Failed to fetch sample {i}. Exception:", e) [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank26]: return self.dispatch_line(frame) [rank26]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank26]: if self.quitting: raise BdbQuit [rank26]: bdb.BdbQuit conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_114818.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Blurred out Christmas Lights", under "Andrew Walker" section\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0b372a2f70> Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f100d61f150> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe9ec31b740> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_043935_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_043935_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Foo BAR | Trusted Community Engagement and Contributions'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/l> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) .BytesIO object at 0x7fd9abc1e430> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc90fe2fa10> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_623134.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_623134.png'}, {'type': 'text', 'text': "\nClick on 'Cencelx'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspaTraceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240821_170945_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240821_170945_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Strikethrough'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): ise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faed92504f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_072203_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_072203_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'The Weather Channel - MSN - Sleeping'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faedb2273d0> /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_014600_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_014600_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Heading 1'"}]}] /aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9b7fc3470> File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdfac4ba480> Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/32818.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/32818.jpg'}, {'type': 'text', 'text': '\nWhere is the abar nav > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) /dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_163548_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_163548_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Insert Captions'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gTraceback (most recent call last): , in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0b327776a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9ba4a2980> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9abc1ea20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc405c9a80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/60552.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_33/C4web50k-1_95435155-split-15.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_33/C4web50k-1_95435155-split-15.png'}, {'type': 'text', 'text': '\nDetermine the spatial location of the UI element 3 years ago with bounding box in the image. Please click on it.'}]}] 4/gui_data/ui_data/GUICourse/guienv/chunk_39/C4web50k-2_184067045-split-1.png'}, {'type': 'text', 'text': '\nFind and give the bounding box coordinates of Our experienced team has helped thousands of Mainers navigate difficult times. From work-related longshore injuries, asbestos claims and Social Security Disability insurance. Once you are a member of our legal practice, we don’t make you go it alone after your case is resolved. in the picture. Click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageZALORA_2024_3_5_17_3-26.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageZALORA_2024_3_5_17_3-26.png'}, {'type': 'text', 'text': '\nFind the location and bounding box of Navigate up in the image displayed. Please double-click on it.'}]}] Traceback (most recent call last): cess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03899670b0> y", line 701, in __next__ [rank44]: data = self._next_data() [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank44]: return self._process_data(data) [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank44]: data.reraise() [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank44]: raise exception [rank44]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank44]: Original Traceback (most recent call last): [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank44]: sample = self._get_item(i) [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank44]: data_dict = self.preprocess_qwen2vl_v3( [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank44]: return self.preprocess_qwen2vl( [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank44]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank44]: image = tcs_loader(image) [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank44]: img = pil_loader(img_value_str) [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank44]: img = Image.open(buff) [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank44]: raise UnidentifiedImageError(msg) [rank44]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0dc787f1a0> [rank44]: During handling of the above exception, another exception occurred: [rank44]: Traceback (most recent call last): [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank44]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank44]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank44]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank44]: print(f"Failed to fetch sample {i}. Exception:", e) [rank44]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank44]: print(f"Failed to fetch sample {i}. Exception:", e) [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank44]: return self.dispatch_line(frame) [rank44]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank44]: if self.quitting: raise BdbQuit [rank44]: bdb.BdbQuit Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_105820.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_105820.png'}, {'type': 'text', 'text': '\nClick on \'The element is an Instagram icon located in the bottom section of the webpage. It is among a group of other social media icons (Facebook, Twitter, YouTube) under the text "Follow Us." It is visually represented by a rounded square with a camera glyph inside.\''}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_25/C4web50k-1_93511376-split-1.png"/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9abc1cea0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc738099210> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_135204_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_135204_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Lines (Stylish)'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_155144_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_155144_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'AutoSave'"}]}] vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9abcfa890> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_148095.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_148095.png'}, {'type': 'text', 'text': "\nClick on 'add Donner Meat to my order'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_685661.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_685661.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Different vents in your home"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9abc1c4f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faf6374d530> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240326/20240326_filtered/mojitianqi/screen_00000066.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240326/20240326_filtered/mojitianqi/screen_00000066.jpg'}, {'type': 'text', 'text': '\n深入探讨一下截图[[855, 760, 961, 821]]区域所蕴含的元素。 Please right-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde18f8c220> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc8cb2e52b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde91d92480> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) iuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc21e6173d0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc5174ae2f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_139770.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_2 File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9abbefd80> erate/data_loader.py", line 552, in __iter__ [rank29]: current_batch = next(dataloader_iter) [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank29]: data = self._next_data() [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank29]: return self._process_data(data) [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank29]: data.reraise() [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank29]: raise exception [rank29]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank29]: Original Traceback (most recent call last): [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank29]: sample = self._get_item(i) [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank29]: data_dict = self.preprocess_qwen2vl_v3( [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank29]: return self.preprocess_qwen2vl( [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank29]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank29]: image = tcs_loader(image) [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank29]: img = pil_loader(img_value_str) [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank29]: img = Image.open(buff) [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank29]: raise UnidentifiedImageError(msg) [rank29]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f03899670b0> [rank29]: During handling of the above exception, another exception occurred: [rank29]: Traceback (most recent call last): [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank29]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank29]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank29]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank29]: print(f"Failed to fetch sample {i}. Exception:", e) [rank29]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank29]: print(f"Failed to fetch sample {i}. Exception:", e) [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank29]: return self.dispatch_line(frame) [rank29]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank29]: if self.quitting: raise BdbQuit [rank29]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faf6374fdd0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_809931.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_809931.png'}, {'type': 'text', 'text': "\nClick on 'Football Shin Pads Guard Sporting Lisbon FC (BNWT)-FirstScoreSport'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9b7fc3970> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe02c141a30> Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) uzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f038cadf0b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) red_244773.png'}, {'type': 'text', 'text': "\nClick on 'raymond macdonald alden'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_504627.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_504627.png'}, {'type': 'text', 'text': "\nClick on 'the textarea'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_310598.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_310598.png'}, {'type': 'text', 'text': "\nClick on 'BARNES & NOBLE, on the bottom right of the page'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imagePodcasts_2024_3_9_13_58-79.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imagePodcasts_2024_3_9_13_58-79.png'}, {'type': 'text', 'text': '\nMark the area occupied by the UI element Sort with bounding box in the provided image. Please double-click oLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_116080.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_116080.png'}, {'type': 'text', 'text': '\nClick on \'The green "BOOK ONLINE" button, located beneath the "Booking and Contact" section on the left side of the page.\''}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd9abc212b0> > /mnt/hwfile/liuzhaoyang/workspace Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_131117_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_131117_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Google Cloud pricing'"}]}] guvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f051b232fc0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f625ee95580> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_594474.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_594474.png'}, {'type': 'text', 'text': '\nClick on \'Product Enquiry link, under the "Paneer kulcha"\''}]}] gent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank49]: img = Image.open(buff) [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank49]: raise UnidentifiedImageError(msg) [rank49]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc5174ae2f0> [rank49]: During handling of the above exception, another exception occurred: [rank49]: Traceback (most recent call last): [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank49]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank49]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank49]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank49]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank49]: print(f"Failed to fetch sample {i}. Exception:", e) [rank49]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank49]: print(f"Failed to fetch sample {i}. Exception:", e) [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank49]: return self.dispatch_line(frame) [rank49]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank49]: if self.quitting: raise BdbQuit [rank49]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f52b4d6fe70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_794928.png(Pdb) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_794928.png'}, {'type': 'text', 'text': "\nClick on 'Specifications link'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_130510.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_130510.png'}, {'type': 'text', 'text': '\nClick on \'The Instagram icon in the "CONNECT WITH US ONLINE" section next to the Facebook icon.\''}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image24809.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image24809.jpg'}, {'type': 'text', 'text': '\nWhat is the location? Explain your answer based on UI element. Double-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_422236.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_422236.png'}, {'type': 'text', 'text': "\nClick on 'Cookie Preferences, opens a dedicated popup modal window'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f086fa63830> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_163245.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_163245.png'}, {'type': 'text', 'text': "\nClick on 'type company name'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc0daae8a90> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc163917150> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdd1bed74c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample => /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) _ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc2b36c3060> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image66929.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_datLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_034011_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_034011_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'New Tab'"}]}] (Pdb) hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b6b8880e0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_022114_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_022114_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on '1/'"}]}] Traceback (most recent call last): //dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_118190.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_118190.png'}, {'type': 'text', 'text': "\nClick on 'Light grey Facebook button with white border, 'f' icon, and 'Share to Facebook' text'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc48ad1990> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f58bb7b0950> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5ff33479c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_103702.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_103702.png'}, {'type': 'text', 'text': "\nClick on 'HillCenter small'"}]}] dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc163a133d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc0251f7560> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_48313.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_48313.png'}, {'type': 'text', 'text': '\nClick on \'A navigation link located in the top right corner of the webpage header, containing the text "CONTACT US" in blue, aligned with other navigation links.\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_114262.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_114262.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Altoids Survival Kit – Concealed Mini EDC page'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_91907.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_91907.png'}, {'type': 'text', 'text': "\nClick on 'navigate to 'Disinformers' region page'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_409291.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_409291.png'}, {'type': 'text', 'text': '\nClick on \'button for "MEN", at the top of the webpage\''}]}] eph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb01c3d7290> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240729155132467/main_227.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240729155132467/main_227.png'}, {'type': 'text', 'text': '\n角色:移动应用UI/UX设计顾问\n背景:客户希望优化其移动应用的用户界面和用户体验,需要输出相应的XML代码片段,包括元素类型(node_type)、位置(location)、内容(content)等属性。\n目标:提供一个结构清晰、准确的XML代码片段,以满足客户对移动应用用户界面和用户体验优化的需求。\n限制:XML代码片段必须遵循移动应用设计标准,易于理解,同时确保信息的精确性和完整性。\n输出格式:XML格式,包含element_type, position, text等属性。\n工作流程:\n分析移动应用的用户界面和用户体验设计需求。\n根据分析结果,确定XML代码片段中的element_type, position, text等属性。\n编写符合规范的XML代码片段: Please right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_134517_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_134517_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Transform'"}]}] h(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f083aef4a40> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_469843.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_469843.png'}, {'type': 'text', 'text': "\nClick on 'icon, titled googleplus'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_582415.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_582415.png'}, {'type': 'text', 'text': '\nClick on \'link for "TruCap+"\''}]}] GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f56255b71a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde636c6e80> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc21cf345e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/s> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/(Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/q> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_47322.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_47322.png'}, {'type': 'text', 'text': "\nClick on '2022 (New tab)'"}]}] sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_143747_before_screenshot_sub3.png File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_143747_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Show desktop'"}]}] PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe928e00a40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_498691.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_498691.png'}, {'type': 'text', 'text': '\nClick on \'Blue "f" icon in a white square border\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde636c5c60> ge.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faf63729b20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f574f0d3ce0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_122150_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_122150_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'T3'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbe4948fa10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_070059_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_070059_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'MSNBC - MSN - Sleeping'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image48756.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image48756.jpg'}, {'type': 'text', 'text': '\nCreate a tree-style visualization of the page layout using the provided image. Click on it.'}]}] /aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0c5b91d490> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/96db3922-0fe0-4823-9f37-b089829476b8.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/96db3922-0fe0-4823-9f37-b089829476b8.png'}, {'type': 'text', 'text': '\n File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde637df420> pace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbe53831f80> conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_182854_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Record'"}]}] process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc2b3543a10> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc0164cf100> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbf6d92ca90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) L File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd57ef59580> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fcb1ca766b0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_051650_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_051650_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'https://scholar.google.com/'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240322/20240322_filtered/baidutieba/screen_00000017.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240322/20240322_filtered/baidutieba/screen_00000017.jpg'}, {'type': 'text', 'text': '\n请将图片上的所有文字输出给我,并给出文字的边界框。 Click on it.'}]}] s = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc0251f7560> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_564579.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_564579.png'}, {'type': 'text', 'text': '\nClick on \'How Can You Tell If Your Replica Rolex Watches For button, in "Triple Grade A Replica Red Rolex Presidential Bracelet"\''}]}] Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc0164b7240> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_746451.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_746451.png'}, {'type': 'text', 'text': "\nClick on 'Share via Facebook'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._gconv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_131948_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'YouTube'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_579391.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_579391.png'}, {'type': 'text', 'text': "\nClick on 'the text area'"}]}] 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd6eabaf650> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) 542160> > /mnt/hwfile/liuzhaoyang/workspaceLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image19592.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image19592.jpg'}, {'type': 'text', 'text': '\nWhat is the type of account? Explain your answer based on UI element. Right-click on it.'}]}] e 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f607bdb8040> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fad6d498770> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_156673.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_156673.png'}, {'type': 'text', 'text': "\nClick on 'text area'"}]}] /workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0b372a2f70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0dc787f290> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd93d2be7a0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_015418_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_015418_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Tips & Tricks'"}]}] ate Events", at the bottom of the image\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_560273.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_560273.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Nancy Pelosi in Singapore"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc2b3542fc0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_021045_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_021045_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'ABC News'"}]}] [rank23]: Traceback (most recent call last): [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank23]: train() [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank23]: trainer.train() [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank23]: return inner_training_loop( [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank23]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank23]: batch_samples += [next(epoch_iterator)] [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank23]: current_batch = next(dataloader_iter) [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank23]: data = self._next_data() [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank23]: return self._process_data(data) [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank23]: data.reraise() [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank23]: raise exception [rank23]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank23]: Original Traceback (most recent call last): [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank23]: sample = self._get_item(i) [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank23]: data_dict = self.preprocess_qwen2vl_v3( [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank23]: return self.preprocess_qwen2vl( [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank23]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank23]: image = tcs_loader(image) [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank23]: img = pil_loader(img_value_str) [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank23]: img = Image.open(buff) [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank23]: raise UnidentifiedImageError(msg) [rank23]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fcb1ca766b0> [rank23]: During handling of the above exception, another exception occurred: [rank23]: Traceback (most recent call last): [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank23]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank23]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank23]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank23]: print(f"Failed to fetch sample {i}. Exception:", e) [rank23]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank23]: print(f"Failed to fetch sample {i}. Exception:", e) [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank23]: return self.dispatch_line(frame) [rank23]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank23]: if self.quitting: raise BdbQuit [rank23]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc8cb2e4fe0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_002111_before_screenshot.pngmple {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd4677234c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_181955_before_screenshot_sub2.png conv: [{'role': 'use(Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __caTraceback (most recent call last): _str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc2b3543b00> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc21c5579c0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0b326748b0> ue_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f58bab5a700> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) : train() [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank57]: trainer.train() [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank57]: return inner_training_loop( [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank57]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank57]: batch_samples += [next(epoch_iterator)] [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank57]: current_batch = next(dataloader_iter) [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank57]: data = self._next_data() [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank57]: return self._process_data(data) [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank57]: data.reraise() [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank57]: raise exception [rank57]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank57]: Original Traceback (most recent call last): [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank57]: sample = self._get_item(i) [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank57]: data_dict = self.preprocess_qwen2vl_v3( [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank57]: return self.preprocess_qwen2vl( [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank57]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank57]: image = tcs_loader(image) [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank57]: img = pil_loader(img_value_str) [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank57]: img = Image.open(buff) [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank57]: raise UnidentifiedImageError(msg) [rank57]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdc48ad1990> [rank57]: During handling of the above exception, another exception occurred: [rank57]: Traceback (most recent call last): [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank57]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank57]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank57]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank57]: print(f"Failed to fetch sample {i}. Exception:", e) [rank57]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank57]: print(f"Failed to fetch sample {i}. Exception:", e) [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank57]: return self.dispatch_line(frame) [rank57]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank57]: if self.quitting: raise BdbQuit [rank57]: bdb.BdbQuit Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_144794.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_144794.png'}, {'type': 'text', 'text': "\nClick on 'initiate search for user contributions with the provided fulltext'"}]}] py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd76c9ee3e0> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) yang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc43d3aa340> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageLifesum_2024_3_6_19_4-147.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageLifesum_2024_3_6_19_4-147.png'}, {'type': 'text', 'text': "\nIdentify and give the bounding box coordinates of I don't have any specific preferences seen in the image. Please right-click on it."}]}] 'text', 'text': '\nClick on \'link labeled "Carrera"\''}]}] ent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc2b3541440> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0b32676570> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_054644_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_054644_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Earthquakes'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd459732520> ge.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fddda8e5df0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f58bb7b09a0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_173638_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_173638_before_screenshot.png'}, {'type': 'text', 'text': '\nClick on \'Swap "from" and "to" languages.\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f58bab581d0> Traceback (most recent call last): //dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_762197.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_762197.png'}, {'type': 'text', 'text': "\nClick on 'A link to supplier shipping instructions page'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_176917.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_176917.png'}, {'type': 'text', 'text': "\nClick on 'navigate to the '8 Best Food Apps To Have While Traveling' page'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdf1cc534c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) (Pdb) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc21e65ecf0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd76d9a3ec0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_192119_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'New Slide'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_798335.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_798335.png'}, {'type': 'text', 'text': '\nClick on \'Everyone, everywhere, all at once: Integrity screening for police, above "Workforce 2023 – getting used to a new normal"\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_052551_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_052551_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Real Estate'"}]}] roidUI/20240326/20240326_filtered/GIFzhizuo/screen_00000529.jpg'}, {'type': 'text', 'text': '\n识别屏幕截图中佩服元素的具体位置,输出坐标值。 Please double-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_708849.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_708849.png'}, {'type': 'text', 'text': "\nClick on 'What we do'"}]}] Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/agent_data/aig_share/ui_agi_appdata_car/20240606/01142.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/agent_data/aig_share/ui_agi_appdata_car/20240606/01142.png'}, {'type': 'text', 'text': '\n分析给定图片,找出电池电量显示标签图标,并给出其位置坐标。 Double-click on it.'}]}] eprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde637e18f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc16390fa10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) r', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_145724_original_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Tools'"}]}] Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f58bb8de930> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdb532bf830> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_743682.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_743682.png'}, {'type': 'text', 'text': "\nClick on '100 Thieves Shop - Official 100 Thieves Merchandise Store'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_111659.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_111659.png'}, {'type': 'text', 'text': '\nClick on \'The button at the far right of the navigation bar that is labeled "EVENTS" in uppercase letters with a light green background.\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_30395.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_30395.png'}, {'type': 'text', 'text': "\nClick on 'navigate to main() function documentation in the Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_151896.png in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0b32673c90> x7f56255942c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240826_165752_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Redo'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdb532bfe20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f58bb7b09a0> [rank59]: Traceback (most recent call last): [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank59]: train() [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank59]: trainer.train() [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank59]: return inner_training_loop( [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank59]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank59]: batch_samples += [next(epoch_iterator)] [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank59]: current_batch = next(dataloader_iter) [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank59]: data = self._next_data() [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank59]: return self._process_data(data) [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank59]: data.reraise() [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank59]: raise exception [rank59]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank59]: Original Traceback (most recent call last): [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank59]: sample = self._get_item(i) [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank59]: data_dict = self.preprocess_qwen2vl_v3( [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank59]: return self.preprocess_qwen2vl( [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank59]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank59]: image = tcs_loader(image) [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank59]: img = pil_loader(img_value_str) [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank59]: img = Image.open(buff) [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank59]: raise UnidentifiedImageError(msg) [rank59]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f58bab581d0> [rank59]: During handling of the above exception, another exception occurred: [rank59]: Traceback (most recent call last): [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank59]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank59]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank59]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank59]: print(f"Failed to fetch sample {i}. Exception:", e) [rank59]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank59]: print(f"Failed to fetch sample {i}. Exception:", e) [rank59]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank59]: return self.di conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_151896.png'}, {'type': 'text', 'text': "\nClick on 'navigate to WSET Diploma Graduation 2015 page'"}]}] t/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f56256b2ac0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_820629.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_820629.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Pew Research Center"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdb532bfc40> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8ec244f100> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240322/20240322_filtered/youku/screen_00000174.jpg conv: [{'role'> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_183501_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_183501_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Clear Recording'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ine 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f501ee9ab10> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde636c8450> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_141890.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_141890.png'}, {'type': 'text', 'text': "\nClick on 'input field'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f56255ab9c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) iuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc21daee570> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_770408.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_770408.png'}, {'type': 'text', 'text': '\nClick on \'Orange-yellow gradient button with blurred placeholder text, under "ACCUEIL"\''}]}] reprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd93d2be7a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6cfeacd710> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl Traceback (most recent call last): (self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc90fe2ed90> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image7792.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image7792.jpg'}, {'type': 'text', 'text': '\nWhat options are given for selecting? Explain your answer based on UI element. Please double-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f56349a1990> /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image14362.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image14362.jpg'}, {'type': 'text', 'text': '\nWhat is the name of the application? Explain your answer based on UI element. Right-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde63bf23e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fddd9c83510> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_257323.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_257323.png'}, {'type': 'text', 'text': "\nClick on 'More actions'"}]}] (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbf6d9> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_120756_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_120756_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Sign in - Google Accounts'"}]}] Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)ct = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f20277b03b0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f56349a2ca0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6cb6386980> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) on3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank55]: return self._process_data(data) [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank55]: data.reraise() [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank55]: raise exception [rank55]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank55]: Original Traceback (most recent call last): [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank55]: sample = self._get_item(i) [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank55]: data_dict = self.preprocess_qwen2vl_v3( [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank55]: return self.preprocess_qwen2vl( [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank55]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank55]: image = tcs_loader(image) [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank55]: img = pil_loader(img_value_str) [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank55]: img = Image.open(buff) [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank55]: raise UnidentifiedImageError(msg) [rank55]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6cfeacd710> [rank55]: During handling of the above exception, another exception occurred: [rank55]: Traceback (most recent call last): [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank55]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank55]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank55]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank55]: print(f"Failed to fetch sample {i}. Exception:", e) [rank55]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank55]: print(f"Failed to fetch sample {i}. Exception:", e) [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank55]: return self.dispatch_line(frame) [rank55]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank55]: if self.quitting: raise BdbQuit [rank55]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde636c5170> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_241628.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_241628.png'}, {'type': 'text', 'text': "\nClick on '‘Profound’ submarine blow to AUKUS on eve of anniversary'"}]}] ng/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)reprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6e40c2cef0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_043650_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_043650_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Privacy Help Center - Policies Help'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_030348_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_030348_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'fmkorea.com'"}]}] (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/wor File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f502b33a9d0> .BytesIO object at 0x7f61360d3ce0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f20275f62a0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_376698.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_376698.png'}, {'type': 'text', 'text': "\nClick on 'Follow on X'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde636c76a0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_183955_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_183955_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Reset to Cameo'"}]}] Traceback (most recent call last): Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_141501_before_screenshot_sub1.png File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/s> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f52b45cc900> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f52b3943c40> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd6ee07ef70> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_025619_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_025619_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Shopping in Microsoft Edge'"}]}] nt/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.pLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_330489.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_330489.png'}, {'type': 'text', 'text': "\nClick on 'Film Festival, on the top left of the image'"}]}] ile "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f56255aa520> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6cb6387380> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_48/C4web50k-3_274629199-split-0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_48/C4web50k-3_274629199-split-0.png'}, {'type': 'text', 'text': '\nPinpoint the location of the UI element Front Line Defenders on the image with bounding box. Click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6cb6387060> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_402730.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_402730.png'}, {'type': 'text', 'text': '\nClick on \'Balm, to the top left of "Vivid or moody colors, contrasting prints"\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_025944_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_025944_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Explore'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_200332_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_200332_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'English English'"}]}] n3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f622fc6ed90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3ef466a570> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_820211.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_820211.png'}, {'type': 'text', 'text': "\nClick on 'Circular gradient button with 'f' text, 3D effect, and anchor tag'"}]}] line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6cfeacdee0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6cfeace0c0> (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6cb71038d0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f56255a9da0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2026b5b740> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/iqiyi/screen_00000360.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/iqiyi/screen_00000360.jpg'}, {'type': 'text', 'text': '\n推荐页面中包含哪些类型的信息呢?(with grounding) Click on it.'}]}] ltered_57307.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_57307.png'}, {'type': 'text', 'text': "\nClick on 'A grey circular button with an upward-pointing arrow inside.'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_74211.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_74211.png'}, {'type': 'text', 'text': "\nClick on 'Black outline SVG icon or graphic with a simple, minimalistic design'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image43671.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image43671.jpg'}, {'type': 'text', 'text': '\nWhich tab is selected? Answer the question using a single word or phrase. Please double-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_588468.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_588468.png'}, {'type': 'text', 'text': '\nClick on \'CLICK Simple, modern search button with \'SEARCH\' text and subtle shadow, to the top right of "RETURN AND REFUND POLICY"\''}]}] uzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5ff1de4630> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_085157_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_085157_before_screenshot_sub0.png'}, {'type': 'text', 'text': '\nClick on \'""\'s avatar\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_712264.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_712264.png'}, {'type': 'text', 'text': "\nClick on 'TotalMedia'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) process_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6cb678de90> Traceback (most recent call last): Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image65961.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image65961.jpg'}, {'type': 'text', 'text': '\nWhat is the total price of an economy car? Explain your answer based on UI element. Please right-click on it.'}]}] qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6cb63872e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f56255ab4c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1d914a8860> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_180548_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_180548_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Zoom Out'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = s conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/agent_data/aig_share/ui_agi_appdata_phone/ui_agi_appdata_phone_20240606_reorg_filtered/04760.png'}, {'type': 'text', 'text': '\n对屏幕截图中的返回按钮进行边界框定位。 Click on it.'}]}] process_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_21599.png PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde636c6340> conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_21599.png'}, {'type': 'text', 'text': '\nClick on \'The navigation menu item labeled "WHY YELLOW BRICK", located in the header section of the webpage, between the "YELLOW BRICK SYSTEMS" logo and other menu items like "WHAT WE DO".\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_064939_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_064939_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Google Account Help'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_117833.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_117833.png'}, {'type': 'text', 'text': "\nClick on 'View Tickets link, at the top right corner of the screenshot'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) Traceback (most recent call last): PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d3f895350> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d3f9aa430> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6a2fe73b50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc63694ce50> ce/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f56255aad90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image40149.jpgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_12385.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_12385.png'}, {'type': 'text', 'text': "\nClick on 'filter content by 'goddess arts''"}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image40149.jpg'}, {'type': 'text', 'text': '\nConstruct a tree-based representation of the page layout extracted from the screenshot. Please double-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_114112_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_114112_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Redo Apply Quick Style'"}]}] Load image from ceph: langchao2:s3:Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/datasetTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc738bee570> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f501efaed40> g = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde636c6250> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f56255aa7f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0012681c60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8a0c7cb600> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_022500_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_022500_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'LinkedIn'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6f02acdda0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6a30403b00> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde636c5c10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f56343da390> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image41597.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image41597.jpg'}, {'type': 'text', 'text': '\nWhat is the given button used to save the category? Answer the question using a single word or phrase. Please right-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f505a321ee0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_60036.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_60036.png'}, {'type': 'text', 'text': "\nClick on 'Dark bold text on light blue background, centered and underlined, linking to Jerusalem clay fragment information, on the bottom left of the image'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_20885.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_20885.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Bez kategorii category'"}]}] n2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde636c59e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) able this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) t): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fcb1caa1530> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc8cb2e7330> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_733619.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_733619.png'}, {'type': 'text', 'text': '\nClick on \'3 of 2, to the top left of "Top Selling Products"\''}]}] /liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f56255ab510> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_195325_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_195325_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'See more hotels'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3ef4669940> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f502b33bec0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_040603_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_040603_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on ''m'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_115803_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_115803_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Eating Your Way to a Heart Attack? 5 Foods to Cut Out Now'"}]}] t_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc63691da80> n2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f09b46d1da0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8a0c7cb100> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2v> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", li> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_712364.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_712364.png'}, {'type': 'text', 'text': "\nClick on 'Oliver Jeffers'"}]}] rkspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe9ec67f1f0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f56343db510> zhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd8dd1062a0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f05e90d6610> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f085cbe6de0> .BytesIO object at 0x7fde341668e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/89d8b927-accf-43d1-a106-58fd99ed138f.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/89d8b927-accf-43d1-a106-58fd99ed138f.png'}, {'type': 'text', 'text': '\nIdentify the bounding box for the element in the screen to complete the task: Shared laughs and made memories with friends.. Please right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc8cb2e5e40> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8a0d197830> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_67311.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_67311.png'}, {'type': 'text', 'text': "\nClick on 'Gamer Juice'"}]}] getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_004908_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_004908_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Intense Reference'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_52171.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_52171.png'}, {'type': 'text', 'text': "\nClick on 'Awesome Image'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Ima File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efe4f6dcef0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) .BytesIO object at 0x7f3daefd85e0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_53140.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_53140.png'}, {'type': 'text', 'text': '\nClick on \'link for "Prepare for Certified Secure Software Lifecycle Professional (CSSLP) exam with uCertify"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc8cb2e5710> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ge', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image69224.jpg'}, {'type': 'text', 'text': '\nCreate a tree-like structure of the page layout based on the screenshot. Click on it.'}]}] e-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe8a85f3290> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc94ceb5a80> > /mnt/hLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/52f6c6c5-8088-40dd-8cb2-acafd341fe12.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/52f6c6c5-8088-40dd-8cb2-acafd341fe12.png'}, {'type': 'text', 'text': '\nWhat are the precise coordinates of the bounding box where I can click to finalize the instruction: 3 min. ago. Right-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) /workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efe4f7ee610> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1c5c80f970> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qconv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240328/20240328_filtered/xuexiqiangguo/screen_00000095.jpg'}, {'type': 'text', 'text': '\n图像的[[305, 46, 616, 83]]部分,具体是哪些元素? Right-click on it.'}]}] kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd93d2be7a0> [rank20]: Traceback (most recent call last): [rank20]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank20]: train() [rank20]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank20]: trainer.train() [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank20]: return inner_training_loop( [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank20]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank20]: batch_samples += [next(epoch_iterator)] [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank20]: current_batch = next(dataloader_iter) [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank20]: data = self._next_data() [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank20]: return self._process_data(data) [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank20]: data.reraise() [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank20]: raise exception [rank20]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank20]: Original Traceback (most recent call last): [rank20]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank20]: sample = self._get_item(i) [rank20]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank20]: data_dict = self.preprocess_qwen2vl_v3( [rank20]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank20]: return self.preprocess_qwen2vl( [rank20]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank20]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank20]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank20]: image = tcs_loader(image) [rank20]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank20]: img = pil_loader(img_value_str) [rank20]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank20]: img = Image.open(buff) [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank20]: raise UnidentifiedImageError(msg) [rank20]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd8dd1062a0> [rank20]: During handling of the above exception, another exception occurred: [rank20]: Traceback (most recent call last): [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank20]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank20]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank20]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank20]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank20]: print(f"Failed to fetch sample {i}. Exception:", e) [rank20]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank20]: print(f"Failed to fetch sample {i}. Exception:", e) [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank20]: return self.dispatch_line(frame) [rank20]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank20]: if self.quitting: raise BdbQuit [rank20]: bdb.BdbQuit File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f05da59a200> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efb424c3ab0> Traceback (most recent call last): process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1da0350db0> Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f05da59a6b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_177017.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_177017.png'}, {'type': 'text', 'text': '\nClick on \'the image of "(Asc)", to the bottom left of "Abstracting & Indexing"\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_125416_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_125416_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'New Tab'"}]}] = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdfad95d8a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f05da59a890> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc21d3013f0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_816264.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_816264.png'}, {'type': 'text', 'text': "\nClick on '1 comment: button, at the top of the image'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_140313_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_140313_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Repeat Doc Close'"}]}] Traceback (most recent call last): //st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_5_14_14_53_53e98c02668847ef930e8dcd0a2a2ec0-4.png Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efb42a86610> /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_266393.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_266393.png'}, {'type': 'text', 'text': "\nClick on 'dui lawyer consultation'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_203755_before_screenshot_sub0.png Traceback (most recent call last): [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_381651.png'}, {'type': 'text', 'text': "\nClick on 'Contact customer service'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File " (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbe4948fa10> mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd459735580> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_743167.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_743167.png'}, {'type': 'text', 'text': "\nClick on 'Add to cart: “Scotch Brite Sponge Cloth (8/11)”'"}]}] dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdfad26eca0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ype': 'text', 'text': '\nWhat is the status of "Use Short Cuts"? Explain your answer based on UI element. Right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b23ac21b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image61200.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image61200.jpg'}, {'type': 'text', 'text': '\nHow many steps are there? Answer the question using a single word or phrase. Please double-click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/AndroidUI/ui_caption_inhouseqqyinyue_screen_00000132.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/AndroidUI/ui_caption_inhouseqqyinyue_screen_00000132.jpg'}, {'type': 'text', 'text': '\n你能详细描述一下这张图片的内容吗? Click on it.'}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe02c138fe0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_408477.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_408477.png'}, {'type': 'text', 'text': "\nClick on 'Type into the text input box, at the top right corner of the image'"}]}] return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6cb6d7a890> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ack (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1ac97f9620> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdd1bed7a10> >Traceback (most recent call last): GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_19120.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_19120.png'}, {'type': 'text', 'text': "\nClick on 'Index'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_063317_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_063317_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Contents'"}]}] (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_022126_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_022126_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Is this helpful?'"}]}] 57, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc327fe98a0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6a2fe73b50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e3024efc0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/d(Pdb) py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdd1bed74c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_511324.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_511324.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Disney Cruis Line Offer"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/47211.jpg'}, {'type': 'text', 'text': '\nCan you provide the click location with bounding box for completing the task: click on loop check box. Click on it.'}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8ec244f100> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6e47b0dfd0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_40312.png {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd6eabad850> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e303626b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_221644_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_221644_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'We need data to give you answers. Try an Example'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_98253.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_98253.png'}, {'type': 'text', 'text': '\nClick on \'A navigation menu item labeled "GIVE BACK +" located at the top of the webpage, to the right of the "LIFESTYLE +" menu item.\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1c5e7ea750> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3daef8bec0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_060200_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_060200_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'View comments 10 Comment'"}]}] t/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f90532b8fe0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank63]: Traceback (most recent call last): [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank63]: train() [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank63]: trainer.train() [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank63]: return inner_training_loop( [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank63]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank63]: batch_samples += [next(epoch_iterator)] [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank63]: current_batch = next(dataloader_iter) [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank63]: data = self._next_data() [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank63]: return self._process_data(data) [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank63]: data.reraise() [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank63]: raise exception [rank63]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank63]: Original Traceback (most recent call last): [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank63]: sample = self._get_item(i) [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank63]: data_dict = self.preprocess_qwen2vl_v3( [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank63]: return self.preprocess_qwen2vl( [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank63]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank63]: image = tcs_loader(image) [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank63]: img = pil_loader(img_value_str) [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank63]: img = Image.open(buff) [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank63]: raise UnidentifiedImageError(msg) [rank63]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f21dc57cd10> [rank63]: During handling of the above exception, another exception occurred: [rank63]: Traceback (most recent call last): [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank63]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank63]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank63]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank63]: print(f"Failed to fetch sample {i}. Exception:", e) [rank63]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank63]: print(f"Failed to fetch sample {i}. Exception:", e) [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank63]: return self.dispatch_line(frame) [rank63]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank63]: if self.quitting: raise BdbQuit [rank63]: bdb.BdbQuit huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_509183.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_509183.png'}, {'type': 'text', 'text': "\nClick on 'Help, opens dialogs'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_042814_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_042814_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Permission to us> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_27718.png'}, {'type': 'text', 'text': "\nClick on 'check 'unpublish' command'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe8a85f36f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f085e6a0ef0> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_28_2_21_8f932a61650a44af9989eed51f55a064-22.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_28_2_21_8f932a61650a44af9989eed51f55a064-22.png'}, {'type': 'text', 'text': '\nIdentify the position and bounding box of Back in the visual content. Right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1da0354c20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3b28dd2480> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd6eabad8f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) in possibly_batched_index] [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank56]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank56]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank56]: print(f"Failed to fetch sample {i}. Exception:", e) [rank56]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank56]: print(f"Failed to fetch sample {i}. Exception:", e) [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank56]: return self.dispatch_line(frame) [rank56]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank56]: if self.quitting: raise BdbQuit [rank56]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8ec17c1ee0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ser', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/AndroidUI/ui_caption_inhouseBzhan_screen_00000313.jpg'}, {'type': 'text', 'text': '\n描述下图里有什么 Please right-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/2024080113334766/main_323_after.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/2024080113334766/main_323_after.png'}, {'type': 'text', 'text': '\n提供一个结构合理、准确的json布局描述,以满足开发者对软件产品用户界面设计的需求。\n限制:json布局描述必须遵守软件设计规范,易于实现,同时确保信息的准确性和完整性。\n输出格式:json格式,包含node_type, location, content等属性。\n工作流程:\n\n分析软件产品的用户界面设计需求。\n根据分析结果,确定json布局描述中的node_type, location, content等属性。\n编写符合规范的json布局描述。 Please click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc163917510> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240802185106940/main_114_after.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0802_099_batch_4_num_2800/20240802185106940/main_114_after.png'}, {'type': 'text', 'text': '\n角色:移动应用UI/UX设计顾问\n背景:客户希望优化其移动应用的用户界面和用户体验,需要输出相应的json代码片段,包括元素类型(node_type)、位置(location)、内容(content)等属性。\n目标:提供一个结构清晰、准确的json代码片段,以满足客户对移动应用用户界面和用户体验优化的需求。\n限制:json代码片段必须遵循移动应用设计标准,易于理解,同时确保信息的精确性和完整性。\n输出格式:json格式,包含element_type, position, text等属性。\n工作流程:\n分析移动应用的用户界面和用户体验设计需求。\n根据分析结果,确定json代码片段中的element_type, position, text等属性。\n编写符合规范的json代码片段: Please double-click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_214527_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_214527_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Ungroup...'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_032603_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_032603_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'list of asthma inhalers uk - Search'"}]}] l_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8f4badf2e0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_194703_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_194703_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'All Developer Events'"}]}] Portal"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_034351_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_034351_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'youtube.com'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd6eabad940> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc1639632e0> [rank54]: Traceback (most recent call last): [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank54]: train() [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank54]: trainer.train() [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank54]: return inner_training_loop( [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank54]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank54]: batch_samples += [next(epoch_iterator)] [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank54]: current_batch = next(dataloader_iter) [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank54]: data = self._next_data() [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank54]: return self._process_data(data) [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank54]: data.reraise() [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank54]: raise exception [rank54]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank54]: Original Traceback (most recent call last): [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank54]: sample = self._get_item(i) [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank54]: data_dict = self.preprocess_qwen2vl_v3( [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank54]: return self.preprocess_qwen2vl( [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank54]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank54]: image = tcs_loader(image) [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank54]: img = pil_loader(img_value_str) [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank54]: img = Image.open(buff) [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank54]: raise UnidentifiedImageError(msg) [rank54]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f085e6a0ef0> [rank54]: During handling of the above exception, another exception occurred: [rank54]: Traceback (most recent call last): [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank54]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank54]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank54]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank54]: print(f"Failed to fetch sample {i}. Exception:", e) [rank54]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank54]: print(f"Failed to fetch sample {i}. Exception:", e) [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank54]: return self.dispatch_line(frame) [rank54]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank54]: if self.quitting: raise BdbQuit [rank54]: bdb.BdbQuit Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_034327_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_034327_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'AutomationID: tab-24'"}]}] Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd6eabad800> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdfac4b84f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3ef54ab9c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3b1ebdacf0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_414815.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_414815.png'}, {'type': 'text', 'text': "\nClick on 'Go to Blog, on the top left of the webpage'"}]}] reenshot'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_52667.png Traceback (most recent call last): conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_52667.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Social Group List page'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd6eabafc40> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_500573.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_500573.png'}, {'type': 'text', 'text': "\nClick on 'the textbox'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd459757600> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_454565.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_454565.png'}, {'type': 'text', 'text': "\nClick on 'White-to-gray gradient background, sans-serif font: 'Backpacks' hyperlink'"}]}] /dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1da02da390> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8c38fa76f0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_605624.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_605624.png'}, {'type': 'text', 'text': "\nClick on 'Link To Facebook Profile'"}]}] (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd6eabad8f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3ef42c0d10> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_143805_original_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_143805_original_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Merge and Center Cells'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_360821.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_360821.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Moe Hannah LLP", above "FAMILY LAW BLOG"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): //dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_209055.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_209055.png'}, {'type': 'text', 'text': "\nClick on 'Unanswered'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1d914d4ef0> mages/multi_modal/agent_data/rico/dataset/image13397.jpg'}, {'type': 'text', 'text': '\nWhat is the application name? Explain your answer based on UI element. Please double-click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_203443_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_203443_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Online Learning'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd45968ebb0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_022337_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_022337_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Microsoft Copilot'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1da19f9bc0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_354931.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_354931.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Ted Lare Design and Build"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) aset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f001267e2f0> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/dcabd341-ed7d-4206-932c-4064f4107cc5.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/dcabd341-ed7d-4206-932c-4064f4107cc5.png'}, {'type': 'text', 'text': '\nHighlight the area with bounding box that I can click to execute the task: Show only web link previews. Click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8ec17c1d50> ge.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3b28dd1080> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_53554.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_53554.png'}, {'type': 'text', 'text': '\nClick on \'A dark grey button with three light grey dots arranged in a triangular formation located to the right of the "Today" button and some calendar navigation controls.\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fde341668e0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd8dc559ee0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa46582b740> vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e30243240> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_143637.png i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_210607_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_210607_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'How to Use a Monitor With Your Closed Laptop'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_603694.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_603694.png'}, {'type': 'text', 'text': "\nClick on 'Print Friendly and PDF'"}]}] > /mnt/hwfile/liuzhaoyang/workspace File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8ec244f100> age) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd467723b00> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_315561.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_315561.png'}, {'type': 'text', 'text': '\nClick on \'Light blue input field with centered white placeholder text "Enter your search words here..."\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3b1ebdacf0> l_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc327fc7d30> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_023735_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_023735_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Privacy'"}]}] (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_785250.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_785250.png'}, {'type': 'text', 'text': "\nClick on 'Posts by CNN Newsource, in the middle of the page'"}]}] last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd6eabaf4c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_042856_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_042856_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Payments Terms of Use | eBay.com - Sleeping'"}]}] File "/mnt/hwfile/liuzhaoyang/worTraceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc1639055d0> .BytesIO object at 0x7efe4f6644f0> e UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e30263c90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_387614.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_387614.png'}, {'type': 'text', 'text': "\nClick on 'the textbox, on the left side of the page'"}]}] ) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd76c1935b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa46424e2f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): //wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_153654_original_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_153654_original_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'main branch'"}]}] eph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_177816.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_177816.png'}, {'type': 'text', 'text': '\nClick on \'23 SES 02 D, Policy Reforms and Teacher Professionalism (Part 1), titled Policy Reforms and Teacher Professionalism (Part 1), in the "Privacy notice" section\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd4677234c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8f4b963ec0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_252649.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_252649.png'}, {'type': 'text', 'text': '\nClick on \'A long, hyperlinked text element with repeating "tippar" text\''}]}] , line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e3035f6f0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_140817_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_140817_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Accept'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc1639062a0> [rank48]: Traceback (most recent call last): [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank48]: train() [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank48]: trainer.train() [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank48]: return inner_training_loop( [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank48]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank48]: batch_samples += [next(epoch_iterator)] [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank48]: current_batch = next(dataloader_iter) [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank48]: data = self._next_data() [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank48]: return self._process_data(data) [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank48]: data.reraise() [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank48]: raise exception [rank48]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank48]: Original Traceback (most recent call last): [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank48]: sample = self._get_item(i) [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank48]: data_dict = self.preprocess_qwen2vl_v3( [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank48]: return self.preprocess_qwen2vl( [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank48]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank48]: image = tcs_loader(image) [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank48]: img = pil_loader(img_value_str) [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank48]: img = Image.open(buff) [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank48]: raise UnidentifiedImageError(msg) [rank48]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc327fc7d30> [rank48]: During handling of the above exception, another exception occurred: [rank48]: Traceback (most recent call last): [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank48]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank48]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank48]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank48]: print(f"Failed to fetch sample {i}. Exception:", e) [rank48]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank48]: print(f"Failed to fetch sample {i}. Exception:", e) [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank48]: return self.dispatch_line(frame) [rank48]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank48]: if self.quitting: raise BdbQuit [rank48]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc0d988ef20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_578851.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_578851.png'}, {'type': 'text', 'text': "\nClick on 'necklace (21 products)'"}]}] ichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_192277.png'}, {'type': 'text', 'text': "\nClick on 'Champions League link'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/f3f9d7bc-6609-4dcd-b851-b53b3b380cef.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/f3f9d7bc-6609-4dcd-b851-b53b3b380cef.png'}, {'type': 'text', 'text': '\nWhere is the bounding box coordinates to click for solving the task: 3:12:00 (12.42 mi). Please click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8f4b963d30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) /envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd76d9a3ec0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8b504f2b60> .BytesIO object at 0x7fa0936bfba0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_67446.png conv: [{'role': 'user', 'Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/5bb5ac65-3db3-4fe2-994d-89751a6294fe.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/5bb5ac65-3db3-4fe2-994d-89751a6294fe.png'}, {'type': 'text', 'text': '\nPlease identify the location with bounding box that can complete the instruction: Zrušit. Please right-click on it.'}]}] e/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7649894e50> 617.png'}, {'type': 'text', 'text': "\nClick on 'Wechat'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd74ffbbf10> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa0936bfce0> > /mnt/hwfile/liuzhaoyang/workspace File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0a0137dd50> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f086fa62840> (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_453256.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_453256.png'}, {'type': 'text', 'text': '\nClick on \'link titled "BlaZeR^"\''}]}] ine 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8f4b963510> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/ag (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_163256_original_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240912_163256_original_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'IsaHelpMain.desktop'"}]}] g/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd74ffd02c0> (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_184.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_184.png'}, {'type': 'text', 'text': "\nClick on 'navigate to the detailed page of 卡速售官方演示站点,测试一下 发布时间:2022-04-30 16:03'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4002050cc0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8f4b963a60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): e/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_143388.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_143388.png'}, {'type': 'text', 'text': '\nClick on \'An image labeled "Haas Haus," located between the images labeled "Danube Journeys" and "Schönbrunn Palace."\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_612769.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_612769.png'}, {'type': 'text', 'text': "\nClick on 'search bar input'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f086fa621b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch Traceback (most recent call last): Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/5363.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/5363.jpg'}, {'type': 'text', 'text': '\nCould you locate the setting on this image and specify its bounding box dimensions? Please click on it.'}]}] last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GU File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/31185.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/31185.jpg'}, {'type': 'text', 'text': "\nDetect and mark the position of a place to deselect the burp 01 file 's gold star with bounding box in this image. Please double-click on it."}]}] /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8ec21cc950> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8f4b961260> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_120200.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_120200.png'}, {'type': 'text', 'text': "\nClick on 'Author image'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_083152_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_083152_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Robert H. Shmerling, MD - Harvard Health - Sleeping'"}]}] (Pdb) ge from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_506816.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_506816.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Automobile Online Business - Informed Looking Tips 5"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f05da593a60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f08f0ed18f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_117605.png File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) qa_filtered_117605.png'}, {'type': 'text', 'text': '\nClick on \'Blue circular button with white right-pointing arrow, at the bottom right corner of the image, under "You may also like" section\''}]}] > /mnt/Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_96163.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_96163.png'}, {'type': 'text', 'text': '\nClick on \'"Courses" me File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem_Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_97808.pngin _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e303d2d90> [rank16]: Traceback (most recent call last): [rank16]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank16]: train() [rank16]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank16]: trainer.train() [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank16]: return inner_training_loop( [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank16]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank16]: batch_samples += [next(epoch_iterator)] [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank16]: current_batch = next(dataloader_iter) [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank16]: data = self._next_data() [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank16]: return self._process_data(data) [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank16]: data.reraise() [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank16]: raise exception [rank16]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank16]: Original Traceback (most recent call last): [rank16]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank16]: sample = self._get_item(i) [rank16]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank16]: data_dict = self.preprocess_qwen2vl_v3( [rank16]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank16]: return self.preprocess_qwen2vl( [rank16]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank16]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank16]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank16]: image = tcs_loader(image) [rank16]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank16]: img = pil_loader(img_value_str) [rank16]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank16]: img = Image.open(buff) [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank16]: raise UnidentifiedImageError(msg) [rank16]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4002050cc0> [rank16]: During handling of the above exception, another exception occurred: [rank16]: Traceback (most recent call last): [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank16]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank16]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank16]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank16]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank16]: print(f"Failed to fetch sample {i}. Exception:", e) [rank16]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank16]: print(f"Failed to fetch sample {i}. Exception:", e) [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank16]: return self.dispatch_line(frame) [rank16]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank16]: if self.quitting: raise BdbQuit [rank16]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_143326_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'AutomationID: Icons_ArrowCircle_M'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_200025_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_200025_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'All Black'"}]}] image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8f4b962a70> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_732980.png in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f52b3942520> Load image from ceph: langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0726_099_batch_3_num_1500/20240726151354491/main_202_after.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0726_099_batch_3_num_1500/20240726151354491/main_202_after.png'}, {'type': 'text', 'text': '\n图片的[[80, 63, 155, 94]]区域内,具体包含了哪些元素信息? Click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_732980.png'}, {'type': 'text', 'text': '\nClick on \'the image of "BALLOONS ABOVE THE VALLEY"\''}]}] 27_030049_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Apple'"}]}] Traceback (most recent call lTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesTraceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) iuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6a30403b00> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8f4b9618f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_043603_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_043603_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Stocks - MSN'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8a8cbc2890> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbd246bb1f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_218801.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_218801.png'}, {'type': 'text', 'text': "\nClick on 'investing.com'"}]}] ck (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d3f895710> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3b1ebd9f30> Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/662.jpg _d20240822_v1/course/C4web50k-2_186872463-split-0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/ui2json_web_d20240822_v1/course/C4web50k-2_186872463-split-0.png'}, {'type': 'text', 'text': '\n使用json格式输出当前UI的结构化信息。 Please click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_330397.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_330397.png'}, {'type': 'text', 'text': '\nClick on \'text area, in the "Leave a Reply" section\''}]}] e 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb97e942250> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): //st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_68/C4web50k-2_185570351-split-0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_68/C4web50k-2_185570351-split-0.png'}, {'type': 'text', 'text': '\nFind the UI element Share in the image and detail its location with bounding box. Please right-click on it.'}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6b23ac2110> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3daefd9bc0> > /mnt/hwfile/liuzhaoyang/workspaceTraceback (most recent call last): taset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240325/20240325_filtered/xiechenglvxing/screen_00000264.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240325/20240325_filtered/xiechenglvxing/screen_00000264.jpg'}, {'type': 'text', 'text': '\n描述一下图片中[[147, 42, 521, 93]]框内传达的元素信息。 Please click on it.'}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f07198579c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f083aef4a40> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): //st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image16863.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image16863.jpg'}, {'type': 'text', 'text': '\nWhich version is used? Explain your answer based on UI element. Click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3b1ebd9f30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240321/20240321_filtered/dazhongdianping/screen_00000547.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240321/20240321_filtered/dazhongdianping/screen_00000547.jpg'}, {'type': 'text', 'text': '\n这个UI界面是什么 Please double-click on it.'}]}] (Pdb) ge from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_79459.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_79459.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Washcloths page'"}]}] py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d3f9a6570> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6cb63872e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6cb63873d0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_757100.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_757100.png'}, {'type': 'text', 'text': '\nClick on \'@westondigital at Thumbtack, above "Your Name (required)", under "Phone number"\''}]}] preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e30251940> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5147984860> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f09b4654130> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_5_9_15_38_ae8d5fd1990541f0b85737d8435a95f1-17.png'}, {'type': 'text', 'text': '\nIdentify the location of the UI element 2024 in the image, specifying its coordinates. Right-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ntent': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_201630.png'}, {'type': 'text', 'text': "\nClick on 'redirect to Electoral and voting districts page'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image19670.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image19670.jpg'}, {'type': 'text', 'text': '\nWhat is the displayed username? Explain your answer based on UI element. Please click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d3f89f4c0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_031753_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_031753_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'View Switcher. Current view is Agenda view'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Traceback (most recent call last): cent call last): [rank58]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank58]: train() [rank58]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank58]: trainer.train() [rank58]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank58]: return inner_training_loop( [rank58]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank58]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank58]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank58]: batch_samples += [next(epoch_iterator)] [rank58]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank58]: current_batch = next(dataloader_iter) [rank58]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank58]: data = self._next_data() [rank58]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank58]: return self._process_data(data) [rank58]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank58]: data.reraise() [rank58]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank58]: raise exception [rank58]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank58]: Original Traceback (most recent call last): [rank58]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank58]: sample = self._get_item(i) [rank58]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank58]: data_dict = self.preprocess_qwen2vl_v3( [rank58]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank58]: return self.preprocess_qwen2vl( [rank58]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank58]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank58]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank58]: image = tcs_loader(image) [rank58]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank58]: img = pil_loader(img_value_str) [rank58]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank58]: img = Image.open(buff) [rank58]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank58]: raise UnidentifiedImageError(msg) [rank58]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6cb63873d0> [rank58]: During handling of the above exception, another exception occurred: [rank58]: Traceback (most recent call last): [rank58]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank58]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank58]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank58]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank58]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank58]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank58]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank58]: print(f"Failed to fetch sample {i}. Exception:", e) [rank58]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank58]: print(f"Failed to fetch sample {i}. Exception:", e) [rank58]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank58]: return self.dispatch_line(frame) [rank58]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank58]: if self.quitting: raise BdbQuit [rank58]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6cb6387150> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_707485.png conv: [{'role': 'user', 'co File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa3abcdef20> File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f501eeae390> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample conv: [{'role': 'user', 'conLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_120039_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_120039_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Ad'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e30252cf0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_141952_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_141952_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'Microsoft Store'"}]}] > /mnt/hwfile/liuzhaoyang/workspace> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed t (Pdb) /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d3f817420> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): //wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_023503_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_023503_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Ad info'"}]}] /liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f501eeae520> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_233595.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_233595.png'}, {'type': 'text', 'text': "\nClick on 'navigate to the article 'Where have all the Whirlpools gone?''"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f788dc41170> Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d3f9adcb0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa464246b10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_172832_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_172832_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'sample_500k.py'"}]}] noxville'"}]}] Load image from ceph: langchao2:s3: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0a0137dd50> Agent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e302509f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f505a321ee0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e3024dc10> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_801061.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_801061.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Sage Knowledge", at the right side of "Menu", to the top right of "Entries A-Z"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_18/C4web50k-1_92386791-split-0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_18/C4web50k-1_92386791-split-0.png'}, {'type': 'text', 'text': '\nDetermine the bounding box and location of Home in this image. Double-click on it.'}]}] _loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6cb63877e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa3ab77eb10> (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f311a98e700> wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_112687.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_112687.png'}, {'type': 'text', 'text': "\nClick on 'Add to basket: “AmRep Cavalry Tank Company”'"}]}] Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa0936bfba0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Traceback (most recent call last): (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbbe148b650> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7636a4a070> ader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f501eeaf150> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_491056.png conv: [{'role': ' conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/e77fd2f4-af08-4d49-86ee-66e9af3c5d3a.png'}, {'type': 'text', 'text': '\nPlease output the bounding box of the region correctly responding to this task: No permissions granted. Click on it.'}]}] set.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_57932.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_57932.png'}, {'type': 'text', 'text': "\nClick on 'ACCOUNT link'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageFever_2024_3_7_21_26-91.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageFever_2024_3_7_21_26-91.png'}, {'type': 'text', 'text': '\nIdentify and give the bounding box coordinates of restaurants Discover seen in the image. Please double-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f311a98e700> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f52b4d6fe70> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdfac4b8900> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_115717_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_115717_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'The Associated Press'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qweTraceback (most recent call last): qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76385af5b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f32ad1d5ad0> Traceback (most recent call last): Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_177387.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_177387.png'}, {'type': 'text', 'text': "\nClick on 'open article on PJM carbon-neutral production'"}]}] Load image from ceph: VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/quanminkge/screen_00000478.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/quanminkge/screen_00000478.jpg'}, {'type': 'text', 'text': '\n找出界面中返回按钮的位置,输出其边界框坐标值。 Click on it.'}]}] /qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbb66e8cae0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe8a9845260> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_682778.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_682778.png'}, {'type': 'text', 'text': '\nClick on \'Permalink to Festivals, sessions, releases, in the "Tag Archives: 6music" section\''}]}] _before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Split screen'"}]}] Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_ioTraceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_125549_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_125549_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'New Tab'"}]}] e 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb94a7fd940> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f15abd31cb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/ImagTraceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_41137.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_41137.png'}, {'type': 'text', 'text': "\nClick on 'click to view offers'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_211557_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_211557_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Font Color'"}]}] img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe8a9845210> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb97e942250> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageWikipedia_2024_3_6_17_46-242.png'}, {'type': 'text', 'text': '\nDetect and mark the position of Navigate up with bounding box in this image. Double-click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageBBC_News_2024_3_2_14_11-257.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageBBC_News_2024_3_2_14_11-257.png'}, {'type': 'text', 'text': '\nIdentify the position of the https://www.yaz.hk/ in this image and provide its bounding box. Please click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1da0462480> [rank19]: Traceback (most recent call last): [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank19]: train() [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank19]: trainer.train() [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank19]: return inner_training_loop( [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank19]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank19]: batch_samples += [next(epoch_iterator)] [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank19]: current_batch = next(dataloader_iter) [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank19]: data = self._next_data() [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank19]: return self._process_data(data) [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank19]: data.reraise() [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank19]: raise exception [rank19]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank19]: Original Traceback (most recent call last): [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank19]: sample = self._get_item(i) [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank19]: data_dict = self.preprocess_qwen2vl_v3( [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank19]: return self.preprocess_qwen2vl( [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank19]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank19]: image = tcs_loader(image) [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank19]: img = pil_loader(img_value_str) [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank19]: img = Image.open(buff) [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank19]: raise UnidentifiedImageError(msg) [rank19]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f52b4d6fe70> [rank19]: During handling of the above exception, another exception occurred: [rank19]: Traceback (most recent call last): [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank19]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank19]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank19]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank19]: print(f"Failed to fetch sample {i}. Exception:", e) [rank19]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank19]: print(f"Failed to fetch sample {i}. Exception:", e) [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank19]: return self.dispatch_line(frame) [rank19]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank19]: if self.quitting: raise BdbQuit [rank19]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/agu> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240322/20240322_filtered/haluo/screen_00000000.jpg'}, {'type': 'text', 'text': '\n精准定位图中打车的位置,并标记其边界框。 Please click on it.'}]}] vis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f501ee2f100> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe0f01b1850> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ress with full domain name.'"}]}] g/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank50]: raise exception [rank50]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank50]: Original Traceback (most recent call last): [rank50]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank50]: sample = self._get_item(i) [rank50]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank50]: data_dict = self.preprocess_qwen2vl_v3( [rank50]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank50]: return self.preprocess_qwen2vl( [rank50]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank50]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank50]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank50]: image = tcs_loader(image) [rank50]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank50]: img = pil_loader(img_value_str) [rank50]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank50]: img = Image.open(buff) [rank50]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank50]: raise UnidentifiedImageError(msg) [rank50]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe8a9845210> [rank50]: During handling of the above exception, another exception occurred: [rank50]: Traceback (most recent call last): [rank50]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank50]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank50]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank50]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank50]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank50]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank50]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank50]: print(f"Failed to fetch sample {i}. Exception:", e) [rank50]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank50]: print(f"Failed to fetch sample {i}. Exception:", e) [rank50]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank50]: return self.dispatch_line(frame) [rank50]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank50]: if self.quitting: raise BdbQuit [rank50]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe02c169030> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_420108.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_420108.png'}, {'type': 'text', 'text': "\nClick on 'Cette page en français'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_377280.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_377280.png'}, {'type': 'text', 'text': "\nClick on 'author profile, at the bottom of the page'"}]}] Traceback (most recent call last): //dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_433627.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_433627.png'}, {'type': 'text', 'text': "\nClick on '0, labeled Rate review as helpful'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f38520eb5b0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vlTraceback (most recent call last): 15, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d3b89fba0> Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhao File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0872307b00> yang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdd26295c10> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3e31005df0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe02c16a750> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_36646.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_36646.png'}, {'type': 'text', 'text': "\nClick on 'access Inner Circle Membership page'"}]}] workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f32adb9fba0> > /mnt/hwfile/liuzhaoyang/workspace> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa3aae1c0e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) or: cannot identify image file <_io.BytesIO object at 0x7f133c0df2e0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_33670.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_33670.png'}, {'type': 'text', 'text': "\nClick on '0, Basket Menu'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_238947.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_238947.png'}, {'type': 'text', 'text': "\nClick on 'navigate to collaboration details'"}]}] ase right-click on it."}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_312608.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_312608.png'}, {'type': 'text', 'text': "\nClick on 'text input box'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240312/20240312_filtered/gaodemap/action_219833.184327finished.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240312/20240312_filtered/gaodemap/action_219833.184327finished.jpg'}, {'type': 'text', 'text': '\n能否帮忙分析这张图片,并且输出每个文字的位置坐标? Please right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe02c162610> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/sr> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) /aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41b551ba10> Traceback (most recent call last): ll last): [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank52]: train() [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank52]: trainer.train() [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank52]: return inner_training_loop( [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank52]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank52]: batch_samples += [next(epoch_iterator)] [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank52]: current_batch = next(dataloader_iter) [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank52]: data = self._next_data() [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank52]: return self._process_data(data) [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank52]: data.reraise() [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank52]: raise exception [rank52]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank52]: Original Traceback (most recent call last): [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank52]: sample = self._get_item(i) [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank52]: data_dict = self.preprocess_qwen2vl_v3( [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank52]: return self.preprocess_qwen2vl( [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank52]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank52]: image = tcs_loader(image) [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank52]: img = pil_loader(img_value_str) [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank52]: img = Image.open(buff) [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank52]: raise UnidentifiedImageError(msg) [rank52]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0872307b00> [rank52]: During handling of the above exception, another exception occurred: [rank52]: Traceback (most recent call last): [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank52]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank52]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank52]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank52]: print(f"Failed to fetch sample {i}. Exception:", e) [rank52]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank52]: print(f"Failed to fetch sample {i}. Exception:", e) [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank52]: return self.dispatch_line(frame) [rank52]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank52]: if self.quitting: raise BdbQuit [rank52]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f32aefa8770> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_104917_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_104917_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Choose a language for shopping.'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240905_150054_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240905_150054_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Notifications'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_044718_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_044718_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'AutomationID: tab-15'"}]}] pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d3b8b59e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspaceLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_021757_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_021757_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Search your Collections'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe02c16a480> mages20240822_191956_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_191956_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Notes Master'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f10a6e27290> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_161817_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_161817_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Activate'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/pythLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_030645_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)er', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_244438.png'}, {'type': 'text', 'text': "\nClick on 'Subscribe button'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fba677c9e40> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8777001ad0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19d5e83ab0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdd26296e30> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1da19f9bc0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_245278.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_245278.png'}, {'type': 'text', 'text': "\nClick on 'Ebizz Logo'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_5_6_21_46_734e78be315f4512a8c0ccae39240db4-7.pngLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_432287.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_5_6_21_46_734e78be315f4512a8c0ccae39240db4-7.png'}, {'type': 'text', 'text': '\nCan you determine the location coordinates of Back displayed in the screen? Please double-click on it.'}]}]conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_432287.png'}, {'type': 'text', 'text': "\nClick on 'Child menu of Toys'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbbe00ab420> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_776217.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_776217.png'}, {'type': 'text', 'text': '\nClick on \'link for "privacy policy", at the top of the image, above "banquets"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbbe148b650> 5, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe02c163f10> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f19d5e82d40> Load image from ceph: langchao2:s3:> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoTraceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qweTraceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petre[rank27]: Traceback (most recent call last): [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank27]: train() [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank27]: trainer.train() [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank27]: return inner_training_loop( [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank27]: batch_samples, n File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0d784e4860> ]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank27]: Original Traceback (most recent call last): [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank27]: sample = self._get_item(i) [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank27]: data_dict = self.preprocess_qwen2vl_v3( [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank27]: return self.preprocess_qwen2vl( [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank27]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank27]: image = tcs_loader(image) [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank27]: img = pil_loader(img_value_str) [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank27]: img = Image.open(buff) [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank27]: raise UnidentifiedImageError(msg) [rank27]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fba677c9e40> [rank27]: During handling of the above exception, another exception occurred: [rank27]: Traceback (most recent call last): [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank27]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank27]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank27]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank27]: print(f"Failed to fetch sample {i}. Exception:", e) [rank27]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank27]: print(f"Failed to fetch sample {i}. Exception:", e) [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank27]: return self.dispatch_line(frame) [rank27]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank27]: if self.quitting: raise BdbQuit [rank27]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbbe38b3970> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_123146.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_123146.png'}, {'type': 'text', 'text': "\nClick on 'view Houston job listings'"}]}] 72.png'}, {'type': 'text', 'text': '\n请描述图片[[880, 957, 958, 992]]框内,那些引人注目的元素。 Please right-click on it.'}]}] identifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe02c162ac0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) F> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) e/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1da0353380> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_3477.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_3477.png'}, {'type': 'text', 'text': '\nClick on \'Mens Elimination day 1, under "Related Galleries"\''}]}] vis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb97e942250> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f42faf8b740> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f42f827bce0> ge.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbb66e8cae0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa4644b6f20> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_032731_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_032731_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Car rentals'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0d785f62a0> /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb9576eef70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Ima> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbbe00a9a80> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_62608.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_62608.png'}, {'type': 'text', 'text': '\nClick on \'the image of "76”x 96" Modern Wrought Iron Double Entry Door with High-impact Double Operable Insulation Glass, HAD2349", on the left side of the webpage\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb94a7f9170> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_277262.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_277262.png'}, {'type': 'text', 'text': '\nClick on \'link for "Medium Viking Blowing Horn / Bugle"\''}]}] 20250222/images/multi_modal/agent_data/rico/dataset/image50618.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image50618.jpg'}, {'type': 'text', 'text': '\nWhat is the name of the application? Explain your answer based on UI element. Click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe02c1639c0> wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbc603b12b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._gLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_816287.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_816287.png'}, {'type': 'text', 'text': '\nClick on \'September 10, 2017, to the bottom left of "Unable to act on their feelings and forced into exile in the Forest of Arden, lovers Rosalind and Orlando become entangled in a beguiling game of love, lust and mistaken identity. One of Shakespeare’…"\''}]}] from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8786579cb0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_246304.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_246304.png'}, {'type': 'text', 'text': "\nClick on 'redirect to Terms of Use page'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/lLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageAgoda_2024_3_5_10_56-196.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageAgoda_2024_3_5_10_56-196.png'}, {'type': 'text', 'text': '\nHow to locate the 19 May 2024 19 and its bounding box in the given image? Please right-click on it.'}]}] image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f09b076eb10> .BytesIO object at 0x7fb97e942250> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f877c1eba60> ge from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_201220_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_201220_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Baby Keepsakes & Announcements for sale | eBay'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5ff1e2f0b0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_204843_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{Traceback (most recent call last): //wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_022240_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_022240_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'History'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ype': 'text', 'text': '\nClick on \'The word "about" located in the top navigation menu.\''}]}] huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb9576eef70> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0c385bee30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) .BytesIO object at 0x7fa3aae1ca90> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb94a82cf40> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_084259_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_084259_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Competition Policy'"}]}] m ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_134316_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_134316_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Black and White'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/58tongcheng/screen_00000257.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/58tongcheng/screen_00000257.jpg'}, {'type': 'text', 'text': '\n如何切换到外观模式?with grounding Please click on it.'}]}] s/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe02c162ca0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7eff0835b740> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_714888.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_714888.png'}, {'type': 'text', 'text': "\nClick on 'submit'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ef9a8764400> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa3ab7f2020> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfTraceback (most recent call last): qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f09b076e3e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ntent': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_383782.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Fidos Pantry"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8786579cb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_489557.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_489557.png'}, {'type': 'text', 'text': "\nClick on 'Gradient background link with white text'"}]}] /src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa3ab7f3d30> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_134902_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_134902_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Quick Styles'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/1e565a7d-5394-442b-9b1b-77f00fed1db7.png ) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ext': "\nClick on 'Row up'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) -5394-442b-9b1b-77f00fed1db7.png'}, {'type': 'text', 'text': '\nFor question: Review performance metrics, locate the bounding box of the element in the image that holds the key to this question. Please double-click on it.'}]}] 供关于图片[[11, 122, 297, 167]]区域元素的深入解读? Click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240905_112400_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240905_112400_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Add Category'"}]}] workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41b5f2b510> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_6038.png'}, {'type': 'text', 'text': "\nClick on 'Temple Law's Ohlbaum makes sense of high-profile trials'"}]}] ample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa3ab7f1580> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample => /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3995583330> ntent': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_307532.png'}, {'type': 'text', 'text': "\nClick on 'klaviyo_ariaid_3'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_172111_before_screenshot_sub1.png (Pdb) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_172111_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Dividend Variant 4'"}]}] (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_348773.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_348773.png'}, {'type': 'text', 'text': "\nClick on 'Mixcloud'"}]}] pace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f086fa60900> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_174406.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_174406.png'}, {'type': 'text', 'text': "\nClick on 'open Staged (progressive) Access page'"}]}] wen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efe4f6d9490> ck (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa3aae1fba0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_052530_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_052530_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Minimize'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f87770f3740> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) /images/multi_modal/agent_data/rico/dataset/image43410.jpg'}, {'type': 'text', 'text': '\nWhat is the country code of Angola? Explain your answer based on UI element. Please click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_134837_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_134837_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Report'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ef8aa1425c0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_044151_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_044151_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'static map image of vector map'"}]}] rols Yesterday in the picture? Please double-click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240904_140523_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240904_140523_screenshot.png'}, {'type': 'text', 'text': "\nClick on '0 PROBLEMS 19'"}]}] "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa464684d10> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) Traceback (most recent call last): File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb94a82cbd0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa3aae1c9a0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample conv: [{'role': 'user', 'content':(Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efb424c3ab0> nt/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f086fa62480> > /mnt/hwfile/liuzhaoyang/workspaceLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_042526_before_screenshot_sLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_120810_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windowLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_142478.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_142478.png'}, {'type': 'text', 'text': "\nClick on 'navigate to payment issue submission form'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_it> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)rocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f38520e98f0> (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Traceback (most recent call last): hao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_30421.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_30421.png'}, {'type': 'text', 'text': "\nClick on 'CLICK Dark blue, centered, circular button with a textured appearance'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa55ea649a0> conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_30/C4web50k-1_94674438-split-1.png'}, {'type': 'text', 'text': '\nWhat are the bounding box details of the 26/09/2016 in this image? Please double-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efb42a86610> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_381054.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_381054.png'}, {'type': 'text', 'text': '\nClick on \'vipassana (21 items), under "Connect with Meditation Practices" section\''}]}] hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank25]: trainer.train() [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank25]: return inner_training_loop( [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank25]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank25]: batch_samples += [next(epoch_iterator)] [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank25]: current_batch = next(dataloader_iter) [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank25]: data = self._next_data() [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank25]: return self._process_data(data) [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank25]: data.reraise() [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank25]: raise exception [rank25]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank25]: Original Traceback (most recent call last): [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank25]: sample = self._get_item(i) [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank25]: data_dict = self.preprocess_qwen2vl_v3( [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank25]: return self.preprocess_qwen2vl( [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank25]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank25]: image = tcs_loader(image) [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank25]: img = pil_loader(img_value_str) [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank25]: img = Image.open(buff) [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank25]: raise UnidentifiedImageError(msg) [rank25]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa464684d10> [rank25]: During handling of the above exception, another exception occurred: [rank25]: Traceback (most recent call last): [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank25]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank25]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank25]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank25]: print(f"Failed to fetch sample {i}. Exception:", e) [rank25]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank25]: print(f"Failed to fetch sample {i}. Exception:", e) [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank25]: return self.dispatch_line(frame) [rank25]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank25]: if self.quitting: raise BdbQuit [rank25]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa321ccf330> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_458832.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_458832.png'}, {'type': 'text', 'text': "\nClick on 'TULIP 80cm/100cm Round Din., on the right side of the webpage'"}]}] v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f797c984090> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7584120e50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): //wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_120747_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_120747_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Share Linkedin'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efb42a868e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_781121.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_781121.png'}, {'type': 'text', 'text': "\nClick on 'Timing Belt Tensioner Video'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_488883.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_488883.png'}, {'type': 'text', 'text': "\nClick on 'OPTICS'"}]}] ng/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7585629fd0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): e/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efe4f6da750> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_351595.pngget_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f501eed7ce0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) qa_filtered_351595.png'}, {'type': 'text', 'text': "\nClick on 'Cart'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_211367.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_211367.png'}, {'type': 'text', 'text': "\nClick on 'navigate to graduation information page'"}]}] (Pdb) Load image from ceph: langchTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdfac4ba480> l_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3852b46ca0> [rank13]: Traceback (most recent call last): [rank13]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank13]: train() [rank13]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank13]: trainer.train() [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank13]: return inner_training_loop( [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank13]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank13]: batch_samples += [next(epoch_iterator)] [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank13]: current_batch = next(dataloader_iter) [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank13]: data = self._next_data() [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank13]: return self._process_data(data) [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank13]: data.reraise() [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank13]: raise exception [rank13]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank13]: Original Traceback (most recent call last): [rank13]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank13]: sample = self._get_item(i) [rank13]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank13]: data_dict = self.preprocess_qwen2vl_v3( [rank13]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank13]: return self.preprocess_qwen2vl( [rank13]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank13]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank13]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank13]: image = tcs_loader(image) [rank13]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank13]: img = pil_loader(img_value_str) [rank13]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank13]: img = Image.open(buff) [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank13]: raise UnidentifiedImageError(msg) [rank13]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f797c984090> [rank13]: During handling of the above exception, another exception occurred: [rank13]: Traceback (most recent call last): [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank13]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank13]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank13]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank13]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank13]: print(f"Failed to fetch sample {i}. Exception:", e) [rank13]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank13]: print(f"Failed to fetch sample {i}. Exception:", e) [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank13]: return self.dispatch_line(frame) [rank13]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank13]: if self.quitting: raise BdbQuit [rank13]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f797b584860> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open rais File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) edImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efe4f6da890> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ng conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_053046_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'App bar'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)er', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_109132.png'}, {'type': 'text', 'text': "\nClick on 'view prices'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): //dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_16032.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_16032.png'}, {'type': 'text', 'text': '\nClick on \'the image of "logo Bikeride"\''}]}] GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f79fc158950> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_104847_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_104847_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Kindle E-readersKindleThe lightest and most compact Kindle'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efe4f6da110> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_193529_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_193529_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'Submit'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_51/C4web50k-3_275030230-split-0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_51/C4web50k-3_275030230-split-0.png'}, {'type': 'text', 'text': '\nIdentify the Gluten Free in the image and its bounding box dimensions. Please right-click on it.'}]}] [rank15]: Traceback (most recent call last): [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank15]: train() [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank15]: trainer.train() [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank15]: return inner_training_loop( [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank15]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank15]: batch_samples += [next(epoch_iterator)] [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank15]: current_batch = next(dataloader_iter) [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank15]: data = self._next_data() [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank15]: return self._process_data(data) [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank15]: data.reraise() [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank15]: raise exception [rank15]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank15]: Original Traceback (most recent call last): [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank15]: sample = self._get_item(i) [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank15]: data_dict = self.preprocess_qwen2vl_v3( [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank15]: return self.preprocess_qwen2vl( [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank15]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank15]: image = tcs_loader(image) [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank15]: img = pil_loader(img_value_str) [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank15]: img = Image.open(buff) [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank15]: raise UnidentifiedImageError(msg) [rank15]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3852b46ca0> [rank15]: During handling of the above exception, another exception occurred: [rank15]: Traceback (most recent call last): [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank15]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank15]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank15]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank15]: print(f"Failed to fetch sample {i}. Exception:", e) [rank15]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank15]: print(f"Failed to fetch sample {i}. Exception:", e) [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank15]: return self.dispatch_line(frame) [rank15]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank15]: if self.quitting: raise BdbQuit [rank15]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3995f4fe70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76f86b23e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ck (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_124019_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_124019_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Start free'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77542e9a30> Load image from ceph: VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240312/20240312_filtered/yingyongshangdian/action_143533.685039finished.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240312/20240312_filtered/yingyongshangdian/action_143533.685039finished.jpg'}, {'type': 'text', 'text': '\n界面截图中的轻游戏在哪里?请提供其边界框坐标。 Double-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_282329.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_282329.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Petit Vour Vegan September 2014 Beauty Box"\''}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240506_reorg_filtered/05094_label.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240506_reorg_filtered/05094_label.png'}, {'type': 'text', 'text': '\n简单介绍下标记为1的元素。 Double-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f38dbf9a2f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7503310810> "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f302714a160> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_199186.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_199186.png'}, {'type': 'text', 'text': "\nClick on 'User account'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240321/20240321_filtered/fanqiexiaoshuo/screen_00000003.jpg huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable nction_filtered_223765.png'}, {'type': 'text', 'text': "\nClick on 'change school year to 2021-2022'"}]}] UIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7ac066e160> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f35cb86bba0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3337712c00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/AndroidUI/ui_caption_inhousedazhongdianping_screen_00000086.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/AndroidUI/ui_caption_inhousedazhongdianping_screen_00000086.jpg'}, {'type': 'text', 'text': '\n这张图片是什么 Please click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77542e9df0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_032049_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_032049_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'For you'"}]}] Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f02fbb2a3e0> o.BytesIO object at 0x7f35cb86bb00> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_200806_before_screenshot.pngmple {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_347682.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_347682.png'}, {'type': 'text', 'text': '\nClick on \'link titled "Gospel of John Chapter 11"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7503312890> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_200806_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Home'"}]}] /data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_159744.png'}, {'type': 'text', 'text': '\nClick on \'Top Rated Merchant\n4.8/5.0 2,000+ Ratings, above "9.7/10.0", to the right of "Free Shipping\nOn Orders $99+"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_36302.png File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_36302.png'}, {'type': 'text', 'text': "\nClick on 'Blockchain development services hero image featuring developer and Solidity-themed background'"}]}] PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f797c984090> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f797c984090> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f877700bc90> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f33f2127060> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable iltered_47665.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Avery Nursing Dress Multi product page'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f35cb86b510> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_206087.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_206087.png'}, {'type': 'text', 'text': "\nClick on 'redirect to Home socks kids section'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f758412cc70> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_153097.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_153097.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Sussex County Seal"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f797b9a31a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7648e9c090> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspaceLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_002123_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_002123_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on '9'"}]}] i_modal/agent_data/AndroidUI/20240312/20240312_filtered/meituxiuxiu/action_148343.772923finished.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI/20240312/20240312_filtered/meituxiuxiu/action_148343.772923finished.jpg'}, {'type': 'text', 'text': '\n我如何更改我的个人资料信息?with grounding Click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f727d2018a0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_134029_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_134029_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Themes'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) rc/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8776fefec0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1c5c80d4e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)er', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_547664.png'}, {'type': 'text', 'text': '\nClick on \'button for "Go", at the center of the webpage, above "RESEARCH ASSISTANCE"\''}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240327/20240327_filtered/ximalaya/screen_00000141.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240327/20240327_filtered/ximalaya/screen_00000141.jpg'}, {'type': 'text', 'text': '\n详细描述给定图片 Click on it.'}]}] Traceback (most recent call last): (Pdb) /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7ac19bc860> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f05d01405e0> > /mnt/hwfile/liuzhaoyang/workspace> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_659452.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_659452.png'}, {'type': 'text', 'text': '\nClick on \'the text input area, under the "Submit a Comment" section\''}]}] l image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77e88b06d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f77e88b06d0> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/ui_data/ui_home_screen_tablet/ui_homescreen_tablet_20240416_v7homescreen-1154.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/ui_data/ui_home_screen_tablet/ui_homescreen_tablet_20240416_v7homescreen-1154.jpg'}, {'type': 'text', 'text': '\n图中[[70, 109, 190, 184]]这个区域的元素的功能是什么。 Please right-click on it.'}]}] blue input field with white border and "Your email" placeholder, flanked by a "SEND" button, to the top right of "Returns"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7648268540> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_694265.png{i}. Exception:", e) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_138579.png (Pdb) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_138579.png'}, {'type': 'text', 'text': "\nClick on 'A small white arrow pointing to the right, located on a dark gray vertical bar on the right side of the screen.'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_9_13_1_9a01554ac61347c79d3611a1e081600f-12.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_9_13_1_9a01554ac61347c79d3611a1e081600f-12.png'}, {'type': 'text', 'text': '\nHow would you determine the bounding box for the Back shown in the image? Please click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_694265.png'}, {'type': 'text', 'text': "\nClick on 'Type into text input area, on the right side of the webpage'"}]}] _item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f05d0150bd0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdfac4b9580> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f79fc163c90> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1c5c80f970> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240506_reorg_filtered/02857.png (Pdb) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_shareui_agi_appdata_phone/ui_agi_appdata_phone_20240506_reorg_filtered/02857.png'}, {'type': 'text', 'text': '\n我想返回上一页,请问应该点击哪个按钮?with grounding Click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_37848.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_37848.png'}, {'type': 'text', 'text': "\nClick on 'open the photo gallery for scuba diving in Abu Dabbab bay'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f311a98e700> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f05d0266200> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdfac4b9f30> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/aig_share/visual_with_bbox_images/ui_agi_appdata_phone/ui_agi_appdata_phone_20240506_reorg_filtered/01498.jpg_function_filtered_109938.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_109938.png'}, {'type': 'text', 'text': "\nClick on 'open the list of digital courses'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240821_192743_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240821_192743_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'Promotional Communications Manager'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76f86b2480> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ages/ui_agi_appdata_phone/ui_agi_appdata_phone_20240506_reorg_filtered/01498.jpg'}, {'type': 'text', 'text': '\n所给图中被框住的元素是什么? Please double-click on it.'}]}] Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/54080.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/54080.jpg'}, {'type': 'text', 'text': '\nIdentify the bounding box coordinates that is relevant to the objective: click to go back. Right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe02c16b240> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f05d20297b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_131519_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_131519_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Crop'"}]}] , under "33 products"\''}]}] Load image from ceph: langchao2:s3:Traceback (most recent call last): ssing/screenshotsweb_hybrid_773k_max_25qa_filtered_633131.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_633131.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Spotify not working in Tesla"\''}]}] self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdfac4b9300> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): //dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_353053.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_353053.png'}, {'type': 'text', 'text': "\nClick on 'the text input box'"}]}] ataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76f8cb5940> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) [rank14]: Traceback (most recent call last): [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank14]: train() [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank14]: trainer.train() [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank14]: return inner_training_loop( [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank14]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank14]: batch_samples += [next(epoch_iterator)] [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank14]: current_batch = next(dataloader_iter) [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank14]: data = self._next_data() [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank14]: return self._process_data(data) [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank14]: data.reraise() [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank14]: raise exception [rank14]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank14]: Original Traceback (most recent call last): [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank14]: sample = self._get_item(i) [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank14]: data_dict = self.preprocess_qwen2vl_v3( [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank14]: return self.preprocess_qwen2vl( [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank14]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank14]: image = tcs_loader(image) [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank14]: img = pil_loader(img_value_str) [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank14]: img = Image.open(buff) [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank14]: raise UnidentifiedImageError(msg) [rank14]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7648268540> [rank14]: During handling of the above exception, another exception occurred: [rank14]: Traceback (most recent call last): [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank14]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank14]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank14]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank14]: print(f"Failed to fetch sample {i}. Exception:", e) [rank14]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank14]: print(f"Failed to fetch sample {i}. Exception:", e) [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank14]: return self.dispatch_line(frame) [rank14]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank14]: if self.quitting: raise BdbQuit [rank14]: bdb.BdbQuit PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7503312430> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Load image from ceph: langchao2:s3: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f32ad1d7600> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdfac4b9cb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/541ac32a-16aa-4449-87cc-1f28cd9bf7ce.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/541ac32a-16aa-4449-87cc-1f28cd9bf7ce.png'}, {'type': 'text', 'text': '\nIdentify the bounding box for the element in the screen to complete the task: Dream Journal Entry. Right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f02faff1440> huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... Traceback (most recent call last): ther: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7584147c40> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76f86b2480> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f863087b920> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image365Scores_2024_3_16_19_20-17.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image365Scores_2024_3_16_19_20-17.png'}, {'type': 'text', 'text': '\nPinpoint the exact area of Search Leagues in the given image with bounding box. Please click on it.'}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efbbc915ad0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f85afa86f20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ck (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PI conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_135708.png'}, {'type': 'text', 'text': '\nClick on \'The word "ABOUT" in the navigatLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_194935_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_194935_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'eBay'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_748896.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_748896.png'}, {'type': 'text', 'text': "\nClick on 'Cookies'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_629917.png i}. Exception:", e)[rank17]: train() [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank17]: trainer.train() [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank17]: return inner_training_loop( [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank17]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank17]: batch_samples += [next(epoch_iterator)] [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank17]: current_batch = next(dataloader_iter) [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank17]: data = self._next_data() [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank17]: return self._process_data(data) [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank17]: data.reraise() [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank17]: raise exception [rank17]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank17]: Original Traceback (most recent call last): [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank17]: sample = self._get_item(i) [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank17]: data_dict = self.preprocess_qwen2vl_v3( [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank17]: return self.preprocess_qwen2vl( [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank17]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank17]: image = tcs_loader(image) [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank17]: img = pil_loader(img_value_str) [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank17]: img = Image.open(buff) [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank17]: raise UnidentifiedImageError(msg) [rank17]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f02faff1440> [rank17]: During handling of the above exception, another exception occurred: [rank17]: Traceback (most recent call last): [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank17]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank17]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank17]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank17]: print(f"Failed to fetch sample {i}. Exception:", e) [rank17]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank17]: print(f"Failed to fetch sample {i}. Exception:", e) [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank17]: return self.dispatch_line(frame) [rank17]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank17]: if self.quitting: raise BdbQuit [rank17]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1af041d3a0> (Pdb) 'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_629917.png'}, {'type': 'text', 'text': '\nClick on \'button titled "POPULAR"\''}]}] uvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f048e51b6f0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/lTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f311a98e570> text', 'text': '\n如何保存我当前编辑的视频模板设置?(with grounding) Please right-click on it.'}]}] le <_io.BytesIO object at 0x7f86310241d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7648268540> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_582115.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_582115.png'}, {'type': 'text', 'text': '\nClick on \'Scene button, above "FLEX Series Flood Light from CSC LED", to the top left of "Subscribe to Our Brands", to the top right of "Design"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240911_170544_original_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/linux_imagesoutput_20240911_170544_original_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Thunderbird Mail'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_216470.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_216470.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Secretary of State Antony Blinken hosts the 17th annual International Women of Courage Award Ceremony on International Women’s Day at the White House in Washington, March 8, 2023. (Reuters)"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_174970.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_174970.png'}, {'type': 'text', 'text': "\nClick on 'Kapok Yoga Bolster Ikat'"}]}] t.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1da0466340> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_787763.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_787763.png'}, {'type': 'text', 'text': "\nClick on 'Email address, text input area'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_123022_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_123022_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Sign in - Google Accounts' File File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4372e432e0> ack (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76011084f0> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_21/C4web50k-1_92767605-split-2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_21/C4web50k-1_92767605-split-2.png'}, {'type': 'text', 'text': '\nIdentify the position of the Reviews in this image and provide its bounding box. Click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f852710f5b0> /liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3f2f21f1a0> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/lLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160305_before_screenshot_sub0.png Traceback (most recent call last): conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160305_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Superscript'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvi> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_298767.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_298767.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Park City Neighborhoods: 12 Best Places to Live in Utah Photo"\''}]}] _modal/agent_data/rico/dataset/image58518.jpg'}, {'type': 'text', 'text': '\nWhat is the user name? Explain your answer based on UI element. Click on it.'}]}] py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f10a7347b50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ck (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_578600.png conv: [{'role': 'user', 'content': [{'typeTraceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images2024Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7600f81cb0> conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_002133_before_scre> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_130637_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Skip Ad'"}]}] (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/ui_data/ui_home_screen_phone/ui_homescreen_phone_20240416_v7homescreen-1581.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/ui_data/ui_home_screen_phone/ui_homescreen_phone_20240416_v7homescreen-1581.jpg'}, {'type': 'text', 'text': '\n对屏幕截图中的所有APP进行边界框定位。 Please right-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_543274.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_543274.png'}, {'type': 'text', 'text': "\nClick on 'GiftRegistry'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_65640.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filte> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)ioned on the right side, next to the search icon, and labeled "ECHIPA."\''}]}] 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7503311850> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ntent': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_496254.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Chilli Rox Box - Single Purchase - The Enviro Co"\''}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_114208_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_114208_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'LinkedIn Privacy Policy'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_122000_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_122000_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'NBC News'"}]}] liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f33f0ed2480> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f10a6e274c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8a0d197830> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8b50494450> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_023808_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_023808_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'MUO'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Traceback (most recent call last): (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_536823.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_536823.png'}, {'type': 'text', 'text': "\nClick on 'Read more about The Comprehensive Guide to a Memorable Vow Renewal Ceremony'"}]}] reprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7503310bd0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f43fb7da110> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_44492.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_44492.png'}, {'type': 'text', 'text': "\nClick on 'input box'"}]}] c/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1aed1b6160> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240322/20240322_filtered/youku/screen_00000067.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240322/20240322_filtered/youku/screen_00000067.jpg'}, {'type': 'text', 'text': '\n详细描述我给的图片。 Please double-click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/cb072ac2-c061-45d6-b3a2-b8e8b5d17c3e.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/cb072ac2-c061-45d6-b3a2-b8e8b5d17c3e.png'}, {'type': 'text', 'text': '\nIndicate the location with bounding box to solve the task: 01:07. Right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f333788fe20> Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/26468.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/26468.jpg'}, {'type': 'text', 'text': '\nFor the operation: go to church click on text, output the bounding box of the region in the image that can complete the task. Please click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3141321670> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3:Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image7353.jpg __ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7600f82ca0> ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f133c0dd580> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) e': 'text', 'text': '\nWhat is the current status of crime? Explain your answer based on UI element. Please double-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f727d2016c0> object at 0x7f788dc41170> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_523183.pngTraceback (most recent call last): conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_523183.png'}, {'type': 'text', 'text': '\nClick on \'the image of "RMI_stickers 13_edited.png"\''}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3337822ac0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_014244_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_014244_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Microsoft search'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_191757_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_191757_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Zoom Out'"}]}] c/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f048f0e3560> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_92505.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_92505.png'}, {'type': 'text', 'text': '\nClick on \'The element is labeled "SWAG" and is part of the navigation menu on the main webpage, positioned between "SUSPENSION" and "WHEELS AND ACCESSORIES".\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41b55196c0> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/677ade56-386f-4221-8eac-ed8c5b2d1972.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/677ade56-386f-4221-8eac-ed8c5b2d1972.png'}, {'type': 'text', 'text': '\nHighlight the area with bounding box that I can click to execute the task: Camera. Please right-click on it.'}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0d7a6a2b10> = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f133ca7f6f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f33377388b0> > /mnt/hwfile/liuzhaoyang/workspace(Pdb) nt/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1ac97f9620> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8631400a90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) iconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f32ad1d71f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_25_20_33_64876d6ed2324dc888a932017f020504-6.png PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8a0c7cb010> conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/image2024_4_25_20_33_64876d6ed2324dc888a932017f020504-6.png'}, {'type': 'text', 'text': '\nWhat are the coordinates of the bounding box surrounding Navigate up in the picture? Click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f02fbb2a5c0> Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0aa3dc6430> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0d79257420> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/50199.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/50199.jpg'}, {'type': 'text', 'text': '\nFor the operation: navigate to next page, output the bounding box of the area in the image that can complete the task. Please double-click on it.'}]}] ocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f10b5962d40> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f788dc41170> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_420207.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_420207.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Two green olives with glossy texture and red dot, on white background"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f32adb9fba0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_187941.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_187941.png'}, {'type': 'text', 'text': "\nClick on 'Redirect to DJKrixmas.com Home Page'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_365828.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_365828.png'}, {'type': 'text', 'text': "\nClick on 'COMIC BOOK MOVIES button'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) last): conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_130505_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Sign in - Google Accounts'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8777018630> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3337716ed0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_344477.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_344477.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Recite Me Accessibility Logo", at the right side of "YES", in "WE USE COOKIES"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3337700ef0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_222336.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_222336.png'}, {'type': 'text', 'text': '\nClick on \'the image of "I Know Places", located at the top left corner of the image\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f10b5963420> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_021250_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wu conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_014934_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Search for anything'"}]}] n2vl_gui/train.py", line 210, in train [rank18]: trainer.train() [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank18]: return inner_training_loop( [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank18]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank18]: batch_samples += [next(epoch_iterator)] [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank18]: current_batch = next(dataloader_iter) [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank18]: data = self._next_data() [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguLoad image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240321/20240321_filtered/fanqiexiaoshuo/screen_00000232.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240321/20240321_filtered/fanqiexiaoshuo/screen_00000232.jpg'}, {'type': 'text', 'text': '\n详细描述我给的图片。 Please click on it.'}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imagePodcasts_2024_3_9_13_58-130.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imagePodcasts_2024_3_9_13_58-130.png'}, {'type': 'text', 'text': '\nHow to locate the Navigate up and its bounding box in the given image? Please right-click on it.'}]}] l_gui/src/aguvis/dataset.py", line 408, in _get_item [rank18]: data_dict = self.preprocess_qwen2vl_v3( [rank18]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank18]: return self.preprocess_qwen2vl( [rank18]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank18]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank18]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank18]: image = tcs_loader(image) [rank18]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank18]: img = pil_loader(img_value_str) [rank18]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank18]: img = Image.open(buff) [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank18]: raise UnidentifiedImageError(msg) [rank18]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1da2238f90> [rank18]: During handling of the above exception, another exception occurred: [rank18]: Traceback (most recent call last): [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank18]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank18]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank18]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank18]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank18]: print(f"Failed to fetch sample {i}. Exception:", e) [rank18]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank18]: print(f"Failed to fetch sample {i}. Exception:", e) [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank18]: return self.dispatch_line(frame) [rank18]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank18]: if self.quitting: raise BdbQuit [rank18]: bdb.BdbQuit [rank4]: Traceback (most recent call last): [rank4]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank4]: train() [rank4]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank4]: trainer.train() [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank4]: return inner_training_loop( [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank4]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank4]: batch_samples += [next(epoch_iterator)] [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank4]: current_batch = next(dataloader_iter) [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank4]: data = self._next_data() [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank4]: return self._process_data(data) [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank4]: data.reraise() [rank4]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank4]: raise exception [rank4]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank4]: Original Traceback (most recent call last): [rank4]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank4]: sample = self._get_item(i) [rank4]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank4]: data_dict = self.preprocess_qwen2vl_v3( [rank4]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_504929.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_504929.png'}, {'type': 'text', 'text': "\nClick on 'Small centered avatar image with low resolution'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3337713a60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank36]: Traceback (most recent call last): [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank36]: train() [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank36]: trainer.train() [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank36]: return inner_training_loop( [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank36]: batch_samples, nTraceback (most recent call last): samples(epoch_iterator, num_batches) [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank36]: batch_samples += [next(epoch_iterator)] [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank36]: current_batch = next(dataloader_iter) [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank36]: data = self._next_data() [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank36]: return self._process_data(data) [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank36]: data.reraise() [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank36]: raise exception [rank36]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank36]: Original Traceback (most recent call last): [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank36]: sample = self._get_item(i) [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank36]: data_dict = self.preprocess_qwen2vl_v3( [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank36]: return self.preprocess_qwen2vl( [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank36]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank36]: image = tcs_loader(image) [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank36]: img = pil_loader(img_value_str) [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank36]: img = Image.open(buff) [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank36]: raise UnidentifiedImageError(msg) [rank36]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f788dc41170> [rank36]: During handling of the above exception, another exception occurred: [rank36]: Traceback (most recent call last): [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank36]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank36]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank36]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank36]: print(f"Failed to fetch sample {i}. Exception:", e) [rank36]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank36]: print(f"Failed to fetch sample {i}. Exception:", e) [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank36]: return self.dispatch_line(frame) [rank36]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank36]: if self.quitting: raise BdbQuit [rank36]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f607bb75580> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_92280.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_92280.png'}, {'type': 'text', 'text': "\nClick on 'redirect to insulated vinyl siding information page'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f10a6e1ec00> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_728815.png > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_728815.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Roger Vidal Chicken Mushroom"\''}]}] (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f333771f2e0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_390369.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_390369.png'}, {'type': 'text', 'text': "\nClick on 'searchBoxAccessibleText'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) gent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0c367e35b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8630a7cbd0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0aa3dc6700> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_043549_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_043549_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on '3 They Drink Lemon Tea'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) l_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe1550acfe0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_043252_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_043252_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Maps'"}]}] menu, above "New York, NY, US", under "Writer/Copy Editor and Proofreader\nAequor \xa0•\xa0 Full-time \xa0•\xa0 New York, NY, US"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76370d8f90> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515,Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f61eb3039c0> /hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f311a98e700> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_123926_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_123926_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Decide security'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_525499.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_525499.png'}, {'type': 'text', 'text': '\nClick on \'WhatsApp Share, to the right of "Tweet"\''}]}] l_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe2e68162a0> t/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f862fc227f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) uzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efe4ed062f0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_9231.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_9231.png'}, {'type': 'text', 'text': '\nClick on \'The text that reads "Exway" within the dropdown menu under the "BRANDS" tab in the navigation bar at the top of the page.\''}]}] led to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_77454.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_77454.png'}, {'type': 'text', 'text': '\nClick on \'A navigation menu item labeled "CAMPOBELLO ISLAND" with a small downward-pointing arrow to the right.\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_276523.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_276523.png'}, {'type': 'text', 'text': "\nClick on 'Do Olive Trees Grow in Vineyards'"}]}] haoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f333773abb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76bf9a7fb0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f877c1eba60> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_023735_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_023735_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Legal'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/lTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspaceLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_86844.png}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe2e68162a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f42f8272fc0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f09a63decf0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f87f8969d00> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_483227.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_483227.png'}, {'type': 'text', 'text': "\nClick on 'Gray lock ic File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5ff48e8590> > /Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efc806c4e50> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f333773aac0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/54862.jpg'}, {'type': 'text', 'text': '\nPlease output the bounding box of the region correctly responding to this task: take me to the website. Please right-click on it.'}]}] 31701_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Amazon Alexa Echo'"}]}] (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0d7a6793a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f42f829d120> /src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7727b3a700> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f09a63decf0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f048e51b6a0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_523344.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_523344.png'}, {'type': 'text', 'text': "\nClick on 'exercise 7.2 (ncert class 11th maths) (5 questions)'"}]}] ace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0c39974810> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyaLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_133850_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_133850_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Google Cloud Platform'"}]}] e 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44b4d74fe0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_627754.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_627754.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Canvas Home"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe36f7a3100> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_140504_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_140504_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Restore Down'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguTraceback (most recent call last): print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f42f829d2b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3f2f21ec50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) orrectly responding to this question: select the icon beside the show image after saving. Please right-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f86b01c16c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", liTraceback (most recent call last): //wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_114623_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_114623_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Shadow'"}]}] n2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f09a63dfbf0> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image34038.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image34038.jpg'}, {'type': 'text', 'text': '\nTHe currency What currency is used for the price? Explain your answer based on UI element. Right-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_127723.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_127723.png'}, {'type': 'text', 'text': '\nClick on \'Type into text input field labeled "Made between", on the bottom right of the webpage\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _g> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_005409_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_005409_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Editing'"}]}] ph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3337738450> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f42f829d3a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3f2ec87560> self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0d7a6792b0> (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f158ca5cc20> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_024827_before_screenshot_sub0.png conv: [{'role': 'useLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_780050.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screensTraceback (most recent call last): Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/sc File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f777999eca0> et.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/ImTraceback (most recent call last): se UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe0f0f08680> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f333773a980> ge.py", line 3536, in open raise UnidentifiedImageError(msg) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7bf8b44540> (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/32bd89d7-2ae1-4cf4-bd7d-f116777eabaa.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/32bd89d7-2ae1-4cf4-bd7d-f116777eabaa.png'}, {'type': 'text', 'text': '\nPlease identify the area of the element with bounding box that can provide an answer to this question: Tasks. Please double-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_272643.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_272643.png'}, {'type': 'text', 'text': "\nClick on 'text box'"}]}] ne 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41b551bce0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41b5f2b510> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_69/C4web50k-3_275202310-split-17.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_69/C4web50k-3_275202310-split-17.png'}, {'type': 'text', 'text': '\nHow would you determine the bounding box for the one or more of the 14 supplements is missing shown in the image? Please double-click on it.'}]}] vis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4371a626b0> > /mnt/hwfile/liuzhaoyang/workspace> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_034252_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atla(Pdb) op_domain/windows_images20240824_034252_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Viridian'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8631400a90> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f862fc227a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_154107_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'System and Security'"}]}] W, Book'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3337738a40> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f87fe875800> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_200445.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_200445.png'}, {'type': 'text', 'text': "\nClick on 'text input box, at the bottom of the page'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_562620.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_562620.png'}, {'type': 'text', 'text': "\nClick on 'Flower shop network opens in new window'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f40eb0cf560> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) .BytesIO object at 0x7f76374a1df0> Load image from ceph: langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0726_099_batch_3_num_1500/20240726150908233/main_292_after.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/gui-share/agent_data/agent_data/ui-detect/label_data/0726_099_batch_3_num_1500/20240726150908233/main_292_after.png'}, {'type': 'text', 'text': '\n- Role: APP界面设计分析专家\n- Background: 用户需要对APP界面进行布局和图标设计,要求输出相应的json结构信息,包括node_type, location, content等属性。\n- Goals: 提供一个清晰、准确的json结构信息,以满足用户对APP界面布局和图标设计的需求。\n- OutputFormat: json格式,包含node_type, location, content等属性。\n- Workflow:\n 1. 分析APP界面的布局和图标设计需求。\n 2. 根据分析结果,确定json结构中的node_type, location, content等属性。\n 3. 编写符合规范的json结构信息。\n请直接给出json格式的数据:\n Please click on it.'}]}] n preprocess_qwen2vl_v3 [rank22]: return self.preprocess_qwen2vl( [rank22]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank22]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank22]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank22]: image = tcs_loader(image) [rank22]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank22]: img = pil_loader(img_value_str) [rank22]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank22]: img = Image.open(buff) [rank22]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank22]: raise UnidentifiedImageError(msg) [rank22]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe0f0f08680> [rank22]: During handling of the above exception, another exception occurred: [rank22]: Traceback (most recent call last): [rank22]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank22]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank22]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank22]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank22]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank22]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank22]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank22]: print(f"Failed to fetch sample {i}. Exception:", e) [rank22]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank22]: print(f"Failed to fetch sample {i}. Exception:", e) [rank22]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank22]: return self.dispatch_line(frame) [rank22]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank22]: if self.quitting: raise BdbQuit [rank22]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe0f1838400> [rank12]: Traceback (most recent call last): [rank12]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank12]: train() [rank12]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank12]: trainer.train() [rank12]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank12]: return inner_training_loop( [rank12]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank12]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank12]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank12]: batch_samples += [next(epoch_iterator)] [rank12]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank12]: current_batch = next(dataloader_iter) [rank12]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank12]: data = self._next_data() [rank12]: File "/mnt/petrelfs/liuzhaoyang/workspace/progr> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank38]: Traceback (most recent call last): [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank38]: train() [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank38]: trainer.train() [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank38]: return inner_training_loop( [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank38]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank38]: batch_samples += [next(epoch_iterator)] [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank38]: current_batch = next(dataloader_iter) [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank38]: data = self._next_data() [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank38]: return self._process_data(data) [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank38]: data.reraise() [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank38]: raise exception [rank38]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank38]: Original Traceback (most recent call last): [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank38]: sample = self._get_item(i) [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank38]: data_dict = self.preprocess_qwen2vl_v3( [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank38]: return self.preprocess_qwen2vl( [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank38]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank38]: image = tcs_loader(image) [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank38]: img = pil_loader(img_value_str) [rank38]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank38]: img = Image.open(buff) [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/ImTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41b5ebbe70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) /bdb.py", line 90, in trace_dispatch [rank38]: return self.dispatch_line(frame) [rank38]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank38]: if self.quitting: raise BdbQuit [rank38]: bdb.BdbQuit l_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0d7b1fb100> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_043532_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_043532_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Extension'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/[rank5]: Traceback (most recent call last): [rank5]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank5]: train() [rank5]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank5]: trainer.train() [rank5]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank5]: return inner_training_loop( [rank5]: File "/mnt/petrelfs/liuzhaoyang/woTraceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76bf9bb380> Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/19221.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/19221.jpg'}, {'type': 'text', 'text': '\nWhere is the bounding box coordinates to click for completing the objective: select learn. Click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ef9a8764400> another exception occurred: [rank5]: Traceback (most recent call last): [rank5]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank5]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank5]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank5]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank5]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank5]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank5]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank5]: print(f"Failed to fetch sample {i}. Exception:", e) [rank5]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank5]: print(f"Failed to fetch sample {i}. Exception:", e) [rank5]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank5]: return self.dispatch_line(frame) [rank5]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank5]: if self.quitting: raise BdbQuit [rank5]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7b785bbba0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_813865.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_813865.png'}, {'type': 'text', 'text': '\nClick on \'the image of "By Gamers For Gamers Logo"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_34571.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_34571.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Krone category page'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0d79992110> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_824568.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_824568.png'}, {'type': 'text', 'text': "\nClick on 'About Optimum Healthcare IT button'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3f2ec87560> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/worLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_139068.png ader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f78e3ce44f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f862fc68220> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) s_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8776fefc90> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line Traceback (most recent call last): e_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3f2f21ec50> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_349502.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_349502.png'}, {'type': 'text', 'text': "\nClick on 'select input field, at the top of the webpage'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1af041d3a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) e 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efb3e03cc70> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_55721.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_55721.png'}, {'type': 'text', 'text': "\nClick on 'Google Sitemap Generator Plugin, Google (XML) Sitemaps Generator Plugin for WordPress'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f86b01c7e20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liu File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41b551bce0> ': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_48319.png'}, {'type': 'text', 'text': "\nClick on 'A circular icon featuring a white leaf design on a dark purple background, located in the top left corner of the webpage.'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f73b14471f0> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/448d74bf-ffc0-43fa-aaf1-397ec6c57cc2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/448d74bf-ffc0-43fa-aaf1-397ec6c57cc2.png'}, {'type': 'text', 'text': '\nIdentify the bounding box for the element in the screen to complete the task: Sun, Oct 1. Click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f76bbb750d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efc813f27f0> Load image from ceph: langchao2:s3: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f87f8962e30> le "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8a8ce76f20> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwf File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Ima> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_49639.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_49639.png'}, {'type': 'text', 'text': > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_687239.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_687239.png'}, {'type': 'text', 'text': "\nClick on 'Listen Live'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_013852_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_013852_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Bullets'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fdfac4b98a0> Traceback (most recent call last): Traceback (most recent call last): taset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_090656_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_090656_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'View comments 6 Comment'"}]}] t.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4371a638d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7778f66570> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample =Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_669972.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_669972.png'}, {'type': 'text', 'text': '\nClick on \'Grey rectangular email input field with light grey background and grey border, with "Email Address" placeholder text\''}]}] d_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ef8aa13c5e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/09653d35-82f8-48f3-ad7b-e8a4ba047afd.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/09653d35-82f8-48f3-ad7b-e8a4ba047afd.png'}, {'type': 'text', 'text': '\nCan you determine the bounding box coordinates to complete my task: Running Adventure. Click on it.'}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe02c16b560> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_480784.png conv: [{'role': 'user', 'coLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160634_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160634_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Styles'"}]}] src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open rais File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preproces> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f83a929af70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) : train() [rank35]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank35]: trainer.train() [rank35]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank35]: return inner_training_loop( [rank35]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank35]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank35]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank35]: batch_samples += [next(epoch_iterator)] [rank35]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank35]: current_batch = next(dataloader_iter) [rank35]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank35]: data = self._next_data() [rank35]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank35]: return self._process_data(data) [rank35]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank35]: data.reraise() [rank35]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank35]: raise exception [rank35]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank35]: Original Traceback (most recent call last): [rank35]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank35]: sample = self._get_item(i) [rank35]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank35]: data_dict = self.preprocess_qwen2vl_v3( [rank35]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank35]: return self.preprocess_qwen2vl( [rank35]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank35]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank35]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank35]: image = tcs_loader(image) [rank35]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank35]: img = pil_loader(img_value_str) [rank35]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank35]: img = Image.open(buff) [rank35]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank35]: raise UnidentifiedImageError(msg) [rank35]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8a8ce76f20> [rank35]: During handling of the above exception, another exception occurred: [rank35]: Traceback (most recent call last): [rank35]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank35]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank35]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank35]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank35]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank35]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank35]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank35]: print(f"Failed to fetch sample {i}. Exception:", e) [rank35]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank35]: print(f"Failed to fetch sample {i}. Exception:", e) [rank35]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank35]: return self.dispatch_line(frame) [rank35]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank35]: if self.quitting: raise BdbQuit [rank35]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8776feb060> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_35596.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_35596.png'}, {'type': 'text', 'text': "\nClick on 'open Pontiac Transportation Museum YouTube channel in a new tab'"}]}] shotsweb_hybrid_773k_max_25qa_filtered_515161.png'}, {'type': 'text', 'text': "\nClick on 'Elena Greene’s Author website'"}]}] }]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44b7095f30> m__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41b5f2b510> n self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f877700ba60> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe02c16ade0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160743_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_160743_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Font Size'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0c367e3150> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_142882.png conv: [{'role': (Pdb) k (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f83ab8a99e0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/AMEX/imageOutlook_2024_2_2_18_33-96.png > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/datLoad image from ceph: VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/dazhongdianping/screen_00000490.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/dazhongdianping/screen_00000490.jpg'}, {'type': 'text', 'text': '\n请标出界面中返回图标的位置,给Load image from ceph: langchao2:s3://wuzhenyu/d File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe2e6b223e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) n it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f86b0379b20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0d785f2570> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f86b0f1b0b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_081444_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'New split screen'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_106982.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_106982.png'}, {'type': 'text', 'text': "\nClick on 'Display Template (1 item)'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfiLoad image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_354144.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_354144.png'}, {'type': 'text', 'text': "\nClick on 'Los Angeles Hotels, on the left side of the screenshot'"}]}] en2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f852710f100> Traceback (most recent call last): Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_ioTraceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f333782a110> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f11cdadbc40> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_90312.png conv: [{'role': 'user', 'conLoad image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_134143_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_134143_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Cascade'"}]}] led to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efdc5687240> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/d089e25b-58a4-4c8d-b3b3-bf078795e0a8.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/d089e25b-58a4-4c8d-b3b3-bf078795e0a8.png'}, {'type': 'text', 'text': '\nFor task: Send feedback to Google, where should I click to complete the task, and provide the bounding box of the region. Double-click on it.'}]}] c/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f839c76e6b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f839c760ae0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_822339.png Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/47813.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_822339.png'}, {'type': 'text', 'text': '\nClick on \'the image of "5 Easy Photography Ideas in 90 Seconds"\''}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/47813.jpg'}, {'type': 'text', 'text': '\nCan you provide the click location with bounding box for completing the task: select the url. Please click on it.'}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f42f8296430> _ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efbbfaa5260> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f15abd31cb0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f181f19d760> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_682360.png{i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_680866.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_680866.png'}, {'type': 'text', 'text': '\nClick on \'CLICK White "f" logo on black background\''}]}] gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f13bcd0a0c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) tered_682360.png'}, {'type': 'text', 'text': "\nClick on 'Support, at the top of the image'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GU - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) yang/workspace/GUIAgentTraceback (most recent call last): , line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f88486531f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ntent': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_033506_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Follow on Facebook'"}]}] taset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f42f8296f20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe36f7a2de0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_141505.png, 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image4910.jpg'}, {'type': 'text', 'text': '\nHow much time will it take to make the dish? Explain your answer based on UI element. Click on it.'}]}] gchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/ui_data/ui_home_screen_phone/ui_homescreen_phone_20240812_v11/homescreen-2102.jpg'}, {'type': 'text', 'text': '\n在屏幕截图上找出智联招聘图标,给出边界框的详细坐标。 Click on it.'}]}] Load image from ceph: VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/34293.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'VC:s3://gui/visual_inputs/multi_modal_2024/gui_data/ui_data/OpenApp/image/34293.jpg'}, {'type': 'text', 'text': '\nLocate the UI element button for watch tv episodes in the image and describe its position with bounding box. Please click on it.'}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_141505.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Logo Collection page'"}]}] _182453_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Mail'"}]}] _00000314.jpg'}, {'type': 'text', 'text': '\n请说明一下在图片[[264, 51, 353, 106]]框起来的内容。 Click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): //dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_151445.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_151445.png'}, {'type': 'text', 'text': "\nClick on 'Video Tour'"}]}] Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f42f827bb00> Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe36f7be570> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f61a4d786d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_111820_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_111820_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Search settings'"}]}] 在图中找到并框出我的设备,给出边界框的精确坐标。 Please click on it.'}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f133c0df2e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18a797fa10> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_034644_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_034644_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Exit Immersive Reader (F9)'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langch File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44065cfe70> aoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f181fc58770> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_141934_before_screenshot_sub3.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_141934_before_screenshot_sub3.png'}, {'type': 'text', 'text': "\nClick on 'Notification Chevron'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f839c76c540> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240328/20240328_filtered/xiaoheihe/screen_00000107.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240328/20240328_filtered/xiaoheihe/screen_00000107.jpg'}, {'type': 'text', 'text': '\n检测图像内文字,并展示文字的确切坐标框位置。 Please right-click on it.'}]}] {'type': 'text', 'text': "\nClick on 'Maintenance'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_818715.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_818715.png'}, {'type': 'text', 'text': "\nClick on 'Podcasts button'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)r', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_83299.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Google"\''}]}] ichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_737702.png'}, {'type': 'text', 'text': '\nClick on \'link labeled "Text Edition"\''}]}] wen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18a78013f0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240905_145100_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240905_145100_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'VPN'"}]}] > /mnt/hwfile/liuzhaoyang/workspace File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f61a4d831a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f622f683ab0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) 'text', 'text': "\nClick on 'uCertify Blog Logo'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efbbd045e40> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f00126763e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8777001cb0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_104956.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_104956.png'}, {'type': 'text', 'text': '\nClick on \'The fourth link in the "Recent Posts" list located on the right sidebar of the webpage.\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_140740_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_140740_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Shanghai, China Weather trends | Microsoft Weather'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_787502.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_787502.png'}, {'type': 'text', 'text': "\nClick on 'Jesus (65 items)'"}]}] Load image from ceph: langchao2:s3: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41b551bce0> "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f15969c9b70> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_462173.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_462173.png'}, {'type': 'text', 'text': "\nClick on 'Newsroom button'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_561964.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_561964.png'}, {'type': 'text', 'text': "\nClick on 'TIES/SHOES/ACCESSORIES button'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f10b5963ab0> ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f839c7927f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efc81201120> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8777002340> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18a79882c0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8b504f3150> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_031818_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_031818_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Marketing and advertising cookies'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image33292.jpg conv: [{'role': 'user conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_598099.png'}, {'type': 'text', 'text': "\nClick on 'The Women's Safe House'"}]}] ase. Click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_274326.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_274326.png'}, {'type': 'text', 'text': "\nClick on '👉Garbaholic👈\nHave you ever noticed, what kids do at garba ground? Check full caption and this series of photographs....😍\n\nThanks to @ferfudardi Kids Garba / Gammat Night\n\nPost 6)\n**Practice Sessions**: Kids can participate in practice sessions before the main event to build their confidence and refine their garba skills.\n\n**Dance Leaders**: Many Garba groups have designated Garba leaders who guide newcomers, including children, in learning the creative stepa.\n\n**Peer Learning**: Children can learn by dancing with their peers, creating a supportive and enjoyable learning environment.\n\n#CandidKids #garbaplayers\n\n#happiness #happinessoverload #garba #kidsgarba #GarbaMagic #DanceUnity #garba #garbasteps #9924227745 #dipmementophotography\n#dip_memento_photography #happyinme #happyme #garbalove #passiongarba #DandiyaDance #Navratri2023#9924227745 #dipmementophotography #dip_memento_photography'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image31130.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image31130.jpg'}, {'type': 'text', 'text': '\nWhich year is shown? Explain your answer based on UI element. Please click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_013923_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_013923_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'AutomationID: QuickStylesGallery'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f87770030b0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f11cfd64fe0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f607b287740> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efbbc915ad0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_064108_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_064108_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'The Weather Channel - MSN - Sleeping'"}]}] /qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) (Pdb) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8a0d197830> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_180209_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_180209_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Icons'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_791320.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_791320.png'}, {'type': 'text', 'text': '\nClick on \'the image of "XXL Mag", in the middle of the screenshot\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_701197.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_701197.png'}, {'type': 'text', 'text': "\nClick on 'Pixelated white car outline on a black background, on the right side of the webpage'"}]}] ocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6136cdf420> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_035801_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_035801_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Going to?'"}]}] /workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f10b5f025c0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8777001300> Traceback (most recent call last): > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8776fefe20> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File " conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_178555.png'}, {'type': 'text', 'text': '\nClick on \'the image of "NAKPRO NUTRITION"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5ff1e2f0b0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image44087.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image44087.jpg'}, {'type': 'text', 'text': '\nCreate a tree-like structure of the page layout based on the screenshot. Please double-click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f607bb75440> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_024355_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_024355_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Go back to filtering menu'"}]}] pace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe36f794d60> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f5ff1de4d10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8b504f3150> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8777003d80> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f607bbaf830> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_001515_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_001515_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Shadow'"}]}] k of Shanghai ATM'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_808013.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_808013.png'}, {'type': 'text', 'text': '\nClick on \'CLICK button titled "Powered by Shopify"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe36f8adda0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.> /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4372e42ac0> oad_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f87770016c0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_053820_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_053820_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Fashion'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe428eebb00> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7bf820c270> oyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe36f7a7ba0> PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44b70a8720> Traceback (most recent call last): Load image from ceph: VC:s3://gui/visual_inputs/multi_modal/agent_data/AndroidUI/20240321/20240321_filtered/dazhongdianping/screen_00000407.jpgtent': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240321/20240321_filtered/weixindushu/screen_00000174.jpg'}, {'type': 'text', 'text': '\n请说明一下在图片[[46, 169, 127, 202]]框起来的内容。 Click on it.'}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_784264.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_784264.png'}, {'type': 'text', 'text': '\nClick on \'link titled "Professional Sports"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe2e71885e0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_196972.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_196972.png'}, {'type': 'text', 'text': "\nClick on 'Contact Admin, titled Contact Webmaster'"}]}] vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8777009b20> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe17c9bfd80> (Pdb) wfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe06152a110> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_242542.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_242542.png'}, {'type': 'text', 'text': "\nClick on 'submit form to receive more information'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f43fb7dc400> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_112119.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_112119.png'}, {'type': 'text', 'text': '\nClick on \'SCHEDULE AND PROGRAM GUIDE, on the left side of "PROGRAM ARCHIVES"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/666397f2-8eb0-45ac-b9ab-7ab8a9c6d993.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/666397f2-8eb0-45ac-b9ab-7ab8a9c6d993.png'}, {'type': 'text', 'text': '\nFor the task: 17:35 - 18:35, output the bounding box for the region in the image that is most relevant to answering the question. Click on it.'}]}] Traceback (most recent call last): //st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/a6bb2660-82ca-480f-89d8-45c4a717048a.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/agent_data/OS-Atlas/androidworld/a6bb2660-82ca-480f-89d8-45c4a717048a.png'}, {'type': 'text', 'text': '\nCan you determine the bounding box coordinates to complete my task: to-do: Expense Tracking Spreadsheet. Double-click on it.'}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_372058.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_372058.png'}, {'type': 'text', 'text': "\nClick on 'share to facebook'"}]}] Traceback (most recent call last): [rank45]: Traceback (most recent call last): [rank45]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank45]: train() [rank45]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank45]: trainer.train() [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank45]: return inner_training_loop( [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank45]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank45]: batch_samples += [next(epoch_iterator)] [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 563, in __iter__ [rank45]: next_batch = next(dataloader_iter) [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank45]: data = self._next_data() [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank45]: return self._process_data(data) [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank45]: data.reraise() [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank45]: raise exception [rank45]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 1. [rank45]: Original Traceback (most recent call last): [rank45]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank45]: sample = self._get_item(i) [rank45]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank45]: data_dict = self.preprocess_qwen2vl_v3( [rank45]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank45]: return self.preprocess_qwen2vl( [rank45]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank45]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank45]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank45]: image = tcs_loader(image) [rank45]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank45]: img = pil_loader(img_value_str) [rank45]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank45]: img = Image.open(buff) [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank45]: raise UnidentifiedImageError(msg) [rank45]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44b70a8720> [rank45]: During handling of the above exception, another exception occurred: [rank45]: Traceback (most recent call last): [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank45]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank45]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank45]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank45]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank45]: print(f"Failed to fetch sample {i}. Exception:", e) [rank45]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank45]: print(f"Failed to fetch sample {i}. Exception:", e) [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank45]: return self.dispatch_line(frame) [rank45]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank45]: if self.quitting: raise BdbQuit [rank45]: bdb.BdbQuit File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe36f796070> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) ile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f877700b1a0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_154701_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_154701_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Curve'"}]}] [rank47]: Traceback (most recent call last): [rank47]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank47]: train() [rank47]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank47]: trainer.train() [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank47]: return inner_training_loop( [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank47]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank47]: batch_samples += [next(epoch_iterator)] [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank47]: current_batch = next(dataloader_iter) [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank47]: data = self._next_data() [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank47]: return self._process_data(data) [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank47]: data.reraise() [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank47]: raise exception [rank47]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank47]: Original Traceback (most recent call last): [rank47]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank47]: sample = self._get_item(i) [rank47]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank47]: data_dict = self.preprocess_qwen2vl_v3( [rank47]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank47]: return self.preprocess_qwen2vl( [rank47]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank47]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank47]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank47]: image = tcs_loader(image) [rank47]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank47]: img = pil_loader(img_value_str) [rank47]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank47]: img = Image.open(buff) [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank47]: raise UnidentifiedImageError(msg) [rank47]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe2e71885e0> [rank47]: During handling of the above exception, another exception occurred: [rank47]: Traceback (most recent call last): [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank47]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank47]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank47]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank47]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank47]: print(f"Failed to fetch sample {i}. Exception:", e) [rank47]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank47]: print(f"Failed to fetch sample {i}. Exception:", e) [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank47]: return self.dispatch_line(frame) [rank47]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank47]: if self.quitting: raise BdbQuit [rank47]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe42a6d7e20> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe428db6840> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe1550acfe0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_82408.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_82408.png'}, {'type': 'text', 'text': '\nClick on \'The "Home" link in the navigation breadcrumb located towards the top-left of the page, situated just above the large image of yellow flowers.\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_155716_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_155716_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Start a conversation'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_404900.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_404900.png'}, {'type': 'text', 'text': '\nClick on \'Blue background with white \'Contact\' text, centered and hyperlinked, on the left side of "Settings"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe36f7a04a0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe17c9bfd80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe36f795120> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_456755.pngconv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_258k_function_filtered_46849.png'}, {'type': 'text', 'text': "\nClick on 'navigate to Goodness and Mercy article'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240826_171649_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240826_171649_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Group'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_605819.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_605819.png'}, {'type': 'text', 'text': "\nClick on 'Google Terms of Service link'"}]}] conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_456755.png'}, {'type': 'text', 'text': "\nClick on 'WRITE A REVIEW, at the bottom of the webpage'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe061b0a980> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe08e1f2070> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe36f794d60> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image38719.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image38719.jpg'}, {'type': 'text', 'text': '\nWhat is the selected theme? Explain your answer based on UI element. Click on it.'}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_124125_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_124125_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Google Cloud Estimate Summary'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8668f25170> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7eff0a6f1120> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f181f19f470> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_015949_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_015949_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Name'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18a781db70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image41468.jpg ailed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_204522_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_204522_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on '6faf3555'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_633676.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_633676.png'}, {'type': 'text', 'text': "\nClick on 'Image of 2011 CHRYSLER 200'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image59699.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image59699.jpg'}, {'type': 'text', 'text': '\nWhat is the user name? Answer the question using a single word or phrase. Please click on it.'}]}] [rank37]: Traceback (most recent call last): [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank37]: train() [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank37]: trainer.train() [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank37]: return inner_training_loop( [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank37]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank37]: batch_samples += [next(epoch_iterator)] [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank37]: current_batch = next(dataloader_iter) [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank37]: data = self._next_data() [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank37]: return self._process_data(data) [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank37]: data.reraise() [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank37]: raise exception [rank37]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank37]: Original Traceback (most recent call last): [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank37]: sample = self._get_item(i) [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank37]: data_dict = self.preprocess_qwen2vl_v3( [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank37]: return self.preprocess_qwen2vl( [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank37]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank37]: image = tcs_loader(image) [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank37]: img = pil_loader(img_value_str) [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank37]: img = Image.open(buff) [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank37]: raise UnidentifiedImageError(msg) [rank37]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7eff0a6f1120> [rank37]: During handling of the above exception, another exception occurred: [rank37]: Traceback (most recent call last): [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank37]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank37]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank37]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank37]: print(f"Failed to fetch sample {i}. Exception:", e) [rank37]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank37]: print(f"Failed to fetch sample {i}. Exception:", e) [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank37]: return self.dispatch_line(frame) [rank37]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank37]: if self.quitting: raise BdbQuit [rank37]: bdb.BdbQuit Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f181fc58770> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18a790eac0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f86685bad90> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efb3b73ba10> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) uzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f44b46edcb0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_686879.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_686879.png'}, {'type': 'text', 'text': "\nClick on 'input field'"}]}] guvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f86305f01d0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): //wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_132227_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240827_132227_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Guidelines'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1598ebd490> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e)er', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_756885.png'}, {'type': 'text', 'text': '\nClick on \'input box, on the left side of the screenshot, under "Kindly search your topic below or browse the recent posts."\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f85af9777e0> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_403577.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_403577.png'}, {'type': 'text', 'text': "\nClick on 'concrete floor Concrete texture big. Premium Textures and Backgrounds · Download concrete cement Texture download. Download the high resolution for commercial use.'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_46483.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_46483.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Personal"\''}]}] Traceback (most recent call last): /dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7600f8bab0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) : 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/AndroidUI20240325/20240325_filtered/shihuo/screen_00000138.jpg'}, {'type': 'text', 'text': '\n详细描述一下图片。 Double-click on it.'}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f311a98e570> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18205f4f90> [rank39]: Traceback (most recent call last): [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank39]: train() [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank39]: trainer.train() [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank39]: return inner_training_loop( [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank39]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank39]: batch_samples += [next(epoch_iterator)] [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank39]: current_batch = next(dataloader_iter) [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank39]: data = self._next_data() [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank39]: return self._process_data(data) [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank39]: data.reraise() [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank39]: raise exception [rank39]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank39]: Original Traceback (most recent call last): [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank39]: sample = self._get_item(i) [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank39]: data_dict = self.preprocess_qwen2vl_v3( [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank39]: return self.preprocess_qwen2vl( [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank39]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank39]: image = tcs_loader(image) [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank39]: img = pil_loader(img_value_str) [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank39]: img = Image.open(buff) [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank39]: raise UnidentifiedImageError(msg) [rank39]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f181fc58770> [rank39]: During handling of the above exception, another exception occurred: [rank39]: Traceback (most recent call last): [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank39]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank39]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank39]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank39]: print(f"Failed to fetch sample {i}. Exception:", e) [rank39]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank39]: print(f"Failed to fetch sample {i}. Exception:", e) [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank39]: return self.dispatch_line(frame) [rank39]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank39]: if self.quitting: raise BdbQuit [rank39]: bdb.BdbQuit > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_439666.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_439666.png'}, {'type': 'text', 'text': '\nClick on \'Documents, below "Galleries", in "Honey Bees"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_165405.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_165405.png'}, {'type': 'text', 'text': '\nClick on \'the Product: select menu, to the top right of "Black"\''}]}] Traceback (most recent call last): Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_329550.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_329550.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Instagram Social Icon"\''}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f82b0d02020> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_060701_before_screenshot.png PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3337711850> conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240824_060701_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Privacy Help Center - Policies Help'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efb3e9e1940> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f181f19f2e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7a4972a5c0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_195829_before_screenshot_sub0.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240828_195829_before_screenshot_sub0.png'}, {'type': 'text', 'text': "\nClick on 'Follow on Facebook'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_031335_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_031335_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'LinkedIn'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_545319.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_545319.png'}, {'type': 'text', 'text': "\nClick on 'SHOP HOME button'"}]}] tab located in the navigation bar at the top of the webpage.\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f829e8c33d0> Traceback (most recent call last): /GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f87746a6340> .BytesIO object at 0x7f33377131f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f15969c9b70> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_51033.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_51033.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Download Smart Hub"\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) 2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f79153fb150> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_155144_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_155144_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Pictures'"}]}] Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_110003_before_screenshot_sub1.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_110003_before_screenshot_sub1.png'}, {'type': 'text', 'text': "\nClick on 'Reorder My Items'"}]}] Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f302714a980> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f84c6a3fd80> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_364964.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_364964.png'}, {'type': 'text', 'text': "\nClick on 'Open Cookie Preferences'"}]}] Agent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f15969ca930> Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_372840.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_372840.png'}, {'type': 'text', 'text': "\nClick on 'About the Project, labeled About Durham Miner'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f1a8b1909f0> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240904_140648_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/macos_images20240904_140648_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Microsoft'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3337740130> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f83a929af70> .BytesIO object at 0x7f18a7823e20> File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efc813593a0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_124033_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_124033_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Networks for migrating enterprise workloads'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) tent': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_28636.png'}, {'type': 'text', 'text': "\nClick on 'Professional Terms & Conditions'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Traceback (most recent call last): //dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_139570.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_139570.png'}, {'type': 'text', 'text': "\nClick on 'A left-pointing arrow icon within a semi-transparent circular background, indicating navigation functionality.'"}]}] Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image70416.jpg conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal/agent_data/rico/dataset/image70416.jpg'}, {'type': 'text', 'text': '\nGenerate a tree-style layout visualization of the page using the screenshot. Please double-click on it.'}]}] png'}, {'type': 'text', 'text': "\nClick on 'Flash'"}]}] File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3337712a70> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18a78232e0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_173512_before_screenshot_sub2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_173512_before_screenshot_sub2.png'}, {'type': 'text', 'text': "\nClick on 'Feedback'"}]}] /workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f83ab8a99e0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe2e6758b30> 571.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_623571.png'}, {'type': 'text', 'text': '\nClick on \'the image of "church of the incarnation logo"\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_710807.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_710807.png'}, {'type': 'text', 'text': "\nClick on 'renovations'"}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f33378984f0> Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18a7823380> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) [rank34]: Traceback (most recent call last): [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 222, in [rank34]: train() [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/train.py", line 210, in train [rank34]: trainer.train() [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2164, in train [rank34]: return inner_training_loop( [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 2473, in _inner_training_loop [rank34]: batch_samples, num_items_in_batch = self.get_batch_samples(epoch_iterator, num_batches) [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/transformers/trainer.py", line 5130, in get_batch_samples [rank34]: batch_samples += [next(epoch_iterator)] [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/accelerate/data_loader.py", line 552, in __iter__ [rank34]: current_batch = next(dataloader_iter) [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 701, in __next__ [rank34]: data = self._next_data() [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1465, in _next_data [rank34]: return self._process_data(data) [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/dataloader.py", line 1491, in _process_data [rank34]: data.reraise() [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/_utils.py", line 715, in reraise [rank34]: raise exception [rank34]: bdb.BdbQuit: Caught BdbQuit in DataLoader worker process 0. [rank34]: Original Traceback (most recent call last): [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ [rank34]: sample = self._get_item(i) [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item [rank34]: data_dict = self.preprocess_qwen2vl_v3( [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 [rank34]: return self.preprocess_qwen2vl( [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl [rank34]: image_inputs = load_image_from_ceph(self.tcs_loader, conv) [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph [rank34]: image = tcs_loader(image) [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ [rank34]: img = pil_loader(img_value_str) [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader [rank34]: img = Image.open(buff) [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open [rank34]: raise UnidentifiedImageError(msg) [rank34]: PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7efc813593a0> [rank34]: During handling of the above exception, another exception occurred: [rank34]: Traceback (most recent call last): [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/worker.py", line 351, in _worker_loop [rank34]: data = fetcher.fetch(index) # type: ignore[possibly-undefined] [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in fetch [rank34]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/data/_utils/fetch.py", line 52, in [rank34]: data = [self.dataset[idx] for idx in possibly_batched_index] [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank34]: print(f"Failed to fetch sample {i}. Exception:", e) [rank34]: File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 364, in __getitem__ [rank34]: print(f"Failed to fetch sample {i}. Exception:", e) [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 90, in trace_dispatch [rank34]: return self.dispatch_line(frame) [rank34]: File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/bdb.py", line 115, in dispatch_line [rank34]: if self.quitting: raise BdbQuit [rank34]: bdb.BdbQuit Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_665574.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_665574.png'}, {'type': 'text', 'text': '\nClick on \'the image of "256 Pc BB Mopar 383-440 Stainless Allen Bolt Kit"\''}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ef9a8764d10> Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_042557_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240823_042557_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Castle Defender Saga'"}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fe42b0d1440> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f302714a7f0> Load image from ceph: langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_14/C4web50k-0_3927143-split-2.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://st2pj/20250222/images/multi_modal_2024/gui_data/ui_data/GUICourse/guienv/chunk_14/C4web50k-0_3927143-split-2.png'}, {'type': 'text', 'text': '\nFind the 1063 Fort St, Victoria, BC within this image and provide its bounding rectangle. Please click on it.'}]}] Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18a78231f0> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_296982.png'}, {'type': 'text', 'text': '\nClick on \'the image of "Intellifluence Herd Worth Value: $55", on the right side of the image\''}]}] > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) (Pdb) Load image from ceph: langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_143425_before_screenshot.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://wuzhenyu/data/OS-Atlas/desktop_domain/windows_images20240822_143425_before_screenshot.png'}, {'type': 'text', 'text': "\nClick on 'Cancel'"}]}] Traceback (most recent call last): Traceback (most recent call last): kspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f607bc83b00> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 408, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 706, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 515, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f18a7823470> > /mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py(364)__getitem__() -> print(f"Failed to fetch sample {i}. Exception:", e) (Pdb) Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_123318.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_123318.png'}, {'type': 'text', 'text': '\nClick on \'The clickable text element located in the top right section of the webpage, positioned between two other links labeled "S.S.S." and "İletişim".\''}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_807993.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_hybrid_773k_max_25qa_filtered_807993.png'}, {'type': 'text', 'text': "\nClick on 'twitter'"}]}] Load image from ceph: langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_123422.png conv: [{'role': 'user', 'content': [{'type': 'image', 'image': 'langchao2:s3://dingzichen/data/uground_web_processing/screenshotsweb_direct_150k_description_filtered_123422.png'}, {'type': 'text', 'text': '\nClick on \'The navigation menu item labeled "BLOG" between "OPD SERVICES" and "CONTACT US" on the horizontal navigation bar.\''}]}] W0307 23:07:21.088000 18460 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] W0307 23:07:21.088000 18460 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** W0307 23:07:21.088000 18460 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. W0307 23:07:21.088000 18460 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** W0307 23:07:21.094000 80923 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] W0307 23:07:21.094000 80923 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** W0307 23:07:21.094000 80923 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] Setting OMP_NUM_THREADS environment variable for each process to be 1 in default, to avoid your system being overloaded, please further tune the variable for optimal performance in your application as needed. W0307 23:07:21.094000 80923 /mnt/hwfile/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/distributed/run.py:793] ***************************************** *** [2025-03-07 23:07:45,452] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:07:45,452] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:07:47,939] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:07:47,939] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:07:47,941] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:07:47,941] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:07:48,541] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:07:48,603] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:07:48,608] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:07:48,628] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:07:48,673] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:07:56,577] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:07:58,449] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:07:59,136] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:09,027] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:11,353] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:12,043] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:20,691] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:20,691] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:20,696] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:20,699] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:20,700] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:20,749] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:20,751] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:20,752] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:20,850] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:20,851] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:20,894] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:20,920] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,341] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,348] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,347] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,347] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,349] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,350] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,350] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,352] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,357] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,357] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,358] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,358] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,358] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,358] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,358] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,361] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,361] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,361] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,361] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,362] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,362] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,362] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,361] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,362] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,362] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,362] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,362] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,659] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,659] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,661] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,663] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,663] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,664] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,706] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,708] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,732] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,746] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,752] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,755] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:21,773] [INFO] [real_accelerator.py:219:get_accelerator] Setting ds_accelerator to cuda (auto detect) [2025-03-07 23:08:22,610] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:22,610] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:22,610] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:22,613] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:22,655] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:22,689] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:22,694] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:22,694] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:22,699] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:22,777] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:22,791] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:22,817] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,221] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:23,438] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:23,545] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,548] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,548] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,558] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:23,558] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,558] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,559] [INFO] [comm.py:652:init_distributed] cdb=None You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:23,562] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:23,565] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:23,573] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:23,612] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,612] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,612] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,612] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,612] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,612] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,613] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,617] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,617] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,617] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,617] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,617] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,618] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,618] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,618] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,631] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,631] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,631] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,631] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,631] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,631] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,631] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,631] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,650] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,681] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,717] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,736] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,736] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,736] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,736] [INFO] [comm.py:683:init_distributed] Initializing TorchBackend in DeepSpeed with backend nccl [2025-03-07 23:08:23,738] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,738] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,738] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,738] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,738] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,751] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,753] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,767] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,788] [INFO] [comm.py:652:init_distributed] cdb=None [2025-03-07 23:08:23,904] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:23,905] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:23,911] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:23,913] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:23,913] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:23,916] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:24,405] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:24,411] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:24,433] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:24,440] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:24,454] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:24,462] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:24,545] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:24,546] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:24,547] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:24,549] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:24,602] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:24,879] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:24,879] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:24,882] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:24,886] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:24,886] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:24,891] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:24,916] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:24,917] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:24,918] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:24,923] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:24,943] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:24,944] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:24,946] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:25,010] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:25,013] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:25,018] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:25,023] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:25,024] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:25,029] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:25,031] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:25,032] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:25,060] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:25,064] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:25,065] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:25,072] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:25,125] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:25,126] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:25,127] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:25,137] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 [2025-03-07 23:08:25,137] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:25,148] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. [2025-03-07 23:08:25,162] [INFO] [config.py:733:__init__] Config mesh_device None world_size = 64 You are attempting to use Flash Attention 2.0 with a model not initialized on GPU. Make sure to move the model to GPU after initializing it on CPU with `model.to('cuda')`. `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 `Qwen2VLRotaryEmbedding` can now be fully parameterized by passing the model config through the `config` argument. All other arguments will be removed in v4.46 [2025-03-07 23:08:27,108] [INFO] [partition_parameters.py:348:__exit__] finished initializing model - num_params = 730, num_elems = 8.29B Loading checkpoint shards: 0%| | 0/5 [00:00, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=0, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_23-08-22_HOST-10-140-60-41, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, )TTraining args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=7, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_23-08-23_HOST-10-140-60-56, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) T Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=7, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_23-07-47_HOST-10-140-60-41, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, doTraining args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=0, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_23-08-22_HOST-10-140-60-102, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) ain=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=2, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_23-08-23_HOST-10-140-60-56, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_see Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=0, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_23-08-23_HOST-10-140-60-101, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=3, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_23-08-23_HOST-10-140-60-101, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, den=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=3, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_23-08-23_HOST-10-140-60-95, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=2, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_23-08-23_HOST-10-140-60-95, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=1, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_23-07-47_HOST-10-140-60-95, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_rain=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=7, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_23-08-22_HOST-10-140-60-102, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=1, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_23-08-22_HOST-10-140-60-102, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokenss_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=2, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_23-07-47_HOST-10-140-60-101, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_si_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=2, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_23-08-11_HOST-10-140-60-102, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, )Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=False, jit_mode_eval=False, label_names=None, label_smoothing_factor=0.0, learning_rate=1e-05, length_column_name=length, load_best_model_at_end=False, local_rank=4, log_level=passive, log_level_replica=warning, log_on_each_node=True, logging_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/runs/Mar07_23-08-22_HOST-10-140-60-102, logging_first_step=False, logging_nan_inf_filter=True, logging_steps=1.0, logging_strategy=steps, lr_scheduler_kwargs={}, lr_scheduler_type=cosine, max_grad_norm=1.0, max_steps=-1, metric_for_best_model=None, model_max_length=8192, mp_parameters=, neftune_noise_alpha=None, no_cuda=False, num_train_epochs=1.0, optim=adamw_torch, optim_args=None, optim_target_modules=None, output_dir=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, overwrite_output_dir=False, past_index=-1, per_device_eval_batch_size=4, per_device_train_batch_size=2, prediction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) Training args: TrainingArguments( _n_gpu=1, accelerator_config={'split_batches': False, 'dispatch_batches': None, 'even_batches': True, 'use_seedable_sampler': True, 'non_blocking': False, 'gradient_accumulation_kwargs': None, 'use_configured_state': False}, adafactor=False, adam_beta1=0.9, adam_beta2=0.999, adam_epsilon=1e-08, attn_implementation=flash_attention_2, auto_find_batch_size=False, average_tokens_across_devices=False, batch_eval_metrics=False, bf16=True, bf16_full_eval=False, cache_dir=None, data_seed=None, dataloader_drop_last=False, dataloader_num_workers=8, dataloader_persistent_workers=False, dataloader_pin_memory=True, dataloader_prefetch_factor=None, ddp_backend=None, ddp_broadcast_buffers=None, ddp_bucket_cap_mb=None, ddp_find_unused_parameters=None, ddp_timeout=1800, debug=[], deepspeed=./scripts/zero3.json, disable_tqdm=False, dispatch_batches=None, do_eval=False, do_predict=False, do_train=False, eval_accumulation_steps=None, eval_delay=0, eval_do_concat_batches=True, eval_on_start=False, eval_steps=None, eval_strategy=no, eval_use_gather_object=False, evaluation_strategy=None, fp16=False, fp16_backend=auto, fp16_full_eval=False, fp16_opt_level=O1, freeze_visual_encoder=True, fsdp=[], fsdp_config={'min_num_params': 0, 'xla': False, 'xla_fsdp_v2': False, 'xla_fsdp_grad_ckpt': False}, fsdp_min_num_params=0, fsdp_transformer_layer_cls_to_wrap=None, full_determinism=False, gradient_accumulation_steps=1, gradient_checkpointing=True, gradient_checkpointing_kwargs=None, greater_is_better=None, group_by_length=False, group_by_modality_length=True, half_precision_backend=auto, hub_always_push=False, hub_model_id=None, hub_private_repo=None, hub_strategy=every_save, hub_token=, ignore_data_skip=False, include_for_metrics=[], include_inputs_for_metrics=False, include_num_input_tokens_seen=False, include_tokens_per_second=Faction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) ction_loss_only=False, push_to_hub=False, push_to_hub_model_id=None, push_to_hub_organization=None, push_to_hub_token=, ray_scope=last, remove_unused_columns=True, report_to=[], restore_callback_states_from_checkpoint=False, resume_from_checkpoint=None, run_name=work_dirs/aguvis_torchrun_64gpus_9//checkpoints/, save_on_each_node=False, save_only_model=False, save_safetensors=True, save_steps=2000, save_strategy=steps, save_total_limit=3, seed=42, skip_memory_metrics=True, split_batches=None, tf32=True, torch_compile=False, torch_compile_backend=None, torch_compile_mode=None, torch_empty_cache_steps=None, torchdynamo=None, tpu_metrics_debug=False, tpu_num_cores=None, use_cpu=False, use_ipex=False, use_legacy_prediction_loop=False, use_liger_kernel=False, use_mps_device=False, verbose_logging=False, warmup_ratio=0.03, warmup_steps=0, weight_decay=0.0, ) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf[TCSLoader] config_path: ~/pet --> before Client(conf_path) --> before[TCSLoader] config_ --> before Client(con[TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) f_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) --> before Client(conf_path) --> before Client(conf_path) reloss.conf --> before Client(conf_path) --> before Client(conf_path) --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) reloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) reloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) --> before Client(conf_path) --> before Client(conf_path) treloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader][TCSLoader] config_path: ~/petreloss.conf Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) --> before Client(conf_path) --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) [TCSLoader] config_path: ~/petreloss.conf --> before Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/guienv.json with all sampling strategy --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) --> after Client(conf_path) Rank 0: Loaded 327972 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/guienv.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/omniact_fix.json with all sampling strategy Rank 0: Loaded 6720 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/omniact_fix.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ricoig16k.json with all sampling strategy Rank 0: Loaded 16133 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ricoig16k.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ricosca.json with all sampling strategy Rank 0: Loaded 173212 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ricosca.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/seeclick.json with all sampling strategy Rank 0: Loaded 271121 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/seeclick.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ui_refexp.json with all sampling strategy Rank 0: Loaded 15624 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/ui_refexp.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/webui350k.json with all sampling strategy Rank 0: Loaded 57389 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/webui350k.json Rank 0: Loading /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/widget_captioning.json with all sampling strategy Rank 0: Loaded 101426 samples from /mnt/petrelfs/share_data/liuzhaoyang/gui/gui_annos/aguvis/stage1/widget_captioning.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_direct_150k_description_filtered.json with all sampling strategy Rank 0: Loaded 148416 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_direct_150k_description_filtered.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_direct_258k_function_filtered.json with all sampling strategy Rank 0: Loaded 248264 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_direct_258k_function_filtered.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_hybrid_773k_max_25qa_filtered_new.json with all sampling strategy Rank 0: Loaded 1243857 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/Uground/web_hybrid_773k_max_25qa_filtered_new.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_windows_splited.json with all sampling strategy Rank 0: Loaded 1075344 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_windows_splited.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_linux_splited.json with all sampling strategy Rank 0: Loaded 43156 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_linux_splited.json Rank 0: Loading /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_macos_splited.json with all sampling strategy Rank 0: Loaded 18399 samples from /mnt/petrelfs/share_data/wuzhenyu/grounding_data/OS-Atlas/windows_desktop/processed_macos_splited.json Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/uibert_train_ground_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 4646 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/uibert_train_ground_d240430_v1.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_taperception_grounding_d240815_v2.jsonl with all sampling strategy Rank 0: Loaded 2500 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_taperception_grounding_d240815_v2.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_widget_grounding_d240815_v2.jsonl with all sampling strategy Rank 0: Loaded 14878 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_widget_grounding_d240815_v2.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_mug_grounding_d240812.jsonl with all sampling strategy Rank 0: Loaded 26090 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/openapp_mug_grounding_d240812.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_phone_2403_ground_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 24798 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_phone_2403_ground_d240430_v1.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_aig_share_2405_ground_d240521_v1.jsonl with all sampling strategy Rank 0: Loaded 5008 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_aig_share_2405_ground_d240521_v1.jsonl Rank 0: Loading /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_aig_share_2406_ground_d240612_v1.jsonl with all sampling strategy Rank 0: Loaded 7903 samples from /mnt/petrelfs/share_data/suweijie/data/gui_data_grounding/private_ui_aig_share_2406_ground_d240612_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/amex_grounding_d240813_v1.jsonl with all sampling strategy Rank 0: Loaded 102007 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/amex_grounding_d240813_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/guicourse_guienv_text_grounding_1_d240815_v3.jsonl with all sampling strategy Rank 0: Loaded 63581 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/guicourse_guienv_text_grounding_1_d240815_v3.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/guicourse_guienv_text_grounding_2_d240815_v3.jsonl with all sampling strategy Rank 0: Loaded 6852 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/guicourse_guienv_text_grounding_2_d240815_v3.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_schedual_extract_20240520_v2_r464_reprompt_d240607.jsonl with repeat:2 sampling strategy Rank 0: Loaded 928 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_schedual_extract_20240520_v2_r464_reprompt_d240607.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_app_d20240822_v1.jsonl with repeat:2 sampling strategy Rank 0: Loaded 2488 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_app_d20240822_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_os_d20240822_v1.jsonl with repeat:2 sampling strategy Rank 0: Loaded 1242 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_os_d20240822_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_web_d20240822_v1.jsonl with repeat:2 sampling strategy Rank 0: Loaded 2360 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui2json_web_d20240822_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_element_recognition_d240605_v1_correct_d240607.jsonl with all sampling strategy Rank 0: Loaded 3791 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_element_recognition_d240605_v1_correct_d240607.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_marker_recognition_d240605_v1.jsonl with all sampling strategy Rank 0: Loaded 5179 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_marker_recognition_d240605_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_ocr_d240605_v1.jsonl with all sampling strategy Rank 0: Loaded 5090 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_ocr_d240605_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_operation_oral_d240605_v1.jsonl with all sampling strategy Rank 0: Loaded 5070 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_operation_oral_d240605_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_visual_prompt_with_bbox_d240605_v1.jsonl with all sampling strategy Rank 0: Loaded 5248 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2405_visual_prompt_with_bbox_d240605_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2408_region_caption_d240903_v1.jsonl with all sampling strategy Rank 0: Loaded 5854 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_2408_region_caption_d240903_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_detection_d20240418_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_detection_d20240418_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_element_recognition_d20240416_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_element_recognition_d20240416_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_ground_d20240416_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240416_v7_ground_d20240416_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_detection_d20240418_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_detection_d20240418_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_element_recognition_d20240416_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_element_recognition_d20240416_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_ground_d20240416_v1.jsonl with all sampling strategy Rank 0: Loaded 2000 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_tablet_20240416_v7_ground_d20240416_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_element_recognition_d240605_v1_correct_d240607.jsonl with all sampling strategy Rank 0: Loaded 24620 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_element_recognition_d240605_v1_correct_d240607.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_long_caption_d20240604_v2.jsonl with all sampling strategy Rank 0: Loaded 17196 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_long_caption_d20240604_v2.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_long_caption_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 5998 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_long_caption_d240430_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_ocr_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 31276 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_ocr_d240430_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screen_qa_short_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 27880 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screen_qa_short_d240430_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screen_qa_with_bbox_d240430_v1.jsonl with all sampling strategy Rank 0: Loaded 62401 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screen_qa_with_bbox_d240430_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screenai_layout_20240604_v1.jsonl with all sampling strategy Rank 0: Loaded 22076 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/screenai_layout_20240604_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_grounding_d240924_v1.jsonl with all sampling strategy Rank 0: Loaded 1405 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_grounding_d240924_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_oral_operation_d240924_v1.jsonl with all sampling strategy Rank 0: Loaded 1405 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_oral_operation_d240924_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_region_caption_d240924_v1.jsonl with all sampling strategy Rank 0: Loaded 1405 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_aig_share_0815_logo_region_caption_d240924_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_ui_operation_oral_wbox_d241023_v2.jsonl with all sampling strategy Rank 0: Loaded 20293 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_2403_ui_operation_oral_wbox_d241023_v2.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_comment_20240606_json_d20241023_v2.jsonl with all sampling strategy Rank 0: Loaded 1055 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_phone_comment_20240606_json_d20241023_v2.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_internal_aig_json_d241126.jsonl with repeat:3 sampling strategy Rank 0: Loaded 6837 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_internal_aig_json_d241126.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_internal_aig_xml_d241126.jsonl with repeat:3 sampling strategy Rank 0: Loaded 6873 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_internal_aig_xml_d241126.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/OS_Altas_androidworld_grounding_d241120_v1.jsonl with all sampling strategy Rank 0: Loaded 89860 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/OS_Altas_androidworld_grounding_d241120_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_long_caption_20240604_v1.jsonl with repeat:4 sampling strategy Rank 0: Loaded 3156 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_aig_share_long_caption_20240604_v1.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/grounding_new.jsonl with all sampling strategy Rank 0: Loaded 863 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/grounding_new.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/regioncaption_new.jsonl with all sampling strategy Rank 0: Loaded 863 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/regioncaption_new.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/oral_operation_new.jsonl with all sampling strategy Rank 0: Loaded 863 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/oral_operation_new.jsonl Rank 0: Loading langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240812_grounding_dataset_20240812_v1_r6600.jsonl with all sampling strategy Rank 0: Loaded 6600 samples from langchao2:s3://gui/new_annotations/st_data/20250222/annotations/private_ui_homescreen_phone_20240812_grounding_dataset_20240812_v1_r6600.jsonl Rank 0: Loaded 4387471 samples from data/stage1_20250307_1.yaml Rank 0: Formatting inputs...Skip in lazy mode Detected kernel version 3.10.0, which is below the recommended minimum of 5.5.0; this can cause the process to hang. It is recommended to upgrade the kernel to the minimum version or higher. Parameter Offload: Total persistent parameters: 877056 in 401 params 0%| | 0/34278 [00:00 8192). Running this sequence through the model will result in indexing errors huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ocess just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMTOKENIZERS_PARALLELISM=(true | false) =(true | false) - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMTOKENIZERS_PARALLELISM=(true | false) =(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocTOKENIZERS_PARALLELISM=(true | false) ent process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: To disable this warning, you can either: To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMTOKENIZERS_PARALLELISMTOKENIZERS_PARALLELISM=(true | false) =(true | false) =(true | false) rallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMTOKENIZERS_PARALLELISM=(true | false) =(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ocess just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMTOKENIZERS_PARALLELISM=(true | false) =(true | false) sm has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ocess just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) locks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ocess just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMhuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... =(true | false) To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current prhuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current prhuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: fork if possible - Explicitly set the environment variable - Avoid using `tokenizers` before thehuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) nizers: The current prTOKENIZERS_PARALLELISM=(true | false) sm has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current prhuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ocess just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Token indices sequence length is longer than the specified maximum sequence length for this model (8940 > 8192). Running this sequence through the model will result in indexing errors huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ocess just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ocess just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ocess just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ocess just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMTOKENIZERS_PARALLELISM=(true | false) =(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMTOKENIZERS_PARALLELISM=(true | false) =(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM - Avoid using `huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable ning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ocess just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) locks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMTOKENIZERS_PARALLELISM=(true | false) =(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ocess just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) locks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMhuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable plicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ocess just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMTOKENIZERS_PARALLELISM=(true | false) =(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) ocess just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) =(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable - Avoid using `tokenizers` before thehuggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMTOKENIZERS_PARALLELISM=(true | false) =(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMTOKENIZERS_PARALLELISM=(true | false) =(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISMTOKENIZERS_PARALLELISM=(true | false) =(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) huggingface/tokenizers: The current process just got forked, after parallelism has already been used. Disabling parallelism to avoid deadlocks... To disable this warning, you can either: - Avoid using `tokenizers` before the fork if possible - Explicitly set the environment variable TOKENIZERS_PARALLELISM=(true | false) Token indices sequence length is longer than the specified maximum sequence length for this model (11653 > 8192). Running this sequence through the model will result in indexing errors Token indices sequence length is longer than the specified maximum sequence length for this model (9186 > 8192). Running this sequence through the model will result in indexing errors Token indices sequence length is longer than the specified maximum sequence length for this model (12518 > 8192). Running this sequence through the model will result in indexing errors Token indices sequence length is longer than the specified maximum sequence length for this model (8630 > 8192). Running this sequence through the model will result in indexing errors Token indices sequence length is longer than the specified maximum sequence length for this model (10090 > 8192). Running this sequence through the model will result in indexing errors /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (14289 > 8192). Running this sequence through the model will result in indexing errors Token indices sequence length is longer than the specified maximum sequence length for this model (10994 > 8192). Running this sequence through the model will result in indexing errors 0%| | 1/34278 [01:45<1002:19:30, 105.27s/it] {'loss': 3.307, 'grad_norm': 54.045951829468514, 'learning_rate': 9.718172983479106e-09, 'epoch': 0.0} 0%| | 1/34278 [01:45<1002:19:30, 105.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 2/34278 [01:50<441:26:49, 46.37s/it] {'loss': 3.4011, 'grad_norm': 59.45590508211042, 'learning_rate': 1.943634596695821e-08, 'epoch': 0.0} 0%| | 2/34278 [01:50<441:26:49, 46.37s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 3/34278 [01:56<267:48:43, 28.13s/it] {'loss': 3.3368, 'grad_norm': 56.232314342880564, 'learning_rate': 2.915451895043732e-08, 'epoch': 0.0} 0%| | 3/34278 [01:56<267:48:43, 28.13s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 4/34278 [02:00<176:20:56, 18.52s/it] {'loss': 3.349, 'grad_norm': 57.02148351734131, 'learning_rate': 3.887269193391642e-08, 'epoch': 0.0} 0%| | 4/34278 [02:00<176:20:56, 18.52s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 5/34278 [02:04<126:47:23, 13.32s/it] {'loss': 3.2952, 'grad_norm': 55.620710576506895, 'learning_rate': 4.8590864917395535e-08, 'epoch': 0.0} 0%| | 5/34278 [02:04<126:47:23, 13.32s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 6/34278 [02:08<96:21:16, 10.12s/it] {'loss': 3.4352, 'grad_norm': 60.152501431945254, 'learning_rate': 5.830903790087464e-08, 'epoch': 0.0} 0%| | 6/34278 [02:08<96:21:16, 10.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 7/34278 [02:12<77:39:10, 8.16s/it] {'loss': 3.493, 'grad_norm': 63.889700682468856, 'learning_rate': 6.802721088435375e-08, 'epoch': 0.0} 0%| | 7/34278 [02:12<77:39:10, 8.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 8/34278 [02:19<72:24:25, 7.61s/it] {'loss': 3.2058, 'grad_norm': 50.90441079107094, 'learning_rate': 7.774538386783285e-08, 'epoch': 0.0} 0%| | 8/34278 [02:19<72:24:25, 7.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 9/34278 [02:25<69:23:38, 7.29s/it] {'loss': 3.293, 'grad_norm': 56.09070079855593, 'learning_rate': 8.746355685131196e-08, 'epoch': 0.0} 0%| | 9/34278 [02:25<69:23:38, 7.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 10/34278 [02:29<58:55:08, 6.19s/it] {'loss': 3.46, 'grad_norm': 63.61971462398412, 'learning_rate': 9.718172983479107e-08, 'epoch': 0.0} 0%| | 10/34278 [02:29<58:55:08, 6.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 11/34278 [02:33<53:38:39, 5.64s/it] {'loss': 3.4424, 'grad_norm': 60.29565511961232, 'learning_rate': 1.0689990281827017e-07, 'epoch': 0.0} 0%| | 11/34278 [02:33<53:38:39, 5.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 12/34278 [02:37<49:10:55, 5.17s/it] {'loss': 3.22, 'grad_norm': 52.44151203167023, 'learning_rate': 1.1661807580174928e-07, 'epoch': 0.0} 0%| | 12/34278 [02:38<49:10:55, 5.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 13/34278 [02:41<44:06:36, 4.63s/it] {'loss': 3.4233, 'grad_norm': 58.19796349152196, 'learning_rate': 1.263362487852284e-07, 'epoch': 0.0} 0%| | 13/34278 [02:41<44:06:36, 4.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 14/34278 [02:45<43:18:30, 4.55s/it] {'loss': 3.4018, 'grad_norm': 61.45277974737471, 'learning_rate': 1.360544217687075e-07, 'epoch': 0.0} 0%| | 14/34278 [02:45<43:18:30, 4.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 15/34278 [02:50<45:13:18, 4.75s/it] {'loss': 3.206, 'grad_norm': 51.96697093650594, 'learning_rate': 1.457725947521866e-07, 'epoch': 0.0} 0%| | 15/34278 [02:50<45:13:18, 4.75s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 16/34278 [02:56<47:22:25, 4.98s/it] {'loss': 3.2888, 'grad_norm': 53.54038814993851, 'learning_rate': 1.554907677356657e-07, 'epoch': 0.0} 0%| | 16/34278 [02:56<47:22:25, 4.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 17/34278 [03:00<45:56:48, 4.83s/it] {'loss': 3.3793, 'grad_norm': 57.73223559145958, 'learning_rate': 1.6520894071914482e-07, 'epoch': 0.0} 0%| | 17/34278 [03:00<45:56:48, 4.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 18/34278 [03:07<51:24:01, 5.40s/it] {'loss': 3.3604, 'grad_norm': 60.66841021414202, 'learning_rate': 1.7492711370262392e-07, 'epoch': 0.0} 0%| | 18/34278 [03:07<51:24:01, 5.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 19/34278 [03:11<47:33:31, 5.00s/it] {'loss': 3.149, 'grad_norm': 48.39521870204028, 'learning_rate': 1.8464528668610302e-07, 'epoch': 0.0} 0%| | 19/34278 [03:11<47:33:31, 5.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 20/34278 [03:16<46:30:05, 4.89s/it] {'loss': 3.3099, 'grad_norm': 55.805932545996086, 'learning_rate': 1.9436345966958214e-07, 'epoch': 0.0} 0%| | 20/34278 [03:16<46:30:05, 4.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 21/34278 [03:24<55:48:56, 5.87s/it] {'loss': 3.3476, 'grad_norm': 55.822401672096156, 'learning_rate': 2.0408163265306121e-07, 'epoch': 0.0} 0%| | 21/34278 [03:24<55:48:56, 5.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 22/34278 [03:31<59:20:48, 6.24s/it] {'loss': 3.1326, 'grad_norm': 48.00227667036085, 'learning_rate': 2.1379980563654034e-07, 'epoch': 0.0} 0%| | 22/34278 [03:31<59:20:48, 6.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 23/34278 [03:35<52:19:12, 5.50s/it] {'loss': 3.0579, 'grad_norm': 46.23671337498092, 'learning_rate': 2.2351797862001946e-07, 'epoch': 0.0} 0%| | 23/34278 [03:35<52:19:12, 5.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 24/34278 [03:39<48:26:07, 5.09s/it] {'loss': 3.1714, 'grad_norm': 49.215337803463065, 'learning_rate': 2.3323615160349856e-07, 'epoch': 0.0} 0%| | 24/34278 [03:39<48:26:07, 5.09s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (8385 > 8192). Running this sequence through the model will result in indexing errors 0%| | 25/34278 [03:43<44:29:34, 4.68s/it] {'loss': 3.2463, 'grad_norm': 50.0964962852148, 'learning_rate': 2.429543245869777e-07, 'epoch': 0.0} 0%| | 25/34278 [03:43<44:29:34, 4.68s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 26/34278 [03:49<48:45:28, 5.12s/it] {'loss': 2.8898, 'grad_norm': 40.48331009532296, 'learning_rate': 2.526724975704568e-07, 'epoch': 0.0} 0%| | 26/34278 [03:49<48:45:28, 5.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 27/34278 [03:52<43:53:10, 4.61s/it] {'loss': 3.0225, 'grad_norm': 43.58796335693116, 'learning_rate': 2.623906705539359e-07, 'epoch': 0.0} 0%| | 27/34278 [03:52<43:53:10, 4.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (9554 > 8192). Running this sequence through the model will result in indexing errors 0%| | 28/34278 [03:56<41:51:35, 4.40s/it] {'loss': 3.2008, 'grad_norm': 50.27078559700237, 'learning_rate': 2.72108843537415e-07, 'epoch': 0.0} 0%| | 28/34278 [03:56<41:51:35, 4.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 29/34278 [04:03<47:43:01, 5.02s/it] {'loss': 2.9526, 'grad_norm': 42.62875679870639, 'learning_rate': 2.818270165208941e-07, 'epoch': 0.0} 0%| | 29/34278 [04:03<47:43:01, 5.02s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 30/34278 [04:08<48:59:33, 5.15s/it] {'loss': 2.7081, 'grad_norm': 36.538631056416364, 'learning_rate': 2.915451895043732e-07, 'epoch': 0.0} 0%| | 30/34278 [04:08<48:59:33, 5.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 31/34278 [04:11<43:25:50, 4.57s/it] {'loss': 2.7338, 'grad_norm': 37.95410524150656, 'learning_rate': 3.0126336248785234e-07, 'epoch': 0.0} 0%| | 31/34278 [04:11<43:25:50, 4.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 32/34278 [04:15<40:47:12, 4.29s/it] {'loss': 2.7104, 'grad_norm': 38.13933215569488, 'learning_rate': 3.109815354713314e-07, 'epoch': 0.0} 0%| | 32/34278 [04:15<40:47:12, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 33/34278 [04:19<39:15:21, 4.13s/it] {'loss': 2.6817, 'grad_norm': 35.171458371816755, 'learning_rate': 3.2069970845481054e-07, 'epoch': 0.0} 0%| | 33/34278 [04:19<39:15:21, 4.13s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 34/34278 [04:23<39:14:04, 4.12s/it] {'loss': 2.652, 'grad_norm': 34.74362617495035, 'learning_rate': 3.3041788143828963e-07, 'epoch': 0.0} 0%| | 34/34278 [04:23<39:14:04, 4.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 35/34278 [04:28<41:55:29, 4.41s/it] {'loss': 2.6788, 'grad_norm': 35.15349620528716, 'learning_rate': 3.401360544217688e-07, 'epoch': 0.0} 0%| | 35/34278 [04:28<41:55:29, 4.41s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (10503 > 8192). Running this sequence through the model will result in indexing errors 0%| | 36/34278 [04:32<40:18:32, 4.24s/it] {'loss': 2.5883, 'grad_norm': 34.67235666928212, 'learning_rate': 3.4985422740524783e-07, 'epoch': 0.0} 0%| | 36/34278 [04:32<40:18:32, 4.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 37/34278 [04:36<39:26:10, 4.15s/it] {'loss': 2.5411, 'grad_norm': 32.38172424222036, 'learning_rate': 3.5957240038872693e-07, 'epoch': 0.0} 0%| | 37/34278 [04:36<39:26:10, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 38/34278 [04:39<37:34:10, 3.95s/it] {'loss': 2.573, 'grad_norm': 33.1882021752588, 'learning_rate': 3.6929057337220603e-07, 'epoch': 0.0} 0%| | 38/34278 [04:39<37:34:10, 3.95s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 39/34278 [04:43<36:26:54, 3.83s/it] {'loss': 2.5205, 'grad_norm': 32.09588599180164, 'learning_rate': 3.790087463556852e-07, 'epoch': 0.0} 0%| | 39/34278 [04:43<36:26:54, 3.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 40/34278 [04:47<37:58:00, 3.99s/it] {'loss': 2.3672, 'grad_norm': 30.64571408525671, 'learning_rate': 3.887269193391643e-07, 'epoch': 0.0} 0%| | 40/34278 [04:47<37:58:00, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (10763 > 8192). Running this sequence through the model will result in indexing errors 0%| | 41/34278 [04:51<36:47:33, 3.87s/it] {'loss': 2.2191, 'grad_norm': 27.741409245103288, 'learning_rate': 3.984450923226434e-07, 'epoch': 0.0} 0%| | 41/34278 [04:51<36:47:33, 3.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 42/34278 [04:55<38:23:24, 4.04s/it] {'loss': 2.1296, 'grad_norm': 28.002704456507587, 'learning_rate': 4.0816326530612243e-07, 'epoch': 0.0} 0%| | 42/34278 [04:55<38:23:24, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 43/34278 [05:02<45:13:31, 4.76s/it] {'loss': 2.0846, 'grad_norm': 27.01102683162873, 'learning_rate': 4.1788143828960163e-07, 'epoch': 0.0} 0%| | 43/34278 [05:02<45:13:31, 4.76s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 44/34278 [05:06<43:42:44, 4.60s/it] {'loss': 2.0225, 'grad_norm': 27.30622428117336, 'learning_rate': 4.275996112730807e-07, 'epoch': 0.0} 0%| | 44/34278 [05:06<43:42:44, 4.60s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (8940 > 8192). Running this sequence through the model will result in indexing errors 0%| | 45/34278 [05:10<41:41:49, 4.38s/it] {'loss': 1.9974, 'grad_norm': 30.44286494588882, 'learning_rate': 4.373177842565598e-07, 'epoch': 0.0} 0%| | 45/34278 [05:10<41:41:49, 4.38s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 46/34278 [05:13<37:52:03, 3.98s/it] {'loss': 1.9598, 'grad_norm': 41.43761884824672, 'learning_rate': 4.4703595724003893e-07, 'epoch': 0.0} 0%| | 46/34278 [05:13<37:52:03, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 47/34278 [05:17<38:13:55, 4.02s/it] {'loss': 1.8443, 'grad_norm': 83.41599647174692, 'learning_rate': 4.5675413022351803e-07, 'epoch': 0.0} 0%| | 47/34278 [05:17<38:13:55, 4.02s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 48/34278 [05:20<37:04:13, 3.90s/it] {'loss': 1.8672, 'grad_norm': 109.62322489197544, 'learning_rate': 4.6647230320699713e-07, 'epoch': 0.0} 0%| | 48/34278 [05:20<37:04:13, 3.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 49/34278 [05:27<44:34:02, 4.69s/it] {'loss': 1.7683, 'grad_norm': 112.88366758582197, 'learning_rate': 4.7619047619047623e-07, 'epoch': 0.0} 0%| | 49/34278 [05:27<44:34:02, 4.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 50/34278 [05:31<42:38:19, 4.48s/it] {'loss': 1.7578, 'grad_norm': 96.52880293362055, 'learning_rate': 4.859086491739554e-07, 'epoch': 0.0} 0%| | 50/34278 [05:31<42:38:19, 4.48s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 51/34278 [05:36<43:28:48, 4.57s/it] {'loss': 1.6946, 'grad_norm': 51.58173404640064, 'learning_rate': 4.956268221574345e-07, 'epoch': 0.0} 0%| | 51/34278 [05:36<43:28:48, 4.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 52/34278 [05:40<41:28:49, 4.36s/it] {'loss': 1.624, 'grad_norm': 98.63939811986286, 'learning_rate': 5.053449951409136e-07, 'epoch': 0.0} 0%| | 52/34278 [05:40<41:28:49, 4.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 53/34278 [05:43<39:13:45, 4.13s/it] {'loss': 1.6404, 'grad_norm': 392.52631989640827, 'learning_rate': 5.150631681243927e-07, 'epoch': 0.0} 0%| | 53/34278 [05:43<39:13:45, 4.13s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 54/34278 [05:50<45:38:37, 4.80s/it] {'loss': 1.621, 'grad_norm': 36.83694097250143, 'learning_rate': 5.247813411078718e-07, 'epoch': 0.0} 0%| | 54/34278 [05:50<45:38:37, 4.80s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 55/34278 [05:56<51:00:38, 5.37s/it] {'loss': 1.5761, 'grad_norm': 37.944539281225644, 'learning_rate': 5.344995140913509e-07, 'epoch': 0.0} 0%| | 55/34278 [05:56<51:00:38, 5.37s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 56/34278 [05:59<44:53:59, 4.72s/it] {'loss': 1.6411, 'grad_norm': 41.05549932657623, 'learning_rate': 5.4421768707483e-07, 'epoch': 0.0} 0%| | 56/34278 [06:00<44:53:59, 4.72s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 57/34278 [06:03<42:41:51, 4.49s/it] {'loss': 1.5739, 'grad_norm': 42.30992951788921, 'learning_rate': 5.539358600583091e-07, 'epoch': 0.0} 0%| | 57/34278 [06:03<42:41:51, 4.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 58/34278 [06:09<45:33:08, 4.79s/it] {'loss': 1.4234, 'grad_norm': 38.06215707218943, 'learning_rate': 5.636540330417882e-07, 'epoch': 0.0} 0%| | 58/34278 [06:09<45:33:08, 4.79s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 59/34278 [06:12<42:01:51, 4.42s/it] {'loss': 1.4642, 'grad_norm': 29.2367743854925, 'learning_rate': 5.733722060252673e-07, 'epoch': 0.0} 0%| | 59/34278 [06:12<42:01:51, 4.42s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 60/34278 [06:16<39:51:47, 4.19s/it] {'loss': 1.3755, 'grad_norm': 25.18278161139425, 'learning_rate': 5.830903790087464e-07, 'epoch': 0.0} 0%| | 60/34278 [06:16<39:51:47, 4.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 61/34278 [06:20<39:37:50, 4.17s/it] {'loss': 1.2975, 'grad_norm': 18.831005340291743, 'learning_rate': 5.928085519922256e-07, 'epoch': 0.0} 0%| | 61/34278 [06:20<39:37:50, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (12442 > 8192). Running this sequence through the model will result in indexing errors 0%| | 62/34278 [06:25<40:24:16, 4.25s/it] {'loss': 1.2641, 'grad_norm': 19.21243668324547, 'learning_rate': 6.025267249757047e-07, 'epoch': 0.0} 0%| | 62/34278 [06:25<40:24:16, 4.25s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 63/34278 [06:28<37:59:04, 4.00s/it] {'loss': 1.2559, 'grad_norm': 18.402347207240606, 'learning_rate': 6.122448979591837e-07, 'epoch': 0.0} 0%| | 63/34278 [06:28<37:59:04, 4.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 64/34278 [06:31<35:08:37, 3.70s/it] {'loss': 1.2621, 'grad_norm': 10.38479986233958, 'learning_rate': 6.219630709426628e-07, 'epoch': 0.0} 0%| | 64/34278 [06:31<35:08:37, 3.70s/it] 0%| | 65/34278 [06:35<34:21:07, 3.61s/it] {'loss': 1.2273, 'grad_norm': 11.240970680762699, 'learning_rate': 6.316812439261419e-07, 'epoch': 0.0} 0%| | 65/34278 [06:35<34:21:07, 3.61s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa49d0708b0> Failed to fetch sample 3641700. Exception: cannot identify image file <_io.BytesIO object at 0x7fa49d0708b0> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 66/34278 [06:39<38:01:57, 4.00s/it] {'loss': 1.1881, 'grad_norm': 9.30095359552287, 'learning_rate': 6.413994169096211e-07, 'epoch': 0.0} 0%| | 66/34278 [06:39<38:01:57, 4.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 67/34278 [06:43<36:10:15, 3.81s/it] {'loss': 1.201, 'grad_norm': 8.727234288494008, 'learning_rate': 6.511175898931002e-07, 'epoch': 0.0} 0%| | 67/34278 [06:43<36:10:15, 3.81s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 68/34278 [06:46<35:12:04, 3.70s/it] {'loss': 1.199, 'grad_norm': 8.448178343565178, 'learning_rate': 6.608357628765793e-07, 'epoch': 0.0} 0%| | 68/34278 [06:46<35:12:04, 3.70s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 69/34278 [06:51<37:03:40, 3.90s/it] {'loss': 1.1633, 'grad_norm': 7.605640693479639, 'learning_rate': 6.705539358600584e-07, 'epoch': 0.0} 0%| | 69/34278 [06:51<37:03:40, 3.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 70/34278 [06:57<43:51:11, 4.62s/it] {'loss': 1.157, 'grad_norm': 7.763773063041981, 'learning_rate': 6.802721088435376e-07, 'epoch': 0.0} 0%| | 70/34278 [06:57<43:51:11, 4.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 71/34278 [07:00<40:09:33, 4.23s/it] {'loss': 1.127, 'grad_norm': 7.83234540673427, 'learning_rate': 6.899902818270166e-07, 'epoch': 0.0} 0%| | 71/34278 [07:00<40:09:33, 4.23s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 72/34278 [07:04<39:12:32, 4.13s/it] {'loss': 1.1411, 'grad_norm': 7.751860059488426, 'learning_rate': 6.997084548104957e-07, 'epoch': 0.0} 0%| | 72/34278 [07:04<39:12:32, 4.13s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 73/34278 [07:08<38:37:39, 4.07s/it] {'loss': 1.1508, 'grad_norm': 7.221011120013558, 'learning_rate': 7.094266277939748e-07, 'epoch': 0.0} 0%| | 73/34278 [07:08<38:37:39, 4.07s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 74/34278 [07:12<38:12:18, 4.02s/it] {'loss': 1.1322, 'grad_norm': 7.249742675328467, 'learning_rate': 7.191448007774539e-07, 'epoch': 0.0} 0%| | 74/34278 [07:12<38:12:18, 4.02s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 75/34278 [07:16<37:32:39, 3.95s/it] {'loss': 1.1169, 'grad_norm': 7.41932368070449, 'learning_rate': 7.288629737609331e-07, 'epoch': 0.0} 0%| | 75/34278 [07:16<37:32:39, 3.95s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 76/34278 [07:20<37:07:55, 3.91s/it] {'loss': 1.1396, 'grad_norm': 6.84055899908655, 'learning_rate': 7.385811467444121e-07, 'epoch': 0.0} 0%| | 76/34278 [07:20<37:07:55, 3.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 77/34278 [07:23<35:54:55, 3.78s/it] {'loss': 1.1648, 'grad_norm': 7.371084361682207, 'learning_rate': 7.482993197278913e-07, 'epoch': 0.0} 0%| | 77/34278 [07:23<35:54:55, 3.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 78/34278 [07:27<37:10:03, 3.91s/it] {'loss': 1.0947, 'grad_norm': 7.28952170579296, 'learning_rate': 7.580174927113704e-07, 'epoch': 0.0} 0%| | 78/34278 [07:27<37:10:03, 3.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 79/34278 [07:32<39:28:39, 4.16s/it] {'loss': 1.0815, 'grad_norm': 6.60461471141226, 'learning_rate': 7.677356656948494e-07, 'epoch': 0.0} 0%| | 79/34278 [07:32<39:28:39, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 80/34278 [07:35<36:54:37, 3.89s/it] {'loss': 1.0779, 'grad_norm': 6.855555588321384, 'learning_rate': 7.774538386783286e-07, 'epoch': 0.0} 0%| | 80/34278 [07:35<36:54:37, 3.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 81/34278 [07:39<35:41:23, 3.76s/it] {'loss': 1.0906, 'grad_norm': 7.464498057697863, 'learning_rate': 7.871720116618077e-07, 'epoch': 0.0} 0%| | 81/34278 [07:39<35:41:23, 3.76s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 82/34278 [07:42<33:59:59, 3.58s/it] {'loss': 1.0776, 'grad_norm': 6.971944178717365, 'learning_rate': 7.968901846452868e-07, 'epoch': 0.0} 0%| | 82/34278 [07:42<33:59:59, 3.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 83/34278 [07:48<42:40:34, 4.49s/it] {'loss': 1.0514, 'grad_norm': 6.441529733700574, 'learning_rate': 8.066083576287659e-07, 'epoch': 0.0} 0%| | 83/34278 [07:48<42:40:34, 4.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 84/34278 [07:52<41:18:52, 4.35s/it] {'loss': 1.0736, 'grad_norm': 6.06148020983748, 'learning_rate': 8.163265306122449e-07, 'epoch': 0.0} 0%| | 84/34278 [07:53<41:18:52, 4.35s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (8401 > 8192). Running this sequence through the model will result in indexing errors 0%| | 85/34278 [07:56<39:46:01, 4.19s/it] {'loss': 1.0877, 'grad_norm': 6.444048939489742, 'learning_rate': 8.260447035957241e-07, 'epoch': 0.0} 0%| | 85/34278 [07:56<39:46:01, 4.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 86/34278 [08:00<39:49:38, 4.19s/it] {'loss': 1.0835, 'grad_norm': 6.181458160460038, 'learning_rate': 8.357628765792033e-07, 'epoch': 0.0} 0%| | 86/34278 [08:01<39:49:38, 4.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 87/34278 [08:05<40:43:34, 4.29s/it] {'loss': 1.0658, 'grad_norm': 5.922607770428193, 'learning_rate': 8.454810495626823e-07, 'epoch': 0.0} 0%| | 87/34278 [08:05<40:43:34, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 88/34278 [08:09<41:16:47, 4.35s/it] {'loss': 1.0226, 'grad_norm': 5.542619628627248, 'learning_rate': 8.551992225461614e-07, 'epoch': 0.0} 0%| | 88/34278 [08:09<41:16:47, 4.35s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 89/34278 [08:13<38:34:28, 4.06s/it] {'loss': 1.0332, 'grad_norm': 5.747595143803563, 'learning_rate': 8.649173955296406e-07, 'epoch': 0.0} 0%| | 89/34278 [08:13<38:34:28, 4.06s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 90/34278 [08:18<40:36:48, 4.28s/it] {'loss': 1.0308, 'grad_norm': 5.402496013260269, 'learning_rate': 8.746355685131196e-07, 'epoch': 0.0} 0%| | 90/34278 [08:18<40:36:48, 4.28s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 91/34278 [08:21<39:18:38, 4.14s/it] {'loss': 1.059, 'grad_norm': 5.562058138188112, 'learning_rate': 8.843537414965988e-07, 'epoch': 0.0} 0%| | 91/34278 [08:21<39:18:38, 4.14s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 92/34278 [08:26<41:24:50, 4.36s/it] {'loss': 1.0442, 'grad_norm': 5.361866485657315, 'learning_rate': 8.940719144800779e-07, 'epoch': 0.0} 0%| | 92/34278 [08:26<41:24:50, 4.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 93/34278 [08:30<40:00:14, 4.21s/it] {'loss': 1.0374, 'grad_norm': 5.351336135979188, 'learning_rate': 9.037900874635569e-07, 'epoch': 0.0} 0%| | 93/34278 [08:30<40:00:14, 4.21s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 94/34278 [08:35<40:49:40, 4.30s/it] {'loss': 1.0648, 'grad_norm': 4.839716796476288, 'learning_rate': 9.135082604470361e-07, 'epoch': 0.0} 0%| | 94/34278 [08:35<40:49:40, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 95/34278 [08:38<39:10:42, 4.13s/it] {'loss': 1.059, 'grad_norm': 4.894278673171512, 'learning_rate': 9.23226433430515e-07, 'epoch': 0.0} 0%| | 95/34278 [08:38<39:10:42, 4.13s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 96/34278 [08:43<39:50:37, 4.20s/it] {'loss': 1.036, 'grad_norm': 4.840493645110819, 'learning_rate': 9.329446064139943e-07, 'epoch': 0.0} 0%| | 96/34278 [08:43<39:50:37, 4.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 97/34278 [08:47<40:25:57, 4.26s/it] {'loss': 1.0267, 'grad_norm': 4.641389959250609, 'learning_rate': 9.426627793974734e-07, 'epoch': 0.0} 0%| | 97/34278 [08:47<40:25:57, 4.26s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 98/34278 [08:51<37:48:08, 3.98s/it] {'loss': 1.0354, 'grad_norm': 4.781491164120611, 'learning_rate': 9.523809523809525e-07, 'epoch': 0.0} 0%| | 98/34278 [08:51<37:48:08, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 99/34278 [08:54<37:21:51, 3.94s/it] {'loss': 1.0251, 'grad_norm': 4.583366378736798, 'learning_rate': 9.620991253644317e-07, 'epoch': 0.0} 0%| | 99/34278 [08:54<37:21:51, 3.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 100/34278 [08:58<35:52:24, 3.78s/it] {'loss': 1.0144, 'grad_norm': 4.825726016243985, 'learning_rate': 9.718172983479108e-07, 'epoch': 0.0} 0%| | 100/34278 [08:58<35:52:24, 3.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 101/34278 [09:03<41:12:06, 4.34s/it] {'loss': 0.9955, 'grad_norm': 4.556217342908838, 'learning_rate': 9.815354713313896e-07, 'epoch': 0.0} 0%| | 101/34278 [09:03<41:12:06, 4.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 102/34278 [09:10<47:04:16, 4.96s/it] {'loss': 0.9907, 'grad_norm': 4.625039174353454, 'learning_rate': 9.91253644314869e-07, 'epoch': 0.0} 0%| | 102/34278 [09:10<47:04:16, 4.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 103/34278 [09:14<44:59:59, 4.74s/it] {'loss': 1.0229, 'grad_norm': 4.68840635703037, 'learning_rate': 1.000971817298348e-06, 'epoch': 0.0} 0%| | 103/34278 [09:14<44:59:59, 4.74s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 104/34278 [09:18<41:34:35, 4.38s/it] {'loss': 1.0059, 'grad_norm': 4.743602647176867, 'learning_rate': 1.0106899902818272e-06, 'epoch': 0.0} 0%| | 104/34278 [09:18<41:34:35, 4.38s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 105/34278 [09:21<39:55:36, 4.21s/it] {'loss': 1.0157, 'grad_norm': 4.731460723266639, 'learning_rate': 1.0204081632653063e-06, 'epoch': 0.0} 0%| | 105/34278 [09:21<39:55:36, 4.21s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (8237 > 8192). Running this sequence through the model will result in indexing errors 0%| | 106/34278 [09:26<39:39:42, 4.18s/it] {'loss': 1.0059, 'grad_norm': 4.747413103995433, 'learning_rate': 1.0301263362487854e-06, 'epoch': 0.0} 0%| | 106/34278 [09:26<39:39:42, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 107/34278 [09:30<40:25:19, 4.26s/it] {'loss': 1.0222, 'grad_norm': 4.8678737085012616, 'learning_rate': 1.0398445092322645e-06, 'epoch': 0.0} 0%| | 107/34278 [09:30<40:25:19, 4.26s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 108/34278 [09:33<38:02:00, 4.01s/it] {'loss': 1.0162, 'grad_norm': 4.836400548026891, 'learning_rate': 1.0495626822157436e-06, 'epoch': 0.0} 0%| | 108/34278 [09:33<38:02:00, 4.01s/it] 0%| | 109/34278 [09:37<37:21:08, 3.94s/it] {'loss': 1.0279, 'grad_norm': 4.847008660412388, 'learning_rate': 1.0592808551992226e-06, 'epoch': 0.0} 0%| | 109/34278 [09:37<37:21:08, 3.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 110/34278 [09:41<36:02:41, 3.80s/it] {'loss': 1.0224, 'grad_norm': 4.987535364124845, 'learning_rate': 1.0689990281827017e-06, 'epoch': 0.0} 0%| | 110/34278 [09:41<36:02:41, 3.80s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 111/34278 [09:45<36:24:43, 3.84s/it] {'loss': 1.0064, 'grad_norm': 4.779277015313041, 'learning_rate': 1.0787172011661808e-06, 'epoch': 0.0} 0%| | 111/34278 [09:45<36:24:43, 3.84s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 112/34278 [09:48<34:52:17, 3.67s/it] {'loss': 1.0306, 'grad_norm': 4.903732341827149, 'learning_rate': 1.08843537414966e-06, 'epoch': 0.0} 0%| | 112/34278 [09:48<34:52:17, 3.67s/it] 0%| | 113/34278 [09:51<33:35:02, 3.54s/it] {'loss': 0.9965, 'grad_norm': 5.067311367747353, 'learning_rate': 1.098153547133139e-06, 'epoch': 0.0} 0%| | 113/34278 [09:51<33:35:02, 3.54s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 114/34278 [09:55<33:20:16, 3.51s/it] {'loss': 0.9921, 'grad_norm': 5.147921224622042, 'learning_rate': 1.1078717201166181e-06, 'epoch': 0.0} 0%| | 114/34278 [09:55<33:20:16, 3.51s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 115/34278 [09:58<33:52:38, 3.57s/it] {'loss': 0.9971, 'grad_norm': 5.143403518771765, 'learning_rate': 1.1175898931000972e-06, 'epoch': 0.0} 0%| | 115/34278 [09:58<33:52:38, 3.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 116/34278 [10:02<34:49:37, 3.67s/it] {'loss': 1.006, 'grad_norm': 5.219385919357969, 'learning_rate': 1.1273080660835763e-06, 'epoch': 0.0} 0%| | 116/34278 [10:02<34:49:37, 3.67s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 117/34278 [10:09<42:33:24, 4.48s/it] {'loss': 0.9755, 'grad_norm': 5.132465962986135, 'learning_rate': 1.1370262390670554e-06, 'epoch': 0.0} 0%| | 117/34278 [10:09<42:33:24, 4.48s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 118/34278 [10:12<40:15:04, 4.24s/it] {'loss': 0.9852, 'grad_norm': 5.195001396279438, 'learning_rate': 1.1467444120505345e-06, 'epoch': 0.0} 0%| | 118/34278 [10:12<40:15:04, 4.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 119/34278 [10:18<45:37:01, 4.81s/it] {'loss': 0.9595, 'grad_norm': 5.258841826130356, 'learning_rate': 1.1564625850340136e-06, 'epoch': 0.0} 0%| | 119/34278 [10:18<45:37:01, 4.81s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 120/34278 [10:22<42:49:18, 4.51s/it] {'loss': 0.978, 'grad_norm': 5.511324577831448, 'learning_rate': 1.1661807580174927e-06, 'epoch': 0.0} 0%| | 120/34278 [10:22<42:49:18, 4.51s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 121/34278 [10:26<42:15:03, 4.45s/it] {'loss': 1.0011, 'grad_norm': 5.615991123547827, 'learning_rate': 1.1758989310009718e-06, 'epoch': 0.0} 0%| | 121/34278 [10:26<42:15:03, 4.45s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 122/34278 [10:31<42:05:44, 4.44s/it] {'loss': 0.9758, 'grad_norm': 5.366948772170658, 'learning_rate': 1.1856171039844512e-06, 'epoch': 0.0} 0%| | 122/34278 [10:31<42:05:44, 4.44s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 123/34278 [10:35<39:55:56, 4.21s/it] {'loss': 0.9831, 'grad_norm': 5.615408935080543, 'learning_rate': 1.19533527696793e-06, 'epoch': 0.0} 0%| | 123/34278 [10:35<39:55:56, 4.21s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 124/34278 [10:38<37:48:37, 3.99s/it] {'loss': 1.0104, 'grad_norm': 5.754702397510861, 'learning_rate': 1.2050534499514093e-06, 'epoch': 0.0} 0%| | 124/34278 [10:38<37:48:37, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 125/34278 [10:42<37:21:02, 3.94s/it] {'loss': 0.994, 'grad_norm': 5.842477132807181, 'learning_rate': 1.2147716229348884e-06, 'epoch': 0.0} 0%| | 125/34278 [10:42<37:21:02, 3.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 126/34278 [10:45<34:45:29, 3.66s/it] {'loss': 0.9818, 'grad_norm': 6.011047654181344, 'learning_rate': 1.2244897959183673e-06, 'epoch': 0.0} 0%| | 126/34278 [10:45<34:45:29, 3.66s/it] 0%| | 127/34278 [10:49<35:51:46, 3.78s/it] {'loss': 0.9825, 'grad_norm': 6.192699227110393, 'learning_rate': 1.2342079689018466e-06, 'epoch': 0.0} 0%| | 127/34278 [10:49<35:51:46, 3.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 128/34278 [10:52<34:28:39, 3.63s/it] {'loss': 0.9925, 'grad_norm': 6.216835153057091, 'learning_rate': 1.2439261418853255e-06, 'epoch': 0.0} 0%| | 128/34278 [10:52<34:28:39, 3.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 129/34278 [10:56<34:22:37, 3.62s/it] {'loss': 0.9706, 'grad_norm': 6.194530008226965, 'learning_rate': 1.2536443148688048e-06, 'epoch': 0.0} 0%| | 129/34278 [10:56<34:22:37, 3.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 130/34278 [11:01<39:41:23, 4.18s/it] {'loss': 0.9501, 'grad_norm': 6.438872617277369, 'learning_rate': 1.2633624878522837e-06, 'epoch': 0.0} 0%| | 130/34278 [11:01<39:41:23, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 131/34278 [11:06<40:05:04, 4.23s/it] {'loss': 0.9656, 'grad_norm': 6.408646071205254, 'learning_rate': 1.2730806608357628e-06, 'epoch': 0.0} 0%| | 131/34278 [11:06<40:05:04, 4.23s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (11636 > 8192). Running this sequence through the model will result in indexing errors 0%| | 132/34278 [11:09<36:16:14, 3.82s/it] {'loss': 0.9717, 'grad_norm': 6.8926656915994515, 'learning_rate': 1.2827988338192421e-06, 'epoch': 0.0} 0%| | 132/34278 [11:09<36:16:14, 3.82s/it] 0%| | 133/34278 [11:13<36:57:22, 3.90s/it] {'loss': 0.9648, 'grad_norm': 6.786347115572355, 'learning_rate': 1.2925170068027212e-06, 'epoch': 0.0} 0%| | 133/34278 [11:13<36:57:22, 3.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 134/34278 [11:17<37:44:48, 3.98s/it] {'loss': 0.955, 'grad_norm': 7.347768619737329, 'learning_rate': 1.3022351797862003e-06, 'epoch': 0.0} 0%| | 134/34278 [11:17<37:44:48, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb75fc0cfe0> Failed to fetch sample 3679498. Exception: cannot identify image file <_io.BytesIO object at 0x7fb75fc0cfe0> 0%| | 135/34278 [11:20<34:46:12, 3.67s/it] {'loss': 0.9522, 'grad_norm': 7.829610945379658, 'learning_rate': 1.3119533527696792e-06, 'epoch': 0.0} 0%| | 135/34278 [11:20<34:46:12, 3.67s/it] 0%| | 136/34278 [11:24<37:21:29, 3.94s/it] {'loss': 0.9546, 'grad_norm': 7.863112355202574, 'learning_rate': 1.3216715257531585e-06, 'epoch': 0.0} 0%| | 136/34278 [11:24<37:21:29, 3.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 137/34278 [11:30<43:18:16, 4.57s/it] {'loss': 0.9559, 'grad_norm': 8.05567637619532, 'learning_rate': 1.3313896987366376e-06, 'epoch': 0.0} 0%| | 137/34278 [11:30<43:18:16, 4.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 138/34278 [11:34<39:53:37, 4.21s/it] {'loss': 0.9535, 'grad_norm': 8.437260079464115, 'learning_rate': 1.3411078717201167e-06, 'epoch': 0.0} 0%| | 138/34278 [11:34<39:53:37, 4.21s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (10047 > 8192). Running this sequence through the model will result in indexing errors 0%| | 139/34278 [11:39<43:10:00, 4.55s/it] {'loss': 0.9651, 'grad_norm': 8.870074634709532, 'learning_rate': 1.3508260447035958e-06, 'epoch': 0.0} 0%| | 139/34278 [11:39<43:10:00, 4.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 140/34278 [11:44<43:41:02, 4.61s/it] {'loss': 0.9671, 'grad_norm': 9.374992282612821, 'learning_rate': 1.3605442176870751e-06, 'epoch': 0.0} 0%| | 140/34278 [11:44<43:41:02, 4.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 141/34278 [11:48<41:34:35, 4.38s/it] {'loss': 0.9546, 'grad_norm': 9.475276853310358, 'learning_rate': 1.370262390670554e-06, 'epoch': 0.0} 0%| | 141/34278 [11:48<41:34:35, 4.38s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 142/34278 [11:52<40:42:47, 4.29s/it] {'loss': 0.9264, 'grad_norm': 9.937411320393801, 'learning_rate': 1.3799805636540331e-06, 'epoch': 0.0} 0%| | 142/34278 [11:52<40:42:47, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 143/34278 [11:58<45:47:33, 4.83s/it] {'loss': 0.9081, 'grad_norm': 9.452404801305743, 'learning_rate': 1.3896987366375122e-06, 'epoch': 0.0} 0%| | 143/34278 [11:58<45:47:33, 4.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 144/34278 [12:01<41:55:15, 4.42s/it] {'loss': 0.9031, 'grad_norm': 10.388070855265655, 'learning_rate': 1.3994169096209913e-06, 'epoch': 0.0} 0%| | 144/34278 [12:01<41:55:15, 4.42s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 145/34278 [12:05<40:27:15, 4.27s/it] {'loss': 0.954, 'grad_norm': 10.395794080232319, 'learning_rate': 1.4091350826044706e-06, 'epoch': 0.0} 0%| | 145/34278 [12:05<40:27:15, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 146/34278 [12:09<38:18:48, 4.04s/it] {'loss': 0.8961, 'grad_norm': 10.814640721322968, 'learning_rate': 1.4188532555879495e-06, 'epoch': 0.0} 0%| | 146/34278 [12:09<38:18:48, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 147/34278 [12:15<44:11:14, 4.66s/it] {'loss': 0.9116, 'grad_norm': 10.711966415412752, 'learning_rate': 1.4285714285714286e-06, 'epoch': 0.0} 0%| | 147/34278 [12:15<44:11:14, 4.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 148/34278 [12:18<40:53:13, 4.31s/it] {'loss': 0.9104, 'grad_norm': 10.875004401781785, 'learning_rate': 1.4382896015549077e-06, 'epoch': 0.0} 0%| | 148/34278 [12:18<40:53:13, 4.31s/it] 0%| | 149/34278 [12:22<38:41:41, 4.08s/it] {'loss': 0.9196, 'grad_norm': 11.176620547920438, 'learning_rate': 1.4480077745383868e-06, 'epoch': 0.0} 0%| | 149/34278 [12:22<38:41:41, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 150/34278 [12:25<36:02:33, 3.80s/it] {'loss': 0.9329, 'grad_norm': 11.122481269522007, 'learning_rate': 1.4577259475218661e-06, 'epoch': 0.0} 0%| | 150/34278 [12:25<36:02:33, 3.80s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 151/34278 [12:31<42:14:41, 4.46s/it] {'loss': 0.914, 'grad_norm': 10.9934949208899, 'learning_rate': 1.4674441205053452e-06, 'epoch': 0.0} 0%| | 151/34278 [12:31<42:14:41, 4.46s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 152/34278 [12:34<37:43:55, 3.98s/it] {'loss': 0.9149, 'grad_norm': 11.621354462329585, 'learning_rate': 1.4771622934888241e-06, 'epoch': 0.0} 0%| | 152/34278 [12:34<37:43:55, 3.98s/it] 0%| | 153/34278 [12:37<36:25:43, 3.84s/it] {'loss': 0.8847, 'grad_norm': 11.18837643736424, 'learning_rate': 1.4868804664723032e-06, 'epoch': 0.0} 0%| | 153/34278 [12:37<36:25:43, 3.84s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (11707 > 8192). Running this sequence through the model will result in indexing errors 0%| | 154/34278 [12:44<43:52:50, 4.63s/it] {'loss': 0.9389, 'grad_norm': 11.098190668768323, 'learning_rate': 1.4965986394557825e-06, 'epoch': 0.0} 0%| | 154/34278 [12:44<43:52:50, 4.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 155/34278 [12:49<44:42:17, 4.72s/it] {'loss': 0.8751, 'grad_norm': 11.596961654297534, 'learning_rate': 1.5063168124392616e-06, 'epoch': 0.0} 0%| | 155/34278 [12:49<44:42:17, 4.72s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 156/34278 [12:52<41:33:35, 4.38s/it] {'loss': 0.8757, 'grad_norm': 11.940546503160725, 'learning_rate': 1.5160349854227407e-06, 'epoch': 0.0} 0%| | 156/34278 [12:52<41:33:35, 4.38s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 157/34278 [12:58<46:16:39, 4.88s/it] {'loss': 0.8986, 'grad_norm': 11.701906764397489, 'learning_rate': 1.5257531584062196e-06, 'epoch': 0.0} 0%| | 157/34278 [12:58<46:16:39, 4.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 158/34278 [13:02<43:54:42, 4.63s/it] {'loss': 0.8803, 'grad_norm': 11.7691566034306, 'learning_rate': 1.5354713313896987e-06, 'epoch': 0.0} 0%| | 158/34278 [13:02<43:54:42, 4.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 159/34278 [13:05<39:10:16, 4.13s/it] {'loss': 0.8606, 'grad_norm': 12.287601839750279, 'learning_rate': 1.545189504373178e-06, 'epoch': 0.0} 0%| | 159/34278 [13:05<39:10:16, 4.13s/it] 0%| | 160/34278 [13:09<37:41:33, 3.98s/it] {'loss': 0.9111, 'grad_norm': 11.789567332957226, 'learning_rate': 1.5549076773566571e-06, 'epoch': 0.0} 0%| | 160/34278 [13:09<37:41:33, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 161/34278 [13:13<37:44:42, 3.98s/it] {'loss': 0.9047, 'grad_norm': 11.945030037944933, 'learning_rate': 1.5646258503401362e-06, 'epoch': 0.0} 0%| | 161/34278 [13:13<37:44:42, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (9049 > 8192). Running this sequence through the model will result in indexing errors 0%| | 162/34278 [13:18<41:28:24, 4.38s/it] {'loss': 0.8572, 'grad_norm': 12.339148912685063, 'learning_rate': 1.5743440233236153e-06, 'epoch': 0.0} 0%| | 162/34278 [13:18<41:28:24, 4.38s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 163/34278 [13:22<39:34:38, 4.18s/it] {'loss': 0.8671, 'grad_norm': 12.183905694288685, 'learning_rate': 1.5840621963070942e-06, 'epoch': 0.0} 0%| | 163/34278 [13:22<39:34:38, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 164/34278 [13:27<42:07:12, 4.44s/it] {'loss': 0.8553, 'grad_norm': 12.041744890895613, 'learning_rate': 1.5937803692905735e-06, 'epoch': 0.0} 0%| | 164/34278 [13:27<42:07:12, 4.44s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 165/34278 [13:33<47:25:40, 5.01s/it] {'loss': 0.8616, 'grad_norm': 11.930518409419967, 'learning_rate': 1.6034985422740526e-06, 'epoch': 0.0} 0%| | 165/34278 [13:33<47:25:40, 5.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 166/34278 [13:38<45:13:41, 4.77s/it] {'loss': 0.8784, 'grad_norm': 12.3466248439367, 'learning_rate': 1.6132167152575317e-06, 'epoch': 0.0} 0%| | 166/34278 [13:38<45:13:41, 4.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 167/34278 [13:42<43:48:40, 4.62s/it] {'loss': 0.8461, 'grad_norm': 12.35820643078063, 'learning_rate': 1.6229348882410108e-06, 'epoch': 0.0} 0%| | 167/34278 [13:42<43:48:40, 4.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 168/34278 [13:47<44:24:15, 4.69s/it] {'loss': 0.8581, 'grad_norm': 12.234820859625453, 'learning_rate': 1.6326530612244897e-06, 'epoch': 0.0} 0%| | 168/34278 [13:47<44:24:15, 4.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 169/34278 [13:52<45:44:45, 4.83s/it] {'loss': 0.8453, 'grad_norm': 11.976455501465534, 'learning_rate': 1.642371234207969e-06, 'epoch': 0.0} 0%| | 169/34278 [13:52<45:44:45, 4.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 170/34278 [13:58<50:12:53, 5.30s/it] {'loss': 0.8628, 'grad_norm': 11.42338346944746, 'learning_rate': 1.6520894071914481e-06, 'epoch': 0.0} 0%| | 170/34278 [13:58<50:12:53, 5.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 0%| | 171/34278 [14:03<47:17:39, 4.99s/it] {'loss': 0.855, 'grad_norm': 12.615894588712042, 'learning_rate': 1.6618075801749272e-06, 'epoch': 0.0} 0%| | 171/34278 [14:03<47:17:39, 4.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 172/34278 [14:06<43:49:21, 4.63s/it] {'loss': 0.8667, 'grad_norm': 12.10212199035121, 'learning_rate': 1.6715257531584065e-06, 'epoch': 0.01} 1%| | 172/34278 [14:06<43:49:21, 4.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 173/34278 [14:09<39:39:33, 4.19s/it] {'loss': 0.8406, 'grad_norm': 12.553028710083943, 'learning_rate': 1.6812439261418856e-06, 'epoch': 0.01} 1%| | 173/34278 [14:10<39:39:33, 4.19s/it] 1%| | 174/34278 [14:16<45:08:38, 4.77s/it] {'loss': 0.8336, 'grad_norm': 12.344688693467962, 'learning_rate': 1.6909620991253645e-06, 'epoch': 0.01} 1%| | 174/34278 [14:16<45:08:38, 4.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 175/34278 [14:19<40:34:22, 4.28s/it] {'loss': 0.8372, 'grad_norm': 12.228384478751527, 'learning_rate': 1.7006802721088436e-06, 'epoch': 0.01} 1%| | 175/34278 [14:19<40:34:22, 4.28s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 176/34278 [14:24<43:07:55, 4.55s/it] {'loss': 0.8211, 'grad_norm': 12.03649962216464, 'learning_rate': 1.7103984450923227e-06, 'epoch': 0.01} 1%| | 176/34278 [14:24<43:07:55, 4.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 177/34278 [14:28<40:59:37, 4.33s/it] {'loss': 0.8637, 'grad_norm': 12.123341825107705, 'learning_rate': 1.720116618075802e-06, 'epoch': 0.01} 1%| | 177/34278 [14:28<40:59:37, 4.33s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 178/34278 [14:32<41:07:48, 4.34s/it] {'loss': 0.8329, 'grad_norm': 12.068774423302306, 'learning_rate': 1.7298347910592811e-06, 'epoch': 0.01} 1%| | 178/34278 [14:32<41:07:48, 4.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 179/34278 [14:36<39:48:37, 4.20s/it] {'loss': 0.8477, 'grad_norm': 11.806571153776764, 'learning_rate': 1.73955296404276e-06, 'epoch': 0.01} 1%| | 179/34278 [14:36<39:48:37, 4.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 180/34278 [14:40<38:07:44, 4.03s/it] {'loss': 0.84, 'grad_norm': 12.046593563644189, 'learning_rate': 1.7492711370262391e-06, 'epoch': 0.01} 1%| | 180/34278 [14:40<38:07:44, 4.03s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 181/34278 [14:43<36:28:18, 3.85s/it] {'loss': 0.8441, 'grad_norm': 11.77316188026381, 'learning_rate': 1.7589893100097182e-06, 'epoch': 0.01} 1%| | 181/34278 [14:43<36:28:18, 3.85s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 182/34278 [14:46<34:40:15, 3.66s/it] {'loss': 0.8339, 'grad_norm': 11.85419644496021, 'learning_rate': 1.7687074829931975e-06, 'epoch': 0.01} 1%| | 182/34278 [14:46<34:40:15, 3.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 183/34278 [14:50<35:35:03, 3.76s/it] {'loss': 0.8214, 'grad_norm': 11.944891604233412, 'learning_rate': 1.7784256559766766e-06, 'epoch': 0.01} 1%| | 183/34278 [14:50<35:35:03, 3.76s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 184/34278 [14:55<36:58:03, 3.90s/it] {'loss': 0.7971, 'grad_norm': 12.159880560789963, 'learning_rate': 1.7881438289601557e-06, 'epoch': 0.01} 1%| | 184/34278 [14:55<36:58:03, 3.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 185/34278 [14:58<35:35:57, 3.76s/it] {'loss': 0.7957, 'grad_norm': 11.91412578384619, 'learning_rate': 1.7978620019436346e-06, 'epoch': 0.01} 1%| | 185/34278 [14:58<35:35:57, 3.76s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 186/34278 [15:03<39:32:10, 4.17s/it] {'loss': 0.808, 'grad_norm': 11.911521353949063, 'learning_rate': 1.8075801749271137e-06, 'epoch': 0.01} 1%| | 186/34278 [15:03<39:32:10, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 187/34278 [15:06<36:35:49, 3.86s/it] {'loss': 0.7909, 'grad_norm': 12.062509293974587, 'learning_rate': 1.817298347910593e-06, 'epoch': 0.01} 1%| | 187/34278 [15:06<36:35:49, 3.86s/it] 1%| | 188/34278 [15:10<35:04:44, 3.70s/it] {'loss': 0.8154, 'grad_norm': 11.600416772936738, 'learning_rate': 1.8270165208940721e-06, 'epoch': 0.01} 1%| | 188/34278 [15:10<35:04:44, 3.70s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 189/34278 [15:13<33:12:06, 3.51s/it] {'loss': 0.7985, 'grad_norm': 12.03027410387505, 'learning_rate': 1.8367346938775512e-06, 'epoch': 0.01} 1%| | 189/34278 [15:13<33:12:06, 3.51s/it] 1%| | 190/34278 [15:18<38:35:28, 4.08s/it] {'loss': 0.7597, 'grad_norm': 11.931153892179784, 'learning_rate': 1.84645286686103e-06, 'epoch': 0.01} 1%| | 190/34278 [15:18<38:35:28, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 191/34278 [15:24<44:32:53, 4.70s/it] {'loss': 0.787, 'grad_norm': 11.454083915717893, 'learning_rate': 1.8561710398445092e-06, 'epoch': 0.01} 1%| | 191/34278 [15:24<44:32:53, 4.70s/it] 1%| | 192/34278 [15:28<42:16:05, 4.46s/it] {'loss': 0.7678, 'grad_norm': 11.909385634464144, 'learning_rate': 1.8658892128279885e-06, 'epoch': 0.01} 1%| | 192/34278 [15:28<42:16:05, 4.46s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 193/34278 [15:31<38:09:23, 4.03s/it] {'loss': 0.7739, 'grad_norm': 12.009181246049002, 'learning_rate': 1.8756073858114676e-06, 'epoch': 0.01} 1%| | 193/34278 [15:31<38:09:23, 4.03s/it] 1%| | 194/34278 [15:37<43:56:42, 4.64s/it] {'loss': 0.7679, 'grad_norm': 11.646051235739243, 'learning_rate': 1.8853255587949467e-06, 'epoch': 0.01} 1%| | 194/34278 [15:37<43:56:42, 4.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 195/34278 [15:41<42:34:49, 4.50s/it] {'loss': 0.7798, 'grad_norm': 11.64121860283326, 'learning_rate': 1.895043731778426e-06, 'epoch': 0.01} 1%| | 195/34278 [15:41<42:34:49, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (12011 > 8192). Running this sequence through the model will result in indexing errors 1%| | 196/34278 [15:47<45:08:00, 4.77s/it] {'loss': 0.7368, 'grad_norm': 11.567810719508575, 'learning_rate': 1.904761904761905e-06, 'epoch': 0.01} 1%| | 196/34278 [15:47<45:08:00, 4.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 197/34278 [15:50<41:59:58, 4.44s/it] {'loss': 0.7623, 'grad_norm': 12.010866229070107, 'learning_rate': 1.914480077745384e-06, 'epoch': 0.01} 1%| | 197/34278 [15:50<41:59:58, 4.44s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 198/34278 [15:57<48:05:38, 5.08s/it] {'loss': 0.7268, 'grad_norm': 11.883917388929667, 'learning_rate': 1.9241982507288633e-06, 'epoch': 0.01} 1%| | 198/34278 [15:57<48:05:38, 5.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 199/34278 [16:00<43:03:11, 4.55s/it] {'loss': 0.7368, 'grad_norm': 11.72346628079956, 'learning_rate': 1.933916423712342e-06, 'epoch': 0.01} 1%| | 199/34278 [16:00<43:03:11, 4.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 200/34278 [16:03<38:20:16, 4.05s/it] {'loss': 0.7469, 'grad_norm': 12.110677939017009, 'learning_rate': 1.9436345966958215e-06, 'epoch': 0.01} 1%| | 200/34278 [16:03<38:20:16, 4.05s/it] 1%| | 201/34278 [16:07<38:03:20, 4.02s/it] {'loss': 0.7279, 'grad_norm': 11.783926774204087, 'learning_rate': 1.9533527696793004e-06, 'epoch': 0.01} 1%| | 201/34278 [16:07<38:03:20, 4.02s/it] 1%| | 202/34278 [16:14<44:59:15, 4.75s/it] {'loss': 0.7358, 'grad_norm': 11.60066904064106, 'learning_rate': 1.9630709426627793e-06, 'epoch': 0.01} 1%| | 202/34278 [16:14<44:59:15, 4.75s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 203/34278 [16:18<43:08:08, 4.56s/it] {'loss': 0.7407, 'grad_norm': 11.201533202218338, 'learning_rate': 1.9727891156462586e-06, 'epoch': 0.01} 1%| | 203/34278 [16:18<43:08:08, 4.56s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 204/34278 [16:21<40:49:41, 4.31s/it] {'loss': 0.7033, 'grad_norm': 11.767916494241318, 'learning_rate': 1.982507288629738e-06, 'epoch': 0.01} 1%| | 204/34278 [16:21<40:49:41, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 205/34278 [16:24<37:10:47, 3.93s/it] {'loss': 0.7102, 'grad_norm': 11.57038811045555, 'learning_rate': 1.992225461613217e-06, 'epoch': 0.01} 1%| | 205/34278 [16:24<37:10:47, 3.93s/it] 1%| | 206/34278 [16:28<35:26:01, 3.74s/it] {'loss': 0.6806, 'grad_norm': 12.001815514280002, 'learning_rate': 2.001943634596696e-06, 'epoch': 0.01} 1%| | 206/34278 [16:28<35:26:01, 3.74s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 207/34278 [16:31<34:17:14, 3.62s/it] {'loss': 0.7051, 'grad_norm': 11.79524440382437, 'learning_rate': 2.011661807580175e-06, 'epoch': 0.01} 1%| | 207/34278 [16:31<34:17:14, 3.62s/it] 1%| | 208/34278 [16:35<34:06:20, 3.60s/it] {'loss': 0.6937, 'grad_norm': 12.383127140344047, 'learning_rate': 2.0213799805636543e-06, 'epoch': 0.01} 1%| | 208/34278 [16:35<34:06:20, 3.60s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 209/34278 [16:38<34:36:52, 3.66s/it] {'loss': 0.6978, 'grad_norm': 11.84018887052034, 'learning_rate': 2.031098153547133e-06, 'epoch': 0.01} 1%| | 209/34278 [16:38<34:36:52, 3.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 210/34278 [16:42<33:43:32, 3.56s/it] {'loss': 0.6793, 'grad_norm': 11.447041483789565, 'learning_rate': 2.0408163265306125e-06, 'epoch': 0.01} 1%| | 210/34278 [16:42<33:43:32, 3.56s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 211/34278 [16:45<32:42:46, 3.46s/it] {'loss': 0.6889, 'grad_norm': 11.658341946520713, 'learning_rate': 2.050534499514092e-06, 'epoch': 0.01} 1%| | 211/34278 [16:45<32:42:46, 3.46s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (9421 > 8192). Running this sequence through the model will result in indexing errors 1%| | 212/34278 [16:48<32:03:11, 3.39s/it] {'loss': 0.684, 'grad_norm': 12.123436638014988, 'learning_rate': 2.0602526724975707e-06, 'epoch': 0.01} 1%| | 212/34278 [16:48<32:03:11, 3.39s/it] 1%| | 213/34278 [16:51<31:41:16, 3.35s/it] {'loss': 0.7034, 'grad_norm': 11.504916523431982, 'learning_rate': 2.0699708454810496e-06, 'epoch': 0.01} 1%| | 213/34278 [16:51<31:41:16, 3.35s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 214/34278 [16:58<39:49:37, 4.21s/it] {'loss': 0.6744, 'grad_norm': 11.538239186493822, 'learning_rate': 2.079689018464529e-06, 'epoch': 0.01} 1%| | 214/34278 [16:58<39:49:37, 4.21s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 215/34278 [17:03<42:45:15, 4.52s/it] {'loss': 0.6989, 'grad_norm': 11.980938305971051, 'learning_rate': 2.089407191448008e-06, 'epoch': 0.01} 1%| | 215/34278 [17:03<42:45:15, 4.52s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 216/34278 [17:06<39:29:10, 4.17s/it] {'loss': 0.6497, 'grad_norm': 11.945898901146133, 'learning_rate': 2.099125364431487e-06, 'epoch': 0.01} 1%| | 216/34278 [17:06<39:29:10, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 217/34278 [17:12<42:34:35, 4.50s/it] {'loss': 0.6457, 'grad_norm': 11.739472956996947, 'learning_rate': 2.1088435374149664e-06, 'epoch': 0.01} 1%| | 217/34278 [17:12<42:34:35, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 218/34278 [17:14<38:08:13, 4.03s/it] {'loss': 0.6612, 'grad_norm': 11.384022274679449, 'learning_rate': 2.1185617103984453e-06, 'epoch': 0.01} 1%| | 218/34278 [17:15<38:08:13, 4.03s/it] 1%| | 219/34278 [17:21<44:22:42, 4.69s/it] {'loss': 0.678, 'grad_norm': 11.40822270680767, 'learning_rate': 2.128279883381924e-06, 'epoch': 0.01} 1%| | 219/34278 [17:21<44:22:42, 4.69s/it] 1%| | 220/34278 [17:26<46:08:52, 4.88s/it] {'loss': 0.6624, 'grad_norm': 11.553583332185259, 'learning_rate': 2.1379980563654035e-06, 'epoch': 0.01} 1%| | 220/34278 [17:26<46:08:52, 4.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 221/34278 [17:29<41:40:00, 4.40s/it] {'loss': 0.6379, 'grad_norm': 11.939577591021372, 'learning_rate': 2.147716229348883e-06, 'epoch': 0.01} 1%| | 221/34278 [17:29<41:40:00, 4.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 222/34278 [17:33<39:09:23, 4.14s/it] {'loss': 0.619, 'grad_norm': 11.7091863365697, 'learning_rate': 2.1574344023323617e-06, 'epoch': 0.01} 1%| | 222/34278 [17:33<39:09:23, 4.14s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 223/34278 [17:36<36:42:05, 3.88s/it] {'loss': 0.6678, 'grad_norm': 11.346639699567936, 'learning_rate': 2.1671525753158406e-06, 'epoch': 0.01} 1%| | 223/34278 [17:36<36:42:05, 3.88s/it] 1%| | 224/34278 [17:40<37:41:30, 3.98s/it] {'loss': 0.6105, 'grad_norm': 11.758854403205179, 'learning_rate': 2.17687074829932e-06, 'epoch': 0.01} 1%| | 224/34278 [17:40<37:41:30, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 225/34278 [17:44<37:30:31, 3.97s/it] {'loss': 0.6417, 'grad_norm': 11.473014914788202, 'learning_rate': 2.1865889212827988e-06, 'epoch': 0.01} 1%| | 225/34278 [17:44<37:30:31, 3.97s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 226/34278 [17:49<38:16:11, 4.05s/it] {'loss': 0.6289, 'grad_norm': 12.027836450653655, 'learning_rate': 2.196307094266278e-06, 'epoch': 0.01} 1%| | 226/34278 [17:49<38:16:11, 4.05s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (8475 > 8192). Running this sequence through the model will result in indexing errors 1%| | 227/34278 [17:51<34:53:46, 3.69s/it] {'loss': 0.5889, 'grad_norm': 11.884689757159778, 'learning_rate': 2.2060252672497574e-06, 'epoch': 0.01} 1%| | 227/34278 [17:51<34:53:46, 3.69s/it] 1%| | 228/34278 [17:55<36:04:33, 3.81s/it] {'loss': 0.6089, 'grad_norm': 11.468961426808615, 'learning_rate': 2.2157434402332363e-06, 'epoch': 0.01} 1%| | 228/34278 [17:55<36:04:33, 3.81s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 229/34278 [18:01<41:55:21, 4.43s/it] {'loss': 0.5994, 'grad_norm': 11.446624636013864, 'learning_rate': 2.225461613216715e-06, 'epoch': 0.01} 1%| | 229/34278 [18:01<41:55:21, 4.43s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 230/34278 [18:05<39:07:57, 4.14s/it] {'loss': 0.6043, 'grad_norm': 11.66395812991183, 'learning_rate': 2.2351797862001945e-06, 'epoch': 0.01} 1%| | 230/34278 [18:05<39:07:57, 4.14s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 231/34278 [18:08<37:13:09, 3.94s/it] {'loss': 0.5903, 'grad_norm': 11.332469127007451, 'learning_rate': 2.244897959183674e-06, 'epoch': 0.01} 1%| | 231/34278 [18:08<37:13:09, 3.94s/it] 1%| | 232/34278 [18:12<36:13:21, 3.83s/it] {'loss': 0.6114, 'grad_norm': 11.471237317246185, 'learning_rate': 2.2546161321671527e-06, 'epoch': 0.01} 1%| | 232/34278 [18:12<36:13:21, 3.83s/it] 1%| | 233/34278 [18:16<37:07:55, 3.93s/it] {'loss': 0.5855, 'grad_norm': 11.850247614856915, 'learning_rate': 2.264334305150632e-06, 'epoch': 0.01} 1%| | 233/34278 [18:16<37:07:55, 3.93s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 234/34278 [18:19<34:09:14, 3.61s/it] {'loss': 0.5949, 'grad_norm': 11.82730517467475, 'learning_rate': 2.274052478134111e-06, 'epoch': 0.01} 1%| | 234/34278 [18:19<34:09:14, 3.61s/it] 1%| | 235/34278 [18:25<42:15:11, 4.47s/it] {'loss': 0.5551, 'grad_norm': 11.485264437660307, 'learning_rate': 2.28377065111759e-06, 'epoch': 0.01} 1%| | 235/34278 [18:25<42:15:11, 4.47s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 236/34278 [18:29<39:15:14, 4.15s/it] {'loss': 0.5722, 'grad_norm': 11.405145187429145, 'learning_rate': 2.293488824101069e-06, 'epoch': 0.01} 1%| | 236/34278 [18:29<39:15:14, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 237/34278 [18:32<37:35:32, 3.98s/it] {'loss': 0.5748, 'grad_norm': 11.660399592734256, 'learning_rate': 2.3032069970845484e-06, 'epoch': 0.01} 1%| | 237/34278 [18:32<37:35:32, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 238/34278 [18:36<37:09:29, 3.93s/it] {'loss': 0.5499, 'grad_norm': 11.439655601904152, 'learning_rate': 2.3129251700680273e-06, 'epoch': 0.01} 1%| | 238/34278 [18:36<37:09:29, 3.93s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10314 > 8192). Running this sequence through the model will result in indexing errors 1%| | 239/34278 [18:40<35:45:52, 3.78s/it] {'loss': 0.559, 'grad_norm': 11.708246461632152, 'learning_rate': 2.3226433430515066e-06, 'epoch': 0.01} 1%| | 239/34278 [18:40<35:45:52, 3.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 240/34278 [18:45<41:25:05, 4.38s/it] {'loss': 0.5765, 'grad_norm': 11.53076305043417, 'learning_rate': 2.3323615160349855e-06, 'epoch': 0.01} 1%| | 240/34278 [18:45<41:25:05, 4.38s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 241/34278 [18:49<39:08:32, 4.14s/it] {'loss': 0.549, 'grad_norm': 11.210280964251877, 'learning_rate': 2.342079689018465e-06, 'epoch': 0.01} 1%| | 241/34278 [18:49<39:08:32, 4.14s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 242/34278 [18:52<36:41:13, 3.88s/it] {'loss': 0.5216, 'grad_norm': 11.496267401715498, 'learning_rate': 2.3517978620019437e-06, 'epoch': 0.01} 1%| | 242/34278 [18:52<36:41:13, 3.88s/it] 1%| | 243/34278 [18:59<43:49:36, 4.64s/it] {'loss': 0.5352, 'grad_norm': 11.055364756441806, 'learning_rate': 2.361516034985423e-06, 'epoch': 0.01} 1%| | 243/34278 [18:59<43:49:36, 4.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 244/34278 [19:02<39:45:31, 4.21s/it] {'loss': 0.5391, 'grad_norm': 11.642585506110569, 'learning_rate': 2.3712342079689023e-06, 'epoch': 0.01} 1%| | 244/34278 [19:02<39:45:31, 4.21s/it] 1%| | 245/34278 [19:06<39:53:07, 4.22s/it] {'loss': 0.5056, 'grad_norm': 11.407627642817618, 'learning_rate': 2.380952380952381e-06, 'epoch': 0.01} 1%| | 245/34278 [19:06<39:53:07, 4.22s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 246/34278 [19:09<36:57:06, 3.91s/it] {'loss': 0.5396, 'grad_norm': 11.082367037594938, 'learning_rate': 2.39067055393586e-06, 'epoch': 0.01} 1%| | 246/34278 [19:09<36:57:06, 3.91s/it] 1%| | 247/34278 [19:14<39:48:06, 4.21s/it] {'loss': 0.5244, 'grad_norm': 11.441394058724788, 'learning_rate': 2.4003887269193394e-06, 'epoch': 0.01} 1%| | 247/34278 [19:14<39:48:06, 4.21s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 248/34278 [19:19<40:42:07, 4.31s/it] {'loss': 0.5261, 'grad_norm': 11.296712351311688, 'learning_rate': 2.4101068999028187e-06, 'epoch': 0.01} 1%| | 248/34278 [19:19<40:42:07, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (9591 > 8192). Running this sequence through the model will result in indexing errors 1%| | 249/34278 [19:23<40:41:16, 4.30s/it] {'loss': 0.5049, 'grad_norm': 10.950446712681304, 'learning_rate': 2.4198250728862976e-06, 'epoch': 0.01} 1%| | 249/34278 [19:23<40:41:16, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 250/34278 [19:27<40:39:20, 4.30s/it] {'loss': 0.5067, 'grad_norm': 11.19605427329169, 'learning_rate': 2.429543245869777e-06, 'epoch': 0.01} 1%| | 250/34278 [19:27<40:39:20, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 251/34278 [19:31<38:38:58, 4.09s/it] {'loss': 0.4921, 'grad_norm': 11.412942617701502, 'learning_rate': 2.4392614188532558e-06, 'epoch': 0.01} 1%| | 251/34278 [19:31<38:38:58, 4.09s/it] 1%| | 252/34278 [19:37<43:54:37, 4.65s/it] {'loss': 0.48, 'grad_norm': 11.100669247591972, 'learning_rate': 2.4489795918367347e-06, 'epoch': 0.01} 1%| | 252/34278 [19:37<43:54:37, 4.65s/it] 1%| | 253/34278 [19:40<41:01:53, 4.34s/it] {'loss': 0.4881, 'grad_norm': 11.16500214937961, 'learning_rate': 2.458697764820214e-06, 'epoch': 0.01} 1%| | 253/34278 [19:40<41:01:53, 4.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 254/34278 [19:45<40:43:04, 4.31s/it] {'loss': 0.496, 'grad_norm': 10.921514466016799, 'learning_rate': 2.4684159378036933e-06, 'epoch': 0.01} 1%| | 254/34278 [19:45<40:43:04, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 255/34278 [19:48<38:04:27, 4.03s/it] {'loss': 0.4621, 'grad_norm': 11.13552901328187, 'learning_rate': 2.478134110787172e-06, 'epoch': 0.01} 1%| | 255/34278 [19:48<38:04:27, 4.03s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 256/34278 [19:52<38:11:34, 4.04s/it] {'loss': 0.4794, 'grad_norm': 10.988723529484387, 'learning_rate': 2.487852283770651e-06, 'epoch': 0.01} 1%| | 256/34278 [19:52<38:11:34, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 257/34278 [19:56<36:17:57, 3.84s/it] {'loss': 0.4642, 'grad_norm': 10.592447527785755, 'learning_rate': 2.4975704567541304e-06, 'epoch': 0.01} 1%| | 257/34278 [19:56<36:17:57, 3.84s/it] 1%| | 258/34278 [19:59<36:38:33, 3.88s/it] {'loss': 0.4454, 'grad_norm': 11.09648035063937, 'learning_rate': 2.5072886297376097e-06, 'epoch': 0.01} 1%| | 258/34278 [19:59<36:38:33, 3.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 259/34278 [20:03<36:59:33, 3.91s/it] {'loss': 0.484, 'grad_norm': 10.789756784936234, 'learning_rate': 2.5170068027210886e-06, 'epoch': 0.01} 1%| | 259/34278 [20:03<36:59:33, 3.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 260/34278 [20:07<36:52:12, 3.90s/it] {'loss': 0.4742, 'grad_norm': 10.505112737452256, 'learning_rate': 2.5267249757045675e-06, 'epoch': 0.01} 1%| | 260/34278 [20:07<36:52:12, 3.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 261/34278 [20:14<43:43:28, 4.63s/it] {'loss': 0.463, 'grad_norm': 10.198260105944357, 'learning_rate': 2.5364431486880468e-06, 'epoch': 0.01} 1%| | 261/34278 [20:14<43:43:28, 4.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 262/34278 [20:17<40:38:44, 4.30s/it] {'loss': 0.4485, 'grad_norm': 10.58117157693541, 'learning_rate': 2.5461613216715257e-06, 'epoch': 0.01} 1%| | 262/34278 [20:17<40:38:44, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 263/34278 [20:20<37:23:49, 3.96s/it] {'loss': 0.4607, 'grad_norm': 10.057462462246159, 'learning_rate': 2.5558794946550054e-06, 'epoch': 0.01} 1%| | 263/34278 [20:20<37:23:49, 3.96s/it] 1%| | 264/34278 [20:23<34:45:44, 3.68s/it] {'loss': 0.4314, 'grad_norm': 10.244038062005783, 'learning_rate': 2.5655976676384843e-06, 'epoch': 0.01} 1%| | 264/34278 [20:23<34:45:44, 3.68s/it] 1%| | 265/34278 [20:29<41:13:44, 4.36s/it] {'loss': 0.4033, 'grad_norm': 10.243916239162006, 'learning_rate': 2.575315840621963e-06, 'epoch': 0.01} 1%| | 265/34278 [20:29<41:13:44, 4.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 266/34278 [20:32<37:35:40, 3.98s/it] {'loss': 0.4201, 'grad_norm': 10.133218763856435, 'learning_rate': 2.5850340136054425e-06, 'epoch': 0.01} 1%| | 266/34278 [20:32<37:35:40, 3.98s/it] 1%| | 267/34278 [20:37<38:55:29, 4.12s/it] {'loss': 0.4381, 'grad_norm': 9.992581101051586, 'learning_rate': 2.5947521865889214e-06, 'epoch': 0.01} 1%| | 267/34278 [20:37<38:55:29, 4.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 268/34278 [20:43<44:27:26, 4.71s/it] {'loss': 0.4254, 'grad_norm': 9.799748051020952, 'learning_rate': 2.6044703595724007e-06, 'epoch': 0.01} 1%| | 268/34278 [20:43<44:27:26, 4.71s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 269/34278 [20:47<42:07:17, 4.46s/it] {'loss': 0.3934, 'grad_norm': 10.097500004633288, 'learning_rate': 2.6141885325558796e-06, 'epoch': 0.01} 1%| | 269/34278 [20:47<42:07:17, 4.46s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 270/34278 [20:50<37:26:46, 3.96s/it] {'loss': 0.4059, 'grad_norm': 9.710073580467833, 'learning_rate': 2.6239067055393585e-06, 'epoch': 0.01} 1%| | 270/34278 [20:50<37:26:46, 3.96s/it] 1%| | 271/34278 [20:53<35:29:28, 3.76s/it] {'loss': 0.3774, 'grad_norm': 9.505183836821201, 'learning_rate': 2.633624878522838e-06, 'epoch': 0.01} 1%| | 271/34278 [20:53<35:29:28, 3.76s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9544 > 8192). Running this sequence through the model will result in indexing errors 1%| | 272/34278 [20:57<36:45:56, 3.89s/it] {'loss': 0.3996, 'grad_norm': 9.1929482166285, 'learning_rate': 2.643343051506317e-06, 'epoch': 0.01} 1%| | 272/34278 [20:57<36:45:56, 3.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 273/34278 [21:01<35:46:06, 3.79s/it] {'loss': 0.3946, 'grad_norm': 9.364972993584722, 'learning_rate': 2.6530612244897964e-06, 'epoch': 0.01} 1%| | 273/34278 [21:01<35:46:06, 3.79s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 274/34278 [21:05<36:45:34, 3.89s/it] {'loss': 0.3545, 'grad_norm': 8.870779249745748, 'learning_rate': 2.6627793974732753e-06, 'epoch': 0.01} 1%| | 274/34278 [21:05<36:45:34, 3.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 275/34278 [21:08<36:09:36, 3.83s/it] {'loss': 0.3629, 'grad_norm': 8.735956325839938, 'learning_rate': 2.6724975704567546e-06, 'epoch': 0.01} 1%| | 275/34278 [21:09<36:09:36, 3.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 276/34278 [21:12<34:15:33, 3.63s/it] {'loss': 0.3931, 'grad_norm': 8.576383992093225, 'learning_rate': 2.6822157434402335e-06, 'epoch': 0.01} 1%| | 276/34278 [21:12<34:15:33, 3.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 277/34278 [21:15<33:59:14, 3.60s/it] {'loss': 0.3627, 'grad_norm': 8.555201112815267, 'learning_rate': 2.6919339164237124e-06, 'epoch': 0.01} 1%| | 277/34278 [21:15<33:59:14, 3.60s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 278/34278 [21:19<35:52:35, 3.80s/it] {'loss': 0.3729, 'grad_norm': 7.984072846166019, 'learning_rate': 2.7016520894071917e-06, 'epoch': 0.01} 1%| | 278/34278 [21:19<35:52:35, 3.80s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 279/34278 [21:24<38:12:06, 4.05s/it] {'loss': 0.3879, 'grad_norm': 8.08509797796882, 'learning_rate': 2.7113702623906706e-06, 'epoch': 0.01} 1%| | 279/34278 [21:24<38:12:06, 4.05s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 280/34278 [21:28<38:30:28, 4.08s/it] {'loss': 0.3499, 'grad_norm': 7.80702870942227, 'learning_rate': 2.7210884353741503e-06, 'epoch': 0.01} 1%| | 280/34278 [21:28<38:30:28, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 281/34278 [21:32<36:45:58, 3.89s/it] {'loss': 0.3568, 'grad_norm': 7.574259010158371, 'learning_rate': 2.730806608357629e-06, 'epoch': 0.01} 1%| | 281/34278 [21:32<36:45:58, 3.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 282/34278 [21:37<40:15:47, 4.26s/it] {'loss': 0.3348, 'grad_norm': 7.222573175239761, 'learning_rate': 2.740524781341108e-06, 'epoch': 0.01} 1%| | 282/34278 [21:37<40:15:47, 4.26s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 283/34278 [21:43<46:58:45, 4.98s/it] {'loss': 0.3409, 'grad_norm': 6.879930759003031, 'learning_rate': 2.7502429543245874e-06, 'epoch': 0.01} 1%| | 283/34278 [21:43<46:58:45, 4.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 284/34278 [21:47<42:32:28, 4.51s/it] {'loss': 0.3651, 'grad_norm': 6.978015023639886, 'learning_rate': 2.7599611273080663e-06, 'epoch': 0.01} 1%| | 284/34278 [21:47<42:32:28, 4.51s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 285/34278 [21:50<38:35:05, 4.09s/it] {'loss': 0.3603, 'grad_norm': 6.358843281212613, 'learning_rate': 2.7696793002915456e-06, 'epoch': 0.01} 1%| | 285/34278 [21:50<38:35:05, 4.09s/it] 1%| | 286/34278 [21:53<36:08:13, 3.83s/it] {'loss': 0.3388, 'grad_norm': 6.320642016108672, 'learning_rate': 2.7793974732750245e-06, 'epoch': 0.01} 1%| | 286/34278 [21:53<36:08:13, 3.83s/it] 1%| | 287/34278 [21:57<34:46:14, 3.68s/it] {'loss': 0.3689, 'grad_norm': 6.32099186551206, 'learning_rate': 2.7891156462585034e-06, 'epoch': 0.01} 1%| | 287/34278 [21:57<34:46:14, 3.68s/it] 1%| | 288/34278 [22:03<42:59:10, 4.55s/it] {'loss': 0.3409, 'grad_norm': 5.807201388060067, 'learning_rate': 2.7988338192419827e-06, 'epoch': 0.01} 1%| | 288/34278 [22:03<42:59:10, 4.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 289/34278 [22:07<41:29:41, 4.39s/it] {'loss': 0.3326, 'grad_norm': 5.647941332066215, 'learning_rate': 2.8085519922254615e-06, 'epoch': 0.01} 1%| | 289/34278 [22:07<41:29:41, 4.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (8890 > 8192). Running this sequence through the model will result in indexing errors 1%| | 290/34278 [22:10<38:20:12, 4.06s/it] {'loss': 0.343, 'grad_norm': 5.453137115098798, 'learning_rate': 2.8182701652089413e-06, 'epoch': 0.01} 1%| | 290/34278 [22:10<38:20:12, 4.06s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 291/34278 [22:17<44:02:59, 4.67s/it] {'loss': 0.3198, 'grad_norm': 5.12504239664104, 'learning_rate': 2.82798833819242e-06, 'epoch': 0.01} 1%| | 291/34278 [22:17<44:02:59, 4.67s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 292/34278 [22:20<40:46:13, 4.32s/it] {'loss': 0.2995, 'grad_norm': 5.063327780982386, 'learning_rate': 2.837706511175899e-06, 'epoch': 0.01} 1%| | 292/34278 [22:20<40:46:13, 4.32s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 293/34278 [22:24<38:54:36, 4.12s/it] {'loss': 0.3412, 'grad_norm': 4.675748098209446, 'learning_rate': 2.8474246841593784e-06, 'epoch': 0.01} 1%| | 293/34278 [22:24<38:54:36, 4.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 294/34278 [22:28<38:18:29, 4.06s/it] {'loss': 0.2962, 'grad_norm': 4.66246409593304, 'learning_rate': 2.8571428571428573e-06, 'epoch': 0.01} 1%| | 294/34278 [22:28<38:18:29, 4.06s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 295/34278 [22:34<44:38:24, 4.73s/it] {'loss': 0.3284, 'grad_norm': 4.919686861392447, 'learning_rate': 2.8668610301263366e-06, 'epoch': 0.01} 1%| | 295/34278 [22:34<44:38:24, 4.73s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 296/34278 [22:37<41:18:08, 4.38s/it] {'loss': 0.3167, 'grad_norm': 4.431848908970389, 'learning_rate': 2.8765792031098155e-06, 'epoch': 0.01} 1%| | 296/34278 [22:37<41:18:08, 4.38s/it] 1%| | 297/34278 [22:41<37:38:15, 3.99s/it] {'loss': 0.3108, 'grad_norm': 4.4158568891954335, 'learning_rate': 2.8862973760932948e-06, 'epoch': 0.01} 1%| | 297/34278 [22:41<37:38:15, 3.99s/it] 1%| | 298/34278 [22:44<35:36:26, 3.77s/it] {'loss': 0.314, 'grad_norm': 3.998923406763035, 'learning_rate': 2.8960155490767737e-06, 'epoch': 0.01} 1%| | 298/34278 [22:44<35:36:26, 3.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (12468 > 8192). Running this sequence through the model will result in indexing errors 1%| | 299/34278 [22:50<42:09:07, 4.47s/it] {'loss': 0.3043, 'grad_norm': 3.944104628252764, 'learning_rate': 2.9057337220602525e-06, 'epoch': 0.01} 1%| | 299/34278 [22:50<42:09:07, 4.47s/it] 1%| | 300/34278 [22:54<39:55:09, 4.23s/it] {'loss': 0.3225, 'grad_norm': 3.7899421781554365, 'learning_rate': 2.9154518950437323e-06, 'epoch': 0.01} 1%| | 300/34278 [22:54<39:55:09, 4.23s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10425 > 8192). Running this sequence through the model will result in indexing errors 1%| | 301/34278 [23:00<45:40:29, 4.84s/it] {'loss': 0.2831, 'grad_norm': 3.552107196564264, 'learning_rate': 2.925170068027211e-06, 'epoch': 0.01} 1%| | 301/34278 [23:00<45:40:29, 4.84s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 302/34278 [23:04<43:31:26, 4.61s/it] {'loss': 0.3414, 'grad_norm': 3.5812651862591265, 'learning_rate': 2.9348882410106905e-06, 'epoch': 0.01} 1%| | 302/34278 [23:04<43:31:26, 4.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 303/34278 [23:08<42:20:09, 4.49s/it] {'loss': 0.3229, 'grad_norm': 3.5160124869214493, 'learning_rate': 2.9446064139941694e-06, 'epoch': 0.01} 1%| | 303/34278 [23:08<42:20:09, 4.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 304/34278 [23:12<39:36:50, 4.20s/it] {'loss': 0.3126, 'grad_norm': 3.6089633308193254, 'learning_rate': 2.9543245869776482e-06, 'epoch': 0.01} 1%| | 304/34278 [23:12<39:36:50, 4.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 305/34278 [23:16<39:01:47, 4.14s/it] {'loss': 0.3235, 'grad_norm': 3.5453234107237632, 'learning_rate': 2.9640427599611276e-06, 'epoch': 0.01} 1%| | 305/34278 [23:16<39:01:47, 4.14s/it] 1%| | 306/34278 [23:20<39:29:23, 4.18s/it] {'loss': 0.3105, 'grad_norm': 3.5035962731904386, 'learning_rate': 2.9737609329446064e-06, 'epoch': 0.01} 1%| | 306/34278 [23:20<39:29:23, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 307/34278 [23:23<37:35:48, 3.98s/it] {'loss': 0.3153, 'grad_norm': 3.7501205724538558, 'learning_rate': 2.983479105928086e-06, 'epoch': 0.01} 1%| | 307/34278 [23:24<37:35:48, 3.98s/it] 1%| | 308/34278 [23:28<38:53:29, 4.12s/it] {'loss': 0.3196, 'grad_norm': 3.394941486669191, 'learning_rate': 2.993197278911565e-06, 'epoch': 0.01} 1%| | 308/34278 [23:28<38:53:29, 4.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 309/34278 [23:31<35:08:12, 3.72s/it] {'loss': 0.2779, 'grad_norm': 3.241172414342122, 'learning_rate': 3.002915451895044e-06, 'epoch': 0.01} 1%| | 309/34278 [23:31<35:08:12, 3.72s/it] 1%| | 310/34278 [23:34<33:11:53, 3.52s/it] {'loss': 0.3087, 'grad_norm': 3.1934920458602507, 'learning_rate': 3.0126336248785233e-06, 'epoch': 0.01} 1%| | 310/34278 [23:34<33:11:53, 3.52s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8277 > 8192). Running this sequence through the model will result in indexing errors 1%| | 311/34278 [23:40<41:13:27, 4.37s/it] {'loss': 0.2996, 'grad_norm': 3.2268698451322657, 'learning_rate': 3.022351797862002e-06, 'epoch': 0.01} 1%| | 311/34278 [23:40<41:13:27, 4.37s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 312/34278 [23:43<38:09:20, 4.04s/it] {'loss': 0.3178, 'grad_norm': 3.1635587205391626, 'learning_rate': 3.0320699708454815e-06, 'epoch': 0.01} 1%| | 312/34278 [23:43<38:09:20, 4.04s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10023 > 8192). Running this sequence through the model will result in indexing errors 1%| | 313/34278 [23:47<36:34:13, 3.88s/it] {'loss': 0.2968, 'grad_norm': 3.0093458116750313, 'learning_rate': 3.0417881438289604e-06, 'epoch': 0.01} 1%| | 313/34278 [23:47<36:34:13, 3.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 314/34278 [23:52<41:29:27, 4.40s/it] {'loss': 0.2976, 'grad_norm': 2.741170367470087, 'learning_rate': 3.0515063168124392e-06, 'epoch': 0.01} 1%| | 314/34278 [23:52<41:29:27, 4.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 315/34278 [23:56<39:22:51, 4.17s/it] {'loss': 0.2977, 'grad_norm': 2.9455145470142714, 'learning_rate': 3.0612244897959185e-06, 'epoch': 0.01} 1%| | 315/34278 [23:56<39:22:51, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 316/34278 [23:59<37:03:30, 3.93s/it] {'loss': 0.3148, 'grad_norm': 2.7230192545396723, 'learning_rate': 3.0709426627793974e-06, 'epoch': 0.01} 1%| | 316/34278 [23:59<37:03:30, 3.93s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9059 > 8192). Running this sequence through the model will result in indexing errors 1%| | 317/34278 [24:04<39:37:40, 4.20s/it] {'loss': 0.3047, 'grad_norm': 2.6993941016970457, 'learning_rate': 3.080660835762877e-06, 'epoch': 0.01} 1%| | 317/34278 [24:04<39:37:40, 4.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (10184 > 8192). Running this sequence through the model will result in indexing errors 1%| | 318/34278 [24:07<36:29:21, 3.87s/it] {'loss': 0.304, 'grad_norm': 2.5951026957119185, 'learning_rate': 3.090379008746356e-06, 'epoch': 0.01} 1%| | 318/34278 [24:07<36:29:21, 3.87s/it] 1%| | 319/34278 [24:11<35:24:22, 3.75s/it] {'loss': 0.2758, 'grad_norm': 2.754713376574571, 'learning_rate': 3.1000971817298354e-06, 'epoch': 0.01} 1%| | 319/34278 [24:11<35:24:22, 3.75s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f889c820810> Failed to fetch sample 2701722. Exception: cannot identify image file <_io.BytesIO object at 0x7f889c820810> 1%| | 320/34278 [24:17<42:30:33, 4.51s/it] {'loss': 0.3091, 'grad_norm': 2.608129232034051, 'learning_rate': 3.1098153547133143e-06, 'epoch': 0.01} 1%| | 320/34278 [24:17<42:30:33, 4.51s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 321/34278 [24:20<39:01:53, 4.14s/it] {'loss': 0.2734, 'grad_norm': 2.3588671050032035, 'learning_rate': 3.119533527696793e-06, 'epoch': 0.01} 1%| | 321/34278 [24:20<39:01:53, 4.14s/it] 1%| | 322/34278 [24:24<37:42:39, 4.00s/it] {'loss': 0.3143, 'grad_norm': 2.576246544800748, 'learning_rate': 3.1292517006802725e-06, 'epoch': 0.01} 1%| | 322/34278 [24:24<37:42:39, 4.00s/it] 1%| | 323/34278 [24:28<36:48:42, 3.90s/it] {'loss': 0.3121, 'grad_norm': 2.442115567378784, 'learning_rate': 3.1389698736637513e-06, 'epoch': 0.01} 1%| | 323/34278 [24:28<36:48:42, 3.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 324/34278 [24:32<36:31:29, 3.87s/it] {'loss': 0.3202, 'grad_norm': 2.5068164870128395, 'learning_rate': 3.1486880466472307e-06, 'epoch': 0.01} 1%| | 324/34278 [24:32<36:31:29, 3.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 325/34278 [24:38<42:33:18, 4.51s/it] {'loss': 0.2937, 'grad_norm': 2.606783819651253, 'learning_rate': 3.1584062196307095e-06, 'epoch': 0.01} 1%| | 325/34278 [24:38<42:33:18, 4.51s/it] 1%| | 326/34278 [24:41<40:55:53, 4.34s/it] {'loss': 0.271, 'grad_norm': 2.1178884108003984, 'learning_rate': 3.1681243926141884e-06, 'epoch': 0.01} 1%| | 326/34278 [24:41<40:55:53, 4.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 327/34278 [24:45<38:08:47, 4.04s/it] {'loss': 0.2843, 'grad_norm': 2.0652110483668404, 'learning_rate': 3.177842565597668e-06, 'epoch': 0.01} 1%| | 327/34278 [24:45<38:08:47, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (10632 > 8192). Running this sequence through the model will result in indexing errors 1%| | 328/34278 [24:49<37:52:18, 4.02s/it] {'loss': 0.2592, 'grad_norm': 2.204974423982823, 'learning_rate': 3.187560738581147e-06, 'epoch': 0.01} 1%| | 328/34278 [24:49<37:52:18, 4.02s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 329/34278 [24:53<38:14:15, 4.05s/it] {'loss': 0.2789, 'grad_norm': 2.0655925911611805, 'learning_rate': 3.1972789115646264e-06, 'epoch': 0.01} 1%| | 329/34278 [24:53<38:14:15, 4.05s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 330/34278 [24:56<36:03:13, 3.82s/it] {'loss': 0.2545, 'grad_norm': 1.9531825316794622, 'learning_rate': 3.2069970845481052e-06, 'epoch': 0.01} 1%| | 330/34278 [24:56<36:03:13, 3.82s/it] 1%| | 331/34278 [25:00<34:55:52, 3.70s/it] {'loss': 0.2606, 'grad_norm': 2.1590601507754106, 'learning_rate': 3.216715257531584e-06, 'epoch': 0.01} 1%| | 331/34278 [25:00<34:55:52, 3.70s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 332/34278 [25:03<33:38:49, 3.57s/it] {'loss': 0.2892, 'grad_norm': 2.0642596187813056, 'learning_rate': 3.2264334305150634e-06, 'epoch': 0.01} 1%| | 332/34278 [25:03<33:38:49, 3.57s/it] 1%| | 333/34278 [25:06<33:00:39, 3.50s/it] {'loss': 0.2509, 'grad_norm': 1.945148069761541, 'learning_rate': 3.2361516034985423e-06, 'epoch': 0.01} 1%| | 333/34278 [25:06<33:00:39, 3.50s/it] 1%| | 334/34278 [25:13<41:09:43, 4.37s/it] {'loss': 0.2663, 'grad_norm': 2.0957514084341544, 'learning_rate': 3.2458697764820216e-06, 'epoch': 0.01} 1%| | 334/34278 [25:13<41:09:43, 4.37s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 335/34278 [25:16<38:04:54, 4.04s/it] {'loss': 0.2792, 'grad_norm': 2.094914155968183, 'learning_rate': 3.2555879494655005e-06, 'epoch': 0.01} 1%| | 335/34278 [25:16<38:04:54, 4.04s/it] 1%| | 336/34278 [25:20<38:36:04, 4.09s/it] {'loss': 0.2866, 'grad_norm': 2.153157228626012, 'learning_rate': 3.2653061224489794e-06, 'epoch': 0.01} 1%| | 336/34278 [25:20<38:36:04, 4.09s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 337/34278 [25:24<37:09:48, 3.94s/it] {'loss': 0.2772, 'grad_norm': 1.8843072897963975, 'learning_rate': 3.275024295432459e-06, 'epoch': 0.01} 1%| | 337/34278 [25:24<37:09:48, 3.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 338/34278 [25:27<34:00:42, 3.61s/it] {'loss': 0.2631, 'grad_norm': 1.9095121160251654, 'learning_rate': 3.284742468415938e-06, 'epoch': 0.01} 1%| | 338/34278 [25:27<34:00:42, 3.61s/it] 1%| | 339/34278 [25:30<32:56:20, 3.49s/it] {'loss': 0.2663, 'grad_norm': 2.1063348860832543, 'learning_rate': 3.2944606413994174e-06, 'epoch': 0.01} 1%| | 339/34278 [25:30<32:56:20, 3.49s/it] 1%| | 340/34278 [25:34<34:13:00, 3.63s/it] {'loss': 0.3219, 'grad_norm': 1.640192998600247, 'learning_rate': 3.3041788143828962e-06, 'epoch': 0.01} 1%| | 340/34278 [25:34<34:13:00, 3.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 341/34278 [25:41<43:27:03, 4.61s/it] {'loss': 0.2576, 'grad_norm': 1.8783143584476323, 'learning_rate': 3.3138969873663755e-06, 'epoch': 0.01} 1%| | 341/34278 [25:41<43:27:03, 4.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 342/34278 [25:45<41:30:18, 4.40s/it] {'loss': 0.3015, 'grad_norm': 1.8743606251045954, 'learning_rate': 3.3236151603498544e-06, 'epoch': 0.01} 1%| | 342/34278 [25:45<41:30:18, 4.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 343/34278 [25:47<37:26:47, 3.97s/it] {'loss': 0.3041, 'grad_norm': 2.3756269137859913, 'learning_rate': 3.3333333333333333e-06, 'epoch': 0.01} 1%| | 343/34278 [25:48<37:26:47, 3.97s/it] 1%| | 344/34278 [25:51<35:56:13, 3.81s/it] {'loss': 0.2436, 'grad_norm': 1.820683881601083, 'learning_rate': 3.343051506316813e-06, 'epoch': 0.01} 1%| | 344/34278 [25:51<35:56:13, 3.81s/it] 1%| | 345/34278 [25:54<34:48:58, 3.69s/it] {'loss': 0.3129, 'grad_norm': 1.6527189980635084, 'learning_rate': 3.352769679300292e-06, 'epoch': 0.01} 1%| | 345/34278 [25:54<34:48:58, 3.69s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9968 > 8192). Running this sequence through the model will result in indexing errors 1%| | 346/34278 [25:57<33:06:42, 3.51s/it] {'loss': 0.2767, 'grad_norm': 2.0328740369563074, 'learning_rate': 3.3624878522837713e-06, 'epoch': 0.01} 1%| | 346/34278 [25:57<33:06:42, 3.51s/it] 1%| | 347/34278 [26:01<32:20:24, 3.43s/it] {'loss': 0.2849, 'grad_norm': 1.7729144762059441, 'learning_rate': 3.37220602526725e-06, 'epoch': 0.01} 1%| | 347/34278 [26:01<32:20:24, 3.43s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 348/34278 [26:05<35:36:05, 3.78s/it] {'loss': 0.3011, 'grad_norm': 1.6319492499213792, 'learning_rate': 3.381924198250729e-06, 'epoch': 0.01} 1%| | 348/34278 [26:05<35:36:05, 3.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 349/34278 [26:09<34:41:35, 3.68s/it] {'loss': 0.2881, 'grad_norm': 1.5627342952620744, 'learning_rate': 3.3916423712342083e-06, 'epoch': 0.01} 1%| | 349/34278 [26:09<34:41:35, 3.68s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 350/34278 [26:12<33:44:50, 3.58s/it] {'loss': 0.3088, 'grad_norm': 2.1436140364775365, 'learning_rate': 3.4013605442176872e-06, 'epoch': 0.01} 1%| | 350/34278 [26:12<33:44:50, 3.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 351/34278 [26:17<38:42:25, 4.11s/it] {'loss': 0.2619, 'grad_norm': 1.5432322995400232, 'learning_rate': 3.4110787172011665e-06, 'epoch': 0.01} 1%| | 351/34278 [26:17<38:42:25, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 352/34278 [26:21<36:51:23, 3.91s/it] {'loss': 0.288, 'grad_norm': 1.4919599215439503, 'learning_rate': 3.4207968901846454e-06, 'epoch': 0.01} 1%| | 352/34278 [26:21<36:51:23, 3.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 353/34278 [26:24<34:05:29, 3.62s/it] {'loss': 0.2656, 'grad_norm': 1.588909354418498, 'learning_rate': 3.4305150631681243e-06, 'epoch': 0.01} 1%| | 353/34278 [26:24<34:05:29, 3.62s/it] 1%| | 354/34278 [26:29<37:53:51, 4.02s/it] {'loss': 0.2971, 'grad_norm': 1.7238025208697898, 'learning_rate': 3.440233236151604e-06, 'epoch': 0.01} 1%| | 354/34278 [26:29<37:53:51, 4.02s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 355/34278 [26:32<35:26:50, 3.76s/it] {'loss': 0.2524, 'grad_norm': 1.3741848014034796, 'learning_rate': 3.449951409135083e-06, 'epoch': 0.01} 1%| | 355/34278 [26:32<35:26:50, 3.76s/it] 1%| | 356/34278 [26:38<42:08:14, 4.47s/it] {'loss': 0.264, 'grad_norm': 1.6950266671469552, 'learning_rate': 3.4596695821185622e-06, 'epoch': 0.01} 1%| | 356/34278 [26:38<42:08:14, 4.47s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 357/34278 [26:42<40:23:49, 4.29s/it] {'loss': 0.2825, 'grad_norm': 1.620413829771043, 'learning_rate': 3.469387755102041e-06, 'epoch': 0.01} 1%| | 357/34278 [26:42<40:23:49, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (13575 > 8192). Running this sequence through the model will result in indexing errors 1%| | 358/34278 [26:46<38:43:15, 4.11s/it] {'loss': 0.2524, 'grad_norm': 1.4843445148445067, 'learning_rate': 3.47910592808552e-06, 'epoch': 0.01} 1%| | 358/34278 [26:46<38:43:15, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 359/34278 [26:51<43:04:54, 4.57s/it] {'loss': 0.2446, 'grad_norm': 1.4282873977233463, 'learning_rate': 3.4888241010689993e-06, 'epoch': 0.01} 1%| | 359/34278 [26:51<43:04:54, 4.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 360/34278 [26:55<39:36:54, 4.20s/it] {'loss': 0.2602, 'grad_norm': 1.5458831165492712, 'learning_rate': 3.4985422740524782e-06, 'epoch': 0.01} 1%| | 360/34278 [26:55<39:36:54, 4.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 361/34278 [26:59<40:10:18, 4.26s/it] {'loss': 0.3021, 'grad_norm': 1.9803172614901905, 'learning_rate': 3.5082604470359575e-06, 'epoch': 0.01} 1%| | 361/34278 [26:59<40:10:18, 4.26s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 362/34278 [27:02<37:45:33, 4.01s/it] {'loss': 0.29, 'grad_norm': 1.7412676481476632, 'learning_rate': 3.5179786200194364e-06, 'epoch': 0.01} 1%| | 362/34278 [27:02<37:45:33, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 363/34278 [27:07<39:29:25, 4.19s/it] {'loss': 0.2645, 'grad_norm': 1.4824319966265418, 'learning_rate': 3.527696793002916e-06, 'epoch': 0.01} 1%| | 363/34278 [27:07<39:29:25, 4.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 364/34278 [27:10<35:18:57, 3.75s/it] {'loss': 0.3022, 'grad_norm': 1.6505631113298946, 'learning_rate': 3.537414965986395e-06, 'epoch': 0.01} 1%| | 364/34278 [27:10<35:18:57, 3.75s/it] 1%| | 365/34278 [27:16<41:47:11, 4.44s/it] {'loss': 0.2778, 'grad_norm': 1.5184315725990505, 'learning_rate': 3.547133138969874e-06, 'epoch': 0.01} 1%| | 365/34278 [27:16<41:47:11, 4.44s/it] 1%| | 366/34278 [27:20<40:48:19, 4.33s/it] {'loss': 0.2707, 'grad_norm': 1.8930868281870492, 'learning_rate': 3.5568513119533532e-06, 'epoch': 0.01} 1%| | 366/34278 [27:20<40:48:19, 4.33s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 367/34278 [27:23<38:01:58, 4.04s/it] {'loss': 0.2454, 'grad_norm': 1.462237554708469, 'learning_rate': 3.566569484936832e-06, 'epoch': 0.01} 1%| | 367/34278 [27:23<38:01:58, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 368/34278 [27:30<44:22:24, 4.71s/it] {'loss': 0.2669, 'grad_norm': 1.6865796691913675, 'learning_rate': 3.5762876579203114e-06, 'epoch': 0.01} 1%| | 368/34278 [27:30<44:22:24, 4.71s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 369/34278 [27:34<42:47:21, 4.54s/it] {'loss': 0.2817, 'grad_norm': 1.5496968461346172, 'learning_rate': 3.5860058309037903e-06, 'epoch': 0.01} 1%| | 369/34278 [27:34<42:47:21, 4.54s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (9929 > 8192). Running this sequence through the model will result in indexing errors 1%| | 370/34278 [27:37<39:52:07, 4.23s/it] {'loss': 0.2597, 'grad_norm': 2.0249897064534523, 'learning_rate': 3.595724003887269e-06, 'epoch': 0.01} 1%| | 370/34278 [27:37<39:52:07, 4.23s/it] 1%| | 371/34278 [27:40<37:12:01, 3.95s/it] {'loss': 0.2521, 'grad_norm': 1.5958207293933246, 'learning_rate': 3.6054421768707485e-06, 'epoch': 0.01} 1%| | 371/34278 [27:40<37:12:01, 3.95s/it] 1%| | 372/34278 [27:44<36:57:12, 3.92s/it] {'loss': 0.2493, 'grad_norm': 1.4351172330015676, 'learning_rate': 3.6151603498542274e-06, 'epoch': 0.01} 1%| | 372/34278 [27:44<36:57:12, 3.92s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (10989 > 8192). Running this sequence through the model will result in indexing errors 1%| | 373/34278 [27:48<34:57:35, 3.71s/it] {'loss': 0.2746, 'grad_norm': 1.3779667132396949, 'learning_rate': 3.624878522837707e-06, 'epoch': 0.01} 1%| | 373/34278 [27:48<34:57:35, 3.71s/it] 1%| | 374/34278 [27:50<32:27:25, 3.45s/it] {'loss': 0.2697, 'grad_norm': 1.686772218077411, 'learning_rate': 3.634596695821186e-06, 'epoch': 0.01} 1%| | 374/34278 [27:50<32:27:25, 3.45s/it] 1%| | 375/34278 [27:54<31:45:54, 3.37s/it] {'loss': 0.3058, 'grad_norm': 1.63760834587086, 'learning_rate': 3.644314868804665e-06, 'epoch': 0.01} 1%| | 375/34278 [27:54<31:45:54, 3.37s/it] 1%| | 376/34278 [27:57<31:20:04, 3.33s/it] {'loss': 0.2521, 'grad_norm': 1.1955134036184547, 'learning_rate': 3.6540330417881442e-06, 'epoch': 0.01} 1%| | 376/34278 [27:57<31:20:04, 3.33s/it] 1%| | 377/34278 [28:01<35:06:48, 3.73s/it] {'loss': 0.2619, 'grad_norm': 1.2622028647576145, 'learning_rate': 3.663751214771623e-06, 'epoch': 0.01} 1%| | 377/34278 [28:02<35:06:48, 3.73s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 378/34278 [28:08<43:07:43, 4.58s/it] {'loss': 0.2717, 'grad_norm': 1.5249266049659616, 'learning_rate': 3.6734693877551024e-06, 'epoch': 0.01} 1%| | 378/34278 [28:08<43:07:43, 4.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 379/34278 [28:11<39:47:18, 4.23s/it] {'loss': 0.2573, 'grad_norm': 1.6900785662024531, 'learning_rate': 3.6831875607385813e-06, 'epoch': 0.01} 1%| | 379/34278 [28:11<39:47:18, 4.23s/it] 1%| | 380/34278 [28:17<43:19:59, 4.60s/it] {'loss': 0.2607, 'grad_norm': 1.211142533860906, 'learning_rate': 3.69290573372206e-06, 'epoch': 0.01} 1%| | 380/34278 [28:17<43:19:59, 4.60s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 381/34278 [28:20<40:08:55, 4.26s/it] {'loss': 0.2549, 'grad_norm': 1.2637964860665716, 'learning_rate': 3.70262390670554e-06, 'epoch': 0.01} 1%| | 381/34278 [28:20<40:08:55, 4.26s/it] 1%| | 382/34278 [28:24<38:21:00, 4.07s/it] {'loss': 0.2704, 'grad_norm': 1.5521314677876958, 'learning_rate': 3.7123420796890184e-06, 'epoch': 0.01} 1%| | 382/34278 [28:24<38:21:00, 4.07s/it] 1%| | 383/34278 [28:28<38:55:38, 4.13s/it] {'loss': 0.2822, 'grad_norm': 1.4219165908310176, 'learning_rate': 3.722060252672498e-06, 'epoch': 0.01} 1%| | 383/34278 [28:28<38:55:38, 4.13s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 384/34278 [28:32<38:21:22, 4.07s/it] {'loss': 0.2554, 'grad_norm': 1.3945148177559197, 'learning_rate': 3.731778425655977e-06, 'epoch': 0.01} 1%| | 384/34278 [28:32<38:21:22, 4.07s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 385/34278 [28:36<37:35:21, 3.99s/it] {'loss': 0.2653, 'grad_norm': 1.4800200894833206, 'learning_rate': 3.7414965986394563e-06, 'epoch': 0.01} 1%| | 385/34278 [28:36<37:35:21, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (8808 > 8192). Running this sequence through the model will result in indexing errors 1%| | 386/34278 [28:40<37:17:29, 3.96s/it] {'loss': 0.2694, 'grad_norm': 1.7467062544533822, 'learning_rate': 3.7512147716229352e-06, 'epoch': 0.01} 1%| | 386/34278 [28:40<37:17:29, 3.96s/it] 1%| | 387/34278 [28:44<38:57:58, 4.14s/it] {'loss': 0.2681, 'grad_norm': 1.4124869764924537, 'learning_rate': 3.760932944606414e-06, 'epoch': 0.01} 1%| | 387/34278 [28:44<38:57:58, 4.14s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 388/34278 [28:49<40:54:42, 4.35s/it] {'loss': 0.288, 'grad_norm': 1.4230904605810348, 'learning_rate': 3.7706511175898934e-06, 'epoch': 0.01} 1%| | 388/34278 [28:49<40:54:42, 4.35s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 389/34278 [28:52<37:33:40, 3.99s/it] {'loss': 0.2743, 'grad_norm': 1.357060034334733, 'learning_rate': 3.7803692905733723e-06, 'epoch': 0.01} 1%| | 389/34278 [28:52<37:33:40, 3.99s/it] 1%| | 390/34278 [28:56<35:32:57, 3.78s/it] {'loss': 0.2562, 'grad_norm': 1.372100485282738, 'learning_rate': 3.790087463556852e-06, 'epoch': 0.01} 1%| | 390/34278 [28:56<35:32:57, 3.78s/it] 1%| | 391/34278 [29:01<39:52:39, 4.24s/it] {'loss': 0.2613, 'grad_norm': 1.3167946418471566, 'learning_rate': 3.799805636540331e-06, 'epoch': 0.01} 1%| | 391/34278 [29:01<39:52:39, 4.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 392/34278 [29:04<37:25:02, 3.98s/it] {'loss': 0.3135, 'grad_norm': 1.2965892630475417, 'learning_rate': 3.80952380952381e-06, 'epoch': 0.01} 1%| | 392/34278 [29:04<37:25:02, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 393/34278 [29:09<38:25:00, 4.08s/it] {'loss': 0.2629, 'grad_norm': 1.3897749697059423, 'learning_rate': 3.819241982507289e-06, 'epoch': 0.01} 1%| | 393/34278 [29:09<38:25:00, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 394/34278 [29:15<44:47:18, 4.76s/it] {'loss': 0.2503, 'grad_norm': 1.3108126440033776, 'learning_rate': 3.828960155490768e-06, 'epoch': 0.01} 1%| | 394/34278 [29:15<44:47:18, 4.76s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 395/34278 [29:19<41:53:28, 4.45s/it] {'loss': 0.2643, 'grad_norm': 1.2565582473203867, 'learning_rate': 3.838678328474247e-06, 'epoch': 0.01} 1%| | 395/34278 [29:19<41:53:28, 4.45s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 396/34278 [29:24<43:12:45, 4.59s/it] {'loss': 0.257, 'grad_norm': 1.3845977026331957, 'learning_rate': 3.848396501457727e-06, 'epoch': 0.01} 1%| | 396/34278 [29:24<43:12:45, 4.59s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (9509 > 8192). Running this sequence through the model will result in indexing errors 1%| | 397/34278 [29:28<41:05:21, 4.37s/it] {'loss': 0.2543, 'grad_norm': 1.0733200186856116, 'learning_rate': 3.858114674441205e-06, 'epoch': 0.01} 1%| | 397/34278 [29:28<41:05:21, 4.37s/it] 1%| | 398/34278 [29:31<38:22:52, 4.08s/it] {'loss': 0.2564, 'grad_norm': 1.3431895406111862, 'learning_rate': 3.867832847424684e-06, 'epoch': 0.01} 1%| | 398/34278 [29:31<38:22:52, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 399/34278 [29:34<35:47:38, 3.80s/it] {'loss': 0.2493, 'grad_norm': 1.5303229422311955, 'learning_rate': 3.877551020408164e-06, 'epoch': 0.01} 1%| | 399/34278 [29:34<35:47:38, 3.80s/it] 1%| | 400/34278 [29:38<35:55:33, 3.82s/it] {'loss': 0.2632, 'grad_norm': 1.5256524022733309, 'learning_rate': 3.887269193391643e-06, 'epoch': 0.01} 1%| | 400/34278 [29:38<35:55:33, 3.82s/it] 1%| | 401/34278 [29:44<41:47:39, 4.44s/it] {'loss': 0.2529, 'grad_norm': 1.3465739038902813, 'learning_rate': 3.8969873663751215e-06, 'epoch': 0.01} 1%| | 401/34278 [29:44<41:47:39, 4.44s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9336 > 8192). Running this sequence through the model will result in indexing errors 1%| | 402/34278 [29:47<39:14:20, 4.17s/it] {'loss': 0.2718, 'grad_norm': 1.6741775016426725, 'learning_rate': 3.906705539358601e-06, 'epoch': 0.01} 1%| | 402/34278 [29:47<39:14:20, 4.17s/it] 1%| | 403/34278 [29:51<38:41:11, 4.11s/it] {'loss': 0.2522, 'grad_norm': 1.5926160920030783, 'learning_rate': 3.91642371234208e-06, 'epoch': 0.01} 1%| | 403/34278 [29:51<38:41:11, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 404/34278 [29:56<40:31:58, 4.31s/it] {'loss': 0.2862, 'grad_norm': 1.2376052927951111, 'learning_rate': 3.926141885325559e-06, 'epoch': 0.01} 1%| | 404/34278 [29:56<40:31:58, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 405/34278 [30:00<38:04:57, 4.05s/it] {'loss': 0.253, 'grad_norm': 1.3319212228022448, 'learning_rate': 3.935860058309039e-06, 'epoch': 0.01} 1%| | 405/34278 [30:00<38:04:57, 4.05s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff08c136de0> Failed to fetch sample 3657298. Exception: cannot identify image file <_io.BytesIO object at 0x7ff08c136de0> 1%| | 406/34278 [30:02<34:50:37, 3.70s/it] {'loss': 0.2689, 'grad_norm': 1.3629418138465732, 'learning_rate': 3.945578231292517e-06, 'epoch': 0.01} 1%| | 406/34278 [30:03<34:50:37, 3.70s/it] 1%| | 407/34278 [30:06<35:08:05, 3.73s/it] {'loss': 0.2525, 'grad_norm': 1.2912180310714865, 'learning_rate': 3.9552964042759965e-06, 'epoch': 0.01} 1%| | 407/34278 [30:06<35:08:05, 3.73s/it] 1%| | 408/34278 [30:10<35:45:32, 3.80s/it] {'loss': 0.2796, 'grad_norm': 1.5134083021404499, 'learning_rate': 3.965014577259476e-06, 'epoch': 0.01} 1%| | 408/34278 [30:10<35:45:32, 3.80s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 409/34278 [30:14<35:03:04, 3.73s/it] {'loss': 0.2531, 'grad_norm': 1.531948495145115, 'learning_rate': 3.974732750242954e-06, 'epoch': 0.01} 1%| | 409/34278 [30:14<35:03:04, 3.73s/it] 1%| | 410/34278 [30:19<40:04:05, 4.26s/it] {'loss': 0.3054, 'grad_norm': 1.4702809793293405, 'learning_rate': 3.984450923226434e-06, 'epoch': 0.01} 1%| | 410/34278 [30:19<40:04:05, 4.26s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 411/34278 [30:23<38:21:36, 4.08s/it] {'loss': 0.2884, 'grad_norm': 1.3305350002771992, 'learning_rate': 3.994169096209913e-06, 'epoch': 0.01} 1%| | 411/34278 [30:23<38:21:36, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 412/34278 [30:29<44:39:42, 4.75s/it] {'loss': 0.26, 'grad_norm': 1.4697924000376428, 'learning_rate': 4.003887269193392e-06, 'epoch': 0.01} 1%| | 412/34278 [30:29<44:39:42, 4.75s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 413/34278 [30:32<39:56:13, 4.25s/it] {'loss': 0.2664, 'grad_norm': 1.3735374659502786, 'learning_rate': 4.013605442176871e-06, 'epoch': 0.01} 1%| | 413/34278 [30:32<39:56:13, 4.25s/it] 1%| | 414/34278 [30:36<36:53:14, 3.92s/it] {'loss': 0.2782, 'grad_norm': 1.2954656043771686, 'learning_rate': 4.02332361516035e-06, 'epoch': 0.01} 1%| | 414/34278 [30:36<36:53:14, 3.92s/it] 1%| | 415/34278 [30:42<43:35:18, 4.63s/it] {'loss': 0.2296, 'grad_norm': 1.1448721497856589, 'learning_rate': 4.033041788143829e-06, 'epoch': 0.01} 1%| | 415/34278 [30:42<43:35:18, 4.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 416/34278 [30:45<39:25:43, 4.19s/it] {'loss': 0.2468, 'grad_norm': 1.3316732945761685, 'learning_rate': 4.042759961127309e-06, 'epoch': 0.01} 1%| | 416/34278 [30:45<39:25:43, 4.19s/it] 1%| | 417/34278 [30:48<35:10:34, 3.74s/it] {'loss': 0.2444, 'grad_norm': 1.5767206129208087, 'learning_rate': 4.052478134110788e-06, 'epoch': 0.01} 1%| | 417/34278 [30:48<35:10:34, 3.74s/it] 1%| | 418/34278 [30:51<34:07:44, 3.63s/it] {'loss': 0.2723, 'grad_norm': 1.4368737411520363, 'learning_rate': 4.062196307094266e-06, 'epoch': 0.01} 1%| | 418/34278 [30:51<34:07:44, 3.63s/it] 1%| | 419/34278 [30:57<41:08:32, 4.37s/it] {'loss': 0.2678, 'grad_norm': 1.1620650429155768, 'learning_rate': 4.071914480077746e-06, 'epoch': 0.01} 1%| | 419/34278 [30:57<41:08:32, 4.37s/it] 1%| | 420/34278 [31:00<36:59:53, 3.93s/it] {'loss': 0.2393, 'grad_norm': 1.4174843135919835, 'learning_rate': 4.081632653061225e-06, 'epoch': 0.01} 1%| | 420/34278 [31:00<36:59:53, 3.93s/it] 1%| | 421/34278 [31:03<34:43:23, 3.69s/it] {'loss': 0.2574, 'grad_norm': 1.2567250097238387, 'learning_rate': 4.0913508260447035e-06, 'epoch': 0.01} 1%| | 421/34278 [31:03<34:43:23, 3.69s/it] 1%| | 422/34278 [31:07<35:18:24, 3.75s/it] {'loss': 0.2528, 'grad_norm': 1.3670647007335115, 'learning_rate': 4.101068999028184e-06, 'epoch': 0.01} 1%| | 422/34278 [31:07<35:18:24, 3.75s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 423/34278 [31:13<40:36:04, 4.32s/it] {'loss': 0.2748, 'grad_norm': 1.5587689474510382, 'learning_rate': 4.110787172011662e-06, 'epoch': 0.01} 1%| | 423/34278 [31:13<40:36:04, 4.32s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 424/34278 [31:16<38:41:56, 4.12s/it] {'loss': 0.2565, 'grad_norm': 1.3272715078796335, 'learning_rate': 4.120505344995141e-06, 'epoch': 0.01} 1%| | 424/34278 [31:16<38:41:56, 4.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 425/34278 [31:20<37:08:29, 3.95s/it] {'loss': 0.2631, 'grad_norm': 1.2378761782370675, 'learning_rate': 4.130223517978621e-06, 'epoch': 0.01} 1%| | 425/34278 [31:20<37:08:29, 3.95s/it] 1%| | 426/34278 [31:24<37:52:42, 4.03s/it] {'loss': 0.263, 'grad_norm': 1.1665020326387956, 'learning_rate': 4.139941690962099e-06, 'epoch': 0.01} 1%| | 426/34278 [31:24<37:52:42, 4.03s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%| | 427/34278 [31:28<37:35:30, 4.00s/it] {'loss': 0.259, 'grad_norm': 1.3182377098074065, 'learning_rate': 4.1496598639455785e-06, 'epoch': 0.01} 1%| | 427/34278 [31:28<37:35:30, 4.00s/it] 1%| | 428/34278 [31:34<43:04:08, 4.58s/it] {'loss': 0.2442, 'grad_norm': 1.1516319554886278, 'learning_rate': 4.159378036929058e-06, 'epoch': 0.01} 1%| | 428/34278 [31:34<43:04:08, 4.58s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10846 > 8192). Running this sequence through the model will result in indexing errors 1%|▏ | 429/34278 [31:38<40:15:28, 4.28s/it] {'loss': 0.2456, 'grad_norm': 1.0877267499954546, 'learning_rate': 4.169096209912537e-06, 'epoch': 0.01} 1%|▏ | 429/34278 [31:38<40:15:28, 4.28s/it] 1%|▏ | 430/34278 [31:44<46:00:33, 4.89s/it] {'loss': 0.2448, 'grad_norm': 1.4293359511150157, 'learning_rate': 4.178814382896016e-06, 'epoch': 0.01} 1%|▏ | 430/34278 [31:44<46:00:33, 4.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 431/34278 [31:48<42:32:41, 4.53s/it] {'loss': 0.2564, 'grad_norm': 1.383345105895979, 'learning_rate': 4.188532555879495e-06, 'epoch': 0.01} 1%|▏ | 431/34278 [31:48<42:32:41, 4.53s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 432/34278 [31:51<39:16:37, 4.18s/it] {'loss': 0.2483, 'grad_norm': 1.3677681652027938, 'learning_rate': 4.198250728862974e-06, 'epoch': 0.01} 1%|▏ | 432/34278 [31:51<39:16:37, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 433/34278 [31:55<38:35:00, 4.10s/it] {'loss': 0.2602, 'grad_norm': 1.3290166256189182, 'learning_rate': 4.2079689018464535e-06, 'epoch': 0.01} 1%|▏ | 433/34278 [31:55<38:35:00, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 434/34278 [31:58<35:19:46, 3.76s/it] {'loss': 0.2335, 'grad_norm': 1.332224555434403, 'learning_rate': 4.217687074829933e-06, 'epoch': 0.01} 1%|▏ | 434/34278 [31:58<35:19:46, 3.76s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9875 > 8192). Running this sequence through the model will result in indexing errors 1%|▏ | 435/34278 [32:01<33:35:19, 3.57s/it] {'loss': 0.2392, 'grad_norm': 1.3313100417461197, 'learning_rate': 4.227405247813411e-06, 'epoch': 0.01} 1%|▏ | 435/34278 [32:01<33:35:19, 3.57s/it] 1%|▏ | 436/34278 [32:05<33:41:16, 3.58s/it] {'loss': 0.2504, 'grad_norm': 1.2006719353729596, 'learning_rate': 4.237123420796891e-06, 'epoch': 0.01} 1%|▏ | 436/34278 [32:05<33:41:16, 3.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 437/34278 [32:08<32:35:26, 3.47s/it] {'loss': 0.249, 'grad_norm': 1.0581534964792212, 'learning_rate': 4.24684159378037e-06, 'epoch': 0.01} 1%|▏ | 437/34278 [32:08<32:35:26, 3.47s/it] 1%|▏ | 438/34278 [32:12<33:27:37, 3.56s/it] {'loss': 0.2482, 'grad_norm': 1.1171627090319456, 'learning_rate': 4.256559766763848e-06, 'epoch': 0.01} 1%|▏ | 438/34278 [32:12<33:27:37, 3.56s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 439/34278 [32:15<33:30:11, 3.56s/it] {'loss': 0.2691, 'grad_norm': 1.2435943357855226, 'learning_rate': 4.266277939747328e-06, 'epoch': 0.01} 1%|▏ | 439/34278 [32:15<33:30:11, 3.56s/it] 1%|▏ | 440/34278 [32:21<40:54:56, 4.35s/it] {'loss': 0.2961, 'grad_norm': 1.2748446085082195, 'learning_rate': 4.275996112730807e-06, 'epoch': 0.01} 1%|▏ | 440/34278 [32:21<40:54:56, 4.35s/it] 1%|▏ | 441/34278 [32:26<41:53:35, 4.46s/it] {'loss': 0.2467, 'grad_norm': 1.4193221476812536, 'learning_rate': 4.2857142857142855e-06, 'epoch': 0.01} 1%|▏ | 441/34278 [32:26<41:53:35, 4.46s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 442/34278 [32:32<47:03:08, 5.01s/it] {'loss': 0.2271, 'grad_norm': 1.1594012561569815, 'learning_rate': 4.295432458697766e-06, 'epoch': 0.01} 1%|▏ | 442/34278 [32:32<47:03:08, 5.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 443/34278 [32:35<41:36:02, 4.43s/it] {'loss': 0.2665, 'grad_norm': 1.4913336551053733, 'learning_rate': 4.305150631681244e-06, 'epoch': 0.01} 1%|▏ | 443/34278 [32:35<41:36:02, 4.43s/it] 1%|▏ | 444/34278 [32:38<37:43:24, 4.01s/it] {'loss': 0.2363, 'grad_norm': 1.2669816916780523, 'learning_rate': 4.314868804664723e-06, 'epoch': 0.01} 1%|▏ | 444/34278 [32:38<37:43:24, 4.01s/it] 1%|▏ | 445/34278 [32:42<35:46:27, 3.81s/it] {'loss': 0.2656, 'grad_norm': 1.3234504651952432, 'learning_rate': 4.324586977648203e-06, 'epoch': 0.01} 1%|▏ | 445/34278 [32:42<35:46:27, 3.81s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 446/34278 [32:48<42:21:17, 4.51s/it] {'loss': 0.2455, 'grad_norm': 1.6579390405095555, 'learning_rate': 4.334305150631681e-06, 'epoch': 0.01} 1%|▏ | 446/34278 [32:48<42:21:17, 4.51s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 447/34278 [32:54<46:55:29, 4.99s/it] {'loss': 0.248, 'grad_norm': 1.349419654484547, 'learning_rate': 4.3440233236151605e-06, 'epoch': 0.01} 1%|▏ | 447/34278 [32:54<46:55:29, 4.99s/it] 1%|▏ | 448/34278 [32:58<42:58:43, 4.57s/it] {'loss': 0.2478, 'grad_norm': 1.2588927126523761, 'learning_rate': 4.35374149659864e-06, 'epoch': 0.01} 1%|▏ | 448/34278 [32:58<42:58:43, 4.57s/it] 1%|▏ | 449/34278 [33:01<39:51:17, 4.24s/it] {'loss': 0.2391, 'grad_norm': 1.357180967939837, 'learning_rate': 4.363459669582119e-06, 'epoch': 0.01} 1%|▏ | 449/34278 [33:01<39:51:17, 4.24s/it] 1%|▏ | 450/34278 [33:05<37:47:00, 4.02s/it] {'loss': 0.2532, 'grad_norm': 1.3862470942315106, 'learning_rate': 4.3731778425655976e-06, 'epoch': 0.01} 1%|▏ | 450/34278 [33:05<37:47:00, 4.02s/it] 1%|▏ | 451/34278 [33:11<44:47:02, 4.77s/it] {'loss': 0.249, 'grad_norm': 1.1208491806284107, 'learning_rate': 4.382896015549078e-06, 'epoch': 0.01} 1%|▏ | 451/34278 [33:11<44:47:02, 4.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 452/34278 [33:14<40:37:54, 4.32s/it] {'loss': 0.2598, 'grad_norm': 1.4265465775294321, 'learning_rate': 4.392614188532556e-06, 'epoch': 0.01} 1%|▏ | 452/34278 [33:14<40:37:54, 4.32s/it] 1%|▏ | 453/34278 [33:17<37:10:41, 3.96s/it] {'loss': 0.2615, 'grad_norm': 1.3011849065304812, 'learning_rate': 4.4023323615160355e-06, 'epoch': 0.01} 1%|▏ | 453/34278 [33:17<37:10:41, 3.96s/it] 1%|▏ | 454/34278 [33:21<36:17:42, 3.86s/it] {'loss': 0.2468, 'grad_norm': 1.1182457723949686, 'learning_rate': 4.412050534499515e-06, 'epoch': 0.01} 1%|▏ | 454/34278 [33:21<36:17:42, 3.86s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 455/34278 [33:25<37:30:45, 3.99s/it] {'loss': 0.2381, 'grad_norm': 1.08008703791803, 'learning_rate': 4.421768707482993e-06, 'epoch': 0.01} 1%|▏ | 455/34278 [33:25<37:30:45, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 456/34278 [33:28<34:54:55, 3.72s/it] {'loss': 0.2302, 'grad_norm': 1.2908888359703325, 'learning_rate': 4.431486880466473e-06, 'epoch': 0.01} 1%|▏ | 456/34278 [33:28<34:54:55, 3.72s/it] 1%|▏ | 457/34278 [33:34<41:09:35, 4.38s/it] {'loss': 0.2389, 'grad_norm': 1.3553893078632153, 'learning_rate': 4.441205053449952e-06, 'epoch': 0.01} 1%|▏ | 457/34278 [33:34<41:09:35, 4.38s/it] 1%|▏ | 458/34278 [33:39<40:49:21, 4.35s/it] {'loss': 0.2318, 'grad_norm': 1.2980071707761118, 'learning_rate': 4.45092322643343e-06, 'epoch': 0.01} 1%|▏ | 458/34278 [33:39<40:49:21, 4.35s/it] 1%|▏ | 459/34278 [33:43<40:10:28, 4.28s/it] {'loss': 0.2551, 'grad_norm': 1.3181871945419834, 'learning_rate': 4.4606413994169105e-06, 'epoch': 0.01} 1%|▏ | 459/34278 [33:43<40:10:28, 4.28s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 460/34278 [33:46<37:02:08, 3.94s/it] {'loss': 0.2586, 'grad_norm': 1.238630790097823, 'learning_rate': 4.470359572400389e-06, 'epoch': 0.01} 1%|▏ | 460/34278 [33:46<37:02:08, 3.94s/it] 1%|▏ | 461/34278 [33:50<37:29:41, 3.99s/it] {'loss': 0.2602, 'grad_norm': 1.3362103010683621, 'learning_rate': 4.480077745383868e-06, 'epoch': 0.01} 1%|▏ | 461/34278 [33:50<37:29:41, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 462/34278 [33:53<35:52:58, 3.82s/it] {'loss': 0.243, 'grad_norm': 1.0962828772234607, 'learning_rate': 4.489795918367348e-06, 'epoch': 0.01} 1%|▏ | 462/34278 [33:53<35:52:58, 3.82s/it] 1%|▏ | 463/34278 [33:57<34:04:47, 3.63s/it] {'loss': 0.2683, 'grad_norm': 1.2585926563412242, 'learning_rate': 4.499514091350826e-06, 'epoch': 0.01} 1%|▏ | 463/34278 [33:57<34:04:47, 3.63s/it] 1%|▏ | 464/34278 [34:00<32:35:16, 3.47s/it] {'loss': 0.314, 'grad_norm': 1.4080036055157774, 'learning_rate': 4.509232264334305e-06, 'epoch': 0.01} 1%|▏ | 464/34278 [34:00<32:35:16, 3.47s/it] 1%|▏ | 465/34278 [34:03<31:38:06, 3.37s/it] {'loss': 0.2601, 'grad_norm': 1.2666446602777268, 'learning_rate': 4.518950437317785e-06, 'epoch': 0.01} 1%|▏ | 465/34278 [34:03<31:38:06, 3.37s/it] 1%|▏ | 466/34278 [34:06<32:15:16, 3.43s/it] {'loss': 0.2677, 'grad_norm': 1.183584288132124, 'learning_rate': 4.528668610301264e-06, 'epoch': 0.01} 1%|▏ | 466/34278 [34:06<32:15:16, 3.43s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 467/34278 [34:10<32:46:46, 3.49s/it] {'loss': 0.2623, 'grad_norm': 1.2559272580191174, 'learning_rate': 4.5383867832847425e-06, 'epoch': 0.01} 1%|▏ | 467/34278 [34:10<32:46:46, 3.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 468/34278 [34:17<41:05:14, 4.37s/it] {'loss': 0.3135, 'grad_norm': 1.2137655364874729, 'learning_rate': 4.548104956268222e-06, 'epoch': 0.01} 1%|▏ | 468/34278 [34:17<41:05:14, 4.37s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 469/34278 [34:20<39:34:24, 4.21s/it] {'loss': 0.2281, 'grad_norm': 1.0815132249305057, 'learning_rate': 4.557823129251701e-06, 'epoch': 0.01} 1%|▏ | 469/34278 [34:20<39:34:24, 4.21s/it] 1%|▏ | 470/34278 [34:24<36:49:05, 3.92s/it] {'loss': 0.2367, 'grad_norm': 1.1782532527157554, 'learning_rate': 4.56754130223518e-06, 'epoch': 0.01} 1%|▏ | 470/34278 [34:24<36:49:05, 3.92s/it] 1%|▏ | 471/34278 [34:28<38:55:58, 4.15s/it] {'loss': 0.2484, 'grad_norm': 1.4533580084977513, 'learning_rate': 4.57725947521866e-06, 'epoch': 0.01} 1%|▏ | 471/34278 [34:28<38:55:58, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 472/34278 [34:34<42:26:17, 4.52s/it] {'loss': 0.2664, 'grad_norm': 1.3583985599668706, 'learning_rate': 4.586977648202138e-06, 'epoch': 0.01} 1%|▏ | 472/34278 [34:34<42:26:17, 4.52s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 473/34278 [34:37<40:14:32, 4.29s/it] {'loss': 0.2355, 'grad_norm': 1.2708757427546593, 'learning_rate': 4.5966958211856175e-06, 'epoch': 0.01} 1%|▏ | 473/34278 [34:37<40:14:32, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 474/34278 [34:41<39:07:45, 4.17s/it] {'loss': 0.2299, 'grad_norm': 1.198494775196898, 'learning_rate': 4.606413994169097e-06, 'epoch': 0.01} 1%|▏ | 474/34278 [34:41<39:07:45, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 475/34278 [34:47<44:12:10, 4.71s/it] {'loss': 0.2457, 'grad_norm': 1.5538009585829942, 'learning_rate': 4.616132167152575e-06, 'epoch': 0.01} 1%|▏ | 475/34278 [34:47<44:12:10, 4.71s/it] 1%|▏ | 476/34278 [34:51<42:29:00, 4.52s/it] {'loss': 0.2527, 'grad_norm': 1.2367763455447653, 'learning_rate': 4.6258503401360546e-06, 'epoch': 0.01} 1%|▏ | 476/34278 [34:51<42:29:00, 4.52s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 477/34278 [34:55<39:57:07, 4.26s/it] {'loss': 0.2428, 'grad_norm': 1.143066225777548, 'learning_rate': 4.635568513119534e-06, 'epoch': 0.01} 1%|▏ | 477/34278 [34:55<39:57:07, 4.26s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11734 > 8192). Running this sequence through the model will result in indexing errors 1%|▏ | 478/34278 [34:59<38:35:57, 4.11s/it] {'loss': 0.2441, 'grad_norm': 1.1629123562353916, 'learning_rate': 4.645286686103013e-06, 'epoch': 0.01} 1%|▏ | 478/34278 [34:59<38:35:57, 4.11s/it] 1%|▏ | 479/34278 [35:03<38:46:11, 4.13s/it] {'loss': 0.262, 'grad_norm': 1.2925898615315257, 'learning_rate': 4.6550048590864925e-06, 'epoch': 0.01} 1%|▏ | 479/34278 [35:03<38:46:11, 4.13s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 480/34278 [35:06<35:30:51, 3.78s/it] {'loss': 0.24, 'grad_norm': 1.300482449456053, 'learning_rate': 4.664723032069971e-06, 'epoch': 0.01} 1%|▏ | 480/34278 [35:06<35:30:51, 3.78s/it] 1%|▏ | 481/34278 [35:09<32:58:45, 3.51s/it] {'loss': 0.2551, 'grad_norm': 1.4025190963822738, 'learning_rate': 4.67444120505345e-06, 'epoch': 0.01} 1%|▏ | 481/34278 [35:09<32:58:45, 3.51s/it] 1%|▏ | 482/34278 [35:12<31:55:56, 3.40s/it] {'loss': 0.235, 'grad_norm': 1.335127065360521, 'learning_rate': 4.68415937803693e-06, 'epoch': 0.01} 1%|▏ | 482/34278 [35:12<31:55:56, 3.40s/it] 1%|▏ | 483/34278 [35:15<31:11:23, 3.32s/it] {'loss': 0.2362, 'grad_norm': 1.0738846808453941, 'learning_rate': 4.693877551020409e-06, 'epoch': 0.01} 1%|▏ | 483/34278 [35:15<31:11:23, 3.32s/it] 1%|▏ | 484/34278 [35:18<31:13:46, 3.33s/it] {'loss': 0.2267, 'grad_norm': 1.2366364177512028, 'learning_rate': 4.703595724003887e-06, 'epoch': 0.01} 1%|▏ | 484/34278 [35:18<31:13:46, 3.33s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 485/34278 [35:22<30:52:02, 3.29s/it] {'loss': 0.2549, 'grad_norm': 1.568015498853348, 'learning_rate': 4.713313896987367e-06, 'epoch': 0.01} 1%|▏ | 485/34278 [35:22<30:52:02, 3.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 486/34278 [35:27<35:38:15, 3.80s/it] {'loss': 0.2484, 'grad_norm': 0.99074606036659, 'learning_rate': 4.723032069970846e-06, 'epoch': 0.01} 1%|▏ | 486/34278 [35:27<35:38:15, 3.80s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 487/34278 [35:33<42:12:08, 4.50s/it] {'loss': 0.2631, 'grad_norm': 1.2545910290085318, 'learning_rate': 4.7327502429543244e-06, 'epoch': 0.01} 1%|▏ | 487/34278 [35:33<42:12:08, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 488/34278 [35:36<38:30:41, 4.10s/it] {'loss': 0.2561, 'grad_norm': 1.3838940278828762, 'learning_rate': 4.742468415937805e-06, 'epoch': 0.01} 1%|▏ | 488/34278 [35:36<38:30:41, 4.10s/it] 1%|▏ | 489/34278 [35:39<35:50:31, 3.82s/it] {'loss': 0.2545, 'grad_norm': 1.3331219246766512, 'learning_rate': 4.752186588921283e-06, 'epoch': 0.01} 1%|▏ | 489/34278 [35:39<35:50:31, 3.82s/it] 1%|▏ | 490/34278 [35:42<33:47:54, 3.60s/it] {'loss': 0.2667, 'grad_norm': 1.5306031190582838, 'learning_rate': 4.761904761904762e-06, 'epoch': 0.01} 1%|▏ | 490/34278 [35:42<33:47:54, 3.60s/it] 1%|▏ | 491/34278 [35:48<40:06:19, 4.27s/it] {'loss': 0.245, 'grad_norm': 1.1863511505688096, 'learning_rate': 4.771622934888242e-06, 'epoch': 0.01} 1%|▏ | 491/34278 [35:48<40:06:19, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 492/34278 [35:52<40:17:00, 4.29s/it] {'loss': 0.2342, 'grad_norm': 1.2939705851848529, 'learning_rate': 4.78134110787172e-06, 'epoch': 0.01} 1%|▏ | 492/34278 [35:52<40:17:00, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 493/34278 [35:58<43:13:46, 4.61s/it] {'loss': 0.2233, 'grad_norm': 1.2050040257403767, 'learning_rate': 4.7910592808551995e-06, 'epoch': 0.01} 1%|▏ | 493/34278 [35:58<43:13:46, 4.61s/it] 1%|▏ | 494/34278 [36:01<40:21:42, 4.30s/it] {'loss': 0.2548, 'grad_norm': 1.2670366063806675, 'learning_rate': 4.800777453838679e-06, 'epoch': 0.01} 1%|▏ | 494/34278 [36:01<40:21:42, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 495/34278 [36:04<37:03:52, 3.95s/it] {'loss': 0.2634, 'grad_norm': 1.4186837559720642, 'learning_rate': 4.810495626822158e-06, 'epoch': 0.01} 1%|▏ | 495/34278 [36:04<37:03:52, 3.95s/it] 1%|▏ | 496/34278 [36:08<35:01:06, 3.73s/it] {'loss': 0.245, 'grad_norm': 1.2092502080757692, 'learning_rate': 4.820213799805637e-06, 'epoch': 0.01} 1%|▏ | 496/34278 [36:08<35:01:06, 3.73s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10290 > 8192). Running this sequence through the model will result in indexing errors 1%|▏ | 497/34278 [36:13<40:23:19, 4.30s/it] {'loss': 0.2804, 'grad_norm': 1.2523718195794546, 'learning_rate': 4.829931972789116e-06, 'epoch': 0.01} 1%|▏ | 497/34278 [36:13<40:23:19, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 498/34278 [36:16<36:59:04, 3.94s/it] {'loss': 0.231, 'grad_norm': 1.2254633096437888, 'learning_rate': 4.839650145772595e-06, 'epoch': 0.01} 1%|▏ | 498/34278 [36:16<36:59:04, 3.94s/it] 1%|▏ | 499/34278 [36:22<43:01:44, 4.59s/it] {'loss': 0.2798, 'grad_norm': 1.2755412540876685, 'learning_rate': 4.8493683187560745e-06, 'epoch': 0.01} 1%|▏ | 499/34278 [36:22<43:01:44, 4.59s/it] 1%|▏ | 500/34278 [36:26<39:28:56, 4.21s/it] {'loss': 0.2423, 'grad_norm': 1.6972872186718777, 'learning_rate': 4.859086491739554e-06, 'epoch': 0.01} 1%|▏ | 500/34278 [36:26<39:28:56, 4.21s/it] 1%|▏ | 501/34278 [36:30<39:15:37, 4.18s/it] {'loss': 0.2727, 'grad_norm': 1.49574324230808, 'learning_rate': 4.868804664723032e-06, 'epoch': 0.01} 1%|▏ | 501/34278 [36:30<39:15:37, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 502/34278 [36:36<44:43:49, 4.77s/it] {'loss': 0.2799, 'grad_norm': 1.4688837322711812, 'learning_rate': 4.8785228377065116e-06, 'epoch': 0.01} 1%|▏ | 502/34278 [36:36<44:43:49, 4.77s/it] 1%|▏ | 503/34278 [36:39<39:07:15, 4.17s/it] {'loss': 0.2468, 'grad_norm': 1.3040968475741357, 'learning_rate': 4.888241010689991e-06, 'epoch': 0.01} 1%|▏ | 503/34278 [36:39<39:07:15, 4.17s/it] 1%|▏ | 504/34278 [36:43<38:47:33, 4.13s/it] {'loss': 0.2456, 'grad_norm': 1.399788268983685, 'learning_rate': 4.897959183673469e-06, 'epoch': 0.01} 1%|▏ | 504/34278 [36:43<38:47:33, 4.13s/it] 1%|▏ | 505/34278 [36:46<36:06:54, 3.85s/it] {'loss': 0.2221, 'grad_norm': 1.198851293120145, 'learning_rate': 4.9076773566569495e-06, 'epoch': 0.01} 1%|▏ | 505/34278 [36:46<36:06:54, 3.85s/it] 1%|▏ | 506/34278 [36:52<43:07:32, 4.60s/it] {'loss': 0.2491, 'grad_norm': 1.2491295874169632, 'learning_rate': 4.917395529640428e-06, 'epoch': 0.01} 1%|▏ | 506/34278 [36:52<43:07:32, 4.60s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 507/34278 [36:56<39:02:03, 4.16s/it] {'loss': 0.2495, 'grad_norm': 1.4679610151184677, 'learning_rate': 4.927113702623907e-06, 'epoch': 0.01} 1%|▏ | 507/34278 [36:56<39:02:03, 4.16s/it] 1%|▏ | 508/34278 [37:00<40:53:27, 4.36s/it] {'loss': 0.2608, 'grad_norm': 1.5826969791489809, 'learning_rate': 4.936831875607387e-06, 'epoch': 0.01} 1%|▏ | 508/34278 [37:00<40:53:27, 4.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 509/34278 [37:04<38:44:27, 4.13s/it] {'loss': 0.2283, 'grad_norm': 1.3592760935131265, 'learning_rate': 4.946550048590865e-06, 'epoch': 0.01} 1%|▏ | 509/34278 [37:04<38:44:27, 4.13s/it] 1%|▏ | 510/34278 [37:09<41:11:14, 4.39s/it] {'loss': 0.2588, 'grad_norm': 1.2970047256431163, 'learning_rate': 4.956268221574344e-06, 'epoch': 0.01} 1%|▏ | 510/34278 [37:09<41:11:14, 4.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 511/34278 [37:15<45:52:18, 4.89s/it] {'loss': 0.2641, 'grad_norm': 1.2880449054790777, 'learning_rate': 4.965986394557824e-06, 'epoch': 0.01} 1%|▏ | 511/34278 [37:15<45:52:18, 4.89s/it] 1%|▏ | 512/34278 [37:19<43:09:42, 4.60s/it] {'loss': 0.2479, 'grad_norm': 1.0672005591191107, 'learning_rate': 4.975704567541302e-06, 'epoch': 0.01} 1%|▏ | 512/34278 [37:19<43:09:42, 4.60s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 1%|▏ | 513/34278 [37:22<38:40:42, 4.12s/it] {'loss': 0.237, 'grad_norm': 1.2408070495317607, 'learning_rate': 4.9854227405247814e-06, 'epoch': 0.01} 1%|▏ | 513/34278 [37:22<38:40:42, 4.12s/it] 1%|▏ | 514/34278 [37:25<35:49:45, 3.82s/it] {'loss': 0.2462, 'grad_norm': 1.4600327611566248, 'learning_rate': 4.995140913508261e-06, 'epoch': 0.01} 1%|▏ | 514/34278 [37:25<35:49:45, 3.82s/it] 2%|▏ | 515/34278 [37:31<42:07:35, 4.49s/it] {'loss': 0.2282, 'grad_norm': 1.0322934537143222, 'learning_rate': 5.00485908649174e-06, 'epoch': 0.02} 2%|▏ | 515/34278 [37:31<42:07:35, 4.49s/it] 2%|▏ | 516/34278 [37:37<46:53:10, 5.00s/it] {'loss': 0.2335, 'grad_norm': 1.17271083022763, 'learning_rate': 5.014577259475219e-06, 'epoch': 0.02} 2%|▏ | 516/34278 [37:37<46:53:10, 5.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 517/34278 [37:42<46:04:21, 4.91s/it] {'loss': 0.2605, 'grad_norm': 1.2339027239877591, 'learning_rate': 5.024295432458698e-06, 'epoch': 0.02} 2%|▏ | 517/34278 [37:42<46:04:21, 4.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 518/34278 [37:46<43:47:53, 4.67s/it] {'loss': 0.2563, 'grad_norm': 1.343196723311313, 'learning_rate': 5.034013605442177e-06, 'epoch': 0.02} 2%|▏ | 518/34278 [37:46<43:47:53, 4.67s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 519/34278 [37:52<45:58:21, 4.90s/it] {'loss': 0.2351, 'grad_norm': 1.2042926311020554, 'learning_rate': 5.0437317784256565e-06, 'epoch': 0.02} 2%|▏ | 519/34278 [37:52<45:58:21, 4.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 520/34278 [37:56<43:56:31, 4.69s/it] {'loss': 0.2681, 'grad_norm': 1.2392118146441158, 'learning_rate': 5.053449951409135e-06, 'epoch': 0.02} 2%|▏ | 520/34278 [37:56<43:56:31, 4.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 521/34278 [37:59<40:31:07, 4.32s/it] {'loss': 0.2342, 'grad_norm': 1.1642156874413723, 'learning_rate': 5.063168124392614e-06, 'epoch': 0.02} 2%|▏ | 521/34278 [37:59<40:31:07, 4.32s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 522/34278 [38:04<42:55:51, 4.58s/it] {'loss': 0.2354, 'grad_norm': 1.0936206726276263, 'learning_rate': 5.0728862973760935e-06, 'epoch': 0.02} 2%|▏ | 522/34278 [38:04<42:55:51, 4.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 523/34278 [38:08<40:23:41, 4.31s/it] {'loss': 0.2315, 'grad_norm': 1.1784572696574878, 'learning_rate': 5.082604470359572e-06, 'epoch': 0.02} 2%|▏ | 523/34278 [38:08<40:23:41, 4.31s/it] 2%|▏ | 524/34278 [38:11<36:43:21, 3.92s/it] {'loss': 0.2559, 'grad_norm': 1.4018108993214766, 'learning_rate': 5.092322643343051e-06, 'epoch': 0.02} 2%|▏ | 524/34278 [38:11<36:43:21, 3.92s/it] 2%|▏ | 525/34278 [38:15<35:58:56, 3.84s/it] {'loss': 0.2738, 'grad_norm': 1.4868977642568852, 'learning_rate': 5.1020408163265315e-06, 'epoch': 0.02} 2%|▏ | 525/34278 [38:15<35:58:56, 3.84s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 526/34278 [38:21<42:44:15, 4.56s/it] {'loss': 0.2424, 'grad_norm': 1.0975374744514939, 'learning_rate': 5.111758989310011e-06, 'epoch': 0.02} 2%|▏ | 526/34278 [38:21<42:44:15, 4.56s/it] 2%|▏ | 527/34278 [38:25<40:51:50, 4.36s/it] {'loss': 0.2563, 'grad_norm': 1.1451405719954026, 'learning_rate': 5.121477162293489e-06, 'epoch': 0.02} 2%|▏ | 527/34278 [38:25<40:51:50, 4.36s/it] 2%|▏ | 528/34278 [38:28<36:40:07, 3.91s/it] {'loss': 0.2718, 'grad_norm': 1.2648820762237045, 'learning_rate': 5.1311953352769686e-06, 'epoch': 0.02} 2%|▏ | 528/34278 [38:28<36:40:07, 3.91s/it] 2%|▏ | 529/34278 [38:31<35:44:19, 3.81s/it] {'loss': 0.2541, 'grad_norm': 1.0871343834254525, 'learning_rate': 5.140913508260448e-06, 'epoch': 0.02} 2%|▏ | 529/34278 [38:31<35:44:19, 3.81s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12570 > 8192). Running this sequence through the model will result in indexing errors 2%|▏ | 530/34278 [38:37<40:24:16, 4.31s/it] {'loss': 0.2263, 'grad_norm': 1.3389746879539115, 'learning_rate': 5.150631681243926e-06, 'epoch': 0.02} 2%|▏ | 530/34278 [38:37<40:24:16, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 531/34278 [38:43<46:18:52, 4.94s/it] {'loss': 0.2366, 'grad_norm': 1.3609250813513585, 'learning_rate': 5.160349854227406e-06, 'epoch': 0.02} 2%|▏ | 531/34278 [38:43<46:18:52, 4.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 532/34278 [38:46<40:34:43, 4.33s/it] {'loss': 0.2535, 'grad_norm': 1.3349704582365165, 'learning_rate': 5.170068027210885e-06, 'epoch': 0.02} 2%|▏ | 532/34278 [38:46<40:34:43, 4.33s/it] 2%|▏ | 533/34278 [38:50<40:07:32, 4.28s/it] {'loss': 0.2533, 'grad_norm': 1.3591505569965345, 'learning_rate': 5.179786200194364e-06, 'epoch': 0.02} 2%|▏ | 533/34278 [38:50<40:07:32, 4.28s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 534/34278 [38:54<38:44:18, 4.13s/it] {'loss': 0.2312, 'grad_norm': 1.0546896668336805, 'learning_rate': 5.189504373177843e-06, 'epoch': 0.02} 2%|▏ | 534/34278 [38:54<38:44:18, 4.13s/it] 2%|▏ | 535/34278 [38:57<36:26:10, 3.89s/it] {'loss': 0.2533, 'grad_norm': 1.1642247748102443, 'learning_rate': 5.199222546161322e-06, 'epoch': 0.02} 2%|▏ | 535/34278 [38:57<36:26:10, 3.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 536/34278 [39:01<35:15:08, 3.76s/it] {'loss': 0.2498, 'grad_norm': 1.2734641669395699, 'learning_rate': 5.208940719144801e-06, 'epoch': 0.02} 2%|▏ | 536/34278 [39:01<35:15:08, 3.76s/it] 2%|▏ | 537/34278 [39:04<32:57:14, 3.52s/it] {'loss': 0.2141, 'grad_norm': 1.0357302408386122, 'learning_rate': 5.21865889212828e-06, 'epoch': 0.02} 2%|▏ | 537/34278 [39:04<32:57:14, 3.52s/it] 2%|▏ | 538/34278 [39:07<32:36:52, 3.48s/it] {'loss': 0.2341, 'grad_norm': 1.0304599709978406, 'learning_rate': 5.228377065111759e-06, 'epoch': 0.02} 2%|▏ | 538/34278 [39:07<32:36:52, 3.48s/it] 2%|▏ | 539/34278 [39:10<31:22:23, 3.35s/it] {'loss': 0.2325, 'grad_norm': 1.4047972859558309, 'learning_rate': 5.2380952380952384e-06, 'epoch': 0.02} 2%|▏ | 539/34278 [39:10<31:22:23, 3.35s/it] 2%|▏ | 540/34278 [39:16<37:12:49, 3.97s/it] {'loss': 0.2495, 'grad_norm': 1.3253297357545974, 'learning_rate': 5.247813411078717e-06, 'epoch': 0.02} 2%|▏ | 540/34278 [39:16<37:12:49, 3.97s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 541/34278 [39:20<37:38:42, 4.02s/it] {'loss': 0.2275, 'grad_norm': 1.0493019209181853, 'learning_rate': 5.257531584062196e-06, 'epoch': 0.02} 2%|▏ | 541/34278 [39:20<37:38:42, 4.02s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 542/34278 [39:26<44:01:23, 4.70s/it] {'loss': 0.2218, 'grad_norm': 1.3123986483884977, 'learning_rate': 5.267249757045676e-06, 'epoch': 0.02} 2%|▏ | 542/34278 [39:26<44:01:23, 4.70s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 543/34278 [39:30<41:15:59, 4.40s/it] {'loss': 0.2488, 'grad_norm': 1.515493491087858, 'learning_rate': 5.276967930029156e-06, 'epoch': 0.02} 2%|▏ | 543/34278 [39:30<41:15:59, 4.40s/it] 2%|▏ | 544/34278 [39:33<38:19:48, 4.09s/it] {'loss': 0.2349, 'grad_norm': 1.1272225662027693, 'learning_rate': 5.286686103012634e-06, 'epoch': 0.02} 2%|▏ | 544/34278 [39:33<38:19:48, 4.09s/it] 2%|▏ | 545/34278 [39:36<36:19:19, 3.88s/it] {'loss': 0.261, 'grad_norm': 1.1164426331892674, 'learning_rate': 5.2964042759961135e-06, 'epoch': 0.02} 2%|▏ | 545/34278 [39:36<36:19:19, 3.88s/it] 2%|▏ | 546/34278 [39:40<34:33:04, 3.69s/it] {'loss': 0.2466, 'grad_norm': 1.2763851002221152, 'learning_rate': 5.306122448979593e-06, 'epoch': 0.02} 2%|▏ | 546/34278 [39:40<34:33:04, 3.69s/it] 2%|▏ | 547/34278 [39:45<40:10:55, 4.29s/it] {'loss': 0.2335, 'grad_norm': 1.2014379229596053, 'learning_rate': 5.315840621963071e-06, 'epoch': 0.02} 2%|▏ | 547/34278 [39:45<40:10:55, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 548/34278 [39:49<38:50:02, 4.14s/it] {'loss': 0.2506, 'grad_norm': 1.2100206293147397, 'learning_rate': 5.3255587949465505e-06, 'epoch': 0.02} 2%|▏ | 548/34278 [39:49<38:50:02, 4.14s/it] 2%|▏ | 549/34278 [39:53<36:33:28, 3.90s/it] {'loss': 0.2634, 'grad_norm': 1.3275336886079625, 'learning_rate': 5.33527696793003e-06, 'epoch': 0.02} 2%|▏ | 549/34278 [39:53<36:33:28, 3.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 550/34278 [39:56<34:04:29, 3.64s/it] {'loss': 0.2299, 'grad_norm': 1.3174878416843403, 'learning_rate': 5.344995140913509e-06, 'epoch': 0.02} 2%|▏ | 550/34278 [39:56<34:04:29, 3.64s/it] 2%|▏ | 551/34278 [39:59<34:29:47, 3.68s/it] {'loss': 0.2384, 'grad_norm': 0.9008258507259723, 'learning_rate': 5.354713313896988e-06, 'epoch': 0.02} 2%|▏ | 551/34278 [39:59<34:29:47, 3.68s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 552/34278 [40:02<32:49:59, 3.50s/it] {'loss': 0.2366, 'grad_norm': 1.3084802157410997, 'learning_rate': 5.364431486880467e-06, 'epoch': 0.02} 2%|▏ | 552/34278 [40:02<32:49:59, 3.50s/it] 2%|▏ | 553/34278 [40:05<30:47:12, 3.29s/it] {'loss': 0.2624, 'grad_norm': 1.2136094468462453, 'learning_rate': 5.374149659863946e-06, 'epoch': 0.02} 2%|▏ | 553/34278 [40:05<30:47:12, 3.29s/it] 2%|▏ | 554/34278 [40:11<38:06:25, 4.07s/it] {'loss': 0.2801, 'grad_norm': 1.4392730033686674, 'learning_rate': 5.383867832847425e-06, 'epoch': 0.02} 2%|▏ | 554/34278 [40:11<38:06:25, 4.07s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (8283 > 8192). Running this sequence through the model will result in indexing errors 2%|▏ | 555/34278 [40:16<40:13:58, 4.29s/it] {'loss': 0.2458, 'grad_norm': 1.2542445458376885, 'learning_rate': 5.393586005830904e-06, 'epoch': 0.02} 2%|▏ | 555/34278 [40:16<40:13:58, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 556/34278 [40:19<37:02:29, 3.95s/it] {'loss': 0.2392, 'grad_norm': 1.34420603075901, 'learning_rate': 5.403304178814383e-06, 'epoch': 0.02} 2%|▏ | 556/34278 [40:19<37:02:29, 3.95s/it] 2%|▏ | 557/34278 [40:23<35:51:21, 3.83s/it] {'loss': 0.2572, 'grad_norm': 1.3102475035554522, 'learning_rate': 5.413022351797862e-06, 'epoch': 0.02} 2%|▏ | 557/34278 [40:23<35:51:21, 3.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 558/34278 [40:29<41:45:22, 4.46s/it] {'loss': 0.2454, 'grad_norm': 1.2479299081465867, 'learning_rate': 5.422740524781341e-06, 'epoch': 0.02} 2%|▏ | 558/34278 [40:29<41:45:22, 4.46s/it] 2%|▏ | 559/34278 [40:35<46:26:15, 4.96s/it] {'loss': 0.2442, 'grad_norm': 1.4107698889438265, 'learning_rate': 5.4324586977648204e-06, 'epoch': 0.02} 2%|▏ | 559/34278 [40:35<46:26:15, 4.96s/it] 2%|▏ | 560/34278 [40:38<40:27:17, 4.32s/it] {'loss': 0.2634, 'grad_norm': 1.4361667300580239, 'learning_rate': 5.442176870748301e-06, 'epoch': 0.02} 2%|▏ | 560/34278 [40:38<40:27:17, 4.32s/it] 2%|▏ | 561/34278 [40:41<38:17:18, 4.09s/it] {'loss': 0.2227, 'grad_norm': 1.072946103830777, 'learning_rate': 5.451895043731778e-06, 'epoch': 0.02} 2%|▏ | 561/34278 [40:41<38:17:18, 4.09s/it] 2%|▏ | 562/34278 [40:45<37:06:15, 3.96s/it] {'loss': 0.2203, 'grad_norm': 1.1858106557560961, 'learning_rate': 5.461613216715258e-06, 'epoch': 0.02} 2%|▏ | 562/34278 [40:45<37:06:15, 3.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 563/34278 [40:48<35:28:22, 3.79s/it] {'loss': 0.2395, 'grad_norm': 1.1162929363472966, 'learning_rate': 5.471331389698738e-06, 'epoch': 0.02} 2%|▏ | 563/34278 [40:48<35:28:22, 3.79s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 564/34278 [40:54<41:56:04, 4.48s/it] {'loss': 0.2282, 'grad_norm': 1.1884848177255527, 'learning_rate': 5.481049562682216e-06, 'epoch': 0.02} 2%|▏ | 564/34278 [40:54<41:56:04, 4.48s/it] 2%|▏ | 565/34278 [40:57<38:20:16, 4.09s/it] {'loss': 0.2603, 'grad_norm': 1.1283973242586844, 'learning_rate': 5.4907677356656954e-06, 'epoch': 0.02} 2%|▏ | 565/34278 [40:57<38:20:16, 4.09s/it] 2%|▏ | 566/34278 [41:00<34:59:19, 3.74s/it] {'loss': 0.2459, 'grad_norm': 1.354333277408854, 'learning_rate': 5.500485908649175e-06, 'epoch': 0.02} 2%|▏ | 566/34278 [41:00<34:59:19, 3.74s/it] 2%|▏ | 567/34278 [41:03<32:57:21, 3.52s/it] {'loss': 0.2178, 'grad_norm': 1.325314031879747, 'learning_rate': 5.510204081632653e-06, 'epoch': 0.02} 2%|▏ | 567/34278 [41:03<32:57:21, 3.52s/it] 2%|▏ | 568/34278 [41:07<32:40:35, 3.49s/it] {'loss': 0.2338, 'grad_norm': 1.11324996018437, 'learning_rate': 5.5199222546161325e-06, 'epoch': 0.02} 2%|▏ | 568/34278 [41:07<32:40:35, 3.49s/it] 2%|▏ | 569/34278 [41:10<31:36:58, 3.38s/it] {'loss': 0.2141, 'grad_norm': 1.3652803222148933, 'learning_rate': 5.529640427599612e-06, 'epoch': 0.02} 2%|▏ | 569/34278 [41:10<31:36:58, 3.38s/it] 2%|▏ | 570/34278 [41:16<38:00:25, 4.06s/it] {'loss': 0.2889, 'grad_norm': 1.5019331891505299, 'learning_rate': 5.539358600583091e-06, 'epoch': 0.02} 2%|▏ | 570/34278 [41:16<38:00:25, 4.06s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 571/34278 [41:18<34:54:17, 3.73s/it] {'loss': 0.2358, 'grad_norm': 1.2501896889679502, 'learning_rate': 5.54907677356657e-06, 'epoch': 0.02} 2%|▏ | 571/34278 [41:18<34:54:17, 3.73s/it] 2%|▏ | 572/34278 [41:22<34:55:09, 3.73s/it] {'loss': 0.25, 'grad_norm': 1.295296482077547, 'learning_rate': 5.558794946550049e-06, 'epoch': 0.02} 2%|▏ | 572/34278 [41:22<34:55:09, 3.73s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 573/34278 [41:26<34:08:47, 3.65s/it] {'loss': 0.2442, 'grad_norm': 1.1903028277520935, 'learning_rate': 5.568513119533528e-06, 'epoch': 0.02} 2%|▏ | 573/34278 [41:26<34:08:47, 3.65s/it] 2%|▏ | 574/34278 [41:29<33:25:08, 3.57s/it] {'loss': 0.2548, 'grad_norm': 1.308995566767906, 'learning_rate': 5.578231292517007e-06, 'epoch': 0.02} 2%|▏ | 574/34278 [41:29<33:25:08, 3.57s/it] 2%|▏ | 575/34278 [41:35<39:06:06, 4.18s/it] {'loss': 0.2714, 'grad_norm': 1.3248921047172681, 'learning_rate': 5.587949465500486e-06, 'epoch': 0.02} 2%|▏ | 575/34278 [41:35<39:06:06, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 576/34278 [41:38<36:58:52, 3.95s/it] {'loss': 0.2275, 'grad_norm': 1.2266592847317053, 'learning_rate': 5.597667638483965e-06, 'epoch': 0.02} 2%|▏ | 576/34278 [41:38<36:58:52, 3.95s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 577/34278 [41:42<35:44:14, 3.82s/it] {'loss': 0.2321, 'grad_norm': 1.399214573599055, 'learning_rate': 5.6073858114674455e-06, 'epoch': 0.02} 2%|▏ | 577/34278 [41:42<35:44:14, 3.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 578/34278 [41:45<33:34:26, 3.59s/it] {'loss': 0.2557, 'grad_norm': 1.22807791517516, 'learning_rate': 5.617103984450923e-06, 'epoch': 0.02} 2%|▏ | 578/34278 [41:45<33:34:26, 3.59s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9080 > 8192). Running this sequence through the model will result in indexing errors 2%|▏ | 579/34278 [41:50<38:27:16, 4.11s/it] {'loss': 0.2544, 'grad_norm': 1.020201564208349, 'learning_rate': 5.626822157434403e-06, 'epoch': 0.02} 2%|▏ | 579/34278 [41:50<38:27:16, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 580/34278 [41:54<38:19:53, 4.10s/it] {'loss': 0.2258, 'grad_norm': 1.15048463646873, 'learning_rate': 5.6365403304178826e-06, 'epoch': 0.02} 2%|▏ | 580/34278 [41:54<38:19:53, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 581/34278 [41:58<37:18:06, 3.99s/it] {'loss': 0.2637, 'grad_norm': 1.3848293352426153, 'learning_rate': 5.646258503401361e-06, 'epoch': 0.02} 2%|▏ | 581/34278 [41:58<37:18:06, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 582/34278 [42:01<34:34:02, 3.69s/it] {'loss': 0.2179, 'grad_norm': 1.2734601804541426, 'learning_rate': 5.65597667638484e-06, 'epoch': 0.02} 2%|▏ | 582/34278 [42:01<34:34:02, 3.69s/it] 2%|▏ | 583/34278 [42:05<36:14:44, 3.87s/it] {'loss': 0.2198, 'grad_norm': 1.05233161257467, 'learning_rate': 5.66569484936832e-06, 'epoch': 0.02} 2%|▏ | 583/34278 [42:05<36:14:44, 3.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 584/34278 [42:09<36:07:17, 3.86s/it] {'loss': 0.2452, 'grad_norm': 1.2250757368811662, 'learning_rate': 5.675413022351798e-06, 'epoch': 0.02} 2%|▏ | 584/34278 [42:09<36:07:17, 3.86s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 585/34278 [42:15<43:01:32, 4.60s/it] {'loss': 0.2414, 'grad_norm': 1.2896924036488904, 'learning_rate': 5.6851311953352774e-06, 'epoch': 0.02} 2%|▏ | 585/34278 [42:15<43:01:32, 4.60s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 586/34278 [42:19<42:06:15, 4.50s/it] {'loss': 0.2263, 'grad_norm': 1.0976720248422727, 'learning_rate': 5.694849368318757e-06, 'epoch': 0.02} 2%|▏ | 586/34278 [42:19<42:06:15, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 587/34278 [42:23<39:01:03, 4.17s/it] {'loss': 0.2573, 'grad_norm': 1.0181006871906253, 'learning_rate': 5.704567541302236e-06, 'epoch': 0.02} 2%|▏ | 587/34278 [42:23<39:01:03, 4.17s/it] 2%|▏ | 588/34278 [42:27<38:08:57, 4.08s/it] {'loss': 0.2535, 'grad_norm': 1.2622292761764857, 'learning_rate': 5.7142857142857145e-06, 'epoch': 0.02} 2%|▏ | 588/34278 [42:27<38:08:57, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 589/34278 [42:30<36:22:13, 3.89s/it] {'loss': 0.2532, 'grad_norm': 1.3264411006970873, 'learning_rate': 5.724003887269194e-06, 'epoch': 0.02} 2%|▏ | 589/34278 [42:30<36:22:13, 3.89s/it] 2%|▏ | 590/34278 [42:33<34:27:17, 3.68s/it] {'loss': 0.2511, 'grad_norm': 1.3876971135647425, 'learning_rate': 5.733722060252673e-06, 'epoch': 0.02} 2%|▏ | 590/34278 [42:33<34:27:17, 3.68s/it] 2%|▏ | 591/34278 [42:36<32:27:36, 3.47s/it] {'loss': 0.2181, 'grad_norm': 0.9140296006283072, 'learning_rate': 5.743440233236152e-06, 'epoch': 0.02} 2%|▏ | 591/34278 [42:36<32:27:36, 3.47s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9929 > 8192). Running this sequence through the model will result in indexing errors 2%|▏ | 592/34278 [42:39<30:52:12, 3.30s/it] {'loss': 0.2426, 'grad_norm': 1.1627511150725285, 'learning_rate': 5.753158406219631e-06, 'epoch': 0.02} 2%|▏ | 592/34278 [42:39<30:52:12, 3.30s/it] 2%|▏ | 593/34278 [42:43<31:40:30, 3.39s/it] {'loss': 0.2276, 'grad_norm': 0.9866981750071782, 'learning_rate': 5.76287657920311e-06, 'epoch': 0.02} 2%|▏ | 593/34278 [42:43<31:40:30, 3.39s/it] 2%|▏ | 594/34278 [42:46<30:16:05, 3.23s/it] {'loss': 0.2073, 'grad_norm': 1.007490986657743, 'learning_rate': 5.7725947521865895e-06, 'epoch': 0.02} 2%|▏ | 594/34278 [42:46<30:16:05, 3.23s/it] 2%|▏ | 595/34278 [42:49<30:55:43, 3.31s/it] {'loss': 0.2378, 'grad_norm': 1.2474666067068907, 'learning_rate': 5.782312925170068e-06, 'epoch': 0.02} 2%|▏ | 595/34278 [42:49<30:55:43, 3.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 596/34278 [42:55<39:02:32, 4.17s/it] {'loss': 0.2607, 'grad_norm': 1.279196133454035, 'learning_rate': 5.792031098153547e-06, 'epoch': 0.02} 2%|▏ | 596/34278 [42:55<39:02:32, 4.17s/it] 2%|▏ | 597/34278 [42:59<38:23:36, 4.10s/it] {'loss': 0.2442, 'grad_norm': 1.1945455195443986, 'learning_rate': 5.8017492711370275e-06, 'epoch': 0.02} 2%|▏ | 597/34278 [42:59<38:23:36, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 598/34278 [43:03<36:02:36, 3.85s/it] {'loss': 0.2123, 'grad_norm': 1.420540164405751, 'learning_rate': 5.811467444120505e-06, 'epoch': 0.02} 2%|▏ | 598/34278 [43:03<36:02:36, 3.85s/it] 2%|▏ | 599/34278 [43:08<39:04:13, 4.18s/it] {'loss': 0.2866, 'grad_norm': 1.3801792520800733, 'learning_rate': 5.821185617103985e-06, 'epoch': 0.02} 2%|▏ | 599/34278 [43:08<39:04:13, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (8478 > 8192). Running this sequence through the model will result in indexing errors 2%|▏ | 600/34278 [43:11<35:52:41, 3.84s/it] {'loss': 0.2689, 'grad_norm': 1.4450475245321002, 'learning_rate': 5.8309037900874645e-06, 'epoch': 0.02} 2%|▏ | 600/34278 [43:11<35:52:41, 3.84s/it] 2%|▏ | 601/34278 [43:14<34:07:00, 3.65s/it] {'loss': 0.2461, 'grad_norm': 1.4434020910980732, 'learning_rate': 5.840621963070943e-06, 'epoch': 0.02} 2%|▏ | 601/34278 [43:14<34:07:00, 3.65s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 602/34278 [43:19<37:48:42, 4.04s/it] {'loss': 0.2304, 'grad_norm': 1.170915524215105, 'learning_rate': 5.850340136054422e-06, 'epoch': 0.02} 2%|▏ | 602/34278 [43:19<37:48:42, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (11845 > 8192). Running this sequence through the model will result in indexing errors 2%|▏ | 603/34278 [43:24<42:19:49, 4.53s/it] {'loss': 0.2216, 'grad_norm': 1.184787623641522, 'learning_rate': 5.860058309037902e-06, 'epoch': 0.02} 2%|▏ | 603/34278 [43:24<42:19:49, 4.53s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 604/34278 [43:28<41:00:31, 4.38s/it] {'loss': 0.266, 'grad_norm': 1.4226813433172771, 'learning_rate': 5.869776482021381e-06, 'epoch': 0.02} 2%|▏ | 604/34278 [43:28<41:00:31, 4.38s/it] 2%|▏ | 605/34278 [43:31<37:09:57, 3.97s/it] {'loss': 0.2365, 'grad_norm': 1.1701452340689182, 'learning_rate': 5.879494655004859e-06, 'epoch': 0.02} 2%|▏ | 605/34278 [43:31<37:09:57, 3.97s/it] 2%|▏ | 606/34278 [43:35<35:08:27, 3.76s/it] {'loss': 0.2279, 'grad_norm': 1.1314185813102013, 'learning_rate': 5.889212827988339e-06, 'epoch': 0.02} 2%|▏ | 606/34278 [43:35<35:08:27, 3.76s/it] 2%|▏ | 607/34278 [43:41<41:55:28, 4.48s/it] {'loss': 0.2297, 'grad_norm': 1.3417308850278244, 'learning_rate': 5.898931000971818e-06, 'epoch': 0.02} 2%|▏ | 607/34278 [43:41<41:55:28, 4.48s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 608/34278 [43:44<38:58:41, 4.17s/it] {'loss': 0.2268, 'grad_norm': 1.1774011258844665, 'learning_rate': 5.9086491739552965e-06, 'epoch': 0.02} 2%|▏ | 608/34278 [43:44<38:58:41, 4.17s/it] 2%|▏ | 609/34278 [43:47<35:56:28, 3.84s/it] {'loss': 0.261, 'grad_norm': 1.1483123651885505, 'learning_rate': 5.918367346938776e-06, 'epoch': 0.02} 2%|▏ | 609/34278 [43:47<35:56:28, 3.84s/it] 2%|▏ | 610/34278 [43:52<38:51:16, 4.15s/it] {'loss': 0.2557, 'grad_norm': 1.0659224829575464, 'learning_rate': 5.928085519922255e-06, 'epoch': 0.02} 2%|▏ | 610/34278 [43:52<38:51:16, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 611/34278 [43:56<37:13:11, 3.98s/it] {'loss': 0.2597, 'grad_norm': 1.3453124406569783, 'learning_rate': 5.937803692905734e-06, 'epoch': 0.02} 2%|▏ | 611/34278 [43:56<37:13:11, 3.98s/it] 2%|▏ | 612/34278 [43:59<35:07:30, 3.76s/it] {'loss': 0.2445, 'grad_norm': 0.9446040810558779, 'learning_rate': 5.947521865889213e-06, 'epoch': 0.02} 2%|▏ | 612/34278 [43:59<35:07:30, 3.76s/it] 2%|▏ | 613/34278 [44:02<32:39:13, 3.49s/it] {'loss': 0.2346, 'grad_norm': 1.075287696623533, 'learning_rate': 5.957240038872692e-06, 'epoch': 0.02} 2%|▏ | 613/34278 [44:02<32:39:13, 3.49s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8784 > 8192). Running this sequence through the model will result in indexing errors 2%|▏ | 614/34278 [44:06<32:49:32, 3.51s/it] {'loss': 0.2217, 'grad_norm': 1.1094662802964779, 'learning_rate': 5.966958211856172e-06, 'epoch': 0.02} 2%|▏ | 614/34278 [44:06<32:49:32, 3.51s/it] 2%|▏ | 615/34278 [44:11<39:41:00, 4.24s/it] {'loss': 0.2595, 'grad_norm': 1.4239082627795743, 'learning_rate': 5.97667638483965e-06, 'epoch': 0.02} 2%|▏ | 615/34278 [44:11<39:41:00, 4.24s/it] 2%|▏ | 616/34278 [44:15<36:24:17, 3.89s/it] {'loss': 0.2404, 'grad_norm': 1.4489530560590527, 'learning_rate': 5.98639455782313e-06, 'epoch': 0.02} 2%|▏ | 616/34278 [44:15<36:24:17, 3.89s/it] 2%|▏ | 617/34278 [44:18<36:22:14, 3.89s/it] {'loss': 0.2334, 'grad_norm': 1.1926829007742819, 'learning_rate': 5.9961127308066094e-06, 'epoch': 0.02} 2%|▏ | 617/34278 [44:18<36:22:14, 3.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 618/34278 [44:24<39:55:08, 4.27s/it] {'loss': 0.2362, 'grad_norm': 1.2726694122969338, 'learning_rate': 6.005830903790088e-06, 'epoch': 0.02} 2%|▏ | 618/34278 [44:24<39:55:08, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 619/34278 [44:27<37:40:50, 4.03s/it] {'loss': 0.236, 'grad_norm': 1.2354341099058872, 'learning_rate': 6.015549076773567e-06, 'epoch': 0.02} 2%|▏ | 619/34278 [44:27<37:40:50, 4.03s/it] 2%|▏ | 620/34278 [44:30<36:01:06, 3.85s/it] {'loss': 0.2352, 'grad_norm': 1.1685072760117496, 'learning_rate': 6.0252672497570465e-06, 'epoch': 0.02} 2%|▏ | 620/34278 [44:30<36:01:06, 3.85s/it] 2%|▏ | 621/34278 [44:33<33:39:18, 3.60s/it] {'loss': 0.2369, 'grad_norm': 1.334388794918696, 'learning_rate': 6.034985422740526e-06, 'epoch': 0.02} 2%|▏ | 621/34278 [44:34<33:39:18, 3.60s/it] 2%|▏ | 622/34278 [44:37<32:32:19, 3.48s/it] {'loss': 0.2287, 'grad_norm': 1.2956117401686313, 'learning_rate': 6.044703595724004e-06, 'epoch': 0.02} 2%|▏ | 622/34278 [44:37<32:32:19, 3.48s/it] 2%|▏ | 623/34278 [44:40<32:23:39, 3.47s/it] {'loss': 0.2498, 'grad_norm': 1.0273512689823068, 'learning_rate': 6.054421768707484e-06, 'epoch': 0.02} 2%|▏ | 623/34278 [44:40<32:23:39, 3.47s/it] 2%|▏ | 624/34278 [44:44<32:52:32, 3.52s/it] {'loss': 0.2438, 'grad_norm': 1.2532451149411814, 'learning_rate': 6.064139941690963e-06, 'epoch': 0.02} 2%|▏ | 624/34278 [44:44<32:52:32, 3.52s/it] 2%|▏ | 625/34278 [44:49<38:09:20, 4.08s/it] {'loss': 0.2363, 'grad_norm': 1.2453640640748649, 'learning_rate': 6.073858114674441e-06, 'epoch': 0.02} 2%|▏ | 625/34278 [44:49<38:09:20, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 626/34278 [44:55<44:07:15, 4.72s/it] {'loss': 0.2254, 'grad_norm': 1.0258445071379283, 'learning_rate': 6.083576287657921e-06, 'epoch': 0.02} 2%|▏ | 626/34278 [44:55<44:07:15, 4.72s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 627/34278 [45:00<44:14:19, 4.73s/it] {'loss': 0.2287, 'grad_norm': 1.1176272783245296, 'learning_rate': 6.0932944606414e-06, 'epoch': 0.02} 2%|▏ | 627/34278 [45:00<44:14:19, 4.73s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 628/34278 [45:04<41:59:53, 4.49s/it] {'loss': 0.232, 'grad_norm': 0.9938821905647645, 'learning_rate': 6.1030126336248785e-06, 'epoch': 0.02} 2%|▏ | 628/34278 [45:04<41:59:53, 4.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 629/34278 [45:08<39:32:06, 4.23s/it] {'loss': 0.2254, 'grad_norm': 1.2124939329163622, 'learning_rate': 6.112730806608358e-06, 'epoch': 0.02} 2%|▏ | 629/34278 [45:08<39:32:06, 4.23s/it] 2%|▏ | 630/34278 [45:11<35:57:48, 3.85s/it] {'loss': 0.2331, 'grad_norm': 1.2653067642151536, 'learning_rate': 6.122448979591837e-06, 'epoch': 0.02} 2%|▏ | 630/34278 [45:11<35:57:48, 3.85s/it] 2%|▏ | 631/34278 [45:14<35:23:56, 3.79s/it] {'loss': 0.2646, 'grad_norm': 1.2801920844112518, 'learning_rate': 6.132167152575316e-06, 'epoch': 0.02} 2%|▏ | 631/34278 [45:14<35:23:56, 3.79s/it] 2%|▏ | 632/34278 [45:17<33:35:35, 3.59s/it] {'loss': 0.2271, 'grad_norm': 1.3997481743012654, 'learning_rate': 6.141885325558795e-06, 'epoch': 0.02} 2%|▏ | 632/34278 [45:17<33:35:35, 3.59s/it] 2%|▏ | 633/34278 [45:21<33:26:05, 3.58s/it] {'loss': 0.231, 'grad_norm': 1.0788089008051882, 'learning_rate': 6.151603498542274e-06, 'epoch': 0.02} 2%|▏ | 633/34278 [45:21<33:26:05, 3.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 634/34278 [45:27<40:02:39, 4.28s/it] {'loss': 0.2745, 'grad_norm': 1.1463139231059625, 'learning_rate': 6.161321671525754e-06, 'epoch': 0.02} 2%|▏ | 634/34278 [45:27<40:02:39, 4.28s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 635/34278 [45:32<41:44:50, 4.47s/it] {'loss': 0.2569, 'grad_norm': 1.5979156001730184, 'learning_rate': 6.171039844509232e-06, 'epoch': 0.02} 2%|▏ | 635/34278 [45:32<41:44:50, 4.47s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 636/34278 [45:36<39:39:01, 4.24s/it] {'loss': 0.2307, 'grad_norm': 1.179192085643085, 'learning_rate': 6.180758017492712e-06, 'epoch': 0.02} 2%|▏ | 636/34278 [45:36<39:39:01, 4.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 637/34278 [45:39<36:54:44, 3.95s/it] {'loss': 0.2272, 'grad_norm': 0.9235492089736143, 'learning_rate': 6.1904761904761914e-06, 'epoch': 0.02} 2%|▏ | 637/34278 [45:39<36:54:44, 3.95s/it] 2%|▏ | 638/34278 [45:43<36:46:44, 3.94s/it] {'loss': 0.2366, 'grad_norm': 1.3841914000140947, 'learning_rate': 6.200194363459671e-06, 'epoch': 0.02} 2%|▏ | 638/34278 [45:43<36:46:44, 3.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 639/34278 [45:46<33:49:52, 3.62s/it] {'loss': 0.2582, 'grad_norm': 1.6345443503321042, 'learning_rate': 6.209912536443149e-06, 'epoch': 0.02} 2%|▏ | 639/34278 [45:46<33:49:52, 3.62s/it] 2%|▏ | 640/34278 [45:49<33:42:26, 3.61s/it] {'loss': 0.2453, 'grad_norm': 0.991943928840273, 'learning_rate': 6.2196307094266285e-06, 'epoch': 0.02} 2%|▏ | 640/34278 [45:49<33:42:26, 3.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 641/34278 [45:54<37:09:26, 3.98s/it] {'loss': 0.2411, 'grad_norm': 1.0783813604538777, 'learning_rate': 6.229348882410108e-06, 'epoch': 0.02} 2%|▏ | 641/34278 [45:54<37:09:26, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 642/34278 [45:57<35:23:16, 3.79s/it] {'loss': 0.2298, 'grad_norm': 1.1270768208689745, 'learning_rate': 6.239067055393586e-06, 'epoch': 0.02} 2%|▏ | 642/34278 [45:57<35:23:16, 3.79s/it] 2%|▏ | 643/34278 [46:02<38:51:28, 4.16s/it] {'loss': 0.2603, 'grad_norm': 1.5482563859839245, 'learning_rate': 6.248785228377066e-06, 'epoch': 0.02} 2%|▏ | 643/34278 [46:02<38:51:28, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 644/34278 [46:06<36:07:54, 3.87s/it] {'loss': 0.2231, 'grad_norm': 1.1322433743515374, 'learning_rate': 6.258503401360545e-06, 'epoch': 0.02} 2%|▏ | 644/34278 [46:06<36:07:54, 3.87s/it] 2%|▏ | 645/34278 [46:08<33:26:25, 3.58s/it] {'loss': 0.2245, 'grad_norm': 1.1655810249844738, 'learning_rate': 6.268221574344023e-06, 'epoch': 0.02} 2%|▏ | 645/34278 [46:08<33:26:25, 3.58s/it] 2%|▏ | 646/34278 [46:13<35:41:13, 3.82s/it] {'loss': 0.2509, 'grad_norm': 1.302144670429167, 'learning_rate': 6.277939747327503e-06, 'epoch': 0.02} 2%|▏ | 646/34278 [46:13<35:41:13, 3.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 647/34278 [46:16<34:27:05, 3.69s/it] {'loss': 0.2391, 'grad_norm': 1.3650227046508312, 'learning_rate': 6.287657920310982e-06, 'epoch': 0.02} 2%|▏ | 647/34278 [46:16<34:27:05, 3.69s/it] 2%|▏ | 648/34278 [46:19<32:56:12, 3.53s/it] {'loss': 0.2299, 'grad_norm': 1.2382916532992942, 'learning_rate': 6.297376093294461e-06, 'epoch': 0.02} 2%|▏ | 648/34278 [46:19<32:56:12, 3.53s/it] 2%|▏ | 649/34278 [46:24<36:28:56, 3.91s/it] {'loss': 0.2179, 'grad_norm': 1.2406123637674724, 'learning_rate': 6.30709426627794e-06, 'epoch': 0.02} 2%|▏ | 649/34278 [46:24<36:28:56, 3.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 650/34278 [46:29<39:04:37, 4.18s/it] {'loss': 0.228, 'grad_norm': 1.1771581053389553, 'learning_rate': 6.316812439261419e-06, 'epoch': 0.02} 2%|▏ | 650/34278 [46:29<39:04:37, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 651/34278 [46:35<43:11:56, 4.62s/it] {'loss': 0.2345, 'grad_norm': 1.1978049273503264, 'learning_rate': 6.326530612244899e-06, 'epoch': 0.02} 2%|▏ | 651/34278 [46:35<43:11:56, 4.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 652/34278 [46:38<39:11:18, 4.20s/it] {'loss': 0.2301, 'grad_norm': 1.0438236362338311, 'learning_rate': 6.336248785228377e-06, 'epoch': 0.02} 2%|▏ | 652/34278 [46:38<39:11:18, 4.20s/it] 2%|▏ | 653/34278 [46:42<38:00:35, 4.07s/it] {'loss': 0.2241, 'grad_norm': 1.2392735048781178, 'learning_rate': 6.345966958211857e-06, 'epoch': 0.02} 2%|▏ | 653/34278 [46:42<38:00:35, 4.07s/it] 2%|▏ | 654/34278 [46:46<39:13:19, 4.20s/it] {'loss': 0.2511, 'grad_norm': 1.2713076329828823, 'learning_rate': 6.355685131195336e-06, 'epoch': 0.02} 2%|▏ | 654/34278 [46:46<39:13:19, 4.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 655/34278 [46:53<45:44:23, 4.90s/it] {'loss': 0.2352, 'grad_norm': 1.0998516143216948, 'learning_rate': 6.365403304178814e-06, 'epoch': 0.02} 2%|▏ | 655/34278 [46:53<45:44:23, 4.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fca0c9ab060> Failed to fetch sample 3641703. Exception: cannot identify image file <_io.BytesIO object at 0x7fca0c9ab060> 2%|▏ | 656/34278 [46:57<43:31:59, 4.66s/it] {'loss': 0.2953, 'grad_norm': 1.4780548053617104, 'learning_rate': 6.375121477162294e-06, 'epoch': 0.02} 2%|▏ | 656/34278 [46:57<43:31:59, 4.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 657/34278 [47:01<41:09:30, 4.41s/it] {'loss': 0.2081, 'grad_norm': 1.0594420018595556, 'learning_rate': 6.384839650145773e-06, 'epoch': 0.02} 2%|▏ | 657/34278 [47:01<41:09:30, 4.41s/it] 2%|▏ | 658/34278 [47:07<46:09:30, 4.94s/it] {'loss': 0.2488, 'grad_norm': 1.0429732466991684, 'learning_rate': 6.394557823129253e-06, 'epoch': 0.02} 2%|▏ | 658/34278 [47:07<46:09:30, 4.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f30759931f0> Failed to fetch sample 2701721. Exception: cannot identify image file <_io.BytesIO object at 0x7f30759931f0> 2%|▏ | 659/34278 [47:13<48:59:38, 5.25s/it] {'loss': 0.2444, 'grad_norm': 1.23883413417218, 'learning_rate': 6.404275996112731e-06, 'epoch': 0.02} 2%|▏ | 659/34278 [47:13<48:59:38, 5.25s/it] 2%|▏ | 660/34278 [47:16<42:17:49, 4.53s/it] {'loss': 0.2385, 'grad_norm': 1.2195983025524706, 'learning_rate': 6.4139941690962105e-06, 'epoch': 0.02} 2%|▏ | 660/34278 [47:16<42:17:49, 4.53s/it] 2%|▏ | 661/34278 [47:19<38:37:01, 4.14s/it] {'loss': 0.2599, 'grad_norm': 1.4125414694072445, 'learning_rate': 6.42371234207969e-06, 'epoch': 0.02} 2%|▏ | 661/34278 [47:19<38:37:01, 4.14s/it] 2%|▏ | 662/34278 [47:23<38:05:28, 4.08s/it] {'loss': 0.2275, 'grad_norm': 1.3614194178922787, 'learning_rate': 6.433430515063168e-06, 'epoch': 0.02} 2%|▏ | 662/34278 [47:23<38:05:28, 4.08s/it] 2%|▏ | 663/34278 [47:26<35:49:10, 3.84s/it] {'loss': 0.2279, 'grad_norm': 1.2375400770445757, 'learning_rate': 6.443148688046648e-06, 'epoch': 0.02} 2%|▏ | 663/34278 [47:26<35:49:10, 3.84s/it] 2%|▏ | 664/34278 [47:30<36:35:35, 3.92s/it] {'loss': 0.2657, 'grad_norm': 1.541586033440628, 'learning_rate': 6.452866861030127e-06, 'epoch': 0.02} 2%|▏ | 664/34278 [47:30<36:35:35, 3.92s/it] 2%|▏ | 665/34278 [47:34<35:51:10, 3.84s/it] {'loss': 0.2331, 'grad_norm': 1.2173335817869875, 'learning_rate': 6.462585034013606e-06, 'epoch': 0.02} 2%|▏ | 665/34278 [47:34<35:51:10, 3.84s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 666/34278 [47:37<33:42:19, 3.61s/it] {'loss': 0.2238, 'grad_norm': 0.9510361983667885, 'learning_rate': 6.472303206997085e-06, 'epoch': 0.02} 2%|▏ | 666/34278 [47:37<33:42:19, 3.61s/it] 2%|▏ | 667/34278 [47:40<32:16:30, 3.46s/it] {'loss': 0.2486, 'grad_norm': 1.2748878906489571, 'learning_rate': 6.482021379980564e-06, 'epoch': 0.02} 2%|▏ | 667/34278 [47:40<32:16:30, 3.46s/it] 2%|▏ | 668/34278 [47:46<39:29:14, 4.23s/it] {'loss': 0.2657, 'grad_norm': 1.4634602527052192, 'learning_rate': 6.491739552964043e-06, 'epoch': 0.02} 2%|▏ | 668/34278 [47:46<39:29:14, 4.23s/it] 2%|▏ | 669/34278 [47:49<36:57:41, 3.96s/it] {'loss': 0.2149, 'grad_norm': 0.899820820063537, 'learning_rate': 6.501457725947522e-06, 'epoch': 0.02} 2%|▏ | 669/34278 [47:49<36:57:41, 3.96s/it] 2%|▏ | 670/34278 [47:52<33:41:09, 3.61s/it] {'loss': 0.2526, 'grad_norm': 1.1159452835771118, 'learning_rate': 6.511175898931001e-06, 'epoch': 0.02} 2%|▏ | 670/34278 [47:52<33:41:09, 3.61s/it] 2%|▏ | 671/34278 [47:56<33:37:51, 3.60s/it] {'loss': 0.2368, 'grad_norm': 1.8587001243768357, 'learning_rate': 6.520894071914481e-06, 'epoch': 0.02} 2%|▏ | 671/34278 [47:56<33:37:51, 3.60s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 672/34278 [48:01<38:27:37, 4.12s/it] {'loss': 0.2454, 'grad_norm': 1.1995907976290137, 'learning_rate': 6.530612244897959e-06, 'epoch': 0.02} 2%|▏ | 672/34278 [48:01<38:27:37, 4.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 673/34278 [48:04<35:38:08, 3.82s/it] {'loss': 0.2436, 'grad_norm': 1.1396371491198773, 'learning_rate': 6.540330417881439e-06, 'epoch': 0.02} 2%|▏ | 673/34278 [48:04<35:38:08, 3.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 674/34278 [48:07<33:59:48, 3.64s/it] {'loss': 0.2352, 'grad_norm': 1.0339295633577104, 'learning_rate': 6.550048590864918e-06, 'epoch': 0.02} 2%|▏ | 674/34278 [48:07<33:59:48, 3.64s/it] 2%|▏ | 675/34278 [48:11<34:50:20, 3.73s/it] {'loss': 0.2559, 'grad_norm': 1.3780473198007388, 'learning_rate': 6.559766763848398e-06, 'epoch': 0.02} 2%|▏ | 675/34278 [48:11<34:50:20, 3.73s/it] 2%|▏ | 676/34278 [48:14<33:01:17, 3.54s/it] {'loss': 0.2264, 'grad_norm': 1.0840698315294541, 'learning_rate': 6.569484936831876e-06, 'epoch': 0.02} 2%|▏ | 676/34278 [48:14<33:01:17, 3.54s/it] 2%|▏ | 677/34278 [48:17<31:43:14, 3.40s/it] {'loss': 0.2306, 'grad_norm': 1.1117692534296524, 'learning_rate': 6.579203109815355e-06, 'epoch': 0.02} 2%|▏ | 677/34278 [48:17<31:43:14, 3.40s/it] 2%|▏ | 678/34278 [48:21<31:08:56, 3.34s/it] {'loss': 0.2173, 'grad_norm': 1.2177199856031593, 'learning_rate': 6.588921282798835e-06, 'epoch': 0.02} 2%|▏ | 678/34278 [48:21<31:08:56, 3.34s/it] 2%|▏ | 679/34278 [48:25<33:17:51, 3.57s/it] {'loss': 0.2632, 'grad_norm': 1.4263377351884565, 'learning_rate': 6.598639455782313e-06, 'epoch': 0.02} 2%|▏ | 679/34278 [48:25<33:17:51, 3.57s/it] 2%|▏ | 680/34278 [48:28<32:22:21, 3.47s/it] {'loss': 0.2449, 'grad_norm': 1.1117249106111613, 'learning_rate': 6.6083576287657925e-06, 'epoch': 0.02} 2%|▏ | 680/34278 [48:28<32:22:21, 3.47s/it] 2%|▏ | 681/34278 [48:32<33:07:09, 3.55s/it] {'loss': 0.2645, 'grad_norm': 1.3031984868239914, 'learning_rate': 6.618075801749272e-06, 'epoch': 0.02} 2%|▏ | 681/34278 [48:32<33:07:09, 3.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 682/34278 [48:36<34:13:06, 3.67s/it] {'loss': 0.2322, 'grad_norm': 1.3056784859426693, 'learning_rate': 6.627793974732751e-06, 'epoch': 0.02} 2%|▏ | 682/34278 [48:36<34:13:06, 3.67s/it] 2%|▏ | 683/34278 [48:40<35:43:50, 3.83s/it] {'loss': 0.2255, 'grad_norm': 0.9106923138955538, 'learning_rate': 6.6375121477162296e-06, 'epoch': 0.02} 2%|▏ | 683/34278 [48:40<35:43:50, 3.83s/it] 2%|▏ | 684/34278 [48:46<42:17:29, 4.53s/it] {'loss': 0.2543, 'grad_norm': 1.154730924057303, 'learning_rate': 6.647230320699709e-06, 'epoch': 0.02} 2%|▏ | 684/34278 [48:46<42:17:29, 4.53s/it] 2%|▏ | 685/34278 [48:50<39:21:53, 4.22s/it] {'loss': 0.2235, 'grad_norm': 1.040696964643687, 'learning_rate': 6.656948493683188e-06, 'epoch': 0.02} 2%|▏ | 685/34278 [48:50<39:21:53, 4.22s/it] 2%|▏ | 686/34278 [48:53<35:51:20, 3.84s/it] {'loss': 0.2393, 'grad_norm': 1.1673430548551929, 'learning_rate': 6.666666666666667e-06, 'epoch': 0.02} 2%|▏ | 686/34278 [48:53<35:51:20, 3.84s/it] 2%|▏ | 687/34278 [48:58<39:43:35, 4.26s/it] {'loss': 0.2178, 'grad_norm': 1.0125759625993465, 'learning_rate': 6.676384839650146e-06, 'epoch': 0.02} 2%|▏ | 687/34278 [48:58<39:43:35, 4.26s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 688/34278 [49:02<38:22:13, 4.11s/it] {'loss': 0.252, 'grad_norm': 1.3676924651707596, 'learning_rate': 6.686103012633626e-06, 'epoch': 0.02} 2%|▏ | 688/34278 [49:02<38:22:13, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 689/34278 [49:06<40:48:59, 4.37s/it] {'loss': 0.2342, 'grad_norm': 1.1636943279474745, 'learning_rate': 6.695821185617104e-06, 'epoch': 0.02} 2%|▏ | 689/34278 [49:06<40:48:59, 4.37s/it] 2%|▏ | 690/34278 [49:10<38:34:08, 4.13s/it] {'loss': 0.2505, 'grad_norm': 1.2277204714048011, 'learning_rate': 6.705539358600584e-06, 'epoch': 0.02} 2%|▏ | 690/34278 [49:10<38:34:08, 4.13s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 691/34278 [49:13<35:13:51, 3.78s/it] {'loss': 0.2369, 'grad_norm': 1.2978532674924002, 'learning_rate': 6.715257531584063e-06, 'epoch': 0.02} 2%|▏ | 691/34278 [49:13<35:13:51, 3.78s/it] 2%|▏ | 692/34278 [49:16<32:55:30, 3.53s/it] {'loss': 0.2515, 'grad_norm': 1.1769595666714598, 'learning_rate': 6.7249757045675425e-06, 'epoch': 0.02} 2%|▏ | 692/34278 [49:16<32:55:30, 3.53s/it] 2%|▏ | 693/34278 [49:19<32:04:29, 3.44s/it] {'loss': 0.2428, 'grad_norm': 1.1000942989695253, 'learning_rate': 6.734693877551021e-06, 'epoch': 0.02} 2%|▏ | 693/34278 [49:19<32:04:29, 3.44s/it] 2%|▏ | 694/34278 [49:24<36:13:09, 3.88s/it] {'loss': 0.1993, 'grad_norm': 0.8371113735131498, 'learning_rate': 6.7444120505345e-06, 'epoch': 0.02} 2%|▏ | 694/34278 [49:24<36:13:09, 3.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 695/34278 [49:28<36:19:04, 3.89s/it] {'loss': 0.2342, 'grad_norm': 1.120861467251313, 'learning_rate': 6.75413022351798e-06, 'epoch': 0.02} 2%|▏ | 695/34278 [49:28<36:19:04, 3.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 696/34278 [49:32<37:20:37, 4.00s/it] {'loss': 0.2589, 'grad_norm': 1.0814084397294184, 'learning_rate': 6.763848396501458e-06, 'epoch': 0.02} 2%|▏ | 696/34278 [49:32<37:20:37, 4.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 697/34278 [49:35<34:54:20, 3.74s/it] {'loss': 0.2288, 'grad_norm': 1.1083675959863164, 'learning_rate': 6.773566569484937e-06, 'epoch': 0.02} 2%|▏ | 697/34278 [49:35<34:54:20, 3.74s/it] 2%|▏ | 698/34278 [49:39<33:15:08, 3.56s/it] {'loss': 0.2873, 'grad_norm': 1.328044711803676, 'learning_rate': 6.783284742468417e-06, 'epoch': 0.02} 2%|▏ | 698/34278 [49:39<33:15:08, 3.56s/it] 2%|▏ | 699/34278 [49:42<31:35:10, 3.39s/it] {'loss': 0.2384, 'grad_norm': 1.2457702284333594, 'learning_rate': 6.793002915451895e-06, 'epoch': 0.02} 2%|▏ | 699/34278 [49:42<31:35:10, 3.39s/it] 2%|▏ | 700/34278 [49:46<34:33:46, 3.71s/it] {'loss': 0.2123, 'grad_norm': 0.9138010870235221, 'learning_rate': 6.8027210884353745e-06, 'epoch': 0.02} 2%|▏ | 700/34278 [49:46<34:33:46, 3.71s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 701/34278 [49:49<33:21:11, 3.58s/it] {'loss': 0.2563, 'grad_norm': 1.0760978257264822, 'learning_rate': 6.812439261418854e-06, 'epoch': 0.02} 2%|▏ | 701/34278 [49:49<33:21:11, 3.58s/it] 2%|▏ | 702/34278 [49:52<31:56:27, 3.42s/it] {'loss': 0.2572, 'grad_norm': 1.3198861326415898, 'learning_rate': 6.822157434402333e-06, 'epoch': 0.02} 2%|▏ | 702/34278 [49:52<31:56:27, 3.42s/it] 2%|▏ | 703/34278 [49:56<31:21:58, 3.36s/it] {'loss': 0.2379, 'grad_norm': 1.4575455604532646, 'learning_rate': 6.8318756073858115e-06, 'epoch': 0.02} 2%|▏ | 703/34278 [49:56<31:21:58, 3.36s/it] 2%|▏ | 704/34278 [49:59<30:35:00, 3.28s/it] {'loss': 0.2112, 'grad_norm': 1.109333016897937, 'learning_rate': 6.841593780369291e-06, 'epoch': 0.02} 2%|▏ | 704/34278 [49:59<30:35:00, 3.28s/it] 2%|▏ | 705/34278 [50:02<31:43:52, 3.40s/it] {'loss': 0.2145, 'grad_norm': 1.0302927524542895, 'learning_rate': 6.85131195335277e-06, 'epoch': 0.02} 2%|▏ | 705/34278 [50:02<31:43:52, 3.40s/it] 2%|▏ | 706/34278 [50:06<31:48:13, 3.41s/it] {'loss': 0.2866, 'grad_norm': 1.2909537063625136, 'learning_rate': 6.861030126336249e-06, 'epoch': 0.02} 2%|▏ | 706/34278 [50:06<31:48:13, 3.41s/it] 2%|▏ | 707/34278 [50:10<33:17:01, 3.57s/it] {'loss': 0.2486, 'grad_norm': 1.6093127500609739, 'learning_rate': 6.870748299319728e-06, 'epoch': 0.02} 2%|▏ | 707/34278 [50:10<33:17:01, 3.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 708/34278 [50:16<40:13:16, 4.31s/it] {'loss': 0.2341, 'grad_norm': 1.2878326694626485, 'learning_rate': 6.880466472303208e-06, 'epoch': 0.02} 2%|▏ | 708/34278 [50:16<40:13:16, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 709/34278 [50:19<37:23:21, 4.01s/it] {'loss': 0.2422, 'grad_norm': 1.1876576872307236, 'learning_rate': 6.890184645286687e-06, 'epoch': 0.02} 2%|▏ | 709/34278 [50:19<37:23:21, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 710/34278 [50:23<36:06:35, 3.87s/it] {'loss': 0.226, 'grad_norm': 1.2053349767690207, 'learning_rate': 6.899902818270166e-06, 'epoch': 0.02} 2%|▏ | 710/34278 [50:23<36:06:35, 3.87s/it] 2%|▏ | 711/34278 [50:26<34:45:42, 3.73s/it] {'loss': 0.2573, 'grad_norm': 1.1621295917851582, 'learning_rate': 6.909620991253645e-06, 'epoch': 0.02} 2%|▏ | 711/34278 [50:26<34:45:42, 3.73s/it] 2%|▏ | 712/34278 [50:29<33:45:34, 3.62s/it] {'loss': 0.2219, 'grad_norm': 1.0779227597831955, 'learning_rate': 6.9193391642371245e-06, 'epoch': 0.02} 2%|▏ | 712/34278 [50:29<33:45:34, 3.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 713/34278 [50:34<35:53:51, 3.85s/it] {'loss': 0.2435, 'grad_norm': 1.0600744780537699, 'learning_rate': 6.929057337220603e-06, 'epoch': 0.02} 2%|▏ | 713/34278 [50:34<35:53:51, 3.85s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 714/34278 [50:37<34:50:58, 3.74s/it] {'loss': 0.2448, 'grad_norm': 1.1870402162790261, 'learning_rate': 6.938775510204082e-06, 'epoch': 0.02} 2%|▏ | 714/34278 [50:37<34:50:58, 3.74s/it] 2%|▏ | 715/34278 [50:41<34:53:37, 3.74s/it] {'loss': 0.241, 'grad_norm': 1.15577933444016, 'learning_rate': 6.948493683187562e-06, 'epoch': 0.02} 2%|▏ | 715/34278 [50:41<34:53:37, 3.74s/it] 2%|▏ | 716/34278 [50:45<34:46:22, 3.73s/it] {'loss': 0.2184, 'grad_norm': 1.0385025420205705, 'learning_rate': 6.95821185617104e-06, 'epoch': 0.02} 2%|▏ | 716/34278 [50:45<34:46:22, 3.73s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 717/34278 [50:48<33:13:20, 3.56s/it] {'loss': 0.2529, 'grad_norm': 1.1981330447897032, 'learning_rate': 6.967930029154519e-06, 'epoch': 0.02} 2%|▏ | 717/34278 [50:48<33:13:20, 3.56s/it] 2%|▏ | 718/34278 [50:51<31:51:44, 3.42s/it] {'loss': 0.2406, 'grad_norm': 1.3893074797081237, 'learning_rate': 6.977648202137999e-06, 'epoch': 0.02} 2%|▏ | 718/34278 [50:51<31:51:44, 3.42s/it] 2%|▏ | 719/34278 [50:54<31:42:02, 3.40s/it] {'loss': 0.2499, 'grad_norm': 1.3478510367178977, 'learning_rate': 6.987366375121478e-06, 'epoch': 0.02} 2%|▏ | 719/34278 [50:54<31:42:02, 3.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 720/34278 [50:58<33:12:15, 3.56s/it] {'loss': 0.2249, 'grad_norm': 1.2290443600628422, 'learning_rate': 6.9970845481049564e-06, 'epoch': 0.02} 2%|▏ | 720/34278 [50:58<33:12:15, 3.56s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 721/34278 [51:04<39:51:48, 4.28s/it] {'loss': 0.234, 'grad_norm': 1.223682726594295, 'learning_rate': 7.006802721088436e-06, 'epoch': 0.02} 2%|▏ | 721/34278 [51:04<39:51:48, 4.28s/it] 2%|▏ | 722/34278 [51:07<36:45:15, 3.94s/it] {'loss': 0.2158, 'grad_norm': 1.2519725592077438, 'learning_rate': 7.016520894071915e-06, 'epoch': 0.02} 2%|▏ | 722/34278 [51:07<36:45:15, 3.94s/it] 2%|▏ | 723/34278 [51:11<35:21:05, 3.79s/it] {'loss': 0.2625, 'grad_norm': 1.0305745510312196, 'learning_rate': 7.0262390670553935e-06, 'epoch': 0.02} 2%|▏ | 723/34278 [51:11<35:21:05, 3.79s/it] 2%|▏ | 724/34278 [51:14<32:47:37, 3.52s/it] {'loss': 0.2494, 'grad_norm': 1.4914540459963992, 'learning_rate': 7.035957240038873e-06, 'epoch': 0.02} 2%|▏ | 724/34278 [51:14<32:47:37, 3.52s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8296 > 8192). Running this sequence through the model will result in indexing errors 2%|▏ | 725/34278 [51:18<34:13:48, 3.67s/it] {'loss': 0.2352, 'grad_norm': 1.4949867206265093, 'learning_rate': 7.045675413022353e-06, 'epoch': 0.02} 2%|▏ | 725/34278 [51:18<34:13:48, 3.67s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 726/34278 [51:22<35:35:30, 3.82s/it] {'loss': 0.2236, 'grad_norm': 1.3330272613270544, 'learning_rate': 7.055393586005832e-06, 'epoch': 0.02} 2%|▏ | 726/34278 [51:22<35:35:30, 3.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 727/34278 [51:26<35:11:23, 3.78s/it] {'loss': 0.2502, 'grad_norm': 1.1526352996187208, 'learning_rate': 7.06511175898931e-06, 'epoch': 0.02} 2%|▏ | 727/34278 [51:26<35:11:23, 3.78s/it] 2%|▏ | 728/34278 [51:28<32:49:48, 3.52s/it] {'loss': 0.2195, 'grad_norm': 1.166306458061035, 'learning_rate': 7.07482993197279e-06, 'epoch': 0.02} 2%|▏ | 728/34278 [51:28<32:49:48, 3.52s/it] 2%|▏ | 729/34278 [51:34<39:40:36, 4.26s/it] {'loss': 0.2595, 'grad_norm': 1.2374836997108423, 'learning_rate': 7.084548104956269e-06, 'epoch': 0.02} 2%|▏ | 729/34278 [51:34<39:40:36, 4.26s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 730/34278 [51:38<36:37:44, 3.93s/it] {'loss': 0.2235, 'grad_norm': 1.0946137599976848, 'learning_rate': 7.094266277939748e-06, 'epoch': 0.02} 2%|▏ | 730/34278 [51:38<36:37:44, 3.93s/it] 2%|▏ | 731/34278 [51:43<41:24:11, 4.44s/it] {'loss': 0.2517, 'grad_norm': 1.3542403105266712, 'learning_rate': 7.103984450923227e-06, 'epoch': 0.02} 2%|▏ | 731/34278 [51:43<41:24:11, 4.44s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (8221 > 8192). Running this sequence through the model will result in indexing errors 2%|▏ | 732/34278 [51:50<46:53:19, 5.03s/it] {'loss': 0.259, 'grad_norm': 1.3603040073749126, 'learning_rate': 7.1137026239067065e-06, 'epoch': 0.02} 2%|▏ | 732/34278 [51:50<46:53:19, 5.03s/it] 2%|▏ | 733/34278 [51:54<45:07:08, 4.84s/it] {'loss': 0.2511, 'grad_norm': 1.148411391960263, 'learning_rate': 7.123420796890185e-06, 'epoch': 0.02} 2%|▏ | 733/34278 [51:54<45:07:08, 4.84s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 734/34278 [51:57<40:39:48, 4.36s/it] {'loss': 0.2233, 'grad_norm': 1.3990863614174234, 'learning_rate': 7.133138969873664e-06, 'epoch': 0.02} 2%|▏ | 734/34278 [51:57<40:39:48, 4.36s/it] 2%|▏ | 735/34278 [52:00<36:24:03, 3.91s/it] {'loss': 0.2483, 'grad_norm': 1.1289883923040358, 'learning_rate': 7.1428571428571436e-06, 'epoch': 0.02} 2%|▏ | 735/34278 [52:00<36:24:03, 3.91s/it] 2%|▏ | 736/34278 [52:04<35:02:14, 3.76s/it] {'loss': 0.2566, 'grad_norm': 1.592308270330597, 'learning_rate': 7.152575315840623e-06, 'epoch': 0.02} 2%|▏ | 736/34278 [52:04<35:02:14, 3.76s/it] 2%|▏ | 737/34278 [52:07<34:51:41, 3.74s/it] {'loss': 0.2244, 'grad_norm': 1.1676915593536734, 'learning_rate': 7.162293488824101e-06, 'epoch': 0.02} 2%|▏ | 737/34278 [52:07<34:51:41, 3.74s/it] 2%|▏ | 738/34278 [52:11<34:16:39, 3.68s/it] {'loss': 0.2694, 'grad_norm': 1.434217925251431, 'learning_rate': 7.172011661807581e-06, 'epoch': 0.02} 2%|▏ | 738/34278 [52:11<34:16:39, 3.68s/it] 2%|▏ | 739/34278 [52:14<33:46:36, 3.63s/it] {'loss': 0.2445, 'grad_norm': 1.3398064548223516, 'learning_rate': 7.18172983479106e-06, 'epoch': 0.02} 2%|▏ | 739/34278 [52:14<33:46:36, 3.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 740/34278 [52:18<35:07:24, 3.77s/it] {'loss': 0.2387, 'grad_norm': 1.0631062210348727, 'learning_rate': 7.191448007774538e-06, 'epoch': 0.02} 2%|▏ | 740/34278 [52:18<35:07:24, 3.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 741/34278 [52:21<32:33:17, 3.49s/it] {'loss': 0.2128, 'grad_norm': 1.2419787284853856, 'learning_rate': 7.201166180758018e-06, 'epoch': 0.02} 2%|▏ | 741/34278 [52:21<32:33:17, 3.49s/it] 2%|▏ | 742/34278 [52:24<31:49:05, 3.42s/it] {'loss': 0.2189, 'grad_norm': 1.347170823544848, 'learning_rate': 7.210884353741497e-06, 'epoch': 0.02} 2%|▏ | 742/34278 [52:24<31:49:05, 3.42s/it] 2%|▏ | 743/34278 [52:31<39:43:56, 4.27s/it] {'loss': 0.2274, 'grad_norm': 1.1052609255450854, 'learning_rate': 7.2206025267249755e-06, 'epoch': 0.02} 2%|▏ | 743/34278 [52:31<39:43:56, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 744/34278 [52:34<36:59:43, 3.97s/it] {'loss': 0.2621, 'grad_norm': 0.9141740311729317, 'learning_rate': 7.230320699708455e-06, 'epoch': 0.02} 2%|▏ | 744/34278 [52:34<36:59:43, 3.97s/it] 2%|▏ | 745/34278 [52:37<34:41:27, 3.72s/it] {'loss': 0.2294, 'grad_norm': 1.2036279072941747, 'learning_rate': 7.240038872691935e-06, 'epoch': 0.02} 2%|▏ | 745/34278 [52:37<34:41:27, 3.72s/it] 2%|▏ | 746/34278 [52:40<33:21:09, 3.58s/it] {'loss': 0.248, 'grad_norm': 1.1373876492741393, 'learning_rate': 7.249757045675414e-06, 'epoch': 0.02} 2%|▏ | 746/34278 [52:40<33:21:09, 3.58s/it] 2%|▏ | 747/34278 [52:46<40:06:36, 4.31s/it] {'loss': 0.205, 'grad_norm': 1.03097313866654, 'learning_rate': 7.259475218658893e-06, 'epoch': 0.02} 2%|▏ | 747/34278 [52:46<40:06:36, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 748/34278 [52:50<39:25:06, 4.23s/it] {'loss': 0.2117, 'grad_norm': 1.225664083192016, 'learning_rate': 7.269193391642372e-06, 'epoch': 0.02} 2%|▏ | 748/34278 [52:50<39:25:06, 4.23s/it] 2%|▏ | 749/34278 [52:54<36:39:39, 3.94s/it] {'loss': 0.2331, 'grad_norm': 1.116765043842244, 'learning_rate': 7.278911564625851e-06, 'epoch': 0.02} 2%|▏ | 749/34278 [52:54<36:39:39, 3.94s/it] 2%|▏ | 750/34278 [52:57<34:01:51, 3.65s/it] {'loss': 0.2103, 'grad_norm': 0.9894490765307579, 'learning_rate': 7.28862973760933e-06, 'epoch': 0.02} 2%|▏ | 750/34278 [52:57<34:01:51, 3.65s/it] 2%|▏ | 751/34278 [53:01<34:43:48, 3.73s/it] {'loss': 0.2209, 'grad_norm': 0.9377323370832152, 'learning_rate': 7.298347910592809e-06, 'epoch': 0.02} 2%|▏ | 751/34278 [53:01<34:43:48, 3.73s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 752/34278 [53:06<40:02:27, 4.30s/it] {'loss': 0.2073, 'grad_norm': 1.1477138262356188, 'learning_rate': 7.3080660835762885e-06, 'epoch': 0.02} 2%|▏ | 752/34278 [53:06<40:02:27, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 753/34278 [53:10<38:11:46, 4.10s/it] {'loss': 0.2314, 'grad_norm': 1.1800756399150825, 'learning_rate': 7.317784256559768e-06, 'epoch': 0.02} 2%|▏ | 753/34278 [53:10<38:11:46, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 754/34278 [53:13<34:53:39, 3.75s/it] {'loss': 0.2559, 'grad_norm': 1.1324552297353292, 'learning_rate': 7.327502429543246e-06, 'epoch': 0.02} 2%|▏ | 754/34278 [53:13<34:53:39, 3.75s/it] 2%|▏ | 755/34278 [53:17<36:13:47, 3.89s/it] {'loss': 0.2116, 'grad_norm': 1.187821230593947, 'learning_rate': 7.3372206025267255e-06, 'epoch': 0.02} 2%|▏ | 755/34278 [53:17<36:13:47, 3.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 756/34278 [53:21<35:11:45, 3.78s/it] {'loss': 0.2385, 'grad_norm': 1.2814527100250865, 'learning_rate': 7.346938775510205e-06, 'epoch': 0.02} 2%|▏ | 756/34278 [53:21<35:11:45, 3.78s/it] 2%|▏ | 757/34278 [53:24<35:15:37, 3.79s/it] {'loss': 0.2269, 'grad_norm': 1.3137757179370568, 'learning_rate': 7.356656948493683e-06, 'epoch': 0.02} 2%|▏ | 757/34278 [53:24<35:15:37, 3.79s/it] 2%|▏ | 758/34278 [53:28<35:29:17, 3.81s/it] {'loss': 0.2237, 'grad_norm': 1.03965993372393, 'learning_rate': 7.366375121477163e-06, 'epoch': 0.02} 2%|▏ | 758/34278 [53:28<35:29:17, 3.81s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 759/34278 [53:31<33:46:19, 3.63s/it] {'loss': 0.2602, 'grad_norm': 1.2122225918284888, 'learning_rate': 7.376093294460642e-06, 'epoch': 0.02} 2%|▏ | 759/34278 [53:31<33:46:19, 3.63s/it] 2%|▏ | 760/34278 [53:34<32:04:41, 3.45s/it] {'loss': 0.2457, 'grad_norm': 1.1906781306110434, 'learning_rate': 7.38581146744412e-06, 'epoch': 0.02} 2%|▏ | 760/34278 [53:34<32:04:41, 3.45s/it] 2%|▏ | 761/34278 [53:39<36:23:16, 3.91s/it] {'loss': 0.245, 'grad_norm': 1.0016455739396728, 'learning_rate': 7.3955296404276e-06, 'epoch': 0.02} 2%|▏ | 761/34278 [53:39<36:23:16, 3.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 762/34278 [53:43<35:18:31, 3.79s/it] {'loss': 0.2, 'grad_norm': 1.0791268810825152, 'learning_rate': 7.40524781341108e-06, 'epoch': 0.02} 2%|▏ | 762/34278 [53:43<35:18:31, 3.79s/it] 2%|▏ | 763/34278 [53:46<32:33:07, 3.50s/it] {'loss': 0.2267, 'grad_norm': 1.2559405314741636, 'learning_rate': 7.414965986394559e-06, 'epoch': 0.02} 2%|▏ | 763/34278 [53:46<32:33:07, 3.50s/it] 2%|▏ | 764/34278 [53:49<32:20:06, 3.47s/it] {'loss': 0.2468, 'grad_norm': 1.2986118162267253, 'learning_rate': 7.424684159378037e-06, 'epoch': 0.02} 2%|▏ | 764/34278 [53:49<32:20:06, 3.47s/it] 2%|▏ | 765/34278 [53:53<33:29:15, 3.60s/it] {'loss': 0.2304, 'grad_norm': 0.9914341493989796, 'learning_rate': 7.434402332361517e-06, 'epoch': 0.02} 2%|▏ | 765/34278 [53:53<33:29:15, 3.60s/it] 2%|▏ | 766/34278 [53:56<32:21:51, 3.48s/it] {'loss': 0.2341, 'grad_norm': 1.2410376941673065, 'learning_rate': 7.444120505344996e-06, 'epoch': 0.02} 2%|▏ | 766/34278 [53:56<32:21:51, 3.48s/it] 2%|▏ | 767/34278 [53:59<31:34:16, 3.39s/it] {'loss': 0.2419, 'grad_norm': 1.144974483693795, 'learning_rate': 7.453838678328475e-06, 'epoch': 0.02} 2%|▏ | 767/34278 [53:59<31:34:16, 3.39s/it] 2%|▏ | 768/34278 [54:04<35:12:35, 3.78s/it] {'loss': 0.2379, 'grad_norm': 1.0885957617854234, 'learning_rate': 7.463556851311954e-06, 'epoch': 0.02} 2%|▏ | 768/34278 [54:04<35:12:35, 3.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 769/34278 [54:07<33:58:09, 3.65s/it] {'loss': 0.2227, 'grad_norm': 1.1391880660459601, 'learning_rate': 7.473275024295433e-06, 'epoch': 0.02} 2%|▏ | 769/34278 [54:08<33:58:09, 3.65s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 770/34278 [54:11<32:53:22, 3.53s/it] {'loss': 0.2249, 'grad_norm': 1.159826237177272, 'learning_rate': 7.482993197278913e-06, 'epoch': 0.02} 2%|▏ | 770/34278 [54:11<32:53:22, 3.53s/it] 2%|▏ | 771/34278 [54:15<34:42:36, 3.73s/it] {'loss': 0.2464, 'grad_norm': 1.1279372315475507, 'learning_rate': 7.492711370262391e-06, 'epoch': 0.02} 2%|▏ | 771/34278 [54:15<34:42:36, 3.73s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 772/34278 [54:18<33:50:57, 3.64s/it] {'loss': 0.2701, 'grad_norm': 1.0538247174746573, 'learning_rate': 7.5024295432458704e-06, 'epoch': 0.02} 2%|▏ | 772/34278 [54:18<33:50:57, 3.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 773/34278 [54:23<36:37:57, 3.94s/it] {'loss': 0.2694, 'grad_norm': 1.0464884319638112, 'learning_rate': 7.51214771622935e-06, 'epoch': 0.02} 2%|▏ | 773/34278 [54:23<36:37:57, 3.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 774/34278 [54:26<33:56:20, 3.65s/it] {'loss': 0.2483, 'grad_norm': 1.0341781965712196, 'learning_rate': 7.521865889212828e-06, 'epoch': 0.02} 2%|▏ | 774/34278 [54:26<33:56:20, 3.65s/it] 2%|▏ | 775/34278 [54:29<32:05:12, 3.45s/it] {'loss': 0.2214, 'grad_norm': 1.1521353006383643, 'learning_rate': 7.5315840621963075e-06, 'epoch': 0.02} 2%|▏ | 775/34278 [54:29<32:05:12, 3.45s/it] 2%|▏ | 776/34278 [54:32<32:20:17, 3.47s/it] {'loss': 0.2137, 'grad_norm': 1.0305586585976119, 'learning_rate': 7.541302235179787e-06, 'epoch': 0.02} 2%|▏ | 776/34278 [54:32<32:20:17, 3.47s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8318 > 8192). Running this sequence through the model will result in indexing errors 2%|▏ | 777/34278 [54:36<32:25:16, 3.48s/it] {'loss': 0.2167, 'grad_norm': 1.2017679461793598, 'learning_rate': 7.551020408163265e-06, 'epoch': 0.02} 2%|▏ | 777/34278 [54:36<32:25:16, 3.48s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (9061 > 8192). Running this sequence through the model will result in indexing errors 2%|▏ | 778/34278 [54:39<31:50:07, 3.42s/it] {'loss': 0.2342, 'grad_norm': 1.147090709009779, 'learning_rate': 7.560738581146745e-06, 'epoch': 0.02} 2%|▏ | 778/34278 [54:39<31:50:07, 3.42s/it] 2%|▏ | 779/34278 [54:42<31:14:38, 3.36s/it] {'loss': 0.2206, 'grad_norm': 1.0402610823434146, 'learning_rate': 7.570456754130224e-06, 'epoch': 0.02} 2%|▏ | 779/34278 [54:42<31:14:38, 3.36s/it] 2%|▏ | 780/34278 [54:46<30:42:11, 3.30s/it] {'loss': 0.2367, 'grad_norm': 1.31274832001162, 'learning_rate': 7.580174927113704e-06, 'epoch': 0.02} 2%|▏ | 780/34278 [54:46<30:42:11, 3.30s/it] 2%|▏ | 781/34278 [54:49<30:35:10, 3.29s/it] {'loss': 0.2567, 'grad_norm': 1.1955641509643553, 'learning_rate': 7.589893100097182e-06, 'epoch': 0.02} 2%|▏ | 781/34278 [54:49<30:35:10, 3.29s/it] 2%|▏ | 782/34278 [54:55<38:09:33, 4.10s/it] {'loss': 0.2256, 'grad_norm': 1.3969703270390994, 'learning_rate': 7.599611273080662e-06, 'epoch': 0.02} 2%|▏ | 782/34278 [54:55<38:09:33, 4.10s/it] 2%|▏ | 783/34278 [54:59<37:25:06, 4.02s/it] {'loss': 0.2043, 'grad_norm': 1.0412933297362201, 'learning_rate': 7.609329446064141e-06, 'epoch': 0.02} 2%|▏ | 783/34278 [54:59<37:25:06, 4.02s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 784/34278 [55:04<41:07:39, 4.42s/it] {'loss': 0.2195, 'grad_norm': 1.0444517135384463, 'learning_rate': 7.61904761904762e-06, 'epoch': 0.02} 2%|▏ | 784/34278 [55:04<41:07:39, 4.42s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 785/34278 [55:08<38:44:55, 4.16s/it] {'loss': 0.2525, 'grad_norm': 1.3301626783376574, 'learning_rate': 7.628765792031099e-06, 'epoch': 0.02} 2%|▏ | 785/34278 [55:08<38:44:55, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 786/34278 [55:13<42:10:43, 4.53s/it] {'loss': 0.2383, 'grad_norm': 1.2009624863679822, 'learning_rate': 7.638483965014577e-06, 'epoch': 0.02} 2%|▏ | 786/34278 [55:13<42:10:43, 4.53s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 787/34278 [55:17<39:14:41, 4.22s/it] {'loss': 0.2256, 'grad_norm': 1.0323992570020093, 'learning_rate': 7.648202137998057e-06, 'epoch': 0.02} 2%|▏ | 787/34278 [55:17<39:14:41, 4.22s/it] 2%|▏ | 788/34278 [55:23<44:47:44, 4.82s/it] {'loss': 0.2162, 'grad_norm': 1.1705082357772691, 'learning_rate': 7.657920310981536e-06, 'epoch': 0.02} 2%|▏ | 788/34278 [55:23<44:47:44, 4.82s/it] 2%|▏ | 789/34278 [55:27<42:16:35, 4.54s/it] {'loss': 0.2397, 'grad_norm': 1.016935911820381, 'learning_rate': 7.667638483965015e-06, 'epoch': 0.02} 2%|▏ | 789/34278 [55:27<42:16:35, 4.54s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 790/34278 [55:30<38:02:35, 4.09s/it] {'loss': 0.2488, 'grad_norm': 1.2098682858694556, 'learning_rate': 7.677356656948495e-06, 'epoch': 0.02} 2%|▏ | 790/34278 [55:30<38:02:35, 4.09s/it] 2%|▏ | 791/34278 [55:33<35:29:26, 3.82s/it] {'loss': 0.2336, 'grad_norm': 1.1292603128213778, 'learning_rate': 7.687074829931972e-06, 'epoch': 0.02} 2%|▏ | 791/34278 [55:33<35:29:26, 3.82s/it] 2%|▏ | 792/34278 [55:36<32:51:01, 3.53s/it] {'loss': 0.2286, 'grad_norm': 1.063301311419607, 'learning_rate': 7.696793002915453e-06, 'epoch': 0.02} 2%|▏ | 792/34278 [55:36<32:51:01, 3.53s/it] 2%|▏ | 793/34278 [55:42<39:52:17, 4.29s/it] {'loss': 0.2195, 'grad_norm': 1.49425163247432, 'learning_rate': 7.706511175898933e-06, 'epoch': 0.02} 2%|▏ | 793/34278 [55:42<39:52:17, 4.29s/it] 2%|▏ | 794/34278 [55:45<36:18:04, 3.90s/it] {'loss': 0.204, 'grad_norm': 1.2549036667959854, 'learning_rate': 7.71622934888241e-06, 'epoch': 0.02} 2%|▏ | 794/34278 [55:45<36:18:04, 3.90s/it] 2%|▏ | 795/34278 [55:49<37:14:14, 4.00s/it] {'loss': 0.2417, 'grad_norm': 1.4739247197768788, 'learning_rate': 7.72594752186589e-06, 'epoch': 0.02} 2%|▏ | 795/34278 [55:49<37:14:14, 4.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 796/34278 [55:53<36:04:20, 3.88s/it] {'loss': 0.2782, 'grad_norm': 1.3112302205058461, 'learning_rate': 7.735665694849369e-06, 'epoch': 0.02} 2%|▏ | 796/34278 [55:53<36:04:20, 3.88s/it] 2%|▏ | 797/34278 [55:57<37:58:32, 4.08s/it] {'loss': 0.3017, 'grad_norm': 1.619905754938196, 'learning_rate': 7.745383867832848e-06, 'epoch': 0.02} 2%|▏ | 797/34278 [55:57<37:58:32, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 798/34278 [56:01<36:58:25, 3.98s/it] {'loss': 0.2599, 'grad_norm': 1.3563231856399103, 'learning_rate': 7.755102040816327e-06, 'epoch': 0.02} 2%|▏ | 798/34278 [56:01<36:58:25, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 799/34278 [56:04<35:30:43, 3.82s/it] {'loss': 0.248, 'grad_norm': 1.327772442289883, 'learning_rate': 7.764820213799807e-06, 'epoch': 0.02} 2%|▏ | 799/34278 [56:04<35:30:43, 3.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 800/34278 [56:09<37:54:48, 4.08s/it] {'loss': 0.231, 'grad_norm': 1.2852257136338878, 'learning_rate': 7.774538386783286e-06, 'epoch': 0.02} 2%|▏ | 800/34278 [56:09<37:54:48, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 801/34278 [56:13<36:57:57, 3.98s/it] {'loss': 0.2229, 'grad_norm': 1.2482077449290176, 'learning_rate': 7.784256559766764e-06, 'epoch': 0.02} 2%|▏ | 801/34278 [56:13<36:57:57, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 802/34278 [56:16<34:34:52, 3.72s/it] {'loss': 0.272, 'grad_norm': 1.1450007187952924, 'learning_rate': 7.793974732750243e-06, 'epoch': 0.02} 2%|▏ | 802/34278 [56:16<34:34:52, 3.72s/it] 2%|▏ | 803/34278 [56:19<32:14:34, 3.47s/it] {'loss': 0.224, 'grad_norm': 1.0787544506279128, 'learning_rate': 7.803692905733722e-06, 'epoch': 0.02} 2%|▏ | 803/34278 [56:19<32:14:34, 3.47s/it] 2%|▏ | 804/34278 [56:24<37:44:11, 4.06s/it] {'loss': 0.2463, 'grad_norm': 1.6160197667204934, 'learning_rate': 7.813411078717202e-06, 'epoch': 0.02} 2%|▏ | 804/34278 [56:24<37:44:11, 4.06s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 805/34278 [56:27<34:24:27, 3.70s/it] {'loss': 0.2042, 'grad_norm': 0.8985221469400513, 'learning_rate': 7.823129251700681e-06, 'epoch': 0.02} 2%|▏ | 805/34278 [56:27<34:24:27, 3.70s/it] 2%|▏ | 806/34278 [56:30<32:42:44, 3.52s/it] {'loss': 0.2648, 'grad_norm': 1.1277547620984618, 'learning_rate': 7.83284742468416e-06, 'epoch': 0.02} 2%|▏ | 806/34278 [56:30<32:42:44, 3.52s/it] 2%|▏ | 807/34278 [56:34<34:38:30, 3.73s/it] {'loss': 0.233, 'grad_norm': 1.0706806725213707, 'learning_rate': 7.84256559766764e-06, 'epoch': 0.02} 2%|▏ | 807/34278 [56:34<34:38:30, 3.73s/it] 2%|▏ | 808/34278 [56:40<39:22:56, 4.24s/it] {'loss': 0.2505, 'grad_norm': 1.2250251136180632, 'learning_rate': 7.852283770651117e-06, 'epoch': 0.02} 2%|▏ | 808/34278 [56:40<39:22:56, 4.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 809/34278 [56:43<36:25:07, 3.92s/it] {'loss': 0.2332, 'grad_norm': 1.0975135269964724, 'learning_rate': 7.862001943634598e-06, 'epoch': 0.02} 2%|▏ | 809/34278 [56:43<36:25:07, 3.92s/it] 2%|▏ | 810/34278 [56:46<33:39:32, 3.62s/it] {'loss': 0.2133, 'grad_norm': 1.2494564394211676, 'learning_rate': 7.871720116618077e-06, 'epoch': 0.02} 2%|▏ | 810/34278 [56:46<33:39:32, 3.62s/it] 2%|▏ | 811/34278 [56:50<35:52:13, 3.86s/it] {'loss': 0.2253, 'grad_norm': 1.3489757437988497, 'learning_rate': 7.881438289601555e-06, 'epoch': 0.02} 2%|▏ | 811/34278 [56:50<35:52:13, 3.86s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 812/34278 [56:54<35:21:54, 3.80s/it] {'loss': 0.2213, 'grad_norm': 1.0251269096777629, 'learning_rate': 7.891156462585034e-06, 'epoch': 0.02} 2%|▏ | 812/34278 [56:54<35:21:54, 3.80s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 813/34278 [56:57<32:53:49, 3.54s/it] {'loss': 0.2236, 'grad_norm': 0.9716459114793602, 'learning_rate': 7.900874635568514e-06, 'epoch': 0.02} 2%|▏ | 813/34278 [56:57<32:53:49, 3.54s/it] 2%|▏ | 814/34278 [57:00<31:03:21, 3.34s/it] {'loss': 0.2336, 'grad_norm': 1.0508870648670574, 'learning_rate': 7.910592808551993e-06, 'epoch': 0.02} 2%|▏ | 814/34278 [57:00<31:03:21, 3.34s/it] 2%|▏ | 815/34278 [57:04<32:15:02, 3.47s/it] {'loss': 0.2213, 'grad_norm': 1.0360282373369818, 'learning_rate': 7.920310981535472e-06, 'epoch': 0.02} 2%|▏ | 815/34278 [57:04<32:15:02, 3.47s/it] 2%|▏ | 816/34278 [57:07<31:23:48, 3.38s/it] {'loss': 0.2357, 'grad_norm': 1.098218013035958, 'learning_rate': 7.930029154518952e-06, 'epoch': 0.02} 2%|▏ | 816/34278 [57:07<31:23:48, 3.38s/it] 2%|▏ | 817/34278 [57:10<30:37:59, 3.30s/it] {'loss': 0.2421, 'grad_norm': 1.1575540066522236, 'learning_rate': 7.939747327502431e-06, 'epoch': 0.02} 2%|▏ | 817/34278 [57:10<30:37:59, 3.30s/it] 2%|▏ | 818/34278 [57:13<31:25:33, 3.38s/it] {'loss': 0.2123, 'grad_norm': 1.4602086588436836, 'learning_rate': 7.949465500485909e-06, 'epoch': 0.02} 2%|▏ | 818/34278 [57:13<31:25:33, 3.38s/it] 2%|▏ | 819/34278 [57:17<30:57:40, 3.33s/it] {'loss': 0.2331, 'grad_norm': 1.054613072449019, 'learning_rate': 7.959183673469388e-06, 'epoch': 0.02} 2%|▏ | 819/34278 [57:17<30:57:40, 3.33s/it] 2%|▏ | 820/34278 [57:19<29:11:33, 3.14s/it] {'loss': 0.2186, 'grad_norm': 1.380533444087093, 'learning_rate': 7.968901846452867e-06, 'epoch': 0.02} 2%|▏ | 820/34278 [57:19<29:11:33, 3.14s/it] 2%|▏ | 821/34278 [57:23<31:41:22, 3.41s/it] {'loss': 0.2151, 'grad_norm': 1.0968081127881506, 'learning_rate': 7.978620019436347e-06, 'epoch': 0.02} 2%|▏ | 821/34278 [57:23<31:41:22, 3.41s/it] 2%|▏ | 822/34278 [57:30<39:29:06, 4.25s/it] {'loss': 0.2447, 'grad_norm': 1.2678370235797236, 'learning_rate': 7.988338192419826e-06, 'epoch': 0.02} 2%|▏ | 822/34278 [57:30<39:29:06, 4.25s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 823/34278 [57:34<38:53:24, 4.18s/it] {'loss': 0.2225, 'grad_norm': 1.1840706113496122, 'learning_rate': 7.998056365403305e-06, 'epoch': 0.02} 2%|▏ | 823/34278 [57:34<38:53:24, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 824/34278 [57:39<41:32:00, 4.47s/it] {'loss': 0.2178, 'grad_norm': 1.2965871034391179, 'learning_rate': 8.007774538386784e-06, 'epoch': 0.02} 2%|▏ | 824/34278 [57:39<41:32:00, 4.47s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 825/34278 [57:43<40:11:53, 4.33s/it] {'loss': 0.2683, 'grad_norm': 1.264222685552627, 'learning_rate': 8.017492711370262e-06, 'epoch': 0.02} 2%|▏ | 825/34278 [57:43<40:11:53, 4.33s/it] 2%|▏ | 826/34278 [57:46<37:07:03, 3.99s/it] {'loss': 0.216, 'grad_norm': 1.3527595201863405, 'learning_rate': 8.027210884353741e-06, 'epoch': 0.02} 2%|▏ | 826/34278 [57:46<37:07:03, 3.99s/it] 2%|▏ | 827/34278 [57:49<33:58:19, 3.66s/it] {'loss': 0.2665, 'grad_norm': 1.4038635883884818, 'learning_rate': 8.036929057337222e-06, 'epoch': 0.02} 2%|▏ | 827/34278 [57:49<33:58:19, 3.66s/it] 2%|▏ | 828/34278 [57:52<32:28:01, 3.49s/it] {'loss': 0.2331, 'grad_norm': 0.9586492703955889, 'learning_rate': 8.0466472303207e-06, 'epoch': 0.02} 2%|▏ | 828/34278 [57:52<32:28:01, 3.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 829/34278 [57:56<34:45:07, 3.74s/it] {'loss': 0.2352, 'grad_norm': 1.052317215766637, 'learning_rate': 8.05636540330418e-06, 'epoch': 0.02} 2%|▏ | 829/34278 [57:56<34:45:07, 3.74s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 830/34278 [57:59<32:13:17, 3.47s/it] {'loss': 0.2424, 'grad_norm': 1.0505145744789148, 'learning_rate': 8.066083576287659e-06, 'epoch': 0.02} 2%|▏ | 830/34278 [57:59<32:13:17, 3.47s/it] 2%|▏ | 831/34278 [58:02<30:12:44, 3.25s/it] {'loss': 0.264, 'grad_norm': 1.259839897348867, 'learning_rate': 8.075801749271138e-06, 'epoch': 0.02} 2%|▏ | 831/34278 [58:02<30:12:44, 3.25s/it] 2%|▏ | 832/34278 [58:08<37:51:54, 4.08s/it] {'loss': 0.2346, 'grad_norm': 1.165680747204167, 'learning_rate': 8.085519922254617e-06, 'epoch': 0.02} 2%|▏ | 832/34278 [58:08<37:51:54, 4.08s/it] 2%|▏ | 833/34278 [58:11<36:20:59, 3.91s/it] {'loss': 0.2324, 'grad_norm': 1.269560461730615, 'learning_rate': 8.095238095238097e-06, 'epoch': 0.02} 2%|▏ | 833/34278 [58:11<36:20:59, 3.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 834/34278 [58:15<36:19:20, 3.91s/it] {'loss': 0.2305, 'grad_norm': 1.0600569819159396, 'learning_rate': 8.104956268221576e-06, 'epoch': 0.02} 2%|▏ | 834/34278 [58:15<36:19:20, 3.91s/it] 2%|▏ | 835/34278 [58:18<34:16:16, 3.69s/it] {'loss': 0.2636, 'grad_norm': 1.1810103725868992, 'learning_rate': 8.114674441205053e-06, 'epoch': 0.02} 2%|▏ | 835/34278 [58:18<34:16:16, 3.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 836/34278 [58:25<41:39:04, 4.48s/it] {'loss': 0.2255, 'grad_norm': 0.9233645643119518, 'learning_rate': 8.124392614188533e-06, 'epoch': 0.02} 2%|▏ | 836/34278 [58:25<41:39:04, 4.48s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 837/34278 [58:28<37:12:34, 4.01s/it] {'loss': 0.1963, 'grad_norm': 0.839916756444877, 'learning_rate': 8.134110787172012e-06, 'epoch': 0.02} 2%|▏ | 837/34278 [58:28<37:12:34, 4.01s/it] 2%|▏ | 838/34278 [58:32<37:03:33, 3.99s/it] {'loss': 0.2182, 'grad_norm': 1.1297884673558336, 'learning_rate': 8.143828960155491e-06, 'epoch': 0.02} 2%|▏ | 838/34278 [58:32<37:03:33, 3.99s/it] 2%|▏ | 839/34278 [58:35<34:40:40, 3.73s/it] {'loss': 0.243, 'grad_norm': 1.012255127000924, 'learning_rate': 8.15354713313897e-06, 'epoch': 0.02} 2%|▏ | 839/34278 [58:35<34:40:40, 3.73s/it] 2%|▏ | 840/34278 [58:38<34:32:47, 3.72s/it] {'loss': 0.2034, 'grad_norm': 1.1376812095346318, 'learning_rate': 8.16326530612245e-06, 'epoch': 0.02} 2%|▏ | 840/34278 [58:38<34:32:47, 3.72s/it] 2%|▏ | 841/34278 [58:41<32:41:47, 3.52s/it] {'loss': 0.237, 'grad_norm': 1.15627653019019, 'learning_rate': 8.17298347910593e-06, 'epoch': 0.02} 2%|▏ | 841/34278 [58:41<32:41:47, 3.52s/it] 2%|▏ | 842/34278 [58:45<31:52:24, 3.43s/it] {'loss': 0.2091, 'grad_norm': 1.3314035681696466, 'learning_rate': 8.182701652089407e-06, 'epoch': 0.02} 2%|▏ | 842/34278 [58:45<31:52:24, 3.43s/it] 2%|▏ | 843/34278 [58:48<31:05:58, 3.35s/it] {'loss': 0.2321, 'grad_norm': 1.0392278314289032, 'learning_rate': 8.192419825072886e-06, 'epoch': 0.02} 2%|▏ | 843/34278 [58:48<31:05:58, 3.35s/it] 2%|▏ | 844/34278 [58:53<36:55:07, 3.98s/it] {'loss': 0.242, 'grad_norm': 1.1571268182840173, 'learning_rate': 8.202137998056367e-06, 'epoch': 0.02} 2%|▏ | 844/34278 [58:53<36:55:07, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 845/34278 [58:56<34:42:16, 3.74s/it] {'loss': 0.2329, 'grad_norm': 1.0872892444485658, 'learning_rate': 8.211856171039845e-06, 'epoch': 0.02} 2%|▏ | 845/34278 [58:56<34:42:16, 3.74s/it] 2%|▏ | 846/34278 [59:00<35:01:08, 3.77s/it] {'loss': 0.2332, 'grad_norm': 1.1112375767980343, 'learning_rate': 8.221574344023324e-06, 'epoch': 0.02} 2%|▏ | 846/34278 [59:00<35:01:08, 3.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 847/34278 [59:04<35:55:36, 3.87s/it] {'loss': 0.2148, 'grad_norm': 1.028165264665385, 'learning_rate': 8.231292517006804e-06, 'epoch': 0.02} 2%|▏ | 847/34278 [59:04<35:55:36, 3.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 848/34278 [59:08<34:06:10, 3.67s/it] {'loss': 0.24, 'grad_norm': 1.2878773067873275, 'learning_rate': 8.241010689990283e-06, 'epoch': 0.02} 2%|▏ | 848/34278 [59:08<34:06:10, 3.67s/it] 2%|▏ | 849/34278 [59:13<39:23:10, 4.24s/it] {'loss': 0.2291, 'grad_norm': 1.2089653481220402, 'learning_rate': 8.250728862973762e-06, 'epoch': 0.02} 2%|▏ | 849/34278 [59:13<39:23:10, 4.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 850/34278 [59:17<39:21:24, 4.24s/it] {'loss': 0.2293, 'grad_norm': 1.2961702554882062, 'learning_rate': 8.260447035957241e-06, 'epoch': 0.02} 2%|▏ | 850/34278 [59:17<39:21:24, 4.24s/it] 2%|▏ | 851/34278 [59:22<41:03:13, 4.42s/it] {'loss': 0.2203, 'grad_norm': 1.1287301452949055, 'learning_rate': 8.27016520894072e-06, 'epoch': 0.02} 2%|▏ | 851/34278 [59:22<41:03:13, 4.42s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 852/34278 [59:27<41:10:42, 4.43s/it] {'loss': 0.2097, 'grad_norm': 1.0994200408217525, 'learning_rate': 8.279883381924198e-06, 'epoch': 0.02} 2%|▏ | 852/34278 [59:27<41:10:42, 4.43s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 2%|▏ | 853/34278 [59:30<37:27:16, 4.03s/it] {'loss': 0.2324, 'grad_norm': 1.1749016881150973, 'learning_rate': 8.289601554907678e-06, 'epoch': 0.02} 2%|▏ | 853/34278 [59:30<37:27:16, 4.03s/it] 2%|▏ | 854/34278 [59:36<43:26:01, 4.68s/it] {'loss': 0.2252, 'grad_norm': 1.459717519565719, 'learning_rate': 8.299319727891157e-06, 'epoch': 0.02} 2%|▏ | 854/34278 [59:36<43:26:01, 4.68s/it] 2%|▏ | 855/34278 [59:40<40:37:35, 4.38s/it] {'loss': 0.2333, 'grad_norm': 1.455160742778526, 'learning_rate': 8.309037900874636e-06, 'epoch': 0.02} 2%|▏ | 855/34278 [59:40<40:37:35, 4.38s/it] 2%|▏ | 856/34278 [59:43<37:04:55, 3.99s/it] {'loss': 0.2271, 'grad_norm': 1.023204593740571, 'learning_rate': 8.318756073858116e-06, 'epoch': 0.02} 2%|▏ | 856/34278 [59:43<37:04:55, 3.99s/it] 3%|▎ | 857/34278 [59:49<42:45:59, 4.61s/it] {'loss': 0.2578, 'grad_norm': 1.1995934918409512, 'learning_rate': 8.328474246841595e-06, 'epoch': 0.03} 3%|▎ | 857/34278 [59:49<42:45:59, 4.61s/it] 3%|▎ | 858/34278 [59:52<39:00:00, 4.20s/it] {'loss': 0.2304, 'grad_norm': 1.1620257301761994, 'learning_rate': 8.338192419825074e-06, 'epoch': 0.03} 3%|▎ | 858/34278 [59:52<39:00:00, 4.20s/it] 3%|▎ | 859/34278 [59:58<44:45:58, 4.82s/it] {'loss': 0.2199, 'grad_norm': 1.1110701350031862, 'learning_rate': 8.347910592808552e-06, 'epoch': 0.03} 3%|▎ | 859/34278 [59:58<44:45:58, 4.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 860/34278 [1:00:01<39:37:25, 4.27s/it] {'loss': 0.226, 'grad_norm': 1.4282060642842584, 'learning_rate': 8.357628765792031e-06, 'epoch': 0.03} 3%|▎ | 860/34278 [1:00:01<39:37:25, 4.27s/it] 3%|▎ | 861/34278 [1:00:07<44:37:56, 4.81s/it] {'loss': 0.2325, 'grad_norm': 1.1058873150035762, 'learning_rate': 8.36734693877551e-06, 'epoch': 0.03} 3%|▎ | 861/34278 [1:00:07<44:37:56, 4.81s/it] 3%|▎ | 862/34278 [1:00:10<39:43:49, 4.28s/it] {'loss': 0.231, 'grad_norm': 1.0212208946954713, 'learning_rate': 8.37706511175899e-06, 'epoch': 0.03} 3%|▎ | 862/34278 [1:00:10<39:43:49, 4.28s/it] 3%|▎ | 863/34278 [1:00:14<36:45:22, 3.96s/it] {'loss': 0.2372, 'grad_norm': 1.0584289725256428, 'learning_rate': 8.386783284742469e-06, 'epoch': 0.03} 3%|▎ | 863/34278 [1:00:14<36:45:22, 3.96s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10556 > 8192). Running this sequence through the model will result in indexing errors 3%|▎ | 864/34278 [1:00:18<36:43:32, 3.96s/it] {'loss': 0.2155, 'grad_norm': 1.3915339635059458, 'learning_rate': 8.396501457725948e-06, 'epoch': 0.03} 3%|▎ | 864/34278 [1:00:18<36:43:32, 3.96s/it] 3%|▎ | 865/34278 [1:00:22<37:33:26, 4.05s/it] {'loss': 0.2274, 'grad_norm': 0.9602181741267753, 'learning_rate': 8.406219630709426e-06, 'epoch': 0.03} 3%|▎ | 865/34278 [1:00:22<37:33:26, 4.05s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 866/34278 [1:00:26<37:37:49, 4.05s/it] {'loss': 0.2359, 'grad_norm': 1.0949774891934438, 'learning_rate': 8.415937803692907e-06, 'epoch': 0.03} 3%|▎ | 866/34278 [1:00:26<37:37:49, 4.05s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 867/34278 [1:00:29<35:04:40, 3.78s/it] {'loss': 0.247, 'grad_norm': 1.1976347719025633, 'learning_rate': 8.425655976676386e-06, 'epoch': 0.03} 3%|▎ | 867/34278 [1:00:29<35:04:40, 3.78s/it] 3%|▎ | 868/34278 [1:00:32<32:22:23, 3.49s/it] {'loss': 0.2243, 'grad_norm': 1.0966583335751576, 'learning_rate': 8.435374149659866e-06, 'epoch': 0.03} 3%|▎ | 868/34278 [1:00:32<32:22:23, 3.49s/it] 3%|▎ | 869/34278 [1:00:37<37:06:07, 4.00s/it] {'loss': 0.1996, 'grad_norm': 0.999186771932867, 'learning_rate': 8.445092322643343e-06, 'epoch': 0.03} 3%|▎ | 869/34278 [1:00:37<37:06:07, 4.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (8695 > 8192). Running this sequence through the model will result in indexing errors 3%|▎ | 870/34278 [1:00:43<42:36:09, 4.59s/it] {'loss': 0.2291, 'grad_norm': 1.1549892838997626, 'learning_rate': 8.454810495626823e-06, 'epoch': 0.03} 3%|▎ | 870/34278 [1:00:43<42:36:09, 4.59s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 871/34278 [1:00:46<37:40:45, 4.06s/it] {'loss': 0.23, 'grad_norm': 1.2565325042452247, 'learning_rate': 8.464528668610302e-06, 'epoch': 0.03} 3%|▎ | 871/34278 [1:00:46<37:40:45, 4.06s/it] 3%|▎ | 872/34278 [1:00:49<36:25:29, 3.93s/it] {'loss': 0.2371, 'grad_norm': 1.264468498651524, 'learning_rate': 8.474246841593781e-06, 'epoch': 0.03} 3%|▎ | 872/34278 [1:00:49<36:25:29, 3.93s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 873/34278 [1:00:53<34:13:17, 3.69s/it] {'loss': 0.2308, 'grad_norm': 1.1107458572943656, 'learning_rate': 8.48396501457726e-06, 'epoch': 0.03} 3%|▎ | 873/34278 [1:00:53<34:13:17, 3.69s/it] 3%|▎ | 874/34278 [1:00:56<34:19:02, 3.70s/it] {'loss': 0.2476, 'grad_norm': 1.245594829405502, 'learning_rate': 8.49368318756074e-06, 'epoch': 0.03} 3%|▎ | 874/34278 [1:00:56<34:19:02, 3.70s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 875/34278 [1:01:02<38:45:06, 4.18s/it] {'loss': 0.2309, 'grad_norm': 1.188427098742183, 'learning_rate': 8.503401360544217e-06, 'epoch': 0.03} 3%|▎ | 875/34278 [1:01:02<38:45:06, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 876/34278 [1:01:06<38:43:06, 4.17s/it] {'loss': 0.2275, 'grad_norm': 1.2916364374413594, 'learning_rate': 8.513119533527697e-06, 'epoch': 0.03} 3%|▎ | 876/34278 [1:01:06<38:43:06, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 877/34278 [1:01:11<41:29:24, 4.47s/it] {'loss': 0.2216, 'grad_norm': 1.1123804597760718, 'learning_rate': 8.522837706511176e-06, 'epoch': 0.03} 3%|▎ | 877/34278 [1:01:11<41:29:24, 4.47s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 878/34278 [1:01:14<38:44:54, 4.18s/it] {'loss': 0.2402, 'grad_norm': 1.0506988222299276, 'learning_rate': 8.532555879494655e-06, 'epoch': 0.03} 3%|▎ | 878/34278 [1:01:14<38:44:54, 4.18s/it] 3%|▎ | 879/34278 [1:01:21<44:46:59, 4.83s/it] {'loss': 0.247, 'grad_norm': 1.07888229018488, 'learning_rate': 8.542274052478135e-06, 'epoch': 0.03} 3%|▎ | 879/34278 [1:01:21<44:46:59, 4.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 880/34278 [1:01:24<40:17:14, 4.34s/it] {'loss': 0.2199, 'grad_norm': 1.114420567762565, 'learning_rate': 8.551992225461614e-06, 'epoch': 0.03} 3%|▎ | 880/34278 [1:01:24<40:17:14, 4.34s/it] 3%|▎ | 881/34278 [1:01:28<40:27:53, 4.36s/it] {'loss': 0.2364, 'grad_norm': 1.0733804086551966, 'learning_rate': 8.561710398445093e-06, 'epoch': 0.03} 3%|▎ | 881/34278 [1:01:28<40:27:53, 4.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 882/34278 [1:01:31<36:41:38, 3.96s/it] {'loss': 0.2129, 'grad_norm': 1.1515411067974415, 'learning_rate': 8.571428571428571e-06, 'epoch': 0.03} 3%|▎ | 882/34278 [1:01:31<36:41:38, 3.96s/it] 3%|▎ | 883/34278 [1:01:35<35:15:52, 3.80s/it] {'loss': 0.2355, 'grad_norm': 1.043433286716918, 'learning_rate': 8.581146744412052e-06, 'epoch': 0.03} 3%|▎ | 883/34278 [1:01:35<35:15:52, 3.80s/it] 3%|▎ | 884/34278 [1:01:40<38:38:32, 4.17s/it] {'loss': 0.2135, 'grad_norm': 1.0336903893330118, 'learning_rate': 8.590864917395531e-06, 'epoch': 0.03} 3%|▎ | 884/34278 [1:01:40<38:38:32, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 885/34278 [1:01:44<39:13:22, 4.23s/it] {'loss': 0.2093, 'grad_norm': 1.256521542254548, 'learning_rate': 8.60058309037901e-06, 'epoch': 0.03} 3%|▎ | 885/34278 [1:01:44<39:13:22, 4.23s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 886/34278 [1:01:50<42:39:19, 4.60s/it] {'loss': 0.261, 'grad_norm': 1.3417628029323483, 'learning_rate': 8.610301263362488e-06, 'epoch': 0.03} 3%|▎ | 886/34278 [1:01:50<42:39:19, 4.60s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 887/34278 [1:01:53<39:28:01, 4.26s/it] {'loss': 0.2119, 'grad_norm': 1.0107801209511784, 'learning_rate': 8.620019436345967e-06, 'epoch': 0.03} 3%|▎ | 887/34278 [1:01:53<39:28:01, 4.26s/it] 3%|▎ | 888/34278 [1:01:58<40:27:22, 4.36s/it] {'loss': 0.2322, 'grad_norm': 1.1406597825944313, 'learning_rate': 8.629737609329447e-06, 'epoch': 0.03} 3%|▎ | 888/34278 [1:01:58<40:27:22, 4.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 889/34278 [1:02:01<38:25:56, 4.14s/it] {'loss': 0.215, 'grad_norm': 1.3920662585308612, 'learning_rate': 8.639455782312926e-06, 'epoch': 0.03} 3%|▎ | 889/34278 [1:02:01<38:25:56, 4.14s/it] 3%|▎ | 890/34278 [1:02:07<41:43:45, 4.50s/it] {'loss': 0.2472, 'grad_norm': 1.1511277459389424, 'learning_rate': 8.649173955296405e-06, 'epoch': 0.03} 3%|▎ | 890/34278 [1:02:07<41:43:45, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 891/34278 [1:02:10<38:24:03, 4.14s/it] {'loss': 0.2161, 'grad_norm': 0.9606071792676083, 'learning_rate': 8.658892128279885e-06, 'epoch': 0.03} 3%|▎ | 891/34278 [1:02:10<38:24:03, 4.14s/it] 3%|▎ | 892/34278 [1:02:13<35:08:00, 3.79s/it] {'loss': 0.2287, 'grad_norm': 1.2710760257723615, 'learning_rate': 8.668610301263362e-06, 'epoch': 0.03} 3%|▎ | 892/34278 [1:02:13<35:08:00, 3.79s/it] 3%|▎ | 893/34278 [1:02:16<33:51:44, 3.65s/it] {'loss': 0.2431, 'grad_norm': 1.168320003779378, 'learning_rate': 8.678328474246842e-06, 'epoch': 0.03} 3%|▎ | 893/34278 [1:02:16<33:51:44, 3.65s/it] 3%|▎ | 894/34278 [1:02:19<32:26:37, 3.50s/it] {'loss': 0.1971, 'grad_norm': 0.8512252460881229, 'learning_rate': 8.688046647230321e-06, 'epoch': 0.03} 3%|▎ | 894/34278 [1:02:20<32:26:37, 3.50s/it] 3%|▎ | 895/34278 [1:02:22<30:53:55, 3.33s/it] {'loss': 0.2528, 'grad_norm': 1.0740925586658399, 'learning_rate': 8.6977648202138e-06, 'epoch': 0.03} 3%|▎ | 895/34278 [1:02:22<30:53:55, 3.33s/it] 3%|▎ | 896/34278 [1:02:26<30:39:41, 3.31s/it] {'loss': 0.2134, 'grad_norm': 1.0778145588806725, 'learning_rate': 8.70748299319728e-06, 'epoch': 0.03} 3%|▎ | 896/34278 [1:02:26<30:39:41, 3.31s/it] 3%|▎ | 897/34278 [1:02:29<30:27:02, 3.28s/it] {'loss': 0.2418, 'grad_norm': 1.0246197372812758, 'learning_rate': 8.717201166180759e-06, 'epoch': 0.03} 3%|▎ | 897/34278 [1:02:29<30:27:02, 3.28s/it] 3%|▎ | 898/34278 [1:02:32<29:02:47, 3.13s/it] {'loss': 0.2249, 'grad_norm': 0.9971908131729528, 'learning_rate': 8.726919339164238e-06, 'epoch': 0.03} 3%|▎ | 898/34278 [1:02:32<29:02:47, 3.13s/it] 3%|▎ | 899/34278 [1:02:35<29:35:10, 3.19s/it] {'loss': 0.2177, 'grad_norm': 0.9847569765468884, 'learning_rate': 8.736637512147716e-06, 'epoch': 0.03} 3%|▎ | 899/34278 [1:02:35<29:35:10, 3.19s/it] 3%|▎ | 900/34278 [1:02:39<32:13:28, 3.48s/it] {'loss': 0.2563, 'grad_norm': 1.223519064180211, 'learning_rate': 8.746355685131195e-06, 'epoch': 0.03} 3%|▎ | 900/34278 [1:02:39<32:13:28, 3.48s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 901/34278 [1:02:43<31:51:17, 3.44s/it] {'loss': 0.2264, 'grad_norm': 1.1518220206967986, 'learning_rate': 8.756073858114676e-06, 'epoch': 0.03} 3%|▎ | 901/34278 [1:02:43<31:51:17, 3.44s/it] 3%|▎ | 902/34278 [1:02:47<34:31:32, 3.72s/it] {'loss': 0.2184, 'grad_norm': 1.189621075076587, 'learning_rate': 8.765792031098155e-06, 'epoch': 0.03} 3%|▎ | 902/34278 [1:02:47<34:31:32, 3.72s/it] 3%|▎ | 903/34278 [1:02:53<40:40:44, 4.39s/it] {'loss': 0.2407, 'grad_norm': 1.2186169070260915, 'learning_rate': 8.775510204081633e-06, 'epoch': 0.03} 3%|▎ | 903/34278 [1:02:53<40:40:44, 4.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 904/34278 [1:02:57<39:54:57, 4.31s/it] {'loss': 0.2262, 'grad_norm': 1.2635856808123254, 'learning_rate': 8.785228377065112e-06, 'epoch': 0.03} 3%|▎ | 904/34278 [1:02:57<39:54:57, 4.31s/it] 3%|▎ | 905/34278 [1:03:03<45:15:20, 4.88s/it] {'loss': 0.2247, 'grad_norm': 0.9812279936248159, 'learning_rate': 8.794946550048592e-06, 'epoch': 0.03} 3%|▎ | 905/34278 [1:03:03<45:15:20, 4.88s/it] 3%|▎ | 906/34278 [1:03:09<48:34:07, 5.24s/it] {'loss': 0.2362, 'grad_norm': 1.1066437925647374, 'learning_rate': 8.804664723032071e-06, 'epoch': 0.03} 3%|▎ | 906/34278 [1:03:09<48:34:07, 5.24s/it] 3%|▎ | 907/34278 [1:03:15<51:19:51, 5.54s/it] {'loss': 0.2073, 'grad_norm': 1.111992497097666, 'learning_rate': 8.81438289601555e-06, 'epoch': 0.03} 3%|▎ | 907/34278 [1:03:16<51:19:51, 5.54s/it] 3%|▎ | 908/34278 [1:03:19<44:38:34, 4.82s/it] {'loss': 0.2296, 'grad_norm': 0.9645432811176635, 'learning_rate': 8.82410106899903e-06, 'epoch': 0.03} 3%|▎ | 908/34278 [1:03:19<44:38:34, 4.82s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4160227b50> Failed to fetch sample 2696823. Exception: cannot identify image file <_io.BytesIO object at 0x7f4160227b50> 3%|▎ | 909/34278 [1:03:22<40:06:09, 4.33s/it] {'loss': 0.2228, 'grad_norm': 0.8492398097468394, 'learning_rate': 8.833819241982507e-06, 'epoch': 0.03} 3%|▎ | 909/34278 [1:03:22<40:06:09, 4.33s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 910/34278 [1:03:27<42:39:47, 4.60s/it] {'loss': 0.2317, 'grad_norm': 1.0748246979612344, 'learning_rate': 8.843537414965987e-06, 'epoch': 0.03} 3%|▎ | 910/34278 [1:03:27<42:39:47, 4.60s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 911/34278 [1:03:32<43:03:57, 4.65s/it] {'loss': 0.2302, 'grad_norm': 1.0053226522490737, 'learning_rate': 8.853255587949466e-06, 'epoch': 0.03} 3%|▎ | 911/34278 [1:03:32<43:03:57, 4.65s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 912/34278 [1:03:36<41:43:22, 4.50s/it] {'loss': 0.2193, 'grad_norm': 0.9971656931450253, 'learning_rate': 8.862973760932945e-06, 'epoch': 0.03} 3%|▎ | 912/34278 [1:03:36<41:43:22, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 913/34278 [1:03:40<39:13:30, 4.23s/it] {'loss': 0.2199, 'grad_norm': 0.9847409980163035, 'learning_rate': 8.872691933916424e-06, 'epoch': 0.03} 3%|▎ | 913/34278 [1:03:40<39:13:30, 4.23s/it] 3%|▎ | 914/34278 [1:03:43<35:48:48, 3.86s/it] {'loss': 0.2146, 'grad_norm': 1.25498568142084, 'learning_rate': 8.882410106899904e-06, 'epoch': 0.03} 3%|▎ | 914/34278 [1:03:43<35:48:48, 3.86s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9414 > 8192). Running this sequence through the model will result in indexing errors 3%|▎ | 915/34278 [1:03:45<33:09:01, 3.58s/it] {'loss': 0.2379, 'grad_norm': 0.9161040678664293, 'learning_rate': 8.892128279883383e-06, 'epoch': 0.03} 3%|▎ | 915/34278 [1:03:45<33:09:01, 3.58s/it] 3%|▎ | 916/34278 [1:03:49<33:13:43, 3.59s/it] {'loss': 0.25, 'grad_norm': 1.2235565726421131, 'learning_rate': 8.90184645286686e-06, 'epoch': 0.03} 3%|▎ | 916/34278 [1:03:49<33:13:43, 3.59s/it] 3%|▎ | 917/34278 [1:03:55<39:04:48, 4.22s/it] {'loss': 0.2239, 'grad_norm': 1.406004105551934, 'learning_rate': 8.91156462585034e-06, 'epoch': 0.03} 3%|▎ | 917/34278 [1:03:55<39:04:48, 4.22s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 918/34278 [1:03:58<36:05:25, 3.89s/it] {'loss': 0.2106, 'grad_norm': 1.1342506113992574, 'learning_rate': 8.921282798833821e-06, 'epoch': 0.03} 3%|▎ | 918/34278 [1:03:58<36:05:25, 3.89s/it] 3%|▎ | 919/34278 [1:04:02<37:50:09, 4.08s/it] {'loss': 0.2329, 'grad_norm': 1.1239375255085295, 'learning_rate': 8.931000971817299e-06, 'epoch': 0.03} 3%|▎ | 919/34278 [1:04:02<37:50:09, 4.08s/it] 3%|▎ | 920/34278 [1:04:06<35:10:00, 3.80s/it] {'loss': 0.2118, 'grad_norm': 0.9457265195547269, 'learning_rate': 8.940719144800778e-06, 'epoch': 0.03} 3%|▎ | 920/34278 [1:04:06<35:10:00, 3.80s/it] 3%|▎ | 921/34278 [1:04:09<34:54:58, 3.77s/it] {'loss': 0.2731, 'grad_norm': 1.0706261685912044, 'learning_rate': 8.950437317784257e-06, 'epoch': 0.03} 3%|▎ | 921/34278 [1:04:09<34:54:58, 3.77s/it] 3%|▎ | 922/34278 [1:04:15<40:42:54, 4.39s/it] {'loss': 0.2014, 'grad_norm': 1.110692695974071, 'learning_rate': 8.960155490767737e-06, 'epoch': 0.03} 3%|▎ | 922/34278 [1:04:15<40:42:54, 4.39s/it] 3%|▎ | 923/34278 [1:04:19<38:34:39, 4.16s/it] {'loss': 0.2016, 'grad_norm': 0.8500774921458645, 'learning_rate': 8.969873663751216e-06, 'epoch': 0.03} 3%|▎ | 923/34278 [1:04:19<38:34:39, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 924/34278 [1:04:22<37:01:37, 4.00s/it] {'loss': 0.2098, 'grad_norm': 1.1852674803784724, 'learning_rate': 8.979591836734695e-06, 'epoch': 0.03} 3%|▎ | 924/34278 [1:04:22<37:01:37, 4.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 925/34278 [1:04:25<34:15:17, 3.70s/it] {'loss': 0.2131, 'grad_norm': 1.1139269298523111, 'learning_rate': 8.989310009718175e-06, 'epoch': 0.03} 3%|▎ | 925/34278 [1:04:25<34:15:17, 3.70s/it] 3%|▎ | 926/34278 [1:04:29<32:51:16, 3.55s/it] {'loss': 0.2088, 'grad_norm': 0.8962746684288225, 'learning_rate': 8.999028182701652e-06, 'epoch': 0.03} 3%|▎ | 926/34278 [1:04:29<32:51:16, 3.55s/it] 3%|▎ | 927/34278 [1:04:32<32:08:25, 3.47s/it] {'loss': 0.2539, 'grad_norm': 1.2521461375304326, 'learning_rate': 9.008746355685131e-06, 'epoch': 0.03} 3%|▎ | 927/34278 [1:04:32<32:08:25, 3.47s/it] 3%|▎ | 928/34278 [1:04:36<32:54:57, 3.55s/it] {'loss': 0.2362, 'grad_norm': 1.1281918008149143, 'learning_rate': 9.01846452866861e-06, 'epoch': 0.03} 3%|▎ | 928/34278 [1:04:36<32:54:57, 3.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 929/34278 [1:04:42<40:06:26, 4.33s/it] {'loss': 0.2083, 'grad_norm': 0.9363948903611427, 'learning_rate': 9.02818270165209e-06, 'epoch': 0.03} 3%|▎ | 929/34278 [1:04:42<40:06:26, 4.33s/it] 3%|▎ | 930/34278 [1:04:45<37:27:41, 4.04s/it] {'loss': 0.1974, 'grad_norm': 1.145122754071775, 'learning_rate': 9.03790087463557e-06, 'epoch': 0.03} 3%|▎ | 930/34278 [1:04:45<37:27:41, 4.04s/it] 3%|▎ | 931/34278 [1:04:48<34:31:48, 3.73s/it] {'loss': 0.2341, 'grad_norm': 1.0865477187764203, 'learning_rate': 9.047619047619049e-06, 'epoch': 0.03} 3%|▎ | 931/34278 [1:04:48<34:31:48, 3.73s/it] 3%|▎ | 932/34278 [1:04:51<32:25:54, 3.50s/it] {'loss': 0.2234, 'grad_norm': 1.3141465162510526, 'learning_rate': 9.057337220602528e-06, 'epoch': 0.03} 3%|▎ | 932/34278 [1:04:51<32:25:54, 3.50s/it] 3%|▎ | 933/34278 [1:04:54<31:46:32, 3.43s/it] {'loss': 0.225, 'grad_norm': 1.1538744272422732, 'learning_rate': 9.067055393586006e-06, 'epoch': 0.03} 3%|▎ | 933/34278 [1:04:54<31:46:32, 3.43s/it] 3%|▎ | 934/34278 [1:04:58<31:03:46, 3.35s/it] {'loss': 0.2199, 'grad_norm': 1.0662250990413884, 'learning_rate': 9.076773566569485e-06, 'epoch': 0.03} 3%|▎ | 934/34278 [1:04:58<31:03:46, 3.35s/it] 3%|▎ | 935/34278 [1:05:03<37:25:22, 4.04s/it] {'loss': 0.2134, 'grad_norm': 1.0405211203137403, 'learning_rate': 9.086491739552964e-06, 'epoch': 0.03} 3%|▎ | 935/34278 [1:05:03<37:25:22, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 936/34278 [1:05:09<43:17:03, 4.67s/it] {'loss': 0.2542, 'grad_norm': 1.167944520650825, 'learning_rate': 9.096209912536444e-06, 'epoch': 0.03} 3%|▎ | 936/34278 [1:05:09<43:17:03, 4.67s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 937/34278 [1:05:12<39:02:55, 4.22s/it] {'loss': 0.2302, 'grad_norm': 1.2235520404347386, 'learning_rate': 9.105928085519923e-06, 'epoch': 0.03} 3%|▎ | 937/34278 [1:05:12<39:02:55, 4.22s/it] 3%|▎ | 938/34278 [1:05:16<36:24:16, 3.93s/it] {'loss': 0.2084, 'grad_norm': 0.8376879036238295, 'learning_rate': 9.115646258503402e-06, 'epoch': 0.03} 3%|▎ | 938/34278 [1:05:16<36:24:16, 3.93s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 939/34278 [1:05:20<36:29:14, 3.94s/it] {'loss': 0.2129, 'grad_norm': 1.1672300639818123, 'learning_rate': 9.125364431486881e-06, 'epoch': 0.03} 3%|▎ | 939/34278 [1:05:20<36:29:14, 3.94s/it] 3%|▎ | 940/34278 [1:05:23<34:07:44, 3.69s/it] {'loss': 0.2273, 'grad_norm': 1.238442350287253, 'learning_rate': 9.13508260447036e-06, 'epoch': 0.03} 3%|▎ | 940/34278 [1:05:23<34:07:44, 3.69s/it] 3%|▎ | 941/34278 [1:05:26<33:48:48, 3.65s/it] {'loss': 0.2313, 'grad_norm': 1.1226085765871399, 'learning_rate': 9.14480077745384e-06, 'epoch': 0.03} 3%|▎ | 941/34278 [1:05:26<33:48:48, 3.65s/it] 3%|▎ | 942/34278 [1:05:30<32:49:44, 3.55s/it] {'loss': 0.2087, 'grad_norm': 1.1102788007428515, 'learning_rate': 9.15451895043732e-06, 'epoch': 0.03} 3%|▎ | 942/34278 [1:05:30<32:49:44, 3.55s/it] 3%|▎ | 943/34278 [1:05:33<32:04:52, 3.46s/it] {'loss': 0.2206, 'grad_norm': 1.04976885817183, 'learning_rate': 9.164237123420797e-06, 'epoch': 0.03} 3%|▎ | 943/34278 [1:05:33<32:04:52, 3.46s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 944/34278 [1:05:36<30:44:59, 3.32s/it] {'loss': 0.2234, 'grad_norm': 1.1268358378697063, 'learning_rate': 9.173955296404276e-06, 'epoch': 0.03} 3%|▎ | 944/34278 [1:05:36<30:44:59, 3.32s/it] 3%|▎ | 945/34278 [1:05:39<29:46:53, 3.22s/it] {'loss': 0.2237, 'grad_norm': 1.0288268415220287, 'learning_rate': 9.183673469387756e-06, 'epoch': 0.03} 3%|▎ | 945/34278 [1:05:39<29:46:53, 3.22s/it] 3%|▎ | 946/34278 [1:05:42<29:27:29, 3.18s/it] {'loss': 0.2179, 'grad_norm': 1.1745349868779273, 'learning_rate': 9.193391642371235e-06, 'epoch': 0.03} 3%|▎ | 946/34278 [1:05:42<29:27:29, 3.18s/it] 3%|▎ | 947/34278 [1:05:45<28:33:11, 3.08s/it] {'loss': 0.2146, 'grad_norm': 1.1508004689734275, 'learning_rate': 9.203109815354714e-06, 'epoch': 0.03} 3%|▎ | 947/34278 [1:05:45<28:33:11, 3.08s/it] 3%|▎ | 948/34278 [1:05:48<30:04:16, 3.25s/it] {'loss': 0.2247, 'grad_norm': 0.9210802873402436, 'learning_rate': 9.212827988338194e-06, 'epoch': 0.03} 3%|▎ | 948/34278 [1:05:48<30:04:16, 3.25s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8410 > 8192). Running this sequence through the model will result in indexing errors 3%|▎ | 949/34278 [1:05:51<29:27:59, 3.18s/it] {'loss': 0.2363, 'grad_norm': 1.1622540536116515, 'learning_rate': 9.222546161321673e-06, 'epoch': 0.03} 3%|▎ | 949/34278 [1:05:52<29:27:59, 3.18s/it] 3%|▎ | 950/34278 [1:05:55<31:39:33, 3.42s/it] {'loss': 0.2479, 'grad_norm': 1.1425629262142145, 'learning_rate': 9.23226433430515e-06, 'epoch': 0.03} 3%|▎ | 950/34278 [1:05:55<31:39:33, 3.42s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 951/34278 [1:05:59<32:49:24, 3.55s/it] {'loss': 0.2409, 'grad_norm': 0.9106780037226964, 'learning_rate': 9.24198250728863e-06, 'epoch': 0.03} 3%|▎ | 951/34278 [1:05:59<32:49:24, 3.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 952/34278 [1:06:02<31:44:44, 3.43s/it] {'loss': 0.2306, 'grad_norm': 1.0965125281970576, 'learning_rate': 9.251700680272109e-06, 'epoch': 0.03} 3%|▎ | 952/34278 [1:06:02<31:44:44, 3.43s/it] 3%|▎ | 953/34278 [1:06:06<31:22:22, 3.39s/it] {'loss': 0.2198, 'grad_norm': 1.1201151553658704, 'learning_rate': 9.261418853255588e-06, 'epoch': 0.03} 3%|▎ | 953/34278 [1:06:06<31:22:22, 3.39s/it] 3%|▎ | 954/34278 [1:06:11<35:30:25, 3.84s/it] {'loss': 0.2362, 'grad_norm': 1.0211064270830699, 'learning_rate': 9.271137026239068e-06, 'epoch': 0.03} 3%|▎ | 954/34278 [1:06:11<35:30:25, 3.84s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 955/34278 [1:06:15<36:14:59, 3.92s/it] {'loss': 0.2131, 'grad_norm': 1.1512785422164178, 'learning_rate': 9.280855199222547e-06, 'epoch': 0.03} 3%|▎ | 955/34278 [1:06:15<36:14:59, 3.92s/it] 3%|▎ | 956/34278 [1:06:18<34:43:31, 3.75s/it] {'loss': 0.2318, 'grad_norm': 1.066555236087168, 'learning_rate': 9.290573372206026e-06, 'epoch': 0.03} 3%|▎ | 956/34278 [1:06:18<34:43:31, 3.75s/it] 3%|▎ | 957/34278 [1:06:21<33:26:42, 3.61s/it] {'loss': 0.227, 'grad_norm': 1.1943198550527352, 'learning_rate': 9.300291545189504e-06, 'epoch': 0.03} 3%|▎ | 957/34278 [1:06:21<33:26:42, 3.61s/it] 3%|▎ | 958/34278 [1:06:27<38:31:32, 4.16s/it] {'loss': 0.2331, 'grad_norm': 1.2512433672693648, 'learning_rate': 9.310009718172985e-06, 'epoch': 0.03} 3%|▎ | 958/34278 [1:06:27<38:31:32, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 959/34278 [1:06:30<36:51:48, 3.98s/it] {'loss': 0.2235, 'grad_norm': 1.1449455664610984, 'learning_rate': 9.319727891156464e-06, 'epoch': 0.03} 3%|▎ | 959/34278 [1:06:30<36:51:48, 3.98s/it] 3%|▎ | 960/34278 [1:06:34<34:28:11, 3.72s/it] {'loss': 0.2308, 'grad_norm': 1.3393978089874983, 'learning_rate': 9.329446064139942e-06, 'epoch': 0.03} 3%|▎ | 960/34278 [1:06:34<34:28:11, 3.72s/it] 3%|▎ | 961/34278 [1:06:38<37:33:50, 4.06s/it] {'loss': 0.2708, 'grad_norm': 1.056733030847754, 'learning_rate': 9.339164237123421e-06, 'epoch': 0.03} 3%|▎ | 961/34278 [1:06:38<37:33:50, 4.06s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 962/34278 [1:06:42<36:56:23, 3.99s/it] {'loss': 0.2195, 'grad_norm': 0.9567251442342112, 'learning_rate': 9.3488824101069e-06, 'epoch': 0.03} 3%|▎ | 962/34278 [1:06:42<36:56:23, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 963/34278 [1:06:45<33:41:54, 3.64s/it] {'loss': 0.2112, 'grad_norm': 1.2786594166323981, 'learning_rate': 9.35860058309038e-06, 'epoch': 0.03} 3%|▎ | 963/34278 [1:06:45<33:41:54, 3.64s/it] 3%|▎ | 964/34278 [1:06:48<33:06:05, 3.58s/it] {'loss': 0.208, 'grad_norm': 1.077364504649222, 'learning_rate': 9.36831875607386e-06, 'epoch': 0.03} 3%|▎ | 964/34278 [1:06:48<33:06:05, 3.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 965/34278 [1:06:54<38:00:42, 4.11s/it] {'loss': 0.2155, 'grad_norm': 1.2229919345979692, 'learning_rate': 9.378036929057338e-06, 'epoch': 0.03} 3%|▎ | 965/34278 [1:06:54<38:00:42, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 966/34278 [1:06:57<35:28:19, 3.83s/it] {'loss': 0.2116, 'grad_norm': 1.066008540075167, 'learning_rate': 9.387755102040818e-06, 'epoch': 0.03} 3%|▎ | 966/34278 [1:06:57<35:28:19, 3.83s/it] 3%|▎ | 967/34278 [1:07:00<32:12:44, 3.48s/it] {'loss': 0.2192, 'grad_norm': 0.9000305593331918, 'learning_rate': 9.397473275024295e-06, 'epoch': 0.03} 3%|▎ | 967/34278 [1:07:00<32:12:44, 3.48s/it] 3%|▎ | 968/34278 [1:07:03<32:22:40, 3.50s/it] {'loss': 0.2446, 'grad_norm': 1.2483103558638065, 'learning_rate': 9.407191448007775e-06, 'epoch': 0.03} 3%|▎ | 968/34278 [1:07:03<32:22:40, 3.50s/it] 3%|▎ | 969/34278 [1:07:09<39:19:48, 4.25s/it] {'loss': 0.2237, 'grad_norm': 1.1654942991154087, 'learning_rate': 9.416909620991254e-06, 'epoch': 0.03} 3%|▎ | 969/34278 [1:07:09<39:19:48, 4.25s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 970/34278 [1:07:13<36:56:09, 3.99s/it] {'loss': 0.2449, 'grad_norm': 1.191747132746991, 'learning_rate': 9.426627793974733e-06, 'epoch': 0.03} 3%|▎ | 970/34278 [1:07:13<36:56:09, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 971/34278 [1:07:16<34:31:07, 3.73s/it] {'loss': 0.2364, 'grad_norm': 1.1163267821303762, 'learning_rate': 9.436345966958213e-06, 'epoch': 0.03} 3%|▎ | 971/34278 [1:07:16<34:31:07, 3.73s/it] 3%|▎ | 972/34278 [1:07:19<32:41:08, 3.53s/it] {'loss': 0.2179, 'grad_norm': 1.3026286877189537, 'learning_rate': 9.446064139941692e-06, 'epoch': 0.03} 3%|▎ | 972/34278 [1:07:19<32:41:08, 3.53s/it] 3%|▎ | 973/34278 [1:07:22<31:29:16, 3.40s/it] {'loss': 0.2369, 'grad_norm': 1.1532836150899277, 'learning_rate': 9.455782312925171e-06, 'epoch': 0.03} 3%|▎ | 973/34278 [1:07:22<31:29:16, 3.40s/it] 3%|▎ | 974/34278 [1:07:25<30:03:44, 3.25s/it] {'loss': 0.2381, 'grad_norm': 1.2249757364981582, 'learning_rate': 9.465500485908649e-06, 'epoch': 0.03} 3%|▎ | 974/34278 [1:07:25<30:03:44, 3.25s/it] 3%|▎ | 975/34278 [1:07:31<37:30:43, 4.06s/it] {'loss': 0.2228, 'grad_norm': 0.895540398285069, 'learning_rate': 9.47521865889213e-06, 'epoch': 0.03} 3%|▎ | 975/34278 [1:07:31<37:30:43, 4.06s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 976/34278 [1:07:34<36:14:14, 3.92s/it] {'loss': 0.2292, 'grad_norm': 1.0567271109109968, 'learning_rate': 9.48493683187561e-06, 'epoch': 0.03} 3%|▎ | 976/34278 [1:07:34<36:14:14, 3.92s/it] 3%|▎ | 977/34278 [1:07:38<35:30:43, 3.84s/it] {'loss': 0.2548, 'grad_norm': 1.0687373189904286, 'learning_rate': 9.494655004859087e-06, 'epoch': 0.03} 3%|▎ | 977/34278 [1:07:38<35:30:43, 3.84s/it] 3%|▎ | 978/34278 [1:07:42<35:51:21, 3.88s/it] {'loss': 0.273, 'grad_norm': 1.305079059525117, 'learning_rate': 9.504373177842566e-06, 'epoch': 0.03} 3%|▎ | 978/34278 [1:07:42<35:51:21, 3.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 979/34278 [1:07:46<36:00:55, 3.89s/it] {'loss': 0.2006, 'grad_norm': 1.1352561925162497, 'learning_rate': 9.514091350826045e-06, 'epoch': 0.03} 3%|▎ | 979/34278 [1:07:46<36:00:55, 3.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (11445 > 8192). Running this sequence through the model will result in indexing errors 3%|▎ | 980/34278 [1:07:49<33:16:31, 3.60s/it] {'loss': 0.2206, 'grad_norm': 0.9489404568910308, 'learning_rate': 9.523809523809525e-06, 'epoch': 0.03} 3%|▎ | 980/34278 [1:07:49<33:16:31, 3.60s/it] 3%|▎ | 981/34278 [1:07:53<34:59:25, 3.78s/it] {'loss': 0.2104, 'grad_norm': 1.5175181641343878, 'learning_rate': 9.533527696793004e-06, 'epoch': 0.03} 3%|▎ | 981/34278 [1:07:53<34:59:25, 3.78s/it] 3%|▎ | 982/34278 [1:07:56<34:05:43, 3.69s/it] {'loss': 0.228, 'grad_norm': 1.401465604706397, 'learning_rate': 9.543245869776483e-06, 'epoch': 0.03} 3%|▎ | 982/34278 [1:07:56<34:05:43, 3.69s/it] 3%|▎ | 983/34278 [1:08:00<32:24:55, 3.50s/it] {'loss': 0.1962, 'grad_norm': 0.8548407597315596, 'learning_rate': 9.552964042759963e-06, 'epoch': 0.03} 3%|▎ | 983/34278 [1:08:00<32:24:55, 3.50s/it] 3%|▎ | 984/34278 [1:08:03<32:09:13, 3.48s/it] {'loss': 0.2216, 'grad_norm': 1.2278122948693826, 'learning_rate': 9.56268221574344e-06, 'epoch': 0.03} 3%|▎ | 984/34278 [1:08:03<32:09:13, 3.48s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 985/34278 [1:08:06<31:13:50, 3.38s/it] {'loss': 0.2103, 'grad_norm': 1.0253338593853756, 'learning_rate': 9.57240038872692e-06, 'epoch': 0.03} 3%|▎ | 985/34278 [1:08:06<31:13:50, 3.38s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (13750 > 8192). Running this sequence through the model will result in indexing errors 3%|▎ | 986/34278 [1:08:09<31:02:55, 3.36s/it] {'loss': 0.2151, 'grad_norm': 1.0819021309696246, 'learning_rate': 9.582118561710399e-06, 'epoch': 0.03} 3%|▎ | 986/34278 [1:08:09<31:02:55, 3.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 987/34278 [1:08:13<31:26:39, 3.40s/it] {'loss': 0.2278, 'grad_norm': 1.1621048015455133, 'learning_rate': 9.591836734693878e-06, 'epoch': 0.03} 3%|▎ | 987/34278 [1:08:13<31:26:39, 3.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 988/34278 [1:08:17<32:57:32, 3.56s/it] {'loss': 0.2065, 'grad_norm': 1.148004969755955, 'learning_rate': 9.601554907677358e-06, 'epoch': 0.03} 3%|▎ | 988/34278 [1:08:17<32:57:32, 3.56s/it] 3%|▎ | 989/34278 [1:08:20<32:22:28, 3.50s/it] {'loss': 0.2662, 'grad_norm': 1.4542424991666156, 'learning_rate': 9.611273080660837e-06, 'epoch': 0.03} 3%|▎ | 989/34278 [1:08:20<32:22:28, 3.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 990/34278 [1:08:25<37:22:32, 4.04s/it] {'loss': 0.2584, 'grad_norm': 1.4312639880264453, 'learning_rate': 9.620991253644316e-06, 'epoch': 0.03} 3%|▎ | 990/34278 [1:08:25<37:22:32, 4.04s/it] 3%|▎ | 991/34278 [1:08:29<34:58:09, 3.78s/it] {'loss': 0.2002, 'grad_norm': 1.248581176882113, 'learning_rate': 9.630709426627794e-06, 'epoch': 0.03} 3%|▎ | 991/34278 [1:08:29<34:58:09, 3.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 992/34278 [1:08:32<33:27:25, 3.62s/it] {'loss': 0.2338, 'grad_norm': 1.075820748048547, 'learning_rate': 9.640427599611275e-06, 'epoch': 0.03} 3%|▎ | 992/34278 [1:08:32<33:27:25, 3.62s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (13251 > 8192). Running this sequence through the model will result in indexing errors 3%|▎ | 993/34278 [1:08:38<40:39:50, 4.40s/it] {'loss': 0.2322, 'grad_norm': 1.4514438131506477, 'learning_rate': 9.650145772594754e-06, 'epoch': 0.03} 3%|▎ | 993/34278 [1:08:38<40:39:50, 4.40s/it] 3%|▎ | 994/34278 [1:08:41<37:25:05, 4.05s/it] {'loss': 0.2208, 'grad_norm': 1.139222248383189, 'learning_rate': 9.659863945578232e-06, 'epoch': 0.03} 3%|▎ | 994/34278 [1:08:41<37:25:05, 4.05s/it] 3%|▎ | 995/34278 [1:08:44<34:11:57, 3.70s/it] {'loss': 0.2531, 'grad_norm': 1.2710835376802465, 'learning_rate': 9.669582118561711e-06, 'epoch': 0.03} 3%|▎ | 995/34278 [1:08:44<34:11:57, 3.70s/it] 3%|▎ | 996/34278 [1:08:47<32:29:50, 3.52s/it] {'loss': 0.217, 'grad_norm': 1.3670607561351638, 'learning_rate': 9.67930029154519e-06, 'epoch': 0.03} 3%|▎ | 996/34278 [1:08:47<32:29:50, 3.52s/it] 3%|▎ | 997/34278 [1:08:51<33:52:20, 3.66s/it] {'loss': 0.2368, 'grad_norm': 1.0421126991423406, 'learning_rate': 9.68901846452867e-06, 'epoch': 0.03} 3%|▎ | 997/34278 [1:08:51<33:52:20, 3.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 998/34278 [1:08:54<32:10:06, 3.48s/it] {'loss': 0.2484, 'grad_norm': 1.164074668991034, 'learning_rate': 9.698736637512149e-06, 'epoch': 0.03} 3%|▎ | 998/34278 [1:08:54<32:10:06, 3.48s/it] 3%|▎ | 999/34278 [1:09:01<40:38:52, 4.40s/it] {'loss': 0.2278, 'grad_norm': 1.1465573713760517, 'learning_rate': 9.708454810495628e-06, 'epoch': 0.03} 3%|▎ | 999/34278 [1:09:01<40:38:52, 4.40s/it] 3%|▎ | 1000/34278 [1:09:07<45:19:24, 4.90s/it] {'loss': 0.2462, 'grad_norm': 1.0784643564271585, 'learning_rate': 9.718172983479108e-06, 'epoch': 0.03} 3%|▎ | 1000/34278 [1:09:07<45:19:24, 4.90s/it] 3%|▎ | 1001/34278 [1:09:10<41:18:45, 4.47s/it] {'loss': 0.2311, 'grad_norm': 1.1130447279632785, 'learning_rate': 9.727891156462585e-06, 'epoch': 0.03} 3%|▎ | 1001/34278 [1:09:10<41:18:45, 4.47s/it] 3%|▎ | 1002/34278 [1:09:14<39:29:58, 4.27s/it] {'loss': 0.232, 'grad_norm': 1.2503403259198171, 'learning_rate': 9.737609329446065e-06, 'epoch': 0.03} 3%|▎ | 1002/34278 [1:09:14<39:29:58, 4.27s/it] 3%|▎ | 1003/34278 [1:09:17<36:15:39, 3.92s/it] {'loss': 0.2339, 'grad_norm': 1.1556947570749498, 'learning_rate': 9.747327502429544e-06, 'epoch': 0.03} 3%|▎ | 1003/34278 [1:09:17<36:15:39, 3.92s/it] 3%|▎ | 1004/34278 [1:09:24<42:42:31, 4.62s/it] {'loss': 0.2439, 'grad_norm': 0.9942713910664488, 'learning_rate': 9.757045675413023e-06, 'epoch': 0.03} 3%|▎ | 1004/34278 [1:09:24<42:42:31, 4.62s/it] 3%|▎ | 1005/34278 [1:09:28<42:06:49, 4.56s/it] {'loss': 0.2094, 'grad_norm': 0.979660492071142, 'learning_rate': 9.766763848396502e-06, 'epoch': 0.03} 3%|▎ | 1005/34278 [1:09:28<42:06:49, 4.56s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1006/34278 [1:09:32<40:47:10, 4.41s/it] {'loss': 0.2114, 'grad_norm': 0.9381926924935178, 'learning_rate': 9.776482021379982e-06, 'epoch': 0.03} 3%|▎ | 1006/34278 [1:09:32<40:47:10, 4.41s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1007/34278 [1:09:35<37:15:34, 4.03s/it] {'loss': 0.2134, 'grad_norm': 0.9335735269940261, 'learning_rate': 9.78620019436346e-06, 'epoch': 0.03} 3%|▎ | 1007/34278 [1:09:35<37:15:34, 4.03s/it] 3%|▎ | 1008/34278 [1:09:38<34:52:15, 3.77s/it] {'loss': 0.2344, 'grad_norm': 1.0880985866604813, 'learning_rate': 9.795918367346939e-06, 'epoch': 0.03} 3%|▎ | 1008/34278 [1:09:38<34:52:15, 3.77s/it] 3%|▎ | 1009/34278 [1:09:42<33:13:55, 3.60s/it] {'loss': 0.2284, 'grad_norm': 1.2107379716170956, 'learning_rate': 9.805636540330418e-06, 'epoch': 0.03} 3%|▎ | 1009/34278 [1:09:42<33:13:55, 3.60s/it] 3%|▎ | 1010/34278 [1:09:46<34:31:06, 3.74s/it] {'loss': 0.2275, 'grad_norm': 1.1463554003095695, 'learning_rate': 9.815354713313899e-06, 'epoch': 0.03} 3%|▎ | 1010/34278 [1:09:46<34:31:06, 3.74s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1011/34278 [1:09:49<32:23:25, 3.51s/it] {'loss': 0.2212, 'grad_norm': 0.9356145745746572, 'learning_rate': 9.825072886297377e-06, 'epoch': 0.03} 3%|▎ | 1011/34278 [1:09:49<32:23:25, 3.51s/it] 3%|▎ | 1012/34278 [1:09:52<31:12:52, 3.38s/it] {'loss': 0.2009, 'grad_norm': 1.1456220731005373, 'learning_rate': 9.834791059280856e-06, 'epoch': 0.03} 3%|▎ | 1012/34278 [1:09:52<31:12:52, 3.38s/it] 3%|▎ | 1013/34278 [1:09:55<31:10:58, 3.37s/it] {'loss': 0.2433, 'grad_norm': 0.9493920852972088, 'learning_rate': 9.844509232264335e-06, 'epoch': 0.03} 3%|▎ | 1013/34278 [1:09:55<31:10:58, 3.37s/it] 3%|▎ | 1014/34278 [1:10:00<34:37:02, 3.75s/it] {'loss': 0.2425, 'grad_norm': 1.120088355834498, 'learning_rate': 9.854227405247815e-06, 'epoch': 0.03} 3%|▎ | 1014/34278 [1:10:00<34:37:02, 3.75s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1015/34278 [1:10:03<34:34:54, 3.74s/it] {'loss': 0.2248, 'grad_norm': 1.0822183880901732, 'learning_rate': 9.863945578231294e-06, 'epoch': 0.03} 3%|▎ | 1015/34278 [1:10:03<34:34:54, 3.74s/it] 3%|▎ | 1016/34278 [1:10:08<36:18:59, 3.93s/it] {'loss': 0.2257, 'grad_norm': 0.9589973441307148, 'learning_rate': 9.873663751214773e-06, 'epoch': 0.03} 3%|▎ | 1016/34278 [1:10:08<36:18:59, 3.93s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1017/34278 [1:10:11<33:36:45, 3.64s/it] {'loss': 0.2179, 'grad_norm': 1.1954242272928033, 'learning_rate': 9.883381924198252e-06, 'epoch': 0.03} 3%|▎ | 1017/34278 [1:10:11<33:36:45, 3.64s/it] 3%|▎ | 1018/34278 [1:10:15<35:19:24, 3.82s/it] {'loss': 0.2343, 'grad_norm': 1.0242808476806817, 'learning_rate': 9.89310009718173e-06, 'epoch': 0.03} 3%|▎ | 1018/34278 [1:10:15<35:19:24, 3.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1019/34278 [1:10:18<33:49:13, 3.66s/it] {'loss': 0.2291, 'grad_norm': 0.9177405765805806, 'learning_rate': 9.90281827016521e-06, 'epoch': 0.03} 3%|▎ | 1019/34278 [1:10:18<33:49:13, 3.66s/it] 3%|▎ | 1020/34278 [1:10:21<32:07:51, 3.48s/it] {'loss': 0.2086, 'grad_norm': 1.2918547765404111, 'learning_rate': 9.912536443148689e-06, 'epoch': 0.03} 3%|▎ | 1020/34278 [1:10:21<32:07:51, 3.48s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1021/34278 [1:10:24<30:30:09, 3.30s/it] {'loss': 0.1967, 'grad_norm': 0.9779153452842309, 'learning_rate': 9.922254616132168e-06, 'epoch': 0.03} 3%|▎ | 1021/34278 [1:10:24<30:30:09, 3.30s/it] 3%|▎ | 1022/34278 [1:10:29<34:47:45, 3.77s/it] {'loss': 0.2171, 'grad_norm': 1.4523478461643309, 'learning_rate': 9.931972789115647e-06, 'epoch': 0.03} 3%|▎ | 1022/34278 [1:10:29<34:47:45, 3.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1023/34278 [1:10:32<32:47:38, 3.55s/it] {'loss': 0.2229, 'grad_norm': 1.2042093839394294, 'learning_rate': 9.941690962099127e-06, 'epoch': 0.03} 3%|▎ | 1023/34278 [1:10:32<32:47:38, 3.55s/it] 3%|▎ | 1024/34278 [1:10:38<39:48:35, 4.31s/it] {'loss': 0.2259, 'grad_norm': 1.373374530674181, 'learning_rate': 9.951409135082604e-06, 'epoch': 0.03} 3%|▎ | 1024/34278 [1:10:38<39:48:35, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1025/34278 [1:10:42<37:45:57, 4.09s/it] {'loss': 0.2517, 'grad_norm': 1.4002754178324355, 'learning_rate': 9.961127308066084e-06, 'epoch': 0.03} 3%|▎ | 1025/34278 [1:10:42<37:45:57, 4.09s/it] 3%|▎ | 1026/34278 [1:10:46<39:18:42, 4.26s/it] {'loss': 0.2115, 'grad_norm': 1.1459327387497917, 'learning_rate': 9.970845481049563e-06, 'epoch': 0.03} 3%|▎ | 1026/34278 [1:10:46<39:18:42, 4.26s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1027/34278 [1:10:50<37:01:14, 4.01s/it] {'loss': 0.2271, 'grad_norm': 1.2113509802193894, 'learning_rate': 9.980563654033044e-06, 'epoch': 0.03} 3%|▎ | 1027/34278 [1:10:50<37:01:14, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1028/34278 [1:10:53<34:01:06, 3.68s/it] {'loss': 0.2373, 'grad_norm': 1.2171569257268688, 'learning_rate': 9.990281827016522e-06, 'epoch': 0.03} 3%|▎ | 1028/34278 [1:10:53<34:01:06, 3.68s/it] 3%|▎ | 1029/34278 [1:10:57<35:15:29, 3.82s/it] {'loss': 0.2208, 'grad_norm': 1.2814230734715215, 'learning_rate': 1e-05, 'epoch': 0.03} 3%|▎ | 1029/34278 [1:10:57<35:15:29, 3.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1030/34278 [1:11:00<33:21:30, 3.61s/it] {'loss': 0.2418, 'grad_norm': 1.1038896598078904, 'learning_rate': 9.999999977680598e-06, 'epoch': 0.03} 3%|▎ | 1030/34278 [1:11:00<33:21:30, 3.61s/it] 3%|▎ | 1031/34278 [1:11:03<32:02:39, 3.47s/it] {'loss': 0.271, 'grad_norm': 1.1780794329233468, 'learning_rate': 9.99999991072239e-06, 'epoch': 0.03} 3%|▎ | 1031/34278 [1:11:03<32:02:39, 3.47s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1032/34278 [1:11:07<34:00:17, 3.68s/it] {'loss': 0.2393, 'grad_norm': 1.2161672986842347, 'learning_rate': 9.999999799125373e-06, 'epoch': 0.03} 3%|▎ | 1032/34278 [1:11:07<34:00:17, 3.68s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1033/34278 [1:11:14<41:05:15, 4.45s/it] {'loss': 0.2331, 'grad_norm': 1.0013279539555224, 'learning_rate': 9.999999642889553e-06, 'epoch': 0.03} 3%|▎ | 1033/34278 [1:11:14<41:05:15, 4.45s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1034/34278 [1:11:19<43:23:26, 4.70s/it] {'loss': 0.2565, 'grad_norm': 1.1191553516627004, 'learning_rate': 9.999999442014931e-06, 'epoch': 0.03} 3%|▎ | 1034/34278 [1:11:19<43:23:26, 4.70s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1035/34278 [1:11:22<38:24:48, 4.16s/it] {'loss': 0.2656, 'grad_norm': 1.0903349175568633, 'learning_rate': 9.999999196501506e-06, 'epoch': 0.03} 3%|▎ | 1035/34278 [1:11:22<38:24:48, 4.16s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10014 > 8192). Running this sequence through the model will result in indexing errors 3%|▎ | 1036/34278 [1:11:25<36:47:07, 3.98s/it] {'loss': 0.2493, 'grad_norm': 1.6161898354090392, 'learning_rate': 9.999998906349283e-06, 'epoch': 0.03} 3%|▎ | 1036/34278 [1:11:25<36:47:07, 3.98s/it] 3%|▎ | 1037/34278 [1:11:29<36:20:16, 3.94s/it] {'loss': 0.2247, 'grad_norm': 1.0186301722121622, 'learning_rate': 9.999998571558263e-06, 'epoch': 0.03} 3%|▎ | 1037/34278 [1:11:29<36:20:16, 3.94s/it] 3%|▎ | 1038/34278 [1:11:32<33:42:08, 3.65s/it] {'loss': 0.2076, 'grad_norm': 0.9351870649492161, 'learning_rate': 9.999998192128449e-06, 'epoch': 0.03} 3%|▎ | 1038/34278 [1:11:32<33:42:08, 3.65s/it] 3%|▎ | 1039/34278 [1:11:37<36:41:00, 3.97s/it] {'loss': 0.248, 'grad_norm': 1.352484917920168, 'learning_rate': 9.999997768059845e-06, 'epoch': 0.03} 3%|▎ | 1039/34278 [1:11:37<36:41:00, 3.97s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1040/34278 [1:11:42<40:10:26, 4.35s/it] {'loss': 0.1975, 'grad_norm': 0.9669604788901461, 'learning_rate': 9.999997299352456e-06, 'epoch': 0.03} 3%|▎ | 1040/34278 [1:11:42<40:10:26, 4.35s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1041/34278 [1:11:47<42:09:09, 4.57s/it] {'loss': 0.2477, 'grad_norm': 1.2802932840500656, 'learning_rate': 9.999996786006282e-06, 'epoch': 0.03} 3%|▎ | 1041/34278 [1:11:47<42:09:09, 4.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1042/34278 [1:11:50<38:08:51, 4.13s/it] {'loss': 0.1886, 'grad_norm': 1.0765392634378834, 'learning_rate': 9.999996228021332e-06, 'epoch': 0.03} 3%|▎ | 1042/34278 [1:11:50<38:08:51, 4.13s/it] 3%|▎ | 1043/34278 [1:11:55<38:42:43, 4.19s/it] {'loss': 0.2101, 'grad_norm': 1.2080753231670838, 'learning_rate': 9.999995625397607e-06, 'epoch': 0.03} 3%|▎ | 1043/34278 [1:11:55<38:42:43, 4.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1044/34278 [1:12:01<44:30:59, 4.82s/it] {'loss': 0.2501, 'grad_norm': 1.0108288252893798, 'learning_rate': 9.999994978135117e-06, 'epoch': 0.03} 3%|▎ | 1044/34278 [1:12:01<44:30:59, 4.82s/it] 3%|▎ | 1045/34278 [1:12:04<39:13:49, 4.25s/it] {'loss': 0.219, 'grad_norm': 1.0778645089294687, 'learning_rate': 9.999994286233866e-06, 'epoch': 0.03} 3%|▎ | 1045/34278 [1:12:04<39:13:49, 4.25s/it] 3%|▎ | 1046/34278 [1:12:08<38:07:22, 4.13s/it] {'loss': 0.2725, 'grad_norm': 1.46972921740258, 'learning_rate': 9.999993549693859e-06, 'epoch': 0.03} 3%|▎ | 1046/34278 [1:12:08<38:07:22, 4.13s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1047/34278 [1:12:13<40:14:42, 4.36s/it] {'loss': 0.2262, 'grad_norm': 1.0426141075524984, 'learning_rate': 9.999992768515101e-06, 'epoch': 0.03} 3%|▎ | 1047/34278 [1:12:13<40:14:42, 4.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1048/34278 [1:12:16<37:38:40, 4.08s/it] {'loss': 0.2029, 'grad_norm': 0.8282285003795017, 'learning_rate': 9.999991942697602e-06, 'epoch': 0.03} 3%|▎ | 1048/34278 [1:12:16<37:38:40, 4.08s/it] 3%|▎ | 1049/34278 [1:12:19<34:29:20, 3.74s/it] {'loss': 0.2347, 'grad_norm': 1.287499827347934, 'learning_rate': 9.999991072241371e-06, 'epoch': 0.03} 3%|▎ | 1049/34278 [1:12:19<34:29:20, 3.74s/it] 3%|▎ | 1050/34278 [1:12:22<32:26:43, 3.52s/it] {'loss': 0.2292, 'grad_norm': 1.0875295125811832, 'learning_rate': 9.999990157146411e-06, 'epoch': 0.03} 3%|▎ | 1050/34278 [1:12:22<32:26:43, 3.52s/it] 3%|▎ | 1051/34278 [1:12:25<31:00:31, 3.36s/it] {'loss': 0.2094, 'grad_norm': 1.0651058116013083, 'learning_rate': 9.999989197412733e-06, 'epoch': 0.03} 3%|▎ | 1051/34278 [1:12:25<31:00:31, 3.36s/it] 3%|▎ | 1052/34278 [1:12:29<33:41:24, 3.65s/it] {'loss': 0.2448, 'grad_norm': 1.3083532189291245, 'learning_rate': 9.999988193040345e-06, 'epoch': 0.03} 3%|▎ | 1052/34278 [1:12:29<33:41:24, 3.65s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1053/34278 [1:12:32<31:22:50, 3.40s/it] {'loss': 0.2164, 'grad_norm': 1.174307964114287, 'learning_rate': 9.999987144029256e-06, 'epoch': 0.03} 3%|▎ | 1053/34278 [1:12:32<31:22:50, 3.40s/it] 3%|▎ | 1054/34278 [1:12:37<35:28:37, 3.84s/it] {'loss': 0.2175, 'grad_norm': 1.1792434998984536, 'learning_rate': 9.999986050379476e-06, 'epoch': 0.03} 3%|▎ | 1054/34278 [1:12:37<35:28:37, 3.84s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1055/34278 [1:12:44<42:54:39, 4.65s/it] {'loss': 0.2458, 'grad_norm': 1.105215045612077, 'learning_rate': 9.999984912091012e-06, 'epoch': 0.03} 3%|▎ | 1055/34278 [1:12:44<42:54:39, 4.65s/it] 3%|▎ | 1056/34278 [1:12:47<38:54:08, 4.22s/it] {'loss': 0.2386, 'grad_norm': 1.2327368379074821, 'learning_rate': 9.999983729163879e-06, 'epoch': 0.03} 3%|▎ | 1056/34278 [1:12:47<38:54:08, 4.22s/it] 3%|▎ | 1057/34278 [1:12:50<37:38:48, 4.08s/it] {'loss': 0.2403, 'grad_norm': 1.0071343378859123, 'learning_rate': 9.999982501598085e-06, 'epoch': 0.03} 3%|▎ | 1057/34278 [1:12:51<37:38:48, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1058/34278 [1:12:53<34:29:43, 3.74s/it] {'loss': 0.2333, 'grad_norm': 0.9581753368319474, 'learning_rate': 9.999981229393638e-06, 'epoch': 0.03} 3%|▎ | 1058/34278 [1:12:53<34:29:43, 3.74s/it] 3%|▎ | 1059/34278 [1:12:58<35:33:06, 3.85s/it] {'loss': 0.2296, 'grad_norm': 0.9039464725641874, 'learning_rate': 9.999979912550554e-06, 'epoch': 0.03} 3%|▎ | 1059/34278 [1:12:58<35:33:06, 3.85s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1060/34278 [1:13:01<34:52:59, 3.78s/it] {'loss': 0.2247, 'grad_norm': 0.9799420306191866, 'learning_rate': 9.999978551068843e-06, 'epoch': 0.03} 3%|▎ | 1060/34278 [1:13:01<34:52:59, 3.78s/it] 3%|▎ | 1061/34278 [1:13:05<34:42:35, 3.76s/it] {'loss': 0.2481, 'grad_norm': 1.0631962544979339, 'learning_rate': 9.999977144948516e-06, 'epoch': 0.03} 3%|▎ | 1061/34278 [1:13:05<34:42:35, 3.76s/it] 3%|▎ | 1062/34278 [1:13:08<33:52:38, 3.67s/it] {'loss': 0.2337, 'grad_norm': 0.8651539004894622, 'learning_rate': 9.999975694189588e-06, 'epoch': 0.03} 3%|▎ | 1062/34278 [1:13:08<33:52:38, 3.67s/it] 3%|▎ | 1063/34278 [1:13:11<32:24:27, 3.51s/it] {'loss': 0.2519, 'grad_norm': 1.1665030369366545, 'learning_rate': 9.999974198792071e-06, 'epoch': 0.03} 3%|▎ | 1063/34278 [1:13:11<32:24:27, 3.51s/it] 3%|▎ | 1064/34278 [1:13:14<30:24:18, 3.30s/it] {'loss': 0.2231, 'grad_norm': 1.114903267694991, 'learning_rate': 9.999972658755976e-06, 'epoch': 0.03} 3%|▎ | 1064/34278 [1:13:14<30:24:18, 3.30s/it] 3%|▎ | 1065/34278 [1:13:20<38:23:14, 4.16s/it] {'loss': 0.2148, 'grad_norm': 0.9330234882071323, 'learning_rate': 9.99997107408132e-06, 'epoch': 0.03} 3%|▎ | 1065/34278 [1:13:20<38:23:14, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1066/34278 [1:13:24<36:34:43, 3.96s/it] {'loss': 0.2154, 'grad_norm': 0.9682184930048112, 'learning_rate': 9.999969444768116e-06, 'epoch': 0.03} 3%|▎ | 1066/34278 [1:13:24<36:34:43, 3.96s/it] 3%|▎ | 1067/34278 [1:13:28<36:46:59, 3.99s/it] {'loss': 0.2454, 'grad_norm': 1.10379167524357, 'learning_rate': 9.999967770816376e-06, 'epoch': 0.03} 3%|▎ | 1067/34278 [1:13:28<36:46:59, 3.99s/it] 3%|▎ | 1068/34278 [1:13:32<35:37:33, 3.86s/it] {'loss': 0.225, 'grad_norm': 0.9999280949417427, 'learning_rate': 9.99996605222612e-06, 'epoch': 0.03} 3%|▎ | 1068/34278 [1:13:32<35:37:33, 3.86s/it] 3%|▎ | 1069/34278 [1:13:35<34:18:30, 3.72s/it] {'loss': 0.2335, 'grad_norm': 1.0971592229248597, 'learning_rate': 9.999964288997361e-06, 'epoch': 0.03} 3%|▎ | 1069/34278 [1:13:35<34:18:30, 3.72s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1070/34278 [1:13:41<41:29:23, 4.50s/it] {'loss': 0.2646, 'grad_norm': 0.9988697953983956, 'learning_rate': 9.999962481130112e-06, 'epoch': 0.03} 3%|▎ | 1070/34278 [1:13:41<41:29:23, 4.50s/it] 3%|▎ | 1071/34278 [1:13:45<38:13:09, 4.14s/it] {'loss': 0.2036, 'grad_norm': 1.1785567945511561, 'learning_rate': 9.999960628624394e-06, 'epoch': 0.03} 3%|▎ | 1071/34278 [1:13:45<38:13:09, 4.14s/it] 3%|▎ | 1072/34278 [1:13:48<35:12:16, 3.82s/it] {'loss': 0.2206, 'grad_norm': 0.8444076295505306, 'learning_rate': 9.999958731480219e-06, 'epoch': 0.03} 3%|▎ | 1072/34278 [1:13:48<35:12:16, 3.82s/it] 3%|▎ | 1073/34278 [1:13:51<34:14:50, 3.71s/it] {'loss': 0.2196, 'grad_norm': 1.1765564114291198, 'learning_rate': 9.999956789697608e-06, 'epoch': 0.03} 3%|▎ | 1073/34278 [1:13:51<34:14:50, 3.71s/it] 3%|▎ | 1074/34278 [1:13:54<32:34:04, 3.53s/it] {'loss': 0.2435, 'grad_norm': 1.0027789207640034, 'learning_rate': 9.999954803276575e-06, 'epoch': 0.03} 3%|▎ | 1074/34278 [1:13:54<32:34:04, 3.53s/it] 3%|▎ | 1075/34278 [1:13:58<32:44:27, 3.55s/it] {'loss': 0.2146, 'grad_norm': 1.1942686350375853, 'learning_rate': 9.99995277221714e-06, 'epoch': 0.03} 3%|▎ | 1075/34278 [1:13:58<32:44:27, 3.55s/it] 3%|▎ | 1076/34278 [1:14:03<36:15:50, 3.93s/it] {'loss': 0.2211, 'grad_norm': 1.0105641916721053, 'learning_rate': 9.99995069651932e-06, 'epoch': 0.03} 3%|▎ | 1076/34278 [1:14:03<36:15:50, 3.93s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1077/34278 [1:14:07<37:39:58, 4.08s/it] {'loss': 0.2601, 'grad_norm': 1.035972104046321, 'learning_rate': 9.999948576183133e-06, 'epoch': 0.03} 3%|▎ | 1077/34278 [1:14:07<37:39:58, 4.08s/it] 3%|▎ | 1078/34278 [1:14:13<43:01:16, 4.66s/it] {'loss': 0.231, 'grad_norm': 1.0463994881138075, 'learning_rate': 9.999946411208598e-06, 'epoch': 0.03} 3%|▎ | 1078/34278 [1:14:13<43:01:16, 4.66s/it] 3%|▎ | 1079/34278 [1:14:17<41:52:30, 4.54s/it] {'loss': 0.2162, 'grad_norm': 1.0958469859592759, 'learning_rate': 9.999944201595736e-06, 'epoch': 0.03} 3%|▎ | 1079/34278 [1:14:17<41:52:30, 4.54s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1080/34278 [1:14:20<37:54:47, 4.11s/it] {'loss': 0.2536, 'grad_norm': 0.9170157458484897, 'learning_rate': 9.999941947344567e-06, 'epoch': 0.03} 3%|▎ | 1080/34278 [1:14:20<37:54:47, 4.11s/it] 3%|▎ | 1081/34278 [1:14:23<34:41:09, 3.76s/it] {'loss': 0.2283, 'grad_norm': 1.0339277976868972, 'learning_rate': 9.999939648455108e-06, 'epoch': 0.03} 3%|▎ | 1081/34278 [1:14:23<34:41:09, 3.76s/it] 3%|▎ | 1082/34278 [1:14:26<32:29:59, 3.52s/it] {'loss': 0.2272, 'grad_norm': 1.010049306504477, 'learning_rate': 9.999937304927384e-06, 'epoch': 0.03} 3%|▎ | 1082/34278 [1:14:26<32:29:59, 3.52s/it] 3%|▎ | 1083/34278 [1:14:30<32:24:32, 3.51s/it] {'loss': 0.2367, 'grad_norm': 0.8365637377041215, 'learning_rate': 9.999934916761411e-06, 'epoch': 0.03} 3%|▎ | 1083/34278 [1:14:30<32:24:32, 3.51s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (10038 > 8192). Running this sequence through the model will result in indexing errors 3%|▎ | 1084/34278 [1:14:33<31:24:37, 3.41s/it] {'loss': 0.2162, 'grad_norm': 1.0402870618881737, 'learning_rate': 9.999932483957212e-06, 'epoch': 0.03} 3%|▎ | 1084/34278 [1:14:33<31:24:37, 3.41s/it] 3%|▎ | 1085/34278 [1:14:36<30:09:04, 3.27s/it] {'loss': 0.2538, 'grad_norm': 1.0734228405774093, 'learning_rate': 9.999930006514811e-06, 'epoch': 0.03} 3%|▎ | 1085/34278 [1:14:36<30:09:04, 3.27s/it] 3%|▎ | 1086/34278 [1:14:39<30:26:37, 3.30s/it] {'loss': 0.2526, 'grad_norm': 1.064405888182433, 'learning_rate': 9.999927484434229e-06, 'epoch': 0.03} 3%|▎ | 1086/34278 [1:14:39<30:26:37, 3.30s/it] 3%|▎ | 1087/34278 [1:14:42<29:44:17, 3.23s/it] {'loss': 0.1984, 'grad_norm': 1.072147764043234, 'learning_rate': 9.999924917715486e-06, 'epoch': 0.03} 3%|▎ | 1087/34278 [1:14:42<29:44:17, 3.23s/it] 3%|▎ | 1088/34278 [1:14:45<29:06:42, 3.16s/it] {'loss': 0.2155, 'grad_norm': 0.9001795646488481, 'learning_rate': 9.999922306358607e-06, 'epoch': 0.03} 3%|▎ | 1088/34278 [1:14:45<29:06:42, 3.16s/it] 3%|▎ | 1089/34278 [1:14:49<29:12:25, 3.17s/it] {'loss': 0.2185, 'grad_norm': 1.080438778596477, 'learning_rate': 9.999919650363617e-06, 'epoch': 0.03} 3%|▎ | 1089/34278 [1:14:49<29:12:25, 3.17s/it] 3%|▎ | 1090/34278 [1:14:54<36:00:42, 3.91s/it] {'loss': 0.2397, 'grad_norm': 0.9943418293671965, 'learning_rate': 9.999916949730536e-06, 'epoch': 0.03} 3%|▎ | 1090/34278 [1:14:54<36:00:42, 3.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1091/34278 [1:14:57<33:33:57, 3.64s/it] {'loss': 0.2258, 'grad_norm': 1.1837387132283383, 'learning_rate': 9.999914204459393e-06, 'epoch': 0.03} 3%|▎ | 1091/34278 [1:14:57<33:33:57, 3.64s/it] 3%|▎ | 1092/34278 [1:15:01<32:54:55, 3.57s/it] {'loss': 0.2383, 'grad_norm': 1.3608341986920691, 'learning_rate': 9.999911414550207e-06, 'epoch': 0.03} 3%|▎ | 1092/34278 [1:15:01<32:54:55, 3.57s/it] 3%|▎ | 1093/34278 [1:15:05<33:51:30, 3.67s/it] {'loss': 0.2279, 'grad_norm': 1.1532640252010724, 'learning_rate': 9.999908580003006e-06, 'epoch': 0.03} 3%|▎ | 1093/34278 [1:15:05<33:51:30, 3.67s/it] 3%|▎ | 1094/34278 [1:15:07<31:49:28, 3.45s/it] {'loss': 0.2466, 'grad_norm': 1.2667090607333933, 'learning_rate': 9.999905700817816e-06, 'epoch': 0.03} 3%|▎ | 1094/34278 [1:15:07<31:49:28, 3.45s/it] 3%|▎ | 1095/34278 [1:15:11<31:24:04, 3.41s/it] {'loss': 0.2575, 'grad_norm': 1.1186150208903016, 'learning_rate': 9.99990277699466e-06, 'epoch': 0.03} 3%|▎ | 1095/34278 [1:15:11<31:24:04, 3.41s/it] 3%|▎ | 1096/34278 [1:15:15<32:51:02, 3.56s/it] {'loss': 0.2322, 'grad_norm': 1.04566986537825, 'learning_rate': 9.999899808533566e-06, 'epoch': 0.03} 3%|▎ | 1096/34278 [1:15:15<32:51:02, 3.56s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9813 > 8192). Running this sequence through the model will result in indexing errors 3%|▎ | 1097/34278 [1:15:21<40:04:04, 4.35s/it] {'loss': 0.2613, 'grad_norm': 1.1302419209531933, 'learning_rate': 9.999896795434561e-06, 'epoch': 0.03} 3%|▎ | 1097/34278 [1:15:21<40:04:04, 4.35s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1098/34278 [1:15:24<35:56:20, 3.90s/it] {'loss': 0.2061, 'grad_norm': 1.001348327485578, 'learning_rate': 9.999893737697668e-06, 'epoch': 0.03} 3%|▎ | 1098/34278 [1:15:24<35:56:20, 3.90s/it] 3%|▎ | 1099/34278 [1:15:28<36:36:20, 3.97s/it] {'loss': 0.2413, 'grad_norm': 1.0936303751215548, 'learning_rate': 9.99989063532292e-06, 'epoch': 0.03} 3%|▎ | 1099/34278 [1:15:28<36:36:20, 3.97s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1100/34278 [1:15:31<35:24:28, 3.84s/it] {'loss': 0.2158, 'grad_norm': 1.2966792596343628, 'learning_rate': 9.999887488310342e-06, 'epoch': 0.03} 3%|▎ | 1100/34278 [1:15:31<35:24:28, 3.84s/it] 3%|▎ | 1101/34278 [1:15:36<36:11:13, 3.93s/it] {'loss': 0.2343, 'grad_norm': 1.068266295553114, 'learning_rate': 9.999884296659961e-06, 'epoch': 0.03} 3%|▎ | 1101/34278 [1:15:36<36:11:13, 3.93s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1102/34278 [1:15:39<34:47:58, 3.78s/it] {'loss': 0.2495, 'grad_norm': 1.0216000271168946, 'learning_rate': 9.999881060371808e-06, 'epoch': 0.03} 3%|▎ | 1102/34278 [1:15:39<34:47:58, 3.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1103/34278 [1:15:42<32:50:58, 3.56s/it] {'loss': 0.2736, 'grad_norm': 1.4157251418123615, 'learning_rate': 9.999877779445908e-06, 'epoch': 0.03} 3%|▎ | 1103/34278 [1:15:42<32:50:58, 3.56s/it] 3%|▎ | 1104/34278 [1:15:47<35:39:55, 3.87s/it] {'loss': 0.2362, 'grad_norm': 1.0096560165463653, 'learning_rate': 9.999874453882294e-06, 'epoch': 0.03} 3%|▎ | 1104/34278 [1:15:47<35:39:55, 3.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1105/34278 [1:15:51<37:20:11, 4.05s/it] {'loss': 0.2317, 'grad_norm': 0.8131015711421686, 'learning_rate': 9.999871083680995e-06, 'epoch': 0.03} 3%|▎ | 1105/34278 [1:15:51<37:20:11, 4.05s/it] 3%|▎ | 1106/34278 [1:15:54<35:11:15, 3.82s/it] {'loss': 0.2321, 'grad_norm': 1.1036781682956833, 'learning_rate': 9.99986766884204e-06, 'epoch': 0.03} 3%|▎ | 1106/34278 [1:15:54<35:11:15, 3.82s/it] 3%|▎ | 1107/34278 [1:15:57<33:09:41, 3.60s/it] {'loss': 0.2198, 'grad_norm': 1.1382838264022068, 'learning_rate': 9.99986420936546e-06, 'epoch': 0.03} 3%|▎ | 1107/34278 [1:15:57<33:09:41, 3.60s/it] 3%|▎ | 1108/34278 [1:16:01<32:08:34, 3.49s/it] {'loss': 0.2192, 'grad_norm': 1.0162604892559457, 'learning_rate': 9.999860705251288e-06, 'epoch': 0.03} 3%|▎ | 1108/34278 [1:16:01<32:08:34, 3.49s/it] 3%|▎ | 1109/34278 [1:16:07<39:32:33, 4.29s/it] {'loss': 0.2077, 'grad_norm': 0.8933296893054877, 'learning_rate': 9.99985715649955e-06, 'epoch': 0.03} 3%|▎ | 1109/34278 [1:16:07<39:32:33, 4.29s/it] 3%|▎ | 1110/34278 [1:16:10<36:23:09, 3.95s/it] {'loss': 0.21, 'grad_norm': 1.119167707859606, 'learning_rate': 9.999853563110282e-06, 'epoch': 0.03} 3%|▎ | 1110/34278 [1:16:10<36:23:09, 3.95s/it] 3%|▎ | 1111/34278 [1:16:13<34:40:08, 3.76s/it] {'loss': 0.226, 'grad_norm': 1.082100130798858, 'learning_rate': 9.999849925083516e-06, 'epoch': 0.03} 3%|▎ | 1111/34278 [1:16:13<34:40:08, 3.76s/it] 3%|▎ | 1112/34278 [1:16:16<32:50:16, 3.56s/it] {'loss': 0.1914, 'grad_norm': 1.0214419161709158, 'learning_rate': 9.999846242419282e-06, 'epoch': 0.03} 3%|▎ | 1112/34278 [1:16:16<32:50:16, 3.56s/it] 3%|▎ | 1113/34278 [1:16:23<39:49:48, 4.32s/it] {'loss': 0.275, 'grad_norm': 0.9343600517953623, 'learning_rate': 9.999842515117615e-06, 'epoch': 0.03} 3%|▎ | 1113/34278 [1:16:23<39:49:48, 4.32s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1114/34278 [1:16:26<36:16:34, 3.94s/it] {'loss': 0.2129, 'grad_norm': 0.9865952655303679, 'learning_rate': 9.999838743178547e-06, 'epoch': 0.03} 3%|▎ | 1114/34278 [1:16:26<36:16:34, 3.94s/it] 3%|▎ | 1115/34278 [1:16:32<42:41:30, 4.63s/it] {'loss': 0.2174, 'grad_norm': 1.319896897819642, 'learning_rate': 9.999834926602113e-06, 'epoch': 0.03} 3%|▎ | 1115/34278 [1:16:32<42:41:30, 4.63s/it] 3%|▎ | 1116/34278 [1:16:36<40:48:24, 4.43s/it] {'loss': 0.226, 'grad_norm': 1.0521765205195404, 'learning_rate': 9.999831065388345e-06, 'epoch': 0.03} 3%|▎ | 1116/34278 [1:16:36<40:48:24, 4.43s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1117/34278 [1:16:42<45:07:50, 4.90s/it] {'loss': 0.2134, 'grad_norm': 0.9654473865450861, 'learning_rate': 9.999827159537281e-06, 'epoch': 0.03} 3%|▎ | 1117/34278 [1:16:42<45:07:50, 4.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1118/34278 [1:16:45<40:50:06, 4.43s/it] {'loss': 0.2353, 'grad_norm': 1.123917566290155, 'learning_rate': 9.999823209048951e-06, 'epoch': 0.03} 3%|▎ | 1118/34278 [1:16:45<40:50:06, 4.43s/it] 3%|▎ | 1119/34278 [1:16:48<36:37:59, 3.98s/it] {'loss': 0.2297, 'grad_norm': 0.9176866330298619, 'learning_rate': 9.999819213923394e-06, 'epoch': 0.03} 3%|▎ | 1119/34278 [1:16:48<36:37:59, 3.98s/it] 3%|▎ | 1120/34278 [1:16:51<33:58:15, 3.69s/it] {'loss': 0.2051, 'grad_norm': 1.1257387803194792, 'learning_rate': 9.999815174160646e-06, 'epoch': 0.03} 3%|▎ | 1120/34278 [1:16:51<33:58:15, 3.69s/it] 3%|▎ | 1121/34278 [1:16:54<31:40:02, 3.44s/it] {'loss': 0.2263, 'grad_norm': 1.0474059598016923, 'learning_rate': 9.999811089760741e-06, 'epoch': 0.03} 3%|▎ | 1121/34278 [1:16:54<31:40:02, 3.44s/it] 3%|▎ | 1122/34278 [1:17:00<39:55:57, 4.34s/it] {'loss': 0.2278, 'grad_norm': 0.798373618759498, 'learning_rate': 9.999806960723715e-06, 'epoch': 0.03} 3%|▎ | 1122/34278 [1:17:00<39:55:57, 4.34s/it] 3%|▎ | 1123/34278 [1:17:04<37:05:04, 4.03s/it] {'loss': 0.2245, 'grad_norm': 0.8280537246125049, 'learning_rate': 9.999802787049609e-06, 'epoch': 0.03} 3%|▎ | 1123/34278 [1:17:04<37:05:04, 4.03s/it] 3%|▎ | 1124/34278 [1:17:08<37:04:44, 4.03s/it] {'loss': 0.2524, 'grad_norm': 0.9711641379405179, 'learning_rate': 9.999798568738453e-06, 'epoch': 0.03} 3%|▎ | 1124/34278 [1:17:08<37:04:44, 4.03s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1125/34278 [1:17:14<42:44:26, 4.64s/it] {'loss': 0.2262, 'grad_norm': 1.1246028188582697, 'learning_rate': 9.99979430579029e-06, 'epoch': 0.03} 3%|▎ | 1125/34278 [1:17:14<42:44:26, 4.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1126/34278 [1:17:17<39:17:03, 4.27s/it] {'loss': 0.1986, 'grad_norm': 0.953299071600111, 'learning_rate': 9.99978999820516e-06, 'epoch': 0.03} 3%|▎ | 1126/34278 [1:17:17<39:17:03, 4.27s/it] 3%|▎ | 1127/34278 [1:17:20<35:41:48, 3.88s/it] {'loss': 0.2109, 'grad_norm': 0.9431805074183164, 'learning_rate': 9.999785645983095e-06, 'epoch': 0.03} 3%|▎ | 1127/34278 [1:17:20<35:41:48, 3.88s/it] 3%|▎ | 1128/34278 [1:17:23<33:57:35, 3.69s/it] {'loss': 0.2574, 'grad_norm': 1.1668135926724232, 'learning_rate': 9.999781249124142e-06, 'epoch': 0.03} 3%|▎ | 1128/34278 [1:17:23<33:57:35, 3.69s/it] 3%|▎ | 1129/34278 [1:17:27<34:35:47, 3.76s/it] {'loss': 0.2129, 'grad_norm': 0.9957053951434509, 'learning_rate': 9.99977680762833e-06, 'epoch': 0.03} 3%|▎ | 1129/34278 [1:17:27<34:35:47, 3.76s/it] 3%|▎ | 1130/34278 [1:17:31<33:24:51, 3.63s/it] {'loss': 0.2155, 'grad_norm': 0.883050594109438, 'learning_rate': 9.999772321495706e-06, 'epoch': 0.03} 3%|▎ | 1130/34278 [1:17:31<33:24:51, 3.63s/it] 3%|▎ | 1131/34278 [1:17:34<33:21:34, 3.62s/it] {'loss': 0.2276, 'grad_norm': 0.9997288408798505, 'learning_rate': 9.999767790726309e-06, 'epoch': 0.03} 3%|▎ | 1131/34278 [1:17:34<33:21:34, 3.62s/it] 3%|▎ | 1132/34278 [1:17:37<30:54:29, 3.36s/it] {'loss': 0.2235, 'grad_norm': 1.3235571539055981, 'learning_rate': 9.999763215320179e-06, 'epoch': 0.03} 3%|▎ | 1132/34278 [1:17:37<30:54:29, 3.36s/it] 3%|▎ | 1133/34278 [1:17:42<35:18:57, 3.84s/it] {'loss': 0.2637, 'grad_norm': 1.086190879888537, 'learning_rate': 9.999758595277356e-06, 'epoch': 0.03} 3%|▎ | 1133/34278 [1:17:42<35:18:57, 3.84s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1134/34278 [1:17:45<33:17:39, 3.62s/it] {'loss': 0.2125, 'grad_norm': 0.9198443741837063, 'learning_rate': 9.999753930597882e-06, 'epoch': 0.03} 3%|▎ | 1134/34278 [1:17:45<33:17:39, 3.62s/it] 3%|▎ | 1135/34278 [1:17:48<31:11:52, 3.39s/it] {'loss': 0.2269, 'grad_norm': 1.046313029577075, 'learning_rate': 9.999749221281798e-06, 'epoch': 0.03} 3%|▎ | 1135/34278 [1:17:48<31:11:52, 3.39s/it] 3%|▎ | 1136/34278 [1:17:53<35:54:38, 3.90s/it] {'loss': 0.2413, 'grad_norm': 1.0128006542158847, 'learning_rate': 9.999744467329147e-06, 'epoch': 0.03} 3%|▎ | 1136/34278 [1:17:53<35:54:38, 3.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1137/34278 [1:17:58<39:11:38, 4.26s/it] {'loss': 0.2361, 'grad_norm': 1.0689231153917713, 'learning_rate': 9.999739668739971e-06, 'epoch': 0.03} 3%|▎ | 1137/34278 [1:17:58<39:11:38, 4.26s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1138/34278 [1:18:01<35:13:05, 3.83s/it] {'loss': 0.2076, 'grad_norm': 1.036064213424324, 'learning_rate': 9.999734825514312e-06, 'epoch': 0.03} 3%|▎ | 1138/34278 [1:18:01<35:13:05, 3.83s/it] 3%|▎ | 1139/34278 [1:18:04<32:51:58, 3.57s/it] {'loss': 0.2032, 'grad_norm': 0.9531214242931653, 'learning_rate': 9.999729937652214e-06, 'epoch': 0.03} 3%|▎ | 1139/34278 [1:18:04<32:51:58, 3.57s/it] 3%|▎ | 1140/34278 [1:18:08<33:40:24, 3.66s/it] {'loss': 0.2308, 'grad_norm': 1.1916189472859937, 'learning_rate': 9.999725005153721e-06, 'epoch': 0.03} 3%|▎ | 1140/34278 [1:18:08<33:40:24, 3.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1141/34278 [1:18:14<40:17:47, 4.38s/it] {'loss': 0.2054, 'grad_norm': 1.093124559785437, 'learning_rate': 9.999720028018877e-06, 'epoch': 0.03} 3%|▎ | 1141/34278 [1:18:14<40:17:47, 4.38s/it] 3%|▎ | 1142/34278 [1:18:20<44:38:27, 4.85s/it] {'loss': 0.254, 'grad_norm': 1.1483154885704239, 'learning_rate': 9.999715006247726e-06, 'epoch': 0.03} 3%|▎ | 1142/34278 [1:18:20<44:38:27, 4.85s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1143/34278 [1:18:23<41:37:22, 4.52s/it] {'loss': 0.24, 'grad_norm': 1.2729315953713594, 'learning_rate': 9.999709939840314e-06, 'epoch': 0.03} 3%|▎ | 1143/34278 [1:18:23<41:37:22, 4.52s/it] 3%|▎ | 1144/34278 [1:18:27<39:00:09, 4.24s/it] {'loss': 0.232, 'grad_norm': 1.156513619544326, 'learning_rate': 9.999704828796683e-06, 'epoch': 0.03} 3%|▎ | 1144/34278 [1:18:27<39:00:09, 4.24s/it] 3%|▎ | 1145/34278 [1:18:31<36:53:58, 4.01s/it] {'loss': 0.225, 'grad_norm': 1.0072178870045805, 'learning_rate': 9.999699673116882e-06, 'epoch': 0.03} 3%|▎ | 1145/34278 [1:18:31<36:53:58, 4.01s/it] 3%|▎ | 1146/34278 [1:18:37<42:59:13, 4.67s/it] {'loss': 0.2517, 'grad_norm': 1.2746348783347192, 'learning_rate': 9.999694472800956e-06, 'epoch': 0.03} 3%|▎ | 1146/34278 [1:18:37<42:59:13, 4.67s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1147/34278 [1:18:41<41:14:29, 4.48s/it] {'loss': 0.2077, 'grad_norm': 1.034777373284874, 'learning_rate': 9.99968922784895e-06, 'epoch': 0.03} 3%|▎ | 1147/34278 [1:18:41<41:14:29, 4.48s/it] 3%|▎ | 1148/34278 [1:18:45<39:31:01, 4.29s/it] {'loss': 0.2146, 'grad_norm': 0.9290033166511193, 'learning_rate': 9.999683938260915e-06, 'epoch': 0.03} 3%|▎ | 1148/34278 [1:18:45<39:31:01, 4.29s/it] 3%|▎ | 1149/34278 [1:18:48<36:16:00, 3.94s/it] {'loss': 0.2305, 'grad_norm': 1.0897810951548232, 'learning_rate': 9.999678604036893e-06, 'epoch': 0.03} 3%|▎ | 1149/34278 [1:18:48<36:16:00, 3.94s/it] 3%|▎ | 1150/34278 [1:18:54<41:32:15, 4.51s/it] {'loss': 0.262, 'grad_norm': 1.0589171142505913, 'learning_rate': 9.999673225176934e-06, 'epoch': 0.03} 3%|▎ | 1150/34278 [1:18:54<41:32:15, 4.51s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1151/34278 [1:18:57<38:32:22, 4.19s/it] {'loss': 0.2618, 'grad_norm': 1.2259342193695393, 'learning_rate': 9.999667801681087e-06, 'epoch': 0.03} 3%|▎ | 1151/34278 [1:18:57<38:32:22, 4.19s/it] 3%|▎ | 1152/34278 [1:19:00<35:28:35, 3.86s/it] {'loss': 0.2627, 'grad_norm': 1.296660574782945, 'learning_rate': 9.999662333549399e-06, 'epoch': 0.03} 3%|▎ | 1152/34278 [1:19:00<35:28:35, 3.86s/it] 3%|▎ | 1153/34278 [1:19:04<35:12:02, 3.83s/it] {'loss': 0.2147, 'grad_norm': 0.8510483303165537, 'learning_rate': 9.999656820781917e-06, 'epoch': 0.03} 3%|▎ | 1153/34278 [1:19:04<35:12:02, 3.83s/it] 3%|▎ | 1154/34278 [1:19:07<33:46:27, 3.67s/it] {'loss': 0.2237, 'grad_norm': 1.023782054526501, 'learning_rate': 9.999651263378696e-06, 'epoch': 0.03} 3%|▎ | 1154/34278 [1:19:07<33:46:27, 3.67s/it] 3%|▎ | 1155/34278 [1:19:11<34:26:56, 3.74s/it] {'loss': 0.2135, 'grad_norm': 0.8665232399303242, 'learning_rate': 9.999645661339779e-06, 'epoch': 0.03} 3%|▎ | 1155/34278 [1:19:11<34:26:56, 3.74s/it] 3%|▎ | 1156/34278 [1:19:14<33:11:17, 3.61s/it] {'loss': 0.2472, 'grad_norm': 1.061822276647852, 'learning_rate': 9.999640014665221e-06, 'epoch': 0.03} 3%|▎ | 1156/34278 [1:19:14<33:11:17, 3.61s/it] 3%|▎ | 1157/34278 [1:19:17<31:18:52, 3.40s/it] {'loss': 0.2443, 'grad_norm': 1.0176166037519079, 'learning_rate': 9.99963432335507e-06, 'epoch': 0.03} 3%|▎ | 1157/34278 [1:19:17<31:18:52, 3.40s/it] 3%|▎ | 1158/34278 [1:19:21<31:02:15, 3.37s/it] {'loss': 0.2371, 'grad_norm': 0.8789299563084425, 'learning_rate': 9.999628587409378e-06, 'epoch': 0.03} 3%|▎ | 1158/34278 [1:19:21<31:02:15, 3.37s/it] 3%|▎ | 1159/34278 [1:19:24<30:16:51, 3.29s/it] {'loss': 0.2006, 'grad_norm': 0.9076170170894672, 'learning_rate': 9.999622806828193e-06, 'epoch': 0.03} 3%|▎ | 1159/34278 [1:19:24<30:16:51, 3.29s/it] 3%|▎ | 1160/34278 [1:19:30<38:27:40, 4.18s/it] {'loss': 0.2241, 'grad_norm': 1.2817894819207372, 'learning_rate': 9.99961698161157e-06, 'epoch': 0.03} 3%|▎ | 1160/34278 [1:19:30<38:27:40, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (13560 > 8192). Running this sequence through the model will result in indexing errors 3%|▎ | 1161/34278 [1:19:33<35:20:47, 3.84s/it] {'loss': 0.2336, 'grad_norm': 1.1012341450895315, 'learning_rate': 9.999611111759562e-06, 'epoch': 0.03} 3%|▎ | 1161/34278 [1:19:33<35:20:47, 3.84s/it] 3%|▎ | 1162/34278 [1:19:37<34:25:30, 3.74s/it] {'loss': 0.2028, 'grad_norm': 0.9114744283139627, 'learning_rate': 9.999605197272219e-06, 'epoch': 0.03} 3%|▎ | 1162/34278 [1:19:37<34:25:30, 3.74s/it] 3%|▎ | 1163/34278 [1:19:40<34:12:51, 3.72s/it] {'loss': 0.2502, 'grad_norm': 1.034475584854439, 'learning_rate': 9.999599238149594e-06, 'epoch': 0.03} 3%|▎ | 1163/34278 [1:19:40<34:12:51, 3.72s/it] 3%|▎ | 1164/34278 [1:19:43<32:04:10, 3.49s/it] {'loss': 0.2222, 'grad_norm': 1.1624818383366493, 'learning_rate': 9.999593234391739e-06, 'epoch': 0.03} 3%|▎ | 1164/34278 [1:19:43<32:04:10, 3.49s/it] 3%|▎ | 1165/34278 [1:19:49<39:33:15, 4.30s/it] {'loss': 0.2257, 'grad_norm': 0.956399900823704, 'learning_rate': 9.99958718599871e-06, 'epoch': 0.03} 3%|▎ | 1165/34278 [1:19:49<39:33:15, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1166/34278 [1:19:53<37:24:36, 4.07s/it] {'loss': 0.2126, 'grad_norm': 0.9627877016735599, 'learning_rate': 9.999581092970561e-06, 'epoch': 0.03} 3%|▎ | 1166/34278 [1:19:53<37:24:36, 4.07s/it] 3%|▎ | 1167/34278 [1:19:57<38:02:32, 4.14s/it] {'loss': 0.2338, 'grad_norm': 1.0693354601759546, 'learning_rate': 9.999574955307345e-06, 'epoch': 0.03} 3%|▎ | 1167/34278 [1:19:57<38:02:32, 4.14s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f962e386070> Failed to fetch sample 2762609. Exception: cannot identify image file <_io.BytesIO object at 0x7f962e386070> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1168/34278 [1:20:03<42:07:19, 4.58s/it] {'loss': 0.2175, 'grad_norm': 1.0054022949745454, 'learning_rate': 9.999568773009116e-06, 'epoch': 0.03} 3%|▎ | 1168/34278 [1:20:03<42:07:19, 4.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1169/34278 [1:20:07<41:20:09, 4.49s/it] {'loss': 0.2229, 'grad_norm': 0.9618317973515895, 'learning_rate': 9.999562546075932e-06, 'epoch': 0.03} 3%|▎ | 1169/34278 [1:20:07<41:20:09, 4.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1170/34278 [1:20:13<44:57:19, 4.89s/it] {'loss': 0.2223, 'grad_norm': 1.0775137058196893, 'learning_rate': 9.999556274507847e-06, 'epoch': 0.03} 3%|▎ | 1170/34278 [1:20:13<44:57:19, 4.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1171/34278 [1:20:16<39:36:15, 4.31s/it] {'loss': 0.2315, 'grad_norm': 1.2646575418200323, 'learning_rate': 9.999549958304917e-06, 'epoch': 0.03} 3%|▎ | 1171/34278 [1:20:16<39:36:15, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1172/34278 [1:20:22<43:38:35, 4.75s/it] {'loss': 0.2253, 'grad_norm': 1.286686590384167, 'learning_rate': 9.999543597467199e-06, 'epoch': 0.03} 3%|▎ | 1172/34278 [1:20:22<43:38:35, 4.75s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1173/34278 [1:20:25<39:38:03, 4.31s/it] {'loss': 0.2286, 'grad_norm': 0.968843466313081, 'learning_rate': 9.999537191994747e-06, 'epoch': 0.03} 3%|▎ | 1173/34278 [1:20:25<39:38:03, 4.31s/it] 3%|▎ | 1174/34278 [1:20:29<38:45:58, 4.22s/it] {'loss': 0.2256, 'grad_norm': 1.2498129034205643, 'learning_rate': 9.999530741887622e-06, 'epoch': 0.03} 3%|▎ | 1174/34278 [1:20:29<38:45:58, 4.22s/it] 3%|▎ | 1175/34278 [1:20:32<35:49:58, 3.90s/it] {'loss': 0.2319, 'grad_norm': 1.22475074679282, 'learning_rate': 9.99952424714588e-06, 'epoch': 0.03} 3%|▎ | 1175/34278 [1:20:32<35:49:58, 3.90s/it] 3%|▎ | 1176/34278 [1:20:37<37:30:17, 4.08s/it] {'loss': 0.2047, 'grad_norm': 0.9952594767143536, 'learning_rate': 9.99951770776958e-06, 'epoch': 0.03} 3%|▎ | 1176/34278 [1:20:37<37:30:17, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1177/34278 [1:20:39<34:04:56, 3.71s/it] {'loss': 0.2247, 'grad_norm': 1.0810281894508293, 'learning_rate': 9.999511123758778e-06, 'epoch': 0.03} 3%|▎ | 1177/34278 [1:20:39<34:04:56, 3.71s/it] 3%|▎ | 1178/34278 [1:20:43<32:53:40, 3.58s/it] {'loss': 0.2337, 'grad_norm': 1.1338001686539567, 'learning_rate': 9.999504495113533e-06, 'epoch': 0.03} 3%|▎ | 1178/34278 [1:20:43<32:53:40, 3.58s/it] 3%|▎ | 1179/34278 [1:20:46<33:27:43, 3.64s/it] {'loss': 0.2058, 'grad_norm': 0.9428062550979496, 'learning_rate': 9.999497821833908e-06, 'epoch': 0.03} 3%|▎ | 1179/34278 [1:20:46<33:27:43, 3.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1180/34278 [1:20:50<32:38:54, 3.55s/it] {'loss': 0.2086, 'grad_norm': 0.8369047360001224, 'learning_rate': 9.999491103919958e-06, 'epoch': 0.03} 3%|▎ | 1180/34278 [1:20:50<32:38:54, 3.55s/it] 3%|▎ | 1181/34278 [1:20:53<32:29:46, 3.53s/it] {'loss': 0.2293, 'grad_norm': 1.2377381920897232, 'learning_rate': 9.999484341371746e-06, 'epoch': 0.03} 3%|▎ | 1181/34278 [1:20:53<32:29:46, 3.53s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9707 > 8192). Running this sequence through the model will result in indexing errors 3%|▎ | 1182/34278 [1:20:59<39:04:01, 4.25s/it] {'loss': 0.2228, 'grad_norm': 0.9937717334450213, 'learning_rate': 9.99947753418933e-06, 'epoch': 0.03} 3%|▎ | 1182/34278 [1:20:59<39:04:01, 4.25s/it] 3%|▎ | 1183/34278 [1:21:04<41:23:12, 4.50s/it] {'loss': 0.2136, 'grad_norm': 0.9479845663406832, 'learning_rate': 9.999470682372774e-06, 'epoch': 0.03} 3%|▎ | 1183/34278 [1:21:04<41:23:12, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1184/34278 [1:21:08<39:06:03, 4.25s/it] {'loss': 0.2464, 'grad_norm': 1.1796947730576115, 'learning_rate': 9.999463785922136e-06, 'epoch': 0.03} 3%|▎ | 1184/34278 [1:21:08<39:06:03, 4.25s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1185/34278 [1:21:11<36:46:57, 4.00s/it] {'loss': 0.2327, 'grad_norm': 1.1718954993744068, 'learning_rate': 9.999456844837478e-06, 'epoch': 0.03} 3%|▎ | 1185/34278 [1:21:11<36:46:57, 4.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1186/34278 [1:21:16<38:52:25, 4.23s/it] {'loss': 0.2402, 'grad_norm': 1.3750095905594888, 'learning_rate': 9.999449859118864e-06, 'epoch': 0.03} 3%|▎ | 1186/34278 [1:21:16<38:52:25, 4.23s/it] 3%|▎ | 1187/34278 [1:21:20<36:57:39, 4.02s/it] {'loss': 0.2332, 'grad_norm': 1.040067389611131, 'learning_rate': 9.999442828766354e-06, 'epoch': 0.03} 3%|▎ | 1187/34278 [1:21:20<36:57:39, 4.02s/it] 3%|▎ | 1188/34278 [1:21:23<36:17:11, 3.95s/it] {'loss': 0.2128, 'grad_norm': 0.9820412034818019, 'learning_rate': 9.999435753780014e-06, 'epoch': 0.03} 3%|▎ | 1188/34278 [1:21:23<36:17:11, 3.95s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 3%|▎ | 1189/34278 [1:21:26<33:31:50, 3.65s/it] {'loss': 0.2201, 'grad_norm': 1.065750756453323, 'learning_rate': 9.999428634159904e-06, 'epoch': 0.03} 3%|▎ | 1189/34278 [1:21:26<33:31:50, 3.65s/it] 3%|▎ | 1190/34278 [1:21:33<40:32:14, 4.41s/it] {'loss': 0.242, 'grad_norm': 1.0596792182901573, 'learning_rate': 9.999421469906088e-06, 'epoch': 0.03} 3%|▎ | 1190/34278 [1:21:33<40:32:14, 4.41s/it] 3%|▎ | 1191/34278 [1:21:36<38:45:15, 4.22s/it] {'loss': 0.225, 'grad_norm': 1.116131218118618, 'learning_rate': 9.999414261018632e-06, 'epoch': 0.03} 3%|▎ | 1191/34278 [1:21:36<38:45:15, 4.22s/it] 3%|▎ | 1192/34278 [1:21:39<35:31:49, 3.87s/it] {'loss': 0.2271, 'grad_norm': 1.2665545716134226, 'learning_rate': 9.999407007497597e-06, 'epoch': 0.03} 3%|▎ | 1192/34278 [1:21:39<35:31:49, 3.87s/it] 3%|▎ | 1193/34278 [1:21:42<33:19:37, 3.63s/it] {'loss': 0.228, 'grad_norm': 1.1856520967677182, 'learning_rate': 9.999399709343051e-06, 'epoch': 0.03} 3%|▎ | 1193/34278 [1:21:42<33:19:37, 3.63s/it] 3%|▎ | 1194/34278 [1:21:45<31:18:51, 3.41s/it] {'loss': 0.1993, 'grad_norm': 1.0543449316668356, 'learning_rate': 9.999392366555056e-06, 'epoch': 0.03} 3%|▎ | 1194/34278 [1:21:45<31:18:51, 3.41s/it] 3%|▎ | 1195/34278 [1:21:49<30:41:51, 3.34s/it] {'loss': 0.2256, 'grad_norm': 1.09449362058283, 'learning_rate': 9.999384979133682e-06, 'epoch': 0.03} 3%|▎ | 1195/34278 [1:21:49<30:41:51, 3.34s/it] 3%|▎ | 1196/34278 [1:21:52<30:23:56, 3.31s/it] {'loss': 0.2213, 'grad_norm': 1.3760256034494558, 'learning_rate': 9.99937754707899e-06, 'epoch': 0.03} 3%|▎ | 1196/34278 [1:21:52<30:23:56, 3.31s/it] 3%|▎ | 1197/34278 [1:21:55<30:25:37, 3.31s/it] {'loss': 0.2194, 'grad_norm': 0.9917622178333504, 'learning_rate': 9.999370070391051e-06, 'epoch': 0.03} 3%|▎ | 1197/34278 [1:21:55<30:25:37, 3.31s/it] 3%|▎ | 1198/34278 [1:21:58<30:03:05, 3.27s/it] {'loss': 0.2129, 'grad_norm': 1.080240115007742, 'learning_rate': 9.999362549069928e-06, 'epoch': 0.03} 3%|▎ | 1198/34278 [1:21:58<30:03:05, 3.27s/it] 3%|▎ | 1199/34278 [1:22:03<33:54:55, 3.69s/it] {'loss': 0.222, 'grad_norm': 1.2657859156040228, 'learning_rate': 9.99935498311569e-06, 'epoch': 0.03} 3%|▎ | 1199/34278 [1:22:03<33:54:55, 3.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1200/34278 [1:22:06<31:55:10, 3.47s/it] {'loss': 0.2269, 'grad_norm': 1.1555732625979267, 'learning_rate': 9.999347372528405e-06, 'epoch': 0.04} 4%|▎ | 1200/34278 [1:22:06<31:55:10, 3.47s/it] 4%|▎ | 1201/34278 [1:22:10<32:49:50, 3.57s/it] {'loss': 0.2366, 'grad_norm': 1.0211903221318566, 'learning_rate': 9.999339717308138e-06, 'epoch': 0.04} 4%|▎ | 1201/34278 [1:22:10<32:49:50, 3.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1202/34278 [1:22:13<32:03:49, 3.49s/it] {'loss': 0.2094, 'grad_norm': 1.0785203518884237, 'learning_rate': 9.99933201745496e-06, 'epoch': 0.04} 4%|▎ | 1202/34278 [1:22:13<32:03:49, 3.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1203/34278 [1:22:19<39:44:39, 4.33s/it] {'loss': 0.2594, 'grad_norm': 0.950386052115053, 'learning_rate': 9.99932427296894e-06, 'epoch': 0.04} 4%|▎ | 1203/34278 [1:22:19<39:44:39, 4.33s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1204/34278 [1:22:22<36:00:50, 3.92s/it] {'loss': 0.2419, 'grad_norm': 1.0639074983819572, 'learning_rate': 9.999316483850147e-06, 'epoch': 0.04} 4%|▎ | 1204/34278 [1:22:22<36:00:50, 3.92s/it] 4%|▎ | 1205/34278 [1:22:26<35:49:46, 3.90s/it] {'loss': 0.2275, 'grad_norm': 0.8965983689343393, 'learning_rate': 9.999308650098649e-06, 'epoch': 0.04} 4%|▎ | 1205/34278 [1:22:26<35:49:46, 3.90s/it] 4%|▎ | 1206/34278 [1:22:32<41:35:29, 4.53s/it] {'loss': 0.2382, 'grad_norm': 0.9542237939173827, 'learning_rate': 9.999300771714518e-06, 'epoch': 0.04} 4%|▎ | 1206/34278 [1:22:32<41:35:29, 4.53s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1207/34278 [1:22:36<39:34:46, 4.31s/it] {'loss': 0.2091, 'grad_norm': 1.1040491741166185, 'learning_rate': 9.999292848697822e-06, 'epoch': 0.04} 4%|▎ | 1207/34278 [1:22:36<39:34:46, 4.31s/it] 4%|▎ | 1208/34278 [1:22:42<45:34:30, 4.96s/it] {'loss': 0.1985, 'grad_norm': 0.8636923209159175, 'learning_rate': 9.999284881048632e-06, 'epoch': 0.04} 4%|▎ | 1208/34278 [1:22:42<45:34:30, 4.96s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8942 > 8192). Running this sequence through the model will result in indexing errors 4%|▎ | 1209/34278 [1:22:47<45:58:40, 5.01s/it] {'loss': 0.2266, 'grad_norm': 0.9725826608998004, 'learning_rate': 9.99927686876702e-06, 'epoch': 0.04} 4%|▎ | 1209/34278 [1:22:48<45:58:40, 5.01s/it] 4%|▎ | 1210/34278 [1:22:51<40:46:38, 4.44s/it] {'loss': 0.2204, 'grad_norm': 1.147718904673886, 'learning_rate': 9.999268811853058e-06, 'epoch': 0.04} 4%|▎ | 1210/34278 [1:22:51<40:46:38, 4.44s/it] 4%|▎ | 1211/34278 [1:22:55<40:37:27, 4.42s/it] {'loss': 0.1992, 'grad_norm': 1.029354075593606, 'learning_rate': 9.99926071030682e-06, 'epoch': 0.04} 4%|▎ | 1211/34278 [1:22:55<40:37:27, 4.42s/it] 4%|▎ | 1212/34278 [1:22:58<38:01:58, 4.14s/it] {'loss': 0.2166, 'grad_norm': 0.8408064466005166, 'learning_rate': 9.999252564128373e-06, 'epoch': 0.04} 4%|▎ | 1212/34278 [1:22:58<38:01:58, 4.14s/it] 4%|▎ | 1213/34278 [1:23:05<44:40:34, 4.86s/it] {'loss': 0.2179, 'grad_norm': 0.9964253539333732, 'learning_rate': 9.999244373317794e-06, 'epoch': 0.04} 4%|▎ | 1213/34278 [1:23:05<44:40:34, 4.86s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1214/34278 [1:23:08<40:12:19, 4.38s/it] {'loss': 0.2022, 'grad_norm': 0.9281384144091911, 'learning_rate': 9.999236137875152e-06, 'epoch': 0.04} 4%|▎ | 1214/34278 [1:23:08<40:12:19, 4.38s/it] 4%|▎ | 1215/34278 [1:23:11<35:59:31, 3.92s/it] {'loss': 0.2373, 'grad_norm': 1.000925384075284, 'learning_rate': 9.999227857800526e-06, 'epoch': 0.04} 4%|▎ | 1215/34278 [1:23:11<35:59:31, 3.92s/it] 4%|▎ | 1216/34278 [1:23:16<39:58:18, 4.35s/it] {'loss': 0.1929, 'grad_norm': 1.1731101092695548, 'learning_rate': 9.999219533093986e-06, 'epoch': 0.04} 4%|▎ | 1216/34278 [1:23:16<39:58:18, 4.35s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1217/34278 [1:23:21<40:00:49, 4.36s/it] {'loss': 0.208, 'grad_norm': 0.939288664547676, 'learning_rate': 9.999211163755607e-06, 'epoch': 0.04} 4%|▎ | 1217/34278 [1:23:21<40:00:49, 4.36s/it] 4%|▎ | 1218/34278 [1:23:26<40:56:35, 4.46s/it] {'loss': 0.2301, 'grad_norm': 0.9529581324176537, 'learning_rate': 9.999202749785465e-06, 'epoch': 0.04} 4%|▎ | 1218/34278 [1:23:26<40:56:35, 4.46s/it] 4%|▎ | 1219/34278 [1:23:30<39:34:45, 4.31s/it] {'loss': 0.2549, 'grad_norm': 1.0034436056259843, 'learning_rate': 9.999194291183633e-06, 'epoch': 0.04} 4%|▎ | 1219/34278 [1:23:30<39:34:45, 4.31s/it] 4%|▎ | 1220/34278 [1:23:32<35:24:42, 3.86s/it] {'loss': 0.2059, 'grad_norm': 0.8140498582760453, 'learning_rate': 9.99918578795019e-06, 'epoch': 0.04} 4%|▎ | 1220/34278 [1:23:32<35:24:42, 3.86s/it] 4%|▎ | 1221/34278 [1:23:36<34:31:47, 3.76s/it] {'loss': 0.2157, 'grad_norm': 1.0141425705255867, 'learning_rate': 9.999177240085207e-06, 'epoch': 0.04} 4%|▎ | 1221/34278 [1:23:36<34:31:47, 3.76s/it] 4%|▎ | 1222/34278 [1:23:39<34:02:53, 3.71s/it] {'loss': 0.217, 'grad_norm': 1.1163699248491348, 'learning_rate': 9.999168647588767e-06, 'epoch': 0.04} 4%|▎ | 1222/34278 [1:23:39<34:02:53, 3.71s/it] 4%|▎ | 1223/34278 [1:23:46<41:27:40, 4.52s/it] {'loss': 0.2209, 'grad_norm': 0.9139091889221808, 'learning_rate': 9.999160010460938e-06, 'epoch': 0.04} 4%|▎ | 1223/34278 [1:23:46<41:27:40, 4.52s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1224/34278 [1:23:49<37:55:23, 4.13s/it] {'loss': 0.2213, 'grad_norm': 0.9758061128820017, 'learning_rate': 9.999151328701804e-06, 'epoch': 0.04} 4%|▎ | 1224/34278 [1:23:49<37:55:23, 4.13s/it] 4%|▎ | 1225/34278 [1:23:53<36:03:22, 3.93s/it] {'loss': 0.2406, 'grad_norm': 1.142823736007986, 'learning_rate': 9.99914260231144e-06, 'epoch': 0.04} 4%|▎ | 1225/34278 [1:23:53<36:03:22, 3.93s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10076 > 8192). Running this sequence through the model will result in indexing errors 4%|▎ | 1226/34278 [1:23:59<41:58:48, 4.57s/it] {'loss': 0.2119, 'grad_norm': 1.0516074160132758, 'learning_rate': 9.999133831289924e-06, 'epoch': 0.04} 4%|▎ | 1226/34278 [1:23:59<41:58:48, 4.57s/it] 4%|▎ | 1227/34278 [1:24:02<38:20:17, 4.18s/it] {'loss': 0.2171, 'grad_norm': 1.1786248717376187, 'learning_rate': 9.999125015637337e-06, 'epoch': 0.04} 4%|▎ | 1227/34278 [1:24:02<38:20:17, 4.18s/it] 4%|▎ | 1228/34278 [1:24:08<42:58:54, 4.68s/it] {'loss': 0.2248, 'grad_norm': 1.0082450454305523, 'learning_rate': 9.999116155353751e-06, 'epoch': 0.04} 4%|▎ | 1228/34278 [1:24:08<42:58:54, 4.68s/it] 4%|▎ | 1229/34278 [1:24:11<39:51:46, 4.34s/it] {'loss': 0.2564, 'grad_norm': 1.0995312497528595, 'learning_rate': 9.999107250439253e-06, 'epoch': 0.04} 4%|▎ | 1229/34278 [1:24:11<39:51:46, 4.34s/it] 4%|▎ | 1230/34278 [1:24:15<38:41:57, 4.22s/it] {'loss': 0.2179, 'grad_norm': 0.9229222188137134, 'learning_rate': 9.999098300893916e-06, 'epoch': 0.04} 4%|▎ | 1230/34278 [1:24:15<38:41:57, 4.22s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1231/34278 [1:24:18<35:45:16, 3.89s/it] {'loss': 0.2322, 'grad_norm': 1.2543935634661596, 'learning_rate': 9.999089306717827e-06, 'epoch': 0.04} 4%|▎ | 1231/34278 [1:24:18<35:45:16, 3.89s/it] 4%|▎ | 1232/34278 [1:24:25<42:24:31, 4.62s/it] {'loss': 0.2477, 'grad_norm': 1.0599017323782356, 'learning_rate': 9.999080267911059e-06, 'epoch': 0.04} 4%|▎ | 1232/34278 [1:24:25<42:24:31, 4.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1233/34278 [1:24:28<39:34:42, 4.31s/it] {'loss': 0.1955, 'grad_norm': 1.08198813220789, 'learning_rate': 9.999071184473694e-06, 'epoch': 0.04} 4%|▎ | 1233/34278 [1:24:28<39:34:42, 4.31s/it] 4%|▎ | 1234/34278 [1:24:33<41:16:37, 4.50s/it] {'loss': 0.2284, 'grad_norm': 0.9296632008847323, 'learning_rate': 9.999062056405818e-06, 'epoch': 0.04} 4%|▎ | 1234/34278 [1:24:33<41:16:37, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1235/34278 [1:24:36<37:26:54, 4.08s/it] {'loss': 0.2238, 'grad_norm': 1.124573375991063, 'learning_rate': 9.999052883707508e-06, 'epoch': 0.04} 4%|▎ | 1235/34278 [1:24:36<37:26:54, 4.08s/it] 4%|▎ | 1236/34278 [1:24:40<35:10:18, 3.83s/it] {'loss': 0.2219, 'grad_norm': 1.1526743479336314, 'learning_rate': 9.999043666378847e-06, 'epoch': 0.04} 4%|▎ | 1236/34278 [1:24:40<35:10:18, 3.83s/it] 4%|▎ | 1237/34278 [1:24:44<36:34:06, 3.98s/it] {'loss': 0.2347, 'grad_norm': 1.3432794585841652, 'learning_rate': 9.999034404419918e-06, 'epoch': 0.04} 4%|▎ | 1237/34278 [1:24:44<36:34:06, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1238/34278 [1:24:47<33:37:10, 3.66s/it] {'loss': 0.2454, 'grad_norm': 1.2156880471894373, 'learning_rate': 9.999025097830803e-06, 'epoch': 0.04} 4%|▎ | 1238/34278 [1:24:47<33:37:10, 3.66s/it] 4%|▎ | 1239/34278 [1:24:50<31:29:28, 3.43s/it] {'loss': 0.2259, 'grad_norm': 1.1655941652232784, 'learning_rate': 9.999015746611587e-06, 'epoch': 0.04} 4%|▎ | 1239/34278 [1:24:50<31:29:28, 3.43s/it] 4%|▎ | 1240/34278 [1:24:53<31:45:39, 3.46s/it] {'loss': 0.22, 'grad_norm': 0.9271587585084555, 'learning_rate': 9.999006350762349e-06, 'epoch': 0.04} 4%|▎ | 1240/34278 [1:24:53<31:45:39, 3.46s/it] 4%|▎ | 1241/34278 [1:24:56<31:16:32, 3.41s/it] {'loss': 0.2417, 'grad_norm': 1.335574920498898, 'learning_rate': 9.998996910283177e-06, 'epoch': 0.04} 4%|▎ | 1241/34278 [1:24:56<31:16:32, 3.41s/it] 4%|▎ | 1242/34278 [1:25:02<36:36:49, 3.99s/it] {'loss': 0.2118, 'grad_norm': 1.2195165397277774, 'learning_rate': 9.998987425174154e-06, 'epoch': 0.04} 4%|▎ | 1242/34278 [1:25:02<36:36:49, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1243/34278 [1:25:05<33:30:04, 3.65s/it] {'loss': 0.2367, 'grad_norm': 0.9770996619705205, 'learning_rate': 9.998977895435365e-06, 'epoch': 0.04} 4%|▎ | 1243/34278 [1:25:05<33:30:04, 3.65s/it] 4%|▎ | 1244/34278 [1:25:08<33:09:02, 3.61s/it] {'loss': 0.2259, 'grad_norm': 1.1461435430787712, 'learning_rate': 9.998968321066893e-06, 'epoch': 0.04} 4%|▎ | 1244/34278 [1:25:08<33:09:02, 3.61s/it] 4%|▎ | 1245/34278 [1:25:11<32:14:59, 3.51s/it] {'loss': 0.2116, 'grad_norm': 0.9637216324090665, 'learning_rate': 9.998958702068825e-06, 'epoch': 0.04} 4%|▎ | 1245/34278 [1:25:12<32:14:59, 3.51s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1246/34278 [1:25:15<33:12:45, 3.62s/it] {'loss': 0.2229, 'grad_norm': 1.3104402836663753, 'learning_rate': 9.99894903844125e-06, 'epoch': 0.04} 4%|▎ | 1246/34278 [1:25:15<33:12:45, 3.62s/it] 4%|▎ | 1247/34278 [1:25:21<39:56:23, 4.35s/it] {'loss': 0.2533, 'grad_norm': 1.0646578161147395, 'learning_rate': 9.99893933018425e-06, 'epoch': 0.04} 4%|▎ | 1247/34278 [1:25:21<39:56:23, 4.35s/it] 4%|▎ | 1248/34278 [1:25:26<40:51:44, 4.45s/it] {'loss': 0.2534, 'grad_norm': 0.9792975473419264, 'learning_rate': 9.998929577297912e-06, 'epoch': 0.04} 4%|▎ | 1248/34278 [1:25:26<40:51:44, 4.45s/it] 4%|▎ | 1249/34278 [1:25:30<39:14:56, 4.28s/it] {'loss': 0.2465, 'grad_norm': 1.3632713777987333, 'learning_rate': 9.998919779782326e-06, 'epoch': 0.04} 4%|▎ | 1249/34278 [1:25:30<39:14:56, 4.28s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (11685 > 8192). Running this sequence through the model will result in indexing errors 4%|▎ | 1250/34278 [1:25:34<38:31:39, 4.20s/it] {'loss': 0.2462, 'grad_norm': 1.1058489882816205, 'learning_rate': 9.998909937637576e-06, 'epoch': 0.04} 4%|▎ | 1250/34278 [1:25:34<38:31:39, 4.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1251/34278 [1:25:38<39:12:40, 4.27s/it] {'loss': 0.2228, 'grad_norm': 0.9250320891239557, 'learning_rate': 9.998900050863751e-06, 'epoch': 0.04} 4%|▎ | 1251/34278 [1:25:38<39:12:40, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1252/34278 [1:25:42<36:21:22, 3.96s/it] {'loss': 0.2159, 'grad_norm': 1.0517853930639236, 'learning_rate': 9.99889011946094e-06, 'epoch': 0.04} 4%|▎ | 1252/34278 [1:25:42<36:21:22, 3.96s/it] 4%|▎ | 1253/34278 [1:25:48<42:20:36, 4.62s/it] {'loss': 0.2312, 'grad_norm': 1.0641799816582995, 'learning_rate': 9.998880143429233e-06, 'epoch': 0.04} 4%|▎ | 1253/34278 [1:25:48<42:20:36, 4.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1254/34278 [1:25:51<39:14:09, 4.28s/it] {'loss': 0.2234, 'grad_norm': 1.0400544389053685, 'learning_rate': 9.998870122768716e-06, 'epoch': 0.04} 4%|▎ | 1254/34278 [1:25:51<39:14:09, 4.28s/it] 4%|▎ | 1255/34278 [1:25:56<39:11:02, 4.27s/it] {'loss': 0.2206, 'grad_norm': 0.7784251761330888, 'learning_rate': 9.99886005747948e-06, 'epoch': 0.04} 4%|▎ | 1255/34278 [1:25:56<39:11:02, 4.27s/it] 4%|▎ | 1256/34278 [1:25:59<36:06:45, 3.94s/it] {'loss': 0.2338, 'grad_norm': 0.9387970895902331, 'learning_rate': 9.998849947561615e-06, 'epoch': 0.04} 4%|▎ | 1256/34278 [1:25:59<36:06:45, 3.94s/it] 4%|▎ | 1257/34278 [1:26:02<34:53:12, 3.80s/it] {'loss': 0.2256, 'grad_norm': 0.9265651405115524, 'learning_rate': 9.99883979301521e-06, 'epoch': 0.04} 4%|▎ | 1257/34278 [1:26:02<34:53:12, 3.80s/it] 4%|▎ | 1258/34278 [1:26:06<34:37:01, 3.77s/it] {'loss': 0.2585, 'grad_norm': 1.0746778660457141, 'learning_rate': 9.998829593840358e-06, 'epoch': 0.04} 4%|▎ | 1258/34278 [1:26:06<34:37:01, 3.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1259/34278 [1:26:11<38:58:05, 4.25s/it] {'loss': 0.2649, 'grad_norm': 0.980393031070998, 'learning_rate': 9.998819350037148e-06, 'epoch': 0.04} 4%|▎ | 1259/34278 [1:26:11<38:58:05, 4.25s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1260/34278 [1:26:14<36:00:37, 3.93s/it] {'loss': 0.2196, 'grad_norm': 1.091527366559575, 'learning_rate': 9.998809061605671e-06, 'epoch': 0.04} 4%|▎ | 1260/34278 [1:26:14<36:00:37, 3.93s/it] 4%|▎ | 1261/34278 [1:26:18<33:44:09, 3.68s/it] {'loss': 0.2436, 'grad_norm': 0.9969539814682798, 'learning_rate': 9.998798728546022e-06, 'epoch': 0.04} 4%|▎ | 1261/34278 [1:26:18<33:44:09, 3.68s/it] 4%|▎ | 1262/34278 [1:26:21<32:51:55, 3.58s/it] {'loss': 0.2032, 'grad_norm': 1.0774496370979247, 'learning_rate': 9.998788350858291e-06, 'epoch': 0.04} 4%|▎ | 1262/34278 [1:26:21<32:51:55, 3.58s/it] 4%|▎ | 1263/34278 [1:26:24<31:49:56, 3.47s/it] {'loss': 0.2389, 'grad_norm': 1.1692389257363829, 'learning_rate': 9.99877792854257e-06, 'epoch': 0.04} 4%|▎ | 1263/34278 [1:26:24<31:49:56, 3.47s/it] 4%|▎ | 1264/34278 [1:26:27<31:04:24, 3.39s/it] {'loss': 0.2021, 'grad_norm': 1.0704672720932178, 'learning_rate': 9.998767461598954e-06, 'epoch': 0.04} 4%|▎ | 1264/34278 [1:26:27<31:04:24, 3.39s/it] 4%|▎ | 1265/34278 [1:26:30<30:06:09, 3.28s/it] {'loss': 0.2237, 'grad_norm': 0.8743761740096346, 'learning_rate': 9.998756950027535e-06, 'epoch': 0.04} 4%|▎ | 1265/34278 [1:26:30<30:06:09, 3.28s/it] 4%|▎ | 1266/34278 [1:26:33<29:31:50, 3.22s/it] {'loss': 0.2225, 'grad_norm': 0.8363473801995409, 'learning_rate': 9.998746393828406e-06, 'epoch': 0.04} 4%|▎ | 1266/34278 [1:26:33<29:31:50, 3.22s/it] 4%|▎ | 1267/34278 [1:26:37<29:17:11, 3.19s/it] {'loss': 0.2146, 'grad_norm': 1.1059349135328747, 'learning_rate': 9.998735793001663e-06, 'epoch': 0.04} 4%|▎ | 1267/34278 [1:26:37<29:17:11, 3.19s/it] 4%|▎ | 1268/34278 [1:26:40<28:40:15, 3.13s/it] {'loss': 0.2393, 'grad_norm': 1.4869298313402535, 'learning_rate': 9.998725147547401e-06, 'epoch': 0.04} 4%|▎ | 1268/34278 [1:26:40<28:40:15, 3.13s/it] 4%|▎ | 1269/34278 [1:26:43<28:18:59, 3.09s/it] {'loss': 0.193, 'grad_norm': 0.7916156050456058, 'learning_rate': 9.998714457465715e-06, 'epoch': 0.04} 4%|▎ | 1269/34278 [1:26:43<28:18:59, 3.09s/it] 4%|▎ | 1270/34278 [1:26:46<28:08:32, 3.07s/it] {'loss': 0.2131, 'grad_norm': 1.0332321823689818, 'learning_rate': 9.998703722756698e-06, 'epoch': 0.04} 4%|▎ | 1270/34278 [1:26:46<28:08:32, 3.07s/it] 4%|▎ | 1271/34278 [1:26:50<31:46:10, 3.47s/it] {'loss': 0.204, 'grad_norm': 1.0944133136162206, 'learning_rate': 9.998692943420448e-06, 'epoch': 0.04} 4%|▎ | 1271/34278 [1:26:50<31:46:10, 3.47s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1272/34278 [1:26:55<35:29:49, 3.87s/it] {'loss': 0.259, 'grad_norm': 1.0342856859310066, 'learning_rate': 9.99868211945706e-06, 'epoch': 0.04} 4%|▎ | 1272/34278 [1:26:55<35:29:49, 3.87s/it] 4%|▎ | 1273/34278 [1:26:59<36:37:23, 3.99s/it] {'loss': 0.2106, 'grad_norm': 0.8740729653604431, 'learning_rate': 9.998671250866631e-06, 'epoch': 0.04} 4%|▎ | 1273/34278 [1:26:59<36:37:23, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1274/34278 [1:27:03<35:14:07, 3.84s/it] {'loss': 0.2356, 'grad_norm': 1.1146756165041172, 'learning_rate': 9.998660337649261e-06, 'epoch': 0.04} 4%|▎ | 1274/34278 [1:27:03<35:14:07, 3.84s/it] 4%|▎ | 1275/34278 [1:27:09<41:56:29, 4.58s/it] {'loss': 0.2227, 'grad_norm': 1.0625759772365957, 'learning_rate': 9.998649379805044e-06, 'epoch': 0.04} 4%|▎ | 1275/34278 [1:27:09<41:56:29, 4.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1276/34278 [1:27:12<37:37:50, 4.10s/it] {'loss': 0.21, 'grad_norm': 0.8965303127918206, 'learning_rate': 9.998638377334076e-06, 'epoch': 0.04} 4%|▎ | 1276/34278 [1:27:12<37:37:50, 4.10s/it] 4%|▎ | 1277/34278 [1:27:15<35:31:41, 3.88s/it] {'loss': 0.2004, 'grad_norm': 0.9839152452283733, 'learning_rate': 9.99862733023646e-06, 'epoch': 0.04} 4%|▎ | 1277/34278 [1:27:15<35:31:41, 3.88s/it] 4%|▎ | 1278/34278 [1:27:18<33:37:25, 3.67s/it] {'loss': 0.2256, 'grad_norm': 0.9071206977228208, 'learning_rate': 9.998616238512292e-06, 'epoch': 0.04} 4%|▎ | 1278/34278 [1:27:18<33:37:25, 3.67s/it] 4%|▎ | 1279/34278 [1:27:22<33:57:50, 3.71s/it] {'loss': 0.198, 'grad_norm': 1.1487114913959353, 'learning_rate': 9.998605102161672e-06, 'epoch': 0.04} 4%|▎ | 1279/34278 [1:27:22<33:57:50, 3.71s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▎ | 1280/34278 [1:27:25<31:51:50, 3.48s/it] {'loss': 0.2025, 'grad_norm': 1.0496812943687999, 'learning_rate': 9.998593921184699e-06, 'epoch': 0.04} 4%|▎ | 1280/34278 [1:27:25<31:51:50, 3.48s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10954 > 8192). Running this sequence through the model will result in indexing errors 4%|▎ | 1281/34278 [1:27:28<30:26:20, 3.32s/it] {'loss': 0.2136, 'grad_norm': 1.0971957882567478, 'learning_rate': 9.998582695581471e-06, 'epoch': 0.04} 4%|▎ | 1281/34278 [1:27:28<30:26:20, 3.32s/it] 4%|▎ | 1282/34278 [1:27:31<29:55:10, 3.26s/it] {'loss': 0.2007, 'grad_norm': 1.1588128358759724, 'learning_rate': 9.99857142535209e-06, 'epoch': 0.04} 4%|▎ | 1282/34278 [1:27:31<29:55:10, 3.26s/it] 4%|▎ | 1283/34278 [1:27:37<37:29:08, 4.09s/it] {'loss': 0.2297, 'grad_norm': 0.9426336682930098, 'learning_rate': 9.998560110496658e-06, 'epoch': 0.04} 4%|▎ | 1283/34278 [1:27:37<37:29:08, 4.09s/it] 4%|▎ | 1284/34278 [1:27:40<33:53:00, 3.70s/it] {'loss': 0.2436, 'grad_norm': 1.10538456148938, 'learning_rate': 9.998548751015275e-06, 'epoch': 0.04} 4%|▎ | 1284/34278 [1:27:40<33:53:00, 3.70s/it] 4%|▎ | 1285/34278 [1:27:43<33:16:28, 3.63s/it] {'loss': 0.2116, 'grad_norm': 0.9765259385595629, 'learning_rate': 9.998537346908041e-06, 'epoch': 0.04} 4%|▎ | 1285/34278 [1:27:43<33:16:28, 3.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1286/34278 [1:27:46<31:36:49, 3.45s/it] {'loss': 0.2166, 'grad_norm': 1.0595574926908682, 'learning_rate': 9.99852589817506e-06, 'epoch': 0.04} 4%|▍ | 1286/34278 [1:27:46<31:36:49, 3.45s/it] 4%|▍ | 1287/34278 [1:27:50<32:10:54, 3.51s/it] {'loss': 0.2492, 'grad_norm': 1.169670012152128, 'learning_rate': 9.99851440481643e-06, 'epoch': 0.04} 4%|▍ | 1287/34278 [1:27:50<32:10:54, 3.51s/it] 4%|▍ | 1288/34278 [1:27:53<30:49:44, 3.36s/it] {'loss': 0.2302, 'grad_norm': 0.952212186128128, 'learning_rate': 9.99850286683226e-06, 'epoch': 0.04} 4%|▍ | 1288/34278 [1:27:53<30:49:44, 3.36s/it] 4%|▍ | 1289/34278 [1:27:56<28:35:35, 3.12s/it] {'loss': 0.2276, 'grad_norm': 1.1727912344580984, 'learning_rate': 9.998491284222647e-06, 'epoch': 0.04} 4%|▍ | 1289/34278 [1:27:56<28:35:35, 3.12s/it] 4%|▍ | 1290/34278 [1:28:02<36:39:17, 4.00s/it] {'loss': 0.2535, 'grad_norm': 1.0989007437200973, 'learning_rate': 9.998479656987699e-06, 'epoch': 0.04} 4%|▍ | 1290/34278 [1:28:02<36:39:17, 4.00s/it] 4%|▍ | 1291/34278 [1:28:08<42:40:38, 4.66s/it] {'loss': 0.2388, 'grad_norm': 1.0529017059038746, 'learning_rate': 9.998467985127518e-06, 'epoch': 0.04} 4%|▍ | 1291/34278 [1:28:08<42:40:38, 4.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1292/34278 [1:28:12<40:13:08, 4.39s/it] {'loss': 0.2541, 'grad_norm': 1.097426098611742, 'learning_rate': 9.998456268642207e-06, 'epoch': 0.04} 4%|▍ | 1292/34278 [1:28:12<40:13:08, 4.39s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12522 > 8192). Running this sequence through the model will result in indexing errors 4%|▍ | 1293/34278 [1:28:18<44:37:15, 4.87s/it] {'loss': 0.225, 'grad_norm': 0.9185947687485129, 'learning_rate': 9.998444507531872e-06, 'epoch': 0.04} 4%|▍ | 1293/34278 [1:28:18<44:37:15, 4.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1294/34278 [1:28:21<39:49:48, 4.35s/it] {'loss': 0.2301, 'grad_norm': 0.9874842534819169, 'learning_rate': 9.998432701796617e-06, 'epoch': 0.04} 4%|▍ | 1294/34278 [1:28:21<39:49:48, 4.35s/it] 4%|▍ | 1295/34278 [1:28:25<39:21:46, 4.30s/it] {'loss': 0.2116, 'grad_norm': 0.987295071667692, 'learning_rate': 9.99842085143655e-06, 'epoch': 0.04} 4%|▍ | 1295/34278 [1:28:25<39:21:46, 4.30s/it] 4%|▍ | 1296/34278 [1:28:29<39:30:11, 4.31s/it] {'loss': 0.22, 'grad_norm': 0.9891448350948405, 'learning_rate': 9.998408956451773e-06, 'epoch': 0.04} 4%|▍ | 1296/34278 [1:28:29<39:30:11, 4.31s/it] 4%|▍ | 1297/34278 [1:28:33<38:36:21, 4.21s/it] {'loss': 0.1882, 'grad_norm': 0.8364712310069523, 'learning_rate': 9.998397016842394e-06, 'epoch': 0.04} 4%|▍ | 1297/34278 [1:28:33<38:36:21, 4.21s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1298/34278 [1:28:40<44:07:19, 4.82s/it] {'loss': 0.2097, 'grad_norm': 1.0465819091625597, 'learning_rate': 9.99838503260852e-06, 'epoch': 0.04} 4%|▍ | 1298/34278 [1:28:40<44:07:19, 4.82s/it] 4%|▍ | 1299/34278 [1:28:43<39:52:42, 4.35s/it] {'loss': 0.2135, 'grad_norm': 1.0831275110704366, 'learning_rate': 9.998373003750259e-06, 'epoch': 0.04} 4%|▍ | 1299/34278 [1:28:43<39:52:42, 4.35s/it] 4%|▍ | 1300/34278 [1:28:46<37:31:57, 4.10s/it] {'loss': 0.2468, 'grad_norm': 1.0699745458763916, 'learning_rate': 9.998360930267715e-06, 'epoch': 0.04} 4%|▍ | 1300/34278 [1:28:46<37:31:57, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1301/34278 [1:28:51<38:41:06, 4.22s/it] {'loss': 0.2259, 'grad_norm': 0.9599499742993474, 'learning_rate': 9.998348812160999e-06, 'epoch': 0.04} 4%|▍ | 1301/34278 [1:28:51<38:41:06, 4.22s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1302/34278 [1:28:54<36:56:28, 4.03s/it] {'loss': 0.219, 'grad_norm': 1.0159198034611567, 'learning_rate': 9.998336649430217e-06, 'epoch': 0.04} 4%|▍ | 1302/34278 [1:28:54<36:56:28, 4.03s/it] 4%|▍ | 1303/34278 [1:28:58<34:32:25, 3.77s/it] {'loss': 0.1939, 'grad_norm': 0.9105327449863514, 'learning_rate': 9.99832444207548e-06, 'epoch': 0.04} 4%|▍ | 1303/34278 [1:28:58<34:32:25, 3.77s/it] 4%|▍ | 1304/34278 [1:29:01<33:18:35, 3.64s/it] {'loss': 0.212, 'grad_norm': 1.0172706973557566, 'learning_rate': 9.998312190096896e-06, 'epoch': 0.04} 4%|▍ | 1304/34278 [1:29:01<33:18:35, 3.64s/it] 4%|▍ | 1305/34278 [1:29:04<32:22:21, 3.53s/it] {'loss': 0.1974, 'grad_norm': 1.1388665063274868, 'learning_rate': 9.998299893494572e-06, 'epoch': 0.04} 4%|▍ | 1305/34278 [1:29:04<32:22:21, 3.53s/it] 4%|▍ | 1306/34278 [1:29:10<38:11:58, 4.17s/it] {'loss': 0.2216, 'grad_norm': 0.9868309224658413, 'learning_rate': 9.99828755226862e-06, 'epoch': 0.04} 4%|▍ | 1306/34278 [1:29:10<38:11:58, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1307/34278 [1:29:13<34:33:56, 3.77s/it] {'loss': 0.2226, 'grad_norm': 1.2572752732793357, 'learning_rate': 9.998275166419152e-06, 'epoch': 0.04} 4%|▍ | 1307/34278 [1:29:13<34:33:56, 3.77s/it] 4%|▍ | 1308/34278 [1:29:16<33:12:28, 3.63s/it] {'loss': 0.2108, 'grad_norm': 1.1406110116712587, 'learning_rate': 9.998262735946274e-06, 'epoch': 0.04} 4%|▍ | 1308/34278 [1:29:16<33:12:28, 3.63s/it] 4%|▍ | 1309/34278 [1:29:19<32:16:10, 3.52s/it] {'loss': 0.2302, 'grad_norm': 0.8878135344973613, 'learning_rate': 9.9982502608501e-06, 'epoch': 0.04} 4%|▍ | 1309/34278 [1:29:19<32:16:10, 3.52s/it] 4%|▍ | 1310/34278 [1:29:25<38:37:43, 4.22s/it] {'loss': 0.2345, 'grad_norm': 1.0201696690286826, 'learning_rate': 9.998237741130742e-06, 'epoch': 0.04} 4%|▍ | 1310/34278 [1:29:25<38:37:43, 4.22s/it] 4%|▍ | 1311/34278 [1:29:31<43:40:50, 4.77s/it] {'loss': 0.2268, 'grad_norm': 0.9213405910768891, 'learning_rate': 9.998225176788309e-06, 'epoch': 0.04} 4%|▍ | 1311/34278 [1:29:31<43:40:50, 4.77s/it] 4%|▍ | 1312/34278 [1:29:37<45:26:12, 4.96s/it] {'loss': 0.1968, 'grad_norm': 0.9604706367061303, 'learning_rate': 9.998212567822917e-06, 'epoch': 0.04} 4%|▍ | 1312/34278 [1:29:37<45:26:12, 4.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1313/34278 [1:29:40<40:46:12, 4.45s/it] {'loss': 0.2221, 'grad_norm': 0.8607403891983637, 'learning_rate': 9.998199914234674e-06, 'epoch': 0.04} 4%|▍ | 1313/34278 [1:29:40<40:46:12, 4.45s/it] 4%|▍ | 1314/34278 [1:29:43<37:34:38, 4.10s/it] {'loss': 0.228, 'grad_norm': 0.9066436101604387, 'learning_rate': 9.998187216023696e-06, 'epoch': 0.04} 4%|▍ | 1314/34278 [1:29:43<37:34:38, 4.10s/it] 4%|▍ | 1315/34278 [1:29:47<35:52:38, 3.92s/it] {'loss': 0.2111, 'grad_norm': 0.9785700401350489, 'learning_rate': 9.998174473190098e-06, 'epoch': 0.04} 4%|▍ | 1315/34278 [1:29:47<35:52:38, 3.92s/it] 4%|▍ | 1316/34278 [1:29:50<34:00:51, 3.71s/it] {'loss': 0.2275, 'grad_norm': 0.9121512263543413, 'learning_rate': 9.99816168573399e-06, 'epoch': 0.04} 4%|▍ | 1316/34278 [1:29:50<34:00:51, 3.71s/it] 4%|▍ | 1317/34278 [1:29:56<40:47:03, 4.45s/it] {'loss': 0.2348, 'grad_norm': 1.0388189498393994, 'learning_rate': 9.998148853655486e-06, 'epoch': 0.04} 4%|▍ | 1317/34278 [1:29:56<40:47:03, 4.45s/it] 4%|▍ | 1318/34278 [1:29:59<36:54:50, 4.03s/it] {'loss': 0.2389, 'grad_norm': 1.266719867745567, 'learning_rate': 9.998135976954704e-06, 'epoch': 0.04} 4%|▍ | 1318/34278 [1:29:59<36:54:50, 4.03s/it] 4%|▍ | 1319/34278 [1:30:02<35:02:09, 3.83s/it] {'loss': 0.2198, 'grad_norm': 1.1566629911987631, 'learning_rate': 9.998123055631756e-06, 'epoch': 0.04} 4%|▍ | 1319/34278 [1:30:02<35:02:09, 3.83s/it] 4%|▍ | 1320/34278 [1:30:06<33:51:05, 3.70s/it] {'loss': 0.2266, 'grad_norm': 1.2143406841243476, 'learning_rate': 9.99811008968676e-06, 'epoch': 0.04} 4%|▍ | 1320/34278 [1:30:06<33:51:05, 3.70s/it] 4%|▍ | 1321/34278 [1:30:09<32:44:23, 3.58s/it] {'loss': 0.2109, 'grad_norm': 0.9275245921485578, 'learning_rate': 9.998097079119828e-06, 'epoch': 0.04} 4%|▍ | 1321/34278 [1:30:09<32:44:23, 3.58s/it] 4%|▍ | 1322/34278 [1:30:14<35:40:19, 3.90s/it] {'loss': 0.2031, 'grad_norm': 1.2690957652371693, 'learning_rate': 9.998084023931081e-06, 'epoch': 0.04} 4%|▍ | 1322/34278 [1:30:14<35:40:19, 3.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1323/34278 [1:30:17<34:16:09, 3.74s/it] {'loss': 0.2205, 'grad_norm': 1.2652695614831517, 'learning_rate': 9.998070924120631e-06, 'epoch': 0.04} 4%|▍ | 1323/34278 [1:30:17<34:16:09, 3.74s/it] 4%|▍ | 1324/34278 [1:30:20<31:49:59, 3.48s/it] {'loss': 0.2145, 'grad_norm': 0.9942675570568783, 'learning_rate': 9.998057779688597e-06, 'epoch': 0.04} 4%|▍ | 1324/34278 [1:30:20<31:49:59, 3.48s/it] 4%|▍ | 1325/34278 [1:30:23<30:16:01, 3.31s/it] {'loss': 0.2325, 'grad_norm': 1.1329949453374601, 'learning_rate': 9.998044590635099e-06, 'epoch': 0.04} 4%|▍ | 1325/34278 [1:30:23<30:16:01, 3.31s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8356 > 8192). Running this sequence through the model will result in indexing errors 4%|▍ | 1326/34278 [1:30:26<29:43:51, 3.25s/it] {'loss': 0.2047, 'grad_norm': 0.8833575532196354, 'learning_rate': 9.99803135696025e-06, 'epoch': 0.04} 4%|▍ | 1326/34278 [1:30:26<29:43:51, 3.25s/it] 4%|▍ | 1327/34278 [1:30:30<30:35:20, 3.34s/it] {'loss': 0.2213, 'grad_norm': 0.9744575379511667, 'learning_rate': 9.998018078664169e-06, 'epoch': 0.04} 4%|▍ | 1327/34278 [1:30:30<30:35:20, 3.34s/it] 4%|▍ | 1328/34278 [1:30:33<30:30:24, 3.33s/it] {'loss': 0.235, 'grad_norm': 0.9418066454427708, 'learning_rate': 9.998004755746977e-06, 'epoch': 0.04} 4%|▍ | 1328/34278 [1:30:33<30:30:24, 3.33s/it] 4%|▍ | 1329/34278 [1:30:39<38:04:31, 4.16s/it] {'loss': 0.233, 'grad_norm': 1.0147343297476814, 'learning_rate': 9.997991388208791e-06, 'epoch': 0.04} 4%|▍ | 1329/34278 [1:30:39<38:04:31, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1330/34278 [1:30:43<37:58:25, 4.15s/it] {'loss': 0.2271, 'grad_norm': 1.1558484528664879, 'learning_rate': 9.997977976049731e-06, 'epoch': 0.04} 4%|▍ | 1330/34278 [1:30:43<37:58:25, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1331/34278 [1:30:46<35:37:06, 3.89s/it] {'loss': 0.2345, 'grad_norm': 1.201303925343188, 'learning_rate': 9.997964519269918e-06, 'epoch': 0.04} 4%|▍ | 1331/34278 [1:30:46<35:37:06, 3.89s/it] 4%|▍ | 1332/34278 [1:30:49<33:11:17, 3.63s/it] {'loss': 0.2185, 'grad_norm': 0.9019540794269065, 'learning_rate': 9.99795101786947e-06, 'epoch': 0.04} 4%|▍ | 1332/34278 [1:30:49<33:11:17, 3.63s/it] 4%|▍ | 1333/34278 [1:30:54<37:03:13, 4.05s/it] {'loss': 0.1996, 'grad_norm': 1.283035114543337, 'learning_rate': 9.997937471848508e-06, 'epoch': 0.04} 4%|▍ | 1333/34278 [1:30:54<37:03:13, 4.05s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1334/34278 [1:30:58<36:00:50, 3.94s/it] {'loss': 0.2305, 'grad_norm': 0.9430874796476467, 'learning_rate': 9.997923881207155e-06, 'epoch': 0.04} 4%|▍ | 1334/34278 [1:30:58<36:00:50, 3.94s/it] 4%|▍ | 1335/34278 [1:31:01<33:07:16, 3.62s/it] {'loss': 0.2537, 'grad_norm': 0.9882398860434454, 'learning_rate': 9.99791024594553e-06, 'epoch': 0.04} 4%|▍ | 1335/34278 [1:31:01<33:07:16, 3.62s/it] 4%|▍ | 1336/34278 [1:31:04<31:13:52, 3.41s/it] {'loss': 0.2189, 'grad_norm': 1.1549123435876236, 'learning_rate': 9.997896566063754e-06, 'epoch': 0.04} 4%|▍ | 1336/34278 [1:31:04<31:13:52, 3.41s/it] 4%|▍ | 1337/34278 [1:31:10<38:58:51, 4.26s/it] {'loss': 0.2072, 'grad_norm': 1.0921869118312562, 'learning_rate': 9.997882841561952e-06, 'epoch': 0.04} 4%|▍ | 1337/34278 [1:31:10<38:58:51, 4.26s/it] 4%|▍ | 1338/34278 [1:31:16<43:49:21, 4.79s/it] {'loss': 0.2191, 'grad_norm': 1.0278245070875547, 'learning_rate': 9.997869072440245e-06, 'epoch': 0.04} 4%|▍ | 1338/34278 [1:31:16<43:49:21, 4.79s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1339/34278 [1:31:20<41:09:51, 4.50s/it] {'loss': 0.2164, 'grad_norm': 1.1846947002143793, 'learning_rate': 9.997855258698756e-06, 'epoch': 0.04} 4%|▍ | 1339/34278 [1:31:20<41:09:51, 4.50s/it] 4%|▍ | 1340/34278 [1:31:23<37:50:30, 4.14s/it] {'loss': 0.2211, 'grad_norm': 0.9517210256139034, 'learning_rate': 9.997841400337608e-06, 'epoch': 0.04} 4%|▍ | 1340/34278 [1:31:23<37:50:30, 4.14s/it] 4%|▍ | 1341/34278 [1:31:27<37:26:34, 4.09s/it] {'loss': 0.1905, 'grad_norm': 1.1482527040068051, 'learning_rate': 9.997827497356925e-06, 'epoch': 0.04} 4%|▍ | 1341/34278 [1:31:27<37:26:34, 4.09s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1342/34278 [1:31:31<35:09:23, 3.84s/it] {'loss': 0.2213, 'grad_norm': 1.0407439080463416, 'learning_rate': 9.997813549756831e-06, 'epoch': 0.04} 4%|▍ | 1342/34278 [1:31:31<35:09:23, 3.84s/it] 4%|▍ | 1343/34278 [1:31:35<37:08:20, 4.06s/it] {'loss': 0.2798, 'grad_norm': 1.1499738624169245, 'learning_rate': 9.99779955753745e-06, 'epoch': 0.04} 4%|▍ | 1343/34278 [1:31:35<37:08:20, 4.06s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1344/34278 [1:31:39<36:48:29, 4.02s/it] {'loss': 0.2103, 'grad_norm': 1.2375675402663109, 'learning_rate': 9.99778552069891e-06, 'epoch': 0.04} 4%|▍ | 1344/34278 [1:31:39<36:48:29, 4.02s/it] 4%|▍ | 1345/34278 [1:31:43<37:28:06, 4.10s/it] {'loss': 0.2392, 'grad_norm': 0.8669936389379261, 'learning_rate': 9.997771439241332e-06, 'epoch': 0.04} 4%|▍ | 1345/34278 [1:31:43<37:28:06, 4.10s/it] 4%|▍ | 1346/34278 [1:31:46<34:11:36, 3.74s/it] {'loss': 0.2091, 'grad_norm': 1.052290039102351, 'learning_rate': 9.997757313164843e-06, 'epoch': 0.04} 4%|▍ | 1346/34278 [1:31:46<34:11:36, 3.74s/it] 4%|▍ | 1347/34278 [1:31:49<32:21:05, 3.54s/it] {'loss': 0.2134, 'grad_norm': 0.9268609287273716, 'learning_rate': 9.997743142469571e-06, 'epoch': 0.04} 4%|▍ | 1347/34278 [1:31:49<32:21:05, 3.54s/it] 4%|▍ | 1348/34278 [1:31:52<30:47:53, 3.37s/it] {'loss': 0.2041, 'grad_norm': 0.9549665618673209, 'learning_rate': 9.997728927155643e-06, 'epoch': 0.04} 4%|▍ | 1348/34278 [1:31:52<30:47:53, 3.37s/it] 4%|▍ | 1349/34278 [1:31:58<37:35:03, 4.11s/it] {'loss': 0.2016, 'grad_norm': 0.9331634674606833, 'learning_rate': 9.997714667223181e-06, 'epoch': 0.04} 4%|▍ | 1349/34278 [1:31:58<37:35:03, 4.11s/it] 4%|▍ | 1350/34278 [1:32:01<34:10:45, 3.74s/it] {'loss': 0.2192, 'grad_norm': 0.9818661097744485, 'learning_rate': 9.997700362672317e-06, 'epoch': 0.04} 4%|▍ | 1350/34278 [1:32:01<34:10:45, 3.74s/it] 4%|▍ | 1351/34278 [1:32:07<41:03:20, 4.49s/it] {'loss': 0.2017, 'grad_norm': 0.8404467533545338, 'learning_rate': 9.997686013503178e-06, 'epoch': 0.04} 4%|▍ | 1351/34278 [1:32:07<41:03:20, 4.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1352/34278 [1:32:10<37:22:40, 4.09s/it] {'loss': 0.2244, 'grad_norm': 0.9627892193536619, 'learning_rate': 9.997671619715889e-06, 'epoch': 0.04} 4%|▍ | 1352/34278 [1:32:10<37:22:40, 4.09s/it] 4%|▍ | 1353/34278 [1:32:13<34:40:48, 3.79s/it] {'loss': 0.2002, 'grad_norm': 0.9226865047277064, 'learning_rate': 9.997657181310584e-06, 'epoch': 0.04} 4%|▍ | 1353/34278 [1:32:14<34:40:48, 3.79s/it] 4%|▍ | 1354/34278 [1:32:16<32:17:12, 3.53s/it] {'loss': 0.2183, 'grad_norm': 1.1284562198697294, 'learning_rate': 9.997642698287386e-06, 'epoch': 0.04} 4%|▍ | 1354/34278 [1:32:16<32:17:12, 3.53s/it] 4%|▍ | 1355/34278 [1:32:22<39:03:05, 4.27s/it] {'loss': 0.2203, 'grad_norm': 1.0154493713003871, 'learning_rate': 9.997628170646428e-06, 'epoch': 0.04} 4%|▍ | 1355/34278 [1:32:22<39:03:05, 4.27s/it] 4%|▍ | 1356/34278 [1:32:26<37:38:00, 4.12s/it] {'loss': 0.2226, 'grad_norm': 1.028409596537489, 'learning_rate': 9.997613598387838e-06, 'epoch': 0.04} 4%|▍ | 1356/34278 [1:32:26<37:38:00, 4.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1357/34278 [1:32:29<34:26:04, 3.77s/it] {'loss': 0.2271, 'grad_norm': 1.0342809832679385, 'learning_rate': 9.997598981511749e-06, 'epoch': 0.04} 4%|▍ | 1357/34278 [1:32:29<34:26:04, 3.77s/it] 4%|▍ | 1358/34278 [1:32:32<33:23:55, 3.65s/it] {'loss': 0.1995, 'grad_norm': 0.8296214691141625, 'learning_rate': 9.997584320018287e-06, 'epoch': 0.04} 4%|▍ | 1358/34278 [1:32:33<33:23:55, 3.65s/it] 4%|▍ | 1359/34278 [1:32:36<34:13:55, 3.74s/it] {'loss': 0.2081, 'grad_norm': 0.9319299517444279, 'learning_rate': 9.997569613907587e-06, 'epoch': 0.04} 4%|▍ | 1359/34278 [1:32:36<34:13:55, 3.74s/it] 4%|▍ | 1360/34278 [1:32:40<33:19:29, 3.64s/it] {'loss': 0.2124, 'grad_norm': 0.8792592970600249, 'learning_rate': 9.997554863179777e-06, 'epoch': 0.04} 4%|▍ | 1360/34278 [1:32:40<33:19:29, 3.64s/it] 4%|▍ | 1361/34278 [1:32:43<32:12:20, 3.52s/it] {'loss': 0.224, 'grad_norm': 0.8704137211657997, 'learning_rate': 9.997540067834991e-06, 'epoch': 0.04} 4%|▍ | 1361/34278 [1:32:43<32:12:20, 3.52s/it] 4%|▍ | 1362/34278 [1:32:47<34:25:45, 3.77s/it] {'loss': 0.2022, 'grad_norm': 0.9876591870234631, 'learning_rate': 9.997525227873361e-06, 'epoch': 0.04} 4%|▍ | 1362/34278 [1:32:47<34:25:45, 3.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1363/34278 [1:32:51<33:05:07, 3.62s/it] {'loss': 0.2039, 'grad_norm': 1.1291734542403797, 'learning_rate': 9.997510343295018e-06, 'epoch': 0.04} 4%|▍ | 1363/34278 [1:32:51<33:05:07, 3.62s/it] 4%|▍ | 1364/34278 [1:32:54<31:34:55, 3.45s/it] {'loss': 0.23, 'grad_norm': 1.1268418016229587, 'learning_rate': 9.997495414100095e-06, 'epoch': 0.04} 4%|▍ | 1364/34278 [1:32:54<31:34:55, 3.45s/it] 4%|▍ | 1365/34278 [1:32:59<37:04:22, 4.06s/it] {'loss': 0.2298, 'grad_norm': 1.148988880184851, 'learning_rate': 9.997480440288726e-06, 'epoch': 0.04} 4%|▍ | 1365/34278 [1:32:59<37:04:22, 4.06s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1366/34278 [1:33:05<42:52:51, 4.69s/it] {'loss': 0.2555, 'grad_norm': 1.1247163410569034, 'learning_rate': 9.997465421861046e-06, 'epoch': 0.04} 4%|▍ | 1366/34278 [1:33:05<42:52:51, 4.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1367/34278 [1:33:09<39:03:51, 4.27s/it] {'loss': 0.1948, 'grad_norm': 0.9965754859389144, 'learning_rate': 9.997450358817185e-06, 'epoch': 0.04} 4%|▍ | 1367/34278 [1:33:09<39:03:51, 4.27s/it] 4%|▍ | 1368/34278 [1:33:12<37:25:43, 4.09s/it] {'loss': 0.2693, 'grad_norm': 1.0375175847296314, 'learning_rate': 9.997435251157284e-06, 'epoch': 0.04} 4%|▍ | 1368/34278 [1:33:12<37:25:43, 4.09s/it] 4%|▍ | 1369/34278 [1:33:16<36:10:32, 3.96s/it] {'loss': 0.2025, 'grad_norm': 1.023653601087876, 'learning_rate': 9.99742009888147e-06, 'epoch': 0.04} 4%|▍ | 1369/34278 [1:33:16<36:10:32, 3.96s/it] 4%|▍ | 1370/34278 [1:33:19<33:03:25, 3.62s/it] {'loss': 0.1867, 'grad_norm': 0.9448813207728343, 'learning_rate': 9.997404901989884e-06, 'epoch': 0.04} 4%|▍ | 1370/34278 [1:33:19<33:03:25, 3.62s/it] 4%|▍ | 1371/34278 [1:33:22<31:32:02, 3.45s/it] {'loss': 0.2416, 'grad_norm': 0.938275403909158, 'learning_rate': 9.997389660482662e-06, 'epoch': 0.04} 4%|▍ | 1371/34278 [1:33:22<31:32:02, 3.45s/it] 4%|▍ | 1372/34278 [1:33:27<36:38:49, 4.01s/it] {'loss': 0.2569, 'grad_norm': 0.9685428975847087, 'learning_rate': 9.997374374359935e-06, 'epoch': 0.04} 4%|▍ | 1372/34278 [1:33:27<36:38:49, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1373/34278 [1:33:31<35:17:13, 3.86s/it] {'loss': 0.2268, 'grad_norm': 1.0739192210213488, 'learning_rate': 9.997359043621844e-06, 'epoch': 0.04} 4%|▍ | 1373/34278 [1:33:31<35:17:13, 3.86s/it] 4%|▍ | 1374/34278 [1:33:37<41:39:05, 4.56s/it] {'loss': 0.218, 'grad_norm': 0.9738521939663268, 'learning_rate': 9.997343668268525e-06, 'epoch': 0.04} 4%|▍ | 1374/34278 [1:33:37<41:39:05, 4.56s/it] 4%|▍ | 1375/34278 [1:33:40<37:56:30, 4.15s/it] {'loss': 0.2071, 'grad_norm': 0.9592428474829809, 'learning_rate': 9.997328248300114e-06, 'epoch': 0.04} 4%|▍ | 1375/34278 [1:33:40<37:56:30, 4.15s/it] 4%|▍ | 1376/34278 [1:33:45<41:09:00, 4.50s/it] {'loss': 0.2207, 'grad_norm': 0.8635790952625635, 'learning_rate': 9.997312783716751e-06, 'epoch': 0.04} 4%|▍ | 1376/34278 [1:33:45<41:09:00, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1377/34278 [1:33:48<37:12:12, 4.07s/it] {'loss': 0.2169, 'grad_norm': 0.9335635120510127, 'learning_rate': 9.997297274518569e-06, 'epoch': 0.04} 4%|▍ | 1377/34278 [1:33:49<37:12:12, 4.07s/it] 4%|▍ | 1378/34278 [1:33:51<33:51:39, 3.71s/it] {'loss': 0.228, 'grad_norm': 1.1756898645285696, 'learning_rate': 9.997281720705713e-06, 'epoch': 0.04} 4%|▍ | 1378/34278 [1:33:51<33:51:39, 3.71s/it] 4%|▍ | 1379/34278 [1:33:57<40:26:25, 4.43s/it] {'loss': 0.2033, 'grad_norm': 1.0855032514936684, 'learning_rate': 9.997266122278317e-06, 'epoch': 0.04} 4%|▍ | 1379/34278 [1:33:57<40:26:25, 4.43s/it] 4%|▍ | 1380/34278 [1:34:00<36:14:41, 3.97s/it] {'loss': 0.2193, 'grad_norm': 0.8673801952091827, 'learning_rate': 9.997250479236522e-06, 'epoch': 0.04} 4%|▍ | 1380/34278 [1:34:00<36:14:41, 3.97s/it] 4%|▍ | 1381/34278 [1:34:04<34:33:46, 3.78s/it] {'loss': 0.232, 'grad_norm': 1.044271829891321, 'learning_rate': 9.99723479158047e-06, 'epoch': 0.04} 4%|▍ | 1381/34278 [1:34:04<34:33:46, 3.78s/it] 4%|▍ | 1382/34278 [1:34:07<32:36:11, 3.57s/it] {'loss': 0.2152, 'grad_norm': 1.1137257096053048, 'learning_rate': 9.997219059310296e-06, 'epoch': 0.04} 4%|▍ | 1382/34278 [1:34:07<32:36:11, 3.57s/it] 4%|▍ | 1383/34278 [1:34:12<36:20:00, 3.98s/it] {'loss': 0.2103, 'grad_norm': 0.8227482080417956, 'learning_rate': 9.997203282426144e-06, 'epoch': 0.04} 4%|▍ | 1383/34278 [1:34:12<36:20:00, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1384/34278 [1:34:15<34:32:58, 3.78s/it] {'loss': 0.2167, 'grad_norm': 0.9498515007487641, 'learning_rate': 9.997187460928155e-06, 'epoch': 0.04} 4%|▍ | 1384/34278 [1:34:15<34:32:58, 3.78s/it] 4%|▍ | 1385/34278 [1:34:20<37:10:06, 4.07s/it] {'loss': 0.2538, 'grad_norm': 1.1769274227042676, 'learning_rate': 9.997171594816467e-06, 'epoch': 0.04} 4%|▍ | 1385/34278 [1:34:20<37:10:06, 4.07s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1386/34278 [1:34:24<36:33:31, 4.00s/it] {'loss': 0.2333, 'grad_norm': 0.9315945779231783, 'learning_rate': 9.997155684091225e-06, 'epoch': 0.04} 4%|▍ | 1386/34278 [1:34:24<36:33:31, 4.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1387/34278 [1:34:28<36:16:11, 3.97s/it] {'loss': 0.1961, 'grad_norm': 0.9803684457551111, 'learning_rate': 9.997139728752571e-06, 'epoch': 0.04} 4%|▍ | 1387/34278 [1:34:28<36:16:11, 3.97s/it] 4%|▍ | 1388/34278 [1:34:31<34:47:38, 3.81s/it] {'loss': 0.2194, 'grad_norm': 0.9885088854204722, 'learning_rate': 9.997123728800647e-06, 'epoch': 0.04} 4%|▍ | 1388/34278 [1:34:31<34:47:38, 3.81s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1389/34278 [1:34:35<36:28:03, 3.99s/it] {'loss': 0.2226, 'grad_norm': 1.0034427516436237, 'learning_rate': 9.997107684235592e-06, 'epoch': 0.04} 4%|▍ | 1389/34278 [1:34:35<36:28:03, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1390/34278 [1:34:38<33:26:22, 3.66s/it] {'loss': 0.2282, 'grad_norm': 0.9975148044713197, 'learning_rate': 9.997091595057556e-06, 'epoch': 0.04} 4%|▍ | 1390/34278 [1:34:38<33:26:22, 3.66s/it] 4%|▍ | 1391/34278 [1:34:41<32:13:57, 3.53s/it] {'loss': 0.1854, 'grad_norm': 1.2398854129625423, 'learning_rate': 9.997075461266677e-06, 'epoch': 0.04} 4%|▍ | 1391/34278 [1:34:41<32:13:57, 3.53s/it] 4%|▍ | 1392/34278 [1:34:45<32:21:16, 3.54s/it] {'loss': 0.2222, 'grad_norm': 0.9221307377309209, 'learning_rate': 9.997059282863103e-06, 'epoch': 0.04} 4%|▍ | 1392/34278 [1:34:45<32:21:16, 3.54s/it] 4%|▍ | 1393/34278 [1:34:50<36:48:29, 4.03s/it] {'loss': 0.2392, 'grad_norm': 1.4777014478559738, 'learning_rate': 9.997043059846974e-06, 'epoch': 0.04} 4%|▍ | 1393/34278 [1:34:50<36:48:29, 4.03s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1394/34278 [1:34:54<35:51:20, 3.93s/it] {'loss': 0.2217, 'grad_norm': 0.9999336775425626, 'learning_rate': 9.997026792218439e-06, 'epoch': 0.04} 4%|▍ | 1394/34278 [1:34:54<35:51:20, 3.93s/it] 4%|▍ | 1395/34278 [1:34:58<36:39:04, 4.01s/it] {'loss': 0.2011, 'grad_norm': 1.0057735011149043, 'learning_rate': 9.99701047997764e-06, 'epoch': 0.04} 4%|▍ | 1395/34278 [1:34:58<36:39:04, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1396/34278 [1:35:02<35:02:41, 3.84s/it] {'loss': 0.2025, 'grad_norm': 1.0771243992386983, 'learning_rate': 9.996994123124727e-06, 'epoch': 0.04} 4%|▍ | 1396/34278 [1:35:02<35:02:41, 3.84s/it] 4%|▍ | 1397/34278 [1:35:05<32:43:20, 3.58s/it] {'loss': 0.2158, 'grad_norm': 1.1764991368919115, 'learning_rate': 9.996977721659841e-06, 'epoch': 0.04} 4%|▍ | 1397/34278 [1:35:05<32:43:20, 3.58s/it] 4%|▍ | 1398/34278 [1:35:08<33:34:27, 3.68s/it] {'loss': 0.2273, 'grad_norm': 1.2576445137719265, 'learning_rate': 9.996961275583133e-06, 'epoch': 0.04} 4%|▍ | 1398/34278 [1:35:08<33:34:27, 3.68s/it] 4%|▍ | 1399/34278 [1:35:13<34:51:58, 3.82s/it] {'loss': 0.2327, 'grad_norm': 0.9886776760583021, 'learning_rate': 9.996944784894747e-06, 'epoch': 0.04} 4%|▍ | 1399/34278 [1:35:13<34:51:58, 3.82s/it] 4%|▍ | 1400/34278 [1:35:15<32:16:30, 3.53s/it] {'loss': 0.2108, 'grad_norm': 0.9695993546130057, 'learning_rate': 9.99692824959483e-06, 'epoch': 0.04} 4%|▍ | 1400/34278 [1:35:15<32:16:30, 3.53s/it] 4%|▍ | 1401/34278 [1:35:18<30:28:21, 3.34s/it] {'loss': 0.1976, 'grad_norm': 0.9952565365059857, 'learning_rate': 9.99691166968353e-06, 'epoch': 0.04} 4%|▍ | 1401/34278 [1:35:18<30:28:21, 3.34s/it] 4%|▍ | 1402/34278 [1:35:23<32:50:53, 3.60s/it] {'loss': 0.2132, 'grad_norm': 0.9257956453901235, 'learning_rate': 9.996895045160997e-06, 'epoch': 0.04} 4%|▍ | 1402/34278 [1:35:23<32:50:53, 3.60s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1403/34278 [1:35:25<30:44:12, 3.37s/it] {'loss': 0.2241, 'grad_norm': 0.9610088002531586, 'learning_rate': 9.996878376027377e-06, 'epoch': 0.04} 4%|▍ | 1403/34278 [1:35:25<30:44:12, 3.37s/it] 4%|▍ | 1404/34278 [1:35:28<29:47:03, 3.26s/it] {'loss': 0.2383, 'grad_norm': 1.019045509669626, 'learning_rate': 9.99686166228282e-06, 'epoch': 0.04} 4%|▍ | 1404/34278 [1:35:28<29:47:03, 3.26s/it] 4%|▍ | 1405/34278 [1:35:32<31:14:46, 3.42s/it] {'loss': 0.198, 'grad_norm': 0.9650270584136241, 'learning_rate': 9.996844903927475e-06, 'epoch': 0.04} 4%|▍ | 1405/34278 [1:35:32<31:14:46, 3.42s/it] 4%|▍ | 1406/34278 [1:35:36<31:33:15, 3.46s/it] {'loss': 0.2085, 'grad_norm': 1.0796070829693196, 'learning_rate': 9.996828100961491e-06, 'epoch': 0.04} 4%|▍ | 1406/34278 [1:35:36<31:33:15, 3.46s/it] 4%|▍ | 1407/34278 [1:35:39<32:28:43, 3.56s/it] {'loss': 0.2449, 'grad_norm': 0.9430226186570874, 'learning_rate': 9.99681125338502e-06, 'epoch': 0.04} 4%|▍ | 1407/34278 [1:35:39<32:28:43, 3.56s/it] 4%|▍ | 1408/34278 [1:35:43<31:05:45, 3.41s/it] {'loss': 0.2313, 'grad_norm': 1.1829258383954429, 'learning_rate': 9.99679436119821e-06, 'epoch': 0.04} 4%|▍ | 1408/34278 [1:35:43<31:05:45, 3.41s/it] 4%|▍ | 1409/34278 [1:35:46<32:31:20, 3.56s/it] {'loss': 0.2551, 'grad_norm': 0.9457366101358423, 'learning_rate': 9.996777424401212e-06, 'epoch': 0.04} 4%|▍ | 1409/34278 [1:35:46<32:31:20, 3.56s/it] 4%|▍ | 1410/34278 [1:35:53<39:44:05, 4.35s/it] {'loss': 0.2309, 'grad_norm': 1.1009213065778383, 'learning_rate': 9.996760442994177e-06, 'epoch': 0.04} 4%|▍ | 1410/34278 [1:35:53<39:44:05, 4.35s/it] 4%|▍ | 1411/34278 [1:35:57<39:39:55, 4.34s/it] {'loss': 0.2367, 'grad_norm': 0.9704531907078432, 'learning_rate': 9.996743416977262e-06, 'epoch': 0.04} 4%|▍ | 1411/34278 [1:35:57<39:39:55, 4.34s/it] 4%|▍ | 1412/34278 [1:36:00<36:32:45, 4.00s/it] {'loss': 0.2295, 'grad_norm': 1.0429666547715157, 'learning_rate': 9.99672634635061e-06, 'epoch': 0.04} 4%|▍ | 1412/34278 [1:36:00<36:32:45, 4.00s/it] 4%|▍ | 1413/34278 [1:36:03<33:43:42, 3.69s/it] {'loss': 0.2381, 'grad_norm': 1.0747472073388955, 'learning_rate': 9.996709231114381e-06, 'epoch': 0.04} 4%|▍ | 1413/34278 [1:36:03<33:43:42, 3.69s/it] 4%|▍ | 1414/34278 [1:36:06<31:35:15, 3.46s/it] {'loss': 0.2277, 'grad_norm': 1.1662343939472601, 'learning_rate': 9.996692071268724e-06, 'epoch': 0.04} 4%|▍ | 1414/34278 [1:36:06<31:35:15, 3.46s/it] 4%|▍ | 1415/34278 [1:36:10<34:03:51, 3.73s/it] {'loss': 0.2253, 'grad_norm': 0.9602644067615418, 'learning_rate': 9.996674866813792e-06, 'epoch': 0.04} 4%|▍ | 1415/34278 [1:36:10<34:03:51, 3.73s/it] 4%|▍ | 1416/34278 [1:36:14<32:38:19, 3.58s/it] {'loss': 0.2473, 'grad_norm': 1.3243000135828542, 'learning_rate': 9.99665761774974e-06, 'epoch': 0.04} 4%|▍ | 1416/34278 [1:36:14<32:38:19, 3.58s/it] 4%|▍ | 1417/34278 [1:36:17<30:42:45, 3.36s/it] {'loss': 0.2018, 'grad_norm': 0.9815509736248158, 'learning_rate': 9.996640324076721e-06, 'epoch': 0.04} 4%|▍ | 1417/34278 [1:36:17<30:42:45, 3.36s/it] 4%|▍ | 1418/34278 [1:36:19<29:35:04, 3.24s/it] {'loss': 0.1946, 'grad_norm': 1.0714111788074172, 'learning_rate': 9.996622985794891e-06, 'epoch': 0.04} 4%|▍ | 1418/34278 [1:36:19<29:35:04, 3.24s/it] 4%|▍ | 1419/34278 [1:36:23<30:12:43, 3.31s/it] {'loss': 0.2087, 'grad_norm': 0.9998604307350838, 'learning_rate': 9.996605602904403e-06, 'epoch': 0.04} 4%|▍ | 1419/34278 [1:36:23<30:12:43, 3.31s/it] 4%|▍ | 1420/34278 [1:36:26<30:19:38, 3.32s/it] {'loss': 0.2381, 'grad_norm': 1.072961976478066, 'learning_rate': 9.996588175405415e-06, 'epoch': 0.04} 4%|▍ | 1420/34278 [1:36:26<30:19:38, 3.32s/it] 4%|▍ | 1421/34278 [1:36:30<30:35:40, 3.35s/it] {'loss': 0.1966, 'grad_norm': 0.8373350153525165, 'learning_rate': 9.996570703298078e-06, 'epoch': 0.04} 4%|▍ | 1421/34278 [1:36:30<30:35:40, 3.35s/it] 4%|▍ | 1422/34278 [1:36:36<38:56:10, 4.27s/it] {'loss': 0.2478, 'grad_norm': 1.089742446453986, 'learning_rate': 9.996553186582552e-06, 'epoch': 0.04} 4%|▍ | 1422/34278 [1:36:36<38:56:10, 4.27s/it] 4%|▍ | 1423/34278 [1:36:40<37:12:35, 4.08s/it] {'loss': 0.2123, 'grad_norm': 0.9748766263828204, 'learning_rate': 9.996535625258992e-06, 'epoch': 0.04} 4%|▍ | 1423/34278 [1:36:40<37:12:35, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1424/34278 [1:36:43<34:51:57, 3.82s/it] {'loss': 0.2195, 'grad_norm': 1.2175897141540069, 'learning_rate': 9.996518019327555e-06, 'epoch': 0.04} 4%|▍ | 1424/34278 [1:36:43<34:51:57, 3.82s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10038 > 8192). Running this sequence through the model will result in indexing errors 4%|▍ | 1425/34278 [1:36:46<32:29:45, 3.56s/it] {'loss': 0.2273, 'grad_norm': 1.0743819665833776, 'learning_rate': 9.996500368788396e-06, 'epoch': 0.04} 4%|▍ | 1425/34278 [1:36:46<32:29:45, 3.56s/it] 4%|▍ | 1426/34278 [1:36:52<39:10:41, 4.29s/it] {'loss': 0.2063, 'grad_norm': 1.03651420740801, 'learning_rate': 9.996482673641675e-06, 'epoch': 0.04} 4%|▍ | 1426/34278 [1:36:52<39:10:41, 4.29s/it] 4%|▍ | 1427/34278 [1:36:57<40:39:05, 4.45s/it] {'loss': 0.2363, 'grad_norm': 1.3119232498834812, 'learning_rate': 9.996464933887551e-06, 'epoch': 0.04} 4%|▍ | 1427/34278 [1:36:57<40:39:05, 4.45s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1428/34278 [1:37:00<36:42:07, 4.02s/it] {'loss': 0.2264, 'grad_norm': 0.9325800581765833, 'learning_rate': 9.996447149526179e-06, 'epoch': 0.04} 4%|▍ | 1428/34278 [1:37:00<36:42:07, 4.02s/it] 4%|▍ | 1429/34278 [1:37:04<37:53:40, 4.15s/it] {'loss': 0.2323, 'grad_norm': 0.9127887289285253, 'learning_rate': 9.99642932055772e-06, 'epoch': 0.04} 4%|▍ | 1429/34278 [1:37:04<37:53:40, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1430/34278 [1:37:07<34:42:03, 3.80s/it] {'loss': 0.217, 'grad_norm': 1.1171056920038365, 'learning_rate': 9.996411446982335e-06, 'epoch': 0.04} 4%|▍ | 1430/34278 [1:37:07<34:42:03, 3.80s/it] 4%|▍ | 1431/34278 [1:37:10<32:08:22, 3.52s/it] {'loss': 0.2261, 'grad_norm': 0.9171130115726579, 'learning_rate': 9.99639352880018e-06, 'epoch': 0.04} 4%|▍ | 1431/34278 [1:37:10<32:08:22, 3.52s/it] 4%|▍ | 1432/34278 [1:37:13<30:48:33, 3.38s/it] {'loss': 0.2296, 'grad_norm': 0.8525848364637603, 'learning_rate': 9.996375566011415e-06, 'epoch': 0.04} 4%|▍ | 1432/34278 [1:37:13<30:48:33, 3.38s/it] 4%|▍ | 1433/34278 [1:37:17<33:20:53, 3.66s/it] {'loss': 0.2172, 'grad_norm': 0.8924284655904076, 'learning_rate': 9.996357558616201e-06, 'epoch': 0.04} 4%|▍ | 1433/34278 [1:37:17<33:20:53, 3.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (8493 > 8192). Running this sequence through the model will result in indexing errors 4%|▍ | 1434/34278 [1:37:21<32:31:18, 3.56s/it] {'loss': 0.2171, 'grad_norm': 0.7701805957891904, 'learning_rate': 9.996339506614703e-06, 'epoch': 0.04} 4%|▍ | 1434/34278 [1:37:21<32:31:18, 3.56s/it] 4%|▍ | 1435/34278 [1:37:24<32:12:12, 3.53s/it] {'loss': 0.2163, 'grad_norm': 0.9346432211708176, 'learning_rate': 9.996321410007076e-06, 'epoch': 0.04} 4%|▍ | 1435/34278 [1:37:24<32:12:12, 3.53s/it] 4%|▍ | 1436/34278 [1:37:27<31:27:42, 3.45s/it] {'loss': 0.2144, 'grad_norm': 1.2609256382114076, 'learning_rate': 9.996303268793484e-06, 'epoch': 0.04} 4%|▍ | 1436/34278 [1:37:28<31:27:42, 3.45s/it] 4%|▍ | 1437/34278 [1:37:31<32:39:05, 3.58s/it] {'loss': 0.2139, 'grad_norm': 0.9450348609382418, 'learning_rate': 9.99628508297409e-06, 'epoch': 0.04} 4%|▍ | 1437/34278 [1:37:31<32:39:05, 3.58s/it] 4%|▍ | 1438/34278 [1:37:34<31:10:40, 3.42s/it] {'loss': 0.2052, 'grad_norm': 1.0011505709567554, 'learning_rate': 9.996266852549056e-06, 'epoch': 0.04} 4%|▍ | 1438/34278 [1:37:34<31:10:40, 3.42s/it] 4%|▍ | 1439/34278 [1:37:38<30:21:13, 3.33s/it] {'loss': 0.2141, 'grad_norm': 0.9925306010267743, 'learning_rate': 9.996248577518543e-06, 'epoch': 0.04} 4%|▍ | 1439/34278 [1:37:38<30:21:13, 3.33s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11861 > 8192). Running this sequence through the model will result in indexing errors 4%|▍ | 1440/34278 [1:37:41<29:26:45, 3.23s/it] {'loss': 0.2218, 'grad_norm': 1.0884507101869654, 'learning_rate': 9.996230257882716e-06, 'epoch': 0.04} 4%|▍ | 1440/34278 [1:37:41<29:26:45, 3.23s/it] 4%|▍ | 1441/34278 [1:37:44<28:50:23, 3.16s/it] {'loss': 0.2378, 'grad_norm': 1.141353351760358, 'learning_rate': 9.99621189364174e-06, 'epoch': 0.04} 4%|▍ | 1441/34278 [1:37:44<28:50:23, 3.16s/it] 4%|▍ | 1442/34278 [1:37:48<31:19:27, 3.43s/it] {'loss': 0.2053, 'grad_norm': 1.0686080529219866, 'learning_rate': 9.996193484795774e-06, 'epoch': 0.04} 4%|▍ | 1442/34278 [1:37:48<31:19:27, 3.43s/it] 4%|▍ | 1443/34278 [1:37:52<32:34:49, 3.57s/it] {'loss': 0.2205, 'grad_norm': 0.9193753146267514, 'learning_rate': 9.996175031344985e-06, 'epoch': 0.04} 4%|▍ | 1443/34278 [1:37:52<32:34:49, 3.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1444/34278 [1:37:55<32:03:13, 3.51s/it] {'loss': 0.1881, 'grad_norm': 1.2292913522378575, 'learning_rate': 9.99615653328954e-06, 'epoch': 0.04} 4%|▍ | 1444/34278 [1:37:55<32:03:13, 3.51s/it] 4%|▍ | 1445/34278 [1:37:59<33:09:33, 3.64s/it] {'loss': 0.2326, 'grad_norm': 0.9432107283405086, 'learning_rate': 9.996137990629601e-06, 'epoch': 0.04} 4%|▍ | 1445/34278 [1:37:59<33:09:33, 3.64s/it] 4%|▍ | 1446/34278 [1:38:02<31:10:30, 3.42s/it] {'loss': 0.232, 'grad_norm': 1.0168940745265806, 'learning_rate': 9.996119403365336e-06, 'epoch': 0.04} 4%|▍ | 1446/34278 [1:38:02<31:10:30, 3.42s/it] 4%|▍ | 1447/34278 [1:38:07<36:49:24, 4.04s/it] {'loss': 0.2081, 'grad_norm': 0.766663241875437, 'learning_rate': 9.996100771496908e-06, 'epoch': 0.04} 4%|▍ | 1447/34278 [1:38:07<36:49:24, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1448/34278 [1:38:11<37:21:10, 4.10s/it] {'loss': 0.2257, 'grad_norm': 0.9448176605055629, 'learning_rate': 9.996082095024486e-06, 'epoch': 0.04} 4%|▍ | 1448/34278 [1:38:11<37:21:10, 4.10s/it] 4%|▍ | 1449/34278 [1:38:15<34:37:09, 3.80s/it] {'loss': 0.2056, 'grad_norm': 0.9189474902407835, 'learning_rate': 9.996063373948236e-06, 'epoch': 0.04} 4%|▍ | 1449/34278 [1:38:15<34:37:09, 3.80s/it] 4%|▍ | 1450/34278 [1:38:20<39:03:28, 4.28s/it] {'loss': 0.2285, 'grad_norm': 0.9430389926796413, 'learning_rate': 9.996044608268323e-06, 'epoch': 0.04} 4%|▍ | 1450/34278 [1:38:20<39:03:28, 4.28s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1451/34278 [1:38:24<37:13:32, 4.08s/it] {'loss': 0.2005, 'grad_norm': 0.8736377488994879, 'learning_rate': 9.996025797984917e-06, 'epoch': 0.04} 4%|▍ | 1451/34278 [1:38:24<37:13:32, 4.08s/it] 4%|▍ | 1452/34278 [1:38:28<37:43:40, 4.14s/it] {'loss': 0.2003, 'grad_norm': 0.9798793350912578, 'learning_rate': 9.996006943098186e-06, 'epoch': 0.04} 4%|▍ | 1452/34278 [1:38:28<37:43:40, 4.14s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1453/34278 [1:38:33<41:05:33, 4.51s/it] {'loss': 0.2413, 'grad_norm': 0.9939519675283078, 'learning_rate': 9.995988043608298e-06, 'epoch': 0.04} 4%|▍ | 1453/34278 [1:38:33<41:05:33, 4.51s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1454/34278 [1:38:38<42:04:29, 4.61s/it] {'loss': 0.2392, 'grad_norm': 1.1457999732943973, 'learning_rate': 9.995969099515422e-06, 'epoch': 0.04} 4%|▍ | 1454/34278 [1:38:38<42:04:29, 4.61s/it] 4%|▍ | 1455/34278 [1:38:44<45:44:46, 5.02s/it] {'loss': 0.2293, 'grad_norm': 0.9996397635655915, 'learning_rate': 9.995950110819725e-06, 'epoch': 0.04} 4%|▍ | 1455/34278 [1:38:44<45:44:46, 5.02s/it] 4%|▍ | 1456/34278 [1:38:48<43:32:46, 4.78s/it] {'loss': 0.2315, 'grad_norm': 1.0414568624002896, 'learning_rate': 9.995931077521377e-06, 'epoch': 0.04} 4%|▍ | 1456/34278 [1:38:48<43:32:46, 4.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1457/34278 [1:38:52<39:35:00, 4.34s/it] {'loss': 0.2548, 'grad_norm': 1.02445309232173, 'learning_rate': 9.995911999620551e-06, 'epoch': 0.04} 4%|▍ | 1457/34278 [1:38:52<39:35:00, 4.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1458/34278 [1:38:55<36:25:28, 4.00s/it] {'loss': 0.2174, 'grad_norm': 0.9462250634681262, 'learning_rate': 9.995892877117415e-06, 'epoch': 0.04} 4%|▍ | 1458/34278 [1:38:55<36:25:28, 4.00s/it] 4%|▍ | 1459/34278 [1:38:58<33:48:10, 3.71s/it] {'loss': 0.2447, 'grad_norm': 1.3235852975520785, 'learning_rate': 9.995873710012139e-06, 'epoch': 0.04} 4%|▍ | 1459/34278 [1:38:58<33:48:10, 3.71s/it] 4%|▍ | 1460/34278 [1:39:01<32:49:03, 3.60s/it] {'loss': 0.2065, 'grad_norm': 1.1324259520052409, 'learning_rate': 9.995854498304896e-06, 'epoch': 0.04} 4%|▍ | 1460/34278 [1:39:01<32:49:03, 3.60s/it] 4%|▍ | 1461/34278 [1:39:04<31:34:05, 3.46s/it] {'loss': 0.2081, 'grad_norm': 0.9157819995964086, 'learning_rate': 9.995835241995857e-06, 'epoch': 0.04} 4%|▍ | 1461/34278 [1:39:04<31:34:05, 3.46s/it] 4%|▍ | 1462/34278 [1:39:08<31:17:56, 3.43s/it] {'loss': 0.2262, 'grad_norm': 0.8460837172718613, 'learning_rate': 9.995815941085193e-06, 'epoch': 0.04} 4%|▍ | 1462/34278 [1:39:08<31:17:56, 3.43s/it] 4%|▍ | 1463/34278 [1:39:11<30:01:53, 3.29s/it] {'loss': 0.1842, 'grad_norm': 0.8605506763271699, 'learning_rate': 9.995796595573078e-06, 'epoch': 0.04} 4%|▍ | 1463/34278 [1:39:11<30:01:53, 3.29s/it] 4%|▍ | 1464/34278 [1:39:14<30:29:08, 3.34s/it] {'loss': 0.2444, 'grad_norm': 0.9943616633690506, 'learning_rate': 9.995777205459682e-06, 'epoch': 0.04} 4%|▍ | 1464/34278 [1:39:14<30:29:08, 3.34s/it] 4%|▍ | 1465/34278 [1:39:17<30:38:36, 3.36s/it] {'loss': 0.1969, 'grad_norm': 0.8662157243640668, 'learning_rate': 9.99575777074518e-06, 'epoch': 0.04} 4%|▍ | 1465/34278 [1:39:17<30:38:36, 3.36s/it] 4%|▍ | 1466/34278 [1:39:22<32:43:09, 3.59s/it] {'loss': 0.1947, 'grad_norm': 0.9849215744267956, 'learning_rate': 9.995738291429745e-06, 'epoch': 0.04} 4%|▍ | 1466/34278 [1:39:22<32:43:09, 3.59s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2a48695120> Failed to fetch sample 3641704. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a48695120> 4%|▍ | 1467/34278 [1:39:25<31:27:30, 3.45s/it] {'loss': 0.1978, 'grad_norm': 0.9378503402257837, 'learning_rate': 9.995718767513551e-06, 'epoch': 0.04} 4%|▍ | 1467/34278 [1:39:25<31:27:30, 3.45s/it] 4%|▍ | 1468/34278 [1:39:28<32:13:35, 3.54s/it] {'loss': 0.2013, 'grad_norm': 0.926220287544285, 'learning_rate': 9.995699198996773e-06, 'epoch': 0.04} 4%|▍ | 1468/34278 [1:39:28<32:13:35, 3.54s/it] 4%|▍ | 1469/34278 [1:39:32<31:47:19, 3.49s/it] {'loss': 0.1947, 'grad_norm': 0.9176908859449656, 'learning_rate': 9.995679585879585e-06, 'epoch': 0.04} 4%|▍ | 1469/34278 [1:39:32<31:47:19, 3.49s/it] 4%|▍ | 1470/34278 [1:39:35<31:21:41, 3.44s/it] {'loss': 0.2238, 'grad_norm': 1.0008999967219585, 'learning_rate': 9.995659928162164e-06, 'epoch': 0.04} 4%|▍ | 1470/34278 [1:39:35<31:21:41, 3.44s/it] 4%|▍ | 1471/34278 [1:39:39<31:18:06, 3.43s/it] {'loss': 0.2221, 'grad_norm': 1.0602822912657908, 'learning_rate': 9.995640225844682e-06, 'epoch': 0.04} 4%|▍ | 1471/34278 [1:39:39<31:18:06, 3.43s/it] 4%|▍ | 1472/34278 [1:39:42<31:35:09, 3.47s/it] {'loss': 0.2049, 'grad_norm': 0.9648436997158979, 'learning_rate': 9.995620478927315e-06, 'epoch': 0.04} 4%|▍ | 1472/34278 [1:39:42<31:35:09, 3.47s/it] 4%|▍ | 1473/34278 [1:39:48<39:09:28, 4.30s/it] {'loss': 0.2048, 'grad_norm': 1.0664357527011148, 'learning_rate': 9.995600687410244e-06, 'epoch': 0.04} 4%|▍ | 1473/34278 [1:39:48<39:09:28, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1474/34278 [1:39:53<39:43:11, 4.36s/it] {'loss': 0.2222, 'grad_norm': 0.9574922828372119, 'learning_rate': 9.99558085129364e-06, 'epoch': 0.04} 4%|▍ | 1474/34278 [1:39:53<39:43:11, 4.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1475/34278 [1:39:59<44:10:08, 4.85s/it] {'loss': 0.2342, 'grad_norm': 1.1493992224879264, 'learning_rate': 9.995560970577685e-06, 'epoch': 0.04} 4%|▍ | 1475/34278 [1:39:59<44:10:08, 4.85s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1476/34278 [1:40:02<38:50:10, 4.26s/it] {'loss': 0.2257, 'grad_norm': 1.2327635581648775, 'learning_rate': 9.995541045262554e-06, 'epoch': 0.04} 4%|▍ | 1476/34278 [1:40:02<38:50:10, 4.26s/it] 4%|▍ | 1477/34278 [1:40:05<36:16:49, 3.98s/it] {'loss': 0.1834, 'grad_norm': 0.8601884200626423, 'learning_rate': 9.995521075348423e-06, 'epoch': 0.04} 4%|▍ | 1477/34278 [1:40:05<36:16:49, 3.98s/it] 4%|▍ | 1478/34278 [1:40:10<37:48:48, 4.15s/it] {'loss': 0.221, 'grad_norm': 1.0366460390906858, 'learning_rate': 9.995501060835474e-06, 'epoch': 0.04} 4%|▍ | 1478/34278 [1:40:10<37:48:48, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1479/34278 [1:40:13<35:32:43, 3.90s/it] {'loss': 0.2226, 'grad_norm': 1.0994996940409911, 'learning_rate': 9.995481001723884e-06, 'epoch': 0.04} 4%|▍ | 1479/34278 [1:40:13<35:32:43, 3.90s/it] 4%|▍ | 1480/34278 [1:40:16<33:56:06, 3.72s/it] {'loss': 0.2102, 'grad_norm': 0.8794707319860001, 'learning_rate': 9.995460898013831e-06, 'epoch': 0.04} 4%|▍ | 1480/34278 [1:40:16<33:56:06, 3.72s/it] 4%|▍ | 1481/34278 [1:40:20<33:48:37, 3.71s/it] {'loss': 0.2182, 'grad_norm': 1.088000625757929, 'learning_rate': 9.995440749705496e-06, 'epoch': 0.04} 4%|▍ | 1481/34278 [1:40:20<33:48:37, 3.71s/it] 4%|▍ | 1482/34278 [1:40:23<32:08:53, 3.53s/it] {'loss': 0.2199, 'grad_norm': 0.9909439668431169, 'learning_rate': 9.99542055679906e-06, 'epoch': 0.04} 4%|▍ | 1482/34278 [1:40:23<32:08:53, 3.53s/it] 4%|▍ | 1483/34278 [1:40:27<34:39:34, 3.80s/it] {'loss': 0.2044, 'grad_norm': 0.9167130629482909, 'learning_rate': 9.9954003192947e-06, 'epoch': 0.04} 4%|▍ | 1483/34278 [1:40:27<34:39:34, 3.80s/it] 4%|▍ | 1484/34278 [1:40:34<41:02:40, 4.51s/it] {'loss': 0.2026, 'grad_norm': 0.8730508094905478, 'learning_rate': 9.9953800371926e-06, 'epoch': 0.04} 4%|▍ | 1484/34278 [1:40:34<41:02:40, 4.51s/it] 4%|▍ | 1485/34278 [1:40:36<36:03:35, 3.96s/it] {'loss': 0.2085, 'grad_norm': 0.973396668159621, 'learning_rate': 9.995359710492937e-06, 'epoch': 0.04} 4%|▍ | 1485/34278 [1:40:36<36:03:35, 3.96s/it] 4%|▍ | 1486/34278 [1:40:40<35:12:37, 3.87s/it] {'loss': 0.2087, 'grad_norm': 0.9718573235784893, 'learning_rate': 9.995339339195898e-06, 'epoch': 0.04} 4%|▍ | 1486/34278 [1:40:40<35:12:37, 3.87s/it] 4%|▍ | 1487/34278 [1:40:43<33:20:11, 3.66s/it] {'loss': 0.1913, 'grad_norm': 0.9224864497323803, 'learning_rate': 9.995318923301659e-06, 'epoch': 0.04} 4%|▍ | 1487/34278 [1:40:43<33:20:11, 3.66s/it] 4%|▍ | 1488/34278 [1:40:48<35:25:06, 3.89s/it] {'loss': 0.2164, 'grad_norm': 1.0016274912019036, 'learning_rate': 9.995298462810407e-06, 'epoch': 0.04} 4%|▍ | 1488/34278 [1:40:48<35:25:06, 3.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1489/34278 [1:40:50<32:48:48, 3.60s/it] {'loss': 0.2464, 'grad_norm': 1.1775308618689844, 'learning_rate': 9.995277957722323e-06, 'epoch': 0.04} 4%|▍ | 1489/34278 [1:40:51<32:48:48, 3.60s/it] 4%|▍ | 1490/34278 [1:40:54<32:31:26, 3.57s/it] {'loss': 0.1977, 'grad_norm': 1.010011497366543, 'learning_rate': 9.995257408037588e-06, 'epoch': 0.04} 4%|▍ | 1490/34278 [1:40:54<32:31:26, 3.57s/it] 4%|▍ | 1491/34278 [1:40:57<30:42:30, 3.37s/it] {'loss': 0.1913, 'grad_norm': 0.8892193731855427, 'learning_rate': 9.995236813756388e-06, 'epoch': 0.04} 4%|▍ | 1491/34278 [1:40:57<30:42:30, 3.37s/it] 4%|▍ | 1492/34278 [1:41:00<30:55:41, 3.40s/it] {'loss': 0.2269, 'grad_norm': 1.0432195456684967, 'learning_rate': 9.995216174878908e-06, 'epoch': 0.04} 4%|▍ | 1492/34278 [1:41:00<30:55:41, 3.40s/it] 4%|▍ | 1493/34278 [1:41:04<30:56:04, 3.40s/it] {'loss': 0.2365, 'grad_norm': 1.1672149910620269, 'learning_rate': 9.99519549140533e-06, 'epoch': 0.04} 4%|▍ | 1493/34278 [1:41:04<30:56:04, 3.40s/it] 4%|▍ | 1494/34278 [1:41:07<31:45:09, 3.49s/it] {'loss': 0.2178, 'grad_norm': 0.9795863159696633, 'learning_rate': 9.995174763335837e-06, 'epoch': 0.04} 4%|▍ | 1494/34278 [1:41:07<31:45:09, 3.49s/it] 4%|▍ | 1495/34278 [1:41:11<31:36:13, 3.47s/it] {'loss': 0.2278, 'grad_norm': 1.2355885439701726, 'learning_rate': 9.995153990670618e-06, 'epoch': 0.04} 4%|▍ | 1495/34278 [1:41:11<31:36:13, 3.47s/it] 4%|▍ | 1496/34278 [1:41:15<32:19:59, 3.55s/it] {'loss': 0.2164, 'grad_norm': 1.0361904879259465, 'learning_rate': 9.995133173409856e-06, 'epoch': 0.04} 4%|▍ | 1496/34278 [1:41:15<32:19:59, 3.55s/it] 4%|▍ | 1497/34278 [1:41:20<38:08:11, 4.19s/it] {'loss': 0.2017, 'grad_norm': 0.9175487337753576, 'learning_rate': 9.995112311553736e-06, 'epoch': 0.04} 4%|▍ | 1497/34278 [1:41:20<38:08:11, 4.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1498/34278 [1:41:23<34:48:27, 3.82s/it] {'loss': 0.2148, 'grad_norm': 0.9087516508905001, 'learning_rate': 9.995091405102449e-06, 'epoch': 0.04} 4%|▍ | 1498/34278 [1:41:23<34:48:27, 3.82s/it] 4%|▍ | 1499/34278 [1:41:27<34:58:08, 3.84s/it] {'loss': 0.2172, 'grad_norm': 1.0065575322472309, 'learning_rate': 9.995070454056175e-06, 'epoch': 0.04} 4%|▍ | 1499/34278 [1:41:27<34:58:08, 3.84s/it] 4%|▍ | 1500/34278 [1:41:31<34:27:33, 3.78s/it] {'loss': 0.2393, 'grad_norm': 1.4500252208703486, 'learning_rate': 9.995049458415108e-06, 'epoch': 0.04} 4%|▍ | 1500/34278 [1:41:31<34:27:33, 3.78s/it] 4%|▍ | 1501/34278 [1:41:35<34:14:34, 3.76s/it] {'loss': 0.2184, 'grad_norm': 1.3234772355923852, 'learning_rate': 9.995028418179429e-06, 'epoch': 0.04} 4%|▍ | 1501/34278 [1:41:35<34:14:34, 3.76s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9067 > 8192). Running this sequence through the model will result in indexing errors 4%|▍ | 1502/34278 [1:41:38<33:24:50, 3.67s/it] {'loss': 0.187, 'grad_norm': 0.9495426174390178, 'learning_rate': 9.99500733334933e-06, 'epoch': 0.04} 4%|▍ | 1502/34278 [1:41:38<33:24:50, 3.67s/it] 4%|▍ | 1503/34278 [1:41:41<32:03:59, 3.52s/it] {'loss': 0.2038, 'grad_norm': 1.2749208543526, 'learning_rate': 9.994986203924996e-06, 'epoch': 0.04} 4%|▍ | 1503/34278 [1:41:41<32:03:59, 3.52s/it] 4%|▍ | 1504/34278 [1:41:44<31:02:21, 3.41s/it] {'loss': 0.2012, 'grad_norm': 1.2059401236051581, 'learning_rate': 9.99496502990662e-06, 'epoch': 0.04} 4%|▍ | 1504/34278 [1:41:44<31:02:21, 3.41s/it] 4%|▍ | 1505/34278 [1:41:48<31:00:41, 3.41s/it] {'loss': 0.2214, 'grad_norm': 1.3535856574377956, 'learning_rate': 9.994943811294387e-06, 'epoch': 0.04} 4%|▍ | 1505/34278 [1:41:48<31:00:41, 3.41s/it] 4%|▍ | 1506/34278 [1:41:53<35:37:55, 3.91s/it] {'loss': 0.2145, 'grad_norm': 1.01008938424475, 'learning_rate': 9.994922548088488e-06, 'epoch': 0.04} 4%|▍ | 1506/34278 [1:41:53<35:37:55, 3.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1507/34278 [1:41:56<33:29:23, 3.68s/it] {'loss': 0.2117, 'grad_norm': 1.106945856950492, 'learning_rate': 9.994901240289114e-06, 'epoch': 0.04} 4%|▍ | 1507/34278 [1:41:56<33:29:23, 3.68s/it] 4%|▍ | 1508/34278 [1:41:59<32:18:25, 3.55s/it] {'loss': 0.2093, 'grad_norm': 1.0718073273861746, 'learning_rate': 9.994879887896453e-06, 'epoch': 0.04} 4%|▍ | 1508/34278 [1:41:59<32:18:25, 3.55s/it] 4%|▍ | 1509/34278 [1:42:03<33:09:50, 3.64s/it] {'loss': 0.1983, 'grad_norm': 0.8134033627259928, 'learning_rate': 9.994858490910699e-06, 'epoch': 0.04} 4%|▍ | 1509/34278 [1:42:03<33:09:50, 3.64s/it] 4%|▍ | 1510/34278 [1:42:06<32:25:24, 3.56s/it] {'loss': 0.2175, 'grad_norm': 1.1558254681157425, 'learning_rate': 9.994837049332038e-06, 'epoch': 0.04} 4%|▍ | 1510/34278 [1:42:06<32:25:24, 3.56s/it] 4%|▍ | 1511/34278 [1:42:10<33:38:06, 3.70s/it] {'loss': 0.2267, 'grad_norm': 1.0413670712359624, 'learning_rate': 9.994815563160665e-06, 'epoch': 0.04} 4%|▍ | 1511/34278 [1:42:10<33:38:06, 3.70s/it] 4%|▍ | 1512/34278 [1:42:14<33:27:07, 3.68s/it] {'loss': 0.2053, 'grad_norm': 0.8713546158791305, 'learning_rate': 9.994794032396772e-06, 'epoch': 0.04} 4%|▍ | 1512/34278 [1:42:14<33:27:07, 3.68s/it] 4%|▍ | 1513/34278 [1:42:18<33:28:50, 3.68s/it] {'loss': 0.2149, 'grad_norm': 0.8762434296611706, 'learning_rate': 9.99477245704055e-06, 'epoch': 0.04} 4%|▍ | 1513/34278 [1:42:18<33:28:50, 3.68s/it] 4%|▍ | 1514/34278 [1:42:21<32:06:08, 3.53s/it] {'loss': 0.2151, 'grad_norm': 1.140460284007135, 'learning_rate': 9.99475083709219e-06, 'epoch': 0.04} 4%|▍ | 1514/34278 [1:42:21<32:06:08, 3.53s/it] 4%|▍ | 1515/34278 [1:42:24<31:27:31, 3.46s/it] {'loss': 0.2065, 'grad_norm': 0.9512975940821824, 'learning_rate': 9.994729172551889e-06, 'epoch': 0.04} 4%|▍ | 1515/34278 [1:42:24<31:27:31, 3.46s/it] 4%|▍ | 1516/34278 [1:42:27<30:18:23, 3.33s/it] {'loss': 0.2101, 'grad_norm': 1.045748755414856, 'learning_rate': 9.994707463419839e-06, 'epoch': 0.04} 4%|▍ | 1516/34278 [1:42:27<30:18:23, 3.33s/it] 4%|▍ | 1517/34278 [1:42:33<37:38:31, 4.14s/it] {'loss': 0.2313, 'grad_norm': 1.155441077931251, 'learning_rate': 9.99468570969623e-06, 'epoch': 0.04} 4%|▍ | 1517/34278 [1:42:33<37:38:31, 4.14s/it] 4%|▍ | 1518/34278 [1:42:37<35:19:18, 3.88s/it] {'loss': 0.1956, 'grad_norm': 0.8182222302235104, 'learning_rate': 9.99466391138126e-06, 'epoch': 0.04} 4%|▍ | 1518/34278 [1:42:37<35:19:18, 3.88s/it] 4%|▍ | 1519/34278 [1:42:40<34:50:10, 3.83s/it] {'loss': 0.1939, 'grad_norm': 0.985013760566948, 'learning_rate': 9.994642068475127e-06, 'epoch': 0.04} 4%|▍ | 1519/34278 [1:42:40<34:50:10, 3.83s/it] 4%|▍ | 1520/34278 [1:42:44<33:51:14, 3.72s/it] {'loss': 0.2333, 'grad_norm': 1.1592479237406494, 'learning_rate': 9.994620180978019e-06, 'epoch': 0.04} 4%|▍ | 1520/34278 [1:42:44<33:51:14, 3.72s/it] 4%|▍ | 1521/34278 [1:42:47<31:21:07, 3.45s/it] {'loss': 0.2004, 'grad_norm': 0.9415477013451679, 'learning_rate': 9.994598248890132e-06, 'epoch': 0.04} 4%|▍ | 1521/34278 [1:42:47<31:21:07, 3.45s/it] 4%|▍ | 1522/34278 [1:42:50<31:43:35, 3.49s/it] {'loss': 0.2114, 'grad_norm': 0.887643002790456, 'learning_rate': 9.994576272211666e-06, 'epoch': 0.04} 4%|▍ | 1522/34278 [1:42:50<31:43:35, 3.49s/it] 4%|▍ | 1523/34278 [1:42:53<30:58:11, 3.40s/it] {'loss': 0.2181, 'grad_norm': 1.0270416990258484, 'learning_rate': 9.994554250942818e-06, 'epoch': 0.04} 4%|▍ | 1523/34278 [1:42:53<30:58:11, 3.40s/it] 4%|▍ | 1524/34278 [1:42:57<32:09:26, 3.53s/it] {'loss': 0.2171, 'grad_norm': 0.9805689129660959, 'learning_rate': 9.994532185083778e-06, 'epoch': 0.04} 4%|▍ | 1524/34278 [1:42:57<32:09:26, 3.53s/it] 4%|▍ | 1525/34278 [1:43:01<31:44:30, 3.49s/it] {'loss': 0.2019, 'grad_norm': 1.0786672431143527, 'learning_rate': 9.99451007463475e-06, 'epoch': 0.04} 4%|▍ | 1525/34278 [1:43:01<31:44:30, 3.49s/it] 4%|▍ | 1526/34278 [1:43:04<31:58:34, 3.51s/it] {'loss': 0.2117, 'grad_norm': 0.9372296678013002, 'learning_rate': 9.994487919595925e-06, 'epoch': 0.04} 4%|▍ | 1526/34278 [1:43:04<31:58:34, 3.51s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1527/34278 [1:43:08<32:07:26, 3.53s/it] {'loss': 0.2313, 'grad_norm': 1.0254043986779158, 'learning_rate': 9.994465719967507e-06, 'epoch': 0.04} 4%|▍ | 1527/34278 [1:43:08<32:07:26, 3.53s/it] 4%|▍ | 1528/34278 [1:43:11<31:06:15, 3.42s/it] {'loss': 0.1993, 'grad_norm': 1.1989115949156917, 'learning_rate': 9.994443475749692e-06, 'epoch': 0.04} 4%|▍ | 1528/34278 [1:43:11<31:06:15, 3.42s/it] 4%|▍ | 1529/34278 [1:43:14<31:01:14, 3.41s/it] {'loss': 0.2251, 'grad_norm': 0.9809271376858866, 'learning_rate': 9.994421186942675e-06, 'epoch': 0.04} 4%|▍ | 1529/34278 [1:43:14<31:01:14, 3.41s/it] 4%|▍ | 1530/34278 [1:43:18<30:42:22, 3.38s/it] {'loss': 0.1776, 'grad_norm': 0.8140246694342183, 'learning_rate': 9.99439885354666e-06, 'epoch': 0.04} 4%|▍ | 1530/34278 [1:43:18<30:42:22, 3.38s/it] 4%|▍ | 1531/34278 [1:43:21<30:25:16, 3.34s/it] {'loss': 0.2158, 'grad_norm': 0.9703925915399549, 'learning_rate': 9.994376475561842e-06, 'epoch': 0.04} 4%|▍ | 1531/34278 [1:43:21<30:25:16, 3.34s/it] 4%|▍ | 1532/34278 [1:43:26<35:48:02, 3.94s/it] {'loss': 0.2174, 'grad_norm': 1.104517324078042, 'learning_rate': 9.994354052988424e-06, 'epoch': 0.04} 4%|▍ | 1532/34278 [1:43:26<35:48:02, 3.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1533/34278 [1:43:32<41:20:55, 4.55s/it] {'loss': 0.2145, 'grad_norm': 0.8316422371741246, 'learning_rate': 9.994331585826606e-06, 'epoch': 0.04} 4%|▍ | 1533/34278 [1:43:32<41:20:55, 4.55s/it] 4%|▍ | 1534/34278 [1:43:35<37:52:29, 4.16s/it] {'loss': 0.2276, 'grad_norm': 0.9830344315374484, 'learning_rate': 9.994309074076589e-06, 'epoch': 0.04} 4%|▍ | 1534/34278 [1:43:35<37:52:29, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1535/34278 [1:43:40<39:46:15, 4.37s/it] {'loss': 0.2396, 'grad_norm': 1.040520479452583, 'learning_rate': 9.994286517738572e-06, 'epoch': 0.04} 4%|▍ | 1535/34278 [1:43:40<39:46:15, 4.37s/it] 4%|▍ | 1536/34278 [1:43:44<38:10:26, 4.20s/it] {'loss': 0.2034, 'grad_norm': 0.8665381336119805, 'learning_rate': 9.994263916812757e-06, 'epoch': 0.04} 4%|▍ | 1536/34278 [1:43:44<38:10:26, 4.20s/it] 4%|▍ | 1537/34278 [1:43:47<35:36:39, 3.92s/it] {'loss': 0.2083, 'grad_norm': 0.8774297599044266, 'learning_rate': 9.994241271299344e-06, 'epoch': 0.04} 4%|▍ | 1537/34278 [1:43:47<35:36:39, 3.92s/it] 4%|▍ | 1538/34278 [1:43:50<33:09:02, 3.65s/it] {'loss': 0.2052, 'grad_norm': 0.8705735421286882, 'learning_rate': 9.994218581198539e-06, 'epoch': 0.04} 4%|▍ | 1538/34278 [1:43:50<33:09:02, 3.65s/it] 4%|▍ | 1539/34278 [1:43:55<37:22:06, 4.11s/it] {'loss': 0.2401, 'grad_norm': 1.0544589605513472, 'learning_rate': 9.994195846510543e-06, 'epoch': 0.04} 4%|▍ | 1539/34278 [1:43:55<37:22:06, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 4%|▍ | 1540/34278 [1:43:59<34:56:31, 3.84s/it] {'loss': 0.2041, 'grad_norm': 1.0616515252387178, 'learning_rate': 9.994173067235557e-06, 'epoch': 0.04} 4%|▍ | 1540/34278 [1:43:59<34:56:31, 3.84s/it] 4%|▍ | 1541/34278 [1:44:03<35:56:00, 3.95s/it] {'loss': 0.2138, 'grad_norm': 0.7933865528135348, 'learning_rate': 9.994150243373789e-06, 'epoch': 0.04} 4%|▍ | 1541/34278 [1:44:03<35:56:00, 3.95s/it] 4%|▍ | 1542/34278 [1:44:06<34:49:26, 3.83s/it] {'loss': 0.2298, 'grad_norm': 1.0397077363518006, 'learning_rate': 9.994127374925438e-06, 'epoch': 0.04} 4%|▍ | 1542/34278 [1:44:06<34:49:26, 3.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (10143 > 8192). Running this sequence through the model will result in indexing errors 5%|▍ | 1543/34278 [1:44:12<38:19:28, 4.21s/it] {'loss': 0.2078, 'grad_norm': 0.887150825337438, 'learning_rate': 9.99410446189071e-06, 'epoch': 0.05} 5%|▍ | 1543/34278 [1:44:12<38:19:28, 4.21s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1544/34278 [1:44:16<40:14:39, 4.43s/it] {'loss': 0.1886, 'grad_norm': 0.7227271216130112, 'learning_rate': 9.99408150426981e-06, 'epoch': 0.05} 5%|▍ | 1544/34278 [1:44:16<40:14:39, 4.43s/it] 5%|▍ | 1545/34278 [1:44:19<35:46:23, 3.93s/it] {'loss': 0.2202, 'grad_norm': 0.9403018061880659, 'learning_rate': 9.994058502062942e-06, 'epoch': 0.05} 5%|▍ | 1545/34278 [1:44:19<35:46:23, 3.93s/it] 5%|▍ | 1546/34278 [1:44:23<34:16:07, 3.77s/it] {'loss': 0.2232, 'grad_norm': 1.16465291239193, 'learning_rate': 9.994035455270313e-06, 'epoch': 0.05} 5%|▍ | 1546/34278 [1:44:23<34:16:07, 3.77s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8495 > 8192). Running this sequence through the model will result in indexing errors Token indices sequence length is longer than the specified maximum sequence length for this model (9407 > 8192). Running this sequence through the model will result in indexing errors 5%|▍ | 1547/34278 [1:44:26<32:08:38, 3.54s/it] {'loss': 0.2132, 'grad_norm': 1.0377796312985823, 'learning_rate': 9.994012363892124e-06, 'epoch': 0.05} 5%|▍ | 1547/34278 [1:44:26<32:08:38, 3.54s/it] 5%|▍ | 1548/34278 [1:44:29<31:17:14, 3.44s/it] {'loss': 0.1932, 'grad_norm': 0.9365964797790705, 'learning_rate': 9.993989227928588e-06, 'epoch': 0.05} 5%|▍ | 1548/34278 [1:44:29<31:17:14, 3.44s/it] 5%|▍ | 1549/34278 [1:44:32<30:23:34, 3.34s/it] {'loss': 0.1995, 'grad_norm': 0.9807639657732039, 'learning_rate': 9.993966047379908e-06, 'epoch': 0.05} 5%|▍ | 1549/34278 [1:44:32<30:23:34, 3.34s/it] 5%|▍ | 1550/34278 [1:44:35<29:23:08, 3.23s/it] {'loss': 0.2334, 'grad_norm': 1.1793814247486445, 'learning_rate': 9.993942822246292e-06, 'epoch': 0.05} 5%|▍ | 1550/34278 [1:44:35<29:23:08, 3.23s/it] 5%|▍ | 1551/34278 [1:44:41<37:01:38, 4.07s/it] {'loss': 0.2024, 'grad_norm': 0.9107580647389542, 'learning_rate': 9.993919552527945e-06, 'epoch': 0.05} 5%|▍ | 1551/34278 [1:44:41<37:01:38, 4.07s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1552/34278 [1:44:47<42:07:29, 4.63s/it] {'loss': 0.1968, 'grad_norm': 0.9549283293210068, 'learning_rate': 9.993896238225079e-06, 'epoch': 0.05} 5%|▍ | 1552/34278 [1:44:47<42:07:29, 4.63s/it] 5%|▍ | 1553/34278 [1:44:50<38:03:40, 4.19s/it] {'loss': 0.2188, 'grad_norm': 1.1968822968012605, 'learning_rate': 9.993872879337896e-06, 'epoch': 0.05} 5%|▍ | 1553/34278 [1:44:50<38:03:40, 4.19s/it] 5%|▍ | 1554/34278 [1:44:56<43:31:04, 4.79s/it] {'loss': 0.2346, 'grad_norm': 1.0057792745103897, 'learning_rate': 9.993849475866611e-06, 'epoch': 0.05} 5%|▍ | 1554/34278 [1:44:56<43:31:04, 4.79s/it] 5%|▍ | 1555/34278 [1:45:00<42:01:00, 4.62s/it] {'loss': 0.2131, 'grad_norm': 1.1397853199149683, 'learning_rate': 9.993826027811427e-06, 'epoch': 0.05} 5%|▍ | 1555/34278 [1:45:00<42:01:00, 4.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (9644 > 8192). Running this sequence through the model will result in indexing errors 5%|▍ | 1556/34278 [1:45:07<46:26:56, 5.11s/it] {'loss': 0.2385, 'grad_norm': 1.3361540815581716, 'learning_rate': 9.993802535172558e-06, 'epoch': 0.05} 5%|▍ | 1556/34278 [1:45:07<46:26:56, 5.11s/it] 5%|▍ | 1557/34278 [1:45:10<40:51:58, 4.50s/it] {'loss': 0.2326, 'grad_norm': 1.2649994201282369, 'learning_rate': 9.993778997950212e-06, 'epoch': 0.05} 5%|▍ | 1557/34278 [1:45:10<40:51:58, 4.50s/it] 5%|▍ | 1558/34278 [1:45:16<44:45:57, 4.93s/it] {'loss': 0.2347, 'grad_norm': 1.1605134543612885, 'learning_rate': 9.9937554161446e-06, 'epoch': 0.05} 5%|▍ | 1558/34278 [1:45:16<44:45:57, 4.93s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1559/34278 [1:45:19<41:22:32, 4.55s/it] {'loss': 0.2091, 'grad_norm': 1.041034454393136, 'learning_rate': 9.993731789755931e-06, 'epoch': 0.05} 5%|▍ | 1559/34278 [1:45:19<41:22:32, 4.55s/it] 5%|▍ | 1560/34278 [1:45:22<37:19:38, 4.11s/it] {'loss': 0.218, 'grad_norm': 1.0357172359899414, 'learning_rate': 9.993708118784417e-06, 'epoch': 0.05} 5%|▍ | 1560/34278 [1:45:22<37:19:38, 4.11s/it] 5%|▍ | 1561/34278 [1:45:26<35:06:28, 3.86s/it] {'loss': 0.208, 'grad_norm': 1.065139304455861, 'learning_rate': 9.993684403230268e-06, 'epoch': 0.05} 5%|▍ | 1561/34278 [1:45:26<35:06:28, 3.86s/it] 5%|▍ | 1562/34278 [1:45:29<32:23:21, 3.56s/it] {'loss': 0.2312, 'grad_norm': 1.2222053867043257, 'learning_rate': 9.993660643093698e-06, 'epoch': 0.05} 5%|▍ | 1562/34278 [1:45:29<32:23:21, 3.56s/it] 5%|▍ | 1563/34278 [1:45:35<39:40:02, 4.37s/it] {'loss': 0.1861, 'grad_norm': 0.7402429115781198, 'learning_rate': 9.993636838374917e-06, 'epoch': 0.05} 5%|▍ | 1563/34278 [1:45:35<39:40:02, 4.37s/it] 5%|▍ | 1564/34278 [1:45:38<36:24:46, 4.01s/it] {'loss': 0.2405, 'grad_norm': 0.9226390491358516, 'learning_rate': 9.99361298907414e-06, 'epoch': 0.05} 5%|▍ | 1564/34278 [1:45:38<36:24:46, 4.01s/it] 5%|▍ | 1565/34278 [1:45:42<35:09:10, 3.87s/it] {'loss': 0.2036, 'grad_norm': 1.1469985183875429, 'learning_rate': 9.993589095191575e-06, 'epoch': 0.05} 5%|▍ | 1565/34278 [1:45:42<35:09:10, 3.87s/it] 5%|▍ | 1566/34278 [1:45:46<36:20:32, 4.00s/it] {'loss': 0.1917, 'grad_norm': 0.9693897306501376, 'learning_rate': 9.993565156727443e-06, 'epoch': 0.05} 5%|▍ | 1566/34278 [1:45:46<36:20:32, 4.00s/it] 5%|▍ | 1567/34278 [1:45:49<35:08:38, 3.87s/it] {'loss': 0.2022, 'grad_norm': 1.1274672267838741, 'learning_rate': 9.99354117368195e-06, 'epoch': 0.05} 5%|▍ | 1567/34278 [1:45:49<35:08:38, 3.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1568/34278 [1:45:55<39:50:45, 4.39s/it] {'loss': 0.2095, 'grad_norm': 1.240372697669746, 'learning_rate': 9.993517146055314e-06, 'epoch': 0.05} 5%|▍ | 1568/34278 [1:45:55<39:50:45, 4.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1569/34278 [1:46:01<44:24:18, 4.89s/it] {'loss': 0.1988, 'grad_norm': 0.9120602913148715, 'learning_rate': 9.99349307384775e-06, 'epoch': 0.05} 5%|▍ | 1569/34278 [1:46:01<44:24:18, 4.89s/it] 5%|▍ | 1570/34278 [1:46:05<41:50:52, 4.61s/it] {'loss': 0.2203, 'grad_norm': 1.1822521899955165, 'learning_rate': 9.993468957059472e-06, 'epoch': 0.05} 5%|▍ | 1570/34278 [1:46:05<41:50:52, 4.61s/it] 5%|▍ | 1571/34278 [1:46:09<40:11:24, 4.42s/it] {'loss': 0.1952, 'grad_norm': 0.9593066799616233, 'learning_rate': 9.993444795690694e-06, 'epoch': 0.05} 5%|▍ | 1571/34278 [1:46:09<40:11:24, 4.42s/it] 5%|▍ | 1572/34278 [1:46:13<38:06:35, 4.19s/it] {'loss': 0.2136, 'grad_norm': 1.0210061544811042, 'learning_rate': 9.993420589741634e-06, 'epoch': 0.05} 5%|▍ | 1572/34278 [1:46:13<38:06:35, 4.19s/it] 5%|▍ | 1573/34278 [1:46:16<35:15:17, 3.88s/it] {'loss': 0.2269, 'grad_norm': 1.1139539809465127, 'learning_rate': 9.993396339212505e-06, 'epoch': 0.05} 5%|▍ | 1573/34278 [1:46:16<35:15:17, 3.88s/it] 5%|▍ | 1574/34278 [1:46:21<38:27:53, 4.23s/it] {'loss': 0.2181, 'grad_norm': 1.104926442800261, 'learning_rate': 9.993372044103528e-06, 'epoch': 0.05} 5%|▍ | 1574/34278 [1:46:21<38:27:53, 4.23s/it] 5%|▍ | 1575/34278 [1:46:27<43:49:29, 4.82s/it] {'loss': 0.2036, 'grad_norm': 1.0134102391549935, 'learning_rate': 9.993347704414915e-06, 'epoch': 0.05} 5%|▍ | 1575/34278 [1:46:27<43:49:29, 4.82s/it] 5%|▍ | 1576/34278 [1:46:30<38:42:49, 4.26s/it] {'loss': 0.1964, 'grad_norm': 0.9529470474929717, 'learning_rate': 9.993323320146888e-06, 'epoch': 0.05} 5%|▍ | 1576/34278 [1:46:30<38:42:49, 4.26s/it] 5%|▍ | 1577/34278 [1:46:36<43:36:26, 4.80s/it] {'loss': 0.213, 'grad_norm': 0.9056768183692107, 'learning_rate': 9.99329889129966e-06, 'epoch': 0.05} 5%|▍ | 1577/34278 [1:46:36<43:36:26, 4.80s/it] 5%|▍ | 1578/34278 [1:46:39<38:57:53, 4.29s/it] {'loss': 0.2074, 'grad_norm': 1.0069441295048178, 'learning_rate': 9.993274417873454e-06, 'epoch': 0.05} 5%|▍ | 1578/34278 [1:46:39<38:57:53, 4.29s/it] 5%|▍ | 1579/34278 [1:46:43<37:22:52, 4.12s/it] {'loss': 0.2324, 'grad_norm': 1.1463857655151746, 'learning_rate': 9.993249899868484e-06, 'epoch': 0.05} 5%|▍ | 1579/34278 [1:46:43<37:22:52, 4.12s/it] 5%|▍ | 1580/34278 [1:46:46<35:13:20, 3.88s/it] {'loss': 0.2036, 'grad_norm': 1.0098795470169168, 'learning_rate': 9.993225337284973e-06, 'epoch': 0.05} 5%|▍ | 1580/34278 [1:46:46<35:13:20, 3.88s/it] 5%|▍ | 1581/34278 [1:46:52<40:52:54, 4.50s/it] {'loss': 0.2347, 'grad_norm': 0.9880110224986607, 'learning_rate': 9.993200730123137e-06, 'epoch': 0.05} 5%|▍ | 1581/34278 [1:46:52<40:52:54, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1582/34278 [1:46:55<36:40:50, 4.04s/it] {'loss': 0.2144, 'grad_norm': 2.314498525771706, 'learning_rate': 9.993176078383198e-06, 'epoch': 0.05} 5%|▍ | 1582/34278 [1:46:55<36:40:50, 4.04s/it] 5%|▍ | 1583/34278 [1:46:58<34:10:10, 3.76s/it] {'loss': 0.2137, 'grad_norm': 0.9938216294739247, 'learning_rate': 9.993151382065372e-06, 'epoch': 0.05} 5%|▍ | 1583/34278 [1:46:58<34:10:10, 3.76s/it] 5%|▍ | 1584/34278 [1:47:02<32:51:12, 3.62s/it] {'loss': 0.2071, 'grad_norm': 0.8233384269887488, 'learning_rate': 9.993126641169884e-06, 'epoch': 0.05} 5%|▍ | 1584/34278 [1:47:02<32:51:12, 3.62s/it] 5%|▍ | 1585/34278 [1:47:05<32:26:34, 3.57s/it] {'loss': 0.2138, 'grad_norm': 0.9796380743525783, 'learning_rate': 9.993101855696955e-06, 'epoch': 0.05} 5%|▍ | 1585/34278 [1:47:05<32:26:34, 3.57s/it] 5%|▍ | 1586/34278 [1:47:10<35:38:40, 3.93s/it] {'loss': 0.2028, 'grad_norm': 0.9481253593913958, 'learning_rate': 9.993077025646802e-06, 'epoch': 0.05} 5%|▍ | 1586/34278 [1:47:10<35:38:40, 3.93s/it] 5%|▍ | 1587/34278 [1:47:15<38:40:54, 4.26s/it] {'loss': 0.22, 'grad_norm': 0.8992629592784864, 'learning_rate': 9.99305215101965e-06, 'epoch': 0.05} 5%|▍ | 1587/34278 [1:47:15<38:40:54, 4.26s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1588/34278 [1:47:18<34:51:49, 3.84s/it] {'loss': 0.2264, 'grad_norm': 0.9038992823317034, 'learning_rate': 9.993027231815722e-06, 'epoch': 0.05} 5%|▍ | 1588/34278 [1:47:18<34:51:49, 3.84s/it] 5%|▍ | 1589/34278 [1:47:22<34:54:44, 3.84s/it] {'loss': 0.2084, 'grad_norm': 0.9287985595121647, 'learning_rate': 9.993002268035237e-06, 'epoch': 0.05} 5%|▍ | 1589/34278 [1:47:22<34:54:44, 3.84s/it] 5%|▍ | 1590/34278 [1:47:25<33:23:24, 3.68s/it] {'loss': 0.2085, 'grad_norm': 0.9586303799953108, 'learning_rate': 9.99297725967842e-06, 'epoch': 0.05} 5%|▍ | 1590/34278 [1:47:25<33:23:24, 3.68s/it] 5%|▍ | 1591/34278 [1:47:28<32:54:04, 3.62s/it] {'loss': 0.2071, 'grad_norm': 1.0202403101467916, 'learning_rate': 9.992952206745494e-06, 'epoch': 0.05} 5%|▍ | 1591/34278 [1:47:28<32:54:04, 3.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1592/34278 [1:47:32<32:46:46, 3.61s/it] {'loss': 0.2122, 'grad_norm': 1.0218848534619405, 'learning_rate': 9.992927109236684e-06, 'epoch': 0.05} 5%|▍ | 1592/34278 [1:47:32<32:46:46, 3.61s/it] 5%|▍ | 1593/34278 [1:47:35<32:38:09, 3.59s/it] {'loss': 0.2096, 'grad_norm': 1.029016927298267, 'learning_rate': 9.99290196715221e-06, 'epoch': 0.05} 5%|▍ | 1593/34278 [1:47:35<32:38:09, 3.59s/it] 5%|▍ | 1594/34278 [1:47:39<32:08:32, 3.54s/it] {'loss': 0.1966, 'grad_norm': 1.049779304993706, 'learning_rate': 9.9928767804923e-06, 'epoch': 0.05} 5%|▍ | 1594/34278 [1:47:39<32:08:32, 3.54s/it] 5%|▍ | 1595/34278 [1:47:42<31:17:17, 3.45s/it] {'loss': 0.2015, 'grad_norm': 0.9050569421058766, 'learning_rate': 9.99285154925718e-06, 'epoch': 0.05} 5%|▍ | 1595/34278 [1:47:42<31:17:17, 3.45s/it] 5%|▍ | 1596/34278 [1:47:46<32:15:23, 3.55s/it] {'loss': 0.2117, 'grad_norm': 0.9535046111228952, 'learning_rate': 9.992826273447072e-06, 'epoch': 0.05} 5%|▍ | 1596/34278 [1:47:46<32:15:23, 3.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1597/34278 [1:47:49<31:34:35, 3.48s/it] {'loss': 0.2196, 'grad_norm': 0.9989884600454102, 'learning_rate': 9.9928009530622e-06, 'epoch': 0.05} 5%|▍ | 1597/34278 [1:47:49<31:34:35, 3.48s/it] 5%|▍ | 1598/34278 [1:47:52<29:44:40, 3.28s/it] {'loss': 0.1903, 'grad_norm': 1.0894001553216668, 'learning_rate': 9.992775588102797e-06, 'epoch': 0.05} 5%|▍ | 1598/34278 [1:47:52<29:44:40, 3.28s/it] 5%|▍ | 1599/34278 [1:47:55<30:14:41, 3.33s/it] {'loss': 0.2124, 'grad_norm': 0.9933183508765261, 'learning_rate': 9.992750178569084e-06, 'epoch': 0.05} 5%|▍ | 1599/34278 [1:47:55<30:14:41, 3.33s/it] 5%|▍ | 1600/34278 [1:47:59<31:51:01, 3.51s/it] {'loss': 0.2065, 'grad_norm': 0.7758056013643674, 'learning_rate': 9.992724724461289e-06, 'epoch': 0.05} 5%|▍ | 1600/34278 [1:47:59<31:51:01, 3.51s/it] 5%|▍ | 1601/34278 [1:48:03<32:18:19, 3.56s/it] {'loss': 0.2316, 'grad_norm': 0.9613749623422736, 'learning_rate': 9.992699225779641e-06, 'epoch': 0.05} 5%|▍ | 1601/34278 [1:48:03<32:18:19, 3.56s/it] 5%|▍ | 1602/34278 [1:48:09<39:10:07, 4.32s/it] {'loss': 0.2316, 'grad_norm': 1.1799737646147839, 'learning_rate': 9.992673682524366e-06, 'epoch': 0.05} 5%|▍ | 1602/34278 [1:48:09<39:10:07, 4.32s/it] 5%|▍ | 1603/34278 [1:48:12<36:22:04, 4.01s/it] {'loss': 0.2398, 'grad_norm': 1.2648274575330423, 'learning_rate': 9.99264809469569e-06, 'epoch': 0.05} 5%|▍ | 1603/34278 [1:48:12<36:22:04, 4.01s/it] 5%|▍ | 1604/34278 [1:48:19<42:35:27, 4.69s/it] {'loss': 0.2238, 'grad_norm': 1.2029527954061672, 'learning_rate': 9.992622462293845e-06, 'epoch': 0.05} 5%|▍ | 1604/34278 [1:48:19<42:35:27, 4.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1605/34278 [1:48:22<38:06:44, 4.20s/it] {'loss': 0.2002, 'grad_norm': 0.9771608790612691, 'learning_rate': 9.992596785319057e-06, 'epoch': 0.05} 5%|▍ | 1605/34278 [1:48:22<38:06:44, 4.20s/it] 5%|▍ | 1606/34278 [1:48:26<39:21:57, 4.34s/it] {'loss': 0.2116, 'grad_norm': 0.9399009127460336, 'learning_rate': 9.99257106377156e-06, 'epoch': 0.05} 5%|▍ | 1606/34278 [1:48:26<39:21:57, 4.34s/it] 5%|▍ | 1607/34278 [1:48:31<40:19:00, 4.44s/it] {'loss': 0.2136, 'grad_norm': 0.940691461243414, 'learning_rate': 9.992545297651578e-06, 'epoch': 0.05} 5%|▍ | 1607/34278 [1:48:31<40:19:00, 4.44s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1608/34278 [1:48:37<42:59:26, 4.74s/it] {'loss': 0.2322, 'grad_norm': 1.1359818303475295, 'learning_rate': 9.992519486959345e-06, 'epoch': 0.05} 5%|▍ | 1608/34278 [1:48:37<42:59:26, 4.74s/it] 5%|▍ | 1609/34278 [1:48:40<39:54:20, 4.40s/it] {'loss': 0.2113, 'grad_norm': 1.0748126340976316, 'learning_rate': 9.992493631695089e-06, 'epoch': 0.05} 5%|▍ | 1609/34278 [1:48:40<39:54:20, 4.40s/it] 5%|▍ | 1610/34278 [1:48:43<35:45:06, 3.94s/it] {'loss': 0.2652, 'grad_norm': 1.0834980895099093, 'learning_rate': 9.99246773185904e-06, 'epoch': 0.05} 5%|▍ | 1610/34278 [1:48:43<35:45:06, 3.94s/it] 5%|▍ | 1611/34278 [1:48:47<35:13:39, 3.88s/it] {'loss': 0.2122, 'grad_norm': 1.0644096924821413, 'learning_rate': 9.992441787451432e-06, 'epoch': 0.05} 5%|▍ | 1611/34278 [1:48:47<35:13:39, 3.88s/it] 5%|▍ | 1612/34278 [1:48:50<33:23:27, 3.68s/it] {'loss': 0.2179, 'grad_norm': 1.0092083097213227, 'learning_rate': 9.992415798472496e-06, 'epoch': 0.05} 5%|▍ | 1612/34278 [1:48:50<33:23:27, 3.68s/it] 5%|▍ | 1613/34278 [1:48:55<35:53:14, 3.96s/it] {'loss': 0.2069, 'grad_norm': 0.8137581387339601, 'learning_rate': 9.992389764922464e-06, 'epoch': 0.05} 5%|▍ | 1613/34278 [1:48:55<35:53:14, 3.96s/it] 5%|▍ | 1614/34278 [1:49:01<41:45:05, 4.60s/it] {'loss': 0.2394, 'grad_norm': 1.3476142279147683, 'learning_rate': 9.992363686801568e-06, 'epoch': 0.05} 5%|▍ | 1614/34278 [1:49:01<41:45:05, 4.60s/it] 5%|▍ | 1615/34278 [1:49:06<43:37:23, 4.81s/it] {'loss': 0.2268, 'grad_norm': 1.030814079950729, 'learning_rate': 9.992337564110038e-06, 'epoch': 0.05} 5%|▍ | 1615/34278 [1:49:06<43:37:23, 4.81s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1616/34278 [1:49:10<42:02:31, 4.63s/it] {'loss': 0.2204, 'grad_norm': 1.0241879382155972, 'learning_rate': 9.992311396848113e-06, 'epoch': 0.05} 5%|▍ | 1616/34278 [1:49:10<42:02:31, 4.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1617/34278 [1:49:17<47:29:15, 5.23s/it] {'loss': 0.2208, 'grad_norm': 0.9256986412470775, 'learning_rate': 9.992285185016022e-06, 'epoch': 0.05} 5%|▍ | 1617/34278 [1:49:17<47:29:15, 5.23s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1618/34278 [1:49:20<42:17:01, 4.66s/it] {'loss': 0.2413, 'grad_norm': 1.1143884070712264, 'learning_rate': 9.992258928614002e-06, 'epoch': 0.05} 5%|▍ | 1618/34278 [1:49:20<42:17:01, 4.66s/it] 5%|▍ | 1619/34278 [1:49:24<40:47:19, 4.50s/it] {'loss': 0.2072, 'grad_norm': 0.8125416427878479, 'learning_rate': 9.992232627642284e-06, 'epoch': 0.05} 5%|▍ | 1619/34278 [1:49:24<40:47:19, 4.50s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10937 > 8192). Running this sequence through the model will result in indexing errors 5%|▍ | 1620/34278 [1:49:27<37:02:32, 4.08s/it] {'loss': 0.2078, 'grad_norm': 0.8811217609983716, 'learning_rate': 9.992206282101106e-06, 'epoch': 0.05} 5%|▍ | 1620/34278 [1:49:27<37:02:32, 4.08s/it] 5%|▍ | 1621/34278 [1:49:34<42:44:55, 4.71s/it] {'loss': 0.2095, 'grad_norm': 0.932856647391488, 'learning_rate': 9.992179891990703e-06, 'epoch': 0.05} 5%|▍ | 1621/34278 [1:49:34<42:44:55, 4.71s/it] 5%|▍ | 1622/34278 [1:49:37<38:59:42, 4.30s/it] {'loss': 0.2038, 'grad_norm': 0.9214838875133651, 'learning_rate': 9.992153457311308e-06, 'epoch': 0.05} 5%|▍ | 1622/34278 [1:49:37<38:59:42, 4.30s/it] 5%|▍ | 1623/34278 [1:49:41<38:46:21, 4.27s/it] {'loss': 0.2249, 'grad_norm': 0.9426686113551024, 'learning_rate': 9.99212697806316e-06, 'epoch': 0.05} 5%|▍ | 1623/34278 [1:49:41<38:46:21, 4.27s/it] 5%|▍ | 1624/34278 [1:49:44<35:53:12, 3.96s/it] {'loss': 0.2193, 'grad_norm': 0.9968463502456327, 'learning_rate': 9.992100454246494e-06, 'epoch': 0.05} 5%|▍ | 1624/34278 [1:49:44<35:53:12, 3.96s/it] 5%|▍ | 1625/34278 [1:49:48<33:54:50, 3.74s/it] {'loss': 0.2156, 'grad_norm': 0.9871110081566117, 'learning_rate': 9.992073885861546e-06, 'epoch': 0.05} 5%|▍ | 1625/34278 [1:49:48<33:54:50, 3.74s/it] 5%|▍ | 1626/34278 [1:49:51<33:45:25, 3.72s/it] {'loss': 0.2117, 'grad_norm': 1.0524692523570862, 'learning_rate': 9.992047272908554e-06, 'epoch': 0.05} 5%|▍ | 1626/34278 [1:49:51<33:45:25, 3.72s/it] 5%|▍ | 1627/34278 [1:49:54<31:52:51, 3.52s/it] {'loss': 0.1994, 'grad_norm': 0.893359779581283, 'learning_rate': 9.992020615387756e-06, 'epoch': 0.05} 5%|▍ | 1627/34278 [1:49:54<31:52:51, 3.52s/it] 5%|▍ | 1628/34278 [1:49:58<32:01:06, 3.53s/it] {'loss': 0.2082, 'grad_norm': 1.1422946182731655, 'learning_rate': 9.991993913299392e-06, 'epoch': 0.05} 5%|▍ | 1628/34278 [1:49:58<32:01:06, 3.53s/it] 5%|▍ | 1629/34278 [1:50:01<31:29:26, 3.47s/it] {'loss': 0.2074, 'grad_norm': 1.0213423646132167, 'learning_rate': 9.991967166643695e-06, 'epoch': 0.05} 5%|▍ | 1629/34278 [1:50:01<31:29:26, 3.47s/it] 5%|▍ | 1630/34278 [1:50:05<31:55:16, 3.52s/it] {'loss': 0.2072, 'grad_norm': 0.8688140390904798, 'learning_rate': 9.991940375420907e-06, 'epoch': 0.05} 5%|▍ | 1630/34278 [1:50:05<31:55:16, 3.52s/it] 5%|▍ | 1631/34278 [1:50:11<38:50:09, 4.28s/it] {'loss': 0.2259, 'grad_norm': 1.0926007970399714, 'learning_rate': 9.991913539631267e-06, 'epoch': 0.05} 5%|▍ | 1631/34278 [1:50:11<38:50:09, 4.28s/it] 5%|▍ | 1632/34278 [1:50:14<36:47:10, 4.06s/it] {'loss': 0.2821, 'grad_norm': 1.5254049005774917, 'learning_rate': 9.991886659275016e-06, 'epoch': 0.05} 5%|▍ | 1632/34278 [1:50:14<36:47:10, 4.06s/it] 5%|▍ | 1633/34278 [1:50:18<35:26:51, 3.91s/it] {'loss': 0.1872, 'grad_norm': 1.0016734470443482, 'learning_rate': 9.991859734352391e-06, 'epoch': 0.05} 5%|▍ | 1633/34278 [1:50:18<35:26:51, 3.91s/it] 5%|▍ | 1634/34278 [1:50:21<34:20:53, 3.79s/it] {'loss': 0.2349, 'grad_norm': 1.1477664097592586, 'learning_rate': 9.991832764863635e-06, 'epoch': 0.05} 5%|▍ | 1634/34278 [1:50:22<34:20:53, 3.79s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1635/34278 [1:50:26<36:20:01, 4.01s/it] {'loss': 0.2047, 'grad_norm': 1.0263970694145326, 'learning_rate': 9.991805750808986e-06, 'epoch': 0.05} 5%|▍ | 1635/34278 [1:50:26<36:20:01, 4.01s/it] 5%|▍ | 1636/34278 [1:50:29<33:15:54, 3.67s/it] {'loss': 0.2363, 'grad_norm': 1.15791772232876, 'learning_rate': 9.99177869218869e-06, 'epoch': 0.05} 5%|▍ | 1636/34278 [1:50:29<33:15:54, 3.67s/it] 5%|▍ | 1637/34278 [1:50:32<33:04:13, 3.65s/it] {'loss': 0.2121, 'grad_norm': 1.0025654025994395, 'learning_rate': 9.991751589002985e-06, 'epoch': 0.05} 5%|▍ | 1637/34278 [1:50:32<33:04:13, 3.65s/it] 5%|▍ | 1638/34278 [1:50:36<32:28:14, 3.58s/it] {'loss': 0.199, 'grad_norm': 0.898770828722737, 'learning_rate': 9.99172444125211e-06, 'epoch': 0.05} 5%|▍ | 1638/34278 [1:50:36<32:28:14, 3.58s/it] 5%|▍ | 1639/34278 [1:50:40<32:48:26, 3.62s/it] {'loss': 0.2112, 'grad_norm': 0.9471821780665727, 'learning_rate': 9.991697248936313e-06, 'epoch': 0.05} 5%|▍ | 1639/34278 [1:50:40<32:48:26, 3.62s/it] 5%|▍ | 1640/34278 [1:50:43<32:56:56, 3.63s/it] {'loss': 0.2362, 'grad_norm': 0.7901277486431224, 'learning_rate': 9.991670012055836e-06, 'epoch': 0.05} 5%|▍ | 1640/34278 [1:50:43<32:56:56, 3.63s/it] 5%|▍ | 1641/34278 [1:50:46<31:00:36, 3.42s/it] {'loss': 0.2316, 'grad_norm': 0.9431150578435697, 'learning_rate': 9.991642730610919e-06, 'epoch': 0.05} 5%|▍ | 1641/34278 [1:50:46<31:00:36, 3.42s/it] 5%|▍ | 1642/34278 [1:50:53<38:57:56, 4.30s/it] {'loss': 0.2141, 'grad_norm': 0.872533087573785, 'learning_rate': 9.991615404601808e-06, 'epoch': 0.05} 5%|▍ | 1642/34278 [1:50:53<38:57:56, 4.30s/it] 5%|▍ | 1643/34278 [1:50:58<42:09:22, 4.65s/it] {'loss': 0.2118, 'grad_norm': 0.878117108498137, 'learning_rate': 9.991588034028746e-06, 'epoch': 0.05} 5%|▍ | 1643/34278 [1:50:58<42:09:22, 4.65s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1644/34278 [1:51:01<38:21:48, 4.23s/it] {'loss': 0.1948, 'grad_norm': 0.8220249744036633, 'learning_rate': 9.991560618891978e-06, 'epoch': 0.05} 5%|▍ | 1644/34278 [1:51:01<38:21:48, 4.23s/it] 5%|▍ | 1645/34278 [1:51:05<36:26:05, 4.02s/it] {'loss': 0.1847, 'grad_norm': 0.8555776022492638, 'learning_rate': 9.991533159191748e-06, 'epoch': 0.05} 5%|▍ | 1645/34278 [1:51:05<36:26:05, 4.02s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9067 > 8192). Running this sequence through the model will result in indexing errors 5%|▍ | 1646/34278 [1:51:08<34:51:02, 3.84s/it] {'loss': 0.205, 'grad_norm': 0.8560024473690883, 'learning_rate': 9.9915056549283e-06, 'epoch': 0.05} 5%|▍ | 1646/34278 [1:51:08<34:51:02, 3.84s/it] 5%|▍ | 1647/34278 [1:51:12<34:35:23, 3.82s/it] {'loss': 0.2026, 'grad_norm': 1.0626266583178325, 'learning_rate': 9.991478106101884e-06, 'epoch': 0.05} 5%|▍ | 1647/34278 [1:51:12<34:35:23, 3.82s/it] 5%|▍ | 1648/34278 [1:51:16<34:08:42, 3.77s/it] {'loss': 0.2175, 'grad_norm': 1.0366821465949032, 'learning_rate': 9.991450512712742e-06, 'epoch': 0.05} 5%|▍ | 1648/34278 [1:51:16<34:08:42, 3.77s/it] 5%|▍ | 1649/34278 [1:51:19<32:04:15, 3.54s/it] {'loss': 0.2355, 'grad_norm': 0.9532431541562103, 'learning_rate': 9.99142287476112e-06, 'epoch': 0.05} 5%|▍ | 1649/34278 [1:51:19<32:04:15, 3.54s/it] 5%|▍ | 1650/34278 [1:51:22<30:59:26, 3.42s/it] {'loss': 0.2178, 'grad_norm': 1.1071808286644875, 'learning_rate': 9.991395192247267e-06, 'epoch': 0.05} 5%|▍ | 1650/34278 [1:51:22<30:59:26, 3.42s/it] 5%|▍ | 1651/34278 [1:51:27<35:28:18, 3.91s/it] {'loss': 0.2171, 'grad_norm': 1.0919039036106617, 'learning_rate': 9.991367465171428e-06, 'epoch': 0.05} 5%|▍ | 1651/34278 [1:51:27<35:28:18, 3.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1652/34278 [1:51:30<33:06:09, 3.65s/it] {'loss': 0.2329, 'grad_norm': 1.0521939051560518, 'learning_rate': 9.991339693533855e-06, 'epoch': 0.05} 5%|▍ | 1652/34278 [1:51:30<33:06:09, 3.65s/it] 5%|▍ | 1653/34278 [1:51:33<31:44:22, 3.50s/it] {'loss': 0.2113, 'grad_norm': 1.2599307604008543, 'learning_rate': 9.99131187733479e-06, 'epoch': 0.05} 5%|▍ | 1653/34278 [1:51:33<31:44:22, 3.50s/it] 5%|▍ | 1654/34278 [1:51:36<31:34:58, 3.49s/it] {'loss': 0.1956, 'grad_norm': 1.0138597731674888, 'learning_rate': 9.991284016574485e-06, 'epoch': 0.05} 5%|▍ | 1654/34278 [1:51:37<31:34:58, 3.49s/it] 5%|▍ | 1655/34278 [1:51:39<30:01:25, 3.31s/it] {'loss': 0.1881, 'grad_norm': 1.0015503180634682, 'learning_rate': 9.991256111253188e-06, 'epoch': 0.05} 5%|▍ | 1655/34278 [1:51:39<30:01:25, 3.31s/it] 5%|▍ | 1656/34278 [1:51:43<29:46:04, 3.29s/it] {'loss': 0.2136, 'grad_norm': 1.4548200339884936, 'learning_rate': 9.991228161371147e-06, 'epoch': 0.05} 5%|▍ | 1656/34278 [1:51:43<29:46:04, 3.29s/it] 5%|▍ | 1657/34278 [1:51:46<28:48:03, 3.18s/it] {'loss': 0.2435, 'grad_norm': 0.9606732047312863, 'learning_rate': 9.991200166928613e-06, 'epoch': 0.05} 5%|▍ | 1657/34278 [1:51:46<28:48:03, 3.18s/it] 5%|▍ | 1658/34278 [1:51:52<36:38:13, 4.04s/it] {'loss': 0.2497, 'grad_norm': 1.1041675199523862, 'learning_rate': 9.991172127925836e-06, 'epoch': 0.05} 5%|▍ | 1658/34278 [1:51:52<36:38:13, 4.04s/it] 5%|▍ | 1659/34278 [1:51:56<38:00:27, 4.19s/it] {'loss': 0.2228, 'grad_norm': 1.0056461141323014, 'learning_rate': 9.991144044363066e-06, 'epoch': 0.05} 5%|▍ | 1659/34278 [1:51:56<38:00:27, 4.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1660/34278 [1:51:59<35:35:08, 3.93s/it] {'loss': 0.2213, 'grad_norm': 0.8770300574055199, 'learning_rate': 9.991115916240553e-06, 'epoch': 0.05} 5%|▍ | 1660/34278 [1:51:59<35:35:08, 3.93s/it] 5%|▍ | 1661/34278 [1:52:03<33:32:40, 3.70s/it] {'loss': 0.246, 'grad_norm': 0.9344253606928895, 'learning_rate': 9.991087743558548e-06, 'epoch': 0.05} 5%|▍ | 1661/34278 [1:52:03<33:32:40, 3.70s/it] 5%|▍ | 1662/34278 [1:52:06<31:28:32, 3.47s/it] {'loss': 0.217, 'grad_norm': 1.1217884327800407, 'learning_rate': 9.991059526317304e-06, 'epoch': 0.05} 5%|▍ | 1662/34278 [1:52:06<31:28:32, 3.47s/it] 5%|▍ | 1663/34278 [1:52:09<30:05:44, 3.32s/it] {'loss': 0.1797, 'grad_norm': 0.7930454799108859, 'learning_rate': 9.991031264517071e-06, 'epoch': 0.05} 5%|▍ | 1663/34278 [1:52:09<30:05:44, 3.32s/it] 5%|▍ | 1664/34278 [1:52:12<29:58:18, 3.31s/it] {'loss': 0.2095, 'grad_norm': 0.8719649707868175, 'learning_rate': 9.991002958158102e-06, 'epoch': 0.05} 5%|▍ | 1664/34278 [1:52:12<29:58:18, 3.31s/it] 5%|▍ | 1665/34278 [1:52:15<29:30:45, 3.26s/it] {'loss': 0.2396, 'grad_norm': 1.0968692164460105, 'learning_rate': 9.990974607240651e-06, 'epoch': 0.05} 5%|▍ | 1665/34278 [1:52:15<29:30:45, 3.26s/it] 5%|▍ | 1666/34278 [1:52:18<28:16:13, 3.12s/it] {'loss': 0.2356, 'grad_norm': 1.0112037472179267, 'learning_rate': 9.990946211764971e-06, 'epoch': 0.05} 5%|▍ | 1666/34278 [1:52:18<28:16:13, 3.12s/it] 5%|▍ | 1667/34278 [1:52:21<29:06:39, 3.21s/it] {'loss': 0.2219, 'grad_norm': 1.0670663463650947, 'learning_rate': 9.990917771731313e-06, 'epoch': 0.05} 5%|▍ | 1667/34278 [1:52:21<29:06:39, 3.21s/it] 5%|▍ | 1668/34278 [1:52:25<31:50:47, 3.52s/it] {'loss': 0.2028, 'grad_norm': 1.0379911288760688, 'learning_rate': 9.990889287139933e-06, 'epoch': 0.05} 5%|▍ | 1668/34278 [1:52:25<31:50:47, 3.52s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1669/34278 [1:52:31<38:42:50, 4.27s/it] {'loss': 0.1853, 'grad_norm': 0.7639772036524248, 'learning_rate': 9.990860757991085e-06, 'epoch': 0.05} 5%|▍ | 1669/34278 [1:52:31<38:42:50, 4.27s/it] 5%|▍ | 1670/34278 [1:52:35<35:46:49, 3.95s/it] {'loss': 0.2193, 'grad_norm': 0.9674274347028614, 'learning_rate': 9.990832184285025e-06, 'epoch': 0.05} 5%|▍ | 1670/34278 [1:52:35<35:46:49, 3.95s/it] 5%|▍ | 1671/34278 [1:52:38<33:51:42, 3.74s/it] {'loss': 0.2198, 'grad_norm': 0.9557308291471724, 'learning_rate': 9.990803566022006e-06, 'epoch': 0.05} 5%|▍ | 1671/34278 [1:52:38<33:51:42, 3.74s/it] 5%|▍ | 1672/34278 [1:52:44<40:03:49, 4.42s/it] {'loss': 0.1895, 'grad_norm': 0.8525217282539468, 'learning_rate': 9.990774903202282e-06, 'epoch': 0.05} 5%|▍ | 1672/34278 [1:52:44<40:03:49, 4.42s/it] 5%|▍ | 1673/34278 [1:52:50<44:43:15, 4.94s/it] {'loss': 0.2313, 'grad_norm': 1.1871635907429425, 'learning_rate': 9.990746195826113e-06, 'epoch': 0.05} 5%|▍ | 1673/34278 [1:52:50<44:43:15, 4.94s/it] 5%|▍ | 1674/34278 [1:52:54<41:34:05, 4.59s/it] {'loss': 0.194, 'grad_norm': 0.9281359347017197, 'learning_rate': 9.99071744389375e-06, 'epoch': 0.05} 5%|▍ | 1674/34278 [1:52:54<41:34:05, 4.59s/it] 5%|▍ | 1675/34278 [1:52:57<37:10:20, 4.10s/it] {'loss': 0.2299, 'grad_norm': 1.1358069207452808, 'learning_rate': 9.990688647405457e-06, 'epoch': 0.05} 5%|▍ | 1675/34278 [1:52:57<37:10:20, 4.10s/it] 5%|▍ | 1676/34278 [1:53:03<42:51:42, 4.73s/it] {'loss': 0.1973, 'grad_norm': 0.9558605429837128, 'learning_rate': 9.990659806361487e-06, 'epoch': 0.05} 5%|▍ | 1676/34278 [1:53:03<42:51:42, 4.73s/it] 5%|▍ | 1677/34278 [1:53:06<39:21:26, 4.35s/it] {'loss': 0.2143, 'grad_norm': 0.9014631057553211, 'learning_rate': 9.990630920762096e-06, 'epoch': 0.05} 5%|▍ | 1677/34278 [1:53:06<39:21:26, 4.35s/it] 5%|▍ | 1678/34278 [1:53:10<36:15:17, 4.00s/it] {'loss': 0.2007, 'grad_norm': 1.0099362039814408, 'learning_rate': 9.990601990607544e-06, 'epoch': 0.05} 5%|▍ | 1678/34278 [1:53:10<36:15:17, 4.00s/it] 5%|▍ | 1679/34278 [1:53:13<35:01:09, 3.87s/it] {'loss': 0.2192, 'grad_norm': 1.0241491990455205, 'learning_rate': 9.99057301589809e-06, 'epoch': 0.05} 5%|▍ | 1679/34278 [1:53:13<35:01:09, 3.87s/it] 5%|▍ | 1680/34278 [1:53:16<32:47:50, 3.62s/it] {'loss': 0.2423, 'grad_norm': 0.9551060818761057, 'learning_rate': 9.99054399663399e-06, 'epoch': 0.05} 5%|▍ | 1680/34278 [1:53:16<32:47:50, 3.62s/it] 5%|▍ | 1681/34278 [1:53:20<31:48:12, 3.51s/it] {'loss': 0.2223, 'grad_norm': 1.0235115751315123, 'learning_rate': 9.990514932815505e-06, 'epoch': 0.05} 5%|▍ | 1681/34278 [1:53:20<31:48:12, 3.51s/it] 5%|▍ | 1682/34278 [1:53:24<33:23:36, 3.69s/it] {'loss': 0.2292, 'grad_norm': 1.0437110676507673, 'learning_rate': 9.990485824442893e-06, 'epoch': 0.05} 5%|▍ | 1682/34278 [1:53:24<33:23:36, 3.69s/it] 5%|▍ | 1683/34278 [1:53:28<34:05:09, 3.76s/it] {'loss': 0.2299, 'grad_norm': 0.8227736075106783, 'learning_rate': 9.990456671516418e-06, 'epoch': 0.05} 5%|▍ | 1683/34278 [1:53:28<34:05:09, 3.76s/it] 5%|▍ | 1684/34278 [1:53:31<32:42:57, 3.61s/it] {'loss': 0.2075, 'grad_norm': 1.1425436215855296, 'learning_rate': 9.990427474036333e-06, 'epoch': 0.05} 5%|▍ | 1684/34278 [1:53:31<32:42:57, 3.61s/it] 5%|▍ | 1685/34278 [1:53:37<40:04:28, 4.43s/it] {'loss': 0.2763, 'grad_norm': 1.1496941580824722, 'learning_rate': 9.990398232002907e-06, 'epoch': 0.05} 5%|▍ | 1685/34278 [1:53:37<40:04:28, 4.43s/it] 5%|▍ | 1686/34278 [1:53:40<36:37:06, 4.04s/it] {'loss': 0.2092, 'grad_norm': 0.8975550080507411, 'learning_rate': 9.990368945416392e-06, 'epoch': 0.05} 5%|▍ | 1686/34278 [1:53:40<36:37:06, 4.04s/it] 5%|▍ | 1687/34278 [1:53:43<33:32:17, 3.70s/it] {'loss': 0.2019, 'grad_norm': 0.8923290018228252, 'learning_rate': 9.990339614277058e-06, 'epoch': 0.05} 5%|▍ | 1687/34278 [1:53:43<33:32:17, 3.70s/it] 5%|▍ | 1688/34278 [1:53:47<32:37:59, 3.60s/it] {'loss': 0.2094, 'grad_norm': 1.0635083174019502, 'learning_rate': 9.990310238585162e-06, 'epoch': 0.05} 5%|▍ | 1688/34278 [1:53:47<32:37:59, 3.60s/it] 5%|▍ | 1689/34278 [1:53:50<31:33:43, 3.49s/it] {'loss': 0.2252, 'grad_norm': 1.021382709157859, 'learning_rate': 9.990280818340968e-06, 'epoch': 0.05} 5%|▍ | 1689/34278 [1:53:50<31:33:43, 3.49s/it] 5%|▍ | 1690/34278 [1:53:53<30:25:34, 3.36s/it] {'loss': 0.1944, 'grad_norm': 0.9332255670808633, 'learning_rate': 9.990251353544738e-06, 'epoch': 0.05} 5%|▍ | 1690/34278 [1:53:53<30:25:34, 3.36s/it] 5%|▍ | 1691/34278 [1:53:56<29:55:01, 3.31s/it] {'loss': 0.1899, 'grad_norm': 1.0526499907531872, 'learning_rate': 9.990221844196734e-06, 'epoch': 0.05} 5%|▍ | 1691/34278 [1:53:56<29:55:01, 3.31s/it] 5%|▍ | 1692/34278 [1:54:02<35:52:42, 3.96s/it] {'loss': 0.2116, 'grad_norm': 0.9494404648611682, 'learning_rate': 9.990192290297223e-06, 'epoch': 0.05} 5%|▍ | 1692/34278 [1:54:02<35:52:42, 3.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1693/34278 [1:54:07<39:11:10, 4.33s/it] {'loss': 0.2082, 'grad_norm': 1.3654028412022048, 'learning_rate': 9.990162691846466e-06, 'epoch': 0.05} 5%|▍ | 1693/34278 [1:54:07<39:11:10, 4.33s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1694/34278 [1:54:10<35:21:03, 3.91s/it] {'loss': 0.1969, 'grad_norm': 1.1887938946549368, 'learning_rate': 9.990133048844726e-06, 'epoch': 0.05} 5%|▍ | 1694/34278 [1:54:10<35:21:03, 3.91s/it] 5%|▍ | 1695/34278 [1:54:16<41:02:32, 4.53s/it] {'loss': 0.2169, 'grad_norm': 1.1183445820649272, 'learning_rate': 9.99010336129227e-06, 'epoch': 0.05} 5%|▍ | 1695/34278 [1:54:16<41:02:32, 4.53s/it] 5%|▍ | 1696/34278 [1:54:20<40:02:33, 4.42s/it] {'loss': 0.1969, 'grad_norm': 1.1069014753156676, 'learning_rate': 9.990073629189364e-06, 'epoch': 0.05} 5%|▍ | 1696/34278 [1:54:20<40:02:33, 4.42s/it] 5%|▍ | 1697/34278 [1:54:24<38:22:16, 4.24s/it] {'loss': 0.1927, 'grad_norm': 0.929966883144153, 'learning_rate': 9.99004385253627e-06, 'epoch': 0.05} 5%|▍ | 1697/34278 [1:54:24<38:22:16, 4.24s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (13750 > 8192). Running this sequence through the model will result in indexing errors 5%|▍ | 1698/34278 [1:54:30<43:37:23, 4.82s/it] {'loss': 0.1831, 'grad_norm': 0.7625456696110425, 'learning_rate': 9.990014031333256e-06, 'epoch': 0.05} 5%|▍ | 1698/34278 [1:54:30<43:37:23, 4.82s/it] 5%|▍ | 1699/34278 [1:54:33<39:11:11, 4.33s/it] {'loss': 0.2018, 'grad_norm': 0.8287507863528123, 'learning_rate': 9.989984165580588e-06, 'epoch': 0.05} 5%|▍ | 1699/34278 [1:54:33<39:11:11, 4.33s/it] 5%|▍ | 1700/34278 [1:54:39<42:32:07, 4.70s/it] {'loss': 0.1863, 'grad_norm': 1.0421683718229273, 'learning_rate': 9.989954255278534e-06, 'epoch': 0.05} 5%|▍ | 1700/34278 [1:54:39<42:32:07, 4.70s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1701/34278 [1:54:43<40:46:20, 4.51s/it] {'loss': 0.1967, 'grad_norm': 1.030293521807953, 'learning_rate': 9.989924300427356e-06, 'epoch': 0.05} 5%|▍ | 1701/34278 [1:54:43<40:46:20, 4.51s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1702/34278 [1:54:46<37:01:14, 4.09s/it] {'loss': 0.2197, 'grad_norm': 1.0855062412490677, 'learning_rate': 9.989894301027328e-06, 'epoch': 0.05} 5%|▍ | 1702/34278 [1:54:46<37:01:14, 4.09s/it] 5%|▍ | 1703/34278 [1:54:49<36:07:30, 3.99s/it] {'loss': 0.2157, 'grad_norm': 1.039058519086821, 'learning_rate': 9.989864257078715e-06, 'epoch': 0.05} 5%|▍ | 1703/34278 [1:54:49<36:07:30, 3.99s/it] 5%|▍ | 1704/34278 [1:54:54<37:42:41, 4.17s/it] {'loss': 0.1884, 'grad_norm': 0.9145943238871364, 'learning_rate': 9.989834168581784e-06, 'epoch': 0.05} 5%|▍ | 1704/34278 [1:54:54<37:42:41, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1705/34278 [1:54:57<34:51:17, 3.85s/it] {'loss': 0.2036, 'grad_norm': 0.9387708509635466, 'learning_rate': 9.989804035536805e-06, 'epoch': 0.05} 5%|▍ | 1705/34278 [1:54:57<34:51:17, 3.85s/it] 5%|▍ | 1706/34278 [1:55:00<33:01:38, 3.65s/it] {'loss': 0.2192, 'grad_norm': 0.9956052149277963, 'learning_rate': 9.989773857944048e-06, 'epoch': 0.05} 5%|▍ | 1706/34278 [1:55:00<33:01:38, 3.65s/it] 5%|▍ | 1707/34278 [1:55:03<31:05:59, 3.44s/it] {'loss': 0.2334, 'grad_norm': 1.0293819252919336, 'learning_rate': 9.989743635803779e-06, 'epoch': 0.05} 5%|▍ | 1707/34278 [1:55:03<31:05:59, 3.44s/it] 5%|▍ | 1708/34278 [1:55:07<32:10:45, 3.56s/it] {'loss': 0.2021, 'grad_norm': 0.8222808592083097, 'learning_rate': 9.989713369116271e-06, 'epoch': 0.05} 5%|▍ | 1708/34278 [1:55:07<32:10:45, 3.56s/it] 5%|▍ | 1709/34278 [1:55:10<30:22:14, 3.36s/it] {'loss': 0.236, 'grad_norm': 0.9689052093946374, 'learning_rate': 9.989683057881794e-06, 'epoch': 0.05} 5%|▍ | 1709/34278 [1:55:10<30:22:14, 3.36s/it] 5%|▍ | 1710/34278 [1:55:13<30:15:56, 3.35s/it] {'loss': 0.2122, 'grad_norm': 0.8636164962595155, 'learning_rate': 9.989652702100616e-06, 'epoch': 0.05} 5%|▍ | 1710/34278 [1:55:13<30:15:56, 3.35s/it] 5%|▍ | 1711/34278 [1:55:16<29:48:16, 3.29s/it] {'loss': 0.2155, 'grad_norm': 0.9883710485548339, 'learning_rate': 9.989622301773011e-06, 'epoch': 0.05} 5%|▍ | 1711/34278 [1:55:17<29:48:16, 3.29s/it] 5%|▍ | 1712/34278 [1:55:21<33:31:25, 3.71s/it] {'loss': 0.2236, 'grad_norm': 0.8859232847320321, 'learning_rate': 9.989591856899248e-06, 'epoch': 0.05} 5%|▍ | 1712/34278 [1:55:21<33:31:25, 3.71s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▍ | 1713/34278 [1:55:26<37:09:11, 4.11s/it] {'loss': 0.2154, 'grad_norm': 0.9338670941696584, 'learning_rate': 9.989561367479603e-06, 'epoch': 0.05} 5%|▍ | 1713/34278 [1:55:26<37:09:11, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1714/34278 [1:55:30<37:38:45, 4.16s/it] {'loss': 0.2294, 'grad_norm': 1.0418470685744265, 'learning_rate': 9.989530833514342e-06, 'epoch': 0.05} 5%|▌ | 1714/34278 [1:55:31<37:38:45, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1715/34278 [1:55:34<36:55:50, 4.08s/it] {'loss': 0.2107, 'grad_norm': 0.9223354819604049, 'learning_rate': 9.989500255003743e-06, 'epoch': 0.05} 5%|▌ | 1715/34278 [1:55:34<36:55:50, 4.08s/it] 5%|▌ | 1716/34278 [1:55:37<33:56:22, 3.75s/it] {'loss': 0.2039, 'grad_norm': 0.9554485296783357, 'learning_rate': 9.989469631948075e-06, 'epoch': 0.05} 5%|▌ | 1716/34278 [1:55:37<33:56:22, 3.75s/it] 5%|▌ | 1717/34278 [1:55:40<31:45:45, 3.51s/it] {'loss': 0.2272, 'grad_norm': 0.9688904025197402, 'learning_rate': 9.989438964347614e-06, 'epoch': 0.05} 5%|▌ | 1717/34278 [1:55:40<31:45:45, 3.51s/it] 5%|▌ | 1718/34278 [1:55:44<32:36:49, 3.61s/it] {'loss': 0.1994, 'grad_norm': 1.0793616107432007, 'learning_rate': 9.989408252202632e-06, 'epoch': 0.05} 5%|▌ | 1718/34278 [1:55:44<32:36:49, 3.61s/it] 5%|▌ | 1719/34278 [1:55:47<31:13:57, 3.45s/it] {'loss': 0.2178, 'grad_norm': 1.0896751899895707, 'learning_rate': 9.989377495513407e-06, 'epoch': 0.05} 5%|▌ | 1719/34278 [1:55:47<31:13:57, 3.45s/it] 5%|▌ | 1720/34278 [1:55:53<38:20:59, 4.24s/it] {'loss': 0.2494, 'grad_norm': 1.0354239852979008, 'learning_rate': 9.989346694280208e-06, 'epoch': 0.05} 5%|▌ | 1720/34278 [1:55:53<38:20:59, 4.24s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9720 > 8192). Running this sequence through the model will result in indexing errors 5%|▌ | 1721/34278 [1:55:57<35:43:47, 3.95s/it] {'loss': 0.194, 'grad_norm': 1.08500778617434, 'learning_rate': 9.989315848503314e-06, 'epoch': 0.05} 5%|▌ | 1721/34278 [1:55:57<35:43:47, 3.95s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1722/34278 [1:56:00<33:23:00, 3.69s/it] {'loss': 0.2071, 'grad_norm': 0.851551946042162, 'learning_rate': 9.989284958182998e-06, 'epoch': 0.05} 5%|▌ | 1722/34278 [1:56:00<33:23:00, 3.69s/it] 5%|▌ | 1723/34278 [1:56:03<32:15:33, 3.57s/it] {'loss': 0.2268, 'grad_norm': 0.8830058696664645, 'learning_rate': 9.989254023319539e-06, 'epoch': 0.05} 5%|▌ | 1723/34278 [1:56:03<32:15:33, 3.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1724/34278 [1:56:06<30:34:26, 3.38s/it] {'loss': 0.2171, 'grad_norm': 1.062087466055277, 'learning_rate': 9.98922304391321e-06, 'epoch': 0.05} 5%|▌ | 1724/34278 [1:56:06<30:34:26, 3.38s/it] 5%|▌ | 1725/34278 [1:56:09<29:50:20, 3.30s/it] {'loss': 0.2071, 'grad_norm': 1.0310098785935518, 'learning_rate': 9.98919201996429e-06, 'epoch': 0.05} 5%|▌ | 1725/34278 [1:56:09<29:50:20, 3.30s/it] 5%|▌ | 1726/34278 [1:56:13<31:03:34, 3.43s/it] {'loss': 0.2104, 'grad_norm': 0.9616539437889566, 'learning_rate': 9.989160951473051e-06, 'epoch': 0.05} 5%|▌ | 1726/34278 [1:56:13<31:03:34, 3.43s/it] 5%|▌ | 1727/34278 [1:56:16<30:12:14, 3.34s/it] {'loss': 0.1964, 'grad_norm': 0.9808099587682962, 'learning_rate': 9.989129838439778e-06, 'epoch': 0.05} 5%|▌ | 1727/34278 [1:56:16<30:12:14, 3.34s/it] 5%|▌ | 1728/34278 [1:56:21<36:14:47, 4.01s/it] {'loss': 0.1986, 'grad_norm': 0.9796749181222015, 'learning_rate': 9.989098680864741e-06, 'epoch': 0.05} 5%|▌ | 1728/34278 [1:56:21<36:14:47, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1729/34278 [1:56:25<35:36:07, 3.94s/it] {'loss': 0.2021, 'grad_norm': 1.1296507753056482, 'learning_rate': 9.989067478748225e-06, 'epoch': 0.05} 5%|▌ | 1729/34278 [1:56:25<35:36:07, 3.94s/it] 5%|▌ | 1730/34278 [1:56:29<34:46:41, 3.85s/it] {'loss': 0.2308, 'grad_norm': 1.0043817801041206, 'learning_rate': 9.989036232090506e-06, 'epoch': 0.05} 5%|▌ | 1730/34278 [1:56:29<34:46:41, 3.85s/it] 5%|▌ | 1731/34278 [1:56:32<33:08:49, 3.67s/it] {'loss': 0.2, 'grad_norm': 0.9883080734561521, 'learning_rate': 9.98900494089186e-06, 'epoch': 0.05} 5%|▌ | 1731/34278 [1:56:32<33:08:49, 3.67s/it] 5%|▌ | 1732/34278 [1:56:35<31:20:49, 3.47s/it] {'loss': 0.214, 'grad_norm': 1.1953902816615842, 'learning_rate': 9.98897360515257e-06, 'epoch': 0.05} 5%|▌ | 1732/34278 [1:56:35<31:20:49, 3.47s/it] 5%|▌ | 1733/34278 [1:56:39<31:48:56, 3.52s/it] {'loss': 0.2194, 'grad_norm': 1.0499755971419535, 'learning_rate': 9.988942224872916e-06, 'epoch': 0.05} 5%|▌ | 1733/34278 [1:56:39<31:48:56, 3.52s/it] 5%|▌ | 1734/34278 [1:56:45<38:39:32, 4.28s/it] {'loss': 0.2297, 'grad_norm': 1.3567395572332095, 'learning_rate': 9.988910800053174e-06, 'epoch': 0.05} 5%|▌ | 1734/34278 [1:56:45<38:39:32, 4.28s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1735/34278 [1:56:49<39:41:55, 4.39s/it] {'loss': 0.2193, 'grad_norm': 1.058652552738655, 'learning_rate': 9.988879330693629e-06, 'epoch': 0.05} 5%|▌ | 1735/34278 [1:56:49<39:41:55, 4.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1736/34278 [1:56:55<43:38:25, 4.83s/it] {'loss': 0.2138, 'grad_norm': 1.1152018443812863, 'learning_rate': 9.98884781679456e-06, 'epoch': 0.05} 5%|▌ | 1736/34278 [1:56:55<43:38:25, 4.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1737/34278 [1:56:59<40:13:52, 4.45s/it] {'loss': 0.2326, 'grad_norm': 1.1431676401102813, 'learning_rate': 9.988816258356249e-06, 'epoch': 0.05} 5%|▌ | 1737/34278 [1:56:59<40:13:52, 4.45s/it] 5%|▌ | 1738/34278 [1:57:02<37:18:41, 4.13s/it] {'loss': 0.2211, 'grad_norm': 1.0671413599887805, 'learning_rate': 9.988784655378976e-06, 'epoch': 0.05} 5%|▌ | 1738/34278 [1:57:02<37:18:41, 4.13s/it] 5%|▌ | 1739/34278 [1:57:08<42:08:50, 4.66s/it] {'loss': 0.1939, 'grad_norm': 0.9707393920933772, 'learning_rate': 9.988753007863025e-06, 'epoch': 0.05} 5%|▌ | 1739/34278 [1:57:08<42:08:50, 4.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1740/34278 [1:57:11<38:18:57, 4.24s/it] {'loss': 0.2146, 'grad_norm': 0.8133631276971836, 'learning_rate': 9.98872131580868e-06, 'epoch': 0.05} 5%|▌ | 1740/34278 [1:57:11<38:18:57, 4.24s/it] 5%|▌ | 1741/34278 [1:57:14<34:40:38, 3.84s/it] {'loss': 0.2003, 'grad_norm': 0.8605833063521691, 'learning_rate': 9.98868957921622e-06, 'epoch': 0.05} 5%|▌ | 1741/34278 [1:57:14<34:40:38, 3.84s/it] 5%|▌ | 1742/34278 [1:57:17<31:52:31, 3.53s/it] {'loss': 0.1961, 'grad_norm': 0.7330354260552346, 'learning_rate': 9.98865779808593e-06, 'epoch': 0.05} 5%|▌ | 1742/34278 [1:57:17<31:52:31, 3.53s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12637 > 8192). Running this sequence through the model will result in indexing errors 5%|▌ | 1743/34278 [1:57:21<32:41:32, 3.62s/it] {'loss': 0.1902, 'grad_norm': 0.8580404109212706, 'learning_rate': 9.988625972418096e-06, 'epoch': 0.05} 5%|▌ | 1743/34278 [1:57:21<32:41:32, 3.62s/it] 5%|▌ | 1744/34278 [1:57:24<31:58:45, 3.54s/it] {'loss': 0.1938, 'grad_norm': 1.1967846988616961, 'learning_rate': 9.988594102212999e-06, 'epoch': 0.05} 5%|▌ | 1744/34278 [1:57:24<31:58:45, 3.54s/it] 5%|▌ | 1745/34278 [1:57:27<30:27:12, 3.37s/it] {'loss': 0.2121, 'grad_norm': 1.0104081967587848, 'learning_rate': 9.988562187470925e-06, 'epoch': 0.05} 5%|▌ | 1745/34278 [1:57:27<30:27:12, 3.37s/it] 5%|▌ | 1746/34278 [1:57:30<29:48:05, 3.30s/it] {'loss': 0.2084, 'grad_norm': 1.1529529643483754, 'learning_rate': 9.988530228192158e-06, 'epoch': 0.05} 5%|▌ | 1746/34278 [1:57:30<29:48:05, 3.30s/it] 5%|▌ | 1747/34278 [1:57:35<32:13:59, 3.57s/it] {'loss': 0.2006, 'grad_norm': 0.9197788799969081, 'learning_rate': 9.988498224376985e-06, 'epoch': 0.05} 5%|▌ | 1747/34278 [1:57:35<32:13:59, 3.57s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10994 > 8192). Running this sequence through the model will result in indexing errors 5%|▌ | 1748/34278 [1:57:38<31:36:47, 3.50s/it] {'loss': 0.1982, 'grad_norm': 1.0174692305896282, 'learning_rate': 9.988466176025689e-06, 'epoch': 0.05} 5%|▌ | 1748/34278 [1:57:38<31:36:47, 3.50s/it] 5%|▌ | 1749/34278 [1:57:44<38:27:43, 4.26s/it] {'loss': 0.2276, 'grad_norm': 0.9349769771620468, 'learning_rate': 9.988434083138561e-06, 'epoch': 0.05} 5%|▌ | 1749/34278 [1:57:44<38:27:43, 4.26s/it] 5%|▌ | 1750/34278 [1:57:47<36:00:14, 3.98s/it] {'loss': 0.2126, 'grad_norm': 1.0934151233929899, 'learning_rate': 9.988401945715882e-06, 'epoch': 0.05} 5%|▌ | 1750/34278 [1:57:47<36:00:14, 3.98s/it] 5%|▌ | 1751/34278 [1:57:51<35:46:36, 3.96s/it] {'loss': 0.2178, 'grad_norm': 1.36762340528711, 'learning_rate': 9.98836976375794e-06, 'epoch': 0.05} 5%|▌ | 1751/34278 [1:57:51<35:46:36, 3.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1752/34278 [1:57:54<32:30:01, 3.60s/it] {'loss': 0.2071, 'grad_norm': 1.0101928167570686, 'learning_rate': 9.988337537265026e-06, 'epoch': 0.05} 5%|▌ | 1752/34278 [1:57:54<32:30:01, 3.60s/it] 5%|▌ | 1753/34278 [1:57:57<32:05:20, 3.55s/it] {'loss': 0.2166, 'grad_norm': 1.1283120636047486, 'learning_rate': 9.988305266237425e-06, 'epoch': 0.05} 5%|▌ | 1753/34278 [1:57:57<32:05:20, 3.55s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f97bddf1f80> Failed to fetch sample 3646667. Exception: cannot identify image file <_io.BytesIO object at 0x7f97bddf1f80> 5%|▌ | 1754/34278 [1:58:01<32:38:26, 3.61s/it] {'loss': 0.2103, 'grad_norm': 0.9499565542893285, 'learning_rate': 9.988272950675423e-06, 'epoch': 0.05} 5%|▌ | 1754/34278 [1:58:01<32:38:26, 3.61s/it] 5%|▌ | 1755/34278 [1:58:04<31:43:20, 3.51s/it] {'loss': 0.2094, 'grad_norm': 1.0255609302975421, 'learning_rate': 9.988240590579314e-06, 'epoch': 0.05} 5%|▌ | 1755/34278 [1:58:04<31:43:20, 3.51s/it] 5%|▌ | 1756/34278 [1:58:07<30:16:41, 3.35s/it] {'loss': 0.2069, 'grad_norm': 1.0197668974193146, 'learning_rate': 9.988208185949382e-06, 'epoch': 0.05} 5%|▌ | 1756/34278 [1:58:07<30:16:41, 3.35s/it] 5%|▌ | 1757/34278 [1:58:11<30:43:42, 3.40s/it] {'loss': 0.2054, 'grad_norm': 1.018468946312996, 'learning_rate': 9.988175736785919e-06, 'epoch': 0.05} 5%|▌ | 1757/34278 [1:58:11<30:43:42, 3.40s/it] 5%|▌ | 1758/34278 [1:58:15<31:38:27, 3.50s/it] {'loss': 0.1945, 'grad_norm': 0.7720587618144342, 'learning_rate': 9.988143243089214e-06, 'epoch': 0.05} 5%|▌ | 1758/34278 [1:58:15<31:38:27, 3.50s/it] 5%|▌ | 1759/34278 [1:58:18<31:22:19, 3.47s/it] {'loss': 0.1977, 'grad_norm': 0.8976148147428786, 'learning_rate': 9.988110704859557e-06, 'epoch': 0.05} 5%|▌ | 1759/34278 [1:58:18<31:22:19, 3.47s/it] 5%|▌ | 1760/34278 [1:58:21<30:43:53, 3.40s/it] {'loss': 0.2066, 'grad_norm': 0.8569834394311729, 'learning_rate': 9.988078122097238e-06, 'epoch': 0.05} 5%|▌ | 1760/34278 [1:58:21<30:43:53, 3.40s/it] 5%|▌ | 1761/34278 [1:58:25<31:53:53, 3.53s/it] {'loss': 0.2169, 'grad_norm': 0.8870857932805177, 'learning_rate': 9.988045494802548e-06, 'epoch': 0.05} 5%|▌ | 1761/34278 [1:58:25<31:53:53, 3.53s/it] 5%|▌ | 1762/34278 [1:58:28<31:14:26, 3.46s/it] {'loss': 0.2199, 'grad_norm': 0.9764283677320921, 'learning_rate': 9.988012822975778e-06, 'epoch': 0.05} 5%|▌ | 1762/34278 [1:58:28<31:14:26, 3.46s/it] 5%|▌ | 1763/34278 [1:58:32<30:45:34, 3.41s/it] {'loss': 0.2174, 'grad_norm': 0.8400425016905786, 'learning_rate': 9.987980106617221e-06, 'epoch': 0.05} 5%|▌ | 1763/34278 [1:58:32<30:45:34, 3.41s/it] 5%|▌ | 1764/34278 [1:58:35<29:38:58, 3.28s/it] {'loss': 0.1997, 'grad_norm': 0.8448586166941309, 'learning_rate': 9.987947345727167e-06, 'epoch': 0.05} 5%|▌ | 1764/34278 [1:58:35<29:38:58, 3.28s/it] 5%|▌ | 1765/34278 [1:58:41<38:49:32, 4.30s/it] {'loss': 0.2065, 'grad_norm': 0.9292200392314232, 'learning_rate': 9.987914540305911e-06, 'epoch': 0.05} 5%|▌ | 1765/34278 [1:58:41<38:49:32, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1766/34278 [1:58:45<35:57:44, 3.98s/it] {'loss': 0.2139, 'grad_norm': 0.9746200610832763, 'learning_rate': 9.987881690353744e-06, 'epoch': 0.05} 5%|▌ | 1766/34278 [1:58:45<35:57:44, 3.98s/it] 5%|▌ | 1767/34278 [1:58:49<36:11:31, 4.01s/it] {'loss': 0.1997, 'grad_norm': 1.0032499501603669, 'learning_rate': 9.987848795870962e-06, 'epoch': 0.05} 5%|▌ | 1767/34278 [1:58:49<36:11:31, 4.01s/it] 5%|▌ | 1768/34278 [1:58:55<42:05:06, 4.66s/it] {'loss': 0.1922, 'grad_norm': 0.961661839373899, 'learning_rate': 9.987815856857856e-06, 'epoch': 0.05} 5%|▌ | 1768/34278 [1:58:55<42:05:06, 4.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1769/34278 [1:58:59<39:21:03, 4.36s/it] {'loss': 0.1909, 'grad_norm': 0.9737648956227785, 'learning_rate': 9.98778287331472e-06, 'epoch': 0.05} 5%|▌ | 1769/34278 [1:58:59<39:21:03, 4.36s/it] 5%|▌ | 1770/34278 [1:59:02<36:28:02, 4.04s/it] {'loss': 0.2159, 'grad_norm': 1.1386668813997758, 'learning_rate': 9.987749845241849e-06, 'epoch': 0.05} 5%|▌ | 1770/34278 [1:59:02<36:28:02, 4.04s/it] 5%|▌ | 1771/34278 [1:59:05<33:40:43, 3.73s/it] {'loss': 0.2037, 'grad_norm': 1.1632667233396496, 'learning_rate': 9.987716772639537e-06, 'epoch': 0.05} 5%|▌ | 1771/34278 [1:59:05<33:40:43, 3.73s/it] 5%|▌ | 1772/34278 [1:59:10<38:03:09, 4.21s/it] {'loss': 0.2186, 'grad_norm': 1.1498587634493553, 'learning_rate': 9.987683655508082e-06, 'epoch': 0.05} 5%|▌ | 1772/34278 [1:59:10<38:03:09, 4.21s/it] 5%|▌ | 1773/34278 [1:59:15<40:22:45, 4.47s/it] {'loss': 0.2564, 'grad_norm': 1.0121456586626543, 'learning_rate': 9.987650493847778e-06, 'epoch': 0.05} 5%|▌ | 1773/34278 [1:59:15<40:22:45, 4.47s/it] 5%|▌ | 1774/34278 [1:59:18<36:28:14, 4.04s/it] {'loss': 0.2089, 'grad_norm': 1.0798339386031093, 'learning_rate': 9.98761728765892e-06, 'epoch': 0.05} 5%|▌ | 1774/34278 [1:59:18<36:28:14, 4.04s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12047 > 8192). Running this sequence through the model will result in indexing errors 5%|▌ | 1775/34278 [1:59:24<41:46:24, 4.63s/it] {'loss': 0.2035, 'grad_norm': 1.06653919711271, 'learning_rate': 9.987584036941806e-06, 'epoch': 0.05} 5%|▌ | 1775/34278 [1:59:24<41:46:24, 4.63s/it] 5%|▌ | 1776/34278 [1:59:28<39:03:09, 4.33s/it] {'loss': 0.2037, 'grad_norm': 1.3176017160123297, 'learning_rate': 9.987550741696734e-06, 'epoch': 0.05} 5%|▌ | 1776/34278 [1:59:28<39:03:09, 4.33s/it] 5%|▌ | 1777/34278 [1:59:33<41:40:42, 4.62s/it] {'loss': 0.2313, 'grad_norm': 1.2445537573465832, 'learning_rate': 9.987517401923996e-06, 'epoch': 0.05} 5%|▌ | 1777/34278 [1:59:33<41:40:42, 4.62s/it] 5%|▌ | 1778/34278 [1:59:36<36:56:17, 4.09s/it] {'loss': 0.2273, 'grad_norm': 0.9860102086983821, 'learning_rate': 9.987484017623896e-06, 'epoch': 0.05} 5%|▌ | 1778/34278 [1:59:36<36:56:17, 4.09s/it] 5%|▌ | 1779/34278 [1:59:39<34:02:36, 3.77s/it] {'loss': 0.2009, 'grad_norm': 0.9877336386360043, 'learning_rate': 9.987450588796729e-06, 'epoch': 0.05} 5%|▌ | 1779/34278 [1:59:39<34:02:36, 3.77s/it] 5%|▌ | 1780/34278 [1:59:43<35:07:25, 3.89s/it] {'loss': 0.2181, 'grad_norm': 0.8539206765930014, 'learning_rate': 9.987417115442793e-06, 'epoch': 0.05} 5%|▌ | 1780/34278 [1:59:43<35:07:25, 3.89s/it] 5%|▌ | 1781/34278 [1:59:47<34:09:42, 3.78s/it] {'loss': 0.2471, 'grad_norm': 0.9826417333117898, 'learning_rate': 9.987383597562388e-06, 'epoch': 0.05} 5%|▌ | 1781/34278 [1:59:47<34:09:42, 3.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1782/34278 [1:59:51<34:26:37, 3.82s/it] {'loss': 0.225, 'grad_norm': 1.0993517355804936, 'learning_rate': 9.987350035155813e-06, 'epoch': 0.05} 5%|▌ | 1782/34278 [1:59:51<34:26:37, 3.82s/it] 5%|▌ | 1783/34278 [1:59:54<33:21:13, 3.70s/it] {'loss': 0.182, 'grad_norm': 0.7973722927989587, 'learning_rate': 9.987316428223367e-06, 'epoch': 0.05} 5%|▌ | 1783/34278 [1:59:54<33:21:13, 3.70s/it] 5%|▌ | 1784/34278 [1:59:58<32:58:33, 3.65s/it] {'loss': 0.1943, 'grad_norm': 0.9459140369300196, 'learning_rate': 9.98728277676535e-06, 'epoch': 0.05} 5%|▌ | 1784/34278 [1:59:58<32:58:33, 3.65s/it] 5%|▌ | 1785/34278 [2:00:02<35:01:18, 3.88s/it] {'loss': 0.2117, 'grad_norm': 0.9004056542612794, 'learning_rate': 9.987249080782065e-06, 'epoch': 0.05} 5%|▌ | 1785/34278 [2:00:02<35:01:18, 3.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1786/34278 [2:00:05<32:42:50, 3.62s/it] {'loss': 0.2147, 'grad_norm': 0.8568729774458818, 'learning_rate': 9.987215340273809e-06, 'epoch': 0.05} 5%|▌ | 1786/34278 [2:00:05<32:42:50, 3.62s/it] 5%|▌ | 1787/34278 [2:00:09<32:28:10, 3.60s/it] {'loss': 0.2059, 'grad_norm': 0.9424697505205792, 'learning_rate': 9.987181555240886e-06, 'epoch': 0.05} 5%|▌ | 1787/34278 [2:00:09<32:28:10, 3.60s/it] 5%|▌ | 1788/34278 [2:00:12<30:49:31, 3.42s/it] {'loss': 0.2197, 'grad_norm': 0.9504645368660839, 'learning_rate': 9.987147725683595e-06, 'epoch': 0.05} 5%|▌ | 1788/34278 [2:00:12<30:49:31, 3.42s/it] 5%|▌ | 1789/34278 [2:00:15<30:05:34, 3.33s/it] {'loss': 0.2079, 'grad_norm': 0.8920022479488012, 'learning_rate': 9.987113851602241e-06, 'epoch': 0.05} 5%|▌ | 1789/34278 [2:00:15<30:05:34, 3.33s/it] 5%|▌ | 1790/34278 [2:00:20<36:16:57, 4.02s/it] {'loss': 0.2085, 'grad_norm': 0.8335692029373526, 'learning_rate': 9.987079932997124e-06, 'epoch': 0.05} 5%|▌ | 1790/34278 [2:00:20<36:16:57, 4.02s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1791/34278 [2:00:25<37:26:33, 4.15s/it] {'loss': 0.2016, 'grad_norm': 0.8922263766690413, 'learning_rate': 9.98704596986855e-06, 'epoch': 0.05} 5%|▌ | 1791/34278 [2:00:25<37:26:33, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1792/34278 [2:00:29<36:42:26, 4.07s/it] {'loss': 0.2218, 'grad_norm': 1.1374022594460134, 'learning_rate': 9.987011962216817e-06, 'epoch': 0.05} 5%|▌ | 1792/34278 [2:00:29<36:42:26, 4.07s/it] 5%|▌ | 1793/34278 [2:00:33<37:05:10, 4.11s/it] {'loss': 0.2133, 'grad_norm': 1.0103153351072, 'learning_rate': 9.986977910042236e-06, 'epoch': 0.05} 5%|▌ | 1793/34278 [2:00:33<37:05:10, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1794/34278 [2:00:39<42:17:28, 4.69s/it] {'loss': 0.204, 'grad_norm': 0.8372195469361688, 'learning_rate': 9.986943813345102e-06, 'epoch': 0.05} 5%|▌ | 1794/34278 [2:00:39<42:17:28, 4.69s/it] 5%|▌ | 1795/34278 [2:00:43<39:17:17, 4.35s/it] {'loss': 0.2102, 'grad_norm': 0.9685977819820961, 'learning_rate': 9.986909672125726e-06, 'epoch': 0.05} 5%|▌ | 1795/34278 [2:00:43<39:17:17, 4.35s/it] 5%|▌ | 1796/34278 [2:00:46<35:35:37, 3.94s/it] {'loss': 0.1972, 'grad_norm': 0.8709555196077586, 'learning_rate': 9.98687548638441e-06, 'epoch': 0.05} 5%|▌ | 1796/34278 [2:00:46<35:35:37, 3.94s/it] 5%|▌ | 1797/34278 [2:00:52<41:10:08, 4.56s/it] {'loss': 0.2141, 'grad_norm': 0.994100812796675, 'learning_rate': 9.986841256121462e-06, 'epoch': 0.05} 5%|▌ | 1797/34278 [2:00:52<41:10:08, 4.56s/it] 5%|▌ | 1798/34278 [2:00:55<38:55:01, 4.31s/it] {'loss': 0.2094, 'grad_norm': 0.8335077470829617, 'learning_rate': 9.986806981337186e-06, 'epoch': 0.05} 5%|▌ | 1798/34278 [2:00:55<38:55:01, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1799/34278 [2:00:58<35:51:24, 3.97s/it] {'loss': 0.2103, 'grad_norm': 1.0429819177319595, 'learning_rate': 9.986772662031886e-06, 'epoch': 0.05} 5%|▌ | 1799/34278 [2:00:58<35:51:24, 3.97s/it] 5%|▌ | 1800/34278 [2:01:02<34:26:05, 3.82s/it] {'loss': 0.2137, 'grad_norm': 1.1556976067958373, 'learning_rate': 9.986738298205872e-06, 'epoch': 0.05} 5%|▌ | 1800/34278 [2:01:02<34:26:05, 3.82s/it] 5%|▌ | 1801/34278 [2:01:08<41:31:45, 4.60s/it] {'loss': 0.1968, 'grad_norm': 0.863431425020644, 'learning_rate': 9.986703889859447e-06, 'epoch': 0.05} 5%|▌ | 1801/34278 [2:01:08<41:31:45, 4.60s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1802/34278 [2:01:12<39:04:54, 4.33s/it] {'loss': 0.2019, 'grad_norm': 1.0146475839847757, 'learning_rate': 9.98666943699292e-06, 'epoch': 0.05} 5%|▌ | 1802/34278 [2:01:12<39:04:54, 4.33s/it] 5%|▌ | 1803/34278 [2:01:15<36:01:24, 3.99s/it] {'loss': 0.2103, 'grad_norm': 1.018489384083942, 'learning_rate': 9.9866349396066e-06, 'epoch': 0.05} 5%|▌ | 1803/34278 [2:01:15<36:01:24, 3.99s/it] 5%|▌ | 1804/34278 [2:01:18<33:30:50, 3.72s/it] {'loss': 0.2222, 'grad_norm': 0.9442883325392579, 'learning_rate': 9.986600397700792e-06, 'epoch': 0.05} 5%|▌ | 1804/34278 [2:01:18<33:30:50, 3.72s/it] 5%|▌ | 1805/34278 [2:01:21<31:27:50, 3.49s/it] {'loss': 0.1946, 'grad_norm': 0.8475062797662067, 'learning_rate': 9.986565811275808e-06, 'epoch': 0.05} 5%|▌ | 1805/34278 [2:01:21<31:27:50, 3.49s/it] 5%|▌ | 1806/34278 [2:01:25<30:59:37, 3.44s/it] {'loss': 0.2294, 'grad_norm': 0.9145959988323972, 'learning_rate': 9.986531180331954e-06, 'epoch': 0.05} 5%|▌ | 1806/34278 [2:01:25<30:59:37, 3.44s/it] 5%|▌ | 1807/34278 [2:01:28<29:43:30, 3.30s/it] {'loss': 0.1987, 'grad_norm': 0.9104170814839893, 'learning_rate': 9.986496504869539e-06, 'epoch': 0.05} 5%|▌ | 1807/34278 [2:01:28<29:43:30, 3.30s/it] 5%|▌ | 1808/34278 [2:01:31<30:07:13, 3.34s/it] {'loss': 0.2209, 'grad_norm': 0.9828131742766328, 'learning_rate': 9.986461784888874e-06, 'epoch': 0.05} 5%|▌ | 1808/34278 [2:01:31<30:07:13, 3.34s/it] 5%|▌ | 1809/34278 [2:01:35<32:51:41, 3.64s/it] {'loss': 0.2289, 'grad_norm': 1.0438944454902002, 'learning_rate': 9.98642702039027e-06, 'epoch': 0.05} 5%|▌ | 1809/34278 [2:01:35<32:51:41, 3.64s/it] 5%|▌ | 1810/34278 [2:01:39<32:09:15, 3.57s/it] {'loss': 0.2171, 'grad_norm': 0.9798356376138588, 'learning_rate': 9.986392211374036e-06, 'epoch': 0.05} 5%|▌ | 1810/34278 [2:01:39<32:09:15, 3.57s/it] 5%|▌ | 1811/34278 [2:01:43<34:50:04, 3.86s/it] {'loss': 0.2172, 'grad_norm': 0.9239154507567835, 'learning_rate': 9.986357357840482e-06, 'epoch': 0.05} 5%|▌ | 1811/34278 [2:01:43<34:50:04, 3.86s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1812/34278 [2:01:46<32:50:11, 3.64s/it] {'loss': 0.1894, 'grad_norm': 0.9252103679074746, 'learning_rate': 9.986322459789919e-06, 'epoch': 0.05} 5%|▌ | 1812/34278 [2:01:46<32:50:11, 3.64s/it] 5%|▌ | 1813/34278 [2:01:51<36:45:03, 4.08s/it] {'loss': 0.1848, 'grad_norm': 1.0655456133056938, 'learning_rate': 9.986287517222659e-06, 'epoch': 0.05} 5%|▌ | 1813/34278 [2:01:52<36:45:03, 4.08s/it] 5%|▌ | 1814/34278 [2:01:55<34:36:10, 3.84s/it] {'loss': 0.1938, 'grad_norm': 1.0343671450226655, 'learning_rate': 9.986252530139016e-06, 'epoch': 0.05} 5%|▌ | 1814/34278 [2:01:55<34:36:10, 3.84s/it] 5%|▌ | 1815/34278 [2:01:58<32:39:49, 3.62s/it] {'loss': 0.2002, 'grad_norm': 1.1083047523984533, 'learning_rate': 9.9862174985393e-06, 'epoch': 0.05} 5%|▌ | 1815/34278 [2:01:58<32:39:49, 3.62s/it] 5%|▌ | 1816/34278 [2:02:01<30:27:26, 3.38s/it] {'loss': 0.1892, 'grad_norm': 1.0550743876531532, 'learning_rate': 9.986182422423825e-06, 'epoch': 0.05} 5%|▌ | 1816/34278 [2:02:01<30:27:26, 3.38s/it] 5%|▌ | 1817/34278 [2:02:04<29:16:41, 3.25s/it] {'loss': 0.2391, 'grad_norm': 1.0801337910542261, 'learning_rate': 9.986147301792904e-06, 'epoch': 0.05} 5%|▌ | 1817/34278 [2:02:04<29:16:41, 3.25s/it] 5%|▌ | 1818/34278 [2:02:07<29:34:45, 3.28s/it] {'loss': 0.2415, 'grad_norm': 1.023483843297668, 'learning_rate': 9.986112136646849e-06, 'epoch': 0.05} 5%|▌ | 1818/34278 [2:02:07<29:34:45, 3.28s/it] 5%|▌ | 1819/34278 [2:02:10<30:03:28, 3.33s/it] {'loss': 0.2269, 'grad_norm': 1.0004524211652035, 'learning_rate': 9.986076926985975e-06, 'epoch': 0.05} 5%|▌ | 1819/34278 [2:02:10<30:03:28, 3.33s/it] 5%|▌ | 1820/34278 [2:02:14<30:35:34, 3.39s/it] {'loss': 0.2063, 'grad_norm': 1.0348783975430784, 'learning_rate': 9.986041672810595e-06, 'epoch': 0.05} 5%|▌ | 1820/34278 [2:02:14<30:35:34, 3.39s/it] 5%|▌ | 1821/34278 [2:02:17<30:24:57, 3.37s/it] {'loss': 0.2134, 'grad_norm': 1.106985517179938, 'learning_rate': 9.98600637412103e-06, 'epoch': 0.05} 5%|▌ | 1821/34278 [2:02:17<30:24:57, 3.37s/it] 5%|▌ | 1822/34278 [2:02:21<30:14:16, 3.35s/it] {'loss': 0.2133, 'grad_norm': 0.976414008602237, 'learning_rate': 9.985971030917586e-06, 'epoch': 0.05} 5%|▌ | 1822/34278 [2:02:21<30:14:16, 3.35s/it] 5%|▌ | 1823/34278 [2:02:25<32:20:28, 3.59s/it] {'loss': 0.2085, 'grad_norm': 1.011429960505763, 'learning_rate': 9.985935643200584e-06, 'epoch': 0.05} 5%|▌ | 1823/34278 [2:02:25<32:20:28, 3.59s/it] 5%|▌ | 1824/34278 [2:02:29<33:29:02, 3.71s/it] {'loss': 0.1981, 'grad_norm': 0.8787937997521821, 'learning_rate': 9.985900210970339e-06, 'epoch': 0.05} 5%|▌ | 1824/34278 [2:02:29<33:29:02, 3.71s/it] 5%|▌ | 1825/34278 [2:02:32<32:03:31, 3.56s/it] {'loss': 0.2079, 'grad_norm': 0.9403096401930826, 'learning_rate': 9.985864734227168e-06, 'epoch': 0.05} 5%|▌ | 1825/34278 [2:02:32<32:03:31, 3.56s/it] 5%|▌ | 1826/34278 [2:02:38<38:51:16, 4.31s/it] {'loss': 0.2124, 'grad_norm': 0.8821673620918404, 'learning_rate': 9.985829212971386e-06, 'epoch': 0.05} 5%|▌ | 1826/34278 [2:02:38<38:51:16, 4.31s/it] 5%|▌ | 1827/34278 [2:02:44<44:35:40, 4.95s/it] {'loss': 0.1871, 'grad_norm': 0.8829264311916839, 'learning_rate': 9.98579364720331e-06, 'epoch': 0.05} 5%|▌ | 1827/34278 [2:02:44<44:35:40, 4.95s/it] 5%|▌ | 1828/34278 [2:02:49<42:25:08, 4.71s/it] {'loss': 0.187, 'grad_norm': 0.9244431599258233, 'learning_rate': 9.98575803692326e-06, 'epoch': 0.05} 5%|▌ | 1828/34278 [2:02:49<42:25:08, 4.71s/it] 5%|▌ | 1829/34278 [2:02:55<47:36:20, 5.28s/it] {'loss': 0.2301, 'grad_norm': 1.131042655378558, 'learning_rate': 9.985722382131554e-06, 'epoch': 0.05} 5%|▌ | 1829/34278 [2:02:55<47:36:20, 5.28s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1830/34278 [2:02:59<42:55:39, 4.76s/it] {'loss': 0.2075, 'grad_norm': 1.147087812053146, 'learning_rate': 9.985686682828506e-06, 'epoch': 0.05} 5%|▌ | 1830/34278 [2:02:59<42:55:39, 4.76s/it] 5%|▌ | 1831/34278 [2:03:02<38:07:16, 4.23s/it] {'loss': 0.2015, 'grad_norm': 1.2341563445994703, 'learning_rate': 9.985650939014438e-06, 'epoch': 0.05} 5%|▌ | 1831/34278 [2:03:02<38:07:16, 4.23s/it] 5%|▌ | 1832/34278 [2:03:05<34:49:44, 3.86s/it] {'loss': 0.2337, 'grad_norm': 0.9085919110664943, 'learning_rate': 9.98561515068967e-06, 'epoch': 0.05} 5%|▌ | 1832/34278 [2:03:05<34:49:44, 3.86s/it] 5%|▌ | 1833/34278 [2:03:10<39:17:14, 4.36s/it] {'loss': 0.2218, 'grad_norm': 1.0343138810120596, 'learning_rate': 9.98557931785452e-06, 'epoch': 0.05} 5%|▌ | 1833/34278 [2:03:10<39:17:14, 4.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1834/34278 [2:03:14<37:12:05, 4.13s/it] {'loss': 0.1899, 'grad_norm': 0.9640557208643279, 'learning_rate': 9.985543440509305e-06, 'epoch': 0.05} 5%|▌ | 1834/34278 [2:03:14<37:12:05, 4.13s/it] 5%|▌ | 1835/34278 [2:03:17<35:39:04, 3.96s/it] {'loss': 0.1991, 'grad_norm': 0.9833535393271521, 'learning_rate': 9.985507518654352e-06, 'epoch': 0.05} 5%|▌ | 1835/34278 [2:03:17<35:39:04, 3.96s/it] 5%|▌ | 1836/34278 [2:03:23<39:54:53, 4.43s/it] {'loss': 0.2165, 'grad_norm': 0.9299675607443383, 'learning_rate': 9.985471552289976e-06, 'epoch': 0.05} 5%|▌ | 1836/34278 [2:03:23<39:54:53, 4.43s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1837/34278 [2:03:27<38:26:21, 4.27s/it] {'loss': 0.1963, 'grad_norm': 0.9548816897494452, 'learning_rate': 9.985435541416499e-06, 'epoch': 0.05} 5%|▌ | 1837/34278 [2:03:27<38:26:21, 4.27s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11695 > 8192). Running this sequence through the model will result in indexing errors 5%|▌ | 1838/34278 [2:03:31<37:24:38, 4.15s/it] {'loss': 0.2194, 'grad_norm': 1.0268622050061784, 'learning_rate': 9.985399486034246e-06, 'epoch': 0.05} 5%|▌ | 1838/34278 [2:03:31<37:24:38, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1839/34278 [2:03:34<34:47:15, 3.86s/it] {'loss': 0.2047, 'grad_norm': 1.2219135992154067, 'learning_rate': 9.985363386143537e-06, 'epoch': 0.05} 5%|▌ | 1839/34278 [2:03:34<34:47:15, 3.86s/it] 5%|▌ | 1840/34278 [2:03:37<33:29:41, 3.72s/it] {'loss': 0.1953, 'grad_norm': 0.8638113387730253, 'learning_rate': 9.985327241744692e-06, 'epoch': 0.05} 5%|▌ | 1840/34278 [2:03:37<33:29:41, 3.72s/it] 5%|▌ | 1841/34278 [2:03:41<33:12:15, 3.69s/it] {'loss': 0.2041, 'grad_norm': 0.8408932603755563, 'learning_rate': 9.985291052838035e-06, 'epoch': 0.05} 5%|▌ | 1841/34278 [2:03:41<33:12:15, 3.69s/it] 5%|▌ | 1842/34278 [2:03:44<30:51:09, 3.42s/it] {'loss': 0.208, 'grad_norm': 1.0482465033233304, 'learning_rate': 9.985254819423891e-06, 'epoch': 0.05} 5%|▌ | 1842/34278 [2:03:44<30:51:09, 3.42s/it] 5%|▌ | 1843/34278 [2:03:49<35:26:37, 3.93s/it] {'loss': 0.2129, 'grad_norm': 0.9725931765566865, 'learning_rate': 9.985218541502581e-06, 'epoch': 0.05} 5%|▌ | 1843/34278 [2:03:49<35:26:37, 3.93s/it] 5%|▌ | 1844/34278 [2:03:52<33:41:12, 3.74s/it] {'loss': 0.1917, 'grad_norm': 0.8500853960485782, 'learning_rate': 9.98518221907443e-06, 'epoch': 0.05} 5%|▌ | 1844/34278 [2:03:52<33:41:12, 3.74s/it] 5%|▌ | 1845/34278 [2:03:55<32:12:38, 3.58s/it] {'loss': 0.1876, 'grad_norm': 0.9499032113249442, 'learning_rate': 9.985145852139763e-06, 'epoch': 0.05} 5%|▌ | 1845/34278 [2:03:55<32:12:38, 3.58s/it] 5%|▌ | 1846/34278 [2:03:59<32:09:53, 3.57s/it] {'loss': 0.2062, 'grad_norm': 1.198222177096465, 'learning_rate': 9.985109440698903e-06, 'epoch': 0.05} 5%|▌ | 1846/34278 [2:03:59<32:09:53, 3.57s/it] 5%|▌ | 1847/34278 [2:04:05<38:48:25, 4.31s/it] {'loss': 0.2536, 'grad_norm': 1.1844301846482599, 'learning_rate': 9.985072984752177e-06, 'epoch': 0.05} 5%|▌ | 1847/34278 [2:04:05<38:48:25, 4.31s/it] 5%|▌ | 1848/34278 [2:04:08<35:33:01, 3.95s/it] {'loss': 0.21, 'grad_norm': 0.9866838497921299, 'learning_rate': 9.985036484299909e-06, 'epoch': 0.05} 5%|▌ | 1848/34278 [2:04:08<35:33:01, 3.95s/it] 5%|▌ | 1849/34278 [2:04:12<35:23:09, 3.93s/it] {'loss': 0.1833, 'grad_norm': 0.8089665687431796, 'learning_rate': 9.984999939342426e-06, 'epoch': 0.05} 5%|▌ | 1849/34278 [2:04:12<35:23:09, 3.93s/it] 5%|▌ | 1850/34278 [2:04:18<40:39:19, 4.51s/it] {'loss': 0.2169, 'grad_norm': 1.0039516026030095, 'learning_rate': 9.984963349880053e-06, 'epoch': 0.05} 5%|▌ | 1850/34278 [2:04:18<40:39:19, 4.51s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10905 > 8192). Running this sequence through the model will result in indexing errors 5%|▌ | 1851/34278 [2:04:24<44:16:33, 4.92s/it] {'loss': 0.2059, 'grad_norm': 0.7919274387544472, 'learning_rate': 9.984926715913115e-06, 'epoch': 0.05} 5%|▌ | 1851/34278 [2:04:24<44:16:33, 4.92s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1852/34278 [2:04:27<39:34:56, 4.39s/it] {'loss': 0.1813, 'grad_norm': 0.9087935115881111, 'learning_rate': 9.984890037441944e-06, 'epoch': 0.05} 5%|▌ | 1852/34278 [2:04:27<39:34:56, 4.39s/it] 5%|▌ | 1853/34278 [2:04:32<43:04:28, 4.78s/it] {'loss': 0.2081, 'grad_norm': 0.947217691867987, 'learning_rate': 9.984853314466865e-06, 'epoch': 0.05} 5%|▌ | 1853/34278 [2:04:33<43:04:28, 4.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1854/34278 [2:04:36<40:42:55, 4.52s/it] {'loss': 0.2045, 'grad_norm': 0.9038864990631458, 'learning_rate': 9.984816546988202e-06, 'epoch': 0.05} 5%|▌ | 1854/34278 [2:04:36<40:42:55, 4.52s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1855/34278 [2:04:40<38:16:25, 4.25s/it] {'loss': 0.2084, 'grad_norm': 1.0652095810031903, 'learning_rate': 9.984779735006291e-06, 'epoch': 0.05} 5%|▌ | 1855/34278 [2:04:40<38:16:25, 4.25s/it] 5%|▌ | 1856/34278 [2:04:43<35:13:53, 3.91s/it] {'loss': 0.2194, 'grad_norm': 0.8575839585957018, 'learning_rate': 9.984742878521456e-06, 'epoch': 0.05} 5%|▌ | 1856/34278 [2:04:43<35:13:53, 3.91s/it] 5%|▌ | 1857/34278 [2:04:47<35:22:56, 3.93s/it] {'loss': 0.233, 'grad_norm': 1.177202848254806, 'learning_rate': 9.984705977534024e-06, 'epoch': 0.05} 5%|▌ | 1857/34278 [2:04:47<35:22:56, 3.93s/it] 5%|▌ | 1858/34278 [2:04:51<35:38:28, 3.96s/it] {'loss': 0.2068, 'grad_norm': 0.9385058659291741, 'learning_rate': 9.98466903204433e-06, 'epoch': 0.05} 5%|▌ | 1858/34278 [2:04:51<35:38:28, 3.96s/it] 5%|▌ | 1859/34278 [2:04:54<33:40:15, 3.74s/it] {'loss': 0.1858, 'grad_norm': 0.9191219594294061, 'learning_rate': 9.984632042052697e-06, 'epoch': 0.05} 5%|▌ | 1859/34278 [2:04:54<33:40:15, 3.74s/it] 5%|▌ | 1860/34278 [2:04:57<31:35:11, 3.51s/it] {'loss': 0.2072, 'grad_norm': 1.0403206931395317, 'learning_rate': 9.984595007559463e-06, 'epoch': 0.05} 5%|▌ | 1860/34278 [2:04:57<31:35:11, 3.51s/it] 5%|▌ | 1861/34278 [2:05:02<34:33:21, 3.84s/it] {'loss': 0.1951, 'grad_norm': 1.03842369920911, 'learning_rate': 9.984557928564952e-06, 'epoch': 0.05} 5%|▌ | 1861/34278 [2:05:02<34:33:21, 3.84s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1862/34278 [2:05:06<35:00:22, 3.89s/it] {'loss': 0.2315, 'grad_norm': 1.058675781860113, 'learning_rate': 9.984520805069499e-06, 'epoch': 0.05} 5%|▌ | 1862/34278 [2:05:06<35:00:22, 3.89s/it] 5%|▌ | 1863/34278 [2:05:09<33:17:32, 3.70s/it] {'loss': 0.2074, 'grad_norm': 0.8732767947654804, 'learning_rate': 9.984483637073435e-06, 'epoch': 0.05} 5%|▌ | 1863/34278 [2:05:09<33:17:32, 3.70s/it] 5%|▌ | 1864/34278 [2:05:12<30:43:13, 3.41s/it] {'loss': 0.2123, 'grad_norm': 1.0706967745492497, 'learning_rate': 9.984446424577089e-06, 'epoch': 0.05} 5%|▌ | 1864/34278 [2:05:12<30:43:13, 3.41s/it] 5%|▌ | 1865/34278 [2:05:16<31:37:33, 3.51s/it] {'loss': 0.2286, 'grad_norm': 1.030682987315947, 'learning_rate': 9.984409167580795e-06, 'epoch': 0.05} 5%|▌ | 1865/34278 [2:05:16<31:37:33, 3.51s/it] 5%|▌ | 1866/34278 [2:05:19<30:30:44, 3.39s/it] {'loss': 0.2053, 'grad_norm': 1.0252946488237773, 'learning_rate': 9.984371866084888e-06, 'epoch': 0.05} 5%|▌ | 1866/34278 [2:05:19<30:30:44, 3.39s/it] 5%|▌ | 1867/34278 [2:05:23<33:09:40, 3.68s/it] {'loss': 0.1978, 'grad_norm': 0.9362847529121195, 'learning_rate': 9.984334520089698e-06, 'epoch': 0.05} 5%|▌ | 1867/34278 [2:05:23<33:09:40, 3.68s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1868/34278 [2:05:29<39:30:03, 4.39s/it] {'loss': 0.2195, 'grad_norm': 1.3522443404635422, 'learning_rate': 9.984297129595559e-06, 'epoch': 0.05} 5%|▌ | 1868/34278 [2:05:29<39:30:03, 4.39s/it] 5%|▌ | 1869/34278 [2:05:33<37:50:54, 4.20s/it] {'loss': 0.2041, 'grad_norm': 1.0720617236396515, 'learning_rate': 9.984259694602805e-06, 'epoch': 0.05} 5%|▌ | 1869/34278 [2:05:33<37:50:54, 4.20s/it] 5%|▌ | 1870/34278 [2:05:36<34:35:59, 3.84s/it] {'loss': 0.1878, 'grad_norm': 1.0099605376182157, 'learning_rate': 9.98422221511177e-06, 'epoch': 0.05} 5%|▌ | 1870/34278 [2:05:36<34:35:59, 3.84s/it] 5%|▌ | 1871/34278 [2:05:39<32:47:17, 3.64s/it] {'loss': 0.208, 'grad_norm': 0.9798177316702981, 'learning_rate': 9.984184691122789e-06, 'epoch': 0.05} 5%|▌ | 1871/34278 [2:05:39<32:47:17, 3.64s/it] 5%|▌ | 1872/34278 [2:05:45<39:34:02, 4.40s/it] {'loss': 0.2119, 'grad_norm': 1.0716844630049964, 'learning_rate': 9.984147122636197e-06, 'epoch': 0.05} 5%|▌ | 1872/34278 [2:05:45<39:34:02, 4.40s/it] 5%|▌ | 1873/34278 [2:05:48<35:26:39, 3.94s/it] {'loss': 0.1929, 'grad_norm': 0.8567982002026963, 'learning_rate': 9.98410950965233e-06, 'epoch': 0.05} 5%|▌ | 1873/34278 [2:05:48<35:26:39, 3.94s/it] 5%|▌ | 1874/34278 [2:05:53<38:39:38, 4.30s/it] {'loss': 0.1963, 'grad_norm': 0.9096358723226816, 'learning_rate': 9.984071852171522e-06, 'epoch': 0.05} 5%|▌ | 1874/34278 [2:05:53<38:39:38, 4.30s/it] 5%|▌ | 1875/34278 [2:05:57<35:57:28, 3.99s/it] {'loss': 0.2366, 'grad_norm': 1.0511720503575723, 'learning_rate': 9.984034150194111e-06, 'epoch': 0.05} 5%|▌ | 1875/34278 [2:05:57<35:57:28, 3.99s/it] 5%|▌ | 1876/34278 [2:06:00<33:44:51, 3.75s/it] {'loss': 0.1963, 'grad_norm': 0.9256344022278301, 'learning_rate': 9.983996403720433e-06, 'epoch': 0.05} 5%|▌ | 1876/34278 [2:06:00<33:44:51, 3.75s/it] 5%|▌ | 1877/34278 [2:06:06<40:19:45, 4.48s/it] {'loss': 0.1908, 'grad_norm': 0.9009449825233125, 'learning_rate': 9.983958612750823e-06, 'epoch': 0.05} 5%|▌ | 1877/34278 [2:06:06<40:19:45, 4.48s/it] 5%|▌ | 1878/34278 [2:06:11<41:05:45, 4.57s/it] {'loss': 0.192, 'grad_norm': 0.8809903046256545, 'learning_rate': 9.983920777285623e-06, 'epoch': 0.05} 5%|▌ | 1878/34278 [2:06:11<41:05:45, 4.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1879/34278 [2:06:17<44:46:10, 4.97s/it] {'loss': 0.1809, 'grad_norm': 0.9111034618443862, 'learning_rate': 9.983882897325168e-06, 'epoch': 0.05} 5%|▌ | 1879/34278 [2:06:17<44:46:10, 4.97s/it] 5%|▌ | 1880/34278 [2:06:20<40:41:42, 4.52s/it] {'loss': 0.2326, 'grad_norm': 0.9654746262667905, 'learning_rate': 9.983844972869795e-06, 'epoch': 0.05} 5%|▌ | 1880/34278 [2:06:20<40:41:42, 4.52s/it] 5%|▌ | 1881/34278 [2:06:24<38:24:02, 4.27s/it] {'loss': 0.2487, 'grad_norm': 1.2852692308548157, 'learning_rate': 9.983807003919843e-06, 'epoch': 0.05} 5%|▌ | 1881/34278 [2:06:24<38:24:02, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1882/34278 [2:06:27<35:40:45, 3.96s/it] {'loss': 0.2199, 'grad_norm': 1.0916802151149374, 'learning_rate': 9.983768990475653e-06, 'epoch': 0.05} 5%|▌ | 1882/34278 [2:06:27<35:40:45, 3.96s/it] 5%|▌ | 1883/34278 [2:06:31<34:47:11, 3.87s/it] {'loss': 0.2137, 'grad_norm': 1.436610939999952, 'learning_rate': 9.983730932537563e-06, 'epoch': 0.05} 5%|▌ | 1883/34278 [2:06:31<34:47:11, 3.87s/it] 5%|▌ | 1884/34278 [2:06:36<39:52:13, 4.43s/it] {'loss': 0.2131, 'grad_norm': 1.0219366994511527, 'learning_rate': 9.983692830105914e-06, 'epoch': 0.05} 5%|▌ | 1884/34278 [2:06:36<39:52:13, 4.43s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 5%|▌ | 1885/34278 [2:06:40<36:20:53, 4.04s/it] {'loss': 0.1929, 'grad_norm': 1.1247098446347263, 'learning_rate': 9.983654683181044e-06, 'epoch': 0.05} 5%|▌ | 1885/34278 [2:06:40<36:20:53, 4.04s/it] 6%|▌ | 1886/34278 [2:06:44<37:11:14, 4.13s/it] {'loss': 0.2042, 'grad_norm': 1.1384705581143768, 'learning_rate': 9.983616491763295e-06, 'epoch': 0.06} 6%|▌ | 1886/34278 [2:06:44<37:11:14, 4.13s/it] 6%|▌ | 1887/34278 [2:06:47<34:58:52, 3.89s/it] {'loss': 0.1956, 'grad_norm': 1.0535108093057595, 'learning_rate': 9.983578255853005e-06, 'epoch': 0.06} 6%|▌ | 1887/34278 [2:06:47<34:58:52, 3.89s/it] 6%|▌ | 1888/34278 [2:06:53<40:46:18, 4.53s/it] {'loss': 0.2327, 'grad_norm': 1.0663686940223367, 'learning_rate': 9.983539975450522e-06, 'epoch': 0.06} 6%|▌ | 1888/34278 [2:06:53<40:46:18, 4.53s/it] 6%|▌ | 1889/34278 [2:06:56<36:43:26, 4.08s/it] {'loss': 0.2326, 'grad_norm': 1.0476600776117433, 'learning_rate': 9.983501650556182e-06, 'epoch': 0.06} 6%|▌ | 1889/34278 [2:06:56<36:43:26, 4.08s/it] 6%|▌ | 1890/34278 [2:06:59<34:06:06, 3.79s/it] {'loss': 0.1957, 'grad_norm': 0.990293055374146, 'learning_rate': 9.98346328117033e-06, 'epoch': 0.06} 6%|▌ | 1890/34278 [2:06:59<34:06:06, 3.79s/it] 6%|▌ | 1891/34278 [2:07:04<36:20:36, 4.04s/it] {'loss': 0.2152, 'grad_norm': 0.9188685309373221, 'learning_rate': 9.983424867293305e-06, 'epoch': 0.06} 6%|▌ | 1891/34278 [2:07:04<36:20:36, 4.04s/it] 6%|▌ | 1892/34278 [2:07:08<35:19:15, 3.93s/it] {'loss': 0.1892, 'grad_norm': 1.0898344218818747, 'learning_rate': 9.983386408925454e-06, 'epoch': 0.06} 6%|▌ | 1892/34278 [2:07:08<35:19:15, 3.93s/it] 6%|▌ | 1893/34278 [2:07:12<35:02:14, 3.89s/it] {'loss': 0.1996, 'grad_norm': 1.0735068947886786, 'learning_rate': 9.983347906067119e-06, 'epoch': 0.06} 6%|▌ | 1893/34278 [2:07:12<35:02:14, 3.89s/it] 6%|▌ | 1894/34278 [2:07:15<32:51:13, 3.65s/it] {'loss': 0.2004, 'grad_norm': 1.012606888062534, 'learning_rate': 9.983309358718642e-06, 'epoch': 0.06} 6%|▌ | 1894/34278 [2:07:15<32:51:13, 3.65s/it] 6%|▌ | 1895/34278 [2:07:18<31:00:55, 3.45s/it] {'loss': 0.2186, 'grad_norm': 1.1319489368529263, 'learning_rate': 9.98327076688037e-06, 'epoch': 0.06} 6%|▌ | 1895/34278 [2:07:18<31:00:55, 3.45s/it] 6%|▌ | 1896/34278 [2:07:21<31:28:55, 3.50s/it] {'loss': 0.1927, 'grad_norm': 1.2018814733508303, 'learning_rate': 9.983232130552646e-06, 'epoch': 0.06} 6%|▌ | 1896/34278 [2:07:21<31:28:55, 3.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1897/34278 [2:07:24<29:47:45, 3.31s/it] {'loss': 0.2108, 'grad_norm': 1.0115475359025725, 'learning_rate': 9.983193449735817e-06, 'epoch': 0.06} 6%|▌ | 1897/34278 [2:07:24<29:47:45, 3.31s/it] 6%|▌ | 1898/34278 [2:07:27<29:43:09, 3.30s/it] {'loss': 0.2062, 'grad_norm': 1.0588049836102444, 'learning_rate': 9.983154724430224e-06, 'epoch': 0.06} 6%|▌ | 1898/34278 [2:07:27<29:43:09, 3.30s/it] 6%|▌ | 1899/34278 [2:07:31<29:26:37, 3.27s/it] {'loss': 0.1824, 'grad_norm': 0.8387924635726959, 'learning_rate': 9.983115954636215e-06, 'epoch': 0.06} 6%|▌ | 1899/34278 [2:07:31<29:26:37, 3.27s/it] 6%|▌ | 1900/34278 [2:07:34<29:19:21, 3.26s/it] {'loss': 0.2061, 'grad_norm': 1.1120552835005346, 'learning_rate': 9.983077140354138e-06, 'epoch': 0.06} 6%|▌ | 1900/34278 [2:07:34<29:19:21, 3.26s/it] 6%|▌ | 1901/34278 [2:07:38<31:17:02, 3.48s/it] {'loss': 0.1873, 'grad_norm': 0.8382455374929305, 'learning_rate': 9.983038281584338e-06, 'epoch': 0.06} 6%|▌ | 1901/34278 [2:07:38<31:17:02, 3.48s/it] 6%|▌ | 1902/34278 [2:07:41<29:28:26, 3.28s/it] {'loss': 0.195, 'grad_norm': 1.1235872104607034, 'learning_rate': 9.98299937832716e-06, 'epoch': 0.06} 6%|▌ | 1902/34278 [2:07:41<29:28:26, 3.28s/it] 6%|▌ | 1903/34278 [2:07:44<29:02:50, 3.23s/it] {'loss': 0.1975, 'grad_norm': 0.8194562048653667, 'learning_rate': 9.982960430582954e-06, 'epoch': 0.06} 6%|▌ | 1903/34278 [2:07:44<29:02:50, 3.23s/it] 6%|▌ | 1904/34278 [2:07:50<37:25:46, 4.16s/it] {'loss': 0.2097, 'grad_norm': 0.9193206952649976, 'learning_rate': 9.982921438352067e-06, 'epoch': 0.06} 6%|▌ | 1904/34278 [2:07:50<37:25:46, 4.16s/it] 6%|▌ | 1905/34278 [2:07:53<35:05:48, 3.90s/it] {'loss': 0.1953, 'grad_norm': 0.8498663432478195, 'learning_rate': 9.982882401634846e-06, 'epoch': 0.06} 6%|▌ | 1905/34278 [2:07:53<35:05:48, 3.90s/it] 6%|▌ | 1906/34278 [2:07:58<36:21:21, 4.04s/it] {'loss': 0.1863, 'grad_norm': 0.861597356877587, 'learning_rate': 9.98284332043164e-06, 'epoch': 0.06} 6%|▌ | 1906/34278 [2:07:58<36:21:21, 4.04s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8427 > 8192). Running this sequence through the model will result in indexing errors 6%|▌ | 1907/34278 [2:08:04<41:56:06, 4.66s/it] {'loss': 0.1998, 'grad_norm': 0.9800120746903566, 'learning_rate': 9.982804194742801e-06, 'epoch': 0.06} 6%|▌ | 1907/34278 [2:08:04<41:56:06, 4.66s/it] 6%|▌ | 1908/34278 [2:08:07<38:13:22, 4.25s/it] {'loss': 0.1987, 'grad_norm': 0.7502622720531859, 'learning_rate': 9.982765024568675e-06, 'epoch': 0.06} 6%|▌ | 1908/34278 [2:08:07<38:13:22, 4.25s/it] 6%|▌ | 1909/34278 [2:08:13<43:22:48, 4.82s/it] {'loss': 0.2327, 'grad_norm': 1.000226061863958, 'learning_rate': 9.982725809909611e-06, 'epoch': 0.06} 6%|▌ | 1909/34278 [2:08:13<43:22:48, 4.82s/it] 6%|▌ | 1910/34278 [2:08:17<39:29:47, 4.39s/it] {'loss': 0.2027, 'grad_norm': 1.0521598794386595, 'learning_rate': 9.98268655076596e-06, 'epoch': 0.06} 6%|▌ | 1910/34278 [2:08:17<39:29:47, 4.39s/it] 6%|▌ | 1911/34278 [2:08:20<36:25:17, 4.05s/it] {'loss': 0.2032, 'grad_norm': 0.8616711043293097, 'learning_rate': 9.982647247138075e-06, 'epoch': 0.06} 6%|▌ | 1911/34278 [2:08:20<36:25:17, 4.05s/it] 6%|▌ | 1912/34278 [2:08:24<35:31:56, 3.95s/it] {'loss': 0.1951, 'grad_norm': 0.9758956868989882, 'learning_rate': 9.982607899026302e-06, 'epoch': 0.06} 6%|▌ | 1912/34278 [2:08:24<35:31:56, 3.95s/it] 6%|▌ | 1913/34278 [2:08:27<35:04:15, 3.90s/it] {'loss': 0.1932, 'grad_norm': 0.7968087778332504, 'learning_rate': 9.982568506430998e-06, 'epoch': 0.06} 6%|▌ | 1913/34278 [2:08:27<35:04:15, 3.90s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11033 > 8192). Running this sequence through the model will result in indexing errors 6%|▌ | 1914/34278 [2:08:30<32:39:56, 3.63s/it] {'loss': 0.2081, 'grad_norm': 1.010309091247346, 'learning_rate': 9.982529069352509e-06, 'epoch': 0.06} 6%|▌ | 1914/34278 [2:08:30<32:39:56, 3.63s/it] 6%|▌ | 1915/34278 [2:08:37<40:08:36, 4.47s/it] {'loss': 0.1811, 'grad_norm': 1.0235427591374133, 'learning_rate': 9.982489587791192e-06, 'epoch': 0.06} 6%|▌ | 1915/34278 [2:08:37<40:08:36, 4.47s/it] 6%|▌ | 1916/34278 [2:08:40<37:51:54, 4.21s/it] {'loss': 0.1919, 'grad_norm': 0.8834228764818954, 'learning_rate': 9.982450061747397e-06, 'epoch': 0.06} 6%|▌ | 1916/34278 [2:08:40<37:51:54, 4.21s/it] 6%|▌ | 1917/34278 [2:08:44<35:33:44, 3.96s/it] {'loss': 0.1993, 'grad_norm': 1.1209481285353378, 'learning_rate': 9.982410491221477e-06, 'epoch': 0.06} 6%|▌ | 1917/34278 [2:08:44<35:33:44, 3.96s/it] 6%|▌ | 1918/34278 [2:08:48<37:12:37, 4.14s/it] {'loss': 0.2142, 'grad_norm': 1.0146252281617338, 'learning_rate': 9.982370876213785e-06, 'epoch': 0.06} 6%|▌ | 1918/34278 [2:08:48<37:12:37, 4.14s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1919/34278 [2:08:52<34:43:16, 3.86s/it] {'loss': 0.1805, 'grad_norm': 1.0691181834273153, 'learning_rate': 9.982331216724676e-06, 'epoch': 0.06} 6%|▌ | 1919/34278 [2:08:52<34:43:16, 3.86s/it] 6%|▌ | 1920/34278 [2:08:55<33:11:08, 3.69s/it] {'loss': 0.2071, 'grad_norm': 1.2740121476083013, 'learning_rate': 9.982291512754503e-06, 'epoch': 0.06} 6%|▌ | 1920/34278 [2:08:55<33:11:08, 3.69s/it] 6%|▌ | 1921/34278 [2:08:58<31:52:22, 3.55s/it] {'loss': 0.209, 'grad_norm': 0.9348187850907693, 'learning_rate': 9.98225176430362e-06, 'epoch': 0.06} 6%|▌ | 1921/34278 [2:08:58<31:52:22, 3.55s/it] 6%|▌ | 1922/34278 [2:09:01<30:42:43, 3.42s/it] {'loss': 0.1875, 'grad_norm': 1.1066368739997507, 'learning_rate': 9.982211971372384e-06, 'epoch': 0.06} 6%|▌ | 1922/34278 [2:09:01<30:42:43, 3.42s/it] 6%|▌ | 1923/34278 [2:09:04<30:19:39, 3.37s/it] {'loss': 0.1922, 'grad_norm': 0.9388244848923358, 'learning_rate': 9.982172133961148e-06, 'epoch': 0.06} 6%|▌ | 1923/34278 [2:09:04<30:19:39, 3.37s/it] 6%|▌ | 1924/34278 [2:09:08<31:16:03, 3.48s/it] {'loss': 0.2184, 'grad_norm': 1.34356212438608, 'learning_rate': 9.982132252070271e-06, 'epoch': 0.06} 6%|▌ | 1924/34278 [2:09:08<31:16:03, 3.48s/it] 6%|▌ | 1925/34278 [2:09:12<32:08:14, 3.58s/it] {'loss': 0.2274, 'grad_norm': 1.015099583720631, 'learning_rate': 9.982092325700103e-06, 'epoch': 0.06} 6%|▌ | 1925/34278 [2:09:12<32:08:14, 3.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1926/34278 [2:09:18<39:35:44, 4.41s/it] {'loss': 0.1928, 'grad_norm': 0.9407140103948982, 'learning_rate': 9.982052354851007e-06, 'epoch': 0.06} 6%|▌ | 1926/34278 [2:09:18<39:35:44, 4.41s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1927/34278 [2:09:22<36:20:36, 4.04s/it] {'loss': 0.2387, 'grad_norm': 1.2565158444534883, 'learning_rate': 9.982012339523335e-06, 'epoch': 0.06} 6%|▌ | 1927/34278 [2:09:22<36:20:36, 4.04s/it] 6%|▌ | 1928/34278 [2:09:25<33:29:48, 3.73s/it] {'loss': 0.2045, 'grad_norm': 0.9817298152237669, 'learning_rate': 9.981972279717446e-06, 'epoch': 0.06} 6%|▌ | 1928/34278 [2:09:25<33:29:48, 3.73s/it] 6%|▌ | 1929/34278 [2:09:29<35:12:54, 3.92s/it] {'loss': 0.1913, 'grad_norm': 1.0505208017484677, 'learning_rate': 9.981932175433697e-06, 'epoch': 0.06} 6%|▌ | 1929/34278 [2:09:29<35:12:54, 3.92s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1930/34278 [2:09:32<34:19:29, 3.82s/it] {'loss': 0.2249, 'grad_norm': 1.011894686119928, 'learning_rate': 9.981892026672449e-06, 'epoch': 0.06} 6%|▌ | 1930/34278 [2:09:32<34:19:29, 3.82s/it] 6%|▌ | 1931/34278 [2:09:35<31:27:40, 3.50s/it] {'loss': 0.1974, 'grad_norm': 0.9690974288239764, 'learning_rate': 9.981851833434058e-06, 'epoch': 0.06} 6%|▌ | 1931/34278 [2:09:35<31:27:40, 3.50s/it] 6%|▌ | 1932/34278 [2:09:39<31:29:36, 3.51s/it] {'loss': 0.2146, 'grad_norm': 1.0278293284259004, 'learning_rate': 9.981811595718882e-06, 'epoch': 0.06} 6%|▌ | 1932/34278 [2:09:39<31:29:36, 3.51s/it] 6%|▌ | 1933/34278 [2:09:42<31:20:57, 3.49s/it] {'loss': 0.1974, 'grad_norm': 0.8300068746950454, 'learning_rate': 9.981771313527283e-06, 'epoch': 0.06} 6%|▌ | 1933/34278 [2:09:42<31:20:57, 3.49s/it] 6%|▌ | 1934/34278 [2:09:45<30:44:10, 3.42s/it] {'loss': 0.208, 'grad_norm': 1.04244037260496, 'learning_rate': 9.981730986859617e-06, 'epoch': 0.06} 6%|▌ | 1934/34278 [2:09:45<30:44:10, 3.42s/it] 6%|▌ | 1935/34278 [2:09:49<30:56:21, 3.44s/it] {'loss': 0.2176, 'grad_norm': 0.8649873612347396, 'learning_rate': 9.981690615716246e-06, 'epoch': 0.06} 6%|▌ | 1935/34278 [2:09:49<30:56:21, 3.44s/it] 6%|▌ | 1936/34278 [2:09:53<31:22:15, 3.49s/it] {'loss': 0.1971, 'grad_norm': 1.0223505095805647, 'learning_rate': 9.98165020009753e-06, 'epoch': 0.06} 6%|▌ | 1936/34278 [2:09:53<31:22:15, 3.49s/it] 6%|▌ | 1937/34278 [2:09:58<36:56:35, 4.11s/it] {'loss': 0.1864, 'grad_norm': 0.9045741861785711, 'learning_rate': 9.981609740003833e-06, 'epoch': 0.06} 6%|▌ | 1937/34278 [2:09:58<36:56:35, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1938/34278 [2:10:01<34:33:15, 3.85s/it] {'loss': 0.227, 'grad_norm': 1.0648583047941091, 'learning_rate': 9.981569235435511e-06, 'epoch': 0.06} 6%|▌ | 1938/34278 [2:10:01<34:33:15, 3.85s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1939/34278 [2:10:06<37:17:38, 4.15s/it] {'loss': 0.2101, 'grad_norm': 0.9632099651165861, 'learning_rate': 9.98152868639293e-06, 'epoch': 0.06} 6%|▌ | 1939/34278 [2:10:06<37:17:38, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1940/34278 [2:10:11<40:10:33, 4.47s/it] {'loss': 0.2127, 'grad_norm': 1.0210597468742788, 'learning_rate': 9.981488092876448e-06, 'epoch': 0.06} 6%|▌ | 1940/34278 [2:10:11<40:10:33, 4.47s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1941/34278 [2:10:15<38:20:51, 4.27s/it] {'loss': 0.2293, 'grad_norm': 1.0617786672123894, 'learning_rate': 9.981447454886431e-06, 'epoch': 0.06} 6%|▌ | 1941/34278 [2:10:15<38:20:51, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1942/34278 [2:10:19<35:51:55, 3.99s/it] {'loss': 0.2131, 'grad_norm': 1.1119559740593121, 'learning_rate': 9.981406772423238e-06, 'epoch': 0.06} 6%|▌ | 1942/34278 [2:10:19<35:51:55, 3.99s/it] 6%|▌ | 1943/34278 [2:10:22<34:08:05, 3.80s/it] {'loss': 0.2134, 'grad_norm': 0.9274663281988038, 'learning_rate': 9.981366045487237e-06, 'epoch': 0.06} 6%|▌ | 1943/34278 [2:10:22<34:08:05, 3.80s/it] 6%|▌ | 1944/34278 [2:10:25<32:35:09, 3.63s/it] {'loss': 0.1877, 'grad_norm': 0.8160314094716041, 'learning_rate': 9.981325274078788e-06, 'epoch': 0.06} 6%|▌ | 1944/34278 [2:10:25<32:35:09, 3.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1945/34278 [2:10:29<32:01:59, 3.57s/it] {'loss': 0.2127, 'grad_norm': 0.9385816920754576, 'learning_rate': 9.981284458198256e-06, 'epoch': 0.06} 6%|▌ | 1945/34278 [2:10:29<32:01:59, 3.57s/it] 6%|▌ | 1946/34278 [2:10:32<30:45:06, 3.42s/it] {'loss': 0.2109, 'grad_norm': 1.0282197362729375, 'learning_rate': 9.981243597846006e-06, 'epoch': 0.06} 6%|▌ | 1946/34278 [2:10:32<30:45:06, 3.42s/it] 6%|▌ | 1947/34278 [2:10:35<31:28:43, 3.51s/it] {'loss': 0.1819, 'grad_norm': 0.8829812410880271, 'learning_rate': 9.981202693022402e-06, 'epoch': 0.06} 6%|▌ | 1947/34278 [2:10:35<31:28:43, 3.51s/it] 6%|▌ | 1948/34278 [2:10:38<30:09:48, 3.36s/it] {'loss': 0.1824, 'grad_norm': 0.8247331199287998, 'learning_rate': 9.98116174372781e-06, 'epoch': 0.06} 6%|▌ | 1948/34278 [2:10:38<30:09:48, 3.36s/it] 6%|▌ | 1949/34278 [2:10:42<29:31:40, 3.29s/it] {'loss': 0.2071, 'grad_norm': 1.0800299153360786, 'learning_rate': 9.981120749962595e-06, 'epoch': 0.06} 6%|▌ | 1949/34278 [2:10:42<29:31:40, 3.29s/it] 6%|▌ | 1950/34278 [2:10:45<29:18:36, 3.26s/it] {'loss': 0.2372, 'grad_norm': 1.1559209610798193, 'learning_rate': 9.981079711727123e-06, 'epoch': 0.06} 6%|▌ | 1950/34278 [2:10:45<29:18:36, 3.26s/it] 6%|▌ | 1951/34278 [2:10:49<32:24:35, 3.61s/it] {'loss': 0.218, 'grad_norm': 1.0102993867748309, 'learning_rate': 9.98103862902176e-06, 'epoch': 0.06} 6%|▌ | 1951/34278 [2:10:49<32:24:35, 3.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1952/34278 [2:10:52<31:05:42, 3.46s/it] {'loss': 0.2478, 'grad_norm': 1.0342717337758949, 'learning_rate': 9.980997501846874e-06, 'epoch': 0.06} 6%|▌ | 1952/34278 [2:10:52<31:05:42, 3.46s/it] 6%|▌ | 1953/34278 [2:10:55<29:43:05, 3.31s/it] {'loss': 0.2119, 'grad_norm': 1.2275850937294275, 'learning_rate': 9.98095633020283e-06, 'epoch': 0.06} 6%|▌ | 1953/34278 [2:10:55<29:43:05, 3.31s/it] 6%|▌ | 1954/34278 [2:10:59<30:31:13, 3.40s/it] {'loss': 0.2492, 'grad_norm': 1.0425157057129126, 'learning_rate': 9.980915114089997e-06, 'epoch': 0.06} 6%|▌ | 1954/34278 [2:10:59<30:31:13, 3.40s/it] 6%|▌ | 1955/34278 [2:11:02<29:36:04, 3.30s/it] {'loss': 0.1798, 'grad_norm': 0.9505753262803261, 'learning_rate': 9.980873853508744e-06, 'epoch': 0.06} 6%|▌ | 1955/34278 [2:11:02<29:36:04, 3.30s/it] 6%|▌ | 1956/34278 [2:11:05<29:37:29, 3.30s/it] {'loss': 0.1912, 'grad_norm': 0.8890408882605577, 'learning_rate': 9.980832548459438e-06, 'epoch': 0.06} 6%|▌ | 1956/34278 [2:11:05<29:37:29, 3.30s/it] 6%|▌ | 1957/34278 [2:11:08<27:57:23, 3.11s/it] {'loss': 0.2051, 'grad_norm': 1.1234316354262521, 'learning_rate': 9.980791198942449e-06, 'epoch': 0.06} 6%|▌ | 1957/34278 [2:11:08<27:57:23, 3.11s/it] 6%|▌ | 1958/34278 [2:11:12<29:55:26, 3.33s/it] {'loss': 0.196, 'grad_norm': 0.8539045011047467, 'learning_rate': 9.980749804958142e-06, 'epoch': 0.06} 6%|▌ | 1958/34278 [2:11:12<29:55:26, 3.33s/it] 6%|▌ | 1959/34278 [2:11:15<30:46:24, 3.43s/it] {'loss': 0.2136, 'grad_norm': 1.0449744737509585, 'learning_rate': 9.980708366506892e-06, 'epoch': 0.06} 6%|▌ | 1959/34278 [2:11:15<30:46:24, 3.43s/it] 6%|▌ | 1960/34278 [2:11:18<29:14:54, 3.26s/it] {'loss': 0.2092, 'grad_norm': 0.9131467350350049, 'learning_rate': 9.980666883589066e-06, 'epoch': 0.06} 6%|▌ | 1960/34278 [2:11:18<29:14:54, 3.26s/it] 6%|▌ | 1961/34278 [2:11:24<36:26:35, 4.06s/it] {'loss': 0.2002, 'grad_norm': 1.0036185750520146, 'learning_rate': 9.980625356205036e-06, 'epoch': 0.06} 6%|▌ | 1961/34278 [2:11:24<36:26:35, 4.06s/it] 6%|▌ | 1962/34278 [2:11:29<37:20:20, 4.16s/it] {'loss': 0.1757, 'grad_norm': 0.7412115243381714, 'learning_rate': 9.980583784355171e-06, 'epoch': 0.06} 6%|▌ | 1962/34278 [2:11:29<37:20:20, 4.16s/it] 6%|▌ | 1963/34278 [2:11:32<35:00:06, 3.90s/it] {'loss': 0.2324, 'grad_norm': 1.0872784702101763, 'learning_rate': 9.980542168039843e-06, 'epoch': 0.06} 6%|▌ | 1963/34278 [2:11:32<35:00:06, 3.90s/it] 6%|▌ | 1964/34278 [2:11:34<31:38:56, 3.53s/it] {'loss': 0.2205, 'grad_norm': 1.002135450712531, 'learning_rate': 9.980500507259423e-06, 'epoch': 0.06} 6%|▌ | 1964/34278 [2:11:34<31:38:56, 3.53s/it] 6%|▌ | 1965/34278 [2:11:38<30:55:22, 3.45s/it] {'loss': 0.1949, 'grad_norm': 0.8774046598244668, 'learning_rate': 9.980458802014285e-06, 'epoch': 0.06} 6%|▌ | 1965/34278 [2:11:38<30:55:22, 3.45s/it] 6%|▌ | 1966/34278 [2:11:43<35:23:44, 3.94s/it] {'loss': 0.1959, 'grad_norm': 0.9070400059442462, 'learning_rate': 9.980417052304798e-06, 'epoch': 0.06} 6%|▌ | 1966/34278 [2:11:43<35:23:44, 3.94s/it] 6%|▌ | 1967/34278 [2:11:49<41:30:06, 4.62s/it] {'loss': 0.2181, 'grad_norm': 0.9288842557182622, 'learning_rate': 9.98037525813134e-06, 'epoch': 0.06} 6%|▌ | 1967/34278 [2:11:49<41:30:06, 4.62s/it] 6%|▌ | 1968/34278 [2:11:54<41:50:13, 4.66s/it] {'loss': 0.1934, 'grad_norm': 0.9941150963868122, 'learning_rate': 9.980333419494275e-06, 'epoch': 0.06} 6%|▌ | 1968/34278 [2:11:54<41:50:13, 4.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1969/34278 [2:11:58<40:35:36, 4.52s/it] {'loss': 0.1963, 'grad_norm': 0.9205959032965471, 'learning_rate': 9.980291536393985e-06, 'epoch': 0.06} 6%|▌ | 1969/34278 [2:11:58<40:35:36, 4.52s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1970/34278 [2:12:01<37:42:53, 4.20s/it] {'loss': 0.1938, 'grad_norm': 0.9087053739773382, 'learning_rate': 9.980249608830842e-06, 'epoch': 0.06} 6%|▌ | 1970/34278 [2:12:01<37:42:53, 4.20s/it] 6%|▌ | 1971/34278 [2:12:05<35:14:23, 3.93s/it] {'loss': 0.2065, 'grad_norm': 1.0001762011083941, 'learning_rate': 9.980207636805218e-06, 'epoch': 0.06} 6%|▌ | 1971/34278 [2:12:05<35:14:23, 3.93s/it] 6%|▌ | 1972/34278 [2:12:09<35:04:34, 3.91s/it] {'loss': 0.1981, 'grad_norm': 0.9188547817314544, 'learning_rate': 9.98016562031749e-06, 'epoch': 0.06} 6%|▌ | 1972/34278 [2:12:09<35:04:34, 3.91s/it] 6%|▌ | 1973/34278 [2:12:14<40:21:40, 4.50s/it] {'loss': 0.2117, 'grad_norm': 0.9671072125018545, 'learning_rate': 9.980123559368032e-06, 'epoch': 0.06} 6%|▌ | 1973/34278 [2:12:14<40:21:40, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1974/34278 [2:12:18<36:33:40, 4.07s/it] {'loss': 0.2168, 'grad_norm': 0.9290917183858046, 'learning_rate': 9.980081453957219e-06, 'epoch': 0.06} 6%|▌ | 1974/34278 [2:12:18<36:33:40, 4.07s/it] 6%|▌ | 1975/34278 [2:12:20<32:58:16, 3.67s/it] {'loss': 0.2031, 'grad_norm': 1.046498723225571, 'learning_rate': 9.980039304085429e-06, 'epoch': 0.06} 6%|▌ | 1975/34278 [2:12:20<32:58:16, 3.67s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9416 > 8192). Running this sequence through the model will result in indexing errors 6%|▌ | 1976/34278 [2:12:24<33:48:10, 3.77s/it] {'loss': 0.1991, 'grad_norm': 0.8340365718130629, 'learning_rate': 9.979997109753035e-06, 'epoch': 0.06} 6%|▌ | 1976/34278 [2:12:24<33:48:10, 3.77s/it] 6%|▌ | 1977/34278 [2:12:28<33:16:07, 3.71s/it] {'loss': 0.2031, 'grad_norm': 0.942880536303, 'learning_rate': 9.979954870960417e-06, 'epoch': 0.06} 6%|▌ | 1977/34278 [2:12:28<33:16:07, 3.71s/it] 6%|▌ | 1978/34278 [2:12:31<31:13:05, 3.48s/it] {'loss': 0.1885, 'grad_norm': 0.9021832572875198, 'learning_rate': 9.97991258770795e-06, 'epoch': 0.06} 6%|▌ | 1978/34278 [2:12:31<31:13:05, 3.48s/it] 6%|▌ | 1979/34278 [2:12:34<30:01:33, 3.35s/it] {'loss': 0.2325, 'grad_norm': 1.0183555583589448, 'learning_rate': 9.979870259996013e-06, 'epoch': 0.06} 6%|▌ | 1979/34278 [2:12:34<30:01:33, 3.35s/it] 6%|▌ | 1980/34278 [2:12:40<38:31:38, 4.29s/it] {'loss': 0.2026, 'grad_norm': 0.9307096496027774, 'learning_rate': 9.979827887824983e-06, 'epoch': 0.06} 6%|▌ | 1980/34278 [2:12:40<38:31:38, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1981/34278 [2:12:44<35:52:16, 4.00s/it] {'loss': 0.1961, 'grad_norm': 0.891606564374665, 'learning_rate': 9.979785471195238e-06, 'epoch': 0.06} 6%|▌ | 1981/34278 [2:12:44<35:52:16, 4.00s/it] 6%|▌ | 1982/34278 [2:12:50<41:46:06, 4.66s/it] {'loss': 0.2038, 'grad_norm': 0.8284938160218519, 'learning_rate': 9.979743010107158e-06, 'epoch': 0.06} 6%|▌ | 1982/34278 [2:12:50<41:46:06, 4.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1983/34278 [2:12:56<46:38:48, 5.20s/it] {'loss': 0.2196, 'grad_norm': 0.8671698559128383, 'learning_rate': 9.979700504561118e-06, 'epoch': 0.06} 6%|▌ | 1983/34278 [2:12:56<46:38:48, 5.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1984/34278 [2:13:02<47:37:00, 5.31s/it] {'loss': 0.2025, 'grad_norm': 0.795225896992179, 'learning_rate': 9.979657954557504e-06, 'epoch': 0.06} 6%|▌ | 1984/34278 [2:13:02<47:37:00, 5.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1985/34278 [2:13:07<47:23:10, 5.28s/it] {'loss': 0.2174, 'grad_norm': 0.8283274741978116, 'learning_rate': 9.97961536009669e-06, 'epoch': 0.06} 6%|▌ | 1985/34278 [2:13:07<47:23:10, 5.28s/it] 6%|▌ | 1986/34278 [2:13:10<42:02:18, 4.69s/it] {'loss': 0.2038, 'grad_norm': 0.9335261797634911, 'learning_rate': 9.97957272117906e-06, 'epoch': 0.06} 6%|▌ | 1986/34278 [2:13:10<42:02:18, 4.69s/it] 6%|▌ | 1987/34278 [2:13:14<38:48:20, 4.33s/it] {'loss': 0.1917, 'grad_norm': 0.7981795817685567, 'learning_rate': 9.979530037804995e-06, 'epoch': 0.06} 6%|▌ | 1987/34278 [2:13:14<38:48:20, 4.33s/it] 6%|▌ | 1988/34278 [2:13:18<39:20:51, 4.39s/it] {'loss': 0.2258, 'grad_norm': 0.8962893771671759, 'learning_rate': 9.979487309974874e-06, 'epoch': 0.06} 6%|▌ | 1988/34278 [2:13:18<39:20:51, 4.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1989/34278 [2:13:23<40:24:20, 4.50s/it] {'loss': 0.2107, 'grad_norm': 0.8931335398606022, 'learning_rate': 9.979444537689078e-06, 'epoch': 0.06} 6%|▌ | 1989/34278 [2:13:23<40:24:20, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1990/34278 [2:13:27<38:27:02, 4.29s/it] {'loss': 0.2215, 'grad_norm': 0.9052813707407674, 'learning_rate': 9.979401720947989e-06, 'epoch': 0.06} 6%|▌ | 1990/34278 [2:13:27<38:27:02, 4.29s/it] 6%|▌ | 1991/34278 [2:13:30<35:33:50, 3.97s/it] {'loss': 0.2007, 'grad_norm': 0.91560999339626, 'learning_rate': 9.979358859751994e-06, 'epoch': 0.06} 6%|▌ | 1991/34278 [2:13:30<35:33:50, 3.97s/it] 6%|▌ | 1992/34278 [2:13:36<39:26:51, 4.40s/it] {'loss': 0.2168, 'grad_norm': 0.9598330084334362, 'learning_rate': 9.979315954101466e-06, 'epoch': 0.06} 6%|▌ | 1992/34278 [2:13:36<39:26:51, 4.40s/it] 6%|▌ | 1993/34278 [2:13:42<46:03:02, 5.13s/it] {'loss': 0.2151, 'grad_norm': 1.0927238732461215, 'learning_rate': 9.979273003996798e-06, 'epoch': 0.06} 6%|▌ | 1993/34278 [2:13:42<46:03:02, 5.13s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1994/34278 [2:13:46<40:50:02, 4.55s/it] {'loss': 0.1992, 'grad_norm': 1.4720458308251883, 'learning_rate': 9.979230009438368e-06, 'epoch': 0.06} 6%|▌ | 1994/34278 [2:13:46<40:50:02, 4.55s/it] 6%|▌ | 1995/34278 [2:13:52<45:13:04, 5.04s/it] {'loss': 0.2135, 'grad_norm': 1.2242446883341342, 'learning_rate': 9.979186970426562e-06, 'epoch': 0.06} 6%|▌ | 1995/34278 [2:13:52<45:13:04, 5.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 1996/34278 [2:13:55<40:21:59, 4.50s/it] {'loss': 0.2088, 'grad_norm': 1.0233646807064443, 'learning_rate': 9.979143886961762e-06, 'epoch': 0.06} 6%|▌ | 1996/34278 [2:13:55<40:21:59, 4.50s/it] 6%|▌ | 1997/34278 [2:13:59<38:43:28, 4.32s/it] {'loss': 0.2071, 'grad_norm': 0.8835682418967595, 'learning_rate': 9.979100759044355e-06, 'epoch': 0.06} 6%|▌ | 1997/34278 [2:13:59<38:43:28, 4.32s/it] 6%|▌ | 1998/34278 [2:14:02<35:18:53, 3.94s/it] {'loss': 0.2396, 'grad_norm': 1.2947625093001198, 'learning_rate': 9.979057586674724e-06, 'epoch': 0.06} 6%|▌ | 1998/34278 [2:14:02<35:18:53, 3.94s/it] 6%|▌ | 1999/34278 [2:14:05<33:07:24, 3.69s/it] {'loss': 0.2177, 'grad_norm': 1.1049690907290772, 'learning_rate': 9.979014369853257e-06, 'epoch': 0.06} 6%|▌ | 1999/34278 [2:14:05<33:07:24, 3.69s/it] 6%|▌ | 2000/34278 [2:14:08<31:17:14, 3.49s/it] {'loss': 0.2488, 'grad_norm': 0.9824809590555131, 'learning_rate': 9.978971108580336e-06, 'epoch': 0.06} 6%|▌ | 2000/34278 [2:14:08<31:17:14, 3.49s/it]Set eos token id toSet eos token id toSet eos token id toSet eos token id toSet eos token id to token to Set eos token id to<|diff_marker|>Set eos token id to<|diff_marker|>Set eos token id to151658 Set eos token id to 151658151658 Set eos token id to151658 Set generation config eos token id toSet eos token toSet eos token to Set generation config eos token id to 151658 Set eos token to151658Set eos token to [151658]<|diff_marker|><|diff_marker|> [151658]Set eos token to <|diff_marker|><|diff_marker|> Set eos token to Set generation config eos token id to <|diff_marker|>Set generation config eos token id to <|diff_marker|>Set generation config eos token id toSet generation config eos token id to[151658][151658] Set generation config eos token id to [151658][151658] Set generation config eos token id to [151658] [151658] 151658151658151658151658Set eos token id to Set eos token id toSet eos token toSet eos token toSet eos token toSet eos token to151658 <|diff_marker|> <|diff_marker|><|diff_marker|> <|diff_marker|>151658 Set eos token to Set generation config eos token id to <|diff_marker|> Set generation config eos token id toSet eos token toSet generation config eos token id to Set generation config eos token id to [151658] <|diff_marker|>Set generation config eos token id to [151658][151658] [151658] [151658] Set generation config eos token id to [151658] Set eos token id to Set eos token id to151658 151658Set eos token to <|diff_marker|> Set eos token to <|diff_marker|> Set generation config eos token id to Set generation config eos token id to[151658] [151658] [151658] [151658] [151658] [151658] Set eos token id to 151658151658Set eos token id to Set eos token id toSet eos token id to 151658 Set eos token toSet eos token to 151658 151658151658 <|diff_marker|>Set eos token id to<|diff_marker|> Set eos token to Set eos token toSet eos token toSet eos token to Set eos token id to <|diff_marker|>Set generation config eos token id to <|diff_marker|><|diff_marker|><|diff_marker|> 151658 151658Set generation config eos token id toSet generation config eos token id toSet generation config eos token id to Set generation config eos token id to Set generation config eos token id to [151658]Set eos token to[151658] [151658]Set eos token to[151658] [151658]<|diff_marker|> [151658]<|diff_marker|> Set generation config eos token id to Set generation config eos token id to [151658] [151658] en id to [151658] Set eos token id back to et eos token id back toSet eos token id back to 151645151645151645 Set eos token back toSet eos token back to <|im_end|><|im_end|> Set eos token back to Set generation config eos token id back to Set generation config eos token id back to <|im_end|> Set generation config eos token id back to [151645, 151643][151645, 151643] [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] on config eos token id back to [151645, 151643]151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2001/34278 [2:14:42<113:25:24, 12.65s/it] {'loss': 0.1933, 'grad_norm': 0.870881960356909, 'learning_rate': 9.978927802856351e-06, 'epoch': 0.06} 6%|▌ | 2001/34278 [2:14:42<113:25:24, 12.65s/it] 6%|▌ | 2002/34278 [2:14:46<89:10:40, 9.95s/it] {'loss': 0.216, 'grad_norm': 1.0664186432210303, 'learning_rate': 9.978884452681688e-06, 'epoch': 0.06} 6%|▌ | 2002/34278 [2:14:46<89:10:40, 9.95s/it] 6%|▌ | 2003/34278 [2:14:49<70:29:03, 7.86s/it] {'loss': 0.2212, 'grad_norm': 0.9522652056931812, 'learning_rate': 9.978841058056731e-06, 'epoch': 0.06} 6%|▌ | 2003/34278 [2:14:49<70:29:03, 7.86s/it] 6%|▌ | 2004/34278 [2:14:53<61:42:35, 6.88s/it] {'loss': 0.2167, 'grad_norm': 0.9962790246780536, 'learning_rate': 9.978797618981871e-06, 'epoch': 0.06} 6%|▌ | 2004/34278 [2:14:53<61:42:35, 6.88s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8872 > 8192). Running this sequence through the model will result in indexing errors 6%|▌ | 2005/34278 [2:14:57<51:39:09, 5.76s/it] {'loss': 0.2063, 'grad_norm': 0.8910113731013437, 'learning_rate': 9.978754135457495e-06, 'epoch': 0.06} 6%|▌ | 2005/34278 [2:14:57<51:39:09, 5.76s/it] 6%|▌ | 2006/34278 [2:15:00<45:50:09, 5.11s/it] {'loss': 0.2163, 'grad_norm': 1.054217756380746, 'learning_rate': 9.97871060748399e-06, 'epoch': 0.06} 6%|▌ | 2006/34278 [2:15:00<45:50:09, 5.11s/it] 6%|▌ | 2007/34278 [2:15:03<40:12:43, 4.49s/it] {'loss': 0.1969, 'grad_norm': 0.9946047777949233, 'learning_rate': 9.978667035061744e-06, 'epoch': 0.06} 6%|▌ | 2007/34278 [2:15:03<40:12:43, 4.49s/it] 6%|▌ | 2008/34278 [2:15:08<41:34:22, 4.64s/it] {'loss': 0.2329, 'grad_norm': 1.0504870261393238, 'learning_rate': 9.97862341819115e-06, 'epoch': 0.06} 6%|▌ | 2008/34278 [2:15:08<41:34:22, 4.64s/it] 6%|▌ | 2009/34278 [2:15:12<39:02:04, 4.35s/it] {'loss': 0.2247, 'grad_norm': 1.0364764713063785, 'learning_rate': 9.978579756872592e-06, 'epoch': 0.06} 6%|▌ | 2009/34278 [2:15:12<39:02:04, 4.35s/it] 6%|▌ | 2010/34278 [2:15:15<35:27:50, 3.96s/it] {'loss': 0.2159, 'grad_norm': 1.1422669472235525, 'learning_rate': 9.978536051106463e-06, 'epoch': 0.06} 6%|▌ | 2010/34278 [2:15:15<35:27:50, 3.96s/it] 6%|▌ | 2011/34278 [2:15:18<33:18:59, 3.72s/it] {'loss': 0.2107, 'grad_norm': 1.0130676558291927, 'learning_rate': 9.978492300893153e-06, 'epoch': 0.06} 6%|▌ | 2011/34278 [2:15:18<33:18:59, 3.72s/it] 6%|▌ | 2012/34278 [2:15:22<32:56:00, 3.67s/it] {'loss': 0.1724, 'grad_norm': 1.0082381433490109, 'learning_rate': 9.978448506233051e-06, 'epoch': 0.06} 6%|▌ | 2012/34278 [2:15:22<32:56:00, 3.67s/it] 6%|▌ | 2013/34278 [2:15:28<39:51:02, 4.45s/it] {'loss': 0.2165, 'grad_norm': 1.2784693655345292, 'learning_rate': 9.978404667126551e-06, 'epoch': 0.06} 6%|▌ | 2013/34278 [2:15:28<39:51:02, 4.45s/it] 6%|▌ | 2014/34278 [2:15:31<35:28:28, 3.96s/it] {'loss': 0.2168, 'grad_norm': 1.0081354924071733, 'learning_rate': 9.978360783574042e-06, 'epoch': 0.06} 6%|▌ | 2014/34278 [2:15:31<35:28:28, 3.96s/it] 6%|▌ | 2015/34278 [2:15:34<34:22:34, 3.84s/it] {'loss': 0.2045, 'grad_norm': 1.1363437190587646, 'learning_rate': 9.978316855575916e-06, 'epoch': 0.06} 6%|▌ | 2015/34278 [2:15:34<34:22:34, 3.84s/it] 6%|▌ | 2016/34278 [2:15:37<31:55:52, 3.56s/it] {'loss': 0.1864, 'grad_norm': 1.0779278282124554, 'learning_rate': 9.978272883132566e-06, 'epoch': 0.06} 6%|▌ | 2016/34278 [2:15:37<31:55:52, 3.56s/it] 6%|▌ | 2017/34278 [2:15:41<31:18:43, 3.49s/it] {'loss': 0.2105, 'grad_norm': 1.0006301243041265, 'learning_rate': 9.978228866244383e-06, 'epoch': 0.06} 6%|▌ | 2017/34278 [2:15:41<31:18:43, 3.49s/it] 6%|▌ | 2018/34278 [2:15:45<33:39:32, 3.76s/it] {'loss': 0.1791, 'grad_norm': 0.8534965953262954, 'learning_rate': 9.97818480491176e-06, 'epoch': 0.06} 6%|▌ | 2018/34278 [2:15:45<33:39:32, 3.76s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2019/34278 [2:15:48<32:14:06, 3.60s/it] {'loss': 0.1959, 'grad_norm': 1.1380684561062862, 'learning_rate': 9.978140699135096e-06, 'epoch': 0.06} 6%|▌ | 2019/34278 [2:15:48<32:14:06, 3.60s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2020/34278 [2:15:51<30:49:42, 3.44s/it] {'loss': 0.1901, 'grad_norm': 0.9092644734509622, 'learning_rate': 9.978096548914778e-06, 'epoch': 0.06} 6%|▌ | 2020/34278 [2:15:51<30:49:42, 3.44s/it] 6%|▌ | 2021/34278 [2:15:57<38:06:57, 4.25s/it] {'loss': 0.232, 'grad_norm': 1.029610767674436, 'learning_rate': 9.9780523542512e-06, 'epoch': 0.06} 6%|▌ | 2021/34278 [2:15:57<38:06:57, 4.25s/it] 6%|▌ | 2022/34278 [2:16:01<35:28:42, 3.96s/it] {'loss': 0.2083, 'grad_norm': 0.9509463445738663, 'learning_rate': 9.978008115144761e-06, 'epoch': 0.06} 6%|▌ | 2022/34278 [2:16:01<35:28:42, 3.96s/it] 6%|▌ | 2023/34278 [2:16:04<32:49:09, 3.66s/it] {'loss': 0.2071, 'grad_norm': 1.005289032997692, 'learning_rate': 9.977963831595854e-06, 'epoch': 0.06} 6%|▌ | 2023/34278 [2:16:04<32:49:09, 3.66s/it] 6%|▌ | 2024/34278 [2:16:07<31:38:18, 3.53s/it] {'loss': 0.207, 'grad_norm': 0.8294734518263973, 'learning_rate': 9.977919503604874e-06, 'epoch': 0.06} 6%|▌ | 2024/34278 [2:16:07<31:38:18, 3.53s/it] 6%|▌ | 2025/34278 [2:16:10<31:32:30, 3.52s/it] {'loss': 0.2089, 'grad_norm': 0.9621392304939402, 'learning_rate': 9.977875131172217e-06, 'epoch': 0.06} 6%|▌ | 2025/34278 [2:16:10<31:32:30, 3.52s/it] 6%|▌ | 2026/34278 [2:16:13<30:34:20, 3.41s/it] {'loss': 0.1857, 'grad_norm': 0.8859118314173687, 'learning_rate': 9.97783071429828e-06, 'epoch': 0.06} 6%|▌ | 2026/34278 [2:16:13<30:34:20, 3.41s/it] 6%|▌ | 2027/34278 [2:16:20<38:33:27, 4.30s/it] {'loss': 0.2273, 'grad_norm': 0.8735600781913359, 'learning_rate': 9.977786252983457e-06, 'epoch': 0.06} 6%|▌ | 2027/34278 [2:16:20<38:33:27, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2028/34278 [2:16:24<37:23:36, 4.17s/it] {'loss': 0.2202, 'grad_norm': 0.9573591880266328, 'learning_rate': 9.977741747228148e-06, 'epoch': 0.06} 6%|▌ | 2028/34278 [2:16:24<37:23:36, 4.17s/it] 6%|▌ | 2029/34278 [2:16:29<40:11:33, 4.49s/it] {'loss': 0.2016, 'grad_norm': 0.7455728674735858, 'learning_rate': 9.977697197032748e-06, 'epoch': 0.06} 6%|▌ | 2029/34278 [2:16:29<40:11:33, 4.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2030/34278 [2:16:32<36:43:57, 4.10s/it] {'loss': 0.1896, 'grad_norm': 0.8173184985776126, 'learning_rate': 9.977652602397657e-06, 'epoch': 0.06} 6%|▌ | 2030/34278 [2:16:32<36:43:57, 4.10s/it] 6%|▌ | 2031/34278 [2:16:36<35:07:20, 3.92s/it] {'loss': 0.1841, 'grad_norm': 1.0818039921066034, 'learning_rate': 9.977607963323271e-06, 'epoch': 0.06} 6%|▌ | 2031/34278 [2:16:36<35:07:20, 3.92s/it] 6%|▌ | 2032/34278 [2:16:39<34:10:28, 3.82s/it] {'loss': 0.199, 'grad_norm': 1.0360598980075821, 'learning_rate': 9.977563279809988e-06, 'epoch': 0.06} 6%|▌ | 2032/34278 [2:16:39<34:10:28, 3.82s/it] 6%|▌ | 2033/34278 [2:16:44<36:33:34, 4.08s/it] {'loss': 0.2106, 'grad_norm': 0.8977184522135336, 'learning_rate': 9.97751855185821e-06, 'epoch': 0.06} 6%|▌ | 2033/34278 [2:16:44<36:33:34, 4.08s/it] 6%|▌ | 2034/34278 [2:16:47<34:36:01, 3.86s/it] {'loss': 0.2273, 'grad_norm': 1.03659852654072, 'learning_rate': 9.977473779468334e-06, 'epoch': 0.06} 6%|▌ | 2034/34278 [2:16:47<34:36:01, 3.86s/it] 6%|▌ | 2035/34278 [2:16:51<33:32:20, 3.74s/it] {'loss': 0.1978, 'grad_norm': 1.080838841640869, 'learning_rate': 9.977428962640761e-06, 'epoch': 0.06} 6%|▌ | 2035/34278 [2:16:51<33:32:20, 3.74s/it] 6%|▌ | 2036/34278 [2:16:57<40:32:01, 4.53s/it] {'loss': 0.2239, 'grad_norm': 0.9819375788267134, 'learning_rate': 9.977384101375888e-06, 'epoch': 0.06} 6%|▌ | 2036/34278 [2:16:57<40:32:01, 4.53s/it] 6%|▌ | 2037/34278 [2:17:00<37:03:27, 4.14s/it] {'loss': 0.1818, 'grad_norm': 0.7784810299083599, 'learning_rate': 9.97733919567412e-06, 'epoch': 0.06} 6%|▌ | 2037/34278 [2:17:00<37:03:27, 4.14s/it] 6%|▌ | 2038/34278 [2:17:03<34:27:14, 3.85s/it] {'loss': 0.2055, 'grad_norm': 1.072008696160696, 'learning_rate': 9.977294245535856e-06, 'epoch': 0.06} 6%|▌ | 2038/34278 [2:17:03<34:27:14, 3.85s/it] 6%|▌ | 2039/34278 [2:17:07<34:22:17, 3.84s/it] {'loss': 0.1994, 'grad_norm': 1.0240489608288772, 'learning_rate': 9.977249250961499e-06, 'epoch': 0.06} 6%|▌ | 2039/34278 [2:17:07<34:22:17, 3.84s/it] 6%|▌ | 2040/34278 [2:17:11<32:54:59, 3.68s/it] {'loss': 0.2183, 'grad_norm': 0.8617751983860222, 'learning_rate': 9.977204211951446e-06, 'epoch': 0.06} 6%|▌ | 2040/34278 [2:17:11<32:54:59, 3.68s/it] 6%|▌ | 2041/34278 [2:17:14<31:20:26, 3.50s/it] {'loss': 0.2032, 'grad_norm': 0.8679195463876015, 'learning_rate': 9.977159128506102e-06, 'epoch': 0.06} 6%|▌ | 2041/34278 [2:17:14<31:20:26, 3.50s/it] 6%|▌ | 2042/34278 [2:17:17<30:44:19, 3.43s/it] {'loss': 0.1828, 'grad_norm': 1.0049132936024472, 'learning_rate': 9.97711400062587e-06, 'epoch': 0.06} 6%|▌ | 2042/34278 [2:17:17<30:44:19, 3.43s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8442 > 8192). Running this sequence through the model will result in indexing errors 6%|▌ | 2043/34278 [2:17:23<37:42:47, 4.21s/it] {'loss': 0.1925, 'grad_norm': 1.000480612387385, 'learning_rate': 9.977068828311153e-06, 'epoch': 0.06} 6%|▌ | 2043/34278 [2:17:23<37:42:47, 4.21s/it] 6%|▌ | 2044/34278 [2:17:26<35:49:42, 4.00s/it] {'loss': 0.2182, 'grad_norm': 1.046292525768253, 'learning_rate': 9.977023611562353e-06, 'epoch': 0.06} 6%|▌ | 2044/34278 [2:17:26<35:49:42, 4.00s/it] 6%|▌ | 2045/34278 [2:17:29<33:03:31, 3.69s/it] {'loss': 0.2134, 'grad_norm': 1.143336121358048, 'learning_rate': 9.976978350379874e-06, 'epoch': 0.06} 6%|▌ | 2045/34278 [2:17:29<33:03:31, 3.69s/it] 6%|▌ | 2046/34278 [2:17:32<31:15:27, 3.49s/it] {'loss': 0.1778, 'grad_norm': 1.1360663591828706, 'learning_rate': 9.97693304476412e-06, 'epoch': 0.06} 6%|▌ | 2046/34278 [2:17:33<31:15:27, 3.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2047/34278 [2:17:36<30:54:30, 3.45s/it] {'loss': 0.1715, 'grad_norm': 0.936331408856632, 'learning_rate': 9.976887694715499e-06, 'epoch': 0.06} 6%|▌ | 2047/34278 [2:17:36<30:54:30, 3.45s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2048/34278 [2:17:42<38:45:41, 4.33s/it] {'loss': 0.2057, 'grad_norm': 0.8936052799897413, 'learning_rate': 9.976842300234408e-06, 'epoch': 0.06} 6%|▌ | 2048/34278 [2:17:42<38:45:41, 4.33s/it] 6%|▌ | 2049/34278 [2:17:46<36:42:48, 4.10s/it] {'loss': 0.2327, 'grad_norm': 1.1175970002868079, 'learning_rate': 9.976796861321261e-06, 'epoch': 0.06} 6%|▌ | 2049/34278 [2:17:46<36:42:48, 4.10s/it] 6%|▌ | 2050/34278 [2:17:49<33:45:34, 3.77s/it] {'loss': 0.2057, 'grad_norm': 1.231150848434283, 'learning_rate': 9.976751377976457e-06, 'epoch': 0.06} 6%|▌ | 2050/34278 [2:17:49<33:45:34, 3.77s/it] 6%|▌ | 2051/34278 [2:17:53<34:12:12, 3.82s/it] {'loss': 0.2203, 'grad_norm': 1.2699398517881084, 'learning_rate': 9.976705850200406e-06, 'epoch': 0.06} 6%|▌ | 2051/34278 [2:17:53<34:12:12, 3.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (8389 > 8192). Running this sequence through the model will result in indexing errors 6%|▌ | 2052/34278 [2:17:56<31:40:14, 3.54s/it] {'loss': 0.1984, 'grad_norm': 1.0646876206759452, 'learning_rate': 9.976660277993512e-06, 'epoch': 0.06} 6%|▌ | 2052/34278 [2:17:56<31:40:14, 3.54s/it] 6%|▌ | 2053/34278 [2:17:59<31:09:57, 3.48s/it] {'loss': 0.2195, 'grad_norm': 0.8956860862984926, 'learning_rate': 9.976614661356185e-06, 'epoch': 0.06} 6%|▌ | 2053/34278 [2:17:59<31:09:57, 3.48s/it] 6%|▌ | 2054/34278 [2:18:02<30:28:07, 3.40s/it] {'loss': 0.1961, 'grad_norm': 0.8997181148171646, 'learning_rate': 9.976569000288829e-06, 'epoch': 0.06} 6%|▌ | 2054/34278 [2:18:02<30:28:07, 3.40s/it] 6%|▌ | 2055/34278 [2:18:06<32:24:14, 3.62s/it] {'loss': 0.2122, 'grad_norm': 1.2379573760388698, 'learning_rate': 9.976523294791853e-06, 'epoch': 0.06} 6%|▌ | 2055/34278 [2:18:06<32:24:14, 3.62s/it] 6%|▌ | 2056/34278 [2:18:09<30:10:28, 3.37s/it] {'loss': 0.2087, 'grad_norm': 0.9920211167696346, 'learning_rate': 9.976477544865665e-06, 'epoch': 0.06} 6%|▌ | 2056/34278 [2:18:09<30:10:28, 3.37s/it] 6%|▌ | 2057/34278 [2:18:12<30:05:19, 3.36s/it] {'loss': 0.1898, 'grad_norm': 1.0675356660624904, 'learning_rate': 9.976431750510676e-06, 'epoch': 0.06} 6%|▌ | 2057/34278 [2:18:12<30:05:19, 3.36s/it] 6%|▌ | 2058/34278 [2:18:17<33:03:16, 3.69s/it] {'loss': 0.1789, 'grad_norm': 1.0178051082291153, 'learning_rate': 9.976385911727288e-06, 'epoch': 0.06} 6%|▌ | 2058/34278 [2:18:17<33:03:16, 3.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2059/34278 [2:18:22<37:15:42, 4.16s/it] {'loss': 0.2228, 'grad_norm': 1.125079253882043, 'learning_rate': 9.976340028515919e-06, 'epoch': 0.06} 6%|▌ | 2059/34278 [2:18:22<37:15:42, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2060/34278 [2:18:26<37:34:06, 4.20s/it] {'loss': 0.2186, 'grad_norm': 1.082058563289299, 'learning_rate': 9.97629410087697e-06, 'epoch': 0.06} 6%|▌ | 2060/34278 [2:18:26<37:34:06, 4.20s/it] 6%|▌ | 2061/34278 [2:18:32<40:13:24, 4.49s/it] {'loss': 0.2472, 'grad_norm': 1.0186351006469425, 'learning_rate': 9.976248128810857e-06, 'epoch': 0.06} 6%|▌ | 2061/34278 [2:18:32<40:13:24, 4.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2062/34278 [2:18:35<36:55:56, 4.13s/it] {'loss': 0.194, 'grad_norm': 0.831530272410781, 'learning_rate': 9.97620211231799e-06, 'epoch': 0.06} 6%|▌ | 2062/34278 [2:18:35<36:55:56, 4.13s/it] 6%|▌ | 2063/34278 [2:18:39<37:04:03, 4.14s/it] {'loss': 0.184, 'grad_norm': 1.1083999837317409, 'learning_rate': 9.976156051398777e-06, 'epoch': 0.06} 6%|▌ | 2063/34278 [2:18:39<37:04:03, 4.14s/it] 6%|▌ | 2064/34278 [2:18:43<35:16:12, 3.94s/it] {'loss': 0.2222, 'grad_norm': 1.3425873591207818, 'learning_rate': 9.97610994605363e-06, 'epoch': 0.06} 6%|▌ | 2064/34278 [2:18:43<35:16:12, 3.94s/it] 6%|▌ | 2065/34278 [2:18:46<34:38:21, 3.87s/it] {'loss': 0.1892, 'grad_norm': 1.0101926257248675, 'learning_rate': 9.976063796282963e-06, 'epoch': 0.06} 6%|▌ | 2065/34278 [2:18:46<34:38:21, 3.87s/it] 6%|▌ | 2066/34278 [2:18:50<33:02:48, 3.69s/it] {'loss': 0.1887, 'grad_norm': 0.9631860566449668, 'learning_rate': 9.976017602087184e-06, 'epoch': 0.06} 6%|▌ | 2066/34278 [2:18:50<33:02:48, 3.69s/it] 6%|▌ | 2067/34278 [2:18:55<37:39:03, 4.21s/it] {'loss': 0.2246, 'grad_norm': 1.1055800441615407, 'learning_rate': 9.97597136346671e-06, 'epoch': 0.06} 6%|▌ | 2067/34278 [2:18:55<37:39:03, 4.21s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2068/34278 [2:18:59<37:50:15, 4.23s/it] {'loss': 0.2035, 'grad_norm': 0.8726134047829306, 'learning_rate': 9.97592508042195e-06, 'epoch': 0.06} 6%|▌ | 2068/34278 [2:18:59<37:50:15, 4.23s/it] 6%|▌ | 2069/34278 [2:19:05<42:44:42, 4.78s/it] {'loss': 0.2137, 'grad_norm': 0.934936738990123, 'learning_rate': 9.97587875295332e-06, 'epoch': 0.06} 6%|▌ | 2069/34278 [2:19:05<42:44:42, 4.78s/it] 6%|▌ | 2070/34278 [2:19:08<37:55:17, 4.24s/it] {'loss': 0.2112, 'grad_norm': 0.9841407267817264, 'learning_rate': 9.975832381061232e-06, 'epoch': 0.06} 6%|▌ | 2070/34278 [2:19:08<37:55:17, 4.24s/it] 6%|▌ | 2071/34278 [2:19:13<39:50:01, 4.45s/it] {'loss': 0.2042, 'grad_norm': 0.9152843445444, 'learning_rate': 9.9757859647461e-06, 'epoch': 0.06} 6%|▌ | 2071/34278 [2:19:13<39:50:01, 4.45s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2072/34278 [2:19:18<40:03:57, 4.48s/it] {'loss': 0.2077, 'grad_norm': 1.0854987493128536, 'learning_rate': 9.975739504008338e-06, 'epoch': 0.06} 6%|▌ | 2072/34278 [2:19:18<40:03:57, 4.48s/it] 6%|▌ | 2073/34278 [2:19:22<39:16:05, 4.39s/it] {'loss': 0.1957, 'grad_norm': 0.8586813028540773, 'learning_rate': 9.975692998848363e-06, 'epoch': 0.06} 6%|▌ | 2073/34278 [2:19:22<39:16:05, 4.39s/it] 6%|▌ | 2074/34278 [2:19:25<34:48:58, 3.89s/it] {'loss': 0.216, 'grad_norm': 0.9470440215588506, 'learning_rate': 9.975646449266588e-06, 'epoch': 0.06} 6%|▌ | 2074/34278 [2:19:25<34:48:58, 3.89s/it] 6%|▌ | 2075/34278 [2:19:29<35:31:53, 3.97s/it] {'loss': 0.2233, 'grad_norm': 0.9768831241799613, 'learning_rate': 9.97559985526343e-06, 'epoch': 0.06} 6%|▌ | 2075/34278 [2:19:29<35:31:53, 3.97s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2076/34278 [2:19:32<33:06:01, 3.70s/it] {'loss': 0.1833, 'grad_norm': 0.763257404956762, 'learning_rate': 9.975553216839302e-06, 'epoch': 0.06} 6%|▌ | 2076/34278 [2:19:32<33:06:01, 3.70s/it] 6%|▌ | 2077/34278 [2:19:35<31:50:05, 3.56s/it] {'loss': 0.2049, 'grad_norm': 0.8869851474454065, 'learning_rate': 9.975506533994625e-06, 'epoch': 0.06} 6%|▌ | 2077/34278 [2:19:35<31:50:05, 3.56s/it] 6%|▌ | 2078/34278 [2:19:38<30:40:55, 3.43s/it] {'loss': 0.2135, 'grad_norm': 1.0869497748315717, 'learning_rate': 9.975459806729813e-06, 'epoch': 0.06} 6%|▌ | 2078/34278 [2:19:38<30:40:55, 3.43s/it] 6%|▌ | 2079/34278 [2:19:42<31:15:06, 3.49s/it] {'loss': 0.1918, 'grad_norm': 1.1255167613899464, 'learning_rate': 9.975413035045283e-06, 'epoch': 0.06} 6%|▌ | 2079/34278 [2:19:42<31:15:06, 3.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2080/34278 [2:19:45<30:27:27, 3.41s/it] {'loss': 0.2113, 'grad_norm': 0.9134421328922334, 'learning_rate': 9.975366218941452e-06, 'epoch': 0.06} 6%|▌ | 2080/34278 [2:19:45<30:27:27, 3.41s/it] 6%|▌ | 2081/34278 [2:19:48<30:15:02, 3.38s/it] {'loss': 0.2129, 'grad_norm': 0.8104014621563806, 'learning_rate': 9.975319358418742e-06, 'epoch': 0.06} 6%|▌ | 2081/34278 [2:19:48<30:15:02, 3.38s/it] 6%|▌ | 2082/34278 [2:19:52<30:00:47, 3.36s/it] {'loss': 0.2514, 'grad_norm': 1.000356659589392, 'learning_rate': 9.975272453477566e-06, 'epoch': 0.06} 6%|▌ | 2082/34278 [2:19:52<30:00:47, 3.36s/it] 6%|▌ | 2083/34278 [2:19:58<36:43:39, 4.11s/it] {'loss': 0.2025, 'grad_norm': 0.9025457848077182, 'learning_rate': 9.975225504118346e-06, 'epoch': 0.06} 6%|▌ | 2083/34278 [2:19:58<36:43:39, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2084/34278 [2:20:00<33:27:42, 3.74s/it] {'loss': 0.2402, 'grad_norm': 0.9106535213473448, 'learning_rate': 9.975178510341502e-06, 'epoch': 0.06} 6%|▌ | 2084/34278 [2:20:00<33:27:42, 3.74s/it] 6%|▌ | 2085/34278 [2:20:04<33:11:27, 3.71s/it] {'loss': 0.222, 'grad_norm': 0.9690881799970482, 'learning_rate': 9.97513147214745e-06, 'epoch': 0.06} 6%|▌ | 2085/34278 [2:20:04<33:11:27, 3.71s/it] 6%|▌ | 2086/34278 [2:20:07<32:21:58, 3.62s/it] {'loss': 0.1889, 'grad_norm': 1.1146932688117845, 'learning_rate': 9.975084389536612e-06, 'epoch': 0.06} 6%|▌ | 2086/34278 [2:20:08<32:21:58, 3.62s/it] 6%|▌ | 2087/34278 [2:20:11<31:45:36, 3.55s/it] {'loss': 0.1824, 'grad_norm': 1.213574459301282, 'learning_rate': 9.975037262509408e-06, 'epoch': 0.06} 6%|▌ | 2087/34278 [2:20:11<31:45:36, 3.55s/it] 6%|▌ | 2088/34278 [2:20:15<32:10:44, 3.60s/it] {'loss': 0.2122, 'grad_norm': 1.068821129367693, 'learning_rate': 9.974990091066258e-06, 'epoch': 0.06} 6%|▌ | 2088/34278 [2:20:15<32:10:44, 3.60s/it] 6%|▌ | 2089/34278 [2:20:18<30:29:47, 3.41s/it] {'loss': 0.2019, 'grad_norm': 0.9554917426061998, 'learning_rate': 9.974942875207587e-06, 'epoch': 0.06} 6%|▌ | 2089/34278 [2:20:18<30:29:47, 3.41s/it] 6%|▌ | 2090/34278 [2:20:21<31:23:20, 3.51s/it] {'loss': 0.2087, 'grad_norm': 0.8726338411025051, 'learning_rate': 9.974895614933814e-06, 'epoch': 0.06} 6%|▌ | 2090/34278 [2:20:21<31:23:20, 3.51s/it] 6%|▌ | 2091/34278 [2:20:25<30:53:00, 3.45s/it] {'loss': 0.2152, 'grad_norm': 1.0445440885430617, 'learning_rate': 9.974848310245357e-06, 'epoch': 0.06} 6%|▌ | 2091/34278 [2:20:25<30:53:00, 3.45s/it] 6%|▌ | 2092/34278 [2:20:28<29:46:16, 3.33s/it] {'loss': 0.2054, 'grad_norm': 0.9555721092593108, 'learning_rate': 9.974800961142644e-06, 'epoch': 0.06} 6%|▌ | 2092/34278 [2:20:28<29:46:16, 3.33s/it] 6%|▌ | 2093/34278 [2:20:32<32:45:53, 3.66s/it] {'loss': 0.2132, 'grad_norm': 1.143783034132988, 'learning_rate': 9.974753567626095e-06, 'epoch': 0.06} 6%|▌ | 2093/34278 [2:20:32<32:45:53, 3.66s/it] 6%|▌ | 2094/34278 [2:20:36<32:49:39, 3.67s/it] {'loss': 0.2086, 'grad_norm': 0.9620146347135539, 'learning_rate': 9.974706129696134e-06, 'epoch': 0.06} 6%|▌ | 2094/34278 [2:20:36<32:49:39, 3.67s/it] 6%|▌ | 2095/34278 [2:20:42<38:49:41, 4.34s/it] {'loss': 0.2216, 'grad_norm': 1.1374169372691725, 'learning_rate': 9.974658647353183e-06, 'epoch': 0.06} 6%|▌ | 2095/34278 [2:20:42<38:49:41, 4.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2096/34278 [2:20:47<41:44:46, 4.67s/it] {'loss': 0.2303, 'grad_norm': 1.1718917859515348, 'learning_rate': 9.974611120597669e-06, 'epoch': 0.06} 6%|▌ | 2096/34278 [2:20:47<41:44:46, 4.67s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2097/34278 [2:20:52<41:16:05, 4.62s/it] {'loss': 0.2146, 'grad_norm': 0.8625288177060566, 'learning_rate': 9.974563549430015e-06, 'epoch': 0.06} 6%|▌ | 2097/34278 [2:20:52<41:16:05, 4.62s/it] 6%|▌ | 2098/34278 [2:20:54<36:23:44, 4.07s/it] {'loss': 0.1687, 'grad_norm': 0.8379967462769726, 'learning_rate': 9.974515933850643e-06, 'epoch': 0.06} 6%|▌ | 2098/34278 [2:20:54<36:23:44, 4.07s/it] 6%|▌ | 2099/34278 [2:20:58<35:19:43, 3.95s/it] {'loss': 0.1764, 'grad_norm': 0.776264281700595, 'learning_rate': 9.97446827385998e-06, 'epoch': 0.06} 6%|▌ | 2099/34278 [2:20:58<35:19:43, 3.95s/it] 6%|▌ | 2100/34278 [2:21:02<34:15:32, 3.83s/it] {'loss': 0.2087, 'grad_norm': 0.8726575437798988, 'learning_rate': 9.974420569458453e-06, 'epoch': 0.06} 6%|▌ | 2100/34278 [2:21:02<34:15:32, 3.83s/it] 6%|▌ | 2101/34278 [2:21:08<40:27:25, 4.53s/it] {'loss': 0.208, 'grad_norm': 0.8770632912289654, 'learning_rate': 9.974372820646488e-06, 'epoch': 0.06} 6%|▌ | 2101/34278 [2:21:08<40:27:25, 4.53s/it] 6%|▌ | 2102/34278 [2:21:13<43:29:25, 4.87s/it] {'loss': 0.2045, 'grad_norm': 0.8742580299006035, 'learning_rate': 9.974325027424508e-06, 'epoch': 0.06} 6%|▌ | 2102/34278 [2:21:13<43:29:25, 4.87s/it] 6%|▌ | 2103/34278 [2:21:17<40:42:26, 4.55s/it] {'loss': 0.1734, 'grad_norm': 0.8363969625727758, 'learning_rate': 9.974277189792942e-06, 'epoch': 0.06} 6%|▌ | 2103/34278 [2:21:17<40:42:26, 4.55s/it] 6%|▌ | 2104/34278 [2:21:21<37:21:30, 4.18s/it] {'loss': 0.2064, 'grad_norm': 0.9791199443738712, 'learning_rate': 9.974229307752216e-06, 'epoch': 0.06} 6%|▌ | 2104/34278 [2:21:21<37:21:30, 4.18s/it] 6%|▌ | 2105/34278 [2:21:27<43:35:22, 4.88s/it] {'loss': 0.2158, 'grad_norm': 0.8474516665512822, 'learning_rate': 9.97418138130276e-06, 'epoch': 0.06} 6%|▌ | 2105/34278 [2:21:27<43:35:22, 4.88s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4277243c90> Failed to fetch sample 3423932. Exception: cannot identify image file <_io.BytesIO object at 0x7f4277243c90> 6%|▌ | 2106/34278 [2:21:30<39:00:57, 4.37s/it] {'loss': 0.209, 'grad_norm': 1.1531210677607857, 'learning_rate': 9.974133410444999e-06, 'epoch': 0.06} 6%|▌ | 2106/34278 [2:21:30<39:00:57, 4.37s/it] 6%|▌ | 2107/34278 [2:21:33<35:17:44, 3.95s/it] {'loss': 0.1918, 'grad_norm': 1.1580693325474198, 'learning_rate': 9.974085395179363e-06, 'epoch': 0.06} 6%|▌ | 2107/34278 [2:21:33<35:17:44, 3.95s/it] 6%|▌ | 2108/34278 [2:21:37<34:08:41, 3.82s/it] {'loss': 0.2347, 'grad_norm': 1.1289985282730919, 'learning_rate': 9.974037335506279e-06, 'epoch': 0.06} 6%|▌ | 2108/34278 [2:21:37<34:08:41, 3.82s/it] 6%|▌ | 2109/34278 [2:21:43<40:11:29, 4.50s/it] {'loss': 0.1917, 'grad_norm': 1.0283689599134265, 'learning_rate': 9.973989231426177e-06, 'epoch': 0.06} 6%|▌ | 2109/34278 [2:21:43<40:11:29, 4.50s/it] 6%|▌ | 2110/34278 [2:21:47<40:11:22, 4.50s/it] {'loss': 0.2044, 'grad_norm': 1.289355969406982, 'learning_rate': 9.973941082939488e-06, 'epoch': 0.06} 6%|▌ | 2110/34278 [2:21:47<40:11:22, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2111/34278 [2:21:53<42:33:32, 4.76s/it] {'loss': 0.2256, 'grad_norm': 1.0637924130768446, 'learning_rate': 9.97389289004664e-06, 'epoch': 0.06} 6%|▌ | 2111/34278 [2:21:53<42:33:32, 4.76s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2112/34278 [2:21:56<38:06:54, 4.27s/it] {'loss': 0.205, 'grad_norm': 0.8672140515563873, 'learning_rate': 9.973844652748063e-06, 'epoch': 0.06} 6%|▌ | 2112/34278 [2:21:56<38:06:54, 4.27s/it] 6%|▌ | 2113/34278 [2:21:59<35:56:59, 4.02s/it] {'loss': 0.1795, 'grad_norm': 1.0234528960065306, 'learning_rate': 9.973796371044187e-06, 'epoch': 0.06} 6%|▌ | 2113/34278 [2:21:59<35:56:59, 4.02s/it] 6%|▌ | 2114/34278 [2:22:05<41:29:40, 4.64s/it] {'loss': 0.2143, 'grad_norm': 1.2992995590118168, 'learning_rate': 9.973748044935446e-06, 'epoch': 0.06} 6%|▌ | 2114/34278 [2:22:05<41:29:40, 4.64s/it] 6%|▌ | 2115/34278 [2:22:11<42:48:26, 4.79s/it] {'loss': 0.2112, 'grad_norm': 1.0693041903325287, 'learning_rate': 9.97369967442227e-06, 'epoch': 0.06} 6%|▌ | 2115/34278 [2:22:11<42:48:26, 4.79s/it] 6%|▌ | 2116/34278 [2:22:14<38:19:00, 4.29s/it] {'loss': 0.2215, 'grad_norm': 0.9951063543454214, 'learning_rate': 9.973651259505091e-06, 'epoch': 0.06} 6%|▌ | 2116/34278 [2:22:14<38:19:00, 4.29s/it] 6%|▌ | 2117/34278 [2:22:17<34:38:07, 3.88s/it] {'loss': 0.1981, 'grad_norm': 1.1663975162463882, 'learning_rate': 9.973602800184339e-06, 'epoch': 0.06} 6%|▌ | 2117/34278 [2:22:17<34:38:07, 3.88s/it] 6%|▌ | 2118/34278 [2:22:21<35:05:53, 3.93s/it] {'loss': 0.1943, 'grad_norm': 0.975352262670444, 'learning_rate': 9.973554296460449e-06, 'epoch': 0.06} 6%|▌ | 2118/34278 [2:22:21<35:05:53, 3.93s/it] 6%|▌ | 2119/34278 [2:22:24<32:18:51, 3.62s/it] {'loss': 0.1886, 'grad_norm': 1.0415545024998738, 'learning_rate': 9.973505748333853e-06, 'epoch': 0.06} 6%|▌ | 2119/34278 [2:22:24<32:18:51, 3.62s/it] 6%|▌ | 2120/34278 [2:22:27<32:05:03, 3.59s/it] {'loss': 0.1918, 'grad_norm': 0.801931958725028, 'learning_rate': 9.973457155804988e-06, 'epoch': 0.06} 6%|▌ | 2120/34278 [2:22:27<32:05:03, 3.59s/it] 6%|▌ | 2121/34278 [2:22:31<33:06:17, 3.71s/it] {'loss': 0.2037, 'grad_norm': 1.215346746364988, 'learning_rate': 9.973408518874281e-06, 'epoch': 0.06} 6%|▌ | 2121/34278 [2:22:31<33:06:17, 3.71s/it] 6%|▌ | 2122/34278 [2:22:35<35:00:24, 3.92s/it] {'loss': 0.2045, 'grad_norm': 1.0733016608293688, 'learning_rate': 9.973359837542173e-06, 'epoch': 0.06} 6%|▌ | 2122/34278 [2:22:35<35:00:24, 3.92s/it] 6%|▌ | 2123/34278 [2:22:41<40:21:24, 4.52s/it] {'loss': 0.2541, 'grad_norm': 0.9902456673038482, 'learning_rate': 9.973311111809094e-06, 'epoch': 0.06} 6%|▌ | 2123/34278 [2:22:41<40:21:24, 4.52s/it] 6%|▌ | 2124/34278 [2:22:45<37:00:31, 4.14s/it] {'loss': 0.2081, 'grad_norm': 1.0124167059750098, 'learning_rate': 9.97326234167548e-06, 'epoch': 0.06} 6%|▌ | 2124/34278 [2:22:45<37:00:31, 4.14s/it] 6%|▌ | 2125/34278 [2:22:48<34:54:27, 3.91s/it] {'loss': 0.2013, 'grad_norm': 1.2004141251112712, 'learning_rate': 9.97321352714177e-06, 'epoch': 0.06} 6%|▌ | 2125/34278 [2:22:48<34:54:27, 3.91s/it] 6%|▌ | 2126/34278 [2:22:52<36:05:56, 4.04s/it] {'loss': 0.1998, 'grad_norm': 0.9116694966161774, 'learning_rate': 9.973164668208394e-06, 'epoch': 0.06} 6%|▌ | 2126/34278 [2:22:52<36:05:56, 4.04s/it] 6%|▌ | 2127/34278 [2:22:56<35:03:26, 3.93s/it] {'loss': 0.2186, 'grad_norm': 1.074177487403381, 'learning_rate': 9.973115764875792e-06, 'epoch': 0.06} 6%|▌ | 2127/34278 [2:22:56<35:03:26, 3.93s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2128/34278 [2:23:01<37:03:19, 4.15s/it] {'loss': 0.2039, 'grad_norm': 0.9602229025107512, 'learning_rate': 9.973066817144398e-06, 'epoch': 0.06} 6%|▌ | 2128/34278 [2:23:01<37:03:19, 4.15s/it] 6%|▌ | 2129/34278 [2:23:03<33:16:45, 3.73s/it] {'loss': 0.2004, 'grad_norm': 0.8902293676577513, 'learning_rate': 9.973017825014652e-06, 'epoch': 0.06} 6%|▌ | 2129/34278 [2:23:03<33:16:45, 3.73s/it] 6%|▌ | 2130/34278 [2:23:09<38:29:54, 4.31s/it] {'loss': 0.1958, 'grad_norm': 0.8580018720372313, 'learning_rate': 9.972968788486992e-06, 'epoch': 0.06} 6%|▌ | 2130/34278 [2:23:09<38:29:54, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2131/34278 [2:23:12<35:31:26, 3.98s/it] {'loss': 0.2243, 'grad_norm': 1.041394416810979, 'learning_rate': 9.972919707561852e-06, 'epoch': 0.06} 6%|▌ | 2131/34278 [2:23:12<35:31:26, 3.98s/it] 6%|▌ | 2132/34278 [2:23:16<33:46:27, 3.78s/it] {'loss': 0.2111, 'grad_norm': 1.021644935083276, 'learning_rate': 9.97287058223967e-06, 'epoch': 0.06} 6%|▌ | 2132/34278 [2:23:16<33:46:27, 3.78s/it] 6%|▌ | 2133/34278 [2:23:19<33:54:34, 3.80s/it] {'loss': 0.2568, 'grad_norm': 1.0695922251783396, 'learning_rate': 9.97282141252089e-06, 'epoch': 0.06} 6%|▌ | 2133/34278 [2:23:19<33:54:34, 3.80s/it] 6%|▌ | 2134/34278 [2:23:26<40:34:45, 4.54s/it] {'loss': 0.1948, 'grad_norm': 0.8600497510893946, 'learning_rate': 9.972772198405945e-06, 'epoch': 0.06} 6%|▌ | 2134/34278 [2:23:26<40:34:45, 4.54s/it] 6%|▌ | 2135/34278 [2:23:29<36:24:24, 4.08s/it] {'loss': 0.1983, 'grad_norm': 0.9383435682456496, 'learning_rate': 9.972722939895279e-06, 'epoch': 0.06} 6%|▌ | 2135/34278 [2:23:29<36:24:24, 4.08s/it] 6%|▌ | 2136/34278 [2:23:34<40:45:44, 4.57s/it] {'loss': 0.1963, 'grad_norm': 0.9670154811684295, 'learning_rate': 9.972673636989327e-06, 'epoch': 0.06} 6%|▌ | 2136/34278 [2:23:34<40:45:44, 4.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (9061 > 8192). Running this sequence through the model will result in indexing errors 6%|▌ | 2137/34278 [2:23:37<36:36:21, 4.10s/it] {'loss': 0.1784, 'grad_norm': 1.0911027522815946, 'learning_rate': 9.972624289688533e-06, 'epoch': 0.06} 6%|▌ | 2137/34278 [2:23:37<36:36:21, 4.10s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9829 > 8192). Running this sequence through the model will result in indexing errors 6%|▌ | 2138/34278 [2:23:42<37:44:46, 4.23s/it] {'loss': 0.201, 'grad_norm': 1.1053076465996292, 'learning_rate': 9.972574897993338e-06, 'epoch': 0.06} 6%|▌ | 2138/34278 [2:23:42<37:44:46, 4.23s/it] 6%|▌ | 2139/34278 [2:23:48<43:18:45, 4.85s/it] {'loss': 0.2232, 'grad_norm': 1.022009117982567, 'learning_rate': 9.97252546190418e-06, 'epoch': 0.06} 6%|▌ | 2139/34278 [2:23:48<43:18:45, 4.85s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2140/34278 [2:23:54<44:53:06, 5.03s/it] {'loss': 0.202, 'grad_norm': 0.8704373189860868, 'learning_rate': 9.972475981421502e-06, 'epoch': 0.06} 6%|▌ | 2140/34278 [2:23:54<44:53:06, 5.03s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▌ | 2141/34278 [2:23:57<39:50:31, 4.46s/it] {'loss': 0.2024, 'grad_norm': 0.8073386618088111, 'learning_rate': 9.972426456545745e-06, 'epoch': 0.06} 6%|▌ | 2141/34278 [2:23:57<39:50:31, 4.46s/it] 6%|▌ | 2142/34278 [2:24:00<36:57:40, 4.14s/it] {'loss': 0.1864, 'grad_norm': 0.9390175056440538, 'learning_rate': 9.972376887277353e-06, 'epoch': 0.06} 6%|▌ | 2142/34278 [2:24:00<36:57:40, 4.14s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▋ | 2143/34278 [2:24:03<34:11:25, 3.83s/it] {'loss': 0.1939, 'grad_norm': 0.8741455883610038, 'learning_rate': 9.972327273616765e-06, 'epoch': 0.06} 6%|▋ | 2143/34278 [2:24:03<34:11:25, 3.83s/it] 6%|▋ | 2144/34278 [2:24:07<32:32:32, 3.65s/it] {'loss': 0.1739, 'grad_norm': 0.8554803354568955, 'learning_rate': 9.972277615564428e-06, 'epoch': 0.06} 6%|▋ | 2144/34278 [2:24:07<32:32:32, 3.65s/it] 6%|▋ | 2145/34278 [2:24:10<32:41:33, 3.66s/it] {'loss': 0.2174, 'grad_norm': 0.928084158501416, 'learning_rate': 9.972227913120782e-06, 'epoch': 0.06} 6%|▋ | 2145/34278 [2:24:10<32:41:33, 3.66s/it] 6%|▋ | 2146/34278 [2:24:13<30:42:57, 3.44s/it] {'loss': 0.1803, 'grad_norm': 0.9267381160754137, 'learning_rate': 9.972178166286273e-06, 'epoch': 0.06} 6%|▋ | 2146/34278 [2:24:13<30:42:57, 3.44s/it] 6%|▋ | 2147/34278 [2:24:19<36:11:04, 4.05s/it] {'loss': 0.209, 'grad_norm': 0.8733856471252143, 'learning_rate': 9.972128375061345e-06, 'epoch': 0.06} 6%|▋ | 2147/34278 [2:24:19<36:11:04, 4.05s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▋ | 2148/34278 [2:24:22<33:32:39, 3.76s/it] {'loss': 0.1997, 'grad_norm': 1.0027675655267019, 'learning_rate': 9.97207853944644e-06, 'epoch': 0.06} 6%|▋ | 2148/34278 [2:24:22<33:32:39, 3.76s/it] 6%|▋ | 2149/34278 [2:24:25<31:34:57, 3.54s/it] {'loss': 0.2078, 'grad_norm': 0.8467316584835951, 'learning_rate': 9.972028659442006e-06, 'epoch': 0.06} 6%|▋ | 2149/34278 [2:24:25<31:34:57, 3.54s/it] 6%|▋ | 2150/34278 [2:24:30<37:29:12, 4.20s/it] {'loss': 0.1902, 'grad_norm': 0.9590674906366835, 'learning_rate': 9.971978735048487e-06, 'epoch': 0.06} 6%|▋ | 2150/34278 [2:24:31<37:29:12, 4.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▋ | 2151/34278 [2:24:35<39:07:05, 4.38s/it] {'loss': 0.2028, 'grad_norm': 0.9162306588055258, 'learning_rate': 9.971928766266328e-06, 'epoch': 0.06} 6%|▋ | 2151/34278 [2:24:35<39:07:05, 4.38s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▋ | 2152/34278 [2:24:39<36:45:01, 4.12s/it] {'loss': 0.2138, 'grad_norm': 1.0010169971801015, 'learning_rate': 9.971878753095975e-06, 'epoch': 0.06} 6%|▋ | 2152/34278 [2:24:39<36:45:01, 4.12s/it] 6%|▋ | 2153/34278 [2:24:42<34:43:54, 3.89s/it] {'loss': 0.2317, 'grad_norm': 0.9096766102413545, 'learning_rate': 9.971828695537877e-06, 'epoch': 0.06} 6%|▋ | 2153/34278 [2:24:42<34:43:54, 3.89s/it] 6%|▋ | 2154/34278 [2:24:47<38:23:25, 4.30s/it] {'loss': 0.209, 'grad_norm': 0.8916366375062106, 'learning_rate': 9.97177859359248e-06, 'epoch': 0.06} 6%|▋ | 2154/34278 [2:24:47<38:23:25, 4.30s/it] 6%|▋ | 2155/34278 [2:24:51<35:38:17, 3.99s/it] {'loss': 0.1851, 'grad_norm': 0.8601853419671421, 'learning_rate': 9.97172844726023e-06, 'epoch': 0.06} 6%|▋ | 2155/34278 [2:24:51<35:38:17, 3.99s/it] 6%|▋ | 2156/34278 [2:24:54<32:59:47, 3.70s/it] {'loss': 0.199, 'grad_norm': 0.9660578264513769, 'learning_rate': 9.971678256541573e-06, 'epoch': 0.06} 6%|▋ | 2156/34278 [2:24:54<32:59:47, 3.70s/it] 6%|▋ | 2157/34278 [2:24:58<34:53:19, 3.91s/it] {'loss': 0.2355, 'grad_norm': 1.021043723589428, 'learning_rate': 9.971628021436962e-06, 'epoch': 0.06} 6%|▋ | 2157/34278 [2:24:58<34:53:19, 3.91s/it] 6%|▋ | 2158/34278 [2:25:03<38:42:04, 4.34s/it] {'loss': 0.2054, 'grad_norm': 1.044111847940825, 'learning_rate': 9.971577741946841e-06, 'epoch': 0.06} 6%|▋ | 2158/34278 [2:25:03<38:42:04, 4.34s/it] 6%|▋ | 2159/34278 [2:25:06<35:04:27, 3.93s/it] {'loss': 0.1827, 'grad_norm': 0.947055773303262, 'learning_rate': 9.971527418071663e-06, 'epoch': 0.06} 6%|▋ | 2159/34278 [2:25:06<35:04:27, 3.93s/it] 6%|▋ | 2160/34278 [2:25:10<33:35:28, 3.77s/it] {'loss': 0.24, 'grad_norm': 0.9863468194876462, 'learning_rate': 9.971477049811873e-06, 'epoch': 0.06} 6%|▋ | 2160/34278 [2:25:10<33:35:28, 3.77s/it] 6%|▋ | 2161/34278 [2:25:14<34:18:12, 3.85s/it] {'loss': 0.1986, 'grad_norm': 0.8877793611940751, 'learning_rate': 9.971426637167924e-06, 'epoch': 0.06} 6%|▋ | 2161/34278 [2:25:14<34:18:12, 3.85s/it] 6%|▋ | 2162/34278 [2:25:17<32:09:31, 3.60s/it] {'loss': 0.1767, 'grad_norm': 0.9084343410271073, 'learning_rate': 9.971376180140264e-06, 'epoch': 0.06} 6%|▋ | 2162/34278 [2:25:17<32:09:31, 3.60s/it] 6%|▋ | 2163/34278 [2:25:20<30:44:41, 3.45s/it] {'loss': 0.2152, 'grad_norm': 0.9435400721303311, 'learning_rate': 9.971325678729344e-06, 'epoch': 0.06} 6%|▋ | 2163/34278 [2:25:20<30:44:41, 3.45s/it] 6%|▋ | 2164/34278 [2:25:24<31:10:30, 3.49s/it] {'loss': 0.2022, 'grad_norm': 0.8291741826520561, 'learning_rate': 9.971275132935616e-06, 'epoch': 0.06} 6%|▋ | 2164/34278 [2:25:24<31:10:30, 3.49s/it] 6%|▋ | 2165/34278 [2:25:29<37:14:09, 4.17s/it] {'loss': 0.1944, 'grad_norm': 1.102884801299515, 'learning_rate': 9.97122454275953e-06, 'epoch': 0.06} 6%|▋ | 2165/34278 [2:25:29<37:14:09, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▋ | 2166/34278 [2:25:33<35:25:51, 3.97s/it] {'loss': 0.1794, 'grad_norm': 0.7718465396264464, 'learning_rate': 9.971173908201536e-06, 'epoch': 0.06} 6%|▋ | 2166/34278 [2:25:33<35:25:51, 3.97s/it] 6%|▋ | 2167/34278 [2:25:37<34:50:37, 3.91s/it] {'loss': 0.2162, 'grad_norm': 0.9722053784672414, 'learning_rate': 9.971123229262091e-06, 'epoch': 0.06} 6%|▋ | 2167/34278 [2:25:37<34:50:37, 3.91s/it] 6%|▋ | 2168/34278 [2:25:42<38:37:29, 4.33s/it] {'loss': 0.2148, 'grad_norm': 0.7703595361079513, 'learning_rate': 9.971072505941643e-06, 'epoch': 0.06} 6%|▋ | 2168/34278 [2:25:42<38:37:29, 4.33s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▋ | 2169/34278 [2:25:45<36:38:32, 4.11s/it] {'loss': 0.2194, 'grad_norm': 0.8627728017973386, 'learning_rate': 9.971021738240648e-06, 'epoch': 0.06} 6%|▋ | 2169/34278 [2:25:46<36:38:32, 4.11s/it] 6%|▋ | 2170/34278 [2:25:49<33:49:08, 3.79s/it] {'loss': 0.1784, 'grad_norm': 0.8168786062791672, 'learning_rate': 9.970970926159556e-06, 'epoch': 0.06} 6%|▋ | 2170/34278 [2:25:49<33:49:08, 3.79s/it] 6%|▋ | 2171/34278 [2:25:51<31:19:48, 3.51s/it] {'loss': 0.1989, 'grad_norm': 0.9458401907464605, 'learning_rate': 9.970920069698822e-06, 'epoch': 0.06} 6%|▋ | 2171/34278 [2:25:51<31:19:48, 3.51s/it] 6%|▋ | 2172/34278 [2:25:58<38:17:44, 4.29s/it] {'loss': 0.2097, 'grad_norm': 0.8681931135470768, 'learning_rate': 9.970869168858901e-06, 'epoch': 0.06} 6%|▋ | 2172/34278 [2:25:58<38:17:44, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▋ | 2173/34278 [2:26:01<35:21:06, 3.96s/it] {'loss': 0.2039, 'grad_norm': 1.0313342547076338, 'learning_rate': 9.970818223640246e-06, 'epoch': 0.06} 6%|▋ | 2173/34278 [2:26:01<35:21:06, 3.96s/it] 6%|▋ | 2174/34278 [2:26:04<33:28:19, 3.75s/it] {'loss': 0.1973, 'grad_norm': 0.9147472245352678, 'learning_rate': 9.970767234043315e-06, 'epoch': 0.06} 6%|▋ | 2174/34278 [2:26:04<33:28:19, 3.75s/it] 6%|▋ | 2175/34278 [2:26:08<34:05:59, 3.82s/it] {'loss': 0.2036, 'grad_norm': 1.0247069348055946, 'learning_rate': 9.970716200068557e-06, 'epoch': 0.06} 6%|▋ | 2175/34278 [2:26:08<34:05:59, 3.82s/it] 6%|▋ | 2176/34278 [2:26:12<33:23:20, 3.74s/it] {'loss': 0.2063, 'grad_norm': 0.9377552652308071, 'learning_rate': 9.970665121716434e-06, 'epoch': 0.06} 6%|▋ | 2176/34278 [2:26:12<33:23:20, 3.74s/it] 6%|▋ | 2177/34278 [2:26:15<31:56:59, 3.58s/it] {'loss': 0.2351, 'grad_norm': 0.8406212963866995, 'learning_rate': 9.9706139989874e-06, 'epoch': 0.06} 6%|▋ | 2177/34278 [2:26:15<31:56:59, 3.58s/it] 6%|▋ | 2178/34278 [2:26:19<33:22:14, 3.74s/it] {'loss': 0.221, 'grad_norm': 0.910002269981033, 'learning_rate': 9.970562831881908e-06, 'epoch': 0.06} 6%|▋ | 2178/34278 [2:26:19<33:22:14, 3.74s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▋ | 2179/34278 [2:26:25<39:42:48, 4.45s/it] {'loss': 0.1821, 'grad_norm': 1.0106532591392394, 'learning_rate': 9.97051162040042e-06, 'epoch': 0.06} 6%|▋ | 2179/34278 [2:26:25<39:42:48, 4.45s/it] 6%|▋ | 2180/34278 [2:26:29<37:49:51, 4.24s/it] {'loss': 0.1846, 'grad_norm': 0.8683149191940535, 'learning_rate': 9.970460364543388e-06, 'epoch': 0.06} 6%|▋ | 2180/34278 [2:26:29<37:49:51, 4.24s/it] 6%|▋ | 2181/34278 [2:26:35<42:35:12, 4.78s/it] {'loss': 0.1887, 'grad_norm': 0.8223202382821131, 'learning_rate': 9.970409064311275e-06, 'epoch': 0.06} 6%|▋ | 2181/34278 [2:26:35<42:35:12, 4.78s/it] 6%|▋ | 2182/34278 [2:26:38<38:25:52, 4.31s/it] {'loss': 0.1998, 'grad_norm': 1.1540314998941879, 'learning_rate': 9.970357719704535e-06, 'epoch': 0.06} 6%|▋ | 2182/34278 [2:26:38<38:25:52, 4.31s/it] 6%|▋ | 2183/34278 [2:26:42<36:44:28, 4.12s/it] {'loss': 0.1957, 'grad_norm': 0.8113968764563977, 'learning_rate': 9.97030633072363e-06, 'epoch': 0.06} 6%|▋ | 2183/34278 [2:26:42<36:44:28, 4.12s/it] 6%|▋ | 2184/34278 [2:26:46<37:10:14, 4.17s/it] {'loss': 0.1942, 'grad_norm': 1.0028786750066798, 'learning_rate': 9.970254897369014e-06, 'epoch': 0.06} 6%|▋ | 2184/34278 [2:26:46<37:10:14, 4.17s/it] 6%|▋ | 2185/34278 [2:26:49<33:59:24, 3.81s/it] {'loss': 0.2071, 'grad_norm': 0.8008843570967988, 'learning_rate': 9.970203419641152e-06, 'epoch': 0.06} 6%|▋ | 2185/34278 [2:26:49<33:59:24, 3.81s/it] 6%|▋ | 2186/34278 [2:26:53<34:15:47, 3.84s/it] {'loss': 0.1916, 'grad_norm': 0.9577374456325305, 'learning_rate': 9.970151897540496e-06, 'epoch': 0.06} 6%|▋ | 2186/34278 [2:26:53<34:15:47, 3.84s/it] 6%|▋ | 2187/34278 [2:26:56<32:11:11, 3.61s/it] {'loss': 0.1907, 'grad_norm': 0.7721337864368466, 'learning_rate': 9.970100331067515e-06, 'epoch': 0.06} 6%|▋ | 2187/34278 [2:26:56<32:11:11, 3.61s/it] 6%|▋ | 2188/34278 [2:26:59<31:56:25, 3.58s/it] {'loss': 0.189, 'grad_norm': 0.8214557848609138, 'learning_rate': 9.97004872022266e-06, 'epoch': 0.06} 6%|▋ | 2188/34278 [2:26:59<31:56:25, 3.58s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8584 > 8192). Running this sequence through the model will result in indexing errors 6%|▋ | 2189/34278 [2:27:03<31:30:18, 3.53s/it] {'loss': 0.1985, 'grad_norm': 0.8534221733409112, 'learning_rate': 9.969997065006399e-06, 'epoch': 0.06} 6%|▋ | 2189/34278 [2:27:03<31:30:18, 3.53s/it] 6%|▋ | 2190/34278 [2:27:06<31:19:40, 3.51s/it] {'loss': 0.2277, 'grad_norm': 0.9731900665243962, 'learning_rate': 9.96994536541919e-06, 'epoch': 0.06} 6%|▋ | 2190/34278 [2:27:06<31:19:40, 3.51s/it] 6%|▋ | 2191/34278 [2:27:13<39:06:57, 4.39s/it] {'loss': 0.2049, 'grad_norm': 0.9098656679861912, 'learning_rate': 9.969893621461495e-06, 'epoch': 0.06} 6%|▋ | 2191/34278 [2:27:13<39:06:57, 4.39s/it] 6%|▋ | 2192/34278 [2:27:16<35:51:25, 4.02s/it] {'loss': 0.1972, 'grad_norm': 1.156345296203105, 'learning_rate': 9.969841833133778e-06, 'epoch': 0.06} 6%|▋ | 2192/34278 [2:27:16<35:51:25, 4.02s/it] 6%|▋ | 2193/34278 [2:27:19<34:17:41, 3.85s/it] {'loss': 0.2061, 'grad_norm': 1.029714304537173, 'learning_rate': 9.969790000436498e-06, 'epoch': 0.06} 6%|▋ | 2193/34278 [2:27:19<34:17:41, 3.85s/it] 6%|▋ | 2194/34278 [2:27:22<32:04:09, 3.60s/it] {'loss': 0.2277, 'grad_norm': 0.839849005411028, 'learning_rate': 9.969738123370118e-06, 'epoch': 0.06} 6%|▋ | 2194/34278 [2:27:22<32:04:09, 3.60s/it] 6%|▋ | 2195/34278 [2:27:26<31:55:25, 3.58s/it] {'loss': 0.2031, 'grad_norm': 1.1541475573991666, 'learning_rate': 9.969686201935105e-06, 'epoch': 0.06} 6%|▋ | 2195/34278 [2:27:26<31:55:25, 3.58s/it] 6%|▋ | 2196/34278 [2:27:29<30:47:36, 3.46s/it] {'loss': 0.1992, 'grad_norm': 1.039905017633847, 'learning_rate': 9.969634236131918e-06, 'epoch': 0.06} 6%|▋ | 2196/34278 [2:27:29<30:47:36, 3.46s/it] 6%|▋ | 2197/34278 [2:27:33<32:36:09, 3.66s/it] {'loss': 0.185, 'grad_norm': 1.0827334882608222, 'learning_rate': 9.969582225961025e-06, 'epoch': 0.06} 6%|▋ | 2197/34278 [2:27:33<32:36:09, 3.66s/it] 6%|▋ | 2198/34278 [2:27:39<38:59:32, 4.38s/it] {'loss': 0.2021, 'grad_norm': 0.8294662301533692, 'learning_rate': 9.969530171422886e-06, 'epoch': 0.06} 6%|▋ | 2198/34278 [2:27:39<38:59:32, 4.38s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▋ | 2199/34278 [2:27:43<36:04:36, 4.05s/it] {'loss': 0.1953, 'grad_norm': 1.16260507700567, 'learning_rate': 9.969478072517968e-06, 'epoch': 0.06} 6%|▋ | 2199/34278 [2:27:43<36:04:36, 4.05s/it] 6%|▋ | 2200/34278 [2:27:46<34:13:25, 3.84s/it] {'loss': 0.204, 'grad_norm': 0.910075438686915, 'learning_rate': 9.969425929246739e-06, 'epoch': 0.06} 6%|▋ | 2200/34278 [2:27:46<34:13:25, 3.84s/it] 6%|▋ | 2201/34278 [2:27:49<32:44:18, 3.67s/it] {'loss': 0.1982, 'grad_norm': 0.7471934485024296, 'learning_rate': 9.969373741609659e-06, 'epoch': 0.06} 6%|▋ | 2201/34278 [2:27:49<32:44:18, 3.67s/it] 6%|▋ | 2202/34278 [2:27:53<32:14:53, 3.62s/it] {'loss': 0.1888, 'grad_norm': 0.8166737270541704, 'learning_rate': 9.969321509607197e-06, 'epoch': 0.06} 6%|▋ | 2202/34278 [2:27:53<32:14:53, 3.62s/it] 6%|▋ | 2203/34278 [2:27:56<32:01:53, 3.60s/it] {'loss': 0.1879, 'grad_norm': 1.0086772959630657, 'learning_rate': 9.969269233239819e-06, 'epoch': 0.06} 6%|▋ | 2203/34278 [2:27:56<32:01:53, 3.60s/it] 6%|▋ | 2204/34278 [2:28:02<38:15:08, 4.29s/it] {'loss': 0.1971, 'grad_norm': 0.9159443963949536, 'learning_rate': 9.96921691250799e-06, 'epoch': 0.06} 6%|▋ | 2204/34278 [2:28:02<38:15:08, 4.29s/it] 6%|▋ | 2205/34278 [2:28:06<36:46:46, 4.13s/it] {'loss': 0.1744, 'grad_norm': 0.884458649367082, 'learning_rate': 9.969164547412182e-06, 'epoch': 0.06} 6%|▋ | 2205/34278 [2:28:06<36:46:46, 4.13s/it] 6%|▋ | 2206/34278 [2:28:09<33:34:17, 3.77s/it] {'loss': 0.1923, 'grad_norm': 0.8610354642199655, 'learning_rate': 9.969112137952856e-06, 'epoch': 0.06} 6%|▋ | 2206/34278 [2:28:09<33:34:17, 3.77s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10639 > 8192). Running this sequence through the model will result in indexing errors 6%|▋ | 2207/34278 [2:28:12<32:21:39, 3.63s/it] {'loss': 0.1913, 'grad_norm': 1.1471539639614219, 'learning_rate': 9.969059684130484e-06, 'epoch': 0.06} 6%|▋ | 2207/34278 [2:28:12<32:21:39, 3.63s/it] 6%|▋ | 2208/34278 [2:28:16<31:54:14, 3.58s/it] {'loss': 0.2078, 'grad_norm': 1.0115666785152742, 'learning_rate': 9.969007185945534e-06, 'epoch': 0.06} 6%|▋ | 2208/34278 [2:28:16<31:54:14, 3.58s/it] 6%|▋ | 2209/34278 [2:28:19<31:17:09, 3.51s/it] {'loss': 0.2172, 'grad_norm': 0.9262368849840441, 'learning_rate': 9.968954643398474e-06, 'epoch': 0.06} 6%|▋ | 2209/34278 [2:28:19<31:17:09, 3.51s/it] 6%|▋ | 2210/34278 [2:28:22<31:06:39, 3.49s/it] {'loss': 0.2038, 'grad_norm': 1.2146042940550938, 'learning_rate': 9.968902056489773e-06, 'epoch': 0.06} 6%|▋ | 2210/34278 [2:28:22<31:06:39, 3.49s/it] 6%|▋ | 2211/34278 [2:28:26<32:42:59, 3.67s/it] {'loss': 0.2207, 'grad_norm': 1.1360427085963671, 'learning_rate': 9.9688494252199e-06, 'epoch': 0.06} 6%|▋ | 2211/34278 [2:28:27<32:42:59, 3.67s/it] 6%|▋ | 2212/34278 [2:28:30<33:39:01, 3.78s/it] {'loss': 0.2076, 'grad_norm': 0.9416998773571001, 'learning_rate': 9.968796749589328e-06, 'epoch': 0.06} 6%|▋ | 2212/34278 [2:28:30<33:39:01, 3.78s/it] 6%|▋ | 2213/34278 [2:28:34<32:22:35, 3.63s/it] {'loss': 0.1948, 'grad_norm': 0.9456392384313109, 'learning_rate': 9.96874402959852e-06, 'epoch': 0.06} 6%|▋ | 2213/34278 [2:28:34<32:22:35, 3.63s/it] 6%|▋ | 2214/34278 [2:28:37<31:17:19, 3.51s/it] {'loss': 0.1727, 'grad_norm': 1.03189788717096, 'learning_rate': 9.968691265247954e-06, 'epoch': 0.06} 6%|▋ | 2214/34278 [2:28:37<31:17:19, 3.51s/it] 6%|▋ | 2215/34278 [2:28:43<39:06:08, 4.39s/it] {'loss': 0.2015, 'grad_norm': 0.8727096691708024, 'learning_rate': 9.968638456538101e-06, 'epoch': 0.06} 6%|▋ | 2215/34278 [2:28:43<39:06:08, 4.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 6%|▋ | 2216/34278 [2:28:47<36:08:35, 4.06s/it] {'loss': 0.2137, 'grad_norm': 1.02655922936091, 'learning_rate': 9.968585603469427e-06, 'epoch': 0.06} 6%|▋ | 2216/34278 [2:28:47<36:08:35, 4.06s/it] 6%|▋ | 2217/34278 [2:28:51<35:46:53, 4.02s/it] {'loss': 0.2149, 'grad_norm': 1.028569638988256, 'learning_rate': 9.968532706042406e-06, 'epoch': 0.06} 6%|▋ | 2217/34278 [2:28:51<35:46:53, 4.02s/it] 6%|▋ | 2218/34278 [2:28:54<33:43:39, 3.79s/it] {'loss': 0.2076, 'grad_norm': 1.0048528039586941, 'learning_rate': 9.968479764257513e-06, 'epoch': 0.06} 6%|▋ | 2218/34278 [2:28:54<33:43:39, 3.79s/it] 6%|▋ | 2219/34278 [2:28:57<32:57:11, 3.70s/it] {'loss': 0.1826, 'grad_norm': 0.8780669246841813, 'learning_rate': 9.968426778115218e-06, 'epoch': 0.06} 6%|▋ | 2219/34278 [2:28:57<32:57:11, 3.70s/it] 6%|▋ | 2220/34278 [2:29:01<31:47:46, 3.57s/it] {'loss': 0.178, 'grad_norm': 0.8531201656836703, 'learning_rate': 9.968373747615996e-06, 'epoch': 0.06} 6%|▋ | 2220/34278 [2:29:01<31:47:46, 3.57s/it] 6%|▋ | 2221/34278 [2:29:04<30:49:10, 3.46s/it] {'loss': 0.1789, 'grad_norm': 0.8613243761746758, 'learning_rate': 9.968320672760318e-06, 'epoch': 0.06} 6%|▋ | 2221/34278 [2:29:04<30:49:10, 3.46s/it] 6%|▋ | 2222/34278 [2:29:07<29:57:59, 3.37s/it] {'loss': 0.19, 'grad_norm': 0.9149738482474605, 'learning_rate': 9.968267553548659e-06, 'epoch': 0.06} 6%|▋ | 2222/34278 [2:29:07<29:57:59, 3.37s/it] 6%|▋ | 2223/34278 [2:29:10<29:38:26, 3.33s/it] {'loss': 0.211, 'grad_norm': 1.0257280791333028, 'learning_rate': 9.968214389981494e-06, 'epoch': 0.06} 6%|▋ | 2223/34278 [2:29:10<29:38:26, 3.33s/it] 6%|▋ | 2224/34278 [2:29:16<36:50:13, 4.14s/it] {'loss': 0.1982, 'grad_norm': 0.9591929787116885, 'learning_rate': 9.968161182059297e-06, 'epoch': 0.06} 6%|▋ | 2224/34278 [2:29:16<36:50:13, 4.14s/it] 6%|▋ | 2225/34278 [2:29:20<36:13:55, 4.07s/it] {'loss': 0.1922, 'grad_norm': 0.8951345868663532, 'learning_rate': 9.968107929782543e-06, 'epoch': 0.06} 6%|▋ | 2225/34278 [2:29:20<36:13:55, 4.07s/it] 6%|▋ | 2226/34278 [2:29:24<34:39:25, 3.89s/it] {'loss': 0.1967, 'grad_norm': 0.7471293912834188, 'learning_rate': 9.968054633151707e-06, 'epoch': 0.06} 6%|▋ | 2226/34278 [2:29:24<34:39:25, 3.89s/it] 6%|▋ | 2227/34278 [2:29:27<32:59:13, 3.71s/it] {'loss': 0.2329, 'grad_norm': 0.8574026233590367, 'learning_rate': 9.968001292167264e-06, 'epoch': 0.06} 6%|▋ | 2227/34278 [2:29:27<32:59:13, 3.71s/it] 6%|▋ | 2228/34278 [2:29:32<35:44:40, 4.01s/it] {'loss': 0.227, 'grad_norm': 0.873056545143532, 'learning_rate': 9.967947906829694e-06, 'epoch': 0.06} 6%|▋ | 2228/34278 [2:29:32<35:44:40, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2229/34278 [2:29:36<35:18:20, 3.97s/it] {'loss': 0.2438, 'grad_norm': 0.9092896561420567, 'learning_rate': 9.967894477139468e-06, 'epoch': 0.07} 7%|▋ | 2229/34278 [2:29:36<35:18:20, 3.97s/it] 7%|▋ | 2230/34278 [2:29:39<32:57:04, 3.70s/it] {'loss': 0.1972, 'grad_norm': 0.8392856461878418, 'learning_rate': 9.967841003097068e-06, 'epoch': 0.07} 7%|▋ | 2230/34278 [2:29:39<32:57:04, 3.70s/it] 7%|▋ | 2231/34278 [2:29:41<30:26:20, 3.42s/it] {'loss': 0.1983, 'grad_norm': 0.9152020310585482, 'learning_rate': 9.967787484702968e-06, 'epoch': 0.07} 7%|▋ | 2231/34278 [2:29:41<30:26:20, 3.42s/it] 7%|▋ | 2232/34278 [2:29:46<34:36:46, 3.89s/it] {'loss': 0.1929, 'grad_norm': 0.7857343524542046, 'learning_rate': 9.96773392195765e-06, 'epoch': 0.07} 7%|▋ | 2232/34278 [2:29:46<34:36:46, 3.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2233/34278 [2:29:50<33:22:53, 3.75s/it] {'loss': 0.211, 'grad_norm': 0.8205865412173147, 'learning_rate': 9.967680314861587e-06, 'epoch': 0.07} 7%|▋ | 2233/34278 [2:29:50<33:22:53, 3.75s/it] 7%|▋ | 2234/34278 [2:29:53<31:57:50, 3.59s/it] {'loss': 0.2158, 'grad_norm': 0.8703512415040333, 'learning_rate': 9.967626663415261e-06, 'epoch': 0.07} 7%|▋ | 2234/34278 [2:29:53<31:57:50, 3.59s/it] 7%|▋ | 2235/34278 [2:29:57<31:47:18, 3.57s/it] {'loss': 0.2311, 'grad_norm': 0.9683867770588545, 'learning_rate': 9.96757296761915e-06, 'epoch': 0.07} 7%|▋ | 2235/34278 [2:29:57<31:47:18, 3.57s/it] 7%|▋ | 2236/34278 [2:30:00<31:46:03, 3.57s/it] {'loss': 0.1998, 'grad_norm': 0.8237442188773494, 'learning_rate': 9.967519227473733e-06, 'epoch': 0.07} 7%|▋ | 2236/34278 [2:30:00<31:46:03, 3.57s/it] 7%|▋ | 2237/34278 [2:30:03<30:18:11, 3.40s/it] {'loss': 0.1877, 'grad_norm': 0.8338489173292051, 'learning_rate': 9.96746544297949e-06, 'epoch': 0.07} 7%|▋ | 2237/34278 [2:30:03<30:18:11, 3.40s/it] 7%|▋ | 2238/34278 [2:30:06<29:22:08, 3.30s/it] {'loss': 0.1998, 'grad_norm': 0.9623474023766088, 'learning_rate': 9.967411614136902e-06, 'epoch': 0.07} 7%|▋ | 2238/34278 [2:30:06<29:22:08, 3.30s/it] 7%|▋ | 2239/34278 [2:30:10<30:32:03, 3.43s/it] {'loss': 0.1996, 'grad_norm': 1.1544915738343882, 'learning_rate': 9.967357740946448e-06, 'epoch': 0.07} 7%|▋ | 2239/34278 [2:30:10<30:32:03, 3.43s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2240/34278 [2:30:14<31:07:42, 3.50s/it] {'loss': 0.1962, 'grad_norm': 0.8763839105754301, 'learning_rate': 9.967303823408612e-06, 'epoch': 0.07} 7%|▋ | 2240/34278 [2:30:14<31:07:42, 3.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2241/34278 [2:30:18<34:22:14, 3.86s/it] {'loss': 0.2032, 'grad_norm': 0.8973607034259703, 'learning_rate': 9.96724986152387e-06, 'epoch': 0.07} 7%|▋ | 2241/34278 [2:30:18<34:22:14, 3.86s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2242/34278 [2:30:21<31:59:16, 3.59s/it] {'loss': 0.2016, 'grad_norm': 1.0217125120852515, 'learning_rate': 9.96719585529271e-06, 'epoch': 0.07} 7%|▋ | 2242/34278 [2:30:21<31:59:16, 3.59s/it] 7%|▋ | 2243/34278 [2:30:26<35:02:55, 3.94s/it] {'loss': 0.2156, 'grad_norm': 0.914861141037201, 'learning_rate': 9.96714180471561e-06, 'epoch': 0.07} 7%|▋ | 2243/34278 [2:30:26<35:02:55, 3.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2244/34278 [2:30:31<37:03:53, 4.17s/it] {'loss': 0.1942, 'grad_norm': 0.852790688174234, 'learning_rate': 9.967087709793053e-06, 'epoch': 0.07} 7%|▋ | 2244/34278 [2:30:31<37:03:53, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2245/34278 [2:30:34<35:57:06, 4.04s/it] {'loss': 0.188, 'grad_norm': 0.8901363975744998, 'learning_rate': 9.967033570525525e-06, 'epoch': 0.07} 7%|▋ | 2245/34278 [2:30:34<35:57:06, 4.04s/it] 7%|▋ | 2246/34278 [2:30:39<36:30:20, 4.10s/it] {'loss': 0.2115, 'grad_norm': 0.8225870555184273, 'learning_rate': 9.966979386913504e-06, 'epoch': 0.07} 7%|▋ | 2246/34278 [2:30:39<36:30:20, 4.10s/it] 7%|▋ | 2247/34278 [2:30:42<34:38:49, 3.89s/it] {'loss': 0.1863, 'grad_norm': 0.863887118505021, 'learning_rate': 9.966925158957479e-06, 'epoch': 0.07} 7%|▋ | 2247/34278 [2:30:42<34:38:49, 3.89s/it] 7%|▋ | 2248/34278 [2:30:45<32:27:23, 3.65s/it] {'loss': 0.214, 'grad_norm': 0.9544258461725252, 'learning_rate': 9.966870886657932e-06, 'epoch': 0.07} 7%|▋ | 2248/34278 [2:30:45<32:27:23, 3.65s/it] 7%|▋ | 2249/34278 [2:30:48<31:28:15, 3.54s/it] {'loss': 0.2219, 'grad_norm': 1.417183988957697, 'learning_rate': 9.966816570015345e-06, 'epoch': 0.07} 7%|▋ | 2249/34278 [2:30:48<31:28:15, 3.54s/it] 7%|▋ | 2250/34278 [2:30:51<30:09:00, 3.39s/it] {'loss': 0.2249, 'grad_norm': 1.1918637172628517, 'learning_rate': 9.966762209030208e-06, 'epoch': 0.07} 7%|▋ | 2250/34278 [2:30:52<30:09:00, 3.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2251/34278 [2:30:55<30:23:56, 3.42s/it] {'loss': 0.2128, 'grad_norm': 0.8183769491972958, 'learning_rate': 9.966707803703002e-06, 'epoch': 0.07} 7%|▋ | 2251/34278 [2:30:55<30:23:56, 3.42s/it] 7%|▋ | 2252/34278 [2:30:59<31:02:11, 3.49s/it] {'loss': 0.2209, 'grad_norm': 1.0000162894697773, 'learning_rate': 9.966653354034214e-06, 'epoch': 0.07} 7%|▋ | 2252/34278 [2:30:59<31:02:11, 3.49s/it] 7%|▋ | 2253/34278 [2:31:05<39:13:09, 4.41s/it] {'loss': 0.1842, 'grad_norm': 1.0300333643567683, 'learning_rate': 9.966598860024332e-06, 'epoch': 0.07} 7%|▋ | 2253/34278 [2:31:05<39:13:09, 4.41s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2254/34278 [2:31:11<43:04:49, 4.84s/it] {'loss': 0.2076, 'grad_norm': 0.9772418886702062, 'learning_rate': 9.966544321673839e-06, 'epoch': 0.07} 7%|▋ | 2254/34278 [2:31:11<43:04:49, 4.84s/it] 7%|▋ | 2255/34278 [2:31:14<38:09:21, 4.29s/it] {'loss': 0.1676, 'grad_norm': 1.2678545558541043, 'learning_rate': 9.966489738983226e-06, 'epoch': 0.07} 7%|▋ | 2255/34278 [2:31:14<38:09:21, 4.29s/it] 7%|▋ | 2256/34278 [2:31:17<35:47:01, 4.02s/it] {'loss': 0.2144, 'grad_norm': 1.107490942511155, 'learning_rate': 9.966435111952977e-06, 'epoch': 0.07} 7%|▋ | 2256/34278 [2:31:17<35:47:01, 4.02s/it] 7%|▋ | 2257/34278 [2:31:20<32:58:39, 3.71s/it] {'loss': 0.1835, 'grad_norm': 0.921844265717796, 'learning_rate': 9.966380440583581e-06, 'epoch': 0.07} 7%|▋ | 2257/34278 [2:31:20<32:58:39, 3.71s/it] 7%|▋ | 2258/34278 [2:31:24<31:37:29, 3.56s/it] {'loss': 0.2103, 'grad_norm': 1.3402928936873384, 'learning_rate': 9.966325724875527e-06, 'epoch': 0.07} 7%|▋ | 2258/34278 [2:31:24<31:37:29, 3.56s/it] 7%|▋ | 2259/34278 [2:31:27<32:08:09, 3.61s/it] {'loss': 0.2007, 'grad_norm': 0.855362458923824, 'learning_rate': 9.9662709648293e-06, 'epoch': 0.07} 7%|▋ | 2259/34278 [2:31:27<32:08:09, 3.61s/it] 7%|▋ | 2260/34278 [2:31:30<30:33:12, 3.44s/it] {'loss': 0.2166, 'grad_norm': 1.0063562289024135, 'learning_rate': 9.966216160445394e-06, 'epoch': 0.07} 7%|▋ | 2260/34278 [2:31:30<30:33:12, 3.44s/it] 7%|▋ | 2261/34278 [2:31:34<31:06:56, 3.50s/it] {'loss': 0.2093, 'grad_norm': 1.0831744996916741, 'learning_rate': 9.966161311724296e-06, 'epoch': 0.07} 7%|▋ | 2261/34278 [2:31:34<31:06:56, 3.50s/it] 7%|▋ | 2262/34278 [2:31:37<30:11:58, 3.40s/it] {'loss': 0.1893, 'grad_norm': 0.6840487396997437, 'learning_rate': 9.966106418666494e-06, 'epoch': 0.07} 7%|▋ | 2262/34278 [2:31:37<30:11:58, 3.40s/it] 7%|▋ | 2263/34278 [2:31:41<31:37:45, 3.56s/it] {'loss': 0.2043, 'grad_norm': 0.9684448740508934, 'learning_rate': 9.96605148127248e-06, 'epoch': 0.07} 7%|▋ | 2263/34278 [2:31:41<31:37:45, 3.56s/it] 7%|▋ | 2264/34278 [2:31:44<31:09:39, 3.50s/it] {'loss': 0.1828, 'grad_norm': 1.0038273974017846, 'learning_rate': 9.965996499542742e-06, 'epoch': 0.07} 7%|▋ | 2264/34278 [2:31:45<31:09:39, 3.50s/it] 7%|▋ | 2265/34278 [2:31:49<33:41:50, 3.79s/it] {'loss': 0.2128, 'grad_norm': 0.8998160043586104, 'learning_rate': 9.965941473477775e-06, 'epoch': 0.07} 7%|▋ | 2265/34278 [2:31:49<33:41:50, 3.79s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2266/34278 [2:31:55<39:55:44, 4.49s/it] {'loss': 0.209, 'grad_norm': 0.8605748093744029, 'learning_rate': 9.965886403078067e-06, 'epoch': 0.07} 7%|▋ | 2266/34278 [2:31:55<39:55:44, 4.49s/it] 7%|▋ | 2267/34278 [2:32:01<44:31:23, 5.01s/it] {'loss': 0.2088, 'grad_norm': 0.7303764695161269, 'learning_rate': 9.965831288344112e-06, 'epoch': 0.07} 7%|▋ | 2267/34278 [2:32:01<44:31:23, 5.01s/it] 7%|▋ | 2268/34278 [2:32:06<44:42:20, 5.03s/it] {'loss': 0.2036, 'grad_norm': 0.88000142850267, 'learning_rate': 9.9657761292764e-06, 'epoch': 0.07} 7%|▋ | 2268/34278 [2:32:06<44:42:20, 5.03s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2269/34278 [2:32:10<41:42:20, 4.69s/it] {'loss': 0.1771, 'grad_norm': 0.76853823738418, 'learning_rate': 9.965720925875421e-06, 'epoch': 0.07} 7%|▋ | 2269/34278 [2:32:10<41:42:20, 4.69s/it] 7%|▋ | 2270/34278 [2:32:14<40:24:40, 4.55s/it] {'loss': 0.2045, 'grad_norm': 0.916048883347126, 'learning_rate': 9.965665678141673e-06, 'epoch': 0.07} 7%|▋ | 2270/34278 [2:32:14<40:24:40, 4.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2271/34278 [2:32:18<37:54:20, 4.26s/it] {'loss': 0.1836, 'grad_norm': 1.0360752776009832, 'learning_rate': 9.96561038607565e-06, 'epoch': 0.07} 7%|▋ | 2271/34278 [2:32:18<37:54:20, 4.26s/it] 7%|▋ | 2272/34278 [2:32:22<36:54:45, 4.15s/it] {'loss': 0.2047, 'grad_norm': 0.935275447511629, 'learning_rate': 9.96555504967784e-06, 'epoch': 0.07} 7%|▋ | 2272/34278 [2:32:22<36:54:45, 4.15s/it] 7%|▋ | 2273/34278 [2:32:25<34:36:22, 3.89s/it] {'loss': 0.1808, 'grad_norm': 0.8368259005842033, 'learning_rate': 9.965499668948741e-06, 'epoch': 0.07} 7%|▋ | 2273/34278 [2:32:25<34:36:22, 3.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2274/34278 [2:32:28<32:28:58, 3.65s/it] {'loss': 0.2463, 'grad_norm': 1.2344291155833595, 'learning_rate': 9.965444243888846e-06, 'epoch': 0.07} 7%|▋ | 2274/34278 [2:32:28<32:28:58, 3.65s/it] 7%|▋ | 2275/34278 [2:32:32<33:03:37, 3.72s/it] {'loss': 0.2175, 'grad_norm': 1.2281465329487147, 'learning_rate': 9.96538877449865e-06, 'epoch': 0.07} 7%|▋ | 2275/34278 [2:32:32<33:03:37, 3.72s/it] 7%|▋ | 2276/34278 [2:32:36<32:31:44, 3.66s/it] {'loss': 0.1772, 'grad_norm': 1.0374219247249994, 'learning_rate': 9.965333260778649e-06, 'epoch': 0.07} 7%|▋ | 2276/34278 [2:32:36<32:31:44, 3.66s/it] 7%|▋ | 2277/34278 [2:32:39<30:34:18, 3.44s/it] {'loss': 0.2241, 'grad_norm': 0.8921888732790692, 'learning_rate': 9.965277702729338e-06, 'epoch': 0.07} 7%|▋ | 2277/34278 [2:32:39<30:34:18, 3.44s/it] 7%|▋ | 2278/34278 [2:32:41<28:41:52, 3.23s/it] {'loss': 0.1979, 'grad_norm': 1.0007032657722952, 'learning_rate': 9.965222100351211e-06, 'epoch': 0.07} 7%|▋ | 2278/34278 [2:32:41<28:41:52, 3.23s/it] 7%|▋ | 2279/34278 [2:32:45<28:32:12, 3.21s/it] {'loss': 0.1953, 'grad_norm': 0.8774868372628707, 'learning_rate': 9.965166453644767e-06, 'epoch': 0.07} 7%|▋ | 2279/34278 [2:32:45<28:32:12, 3.21s/it] 7%|▋ | 2280/34278 [2:32:48<28:08:14, 3.17s/it] {'loss': 0.1711, 'grad_norm': 0.7244194252569438, 'learning_rate': 9.965110762610504e-06, 'epoch': 0.07} 7%|▋ | 2280/34278 [2:32:48<28:08:14, 3.17s/it] 7%|▋ | 2281/34278 [2:32:51<29:39:55, 3.34s/it] {'loss': 0.2493, 'grad_norm': 1.0813616908314334, 'learning_rate': 9.965055027248915e-06, 'epoch': 0.07} 7%|▋ | 2281/34278 [2:32:51<29:39:55, 3.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2282/34278 [2:32:54<28:51:19, 3.25s/it] {'loss': 0.2176, 'grad_norm': 0.9372654404430211, 'learning_rate': 9.964999247560501e-06, 'epoch': 0.07} 7%|▋ | 2282/34278 [2:32:54<28:51:19, 3.25s/it] 7%|▋ | 2283/34278 [2:32:58<28:51:12, 3.25s/it] {'loss': 0.193, 'grad_norm': 0.9265101736038648, 'learning_rate': 9.96494342354576e-06, 'epoch': 0.07} 7%|▋ | 2283/34278 [2:32:58<28:51:12, 3.25s/it] 7%|▋ | 2284/34278 [2:33:04<36:40:45, 4.13s/it] {'loss': 0.2132, 'grad_norm': 1.1503195481058759, 'learning_rate': 9.964887555205189e-06, 'epoch': 0.07} 7%|▋ | 2284/34278 [2:33:04<36:40:45, 4.13s/it] 7%|▋ | 2285/34278 [2:33:10<41:18:19, 4.65s/it] {'loss': 0.202, 'grad_norm': 0.9757278540844526, 'learning_rate': 9.964831642539285e-06, 'epoch': 0.07} 7%|▋ | 2285/34278 [2:33:10<41:18:19, 4.65s/it] 7%|▋ | 2286/34278 [2:33:13<37:12:10, 4.19s/it] {'loss': 0.2026, 'grad_norm': 1.2181286560018258, 'learning_rate': 9.964775685548552e-06, 'epoch': 0.07} 7%|▋ | 2286/34278 [2:33:13<37:12:10, 4.19s/it] 7%|▋ | 2287/34278 [2:33:16<34:25:57, 3.87s/it] {'loss': 0.2049, 'grad_norm': 0.8442524527932412, 'learning_rate': 9.964719684233486e-06, 'epoch': 0.07} 7%|▋ | 2287/34278 [2:33:16<34:25:57, 3.87s/it] 7%|▋ | 2288/34278 [2:33:19<33:11:59, 3.74s/it] {'loss': 0.1791, 'grad_norm': 0.9553116229166884, 'learning_rate': 9.964663638594587e-06, 'epoch': 0.07} 7%|▋ | 2288/34278 [2:33:19<33:11:59, 3.74s/it] 7%|▋ | 2289/34278 [2:33:25<37:42:46, 4.24s/it] {'loss': 0.1953, 'grad_norm': 0.8899230068080627, 'learning_rate': 9.964607548632356e-06, 'epoch': 0.07} 7%|▋ | 2289/34278 [2:33:25<37:42:46, 4.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f161a981170> Failed to fetch sample 2701718. Exception: cannot identify image file <_io.BytesIO object at 0x7f161a981170> 7%|▋ | 2290/34278 [2:33:28<36:06:07, 4.06s/it] {'loss': 0.2116, 'grad_norm': 1.0511859131465593, 'learning_rate': 9.964551414347297e-06, 'epoch': 0.07} 7%|▋ | 2290/34278 [2:33:28<36:06:07, 4.06s/it] 7%|▋ | 2291/34278 [2:33:32<35:28:02, 3.99s/it] {'loss': 0.1906, 'grad_norm': 0.9932528612213981, 'learning_rate': 9.964495235739907e-06, 'epoch': 0.07} 7%|▋ | 2291/34278 [2:33:32<35:28:02, 3.99s/it] 7%|▋ | 2292/34278 [2:33:36<34:02:03, 3.83s/it] {'loss': 0.1783, 'grad_norm': 1.0239559240853664, 'learning_rate': 9.964439012810686e-06, 'epoch': 0.07} 7%|▋ | 2292/34278 [2:33:36<34:02:03, 3.83s/it] 7%|▋ | 2293/34278 [2:33:40<35:30:24, 4.00s/it] {'loss': 0.2021, 'grad_norm': 1.2101691483406114, 'learning_rate': 9.96438274556014e-06, 'epoch': 0.07} 7%|▋ | 2293/34278 [2:33:40<35:30:24, 4.00s/it] 7%|▋ | 2294/34278 [2:33:46<40:48:27, 4.59s/it] {'loss': 0.2169, 'grad_norm': 0.8340825736135985, 'learning_rate': 9.96432643398877e-06, 'epoch': 0.07} 7%|▋ | 2294/34278 [2:33:46<40:48:27, 4.59s/it] 7%|▋ | 2295/34278 [2:33:52<44:57:58, 5.06s/it] {'loss': 0.2003, 'grad_norm': 0.9372531126845941, 'learning_rate': 9.96427007809708e-06, 'epoch': 0.07} 7%|▋ | 2295/34278 [2:33:52<44:57:58, 5.06s/it] 7%|▋ | 2296/34278 [2:33:56<42:08:09, 4.74s/it] {'loss': 0.1893, 'grad_norm': 0.9408517668032419, 'learning_rate': 9.964213677885571e-06, 'epoch': 0.07} 7%|▋ | 2296/34278 [2:33:56<42:08:09, 4.74s/it] 7%|▋ | 2297/34278 [2:34:00<38:21:59, 4.32s/it] {'loss': 0.1915, 'grad_norm': 0.9129308162969344, 'learning_rate': 9.964157233354745e-06, 'epoch': 0.07} 7%|▋ | 2297/34278 [2:34:00<38:21:59, 4.32s/it] 7%|▋ | 2298/34278 [2:34:03<35:22:21, 3.98s/it] {'loss': 0.2003, 'grad_norm': 0.8845099707748527, 'learning_rate': 9.964100744505111e-06, 'epoch': 0.07} 7%|▋ | 2298/34278 [2:34:03<35:22:21, 3.98s/it] 7%|▋ | 2299/34278 [2:34:08<37:50:42, 4.26s/it] {'loss': 0.2201, 'grad_norm': 1.063371864190776, 'learning_rate': 9.96404421133717e-06, 'epoch': 0.07} 7%|▋ | 2299/34278 [2:34:08<37:50:42, 4.26s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2300/34278 [2:34:11<35:14:37, 3.97s/it] {'loss': 0.1992, 'grad_norm': 1.2343004147392833, 'learning_rate': 9.963987633851427e-06, 'epoch': 0.07} 7%|▋ | 2300/34278 [2:34:11<35:14:37, 3.97s/it] 7%|▋ | 2301/34278 [2:34:15<35:22:29, 3.98s/it] {'loss': 0.1975, 'grad_norm': 0.8096666210595943, 'learning_rate': 9.963931012048387e-06, 'epoch': 0.07} 7%|▋ | 2301/34278 [2:34:15<35:22:29, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2302/34278 [2:34:19<34:36:55, 3.90s/it] {'loss': 0.2318, 'grad_norm': 0.9578026541774669, 'learning_rate': 9.963874345928557e-06, 'epoch': 0.07} 7%|▋ | 2302/34278 [2:34:19<34:36:55, 3.90s/it] 7%|▋ | 2303/34278 [2:34:24<37:15:37, 4.20s/it] {'loss': 0.2109, 'grad_norm': 0.9635990484625206, 'learning_rate': 9.963817635492441e-06, 'epoch': 0.07} 7%|▋ | 2303/34278 [2:34:24<37:15:37, 4.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2304/34278 [2:34:27<35:17:02, 3.97s/it] {'loss': 0.1993, 'grad_norm': 0.9000620424505866, 'learning_rate': 9.963760880740545e-06, 'epoch': 0.07} 7%|▋ | 2304/34278 [2:34:27<35:17:02, 3.97s/it] 7%|▋ | 2305/34278 [2:34:32<38:34:14, 4.34s/it] {'loss': 0.2174, 'grad_norm': 0.8117717998021512, 'learning_rate': 9.96370408167338e-06, 'epoch': 0.07} 7%|▋ | 2305/34278 [2:34:32<38:34:14, 4.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2306/34278 [2:34:36<36:51:28, 4.15s/it] {'loss': 0.1928, 'grad_norm': 0.8858406507149971, 'learning_rate': 9.963647238291446e-06, 'epoch': 0.07} 7%|▋ | 2306/34278 [2:34:36<36:51:28, 4.15s/it] 7%|▋ | 2307/34278 [2:34:39<34:26:21, 3.88s/it] {'loss': 0.1973, 'grad_norm': 0.9669617480035934, 'learning_rate': 9.963590350595258e-06, 'epoch': 0.07} 7%|▋ | 2307/34278 [2:34:39<34:26:21, 3.88s/it] 7%|▋ | 2308/34278 [2:34:44<37:41:50, 4.24s/it] {'loss': 0.1798, 'grad_norm': 0.7658286769158015, 'learning_rate': 9.963533418585318e-06, 'epoch': 0.07} 7%|▋ | 2308/34278 [2:34:44<37:41:50, 4.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2309/34278 [2:34:48<36:26:40, 4.10s/it] {'loss': 0.2138, 'grad_norm': 0.9558386659030622, 'learning_rate': 9.963476442262136e-06, 'epoch': 0.07} 7%|▋ | 2309/34278 [2:34:48<36:26:40, 4.10s/it] 7%|▋ | 2310/34278 [2:34:51<33:59:28, 3.83s/it] {'loss': 0.1998, 'grad_norm': 0.9344254127484777, 'learning_rate': 9.963419421626224e-06, 'epoch': 0.07} 7%|▋ | 2310/34278 [2:34:51<33:59:28, 3.83s/it] 7%|▋ | 2311/34278 [2:34:56<35:30:40, 4.00s/it] {'loss': 0.2017, 'grad_norm': 0.8800064663338417, 'learning_rate': 9.963362356678086e-06, 'epoch': 0.07} 7%|▋ | 2311/34278 [2:34:56<35:30:40, 4.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2312/34278 [2:34:59<34:10:08, 3.85s/it] {'loss': 0.1742, 'grad_norm': 0.7737275162632699, 'learning_rate': 9.963305247418234e-06, 'epoch': 0.07} 7%|▋ | 2312/34278 [2:34:59<34:10:08, 3.85s/it] 7%|▋ | 2313/34278 [2:35:02<32:28:44, 3.66s/it] {'loss': 0.1959, 'grad_norm': 1.1441889633269098, 'learning_rate': 9.963248093847179e-06, 'epoch': 0.07} 7%|▋ | 2313/34278 [2:35:02<32:28:44, 3.66s/it] 7%|▋ | 2314/34278 [2:35:05<30:05:50, 3.39s/it] {'loss': 0.1829, 'grad_norm': 1.052257520574918, 'learning_rate': 9.963190895965428e-06, 'epoch': 0.07} 7%|▋ | 2314/34278 [2:35:05<30:05:50, 3.39s/it] 7%|▋ | 2315/34278 [2:35:09<32:09:34, 3.62s/it] {'loss': 0.1834, 'grad_norm': 1.0701099476644802, 'learning_rate': 9.963133653773495e-06, 'epoch': 0.07} 7%|▋ | 2315/34278 [2:35:09<32:09:34, 3.62s/it] 7%|▋ | 2316/34278 [2:35:12<31:01:56, 3.50s/it] {'loss': 0.2044, 'grad_norm': 0.8497336606664448, 'learning_rate': 9.963076367271889e-06, 'epoch': 0.07} 7%|▋ | 2316/34278 [2:35:12<31:01:56, 3.50s/it] 7%|▋ | 2317/34278 [2:35:18<37:34:28, 4.23s/it] {'loss': 0.2138, 'grad_norm': 0.9013699327004027, 'learning_rate': 9.96301903646112e-06, 'epoch': 0.07} 7%|▋ | 2317/34278 [2:35:18<37:34:28, 4.23s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2318/34278 [2:35:25<42:55:54, 4.84s/it] {'loss': 0.2128, 'grad_norm': 0.8988222302331794, 'learning_rate': 9.962961661341707e-06, 'epoch': 0.07} 7%|▋ | 2318/34278 [2:35:25<42:55:54, 4.84s/it] 7%|▋ | 2319/34278 [2:35:28<39:13:32, 4.42s/it] {'loss': 0.2091, 'grad_norm': 0.9897416661460551, 'learning_rate': 9.962904241914151e-06, 'epoch': 0.07} 7%|▋ | 2319/34278 [2:35:28<39:13:32, 4.42s/it] 7%|▋ | 2320/34278 [2:35:31<34:51:31, 3.93s/it] {'loss': 0.2044, 'grad_norm': 1.0865752192965261, 'learning_rate': 9.962846778178974e-06, 'epoch': 0.07} 7%|▋ | 2320/34278 [2:35:31<34:51:31, 3.93s/it] 7%|▋ | 2321/34278 [2:35:35<34:33:59, 3.89s/it] {'loss': 0.1885, 'grad_norm': 0.918354437620551, 'learning_rate': 9.962789270136687e-06, 'epoch': 0.07} 7%|▋ | 2321/34278 [2:35:35<34:33:59, 3.89s/it] 7%|▋ | 2322/34278 [2:35:38<32:04:23, 3.61s/it] {'loss': 0.204, 'grad_norm': 1.2876227223562233, 'learning_rate': 9.962731717787798e-06, 'epoch': 0.07} 7%|▋ | 2322/34278 [2:35:38<32:04:23, 3.61s/it] 7%|▋ | 2323/34278 [2:35:41<32:17:19, 3.64s/it] {'loss': 0.1953, 'grad_norm': 0.8520317395261258, 'learning_rate': 9.962674121132827e-06, 'epoch': 0.07} 7%|▋ | 2323/34278 [2:35:41<32:17:19, 3.64s/it] 7%|▋ | 2324/34278 [2:35:45<32:14:06, 3.63s/it] {'loss': 0.2028, 'grad_norm': 1.011606949292786, 'learning_rate': 9.962616480172287e-06, 'epoch': 0.07} 7%|▋ | 2324/34278 [2:35:45<32:14:06, 3.63s/it] 7%|▋ | 2325/34278 [2:35:48<31:03:42, 3.50s/it] {'loss': 0.1701, 'grad_norm': 0.9435219564123694, 'learning_rate': 9.96255879490669e-06, 'epoch': 0.07} 7%|▋ | 2325/34278 [2:35:48<31:03:42, 3.50s/it] 7%|▋ | 2326/34278 [2:35:51<30:02:37, 3.39s/it] {'loss': 0.2142, 'grad_norm': 0.9355932918071468, 'learning_rate': 9.962501065336553e-06, 'epoch': 0.07} 7%|▋ | 2326/34278 [2:35:51<30:02:37, 3.39s/it] 7%|▋ | 2327/34278 [2:35:54<29:00:47, 3.27s/it] {'loss': 0.1902, 'grad_norm': 1.0504961242919153, 'learning_rate': 9.962443291462393e-06, 'epoch': 0.07} 7%|▋ | 2327/34278 [2:35:54<29:00:47, 3.27s/it] 7%|▋ | 2328/34278 [2:35:57<27:33:55, 3.11s/it] {'loss': 0.188, 'grad_norm': 1.0262234020607701, 'learning_rate': 9.962385473284723e-06, 'epoch': 0.07} 7%|▋ | 2328/34278 [2:35:57<27:33:55, 3.11s/it] 7%|▋ | 2329/34278 [2:36:00<28:13:11, 3.18s/it] {'loss': 0.1796, 'grad_norm': 0.8601566986948037, 'learning_rate': 9.962327610804059e-06, 'epoch': 0.07} 7%|▋ | 2329/34278 [2:36:00<28:13:11, 3.18s/it] 7%|▋ | 2330/34278 [2:36:04<30:05:59, 3.39s/it] {'loss': 0.196, 'grad_norm': 0.8643775448630018, 'learning_rate': 9.962269704020919e-06, 'epoch': 0.07} 7%|▋ | 2330/34278 [2:36:04<30:05:59, 3.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2331/34278 [2:36:07<29:07:16, 3.28s/it] {'loss': 0.1832, 'grad_norm': 1.0711486392766931, 'learning_rate': 9.962211752935821e-06, 'epoch': 0.07} 7%|▋ | 2331/34278 [2:36:07<29:07:16, 3.28s/it] 7%|▋ | 2332/34278 [2:36:11<29:27:54, 3.32s/it] {'loss': 0.2064, 'grad_norm': 0.8854287171512436, 'learning_rate': 9.96215375754928e-06, 'epoch': 0.07} 7%|▋ | 2332/34278 [2:36:11<29:27:54, 3.32s/it] 7%|▋ | 2333/34278 [2:36:14<28:23:27, 3.20s/it] {'loss': 0.1963, 'grad_norm': 1.0101785669871297, 'learning_rate': 9.962095717861816e-06, 'epoch': 0.07} 7%|▋ | 2333/34278 [2:36:14<28:23:27, 3.20s/it] 7%|▋ | 2334/34278 [2:36:18<31:05:51, 3.50s/it] {'loss': 0.2374, 'grad_norm': 1.0012062687223648, 'learning_rate': 9.962037633873945e-06, 'epoch': 0.07} 7%|▋ | 2334/34278 [2:36:18<31:05:51, 3.50s/it] 7%|▋ | 2335/34278 [2:36:21<31:26:52, 3.54s/it] {'loss': 0.187, 'grad_norm': 0.8667464615236197, 'learning_rate': 9.961979505586185e-06, 'epoch': 0.07} 7%|▋ | 2335/34278 [2:36:21<31:26:52, 3.54s/it] 7%|▋ | 2336/34278 [2:36:28<38:23:47, 4.33s/it] {'loss': 0.2079, 'grad_norm': 0.9361547793247379, 'learning_rate': 9.961921332999058e-06, 'epoch': 0.07} 7%|▋ | 2336/34278 [2:36:28<38:23:47, 4.33s/it] 7%|▋ | 2337/34278 [2:36:30<34:31:29, 3.89s/it] {'loss': 0.2029, 'grad_norm': 0.9839156956812861, 'learning_rate': 9.961863116113083e-06, 'epoch': 0.07} 7%|▋ | 2337/34278 [2:36:30<34:31:29, 3.89s/it] 7%|▋ | 2338/34278 [2:36:34<33:31:31, 3.78s/it] {'loss': 0.2236, 'grad_norm': 1.0421774610372838, 'learning_rate': 9.961804854928778e-06, 'epoch': 0.07} 7%|▋ | 2338/34278 [2:36:34<33:31:31, 3.78s/it] 7%|▋ | 2339/34278 [2:36:38<32:58:58, 3.72s/it] {'loss': 0.2267, 'grad_norm': 1.0811409838186092, 'learning_rate': 9.961746549446662e-06, 'epoch': 0.07} 7%|▋ | 2339/34278 [2:36:38<32:58:58, 3.72s/it] 7%|▋ | 2340/34278 [2:36:43<36:24:51, 4.10s/it] {'loss': 0.1794, 'grad_norm': 1.0460234373857449, 'learning_rate': 9.961688199667259e-06, 'epoch': 0.07} 7%|▋ | 2340/34278 [2:36:43<36:24:51, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2341/34278 [2:36:46<33:48:14, 3.81s/it] {'loss': 0.2255, 'grad_norm': 0.9051163139657413, 'learning_rate': 9.961629805591088e-06, 'epoch': 0.07} 7%|▋ | 2341/34278 [2:36:46<33:48:14, 3.81s/it] 7%|▋ | 2342/34278 [2:36:49<32:13:32, 3.63s/it] {'loss': 0.1808, 'grad_norm': 1.1277562297268768, 'learning_rate': 9.96157136721867e-06, 'epoch': 0.07} 7%|▋ | 2342/34278 [2:36:49<32:13:32, 3.63s/it] 7%|▋ | 2343/34278 [2:36:53<33:32:17, 3.78s/it] {'loss': 0.2151, 'grad_norm': 0.9348417394825488, 'learning_rate': 9.961512884550529e-06, 'epoch': 0.07} 7%|▋ | 2343/34278 [2:36:53<33:32:17, 3.78s/it] 7%|▋ | 2344/34278 [2:36:57<32:44:43, 3.69s/it] {'loss': 0.1855, 'grad_norm': 1.0292995123316866, 'learning_rate': 9.961454357587183e-06, 'epoch': 0.07} 7%|▋ | 2344/34278 [2:36:57<32:44:43, 3.69s/it] 7%|▋ | 2345/34278 [2:37:00<31:28:57, 3.55s/it] {'loss': 0.2051, 'grad_norm': 0.9322885776989647, 'learning_rate': 9.961395786329158e-06, 'epoch': 0.07} 7%|▋ | 2345/34278 [2:37:00<31:28:57, 3.55s/it] 7%|▋ | 2346/34278 [2:37:04<34:35:49, 3.90s/it] {'loss': 0.1959, 'grad_norm': 1.0105816732637924, 'learning_rate': 9.961337170776974e-06, 'epoch': 0.07} 7%|▋ | 2346/34278 [2:37:04<34:35:49, 3.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2347/34278 [2:37:08<32:25:59, 3.66s/it] {'loss': 0.2075, 'grad_norm': 1.0499178862163263, 'learning_rate': 9.961278510931159e-06, 'epoch': 0.07} 7%|▋ | 2347/34278 [2:37:08<32:25:59, 3.66s/it] 7%|▋ | 2348/34278 [2:37:11<31:53:13, 3.60s/it] {'loss': 0.1925, 'grad_norm': 0.8912862401866359, 'learning_rate': 9.961219806792232e-06, 'epoch': 0.07} 7%|▋ | 2348/34278 [2:37:11<31:53:13, 3.60s/it] 7%|▋ | 2349/34278 [2:37:14<30:52:32, 3.48s/it] {'loss': 0.185, 'grad_norm': 0.9819560623045472, 'learning_rate': 9.96116105836072e-06, 'epoch': 0.07} 7%|▋ | 2349/34278 [2:37:14<30:52:32, 3.48s/it] 7%|▋ | 2350/34278 [2:37:17<29:45:36, 3.36s/it] {'loss': 0.2011, 'grad_norm': 0.9962237144968342, 'learning_rate': 9.961102265637144e-06, 'epoch': 0.07} 7%|▋ | 2350/34278 [2:37:17<29:45:36, 3.36s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8262 > 8192). Running this sequence through the model will result in indexing errors 7%|▋ | 2351/34278 [2:37:23<36:51:28, 4.16s/it] {'loss': 0.2263, 'grad_norm': 1.0182809788713403, 'learning_rate': 9.961043428622035e-06, 'epoch': 0.07} 7%|▋ | 2351/34278 [2:37:23<36:51:28, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2352/34278 [2:37:26<33:06:01, 3.73s/it] {'loss': 0.2016, 'grad_norm': 0.9143174112065799, 'learning_rate': 9.960984547315912e-06, 'epoch': 0.07} 7%|▋ | 2352/34278 [2:37:26<33:06:01, 3.73s/it] 7%|▋ | 2353/34278 [2:37:29<30:45:54, 3.47s/it] {'loss': 0.1839, 'grad_norm': 1.0485378149425884, 'learning_rate': 9.960925621719303e-06, 'epoch': 0.07} 7%|▋ | 2353/34278 [2:37:29<30:45:54, 3.47s/it] 7%|▋ | 2354/34278 [2:37:32<29:37:09, 3.34s/it] {'loss': 0.1631, 'grad_norm': 1.0526971123647655, 'learning_rate': 9.960866651832736e-06, 'epoch': 0.07} 7%|▋ | 2354/34278 [2:37:32<29:37:09, 3.34s/it] 7%|▋ | 2355/34278 [2:37:36<30:23:18, 3.43s/it] {'loss': 0.1991, 'grad_norm': 0.8948910382362572, 'learning_rate': 9.960807637656735e-06, 'epoch': 0.07} 7%|▋ | 2355/34278 [2:37:36<30:23:18, 3.43s/it] 7%|▋ | 2356/34278 [2:37:39<31:39:16, 3.57s/it] {'loss': 0.2164, 'grad_norm': 1.1887194133978207, 'learning_rate': 9.960748579191828e-06, 'epoch': 0.07} 7%|▋ | 2356/34278 [2:37:40<31:39:16, 3.57s/it] 7%|▋ | 2357/34278 [2:37:43<30:47:56, 3.47s/it] {'loss': 0.1861, 'grad_norm': 1.1574222769922864, 'learning_rate': 9.960689476438541e-06, 'epoch': 0.07} 7%|▋ | 2357/34278 [2:37:43<30:47:56, 3.47s/it] 7%|▋ | 2358/34278 [2:37:46<29:49:37, 3.36s/it] {'loss': 0.1801, 'grad_norm': 0.7514501214475601, 'learning_rate': 9.960630329397403e-06, 'epoch': 0.07} 7%|▋ | 2358/34278 [2:37:46<29:49:37, 3.36s/it] 7%|▋ | 2359/34278 [2:37:49<29:48:31, 3.36s/it] {'loss': 0.222, 'grad_norm': 0.9347542331968323, 'learning_rate': 9.960571138068942e-06, 'epoch': 0.07} 7%|▋ | 2359/34278 [2:37:49<29:48:31, 3.36s/it] 7%|▋ | 2360/34278 [2:37:53<30:06:06, 3.40s/it] {'loss': 0.2159, 'grad_norm': 0.8621373614996176, 'learning_rate': 9.960511902453685e-06, 'epoch': 0.07} 7%|▋ | 2360/34278 [2:37:53<30:06:06, 3.40s/it] 7%|▋ | 2361/34278 [2:37:56<29:42:50, 3.35s/it] {'loss': 0.1902, 'grad_norm': 0.9284847136860013, 'learning_rate': 9.960452622552163e-06, 'epoch': 0.07} 7%|▋ | 2361/34278 [2:37:56<29:42:50, 3.35s/it] 7%|▋ | 2362/34278 [2:38:00<32:43:37, 3.69s/it] {'loss': 0.1985, 'grad_norm': 0.9108127425244467, 'learning_rate': 9.960393298364904e-06, 'epoch': 0.07} 7%|▋ | 2362/34278 [2:38:00<32:43:37, 3.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2363/34278 [2:38:03<30:57:08, 3.49s/it] {'loss': 0.1968, 'grad_norm': 0.8798707880760961, 'learning_rate': 9.960333929892438e-06, 'epoch': 0.07} 7%|▋ | 2363/34278 [2:38:03<30:57:08, 3.49s/it] 7%|▋ | 2364/34278 [2:38:09<35:17:56, 3.98s/it] {'loss': 0.2275, 'grad_norm': 0.9092462067129579, 'learning_rate': 9.960274517135294e-06, 'epoch': 0.07} 7%|▋ | 2364/34278 [2:38:09<35:17:56, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2365/34278 [2:38:12<33:44:22, 3.81s/it] {'loss': 0.2058, 'grad_norm': 1.0368817334127765, 'learning_rate': 9.960215060094004e-06, 'epoch': 0.07} 7%|▋ | 2365/34278 [2:38:12<33:44:22, 3.81s/it] 7%|▋ | 2366/34278 [2:38:19<41:22:24, 4.67s/it] {'loss': 0.1853, 'grad_norm': 0.861514927853887, 'learning_rate': 9.960155558769097e-06, 'epoch': 0.07} 7%|▋ | 2366/34278 [2:38:19<41:22:24, 4.67s/it] 7%|▋ | 2367/34278 [2:38:21<36:15:59, 4.09s/it] {'loss': 0.223, 'grad_norm': 0.8636982992003643, 'learning_rate': 9.960096013161105e-06, 'epoch': 0.07} 7%|▋ | 2367/34278 [2:38:21<36:15:59, 4.09s/it] 7%|▋ | 2368/34278 [2:38:25<35:00:09, 3.95s/it] {'loss': 0.2064, 'grad_norm': 0.860306211958087, 'learning_rate': 9.960036423270561e-06, 'epoch': 0.07} 7%|▋ | 2368/34278 [2:38:25<35:00:09, 3.95s/it] 7%|▋ | 2369/34278 [2:38:28<33:08:29, 3.74s/it] {'loss': 0.1825, 'grad_norm': 0.847459846554446, 'learning_rate': 9.959976789097997e-06, 'epoch': 0.07} 7%|▋ | 2369/34278 [2:38:28<33:08:29, 3.74s/it] 7%|▋ | 2370/34278 [2:38:32<32:48:59, 3.70s/it] {'loss': 0.1952, 'grad_norm': 0.9282284890303654, 'learning_rate': 9.959917110643942e-06, 'epoch': 0.07} 7%|▋ | 2370/34278 [2:38:32<32:48:59, 3.70s/it] 7%|▋ | 2371/34278 [2:38:36<33:13:56, 3.75s/it] {'loss': 0.1776, 'grad_norm': 0.939952568470686, 'learning_rate': 9.959857387908931e-06, 'epoch': 0.07} 7%|▋ | 2371/34278 [2:38:36<33:13:56, 3.75s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2372/34278 [2:38:40<34:48:26, 3.93s/it] {'loss': 0.2041, 'grad_norm': 1.0191405770354907, 'learning_rate': 9.959797620893498e-06, 'epoch': 0.07} 7%|▋ | 2372/34278 [2:38:40<34:48:26, 3.93s/it] 7%|▋ | 2373/34278 [2:38:43<33:07:52, 3.74s/it] {'loss': 0.184, 'grad_norm': 0.7120147314681421, 'learning_rate': 9.959737809598177e-06, 'epoch': 0.07} 7%|▋ | 2373/34278 [2:38:43<33:07:52, 3.74s/it] 7%|▋ | 2374/34278 [2:38:47<33:56:37, 3.83s/it] {'loss': 0.2127, 'grad_norm': 0.8907912228070642, 'learning_rate': 9.959677954023501e-06, 'epoch': 0.07} 7%|▋ | 2374/34278 [2:38:47<33:56:37, 3.83s/it] 7%|▋ | 2375/34278 [2:38:51<32:09:13, 3.63s/it] {'loss': 0.2173, 'grad_norm': 0.9414504551311939, 'learning_rate': 9.959618054170003e-06, 'epoch': 0.07} 7%|▋ | 2375/34278 [2:38:51<32:09:13, 3.63s/it] 7%|▋ | 2376/34278 [2:38:54<31:27:13, 3.55s/it] {'loss': 0.1791, 'grad_norm': 0.7052813775278587, 'learning_rate': 9.959558110038218e-06, 'epoch': 0.07} 7%|▋ | 2376/34278 [2:38:54<31:27:13, 3.55s/it] 7%|▋ | 2377/34278 [2:38:57<30:24:18, 3.43s/it] {'loss': 0.2138, 'grad_norm': 0.9652752271670552, 'learning_rate': 9.959498121628683e-06, 'epoch': 0.07} 7%|▋ | 2377/34278 [2:38:57<30:24:18, 3.43s/it] 7%|▋ | 2378/34278 [2:39:00<29:11:33, 3.29s/it] {'loss': 0.2052, 'grad_norm': 1.0808645757388777, 'learning_rate': 9.959438088941935e-06, 'epoch': 0.07} 7%|▋ | 2378/34278 [2:39:00<29:11:33, 3.29s/it] 7%|▋ | 2379/34278 [2:39:03<28:53:05, 3.26s/it] {'loss': 0.183, 'grad_norm': 0.9018716780531703, 'learning_rate': 9.959378011978504e-06, 'epoch': 0.07} 7%|▋ | 2379/34278 [2:39:03<28:53:05, 3.26s/it] 7%|▋ | 2380/34278 [2:39:07<29:26:32, 3.32s/it] {'loss': 0.203, 'grad_norm': 1.0419770280542815, 'learning_rate': 9.959317890738932e-06, 'epoch': 0.07} 7%|▋ | 2380/34278 [2:39:07<29:26:32, 3.32s/it] 7%|▋ | 2381/34278 [2:39:10<28:57:34, 3.27s/it] {'loss': 0.2298, 'grad_norm': 0.969712062246678, 'learning_rate': 9.959257725223753e-06, 'epoch': 0.07} 7%|▋ | 2381/34278 [2:39:10<28:57:34, 3.27s/it] 7%|▋ | 2382/34278 [2:39:13<27:26:37, 3.10s/it] {'loss': 0.1666, 'grad_norm': 0.8279362437059269, 'learning_rate': 9.959197515433505e-06, 'epoch': 0.07} 7%|▋ | 2382/34278 [2:39:13<27:26:37, 3.10s/it] 7%|▋ | 2383/34278 [2:39:16<28:33:07, 3.22s/it] {'loss': 0.2003, 'grad_norm': 0.974389601516134, 'learning_rate': 9.959137261368725e-06, 'epoch': 0.07} 7%|▋ | 2383/34278 [2:39:16<28:33:07, 3.22s/it] 7%|▋ | 2384/34278 [2:39:20<29:24:22, 3.32s/it] {'loss': 0.209, 'grad_norm': 0.9440985070740611, 'learning_rate': 9.959076963029954e-06, 'epoch': 0.07} 7%|▋ | 2384/34278 [2:39:20<29:24:22, 3.32s/it] 7%|▋ | 2385/34278 [2:39:24<32:57:18, 3.72s/it] {'loss': 0.1983, 'grad_norm': 1.0767766171698354, 'learning_rate': 9.959016620417725e-06, 'epoch': 0.07} 7%|▋ | 2385/34278 [2:39:24<32:57:18, 3.72s/it] 7%|▋ | 2386/34278 [2:39:27<31:08:58, 3.52s/it] {'loss': 0.2096, 'grad_norm': 1.0014248062894908, 'learning_rate': 9.95895623353258e-06, 'epoch': 0.07} 7%|▋ | 2386/34278 [2:39:27<31:08:58, 3.52s/it] 7%|▋ | 2387/34278 [2:39:31<30:51:20, 3.48s/it] {'loss': 0.2061, 'grad_norm': 0.7556710962392882, 'learning_rate': 9.958895802375056e-06, 'epoch': 0.07} 7%|▋ | 2387/34278 [2:39:31<30:51:20, 3.48s/it] 7%|▋ | 2388/34278 [2:39:34<29:27:20, 3.33s/it] {'loss': 0.1853, 'grad_norm': 0.9292753584936179, 'learning_rate': 9.958835326945698e-06, 'epoch': 0.07} 7%|▋ | 2388/34278 [2:39:34<29:27:20, 3.33s/it] 7%|▋ | 2389/34278 [2:39:38<30:55:36, 3.49s/it] {'loss': 0.201, 'grad_norm': 1.1008176939215812, 'learning_rate': 9.958774807245039e-06, 'epoch': 0.07} 7%|▋ | 2389/34278 [2:39:38<30:55:36, 3.49s/it] 7%|▋ | 2390/34278 [2:39:41<30:15:59, 3.42s/it] {'loss': 0.1892, 'grad_norm': 1.0318162019529176, 'learning_rate': 9.958714243273624e-06, 'epoch': 0.07} 7%|▋ | 2390/34278 [2:39:41<30:15:59, 3.42s/it] 7%|▋ | 2391/34278 [2:39:44<29:50:05, 3.37s/it] {'loss': 0.2192, 'grad_norm': 0.9803870456434441, 'learning_rate': 9.95865363503199e-06, 'epoch': 0.07} 7%|▋ | 2391/34278 [2:39:44<29:50:05, 3.37s/it] 7%|▋ | 2392/34278 [2:39:47<29:10:56, 3.29s/it] {'loss': 0.2317, 'grad_norm': 1.1702309635721455, 'learning_rate': 9.958592982520681e-06, 'epoch': 0.07} 7%|▋ | 2392/34278 [2:39:47<29:10:56, 3.29s/it] 7%|▋ | 2393/34278 [2:39:53<36:14:15, 4.09s/it] {'loss': 0.1876, 'grad_norm': 0.9783216659503141, 'learning_rate': 9.958532285740238e-06, 'epoch': 0.07} 7%|▋ | 2393/34278 [2:39:53<36:14:15, 4.09s/it] 7%|▋ | 2394/34278 [2:39:56<34:11:32, 3.86s/it] {'loss': 0.1973, 'grad_norm': 0.7905536273298849, 'learning_rate': 9.958471544691201e-06, 'epoch': 0.07} 7%|▋ | 2394/34278 [2:39:56<34:11:32, 3.86s/it] 7%|▋ | 2395/34278 [2:40:00<33:01:38, 3.73s/it] {'loss': 0.2185, 'grad_norm': 1.077268354827615, 'learning_rate': 9.958410759374116e-06, 'epoch': 0.07} 7%|▋ | 2395/34278 [2:40:00<33:01:38, 3.73s/it] 7%|▋ | 2396/34278 [2:40:06<38:44:41, 4.37s/it] {'loss': 0.1984, 'grad_norm': 1.126968531852992, 'learning_rate': 9.958349929789521e-06, 'epoch': 0.07} 7%|▋ | 2396/34278 [2:40:06<38:44:41, 4.37s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2397/34278 [2:40:10<37:56:54, 4.29s/it] {'loss': 0.1918, 'grad_norm': 0.7881223910381542, 'learning_rate': 9.958289055937963e-06, 'epoch': 0.07} 7%|▋ | 2397/34278 [2:40:10<37:56:54, 4.29s/it] 7%|▋ | 2398/34278 [2:40:13<33:58:54, 3.84s/it] {'loss': 0.197, 'grad_norm': 0.9537166975001522, 'learning_rate': 9.958228137819984e-06, 'epoch': 0.07} 7%|▋ | 2398/34278 [2:40:13<33:58:54, 3.84s/it] 7%|▋ | 2399/34278 [2:40:15<31:22:29, 3.54s/it] {'loss': 0.214, 'grad_norm': 0.9744682320428244, 'learning_rate': 9.958167175436128e-06, 'epoch': 0.07} 7%|▋ | 2399/34278 [2:40:15<31:22:29, 3.54s/it] 7%|▋ | 2400/34278 [2:40:19<30:18:48, 3.42s/it] {'loss': 0.1957, 'grad_norm': 1.1523567877347596, 'learning_rate': 9.95810616878694e-06, 'epoch': 0.07} 7%|▋ | 2400/34278 [2:40:19<30:18:48, 3.42s/it] 7%|▋ | 2401/34278 [2:40:24<35:40:11, 4.03s/it] {'loss': 0.187, 'grad_norm': 1.1944507329865188, 'learning_rate': 9.958045117872961e-06, 'epoch': 0.07} 7%|▋ | 2401/34278 [2:40:24<35:40:11, 4.03s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2402/34278 [2:40:30<40:57:42, 4.63s/it] {'loss': 0.1911, 'grad_norm': 0.9685427581118649, 'learning_rate': 9.95798402269474e-06, 'epoch': 0.07} 7%|▋ | 2402/34278 [2:40:30<40:57:42, 4.63s/it] 7%|▋ | 2403/34278 [2:40:33<37:00:56, 4.18s/it] {'loss': 0.1912, 'grad_norm': 1.0326345432613742, 'learning_rate': 9.95792288325282e-06, 'epoch': 0.07} 7%|▋ | 2403/34278 [2:40:33<37:00:56, 4.18s/it] 7%|▋ | 2404/34278 [2:40:38<37:22:35, 4.22s/it] {'loss': 0.1985, 'grad_norm': 0.9634055727972706, 'learning_rate': 9.95786169954775e-06, 'epoch': 0.07} 7%|▋ | 2404/34278 [2:40:38<37:22:35, 4.22s/it] 7%|▋ | 2405/34278 [2:40:44<42:35:09, 4.81s/it] {'loss': 0.2138, 'grad_norm': 0.9193623921254882, 'learning_rate': 9.957800471580074e-06, 'epoch': 0.07} 7%|▋ | 2405/34278 [2:40:44<42:35:09, 4.81s/it] 7%|▋ | 2406/34278 [2:40:47<38:59:07, 4.40s/it] {'loss': 0.2002, 'grad_norm': 0.9821151777603562, 'learning_rate': 9.957739199350339e-06, 'epoch': 0.07} 7%|▋ | 2406/34278 [2:40:47<38:59:07, 4.40s/it] 7%|▋ | 2407/34278 [2:40:50<35:43:57, 4.04s/it] {'loss': 0.1921, 'grad_norm': 0.9099336925257334, 'learning_rate': 9.95767788285909e-06, 'epoch': 0.07} 7%|▋ | 2407/34278 [2:40:50<35:43:57, 4.04s/it] 7%|▋ | 2408/34278 [2:40:55<36:06:34, 4.08s/it] {'loss': 0.1831, 'grad_norm': 0.7324101817432253, 'learning_rate': 9.957616522106878e-06, 'epoch': 0.07} 7%|▋ | 2408/34278 [2:40:55<36:06:34, 4.08s/it] 7%|▋ | 2409/34278 [2:40:59<37:43:30, 4.26s/it] {'loss': 0.1838, 'grad_norm': 0.8722205327852851, 'learning_rate': 9.95755511709425e-06, 'epoch': 0.07} 7%|▋ | 2409/34278 [2:40:59<37:43:30, 4.26s/it] 7%|▋ | 2410/34278 [2:41:03<35:39:50, 4.03s/it] {'loss': 0.1973, 'grad_norm': 1.0160175849577036, 'learning_rate': 9.957493667821752e-06, 'epoch': 0.07} 7%|▋ | 2410/34278 [2:41:03<35:39:50, 4.03s/it] 7%|▋ | 2411/34278 [2:41:08<38:38:21, 4.37s/it] {'loss': 0.228, 'grad_norm': 1.0796794263996108, 'learning_rate': 9.957432174289934e-06, 'epoch': 0.07} 7%|▋ | 2411/34278 [2:41:08<38:38:21, 4.37s/it] 7%|▋ | 2412/34278 [2:41:11<35:04:29, 3.96s/it] {'loss': 0.2113, 'grad_norm': 1.1028917050992295, 'learning_rate': 9.957370636499346e-06, 'epoch': 0.07} 7%|▋ | 2412/34278 [2:41:11<35:04:29, 3.96s/it] 7%|▋ | 2413/34278 [2:41:14<33:43:59, 3.81s/it] {'loss': 0.2053, 'grad_norm': 1.1035408149593489, 'learning_rate': 9.957309054450534e-06, 'epoch': 0.07} 7%|▋ | 2413/34278 [2:41:14<33:43:59, 3.81s/it] 7%|▋ | 2414/34278 [2:41:19<34:56:42, 3.95s/it] {'loss': 0.1859, 'grad_norm': 1.0248996546132965, 'learning_rate': 9.957247428144052e-06, 'epoch': 0.07} 7%|▋ | 2414/34278 [2:41:19<34:56:42, 3.95s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2415/34278 [2:41:22<33:45:23, 3.81s/it] {'loss': 0.1889, 'grad_norm': 1.0670146838617907, 'learning_rate': 9.957185757580448e-06, 'epoch': 0.07} 7%|▋ | 2415/34278 [2:41:22<33:45:23, 3.81s/it] 7%|▋ | 2416/34278 [2:41:25<31:57:26, 3.61s/it] {'loss': 0.1887, 'grad_norm': 0.9198217089388051, 'learning_rate': 9.957124042760274e-06, 'epoch': 0.07} 7%|▋ | 2416/34278 [2:41:25<31:57:26, 3.61s/it] 7%|▋ | 2417/34278 [2:41:29<31:07:12, 3.52s/it] {'loss': 0.1753, 'grad_norm': 1.1104964904412704, 'learning_rate': 9.957062283684078e-06, 'epoch': 0.07} 7%|▋ | 2417/34278 [2:41:29<31:07:12, 3.52s/it] 7%|▋ | 2418/34278 [2:41:32<29:59:10, 3.39s/it] {'loss': 0.2157, 'grad_norm': 1.0434878111586023, 'learning_rate': 9.957000480352415e-06, 'epoch': 0.07} 7%|▋ | 2418/34278 [2:41:32<29:59:10, 3.39s/it] 7%|▋ | 2419/34278 [2:41:35<29:08:00, 3.29s/it] {'loss': 0.1835, 'grad_norm': 0.8205827054933801, 'learning_rate': 9.956938632765833e-06, 'epoch': 0.07} 7%|▋ | 2419/34278 [2:41:35<29:08:00, 3.29s/it] 7%|▋ | 2420/34278 [2:41:38<29:25:21, 3.32s/it] {'loss': 0.1918, 'grad_norm': 0.9370711144255537, 'learning_rate': 9.956876740924888e-06, 'epoch': 0.07} 7%|▋ | 2420/34278 [2:41:38<29:25:21, 3.32s/it] 7%|▋ | 2421/34278 [2:41:41<29:15:22, 3.31s/it] {'loss': 0.1937, 'grad_norm': 0.9241820231784976, 'learning_rate': 9.956814804830131e-06, 'epoch': 0.07} 7%|▋ | 2421/34278 [2:41:41<29:15:22, 3.31s/it] 7%|▋ | 2422/34278 [2:41:45<29:16:20, 3.31s/it] {'loss': 0.1989, 'grad_norm': 0.8320079140917186, 'learning_rate': 9.956752824482114e-06, 'epoch': 0.07} 7%|▋ | 2422/34278 [2:41:45<29:16:20, 3.31s/it] 7%|▋ | 2423/34278 [2:41:48<30:29:15, 3.45s/it] {'loss': 0.2022, 'grad_norm': 0.9235242375593017, 'learning_rate': 9.956690799881391e-06, 'epoch': 0.07} 7%|▋ | 2423/34278 [2:41:48<30:29:15, 3.45s/it] 7%|▋ | 2424/34278 [2:41:52<30:42:23, 3.47s/it] {'loss': 0.2068, 'grad_norm': 0.862395217085308, 'learning_rate': 9.956628731028516e-06, 'epoch': 0.07} 7%|▋ | 2424/34278 [2:41:52<30:42:23, 3.47s/it] 7%|▋ | 2425/34278 [2:41:56<32:49:57, 3.71s/it] {'loss': 0.1978, 'grad_norm': 0.8779579436971858, 'learning_rate': 9.956566617924043e-06, 'epoch': 0.07} 7%|▋ | 2425/34278 [2:41:56<32:49:57, 3.71s/it] 7%|▋ | 2426/34278 [2:42:01<35:39:48, 4.03s/it] {'loss': 0.1976, 'grad_norm': 0.8552995115610158, 'learning_rate': 9.956504460568525e-06, 'epoch': 0.07} 7%|▋ | 2426/34278 [2:42:01<35:39:48, 4.03s/it] 7%|▋ | 2427/34278 [2:42:07<40:47:59, 4.61s/it] {'loss': 0.2032, 'grad_norm': 0.8616896467486487, 'learning_rate': 9.95644225896252e-06, 'epoch': 0.07} 7%|▋ | 2427/34278 [2:42:07<40:47:59, 4.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2428/34278 [2:42:10<36:50:44, 4.16s/it] {'loss': 0.2089, 'grad_norm': 1.1605532484673293, 'learning_rate': 9.956380013106582e-06, 'epoch': 0.07} 7%|▋ | 2428/34278 [2:42:10<36:50:44, 4.16s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12315 > 8192). Running this sequence through the model will result in indexing errors 7%|▋ | 2429/34278 [2:42:14<37:11:36, 4.20s/it] {'loss': 0.2094, 'grad_norm': 1.000970860784796, 'learning_rate': 9.956317723001265e-06, 'epoch': 0.07} 7%|▋ | 2429/34278 [2:42:14<37:11:36, 4.20s/it] 7%|▋ | 2430/34278 [2:42:17<34:15:48, 3.87s/it] {'loss': 0.2075, 'grad_norm': 0.9507506852609013, 'learning_rate': 9.956255388647127e-06, 'epoch': 0.07} 7%|▋ | 2430/34278 [2:42:18<34:15:48, 3.87s/it] 7%|▋ | 2431/34278 [2:42:21<33:58:05, 3.84s/it] {'loss': 0.2001, 'grad_norm': 1.0275419870209836, 'learning_rate': 9.956193010044725e-06, 'epoch': 0.07} 7%|▋ | 2431/34278 [2:42:21<33:58:05, 3.84s/it] 7%|▋ | 2432/34278 [2:42:24<32:07:47, 3.63s/it] {'loss': 0.18, 'grad_norm': 1.1099346786745798, 'learning_rate': 9.956130587194615e-06, 'epoch': 0.07} 7%|▋ | 2432/34278 [2:42:24<32:07:47, 3.63s/it] 7%|▋ | 2433/34278 [2:42:28<32:21:23, 3.66s/it] {'loss': 0.2279, 'grad_norm': 0.9167391311402326, 'learning_rate': 9.956068120097353e-06, 'epoch': 0.07} 7%|▋ | 2433/34278 [2:42:28<32:21:23, 3.66s/it] 7%|▋ | 2434/34278 [2:42:33<36:49:43, 4.16s/it] {'loss': 0.1833, 'grad_norm': 0.997229453392759, 'learning_rate': 9.956005608753499e-06, 'epoch': 0.07} 7%|▋ | 2434/34278 [2:42:33<36:49:43, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2435/34278 [2:42:38<37:03:32, 4.19s/it] {'loss': 0.2074, 'grad_norm': 0.9326126299386502, 'learning_rate': 9.95594305316361e-06, 'epoch': 0.07} 7%|▋ | 2435/34278 [2:42:38<37:03:32, 4.19s/it] 7%|▋ | 2436/34278 [2:42:44<43:00:29, 4.86s/it] {'loss': 0.1873, 'grad_norm': 0.8114360126713511, 'learning_rate': 9.955880453328243e-06, 'epoch': 0.07} 7%|▋ | 2436/34278 [2:42:44<43:00:29, 4.86s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2437/34278 [2:42:48<39:29:19, 4.46s/it] {'loss': 0.192, 'grad_norm': 1.1166856325162593, 'learning_rate': 9.95581780924796e-06, 'epoch': 0.07} 7%|▋ | 2437/34278 [2:42:48<39:29:19, 4.46s/it] 7%|▋ | 2438/34278 [2:42:51<36:21:32, 4.11s/it] {'loss': 0.1774, 'grad_norm': 0.9857179562554458, 'learning_rate': 9.955755120923319e-06, 'epoch': 0.07} 7%|▋ | 2438/34278 [2:42:51<36:21:32, 4.11s/it] 7%|▋ | 2439/34278 [2:42:57<41:33:56, 4.70s/it] {'loss': 0.2159, 'grad_norm': 1.0638188423148094, 'learning_rate': 9.955692388354876e-06, 'epoch': 0.07} 7%|▋ | 2439/34278 [2:42:57<41:33:56, 4.70s/it] 7%|▋ | 2440/34278 [2:43:00<38:08:31, 4.31s/it] {'loss': 0.218, 'grad_norm': 1.0349445235061012, 'learning_rate': 9.955629611543198e-06, 'epoch': 0.07} 7%|▋ | 2440/34278 [2:43:00<38:08:31, 4.31s/it] 7%|▋ | 2441/34278 [2:43:04<35:44:50, 4.04s/it] {'loss': 0.1897, 'grad_norm': 0.8570386046990435, 'learning_rate': 9.95556679048884e-06, 'epoch': 0.07} 7%|▋ | 2441/34278 [2:43:04<35:44:50, 4.04s/it] 7%|▋ | 2442/34278 [2:43:07<33:16:10, 3.76s/it] {'loss': 0.222, 'grad_norm': 0.9321227916859394, 'learning_rate': 9.955503925192365e-06, 'epoch': 0.07} 7%|▋ | 2442/34278 [2:43:07<33:16:10, 3.76s/it] 7%|▋ | 2443/34278 [2:43:13<40:02:08, 4.53s/it] {'loss': 0.1724, 'grad_norm': 1.1486333156118809, 'learning_rate': 9.955441015654334e-06, 'epoch': 0.07} 7%|▋ | 2443/34278 [2:43:13<40:02:08, 4.53s/it] 7%|▋ | 2444/34278 [2:43:17<37:27:15, 4.24s/it] {'loss': 0.1985, 'grad_norm': 0.8610017320801263, 'learning_rate': 9.955378061875309e-06, 'epoch': 0.07} 7%|▋ | 2444/34278 [2:43:17<37:27:15, 4.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2445/34278 [2:43:21<36:55:12, 4.18s/it] {'loss': 0.1992, 'grad_norm': 0.9887204063111942, 'learning_rate': 9.955315063855851e-06, 'epoch': 0.07} 7%|▋ | 2445/34278 [2:43:21<36:55:12, 4.18s/it] 7%|▋ | 2446/34278 [2:43:24<35:10:23, 3.98s/it] {'loss': 0.187, 'grad_norm': 1.0296967608410998, 'learning_rate': 9.955252021596524e-06, 'epoch': 0.07} 7%|▋ | 2446/34278 [2:43:24<35:10:23, 3.98s/it] 7%|▋ | 2447/34278 [2:43:28<34:46:59, 3.93s/it] {'loss': 0.2054, 'grad_norm': 1.042432065215246, 'learning_rate': 9.955188935097888e-06, 'epoch': 0.07} 7%|▋ | 2447/34278 [2:43:28<34:46:59, 3.93s/it] 7%|▋ | 2448/34278 [2:43:33<37:00:25, 4.19s/it] {'loss': 0.1951, 'grad_norm': 1.0335384342747953, 'learning_rate': 9.95512580436051e-06, 'epoch': 0.07} 7%|▋ | 2448/34278 [2:43:33<37:00:25, 4.19s/it] 7%|▋ | 2449/34278 [2:43:36<34:53:24, 3.95s/it] {'loss': 0.2164, 'grad_norm': 1.068551528570425, 'learning_rate': 9.955062629384952e-06, 'epoch': 0.07} 7%|▋ | 2449/34278 [2:43:36<34:53:24, 3.95s/it] 7%|▋ | 2450/34278 [2:43:40<33:37:39, 3.80s/it] {'loss': 0.1946, 'grad_norm': 0.8822694536965376, 'learning_rate': 9.954999410171775e-06, 'epoch': 0.07} 7%|▋ | 2450/34278 [2:43:40<33:37:39, 3.80s/it] 7%|▋ | 2451/34278 [2:43:43<32:22:48, 3.66s/it] {'loss': 0.194, 'grad_norm': 0.952687521069965, 'learning_rate': 9.954936146721548e-06, 'epoch': 0.07} 7%|▋ | 2451/34278 [2:43:43<32:22:48, 3.66s/it] 7%|▋ | 2452/34278 [2:43:46<31:02:52, 3.51s/it] {'loss': 0.2031, 'grad_norm': 1.0031368490503632, 'learning_rate': 9.954872839034836e-06, 'epoch': 0.07} 7%|▋ | 2452/34278 [2:43:46<31:02:52, 3.51s/it] 7%|▋ | 2453/34278 [2:43:50<32:16:51, 3.65s/it] {'loss': 0.1763, 'grad_norm': 0.9135427669353791, 'learning_rate': 9.954809487112198e-06, 'epoch': 0.07} 7%|▋ | 2453/34278 [2:43:50<32:16:51, 3.65s/it] 7%|▋ | 2454/34278 [2:43:54<31:20:17, 3.55s/it] {'loss': 0.1695, 'grad_norm': 0.9643004903890515, 'learning_rate': 9.954746090954205e-06, 'epoch': 0.07} 7%|▋ | 2454/34278 [2:43:54<31:20:17, 3.55s/it] 7%|▋ | 2455/34278 [2:43:57<30:49:40, 3.49s/it] {'loss': 0.214, 'grad_norm': 0.9528310407040534, 'learning_rate': 9.954682650561423e-06, 'epoch': 0.07} 7%|▋ | 2455/34278 [2:43:57<30:49:40, 3.49s/it] 7%|▋ | 2456/34278 [2:44:00<29:31:43, 3.34s/it] {'loss': 0.227, 'grad_norm': 0.8547133992363304, 'learning_rate': 9.954619165934417e-06, 'epoch': 0.07} 7%|▋ | 2456/34278 [2:44:00<29:31:43, 3.34s/it] 7%|▋ | 2457/34278 [2:44:05<33:23:18, 3.78s/it] {'loss': 0.1706, 'grad_norm': 0.9051851367771333, 'learning_rate': 9.954555637073752e-06, 'epoch': 0.07} 7%|▋ | 2457/34278 [2:44:05<33:23:18, 3.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2458/34278 [2:44:10<36:04:05, 4.08s/it] {'loss': 0.2051, 'grad_norm': 0.8861919955521864, 'learning_rate': 9.95449206398e-06, 'epoch': 0.07} 7%|▋ | 2458/34278 [2:44:10<36:04:05, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2459/34278 [2:44:14<35:55:18, 4.06s/it] {'loss': 0.201, 'grad_norm': 0.9730791963232464, 'learning_rate': 9.954428446653723e-06, 'epoch': 0.07} 7%|▋ | 2459/34278 [2:44:14<35:55:18, 4.06s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2460/34278 [2:44:18<37:19:44, 4.22s/it] {'loss': 0.1991, 'grad_norm': 0.9933523719193613, 'learning_rate': 9.954364785095493e-06, 'epoch': 0.07} 7%|▋ | 2460/34278 [2:44:18<37:19:44, 4.22s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9251 > 8192). Running this sequence through the model will result in indexing errors 7%|▋ | 2461/34278 [2:44:24<40:47:18, 4.62s/it] {'loss': 0.1834, 'grad_norm': 0.8492267930360498, 'learning_rate': 9.954301079305875e-06, 'epoch': 0.07} 7%|▋ | 2461/34278 [2:44:24<40:47:18, 4.62s/it] 7%|▋ | 2462/34278 [2:44:28<40:19:59, 4.56s/it] {'loss': 0.1983, 'grad_norm': 1.0898044476044764, 'learning_rate': 9.95423732928544e-06, 'epoch': 0.07} 7%|▋ | 2462/34278 [2:44:28<40:19:59, 4.56s/it] 7%|▋ | 2463/34278 [2:44:31<35:57:12, 4.07s/it] {'loss': 0.2028, 'grad_norm': 0.9642681007485292, 'learning_rate': 9.95417353503476e-06, 'epoch': 0.07} 7%|▋ | 2463/34278 [2:44:31<35:57:12, 4.07s/it] 7%|▋ | 2464/34278 [2:44:34<33:44:14, 3.82s/it] {'loss': 0.1994, 'grad_norm': 0.8881218865673711, 'learning_rate': 9.9541096965544e-06, 'epoch': 0.07} 7%|▋ | 2464/34278 [2:44:34<33:44:14, 3.82s/it] 7%|▋ | 2465/34278 [2:44:37<31:57:15, 3.62s/it] {'loss': 0.1858, 'grad_norm': 0.9375368969560157, 'learning_rate': 9.954045813844929e-06, 'epoch': 0.07} 7%|▋ | 2465/34278 [2:44:37<31:57:15, 3.62s/it] 7%|▋ | 2466/34278 [2:44:41<30:57:52, 3.50s/it] {'loss': 0.1921, 'grad_norm': 0.9285210062506181, 'learning_rate': 9.953981886906921e-06, 'epoch': 0.07} 7%|▋ | 2466/34278 [2:44:41<30:57:52, 3.50s/it] 7%|▋ | 2467/34278 [2:44:47<38:00:11, 4.30s/it] {'loss': 0.2061, 'grad_norm': 1.004301277208077, 'learning_rate': 9.953917915740944e-06, 'epoch': 0.07} 7%|▋ | 2467/34278 [2:44:47<38:00:11, 4.30s/it] 7%|▋ | 2468/34278 [2:44:51<37:24:34, 4.23s/it] {'loss': 0.2316, 'grad_norm': 0.9226092100920326, 'learning_rate': 9.953853900347572e-06, 'epoch': 0.07} 7%|▋ | 2468/34278 [2:44:51<37:24:34, 4.23s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2469/34278 [2:44:54<35:06:51, 3.97s/it] {'loss': 0.2167, 'grad_norm': 0.8940228359318462, 'learning_rate': 9.953789840727374e-06, 'epoch': 0.07} 7%|▋ | 2469/34278 [2:44:54<35:06:51, 3.97s/it] 7%|▋ | 2470/34278 [2:44:58<33:24:55, 3.78s/it] {'loss': 0.2064, 'grad_norm': 0.7997127468207761, 'learning_rate': 9.953725736880925e-06, 'epoch': 0.07} 7%|▋ | 2470/34278 [2:44:58<33:24:55, 3.78s/it] 7%|▋ | 2471/34278 [2:45:00<31:01:50, 3.51s/it] {'loss': 0.1954, 'grad_norm': 0.9171188662970678, 'learning_rate': 9.953661588808795e-06, 'epoch': 0.07} 7%|▋ | 2471/34278 [2:45:01<31:01:50, 3.51s/it] 7%|▋ | 2472/34278 [2:45:04<30:35:14, 3.46s/it] {'loss': 0.2022, 'grad_norm': 0.8660430614460405, 'learning_rate': 9.953597396511555e-06, 'epoch': 0.07} 7%|▋ | 2472/34278 [2:45:04<30:35:14, 3.46s/it] 7%|▋ | 2473/34278 [2:45:07<29:44:28, 3.37s/it] {'loss': 0.1714, 'grad_norm': 0.9180021940704193, 'learning_rate': 9.95353315998978e-06, 'epoch': 0.07} 7%|▋ | 2473/34278 [2:45:07<29:44:28, 3.37s/it] 7%|▋ | 2474/34278 [2:45:10<28:55:14, 3.27s/it] {'loss': 0.1886, 'grad_norm': 0.8869049918164137, 'learning_rate': 9.953468879244045e-06, 'epoch': 0.07} 7%|▋ | 2474/34278 [2:45:10<28:55:14, 3.27s/it] 7%|▋ | 2475/34278 [2:45:16<35:48:45, 4.05s/it] {'loss': 0.1792, 'grad_norm': 0.8894304734100424, 'learning_rate': 9.95340455427492e-06, 'epoch': 0.07} 7%|▋ | 2475/34278 [2:45:16<35:48:45, 4.05s/it] 7%|▋ | 2476/34278 [2:45:20<36:20:22, 4.11s/it] {'loss': 0.2004, 'grad_norm': 0.9035187211235377, 'learning_rate': 9.953340185082982e-06, 'epoch': 0.07} 7%|▋ | 2476/34278 [2:45:20<36:20:22, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2477/34278 [2:45:23<33:28:31, 3.79s/it] {'loss': 0.1821, 'grad_norm': 0.8558932996514017, 'learning_rate': 9.953275771668807e-06, 'epoch': 0.07} 7%|▋ | 2477/34278 [2:45:23<33:28:31, 3.79s/it] 7%|▋ | 2478/34278 [2:45:26<31:31:48, 3.57s/it] {'loss': 0.1976, 'grad_norm': 1.0864380103500642, 'learning_rate': 9.953211314032967e-06, 'epoch': 0.07} 7%|▋ | 2478/34278 [2:45:26<31:31:48, 3.57s/it] 7%|▋ | 2479/34278 [2:45:30<32:14:39, 3.65s/it] {'loss': 0.1813, 'grad_norm': 0.7988910475774128, 'learning_rate': 9.95314681217604e-06, 'epoch': 0.07} 7%|▋ | 2479/34278 [2:45:30<32:14:39, 3.65s/it] 7%|▋ | 2480/34278 [2:45:36<38:41:53, 4.38s/it] {'loss': 0.1949, 'grad_norm': 0.9068000257420371, 'learning_rate': 9.9530822660986e-06, 'epoch': 0.07} 7%|▋ | 2480/34278 [2:45:36<38:41:53, 4.38s/it] 7%|▋ | 2481/34278 [2:45:39<35:16:29, 3.99s/it] {'loss': 0.1993, 'grad_norm': 0.9470315605128883, 'learning_rate': 9.953017675801225e-06, 'epoch': 0.07} 7%|▋ | 2481/34278 [2:45:39<35:16:29, 3.99s/it] 7%|▋ | 2482/34278 [2:45:42<32:47:46, 3.71s/it] {'loss': 0.1859, 'grad_norm': 0.9435008351969455, 'learning_rate': 9.952953041284488e-06, 'epoch': 0.07} 7%|▋ | 2482/34278 [2:45:42<32:47:46, 3.71s/it] 7%|▋ | 2483/34278 [2:45:46<31:22:04, 3.55s/it] {'loss': 0.2186, 'grad_norm': 0.9990786184141739, 'learning_rate': 9.952888362548971e-06, 'epoch': 0.07} 7%|▋ | 2483/34278 [2:45:46<31:22:04, 3.55s/it] 7%|▋ | 2484/34278 [2:45:49<30:13:52, 3.42s/it] {'loss': 0.208, 'grad_norm': 0.9145289797257048, 'learning_rate': 9.952823639595248e-06, 'epoch': 0.07} 7%|▋ | 2484/34278 [2:45:49<30:13:52, 3.42s/it] 7%|▋ | 2485/34278 [2:45:52<29:17:35, 3.32s/it] {'loss': 0.2204, 'grad_norm': 1.0575528341843174, 'learning_rate': 9.952758872423897e-06, 'epoch': 0.07} 7%|▋ | 2485/34278 [2:45:52<29:17:35, 3.32s/it] 7%|▋ | 2486/34278 [2:45:56<32:09:20, 3.64s/it] {'loss': 0.1962, 'grad_norm': 0.8410179058919071, 'learning_rate': 9.952694061035499e-06, 'epoch': 0.07} 7%|▋ | 2486/34278 [2:45:56<32:09:20, 3.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2487/34278 [2:46:00<31:34:15, 3.58s/it] {'loss': 0.189, 'grad_norm': 0.9140375553260064, 'learning_rate': 9.952629205430631e-06, 'epoch': 0.07} 7%|▋ | 2487/34278 [2:46:00<31:34:15, 3.58s/it] 7%|▋ | 2488/34278 [2:46:03<30:54:30, 3.50s/it] {'loss': 0.2248, 'grad_norm': 1.2227293723778352, 'learning_rate': 9.95256430560987e-06, 'epoch': 0.07} 7%|▋ | 2488/34278 [2:46:03<30:54:30, 3.50s/it] 7%|▋ | 2489/34278 [2:46:06<30:38:16, 3.47s/it] {'loss': 0.1888, 'grad_norm': 0.997622801955556, 'learning_rate': 9.952499361573797e-06, 'epoch': 0.07} 7%|▋ | 2489/34278 [2:46:06<30:38:16, 3.47s/it] 7%|▋ | 2490/34278 [2:46:09<29:16:54, 3.32s/it] {'loss': 0.1957, 'grad_norm': 0.8271726846289789, 'learning_rate': 9.952434373322993e-06, 'epoch': 0.07} 7%|▋ | 2490/34278 [2:46:09<29:16:54, 3.32s/it] 7%|▋ | 2491/34278 [2:46:13<29:59:14, 3.40s/it] {'loss': 0.1893, 'grad_norm': 0.8662622401647231, 'learning_rate': 9.952369340858037e-06, 'epoch': 0.07} 7%|▋ | 2491/34278 [2:46:13<29:59:14, 3.40s/it] 7%|▋ | 2492/34278 [2:46:16<29:26:07, 3.33s/it] {'loss': 0.2122, 'grad_norm': 1.0754646388722005, 'learning_rate': 9.95230426417951e-06, 'epoch': 0.07} 7%|▋ | 2492/34278 [2:46:16<29:26:07, 3.33s/it] 7%|▋ | 2493/34278 [2:46:22<36:23:24, 4.12s/it] {'loss': 0.2295, 'grad_norm': 1.0034699914219778, 'learning_rate': 9.952239143287992e-06, 'epoch': 0.07} 7%|▋ | 2493/34278 [2:46:22<36:23:24, 4.12s/it] 7%|▋ | 2494/34278 [2:46:25<33:37:37, 3.81s/it] {'loss': 0.2452, 'grad_norm': 1.037099578814681, 'learning_rate': 9.952173978184065e-06, 'epoch': 0.07} 7%|▋ | 2494/34278 [2:46:25<33:37:37, 3.81s/it] 7%|▋ | 2495/34278 [2:46:30<37:03:32, 4.20s/it] {'loss': 0.1975, 'grad_norm': 0.882954788289804, 'learning_rate': 9.952108768868311e-06, 'epoch': 0.07} 7%|▋ | 2495/34278 [2:46:30<37:03:32, 4.20s/it] 7%|▋ | 2496/34278 [2:46:33<34:48:42, 3.94s/it] {'loss': 0.2023, 'grad_norm': 0.8946645911590452, 'learning_rate': 9.952043515341315e-06, 'epoch': 0.07} 7%|▋ | 2496/34278 [2:46:33<34:48:42, 3.94s/it] 7%|▋ | 2497/34278 [2:46:40<40:27:37, 4.58s/it] {'loss': 0.1913, 'grad_norm': 0.8918220605361443, 'learning_rate': 9.951978217603652e-06, 'epoch': 0.07} 7%|▋ | 2497/34278 [2:46:40<40:27:37, 4.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2498/34278 [2:46:43<38:46:22, 4.39s/it] {'loss': 0.175, 'grad_norm': 0.8738879545689714, 'learning_rate': 9.951912875655913e-06, 'epoch': 0.07} 7%|▋ | 2498/34278 [2:46:44<38:46:22, 4.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2499/34278 [2:46:47<35:52:39, 4.06s/it] {'loss': 0.2217, 'grad_norm': 0.9694918935764463, 'learning_rate': 9.951847489498675e-06, 'epoch': 0.07} 7%|▋ | 2499/34278 [2:46:47<35:52:39, 4.06s/it] 7%|▋ | 2500/34278 [2:46:50<33:31:56, 3.80s/it] {'loss': 0.1693, 'grad_norm': 0.9743991490788483, 'learning_rate': 9.951782059132528e-06, 'epoch': 0.07} 7%|▋ | 2500/34278 [2:46:50<33:31:56, 3.80s/it] 7%|▋ | 2501/34278 [2:46:54<34:37:49, 3.92s/it] {'loss': 0.1954, 'grad_norm': 0.9085291604921203, 'learning_rate': 9.95171658455805e-06, 'epoch': 0.07} 7%|▋ | 2501/34278 [2:46:54<34:37:49, 3.92s/it] 7%|▋ | 2502/34278 [2:46:58<33:28:50, 3.79s/it] {'loss': 0.2006, 'grad_norm': 0.9292300982103625, 'learning_rate': 9.951651065775831e-06, 'epoch': 0.07} 7%|▋ | 2502/34278 [2:46:58<33:28:50, 3.79s/it] 7%|▋ | 2503/34278 [2:47:01<31:27:27, 3.56s/it] {'loss': 0.195, 'grad_norm': 0.9665078351867818, 'learning_rate': 9.951585502786452e-06, 'epoch': 0.07} 7%|▋ | 2503/34278 [2:47:01<31:27:27, 3.56s/it] 7%|▋ | 2504/34278 [2:47:06<35:48:00, 4.06s/it] {'loss': 0.1977, 'grad_norm': 0.8681058912661461, 'learning_rate': 9.9515198955905e-06, 'epoch': 0.07} 7%|▋ | 2504/34278 [2:47:06<35:48:00, 4.06s/it] 7%|▋ | 2505/34278 [2:47:09<34:30:40, 3.91s/it] {'loss': 0.1948, 'grad_norm': 0.8466550917204185, 'learning_rate': 9.95145424418856e-06, 'epoch': 0.07} 7%|▋ | 2505/34278 [2:47:09<34:30:40, 3.91s/it] 7%|▋ | 2506/34278 [2:47:13<33:28:43, 3.79s/it] {'loss': 0.1869, 'grad_norm': 1.2150285220508747, 'learning_rate': 9.951388548581218e-06, 'epoch': 0.07} 7%|▋ | 2506/34278 [2:47:13<33:28:43, 3.79s/it] 7%|▋ | 2507/34278 [2:47:16<31:06:57, 3.53s/it] {'loss': 0.189, 'grad_norm': 0.9931115293779862, 'learning_rate': 9.951322808769062e-06, 'epoch': 0.07} 7%|▋ | 2507/34278 [2:47:16<31:06:57, 3.53s/it] 7%|▋ | 2508/34278 [2:47:20<31:34:29, 3.58s/it] {'loss': 0.2124, 'grad_norm': 0.8626432422597347, 'learning_rate': 9.951257024752678e-06, 'epoch': 0.07} 7%|▋ | 2508/34278 [2:47:20<31:34:29, 3.58s/it] 7%|▋ | 2509/34278 [2:47:23<30:28:08, 3.45s/it] {'loss': 0.2139, 'grad_norm': 0.9069895047602374, 'learning_rate': 9.951191196532653e-06, 'epoch': 0.07} 7%|▋ | 2509/34278 [2:47:23<30:28:08, 3.45s/it] 7%|▋ | 2510/34278 [2:47:26<30:04:15, 3.41s/it] {'loss': 0.2091, 'grad_norm': 0.991581796264132, 'learning_rate': 9.951125324109573e-06, 'epoch': 0.07} 7%|▋ | 2510/34278 [2:47:26<30:04:15, 3.41s/it] 7%|▋ | 2511/34278 [2:47:32<36:56:09, 4.19s/it] {'loss': 0.2086, 'grad_norm': 0.9564016164853112, 'learning_rate': 9.951059407484032e-06, 'epoch': 0.07} 7%|▋ | 2511/34278 [2:47:32<36:56:09, 4.19s/it] 7%|▋ | 2512/34278 [2:47:38<42:37:13, 4.83s/it] {'loss': 0.2035, 'grad_norm': 0.824312750052677, 'learning_rate': 9.950993446656612e-06, 'epoch': 0.07} 7%|▋ | 2512/34278 [2:47:38<42:37:13, 4.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2513/34278 [2:47:41<37:43:44, 4.28s/it] {'loss': 0.1888, 'grad_norm': 1.0688203964707534, 'learning_rate': 9.950927441627905e-06, 'epoch': 0.07} 7%|▋ | 2513/34278 [2:47:41<37:43:44, 4.28s/it] 7%|▋ | 2514/34278 [2:47:47<42:15:28, 4.79s/it] {'loss': 0.2116, 'grad_norm': 0.9778269925076594, 'learning_rate': 9.950861392398499e-06, 'epoch': 0.07} 7%|▋ | 2514/34278 [2:47:47<42:15:28, 4.79s/it] 7%|▋ | 2515/34278 [2:47:52<40:35:45, 4.60s/it] {'loss': 0.2042, 'grad_norm': 1.1299564714170196, 'learning_rate': 9.950795298968986e-06, 'epoch': 0.07} 7%|▋ | 2515/34278 [2:47:52<40:35:45, 4.60s/it] 7%|▋ | 2516/34278 [2:47:55<38:38:28, 4.38s/it] {'loss': 0.196, 'grad_norm': 0.8718630474472272, 'learning_rate': 9.950729161339951e-06, 'epoch': 0.07} 7%|▋ | 2516/34278 [2:47:55<38:38:28, 4.38s/it] 7%|▋ | 2517/34278 [2:48:00<39:04:54, 4.43s/it] {'loss': 0.185, 'grad_norm': 0.8128594163537469, 'learning_rate': 9.95066297951199e-06, 'epoch': 0.07} 7%|▋ | 2517/34278 [2:48:00<39:04:54, 4.43s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (11343 > 8192). Running this sequence through the model will result in indexing errors 7%|▋ | 2518/34278 [2:48:04<37:20:51, 4.23s/it] {'loss': 0.1924, 'grad_norm': 0.7886401178405563, 'learning_rate': 9.950596753485693e-06, 'epoch': 0.07} 7%|▋ | 2518/34278 [2:48:04<37:20:51, 4.23s/it] 7%|▋ | 2519/34278 [2:48:07<35:10:18, 3.99s/it] {'loss': 0.1983, 'grad_norm': 0.7225279092141481, 'learning_rate': 9.950530483261649e-06, 'epoch': 0.07} 7%|▋ | 2519/34278 [2:48:07<35:10:18, 3.99s/it] 7%|▋ | 2520/34278 [2:48:12<36:15:54, 4.11s/it] {'loss': 0.1919, 'grad_norm': 0.8504606643688458, 'learning_rate': 9.95046416884045e-06, 'epoch': 0.07} 7%|▋ | 2520/34278 [2:48:12<36:15:54, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2521/34278 [2:48:16<37:06:49, 4.21s/it] {'loss': 0.2238, 'grad_norm': 1.0296881981133896, 'learning_rate': 9.95039781022269e-06, 'epoch': 0.07} 7%|▋ | 2521/34278 [2:48:16<37:06:49, 4.21s/it] 7%|▋ | 2522/34278 [2:48:19<34:41:43, 3.93s/it] {'loss': 0.1834, 'grad_norm': 0.8873545789881933, 'learning_rate': 9.950331407408958e-06, 'epoch': 0.07} 7%|▋ | 2522/34278 [2:48:19<34:41:43, 3.93s/it] 7%|▋ | 2523/34278 [2:48:25<40:43:52, 4.62s/it] {'loss': 0.1756, 'grad_norm': 0.8140051656943608, 'learning_rate': 9.95026496039985e-06, 'epoch': 0.07} 7%|▋ | 2523/34278 [2:48:25<40:43:52, 4.62s/it] 7%|▋ | 2524/34278 [2:48:29<37:08:21, 4.21s/it] {'loss': 0.2111, 'grad_norm': 0.9609860973193264, 'learning_rate': 9.950198469195959e-06, 'epoch': 0.07} 7%|▋ | 2524/34278 [2:48:29<37:08:21, 4.21s/it] 7%|▋ | 2525/34278 [2:48:34<38:40:45, 4.39s/it] {'loss': 0.198, 'grad_norm': 0.9451889863966915, 'learning_rate': 9.950131933797876e-06, 'epoch': 0.07} 7%|▋ | 2525/34278 [2:48:34<38:40:45, 4.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2526/34278 [2:48:38<38:15:41, 4.34s/it] {'loss': 0.1906, 'grad_norm': 0.9025713220549293, 'learning_rate': 9.950065354206198e-06, 'epoch': 0.07} 7%|▋ | 2526/34278 [2:48:38<38:15:41, 4.34s/it] 7%|▋ | 2527/34278 [2:48:41<35:31:38, 4.03s/it] {'loss': 0.1878, 'grad_norm': 1.1171116402543344, 'learning_rate': 9.949998730421519e-06, 'epoch': 0.07} 7%|▋ | 2527/34278 [2:48:41<35:31:38, 4.03s/it] 7%|▋ | 2528/34278 [2:48:45<34:04:37, 3.86s/it] {'loss': 0.2053, 'grad_norm': 0.8216241005939505, 'learning_rate': 9.949932062444431e-06, 'epoch': 0.07} 7%|▋ | 2528/34278 [2:48:45<34:04:37, 3.86s/it] 7%|▋ | 2529/34278 [2:48:50<39:07:54, 4.44s/it] {'loss': 0.1963, 'grad_norm': 1.2913869392142565, 'learning_rate': 9.949865350275532e-06, 'epoch': 0.07} 7%|▋ | 2529/34278 [2:48:50<39:07:54, 4.44s/it] 7%|▋ | 2530/34278 [2:48:54<37:28:34, 4.25s/it] {'loss': 0.1922, 'grad_norm': 0.8574517560317876, 'learning_rate': 9.949798593915418e-06, 'epoch': 0.07} 7%|▋ | 2530/34278 [2:48:54<37:28:34, 4.25s/it] 7%|▋ | 2531/34278 [2:48:57<34:24:03, 3.90s/it] {'loss': 0.1907, 'grad_norm': 1.1689085454686512, 'learning_rate': 9.949731793364683e-06, 'epoch': 0.07} 7%|▋ | 2531/34278 [2:48:57<34:24:03, 3.90s/it] 7%|▋ | 2532/34278 [2:49:01<32:56:22, 3.74s/it] {'loss': 0.1945, 'grad_norm': 1.2915789591712317, 'learning_rate': 9.949664948623923e-06, 'epoch': 0.07} 7%|▋ | 2532/34278 [2:49:01<32:56:22, 3.74s/it] 7%|▋ | 2533/34278 [2:49:07<39:52:23, 4.52s/it] {'loss': 0.1708, 'grad_norm': 0.7767667508933432, 'learning_rate': 9.949598059693737e-06, 'epoch': 0.07} 7%|▋ | 2533/34278 [2:49:07<39:52:23, 4.52s/it] 7%|▋ | 2534/34278 [2:49:13<43:49:45, 4.97s/it] {'loss': 0.2032, 'grad_norm': 1.2243554126197655, 'learning_rate': 9.94953112657472e-06, 'epoch': 0.07} 7%|▋ | 2534/34278 [2:49:13<43:49:45, 4.97s/it] 7%|▋ | 2535/34278 [2:49:16<38:25:10, 4.36s/it] {'loss': 0.19, 'grad_norm': 1.1787710785802916, 'learning_rate': 9.949464149267473e-06, 'epoch': 0.07} 7%|▋ | 2535/34278 [2:49:16<38:25:10, 4.36s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9295 > 8192). Running this sequence through the model will result in indexing errors 7%|▋ | 2536/34278 [2:49:19<35:50:22, 4.06s/it] {'loss': 0.1989, 'grad_norm': 0.906760896516415, 'learning_rate': 9.94939712777259e-06, 'epoch': 0.07} 7%|▋ | 2536/34278 [2:49:19<35:50:22, 4.06s/it] 7%|▋ | 2537/34278 [2:49:23<33:43:45, 3.83s/it] {'loss': 0.1758, 'grad_norm': 1.031484955743404, 'learning_rate': 9.949330062090671e-06, 'epoch': 0.07} 7%|▋ | 2537/34278 [2:49:23<33:43:45, 3.83s/it] 7%|▋ | 2538/34278 [2:49:26<32:15:56, 3.66s/it] {'loss': 0.1878, 'grad_norm': 0.9411214101633906, 'learning_rate': 9.949262952222316e-06, 'epoch': 0.07} 7%|▋ | 2538/34278 [2:49:26<32:15:56, 3.66s/it] 7%|▋ | 2539/34278 [2:49:30<32:40:56, 3.71s/it] {'loss': 0.1961, 'grad_norm': 0.9884594827492988, 'learning_rate': 9.94919579816812e-06, 'epoch': 0.07} 7%|▋ | 2539/34278 [2:49:30<32:40:56, 3.71s/it] 7%|▋ | 2540/34278 [2:49:34<33:59:21, 3.86s/it] {'loss': 0.199, 'grad_norm': 1.0737036981799195, 'learning_rate': 9.949128599928687e-06, 'epoch': 0.07} 7%|▋ | 2540/34278 [2:49:34<33:59:21, 3.86s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2541/34278 [2:49:38<34:07:27, 3.87s/it] {'loss': 0.1732, 'grad_norm': 0.9696232110495232, 'learning_rate': 9.949061357504617e-06, 'epoch': 0.07} 7%|▋ | 2541/34278 [2:49:38<34:07:27, 3.87s/it] 7%|▋ | 2542/34278 [2:49:44<39:19:35, 4.46s/it] {'loss': 0.1945, 'grad_norm': 1.1846628887886361, 'learning_rate': 9.948994070896508e-06, 'epoch': 0.07} 7%|▋ | 2542/34278 [2:49:44<39:19:35, 4.46s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2543/34278 [2:49:47<36:22:16, 4.13s/it] {'loss': 0.204, 'grad_norm': 1.4202641562292888, 'learning_rate': 9.948926740104958e-06, 'epoch': 0.07} 7%|▋ | 2543/34278 [2:49:47<36:22:16, 4.13s/it] 7%|▋ | 2544/34278 [2:49:51<35:33:18, 4.03s/it] {'loss': 0.1673, 'grad_norm': 0.933920277116389, 'learning_rate': 9.948859365130574e-06, 'epoch': 0.07} 7%|▋ | 2544/34278 [2:49:51<35:33:18, 4.03s/it] 7%|▋ | 2545/34278 [2:49:54<33:18:20, 3.78s/it] {'loss': 0.2584, 'grad_norm': 0.99870713723646, 'learning_rate': 9.948791945973955e-06, 'epoch': 0.07} 7%|▋ | 2545/34278 [2:49:54<33:18:20, 3.78s/it] 7%|▋ | 2546/34278 [2:50:01<40:52:05, 4.64s/it] {'loss': 0.1935, 'grad_norm': 0.944141519756461, 'learning_rate': 9.948724482635703e-06, 'epoch': 0.07} 7%|▋ | 2546/34278 [2:50:01<40:52:05, 4.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2547/34278 [2:50:06<44:09:15, 5.01s/it] {'loss': 0.2236, 'grad_norm': 0.9033999505149384, 'learning_rate': 9.94865697511642e-06, 'epoch': 0.07} 7%|▋ | 2547/34278 [2:50:06<44:09:15, 5.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2548/34278 [2:50:12<46:43:46, 5.30s/it] {'loss': 0.1818, 'grad_norm': 0.8211666350624257, 'learning_rate': 9.94858942341671e-06, 'epoch': 0.07} 7%|▋ | 2548/34278 [2:50:12<46:43:46, 5.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2549/34278 [2:50:16<41:41:58, 4.73s/it] {'loss': 0.1812, 'grad_norm': 0.9274903197590763, 'learning_rate': 9.948521827537172e-06, 'epoch': 0.07} 7%|▋ | 2549/34278 [2:50:16<41:41:58, 4.73s/it] 7%|▋ | 2550/34278 [2:50:20<39:38:31, 4.50s/it] {'loss': 0.1965, 'grad_norm': 0.9605098584721782, 'learning_rate': 9.948454187478414e-06, 'epoch': 0.07} 7%|▋ | 2550/34278 [2:50:20<39:38:31, 4.50s/it] 7%|▋ | 2551/34278 [2:50:23<36:21:37, 4.13s/it] {'loss': 0.1996, 'grad_norm': 0.9480236670737756, 'learning_rate': 9.948386503241039e-06, 'epoch': 0.07} 7%|▋ | 2551/34278 [2:50:23<36:21:37, 4.13s/it] 7%|▋ | 2552/34278 [2:50:26<33:43:10, 3.83s/it] {'loss': 0.1988, 'grad_norm': 0.8485166444346322, 'learning_rate': 9.94831877482565e-06, 'epoch': 0.07} 7%|▋ | 2552/34278 [2:50:26<33:43:10, 3.83s/it] 7%|▋ | 2553/34278 [2:50:29<31:07:38, 3.53s/it] {'loss': 0.1853, 'grad_norm': 0.825194544691037, 'learning_rate': 9.948251002232852e-06, 'epoch': 0.07} 7%|▋ | 2553/34278 [2:50:29<31:07:38, 3.53s/it] 7%|▋ | 2554/34278 [2:50:33<32:02:06, 3.64s/it] {'loss': 0.213, 'grad_norm': 0.9736642606664729, 'learning_rate': 9.948183185463252e-06, 'epoch': 0.07} 7%|▋ | 2554/34278 [2:50:33<32:02:06, 3.64s/it] 7%|▋ | 2555/34278 [2:50:36<31:49:23, 3.61s/it] {'loss': 0.1845, 'grad_norm': 0.8261115363986782, 'learning_rate': 9.948115324517451e-06, 'epoch': 0.07} 7%|▋ | 2555/34278 [2:50:36<31:49:23, 3.61s/it] 7%|▋ | 2556/34278 [2:50:39<30:27:12, 3.46s/it] {'loss': 0.1782, 'grad_norm': 1.0089780084904925, 'learning_rate': 9.948047419396059e-06, 'epoch': 0.07} 7%|▋ | 2556/34278 [2:50:40<30:27:12, 3.46s/it] 7%|▋ | 2557/34278 [2:50:43<29:29:44, 3.35s/it] {'loss': 0.1911, 'grad_norm': 0.7853899303405794, 'learning_rate': 9.947979470099682e-06, 'epoch': 0.07} 7%|▋ | 2557/34278 [2:50:43<29:29:44, 3.35s/it] 7%|▋ | 2558/34278 [2:50:49<37:16:07, 4.23s/it] {'loss': 0.2073, 'grad_norm': 1.0173764561506085, 'learning_rate': 9.947911476628923e-06, 'epoch': 0.07} 7%|▋ | 2558/34278 [2:50:49<37:16:07, 4.23s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2559/34278 [2:50:53<35:51:53, 4.07s/it] {'loss': 0.2028, 'grad_norm': 0.8505969321849713, 'learning_rate': 9.947843438984392e-06, 'epoch': 0.07} 7%|▋ | 2559/34278 [2:50:53<35:51:53, 4.07s/it] 7%|▋ | 2560/34278 [2:50:56<33:51:03, 3.84s/it] {'loss': 0.1719, 'grad_norm': 0.794195364019822, 'learning_rate': 9.947775357166699e-06, 'epoch': 0.07} 7%|▋ | 2560/34278 [2:50:56<33:51:03, 3.84s/it] 7%|▋ | 2561/34278 [2:50:59<32:46:16, 3.72s/it] {'loss': 0.1784, 'grad_norm': 0.835272658713182, 'learning_rate': 9.947707231176444e-06, 'epoch': 0.07} 7%|▋ | 2561/34278 [2:50:59<32:46:16, 3.72s/it] 7%|▋ | 2562/34278 [2:51:02<30:30:58, 3.46s/it] {'loss': 0.1715, 'grad_norm': 0.9894806609791134, 'learning_rate': 9.947639061014242e-06, 'epoch': 0.07} 7%|▋ | 2562/34278 [2:51:02<30:30:58, 3.46s/it] 7%|▋ | 2563/34278 [2:51:06<31:32:20, 3.58s/it] {'loss': 0.1863, 'grad_norm': 0.9869328334441443, 'learning_rate': 9.9475708466807e-06, 'epoch': 0.07} 7%|▋ | 2563/34278 [2:51:06<31:32:20, 3.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 7%|▋ | 2564/34278 [2:51:09<29:56:30, 3.40s/it] {'loss': 0.2023, 'grad_norm': 1.159662679891997, 'learning_rate': 9.947502588176427e-06, 'epoch': 0.07} 7%|▋ | 2564/34278 [2:51:09<29:56:30, 3.40s/it] 7%|▋ | 2565/34278 [2:51:12<29:22:23, 3.33s/it] {'loss': 0.1819, 'grad_norm': 0.9242264876067878, 'learning_rate': 9.947434285502032e-06, 'epoch': 0.07} 7%|▋ | 2565/34278 [2:51:12<29:22:23, 3.33s/it] 7%|▋ | 2566/34278 [2:51:16<29:28:35, 3.35s/it] {'loss': 0.1965, 'grad_norm': 1.026543496068096, 'learning_rate': 9.947365938658124e-06, 'epoch': 0.07} 7%|▋ | 2566/34278 [2:51:16<29:28:35, 3.35s/it] 7%|▋ | 2567/34278 [2:51:20<31:32:13, 3.58s/it] {'loss': 0.1957, 'grad_norm': 0.9841963172592002, 'learning_rate': 9.947297547645314e-06, 'epoch': 0.07} 7%|▋ | 2567/34278 [2:51:20<31:32:13, 3.58s/it] 7%|▋ | 2568/34278 [2:51:23<30:57:41, 3.52s/it] {'loss': 0.1963, 'grad_norm': 1.2404952203932327, 'learning_rate': 9.947229112464213e-06, 'epoch': 0.07} 7%|▋ | 2568/34278 [2:51:23<30:57:41, 3.52s/it] 7%|▋ | 2569/34278 [2:51:27<30:46:05, 3.49s/it] {'loss': 0.1969, 'grad_norm': 0.9740693352878483, 'learning_rate': 9.947160633115431e-06, 'epoch': 0.07} 7%|▋ | 2569/34278 [2:51:27<30:46:05, 3.49s/it] 7%|▋ | 2570/34278 [2:51:31<33:43:51, 3.83s/it] {'loss': 0.1857, 'grad_norm': 0.870458868803175, 'learning_rate': 9.94709210959958e-06, 'epoch': 0.07} 7%|▋ | 2570/34278 [2:51:31<33:43:51, 3.83s/it] 8%|▊ | 2571/34278 [2:51:35<32:45:47, 3.72s/it] {'loss': 0.1886, 'grad_norm': 0.9470708798435789, 'learning_rate': 9.947023541917271e-06, 'epoch': 0.08} 8%|▊ | 2571/34278 [2:51:35<32:45:47, 3.72s/it] 8%|▊ | 2572/34278 [2:51:38<31:44:28, 3.60s/it] {'loss': 0.2086, 'grad_norm': 1.1836894201983172, 'learning_rate': 9.946954930069117e-06, 'epoch': 0.08} 8%|▊ | 2572/34278 [2:51:38<31:44:28, 3.60s/it] 8%|▊ | 2573/34278 [2:51:41<30:57:59, 3.52s/it] {'loss': 0.202, 'grad_norm': 0.9832880021505518, 'learning_rate': 9.946886274055731e-06, 'epoch': 0.08} 8%|▊ | 2573/34278 [2:51:41<30:57:59, 3.52s/it] 8%|▊ | 2574/34278 [2:51:45<31:56:11, 3.63s/it] {'loss': 0.208, 'grad_norm': 0.9515418412970081, 'learning_rate': 9.946817573877725e-06, 'epoch': 0.08} 8%|▊ | 2574/34278 [2:51:45<31:56:11, 3.63s/it] 8%|▊ | 2575/34278 [2:51:51<38:28:32, 4.37s/it] {'loss': 0.1928, 'grad_norm': 0.9548031685703873, 'learning_rate': 9.946748829535714e-06, 'epoch': 0.08} 8%|▊ | 2575/34278 [2:51:51<38:28:32, 4.37s/it] 8%|▊ | 2576/34278 [2:51:54<35:34:03, 4.04s/it] {'loss': 0.2178, 'grad_norm': 0.8728612685822629, 'learning_rate': 9.946680041030308e-06, 'epoch': 0.08} 8%|▊ | 2576/34278 [2:51:54<35:34:03, 4.04s/it] 8%|▊ | 2577/34278 [2:51:58<33:45:03, 3.83s/it] {'loss': 0.2207, 'grad_norm': 0.9042425278014952, 'learning_rate': 9.946611208362123e-06, 'epoch': 0.08} 8%|▊ | 2577/34278 [2:51:58<33:45:03, 3.83s/it] 8%|▊ | 2578/34278 [2:52:01<31:32:56, 3.58s/it] {'loss': 0.1995, 'grad_norm': 0.8710367260702849, 'learning_rate': 9.946542331531777e-06, 'epoch': 0.08} 8%|▊ | 2578/34278 [2:52:01<31:32:56, 3.58s/it] 8%|▊ | 2579/34278 [2:52:07<38:13:08, 4.34s/it] {'loss': 0.2174, 'grad_norm': 0.8886974016626152, 'learning_rate': 9.946473410539878e-06, 'epoch': 0.08} 8%|▊ | 2579/34278 [2:52:07<38:13:08, 4.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2580/34278 [2:52:10<35:20:03, 4.01s/it] {'loss': 0.2076, 'grad_norm': 0.8335754120537804, 'learning_rate': 9.946404445387048e-06, 'epoch': 0.08} 8%|▊ | 2580/34278 [2:52:10<35:20:03, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2581/34278 [2:52:16<39:04:55, 4.44s/it] {'loss': 0.1905, 'grad_norm': 0.8780117152861452, 'learning_rate': 9.946335436073899e-06, 'epoch': 0.08} 8%|▊ | 2581/34278 [2:52:16<39:04:55, 4.44s/it] 8%|▊ | 2582/34278 [2:52:21<41:50:03, 4.75s/it] {'loss': 0.1763, 'grad_norm': 0.9089534703220984, 'learning_rate': 9.946266382601049e-06, 'epoch': 0.08} 8%|▊ | 2582/34278 [2:52:21<41:50:03, 4.75s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2583/34278 [2:52:25<40:27:46, 4.60s/it] {'loss': 0.2208, 'grad_norm': 0.7917110926617236, 'learning_rate': 9.946197284969112e-06, 'epoch': 0.08} 8%|▊ | 2583/34278 [2:52:25<40:27:46, 4.60s/it] 8%|▊ | 2584/34278 [2:52:29<37:45:17, 4.29s/it] {'loss': 0.2117, 'grad_norm': 1.1303322563806806, 'learning_rate': 9.946128143178708e-06, 'epoch': 0.08} 8%|▊ | 2584/34278 [2:52:29<37:45:17, 4.29s/it] 8%|▊ | 2585/34278 [2:52:32<33:55:42, 3.85s/it] {'loss': 0.1866, 'grad_norm': 1.0074994278245815, 'learning_rate': 9.946058957230451e-06, 'epoch': 0.08} 8%|▊ | 2585/34278 [2:52:32<33:55:42, 3.85s/it] 8%|▊ | 2586/34278 [2:52:35<31:31:45, 3.58s/it] {'loss': 0.204, 'grad_norm': 0.918129918274096, 'learning_rate': 9.945989727124963e-06, 'epoch': 0.08} 8%|▊ | 2586/34278 [2:52:35<31:31:45, 3.58s/it] 8%|▊ | 2587/34278 [2:52:38<30:06:04, 3.42s/it] {'loss': 0.1771, 'grad_norm': 0.7543863200792966, 'learning_rate': 9.945920452862856e-06, 'epoch': 0.08} 8%|▊ | 2587/34278 [2:52:38<30:06:04, 3.42s/it] 8%|▊ | 2588/34278 [2:52:41<29:06:55, 3.31s/it] {'loss': 0.2061, 'grad_norm': 1.3205281719710635, 'learning_rate': 9.945851134444754e-06, 'epoch': 0.08} 8%|▊ | 2588/34278 [2:52:41<29:06:55, 3.31s/it] 8%|▊ | 2589/34278 [2:52:44<28:18:07, 3.22s/it] {'loss': 0.1994, 'grad_norm': 1.0235724508465391, 'learning_rate': 9.945781771871274e-06, 'epoch': 0.08} 8%|▊ | 2589/34278 [2:52:44<28:18:07, 3.22s/it] 8%|▊ | 2590/34278 [2:52:50<36:09:00, 4.11s/it] {'loss': 0.1996, 'grad_norm': 0.920875422648793, 'learning_rate': 9.945712365143034e-06, 'epoch': 0.08} 8%|▊ | 2590/34278 [2:52:50<36:09:00, 4.11s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12541 > 8192). Running this sequence through the model will result in indexing errors 8%|▊ | 2591/34278 [2:52:53<33:25:06, 3.80s/it] {'loss': 0.2, 'grad_norm': 0.9255287465514315, 'learning_rate': 9.945642914260655e-06, 'epoch': 0.08} 8%|▊ | 2591/34278 [2:52:53<33:25:06, 3.80s/it] 8%|▊ | 2592/34278 [2:52:57<33:39:56, 3.82s/it] {'loss': 0.2171, 'grad_norm': 1.0882154267587745, 'learning_rate': 9.945573419224757e-06, 'epoch': 0.08} 8%|▊ | 2592/34278 [2:52:57<33:39:56, 3.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2593/34278 [2:53:02<35:40:02, 4.05s/it] {'loss': 0.1969, 'grad_norm': 1.111913818943444, 'learning_rate': 9.945503880035958e-06, 'epoch': 0.08} 8%|▊ | 2593/34278 [2:53:02<35:40:02, 4.05s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa23e10ba10> Failed to fetch sample 2735739. Exception: cannot identify image file <_io.BytesIO object at 0x7fa23e10ba10> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2594/34278 [2:53:06<37:57:22, 4.31s/it] {'loss': 0.2094, 'grad_norm': 0.8757571257745991, 'learning_rate': 9.945434296694883e-06, 'epoch': 0.08} 8%|▊ | 2594/34278 [2:53:06<37:57:22, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2595/34278 [2:53:09<33:39:43, 3.82s/it] {'loss': 0.1984, 'grad_norm': 0.8715141905293745, 'learning_rate': 9.94536466920215e-06, 'epoch': 0.08} 8%|▊ | 2595/34278 [2:53:09<33:39:43, 3.82s/it] 8%|▊ | 2596/34278 [2:53:12<32:28:23, 3.69s/it] {'loss': 0.2181, 'grad_norm': 1.2895697791871283, 'learning_rate': 9.945294997558384e-06, 'epoch': 0.08} 8%|▊ | 2596/34278 [2:53:13<32:28:23, 3.69s/it] 8%|▊ | 2597/34278 [2:53:16<30:58:28, 3.52s/it] {'loss': 0.199, 'grad_norm': 1.0281992159004434, 'learning_rate': 9.945225281764203e-06, 'epoch': 0.08} 8%|▊ | 2597/34278 [2:53:16<30:58:28, 3.52s/it] 8%|▊ | 2598/34278 [2:53:20<32:41:24, 3.71s/it] {'loss': 0.1731, 'grad_norm': 1.0161962229050718, 'learning_rate': 9.945155521820232e-06, 'epoch': 0.08} 8%|▊ | 2598/34278 [2:53:20<32:41:24, 3.71s/it] 8%|▊ | 2599/34278 [2:53:24<35:07:34, 3.99s/it] {'loss': 0.1997, 'grad_norm': 0.9895792606137934, 'learning_rate': 9.945085717727093e-06, 'epoch': 0.08} 8%|▊ | 2599/34278 [2:53:25<35:07:34, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (8964 > 8192). Running this sequence through the model will result in indexing errors 8%|▊ | 2600/34278 [2:53:28<35:15:15, 4.01s/it] {'loss': 0.1735, 'grad_norm': 0.8103461045109309, 'learning_rate': 9.945015869485409e-06, 'epoch': 0.08} 8%|▊ | 2600/34278 [2:53:28<35:15:15, 4.01s/it] 8%|▊ | 2601/34278 [2:53:33<35:47:17, 4.07s/it] {'loss': 0.1902, 'grad_norm': 0.7954580081561768, 'learning_rate': 9.944945977095803e-06, 'epoch': 0.08} 8%|▊ | 2601/34278 [2:53:33<35:47:17, 4.07s/it] 8%|▊ | 2602/34278 [2:53:37<35:55:39, 4.08s/it] {'loss': 0.1729, 'grad_norm': 0.8832405590401877, 'learning_rate': 9.9448760405589e-06, 'epoch': 0.08} 8%|▊ | 2602/34278 [2:53:37<35:55:39, 4.08s/it] 8%|▊ | 2603/34278 [2:53:41<36:28:01, 4.14s/it] {'loss': 0.2013, 'grad_norm': 0.9456933396832424, 'learning_rate': 9.944806059875326e-06, 'epoch': 0.08} 8%|▊ | 2603/34278 [2:53:41<36:28:01, 4.14s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2604/34278 [2:53:47<40:22:07, 4.59s/it] {'loss': 0.2016, 'grad_norm': 1.0812812205345452, 'learning_rate': 9.944736035045702e-06, 'epoch': 0.08} 8%|▊ | 2604/34278 [2:53:47<40:22:07, 4.59s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2605/34278 [2:53:51<39:00:14, 4.43s/it] {'loss': 0.168, 'grad_norm': 0.8352054121513081, 'learning_rate': 9.944665966070654e-06, 'epoch': 0.08} 8%|▊ | 2605/34278 [2:53:51<39:00:14, 4.43s/it] 8%|▊ | 2606/34278 [2:53:57<43:40:12, 4.96s/it] {'loss': 0.1932, 'grad_norm': 0.9275361524424524, 'learning_rate': 9.944595852950812e-06, 'epoch': 0.08} 8%|▊ | 2606/34278 [2:53:57<43:40:12, 4.96s/it] 8%|▊ | 2607/34278 [2:54:01<41:46:35, 4.75s/it] {'loss': 0.1877, 'grad_norm': 0.9762002102562343, 'learning_rate': 9.944525695686795e-06, 'epoch': 0.08} 8%|▊ | 2607/34278 [2:54:01<41:46:35, 4.75s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2608/34278 [2:54:04<37:43:56, 4.29s/it] {'loss': 0.1964, 'grad_norm': 0.8302742683084885, 'learning_rate': 9.944455494279235e-06, 'epoch': 0.08} 8%|▊ | 2608/34278 [2:54:04<37:43:56, 4.29s/it] 8%|▊ | 2609/34278 [2:54:07<33:56:33, 3.86s/it] {'loss': 0.1935, 'grad_norm': 0.9651412018240907, 'learning_rate': 9.944385248728757e-06, 'epoch': 0.08} 8%|▊ | 2609/34278 [2:54:07<33:56:33, 3.86s/it] 8%|▊ | 2610/34278 [2:54:13<39:48:38, 4.53s/it] {'loss': 0.1954, 'grad_norm': 1.0580469462137965, 'learning_rate': 9.944314959035987e-06, 'epoch': 0.08} 8%|▊ | 2610/34278 [2:54:13<39:48:38, 4.53s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2611/34278 [2:54:18<39:37:48, 4.51s/it] {'loss': 0.1786, 'grad_norm': 0.8436060684882135, 'learning_rate': 9.944244625201553e-06, 'epoch': 0.08} 8%|▊ | 2611/34278 [2:54:18<39:37:48, 4.51s/it] 8%|▊ | 2612/34278 [2:54:24<44:07:02, 5.02s/it] {'loss': 0.201, 'grad_norm': 1.1668726972890329, 'learning_rate': 9.944174247226084e-06, 'epoch': 0.08} 8%|▊ | 2612/34278 [2:54:24<44:07:02, 5.02s/it] 8%|▊ | 2613/34278 [2:54:27<39:33:20, 4.50s/it] {'loss': 0.2082, 'grad_norm': 0.9050530434600664, 'learning_rate': 9.944103825110207e-06, 'epoch': 0.08} 8%|▊ | 2613/34278 [2:54:27<39:33:20, 4.50s/it] 8%|▊ | 2614/34278 [2:54:33<43:45:19, 4.97s/it] {'loss': 0.1855, 'grad_norm': 0.9657438371151073, 'learning_rate': 9.944033358854553e-06, 'epoch': 0.08} 8%|▊ | 2614/34278 [2:54:33<43:45:19, 4.97s/it] 8%|▊ | 2615/34278 [2:54:37<40:07:30, 4.56s/it] {'loss': 0.1957, 'grad_norm': 0.8982666967367076, 'learning_rate': 9.943962848459747e-06, 'epoch': 0.08} 8%|▊ | 2615/34278 [2:54:37<40:07:30, 4.56s/it] 8%|▊ | 2616/34278 [2:54:40<36:48:06, 4.18s/it] {'loss': 0.2097, 'grad_norm': 0.8917214421128401, 'learning_rate': 9.943892293926422e-06, 'epoch': 0.08} 8%|▊ | 2616/34278 [2:54:40<36:48:06, 4.18s/it] 8%|▊ | 2617/34278 [2:54:43<34:04:28, 3.87s/it] {'loss': 0.1911, 'grad_norm': 1.0212730113490176, 'learning_rate': 9.943821695255208e-06, 'epoch': 0.08} 8%|▊ | 2617/34278 [2:54:43<34:04:28, 3.87s/it] 8%|▊ | 2618/34278 [2:54:47<33:25:04, 3.80s/it] {'loss': 0.1713, 'grad_norm': 1.2312845391322114, 'learning_rate': 9.943751052446732e-06, 'epoch': 0.08} 8%|▊ | 2618/34278 [2:54:47<33:25:04, 3.80s/it] 8%|▊ | 2619/34278 [2:54:51<33:57:23, 3.86s/it] {'loss': 0.1962, 'grad_norm': 0.7339266003982186, 'learning_rate': 9.943680365501628e-06, 'epoch': 0.08} 8%|▊ | 2619/34278 [2:54:51<33:57:23, 3.86s/it] 8%|▊ | 2620/34278 [2:54:54<31:16:06, 3.56s/it] {'loss': 0.1659, 'grad_norm': 0.769726512109223, 'learning_rate': 9.943609634420526e-06, 'epoch': 0.08} 8%|▊ | 2620/34278 [2:54:54<31:16:06, 3.56s/it] 8%|▊ | 2621/34278 [2:54:57<30:40:17, 3.49s/it] {'loss': 0.2021, 'grad_norm': 1.1079342427199865, 'learning_rate': 9.943538859204056e-06, 'epoch': 0.08} 8%|▊ | 2621/34278 [2:54:57<30:40:17, 3.49s/it] 8%|▊ | 2622/34278 [2:55:00<29:52:35, 3.40s/it] {'loss': 0.1891, 'grad_norm': 0.9860008360785368, 'learning_rate': 9.943468039852852e-06, 'epoch': 0.08} 8%|▊ | 2622/34278 [2:55:00<29:52:35, 3.40s/it] 8%|▊ | 2623/34278 [2:55:04<30:04:52, 3.42s/it] {'loss': 0.1832, 'grad_norm': 0.7521624716160008, 'learning_rate': 9.943397176367546e-06, 'epoch': 0.08} 8%|▊ | 2623/34278 [2:55:04<30:04:52, 3.42s/it] 8%|▊ | 2624/34278 [2:55:07<28:50:02, 3.28s/it] {'loss': 0.1984, 'grad_norm': 1.0201516196014724, 'learning_rate': 9.94332626874877e-06, 'epoch': 0.08} 8%|▊ | 2624/34278 [2:55:07<28:50:02, 3.28s/it] 8%|▊ | 2625/34278 [2:55:10<28:31:43, 3.24s/it] {'loss': 0.21, 'grad_norm': 1.221830997682814, 'learning_rate': 9.943255316997156e-06, 'epoch': 0.08} 8%|▊ | 2625/34278 [2:55:10<28:31:43, 3.24s/it] 8%|▊ | 2626/34278 [2:55:13<28:49:57, 3.28s/it] {'loss': 0.2104, 'grad_norm': 0.8614480246302127, 'learning_rate': 9.943184321113339e-06, 'epoch': 0.08} 8%|▊ | 2626/34278 [2:55:13<28:49:57, 3.28s/it] 8%|▊ | 2627/34278 [2:55:17<29:16:03, 3.33s/it] {'loss': 0.1722, 'grad_norm': 0.8262000373867923, 'learning_rate': 9.943113281097953e-06, 'epoch': 0.08} 8%|▊ | 2627/34278 [2:55:17<29:16:03, 3.33s/it] 8%|▊ | 2628/34278 [2:55:23<35:59:31, 4.09s/it] {'loss': 0.1972, 'grad_norm': 0.9084643356559261, 'learning_rate': 9.943042196951631e-06, 'epoch': 0.08} 8%|▊ | 2628/34278 [2:55:23<35:59:31, 4.09s/it] 8%|▊ | 2629/34278 [2:55:26<34:35:24, 3.93s/it] {'loss': 0.1875, 'grad_norm': 0.8752944963258539, 'learning_rate': 9.942971068675009e-06, 'epoch': 0.08} 8%|▊ | 2629/34278 [2:55:26<34:35:24, 3.93s/it] 8%|▊ | 2630/34278 [2:55:30<34:22:44, 3.91s/it] {'loss': 0.1535, 'grad_norm': 0.795338642633206, 'learning_rate': 9.942899896268721e-06, 'epoch': 0.08} 8%|▊ | 2630/34278 [2:55:30<34:22:44, 3.91s/it] 8%|▊ | 2631/34278 [2:55:34<33:10:07, 3.77s/it] {'loss': 0.2219, 'grad_norm': 0.9344221503966518, 'learning_rate': 9.942828679733402e-06, 'epoch': 0.08} 8%|▊ | 2631/34278 [2:55:34<33:10:07, 3.77s/it] 8%|▊ | 2632/34278 [2:55:40<39:05:11, 4.45s/it] {'loss': 0.1929, 'grad_norm': 1.4776393321268981, 'learning_rate': 9.942757419069688e-06, 'epoch': 0.08} 8%|▊ | 2632/34278 [2:55:40<39:05:11, 4.45s/it] 8%|▊ | 2633/34278 [2:55:44<38:48:36, 4.42s/it] {'loss': 0.1843, 'grad_norm': 1.0484382039943305, 'learning_rate': 9.942686114278218e-06, 'epoch': 0.08} 8%|▊ | 2633/34278 [2:55:44<38:48:36, 4.42s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2634/34278 [2:55:50<43:27:06, 4.94s/it] {'loss': 0.1758, 'grad_norm': 0.7543481335208506, 'learning_rate': 9.942614765359625e-06, 'epoch': 0.08} 8%|▊ | 2634/34278 [2:55:50<43:27:06, 4.94s/it] 8%|▊ | 2635/34278 [2:55:53<38:06:39, 4.34s/it] {'loss': 0.1963, 'grad_norm': 1.1031882754003925, 'learning_rate': 9.942543372314548e-06, 'epoch': 0.08} 8%|▊ | 2635/34278 [2:55:53<38:06:39, 4.34s/it] 8%|▊ | 2636/34278 [2:55:57<36:14:21, 4.12s/it] {'loss': 0.2032, 'grad_norm': 1.0430234254755133, 'learning_rate': 9.942471935143623e-06, 'epoch': 0.08} 8%|▊ | 2636/34278 [2:55:57<36:14:21, 4.12s/it] 8%|▊ | 2637/34278 [2:56:00<35:04:51, 3.99s/it] {'loss': 0.1958, 'grad_norm': 1.0339827850226333, 'learning_rate': 9.942400453847487e-06, 'epoch': 0.08} 8%|▊ | 2637/34278 [2:56:00<35:04:51, 3.99s/it] 8%|▊ | 2638/34278 [2:56:04<32:55:43, 3.75s/it] {'loss': 0.2067, 'grad_norm': 1.1071622241519605, 'learning_rate': 9.94232892842678e-06, 'epoch': 0.08} 8%|▊ | 2638/34278 [2:56:04<32:55:43, 3.75s/it] 8%|▊ | 2639/34278 [2:56:07<31:25:59, 3.58s/it] {'loss': 0.1747, 'grad_norm': 1.0958838219887979, 'learning_rate': 9.942257358882144e-06, 'epoch': 0.08} 8%|▊ | 2639/34278 [2:56:07<31:25:59, 3.58s/it] 8%|▊ | 2640/34278 [2:56:10<29:44:45, 3.38s/it] {'loss': 0.227, 'grad_norm': 1.0717453136637745, 'learning_rate': 9.94218574521421e-06, 'epoch': 0.08} 8%|▊ | 2640/34278 [2:56:10<29:44:45, 3.38s/it] 8%|▊ | 2641/34278 [2:56:15<34:09:35, 3.89s/it] {'loss': 0.2231, 'grad_norm': 0.9185332528220834, 'learning_rate': 9.942114087423622e-06, 'epoch': 0.08} 8%|▊ | 2641/34278 [2:56:15<34:09:35, 3.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2642/34278 [2:56:18<31:42:01, 3.61s/it] {'loss': 0.1943, 'grad_norm': 0.9051003261109557, 'learning_rate': 9.942042385511022e-06, 'epoch': 0.08} 8%|▊ | 2642/34278 [2:56:18<31:42:01, 3.61s/it] 8%|▊ | 2643/34278 [2:56:22<32:34:41, 3.71s/it] {'loss': 0.1824, 'grad_norm': 1.3066658504609052, 'learning_rate': 9.941970639477044e-06, 'epoch': 0.08} 8%|▊ | 2643/34278 [2:56:22<32:34:41, 3.71s/it] 8%|▊ | 2644/34278 [2:56:25<32:09:24, 3.66s/it] {'loss': 0.2068, 'grad_norm': 0.9224265104180501, 'learning_rate': 9.941898849322333e-06, 'epoch': 0.08} 8%|▊ | 2644/34278 [2:56:25<32:09:24, 3.66s/it] 8%|▊ | 2645/34278 [2:56:28<30:47:14, 3.50s/it] {'loss': 0.1943, 'grad_norm': 0.8542627557072558, 'learning_rate': 9.94182701504753e-06, 'epoch': 0.08} 8%|▊ | 2645/34278 [2:56:28<30:47:14, 3.50s/it] 8%|▊ | 2646/34278 [2:56:33<32:47:07, 3.73s/it] {'loss': 0.2051, 'grad_norm': 1.1004733462542646, 'learning_rate': 9.941755136653273e-06, 'epoch': 0.08} 8%|▊ | 2646/34278 [2:56:33<32:47:07, 3.73s/it] 8%|▊ | 2647/34278 [2:56:38<38:32:44, 4.39s/it] {'loss': 0.2048, 'grad_norm': 0.964762488155317, 'learning_rate': 9.941683214140207e-06, 'epoch': 0.08} 8%|▊ | 2647/34278 [2:56:39<38:32:44, 4.39s/it] 8%|▊ | 2648/34278 [2:56:42<35:26:26, 4.03s/it] {'loss': 0.1727, 'grad_norm': 0.8247333942088099, 'learning_rate': 9.941611247508973e-06, 'epoch': 0.08} 8%|▊ | 2648/34278 [2:56:42<35:26:26, 4.03s/it] 8%|▊ | 2649/34278 [2:56:45<32:37:21, 3.71s/it] {'loss': 0.2032, 'grad_norm': 1.0219784976162776, 'learning_rate': 9.941539236760213e-06, 'epoch': 0.08} 8%|▊ | 2649/34278 [2:56:45<32:37:21, 3.71s/it] 8%|▊ | 2650/34278 [2:56:48<30:33:36, 3.48s/it] {'loss': 0.1955, 'grad_norm': 1.1353265879464371, 'learning_rate': 9.94146718189457e-06, 'epoch': 0.08} 8%|▊ | 2650/34278 [2:56:48<30:33:36, 3.48s/it] 8%|▊ | 2651/34278 [2:56:52<32:13:54, 3.67s/it] {'loss': 0.1886, 'grad_norm': 0.7962805265253471, 'learning_rate': 9.94139508291269e-06, 'epoch': 0.08} 8%|▊ | 2651/34278 [2:56:52<32:13:54, 3.67s/it] 8%|▊ | 2652/34278 [2:56:55<31:50:23, 3.62s/it] {'loss': 0.1976, 'grad_norm': 0.7955698233236265, 'learning_rate': 9.941322939815213e-06, 'epoch': 0.08} 8%|▊ | 2652/34278 [2:56:55<31:50:23, 3.62s/it] 8%|▊ | 2653/34278 [2:56:59<31:15:02, 3.56s/it] {'loss': 0.1962, 'grad_norm': 1.0542124619816051, 'learning_rate': 9.941250752602783e-06, 'epoch': 0.08} 8%|▊ | 2653/34278 [2:56:59<31:15:02, 3.56s/it] 8%|▊ | 2654/34278 [2:57:05<37:44:24, 4.30s/it] {'loss': 0.1893, 'grad_norm': 0.8402126269339003, 'learning_rate': 9.941178521276046e-06, 'epoch': 0.08} 8%|▊ | 2654/34278 [2:57:05<37:44:24, 4.30s/it] 8%|▊ | 2655/34278 [2:57:09<37:32:56, 4.27s/it] {'loss': 0.1937, 'grad_norm': 0.8447199467017881, 'learning_rate': 9.941106245835648e-06, 'epoch': 0.08} 8%|▊ | 2655/34278 [2:57:09<37:32:56, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2656/34278 [2:57:13<38:08:05, 4.34s/it] {'loss': 0.1825, 'grad_norm': 0.912593285122189, 'learning_rate': 9.941033926282233e-06, 'epoch': 0.08} 8%|▊ | 2656/34278 [2:57:13<38:08:05, 4.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (10503 > 8192). Running this sequence through the model will result in indexing errors 8%|▊ | 2657/34278 [2:57:20<43:23:08, 4.94s/it] {'loss': 0.1984, 'grad_norm': 0.9718797109379124, 'learning_rate': 9.940961562616446e-06, 'epoch': 0.08} 8%|▊ | 2657/34278 [2:57:20<43:23:08, 4.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2658/34278 [2:57:24<42:41:53, 4.86s/it] {'loss': 0.1878, 'grad_norm': 0.9868174289875705, 'learning_rate': 9.940889154838934e-06, 'epoch': 0.08} 8%|▊ | 2658/34278 [2:57:24<42:41:53, 4.86s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2659/34278 [2:57:28<38:30:15, 4.38s/it] {'loss': 0.1867, 'grad_norm': 1.0507159006122682, 'learning_rate': 9.940816702950343e-06, 'epoch': 0.08} 8%|▊ | 2659/34278 [2:57:28<38:30:15, 4.38s/it] 8%|▊ | 2660/34278 [2:57:31<36:27:17, 4.15s/it] {'loss': 0.1996, 'grad_norm': 1.0525048615759034, 'learning_rate': 9.940744206951318e-06, 'epoch': 0.08} 8%|▊ | 2660/34278 [2:57:31<36:27:17, 4.15s/it] 8%|▊ | 2661/34278 [2:57:35<34:10:13, 3.89s/it] {'loss': 0.1994, 'grad_norm': 1.0223398335816087, 'learning_rate': 9.940671666842512e-06, 'epoch': 0.08} 8%|▊ | 2661/34278 [2:57:35<34:10:13, 3.89s/it] 8%|▊ | 2662/34278 [2:57:40<37:17:05, 4.25s/it] {'loss': 0.173, 'grad_norm': 0.9068473842074293, 'learning_rate': 9.940599082624566e-06, 'epoch': 0.08} 8%|▊ | 2662/34278 [2:57:40<37:17:05, 4.25s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2663/34278 [2:57:43<35:19:00, 4.02s/it] {'loss': 0.1894, 'grad_norm': 0.899211220038558, 'learning_rate': 9.940526454298132e-06, 'epoch': 0.08} 8%|▊ | 2663/34278 [2:57:43<35:19:00, 4.02s/it] 8%|▊ | 2664/34278 [2:57:46<32:29:49, 3.70s/it] {'loss': 0.1943, 'grad_norm': 0.9777365591898555, 'learning_rate': 9.940453781863857e-06, 'epoch': 0.08} 8%|▊ | 2664/34278 [2:57:46<32:29:49, 3.70s/it] 8%|▊ | 2665/34278 [2:57:52<38:34:54, 4.39s/it] {'loss': 0.2013, 'grad_norm': 0.7923847513019054, 'learning_rate': 9.940381065322388e-06, 'epoch': 0.08} 8%|▊ | 2665/34278 [2:57:52<38:34:54, 4.39s/it] 8%|▊ | 2666/34278 [2:57:55<34:38:26, 3.94s/it] {'loss': 0.1779, 'grad_norm': 0.8198327533951112, 'learning_rate': 9.94030830467438e-06, 'epoch': 0.08} 8%|▊ | 2666/34278 [2:57:55<34:38:26, 3.94s/it] 8%|▊ | 2667/34278 [2:57:58<31:53:04, 3.63s/it] {'loss': 0.1892, 'grad_norm': 1.0282763387788505, 'learning_rate': 9.940235499920476e-06, 'epoch': 0.08} 8%|▊ | 2667/34278 [2:57:58<31:53:04, 3.63s/it] 8%|▊ | 2668/34278 [2:58:04<38:58:10, 4.44s/it] {'loss': 0.1873, 'grad_norm': 0.955231002477153, 'learning_rate': 9.940162651061329e-06, 'epoch': 0.08} 8%|▊ | 2668/34278 [2:58:04<38:58:10, 4.44s/it] 8%|▊ | 2669/34278 [2:58:08<37:11:34, 4.24s/it] {'loss': 0.1858, 'grad_norm': 0.9416850910784917, 'learning_rate': 9.940089758097591e-06, 'epoch': 0.08} 8%|▊ | 2669/34278 [2:58:08<37:11:34, 4.24s/it] 8%|▊ | 2670/34278 [2:58:11<33:38:43, 3.83s/it] {'loss': 0.204, 'grad_norm': 0.9282490361042554, 'learning_rate': 9.94001682102991e-06, 'epoch': 0.08} 8%|▊ | 2670/34278 [2:58:11<33:38:43, 3.83s/it] 8%|▊ | 2671/34278 [2:58:14<33:05:21, 3.77s/it] {'loss': 0.2057, 'grad_norm': 0.9752447389088829, 'learning_rate': 9.939943839858936e-06, 'epoch': 0.08} 8%|▊ | 2671/34278 [2:58:14<33:05:21, 3.77s/it] 8%|▊ | 2672/34278 [2:58:18<31:24:22, 3.58s/it] {'loss': 0.1849, 'grad_norm': 0.9947648135003092, 'learning_rate': 9.939870814585327e-06, 'epoch': 0.08} 8%|▊ | 2672/34278 [2:58:18<31:24:22, 3.58s/it] 8%|▊ | 2673/34278 [2:58:23<35:43:03, 4.07s/it] {'loss': 0.2005, 'grad_norm': 0.968443584009168, 'learning_rate': 9.939797745209727e-06, 'epoch': 0.08} 8%|▊ | 2673/34278 [2:58:23<35:43:03, 4.07s/it] 8%|▊ | 2674/34278 [2:58:27<36:13:17, 4.13s/it] {'loss': 0.2075, 'grad_norm': 0.9606764833604585, 'learning_rate': 9.939724631732793e-06, 'epoch': 0.08} 8%|▊ | 2674/34278 [2:58:27<36:13:17, 4.13s/it] 8%|▊ | 2675/34278 [2:58:30<33:01:29, 3.76s/it] {'loss': 0.2015, 'grad_norm': 1.0200740151040866, 'learning_rate': 9.939651474155176e-06, 'epoch': 0.08} 8%|▊ | 2675/34278 [2:58:30<33:01:29, 3.76s/it] 8%|▊ | 2676/34278 [2:58:34<33:03:20, 3.77s/it] {'loss': 0.178, 'grad_norm': 1.0026526550299906, 'learning_rate': 9.93957827247753e-06, 'epoch': 0.08} 8%|▊ | 2676/34278 [2:58:34<33:03:20, 3.77s/it] 8%|▊ | 2677/34278 [2:58:37<32:30:58, 3.70s/it] {'loss': 0.1905, 'grad_norm': 0.9075067466521576, 'learning_rate': 9.93950502670051e-06, 'epoch': 0.08} 8%|▊ | 2677/34278 [2:58:37<32:30:58, 3.70s/it] 8%|▊ | 2678/34278 [2:58:41<32:56:57, 3.75s/it] {'loss': 0.2049, 'grad_norm': 1.0334341557588698, 'learning_rate': 9.939431736824767e-06, 'epoch': 0.08} 8%|▊ | 2678/34278 [2:58:41<32:56:57, 3.75s/it] 8%|▊ | 2679/34278 [2:58:45<33:07:33, 3.77s/it] {'loss': 0.1964, 'grad_norm': 0.9574596778896336, 'learning_rate': 9.939358402850955e-06, 'epoch': 0.08} 8%|▊ | 2679/34278 [2:58:45<33:07:33, 3.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2680/34278 [2:58:48<31:27:58, 3.58s/it] {'loss': 0.2029, 'grad_norm': 1.0897485305686212, 'learning_rate': 9.939285024779734e-06, 'epoch': 0.08} 8%|▊ | 2680/34278 [2:58:48<31:27:58, 3.58s/it] 8%|▊ | 2681/34278 [2:58:51<29:46:37, 3.39s/it] {'loss': 0.1822, 'grad_norm': 0.9828221224677702, 'learning_rate': 9.939211602611754e-06, 'epoch': 0.08} 8%|▊ | 2681/34278 [2:58:51<29:46:37, 3.39s/it] 8%|▊ | 2682/34278 [2:58:54<29:04:18, 3.31s/it] {'loss': 0.1943, 'grad_norm': 1.1855913890247984, 'learning_rate': 9.93913813634767e-06, 'epoch': 0.08} 8%|▊ | 2682/34278 [2:58:54<29:04:18, 3.31s/it] 8%|▊ | 2683/34278 [2:58:58<29:02:00, 3.31s/it] {'loss': 0.2273, 'grad_norm': 1.2297052617062867, 'learning_rate': 9.939064625988142e-06, 'epoch': 0.08} 8%|▊ | 2683/34278 [2:58:58<29:02:00, 3.31s/it] 8%|▊ | 2684/34278 [2:59:01<29:11:18, 3.33s/it] {'loss': 0.1936, 'grad_norm': 0.8934870583091481, 'learning_rate': 9.938991071533823e-06, 'epoch': 0.08} 8%|▊ | 2684/34278 [2:59:01<29:11:18, 3.33s/it] 8%|▊ | 2685/34278 [2:59:04<27:50:49, 3.17s/it] {'loss': 0.1949, 'grad_norm': 1.244846335122825, 'learning_rate': 9.938917472985372e-06, 'epoch': 0.08} 8%|▊ | 2685/34278 [2:59:04<27:50:49, 3.17s/it] 8%|▊ | 2686/34278 [2:59:07<27:19:06, 3.11s/it] {'loss': 0.1697, 'grad_norm': 1.1793669347515032, 'learning_rate': 9.938843830343443e-06, 'epoch': 0.08} 8%|▊ | 2686/34278 [2:59:07<27:19:06, 3.11s/it] 8%|▊ | 2687/34278 [2:59:10<29:00:10, 3.31s/it] {'loss': 0.211, 'grad_norm': 0.8600819325514898, 'learning_rate': 9.938770143608695e-06, 'epoch': 0.08} 8%|▊ | 2687/34278 [2:59:10<29:00:10, 3.31s/it] 8%|▊ | 2688/34278 [2:59:14<28:57:31, 3.30s/it] {'loss': 0.2164, 'grad_norm': 0.904292666952364, 'learning_rate': 9.938696412781787e-06, 'epoch': 0.08} 8%|▊ | 2688/34278 [2:59:14<28:57:31, 3.30s/it] 8%|▊ | 2689/34278 [2:59:17<28:49:14, 3.28s/it] {'loss': 0.2169, 'grad_norm': 1.0879181073662818, 'learning_rate': 9.938622637863377e-06, 'epoch': 0.08} 8%|▊ | 2689/34278 [2:59:17<28:49:14, 3.28s/it] 8%|▊ | 2690/34278 [2:59:20<29:05:10, 3.31s/it] {'loss': 0.1878, 'grad_norm': 0.8801298156236977, 'learning_rate': 9.938548818854124e-06, 'epoch': 0.08} 8%|▊ | 2690/34278 [2:59:20<29:05:10, 3.31s/it] 8%|▊ | 2691/34278 [2:59:26<34:24:17, 3.92s/it] {'loss': 0.1956, 'grad_norm': 0.8137726067530414, 'learning_rate': 9.938474955754685e-06, 'epoch': 0.08} 8%|▊ | 2691/34278 [2:59:26<34:24:17, 3.92s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2692/34278 [2:59:29<32:59:40, 3.76s/it] {'loss': 0.1922, 'grad_norm': 1.0321234197281965, 'learning_rate': 9.93840104856572e-06, 'epoch': 0.08} 8%|▊ | 2692/34278 [2:59:29<32:59:40, 3.76s/it] 8%|▊ | 2693/34278 [2:59:32<30:39:05, 3.49s/it] {'loss': 0.1887, 'grad_norm': 0.9366913737022776, 'learning_rate': 9.93832709728789e-06, 'epoch': 0.08} 8%|▊ | 2693/34278 [2:59:32<30:39:05, 3.49s/it] 8%|▊ | 2694/34278 [2:59:35<29:54:14, 3.41s/it] {'loss': 0.1824, 'grad_norm': 0.7059196165237965, 'learning_rate': 9.938253101921852e-06, 'epoch': 0.08} 8%|▊ | 2694/34278 [2:59:35<29:54:14, 3.41s/it] 8%|▊ | 2695/34278 [2:59:38<28:52:42, 3.29s/it] {'loss': 0.1808, 'grad_norm': 1.0892404704290901, 'learning_rate': 9.938179062468272e-06, 'epoch': 0.08} 8%|▊ | 2695/34278 [2:59:38<28:52:42, 3.29s/it] 8%|▊ | 2696/34278 [2:59:41<27:45:30, 3.16s/it] {'loss': 0.1961, 'grad_norm': 1.0510282012379226, 'learning_rate': 9.938104978927807e-06, 'epoch': 0.08} 8%|▊ | 2696/34278 [2:59:41<27:45:30, 3.16s/it] 8%|▊ | 2697/34278 [2:59:46<33:29:27, 3.82s/it] {'loss': 0.1865, 'grad_norm': 0.9814648931511375, 'learning_rate': 9.938030851301122e-06, 'epoch': 0.08} 8%|▊ | 2697/34278 [2:59:46<33:29:27, 3.82s/it] 8%|▊ | 2698/34278 [2:59:50<32:43:59, 3.73s/it] {'loss': 0.1965, 'grad_norm': 0.9219050370141942, 'learning_rate': 9.937956679588874e-06, 'epoch': 0.08} 8%|▊ | 2698/34278 [2:59:50<32:43:59, 3.73s/it] 8%|▊ | 2699/34278 [2:59:56<38:40:37, 4.41s/it] {'loss': 0.1815, 'grad_norm': 1.2533633717403911, 'learning_rate': 9.937882463791727e-06, 'epoch': 0.08} 8%|▊ | 2699/34278 [2:59:56<38:40:37, 4.41s/it] 8%|▊ | 2700/34278 [3:00:00<37:15:50, 4.25s/it] {'loss': 0.2045, 'grad_norm': 1.1228788934518406, 'learning_rate': 9.937808203910345e-06, 'epoch': 0.08} 8%|▊ | 2700/34278 [3:00:00<37:15:50, 4.25s/it] 8%|▊ | 2701/34278 [3:00:04<36:28:03, 4.16s/it] {'loss': 0.1988, 'grad_norm': 0.9634574103341637, 'learning_rate': 9.93773389994539e-06, 'epoch': 0.08} 8%|▊ | 2701/34278 [3:00:04<36:28:03, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2702/34278 [3:00:07<33:44:28, 3.85s/it] {'loss': 0.1836, 'grad_norm': 1.021651442566582, 'learning_rate': 9.937659551897526e-06, 'epoch': 0.08} 8%|▊ | 2702/34278 [3:00:07<33:44:28, 3.85s/it] 8%|▊ | 2703/34278 [3:00:10<32:06:49, 3.66s/it] {'loss': 0.1978, 'grad_norm': 1.0105141364475234, 'learning_rate': 9.937585159767416e-06, 'epoch': 0.08} 8%|▊ | 2703/34278 [3:00:10<32:06:49, 3.66s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10915 > 8192). Running this sequence through the model will result in indexing errors 8%|▊ | 2704/34278 [3:00:13<31:07:56, 3.55s/it] {'loss': 0.1799, 'grad_norm': 1.021767068751826, 'learning_rate': 9.937510723555723e-06, 'epoch': 0.08} 8%|▊ | 2704/34278 [3:00:13<31:07:56, 3.55s/it] 8%|▊ | 2705/34278 [3:00:16<29:48:48, 3.40s/it] {'loss': 0.1985, 'grad_norm': 0.9066280821815615, 'learning_rate': 9.937436243263115e-06, 'epoch': 0.08} 8%|▊ | 2705/34278 [3:00:16<29:48:48, 3.40s/it] 8%|▊ | 2706/34278 [3:00:20<29:21:10, 3.35s/it] {'loss': 0.2056, 'grad_norm': 0.841122543552312, 'learning_rate': 9.937361718890255e-06, 'epoch': 0.08} 8%|▊ | 2706/34278 [3:00:20<29:21:10, 3.35s/it] 8%|▊ | 2707/34278 [3:00:23<28:53:21, 3.29s/it] {'loss': 0.2134, 'grad_norm': 0.9077763985090129, 'learning_rate': 9.937287150437807e-06, 'epoch': 0.08} 8%|▊ | 2707/34278 [3:00:23<28:53:21, 3.29s/it] 8%|▊ | 2708/34278 [3:00:26<28:51:00, 3.29s/it] {'loss': 0.2175, 'grad_norm': 0.9746503628502282, 'learning_rate': 9.937212537906438e-06, 'epoch': 0.08} 8%|▊ | 2708/34278 [3:00:26<28:51:00, 3.29s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12522 > 8192). Running this sequence through the model will result in indexing errors 8%|▊ | 2709/34278 [3:00:29<27:52:38, 3.18s/it] {'loss': 0.1983, 'grad_norm': 1.0464230380507649, 'learning_rate': 9.937137881296814e-06, 'epoch': 0.08} 8%|▊ | 2709/34278 [3:00:29<27:52:38, 3.18s/it] 8%|▊ | 2710/34278 [3:00:32<28:40:09, 3.27s/it] {'loss': 0.2105, 'grad_norm': 1.200070722162346, 'learning_rate': 9.937063180609602e-06, 'epoch': 0.08} 8%|▊ | 2710/34278 [3:00:32<28:40:09, 3.27s/it] 8%|▊ | 2711/34278 [3:00:37<31:48:46, 3.63s/it] {'loss': 0.1901, 'grad_norm': 0.8591917355902438, 'learning_rate': 9.936988435845469e-06, 'epoch': 0.08} 8%|▊ | 2711/34278 [3:00:37<31:48:46, 3.63s/it] 8%|▊ | 2712/34278 [3:00:40<30:53:08, 3.52s/it] {'loss': 0.1849, 'grad_norm': 0.8876176216163101, 'learning_rate': 9.93691364700508e-06, 'epoch': 0.08} 8%|▊ | 2712/34278 [3:00:40<30:53:08, 3.52s/it] 8%|▊ | 2713/34278 [3:00:43<29:38:21, 3.38s/it] {'loss': 0.1925, 'grad_norm': 1.1549833181924853, 'learning_rate': 9.936838814089107e-06, 'epoch': 0.08} 8%|▊ | 2713/34278 [3:00:43<29:38:21, 3.38s/it] 8%|▊ | 2714/34278 [3:00:46<28:19:57, 3.23s/it] {'loss': 0.2012, 'grad_norm': 1.0203950851234787, 'learning_rate': 9.936763937098213e-06, 'epoch': 0.08} 8%|▊ | 2714/34278 [3:00:46<28:19:57, 3.23s/it] 8%|▊ | 2715/34278 [3:00:50<28:59:36, 3.31s/it] {'loss': 0.2047, 'grad_norm': 1.0481209030947731, 'learning_rate': 9.93668901603307e-06, 'epoch': 0.08} 8%|▊ | 2715/34278 [3:00:50<28:59:36, 3.31s/it] 8%|▊ | 2716/34278 [3:00:53<28:53:37, 3.30s/it] {'loss': 0.1842, 'grad_norm': 0.9819989793255824, 'learning_rate': 9.936614050894346e-06, 'epoch': 0.08} 8%|▊ | 2716/34278 [3:00:53<28:53:37, 3.30s/it] 8%|▊ | 2717/34278 [3:00:58<34:00:01, 3.88s/it] {'loss': 0.1967, 'grad_norm': 0.9479496180338147, 'learning_rate': 9.93653904168271e-06, 'epoch': 0.08} 8%|▊ | 2717/34278 [3:00:58<34:00:01, 3.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2718/34278 [3:01:02<32:50:59, 3.75s/it] {'loss': 0.1801, 'grad_norm': 1.106088483378671, 'learning_rate': 9.936463988398834e-06, 'epoch': 0.08} 8%|▊ | 2718/34278 [3:01:02<32:50:59, 3.75s/it] 8%|▊ | 2719/34278 [3:01:05<32:46:12, 3.74s/it] {'loss': 0.2019, 'grad_norm': 0.8381484977399565, 'learning_rate': 9.936388891043384e-06, 'epoch': 0.08} 8%|▊ | 2719/34278 [3:01:05<32:46:12, 3.74s/it] 8%|▊ | 2720/34278 [3:01:11<39:11:53, 4.47s/it] {'loss': 0.1896, 'grad_norm': 0.9140491887527451, 'learning_rate': 9.936313749617032e-06, 'epoch': 0.08} 8%|▊ | 2720/34278 [3:01:12<39:11:53, 4.47s/it] 8%|▊ | 2721/34278 [3:01:14<34:52:14, 3.98s/it] {'loss': 0.1963, 'grad_norm': 0.9834658135188353, 'learning_rate': 9.93623856412045e-06, 'epoch': 0.08} 8%|▊ | 2721/34278 [3:01:14<34:52:14, 3.98s/it] 8%|▊ | 2722/34278 [3:01:18<34:08:55, 3.90s/it] {'loss': 0.2083, 'grad_norm': 0.870219693261259, 'learning_rate': 9.93616333455431e-06, 'epoch': 0.08} 8%|▊ | 2722/34278 [3:01:18<34:08:55, 3.90s/it] 8%|▊ | 2723/34278 [3:01:21<32:03:35, 3.66s/it] {'loss': 0.2006, 'grad_norm': 0.8545790111060516, 'learning_rate': 9.93608806091928e-06, 'epoch': 0.08} 8%|▊ | 2723/34278 [3:01:21<32:03:35, 3.66s/it] 8%|▊ | 2724/34278 [3:01:24<29:57:01, 3.42s/it] {'loss': 0.1972, 'grad_norm': 0.8398428387455098, 'learning_rate': 9.936012743216034e-06, 'epoch': 0.08} 8%|▊ | 2724/34278 [3:01:24<29:57:01, 3.42s/it] 8%|▊ | 2725/34278 [3:01:27<29:08:10, 3.32s/it] {'loss': 0.1978, 'grad_norm': 0.9003407969640789, 'learning_rate': 9.935937381445247e-06, 'epoch': 0.08} 8%|▊ | 2725/34278 [3:01:27<29:08:10, 3.32s/it] 8%|▊ | 2726/34278 [3:01:30<29:02:02, 3.31s/it] {'loss': 0.2031, 'grad_norm': 0.918600237288775, 'learning_rate': 9.935861975607586e-06, 'epoch': 0.08} 8%|▊ | 2726/34278 [3:01:30<29:02:02, 3.31s/it] 8%|▊ | 2727/34278 [3:01:34<30:11:14, 3.44s/it] {'loss': 0.1818, 'grad_norm': 1.001836761949255, 'learning_rate': 9.93578652570373e-06, 'epoch': 0.08} 8%|▊ | 2727/34278 [3:01:34<30:11:14, 3.44s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2728/34278 [3:01:37<29:42:46, 3.39s/it] {'loss': 0.19, 'grad_norm': 0.7416507055637092, 'learning_rate': 9.935711031734349e-06, 'epoch': 0.08} 8%|▊ | 2728/34278 [3:01:37<29:42:46, 3.39s/it] 8%|▊ | 2729/34278 [3:01:43<36:27:59, 4.16s/it] {'loss': 0.2049, 'grad_norm': 1.1444359194944251, 'learning_rate': 9.935635493700117e-06, 'epoch': 0.08} 8%|▊ | 2729/34278 [3:01:43<36:27:59, 4.16s/it] 8%|▊ | 2730/34278 [3:01:47<33:52:53, 3.87s/it] {'loss': 0.1933, 'grad_norm': 0.901635866880554, 'learning_rate': 9.935559911601713e-06, 'epoch': 0.08} 8%|▊ | 2730/34278 [3:01:47<33:52:53, 3.87s/it] 8%|▊ | 2731/34278 [3:01:51<34:19:30, 3.92s/it] {'loss': 0.2042, 'grad_norm': 0.9359464537723414, 'learning_rate': 9.935484285439806e-06, 'epoch': 0.08} 8%|▊ | 2731/34278 [3:01:51<34:19:30, 3.92s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2732/34278 [3:01:54<32:43:58, 3.74s/it] {'loss': 0.1722, 'grad_norm': 0.8740846860070967, 'learning_rate': 9.935408615215075e-06, 'epoch': 0.08} 8%|▊ | 2732/34278 [3:01:54<32:43:58, 3.74s/it] 8%|▊ | 2733/34278 [3:01:59<36:31:28, 4.17s/it] {'loss': 0.1991, 'grad_norm': 0.8249563066594702, 'learning_rate': 9.935332900928192e-06, 'epoch': 0.08} 8%|▊ | 2733/34278 [3:01:59<36:31:28, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2734/34278 [3:02:02<34:32:26, 3.94s/it] {'loss': 0.1953, 'grad_norm': 1.0008799259776313, 'learning_rate': 9.935257142579835e-06, 'epoch': 0.08} 8%|▊ | 2734/34278 [3:02:02<34:32:26, 3.94s/it] 8%|▊ | 2735/34278 [3:02:06<33:41:04, 3.84s/it] {'loss': 0.2069, 'grad_norm': 0.9306772494535623, 'learning_rate': 9.93518134017068e-06, 'epoch': 0.08} 8%|▊ | 2735/34278 [3:02:06<33:41:04, 3.84s/it] 8%|▊ | 2736/34278 [3:02:12<40:13:34, 4.59s/it] {'loss': 0.1979, 'grad_norm': 0.935440707262837, 'learning_rate': 9.935105493701406e-06, 'epoch': 0.08} 8%|▊ | 2736/34278 [3:02:12<40:13:34, 4.59s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2737/34278 [3:02:16<37:03:23, 4.23s/it] {'loss': 0.1726, 'grad_norm': 0.8310594003377255, 'learning_rate': 9.935029603172689e-06, 'epoch': 0.08} 8%|▊ | 2737/34278 [3:02:16<37:03:23, 4.23s/it] 8%|▊ | 2738/34278 [3:02:19<34:26:03, 3.93s/it] {'loss': 0.1868, 'grad_norm': 0.8204925660785971, 'learning_rate': 9.934953668585205e-06, 'epoch': 0.08} 8%|▊ | 2738/34278 [3:02:19<34:26:03, 3.93s/it] 8%|▊ | 2739/34278 [3:02:22<31:59:21, 3.65s/it] {'loss': 0.1967, 'grad_norm': 1.1133498944659208, 'learning_rate': 9.93487768993963e-06, 'epoch': 0.08} 8%|▊ | 2739/34278 [3:02:22<31:59:21, 3.65s/it] 8%|▊ | 2740/34278 [3:02:25<31:11:10, 3.56s/it] {'loss': 0.2357, 'grad_norm': 0.9255066100765028, 'learning_rate': 9.93480166723665e-06, 'epoch': 0.08} 8%|▊ | 2740/34278 [3:02:25<31:11:10, 3.56s/it] 8%|▊ | 2741/34278 [3:02:31<37:51:35, 4.32s/it] {'loss': 0.1891, 'grad_norm': 0.9069965715365438, 'learning_rate': 9.934725600476935e-06, 'epoch': 0.08} 8%|▊ | 2741/34278 [3:02:32<37:51:35, 4.32s/it] 8%|▊ | 2742/34278 [3:02:35<35:33:37, 4.06s/it] {'loss': 0.2092, 'grad_norm': 1.0889468089974361, 'learning_rate': 9.934649489661168e-06, 'epoch': 0.08} 8%|▊ | 2742/34278 [3:02:35<35:33:37, 4.06s/it] 8%|▊ | 2743/34278 [3:02:40<37:08:37, 4.24s/it] {'loss': 0.1998, 'grad_norm': 1.0411829070438932, 'learning_rate': 9.934573334790029e-06, 'epoch': 0.08} 8%|▊ | 2743/34278 [3:02:40<37:08:37, 4.24s/it] 8%|▊ | 2744/34278 [3:02:43<35:10:46, 4.02s/it] {'loss': 0.1823, 'grad_norm': 0.994705550420692, 'learning_rate': 9.934497135864198e-06, 'epoch': 0.08} 8%|▊ | 2744/34278 [3:02:43<35:10:46, 4.02s/it] 8%|▊ | 2745/34278 [3:02:46<32:05:05, 3.66s/it] {'loss': 0.182, 'grad_norm': 0.9356101296702157, 'learning_rate': 9.934420892884352e-06, 'epoch': 0.08} 8%|▊ | 2745/34278 [3:02:46<32:05:05, 3.66s/it] 8%|▊ | 2746/34278 [3:02:49<29:44:40, 3.40s/it] {'loss': 0.1776, 'grad_norm': 0.9463574646080707, 'learning_rate': 9.934344605851179e-06, 'epoch': 0.08} 8%|▊ | 2746/34278 [3:02:49<29:44:40, 3.40s/it] 8%|▊ | 2747/34278 [3:02:52<29:27:01, 3.36s/it] {'loss': 0.1967, 'grad_norm': 1.064276672676239, 'learning_rate': 9.93426827476535e-06, 'epoch': 0.08} 8%|▊ | 2747/34278 [3:02:52<29:27:01, 3.36s/it] 8%|▊ | 2748/34278 [3:02:58<36:32:45, 4.17s/it] {'loss': 0.1857, 'grad_norm': 0.9202510035566694, 'learning_rate': 9.934191899627555e-06, 'epoch': 0.08} 8%|▊ | 2748/34278 [3:02:58<36:32:45, 4.17s/it] 8%|▊ | 2749/34278 [3:03:01<33:49:39, 3.86s/it] {'loss': 0.166, 'grad_norm': 0.8217108165975183, 'learning_rate': 9.934115480438471e-06, 'epoch': 0.08} 8%|▊ | 2749/34278 [3:03:01<33:49:39, 3.86s/it] 8%|▊ | 2750/34278 [3:03:04<32:05:13, 3.66s/it] {'loss': 0.1742, 'grad_norm': 0.8994533678584419, 'learning_rate': 9.934039017198784e-06, 'epoch': 0.08} 8%|▊ | 2750/34278 [3:03:04<32:05:13, 3.66s/it] 8%|▊ | 2751/34278 [3:03:10<37:01:30, 4.23s/it] {'loss': 0.2001, 'grad_norm': 0.8821589782245868, 'learning_rate': 9.933962509909173e-06, 'epoch': 0.08} 8%|▊ | 2751/34278 [3:03:10<37:01:30, 4.23s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10632 > 8192). Running this sequence through the model will result in indexing errors 8%|▊ | 2752/34278 [3:03:15<39:11:07, 4.47s/it] {'loss': 0.2066, 'grad_norm': 0.9057562916177336, 'learning_rate': 9.933885958570323e-06, 'epoch': 0.08} 8%|▊ | 2752/34278 [3:03:15<39:11:07, 4.47s/it] 8%|▊ | 2753/34278 [3:03:18<34:22:22, 3.93s/it] {'loss': 0.1882, 'grad_norm': 1.01509641287476, 'learning_rate': 9.933809363182916e-06, 'epoch': 0.08} 8%|▊ | 2753/34278 [3:03:18<34:22:22, 3.93s/it] 8%|▊ | 2754/34278 [3:03:24<40:14:03, 4.59s/it] {'loss': 0.1919, 'grad_norm': 0.9693622053620701, 'learning_rate': 9.933732723747638e-06, 'epoch': 0.08} 8%|▊ | 2754/34278 [3:03:24<40:14:03, 4.59s/it] 8%|▊ | 2755/34278 [3:03:29<40:47:11, 4.66s/it] {'loss': 0.1758, 'grad_norm': 0.9054770900695217, 'learning_rate': 9.933656040265172e-06, 'epoch': 0.08} 8%|▊ | 2755/34278 [3:03:29<40:47:11, 4.66s/it] 8%|▊ | 2756/34278 [3:03:33<41:01:15, 4.68s/it] {'loss': 0.2141, 'grad_norm': 1.1345735848261322, 'learning_rate': 9.9335793127362e-06, 'epoch': 0.08} 8%|▊ | 2756/34278 [3:03:33<41:01:15, 4.68s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (9815 > 8192). Running this sequence through the model will result in indexing errors 8%|▊ | 2757/34278 [3:03:37<37:59:43, 4.34s/it] {'loss': 0.181, 'grad_norm': 0.9157774978521533, 'learning_rate': 9.933502541161413e-06, 'epoch': 0.08} 8%|▊ | 2757/34278 [3:03:37<37:59:43, 4.34s/it] 8%|▊ | 2758/34278 [3:03:41<36:21:28, 4.15s/it] {'loss': 0.198, 'grad_norm': 0.8205035273923003, 'learning_rate': 9.933425725541493e-06, 'epoch': 0.08} 8%|▊ | 2758/34278 [3:03:41<36:21:28, 4.15s/it] 8%|▊ | 2759/34278 [3:03:43<32:43:08, 3.74s/it] {'loss': 0.1876, 'grad_norm': 1.1078514936409194, 'learning_rate': 9.933348865877125e-06, 'epoch': 0.08} 8%|▊ | 2759/34278 [3:03:43<32:43:08, 3.74s/it] 8%|▊ | 2760/34278 [3:03:47<33:03:56, 3.78s/it] {'loss': 0.2011, 'grad_norm': 0.9466618575867122, 'learning_rate': 9.933271962168993e-06, 'epoch': 0.08} 8%|▊ | 2760/34278 [3:03:47<33:03:56, 3.78s/it] 8%|▊ | 2761/34278 [3:03:53<38:31:17, 4.40s/it] {'loss': 0.1894, 'grad_norm': 1.0194453198695188, 'learning_rate': 9.93319501441779e-06, 'epoch': 0.08} 8%|▊ | 2761/34278 [3:03:53<38:31:17, 4.40s/it] 8%|▊ | 2762/34278 [3:03:57<36:07:53, 4.13s/it] {'loss': 0.1935, 'grad_norm': 0.8180571092945929, 'learning_rate': 9.9331180226242e-06, 'epoch': 0.08} 8%|▊ | 2762/34278 [3:03:57<36:07:53, 4.13s/it] 8%|▊ | 2763/34278 [3:03:59<32:46:29, 3.74s/it] {'loss': 0.2106, 'grad_norm': 1.0405861116911927, 'learning_rate': 9.933040986788909e-06, 'epoch': 0.08} 8%|▊ | 2763/34278 [3:03:59<32:46:29, 3.74s/it] 8%|▊ | 2764/34278 [3:04:03<31:32:03, 3.60s/it] {'loss': 0.1856, 'grad_norm': 0.8460896887974851, 'learning_rate': 9.932963906912603e-06, 'epoch': 0.08} 8%|▊ | 2764/34278 [3:04:03<31:32:03, 3.60s/it] 8%|▊ | 2765/34278 [3:04:09<38:10:22, 4.36s/it] {'loss': 0.1892, 'grad_norm': 0.7751705907994965, 'learning_rate': 9.932886782995977e-06, 'epoch': 0.08} 8%|▊ | 2765/34278 [3:04:09<38:10:22, 4.36s/it] 8%|▊ | 2766/34278 [3:04:14<39:26:05, 4.51s/it] {'loss': 0.1973, 'grad_norm': 1.0849083426806656, 'learning_rate': 9.932809615039714e-06, 'epoch': 0.08} 8%|▊ | 2766/34278 [3:04:14<39:26:05, 4.51s/it] 8%|▊ | 2767/34278 [3:04:17<35:51:15, 4.10s/it] {'loss': 0.1811, 'grad_norm': 0.79395450879649, 'learning_rate': 9.932732403044502e-06, 'epoch': 0.08} 8%|▊ | 2767/34278 [3:04:17<35:51:15, 4.10s/it] 8%|▊ | 2768/34278 [3:04:20<33:34:51, 3.84s/it] {'loss': 0.1812, 'grad_norm': 0.9203116392343945, 'learning_rate': 9.932655147011034e-06, 'epoch': 0.08} 8%|▊ | 2768/34278 [3:04:20<33:34:51, 3.84s/it] 8%|▊ | 2769/34278 [3:04:25<37:40:15, 4.30s/it] {'loss': 0.1869, 'grad_norm': 0.7840840738602628, 'learning_rate': 9.93257784694e-06, 'epoch': 0.08} 8%|▊ | 2769/34278 [3:04:25<37:40:15, 4.30s/it] 8%|▊ | 2770/34278 [3:04:32<42:41:12, 4.88s/it] {'loss': 0.1934, 'grad_norm': 0.8152327691890939, 'learning_rate': 9.932500502832087e-06, 'epoch': 0.08} 8%|▊ | 2770/34278 [3:04:32<42:41:12, 4.88s/it] 8%|▊ | 2771/34278 [3:04:35<39:39:00, 4.53s/it] {'loss': 0.1858, 'grad_norm': 0.9607678098915426, 'learning_rate': 9.932423114687988e-06, 'epoch': 0.08} 8%|▊ | 2771/34278 [3:04:35<39:39:00, 4.53s/it] 8%|▊ | 2772/34278 [3:04:39<37:10:58, 4.25s/it] {'loss': 0.2001, 'grad_norm': 0.7847020955786086, 'learning_rate': 9.932345682508393e-06, 'epoch': 0.08} 8%|▊ | 2772/34278 [3:04:39<37:10:58, 4.25s/it] 8%|▊ | 2773/34278 [3:04:42<34:32:11, 3.95s/it] {'loss': 0.1943, 'grad_norm': 1.0975533217950417, 'learning_rate': 9.93226820629399e-06, 'epoch': 0.08} 8%|▊ | 2773/34278 [3:04:42<34:32:11, 3.95s/it] 8%|▊ | 2774/34278 [3:04:48<40:10:49, 4.59s/it] {'loss': 0.2256, 'grad_norm': 1.0910647082549227, 'learning_rate': 9.932190686045478e-06, 'epoch': 0.08} 8%|▊ | 2774/34278 [3:04:48<40:10:49, 4.59s/it] 8%|▊ | 2775/34278 [3:04:55<44:25:40, 5.08s/it] {'loss': 0.2086, 'grad_norm': 6.994995426750002, 'learning_rate': 9.932113121763542e-06, 'epoch': 0.08} 8%|▊ | 2775/34278 [3:04:55<44:25:40, 5.08s/it] 8%|▊ | 2776/34278 [3:05:00<45:44:48, 5.23s/it] {'loss': 0.1986, 'grad_norm': 1.2525824847267382, 'learning_rate': 9.93203551344888e-06, 'epoch': 0.08} 8%|▊ | 2776/34278 [3:05:00<45:44:48, 5.23s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2777/34278 [3:05:03<39:36:42, 4.53s/it] {'loss': 0.2066, 'grad_norm': 1.1556186623538949, 'learning_rate': 9.931957861102181e-06, 'epoch': 0.08} 8%|▊ | 2777/34278 [3:05:03<39:36:42, 4.53s/it] 8%|▊ | 2778/34278 [3:05:06<36:08:09, 4.13s/it] {'loss': 0.2073, 'grad_norm': 1.2286560015579542, 'learning_rate': 9.93188016472414e-06, 'epoch': 0.08} 8%|▊ | 2778/34278 [3:05:06<36:08:09, 4.13s/it] 8%|▊ | 2779/34278 [3:05:10<35:14:56, 4.03s/it] {'loss': 0.189, 'grad_norm': 1.1692159101193345, 'learning_rate': 9.931802424315448e-06, 'epoch': 0.08} 8%|▊ | 2779/34278 [3:05:10<35:14:56, 4.03s/it] 8%|▊ | 2780/34278 [3:05:13<33:12:00, 3.79s/it] {'loss': 0.168, 'grad_norm': 1.169928906398308, 'learning_rate': 9.931724639876806e-06, 'epoch': 0.08} 8%|▊ | 2780/34278 [3:05:13<33:12:00, 3.79s/it] 8%|▊ | 2781/34278 [3:05:16<31:16:48, 3.58s/it] {'loss': 0.2168, 'grad_norm': 1.1798753346118862, 'learning_rate': 9.931646811408899e-06, 'epoch': 0.08} 8%|▊ | 2781/34278 [3:05:16<31:16:48, 3.58s/it] 8%|▊ | 2782/34278 [3:05:20<32:26:48, 3.71s/it] {'loss': 0.2013, 'grad_norm': 1.139026510056305, 'learning_rate': 9.931568938912428e-06, 'epoch': 0.08} 8%|▊ | 2782/34278 [3:05:20<32:26:48, 3.71s/it] 8%|▊ | 2783/34278 [3:05:23<30:33:32, 3.49s/it] {'loss': 0.1781, 'grad_norm': 0.9073070629248077, 'learning_rate': 9.931491022388087e-06, 'epoch': 0.08} 8%|▊ | 2783/34278 [3:05:23<30:33:32, 3.49s/it] 8%|▊ | 2784/34278 [3:05:27<30:22:35, 3.47s/it] {'loss': 0.1888, 'grad_norm': 1.0617980509299942, 'learning_rate': 9.931413061836573e-06, 'epoch': 0.08} 8%|▊ | 2784/34278 [3:05:27<30:22:35, 3.47s/it] 8%|▊ | 2785/34278 [3:05:30<29:14:21, 3.34s/it] {'loss': 0.1987, 'grad_norm': 1.0467083373748136, 'learning_rate': 9.931335057258579e-06, 'epoch': 0.08} 8%|▊ | 2785/34278 [3:05:30<29:14:21, 3.34s/it] 8%|▊ | 2786/34278 [3:05:36<36:24:17, 4.16s/it] {'loss': 0.1807, 'grad_norm': 0.9096295779546549, 'learning_rate': 9.931257008654801e-06, 'epoch': 0.08} 8%|▊ | 2786/34278 [3:05:36<36:24:17, 4.16s/it] 8%|▊ | 2787/34278 [3:05:40<35:09:10, 4.02s/it] {'loss': 0.1992, 'grad_norm': 1.0581114770644902, 'learning_rate': 9.931178916025941e-06, 'epoch': 0.08} 8%|▊ | 2787/34278 [3:05:40<35:09:10, 4.02s/it] 8%|▊ | 2788/34278 [3:05:43<32:34:21, 3.72s/it] {'loss': 0.2196, 'grad_norm': 0.901276263214751, 'learning_rate': 9.931100779372691e-06, 'epoch': 0.08} 8%|▊ | 2788/34278 [3:05:43<32:34:21, 3.72s/it] 8%|▊ | 2789/34278 [3:05:45<30:25:09, 3.48s/it] {'loss': 0.1913, 'grad_norm': 0.8226719310667839, 'learning_rate': 9.93102259869575e-06, 'epoch': 0.08} 8%|▊ | 2789/34278 [3:05:45<30:25:09, 3.48s/it] 8%|▊ | 2790/34278 [3:05:51<36:34:07, 4.18s/it] {'loss': 0.1995, 'grad_norm': 0.9334413386187128, 'learning_rate': 9.930944373995816e-06, 'epoch': 0.08} 8%|▊ | 2790/34278 [3:05:51<36:34:07, 4.18s/it] 8%|▊ | 2791/34278 [3:05:55<34:29:21, 3.94s/it] {'loss': 0.168, 'grad_norm': 0.8547959893057678, 'learning_rate': 9.93086610527359e-06, 'epoch': 0.08} 8%|▊ | 2791/34278 [3:05:55<34:29:21, 3.94s/it] 8%|▊ | 2792/34278 [3:05:58<31:38:45, 3.62s/it] {'loss': 0.1803, 'grad_norm': 0.8424939866021695, 'learning_rate': 9.930787792529768e-06, 'epoch': 0.08} 8%|▊ | 2792/34278 [3:05:58<31:38:45, 3.62s/it] 8%|▊ | 2793/34278 [3:06:03<37:17:00, 4.26s/it] {'loss': 0.1807, 'grad_norm': 0.7925941421375563, 'learning_rate': 9.930709435765049e-06, 'epoch': 0.08} 8%|▊ | 2793/34278 [3:06:03<37:17:00, 4.26s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2794/34278 [3:06:07<34:39:37, 3.96s/it] {'loss': 0.1995, 'grad_norm': 0.8068038095868088, 'learning_rate': 9.930631034980132e-06, 'epoch': 0.08} 8%|▊ | 2794/34278 [3:06:07<34:39:37, 3.96s/it] 8%|▊ | 2795/34278 [3:06:13<40:10:00, 4.59s/it] {'loss': 0.2237, 'grad_norm': 1.1927830385184788, 'learning_rate': 9.93055259017572e-06, 'epoch': 0.08} 8%|▊ | 2795/34278 [3:06:13<40:10:00, 4.59s/it] 8%|▊ | 2796/34278 [3:06:15<35:13:31, 4.03s/it] {'loss': 0.1778, 'grad_norm': 0.87189253766637, 'learning_rate': 9.93047410135251e-06, 'epoch': 0.08} 8%|▊ | 2796/34278 [3:06:15<35:13:31, 4.03s/it] 8%|▊ | 2797/34278 [3:06:21<40:40:02, 4.65s/it] {'loss': 0.1726, 'grad_norm': 0.9239119098014479, 'learning_rate': 9.930395568511205e-06, 'epoch': 0.08} 8%|▊ | 2797/34278 [3:06:21<40:40:02, 4.65s/it] 8%|▊ | 2798/34278 [3:06:25<36:44:42, 4.20s/it] {'loss': 0.2261, 'grad_norm': 0.8123218897094974, 'learning_rate': 9.930316991652506e-06, 'epoch': 0.08} 8%|▊ | 2798/34278 [3:06:25<36:44:42, 4.20s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11695 > 8192). Running this sequence through the model will result in indexing errors 8%|▊ | 2799/34278 [3:06:31<41:53:29, 4.79s/it] {'loss': 0.2013, 'grad_norm': 1.0795451179377848, 'learning_rate': 9.930238370777112e-06, 'epoch': 0.08} 8%|▊ | 2799/34278 [3:06:31<41:53:29, 4.79s/it] 8%|▊ | 2800/34278 [3:06:34<36:57:27, 4.23s/it] {'loss': 0.1877, 'grad_norm': 0.9989426857442159, 'learning_rate': 9.93015970588573e-06, 'epoch': 0.08} 8%|▊ | 2800/34278 [3:06:34<36:57:27, 4.23s/it] 8%|▊ | 2801/34278 [3:06:37<34:26:23, 3.94s/it] {'loss': 0.2254, 'grad_norm': 0.809457448580003, 'learning_rate': 9.930080996979055e-06, 'epoch': 0.08} 8%|▊ | 2801/34278 [3:06:37<34:26:23, 3.94s/it] 8%|▊ | 2802/34278 [3:06:40<31:26:52, 3.60s/it] {'loss': 0.2054, 'grad_norm': 0.8504300144113776, 'learning_rate': 9.930002244057795e-06, 'epoch': 0.08} 8%|▊ | 2802/34278 [3:06:40<31:26:52, 3.60s/it] 8%|▊ | 2803/34278 [3:06:43<30:34:12, 3.50s/it] {'loss': 0.2227, 'grad_norm': 1.0160328925502884, 'learning_rate': 9.929923447122654e-06, 'epoch': 0.08} 8%|▊ | 2803/34278 [3:06:43<30:34:12, 3.50s/it] 8%|▊ | 2804/34278 [3:06:49<37:12:43, 4.26s/it] {'loss': 0.1837, 'grad_norm': 0.859542823941737, 'learning_rate': 9.92984460617433e-06, 'epoch': 0.08} 8%|▊ | 2804/34278 [3:06:49<37:12:43, 4.26s/it] 8%|▊ | 2805/34278 [3:06:52<34:01:08, 3.89s/it] {'loss': 0.1782, 'grad_norm': 1.3241622148617618, 'learning_rate': 9.929765721213533e-06, 'epoch': 0.08} 8%|▊ | 2805/34278 [3:06:52<34:01:08, 3.89s/it] 8%|▊ | 2806/34278 [3:06:56<33:11:40, 3.80s/it] {'loss': 0.1826, 'grad_norm': 0.9965291614881215, 'learning_rate': 9.929686792240965e-06, 'epoch': 0.08} 8%|▊ | 2806/34278 [3:06:56<33:11:40, 3.80s/it] 8%|▊ | 2807/34278 [3:07:00<33:36:45, 3.84s/it] {'loss': 0.2149, 'grad_norm': 1.086457509226983, 'learning_rate': 9.929607819257327e-06, 'epoch': 0.08} 8%|▊ | 2807/34278 [3:07:00<33:36:45, 3.84s/it] 8%|▊ | 2808/34278 [3:07:05<38:11:06, 4.37s/it] {'loss': 0.214, 'grad_norm': 0.909790091638877, 'learning_rate': 9.929528802263331e-06, 'epoch': 0.08} 8%|▊ | 2808/34278 [3:07:05<38:11:06, 4.37s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2809/34278 [3:07:08<34:20:37, 3.93s/it] {'loss': 0.1862, 'grad_norm': 0.9813151341080305, 'learning_rate': 9.929449741259675e-06, 'epoch': 0.08} 8%|▊ | 2809/34278 [3:07:08<34:20:37, 3.93s/it] 8%|▊ | 2810/34278 [3:07:11<32:06:56, 3.67s/it] {'loss': 0.1895, 'grad_norm': 1.0301680924957024, 'learning_rate': 9.92937063624707e-06, 'epoch': 0.08} 8%|▊ | 2810/34278 [3:07:11<32:06:56, 3.67s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2811/34278 [3:07:15<31:29:05, 3.60s/it] {'loss': 0.1759, 'grad_norm': 1.1282745828440568, 'learning_rate': 9.929291487226221e-06, 'epoch': 0.08} 8%|▊ | 2811/34278 [3:07:15<31:29:05, 3.60s/it] 8%|▊ | 2812/34278 [3:07:18<29:55:34, 3.42s/it] {'loss': 0.1858, 'grad_norm': 1.0243361260094277, 'learning_rate': 9.929212294197834e-06, 'epoch': 0.08} 8%|▊ | 2812/34278 [3:07:18<29:55:34, 3.42s/it] 8%|▊ | 2813/34278 [3:07:22<31:32:24, 3.61s/it] {'loss': 0.2236, 'grad_norm': 0.9394166173833635, 'learning_rate': 9.929133057162616e-06, 'epoch': 0.08} 8%|▊ | 2813/34278 [3:07:22<31:32:24, 3.61s/it] 8%|▊ | 2814/34278 [3:07:24<29:04:06, 3.33s/it] {'loss': 0.1997, 'grad_norm': 1.0602639828605804, 'learning_rate': 9.929053776121276e-06, 'epoch': 0.08} 8%|▊ | 2814/34278 [3:07:24<29:04:06, 3.33s/it] 8%|▊ | 2815/34278 [3:07:30<34:44:57, 3.98s/it] {'loss': 0.1889, 'grad_norm': 0.9907947906298216, 'learning_rate': 9.92897445107452e-06, 'epoch': 0.08} 8%|▊ | 2815/34278 [3:07:30<34:44:57, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (8237 > 8192). Running this sequence through the model will result in indexing errors 8%|▊ | 2816/34278 [3:07:33<32:24:52, 3.71s/it] {'loss': 0.1892, 'grad_norm': 1.0825967856642165, 'learning_rate': 9.928895082023056e-06, 'epoch': 0.08} 8%|▊ | 2816/34278 [3:07:33<32:24:52, 3.71s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10530 > 8192). Running this sequence through the model will result in indexing errors 8%|▊ | 2817/34278 [3:07:36<30:16:19, 3.46s/it] {'loss': 0.1786, 'grad_norm': 1.129993170967017, 'learning_rate': 9.928815668967592e-06, 'epoch': 0.08} 8%|▊ | 2817/34278 [3:07:36<30:16:19, 3.46s/it] 8%|▊ | 2818/34278 [3:07:41<36:07:00, 4.13s/it] {'loss': 0.1943, 'grad_norm': 0.9319226639332902, 'learning_rate': 9.928736211908841e-06, 'epoch': 0.08} 8%|▊ | 2818/34278 [3:07:41<36:07:00, 4.13s/it] 8%|▊ | 2819/34278 [3:07:46<35:50:31, 4.10s/it] {'loss': 0.1892, 'grad_norm': 0.8958238329213462, 'learning_rate': 9.92865671084751e-06, 'epoch': 0.08} 8%|▊ | 2819/34278 [3:07:46<35:50:31, 4.10s/it] 8%|▊ | 2820/34278 [3:07:49<33:50:21, 3.87s/it] {'loss': 0.1752, 'grad_norm': 1.1202960962969155, 'learning_rate': 9.928577165784306e-06, 'epoch': 0.08} 8%|▊ | 2820/34278 [3:07:49<33:50:21, 3.87s/it] 8%|▊ | 2821/34278 [3:07:52<32:41:16, 3.74s/it] {'loss': 0.218, 'grad_norm': 0.9182878870187241, 'learning_rate': 9.928497576719943e-06, 'epoch': 0.08} 8%|▊ | 2821/34278 [3:07:52<32:41:16, 3.74s/it] 8%|▊ | 2822/34278 [3:07:56<32:02:33, 3.67s/it] {'loss': 0.1809, 'grad_norm': 1.152626927527983, 'learning_rate': 9.92841794365513e-06, 'epoch': 0.08} 8%|▊ | 2822/34278 [3:07:56<32:02:33, 3.67s/it] 8%|▊ | 2823/34278 [3:07:59<30:47:04, 3.52s/it] {'loss': 0.1851, 'grad_norm': 0.8891267211475501, 'learning_rate': 9.928338266590578e-06, 'epoch': 0.08} 8%|▊ | 2823/34278 [3:07:59<30:47:04, 3.52s/it] 8%|▊ | 2824/34278 [3:08:03<30:57:58, 3.54s/it] {'loss': 0.2464, 'grad_norm': 0.9388552579142749, 'learning_rate': 9.928258545526999e-06, 'epoch': 0.08} 8%|▊ | 2824/34278 [3:08:03<30:57:58, 3.54s/it] 8%|▊ | 2825/34278 [3:08:06<29:52:57, 3.42s/it] {'loss': 0.1702, 'grad_norm': 1.0039462884984038, 'learning_rate': 9.928178780465103e-06, 'epoch': 0.08} 8%|▊ | 2825/34278 [3:08:06<29:52:57, 3.42s/it] 8%|▊ | 2826/34278 [3:08:11<35:06:44, 4.02s/it] {'loss': 0.1692, 'grad_norm': 0.8654622861712875, 'learning_rate': 9.928098971405604e-06, 'epoch': 0.08} 8%|▊ | 2826/34278 [3:08:11<35:06:44, 4.02s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2827/34278 [3:08:15<34:46:26, 3.98s/it] {'loss': 0.208, 'grad_norm': 0.9468776200338908, 'learning_rate': 9.928019118349214e-06, 'epoch': 0.08} 8%|▊ | 2827/34278 [3:08:15<34:46:26, 3.98s/it] 8%|▊ | 2828/34278 [3:08:19<34:00:53, 3.89s/it] {'loss': 0.1893, 'grad_norm': 0.8675639463620226, 'learning_rate': 9.927939221296645e-06, 'epoch': 0.08} 8%|▊ | 2828/34278 [3:08:19<34:00:53, 3.89s/it] 8%|▊ | 2829/34278 [3:08:22<31:57:10, 3.66s/it] {'loss': 0.1912, 'grad_norm': 0.9140124809001825, 'learning_rate': 9.927859280248613e-06, 'epoch': 0.08} 8%|▊ | 2829/34278 [3:08:22<31:57:10, 3.66s/it] 8%|▊ | 2830/34278 [3:08:26<34:39:11, 3.97s/it] {'loss': 0.1882, 'grad_norm': 0.8808135522217446, 'learning_rate': 9.927779295205828e-06, 'epoch': 0.08} 8%|▊ | 2830/34278 [3:08:26<34:39:11, 3.97s/it] 8%|▊ | 2831/34278 [3:08:29<31:46:23, 3.64s/it] {'loss': 0.2129, 'grad_norm': 1.1263294413792957, 'learning_rate': 9.927699266169006e-06, 'epoch': 0.08} 8%|▊ | 2831/34278 [3:08:29<31:46:23, 3.64s/it] 8%|▊ | 2832/34278 [3:08:34<34:45:54, 3.98s/it] {'loss': 0.2159, 'grad_norm': 1.1464325707882075, 'learning_rate': 9.927619193138862e-06, 'epoch': 0.08} 8%|▊ | 2832/34278 [3:08:34<34:45:54, 3.98s/it] 8%|▊ | 2833/34278 [3:08:40<40:22:05, 4.62s/it] {'loss': 0.1843, 'grad_norm': 0.9193497134407058, 'learning_rate': 9.927539076116108e-06, 'epoch': 0.08} 8%|▊ | 2833/34278 [3:08:40<40:22:05, 4.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2834/34278 [3:08:44<37:07:21, 4.25s/it] {'loss': 0.1875, 'grad_norm': 1.0620327484528727, 'learning_rate': 9.927458915101463e-06, 'epoch': 0.08} 8%|▊ | 2834/34278 [3:08:44<37:07:21, 4.25s/it] 8%|▊ | 2835/34278 [3:08:47<34:03:30, 3.90s/it] {'loss': 0.1988, 'grad_norm': 1.297023435604498, 'learning_rate': 9.92737871009564e-06, 'epoch': 0.08} 8%|▊ | 2835/34278 [3:08:47<34:03:30, 3.90s/it] 8%|▊ | 2836/34278 [3:08:53<40:10:00, 4.60s/it] {'loss': 0.1945, 'grad_norm': 1.1594631116160867, 'learning_rate': 9.927298461099358e-06, 'epoch': 0.08} 8%|▊ | 2836/34278 [3:08:53<40:10:00, 4.60s/it] 8%|▊ | 2837/34278 [3:08:57<37:57:22, 4.35s/it] {'loss': 0.2019, 'grad_norm': 2.71844039216031, 'learning_rate': 9.92721816811333e-06, 'epoch': 0.08} 8%|▊ | 2837/34278 [3:08:57<37:57:22, 4.35s/it] 8%|▊ | 2838/34278 [3:09:03<42:38:50, 4.88s/it] {'loss': 0.2028, 'grad_norm': 1.1300278850126155, 'learning_rate': 9.927137831138275e-06, 'epoch': 0.08} 8%|▊ | 2838/34278 [3:09:03<42:38:50, 4.88s/it] 8%|▊ | 2839/34278 [3:09:06<37:59:23, 4.35s/it] {'loss': 0.2007, 'grad_norm': 0.9754568136533278, 'learning_rate': 9.92705745017491e-06, 'epoch': 0.08} 8%|▊ | 2839/34278 [3:09:06<37:59:23, 4.35s/it] 8%|▊ | 2840/34278 [3:09:09<34:41:23, 3.97s/it] {'loss': 0.2024, 'grad_norm': 0.8622160475156416, 'learning_rate': 9.926977025223954e-06, 'epoch': 0.08} 8%|▊ | 2840/34278 [3:09:09<34:41:23, 3.97s/it] 8%|▊ | 2841/34278 [3:09:15<40:06:56, 4.59s/it] {'loss': 0.1839, 'grad_norm': 0.9267529229951084, 'learning_rate': 9.92689655628612e-06, 'epoch': 0.08} 8%|▊ | 2841/34278 [3:09:15<40:06:56, 4.59s/it] 8%|▊ | 2842/34278 [3:09:21<43:53:54, 5.03s/it] {'loss': 0.2059, 'grad_norm': 0.9914073092895164, 'learning_rate': 9.926816043362132e-06, 'epoch': 0.08} 8%|▊ | 2842/34278 [3:09:21<43:53:54, 5.03s/it] 8%|▊ | 2843/34278 [3:09:26<43:34:49, 4.99s/it] {'loss': 0.2116, 'grad_norm': 0.8820006093688145, 'learning_rate': 9.926735486452706e-06, 'epoch': 0.08} 8%|▊ | 2843/34278 [3:09:26<43:34:49, 4.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2844/34278 [3:09:32<45:49:27, 5.25s/it] {'loss': 0.1931, 'grad_norm': 1.128717413870026, 'learning_rate': 9.92665488555856e-06, 'epoch': 0.08} 8%|▊ | 2844/34278 [3:09:32<45:49:27, 5.25s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2845/34278 [3:09:35<40:23:17, 4.63s/it] {'loss': 0.2084, 'grad_norm': 1.033345513230471, 'learning_rate': 9.926574240680417e-06, 'epoch': 0.08} 8%|▊ | 2845/34278 [3:09:35<40:23:17, 4.63s/it] 8%|▊ | 2846/34278 [3:09:39<38:55:30, 4.46s/it] {'loss': 0.1836, 'grad_norm': 0.7905999006884825, 'learning_rate': 9.926493551818995e-06, 'epoch': 0.08} 8%|▊ | 2846/34278 [3:09:39<38:55:30, 4.46s/it] 8%|▊ | 2847/34278 [3:09:42<35:57:01, 4.12s/it] {'loss': 0.1792, 'grad_norm': 0.9106042662231119, 'learning_rate': 9.926412818975015e-06, 'epoch': 0.08} 8%|▊ | 2847/34278 [3:09:42<35:57:01, 4.12s/it] 8%|▊ | 2848/34278 [3:09:46<34:10:42, 3.91s/it] {'loss': 0.1835, 'grad_norm': 0.7649859787618821, 'learning_rate': 9.926332042149196e-06, 'epoch': 0.08} 8%|▊ | 2848/34278 [3:09:46<34:10:42, 3.91s/it] 8%|▊ | 2849/34278 [3:09:49<33:05:57, 3.79s/it] {'loss': 0.2235, 'grad_norm': 0.9070524418435134, 'learning_rate': 9.926251221342262e-06, 'epoch': 0.08} 8%|▊ | 2849/34278 [3:09:49<33:05:57, 3.79s/it] 8%|▊ | 2850/34278 [3:09:55<38:49:50, 4.45s/it] {'loss': 0.1774, 'grad_norm': 0.8590134215437185, 'learning_rate': 9.926170356554932e-06, 'epoch': 0.08} 8%|▊ | 2850/34278 [3:09:55<38:49:50, 4.45s/it] 8%|▊ | 2851/34278 [3:10:01<42:52:05, 4.91s/it] {'loss': 0.214, 'grad_norm': 1.0204603380841695, 'learning_rate': 9.92608944778793e-06, 'epoch': 0.08} 8%|▊ | 2851/34278 [3:10:01<42:52:05, 4.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2852/34278 [3:10:05<38:35:31, 4.42s/it] {'loss': 0.2257, 'grad_norm': 0.9968437364023346, 'learning_rate': 9.926008495041975e-06, 'epoch': 0.08} 8%|▊ | 2852/34278 [3:10:05<38:35:31, 4.42s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9014 > 8192). Running this sequence through the model will result in indexing errors 8%|▊ | 2853/34278 [3:10:08<34:45:38, 3.98s/it] {'loss': 0.1844, 'grad_norm': 0.9535351588974474, 'learning_rate': 9.925927498317794e-06, 'epoch': 0.08} 8%|▊ | 2853/34278 [3:10:08<34:45:38, 3.98s/it] 8%|▊ | 2854/34278 [3:10:13<37:44:40, 4.32s/it] {'loss': 0.2026, 'grad_norm': 0.8854242104496456, 'learning_rate': 9.925846457616109e-06, 'epoch': 0.08} 8%|▊ | 2854/34278 [3:10:13<37:44:40, 4.32s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2855/34278 [3:10:16<34:58:44, 4.01s/it] {'loss': 0.192, 'grad_norm': 1.0089888511502039, 'learning_rate': 9.925765372937641e-06, 'epoch': 0.08} 8%|▊ | 2855/34278 [3:10:16<34:58:44, 4.01s/it] 8%|▊ | 2856/34278 [3:10:22<41:03:14, 4.70s/it] {'loss': 0.2179, 'grad_norm': 0.917845127766735, 'learning_rate': 9.925684244283116e-06, 'epoch': 0.08} 8%|▊ | 2856/34278 [3:10:22<41:03:14, 4.70s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2857/34278 [3:10:26<38:36:31, 4.42s/it] {'loss': 0.1836, 'grad_norm': 0.9949885017616943, 'learning_rate': 9.925603071653258e-06, 'epoch': 0.08} 8%|▊ | 2857/34278 [3:10:26<38:36:31, 4.42s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2858/34278 [3:10:30<36:19:32, 4.16s/it] {'loss': 0.2328, 'grad_norm': 1.0751050383811247, 'learning_rate': 9.925521855048794e-06, 'epoch': 0.08} 8%|▊ | 2858/34278 [3:10:30<36:19:32, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2859/34278 [3:10:33<34:57:58, 4.01s/it] {'loss': 0.1967, 'grad_norm': 0.906733513481403, 'learning_rate': 9.925440594470444e-06, 'epoch': 0.08} 8%|▊ | 2859/34278 [3:10:33<34:57:58, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2860/34278 [3:10:36<32:20:56, 3.71s/it] {'loss': 0.2128, 'grad_norm': 1.142701110240169, 'learning_rate': 9.925359289918937e-06, 'epoch': 0.08} 8%|▊ | 2860/34278 [3:10:36<32:20:56, 3.71s/it] 8%|▊ | 2861/34278 [3:10:39<30:45:01, 3.52s/it] {'loss': 0.213, 'grad_norm': 0.9905148856143702, 'learning_rate': 9.925277941394998e-06, 'epoch': 0.08} 8%|▊ | 2861/34278 [3:10:39<30:45:01, 3.52s/it] 8%|▊ | 2862/34278 [3:10:43<30:59:41, 3.55s/it] {'loss': 0.1755, 'grad_norm': 1.0512130964236224, 'learning_rate': 9.925196548899353e-06, 'epoch': 0.08} 8%|▊ | 2862/34278 [3:10:43<30:59:41, 3.55s/it] 8%|▊ | 2863/34278 [3:10:47<31:29:43, 3.61s/it] {'loss': 0.1709, 'grad_norm': 1.0806773648246382, 'learning_rate': 9.925115112432728e-06, 'epoch': 0.08} 8%|▊ | 2863/34278 [3:10:47<31:29:43, 3.61s/it] 8%|▊ | 2864/34278 [3:10:50<30:31:58, 3.50s/it] {'loss': 0.192, 'grad_norm': 1.1075388767413175, 'learning_rate': 9.925033631995854e-06, 'epoch': 0.08} 8%|▊ | 2864/34278 [3:10:50<30:31:58, 3.50s/it] 8%|▊ | 2865/34278 [3:10:54<31:03:14, 3.56s/it] {'loss': 0.2066, 'grad_norm': 0.9388043533551024, 'learning_rate': 9.924952107589452e-06, 'epoch': 0.08} 8%|▊ | 2865/34278 [3:10:54<31:03:14, 3.56s/it] 8%|▊ | 2866/34278 [3:10:57<29:59:06, 3.44s/it] {'loss': 0.2075, 'grad_norm': 0.9950697119120068, 'learning_rate': 9.924870539214256e-06, 'epoch': 0.08} 8%|▊ | 2866/34278 [3:10:57<29:59:06, 3.44s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12570 > 8192). Running this sequence through the model will result in indexing errors 8%|▊ | 2867/34278 [3:11:02<35:39:07, 4.09s/it] {'loss': 0.2287, 'grad_norm': 1.1421071021035174, 'learning_rate': 9.924788926870989e-06, 'epoch': 0.08} 8%|▊ | 2867/34278 [3:11:02<35:39:07, 4.09s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2868/34278 [3:11:08<39:29:52, 4.53s/it] {'loss': 0.1956, 'grad_norm': 1.0471766176853352, 'learning_rate': 9.924707270560383e-06, 'epoch': 0.08} 8%|▊ | 2868/34278 [3:11:08<39:29:52, 4.53s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2869/34278 [3:11:13<40:41:15, 4.66s/it] {'loss': 0.2252, 'grad_norm': 0.91963288783241, 'learning_rate': 9.924625570283167e-06, 'epoch': 0.08} 8%|▊ | 2869/34278 [3:11:13<40:41:15, 4.66s/it] 8%|▊ | 2870/34278 [3:11:17<37:47:40, 4.33s/it] {'loss': 0.1757, 'grad_norm': 1.0305444563482429, 'learning_rate': 9.92454382604007e-06, 'epoch': 0.08} 8%|▊ | 2870/34278 [3:11:17<37:47:40, 4.33s/it] 8%|▊ | 2871/34278 [3:11:20<34:21:10, 3.94s/it] {'loss': 0.1947, 'grad_norm': 0.9452937508991428, 'learning_rate': 9.92446203783182e-06, 'epoch': 0.08} 8%|▊ | 2871/34278 [3:11:20<34:21:10, 3.94s/it] 8%|▊ | 2872/34278 [3:11:23<34:07:47, 3.91s/it] {'loss': 0.196, 'grad_norm': 0.9541093318719566, 'learning_rate': 9.924380205659147e-06, 'epoch': 0.08} 8%|▊ | 2872/34278 [3:11:23<34:07:47, 3.91s/it] 8%|▊ | 2873/34278 [3:11:26<31:47:34, 3.64s/it] {'loss': 0.1981, 'grad_norm': 1.112014158471839, 'learning_rate': 9.924298329522786e-06, 'epoch': 0.08} 8%|▊ | 2873/34278 [3:11:26<31:47:34, 3.64s/it] 8%|▊ | 2874/34278 [3:11:30<30:43:51, 3.52s/it] {'loss': 0.182, 'grad_norm': 0.901707974059267, 'learning_rate': 9.924216409423464e-06, 'epoch': 0.08} 8%|▊ | 2874/34278 [3:11:30<30:43:51, 3.52s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2875/34278 [3:11:33<29:21:18, 3.37s/it] {'loss': 0.2023, 'grad_norm': 1.1860376632262863, 'learning_rate': 9.924134445361913e-06, 'epoch': 0.08} 8%|▊ | 2875/34278 [3:11:33<29:21:18, 3.37s/it] 8%|▊ | 2876/34278 [3:11:39<36:38:59, 4.20s/it] {'loss': 0.1765, 'grad_norm': 1.0321403578435027, 'learning_rate': 9.924052437338865e-06, 'epoch': 0.08} 8%|▊ | 2876/34278 [3:11:39<36:38:59, 4.20s/it] 8%|▊ | 2877/34278 [3:11:44<39:14:41, 4.50s/it] {'loss': 0.1855, 'grad_norm': 0.8207461696475347, 'learning_rate': 9.923970385355052e-06, 'epoch': 0.08} 8%|▊ | 2877/34278 [3:11:44<39:14:41, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2878/34278 [3:11:50<43:26:42, 4.98s/it] {'loss': 0.2035, 'grad_norm': 0.9138012734625626, 'learning_rate': 9.92388828941121e-06, 'epoch': 0.08} 8%|▊ | 2878/34278 [3:11:50<43:26:42, 4.98s/it] 8%|▊ | 2879/34278 [3:11:54<40:00:46, 4.59s/it] {'loss': 0.1844, 'grad_norm': 0.9461767070447371, 'learning_rate': 9.923806149508066e-06, 'epoch': 0.08} 8%|▊ | 2879/34278 [3:11:54<40:00:46, 4.59s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10856 > 8192). Running this sequence through the model will result in indexing errors 8%|▊ | 2880/34278 [3:11:57<35:59:23, 4.13s/it] {'loss': 0.1901, 'grad_norm': 0.9880288764358837, 'learning_rate': 9.923723965646356e-06, 'epoch': 0.08} 8%|▊ | 2880/34278 [3:11:57<35:59:23, 4.13s/it] 8%|▊ | 2881/34278 [3:12:03<41:09:57, 4.72s/it] {'loss': 0.2148, 'grad_norm': 1.1643190997949617, 'learning_rate': 9.923641737826815e-06, 'epoch': 0.08} 8%|▊ | 2881/34278 [3:12:03<41:09:57, 4.72s/it] 8%|▊ | 2882/34278 [3:12:06<37:25:04, 4.29s/it] {'loss': 0.1941, 'grad_norm': 1.1033208593677126, 'learning_rate': 9.923559466050174e-06, 'epoch': 0.08} 8%|▊ | 2882/34278 [3:12:06<37:25:04, 4.29s/it] 8%|▊ | 2883/34278 [3:12:09<34:03:49, 3.91s/it] {'loss': 0.1997, 'grad_norm': 0.9737642643596599, 'learning_rate': 9.923477150317172e-06, 'epoch': 0.08} 8%|▊ | 2883/34278 [3:12:09<34:03:49, 3.91s/it] 8%|▊ | 2884/34278 [3:12:13<32:33:58, 3.73s/it] {'loss': 0.2009, 'grad_norm': 0.9261757169279775, 'learning_rate': 9.92339479062854e-06, 'epoch': 0.08} 8%|▊ | 2884/34278 [3:12:13<32:33:58, 3.73s/it] 8%|▊ | 2885/34278 [3:12:16<31:18:49, 3.59s/it] {'loss': 0.1933, 'grad_norm': 1.258990034617718, 'learning_rate': 9.923312386985013e-06, 'epoch': 0.08} 8%|▊ | 2885/34278 [3:12:16<31:18:49, 3.59s/it] 8%|▊ | 2886/34278 [3:12:22<38:02:04, 4.36s/it] {'loss': 0.1971, 'grad_norm': 1.001298937861448, 'learning_rate': 9.92322993938733e-06, 'epoch': 0.08} 8%|▊ | 2886/34278 [3:12:22<38:02:04, 4.36s/it] 8%|▊ | 2887/34278 [3:12:26<36:02:18, 4.13s/it] {'loss': 0.1794, 'grad_norm': 0.9213789051844401, 'learning_rate': 9.923147447836226e-06, 'epoch': 0.08} 8%|▊ | 2887/34278 [3:12:26<36:02:18, 4.13s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12565 > 8192). Running this sequence through the model will result in indexing errors 8%|▊ | 2888/34278 [3:12:29<35:11:52, 4.04s/it] {'loss': 0.1844, 'grad_norm': 0.7759502889329065, 'learning_rate': 9.923064912332436e-06, 'epoch': 0.08} 8%|▊ | 2888/34278 [3:12:29<35:11:52, 4.04s/it] 8%|▊ | 2889/34278 [3:12:32<32:06:12, 3.68s/it] {'loss': 0.1866, 'grad_norm': 0.9828130081912211, 'learning_rate': 9.922982332876698e-06, 'epoch': 0.08} 8%|▊ | 2889/34278 [3:12:32<32:06:12, 3.68s/it] 8%|▊ | 2890/34278 [3:12:38<38:19:44, 4.40s/it] {'loss': 0.1957, 'grad_norm': 0.9141924436531134, 'learning_rate': 9.922899709469748e-06, 'epoch': 0.08} 8%|▊ | 2890/34278 [3:12:38<38:19:44, 4.40s/it] 8%|▊ | 2891/34278 [3:12:42<36:19:43, 4.17s/it] {'loss': 0.201, 'grad_norm': 0.9530853995878734, 'learning_rate': 9.922817042112326e-06, 'epoch': 0.08} 8%|▊ | 2891/34278 [3:12:42<36:19:43, 4.17s/it] 8%|▊ | 2892/34278 [3:12:45<33:38:05, 3.86s/it] {'loss': 0.1921, 'grad_norm': 0.809485663870376, 'learning_rate': 9.922734330805169e-06, 'epoch': 0.08} 8%|▊ | 2892/34278 [3:12:45<33:38:05, 3.86s/it] 8%|▊ | 2893/34278 [3:12:48<31:59:23, 3.67s/it] {'loss': 0.2064, 'grad_norm': 0.8076078712216355, 'learning_rate': 9.922651575549013e-06, 'epoch': 0.08} 8%|▊ | 2893/34278 [3:12:48<31:59:23, 3.67s/it] 8%|▊ | 2894/34278 [3:12:51<30:42:11, 3.52s/it] {'loss': 0.2151, 'grad_norm': 0.9071757954275966, 'learning_rate': 9.9225687763446e-06, 'epoch': 0.08} 8%|▊ | 2894/34278 [3:12:52<30:42:11, 3.52s/it] 8%|▊ | 2895/34278 [3:12:55<29:32:08, 3.39s/it] {'loss': 0.177, 'grad_norm': 0.9745045000878537, 'learning_rate': 9.922485933192667e-06, 'epoch': 0.08} 8%|▊ | 2895/34278 [3:12:55<29:32:08, 3.39s/it] 8%|▊ | 2896/34278 [3:12:58<29:44:49, 3.41s/it] {'loss': 0.1813, 'grad_norm': 0.9922703437221619, 'learning_rate': 9.922403046093956e-06, 'epoch': 0.08} 8%|▊ | 2896/34278 [3:12:58<29:44:49, 3.41s/it] 8%|▊ | 2897/34278 [3:13:01<29:34:31, 3.39s/it] {'loss': 0.1787, 'grad_norm': 0.9315399021583822, 'learning_rate': 9.922320115049205e-06, 'epoch': 0.08} 8%|▊ | 2897/34278 [3:13:01<29:34:31, 3.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2898/34278 [3:13:05<29:27:46, 3.38s/it] {'loss': 0.1704, 'grad_norm': 0.8776975219008598, 'learning_rate': 9.922237140059157e-06, 'epoch': 0.08} 8%|▊ | 2898/34278 [3:13:05<29:27:46, 3.38s/it] 8%|▊ | 2899/34278 [3:13:09<30:38:33, 3.52s/it] {'loss': 0.28, 'grad_norm': 1.1553772509854796, 'learning_rate': 9.922154121124548e-06, 'epoch': 0.08} 8%|▊ | 2899/34278 [3:13:09<30:38:33, 3.52s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2900/34278 [3:13:13<33:41:50, 3.87s/it] {'loss': 0.1691, 'grad_norm': 0.8628025604026724, 'learning_rate': 9.922071058246122e-06, 'epoch': 0.08} 8%|▊ | 2900/34278 [3:13:13<33:41:50, 3.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2901/34278 [3:13:17<32:51:25, 3.77s/it] {'loss': 0.1756, 'grad_norm': 0.9123746640465588, 'learning_rate': 9.921987951424624e-06, 'epoch': 0.08} 8%|▊ | 2901/34278 [3:13:17<32:51:25, 3.77s/it] 8%|▊ | 2902/34278 [3:13:21<33:09:47, 3.81s/it] {'loss': 0.2124, 'grad_norm': 0.9395113792693988, 'learning_rate': 9.92190480066079e-06, 'epoch': 0.08} 8%|▊ | 2902/34278 [3:13:21<33:09:47, 3.81s/it] 8%|▊ | 2903/34278 [3:13:24<31:37:58, 3.63s/it] {'loss': 0.2006, 'grad_norm': 0.8051932965639589, 'learning_rate': 9.921821605955366e-06, 'epoch': 0.08} 8%|▊ | 2903/34278 [3:13:24<31:37:58, 3.63s/it] 8%|▊ | 2904/34278 [3:13:28<32:39:19, 3.75s/it] {'loss': 0.188, 'grad_norm': 0.8792275575625256, 'learning_rate': 9.921738367309091e-06, 'epoch': 0.08} 8%|▊ | 2904/34278 [3:13:28<32:39:19, 3.75s/it] 8%|▊ | 2905/34278 [3:13:32<32:59:33, 3.79s/it] {'loss': 0.1948, 'grad_norm': 0.9120241887495304, 'learning_rate': 9.921655084722713e-06, 'epoch': 0.08} 8%|▊ | 2905/34278 [3:13:32<32:59:33, 3.79s/it] 8%|▊ | 2906/34278 [3:13:35<32:43:29, 3.76s/it] {'loss': 0.2279, 'grad_norm': 0.9919025555721173, 'learning_rate': 9.921571758196973e-06, 'epoch': 0.08} 8%|▊ | 2906/34278 [3:13:35<32:43:29, 3.76s/it] 8%|▊ | 2907/34278 [3:13:41<38:00:58, 4.36s/it] {'loss': 0.1788, 'grad_norm': 0.8772008946481051, 'learning_rate': 9.921488387732617e-06, 'epoch': 0.08} 8%|▊ | 2907/34278 [3:13:41<38:00:58, 4.36s/it] 8%|▊ | 2908/34278 [3:13:45<35:56:17, 4.12s/it] {'loss': 0.1836, 'grad_norm': 0.8265557172512299, 'learning_rate': 9.921404973330385e-06, 'epoch': 0.08} 8%|▊ | 2908/34278 [3:13:45<35:56:17, 4.12s/it] 8%|▊ | 2909/34278 [3:13:49<36:49:19, 4.23s/it] {'loss': 0.1791, 'grad_norm': 0.9060913602449435, 'learning_rate': 9.921321514991024e-06, 'epoch': 0.08} 8%|▊ | 2909/34278 [3:13:49<36:49:19, 4.23s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2910/34278 [3:13:54<38:09:27, 4.38s/it] {'loss': 0.1884, 'grad_norm': 0.8626848225961726, 'learning_rate': 9.92123801271528e-06, 'epoch': 0.08} 8%|▊ | 2910/34278 [3:13:54<38:09:27, 4.38s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 8%|▊ | 2911/34278 [3:13:57<34:55:49, 4.01s/it] {'loss': 0.1761, 'grad_norm': 0.8905954737845782, 'learning_rate': 9.921154466503899e-06, 'epoch': 0.08} 8%|▊ | 2911/34278 [3:13:57<34:55:49, 4.01s/it] 8%|▊ | 2912/34278 [3:14:00<32:21:42, 3.71s/it] {'loss': 0.1756, 'grad_norm': 0.9143770626772574, 'learning_rate': 9.921070876357625e-06, 'epoch': 0.08} 8%|▊ | 2912/34278 [3:14:00<32:21:42, 3.71s/it] 8%|▊ | 2913/34278 [3:14:03<30:39:11, 3.52s/it] {'loss': 0.202, 'grad_norm': 0.929480628058618, 'learning_rate': 9.920987242277205e-06, 'epoch': 0.08} 8%|▊ | 2913/34278 [3:14:03<30:39:11, 3.52s/it] 9%|▊ | 2914/34278 [3:14:06<28:55:07, 3.32s/it] {'loss': 0.1967, 'grad_norm': 0.8884063264213005, 'learning_rate': 9.920903564263385e-06, 'epoch': 0.09} 9%|▊ | 2914/34278 [3:14:06<28:55:07, 3.32s/it] 9%|▊ | 2915/34278 [3:14:11<33:30:51, 3.85s/it] {'loss': 0.2079, 'grad_norm': 0.9497256865252193, 'learning_rate': 9.920819842316914e-06, 'epoch': 0.09} 9%|▊ | 2915/34278 [3:14:11<33:30:51, 3.85s/it] 9%|▊ | 2916/34278 [3:14:17<39:07:36, 4.49s/it] {'loss': 0.1875, 'grad_norm': 0.9076402904311048, 'learning_rate': 9.920736076438535e-06, 'epoch': 0.09} 9%|▊ | 2916/34278 [3:14:17<39:07:36, 4.49s/it] 9%|▊ | 2917/34278 [3:14:20<35:12:48, 4.04s/it] {'loss': 0.2004, 'grad_norm': 0.8785093924513132, 'learning_rate': 9.920652266629002e-06, 'epoch': 0.09} 9%|▊ | 2917/34278 [3:14:20<35:12:48, 4.04s/it] 9%|▊ | 2918/34278 [3:14:27<41:22:04, 4.75s/it] {'loss': 0.1886, 'grad_norm': 0.9594248321183944, 'learning_rate': 9.92056841288906e-06, 'epoch': 0.09} 9%|▊ | 2918/34278 [3:14:27<41:22:04, 4.75s/it] 9%|▊ | 2919/34278 [3:14:30<38:44:51, 4.45s/it] {'loss': 0.2005, 'grad_norm': 0.7301481664940029, 'learning_rate': 9.920484515219458e-06, 'epoch': 0.09} 9%|▊ | 2919/34278 [3:14:30<38:44:51, 4.45s/it] 9%|▊ | 2920/34278 [3:14:33<34:38:40, 3.98s/it] {'loss': 0.1737, 'grad_norm': 0.8705722867537141, 'learning_rate': 9.920400573620943e-06, 'epoch': 0.09} 9%|▊ | 2920/34278 [3:14:33<34:38:40, 3.98s/it] 9%|▊ | 2921/34278 [3:14:36<32:44:52, 3.76s/it] {'loss': 0.2029, 'grad_norm': 0.9298330380751542, 'learning_rate': 9.920316588094268e-06, 'epoch': 0.09} 9%|▊ | 2921/34278 [3:14:36<32:44:52, 3.76s/it] 9%|▊ | 2922/34278 [3:14:41<33:56:28, 3.90s/it] {'loss': 0.1906, 'grad_norm': 0.8401546420998232, 'learning_rate': 9.92023255864018e-06, 'epoch': 0.09} 9%|▊ | 2922/34278 [3:14:41<33:56:28, 3.90s/it] 9%|▊ | 2923/34278 [3:14:44<32:52:52, 3.78s/it] {'loss': 0.1863, 'grad_norm': 0.9615083026920218, 'learning_rate': 9.92014848525943e-06, 'epoch': 0.09} 9%|▊ | 2923/34278 [3:14:44<32:52:52, 3.78s/it] 9%|▊ | 2924/34278 [3:14:50<38:51:16, 4.46s/it] {'loss': 0.188, 'grad_norm': 1.0750265050292815, 'learning_rate': 9.92006436795277e-06, 'epoch': 0.09} 9%|▊ | 2924/34278 [3:14:50<38:51:16, 4.46s/it] 9%|▊ | 2925/34278 [3:14:53<35:20:14, 4.06s/it] {'loss': 0.179, 'grad_norm': 0.9646879674547842, 'learning_rate': 9.919980206720949e-06, 'epoch': 0.09} 9%|▊ | 2925/34278 [3:14:53<35:20:14, 4.06s/it] 9%|▊ | 2926/34278 [3:14:57<34:36:44, 3.97s/it] {'loss': 0.1659, 'grad_norm': 0.7327944202303368, 'learning_rate': 9.919896001564721e-06, 'epoch': 0.09} 9%|▊ | 2926/34278 [3:14:57<34:36:44, 3.97s/it] 9%|▊ | 2927/34278 [3:15:01<33:17:46, 3.82s/it] {'loss': 0.2574, 'grad_norm': 0.9456678912751119, 'learning_rate': 9.919811752484834e-06, 'epoch': 0.09} 9%|▊ | 2927/34278 [3:15:01<33:17:46, 3.82s/it] 9%|▊ | 2928/34278 [3:15:07<39:21:12, 4.52s/it] {'loss': 0.2018, 'grad_norm': 0.983425907080063, 'learning_rate': 9.919727459482043e-06, 'epoch': 0.09} 9%|▊ | 2928/34278 [3:15:07<39:21:12, 4.52s/it] 9%|▊ | 2929/34278 [3:15:13<43:14:08, 4.97s/it] {'loss': 0.177, 'grad_norm': 0.803034174790023, 'learning_rate': 9.919643122557099e-06, 'epoch': 0.09} 9%|▊ | 2929/34278 [3:15:13<43:14:08, 4.97s/it] 9%|▊ | 2930/34278 [3:15:16<38:06:26, 4.38s/it] {'loss': 0.2047, 'grad_norm': 0.8841768667334998, 'learning_rate': 9.919558741710757e-06, 'epoch': 0.09} 9%|▊ | 2930/34278 [3:15:16<38:06:26, 4.38s/it] 9%|▊ | 2931/34278 [3:15:19<34:23:15, 3.95s/it] {'loss': 0.1876, 'grad_norm': 0.8384717905943977, 'learning_rate': 9.919474316943767e-06, 'epoch': 0.09} 9%|▊ | 2931/34278 [3:15:19<34:23:15, 3.95s/it] 9%|▊ | 2932/34278 [3:15:22<32:25:44, 3.72s/it] {'loss': 0.1822, 'grad_norm': 0.9552471273658159, 'learning_rate': 9.919389848256886e-06, 'epoch': 0.09} 9%|▊ | 2932/34278 [3:15:22<32:25:44, 3.72s/it] 9%|▊ | 2933/34278 [3:15:25<32:06:25, 3.69s/it] {'loss': 0.1677, 'grad_norm': 0.861439992265136, 'learning_rate': 9.919305335650866e-06, 'epoch': 0.09} 9%|▊ | 2933/34278 [3:15:25<32:06:25, 3.69s/it] 9%|▊ | 2934/34278 [3:15:29<31:21:39, 3.60s/it] {'loss': 0.1805, 'grad_norm': 0.9117444486814688, 'learning_rate': 9.919220779126464e-06, 'epoch': 0.09} 9%|▊ | 2934/34278 [3:15:29<31:21:39, 3.60s/it] 9%|▊ | 2935/34278 [3:15:32<30:25:59, 3.50s/it] {'loss': 0.1942, 'grad_norm': 0.9329925244000081, 'learning_rate': 9.919136178684432e-06, 'epoch': 0.09} 9%|▊ | 2935/34278 [3:15:32<30:25:59, 3.50s/it] 9%|▊ | 2936/34278 [3:15:35<29:29:19, 3.39s/it] {'loss': 0.1848, 'grad_norm': 1.0201358946837327, 'learning_rate': 9.919051534325526e-06, 'epoch': 0.09} 9%|▊ | 2936/34278 [3:15:35<29:29:19, 3.39s/it] 9%|▊ | 2937/34278 [3:15:38<28:36:44, 3.29s/it] {'loss': 0.1728, 'grad_norm': 1.0060928362882617, 'learning_rate': 9.918966846050502e-06, 'epoch': 0.09} 9%|▊ | 2937/34278 [3:15:38<28:36:44, 3.29s/it] 9%|▊ | 2938/34278 [3:15:42<28:55:16, 3.32s/it] {'loss': 0.2069, 'grad_norm': 0.93465579707486, 'learning_rate': 9.918882113860117e-06, 'epoch': 0.09} 9%|▊ | 2938/34278 [3:15:42<28:55:16, 3.32s/it] 9%|▊ | 2939/34278 [3:15:46<31:13:13, 3.59s/it] {'loss': 0.2154, 'grad_norm': 0.9294178165199828, 'learning_rate': 9.918797337755125e-06, 'epoch': 0.09} 9%|▊ | 2939/34278 [3:15:46<31:13:13, 3.59s/it] 9%|▊ | 2940/34278 [3:15:50<31:52:23, 3.66s/it] {'loss': 0.163, 'grad_norm': 1.0235219812827783, 'learning_rate': 9.918712517736288e-06, 'epoch': 0.09} 9%|▊ | 2940/34278 [3:15:50<31:52:23, 3.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▊ | 2941/34278 [3:15:53<30:27:06, 3.50s/it] {'loss': 0.1693, 'grad_norm': 0.9363081392345324, 'learning_rate': 9.918627653804358e-06, 'epoch': 0.09} 9%|▊ | 2941/34278 [3:15:53<30:27:06, 3.50s/it] 9%|▊ | 2942/34278 [3:15:57<31:36:20, 3.63s/it] {'loss': 0.1792, 'grad_norm': 0.8096250351067642, 'learning_rate': 9.918542745960094e-06, 'epoch': 0.09} 9%|▊ | 2942/34278 [3:15:57<31:36:20, 3.63s/it] 9%|▊ | 2943/34278 [3:16:00<30:55:06, 3.55s/it] {'loss': 0.2217, 'grad_norm': 1.030032549066214, 'learning_rate': 9.918457794204255e-06, 'epoch': 0.09} 9%|▊ | 2943/34278 [3:16:00<30:55:06, 3.55s/it] 9%|▊ | 2944/34278 [3:16:04<32:23:56, 3.72s/it] {'loss': 0.1828, 'grad_norm': 0.8574877595755447, 'learning_rate': 9.918372798537599e-06, 'epoch': 0.09} 9%|▊ | 2944/34278 [3:16:04<32:23:56, 3.72s/it] 9%|▊ | 2945/34278 [3:16:10<38:37:03, 4.44s/it] {'loss': 0.2054, 'grad_norm': 1.491100058222644, 'learning_rate': 9.918287758960885e-06, 'epoch': 0.09} 9%|▊ | 2945/34278 [3:16:10<38:37:03, 4.44s/it] 9%|▊ | 2946/34278 [3:16:17<43:29:38, 5.00s/it] {'loss': 0.1918, 'grad_norm': 0.9083850752775197, 'learning_rate': 9.918202675474872e-06, 'epoch': 0.09} 9%|▊ | 2946/34278 [3:16:17<43:29:38, 5.00s/it] 9%|▊ | 2947/34278 [3:16:20<37:42:28, 4.33s/it] {'loss': 0.1913, 'grad_norm': 0.9011727890489442, 'learning_rate': 9.91811754808032e-06, 'epoch': 0.09} 9%|▊ | 2947/34278 [3:16:20<37:42:28, 4.33s/it] 9%|▊ | 2948/34278 [3:16:23<35:01:02, 4.02s/it] {'loss': 0.1825, 'grad_norm': 0.7906158588954184, 'learning_rate': 9.918032376777987e-06, 'epoch': 0.09} 9%|▊ | 2948/34278 [3:16:23<35:01:02, 4.02s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff395973790> Failed to fetch sample 2667587. Exception: cannot identify image file <_io.BytesIO object at 0x7ff395973790> 9%|▊ | 2949/34278 [3:16:26<33:21:41, 3.83s/it] {'loss': 0.1932, 'grad_norm': 0.8329374188141475, 'learning_rate': 9.917947161568635e-06, 'epoch': 0.09} 9%|▊ | 2949/34278 [3:16:26<33:21:41, 3.83s/it] 9%|▊ | 2950/34278 [3:16:29<31:46:38, 3.65s/it] {'loss': 0.1805, 'grad_norm': 0.9847432416449173, 'learning_rate': 9.917861902453026e-06, 'epoch': 0.09} 9%|▊ | 2950/34278 [3:16:29<31:46:38, 3.65s/it] 9%|▊ | 2951/34278 [3:16:34<32:58:35, 3.79s/it] {'loss': 0.198, 'grad_norm': 0.77088051200454, 'learning_rate': 9.91777659943192e-06, 'epoch': 0.09} 9%|▊ | 2951/34278 [3:16:34<32:58:35, 3.79s/it] 9%|▊ | 2952/34278 [3:16:37<32:33:21, 3.74s/it] {'loss': 0.2013, 'grad_norm': 0.9664209900370838, 'learning_rate': 9.917691252506077e-06, 'epoch': 0.09} 9%|▊ | 2952/34278 [3:16:37<32:33:21, 3.74s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▊ | 2953/34278 [3:16:41<31:50:15, 3.66s/it] {'loss': 0.1877, 'grad_norm': 0.8164306184048502, 'learning_rate': 9.917605861676263e-06, 'epoch': 0.09} 9%|▊ | 2953/34278 [3:16:41<31:50:15, 3.66s/it] 9%|▊ | 2954/34278 [3:16:44<32:21:44, 3.72s/it] {'loss': 0.1819, 'grad_norm': 0.9170505125275688, 'learning_rate': 9.917520426943234e-06, 'epoch': 0.09} 9%|▊ | 2954/34278 [3:16:44<32:21:44, 3.72s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▊ | 2955/34278 [3:16:51<38:34:08, 4.43s/it] {'loss': 0.1946, 'grad_norm': 0.8540539865375996, 'learning_rate': 9.91743494830776e-06, 'epoch': 0.09} 9%|▊ | 2955/34278 [3:16:51<38:34:08, 4.43s/it] 9%|▊ | 2956/34278 [3:16:56<40:31:41, 4.66s/it] {'loss': 0.1847, 'grad_norm': 0.8564212394033919, 'learning_rate': 9.9173494257706e-06, 'epoch': 0.09} 9%|▊ | 2956/34278 [3:16:56<40:31:41, 4.66s/it] 9%|▊ | 2957/34278 [3:16:59<37:54:43, 4.36s/it] {'loss': 0.2041, 'grad_norm': 0.9910363759857238, 'learning_rate': 9.917263859332517e-06, 'epoch': 0.09} 9%|▊ | 2957/34278 [3:16:59<37:54:43, 4.36s/it] 9%|▊ | 2958/34278 [3:17:04<39:37:48, 4.56s/it] {'loss': 0.1802, 'grad_norm': 0.8074326237234796, 'learning_rate': 9.917178248994276e-06, 'epoch': 0.09} 9%|▊ | 2958/34278 [3:17:04<39:37:48, 4.56s/it] 9%|▊ | 2959/34278 [3:17:08<37:20:16, 4.29s/it] {'loss': 0.1797, 'grad_norm': 0.8084312759161405, 'learning_rate': 9.917092594756644e-06, 'epoch': 0.09} 9%|▊ | 2959/34278 [3:17:08<37:20:16, 4.29s/it] 9%|▊ | 2960/34278 [3:17:12<35:00:13, 4.02s/it] {'loss': 0.2053, 'grad_norm': 0.902437331471456, 'learning_rate': 9.91700689662038e-06, 'epoch': 0.09} 9%|▊ | 2960/34278 [3:17:12<35:00:13, 4.02s/it] 9%|▊ | 2961/34278 [3:17:15<33:09:15, 3.81s/it] {'loss': 0.2152, 'grad_norm': 1.1399668089315442, 'learning_rate': 9.916921154586255e-06, 'epoch': 0.09} 9%|▊ | 2961/34278 [3:17:15<33:09:15, 3.81s/it] 9%|▊ | 2962/34278 [3:17:18<30:58:53, 3.56s/it] {'loss': 0.1975, 'grad_norm': 1.0049544203280714, 'learning_rate': 9.91683536865503e-06, 'epoch': 0.09} 9%|▊ | 2962/34278 [3:17:18<30:58:53, 3.56s/it] 9%|▊ | 2963/34278 [3:17:21<29:43:38, 3.42s/it] {'loss': 0.196, 'grad_norm': 0.912389918373014, 'learning_rate': 9.916749538827472e-06, 'epoch': 0.09} 9%|▊ | 2963/34278 [3:17:21<29:43:38, 3.42s/it] 9%|▊ | 2964/34278 [3:17:27<36:22:54, 4.18s/it] {'loss': 0.1969, 'grad_norm': 0.8774603790351778, 'learning_rate': 9.916663665104348e-06, 'epoch': 0.09} 9%|▊ | 2964/34278 [3:17:27<36:22:54, 4.18s/it] 9%|▊ | 2965/34278 [3:17:30<33:22:10, 3.84s/it] {'loss': 0.1683, 'grad_norm': 0.8048369930767437, 'learning_rate': 9.916577747486425e-06, 'epoch': 0.09} 9%|▊ | 2965/34278 [3:17:30<33:22:10, 3.84s/it] 9%|▊ | 2966/34278 [3:17:36<38:10:50, 4.39s/it] {'loss': 0.2098, 'grad_norm': 0.9037903315671819, 'learning_rate': 9.91649178597447e-06, 'epoch': 0.09} 9%|▊ | 2966/34278 [3:17:36<38:10:50, 4.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▊ | 2967/34278 [3:17:40<37:19:00, 4.29s/it] {'loss': 0.1768, 'grad_norm': 0.9839554970042445, 'learning_rate': 9.91640578056925e-06, 'epoch': 0.09} 9%|▊ | 2967/34278 [3:17:40<37:19:00, 4.29s/it] 9%|▊ | 2968/34278 [3:17:44<37:15:00, 4.28s/it] {'loss': 0.1728, 'grad_norm': 0.8167726210545194, 'learning_rate': 9.916319731271532e-06, 'epoch': 0.09} 9%|▊ | 2968/34278 [3:17:44<37:15:00, 4.28s/it] 9%|▊ | 2969/34278 [3:17:48<37:30:29, 4.31s/it] {'loss': 0.1754, 'grad_norm': 0.8284331586776322, 'learning_rate': 9.916233638082086e-06, 'epoch': 0.09} 9%|▊ | 2969/34278 [3:17:48<37:30:29, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▊ | 2970/34278 [3:17:51<34:13:56, 3.94s/it] {'loss': 0.1781, 'grad_norm': 0.8480392275632824, 'learning_rate': 9.916147501001679e-06, 'epoch': 0.09} 9%|▊ | 2970/34278 [3:17:51<34:13:56, 3.94s/it] 9%|▊ | 2971/34278 [3:17:54<31:53:01, 3.67s/it] {'loss': 0.1839, 'grad_norm': 0.8980497097528721, 'learning_rate': 9.91606132003108e-06, 'epoch': 0.09} 9%|▊ | 2971/34278 [3:17:54<31:53:01, 3.67s/it] 9%|▊ | 2972/34278 [3:17:57<30:01:22, 3.45s/it] {'loss': 0.1868, 'grad_norm': 0.9619761370527177, 'learning_rate': 9.91597509517106e-06, 'epoch': 0.09} 9%|▊ | 2972/34278 [3:17:57<30:01:22, 3.45s/it] 9%|▊ | 2973/34278 [3:18:01<30:13:06, 3.48s/it] {'loss': 0.2301, 'grad_norm': 0.7884627570569046, 'learning_rate': 9.91588882642239e-06, 'epoch': 0.09} 9%|▊ | 2973/34278 [3:18:01<30:13:06, 3.48s/it] 9%|▊ | 2974/34278 [3:18:04<29:31:13, 3.39s/it] {'loss': 0.1739, 'grad_norm': 0.7750463676329331, 'learning_rate': 9.915802513785835e-06, 'epoch': 0.09} 9%|▊ | 2974/34278 [3:18:04<29:31:13, 3.39s/it] 9%|▊ | 2975/34278 [3:18:08<30:41:49, 3.53s/it] {'loss': 0.176, 'grad_norm': 0.825933172220525, 'learning_rate': 9.91571615726217e-06, 'epoch': 0.09} 9%|▊ | 2975/34278 [3:18:08<30:41:49, 3.53s/it] 9%|▊ | 2976/34278 [3:18:14<37:15:25, 4.28s/it] {'loss': 0.209, 'grad_norm': 0.8739152510151846, 'learning_rate': 9.915629756852163e-06, 'epoch': 0.09} 9%|▊ | 2976/34278 [3:18:14<37:15:25, 4.28s/it] 9%|▊ | 2977/34278 [3:18:17<34:29:55, 3.97s/it] {'loss': 0.2316, 'grad_norm': 0.9315939547274, 'learning_rate': 9.915543312556588e-06, 'epoch': 0.09} 9%|▊ | 2977/34278 [3:18:17<34:29:55, 3.97s/it] 9%|▊ | 2978/34278 [3:18:20<31:34:30, 3.63s/it] {'loss': 0.2085, 'grad_norm': 0.7991220099424584, 'learning_rate': 9.915456824376217e-06, 'epoch': 0.09} 9%|▊ | 2978/34278 [3:18:20<31:34:30, 3.63s/it] 9%|▊ | 2979/34278 [3:18:24<31:30:11, 3.62s/it] {'loss': 0.2058, 'grad_norm': 0.9678247234032558, 'learning_rate': 9.915370292311818e-06, 'epoch': 0.09} 9%|▊ | 2979/34278 [3:18:24<31:30:11, 3.62s/it] 9%|▊ | 2980/34278 [3:18:28<32:15:49, 3.71s/it] {'loss': 0.1984, 'grad_norm': 0.9505488319158721, 'learning_rate': 9.91528371636417e-06, 'epoch': 0.09} 9%|▊ | 2980/34278 [3:18:28<32:15:49, 3.71s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9036 > 8192). Running this sequence through the model will result in indexing errors 9%|▊ | 2981/34278 [3:18:31<30:27:28, 3.50s/it] {'loss': 0.1738, 'grad_norm': 0.8523923035068051, 'learning_rate': 9.915197096534039e-06, 'epoch': 0.09} 9%|▊ | 2981/34278 [3:18:31<30:27:28, 3.50s/it] 9%|▊ | 2982/34278 [3:18:37<37:27:27, 4.31s/it] {'loss': 0.2035, 'grad_norm': 0.9849848988500524, 'learning_rate': 9.915110432822203e-06, 'epoch': 0.09} 9%|▊ | 2982/34278 [3:18:37<37:27:27, 4.31s/it] 9%|▊ | 2983/34278 [3:18:42<38:37:01, 4.44s/it] {'loss': 0.1719, 'grad_norm': 0.7955156687986064, 'learning_rate': 9.915023725229435e-06, 'epoch': 0.09} 9%|▊ | 2983/34278 [3:18:42<38:37:01, 4.44s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▊ | 2984/34278 [3:18:45<35:39:13, 4.10s/it] {'loss': 0.1807, 'grad_norm': 0.8459914807033535, 'learning_rate': 9.914936973756509e-06, 'epoch': 0.09} 9%|▊ | 2984/34278 [3:18:45<35:39:13, 4.10s/it] 9%|▊ | 2985/34278 [3:18:48<32:56:31, 3.79s/it] {'loss': 0.1929, 'grad_norm': 1.01317682427066, 'learning_rate': 9.914850178404199e-06, 'epoch': 0.09} 9%|▊ | 2985/34278 [3:18:48<32:56:31, 3.79s/it] 9%|▊ | 2986/34278 [3:18:51<31:45:33, 3.65s/it] {'loss': 0.1938, 'grad_norm': 0.8749675976675539, 'learning_rate': 9.914763339173279e-06, 'epoch': 0.09} 9%|▊ | 2986/34278 [3:18:51<31:45:33, 3.65s/it] 9%|▊ | 2987/34278 [3:18:55<30:57:56, 3.56s/it] {'loss': 0.1933, 'grad_norm': 1.0050341942248369, 'learning_rate': 9.914676456064526e-06, 'epoch': 0.09} 9%|▊ | 2987/34278 [3:18:55<30:57:56, 3.56s/it] 9%|▊ | 2988/34278 [3:18:58<30:32:03, 3.51s/it] {'loss': 0.1998, 'grad_norm': 0.8836090974873869, 'learning_rate': 9.914589529078713e-06, 'epoch': 0.09} 9%|▊ | 2988/34278 [3:18:58<30:32:03, 3.51s/it] 9%|▊ | 2989/34278 [3:19:01<29:09:00, 3.35s/it] {'loss': 0.1878, 'grad_norm': 1.1617059750490621, 'learning_rate': 9.914502558216618e-06, 'epoch': 0.09} 9%|▊ | 2989/34278 [3:19:01<29:09:00, 3.35s/it] 9%|▊ | 2990/34278 [3:19:05<31:31:34, 3.63s/it] {'loss': 0.1999, 'grad_norm': 0.9042435047273026, 'learning_rate': 9.91441554347902e-06, 'epoch': 0.09} 9%|▊ | 2990/34278 [3:19:05<31:31:34, 3.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▊ | 2991/34278 [3:19:09<31:27:58, 3.62s/it] {'loss': 0.1955, 'grad_norm': 0.78858604502361, 'learning_rate': 9.91432848486669e-06, 'epoch': 0.09} 9%|▊ | 2991/34278 [3:19:09<31:27:58, 3.62s/it] 9%|▊ | 2992/34278 [3:19:12<29:46:42, 3.43s/it] {'loss': 0.2084, 'grad_norm': 0.913998913718363, 'learning_rate': 9.91424138238041e-06, 'epoch': 0.09} 9%|▊ | 2992/34278 [3:19:12<29:46:42, 3.43s/it] 9%|▊ | 2993/34278 [3:19:15<29:40:03, 3.41s/it] {'loss': 0.1769, 'grad_norm': 0.8933043256883373, 'learning_rate': 9.914154236020957e-06, 'epoch': 0.09} 9%|▊ | 2993/34278 [3:19:15<29:40:03, 3.41s/it] 9%|▊ | 2994/34278 [3:19:18<29:03:47, 3.34s/it] {'loss': 0.1945, 'grad_norm': 0.8455693392256065, 'learning_rate': 9.914067045789107e-06, 'epoch': 0.09} 9%|▊ | 2994/34278 [3:19:18<29:03:47, 3.34s/it] 9%|▊ | 2995/34278 [3:19:24<36:02:56, 4.15s/it] {'loss': 0.1998, 'grad_norm': 1.0031771539640966, 'learning_rate': 9.913979811685638e-06, 'epoch': 0.09} 9%|▊ | 2995/34278 [3:19:24<36:02:56, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▊ | 2996/34278 [3:19:31<41:19:32, 4.76s/it] {'loss': 0.1791, 'grad_norm': 0.8820102377931416, 'learning_rate': 9.913892533711331e-06, 'epoch': 0.09} 9%|▊ | 2996/34278 [3:19:31<41:19:32, 4.76s/it] 9%|▊ | 2997/34278 [3:19:35<40:08:01, 4.62s/it] {'loss': 0.1785, 'grad_norm': 0.9376324370591926, 'learning_rate': 9.913805211866967e-06, 'epoch': 0.09} 9%|▊ | 2997/34278 [3:19:35<40:08:01, 4.62s/it] 9%|▊ | 2998/34278 [3:19:39<38:09:59, 4.39s/it] {'loss': 0.1782, 'grad_norm': 1.0575494307685773, 'learning_rate': 9.913717846153322e-06, 'epoch': 0.09} 9%|▊ | 2998/34278 [3:19:39<38:09:59, 4.39s/it] 9%|▊ | 2999/34278 [3:19:42<35:04:37, 4.04s/it] {'loss': 0.195, 'grad_norm': 1.4428421928061463, 'learning_rate': 9.913630436571176e-06, 'epoch': 0.09} 9%|▊ | 2999/34278 [3:19:42<35:04:37, 4.04s/it] 9%|▉ | 3000/34278 [3:19:48<39:55:58, 4.60s/it] {'loss': 0.225, 'grad_norm': 1.2558891724714025, 'learning_rate': 9.91354298312131e-06, 'epoch': 0.09} 9%|▉ | 3000/34278 [3:19:48<39:55:58, 4.60s/it] 9%|▉ | 3001/34278 [3:19:51<37:03:52, 4.27s/it] {'loss': 0.1908, 'grad_norm': 0.9906785696055594, 'learning_rate': 9.913455485804506e-06, 'epoch': 0.09} 9%|▉ | 3001/34278 [3:19:51<37:03:52, 4.27s/it] 9%|▉ | 3002/34278 [3:19:55<35:34:06, 4.09s/it] {'loss': 0.182, 'grad_norm': 0.9814461413562301, 'learning_rate': 9.913367944621545e-06, 'epoch': 0.09} 9%|▉ | 3002/34278 [3:19:55<35:34:06, 4.09s/it] 9%|▉ | 3003/34278 [3:19:59<34:17:19, 3.95s/it] {'loss': 0.1837, 'grad_norm': 1.3326132631853649, 'learning_rate': 9.913280359573207e-06, 'epoch': 0.09} 9%|▉ | 3003/34278 [3:19:59<34:17:19, 3.95s/it] 9%|▉ | 3004/34278 [3:20:03<34:41:10, 3.99s/it] {'loss': 0.1795, 'grad_norm': 0.8927915911853632, 'learning_rate': 9.913192730660275e-06, 'epoch': 0.09} 9%|▉ | 3004/34278 [3:20:03<34:41:10, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3005/34278 [3:20:06<31:55:40, 3.68s/it] {'loss': 0.1945, 'grad_norm': 1.0050729571802486, 'learning_rate': 9.913105057883532e-06, 'epoch': 0.09} 9%|▉ | 3005/34278 [3:20:06<31:55:40, 3.68s/it] 9%|▉ | 3006/34278 [3:20:09<31:16:22, 3.60s/it] {'loss': 0.1964, 'grad_norm': 0.9128684143400813, 'learning_rate': 9.91301734124376e-06, 'epoch': 0.09} 9%|▉ | 3006/34278 [3:20:09<31:16:22, 3.60s/it] 9%|▉ | 3007/34278 [3:20:12<30:21:30, 3.49s/it] {'loss': 0.1732, 'grad_norm': 0.8438485794169703, 'learning_rate': 9.91292958074174e-06, 'epoch': 0.09} 9%|▉ | 3007/34278 [3:20:12<30:21:30, 3.49s/it] 9%|▉ | 3008/34278 [3:20:18<36:55:25, 4.25s/it] {'loss': 0.2083, 'grad_norm': 0.8477518508629068, 'learning_rate': 9.91284177637826e-06, 'epoch': 0.09} 9%|▉ | 3008/34278 [3:20:18<36:55:25, 4.25s/it] 9%|▉ | 3009/34278 [3:20:24<41:41:00, 4.80s/it] {'loss': 0.1763, 'grad_norm': 0.7869919198682958, 'learning_rate': 9.9127539281541e-06, 'epoch': 0.09} 9%|▉ | 3009/34278 [3:20:24<41:41:00, 4.80s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9014 > 8192). Running this sequence through the model will result in indexing errors 9%|▉ | 3010/34278 [3:20:27<36:48:16, 4.24s/it] {'loss': 0.1778, 'grad_norm': 1.0437124670764184, 'learning_rate': 9.912666036070045e-06, 'epoch': 0.09} 9%|▉ | 3010/34278 [3:20:27<36:48:16, 4.24s/it] 9%|▉ | 3011/34278 [3:20:31<35:09:13, 4.05s/it] {'loss': 0.1874, 'grad_norm': 0.874233790713194, 'learning_rate': 9.912578100126883e-06, 'epoch': 0.09} 9%|▉ | 3011/34278 [3:20:31<35:09:13, 4.05s/it] 9%|▉ | 3012/34278 [3:20:34<32:58:51, 3.80s/it] {'loss': 0.2016, 'grad_norm': 0.8595788089308047, 'learning_rate': 9.912490120325394e-06, 'epoch': 0.09} 9%|▉ | 3012/34278 [3:20:34<32:58:51, 3.80s/it] 9%|▉ | 3013/34278 [3:20:37<31:34:07, 3.63s/it] {'loss': 0.1904, 'grad_norm': 0.9981321545014091, 'learning_rate': 9.912402096666367e-06, 'epoch': 0.09} 9%|▉ | 3013/34278 [3:20:37<31:34:07, 3.63s/it] 9%|▉ | 3014/34278 [3:20:44<38:26:29, 4.43s/it] {'loss': 0.1721, 'grad_norm': 0.9073995176985775, 'learning_rate': 9.912314029150586e-06, 'epoch': 0.09} 9%|▉ | 3014/34278 [3:20:44<38:26:29, 4.43s/it] 9%|▉ | 3015/34278 [3:20:47<34:39:28, 3.99s/it] {'loss': 0.1936, 'grad_norm': 0.9292271893163728, 'learning_rate': 9.912225917778838e-06, 'epoch': 0.09} 9%|▉ | 3015/34278 [3:20:47<34:39:28, 3.99s/it] 9%|▉ | 3016/34278 [3:20:50<31:40:50, 3.65s/it] {'loss': 0.195, 'grad_norm': 0.9206644134436318, 'learning_rate': 9.91213776255191e-06, 'epoch': 0.09} 9%|▉ | 3016/34278 [3:20:50<31:40:50, 3.65s/it] 9%|▉ | 3017/34278 [3:20:56<39:10:53, 4.51s/it] {'loss': 0.2245, 'grad_norm': 0.8990250912326241, 'learning_rate': 9.912049563470589e-06, 'epoch': 0.09} 9%|▉ | 3017/34278 [3:20:56<39:10:53, 4.51s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3018/34278 [3:21:02<43:06:21, 4.96s/it] {'loss': 0.1705, 'grad_norm': 0.9241519313027056, 'learning_rate': 9.91196132053566e-06, 'epoch': 0.09} 9%|▉ | 3018/34278 [3:21:02<43:06:21, 4.96s/it] 9%|▉ | 3019/34278 [3:21:05<38:56:44, 4.49s/it] {'loss': 0.2189, 'grad_norm': 0.9298534231910399, 'learning_rate': 9.911873033747916e-06, 'epoch': 0.09} 9%|▉ | 3019/34278 [3:21:05<38:56:44, 4.49s/it] 9%|▉ | 3020/34278 [3:21:09<36:29:23, 4.20s/it] {'loss': 0.1865, 'grad_norm': 0.7907726777360939, 'learning_rate': 9.911784703108141e-06, 'epoch': 0.09} 9%|▉ | 3020/34278 [3:21:09<36:29:23, 4.20s/it] 9%|▉ | 3021/34278 [3:21:12<33:44:22, 3.89s/it] {'loss': 0.206, 'grad_norm': 1.0804125823083952, 'learning_rate': 9.911696328617126e-06, 'epoch': 0.09} 9%|▉ | 3021/34278 [3:21:12<33:44:22, 3.89s/it] 9%|▉ | 3022/34278 [3:21:16<34:10:19, 3.94s/it] {'loss': 0.1666, 'grad_norm': 0.7719793440825188, 'learning_rate': 9.911607910275655e-06, 'epoch': 0.09} 9%|▉ | 3022/34278 [3:21:16<34:10:19, 3.94s/it] 9%|▉ | 3023/34278 [3:21:19<31:42:53, 3.65s/it] {'loss': 0.2015, 'grad_norm': 0.9937527633913799, 'learning_rate': 9.911519448084526e-06, 'epoch': 0.09} 9%|▉ | 3023/34278 [3:21:19<31:42:53, 3.65s/it] 9%|▉ | 3024/34278 [3:21:22<29:55:20, 3.45s/it] {'loss': 0.236, 'grad_norm': 0.9353945419009075, 'learning_rate': 9.91143094204452e-06, 'epoch': 0.09} 9%|▉ | 3024/34278 [3:21:22<29:55:20, 3.45s/it] 9%|▉ | 3025/34278 [3:21:28<36:42:14, 4.23s/it] {'loss': 0.1743, 'grad_norm': 1.0133951671553068, 'learning_rate': 9.911342392156432e-06, 'epoch': 0.09} 9%|▉ | 3025/34278 [3:21:28<36:42:14, 4.23s/it] 9%|▉ | 3026/34278 [3:21:35<42:23:06, 4.88s/it] {'loss': 0.2082, 'grad_norm': 0.989671455261747, 'learning_rate': 9.911253798421051e-06, 'epoch': 0.09} 9%|▉ | 3026/34278 [3:21:35<42:23:06, 4.88s/it] 9%|▉ | 3027/34278 [3:21:38<39:29:25, 4.55s/it] {'loss': 0.1768, 'grad_norm': 1.2663908906167805, 'learning_rate': 9.91116516083917e-06, 'epoch': 0.09} 9%|▉ | 3027/34278 [3:21:38<39:29:25, 4.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3028/34278 [3:21:41<35:14:49, 4.06s/it] {'loss': 0.1738, 'grad_norm': 0.8667957135089173, 'learning_rate': 9.911076479411578e-06, 'epoch': 0.09} 9%|▉ | 3028/34278 [3:21:41<35:14:49, 4.06s/it] 9%|▉ | 3029/34278 [3:21:45<33:38:47, 3.88s/it] {'loss': 0.2223, 'grad_norm': 1.0614484471945946, 'learning_rate': 9.910987754139067e-06, 'epoch': 0.09} 9%|▉ | 3029/34278 [3:21:45<33:38:47, 3.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3030/34278 [3:21:51<38:39:50, 4.45s/it] {'loss': 0.1865, 'grad_norm': 1.0329687572312753, 'learning_rate': 9.91089898502243e-06, 'epoch': 0.09} 9%|▉ | 3030/34278 [3:21:51<38:39:50, 4.45s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3031/34278 [3:21:54<35:34:00, 4.10s/it] {'loss': 0.1996, 'grad_norm': 1.067140489676507, 'learning_rate': 9.910810172062462e-06, 'epoch': 0.09} 9%|▉ | 3031/34278 [3:21:54<35:34:00, 4.10s/it] 9%|▉ | 3032/34278 [3:21:57<32:30:50, 3.75s/it] {'loss': 0.1864, 'grad_norm': 0.96701487872926, 'learning_rate': 9.91072131525995e-06, 'epoch': 0.09} 9%|▉ | 3032/34278 [3:21:57<32:30:50, 3.75s/it] 9%|▉ | 3033/34278 [3:22:00<31:53:17, 3.67s/it] {'loss': 0.2033, 'grad_norm': 0.9739058647898851, 'learning_rate': 9.910632414615691e-06, 'epoch': 0.09} 9%|▉ | 3033/34278 [3:22:00<31:53:17, 3.67s/it] 9%|▉ | 3034/34278 [3:22:05<35:15:15, 4.06s/it] {'loss': 0.2031, 'grad_norm': 0.9240948823187121, 'learning_rate': 9.910543470130478e-06, 'epoch': 0.09} 9%|▉ | 3034/34278 [3:22:05<35:15:15, 4.06s/it] 9%|▉ | 3035/34278 [3:22:08<32:37:25, 3.76s/it] {'loss': 0.1894, 'grad_norm': 1.0302705675412827, 'learning_rate': 9.910454481805105e-06, 'epoch': 0.09} 9%|▉ | 3035/34278 [3:22:08<32:37:25, 3.76s/it] 9%|▉ | 3036/34278 [3:22:12<31:33:20, 3.64s/it] {'loss': 0.2008, 'grad_norm': 1.0976861082356608, 'learning_rate': 9.910365449640367e-06, 'epoch': 0.09} 9%|▉ | 3036/34278 [3:22:12<31:33:20, 3.64s/it] 9%|▉ | 3037/34278 [3:22:14<29:31:05, 3.40s/it] {'loss': 0.192, 'grad_norm': 0.9165697419348509, 'learning_rate': 9.910276373637058e-06, 'epoch': 0.09} 9%|▉ | 3037/34278 [3:22:14<29:31:05, 3.40s/it] 9%|▉ | 3038/34278 [3:22:19<31:13:02, 3.60s/it] {'loss': 0.2065, 'grad_norm': 1.4118806773504327, 'learning_rate': 9.910187253795974e-06, 'epoch': 0.09} 9%|▉ | 3038/34278 [3:22:19<31:13:02, 3.60s/it] 9%|▉ | 3039/34278 [3:22:22<31:52:42, 3.67s/it] {'loss': 0.1819, 'grad_norm': 0.8733323283579463, 'learning_rate': 9.91009809011791e-06, 'epoch': 0.09} 9%|▉ | 3039/34278 [3:22:22<31:52:42, 3.67s/it] 9%|▉ | 3040/34278 [3:22:26<31:33:24, 3.64s/it] {'loss': 0.1768, 'grad_norm': 0.732815032517925, 'learning_rate': 9.910008882603664e-06, 'epoch': 0.09} 9%|▉ | 3040/34278 [3:22:26<31:33:24, 3.64s/it] 9%|▉ | 3041/34278 [3:22:29<30:16:03, 3.49s/it] {'loss': 0.1973, 'grad_norm': 1.1269245047076029, 'learning_rate': 9.909919631254028e-06, 'epoch': 0.09} 9%|▉ | 3041/34278 [3:22:29<30:16:03, 3.49s/it] 9%|▉ | 3042/34278 [3:22:32<30:03:00, 3.46s/it] {'loss': 0.2164, 'grad_norm': 0.9555805411047182, 'learning_rate': 9.909830336069803e-06, 'epoch': 0.09} 9%|▉ | 3042/34278 [3:22:32<30:03:00, 3.46s/it] 9%|▉ | 3043/34278 [3:22:36<30:48:08, 3.55s/it] {'loss': 0.1821, 'grad_norm': 0.9834162396195962, 'learning_rate': 9.909740997051786e-06, 'epoch': 0.09} 9%|▉ | 3043/34278 [3:22:36<30:48:08, 3.55s/it] 9%|▉ | 3044/34278 [3:22:42<37:20:15, 4.30s/it] {'loss': 0.1889, 'grad_norm': 0.8057651821738971, 'learning_rate': 9.909651614200773e-06, 'epoch': 0.09} 9%|▉ | 3044/34278 [3:22:42<37:20:15, 4.30s/it] 9%|▉ | 3045/34278 [3:22:46<34:46:17, 4.01s/it] {'loss': 0.1914, 'grad_norm': 0.9273619873149924, 'learning_rate': 9.90956218751756e-06, 'epoch': 0.09} 9%|▉ | 3045/34278 [3:22:46<34:46:17, 4.01s/it] 9%|▉ | 3046/34278 [3:22:49<32:03:29, 3.70s/it] {'loss': 0.1796, 'grad_norm': 0.7597660008902855, 'learning_rate': 9.90947271700295e-06, 'epoch': 0.09} 9%|▉ | 3046/34278 [3:22:49<32:03:29, 3.70s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10826 > 8192). Running this sequence through the model will result in indexing errors 9%|▉ | 3047/34278 [3:22:53<34:37:53, 3.99s/it] {'loss': 0.1871, 'grad_norm': 0.7956731608419013, 'learning_rate': 9.909383202657739e-06, 'epoch': 0.09} 9%|▉ | 3047/34278 [3:22:53<34:37:53, 3.99s/it] 9%|▉ | 3048/34278 [3:22:56<31:41:45, 3.65s/it] {'loss': 0.1549, 'grad_norm': 0.7726968702012784, 'learning_rate': 9.909293644482727e-06, 'epoch': 0.09} 9%|▉ | 3048/34278 [3:22:56<31:41:45, 3.65s/it] 9%|▉ | 3049/34278 [3:22:59<29:52:39, 3.44s/it] {'loss': 0.1771, 'grad_norm': 0.9075519329697846, 'learning_rate': 9.909204042478713e-06, 'epoch': 0.09} 9%|▉ | 3049/34278 [3:22:59<29:52:39, 3.44s/it] 9%|▉ | 3050/34278 [3:23:05<37:01:24, 4.27s/it] {'loss': 0.1916, 'grad_norm': 0.9671623273112833, 'learning_rate': 9.9091143966465e-06, 'epoch': 0.09} 9%|▉ | 3050/34278 [3:23:05<37:01:24, 4.27s/it] 9%|▉ | 3051/34278 [3:23:09<34:49:29, 4.01s/it] {'loss': 0.2011, 'grad_norm': 0.9015163634226375, 'learning_rate': 9.909024706986881e-06, 'epoch': 0.09} 9%|▉ | 3051/34278 [3:23:09<34:49:29, 4.01s/it] 9%|▉ | 3052/34278 [3:23:12<34:13:33, 3.95s/it] {'loss': 0.1803, 'grad_norm': 0.9184962450164161, 'learning_rate': 9.908934973500664e-06, 'epoch': 0.09} 9%|▉ | 3052/34278 [3:23:12<34:13:33, 3.95s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3053/34278 [3:23:15<31:45:21, 3.66s/it] {'loss': 0.1614, 'grad_norm': 0.8450153788163894, 'learning_rate': 9.908845196188647e-06, 'epoch': 0.09} 9%|▉ | 3053/34278 [3:23:15<31:45:21, 3.66s/it] 9%|▉ | 3054/34278 [3:23:19<32:34:14, 3.76s/it] {'loss': 0.194, 'grad_norm': 0.9553291965529953, 'learning_rate': 9.908755375051631e-06, 'epoch': 0.09} 9%|▉ | 3054/34278 [3:23:19<32:34:14, 3.76s/it] 9%|▉ | 3055/34278 [3:23:25<36:46:06, 4.24s/it] {'loss': 0.1671, 'grad_norm': 0.891058644169043, 'learning_rate': 9.90866551009042e-06, 'epoch': 0.09} 9%|▉ | 3055/34278 [3:23:25<36:46:06, 4.24s/it] 9%|▉ | 3056/34278 [3:23:28<35:05:30, 4.05s/it] {'loss': 0.1813, 'grad_norm': 1.2104446913879496, 'learning_rate': 9.908575601305815e-06, 'epoch': 0.09} 9%|▉ | 3056/34278 [3:23:28<35:05:30, 4.05s/it] 9%|▉ | 3057/34278 [3:23:32<33:13:22, 3.83s/it] {'loss': 0.2193, 'grad_norm': 1.0127375435632076, 'learning_rate': 9.908485648698618e-06, 'epoch': 0.09} 9%|▉ | 3057/34278 [3:23:32<33:13:22, 3.83s/it] 9%|▉ | 3058/34278 [3:23:36<33:35:27, 3.87s/it] {'loss': 0.1938, 'grad_norm': 1.0258627847235318, 'learning_rate': 9.908395652269633e-06, 'epoch': 0.09} 9%|▉ | 3058/34278 [3:23:36<33:35:27, 3.87s/it] 9%|▉ | 3059/34278 [3:23:40<33:41:02, 3.88s/it] {'loss': 0.2049, 'grad_norm': 1.0079976664411516, 'learning_rate': 9.908305612019665e-06, 'epoch': 0.09} 9%|▉ | 3059/34278 [3:23:40<33:41:02, 3.88s/it] 9%|▉ | 3060/34278 [3:23:43<31:44:49, 3.66s/it] {'loss': 0.1988, 'grad_norm': 0.9704927433343631, 'learning_rate': 9.908215527949514e-06, 'epoch': 0.09} 9%|▉ | 3060/34278 [3:23:43<31:44:49, 3.66s/it] 9%|▉ | 3061/34278 [3:23:47<34:04:41, 3.93s/it] {'loss': 0.1726, 'grad_norm': 0.8499876970223335, 'learning_rate': 9.908125400059988e-06, 'epoch': 0.09} 9%|▉ | 3061/34278 [3:23:47<34:04:41, 3.93s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3062/34278 [3:23:51<33:01:41, 3.81s/it] {'loss': 0.1965, 'grad_norm': 0.8942257336748414, 'learning_rate': 9.908035228351888e-06, 'epoch': 0.09} 9%|▉ | 3062/34278 [3:23:51<33:01:41, 3.81s/it] 9%|▉ | 3063/34278 [3:23:54<30:33:00, 3.52s/it] {'loss': 0.1926, 'grad_norm': 0.9569510087355682, 'learning_rate': 9.907945012826022e-06, 'epoch': 0.09} 9%|▉ | 3063/34278 [3:23:54<30:33:00, 3.52s/it] 9%|▉ | 3064/34278 [3:23:57<30:23:36, 3.51s/it] {'loss': 0.2116, 'grad_norm': 1.0157032076843844, 'learning_rate': 9.907854753483194e-06, 'epoch': 0.09} 9%|▉ | 3064/34278 [3:23:57<30:23:36, 3.51s/it] 9%|▉ | 3065/34278 [3:24:02<32:58:16, 3.80s/it] {'loss': 0.192, 'grad_norm': 1.0428423346131277, 'learning_rate': 9.907764450324213e-06, 'epoch': 0.09} 9%|▉ | 3065/34278 [3:24:02<32:58:16, 3.80s/it] 9%|▉ | 3066/34278 [3:24:05<31:48:37, 3.67s/it] {'loss': 0.1769, 'grad_norm': 0.9531557202518242, 'learning_rate': 9.90767410334988e-06, 'epoch': 0.09} 9%|▉ | 3066/34278 [3:24:05<31:48:37, 3.67s/it] 9%|▉ | 3067/34278 [3:24:11<38:08:52, 4.40s/it] {'loss': 0.1977, 'grad_norm': 0.9251978560430241, 'learning_rate': 9.907583712561007e-06, 'epoch': 0.09} 9%|▉ | 3067/34278 [3:24:11<38:08:52, 4.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3068/34278 [3:24:14<35:25:40, 4.09s/it] {'loss': 0.1731, 'grad_norm': 0.8586274452048444, 'learning_rate': 9.907493277958395e-06, 'epoch': 0.09} 9%|▉ | 3068/34278 [3:24:14<35:25:40, 4.09s/it] 9%|▉ | 3069/34278 [3:24:18<33:25:06, 3.85s/it] {'loss': 0.2121, 'grad_norm': 0.9827735005065518, 'learning_rate': 9.907402799542856e-06, 'epoch': 0.09} 9%|▉ | 3069/34278 [3:24:18<33:25:06, 3.85s/it] 9%|▉ | 3070/34278 [3:24:21<31:47:25, 3.67s/it] {'loss': 0.18, 'grad_norm': 0.9753572676253564, 'learning_rate': 9.907312277315196e-06, 'epoch': 0.09} 9%|▉ | 3070/34278 [3:24:21<31:47:25, 3.67s/it] 9%|▉ | 3071/34278 [3:24:24<30:28:35, 3.52s/it] {'loss': 0.1682, 'grad_norm': 0.8210386693880609, 'learning_rate': 9.907221711276224e-06, 'epoch': 0.09} 9%|▉ | 3071/34278 [3:24:24<30:28:35, 3.52s/it] 9%|▉ | 3072/34278 [3:24:28<30:27:09, 3.51s/it] {'loss': 0.1851, 'grad_norm': 0.8229563047627599, 'learning_rate': 9.907131101426748e-06, 'epoch': 0.09} 9%|▉ | 3072/34278 [3:24:28<30:27:09, 3.51s/it] 9%|▉ | 3073/34278 [3:24:31<29:00:36, 3.35s/it] {'loss': 0.193, 'grad_norm': 1.1028219063692366, 'learning_rate': 9.907040447767575e-06, 'epoch': 0.09} 9%|▉ | 3073/34278 [3:24:31<29:00:36, 3.35s/it] 9%|▉ | 3074/34278 [3:24:34<27:53:05, 3.22s/it] {'loss': 0.1843, 'grad_norm': 0.9994190165962931, 'learning_rate': 9.906949750299519e-06, 'epoch': 0.09} 9%|▉ | 3074/34278 [3:24:34<27:53:05, 3.22s/it] 9%|▉ | 3075/34278 [3:24:37<27:51:05, 3.21s/it] {'loss': 0.1813, 'grad_norm': 0.8739944914336276, 'learning_rate': 9.906859009023386e-06, 'epoch': 0.09} 9%|▉ | 3075/34278 [3:24:37<27:51:05, 3.21s/it] 9%|▉ | 3076/34278 [3:24:41<29:57:58, 3.46s/it] {'loss': 0.1986, 'grad_norm': 0.9650162467973781, 'learning_rate': 9.906768223939986e-06, 'epoch': 0.09} 9%|▉ | 3076/34278 [3:24:41<29:57:58, 3.46s/it] 9%|▉ | 3077/34278 [3:24:44<30:13:25, 3.49s/it] {'loss': 0.2227, 'grad_norm': 1.203808331887953, 'learning_rate': 9.906677395050132e-06, 'epoch': 0.09} 9%|▉ | 3077/34278 [3:24:44<30:13:25, 3.49s/it] 9%|▉ | 3078/34278 [3:24:47<29:16:02, 3.38s/it] {'loss': 0.1764, 'grad_norm': 0.9256040756127788, 'learning_rate': 9.906586522354633e-06, 'epoch': 0.09} 9%|▉ | 3078/34278 [3:24:47<29:16:02, 3.38s/it] 9%|▉ | 3079/34278 [3:24:51<29:13:31, 3.37s/it] {'loss': 0.1918, 'grad_norm': 0.9056929166129613, 'learning_rate': 9.9064956058543e-06, 'epoch': 0.09} 9%|▉ | 3079/34278 [3:24:51<29:13:31, 3.37s/it] 9%|▉ | 3080/34278 [3:24:54<29:03:04, 3.35s/it] {'loss': 0.1699, 'grad_norm': 0.925389887214733, 'learning_rate': 9.906404645549947e-06, 'epoch': 0.09} 9%|▉ | 3080/34278 [3:24:54<29:03:04, 3.35s/it] 9%|▉ | 3081/34278 [3:24:59<32:32:22, 3.75s/it] {'loss': 0.1714, 'grad_norm': 1.0533964757737913, 'learning_rate': 9.906313641442385e-06, 'epoch': 0.09} 9%|▉ | 3081/34278 [3:24:59<32:32:22, 3.75s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3082/34278 [3:25:02<31:10:10, 3.60s/it] {'loss': 0.2107, 'grad_norm': 1.0862831572218263, 'learning_rate': 9.906222593532424e-06, 'epoch': 0.09} 9%|▉ | 3082/34278 [3:25:02<31:10:10, 3.60s/it] 9%|▉ | 3083/34278 [3:25:06<30:49:40, 3.56s/it] {'loss': 0.1854, 'grad_norm': 0.8008642476942929, 'learning_rate': 9.906131501820881e-06, 'epoch': 0.09} 9%|▉ | 3083/34278 [3:25:06<30:49:40, 3.56s/it] 9%|▉ | 3084/34278 [3:25:09<31:50:31, 3.67s/it] {'loss': 0.1836, 'grad_norm': 1.3107928344527915, 'learning_rate': 9.906040366308565e-06, 'epoch': 0.09} 9%|▉ | 3084/34278 [3:25:09<31:50:31, 3.67s/it] 9%|▉ | 3085/34278 [3:25:12<29:01:27, 3.35s/it] {'loss': 0.1997, 'grad_norm': 1.1093016703702059, 'learning_rate': 9.905949186996293e-06, 'epoch': 0.09} 9%|▉ | 3085/34278 [3:25:12<29:01:27, 3.35s/it] 9%|▉ | 3086/34278 [3:25:16<29:35:10, 3.41s/it] {'loss': 0.1899, 'grad_norm': 1.219609482626167, 'learning_rate': 9.905857963884878e-06, 'epoch': 0.09} 9%|▉ | 3086/34278 [3:25:16<29:35:10, 3.41s/it] 9%|▉ | 3087/34278 [3:25:19<29:53:58, 3.45s/it] {'loss': 0.1925, 'grad_norm': 0.8906354180218126, 'learning_rate': 9.905766696975134e-06, 'epoch': 0.09} 9%|▉ | 3087/34278 [3:25:19<29:53:58, 3.45s/it] 9%|▉ | 3088/34278 [3:25:25<36:43:06, 4.24s/it] {'loss': 0.2051, 'grad_norm': 0.8998464826192879, 'learning_rate': 9.905675386267877e-06, 'epoch': 0.09} 9%|▉ | 3088/34278 [3:25:25<36:43:06, 4.24s/it] 9%|▉ | 3089/34278 [3:25:29<35:06:11, 4.05s/it] {'loss': 0.1956, 'grad_norm': 1.0311402053101608, 'learning_rate': 9.90558403176392e-06, 'epoch': 0.09} 9%|▉ | 3089/34278 [3:25:29<35:06:11, 4.05s/it] 9%|▉ | 3090/34278 [3:25:34<37:57:52, 4.38s/it] {'loss': 0.2266, 'grad_norm': 1.046837754144681, 'learning_rate': 9.90549263346408e-06, 'epoch': 0.09} 9%|▉ | 3090/34278 [3:25:34<37:57:52, 4.38s/it] 9%|▉ | 3091/34278 [3:25:38<36:49:06, 4.25s/it] {'loss': 0.2041, 'grad_norm': 0.9232934703463918, 'learning_rate': 9.905401191369172e-06, 'epoch': 0.09} 9%|▉ | 3091/34278 [3:25:38<36:49:06, 4.25s/it] 9%|▉ | 3092/34278 [3:25:44<40:27:45, 4.67s/it] {'loss': 0.2204, 'grad_norm': 0.929324485804581, 'learning_rate': 9.905309705480014e-06, 'epoch': 0.09} 9%|▉ | 3092/34278 [3:25:44<40:27:45, 4.67s/it] 9%|▉ | 3093/34278 [3:25:47<37:01:05, 4.27s/it] {'loss': 0.1999, 'grad_norm': 0.7903364123471041, 'learning_rate': 9.905218175797421e-06, 'epoch': 0.09} 9%|▉ | 3093/34278 [3:25:47<37:01:05, 4.27s/it] 9%|▉ | 3094/34278 [3:25:51<35:40:10, 4.12s/it] {'loss': 0.1671, 'grad_norm': 1.137748834396601, 'learning_rate': 9.905126602322212e-06, 'epoch': 0.09} 9%|▉ | 3094/34278 [3:25:51<35:40:10, 4.12s/it] 9%|▉ | 3095/34278 [3:25:54<33:55:32, 3.92s/it] {'loss': 0.1846, 'grad_norm': 1.044897327008593, 'learning_rate': 9.905034985055205e-06, 'epoch': 0.09} 9%|▉ | 3095/34278 [3:25:54<33:55:32, 3.92s/it] 9%|▉ | 3096/34278 [3:25:57<31:00:41, 3.58s/it] {'loss': 0.1588, 'grad_norm': 0.7780612022090226, 'learning_rate': 9.904943323997216e-06, 'epoch': 0.09} 9%|▉ | 3096/34278 [3:25:57<31:00:41, 3.58s/it] 9%|▉ | 3097/34278 [3:26:00<29:24:26, 3.40s/it] {'loss': 0.1766, 'grad_norm': 0.8476472221427317, 'learning_rate': 9.904851619149063e-06, 'epoch': 0.09} 9%|▉ | 3097/34278 [3:26:00<29:24:26, 3.40s/it] 9%|▉ | 3098/34278 [3:26:03<27:35:01, 3.18s/it] {'loss': 0.165, 'grad_norm': 0.922397270879313, 'learning_rate': 9.904759870511564e-06, 'epoch': 0.09} 9%|▉ | 3098/34278 [3:26:03<27:35:01, 3.18s/it] 9%|▉ | 3099/34278 [3:26:06<26:53:10, 3.10s/it] {'loss': 0.2115, 'grad_norm': 0.9510848219792485, 'learning_rate': 9.904668078085543e-06, 'epoch': 0.09} 9%|▉ | 3099/34278 [3:26:06<26:53:10, 3.10s/it] 9%|▉ | 3100/34278 [3:26:09<27:03:35, 3.12s/it] {'loss': 0.1911, 'grad_norm': 0.8949759049240087, 'learning_rate': 9.904576241871814e-06, 'epoch': 0.09} 9%|▉ | 3100/34278 [3:26:09<27:03:35, 3.12s/it] 9%|▉ | 3101/34278 [3:26:12<28:35:18, 3.30s/it] {'loss': 0.1785, 'grad_norm': 1.4688418395431837, 'learning_rate': 9.9044843618712e-06, 'epoch': 0.09} 9%|▉ | 3101/34278 [3:26:12<28:35:18, 3.30s/it] 9%|▉ | 3102/34278 [3:26:16<29:35:57, 3.42s/it] {'loss': 0.2037, 'grad_norm': 0.7970945369810468, 'learning_rate': 9.90439243808452e-06, 'epoch': 0.09} 9%|▉ | 3102/34278 [3:26:16<29:35:57, 3.42s/it] 9%|▉ | 3103/34278 [3:26:19<28:34:29, 3.30s/it] {'loss': 0.1704, 'grad_norm': 0.8455029656333727, 'learning_rate': 9.904300470512595e-06, 'epoch': 0.09} 9%|▉ | 3103/34278 [3:26:19<28:34:29, 3.30s/it] 9%|▉ | 3104/34278 [3:26:22<27:41:25, 3.20s/it] {'loss': 0.1919, 'grad_norm': 0.9547802543491765, 'learning_rate': 9.904208459156247e-06, 'epoch': 0.09} 9%|▉ | 3104/34278 [3:26:22<27:41:25, 3.20s/it] 9%|▉ | 3105/34278 [3:26:26<28:31:55, 3.30s/it] {'loss': 0.2218, 'grad_norm': 1.006677853859439, 'learning_rate': 9.904116404016296e-06, 'epoch': 0.09} 9%|▉ | 3105/34278 [3:26:26<28:31:55, 3.30s/it] 9%|▉ | 3106/34278 [3:26:29<28:17:09, 3.27s/it] {'loss': 0.1964, 'grad_norm': 0.8237642844111117, 'learning_rate': 9.904024305093564e-06, 'epoch': 0.09} 9%|▉ | 3106/34278 [3:26:29<28:17:09, 3.27s/it] 9%|▉ | 3107/34278 [3:26:32<27:16:52, 3.15s/it] {'loss': 0.2008, 'grad_norm': 0.8582909828870113, 'learning_rate': 9.903932162388875e-06, 'epoch': 0.09} 9%|▉ | 3107/34278 [3:26:32<27:16:52, 3.15s/it] 9%|▉ | 3108/34278 [3:26:35<28:25:28, 3.28s/it] {'loss': 0.1964, 'grad_norm': 0.7976472158239346, 'learning_rate': 9.903839975903049e-06, 'epoch': 0.09} 9%|▉ | 3108/34278 [3:26:35<28:25:28, 3.28s/it] 9%|▉ | 3109/34278 [3:26:39<30:26:42, 3.52s/it] {'loss': 0.2018, 'grad_norm': 0.8758348784855697, 'learning_rate': 9.903747745636912e-06, 'epoch': 0.09} 9%|▉ | 3109/34278 [3:26:39<30:26:42, 3.52s/it] 9%|▉ | 3110/34278 [3:26:45<37:12:44, 4.30s/it] {'loss': 0.1821, 'grad_norm': 0.9311384467244032, 'learning_rate': 9.903655471591285e-06, 'epoch': 0.09} 9%|▉ | 3110/34278 [3:26:45<37:12:44, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3111/34278 [3:26:49<34:47:49, 4.02s/it] {'loss': 0.1604, 'grad_norm': 0.8383641189741342, 'learning_rate': 9.903563153766992e-06, 'epoch': 0.09} 9%|▉ | 3111/34278 [3:26:49<34:47:49, 4.02s/it] 9%|▉ | 3112/34278 [3:26:52<33:01:43, 3.82s/it] {'loss': 0.1886, 'grad_norm': 0.7450909120524571, 'learning_rate': 9.903470792164857e-06, 'epoch': 0.09} 9%|▉ | 3112/34278 [3:26:52<33:01:43, 3.82s/it] 9%|▉ | 3113/34278 [3:26:58<39:07:58, 4.52s/it] {'loss': 0.2151, 'grad_norm': 0.9430597561494886, 'learning_rate': 9.903378386785707e-06, 'epoch': 0.09} 9%|▉ | 3113/34278 [3:26:58<39:07:58, 4.52s/it] 9%|▉ | 3114/34278 [3:27:02<36:15:26, 4.19s/it] {'loss': 0.1744, 'grad_norm': 1.0206645907134217, 'learning_rate': 9.903285937630364e-06, 'epoch': 0.09} 9%|▉ | 3114/34278 [3:27:02<36:15:26, 4.19s/it] 9%|▉ | 3115/34278 [3:27:06<36:07:23, 4.17s/it] {'loss': 0.1908, 'grad_norm': 0.8434807430942434, 'learning_rate': 9.903193444699656e-06, 'epoch': 0.09} 9%|▉ | 3115/34278 [3:27:06<36:07:23, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3116/34278 [3:27:11<37:43:26, 4.36s/it] {'loss': 0.2084, 'grad_norm': 0.9500140411201377, 'learning_rate': 9.903100907994407e-06, 'epoch': 0.09} 9%|▉ | 3116/34278 [3:27:11<37:43:26, 4.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3117/34278 [3:27:14<34:08:04, 3.94s/it] {'loss': 0.1884, 'grad_norm': 0.837192644453322, 'learning_rate': 9.903008327515442e-06, 'epoch': 0.09} 9%|▉ | 3117/34278 [3:27:14<34:08:04, 3.94s/it] 9%|▉ | 3118/34278 [3:27:17<32:06:10, 3.71s/it] {'loss': 0.1848, 'grad_norm': 0.8378672650614899, 'learning_rate': 9.902915703263591e-06, 'epoch': 0.09} 9%|▉ | 3118/34278 [3:27:17<32:06:10, 3.71s/it] 9%|▉ | 3119/34278 [3:27:23<38:20:18, 4.43s/it] {'loss': 0.1759, 'grad_norm': 0.9663374647208482, 'learning_rate': 9.902823035239678e-06, 'epoch': 0.09} 9%|▉ | 3119/34278 [3:27:23<38:20:18, 4.43s/it] 9%|▉ | 3120/34278 [3:27:26<35:31:17, 4.10s/it] {'loss': 0.2047, 'grad_norm': 0.841080490878279, 'learning_rate': 9.902730323444531e-06, 'epoch': 0.09} 9%|▉ | 3120/34278 [3:27:26<35:31:17, 4.10s/it] 9%|▉ | 3121/34278 [3:27:29<33:02:14, 3.82s/it] {'loss': 0.1889, 'grad_norm': 0.9822202779269245, 'learning_rate': 9.902637567878979e-06, 'epoch': 0.09} 9%|▉ | 3121/34278 [3:27:29<33:02:14, 3.82s/it] 9%|▉ | 3122/34278 [3:27:34<35:06:16, 4.06s/it] {'loss': 0.1784, 'grad_norm': 0.8996420337880603, 'learning_rate': 9.90254476854385e-06, 'epoch': 0.09} 9%|▉ | 3122/34278 [3:27:34<35:06:16, 4.06s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3123/34278 [3:27:37<32:04:36, 3.71s/it] {'loss': 0.1851, 'grad_norm': 1.3390013111690215, 'learning_rate': 9.90245192543997e-06, 'epoch': 0.09} 9%|▉ | 3123/34278 [3:27:37<32:04:36, 3.71s/it] 9%|▉ | 3124/34278 [3:27:40<29:57:14, 3.46s/it] {'loss': 0.192, 'grad_norm': 0.8384136121624998, 'learning_rate': 9.90235903856817e-06, 'epoch': 0.09} 9%|▉ | 3124/34278 [3:27:40<29:57:14, 3.46s/it] 9%|▉ | 3125/34278 [3:27:44<31:11:35, 3.60s/it] {'loss': 0.1807, 'grad_norm': 0.8420761828985089, 'learning_rate': 9.902266107929279e-06, 'epoch': 0.09} 9%|▉ | 3125/34278 [3:27:44<31:11:35, 3.60s/it] 9%|▉ | 3126/34278 [3:27:47<30:57:07, 3.58s/it] {'loss': 0.1902, 'grad_norm': 0.8803474977758997, 'learning_rate': 9.902173133524125e-06, 'epoch': 0.09} 9%|▉ | 3126/34278 [3:27:47<30:57:07, 3.58s/it] 9%|▉ | 3127/34278 [3:27:51<30:10:39, 3.49s/it] {'loss': 0.2006, 'grad_norm': 0.9323609834892589, 'learning_rate': 9.902080115353541e-06, 'epoch': 0.09} 9%|▉ | 3127/34278 [3:27:51<30:10:39, 3.49s/it] 9%|▉ | 3128/34278 [3:27:56<36:32:26, 4.22s/it] {'loss': 0.1729, 'grad_norm': 0.9239092626296833, 'learning_rate': 9.901987053418355e-06, 'epoch': 0.09} 9%|▉ | 3128/34278 [3:27:56<36:32:26, 4.22s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3129/34278 [3:28:00<33:40:03, 3.89s/it] {'loss': 0.1734, 'grad_norm': 0.9192162854775283, 'learning_rate': 9.901893947719401e-06, 'epoch': 0.09} 9%|▉ | 3129/34278 [3:28:00<33:40:03, 3.89s/it] 9%|▉ | 3130/34278 [3:28:05<36:53:44, 4.26s/it] {'loss': 0.2072, 'grad_norm': 1.1515306150746933, 'learning_rate': 9.901800798257506e-06, 'epoch': 0.09} 9%|▉ | 3130/34278 [3:28:05<36:53:44, 4.26s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3131/34278 [3:28:10<38:53:06, 4.49s/it] {'loss': 0.1883, 'grad_norm': 0.9095002771703233, 'learning_rate': 9.901707605033504e-06, 'epoch': 0.09} 9%|▉ | 3131/34278 [3:28:10<38:53:06, 4.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3132/34278 [3:28:13<34:39:07, 4.01s/it] {'loss': 0.1689, 'grad_norm': 0.9494728333330158, 'learning_rate': 9.901614368048226e-06, 'epoch': 0.09} 9%|▉ | 3132/34278 [3:28:13<34:39:07, 4.01s/it] 9%|▉ | 3133/34278 [3:28:16<32:05:07, 3.71s/it] {'loss': 0.1848, 'grad_norm': 1.0741251960370346, 'learning_rate': 9.901521087302508e-06, 'epoch': 0.09} 9%|▉ | 3133/34278 [3:28:16<32:05:07, 3.71s/it] 9%|▉ | 3134/34278 [3:28:19<31:15:56, 3.61s/it] {'loss': 0.201, 'grad_norm': 0.9276160324018725, 'learning_rate': 9.901427762797176e-06, 'epoch': 0.09} 9%|▉ | 3134/34278 [3:28:19<31:15:56, 3.61s/it] 9%|▉ | 3135/34278 [3:28:22<29:19:31, 3.39s/it] {'loss': 0.1815, 'grad_norm': 0.9020587477903261, 'learning_rate': 9.901334394533069e-06, 'epoch': 0.09} 9%|▉ | 3135/34278 [3:28:22<29:19:31, 3.39s/it] 9%|▉ | 3136/34278 [3:28:28<35:53:34, 4.15s/it] {'loss': 0.1853, 'grad_norm': 0.9543040013265817, 'learning_rate': 9.901240982511017e-06, 'epoch': 0.09} 9%|▉ | 3136/34278 [3:28:28<35:53:34, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3137/34278 [3:28:31<33:42:41, 3.90s/it] {'loss': 0.2072, 'grad_norm': 0.8074048966683778, 'learning_rate': 9.901147526731857e-06, 'epoch': 0.09} 9%|▉ | 3137/34278 [3:28:31<33:42:41, 3.90s/it] 9%|▉ | 3138/34278 [3:28:34<32:06:33, 3.71s/it] {'loss': 0.1705, 'grad_norm': 0.8168688399715941, 'learning_rate': 9.901054027196422e-06, 'epoch': 0.09} 9%|▉ | 3138/34278 [3:28:34<32:06:33, 3.71s/it] 9%|▉ | 3139/34278 [3:28:41<38:41:03, 4.47s/it] {'loss': 0.1854, 'grad_norm': 0.9696701967770042, 'learning_rate': 9.900960483905546e-06, 'epoch': 0.09} 9%|▉ | 3139/34278 [3:28:41<38:41:03, 4.47s/it] 9%|▉ | 3140/34278 [3:28:44<35:38:37, 4.12s/it] {'loss': 0.1764, 'grad_norm': 0.8608991500956572, 'learning_rate': 9.900866896860066e-06, 'epoch': 0.09} 9%|▉ | 3140/34278 [3:28:44<35:38:37, 4.12s/it] 9%|▉ | 3141/34278 [3:28:47<32:45:51, 3.79s/it] {'loss': 0.2001, 'grad_norm': 0.9563370412441334, 'learning_rate': 9.900773266060814e-06, 'epoch': 0.09} 9%|▉ | 3141/34278 [3:28:47<32:45:51, 3.79s/it] 9%|▉ | 3142/34278 [3:28:50<30:46:37, 3.56s/it] {'loss': 0.2042, 'grad_norm': 1.0622492285855472, 'learning_rate': 9.90067959150863e-06, 'epoch': 0.09} 9%|▉ | 3142/34278 [3:28:50<30:46:37, 3.56s/it] 9%|▉ | 3143/34278 [3:28:53<30:31:22, 3.53s/it] {'loss': 0.207, 'grad_norm': 0.980733770469344, 'learning_rate': 9.90058587320435e-06, 'epoch': 0.09} 9%|▉ | 3143/34278 [3:28:53<30:31:22, 3.53s/it] 9%|▉ | 3144/34278 [3:28:58<32:03:26, 3.71s/it] {'loss': 0.2022, 'grad_norm': 0.9548998232870056, 'learning_rate': 9.900492111148804e-06, 'epoch': 0.09} 9%|▉ | 3144/34278 [3:28:58<32:03:26, 3.71s/it] 9%|▉ | 3145/34278 [3:29:01<30:43:34, 3.55s/it] {'loss': 0.1978, 'grad_norm': 1.1523984929924487, 'learning_rate': 9.900398305342838e-06, 'epoch': 0.09} 9%|▉ | 3145/34278 [3:29:01<30:43:34, 3.55s/it] 9%|▉ | 3146/34278 [3:29:04<28:52:16, 3.34s/it] {'loss': 0.1852, 'grad_norm': 0.9496215587216399, 'learning_rate': 9.900304455787285e-06, 'epoch': 0.09} 9%|▉ | 3146/34278 [3:29:04<28:52:16, 3.34s/it] 9%|▉ | 3147/34278 [3:29:07<29:15:18, 3.38s/it] {'loss': 0.1993, 'grad_norm': 1.0291303492810868, 'learning_rate': 9.900210562482985e-06, 'epoch': 0.09} 9%|▉ | 3147/34278 [3:29:07<29:15:18, 3.38s/it] 9%|▉ | 3148/34278 [3:29:13<36:02:07, 4.17s/it] {'loss': 0.193, 'grad_norm': 1.1989672928651198, 'learning_rate': 9.900116625430774e-06, 'epoch': 0.09} 9%|▉ | 3148/34278 [3:29:13<36:02:07, 4.17s/it] 9%|▉ | 3149/34278 [3:29:16<33:12:11, 3.84s/it] {'loss': 0.2284, 'grad_norm': 0.9900630480878376, 'learning_rate': 9.90002264463149e-06, 'epoch': 0.09} 9%|▉ | 3149/34278 [3:29:16<33:12:11, 3.84s/it] 9%|▉ | 3150/34278 [3:29:19<31:33:38, 3.65s/it] {'loss': 0.196, 'grad_norm': 1.0363159213300737, 'learning_rate': 9.899928620085975e-06, 'epoch': 0.09} 9%|▉ | 3150/34278 [3:29:19<31:33:38, 3.65s/it] 9%|▉ | 3151/34278 [3:29:23<32:02:01, 3.70s/it] {'loss': 0.1909, 'grad_norm': 1.089681182274323, 'learning_rate': 9.899834551795066e-06, 'epoch': 0.09} 9%|▉ | 3151/34278 [3:29:23<32:02:01, 3.70s/it] 9%|▉ | 3152/34278 [3:29:26<30:01:45, 3.47s/it] {'loss': 0.1828, 'grad_norm': 1.1716790411151727, 'learning_rate': 9.899740439759605e-06, 'epoch': 0.09} 9%|▉ | 3152/34278 [3:29:26<30:01:45, 3.47s/it] 9%|▉ | 3153/34278 [3:29:29<29:07:36, 3.37s/it] {'loss': 0.1781, 'grad_norm': 0.8815311348464451, 'learning_rate': 9.899646283980432e-06, 'epoch': 0.09} 9%|▉ | 3153/34278 [3:29:29<29:07:36, 3.37s/it] 9%|▉ | 3154/34278 [3:29:32<28:33:09, 3.30s/it] {'loss': 0.2, 'grad_norm': 0.8823874358845932, 'learning_rate': 9.899552084458383e-06, 'epoch': 0.09} 9%|▉ | 3154/34278 [3:29:32<28:33:09, 3.30s/it] 9%|▉ | 3155/34278 [3:29:36<29:33:57, 3.42s/it] {'loss': 0.1835, 'grad_norm': 1.0980207624784954, 'learning_rate': 9.899457841194307e-06, 'epoch': 0.09} 9%|▉ | 3155/34278 [3:29:36<29:33:57, 3.42s/it] 9%|▉ | 3156/34278 [3:29:39<28:58:15, 3.35s/it] {'loss': 0.1886, 'grad_norm': 1.09346749548337, 'learning_rate': 9.899363554189038e-06, 'epoch': 0.09} 9%|▉ | 3156/34278 [3:29:39<28:58:15, 3.35s/it] 9%|▉ | 3157/34278 [3:29:44<31:51:21, 3.69s/it] {'loss': 0.2037, 'grad_norm': 0.832123474446183, 'learning_rate': 9.899269223443421e-06, 'epoch': 0.09} 9%|▉ | 3157/34278 [3:29:44<31:51:21, 3.69s/it] 9%|▉ | 3158/34278 [3:29:47<30:01:46, 3.47s/it] {'loss': 0.191, 'grad_norm': 0.9348866978298538, 'learning_rate': 9.899174848958298e-06, 'epoch': 0.09} 9%|▉ | 3158/34278 [3:29:47<30:01:46, 3.47s/it] 9%|▉ | 3159/34278 [3:29:50<28:51:11, 3.34s/it] {'loss': 0.168, 'grad_norm': 0.8448691113518083, 'learning_rate': 9.899080430734512e-06, 'epoch': 0.09} 9%|▉ | 3159/34278 [3:29:50<28:51:11, 3.34s/it] 9%|▉ | 3160/34278 [3:29:53<28:49:47, 3.34s/it] {'loss': 0.1911, 'grad_norm': 0.9668811661491377, 'learning_rate': 9.898985968772905e-06, 'epoch': 0.09} 9%|▉ | 3160/34278 [3:29:53<28:49:47, 3.34s/it] 9%|▉ | 3161/34278 [3:29:57<29:29:13, 3.41s/it] {'loss': 0.2, 'grad_norm': 0.8822283021391198, 'learning_rate': 9.898891463074321e-06, 'epoch': 0.09} 9%|▉ | 3161/34278 [3:29:57<29:29:13, 3.41s/it] 9%|▉ | 3162/34278 [3:30:00<29:38:25, 3.43s/it] {'loss': 0.1518, 'grad_norm': 0.8299612372235152, 'learning_rate': 9.898796913639605e-06, 'epoch': 0.09} 9%|▉ | 3162/34278 [3:30:00<29:38:25, 3.43s/it] 9%|▉ | 3163/34278 [3:30:03<28:37:21, 3.31s/it] {'loss': 0.1837, 'grad_norm': 1.1499537794875645, 'learning_rate': 9.898702320469597e-06, 'epoch': 0.09} 9%|▉ | 3163/34278 [3:30:03<28:37:21, 3.31s/it] 9%|▉ | 3164/34278 [3:30:07<29:47:49, 3.45s/it] {'loss': 0.1635, 'grad_norm': 0.9581816435886256, 'learning_rate': 9.898607683565146e-06, 'epoch': 0.09} 9%|▉ | 3164/34278 [3:30:07<29:47:49, 3.45s/it] 9%|▉ | 3165/34278 [3:30:11<31:03:02, 3.59s/it] {'loss': 0.1697, 'grad_norm': 0.776698703755397, 'learning_rate': 9.898513002927094e-06, 'epoch': 0.09} 9%|▉ | 3165/34278 [3:30:11<31:03:02, 3.59s/it] 9%|▉ | 3166/34278 [3:30:14<29:30:52, 3.42s/it] {'loss': 0.1696, 'grad_norm': 1.1063066015427327, 'learning_rate': 9.898418278556288e-06, 'epoch': 0.09} 9%|▉ | 3166/34278 [3:30:14<29:30:52, 3.42s/it] 9%|▉ | 3167/34278 [3:30:20<36:15:13, 4.20s/it] {'loss': 0.1845, 'grad_norm': 1.0075887363532354, 'learning_rate': 9.898323510453571e-06, 'epoch': 0.09} 9%|▉ | 3167/34278 [3:30:20<36:15:13, 4.20s/it] 9%|▉ | 3168/34278 [3:30:24<34:55:19, 4.04s/it] {'loss': 0.1791, 'grad_norm': 0.9451382122340356, 'learning_rate': 9.898228698619794e-06, 'epoch': 0.09} 9%|▉ | 3168/34278 [3:30:24<34:55:19, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (8296 > 8192). Running this sequence through the model will result in indexing errors 9%|▉ | 3169/34278 [3:30:27<33:11:20, 3.84s/it] {'loss': 0.1729, 'grad_norm': 1.037499933351149, 'learning_rate': 9.898133843055798e-06, 'epoch': 0.09} 9%|▉ | 3169/34278 [3:30:27<33:11:20, 3.84s/it] 9%|▉ | 3170/34278 [3:30:30<30:54:20, 3.58s/it] {'loss': 0.1966, 'grad_norm': 1.3896563982420864, 'learning_rate': 9.898038943762434e-06, 'epoch': 0.09} 9%|▉ | 3170/34278 [3:30:30<30:54:20, 3.58s/it] 9%|▉ | 3171/34278 [3:30:33<29:08:35, 3.37s/it] {'loss': 0.2243, 'grad_norm': 1.009329571422478, 'learning_rate': 9.897944000740547e-06, 'epoch': 0.09} 9%|▉ | 3171/34278 [3:30:33<29:08:35, 3.37s/it] 9%|▉ | 3172/34278 [3:30:39<35:13:50, 4.08s/it] {'loss': 0.1789, 'grad_norm': 0.8822312471226745, 'learning_rate': 9.897849013990985e-06, 'epoch': 0.09} 9%|▉ | 3172/34278 [3:30:39<35:13:50, 4.08s/it] 9%|▉ | 3173/34278 [3:30:42<33:40:29, 3.90s/it] {'loss': 0.222, 'grad_norm': 1.1137719163993303, 'learning_rate': 9.897753983514595e-06, 'epoch': 0.09} 9%|▉ | 3173/34278 [3:30:42<33:40:29, 3.90s/it] 9%|▉ | 3174/34278 [3:30:45<32:19:41, 3.74s/it] {'loss': 0.1824, 'grad_norm': 1.0172438228433538, 'learning_rate': 9.897658909312229e-06, 'epoch': 0.09} 9%|▉ | 3174/34278 [3:30:45<32:19:41, 3.74s/it] 9%|▉ | 3175/34278 [3:30:51<36:39:49, 4.24s/it] {'loss': 0.1768, 'grad_norm': 0.9884739364426408, 'learning_rate': 9.897563791384733e-06, 'epoch': 0.09} 9%|▉ | 3175/34278 [3:30:51<36:39:49, 4.24s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff27f7d39c0> Failed to fetch sample 3722598. Exception: cannot identify image file <_io.BytesIO object at 0x7ff27f7d39c0> 9%|▉ | 3176/34278 [3:30:57<41:41:28, 4.83s/it] {'loss': 0.2083, 'grad_norm': 1.0481532690179143, 'learning_rate': 9.897468629732956e-06, 'epoch': 0.09} 9%|▉ | 3176/34278 [3:30:57<41:41:28, 4.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3177/34278 [3:31:01<38:19:55, 4.44s/it] {'loss': 0.2084, 'grad_norm': 0.9691419807415792, 'learning_rate': 9.897373424357747e-06, 'epoch': 0.09} 9%|▉ | 3177/34278 [3:31:01<38:19:55, 4.44s/it] 9%|▉ | 3178/34278 [3:31:04<34:48:46, 4.03s/it] {'loss': 0.1862, 'grad_norm': 0.9862788336906526, 'learning_rate': 9.897278175259959e-06, 'epoch': 0.09} 9%|▉ | 3178/34278 [3:31:04<34:48:46, 4.03s/it] 9%|▉ | 3179/34278 [3:31:08<34:59:19, 4.05s/it] {'loss': 0.1723, 'grad_norm': 0.9780930416857407, 'learning_rate': 9.897182882440439e-06, 'epoch': 0.09} 9%|▉ | 3179/34278 [3:31:08<34:59:19, 4.05s/it] 9%|▉ | 3180/34278 [3:31:11<33:34:26, 3.89s/it] {'loss': 0.1902, 'grad_norm': 0.7931073017265083, 'learning_rate': 9.897087545900039e-06, 'epoch': 0.09} 9%|▉ | 3180/34278 [3:31:11<33:34:26, 3.89s/it] 9%|▉ | 3181/34278 [3:31:14<30:32:55, 3.54s/it] {'loss': 0.1863, 'grad_norm': 0.8816903279426856, 'learning_rate': 9.896992165639612e-06, 'epoch': 0.09} 9%|▉ | 3181/34278 [3:31:14<30:32:55, 3.54s/it] 9%|▉ | 3182/34278 [3:31:17<29:34:40, 3.42s/it] {'loss': 0.2192, 'grad_norm': 0.949194604791015, 'learning_rate': 9.896896741660008e-06, 'epoch': 0.09} 9%|▉ | 3182/34278 [3:31:17<29:34:40, 3.42s/it] 9%|▉ | 3183/34278 [3:31:21<30:03:54, 3.48s/it] {'loss': 0.1992, 'grad_norm': 0.918235761083928, 'learning_rate': 9.896801273962078e-06, 'epoch': 0.09} 9%|▉ | 3183/34278 [3:31:21<30:03:54, 3.48s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3184/34278 [3:31:24<29:40:46, 3.44s/it] {'loss': 0.1997, 'grad_norm': 0.807178404781097, 'learning_rate': 9.896705762546676e-06, 'epoch': 0.09} 9%|▉ | 3184/34278 [3:31:24<29:40:46, 3.44s/it] 9%|▉ | 3185/34278 [3:31:27<27:55:49, 3.23s/it] {'loss': 0.2117, 'grad_norm': 0.9321238823064868, 'learning_rate': 9.896610207414654e-06, 'epoch': 0.09} 9%|▉ | 3185/34278 [3:31:27<27:55:49, 3.23s/it] 9%|▉ | 3186/34278 [3:31:30<28:27:09, 3.29s/it] {'loss': 0.2062, 'grad_norm': 0.9375778044105287, 'learning_rate': 9.896514608566863e-06, 'epoch': 0.09} 9%|▉ | 3186/34278 [3:31:30<28:27:09, 3.29s/it] 9%|▉ | 3187/34278 [3:31:33<27:22:20, 3.17s/it] {'loss': 0.1912, 'grad_norm': 0.7205607205601595, 'learning_rate': 9.896418966004159e-06, 'epoch': 0.09} 9%|▉ | 3187/34278 [3:31:33<27:22:20, 3.17s/it] 9%|▉ | 3188/34278 [3:31:37<28:57:14, 3.35s/it] {'loss': 0.186, 'grad_norm': 0.8587312590346948, 'learning_rate': 9.896323279727398e-06, 'epoch': 0.09} 9%|▉ | 3188/34278 [3:31:37<28:57:14, 3.35s/it] 9%|▉ | 3189/34278 [3:31:43<35:13:14, 4.08s/it] {'loss': 0.2327, 'grad_norm': 1.0585477782466406, 'learning_rate': 9.89622754973743e-06, 'epoch': 0.09} 9%|▉ | 3189/34278 [3:31:43<35:13:14, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3190/34278 [3:31:46<32:45:45, 3.79s/it] {'loss': 0.1881, 'grad_norm': 0.8824097003662679, 'learning_rate': 9.896131776035111e-06, 'epoch': 0.09} 9%|▉ | 3190/34278 [3:31:46<32:45:45, 3.79s/it] 9%|▉ | 3191/34278 [3:31:49<30:14:27, 3.50s/it] {'loss': 0.1937, 'grad_norm': 1.0035868168095385, 'learning_rate': 9.896035958621295e-06, 'epoch': 0.09} 9%|▉ | 3191/34278 [3:31:49<30:14:27, 3.50s/it] 9%|▉ | 3192/34278 [3:31:52<29:42:56, 3.44s/it] {'loss': 0.2211, 'grad_norm': 1.0261592643292214, 'learning_rate': 9.89594009749684e-06, 'epoch': 0.09} 9%|▉ | 3192/34278 [3:31:52<29:42:56, 3.44s/it] 9%|▉ | 3193/34278 [3:31:58<37:04:43, 4.29s/it] {'loss': 0.1801, 'grad_norm': 0.8813262335652425, 'learning_rate': 9.895844192662602e-06, 'epoch': 0.09} 9%|▉ | 3193/34278 [3:31:58<37:04:43, 4.29s/it] 9%|▉ | 3194/34278 [3:32:01<34:22:50, 3.98s/it] {'loss': 0.1958, 'grad_norm': 1.178278218707878, 'learning_rate': 9.895748244119434e-06, 'epoch': 0.09} 9%|▉ | 3194/34278 [3:32:01<34:22:50, 3.98s/it] 9%|▉ | 3195/34278 [3:32:07<39:17:05, 4.55s/it] {'loss': 0.2324, 'grad_norm': 1.0313091909980319, 'learning_rate': 9.895652251868196e-06, 'epoch': 0.09} 9%|▉ | 3195/34278 [3:32:07<39:17:05, 4.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3196/34278 [3:32:13<43:21:02, 5.02s/it] {'loss': 0.1788, 'grad_norm': 0.9311020701092604, 'learning_rate': 9.89555621590974e-06, 'epoch': 0.09} 9%|▉ | 3196/34278 [3:32:13<43:21:02, 5.02s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3197/34278 [3:32:19<45:17:59, 5.25s/it] {'loss': 0.1872, 'grad_norm': 0.764970517720893, 'learning_rate': 9.89546013624493e-06, 'epoch': 0.09} 9%|▉ | 3197/34278 [3:32:19<45:17:59, 5.25s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3198/34278 [3:32:22<39:30:16, 4.58s/it] {'loss': 0.1919, 'grad_norm': 0.9470134867851716, 'learning_rate': 9.89536401287462e-06, 'epoch': 0.09} 9%|▉ | 3198/34278 [3:32:22<39:30:16, 4.58s/it] 9%|▉ | 3199/34278 [3:32:26<36:13:48, 4.20s/it] {'loss': 0.2011, 'grad_norm': 0.8313124470837624, 'learning_rate': 9.895267845799667e-06, 'epoch': 0.09} 9%|▉ | 3199/34278 [3:32:26<36:13:48, 4.20s/it] 9%|▉ | 3200/34278 [3:32:30<36:08:44, 4.19s/it] {'loss': 0.1792, 'grad_norm': 0.7609124467794467, 'learning_rate': 9.895171635020933e-06, 'epoch': 0.09} 9%|▉ | 3200/34278 [3:32:30<36:08:44, 4.19s/it] 9%|▉ | 3201/34278 [3:32:36<41:20:17, 4.79s/it] {'loss': 0.213, 'grad_norm': 0.8741796775430868, 'learning_rate': 9.895075380539275e-06, 'epoch': 0.09} 9%|▉ | 3201/34278 [3:32:36<41:20:17, 4.79s/it] 9%|▉ | 3202/34278 [3:32:39<36:42:02, 4.25s/it] {'loss': 0.2125, 'grad_norm': 0.9733134720207894, 'learning_rate': 9.894979082355552e-06, 'epoch': 0.09} 9%|▉ | 3202/34278 [3:32:39<36:42:02, 4.25s/it] 9%|▉ | 3203/34278 [3:32:43<35:08:46, 4.07s/it] {'loss': 0.2068, 'grad_norm': 0.9272558824072925, 'learning_rate': 9.894882740470625e-06, 'epoch': 0.09} 9%|▉ | 3203/34278 [3:32:43<35:08:46, 4.07s/it] 9%|▉ | 3204/34278 [3:32:46<34:06:18, 3.95s/it] {'loss': 0.2177, 'grad_norm': 0.8939817708889214, 'learning_rate': 9.894786354885354e-06, 'epoch': 0.09} 9%|▉ | 3204/34278 [3:32:46<34:06:18, 3.95s/it] 9%|▉ | 3205/34278 [3:32:49<31:46:59, 3.68s/it] {'loss': 0.1964, 'grad_norm': 0.7836086036755353, 'learning_rate': 9.894689925600596e-06, 'epoch': 0.09} 9%|▉ | 3205/34278 [3:32:49<31:46:59, 3.68s/it] 9%|▉ | 3206/34278 [3:32:52<30:15:53, 3.51s/it] {'loss': 0.1979, 'grad_norm': 0.8645969989270436, 'learning_rate': 9.894593452617216e-06, 'epoch': 0.09} 9%|▉ | 3206/34278 [3:32:52<30:15:53, 3.51s/it] 9%|▉ | 3207/34278 [3:32:56<31:54:18, 3.70s/it] {'loss': 0.179, 'grad_norm': 0.8637258641034835, 'learning_rate': 9.894496935936076e-06, 'epoch': 0.09} 9%|▉ | 3207/34278 [3:32:57<31:54:18, 3.70s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3208/34278 [3:33:01<35:08:43, 4.07s/it] {'loss': 0.1797, 'grad_norm': 0.686666093078826, 'learning_rate': 9.894400375558035e-06, 'epoch': 0.09} 9%|▉ | 3208/34278 [3:33:01<35:08:43, 4.07s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3209/34278 [3:33:05<33:38:37, 3.90s/it] {'loss': 0.1913, 'grad_norm': 0.9946572338255079, 'learning_rate': 9.894303771483955e-06, 'epoch': 0.09} 9%|▉ | 3209/34278 [3:33:05<33:38:37, 3.90s/it] 9%|▉ | 3210/34278 [3:33:08<32:02:10, 3.71s/it] {'loss': 0.1663, 'grad_norm': 0.7749649434714495, 'learning_rate': 9.8942071237147e-06, 'epoch': 0.09} 9%|▉ | 3210/34278 [3:33:08<32:02:10, 3.71s/it] 9%|▉ | 3211/34278 [3:33:13<33:41:11, 3.90s/it] {'loss': 0.1699, 'grad_norm': 0.7833073905906534, 'learning_rate': 9.894110432251131e-06, 'epoch': 0.09} 9%|▉ | 3211/34278 [3:33:13<33:41:11, 3.90s/it] 9%|▉ | 3212/34278 [3:33:16<31:39:19, 3.67s/it] {'loss': 0.178, 'grad_norm': 0.9176131665834725, 'learning_rate': 9.894013697094113e-06, 'epoch': 0.09} 9%|▉ | 3212/34278 [3:33:16<31:39:19, 3.67s/it] 9%|▉ | 3213/34278 [3:33:21<35:02:36, 4.06s/it] {'loss': 0.1909, 'grad_norm': 0.82519863993861, 'learning_rate': 9.89391691824451e-06, 'epoch': 0.09} 9%|▉ | 3213/34278 [3:33:21<35:02:36, 4.06s/it] 9%|▉ | 3214/34278 [3:33:25<34:44:06, 4.03s/it] {'loss': 0.1708, 'grad_norm': 1.0785544956802007, 'learning_rate': 9.893820095703185e-06, 'epoch': 0.09} 9%|▉ | 3214/34278 [3:33:25<34:44:06, 4.03s/it] 9%|▉ | 3215/34278 [3:33:29<35:23:25, 4.10s/it] {'loss': 0.1815, 'grad_norm': 0.8957478455051892, 'learning_rate': 9.893723229471001e-06, 'epoch': 0.09} 9%|▉ | 3215/34278 [3:33:29<35:23:25, 4.10s/it] 9%|▉ | 3216/34278 [3:33:32<33:06:56, 3.84s/it] {'loss': 0.2181, 'grad_norm': 1.0309759483868939, 'learning_rate': 9.893626319548823e-06, 'epoch': 0.09} 9%|▉ | 3216/34278 [3:33:32<33:06:56, 3.84s/it] 9%|▉ | 3217/34278 [3:33:39<40:04:10, 4.64s/it] {'loss': 0.1885, 'grad_norm': 1.112924509433906, 'learning_rate': 9.89352936593752e-06, 'epoch': 0.09} 9%|▉ | 3217/34278 [3:33:39<40:04:10, 4.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3218/34278 [3:33:42<36:00:30, 4.17s/it] {'loss': 0.1724, 'grad_norm': 1.0197801277631693, 'learning_rate': 9.893432368637954e-06, 'epoch': 0.09} 9%|▉ | 3218/34278 [3:33:42<36:00:30, 4.17s/it] 9%|▉ | 3219/34278 [3:33:45<34:43:01, 4.02s/it] {'loss': 0.1759, 'grad_norm': 0.9052079172498365, 'learning_rate': 9.893335327650992e-06, 'epoch': 0.09} 9%|▉ | 3219/34278 [3:33:45<34:43:01, 4.02s/it] 9%|▉ | 3220/34278 [3:33:48<32:05:36, 3.72s/it] {'loss': 0.1901, 'grad_norm': 1.0412474859763168, 'learning_rate': 9.893238242977502e-06, 'epoch': 0.09} 9%|▉ | 3220/34278 [3:33:48<32:05:36, 3.72s/it] 9%|▉ | 3221/34278 [3:33:54<37:47:09, 4.38s/it] {'loss': 0.1968, 'grad_norm': 0.8339256080577501, 'learning_rate': 9.893141114618348e-06, 'epoch': 0.09} 9%|▉ | 3221/34278 [3:33:54<37:47:09, 4.38s/it] 9%|▉ | 3222/34278 [3:33:58<35:02:15, 4.06s/it] {'loss': 0.1831, 'grad_norm': 0.8540496003410044, 'learning_rate': 9.893043942574397e-06, 'epoch': 0.09} 9%|▉ | 3222/34278 [3:33:58<35:02:15, 4.06s/it] 9%|▉ | 3223/34278 [3:34:02<37:07:05, 4.30s/it] {'loss': 0.2003, 'grad_norm': 0.992501538161245, 'learning_rate': 9.89294672684652e-06, 'epoch': 0.09} 9%|▉ | 3223/34278 [3:34:03<37:07:05, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3224/34278 [3:34:05<33:23:57, 3.87s/it] {'loss': 0.197, 'grad_norm': 0.9560750673388776, 'learning_rate': 9.89284946743558e-06, 'epoch': 0.09} 9%|▉ | 3224/34278 [3:34:05<33:23:57, 3.87s/it] 9%|▉ | 3225/34278 [3:34:08<30:50:24, 3.58s/it] {'loss': 0.1927, 'grad_norm': 1.117693455821563, 'learning_rate': 9.892752164342449e-06, 'epoch': 0.09} 9%|▉ | 3225/34278 [3:34:08<30:50:24, 3.58s/it] 9%|▉ | 3226/34278 [3:34:13<33:27:01, 3.88s/it] {'loss': 0.1923, 'grad_norm': 0.918814702440944, 'learning_rate': 9.892654817567995e-06, 'epoch': 0.09} 9%|▉ | 3226/34278 [3:34:13<33:27:01, 3.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3227/34278 [3:34:16<31:18:19, 3.63s/it] {'loss': 0.1659, 'grad_norm': 1.0096062228169131, 'learning_rate': 9.892557427113087e-06, 'epoch': 0.09} 9%|▉ | 3227/34278 [3:34:16<31:18:19, 3.63s/it] 9%|▉ | 3228/34278 [3:34:20<32:58:11, 3.82s/it] {'loss': 0.1872, 'grad_norm': 0.9441484563863961, 'learning_rate': 9.892459992978594e-06, 'epoch': 0.09} 9%|▉ | 3228/34278 [3:34:20<32:58:11, 3.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3229/34278 [3:34:23<31:32:33, 3.66s/it] {'loss': 0.2264, 'grad_norm': 0.9177509239500861, 'learning_rate': 9.892362515165386e-06, 'epoch': 0.09} 9%|▉ | 3229/34278 [3:34:23<31:32:33, 3.66s/it] 9%|▉ | 3230/34278 [3:34:27<31:22:56, 3.64s/it] {'loss': 0.189, 'grad_norm': 0.898743746501821, 'learning_rate': 9.892264993674334e-06, 'epoch': 0.09} 9%|▉ | 3230/34278 [3:34:27<31:22:56, 3.64s/it] 9%|▉ | 3231/34278 [3:34:31<31:31:55, 3.66s/it] {'loss': 0.1947, 'grad_norm': 0.9162917180759605, 'learning_rate': 9.892167428506307e-06, 'epoch': 0.09} 9%|▉ | 3231/34278 [3:34:31<31:31:55, 3.66s/it] 9%|▉ | 3232/34278 [3:34:37<37:57:58, 4.40s/it] {'loss': 0.1942, 'grad_norm': 1.1773099051391154, 'learning_rate': 9.892069819662179e-06, 'epoch': 0.09} 9%|▉ | 3232/34278 [3:34:37<37:57:58, 4.40s/it] 9%|▉ | 3233/34278 [3:34:43<41:55:27, 4.86s/it] {'loss': 0.1939, 'grad_norm': 1.013402861911588, 'learning_rate': 9.891972167142816e-06, 'epoch': 0.09} 9%|▉ | 3233/34278 [3:34:43<41:55:27, 4.86s/it] 9%|▉ | 3234/34278 [3:34:47<39:09:29, 4.54s/it] {'loss': 0.1895, 'grad_norm': 1.2777285577531594, 'learning_rate': 9.891874470949095e-06, 'epoch': 0.09} 9%|▉ | 3234/34278 [3:34:47<39:09:29, 4.54s/it] 9%|▉ | 3235/34278 [3:34:50<36:05:33, 4.19s/it] {'loss': 0.209, 'grad_norm': 0.9119440469852996, 'learning_rate': 9.891776731081887e-06, 'epoch': 0.09} 9%|▉ | 3235/34278 [3:34:50<36:05:33, 4.19s/it] 9%|▉ | 3236/34278 [3:34:53<34:22:36, 3.99s/it] {'loss': 0.1967, 'grad_norm': 0.8659368042696534, 'learning_rate': 9.891678947542063e-06, 'epoch': 0.09} 9%|▉ | 3236/34278 [3:34:53<34:22:36, 3.99s/it] 9%|▉ | 3237/34278 [3:34:57<32:42:32, 3.79s/it] {'loss': 0.1841, 'grad_norm': 0.9496012057232084, 'learning_rate': 9.891581120330498e-06, 'epoch': 0.09} 9%|▉ | 3237/34278 [3:34:57<32:42:32, 3.79s/it] 9%|▉ | 3238/34278 [3:35:00<30:59:48, 3.59s/it] {'loss': 0.1724, 'grad_norm': 0.8945707600040342, 'learning_rate': 9.891483249448066e-06, 'epoch': 0.09} 9%|▉ | 3238/34278 [3:35:00<30:59:48, 3.59s/it] 9%|▉ | 3239/34278 [3:35:03<29:16:02, 3.39s/it] {'loss': 0.1955, 'grad_norm': 0.8099025961551439, 'learning_rate': 9.891385334895637e-06, 'epoch': 0.09} 9%|▉ | 3239/34278 [3:35:03<29:16:02, 3.39s/it] 9%|▉ | 3240/34278 [3:35:06<29:37:11, 3.44s/it] {'loss': 0.1944, 'grad_norm': 0.8034245101934876, 'learning_rate': 9.891287376674089e-06, 'epoch': 0.09} 9%|▉ | 3240/34278 [3:35:06<29:37:11, 3.44s/it] 9%|▉ | 3241/34278 [3:35:10<28:46:37, 3.34s/it] {'loss': 0.1938, 'grad_norm': 0.9094264512261659, 'learning_rate': 9.891189374784294e-06, 'epoch': 0.09} 9%|▉ | 3241/34278 [3:35:10<28:46:37, 3.34s/it] 9%|▉ | 3242/34278 [3:35:13<28:36:36, 3.32s/it] {'loss': 0.1916, 'grad_norm': 1.0101400403658125, 'learning_rate': 9.891091329227127e-06, 'epoch': 0.09} 9%|▉ | 3242/34278 [3:35:13<28:36:36, 3.32s/it] 9%|▉ | 3243/34278 [3:35:16<28:34:52, 3.32s/it] {'loss': 0.1856, 'grad_norm': 0.9529816607580996, 'learning_rate': 9.890993240003465e-06, 'epoch': 0.09} 9%|▉ | 3243/34278 [3:35:16<28:34:52, 3.32s/it] 9%|▉ | 3244/34278 [3:35:20<29:13:24, 3.39s/it] {'loss': 0.2018, 'grad_norm': 0.7679335540210122, 'learning_rate': 9.890895107114182e-06, 'epoch': 0.09} 9%|▉ | 3244/34278 [3:35:20<29:13:24, 3.39s/it] 9%|▉ | 3245/34278 [3:35:23<29:26:46, 3.42s/it] {'loss': 0.1774, 'grad_norm': 1.1404036593134883, 'learning_rate': 9.890796930560156e-06, 'epoch': 0.09} 9%|▉ | 3245/34278 [3:35:23<29:26:46, 3.42s/it] 9%|▉ | 3246/34278 [3:35:26<29:19:12, 3.40s/it] {'loss': 0.1967, 'grad_norm': 0.8780786896423192, 'learning_rate': 9.890698710342263e-06, 'epoch': 0.09} 9%|▉ | 3246/34278 [3:35:27<29:19:12, 3.40s/it] 9%|▉ | 3247/34278 [3:35:31<30:54:10, 3.59s/it] {'loss': 0.1814, 'grad_norm': 0.9576698836614522, 'learning_rate': 9.89060044646138e-06, 'epoch': 0.09} 9%|▉ | 3247/34278 [3:35:31<30:54:10, 3.59s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 9%|▉ | 3248/34278 [3:35:34<30:54:20, 3.59s/it] {'loss': 0.1873, 'grad_norm': 0.8730570112400692, 'learning_rate': 9.890502138918382e-06, 'epoch': 0.09} 9%|▉ | 3248/34278 [3:35:34<30:54:20, 3.59s/it] 9%|▉ | 3249/34278 [3:35:37<29:11:04, 3.39s/it] {'loss': 0.1919, 'grad_norm': 1.2602141128621513, 'learning_rate': 9.890403787714148e-06, 'epoch': 0.09} 9%|▉ | 3249/34278 [3:35:37<29:11:04, 3.39s/it] 9%|▉ | 3250/34278 [3:35:40<28:09:53, 3.27s/it] {'loss': 0.1758, 'grad_norm': 1.0245830975735313, 'learning_rate': 9.890305392849559e-06, 'epoch': 0.09} 9%|▉ | 3250/34278 [3:35:40<28:09:53, 3.27s/it] 9%|▉ | 3251/34278 [3:35:43<28:19:15, 3.29s/it] {'loss': 0.2038, 'grad_norm': 0.9221155138585105, 'learning_rate': 9.89020695432549e-06, 'epoch': 0.09} 9%|▉ | 3251/34278 [3:35:43<28:19:15, 3.29s/it] 9%|▉ | 3252/34278 [3:35:49<35:33:10, 4.13s/it] {'loss': 0.2032, 'grad_norm': 1.4151546844713536, 'learning_rate': 9.890108472142818e-06, 'epoch': 0.09} 9%|▉ | 3252/34278 [3:35:49<35:33:10, 4.13s/it] 9%|▉ | 3253/34278 [3:35:53<34:35:07, 4.01s/it] {'loss': 0.2026, 'grad_norm': 1.0559456413965769, 'learning_rate': 9.890009946302429e-06, 'epoch': 0.09} 9%|▉ | 3253/34278 [3:35:53<34:35:07, 4.01s/it] 9%|▉ | 3254/34278 [3:35:56<32:31:22, 3.77s/it] {'loss': 0.1714, 'grad_norm': 1.0718071293408422, 'learning_rate': 9.889911376805195e-06, 'epoch': 0.09} 9%|▉ | 3254/34278 [3:35:56<32:31:22, 3.77s/it] 9%|▉ | 3255/34278 [3:36:00<31:44:38, 3.68s/it] {'loss': 0.2007, 'grad_norm': 0.9089126402809143, 'learning_rate': 9.889812763652002e-06, 'epoch': 0.09} 9%|▉ | 3255/34278 [3:36:00<31:44:38, 3.68s/it] 9%|▉ | 3256/34278 [3:36:03<29:07:50, 3.38s/it] {'loss': 0.1913, 'grad_norm': 1.1091010533807344, 'learning_rate': 9.889714106843726e-06, 'epoch': 0.09} 9%|▉ | 3256/34278 [3:36:03<29:07:50, 3.38s/it] 10%|▉ | 3257/34278 [3:36:07<30:43:53, 3.57s/it] {'loss': 0.1781, 'grad_norm': 1.0009387069839508, 'learning_rate': 9.889615406381252e-06, 'epoch': 0.1} 10%|▉ | 3257/34278 [3:36:07<30:43:53, 3.57s/it] 10%|▉ | 3258/34278 [3:36:11<34:12:39, 3.97s/it] {'loss': 0.1754, 'grad_norm': 0.9272485928702179, 'learning_rate': 9.889516662265457e-06, 'epoch': 0.1} 10%|▉ | 3258/34278 [3:36:11<34:12:39, 3.97s/it] 10%|▉ | 3259/34278 [3:36:15<32:39:40, 3.79s/it] {'loss': 0.1636, 'grad_norm': 0.7614197847993052, 'learning_rate': 9.889417874497225e-06, 'epoch': 0.1} 10%|▉ | 3259/34278 [3:36:15<32:39:40, 3.79s/it] 10%|▉ | 3260/34278 [3:36:18<30:39:42, 3.56s/it] {'loss': 0.1972, 'grad_norm': 0.9743955166343347, 'learning_rate': 9.889319043077438e-06, 'epoch': 0.1} 10%|▉ | 3260/34278 [3:36:18<30:39:42, 3.56s/it] 10%|▉ | 3261/34278 [3:36:21<30:51:09, 3.58s/it] {'loss': 0.2058, 'grad_norm': 0.9910664775395345, 'learning_rate': 9.889220168006977e-06, 'epoch': 0.1} 10%|▉ | 3261/34278 [3:36:22<30:51:09, 3.58s/it] 10%|▉ | 3262/34278 [3:36:25<30:12:44, 3.51s/it] {'loss': 0.1661, 'grad_norm': 0.7768260311895827, 'learning_rate': 9.889121249286727e-06, 'epoch': 0.1} 10%|▉ | 3262/34278 [3:36:25<30:12:44, 3.51s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8682 > 8192). Running this sequence through the model will result in indexing errors 10%|▉ | 3263/34278 [3:36:28<29:14:21, 3.39s/it] {'loss': 0.1687, 'grad_norm': 0.919702114574406, 'learning_rate': 9.889022286917567e-06, 'epoch': 0.1} 10%|▉ | 3263/34278 [3:36:28<29:14:21, 3.39s/it] 10%|▉ | 3264/34278 [3:36:31<29:10:37, 3.39s/it] {'loss': 0.1684, 'grad_norm': 1.1063076369430893, 'learning_rate': 9.888923280900385e-06, 'epoch': 0.1} 10%|▉ | 3264/34278 [3:36:31<29:10:37, 3.39s/it] 10%|▉ | 3265/34278 [3:36:36<31:24:11, 3.65s/it] {'loss': 0.1717, 'grad_norm': 0.9419642147128596, 'learning_rate': 9.888824231236063e-06, 'epoch': 0.1} 10%|▉ | 3265/34278 [3:36:36<31:24:11, 3.65s/it] 10%|▉ | 3266/34278 [3:36:39<30:32:26, 3.55s/it] {'loss': 0.1885, 'grad_norm': 0.8241379303385615, 'learning_rate': 9.888725137925484e-06, 'epoch': 0.1} 10%|▉ | 3266/34278 [3:36:39<30:32:26, 3.55s/it] 10%|▉ | 3267/34278 [3:36:44<34:34:27, 4.01s/it] {'loss': 0.2109, 'grad_norm': 1.0135462106994542, 'learning_rate': 9.888626000969534e-06, 'epoch': 0.1} 10%|▉ | 3267/34278 [3:36:44<34:34:27, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3268/34278 [3:36:48<35:48:18, 4.16s/it] {'loss': 0.1987, 'grad_norm': 0.9170158099052814, 'learning_rate': 9.8885268203691e-06, 'epoch': 0.1} 10%|▉ | 3268/34278 [3:36:48<35:48:18, 4.16s/it] 10%|▉ | 3269/34278 [3:36:51<32:41:05, 3.79s/it] {'loss': 0.2015, 'grad_norm': 1.0248116195809147, 'learning_rate': 9.888427596125063e-06, 'epoch': 0.1} 10%|▉ | 3269/34278 [3:36:51<32:41:05, 3.79s/it] 10%|▉ | 3270/34278 [3:36:57<37:56:47, 4.41s/it] {'loss': 0.2114, 'grad_norm': 0.9796239011512825, 'learning_rate': 9.888328328238313e-06, 'epoch': 0.1} 10%|▉ | 3270/34278 [3:36:57<37:56:47, 4.41s/it] 10%|▉ | 3271/34278 [3:37:01<35:07:53, 4.08s/it] {'loss': 0.1797, 'grad_norm': 0.9266420074692738, 'learning_rate': 9.888229016709735e-06, 'epoch': 0.1} 10%|▉ | 3271/34278 [3:37:01<35:07:53, 4.08s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11491 > 8192). Running this sequence through the model will result in indexing errors 10%|▉ | 3272/34278 [3:37:04<34:44:45, 4.03s/it] {'loss': 0.1819, 'grad_norm': 1.0461543235954773, 'learning_rate': 9.888129661540215e-06, 'epoch': 0.1} 10%|▉ | 3272/34278 [3:37:04<34:44:45, 4.03s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3273/34278 [3:37:08<33:25:00, 3.88s/it] {'loss': 0.171, 'grad_norm': 0.8013264228977466, 'learning_rate': 9.88803026273064e-06, 'epoch': 0.1} 10%|▉ | 3273/34278 [3:37:08<33:25:00, 3.88s/it] 10%|▉ | 3274/34278 [3:37:14<40:03:56, 4.65s/it] {'loss': 0.1764, 'grad_norm': 0.9709236879591128, 'learning_rate': 9.887930820281896e-06, 'epoch': 0.1} 10%|▉ | 3274/34278 [3:37:15<40:03:56, 4.65s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3275/34278 [3:37:17<35:04:50, 4.07s/it] {'loss': 0.1848, 'grad_norm': 0.9074604077154521, 'learning_rate': 9.887831334194874e-06, 'epoch': 0.1} 10%|▉ | 3275/34278 [3:37:17<35:04:50, 4.07s/it] 10%|▉ | 3276/34278 [3:37:21<33:18:39, 3.87s/it] {'loss': 0.1935, 'grad_norm': 0.8539229277032876, 'learning_rate': 9.887731804470462e-06, 'epoch': 0.1} 10%|▉ | 3276/34278 [3:37:21<33:18:39, 3.87s/it] 10%|▉ | 3277/34278 [3:37:26<37:14:54, 4.33s/it] {'loss': 0.2011, 'grad_norm': 0.9419424877500043, 'learning_rate': 9.887632231109546e-06, 'epoch': 0.1} 10%|▉ | 3277/34278 [3:37:26<37:14:54, 4.33s/it] 10%|▉ | 3278/34278 [3:37:30<36:08:20, 4.20s/it] {'loss': 0.1814, 'grad_norm': 0.849111456865038, 'learning_rate': 9.887532614113018e-06, 'epoch': 0.1} 10%|▉ | 3278/34278 [3:37:30<36:08:20, 4.20s/it] 10%|▉ | 3279/34278 [3:37:33<34:29:41, 4.01s/it] {'loss': 0.1734, 'grad_norm': 0.8042604791344427, 'learning_rate': 9.887432953481762e-06, 'epoch': 0.1} 10%|▉ | 3279/34278 [3:37:33<34:29:41, 4.01s/it] 10%|▉ | 3280/34278 [3:37:37<32:58:16, 3.83s/it] {'loss': 0.1808, 'grad_norm': 0.8820753714732914, 'learning_rate': 9.887333249216673e-06, 'epoch': 0.1} 10%|▉ | 3280/34278 [3:37:37<32:58:16, 3.83s/it] 10%|▉ | 3281/34278 [3:37:40<30:41:11, 3.56s/it] {'loss': 0.2121, 'grad_norm': 0.8229004316236141, 'learning_rate': 9.88723350131864e-06, 'epoch': 0.1} 10%|▉ | 3281/34278 [3:37:40<30:41:11, 3.56s/it] 10%|▉ | 3282/34278 [3:37:46<37:10:10, 4.32s/it] {'loss': 0.2052, 'grad_norm': 0.8417919912196324, 'learning_rate': 9.887133709788552e-06, 'epoch': 0.1} 10%|▉ | 3282/34278 [3:37:46<37:10:10, 4.32s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3283/34278 [3:37:49<35:13:58, 4.09s/it] {'loss': 0.2154, 'grad_norm': 0.8615882879927675, 'learning_rate': 9.887033874627303e-06, 'epoch': 0.1} 10%|▉ | 3283/34278 [3:37:49<35:13:58, 4.09s/it] 10%|▉ | 3284/34278 [3:37:53<34:17:40, 3.98s/it] {'loss': 0.1894, 'grad_norm': 1.0144257818867242, 'learning_rate': 9.88693399583578e-06, 'epoch': 0.1} 10%|▉ | 3284/34278 [3:37:53<34:17:40, 3.98s/it] 10%|▉ | 3285/34278 [3:37:56<31:30:06, 3.66s/it] {'loss': 0.1935, 'grad_norm': 0.9307069346703885, 'learning_rate': 9.886834073414878e-06, 'epoch': 0.1} 10%|▉ | 3285/34278 [3:37:56<31:30:06, 3.66s/it] 10%|▉ | 3286/34278 [3:38:00<32:42:44, 3.80s/it] {'loss': 0.1805, 'grad_norm': 0.9137495912927327, 'learning_rate': 9.886734107365486e-06, 'epoch': 0.1} 10%|▉ | 3286/34278 [3:38:00<32:42:44, 3.80s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3287/34278 [3:38:04<31:39:48, 3.68s/it] {'loss': 0.2039, 'grad_norm': 1.0047984497652105, 'learning_rate': 9.8866340976885e-06, 'epoch': 0.1} 10%|▉ | 3287/34278 [3:38:04<31:39:48, 3.68s/it] 10%|▉ | 3288/34278 [3:38:07<30:19:00, 3.52s/it] {'loss': 0.2172, 'grad_norm': 0.9344310243993718, 'learning_rate': 9.886534044384812e-06, 'epoch': 0.1} 10%|▉ | 3288/34278 [3:38:07<30:19:00, 3.52s/it] 10%|▉ | 3289/34278 [3:38:11<31:31:53, 3.66s/it] {'loss': 0.1894, 'grad_norm': 1.009365258653322, 'learning_rate': 9.886433947455314e-06, 'epoch': 0.1} 10%|▉ | 3289/34278 [3:38:11<31:31:53, 3.66s/it] 10%|▉ | 3290/34278 [3:38:14<30:47:30, 3.58s/it] {'loss': 0.1806, 'grad_norm': 1.0553967057413909, 'learning_rate': 9.886333806900901e-06, 'epoch': 0.1} 10%|▉ | 3290/34278 [3:38:14<30:47:30, 3.58s/it] 10%|▉ | 3291/34278 [3:38:18<30:19:44, 3.52s/it] {'loss': 0.182, 'grad_norm': 1.0933924973594682, 'learning_rate': 9.886233622722464e-06, 'epoch': 0.1} 10%|▉ | 3291/34278 [3:38:18<30:19:44, 3.52s/it] 10%|▉ | 3292/34278 [3:38:20<28:00:59, 3.25s/it] {'loss': 0.1702, 'grad_norm': 0.8160425800459061, 'learning_rate': 9.886133394920901e-06, 'epoch': 0.1} 10%|▉ | 3292/34278 [3:38:20<28:00:59, 3.25s/it] 10%|▉ | 3293/34278 [3:38:24<28:56:44, 3.36s/it] {'loss': 0.1959, 'grad_norm': 0.9554707412430091, 'learning_rate': 9.886033123497106e-06, 'epoch': 0.1} 10%|▉ | 3293/34278 [3:38:24<28:56:44, 3.36s/it] 10%|▉ | 3294/34278 [3:38:29<32:53:01, 3.82s/it] {'loss': 0.1772, 'grad_norm': 0.7926178740748009, 'learning_rate': 9.885932808451973e-06, 'epoch': 0.1} 10%|▉ | 3294/34278 [3:38:29<32:53:01, 3.82s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8624 > 8192). Running this sequence through the model will result in indexing errors 10%|▉ | 3295/34278 [3:38:33<34:22:38, 3.99s/it] {'loss': 0.1948, 'grad_norm': 0.9356315460002099, 'learning_rate': 9.885832449786398e-06, 'epoch': 0.1} 10%|▉ | 3295/34278 [3:38:33<34:22:38, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3296/34278 [3:38:36<32:25:02, 3.77s/it] {'loss': 0.1947, 'grad_norm': 0.8967607401124458, 'learning_rate': 9.885732047501277e-06, 'epoch': 0.1} 10%|▉ | 3296/34278 [3:38:36<32:25:02, 3.77s/it] 10%|▉ | 3297/34278 [3:38:42<37:41:00, 4.38s/it] {'loss': 0.173, 'grad_norm': 1.0259354476889229, 'learning_rate': 9.885631601597508e-06, 'epoch': 0.1} 10%|▉ | 3297/34278 [3:38:42<37:41:00, 4.38s/it] 10%|▉ | 3298/34278 [3:38:48<41:55:19, 4.87s/it] {'loss': 0.1803, 'grad_norm': 0.8528681011356201, 'learning_rate': 9.885531112075986e-06, 'epoch': 0.1} 10%|▉ | 3298/34278 [3:38:48<41:55:19, 4.87s/it] 10%|▉ | 3299/34278 [3:38:52<39:26:37, 4.58s/it] {'loss': 0.1766, 'grad_norm': 0.7580319978031034, 'learning_rate': 9.885430578937608e-06, 'epoch': 0.1} 10%|▉ | 3299/34278 [3:38:52<39:26:37, 4.58s/it] 10%|▉ | 3300/34278 [3:38:56<38:34:34, 4.48s/it] {'loss': 0.2093, 'grad_norm': 0.9766955848878668, 'learning_rate': 9.88533000218327e-06, 'epoch': 0.1} 10%|▉ | 3300/34278 [3:38:56<38:34:34, 4.48s/it] 10%|▉ | 3301/34278 [3:38:59<34:43:55, 4.04s/it] {'loss': 0.1709, 'grad_norm': 0.6405171835639228, 'learning_rate': 9.885229381813875e-06, 'epoch': 0.1} 10%|▉ | 3301/34278 [3:38:59<34:43:55, 4.04s/it] 10%|▉ | 3302/34278 [3:39:02<31:53:11, 3.71s/it] {'loss': 0.2178, 'grad_norm': 0.9008804603593166, 'learning_rate': 9.885128717830317e-06, 'epoch': 0.1} 10%|▉ | 3302/34278 [3:39:02<31:53:11, 3.71s/it] 10%|▉ | 3303/34278 [3:39:08<37:52:26, 4.40s/it] {'loss': 0.1689, 'grad_norm': 0.8826419199815049, 'learning_rate': 9.885028010233497e-06, 'epoch': 0.1} 10%|▉ | 3303/34278 [3:39:08<37:52:26, 4.40s/it] 10%|▉ | 3304/34278 [3:39:11<34:36:14, 4.02s/it] {'loss': 0.1964, 'grad_norm': 0.9761910585351266, 'learning_rate': 9.884927259024311e-06, 'epoch': 0.1} 10%|▉ | 3304/34278 [3:39:11<34:36:14, 4.02s/it] 10%|▉ | 3305/34278 [3:39:15<32:44:39, 3.81s/it] {'loss': 0.1975, 'grad_norm': 0.8815602490417409, 'learning_rate': 9.884826464203662e-06, 'epoch': 0.1} 10%|▉ | 3305/34278 [3:39:15<32:44:39, 3.81s/it] 10%|▉ | 3306/34278 [3:39:18<30:48:30, 3.58s/it] {'loss': 0.1909, 'grad_norm': 0.966632303420737, 'learning_rate': 9.88472562577245e-06, 'epoch': 0.1} 10%|▉ | 3306/34278 [3:39:18<30:48:30, 3.58s/it] 10%|▉ | 3307/34278 [3:39:24<36:53:57, 4.29s/it] {'loss': 0.1675, 'grad_norm': 0.9525752124567378, 'learning_rate': 9.88462474373157e-06, 'epoch': 0.1} 10%|▉ | 3307/34278 [3:39:24<36:53:57, 4.29s/it] 10%|▉ | 3308/34278 [3:39:27<34:04:02, 3.96s/it] {'loss': 0.1781, 'grad_norm': 0.9006113194578775, 'learning_rate': 9.88452381808193e-06, 'epoch': 0.1} 10%|▉ | 3308/34278 [3:39:27<34:04:02, 3.96s/it] 10%|▉ | 3309/34278 [3:39:30<31:45:52, 3.69s/it] {'loss': 0.1796, 'grad_norm': 0.8680790723405596, 'learning_rate': 9.884422848824424e-06, 'epoch': 0.1} 10%|▉ | 3309/34278 [3:39:30<31:45:52, 3.69s/it] 10%|▉ | 3310/34278 [3:39:35<34:54:30, 4.06s/it] {'loss': 0.1566, 'grad_norm': 1.6488054174957947, 'learning_rate': 9.88432183595996e-06, 'epoch': 0.1} 10%|▉ | 3310/34278 [3:39:35<34:54:30, 4.06s/it] 10%|▉ | 3311/34278 [3:39:39<36:19:32, 4.22s/it] {'loss': 0.1827, 'grad_norm': 0.8059685590284299, 'learning_rate': 9.884220779489435e-06, 'epoch': 0.1} 10%|▉ | 3311/34278 [3:39:39<36:19:32, 4.22s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11036 > 8192). Running this sequence through the model will result in indexing errors 10%|▉ | 3312/34278 [3:39:44<36:50:03, 4.28s/it] {'loss': 0.1832, 'grad_norm': 0.9038782946036477, 'learning_rate': 9.884119679413753e-06, 'epoch': 0.1} 10%|▉ | 3312/34278 [3:39:44<36:50:03, 4.28s/it] 10%|▉ | 3313/34278 [3:39:47<34:31:50, 4.01s/it] {'loss': 0.2048, 'grad_norm': 0.8444990222150676, 'learning_rate': 9.884018535733816e-06, 'epoch': 0.1} 10%|▉ | 3313/34278 [3:39:47<34:31:50, 4.01s/it] 10%|▉ | 3314/34278 [3:39:52<36:00:11, 4.19s/it] {'loss': 0.1994, 'grad_norm': 0.9332736693317996, 'learning_rate': 9.883917348450529e-06, 'epoch': 0.1} 10%|▉ | 3314/34278 [3:39:52<36:00:11, 4.19s/it] 10%|▉ | 3315/34278 [3:39:56<36:20:12, 4.22s/it] {'loss': 0.184, 'grad_norm': 0.9741830425703921, 'learning_rate': 9.883816117564792e-06, 'epoch': 0.1} 10%|▉ | 3315/34278 [3:39:56<36:20:12, 4.22s/it] 10%|▉ | 3316/34278 [3:40:00<35:17:43, 4.10s/it] {'loss': 0.1802, 'grad_norm': 0.8552561903773729, 'learning_rate': 9.883714843077512e-06, 'epoch': 0.1} 10%|▉ | 3316/34278 [3:40:00<35:17:43, 4.10s/it] 10%|▉ | 3317/34278 [3:40:03<32:22:24, 3.76s/it] {'loss': 0.2043, 'grad_norm': 1.017909983191464, 'learning_rate': 9.883613524989591e-06, 'epoch': 0.1} 10%|▉ | 3317/34278 [3:40:03<32:22:24, 3.76s/it] 10%|▉ | 3318/34278 [3:40:07<33:05:31, 3.85s/it] {'loss': 0.1761, 'grad_norm': 0.8278959717775877, 'learning_rate': 9.883512163301934e-06, 'epoch': 0.1} 10%|▉ | 3318/34278 [3:40:07<33:05:31, 3.85s/it] 10%|▉ | 3319/34278 [3:40:10<31:40:39, 3.68s/it] {'loss': 0.1853, 'grad_norm': 1.028255430484232, 'learning_rate': 9.883410758015446e-06, 'epoch': 0.1} 10%|▉ | 3319/34278 [3:40:10<31:40:39, 3.68s/it] 10%|▉ | 3320/34278 [3:40:16<38:06:40, 4.43s/it] {'loss': 0.168, 'grad_norm': 0.8815026666009607, 'learning_rate': 9.883309309131032e-06, 'epoch': 0.1} 10%|▉ | 3320/34278 [3:40:16<38:06:40, 4.43s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3321/34278 [3:40:20<34:44:33, 4.04s/it] {'loss': 0.1587, 'grad_norm': 0.9173579267075747, 'learning_rate': 9.883207816649599e-06, 'epoch': 0.1} 10%|▉ | 3321/34278 [3:40:20<34:44:33, 4.04s/it] 10%|▉ | 3322/34278 [3:40:23<33:24:00, 3.88s/it] {'loss': 0.1983, 'grad_norm': 0.8683491755852815, 'learning_rate': 9.883106280572052e-06, 'epoch': 0.1} 10%|▉ | 3322/34278 [3:40:23<33:24:00, 3.88s/it] 10%|▉ | 3323/34278 [3:40:27<32:26:17, 3.77s/it] {'loss': 0.176, 'grad_norm': 1.012331012789619, 'learning_rate': 9.883004700899299e-06, 'epoch': 0.1} 10%|▉ | 3323/34278 [3:40:27<32:26:17, 3.77s/it] 10%|▉ | 3324/34278 [3:40:33<38:26:33, 4.47s/it] {'loss': 0.1965, 'grad_norm': 0.8572340863566649, 'learning_rate': 9.882903077632245e-06, 'epoch': 0.1} 10%|▉ | 3324/34278 [3:40:33<38:26:33, 4.47s/it] 10%|▉ | 3325/34278 [3:40:37<37:43:59, 4.39s/it] {'loss': 0.166, 'grad_norm': 0.8623076186261965, 'learning_rate': 9.882801410771798e-06, 'epoch': 0.1} 10%|▉ | 3325/34278 [3:40:37<37:43:59, 4.39s/it] 10%|▉ | 3326/34278 [3:40:43<41:50:04, 4.87s/it] {'loss': 0.173, 'grad_norm': 1.071937739099093, 'learning_rate': 9.882699700318865e-06, 'epoch': 0.1} 10%|▉ | 3326/34278 [3:40:43<41:50:04, 4.87s/it] 10%|▉ | 3327/34278 [3:40:47<39:35:07, 4.60s/it] {'loss': 0.1951, 'grad_norm': 1.0500033029737166, 'learning_rate': 9.882597946274356e-06, 'epoch': 0.1} 10%|▉ | 3327/34278 [3:40:47<39:35:07, 4.60s/it] 10%|▉ | 3328/34278 [3:40:50<35:51:49, 4.17s/it] {'loss': 0.1773, 'grad_norm': 1.1095799706133407, 'learning_rate': 9.882496148639178e-06, 'epoch': 0.1} 10%|▉ | 3328/34278 [3:40:50<35:51:49, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3329/34278 [3:40:53<33:39:10, 3.91s/it] {'loss': 0.2105, 'grad_norm': 0.8412458752082929, 'learning_rate': 9.882394307414237e-06, 'epoch': 0.1} 10%|▉ | 3329/34278 [3:40:53<33:39:10, 3.91s/it] 10%|▉ | 3330/34278 [3:40:57<32:00:24, 3.72s/it] {'loss': 0.1838, 'grad_norm': 0.9602601915159229, 'learning_rate': 9.88229242260045e-06, 'epoch': 0.1} 10%|▉ | 3330/34278 [3:40:57<32:00:24, 3.72s/it] 10%|▉ | 3331/34278 [3:41:01<32:22:16, 3.77s/it] {'loss': 0.1984, 'grad_norm': 1.056922883017463, 'learning_rate': 9.882190494198718e-06, 'epoch': 0.1} 10%|▉ | 3331/34278 [3:41:01<32:22:16, 3.77s/it] 10%|▉ | 3332/34278 [3:41:03<29:53:29, 3.48s/it] {'loss': 0.1755, 'grad_norm': 0.9087468509235599, 'learning_rate': 9.882088522209956e-06, 'epoch': 0.1} 10%|▉ | 3332/34278 [3:41:03<29:53:29, 3.48s/it] 10%|▉ | 3333/34278 [3:41:09<35:00:23, 4.07s/it] {'loss': 0.1816, 'grad_norm': 0.8542197272026406, 'learning_rate': 9.881986506635073e-06, 'epoch': 0.1} 10%|▉ | 3333/34278 [3:41:09<35:00:23, 4.07s/it] 10%|▉ | 3334/34278 [3:41:13<36:28:31, 4.24s/it] {'loss': 0.1975, 'grad_norm': 0.9848855524266356, 'learning_rate': 9.88188444747498e-06, 'epoch': 0.1} 10%|▉ | 3334/34278 [3:41:13<36:28:31, 4.24s/it] 10%|▉ | 3335/34278 [3:41:16<33:16:34, 3.87s/it] {'loss': 0.1866, 'grad_norm': 1.0557472764375053, 'learning_rate': 9.881782344730588e-06, 'epoch': 0.1} 10%|▉ | 3335/34278 [3:41:16<33:16:34, 3.87s/it] 10%|▉ | 3336/34278 [3:41:22<36:33:01, 4.25s/it] {'loss': 0.1833, 'grad_norm': 0.9167491324108101, 'learning_rate': 9.881680198402808e-06, 'epoch': 0.1} 10%|▉ | 3336/34278 [3:41:22<36:33:01, 4.25s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3337/34278 [3:41:25<33:23:20, 3.88s/it] {'loss': 0.2438, 'grad_norm': 0.9669479835621975, 'learning_rate': 9.881578008492554e-06, 'epoch': 0.1} 10%|▉ | 3337/34278 [3:41:25<33:23:20, 3.88s/it] 10%|▉ | 3338/34278 [3:41:28<31:46:36, 3.70s/it] {'loss': 0.2082, 'grad_norm': 1.132293288909696, 'learning_rate': 9.881475775000735e-06, 'epoch': 0.1} 10%|▉ | 3338/34278 [3:41:28<31:46:36, 3.70s/it] 10%|▉ | 3339/34278 [3:41:31<30:04:56, 3.50s/it] {'loss': 0.1738, 'grad_norm': 1.0823698455225095, 'learning_rate': 9.881373497928267e-06, 'epoch': 0.1} 10%|▉ | 3339/34278 [3:41:31<30:04:56, 3.50s/it] 10%|▉ | 3340/34278 [3:41:37<38:04:49, 4.43s/it] {'loss': 0.2096, 'grad_norm': 1.2544833097035346, 'learning_rate': 9.881271177276061e-06, 'epoch': 0.1} 10%|▉ | 3340/34278 [3:41:37<38:04:49, 4.43s/it] 10%|▉ | 3341/34278 [3:41:44<42:35:39, 4.96s/it] {'loss': 0.1918, 'grad_norm': 1.025550403217825, 'learning_rate': 9.881168813045032e-06, 'epoch': 0.1} 10%|▉ | 3341/34278 [3:41:44<42:35:39, 4.96s/it] 10%|▉ | 3342/34278 [3:41:47<37:48:40, 4.40s/it] {'loss': 0.1862, 'grad_norm': 0.7801058042312882, 'learning_rate': 9.881066405236093e-06, 'epoch': 0.1} 10%|▉ | 3342/34278 [3:41:47<37:48:40, 4.40s/it] 10%|▉ | 3343/34278 [3:41:50<34:42:19, 4.04s/it] {'loss': 0.1944, 'grad_norm': 1.123647415104818, 'learning_rate': 9.880963953850158e-06, 'epoch': 0.1} 10%|▉ | 3343/34278 [3:41:50<34:42:19, 4.04s/it] 10%|▉ | 3344/34278 [3:41:53<31:30:54, 3.67s/it] {'loss': 0.2039, 'grad_norm': 0.9789486274989931, 'learning_rate': 9.880861458888141e-06, 'epoch': 0.1} 10%|▉ | 3344/34278 [3:41:53<31:30:54, 3.67s/it] 10%|▉ | 3345/34278 [3:41:56<30:36:52, 3.56s/it] {'loss': 0.1849, 'grad_norm': 0.8972913970938006, 'learning_rate': 9.88075892035096e-06, 'epoch': 0.1} 10%|▉ | 3345/34278 [3:41:56<30:36:52, 3.56s/it] 10%|▉ | 3346/34278 [3:42:02<36:42:25, 4.27s/it] {'loss': 0.1955, 'grad_norm': 1.4062895258580415, 'learning_rate': 9.880656338239527e-06, 'epoch': 0.1} 10%|▉ | 3346/34278 [3:42:02<36:42:25, 4.27s/it] 10%|▉ | 3347/34278 [3:42:05<34:10:27, 3.98s/it] {'loss': 0.1881, 'grad_norm': 0.9490073349859146, 'learning_rate': 9.880553712554759e-06, 'epoch': 0.1} 10%|▉ | 3347/34278 [3:42:05<34:10:27, 3.98s/it] 10%|▉ | 3348/34278 [3:42:10<35:57:46, 4.19s/it] {'loss': 0.1759, 'grad_norm': 1.015122416593752, 'learning_rate': 9.880451043297574e-06, 'epoch': 0.1} 10%|▉ | 3348/34278 [3:42:10<35:57:46, 4.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3349/34278 [3:42:15<37:35:33, 4.38s/it] {'loss': 0.1793, 'grad_norm': 0.8943725267473936, 'learning_rate': 9.880348330468885e-06, 'epoch': 0.1} 10%|▉ | 3349/34278 [3:42:15<37:35:33, 4.38s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3350/34278 [3:42:19<36:32:34, 4.25s/it] {'loss': 0.1867, 'grad_norm': 0.969702487842303, 'learning_rate': 9.880245574069613e-06, 'epoch': 0.1} 10%|▉ | 3350/34278 [3:42:19<36:32:34, 4.25s/it] 10%|▉ | 3351/34278 [3:42:22<34:23:08, 4.00s/it] {'loss': 0.1787, 'grad_norm': 1.2799073018645288, 'learning_rate': 9.880142774100673e-06, 'epoch': 0.1} 10%|▉ | 3351/34278 [3:42:22<34:23:08, 4.00s/it] 10%|▉ | 3352/34278 [3:42:25<31:02:46, 3.61s/it] {'loss': 0.1813, 'grad_norm': 1.0941196809913207, 'learning_rate': 9.880039930562983e-06, 'epoch': 0.1} 10%|▉ | 3352/34278 [3:42:25<31:02:46, 3.61s/it] 10%|▉ | 3353/34278 [3:42:28<30:56:40, 3.60s/it] {'loss': 0.1697, 'grad_norm': 1.0847507720429033, 'learning_rate': 9.879937043457462e-06, 'epoch': 0.1} 10%|▉ | 3353/34278 [3:42:28<30:56:40, 3.60s/it] 10%|▉ | 3354/34278 [3:42:32<30:06:16, 3.50s/it] {'loss': 0.2061, 'grad_norm': 0.8460299768224915, 'learning_rate': 9.879834112785028e-06, 'epoch': 0.1} 10%|▉ | 3354/34278 [3:42:32<30:06:16, 3.50s/it] 10%|▉ | 3355/34278 [3:42:36<31:15:35, 3.64s/it] {'loss': 0.1928, 'grad_norm': 1.1308387993487357, 'learning_rate': 9.8797311385466e-06, 'epoch': 0.1} 10%|▉ | 3355/34278 [3:42:36<31:15:35, 3.64s/it] 10%|▉ | 3356/34278 [3:42:38<28:45:07, 3.35s/it] {'loss': 0.2231, 'grad_norm': 1.051652606247428, 'learning_rate': 9.879628120743096e-06, 'epoch': 0.1} 10%|▉ | 3356/34278 [3:42:38<28:45:07, 3.35s/it] 10%|▉ | 3357/34278 [3:42:43<33:20:51, 3.88s/it] {'loss': 0.198, 'grad_norm': 0.9733068669877226, 'learning_rate': 9.879525059375438e-06, 'epoch': 0.1} 10%|▉ | 3357/34278 [3:42:44<33:20:51, 3.88s/it] 10%|▉ | 3358/34278 [3:42:47<31:46:02, 3.70s/it] {'loss': 0.1969, 'grad_norm': 1.0361697472510567, 'learning_rate': 9.879421954444546e-06, 'epoch': 0.1} 10%|▉ | 3358/34278 [3:42:47<31:46:02, 3.70s/it] 10%|▉ | 3359/34278 [3:42:50<29:55:14, 3.48s/it] {'loss': 0.2174, 'grad_norm': 0.9013273137220902, 'learning_rate': 9.879318805951339e-06, 'epoch': 0.1} 10%|▉ | 3359/34278 [3:42:50<29:55:14, 3.48s/it] 10%|▉ | 3360/34278 [3:42:54<31:18:31, 3.65s/it] {'loss': 0.1638, 'grad_norm': 0.9373053268113689, 'learning_rate': 9.879215613896737e-06, 'epoch': 0.1} 10%|▉ | 3360/34278 [3:42:54<31:18:31, 3.65s/it] 10%|▉ | 3361/34278 [3:42:57<29:33:16, 3.44s/it] {'loss': 0.1904, 'grad_norm': 1.0238897306994317, 'learning_rate': 9.879112378281666e-06, 'epoch': 0.1} 10%|▉ | 3361/34278 [3:42:57<29:33:16, 3.44s/it] 10%|▉ | 3362/34278 [3:43:00<29:46:29, 3.47s/it] {'loss': 0.1961, 'grad_norm': 0.9165639559651162, 'learning_rate': 9.879009099107042e-06, 'epoch': 0.1} 10%|▉ | 3362/34278 [3:43:00<29:46:29, 3.47s/it] 10%|▉ | 3363/34278 [3:43:05<33:59:51, 3.96s/it] {'loss': 0.2078, 'grad_norm': 1.13160027184167, 'learning_rate': 9.87890577637379e-06, 'epoch': 0.1} 10%|▉ | 3363/34278 [3:43:05<33:59:51, 3.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3364/34278 [3:43:11<39:02:51, 4.55s/it] {'loss': 0.1955, 'grad_norm': 1.0210775302186303, 'learning_rate': 9.878802410082832e-06, 'epoch': 0.1} 10%|▉ | 3364/34278 [3:43:11<39:02:51, 4.55s/it] 10%|▉ | 3365/34278 [3:43:15<36:02:53, 4.20s/it] {'loss': 0.1682, 'grad_norm': 0.8217896526873605, 'learning_rate': 9.87869900023509e-06, 'epoch': 0.1} 10%|▉ | 3365/34278 [3:43:15<36:02:53, 4.20s/it] 10%|▉ | 3366/34278 [3:43:18<33:59:59, 3.96s/it] {'loss': 0.1885, 'grad_norm': 1.1233133348841937, 'learning_rate': 9.87859554683149e-06, 'epoch': 0.1} 10%|▉ | 3366/34278 [3:43:18<33:59:59, 3.96s/it] 10%|▉ | 3367/34278 [3:43:25<40:54:48, 4.76s/it] {'loss': 0.1978, 'grad_norm': 1.0795927439787238, 'learning_rate': 9.878492049872951e-06, 'epoch': 0.1} 10%|▉ | 3367/34278 [3:43:25<40:54:48, 4.76s/it] 10%|▉ | 3368/34278 [3:43:28<38:02:01, 4.43s/it] {'loss': 0.2049, 'grad_norm': 1.2355938655652003, 'learning_rate': 9.8783885093604e-06, 'epoch': 0.1} 10%|▉ | 3368/34278 [3:43:28<38:02:01, 4.43s/it] 10%|▉ | 3369/34278 [3:43:34<42:11:40, 4.91s/it] {'loss': 0.1823, 'grad_norm': 1.0884883466673847, 'learning_rate': 9.878284925294763e-06, 'epoch': 0.1} 10%|▉ | 3369/34278 [3:43:34<42:11:40, 4.91s/it] 10%|▉ | 3370/34278 [3:43:38<37:45:32, 4.40s/it] {'loss': 0.2198, 'grad_norm': 1.2905904713936949, 'learning_rate': 9.87818129767696e-06, 'epoch': 0.1} 10%|▉ | 3370/34278 [3:43:38<37:45:32, 4.40s/it] 10%|▉ | 3371/34278 [3:43:41<35:39:32, 4.15s/it] {'loss': 0.1829, 'grad_norm': 0.8099165654820559, 'learning_rate': 9.878077626507921e-06, 'epoch': 0.1} 10%|▉ | 3371/34278 [3:43:41<35:39:32, 4.15s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8831 > 8192). Running this sequence through the model will result in indexing errors 10%|▉ | 3372/34278 [3:43:44<32:56:30, 3.84s/it] {'loss': 0.2136, 'grad_norm': 1.206895294436484, 'learning_rate': 9.877973911788569e-06, 'epoch': 0.1} 10%|▉ | 3372/34278 [3:43:44<32:56:30, 3.84s/it] 10%|▉ | 3373/34278 [3:43:48<31:36:38, 3.68s/it] {'loss': 0.1921, 'grad_norm': 1.284751688336377, 'learning_rate': 9.87787015351983e-06, 'epoch': 0.1} 10%|▉ | 3373/34278 [3:43:48<31:36:38, 3.68s/it] 10%|▉ | 3374/34278 [3:43:51<29:37:32, 3.45s/it] {'loss': 0.1664, 'grad_norm': 0.7459196499352689, 'learning_rate': 9.877766351702631e-06, 'epoch': 0.1} 10%|▉ | 3374/34278 [3:43:51<29:37:32, 3.45s/it] 10%|▉ | 3375/34278 [3:43:56<36:05:23, 4.20s/it] {'loss': 0.1964, 'grad_norm': 1.2156262081437241, 'learning_rate': 9.877662506337898e-06, 'epoch': 0.1} 10%|▉ | 3375/34278 [3:43:57<36:05:23, 4.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3376/34278 [3:44:03<40:47:00, 4.75s/it] {'loss': 0.2037, 'grad_norm': 0.9004909046762573, 'learning_rate': 9.877558617426558e-06, 'epoch': 0.1} 10%|▉ | 3376/34278 [3:44:03<40:47:00, 4.75s/it] 10%|▉ | 3377/34278 [3:44:06<37:15:33, 4.34s/it] {'loss': 0.1864, 'grad_norm': 0.8786610661004547, 'learning_rate': 9.877454684969541e-06, 'epoch': 0.1} 10%|▉ | 3377/34278 [3:44:06<37:15:33, 4.34s/it] 10%|▉ | 3378/34278 [3:44:09<34:19:00, 4.00s/it] {'loss': 0.1912, 'grad_norm': 0.8747028960856301, 'learning_rate': 9.87735070896777e-06, 'epoch': 0.1} 10%|▉ | 3378/34278 [3:44:09<34:19:00, 4.00s/it] 10%|▉ | 3379/34278 [3:44:13<33:08:56, 3.86s/it] {'loss': 0.1775, 'grad_norm': 0.9535249120129909, 'learning_rate': 9.87724668942218e-06, 'epoch': 0.1} 10%|▉ | 3379/34278 [3:44:13<33:08:56, 3.86s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3380/34278 [3:44:16<30:59:17, 3.61s/it] {'loss': 0.1693, 'grad_norm': 0.7698045496984315, 'learning_rate': 9.877142626333692e-06, 'epoch': 0.1} 10%|▉ | 3380/34278 [3:44:16<30:59:17, 3.61s/it] 10%|▉ | 3381/34278 [3:44:22<37:21:02, 4.35s/it] {'loss': 0.1714, 'grad_norm': 0.8464332139561874, 'learning_rate': 9.87703851970324e-06, 'epoch': 0.1} 10%|▉ | 3381/34278 [3:44:22<37:21:02, 4.35s/it] 10%|▉ | 3382/34278 [3:44:26<36:20:28, 4.23s/it] {'loss': 0.1952, 'grad_norm': 0.9022198071684441, 'learning_rate': 9.876934369531754e-06, 'epoch': 0.1} 10%|▉ | 3382/34278 [3:44:26<36:20:28, 4.23s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3383/34278 [3:44:30<37:30:55, 4.37s/it] {'loss': 0.1888, 'grad_norm': 0.9859271685129993, 'learning_rate': 9.87683017582016e-06, 'epoch': 0.1} 10%|▉ | 3383/34278 [3:44:30<37:30:55, 4.37s/it] 10%|▉ | 3384/34278 [3:44:34<34:22:44, 4.01s/it] {'loss': 0.1818, 'grad_norm': 1.026625590481748, 'learning_rate': 9.876725938569392e-06, 'epoch': 0.1} 10%|▉ | 3384/34278 [3:44:34<34:22:44, 4.01s/it] 10%|▉ | 3385/34278 [3:44:37<33:48:48, 3.94s/it] {'loss': 0.1709, 'grad_norm': 0.9036674784035109, 'learning_rate': 9.876621657780378e-06, 'epoch': 0.1} 10%|▉ | 3385/34278 [3:44:37<33:48:48, 3.94s/it] 10%|▉ | 3386/34278 [3:44:44<39:48:45, 4.64s/it] {'loss': 0.1864, 'grad_norm': 1.0830903552836817, 'learning_rate': 9.876517333454051e-06, 'epoch': 0.1} 10%|▉ | 3386/34278 [3:44:44<39:48:45, 4.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3387/34278 [3:44:48<38:07:49, 4.44s/it] {'loss': 0.1951, 'grad_norm': 1.0448141036868295, 'learning_rate': 9.876412965591343e-06, 'epoch': 0.1} 10%|▉ | 3387/34278 [3:44:48<38:07:49, 4.44s/it] 10%|▉ | 3388/34278 [3:44:51<34:26:47, 4.01s/it] {'loss': 0.1695, 'grad_norm': 0.9688155387079148, 'learning_rate': 9.876308554193182e-06, 'epoch': 0.1} 10%|▉ | 3388/34278 [3:44:51<34:26:47, 4.01s/it] 10%|▉ | 3389/34278 [3:44:54<31:37:02, 3.68s/it] {'loss': 0.1769, 'grad_norm': 0.8334733215835574, 'learning_rate': 9.876204099260501e-06, 'epoch': 0.1} 10%|▉ | 3389/34278 [3:44:54<31:37:02, 3.68s/it] 10%|▉ | 3390/34278 [3:44:58<33:09:33, 3.86s/it] {'loss': 0.1955, 'grad_norm': 0.9606360477202657, 'learning_rate': 9.876099600794236e-06, 'epoch': 0.1} 10%|▉ | 3390/34278 [3:44:58<33:09:33, 3.86s/it] 10%|▉ | 3391/34278 [3:45:03<35:56:34, 4.19s/it] {'loss': 0.2048, 'grad_norm': 0.8205109808036659, 'learning_rate': 9.875995058795316e-06, 'epoch': 0.1} 10%|▉ | 3391/34278 [3:45:03<35:56:34, 4.19s/it] 10%|▉ | 3392/34278 [3:45:09<40:45:41, 4.75s/it] {'loss': 0.2095, 'grad_norm': 0.9982764097769949, 'learning_rate': 9.875890473264678e-06, 'epoch': 0.1} 10%|▉ | 3392/34278 [3:45:09<40:45:41, 4.75s/it] 10%|▉ | 3393/34278 [3:45:12<37:08:22, 4.33s/it] {'loss': 0.2021, 'grad_norm': 1.0626266558414459, 'learning_rate': 9.875785844203251e-06, 'epoch': 0.1} 10%|▉ | 3393/34278 [3:45:12<37:08:22, 4.33s/it] 10%|▉ | 3394/34278 [3:45:18<42:10:33, 4.92s/it] {'loss': 0.1683, 'grad_norm': 0.9973487185751021, 'learning_rate': 9.875681171611974e-06, 'epoch': 0.1} 10%|▉ | 3394/34278 [3:45:18<42:10:33, 4.92s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9416 > 8192). Running this sequence through the model will result in indexing errors 10%|▉ | 3395/34278 [3:45:22<38:48:11, 4.52s/it] {'loss': 0.1782, 'grad_norm': 1.0140220788974939, 'learning_rate': 9.87557645549178e-06, 'epoch': 0.1} 10%|▉ | 3395/34278 [3:45:22<38:48:11, 4.52s/it] 10%|▉ | 3396/34278 [3:45:25<34:57:07, 4.07s/it] {'loss': 0.169, 'grad_norm': 0.9630179068400443, 'learning_rate': 9.875471695843603e-06, 'epoch': 0.1} 10%|▉ | 3396/34278 [3:45:25<34:57:07, 4.07s/it] 10%|▉ | 3397/34278 [3:45:28<32:57:42, 3.84s/it] {'loss': 0.1653, 'grad_norm': 0.7429839039648668, 'learning_rate': 9.875366892668376e-06, 'epoch': 0.1} 10%|▉ | 3397/34278 [3:45:28<32:57:42, 3.84s/it] 10%|▉ | 3398/34278 [3:45:32<32:05:04, 3.74s/it] {'loss': 0.1951, 'grad_norm': 1.075112707108095, 'learning_rate': 9.87526204596704e-06, 'epoch': 0.1} 10%|▉ | 3398/34278 [3:45:32<32:05:04, 3.74s/it] 10%|▉ | 3399/34278 [3:45:35<30:04:22, 3.51s/it] {'loss': 0.1878, 'grad_norm': 0.9306430813422164, 'learning_rate': 9.875157155740528e-06, 'epoch': 0.1} 10%|▉ | 3399/34278 [3:45:35<30:04:22, 3.51s/it] 10%|▉ | 3400/34278 [3:45:38<28:58:48, 3.38s/it] {'loss': 0.182, 'grad_norm': 0.81734598225374, 'learning_rate': 9.875052221989777e-06, 'epoch': 0.1} 10%|▉ | 3400/34278 [3:45:38<28:58:48, 3.38s/it] 10%|▉ | 3401/34278 [3:45:44<35:38:39, 4.16s/it] {'loss': 0.1704, 'grad_norm': 0.885771193859634, 'learning_rate': 9.874947244715722e-06, 'epoch': 0.1} 10%|▉ | 3401/34278 [3:45:44<35:38:39, 4.16s/it] 10%|▉ | 3402/34278 [3:45:47<33:21:40, 3.89s/it] {'loss': 0.1752, 'grad_norm': 0.8940409062009961, 'learning_rate': 9.874842223919303e-06, 'epoch': 0.1} 10%|▉ | 3402/34278 [3:45:47<33:21:40, 3.89s/it] 10%|▉ | 3403/34278 [3:45:50<31:42:49, 3.70s/it] {'loss': 0.1635, 'grad_norm': 0.8175311045028216, 'learning_rate': 9.874737159601455e-06, 'epoch': 0.1} 10%|▉ | 3403/34278 [3:45:50<31:42:49, 3.70s/it] 10%|▉ | 3404/34278 [3:45:55<33:59:28, 3.96s/it] {'loss': 0.2026, 'grad_norm': 0.9443309919729741, 'learning_rate': 9.87463205176312e-06, 'epoch': 0.1} 10%|▉ | 3404/34278 [3:45:55<33:59:28, 3.96s/it] 10%|▉ | 3405/34278 [3:45:58<31:05:42, 3.63s/it] {'loss': 0.1726, 'grad_norm': 0.919398290601415, 'learning_rate': 9.87452690040523e-06, 'epoch': 0.1} 10%|▉ | 3405/34278 [3:45:58<31:05:42, 3.63s/it] 10%|▉ | 3406/34278 [3:46:01<29:54:32, 3.49s/it] {'loss': 0.1836, 'grad_norm': 0.8357805180322369, 'learning_rate': 9.87442170552873e-06, 'epoch': 0.1} 10%|▉ | 3406/34278 [3:46:01<29:54:32, 3.49s/it] 10%|▉ | 3407/34278 [3:46:04<29:32:41, 3.45s/it] {'loss': 0.1737, 'grad_norm': 0.885788982003115, 'learning_rate': 9.874316467134557e-06, 'epoch': 0.1} 10%|▉ | 3407/34278 [3:46:04<29:32:41, 3.45s/it] 10%|▉ | 3408/34278 [3:46:08<30:36:08, 3.57s/it] {'loss': 0.2005, 'grad_norm': 0.9252449527546076, 'learning_rate': 9.874211185223649e-06, 'epoch': 0.1} 10%|▉ | 3408/34278 [3:46:08<30:36:08, 3.57s/it] 10%|▉ | 3409/34278 [3:46:12<30:19:52, 3.54s/it] {'loss': 0.2026, 'grad_norm': 1.4076580398560004, 'learning_rate': 9.874105859796947e-06, 'epoch': 0.1} 10%|▉ | 3409/34278 [3:46:12<30:19:52, 3.54s/it] 10%|▉ | 3410/34278 [3:46:15<28:52:34, 3.37s/it] {'loss': 0.185, 'grad_norm': 0.9840688774481136, 'learning_rate': 9.87400049085539e-06, 'epoch': 0.1} 10%|▉ | 3410/34278 [3:46:15<28:52:34, 3.37s/it] 10%|▉ | 3411/34278 [3:46:18<29:16:54, 3.42s/it] {'loss': 0.1624, 'grad_norm': 0.8966402285038693, 'learning_rate': 9.873895078399925e-06, 'epoch': 0.1} 10%|▉ | 3411/34278 [3:46:18<29:16:54, 3.42s/it] 10%|▉ | 3412/34278 [3:46:22<29:36:03, 3.45s/it] {'loss': 0.1746, 'grad_norm': 0.8245966853829566, 'learning_rate': 9.873789622431484e-06, 'epoch': 0.1} 10%|▉ | 3412/34278 [3:46:22<29:36:03, 3.45s/it] 10%|▉ | 3413/34278 [3:46:25<29:35:21, 3.45s/it] {'loss': 0.1717, 'grad_norm': 0.941991276321919, 'learning_rate': 9.873684122951013e-06, 'epoch': 0.1} 10%|▉ | 3413/34278 [3:46:25<29:35:21, 3.45s/it] 10%|▉ | 3414/34278 [3:46:28<28:42:10, 3.35s/it] {'loss': 0.1923, 'grad_norm': 0.9563024015671291, 'learning_rate': 9.873578579959456e-06, 'epoch': 0.1} 10%|▉ | 3414/34278 [3:46:28<28:42:10, 3.35s/it] 10%|▉ | 3415/34278 [3:46:32<29:07:16, 3.40s/it] {'loss': 0.2, 'grad_norm': 1.0495462587066298, 'learning_rate': 9.87347299345775e-06, 'epoch': 0.1} 10%|▉ | 3415/34278 [3:46:32<29:07:16, 3.40s/it] 10%|▉ | 3416/34278 [3:46:35<28:58:03, 3.38s/it] {'loss': 0.1999, 'grad_norm': 1.082272418818284, 'learning_rate': 9.873367363446843e-06, 'epoch': 0.1} 10%|▉ | 3416/34278 [3:46:35<28:58:03, 3.38s/it] 10%|▉ | 3417/34278 [3:46:38<27:40:31, 3.23s/it] {'loss': 0.1873, 'grad_norm': 0.9170445275393765, 'learning_rate': 9.873261689927674e-06, 'epoch': 0.1} 10%|▉ | 3417/34278 [3:46:38<27:40:31, 3.23s/it] 10%|▉ | 3418/34278 [3:46:41<27:55:18, 3.26s/it] {'loss': 0.1884, 'grad_norm': 0.9607984396105024, 'learning_rate': 9.873155972901187e-06, 'epoch': 0.1} 10%|▉ | 3418/34278 [3:46:41<27:55:18, 3.26s/it] 10%|▉ | 3419/34278 [3:46:47<33:48:32, 3.94s/it] {'loss': 0.1969, 'grad_norm': 0.878612477667233, 'learning_rate': 9.87305021236833e-06, 'epoch': 0.1} 10%|▉ | 3419/34278 [3:46:47<33:48:32, 3.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|▉ | 3420/34278 [3:46:50<32:14:39, 3.76s/it] {'loss': 0.1816, 'grad_norm': 0.8412296816249966, 'learning_rate': 9.87294440833004e-06, 'epoch': 0.1} 10%|▉ | 3420/34278 [3:46:50<32:14:39, 3.76s/it] 10%|▉ | 3421/34278 [3:46:53<30:11:01, 3.52s/it] {'loss': 0.1765, 'grad_norm': 1.023308551384643, 'learning_rate': 9.872838560787269e-06, 'epoch': 0.1} 10%|▉ | 3421/34278 [3:46:53<30:11:01, 3.52s/it] 10%|▉ | 3422/34278 [3:46:57<29:49:28, 3.48s/it] {'loss': 0.1857, 'grad_norm': 0.7906909622655207, 'learning_rate': 9.872732669740956e-06, 'epoch': 0.1} 10%|▉ | 3422/34278 [3:46:57<29:49:28, 3.48s/it] 10%|▉ | 3423/34278 [3:47:00<30:11:35, 3.52s/it] {'loss': 0.2048, 'grad_norm': 0.8820929409110352, 'learning_rate': 9.87262673519205e-06, 'epoch': 0.1} 10%|▉ | 3423/34278 [3:47:00<30:11:35, 3.52s/it] 10%|▉ | 3424/34278 [3:47:06<36:46:26, 4.29s/it] {'loss': 0.1854, 'grad_norm': 0.968160720640558, 'learning_rate': 9.872520757141497e-06, 'epoch': 0.1} 10%|▉ | 3424/34278 [3:47:06<36:46:26, 4.29s/it] 10%|▉ | 3425/34278 [3:47:10<35:04:51, 4.09s/it] {'loss': 0.183, 'grad_norm': 0.70461981116057, 'learning_rate': 9.87241473559024e-06, 'epoch': 0.1} 10%|▉ | 3425/34278 [3:47:10<35:04:51, 4.09s/it] 10%|▉ | 3426/34278 [3:47:13<32:23:19, 3.78s/it] {'loss': 0.1572, 'grad_norm': 0.6835017711956095, 'learning_rate': 9.872308670539229e-06, 'epoch': 0.1} 10%|▉ | 3426/34278 [3:47:13<32:23:19, 3.78s/it] 10%|▉ | 3427/34278 [3:47:16<30:55:27, 3.61s/it] {'loss': 0.188, 'grad_norm': 0.8106605104603117, 'learning_rate': 9.872202561989409e-06, 'epoch': 0.1} 10%|▉ | 3427/34278 [3:47:16<30:55:27, 3.61s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0caba83fb0> Failed to fetch sample 3187199. Exception: cannot identify image file <_io.BytesIO object at 0x7f0caba83fb0> 10%|█ | 3428/34278 [3:47:19<29:07:21, 3.40s/it] {'loss': 0.1971, 'grad_norm': 1.0264724096549127, 'learning_rate': 9.872096409941726e-06, 'epoch': 0.1} 10%|█ | 3428/34278 [3:47:19<29:07:21, 3.40s/it] 10%|█ | 3429/34278 [3:47:22<28:47:09, 3.36s/it] {'loss': 0.183, 'grad_norm': 0.7914214116749334, 'learning_rate': 9.871990214397131e-06, 'epoch': 0.1} 10%|█ | 3429/34278 [3:47:22<28:47:09, 3.36s/it] 10%|█ | 3430/34278 [3:47:26<29:57:46, 3.50s/it] {'loss': 0.1761, 'grad_norm': 0.9188258887963189, 'learning_rate': 9.871883975356568e-06, 'epoch': 0.1} 10%|█ | 3430/34278 [3:47:26<29:57:46, 3.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3431/34278 [3:47:30<30:17:42, 3.54s/it] {'loss': 0.1804, 'grad_norm': 0.8394724893678543, 'learning_rate': 9.87177769282099e-06, 'epoch': 0.1} 10%|█ | 3431/34278 [3:47:30<30:17:42, 3.54s/it] 10%|█ | 3432/34278 [3:47:34<32:19:42, 3.77s/it] {'loss': 0.1958, 'grad_norm': 1.2163310379645849, 'learning_rate': 9.871671366791344e-06, 'epoch': 0.1} 10%|█ | 3432/34278 [3:47:34<32:19:42, 3.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3433/34278 [3:47:37<30:46:40, 3.59s/it] {'loss': 0.1693, 'grad_norm': 0.8273408849172802, 'learning_rate': 9.87156499726858e-06, 'epoch': 0.1} 10%|█ | 3433/34278 [3:47:37<30:46:40, 3.59s/it] 10%|█ | 3434/34278 [3:47:41<30:22:26, 3.55s/it] {'loss': 0.1908, 'grad_norm': 0.9024040961441147, 'learning_rate': 9.871458584253644e-06, 'epoch': 0.1} 10%|█ | 3434/34278 [3:47:41<30:22:26, 3.55s/it] 10%|█ | 3435/34278 [3:47:46<34:25:20, 4.02s/it] {'loss': 0.1711, 'grad_norm': 0.8330899285828883, 'learning_rate': 9.871352127747489e-06, 'epoch': 0.1} 10%|█ | 3435/34278 [3:47:46<34:25:20, 4.02s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3436/34278 [3:47:50<35:54:43, 4.19s/it] {'loss': 0.1846, 'grad_norm': 0.9509143420825992, 'learning_rate': 9.871245627751067e-06, 'epoch': 0.1} 10%|█ | 3436/34278 [3:47:50<35:54:43, 4.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3437/34278 [3:47:55<36:01:35, 4.21s/it] {'loss': 0.2013, 'grad_norm': 1.0883350035256232, 'learning_rate': 9.871139084265324e-06, 'epoch': 0.1} 10%|█ | 3437/34278 [3:47:55<36:01:35, 4.21s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3438/34278 [3:47:58<33:39:34, 3.93s/it] {'loss': 0.2145, 'grad_norm': 1.1309254010693355, 'learning_rate': 9.871032497291217e-06, 'epoch': 0.1} 10%|█ | 3438/34278 [3:47:58<33:39:34, 3.93s/it] 10%|█ | 3439/34278 [3:48:02<34:37:49, 4.04s/it] {'loss': 0.1901, 'grad_norm': 0.9576681093904603, 'learning_rate': 9.870925866829692e-06, 'epoch': 0.1} 10%|█ | 3439/34278 [3:48:02<34:37:49, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3440/34278 [3:48:06<34:05:29, 3.98s/it] {'loss': 0.1655, 'grad_norm': 1.1262857177310994, 'learning_rate': 9.870819192881707e-06, 'epoch': 0.1} 10%|█ | 3440/34278 [3:48:06<34:05:29, 3.98s/it] 10%|█ | 3441/34278 [3:48:12<39:28:11, 4.61s/it] {'loss': 0.1817, 'grad_norm': 0.9866849846226996, 'learning_rate': 9.870712475448207e-06, 'epoch': 0.1} 10%|█ | 3441/34278 [3:48:12<39:28:11, 4.61s/it] 10%|█ | 3442/34278 [3:48:15<36:09:58, 4.22s/it] {'loss': 0.1915, 'grad_norm': 1.0529171418373293, 'learning_rate': 9.870605714530152e-06, 'epoch': 0.1} 10%|█ | 3442/34278 [3:48:15<36:09:58, 4.22s/it] 10%|█ | 3443/34278 [3:48:19<34:11:55, 3.99s/it] {'loss': 0.189, 'grad_norm': 1.0145462296288652, 'learning_rate': 9.870498910128492e-06, 'epoch': 0.1} 10%|█ | 3443/34278 [3:48:19<34:11:55, 3.99s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8540 > 8192). Running this sequence through the model will result in indexing errors 10%|█ | 3444/34278 [3:48:22<31:37:41, 3.69s/it] {'loss': 0.1554, 'grad_norm': 0.8909609759811756, 'learning_rate': 9.870392062244178e-06, 'epoch': 0.1} 10%|█ | 3444/34278 [3:48:22<31:37:41, 3.69s/it] 10%|█ | 3445/34278 [3:48:26<32:03:19, 3.74s/it] {'loss': 0.177, 'grad_norm': 1.1927000558620366, 'learning_rate': 9.870285170878167e-06, 'epoch': 0.1} 10%|█ | 3445/34278 [3:48:26<32:03:19, 3.74s/it] 10%|█ | 3446/34278 [3:48:29<31:58:43, 3.73s/it] {'loss': 0.2003, 'grad_norm': 1.178151159469842, 'learning_rate': 9.870178236031413e-06, 'epoch': 0.1} 10%|█ | 3446/34278 [3:48:29<31:58:43, 3.73s/it] 10%|█ | 3447/34278 [3:48:33<31:03:59, 3.63s/it] {'loss': 0.1837, 'grad_norm': 0.975324090964805, 'learning_rate': 9.870071257704871e-06, 'epoch': 0.1} 10%|█ | 3447/34278 [3:48:33<31:03:59, 3.63s/it] 10%|█ | 3448/34278 [3:48:36<30:31:25, 3.56s/it] {'loss': 0.174, 'grad_norm': 1.1448229783115247, 'learning_rate': 9.869964235899494e-06, 'epoch': 0.1} 10%|█ | 3448/34278 [3:48:36<30:31:25, 3.56s/it] 10%|█ | 3449/34278 [3:48:40<29:47:12, 3.48s/it] {'loss': 0.1821, 'grad_norm': 0.9514149119948482, 'learning_rate': 9.86985717061624e-06, 'epoch': 0.1} 10%|█ | 3449/34278 [3:48:40<29:47:12, 3.48s/it] 10%|█ | 3450/34278 [3:48:44<31:03:17, 3.63s/it] {'loss': 0.1681, 'grad_norm': 0.847909054724833, 'learning_rate': 9.869750061856063e-06, 'epoch': 0.1} 10%|█ | 3450/34278 [3:48:44<31:03:17, 3.63s/it] 10%|█ | 3451/34278 [3:48:50<37:33:00, 4.39s/it] {'loss': 0.1883, 'grad_norm': 0.9218506513046638, 'learning_rate': 9.869642909619921e-06, 'epoch': 0.1} 10%|█ | 3451/34278 [3:48:50<37:33:00, 4.39s/it] 10%|█ | 3452/34278 [3:48:53<34:37:11, 4.04s/it] {'loss': 0.1958, 'grad_norm': 0.9041931059317265, 'learning_rate': 9.869535713908768e-06, 'epoch': 0.1} 10%|█ | 3452/34278 [3:48:53<34:37:11, 4.04s/it] 10%|█ | 3453/34278 [3:48:57<35:49:38, 4.18s/it] {'loss': 0.1794, 'grad_norm': 0.8896465975703093, 'learning_rate': 9.869428474723563e-06, 'epoch': 0.1} 10%|█ | 3453/34278 [3:48:57<35:49:38, 4.18s/it] 10%|█ | 3454/34278 [3:49:04<40:39:50, 4.75s/it] {'loss': 0.1734, 'grad_norm': 0.9377163592942124, 'learning_rate': 9.869321192065264e-06, 'epoch': 0.1} 10%|█ | 3454/34278 [3:49:04<40:39:50, 4.75s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3455/34278 [3:49:07<37:16:19, 4.35s/it] {'loss': 0.1938, 'grad_norm': 1.0246636715401394, 'learning_rate': 9.869213865934827e-06, 'epoch': 0.1} 10%|█ | 3455/34278 [3:49:07<37:16:19, 4.35s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10905 > 8192). Running this sequence through the model will result in indexing errors 10%|█ | 3456/34278 [3:49:10<34:39:46, 4.05s/it] {'loss': 0.2209, 'grad_norm': 0.8475687952188002, 'learning_rate': 9.869106496333213e-06, 'epoch': 0.1} 10%|█ | 3456/34278 [3:49:10<34:39:46, 4.05s/it] 10%|█ | 3457/34278 [3:49:14<32:55:05, 3.84s/it] {'loss': 0.1892, 'grad_norm': 0.9379196203907476, 'learning_rate': 9.868999083261377e-06, 'epoch': 0.1} 10%|█ | 3457/34278 [3:49:14<32:55:05, 3.84s/it] 10%|█ | 3458/34278 [3:49:18<33:01:07, 3.86s/it] {'loss': 0.2129, 'grad_norm': 0.9441790698051824, 'learning_rate': 9.868891626720279e-06, 'epoch': 0.1} 10%|█ | 3458/34278 [3:49:18<33:01:07, 3.86s/it] 10%|█ | 3459/34278 [3:49:24<39:44:31, 4.64s/it] {'loss': 0.1785, 'grad_norm': 0.7326471551649248, 'learning_rate': 9.868784126710878e-06, 'epoch': 0.1} 10%|█ | 3459/34278 [3:49:24<39:44:31, 4.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3460/34278 [3:49:29<40:03:50, 4.68s/it] {'loss': 0.1778, 'grad_norm': 0.818500409204287, 'learning_rate': 9.868676583234136e-06, 'epoch': 0.1} 10%|█ | 3460/34278 [3:49:29<40:03:50, 4.68s/it] 10%|█ | 3461/34278 [3:49:33<38:06:32, 4.45s/it] {'loss': 0.1668, 'grad_norm': 0.8706269163312929, 'learning_rate': 9.868568996291013e-06, 'epoch': 0.1} 10%|█ | 3461/34278 [3:49:33<38:06:32, 4.45s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3462/34278 [3:49:36<35:06:55, 4.10s/it] {'loss': 0.1815, 'grad_norm': 1.3126513024180493, 'learning_rate': 9.868461365882465e-06, 'epoch': 0.1} 10%|█ | 3462/34278 [3:49:36<35:06:55, 4.10s/it] 10%|█ | 3463/34278 [3:49:40<34:38:26, 4.05s/it] {'loss': 0.177, 'grad_norm': 0.8610585831478037, 'learning_rate': 9.868353692009458e-06, 'epoch': 0.1} 10%|█ | 3463/34278 [3:49:40<34:38:26, 4.05s/it] 10%|█ | 3464/34278 [3:49:43<31:58:24, 3.74s/it] {'loss': 0.1965, 'grad_norm': 1.011781175607378, 'learning_rate': 9.868245974672952e-06, 'epoch': 0.1} 10%|█ | 3464/34278 [3:49:43<31:58:24, 3.74s/it] 10%|█ | 3465/34278 [3:49:47<32:29:29, 3.80s/it] {'loss': 0.172, 'grad_norm': 0.833371986012559, 'learning_rate': 9.868138213873908e-06, 'epoch': 0.1} 10%|█ | 3465/34278 [3:49:47<32:29:29, 3.80s/it] 10%|█ | 3466/34278 [3:49:50<30:51:33, 3.61s/it] {'loss': 0.1783, 'grad_norm': 1.022504922983816, 'learning_rate': 9.868030409613286e-06, 'epoch': 0.1} 10%|█ | 3466/34278 [3:49:50<30:51:33, 3.61s/it] 10%|█ | 3467/34278 [3:49:53<29:28:18, 3.44s/it] {'loss': 0.2108, 'grad_norm': 1.0568751412016992, 'learning_rate': 9.867922561892053e-06, 'epoch': 0.1} 10%|█ | 3467/34278 [3:49:53<29:28:18, 3.44s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10047 > 8192). Running this sequence through the model will result in indexing errors 10%|█ | 3468/34278 [3:49:59<35:44:36, 4.18s/it] {'loss': 0.1764, 'grad_norm': 0.9318373715013006, 'learning_rate': 9.86781467071117e-06, 'epoch': 0.1} 10%|█ | 3468/34278 [3:49:59<35:44:36, 4.18s/it] 10%|█ | 3469/34278 [3:50:02<32:49:28, 3.84s/it] {'loss': 0.1759, 'grad_norm': 0.9056729730442917, 'learning_rate': 9.867706736071596e-06, 'epoch': 0.1} 10%|█ | 3469/34278 [3:50:02<32:49:28, 3.84s/it] 10%|█ | 3470/34278 [3:50:08<38:48:17, 4.53s/it] {'loss': 0.1844, 'grad_norm': 0.8599796170135051, 'learning_rate': 9.867598757974302e-06, 'epoch': 0.1} 10%|█ | 3470/34278 [3:50:08<38:48:17, 4.53s/it] 10%|█ | 3471/34278 [3:50:12<37:02:55, 4.33s/it] {'loss': 0.2018, 'grad_norm': 0.9620661176977179, 'learning_rate': 9.867490736420245e-06, 'epoch': 0.1} 10%|█ | 3471/34278 [3:50:12<37:02:55, 4.33s/it] 10%|█ | 3472/34278 [3:50:16<36:46:19, 4.30s/it] {'loss': 0.1757, 'grad_norm': 0.9284833683199802, 'learning_rate': 9.867382671410395e-06, 'epoch': 0.1} 10%|█ | 3472/34278 [3:50:16<36:46:19, 4.30s/it] 10%|█ | 3473/34278 [3:50:22<40:49:13, 4.77s/it] {'loss': 0.2095, 'grad_norm': 0.8619858597108815, 'learning_rate': 9.867274562945713e-06, 'epoch': 0.1} 10%|█ | 3473/34278 [3:50:22<40:49:13, 4.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3474/34278 [3:50:25<35:47:47, 4.18s/it] {'loss': 0.1842, 'grad_norm': 1.040972216388095, 'learning_rate': 9.867166411027167e-06, 'epoch': 0.1} 10%|█ | 3474/34278 [3:50:25<35:47:47, 4.18s/it] 10%|█ | 3475/34278 [3:50:29<35:17:15, 4.12s/it] {'loss': 0.1659, 'grad_norm': 1.008087937414507, 'learning_rate': 9.867058215655721e-06, 'epoch': 0.1} 10%|█ | 3475/34278 [3:50:29<35:17:15, 4.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3476/34278 [3:50:32<32:52:07, 3.84s/it] {'loss': 0.1902, 'grad_norm': 0.8469254202320418, 'learning_rate': 9.86694997683234e-06, 'epoch': 0.1} 10%|█ | 3476/34278 [3:50:32<32:52:07, 3.84s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8442 > 8192). Running this sequence through the model will result in indexing errors 10%|█ | 3477/34278 [3:50:36<33:16:23, 3.89s/it] {'loss': 0.2063, 'grad_norm': 0.7941411979341042, 'learning_rate': 9.866841694557993e-06, 'epoch': 0.1} 10%|█ | 3477/34278 [3:50:36<33:16:23, 3.89s/it] 10%|█ | 3478/34278 [3:50:40<32:22:28, 3.78s/it] {'loss': 0.186, 'grad_norm': 0.8950825419085655, 'learning_rate': 9.866733368833643e-06, 'epoch': 0.1} 10%|█ | 3478/34278 [3:50:40<32:22:28, 3.78s/it] 10%|█ | 3479/34278 [3:50:43<30:51:40, 3.61s/it] {'loss': 0.2022, 'grad_norm': 1.14569798578479, 'learning_rate': 9.866624999660262e-06, 'epoch': 0.1} 10%|█ | 3479/34278 [3:50:43<30:51:40, 3.61s/it] 10%|█ | 3480/34278 [3:50:46<29:27:22, 3.44s/it] {'loss': 0.1754, 'grad_norm': 0.7917867480873769, 'learning_rate': 9.866516587038813e-06, 'epoch': 0.1} 10%|█ | 3480/34278 [3:50:46<29:27:22, 3.44s/it] 10%|█ | 3481/34278 [3:50:51<34:58:08, 4.09s/it] {'loss': 0.1829, 'grad_norm': 0.8642967705541632, 'learning_rate': 9.866408130970267e-06, 'epoch': 0.1} 10%|█ | 3481/34278 [3:50:51<34:58:08, 4.09s/it] 10%|█ | 3482/34278 [3:50:55<33:07:53, 3.87s/it] {'loss': 0.2343, 'grad_norm': 1.5467627532716057, 'learning_rate': 9.86629963145559e-06, 'epoch': 0.1} 10%|█ | 3482/34278 [3:50:55<33:07:53, 3.87s/it] 10%|█ | 3483/34278 [3:51:00<37:03:39, 4.33s/it] {'loss': 0.1737, 'grad_norm': 0.8266053130281075, 'learning_rate': 9.86619108849575e-06, 'epoch': 0.1} 10%|█ | 3483/34278 [3:51:00<37:03:39, 4.33s/it] 10%|█ | 3484/34278 [3:51:03<33:35:53, 3.93s/it] {'loss': 0.1775, 'grad_norm': 0.9367688798654689, 'learning_rate': 9.86608250209172e-06, 'epoch': 0.1} 10%|█ | 3484/34278 [3:51:03<33:35:53, 3.93s/it] 10%|█ | 3485/34278 [3:51:07<32:03:02, 3.75s/it] {'loss': 0.1667, 'grad_norm': 0.9155674535316602, 'learning_rate': 9.865973872244466e-06, 'epoch': 0.1} 10%|█ | 3485/34278 [3:51:07<32:03:02, 3.75s/it] 10%|█ | 3486/34278 [3:51:10<31:24:40, 3.67s/it] {'loss': 0.2094, 'grad_norm': 0.9847455876401546, 'learning_rate': 9.865865198954959e-06, 'epoch': 0.1} 10%|█ | 3486/34278 [3:51:10<31:24:40, 3.67s/it] 10%|█ | 3487/34278 [3:51:14<31:32:37, 3.69s/it] {'loss': 0.1995, 'grad_norm': 0.8308552008359891, 'learning_rate': 9.865756482224169e-06, 'epoch': 0.1} 10%|█ | 3487/34278 [3:51:14<31:32:37, 3.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3488/34278 [3:51:17<29:52:45, 3.49s/it] {'loss': 0.1962, 'grad_norm': 0.8410268218827373, 'learning_rate': 9.865647722053066e-06, 'epoch': 0.1} 10%|█ | 3488/34278 [3:51:17<29:52:45, 3.49s/it] 10%|█ | 3489/34278 [3:51:20<30:10:00, 3.53s/it] {'loss': 0.1779, 'grad_norm': 1.0087615518486086, 'learning_rate': 9.865538918442624e-06, 'epoch': 0.1} 10%|█ | 3489/34278 [3:51:20<30:10:00, 3.53s/it] 10%|█ | 3490/34278 [3:51:24<30:46:55, 3.60s/it] {'loss': 0.1771, 'grad_norm': 1.041963448367663, 'learning_rate': 9.86543007139381e-06, 'epoch': 0.1} 10%|█ | 3490/34278 [3:51:24<30:46:55, 3.60s/it] 10%|█ | 3491/34278 [3:51:28<30:33:10, 3.57s/it] {'loss': 0.167, 'grad_norm': 1.0238256726632478, 'learning_rate': 9.865321180907597e-06, 'epoch': 0.1} 10%|█ | 3491/34278 [3:51:28<30:33:10, 3.57s/it] 10%|█ | 3492/34278 [3:51:31<30:10:21, 3.53s/it] {'loss': 0.205, 'grad_norm': 1.161048222587803, 'learning_rate': 9.86521224698496e-06, 'epoch': 0.1} 10%|█ | 3492/34278 [3:51:31<30:10:21, 3.53s/it] 10%|█ | 3493/34278 [3:51:35<30:25:32, 3.56s/it] {'loss': 0.1717, 'grad_norm': 0.9043069579942137, 'learning_rate': 9.865103269626868e-06, 'epoch': 0.1} 10%|█ | 3493/34278 [3:51:35<30:25:32, 3.56s/it] 10%|█ | 3494/34278 [3:51:38<28:42:09, 3.36s/it] {'loss': 0.1919, 'grad_norm': 0.8923616361442843, 'learning_rate': 9.864994248834297e-06, 'epoch': 0.1} 10%|█ | 3494/34278 [3:51:38<28:42:09, 3.36s/it] 10%|█ | 3495/34278 [3:51:41<28:09:41, 3.29s/it] {'loss': 0.2093, 'grad_norm': 0.7993846930318241, 'learning_rate': 9.864885184608217e-06, 'epoch': 0.1} 10%|█ | 3495/34278 [3:51:41<28:09:41, 3.29s/it] 10%|█ | 3496/34278 [3:51:44<27:58:38, 3.27s/it] {'loss': 0.2036, 'grad_norm': 0.9328576530882595, 'learning_rate': 9.864776076949604e-06, 'epoch': 0.1} 10%|█ | 3496/34278 [3:51:44<27:58:38, 3.27s/it] 10%|█ | 3497/34278 [3:51:47<27:23:12, 3.20s/it] {'loss': 0.1852, 'grad_norm': 0.9497224570949078, 'learning_rate': 9.864666925859432e-06, 'epoch': 0.1} 10%|█ | 3497/34278 [3:51:47<27:23:12, 3.20s/it] 10%|█ | 3498/34278 [3:51:51<28:57:51, 3.39s/it] {'loss': 0.1747, 'grad_norm': 0.8879533396575509, 'learning_rate': 9.864557731338675e-06, 'epoch': 0.1} 10%|█ | 3498/34278 [3:51:51<28:57:51, 3.39s/it] 10%|█ | 3499/34278 [3:51:54<27:45:11, 3.25s/it] {'loss': 0.1946, 'grad_norm': 0.8937655185804683, 'learning_rate': 9.864448493388307e-06, 'epoch': 0.1} 10%|█ | 3499/34278 [3:51:54<27:45:11, 3.25s/it] 10%|█ | 3500/34278 [3:51:57<26:58:25, 3.16s/it] {'loss': 0.197, 'grad_norm': 0.9357675307242985, 'learning_rate': 9.864339212009304e-06, 'epoch': 0.1} 10%|█ | 3500/34278 [3:51:57<26:58:25, 3.16s/it] 10%|█ | 3501/34278 [3:52:03<34:18:30, 4.01s/it] {'loss': 0.1909, 'grad_norm': 0.8261058627027291, 'learning_rate': 9.864229887202643e-06, 'epoch': 0.1} 10%|█ | 3501/34278 [3:52:03<34:18:30, 4.01s/it] 10%|█ | 3502/34278 [3:52:06<32:02:41, 3.75s/it] {'loss': 0.1902, 'grad_norm': 1.0006517067419611, 'learning_rate': 9.864120518969298e-06, 'epoch': 0.1} 10%|█ | 3502/34278 [3:52:06<32:02:41, 3.75s/it] 10%|█ | 3503/34278 [3:52:12<37:57:52, 4.44s/it] {'loss': 0.1867, 'grad_norm': 1.0604154204135914, 'learning_rate': 9.864011107310246e-06, 'epoch': 0.1} 10%|█ | 3503/34278 [3:52:12<37:57:52, 4.44s/it] 10%|█ | 3504/34278 [3:52:18<43:02:08, 5.03s/it] {'loss': 0.2068, 'grad_norm': 0.8545044363306173, 'learning_rate': 9.863901652226464e-06, 'epoch': 0.1} 10%|█ | 3504/34278 [3:52:18<43:02:08, 5.03s/it] 10%|█ | 3505/34278 [3:52:21<37:45:47, 4.42s/it] {'loss': 0.1637, 'grad_norm': 1.0859515433427762, 'learning_rate': 9.86379215371893e-06, 'epoch': 0.1} 10%|█ | 3505/34278 [3:52:21<37:45:47, 4.42s/it] 10%|█ | 3506/34278 [3:52:26<37:13:32, 4.36s/it] {'loss': 0.2019, 'grad_norm': 1.1918579655623438, 'learning_rate': 9.86368261178862e-06, 'epoch': 0.1} 10%|█ | 3506/34278 [3:52:26<37:13:32, 4.36s/it] 10%|█ | 3507/34278 [3:52:32<41:33:50, 4.86s/it] {'loss': 0.1965, 'grad_norm': 1.0430591125132915, 'learning_rate': 9.863573026436513e-06, 'epoch': 0.1} 10%|█ | 3507/34278 [3:52:32<41:33:50, 4.86s/it] 10%|█ | 3508/34278 [3:52:35<37:23:00, 4.37s/it] {'loss': 0.1915, 'grad_norm': 1.221202207806947, 'learning_rate': 9.863463397663587e-06, 'epoch': 0.1} 10%|█ | 3508/34278 [3:52:35<37:23:00, 4.37s/it] 10%|█ | 3509/34278 [3:52:38<34:44:21, 4.06s/it] {'loss': 0.2163, 'grad_norm': 1.142944732564729, 'learning_rate': 9.863353725470822e-06, 'epoch': 0.1} 10%|█ | 3509/34278 [3:52:38<34:44:21, 4.06s/it] 10%|█ | 3510/34278 [3:52:44<39:37:24, 4.64s/it] {'loss': 0.1711, 'grad_norm': 0.9364772931896006, 'learning_rate': 9.863244009859194e-06, 'epoch': 0.1} 10%|█ | 3510/34278 [3:52:44<39:37:24, 4.64s/it] 10%|█ | 3511/34278 [3:52:48<38:38:19, 4.52s/it] {'loss': 0.1801, 'grad_norm': 1.052710404061729, 'learning_rate': 9.863134250829685e-06, 'epoch': 0.1} 10%|█ | 3511/34278 [3:52:48<38:38:19, 4.52s/it] 10%|█ | 3512/34278 [3:52:52<37:11:27, 4.35s/it] {'loss': 0.1846, 'grad_norm': 1.0758791904026912, 'learning_rate': 9.863024448383273e-06, 'epoch': 0.1} 10%|█ | 3512/34278 [3:52:52<37:11:27, 4.35s/it] 10%|█ | 3513/34278 [3:52:56<36:08:38, 4.23s/it] {'loss': 0.1926, 'grad_norm': 1.038212081530209, 'learning_rate': 9.86291460252094e-06, 'epoch': 0.1} 10%|█ | 3513/34278 [3:52:56<36:08:38, 4.23s/it] 10%|█ | 3514/34278 [3:53:00<35:20:22, 4.14s/it] {'loss': 0.1836, 'grad_norm': 1.134331392771612, 'learning_rate': 9.862804713243667e-06, 'epoch': 0.1} 10%|█ | 3514/34278 [3:53:00<35:20:22, 4.14s/it] 10%|█ | 3515/34278 [3:53:04<33:31:29, 3.92s/it] {'loss': 0.1736, 'grad_norm': 1.0703474061020763, 'learning_rate': 9.862694780552435e-06, 'epoch': 0.1} 10%|█ | 3515/34278 [3:53:04<33:31:29, 3.92s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8784 > 8192). Running this sequence through the model will result in indexing errors 10%|█ | 3516/34278 [3:53:07<32:18:07, 3.78s/it] {'loss': 0.2009, 'grad_norm': 1.0638348901860075, 'learning_rate': 9.862584804448226e-06, 'epoch': 0.1} 10%|█ | 3516/34278 [3:53:07<32:18:07, 3.78s/it] 10%|█ | 3517/34278 [3:53:10<30:09:18, 3.53s/it] {'loss': 0.1828, 'grad_norm': 0.8599113957013631, 'learning_rate': 9.862474784932018e-06, 'epoch': 0.1} 10%|█ | 3517/34278 [3:53:10<30:09:18, 3.53s/it] 10%|█ | 3518/34278 [3:53:16<36:32:38, 4.28s/it] {'loss': 0.1715, 'grad_norm': 0.9874231174873996, 'learning_rate': 9.862364722004798e-06, 'epoch': 0.1} 10%|█ | 3518/34278 [3:53:16<36:32:38, 4.28s/it] 10%|█ | 3519/34278 [3:53:19<33:23:57, 3.91s/it] {'loss': 0.1908, 'grad_norm': 0.8885320043956563, 'learning_rate': 9.862254615667546e-06, 'epoch': 0.1} 10%|█ | 3519/34278 [3:53:19<33:23:57, 3.91s/it] 10%|█ | 3520/34278 [3:53:25<39:04:40, 4.57s/it] {'loss': 0.2105, 'grad_norm': 1.073164181263853, 'learning_rate': 9.862144465921244e-06, 'epoch': 0.1} 10%|█ | 3520/34278 [3:53:25<39:04:40, 4.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3521/34278 [3:53:31<41:20:23, 4.84s/it] {'loss': 0.1767, 'grad_norm': 0.8240112977118601, 'learning_rate': 9.862034272766879e-06, 'epoch': 0.1} 10%|█ | 3521/34278 [3:53:31<41:20:23, 4.84s/it] 10%|█ | 3522/34278 [3:53:37<43:55:55, 5.14s/it] {'loss': 0.1746, 'grad_norm': 0.8359128423414961, 'learning_rate': 9.86192403620543e-06, 'epoch': 0.1} 10%|█ | 3522/34278 [3:53:37<43:55:55, 5.14s/it] 10%|█ | 3523/34278 [3:53:40<39:43:49, 4.65s/it] {'loss': 0.2089, 'grad_norm': 0.9401539512117901, 'learning_rate': 9.861813756237886e-06, 'epoch': 0.1} 10%|█ | 3523/34278 [3:53:40<39:43:49, 4.65s/it] 10%|█ | 3524/34278 [3:53:46<42:53:05, 5.02s/it] {'loss': 0.1778, 'grad_norm': 0.7972123494742941, 'learning_rate': 9.861703432865228e-06, 'epoch': 0.1} 10%|█ | 3524/34278 [3:53:46<42:53:05, 5.02s/it] 10%|█ | 3525/34278 [3:53:50<39:41:04, 4.65s/it] {'loss': 0.1787, 'grad_norm': 0.8325761377041836, 'learning_rate': 9.861593066088444e-06, 'epoch': 0.1} 10%|█ | 3525/34278 [3:53:50<39:41:04, 4.65s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3526/34278 [3:53:56<43:21:55, 5.08s/it] {'loss': 0.2027, 'grad_norm': 0.820495223309961, 'learning_rate': 9.861482655908517e-06, 'epoch': 0.1} 10%|█ | 3526/34278 [3:53:56<43:21:55, 5.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3527/34278 [3:53:59<38:47:01, 4.54s/it] {'loss': 0.1723, 'grad_norm': 0.7924336240807823, 'learning_rate': 9.861372202326432e-06, 'epoch': 0.1} 10%|█ | 3527/34278 [3:53:59<38:47:01, 4.54s/it] 10%|█ | 3528/34278 [3:54:05<42:20:36, 4.96s/it] {'loss': 0.1683, 'grad_norm': 0.7817727912241805, 'learning_rate': 9.861261705343178e-06, 'epoch': 0.1} 10%|█ | 3528/34278 [3:54:05<42:20:36, 4.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3529/34278 [3:54:08<37:52:59, 4.44s/it] {'loss': 0.1749, 'grad_norm': 0.86152589055737, 'learning_rate': 9.861151164959738e-06, 'epoch': 0.1} 10%|█ | 3529/34278 [3:54:08<37:52:59, 4.44s/it] 10%|█ | 3530/34278 [3:54:12<35:07:53, 4.11s/it] {'loss': 0.1547, 'grad_norm': 0.6543986493202684, 'learning_rate': 9.861040581177103e-06, 'epoch': 0.1} 10%|█ | 3530/34278 [3:54:12<35:07:53, 4.11s/it] 10%|█ | 3531/34278 [3:54:15<32:51:52, 3.85s/it] {'loss': 0.1648, 'grad_norm': 1.0808079835968676, 'learning_rate': 9.86092995399626e-06, 'epoch': 0.1} 10%|█ | 3531/34278 [3:54:15<32:51:52, 3.85s/it] 10%|█ | 3532/34278 [3:54:18<31:00:35, 3.63s/it] {'loss': 0.1878, 'grad_norm': 0.8424842609109987, 'learning_rate': 9.860819283418192e-06, 'epoch': 0.1} 10%|█ | 3532/34278 [3:54:18<31:00:35, 3.63s/it] 10%|█ | 3533/34278 [3:54:22<32:12:05, 3.77s/it] {'loss': 0.1892, 'grad_norm': 0.8238721639152723, 'learning_rate': 9.860708569443888e-06, 'epoch': 0.1} 10%|█ | 3533/34278 [3:54:22<32:12:05, 3.77s/it] 10%|█ | 3534/34278 [3:54:25<30:58:37, 3.63s/it] {'loss': 0.1705, 'grad_norm': 0.8802106856130075, 'learning_rate': 9.860597812074343e-06, 'epoch': 0.1} 10%|█ | 3534/34278 [3:54:25<30:58:37, 3.63s/it] 10%|█ | 3535/34278 [3:54:29<30:32:30, 3.58s/it] {'loss': 0.1472, 'grad_norm': 0.7550766242911688, 'learning_rate': 9.860487011310537e-06, 'epoch': 0.1} 10%|█ | 3535/34278 [3:54:29<30:32:30, 3.58s/it] 10%|█ | 3536/34278 [3:54:35<37:00:15, 4.33s/it] {'loss': 0.1783, 'grad_norm': 0.9375274970501286, 'learning_rate': 9.860376167153466e-06, 'epoch': 0.1} 10%|█ | 3536/34278 [3:54:35<37:00:15, 4.33s/it] 10%|█ | 3537/34278 [3:54:38<34:08:43, 4.00s/it] {'loss': 0.2091, 'grad_norm': 0.9816305146749097, 'learning_rate': 9.860265279604114e-06, 'epoch': 0.1} 10%|█ | 3537/34278 [3:54:38<34:08:43, 4.00s/it] 10%|█ | 3538/34278 [3:54:42<33:29:48, 3.92s/it] {'loss': 0.1865, 'grad_norm': 0.9361128396163539, 'learning_rate': 9.860154348663476e-06, 'epoch': 0.1} 10%|█ | 3538/34278 [3:54:42<33:29:48, 3.92s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10658 > 8192). Running this sequence through the model will result in indexing errors 10%|█ | 3539/34278 [3:54:47<36:07:12, 4.23s/it] {'loss': 0.1921, 'grad_norm': 0.9431525304148766, 'learning_rate': 9.86004337433254e-06, 'epoch': 0.1} 10%|█ | 3539/34278 [3:54:47<36:07:12, 4.23s/it] 10%|█ | 3540/34278 [3:54:52<39:09:09, 4.59s/it] {'loss': 0.1886, 'grad_norm': 1.1496564048652087, 'learning_rate': 9.859932356612297e-06, 'epoch': 0.1} 10%|█ | 3540/34278 [3:54:52<39:09:09, 4.59s/it] 10%|█ | 3541/34278 [3:54:56<37:20:34, 4.37s/it] {'loss': 0.2, 'grad_norm': 0.9096912734203451, 'learning_rate': 9.859821295503736e-06, 'epoch': 0.1} 10%|█ | 3541/34278 [3:54:56<37:20:34, 4.37s/it] 10%|█ | 3542/34278 [3:54:59<34:10:42, 4.00s/it] {'loss': 0.2068, 'grad_norm': 0.7937785251353193, 'learning_rate': 9.859710191007851e-06, 'epoch': 0.1} 10%|█ | 3542/34278 [3:54:59<34:10:42, 4.00s/it] 10%|█ | 3543/34278 [3:55:02<31:53:00, 3.73s/it] {'loss': 0.1765, 'grad_norm': 1.0477861995956441, 'learning_rate': 9.859599043125636e-06, 'epoch': 0.1} 10%|█ | 3543/34278 [3:55:02<31:53:00, 3.73s/it] 10%|█ | 3544/34278 [3:55:05<30:20:29, 3.55s/it] {'loss': 0.1709, 'grad_norm': 0.8480879574997857, 'learning_rate': 9.85948785185808e-06, 'epoch': 0.1} 10%|█ | 3544/34278 [3:55:05<30:20:29, 3.55s/it] 10%|█ | 3545/34278 [3:55:09<29:31:48, 3.46s/it] {'loss': 0.1727, 'grad_norm': 0.7737043039154154, 'learning_rate': 9.859376617206175e-06, 'epoch': 0.1} 10%|█ | 3545/34278 [3:55:09<29:31:48, 3.46s/it] 10%|█ | 3546/34278 [3:55:14<33:13:45, 3.89s/it] {'loss': 0.1763, 'grad_norm': 0.8627681389248744, 'learning_rate': 9.859265339170918e-06, 'epoch': 0.1} 10%|█ | 3546/34278 [3:55:14<33:13:45, 3.89s/it] 10%|█ | 3547/34278 [3:55:19<36:18:24, 4.25s/it] {'loss': 0.1755, 'grad_norm': 0.7419522433195506, 'learning_rate': 9.859154017753299e-06, 'epoch': 0.1} 10%|█ | 3547/34278 [3:55:19<36:18:24, 4.25s/it] 10%|█ | 3548/34278 [3:55:22<33:36:40, 3.94s/it] {'loss': 0.1811, 'grad_norm': 0.9000903192887774, 'learning_rate': 9.859042652954312e-06, 'epoch': 0.1} 10%|█ | 3548/34278 [3:55:22<33:36:40, 3.94s/it] 10%|█ | 3549/34278 [3:55:26<34:49:12, 4.08s/it] {'loss': 0.1972, 'grad_norm': 0.8256068494585156, 'learning_rate': 9.858931244774952e-06, 'epoch': 0.1} 10%|█ | 3549/34278 [3:55:26<34:49:12, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3550/34278 [3:55:29<32:02:08, 3.75s/it] {'loss': 0.1571, 'grad_norm': 0.8784834109295333, 'learning_rate': 9.858819793216214e-06, 'epoch': 0.1} 10%|█ | 3550/34278 [3:55:29<32:02:08, 3.75s/it] 10%|█ | 3551/34278 [3:55:33<32:27:21, 3.80s/it] {'loss': 0.1682, 'grad_norm': 0.7339011855047681, 'learning_rate': 9.858708298279094e-06, 'epoch': 0.1} 10%|█ | 3551/34278 [3:55:33<32:27:21, 3.80s/it] 10%|█ | 3552/34278 [3:55:36<31:06:04, 3.64s/it] {'loss': 0.1947, 'grad_norm': 0.776599733420472, 'learning_rate': 9.858596759964586e-06, 'epoch': 0.1} 10%|█ | 3552/34278 [3:55:37<31:06:04, 3.64s/it] 10%|█ | 3553/34278 [3:55:41<33:17:17, 3.90s/it] {'loss': 0.2001, 'grad_norm': 1.0010043951440706, 'learning_rate': 9.858485178273684e-06, 'epoch': 0.1} 10%|█ | 3553/34278 [3:55:41<33:17:17, 3.90s/it] 10%|█ | 3554/34278 [3:55:47<38:50:37, 4.55s/it] {'loss': 0.1782, 'grad_norm': 0.9808689083488781, 'learning_rate': 9.858373553207387e-06, 'epoch': 0.1} 10%|█ | 3554/34278 [3:55:47<38:50:37, 4.55s/it] 10%|█ | 3555/34278 [3:55:50<34:32:33, 4.05s/it] {'loss': 0.1694, 'grad_norm': 0.7748226788065447, 'learning_rate': 9.858261884766693e-06, 'epoch': 0.1} 10%|█ | 3555/34278 [3:55:50<34:32:33, 4.05s/it] 10%|█ | 3556/34278 [3:55:53<32:00:18, 3.75s/it] {'loss': 0.1901, 'grad_norm': 0.9663212703640649, 'learning_rate': 9.858150172952594e-06, 'epoch': 0.1} 10%|█ | 3556/34278 [3:55:53<32:00:18, 3.75s/it] 10%|█ | 3557/34278 [3:55:56<30:50:02, 3.61s/it] {'loss': 0.1836, 'grad_norm': 0.9912394499988492, 'learning_rate': 9.85803841776609e-06, 'epoch': 0.1} 10%|█ | 3557/34278 [3:55:56<30:50:02, 3.61s/it] 10%|█ | 3558/34278 [3:55:59<29:39:10, 3.47s/it] {'loss': 0.1824, 'grad_norm': 1.1206233748183603, 'learning_rate': 9.857926619208181e-06, 'epoch': 0.1} 10%|█ | 3558/34278 [3:55:59<29:39:10, 3.47s/it] 10%|█ | 3559/34278 [3:56:02<28:35:15, 3.35s/it] {'loss': 0.1862, 'grad_norm': 0.8513351639284373, 'learning_rate': 9.857814777279861e-06, 'epoch': 0.1} 10%|█ | 3559/34278 [3:56:02<28:35:15, 3.35s/it] 10%|█ | 3560/34278 [3:56:07<32:19:09, 3.79s/it] {'loss': 0.173, 'grad_norm': 1.1036362299937632, 'learning_rate': 9.85770289198213e-06, 'epoch': 0.1} 10%|█ | 3560/34278 [3:56:07<32:19:09, 3.79s/it] 10%|█ | 3561/34278 [3:56:11<31:19:44, 3.67s/it] {'loss': 0.1692, 'grad_norm': 0.917666980493957, 'learning_rate': 9.85759096331599e-06, 'epoch': 0.1} 10%|█ | 3561/34278 [3:56:11<31:19:44, 3.67s/it] 10%|█ | 3562/34278 [3:56:14<29:39:00, 3.48s/it] {'loss': 0.1897, 'grad_norm': 0.9200122465756295, 'learning_rate': 9.857478991282434e-06, 'epoch': 0.1} 10%|█ | 3562/34278 [3:56:14<29:39:00, 3.48s/it] 10%|█ | 3563/34278 [3:56:17<28:37:42, 3.36s/it] {'loss': 0.21, 'grad_norm': 0.9203765928255002, 'learning_rate': 9.857366975882468e-06, 'epoch': 0.1} 10%|█ | 3563/34278 [3:56:17<28:37:42, 3.36s/it] 10%|█ | 3564/34278 [3:56:22<33:19:34, 3.91s/it] {'loss': 0.1877, 'grad_norm': 0.8525297570143247, 'learning_rate': 9.857254917117087e-06, 'epoch': 0.1} 10%|█ | 3564/34278 [3:56:22<33:19:34, 3.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3565/34278 [3:56:28<38:58:28, 4.57s/it] {'loss': 0.1892, 'grad_norm': 0.7806261111855988, 'learning_rate': 9.857142814987295e-06, 'epoch': 0.1} 10%|█ | 3565/34278 [3:56:28<38:58:28, 4.57s/it] 10%|█ | 3566/34278 [3:56:34<42:59:52, 5.04s/it] {'loss': 0.1845, 'grad_norm': 0.7587133703653338, 'learning_rate': 9.85703066949409e-06, 'epoch': 0.1} 10%|█ | 3566/34278 [3:56:34<42:59:52, 5.04s/it] 10%|█ | 3567/34278 [3:56:37<37:59:20, 4.45s/it] {'loss': 0.191, 'grad_norm': 1.026226746266854, 'learning_rate': 9.856918480638476e-06, 'epoch': 0.1} 10%|█ | 3567/34278 [3:56:37<37:59:20, 4.45s/it] 10%|█ | 3568/34278 [3:56:41<37:00:45, 4.34s/it] {'loss': 0.1728, 'grad_norm': 0.7792006638213195, 'learning_rate': 9.856806248421453e-06, 'epoch': 0.1} 10%|█ | 3568/34278 [3:56:41<37:00:45, 4.34s/it] 10%|█ | 3569/34278 [3:56:45<33:58:08, 3.98s/it] {'loss': 0.1595, 'grad_norm': 0.8558516870616105, 'learning_rate': 9.856693972844022e-06, 'epoch': 0.1} 10%|█ | 3569/34278 [3:56:45<33:58:08, 3.98s/it] 10%|█ | 3570/34278 [3:56:48<31:37:48, 3.71s/it] {'loss': 0.1845, 'grad_norm': 0.8568316620101666, 'learning_rate': 9.856581653907188e-06, 'epoch': 0.1} 10%|█ | 3570/34278 [3:56:48<31:37:48, 3.71s/it] 10%|█ | 3571/34278 [3:56:51<30:42:08, 3.60s/it] {'loss': 0.1891, 'grad_norm': 0.9358208121428494, 'learning_rate': 9.856469291611953e-06, 'epoch': 0.1} 10%|█ | 3571/34278 [3:56:51<30:42:08, 3.60s/it] 10%|█ | 3572/34278 [3:56:54<29:46:55, 3.49s/it] {'loss': 0.1942, 'grad_norm': 0.9295522384551842, 'learning_rate': 9.856356885959318e-06, 'epoch': 0.1} 10%|█ | 3572/34278 [3:56:54<29:46:55, 3.49s/it] 10%|█ | 3573/34278 [3:57:00<36:18:52, 4.26s/it] {'loss': 0.1781, 'grad_norm': 0.8160735448647487, 'learning_rate': 9.856244436950287e-06, 'epoch': 0.1} 10%|█ | 3573/34278 [3:57:00<36:18:52, 4.26s/it] 10%|█ | 3574/34278 [3:57:06<41:24:27, 4.85s/it] {'loss': 0.201, 'grad_norm': 1.0885212326819285, 'learning_rate': 9.856131944585867e-06, 'epoch': 0.1} 10%|█ | 3574/34278 [3:57:06<41:24:27, 4.85s/it] 10%|█ | 3575/34278 [3:57:10<38:59:58, 4.57s/it] {'loss': 0.1903, 'grad_norm': 0.9730661270887382, 'learning_rate': 9.85601940886706e-06, 'epoch': 0.1} 10%|█ | 3575/34278 [3:57:10<38:59:58, 4.57s/it] 10%|█ | 3576/34278 [3:57:16<42:18:35, 4.96s/it] {'loss': 0.1936, 'grad_norm': 0.7703858602175028, 'learning_rate': 9.85590682979487e-06, 'epoch': 0.1} 10%|█ | 3576/34278 [3:57:16<42:18:35, 4.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3577/34278 [3:57:20<39:41:55, 4.66s/it] {'loss': 0.1974, 'grad_norm': 0.804187780846293, 'learning_rate': 9.855794207370305e-06, 'epoch': 0.1} 10%|█ | 3577/34278 [3:57:20<39:41:55, 4.66s/it] 10%|█ | 3578/34278 [3:57:26<43:31:20, 5.10s/it] {'loss': 0.226, 'grad_norm': 0.8588128990261085, 'learning_rate': 9.855681541594367e-06, 'epoch': 0.1} 10%|█ | 3578/34278 [3:57:26<43:31:20, 5.10s/it] 10%|█ | 3579/34278 [3:57:29<38:21:56, 4.50s/it] {'loss': 0.185, 'grad_norm': 0.9682983627913089, 'learning_rate': 9.855568832468063e-06, 'epoch': 0.1} 10%|█ | 3579/34278 [3:57:29<38:21:56, 4.50s/it] 10%|█ | 3580/34278 [3:57:34<37:24:27, 4.39s/it] {'loss': 0.1866, 'grad_norm': 0.8972332881015471, 'learning_rate': 9.8554560799924e-06, 'epoch': 0.1} 10%|█ | 3580/34278 [3:57:34<37:24:27, 4.39s/it] 10%|█ | 3581/34278 [3:57:36<33:16:20, 3.90s/it] {'loss': 0.1768, 'grad_norm': 0.817740618854365, 'learning_rate': 9.855343284168384e-06, 'epoch': 0.1} 10%|█ | 3581/34278 [3:57:36<33:16:20, 3.90s/it] 10%|█ | 3582/34278 [3:57:42<38:23:11, 4.50s/it] {'loss': 0.1845, 'grad_norm': 0.8671290101837122, 'learning_rate': 9.855230444997021e-06, 'epoch': 0.1} 10%|█ | 3582/34278 [3:57:42<38:23:11, 4.50s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11363 > 8192). Running this sequence through the model will result in indexing errors 10%|█ | 3583/34278 [3:57:46<36:15:08, 4.25s/it] {'loss': 0.1987, 'grad_norm': 1.0124203321984675, 'learning_rate': 9.855117562479321e-06, 'epoch': 0.1} 10%|█ | 3583/34278 [3:57:46<36:15:08, 4.25s/it] 10%|█ | 3584/34278 [3:57:50<34:59:13, 4.10s/it] {'loss': 0.1724, 'grad_norm': 1.0825051900887046, 'learning_rate': 9.855004636616293e-06, 'epoch': 0.1} 10%|█ | 3584/34278 [3:57:50<34:59:13, 4.10s/it] 10%|█ | 3585/34278 [3:57:56<40:44:21, 4.78s/it] {'loss': 0.2068, 'grad_norm': 0.9814958449489769, 'learning_rate': 9.85489166740894e-06, 'epoch': 0.1} 10%|█ | 3585/34278 [3:57:56<40:44:21, 4.78s/it] 10%|█ | 3586/34278 [3:57:59<36:56:46, 4.33s/it] {'loss': 0.1795, 'grad_norm': 0.9602983957181478, 'learning_rate': 9.854778654858272e-06, 'epoch': 0.1} 10%|█ | 3586/34278 [3:57:59<36:56:46, 4.33s/it] 10%|█ | 3587/34278 [3:58:02<33:45:39, 3.96s/it] {'loss': 0.176, 'grad_norm': 0.9920054261851653, 'learning_rate': 9.854665598965301e-06, 'epoch': 0.1} 10%|█ | 3587/34278 [3:58:02<33:45:39, 3.96s/it] 10%|█ | 3588/34278 [3:58:07<35:44:44, 4.19s/it] {'loss': 0.1929, 'grad_norm': 0.9678449489202484, 'learning_rate': 9.854552499731032e-06, 'epoch': 0.1} 10%|█ | 3588/34278 [3:58:07<35:44:44, 4.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3589/34278 [3:58:11<34:57:19, 4.10s/it] {'loss': 0.1992, 'grad_norm': 0.9525309788802895, 'learning_rate': 9.85443935715648e-06, 'epoch': 0.1} 10%|█ | 3589/34278 [3:58:11<34:57:19, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3590/34278 [3:58:15<35:27:16, 4.16s/it] {'loss': 0.1708, 'grad_norm': 0.8838725559384846, 'learning_rate': 9.854326171242651e-06, 'epoch': 0.1} 10%|█ | 3590/34278 [3:58:15<35:27:16, 4.16s/it] 10%|█ | 3591/34278 [3:58:19<33:44:28, 3.96s/it] {'loss': 0.1706, 'grad_norm': 0.9580701114468362, 'learning_rate': 9.854212941990557e-06, 'epoch': 0.1} 10%|█ | 3591/34278 [3:58:19<33:44:28, 3.96s/it] 10%|█ | 3592/34278 [3:58:22<32:10:53, 3.78s/it] {'loss': 0.1909, 'grad_norm': 1.1849188690459378, 'learning_rate': 9.854099669401209e-06, 'epoch': 0.1} 10%|█ | 3592/34278 [3:58:22<32:10:53, 3.78s/it] 10%|█ | 3593/34278 [3:58:26<33:20:09, 3.91s/it] {'loss': 0.1949, 'grad_norm': 0.8369489004436974, 'learning_rate': 9.853986353475618e-06, 'epoch': 0.1} 10%|█ | 3593/34278 [3:58:26<33:20:09, 3.91s/it] 10%|█ | 3594/34278 [3:58:30<32:23:07, 3.80s/it] {'loss': 0.1601, 'grad_norm': 0.8283792487854817, 'learning_rate': 9.853872994214794e-06, 'epoch': 0.1} 10%|█ | 3594/34278 [3:58:30<32:23:07, 3.80s/it] 10%|█ | 3595/34278 [3:58:33<30:02:14, 3.52s/it] {'loss': 0.1886, 'grad_norm': 0.9514420257774472, 'learning_rate': 9.853759591619752e-06, 'epoch': 0.1} 10%|█ | 3595/34278 [3:58:33<30:02:14, 3.52s/it] 10%|█ | 3596/34278 [3:58:37<32:11:19, 3.78s/it] {'loss': 0.1724, 'grad_norm': 0.7027861315228137, 'learning_rate': 9.853646145691502e-06, 'epoch': 0.1} 10%|█ | 3596/34278 [3:58:37<32:11:19, 3.78s/it] 10%|█ | 3597/34278 [3:58:44<39:57:25, 4.69s/it] {'loss': 0.2092, 'grad_norm': 0.821053437805854, 'learning_rate': 9.85353265643106e-06, 'epoch': 0.1} 10%|█ | 3597/34278 [3:58:44<39:57:25, 4.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 10%|█ | 3598/34278 [3:58:48<37:28:20, 4.40s/it] {'loss': 0.1811, 'grad_norm': 0.961109184519753, 'learning_rate': 9.853419123839434e-06, 'epoch': 0.1} 10%|█ | 3598/34278 [3:58:48<37:28:20, 4.40s/it] 10%|█ | 3599/34278 [3:58:51<33:52:55, 3.98s/it] {'loss': 0.2077, 'grad_norm': 0.8388968563304371, 'learning_rate': 9.853305547917643e-06, 'epoch': 0.1} 10%|█ | 3599/34278 [3:58:51<33:52:55, 3.98s/it] 11%|█ | 3600/34278 [3:58:55<33:26:15, 3.92s/it] {'loss': 0.2083, 'grad_norm': 0.8995002274805877, 'learning_rate': 9.853191928666699e-06, 'epoch': 0.11} 11%|█ | 3600/34278 [3:58:55<33:26:15, 3.92s/it] 11%|█ | 3601/34278 [3:58:58<31:12:39, 3.66s/it] {'loss': 0.1739, 'grad_norm': 0.8278156054714256, 'learning_rate': 9.853078266087615e-06, 'epoch': 0.11} 11%|█ | 3601/34278 [3:58:58<31:12:39, 3.66s/it] 11%|█ | 3602/34278 [3:59:01<31:37:50, 3.71s/it] {'loss': 0.173, 'grad_norm': 0.8207064823983518, 'learning_rate': 9.852964560181406e-06, 'epoch': 0.11} 11%|█ | 3602/34278 [3:59:01<31:37:50, 3.71s/it] 11%|█ | 3603/34278 [3:59:05<30:19:21, 3.56s/it] {'loss': 0.1978, 'grad_norm': 0.778077752028872, 'learning_rate': 9.852850810949088e-06, 'epoch': 0.11} 11%|█ | 3603/34278 [3:59:05<30:19:21, 3.56s/it] 11%|█ | 3604/34278 [3:59:11<36:21:11, 4.27s/it] {'loss': 0.1907, 'grad_norm': 0.7439386492724069, 'learning_rate': 9.852737018391678e-06, 'epoch': 0.11} 11%|█ | 3604/34278 [3:59:11<36:21:11, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3605/34278 [3:59:14<33:35:57, 3.94s/it] {'loss': 0.1684, 'grad_norm': 0.8941220101624787, 'learning_rate': 9.85262318251019e-06, 'epoch': 0.11} 11%|█ | 3605/34278 [3:59:14<33:35:57, 3.94s/it] 11%|█ | 3606/34278 [3:59:20<38:51:24, 4.56s/it] {'loss': 0.2096, 'grad_norm': 0.920803984158187, 'learning_rate': 9.85250930330564e-06, 'epoch': 0.11} 11%|█ | 3606/34278 [3:59:20<38:51:24, 4.56s/it] 11%|█ | 3607/34278 [3:59:23<35:58:29, 4.22s/it] {'loss': 0.1747, 'grad_norm': 0.8530089595896981, 'learning_rate': 9.852395380779045e-06, 'epoch': 0.11} 11%|█ | 3607/34278 [3:59:23<35:58:29, 4.22s/it] 11%|█ | 3608/34278 [3:59:26<33:42:11, 3.96s/it] {'loss': 0.2058, 'grad_norm': 0.7817955853277232, 'learning_rate': 9.852281414931422e-06, 'epoch': 0.11} 11%|█ | 3608/34278 [3:59:26<33:42:11, 3.96s/it] 11%|█ | 3609/34278 [3:59:30<32:22:06, 3.80s/it] {'loss': 0.182, 'grad_norm': 0.8349454842830898, 'learning_rate': 9.852167405763791e-06, 'epoch': 0.11} 11%|█ | 3609/34278 [3:59:30<32:22:06, 3.80s/it] 11%|█ | 3610/34278 [3:59:33<30:00:42, 3.52s/it] {'loss': 0.1861, 'grad_norm': 0.9498919429639381, 'learning_rate': 9.852053353277166e-06, 'epoch': 0.11} 11%|█ | 3610/34278 [3:59:33<30:00:42, 3.52s/it] 11%|█ | 3611/34278 [3:59:36<29:48:58, 3.50s/it] {'loss': 0.1714, 'grad_norm': 1.010670781742482, 'learning_rate': 9.851939257472567e-06, 'epoch': 0.11} 11%|█ | 3611/34278 [3:59:36<29:48:58, 3.50s/it] 11%|█ | 3612/34278 [3:59:39<28:18:09, 3.32s/it] {'loss': 0.1748, 'grad_norm': 0.8576945237017012, 'learning_rate': 9.851825118351012e-06, 'epoch': 0.11} 11%|█ | 3612/34278 [3:59:39<28:18:09, 3.32s/it] 11%|█ | 3613/34278 [3:59:43<29:15:15, 3.43s/it] {'loss': 0.1497, 'grad_norm': 0.8817090284343435, 'learning_rate': 9.851710935913522e-06, 'epoch': 0.11} 11%|█ | 3613/34278 [3:59:43<29:15:15, 3.43s/it] 11%|█ | 3614/34278 [3:59:48<34:42:07, 4.07s/it] {'loss': 0.1873, 'grad_norm': 0.9768218949533995, 'learning_rate': 9.851596710161115e-06, 'epoch': 0.11} 11%|█ | 3614/34278 [3:59:48<34:42:07, 4.07s/it] 11%|█ | 3615/34278 [3:59:54<38:03:57, 4.47s/it] {'loss': 0.1937, 'grad_norm': 0.969484927400703, 'learning_rate': 9.851482441094809e-06, 'epoch': 0.11} 11%|█ | 3615/34278 [3:59:54<38:03:57, 4.47s/it] 11%|█ | 3616/34278 [4:00:00<42:22:39, 4.98s/it] {'loss': 0.1608, 'grad_norm': 0.8436609685314721, 'learning_rate': 9.851368128715627e-06, 'epoch': 0.11} 11%|█ | 3616/34278 [4:00:00<42:22:39, 4.98s/it] 11%|█ | 3617/34278 [4:00:03<38:08:12, 4.48s/it] {'loss': 0.2042, 'grad_norm': 0.8973571802412434, 'learning_rate': 9.85125377302459e-06, 'epoch': 0.11} 11%|█ | 3617/34278 [4:00:03<38:08:12, 4.48s/it] 11%|█ | 3618/34278 [4:00:06<33:34:47, 3.94s/it] {'loss': 0.1761, 'grad_norm': 0.9805136113686406, 'learning_rate': 9.851139374022715e-06, 'epoch': 0.11} 11%|█ | 3618/34278 [4:00:06<33:34:47, 3.94s/it] 11%|█ | 3619/34278 [4:00:09<31:28:36, 3.70s/it] {'loss': 0.1988, 'grad_norm': 0.8950355394942241, 'learning_rate': 9.851024931711026e-06, 'epoch': 0.11} 11%|█ | 3619/34278 [4:00:09<31:28:36, 3.70s/it] 11%|█ | 3620/34278 [4:00:12<30:43:05, 3.61s/it] {'loss': 0.1736, 'grad_norm': 0.9967364634792623, 'learning_rate': 9.850910446090545e-06, 'epoch': 0.11} 11%|█ | 3620/34278 [4:00:12<30:43:05, 3.61s/it] 11%|█ | 3621/34278 [4:00:16<29:46:45, 3.50s/it] {'loss': 0.1721, 'grad_norm': 0.8152349956595033, 'learning_rate': 9.850795917162295e-06, 'epoch': 0.11} 11%|█ | 3621/34278 [4:00:16<29:46:45, 3.50s/it] 11%|█ | 3622/34278 [4:00:19<28:25:52, 3.34s/it] {'loss': 0.2011, 'grad_norm': 0.9117315910839804, 'learning_rate': 9.850681344927295e-06, 'epoch': 0.11} 11%|█ | 3622/34278 [4:00:19<28:25:52, 3.34s/it] 11%|█ | 3623/34278 [4:00:22<28:14:15, 3.32s/it] {'loss': 0.1913, 'grad_norm': 0.9183673562073597, 'learning_rate': 9.85056672938657e-06, 'epoch': 0.11} 11%|█ | 3623/34278 [4:00:22<28:14:15, 3.32s/it] 11%|█ | 3624/34278 [4:00:26<29:09:34, 3.42s/it] {'loss': 0.1997, 'grad_norm': 0.8907167001631543, 'learning_rate': 9.850452070541145e-06, 'epoch': 0.11} 11%|█ | 3624/34278 [4:00:26<29:09:34, 3.42s/it] 11%|█ | 3625/34278 [4:00:29<28:25:24, 3.34s/it] {'loss': 0.1567, 'grad_norm': 0.7452388870196537, 'learning_rate': 9.85033736839204e-06, 'epoch': 0.11} 11%|█ | 3625/34278 [4:00:29<28:25:24, 3.34s/it] 11%|█ | 3626/34278 [4:00:32<28:10:24, 3.31s/it] {'loss': 0.1786, 'grad_norm': 1.0293733016375348, 'learning_rate': 9.850222622940282e-06, 'epoch': 0.11} 11%|█ | 3626/34278 [4:00:32<28:10:24, 3.31s/it] 11%|█ | 3627/34278 [4:00:38<35:00:20, 4.11s/it] {'loss': 0.1807, 'grad_norm': 0.8947462098266787, 'learning_rate': 9.850107834186893e-06, 'epoch': 0.11} 11%|█ | 3627/34278 [4:00:38<35:00:20, 4.11s/it] 11%|█ | 3628/34278 [4:00:42<35:15:12, 4.14s/it] {'loss': 0.1698, 'grad_norm': 0.8708521042151944, 'learning_rate': 9.8499930021329e-06, 'epoch': 0.11} 11%|█ | 3628/34278 [4:00:42<35:15:12, 4.14s/it] 11%|█ | 3629/34278 [4:00:46<34:03:16, 4.00s/it] {'loss': 0.219, 'grad_norm': 0.9404817004403054, 'learning_rate': 9.849878126779326e-06, 'epoch': 0.11} 11%|█ | 3629/34278 [4:00:46<34:03:16, 4.00s/it] 11%|█ | 3630/34278 [4:00:49<32:14:18, 3.79s/it] {'loss': 0.2157, 'grad_norm': 0.8760941174090979, 'learning_rate': 9.8497632081272e-06, 'epoch': 0.11} 11%|█ | 3630/34278 [4:00:49<32:14:18, 3.79s/it] 11%|█ | 3631/34278 [4:00:52<30:38:09, 3.60s/it] {'loss': 0.1963, 'grad_norm': 0.9371641097270601, 'learning_rate': 9.849648246177544e-06, 'epoch': 0.11} 11%|█ | 3631/34278 [4:00:52<30:38:09, 3.60s/it] 11%|█ | 3632/34278 [4:00:58<35:49:35, 4.21s/it] {'loss': 0.1569, 'grad_norm': 0.8086527270226705, 'learning_rate': 9.849533240931388e-06, 'epoch': 0.11} 11%|█ | 3632/34278 [4:00:58<35:49:35, 4.21s/it] 11%|█ | 3633/34278 [4:01:01<33:26:57, 3.93s/it] {'loss': 0.1793, 'grad_norm': 0.9406634070880344, 'learning_rate': 9.849418192389755e-06, 'epoch': 0.11} 11%|█ | 3633/34278 [4:01:01<33:26:57, 3.93s/it] 11%|█ | 3634/34278 [4:01:04<31:29:53, 3.70s/it] {'loss': 0.2215, 'grad_norm': 1.010354893817405, 'learning_rate': 9.849303100553675e-06, 'epoch': 0.11} 11%|█ | 3634/34278 [4:01:04<31:29:53, 3.70s/it] 11%|█ | 3635/34278 [4:01:08<30:29:12, 3.58s/it] {'loss': 0.1973, 'grad_norm': 1.1892070123249876, 'learning_rate': 9.849187965424174e-06, 'epoch': 0.11} 11%|█ | 3635/34278 [4:01:08<30:29:12, 3.58s/it] 11%|█ | 3636/34278 [4:01:12<31:20:16, 3.68s/it] {'loss': 0.1823, 'grad_norm': 1.0311238063426773, 'learning_rate': 9.849072787002281e-06, 'epoch': 0.11} 11%|█ | 3636/34278 [4:01:12<31:20:16, 3.68s/it] 11%|█ | 3637/34278 [4:01:15<31:18:52, 3.68s/it] {'loss': 0.173, 'grad_norm': 0.8037583422842794, 'learning_rate': 9.848957565289024e-06, 'epoch': 0.11} 11%|█ | 3637/34278 [4:01:15<31:18:52, 3.68s/it] 11%|█ | 3638/34278 [4:01:21<37:26:56, 4.40s/it] {'loss': 0.1733, 'grad_norm': 0.9914643598253442, 'learning_rate': 9.84884230028543e-06, 'epoch': 0.11} 11%|█ | 3638/34278 [4:01:21<37:26:56, 4.40s/it] 11%|█ | 3639/34278 [4:01:25<35:00:08, 4.11s/it] {'loss': 0.1967, 'grad_norm': 0.7477706637644134, 'learning_rate': 9.84872699199253e-06, 'epoch': 0.11} 11%|█ | 3639/34278 [4:01:25<35:00:08, 4.11s/it] 11%|█ | 3640/34278 [4:01:28<32:35:37, 3.83s/it] {'loss': 0.1821, 'grad_norm': 0.8397578799055514, 'learning_rate': 9.848611640411355e-06, 'epoch': 0.11} 11%|█ | 3640/34278 [4:01:28<32:35:37, 3.83s/it] 11%|█ | 3641/34278 [4:01:31<30:38:09, 3.60s/it] {'loss': 0.1913, 'grad_norm': 0.8983672869267227, 'learning_rate': 9.848496245542928e-06, 'epoch': 0.11} 11%|█ | 3641/34278 [4:01:31<30:38:09, 3.60s/it] 11%|█ | 3642/34278 [4:01:37<36:48:47, 4.33s/it] {'loss': 0.1673, 'grad_norm': 0.8500491520526038, 'learning_rate': 9.848380807388287e-06, 'epoch': 0.11} 11%|█ | 3642/34278 [4:01:37<36:48:47, 4.33s/it] 11%|█ | 3643/34278 [4:01:43<41:43:20, 4.90s/it] {'loss': 0.1905, 'grad_norm': 0.8896732084116462, 'learning_rate': 9.84826532594846e-06, 'epoch': 0.11} 11%|█ | 3643/34278 [4:01:43<41:43:20, 4.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3644/34278 [4:01:47<37:59:38, 4.46s/it] {'loss': 0.1849, 'grad_norm': 0.7485932722651284, 'learning_rate': 9.848149801224478e-06, 'epoch': 0.11} 11%|█ | 3644/34278 [4:01:47<37:59:38, 4.46s/it] 11%|█ | 3645/34278 [4:01:53<41:56:03, 4.93s/it] {'loss': 0.1847, 'grad_norm': 1.967401684554678, 'learning_rate': 9.84803423321737e-06, 'epoch': 0.11} 11%|█ | 3645/34278 [4:01:53<41:56:03, 4.93s/it] 11%|█ | 3646/34278 [4:01:57<38:56:18, 4.58s/it] {'loss': 0.1721, 'grad_norm': 0.8288106003515783, 'learning_rate': 9.84791862192817e-06, 'epoch': 0.11} 11%|█ | 3646/34278 [4:01:57<38:56:18, 4.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3647/34278 [4:02:00<35:49:26, 4.21s/it] {'loss': 0.1993, 'grad_norm': 1.146949706729043, 'learning_rate': 9.84780296735791e-06, 'epoch': 0.11} 11%|█ | 3647/34278 [4:02:00<35:49:26, 4.21s/it] 11%|█ | 3648/34278 [4:02:03<33:18:44, 3.92s/it] {'loss': 0.1798, 'grad_norm': 0.8806391390682924, 'learning_rate': 9.847687269507624e-06, 'epoch': 0.11} 11%|█ | 3648/34278 [4:02:03<33:18:44, 3.92s/it] 11%|█ | 3649/34278 [4:02:06<31:04:32, 3.65s/it] {'loss': 0.1659, 'grad_norm': 1.0032036286894994, 'learning_rate': 9.847571528378342e-06, 'epoch': 0.11} 11%|█ | 3649/34278 [4:02:06<31:04:32, 3.65s/it] 11%|█ | 3650/34278 [4:02:09<28:49:30, 3.39s/it] {'loss': 0.1796, 'grad_norm': 0.8126100534228001, 'learning_rate': 9.8474557439711e-06, 'epoch': 0.11} 11%|█ | 3650/34278 [4:02:09<28:49:30, 3.39s/it] 11%|█ | 3651/34278 [4:02:15<35:41:26, 4.20s/it] {'loss': 0.1589, 'grad_norm': 0.8152182091978051, 'learning_rate': 9.847339916286928e-06, 'epoch': 0.11} 11%|█ | 3651/34278 [4:02:15<35:41:26, 4.20s/it] 11%|█ | 3652/34278 [4:02:20<36:44:39, 4.32s/it] {'loss': 0.1737, 'grad_norm': 0.7248696144602788, 'learning_rate': 9.847224045326864e-06, 'epoch': 0.11} 11%|█ | 3652/34278 [4:02:20<36:44:39, 4.32s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3653/34278 [4:02:22<32:31:43, 3.82s/it] {'loss': 0.1755, 'grad_norm': 0.9821676998680899, 'learning_rate': 9.84710813109194e-06, 'epoch': 0.11} 11%|█ | 3653/34278 [4:02:22<32:31:43, 3.82s/it] 11%|█ | 3654/34278 [4:02:26<32:31:11, 3.82s/it] {'loss': 0.1744, 'grad_norm': 0.9874156412488563, 'learning_rate': 9.846992173583193e-06, 'epoch': 0.11} 11%|█ | 3654/34278 [4:02:26<32:31:11, 3.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3655/34278 [4:02:29<31:00:28, 3.65s/it] {'loss': 0.1875, 'grad_norm': 0.9935917984752777, 'learning_rate': 9.846876172801653e-06, 'epoch': 0.11} 11%|█ | 3655/34278 [4:02:29<31:00:28, 3.65s/it] 11%|█ | 3656/34278 [4:02:35<36:51:43, 4.33s/it] {'loss': 0.2255, 'grad_norm': 0.8547818384954391, 'learning_rate': 9.846760128748363e-06, 'epoch': 0.11} 11%|█ | 3656/34278 [4:02:35<36:51:43, 4.33s/it] 11%|█ | 3657/34278 [4:02:41<41:21:34, 4.86s/it] {'loss': 0.1973, 'grad_norm': 1.1420768220006505, 'learning_rate': 9.846644041424357e-06, 'epoch': 0.11} 11%|█ | 3657/34278 [4:02:41<41:21:34, 4.86s/it] 11%|█ | 3658/34278 [4:02:45<37:06:38, 4.36s/it] {'loss': 0.189, 'grad_norm': 0.9489048749425871, 'learning_rate': 9.846527910830666e-06, 'epoch': 0.11} 11%|█ | 3658/34278 [4:02:45<37:06:38, 4.36s/it] 11%|█ | 3659/34278 [4:02:49<36:45:33, 4.32s/it] {'loss': 0.1802, 'grad_norm': 0.7622933975836242, 'learning_rate': 9.846411736968334e-06, 'epoch': 0.11} 11%|█ | 3659/34278 [4:02:49<36:45:33, 4.32s/it] 11%|█ | 3660/34278 [4:02:53<35:26:52, 4.17s/it] {'loss': 0.2019, 'grad_norm': 0.9875678375622912, 'learning_rate': 9.846295519838393e-06, 'epoch': 0.11} 11%|█ | 3660/34278 [4:02:53<35:26:52, 4.17s/it] 11%|█ | 3661/34278 [4:02:57<35:21:10, 4.16s/it] {'loss': 0.1736, 'grad_norm': 0.9658618910429523, 'learning_rate': 9.846179259441884e-06, 'epoch': 0.11} 11%|█ | 3661/34278 [4:02:57<35:21:10, 4.16s/it] 11%|█ | 3662/34278 [4:03:00<34:10:24, 4.02s/it] {'loss': 0.1678, 'grad_norm': 0.7838056532062962, 'learning_rate': 9.846062955779843e-06, 'epoch': 0.11} 11%|█ | 3662/34278 [4:03:00<34:10:24, 4.02s/it] 11%|█ | 3663/34278 [4:03:04<32:11:50, 3.79s/it] {'loss': 0.1751, 'grad_norm': 0.9063951910550987, 'learning_rate': 9.845946608853307e-06, 'epoch': 0.11} 11%|█ | 3663/34278 [4:03:04<32:11:50, 3.79s/it] 11%|█ | 3664/34278 [4:03:08<32:32:00, 3.83s/it] {'loss': 0.1974, 'grad_norm': 0.954163078224032, 'learning_rate': 9.845830218663319e-06, 'epoch': 0.11} 11%|█ | 3664/34278 [4:03:08<32:32:00, 3.83s/it] 11%|█ | 3665/34278 [4:03:11<31:27:36, 3.70s/it] {'loss': 0.1807, 'grad_norm': 0.9078999281041293, 'learning_rate': 9.845713785210915e-06, 'epoch': 0.11} 11%|█ | 3665/34278 [4:03:11<31:27:36, 3.70s/it] 11%|█ | 3666/34278 [4:03:15<31:08:38, 3.66s/it] {'loss': 0.1603, 'grad_norm': 0.9482398997349659, 'learning_rate': 9.845597308497134e-06, 'epoch': 0.11} 11%|█ | 3666/34278 [4:03:15<31:08:38, 3.66s/it] 11%|█ | 3667/34278 [4:03:18<30:03:39, 3.54s/it] {'loss': 0.1867, 'grad_norm': 0.9527446241527082, 'learning_rate': 9.845480788523018e-06, 'epoch': 0.11} 11%|█ | 3667/34278 [4:03:18<30:03:39, 3.54s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47ea4fbe20> Failed to fetch sample 3423930. Exception: cannot identify image file <_io.BytesIO object at 0x7f47ea4fbe20> 11%|█ | 3668/34278 [4:03:24<37:58:39, 4.47s/it] {'loss': 0.163, 'grad_norm': 0.8204347964831162, 'learning_rate': 9.845364225289606e-06, 'epoch': 0.11} 11%|█ | 3668/34278 [4:03:24<37:58:39, 4.47s/it] 11%|█ | 3669/34278 [4:03:29<37:40:20, 4.43s/it] {'loss': 0.1621, 'grad_norm': 0.7344762053758938, 'learning_rate': 9.845247618797938e-06, 'epoch': 0.11} 11%|█ | 3669/34278 [4:03:29<37:40:20, 4.43s/it] 11%|█ | 3670/34278 [4:03:34<40:36:51, 4.78s/it] {'loss': 0.2151, 'grad_norm': 1.0051301924257035, 'learning_rate': 9.845130969049057e-06, 'epoch': 0.11} 11%|█ | 3670/34278 [4:03:34<40:36:51, 4.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3671/34278 [4:03:38<38:22:08, 4.51s/it] {'loss': 0.1864, 'grad_norm': 0.9033577606481228, 'learning_rate': 9.845014276044002e-06, 'epoch': 0.11} 11%|█ | 3671/34278 [4:03:38<38:22:08, 4.51s/it] 11%|█ | 3672/34278 [4:03:41<34:58:37, 4.11s/it] {'loss': 0.1952, 'grad_norm': 1.0478524524363073, 'learning_rate': 9.844897539783817e-06, 'epoch': 0.11} 11%|█ | 3672/34278 [4:03:41<34:58:37, 4.11s/it] 11%|█ | 3673/34278 [4:03:45<32:17:03, 3.80s/it] {'loss': 0.1921, 'grad_norm': 0.9434465268313577, 'learning_rate': 9.844780760269543e-06, 'epoch': 0.11} 11%|█ | 3673/34278 [4:03:45<32:17:03, 3.80s/it] 11%|█ | 3674/34278 [4:03:48<30:56:38, 3.64s/it] {'loss': 0.1987, 'grad_norm': 1.185492328938468, 'learning_rate': 9.844663937502225e-06, 'epoch': 0.11} 11%|█ | 3674/34278 [4:03:48<30:56:38, 3.64s/it] 11%|█ | 3675/34278 [4:03:51<29:57:52, 3.52s/it] {'loss': 0.1827, 'grad_norm': 0.9408758168148659, 'learning_rate': 9.844547071482902e-06, 'epoch': 0.11} 11%|█ | 3675/34278 [4:03:51<29:57:52, 3.52s/it] 11%|█ | 3676/34278 [4:03:57<36:45:43, 4.32s/it] {'loss': 0.1894, 'grad_norm': 0.9133147091973878, 'learning_rate': 9.844430162212619e-06, 'epoch': 0.11} 11%|█ | 3676/34278 [4:03:57<36:45:43, 4.32s/it] 11%|█ | 3677/34278 [4:04:02<37:26:53, 4.41s/it] {'loss': 0.2018, 'grad_norm': 0.8748009274424542, 'learning_rate': 9.84431320969242e-06, 'epoch': 0.11} 11%|█ | 3677/34278 [4:04:02<37:26:53, 4.41s/it] 11%|█ | 3678/34278 [4:04:05<35:16:37, 4.15s/it] {'loss': 0.1778, 'grad_norm': 1.0174652467453873, 'learning_rate': 9.84419621392335e-06, 'epoch': 0.11} 11%|█ | 3678/34278 [4:04:05<35:16:37, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3679/34278 [4:04:09<34:15:06, 4.03s/it] {'loss': 0.1831, 'grad_norm': 0.7728782072356283, 'learning_rate': 9.844079174906453e-06, 'epoch': 0.11} 11%|█ | 3679/34278 [4:04:09<34:15:06, 4.03s/it] 11%|█ | 3680/34278 [4:04:12<31:57:19, 3.76s/it] {'loss': 0.1937, 'grad_norm': 1.108675218183585, 'learning_rate': 9.843962092642772e-06, 'epoch': 0.11} 11%|█ | 3680/34278 [4:04:12<31:57:19, 3.76s/it] 11%|█ | 3681/34278 [4:04:15<30:38:20, 3.60s/it] {'loss': 0.1666, 'grad_norm': 0.8191061791296362, 'learning_rate': 9.843844967133353e-06, 'epoch': 0.11} 11%|█ | 3681/34278 [4:04:15<30:38:20, 3.60s/it] 11%|█ | 3682/34278 [4:04:19<29:57:58, 3.53s/it] {'loss': 0.1834, 'grad_norm': 1.093219952602174, 'learning_rate': 9.843727798379245e-06, 'epoch': 0.11} 11%|█ | 3682/34278 [4:04:19<29:57:58, 3.53s/it] 11%|█ | 3683/34278 [4:04:25<35:54:47, 4.23s/it] {'loss': 0.1951, 'grad_norm': 0.8429401948794337, 'learning_rate': 9.843610586381491e-06, 'epoch': 0.11} 11%|█ | 3683/34278 [4:04:25<35:54:47, 4.23s/it] 11%|█ | 3684/34278 [4:04:29<36:47:41, 4.33s/it] {'loss': 0.2011, 'grad_norm': 1.0067529057383613, 'learning_rate': 9.843493331141136e-06, 'epoch': 0.11} 11%|█ | 3684/34278 [4:04:29<36:47:41, 4.33s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3685/34278 [4:04:35<39:16:12, 4.62s/it] {'loss': 0.1728, 'grad_norm': 0.7733939536939644, 'learning_rate': 9.843376032659231e-06, 'epoch': 0.11} 11%|█ | 3685/34278 [4:04:35<39:16:12, 4.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3686/34278 [4:04:38<35:33:32, 4.18s/it] {'loss': 0.1712, 'grad_norm': 0.9664776166046107, 'learning_rate': 9.84325869093682e-06, 'epoch': 0.11} 11%|█ | 3686/34278 [4:04:38<35:33:32, 4.18s/it] 11%|█ | 3687/34278 [4:04:41<33:10:50, 3.90s/it] {'loss': 0.2086, 'grad_norm': 0.8594469761438377, 'learning_rate': 9.843141305974951e-06, 'epoch': 0.11} 11%|█ | 3687/34278 [4:04:41<33:10:50, 3.90s/it] 11%|█ | 3688/34278 [4:04:44<31:39:56, 3.73s/it] {'loss': 0.1671, 'grad_norm': 1.0214638134518348, 'learning_rate': 9.843023877774673e-06, 'epoch': 0.11} 11%|█ | 3688/34278 [4:04:44<31:39:56, 3.73s/it] 11%|█ | 3689/34278 [4:04:47<29:44:44, 3.50s/it] {'loss': 0.1923, 'grad_norm': 0.9576001938898379, 'learning_rate': 9.842906406337034e-06, 'epoch': 0.11} 11%|█ | 3689/34278 [4:04:47<29:44:44, 3.50s/it] 11%|█ | 3690/34278 [4:04:51<30:18:25, 3.57s/it] {'loss': 0.1962, 'grad_norm': 0.867883200304093, 'learning_rate': 9.842788891663085e-06, 'epoch': 0.11} 11%|█ | 3690/34278 [4:04:51<30:18:25, 3.57s/it] 11%|█ | 3691/34278 [4:04:57<36:02:25, 4.24s/it] {'loss': 0.1782, 'grad_norm': 0.891562337545845, 'learning_rate': 9.84267133375387e-06, 'epoch': 0.11} 11%|█ | 3691/34278 [4:04:57<36:02:25, 4.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3692/34278 [4:05:00<33:01:16, 3.89s/it] {'loss': 0.1972, 'grad_norm': 0.9513846342371838, 'learning_rate': 9.842553732610442e-06, 'epoch': 0.11} 11%|█ | 3692/34278 [4:05:00<33:01:16, 3.89s/it] 11%|█ | 3693/34278 [4:05:06<38:46:01, 4.56s/it] {'loss': 0.2128, 'grad_norm': 0.8740119196856326, 'learning_rate': 9.842436088233851e-06, 'epoch': 0.11} 11%|█ | 3693/34278 [4:05:06<38:46:01, 4.56s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3694/34278 [4:05:10<36:30:50, 4.30s/it] {'loss': 0.2001, 'grad_norm': 0.9505336722522594, 'learning_rate': 9.842318400625145e-06, 'epoch': 0.11} 11%|█ | 3694/34278 [4:05:10<36:30:50, 4.30s/it] 11%|█ | 3695/34278 [4:05:16<40:46:52, 4.80s/it] {'loss': 0.1871, 'grad_norm': 0.9032592869121997, 'learning_rate': 9.842200669785378e-06, 'epoch': 0.11} 11%|█ | 3695/34278 [4:05:16<40:46:52, 4.80s/it] 11%|█ | 3696/34278 [4:05:20<40:12:41, 4.73s/it] {'loss': 0.1887, 'grad_norm': 1.0805069099010487, 'learning_rate': 9.842082895715598e-06, 'epoch': 0.11} 11%|█ | 3696/34278 [4:05:20<40:12:41, 4.73s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3697/34278 [4:05:24<36:32:35, 4.30s/it] {'loss': 0.2106, 'grad_norm': 0.9110385262451008, 'learning_rate': 9.84196507841686e-06, 'epoch': 0.11} 11%|█ | 3697/34278 [4:05:24<36:32:35, 4.30s/it] 11%|█ | 3698/34278 [4:05:27<35:34:05, 4.19s/it] {'loss': 0.1758, 'grad_norm': 1.0107292562277639, 'learning_rate': 9.84184721789021e-06, 'epoch': 0.11} 11%|█ | 3698/34278 [4:05:27<35:34:05, 4.19s/it] 11%|█ | 3699/34278 [4:05:33<40:11:54, 4.73s/it] {'loss': 0.1807, 'grad_norm': 1.1430490413573193, 'learning_rate': 9.841729314136707e-06, 'epoch': 0.11} 11%|█ | 3699/34278 [4:05:33<40:11:54, 4.73s/it] 11%|█ | 3700/34278 [4:05:37<36:19:36, 4.28s/it] {'loss': 0.1829, 'grad_norm': 1.2589913484771265, 'learning_rate': 9.8416113671574e-06, 'epoch': 0.11} 11%|█ | 3700/34278 [4:05:37<36:19:36, 4.28s/it] 11%|█ | 3701/34278 [4:05:40<33:32:46, 3.95s/it] {'loss': 0.1862, 'grad_norm': 0.9674064446555021, 'learning_rate': 9.841493376953341e-06, 'epoch': 0.11} 11%|█ | 3701/34278 [4:05:40<33:32:46, 3.95s/it] 11%|█ | 3702/34278 [4:05:43<31:31:54, 3.71s/it] {'loss': 0.2146, 'grad_norm': 1.00196352427155, 'learning_rate': 9.841375343525586e-06, 'epoch': 0.11} 11%|█ | 3702/34278 [4:05:43<31:31:54, 3.71s/it] 11%|█ | 3703/34278 [4:05:46<29:47:03, 3.51s/it] {'loss': 0.1696, 'grad_norm': 0.8110697783459644, 'learning_rate': 9.841257266875187e-06, 'epoch': 0.11} 11%|█ | 3703/34278 [4:05:46<29:47:03, 3.51s/it] 11%|█ | 3704/34278 [4:05:49<29:31:08, 3.48s/it] {'loss': 0.1775, 'grad_norm': 1.0770057636871617, 'learning_rate': 9.8411391470032e-06, 'epoch': 0.11} 11%|█ | 3704/34278 [4:05:49<29:31:08, 3.48s/it] 11%|█ | 3705/34278 [4:05:53<28:42:53, 3.38s/it] {'loss': 0.1761, 'grad_norm': 1.0268880636789652, 'learning_rate': 9.841020983910675e-06, 'epoch': 0.11} 11%|█ | 3705/34278 [4:05:53<28:42:53, 3.38s/it] 11%|█ | 3706/34278 [4:05:56<28:12:01, 3.32s/it] {'loss': 0.1679, 'grad_norm': 0.8155987790917397, 'learning_rate': 9.840902777598675e-06, 'epoch': 0.11} 11%|█ | 3706/34278 [4:05:56<28:12:01, 3.32s/it] 11%|█ | 3707/34278 [4:05:59<28:52:44, 3.40s/it] {'loss': 0.1854, 'grad_norm': 0.8870319716724403, 'learning_rate': 9.840784528068248e-06, 'epoch': 0.11} 11%|█ | 3707/34278 [4:05:59<28:52:44, 3.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3708/34278 [4:06:05<34:39:22, 4.08s/it] {'loss': 0.1896, 'grad_norm': 0.8624835876854527, 'learning_rate': 9.840666235320453e-06, 'epoch': 0.11} 11%|█ | 3708/34278 [4:06:05<34:39:22, 4.08s/it] 11%|█ | 3709/34278 [4:06:10<37:23:00, 4.40s/it] {'loss': 0.1817, 'grad_norm': 0.9718767513383666, 'learning_rate': 9.840547899356344e-06, 'epoch': 0.11} 11%|█ | 3709/34278 [4:06:10<37:23:00, 4.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3710/34278 [4:06:14<35:50:53, 4.22s/it] {'loss': 0.2356, 'grad_norm': 0.9234147792364222, 'learning_rate': 9.840429520176981e-06, 'epoch': 0.11} 11%|█ | 3710/34278 [4:06:14<35:50:53, 4.22s/it] 11%|█ | 3711/34278 [4:06:18<34:31:49, 4.07s/it] {'loss': 0.1804, 'grad_norm': 0.8759859936433729, 'learning_rate': 9.84031109778342e-06, 'epoch': 0.11} 11%|█ | 3711/34278 [4:06:18<34:31:49, 4.07s/it] 11%|█ | 3712/34278 [4:06:24<39:25:38, 4.64s/it] {'loss': 0.1864, 'grad_norm': 0.9744025451650687, 'learning_rate': 9.840192632176714e-06, 'epoch': 0.11} 11%|█ | 3712/34278 [4:06:24<39:25:38, 4.64s/it] 11%|█ | 3713/34278 [4:06:27<35:02:13, 4.13s/it] {'loss': 0.185, 'grad_norm': 1.1145841227908508, 'learning_rate': 9.840074123357924e-06, 'epoch': 0.11} 11%|█ | 3713/34278 [4:06:27<35:02:13, 4.13s/it] 11%|█ | 3714/34278 [4:06:33<39:56:08, 4.70s/it] {'loss': 0.1762, 'grad_norm': 0.9164931550793644, 'learning_rate': 9.839955571328108e-06, 'epoch': 0.11} 11%|█ | 3714/34278 [4:06:33<39:56:08, 4.70s/it] 11%|█ | 3715/34278 [4:06:36<36:04:15, 4.25s/it] {'loss': 0.2067, 'grad_norm': 1.3306792105766259, 'learning_rate': 9.839836976088326e-06, 'epoch': 0.11} 11%|█ | 3715/34278 [4:06:36<36:04:15, 4.25s/it] 11%|█ | 3716/34278 [4:06:39<32:50:04, 3.87s/it] {'loss': 0.1904, 'grad_norm': 1.0025956067578259, 'learning_rate': 9.839718337639633e-06, 'epoch': 0.11} 11%|█ | 3716/34278 [4:06:39<32:50:04, 3.87s/it] 11%|█ | 3717/34278 [4:06:43<34:46:34, 4.10s/it] {'loss': 0.1753, 'grad_norm': 0.9590885285899383, 'learning_rate': 9.83959965598309e-06, 'epoch': 0.11} 11%|█ | 3717/34278 [4:06:43<34:46:34, 4.10s/it] 11%|█ | 3718/34278 [4:06:47<33:47:47, 3.98s/it] {'loss': 0.1655, 'grad_norm': 0.8519876240715798, 'learning_rate': 9.839480931119756e-06, 'epoch': 0.11} 11%|█ | 3718/34278 [4:06:47<33:47:47, 3.98s/it] 11%|█ | 3719/34278 [4:06:53<39:35:10, 4.66s/it] {'loss': 0.2068, 'grad_norm': 1.0216085372450623, 'learning_rate': 9.839362163050692e-06, 'epoch': 0.11} 11%|█ | 3719/34278 [4:06:53<39:35:10, 4.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3720/34278 [4:06:57<37:32:49, 4.42s/it] {'loss': 0.2108, 'grad_norm': 1.015218478793388, 'learning_rate': 9.839243351776959e-06, 'epoch': 0.11} 11%|█ | 3720/34278 [4:06:57<37:32:49, 4.42s/it] 11%|█ | 3721/34278 [4:07:01<34:32:58, 4.07s/it] {'loss': 0.1814, 'grad_norm': 0.8833693071316394, 'learning_rate': 9.839124497299614e-06, 'epoch': 0.11} 11%|█ | 3721/34278 [4:07:01<34:32:58, 4.07s/it] 11%|█ | 3722/34278 [4:07:07<39:30:50, 4.66s/it] {'loss': 0.1952, 'grad_norm': 0.9985068447394757, 'learning_rate': 9.839005599619723e-06, 'epoch': 0.11} 11%|█ | 3722/34278 [4:07:07<39:30:50, 4.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3723/34278 [4:07:10<36:04:47, 4.25s/it] {'loss': 0.1816, 'grad_norm': 0.9275513908128796, 'learning_rate': 9.838886658738345e-06, 'epoch': 0.11} 11%|█ | 3723/34278 [4:07:10<36:04:47, 4.25s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10499 > 8192). Running this sequence through the model will result in indexing errors 11%|█ | 3724/34278 [4:07:15<39:11:11, 4.62s/it] {'loss': 0.2028, 'grad_norm': 1.0624517703707523, 'learning_rate': 9.838767674656541e-06, 'epoch': 0.11} 11%|█ | 3724/34278 [4:07:15<39:11:11, 4.62s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10149 > 8192). Running this sequence through the model will result in indexing errors 11%|█ | 3725/34278 [4:07:19<36:59:29, 4.36s/it] {'loss': 0.1519, 'grad_norm': 1.0070792855763442, 'learning_rate': 9.838648647375375e-06, 'epoch': 0.11} 11%|█ | 3725/34278 [4:07:19<36:59:29, 4.36s/it] 11%|█ | 3726/34278 [4:07:22<33:45:47, 3.98s/it] {'loss': 0.1704, 'grad_norm': 0.8410867942920129, 'learning_rate': 9.83852957689591e-06, 'epoch': 0.11} 11%|█ | 3726/34278 [4:07:22<33:45:47, 3.98s/it] 11%|█ | 3727/34278 [4:07:27<35:01:03, 4.13s/it] {'loss': 0.2194, 'grad_norm': 1.1809385178616498, 'learning_rate': 9.838410463219206e-06, 'epoch': 0.11} 11%|█ | 3727/34278 [4:07:27<35:01:03, 4.13s/it] 11%|█ | 3728/34278 [4:07:33<39:25:51, 4.65s/it] {'loss': 0.1862, 'grad_norm': 1.01928972420757, 'learning_rate': 9.838291306346329e-06, 'epoch': 0.11} 11%|█ | 3728/34278 [4:07:33<39:25:51, 4.65s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3729/34278 [4:07:35<35:09:58, 4.14s/it] {'loss': 0.171, 'grad_norm': 0.8529205792076802, 'learning_rate': 9.838172106278344e-06, 'epoch': 0.11} 11%|█ | 3729/34278 [4:07:36<35:09:58, 4.14s/it] 11%|█ | 3730/34278 [4:07:39<32:31:44, 3.83s/it] {'loss': 0.1764, 'grad_norm': 0.9092029669728684, 'learning_rate': 9.83805286301631e-06, 'epoch': 0.11} 11%|█ | 3730/34278 [4:07:39<32:31:44, 3.83s/it] 11%|█ | 3731/34278 [4:07:42<31:17:42, 3.69s/it] {'loss': 0.1843, 'grad_norm': 0.9078365624779545, 'learning_rate': 9.837933576561297e-06, 'epoch': 0.11} 11%|█ | 3731/34278 [4:07:42<31:17:42, 3.69s/it] 11%|█ | 3732/34278 [4:07:46<31:15:43, 3.68s/it] {'loss': 0.2223, 'grad_norm': 0.8989173614593705, 'learning_rate': 9.837814246914367e-06, 'epoch': 0.11} 11%|█ | 3732/34278 [4:07:46<31:15:43, 3.68s/it] 11%|█ | 3733/34278 [4:07:50<32:42:47, 3.86s/it] {'loss': 0.1806, 'grad_norm': 0.8498264796992692, 'learning_rate': 9.83769487407659e-06, 'epoch': 0.11} 11%|█ | 3733/34278 [4:07:50<32:42:47, 3.86s/it] 11%|█ | 3734/34278 [4:07:53<31:42:55, 3.74s/it] {'loss': 0.1977, 'grad_norm': 1.1470141639434395, 'learning_rate': 9.837575458049023e-06, 'epoch': 0.11} 11%|█ | 3734/34278 [4:07:53<31:42:55, 3.74s/it] 11%|█ | 3735/34278 [4:07:57<30:43:32, 3.62s/it] {'loss': 0.2063, 'grad_norm': 1.0260437543873395, 'learning_rate': 9.83745599883274e-06, 'epoch': 0.11} 11%|█ | 3735/34278 [4:07:57<30:43:32, 3.62s/it] 11%|█ | 3736/34278 [4:08:01<31:18:16, 3.69s/it] {'loss': 0.1625, 'grad_norm': 0.9272079305040548, 'learning_rate': 9.837336496428804e-06, 'epoch': 0.11} 11%|█ | 3736/34278 [4:08:01<31:18:16, 3.69s/it] 11%|█ | 3737/34278 [4:08:04<30:51:36, 3.64s/it] {'loss': 0.1914, 'grad_norm': 0.7823327792556027, 'learning_rate': 9.837216950838282e-06, 'epoch': 0.11} 11%|█ | 3737/34278 [4:08:04<30:51:36, 3.64s/it] 11%|█ | 3738/34278 [4:08:07<29:41:25, 3.50s/it] {'loss': 0.1744, 'grad_norm': 0.9446827984937581, 'learning_rate': 9.83709736206224e-06, 'epoch': 0.11} 11%|█ | 3738/34278 [4:08:07<29:41:25, 3.50s/it] 11%|█ | 3739/34278 [4:08:11<30:41:07, 3.62s/it] {'loss': 0.2053, 'grad_norm': 1.0576705550162075, 'learning_rate': 9.836977730101751e-06, 'epoch': 0.11} 11%|█ | 3739/34278 [4:08:11<30:41:07, 3.62s/it] 11%|█ | 3740/34278 [4:08:14<29:27:45, 3.47s/it] {'loss': 0.1837, 'grad_norm': 0.7897544097238344, 'learning_rate': 9.836858054957879e-06, 'epoch': 0.11} 11%|█ | 3740/34278 [4:08:14<29:27:45, 3.47s/it] 11%|█ | 3741/34278 [4:08:17<28:13:40, 3.33s/it] {'loss': 0.1868, 'grad_norm': 0.7602633907190607, 'learning_rate': 9.83673833663169e-06, 'epoch': 0.11} 11%|█ | 3741/34278 [4:08:17<28:13:40, 3.33s/it] 11%|█ | 3742/34278 [4:08:20<27:52:06, 3.29s/it] {'loss': 0.1652, 'grad_norm': 0.8498575573800358, 'learning_rate': 9.836618575124259e-06, 'epoch': 0.11} 11%|█ | 3742/34278 [4:08:20<27:52:06, 3.29s/it] 11%|█ | 3743/34278 [4:08:26<34:36:09, 4.08s/it] {'loss': 0.1701, 'grad_norm': 0.8334433786361117, 'learning_rate': 9.836498770436652e-06, 'epoch': 0.11} 11%|█ | 3743/34278 [4:08:26<34:36:09, 4.08s/it] 11%|█ | 3744/34278 [4:08:29<31:55:33, 3.76s/it] {'loss': 0.1678, 'grad_norm': 0.8748325644408049, 'learning_rate': 9.836378922569935e-06, 'epoch': 0.11} 11%|█ | 3744/34278 [4:08:29<31:55:33, 3.76s/it] 11%|█ | 3745/34278 [4:08:32<29:59:55, 3.54s/it] {'loss': 0.196, 'grad_norm': 1.1032192674675612, 'learning_rate': 9.836259031525184e-06, 'epoch': 0.11} 11%|█ | 3745/34278 [4:08:32<29:59:55, 3.54s/it] 11%|█ | 3746/34278 [4:08:36<31:06:19, 3.67s/it] {'loss': 0.176, 'grad_norm': 0.8929200675933284, 'learning_rate': 9.836139097303468e-06, 'epoch': 0.11} 11%|█ | 3746/34278 [4:08:36<31:06:19, 3.67s/it] 11%|█ | 3747/34278 [4:08:51<59:54:42, 7.06s/it] {'loss': 0.1866, 'grad_norm': 0.9147053940989985, 'learning_rate': 9.836019119905856e-06, 'epoch': 0.11} 11%|█ | 3747/34278 [4:08:51<59:54:42, 7.06s/it] 11%|█ | 3748/34278 [4:08:55<50:29:02, 5.95s/it] {'loss': 0.1712, 'grad_norm': 0.8795091301665815, 'learning_rate': 9.835899099333418e-06, 'epoch': 0.11} 11%|█ | 3748/34278 [4:08:55<50:29:02, 5.95s/it] 11%|█ | 3749/34278 [4:08:58<43:16:20, 5.10s/it] {'loss': 0.1879, 'grad_norm': 0.8008882443825363, 'learning_rate': 9.835779035587228e-06, 'epoch': 0.11} 11%|█ | 3749/34278 [4:08:58<43:16:20, 5.10s/it] 11%|█ | 3750/34278 [4:09:12<65:10:16, 7.69s/it] {'loss': 0.2044, 'grad_norm': 0.9958444752671687, 'learning_rate': 9.835658928668356e-06, 'epoch': 0.11} 11%|█ | 3750/34278 [4:09:12<65:10:16, 7.69s/it] 11%|█ | 3751/34278 [4:09:30<93:25:57, 11.02s/it] {'loss': 0.1832, 'grad_norm': 0.778564098885269, 'learning_rate': 9.835538778577877e-06, 'epoch': 0.11} 11%|█ | 3751/34278 [4:09:30<93:25:57, 11.02s/it] 11%|█ | 3752/34278 [4:09:49<112:55:16, 13.32s/it] {'loss': 0.1748, 'grad_norm': 0.9398409858222706, 'learning_rate': 9.835418585316863e-06, 'epoch': 0.11} 11%|█ | 3752/34278 [4:09:49<112:55:16, 13.32s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8d36567740> Failed to fetch sample 2762607. Exception: cannot identify image file <_io.BytesIO object at 0x7f8d36567740> 11%|█ | 3753/34278 [4:09:52<86:44:57, 10.23s/it] {'loss': 0.16, 'grad_norm': 0.8758560899904604, 'learning_rate': 9.835298348886386e-06, 'epoch': 0.11} 11%|█ | 3753/34278 [4:09:52<86:44:57, 10.23s/it] 11%|█ | 3754/34278 [4:10:06<96:53:29, 11.43s/it] {'loss': 0.1743, 'grad_norm': 0.8911949385467063, 'learning_rate': 9.835178069287519e-06, 'epoch': 0.11} 11%|█ | 3754/34278 [4:10:06<96:53:29, 11.43s/it] 11%|█ | 3755/34278 [4:10:10<76:10:05, 8.98s/it] {'loss': 0.1839, 'grad_norm': 0.8900291916817913, 'learning_rate': 9.835057746521335e-06, 'epoch': 0.11} 11%|█ | 3755/34278 [4:10:10<76:10:05, 8.98s/it] 11%|█ | 3756/34278 [4:10:36<120:42:50, 14.24s/it] {'loss': 0.1781, 'grad_norm': 0.9417581588384263, 'learning_rate': 9.83493738058891e-06, 'epoch': 0.11} 11%|█ | 3756/34278 [4:10:36<120:42:50, 14.24s/it] 11%|█ | 3757/34278 [4:11:07<162:07:15, 19.12s/it] {'loss': 0.1767, 'grad_norm': 0.9872104931867929, 'learning_rate': 9.834816971491322e-06, 'epoch': 0.11} 11%|█ | 3757/34278 [4:11:07<162:07:15, 19.12s/it] 11%|█ | 3758/34278 [4:11:20<147:30:16, 17.40s/it] {'loss': 0.1591, 'grad_norm': 0.8309452719909529, 'learning_rate': 9.834696519229638e-06, 'epoch': 0.11} 11%|█ | 3758/34278 [4:11:20<147:30:16, 17.40s/it] 11%|█ | 3759/34278 [4:11:48<175:03:50, 20.65s/it] {'loss': 0.188, 'grad_norm': 0.8356111273091072, 'learning_rate': 9.83457602380494e-06, 'epoch': 0.11} 11%|█ | 3759/34278 [4:11:48<175:03:50, 20.65s/it] 11%|█ | 3760/34278 [4:12:08<172:54:32, 20.40s/it] {'loss': 0.1767, 'grad_norm': 1.2821605838808534, 'learning_rate': 9.8344554852183e-06, 'epoch': 0.11} 11%|█ | 3760/34278 [4:12:08<172:54:32, 20.40s/it] 11%|█ | 3761/34278 [4:12:43<210:22:50, 24.82s/it] {'loss': 0.1861, 'grad_norm': 0.7899581477687933, 'learning_rate': 9.834334903470796e-06, 'epoch': 0.11} 11%|█ | 3761/34278 [4:12:43<210:22:50, 24.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3762/34278 [4:13:02<195:06:51, 23.02s/it] {'loss': 0.1719, 'grad_norm': 0.8060093967417578, 'learning_rate': 9.834214278563503e-06, 'epoch': 0.11} 11%|█ | 3762/34278 [4:13:02<195:06:51, 23.02s/it] 11%|█ | 3763/34278 [4:13:32<213:16:08, 25.16s/it] {'loss': 0.1846, 'grad_norm': 0.9629195746359068, 'learning_rate': 9.834093610497501e-06, 'epoch': 0.11} 11%|█ | 3763/34278 [4:13:32<213:16:08, 25.16s/it] 11%|█ | 3764/34278 [4:14:00<221:21:23, 26.12s/it] {'loss': 0.1865, 'grad_norm': 0.8498193523692971, 'learning_rate': 9.833972899273863e-06, 'epoch': 0.11} 11%|█ | 3764/34278 [4:14:00<221:21:23, 26.12s/it] 11%|█ | 3765/34278 [4:14:18<200:02:41, 23.60s/it] {'loss': 0.2186, 'grad_norm': 0.9510094692083803, 'learning_rate': 9.83385214489367e-06, 'epoch': 0.11} 11%|█ | 3765/34278 [4:14:18<200:02:41, 23.60s/it] 11%|█ | 3766/34278 [4:14:32<174:40:47, 20.61s/it] {'loss': 0.1844, 'grad_norm': 0.9199366240798985, 'learning_rate': 9.833731347358e-06, 'epoch': 0.11} 11%|█ | 3766/34278 [4:14:32<174:40:47, 20.61s/it] 11%|█ | 3767/34278 [4:15:10<219:53:06, 25.94s/it] {'loss': 0.2012, 'grad_norm': 0.9649402924819324, 'learning_rate': 9.83361050666793e-06, 'epoch': 0.11} 11%|█ | 3767/34278 [4:15:10<219:53:06, 25.94s/it] 11%|█ | 3768/34278 [4:15:35<215:47:47, 25.46s/it] {'loss': 0.1543, 'grad_norm': 0.8969514360865897, 'learning_rate': 9.833489622824537e-06, 'epoch': 0.11} 11%|█ | 3768/34278 [4:15:35<215:47:47, 25.46s/it] 11%|█ | 3769/34278 [4:16:06<229:59:35, 27.14s/it] {'loss': 0.1777, 'grad_norm': 0.9774128949200379, 'learning_rate': 9.833368695828905e-06, 'epoch': 0.11} 11%|█ | 3769/34278 [4:16:06<229:59:35, 27.14s/it] 11%|█ | 3770/34278 [4:16:30<222:15:57, 26.23s/it] {'loss': 0.1741, 'grad_norm': 0.8418971392788187, 'learning_rate': 9.833247725682111e-06, 'epoch': 0.11} 11%|█ | 3770/34278 [4:16:30<222:15:57, 26.23s/it] 11%|█ | 3771/34278 [4:16:47<200:46:07, 23.69s/it] {'loss': 0.1655, 'grad_norm': 0.8379352874568623, 'learning_rate': 9.833126712385234e-06, 'epoch': 0.11} 11%|█ | 3771/34278 [4:16:47<200:46:07, 23.69s/it] 11%|█ | 3772/34278 [4:17:01<175:09:41, 20.67s/it] {'loss': 0.1836, 'grad_norm': 0.8231065469113594, 'learning_rate': 9.833005655939356e-06, 'epoch': 0.11} 11%|█ | 3772/34278 [4:17:01<175:09:41, 20.67s/it] 11%|█ | 3773/34278 [4:17:04<130:52:51, 15.45s/it] {'loss': 0.1689, 'grad_norm': 0.8191344525780554, 'learning_rate': 9.832884556345556e-06, 'epoch': 0.11} 11%|█ | 3773/34278 [4:17:04<130:52:51, 15.45s/it] 11%|█ | 3774/34278 [4:17:33<165:38:30, 19.55s/it] {'loss': 0.1933, 'grad_norm': 0.9611187222121934, 'learning_rate': 9.832763413604918e-06, 'epoch': 0.11} 11%|█ | 3774/34278 [4:17:33<165:38:30, 19.55s/it] 11%|█ | 3775/34278 [4:17:38<126:36:42, 14.94s/it] {'loss': 0.1874, 'grad_norm': 0.832741343395637, 'learning_rate': 9.832642227718522e-06, 'epoch': 0.11} 11%|█ | 3775/34278 [4:17:38<126:36:42, 14.94s/it] 11%|█ | 3776/34278 [4:18:06<161:01:41, 19.01s/it] {'loss': 0.1721, 'grad_norm': 0.897739888953288, 'learning_rate': 9.83252099868745e-06, 'epoch': 0.11} 11%|█ | 3776/34278 [4:18:06<161:01:41, 19.01s/it] 11%|█ | 3777/34278 [4:18:26<162:53:03, 19.23s/it] {'loss': 0.1987, 'grad_norm': 0.9764722624378016, 'learning_rate': 9.832399726512783e-06, 'epoch': 0.11} 11%|█ | 3777/34278 [4:18:26<162:53:03, 19.23s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3778/34278 [4:18:40<150:00:09, 17.71s/it] {'loss': 0.1985, 'grad_norm': 0.8605112724821691, 'learning_rate': 9.832278411195606e-06, 'epoch': 0.11} 11%|█ | 3778/34278 [4:18:40<150:00:09, 17.71s/it] 11%|█ | 3779/34278 [4:18:44<114:31:43, 13.52s/it] {'loss': 0.1808, 'grad_norm': 0.9433795788950949, 'learning_rate': 9.832157052737e-06, 'epoch': 0.11} 11%|█ | 3779/34278 [4:18:44<114:31:43, 13.52s/it] 11%|█ | 3780/34278 [4:18:48<89:39:04, 10.58s/it] {'loss': 0.1788, 'grad_norm': 1.1215045418125138, 'learning_rate': 9.83203565113805e-06, 'epoch': 0.11} 11%|█ | 3780/34278 [4:18:48<89:39:04, 10.58s/it] 11%|█ | 3781/34278 [4:18:50<69:32:20, 8.21s/it] {'loss': 0.1605, 'grad_norm': 0.9726536245052037, 'learning_rate': 9.831914206399837e-06, 'epoch': 0.11} 11%|█ | 3781/34278 [4:18:50<69:32:20, 8.21s/it] 11%|█ | 3782/34278 [4:18:56<63:46:09, 7.53s/it] {'loss': 0.1787, 'grad_norm': 0.9226634528504349, 'learning_rate': 9.831792718523449e-06, 'epoch': 0.11} 11%|█ | 3782/34278 [4:18:56<63:46:09, 7.53s/it] 11%|█ | 3783/34278 [4:19:01<56:22:08, 6.65s/it] {'loss': 0.2006, 'grad_norm': 0.81507376458163, 'learning_rate': 9.83167118750997e-06, 'epoch': 0.11} 11%|█ | 3783/34278 [4:19:01<56:22:08, 6.65s/it] 11%|█ | 3784/34278 [4:19:04<46:49:31, 5.53s/it] {'loss': 0.198, 'grad_norm': 0.9089075165787985, 'learning_rate': 9.831549613360482e-06, 'epoch': 0.11} 11%|█ | 3784/34278 [4:19:04<46:49:31, 5.53s/it] 11%|█ | 3785/34278 [4:19:06<39:52:58, 4.71s/it] {'loss': 0.1777, 'grad_norm': 0.9859611143876775, 'learning_rate': 9.831427996076074e-06, 'epoch': 0.11} 11%|█ | 3785/34278 [4:19:06<39:52:58, 4.71s/it] 11%|█ | 3786/34278 [4:19:10<35:49:12, 4.23s/it] {'loss': 0.1565, 'grad_norm': 0.7837485554712099, 'learning_rate': 9.83130633565783e-06, 'epoch': 0.11} 11%|█ | 3786/34278 [4:19:10<35:49:12, 4.23s/it] 11%|█ | 3787/34278 [4:19:12<32:31:11, 3.84s/it] {'loss': 0.2108, 'grad_norm': 0.9260998754003568, 'learning_rate': 9.831184632106837e-06, 'epoch': 0.11} 11%|█ | 3787/34278 [4:19:13<32:31:11, 3.84s/it] 11%|█ | 3788/34278 [4:19:15<29:38:11, 3.50s/it] {'loss': 0.1886, 'grad_norm': 0.8565242221498323, 'learning_rate': 9.831062885424181e-06, 'epoch': 0.11} 11%|█ | 3788/34278 [4:19:15<29:38:11, 3.50s/it] 11%|█ | 3789/34278 [4:19:19<29:22:53, 3.47s/it] {'loss': 0.1839, 'grad_norm': 0.8681149920558899, 'learning_rate': 9.830941095610948e-06, 'epoch': 0.11} 11%|█ | 3789/34278 [4:19:19<29:22:53, 3.47s/it] 11%|█ | 3790/34278 [4:19:22<29:16:54, 3.46s/it] {'loss': 0.1857, 'grad_norm': 0.857683535434253, 'learning_rate': 9.830819262668225e-06, 'epoch': 0.11} 11%|█ | 3790/34278 [4:19:22<29:16:54, 3.46s/it] 11%|█ | 3791/34278 [4:19:25<29:13:43, 3.45s/it] {'loss': 0.2115, 'grad_norm': 1.810542375779002, 'learning_rate': 9.830697386597102e-06, 'epoch': 0.11} 11%|█ | 3791/34278 [4:19:25<29:13:43, 3.45s/it] 11%|█ | 3792/34278 [4:19:29<29:30:32, 3.48s/it] {'loss': 0.1878, 'grad_norm': 0.9987769624881818, 'learning_rate': 9.830575467398666e-06, 'epoch': 0.11} 11%|█ | 3792/34278 [4:19:29<29:30:32, 3.48s/it] 11%|█ | 3793/34278 [4:19:32<28:30:54, 3.37s/it] {'loss': 0.1875, 'grad_norm': 0.8863324438702518, 'learning_rate': 9.830453505074005e-06, 'epoch': 0.11} 11%|█ | 3793/34278 [4:19:32<28:30:54, 3.37s/it] 11%|█ | 3794/34278 [4:19:37<33:16:39, 3.93s/it] {'loss': 0.1657, 'grad_norm': 0.9043276789551575, 'learning_rate': 9.830331499624208e-06, 'epoch': 0.11} 11%|█ | 3794/34278 [4:19:37<33:16:39, 3.93s/it] 11%|█ | 3795/34278 [4:19:40<30:41:08, 3.62s/it] {'loss': 0.1816, 'grad_norm': 1.0448868343342503, 'learning_rate': 9.830209451050365e-06, 'epoch': 0.11} 11%|█ | 3795/34278 [4:19:40<30:41:08, 3.62s/it] 11%|█ | 3796/34278 [4:19:44<32:11:57, 3.80s/it] {'loss': 0.1772, 'grad_norm': 1.1265263659396727, 'learning_rate': 9.830087359353566e-06, 'epoch': 0.11} 11%|█ | 3796/34278 [4:19:44<32:11:57, 3.80s/it] 11%|█ | 3797/34278 [4:19:48<32:01:01, 3.78s/it] {'loss': 0.194, 'grad_norm': 0.9817894511712082, 'learning_rate': 9.829965224534899e-06, 'epoch': 0.11} 11%|█ | 3797/34278 [4:19:48<32:01:01, 3.78s/it] 11%|█ | 3798/34278 [4:19:52<31:02:30, 3.67s/it] {'loss': 0.1879, 'grad_norm': 0.8576839815110606, 'learning_rate': 9.829843046595455e-06, 'epoch': 0.11} 11%|█ | 3798/34278 [4:19:52<31:02:30, 3.67s/it] 11%|█ | 3799/34278 [4:19:54<28:56:49, 3.42s/it] {'loss': 0.1745, 'grad_norm': 0.9716186224016058, 'learning_rate': 9.829720825536327e-06, 'epoch': 0.11} 11%|█ | 3799/34278 [4:19:54<28:56:49, 3.42s/it] 11%|█ | 3800/34278 [4:19:57<27:35:28, 3.26s/it] {'loss': 0.1658, 'grad_norm': 1.000444412402128, 'learning_rate': 9.829598561358602e-06, 'epoch': 0.11} 11%|█ | 3800/34278 [4:19:57<27:35:28, 3.26s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12916 > 8192). Running this sequence through the model will result in indexing errors 11%|█ | 3801/34278 [4:20:00<26:23:45, 3.12s/it] {'loss': 0.1732, 'grad_norm': 0.9179903393874113, 'learning_rate': 9.829476254063376e-06, 'epoch': 0.11} 11%|█ | 3801/34278 [4:20:00<26:23:45, 3.12s/it] 11%|█ | 3802/34278 [4:20:03<26:12:56, 3.10s/it] {'loss': 0.1831, 'grad_norm': 1.0004830831060219, 'learning_rate': 9.829353903651739e-06, 'epoch': 0.11} 11%|█ | 3802/34278 [4:20:03<26:12:56, 3.10s/it] 11%|█ | 3803/34278 [4:20:06<26:12:51, 3.10s/it] {'loss': 0.1746, 'grad_norm': 1.0181127662768221, 'learning_rate': 9.829231510124782e-06, 'epoch': 0.11} 11%|█ | 3803/34278 [4:20:06<26:12:51, 3.10s/it] 11%|█ | 3804/34278 [4:20:10<27:05:20, 3.20s/it] {'loss': 0.2311, 'grad_norm': 1.0400139495645497, 'learning_rate': 9.829109073483598e-06, 'epoch': 0.11} 11%|█ | 3804/34278 [4:20:10<27:05:20, 3.20s/it] 11%|█ | 3805/34278 [4:20:13<26:18:39, 3.11s/it] {'loss': 0.2045, 'grad_norm': 0.8718607152673257, 'learning_rate': 9.828986593729283e-06, 'epoch': 0.11} 11%|█ | 3805/34278 [4:20:13<26:18:39, 3.11s/it] 11%|█ | 3806/34278 [4:20:16<27:41:56, 3.27s/it] {'loss': 0.2088, 'grad_norm': 0.92743147372795, 'learning_rate': 9.828864070862927e-06, 'epoch': 0.11} 11%|█ | 3806/34278 [4:20:16<27:41:56, 3.27s/it] 11%|█ | 3807/34278 [4:20:19<27:08:02, 3.21s/it] {'loss': 0.1533, 'grad_norm': 0.9482139382413433, 'learning_rate': 9.828741504885627e-06, 'epoch': 0.11} 11%|█ | 3807/34278 [4:20:19<27:08:02, 3.21s/it] 11%|█ | 3808/34278 [4:20:23<28:05:09, 3.32s/it] {'loss': 0.1647, 'grad_norm': 0.8801976295218055, 'learning_rate': 9.828618895798474e-06, 'epoch': 0.11} 11%|█ | 3808/34278 [4:20:23<28:05:09, 3.32s/it] 11%|█ | 3809/34278 [4:20:26<26:17:33, 3.11s/it] {'loss': 0.1695, 'grad_norm': 0.8325714634226374, 'learning_rate': 9.828496243602566e-06, 'epoch': 0.11} 11%|█ | 3809/34278 [4:20:26<26:17:33, 3.11s/it] 11%|█ | 3810/34278 [4:20:29<26:30:38, 3.13s/it] {'loss': 0.1638, 'grad_norm': 0.9163576438397082, 'learning_rate': 9.828373548298994e-06, 'epoch': 0.11} 11%|█ | 3810/34278 [4:20:29<26:30:38, 3.13s/it] 11%|█ | 3811/34278 [4:20:34<31:04:32, 3.67s/it] {'loss': 0.1789, 'grad_norm': 1.1032582438441247, 'learning_rate': 9.828250809888857e-06, 'epoch': 0.11} 11%|█ | 3811/34278 [4:20:34<31:04:32, 3.67s/it] 11%|█ | 3812/34278 [4:20:37<30:03:36, 3.55s/it] {'loss': 0.1896, 'grad_norm': 0.9469312949880322, 'learning_rate': 9.828128028373249e-06, 'epoch': 0.11} 11%|█ | 3812/34278 [4:20:37<30:03:36, 3.55s/it] 11%|█ | 3813/34278 [4:20:42<33:30:27, 3.96s/it] {'loss': 0.1759, 'grad_norm': 0.9198554637930745, 'learning_rate': 9.828005203753266e-06, 'epoch': 0.11} 11%|█ | 3813/34278 [4:20:42<33:30:27, 3.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3814/34278 [4:20:45<31:11:08, 3.69s/it] {'loss': 0.1659, 'grad_norm': 0.9820440764944599, 'learning_rate': 9.827882336030005e-06, 'epoch': 0.11} 11%|█ | 3814/34278 [4:20:45<31:11:08, 3.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3815/34278 [4:20:48<29:59:20, 3.54s/it] {'loss': 0.1752, 'grad_norm': 0.8816234477053786, 'learning_rate': 9.827759425204563e-06, 'epoch': 0.11} 11%|█ | 3815/34278 [4:20:48<29:59:20, 3.54s/it] 11%|█ | 3816/34278 [4:20:51<28:19:49, 3.35s/it] {'loss': 0.1835, 'grad_norm': 0.8578308886705731, 'learning_rate': 9.82763647127804e-06, 'epoch': 0.11} 11%|█ | 3816/34278 [4:20:51<28:19:49, 3.35s/it] 11%|█ | 3817/34278 [4:20:54<27:42:36, 3.27s/it] {'loss': 0.1889, 'grad_norm': 1.1081784354192596, 'learning_rate': 9.827513474251527e-06, 'epoch': 0.11} 11%|█ | 3817/34278 [4:20:54<27:42:36, 3.27s/it] 11%|█ | 3818/34278 [4:21:00<35:08:43, 4.15s/it] {'loss': 0.1766, 'grad_norm': 1.2342896315640153, 'learning_rate': 9.827390434126128e-06, 'epoch': 0.11} 11%|█ | 3818/34278 [4:21:00<35:08:43, 4.15s/it] 11%|█ | 3819/34278 [4:21:03<32:12:54, 3.81s/it] {'loss': 0.1937, 'grad_norm': 0.9570697493862371, 'learning_rate': 9.82726735090294e-06, 'epoch': 0.11} 11%|█ | 3819/34278 [4:21:03<32:12:54, 3.81s/it] 11%|█ | 3820/34278 [4:21:06<30:37:38, 3.62s/it] {'loss': 0.181, 'grad_norm': 1.0175854183995052, 'learning_rate': 9.827144224583061e-06, 'epoch': 0.11} 11%|█ | 3820/34278 [4:21:06<30:37:38, 3.62s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8410 > 8192). Running this sequence through the model will result in indexing errors 11%|█ | 3821/34278 [4:21:10<30:15:17, 3.58s/it] {'loss': 0.1795, 'grad_norm': 1.0069047586440978, 'learning_rate': 9.827021055167591e-06, 'epoch': 0.11} 11%|█ | 3821/34278 [4:21:10<30:15:17, 3.58s/it] 11%|█ | 3822/34278 [4:21:16<36:08:13, 4.27s/it] {'loss': 0.1958, 'grad_norm': 0.8170535531556331, 'learning_rate': 9.82689784265763e-06, 'epoch': 0.11} 11%|█ | 3822/34278 [4:21:16<36:08:13, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3823/34278 [4:21:19<33:35:54, 3.97s/it] {'loss': 0.1887, 'grad_norm': 0.9762549144270604, 'learning_rate': 9.826774587054274e-06, 'epoch': 0.11} 11%|█ | 3823/34278 [4:21:19<33:35:54, 3.97s/it] 11%|█ | 3824/34278 [4:21:22<31:38:09, 3.74s/it] {'loss': 0.1836, 'grad_norm': 0.9765942416669872, 'learning_rate': 9.826651288358631e-06, 'epoch': 0.11} 11%|█ | 3824/34278 [4:21:22<31:38:09, 3.74s/it] 11%|█ | 3825/34278 [4:21:27<34:58:41, 4.13s/it] {'loss': 0.1638, 'grad_norm': 0.8275904962695941, 'learning_rate': 9.826527946571796e-06, 'epoch': 0.11} 11%|█ | 3825/34278 [4:21:27<34:58:41, 4.13s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3826/34278 [4:21:30<32:07:46, 3.80s/it] {'loss': 0.1949, 'grad_norm': 0.7480591770664048, 'learning_rate': 9.826404561694872e-06, 'epoch': 0.11} 11%|█ | 3826/34278 [4:21:30<32:07:46, 3.80s/it] 11%|█ | 3827/34278 [4:21:34<30:29:18, 3.60s/it] {'loss': 0.1724, 'grad_norm': 0.8127582086929548, 'learning_rate': 9.82628113372896e-06, 'epoch': 0.11} 11%|█ | 3827/34278 [4:21:34<30:29:18, 3.60s/it] 11%|█ | 3828/34278 [4:21:36<28:21:02, 3.35s/it] {'loss': 0.1797, 'grad_norm': 0.9590706890411163, 'learning_rate': 9.82615766267516e-06, 'epoch': 0.11} 11%|█ | 3828/34278 [4:21:36<28:21:02, 3.35s/it] 11%|█ | 3829/34278 [4:21:40<28:38:02, 3.39s/it] {'loss': 0.1907, 'grad_norm': 0.9691927534950275, 'learning_rate': 9.826034148534578e-06, 'epoch': 0.11} 11%|█ | 3829/34278 [4:21:40<28:38:02, 3.39s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11584 > 8192). Running this sequence through the model will result in indexing errors 11%|█ | 3830/34278 [4:21:43<28:36:03, 3.38s/it] {'loss': 0.1862, 'grad_norm': 0.9479541205797285, 'learning_rate': 9.825910591308316e-06, 'epoch': 0.11} 11%|█ | 3830/34278 [4:21:43<28:36:03, 3.38s/it] 11%|█ | 3831/34278 [4:21:49<34:57:48, 4.13s/it] {'loss': 0.1813, 'grad_norm': 0.8363688236797676, 'learning_rate': 9.825786990997474e-06, 'epoch': 0.11} 11%|█ | 3831/34278 [4:21:49<34:57:48, 4.13s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (13165 > 8192). Running this sequence through the model will result in indexing errors 11%|█ | 3832/34278 [4:21:53<33:45:14, 3.99s/it] {'loss': 0.1632, 'grad_norm': 1.0214010560599478, 'learning_rate': 9.82566334760316e-06, 'epoch': 0.11} 11%|█ | 3832/34278 [4:21:53<33:45:14, 3.99s/it] 11%|█ | 3833/34278 [4:21:56<32:02:26, 3.79s/it] {'loss': 0.1951, 'grad_norm': 0.9408923245091138, 'learning_rate': 9.825539661126476e-06, 'epoch': 0.11} 11%|█ | 3833/34278 [4:21:56<32:02:26, 3.79s/it] 11%|█ | 3834/34278 [4:21:59<29:56:42, 3.54s/it] {'loss': 0.1886, 'grad_norm': 0.8505465960882488, 'learning_rate': 9.825415931568525e-06, 'epoch': 0.11} 11%|█ | 3834/34278 [4:21:59<29:56:42, 3.54s/it] 11%|█ | 3835/34278 [4:22:02<28:32:30, 3.38s/it] {'loss': 0.2082, 'grad_norm': 0.9581726090741551, 'learning_rate': 9.825292158930414e-06, 'epoch': 0.11} 11%|█ | 3835/34278 [4:22:02<28:32:30, 3.38s/it] 11%|█ | 3836/34278 [4:22:06<30:59:14, 3.66s/it] {'loss': 0.1919, 'grad_norm': 1.0289200179538978, 'learning_rate': 9.825168343213244e-06, 'epoch': 0.11} 11%|█ | 3836/34278 [4:22:06<30:59:14, 3.66s/it] 11%|█ | 3837/34278 [4:22:10<31:12:54, 3.69s/it] {'loss': 0.1879, 'grad_norm': 0.8372072909384086, 'learning_rate': 9.825044484418123e-06, 'epoch': 0.11} 11%|█ | 3837/34278 [4:22:10<31:12:54, 3.69s/it] 11%|█ | 3838/34278 [4:22:15<34:22:55, 4.07s/it] {'loss': 0.1985, 'grad_norm': 0.9287659409067983, 'learning_rate': 9.824920582546157e-06, 'epoch': 0.11} 11%|█ | 3838/34278 [4:22:15<34:22:55, 4.07s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3839/34278 [4:22:19<35:05:45, 4.15s/it] {'loss': 0.1935, 'grad_norm': 1.1754128764959069, 'learning_rate': 9.824796637598452e-06, 'epoch': 0.11} 11%|█ | 3839/34278 [4:22:19<35:05:45, 4.15s/it] 11%|█ | 3840/34278 [4:22:23<34:21:47, 4.06s/it] {'loss': 0.1542, 'grad_norm': 0.7114469733950892, 'learning_rate': 9.824672649576114e-06, 'epoch': 0.11} 11%|█ | 3840/34278 [4:22:23<34:21:47, 4.06s/it] 11%|█ | 3841/34278 [4:22:27<33:19:16, 3.94s/it] {'loss': 0.1697, 'grad_norm': 0.9652819565295031, 'learning_rate': 9.824548618480251e-06, 'epoch': 0.11} 11%|█ | 3841/34278 [4:22:27<33:19:16, 3.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3842/34278 [4:22:31<33:59:35, 4.02s/it] {'loss': 0.1877, 'grad_norm': 0.8632809183926381, 'learning_rate': 9.82442454431197e-06, 'epoch': 0.11} 11%|█ | 3842/34278 [4:22:31<33:59:35, 4.02s/it] 11%|█ | 3843/34278 [4:22:35<33:02:47, 3.91s/it] {'loss': 0.1937, 'grad_norm': 0.8925510327530022, 'learning_rate': 9.824300427072379e-06, 'epoch': 0.11} 11%|█ | 3843/34278 [4:22:35<33:02:47, 3.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█ | 3844/34278 [4:22:41<38:09:46, 4.51s/it] {'loss': 0.1879, 'grad_norm': 1.0306300289579706, 'learning_rate': 9.824176266762584e-06, 'epoch': 0.11} 11%|█ | 3844/34278 [4:22:41<38:09:46, 4.51s/it] 11%|█ | 3845/34278 [4:22:45<37:03:45, 4.38s/it] {'loss': 0.174, 'grad_norm': 0.7272399866168803, 'learning_rate': 9.824052063383696e-06, 'epoch': 0.11} 11%|█ | 3845/34278 [4:22:45<37:03:45, 4.38s/it] 11%|█ | 3846/34278 [4:22:48<34:21:27, 4.06s/it] {'loss': 0.1594, 'grad_norm': 0.7953291749523506, 'learning_rate': 9.823927816936823e-06, 'epoch': 0.11} 11%|█ | 3846/34278 [4:22:48<34:21:27, 4.06s/it] 11%|█ | 3847/34278 [4:22:54<39:09:27, 4.63s/it] {'loss': 0.1844, 'grad_norm': 0.9242746273139414, 'learning_rate': 9.823803527423073e-06, 'epoch': 0.11} 11%|█ | 3847/34278 [4:22:54<39:09:27, 4.63s/it] 11%|█ | 3848/34278 [4:22:57<34:41:33, 4.10s/it] {'loss': 0.1838, 'grad_norm': 0.7378429952935994, 'learning_rate': 9.823679194843556e-06, 'epoch': 0.11} 11%|█ | 3848/34278 [4:22:57<34:41:33, 4.10s/it] 11%|█ | 3849/34278 [4:23:00<31:36:53, 3.74s/it] {'loss': 0.1739, 'grad_norm': 0.8491326506415591, 'learning_rate': 9.823554819199383e-06, 'epoch': 0.11} 11%|█ | 3849/34278 [4:23:00<31:36:53, 3.74s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9141 > 8192). Running this sequence through the model will result in indexing errors 11%|█ | 3850/34278 [4:23:03<29:42:33, 3.51s/it] {'loss': 0.1633, 'grad_norm': 0.8026960017082256, 'learning_rate': 9.823430400491665e-06, 'epoch': 0.11} 11%|█ | 3850/34278 [4:23:03<29:42:33, 3.51s/it] 11%|█ | 3851/34278 [4:23:07<30:51:28, 3.65s/it] {'loss': 0.1662, 'grad_norm': 0.6599642453313245, 'learning_rate': 9.823305938721511e-06, 'epoch': 0.11} 11%|█ | 3851/34278 [4:23:07<30:51:28, 3.65s/it] 11%|█ | 3852/34278 [4:23:10<29:36:17, 3.50s/it] {'loss': 0.1814, 'grad_norm': 0.8515874900467115, 'learning_rate': 9.823181433890033e-06, 'epoch': 0.11} 11%|█ | 3852/34278 [4:23:10<29:36:17, 3.50s/it] 11%|█ | 3853/34278 [4:23:16<35:55:35, 4.25s/it] {'loss': 0.1618, 'grad_norm': 0.8802819705576921, 'learning_rate': 9.823056885998344e-06, 'epoch': 0.11} 11%|█ | 3853/34278 [4:23:16<35:55:35, 4.25s/it] 11%|█ | 3854/34278 [4:23:22<40:15:10, 4.76s/it] {'loss': 0.1708, 'grad_norm': 0.9681653255674283, 'learning_rate': 9.822932295047552e-06, 'epoch': 0.11} 11%|█ | 3854/34278 [4:23:22<40:15:10, 4.76s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8311 > 8192). Running this sequence through the model will result in indexing errors 11%|█ | 3855/34278 [4:23:26<38:47:18, 4.59s/it] {'loss': 0.2135, 'grad_norm': 0.8629397185868605, 'learning_rate': 9.822807661038774e-06, 'epoch': 0.11} 11%|█ | 3855/34278 [4:23:26<38:47:18, 4.59s/it] 11%|█ | 3856/34278 [4:23:30<36:55:37, 4.37s/it] {'loss': 0.1723, 'grad_norm': 0.9201597321229085, 'learning_rate': 9.82268298397312e-06, 'epoch': 0.11} 11%|█ | 3856/34278 [4:23:30<36:55:37, 4.37s/it] 11%|█▏ | 3857/34278 [4:23:33<33:18:36, 3.94s/it] {'loss': 0.1741, 'grad_norm': 1.0763162141258271, 'learning_rate': 9.822558263851703e-06, 'epoch': 0.11} 11%|█▏ | 3857/34278 [4:23:33<33:18:36, 3.94s/it] 11%|█▏ | 3858/34278 [4:23:36<32:34:56, 3.86s/it] {'loss': 0.1767, 'grad_norm': 0.8981623376821546, 'learning_rate': 9.822433500675637e-06, 'epoch': 0.11} 11%|█▏ | 3858/34278 [4:23:36<32:34:56, 3.86s/it] 11%|█▏ | 3859/34278 [4:23:40<31:49:14, 3.77s/it] {'loss': 0.1978, 'grad_norm': 0.9258733393847015, 'learning_rate': 9.822308694446036e-06, 'epoch': 0.11} 11%|█▏ | 3859/34278 [4:23:40<31:49:14, 3.77s/it] 11%|█▏ | 3860/34278 [4:23:43<30:29:48, 3.61s/it] {'loss': 0.1732, 'grad_norm': 1.4218574899201408, 'learning_rate': 9.822183845164016e-06, 'epoch': 0.11} 11%|█▏ | 3860/34278 [4:23:43<30:29:48, 3.61s/it] 11%|█▏ | 3861/34278 [4:23:46<28:48:49, 3.41s/it] {'loss': 0.1897, 'grad_norm': 1.0251844248860928, 'learning_rate': 9.822058952830687e-06, 'epoch': 0.11} 11%|█▏ | 3861/34278 [4:23:46<28:48:49, 3.41s/it] 11%|█▏ | 3862/34278 [4:23:49<28:23:51, 3.36s/it] {'loss': 0.1806, 'grad_norm': 0.9119394006279787, 'learning_rate': 9.821934017447167e-06, 'epoch': 0.11} 11%|█▏ | 3862/34278 [4:23:49<28:23:51, 3.36s/it] 11%|█▏ | 3863/34278 [4:23:53<27:49:56, 3.29s/it] {'loss': 0.1742, 'grad_norm': 0.9380165588659504, 'learning_rate': 9.82180903901457e-06, 'epoch': 0.11} 11%|█▏ | 3863/34278 [4:23:53<27:49:56, 3.29s/it] 11%|█▏ | 3864/34278 [4:23:55<26:50:14, 3.18s/it] {'loss': 0.193, 'grad_norm': 1.03819331935346, 'learning_rate': 9.821684017534016e-06, 'epoch': 0.11} 11%|█▏ | 3864/34278 [4:23:56<26:50:14, 3.18s/it] 11%|█▏ | 3865/34278 [4:24:00<29:59:36, 3.55s/it] {'loss': 0.1743, 'grad_norm': 1.5310012087194778, 'learning_rate': 9.821558953006618e-06, 'epoch': 0.11} 11%|█▏ | 3865/34278 [4:24:00<29:59:36, 3.55s/it] 11%|█▏ | 3866/34278 [4:24:05<33:23:42, 3.95s/it] {'loss': 0.1805, 'grad_norm': 0.9863703381047424, 'learning_rate': 9.821433845433492e-06, 'epoch': 0.11} 11%|█▏ | 3866/34278 [4:24:05<33:23:42, 3.95s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█▏ | 3867/34278 [4:24:09<32:46:24, 3.88s/it] {'loss': 0.2043, 'grad_norm': 0.9499991563289474, 'learning_rate': 9.821308694815757e-06, 'epoch': 0.11} 11%|█▏ | 3867/34278 [4:24:09<32:46:24, 3.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█▏ | 3868/34278 [4:24:15<38:08:44, 4.52s/it] {'loss': 0.1884, 'grad_norm': 0.9947653672530437, 'learning_rate': 9.821183501154526e-06, 'epoch': 0.11} 11%|█▏ | 3868/34278 [4:24:15<38:08:44, 4.52s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█▏ | 3869/34278 [4:24:18<34:29:04, 4.08s/it] {'loss': 0.1881, 'grad_norm': 1.0925399041803332, 'learning_rate': 9.821058264450921e-06, 'epoch': 0.11} 11%|█▏ | 3869/34278 [4:24:18<34:29:04, 4.08s/it] 11%|█▏ | 3870/34278 [4:24:21<32:44:36, 3.88s/it] {'loss': 0.1623, 'grad_norm': 0.7692636279572528, 'learning_rate': 9.82093298470606e-06, 'epoch': 0.11} 11%|█▏ | 3870/34278 [4:24:21<32:44:36, 3.88s/it] 11%|█▏ | 3871/34278 [4:24:24<30:38:36, 3.63s/it] {'loss': 0.1785, 'grad_norm': 0.9293704198720617, 'learning_rate': 9.820807661921057e-06, 'epoch': 0.11} 11%|█▏ | 3871/34278 [4:24:24<30:38:36, 3.63s/it] 11%|█▏ | 3872/34278 [4:24:28<32:25:01, 3.84s/it] {'loss': 0.1883, 'grad_norm': 1.1286801579823407, 'learning_rate': 9.820682296097038e-06, 'epoch': 0.11} 11%|█▏ | 3872/34278 [4:24:28<32:25:01, 3.84s/it] 11%|█▏ | 3873/34278 [4:24:32<31:56:34, 3.78s/it] {'loss': 0.1793, 'grad_norm': 0.99191725635705, 'learning_rate': 9.820556887235115e-06, 'epoch': 0.11} 11%|█▏ | 3873/34278 [4:24:32<31:56:34, 3.78s/it] 11%|█▏ | 3874/34278 [4:24:35<30:05:40, 3.56s/it] {'loss': 0.1794, 'grad_norm': 1.1514342529400776, 'learning_rate': 9.820431435336412e-06, 'epoch': 0.11} 11%|█▏ | 3874/34278 [4:24:35<30:05:40, 3.56s/it] 11%|█▏ | 3875/34278 [4:24:38<28:45:05, 3.40s/it] {'loss': 0.1748, 'grad_norm': 0.8792916962165257, 'learning_rate': 9.820305940402046e-06, 'epoch': 0.11} 11%|█▏ | 3875/34278 [4:24:38<28:45:05, 3.40s/it] 11%|█▏ | 3876/34278 [4:24:42<29:34:10, 3.50s/it] {'loss': 0.2084, 'grad_norm': 0.9190598137431197, 'learning_rate': 9.82018040243314e-06, 'epoch': 0.11} 11%|█▏ | 3876/34278 [4:24:42<29:34:10, 3.50s/it] 11%|█▏ | 3877/34278 [4:24:45<27:54:22, 3.30s/it] {'loss': 0.1797, 'grad_norm': 1.036820131371785, 'learning_rate': 9.820054821430818e-06, 'epoch': 0.11} 11%|█▏ | 3877/34278 [4:24:45<27:54:22, 3.30s/it] 11%|█▏ | 3878/34278 [4:24:48<26:59:39, 3.20s/it] {'loss': 0.1821, 'grad_norm': 1.0797855853546763, 'learning_rate': 9.819929197396193e-06, 'epoch': 0.11} 11%|█▏ | 3878/34278 [4:24:48<26:59:39, 3.20s/it] 11%|█▏ | 3879/34278 [4:24:51<27:09:24, 3.22s/it] {'loss': 0.2093, 'grad_norm': 0.9713902491663778, 'learning_rate': 9.819803530330393e-06, 'epoch': 0.11} 11%|█▏ | 3879/34278 [4:24:51<27:09:24, 3.22s/it] 11%|█▏ | 3880/34278 [4:24:54<26:33:25, 3.15s/it] {'loss': 0.1933, 'grad_norm': 0.9679103759968202, 'learning_rate': 9.819677820234536e-06, 'epoch': 0.11} 11%|█▏ | 3880/34278 [4:24:54<26:33:25, 3.15s/it] 11%|█▏ | 3881/34278 [4:24:57<25:55:33, 3.07s/it] {'loss': 0.1826, 'grad_norm': 0.9074006853552428, 'learning_rate': 9.819552067109748e-06, 'epoch': 0.11} 11%|█▏ | 3881/34278 [4:24:57<25:55:33, 3.07s/it] 11%|█▏ | 3882/34278 [4:25:00<25:26:43, 3.01s/it] {'loss': 0.1755, 'grad_norm': 0.7600454088624697, 'learning_rate': 9.819426270957148e-06, 'epoch': 0.11} 11%|█▏ | 3882/34278 [4:25:00<25:26:43, 3.01s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11343 > 8192). Running this sequence through the model will result in indexing errors 11%|█▏ | 3883/34278 [4:25:03<26:04:53, 3.09s/it] {'loss': 0.1847, 'grad_norm': 0.8660801771507908, 'learning_rate': 9.819300431777861e-06, 'epoch': 0.11} 11%|█▏ | 3883/34278 [4:25:03<26:04:53, 3.09s/it] 11%|█▏ | 3884/34278 [4:25:07<28:05:35, 3.33s/it] {'loss': 0.1881, 'grad_norm': 0.8873507212821025, 'learning_rate': 9.81917454957301e-06, 'epoch': 0.11} 11%|█▏ | 3884/34278 [4:25:07<28:05:35, 3.33s/it] 11%|█▏ | 3885/34278 [4:25:10<27:24:07, 3.25s/it] {'loss': 0.172, 'grad_norm': 0.9727903192838452, 'learning_rate': 9.819048624343718e-06, 'epoch': 0.11} 11%|█▏ | 3885/34278 [4:25:10<27:24:07, 3.25s/it] 11%|█▏ | 3886/34278 [4:25:16<34:18:02, 4.06s/it] {'loss': 0.1991, 'grad_norm': 0.9129466865242515, 'learning_rate': 9.818922656091113e-06, 'epoch': 0.11} 11%|█▏ | 3886/34278 [4:25:16<34:18:02, 4.06s/it] 11%|█▏ | 3887/34278 [4:25:19<32:28:17, 3.85s/it] {'loss': 0.1902, 'grad_norm': 0.7584331411062796, 'learning_rate': 9.818796644816315e-06, 'epoch': 0.11} 11%|█▏ | 3887/34278 [4:25:19<32:28:17, 3.85s/it] 11%|█▏ | 3888/34278 [4:25:22<30:13:30, 3.58s/it] {'loss': 0.1941, 'grad_norm': 0.9276645585519963, 'learning_rate': 9.818670590520452e-06, 'epoch': 0.11} 11%|█▏ | 3888/34278 [4:25:22<30:13:30, 3.58s/it] 11%|█▏ | 3889/34278 [4:25:25<28:41:22, 3.40s/it] {'loss': 0.1913, 'grad_norm': 0.7732179536792737, 'learning_rate': 9.818544493204647e-06, 'epoch': 0.11} 11%|█▏ | 3889/34278 [4:25:25<28:41:22, 3.40s/it] 11%|█▏ | 3890/34278 [4:25:29<30:30:59, 3.62s/it] {'loss': 0.1911, 'grad_norm': 0.7812702008459518, 'learning_rate': 9.818418352870028e-06, 'epoch': 0.11} 11%|█▏ | 3890/34278 [4:25:29<30:30:59, 3.62s/it] 11%|█▏ | 3891/34278 [4:25:35<36:33:23, 4.33s/it] {'loss': 0.179, 'grad_norm': 0.8628150701202014, 'learning_rate': 9.81829216951772e-06, 'epoch': 0.11} 11%|█▏ | 3891/34278 [4:25:35<36:33:23, 4.33s/it] 11%|█▏ | 3892/34278 [4:25:38<33:06:10, 3.92s/it] {'loss': 0.1722, 'grad_norm': 0.6979059627359, 'learning_rate': 9.81816594314885e-06, 'epoch': 0.11} 11%|█▏ | 3892/34278 [4:25:38<33:06:10, 3.92s/it] 11%|█▏ | 3893/34278 [4:25:42<31:42:32, 3.76s/it] {'loss': 0.1881, 'grad_norm': 0.9831420334341316, 'learning_rate': 9.818039673764543e-06, 'epoch': 0.11} 11%|█▏ | 3893/34278 [4:25:42<31:42:32, 3.76s/it] 11%|█▏ | 3894/34278 [4:25:45<30:41:06, 3.64s/it] {'loss': 0.1912, 'grad_norm': 0.7137533847401493, 'learning_rate': 9.817913361365931e-06, 'epoch': 0.11} 11%|█▏ | 3894/34278 [4:25:45<30:41:06, 3.64s/it] 11%|█▏ | 3895/34278 [4:25:48<29:34:36, 3.50s/it] {'loss': 0.1655, 'grad_norm': 0.7777777033964453, 'learning_rate': 9.817787005954136e-06, 'epoch': 0.11} 11%|█▏ | 3895/34278 [4:25:48<29:34:36, 3.50s/it] 11%|█▏ | 3896/34278 [4:25:52<29:24:23, 3.48s/it] {'loss': 0.1768, 'grad_norm': 0.8134838142745925, 'learning_rate': 9.81766060753029e-06, 'epoch': 0.11} 11%|█▏ | 3896/34278 [4:25:52<29:24:23, 3.48s/it] 11%|█▏ | 3897/34278 [4:25:55<28:33:48, 3.38s/it] {'loss': 0.1702, 'grad_norm': 0.8617382337355313, 'learning_rate': 9.817534166095519e-06, 'epoch': 0.11} 11%|█▏ | 3897/34278 [4:25:55<28:33:48, 3.38s/it] 11%|█▏ | 3898/34278 [4:25:58<28:16:13, 3.35s/it] {'loss': 0.178, 'grad_norm': 0.7574232328325251, 'learning_rate': 9.817407681650955e-06, 'epoch': 0.11} 11%|█▏ | 3898/34278 [4:25:58<28:16:13, 3.35s/it] 11%|█▏ | 3899/34278 [4:26:02<30:31:13, 3.62s/it] {'loss': 0.1813, 'grad_norm': 0.967889985141796, 'learning_rate': 9.817281154197725e-06, 'epoch': 0.11} 11%|█▏ | 3899/34278 [4:26:02<30:31:13, 3.62s/it] 11%|█▏ | 3900/34278 [4:26:06<30:14:33, 3.58s/it] {'loss': 0.2091, 'grad_norm': 0.8882074160499374, 'learning_rate': 9.817154583736956e-06, 'epoch': 0.11} 11%|█▏ | 3900/34278 [4:26:06<30:14:33, 3.58s/it] 11%|█▏ | 3901/34278 [4:26:09<29:34:51, 3.51s/it] {'loss': 0.1617, 'grad_norm': 0.8488250995262521, 'learning_rate': 9.817027970269783e-06, 'epoch': 0.11} 11%|█▏ | 3901/34278 [4:26:09<29:34:51, 3.51s/it] 11%|█▏ | 3902/34278 [4:26:12<28:43:50, 3.41s/it] {'loss': 0.1658, 'grad_norm': 0.9773972648043582, 'learning_rate': 9.816901313797333e-06, 'epoch': 0.11} 11%|█▏ | 3902/34278 [4:26:12<28:43:50, 3.41s/it] 11%|█▏ | 3903/34278 [4:26:15<28:24:11, 3.37s/it] {'loss': 0.1725, 'grad_norm': 0.8364426964259626, 'learning_rate': 9.81677461432074e-06, 'epoch': 0.11} 11%|█▏ | 3903/34278 [4:26:15<28:24:11, 3.37s/it] 11%|█▏ | 3904/34278 [4:26:19<28:10:01, 3.34s/it] {'loss': 0.1689, 'grad_norm': 0.9219903731469782, 'learning_rate': 9.816647871841132e-06, 'epoch': 0.11} 11%|█▏ | 3904/34278 [4:26:19<28:10:01, 3.34s/it] 11%|█▏ | 3905/34278 [4:26:25<35:23:38, 4.20s/it] {'loss': 0.1664, 'grad_norm': 0.8414022118558264, 'learning_rate': 9.816521086359641e-06, 'epoch': 0.11} 11%|█▏ | 3905/34278 [4:26:25<35:23:38, 4.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█▏ | 3906/34278 [4:26:31<39:49:06, 4.72s/it] {'loss': 0.2033, 'grad_norm': 1.1454968057716195, 'learning_rate': 9.8163942578774e-06, 'epoch': 0.11} 11%|█▏ | 3906/34278 [4:26:31<39:49:06, 4.72s/it] 11%|█▏ | 3907/34278 [4:26:34<35:10:53, 4.17s/it] {'loss': 0.182, 'grad_norm': 1.0569777052292275, 'learning_rate': 9.816267386395542e-06, 'epoch': 0.11} 11%|█▏ | 3907/34278 [4:26:34<35:10:53, 4.17s/it] 11%|█▏ | 3908/34278 [4:26:37<32:25:55, 3.84s/it] {'loss': 0.1924, 'grad_norm': 0.8936548661951315, 'learning_rate': 9.816140471915196e-06, 'epoch': 0.11} 11%|█▏ | 3908/34278 [4:26:37<32:25:55, 3.84s/it] 11%|█▏ | 3909/34278 [4:26:40<29:47:19, 3.53s/it] {'loss': 0.2064, 'grad_norm': 1.0896106836331243, 'learning_rate': 9.8160135144375e-06, 'epoch': 0.11} 11%|█▏ | 3909/34278 [4:26:40<29:47:19, 3.53s/it] 11%|█▏ | 3910/34278 [4:26:43<29:12:50, 3.46s/it] {'loss': 0.1767, 'grad_norm': 0.8439333244411857, 'learning_rate': 9.815886513963584e-06, 'epoch': 0.11} 11%|█▏ | 3910/34278 [4:26:43<29:12:50, 3.46s/it] 11%|█▏ | 3911/34278 [4:26:47<30:41:18, 3.64s/it] {'loss': 0.1896, 'grad_norm': 1.012792895535858, 'learning_rate': 9.815759470494582e-06, 'epoch': 0.11} 11%|█▏ | 3911/34278 [4:26:47<30:41:18, 3.64s/it] 11%|█▏ | 3912/34278 [4:26:50<29:29:52, 3.50s/it] {'loss': 0.2073, 'grad_norm': 1.1158822214582353, 'learning_rate': 9.81563238403163e-06, 'epoch': 0.11} 11%|█▏ | 3912/34278 [4:26:50<29:29:52, 3.50s/it] 11%|█▏ | 3913/34278 [4:26:54<29:34:41, 3.51s/it] {'loss': 0.1929, 'grad_norm': 0.8168770507793907, 'learning_rate': 9.815505254575862e-06, 'epoch': 0.11} 11%|█▏ | 3913/34278 [4:26:54<29:34:41, 3.51s/it] 11%|█▏ | 3914/34278 [4:26:57<28:32:28, 3.38s/it] {'loss': 0.1842, 'grad_norm': 1.0465396613520146, 'learning_rate': 9.815378082128414e-06, 'epoch': 0.11} 11%|█▏ | 3914/34278 [4:26:57<28:32:28, 3.38s/it] 11%|█▏ | 3915/34278 [4:27:03<35:38:43, 4.23s/it] {'loss': 0.1921, 'grad_norm': 1.03863955639095, 'learning_rate': 9.815250866690418e-06, 'epoch': 0.11} 11%|█▏ | 3915/34278 [4:27:03<35:38:43, 4.23s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█▏ | 3916/34278 [4:27:07<33:55:20, 4.02s/it] {'loss': 0.1971, 'grad_norm': 0.8759339958631058, 'learning_rate': 9.815123608263011e-06, 'epoch': 0.11} 11%|█▏ | 3916/34278 [4:27:07<33:55:20, 4.02s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0c247acb30> Failed to fetch sample 2735736. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c247acb30> 11%|█▏ | 3917/34278 [4:27:09<30:40:58, 3.64s/it] {'loss': 0.1914, 'grad_norm': 0.9576286845939498, 'learning_rate': 9.81499630684733e-06, 'epoch': 0.11} 11%|█▏ | 3917/34278 [4:27:09<30:40:58, 3.64s/it] 11%|█▏ | 3918/34278 [4:27:13<30:21:41, 3.60s/it] {'loss': 0.1733, 'grad_norm': 0.8924321060739058, 'learning_rate': 9.814868962444512e-06, 'epoch': 0.11} 11%|█▏ | 3918/34278 [4:27:13<30:21:41, 3.60s/it] 11%|█▏ | 3919/34278 [4:27:16<29:15:15, 3.47s/it] {'loss': 0.201, 'grad_norm': 0.9533798172660076, 'learning_rate': 9.814741575055694e-06, 'epoch': 0.11} 11%|█▏ | 3919/34278 [4:27:16<29:15:15, 3.47s/it] 11%|█▏ | 3920/34278 [4:27:19<28:09:57, 3.34s/it] {'loss': 0.1709, 'grad_norm': 0.8569490934644632, 'learning_rate': 9.814614144682014e-06, 'epoch': 0.11} 11%|█▏ | 3920/34278 [4:27:19<28:09:57, 3.34s/it] 11%|█▏ | 3921/34278 [4:27:22<26:43:48, 3.17s/it] {'loss': 0.2006, 'grad_norm': 0.9917094580774355, 'learning_rate': 9.814486671324604e-06, 'epoch': 0.11} 11%|█▏ | 3921/34278 [4:27:22<26:43:48, 3.17s/it] 11%|█▏ | 3922/34278 [4:27:25<26:10:42, 3.10s/it] {'loss': 0.1792, 'grad_norm': 0.9987004050751437, 'learning_rate': 9.81435915498461e-06, 'epoch': 0.11} 11%|█▏ | 3922/34278 [4:27:25<26:10:42, 3.10s/it] 11%|█▏ | 3923/34278 [4:27:28<26:22:28, 3.13s/it] {'loss': 0.1784, 'grad_norm': 0.9412265153598919, 'learning_rate': 9.814231595663165e-06, 'epoch': 0.11} 11%|█▏ | 3923/34278 [4:27:28<26:22:28, 3.13s/it] 11%|█▏ | 3924/34278 [4:27:31<26:15:21, 3.11s/it] {'loss': 0.1866, 'grad_norm': 0.9343015684978182, 'learning_rate': 9.81410399336141e-06, 'epoch': 0.11} 11%|█▏ | 3924/34278 [4:27:31<26:15:21, 3.11s/it] 11%|█▏ | 3925/34278 [4:27:34<26:49:01, 3.18s/it] {'loss': 0.2045, 'grad_norm': 1.0180321162969372, 'learning_rate': 9.813976348080484e-06, 'epoch': 0.11} 11%|█▏ | 3925/34278 [4:27:34<26:49:01, 3.18s/it] 11%|█▏ | 3926/34278 [4:27:40<34:01:12, 4.04s/it] {'loss': 0.1665, 'grad_norm': 0.7786413912112332, 'learning_rate': 9.813848659821526e-06, 'epoch': 0.11} 11%|█▏ | 3926/34278 [4:27:40<34:01:12, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█▏ | 3927/34278 [4:27:44<33:37:41, 3.99s/it] {'loss': 0.1968, 'grad_norm': 0.9569943966338017, 'learning_rate': 9.813720928585678e-06, 'epoch': 0.11} 11%|█▏ | 3927/34278 [4:27:44<33:37:41, 3.99s/it] 11%|█▏ | 3928/34278 [4:27:47<31:01:32, 3.68s/it] {'loss': 0.1662, 'grad_norm': 0.8815021786382026, 'learning_rate': 9.813593154374075e-06, 'epoch': 0.11} 11%|█▏ | 3928/34278 [4:27:47<31:01:32, 3.68s/it] 11%|█▏ | 3929/34278 [4:27:50<30:00:12, 3.56s/it] {'loss': 0.1853, 'grad_norm': 0.8900614397711096, 'learning_rate': 9.813465337187864e-06, 'epoch': 0.11} 11%|█▏ | 3929/34278 [4:27:50<30:00:12, 3.56s/it] 11%|█▏ | 3930/34278 [4:27:56<35:40:25, 4.23s/it] {'loss': 0.1685, 'grad_norm': 0.8984565850328089, 'learning_rate': 9.813337477028184e-06, 'epoch': 0.11} 11%|█▏ | 3930/34278 [4:27:56<35:40:25, 4.23s/it] 11%|█▏ | 3931/34278 [4:27:59<33:00:35, 3.92s/it] {'loss': 0.2095, 'grad_norm': 1.4587227900082587, 'learning_rate': 9.813209573896175e-06, 'epoch': 0.11} 11%|█▏ | 3931/34278 [4:27:59<33:00:35, 3.92s/it] 11%|█▏ | 3932/34278 [4:28:03<32:38:14, 3.87s/it] {'loss': 0.1766, 'grad_norm': 1.0989382820231854, 'learning_rate': 9.81308162779298e-06, 'epoch': 0.11} 11%|█▏ | 3932/34278 [4:28:03<32:38:14, 3.87s/it] 11%|█▏ | 3933/34278 [4:28:08<36:12:23, 4.30s/it] {'loss': 0.1852, 'grad_norm': 0.7076470715977727, 'learning_rate': 9.812953638719741e-06, 'epoch': 0.11} 11%|█▏ | 3933/34278 [4:28:09<36:12:23, 4.30s/it] 11%|█▏ | 3934/34278 [4:28:12<33:45:56, 4.01s/it] {'loss': 0.2041, 'grad_norm': 0.8461530718765761, 'learning_rate': 9.812825606677601e-06, 'epoch': 0.11} 11%|█▏ | 3934/34278 [4:28:12<33:45:56, 4.01s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9141 > 8192). Running this sequence through the model will result in indexing errors 11%|█▏ | 3935/34278 [4:28:16<33:22:19, 3.96s/it] {'loss': 0.1848, 'grad_norm': 0.807544292357872, 'learning_rate': 9.812697531667704e-06, 'epoch': 0.11} 11%|█▏ | 3935/34278 [4:28:16<33:22:19, 3.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 11%|█▏ | 3936/34278 [4:28:19<31:24:25, 3.73s/it] {'loss': 0.1939, 'grad_norm': 0.934358375278067, 'learning_rate': 9.812569413691191e-06, 'epoch': 0.11} 11%|█▏ | 3936/34278 [4:28:19<31:24:25, 3.73s/it] 11%|█▏ | 3937/34278 [4:28:23<33:24:19, 3.96s/it] {'loss': 0.176, 'grad_norm': 0.6569794659233321, 'learning_rate': 9.812441252749207e-06, 'epoch': 0.11} 11%|█▏ | 3937/34278 [4:28:23<33:24:19, 3.96s/it] 11%|█▏ | 3938/34278 [4:28:27<32:29:33, 3.86s/it] {'loss': 0.19, 'grad_norm': 0.865850056802861, 'learning_rate': 9.812313048842896e-06, 'epoch': 0.11} 11%|█▏ | 3938/34278 [4:28:27<32:29:33, 3.86s/it] 11%|█▏ | 3939/34278 [4:28:31<32:18:27, 3.83s/it] {'loss': 0.1671, 'grad_norm': 1.4734153159013528, 'learning_rate': 9.812184801973405e-06, 'epoch': 0.11} 11%|█▏ | 3939/34278 [4:28:31<32:18:27, 3.83s/it] 11%|█▏ | 3940/34278 [4:28:34<31:41:17, 3.76s/it] {'loss': 0.1888, 'grad_norm': 0.7067043431495279, 'learning_rate': 9.812056512141875e-06, 'epoch': 0.11} 11%|█▏ | 3940/34278 [4:28:34<31:41:17, 3.76s/it] 11%|█▏ | 3941/34278 [4:28:38<32:25:55, 3.85s/it] {'loss': 0.1906, 'grad_norm': 0.7949118566718023, 'learning_rate': 9.811928179349455e-06, 'epoch': 0.11} 11%|█▏ | 3941/34278 [4:28:38<32:25:55, 3.85s/it] 12%|█▏ | 3942/34278 [4:28:41<29:59:10, 3.56s/it] {'loss': 0.1859, 'grad_norm': 0.8324770315797557, 'learning_rate': 9.811799803597286e-06, 'epoch': 0.12} 12%|█▏ | 3942/34278 [4:28:41<29:59:10, 3.56s/it] 12%|█▏ | 3943/34278 [4:28:46<34:08:18, 4.05s/it] {'loss': 0.1884, 'grad_norm': 0.7840093644842944, 'learning_rate': 9.811671384886518e-06, 'epoch': 0.12} 12%|█▏ | 3943/34278 [4:28:47<34:08:18, 4.05s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 3944/34278 [4:28:50<31:47:26, 3.77s/it] {'loss': 0.1879, 'grad_norm': 1.0144230590449386, 'learning_rate': 9.811542923218298e-06, 'epoch': 0.12} 12%|█▏ | 3944/34278 [4:28:50<31:47:26, 3.77s/it] 12%|█▏ | 3945/34278 [4:28:54<34:16:46, 4.07s/it] {'loss': 0.2091, 'grad_norm': 0.9499532149031065, 'learning_rate': 9.811414418593771e-06, 'epoch': 0.12} 12%|█▏ | 3945/34278 [4:28:54<34:16:46, 4.07s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 3946/34278 [4:28:57<31:09:02, 3.70s/it] {'loss': 0.1768, 'grad_norm': 1.845968864637907, 'learning_rate': 9.811285871014084e-06, 'epoch': 0.12} 12%|█▏ | 3946/34278 [4:28:57<31:09:02, 3.70s/it] 12%|█▏ | 3947/34278 [4:29:00<29:37:53, 3.52s/it] {'loss': 0.1724, 'grad_norm': 0.915737179055813, 'learning_rate': 9.811157280480386e-06, 'epoch': 0.12} 12%|█▏ | 3947/34278 [4:29:00<29:37:53, 3.52s/it] 12%|█▏ | 3948/34278 [4:29:03<28:36:04, 3.39s/it] {'loss': 0.1861, 'grad_norm': 0.8715096334382696, 'learning_rate': 9.811028646993823e-06, 'epoch': 0.12} 12%|█▏ | 3948/34278 [4:29:03<28:36:04, 3.39s/it] 12%|█▏ | 3949/34278 [4:29:07<27:59:14, 3.32s/it] {'loss': 0.2188, 'grad_norm': 1.1633558319535617, 'learning_rate': 9.810899970555547e-06, 'epoch': 0.12} 12%|█▏ | 3949/34278 [4:29:07<27:59:14, 3.32s/it] 12%|█▏ | 3950/34278 [4:29:13<34:43:42, 4.12s/it] {'loss': 0.1959, 'grad_norm': 1.0185924503742674, 'learning_rate': 9.810771251166702e-06, 'epoch': 0.12} 12%|█▏ | 3950/34278 [4:29:13<34:43:42, 4.12s/it] 12%|█▏ | 3951/34278 [4:29:19<39:44:07, 4.72s/it] {'loss': 0.1864, 'grad_norm': 0.9597127919256301, 'learning_rate': 9.810642488828442e-06, 'epoch': 0.12} 12%|█▏ | 3951/34278 [4:29:19<39:44:07, 4.72s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 3952/34278 [4:29:22<35:19:55, 4.19s/it] {'loss': 0.1681, 'grad_norm': 0.934634222614499, 'learning_rate': 9.810513683541913e-06, 'epoch': 0.12} 12%|█▏ | 3952/34278 [4:29:22<35:19:55, 4.19s/it] 12%|█▏ | 3953/34278 [4:29:25<33:46:26, 4.01s/it] {'loss': 0.1939, 'grad_norm': 0.9237840016353603, 'learning_rate': 9.810384835308266e-06, 'epoch': 0.12} 12%|█▏ | 3953/34278 [4:29:25<33:46:26, 4.01s/it] 12%|█▏ | 3954/34278 [4:29:31<37:54:16, 4.50s/it] {'loss': 0.1653, 'grad_norm': 0.849446720945563, 'learning_rate': 9.810255944128651e-06, 'epoch': 0.12} 12%|█▏ | 3954/34278 [4:29:31<37:54:16, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 3955/34278 [4:29:34<34:24:56, 4.09s/it] {'loss': 0.1699, 'grad_norm': 1.084779195091929, 'learning_rate': 9.81012701000422e-06, 'epoch': 0.12} 12%|█▏ | 3955/34278 [4:29:34<34:24:56, 4.09s/it] 12%|█▏ | 3956/34278 [4:29:38<35:10:05, 4.18s/it] {'loss': 0.1733, 'grad_norm': 0.944132426603004, 'learning_rate': 9.809998032936123e-06, 'epoch': 0.12} 12%|█▏ | 3956/34278 [4:29:38<35:10:05, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 3957/34278 [4:29:41<32:16:55, 3.83s/it] {'loss': 0.1782, 'grad_norm': 0.961987504303707, 'learning_rate': 9.809869012925512e-06, 'epoch': 0.12} 12%|█▏ | 3957/34278 [4:29:41<32:16:55, 3.83s/it] 12%|█▏ | 3958/34278 [4:29:46<33:57:41, 4.03s/it] {'loss': 0.1813, 'grad_norm': 1.0617644175619754, 'learning_rate': 9.80973994997354e-06, 'epoch': 0.12} 12%|█▏ | 3958/34278 [4:29:46<33:57:41, 4.03s/it] 12%|█▏ | 3959/34278 [4:29:52<38:43:12, 4.60s/it] {'loss': 0.167, 'grad_norm': 0.9413351711979291, 'learning_rate': 9.809610844081357e-06, 'epoch': 0.12} 12%|█▏ | 3959/34278 [4:29:52<38:43:12, 4.60s/it] 12%|█▏ | 3960/34278 [4:29:55<34:16:12, 4.07s/it] {'loss': 0.1549, 'grad_norm': 0.8095112512951378, 'learning_rate': 9.809481695250116e-06, 'epoch': 0.12} 12%|█▏ | 3960/34278 [4:29:55<34:16:12, 4.07s/it] 12%|█▏ | 3961/34278 [4:30:01<39:15:49, 4.66s/it] {'loss': 0.1953, 'grad_norm': 1.0105866694575891, 'learning_rate': 9.80935250348097e-06, 'epoch': 0.12} 12%|█▏ | 3961/34278 [4:30:01<39:15:49, 4.66s/it] 12%|█▏ | 3962/34278 [4:30:04<35:12:48, 4.18s/it] {'loss': 0.1852, 'grad_norm': 1.177664778950724, 'learning_rate': 9.809223268775074e-06, 'epoch': 0.12} 12%|█▏ | 3962/34278 [4:30:04<35:12:48, 4.18s/it] 12%|█▏ | 3963/34278 [4:30:07<33:46:24, 4.01s/it] {'loss': 0.1861, 'grad_norm': 0.7748362871271275, 'learning_rate': 9.80909399113358e-06, 'epoch': 0.12} 12%|█▏ | 3963/34278 [4:30:07<33:46:24, 4.01s/it] 12%|█▏ | 3964/34278 [4:30:10<30:41:13, 3.64s/it] {'loss': 0.1603, 'grad_norm': 0.8218146313453756, 'learning_rate': 9.808964670557643e-06, 'epoch': 0.12} 12%|█▏ | 3964/34278 [4:30:10<30:41:13, 3.64s/it] 12%|█▏ | 3965/34278 [4:30:13<29:51:27, 3.55s/it] {'loss': 0.1827, 'grad_norm': 1.1794113164782682, 'learning_rate': 9.80883530704842e-06, 'epoch': 0.12} 12%|█▏ | 3965/34278 [4:30:13<29:51:27, 3.55s/it] 12%|█▏ | 3966/34278 [4:30:17<28:37:10, 3.40s/it] {'loss': 0.1541, 'grad_norm': 0.7130535387940719, 'learning_rate': 9.808705900607058e-06, 'epoch': 0.12} 12%|█▏ | 3966/34278 [4:30:17<28:37:10, 3.40s/it] 12%|█▏ | 3967/34278 [4:30:21<30:40:45, 3.64s/it] {'loss': 0.179, 'grad_norm': 0.9235360139481987, 'learning_rate': 9.808576451234721e-06, 'epoch': 0.12} 12%|█▏ | 3967/34278 [4:30:21<30:40:45, 3.64s/it] 12%|█▏ | 3968/34278 [4:30:25<31:35:20, 3.75s/it] {'loss': 0.1914, 'grad_norm': 1.1301699140384471, 'learning_rate': 9.80844695893256e-06, 'epoch': 0.12} 12%|█▏ | 3968/34278 [4:30:25<31:35:20, 3.75s/it] 12%|█▏ | 3969/34278 [4:30:31<36:51:49, 4.38s/it] {'loss': 0.1531, 'grad_norm': 0.8260365779849368, 'learning_rate': 9.808317423701735e-06, 'epoch': 0.12} 12%|█▏ | 3969/34278 [4:30:31<36:51:49, 4.38s/it] 12%|█▏ | 3970/34278 [4:30:33<33:01:17, 3.92s/it] {'loss': 0.1969, 'grad_norm': 0.8304113067468187, 'learning_rate': 9.808187845543397e-06, 'epoch': 0.12} 12%|█▏ | 3970/34278 [4:30:33<33:01:17, 3.92s/it] 12%|█▏ | 3971/34278 [4:30:37<32:08:38, 3.82s/it] {'loss': 0.1897, 'grad_norm': 0.9679207163452974, 'learning_rate': 9.808058224458708e-06, 'epoch': 0.12} 12%|█▏ | 3971/34278 [4:30:37<32:08:38, 3.82s/it] 12%|█▏ | 3972/34278 [4:30:40<30:36:02, 3.63s/it] {'loss': 0.1729, 'grad_norm': 0.8470787888221096, 'learning_rate': 9.807928560448822e-06, 'epoch': 0.12} 12%|█▏ | 3972/34278 [4:30:40<30:36:02, 3.63s/it] 12%|█▏ | 3973/34278 [4:30:43<28:46:38, 3.42s/it] {'loss': 0.1717, 'grad_norm': 0.9483774764824003, 'learning_rate': 9.807798853514898e-06, 'epoch': 0.12} 12%|█▏ | 3973/34278 [4:30:43<28:46:38, 3.42s/it] 12%|█▏ | 3974/34278 [4:30:46<28:29:01, 3.38s/it] {'loss': 0.187, 'grad_norm': 0.8434783474399654, 'learning_rate': 9.807669103658092e-06, 'epoch': 0.12} 12%|█▏ | 3974/34278 [4:30:46<28:29:01, 3.38s/it] 12%|█▏ | 3975/34278 [4:30:49<27:02:24, 3.21s/it] {'loss': 0.2028, 'grad_norm': 1.138839474688897, 'learning_rate': 9.807539310879566e-06, 'epoch': 0.12} 12%|█▏ | 3975/34278 [4:30:49<27:02:24, 3.21s/it] 12%|█▏ | 3976/34278 [4:30:55<33:39:46, 4.00s/it] {'loss': 0.1715, 'grad_norm': 0.8098813997017877, 'learning_rate': 9.807409475180476e-06, 'epoch': 0.12} 12%|█▏ | 3976/34278 [4:30:55<33:39:46, 4.00s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fefcb0fe430> Failed to fetch sample 3655705. Exception: cannot identify image file <_io.BytesIO object at 0x7fefcb0fe430> 12%|█▏ | 3977/34278 [4:30:58<31:13:00, 3.71s/it] {'loss': 0.1742, 'grad_norm': 0.8476295399786722, 'learning_rate': 9.80727959656198e-06, 'epoch': 0.12} 12%|█▏ | 3977/34278 [4:30:58<31:13:00, 3.71s/it] 12%|█▏ | 3978/34278 [4:31:04<36:36:10, 4.35s/it] {'loss': 0.1936, 'grad_norm': 1.0189550480133291, 'learning_rate': 9.807149675025242e-06, 'epoch': 0.12} 12%|█▏ | 3978/34278 [4:31:04<36:36:10, 4.35s/it] 12%|█▏ | 3979/34278 [4:31:10<40:36:22, 4.82s/it] {'loss': 0.1925, 'grad_norm': 0.9583739475972958, 'learning_rate': 9.807019710571418e-06, 'epoch': 0.12} 12%|█▏ | 3979/34278 [4:31:10<40:36:22, 4.82s/it] 12%|█▏ | 3980/34278 [4:31:13<36:20:38, 4.32s/it] {'loss': 0.184, 'grad_norm': 0.947373128348233, 'learning_rate': 9.80688970320167e-06, 'epoch': 0.12} 12%|█▏ | 3980/34278 [4:31:13<36:20:38, 4.32s/it] 12%|█▏ | 3981/34278 [4:31:16<32:05:28, 3.81s/it] {'loss': 0.194, 'grad_norm': 1.0217865886958784, 'learning_rate': 9.806759652917157e-06, 'epoch': 0.12} 12%|█▏ | 3981/34278 [4:31:16<32:05:28, 3.81s/it] 12%|█▏ | 3982/34278 [4:31:19<31:12:50, 3.71s/it] {'loss': 0.1809, 'grad_norm': 0.9091592305269642, 'learning_rate': 9.806629559719042e-06, 'epoch': 0.12} 12%|█▏ | 3982/34278 [4:31:19<31:12:50, 3.71s/it] 12%|█▏ | 3983/34278 [4:31:23<31:02:29, 3.69s/it] {'loss': 0.1788, 'grad_norm': 0.927284884626031, 'learning_rate': 9.806499423608486e-06, 'epoch': 0.12} 12%|█▏ | 3983/34278 [4:31:23<31:02:29, 3.69s/it] 12%|█▏ | 3984/34278 [4:31:29<36:52:14, 4.38s/it] {'loss': 0.1802, 'grad_norm': 1.0614083593915533, 'learning_rate': 9.80636924458665e-06, 'epoch': 0.12} 12%|█▏ | 3984/34278 [4:31:29<36:52:14, 4.38s/it] 12%|█▏ | 3985/34278 [4:31:32<33:01:35, 3.92s/it] {'loss': 0.2047, 'grad_norm': 0.8477671183172, 'learning_rate': 9.806239022654699e-06, 'epoch': 0.12} 12%|█▏ | 3985/34278 [4:31:32<33:01:35, 3.92s/it] 12%|█▏ | 3986/34278 [4:31:35<32:05:18, 3.81s/it] {'loss': 0.1742, 'grad_norm': 1.0228600825612724, 'learning_rate': 9.80610875781379e-06, 'epoch': 0.12} 12%|█▏ | 3986/34278 [4:31:35<32:05:18, 3.81s/it] 12%|█▏ | 3987/34278 [4:31:39<31:02:55, 3.69s/it] {'loss': 0.1681, 'grad_norm': 0.992616764441537, 'learning_rate': 9.805978450065092e-06, 'epoch': 0.12} 12%|█▏ | 3987/34278 [4:31:39<31:02:55, 3.69s/it] 12%|█▏ | 3988/34278 [4:31:41<28:59:35, 3.45s/it] {'loss': 0.1861, 'grad_norm': 0.9306633160170079, 'learning_rate': 9.805848099409765e-06, 'epoch': 0.12} 12%|█▏ | 3988/34278 [4:31:41<28:59:35, 3.45s/it] 12%|█▏ | 3989/34278 [4:31:47<34:46:14, 4.13s/it] {'loss': 0.1747, 'grad_norm': 1.0939088043077234, 'learning_rate': 9.805717705848972e-06, 'epoch': 0.12} 12%|█▏ | 3989/34278 [4:31:47<34:46:14, 4.13s/it] 12%|█▏ | 3990/34278 [4:31:50<32:24:33, 3.85s/it] {'loss': 0.1892, 'grad_norm': 0.8550016996193792, 'learning_rate': 9.805587269383881e-06, 'epoch': 0.12} 12%|█▏ | 3990/34278 [4:31:50<32:24:33, 3.85s/it] 12%|█▏ | 3991/34278 [4:31:54<31:49:10, 3.78s/it] {'loss': 0.1952, 'grad_norm': 1.0985057177862256, 'learning_rate': 9.805456790015652e-06, 'epoch': 0.12} 12%|█▏ | 3991/34278 [4:31:54<31:49:10, 3.78s/it] 12%|█▏ | 3992/34278 [4:31:58<32:16:02, 3.84s/it] {'loss': 0.1862, 'grad_norm': 1.2032884200598375, 'learning_rate': 9.805326267745451e-06, 'epoch': 0.12} 12%|█▏ | 3992/34278 [4:31:58<32:16:02, 3.84s/it] 12%|█▏ | 3993/34278 [4:32:04<38:12:08, 4.54s/it] {'loss': 0.1677, 'grad_norm': 0.8801044670755954, 'learning_rate': 9.805195702574446e-06, 'epoch': 0.12} 12%|█▏ | 3993/34278 [4:32:04<38:12:08, 4.54s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 3994/34278 [4:32:07<34:56:19, 4.15s/it] {'loss': 0.2165, 'grad_norm': 0.9838842619636844, 'learning_rate': 9.805065094503801e-06, 'epoch': 0.12} 12%|█▏ | 3994/34278 [4:32:07<34:56:19, 4.15s/it] 12%|█▏ | 3995/34278 [4:32:12<34:51:46, 4.14s/it] {'loss': 0.1717, 'grad_norm': 1.036174236277911, 'learning_rate': 9.804934443534682e-06, 'epoch': 0.12} 12%|█▏ | 3995/34278 [4:32:12<34:51:46, 4.14s/it] 12%|█▏ | 3996/34278 [4:32:15<33:23:17, 3.97s/it] {'loss': 0.182, 'grad_norm': 1.0365517572217218, 'learning_rate': 9.804803749668254e-06, 'epoch': 0.12} 12%|█▏ | 3996/34278 [4:32:15<33:23:17, 3.97s/it] 12%|█▏ | 3997/34278 [4:32:21<38:47:38, 4.61s/it] {'loss': 0.2106, 'grad_norm': 1.2389885107002385, 'learning_rate': 9.804673012905686e-06, 'epoch': 0.12} 12%|█▏ | 3997/34278 [4:32:21<38:47:38, 4.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 3998/34278 [4:32:25<35:35:34, 4.23s/it] {'loss': 0.1792, 'grad_norm': 0.908669887314025, 'learning_rate': 9.804542233248144e-06, 'epoch': 0.12} 12%|█▏ | 3998/34278 [4:32:25<35:35:34, 4.23s/it] 12%|█▏ | 3999/34278 [4:32:27<31:51:59, 3.79s/it] {'loss': 0.1682, 'grad_norm': 0.7735821706347971, 'learning_rate': 9.804411410696797e-06, 'epoch': 0.12} 12%|█▏ | 3999/34278 [4:32:27<31:51:59, 3.79s/it] 12%|█▏ | 4000/34278 [4:32:30<30:18:08, 3.60s/it] {'loss': 0.1963, 'grad_norm': 0.9761508165663874, 'learning_rate': 9.804280545252812e-06, 'epoch': 0.12} 12%|█▏ | 4000/34278 [4:32:30<30:18:08, 3.60s/it]Set eos token id to Set eos token id to151658 Set eos token to <|diff_marker|> 151658Set generation config eos token id to [151658] Set eos token to <|diff_marker|> Set eos token id toSet generation config eos token id to 151658[151658] Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id toSet eos token id to et eos token id to Set eos token id to151658151658151658 151658 Set eos token toSet eos token to Set eos token to <|diff_marker|> <|diff_marker|> Set eos token to<|diff_marker|> Set generation config eos token id toSet generation config eos token id to<|diff_marker|> Set generation config eos token id to Set generation config eos token id to[151658][151658] [151658] [151658] Set eos token id to 151658Set eos token id to Set eos token to <|diff_marker|> 151658Set eos token id toSet generation config eos token id to Set eos token to <|diff_marker|>151658[151658] Set generation config eos token id toSet eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to [151658] 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] ation config eos token id to [151658] 151658151658Set eos token id to Set eos token toSet eos token to <|diff_marker|><|diff_marker|> Set generation config eos token id to Set eos token id to 151658Set generation config eos token id to[151658]Set eos token id to Set eos token to151658 [151658]<|diff_marker|>151658 Set eos token to Set generation config eos token id to<|diff_marker|>Set eos token to <|diff_marker|> Set generation config eos token id to[151658] Set generation config eos token id to [151658] [151658] Set eos token id toSet eos token id to 151658 151658 Set eos token to <|diff_marker|>Set eos token to <|diff_marker|>Set generation config eos token id to Set generation config eos token id to [151658] [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id back toSet eos token id back to 151645151645 Set eos token back toSet eos token back to <|im_end|><|im_end|> Set generation config eos token id back toSet generation config eos token id back to [151645, 151643][151645, 151643] Set eos token id back to Set eos token id back to151645 Set eos token back to 151645<|im_end|> Set generation config eos token id back to Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to Set eos token id back to 151645[151645, 151643] Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4001/34278 [4:33:03<102:13:09, 12.15s/it] {'loss': 0.1899, 'grad_norm': 0.8864981232126092, 'learning_rate': 9.804149636917355e-06, 'epoch': 0.12} 12%|█▏ | 4001/34278 [4:33:03<102:13:09, 12.15s/it] 12%|█▏ | 4002/34278 [4:33:08<84:27:50, 10.04s/it] {'loss': 0.1836, 'grad_norm': 0.8001310128164034, 'learning_rate': 9.8040186856916e-06, 'epoch': 0.12} 12%|█▏ | 4002/34278 [4:33:08<84:27:50, 10.04s/it] 12%|█▏ | 4003/34278 [4:33:11<66:41:00, 7.93s/it] {'loss': 0.1788, 'grad_norm': 0.8814383640876319, 'learning_rate': 9.80388769157671e-06, 'epoch': 0.12} 12%|█▏ | 4003/34278 [4:33:11<66:41:00, 7.93s/it] 12%|█▏ | 4004/34278 [4:33:14<54:15:01, 6.45s/it] {'loss': 0.1771, 'grad_norm': 0.8520770186053725, 'learning_rate': 9.803756654573857e-06, 'epoch': 0.12} 12%|█▏ | 4004/34278 [4:33:14<54:15:01, 6.45s/it] 12%|█▏ | 4005/34278 [4:33:17<46:34:24, 5.54s/it] {'loss': 0.1825, 'grad_norm': 0.880540957617837, 'learning_rate': 9.803625574684213e-06, 'epoch': 0.12} 12%|█▏ | 4005/34278 [4:33:17<46:34:24, 5.54s/it] 12%|█▏ | 4006/34278 [4:33:20<40:35:04, 4.83s/it] {'loss': 0.1881, 'grad_norm': 0.9309486508322999, 'learning_rate': 9.803494451908946e-06, 'epoch': 0.12} 12%|█▏ | 4006/34278 [4:33:20<40:35:04, 4.83s/it] 12%|█▏ | 4007/34278 [4:33:23<36:33:05, 4.35s/it] {'loss': 0.1665, 'grad_norm': 0.8613043448542246, 'learning_rate': 9.803363286249228e-06, 'epoch': 0.12} 12%|█▏ | 4007/34278 [4:33:24<36:33:05, 4.35s/it] 12%|█▏ | 4008/34278 [4:33:27<34:56:29, 4.16s/it] {'loss': 0.1835, 'grad_norm': 0.96365789331625, 'learning_rate': 9.803232077706229e-06, 'epoch': 0.12} 12%|█▏ | 4008/34278 [4:33:27<34:56:29, 4.16s/it] 12%|█▏ | 4009/34278 [4:33:30<31:53:37, 3.79s/it] {'loss': 0.1949, 'grad_norm': 0.8145166726519757, 'learning_rate': 9.80310082628112e-06, 'epoch': 0.12} 12%|█▏ | 4009/34278 [4:33:30<31:53:37, 3.79s/it] 12%|█▏ | 4010/34278 [4:33:33<29:12:16, 3.47s/it] {'loss': 0.1774, 'grad_norm': 0.8066981054531861, 'learning_rate': 9.802969531975074e-06, 'epoch': 0.12} 12%|█▏ | 4010/34278 [4:33:33<29:12:16, 3.47s/it] 12%|█▏ | 4011/34278 [4:33:36<29:10:35, 3.47s/it] {'loss': 0.1775, 'grad_norm': 0.8025106983913187, 'learning_rate': 9.802838194789264e-06, 'epoch': 0.12} 12%|█▏ | 4011/34278 [4:33:36<29:10:35, 3.47s/it] 12%|█▏ | 4012/34278 [4:33:42<35:40:03, 4.24s/it] {'loss': 0.1826, 'grad_norm': 0.7545697971668403, 'learning_rate': 9.802706814724857e-06, 'epoch': 0.12} 12%|█▏ | 4012/34278 [4:33:42<35:40:03, 4.24s/it] 12%|█▏ | 4013/34278 [4:33:45<32:15:43, 3.84s/it] {'loss': 0.1763, 'grad_norm': 0.8854062913148097, 'learning_rate': 9.802575391783033e-06, 'epoch': 0.12} 12%|█▏ | 4013/34278 [4:33:45<32:15:43, 3.84s/it] 12%|█▏ | 4014/34278 [4:33:48<30:21:40, 3.61s/it] {'loss': 0.1914, 'grad_norm': 0.886431520686121, 'learning_rate': 9.802443925964963e-06, 'epoch': 0.12} 12%|█▏ | 4014/34278 [4:33:48<30:21:40, 3.61s/it] 12%|█▏ | 4015/34278 [4:33:52<31:00:53, 3.69s/it] {'loss': 0.1723, 'grad_norm': 0.819842833339432, 'learning_rate': 9.80231241727182e-06, 'epoch': 0.12} 12%|█▏ | 4015/34278 [4:33:52<31:00:53, 3.69s/it] 12%|█▏ | 4016/34278 [4:33:55<28:49:52, 3.43s/it] {'loss': 0.1766, 'grad_norm': 0.777312991272999, 'learning_rate': 9.802180865704775e-06, 'epoch': 0.12} 12%|█▏ | 4016/34278 [4:33:55<28:49:52, 3.43s/it] 12%|█▏ | 4017/34278 [4:33:58<28:47:06, 3.42s/it] {'loss': 0.1825, 'grad_norm': 0.9257122523737836, 'learning_rate': 9.80204927126501e-06, 'epoch': 0.12} 12%|█▏ | 4017/34278 [4:33:58<28:47:06, 3.42s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4018/34278 [4:34:02<28:03:11, 3.34s/it] {'loss': 0.1899, 'grad_norm': 0.8258775242878535, 'learning_rate': 9.801917633953693e-06, 'epoch': 0.12} 12%|█▏ | 4018/34278 [4:34:02<28:03:11, 3.34s/it] 12%|█▏ | 4019/34278 [4:34:05<28:44:20, 3.42s/it] {'loss': 0.1886, 'grad_norm': 0.7616173750474217, 'learning_rate': 9.801785953772001e-06, 'epoch': 0.12} 12%|█▏ | 4019/34278 [4:34:05<28:44:20, 3.42s/it] 12%|█▏ | 4020/34278 [4:34:09<30:36:53, 3.64s/it] {'loss': 0.2046, 'grad_norm': 0.9280170484458173, 'learning_rate': 9.801654230721111e-06, 'epoch': 0.12} 12%|█▏ | 4020/34278 [4:34:09<30:36:53, 3.64s/it] 12%|█▏ | 4021/34278 [4:34:13<29:39:44, 3.53s/it] {'loss': 0.1842, 'grad_norm': 0.9799221946806403, 'learning_rate': 9.801522464802199e-06, 'epoch': 0.12} 12%|█▏ | 4021/34278 [4:34:13<29:39:44, 3.53s/it] 12%|█▏ | 4022/34278 [4:34:16<29:17:58, 3.49s/it] {'loss': 0.1696, 'grad_norm': 0.8179915557194807, 'learning_rate': 9.80139065601644e-06, 'epoch': 0.12} 12%|█▏ | 4022/34278 [4:34:16<29:17:58, 3.49s/it] 12%|█▏ | 4023/34278 [4:34:20<30:49:58, 3.67s/it] {'loss': 0.1735, 'grad_norm': 0.8595657911189106, 'learning_rate': 9.801258804365013e-06, 'epoch': 0.12} 12%|█▏ | 4023/34278 [4:34:20<30:49:58, 3.67s/it] 12%|█▏ | 4024/34278 [4:34:23<28:08:45, 3.35s/it] {'loss': 0.1671, 'grad_norm': 0.9495398536792797, 'learning_rate': 9.80112690984909e-06, 'epoch': 0.12} 12%|█▏ | 4024/34278 [4:34:23<28:08:45, 3.35s/it] 12%|█▏ | 4025/34278 [4:34:26<28:22:46, 3.38s/it] {'loss': 0.1657, 'grad_norm': 1.096707425460663, 'learning_rate': 9.800994972469855e-06, 'epoch': 0.12} 12%|█▏ | 4025/34278 [4:34:26<28:22:46, 3.38s/it] 12%|█▏ | 4026/34278 [4:34:32<33:33:54, 3.99s/it] {'loss': 0.1806, 'grad_norm': 0.8307152374186523, 'learning_rate': 9.800862992228481e-06, 'epoch': 0.12} 12%|█▏ | 4026/34278 [4:34:32<33:33:54, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4027/34278 [4:34:35<32:35:25, 3.88s/it] {'loss': 0.1627, 'grad_norm': 0.6971424405215413, 'learning_rate': 9.800730969126151e-06, 'epoch': 0.12} 12%|█▏ | 4027/34278 [4:34:35<32:35:25, 3.88s/it] 12%|█▏ | 4028/34278 [4:34:38<30:21:19, 3.61s/it] {'loss': 0.1918, 'grad_norm': 1.1698168537208269, 'learning_rate': 9.800598903164039e-06, 'epoch': 0.12} 12%|█▏ | 4028/34278 [4:34:38<30:21:19, 3.61s/it] 12%|█▏ | 4029/34278 [4:34:42<29:47:59, 3.55s/it] {'loss': 0.2005, 'grad_norm': 0.8671636053545344, 'learning_rate': 9.800466794343326e-06, 'epoch': 0.12} 12%|█▏ | 4029/34278 [4:34:42<29:47:59, 3.55s/it] 12%|█▏ | 4030/34278 [4:34:45<30:07:10, 3.58s/it] {'loss': 0.1822, 'grad_norm': 0.7690461245297601, 'learning_rate': 9.800334642665193e-06, 'epoch': 0.12} 12%|█▏ | 4030/34278 [4:34:45<30:07:10, 3.58s/it] 12%|█▏ | 4031/34278 [4:34:48<29:08:42, 3.47s/it] {'loss': 0.1686, 'grad_norm': 0.8673928340155402, 'learning_rate': 9.800202448130816e-06, 'epoch': 0.12} 12%|█▏ | 4031/34278 [4:34:48<29:08:42, 3.47s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4032/34278 [4:34:51<27:56:32, 3.33s/it] {'loss': 0.1915, 'grad_norm': 1.0113940236773769, 'learning_rate': 9.80007021074138e-06, 'epoch': 0.12} 12%|█▏ | 4032/34278 [4:34:51<27:56:32, 3.33s/it] 12%|█▏ | 4033/34278 [4:34:56<30:29:43, 3.63s/it] {'loss': 0.1878, 'grad_norm': 0.835804990942079, 'learning_rate': 9.79993793049806e-06, 'epoch': 0.12} 12%|█▏ | 4033/34278 [4:34:56<30:29:43, 3.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4034/34278 [4:34:59<29:06:34, 3.46s/it] {'loss': 0.1573, 'grad_norm': 1.055479587939155, 'learning_rate': 9.799805607402042e-06, 'epoch': 0.12} 12%|█▏ | 4034/34278 [4:34:59<29:06:34, 3.46s/it] 12%|█▏ | 4035/34278 [4:35:02<27:34:48, 3.28s/it] {'loss': 0.1709, 'grad_norm': 0.9922774040757671, 'learning_rate': 9.799673241454504e-06, 'epoch': 0.12} 12%|█▏ | 4035/34278 [4:35:02<27:34:48, 3.28s/it] 12%|█▏ | 4036/34278 [4:35:05<28:36:45, 3.41s/it] {'loss': 0.1608, 'grad_norm': 0.8400929161992129, 'learning_rate': 9.79954083265663e-06, 'epoch': 0.12} 12%|█▏ | 4036/34278 [4:35:05<28:36:45, 3.41s/it] 12%|█▏ | 4037/34278 [4:35:09<28:16:28, 3.37s/it] {'loss': 0.184, 'grad_norm': 1.0264751256358728, 'learning_rate': 9.7994083810096e-06, 'epoch': 0.12} 12%|█▏ | 4037/34278 [4:35:09<28:16:28, 3.37s/it] 12%|█▏ | 4038/34278 [4:35:14<33:54:14, 4.04s/it] {'loss': 0.1754, 'grad_norm': 0.9068214960259822, 'learning_rate': 9.799275886514599e-06, 'epoch': 0.12} 12%|█▏ | 4038/34278 [4:35:14<33:54:14, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4039/34278 [4:35:17<31:31:06, 3.75s/it] {'loss': 0.1583, 'grad_norm': 1.096105259722257, 'learning_rate': 9.799143349172809e-06, 'epoch': 0.12} 12%|█▏ | 4039/34278 [4:35:17<31:31:06, 3.75s/it] 12%|█▏ | 4040/34278 [4:35:22<32:51:03, 3.91s/it] {'loss': 0.1569, 'grad_norm': 0.7592187551813588, 'learning_rate': 9.799010768985413e-06, 'epoch': 0.12} 12%|█▏ | 4040/34278 [4:35:22<32:51:03, 3.91s/it] 12%|█▏ | 4041/34278 [4:35:24<30:01:54, 3.58s/it] {'loss': 0.2049, 'grad_norm': 0.9843277352889542, 'learning_rate': 9.798878145953592e-06, 'epoch': 0.12} 12%|█▏ | 4041/34278 [4:35:24<30:01:54, 3.58s/it] 12%|█▏ | 4042/34278 [4:35:29<31:17:46, 3.73s/it] {'loss': 0.1993, 'grad_norm': 1.2825858480708008, 'learning_rate': 9.798745480078535e-06, 'epoch': 0.12} 12%|█▏ | 4042/34278 [4:35:29<31:17:46, 3.73s/it] 12%|█▏ | 4043/34278 [4:35:33<32:10:44, 3.83s/it] {'loss': 0.1884, 'grad_norm': 0.8415361714071162, 'learning_rate': 9.798612771361423e-06, 'epoch': 0.12} 12%|█▏ | 4043/34278 [4:35:33<32:10:44, 3.83s/it] 12%|█▏ | 4044/34278 [4:35:36<31:01:35, 3.69s/it] {'loss': 0.1677, 'grad_norm': 0.68541036268047, 'learning_rate': 9.798480019803442e-06, 'epoch': 0.12} 12%|█▏ | 4044/34278 [4:35:36<31:01:35, 3.69s/it] 12%|█▏ | 4045/34278 [4:35:39<29:52:04, 3.56s/it] {'loss': 0.1775, 'grad_norm': 0.8833576539239999, 'learning_rate': 9.798347225405777e-06, 'epoch': 0.12} 12%|█▏ | 4045/34278 [4:35:39<29:52:04, 3.56s/it] 12%|█▏ | 4046/34278 [4:35:43<29:19:33, 3.49s/it] {'loss': 0.1793, 'grad_norm': 0.7764367365558691, 'learning_rate': 9.798214388169613e-06, 'epoch': 0.12} 12%|█▏ | 4046/34278 [4:35:43<29:19:33, 3.49s/it] 12%|█▏ | 4047/34278 [4:35:46<28:58:05, 3.45s/it] {'loss': 0.1672, 'grad_norm': 0.7572082043070054, 'learning_rate': 9.798081508096135e-06, 'epoch': 0.12} 12%|█▏ | 4047/34278 [4:35:46<28:58:05, 3.45s/it] 12%|█▏ | 4048/34278 [4:35:49<28:43:07, 3.42s/it] {'loss': 0.178, 'grad_norm': 0.7607281881986289, 'learning_rate': 9.797948585186533e-06, 'epoch': 0.12} 12%|█▏ | 4048/34278 [4:35:49<28:43:07, 3.42s/it] 12%|█▏ | 4049/34278 [4:35:52<27:23:24, 3.26s/it] {'loss': 0.2079, 'grad_norm': 1.0620435903387817, 'learning_rate': 9.79781561944199e-06, 'epoch': 0.12} 12%|█▏ | 4049/34278 [4:35:52<27:23:24, 3.26s/it] 12%|█▏ | 4050/34278 [4:35:55<26:34:40, 3.17s/it] {'loss': 0.171, 'grad_norm': 0.7446112541942896, 'learning_rate': 9.797682610863695e-06, 'epoch': 0.12} 12%|█▏ | 4050/34278 [4:35:55<26:34:40, 3.17s/it] 12%|█▏ | 4051/34278 [4:35:59<27:46:48, 3.31s/it] {'loss': 0.1609, 'grad_norm': 0.7523181824036657, 'learning_rate': 9.797549559452835e-06, 'epoch': 0.12} 12%|█▏ | 4051/34278 [4:35:59<27:46:48, 3.31s/it] 12%|█▏ | 4052/34278 [4:36:02<27:15:35, 3.25s/it] {'loss': 0.1728, 'grad_norm': 0.7258478653299077, 'learning_rate': 9.797416465210599e-06, 'epoch': 0.12} 12%|█▏ | 4052/34278 [4:36:02<27:15:35, 3.25s/it] 12%|█▏ | 4053/34278 [4:36:06<28:46:54, 3.43s/it] {'loss': 0.1783, 'grad_norm': 0.788248929247024, 'learning_rate': 9.797283328138172e-06, 'epoch': 0.12} 12%|█▏ | 4053/34278 [4:36:06<28:46:54, 3.43s/it] 12%|█▏ | 4054/34278 [4:36:10<30:48:22, 3.67s/it] {'loss': 0.1922, 'grad_norm': 1.0221899508346437, 'learning_rate': 9.797150148236744e-06, 'epoch': 0.12} 12%|█▏ | 4054/34278 [4:36:10<30:48:22, 3.67s/it] 12%|█▏ | 4055/34278 [4:36:13<29:45:26, 3.54s/it] {'loss': 0.195, 'grad_norm': 0.8504084637091053, 'learning_rate': 9.797016925507507e-06, 'epoch': 0.12} 12%|█▏ | 4055/34278 [4:36:13<29:45:26, 3.54s/it] 12%|█▏ | 4056/34278 [4:36:16<28:23:19, 3.38s/it] {'loss': 0.1966, 'grad_norm': 0.8017278695647965, 'learning_rate': 9.796883659951648e-06, 'epoch': 0.12} 12%|█▏ | 4056/34278 [4:36:16<28:23:19, 3.38s/it] 12%|█▏ | 4057/34278 [4:36:20<29:11:23, 3.48s/it] {'loss': 0.1683, 'grad_norm': 1.1323548778598158, 'learning_rate': 9.796750351570355e-06, 'epoch': 0.12} 12%|█▏ | 4057/34278 [4:36:20<29:11:23, 3.48s/it] 12%|█▏ | 4058/34278 [4:36:24<31:50:15, 3.79s/it] {'loss': 0.1936, 'grad_norm': 1.1258246439497628, 'learning_rate': 9.79661700036482e-06, 'epoch': 0.12} 12%|█▏ | 4058/34278 [4:36:24<31:50:15, 3.79s/it] 12%|█▏ | 4059/34278 [4:36:27<29:45:03, 3.54s/it] {'loss': 0.1841, 'grad_norm': 0.8904428235802615, 'learning_rate': 9.796483606336235e-06, 'epoch': 0.12} 12%|█▏ | 4059/34278 [4:36:27<29:45:03, 3.54s/it] 12%|█▏ | 4060/34278 [4:36:30<28:22:22, 3.38s/it] {'loss': 0.1864, 'grad_norm': 1.1806870222346555, 'learning_rate': 9.796350169485789e-06, 'epoch': 0.12} 12%|█▏ | 4060/34278 [4:36:30<28:22:22, 3.38s/it] 12%|█▏ | 4061/34278 [4:36:33<27:39:39, 3.30s/it] {'loss': 0.1982, 'grad_norm': 1.0329696619723054, 'learning_rate': 9.796216689814672e-06, 'epoch': 0.12} 12%|█▏ | 4061/34278 [4:36:33<27:39:39, 3.30s/it] 12%|█▏ | 4062/34278 [4:36:37<29:21:54, 3.50s/it] {'loss': 0.1914, 'grad_norm': 0.9676735357582843, 'learning_rate': 9.79608316732408e-06, 'epoch': 0.12} 12%|█▏ | 4062/34278 [4:36:37<29:21:54, 3.50s/it] 12%|█▏ | 4063/34278 [4:36:41<28:25:03, 3.39s/it] {'loss': 0.1786, 'grad_norm': 0.8864614157739367, 'learning_rate': 9.7959496020152e-06, 'epoch': 0.12} 12%|█▏ | 4063/34278 [4:36:41<28:25:03, 3.39s/it] 12%|█▏ | 4064/34278 [4:36:47<35:06:02, 4.18s/it] {'loss': 0.1964, 'grad_norm': 1.0882041126738569, 'learning_rate': 9.795815993889229e-06, 'epoch': 0.12} 12%|█▏ | 4064/34278 [4:36:47<35:06:02, 4.18s/it] 12%|█▏ | 4065/34278 [4:36:50<33:15:33, 3.96s/it] {'loss': 0.1833, 'grad_norm': 0.8471911828856352, 'learning_rate': 9.795682342947356e-06, 'epoch': 0.12} 12%|█▏ | 4065/34278 [4:36:50<33:15:33, 3.96s/it] 12%|█▏ | 4066/34278 [4:36:56<37:42:08, 4.49s/it] {'loss': 0.1761, 'grad_norm': 0.7909408846675637, 'learning_rate': 9.795548649190777e-06, 'epoch': 0.12} 12%|█▏ | 4066/34278 [4:36:56<37:42:08, 4.49s/it] 12%|█▏ | 4067/34278 [4:36:59<33:59:02, 4.05s/it] {'loss': 0.1947, 'grad_norm': 0.9649875727522704, 'learning_rate': 9.795414912620685e-06, 'epoch': 0.12} 12%|█▏ | 4067/34278 [4:36:59<33:59:02, 4.05s/it] 12%|█▏ | 4068/34278 [4:37:02<32:12:24, 3.84s/it] {'loss': 0.1786, 'grad_norm': 0.8506455621724606, 'learning_rate': 9.79528113323827e-06, 'epoch': 0.12} 12%|█▏ | 4068/34278 [4:37:02<32:12:24, 3.84s/it] 12%|█▏ | 4069/34278 [4:37:05<30:19:43, 3.61s/it] {'loss': 0.1874, 'grad_norm': 0.7179979463357128, 'learning_rate': 9.795147311044732e-06, 'epoch': 0.12} 12%|█▏ | 4069/34278 [4:37:05<30:19:43, 3.61s/it] 12%|█▏ | 4070/34278 [4:37:11<36:36:07, 4.36s/it] {'loss': 0.177, 'grad_norm': 0.8867828001423357, 'learning_rate': 9.795013446041264e-06, 'epoch': 0.12} 12%|█▏ | 4070/34278 [4:37:11<36:36:07, 4.36s/it] 12%|█▏ | 4071/34278 [4:37:16<37:34:00, 4.48s/it] {'loss': 0.1785, 'grad_norm': 0.7854471935045214, 'learning_rate': 9.79487953822906e-06, 'epoch': 0.12} 12%|█▏ | 4071/34278 [4:37:16<37:34:00, 4.48s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4072/34278 [4:37:20<35:44:09, 4.26s/it] {'loss': 0.1735, 'grad_norm': 0.8486496225340521, 'learning_rate': 9.794745587609318e-06, 'epoch': 0.12} 12%|█▏ | 4072/34278 [4:37:20<35:44:09, 4.26s/it] 12%|█▏ | 4073/34278 [4:37:23<33:05:31, 3.94s/it] {'loss': 0.1722, 'grad_norm': 0.8685026532048004, 'learning_rate': 9.794611594183229e-06, 'epoch': 0.12} 12%|█▏ | 4073/34278 [4:37:23<33:05:31, 3.94s/it] 12%|█▏ | 4074/34278 [4:37:26<30:31:47, 3.64s/it] {'loss': 0.1803, 'grad_norm': 0.9528898964468909, 'learning_rate': 9.794477557951993e-06, 'epoch': 0.12} 12%|█▏ | 4074/34278 [4:37:26<30:31:47, 3.64s/it] 12%|█▏ | 4075/34278 [4:37:30<30:16:14, 3.61s/it] {'loss': 0.1779, 'grad_norm': 0.8342324505777576, 'learning_rate': 9.794343478916807e-06, 'epoch': 0.12} 12%|█▏ | 4075/34278 [4:37:30<30:16:14, 3.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4076/34278 [4:37:33<29:12:42, 3.48s/it] {'loss': 0.2007, 'grad_norm': 0.7742586727436434, 'learning_rate': 9.794209357078867e-06, 'epoch': 0.12} 12%|█▏ | 4076/34278 [4:37:33<29:12:42, 3.48s/it] 12%|█▏ | 4077/34278 [4:37:37<32:23:33, 3.86s/it] {'loss': 0.1842, 'grad_norm': 0.9091465661381676, 'learning_rate': 9.79407519243937e-06, 'epoch': 0.12} 12%|█▏ | 4077/34278 [4:37:37<32:23:33, 3.86s/it] 12%|█▏ | 4078/34278 [4:37:40<29:43:28, 3.54s/it] {'loss': 0.1462, 'grad_norm': 0.7469480493965509, 'learning_rate': 9.793940984999512e-06, 'epoch': 0.12} 12%|█▏ | 4078/34278 [4:37:40<29:43:28, 3.54s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41678cd030> Failed to fetch sample 3353701. Exception: cannot identify image file <_io.BytesIO object at 0x7f41678cd030> 12%|█▏ | 4079/34278 [4:37:44<30:15:58, 3.61s/it] {'loss': 0.1947, 'grad_norm': 0.8886994889203368, 'learning_rate': 9.793806734760496e-06, 'epoch': 0.12} 12%|█▏ | 4079/34278 [4:37:44<30:15:58, 3.61s/it] 12%|█▏ | 4080/34278 [4:37:47<29:11:49, 3.48s/it] {'loss': 0.1827, 'grad_norm': 0.965534648887828, 'learning_rate': 9.793672441723515e-06, 'epoch': 0.12} 12%|█▏ | 4080/34278 [4:37:47<29:11:49, 3.48s/it] 12%|█▏ | 4081/34278 [4:37:51<29:53:06, 3.56s/it] {'loss': 0.1761, 'grad_norm': 0.9257063083089873, 'learning_rate': 9.793538105889775e-06, 'epoch': 0.12} 12%|█▏ | 4081/34278 [4:37:51<29:53:06, 3.56s/it] 12%|█▏ | 4082/34278 [4:37:55<31:58:28, 3.81s/it] {'loss': 0.1908, 'grad_norm': 1.2208151541428913, 'learning_rate': 9.79340372726047e-06, 'epoch': 0.12} 12%|█▏ | 4082/34278 [4:37:55<31:58:28, 3.81s/it] 12%|█▏ | 4083/34278 [4:37:59<31:29:38, 3.75s/it] {'loss': 0.191, 'grad_norm': 0.9993951765731436, 'learning_rate': 9.793269305836799e-06, 'epoch': 0.12} 12%|█▏ | 4083/34278 [4:37:59<31:29:38, 3.75s/it] 12%|█▏ | 4084/34278 [4:38:03<31:04:58, 3.71s/it] {'loss': 0.1623, 'grad_norm': 0.8280227146034735, 'learning_rate': 9.793134841619964e-06, 'epoch': 0.12} 12%|█▏ | 4084/34278 [4:38:03<31:04:58, 3.71s/it] 12%|█▏ | 4085/34278 [4:38:05<28:28:31, 3.40s/it] {'loss': 0.1563, 'grad_norm': 1.3120390396897854, 'learning_rate': 9.793000334611166e-06, 'epoch': 0.12} 12%|█▏ | 4085/34278 [4:38:05<28:28:31, 3.40s/it] 12%|█▏ | 4086/34278 [4:38:09<30:27:43, 3.63s/it] {'loss': 0.1685, 'grad_norm': 0.8928568609837696, 'learning_rate': 9.792865784811604e-06, 'epoch': 0.12} 12%|█▏ | 4086/34278 [4:38:09<30:27:43, 3.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4087/34278 [4:38:13<30:20:13, 3.62s/it] {'loss': 0.1704, 'grad_norm': 1.0876694985665638, 'learning_rate': 9.792731192222482e-06, 'epoch': 0.12} 12%|█▏ | 4087/34278 [4:38:13<30:20:13, 3.62s/it] 12%|█▏ | 4088/34278 [4:38:16<29:00:46, 3.46s/it] {'loss': 0.1743, 'grad_norm': 0.8772309262531918, 'learning_rate': 9.792596556845e-06, 'epoch': 0.12} 12%|█▏ | 4088/34278 [4:38:16<29:00:46, 3.46s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9036 > 8192). Running this sequence through the model will result in indexing errors 12%|█▏ | 4089/34278 [4:38:19<28:08:42, 3.36s/it] {'loss': 0.1548, 'grad_norm': 0.8882982565596025, 'learning_rate': 9.79246187868036e-06, 'epoch': 0.12} 12%|█▏ | 4089/34278 [4:38:19<28:08:42, 3.36s/it] 12%|█▏ | 4090/34278 [4:38:23<28:35:06, 3.41s/it] {'loss': 0.1719, 'grad_norm': 0.7598220770302359, 'learning_rate': 9.792327157729762e-06, 'epoch': 0.12} 12%|█▏ | 4090/34278 [4:38:23<28:35:06, 3.41s/it] 12%|█▏ | 4091/34278 [4:38:27<30:49:46, 3.68s/it] {'loss': 0.1703, 'grad_norm': 0.8979567040089305, 'learning_rate': 9.792192393994415e-06, 'epoch': 0.12} 12%|█▏ | 4091/34278 [4:38:27<30:49:46, 3.68s/it] 12%|█▏ | 4092/34278 [4:38:33<36:50:28, 4.39s/it] {'loss': 0.1821, 'grad_norm': 1.0627609739083699, 'learning_rate': 9.792057587475516e-06, 'epoch': 0.12} 12%|█▏ | 4092/34278 [4:38:33<36:50:28, 4.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4093/34278 [4:38:37<34:36:05, 4.13s/it] {'loss': 0.1948, 'grad_norm': 0.9182593943569555, 'learning_rate': 9.79192273817427e-06, 'epoch': 0.12} 12%|█▏ | 4093/34278 [4:38:37<34:36:05, 4.13s/it] 12%|█▏ | 4094/34278 [4:38:40<32:06:27, 3.83s/it] {'loss': 0.199, 'grad_norm': 0.8349549529579332, 'learning_rate': 9.791787846091883e-06, 'epoch': 0.12} 12%|█▏ | 4094/34278 [4:38:40<32:06:27, 3.83s/it] 12%|█▏ | 4095/34278 [4:38:43<30:38:21, 3.65s/it] {'loss': 0.192, 'grad_norm': 0.9436604013645415, 'learning_rate': 9.79165291122956e-06, 'epoch': 0.12} 12%|█▏ | 4095/34278 [4:38:43<30:38:21, 3.65s/it] 12%|█▏ | 4096/34278 [4:38:47<30:49:46, 3.68s/it] {'loss': 0.1638, 'grad_norm': 0.8038645009853541, 'learning_rate': 9.7915179335885e-06, 'epoch': 0.12} 12%|█▏ | 4096/34278 [4:38:47<30:49:46, 3.68s/it] 12%|█▏ | 4097/34278 [4:38:50<29:37:42, 3.53s/it] {'loss': 0.1903, 'grad_norm': 0.9705876176402898, 'learning_rate': 9.791382913169913e-06, 'epoch': 0.12} 12%|█▏ | 4097/34278 [4:38:50<29:37:42, 3.53s/it] 12%|█▏ | 4098/34278 [4:38:53<29:44:24, 3.55s/it] {'loss': 0.2078, 'grad_norm': 0.8509804031952122, 'learning_rate': 9.791247849975003e-06, 'epoch': 0.12} 12%|█▏ | 4098/34278 [4:38:54<29:44:24, 3.55s/it] 12%|█▏ | 4099/34278 [4:38:56<28:13:06, 3.37s/it] {'loss': 0.1825, 'grad_norm': 1.141298404017863, 'learning_rate': 9.791112744004979e-06, 'epoch': 0.12} 12%|█▏ | 4099/34278 [4:38:56<28:13:06, 3.37s/it] 12%|█▏ | 4100/34278 [4:39:00<27:51:29, 3.32s/it] {'loss': 0.1968, 'grad_norm': 0.8444499592388437, 'learning_rate': 9.79097759526104e-06, 'epoch': 0.12} 12%|█▏ | 4100/34278 [4:39:00<27:51:29, 3.32s/it] 12%|█▏ | 4101/34278 [4:39:05<31:56:26, 3.81s/it] {'loss': 0.1741, 'grad_norm': 0.8467549660289233, 'learning_rate': 9.790842403744398e-06, 'epoch': 0.12} 12%|█▏ | 4101/34278 [4:39:05<31:56:26, 3.81s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4102/34278 [4:39:08<30:25:33, 3.63s/it] {'loss': 0.1772, 'grad_norm': 0.8479821220200611, 'learning_rate': 9.79070716945626e-06, 'epoch': 0.12} 12%|█▏ | 4102/34278 [4:39:08<30:25:33, 3.63s/it] 12%|█▏ | 4103/34278 [4:39:11<29:14:52, 3.49s/it] {'loss': 0.1899, 'grad_norm': 0.8502502276293883, 'learning_rate': 9.79057189239783e-06, 'epoch': 0.12} 12%|█▏ | 4103/34278 [4:39:11<29:14:52, 3.49s/it] 12%|█▏ | 4104/34278 [4:39:15<29:30:03, 3.52s/it] {'loss': 0.1626, 'grad_norm': 0.7865772133425387, 'learning_rate': 9.790436572570319e-06, 'epoch': 0.12} 12%|█▏ | 4104/34278 [4:39:15<29:30:03, 3.52s/it] 12%|█▏ | 4105/34278 [4:39:18<28:08:29, 3.36s/it] {'loss': 0.1824, 'grad_norm': 0.962058000668101, 'learning_rate': 9.790301209974932e-06, 'epoch': 0.12} 12%|█▏ | 4105/34278 [4:39:18<28:08:29, 3.36s/it] 12%|█▏ | 4106/34278 [4:39:20<26:57:56, 3.22s/it] {'loss': 0.1852, 'grad_norm': 0.7708385284061801, 'learning_rate': 9.790165804612882e-06, 'epoch': 0.12} 12%|█▏ | 4106/34278 [4:39:20<26:57:56, 3.22s/it] 12%|█▏ | 4107/34278 [4:39:23<25:42:46, 3.07s/it] {'loss': 0.1766, 'grad_norm': 0.9232131298020444, 'learning_rate': 9.790030356485374e-06, 'epoch': 0.12} 12%|█▏ | 4107/34278 [4:39:23<25:42:46, 3.07s/it] 12%|█▏ | 4108/34278 [4:39:27<28:24:36, 3.39s/it] {'loss': 0.1662, 'grad_norm': 0.8298980115415907, 'learning_rate': 9.789894865593619e-06, 'epoch': 0.12} 12%|█▏ | 4108/34278 [4:39:27<28:24:36, 3.39s/it] 12%|█▏ | 4109/34278 [4:39:31<28:08:32, 3.36s/it] {'loss': 0.165, 'grad_norm': 0.7722703020740381, 'learning_rate': 9.789759331938826e-06, 'epoch': 0.12} 12%|█▏ | 4109/34278 [4:39:31<28:08:32, 3.36s/it] 12%|█▏ | 4110/34278 [4:39:36<33:58:32, 4.05s/it] {'loss': 0.1822, 'grad_norm': 1.0514972642459577, 'learning_rate': 9.789623755522204e-06, 'epoch': 0.12} 12%|█▏ | 4110/34278 [4:39:36<33:58:32, 4.05s/it] 12%|█▏ | 4111/34278 [4:39:41<36:29:49, 4.36s/it] {'loss': 0.1652, 'grad_norm': 0.6943035354131065, 'learning_rate': 9.789488136344966e-06, 'epoch': 0.12} 12%|█▏ | 4111/34278 [4:39:41<36:29:49, 4.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4112/34278 [4:39:45<33:41:33, 4.02s/it] {'loss': 0.1632, 'grad_norm': 0.8219632672017334, 'learning_rate': 9.78935247440832e-06, 'epoch': 0.12} 12%|█▏ | 4112/34278 [4:39:45<33:41:33, 4.02s/it] 12%|█▏ | 4113/34278 [4:39:51<39:19:37, 4.69s/it] {'loss': 0.1691, 'grad_norm': 0.9181735980006912, 'learning_rate': 9.789216769713479e-06, 'epoch': 0.12} 12%|█▏ | 4113/34278 [4:39:51<39:19:37, 4.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4114/34278 [4:39:54<35:15:35, 4.21s/it] {'loss': 0.2045, 'grad_norm': 1.042590637169085, 'learning_rate': 9.789081022261654e-06, 'epoch': 0.12} 12%|█▏ | 4114/34278 [4:39:54<35:15:35, 4.21s/it] 12%|█▏ | 4115/34278 [4:40:00<40:07:48, 4.79s/it] {'loss': 0.1585, 'grad_norm': 0.7952694115501756, 'learning_rate': 9.788945232054056e-06, 'epoch': 0.12} 12%|█▏ | 4115/34278 [4:40:00<40:07:48, 4.79s/it] 12%|█▏ | 4116/34278 [4:40:04<37:06:23, 4.43s/it] {'loss': 0.1787, 'grad_norm': 0.6729379380871152, 'learning_rate': 9.788809399091899e-06, 'epoch': 0.12} 12%|█▏ | 4116/34278 [4:40:04<37:06:23, 4.43s/it] 12%|█▏ | 4117/34278 [4:40:07<34:51:50, 4.16s/it] {'loss': 0.1917, 'grad_norm': 1.5782806905729891, 'learning_rate': 9.788673523376396e-06, 'epoch': 0.12} 12%|█▏ | 4117/34278 [4:40:07<34:51:50, 4.16s/it] 12%|█▏ | 4118/34278 [4:40:13<38:50:03, 4.64s/it] {'loss': 0.1906, 'grad_norm': 0.9286855269409086, 'learning_rate': 9.788537604908756e-06, 'epoch': 0.12} 12%|█▏ | 4118/34278 [4:40:13<38:50:03, 4.64s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb20bd9bba0> Failed to fetch sample 3087948. Exception: cannot identify image file <_io.BytesIO object at 0x7fb20bd9bba0> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4119/34278 [4:40:17<36:17:39, 4.33s/it] {'loss': 0.1729, 'grad_norm': 1.1332457540301297, 'learning_rate': 9.788401643690197e-06, 'epoch': 0.12} 12%|█▏ | 4119/34278 [4:40:17<36:17:39, 4.33s/it] 12%|█▏ | 4120/34278 [4:40:20<33:38:11, 4.02s/it] {'loss': 0.1785, 'grad_norm': 0.9568786266958319, 'learning_rate': 9.788265639721932e-06, 'epoch': 0.12} 12%|█▏ | 4120/34278 [4:40:20<33:38:11, 4.02s/it] 12%|█▏ | 4121/34278 [4:40:23<30:43:57, 3.67s/it] {'loss': 0.19, 'grad_norm': 0.8098561999190378, 'learning_rate': 9.788129593005174e-06, 'epoch': 0.12} 12%|█▏ | 4121/34278 [4:40:23<30:43:57, 3.67s/it] 12%|█▏ | 4122/34278 [4:40:29<36:15:18, 4.33s/it] {'loss': 0.1847, 'grad_norm': 1.0386085787577433, 'learning_rate': 9.787993503541137e-06, 'epoch': 0.12} 12%|█▏ | 4122/34278 [4:40:29<36:15:18, 4.33s/it] 12%|█▏ | 4123/34278 [4:40:32<32:54:25, 3.93s/it] {'loss': 0.2145, 'grad_norm': 0.792221227936919, 'learning_rate': 9.787857371331039e-06, 'epoch': 0.12} 12%|█▏ | 4123/34278 [4:40:32<32:54:25, 3.93s/it] 12%|█▏ | 4124/34278 [4:40:34<30:25:25, 3.63s/it] {'loss': 0.1812, 'grad_norm': 1.0271046503439851, 'learning_rate': 9.787721196376092e-06, 'epoch': 0.12} 12%|█▏ | 4124/34278 [4:40:34<30:25:25, 3.63s/it] 12%|█▏ | 4125/34278 [4:40:37<28:20:49, 3.38s/it] {'loss': 0.192, 'grad_norm': 0.8288151132963502, 'learning_rate': 9.787584978677514e-06, 'epoch': 0.12} 12%|█▏ | 4125/34278 [4:40:37<28:20:49, 3.38s/it] 12%|█▏ | 4126/34278 [4:40:43<34:27:27, 4.11s/it] {'loss': 0.1728, 'grad_norm': 0.8788552241854136, 'learning_rate': 9.787448718236519e-06, 'epoch': 0.12} 12%|█▏ | 4126/34278 [4:40:43<34:27:27, 4.11s/it] 12%|█▏ | 4127/34278 [4:40:46<31:12:29, 3.73s/it] {'loss': 0.184, 'grad_norm': 0.9221041884990281, 'learning_rate': 9.787312415054325e-06, 'epoch': 0.12} 12%|█▏ | 4127/34278 [4:40:46<31:12:29, 3.73s/it] 12%|█▏ | 4128/34278 [4:40:51<35:23:07, 4.23s/it] {'loss': 0.1595, 'grad_norm': 0.8464018185918841, 'learning_rate': 9.787176069132149e-06, 'epoch': 0.12} 12%|█▏ | 4128/34278 [4:40:51<35:23:07, 4.23s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4129/34278 [4:40:55<34:17:56, 4.10s/it] {'loss': 0.1811, 'grad_norm': 0.9061550624397733, 'learning_rate': 9.787039680471206e-06, 'epoch': 0.12} 12%|█▏ | 4129/34278 [4:40:55<34:17:56, 4.10s/it] 12%|█▏ | 4130/34278 [4:40:58<31:33:23, 3.77s/it] {'loss': 0.1601, 'grad_norm': 0.9436511449768542, 'learning_rate': 9.786903249072717e-06, 'epoch': 0.12} 12%|█▏ | 4130/34278 [4:40:58<31:33:23, 3.77s/it] 12%|█▏ | 4131/34278 [4:41:02<32:39:55, 3.90s/it] {'loss': 0.206, 'grad_norm': 0.8521825288934504, 'learning_rate': 9.786766774937898e-06, 'epoch': 0.12} 12%|█▏ | 4131/34278 [4:41:02<32:39:55, 3.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4132/34278 [4:41:07<33:42:00, 4.02s/it] {'loss': 0.166, 'grad_norm': 1.3319908623471917, 'learning_rate': 9.78663025806797e-06, 'epoch': 0.12} 12%|█▏ | 4132/34278 [4:41:07<33:42:00, 4.02s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4133/34278 [4:41:11<35:33:41, 4.25s/it] {'loss': 0.1735, 'grad_norm': 1.3676056687058058, 'learning_rate': 9.786493698464149e-06, 'epoch': 0.12} 12%|█▏ | 4133/34278 [4:41:11<35:33:41, 4.25s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4134/34278 [4:41:15<34:16:00, 4.09s/it] {'loss': 0.1817, 'grad_norm': 0.9383909096438793, 'learning_rate': 9.786357096127652e-06, 'epoch': 0.12} 12%|█▏ | 4134/34278 [4:41:15<34:16:00, 4.09s/it] 12%|█▏ | 4135/34278 [4:41:18<31:00:45, 3.70s/it] {'loss': 0.1979, 'grad_norm': 1.2318892210635184, 'learning_rate': 9.786220451059704e-06, 'epoch': 0.12} 12%|█▏ | 4135/34278 [4:41:18<31:00:45, 3.70s/it] 12%|█▏ | 4136/34278 [4:41:21<30:05:01, 3.59s/it] {'loss': 0.1712, 'grad_norm': 0.9014131174008452, 'learning_rate': 9.786083763261522e-06, 'epoch': 0.12} 12%|█▏ | 4136/34278 [4:41:21<30:05:01, 3.59s/it] 12%|█▏ | 4137/34278 [4:41:24<28:15:20, 3.37s/it] {'loss': 0.1873, 'grad_norm': 0.8132711903801089, 'learning_rate': 9.785947032734326e-06, 'epoch': 0.12} 12%|█▏ | 4137/34278 [4:41:24<28:15:20, 3.37s/it] 12%|█▏ | 4138/34278 [4:41:28<30:27:22, 3.64s/it] {'loss': 0.1724, 'grad_norm': 0.8642170649113723, 'learning_rate': 9.785810259479337e-06, 'epoch': 0.12} 12%|█▏ | 4138/34278 [4:41:28<30:27:22, 3.64s/it] 12%|█▏ | 4139/34278 [4:41:34<36:27:38, 4.36s/it] {'loss': 0.1635, 'grad_norm': 0.8133536176102205, 'learning_rate': 9.785673443497779e-06, 'epoch': 0.12} 12%|█▏ | 4139/34278 [4:41:34<36:27:38, 4.36s/it] 12%|█▏ | 4140/34278 [4:41:38<34:39:31, 4.14s/it] {'loss': 0.1992, 'grad_norm': 0.8421766063801384, 'learning_rate': 9.785536584790869e-06, 'epoch': 0.12} 12%|█▏ | 4140/34278 [4:41:38<34:39:31, 4.14s/it] 12%|█▏ | 4141/34278 [4:41:44<39:37:40, 4.73s/it] {'loss': 0.1728, 'grad_norm': 0.8301407227670122, 'learning_rate': 9.78539968335983e-06, 'epoch': 0.12} 12%|█▏ | 4141/34278 [4:41:44<39:37:40, 4.73s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4142/34278 [4:41:49<39:58:31, 4.78s/it] {'loss': 0.172, 'grad_norm': 0.8329694079632481, 'learning_rate': 9.785262739205887e-06, 'epoch': 0.12} 12%|█▏ | 4142/34278 [4:41:49<39:58:31, 4.78s/it] 12%|█▏ | 4143/34278 [4:41:53<37:39:21, 4.50s/it] {'loss': 0.1815, 'grad_norm': 0.8015332815545945, 'learning_rate': 9.78512575233026e-06, 'epoch': 0.12} 12%|█▏ | 4143/34278 [4:41:53<37:39:21, 4.50s/it] 12%|█▏ | 4144/34278 [4:41:56<33:50:58, 4.04s/it] {'loss': 0.1662, 'grad_norm': 0.7640230894037681, 'learning_rate': 9.784988722734172e-06, 'epoch': 0.12} 12%|█▏ | 4144/34278 [4:41:56<33:50:58, 4.04s/it] 12%|█▏ | 4145/34278 [4:41:59<31:05:05, 3.71s/it] {'loss': 0.1756, 'grad_norm': 0.7414765506748662, 'learning_rate': 9.784851650418847e-06, 'epoch': 0.12} 12%|█▏ | 4145/34278 [4:41:59<31:05:05, 3.71s/it] 12%|█▏ | 4146/34278 [4:42:03<33:02:25, 3.95s/it] {'loss': 0.1594, 'grad_norm': 0.8478198725277386, 'learning_rate': 9.784714535385509e-06, 'epoch': 0.12} 12%|█▏ | 4146/34278 [4:42:03<33:02:25, 3.95s/it] 12%|█▏ | 4147/34278 [4:42:07<32:31:09, 3.89s/it] {'loss': 0.1631, 'grad_norm': 1.2695178645519447, 'learning_rate': 9.784577377635382e-06, 'epoch': 0.12} 12%|█▏ | 4147/34278 [4:42:07<32:31:09, 3.89s/it] 12%|█▏ | 4148/34278 [4:42:12<34:06:22, 4.08s/it] {'loss': 0.1609, 'grad_norm': 0.7650605253914328, 'learning_rate': 9.784440177169689e-06, 'epoch': 0.12} 12%|█▏ | 4148/34278 [4:42:12<34:06:22, 4.08s/it] 12%|█▏ | 4149/34278 [4:42:15<32:06:53, 3.84s/it] {'loss': 0.2238, 'grad_norm': 1.0276395784864185, 'learning_rate': 9.784302933989657e-06, 'epoch': 0.12} 12%|█▏ | 4149/34278 [4:42:15<32:06:53, 3.84s/it] 12%|█▏ | 4150/34278 [4:42:19<33:31:23, 4.01s/it] {'loss': 0.1882, 'grad_norm': 0.8494424714848874, 'learning_rate': 9.784165648096514e-06, 'epoch': 0.12} 12%|█▏ | 4150/34278 [4:42:19<33:31:23, 4.01s/it] 12%|█▏ | 4151/34278 [4:42:23<32:11:06, 3.85s/it] {'loss': 0.1743, 'grad_norm': 0.8593437139387499, 'learning_rate': 9.784028319491478e-06, 'epoch': 0.12} 12%|█▏ | 4151/34278 [4:42:23<32:11:06, 3.85s/it] 12%|█▏ | 4152/34278 [4:42:25<29:22:22, 3.51s/it] {'loss': 0.1742, 'grad_norm': 0.8446351444485749, 'learning_rate': 9.78389094817578e-06, 'epoch': 0.12} 12%|█▏ | 4152/34278 [4:42:25<29:22:22, 3.51s/it] 12%|█▏ | 4153/34278 [4:42:29<28:34:19, 3.41s/it] {'loss': 0.1892, 'grad_norm': 0.906404467522756, 'learning_rate': 9.783753534150646e-06, 'epoch': 0.12} 12%|█▏ | 4153/34278 [4:42:29<28:34:19, 3.41s/it] 12%|█▏ | 4154/34278 [4:42:31<27:03:15, 3.23s/it] {'loss': 0.1764, 'grad_norm': 0.7928934533824235, 'learning_rate': 9.783616077417301e-06, 'epoch': 0.12} 12%|█▏ | 4154/34278 [4:42:31<27:03:15, 3.23s/it] 12%|█▏ | 4155/34278 [4:42:34<26:17:20, 3.14s/it] {'loss': 0.1746, 'grad_norm': 1.011787013239762, 'learning_rate': 9.783478577976976e-06, 'epoch': 0.12} 12%|█▏ | 4155/34278 [4:42:34<26:17:20, 3.14s/it] 12%|█▏ | 4156/34278 [4:42:37<25:48:53, 3.09s/it] {'loss': 0.1744, 'grad_norm': 0.8092316016257747, 'learning_rate': 9.783341035830895e-06, 'epoch': 0.12} 12%|█▏ | 4156/34278 [4:42:37<25:48:53, 3.09s/it] 12%|█▏ | 4157/34278 [4:42:40<25:59:51, 3.11s/it] {'loss': 0.1754, 'grad_norm': 0.7880658319412299, 'learning_rate': 9.783203450980287e-06, 'epoch': 0.12} 12%|█▏ | 4157/34278 [4:42:40<25:59:51, 3.11s/it] 12%|█▏ | 4158/34278 [4:42:44<27:31:16, 3.29s/it] {'loss': 0.1914, 'grad_norm': 0.9193699472202896, 'learning_rate': 9.78306582342638e-06, 'epoch': 0.12} 12%|█▏ | 4158/34278 [4:42:44<27:31:16, 3.29s/it] 12%|█▏ | 4159/34278 [4:42:50<33:26:57, 4.00s/it] {'loss': 0.1599, 'grad_norm': 0.782109094769802, 'learning_rate': 9.782928153170403e-06, 'epoch': 0.12} 12%|█▏ | 4159/34278 [4:42:50<33:26:57, 4.00s/it] 12%|█▏ | 4160/34278 [4:42:53<30:42:29, 3.67s/it] {'loss': 0.197, 'grad_norm': 1.131680214875881, 'learning_rate': 9.782790440213587e-06, 'epoch': 0.12} 12%|█▏ | 4160/34278 [4:42:53<30:42:29, 3.67s/it] 12%|█▏ | 4161/34278 [4:42:59<36:35:18, 4.37s/it] {'loss': 0.1773, 'grad_norm': 0.8343074545424928, 'learning_rate': 9.782652684557158e-06, 'epoch': 0.12} 12%|█▏ | 4161/34278 [4:42:59<36:35:18, 4.37s/it] 12%|█▏ | 4162/34278 [4:43:03<35:32:06, 4.25s/it] {'loss': 0.174, 'grad_norm': 0.9572818164206963, 'learning_rate': 9.78251488620235e-06, 'epoch': 0.12} 12%|█▏ | 4162/34278 [4:43:03<35:32:06, 4.25s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4163/34278 [4:43:06<32:56:57, 3.94s/it] {'loss': 0.168, 'grad_norm': 0.7420543684606417, 'learning_rate': 9.782377045150387e-06, 'epoch': 0.12} 12%|█▏ | 4163/34278 [4:43:06<32:56:57, 3.94s/it] 12%|█▏ | 4164/34278 [4:43:09<31:07:14, 3.72s/it] {'loss': 0.1609, 'grad_norm': 0.8974813647430494, 'learning_rate': 9.782239161402505e-06, 'epoch': 0.12} 12%|█▏ | 4164/34278 [4:43:09<31:07:14, 3.72s/it] 12%|█▏ | 4165/34278 [4:43:14<34:53:09, 4.17s/it] {'loss': 0.1808, 'grad_norm': 0.8375675889457979, 'learning_rate': 9.782101234959935e-06, 'epoch': 0.12} 12%|█▏ | 4165/34278 [4:43:14<34:53:09, 4.17s/it] 12%|█▏ | 4166/34278 [4:43:17<32:02:54, 3.83s/it] {'loss': 0.1869, 'grad_norm': 1.038493570210264, 'learning_rate': 9.781963265823905e-06, 'epoch': 0.12} 12%|█▏ | 4166/34278 [4:43:17<32:02:54, 3.83s/it] 12%|█▏ | 4167/34278 [4:43:21<32:02:06, 3.83s/it] {'loss': 0.178, 'grad_norm': 0.9711615425737522, 'learning_rate': 9.78182525399565e-06, 'epoch': 0.12} 12%|█▏ | 4167/34278 [4:43:21<32:02:06, 3.83s/it] 12%|█▏ | 4168/34278 [4:43:27<36:09:15, 4.32s/it] {'loss': 0.1514, 'grad_norm': 0.7962882984557913, 'learning_rate': 9.781687199476399e-06, 'epoch': 0.12} 12%|█▏ | 4168/34278 [4:43:27<36:09:15, 4.32s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4169/34278 [4:43:30<33:05:01, 3.96s/it] {'loss': 0.1632, 'grad_norm': 0.7542737816784294, 'learning_rate': 9.781549102267387e-06, 'epoch': 0.12} 12%|█▏ | 4169/34278 [4:43:30<33:05:01, 3.96s/it] 12%|█▏ | 4170/34278 [4:43:35<37:01:56, 4.43s/it] {'loss': 0.1702, 'grad_norm': 0.9970523601200407, 'learning_rate': 9.781410962369846e-06, 'epoch': 0.12} 12%|█▏ | 4170/34278 [4:43:35<37:01:56, 4.43s/it] 12%|█▏ | 4171/34278 [4:43:39<35:48:30, 4.28s/it] {'loss': 0.1836, 'grad_norm': 1.074944699922814, 'learning_rate': 9.78127277978501e-06, 'epoch': 0.12} 12%|█▏ | 4171/34278 [4:43:39<35:48:30, 4.28s/it] 12%|█▏ | 4172/34278 [4:43:46<41:26:59, 4.96s/it] {'loss': 0.1778, 'grad_norm': 1.052545248224118, 'learning_rate': 9.781134554514108e-06, 'epoch': 0.12} 12%|█▏ | 4172/34278 [4:43:46<41:26:59, 4.96s/it] 12%|█▏ | 4173/34278 [4:43:49<37:30:19, 4.48s/it] {'loss': 0.1685, 'grad_norm': 0.8528179908215335, 'learning_rate': 9.780996286558382e-06, 'epoch': 0.12} 12%|█▏ | 4173/34278 [4:43:49<37:30:19, 4.48s/it] 12%|█▏ | 4174/34278 [4:43:54<37:43:52, 4.51s/it] {'loss': 0.1955, 'grad_norm': 1.047169045200767, 'learning_rate': 9.780857975919063e-06, 'epoch': 0.12} 12%|█▏ | 4174/34278 [4:43:54<37:43:52, 4.51s/it] 12%|█▏ | 4175/34278 [4:43:59<40:32:30, 4.85s/it] {'loss': 0.1881, 'grad_norm': 0.9103774911103129, 'learning_rate': 9.780719622597383e-06, 'epoch': 0.12} 12%|█▏ | 4175/34278 [4:43:59<40:32:30, 4.85s/it] 12%|█▏ | 4176/34278 [4:44:03<36:52:58, 4.41s/it] {'loss': 0.1521, 'grad_norm': 0.6978801978572893, 'learning_rate': 9.78058122659458e-06, 'epoch': 0.12} 12%|█▏ | 4176/34278 [4:44:03<36:52:58, 4.41s/it] 12%|█▏ | 4177/34278 [4:44:06<33:37:05, 4.02s/it] {'loss': 0.1668, 'grad_norm': 0.8262896383805954, 'learning_rate': 9.780442787911891e-06, 'epoch': 0.12} 12%|█▏ | 4177/34278 [4:44:06<33:37:05, 4.02s/it] 12%|█▏ | 4178/34278 [4:44:09<32:14:20, 3.86s/it] {'loss': 0.1743, 'grad_norm': 0.8353675530022704, 'learning_rate': 9.780304306550547e-06, 'epoch': 0.12} 12%|█▏ | 4178/34278 [4:44:09<32:14:20, 3.86s/it] 12%|█▏ | 4179/34278 [4:44:13<30:40:06, 3.67s/it] {'loss': 0.2063, 'grad_norm': 1.269227532562418, 'learning_rate': 9.78016578251179e-06, 'epoch': 0.12} 12%|█▏ | 4179/34278 [4:44:13<30:40:06, 3.67s/it] 12%|█▏ | 4180/34278 [4:44:15<28:32:30, 3.41s/it] {'loss': 0.1675, 'grad_norm': 0.800200258529141, 'learning_rate': 9.780027215796853e-06, 'epoch': 0.12} 12%|█▏ | 4180/34278 [4:44:15<28:32:30, 3.41s/it] 12%|█▏ | 4181/34278 [4:44:18<27:34:13, 3.30s/it] {'loss': 0.1682, 'grad_norm': 0.862996763044887, 'learning_rate': 9.779888606406974e-06, 'epoch': 0.12} 12%|█▏ | 4181/34278 [4:44:18<27:34:13, 3.30s/it] 12%|█▏ | 4182/34278 [4:44:22<27:51:16, 3.33s/it] {'loss': 0.1711, 'grad_norm': 1.0524814671615295, 'learning_rate': 9.77974995434339e-06, 'epoch': 0.12} 12%|█▏ | 4182/34278 [4:44:22<27:51:16, 3.33s/it] 12%|█▏ | 4183/34278 [4:44:26<29:13:42, 3.50s/it] {'loss': 0.1766, 'grad_norm': 0.7834384166130912, 'learning_rate': 9.77961125960734e-06, 'epoch': 0.12} 12%|█▏ | 4183/34278 [4:44:26<29:13:42, 3.50s/it] 12%|█▏ | 4184/34278 [4:44:29<28:44:09, 3.44s/it] {'loss': 0.1928, 'grad_norm': 1.0000136967483966, 'learning_rate': 9.779472522200063e-06, 'epoch': 0.12} 12%|█▏ | 4184/34278 [4:44:29<28:44:09, 3.44s/it] 12%|█▏ | 4185/34278 [4:44:33<31:12:32, 3.73s/it] {'loss': 0.207, 'grad_norm': 0.8285754849884852, 'learning_rate': 9.779333742122792e-06, 'epoch': 0.12} 12%|█▏ | 4185/34278 [4:44:33<31:12:32, 3.73s/it] 12%|█▏ | 4186/34278 [4:44:38<32:25:22, 3.88s/it] {'loss': 0.1918, 'grad_norm': 0.9061836895942934, 'learning_rate': 9.779194919376774e-06, 'epoch': 0.12} 12%|█▏ | 4186/34278 [4:44:38<32:25:22, 3.88s/it] 12%|█▏ | 4187/34278 [4:44:41<30:02:27, 3.59s/it] {'loss': 0.1832, 'grad_norm': 0.9974874319325114, 'learning_rate': 9.779056053963243e-06, 'epoch': 0.12} 12%|█▏ | 4187/34278 [4:44:41<30:02:27, 3.59s/it] 12%|█▏ | 4188/34278 [4:44:44<28:25:46, 3.40s/it] {'loss': 0.1974, 'grad_norm': 0.79496352702967, 'learning_rate': 9.778917145883441e-06, 'epoch': 0.12} 12%|█▏ | 4188/34278 [4:44:44<28:25:46, 3.40s/it] 12%|█▏ | 4189/34278 [4:44:49<34:14:22, 4.10s/it] {'loss': 0.1795, 'grad_norm': 1.1376640314406414, 'learning_rate': 9.778778195138609e-06, 'epoch': 0.12} 12%|█▏ | 4189/34278 [4:44:49<34:14:22, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4190/34278 [4:44:52<31:43:54, 3.80s/it] {'loss': 0.1883, 'grad_norm': 0.8717399481860646, 'learning_rate': 9.778639201729985e-06, 'epoch': 0.12} 12%|█▏ | 4190/34278 [4:44:52<31:43:54, 3.80s/it] 12%|█▏ | 4191/34278 [4:44:59<38:02:47, 4.55s/it] {'loss': 0.1858, 'grad_norm': 0.8262123753517054, 'learning_rate': 9.77850016565881e-06, 'epoch': 0.12} 12%|█▏ | 4191/34278 [4:44:59<38:02:47, 4.55s/it] 12%|█▏ | 4192/34278 [4:45:02<34:40:51, 4.15s/it] {'loss': 0.2038, 'grad_norm': 1.0161112536908263, 'learning_rate': 9.778361086926327e-06, 'epoch': 0.12} 12%|█▏ | 4192/34278 [4:45:02<34:40:51, 4.15s/it] 12%|█▏ | 4193/34278 [4:45:07<36:28:21, 4.36s/it] {'loss': 0.1889, 'grad_norm': 0.9989675131884603, 'learning_rate': 9.778221965533776e-06, 'epoch': 0.12} 12%|█▏ | 4193/34278 [4:45:07<36:28:21, 4.36s/it] 12%|█▏ | 4194/34278 [4:45:10<32:38:48, 3.91s/it] {'loss': 0.1553, 'grad_norm': 0.886147768004657, 'learning_rate': 9.778082801482402e-06, 'epoch': 0.12} 12%|█▏ | 4194/34278 [4:45:10<32:38:48, 3.91s/it] 12%|█▏ | 4195/34278 [4:45:13<30:26:10, 3.64s/it] {'loss': 0.1862, 'grad_norm': 0.9443788967680901, 'learning_rate': 9.777943594773443e-06, 'epoch': 0.12} 12%|█▏ | 4195/34278 [4:45:13<30:26:10, 3.64s/it] 12%|█▏ | 4196/34278 [4:45:18<35:39:42, 4.27s/it] {'loss': 0.1859, 'grad_norm': 0.7090528132935483, 'learning_rate': 9.777804345408146e-06, 'epoch': 0.12} 12%|█▏ | 4196/34278 [4:45:18<35:39:42, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4197/34278 [4:45:24<40:14:36, 4.82s/it] {'loss': 0.1881, 'grad_norm': 0.9017168604179734, 'learning_rate': 9.77766505338775e-06, 'epoch': 0.12} 12%|█▏ | 4197/34278 [4:45:24<40:14:36, 4.82s/it] 12%|█▏ | 4198/34278 [4:45:28<36:12:30, 4.33s/it] {'loss': 0.2047, 'grad_norm': 0.9176129179618445, 'learning_rate': 9.777525718713503e-06, 'epoch': 0.12} 12%|█▏ | 4198/34278 [4:45:28<36:12:30, 4.33s/it] 12%|█▏ | 4199/34278 [4:45:31<33:08:20, 3.97s/it] {'loss': 0.1971, 'grad_norm': 0.8592543265106798, 'learning_rate': 9.777386341386647e-06, 'epoch': 0.12} 12%|█▏ | 4199/34278 [4:45:31<33:08:20, 3.97s/it] 12%|█▏ | 4200/34278 [4:45:34<30:37:36, 3.67s/it] {'loss': 0.1667, 'grad_norm': 0.9434533936174614, 'learning_rate': 9.777246921408426e-06, 'epoch': 0.12} 12%|█▏ | 4200/34278 [4:45:34<30:37:36, 3.67s/it] 12%|█▏ | 4201/34278 [4:45:37<29:32:34, 3.54s/it] {'loss': 0.1919, 'grad_norm': 0.927354638086785, 'learning_rate': 9.777107458780084e-06, 'epoch': 0.12} 12%|█▏ | 4201/34278 [4:45:37<29:32:34, 3.54s/it] 12%|█▏ | 4202/34278 [4:45:40<27:52:52, 3.34s/it] {'loss': 0.2093, 'grad_norm': 0.9226859426751418, 'learning_rate': 9.776967953502869e-06, 'epoch': 0.12} 12%|█▏ | 4202/34278 [4:45:40<27:52:52, 3.34s/it] 12%|█▏ | 4203/34278 [4:45:44<30:34:07, 3.66s/it] {'loss': 0.1823, 'grad_norm': 0.9161039871550587, 'learning_rate': 9.776828405578023e-06, 'epoch': 0.12} 12%|█▏ | 4203/34278 [4:45:44<30:34:07, 3.66s/it] 12%|█▏ | 4204/34278 [4:45:48<29:49:27, 3.57s/it] {'loss': 0.1651, 'grad_norm': 0.9020688434530008, 'learning_rate': 9.776688815006792e-06, 'epoch': 0.12} 12%|█▏ | 4204/34278 [4:45:48<29:49:27, 3.57s/it] 12%|█▏ | 4205/34278 [4:45:51<28:07:52, 3.37s/it] {'loss': 0.1836, 'grad_norm': 1.037436081549395, 'learning_rate': 9.776549181790424e-06, 'epoch': 0.12} 12%|█▏ | 4205/34278 [4:45:51<28:07:52, 3.37s/it] 12%|█▏ | 4206/34278 [4:45:54<27:19:55, 3.27s/it] {'loss': 0.1694, 'grad_norm': 0.8719546752081956, 'learning_rate': 9.776409505930167e-06, 'epoch': 0.12} 12%|█▏ | 4206/34278 [4:45:54<27:19:55, 3.27s/it] 12%|█▏ | 4207/34278 [4:45:57<28:27:18, 3.41s/it] {'loss': 0.1796, 'grad_norm': 0.8810485573848786, 'learning_rate': 9.776269787427266e-06, 'epoch': 0.12} 12%|█▏ | 4207/34278 [4:45:57<28:27:18, 3.41s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4208/34278 [4:46:01<28:14:34, 3.38s/it] {'loss': 0.1915, 'grad_norm': 0.9978537568358294, 'learning_rate': 9.776130026282968e-06, 'epoch': 0.12} 12%|█▏ | 4208/34278 [4:46:01<28:14:34, 3.38s/it] 12%|█▏ | 4209/34278 [4:46:03<26:41:58, 3.20s/it] {'loss': 0.1708, 'grad_norm': 0.7555369809030872, 'learning_rate': 9.77599022249852e-06, 'epoch': 0.12} 12%|█▏ | 4209/34278 [4:46:03<26:41:58, 3.20s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f7904036f70> Failed to fetch sample 3268269. Exception: cannot identify image file <_io.BytesIO object at 0x7f7904036f70> 12%|█▏ | 4210/34278 [4:46:07<26:40:45, 3.19s/it] {'loss': 0.1572, 'grad_norm': 0.9692423805565872, 'learning_rate': 9.775850376075174e-06, 'epoch': 0.12} 12%|█▏ | 4210/34278 [4:46:07<26:40:45, 3.19s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11077 > 8192). Running this sequence through the model will result in indexing errors 12%|█▏ | 4211/34278 [4:46:10<28:08:18, 3.37s/it] {'loss': 0.2008, 'grad_norm': 1.0618323249689408, 'learning_rate': 9.775710487014172e-06, 'epoch': 0.12} 12%|█▏ | 4211/34278 [4:46:10<28:08:18, 3.37s/it] 12%|█▏ | 4212/34278 [4:46:14<29:13:32, 3.50s/it] {'loss': 0.1839, 'grad_norm': 1.040945500262105, 'learning_rate': 9.77557055531677e-06, 'epoch': 0.12} 12%|█▏ | 4212/34278 [4:46:14<29:13:32, 3.50s/it] 12%|█▏ | 4213/34278 [4:46:18<29:43:00, 3.56s/it] {'loss': 0.2114, 'grad_norm': 1.037133168332038, 'learning_rate': 9.775430580984213e-06, 'epoch': 0.12} 12%|█▏ | 4213/34278 [4:46:18<29:43:00, 3.56s/it] 12%|█▏ | 4214/34278 [4:46:24<35:11:32, 4.21s/it] {'loss': 0.1874, 'grad_norm': 0.8061472566653828, 'learning_rate': 9.775290564017752e-06, 'epoch': 0.12} 12%|█▏ | 4214/34278 [4:46:24<35:11:32, 4.21s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10256 > 8192). Running this sequence through the model will result in indexing errors 12%|█▏ | 4215/34278 [4:46:27<33:50:44, 4.05s/it] {'loss': 0.1713, 'grad_norm': 0.8963591424698387, 'learning_rate': 9.775150504418638e-06, 'epoch': 0.12} 12%|█▏ | 4215/34278 [4:46:27<33:50:44, 4.05s/it] 12%|█▏ | 4216/34278 [4:46:30<31:41:51, 3.80s/it] {'loss': 0.189, 'grad_norm': 0.8630866206082455, 'learning_rate': 9.775010402188119e-06, 'epoch': 0.12} 12%|█▏ | 4216/34278 [4:46:30<31:41:51, 3.80s/it] 12%|█▏ | 4217/34278 [4:46:34<30:18:26, 3.63s/it] {'loss': 0.2046, 'grad_norm': 0.9263597454549111, 'learning_rate': 9.774870257327447e-06, 'epoch': 0.12} 12%|█▏ | 4217/34278 [4:46:34<30:18:26, 3.63s/it] 12%|█▏ | 4218/34278 [4:46:40<36:09:05, 4.33s/it] {'loss': 0.1956, 'grad_norm': 1.071914955078211, 'learning_rate': 9.774730069837872e-06, 'epoch': 0.12} 12%|█▏ | 4218/34278 [4:46:40<36:09:05, 4.33s/it] 12%|█▏ | 4219/34278 [4:46:44<35:29:23, 4.25s/it] {'loss': 0.1758, 'grad_norm': 0.8329010322263537, 'learning_rate': 9.774589839720649e-06, 'epoch': 0.12} 12%|█▏ | 4219/34278 [4:46:44<35:29:23, 4.25s/it] 12%|█▏ | 4220/34278 [4:46:47<33:13:52, 3.98s/it] {'loss': 0.1977, 'grad_norm': 0.7920640763112636, 'learning_rate': 9.774449566977027e-06, 'epoch': 0.12} 12%|█▏ | 4220/34278 [4:46:47<33:13:52, 3.98s/it] 12%|█▏ | 4221/34278 [4:46:53<39:03:16, 4.68s/it] {'loss': 0.2016, 'grad_norm': 1.008236458276534, 'learning_rate': 9.774309251608259e-06, 'epoch': 0.12} 12%|█▏ | 4221/34278 [4:46:53<39:03:16, 4.68s/it] 12%|█▏ | 4222/34278 [4:46:57<35:51:36, 4.30s/it] {'loss': 0.1833, 'grad_norm': 0.9347482520652315, 'learning_rate': 9.774168893615597e-06, 'epoch': 0.12} 12%|█▏ | 4222/34278 [4:46:57<35:51:36, 4.30s/it] 12%|█▏ | 4223/34278 [4:47:00<33:20:10, 3.99s/it] {'loss': 0.1681, 'grad_norm': 0.8640234539806687, 'learning_rate': 9.774028493000295e-06, 'epoch': 0.12} 12%|█▏ | 4223/34278 [4:47:00<33:20:10, 3.99s/it] 12%|█▏ | 4224/34278 [4:47:03<30:50:00, 3.69s/it] {'loss': 0.2026, 'grad_norm': 1.1631087202336545, 'learning_rate': 9.773888049763606e-06, 'epoch': 0.12} 12%|█▏ | 4224/34278 [4:47:03<30:50:00, 3.69s/it] 12%|█▏ | 4225/34278 [4:47:07<32:00:08, 3.83s/it] {'loss': 0.1759, 'grad_norm': 1.0334941698045452, 'learning_rate': 9.773747563906785e-06, 'epoch': 0.12} 12%|█▏ | 4225/34278 [4:47:07<32:00:08, 3.83s/it] 12%|█▏ | 4226/34278 [4:47:11<30:36:45, 3.67s/it] {'loss': 0.1854, 'grad_norm': 0.9064466530876436, 'learning_rate': 9.773607035431085e-06, 'epoch': 0.12} 12%|█▏ | 4226/34278 [4:47:11<30:36:45, 3.67s/it] 12%|█▏ | 4227/34278 [4:47:16<33:59:36, 4.07s/it] {'loss': 0.1936, 'grad_norm': 1.819916819047121, 'learning_rate': 9.77346646433776e-06, 'epoch': 0.12} 12%|█▏ | 4227/34278 [4:47:16<33:59:36, 4.07s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4228/34278 [4:47:19<31:15:24, 3.74s/it] {'loss': 0.1885, 'grad_norm': 0.9761879810370655, 'learning_rate': 9.773325850628069e-06, 'epoch': 0.12} 12%|█▏ | 4228/34278 [4:47:19<31:15:24, 3.74s/it] 12%|█▏ | 4229/34278 [4:47:23<33:38:08, 4.03s/it] {'loss': 0.1658, 'grad_norm': 0.7388657787543375, 'learning_rate': 9.77318519430326e-06, 'epoch': 0.12} 12%|█▏ | 4229/34278 [4:47:23<33:38:08, 4.03s/it] 12%|█▏ | 4230/34278 [4:47:27<31:48:46, 3.81s/it] {'loss': 0.1685, 'grad_norm': 0.991855344225325, 'learning_rate': 9.773044495364596e-06, 'epoch': 0.12} 12%|█▏ | 4230/34278 [4:47:27<31:48:46, 3.81s/it] 12%|█▏ | 4231/34278 [4:47:31<33:45:51, 4.05s/it] {'loss': 0.2044, 'grad_norm': 0.8834022854048016, 'learning_rate': 9.77290375381333e-06, 'epoch': 0.12} 12%|█▏ | 4231/34278 [4:47:31<33:45:51, 4.05s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11189 > 8192). Running this sequence through the model will result in indexing errors 12%|█▏ | 4232/34278 [4:47:35<32:43:24, 3.92s/it] {'loss': 0.2013, 'grad_norm': 0.8811956457462701, 'learning_rate': 9.772762969650717e-06, 'epoch': 0.12} 12%|█▏ | 4232/34278 [4:47:35<32:43:24, 3.92s/it] 12%|█▏ | 4233/34278 [4:47:38<31:12:36, 3.74s/it] {'loss': 0.1691, 'grad_norm': 0.9784935254653434, 'learning_rate': 9.772622142878016e-06, 'epoch': 0.12} 12%|█▏ | 4233/34278 [4:47:38<31:12:36, 3.74s/it] 12%|█▏ | 4234/34278 [4:47:41<30:18:05, 3.63s/it] {'loss': 0.1745, 'grad_norm': 0.8224157593540263, 'learning_rate': 9.772481273496486e-06, 'epoch': 0.12} 12%|█▏ | 4234/34278 [4:47:41<30:18:05, 3.63s/it] 12%|█▏ | 4235/34278 [4:47:45<29:39:06, 3.55s/it] {'loss': 0.1704, 'grad_norm': 0.9826511369228231, 'learning_rate': 9.77234036150738e-06, 'epoch': 0.12} 12%|█▏ | 4235/34278 [4:47:45<29:39:06, 3.55s/it] 12%|█▏ | 4236/34278 [4:47:51<35:52:00, 4.30s/it] {'loss': 0.1594, 'grad_norm': 0.8707132641162341, 'learning_rate': 9.77219940691196e-06, 'epoch': 0.12} 12%|█▏ | 4236/34278 [4:47:51<35:52:00, 4.30s/it] 12%|█▏ | 4237/34278 [4:47:54<32:59:02, 3.95s/it] {'loss': 0.2133, 'grad_norm': 0.8486161938388265, 'learning_rate': 9.77205840971148e-06, 'epoch': 0.12} 12%|█▏ | 4237/34278 [4:47:54<32:59:02, 3.95s/it] 12%|█▏ | 4238/34278 [4:47:58<32:51:49, 3.94s/it] {'loss': 0.18, 'grad_norm': 0.9949223452883215, 'learning_rate': 9.771917369907206e-06, 'epoch': 0.12} 12%|█▏ | 4238/34278 [4:47:58<32:51:49, 3.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4239/34278 [4:48:01<29:59:09, 3.59s/it] {'loss': 0.2059, 'grad_norm': 1.0058494158528097, 'learning_rate': 9.77177628750039e-06, 'epoch': 0.12} 12%|█▏ | 4239/34278 [4:48:01<29:59:09, 3.59s/it] 12%|█▏ | 4240/34278 [4:48:05<32:29:24, 3.89s/it] {'loss': 0.1639, 'grad_norm': 0.9213454019566658, 'learning_rate': 9.771635162492296e-06, 'epoch': 0.12} 12%|█▏ | 4240/34278 [4:48:05<32:29:24, 3.89s/it] 12%|█▏ | 4241/34278 [4:48:08<30:47:53, 3.69s/it] {'loss': 0.1898, 'grad_norm': 1.0291911621537848, 'learning_rate': 9.771493994884182e-06, 'epoch': 0.12} 12%|█▏ | 4241/34278 [4:48:09<30:47:53, 3.69s/it] 12%|█▏ | 4242/34278 [4:48:12<29:28:57, 3.53s/it] {'loss': 0.1783, 'grad_norm': 0.8649075291704439, 'learning_rate': 9.771352784677309e-06, 'epoch': 0.12} 12%|█▏ | 4242/34278 [4:48:12<29:28:57, 3.53s/it] 12%|█▏ | 4243/34278 [4:48:15<29:16:07, 3.51s/it] {'loss': 0.1514, 'grad_norm': 0.8785762286609688, 'learning_rate': 9.771211531872935e-06, 'epoch': 0.12} 12%|█▏ | 4243/34278 [4:48:15<29:16:07, 3.51s/it] 12%|█▏ | 4244/34278 [4:48:19<31:09:26, 3.73s/it] {'loss': 0.1826, 'grad_norm': 0.823081519999211, 'learning_rate': 9.771070236472324e-06, 'epoch': 0.12} 12%|█▏ | 4244/34278 [4:48:19<31:09:26, 3.73s/it] 12%|█▏ | 4245/34278 [4:48:23<31:27:45, 3.77s/it] {'loss': 0.1731, 'grad_norm': 0.7215599451247365, 'learning_rate': 9.77092889847674e-06, 'epoch': 0.12} 12%|█▏ | 4245/34278 [4:48:23<31:27:45, 3.77s/it] 12%|█▏ | 4246/34278 [4:48:29<36:11:37, 4.34s/it] {'loss': 0.1993, 'grad_norm': 0.9713610710868089, 'learning_rate': 9.770787517887439e-06, 'epoch': 0.12} 12%|█▏ | 4246/34278 [4:48:29<36:11:37, 4.34s/it] 12%|█▏ | 4247/34278 [4:48:32<33:25:01, 4.01s/it] {'loss': 0.1883, 'grad_norm': 0.8715957915372682, 'learning_rate': 9.770646094705687e-06, 'epoch': 0.12} 12%|█▏ | 4247/34278 [4:48:32<33:25:01, 4.01s/it] 12%|█▏ | 4248/34278 [4:48:36<33:13:19, 3.98s/it] {'loss': 0.1868, 'grad_norm': 0.8420458576843002, 'learning_rate': 9.770504628932744e-06, 'epoch': 0.12} 12%|█▏ | 4248/34278 [4:48:36<33:13:19, 3.98s/it] 12%|█▏ | 4249/34278 [4:48:42<38:54:18, 4.66s/it] {'loss': 0.191, 'grad_norm': 0.9451975101715953, 'learning_rate': 9.770363120569876e-06, 'epoch': 0.12} 12%|█▏ | 4249/34278 [4:48:42<38:54:18, 4.66s/it] 12%|█▏ | 4250/34278 [4:48:45<34:55:53, 4.19s/it] {'loss': 0.1633, 'grad_norm': 0.9607044740168983, 'learning_rate': 9.770221569618343e-06, 'epoch': 0.12} 12%|█▏ | 4250/34278 [4:48:45<34:55:53, 4.19s/it] 12%|█▏ | 4251/34278 [4:48:51<37:23:34, 4.48s/it] {'loss': 0.1686, 'grad_norm': 0.7286356854872065, 'learning_rate': 9.770079976079414e-06, 'epoch': 0.12} 12%|█▏ | 4251/34278 [4:48:51<37:23:34, 4.48s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4252/34278 [4:48:54<34:13:38, 4.10s/it] {'loss': 0.1866, 'grad_norm': 0.8874022088049638, 'learning_rate': 9.769938339954347e-06, 'epoch': 0.12} 12%|█▏ | 4252/34278 [4:48:54<34:13:38, 4.10s/it] 12%|█▏ | 4253/34278 [4:48:57<33:15:16, 3.99s/it] {'loss': 0.1889, 'grad_norm': 1.0625998134826966, 'learning_rate': 9.76979666124441e-06, 'epoch': 0.12} 12%|█▏ | 4253/34278 [4:48:57<33:15:16, 3.99s/it] 12%|█▏ | 4254/34278 [4:49:01<31:04:46, 3.73s/it] {'loss': 0.1633, 'grad_norm': 0.8184826542103033, 'learning_rate': 9.769654939950866e-06, 'epoch': 0.12} 12%|█▏ | 4254/34278 [4:49:01<31:04:46, 3.73s/it] 12%|█▏ | 4255/34278 [4:49:05<32:28:46, 3.89s/it] {'loss': 0.1826, 'grad_norm': 1.2131584622111504, 'learning_rate': 9.769513176074982e-06, 'epoch': 0.12} 12%|█▏ | 4255/34278 [4:49:05<32:28:46, 3.89s/it] 12%|█▏ | 4256/34278 [4:49:09<33:34:37, 4.03s/it] {'loss': 0.1853, 'grad_norm': 0.9988584186668532, 'learning_rate': 9.769371369618023e-06, 'epoch': 0.12} 12%|█▏ | 4256/34278 [4:49:09<33:34:37, 4.03s/it] 12%|█▏ | 4257/34278 [4:49:13<32:58:07, 3.95s/it] {'loss': 0.1509, 'grad_norm': 1.180325587882098, 'learning_rate': 9.769229520581256e-06, 'epoch': 0.12} 12%|█▏ | 4257/34278 [4:49:13<32:58:07, 3.95s/it] 12%|█▏ | 4258/34278 [4:49:17<32:51:33, 3.94s/it] {'loss': 0.1889, 'grad_norm': 0.7234165745053014, 'learning_rate': 9.769087628965945e-06, 'epoch': 0.12} 12%|█▏ | 4258/34278 [4:49:17<32:51:33, 3.94s/it] 12%|█▏ | 4259/34278 [4:49:20<31:05:01, 3.73s/it] {'loss': 0.173, 'grad_norm': 0.8663462398173085, 'learning_rate': 9.768945694773358e-06, 'epoch': 0.12} 12%|█▏ | 4259/34278 [4:49:20<31:05:01, 3.73s/it] 12%|█▏ | 4260/34278 [4:49:24<32:09:19, 3.86s/it] {'loss': 0.182, 'grad_norm': 0.7336133954693259, 'learning_rate': 9.768803718004764e-06, 'epoch': 0.12} 12%|█▏ | 4260/34278 [4:49:24<32:09:19, 3.86s/it] 12%|█▏ | 4261/34278 [4:49:27<29:42:14, 3.56s/it] {'loss': 0.1595, 'grad_norm': 0.7633816702741599, 'learning_rate': 9.768661698661427e-06, 'epoch': 0.12} 12%|█▏ | 4261/34278 [4:49:27<29:42:14, 3.56s/it] 12%|█▏ | 4262/34278 [4:49:32<32:27:52, 3.89s/it] {'loss': 0.1715, 'grad_norm': 0.89792804587649, 'learning_rate': 9.768519636744618e-06, 'epoch': 0.12} 12%|█▏ | 4262/34278 [4:49:32<32:27:52, 3.89s/it] 12%|█▏ | 4263/34278 [4:49:35<31:10:16, 3.74s/it] {'loss': 0.2037, 'grad_norm': 0.8208558511054609, 'learning_rate': 9.768377532255602e-06, 'epoch': 0.12} 12%|█▏ | 4263/34278 [4:49:35<31:10:16, 3.74s/it] 12%|█▏ | 4264/34278 [4:49:38<29:37:04, 3.55s/it] {'loss': 0.1795, 'grad_norm': 1.213932174256665, 'learning_rate': 9.768235385195653e-06, 'epoch': 0.12} 12%|█▏ | 4264/34278 [4:49:38<29:37:04, 3.55s/it] 12%|█▏ | 4265/34278 [4:49:42<29:36:58, 3.55s/it] {'loss': 0.1611, 'grad_norm': 0.7904329914917131, 'learning_rate': 9.768093195566033e-06, 'epoch': 0.12} 12%|█▏ | 4265/34278 [4:49:42<29:36:58, 3.55s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (14170 > 8192). Running this sequence through the model will result in indexing errors 12%|█▏ | 4266/34278 [4:49:46<30:33:04, 3.66s/it] {'loss': 0.1863, 'grad_norm': 0.7345958432260812, 'learning_rate': 9.767950963368018e-06, 'epoch': 0.12} 12%|█▏ | 4266/34278 [4:49:46<30:33:04, 3.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4267/34278 [4:49:50<30:39:23, 3.68s/it] {'loss': 0.159, 'grad_norm': 0.872169523460647, 'learning_rate': 9.767808688602873e-06, 'epoch': 0.12} 12%|█▏ | 4267/34278 [4:49:50<30:39:23, 3.68s/it] 12%|█▏ | 4268/34278 [4:49:56<36:26:37, 4.37s/it] {'loss': 0.1784, 'grad_norm': 0.8695611591130454, 'learning_rate': 9.767666371271871e-06, 'epoch': 0.12} 12%|█▏ | 4268/34278 [4:49:56<36:26:37, 4.37s/it] 12%|█▏ | 4269/34278 [4:49:59<35:00:58, 4.20s/it] {'loss': 0.174, 'grad_norm': 0.6730778413023714, 'learning_rate': 9.767524011376283e-06, 'epoch': 0.12} 12%|█▏ | 4269/34278 [4:49:59<35:00:58, 4.20s/it] 12%|█▏ | 4270/34278 [4:50:02<32:00:07, 3.84s/it] {'loss': 0.1973, 'grad_norm': 0.9220896534051399, 'learning_rate': 9.767381608917377e-06, 'epoch': 0.12} 12%|█▏ | 4270/34278 [4:50:02<32:00:07, 3.84s/it] 12%|█▏ | 4271/34278 [4:50:05<29:52:13, 3.58s/it] {'loss': 0.1691, 'grad_norm': 0.871915000359137, 'learning_rate': 9.767239163896427e-06, 'epoch': 0.12} 12%|█▏ | 4271/34278 [4:50:05<29:52:13, 3.58s/it] 12%|█▏ | 4272/34278 [4:50:12<36:48:54, 4.42s/it] {'loss': 0.194, 'grad_norm': 0.8734391780543423, 'learning_rate': 9.767096676314703e-06, 'epoch': 0.12} 12%|█▏ | 4272/34278 [4:50:12<36:48:54, 4.42s/it] 12%|█▏ | 4273/34278 [4:50:15<34:44:14, 4.17s/it] {'loss': 0.1925, 'grad_norm': 0.8891960491574288, 'learning_rate': 9.76695414617348e-06, 'epoch': 0.12} 12%|█▏ | 4273/34278 [4:50:15<34:44:14, 4.17s/it] 12%|█▏ | 4274/34278 [4:50:18<32:26:04, 3.89s/it] {'loss': 0.1743, 'grad_norm': 0.8286255539685276, 'learning_rate': 9.766811573474026e-06, 'epoch': 0.12} 12%|█▏ | 4274/34278 [4:50:18<32:26:04, 3.89s/it] 12%|█▏ | 4275/34278 [4:50:22<30:58:03, 3.72s/it] {'loss': 0.1531, 'grad_norm': 1.1132476951309755, 'learning_rate': 9.766668958217617e-06, 'epoch': 0.12} 12%|█▏ | 4275/34278 [4:50:22<30:58:03, 3.72s/it] 12%|█▏ | 4276/34278 [4:50:26<31:01:59, 3.72s/it] {'loss': 0.1677, 'grad_norm': 0.9154750259971727, 'learning_rate': 9.766526300405525e-06, 'epoch': 0.12} 12%|█▏ | 4276/34278 [4:50:26<31:01:59, 3.72s/it] 12%|█▏ | 4277/34278 [4:50:29<29:57:16, 3.59s/it] {'loss': 0.1823, 'grad_norm': 0.9013459160814709, 'learning_rate': 9.766383600039025e-06, 'epoch': 0.12} 12%|█▏ | 4277/34278 [4:50:29<29:57:16, 3.59s/it] 12%|█▏ | 4278/34278 [4:50:32<28:42:05, 3.44s/it] {'loss': 0.1581, 'grad_norm': 0.9060290447522402, 'learning_rate': 9.76624085711939e-06, 'epoch': 0.12} 12%|█▏ | 4278/34278 [4:50:32<28:42:05, 3.44s/it] 12%|█▏ | 4279/34278 [4:50:35<28:09:39, 3.38s/it] {'loss': 0.1683, 'grad_norm': 0.9803903375051782, 'learning_rate': 9.766098071647892e-06, 'epoch': 0.12} 12%|█▏ | 4279/34278 [4:50:35<28:09:39, 3.38s/it] 12%|█▏ | 4280/34278 [4:50:39<29:39:53, 3.56s/it] {'loss': 0.2036, 'grad_norm': 0.8788098055543385, 'learning_rate': 9.765955243625811e-06, 'epoch': 0.12} 12%|█▏ | 4280/34278 [4:50:39<29:39:53, 3.56s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 12%|█▏ | 4281/34278 [4:50:43<29:54:04, 3.59s/it] {'loss': 0.1957, 'grad_norm': 0.9587287730384955, 'learning_rate': 9.765812373054418e-06, 'epoch': 0.12} 12%|█▏ | 4281/34278 [4:50:43<29:54:04, 3.59s/it] 12%|█▏ | 4282/34278 [4:50:46<28:50:39, 3.46s/it] {'loss': 0.1852, 'grad_norm': 0.9823218003703186, 'learning_rate': 9.76566945993499e-06, 'epoch': 0.12} 12%|█▏ | 4282/34278 [4:50:46<28:50:39, 3.46s/it] 12%|█▏ | 4283/34278 [4:50:50<29:46:03, 3.57s/it] {'loss': 0.1767, 'grad_norm': 1.0552717781488987, 'learning_rate': 9.765526504268803e-06, 'epoch': 0.12} 12%|█▏ | 4283/34278 [4:50:50<29:46:03, 3.57s/it] 12%|█▏ | 4284/34278 [4:50:54<30:14:42, 3.63s/it] {'loss': 0.1985, 'grad_norm': 1.0440471798962738, 'learning_rate': 9.765383506057134e-06, 'epoch': 0.12} 12%|█▏ | 4284/34278 [4:50:54<30:14:42, 3.63s/it] 13%|█▎ | 4285/34278 [4:50:59<34:42:11, 4.17s/it] {'loss': 0.1695, 'grad_norm': 0.8833607271954884, 'learning_rate': 9.765240465301256e-06, 'epoch': 0.13} 13%|█▎ | 4285/34278 [4:50:59<34:42:11, 4.17s/it] 13%|█▎ | 4286/34278 [4:51:03<33:19:52, 4.00s/it] {'loss': 0.202, 'grad_norm': 1.0199050563499885, 'learning_rate': 9.765097382002451e-06, 'epoch': 0.13} 13%|█▎ | 4286/34278 [4:51:03<33:19:52, 4.00s/it] 13%|█▎ | 4287/34278 [4:51:05<30:33:30, 3.67s/it] {'loss': 0.1864, 'grad_norm': 1.00389863995398, 'learning_rate': 9.764954256161994e-06, 'epoch': 0.13} 13%|█▎ | 4287/34278 [4:51:05<30:33:30, 3.67s/it] 13%|█▎ | 4288/34278 [4:51:09<30:54:38, 3.71s/it] {'loss': 0.1737, 'grad_norm': 1.0983481360195875, 'learning_rate': 9.76481108778116e-06, 'epoch': 0.13} 13%|█▎ | 4288/34278 [4:51:09<30:54:38, 3.71s/it] 13%|█▎ | 4289/34278 [4:51:13<30:49:54, 3.70s/it] {'loss': 0.1745, 'grad_norm': 0.9268256651823165, 'learning_rate': 9.764667876861234e-06, 'epoch': 0.13} 13%|█▎ | 4289/34278 [4:51:13<30:49:54, 3.70s/it] 13%|█▎ | 4290/34278 [4:51:17<31:21:21, 3.76s/it] {'loss': 0.1788, 'grad_norm': 0.7470529526993203, 'learning_rate': 9.764524623403488e-06, 'epoch': 0.13} 13%|█▎ | 4290/34278 [4:51:17<31:21:21, 3.76s/it] 13%|█▎ | 4291/34278 [4:51:20<30:02:13, 3.61s/it] {'loss': 0.176, 'grad_norm': 1.1216784539669975, 'learning_rate': 9.764381327409204e-06, 'epoch': 0.13} 13%|█▎ | 4291/34278 [4:51:20<30:02:13, 3.61s/it] 13%|█▎ | 4292/34278 [4:51:23<28:11:17, 3.38s/it] {'loss': 0.1994, 'grad_norm': 0.9260657882814729, 'learning_rate': 9.764237988879663e-06, 'epoch': 0.13} 13%|█▎ | 4292/34278 [4:51:23<28:11:17, 3.38s/it] 13%|█▎ | 4293/34278 [4:51:28<31:25:08, 3.77s/it] {'loss': 0.1698, 'grad_norm': 0.8801649593751968, 'learning_rate': 9.76409460781614e-06, 'epoch': 0.13} 13%|█▎ | 4293/34278 [4:51:28<31:25:08, 3.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4294/34278 [4:51:31<29:40:31, 3.56s/it] {'loss': 0.1665, 'grad_norm': 0.7524840407802879, 'learning_rate': 9.76395118421992e-06, 'epoch': 0.13} 13%|█▎ | 4294/34278 [4:51:31<29:40:31, 3.56s/it] 13%|█▎ | 4295/34278 [4:51:34<29:00:19, 3.48s/it] {'loss': 0.203, 'grad_norm': 0.9182771897339053, 'learning_rate': 9.763807718092278e-06, 'epoch': 0.13} 13%|█▎ | 4295/34278 [4:51:34<29:00:19, 3.48s/it] 13%|█▎ | 4296/34278 [4:51:37<28:47:24, 3.46s/it] {'loss': 0.1744, 'grad_norm': 0.9437971667408273, 'learning_rate': 9.7636642094345e-06, 'epoch': 0.13} 13%|█▎ | 4296/34278 [4:51:37<28:47:24, 3.46s/it] 13%|█▎ | 4297/34278 [4:51:41<28:59:12, 3.48s/it] {'loss': 0.1656, 'grad_norm': 0.8554139785337814, 'learning_rate': 9.763520658247866e-06, 'epoch': 0.13} 13%|█▎ | 4297/34278 [4:51:41<28:59:12, 3.48s/it] 13%|█▎ | 4298/34278 [4:51:44<28:08:00, 3.38s/it] {'loss': 0.1814, 'grad_norm': 1.0251410191592603, 'learning_rate': 9.763377064533654e-06, 'epoch': 0.13} 13%|█▎ | 4298/34278 [4:51:44<28:08:00, 3.38s/it] 13%|█▎ | 4299/34278 [4:51:47<27:09:15, 3.26s/it] {'loss': 0.1878, 'grad_norm': 0.8584107009287739, 'learning_rate': 9.76323342829315e-06, 'epoch': 0.13} 13%|█▎ | 4299/34278 [4:51:47<27:09:15, 3.26s/it] 13%|█▎ | 4300/34278 [4:51:50<27:31:01, 3.30s/it] {'loss': 0.1914, 'grad_norm': 1.298075008577799, 'learning_rate': 9.763089749527635e-06, 'epoch': 0.13} 13%|█▎ | 4300/34278 [4:51:51<27:31:01, 3.30s/it] 13%|█▎ | 4301/34278 [4:51:56<34:11:04, 4.11s/it] {'loss': 0.1741, 'grad_norm': 0.8654377573473938, 'learning_rate': 9.762946028238391e-06, 'epoch': 0.13} 13%|█▎ | 4301/34278 [4:51:56<34:11:04, 4.11s/it] 13%|█▎ | 4302/34278 [4:51:59<31:10:24, 3.74s/it] {'loss': 0.1893, 'grad_norm': 0.7753543465818352, 'learning_rate': 9.762802264426703e-06, 'epoch': 0.13} 13%|█▎ | 4302/34278 [4:51:59<31:10:24, 3.74s/it] 13%|█▎ | 4303/34278 [4:52:05<36:50:01, 4.42s/it] {'loss': 0.1782, 'grad_norm': 1.3344574668330917, 'learning_rate': 9.762658458093852e-06, 'epoch': 0.13} 13%|█▎ | 4303/34278 [4:52:05<36:50:01, 4.42s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4304/34278 [4:52:11<40:13:00, 4.83s/it] {'loss': 0.1892, 'grad_norm': 0.9834481709401277, 'learning_rate': 9.762514609241124e-06, 'epoch': 0.13} 13%|█▎ | 4304/34278 [4:52:11<40:13:00, 4.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4305/34278 [4:52:18<44:28:08, 5.34s/it] {'loss': 0.1835, 'grad_norm': 0.8555403034672496, 'learning_rate': 9.762370717869804e-06, 'epoch': 0.13} 13%|█▎ | 4305/34278 [4:52:18<44:28:08, 5.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4306/34278 [4:52:21<38:15:04, 4.59s/it] {'loss': 0.1929, 'grad_norm': 1.0901077663869283, 'learning_rate': 9.762226783981174e-06, 'epoch': 0.13} 13%|█▎ | 4306/34278 [4:52:21<38:15:04, 4.59s/it] 13%|█▎ | 4307/34278 [4:52:24<35:15:13, 4.23s/it] {'loss': 0.1909, 'grad_norm': 0.9769178606454748, 'learning_rate': 9.762082807576518e-06, 'epoch': 0.13} 13%|█▎ | 4307/34278 [4:52:24<35:15:13, 4.23s/it] 13%|█▎ | 4308/34278 [4:52:28<34:08:58, 4.10s/it] {'loss': 0.1983, 'grad_norm': 0.7929454063432803, 'learning_rate': 9.761938788657127e-06, 'epoch': 0.13} 13%|█▎ | 4308/34278 [4:52:28<34:08:58, 4.10s/it] 13%|█▎ | 4309/34278 [4:52:31<31:46:06, 3.82s/it] {'loss': 0.2156, 'grad_norm': 0.8921832339132862, 'learning_rate': 9.761794727224281e-06, 'epoch': 0.13} 13%|█▎ | 4309/34278 [4:52:31<31:46:06, 3.82s/it] 13%|█▎ | 4310/34278 [4:52:34<29:05:43, 3.50s/it] {'loss': 0.1535, 'grad_norm': 0.8545715424189586, 'learning_rate': 9.761650623279269e-06, 'epoch': 0.13} 13%|█▎ | 4310/34278 [4:52:34<29:05:43, 3.50s/it] 13%|█▎ | 4311/34278 [4:52:37<29:51:46, 3.59s/it] {'loss': 0.1788, 'grad_norm': 0.8778341189523261, 'learning_rate': 9.761506476823377e-06, 'epoch': 0.13} 13%|█▎ | 4311/34278 [4:52:37<29:51:46, 3.59s/it] 13%|█▎ | 4312/34278 [4:52:41<28:54:15, 3.47s/it] {'loss': 0.1773, 'grad_norm': 0.7884318379167016, 'learning_rate': 9.761362287857891e-06, 'epoch': 0.13} 13%|█▎ | 4312/34278 [4:52:41<28:54:15, 3.47s/it] 13%|█▎ | 4313/34278 [4:52:44<29:08:11, 3.50s/it] {'loss': 0.1728, 'grad_norm': 0.8426840395974396, 'learning_rate': 9.761218056384102e-06, 'epoch': 0.13} 13%|█▎ | 4313/34278 [4:52:44<29:08:11, 3.50s/it] 13%|█▎ | 4314/34278 [4:52:48<28:48:31, 3.46s/it] {'loss': 0.1814, 'grad_norm': 0.9073579849848015, 'learning_rate': 9.761073782403291e-06, 'epoch': 0.13} 13%|█▎ | 4314/34278 [4:52:48<28:48:31, 3.46s/it] 13%|█▎ | 4315/34278 [4:52:51<29:57:35, 3.60s/it] {'loss': 0.1588, 'grad_norm': 0.9179197291915331, 'learning_rate': 9.760929465916752e-06, 'epoch': 0.13} 13%|█▎ | 4315/34278 [4:52:52<29:57:35, 3.60s/it] 13%|█▎ | 4316/34278 [4:52:55<28:56:54, 3.48s/it] {'loss': 0.1725, 'grad_norm': 0.8223070944303513, 'learning_rate': 9.76078510692577e-06, 'epoch': 0.13} 13%|█▎ | 4316/34278 [4:52:55<28:56:54, 3.48s/it] 13%|█▎ | 4317/34278 [4:53:01<35:12:09, 4.23s/it] {'loss': 0.1931, 'grad_norm': 0.9131029632198148, 'learning_rate': 9.760640705431636e-06, 'epoch': 0.13} 13%|█▎ | 4317/34278 [4:53:01<35:12:09, 4.23s/it] 13%|█▎ | 4318/34278 [4:53:05<35:12:55, 4.23s/it] {'loss': 0.1472, 'grad_norm': 0.8772096245229108, 'learning_rate': 9.76049626143564e-06, 'epoch': 0.13} 13%|█▎ | 4318/34278 [4:53:05<35:12:55, 4.23s/it] 13%|█▎ | 4319/34278 [4:53:11<39:39:36, 4.77s/it] {'loss': 0.1809, 'grad_norm': 0.7007596844463523, 'learning_rate': 9.760351774939068e-06, 'epoch': 0.13} 13%|█▎ | 4319/34278 [4:53:11<39:39:36, 4.77s/it] 13%|█▎ | 4320/34278 [4:53:17<42:54:57, 5.16s/it] {'loss': 0.1905, 'grad_norm': 0.8300390578037848, 'learning_rate': 9.76020724594321e-06, 'epoch': 0.13} 13%|█▎ | 4320/34278 [4:53:17<42:54:57, 5.16s/it] 13%|█▎ | 4321/34278 [4:53:21<39:39:10, 4.77s/it] {'loss': 0.1781, 'grad_norm': 1.0328419104468722, 'learning_rate': 9.76006267444936e-06, 'epoch': 0.13} 13%|█▎ | 4321/34278 [4:53:21<39:39:10, 4.77s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9577 > 8192). Running this sequence through the model will result in indexing errors 13%|█▎ | 4322/34278 [4:53:26<39:47:26, 4.78s/it] {'loss': 0.1619, 'grad_norm': 0.7480386916897522, 'learning_rate': 9.759918060458807e-06, 'epoch': 0.13} 13%|█▎ | 4322/34278 [4:53:26<39:47:26, 4.78s/it] 13%|█▎ | 4323/34278 [4:53:31<40:06:10, 4.82s/it] {'loss': 0.1793, 'grad_norm': 0.8698351781302592, 'learning_rate': 9.75977340397284e-06, 'epoch': 0.13} 13%|█▎ | 4323/34278 [4:53:31<40:06:10, 4.82s/it] 13%|█▎ | 4324/34278 [4:53:34<37:03:53, 4.45s/it] {'loss': 0.1735, 'grad_norm': 0.8172360464952455, 'learning_rate': 9.759628704992754e-06, 'epoch': 0.13} 13%|█▎ | 4324/34278 [4:53:34<37:03:53, 4.45s/it] 13%|█▎ | 4325/34278 [4:53:37<33:56:08, 4.08s/it] {'loss': 0.1738, 'grad_norm': 0.8593913726772656, 'learning_rate': 9.75948396351984e-06, 'epoch': 0.13} 13%|█▎ | 4325/34278 [4:53:37<33:56:08, 4.08s/it] 13%|█▎ | 4326/34278 [4:53:44<40:00:02, 4.81s/it] {'loss': 0.1669, 'grad_norm': 0.8322858175517441, 'learning_rate': 9.759339179555387e-06, 'epoch': 0.13} 13%|█▎ | 4326/34278 [4:53:44<40:00:02, 4.81s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4327/34278 [4:53:47<36:03:58, 4.34s/it] {'loss': 0.1765, 'grad_norm': 0.8524071023669001, 'learning_rate': 9.759194353100692e-06, 'epoch': 0.13} 13%|█▎ | 4327/34278 [4:53:47<36:03:58, 4.34s/it] 13%|█▎ | 4328/34278 [4:53:53<40:00:59, 4.81s/it] {'loss': 0.1711, 'grad_norm': 0.8115798583704614, 'learning_rate': 9.759049484157045e-06, 'epoch': 0.13} 13%|█▎ | 4328/34278 [4:53:53<40:00:59, 4.81s/it] 13%|█▎ | 4329/34278 [4:53:59<42:57:33, 5.16s/it] {'loss': 0.1997, 'grad_norm': 0.830874898071875, 'learning_rate': 9.758904572725739e-06, 'epoch': 0.13} 13%|█▎ | 4329/34278 [4:53:59<42:57:33, 5.16s/it] 13%|█▎ | 4330/34278 [4:54:03<39:44:30, 4.78s/it] {'loss': 0.1765, 'grad_norm': 0.8784167964965695, 'learning_rate': 9.758759618808071e-06, 'epoch': 0.13} 13%|█▎ | 4330/34278 [4:54:03<39:44:30, 4.78s/it] 13%|█▎ | 4331/34278 [4:54:06<36:12:12, 4.35s/it] {'loss': 0.1848, 'grad_norm': 0.7548423758563226, 'learning_rate': 9.75861462240533e-06, 'epoch': 0.13} 13%|█▎ | 4331/34278 [4:54:06<36:12:12, 4.35s/it] 13%|█▎ | 4332/34278 [4:54:11<36:46:22, 4.42s/it] {'loss': 0.1617, 'grad_norm': 0.844675211204546, 'learning_rate': 9.758469583518819e-06, 'epoch': 0.13} 13%|█▎ | 4332/34278 [4:54:11<36:46:22, 4.42s/it] 13%|█▎ | 4333/34278 [4:54:15<35:47:33, 4.30s/it] {'loss': 0.1688, 'grad_norm': 0.8814391936112137, 'learning_rate': 9.758324502149824e-06, 'epoch': 0.13} 13%|█▎ | 4333/34278 [4:54:15<35:47:33, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4334/34278 [4:54:20<38:36:08, 4.64s/it] {'loss': 0.1648, 'grad_norm': 0.9049808953261231, 'learning_rate': 9.758179378299644e-06, 'epoch': 0.13} 13%|█▎ | 4334/34278 [4:54:20<38:36:08, 4.64s/it] 13%|█▎ | 4335/34278 [4:54:23<34:59:11, 4.21s/it] {'loss': 0.1932, 'grad_norm': 0.941452688339845, 'learning_rate': 9.758034211969573e-06, 'epoch': 0.13} 13%|█▎ | 4335/34278 [4:54:23<34:59:11, 4.21s/it] 13%|█▎ | 4336/34278 [4:54:27<33:12:09, 3.99s/it] {'loss': 0.1624, 'grad_norm': 0.7424095216186802, 'learning_rate': 9.757889003160912e-06, 'epoch': 0.13} 13%|█▎ | 4336/34278 [4:54:27<33:12:09, 3.99s/it] 13%|█▎ | 4337/34278 [4:54:33<38:35:20, 4.64s/it] {'loss': 0.1705, 'grad_norm': 0.853858928136336, 'learning_rate': 9.757743751874951e-06, 'epoch': 0.13} 13%|█▎ | 4337/34278 [4:54:33<38:35:20, 4.64s/it] 13%|█▎ | 4338/34278 [4:54:36<34:28:23, 4.15s/it] {'loss': 0.1698, 'grad_norm': 0.83593321941497, 'learning_rate': 9.757598458112991e-06, 'epoch': 0.13} 13%|█▎ | 4338/34278 [4:54:36<34:28:23, 4.15s/it] 13%|█▎ | 4339/34278 [4:54:39<31:17:22, 3.76s/it] {'loss': 0.1504, 'grad_norm': 0.7729304014754222, 'learning_rate': 9.757453121876327e-06, 'epoch': 0.13} 13%|█▎ | 4339/34278 [4:54:39<31:17:22, 3.76s/it] 13%|█▎ | 4340/34278 [4:54:45<36:46:13, 4.42s/it] {'loss': 0.1568, 'grad_norm': 1.2136177458516852, 'learning_rate': 9.757307743166259e-06, 'epoch': 0.13} 13%|█▎ | 4340/34278 [4:54:45<36:46:13, 4.42s/it] 13%|█▎ | 4341/34278 [4:54:48<33:05:54, 3.98s/it] {'loss': 0.1737, 'grad_norm': 0.8488242378339931, 'learning_rate': 9.757162321984079e-06, 'epoch': 0.13} 13%|█▎ | 4341/34278 [4:54:48<33:05:54, 3.98s/it] 13%|█▎ | 4342/34278 [4:54:51<32:02:23, 3.85s/it] {'loss': 0.1948, 'grad_norm': 0.8264301337138541, 'learning_rate': 9.757016858331092e-06, 'epoch': 0.13} 13%|█▎ | 4342/34278 [4:54:51<32:02:23, 3.85s/it] 13%|█▎ | 4343/34278 [4:54:55<30:45:17, 3.70s/it] {'loss': 0.1746, 'grad_norm': 1.1684518053757948, 'learning_rate': 9.756871352208594e-06, 'epoch': 0.13} 13%|█▎ | 4343/34278 [4:54:55<30:45:17, 3.70s/it] 13%|█▎ | 4344/34278 [4:55:01<37:36:43, 4.52s/it] {'loss': 0.2024, 'grad_norm': 0.9713931040968758, 'learning_rate': 9.756725803617883e-06, 'epoch': 0.13} 13%|█▎ | 4344/34278 [4:55:01<37:36:43, 4.52s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4345/34278 [4:55:04<33:15:55, 4.00s/it] {'loss': 0.1805, 'grad_norm': 0.903833122560911, 'learning_rate': 9.756580212560261e-06, 'epoch': 0.13} 13%|█▎ | 4345/34278 [4:55:04<33:15:55, 4.00s/it] 13%|█▎ | 4346/34278 [4:55:07<31:11:07, 3.75s/it] {'loss': 0.1978, 'grad_norm': 1.0932615376906702, 'learning_rate': 9.756434579037027e-06, 'epoch': 0.13} 13%|█▎ | 4346/34278 [4:55:07<31:11:07, 3.75s/it] 13%|█▎ | 4347/34278 [4:55:11<31:32:58, 3.79s/it] {'loss': 0.1823, 'grad_norm': 1.019024004897646, 'learning_rate': 9.75628890304948e-06, 'epoch': 0.13} 13%|█▎ | 4347/34278 [4:55:11<31:32:58, 3.79s/it] 13%|█▎ | 4348/34278 [4:55:14<29:52:47, 3.59s/it] {'loss': 0.182, 'grad_norm': 0.7929642521823096, 'learning_rate': 9.756143184598919e-06, 'epoch': 0.13} 13%|█▎ | 4348/34278 [4:55:14<29:52:47, 3.59s/it] 13%|█▎ | 4349/34278 [4:55:18<30:20:02, 3.65s/it] {'loss': 0.1728, 'grad_norm': 0.839591692305866, 'learning_rate': 9.755997423686649e-06, 'epoch': 0.13} 13%|█▎ | 4349/34278 [4:55:18<30:20:02, 3.65s/it] 13%|█▎ | 4350/34278 [4:55:22<30:47:26, 3.70s/it] {'loss': 0.1765, 'grad_norm': 0.9423603267135233, 'learning_rate': 9.75585162031397e-06, 'epoch': 0.13} 13%|█▎ | 4350/34278 [4:55:22<30:47:26, 3.70s/it] 13%|█▎ | 4351/34278 [4:55:26<31:20:36, 3.77s/it] {'loss': 0.1857, 'grad_norm': 0.8079607281592949, 'learning_rate': 9.75570577448218e-06, 'epoch': 0.13} 13%|█▎ | 4351/34278 [4:55:26<31:20:36, 3.77s/it] 13%|█▎ | 4352/34278 [4:55:29<30:50:18, 3.71s/it] {'loss': 0.1741, 'grad_norm': 0.8822615224554138, 'learning_rate': 9.755559886192586e-06, 'epoch': 0.13} 13%|█▎ | 4352/34278 [4:55:29<30:50:18, 3.71s/it] 13%|█▎ | 4353/34278 [4:55:33<29:40:06, 3.57s/it] {'loss': 0.1711, 'grad_norm': 0.6578728844839694, 'learning_rate': 9.75541395544649e-06, 'epoch': 0.13} 13%|█▎ | 4353/34278 [4:55:33<29:40:06, 3.57s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9691 > 8192). Running this sequence through the model will result in indexing errors 13%|█▎ | 4354/34278 [4:55:37<31:17:13, 3.76s/it] {'loss': 0.1746, 'grad_norm': 0.7544640852412733, 'learning_rate': 9.755267982245192e-06, 'epoch': 0.13} 13%|█▎ | 4354/34278 [4:55:37<31:17:13, 3.76s/it] 13%|█▎ | 4355/34278 [4:55:40<29:52:49, 3.59s/it] {'loss': 0.1552, 'grad_norm': 0.8261389581100625, 'learning_rate': 9.755121966589996e-06, 'epoch': 0.13} 13%|█▎ | 4355/34278 [4:55:40<29:52:49, 3.59s/it] 13%|█▎ | 4356/34278 [4:55:46<34:53:02, 4.20s/it] {'loss': 0.1747, 'grad_norm': 0.7913161637905148, 'learning_rate': 9.754975908482207e-06, 'epoch': 0.13} 13%|█▎ | 4356/34278 [4:55:46<34:53:02, 4.20s/it] 13%|█▎ | 4357/34278 [4:55:49<31:55:51, 3.84s/it] {'loss': 0.1879, 'grad_norm': 0.8225646514876421, 'learning_rate': 9.75482980792313e-06, 'epoch': 0.13} 13%|█▎ | 4357/34278 [4:55:49<31:55:51, 3.84s/it] 13%|█▎ | 4358/34278 [4:55:53<32:44:04, 3.94s/it] {'loss': 0.2059, 'grad_norm': 0.876059043782913, 'learning_rate': 9.754683664914064e-06, 'epoch': 0.13} 13%|█▎ | 4358/34278 [4:55:53<32:44:04, 3.94s/it] 13%|█▎ | 4359/34278 [4:55:56<30:10:43, 3.63s/it] {'loss': 0.169, 'grad_norm': 0.8546291935416853, 'learning_rate': 9.75453747945632e-06, 'epoch': 0.13} 13%|█▎ | 4359/34278 [4:55:56<30:10:43, 3.63s/it] 13%|█▎ | 4360/34278 [4:55:59<29:07:39, 3.50s/it] {'loss': 0.1804, 'grad_norm': 0.9040342126127462, 'learning_rate': 9.754391251551199e-06, 'epoch': 0.13} 13%|█▎ | 4360/34278 [4:55:59<29:07:39, 3.50s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9519 > 8192). Running this sequence through the model will result in indexing errors 13%|█▎ | 4361/34278 [4:56:05<35:10:25, 4.23s/it] {'loss': 0.1708, 'grad_norm': 0.7900388267349989, 'learning_rate': 9.754244981200007e-06, 'epoch': 0.13} 13%|█▎ | 4361/34278 [4:56:05<35:10:25, 4.23s/it] 13%|█▎ | 4362/34278 [4:56:08<32:17:56, 3.89s/it] {'loss': 0.1904, 'grad_norm': 0.7686265863281533, 'learning_rate': 9.754098668404053e-06, 'epoch': 0.13} 13%|█▎ | 4362/34278 [4:56:08<32:17:56, 3.89s/it] 13%|█▎ | 4363/34278 [4:56:11<29:19:48, 3.53s/it] {'loss': 0.1962, 'grad_norm': 0.8614124048181928, 'learning_rate': 9.753952313164639e-06, 'epoch': 0.13} 13%|█▎ | 4363/34278 [4:56:11<29:19:48, 3.53s/it] 13%|█▎ | 4364/34278 [4:56:14<27:49:11, 3.35s/it] {'loss': 0.1648, 'grad_norm': 0.8364488152172986, 'learning_rate': 9.753805915483076e-06, 'epoch': 0.13} 13%|█▎ | 4364/34278 [4:56:14<27:49:11, 3.35s/it] 13%|█▎ | 4365/34278 [4:56:17<28:45:00, 3.46s/it] {'loss': 0.1745, 'grad_norm': 1.1712213109837195, 'learning_rate': 9.753659475360666e-06, 'epoch': 0.13} 13%|█▎ | 4365/34278 [4:56:17<28:45:00, 3.46s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4366/34278 [4:56:20<27:54:43, 3.36s/it] {'loss': 0.2046, 'grad_norm': 0.823577484205447, 'learning_rate': 9.75351299279872e-06, 'epoch': 0.13} 13%|█▎ | 4366/34278 [4:56:20<27:54:43, 3.36s/it] 13%|█▎ | 4367/34278 [4:56:24<27:31:43, 3.31s/it] {'loss': 0.1854, 'grad_norm': 0.8274937620034111, 'learning_rate': 9.753366467798545e-06, 'epoch': 0.13} 13%|█▎ | 4367/34278 [4:56:24<27:31:43, 3.31s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (13268 > 8192). Running this sequence through the model will result in indexing errors 13%|█▎ | 4368/34278 [4:56:26<26:35:12, 3.20s/it] {'loss': 0.1667, 'grad_norm': 0.7816174048127913, 'learning_rate': 9.753219900361449e-06, 'epoch': 0.13} 13%|█▎ | 4368/34278 [4:56:27<26:35:12, 3.20s/it] 13%|█▎ | 4369/34278 [4:56:30<27:43:24, 3.34s/it] {'loss': 0.1933, 'grad_norm': 0.9266467808636973, 'learning_rate': 9.75307329048874e-06, 'epoch': 0.13} 13%|█▎ | 4369/34278 [4:56:30<27:43:24, 3.34s/it] 13%|█▎ | 4370/34278 [4:56:33<26:15:54, 3.16s/it] {'loss': 0.1567, 'grad_norm': 0.8210069172112148, 'learning_rate': 9.752926638181728e-06, 'epoch': 0.13} 13%|█▎ | 4370/34278 [4:56:33<26:15:54, 3.16s/it] 13%|█▎ | 4371/34278 [4:56:36<26:04:01, 3.14s/it] {'loss': 0.1563, 'grad_norm': 0.8234231612323969, 'learning_rate': 9.75277994344172e-06, 'epoch': 0.13} 13%|█▎ | 4371/34278 [4:56:36<26:04:01, 3.14s/it] 13%|█▎ | 4372/34278 [4:56:41<30:44:52, 3.70s/it] {'loss': 0.1654, 'grad_norm': 0.9439437784972723, 'learning_rate': 9.75263320627003e-06, 'epoch': 0.13} 13%|█▎ | 4372/34278 [4:56:41<30:44:52, 3.70s/it] 13%|█▎ | 4373/34278 [4:56:44<29:01:01, 3.49s/it] {'loss': 0.187, 'grad_norm': 1.0884616616947242, 'learning_rate': 9.752486426667963e-06, 'epoch': 0.13} 13%|█▎ | 4373/34278 [4:56:44<29:01:01, 3.49s/it] 13%|█▎ | 4374/34278 [4:56:47<28:46:56, 3.46s/it] {'loss': 0.1978, 'grad_norm': 1.0232977420602125, 'learning_rate': 9.752339604636832e-06, 'epoch': 0.13} 13%|█▎ | 4374/34278 [4:56:47<28:46:56, 3.46s/it] 13%|█▎ | 4375/34278 [4:56:50<27:32:17, 3.32s/it] {'loss': 0.2103, 'grad_norm': 0.8515421355954126, 'learning_rate': 9.752192740177948e-06, 'epoch': 0.13} 13%|█▎ | 4375/34278 [4:56:50<27:32:17, 3.32s/it] 13%|█▎ | 4376/34278 [4:56:56<32:53:48, 3.96s/it] {'loss': 0.167, 'grad_norm': 0.8976042839791545, 'learning_rate': 9.752045833292622e-06, 'epoch': 0.13} 13%|█▎ | 4376/34278 [4:56:56<32:53:48, 3.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4377/34278 [4:56:59<30:42:47, 3.70s/it] {'loss': 0.1835, 'grad_norm': 1.000535780969401, 'learning_rate': 9.751898883982164e-06, 'epoch': 0.13} 13%|█▎ | 4377/34278 [4:56:59<30:42:47, 3.70s/it] 13%|█▎ | 4378/34278 [4:57:02<28:55:20, 3.48s/it] {'loss': 0.2075, 'grad_norm': 0.8509745941678003, 'learning_rate': 9.751751892247888e-06, 'epoch': 0.13} 13%|█▎ | 4378/34278 [4:57:02<28:55:20, 3.48s/it] 13%|█▎ | 4379/34278 [4:57:06<29:17:49, 3.53s/it] {'loss': 0.2284, 'grad_norm': 1.074620987838019, 'learning_rate': 9.751604858091106e-06, 'epoch': 0.13} 13%|█▎ | 4379/34278 [4:57:06<29:17:49, 3.53s/it] 13%|█▎ | 4380/34278 [4:57:09<28:34:41, 3.44s/it] {'loss': 0.1752, 'grad_norm': 0.8236903064117794, 'learning_rate': 9.75145778151313e-06, 'epoch': 0.13} 13%|█▎ | 4380/34278 [4:57:09<28:34:41, 3.44s/it] 13%|█▎ | 4381/34278 [4:57:13<30:29:41, 3.67s/it] {'loss': 0.1694, 'grad_norm': 0.9561425146550531, 'learning_rate': 9.751310662515271e-06, 'epoch': 0.13} 13%|█▎ | 4381/34278 [4:57:13<30:29:41, 3.67s/it] 13%|█▎ | 4382/34278 [4:57:16<28:35:19, 3.44s/it] {'loss': 0.1628, 'grad_norm': 1.0155410375377456, 'learning_rate': 9.751163501098847e-06, 'epoch': 0.13} 13%|█▎ | 4382/34278 [4:57:16<28:35:19, 3.44s/it] 13%|█▎ | 4383/34278 [4:57:19<27:45:15, 3.34s/it] {'loss': 0.1641, 'grad_norm': 0.8281766624317595, 'learning_rate': 9.751016297265168e-06, 'epoch': 0.13} 13%|█▎ | 4383/34278 [4:57:19<27:45:15, 3.34s/it] 13%|█▎ | 4384/34278 [4:57:23<30:34:31, 3.68s/it] {'loss': 0.1824, 'grad_norm': 1.0099441175090207, 'learning_rate': 9.75086905101555e-06, 'epoch': 0.13} 13%|█▎ | 4384/34278 [4:57:23<30:34:31, 3.68s/it] 13%|█▎ | 4385/34278 [4:57:28<33:23:49, 4.02s/it] {'loss': 0.1763, 'grad_norm': 0.7999313130963264, 'learning_rate': 9.750721762351308e-06, 'epoch': 0.13} 13%|█▎ | 4385/34278 [4:57:28<33:23:49, 4.02s/it] 13%|█▎ | 4386/34278 [4:57:34<38:12:27, 4.60s/it] {'loss': 0.1679, 'grad_norm': 1.054702610948299, 'learning_rate': 9.750574431273756e-06, 'epoch': 0.13} 13%|█▎ | 4386/34278 [4:57:34<38:12:27, 4.60s/it] 13%|█▎ | 4387/34278 [4:57:38<36:19:14, 4.37s/it] {'loss': 0.1783, 'grad_norm': 0.9946926871128318, 'learning_rate': 9.75042705778421e-06, 'epoch': 0.13} 13%|█▎ | 4387/34278 [4:57:38<36:19:14, 4.37s/it] 13%|█▎ | 4388/34278 [4:57:41<33:15:19, 4.01s/it] {'loss': 0.1932, 'grad_norm': 0.7788516142655981, 'learning_rate': 9.750279641883985e-06, 'epoch': 0.13} 13%|█▎ | 4388/34278 [4:57:41<33:15:19, 4.01s/it] 13%|█▎ | 4389/34278 [4:57:44<31:20:55, 3.78s/it] {'loss': 0.1718, 'grad_norm': 1.0465756204115466, 'learning_rate': 9.750132183574395e-06, 'epoch': 0.13} 13%|█▎ | 4389/34278 [4:57:44<31:20:55, 3.78s/it] 13%|█▎ | 4390/34278 [4:57:49<33:45:22, 4.07s/it] {'loss': 0.1865, 'grad_norm': 0.9532329728436199, 'learning_rate': 9.749984682856762e-06, 'epoch': 0.13} 13%|█▎ | 4390/34278 [4:57:49<33:45:22, 4.07s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4391/34278 [4:57:53<32:41:22, 3.94s/it] {'loss': 0.1822, 'grad_norm': 0.8816784919587766, 'learning_rate': 9.749837139732398e-06, 'epoch': 0.13} 13%|█▎ | 4391/34278 [4:57:53<32:41:22, 3.94s/it] 13%|█▎ | 4392/34278 [4:57:56<30:56:27, 3.73s/it] {'loss': 0.1874, 'grad_norm': 1.217591959923788, 'learning_rate': 9.749689554202621e-06, 'epoch': 0.13} 13%|█▎ | 4392/34278 [4:57:56<30:56:27, 3.73s/it] 13%|█▎ | 4393/34278 [4:57:59<28:54:23, 3.48s/it] {'loss': 0.1637, 'grad_norm': 0.9820674644439095, 'learning_rate': 9.74954192626875e-06, 'epoch': 0.13} 13%|█▎ | 4393/34278 [4:57:59<28:54:23, 3.48s/it] 13%|█▎ | 4394/34278 [4:58:02<28:23:43, 3.42s/it] {'loss': 0.1606, 'grad_norm': 0.7625453602481914, 'learning_rate': 9.749394255932105e-06, 'epoch': 0.13} 13%|█▎ | 4394/34278 [4:58:02<28:23:43, 3.42s/it] 13%|█▎ | 4395/34278 [4:58:08<34:43:31, 4.18s/it] {'loss': 0.208, 'grad_norm': 1.0725278108650127, 'learning_rate': 9.749246543194e-06, 'epoch': 0.13} 13%|█▎ | 4395/34278 [4:58:08<34:43:31, 4.18s/it] 13%|█▎ | 4396/34278 [4:58:12<33:14:52, 4.01s/it] {'loss': 0.1826, 'grad_norm': 0.731477959555322, 'learning_rate': 9.749098788055755e-06, 'epoch': 0.13} 13%|█▎ | 4396/34278 [4:58:12<33:14:52, 4.01s/it] 13%|█▎ | 4397/34278 [4:58:15<31:53:21, 3.84s/it] {'loss': 0.2128, 'grad_norm': 0.881191244196159, 'learning_rate': 9.748950990518691e-06, 'epoch': 0.13} 13%|█▎ | 4397/34278 [4:58:15<31:53:21, 3.84s/it] 13%|█▎ | 4398/34278 [4:58:21<37:12:10, 4.48s/it] {'loss': 0.1821, 'grad_norm': 1.0291776455260686, 'learning_rate': 9.748803150584125e-06, 'epoch': 0.13} 13%|█▎ | 4398/34278 [4:58:21<37:12:10, 4.48s/it] 13%|█▎ | 4399/34278 [4:58:26<38:51:21, 4.68s/it] {'loss': 0.1605, 'grad_norm': 0.935685919356452, 'learning_rate': 9.74865526825338e-06, 'epoch': 0.13} 13%|█▎ | 4399/34278 [4:58:26<38:51:21, 4.68s/it] 13%|█▎ | 4400/34278 [4:58:30<35:27:34, 4.27s/it] {'loss': 0.1797, 'grad_norm': 0.8613374160537525, 'learning_rate': 9.748507343527772e-06, 'epoch': 0.13} 13%|█▎ | 4400/34278 [4:58:30<35:27:34, 4.27s/it] 13%|█▎ | 4401/34278 [4:58:34<34:20:09, 4.14s/it] {'loss': 0.1987, 'grad_norm': 0.9866279611483154, 'learning_rate': 9.748359376408625e-06, 'epoch': 0.13} 13%|█▎ | 4401/34278 [4:58:34<34:20:09, 4.14s/it] 13%|█▎ | 4402/34278 [4:58:37<32:08:40, 3.87s/it] {'loss': 0.1823, 'grad_norm': 0.8524349648703374, 'learning_rate': 9.74821136689726e-06, 'epoch': 0.13} 13%|█▎ | 4402/34278 [4:58:37<32:08:40, 3.87s/it] 13%|█▎ | 4403/34278 [4:58:40<31:09:47, 3.76s/it] {'loss': 0.1784, 'grad_norm': 0.8114088040798733, 'learning_rate': 9.748063314994995e-06, 'epoch': 0.13} 13%|█▎ | 4403/34278 [4:58:40<31:09:47, 3.76s/it] 13%|█▎ | 4404/34278 [4:58:44<31:19:02, 3.77s/it] {'loss': 0.1882, 'grad_norm': 1.0120858097085856, 'learning_rate': 9.747915220703157e-06, 'epoch': 0.13} 13%|█▎ | 4404/34278 [4:58:44<31:19:02, 3.77s/it] 13%|█▎ | 4405/34278 [4:58:47<30:09:59, 3.64s/it] {'loss': 0.1698, 'grad_norm': 0.906713513376724, 'learning_rate': 9.747767084023063e-06, 'epoch': 0.13} 13%|█▎ | 4405/34278 [4:58:47<30:09:59, 3.64s/it] 13%|█▎ | 4406/34278 [4:58:50<28:03:43, 3.38s/it] {'loss': 0.1788, 'grad_norm': 0.808284806868891, 'learning_rate': 9.74761890495604e-06, 'epoch': 0.13} 13%|█▎ | 4406/34278 [4:58:50<28:03:43, 3.38s/it] 13%|█▎ | 4407/34278 [4:58:53<27:09:12, 3.27s/it] {'loss': 0.1552, 'grad_norm': 0.9157060587186935, 'learning_rate': 9.747470683503407e-06, 'epoch': 0.13} 13%|█▎ | 4407/34278 [4:58:53<27:09:12, 3.27s/it] 13%|█▎ | 4408/34278 [4:58:57<27:27:19, 3.31s/it] {'loss': 0.1783, 'grad_norm': 0.7299765371223875, 'learning_rate': 9.74732241966649e-06, 'epoch': 0.13} 13%|█▎ | 4408/34278 [4:58:57<27:27:19, 3.31s/it] 13%|█▎ | 4409/34278 [4:59:03<33:58:52, 4.10s/it] {'loss': 0.1805, 'grad_norm': 0.794202062630329, 'learning_rate': 9.747174113446612e-06, 'epoch': 0.13} 13%|█▎ | 4409/34278 [4:59:03<33:58:52, 4.10s/it] 13%|█▎ | 4410/34278 [4:59:05<31:04:51, 3.75s/it] {'loss': 0.1579, 'grad_norm': 0.9529590201063353, 'learning_rate': 9.747025764845095e-06, 'epoch': 0.13} 13%|█▎ | 4410/34278 [4:59:06<31:04:51, 3.75s/it] 13%|█▎ | 4411/34278 [4:59:09<30:31:32, 3.68s/it] {'loss': 0.173, 'grad_norm': 0.815325678211468, 'learning_rate': 9.746877373863265e-06, 'epoch': 0.13} 13%|█▎ | 4411/34278 [4:59:09<30:31:32, 3.68s/it] 13%|█▎ | 4412/34278 [4:59:12<29:01:18, 3.50s/it] {'loss': 0.1658, 'grad_norm': 0.7731034448423537, 'learning_rate': 9.74672894050245e-06, 'epoch': 0.13} 13%|█▎ | 4412/34278 [4:59:12<29:01:18, 3.50s/it] 13%|█▎ | 4413/34278 [4:59:15<28:38:48, 3.45s/it] {'loss': 0.1806, 'grad_norm': 0.9692188961092176, 'learning_rate': 9.74658046476397e-06, 'epoch': 0.13} 13%|█▎ | 4413/34278 [4:59:15<28:38:48, 3.45s/it] 13%|█▎ | 4414/34278 [4:59:19<28:07:43, 3.39s/it] {'loss': 0.1703, 'grad_norm': 0.8441353019429177, 'learning_rate': 9.746431946649153e-06, 'epoch': 0.13} 13%|█▎ | 4414/34278 [4:59:19<28:07:43, 3.39s/it] 13%|█▎ | 4415/34278 [4:59:22<27:32:06, 3.32s/it] {'loss': 0.1687, 'grad_norm': 1.4495784388670117, 'learning_rate': 9.746283386159326e-06, 'epoch': 0.13} 13%|█▎ | 4415/34278 [4:59:22<27:32:06, 3.32s/it] 13%|█▎ | 4416/34278 [4:59:27<32:28:35, 3.92s/it] {'loss': 0.1833, 'grad_norm': 0.844591601552637, 'learning_rate': 9.746134783295813e-06, 'epoch': 0.13} 13%|█▎ | 4416/34278 [4:59:27<32:28:35, 3.92s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4417/34278 [4:59:30<29:51:45, 3.60s/it] {'loss': 0.183, 'grad_norm': 0.8216297326240757, 'learning_rate': 9.745986138059941e-06, 'epoch': 0.13} 13%|█▎ | 4417/34278 [4:59:30<29:51:45, 3.60s/it] 13%|█▎ | 4418/34278 [4:59:34<29:50:07, 3.60s/it] {'loss': 0.1558, 'grad_norm': 0.7837085449321946, 'learning_rate': 9.745837450453036e-06, 'epoch': 0.13} 13%|█▎ | 4418/34278 [4:59:34<29:50:07, 3.60s/it] 13%|█▎ | 4419/34278 [4:59:37<28:21:27, 3.42s/it] {'loss': 0.171, 'grad_norm': 0.9203670229578482, 'learning_rate': 9.745688720476431e-06, 'epoch': 0.13} 13%|█▎ | 4419/34278 [4:59:37<28:21:27, 3.42s/it] 13%|█▎ | 4420/34278 [4:59:42<33:14:16, 4.01s/it] {'loss': 0.1813, 'grad_norm': 0.9020978895465857, 'learning_rate': 9.745539948131449e-06, 'epoch': 0.13} 13%|█▎ | 4420/34278 [4:59:42<33:14:16, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4421/34278 [4:59:46<33:57:25, 4.09s/it] {'loss': 0.2127, 'grad_norm': 0.8931646113379528, 'learning_rate': 9.745391133419417e-06, 'epoch': 0.13} 13%|█▎ | 4421/34278 [4:59:46<33:57:25, 4.09s/it] 13%|█▎ | 4422/34278 [4:59:52<38:24:57, 4.63s/it] {'loss': 0.16, 'grad_norm': 1.1000981484583716, 'learning_rate': 9.74524227634167e-06, 'epoch': 0.13} 13%|█▎ | 4422/34278 [4:59:52<38:24:57, 4.63s/it] 13%|█▎ | 4423/34278 [4:59:55<34:34:13, 4.17s/it] {'loss': 0.1701, 'grad_norm': 1.0698500351691078, 'learning_rate': 9.745093376899528e-06, 'epoch': 0.13} 13%|█▎ | 4423/34278 [4:59:55<34:34:13, 4.17s/it] 13%|█▎ | 4424/34278 [4:59:59<33:36:25, 4.05s/it] {'loss': 0.2185, 'grad_norm': 0.9110995597732715, 'learning_rate': 9.744944435094327e-06, 'epoch': 0.13} 13%|█▎ | 4424/34278 [4:59:59<33:36:25, 4.05s/it] 13%|█▎ | 4425/34278 [5:00:02<31:37:43, 3.81s/it] {'loss': 0.2024, 'grad_norm': 0.9159848910732172, 'learning_rate': 9.744795450927395e-06, 'epoch': 0.13} 13%|█▎ | 4425/34278 [5:00:02<31:37:43, 3.81s/it] 13%|█▎ | 4426/34278 [5:00:06<30:59:28, 3.74s/it] {'loss': 0.1625, 'grad_norm': 0.8734090453282745, 'learning_rate': 9.744646424400062e-06, 'epoch': 0.13} 13%|█▎ | 4426/34278 [5:00:06<30:59:28, 3.74s/it] 13%|█▎ | 4427/34278 [5:00:08<28:17:33, 3.41s/it] {'loss': 0.1709, 'grad_norm': 0.950467843389007, 'learning_rate': 9.744497355513658e-06, 'epoch': 0.13} 13%|█▎ | 4427/34278 [5:00:09<28:17:33, 3.41s/it] 13%|█▎ | 4428/34278 [5:00:12<28:28:21, 3.43s/it] {'loss': 0.1609, 'grad_norm': 0.823200969360653, 'learning_rate': 9.744348244269515e-06, 'epoch': 0.13} 13%|█▎ | 4428/34278 [5:00:12<28:28:21, 3.43s/it] 13%|█▎ | 4429/34278 [5:00:15<27:08:58, 3.27s/it] {'loss': 0.1603, 'grad_norm': 0.8237340072156272, 'learning_rate': 9.744199090668963e-06, 'epoch': 0.13} 13%|█▎ | 4429/34278 [5:00:15<27:08:58, 3.27s/it] 13%|█▎ | 4430/34278 [5:00:18<26:16:39, 3.17s/it] {'loss': 0.1796, 'grad_norm': 0.8998333669220802, 'learning_rate': 9.744049894713334e-06, 'epoch': 0.13} 13%|█▎ | 4430/34278 [5:00:18<26:16:39, 3.17s/it] 13%|█▎ | 4431/34278 [5:00:21<26:05:50, 3.15s/it] {'loss': 0.1642, 'grad_norm': 1.099354717803711, 'learning_rate': 9.74390065640396e-06, 'epoch': 0.13} 13%|█▎ | 4431/34278 [5:00:21<26:05:50, 3.15s/it] 13%|█▎ | 4432/34278 [5:00:24<25:49:05, 3.11s/it] {'loss': 0.1733, 'grad_norm': 0.8097038713295855, 'learning_rate': 9.743751375742171e-06, 'epoch': 0.13} 13%|█▎ | 4432/34278 [5:00:24<25:49:05, 3.11s/it] 13%|█▎ | 4433/34278 [5:00:27<26:05:21, 3.15s/it] {'loss': 0.1872, 'grad_norm': 0.8592724449828005, 'learning_rate': 9.743602052729307e-06, 'epoch': 0.13} 13%|█▎ | 4433/34278 [5:00:27<26:05:21, 3.15s/it] 13%|█▎ | 4434/34278 [5:00:31<26:42:36, 3.22s/it] {'loss': 0.1914, 'grad_norm': 1.0984868393219305, 'learning_rate': 9.743452687366692e-06, 'epoch': 0.13} 13%|█▎ | 4434/34278 [5:00:31<26:42:36, 3.22s/it] 13%|█▎ | 4435/34278 [5:00:37<33:58:08, 4.10s/it] {'loss': 0.1964, 'grad_norm': 0.8896499347836703, 'learning_rate': 9.743303279655666e-06, 'epoch': 0.13} 13%|█▎ | 4435/34278 [5:00:37<33:58:08, 4.10s/it] 13%|█▎ | 4436/34278 [5:00:40<31:31:09, 3.80s/it] {'loss': 0.1828, 'grad_norm': 1.0362742045145739, 'learning_rate': 9.74315382959756e-06, 'epoch': 0.13} 13%|█▎ | 4436/34278 [5:00:40<31:31:09, 3.80s/it] 13%|█▎ | 4437/34278 [5:00:46<36:18:16, 4.38s/it] {'loss': 0.1617, 'grad_norm': 0.7021469494672792, 'learning_rate': 9.743004337193708e-06, 'epoch': 0.13} 13%|█▎ | 4437/34278 [5:00:46<36:18:16, 4.38s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4438/34278 [5:00:49<33:52:06, 4.09s/it] {'loss': 0.2237, 'grad_norm': 0.8919651586134387, 'learning_rate': 9.742854802445447e-06, 'epoch': 0.13} 13%|█▎ | 4438/34278 [5:00:49<33:52:06, 4.09s/it] 13%|█▎ | 4439/34278 [5:00:52<30:50:18, 3.72s/it] {'loss': 0.1749, 'grad_norm': 0.9760275214910902, 'learning_rate': 9.74270522535411e-06, 'epoch': 0.13} 13%|█▎ | 4439/34278 [5:00:52<30:50:18, 3.72s/it] 13%|█▎ | 4440/34278 [5:00:56<31:34:19, 3.81s/it] {'loss': 0.177, 'grad_norm': 0.856505135887569, 'learning_rate': 9.742555605921033e-06, 'epoch': 0.13} 13%|█▎ | 4440/34278 [5:00:56<31:34:19, 3.81s/it] 13%|█▎ | 4441/34278 [5:01:00<31:37:32, 3.82s/it] {'loss': 0.1582, 'grad_norm': 0.8350006804059159, 'learning_rate': 9.742405944147552e-06, 'epoch': 0.13} 13%|█▎ | 4441/34278 [5:01:00<31:37:32, 3.82s/it] 13%|█▎ | 4442/34278 [5:01:03<30:52:23, 3.73s/it] {'loss': 0.1942, 'grad_norm': 0.7715643717784217, 'learning_rate': 9.742256240035001e-06, 'epoch': 0.13} 13%|█▎ | 4442/34278 [5:01:03<30:52:23, 3.73s/it] 13%|█▎ | 4443/34278 [5:01:07<31:00:18, 3.74s/it] {'loss': 0.2079, 'grad_norm': 0.9894542110207976, 'learning_rate': 9.74210649358472e-06, 'epoch': 0.13} 13%|█▎ | 4443/34278 [5:01:07<31:00:18, 3.74s/it] 13%|█▎ | 4444/34278 [5:01:10<29:18:32, 3.54s/it] {'loss': 0.1907, 'grad_norm': 0.9973062047101183, 'learning_rate': 9.741956704798045e-06, 'epoch': 0.13} 13%|█▎ | 4444/34278 [5:01:10<29:18:32, 3.54s/it] 13%|█▎ | 4445/34278 [5:01:14<30:02:38, 3.63s/it] {'loss': 0.1728, 'grad_norm': 0.8975359268533455, 'learning_rate': 9.741806873676311e-06, 'epoch': 0.13} 13%|█▎ | 4445/34278 [5:01:14<30:02:38, 3.63s/it] 13%|█▎ | 4446/34278 [5:01:17<28:46:17, 3.47s/it] {'loss': 0.1688, 'grad_norm': 0.9650392929427682, 'learning_rate': 9.741657000220858e-06, 'epoch': 0.13} 13%|█▎ | 4446/34278 [5:01:17<28:46:17, 3.47s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f339f2cd6c0> Failed to fetch sample 3187194. Exception: cannot identify image file <_io.BytesIO object at 0x7f339f2cd6c0> 13%|█▎ | 4447/34278 [5:01:23<35:06:56, 4.24s/it] {'loss': 0.1742, 'grad_norm': 0.8087828182320042, 'learning_rate': 9.741507084433024e-06, 'epoch': 0.13} 13%|█▎ | 4447/34278 [5:01:23<35:06:56, 4.24s/it] 13%|█▎ | 4448/34278 [5:01:26<31:52:02, 3.85s/it] {'loss': 0.1692, 'grad_norm': 0.8646356277152982, 'learning_rate': 9.741357126314146e-06, 'epoch': 0.13} 13%|█▎ | 4448/34278 [5:01:26<31:52:02, 3.85s/it] 13%|█▎ | 4449/34278 [5:01:29<30:05:53, 3.63s/it] {'loss': 0.199, 'grad_norm': 1.2893248191042816, 'learning_rate': 9.741207125865562e-06, 'epoch': 0.13} 13%|█▎ | 4449/34278 [5:01:29<30:05:53, 3.63s/it] 13%|█▎ | 4450/34278 [5:01:32<28:40:30, 3.46s/it] {'loss': 0.1643, 'grad_norm': 0.8472904394910414, 'learning_rate': 9.741057083088614e-06, 'epoch': 0.13} 13%|█▎ | 4450/34278 [5:01:32<28:40:30, 3.46s/it] 13%|█▎ | 4451/34278 [5:01:38<34:53:36, 4.21s/it] {'loss': 0.1597, 'grad_norm': 0.6933867492193225, 'learning_rate': 9.74090699798464e-06, 'epoch': 0.13} 13%|█▎ | 4451/34278 [5:01:38<34:53:36, 4.21s/it] 13%|█▎ | 4452/34278 [5:01:42<33:28:14, 4.04s/it] {'loss': 0.1838, 'grad_norm': 0.8737035311152308, 'learning_rate': 9.740756870554979e-06, 'epoch': 0.13} 13%|█▎ | 4452/34278 [5:01:42<33:28:14, 4.04s/it] 13%|█▎ | 4453/34278 [5:01:45<32:26:05, 3.92s/it] {'loss': 0.1497, 'grad_norm': 0.7910643982888277, 'learning_rate': 9.740606700800974e-06, 'epoch': 0.13} 13%|█▎ | 4453/34278 [5:01:45<32:26:05, 3.92s/it] 13%|█▎ | 4454/34278 [5:01:48<29:45:12, 3.59s/it] {'loss': 0.1917, 'grad_norm': 0.8199743996277233, 'learning_rate': 9.740456488723964e-06, 'epoch': 0.13} 13%|█▎ | 4454/34278 [5:01:48<29:45:12, 3.59s/it] 13%|█▎ | 4455/34278 [5:01:51<28:48:46, 3.48s/it] {'loss': 0.1695, 'grad_norm': 0.7722978469757387, 'learning_rate': 9.74030623432529e-06, 'epoch': 0.13} 13%|█▎ | 4455/34278 [5:01:51<28:48:46, 3.48s/it] 13%|█▎ | 4456/34278 [5:01:55<28:11:14, 3.40s/it] {'loss': 0.1709, 'grad_norm': 1.308533333466574, 'learning_rate': 9.740155937606291e-06, 'epoch': 0.13} 13%|█▎ | 4456/34278 [5:01:55<28:11:14, 3.40s/it] 13%|█▎ | 4457/34278 [5:02:00<34:19:42, 4.14s/it] {'loss': 0.1745, 'grad_norm': 0.7990116703827624, 'learning_rate': 9.740005598568314e-06, 'epoch': 0.13} 13%|█▎ | 4457/34278 [5:02:01<34:19:42, 4.14s/it] 13%|█▎ | 4458/34278 [5:02:04<33:50:41, 4.09s/it] {'loss': 0.2187, 'grad_norm': 1.0282780284630753, 'learning_rate': 9.739855217212699e-06, 'epoch': 0.13} 13%|█▎ | 4458/34278 [5:02:04<33:50:41, 4.09s/it] 13%|█▎ | 4459/34278 [5:02:08<31:36:28, 3.82s/it] {'loss': 0.1778, 'grad_norm': 0.8502975395774706, 'learning_rate': 9.739704793540786e-06, 'epoch': 0.13} 13%|█▎ | 4459/34278 [5:02:08<31:36:28, 3.82s/it] 13%|█▎ | 4460/34278 [5:02:11<29:42:51, 3.59s/it] {'loss': 0.178, 'grad_norm': 0.9208482895620984, 'learning_rate': 9.739554327553922e-06, 'epoch': 0.13} 13%|█▎ | 4460/34278 [5:02:11<29:42:51, 3.59s/it] 13%|█▎ | 4461/34278 [5:02:15<30:32:52, 3.69s/it] {'loss': 0.1888, 'grad_norm': 0.9264866785624601, 'learning_rate': 9.739403819253447e-06, 'epoch': 0.13} 13%|█▎ | 4461/34278 [5:02:15<30:32:52, 3.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4462/34278 [5:02:18<28:36:01, 3.45s/it] {'loss': 0.1728, 'grad_norm': 0.8794463549541848, 'learning_rate': 9.739253268640705e-06, 'epoch': 0.13} 13%|█▎ | 4462/34278 [5:02:18<28:36:01, 3.45s/it] 13%|█▎ | 4463/34278 [5:02:20<27:26:27, 3.31s/it] {'loss': 0.1627, 'grad_norm': 0.9188176098897562, 'learning_rate': 9.739102675717044e-06, 'epoch': 0.13} 13%|█▎ | 4463/34278 [5:02:21<27:26:27, 3.31s/it] 13%|█▎ | 4464/34278 [5:02:26<34:01:14, 4.11s/it] {'loss': 0.178, 'grad_norm': 0.9793199473961062, 'learning_rate': 9.738952040483804e-06, 'epoch': 0.13} 13%|█▎ | 4464/34278 [5:02:26<34:01:14, 4.11s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11185 > 8192). Running this sequence through the model will result in indexing errors 13%|█▎ | 4465/34278 [5:02:30<33:34:57, 4.06s/it] {'loss': 0.1757, 'grad_norm': 0.9646800594319975, 'learning_rate': 9.738801362942332e-06, 'epoch': 0.13} 13%|█▎ | 4465/34278 [5:02:30<33:34:57, 4.06s/it] 13%|█▎ | 4466/34278 [5:02:34<31:42:13, 3.83s/it] {'loss': 0.1992, 'grad_norm': 1.1586724336458754, 'learning_rate': 9.738650643093972e-06, 'epoch': 0.13} 13%|█▎ | 4466/34278 [5:02:34<31:42:13, 3.83s/it] 13%|█▎ | 4467/34278 [5:02:36<28:58:51, 3.50s/it] {'loss': 0.167, 'grad_norm': 1.1308472705818482, 'learning_rate': 9.738499880940071e-06, 'epoch': 0.13} 13%|█▎ | 4467/34278 [5:02:36<28:58:51, 3.50s/it] 13%|█▎ | 4468/34278 [5:02:40<28:13:36, 3.41s/it] {'loss': 0.193, 'grad_norm': 0.901168776397259, 'learning_rate': 9.738349076481975e-06, 'epoch': 0.13} 13%|█▎ | 4468/34278 [5:02:40<28:13:36, 3.41s/it] 13%|█▎ | 4469/34278 [5:02:43<27:52:45, 3.37s/it] {'loss': 0.1824, 'grad_norm': 0.9144738545975689, 'learning_rate': 9.738198229721028e-06, 'epoch': 0.13} 13%|█▎ | 4469/34278 [5:02:43<27:52:45, 3.37s/it] 13%|█▎ | 4470/34278 [5:02:46<27:20:19, 3.30s/it] {'loss': 0.2157, 'grad_norm': 0.8880825342168955, 'learning_rate': 9.738047340658578e-06, 'epoch': 0.13} 13%|█▎ | 4470/34278 [5:02:46<27:20:19, 3.30s/it] 13%|█▎ | 4471/34278 [5:02:49<26:31:46, 3.20s/it] {'loss': 0.1791, 'grad_norm': 0.7563688378989818, 'learning_rate': 9.737896409295974e-06, 'epoch': 0.13} 13%|█▎ | 4471/34278 [5:02:49<26:31:46, 3.20s/it] 13%|█▎ | 4472/34278 [5:02:53<28:07:48, 3.40s/it] {'loss': 0.1844, 'grad_norm': 0.9776378723900304, 'learning_rate': 9.73774543563456e-06, 'epoch': 0.13} 13%|█▎ | 4472/34278 [5:02:53<28:07:48, 3.40s/it] 13%|█▎ | 4473/34278 [5:02:58<31:49:09, 3.84s/it] {'loss': 0.1737, 'grad_norm': 0.7359968535887115, 'learning_rate': 9.737594419675687e-06, 'epoch': 0.13} 13%|█▎ | 4473/34278 [5:02:58<31:49:09, 3.84s/it] 13%|█▎ | 4474/34278 [5:03:01<29:19:50, 3.54s/it] {'loss': 0.1718, 'grad_norm': 0.7716440186722187, 'learning_rate': 9.737443361420702e-06, 'epoch': 0.13} 13%|█▎ | 4474/34278 [5:03:01<29:19:50, 3.54s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9716 > 8192). Running this sequence through the model will result in indexing errors 13%|█▎ | 4475/34278 [5:03:04<29:09:00, 3.52s/it] {'loss': 0.1585, 'grad_norm': 0.9520845386723726, 'learning_rate': 9.737292260870954e-06, 'epoch': 0.13} 13%|█▎ | 4475/34278 [5:03:04<29:09:00, 3.52s/it] 13%|█▎ | 4476/34278 [5:03:09<33:46:46, 4.08s/it] {'loss': 0.1939, 'grad_norm': 0.721596082019839, 'learning_rate': 9.737141118027791e-06, 'epoch': 0.13} 13%|█▎ | 4476/34278 [5:03:09<33:46:46, 4.08s/it] 13%|█▎ | 4477/34278 [5:03:13<31:31:46, 3.81s/it] {'loss': 0.2028, 'grad_norm': 0.8360220237465885, 'learning_rate': 9.736989932892564e-06, 'epoch': 0.13} 13%|█▎ | 4477/34278 [5:03:13<31:31:46, 3.81s/it] 13%|█▎ | 4478/34278 [5:03:16<29:14:14, 3.53s/it] {'loss': 0.1842, 'grad_norm': 1.0491231446619385, 'learning_rate': 9.73683870546662e-06, 'epoch': 0.13} 13%|█▎ | 4478/34278 [5:03:16<29:14:14, 3.53s/it] 13%|█▎ | 4479/34278 [5:03:19<28:20:52, 3.42s/it] {'loss': 0.1732, 'grad_norm': 0.8288794621008557, 'learning_rate': 9.736687435751311e-06, 'epoch': 0.13} 13%|█▎ | 4479/34278 [5:03:19<28:20:52, 3.42s/it] 13%|█▎ | 4480/34278 [5:03:22<27:28:17, 3.32s/it] {'loss': 0.1645, 'grad_norm': 0.9218598572374194, 'learning_rate': 9.736536123747989e-06, 'epoch': 0.13} 13%|█▎ | 4480/34278 [5:03:22<27:28:17, 3.32s/it] 13%|█▎ | 4481/34278 [5:03:27<31:15:05, 3.78s/it] {'loss': 0.1692, 'grad_norm': 0.8530519074164498, 'learning_rate': 9.736384769458e-06, 'epoch': 0.13} 13%|█▎ | 4481/34278 [5:03:27<31:15:05, 3.78s/it] 13%|█▎ | 4482/34278 [5:03:30<31:31:07, 3.81s/it] {'loss': 0.1797, 'grad_norm': 0.8975258785313561, 'learning_rate': 9.736233372882701e-06, 'epoch': 0.13} 13%|█▎ | 4482/34278 [5:03:31<31:31:07, 3.81s/it] 13%|█▎ | 4483/34278 [5:03:37<37:18:12, 4.51s/it] {'loss': 0.184, 'grad_norm': 0.7132316423037498, 'learning_rate': 9.73608193402344e-06, 'epoch': 0.13} 13%|█▎ | 4483/34278 [5:03:37<37:18:12, 4.51s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4484/34278 [5:03:41<36:39:51, 4.43s/it] {'loss': 0.1705, 'grad_norm': 0.6603410548513219, 'learning_rate': 9.735930452881571e-06, 'epoch': 0.13} 13%|█▎ | 4484/34278 [5:03:41<36:39:51, 4.43s/it] 13%|█▎ | 4485/34278 [5:03:46<39:11:30, 4.74s/it] {'loss': 0.1967, 'grad_norm': 0.9550476682958892, 'learning_rate': 9.735778929458446e-06, 'epoch': 0.13} 13%|█▎ | 4485/34278 [5:03:46<39:11:30, 4.74s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4486/34278 [5:03:51<38:46:57, 4.69s/it] {'loss': 0.1805, 'grad_norm': 0.9924511364030971, 'learning_rate': 9.735627363755415e-06, 'epoch': 0.13} 13%|█▎ | 4486/34278 [5:03:51<38:46:57, 4.69s/it] 13%|█▎ | 4487/34278 [5:03:55<36:34:36, 4.42s/it] {'loss': 0.1785, 'grad_norm': 0.887661578594931, 'learning_rate': 9.735475755773836e-06, 'epoch': 0.13} 13%|█▎ | 4487/34278 [5:03:55<36:34:36, 4.42s/it] 13%|█▎ | 4488/34278 [5:03:59<35:43:35, 4.32s/it] {'loss': 0.1823, 'grad_norm': 0.7651320938224379, 'learning_rate': 9.735324105515059e-06, 'epoch': 0.13} 13%|█▎ | 4488/34278 [5:03:59<35:43:35, 4.32s/it] 13%|█▎ | 4489/34278 [5:04:02<32:34:31, 3.94s/it] {'loss': 0.1636, 'grad_norm': 0.7714400056966344, 'learning_rate': 9.735172412980439e-06, 'epoch': 0.13} 13%|█▎ | 4489/34278 [5:04:02<32:34:31, 3.94s/it] 13%|█▎ | 4490/34278 [5:04:08<37:13:08, 4.50s/it] {'loss': 0.1729, 'grad_norm': 0.8132703483276422, 'learning_rate': 9.735020678171327e-06, 'epoch': 0.13} 13%|█▎ | 4490/34278 [5:04:08<37:13:08, 4.50s/it] 13%|█▎ | 4491/34278 [5:04:11<33:24:27, 4.04s/it] {'loss': 0.1672, 'grad_norm': 0.7972538544908403, 'learning_rate': 9.734868901089084e-06, 'epoch': 0.13} 13%|█▎ | 4491/34278 [5:04:11<33:24:27, 4.04s/it] 13%|█▎ | 4492/34278 [5:04:17<38:14:28, 4.62s/it] {'loss': 0.1592, 'grad_norm': 0.8735972701058112, 'learning_rate': 9.734717081735061e-06, 'epoch': 0.13} 13%|█▎ | 4492/34278 [5:04:17<38:14:28, 4.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4493/34278 [5:04:20<35:36:39, 4.30s/it] {'loss': 0.1874, 'grad_norm': 0.9802666760505124, 'learning_rate': 9.734565220110614e-06, 'epoch': 0.13} 13%|█▎ | 4493/34278 [5:04:20<35:36:39, 4.30s/it] 13%|█▎ | 4494/34278 [5:04:24<33:41:01, 4.07s/it] {'loss': 0.1914, 'grad_norm': 0.9545295838676167, 'learning_rate': 9.7344133162171e-06, 'epoch': 0.13} 13%|█▎ | 4494/34278 [5:04:24<33:41:01, 4.07s/it] 13%|█▎ | 4495/34278 [5:04:27<31:36:22, 3.82s/it] {'loss': 0.1794, 'grad_norm': 0.7092996313604574, 'learning_rate': 9.734261370055873e-06, 'epoch': 0.13} 13%|█▎ | 4495/34278 [5:04:27<31:36:22, 3.82s/it] 13%|█▎ | 4496/34278 [5:04:30<29:18:25, 3.54s/it] {'loss': 0.1775, 'grad_norm': 0.8287097252576615, 'learning_rate': 9.734109381628289e-06, 'epoch': 0.13} 13%|█▎ | 4496/34278 [5:04:30<29:18:25, 3.54s/it] 13%|█▎ | 4497/34278 [5:04:36<35:11:56, 4.25s/it] {'loss': 0.1621, 'grad_norm': 0.9295568582263029, 'learning_rate': 9.73395735093571e-06, 'epoch': 0.13} 13%|█▎ | 4497/34278 [5:04:36<35:11:56, 4.25s/it] 13%|█▎ | 4498/34278 [5:04:39<31:38:29, 3.83s/it] {'loss': 0.1803, 'grad_norm': 0.7091365935560728, 'learning_rate': 9.733805277979488e-06, 'epoch': 0.13} 13%|█▎ | 4498/34278 [5:04:39<31:38:29, 3.83s/it] 13%|█▎ | 4499/34278 [5:04:42<30:02:30, 3.63s/it] {'loss': 0.1791, 'grad_norm': 0.8390669649611711, 'learning_rate': 9.733653162760984e-06, 'epoch': 0.13} 13%|█▎ | 4499/34278 [5:04:42<30:02:30, 3.63s/it] 13%|█▎ | 4500/34278 [5:04:45<29:02:49, 3.51s/it] {'loss': 0.1748, 'grad_norm': 0.9664208370062155, 'learning_rate': 9.733501005281552e-06, 'epoch': 0.13} 13%|█▎ | 4500/34278 [5:04:45<29:02:49, 3.51s/it] 13%|█▎ | 4501/34278 [5:04:50<33:33:59, 4.06s/it] {'loss': 0.198, 'grad_norm': 0.8713257616137842, 'learning_rate': 9.733348805542555e-06, 'epoch': 0.13} 13%|█▎ | 4501/34278 [5:04:50<33:33:59, 4.06s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4502/34278 [5:04:53<30:13:07, 3.65s/it] {'loss': 0.166, 'grad_norm': 1.0050067027036291, 'learning_rate': 9.73319656354535e-06, 'epoch': 0.13} 13%|█▎ | 4502/34278 [5:04:53<30:13:07, 3.65s/it] 13%|█▎ | 4503/34278 [5:04:56<28:42:56, 3.47s/it] {'loss': 0.1673, 'grad_norm': 0.8355747823438511, 'learning_rate': 9.733044279291293e-06, 'epoch': 0.13} 13%|█▎ | 4503/34278 [5:04:56<28:42:56, 3.47s/it] 13%|█▎ | 4504/34278 [5:05:00<29:58:08, 3.62s/it] {'loss': 0.1845, 'grad_norm': 0.7444021287462947, 'learning_rate': 9.73289195278175e-06, 'epoch': 0.13} 13%|█▎ | 4504/34278 [5:05:00<29:58:08, 3.62s/it] 13%|█▎ | 4505/34278 [5:05:03<28:27:54, 3.44s/it] {'loss': 0.181, 'grad_norm': 0.9151521681337017, 'learning_rate': 9.732739584018074e-06, 'epoch': 0.13} 13%|█▎ | 4505/34278 [5:05:03<28:27:54, 3.44s/it] 13%|█▎ | 4506/34278 [5:05:06<27:14:49, 3.29s/it] {'loss': 0.1592, 'grad_norm': 0.7972987826597281, 'learning_rate': 9.732587173001631e-06, 'epoch': 0.13} 13%|█▎ | 4506/34278 [5:05:06<27:14:49, 3.29s/it] 13%|█▎ | 4507/34278 [5:05:09<26:40:47, 3.23s/it] {'loss': 0.1704, 'grad_norm': 0.8646593079278234, 'learning_rate': 9.732434719733782e-06, 'epoch': 0.13} 13%|█▎ | 4507/34278 [5:05:09<26:40:47, 3.23s/it] 13%|█▎ | 4508/34278 [5:05:12<26:30:40, 3.21s/it] {'loss': 0.1901, 'grad_norm': 0.8241084619506681, 'learning_rate': 9.732282224215881e-06, 'epoch': 0.13} 13%|█▎ | 4508/34278 [5:05:12<26:30:40, 3.21s/it] 13%|█▎ | 4509/34278 [5:05:16<27:55:34, 3.38s/it] {'loss': 0.1651, 'grad_norm': 0.9892414621713331, 'learning_rate': 9.732129686449296e-06, 'epoch': 0.13} 13%|█▎ | 4509/34278 [5:05:16<27:55:34, 3.38s/it] 13%|█▎ | 4510/34278 [5:05:20<29:10:47, 3.53s/it] {'loss': 0.188, 'grad_norm': 0.9819990085102668, 'learning_rate': 9.731977106435387e-06, 'epoch': 0.13} 13%|█▎ | 4510/34278 [5:05:20<29:10:47, 3.53s/it] 13%|█▎ | 4511/34278 [5:05:23<28:58:05, 3.50s/it] {'loss': 0.1876, 'grad_norm': 0.9171421502044307, 'learning_rate': 9.731824484175516e-06, 'epoch': 0.13} 13%|█▎ | 4511/34278 [5:05:23<28:58:05, 3.50s/it] 13%|█▎ | 4512/34278 [5:05:27<28:56:56, 3.50s/it] {'loss': 0.179, 'grad_norm': 0.8218702361406134, 'learning_rate': 9.731671819671045e-06, 'epoch': 0.13} 13%|█▎ | 4512/34278 [5:05:27<28:56:56, 3.50s/it] 13%|█▎ | 4513/34278 [5:05:33<35:17:31, 4.27s/it] {'loss': 0.2002, 'grad_norm': 1.2322820415795661, 'learning_rate': 9.731519112923338e-06, 'epoch': 0.13} 13%|█▎ | 4513/34278 [5:05:33<35:17:31, 4.27s/it] 13%|█▎ | 4514/34278 [5:05:37<35:51:33, 4.34s/it] {'loss': 0.1608, 'grad_norm': 0.9227080892573732, 'learning_rate': 9.731366363933759e-06, 'epoch': 0.13} 13%|█▎ | 4514/34278 [5:05:37<35:51:33, 4.34s/it] 13%|█▎ | 4515/34278 [5:05:41<33:55:30, 4.10s/it] {'loss': 0.1818, 'grad_norm': 0.9087240830849821, 'learning_rate': 9.731213572703668e-06, 'epoch': 0.13} 13%|█▎ | 4515/34278 [5:05:41<33:55:30, 4.10s/it] 13%|█▎ | 4516/34278 [5:05:44<30:47:29, 3.72s/it] {'loss': 0.1703, 'grad_norm': 0.8235324858729057, 'learning_rate': 9.731060739234433e-06, 'epoch': 0.13} 13%|█▎ | 4516/34278 [5:05:44<30:47:29, 3.72s/it] 13%|█▎ | 4517/34278 [5:05:47<29:13:19, 3.53s/it] {'loss': 0.1648, 'grad_norm': 0.9362396933008768, 'learning_rate': 9.730907863527417e-06, 'epoch': 0.13} 13%|█▎ | 4517/34278 [5:05:47<29:13:19, 3.53s/it] 13%|█▎ | 4518/34278 [5:05:53<35:26:56, 4.29s/it] {'loss': 0.1865, 'grad_norm': 1.1912283627279456, 'learning_rate': 9.730754945583985e-06, 'epoch': 0.13} 13%|█▎ | 4518/34278 [5:05:53<35:26:56, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4519/34278 [5:05:56<32:54:12, 3.98s/it] {'loss': 0.1914, 'grad_norm': 0.8935139894178387, 'learning_rate': 9.730601985405502e-06, 'epoch': 0.13} 13%|█▎ | 4519/34278 [5:05:56<32:54:12, 3.98s/it] 13%|█▎ | 4520/34278 [5:06:00<32:13:54, 3.90s/it] {'loss': 0.184, 'grad_norm': 0.973250479275411, 'learning_rate': 9.730448982993335e-06, 'epoch': 0.13} 13%|█▎ | 4520/34278 [5:06:00<32:13:54, 3.90s/it] 13%|█▎ | 4521/34278 [5:06:03<29:39:44, 3.59s/it] {'loss': 0.1975, 'grad_norm': 0.9207900746444148, 'learning_rate': 9.730295938348847e-06, 'epoch': 0.13} 13%|█▎ | 4521/34278 [5:06:03<29:39:44, 3.59s/it] 13%|█▎ | 4522/34278 [5:06:06<29:03:06, 3.51s/it] {'loss': 0.1571, 'grad_norm': 0.783714243575158, 'learning_rate': 9.730142851473407e-06, 'epoch': 0.13} 13%|█▎ | 4522/34278 [5:06:06<29:03:06, 3.51s/it] 13%|█▎ | 4523/34278 [5:06:10<30:21:06, 3.67s/it] {'loss': 0.1912, 'grad_norm': 0.9324602659366575, 'learning_rate': 9.729989722368381e-06, 'epoch': 0.13} 13%|█▎ | 4523/34278 [5:06:10<30:21:06, 3.67s/it] 13%|█▎ | 4524/34278 [5:06:14<29:52:53, 3.62s/it] {'loss': 0.1559, 'grad_norm': 0.808382609467356, 'learning_rate': 9.729836551035134e-06, 'epoch': 0.13} 13%|█▎ | 4524/34278 [5:06:14<29:52:53, 3.62s/it] 13%|█▎ | 4525/34278 [5:06:18<31:06:06, 3.76s/it] {'loss': 0.1716, 'grad_norm': 0.7921957445975134, 'learning_rate': 9.729683337475037e-06, 'epoch': 0.13} 13%|█▎ | 4525/34278 [5:06:18<31:06:06, 3.76s/it] 13%|█▎ | 4526/34278 [5:06:22<31:52:19, 3.86s/it] {'loss': 0.1857, 'grad_norm': 0.8852867994342315, 'learning_rate': 9.729530081689456e-06, 'epoch': 0.13} 13%|█▎ | 4526/34278 [5:06:22<31:52:19, 3.86s/it] 13%|█▎ | 4527/34278 [5:06:24<29:02:43, 3.51s/it] {'loss': 0.1931, 'grad_norm': 0.8749446727151079, 'learning_rate': 9.72937678367976e-06, 'epoch': 0.13} 13%|█▎ | 4527/34278 [5:06:24<29:02:43, 3.51s/it] 13%|█▎ | 4528/34278 [5:06:28<29:08:07, 3.53s/it] {'loss': 0.1641, 'grad_norm': 0.7229377908847272, 'learning_rate': 9.729223443447318e-06, 'epoch': 0.13} 13%|█▎ | 4528/34278 [5:06:28<29:08:07, 3.53s/it] 13%|█▎ | 4529/34278 [5:06:31<28:39:31, 3.47s/it] {'loss': 0.1791, 'grad_norm': 0.9158433909304731, 'learning_rate': 9.729070060993495e-06, 'epoch': 0.13} 13%|█▎ | 4529/34278 [5:06:31<28:39:31, 3.47s/it] 13%|█▎ | 4530/34278 [5:06:35<27:55:58, 3.38s/it] {'loss': 0.1791, 'grad_norm': 0.8377665759811016, 'learning_rate': 9.728916636319666e-06, 'epoch': 0.13} 13%|█▎ | 4530/34278 [5:06:35<27:55:58, 3.38s/it] 13%|█▎ | 4531/34278 [5:06:38<28:49:52, 3.49s/it] {'loss': 0.1618, 'grad_norm': 0.6941753010427415, 'learning_rate': 9.728763169427197e-06, 'epoch': 0.13} 13%|█▎ | 4531/34278 [5:06:38<28:49:52, 3.49s/it] 13%|█▎ | 4532/34278 [5:06:44<35:06:09, 4.25s/it] {'loss': 0.1604, 'grad_norm': 0.8049438878497239, 'learning_rate': 9.72860966031746e-06, 'epoch': 0.13} 13%|█▎ | 4532/34278 [5:06:44<35:06:09, 4.25s/it] 13%|█▎ | 4533/34278 [5:06:48<32:35:06, 3.94s/it] {'loss': 0.1704, 'grad_norm': 0.8760066339377347, 'learning_rate': 9.728456108991824e-06, 'epoch': 0.13} 13%|█▎ | 4533/34278 [5:06:48<32:35:06, 3.94s/it] 13%|█▎ | 4534/34278 [5:06:50<30:03:27, 3.64s/it] {'loss': 0.1493, 'grad_norm': 0.7424387588543098, 'learning_rate': 9.728302515451661e-06, 'epoch': 0.13} 13%|█▎ | 4534/34278 [5:06:50<30:03:27, 3.64s/it] 13%|█▎ | 4535/34278 [5:06:54<29:46:53, 3.60s/it] {'loss': 0.1796, 'grad_norm': 0.9813900623889721, 'learning_rate': 9.728148879698341e-06, 'epoch': 0.13} 13%|█▎ | 4535/34278 [5:06:54<29:46:53, 3.60s/it] 13%|█▎ | 4536/34278 [5:06:57<28:50:58, 3.49s/it] {'loss': 0.1783, 'grad_norm': 0.8126444801411503, 'learning_rate': 9.727995201733238e-06, 'epoch': 0.13} 13%|█▎ | 4536/34278 [5:06:57<28:50:58, 3.49s/it] 13%|█▎ | 4537/34278 [5:07:03<34:58:51, 4.23s/it] {'loss': 0.2128, 'grad_norm': 0.8546638488854917, 'learning_rate': 9.727841481557722e-06, 'epoch': 0.13} 13%|█▎ | 4537/34278 [5:07:03<34:58:51, 4.23s/it] 13%|█▎ | 4538/34278 [5:07:09<39:31:29, 4.78s/it] {'loss': 0.1565, 'grad_norm': 0.6899971193858305, 'learning_rate': 9.727687719173164e-06, 'epoch': 0.13} 13%|█▎ | 4538/34278 [5:07:09<39:31:29, 4.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4539/34278 [5:07:13<36:08:18, 4.37s/it] {'loss': 0.1578, 'grad_norm': 1.0858397714441617, 'learning_rate': 9.727533914580941e-06, 'epoch': 0.13} 13%|█▎ | 4539/34278 [5:07:13<36:08:18, 4.37s/it] 13%|█▎ | 4540/34278 [5:07:19<39:44:01, 4.81s/it] {'loss': 0.1679, 'grad_norm': 0.9089000914262344, 'learning_rate': 9.727380067782424e-06, 'epoch': 0.13} 13%|█▎ | 4540/34278 [5:07:19<39:44:01, 4.81s/it] 13%|█▎ | 4541/34278 [5:07:21<34:53:02, 4.22s/it] {'loss': 0.1622, 'grad_norm': 0.9199188357334842, 'learning_rate': 9.727226178778985e-06, 'epoch': 0.13} 13%|█▎ | 4541/34278 [5:07:21<34:53:02, 4.22s/it] 13%|█▎ | 4542/34278 [5:07:25<32:47:34, 3.97s/it] {'loss': 0.1826, 'grad_norm': 1.0201509071058543, 'learning_rate': 9.727072247572e-06, 'epoch': 0.13} 13%|█▎ | 4542/34278 [5:07:25<32:47:34, 3.97s/it] 13%|█▎ | 4543/34278 [5:07:28<30:01:19, 3.63s/it] {'loss': 0.1627, 'grad_norm': 0.8936142567745634, 'learning_rate': 9.726918274162841e-06, 'epoch': 0.13} 13%|█▎ | 4543/34278 [5:07:28<30:01:19, 3.63s/it] 13%|█▎ | 4544/34278 [5:07:31<28:29:58, 3.45s/it] {'loss': 0.1519, 'grad_norm': 0.8517450630339168, 'learning_rate': 9.726764258552885e-06, 'epoch': 0.13} 13%|█▎ | 4544/34278 [5:07:31<28:29:58, 3.45s/it] 13%|█▎ | 4545/34278 [5:07:33<26:59:45, 3.27s/it] {'loss': 0.1757, 'grad_norm': 0.9178315720289905, 'learning_rate': 9.726610200743505e-06, 'epoch': 0.13} 13%|█▎ | 4545/34278 [5:07:33<26:59:45, 3.27s/it] 13%|█▎ | 4546/34278 [5:07:36<25:56:51, 3.14s/it] {'loss': 0.1758, 'grad_norm': 1.0190663322135842, 'learning_rate': 9.726456100736079e-06, 'epoch': 0.13} 13%|█▎ | 4546/34278 [5:07:36<25:56:51, 3.14s/it] 13%|█▎ | 4547/34278 [5:07:42<31:30:21, 3.81s/it] {'loss': 0.2015, 'grad_norm': 1.0820984922683805, 'learning_rate': 9.72630195853198e-06, 'epoch': 0.13} 13%|█▎ | 4547/34278 [5:07:42<31:30:21, 3.81s/it] 13%|█▎ | 4548/34278 [5:07:46<32:01:52, 3.88s/it] {'loss': 0.1603, 'grad_norm': 1.0041173467043631, 'learning_rate': 9.726147774132588e-06, 'epoch': 0.13} 13%|█▎ | 4548/34278 [5:07:46<32:01:52, 3.88s/it] 13%|█▎ | 4549/34278 [5:07:50<33:46:55, 4.09s/it] {'loss': 0.1822, 'grad_norm': 0.8375708058908469, 'learning_rate': 9.725993547539274e-06, 'epoch': 0.13} 13%|█▎ | 4549/34278 [5:07:50<33:46:55, 4.09s/it] 13%|█▎ | 4550/34278 [5:07:53<31:09:57, 3.77s/it] {'loss': 0.1669, 'grad_norm': 1.068467284790401, 'learning_rate': 9.72583927875342e-06, 'epoch': 0.13} 13%|█▎ | 4550/34278 [5:07:53<31:09:57, 3.77s/it] 13%|█▎ | 4551/34278 [5:07:58<34:01:40, 4.12s/it] {'loss': 0.1908, 'grad_norm': 0.6462988019926467, 'learning_rate': 9.725684967776398e-06, 'epoch': 0.13} 13%|█▎ | 4551/34278 [5:07:58<34:01:40, 4.12s/it] 13%|█▎ | 4552/34278 [5:08:01<31:13:17, 3.78s/it] {'loss': 0.187, 'grad_norm': 0.9865048607505865, 'learning_rate': 9.725530614609592e-06, 'epoch': 0.13} 13%|█▎ | 4552/34278 [5:08:01<31:13:17, 3.78s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9900 > 8192). Running this sequence through the model will result in indexing errors 13%|█▎ | 4553/34278 [5:08:05<31:07:58, 3.77s/it] {'loss': 0.1827, 'grad_norm': 0.9761082259552106, 'learning_rate': 9.725376219254374e-06, 'epoch': 0.13} 13%|█▎ | 4553/34278 [5:08:05<31:07:58, 3.77s/it] 13%|█▎ | 4554/34278 [5:08:08<29:26:54, 3.57s/it] {'loss': 0.1714, 'grad_norm': 0.7043053307124566, 'learning_rate': 9.725221781712128e-06, 'epoch': 0.13} 13%|█▎ | 4554/34278 [5:08:08<29:26:54, 3.57s/it] 13%|█▎ | 4555/34278 [5:08:11<27:48:28, 3.37s/it] {'loss': 0.1495, 'grad_norm': 0.7991076445944068, 'learning_rate': 9.725067301984228e-06, 'epoch': 0.13} 13%|█▎ | 4555/34278 [5:08:11<27:48:28, 3.37s/it] 13%|█▎ | 4556/34278 [5:08:15<28:28:57, 3.45s/it] {'loss': 0.186, 'grad_norm': 0.8925024714989009, 'learning_rate': 9.724912780072055e-06, 'epoch': 0.13} 13%|█▎ | 4556/34278 [5:08:15<28:28:57, 3.45s/it] 13%|█▎ | 4557/34278 [5:08:18<27:34:15, 3.34s/it] {'loss': 0.1866, 'grad_norm': 0.9022041240709835, 'learning_rate': 9.72475821597699e-06, 'epoch': 0.13} 13%|█▎ | 4557/34278 [5:08:18<27:34:15, 3.34s/it] 13%|█▎ | 4558/34278 [5:08:21<26:56:26, 3.26s/it] {'loss': 0.1978, 'grad_norm': 0.8633724589768932, 'learning_rate': 9.724603609700409e-06, 'epoch': 0.13} 13%|█▎ | 4558/34278 [5:08:21<26:56:26, 3.26s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8336 > 8192). Running this sequence through the model will result in indexing errors 13%|█▎ | 4559/34278 [5:08:25<28:56:28, 3.51s/it] {'loss': 0.1618, 'grad_norm': 0.9573618809816482, 'learning_rate': 9.724448961243698e-06, 'epoch': 0.13} 13%|█▎ | 4559/34278 [5:08:25<28:56:28, 3.51s/it] 13%|█▎ | 4560/34278 [5:08:28<28:51:11, 3.50s/it] {'loss': 0.1531, 'grad_norm': 0.9401676823193406, 'learning_rate': 9.724294270608232e-06, 'epoch': 0.13} 13%|█▎ | 4560/34278 [5:08:28<28:51:11, 3.50s/it] 13%|█▎ | 4561/34278 [5:08:32<28:26:49, 3.45s/it] {'loss': 0.1691, 'grad_norm': 0.7827004137553876, 'learning_rate': 9.724139537795396e-06, 'epoch': 0.13} 13%|█▎ | 4561/34278 [5:08:32<28:26:49, 3.45s/it] 13%|█▎ | 4562/34278 [5:08:36<29:54:05, 3.62s/it] {'loss': 0.1893, 'grad_norm': 0.8200385903586284, 'learning_rate': 9.72398476280657e-06, 'epoch': 0.13} 13%|█▎ | 4562/34278 [5:08:36<29:54:05, 3.62s/it] 13%|█▎ | 4563/34278 [5:08:40<30:30:30, 3.70s/it] {'loss': 0.1903, 'grad_norm': 0.9458018411150592, 'learning_rate': 9.723829945643135e-06, 'epoch': 0.13} 13%|█▎ | 4563/34278 [5:08:40<30:30:30, 3.70s/it] 13%|█▎ | 4564/34278 [5:08:42<28:19:33, 3.43s/it] {'loss': 0.1685, 'grad_norm': 0.9489788729222979, 'learning_rate': 9.723675086306474e-06, 'epoch': 0.13} 13%|█▎ | 4564/34278 [5:08:42<28:19:33, 3.43s/it] 13%|█▎ | 4565/34278 [5:08:46<29:20:44, 3.56s/it] {'loss': 0.1712, 'grad_norm': 0.935321272037622, 'learning_rate': 9.72352018479797e-06, 'epoch': 0.13} 13%|█▎ | 4565/34278 [5:08:46<29:20:44, 3.56s/it] 13%|█▎ | 4566/34278 [5:08:49<28:10:29, 3.41s/it] {'loss': 0.1975, 'grad_norm': 1.0357709357010814, 'learning_rate': 9.723365241119004e-06, 'epoch': 0.13} 13%|█▎ | 4566/34278 [5:08:49<28:10:29, 3.41s/it] 13%|█▎ | 4567/34278 [5:08:53<27:49:06, 3.37s/it] {'loss': 0.1677, 'grad_norm': 1.2735266730368744, 'learning_rate': 9.723210255270962e-06, 'epoch': 0.13} 13%|█▎ | 4567/34278 [5:08:53<27:49:06, 3.37s/it] 13%|█▎ | 4568/34278 [5:08:59<34:35:44, 4.19s/it] {'loss': 0.1783, 'grad_norm': 0.9795401224102253, 'learning_rate': 9.723055227255227e-06, 'epoch': 0.13} 13%|█▎ | 4568/34278 [5:08:59<34:35:44, 4.19s/it] 13%|█▎ | 4569/34278 [5:09:02<32:42:50, 3.96s/it] {'loss': 0.1731, 'grad_norm': 1.169361910063188, 'learning_rate': 9.722900157073181e-06, 'epoch': 0.13} 13%|█▎ | 4569/34278 [5:09:02<32:42:50, 3.96s/it] 13%|█▎ | 4570/34278 [5:09:06<32:19:32, 3.92s/it] {'loss': 0.1732, 'grad_norm': 0.7712404266060171, 'learning_rate': 9.72274504472621e-06, 'epoch': 0.13} 13%|█▎ | 4570/34278 [5:09:06<32:19:32, 3.92s/it] 13%|█▎ | 4571/34278 [5:09:09<30:09:49, 3.66s/it] {'loss': 0.201, 'grad_norm': 0.9771102635200523, 'learning_rate': 9.722589890215699e-06, 'epoch': 0.13} 13%|█▎ | 4571/34278 [5:09:09<30:09:49, 3.66s/it] 13%|█▎ | 4572/34278 [5:09:15<35:27:28, 4.30s/it] {'loss': 0.1674, 'grad_norm': 0.9443789669268178, 'learning_rate': 9.722434693543032e-06, 'epoch': 0.13} 13%|█▎ | 4572/34278 [5:09:15<35:27:28, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 13%|█▎ | 4573/34278 [5:09:18<31:53:06, 3.86s/it] {'loss': 0.1968, 'grad_norm': 1.069382026402891, 'learning_rate': 9.722279454709596e-06, 'epoch': 0.13} 13%|█▎ | 4573/34278 [5:09:18<31:53:06, 3.86s/it] 13%|█▎ | 4574/34278 [5:09:21<30:37:06, 3.71s/it] {'loss': 0.1645, 'grad_norm': 1.0769538937358414, 'learning_rate': 9.722124173716776e-06, 'epoch': 0.13} 13%|█▎ | 4574/34278 [5:09:21<30:37:06, 3.71s/it] 13%|█▎ | 4575/34278 [5:09:25<30:38:13, 3.71s/it] {'loss': 0.1918, 'grad_norm': 0.923235470376459, 'learning_rate': 9.72196885056596e-06, 'epoch': 0.13} 13%|█▎ | 4575/34278 [5:09:25<30:38:13, 3.71s/it] 13%|█▎ | 4576/34278 [5:09:28<28:58:54, 3.51s/it] {'loss': 0.1563, 'grad_norm': 0.7941395869754817, 'learning_rate': 9.721813485258533e-06, 'epoch': 0.13} 13%|█▎ | 4576/34278 [5:09:28<28:58:54, 3.51s/it] 13%|█▎ | 4577/34278 [5:09:32<30:02:27, 3.64s/it] {'loss': 0.1865, 'grad_norm': 1.0700898790958382, 'learning_rate': 9.72165807779588e-06, 'epoch': 0.13} 13%|█▎ | 4577/34278 [5:09:32<30:02:27, 3.64s/it] 13%|█▎ | 4578/34278 [5:09:35<30:01:54, 3.64s/it] {'loss': 0.1802, 'grad_norm': 0.9833498497939681, 'learning_rate': 9.721502628179394e-06, 'epoch': 0.13} 13%|█▎ | 4578/34278 [5:09:35<30:01:54, 3.64s/it] 13%|█▎ | 4579/34278 [5:09:40<31:55:06, 3.87s/it] {'loss': 0.1633, 'grad_norm': 0.944078838232898, 'learning_rate': 9.721347136410458e-06, 'epoch': 0.13} 13%|█▎ | 4579/34278 [5:09:40<31:55:06, 3.87s/it] 13%|█▎ | 4580/34278 [5:09:45<35:21:25, 4.29s/it] {'loss': 0.1904, 'grad_norm': 1.1012213330837473, 'learning_rate': 9.721191602490463e-06, 'epoch': 0.13} 13%|█▎ | 4580/34278 [5:09:45<35:21:25, 4.29s/it] 13%|█▎ | 4581/34278 [5:09:50<36:41:56, 4.45s/it] {'loss': 0.1858, 'grad_norm': 1.1939507249414758, 'learning_rate': 9.721036026420795e-06, 'epoch': 0.13} 13%|█▎ | 4581/34278 [5:09:50<36:41:56, 4.45s/it] 13%|█▎ | 4582/34278 [5:09:53<33:03:26, 4.01s/it] {'loss': 0.1618, 'grad_norm': 1.0069483261326841, 'learning_rate': 9.720880408202844e-06, 'epoch': 0.13} 13%|█▎ | 4582/34278 [5:09:53<33:03:26, 4.01s/it] 13%|█▎ | 4583/34278 [5:09:59<38:00:34, 4.61s/it] {'loss': 0.1894, 'grad_norm': 1.124695251461858, 'learning_rate': 9.720724747838002e-06, 'epoch': 0.13} 13%|█▎ | 4583/34278 [5:09:59<38:00:34, 4.61s/it] 13%|█▎ | 4584/34278 [5:10:02<34:25:05, 4.17s/it] {'loss': 0.15, 'grad_norm': 1.2071403039976019, 'learning_rate': 9.720569045327655e-06, 'epoch': 0.13} 13%|█▎ | 4584/34278 [5:10:02<34:25:05, 4.17s/it] 13%|█▎ | 4585/34278 [5:10:05<32:01:09, 3.88s/it] {'loss': 0.1794, 'grad_norm': 0.8810175876828813, 'learning_rate': 9.720413300673194e-06, 'epoch': 0.13} 13%|█▎ | 4585/34278 [5:10:05<32:01:09, 3.88s/it] 13%|█▎ | 4586/34278 [5:10:11<37:04:01, 4.49s/it] {'loss': 0.213, 'grad_norm': 1.0453791704302176, 'learning_rate': 9.72025751387601e-06, 'epoch': 0.13} 13%|█▎ | 4586/34278 [5:10:11<37:04:01, 4.49s/it] 13%|█▎ | 4587/34278 [5:10:14<33:33:10, 4.07s/it] {'loss': 0.2021, 'grad_norm': 0.927080338413851, 'learning_rate': 9.720101684937494e-06, 'epoch': 0.13} 13%|█▎ | 4587/34278 [5:10:14<33:33:10, 4.07s/it] 13%|█▎ | 4588/34278 [5:10:17<30:34:53, 3.71s/it] {'loss': 0.1872, 'grad_norm': 0.779984080753791, 'learning_rate': 9.719945813859037e-06, 'epoch': 0.13} 13%|█▎ | 4588/34278 [5:10:17<30:34:53, 3.71s/it] 13%|█▎ | 4589/34278 [5:10:20<28:52:42, 3.50s/it] {'loss': 0.1697, 'grad_norm': 0.7996838449590193, 'learning_rate': 9.719789900642031e-06, 'epoch': 0.13} 13%|█▎ | 4589/34278 [5:10:20<28:52:42, 3.50s/it] 13%|█▎ | 4590/34278 [5:10:24<28:51:42, 3.50s/it] {'loss': 0.19, 'grad_norm': 1.0471543096218647, 'learning_rate': 9.719633945287867e-06, 'epoch': 0.13} 13%|█▎ | 4590/34278 [5:10:24<28:51:42, 3.50s/it] 13%|█▎ | 4591/34278 [5:10:26<27:05:48, 3.29s/it] {'loss': 0.1916, 'grad_norm': 0.8589292071998451, 'learning_rate': 9.719477947797938e-06, 'epoch': 0.13} 13%|█▎ | 4591/34278 [5:10:26<27:05:48, 3.29s/it] 13%|█▎ | 4592/34278 [5:10:29<26:00:31, 3.15s/it] {'loss': 0.1639, 'grad_norm': 0.9015449264921895, 'learning_rate': 9.719321908173636e-06, 'epoch': 0.13} 13%|█▎ | 4592/34278 [5:10:29<26:00:31, 3.15s/it] 13%|█▎ | 4593/34278 [5:10:33<28:48:25, 3.49s/it] {'loss': 0.17, 'grad_norm': 0.7964675837696863, 'learning_rate': 9.719165826416354e-06, 'epoch': 0.13} 13%|█▎ | 4593/34278 [5:10:33<28:48:25, 3.49s/it] 13%|█▎ | 4594/34278 [5:10:40<35:06:28, 4.26s/it] {'loss': 0.1807, 'grad_norm': 0.7635652503837955, 'learning_rate': 9.719009702527488e-06, 'epoch': 0.13} 13%|█▎ | 4594/34278 [5:10:40<35:06:28, 4.26s/it] 13%|█▎ | 4595/34278 [5:10:43<34:09:38, 4.14s/it] {'loss': 0.1802, 'grad_norm': 0.8480596163432714, 'learning_rate': 9.718853536508428e-06, 'epoch': 0.13} 13%|█▎ | 4595/34278 [5:10:43<34:09:38, 4.14s/it] 13%|█▎ | 4596/34278 [5:10:46<31:00:26, 3.76s/it] {'loss': 0.1766, 'grad_norm': 0.9808208161559423, 'learning_rate': 9.718697328360571e-06, 'epoch': 0.13} 13%|█▎ | 4596/34278 [5:10:46<31:00:26, 3.76s/it] 13%|█▎ | 4597/34278 [5:10:50<29:57:12, 3.63s/it] {'loss': 0.1716, 'grad_norm': 0.7814399820174374, 'learning_rate': 9.71854107808531e-06, 'epoch': 0.13} 13%|█▎ | 4597/34278 [5:10:50<29:57:12, 3.63s/it] 13%|█▎ | 4598/34278 [5:10:53<28:32:39, 3.46s/it] {'loss': 0.19, 'grad_norm': 1.0404926267332295, 'learning_rate': 9.718384785684043e-06, 'epoch': 0.13} 13%|█▎ | 4598/34278 [5:10:53<28:32:39, 3.46s/it] 13%|█▎ | 4599/34278 [5:10:56<28:09:30, 3.42s/it] {'loss': 0.193, 'grad_norm': 0.8586317864673153, 'learning_rate': 9.71822845115816e-06, 'epoch': 0.13} 13%|█▎ | 4599/34278 [5:10:56<28:09:30, 3.42s/it] 13%|█▎ | 4600/34278 [5:10:59<27:09:20, 3.29s/it] {'loss': 0.1874, 'grad_norm': 0.8304034883877234, 'learning_rate': 9.718072074509061e-06, 'epoch': 0.13} 13%|█▎ | 4600/34278 [5:10:59<27:09:20, 3.29s/it] 13%|█▎ | 4601/34278 [5:11:05<33:50:26, 4.11s/it] {'loss': 0.2338, 'grad_norm': 0.9164153625205795, 'learning_rate': 9.717915655738142e-06, 'epoch': 0.13} 13%|█▎ | 4601/34278 [5:11:05<33:50:26, 4.11s/it] 13%|█▎ | 4602/34278 [5:11:08<31:42:21, 3.85s/it] {'loss': 0.1782, 'grad_norm': 1.1430108183267886, 'learning_rate': 9.717759194846797e-06, 'epoch': 0.13} 13%|█▎ | 4602/34278 [5:11:08<31:42:21, 3.85s/it] 13%|█▎ | 4603/34278 [5:11:11<28:57:02, 3.51s/it] {'loss': 0.1561, 'grad_norm': 0.9137431537210429, 'learning_rate': 9.717602691836423e-06, 'epoch': 0.13} 13%|█▎ | 4603/34278 [5:11:11<28:57:02, 3.51s/it] 13%|█▎ | 4604/34278 [5:11:17<36:02:07, 4.37s/it] {'loss': 0.1889, 'grad_norm': 0.7274491086036875, 'learning_rate': 9.717446146708421e-06, 'epoch': 0.13} 13%|█▎ | 4604/34278 [5:11:17<36:02:07, 4.37s/it] 13%|█▎ | 4605/34278 [5:11:20<32:32:59, 3.95s/it] {'loss': 0.1808, 'grad_norm': 0.9989608522592939, 'learning_rate': 9.717289559464185e-06, 'epoch': 0.13} 13%|█▎ | 4605/34278 [5:11:20<32:32:59, 3.95s/it] 13%|█▎ | 4606/34278 [5:11:23<30:01:59, 3.64s/it] {'loss': 0.1676, 'grad_norm': 0.8142491245683972, 'learning_rate': 9.717132930105114e-06, 'epoch': 0.13} 13%|█▎ | 4606/34278 [5:11:23<30:01:59, 3.64s/it] 13%|█▎ | 4607/34278 [5:11:27<31:14:42, 3.79s/it] {'loss': 0.1949, 'grad_norm': 0.9805855579751301, 'learning_rate': 9.716976258632604e-06, 'epoch': 0.13} 13%|█▎ | 4607/34278 [5:11:27<31:14:42, 3.79s/it] 13%|█▎ | 4608/34278 [5:11:31<30:07:17, 3.65s/it] {'loss': 0.168, 'grad_norm': 0.8075337425541218, 'learning_rate': 9.716819545048058e-06, 'epoch': 0.13} 13%|█▎ | 4608/34278 [5:11:31<30:07:17, 3.65s/it] 13%|█▎ | 4609/34278 [5:11:33<27:58:08, 3.39s/it] {'loss': 0.1948, 'grad_norm': 1.0539063299461362, 'learning_rate': 9.716662789352872e-06, 'epoch': 0.13} 13%|█▎ | 4609/34278 [5:11:33<27:58:08, 3.39s/it] 13%|█▎ | 4610/34278 [5:11:36<26:25:21, 3.21s/it] {'loss': 0.1681, 'grad_norm': 0.8805078573360297, 'learning_rate': 9.716505991548448e-06, 'epoch': 0.13} 13%|█▎ | 4610/34278 [5:11:36<26:25:21, 3.21s/it] 13%|█▎ | 4611/34278 [5:11:40<27:38:23, 3.35s/it] {'loss': 0.1731, 'grad_norm': 0.9125942018569365, 'learning_rate': 9.716349151636183e-06, 'epoch': 0.13} 13%|█▎ | 4611/34278 [5:11:40<27:38:23, 3.35s/it] 13%|█▎ | 4612/34278 [5:11:43<27:12:45, 3.30s/it] {'loss': 0.1584, 'grad_norm': 0.9474453748762206, 'learning_rate': 9.716192269617482e-06, 'epoch': 0.13} 13%|█▎ | 4612/34278 [5:11:43<27:12:45, 3.30s/it] 13%|█▎ | 4613/34278 [5:11:46<25:50:15, 3.14s/it] {'loss': 0.1658, 'grad_norm': 0.8860912041637936, 'learning_rate': 9.71603534549374e-06, 'epoch': 0.13} 13%|█▎ | 4613/34278 [5:11:46<25:50:15, 3.14s/it] 13%|█▎ | 4614/34278 [5:11:52<33:02:33, 4.01s/it] {'loss': 0.1585, 'grad_norm': 0.8443991560150967, 'learning_rate': 9.715878379266359e-06, 'epoch': 0.13} 13%|█▎ | 4614/34278 [5:11:52<33:02:33, 4.01s/it] 13%|█▎ | 4615/34278 [5:11:58<37:58:20, 4.61s/it] {'loss': 0.1795, 'grad_norm': 1.0155062221408329, 'learning_rate': 9.715721370936742e-06, 'epoch': 0.13} 13%|█▎ | 4615/34278 [5:11:58<37:58:20, 4.61s/it] 13%|█▎ | 4616/34278 [5:12:01<34:09:10, 4.15s/it] {'loss': 0.1727, 'grad_norm': 0.8834955589918384, 'learning_rate': 9.715564320506292e-06, 'epoch': 0.13} 13%|█▎ | 4616/34278 [5:12:01<34:09:10, 4.15s/it] 13%|█▎ | 4617/34278 [5:12:05<32:42:58, 3.97s/it] {'loss': 0.1717, 'grad_norm': 0.9188399044549099, 'learning_rate': 9.715407227976408e-06, 'epoch': 0.13} 13%|█▎ | 4617/34278 [5:12:05<32:42:58, 3.97s/it] 13%|█▎ | 4618/34278 [5:12:08<31:46:02, 3.86s/it] {'loss': 0.1784, 'grad_norm': 1.028223764093895, 'learning_rate': 9.715250093348494e-06, 'epoch': 0.13} 13%|█▎ | 4618/34278 [5:12:08<31:46:02, 3.86s/it] 13%|█▎ | 4619/34278 [5:12:11<29:49:23, 3.62s/it] {'loss': 0.1813, 'grad_norm': 0.844104848499323, 'learning_rate': 9.715092916623954e-06, 'epoch': 0.13} 13%|█▎ | 4619/34278 [5:12:11<29:49:23, 3.62s/it] 13%|█▎ | 4620/34278 [5:12:14<28:19:57, 3.44s/it] {'loss': 0.1856, 'grad_norm': 0.9158693434032174, 'learning_rate': 9.714935697804188e-06, 'epoch': 0.13} 13%|█▎ | 4620/34278 [5:12:14<28:19:57, 3.44s/it] 13%|█▎ | 4621/34278 [5:12:20<34:34:05, 4.20s/it] {'loss': 0.1723, 'grad_norm': 0.9538583470221261, 'learning_rate': 9.714778436890604e-06, 'epoch': 0.13} 13%|█▎ | 4621/34278 [5:12:20<34:34:05, 4.20s/it] 13%|█▎ | 4622/34278 [5:12:23<32:20:32, 3.93s/it] {'loss': 0.1954, 'grad_norm': 0.9928564515885795, 'learning_rate': 9.7146211338846e-06, 'epoch': 0.13} 13%|█▎ | 4622/34278 [5:12:24<32:20:32, 3.93s/it] 13%|█▎ | 4623/34278 [5:12:28<34:16:18, 4.16s/it] {'loss': 0.1698, 'grad_norm': 0.9387120450307491, 'learning_rate': 9.714463788787588e-06, 'epoch': 0.13} 13%|█▎ | 4623/34278 [5:12:28<34:16:18, 4.16s/it] 13%|█▎ | 4624/34278 [5:12:31<31:56:05, 3.88s/it] {'loss': 0.1907, 'grad_norm': 1.0527771502712409, 'learning_rate': 9.714306401600967e-06, 'epoch': 0.13} 13%|█▎ | 4624/34278 [5:12:31<31:56:05, 3.88s/it] 13%|█▎ | 4625/34278 [5:12:35<32:18:36, 3.92s/it] {'loss': 0.1804, 'grad_norm': 0.8933751834117948, 'learning_rate': 9.714148972326144e-06, 'epoch': 0.13} 13%|█▎ | 4625/34278 [5:12:35<32:18:36, 3.92s/it] 13%|█▎ | 4626/34278 [5:12:39<30:59:17, 3.76s/it] {'loss': 0.1694, 'grad_norm': 0.6514000135480227, 'learning_rate': 9.713991500964524e-06, 'epoch': 0.13} 13%|█▎ | 4626/34278 [5:12:39<30:59:17, 3.76s/it] 13%|█▎ | 4627/34278 [5:12:42<28:51:18, 3.50s/it] {'loss': 0.1553, 'grad_norm': 1.0321308868196355, 'learning_rate': 9.713833987517514e-06, 'epoch': 0.13} 13%|█▎ | 4627/34278 [5:12:42<28:51:18, 3.50s/it] 14%|█▎ | 4628/34278 [5:12:45<27:20:41, 3.32s/it] {'loss': 0.1678, 'grad_norm': 0.8747041239954297, 'learning_rate': 9.713676431986518e-06, 'epoch': 0.14} 14%|█▎ | 4628/34278 [5:12:45<27:20:41, 3.32s/it] 14%|█▎ | 4629/34278 [5:12:49<29:05:16, 3.53s/it] {'loss': 0.198, 'grad_norm': 0.7833899639069093, 'learning_rate': 9.713518834372946e-06, 'epoch': 0.14} 14%|█▎ | 4629/34278 [5:12:49<29:05:16, 3.53s/it] 14%|█▎ | 4630/34278 [5:12:55<35:25:22, 4.30s/it] {'loss': 0.1672, 'grad_norm': 0.76908563457908, 'learning_rate': 9.713361194678201e-06, 'epoch': 0.14} 14%|█▎ | 4630/34278 [5:12:55<35:25:22, 4.30s/it] 14%|█▎ | 4631/34278 [5:13:01<39:19:19, 4.77s/it] {'loss': 0.1817, 'grad_norm': 0.9875628885189603, 'learning_rate': 9.713203512903695e-06, 'epoch': 0.14} 14%|█▎ | 4631/34278 [5:13:01<39:19:19, 4.77s/it] 14%|█▎ | 4632/34278 [5:13:05<39:22:15, 4.78s/it] {'loss': 0.1672, 'grad_norm': 0.9629519042401639, 'learning_rate': 9.71304578905083e-06, 'epoch': 0.14} 14%|█▎ | 4632/34278 [5:13:05<39:22:15, 4.78s/it] 14%|█▎ | 4633/34278 [5:13:09<35:51:02, 4.35s/it] {'loss': 0.186, 'grad_norm': 0.9857470706999932, 'learning_rate': 9.71288802312102e-06, 'epoch': 0.14} 14%|█▎ | 4633/34278 [5:13:09<35:51:02, 4.35s/it] 14%|█▎ | 4634/34278 [5:13:12<32:18:12, 3.92s/it] {'loss': 0.1552, 'grad_norm': 0.730565997339259, 'learning_rate': 9.712730215115668e-06, 'epoch': 0.14} 14%|█▎ | 4634/34278 [5:13:12<32:18:12, 3.92s/it] 14%|█▎ | 4635/34278 [5:13:14<29:31:32, 3.59s/it] {'loss': 0.1915, 'grad_norm': 1.1446007527994253, 'learning_rate': 9.71257236503619e-06, 'epoch': 0.14} 14%|█▎ | 4635/34278 [5:13:14<29:31:32, 3.59s/it] 14%|█▎ | 4636/34278 [5:13:18<28:38:52, 3.48s/it] {'loss': 0.1812, 'grad_norm': 0.9749549218703435, 'learning_rate': 9.712414472883987e-06, 'epoch': 0.14} 14%|█▎ | 4636/34278 [5:13:18<28:38:52, 3.48s/it] 14%|█▎ | 4637/34278 [5:13:21<27:55:08, 3.39s/it] {'loss': 0.1578, 'grad_norm': 0.7799690275589842, 'learning_rate': 9.712256538660474e-06, 'epoch': 0.14} 14%|█▎ | 4637/34278 [5:13:21<27:55:08, 3.39s/it] 14%|█▎ | 4638/34278 [5:13:25<30:23:01, 3.69s/it] {'loss': 0.1758, 'grad_norm': 0.7807747971278297, 'learning_rate': 9.712098562367059e-06, 'epoch': 0.14} 14%|█▎ | 4638/34278 [5:13:25<30:23:01, 3.69s/it] 14%|█▎ | 4639/34278 [5:13:31<34:52:44, 4.24s/it] {'loss': 0.1832, 'grad_norm': 0.8710834872583403, 'learning_rate': 9.711940544005154e-06, 'epoch': 0.14} 14%|█▎ | 4639/34278 [5:13:31<34:52:44, 4.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▎ | 4640/34278 [5:13:35<34:35:19, 4.20s/it] {'loss': 0.1732, 'grad_norm': 0.7614578950485028, 'learning_rate': 9.711782483576168e-06, 'epoch': 0.14} 14%|█▎ | 4640/34278 [5:13:35<34:35:19, 4.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▎ | 4641/34278 [5:13:38<32:32:26, 3.95s/it] {'loss': 0.1992, 'grad_norm': 0.8424303678910273, 'learning_rate': 9.711624381081513e-06, 'epoch': 0.14} 14%|█▎ | 4641/34278 [5:13:38<32:32:26, 3.95s/it] 14%|█▎ | 4642/34278 [5:13:43<33:34:16, 4.08s/it] {'loss': 0.1697, 'grad_norm': 0.865337928104798, 'learning_rate': 9.711466236522599e-06, 'epoch': 0.14} 14%|█▎ | 4642/34278 [5:13:43<33:34:16, 4.08s/it] 14%|█▎ | 4643/34278 [5:13:49<38:22:54, 4.66s/it] {'loss': 0.187, 'grad_norm': 0.9408710848620837, 'learning_rate': 9.71130804990084e-06, 'epoch': 0.14} 14%|█▎ | 4643/34278 [5:13:49<38:22:54, 4.66s/it] 14%|█▎ | 4644/34278 [5:13:52<34:55:29, 4.24s/it] {'loss': 0.1815, 'grad_norm': 0.8181683362796547, 'learning_rate': 9.711149821217648e-06, 'epoch': 0.14} 14%|█▎ | 4644/34278 [5:13:52<34:55:29, 4.24s/it] 14%|█▎ | 4645/34278 [5:13:55<31:45:55, 3.86s/it] {'loss': 0.1558, 'grad_norm': 0.8425298921105472, 'learning_rate': 9.710991550474435e-06, 'epoch': 0.14} 14%|█▎ | 4645/34278 [5:13:55<31:45:55, 3.86s/it] 14%|█▎ | 4646/34278 [5:13:58<29:17:43, 3.56s/it] {'loss': 0.1592, 'grad_norm': 0.8238656748496306, 'learning_rate': 9.710833237672612e-06, 'epoch': 0.14} 14%|█▎ | 4646/34278 [5:13:58<29:17:43, 3.56s/it] 14%|█▎ | 4647/34278 [5:14:04<35:09:47, 4.27s/it] {'loss': 0.1952, 'grad_norm': 1.0303863480310418, 'learning_rate': 9.710674882813598e-06, 'epoch': 0.14} 14%|█▎ | 4647/34278 [5:14:04<35:09:47, 4.27s/it] 14%|█▎ | 4648/34278 [5:14:07<31:44:25, 3.86s/it] {'loss': 0.198, 'grad_norm': 0.9194943781256945, 'learning_rate': 9.7105164858988e-06, 'epoch': 0.14} 14%|█▎ | 4648/34278 [5:14:07<31:44:25, 3.86s/it] 14%|█▎ | 4649/34278 [5:14:10<29:25:41, 3.58s/it] {'loss': 0.1733, 'grad_norm': 0.8206790543242434, 'learning_rate': 9.710358046929636e-06, 'epoch': 0.14} 14%|█▎ | 4649/34278 [5:14:10<29:25:41, 3.58s/it] 14%|█▎ | 4650/34278 [5:14:12<27:34:07, 3.35s/it] {'loss': 0.1918, 'grad_norm': 1.107013105208648, 'learning_rate': 9.710199565907521e-06, 'epoch': 0.14} 14%|█▎ | 4650/34278 [5:14:12<27:34:07, 3.35s/it] 14%|█▎ | 4651/34278 [5:14:18<33:48:33, 4.11s/it] {'loss': 0.1512, 'grad_norm': 0.9489735749586852, 'learning_rate': 9.710041042833869e-06, 'epoch': 0.14} 14%|█▎ | 4651/34278 [5:14:18<33:48:33, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▎ | 4652/34278 [5:14:21<31:42:25, 3.85s/it] {'loss': 0.1822, 'grad_norm': 0.7685001734325031, 'learning_rate': 9.709882477710093e-06, 'epoch': 0.14} 14%|█▎ | 4652/34278 [5:14:21<31:42:25, 3.85s/it] 14%|█▎ | 4653/34278 [5:14:25<29:47:17, 3.62s/it] {'loss': 0.1782, 'grad_norm': 0.9211035392852889, 'learning_rate': 9.709723870537613e-06, 'epoch': 0.14} 14%|█▎ | 4653/34278 [5:14:25<29:47:17, 3.62s/it] 14%|█▎ | 4654/34278 [5:14:29<30:40:01, 3.73s/it] {'loss': 0.1606, 'grad_norm': 0.9523188363240555, 'learning_rate': 9.70956522131784e-06, 'epoch': 0.14} 14%|█▎ | 4654/34278 [5:14:29<30:40:01, 3.73s/it] 14%|█▎ | 4655/34278 [5:14:35<36:17:18, 4.41s/it] {'loss': 0.1775, 'grad_norm': 0.9060661785416497, 'learning_rate': 9.709406530052194e-06, 'epoch': 0.14} 14%|█▎ | 4655/34278 [5:14:35<36:17:18, 4.41s/it] 14%|█▎ | 4656/34278 [5:14:38<33:47:54, 4.11s/it] {'loss': 0.187, 'grad_norm': 0.9669192154217833, 'learning_rate': 9.709247796742091e-06, 'epoch': 0.14} 14%|█▎ | 4656/34278 [5:14:38<33:47:54, 4.11s/it] 14%|█▎ | 4657/34278 [5:14:41<30:47:03, 3.74s/it] {'loss': 0.195, 'grad_norm': 0.9399764473212696, 'learning_rate': 9.709089021388947e-06, 'epoch': 0.14} 14%|█▎ | 4657/34278 [5:14:41<30:47:03, 3.74s/it] 14%|█▎ | 4658/34278 [5:14:45<30:54:32, 3.76s/it] {'loss': 0.1798, 'grad_norm': 0.9661483210094688, 'learning_rate': 9.708930203994182e-06, 'epoch': 0.14} 14%|█▎ | 4658/34278 [5:14:45<30:54:32, 3.76s/it] 14%|█▎ | 4659/34278 [5:14:48<29:15:59, 3.56s/it] {'loss': 0.1684, 'grad_norm': 0.9124454167609475, 'learning_rate': 9.708771344559212e-06, 'epoch': 0.14} 14%|█▎ | 4659/34278 [5:14:48<29:15:59, 3.56s/it] 14%|█▎ | 4660/34278 [5:14:53<34:11:07, 4.16s/it] {'loss': 0.1919, 'grad_norm': 1.0215319227945203, 'learning_rate': 9.708612443085454e-06, 'epoch': 0.14} 14%|█▎ | 4660/34278 [5:14:53<34:11:07, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▎ | 4661/34278 [5:14:57<32:21:30, 3.93s/it] {'loss': 0.1734, 'grad_norm': 1.0114839591203058, 'learning_rate': 9.708453499574328e-06, 'epoch': 0.14} 14%|█▎ | 4661/34278 [5:14:57<32:21:30, 3.93s/it] 14%|█▎ | 4662/34278 [5:15:00<29:42:41, 3.61s/it] {'loss': 0.177, 'grad_norm': 0.9298959485935899, 'learning_rate': 9.708294514027255e-06, 'epoch': 0.14} 14%|█▎ | 4662/34278 [5:15:00<29:42:41, 3.61s/it] 14%|█▎ | 4663/34278 [5:15:02<27:59:15, 3.40s/it] {'loss': 0.1771, 'grad_norm': 0.9542062169030524, 'learning_rate': 9.708135486445652e-06, 'epoch': 0.14} 14%|█▎ | 4663/34278 [5:15:02<27:59:15, 3.40s/it] 14%|█▎ | 4664/34278 [5:15:06<27:21:52, 3.33s/it] {'loss': 0.1553, 'grad_norm': 0.858838796975156, 'learning_rate': 9.707976416830938e-06, 'epoch': 0.14} 14%|█▎ | 4664/34278 [5:15:06<27:21:52, 3.33s/it] 14%|█▎ | 4665/34278 [5:15:09<26:54:36, 3.27s/it] {'loss': 0.18, 'grad_norm': 0.7526944223238617, 'learning_rate': 9.707817305184535e-06, 'epoch': 0.14} 14%|█▎ | 4665/34278 [5:15:09<26:54:36, 3.27s/it] 14%|█▎ | 4666/34278 [5:15:14<32:51:51, 4.00s/it] {'loss': 0.1637, 'grad_norm': 0.7810099686966809, 'learning_rate': 9.707658151507864e-06, 'epoch': 0.14} 14%|█▎ | 4666/34278 [5:15:14<32:51:51, 4.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▎ | 4667/34278 [5:15:18<31:23:47, 3.82s/it] {'loss': 0.1872, 'grad_norm': 0.9086215597119266, 'learning_rate': 9.707498955802343e-06, 'epoch': 0.14} 14%|█▎ | 4667/34278 [5:15:18<31:23:47, 3.82s/it] 14%|█▎ | 4668/34278 [5:15:21<29:24:23, 3.58s/it] {'loss': 0.1891, 'grad_norm': 0.9508524355128216, 'learning_rate': 9.707339718069397e-06, 'epoch': 0.14} 14%|█▎ | 4668/34278 [5:15:21<29:24:23, 3.58s/it] 14%|█▎ | 4669/34278 [5:15:25<30:21:29, 3.69s/it] {'loss': 0.1848, 'grad_norm': 0.7195667254981255, 'learning_rate': 9.707180438310446e-06, 'epoch': 0.14} 14%|█▎ | 4669/34278 [5:15:25<30:21:29, 3.69s/it] 14%|█▎ | 4670/34278 [5:15:28<29:50:36, 3.63s/it] {'loss': 0.182, 'grad_norm': 0.8184403771567913, 'learning_rate': 9.707021116526908e-06, 'epoch': 0.14} 14%|█▎ | 4670/34278 [5:15:28<29:50:36, 3.63s/it] 14%|█▎ | 4671/34278 [5:15:31<27:35:34, 3.36s/it] {'loss': 0.1559, 'grad_norm': 0.9658791864775528, 'learning_rate': 9.706861752720213e-06, 'epoch': 0.14} 14%|█▎ | 4671/34278 [5:15:31<27:35:34, 3.36s/it] 14%|█▎ | 4672/34278 [5:15:34<26:54:37, 3.27s/it] {'loss': 0.1546, 'grad_norm': 0.7212660341141374, 'learning_rate': 9.706702346891778e-06, 'epoch': 0.14} 14%|█▎ | 4672/34278 [5:15:34<26:54:37, 3.27s/it] 14%|█▎ | 4673/34278 [5:15:37<25:46:55, 3.14s/it] {'loss': 0.1698, 'grad_norm': 0.8810046665043058, 'learning_rate': 9.70654289904303e-06, 'epoch': 0.14} 14%|█▎ | 4673/34278 [5:15:37<25:46:55, 3.14s/it] 14%|█▎ | 4674/34278 [5:15:40<25:38:19, 3.12s/it] {'loss': 0.1819, 'grad_norm': 0.8165448817841481, 'learning_rate': 9.70638340917539e-06, 'epoch': 0.14} 14%|█▎ | 4674/34278 [5:15:40<25:38:19, 3.12s/it] 14%|█▎ | 4675/34278 [5:15:43<25:51:44, 3.15s/it] {'loss': 0.1813, 'grad_norm': 1.0664393599702064, 'learning_rate': 9.706223877290282e-06, 'epoch': 0.14} 14%|█▎ | 4675/34278 [5:15:43<25:51:44, 3.15s/it] 14%|█▎ | 4676/34278 [5:15:47<27:10:44, 3.31s/it] {'loss': 0.1828, 'grad_norm': 0.7238391073438882, 'learning_rate': 9.70606430338913e-06, 'epoch': 0.14} 14%|█▎ | 4676/34278 [5:15:47<27:10:44, 3.31s/it] 14%|█▎ | 4677/34278 [5:15:50<26:24:02, 3.21s/it] {'loss': 0.1973, 'grad_norm': 1.005694470360366, 'learning_rate': 9.70590468747336e-06, 'epoch': 0.14} 14%|█▎ | 4677/34278 [5:15:50<26:24:02, 3.21s/it] 14%|█▎ | 4678/34278 [5:15:53<25:08:34, 3.06s/it] {'loss': 0.1871, 'grad_norm': 0.9241377635739217, 'learning_rate': 9.705745029544396e-06, 'epoch': 0.14} 14%|█▎ | 4678/34278 [5:15:53<25:08:34, 3.06s/it] 14%|█▎ | 4679/34278 [5:15:56<25:31:29, 3.10s/it] {'loss': 0.1804, 'grad_norm': 0.8179031785167225, 'learning_rate': 9.705585329603664e-06, 'epoch': 0.14} 14%|█▎ | 4679/34278 [5:15:56<25:31:29, 3.10s/it] 14%|█▎ | 4680/34278 [5:15:58<24:35:53, 2.99s/it] {'loss': 0.1732, 'grad_norm': 0.7308893257140879, 'learning_rate': 9.705425587652589e-06, 'epoch': 0.14} 14%|█▎ | 4680/34278 [5:15:59<24:35:53, 2.99s/it] 14%|█▎ | 4681/34278 [5:16:02<25:14:22, 3.07s/it] {'loss': 0.1722, 'grad_norm': 0.7528912940315003, 'learning_rate': 9.705265803692597e-06, 'epoch': 0.14} 14%|█▎ | 4681/34278 [5:16:02<25:14:22, 3.07s/it] 14%|█▎ | 4682/34278 [5:16:08<32:26:27, 3.95s/it] {'loss': 0.175, 'grad_norm': 1.052690632300158, 'learning_rate': 9.705105977725117e-06, 'epoch': 0.14} 14%|█▎ | 4682/34278 [5:16:08<32:26:27, 3.95s/it] 14%|█▎ | 4683/34278 [5:16:12<33:51:44, 4.12s/it] {'loss': 0.1625, 'grad_norm': 0.8706043181573576, 'learning_rate': 9.704946109751572e-06, 'epoch': 0.14} 14%|█▎ | 4683/34278 [5:16:12<33:51:44, 4.12s/it] 14%|█▎ | 4684/34278 [5:16:15<30:27:40, 3.71s/it] {'loss': 0.1767, 'grad_norm': 0.7828218637229405, 'learning_rate': 9.704786199773392e-06, 'epoch': 0.14} 14%|█▎ | 4684/34278 [5:16:15<30:27:40, 3.71s/it] 14%|█▎ | 4685/34278 [5:16:18<28:38:34, 3.48s/it] {'loss': 0.1768, 'grad_norm': 1.6359450019303676, 'learning_rate': 9.704626247792006e-06, 'epoch': 0.14} 14%|█▎ | 4685/34278 [5:16:18<28:38:34, 3.48s/it] 14%|█▎ | 4686/34278 [5:16:21<28:38:54, 3.49s/it] {'loss': 0.1467, 'grad_norm': 0.8522023527214693, 'learning_rate': 9.704466253808837e-06, 'epoch': 0.14} 14%|█▎ | 4686/34278 [5:16:21<28:38:54, 3.49s/it] 14%|█▎ | 4687/34278 [5:16:25<28:10:52, 3.43s/it] {'loss': 0.1943, 'grad_norm': 0.9438185753052197, 'learning_rate': 9.70430621782532e-06, 'epoch': 0.14} 14%|█▎ | 4687/34278 [5:16:25<28:10:52, 3.43s/it] 14%|█▎ | 4688/34278 [5:16:31<34:33:11, 4.20s/it] {'loss': 0.1958, 'grad_norm': 0.8967662864085583, 'learning_rate': 9.704146139842876e-06, 'epoch': 0.14} 14%|█▎ | 4688/34278 [5:16:31<34:33:11, 4.20s/it] 14%|█▎ | 4689/34278 [5:16:37<39:19:40, 4.78s/it] {'loss': 0.2017, 'grad_norm': 0.8176074719163361, 'learning_rate': 9.70398601986294e-06, 'epoch': 0.14} 14%|█▎ | 4689/34278 [5:16:37<39:19:40, 4.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▎ | 4690/34278 [5:16:43<41:43:08, 5.08s/it] {'loss': 0.2056, 'grad_norm': 1.0065416187094867, 'learning_rate': 9.70382585788694e-06, 'epoch': 0.14} 14%|█▎ | 4690/34278 [5:16:43<41:43:08, 5.08s/it] 14%|█▎ | 4691/34278 [5:16:46<37:34:48, 4.57s/it] {'loss': 0.1881, 'grad_norm': 0.8540913616045407, 'learning_rate': 9.703665653916306e-06, 'epoch': 0.14} 14%|█▎ | 4691/34278 [5:16:46<37:34:48, 4.57s/it] 14%|█▎ | 4692/34278 [5:16:49<33:47:43, 4.11s/it] {'loss': 0.1611, 'grad_norm': 0.7964480986863118, 'learning_rate': 9.703505407952467e-06, 'epoch': 0.14} 14%|█▎ | 4692/34278 [5:16:49<33:47:43, 4.11s/it] 14%|█▎ | 4693/34278 [5:16:53<33:44:58, 4.11s/it] {'loss': 0.2064, 'grad_norm': 1.0276098799259779, 'learning_rate': 9.703345119996854e-06, 'epoch': 0.14} 14%|█▎ | 4693/34278 [5:16:53<33:44:58, 4.11s/it] 14%|█▎ | 4694/34278 [5:16:59<38:40:36, 4.71s/it] {'loss': 0.181, 'grad_norm': 0.8780151029964625, 'learning_rate': 9.7031847900509e-06, 'epoch': 0.14} 14%|█▎ | 4694/34278 [5:16:59<38:40:36, 4.71s/it] 14%|█▎ | 4695/34278 [5:17:05<42:07:41, 5.13s/it] {'loss': 0.1828, 'grad_norm': 0.9988535269115225, 'learning_rate': 9.703024418116035e-06, 'epoch': 0.14} 14%|█▎ | 4695/34278 [5:17:05<42:07:41, 5.13s/it] 14%|█▎ | 4696/34278 [5:17:11<42:57:35, 5.23s/it] {'loss': 0.1904, 'grad_norm': 0.8217531980589999, 'learning_rate': 9.702864004193689e-06, 'epoch': 0.14} 14%|█▎ | 4696/34278 [5:17:11<42:57:35, 5.23s/it] 14%|█▎ | 4697/34278 [5:17:14<37:30:18, 4.56s/it] {'loss': 0.194, 'grad_norm': 0.9369455041215813, 'learning_rate': 9.702703548285297e-06, 'epoch': 0.14} 14%|█▎ | 4697/34278 [5:17:14<37:30:18, 4.56s/it] 14%|█▎ | 4698/34278 [5:17:18<35:22:34, 4.31s/it] {'loss': 0.1897, 'grad_norm': 1.0206640713725064, 'learning_rate': 9.702543050392289e-06, 'epoch': 0.14} 14%|█▎ | 4698/34278 [5:17:18<35:22:34, 4.31s/it] 14%|█▎ | 4699/34278 [5:17:21<32:17:56, 3.93s/it] {'loss': 0.1785, 'grad_norm': 0.905883141033335, 'learning_rate': 9.702382510516101e-06, 'epoch': 0.14} 14%|█▎ | 4699/34278 [5:17:21<32:17:56, 3.93s/it] 14%|█▎ | 4700/34278 [5:17:25<32:11:00, 3.92s/it] {'loss': 0.1628, 'grad_norm': 0.822309423685823, 'learning_rate': 9.702221928658162e-06, 'epoch': 0.14} 14%|█▎ | 4700/34278 [5:17:25<32:11:00, 3.92s/it] 14%|█▎ | 4701/34278 [5:17:28<30:15:56, 3.68s/it] {'loss': 0.1513, 'grad_norm': 0.830246206302412, 'learning_rate': 9.702061304819912e-06, 'epoch': 0.14} 14%|█▎ | 4701/34278 [5:17:28<30:15:56, 3.68s/it] 14%|█▎ | 4702/34278 [5:17:32<31:25:04, 3.82s/it] {'loss': 0.1923, 'grad_norm': 0.844493568165159, 'learning_rate': 9.70190063900278e-06, 'epoch': 0.14} 14%|█▎ | 4702/34278 [5:17:32<31:25:04, 3.82s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8080e9e2f0> Failed to fetch sample 2667585. Exception: cannot identify image file <_io.BytesIO object at 0x7f8080e9e2f0> /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▎ | 4703/34278 [5:17:35<29:49:03, 3.63s/it] {'loss': 0.1595, 'grad_norm': 0.8159300219183769, 'learning_rate': 9.701739931208199e-06, 'epoch': 0.14} 14%|█▎ | 4703/34278 [5:17:35<29:49:03, 3.63s/it] 14%|█▎ | 4704/34278 [5:17:39<29:32:36, 3.60s/it] {'loss': 0.1657, 'grad_norm': 0.9591670417352346, 'learning_rate': 9.701579181437608e-06, 'epoch': 0.14} 14%|█▎ | 4704/34278 [5:17:39<29:32:36, 3.60s/it] 14%|█▎ | 4705/34278 [5:17:44<35:25:17, 4.31s/it] {'loss': 0.1744, 'grad_norm': 0.9397254357533835, 'learning_rate': 9.701418389692441e-06, 'epoch': 0.14} 14%|█▎ | 4705/34278 [5:17:45<35:25:17, 4.31s/it] 14%|█▎ | 4706/34278 [5:17:47<31:50:32, 3.88s/it] {'loss': 0.187, 'grad_norm': 0.8273778307663063, 'learning_rate': 9.701257555974131e-06, 'epoch': 0.14} 14%|█▎ | 4706/34278 [5:17:47<31:50:32, 3.88s/it] 14%|█▎ | 4707/34278 [5:17:50<29:52:36, 3.64s/it] {'loss': 0.1812, 'grad_norm': 1.003808430533134, 'learning_rate': 9.701096680284119e-06, 'epoch': 0.14} 14%|█▎ | 4707/34278 [5:17:50<29:52:36, 3.64s/it] 14%|█▎ | 4708/34278 [5:17:54<29:50:45, 3.63s/it] {'loss': 0.1621, 'grad_norm': 0.7965113465415409, 'learning_rate': 9.700935762623835e-06, 'epoch': 0.14} 14%|█▎ | 4708/34278 [5:17:54<29:50:45, 3.63s/it] 14%|█▎ | 4709/34278 [5:17:57<29:19:04, 3.57s/it] {'loss': 0.1799, 'grad_norm': 0.8984695013894336, 'learning_rate': 9.700774802994721e-06, 'epoch': 0.14} 14%|█▎ | 4709/34278 [5:17:57<29:19:04, 3.57s/it] 14%|█▎ | 4710/34278 [5:18:01<29:57:06, 3.65s/it] {'loss': 0.2136, 'grad_norm': 1.0562346510337288, 'learning_rate': 9.700613801398209e-06, 'epoch': 0.14} 14%|█▎ | 4710/34278 [5:18:01<29:57:06, 3.65s/it] 14%|█▎ | 4711/34278 [5:18:04<28:44:39, 3.50s/it] {'loss': 0.179, 'grad_norm': 1.0092671338183845, 'learning_rate': 9.700452757835741e-06, 'epoch': 0.14} 14%|█▎ | 4711/34278 [5:18:04<28:44:39, 3.50s/it] 14%|█▎ | 4712/34278 [5:18:10<33:47:10, 4.11s/it] {'loss': 0.1666, 'grad_norm': 0.8689706657713749, 'learning_rate': 9.700291672308752e-06, 'epoch': 0.14} 14%|█▎ | 4712/34278 [5:18:10<33:47:10, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▎ | 4713/34278 [5:18:13<31:11:02, 3.80s/it] {'loss': 0.2214, 'grad_norm': 0.9547444978433681, 'learning_rate': 9.700130544818682e-06, 'epoch': 0.14} 14%|█▎ | 4713/34278 [5:18:13<31:11:02, 3.80s/it] 14%|█▍ | 4714/34278 [5:18:17<32:00:10, 3.90s/it] {'loss': 0.1915, 'grad_norm': 0.898366732252034, 'learning_rate': 9.69996937536697e-06, 'epoch': 0.14} 14%|█▍ | 4714/34278 [5:18:17<32:00:10, 3.90s/it] 14%|█▍ | 4715/34278 [5:18:23<37:50:57, 4.61s/it] {'loss': 0.1653, 'grad_norm': 0.7665205658010267, 'learning_rate': 9.69980816395505e-06, 'epoch': 0.14} 14%|█▍ | 4715/34278 [5:18:23<37:50:57, 4.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▍ | 4716/34278 [5:18:27<35:07:43, 4.28s/it] {'loss': 0.185, 'grad_norm': 0.8091459372871365, 'learning_rate': 9.699646910584366e-06, 'epoch': 0.14} 14%|█▍ | 4716/34278 [5:18:27<35:07:43, 4.28s/it] 14%|█▍ | 4717/34278 [5:18:31<33:56:00, 4.13s/it] {'loss': 0.1824, 'grad_norm': 0.9247503942827152, 'learning_rate': 9.699485615256357e-06, 'epoch': 0.14} 14%|█▍ | 4717/34278 [5:18:31<33:56:00, 4.13s/it] 14%|█▍ | 4718/34278 [5:18:34<32:57:03, 4.01s/it] {'loss': 0.1819, 'grad_norm': 0.8717922752137528, 'learning_rate': 9.699324277972462e-06, 'epoch': 0.14} 14%|█▍ | 4718/34278 [5:18:35<32:57:03, 4.01s/it] 14%|█▍ | 4719/34278 [5:18:40<35:29:28, 4.32s/it] {'loss': 0.1925, 'grad_norm': 1.0773589783394717, 'learning_rate': 9.699162898734122e-06, 'epoch': 0.14} 14%|█▍ | 4719/34278 [5:18:40<35:29:28, 4.32s/it] 14%|█▍ | 4720/34278 [5:18:45<38:03:42, 4.64s/it] {'loss': 0.197, 'grad_norm': 1.0901931893968568, 'learning_rate': 9.699001477542775e-06, 'epoch': 0.14} 14%|█▍ | 4720/34278 [5:18:45<38:03:42, 4.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▍ | 4721/34278 [5:18:47<33:00:39, 4.02s/it] {'loss': 0.1804, 'grad_norm': 1.1835489899861442, 'learning_rate': 9.698840014399867e-06, 'epoch': 0.14} 14%|█▍ | 4721/34278 [5:18:48<33:00:39, 4.02s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8196 > 8192). Running this sequence through the model will result in indexing errors 14%|█▍ | 4722/34278 [5:18:51<30:45:23, 3.75s/it] {'loss': 0.207, 'grad_norm': 0.9021954966986981, 'learning_rate': 9.698678509306836e-06, 'epoch': 0.14} 14%|█▍ | 4722/34278 [5:18:51<30:45:23, 3.75s/it] 14%|█▍ | 4723/34278 [5:18:54<28:57:28, 3.53s/it] {'loss': 0.175, 'grad_norm': 1.0469311591030825, 'learning_rate': 9.698516962265125e-06, 'epoch': 0.14} 14%|█▍ | 4723/34278 [5:18:54<28:57:28, 3.53s/it] 14%|█▍ | 4724/34278 [5:18:57<29:34:45, 3.60s/it] {'loss': 0.1755, 'grad_norm': 0.8073322173220286, 'learning_rate': 9.698355373276178e-06, 'epoch': 0.14} 14%|█▍ | 4724/34278 [5:18:57<29:34:45, 3.60s/it] 14%|█▍ | 4725/34278 [5:19:01<28:52:24, 3.52s/it] {'loss': 0.173, 'grad_norm': 0.8685317986389969, 'learning_rate': 9.698193742341434e-06, 'epoch': 0.14} 14%|█▍ | 4725/34278 [5:19:01<28:52:24, 3.52s/it] 14%|█▍ | 4726/34278 [5:19:04<28:00:38, 3.41s/it] {'loss': 0.1755, 'grad_norm': 0.8256525508359017, 'learning_rate': 9.698032069462338e-06, 'epoch': 0.14} 14%|█▍ | 4726/34278 [5:19:04<28:00:38, 3.41s/it] 14%|█▍ | 4727/34278 [5:19:07<27:29:20, 3.35s/it] {'loss': 0.1899, 'grad_norm': 0.9431652155983686, 'learning_rate': 9.697870354640334e-06, 'epoch': 0.14} 14%|█▍ | 4727/34278 [5:19:07<27:29:20, 3.35s/it] 14%|█▍ | 4728/34278 [5:19:10<26:00:13, 3.17s/it] {'loss': 0.1504, 'grad_norm': 1.0830611080645738, 'learning_rate': 9.697708597876863e-06, 'epoch': 0.14} 14%|█▍ | 4728/34278 [5:19:10<26:00:13, 3.17s/it] 14%|█▍ | 4729/34278 [5:19:16<32:16:09, 3.93s/it] {'loss': 0.1696, 'grad_norm': 0.7177082561141378, 'learning_rate': 9.697546799173372e-06, 'epoch': 0.14} 14%|█▍ | 4729/34278 [5:19:16<32:16:09, 3.93s/it] 14%|█▍ | 4730/34278 [5:19:22<37:28:33, 4.57s/it] {'loss': 0.1684, 'grad_norm': 1.0145093897760615, 'learning_rate': 9.697384958531307e-06, 'epoch': 0.14} 14%|█▍ | 4730/34278 [5:19:22<37:28:33, 4.57s/it] 14%|█▍ | 4731/34278 [5:19:25<35:35:24, 4.34s/it] {'loss': 0.2042, 'grad_norm': 0.9537271286997424, 'learning_rate': 9.697223075952107e-06, 'epoch': 0.14} 14%|█▍ | 4731/34278 [5:19:25<35:35:24, 4.34s/it] 14%|█▍ | 4732/34278 [5:19:29<33:54:02, 4.13s/it] {'loss': 0.1744, 'grad_norm': 0.8615635927849443, 'learning_rate': 9.697061151437223e-06, 'epoch': 0.14} 14%|█▍ | 4732/34278 [5:19:29<33:54:02, 4.13s/it] 14%|█▍ | 4733/34278 [5:19:33<33:51:30, 4.13s/it] {'loss': 0.164, 'grad_norm': 1.0196870703943546, 'learning_rate': 9.696899184988097e-06, 'epoch': 0.14} 14%|█▍ | 4733/34278 [5:19:33<33:51:30, 4.13s/it] 14%|█▍ | 4734/34278 [5:19:38<35:19:16, 4.30s/it] {'loss': 0.1603, 'grad_norm': 0.7820486500981627, 'learning_rate': 9.696737176606177e-06, 'epoch': 0.14} 14%|█▍ | 4734/34278 [5:19:38<35:19:16, 4.30s/it] 14%|█▍ | 4735/34278 [5:19:41<32:33:48, 3.97s/it] {'loss': 0.1887, 'grad_norm': 1.0072738160492847, 'learning_rate': 9.696575126292908e-06, 'epoch': 0.14} 14%|█▍ | 4735/34278 [5:19:41<32:33:48, 3.97s/it] 14%|█▍ | 4736/34278 [5:19:45<31:26:12, 3.83s/it] {'loss': 0.1781, 'grad_norm': 1.002552660392895, 'learning_rate': 9.696413034049738e-06, 'epoch': 0.14} 14%|█▍ | 4736/34278 [5:19:45<31:26:12, 3.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▍ | 4737/34278 [5:19:49<31:45:51, 3.87s/it] {'loss': 0.174, 'grad_norm': 0.8051411523838466, 'learning_rate': 9.696250899878114e-06, 'epoch': 0.14} 14%|█▍ | 4737/34278 [5:19:49<31:45:51, 3.87s/it] 14%|█▍ | 4738/34278 [5:19:53<34:17:19, 4.18s/it] {'loss': 0.2058, 'grad_norm': 0.8422472358201145, 'learning_rate': 9.696088723779481e-06, 'epoch': 0.14} 14%|█▍ | 4738/34278 [5:19:53<34:17:19, 4.18s/it] 14%|█▍ | 4739/34278 [5:19:56<31:32:23, 3.84s/it] {'loss': 0.1796, 'grad_norm': 1.0801041751659373, 'learning_rate': 9.695926505755291e-06, 'epoch': 0.14} 14%|█▍ | 4739/34278 [5:19:57<31:32:23, 3.84s/it] 14%|█▍ | 4740/34278 [5:20:00<30:06:19, 3.67s/it] {'loss': 0.1877, 'grad_norm': 0.7600740521139456, 'learning_rate': 9.695764245806989e-06, 'epoch': 0.14} 14%|█▍ | 4740/34278 [5:20:00<30:06:19, 3.67s/it] 14%|█▍ | 4741/34278 [5:20:03<28:04:52, 3.42s/it] {'loss': 0.1549, 'grad_norm': 0.7945319040741914, 'learning_rate': 9.695601943936026e-06, 'epoch': 0.14} 14%|█▍ | 4741/34278 [5:20:03<28:04:52, 3.42s/it] 14%|█▍ | 4742/34278 [5:20:06<27:06:08, 3.30s/it] {'loss': 0.1774, 'grad_norm': 0.992848090373319, 'learning_rate': 9.69543960014385e-06, 'epoch': 0.14} 14%|█▍ | 4742/34278 [5:20:06<27:06:08, 3.30s/it] 14%|█▍ | 4743/34278 [5:20:10<29:08:57, 3.55s/it] {'loss': 0.1525, 'grad_norm': 0.7954009770454634, 'learning_rate': 9.695277214431909e-06, 'epoch': 0.14} 14%|█▍ | 4743/34278 [5:20:10<29:08:57, 3.55s/it] 14%|█▍ | 4744/34278 [5:20:13<27:39:49, 3.37s/it] {'loss': 0.1813, 'grad_norm': 1.0887109705714113, 'learning_rate': 9.695114786801654e-06, 'epoch': 0.14} 14%|█▍ | 4744/34278 [5:20:13<27:39:49, 3.37s/it] 14%|█▍ | 4745/34278 [5:20:16<26:29:30, 3.23s/it] {'loss': 0.1788, 'grad_norm': 0.7643587553064333, 'learning_rate': 9.694952317254535e-06, 'epoch': 0.14} 14%|█▍ | 4745/34278 [5:20:16<26:29:30, 3.23s/it] 14%|█▍ | 4746/34278 [5:20:18<25:23:29, 3.10s/it] {'loss': 0.1648, 'grad_norm': 1.371828272599082, 'learning_rate': 9.694789805792001e-06, 'epoch': 0.14} 14%|█▍ | 4746/34278 [5:20:18<25:23:29, 3.10s/it] 14%|█▍ | 4747/34278 [5:20:22<25:55:26, 3.16s/it] {'loss': 0.1668, 'grad_norm': 0.8498365739317533, 'learning_rate': 9.694627252415507e-06, 'epoch': 0.14} 14%|█▍ | 4747/34278 [5:20:22<25:55:26, 3.16s/it] 14%|█▍ | 4748/34278 [5:20:25<25:49:55, 3.15s/it] {'loss': 0.1729, 'grad_norm': 1.2725773426236568, 'learning_rate': 9.6944646571265e-06, 'epoch': 0.14} 14%|█▍ | 4748/34278 [5:20:25<25:49:55, 3.15s/it] 14%|█▍ | 4749/34278 [5:20:31<32:33:11, 3.97s/it] {'loss': 0.1991, 'grad_norm': 0.8315757480574972, 'learning_rate': 9.694302019926433e-06, 'epoch': 0.14} 14%|█▍ | 4749/34278 [5:20:31<32:33:11, 3.97s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9555 > 8192). Running this sequence through the model will result in indexing errors 14%|█▍ | 4750/34278 [5:20:34<30:09:10, 3.68s/it] {'loss': 0.1822, 'grad_norm': 1.1038174200605573, 'learning_rate': 9.69413934081676e-06, 'epoch': 0.14} 14%|█▍ | 4750/34278 [5:20:34<30:09:10, 3.68s/it] 14%|█▍ | 4751/34278 [5:20:37<30:25:28, 3.71s/it] {'loss': 0.1935, 'grad_norm': 0.9394178159287982, 'learning_rate': 9.69397661979893e-06, 'epoch': 0.14} 14%|█▍ | 4751/34278 [5:20:37<30:25:28, 3.71s/it] 14%|█▍ | 4752/34278 [5:20:41<29:09:49, 3.56s/it] {'loss': 0.181, 'grad_norm': 0.9224771336747449, 'learning_rate': 9.693813856874399e-06, 'epoch': 0.14} 14%|█▍ | 4752/34278 [5:20:41<29:09:49, 3.56s/it] 14%|█▍ | 4753/34278 [5:20:47<34:47:22, 4.24s/it] {'loss': 0.201, 'grad_norm': 1.0966758332974174, 'learning_rate': 9.693651052044617e-06, 'epoch': 0.14} 14%|█▍ | 4753/34278 [5:20:47<34:47:22, 4.24s/it] 14%|█▍ | 4754/34278 [5:20:49<31:25:21, 3.83s/it] {'loss': 0.1764, 'grad_norm': 1.0613697191543172, 'learning_rate': 9.693488205311039e-06, 'epoch': 0.14} 14%|█▍ | 4754/34278 [5:20:49<31:25:21, 3.83s/it] 14%|█▍ | 4755/34278 [5:20:52<28:45:23, 3.51s/it] {'loss': 0.1506, 'grad_norm': 0.8008712692693377, 'learning_rate': 9.693325316675118e-06, 'epoch': 0.14} 14%|█▍ | 4755/34278 [5:20:52<28:45:23, 3.51s/it] 14%|█▍ | 4756/34278 [5:20:55<27:21:57, 3.34s/it] {'loss': 0.1886, 'grad_norm': 0.9747624024045326, 'learning_rate': 9.69316238613831e-06, 'epoch': 0.14} 14%|█▍ | 4756/34278 [5:20:55<27:21:57, 3.34s/it] 14%|█▍ | 4757/34278 [5:20:58<27:01:02, 3.29s/it] {'loss': 0.1833, 'grad_norm': 1.0202675067414013, 'learning_rate': 9.69299941370207e-06, 'epoch': 0.14} 14%|█▍ | 4757/34278 [5:20:58<27:01:02, 3.29s/it] 14%|█▍ | 4758/34278 [5:21:03<30:51:16, 3.76s/it] {'loss': 0.1441, 'grad_norm': 0.8653840356432823, 'learning_rate': 9.692836399367849e-06, 'epoch': 0.14} 14%|█▍ | 4758/34278 [5:21:03<30:51:16, 3.76s/it] 14%|█▍ | 4759/34278 [5:21:07<30:42:27, 3.74s/it] {'loss': 0.1548, 'grad_norm': 0.8570360870727849, 'learning_rate': 9.692673343137105e-06, 'epoch': 0.14} 14%|█▍ | 4759/34278 [5:21:07<30:42:27, 3.74s/it] 14%|█▍ | 4760/34278 [5:21:10<29:40:09, 3.62s/it] {'loss': 0.1604, 'grad_norm': 0.6883782771617556, 'learning_rate': 9.692510245011295e-06, 'epoch': 0.14} 14%|█▍ | 4760/34278 [5:21:10<29:40:09, 3.62s/it] 14%|█▍ | 4761/34278 [5:21:16<35:25:42, 4.32s/it] {'loss': 0.1833, 'grad_norm': 0.8918098074714871, 'learning_rate': 9.692347104991872e-06, 'epoch': 0.14} 14%|█▍ | 4761/34278 [5:21:16<35:25:42, 4.32s/it] 14%|█▍ | 4762/34278 [5:21:19<32:51:08, 4.01s/it] {'loss': 0.2015, 'grad_norm': 0.8063028934656418, 'learning_rate': 9.692183923080296e-06, 'epoch': 0.14} 14%|█▍ | 4762/34278 [5:21:19<32:51:08, 4.01s/it] 14%|█▍ | 4763/34278 [5:21:23<31:22:06, 3.83s/it] {'loss': 0.2058, 'grad_norm': 0.8285488144862405, 'learning_rate': 9.692020699278022e-06, 'epoch': 0.14} 14%|█▍ | 4763/34278 [5:21:23<31:22:06, 3.83s/it] 14%|█▍ | 4764/34278 [5:21:26<29:14:32, 3.57s/it] {'loss': 0.1755, 'grad_norm': 0.9826245249457514, 'learning_rate': 9.691857433586506e-06, 'epoch': 0.14} 14%|█▍ | 4764/34278 [5:21:26<29:14:32, 3.57s/it] 14%|█▍ | 4765/34278 [5:21:29<29:36:59, 3.61s/it] {'loss': 0.1621, 'grad_norm': 1.0692972601624349, 'learning_rate': 9.691694126007207e-06, 'epoch': 0.14} 14%|█▍ | 4765/34278 [5:21:30<29:36:59, 3.61s/it] 14%|█▍ | 4766/34278 [5:21:33<29:16:18, 3.57s/it] {'loss': 0.1718, 'grad_norm': 0.7243879299682265, 'learning_rate': 9.691530776541584e-06, 'epoch': 0.14} 14%|█▍ | 4766/34278 [5:21:33<29:16:18, 3.57s/it] 14%|█▍ | 4767/34278 [5:21:36<27:04:32, 3.30s/it] {'loss': 0.1684, 'grad_norm': 1.469514154173167, 'learning_rate': 9.691367385191092e-06, 'epoch': 0.14} 14%|█▍ | 4767/34278 [5:21:36<27:04:32, 3.30s/it] 14%|█▍ | 4768/34278 [5:21:39<27:15:06, 3.32s/it] {'loss': 0.1612, 'grad_norm': 1.007738393673632, 'learning_rate': 9.691203951957195e-06, 'epoch': 0.14} 14%|█▍ | 4768/34278 [5:21:39<27:15:06, 3.32s/it] 14%|█▍ | 4769/34278 [5:21:43<28:21:53, 3.46s/it] {'loss': 0.1872, 'grad_norm': 0.7627352180329647, 'learning_rate': 9.691040476841347e-06, 'epoch': 0.14} 14%|█▍ | 4769/34278 [5:21:43<28:21:53, 3.46s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▍ | 4770/34278 [5:21:46<26:55:36, 3.29s/it] {'loss': 0.1836, 'grad_norm': 1.0794946403128456, 'learning_rate': 9.69087695984501e-06, 'epoch': 0.14} 14%|█▍ | 4770/34278 [5:21:46<26:55:36, 3.29s/it] 14%|█▍ | 4771/34278 [5:21:49<27:04:14, 3.30s/it] {'loss': 0.1692, 'grad_norm': 0.8567896408660971, 'learning_rate': 9.690713400969643e-06, 'epoch': 0.14} 14%|█▍ | 4771/34278 [5:21:49<27:04:14, 3.30s/it] 14%|█▍ | 4772/34278 [5:21:52<25:58:36, 3.17s/it] {'loss': 0.1707, 'grad_norm': 0.7951352571820303, 'learning_rate': 9.690549800216707e-06, 'epoch': 0.14} 14%|█▍ | 4772/34278 [5:21:52<25:58:36, 3.17s/it] 14%|█▍ | 4773/34278 [5:21:55<25:49:30, 3.15s/it] {'loss': 0.1804, 'grad_norm': 0.8863994923542984, 'learning_rate': 9.69038615758766e-06, 'epoch': 0.14} 14%|█▍ | 4773/34278 [5:21:55<25:49:30, 3.15s/it] 14%|█▍ | 4774/34278 [5:21:58<26:16:10, 3.21s/it] {'loss': 0.1781, 'grad_norm': 1.00564861230173, 'learning_rate': 9.690222473083969e-06, 'epoch': 0.14} 14%|█▍ | 4774/34278 [5:21:58<26:16:10, 3.21s/it] 14%|█▍ | 4775/34278 [5:22:02<27:49:00, 3.39s/it] {'loss': 0.157, 'grad_norm': 0.940740162116202, 'learning_rate': 9.690058746707088e-06, 'epoch': 0.14} 14%|█▍ | 4775/34278 [5:22:02<27:49:00, 3.39s/it] 14%|█▍ | 4776/34278 [5:22:06<29:29:12, 3.60s/it] {'loss': 0.1905, 'grad_norm': 0.8229699866809916, 'learning_rate': 9.689894978458483e-06, 'epoch': 0.14} 14%|█▍ | 4776/34278 [5:22:06<29:29:12, 3.60s/it] 14%|█▍ | 4777/34278 [5:22:09<27:05:21, 3.31s/it] {'loss': 0.1481, 'grad_norm': 0.8355385203041596, 'learning_rate': 9.689731168339617e-06, 'epoch': 0.14} 14%|█▍ | 4777/34278 [5:22:09<27:05:21, 3.31s/it] 14%|█▍ | 4778/34278 [5:22:12<25:36:55, 3.13s/it] {'loss': 0.1554, 'grad_norm': 1.0131008965133432, 'learning_rate': 9.689567316351948e-06, 'epoch': 0.14} 14%|█▍ | 4778/34278 [5:22:12<25:36:55, 3.13s/it] 14%|█▍ | 4779/34278 [5:22:18<33:07:07, 4.04s/it] {'loss': 0.1821, 'grad_norm': 0.922450578193256, 'learning_rate': 9.689403422496943e-06, 'epoch': 0.14} 14%|█▍ | 4779/34278 [5:22:18<33:07:07, 4.04s/it] 14%|█▍ | 4780/34278 [5:22:24<38:25:00, 4.69s/it] {'loss': 0.2, 'grad_norm': 0.8739646449136426, 'learning_rate': 9.689239486776062e-06, 'epoch': 0.14} 14%|█▍ | 4780/34278 [5:22:24<38:25:00, 4.69s/it] 14%|█▍ | 4781/34278 [5:22:28<35:48:38, 4.37s/it] {'loss': 0.1892, 'grad_norm': 0.7958490554860915, 'learning_rate': 9.689075509190773e-06, 'epoch': 0.14} 14%|█▍ | 4781/34278 [5:22:28<35:48:38, 4.37s/it] 14%|█▍ | 4782/34278 [5:22:30<32:07:28, 3.92s/it] {'loss': 0.1629, 'grad_norm': 1.081798561341606, 'learning_rate': 9.688911489742536e-06, 'epoch': 0.14} 14%|█▍ | 4782/34278 [5:22:30<32:07:28, 3.92s/it] 14%|█▍ | 4783/34278 [5:22:35<34:48:08, 4.25s/it] {'loss': 0.1786, 'grad_norm': 1.0034315642502445, 'learning_rate': 9.688747428432817e-06, 'epoch': 0.14} 14%|█▍ | 4783/34278 [5:22:35<34:48:08, 4.25s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▍ | 4784/34278 [5:22:39<33:23:40, 4.08s/it] {'loss': 0.1918, 'grad_norm': 1.1513257134225778, 'learning_rate': 9.68858332526308e-06, 'epoch': 0.14} 14%|█▍ | 4784/34278 [5:22:39<33:23:40, 4.08s/it] 14%|█▍ | 4785/34278 [5:22:42<30:12:30, 3.69s/it] {'loss': 0.1708, 'grad_norm': 0.931118991810217, 'learning_rate': 9.68841918023479e-06, 'epoch': 0.14} 14%|█▍ | 4785/34278 [5:22:42<30:12:30, 3.69s/it] 14%|█▍ | 4786/34278 [5:22:46<30:24:26, 3.71s/it] {'loss': 0.1728, 'grad_norm': 0.9032086134251692, 'learning_rate': 9.688254993349413e-06, 'epoch': 0.14} 14%|█▍ | 4786/34278 [5:22:46<30:24:26, 3.71s/it] 14%|█▍ | 4787/34278 [5:22:50<32:21:51, 3.95s/it] {'loss': 0.1724, 'grad_norm': 0.8804102543714083, 'learning_rate': 9.688090764608414e-06, 'epoch': 0.14} 14%|█▍ | 4787/34278 [5:22:50<32:21:51, 3.95s/it] 14%|█▍ | 4788/34278 [5:22:53<29:37:53, 3.62s/it] {'loss': 0.1996, 'grad_norm': 0.9761342170497085, 'learning_rate': 9.68792649401326e-06, 'epoch': 0.14} 14%|█▍ | 4788/34278 [5:22:53<29:37:53, 3.62s/it] 14%|█▍ | 4789/34278 [5:22:57<30:02:02, 3.67s/it] {'loss': 0.1751, 'grad_norm': 1.1328109314819435, 'learning_rate': 9.687762181565417e-06, 'epoch': 0.14} 14%|█▍ | 4789/34278 [5:22:57<30:02:02, 3.67s/it] 14%|█▍ | 4790/34278 [5:23:00<29:14:43, 3.57s/it] {'loss': 0.1617, 'grad_norm': 0.6669056177756585, 'learning_rate': 9.687597827266355e-06, 'epoch': 0.14} 14%|█▍ | 4790/34278 [5:23:00<29:14:43, 3.57s/it] 14%|█▍ | 4791/34278 [5:23:05<32:01:04, 3.91s/it] {'loss': 0.1762, 'grad_norm': 1.0562784085060106, 'learning_rate': 9.687433431117536e-06, 'epoch': 0.14} 14%|█▍ | 4791/34278 [5:23:05<32:01:04, 3.91s/it] 14%|█▍ | 4792/34278 [5:23:08<29:29:03, 3.60s/it] {'loss': 0.1845, 'grad_norm': 0.9796484153565788, 'learning_rate': 9.68726899312043e-06, 'epoch': 0.14} 14%|█▍ | 4792/34278 [5:23:08<29:29:03, 3.60s/it] 14%|█▍ | 4793/34278 [5:23:14<35:10:39, 4.30s/it] {'loss': 0.1808, 'grad_norm': 0.8837581374484309, 'learning_rate': 9.687104513276506e-06, 'epoch': 0.14} 14%|█▍ | 4793/34278 [5:23:14<35:10:39, 4.30s/it] 14%|█▍ | 4794/34278 [5:23:17<31:54:59, 3.90s/it] {'loss': 0.1809, 'grad_norm': 0.9029351835677035, 'learning_rate': 9.686939991587231e-06, 'epoch': 0.14} 14%|█▍ | 4794/34278 [5:23:17<31:54:59, 3.90s/it] 14%|█▍ | 4795/34278 [5:23:20<30:45:59, 3.76s/it] {'loss': 0.1855, 'grad_norm': 1.073322187155975, 'learning_rate': 9.686775428054077e-06, 'epoch': 0.14} 14%|█▍ | 4795/34278 [5:23:20<30:45:59, 3.76s/it] 14%|█▍ | 4796/34278 [5:23:23<29:33:24, 3.61s/it] {'loss': 0.1758, 'grad_norm': 0.872777369415657, 'learning_rate': 9.68661082267851e-06, 'epoch': 0.14} 14%|█▍ | 4796/34278 [5:23:23<29:33:24, 3.61s/it] 14%|█▍ | 4797/34278 [5:23:28<32:06:03, 3.92s/it] {'loss': 0.1992, 'grad_norm': 1.220254433923574, 'learning_rate': 9.686446175462e-06, 'epoch': 0.14} 14%|█▍ | 4797/34278 [5:23:28<32:06:03, 3.92s/it] 14%|█▍ | 4798/34278 [5:23:32<31:29:01, 3.84s/it] {'loss': 0.1957, 'grad_norm': 1.0710973421363612, 'learning_rate': 9.686281486406016e-06, 'epoch': 0.14} 14%|█▍ | 4798/34278 [5:23:32<31:29:01, 3.84s/it] 14%|█▍ | 4799/34278 [5:23:35<30:35:23, 3.74s/it] {'loss': 0.1805, 'grad_norm': 0.8878692918807475, 'learning_rate': 9.68611675551203e-06, 'epoch': 0.14} 14%|█▍ | 4799/34278 [5:23:35<30:35:23, 3.74s/it] 14%|█▍ | 4800/34278 [5:23:39<31:29:06, 3.85s/it] {'loss': 0.1702, 'grad_norm': 1.2768733842676163, 'learning_rate': 9.685951982781515e-06, 'epoch': 0.14} 14%|█▍ | 4800/34278 [5:23:39<31:29:06, 3.85s/it] 14%|█▍ | 4801/34278 [5:23:45<37:06:37, 4.53s/it] {'loss': 0.1667, 'grad_norm': 0.7965365572186565, 'learning_rate': 9.685787168215936e-06, 'epoch': 0.14} 14%|█▍ | 4801/34278 [5:23:45<37:06:37, 4.53s/it] 14%|█▍ | 4802/34278 [5:23:48<33:09:43, 4.05s/it] {'loss': 0.1803, 'grad_norm': 0.9463080448399828, 'learning_rate': 9.68562231181677e-06, 'epoch': 0.14} 14%|█▍ | 4802/34278 [5:23:48<33:09:43, 4.05s/it] 14%|█▍ | 4803/34278 [5:23:51<30:30:35, 3.73s/it] {'loss': 0.1546, 'grad_norm': 0.8209363619499964, 'learning_rate': 9.685457413585485e-06, 'epoch': 0.14} 14%|█▍ | 4803/34278 [5:23:51<30:30:35, 3.73s/it] 14%|█▍ | 4804/34278 [5:23:54<28:58:27, 3.54s/it] {'loss': 0.195, 'grad_norm': 0.9216286184168969, 'learning_rate': 9.685292473523556e-06, 'epoch': 0.14} 14%|█▍ | 4804/34278 [5:23:54<28:58:27, 3.54s/it] 14%|█▍ | 4805/34278 [5:23:57<27:47:12, 3.39s/it] {'loss': 0.2126, 'grad_norm': 0.7379346868485538, 'learning_rate': 9.685127491632453e-06, 'epoch': 0.14} 14%|█▍ | 4805/34278 [5:23:57<27:47:12, 3.39s/it] 14%|█▍ | 4806/34278 [5:24:00<26:42:53, 3.26s/it] {'loss': 0.1843, 'grad_norm': 0.8735232330808408, 'learning_rate': 9.68496246791365e-06, 'epoch': 0.14} 14%|█▍ | 4806/34278 [5:24:00<26:42:53, 3.26s/it] 14%|█▍ | 4807/34278 [5:24:04<27:32:40, 3.36s/it] {'loss': 0.1806, 'grad_norm': 0.7222127071955207, 'learning_rate': 9.684797402368622e-06, 'epoch': 0.14} 14%|█▍ | 4807/34278 [5:24:04<27:32:40, 3.36s/it] 14%|█▍ | 4808/34278 [5:24:07<27:43:28, 3.39s/it] {'loss': 0.1766, 'grad_norm': 0.7931973528469856, 'learning_rate': 9.684632294998839e-06, 'epoch': 0.14} 14%|█▍ | 4808/34278 [5:24:07<27:43:28, 3.39s/it] 14%|█▍ | 4809/34278 [5:24:10<26:05:18, 3.19s/it] {'loss': 0.1581, 'grad_norm': 0.8743287978832351, 'learning_rate': 9.68446714580578e-06, 'epoch': 0.14} 14%|█▍ | 4809/34278 [5:24:10<26:05:18, 3.19s/it] 14%|█▍ | 4810/34278 [5:24:14<27:12:23, 3.32s/it] {'loss': 0.1527, 'grad_norm': 0.8218092003905312, 'learning_rate': 9.684301954790914e-06, 'epoch': 0.14} 14%|█▍ | 4810/34278 [5:24:14<27:12:23, 3.32s/it] 14%|█▍ | 4811/34278 [5:24:17<27:08:52, 3.32s/it] {'loss': 0.1785, 'grad_norm': 0.8003358263025518, 'learning_rate': 9.68413672195572e-06, 'epoch': 0.14} 14%|█▍ | 4811/34278 [5:24:17<27:08:52, 3.32s/it] 14%|█▍ | 4812/34278 [5:24:20<25:36:56, 3.13s/it] {'loss': 0.1701, 'grad_norm': 0.8468720761596434, 'learning_rate': 9.683971447301672e-06, 'epoch': 0.14} 14%|█▍ | 4812/34278 [5:24:20<25:36:56, 3.13s/it] 14%|█▍ | 4813/34278 [5:24:23<25:55:25, 3.17s/it] {'loss': 0.1741, 'grad_norm': 0.8887877124128191, 'learning_rate': 9.683806130830243e-06, 'epoch': 0.14} 14%|█▍ | 4813/34278 [5:24:23<25:55:25, 3.17s/it] 14%|█▍ | 4814/34278 [5:24:27<27:33:29, 3.37s/it] {'loss': 0.1731, 'grad_norm': 1.0279176973479827, 'learning_rate': 9.683640772542913e-06, 'epoch': 0.14} 14%|█▍ | 4814/34278 [5:24:27<27:33:29, 3.37s/it] 14%|█▍ | 4815/34278 [5:24:32<31:06:40, 3.80s/it] {'loss': 0.1852, 'grad_norm': 0.7195582202001497, 'learning_rate': 9.683475372441154e-06, 'epoch': 0.14} 14%|█▍ | 4815/34278 [5:24:32<31:06:40, 3.80s/it] 14%|█▍ | 4816/34278 [5:24:35<29:27:29, 3.60s/it] {'loss': 0.1855, 'grad_norm': 0.8020805476616278, 'learning_rate': 9.683309930526447e-06, 'epoch': 0.14} 14%|█▍ | 4816/34278 [5:24:35<29:27:29, 3.60s/it] 14%|█▍ | 4817/34278 [5:24:38<28:27:08, 3.48s/it] {'loss': 0.195, 'grad_norm': 0.9048510224345565, 'learning_rate': 9.683144446800265e-06, 'epoch': 0.14} 14%|█▍ | 4817/34278 [5:24:38<28:27:08, 3.48s/it] 14%|█▍ | 4818/34278 [5:24:41<27:38:30, 3.38s/it] {'loss': 0.2009, 'grad_norm': 0.9783988267007112, 'learning_rate': 9.682978921264091e-06, 'epoch': 0.14} 14%|█▍ | 4818/34278 [5:24:41<27:38:30, 3.38s/it] 14%|█▍ | 4819/34278 [5:24:44<26:54:18, 3.29s/it] {'loss': 0.1682, 'grad_norm': 0.854084900659183, 'learning_rate': 9.682813353919395e-06, 'epoch': 0.14} 14%|█▍ | 4819/34278 [5:24:44<26:54:18, 3.29s/it] 14%|█▍ | 4820/34278 [5:24:47<25:43:10, 3.14s/it] {'loss': 0.1763, 'grad_norm': 0.9144785598322361, 'learning_rate': 9.68264774476766e-06, 'epoch': 0.14} 14%|█▍ | 4820/34278 [5:24:47<25:43:10, 3.14s/it] 14%|█▍ | 4821/34278 [5:24:53<32:23:45, 3.96s/it] {'loss': 0.1974, 'grad_norm': 0.8744938036025505, 'learning_rate': 9.682482093810366e-06, 'epoch': 0.14} 14%|█▍ | 4821/34278 [5:24:53<32:23:45, 3.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▍ | 4822/34278 [5:24:56<30:48:44, 3.77s/it] {'loss': 0.2059, 'grad_norm': 0.9891754775443123, 'learning_rate': 9.682316401048988e-06, 'epoch': 0.14} 14%|█▍ | 4822/34278 [5:24:56<30:48:44, 3.77s/it] 14%|█▍ | 4823/34278 [5:24:59<28:24:21, 3.47s/it] {'loss': 0.1657, 'grad_norm': 0.8074593601049309, 'learning_rate': 9.682150666485007e-06, 'epoch': 0.14} 14%|█▍ | 4823/34278 [5:24:59<28:24:21, 3.47s/it] 14%|█▍ | 4824/34278 [5:25:02<28:24:20, 3.47s/it] {'loss': 0.1729, 'grad_norm': 0.8042886657050788, 'learning_rate': 9.681984890119903e-06, 'epoch': 0.14} 14%|█▍ | 4824/34278 [5:25:02<28:24:20, 3.47s/it] 14%|█▍ | 4825/34278 [5:25:05<26:17:26, 3.21s/it] {'loss': 0.2028, 'grad_norm': 1.025908523614656, 'learning_rate': 9.681819071955155e-06, 'epoch': 0.14} 14%|█▍ | 4825/34278 [5:25:05<26:17:26, 3.21s/it] 14%|█▍ | 4826/34278 [5:25:11<32:58:05, 4.03s/it] {'loss': 0.1789, 'grad_norm': 0.8273014432705452, 'learning_rate': 9.681653211992244e-06, 'epoch': 0.14} 14%|█▍ | 4826/34278 [5:25:11<32:58:05, 4.03s/it] 14%|█▍ | 4827/34278 [5:25:14<30:56:12, 3.78s/it] {'loss': 0.1711, 'grad_norm': 1.0057024319220127, 'learning_rate': 9.68148731023265e-06, 'epoch': 0.14} 14%|█▍ | 4827/34278 [5:25:14<30:56:12, 3.78s/it] 14%|█▍ | 4828/34278 [5:25:18<30:07:38, 3.68s/it] {'loss': 0.1922, 'grad_norm': 1.168127807007021, 'learning_rate': 9.681321366677858e-06, 'epoch': 0.14} 14%|█▍ | 4828/34278 [5:25:18<30:07:38, 3.68s/it] 14%|█▍ | 4829/34278 [5:25:21<29:14:28, 3.57s/it] {'loss': 0.1857, 'grad_norm': 1.0477758856695032, 'learning_rate': 9.681155381329344e-06, 'epoch': 0.14} 14%|█▍ | 4829/34278 [5:25:21<29:14:28, 3.57s/it] 14%|█▍ | 4830/34278 [5:25:24<28:18:13, 3.46s/it] {'loss': 0.1767, 'grad_norm': 0.8195480953049104, 'learning_rate': 9.680989354188593e-06, 'epoch': 0.14} 14%|█▍ | 4830/34278 [5:25:24<28:18:13, 3.46s/it] 14%|█▍ | 4831/34278 [5:25:30<34:37:21, 4.23s/it] {'loss': 0.1824, 'grad_norm': 0.9603507354533929, 'learning_rate': 9.680823285257087e-06, 'epoch': 0.14} 14%|█▍ | 4831/34278 [5:25:30<34:37:21, 4.23s/it] 14%|█▍ | 4832/34278 [5:25:33<32:16:04, 3.94s/it] {'loss': 0.1955, 'grad_norm': 1.1537190976081713, 'learning_rate': 9.680657174536305e-06, 'epoch': 0.14} 14%|█▍ | 4832/34278 [5:25:33<32:16:04, 3.94s/it] 14%|█▍ | 4833/34278 [5:25:39<37:05:11, 4.53s/it] {'loss': 0.1614, 'grad_norm': 0.8363877872012376, 'learning_rate': 9.680491022027736e-06, 'epoch': 0.14} 14%|█▍ | 4833/34278 [5:25:39<37:05:11, 4.53s/it] 14%|█▍ | 4834/34278 [5:25:42<33:32:53, 4.10s/it] {'loss': 0.1586, 'grad_norm': 0.910452640839773, 'learning_rate': 9.68032482773286e-06, 'epoch': 0.14} 14%|█▍ | 4834/34278 [5:25:42<33:32:53, 4.10s/it] 14%|█▍ | 4835/34278 [5:25:46<31:44:33, 3.88s/it] {'loss': 0.1877, 'grad_norm': 1.0384120108467383, 'learning_rate': 9.680158591653162e-06, 'epoch': 0.14} 14%|█▍ | 4835/34278 [5:25:46<31:44:33, 3.88s/it] 14%|█▍ | 4836/34278 [5:25:49<29:44:22, 3.64s/it] {'loss': 0.181, 'grad_norm': 0.8799418821600672, 'learning_rate': 9.679992313790123e-06, 'epoch': 0.14} 14%|█▍ | 4836/34278 [5:25:49<29:44:22, 3.64s/it] 14%|█▍ | 4837/34278 [5:25:52<28:14:46, 3.45s/it] {'loss': 0.1696, 'grad_norm': 1.1052277610346195, 'learning_rate': 9.679825994145232e-06, 'epoch': 0.14} 14%|█▍ | 4837/34278 [5:25:52<28:14:46, 3.45s/it] 14%|█▍ | 4838/34278 [5:25:56<28:46:31, 3.52s/it] {'loss': 0.1689, 'grad_norm': 0.8070947761707465, 'learning_rate': 9.67965963271997e-06, 'epoch': 0.14} 14%|█▍ | 4838/34278 [5:25:56<28:46:31, 3.52s/it] 14%|█▍ | 4839/34278 [5:25:59<27:48:12, 3.40s/it] {'loss': 0.1831, 'grad_norm': 0.8348709118117703, 'learning_rate': 9.679493229515825e-06, 'epoch': 0.14} 14%|█▍ | 4839/34278 [5:25:59<27:48:12, 3.40s/it] 14%|█▍ | 4840/34278 [5:26:02<27:13:19, 3.33s/it] {'loss': 0.1851, 'grad_norm': 0.883498083009703, 'learning_rate': 9.679326784534283e-06, 'epoch': 0.14} 14%|█▍ | 4840/34278 [5:26:02<27:13:19, 3.33s/it] 14%|█▍ | 4841/34278 [5:26:05<26:47:04, 3.28s/it] {'loss': 0.1669, 'grad_norm': 0.8407038467802971, 'learning_rate': 9.679160297776826e-06, 'epoch': 0.14} 14%|█▍ | 4841/34278 [5:26:05<26:47:04, 3.28s/it] 14%|█▍ | 4842/34278 [5:26:08<26:18:17, 3.22s/it] {'loss': 0.185, 'grad_norm': 0.8112263115764934, 'learning_rate': 9.678993769244942e-06, 'epoch': 0.14} 14%|█▍ | 4842/34278 [5:26:08<26:18:17, 3.22s/it] 14%|█▍ | 4843/34278 [5:26:12<28:30:26, 3.49s/it] {'loss': 0.172, 'grad_norm': 0.8819726568891366, 'learning_rate': 9.678827198940121e-06, 'epoch': 0.14} 14%|█▍ | 4843/34278 [5:26:12<28:30:26, 3.49s/it] 14%|█▍ | 4844/34278 [5:26:16<28:33:38, 3.49s/it] {'loss': 0.1838, 'grad_norm': 0.8196877100230318, 'learning_rate': 9.678660586863847e-06, 'epoch': 0.14} 14%|█▍ | 4844/34278 [5:26:16<28:33:38, 3.49s/it] 14%|█▍ | 4845/34278 [5:26:19<27:02:13, 3.31s/it] {'loss': 0.1889, 'grad_norm': 0.8916286389589517, 'learning_rate': 9.678493933017608e-06, 'epoch': 0.14} 14%|█▍ | 4845/34278 [5:26:19<27:02:13, 3.31s/it] 14%|█▍ | 4846/34278 [5:26:21<25:38:18, 3.14s/it] {'loss': 0.1844, 'grad_norm': 0.8081831430889657, 'learning_rate': 9.678327237402892e-06, 'epoch': 0.14} 14%|█▍ | 4846/34278 [5:26:21<25:38:18, 3.14s/it] 14%|█▍ | 4847/34278 [5:26:25<27:13:47, 3.33s/it] {'loss': 0.2088, 'grad_norm': 0.8996943013924475, 'learning_rate': 9.678160500021188e-06, 'epoch': 0.14} 14%|█▍ | 4847/34278 [5:26:25<27:13:47, 3.33s/it] 14%|█▍ | 4848/34278 [5:26:28<25:53:47, 3.17s/it] {'loss': 0.1849, 'grad_norm': 1.1877947274995793, 'learning_rate': 9.677993720873983e-06, 'epoch': 0.14} 14%|█▍ | 4848/34278 [5:26:28<25:53:47, 3.17s/it] 14%|█▍ | 4849/34278 [5:26:31<25:08:15, 3.08s/it] {'loss': 0.181, 'grad_norm': 0.8454475429653244, 'learning_rate': 9.677826899962767e-06, 'epoch': 0.14} 14%|█▍ | 4849/34278 [5:26:31<25:08:15, 3.08s/it] 14%|█▍ | 4850/34278 [5:26:34<25:15:40, 3.09s/it] {'loss': 0.2063, 'grad_norm': 0.9777729656039873, 'learning_rate': 9.677660037289029e-06, 'epoch': 0.14} 14%|█▍ | 4850/34278 [5:26:34<25:15:40, 3.09s/it] 14%|█▍ | 4851/34278 [5:26:40<32:29:07, 3.97s/it] {'loss': 0.1797, 'grad_norm': 0.8346924308437704, 'learning_rate': 9.67749313285426e-06, 'epoch': 0.14} 14%|█▍ | 4851/34278 [5:26:40<32:29:07, 3.97s/it] 14%|█▍ | 4852/34278 [5:26:46<37:31:05, 4.59s/it] {'loss': 0.1828, 'grad_norm': 0.8235078319478367, 'learning_rate': 9.677326186659947e-06, 'epoch': 0.14} 14%|█▍ | 4852/34278 [5:26:46<37:31:05, 4.59s/it] 14%|█▍ | 4853/34278 [5:26:49<34:04:49, 4.17s/it] {'loss': 0.1679, 'grad_norm': 0.8134056400223716, 'learning_rate': 9.677159198707582e-06, 'epoch': 0.14} 14%|█▍ | 4853/34278 [5:26:49<34:04:49, 4.17s/it] 14%|█▍ | 4854/34278 [5:26:53<34:32:54, 4.23s/it] {'loss': 0.1782, 'grad_norm': 0.9372396523329549, 'learning_rate': 9.676992168998657e-06, 'epoch': 0.14} 14%|█▍ | 4854/34278 [5:26:53<34:32:54, 4.23s/it] 14%|█▍ | 4855/34278 [5:26:58<34:17:10, 4.20s/it] {'loss': 0.1568, 'grad_norm': 0.75538528348334, 'learning_rate': 9.676825097534663e-06, 'epoch': 0.14} 14%|█▍ | 4855/34278 [5:26:58<34:17:10, 4.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▍ | 4856/34278 [5:27:02<33:37:00, 4.11s/it] {'loss': 0.1759, 'grad_norm': 0.7854605906588132, 'learning_rate': 9.676657984317092e-06, 'epoch': 0.14} 14%|█▍ | 4856/34278 [5:27:02<33:37:00, 4.11s/it] 14%|█▍ | 4857/34278 [5:27:05<31:11:33, 3.82s/it] {'loss': 0.1794, 'grad_norm': 0.9421707495263524, 'learning_rate': 9.676490829347434e-06, 'epoch': 0.14} 14%|█▍ | 4857/34278 [5:27:05<31:11:33, 3.82s/it] 14%|█▍ | 4858/34278 [5:27:09<32:16:46, 3.95s/it] {'loss': 0.1703, 'grad_norm': 0.7743067145071496, 'learning_rate': 9.67632363262718e-06, 'epoch': 0.14} 14%|█▍ | 4858/34278 [5:27:09<32:16:46, 3.95s/it] 14%|█▍ | 4859/34278 [5:27:12<30:25:31, 3.72s/it] {'loss': 0.1743, 'grad_norm': 0.930500697699094, 'learning_rate': 9.676156394157829e-06, 'epoch': 0.14} 14%|█▍ | 4859/34278 [5:27:12<30:25:31, 3.72s/it] 14%|█▍ | 4860/34278 [5:27:15<28:14:03, 3.46s/it] {'loss': 0.1612, 'grad_norm': 0.7814997975949639, 'learning_rate': 9.675989113940866e-06, 'epoch': 0.14} 14%|█▍ | 4860/34278 [5:27:15<28:14:03, 3.46s/it] 14%|█▍ | 4861/34278 [5:27:18<27:45:15, 3.40s/it] {'loss': 0.1699, 'grad_norm': 0.8865724413664395, 'learning_rate': 9.67582179197779e-06, 'epoch': 0.14} 14%|█▍ | 4861/34278 [5:27:18<27:45:15, 3.40s/it] 14%|█▍ | 4862/34278 [5:27:21<27:02:05, 3.31s/it] {'loss': 0.183, 'grad_norm': 1.1884952321978883, 'learning_rate': 9.675654428270094e-06, 'epoch': 0.14} 14%|█▍ | 4862/34278 [5:27:21<27:02:05, 3.31s/it] 14%|█▍ | 4863/34278 [5:27:24<26:18:08, 3.22s/it] {'loss': 0.1713, 'grad_norm': 0.8074043648783656, 'learning_rate': 9.675487022819273e-06, 'epoch': 0.14} 14%|█▍ | 4863/34278 [5:27:24<26:18:08, 3.22s/it] 14%|█▍ | 4864/34278 [5:27:28<27:27:48, 3.36s/it] {'loss': 0.1705, 'grad_norm': 0.8235702417362052, 'learning_rate': 9.675319575626817e-06, 'epoch': 0.14} 14%|█▍ | 4864/34278 [5:27:28<27:27:48, 3.36s/it] 14%|█▍ | 4865/34278 [5:27:31<27:29:08, 3.36s/it] {'loss': 0.1851, 'grad_norm': 1.2325535207631664, 'learning_rate': 9.675152086694226e-06, 'epoch': 0.14} 14%|█▍ | 4865/34278 [5:27:31<27:29:08, 3.36s/it] 14%|█▍ | 4866/34278 [5:27:35<27:02:27, 3.31s/it] {'loss': 0.183, 'grad_norm': 1.0524729024408745, 'learning_rate': 9.67498455602299e-06, 'epoch': 0.14} 14%|█▍ | 4866/34278 [5:27:35<27:02:27, 3.31s/it] 14%|█▍ | 4867/34278 [5:27:37<25:54:29, 3.17s/it] {'loss': 0.194, 'grad_norm': 1.0442032804423818, 'learning_rate': 9.674816983614611e-06, 'epoch': 0.14} 14%|█▍ | 4867/34278 [5:27:37<25:54:29, 3.17s/it] 14%|█▍ | 4868/34278 [5:27:42<29:27:13, 3.61s/it] {'loss': 0.1929, 'grad_norm': 0.958524240553488, 'learning_rate': 9.67464936947058e-06, 'epoch': 0.14} 14%|█▍ | 4868/34278 [5:27:42<29:27:13, 3.61s/it] 14%|█▍ | 4869/34278 [5:27:45<28:41:58, 3.51s/it] {'loss': 0.1904, 'grad_norm': 0.9250214990383446, 'learning_rate': 9.674481713592398e-06, 'epoch': 0.14} 14%|█▍ | 4869/34278 [5:27:45<28:41:58, 3.51s/it] 14%|█▍ | 4870/34278 [5:27:48<27:34:52, 3.38s/it] {'loss': 0.2122, 'grad_norm': 0.8600711976437861, 'learning_rate': 9.674314015981557e-06, 'epoch': 0.14} 14%|█▍ | 4870/34278 [5:27:48<27:34:52, 3.38s/it] 14%|█▍ | 4871/34278 [5:27:54<33:18:09, 4.08s/it] {'loss': 0.1628, 'grad_norm': 0.8270568012751615, 'learning_rate': 9.674146276639556e-06, 'epoch': 0.14} 14%|█▍ | 4871/34278 [5:27:54<33:18:09, 4.08s/it] 14%|█▍ | 4872/34278 [5:27:57<31:27:36, 3.85s/it] {'loss': 0.1797, 'grad_norm': 0.891043897789974, 'learning_rate': 9.673978495567895e-06, 'epoch': 0.14} 14%|█▍ | 4872/34278 [5:27:57<31:27:36, 3.85s/it] 14%|█▍ | 4873/34278 [5:28:00<28:46:11, 3.52s/it] {'loss': 0.1758, 'grad_norm': 0.9007499180467402, 'learning_rate': 9.673810672768068e-06, 'epoch': 0.14} 14%|█▍ | 4873/34278 [5:28:00<28:46:11, 3.52s/it] 14%|█▍ | 4874/34278 [5:28:03<28:02:08, 3.43s/it] {'loss': 0.1674, 'grad_norm': 0.9840940820256868, 'learning_rate': 9.673642808241574e-06, 'epoch': 0.14} 14%|█▍ | 4874/34278 [5:28:03<28:02:08, 3.43s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9532 > 8192). Running this sequence through the model will result in indexing errors 14%|█▍ | 4875/34278 [5:28:07<28:33:11, 3.50s/it] {'loss': 0.1663, 'grad_norm': 1.0706059805118107, 'learning_rate': 9.673474901989916e-06, 'epoch': 0.14} 14%|█▍ | 4875/34278 [5:28:07<28:33:11, 3.50s/it] 14%|█▍ | 4876/34278 [5:28:10<28:16:54, 3.46s/it] {'loss': 0.1641, 'grad_norm': 0.7539907717704584, 'learning_rate': 9.673306954014588e-06, 'epoch': 0.14} 14%|█▍ | 4876/34278 [5:28:10<28:16:54, 3.46s/it] 14%|█▍ | 4877/34278 [5:28:14<28:10:08, 3.45s/it] {'loss': 0.1917, 'grad_norm': 0.9564622588789834, 'learning_rate': 9.673138964317091e-06, 'epoch': 0.14} 14%|█▍ | 4877/34278 [5:28:14<28:10:08, 3.45s/it] 14%|█▍ | 4878/34278 [5:28:18<29:11:14, 3.57s/it] {'loss': 0.1669, 'grad_norm': 1.072023018231923, 'learning_rate': 9.672970932898923e-06, 'epoch': 0.14} 14%|█▍ | 4878/34278 [5:28:18<29:11:14, 3.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▍ | 4879/34278 [5:28:24<36:18:20, 4.45s/it] {'loss': 0.1719, 'grad_norm': 0.876912322391861, 'learning_rate': 9.67280285976159e-06, 'epoch': 0.14} 14%|█▍ | 4879/34278 [5:28:24<36:18:20, 4.45s/it] 14%|█▍ | 4880/34278 [5:28:29<37:01:20, 4.53s/it] {'loss': 0.1677, 'grad_norm': 1.0984522305127342, 'learning_rate': 9.672634744906585e-06, 'epoch': 0.14} 14%|█▍ | 4880/34278 [5:28:29<37:01:20, 4.53s/it] 14%|█▍ | 4881/34278 [5:28:32<32:58:11, 4.04s/it] {'loss': 0.1843, 'grad_norm': 0.9969050836869188, 'learning_rate': 9.672466588335414e-06, 'epoch': 0.14} 14%|█▍ | 4881/34278 [5:28:32<32:58:11, 4.04s/it] 14%|█▍ | 4882/34278 [5:28:38<37:26:28, 4.59s/it] {'loss': 0.1782, 'grad_norm': 0.8220361004029493, 'learning_rate': 9.672298390049577e-06, 'epoch': 0.14} 14%|█▍ | 4882/34278 [5:28:38<37:26:28, 4.59s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▍ | 4883/34278 [5:28:41<34:06:06, 4.18s/it] {'loss': 0.1809, 'grad_norm': 1.0229843634298637, 'learning_rate': 9.672130150050576e-06, 'epoch': 0.14} 14%|█▍ | 4883/34278 [5:28:41<34:06:06, 4.18s/it] 14%|█▍ | 4884/34278 [5:28:44<31:57:59, 3.92s/it] {'loss': 0.1731, 'grad_norm': 0.970871248674842, 'learning_rate': 9.67196186833991e-06, 'epoch': 0.14} 14%|█▍ | 4884/34278 [5:28:44<31:57:59, 3.92s/it] 14%|█▍ | 4885/34278 [5:28:47<30:06:47, 3.69s/it] {'loss': 0.1873, 'grad_norm': 1.1818675378503378, 'learning_rate': 9.671793544919086e-06, 'epoch': 0.14} 14%|█▍ | 4885/34278 [5:28:47<30:06:47, 3.69s/it] 14%|█▍ | 4886/34278 [5:28:51<28:58:47, 3.55s/it] {'loss': 0.1758, 'grad_norm': 0.9214167530946143, 'learning_rate': 9.671625179789603e-06, 'epoch': 0.14} 14%|█▍ | 4886/34278 [5:28:51<28:58:47, 3.55s/it] 14%|█▍ | 4887/34278 [5:28:55<29:55:59, 3.67s/it] {'loss': 0.1707, 'grad_norm': 0.807857849013604, 'learning_rate': 9.671456772952967e-06, 'epoch': 0.14} 14%|█▍ | 4887/34278 [5:28:55<29:55:59, 3.67s/it] 14%|█▍ | 4888/34278 [5:28:57<27:49:55, 3.41s/it] {'loss': 0.1761, 'grad_norm': 1.1217031758878113, 'learning_rate': 9.671288324410678e-06, 'epoch': 0.14} 14%|█▍ | 4888/34278 [5:28:57<27:49:55, 3.41s/it] 14%|█▍ | 4889/34278 [5:29:00<27:09:35, 3.33s/it] {'loss': 0.1554, 'grad_norm': 0.7196737452829093, 'learning_rate': 9.671119834164245e-06, 'epoch': 0.14} 14%|█▍ | 4889/34278 [5:29:00<27:09:35, 3.33s/it] 14%|█▍ | 4890/34278 [5:29:06<32:37:44, 4.00s/it] {'loss': 0.2039, 'grad_norm': 0.8343552659400554, 'learning_rate': 9.670951302215166e-06, 'epoch': 0.14} 14%|█▍ | 4890/34278 [5:29:06<32:37:44, 4.00s/it] 14%|█▍ | 4891/34278 [5:29:10<32:11:50, 3.94s/it] {'loss': 0.1932, 'grad_norm': 0.967408695123973, 'learning_rate': 9.67078272856495e-06, 'epoch': 0.14} 14%|█▍ | 4891/34278 [5:29:10<32:11:50, 3.94s/it] 14%|█▍ | 4892/34278 [5:29:13<31:06:15, 3.81s/it] {'loss': 0.1958, 'grad_norm': 0.892280770306024, 'learning_rate': 9.670614113215102e-06, 'epoch': 0.14} 14%|█▍ | 4892/34278 [5:29:13<31:06:15, 3.81s/it] 14%|█▍ | 4893/34278 [5:29:17<30:05:19, 3.69s/it] {'loss': 0.177, 'grad_norm': 0.8341263048425164, 'learning_rate': 9.670445456167125e-06, 'epoch': 0.14} 14%|█▍ | 4893/34278 [5:29:17<30:05:19, 3.69s/it] 14%|█▍ | 4894/34278 [5:29:20<28:19:12, 3.47s/it] {'loss': 0.1557, 'grad_norm': 0.8470140746338892, 'learning_rate': 9.670276757422525e-06, 'epoch': 0.14} 14%|█▍ | 4894/34278 [5:29:20<28:19:12, 3.47s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fafd81bf240> Failed to fetch sample 3110973. Exception: cannot identify image file <_io.BytesIO object at 0x7fafd81bf240> 14%|█▍ | 4895/34278 [5:29:24<29:18:11, 3.59s/it] {'loss': 0.1607, 'grad_norm': 0.7642882308535696, 'learning_rate': 9.670108016982812e-06, 'epoch': 0.14} 14%|█▍ | 4895/34278 [5:29:24<29:18:11, 3.59s/it] 14%|█▍ | 4896/34278 [5:29:29<34:34:58, 4.24s/it] {'loss': 0.1648, 'grad_norm': 0.9228984753411509, 'learning_rate': 9.669939234849485e-06, 'epoch': 0.14} 14%|█▍ | 4896/34278 [5:29:29<34:34:58, 4.24s/it] 14%|█▍ | 4897/34278 [5:29:33<32:07:17, 3.94s/it] {'loss': 0.1886, 'grad_norm': 0.9083247188440113, 'learning_rate': 9.66977041102406e-06, 'epoch': 0.14} 14%|█▍ | 4897/34278 [5:29:33<32:07:17, 3.94s/it] 14%|█▍ | 4898/34278 [5:29:36<30:16:13, 3.71s/it] {'loss': 0.1827, 'grad_norm': 0.9890740184703097, 'learning_rate': 9.669601545508037e-06, 'epoch': 0.14} 14%|█▍ | 4898/34278 [5:29:36<30:16:13, 3.71s/it] 14%|█▍ | 4899/34278 [5:29:39<29:17:34, 3.59s/it] {'loss': 0.1732, 'grad_norm': 0.952680685214979, 'learning_rate': 9.669432638302926e-06, 'epoch': 0.14} 14%|█▍ | 4899/34278 [5:29:39<29:17:34, 3.59s/it] 14%|█▍ | 4900/34278 [5:29:42<28:26:53, 3.49s/it] {'loss': 0.1736, 'grad_norm': 0.9701128725983815, 'learning_rate': 9.669263689410236e-06, 'epoch': 0.14} 14%|█▍ | 4900/34278 [5:29:42<28:26:53, 3.49s/it] 14%|█▍ | 4901/34278 [5:29:45<27:44:38, 3.40s/it] {'loss': 0.1794, 'grad_norm': 0.9325190999545404, 'learning_rate': 9.669094698831474e-06, 'epoch': 0.14} 14%|█▍ | 4901/34278 [5:29:46<27:44:38, 3.40s/it] 14%|█▍ | 4902/34278 [5:29:48<26:40:16, 3.27s/it] {'loss': 0.1813, 'grad_norm': 0.9819126438419686, 'learning_rate': 9.66892566656815e-06, 'epoch': 0.14} 14%|█▍ | 4902/34278 [5:29:48<26:40:16, 3.27s/it] 14%|█▍ | 4903/34278 [5:29:52<26:32:38, 3.25s/it] {'loss': 0.1745, 'grad_norm': 0.7815883657284478, 'learning_rate': 9.668756592621771e-06, 'epoch': 0.14} 14%|█▍ | 4903/34278 [5:29:52<26:32:38, 3.25s/it] 14%|█▍ | 4904/34278 [5:29:55<26:49:40, 3.29s/it] {'loss': 0.1786, 'grad_norm': 1.0241138247997523, 'learning_rate': 9.668587476993847e-06, 'epoch': 0.14} 14%|█▍ | 4904/34278 [5:29:55<26:49:40, 3.29s/it] 14%|█▍ | 4905/34278 [5:29:59<27:40:31, 3.39s/it] {'loss': 0.1738, 'grad_norm': 0.8584384943611965, 'learning_rate': 9.66841831968589e-06, 'epoch': 0.14} 14%|█▍ | 4905/34278 [5:29:59<27:40:31, 3.39s/it] 14%|█▍ | 4906/34278 [5:30:02<27:21:16, 3.35s/it] {'loss': 0.1784, 'grad_norm': 0.8130231945414756, 'learning_rate': 9.668249120699409e-06, 'epoch': 0.14} 14%|█▍ | 4906/34278 [5:30:02<27:21:16, 3.35s/it] 14%|█▍ | 4907/34278 [5:30:05<26:46:34, 3.28s/it] {'loss': 0.1666, 'grad_norm': 0.9657054584040975, 'learning_rate': 9.668079880035911e-06, 'epoch': 0.14} 14%|█▍ | 4907/34278 [5:30:05<26:46:34, 3.28s/it] 14%|█▍ | 4908/34278 [5:30:09<27:15:13, 3.34s/it] {'loss': 0.1716, 'grad_norm': 0.7926088134825279, 'learning_rate': 9.667910597696914e-06, 'epoch': 0.14} 14%|█▍ | 4908/34278 [5:30:09<27:15:13, 3.34s/it] 14%|█▍ | 4909/34278 [5:30:12<27:26:23, 3.36s/it] {'loss': 0.172, 'grad_norm': 0.7750102848427548, 'learning_rate': 9.667741273683924e-06, 'epoch': 0.14} 14%|█▍ | 4909/34278 [5:30:12<27:26:23, 3.36s/it] 14%|█▍ | 4910/34278 [5:30:17<31:29:51, 3.86s/it] {'loss': 0.2008, 'grad_norm': 0.945000267560802, 'learning_rate': 9.667571907998455e-06, 'epoch': 0.14} 14%|█▍ | 4910/34278 [5:30:17<31:29:51, 3.86s/it] 14%|█▍ | 4911/34278 [5:30:21<31:28:55, 3.86s/it] {'loss': 0.1951, 'grad_norm': 0.8321070111201656, 'learning_rate': 9.667402500642017e-06, 'epoch': 0.14} 14%|█▍ | 4911/34278 [5:30:21<31:28:55, 3.86s/it] 14%|█▍ | 4912/34278 [5:30:24<29:43:07, 3.64s/it] {'loss': 0.1851, 'grad_norm': 0.7348755939022059, 'learning_rate': 9.667233051616124e-06, 'epoch': 0.14} 14%|█▍ | 4912/34278 [5:30:24<29:43:07, 3.64s/it] 14%|█▍ | 4913/34278 [5:30:27<29:25:04, 3.61s/it] {'loss': 0.211, 'grad_norm': 0.998817677821531, 'learning_rate': 9.66706356092229e-06, 'epoch': 0.14} 14%|█▍ | 4913/34278 [5:30:28<29:25:04, 3.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▍ | 4914/34278 [5:30:31<28:51:40, 3.54s/it] {'loss': 0.1577, 'grad_norm': 1.2869541877033552, 'learning_rate': 9.666894028562025e-06, 'epoch': 0.14} 14%|█▍ | 4914/34278 [5:30:31<28:51:40, 3.54s/it] 14%|█▍ | 4915/34278 [5:30:35<29:17:21, 3.59s/it] {'loss': 0.1902, 'grad_norm': 0.810689609054851, 'learning_rate': 9.666724454536844e-06, 'epoch': 0.14} 14%|█▍ | 4915/34278 [5:30:35<29:17:21, 3.59s/it] 14%|█▍ | 4916/34278 [5:30:38<28:18:29, 3.47s/it] {'loss': 0.1869, 'grad_norm': 0.7782743730720433, 'learning_rate': 9.666554838848262e-06, 'epoch': 0.14} 14%|█▍ | 4916/34278 [5:30:38<28:18:29, 3.47s/it] 14%|█▍ | 4917/34278 [5:30:41<28:22:31, 3.48s/it] {'loss': 0.1727, 'grad_norm': 0.7901143669137367, 'learning_rate': 9.666385181497793e-06, 'epoch': 0.14} 14%|█▍ | 4917/34278 [5:30:41<28:22:31, 3.48s/it] 14%|█▍ | 4918/34278 [5:30:45<29:05:17, 3.57s/it] {'loss': 0.1963, 'grad_norm': 0.8631548557260729, 'learning_rate': 9.66621548248695e-06, 'epoch': 0.14} 14%|█▍ | 4918/34278 [5:30:45<29:05:17, 3.57s/it] 14%|█▍ | 4919/34278 [5:30:50<31:27:58, 3.86s/it] {'loss': 0.2079, 'grad_norm': 0.9935157262006136, 'learning_rate': 9.666045741817249e-06, 'epoch': 0.14} 14%|█▍ | 4919/34278 [5:30:50<31:27:58, 3.86s/it] 14%|█▍ | 4920/34278 [5:30:53<30:00:10, 3.68s/it] {'loss': 0.1619, 'grad_norm': 0.9220926059322345, 'learning_rate': 9.665875959490205e-06, 'epoch': 0.14} 14%|█▍ | 4920/34278 [5:30:53<30:00:10, 3.68s/it] 14%|█▍ | 4921/34278 [5:30:56<28:32:41, 3.50s/it] {'loss': 0.1579, 'grad_norm': 0.8758027143288649, 'learning_rate': 9.665706135507336e-06, 'epoch': 0.14} 14%|█▍ | 4921/34278 [5:30:56<28:32:41, 3.50s/it] 14%|█▍ | 4922/34278 [5:30:59<27:56:40, 3.43s/it] {'loss': 0.171, 'grad_norm': 1.0016470592695905, 'learning_rate': 9.665536269870155e-06, 'epoch': 0.14} 14%|█▍ | 4922/34278 [5:30:59<27:56:40, 3.43s/it] 14%|█▍ | 4923/34278 [5:31:03<29:42:49, 3.64s/it] {'loss': 0.1906, 'grad_norm': 0.8461107158112773, 'learning_rate': 9.665366362580179e-06, 'epoch': 0.14} 14%|█▍ | 4923/34278 [5:31:03<29:42:49, 3.64s/it] 14%|█▍ | 4924/34278 [5:31:06<27:47:10, 3.41s/it] {'loss': 0.1687, 'grad_norm': 0.8362304161899768, 'learning_rate': 9.665196413638929e-06, 'epoch': 0.14} 14%|█▍ | 4924/34278 [5:31:06<27:47:10, 3.41s/it] 14%|█▍ | 4925/34278 [5:31:09<26:44:47, 3.28s/it] {'loss': 0.1795, 'grad_norm': 0.9839200656933649, 'learning_rate': 9.665026423047916e-06, 'epoch': 0.14} 14%|█▍ | 4925/34278 [5:31:09<26:44:47, 3.28s/it] 14%|█▍ | 4926/34278 [5:31:12<25:58:04, 3.18s/it] {'loss': 0.1687, 'grad_norm': 0.7722141779186894, 'learning_rate': 9.664856390808661e-06, 'epoch': 0.14} 14%|█▍ | 4926/34278 [5:31:12<25:58:04, 3.18s/it] 14%|█▍ | 4927/34278 [5:31:16<27:19:43, 3.35s/it] {'loss': 0.2082, 'grad_norm': 0.9790159364229566, 'learning_rate': 9.664686316922684e-06, 'epoch': 0.14} 14%|█▍ | 4927/34278 [5:31:16<27:19:43, 3.35s/it] 14%|█▍ | 4928/34278 [5:31:19<27:09:58, 3.33s/it] {'loss': 0.1859, 'grad_norm': 0.7355735603486906, 'learning_rate': 9.664516201391501e-06, 'epoch': 0.14} 14%|█▍ | 4928/34278 [5:31:19<27:09:58, 3.33s/it] 14%|█▍ | 4929/34278 [5:31:22<27:10:59, 3.33s/it] {'loss': 0.1685, 'grad_norm': 0.7096188215676326, 'learning_rate': 9.664346044216628e-06, 'epoch': 0.14} 14%|█▍ | 4929/34278 [5:31:23<27:10:59, 3.33s/it] 14%|█▍ | 4930/34278 [5:31:25<25:58:25, 3.19s/it] {'loss': 0.1758, 'grad_norm': 0.7527802600599048, 'learning_rate': 9.66417584539959e-06, 'epoch': 0.14} 14%|█▍ | 4930/34278 [5:31:25<25:58:25, 3.19s/it] 14%|█▍ | 4931/34278 [5:31:28<25:52:58, 3.18s/it] {'loss': 0.1843, 'grad_norm': 0.9988017797907917, 'learning_rate': 9.664005604941901e-06, 'epoch': 0.14} 14%|█▍ | 4931/34278 [5:31:28<25:52:58, 3.18s/it] 14%|█▍ | 4932/34278 [5:31:32<26:03:17, 3.20s/it] {'loss': 0.1741, 'grad_norm': 0.8848600437266344, 'learning_rate': 9.663835322845086e-06, 'epoch': 0.14} 14%|█▍ | 4932/34278 [5:31:32<26:03:17, 3.20s/it] 14%|█▍ | 4933/34278 [5:31:38<32:46:34, 4.02s/it] {'loss': 0.1653, 'grad_norm': 0.8896283950204757, 'learning_rate': 9.66366499911066e-06, 'epoch': 0.14} 14%|█▍ | 4933/34278 [5:31:38<32:46:34, 4.02s/it] 14%|█▍ | 4934/34278 [5:31:41<30:59:41, 3.80s/it] {'loss': 0.1836, 'grad_norm': 0.9126068563691186, 'learning_rate': 9.663494633740148e-06, 'epoch': 0.14} 14%|█▍ | 4934/34278 [5:31:41<30:59:41, 3.80s/it] 14%|█▍ | 4935/34278 [5:31:44<29:10:59, 3.58s/it] {'loss': 0.1843, 'grad_norm': 0.9289381424332497, 'learning_rate': 9.663324226735069e-06, 'epoch': 0.14} 14%|█▍ | 4935/34278 [5:31:44<29:10:59, 3.58s/it] 14%|█▍ | 4936/34278 [5:31:49<33:02:26, 4.05s/it] {'loss': 0.1585, 'grad_norm': 0.9131368057191285, 'learning_rate': 9.663153778096943e-06, 'epoch': 0.14} 14%|█▍ | 4936/34278 [5:31:49<33:02:26, 4.05s/it] 14%|█▍ | 4937/34278 [5:31:52<30:43:30, 3.77s/it] {'loss': 0.1739, 'grad_norm': 0.9669098843130987, 'learning_rate': 9.662983287827295e-06, 'epoch': 0.14} 14%|█▍ | 4937/34278 [5:31:52<30:43:30, 3.77s/it] 14%|█▍ | 4938/34278 [5:31:58<36:10:36, 4.44s/it] {'loss': 0.1903, 'grad_norm': 0.9790305801935686, 'learning_rate': 9.662812755927645e-06, 'epoch': 0.14} 14%|█▍ | 4938/34278 [5:31:58<36:10:36, 4.44s/it] 14%|█▍ | 4939/34278 [5:32:01<32:10:57, 3.95s/it] {'loss': 0.1882, 'grad_norm': 0.899433487635994, 'learning_rate': 9.662642182399514e-06, 'epoch': 0.14} 14%|█▍ | 4939/34278 [5:32:01<32:10:57, 3.95s/it] 14%|█▍ | 4940/34278 [5:32:04<30:22:54, 3.73s/it] {'loss': 0.1937, 'grad_norm': 1.0199047256431513, 'learning_rate': 9.662471567244428e-06, 'epoch': 0.14} 14%|█▍ | 4940/34278 [5:32:04<30:22:54, 3.73s/it] 14%|█▍ | 4941/34278 [5:32:07<28:12:41, 3.46s/it] {'loss': 0.1606, 'grad_norm': 0.8457275564071992, 'learning_rate': 9.662300910463908e-06, 'epoch': 0.14} 14%|█▍ | 4941/34278 [5:32:07<28:12:41, 3.46s/it] 14%|█▍ | 4942/34278 [5:32:11<29:50:11, 3.66s/it] {'loss': 0.1799, 'grad_norm': 1.0844181886077786, 'learning_rate': 9.662130212059481e-06, 'epoch': 0.14} 14%|█▍ | 4942/34278 [5:32:11<29:50:11, 3.66s/it] 14%|█▍ | 4943/34278 [5:32:15<29:25:13, 3.61s/it] {'loss': 0.1576, 'grad_norm': 1.225013607112546, 'learning_rate': 9.661959472032667e-06, 'epoch': 0.14} 14%|█▍ | 4943/34278 [5:32:15<29:25:13, 3.61s/it] 14%|█▍ | 4944/34278 [5:32:18<27:38:57, 3.39s/it] {'loss': 0.183, 'grad_norm': 0.88643665251535, 'learning_rate': 9.66178869038499e-06, 'epoch': 0.14} 14%|█▍ | 4944/34278 [5:32:18<27:38:57, 3.39s/it] 14%|█▍ | 4945/34278 [5:32:21<26:49:47, 3.29s/it] {'loss': 0.2036, 'grad_norm': 1.0884878450603226, 'learning_rate': 9.661617867117978e-06, 'epoch': 0.14} 14%|█▍ | 4945/34278 [5:32:21<26:49:47, 3.29s/it] 14%|█▍ | 4946/34278 [5:32:24<25:56:48, 3.18s/it] {'loss': 0.1587, 'grad_norm': 1.194654672655845, 'learning_rate': 9.661447002233156e-06, 'epoch': 0.14} 14%|█▍ | 4946/34278 [5:32:24<25:56:48, 3.18s/it] 14%|█▍ | 4947/34278 [5:32:28<28:00:29, 3.44s/it] {'loss': 0.1819, 'grad_norm': 0.9949113935387659, 'learning_rate': 9.661276095732046e-06, 'epoch': 0.14} 14%|█▍ | 4947/34278 [5:32:28<28:00:29, 3.44s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▍ | 4948/34278 [5:32:32<29:09:30, 3.58s/it] {'loss': 0.1672, 'grad_norm': 0.9433703823494165, 'learning_rate': 9.661105147616177e-06, 'epoch': 0.14} 14%|█▍ | 4948/34278 [5:32:32<29:09:30, 3.58s/it] 14%|█▍ | 4949/34278 [5:32:35<29:22:50, 3.61s/it] {'loss': 0.1802, 'grad_norm': 1.1335559624318436, 'learning_rate': 9.660934157887072e-06, 'epoch': 0.14} 14%|█▍ | 4949/34278 [5:32:35<29:22:50, 3.61s/it] 14%|█▍ | 4950/34278 [5:32:39<28:31:54, 3.50s/it] {'loss': 0.1969, 'grad_norm': 1.0942795572129844, 'learning_rate': 9.66076312654626e-06, 'epoch': 0.14} 14%|█▍ | 4950/34278 [5:32:39<28:31:54, 3.50s/it] 14%|█▍ | 4951/34278 [5:32:42<27:35:15, 3.39s/it] {'loss': 0.1666, 'grad_norm': 0.7748508902390537, 'learning_rate': 9.660592053595268e-06, 'epoch': 0.14} 14%|█▍ | 4951/34278 [5:32:42<27:35:15, 3.39s/it] 14%|█▍ | 4952/34278 [5:32:46<29:09:17, 3.58s/it] {'loss': 0.1774, 'grad_norm': 1.1553100246260095, 'learning_rate': 9.660420939035624e-06, 'epoch': 0.14} 14%|█▍ | 4952/34278 [5:32:46<29:09:17, 3.58s/it] 14%|█▍ | 4953/34278 [5:32:49<28:22:02, 3.48s/it] {'loss': 0.2019, 'grad_norm': 0.9604119024938216, 'learning_rate': 9.660249782868853e-06, 'epoch': 0.14} 14%|█▍ | 4953/34278 [5:32:49<28:22:02, 3.48s/it] 14%|█▍ | 4954/34278 [5:32:52<27:53:08, 3.42s/it] {'loss': 0.1765, 'grad_norm': 0.9544521640614683, 'learning_rate': 9.660078585096484e-06, 'epoch': 0.14} 14%|█▍ | 4954/34278 [5:32:52<27:53:08, 3.42s/it] 14%|█▍ | 4955/34278 [5:32:55<26:18:59, 3.23s/it] {'loss': 0.176, 'grad_norm': 1.1864981680288276, 'learning_rate': 9.659907345720046e-06, 'epoch': 0.14} 14%|█▍ | 4955/34278 [5:32:55<26:18:59, 3.23s/it] 14%|█▍ | 4956/34278 [5:32:58<26:28:01, 3.25s/it] {'loss': 0.2009, 'grad_norm': 0.8692202165735067, 'learning_rate': 9.659736064741068e-06, 'epoch': 0.14} 14%|█▍ | 4956/34278 [5:32:58<26:28:01, 3.25s/it] 14%|█▍ | 4957/34278 [5:33:02<27:04:01, 3.32s/it] {'loss': 0.1424, 'grad_norm': 0.646865467793404, 'learning_rate': 9.65956474216108e-06, 'epoch': 0.14} 14%|█▍ | 4957/34278 [5:33:02<27:04:01, 3.32s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▍ | 4958/34278 [5:33:07<31:24:01, 3.86s/it] {'loss': 0.1845, 'grad_norm': 1.0337038399625629, 'learning_rate': 9.659393377981609e-06, 'epoch': 0.14} 14%|█▍ | 4958/34278 [5:33:07<31:24:01, 3.86s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 14%|█▍ | 4959/34278 [5:33:10<30:34:34, 3.75s/it] {'loss': 0.1727, 'grad_norm': 1.0443727583431346, 'learning_rate': 9.659221972204186e-06, 'epoch': 0.14} 14%|█▍ | 4959/34278 [5:33:10<30:34:34, 3.75s/it] 14%|█▍ | 4960/34278 [5:33:17<36:24:06, 4.47s/it] {'loss': 0.1673, 'grad_norm': 0.8110179707939673, 'learning_rate': 9.65905052483034e-06, 'epoch': 0.14} 14%|█▍ | 4960/34278 [5:33:17<36:24:06, 4.47s/it] 14%|█▍ | 4961/34278 [5:33:23<40:09:29, 4.93s/it] {'loss': 0.1705, 'grad_norm': 0.7813388231834524, 'learning_rate': 9.658879035861606e-06, 'epoch': 0.14} 14%|█▍ | 4961/34278 [5:33:23<40:09:29, 4.93s/it] 14%|█▍ | 4962/34278 [5:33:27<38:45:18, 4.76s/it] {'loss': 0.1749, 'grad_norm': 0.9773009102407664, 'learning_rate': 9.65870750529951e-06, 'epoch': 0.14} 14%|█▍ | 4962/34278 [5:33:27<38:45:18, 4.76s/it] 14%|█▍ | 4963/34278 [5:33:30<35:04:45, 4.31s/it] {'loss': 0.2042, 'grad_norm': 1.0953635856464539, 'learning_rate': 9.658535933145588e-06, 'epoch': 0.14} 14%|█▍ | 4963/34278 [5:33:30<35:04:45, 4.31s/it] 14%|█▍ | 4964/34278 [5:33:36<37:45:19, 4.64s/it] {'loss': 0.1684, 'grad_norm': 0.8310078695169273, 'learning_rate': 9.658364319401368e-06, 'epoch': 0.14} 14%|█▍ | 4964/34278 [5:33:36<37:45:19, 4.64s/it] 14%|█▍ | 4965/34278 [5:33:42<42:33:09, 5.23s/it] {'loss': 0.1942, 'grad_norm': 0.8966136224520148, 'learning_rate': 9.658192664068382e-06, 'epoch': 0.14} 14%|█▍ | 4965/34278 [5:33:42<42:33:09, 5.23s/it] 14%|█▍ | 4966/34278 [5:33:46<38:09:42, 4.69s/it] {'loss': 0.1953, 'grad_norm': 0.9646429390948612, 'learning_rate': 9.658020967148166e-06, 'epoch': 0.14} 14%|█▍ | 4966/34278 [5:33:46<38:09:42, 4.69s/it] 14%|█▍ | 4967/34278 [5:33:50<38:05:11, 4.68s/it] {'loss': 0.188, 'grad_norm': 0.9060359924917519, 'learning_rate': 9.65784922864225e-06, 'epoch': 0.14} 14%|█▍ | 4967/34278 [5:33:50<38:05:11, 4.68s/it] 14%|█▍ | 4968/34278 [5:33:54<35:16:06, 4.33s/it] {'loss': 0.1605, 'grad_norm': 1.21107510751277, 'learning_rate': 9.657677448552167e-06, 'epoch': 0.14} 14%|█▍ | 4968/34278 [5:33:54<35:16:06, 4.33s/it] 14%|█▍ | 4969/34278 [5:33:57<32:58:14, 4.05s/it] {'loss': 0.1642, 'grad_norm': 0.8668860283561532, 'learning_rate': 9.657505626879452e-06, 'epoch': 0.14} 14%|█▍ | 4969/34278 [5:33:57<32:58:14, 4.05s/it] 14%|█▍ | 4970/34278 [5:34:00<30:41:41, 3.77s/it] {'loss': 0.1879, 'grad_norm': 1.0359011983742257, 'learning_rate': 9.65733376362564e-06, 'epoch': 0.14} 14%|█▍ | 4970/34278 [5:34:00<30:41:41, 3.77s/it] 15%|█▍ | 4971/34278 [5:34:04<29:26:45, 3.62s/it] {'loss': 0.1684, 'grad_norm': 0.7821949860670446, 'learning_rate': 9.657161858792263e-06, 'epoch': 0.15} 15%|█▍ | 4971/34278 [5:34:04<29:26:45, 3.62s/it] 15%|█▍ | 4972/34278 [5:34:07<28:30:06, 3.50s/it] {'loss': 0.1677, 'grad_norm': 0.7176589228105676, 'learning_rate': 9.656989912380857e-06, 'epoch': 0.15} 15%|█▍ | 4972/34278 [5:34:07<28:30:06, 3.50s/it] 15%|█▍ | 4973/34278 [5:34:10<27:32:20, 3.38s/it] {'loss': 0.1957, 'grad_norm': 0.8690548931633352, 'learning_rate': 9.656817924392958e-06, 'epoch': 0.15} 15%|█▍ | 4973/34278 [5:34:10<27:32:20, 3.38s/it] 15%|█▍ | 4974/34278 [5:34:13<28:02:26, 3.44s/it] {'loss': 0.2092, 'grad_norm': 0.8615092289222902, 'learning_rate': 9.656645894830098e-06, 'epoch': 0.15} 15%|█▍ | 4974/34278 [5:34:13<28:02:26, 3.44s/it] 15%|█▍ | 4975/34278 [5:34:19<33:17:21, 4.09s/it] {'loss': 0.1725, 'grad_norm': 0.9381894088351447, 'learning_rate': 9.656473823693814e-06, 'epoch': 0.15} 15%|█▍ | 4975/34278 [5:34:19<33:17:21, 4.09s/it] 15%|█▍ | 4976/34278 [5:34:23<31:48:29, 3.91s/it] {'loss': 0.1581, 'grad_norm': 0.7216487457606322, 'learning_rate': 9.656301710985646e-06, 'epoch': 0.15} 15%|█▍ | 4976/34278 [5:34:23<31:48:29, 3.91s/it] 15%|█▍ | 4977/34278 [5:34:26<30:08:12, 3.70s/it] {'loss': 0.1655, 'grad_norm': 0.9951387042927817, 'learning_rate': 9.656129556707127e-06, 'epoch': 0.15} 15%|█▍ | 4977/34278 [5:34:26<30:08:12, 3.70s/it] 15%|█▍ | 4978/34278 [5:34:29<28:09:16, 3.46s/it] {'loss': 0.1642, 'grad_norm': 0.6943663219266645, 'learning_rate': 9.655957360859796e-06, 'epoch': 0.15} 15%|█▍ | 4978/34278 [5:34:29<28:09:16, 3.46s/it] 15%|█▍ | 4979/34278 [5:34:35<34:12:06, 4.20s/it] {'loss': 0.163, 'grad_norm': 0.745824175529699, 'learning_rate': 9.655785123445186e-06, 'epoch': 0.15} 15%|█▍ | 4979/34278 [5:34:35<34:12:06, 4.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▍ | 4980/34278 [5:34:38<31:56:50, 3.93s/it] {'loss': 0.1852, 'grad_norm': 0.7571817541508856, 'learning_rate': 9.65561284446484e-06, 'epoch': 0.15} 15%|█▍ | 4980/34278 [5:34:38<31:56:50, 3.93s/it] 15%|█▍ | 4981/34278 [5:34:41<30:44:25, 3.78s/it] {'loss': 0.1658, 'grad_norm': 0.8569921722541516, 'learning_rate': 9.655440523920295e-06, 'epoch': 0.15} 15%|█▍ | 4981/34278 [5:34:41<30:44:25, 3.78s/it] 15%|█▍ | 4982/34278 [5:34:44<28:59:37, 3.56s/it] {'loss': 0.1627, 'grad_norm': 0.7833330263188776, 'learning_rate': 9.655268161813088e-06, 'epoch': 0.15} 15%|█▍ | 4982/34278 [5:34:44<28:59:37, 3.56s/it] 15%|█▍ | 4983/34278 [5:34:50<34:50:54, 4.28s/it] {'loss': 0.1823, 'grad_norm': 0.7938495250166756, 'learning_rate': 9.655095758144757e-06, 'epoch': 0.15} 15%|█▍ | 4983/34278 [5:34:50<34:50:54, 4.28s/it] 15%|█▍ | 4984/34278 [5:34:53<31:39:22, 3.89s/it] {'loss': 0.1646, 'grad_norm': 0.8190841000349872, 'learning_rate': 9.654923312916842e-06, 'epoch': 0.15} 15%|█▍ | 4984/34278 [5:34:53<31:39:22, 3.89s/it] 15%|█▍ | 4985/34278 [5:34:56<29:49:48, 3.67s/it] {'loss': 0.1607, 'grad_norm': 1.2172420550336633, 'learning_rate': 9.654750826130882e-06, 'epoch': 0.15} 15%|█▍ | 4985/34278 [5:34:56<29:49:48, 3.67s/it] 15%|█▍ | 4986/34278 [5:34:59<27:43:00, 3.41s/it] {'loss': 0.161, 'grad_norm': 0.8853725881410209, 'learning_rate': 9.654578297788421e-06, 'epoch': 0.15} 15%|█▍ | 4986/34278 [5:34:59<27:43:00, 3.41s/it] 15%|█▍ | 4987/34278 [5:35:02<27:11:42, 3.34s/it] {'loss': 0.1581, 'grad_norm': 0.83162721813925, 'learning_rate': 9.654405727890994e-06, 'epoch': 0.15} 15%|█▍ | 4987/34278 [5:35:02<27:11:42, 3.34s/it] 15%|█▍ | 4988/34278 [5:35:05<26:13:03, 3.22s/it] {'loss': 0.1761, 'grad_norm': 1.23119552734588, 'learning_rate': 9.654233116440144e-06, 'epoch': 0.15} 15%|█▍ | 4988/34278 [5:35:05<26:13:03, 3.22s/it] 15%|█▍ | 4989/34278 [5:35:08<25:30:56, 3.14s/it] {'loss': 0.176, 'grad_norm': 0.8811144445160094, 'learning_rate': 9.654060463437411e-06, 'epoch': 0.15} 15%|█▍ | 4989/34278 [5:35:08<25:30:56, 3.14s/it] 15%|█▍ | 4990/34278 [5:35:13<30:23:30, 3.74s/it] {'loss': 0.1919, 'grad_norm': 0.7174372214501783, 'learning_rate': 9.65388776888434e-06, 'epoch': 0.15} 15%|█▍ | 4990/34278 [5:35:13<30:23:30, 3.74s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▍ | 4991/34278 [5:35:20<36:03:48, 4.43s/it] {'loss': 0.1632, 'grad_norm': 0.99449911960273, 'learning_rate': 9.653715032782467e-06, 'epoch': 0.15} 15%|█▍ | 4991/34278 [5:35:20<36:03:48, 4.43s/it] 15%|█▍ | 4992/34278 [5:35:22<32:26:50, 3.99s/it] {'loss': 0.2133, 'grad_norm': 0.8985792692934712, 'learning_rate': 9.653542255133339e-06, 'epoch': 0.15} 15%|█▍ | 4992/34278 [5:35:22<32:26:50, 3.99s/it] 15%|█▍ | 4993/34278 [5:35:26<31:19:37, 3.85s/it] {'loss': 0.1738, 'grad_norm': 0.8439128817625573, 'learning_rate': 9.653369435938495e-06, 'epoch': 0.15} 15%|█▍ | 4993/34278 [5:35:26<31:19:37, 3.85s/it] 15%|█▍ | 4994/34278 [5:35:30<31:30:37, 3.87s/it] {'loss': 0.2177, 'grad_norm': 0.9013739350288998, 'learning_rate': 9.65319657519948e-06, 'epoch': 0.15} 15%|█▍ | 4994/34278 [5:35:30<31:30:37, 3.87s/it] 15%|█▍ | 4995/34278 [5:35:36<36:34:16, 4.50s/it] {'loss': 0.1606, 'grad_norm': 1.1169097052433632, 'learning_rate': 9.653023672917839e-06, 'epoch': 0.15} 15%|█▍ | 4995/34278 [5:35:36<36:34:16, 4.50s/it] 15%|█▍ | 4996/34278 [5:35:39<33:52:32, 4.16s/it] {'loss': 0.1931, 'grad_norm': 0.8952326065505679, 'learning_rate': 9.65285072909511e-06, 'epoch': 0.15} 15%|█▍ | 4996/34278 [5:35:39<33:52:32, 4.16s/it] 15%|█▍ | 4997/34278 [5:35:45<38:34:46, 4.74s/it] {'loss': 0.1632, 'grad_norm': 0.8400266765729321, 'learning_rate': 9.652677743732843e-06, 'epoch': 0.15} 15%|█▍ | 4997/34278 [5:35:45<38:34:46, 4.74s/it] 15%|█▍ | 4998/34278 [5:35:50<38:01:10, 4.67s/it] {'loss': 0.1917, 'grad_norm': 1.0135250732946663, 'learning_rate': 9.652504716832578e-06, 'epoch': 0.15} 15%|█▍ | 4998/34278 [5:35:50<38:01:10, 4.67s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▍ | 4999/34278 [5:35:55<38:52:27, 4.78s/it] {'loss': 0.1656, 'grad_norm': 0.8595442628576205, 'learning_rate': 9.652331648395863e-06, 'epoch': 0.15} 15%|█▍ | 4999/34278 [5:35:55<38:52:27, 4.78s/it] 15%|█▍ | 5000/34278 [5:36:00<39:40:03, 4.88s/it] {'loss': 0.1969, 'grad_norm': 0.9970121540001996, 'learning_rate': 9.65215853842424e-06, 'epoch': 0.15} 15%|█▍ | 5000/34278 [5:36:00<39:40:03, 4.88s/it] 15%|█▍ | 5001/34278 [5:36:03<36:12:52, 4.45s/it] {'loss': 0.174, 'grad_norm': 0.9562353762307105, 'learning_rate': 9.651985386919257e-06, 'epoch': 0.15} 15%|█▍ | 5001/34278 [5:36:03<36:12:52, 4.45s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff307edbbf0> Failed to fetch sample 3353710. Exception: cannot identify image file <_io.BytesIO object at 0x7ff307edbbf0> 15%|█▍ | 5002/34278 [5:36:08<36:02:50, 4.43s/it] {'loss': 0.1831, 'grad_norm': 0.8746215172434272, 'learning_rate': 9.65181219388246e-06, 'epoch': 0.15} 15%|█▍ | 5002/34278 [5:36:08<36:02:50, 4.43s/it] 15%|█▍ | 5003/34278 [5:36:13<37:14:51, 4.58s/it] {'loss': 0.1906, 'grad_norm': 0.834023705470222, 'learning_rate': 9.651638959315392e-06, 'epoch': 0.15} 15%|█▍ | 5003/34278 [5:36:13<37:14:51, 4.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▍ | 5004/34278 [5:36:19<41:33:45, 5.11s/it] {'loss': 0.1756, 'grad_norm': 0.8542850057204772, 'learning_rate': 9.651465683219603e-06, 'epoch': 0.15} 15%|█▍ | 5004/34278 [5:36:19<41:33:45, 5.11s/it] 15%|█▍ | 5005/34278 [5:36:23<37:25:37, 4.60s/it] {'loss': 0.1664, 'grad_norm': 0.9872404766398125, 'learning_rate': 9.65129236559664e-06, 'epoch': 0.15} 15%|█▍ | 5005/34278 [5:36:23<37:25:37, 4.60s/it] 15%|█▍ | 5006/34278 [5:36:26<33:36:53, 4.13s/it] {'loss': 0.1721, 'grad_norm': 0.9819310883974036, 'learning_rate': 9.651119006448047e-06, 'epoch': 0.15} 15%|█▍ | 5006/34278 [5:36:26<33:36:53, 4.13s/it] 15%|█▍ | 5007/34278 [5:36:30<35:06:28, 4.32s/it] {'loss': 0.1691, 'grad_norm': 0.7927647989876684, 'learning_rate': 9.650945605775374e-06, 'epoch': 0.15} 15%|█▍ | 5007/34278 [5:36:30<35:06:28, 4.32s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12433 > 8192). Running this sequence through the model will result in indexing errors 15%|█▍ | 5008/34278 [5:36:33<32:02:47, 3.94s/it] {'loss': 0.1719, 'grad_norm': 0.8871799854751007, 'learning_rate': 9.650772163580171e-06, 'epoch': 0.15} 15%|█▍ | 5008/34278 [5:36:33<32:02:47, 3.94s/it] 15%|█▍ | 5009/34278 [5:36:37<30:04:57, 3.70s/it] {'loss': 0.2095, 'grad_norm': 1.0809928484274853, 'learning_rate': 9.650598679863983e-06, 'epoch': 0.15} 15%|█▍ | 5009/34278 [5:36:37<30:04:57, 3.70s/it] 15%|█▍ | 5010/34278 [5:36:39<27:59:12, 3.44s/it] {'loss': 0.1676, 'grad_norm': 0.7793813813735866, 'learning_rate': 9.65042515462836e-06, 'epoch': 0.15} 15%|█▍ | 5010/34278 [5:36:39<27:59:12, 3.44s/it] 15%|█▍ | 5011/34278 [5:36:43<27:34:35, 3.39s/it] {'loss': 0.1879, 'grad_norm': 1.0455750202298337, 'learning_rate': 9.65025158787485e-06, 'epoch': 0.15} 15%|█▍ | 5011/34278 [5:36:43<27:34:35, 3.39s/it] 15%|█▍ | 5012/34278 [5:36:46<26:59:57, 3.32s/it] {'loss': 0.1956, 'grad_norm': 0.8515921729199322, 'learning_rate': 9.650077979605008e-06, 'epoch': 0.15} 15%|█▍ | 5012/34278 [5:36:46<26:59:57, 3.32s/it] 15%|█▍ | 5013/34278 [5:36:49<25:59:25, 3.20s/it] {'loss': 0.1838, 'grad_norm': 0.9358857516516348, 'learning_rate': 9.649904329820377e-06, 'epoch': 0.15} 15%|█▍ | 5013/34278 [5:36:49<25:59:25, 3.20s/it] 15%|█▍ | 5014/34278 [5:36:52<25:03:27, 3.08s/it] {'loss': 0.1612, 'grad_norm': 1.471933046679515, 'learning_rate': 9.64973063852251e-06, 'epoch': 0.15} 15%|█▍ | 5014/34278 [5:36:52<25:03:27, 3.08s/it] 15%|█▍ | 5015/34278 [5:36:55<26:56:23, 3.31s/it] {'loss': 0.1826, 'grad_norm': 0.7511996901125207, 'learning_rate': 9.649556905712958e-06, 'epoch': 0.15} 15%|█▍ | 5015/34278 [5:36:55<26:56:23, 3.31s/it] 15%|█▍ | 5016/34278 [5:37:00<29:02:32, 3.57s/it] {'loss': 0.1909, 'grad_norm': 1.0837699750013672, 'learning_rate': 9.649383131393273e-06, 'epoch': 0.15} 15%|█▍ | 5016/34278 [5:37:00<29:02:32, 3.57s/it] 15%|█▍ | 5017/34278 [5:37:05<33:25:12, 4.11s/it] {'loss': 0.1615, 'grad_norm': 0.9137538164715912, 'learning_rate': 9.649209315565005e-06, 'epoch': 0.15} 15%|█▍ | 5017/34278 [5:37:05<33:25:12, 4.11s/it] 15%|█▍ | 5018/34278 [5:37:10<35:14:41, 4.34s/it] {'loss': 0.2022, 'grad_norm': 0.9160622542964243, 'learning_rate': 9.649035458229706e-06, 'epoch': 0.15} 15%|█▍ | 5018/34278 [5:37:10<35:14:41, 4.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▍ | 5019/34278 [5:37:13<32:38:49, 4.02s/it] {'loss': 0.1823, 'grad_norm': 0.9417295540153825, 'learning_rate': 9.648861559388927e-06, 'epoch': 0.15} 15%|█▍ | 5019/34278 [5:37:13<32:38:49, 4.02s/it] 15%|█▍ | 5020/34278 [5:37:16<30:17:07, 3.73s/it] {'loss': 0.2048, 'grad_norm': 0.9319882259605891, 'learning_rate': 9.648687619044222e-06, 'epoch': 0.15} 15%|█▍ | 5020/34278 [5:37:16<30:17:07, 3.73s/it] 15%|█▍ | 5021/34278 [5:37:19<29:08:56, 3.59s/it] {'loss': 0.1749, 'grad_norm': 0.90146406917979, 'learning_rate': 9.648513637197145e-06, 'epoch': 0.15} 15%|█▍ | 5021/34278 [5:37:19<29:08:56, 3.59s/it] 15%|█▍ | 5022/34278 [5:37:24<32:27:38, 3.99s/it] {'loss': 0.1848, 'grad_norm': 1.0829929748917295, 'learning_rate': 9.648339613849246e-06, 'epoch': 0.15} 15%|█▍ | 5022/34278 [5:37:24<32:27:38, 3.99s/it] 15%|█▍ | 5023/34278 [5:37:28<32:13:19, 3.97s/it] {'loss': 0.19, 'grad_norm': 1.0302658800200448, 'learning_rate': 9.648165549002082e-06, 'epoch': 0.15} 15%|█▍ | 5023/34278 [5:37:28<32:13:19, 3.97s/it] 15%|█▍ | 5024/34278 [5:37:31<29:38:26, 3.65s/it] {'loss': 0.1965, 'grad_norm': 0.911838579645531, 'learning_rate': 9.647991442657206e-06, 'epoch': 0.15} 15%|█▍ | 5024/34278 [5:37:31<29:38:26, 3.65s/it] 15%|█▍ | 5025/34278 [5:37:34<28:29:30, 3.51s/it] {'loss': 0.1928, 'grad_norm': 1.0274456874566988, 'learning_rate': 9.647817294816171e-06, 'epoch': 0.15} 15%|█▍ | 5025/34278 [5:37:34<28:29:30, 3.51s/it] 15%|█▍ | 5026/34278 [5:37:37<26:41:27, 3.28s/it] {'loss': 0.1619, 'grad_norm': 0.9573276296487475, 'learning_rate': 9.647643105480533e-06, 'epoch': 0.15} 15%|█▍ | 5026/34278 [5:37:37<26:41:27, 3.28s/it] 15%|█▍ | 5027/34278 [5:37:42<29:49:20, 3.67s/it] {'loss': 0.1878, 'grad_norm': 0.8000526587833882, 'learning_rate': 9.647468874651847e-06, 'epoch': 0.15} 15%|█▍ | 5027/34278 [5:37:42<29:49:20, 3.67s/it] 15%|█▍ | 5028/34278 [5:37:45<28:17:34, 3.48s/it] {'loss': 0.1555, 'grad_norm': 1.0577708390316358, 'learning_rate': 9.64729460233167e-06, 'epoch': 0.15} 15%|█▍ | 5028/34278 [5:37:45<28:17:34, 3.48s/it] 15%|█▍ | 5029/34278 [5:37:48<27:09:18, 3.34s/it] {'loss': 0.1695, 'grad_norm': 1.000180504334239, 'learning_rate': 9.647120288521552e-06, 'epoch': 0.15} 15%|█▍ | 5029/34278 [5:37:48<27:09:18, 3.34s/it] 15%|█▍ | 5030/34278 [5:37:51<26:24:31, 3.25s/it] {'loss': 0.2048, 'grad_norm': 0.862740318043579, 'learning_rate': 9.646945933223058e-06, 'epoch': 0.15} 15%|█▍ | 5030/34278 [5:37:51<26:24:31, 3.25s/it] 15%|█▍ | 5031/34278 [5:37:54<25:58:21, 3.20s/it] {'loss': 0.1508, 'grad_norm': 0.9669313044036966, 'learning_rate': 9.646771536437737e-06, 'epoch': 0.15} 15%|█▍ | 5031/34278 [5:37:54<25:58:21, 3.20s/it] 15%|█▍ | 5032/34278 [5:37:57<25:17:38, 3.11s/it] {'loss': 0.1808, 'grad_norm': 0.9811796630238051, 'learning_rate': 9.64659709816715e-06, 'epoch': 0.15} 15%|█▍ | 5032/34278 [5:37:57<25:17:38, 3.11s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8976 > 8192). Running this sequence through the model will result in indexing errors 15%|█▍ | 5033/34278 [5:38:01<27:05:08, 3.33s/it] {'loss': 0.1899, 'grad_norm': 0.7942673025650904, 'learning_rate': 9.646422618412853e-06, 'epoch': 0.15} 15%|█▍ | 5033/34278 [5:38:01<27:05:08, 3.33s/it] 15%|█▍ | 5034/34278 [5:38:03<25:41:07, 3.16s/it] {'loss': 0.1431, 'grad_norm': 0.8135131995389027, 'learning_rate': 9.646248097176404e-06, 'epoch': 0.15} 15%|█▍ | 5034/34278 [5:38:03<25:41:07, 3.16s/it] 15%|█▍ | 5035/34278 [5:38:07<26:01:56, 3.20s/it] {'loss': 0.1833, 'grad_norm': 0.7968946027923035, 'learning_rate': 9.646073534459362e-06, 'epoch': 0.15} 15%|█▍ | 5035/34278 [5:38:07<26:01:56, 3.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▍ | 5036/34278 [5:38:11<28:03:34, 3.45s/it] {'loss': 0.1513, 'grad_norm': 0.7283403021107399, 'learning_rate': 9.645898930263284e-06, 'epoch': 0.15} 15%|█▍ | 5036/34278 [5:38:11<28:03:34, 3.45s/it] 15%|█▍ | 5037/34278 [5:38:15<31:23:03, 3.86s/it] {'loss': 0.1943, 'grad_norm': 0.876018090340404, 'learning_rate': 9.64572428458973e-06, 'epoch': 0.15} 15%|█▍ | 5037/34278 [5:38:16<31:23:03, 3.86s/it] 15%|█▍ | 5038/34278 [5:38:19<29:24:49, 3.62s/it] {'loss': 0.193, 'grad_norm': 0.942450992235393, 'learning_rate': 9.645549597440258e-06, 'epoch': 0.15} 15%|█▍ | 5038/34278 [5:38:19<29:24:49, 3.62s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11231 > 8192). Running this sequence through the model will result in indexing errors 15%|█▍ | 5039/34278 [5:38:21<27:37:10, 3.40s/it] {'loss': 0.1722, 'grad_norm': 1.3212259219432179, 'learning_rate': 9.645374868816427e-06, 'epoch': 0.15} 15%|█▍ | 5039/34278 [5:38:21<27:37:10, 3.40s/it] 15%|█▍ | 5040/34278 [5:38:26<29:43:26, 3.66s/it] {'loss': 0.1932, 'grad_norm': 0.9321174895484561, 'learning_rate': 9.6452000987198e-06, 'epoch': 0.15} 15%|█▍ | 5040/34278 [5:38:26<29:43:26, 3.66s/it] 15%|█▍ | 5041/34278 [5:38:28<27:18:42, 3.36s/it] {'loss': 0.1897, 'grad_norm': 0.8246588295826806, 'learning_rate': 9.645025287151935e-06, 'epoch': 0.15} 15%|█▍ | 5041/34278 [5:38:28<27:18:42, 3.36s/it] 15%|█▍ | 5042/34278 [5:38:32<27:27:56, 3.38s/it] {'loss': 0.1841, 'grad_norm': 1.1071089887580734, 'learning_rate': 9.644850434114392e-06, 'epoch': 0.15} 15%|█▍ | 5042/34278 [5:38:32<27:27:56, 3.38s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12441 > 8192). Running this sequence through the model will result in indexing errors 15%|█▍ | 5043/34278 [5:38:38<33:55:00, 4.18s/it] {'loss': 0.1802, 'grad_norm': 0.7533652903409431, 'learning_rate': 9.644675539608735e-06, 'epoch': 0.15} 15%|█▍ | 5043/34278 [5:38:38<33:55:00, 4.18s/it] 15%|█▍ | 5044/34278 [5:38:41<31:44:23, 3.91s/it] {'loss': 0.192, 'grad_norm': 0.7793821199418741, 'learning_rate': 9.644500603636521e-06, 'epoch': 0.15} 15%|█▍ | 5044/34278 [5:38:41<31:44:23, 3.91s/it] 15%|█▍ | 5045/34278 [5:38:45<31:47:35, 3.92s/it] {'loss': 0.1674, 'grad_norm': 1.0249558726678674, 'learning_rate': 9.644325626199315e-06, 'epoch': 0.15} 15%|█▍ | 5045/34278 [5:38:45<31:47:35, 3.92s/it] 15%|█▍ | 5046/34278 [5:38:48<29:25:02, 3.62s/it] {'loss': 0.1565, 'grad_norm': 0.9035026720752898, 'learning_rate': 9.64415060729868e-06, 'epoch': 0.15} 15%|█▍ | 5046/34278 [5:38:48<29:25:02, 3.62s/it] 15%|█▍ | 5047/34278 [5:38:52<29:15:29, 3.60s/it] {'loss': 0.2052, 'grad_norm': 0.9098129543276077, 'learning_rate': 9.643975546936177e-06, 'epoch': 0.15} 15%|█▍ | 5047/34278 [5:38:52<29:15:29, 3.60s/it] 15%|█▍ | 5048/34278 [5:38:55<28:41:54, 3.53s/it] {'loss': 0.2131, 'grad_norm': 1.145881702321572, 'learning_rate': 9.64380044511337e-06, 'epoch': 0.15} 15%|█▍ | 5048/34278 [5:38:55<28:41:54, 3.53s/it] 15%|█▍ | 5049/34278 [5:38:58<28:30:51, 3.51s/it] {'loss': 0.1987, 'grad_norm': 1.0054627880130236, 'learning_rate': 9.643625301831819e-06, 'epoch': 0.15} 15%|█▍ | 5049/34278 [5:38:58<28:30:51, 3.51s/it] 15%|█▍ | 5050/34278 [5:39:02<27:44:11, 3.42s/it] {'loss': 0.1869, 'grad_norm': 0.8709340744090021, 'learning_rate': 9.64345011709309e-06, 'epoch': 0.15} 15%|█▍ | 5050/34278 [5:39:02<27:44:11, 3.42s/it] 15%|█▍ | 5051/34278 [5:39:08<35:43:05, 4.40s/it] {'loss': 0.1622, 'grad_norm': 0.8013928944623968, 'learning_rate': 9.643274890898746e-06, 'epoch': 0.15} 15%|█▍ | 5051/34278 [5:39:08<35:43:05, 4.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▍ | 5052/34278 [5:39:11<32:47:23, 4.04s/it] {'loss': 0.167, 'grad_norm': 1.1178847635819615, 'learning_rate': 9.643099623250354e-06, 'epoch': 0.15} 15%|█▍ | 5052/34278 [5:39:11<32:47:23, 4.04s/it] 15%|█▍ | 5053/34278 [5:39:15<32:12:58, 3.97s/it] {'loss': 0.1725, 'grad_norm': 0.8633938310892648, 'learning_rate': 9.642924314149476e-06, 'epoch': 0.15} 15%|█▍ | 5053/34278 [5:39:15<32:12:58, 3.97s/it] 15%|█▍ | 5054/34278 [5:39:21<37:38:50, 4.64s/it] {'loss': 0.1859, 'grad_norm': 0.8510786116023561, 'learning_rate': 9.642748963597679e-06, 'epoch': 0.15} 15%|█▍ | 5054/34278 [5:39:21<37:38:50, 4.64s/it] 15%|█▍ | 5055/34278 [5:39:25<34:53:34, 4.30s/it] {'loss': 0.1587, 'grad_norm': 0.8381533064398632, 'learning_rate': 9.642573571596526e-06, 'epoch': 0.15} 15%|█▍ | 5055/34278 [5:39:25<34:53:34, 4.30s/it] 15%|█▍ | 5056/34278 [5:39:30<36:54:29, 4.55s/it] {'loss': 0.1804, 'grad_norm': 0.8260893979067963, 'learning_rate': 9.642398138147586e-06, 'epoch': 0.15} 15%|█▍ | 5056/34278 [5:39:30<36:54:29, 4.55s/it] 15%|█▍ | 5057/34278 [5:39:33<32:26:03, 4.00s/it] {'loss': 0.1834, 'grad_norm': 1.0002854630513247, 'learning_rate': 9.642222663252423e-06, 'epoch': 0.15} 15%|█▍ | 5057/34278 [5:39:33<32:26:03, 4.00s/it] 15%|█▍ | 5058/34278 [5:39:38<35:21:23, 4.36s/it] {'loss': 0.1655, 'grad_norm': 0.7813155893855992, 'learning_rate': 9.642047146912605e-06, 'epoch': 0.15} 15%|█▍ | 5058/34278 [5:39:38<35:21:23, 4.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▍ | 5059/34278 [5:39:41<32:39:41, 4.02s/it] {'loss': 0.1804, 'grad_norm': 1.0238358688806195, 'learning_rate': 9.641871589129696e-06, 'epoch': 0.15} 15%|█▍ | 5059/34278 [5:39:41<32:39:41, 4.02s/it] 15%|█▍ | 5060/34278 [5:39:46<33:49:17, 4.17s/it] {'loss': 0.1947, 'grad_norm': 1.0137480806042538, 'learning_rate': 9.641695989905268e-06, 'epoch': 0.15} 15%|█▍ | 5060/34278 [5:39:46<33:49:17, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▍ | 5061/34278 [5:39:49<31:23:44, 3.87s/it] {'loss': 0.1635, 'grad_norm': 0.8568766580525737, 'learning_rate': 9.641520349240885e-06, 'epoch': 0.15} 15%|█▍ | 5061/34278 [5:39:49<31:23:44, 3.87s/it] 15%|█▍ | 5062/34278 [5:39:53<31:26:45, 3.87s/it] {'loss': 0.1587, 'grad_norm': 1.0123927327613185, 'learning_rate': 9.641344667138117e-06, 'epoch': 0.15} 15%|█▍ | 5062/34278 [5:39:53<31:26:45, 3.87s/it] 15%|█▍ | 5063/34278 [5:39:56<29:02:39, 3.58s/it] {'loss': 0.1828, 'grad_norm': 0.7770446009954005, 'learning_rate': 9.641168943598531e-06, 'epoch': 0.15} 15%|█▍ | 5063/34278 [5:39:56<29:02:39, 3.58s/it] 15%|█▍ | 5064/34278 [5:39:58<26:56:28, 3.32s/it] {'loss': 0.2229, 'grad_norm': 1.0421396794898743, 'learning_rate': 9.640993178623698e-06, 'epoch': 0.15} 15%|█▍ | 5064/34278 [5:39:58<26:56:28, 3.32s/it] 15%|█▍ | 5065/34278 [5:40:02<26:41:53, 3.29s/it] {'loss': 0.1589, 'grad_norm': 0.945971704493877, 'learning_rate': 9.640817372215184e-06, 'epoch': 0.15} 15%|█▍ | 5065/34278 [5:40:02<26:41:53, 3.29s/it] 15%|█▍ | 5066/34278 [5:40:05<27:02:05, 3.33s/it] {'loss': 0.167, 'grad_norm': 0.8100496290166211, 'learning_rate': 9.640641524374561e-06, 'epoch': 0.15} 15%|█▍ | 5066/34278 [5:40:05<27:02:05, 3.33s/it] 15%|█▍ | 5067/34278 [5:40:08<26:27:47, 3.26s/it] {'loss': 0.2038, 'grad_norm': 0.9022934117122732, 'learning_rate': 9.6404656351034e-06, 'epoch': 0.15} 15%|█▍ | 5067/34278 [5:40:08<26:27:47, 3.26s/it] 15%|█▍ | 5068/34278 [5:40:11<26:25:21, 3.26s/it] {'loss': 0.1746, 'grad_norm': 0.940857312978776, 'learning_rate': 9.640289704403268e-06, 'epoch': 0.15} 15%|█▍ | 5068/34278 [5:40:11<26:25:21, 3.26s/it] 15%|█▍ | 5069/34278 [5:40:15<27:02:11, 3.33s/it] {'loss': 0.1631, 'grad_norm': 1.1533203706367532, 'learning_rate': 9.640113732275736e-06, 'epoch': 0.15} 15%|█▍ | 5069/34278 [5:40:15<27:02:11, 3.33s/it] 15%|█▍ | 5070/34278 [5:40:21<33:33:39, 4.14s/it] {'loss': 0.1671, 'grad_norm': 0.726215537350509, 'learning_rate': 9.639937718722379e-06, 'epoch': 0.15} 15%|█▍ | 5070/34278 [5:40:21<33:33:39, 4.14s/it] 15%|█▍ | 5071/34278 [5:40:24<31:15:44, 3.85s/it] {'loss': 0.1644, 'grad_norm': 0.94110480875373, 'learning_rate': 9.639761663744764e-06, 'epoch': 0.15} 15%|█▍ | 5071/34278 [5:40:24<31:15:44, 3.85s/it] 15%|█▍ | 5072/34278 [5:40:30<36:54:06, 4.55s/it] {'loss': 0.1642, 'grad_norm': 0.8060087313547423, 'learning_rate': 9.639585567344464e-06, 'epoch': 0.15} 15%|█▍ | 5072/34278 [5:40:30<36:54:06, 4.55s/it] 15%|█▍ | 5073/34278 [5:40:34<34:37:25, 4.27s/it] {'loss': 0.1669, 'grad_norm': 0.8795655083349191, 'learning_rate': 9.639409429523053e-06, 'epoch': 0.15} 15%|█▍ | 5073/34278 [5:40:34<34:37:25, 4.27s/it] 15%|█▍ | 5074/34278 [5:40:38<33:34:44, 4.14s/it] {'loss': 0.1813, 'grad_norm': 0.8783713990715803, 'learning_rate': 9.639233250282101e-06, 'epoch': 0.15} 15%|█▍ | 5074/34278 [5:40:38<33:34:44, 4.14s/it] 15%|█▍ | 5075/34278 [5:40:41<30:33:25, 3.77s/it] {'loss': 0.1919, 'grad_norm': 0.923042653322279, 'learning_rate': 9.639057029623183e-06, 'epoch': 0.15} 15%|█▍ | 5075/34278 [5:40:41<30:33:25, 3.77s/it] 15%|█▍ | 5076/34278 [5:40:44<30:23:27, 3.75s/it] {'loss': 0.1818, 'grad_norm': 0.8036545791712385, 'learning_rate': 9.63888076754787e-06, 'epoch': 0.15} 15%|█▍ | 5076/34278 [5:40:44<30:23:27, 3.75s/it] 15%|█▍ | 5077/34278 [5:40:47<28:00:51, 3.45s/it] {'loss': 0.1742, 'grad_norm': 0.8408184672385143, 'learning_rate': 9.63870446405774e-06, 'epoch': 0.15} 15%|█▍ | 5077/34278 [5:40:47<28:00:51, 3.45s/it] 15%|█▍ | 5078/34278 [5:40:50<27:00:30, 3.33s/it] {'loss': 0.1864, 'grad_norm': 0.8302103753326288, 'learning_rate': 9.63852811915436e-06, 'epoch': 0.15} 15%|█▍ | 5078/34278 [5:40:50<27:00:30, 3.33s/it] 15%|█▍ | 5079/34278 [5:40:53<26:50:06, 3.31s/it] {'loss': 0.186, 'grad_norm': 0.987029581812354, 'learning_rate': 9.638351732839311e-06, 'epoch': 0.15} 15%|█▍ | 5079/34278 [5:40:53<26:50:06, 3.31s/it] 15%|█▍ | 5080/34278 [5:40:57<26:31:44, 3.27s/it] {'loss': 0.189, 'grad_norm': 0.7762550095069394, 'learning_rate': 9.638175305114163e-06, 'epoch': 0.15} 15%|█▍ | 5080/34278 [5:40:57<26:31:44, 3.27s/it] 15%|█▍ | 5081/34278 [5:41:00<25:46:55, 3.18s/it] {'loss': 0.1626, 'grad_norm': 0.948031984001899, 'learning_rate': 9.637998835980493e-06, 'epoch': 0.15} 15%|█▍ | 5081/34278 [5:41:00<25:46:55, 3.18s/it] 15%|█▍ | 5082/34278 [5:41:02<25:08:21, 3.10s/it] {'loss': 0.1575, 'grad_norm': 0.8432231161022153, 'learning_rate': 9.637822325439878e-06, 'epoch': 0.15} 15%|█▍ | 5082/34278 [5:41:02<25:08:21, 3.10s/it] 15%|█▍ | 5083/34278 [5:41:06<25:09:45, 3.10s/it] {'loss': 0.1795, 'grad_norm': 0.8277721859028909, 'learning_rate': 9.637645773493893e-06, 'epoch': 0.15} 15%|█▍ | 5083/34278 [5:41:06<25:09:45, 3.10s/it] 15%|█▍ | 5084/34278 [5:41:10<29:07:25, 3.59s/it] {'loss': 0.1863, 'grad_norm': 0.8421227542506831, 'learning_rate': 9.637469180144112e-06, 'epoch': 0.15} 15%|█▍ | 5084/34278 [5:41:10<29:07:25, 3.59s/it] 15%|█▍ | 5085/34278 [5:41:13<27:19:54, 3.37s/it] {'loss': 0.1518, 'grad_norm': 0.9640066115636375, 'learning_rate': 9.637292545392114e-06, 'epoch': 0.15} 15%|█▍ | 5085/34278 [5:41:13<27:19:54, 3.37s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10292 > 8192). Running this sequence through the model will result in indexing errors 15%|█▍ | 5086/34278 [5:41:17<28:34:06, 3.52s/it] {'loss': 0.1631, 'grad_norm': 0.7376568881964116, 'learning_rate': 9.637115869239475e-06, 'epoch': 0.15} 15%|█▍ | 5086/34278 [5:41:17<28:34:06, 3.52s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▍ | 5087/34278 [5:41:20<27:12:03, 3.35s/it] {'loss': 0.1839, 'grad_norm': 0.894572818155894, 'learning_rate': 9.636939151687772e-06, 'epoch': 0.15} 15%|█▍ | 5087/34278 [5:41:20<27:12:03, 3.35s/it] 15%|█▍ | 5088/34278 [5:41:26<34:31:49, 4.26s/it] {'loss': 0.1718, 'grad_norm': 0.8624785688237763, 'learning_rate': 9.636762392738583e-06, 'epoch': 0.15} 15%|█▍ | 5088/34278 [5:41:26<34:31:49, 4.26s/it] 15%|█▍ | 5089/34278 [5:41:30<32:42:35, 4.03s/it] {'loss': 0.1636, 'grad_norm': 0.8963683902542617, 'learning_rate': 9.636585592393489e-06, 'epoch': 0.15} 15%|█▍ | 5089/34278 [5:41:30<32:42:35, 4.03s/it] 15%|█▍ | 5090/34278 [5:41:33<30:30:52, 3.76s/it] {'loss': 0.1761, 'grad_norm': 0.7614672069464431, 'learning_rate': 9.636408750654062e-06, 'epoch': 0.15} 15%|█▍ | 5090/34278 [5:41:33<30:30:52, 3.76s/it] 15%|█▍ | 5091/34278 [5:41:36<28:38:41, 3.53s/it] {'loss': 0.1641, 'grad_norm': 0.8099578606723454, 'learning_rate': 9.636231867521886e-06, 'epoch': 0.15} 15%|█▍ | 5091/34278 [5:41:36<28:38:41, 3.53s/it] 15%|█▍ | 5092/34278 [5:41:39<27:23:32, 3.38s/it] {'loss': 0.1641, 'grad_norm': 0.7303341978527088, 'learning_rate': 9.636054942998538e-06, 'epoch': 0.15} 15%|█▍ | 5092/34278 [5:41:39<27:23:32, 3.38s/it] 15%|█▍ | 5093/34278 [5:41:43<29:28:18, 3.64s/it] {'loss': 0.1661, 'grad_norm': 0.9262007731517238, 'learning_rate': 9.635877977085599e-06, 'epoch': 0.15} 15%|█▍ | 5093/34278 [5:41:43<29:28:18, 3.64s/it] 15%|█▍ | 5094/34278 [5:41:47<29:42:15, 3.66s/it] {'loss': 0.1975, 'grad_norm': 0.8515555470352184, 'learning_rate': 9.635700969784648e-06, 'epoch': 0.15} 15%|█▍ | 5094/34278 [5:41:47<29:42:15, 3.66s/it] 15%|█▍ | 5095/34278 [5:41:50<28:09:49, 3.47s/it] {'loss': 0.1685, 'grad_norm': 0.7646387338569562, 'learning_rate': 9.635523921097265e-06, 'epoch': 0.15} 15%|█▍ | 5095/34278 [5:41:50<28:09:49, 3.47s/it] 15%|█▍ | 5096/34278 [5:41:53<27:34:27, 3.40s/it] {'loss': 0.1685, 'grad_norm': 0.9120152936496411, 'learning_rate': 9.635346831025032e-06, 'epoch': 0.15} 15%|█▍ | 5096/34278 [5:41:53<27:34:27, 3.40s/it] 15%|█▍ | 5097/34278 [5:41:56<26:05:50, 3.22s/it] {'loss': 0.1688, 'grad_norm': 0.8963548524790588, 'learning_rate': 9.635169699569528e-06, 'epoch': 0.15} 15%|█▍ | 5097/34278 [5:41:56<26:05:50, 3.22s/it] 15%|█▍ | 5098/34278 [5:41:59<26:03:24, 3.21s/it] {'loss': 0.1815, 'grad_norm': 0.8247082107867015, 'learning_rate': 9.634992526732336e-06, 'epoch': 0.15} 15%|█▍ | 5098/34278 [5:41:59<26:03:24, 3.21s/it] 15%|█▍ | 5099/34278 [5:42:02<24:51:54, 3.07s/it] {'loss': 0.1901, 'grad_norm': 0.894825694058036, 'learning_rate': 9.634815312515038e-06, 'epoch': 0.15} 15%|█▍ | 5099/34278 [5:42:02<24:51:54, 3.07s/it] 15%|█▍ | 5100/34278 [5:42:05<24:19:29, 3.00s/it] {'loss': 0.1655, 'grad_norm': 0.84077347909432, 'learning_rate': 9.634638056919213e-06, 'epoch': 0.15} 15%|█▍ | 5100/34278 [5:42:05<24:19:29, 3.00s/it] 15%|█▍ | 5101/34278 [5:42:09<26:23:13, 3.26s/it] {'loss': 0.1801, 'grad_norm': 0.781795925008661, 'learning_rate': 9.634460759946449e-06, 'epoch': 0.15} 15%|█▍ | 5101/34278 [5:42:09<26:23:13, 3.26s/it] 15%|█▍ | 5102/34278 [5:42:12<27:20:31, 3.37s/it] {'loss': 0.1749, 'grad_norm': 0.9837437794935363, 'learning_rate': 9.634283421598322e-06, 'epoch': 0.15} 15%|█▍ | 5102/34278 [5:42:12<27:20:31, 3.37s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9076 > 8192). Running this sequence through the model will result in indexing errors 15%|█▍ | 5103/34278 [5:42:15<26:29:14, 3.27s/it] {'loss': 0.156, 'grad_norm': 0.8140802808114325, 'learning_rate': 9.63410604187642e-06, 'epoch': 0.15} 15%|█▍ | 5103/34278 [5:42:15<26:29:14, 3.27s/it] 15%|█▍ | 5104/34278 [5:42:20<29:17:09, 3.61s/it] {'loss': 0.1786, 'grad_norm': 0.9795870482107814, 'learning_rate': 9.633928620782327e-06, 'epoch': 0.15} 15%|█▍ | 5104/34278 [5:42:20<29:17:09, 3.61s/it] 15%|█▍ | 5105/34278 [5:42:23<27:24:18, 3.38s/it] {'loss': 0.1864, 'grad_norm': 0.8467942217568136, 'learning_rate': 9.633751158317624e-06, 'epoch': 0.15} 15%|█▍ | 5105/34278 [5:42:23<27:24:18, 3.38s/it] 15%|█▍ | 5106/34278 [5:42:26<27:20:01, 3.37s/it] {'loss': 0.179, 'grad_norm': 0.9165020355845057, 'learning_rate': 9.633573654483898e-06, 'epoch': 0.15} 15%|█▍ | 5106/34278 [5:42:26<27:20:01, 3.37s/it] 15%|█▍ | 5107/34278 [5:42:29<26:20:08, 3.25s/it] {'loss': 0.1839, 'grad_norm': 0.9626048326471355, 'learning_rate': 9.633396109282733e-06, 'epoch': 0.15} 15%|█▍ | 5107/34278 [5:42:29<26:20:08, 3.25s/it] 15%|█▍ | 5108/34278 [5:42:33<28:40:21, 3.54s/it] {'loss': 0.1642, 'grad_norm': 0.7993374003325108, 'learning_rate': 9.633218522715713e-06, 'epoch': 0.15} 15%|█▍ | 5108/34278 [5:42:33<28:40:21, 3.54s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▍ | 5109/34278 [5:42:36<26:57:53, 3.33s/it] {'loss': 0.1833, 'grad_norm': 0.8175438779764319, 'learning_rate': 9.633040894784423e-06, 'epoch': 0.15} 15%|█▍ | 5109/34278 [5:42:36<26:57:53, 3.33s/it] 15%|█▍ | 5110/34278 [5:42:41<31:25:22, 3.88s/it] {'loss': 0.1653, 'grad_norm': 0.8280294357075404, 'learning_rate': 9.63286322549045e-06, 'epoch': 0.15} 15%|█▍ | 5110/34278 [5:42:41<31:25:22, 3.88s/it] 15%|█▍ | 5111/34278 [5:42:47<36:22:56, 4.49s/it] {'loss': 0.2017, 'grad_norm': 0.908338783381698, 'learning_rate': 9.632685514835381e-06, 'epoch': 0.15} 15%|█▍ | 5111/34278 [5:42:47<36:22:56, 4.49s/it] 15%|█▍ | 5112/34278 [5:42:50<32:11:37, 3.97s/it] {'loss': 0.1945, 'grad_norm': 1.0516092502100833, 'learning_rate': 9.632507762820802e-06, 'epoch': 0.15} 15%|█▍ | 5112/34278 [5:42:50<32:11:37, 3.97s/it] 15%|█▍ | 5113/34278 [5:42:53<29:38:48, 3.66s/it] {'loss': 0.214, 'grad_norm': 0.8242778378158175, 'learning_rate': 9.632329969448297e-06, 'epoch': 0.15} 15%|█▍ | 5113/34278 [5:42:53<29:38:48, 3.66s/it] 15%|█▍ | 5114/34278 [5:42:56<28:14:01, 3.49s/it] {'loss': 0.188, 'grad_norm': 0.9240705779461624, 'learning_rate': 9.63215213471946e-06, 'epoch': 0.15} 15%|█▍ | 5114/34278 [5:42:56<28:14:01, 3.49s/it] 15%|█▍ | 5115/34278 [5:42:59<27:02:22, 3.34s/it] {'loss': 0.176, 'grad_norm': 0.9747253445918209, 'learning_rate': 9.631974258635872e-06, 'epoch': 0.15} 15%|█▍ | 5115/34278 [5:42:59<27:02:22, 3.34s/it] 15%|█▍ | 5116/34278 [5:43:04<32:04:10, 3.96s/it] {'loss': 0.1661, 'grad_norm': 0.7211727458179493, 'learning_rate': 9.631796341199122e-06, 'epoch': 0.15} 15%|█▍ | 5116/34278 [5:43:04<32:04:10, 3.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▍ | 5117/34278 [5:43:08<31:06:22, 3.84s/it] {'loss': 0.1683, 'grad_norm': 0.8473016855698315, 'learning_rate': 9.631618382410804e-06, 'epoch': 0.15} 15%|█▍ | 5117/34278 [5:43:08<31:06:22, 3.84s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▍ | 5118/34278 [5:43:14<36:11:17, 4.47s/it] {'loss': 0.178, 'grad_norm': 0.970757502447642, 'learning_rate': 9.631440382272498e-06, 'epoch': 0.15} 15%|█▍ | 5118/34278 [5:43:14<36:11:17, 4.47s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▍ | 5119/34278 [5:43:17<32:15:45, 3.98s/it] {'loss': 0.1492, 'grad_norm': 0.6904814083838131, 'learning_rate': 9.631262340785802e-06, 'epoch': 0.15} 15%|█▍ | 5119/34278 [5:43:17<32:15:45, 3.98s/it] 15%|█▍ | 5120/34278 [5:43:19<29:37:45, 3.66s/it] {'loss': 0.1924, 'grad_norm': 0.7421442896285018, 'learning_rate': 9.6310842579523e-06, 'epoch': 0.15} 15%|█▍ | 5120/34278 [5:43:19<29:37:45, 3.66s/it] 15%|█▍ | 5121/34278 [5:43:23<28:41:38, 3.54s/it] {'loss': 0.1639, 'grad_norm': 1.0651467930117666, 'learning_rate': 9.630906133773583e-06, 'epoch': 0.15} 15%|█▍ | 5121/34278 [5:43:23<28:41:38, 3.54s/it] 15%|█▍ | 5122/34278 [5:43:27<31:16:01, 3.86s/it] {'loss': 0.1725, 'grad_norm': 0.8390889273605537, 'learning_rate': 9.63072796825124e-06, 'epoch': 0.15} 15%|█▍ | 5122/34278 [5:43:27<31:16:01, 3.86s/it] 15%|█▍ | 5123/34278 [5:43:31<30:59:46, 3.83s/it] {'loss': 0.1823, 'grad_norm': 0.9309394813860895, 'learning_rate': 9.630549761386865e-06, 'epoch': 0.15} 15%|█▍ | 5123/34278 [5:43:31<30:59:46, 3.83s/it] 15%|█▍ | 5124/34278 [5:43:36<34:17:49, 4.24s/it] {'loss': 0.1674, 'grad_norm': 0.8677477915122752, 'learning_rate': 9.630371513182047e-06, 'epoch': 0.15} 15%|█▍ | 5124/34278 [5:43:36<34:17:49, 4.24s/it] 15%|█▍ | 5125/34278 [5:43:42<37:12:26, 4.59s/it] {'loss': 0.1645, 'grad_norm': 0.7952579105541172, 'learning_rate': 9.630193223638378e-06, 'epoch': 0.15} 15%|█▍ | 5125/34278 [5:43:42<37:12:26, 4.59s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▍ | 5126/34278 [5:43:45<34:22:46, 4.25s/it] {'loss': 0.168, 'grad_norm': 0.805111346894642, 'learning_rate': 9.630014892757449e-06, 'epoch': 0.15} 15%|█▍ | 5126/34278 [5:43:45<34:22:46, 4.25s/it] 15%|█▍ | 5127/34278 [5:43:49<32:14:28, 3.98s/it] {'loss': 0.1874, 'grad_norm': 0.9688313650719023, 'learning_rate': 9.629836520540851e-06, 'epoch': 0.15} 15%|█▍ | 5127/34278 [5:43:49<32:14:28, 3.98s/it] 15%|█▍ | 5128/34278 [5:43:52<30:22:30, 3.75s/it] {'loss': 0.1825, 'grad_norm': 1.100268308538829, 'learning_rate': 9.629658106990179e-06, 'epoch': 0.15} 15%|█▍ | 5128/34278 [5:43:52<30:22:30, 3.75s/it] 15%|█▍ | 5129/34278 [5:43:55<28:41:23, 3.54s/it] {'loss': 0.1855, 'grad_norm': 0.8652801324456252, 'learning_rate': 9.629479652107024e-06, 'epoch': 0.15} 15%|█▍ | 5129/34278 [5:43:55<28:41:23, 3.54s/it] 15%|█▍ | 5130/34278 [5:43:58<28:06:08, 3.47s/it] {'loss': 0.1559, 'grad_norm': 0.8891167955846057, 'learning_rate': 9.62930115589298e-06, 'epoch': 0.15} 15%|█▍ | 5130/34278 [5:43:58<28:06:08, 3.47s/it] 15%|█▍ | 5131/34278 [5:44:04<34:14:08, 4.23s/it] {'loss': 0.198, 'grad_norm': 0.9620161577897258, 'learning_rate': 9.62912261834964e-06, 'epoch': 0.15} 15%|█▍ | 5131/34278 [5:44:04<34:14:08, 4.23s/it] 15%|█▍ | 5132/34278 [5:44:07<31:06:17, 3.84s/it] {'loss': 0.1839, 'grad_norm': 0.9557939480904629, 'learning_rate': 9.628944039478599e-06, 'epoch': 0.15} 15%|█▍ | 5132/34278 [5:44:07<31:06:17, 3.84s/it] 15%|█▍ | 5133/34278 [5:44:10<29:38:41, 3.66s/it] {'loss': 0.1862, 'grad_norm': 0.7887926047280494, 'learning_rate': 9.628765419281452e-06, 'epoch': 0.15} 15%|█▍ | 5133/34278 [5:44:10<29:38:41, 3.66s/it] 15%|█▍ | 5134/34278 [5:44:13<28:19:32, 3.50s/it] {'loss': 0.1729, 'grad_norm': 0.9482297956073317, 'learning_rate': 9.62858675775979e-06, 'epoch': 0.15} 15%|█▍ | 5134/34278 [5:44:13<28:19:32, 3.50s/it] 15%|█▍ | 5135/34278 [5:44:17<29:39:10, 3.66s/it] {'loss': 0.1659, 'grad_norm': 0.8407297247504261, 'learning_rate': 9.62840805491521e-06, 'epoch': 0.15} 15%|█▍ | 5135/34278 [5:44:17<29:39:10, 3.66s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10293 > 8192). Running this sequence through the model will result in indexing errors 15%|█▍ | 5136/34278 [5:44:21<28:38:11, 3.54s/it] {'loss': 0.1801, 'grad_norm': 0.7981482556755108, 'learning_rate': 9.62822931074931e-06, 'epoch': 0.15} 15%|█▍ | 5136/34278 [5:44:21<28:38:11, 3.54s/it] 15%|█▍ | 5137/34278 [5:44:27<34:28:47, 4.26s/it] {'loss': 0.2001, 'grad_norm': 0.9121878201813708, 'learning_rate': 9.62805052526368e-06, 'epoch': 0.15} 15%|█▍ | 5137/34278 [5:44:27<34:28:47, 4.26s/it] 15%|█▍ | 5138/34278 [5:44:30<32:02:36, 3.96s/it] {'loss': 0.1788, 'grad_norm': 1.113022027779779, 'learning_rate': 9.627871698459925e-06, 'epoch': 0.15} 15%|█▍ | 5138/34278 [5:44:30<32:02:36, 3.96s/it] 15%|█▍ | 5139/34278 [5:44:33<29:28:12, 3.64s/it] {'loss': 0.1934, 'grad_norm': 0.897779033375477, 'learning_rate': 9.627692830339633e-06, 'epoch': 0.15} 15%|█▍ | 5139/34278 [5:44:33<29:28:12, 3.64s/it] 15%|█▍ | 5140/34278 [5:44:36<28:14:46, 3.49s/it] {'loss': 0.1609, 'grad_norm': 0.7520491511565212, 'learning_rate': 9.627513920904403e-06, 'epoch': 0.15} 15%|█▍ | 5140/34278 [5:44:36<28:14:46, 3.49s/it] 15%|█▍ | 5141/34278 [5:44:39<27:26:52, 3.39s/it] {'loss': 0.1887, 'grad_norm': 0.8198308417034144, 'learning_rate': 9.627334970155837e-06, 'epoch': 0.15} 15%|█▍ | 5141/34278 [5:44:39<27:26:52, 3.39s/it] 15%|█▌ | 5142/34278 [5:44:42<27:00:40, 3.34s/it] {'loss': 0.1605, 'grad_norm': 0.7567235699570818, 'learning_rate': 9.627155978095526e-06, 'epoch': 0.15} 15%|█▌ | 5142/34278 [5:44:42<27:00:40, 3.34s/it] 15%|█▌ | 5143/34278 [5:44:48<33:25:34, 4.13s/it] {'loss': 0.155, 'grad_norm': 0.8121192608338688, 'learning_rate': 9.626976944725071e-06, 'epoch': 0.15} 15%|█▌ | 5143/34278 [5:44:48<33:25:34, 4.13s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▌ | 5144/34278 [5:44:51<30:42:05, 3.79s/it] {'loss': 0.1668, 'grad_norm': 0.8404794822023604, 'learning_rate': 9.626797870046071e-06, 'epoch': 0.15} 15%|█▌ | 5144/34278 [5:44:51<30:42:05, 3.79s/it] 15%|█▌ | 5145/34278 [5:44:54<29:09:34, 3.60s/it] {'loss': 0.1675, 'grad_norm': 0.8568013721923577, 'learning_rate': 9.626618754060127e-06, 'epoch': 0.15} 15%|█▌ | 5145/34278 [5:44:54<29:09:34, 3.60s/it] 15%|█▌ | 5146/34278 [5:44:57<27:48:21, 3.44s/it] {'loss': 0.2013, 'grad_norm': 0.7932146477101287, 'learning_rate': 9.626439596768831e-06, 'epoch': 0.15} 15%|█▌ | 5146/34278 [5:44:57<27:48:21, 3.44s/it] 15%|█▌ | 5147/34278 [5:45:03<32:27:53, 4.01s/it] {'loss': 0.1901, 'grad_norm': 0.9595123753157163, 'learning_rate': 9.626260398173788e-06, 'epoch': 0.15} 15%|█▌ | 5147/34278 [5:45:03<32:27:53, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▌ | 5148/34278 [5:45:06<30:55:22, 3.82s/it] {'loss': 0.1652, 'grad_norm': 0.8542659439878874, 'learning_rate': 9.626081158276597e-06, 'epoch': 0.15} 15%|█▌ | 5148/34278 [5:45:06<30:55:22, 3.82s/it] 15%|█▌ | 5149/34278 [5:45:09<28:54:41, 3.57s/it] {'loss': 0.1802, 'grad_norm': 0.9191254664150484, 'learning_rate': 9.625901877078857e-06, 'epoch': 0.15} 15%|█▌ | 5149/34278 [5:45:09<28:54:41, 3.57s/it] 15%|█▌ | 5150/34278 [5:45:13<28:14:08, 3.49s/it] {'loss': 0.1558, 'grad_norm': 0.8044557749570149, 'learning_rate': 9.625722554582171e-06, 'epoch': 0.15} 15%|█▌ | 5150/34278 [5:45:13<28:14:08, 3.49s/it] 15%|█▌ | 5151/34278 [5:45:19<34:25:44, 4.26s/it] {'loss': 0.166, 'grad_norm': 1.1638830160852909, 'learning_rate': 9.625543190788138e-06, 'epoch': 0.15} 15%|█▌ | 5151/34278 [5:45:19<34:25:44, 4.26s/it] 15%|█▌ | 5152/34278 [5:45:21<30:57:49, 3.83s/it] {'loss': 0.1691, 'grad_norm': 0.7792950629208888, 'learning_rate': 9.625363785698358e-06, 'epoch': 0.15} 15%|█▌ | 5152/34278 [5:45:21<30:57:49, 3.83s/it] 15%|█▌ | 5153/34278 [5:45:25<31:06:48, 3.85s/it] {'loss': 0.1545, 'grad_norm': 0.9530255030838118, 'learning_rate': 9.625184339314435e-06, 'epoch': 0.15} 15%|█▌ | 5153/34278 [5:45:25<31:06:48, 3.85s/it] 15%|█▌ | 5154/34278 [5:45:29<30:47:36, 3.81s/it] {'loss': 0.1808, 'grad_norm': 0.7949708710489812, 'learning_rate': 9.625004851637972e-06, 'epoch': 0.15} 15%|█▌ | 5154/34278 [5:45:29<30:47:36, 3.81s/it] 15%|█▌ | 5155/34278 [5:45:33<31:22:06, 3.88s/it] {'loss': 0.2024, 'grad_norm': 0.884317595587077, 'learning_rate': 9.624825322670567e-06, 'epoch': 0.15} 15%|█▌ | 5155/34278 [5:45:33<31:22:06, 3.88s/it] 15%|█▌ | 5156/34278 [5:45:36<29:14:43, 3.62s/it] {'loss': 0.1817, 'grad_norm': 1.021448310381221, 'learning_rate': 9.624645752413827e-06, 'epoch': 0.15} 15%|█▌ | 5156/34278 [5:45:36<29:14:43, 3.62s/it] 15%|█▌ | 5157/34278 [5:45:42<34:39:49, 4.29s/it] {'loss': 0.166, 'grad_norm': 1.011998252678421, 'learning_rate': 9.624466140869353e-06, 'epoch': 0.15} 15%|█▌ | 5157/34278 [5:45:42<34:39:49, 4.29s/it] 15%|█▌ | 5158/34278 [5:45:46<33:29:05, 4.14s/it] {'loss': 0.1771, 'grad_norm': 0.8965535961246643, 'learning_rate': 9.62428648803875e-06, 'epoch': 0.15} 15%|█▌ | 5158/34278 [5:45:46<33:29:05, 4.14s/it] 15%|█▌ | 5159/34278 [5:45:50<32:56:04, 4.07s/it] {'loss': 0.1823, 'grad_norm': 0.8663976474058169, 'learning_rate': 9.624106793923622e-06, 'epoch': 0.15} 15%|█▌ | 5159/34278 [5:45:50<32:56:04, 4.07s/it] 15%|█▌ | 5160/34278 [5:45:53<30:59:00, 3.83s/it] {'loss': 0.1517, 'grad_norm': 0.8580090319773263, 'learning_rate': 9.62392705852557e-06, 'epoch': 0.15} 15%|█▌ | 5160/34278 [5:45:53<30:59:00, 3.83s/it] 15%|█▌ | 5161/34278 [5:45:59<36:15:17, 4.48s/it] {'loss': 0.1695, 'grad_norm': 0.8654448618053974, 'learning_rate': 9.623747281846203e-06, 'epoch': 0.15} 15%|█▌ | 5161/34278 [5:45:59<36:15:17, 4.48s/it] 15%|█▌ | 5162/34278 [5:46:05<40:50:45, 5.05s/it] {'loss': 0.1712, 'grad_norm': 0.7153421101554999, 'learning_rate': 9.623567463887123e-06, 'epoch': 0.15} 15%|█▌ | 5162/34278 [5:46:05<40:50:45, 5.05s/it] 15%|█▌ | 5163/34278 [5:46:10<39:53:20, 4.93s/it] {'loss': 0.1591, 'grad_norm': 0.7000303542349987, 'learning_rate': 9.623387604649937e-06, 'epoch': 0.15} 15%|█▌ | 5163/34278 [5:46:10<39:53:20, 4.93s/it] 15%|█▌ | 5164/34278 [5:46:13<34:41:44, 4.29s/it] {'loss': 0.1674, 'grad_norm': 1.0505270905577375, 'learning_rate': 9.62320770413625e-06, 'epoch': 0.15} 15%|█▌ | 5164/34278 [5:46:13<34:41:44, 4.29s/it] 15%|█▌ | 5165/34278 [5:46:17<33:40:14, 4.16s/it] {'loss': 0.1835, 'grad_norm': 0.7705851421827375, 'learning_rate': 9.623027762347669e-06, 'epoch': 0.15} 15%|█▌ | 5165/34278 [5:46:17<33:40:14, 4.16s/it] 15%|█▌ | 5166/34278 [5:46:20<31:24:08, 3.88s/it] {'loss': 0.1675, 'grad_norm': 1.0398084532315381, 'learning_rate': 9.622847779285798e-06, 'epoch': 0.15} 15%|█▌ | 5166/34278 [5:46:20<31:24:08, 3.88s/it] 15%|█▌ | 5167/34278 [5:46:25<33:27:16, 4.14s/it] {'loss': 0.1654, 'grad_norm': 0.8384019112904402, 'learning_rate': 9.622667754952246e-06, 'epoch': 0.15} 15%|█▌ | 5167/34278 [5:46:25<33:27:16, 4.14s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▌ | 5168/34278 [5:46:31<38:01:43, 4.70s/it] {'loss': 0.1583, 'grad_norm': 0.9030474618235137, 'learning_rate': 9.62248768934862e-06, 'epoch': 0.15} 15%|█▌ | 5168/34278 [5:46:31<38:01:43, 4.70s/it] 15%|█▌ | 5169/34278 [5:46:35<36:59:42, 4.58s/it] {'loss': 0.1434, 'grad_norm': 0.8620656330514548, 'learning_rate': 9.62230758247653e-06, 'epoch': 0.15} 15%|█▌ | 5169/34278 [5:46:35<36:59:42, 4.58s/it] 15%|█▌ | 5170/34278 [5:46:38<33:08:46, 4.10s/it] {'loss': 0.1841, 'grad_norm': 0.9205909108268115, 'learning_rate': 9.622127434337578e-06, 'epoch': 0.15} 15%|█▌ | 5170/34278 [5:46:38<33:08:46, 4.10s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9863 > 8192). Running this sequence through the model will result in indexing errors 15%|█▌ | 5171/34278 [5:46:43<36:29:03, 4.51s/it] {'loss': 0.1992, 'grad_norm': 1.0488539207987573, 'learning_rate': 9.621947244933377e-06, 'epoch': 0.15} 15%|█▌ | 5171/34278 [5:46:43<36:29:03, 4.51s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (13629 > 8192). Running this sequence through the model will result in indexing errors 15%|█▌ | 5172/34278 [5:46:49<40:12:10, 4.97s/it] {'loss': 0.1772, 'grad_norm': 0.7687755427598137, 'learning_rate': 9.621767014265534e-06, 'epoch': 0.15} 15%|█▌ | 5172/34278 [5:46:49<40:12:10, 4.97s/it] 15%|█▌ | 5173/34278 [5:46:53<36:07:15, 4.47s/it] {'loss': 0.1617, 'grad_norm': 0.9379189448252973, 'learning_rate': 9.621586742335658e-06, 'epoch': 0.15} 15%|█▌ | 5173/34278 [5:46:53<36:07:15, 4.47s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8346 > 8192). Running this sequence through the model will result in indexing errors 15%|█▌ | 5174/34278 [5:46:56<32:57:16, 4.08s/it] {'loss': 0.1783, 'grad_norm': 0.9344745331471433, 'learning_rate': 9.62140642914536e-06, 'epoch': 0.15} 15%|█▌ | 5174/34278 [5:46:56<32:57:16, 4.08s/it] 15%|█▌ | 5175/34278 [5:46:59<30:59:29, 3.83s/it] {'loss': 0.1823, 'grad_norm': 0.9544197104113092, 'learning_rate': 9.621226074696249e-06, 'epoch': 0.15} 15%|█▌ | 5175/34278 [5:46:59<30:59:29, 3.83s/it] 15%|█▌ | 5176/34278 [5:47:03<30:42:49, 3.80s/it] {'loss': 0.1848, 'grad_norm': 0.8035634563611824, 'learning_rate': 9.621045678989933e-06, 'epoch': 0.15} 15%|█▌ | 5176/34278 [5:47:03<30:42:49, 3.80s/it] 15%|█▌ | 5177/34278 [5:47:08<35:05:58, 4.34s/it] {'loss': 0.1656, 'grad_norm': 1.2575767680881196, 'learning_rate': 9.620865242028025e-06, 'epoch': 0.15} 15%|█▌ | 5177/34278 [5:47:08<35:05:58, 4.34s/it] 15%|█▌ | 5178/34278 [5:47:12<34:11:10, 4.23s/it] {'loss': 0.1561, 'grad_norm': 0.9709566370302833, 'learning_rate': 9.620684763812135e-06, 'epoch': 0.15} 15%|█▌ | 5178/34278 [5:47:12<34:11:10, 4.23s/it] 15%|█▌ | 5179/34278 [5:47:18<38:23:50, 4.75s/it] {'loss': 0.1895, 'grad_norm': 0.8131796770604164, 'learning_rate': 9.620504244343875e-06, 'epoch': 0.15} 15%|█▌ | 5179/34278 [5:47:18<38:23:50, 4.75s/it] 15%|█▌ | 5180/34278 [5:47:21<34:01:42, 4.21s/it] {'loss': 0.1597, 'grad_norm': 0.9585894437782273, 'learning_rate': 9.620323683624855e-06, 'epoch': 0.15} 15%|█▌ | 5180/34278 [5:47:21<34:01:42, 4.21s/it] 15%|█▌ | 5181/34278 [5:47:27<37:03:04, 4.58s/it] {'loss': 0.1802, 'grad_norm': 0.7724375926022282, 'learning_rate': 9.62014308165669e-06, 'epoch': 0.15} 15%|█▌ | 5181/34278 [5:47:27<37:03:04, 4.58s/it] 15%|█▌ | 5182/34278 [5:47:30<33:06:43, 4.10s/it] {'loss': 0.1554, 'grad_norm': 0.7903951900843166, 'learning_rate': 9.619962438440988e-06, 'epoch': 0.15} 15%|█▌ | 5182/34278 [5:47:30<33:06:43, 4.10s/it] 15%|█▌ | 5183/34278 [5:47:33<31:37:47, 3.91s/it] {'loss': 0.1607, 'grad_norm': 0.8380458263545416, 'learning_rate': 9.619781753979367e-06, 'epoch': 0.15} 15%|█▌ | 5183/34278 [5:47:33<31:37:47, 3.91s/it] 15%|█▌ | 5184/34278 [5:47:39<36:35:54, 4.53s/it] {'loss': 0.148, 'grad_norm': 1.2674925913842625, 'learning_rate': 9.619601028273436e-06, 'epoch': 0.15} 15%|█▌ | 5184/34278 [5:47:39<36:35:54, 4.53s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12432 > 8192). Running this sequence through the model will result in indexing errors 15%|█▌ | 5185/34278 [5:47:43<34:49:39, 4.31s/it] {'loss': 0.1669, 'grad_norm': 0.6770142237354774, 'learning_rate': 9.61942026132481e-06, 'epoch': 0.15} 15%|█▌ | 5185/34278 [5:47:43<34:49:39, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▌ | 5186/34278 [5:47:46<31:59:44, 3.96s/it] {'loss': 0.1922, 'grad_norm': 1.2654969480912075, 'learning_rate': 9.619239453135103e-06, 'epoch': 0.15} 15%|█▌ | 5186/34278 [5:47:46<31:59:44, 3.96s/it] 15%|█▌ | 5187/34278 [5:47:50<31:49:29, 3.94s/it] {'loss': 0.16, 'grad_norm': 0.9072409755784046, 'learning_rate': 9.619058603705927e-06, 'epoch': 0.15} 15%|█▌ | 5187/34278 [5:47:50<31:49:29, 3.94s/it] 15%|█▌ | 5188/34278 [5:47:53<30:32:18, 3.78s/it] {'loss': 0.1828, 'grad_norm': 0.9758151903911123, 'learning_rate': 9.6188777130389e-06, 'epoch': 0.15} 15%|█▌ | 5188/34278 [5:47:53<30:32:18, 3.78s/it] 15%|█▌ | 5189/34278 [5:47:57<29:36:04, 3.66s/it] {'loss': 0.1837, 'grad_norm': 0.8384463599622914, 'learning_rate': 9.618696781135635e-06, 'epoch': 0.15} 15%|█▌ | 5189/34278 [5:47:57<29:36:04, 3.66s/it] 15%|█▌ | 5190/34278 [5:48:00<28:38:57, 3.55s/it] {'loss': 0.1779, 'grad_norm': 1.0062725527060505, 'learning_rate': 9.618515807997748e-06, 'epoch': 0.15} 15%|█▌ | 5190/34278 [5:48:00<28:38:57, 3.55s/it] 15%|█▌ | 5191/34278 [5:48:03<27:38:22, 3.42s/it] {'loss': 0.2033, 'grad_norm': 0.9554657667561735, 'learning_rate': 9.618334793626855e-06, 'epoch': 0.15} 15%|█▌ | 5191/34278 [5:48:03<27:38:22, 3.42s/it] 15%|█▌ | 5192/34278 [5:48:06<26:55:17, 3.33s/it] {'loss': 0.1588, 'grad_norm': 0.7103028587542359, 'learning_rate': 9.61815373802457e-06, 'epoch': 0.15} 15%|█▌ | 5192/34278 [5:48:06<26:55:17, 3.33s/it] 15%|█▌ | 5193/34278 [5:48:09<25:57:40, 3.21s/it] {'loss': 0.165, 'grad_norm': 0.9075256364303415, 'learning_rate': 9.617972641192513e-06, 'epoch': 0.15} 15%|█▌ | 5193/34278 [5:48:09<25:57:40, 3.21s/it] 15%|█▌ | 5194/34278 [5:48:15<32:58:15, 4.08s/it] {'loss': 0.1742, 'grad_norm': 1.1186651509945547, 'learning_rate': 9.617791503132297e-06, 'epoch': 0.15} 15%|█▌ | 5194/34278 [5:48:15<32:58:15, 4.08s/it] 15%|█▌ | 5195/34278 [5:48:18<30:18:13, 3.75s/it] {'loss': 0.223, 'grad_norm': 0.916327418833708, 'learning_rate': 9.617610323845539e-06, 'epoch': 0.15} 15%|█▌ | 5195/34278 [5:48:18<30:18:13, 3.75s/it] 15%|█▌ | 5196/34278 [5:48:22<29:47:08, 3.69s/it] {'loss': 0.1556, 'grad_norm': 0.874467535661217, 'learning_rate': 9.617429103333862e-06, 'epoch': 0.15} 15%|█▌ | 5196/34278 [5:48:22<29:47:08, 3.69s/it] 15%|█▌ | 5197/34278 [5:48:27<32:16:26, 4.00s/it] {'loss': 0.1789, 'grad_norm': 1.1124542867749956, 'learning_rate': 9.617247841598877e-06, 'epoch': 0.15} 15%|█▌ | 5197/34278 [5:48:27<32:16:26, 4.00s/it] 15%|█▌ | 5198/34278 [5:48:30<30:24:44, 3.76s/it] {'loss': 0.1895, 'grad_norm': 0.9619492305116694, 'learning_rate': 9.617066538642209e-06, 'epoch': 0.15} 15%|█▌ | 5198/34278 [5:48:30<30:24:44, 3.76s/it] 15%|█▌ | 5199/34278 [5:48:35<34:02:17, 4.21s/it] {'loss': 0.1647, 'grad_norm': 1.0891589166378086, 'learning_rate': 9.616885194465471e-06, 'epoch': 0.15} 15%|█▌ | 5199/34278 [5:48:35<34:02:17, 4.21s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▌ | 5200/34278 [5:48:39<34:36:15, 4.28s/it] {'loss': 0.1664, 'grad_norm': 0.8885211172842047, 'learning_rate': 9.616703809070283e-06, 'epoch': 0.15} 15%|█▌ | 5200/34278 [5:48:39<34:36:15, 4.28s/it] 15%|█▌ | 5201/34278 [5:48:43<32:00:09, 3.96s/it] {'loss': 0.1675, 'grad_norm': 0.8612314005430644, 'learning_rate': 9.616522382458268e-06, 'epoch': 0.15} 15%|█▌ | 5201/34278 [5:48:43<32:00:09, 3.96s/it] 15%|█▌ | 5202/34278 [5:48:46<30:20:02, 3.76s/it] {'loss': 0.1864, 'grad_norm': 0.9704916637924584, 'learning_rate': 9.616340914631041e-06, 'epoch': 0.15} 15%|█▌ | 5202/34278 [5:48:46<30:20:02, 3.76s/it] 15%|█▌ | 5203/34278 [5:48:52<35:24:17, 4.38s/it] {'loss': 0.1916, 'grad_norm': 0.8954603361269353, 'learning_rate': 9.616159405590226e-06, 'epoch': 0.15} 15%|█▌ | 5203/34278 [5:48:52<35:24:17, 4.38s/it] 15%|█▌ | 5204/34278 [5:48:56<35:01:27, 4.34s/it] {'loss': 0.1832, 'grad_norm': 1.0133855686822917, 'learning_rate': 9.615977855337442e-06, 'epoch': 0.15} 15%|█▌ | 5204/34278 [5:48:56<35:01:27, 4.34s/it] 15%|█▌ | 5205/34278 [5:48:59<31:14:17, 3.87s/it] {'loss': 0.1587, 'grad_norm': 0.8147693290478469, 'learning_rate': 9.615796263874308e-06, 'epoch': 0.15} 15%|█▌ | 5205/34278 [5:48:59<31:14:17, 3.87s/it] 15%|█▌ | 5206/34278 [5:49:02<29:32:37, 3.66s/it] {'loss': 0.1581, 'grad_norm': 1.0275112228198235, 'learning_rate': 9.615614631202449e-06, 'epoch': 0.15} 15%|█▌ | 5206/34278 [5:49:02<29:32:37, 3.66s/it] 15%|█▌ | 5207/34278 [5:49:05<28:04:27, 3.48s/it] {'loss': 0.1894, 'grad_norm': 1.1285279421671728, 'learning_rate': 9.615432957323481e-06, 'epoch': 0.15} 15%|█▌ | 5207/34278 [5:49:05<28:04:27, 3.48s/it] 15%|█▌ | 5208/34278 [5:49:08<27:50:35, 3.45s/it] {'loss': 0.1739, 'grad_norm': 0.8877871622878019, 'learning_rate': 9.615251242239033e-06, 'epoch': 0.15} 15%|█▌ | 5208/34278 [5:49:08<27:50:35, 3.45s/it] 15%|█▌ | 5209/34278 [5:49:11<26:44:07, 3.31s/it] {'loss': 0.1652, 'grad_norm': 1.050245295509508, 'learning_rate': 9.61506948595072e-06, 'epoch': 0.15} 15%|█▌ | 5209/34278 [5:49:11<26:44:07, 3.31s/it] 15%|█▌ | 5210/34278 [5:49:15<26:11:34, 3.24s/it] {'loss': 0.1741, 'grad_norm': 0.9636502036667239, 'learning_rate': 9.614887688460171e-06, 'epoch': 0.15} 15%|█▌ | 5210/34278 [5:49:15<26:11:34, 3.24s/it] 15%|█▌ | 5211/34278 [5:49:18<27:44:24, 3.44s/it] {'loss': 0.1775, 'grad_norm': 1.020172444862389, 'learning_rate': 9.614705849769006e-06, 'epoch': 0.15} 15%|█▌ | 5211/34278 [5:49:18<27:44:24, 3.44s/it] 15%|█▌ | 5212/34278 [5:49:21<26:28:53, 3.28s/it] {'loss': 0.1719, 'grad_norm': 0.7703176891004924, 'learning_rate': 9.61452396987885e-06, 'epoch': 0.15} 15%|█▌ | 5212/34278 [5:49:21<26:28:53, 3.28s/it] 15%|█▌ | 5213/34278 [5:49:24<26:16:46, 3.26s/it] {'loss': 0.2006, 'grad_norm': 0.7928883634593038, 'learning_rate': 9.614342048791322e-06, 'epoch': 0.15} 15%|█▌ | 5213/34278 [5:49:25<26:16:46, 3.26s/it] 15%|█▌ | 5214/34278 [5:49:27<25:39:05, 3.18s/it] {'loss': 0.1738, 'grad_norm': 0.8622712038997947, 'learning_rate': 9.614160086508053e-06, 'epoch': 0.15} 15%|█▌ | 5214/34278 [5:49:28<25:39:05, 3.18s/it] 15%|█▌ | 5215/34278 [5:49:34<33:10:58, 4.11s/it] {'loss': 0.1908, 'grad_norm': 1.015876504668712, 'learning_rate': 9.613978083030663e-06, 'epoch': 0.15} 15%|█▌ | 5215/34278 [5:49:34<33:10:58, 4.11s/it] 15%|█▌ | 5216/34278 [5:49:37<30:50:32, 3.82s/it] {'loss': 0.1813, 'grad_norm': 0.9547319868253054, 'learning_rate': 9.613796038360779e-06, 'epoch': 0.15} 15%|█▌ | 5216/34278 [5:49:37<30:50:32, 3.82s/it] 15%|█▌ | 5217/34278 [5:49:42<33:43:18, 4.18s/it] {'loss': 0.178, 'grad_norm': 0.8052457579092854, 'learning_rate': 9.613613952500024e-06, 'epoch': 0.15} 15%|█▌ | 5217/34278 [5:49:42<33:43:18, 4.18s/it] 15%|█▌ | 5218/34278 [5:49:45<31:04:23, 3.85s/it] {'loss': 0.1479, 'grad_norm': 0.9372740219522526, 'learning_rate': 9.613431825450026e-06, 'epoch': 0.15} 15%|█▌ | 5218/34278 [5:49:45<31:04:23, 3.85s/it] 15%|█▌ | 5219/34278 [5:49:51<36:22:01, 4.51s/it] {'loss': 0.1575, 'grad_norm': 1.0724994981092462, 'learning_rate': 9.613249657212408e-06, 'epoch': 0.15} 15%|█▌ | 5219/34278 [5:49:51<36:22:01, 4.51s/it] 15%|█▌ | 5220/34278 [5:49:56<36:42:42, 4.55s/it] {'loss': 0.1506, 'grad_norm': 0.8514462977175778, 'learning_rate': 9.613067447788802e-06, 'epoch': 0.15} 15%|█▌ | 5220/34278 [5:49:56<36:42:42, 4.55s/it] 15%|█▌ | 5221/34278 [5:49:59<33:15:52, 4.12s/it] {'loss': 0.1783, 'grad_norm': 1.186589938781719, 'learning_rate': 9.612885197180828e-06, 'epoch': 0.15} 15%|█▌ | 5221/34278 [5:49:59<33:15:52, 4.12s/it] 15%|█▌ | 5222/34278 [5:50:03<32:39:59, 4.05s/it] {'loss': 0.1699, 'grad_norm': 0.9758065586189032, 'learning_rate': 9.612702905390116e-06, 'epoch': 0.15} 15%|█▌ | 5222/34278 [5:50:03<32:39:59, 4.05s/it] 15%|█▌ | 5223/34278 [5:50:06<30:40:11, 3.80s/it] {'loss': 0.1536, 'grad_norm': 0.8549776954726223, 'learning_rate': 9.612520572418296e-06, 'epoch': 0.15} 15%|█▌ | 5223/34278 [5:50:06<30:40:11, 3.80s/it] 15%|█▌ | 5224/34278 [5:50:09<28:40:14, 3.55s/it] {'loss': 0.1806, 'grad_norm': 0.9629543975850083, 'learning_rate': 9.612338198266993e-06, 'epoch': 0.15} 15%|█▌ | 5224/34278 [5:50:09<28:40:14, 3.55s/it] 15%|█▌ | 5225/34278 [5:50:12<27:37:56, 3.42s/it] {'loss': 0.1792, 'grad_norm': 0.944028999761969, 'learning_rate': 9.612155782937835e-06, 'epoch': 0.15} 15%|█▌ | 5225/34278 [5:50:12<27:37:56, 3.42s/it] 15%|█▌ | 5226/34278 [5:50:16<28:16:22, 3.50s/it] {'loss': 0.1704, 'grad_norm': 0.8341628912648352, 'learning_rate': 9.61197332643245e-06, 'epoch': 0.15} 15%|█▌ | 5226/34278 [5:50:16<28:16:22, 3.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▌ | 5227/34278 [5:50:20<30:57:16, 3.84s/it] {'loss': 0.196, 'grad_norm': 0.8256586584529005, 'learning_rate': 9.61179082875247e-06, 'epoch': 0.15} 15%|█▌ | 5227/34278 [5:50:20<30:57:16, 3.84s/it] 15%|█▌ | 5228/34278 [5:50:24<29:44:15, 3.69s/it] {'loss': 0.1666, 'grad_norm': 0.8263066750080229, 'learning_rate': 9.611608289899521e-06, 'epoch': 0.15} 15%|█▌ | 5228/34278 [5:50:24<29:44:15, 3.69s/it] 15%|█▌ | 5229/34278 [5:50:27<29:31:16, 3.66s/it] {'loss': 0.1835, 'grad_norm': 0.9071922848564402, 'learning_rate': 9.611425709875234e-06, 'epoch': 0.15} 15%|█▌ | 5229/34278 [5:50:27<29:31:16, 3.66s/it] 15%|█▌ | 5230/34278 [5:50:31<28:42:45, 3.56s/it] {'loss': 0.1432, 'grad_norm': 0.6553228819023137, 'learning_rate': 9.611243088681239e-06, 'epoch': 0.15} 15%|█▌ | 5230/34278 [5:50:31<28:42:45, 3.56s/it] 15%|█▌ | 5231/34278 [5:50:34<27:48:51, 3.45s/it] {'loss': 0.1773, 'grad_norm': 0.7810850295323523, 'learning_rate': 9.611060426319168e-06, 'epoch': 0.15} 15%|█▌ | 5231/34278 [5:50:34<27:48:51, 3.45s/it] 15%|█▌ | 5232/34278 [5:50:40<33:37:35, 4.17s/it] {'loss': 0.1377, 'grad_norm': 0.9786584587614672, 'learning_rate': 9.61087772279065e-06, 'epoch': 0.15} 15%|█▌ | 5232/34278 [5:50:40<33:37:35, 4.17s/it] 15%|█▌ | 5233/34278 [5:50:43<32:34:58, 4.04s/it] {'loss': 0.1647, 'grad_norm': 0.688277715765858, 'learning_rate': 9.610694978097314e-06, 'epoch': 0.15} 15%|█▌ | 5233/34278 [5:50:43<32:34:58, 4.04s/it] 15%|█▌ | 5234/34278 [5:50:47<30:29:29, 3.78s/it] {'loss': 0.1758, 'grad_norm': 0.9339862118585256, 'learning_rate': 9.610512192240797e-06, 'epoch': 0.15} 15%|█▌ | 5234/34278 [5:50:47<30:29:29, 3.78s/it] 15%|█▌ | 5235/34278 [5:50:53<36:11:55, 4.49s/it] {'loss': 0.1923, 'grad_norm': 0.9559664169701665, 'learning_rate': 9.610329365222725e-06, 'epoch': 0.15} 15%|█▌ | 5235/34278 [5:50:53<36:11:55, 4.49s/it] 15%|█▌ | 5236/34278 [5:50:57<35:46:33, 4.43s/it] {'loss': 0.1703, 'grad_norm': 0.7826416609748704, 'learning_rate': 9.610146497044736e-06, 'epoch': 0.15} 15%|█▌ | 5236/34278 [5:50:57<35:46:33, 4.43s/it] 15%|█▌ | 5237/34278 [5:51:01<33:52:38, 4.20s/it] {'loss': 0.2068, 'grad_norm': 0.9205570412734108, 'learning_rate': 9.609963587708457e-06, 'epoch': 0.15} 15%|█▌ | 5237/34278 [5:51:01<33:52:38, 4.20s/it] 15%|█▌ | 5238/34278 [5:51:04<31:45:26, 3.94s/it] {'loss': 0.1729, 'grad_norm': 0.9580224749521794, 'learning_rate': 9.609780637215525e-06, 'epoch': 0.15} 15%|█▌ | 5238/34278 [5:51:04<31:45:26, 3.94s/it] 15%|█▌ | 5239/34278 [5:51:08<31:56:38, 3.96s/it] {'loss': 0.1668, 'grad_norm': 0.7730007566000654, 'learning_rate': 9.609597645567572e-06, 'epoch': 0.15} 15%|█▌ | 5239/34278 [5:51:08<31:56:38, 3.96s/it] 15%|█▌ | 5240/34278 [5:51:12<31:12:34, 3.87s/it] {'loss': 0.1562, 'grad_norm': 0.8259023240302131, 'learning_rate': 9.609414612766231e-06, 'epoch': 0.15} 15%|█▌ | 5240/34278 [5:51:12<31:12:34, 3.87s/it] 15%|█▌ | 5241/34278 [5:51:18<36:25:24, 4.52s/it] {'loss': 0.1703, 'grad_norm': 0.8524120965477474, 'learning_rate': 9.609231538813137e-06, 'epoch': 0.15} 15%|█▌ | 5241/34278 [5:51:18<36:25:24, 4.52s/it] 15%|█▌ | 5242/34278 [5:51:24<40:34:59, 5.03s/it] {'loss': 0.1637, 'grad_norm': 0.7507180816390783, 'learning_rate': 9.609048423709923e-06, 'epoch': 0.15} 15%|█▌ | 5242/34278 [5:51:24<40:34:59, 5.03s/it] 15%|█▌ | 5243/34278 [5:51:27<36:18:48, 4.50s/it] {'loss': 0.1764, 'grad_norm': 1.003073067269039, 'learning_rate': 9.608865267458227e-06, 'epoch': 0.15} 15%|█▌ | 5243/34278 [5:51:27<36:18:48, 4.50s/it] 15%|█▌ | 5244/34278 [5:51:33<40:04:32, 4.97s/it] {'loss': 0.1496, 'grad_norm': 0.7127185709273672, 'learning_rate': 9.60868207005968e-06, 'epoch': 0.15} 15%|█▌ | 5244/34278 [5:51:33<40:04:32, 4.97s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▌ | 5245/34278 [5:51:36<35:56:02, 4.46s/it] {'loss': 0.2168, 'grad_norm': 0.772085291176157, 'learning_rate': 9.608498831515921e-06, 'epoch': 0.15} 15%|█▌ | 5245/34278 [5:51:36<35:56:02, 4.46s/it] 15%|█▌ | 5246/34278 [5:51:42<39:24:12, 4.89s/it] {'loss': 0.1735, 'grad_norm': 0.9274081634817349, 'learning_rate': 9.608315551828584e-06, 'epoch': 0.15} 15%|█▌ | 5246/34278 [5:51:42<39:24:12, 4.89s/it] 15%|█▌ | 5247/34278 [5:51:47<38:34:52, 4.78s/it] {'loss': 0.1768, 'grad_norm': 0.7450579897763286, 'learning_rate': 9.608132230999308e-06, 'epoch': 0.15} 15%|█▌ | 5247/34278 [5:51:47<38:34:52, 4.78s/it] 15%|█▌ | 5248/34278 [5:51:50<33:38:32, 4.17s/it] {'loss': 0.1442, 'grad_norm': 0.7313613290022853, 'learning_rate': 9.607948869029723e-06, 'epoch': 0.15} 15%|█▌ | 5248/34278 [5:51:50<33:38:32, 4.17s/it] 15%|█▌ | 5249/34278 [5:51:53<32:28:35, 4.03s/it] {'loss': 0.1954, 'grad_norm': 0.8460874138905079, 'learning_rate': 9.607765465921475e-06, 'epoch': 0.15} 15%|█▌ | 5249/34278 [5:51:53<32:28:35, 4.03s/it] 15%|█▌ | 5250/34278 [5:51:56<29:51:18, 3.70s/it] {'loss': 0.1687, 'grad_norm': 0.9068670462246586, 'learning_rate': 9.607582021676193e-06, 'epoch': 0.15} 15%|█▌ | 5250/34278 [5:51:56<29:51:18, 3.70s/it] 15%|█▌ | 5251/34278 [5:52:00<28:44:40, 3.56s/it] {'loss': 0.1773, 'grad_norm': 0.9946540282040286, 'learning_rate': 9.607398536295522e-06, 'epoch': 0.15} 15%|█▌ | 5251/34278 [5:52:00<28:44:40, 3.56s/it] 15%|█▌ | 5252/34278 [5:52:03<27:29:46, 3.41s/it] {'loss': 0.1719, 'grad_norm': 0.9035621429233804, 'learning_rate': 9.607215009781094e-06, 'epoch': 0.15} 15%|█▌ | 5252/34278 [5:52:03<27:29:46, 3.41s/it] 15%|█▌ | 5253/34278 [5:52:06<27:51:50, 3.46s/it] {'loss': 0.177, 'grad_norm': 0.9185277929085259, 'learning_rate': 9.607031442134554e-06, 'epoch': 0.15} 15%|█▌ | 5253/34278 [5:52:06<27:51:50, 3.46s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f216b9b57b0> Failed to fetch sample 2654734. Exception: cannot identify image file <_io.BytesIO object at 0x7f216b9b57b0> 15%|█▌ | 5254/34278 [5:52:10<28:52:07, 3.58s/it] {'loss': 0.1675, 'grad_norm': 0.869343631941605, 'learning_rate': 9.606847833357534e-06, 'epoch': 0.15} 15%|█▌ | 5254/34278 [5:52:10<28:52:07, 3.58s/it] 15%|█▌ | 5255/34278 [5:52:13<28:15:34, 3.51s/it] {'loss': 0.1361, 'grad_norm': 0.7384229055781494, 'learning_rate': 9.606664183451677e-06, 'epoch': 0.15} 15%|█▌ | 5255/34278 [5:52:13<28:15:34, 3.51s/it] 15%|█▌ | 5256/34278 [5:52:16<26:43:19, 3.31s/it] {'loss': 0.1833, 'grad_norm': 0.9312720913620172, 'learning_rate': 9.606480492418622e-06, 'epoch': 0.15} 15%|█▌ | 5256/34278 [5:52:16<26:43:19, 3.31s/it] 15%|█▌ | 5257/34278 [5:52:19<26:27:28, 3.28s/it] {'loss': 0.152, 'grad_norm': 0.9086396553991825, 'learning_rate': 9.606296760260008e-06, 'epoch': 0.15} 15%|█▌ | 5257/34278 [5:52:19<26:27:28, 3.28s/it] 15%|█▌ | 5258/34278 [5:52:25<32:09:27, 3.99s/it] {'loss': 0.1819, 'grad_norm': 0.8849816361045499, 'learning_rate': 9.606112986977477e-06, 'epoch': 0.15} 15%|█▌ | 5258/34278 [5:52:25<32:09:27, 3.99s/it] 15%|█▌ | 5259/34278 [5:52:32<38:08:48, 4.73s/it] {'loss': 0.1692, 'grad_norm': 0.9573284683873251, 'learning_rate': 9.605929172572668e-06, 'epoch': 0.15} 15%|█▌ | 5259/34278 [5:52:32<38:08:48, 4.73s/it] 15%|█▌ | 5260/34278 [5:52:35<34:24:42, 4.27s/it] {'loss': 0.1778, 'grad_norm': 0.9932776120380594, 'learning_rate': 9.605745317047224e-06, 'epoch': 0.15} 15%|█▌ | 5260/34278 [5:52:35<34:24:42, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▌ | 5261/34278 [5:52:41<38:38:53, 4.79s/it] {'loss': 0.1704, 'grad_norm': 0.7934079782105875, 'learning_rate': 9.605561420402786e-06, 'epoch': 0.15} 15%|█▌ | 5261/34278 [5:52:41<38:38:53, 4.79s/it] 15%|█▌ | 5262/34278 [5:52:44<33:46:01, 4.19s/it] {'loss': 0.1662, 'grad_norm': 1.0393421979411546, 'learning_rate': 9.605377482640991e-06, 'epoch': 0.15} 15%|█▌ | 5262/34278 [5:52:44<33:46:01, 4.19s/it] 15%|█▌ | 5263/34278 [5:52:49<37:17:32, 4.63s/it] {'loss': 0.1714, 'grad_norm': 0.8575229870657547, 'learning_rate': 9.60519350376349e-06, 'epoch': 0.15} 15%|█▌ | 5263/34278 [5:52:49<37:17:32, 4.63s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9823 > 8192). Running this sequence through the model will result in indexing errors 15%|█▌ | 5264/34278 [5:52:55<40:11:56, 4.99s/it] {'loss': 0.1795, 'grad_norm': 0.968177165950683, 'learning_rate': 9.605009483771919e-06, 'epoch': 0.15} 15%|█▌ | 5264/34278 [5:52:55<40:11:56, 4.99s/it] 15%|█▌ | 5265/34278 [5:53:01<42:31:39, 5.28s/it] {'loss': 0.1902, 'grad_norm': 0.8745136764727854, 'learning_rate': 9.604825422667921e-06, 'epoch': 0.15} 15%|█▌ | 5265/34278 [5:53:01<42:31:39, 5.28s/it] 15%|█▌ | 5266/34278 [5:53:04<37:46:07, 4.69s/it] {'loss': 0.1536, 'grad_norm': 0.7757863453994622, 'learning_rate': 9.604641320453143e-06, 'epoch': 0.15} 15%|█▌ | 5266/34278 [5:53:04<37:46:07, 4.69s/it] 15%|█▌ | 5267/34278 [5:53:08<35:09:48, 4.36s/it] {'loss': 0.1776, 'grad_norm': 0.7955542026906197, 'learning_rate': 9.604457177129226e-06, 'epoch': 0.15} 15%|█▌ | 5267/34278 [5:53:08<35:09:48, 4.36s/it] 15%|█▌ | 5268/34278 [5:53:11<33:01:10, 4.10s/it] {'loss': 0.202, 'grad_norm': 1.4192027347687417, 'learning_rate': 9.604272992697814e-06, 'epoch': 0.15} 15%|█▌ | 5268/34278 [5:53:11<33:01:10, 4.10s/it] 15%|█▌ | 5269/34278 [5:53:15<32:56:16, 4.09s/it] {'loss': 0.1645, 'grad_norm': 1.0294363103726043, 'learning_rate': 9.604088767160553e-06, 'epoch': 0.15} 15%|█▌ | 5269/34278 [5:53:15<32:56:16, 4.09s/it] 15%|█▌ | 5270/34278 [5:53:19<31:03:31, 3.85s/it] {'loss': 0.1819, 'grad_norm': 0.84436034845223, 'learning_rate': 9.603904500519086e-06, 'epoch': 0.15} 15%|█▌ | 5270/34278 [5:53:19<31:03:31, 3.85s/it] 15%|█▌ | 5271/34278 [5:53:22<30:22:21, 3.77s/it] {'loss': 0.1605, 'grad_norm': 1.2536092902006988, 'learning_rate': 9.603720192775057e-06, 'epoch': 0.15} 15%|█▌ | 5271/34278 [5:53:22<30:22:21, 3.77s/it] 15%|█▌ | 5272/34278 [5:53:27<32:18:32, 4.01s/it] {'loss': 0.1796, 'grad_norm': 1.0431662330200993, 'learning_rate': 9.603535843930116e-06, 'epoch': 0.15} 15%|█▌ | 5272/34278 [5:53:27<32:18:32, 4.01s/it] 15%|█▌ | 5273/34278 [5:53:30<29:02:26, 3.60s/it] {'loss': 0.1757, 'grad_norm': 0.916858461356858, 'learning_rate': 9.603351453985903e-06, 'epoch': 0.15} 15%|█▌ | 5273/34278 [5:53:30<29:02:26, 3.60s/it] 15%|█▌ | 5274/34278 [5:53:33<27:38:39, 3.43s/it] {'loss': 0.1658, 'grad_norm': 0.9931652330333578, 'learning_rate': 9.603167022944069e-06, 'epoch': 0.15} 15%|█▌ | 5274/34278 [5:53:33<27:38:39, 3.43s/it] 15%|█▌ | 5275/34278 [5:53:36<26:33:59, 3.30s/it] {'loss': 0.1544, 'grad_norm': 1.0203396642721403, 'learning_rate': 9.602982550806259e-06, 'epoch': 0.15} 15%|█▌ | 5275/34278 [5:53:36<26:33:59, 3.30s/it] 15%|█▌ | 5276/34278 [5:53:41<32:16:31, 4.01s/it] {'loss': 0.153, 'grad_norm': 0.8700065891494176, 'learning_rate': 9.602798037574117e-06, 'epoch': 0.15} 15%|█▌ | 5276/34278 [5:53:41<32:16:31, 4.01s/it] 15%|█▌ | 5277/34278 [5:53:44<29:35:11, 3.67s/it] {'loss': 0.1747, 'grad_norm': 0.8711074668522925, 'learning_rate': 9.602613483249297e-06, 'epoch': 0.15} 15%|█▌ | 5277/34278 [5:53:44<29:35:11, 3.67s/it] 15%|█▌ | 5278/34278 [5:53:47<27:27:36, 3.41s/it] {'loss': 0.1628, 'grad_norm': 0.8864803591663832, 'learning_rate': 9.60242888783344e-06, 'epoch': 0.15} 15%|█▌ | 5278/34278 [5:53:47<27:27:36, 3.41s/it] 15%|█▌ | 5279/34278 [5:53:50<26:10:04, 3.25s/it] {'loss': 0.1876, 'grad_norm': 0.8195468392838261, 'learning_rate': 9.602244251328197e-06, 'epoch': 0.15} 15%|█▌ | 5279/34278 [5:53:50<26:10:04, 3.25s/it] 15%|█▌ | 5280/34278 [5:53:53<27:09:22, 3.37s/it] {'loss': 0.1477, 'grad_norm': 0.7120788408881081, 'learning_rate': 9.602059573735216e-06, 'epoch': 0.15} 15%|█▌ | 5280/34278 [5:53:53<27:09:22, 3.37s/it] 15%|█▌ | 5281/34278 [5:53:57<27:02:58, 3.36s/it] {'loss': 0.191, 'grad_norm': 1.1433381576343686, 'learning_rate': 9.601874855056144e-06, 'epoch': 0.15} 15%|█▌ | 5281/34278 [5:53:57<27:02:58, 3.36s/it] 15%|█▌ | 5282/34278 [5:54:03<33:15:40, 4.13s/it] {'loss': 0.1571, 'grad_norm': 1.1968477507990083, 'learning_rate': 9.601690095292634e-06, 'epoch': 0.15} 15%|█▌ | 5282/34278 [5:54:03<33:15:40, 4.13s/it] 15%|█▌ | 5283/34278 [5:54:06<30:49:49, 3.83s/it] {'loss': 0.1788, 'grad_norm': 0.7344208807854702, 'learning_rate': 9.601505294446333e-06, 'epoch': 0.15} 15%|█▌ | 5283/34278 [5:54:06<30:49:49, 3.83s/it] 15%|█▌ | 5284/34278 [5:54:09<29:36:19, 3.68s/it] {'loss': 0.1953, 'grad_norm': 0.9059529903033163, 'learning_rate': 9.60132045251889e-06, 'epoch': 0.15} 15%|█▌ | 5284/34278 [5:54:09<29:36:19, 3.68s/it] 15%|█▌ | 5285/34278 [5:54:12<28:15:48, 3.51s/it] {'loss': 0.1667, 'grad_norm': 1.0109403347908683, 'learning_rate': 9.60113556951196e-06, 'epoch': 0.15} 15%|█▌ | 5285/34278 [5:54:12<28:15:48, 3.51s/it] 15%|█▌ | 5286/34278 [5:54:17<31:53:12, 3.96s/it] {'loss': 0.1509, 'grad_norm': 0.7689079319908673, 'learning_rate': 9.600950645427185e-06, 'epoch': 0.15} 15%|█▌ | 5286/34278 [5:54:17<31:53:12, 3.96s/it] 15%|█▌ | 5287/34278 [5:54:23<35:26:41, 4.40s/it] {'loss': 0.1778, 'grad_norm': 1.00848130810581, 'learning_rate': 9.600765680266225e-06, 'epoch': 0.15} 15%|█▌ | 5287/34278 [5:54:23<35:26:41, 4.40s/it] 15%|█▌ | 5288/34278 [5:54:26<31:47:53, 3.95s/it] {'loss': 0.1809, 'grad_norm': 0.8903844184600506, 'learning_rate': 9.600580674030724e-06, 'epoch': 0.15} 15%|█▌ | 5288/34278 [5:54:26<31:47:53, 3.95s/it] 15%|█▌ | 5289/34278 [5:54:29<30:31:39, 3.79s/it] {'loss': 0.2061, 'grad_norm': 0.9875250119902921, 'learning_rate': 9.600395626722339e-06, 'epoch': 0.15} 15%|█▌ | 5289/34278 [5:54:29<30:31:39, 3.79s/it] 15%|█▌ | 5290/34278 [5:54:32<29:22:34, 3.65s/it] {'loss': 0.1896, 'grad_norm': 1.0680535625688914, 'learning_rate': 9.60021053834272e-06, 'epoch': 0.15} 15%|█▌ | 5290/34278 [5:54:32<29:22:34, 3.65s/it] 15%|█▌ | 5291/34278 [5:54:35<28:02:01, 3.48s/it] {'loss': 0.1605, 'grad_norm': 0.8705081594665709, 'learning_rate': 9.60002540889352e-06, 'epoch': 0.15} 15%|█▌ | 5291/34278 [5:54:35<28:02:01, 3.48s/it] 15%|█▌ | 5292/34278 [5:54:39<27:34:29, 3.42s/it] {'loss': 0.1667, 'grad_norm': 0.9134141055638958, 'learning_rate': 9.59984023837639e-06, 'epoch': 0.15} 15%|█▌ | 5292/34278 [5:54:39<27:34:29, 3.42s/it] 15%|█▌ | 5293/34278 [5:54:42<26:40:45, 3.31s/it] {'loss': 0.2085, 'grad_norm': 0.9746062664402452, 'learning_rate': 9.599655026792984e-06, 'epoch': 0.15} 15%|█▌ | 5293/34278 [5:54:42<26:40:45, 3.31s/it] 15%|█▌ | 5294/34278 [5:54:46<29:05:18, 3.61s/it] {'loss': 0.1848, 'grad_norm': 1.0177194139187573, 'learning_rate': 9.599469774144958e-06, 'epoch': 0.15} 15%|█▌ | 5294/34278 [5:54:46<29:05:18, 3.61s/it] 15%|█▌ | 5295/34278 [5:54:49<26:50:49, 3.33s/it] {'loss': 0.1558, 'grad_norm': 0.7336467370838153, 'learning_rate': 9.599284480433963e-06, 'epoch': 0.15} 15%|█▌ | 5295/34278 [5:54:49<26:50:49, 3.33s/it] 15%|█▌ | 5296/34278 [5:54:54<32:39:37, 4.06s/it] {'loss': 0.1593, 'grad_norm': 0.8934190069714771, 'learning_rate': 9.599099145661654e-06, 'epoch': 0.15} 15%|█▌ | 5296/34278 [5:54:55<32:39:37, 4.06s/it] 15%|█▌ | 5297/34278 [5:54:58<30:50:14, 3.83s/it] {'loss': 0.1686, 'grad_norm': 0.7370426149551272, 'learning_rate': 9.598913769829685e-06, 'epoch': 0.15} 15%|█▌ | 5297/34278 [5:54:58<30:50:14, 3.83s/it] 15%|█▌ | 5298/34278 [5:55:03<33:42:47, 4.19s/it] {'loss': 0.1655, 'grad_norm': 0.8063677727885334, 'learning_rate': 9.598728352939713e-06, 'epoch': 0.15} 15%|█▌ | 5298/34278 [5:55:03<33:42:47, 4.19s/it] 15%|█▌ | 5299/34278 [5:55:07<32:58:00, 4.10s/it] {'loss': 0.2007, 'grad_norm': 0.8655458639912662, 'learning_rate': 9.59854289499339e-06, 'epoch': 0.15} 15%|█▌ | 5299/34278 [5:55:07<32:58:00, 4.10s/it] 15%|█▌ | 5300/34278 [5:55:13<37:21:18, 4.64s/it] {'loss': 0.2051, 'grad_norm': 0.9481333670787759, 'learning_rate': 9.598357395992375e-06, 'epoch': 0.15} 15%|█▌ | 5300/34278 [5:55:13<37:21:18, 4.64s/it] 15%|█▌ | 5301/34278 [5:55:19<40:39:27, 5.05s/it] {'loss': 0.1735, 'grad_norm': 0.9178766672636066, 'learning_rate': 9.598171855938323e-06, 'epoch': 0.15} 15%|█▌ | 5301/34278 [5:55:19<40:39:27, 5.05s/it] 15%|█▌ | 5302/34278 [5:55:22<36:48:51, 4.57s/it] {'loss': 0.1885, 'grad_norm': 0.9383207793954956, 'learning_rate': 9.597986274832891e-06, 'epoch': 0.15} 15%|█▌ | 5302/34278 [5:55:22<36:48:51, 4.57s/it] 15%|█▌ | 5303/34278 [5:55:28<40:08:15, 4.99s/it] {'loss': 0.1754, 'grad_norm': 0.9201225255267969, 'learning_rate': 9.597800652677734e-06, 'epoch': 0.15} 15%|█▌ | 5303/34278 [5:55:28<40:08:15, 4.99s/it] 15%|█▌ | 5304/34278 [5:55:33<40:19:17, 5.01s/it] {'loss': 0.164, 'grad_norm': 0.8736399135909124, 'learning_rate': 9.597614989474512e-06, 'epoch': 0.15} 15%|█▌ | 5304/34278 [5:55:33<40:19:17, 5.01s/it] 15%|█▌ | 5305/34278 [5:55:36<35:04:24, 4.36s/it] {'loss': 0.1654, 'grad_norm': 0.8028640814658534, 'learning_rate': 9.597429285224879e-06, 'epoch': 0.15} 15%|█▌ | 5305/34278 [5:55:36<35:04:24, 4.36s/it] 15%|█▌ | 5306/34278 [5:55:39<31:31:37, 3.92s/it] {'loss': 0.1777, 'grad_norm': 0.8444657613859604, 'learning_rate': 9.597243539930496e-06, 'epoch': 0.15} 15%|█▌ | 5306/34278 [5:55:39<31:31:37, 3.92s/it] 15%|█▌ | 5307/34278 [5:55:44<35:00:16, 4.35s/it] {'loss': 0.176, 'grad_norm': 0.9995551637483626, 'learning_rate': 9.597057753593018e-06, 'epoch': 0.15} 15%|█▌ | 5307/34278 [5:55:44<35:00:16, 4.35s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 15%|█▌ | 5308/34278 [5:55:48<33:16:48, 4.14s/it] {'loss': 0.1801, 'grad_norm': 1.0362160814459496, 'learning_rate': 9.59687192621411e-06, 'epoch': 0.15} 15%|█▌ | 5308/34278 [5:55:48<33:16:48, 4.14s/it] 15%|█▌ | 5309/34278 [5:55:52<32:29:43, 4.04s/it] {'loss': 0.1802, 'grad_norm': 0.7848692139608957, 'learning_rate': 9.596686057795424e-06, 'epoch': 0.15} 15%|█▌ | 5309/34278 [5:55:52<32:29:43, 4.04s/it] 15%|█▌ | 5310/34278 [5:55:55<30:20:27, 3.77s/it] {'loss': 0.1841, 'grad_norm': 0.8313163715921467, 'learning_rate': 9.59650014833862e-06, 'epoch': 0.15} 15%|█▌ | 5310/34278 [5:55:55<30:20:27, 3.77s/it] 15%|█▌ | 5311/34278 [5:55:58<29:16:49, 3.64s/it] {'loss': 0.1847, 'grad_norm': 0.9399501200895383, 'learning_rate': 9.596314197845365e-06, 'epoch': 0.15} 15%|█▌ | 5311/34278 [5:55:58<29:16:49, 3.64s/it] 15%|█▌ | 5312/34278 [5:56:01<28:24:17, 3.53s/it] {'loss': 0.1573, 'grad_norm': 1.0063637393463205, 'learning_rate': 9.59612820631731e-06, 'epoch': 0.15} 15%|█▌ | 5312/34278 [5:56:01<28:24:17, 3.53s/it] 15%|█▌ | 5313/34278 [5:56:04<26:31:30, 3.30s/it] {'loss': 0.181, 'grad_norm': 0.8905078304122748, 'learning_rate': 9.595942173756121e-06, 'epoch': 0.15} 15%|█▌ | 5313/34278 [5:56:04<26:31:30, 3.30s/it] 16%|█▌ | 5314/34278 [5:56:10<33:06:08, 4.11s/it] {'loss': 0.1461, 'grad_norm': 0.8347368721986382, 'learning_rate': 9.595756100163459e-06, 'epoch': 0.16} 16%|█▌ | 5314/34278 [5:56:10<33:06:08, 4.11s/it] 16%|█▌ | 5315/34278 [5:56:14<31:58:03, 3.97s/it] {'loss': 0.1756, 'grad_norm': 0.7521674181445719, 'learning_rate': 9.59556998554098e-06, 'epoch': 0.16} 16%|█▌ | 5315/34278 [5:56:14<31:58:03, 3.97s/it] 16%|█▌ | 5316/34278 [5:56:18<31:27:15, 3.91s/it] {'loss': 0.166, 'grad_norm': 0.6519258185824471, 'learning_rate': 9.595383829890352e-06, 'epoch': 0.16} 16%|█▌ | 5316/34278 [5:56:18<31:27:15, 3.91s/it] 16%|█▌ | 5317/34278 [5:56:21<29:53:04, 3.71s/it] {'loss': 0.1663, 'grad_norm': 0.8118094863671605, 'learning_rate': 9.595197633213233e-06, 'epoch': 0.16} 16%|█▌ | 5317/34278 [5:56:21<29:53:04, 3.71s/it] 16%|█▌ | 5318/34278 [5:56:24<27:36:37, 3.43s/it] {'loss': 0.1569, 'grad_norm': 0.8365807746765657, 'learning_rate': 9.595011395511288e-06, 'epoch': 0.16} 16%|█▌ | 5318/34278 [5:56:24<27:36:37, 3.43s/it] 16%|█▌ | 5319/34278 [5:56:28<29:04:48, 3.62s/it] {'loss': 0.1725, 'grad_norm': 0.8117970104724412, 'learning_rate': 9.594825116786177e-06, 'epoch': 0.16} 16%|█▌ | 5319/34278 [5:56:28<29:04:48, 3.62s/it] 16%|█▌ | 5320/34278 [5:56:31<28:52:46, 3.59s/it] {'loss': 0.1667, 'grad_norm': 1.0313484257999297, 'learning_rate': 9.594638797039564e-06, 'epoch': 0.16} 16%|█▌ | 5320/34278 [5:56:31<28:52:46, 3.59s/it] 16%|█▌ | 5321/34278 [5:56:35<29:03:53, 3.61s/it] {'loss': 0.1444, 'grad_norm': 0.7814312257479537, 'learning_rate': 9.594452436273113e-06, 'epoch': 0.16} 16%|█▌ | 5321/34278 [5:56:35<29:03:53, 3.61s/it] 16%|█▌ | 5322/34278 [5:56:38<28:08:29, 3.50s/it] {'loss': 0.1819, 'grad_norm': 0.8938854346748851, 'learning_rate': 9.594266034488487e-06, 'epoch': 0.16} 16%|█▌ | 5322/34278 [5:56:38<28:08:29, 3.50s/it] 16%|█▌ | 5323/34278 [5:56:41<27:34:51, 3.43s/it] {'loss': 0.1534, 'grad_norm': 0.8963589527425608, 'learning_rate': 9.594079591687352e-06, 'epoch': 0.16} 16%|█▌ | 5323/34278 [5:56:41<27:34:51, 3.43s/it] 16%|█▌ | 5324/34278 [5:56:45<27:22:46, 3.40s/it] {'loss': 0.1572, 'grad_norm': 1.015163944737604, 'learning_rate': 9.593893107871371e-06, 'epoch': 0.16} 16%|█▌ | 5324/34278 [5:56:45<27:22:46, 3.40s/it] 16%|█▌ | 5325/34278 [5:56:48<26:02:59, 3.24s/it] {'loss': 0.1667, 'grad_norm': 0.93609118151598, 'learning_rate': 9.593706583042208e-06, 'epoch': 0.16} 16%|█▌ | 5325/34278 [5:56:48<26:02:59, 3.24s/it] 16%|█▌ | 5326/34278 [5:56:51<25:42:28, 3.20s/it] {'loss': 0.1812, 'grad_norm': 0.9236317790651747, 'learning_rate': 9.593520017201528e-06, 'epoch': 0.16} 16%|█▌ | 5326/34278 [5:56:51<25:42:28, 3.20s/it] 16%|█▌ | 5327/34278 [5:56:54<25:33:35, 3.18s/it] {'loss': 0.184, 'grad_norm': 0.8454415638573578, 'learning_rate': 9.593333410351e-06, 'epoch': 0.16} 16%|█▌ | 5327/34278 [5:56:54<25:33:35, 3.18s/it] 16%|█▌ | 5328/34278 [5:57:00<32:50:55, 4.08s/it] {'loss': 0.1783, 'grad_norm': 1.3257250309757944, 'learning_rate': 9.593146762492287e-06, 'epoch': 0.16} 16%|█▌ | 5328/34278 [5:57:00<32:50:55, 4.08s/it] 16%|█▌ | 5329/34278 [5:57:03<30:43:41, 3.82s/it] {'loss': 0.1603, 'grad_norm': 1.0337770473486, 'learning_rate': 9.592960073627055e-06, 'epoch': 0.16} 16%|█▌ | 5329/34278 [5:57:03<30:43:41, 3.82s/it] 16%|█▌ | 5330/34278 [5:57:07<29:52:17, 3.71s/it] {'loss': 0.1514, 'grad_norm': 0.9237513937216989, 'learning_rate': 9.592773343756973e-06, 'epoch': 0.16} 16%|█▌ | 5330/34278 [5:57:07<29:52:17, 3.71s/it] 16%|█▌ | 5331/34278 [5:57:10<28:18:42, 3.52s/it] {'loss': 0.1684, 'grad_norm': 1.1175713018548445, 'learning_rate': 9.592586572883709e-06, 'epoch': 0.16} 16%|█▌ | 5331/34278 [5:57:10<28:18:42, 3.52s/it] 16%|█▌ | 5332/34278 [5:57:13<27:56:04, 3.47s/it] {'loss': 0.1635, 'grad_norm': 0.9757367878718678, 'learning_rate': 9.592399761008925e-06, 'epoch': 0.16} 16%|█▌ | 5332/34278 [5:57:13<27:56:04, 3.47s/it] 16%|█▌ | 5333/34278 [5:57:18<31:55:24, 3.97s/it] {'loss': 0.2119, 'grad_norm': 1.0955431275446061, 'learning_rate': 9.592212908134295e-06, 'epoch': 0.16} 16%|█▌ | 5333/34278 [5:57:18<31:55:24, 3.97s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8603 > 8192). Running this sequence through the model will result in indexing errors 16%|█▌ | 5334/34278 [5:57:24<36:34:05, 4.55s/it] {'loss': 0.1719, 'grad_norm': 0.923702646775755, 'learning_rate': 9.592026014261482e-06, 'epoch': 0.16} 16%|█▌ | 5334/34278 [5:57:24<36:34:05, 4.55s/it] 16%|█▌ | 5335/34278 [5:57:27<33:19:52, 4.15s/it] {'loss': 0.1842, 'grad_norm': 0.8292061516193554, 'learning_rate': 9.59183907939216e-06, 'epoch': 0.16} 16%|█▌ | 5335/34278 [5:57:27<33:19:52, 4.15s/it] 16%|█▌ | 5336/34278 [5:57:32<33:35:21, 4.18s/it] {'loss': 0.1537, 'grad_norm': 0.9886712419812845, 'learning_rate': 9.591652103527992e-06, 'epoch': 0.16} 16%|█▌ | 5336/34278 [5:57:32<33:35:21, 4.18s/it] 16%|█▌ | 5337/34278 [5:57:35<33:00:34, 4.11s/it] {'loss': 0.16, 'grad_norm': 0.7914672609825438, 'learning_rate': 9.591465086670651e-06, 'epoch': 0.16} 16%|█▌ | 5337/34278 [5:57:36<33:00:34, 4.11s/it] 16%|█▌ | 5338/34278 [5:57:39<31:06:12, 3.87s/it] {'loss': 0.1973, 'grad_norm': 0.8589726046962344, 'learning_rate': 9.591278028821806e-06, 'epoch': 0.16} 16%|█▌ | 5338/34278 [5:57:39<31:06:12, 3.87s/it] 16%|█▌ | 5339/34278 [5:57:44<33:47:49, 4.20s/it] {'loss': 0.1521, 'grad_norm': 0.8789279455211334, 'learning_rate': 9.591090929983127e-06, 'epoch': 0.16} 16%|█▌ | 5339/34278 [5:57:44<33:47:49, 4.20s/it] 16%|█▌ | 5340/34278 [5:57:47<31:51:43, 3.96s/it] {'loss': 0.1736, 'grad_norm': 0.8498280798734861, 'learning_rate': 9.590903790156282e-06, 'epoch': 0.16} 16%|█▌ | 5340/34278 [5:57:47<31:51:43, 3.96s/it] 16%|█▌ | 5341/34278 [5:57:51<31:23:16, 3.90s/it] {'loss': 0.1713, 'grad_norm': 0.7567362231450246, 'learning_rate': 9.590716609342947e-06, 'epoch': 0.16} 16%|█▌ | 5341/34278 [5:57:51<31:23:16, 3.90s/it] 16%|█▌ | 5342/34278 [5:57:54<28:14:39, 3.51s/it] {'loss': 0.1797, 'grad_norm': 0.9352837267459277, 'learning_rate': 9.590529387544789e-06, 'epoch': 0.16} 16%|█▌ | 5342/34278 [5:57:54<28:14:39, 3.51s/it] 16%|█▌ | 5343/34278 [5:57:57<27:05:39, 3.37s/it] {'loss': 0.1724, 'grad_norm': 0.9945709590177103, 'learning_rate': 9.59034212476348e-06, 'epoch': 0.16} 16%|█▌ | 5343/34278 [5:57:57<27:05:39, 3.37s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9702 > 8192). Running this sequence through the model will result in indexing errors 16%|█▌ | 5344/34278 [5:58:00<26:30:57, 3.30s/it] {'loss': 0.1771, 'grad_norm': 0.8160437077233418, 'learning_rate': 9.590154821000692e-06, 'epoch': 0.16} 16%|█▌ | 5344/34278 [5:58:00<26:30:57, 3.30s/it] 16%|█▌ | 5345/34278 [5:58:03<25:50:47, 3.22s/it] {'loss': 0.1703, 'grad_norm': 0.9219373694189473, 'learning_rate': 9.5899674762581e-06, 'epoch': 0.16} 16%|█▌ | 5345/34278 [5:58:03<25:50:47, 3.22s/it] 16%|█▌ | 5346/34278 [5:58:07<27:39:29, 3.44s/it] {'loss': 0.157, 'grad_norm': 0.8541268506205408, 'learning_rate': 9.589780090537371e-06, 'epoch': 0.16} 16%|█▌ | 5346/34278 [5:58:07<27:39:29, 3.44s/it] 16%|█▌ | 5347/34278 [5:58:12<32:27:41, 4.04s/it] {'loss': 0.1925, 'grad_norm': 0.8950324678253496, 'learning_rate': 9.589592663840182e-06, 'epoch': 0.16} 16%|█▌ | 5347/34278 [5:58:12<32:27:41, 4.04s/it] 16%|█▌ | 5348/34278 [5:58:15<30:01:46, 3.74s/it] {'loss': 0.1651, 'grad_norm': 0.7035913765548862, 'learning_rate': 9.589405196168204e-06, 'epoch': 0.16} 16%|█▌ | 5348/34278 [5:58:15<30:01:46, 3.74s/it] 16%|█▌ | 5349/34278 [5:58:20<31:48:16, 3.96s/it] {'loss': 0.1657, 'grad_norm': 1.6294715856673787, 'learning_rate': 9.589217687523114e-06, 'epoch': 0.16} 16%|█▌ | 5349/34278 [5:58:20<31:48:16, 3.96s/it] 16%|█▌ | 5350/34278 [5:58:23<29:29:31, 3.67s/it] {'loss': 0.1592, 'grad_norm': 0.8411972847649672, 'learning_rate': 9.589030137906584e-06, 'epoch': 0.16} 16%|█▌ | 5350/34278 [5:58:23<29:29:31, 3.67s/it] 16%|█▌ | 5351/34278 [5:58:28<33:18:53, 4.15s/it] {'loss': 0.182, 'grad_norm': 0.7574829571818203, 'learning_rate': 9.588842547320287e-06, 'epoch': 0.16} 16%|█▌ | 5351/34278 [5:58:28<33:18:53, 4.15s/it] 16%|█▌ | 5352/34278 [5:58:31<30:43:06, 3.82s/it] {'loss': 0.1912, 'grad_norm': 0.9529049001866505, 'learning_rate': 9.588654915765901e-06, 'epoch': 0.16} 16%|█▌ | 5352/34278 [5:58:31<30:43:06, 3.82s/it] 16%|█▌ | 5353/34278 [5:58:34<28:48:59, 3.59s/it] {'loss': 0.1724, 'grad_norm': 0.895152485217695, 'learning_rate': 9.588467243245099e-06, 'epoch': 0.16} 16%|█▌ | 5353/34278 [5:58:34<28:48:59, 3.59s/it] 16%|█▌ | 5354/34278 [5:58:40<35:15:14, 4.39s/it] {'loss': 0.1535, 'grad_norm': 0.8419340032565599, 'learning_rate': 9.588279529759556e-06, 'epoch': 0.16} 16%|█▌ | 5354/34278 [5:58:40<35:15:14, 4.39s/it] 16%|█▌ | 5355/34278 [5:58:44<33:18:49, 4.15s/it] {'loss': 0.1682, 'grad_norm': 0.8098932874842786, 'learning_rate': 9.588091775310948e-06, 'epoch': 0.16} 16%|█▌ | 5355/34278 [5:58:44<33:18:49, 4.15s/it] 16%|█▌ | 5356/34278 [5:58:47<29:57:33, 3.73s/it] {'loss': 0.1503, 'grad_norm': 1.0321349408698615, 'learning_rate': 9.587903979900953e-06, 'epoch': 0.16} 16%|█▌ | 5356/34278 [5:58:47<29:57:33, 3.73s/it] 16%|█▌ | 5357/34278 [5:58:51<31:28:39, 3.92s/it] {'loss': 0.1763, 'grad_norm': 0.9084734717859104, 'learning_rate': 9.587716143531248e-06, 'epoch': 0.16} 16%|█▌ | 5357/34278 [5:58:51<31:28:39, 3.92s/it] 16%|█▌ | 5358/34278 [5:58:54<29:51:53, 3.72s/it] {'loss': 0.1778, 'grad_norm': 0.8982855549872955, 'learning_rate': 9.587528266203505e-06, 'epoch': 0.16} 16%|█▌ | 5358/34278 [5:58:54<29:51:53, 3.72s/it] 16%|█▌ | 5359/34278 [5:58:57<28:15:10, 3.52s/it] {'loss': 0.1855, 'grad_norm': 0.8508686200377119, 'learning_rate': 9.587340347919406e-06, 'epoch': 0.16} 16%|█▌ | 5359/34278 [5:58:57<28:15:10, 3.52s/it] 16%|█▌ | 5360/34278 [5:59:01<27:57:53, 3.48s/it] {'loss': 0.1845, 'grad_norm': 1.0722445709322017, 'learning_rate': 9.587152388680628e-06, 'epoch': 0.16} 16%|█▌ | 5360/34278 [5:59:01<27:57:53, 3.48s/it] 16%|█▌ | 5361/34278 [5:59:04<27:41:12, 3.45s/it] {'loss': 0.1822, 'grad_norm': 0.9633100778239042, 'learning_rate': 9.586964388488849e-06, 'epoch': 0.16} 16%|█▌ | 5361/34278 [5:59:04<27:41:12, 3.45s/it] 16%|█▌ | 5362/34278 [5:59:07<26:55:44, 3.35s/it] {'loss': 0.2008, 'grad_norm': 1.1789964108458322, 'learning_rate': 9.586776347345745e-06, 'epoch': 0.16} 16%|█▌ | 5362/34278 [5:59:07<26:55:44, 3.35s/it] 16%|█▌ | 5363/34278 [5:59:13<32:06:28, 4.00s/it] {'loss': 0.1583, 'grad_norm': 0.9854937256141431, 'learning_rate': 9.586588265252999e-06, 'epoch': 0.16} 16%|█▌ | 5363/34278 [5:59:13<32:06:28, 4.00s/it] 16%|█▌ | 5364/34278 [5:59:17<32:34:10, 4.06s/it] {'loss': 0.1722, 'grad_norm': 0.8743567094915753, 'learning_rate': 9.586400142212287e-06, 'epoch': 0.16} 16%|█▌ | 5364/34278 [5:59:17<32:34:10, 4.06s/it] 16%|█▌ | 5365/34278 [5:59:21<32:07:57, 4.00s/it] {'loss': 0.1827, 'grad_norm': 1.2650349148349425, 'learning_rate': 9.58621197822529e-06, 'epoch': 0.16} 16%|█▌ | 5365/34278 [5:59:21<32:07:57, 4.00s/it] 16%|█▌ | 5366/34278 [5:59:23<29:08:56, 3.63s/it] {'loss': 0.1711, 'grad_norm': 0.6444370402660897, 'learning_rate': 9.586023773293687e-06, 'epoch': 0.16} 16%|█▌ | 5366/34278 [5:59:24<29:08:56, 3.63s/it] 16%|█▌ | 5367/34278 [5:59:26<27:34:06, 3.43s/it] {'loss': 0.1852, 'grad_norm': 0.9065177099854337, 'learning_rate': 9.585835527419157e-06, 'epoch': 0.16} 16%|█▌ | 5367/34278 [5:59:26<27:34:06, 3.43s/it] 16%|█▌ | 5368/34278 [5:59:30<27:49:34, 3.47s/it] {'loss': 0.1482, 'grad_norm': 0.9754269080467419, 'learning_rate': 9.585647240603384e-06, 'epoch': 0.16} 16%|█▌ | 5368/34278 [5:59:30<27:49:34, 3.47s/it] 16%|█▌ | 5369/34278 [5:59:35<32:09:42, 4.01s/it] {'loss': 0.1816, 'grad_norm': 0.645413530190737, 'learning_rate': 9.585458912848048e-06, 'epoch': 0.16} 16%|█▌ | 5369/34278 [5:59:35<32:09:42, 4.01s/it] 16%|█▌ | 5370/34278 [5:59:38<30:00:12, 3.74s/it] {'loss': 0.1865, 'grad_norm': 0.7349710144313402, 'learning_rate': 9.585270544154825e-06, 'epoch': 0.16} 16%|█▌ | 5370/34278 [5:59:38<30:00:12, 3.74s/it] 16%|█▌ | 5371/34278 [5:59:43<32:33:10, 4.05s/it] {'loss': 0.1839, 'grad_norm': 0.8156158013215924, 'learning_rate': 9.585082134525405e-06, 'epoch': 0.16} 16%|█▌ | 5371/34278 [5:59:43<32:33:10, 4.05s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 16%|█▌ | 5372/34278 [5:59:47<32:48:21, 4.09s/it] {'loss': 0.2123, 'grad_norm': 0.7953523323285103, 'learning_rate': 9.584893683961464e-06, 'epoch': 0.16} 16%|█▌ | 5372/34278 [5:59:47<32:48:21, 4.09s/it] 16%|█▌ | 5373/34278 [5:59:50<30:20:46, 3.78s/it] {'loss': 0.1591, 'grad_norm': 0.7576460182732079, 'learning_rate': 9.58470519246469e-06, 'epoch': 0.16} 16%|█▌ | 5373/34278 [5:59:50<30:20:46, 3.78s/it] 16%|█▌ | 5374/34278 [5:59:56<35:20:02, 4.40s/it] {'loss': 0.174, 'grad_norm': 0.7287874841201041, 'learning_rate': 9.58451666003676e-06, 'epoch': 0.16} 16%|█▌ | 5374/34278 [5:59:56<35:20:02, 4.40s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8897 > 8192). Running this sequence through the model will result in indexing errors 16%|█▌ | 5375/34278 [5:59:59<31:55:11, 3.98s/it] {'loss': 0.1652, 'grad_norm': 0.880261036992491, 'learning_rate': 9.58432808667936e-06, 'epoch': 0.16} 16%|█▌ | 5375/34278 [5:59:59<31:55:11, 3.98s/it] 16%|█▌ | 5376/34278 [6:00:02<29:59:00, 3.73s/it] {'loss': 0.1956, 'grad_norm': 0.7200438830764648, 'learning_rate': 9.584139472394173e-06, 'epoch': 0.16} 16%|█▌ | 5376/34278 [6:00:02<29:59:00, 3.73s/it] 16%|█▌ | 5377/34278 [6:00:05<27:58:36, 3.48s/it] {'loss': 0.1845, 'grad_norm': 0.8144091771627586, 'learning_rate': 9.583950817182883e-06, 'epoch': 0.16} 16%|█▌ | 5377/34278 [6:00:05<27:58:36, 3.48s/it] 16%|█▌ | 5378/34278 [6:00:08<26:43:55, 3.33s/it] {'loss': 0.1918, 'grad_norm': 1.0797429701934214, 'learning_rate': 9.583762121047175e-06, 'epoch': 0.16} 16%|█▌ | 5378/34278 [6:00:08<26:43:55, 3.33s/it] 16%|█▌ | 5379/34278 [6:00:13<29:07:03, 3.63s/it] {'loss': 0.236, 'grad_norm': 0.8390675918028163, 'learning_rate': 9.583573383988733e-06, 'epoch': 0.16} 16%|█▌ | 5379/34278 [6:00:13<29:07:03, 3.63s/it] 16%|█▌ | 5380/34278 [6:00:16<27:59:23, 3.49s/it] {'loss': 0.1648, 'grad_norm': 0.830419468352586, 'learning_rate': 9.583384606009243e-06, 'epoch': 0.16} 16%|█▌ | 5380/34278 [6:00:16<27:59:23, 3.49s/it] 16%|█▌ | 5381/34278 [6:00:22<33:33:24, 4.18s/it] {'loss': 0.1763, 'grad_norm': 0.7595095834019951, 'learning_rate': 9.583195787110387e-06, 'epoch': 0.16} 16%|█▌ | 5381/34278 [6:00:22<33:33:24, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 16%|█▌ | 5382/34278 [6:00:26<33:01:12, 4.11s/it] {'loss': 0.1583, 'grad_norm': 0.8040826984867856, 'learning_rate': 9.583006927293855e-06, 'epoch': 0.16} 16%|█▌ | 5382/34278 [6:00:26<33:01:12, 4.11s/it] 16%|█▌ | 5383/34278 [6:00:30<34:16:14, 4.27s/it] {'loss': 0.1714, 'grad_norm': 0.9431321511186578, 'learning_rate': 9.582818026561332e-06, 'epoch': 0.16} 16%|█▌ | 5383/34278 [6:00:30<34:16:14, 4.27s/it] 16%|█▌ | 5384/34278 [6:00:34<32:23:17, 4.04s/it] {'loss': 0.1668, 'grad_norm': 0.976150288729968, 'learning_rate': 9.5826290849145e-06, 'epoch': 0.16} 16%|█▌ | 5384/34278 [6:00:34<32:23:17, 4.04s/it] 16%|█▌ | 5385/34278 [6:00:37<29:53:20, 3.72s/it] {'loss': 0.1677, 'grad_norm': 0.6976835131639112, 'learning_rate': 9.582440102355052e-06, 'epoch': 0.16} 16%|█▌ | 5385/34278 [6:00:37<29:53:20, 3.72s/it] 16%|█▌ | 5386/34278 [6:00:41<31:45:55, 3.96s/it] {'loss': 0.1982, 'grad_norm': 1.0044919521364357, 'learning_rate': 9.582251078884672e-06, 'epoch': 0.16} 16%|█▌ | 5386/34278 [6:00:41<31:45:55, 3.96s/it] 16%|█▌ | 5387/34278 [6:00:47<35:41:21, 4.45s/it] {'loss': 0.1784, 'grad_norm': 0.8706452783430578, 'learning_rate': 9.58206201450505e-06, 'epoch': 0.16} 16%|█▌ | 5387/34278 [6:00:47<35:41:21, 4.45s/it] 16%|█▌ | 5388/34278 [6:00:53<39:37:18, 4.94s/it] {'loss': 0.168, 'grad_norm': 0.8441239421029116, 'learning_rate': 9.58187290921787e-06, 'epoch': 0.16} 16%|█▌ | 5388/34278 [6:00:53<39:37:18, 4.94s/it] 16%|█▌ | 5389/34278 [6:00:56<36:33:20, 4.56s/it] {'loss': 0.1832, 'grad_norm': 1.0036271562359225, 'learning_rate': 9.581683763024825e-06, 'epoch': 0.16} 16%|█▌ | 5389/34278 [6:00:57<36:33:20, 4.56s/it] 16%|█▌ | 5390/34278 [6:01:00<33:08:23, 4.13s/it] {'loss': 0.1699, 'grad_norm': 0.7096517906693629, 'learning_rate': 9.5814945759276e-06, 'epoch': 0.16} 16%|█▌ | 5390/34278 [6:01:00<33:08:23, 4.13s/it] 16%|█▌ | 5391/34278 [6:01:03<30:54:52, 3.85s/it] {'loss': 0.1939, 'grad_norm': 0.9350427729280126, 'learning_rate': 9.581305347927883e-06, 'epoch': 0.16} 16%|█▌ | 5391/34278 [6:01:03<30:54:52, 3.85s/it] 16%|█▌ | 5392/34278 [6:01:06<28:57:13, 3.61s/it] {'loss': 0.1818, 'grad_norm': 0.8232720239413225, 'learning_rate': 9.581116079027367e-06, 'epoch': 0.16} 16%|█▌ | 5392/34278 [6:01:06<28:57:13, 3.61s/it] 16%|█▌ | 5393/34278 [6:01:12<34:31:57, 4.30s/it] {'loss': 0.1906, 'grad_norm': 0.8040089296347235, 'learning_rate': 9.580926769227741e-06, 'epoch': 0.16} 16%|█▌ | 5393/34278 [6:01:12<34:31:57, 4.30s/it] 16%|█▌ | 5394/34278 [6:01:15<31:10:06, 3.88s/it] {'loss': 0.1559, 'grad_norm': 0.8168039532541982, 'learning_rate': 9.580737418530693e-06, 'epoch': 0.16} 16%|█▌ | 5394/34278 [6:01:15<31:10:06, 3.88s/it] 16%|█▌ | 5395/34278 [6:01:18<29:57:52, 3.73s/it] {'loss': 0.1698, 'grad_norm': 0.8357454538670241, 'learning_rate': 9.580548026937915e-06, 'epoch': 0.16} 16%|█▌ | 5395/34278 [6:01:18<29:57:52, 3.73s/it] 16%|█▌ | 5396/34278 [6:01:21<28:51:02, 3.60s/it] {'loss': 0.1648, 'grad_norm': 0.8003691384268423, 'learning_rate': 9.580358594451098e-06, 'epoch': 0.16} 16%|█▌ | 5396/34278 [6:01:21<28:51:02, 3.60s/it] 16%|█▌ | 5397/34278 [6:01:25<29:33:57, 3.69s/it] {'loss': 0.182, 'grad_norm': 0.9345653440989051, 'learning_rate': 9.580169121071934e-06, 'epoch': 0.16} 16%|█▌ | 5397/34278 [6:01:25<29:33:57, 3.69s/it] 16%|█▌ | 5398/34278 [6:01:29<30:39:52, 3.82s/it] {'loss': 0.1632, 'grad_norm': 0.86565735878832, 'learning_rate': 9.579979606802112e-06, 'epoch': 0.16} 16%|█▌ | 5398/34278 [6:01:29<30:39:52, 3.82s/it] 16%|█▌ | 5399/34278 [6:01:33<30:27:40, 3.80s/it] {'loss': 0.2039, 'grad_norm': 0.9821047596796536, 'learning_rate': 9.579790051643325e-06, 'epoch': 0.16} 16%|█▌ | 5399/34278 [6:01:33<30:27:40, 3.80s/it] 16%|█▌ | 5400/34278 [6:01:36<28:10:26, 3.51s/it] {'loss': 0.1609, 'grad_norm': 0.9166169957795464, 'learning_rate': 9.579600455597266e-06, 'epoch': 0.16} 16%|█▌ | 5400/34278 [6:01:36<28:10:26, 3.51s/it] 16%|█▌ | 5401/34278 [6:01:40<28:20:39, 3.53s/it] {'loss': 0.1557, 'grad_norm': 0.9635580581460742, 'learning_rate': 9.579410818665628e-06, 'epoch': 0.16} 16%|█▌ | 5401/34278 [6:01:40<28:20:39, 3.53s/it] 16%|█▌ | 5402/34278 [6:01:42<26:41:22, 3.33s/it] {'loss': 0.1944, 'grad_norm': 1.080706368444192, 'learning_rate': 9.579221140850104e-06, 'epoch': 0.16} 16%|█▌ | 5402/34278 [6:01:42<26:41:22, 3.33s/it] 16%|█▌ | 5403/34278 [6:01:45<26:06:12, 3.25s/it] {'loss': 0.1764, 'grad_norm': 0.927293969695493, 'learning_rate': 9.579031422152387e-06, 'epoch': 0.16} 16%|█▌ | 5403/34278 [6:01:46<26:06:12, 3.25s/it] 16%|█▌ | 5404/34278 [6:01:49<25:34:58, 3.19s/it] {'loss': 0.1652, 'grad_norm': 0.9799503631241685, 'learning_rate': 9.57884166257417e-06, 'epoch': 0.16} 16%|█▌ | 5404/34278 [6:01:49<25:34:58, 3.19s/it] 16%|█▌ | 5405/34278 [6:01:53<27:43:25, 3.46s/it] {'loss': 0.188, 'grad_norm': 1.0106878771654144, 'learning_rate': 9.578651862117148e-06, 'epoch': 0.16} 16%|█▌ | 5405/34278 [6:01:53<27:43:25, 3.46s/it] 16%|█▌ | 5406/34278 [6:01:56<28:25:15, 3.54s/it] {'loss': 0.1628, 'grad_norm': 0.7422579840142393, 'learning_rate': 9.578462020783013e-06, 'epoch': 0.16} 16%|█▌ | 5406/34278 [6:01:56<28:25:15, 3.54s/it] 16%|█▌ | 5407/34278 [6:02:02<34:01:43, 4.24s/it] {'loss': 0.1757, 'grad_norm': 1.3446219409699107, 'learning_rate': 9.578272138573463e-06, 'epoch': 0.16} 16%|█▌ | 5407/34278 [6:02:02<34:01:43, 4.24s/it] 16%|█▌ | 5408/34278 [6:02:05<31:20:36, 3.91s/it] {'loss': 0.1895, 'grad_norm': 1.0520443887428077, 'learning_rate': 9.578082215490194e-06, 'epoch': 0.16} 16%|█▌ | 5408/34278 [6:02:05<31:20:36, 3.91s/it] 16%|█▌ | 5409/34278 [6:02:08<29:00:57, 3.62s/it] {'loss': 0.1933, 'grad_norm': 0.8377729339645991, 'learning_rate': 9.577892251534899e-06, 'epoch': 0.16} 16%|█▌ | 5409/34278 [6:02:08<29:00:57, 3.62s/it] 16%|█▌ | 5410/34278 [6:02:11<27:45:34, 3.46s/it] {'loss': 0.1654, 'grad_norm': 0.743670948833949, 'learning_rate': 9.577702246709275e-06, 'epoch': 0.16} 16%|█▌ | 5410/34278 [6:02:11<27:45:34, 3.46s/it] 16%|█▌ | 5411/34278 [6:02:17<32:30:45, 4.05s/it] {'loss': 0.1672, 'grad_norm': 1.030117423630722, 'learning_rate': 9.577512201015017e-06, 'epoch': 0.16} 16%|█▌ | 5411/34278 [6:02:17<32:30:45, 4.05s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 16%|█▌ | 5412/34278 [6:02:20<30:40:32, 3.83s/it] {'loss': 0.1775, 'grad_norm': 0.9125179739569904, 'learning_rate': 9.577322114453823e-06, 'epoch': 0.16} 16%|█▌ | 5412/34278 [6:02:20<30:40:32, 3.83s/it] 16%|█▌ | 5413/34278 [6:02:23<29:15:55, 3.65s/it] {'loss': 0.1844, 'grad_norm': 0.9642911522184233, 'learning_rate': 9.57713198702739e-06, 'epoch': 0.16} 16%|█▌ | 5413/34278 [6:02:23<29:15:55, 3.65s/it] 16%|█▌ | 5414/34278 [6:02:29<33:11:20, 4.14s/it] {'loss': 0.159, 'grad_norm': 0.8439583916956757, 'learning_rate': 9.576941818737417e-06, 'epoch': 0.16} 16%|█▌ | 5414/34278 [6:02:29<33:11:20, 4.14s/it] 16%|█▌ | 5415/34278 [6:02:32<31:34:39, 3.94s/it] {'loss': 0.1573, 'grad_norm': 0.7592182479014896, 'learning_rate': 9.576751609585598e-06, 'epoch': 0.16} 16%|█▌ | 5415/34278 [6:02:32<31:34:39, 3.94s/it] 16%|█▌ | 5416/34278 [6:02:36<30:49:06, 3.84s/it] {'loss': 0.1709, 'grad_norm': 0.9401020753462049, 'learning_rate': 9.576561359573634e-06, 'epoch': 0.16} 16%|█▌ | 5416/34278 [6:02:36<30:49:06, 3.84s/it] 16%|█▌ | 5417/34278 [6:02:39<28:29:06, 3.55s/it] {'loss': 0.1919, 'grad_norm': 0.9758425237649551, 'learning_rate': 9.576371068703223e-06, 'epoch': 0.16} 16%|█▌ | 5417/34278 [6:02:39<28:29:06, 3.55s/it] 16%|█▌ | 5418/34278 [6:02:42<28:08:44, 3.51s/it] {'loss': 0.1742, 'grad_norm': 0.7927983301156055, 'learning_rate': 9.576180736976063e-06, 'epoch': 0.16} 16%|█▌ | 5418/34278 [6:02:42<28:08:44, 3.51s/it] 16%|█▌ | 5419/34278 [6:02:45<27:00:41, 3.37s/it] {'loss': 0.1855, 'grad_norm': 0.9042913679129185, 'learning_rate': 9.575990364393854e-06, 'epoch': 0.16} 16%|█▌ | 5419/34278 [6:02:45<27:00:41, 3.37s/it] 16%|█▌ | 5420/34278 [6:02:48<26:25:15, 3.30s/it] {'loss': 0.1931, 'grad_norm': 1.0121199501897107, 'learning_rate': 9.575799950958296e-06, 'epoch': 0.16} 16%|█▌ | 5420/34278 [6:02:48<26:25:15, 3.30s/it] 16%|█▌ | 5421/34278 [6:02:51<26:24:05, 3.29s/it] {'loss': 0.1656, 'grad_norm': 0.7928701851499547, 'learning_rate': 9.575609496671087e-06, 'epoch': 0.16} 16%|█▌ | 5421/34278 [6:02:51<26:24:05, 3.29s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9716 > 8192). Running this sequence through the model will result in indexing errors 16%|█▌ | 5422/34278 [6:02:55<26:05:34, 3.26s/it] {'loss': 0.1773, 'grad_norm': 0.9124917297475127, 'learning_rate': 9.57541900153393e-06, 'epoch': 0.16} 16%|█▌ | 5422/34278 [6:02:55<26:05:34, 3.26s/it] 16%|█▌ | 5423/34278 [6:02:58<25:52:35, 3.23s/it] {'loss': 0.1692, 'grad_norm': 0.9829388883479042, 'learning_rate': 9.575228465548523e-06, 'epoch': 0.16} 16%|█▌ | 5423/34278 [6:02:58<25:52:35, 3.23s/it] 16%|█▌ | 5424/34278 [6:03:02<28:26:04, 3.55s/it] {'loss': 0.1558, 'grad_norm': 0.7949378524931011, 'learning_rate': 9.57503788871657e-06, 'epoch': 0.16} 16%|█▌ | 5424/34278 [6:03:02<28:26:04, 3.55s/it] 16%|█▌ | 5425/34278 [6:03:06<29:20:54, 3.66s/it] {'loss': 0.1726, 'grad_norm': 0.8954911948894755, 'learning_rate': 9.57484727103977e-06, 'epoch': 0.16} 16%|█▌ | 5425/34278 [6:03:06<29:20:54, 3.66s/it] 16%|█▌ | 5426/34278 [6:03:10<29:09:35, 3.64s/it] {'loss': 0.1642, 'grad_norm': 0.7890650899596373, 'learning_rate': 9.574656612519826e-06, 'epoch': 0.16} 16%|█▌ | 5426/34278 [6:03:10<29:09:35, 3.64s/it] 16%|█▌ | 5427/34278 [6:03:13<27:42:55, 3.46s/it] {'loss': 0.168, 'grad_norm': 0.8069001450485412, 'learning_rate': 9.57446591315844e-06, 'epoch': 0.16} 16%|█▌ | 5427/34278 [6:03:13<27:42:55, 3.46s/it] 16%|█▌ | 5428/34278 [6:03:16<27:50:46, 3.47s/it] {'loss': 0.209, 'grad_norm': 0.9666639566313857, 'learning_rate': 9.574275172957312e-06, 'epoch': 0.16} 16%|█▌ | 5428/34278 [6:03:16<27:50:46, 3.47s/it] 16%|█▌ | 5429/34278 [6:03:19<27:31:11, 3.43s/it] {'loss': 0.1843, 'grad_norm': 0.9358702572827285, 'learning_rate': 9.57408439191815e-06, 'epoch': 0.16} 16%|█▌ | 5429/34278 [6:03:20<27:31:11, 3.43s/it] 16%|█▌ | 5430/34278 [6:03:26<33:58:27, 4.24s/it] {'loss': 0.1591, 'grad_norm': 0.7631703200514474, 'learning_rate': 9.573893570042654e-06, 'epoch': 0.16} 16%|█▌ | 5430/34278 [6:03:26<33:58:27, 4.24s/it] 16%|█▌ | 5431/34278 [6:03:32<37:57:07, 4.74s/it] {'loss': 0.1791, 'grad_norm': 1.1370567339111364, 'learning_rate': 9.573702707332527e-06, 'epoch': 0.16} 16%|█▌ | 5431/34278 [6:03:32<37:57:07, 4.74s/it] 16%|█▌ | 5432/34278 [6:03:35<34:33:43, 4.31s/it] {'loss': 0.1593, 'grad_norm': 0.8459786444647961, 'learning_rate': 9.573511803789475e-06, 'epoch': 0.16} 16%|█▌ | 5432/34278 [6:03:35<34:33:43, 4.31s/it] 16%|█▌ | 5433/34278 [6:03:39<33:51:31, 4.23s/it] {'loss': 0.1971, 'grad_norm': 0.8080793621441061, 'learning_rate': 9.573320859415202e-06, 'epoch': 0.16} 16%|█▌ | 5433/34278 [6:03:39<33:51:31, 4.23s/it] 16%|█▌ | 5434/34278 [6:03:42<31:51:08, 3.98s/it] {'loss': 0.1996, 'grad_norm': 1.074489446362773, 'learning_rate': 9.573129874211411e-06, 'epoch': 0.16} 16%|█▌ | 5434/34278 [6:03:42<31:51:08, 3.98s/it] 16%|█▌ | 5435/34278 [6:03:45<29:45:18, 3.71s/it] {'loss': 0.1816, 'grad_norm': 0.9428179071504762, 'learning_rate': 9.57293884817981e-06, 'epoch': 0.16} 16%|█▌ | 5435/34278 [6:03:45<29:45:18, 3.71s/it] 16%|█▌ | 5436/34278 [6:03:49<29:01:59, 3.62s/it] {'loss': 0.1804, 'grad_norm': 0.8700286428009719, 'learning_rate': 9.572747781322099e-06, 'epoch': 0.16} 16%|█▌ | 5436/34278 [6:03:49<29:01:59, 3.62s/it] 16%|█▌ | 5437/34278 [6:03:52<27:27:01, 3.43s/it] {'loss': 0.1623, 'grad_norm': 0.8717119106477643, 'learning_rate': 9.57255667363999e-06, 'epoch': 0.16} 16%|█▌ | 5437/34278 [6:03:52<27:27:01, 3.43s/it] 16%|█▌ | 5438/34278 [6:03:58<33:30:17, 4.18s/it] {'loss': 0.1646, 'grad_norm': 0.7060402488018908, 'learning_rate': 9.572365525135185e-06, 'epoch': 0.16} 16%|█▌ | 5438/34278 [6:03:58<33:30:17, 4.18s/it] 16%|█▌ | 5439/34278 [6:04:01<31:45:19, 3.96s/it] {'loss': 0.1633, 'grad_norm': 0.8811567443773396, 'learning_rate': 9.572174335809394e-06, 'epoch': 0.16} 16%|█▌ | 5439/34278 [6:04:01<31:45:19, 3.96s/it] 16%|█▌ | 5440/34278 [6:04:06<34:38:21, 4.32s/it] {'loss': 0.1772, 'grad_norm': 0.7270687461858438, 'learning_rate': 9.571983105664322e-06, 'epoch': 0.16} 16%|█▌ | 5440/34278 [6:04:06<34:38:21, 4.32s/it] 16%|█▌ | 5441/34278 [6:04:10<33:31:20, 4.18s/it] {'loss': 0.1702, 'grad_norm': 0.8497674277569558, 'learning_rate': 9.571791834701675e-06, 'epoch': 0.16} 16%|█▌ | 5441/34278 [6:04:10<33:31:20, 4.18s/it] 16%|█▌ | 5442/34278 [6:04:13<30:38:32, 3.83s/it] {'loss': 0.2009, 'grad_norm': 0.8674696912018066, 'learning_rate': 9.571600522923163e-06, 'epoch': 0.16} 16%|█▌ | 5442/34278 [6:04:13<30:38:32, 3.83s/it] 16%|█▌ | 5443/34278 [6:04:19<35:39:57, 4.45s/it] {'loss': 0.1604, 'grad_norm': 0.8656224250311568, 'learning_rate': 9.571409170330491e-06, 'epoch': 0.16} 16%|█▌ | 5443/34278 [6:04:19<35:39:57, 4.45s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 16%|█▌ | 5444/34278 [6:04:22<32:35:06, 4.07s/it] {'loss': 0.1809, 'grad_norm': 0.7641537478070192, 'learning_rate': 9.57121777692537e-06, 'epoch': 0.16} 16%|█▌ | 5444/34278 [6:04:22<32:35:06, 4.07s/it] 16%|█▌ | 5445/34278 [6:04:27<35:18:40, 4.41s/it] {'loss': 0.184, 'grad_norm': 1.1083848267124718, 'learning_rate': 9.571026342709508e-06, 'epoch': 0.16} 16%|█▌ | 5445/34278 [6:04:27<35:18:40, 4.41s/it] 16%|█▌ | 5446/34278 [6:04:33<38:40:38, 4.83s/it] {'loss': 0.1665, 'grad_norm': 0.7476783036580589, 'learning_rate': 9.570834867684615e-06, 'epoch': 0.16} 16%|█▌ | 5446/34278 [6:04:33<38:40:38, 4.83s/it] 16%|█▌ | 5447/34278 [6:04:38<39:33:23, 4.94s/it] {'loss': 0.1715, 'grad_norm': 0.6403376710925319, 'learning_rate': 9.5706433518524e-06, 'epoch': 0.16} 16%|█▌ | 5447/34278 [6:04:38<39:33:23, 4.94s/it] 16%|█▌ | 5448/34278 [6:04:44<40:46:35, 5.09s/it] {'loss': 0.1795, 'grad_norm': 0.9263578967867363, 'learning_rate': 9.57045179521457e-06, 'epoch': 0.16} 16%|█▌ | 5448/34278 [6:04:44<40:46:35, 5.09s/it] 16%|█▌ | 5449/34278 [6:04:47<36:31:35, 4.56s/it] {'loss': 0.1744, 'grad_norm': 0.8908157844591182, 'learning_rate': 9.570260197772838e-06, 'epoch': 0.16} 16%|█▌ | 5449/34278 [6:04:47<36:31:35, 4.56s/it] 16%|█▌ | 5450/34278 [6:04:51<34:32:01, 4.31s/it] {'loss': 0.1492, 'grad_norm': 0.7682575593731672, 'learning_rate': 9.570068559528915e-06, 'epoch': 0.16} 16%|█▌ | 5450/34278 [6:04:51<34:32:01, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 16%|█▌ | 5451/34278 [6:04:55<32:44:12, 4.09s/it] {'loss': 0.1744, 'grad_norm': 0.920412489242452, 'learning_rate': 9.56987688048451e-06, 'epoch': 0.16} 16%|█▌ | 5451/34278 [6:04:55<32:44:12, 4.09s/it] 16%|█▌ | 5452/34278 [6:04:58<30:45:51, 3.84s/it] {'loss': 0.1601, 'grad_norm': 0.8094421064257414, 'learning_rate': 9.569685160641335e-06, 'epoch': 0.16} 16%|█▌ | 5452/34278 [6:04:58<30:45:51, 3.84s/it] 16%|█▌ | 5453/34278 [6:05:01<28:57:31, 3.62s/it] {'loss': 0.1831, 'grad_norm': 0.9815816069454216, 'learning_rate': 9.569493400001102e-06, 'epoch': 0.16} 16%|█▌ | 5453/34278 [6:05:01<28:57:31, 3.62s/it] 16%|█▌ | 5454/34278 [6:05:06<32:17:18, 4.03s/it] {'loss': 0.1607, 'grad_norm': 0.8030345602944954, 'learning_rate': 9.569301598565523e-06, 'epoch': 0.16} 16%|█▌ | 5454/34278 [6:05:06<32:17:18, 4.03s/it] 16%|█▌ | 5455/34278 [6:05:09<29:54:01, 3.73s/it] {'loss': 0.1863, 'grad_norm': 0.931746539425742, 'learning_rate': 9.56910975633631e-06, 'epoch': 0.16} 16%|█▌ | 5455/34278 [6:05:09<29:54:01, 3.73s/it] 16%|█▌ | 5456/34278 [6:05:12<29:04:24, 3.63s/it] {'loss': 0.1812, 'grad_norm': 0.9966607132436579, 'learning_rate': 9.568917873315176e-06, 'epoch': 0.16} 16%|█▌ | 5456/34278 [6:05:12<29:04:24, 3.63s/it] 16%|█▌ | 5457/34278 [6:05:16<28:33:32, 3.57s/it] {'loss': 0.1611, 'grad_norm': 0.861422562580085, 'learning_rate': 9.568725949503834e-06, 'epoch': 0.16} 16%|█▌ | 5457/34278 [6:05:16<28:33:32, 3.57s/it] 16%|█▌ | 5458/34278 [6:05:20<29:53:39, 3.73s/it] {'loss': 0.1589, 'grad_norm': 0.7667715833184627, 'learning_rate': 9.568533984903999e-06, 'epoch': 0.16} 16%|█▌ | 5458/34278 [6:05:20<29:53:39, 3.73s/it] 16%|█▌ | 5459/34278 [6:05:24<30:00:15, 3.75s/it] {'loss': 0.1553, 'grad_norm': 0.7941531497924824, 'learning_rate': 9.568341979517379e-06, 'epoch': 0.16} 16%|█▌ | 5459/34278 [6:05:24<30:00:15, 3.75s/it] 16%|█▌ | 5460/34278 [6:05:27<28:52:35, 3.61s/it] {'loss': 0.1503, 'grad_norm': 0.8236868673519583, 'learning_rate': 9.568149933345696e-06, 'epoch': 0.16} 16%|█▌ | 5460/34278 [6:05:27<28:52:35, 3.61s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11546 > 8192). Running this sequence through the model will result in indexing errors 16%|█▌ | 5461/34278 [6:05:30<27:31:45, 3.44s/it] {'loss': 0.1653, 'grad_norm': 0.9462066861728884, 'learning_rate': 9.567957846390659e-06, 'epoch': 0.16} 16%|█▌ | 5461/34278 [6:05:30<27:31:45, 3.44s/it] 16%|█▌ | 5462/34278 [6:05:35<31:47:39, 3.97s/it] {'loss': 0.148, 'grad_norm': 0.7394685143027268, 'learning_rate': 9.567765718653985e-06, 'epoch': 0.16} 16%|█▌ | 5462/34278 [6:05:35<31:47:39, 3.97s/it] 16%|█▌ | 5463/34278 [6:05:41<36:43:13, 4.59s/it] {'loss': 0.1822, 'grad_norm': 0.8259509489049427, 'learning_rate': 9.56757355013739e-06, 'epoch': 0.16} 16%|█▌ | 5463/34278 [6:05:41<36:43:13, 4.59s/it] 16%|█▌ | 5464/34278 [6:05:44<33:16:00, 4.16s/it] {'loss': 0.1607, 'grad_norm': 0.7205320025845744, 'learning_rate': 9.567381340842587e-06, 'epoch': 0.16} 16%|█▌ | 5464/34278 [6:05:44<33:16:00, 4.16s/it] 16%|█▌ | 5465/34278 [6:05:48<31:18:09, 3.91s/it] {'loss': 0.1819, 'grad_norm': 0.9278171605585844, 'learning_rate': 9.567189090771297e-06, 'epoch': 0.16} 16%|█▌ | 5465/34278 [6:05:48<31:18:09, 3.91s/it] 16%|█▌ | 5466/34278 [6:05:51<29:09:09, 3.64s/it] {'loss': 0.1654, 'grad_norm': 0.7434139517527243, 'learning_rate': 9.56699679992523e-06, 'epoch': 0.16} 16%|█▌ | 5466/34278 [6:05:51<29:09:09, 3.64s/it] 16%|█▌ | 5467/34278 [6:05:54<28:12:00, 3.52s/it] {'loss': 0.1971, 'grad_norm': 0.7608680124730055, 'learning_rate': 9.566804468306106e-06, 'epoch': 0.16} 16%|█▌ | 5467/34278 [6:05:54<28:12:00, 3.52s/it] 16%|█▌ | 5468/34278 [6:05:57<27:52:32, 3.48s/it] {'loss': 0.1558, 'grad_norm': 0.8194086504675923, 'learning_rate': 9.566612095915645e-06, 'epoch': 0.16} 16%|█▌ | 5468/34278 [6:05:57<27:52:32, 3.48s/it] 16%|█▌ | 5469/34278 [6:06:01<27:36:31, 3.45s/it] {'loss': 0.1753, 'grad_norm': 0.8348174586247553, 'learning_rate': 9.566419682755556e-06, 'epoch': 0.16} 16%|█▌ | 5469/34278 [6:06:01<27:36:31, 3.45s/it] 16%|█▌ | 5470/34278 [6:06:04<26:08:14, 3.27s/it] {'loss': 0.1789, 'grad_norm': 0.7991239621964856, 'learning_rate': 9.566227228827567e-06, 'epoch': 0.16} 16%|█▌ | 5470/34278 [6:06:04<26:08:14, 3.27s/it] 16%|█▌ | 5471/34278 [6:06:06<24:52:31, 3.11s/it] {'loss': 0.1584, 'grad_norm': 1.142913604038722, 'learning_rate': 9.566034734133389e-06, 'epoch': 0.16} 16%|█▌ | 5471/34278 [6:06:06<24:52:31, 3.11s/it] 16%|█▌ | 5472/34278 [6:06:10<25:44:37, 3.22s/it] {'loss': 0.1798, 'grad_norm': 1.2896668932997617, 'learning_rate': 9.565842198674745e-06, 'epoch': 0.16} 16%|█▌ | 5472/34278 [6:06:10<25:44:37, 3.22s/it] 16%|█▌ | 5473/34278 [6:06:16<32:01:13, 4.00s/it] {'loss': 0.1493, 'grad_norm': 0.6454306703971007, 'learning_rate': 9.565649622453348e-06, 'epoch': 0.16} 16%|█▌ | 5473/34278 [6:06:16<32:01:13, 4.00s/it] 16%|█▌ | 5474/34278 [6:06:18<29:20:30, 3.67s/it] {'loss': 0.1752, 'grad_norm': 0.815570010582798, 'learning_rate': 9.565457005470924e-06, 'epoch': 0.16} 16%|█▌ | 5474/34278 [6:06:18<29:20:30, 3.67s/it] 16%|█▌ | 5475/34278 [6:06:21<27:28:50, 3.43s/it] {'loss': 0.202, 'grad_norm': 0.8262825873337042, 'learning_rate': 9.565264347729188e-06, 'epoch': 0.16} 16%|█▌ | 5475/34278 [6:06:21<27:28:50, 3.43s/it] 16%|█▌ | 5476/34278 [6:06:25<28:24:36, 3.55s/it] {'loss': 0.1589, 'grad_norm': 0.7935681384870441, 'learning_rate': 9.565071649229864e-06, 'epoch': 0.16} 16%|█▌ | 5476/34278 [6:06:25<28:24:36, 3.55s/it] 16%|█▌ | 5477/34278 [6:06:29<27:53:51, 3.49s/it] {'loss': 0.1599, 'grad_norm': 0.8986351776738177, 'learning_rate': 9.564878909974668e-06, 'epoch': 0.16} 16%|█▌ | 5477/34278 [6:06:29<27:53:51, 3.49s/it] 16%|█▌ | 5478/34278 [6:06:32<27:01:15, 3.38s/it] {'loss': 0.1857, 'grad_norm': 1.1528515935918588, 'learning_rate': 9.564686129965324e-06, 'epoch': 0.16} 16%|█▌ | 5478/34278 [6:06:32<27:01:15, 3.38s/it] 16%|█▌ | 5479/34278 [6:06:38<33:11:14, 4.15s/it] {'loss': 0.1662, 'grad_norm': 0.8541012777506539, 'learning_rate': 9.56449330920355e-06, 'epoch': 0.16} 16%|█▌ | 5479/34278 [6:06:38<33:11:14, 4.15s/it] 16%|█▌ | 5480/34278 [6:06:41<30:41:33, 3.84s/it] {'loss': 0.1712, 'grad_norm': 0.8620862149959613, 'learning_rate': 9.564300447691073e-06, 'epoch': 0.16} 16%|█▌ | 5480/34278 [6:06:41<30:41:33, 3.84s/it] 16%|█▌ | 5481/34278 [6:06:44<28:32:30, 3.57s/it] {'loss': 0.1683, 'grad_norm': 0.8990771412602441, 'learning_rate': 9.564107545429609e-06, 'epoch': 0.16} 16%|█▌ | 5481/34278 [6:06:44<28:32:30, 3.57s/it] 16%|█▌ | 5482/34278 [6:06:47<27:54:11, 3.49s/it] {'loss': 0.1797, 'grad_norm': 0.9941917188876096, 'learning_rate': 9.563914602420882e-06, 'epoch': 0.16} 16%|█▌ | 5482/34278 [6:06:47<27:54:11, 3.49s/it] 16%|█▌ | 5483/34278 [6:06:50<26:41:43, 3.34s/it] {'loss': 0.1647, 'grad_norm': 0.9751589699648194, 'learning_rate': 9.563721618666616e-06, 'epoch': 0.16} 16%|█▌ | 5483/34278 [6:06:50<26:41:43, 3.34s/it] 16%|█▌ | 5484/34278 [6:06:53<26:09:27, 3.27s/it] {'loss': 0.145, 'grad_norm': 0.7272839451291004, 'learning_rate': 9.563528594168533e-06, 'epoch': 0.16} 16%|█▌ | 5484/34278 [6:06:53<26:09:27, 3.27s/it] 16%|█▌ | 5485/34278 [6:06:56<25:19:29, 3.17s/it] {'loss': 0.1563, 'grad_norm': 1.0892100242230787, 'learning_rate': 9.563335528928355e-06, 'epoch': 0.16} 16%|█▌ | 5485/34278 [6:06:56<25:19:29, 3.17s/it] 16%|█▌ | 5486/34278 [6:06:59<24:45:38, 3.10s/it] {'loss': 0.1882, 'grad_norm': 0.9542053547141254, 'learning_rate': 9.563142422947806e-06, 'epoch': 0.16} 16%|█▌ | 5486/34278 [6:06:59<24:45:38, 3.10s/it] 16%|█▌ | 5487/34278 [6:07:02<24:43:42, 3.09s/it] {'loss': 0.1576, 'grad_norm': 0.7216372949107076, 'learning_rate': 9.562949276228612e-06, 'epoch': 0.16} 16%|█▌ | 5487/34278 [6:07:02<24:43:42, 3.09s/it] 16%|█▌ | 5488/34278 [6:07:05<24:03:18, 3.01s/it] {'loss': 0.164, 'grad_norm': 0.9598284668236613, 'learning_rate': 9.562756088772496e-06, 'epoch': 0.16} 16%|█▌ | 5488/34278 [6:07:05<24:03:18, 3.01s/it] 16%|█▌ | 5489/34278 [6:07:08<24:58:52, 3.12s/it] {'loss': 0.1567, 'grad_norm': 0.8168559170918406, 'learning_rate': 9.562562860581183e-06, 'epoch': 0.16} 16%|█▌ | 5489/34278 [6:07:08<24:58:52, 3.12s/it] 16%|█▌ | 5490/34278 [6:07:12<26:12:54, 3.28s/it] {'loss': 0.1995, 'grad_norm': 0.9889863519411258, 'learning_rate': 9.562369591656397e-06, 'epoch': 0.16} 16%|█▌ | 5490/34278 [6:07:12<26:12:54, 3.28s/it] 16%|█▌ | 5491/34278 [6:07:15<25:57:20, 3.25s/it] {'loss': 0.1785, 'grad_norm': 0.9228763922672527, 'learning_rate': 9.562176281999866e-06, 'epoch': 0.16} 16%|█▌ | 5491/34278 [6:07:15<25:57:20, 3.25s/it] 16%|█▌ | 5492/34278 [6:07:18<24:46:52, 3.10s/it] {'loss': 0.1806, 'grad_norm': 1.0300716925012865, 'learning_rate': 9.561982931613314e-06, 'epoch': 0.16} 16%|█▌ | 5492/34278 [6:07:18<24:46:52, 3.10s/it] 16%|█▌ | 5493/34278 [6:07:21<24:18:39, 3.04s/it] {'loss': 0.1564, 'grad_norm': 0.8421692313466544, 'learning_rate': 9.561789540498466e-06, 'epoch': 0.16} 16%|█▌ | 5493/34278 [6:07:21<24:18:39, 3.04s/it] 16%|█▌ | 5494/34278 [6:07:24<24:04:07, 3.01s/it] {'loss': 0.1791, 'grad_norm': 0.7979222576177974, 'learning_rate': 9.56159610865705e-06, 'epoch': 0.16} 16%|█▌ | 5494/34278 [6:07:24<24:04:07, 3.01s/it] 16%|█▌ | 5495/34278 [6:07:27<25:12:01, 3.15s/it] {'loss': 0.1549, 'grad_norm': 0.8987821980671169, 'learning_rate': 9.561402636090795e-06, 'epoch': 0.16} 16%|█▌ | 5495/34278 [6:07:27<25:12:01, 3.15s/it] 16%|█▌ | 5496/34278 [6:07:31<25:57:20, 3.25s/it] {'loss': 0.1734, 'grad_norm': 0.8303855245795769, 'learning_rate': 9.561209122801424e-06, 'epoch': 0.16} 16%|█▌ | 5496/34278 [6:07:31<25:57:20, 3.25s/it] 16%|█▌ | 5497/34278 [6:07:35<29:32:48, 3.70s/it] {'loss': 0.1541, 'grad_norm': 0.8530449356921735, 'learning_rate': 9.561015568790667e-06, 'epoch': 0.16} 16%|█▌ | 5497/34278 [6:07:35<29:32:48, 3.70s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 16%|█▌ | 5498/34278 [6:07:38<27:27:32, 3.43s/it] {'loss': 0.1764, 'grad_norm': 1.0519487125777036, 'learning_rate': 9.560821974060253e-06, 'epoch': 0.16} 16%|█▌ | 5498/34278 [6:07:38<27:27:32, 3.43s/it] 16%|█▌ | 5499/34278 [6:07:41<26:53:26, 3.36s/it] {'loss': 0.1941, 'grad_norm': 0.7882587733865246, 'learning_rate': 9.56062833861191e-06, 'epoch': 0.16} 16%|█▌ | 5499/34278 [6:07:41<26:53:26, 3.36s/it] 16%|█▌ | 5500/34278 [6:07:45<27:24:13, 3.43s/it] {'loss': 0.1807, 'grad_norm': 0.7412433977247188, 'learning_rate': 9.560434662447364e-06, 'epoch': 0.16} 16%|█▌ | 5500/34278 [6:07:45<27:24:13, 3.43s/it] 16%|█▌ | 5501/34278 [6:07:49<29:12:19, 3.65s/it] {'loss': 0.1818, 'grad_norm': 0.7990183668196447, 'learning_rate': 9.560240945568346e-06, 'epoch': 0.16} 16%|█▌ | 5501/34278 [6:07:49<29:12:19, 3.65s/it] 16%|█▌ | 5502/34278 [6:07:52<27:37:13, 3.46s/it] {'loss': 0.1754, 'grad_norm': 0.8293011877629302, 'learning_rate': 9.560047187976586e-06, 'epoch': 0.16} 16%|█▌ | 5502/34278 [6:07:52<27:37:13, 3.46s/it] 16%|█▌ | 5503/34278 [6:07:55<25:53:39, 3.24s/it] {'loss': 0.1708, 'grad_norm': 0.8004321008653155, 'learning_rate': 9.559853389673814e-06, 'epoch': 0.16} 16%|█▌ | 5503/34278 [6:07:55<25:53:39, 3.24s/it] 16%|█▌ | 5504/34278 [6:08:01<32:25:22, 4.06s/it] {'loss': 0.1696, 'grad_norm': 0.7603853919729233, 'learning_rate': 9.559659550661759e-06, 'epoch': 0.16} 16%|█▌ | 5504/34278 [6:08:01<32:25:22, 4.06s/it] 16%|█▌ | 5505/34278 [6:08:06<36:14:16, 4.53s/it] {'loss': 0.1771, 'grad_norm': 0.8410948698440288, 'learning_rate': 9.559465670942151e-06, 'epoch': 0.16} 16%|█▌ | 5505/34278 [6:08:06<36:14:16, 4.53s/it] 16%|█▌ | 5506/34278 [6:08:11<36:23:14, 4.55s/it] {'loss': 0.168, 'grad_norm': 0.784680841322243, 'learning_rate': 9.559271750516723e-06, 'epoch': 0.16} 16%|█▌ | 5506/34278 [6:08:11<36:23:14, 4.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 16%|█▌ | 5507/34278 [6:08:14<33:05:47, 4.14s/it] {'loss': 0.1869, 'grad_norm': 0.8670621225688402, 'learning_rate': 9.559077789387204e-06, 'epoch': 0.16} 16%|█▌ | 5507/34278 [6:08:14<33:05:47, 4.14s/it] 16%|█▌ | 5508/34278 [6:08:19<35:50:02, 4.48s/it] {'loss': 0.1893, 'grad_norm': 0.9298948577198906, 'learning_rate': 9.558883787555328e-06, 'epoch': 0.16} 16%|█▌ | 5508/34278 [6:08:19<35:50:02, 4.48s/it] 16%|█▌ | 5509/34278 [6:08:23<34:34:53, 4.33s/it] {'loss': 0.1788, 'grad_norm': 1.0162196307683045, 'learning_rate': 9.558689745022825e-06, 'epoch': 0.16} 16%|█▌ | 5509/34278 [6:08:23<34:34:53, 4.33s/it] 16%|█▌ | 5510/34278 [6:08:27<32:18:15, 4.04s/it] {'loss': 0.1449, 'grad_norm': 1.1083655323504589, 'learning_rate': 9.558495661791429e-06, 'epoch': 0.16} 16%|█▌ | 5510/34278 [6:08:27<32:18:15, 4.04s/it] 16%|█▌ | 5511/34278 [6:08:30<30:59:17, 3.88s/it] {'loss': 0.1608, 'grad_norm': 0.9613057259010386, 'learning_rate': 9.558301537862873e-06, 'epoch': 0.16} 16%|█▌ | 5511/34278 [6:08:30<30:59:17, 3.88s/it] 16%|█▌ | 5512/34278 [6:08:34<29:32:59, 3.70s/it] {'loss': 0.1601, 'grad_norm': 0.8995678877128398, 'learning_rate': 9.558107373238887e-06, 'epoch': 0.16} 16%|█▌ | 5512/34278 [6:08:34<29:32:59, 3.70s/it] 16%|█▌ | 5513/34278 [6:08:37<29:33:42, 3.70s/it] {'loss': 0.1552, 'grad_norm': 1.0690399837000504, 'learning_rate': 9.557913167921206e-06, 'epoch': 0.16} 16%|█▌ | 5513/34278 [6:08:37<29:33:42, 3.70s/it] 16%|█▌ | 5514/34278 [6:08:41<29:21:43, 3.67s/it] {'loss': 0.157, 'grad_norm': 0.9360316503120157, 'learning_rate': 9.557718921911567e-06, 'epoch': 0.16} 16%|█▌ | 5514/34278 [6:08:41<29:21:43, 3.67s/it] 16%|█▌ | 5515/34278 [6:08:45<30:40:49, 3.84s/it] {'loss': 0.1924, 'grad_norm': 0.9575391506624618, 'learning_rate': 9.5575246352117e-06, 'epoch': 0.16} 16%|█▌ | 5515/34278 [6:08:45<30:40:49, 3.84s/it] 16%|█▌ | 5516/34278 [6:08:49<30:50:20, 3.86s/it] {'loss': 0.1749, 'grad_norm': 0.9290381634060488, 'learning_rate': 9.55733030782334e-06, 'epoch': 0.16} 16%|█▌ | 5516/34278 [6:08:49<30:50:20, 3.86s/it] 16%|█▌ | 5517/34278 [6:08:55<35:53:08, 4.49s/it] {'loss': 0.1837, 'grad_norm': 0.7528786554678382, 'learning_rate': 9.557135939748224e-06, 'epoch': 0.16} 16%|█▌ | 5517/34278 [6:08:55<35:53:08, 4.49s/it] 16%|█▌ | 5518/34278 [6:08:58<33:04:14, 4.14s/it] {'loss': 0.1937, 'grad_norm': 1.0485476870540271, 'learning_rate': 9.556941530988087e-06, 'epoch': 0.16} 16%|█▌ | 5518/34278 [6:08:58<33:04:14, 4.14s/it] 16%|█▌ | 5519/34278 [6:09:02<30:49:24, 3.86s/it] {'loss': 0.1733, 'grad_norm': 0.9967608410686682, 'learning_rate': 9.556747081544663e-06, 'epoch': 0.16} 16%|█▌ | 5519/34278 [6:09:02<30:49:24, 3.86s/it] 16%|█▌ | 5520/34278 [6:09:04<28:38:13, 3.58s/it] {'loss': 0.1674, 'grad_norm': 0.7148948430951124, 'learning_rate': 9.556552591419688e-06, 'epoch': 0.16} 16%|█▌ | 5520/34278 [6:09:04<28:38:13, 3.58s/it] 16%|█▌ | 5521/34278 [6:09:07<26:46:01, 3.35s/it] {'loss': 0.1876, 'grad_norm': 1.1189871928418456, 'learning_rate': 9.556358060614901e-06, 'epoch': 0.16} 16%|█▌ | 5521/34278 [6:09:07<26:46:01, 3.35s/it] 16%|█▌ | 5522/34278 [6:09:11<27:26:01, 3.43s/it] {'loss': 0.1999, 'grad_norm': 0.9812964305816959, 'learning_rate': 9.556163489132036e-06, 'epoch': 0.16} 16%|█▌ | 5522/34278 [6:09:11<27:26:01, 3.43s/it] 16%|█▌ | 5523/34278 [6:09:14<26:25:51, 3.31s/it] {'loss': 0.1967, 'grad_norm': 0.8065992828984858, 'learning_rate': 9.55596887697283e-06, 'epoch': 0.16} 16%|█▌ | 5523/34278 [6:09:14<26:25:51, 3.31s/it] 16%|█▌ | 5524/34278 [6:09:17<25:05:31, 3.14s/it] {'loss': 0.1568, 'grad_norm': 1.0285633053461214, 'learning_rate': 9.555774224139022e-06, 'epoch': 0.16} 16%|█▌ | 5524/34278 [6:09:17<25:05:31, 3.14s/it] 16%|█▌ | 5525/34278 [6:09:22<31:24:34, 3.93s/it] {'loss': 0.1449, 'grad_norm': 0.8012421283696906, 'learning_rate': 9.555579530632351e-06, 'epoch': 0.16} 16%|█▌ | 5525/34278 [6:09:22<31:24:34, 3.93s/it] 16%|█▌ | 5526/34278 [6:09:26<29:37:41, 3.71s/it] {'loss': 0.1733, 'grad_norm': 0.8074301832246265, 'learning_rate': 9.555384796454551e-06, 'epoch': 0.16} 16%|█▌ | 5526/34278 [6:09:26<29:37:41, 3.71s/it] 16%|█▌ | 5527/34278 [6:09:29<28:28:55, 3.57s/it] {'loss': 0.1658, 'grad_norm': 0.9412707946278771, 'learning_rate': 9.555190021607364e-06, 'epoch': 0.16} 16%|█▌ | 5527/34278 [6:09:29<28:28:55, 3.57s/it] 16%|█▌ | 5528/34278 [6:09:33<28:43:19, 3.60s/it] {'loss': 0.2106, 'grad_norm': 1.0414637734958772, 'learning_rate': 9.554995206092527e-06, 'epoch': 0.16} 16%|█▌ | 5528/34278 [6:09:33<28:43:19, 3.60s/it] 16%|█▌ | 5529/34278 [6:09:36<27:17:36, 3.42s/it] {'loss': 0.1802, 'grad_norm': 0.8999965230351273, 'learning_rate': 9.554800349911784e-06, 'epoch': 0.16} 16%|█▌ | 5529/34278 [6:09:36<27:17:36, 3.42s/it] 16%|█▌ | 5530/34278 [6:09:39<28:16:39, 3.54s/it] {'loss': 0.2027, 'grad_norm': 0.9708105064062906, 'learning_rate': 9.554605453066868e-06, 'epoch': 0.16} 16%|█▌ | 5530/34278 [6:09:39<28:16:39, 3.54s/it] 16%|█▌ | 5531/34278 [6:09:42<26:59:02, 3.38s/it] {'loss': 0.169, 'grad_norm': 0.8076048243727747, 'learning_rate': 9.55441051555952e-06, 'epoch': 0.16} 16%|█▌ | 5531/34278 [6:09:42<26:59:02, 3.38s/it] 16%|█▌ | 5532/34278 [6:09:48<32:15:17, 4.04s/it] {'loss': 0.1857, 'grad_norm': 0.9693097447828146, 'learning_rate': 9.554215537391485e-06, 'epoch': 0.16} 16%|█▌ | 5532/34278 [6:09:48<32:15:17, 4.04s/it] 16%|█▌ | 5533/34278 [6:09:53<35:23:13, 4.43s/it] {'loss': 0.1715, 'grad_norm': 0.68000558152758, 'learning_rate': 9.5540205185645e-06, 'epoch': 0.16} 16%|█▌ | 5533/34278 [6:09:53<35:23:13, 4.43s/it] 16%|█▌ | 5534/34278 [6:09:57<32:51:22, 4.12s/it] {'loss': 0.1742, 'grad_norm': 0.9131535674772204, 'learning_rate': 9.553825459080306e-06, 'epoch': 0.16} 16%|█▌ | 5534/34278 [6:09:57<32:51:22, 4.12s/it] 16%|█▌ | 5535/34278 [6:10:00<29:58:01, 3.75s/it] {'loss': 0.1684, 'grad_norm': 0.9443091267167332, 'learning_rate': 9.553630358940647e-06, 'epoch': 0.16} 16%|█▌ | 5535/34278 [6:10:00<29:58:01, 3.75s/it] 16%|█▌ | 5536/34278 [6:10:04<31:29:45, 3.94s/it] {'loss': 0.1637, 'grad_norm': 0.8825146081504416, 'learning_rate': 9.553435218147262e-06, 'epoch': 0.16} 16%|█▌ | 5536/34278 [6:10:04<31:29:45, 3.94s/it] 16%|█▌ | 5537/34278 [6:10:07<29:34:32, 3.70s/it] {'loss': 0.181, 'grad_norm': 0.8556868523999357, 'learning_rate': 9.553240036701893e-06, 'epoch': 0.16} 16%|█▌ | 5537/34278 [6:10:07<29:34:32, 3.70s/it] 16%|█▌ | 5538/34278 [6:10:10<27:25:15, 3.43s/it] {'loss': 0.1845, 'grad_norm': 1.0266545781521235, 'learning_rate': 9.553044814606287e-06, 'epoch': 0.16} 16%|█▌ | 5538/34278 [6:10:10<27:25:15, 3.43s/it] 16%|█▌ | 5539/34278 [6:10:13<26:03:18, 3.26s/it] {'loss': 0.1725, 'grad_norm': 0.8386571943758669, 'learning_rate': 9.552849551862182e-06, 'epoch': 0.16} 16%|█▌ | 5539/34278 [6:10:13<26:03:18, 3.26s/it] 16%|█▌ | 5540/34278 [6:10:17<27:22:43, 3.43s/it] {'loss': 0.1419, 'grad_norm': 0.7416334146289594, 'learning_rate': 9.552654248471323e-06, 'epoch': 0.16} 16%|█▌ | 5540/34278 [6:10:17<27:22:43, 3.43s/it] 16%|█▌ | 5541/34278 [6:10:20<28:28:46, 3.57s/it] {'loss': 0.1788, 'grad_norm': 0.7914754642929637, 'learning_rate': 9.552458904435454e-06, 'epoch': 0.16} 16%|█▌ | 5541/34278 [6:10:21<28:28:46, 3.57s/it] 16%|█▌ | 5542/34278 [6:10:27<34:21:41, 4.30s/it] {'loss': 0.1791, 'grad_norm': 0.8697126259385773, 'learning_rate': 9.55226351975632e-06, 'epoch': 0.16} 16%|█▌ | 5542/34278 [6:10:27<34:21:41, 4.30s/it] 16%|█▌ | 5543/34278 [6:10:30<31:15:01, 3.92s/it] {'loss': 0.1811, 'grad_norm': 0.8546440583131948, 'learning_rate': 9.552068094435663e-06, 'epoch': 0.16} 16%|█▌ | 5543/34278 [6:10:30<31:15:01, 3.92s/it] 16%|█▌ | 5544/34278 [6:10:33<29:36:35, 3.71s/it] {'loss': 0.1487, 'grad_norm': 0.7984828349344645, 'learning_rate': 9.551872628475227e-06, 'epoch': 0.16} 16%|█▌ | 5544/34278 [6:10:33<29:36:35, 3.71s/it] 16%|█▌ | 5545/34278 [6:10:36<28:37:40, 3.59s/it] {'loss': 0.1632, 'grad_norm': 0.8557930680296238, 'learning_rate': 9.551677121876761e-06, 'epoch': 0.16} 16%|█▌ | 5545/34278 [6:10:36<28:37:40, 3.59s/it] 16%|█▌ | 5546/34278 [6:10:42<34:02:28, 4.27s/it] {'loss': 0.1918, 'grad_norm': 0.8212788916557553, 'learning_rate': 9.551481574642008e-06, 'epoch': 0.16} 16%|█▌ | 5546/34278 [6:10:42<34:02:28, 4.27s/it] 16%|█▌ | 5547/34278 [6:10:45<31:33:08, 3.95s/it] {'loss': 0.1712, 'grad_norm': 0.893961801426182, 'learning_rate': 9.551285986772714e-06, 'epoch': 0.16} 16%|█▌ | 5547/34278 [6:10:45<31:33:08, 3.95s/it] 16%|█▌ | 5548/34278 [6:10:48<28:31:58, 3.58s/it] {'loss': 0.1753, 'grad_norm': 1.05923268609053, 'learning_rate': 9.551090358270624e-06, 'epoch': 0.16} 16%|█▌ | 5548/34278 [6:10:48<28:31:58, 3.58s/it] 16%|█▌ | 5549/34278 [6:10:51<28:00:49, 3.51s/it] {'loss': 0.1622, 'grad_norm': 0.8934698772171971, 'learning_rate': 9.550894689137487e-06, 'epoch': 0.16} 16%|█▌ | 5549/34278 [6:10:51<28:00:49, 3.51s/it] 16%|█▌ | 5550/34278 [6:10:54<27:01:24, 3.39s/it] {'loss': 0.2013, 'grad_norm': 1.0534821834609347, 'learning_rate': 9.550698979375046e-06, 'epoch': 0.16} 16%|█▌ | 5550/34278 [6:10:54<27:01:24, 3.39s/it] 16%|█▌ | 5551/34278 [6:11:00<33:25:37, 4.19s/it] {'loss': 0.1754, 'grad_norm': 0.8065592466817278, 'learning_rate': 9.550503228985053e-06, 'epoch': 0.16} 16%|█▌ | 5551/34278 [6:11:00<33:25:37, 4.19s/it] 16%|█▌ | 5552/34278 [6:11:03<30:40:56, 3.85s/it] {'loss': 0.1603, 'grad_norm': 1.027763839163784, 'learning_rate': 9.550307437969254e-06, 'epoch': 0.16} 16%|█▌ | 5552/34278 [6:11:03<30:40:56, 3.85s/it] 16%|█▌ | 5553/34278 [6:11:07<29:47:40, 3.73s/it] {'loss': 0.185, 'grad_norm': 0.7821991728621691, 'learning_rate': 9.550111606329396e-06, 'epoch': 0.16} 16%|█▌ | 5553/34278 [6:11:07<29:47:40, 3.73s/it] 16%|█▌ | 5554/34278 [6:11:13<35:09:50, 4.41s/it] {'loss': 0.1697, 'grad_norm': 0.9745333289682548, 'learning_rate': 9.549915734067229e-06, 'epoch': 0.16} 16%|█▌ | 5554/34278 [6:11:13<35:09:50, 4.41s/it] 16%|█▌ | 5555/34278 [6:11:16<31:59:37, 4.01s/it] {'loss': 0.1752, 'grad_norm': 0.959499070010955, 'learning_rate': 9.549719821184498e-06, 'epoch': 0.16} 16%|█▌ | 5555/34278 [6:11:16<31:59:37, 4.01s/it] 16%|█▌ | 5556/34278 [6:11:20<32:08:15, 4.03s/it] {'loss': 0.1589, 'grad_norm': 0.6914850172054944, 'learning_rate': 9.549523867682955e-06, 'epoch': 0.16} 16%|█▌ | 5556/34278 [6:11:20<32:08:15, 4.03s/it] 16%|█▌ | 5557/34278 [6:11:25<33:27:18, 4.19s/it] {'loss': 0.1686, 'grad_norm': 0.8649204270790649, 'learning_rate': 9.54932787356435e-06, 'epoch': 0.16} 16%|█▌ | 5557/34278 [6:11:25<33:27:18, 4.19s/it] 16%|█▌ | 5558/34278 [6:11:28<30:59:39, 3.89s/it] {'loss': 0.1747, 'grad_norm': 0.9069323247739839, 'learning_rate': 9.54913183883043e-06, 'epoch': 0.16} 16%|█▌ | 5558/34278 [6:11:28<30:59:39, 3.89s/it] 16%|█▌ | 5559/34278 [6:11:31<28:47:57, 3.61s/it] {'loss': 0.1626, 'grad_norm': 0.7860750154736431, 'learning_rate': 9.548935763482949e-06, 'epoch': 0.16} 16%|█▌ | 5559/34278 [6:11:31<28:47:57, 3.61s/it] 16%|█▌ | 5560/34278 [6:11:34<26:52:45, 3.37s/it] {'loss': 0.1506, 'grad_norm': 0.7955310104902404, 'learning_rate': 9.548739647523654e-06, 'epoch': 0.16} 16%|█▌ | 5560/34278 [6:11:34<26:52:45, 3.37s/it] 16%|█▌ | 5561/34278 [6:11:40<33:26:37, 4.19s/it] {'loss': 0.1723, 'grad_norm': 1.0055939065865114, 'learning_rate': 9.548543490954299e-06, 'epoch': 0.16} 16%|█▌ | 5561/34278 [6:11:40<33:26:37, 4.19s/it] 16%|█▌ | 5562/34278 [6:11:43<32:04:24, 4.02s/it] {'loss': 0.1868, 'grad_norm': 0.8232828850756518, 'learning_rate': 9.548347293776632e-06, 'epoch': 0.16} 16%|█▌ | 5562/34278 [6:11:43<32:04:24, 4.02s/it] 16%|█▌ | 5563/34278 [6:11:47<30:35:24, 3.84s/it] {'loss': 0.1684, 'grad_norm': 0.8850197389033428, 'learning_rate': 9.548151055992407e-06, 'epoch': 0.16} 16%|█▌ | 5563/34278 [6:11:47<30:35:24, 3.84s/it] 16%|█▌ | 5564/34278 [6:11:50<28:18:52, 3.55s/it] {'loss': 0.1787, 'grad_norm': 0.9326340762746111, 'learning_rate': 9.547954777603374e-06, 'epoch': 0.16} 16%|█▌ | 5564/34278 [6:11:50<28:18:52, 3.55s/it] 16%|█▌ | 5565/34278 [6:11:53<27:47:36, 3.48s/it] {'loss': 0.1705, 'grad_norm': 0.80297701862277, 'learning_rate': 9.547758458611287e-06, 'epoch': 0.16} 16%|█▌ | 5565/34278 [6:11:53<27:47:36, 3.48s/it] 16%|█▌ | 5566/34278 [6:11:57<28:25:03, 3.56s/it] {'loss': 0.1827, 'grad_norm': 0.7602025155635146, 'learning_rate': 9.5475620990179e-06, 'epoch': 0.16} 16%|█▌ | 5566/34278 [6:11:57<28:25:03, 3.56s/it] 16%|█▌ | 5567/34278 [6:12:00<27:58:55, 3.51s/it] {'loss': 0.1789, 'grad_norm': 1.3730000806359959, 'learning_rate': 9.547365698824962e-06, 'epoch': 0.16} 16%|█▌ | 5567/34278 [6:12:00<27:58:55, 3.51s/it] 16%|█▌ | 5568/34278 [6:12:03<26:53:24, 3.37s/it] {'loss': 0.16, 'grad_norm': 0.8780466005884809, 'learning_rate': 9.547169258034228e-06, 'epoch': 0.16} 16%|█▌ | 5568/34278 [6:12:03<26:53:24, 3.37s/it] 16%|█▌ | 5569/34278 [6:12:07<28:24:31, 3.56s/it] {'loss': 0.1688, 'grad_norm': 0.7973441690357147, 'learning_rate': 9.546972776647454e-06, 'epoch': 0.16} 16%|█▌ | 5569/34278 [6:12:07<28:24:31, 3.56s/it] 16%|█▌ | 5570/34278 [6:12:13<33:41:28, 4.22s/it] {'loss': 0.1712, 'grad_norm': 0.9387914187545926, 'learning_rate': 9.546776254666392e-06, 'epoch': 0.16} 16%|█▌ | 5570/34278 [6:12:13<33:41:28, 4.22s/it] 16%|█▋ | 5571/34278 [6:12:16<30:53:17, 3.87s/it] {'loss': 0.1728, 'grad_norm': 1.0942556767713258, 'learning_rate': 9.546579692092797e-06, 'epoch': 0.16} 16%|█▋ | 5571/34278 [6:12:16<30:53:17, 3.87s/it] 16%|█▋ | 5572/34278 [6:12:20<30:32:40, 3.83s/it] {'loss': 0.1851, 'grad_norm': 0.8994339052118556, 'learning_rate': 9.546383088928423e-06, 'epoch': 0.16} 16%|█▋ | 5572/34278 [6:12:20<30:32:40, 3.83s/it] 16%|█▋ | 5573/34278 [6:12:26<36:16:25, 4.55s/it] {'loss': 0.1838, 'grad_norm': 0.8631096758899228, 'learning_rate': 9.546186445175027e-06, 'epoch': 0.16} 16%|█▋ | 5573/34278 [6:12:26<36:16:25, 4.55s/it] 16%|█▋ | 5574/34278 [6:12:30<35:10:22, 4.41s/it] {'loss': 0.1857, 'grad_norm': 1.0114924830145064, 'learning_rate': 9.545989760834365e-06, 'epoch': 0.16} 16%|█▋ | 5574/34278 [6:12:30<35:10:22, 4.41s/it] 16%|█▋ | 5575/34278 [6:12:33<32:20:32, 4.06s/it] {'loss': 0.1674, 'grad_norm': 0.8805384329134841, 'learning_rate': 9.545793035908188e-06, 'epoch': 0.16} 16%|█▋ | 5575/34278 [6:12:33<32:20:32, 4.06s/it] 16%|█▋ | 5576/34278 [6:12:39<36:32:44, 4.58s/it] {'loss': 0.1974, 'grad_norm': 0.7961742170072749, 'learning_rate': 9.545596270398258e-06, 'epoch': 0.16} 16%|█▋ | 5576/34278 [6:12:39<36:32:44, 4.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 16%|█▋ | 5577/34278 [6:12:45<40:14:52, 5.05s/it] {'loss': 0.1811, 'grad_norm': 0.9199810343204181, 'learning_rate': 9.54539946430633e-06, 'epoch': 0.16} 16%|█▋ | 5577/34278 [6:12:45<40:14:52, 5.05s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 16%|█▋ | 5578/34278 [6:12:48<35:46:54, 4.49s/it] {'loss': 0.16, 'grad_norm': 0.8605639682493905, 'learning_rate': 9.545202617634162e-06, 'epoch': 0.16} 16%|█▋ | 5578/34278 [6:12:48<35:46:54, 4.49s/it] 16%|█▋ | 5579/34278 [6:12:51<32:11:56, 4.04s/it] {'loss': 0.1785, 'grad_norm': 0.989559048965002, 'learning_rate': 9.545005730383508e-06, 'epoch': 0.16} 16%|█▋ | 5579/34278 [6:12:51<32:11:56, 4.04s/it] 16%|█▋ | 5580/34278 [6:12:57<36:26:29, 4.57s/it] {'loss': 0.1726, 'grad_norm': 0.8561875892868223, 'learning_rate': 9.544808802556129e-06, 'epoch': 0.16} 16%|█▋ | 5580/34278 [6:12:57<36:26:29, 4.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 16%|█▋ | 5581/34278 [6:13:00<33:02:27, 4.14s/it] {'loss': 0.1483, 'grad_norm': 0.8761340504741546, 'learning_rate': 9.544611834153781e-06, 'epoch': 0.16} 16%|█▋ | 5581/34278 [6:13:00<33:02:27, 4.14s/it] 16%|█▋ | 5582/34278 [6:13:03<30:02:08, 3.77s/it] {'loss': 0.1778, 'grad_norm': 0.9637252105261748, 'learning_rate': 9.544414825178223e-06, 'epoch': 0.16} 16%|█▋ | 5582/34278 [6:13:03<30:02:08, 3.77s/it] 16%|█▋ | 5583/34278 [6:13:07<29:18:06, 3.68s/it] {'loss': 0.1713, 'grad_norm': 0.8010618817615577, 'learning_rate': 9.544217775631215e-06, 'epoch': 0.16} 16%|█▋ | 5583/34278 [6:13:07<29:18:06, 3.68s/it] 16%|█▋ | 5584/34278 [6:13:10<28:20:27, 3.56s/it] {'loss': 0.1451, 'grad_norm': 0.8309504926748028, 'learning_rate': 9.544020685514515e-06, 'epoch': 0.16} 16%|█▋ | 5584/34278 [6:13:10<28:20:27, 3.56s/it] 16%|█▋ | 5585/34278 [6:13:16<34:02:23, 4.27s/it] {'loss': 0.1656, 'grad_norm': 0.8439330817438556, 'learning_rate': 9.543823554829884e-06, 'epoch': 0.16} 16%|█▋ | 5585/34278 [6:13:16<34:02:23, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 16%|█▋ | 5586/34278 [6:13:19<31:48:03, 3.99s/it] {'loss': 0.1534, 'grad_norm': 0.6605145046534995, 'learning_rate': 9.54362638357908e-06, 'epoch': 0.16} 16%|█▋ | 5586/34278 [6:13:19<31:48:03, 3.99s/it] 16%|█▋ | 5587/34278 [6:13:25<36:57:22, 4.64s/it] {'loss': 0.1763, 'grad_norm': 0.7667144900167461, 'learning_rate': 9.543429171763865e-06, 'epoch': 0.16} 16%|█▋ | 5587/34278 [6:13:25<36:57:22, 4.64s/it] 16%|█▋ | 5588/34278 [6:13:28<32:37:39, 4.09s/it] {'loss': 0.1585, 'grad_norm': 0.8680758585249889, 'learning_rate': 9.543231919385999e-06, 'epoch': 0.16} 16%|█▋ | 5588/34278 [6:13:28<32:37:39, 4.09s/it] 16%|█▋ | 5589/34278 [6:13:32<31:03:18, 3.90s/it] {'loss': 0.1641, 'grad_norm': 0.7830878767600102, 'learning_rate': 9.543034626447243e-06, 'epoch': 0.16} 16%|█▋ | 5589/34278 [6:13:32<31:03:18, 3.90s/it] 16%|█▋ | 5590/34278 [6:13:38<36:14:49, 4.55s/it] {'loss': 0.1905, 'grad_norm': 0.863822806015243, 'learning_rate': 9.542837292949358e-06, 'epoch': 0.16} 16%|█▋ | 5590/34278 [6:13:38<36:14:49, 4.55s/it] 16%|█▋ | 5591/34278 [6:13:42<34:44:54, 4.36s/it] {'loss': 0.171, 'grad_norm': 0.7331609304442888, 'learning_rate': 9.542639918894105e-06, 'epoch': 0.16} 16%|█▋ | 5591/34278 [6:13:42<34:44:54, 4.36s/it] 16%|█▋ | 5592/34278 [6:13:45<31:33:23, 3.96s/it] {'loss': 0.1641, 'grad_norm': 0.8984095274978744, 'learning_rate': 9.542442504283249e-06, 'epoch': 0.16} 16%|█▋ | 5592/34278 [6:13:45<31:33:23, 3.96s/it] 16%|█▋ | 5593/34278 [6:13:48<29:45:13, 3.73s/it] {'loss': 0.1696, 'grad_norm': 0.8234488756856335, 'learning_rate': 9.542245049118551e-06, 'epoch': 0.16} 16%|█▋ | 5593/34278 [6:13:48<29:45:13, 3.73s/it] 16%|█▋ | 5594/34278 [6:13:52<30:26:11, 3.82s/it] {'loss': 0.174, 'grad_norm': 0.7996534533990739, 'learning_rate': 9.542047553401773e-06, 'epoch': 0.16} 16%|█▋ | 5594/34278 [6:13:52<30:26:11, 3.82s/it] 16%|█▋ | 5595/34278 [6:13:56<30:45:41, 3.86s/it] {'loss': 0.174, 'grad_norm': 0.9163249519287838, 'learning_rate': 9.541850017134678e-06, 'epoch': 0.16} 16%|█▋ | 5595/34278 [6:13:56<30:45:41, 3.86s/it] 16%|█▋ | 5596/34278 [6:13:59<28:19:05, 3.55s/it] {'loss': 0.191, 'grad_norm': 0.8898491930885384, 'learning_rate': 9.54165244031903e-06, 'epoch': 0.16} 16%|█▋ | 5596/34278 [6:13:59<28:19:05, 3.55s/it] 16%|█▋ | 5597/34278 [6:14:02<28:17:43, 3.55s/it] {'loss': 0.1731, 'grad_norm': 0.7146522455286817, 'learning_rate': 9.541454822956592e-06, 'epoch': 0.16} 16%|█▋ | 5597/34278 [6:14:02<28:17:43, 3.55s/it] 16%|█▋ | 5598/34278 [6:14:05<27:27:05, 3.45s/it] {'loss': 0.1873, 'grad_norm': 0.9071124784826288, 'learning_rate': 9.541257165049132e-06, 'epoch': 0.16} 16%|█▋ | 5598/34278 [6:14:05<27:27:05, 3.45s/it] 16%|█▋ | 5599/34278 [6:14:09<28:28:07, 3.57s/it] {'loss': 0.1836, 'grad_norm': 1.0050295452380662, 'learning_rate': 9.541059466598413e-06, 'epoch': 0.16} 16%|█▋ | 5599/34278 [6:14:09<28:28:07, 3.57s/it] 16%|█▋ | 5600/34278 [6:14:12<26:36:19, 3.34s/it] {'loss': 0.1866, 'grad_norm': 0.960905206118692, 'learning_rate': 9.540861727606196e-06, 'epoch': 0.16} 16%|█▋ | 5600/34278 [6:14:12<26:36:19, 3.34s/it] 16%|█▋ | 5601/34278 [6:14:15<25:59:05, 3.26s/it] {'loss': 0.1609, 'grad_norm': 1.0512020080564881, 'learning_rate': 9.540663948074251e-06, 'epoch': 0.16} 16%|█▋ | 5601/34278 [6:14:15<25:59:05, 3.26s/it] 16%|█▋ | 5602/34278 [6:14:18<25:42:03, 3.23s/it] {'loss': 0.1866, 'grad_norm': 0.9435689980908454, 'learning_rate': 9.540466128004342e-06, 'epoch': 0.16} 16%|█▋ | 5602/34278 [6:14:18<25:42:03, 3.23s/it] 16%|█▋ | 5603/34278 [6:14:22<26:53:04, 3.38s/it] {'loss': 0.1726, 'grad_norm': 1.1703447518077454, 'learning_rate': 9.540268267398237e-06, 'epoch': 0.16} 16%|█▋ | 5603/34278 [6:14:22<26:53:04, 3.38s/it] 16%|█▋ | 5604/34278 [6:14:25<26:33:41, 3.33s/it] {'loss': 0.1829, 'grad_norm': 1.488296337065139, 'learning_rate': 9.540070366257699e-06, 'epoch': 0.16} 16%|█▋ | 5604/34278 [6:14:25<26:33:41, 3.33s/it] 16%|█▋ | 5605/34278 [6:14:29<27:56:58, 3.51s/it] {'loss': 0.1969, 'grad_norm': 0.922847693332177, 'learning_rate': 9.539872424584496e-06, 'epoch': 0.16} 16%|█▋ | 5605/34278 [6:14:29<27:56:58, 3.51s/it] 16%|█▋ | 5606/34278 [6:14:32<27:22:50, 3.44s/it] {'loss': 0.1792, 'grad_norm': 0.9796080261896619, 'learning_rate': 9.539674442380397e-06, 'epoch': 0.16} 16%|█▋ | 5606/34278 [6:14:32<27:22:50, 3.44s/it] 16%|█▋ | 5607/34278 [6:14:35<26:33:02, 3.33s/it] {'loss': 0.1811, 'grad_norm': 1.1559993570626443, 'learning_rate': 9.539476419647168e-06, 'epoch': 0.16} 16%|█▋ | 5607/34278 [6:14:35<26:33:02, 3.33s/it] 16%|█▋ | 5608/34278 [6:14:39<25:59:06, 3.26s/it] {'loss': 0.1813, 'grad_norm': 1.002853243594281, 'learning_rate': 9.539278356386577e-06, 'epoch': 0.16} 16%|█▋ | 5608/34278 [6:14:39<25:59:06, 3.26s/it] 16%|█▋ | 5609/34278 [6:14:42<25:53:40, 3.25s/it] {'loss': 0.1737, 'grad_norm': 1.042737315841317, 'learning_rate': 9.539080252600392e-06, 'epoch': 0.16} 16%|█▋ | 5609/34278 [6:14:42<25:53:40, 3.25s/it] 16%|█▋ | 5610/34278 [6:14:45<26:11:32, 3.29s/it] {'loss': 0.1659, 'grad_norm': 0.8994279842767887, 'learning_rate': 9.538882108290384e-06, 'epoch': 0.16} 16%|█▋ | 5610/34278 [6:14:45<26:11:32, 3.29s/it] 16%|█▋ | 5611/34278 [6:14:49<28:12:29, 3.54s/it] {'loss': 0.1722, 'grad_norm': 1.1451182307194003, 'learning_rate': 9.538683923458319e-06, 'epoch': 0.16} 16%|█▋ | 5611/34278 [6:14:49<28:12:29, 3.54s/it] 16%|█▋ | 5612/34278 [6:14:53<28:21:13, 3.56s/it] {'loss': 0.1794, 'grad_norm': 0.8847074219193041, 'learning_rate': 9.538485698105965e-06, 'epoch': 0.16} 16%|█▋ | 5612/34278 [6:14:53<28:21:13, 3.56s/it] 16%|█▋ | 5613/34278 [6:14:56<27:02:17, 3.40s/it] {'loss': 0.1737, 'grad_norm': 0.8445970893768949, 'learning_rate': 9.538287432235096e-06, 'epoch': 0.16} 16%|█▋ | 5613/34278 [6:14:56<27:02:17, 3.40s/it] 16%|█▋ | 5614/34278 [6:14:59<26:16:13, 3.30s/it] {'loss': 0.1914, 'grad_norm': 0.9592856169076226, 'learning_rate': 9.53808912584748e-06, 'epoch': 0.16} 16%|█▋ | 5614/34278 [6:14:59<26:16:13, 3.30s/it] 16%|█▋ | 5615/34278 [6:15:03<27:30:24, 3.45s/it] {'loss': 0.1689, 'grad_norm': 0.7155258987602527, 'learning_rate': 9.53789077894489e-06, 'epoch': 0.16} 16%|█▋ | 5615/34278 [6:15:03<27:30:24, 3.45s/it] 16%|█▋ | 5616/34278 [6:15:08<30:43:27, 3.86s/it] {'loss': 0.1556, 'grad_norm': 0.7297140127836609, 'learning_rate': 9.537692391529093e-06, 'epoch': 0.16} 16%|█▋ | 5616/34278 [6:15:08<30:43:27, 3.86s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 16%|█▋ | 5617/34278 [6:15:11<29:16:15, 3.68s/it] {'loss': 0.1589, 'grad_norm': 0.98604310466573, 'learning_rate': 9.53749396360186e-06, 'epoch': 0.16} 16%|█▋ | 5617/34278 [6:15:11<29:16:15, 3.68s/it] 16%|█▋ | 5618/34278 [6:15:14<27:30:42, 3.46s/it] {'loss': 0.1926, 'grad_norm': 0.7811506326450793, 'learning_rate': 9.537295495164965e-06, 'epoch': 0.16} 16%|█▋ | 5618/34278 [6:15:14<27:30:42, 3.46s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8730 > 8192). Running this sequence through the model will result in indexing errors 16%|█▋ | 5619/34278 [6:15:17<26:46:20, 3.36s/it] {'loss': 0.1641, 'grad_norm': 0.828578056936563, 'learning_rate': 9.537096986220177e-06, 'epoch': 0.16} 16%|█▋ | 5619/34278 [6:15:17<26:46:20, 3.36s/it] 16%|█▋ | 5620/34278 [6:15:20<25:26:54, 3.20s/it] {'loss': 0.1866, 'grad_norm': 0.9639280325185331, 'learning_rate': 9.536898436769273e-06, 'epoch': 0.16} 16%|█▋ | 5620/34278 [6:15:20<25:26:54, 3.20s/it] 16%|█▋ | 5621/34278 [6:15:23<25:26:28, 3.20s/it] {'loss': 0.1743, 'grad_norm': 0.7248176740815467, 'learning_rate': 9.536699846814023e-06, 'epoch': 0.16} 16%|█▋ | 5621/34278 [6:15:23<25:26:28, 3.20s/it] 16%|█▋ | 5622/34278 [6:15:26<24:58:23, 3.14s/it] {'loss': 0.1711, 'grad_norm': 0.8192479076642077, 'learning_rate': 9.536501216356198e-06, 'epoch': 0.16} 16%|█▋ | 5622/34278 [6:15:26<24:58:23, 3.14s/it] 16%|█▋ | 5623/34278 [6:15:29<25:32:24, 3.21s/it] {'loss': 0.2011, 'grad_norm': 0.8799247150203047, 'learning_rate': 9.536302545397575e-06, 'epoch': 0.16} 16%|█▋ | 5623/34278 [6:15:29<25:32:24, 3.21s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10228 > 8192). Running this sequence through the model will result in indexing errors 16%|█▋ | 5624/34278 [6:15:36<32:49:52, 4.12s/it] {'loss': 0.1714, 'grad_norm': 0.7990696505438856, 'learning_rate': 9.536103833939924e-06, 'epoch': 0.16} 16%|█▋ | 5624/34278 [6:15:36<32:49:52, 4.12s/it] 16%|█▋ | 5625/34278 [6:15:39<30:19:23, 3.81s/it] {'loss': 0.1907, 'grad_norm': 0.8583153072459949, 'learning_rate': 9.535905081985022e-06, 'epoch': 0.16} 16%|█▋ | 5625/34278 [6:15:39<30:19:23, 3.81s/it] 16%|█▋ | 5626/34278 [6:15:45<35:43:08, 4.49s/it] {'loss': 0.149, 'grad_norm': 0.8126369465430842, 'learning_rate': 9.53570628953464e-06, 'epoch': 0.16} 16%|█▋ | 5626/34278 [6:15:45<35:43:08, 4.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 16%|█▋ | 5627/34278 [6:15:50<38:45:04, 4.87s/it] {'loss': 0.1939, 'grad_norm': 0.8111059379127915, 'learning_rate': 9.535507456590559e-06, 'epoch': 0.16} 16%|█▋ | 5627/34278 [6:15:50<38:45:04, 4.87s/it] 16%|█▋ | 5628/34278 [6:15:54<34:25:21, 4.33s/it] {'loss': 0.1913, 'grad_norm': 0.7953149377586634, 'learning_rate': 9.535308583154546e-06, 'epoch': 0.16} 16%|█▋ | 5628/34278 [6:15:54<34:25:21, 4.33s/it] 16%|█▋ | 5629/34278 [6:15:56<31:08:48, 3.91s/it] {'loss': 0.1657, 'grad_norm': 1.0358783893962589, 'learning_rate': 9.535109669228383e-06, 'epoch': 0.16} 16%|█▋ | 5629/34278 [6:15:57<31:08:48, 3.91s/it] 16%|█▋ | 5630/34278 [6:16:01<31:46:55, 3.99s/it] {'loss': 0.1766, 'grad_norm': 0.9209626254686679, 'learning_rate': 9.534910714813843e-06, 'epoch': 0.16} 16%|█▋ | 5630/34278 [6:16:01<31:46:55, 3.99s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11555 > 8192). Running this sequence through the model will result in indexing errors 16%|█▋ | 5631/34278 [6:16:06<35:19:26, 4.44s/it] {'loss': 0.171, 'grad_norm': 0.7619437338287154, 'learning_rate': 9.534711719912701e-06, 'epoch': 0.16} 16%|█▋ | 5631/34278 [6:16:06<35:19:26, 4.44s/it] 16%|█▋ | 5632/34278 [6:16:09<31:50:05, 4.00s/it] {'loss': 0.1589, 'grad_norm': 1.101362466756387, 'learning_rate': 9.534512684526738e-06, 'epoch': 0.16} 16%|█▋ | 5632/34278 [6:16:09<31:50:05, 4.00s/it] 16%|█▋ | 5633/34278 [6:16:13<30:45:04, 3.86s/it] {'loss': 0.1726, 'grad_norm': 0.9187218551616347, 'learning_rate': 9.534313608657728e-06, 'epoch': 0.16} 16%|█▋ | 5633/34278 [6:16:13<30:45:04, 3.86s/it] 16%|█▋ | 5634/34278 [6:16:16<29:01:22, 3.65s/it] {'loss': 0.1282, 'grad_norm': 0.6625857365164486, 'learning_rate': 9.534114492307447e-06, 'epoch': 0.16} 16%|█▋ | 5634/34278 [6:16:16<29:01:22, 3.65s/it] 16%|█▋ | 5635/34278 [6:16:19<28:52:07, 3.63s/it] {'loss': 0.2211, 'grad_norm': 1.0032110101859075, 'learning_rate': 9.533915335477675e-06, 'epoch': 0.16} 16%|█▋ | 5635/34278 [6:16:19<28:52:07, 3.63s/it] 16%|█▋ | 5636/34278 [6:16:23<27:46:06, 3.49s/it] {'loss': 0.1721, 'grad_norm': 0.9061869707204406, 'learning_rate': 9.53371613817019e-06, 'epoch': 0.16} 16%|█▋ | 5636/34278 [6:16:23<27:46:06, 3.49s/it] 16%|█▋ | 5637/34278 [6:16:28<31:37:56, 3.98s/it] {'loss': 0.1486, 'grad_norm': 0.8174193509889471, 'learning_rate': 9.533516900386768e-06, 'epoch': 0.16} 16%|█▋ | 5637/34278 [6:16:28<31:37:56, 3.98s/it] 16%|█▋ | 5638/34278 [6:16:32<32:35:25, 4.10s/it] {'loss': 0.195, 'grad_norm': 0.848974270108626, 'learning_rate': 9.53331762212919e-06, 'epoch': 0.16} 16%|█▋ | 5638/34278 [6:16:32<32:35:25, 4.10s/it] 16%|█▋ | 5639/34278 [6:16:35<29:29:17, 3.71s/it] {'loss': 0.1512, 'grad_norm': 0.8537514937918482, 'learning_rate': 9.533118303399234e-06, 'epoch': 0.16} 16%|█▋ | 5639/34278 [6:16:35<29:29:17, 3.71s/it] 16%|█▋ | 5640/34278 [6:16:39<29:47:05, 3.74s/it] {'loss': 0.1543, 'grad_norm': 0.7619147168312305, 'learning_rate': 9.53291894419868e-06, 'epoch': 0.16} 16%|█▋ | 5640/34278 [6:16:39<29:47:05, 3.74s/it] 16%|█▋ | 5641/34278 [6:16:42<29:33:38, 3.72s/it] {'loss': 0.1475, 'grad_norm': 0.7312178056896909, 'learning_rate': 9.53271954452931e-06, 'epoch': 0.16} 16%|█▋ | 5641/34278 [6:16:42<29:33:38, 3.72s/it] 16%|█▋ | 5642/34278 [6:16:46<28:37:22, 3.60s/it] {'loss': 0.1967, 'grad_norm': 0.8892306734556975, 'learning_rate': 9.5325201043929e-06, 'epoch': 0.16} 16%|█▋ | 5642/34278 [6:16:46<28:37:22, 3.60s/it] 16%|█▋ | 5643/34278 [6:16:49<26:54:53, 3.38s/it] {'loss': 0.1904, 'grad_norm': 0.8135042187040571, 'learning_rate': 9.53232062379123e-06, 'epoch': 0.16} 16%|█▋ | 5643/34278 [6:16:49<26:54:53, 3.38s/it] 16%|█▋ | 5644/34278 [6:16:52<27:16:51, 3.43s/it] {'loss': 0.1686, 'grad_norm': 0.8037126020990946, 'learning_rate': 9.532121102726088e-06, 'epoch': 0.16} 16%|█▋ | 5644/34278 [6:16:52<27:16:51, 3.43s/it] 16%|█▋ | 5645/34278 [6:16:56<27:46:02, 3.49s/it] {'loss': 0.1906, 'grad_norm': 0.8641625415937917, 'learning_rate': 9.531921541199249e-06, 'epoch': 0.16} 16%|█▋ | 5645/34278 [6:16:56<27:46:02, 3.49s/it] 16%|█▋ | 5646/34278 [6:16:59<28:10:54, 3.54s/it] {'loss': 0.1827, 'grad_norm': 1.0199354918080799, 'learning_rate': 9.531721939212497e-06, 'epoch': 0.16} 16%|█▋ | 5646/34278 [6:16:59<28:10:54, 3.54s/it] 16%|█▋ | 5647/34278 [6:17:04<30:29:38, 3.83s/it] {'loss': 0.1537, 'grad_norm': 0.8532780844787409, 'learning_rate': 9.53152229676761e-06, 'epoch': 0.16} 16%|█▋ | 5647/34278 [6:17:04<30:29:38, 3.83s/it] 16%|█▋ | 5648/34278 [6:17:07<28:48:44, 3.62s/it] {'loss': 0.1719, 'grad_norm': 0.9380098937926005, 'learning_rate': 9.531322613866378e-06, 'epoch': 0.16} 16%|█▋ | 5648/34278 [6:17:07<28:48:44, 3.62s/it] 16%|█▋ | 5649/34278 [6:17:11<29:25:15, 3.70s/it] {'loss': 0.167, 'grad_norm': 0.7853122808500244, 'learning_rate': 9.531122890510577e-06, 'epoch': 0.16} 16%|█▋ | 5649/34278 [6:17:11<29:25:15, 3.70s/it] 16%|█▋ | 5650/34278 [6:17:14<28:11:40, 3.55s/it] {'loss': 0.1777, 'grad_norm': 0.9987451694364654, 'learning_rate': 9.530923126701994e-06, 'epoch': 0.16} 16%|█▋ | 5650/34278 [6:17:14<28:11:40, 3.55s/it] 16%|█▋ | 5651/34278 [6:17:20<33:58:20, 4.27s/it] {'loss': 0.1762, 'grad_norm': 1.13444692949112, 'learning_rate': 9.530723322442408e-06, 'epoch': 0.16} 16%|█▋ | 5651/34278 [6:17:20<33:58:20, 4.27s/it] 16%|█▋ | 5652/34278 [6:17:23<30:53:40, 3.89s/it] {'loss': 0.1584, 'grad_norm': 0.9127974602991187, 'learning_rate': 9.530523477733608e-06, 'epoch': 0.16} 16%|█▋ | 5652/34278 [6:17:23<30:53:40, 3.89s/it] 16%|█▋ | 5653/34278 [6:17:26<28:44:55, 3.62s/it] {'loss': 0.1649, 'grad_norm': 0.98414473118197, 'learning_rate': 9.530323592577376e-06, 'epoch': 0.16} 16%|█▋ | 5653/34278 [6:17:26<28:44:55, 3.62s/it] 16%|█▋ | 5654/34278 [6:17:29<28:16:24, 3.56s/it] {'loss': 0.1655, 'grad_norm': 0.9241482100282952, 'learning_rate': 9.530123666975498e-06, 'epoch': 0.16} 16%|█▋ | 5654/34278 [6:17:29<28:16:24, 3.56s/it] 16%|█▋ | 5655/34278 [6:17:33<27:57:44, 3.52s/it] {'loss': 0.1782, 'grad_norm': 0.8354171263977225, 'learning_rate': 9.529923700929753e-06, 'epoch': 0.16} 16%|█▋ | 5655/34278 [6:17:33<27:57:44, 3.52s/it] 17%|█▋ | 5656/34278 [6:17:36<26:48:43, 3.37s/it] {'loss': 0.1907, 'grad_norm': 1.1181879737222529, 'learning_rate': 9.529723694441935e-06, 'epoch': 0.17} 17%|█▋ | 5656/34278 [6:17:36<26:48:43, 3.37s/it] 17%|█▋ | 5657/34278 [6:17:39<26:09:49, 3.29s/it] {'loss': 0.1565, 'grad_norm': 0.7588624883101615, 'learning_rate': 9.529523647513824e-06, 'epoch': 0.17} 17%|█▋ | 5657/34278 [6:17:39<26:09:49, 3.29s/it] 17%|█▋ | 5658/34278 [6:17:42<26:13:54, 3.30s/it] {'loss': 0.1788, 'grad_norm': 0.9046596801968325, 'learning_rate': 9.529323560147204e-06, 'epoch': 0.17} 17%|█▋ | 5658/34278 [6:17:42<26:13:54, 3.30s/it] 17%|█▋ | 5659/34278 [6:17:46<26:53:37, 3.38s/it] {'loss': 0.1992, 'grad_norm': 0.8841903567900284, 'learning_rate': 9.529123432343868e-06, 'epoch': 0.17} 17%|█▋ | 5659/34278 [6:17:46<26:53:37, 3.38s/it] 17%|█▋ | 5660/34278 [6:17:49<26:40:12, 3.35s/it] {'loss': 0.158, 'grad_norm': 0.8972858405509329, 'learning_rate': 9.528923264105597e-06, 'epoch': 0.17} 17%|█▋ | 5660/34278 [6:17:49<26:40:12, 3.35s/it] 17%|█▋ | 5661/34278 [6:17:55<32:56:32, 4.14s/it] {'loss': 0.1641, 'grad_norm': 0.8204773792832021, 'learning_rate': 9.528723055434182e-06, 'epoch': 0.17} 17%|█▋ | 5661/34278 [6:17:55<32:56:32, 4.14s/it] 17%|█▋ | 5662/34278 [6:17:58<29:36:49, 3.73s/it] {'loss': 0.186, 'grad_norm': 0.8422016494152981, 'learning_rate': 9.528522806331409e-06, 'epoch': 0.17} 17%|█▋ | 5662/34278 [6:17:58<29:36:49, 3.73s/it] 17%|█▋ | 5663/34278 [6:18:01<27:47:17, 3.50s/it] {'loss': 0.1788, 'grad_norm': 1.0036955772986154, 'learning_rate': 9.528322516799064e-06, 'epoch': 0.17} 17%|█▋ | 5663/34278 [6:18:01<27:47:17, 3.50s/it] 17%|█▋ | 5664/34278 [6:18:05<28:39:58, 3.61s/it] {'loss': 0.1682, 'grad_norm': 0.8658475774572192, 'learning_rate': 9.528122186838935e-06, 'epoch': 0.17} 17%|█▋ | 5664/34278 [6:18:05<28:39:58, 3.61s/it] 17%|█▋ | 5665/34278 [6:18:10<32:32:34, 4.09s/it] {'loss': 0.196, 'grad_norm': 0.9343986033765772, 'learning_rate': 9.527921816452815e-06, 'epoch': 0.17} 17%|█▋ | 5665/34278 [6:18:10<32:32:34, 4.09s/it] 17%|█▋ | 5666/34278 [6:18:14<31:56:59, 4.02s/it] {'loss': 0.1598, 'grad_norm': 0.9523178297434219, 'learning_rate': 9.527721405642489e-06, 'epoch': 0.17} 17%|█▋ | 5666/34278 [6:18:14<31:56:59, 4.02s/it] 17%|█▋ | 5667/34278 [6:18:17<30:29:24, 3.84s/it] {'loss': 0.1837, 'grad_norm': 0.8053235060679232, 'learning_rate': 9.527520954409748e-06, 'epoch': 0.17} 17%|█▋ | 5667/34278 [6:18:17<30:29:24, 3.84s/it] 17%|█▋ | 5668/34278 [6:18:23<36:06:24, 4.54s/it] {'loss': 0.1632, 'grad_norm': 0.9254610811105273, 'learning_rate': 9.527320462756379e-06, 'epoch': 0.17} 17%|█▋ | 5668/34278 [6:18:23<36:06:24, 4.54s/it] 17%|█▋ | 5669/34278 [6:18:27<33:58:43, 4.28s/it] {'loss': 0.1702, 'grad_norm': 0.8281553470358143, 'learning_rate': 9.527119930684174e-06, 'epoch': 0.17} 17%|█▋ | 5669/34278 [6:18:27<33:58:43, 4.28s/it] 17%|█▋ | 5670/34278 [6:18:31<32:06:25, 4.04s/it] {'loss': 0.1847, 'grad_norm': 0.6681916431569204, 'learning_rate': 9.526919358194923e-06, 'epoch': 0.17} 17%|█▋ | 5670/34278 [6:18:31<32:06:25, 4.04s/it] 17%|█▋ | 5671/34278 [6:18:34<30:34:04, 3.85s/it] {'loss': 0.1643, 'grad_norm': 0.8700015870060569, 'learning_rate': 9.526718745290418e-06, 'epoch': 0.17} 17%|█▋ | 5671/34278 [6:18:34<30:34:04, 3.85s/it] 17%|█▋ | 5672/34278 [6:18:40<34:56:49, 4.40s/it] {'loss': 0.188, 'grad_norm': 0.9119618151662121, 'learning_rate': 9.526518091972447e-06, 'epoch': 0.17} 17%|█▋ | 5672/34278 [6:18:40<34:56:49, 4.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5673/34278 [6:18:43<32:07:07, 4.04s/it] {'loss': 0.1717, 'grad_norm': 0.7895264606281064, 'learning_rate': 9.526317398242803e-06, 'epoch': 0.17} 17%|█▋ | 5673/34278 [6:18:43<32:07:07, 4.04s/it] 17%|█▋ | 5674/34278 [6:18:46<30:03:34, 3.78s/it] {'loss': 0.1682, 'grad_norm': 0.8419253020973396, 'learning_rate': 9.52611666410328e-06, 'epoch': 0.17} 17%|█▋ | 5674/34278 [6:18:46<30:03:34, 3.78s/it] 17%|█▋ | 5675/34278 [6:18:49<28:00:03, 3.52s/it] {'loss': 0.1559, 'grad_norm': 0.9489149000767232, 'learning_rate': 9.525915889555666e-06, 'epoch': 0.17} 17%|█▋ | 5675/34278 [6:18:49<28:00:03, 3.52s/it] 17%|█▋ | 5676/34278 [6:18:53<29:01:31, 3.65s/it] {'loss': 0.1551, 'grad_norm': 1.1362243233797817, 'learning_rate': 9.525715074601756e-06, 'epoch': 0.17} 17%|█▋ | 5676/34278 [6:18:53<29:01:31, 3.65s/it] 17%|█▋ | 5677/34278 [6:18:56<27:44:08, 3.49s/it] {'loss': 0.1671, 'grad_norm': 0.8007023063084524, 'learning_rate': 9.525514219243342e-06, 'epoch': 0.17} 17%|█▋ | 5677/34278 [6:18:56<27:44:08, 3.49s/it] 17%|█▋ | 5678/34278 [6:19:00<28:40:55, 3.61s/it] {'loss': 0.1884, 'grad_norm': 0.8760805163055179, 'learning_rate': 9.525313323482217e-06, 'epoch': 0.17} 17%|█▋ | 5678/34278 [6:19:00<28:40:55, 3.61s/it] 17%|█▋ | 5679/34278 [6:19:04<30:43:46, 3.87s/it] {'loss': 0.1659, 'grad_norm': 0.7956839749398408, 'learning_rate': 9.525112387320177e-06, 'epoch': 0.17} 17%|█▋ | 5679/34278 [6:19:04<30:43:46, 3.87s/it] 17%|█▋ | 5680/34278 [6:19:08<29:34:40, 3.72s/it] {'loss': 0.1632, 'grad_norm': 0.970887167937794, 'learning_rate': 9.524911410759012e-06, 'epoch': 0.17} 17%|█▋ | 5680/34278 [6:19:08<29:34:40, 3.72s/it] 17%|█▋ | 5681/34278 [6:19:11<27:45:05, 3.49s/it] {'loss': 0.1861, 'grad_norm': 1.3408322576460419, 'learning_rate': 9.524710393800518e-06, 'epoch': 0.17} 17%|█▋ | 5681/34278 [6:19:11<27:45:05, 3.49s/it] 17%|█▋ | 5682/34278 [6:19:17<33:29:02, 4.22s/it] {'loss': 0.1553, 'grad_norm': 0.9953309589317046, 'learning_rate': 9.524509336446489e-06, 'epoch': 0.17} 17%|█▋ | 5682/34278 [6:19:17<33:29:02, 4.22s/it] 17%|█▋ | 5683/34278 [6:19:20<31:40:43, 3.99s/it] {'loss': 0.1746, 'grad_norm': 1.0691731422030517, 'learning_rate': 9.524308238698723e-06, 'epoch': 0.17} 17%|█▋ | 5683/34278 [6:19:20<31:40:43, 3.99s/it] 17%|█▋ | 5684/34278 [6:19:23<29:37:06, 3.73s/it] {'loss': 0.1469, 'grad_norm': 0.9751236644786784, 'learning_rate': 9.52410710055901e-06, 'epoch': 0.17} 17%|█▋ | 5684/34278 [6:19:23<29:37:06, 3.73s/it] 17%|█▋ | 5685/34278 [6:19:26<28:14:06, 3.55s/it] {'loss': 0.1604, 'grad_norm': 1.376478960450591, 'learning_rate': 9.52390592202915e-06, 'epoch': 0.17} 17%|█▋ | 5685/34278 [6:19:26<28:14:06, 3.55s/it] 17%|█▋ | 5686/34278 [6:19:29<26:53:55, 3.39s/it] {'loss': 0.181, 'grad_norm': 0.9059184036656142, 'learning_rate': 9.523704703110939e-06, 'epoch': 0.17} 17%|█▋ | 5686/34278 [6:19:29<26:53:55, 3.39s/it] 17%|█▋ | 5687/34278 [6:19:32<25:32:04, 3.22s/it] {'loss': 0.1871, 'grad_norm': 0.9531920355015756, 'learning_rate': 9.523503443806173e-06, 'epoch': 0.17} 17%|█▋ | 5687/34278 [6:19:32<25:32:04, 3.22s/it] 17%|█▋ | 5688/34278 [6:19:37<28:29:29, 3.59s/it] {'loss': 0.1468, 'grad_norm': 1.3610865414447926, 'learning_rate': 9.523302144116647e-06, 'epoch': 0.17} 17%|█▋ | 5688/34278 [6:19:37<28:29:29, 3.59s/it] 17%|█▋ | 5689/34278 [6:19:40<27:20:23, 3.44s/it] {'loss': 0.1886, 'grad_norm': 0.9879132279272322, 'learning_rate': 9.523100804044159e-06, 'epoch': 0.17} 17%|█▋ | 5689/34278 [6:19:40<27:20:23, 3.44s/it] 17%|█▋ | 5690/34278 [6:19:44<28:45:21, 3.62s/it] {'loss': 0.17, 'grad_norm': 0.9880328281509605, 'learning_rate': 9.522899423590507e-06, 'epoch': 0.17} 17%|█▋ | 5690/34278 [6:19:44<28:45:21, 3.62s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8653 > 8192). Running this sequence through the model will result in indexing errors 17%|█▋ | 5691/34278 [6:19:48<29:44:59, 3.75s/it] {'loss': 0.1827, 'grad_norm': 0.8304979236237056, 'learning_rate': 9.52269800275749e-06, 'epoch': 0.17} 17%|█▋ | 5691/34278 [6:19:48<29:44:59, 3.75s/it] 17%|█▋ | 5692/34278 [6:19:51<28:20:45, 3.57s/it] {'loss': 0.1884, 'grad_norm': 0.9552655816701444, 'learning_rate': 9.522496541546901e-06, 'epoch': 0.17} 17%|█▋ | 5692/34278 [6:19:51<28:20:45, 3.57s/it] 17%|█▋ | 5693/34278 [6:19:54<27:15:08, 3.43s/it] {'loss': 0.1643, 'grad_norm': 0.8358397925988778, 'learning_rate': 9.522295039960544e-06, 'epoch': 0.17} 17%|█▋ | 5693/34278 [6:19:54<27:15:08, 3.43s/it] 17%|█▋ | 5694/34278 [6:19:57<26:20:36, 3.32s/it] {'loss': 0.1639, 'grad_norm': 0.6558762570348208, 'learning_rate': 9.522093498000218e-06, 'epoch': 0.17} 17%|█▋ | 5694/34278 [6:19:57<26:20:36, 3.32s/it] 17%|█▋ | 5695/34278 [6:20:00<25:58:18, 3.27s/it] {'loss': 0.1665, 'grad_norm': 0.9007862062668994, 'learning_rate': 9.521891915667722e-06, 'epoch': 0.17} 17%|█▋ | 5695/34278 [6:20:00<25:58:18, 3.27s/it] 17%|█▋ | 5696/34278 [6:20:04<26:49:01, 3.38s/it] {'loss': 0.1949, 'grad_norm': 0.8837016889555012, 'learning_rate': 9.52169029296485e-06, 'epoch': 0.17} 17%|█▋ | 5696/34278 [6:20:04<26:49:01, 3.38s/it] 17%|█▋ | 5697/34278 [6:20:07<26:23:31, 3.32s/it] {'loss': 0.1772, 'grad_norm': 0.7108733949655556, 'learning_rate': 9.521488629893411e-06, 'epoch': 0.17} 17%|█▋ | 5697/34278 [6:20:07<26:23:31, 3.32s/it] 17%|█▋ | 5698/34278 [6:20:10<26:03:28, 3.28s/it] {'loss': 0.1789, 'grad_norm': 1.7374926645259825, 'learning_rate': 9.521286926455198e-06, 'epoch': 0.17} 17%|█▋ | 5698/34278 [6:20:10<26:03:28, 3.28s/it] 17%|█▋ | 5699/34278 [6:20:16<31:46:10, 4.00s/it] {'loss': 0.1643, 'grad_norm': 1.0873970809436402, 'learning_rate': 9.521085182652016e-06, 'epoch': 0.17} 17%|█▋ | 5699/34278 [6:20:16<31:46:10, 4.00s/it] 17%|█▋ | 5700/34278 [6:20:19<29:50:32, 3.76s/it] {'loss': 0.1598, 'grad_norm': 0.9235896302946089, 'learning_rate': 9.520883398485665e-06, 'epoch': 0.17} 17%|█▋ | 5700/34278 [6:20:19<29:50:32, 3.76s/it] 17%|█▋ | 5701/34278 [6:20:23<30:53:25, 3.89s/it] {'loss': 0.1547, 'grad_norm': 0.8480519275285644, 'learning_rate': 9.520681573957944e-06, 'epoch': 0.17} 17%|█▋ | 5701/34278 [6:20:23<30:53:25, 3.89s/it] 17%|█▋ | 5702/34278 [6:20:26<28:51:16, 3.64s/it] {'loss': 0.1657, 'grad_norm': 0.8684837851224382, 'learning_rate': 9.520479709070661e-06, 'epoch': 0.17} 17%|█▋ | 5702/34278 [6:20:26<28:51:16, 3.64s/it] 17%|█▋ | 5703/34278 [6:20:32<33:53:09, 4.27s/it] {'loss': 0.209, 'grad_norm': 0.8678892323263708, 'learning_rate': 9.52027780382561e-06, 'epoch': 0.17} 17%|█▋ | 5703/34278 [6:20:32<33:53:09, 4.27s/it] 17%|█▋ | 5704/34278 [6:20:36<31:53:51, 4.02s/it] {'loss': 0.1808, 'grad_norm': 0.7701939936552961, 'learning_rate': 9.5200758582246e-06, 'epoch': 0.17} 17%|█▋ | 5704/34278 [6:20:36<31:53:51, 4.02s/it] 17%|█▋ | 5705/34278 [6:20:39<29:55:47, 3.77s/it] {'loss': 0.1687, 'grad_norm': 0.7743930184201232, 'learning_rate': 9.519873872269431e-06, 'epoch': 0.17} 17%|█▋ | 5705/34278 [6:20:39<29:55:47, 3.77s/it] 17%|█▋ | 5706/34278 [6:20:45<35:02:18, 4.41s/it] {'loss': 0.1751, 'grad_norm': 0.8668160425028448, 'learning_rate': 9.519671845961908e-06, 'epoch': 0.17} 17%|█▋ | 5706/34278 [6:20:45<35:02:18, 4.41s/it] 17%|█▋ | 5707/34278 [6:20:48<31:53:08, 4.02s/it] {'loss': 0.1701, 'grad_norm': 1.0076935631153972, 'learning_rate': 9.519469779303833e-06, 'epoch': 0.17} 17%|█▋ | 5707/34278 [6:20:48<31:53:08, 4.02s/it] 17%|█▋ | 5708/34278 [6:20:51<30:03:54, 3.79s/it] {'loss': 0.1487, 'grad_norm': 0.7363457026736757, 'learning_rate': 9.519267672297013e-06, 'epoch': 0.17} 17%|█▋ | 5708/34278 [6:20:51<30:03:54, 3.79s/it] 17%|█▋ | 5709/34278 [6:20:55<31:14:10, 3.94s/it] {'loss': 0.1532, 'grad_norm': 0.8646562493640063, 'learning_rate': 9.519065524943247e-06, 'epoch': 0.17} 17%|█▋ | 5709/34278 [6:20:55<31:14:10, 3.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5710/34278 [6:20:58<29:01:48, 3.66s/it] {'loss': 0.1767, 'grad_norm': 0.9669843846088726, 'learning_rate': 9.518863337244344e-06, 'epoch': 0.17} 17%|█▋ | 5710/34278 [6:20:58<29:01:48, 3.66s/it] 17%|█▋ | 5711/34278 [6:21:01<27:25:01, 3.46s/it] {'loss': 0.149, 'grad_norm': 0.7631156015200443, 'learning_rate': 9.518661109202107e-06, 'epoch': 0.17} 17%|█▋ | 5711/34278 [6:21:01<27:25:01, 3.46s/it] 17%|█▋ | 5712/34278 [6:21:04<26:45:58, 3.37s/it] {'loss': 0.1494, 'grad_norm': 0.9086542199023313, 'learning_rate': 9.518458840818343e-06, 'epoch': 0.17} 17%|█▋ | 5712/34278 [6:21:05<26:45:58, 3.37s/it] 17%|█▋ | 5713/34278 [6:21:09<28:17:45, 3.57s/it] {'loss': 0.1917, 'grad_norm': 0.9339741301798014, 'learning_rate': 9.518256532094859e-06, 'epoch': 0.17} 17%|█▋ | 5713/34278 [6:21:09<28:17:45, 3.57s/it] 17%|█▋ | 5714/34278 [6:21:12<27:36:12, 3.48s/it] {'loss': 0.1496, 'grad_norm': 0.9002561742502497, 'learning_rate': 9.518054183033456e-06, 'epoch': 0.17} 17%|█▋ | 5714/34278 [6:21:12<27:36:12, 3.48s/it] 17%|█▋ | 5715/34278 [6:21:16<29:25:06, 3.71s/it] {'loss': 0.1735, 'grad_norm': 0.8132471465484027, 'learning_rate': 9.517851793635946e-06, 'epoch': 0.17} 17%|█▋ | 5715/34278 [6:21:16<29:25:06, 3.71s/it] 17%|█▋ | 5716/34278 [6:21:19<28:03:44, 3.54s/it] {'loss': 0.1886, 'grad_norm': 0.8527596315922845, 'learning_rate': 9.517649363904132e-06, 'epoch': 0.17} 17%|█▋ | 5716/34278 [6:21:19<28:03:44, 3.54s/it] 17%|█▋ | 5717/34278 [6:21:22<27:19:17, 3.44s/it] {'loss': 0.1692, 'grad_norm': 0.8714980015972134, 'learning_rate': 9.517446893839824e-06, 'epoch': 0.17} 17%|█▋ | 5717/34278 [6:21:22<27:19:17, 3.44s/it] 17%|█▋ | 5718/34278 [6:21:28<33:20:58, 4.20s/it] {'loss': 0.1681, 'grad_norm': 0.8587323457269065, 'learning_rate': 9.517244383444829e-06, 'epoch': 0.17} 17%|█▋ | 5718/34278 [6:21:28<33:20:58, 4.20s/it] 17%|█▋ | 5719/34278 [6:21:31<30:22:04, 3.83s/it] {'loss': 0.1441, 'grad_norm': 0.9271190757581552, 'learning_rate': 9.517041832720953e-06, 'epoch': 0.17} 17%|█▋ | 5719/34278 [6:21:31<30:22:04, 3.83s/it] 17%|█▋ | 5720/34278 [6:21:36<33:25:50, 4.21s/it] {'loss': 0.1813, 'grad_norm': 0.9111026655110478, 'learning_rate': 9.516839241670006e-06, 'epoch': 0.17} 17%|█▋ | 5720/34278 [6:21:36<33:25:50, 4.21s/it] 17%|█▋ | 5721/34278 [6:21:40<31:28:14, 3.97s/it] {'loss': 0.1993, 'grad_norm': 0.8138309657015205, 'learning_rate': 9.516636610293798e-06, 'epoch': 0.17} 17%|█▋ | 5721/34278 [6:21:40<31:28:14, 3.97s/it] 17%|█▋ | 5722/34278 [6:21:46<36:20:04, 4.58s/it] {'loss': 0.1764, 'grad_norm': 0.9979093129062674, 'learning_rate': 9.516433938594137e-06, 'epoch': 0.17} 17%|█▋ | 5722/34278 [6:21:46<36:20:04, 4.58s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12684 > 8192). Running this sequence through the model will result in indexing errors 17%|█▋ | 5723/34278 [6:21:49<32:42:12, 4.12s/it] {'loss': 0.1888, 'grad_norm': 0.8876233363953347, 'learning_rate': 9.51623122657283e-06, 'epoch': 0.17} 17%|█▋ | 5723/34278 [6:21:49<32:42:12, 4.12s/it] 17%|█▋ | 5724/34278 [6:21:53<31:37:41, 3.99s/it] {'loss': 0.1652, 'grad_norm': 0.7661865290715371, 'learning_rate': 9.516028474231689e-06, 'epoch': 0.17} 17%|█▋ | 5724/34278 [6:21:53<31:37:41, 3.99s/it] 17%|█▋ | 5725/34278 [6:21:56<30:10:08, 3.80s/it] {'loss': 0.1705, 'grad_norm': 0.7970528798299262, 'learning_rate': 9.515825681572523e-06, 'epoch': 0.17} 17%|█▋ | 5725/34278 [6:21:56<30:10:08, 3.80s/it] 17%|█▋ | 5726/34278 [6:22:02<34:45:44, 4.38s/it] {'loss': 0.1685, 'grad_norm': 0.8731151303887534, 'learning_rate': 9.515622848597145e-06, 'epoch': 0.17} 17%|█▋ | 5726/34278 [6:22:02<34:45:44, 4.38s/it] 17%|█▋ | 5727/34278 [6:22:04<30:37:56, 3.86s/it] {'loss': 0.2072, 'grad_norm': 0.8423270951003244, 'learning_rate': 9.515419975307365e-06, 'epoch': 0.17} 17%|█▋ | 5727/34278 [6:22:04<30:37:56, 3.86s/it] 17%|█▋ | 5728/34278 [6:22:08<29:10:41, 3.68s/it] {'loss': 0.1861, 'grad_norm': 0.9563534421355293, 'learning_rate': 9.515217061704991e-06, 'epoch': 0.17} 17%|█▋ | 5728/34278 [6:22:08<29:10:41, 3.68s/it] 17%|█▋ | 5729/34278 [6:22:12<31:59:32, 4.03s/it] {'loss': 0.1862, 'grad_norm': 1.0794263591197344, 'learning_rate': 9.515014107791839e-06, 'epoch': 0.17} 17%|█▋ | 5729/34278 [6:22:12<31:59:32, 4.03s/it] 17%|█▋ | 5730/34278 [6:22:15<29:01:43, 3.66s/it] {'loss': 0.167, 'grad_norm': 0.8482002265219688, 'learning_rate': 9.514811113569718e-06, 'epoch': 0.17} 17%|█▋ | 5730/34278 [6:22:15<29:01:43, 3.66s/it] 17%|█▋ | 5731/34278 [6:22:19<28:12:48, 3.56s/it] {'loss': 0.1649, 'grad_norm': 0.8831824460817326, 'learning_rate': 9.514608079040441e-06, 'epoch': 0.17} 17%|█▋ | 5731/34278 [6:22:19<28:12:48, 3.56s/it] 17%|█▋ | 5732/34278 [6:22:23<29:38:26, 3.74s/it] {'loss': 0.168, 'grad_norm': 0.8771172281238426, 'learning_rate': 9.51440500420582e-06, 'epoch': 0.17} 17%|█▋ | 5732/34278 [6:22:23<29:38:26, 3.74s/it] 17%|█▋ | 5733/34278 [6:22:29<35:01:26, 4.42s/it] {'loss': 0.1793, 'grad_norm': 0.801982788928741, 'learning_rate': 9.51420188906767e-06, 'epoch': 0.17} 17%|█▋ | 5733/34278 [6:22:29<35:01:26, 4.42s/it] 17%|█▋ | 5734/34278 [6:22:32<32:40:36, 4.12s/it] {'loss': 0.179, 'grad_norm': 0.9045361378829273, 'learning_rate': 9.513998733627802e-06, 'epoch': 0.17} 17%|█▋ | 5734/34278 [6:22:32<32:40:36, 4.12s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb15de5fbf0> Failed to fetch sample 2696824. Exception: cannot identify image file <_io.BytesIO object at 0x7fb15de5fbf0> 17%|█▋ | 5735/34278 [6:22:35<28:59:23, 3.66s/it] {'loss': 0.1709, 'grad_norm': 0.9572368123826093, 'learning_rate': 9.513795537888032e-06, 'epoch': 0.17} 17%|█▋ | 5735/34278 [6:22:35<28:59:23, 3.66s/it] 17%|█▋ | 5736/34278 [6:22:39<30:43:52, 3.88s/it] {'loss': 0.1658, 'grad_norm': 0.9885183697072522, 'learning_rate': 9.513592301850174e-06, 'epoch': 0.17} 17%|█▋ | 5736/34278 [6:22:39<30:43:52, 3.88s/it] 17%|█▋ | 5737/34278 [6:22:43<30:14:31, 3.81s/it] {'loss': 0.1757, 'grad_norm': 0.8311614188640513, 'learning_rate': 9.51338902551604e-06, 'epoch': 0.17} 17%|█▋ | 5737/34278 [6:22:43<30:14:31, 3.81s/it] 17%|█▋ | 5738/34278 [6:22:49<35:18:20, 4.45s/it] {'loss': 0.1685, 'grad_norm': 0.9266043594963513, 'learning_rate': 9.513185708887445e-06, 'epoch': 0.17} 17%|█▋ | 5738/34278 [6:22:49<35:18:20, 4.45s/it] 17%|█▋ | 5739/34278 [6:22:52<32:11:31, 4.06s/it] {'loss': 0.1695, 'grad_norm': 0.8195980583339016, 'learning_rate': 9.512982351966207e-06, 'epoch': 0.17} 17%|█▋ | 5739/34278 [6:22:52<32:11:31, 4.06s/it] 17%|█▋ | 5740/34278 [6:22:56<31:33:12, 3.98s/it] {'loss': 0.173, 'grad_norm': 0.8522298747866902, 'learning_rate': 9.51277895475414e-06, 'epoch': 0.17} 17%|█▋ | 5740/34278 [6:22:56<31:33:12, 3.98s/it] 17%|█▋ | 5741/34278 [6:22:59<30:45:06, 3.88s/it] {'loss': 0.1659, 'grad_norm': 0.8191427424235472, 'learning_rate': 9.51257551725306e-06, 'epoch': 0.17} 17%|█▋ | 5741/34278 [6:22:59<30:45:06, 3.88s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8296 > 8192). Running this sequence through the model will result in indexing errors 17%|█▋ | 5742/34278 [6:23:06<36:53:00, 4.65s/it] {'loss': 0.1802, 'grad_norm': 0.883421807881772, 'learning_rate': 9.512372039464782e-06, 'epoch': 0.17} 17%|█▋ | 5742/34278 [6:23:06<36:53:00, 4.65s/it] 17%|█▋ | 5743/34278 [6:23:09<33:35:43, 4.24s/it] {'loss': 0.1639, 'grad_norm': 0.8813050107409939, 'learning_rate': 9.512168521391123e-06, 'epoch': 0.17} 17%|█▋ | 5743/34278 [6:23:09<33:35:43, 4.24s/it] 17%|█▋ | 5744/34278 [6:23:12<31:42:39, 4.00s/it] {'loss': 0.194, 'grad_norm': 0.9627359759151015, 'learning_rate': 9.511964963033902e-06, 'epoch': 0.17} 17%|█▋ | 5744/34278 [6:23:12<31:42:39, 4.00s/it] 17%|█▋ | 5745/34278 [6:23:15<29:00:33, 3.66s/it] {'loss': 0.1694, 'grad_norm': 0.7925154895715254, 'learning_rate': 9.511761364394935e-06, 'epoch': 0.17} 17%|█▋ | 5745/34278 [6:23:15<29:00:33, 3.66s/it] 17%|█▋ | 5746/34278 [6:23:18<27:46:44, 3.50s/it] {'loss': 0.1515, 'grad_norm': 0.850554932296648, 'learning_rate': 9.51155772547604e-06, 'epoch': 0.17} 17%|█▋ | 5746/34278 [6:23:18<27:46:44, 3.50s/it] 17%|█▋ | 5747/34278 [6:23:23<30:56:16, 3.90s/it] {'loss': 0.1725, 'grad_norm': 0.8708859149242727, 'learning_rate': 9.511354046279032e-06, 'epoch': 0.17} 17%|█▋ | 5747/34278 [6:23:23<30:56:16, 3.90s/it] 17%|█▋ | 5748/34278 [6:23:30<37:06:51, 4.68s/it] {'loss': 0.1995, 'grad_norm': 0.909647459641582, 'learning_rate': 9.511150326805734e-06, 'epoch': 0.17} 17%|█▋ | 5748/34278 [6:23:30<37:06:51, 4.68s/it] 17%|█▋ | 5749/34278 [6:23:33<33:45:33, 4.26s/it] {'loss': 0.1752, 'grad_norm': 0.8624369587185624, 'learning_rate': 9.510946567057963e-06, 'epoch': 0.17} 17%|█▋ | 5749/34278 [6:23:33<33:45:33, 4.26s/it] 17%|█▋ | 5750/34278 [6:23:37<32:30:04, 4.10s/it] {'loss': 0.1655, 'grad_norm': 0.8705414073551789, 'learning_rate': 9.510742767037538e-06, 'epoch': 0.17} 17%|█▋ | 5750/34278 [6:23:37<32:30:04, 4.10s/it] 17%|█▋ | 5751/34278 [6:23:41<31:41:34, 4.00s/it] {'loss': 0.1691, 'grad_norm': 0.8458393316577807, 'learning_rate': 9.510538926746276e-06, 'epoch': 0.17} 17%|█▋ | 5751/34278 [6:23:41<31:41:34, 4.00s/it] 17%|█▋ | 5752/34278 [6:23:44<29:19:46, 3.70s/it] {'loss': 0.1847, 'grad_norm': 0.7781652817375657, 'learning_rate': 9.510335046186001e-06, 'epoch': 0.17} 17%|█▋ | 5752/34278 [6:23:44<29:19:46, 3.70s/it] 17%|█▋ | 5753/34278 [6:23:50<34:55:38, 4.41s/it] {'loss': 0.1997, 'grad_norm': 0.7684798531988254, 'learning_rate': 9.510131125358532e-06, 'epoch': 0.17} 17%|█▋ | 5753/34278 [6:23:50<34:55:38, 4.41s/it] 17%|█▋ | 5754/34278 [6:23:53<31:58:53, 4.04s/it] {'loss': 0.1543, 'grad_norm': 0.7430952463355108, 'learning_rate': 9.509927164265688e-06, 'epoch': 0.17} 17%|█▋ | 5754/34278 [6:23:53<31:58:53, 4.04s/it] 17%|█▋ | 5755/34278 [6:23:58<34:22:51, 4.34s/it] {'loss': 0.1617, 'grad_norm': 0.7541261204553376, 'learning_rate': 9.509723162909292e-06, 'epoch': 0.17} 17%|█▋ | 5755/34278 [6:23:58<34:22:51, 4.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5756/34278 [6:24:01<31:34:24, 3.99s/it] {'loss': 0.168, 'grad_norm': 1.0395113794202016, 'learning_rate': 9.509519121291164e-06, 'epoch': 0.17} 17%|█▋ | 5756/34278 [6:24:01<31:34:24, 3.99s/it] 17%|█▋ | 5757/34278 [6:24:04<29:51:25, 3.77s/it] {'loss': 0.1668, 'grad_norm': 0.8457258943799788, 'learning_rate': 9.509315039413126e-06, 'epoch': 0.17} 17%|█▋ | 5757/34278 [6:24:04<29:51:25, 3.77s/it] 17%|█▋ | 5758/34278 [6:24:08<28:56:58, 3.65s/it] {'loss': 0.1658, 'grad_norm': 0.8008591685751763, 'learning_rate': 9.509110917276997e-06, 'epoch': 0.17} 17%|█▋ | 5758/34278 [6:24:08<28:56:58, 3.65s/it] 17%|█▋ | 5759/34278 [6:24:13<33:45:44, 4.26s/it] {'loss': 0.1943, 'grad_norm': 1.0079926273568525, 'learning_rate': 9.508906754884603e-06, 'epoch': 0.17} 17%|█▋ | 5759/34278 [6:24:13<33:45:44, 4.26s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5760/34278 [6:24:19<38:07:00, 4.81s/it] {'loss': 0.1741, 'grad_norm': 0.796659282858451, 'learning_rate': 9.508702552237768e-06, 'epoch': 0.17} 17%|█▋ | 5760/34278 [6:24:19<38:07:00, 4.81s/it] 17%|█▋ | 5761/34278 [6:24:25<40:56:58, 5.17s/it] {'loss': 0.1626, 'grad_norm': 0.9530830112998195, 'learning_rate': 9.508498309338313e-06, 'epoch': 0.17} 17%|█▋ | 5761/34278 [6:24:25<40:56:58, 5.17s/it] 17%|█▋ | 5762/34278 [6:24:29<35:57:07, 4.54s/it] {'loss': 0.2005, 'grad_norm': 0.8356891526275132, 'learning_rate': 9.50829402618806e-06, 'epoch': 0.17} 17%|█▋ | 5762/34278 [6:24:29<35:57:07, 4.54s/it] 17%|█▋ | 5763/34278 [6:24:31<31:39:24, 4.00s/it] {'loss': 0.1966, 'grad_norm': 0.9700998337612351, 'learning_rate': 9.508089702788835e-06, 'epoch': 0.17} 17%|█▋ | 5763/34278 [6:24:31<31:39:24, 4.00s/it] 17%|█▋ | 5764/34278 [6:24:35<30:10:20, 3.81s/it] {'loss': 0.1479, 'grad_norm': 0.7978492663470175, 'learning_rate': 9.50788533914246e-06, 'epoch': 0.17} 17%|█▋ | 5764/34278 [6:24:35<30:10:20, 3.81s/it] 17%|█▋ | 5765/34278 [6:24:38<29:09:11, 3.68s/it] {'loss': 0.1873, 'grad_norm': 0.8288991693189907, 'learning_rate': 9.507680935250762e-06, 'epoch': 0.17} 17%|█▋ | 5765/34278 [6:24:38<29:09:11, 3.68s/it] 17%|█▋ | 5766/34278 [6:24:41<27:14:20, 3.44s/it] {'loss': 0.1828, 'grad_norm': 0.7963707692136316, 'learning_rate': 9.507476491115564e-06, 'epoch': 0.17} 17%|█▋ | 5766/34278 [6:24:41<27:14:20, 3.44s/it] 17%|█▋ | 5767/34278 [6:24:44<25:33:19, 3.23s/it] {'loss': 0.1466, 'grad_norm': 0.8758297637996928, 'learning_rate': 9.507272006738692e-06, 'epoch': 0.17} 17%|█▋ | 5767/34278 [6:24:44<25:33:19, 3.23s/it] 17%|█▋ | 5768/34278 [6:24:47<26:24:52, 3.34s/it] {'loss': 0.1904, 'grad_norm': 0.8754145738446034, 'learning_rate': 9.50706748212197e-06, 'epoch': 0.17} 17%|█▋ | 5768/34278 [6:24:47<26:24:52, 3.34s/it] 17%|█▋ | 5769/34278 [6:24:51<27:45:06, 3.50s/it] {'loss': 0.1713, 'grad_norm': 0.7414681198536065, 'learning_rate': 9.506862917267228e-06, 'epoch': 0.17} 17%|█▋ | 5769/34278 [6:24:51<27:45:06, 3.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5770/34278 [6:24:54<26:25:32, 3.34s/it] {'loss': 0.1419, 'grad_norm': 0.7989072402561423, 'learning_rate': 9.506658312176288e-06, 'epoch': 0.17} 17%|█▋ | 5770/34278 [6:24:54<26:25:32, 3.34s/it] 17%|█▋ | 5771/34278 [6:24:59<29:22:58, 3.71s/it] {'loss': 0.1562, 'grad_norm': 0.7659770789703081, 'learning_rate': 9.506453666850982e-06, 'epoch': 0.17} 17%|█▋ | 5771/34278 [6:24:59<29:22:58, 3.71s/it] 17%|█▋ | 5772/34278 [6:25:02<27:59:39, 3.54s/it] {'loss': 0.1646, 'grad_norm': 0.9861733022695788, 'learning_rate': 9.50624898129313e-06, 'epoch': 0.17} 17%|█▋ | 5772/34278 [6:25:02<27:59:39, 3.54s/it] 17%|█▋ | 5773/34278 [6:25:05<28:26:09, 3.59s/it] {'loss': 0.1901, 'grad_norm': 0.9031594025140651, 'learning_rate': 9.506044255504563e-06, 'epoch': 0.17} 17%|█▋ | 5773/34278 [6:25:05<28:26:09, 3.59s/it] 17%|█▋ | 5774/34278 [6:25:08<27:03:36, 3.42s/it] {'loss': 0.1716, 'grad_norm': 0.9649187871318449, 'learning_rate': 9.50583948948711e-06, 'epoch': 0.17} 17%|█▋ | 5774/34278 [6:25:09<27:03:36, 3.42s/it] 17%|█▋ | 5775/34278 [6:25:12<27:08:20, 3.43s/it] {'loss': 0.1537, 'grad_norm': 0.9163837986095745, 'learning_rate': 9.505634683242595e-06, 'epoch': 0.17} 17%|█▋ | 5775/34278 [6:25:12<27:08:20, 3.43s/it] 17%|█▋ | 5776/34278 [6:25:16<29:34:19, 3.74s/it] {'loss': 0.16, 'grad_norm': 0.7913032036590982, 'learning_rate': 9.505429836772852e-06, 'epoch': 0.17} 17%|█▋ | 5776/34278 [6:25:16<29:34:19, 3.74s/it] 17%|█▋ | 5777/34278 [6:25:20<28:13:46, 3.57s/it] {'loss': 0.1684, 'grad_norm': 1.0419250495818513, 'learning_rate': 9.505224950079705e-06, 'epoch': 0.17} 17%|█▋ | 5777/34278 [6:25:20<28:13:46, 3.57s/it] 17%|█▋ | 5778/34278 [6:25:26<33:58:20, 4.29s/it] {'loss': 0.1549, 'grad_norm': 0.8186132470280999, 'learning_rate': 9.505020023164985e-06, 'epoch': 0.17} 17%|█▋ | 5778/34278 [6:25:26<33:58:20, 4.29s/it] 17%|█▋ | 5779/34278 [6:25:29<31:09:50, 3.94s/it] {'loss': 0.1798, 'grad_norm': 0.9145764220434629, 'learning_rate': 9.504815056030523e-06, 'epoch': 0.17} 17%|█▋ | 5779/34278 [6:25:29<31:09:50, 3.94s/it] 17%|█▋ | 5780/34278 [6:25:32<29:10:14, 3.68s/it] {'loss': 0.1757, 'grad_norm': 0.7939721453720862, 'learning_rate': 9.504610048678148e-06, 'epoch': 0.17} 17%|█▋ | 5780/34278 [6:25:32<29:10:14, 3.68s/it] 17%|█▋ | 5781/34278 [6:25:34<26:52:04, 3.39s/it] {'loss': 0.1634, 'grad_norm': 0.9123615104748429, 'learning_rate': 9.504405001109688e-06, 'epoch': 0.17} 17%|█▋ | 5781/34278 [6:25:34<26:52:04, 3.39s/it] 17%|█▋ | 5782/34278 [6:25:38<27:06:10, 3.42s/it] {'loss': 0.1433, 'grad_norm': 0.7837052496776843, 'learning_rate': 9.504199913326977e-06, 'epoch': 0.17} 17%|█▋ | 5782/34278 [6:25:38<27:06:10, 3.42s/it] 17%|█▋ | 5783/34278 [6:25:41<27:02:41, 3.42s/it] {'loss': 0.1827, 'grad_norm': 0.9136821640560577, 'learning_rate': 9.503994785331845e-06, 'epoch': 0.17} 17%|█▋ | 5783/34278 [6:25:41<27:02:41, 3.42s/it] 17%|█▋ | 5784/34278 [6:25:45<28:40:02, 3.62s/it] {'loss': 0.1697, 'grad_norm': 0.8988171773056757, 'learning_rate': 9.50378961712612e-06, 'epoch': 0.17} 17%|█▋ | 5784/34278 [6:25:45<28:40:02, 3.62s/it] 17%|█▋ | 5785/34278 [6:25:49<27:36:29, 3.49s/it] {'loss': 0.1913, 'grad_norm': 0.8454932841127984, 'learning_rate': 9.50358440871164e-06, 'epoch': 0.17} 17%|█▋ | 5785/34278 [6:25:49<27:36:29, 3.49s/it] 17%|█▋ | 5786/34278 [6:25:53<28:45:25, 3.63s/it] {'loss': 0.1785, 'grad_norm': 0.9256260902701139, 'learning_rate': 9.50337916009023e-06, 'epoch': 0.17} 17%|█▋ | 5786/34278 [6:25:53<28:45:25, 3.63s/it] 17%|█▋ | 5787/34278 [6:25:56<27:16:36, 3.45s/it] {'loss': 0.1585, 'grad_norm': 0.8611191499739325, 'learning_rate': 9.503173871263728e-06, 'epoch': 0.17} 17%|█▋ | 5787/34278 [6:25:56<27:16:36, 3.45s/it] 17%|█▋ | 5788/34278 [6:25:59<26:37:14, 3.36s/it] {'loss': 0.188, 'grad_norm': 1.1466914106383315, 'learning_rate': 9.502968542233963e-06, 'epoch': 0.17} 17%|█▋ | 5788/34278 [6:25:59<26:37:14, 3.36s/it] 17%|█▋ | 5789/34278 [6:26:02<25:50:10, 3.26s/it] {'loss': 0.1756, 'grad_norm': 0.855778321669791, 'learning_rate': 9.502763173002772e-06, 'epoch': 0.17} 17%|█▋ | 5789/34278 [6:26:02<25:50:10, 3.26s/it] 17%|█▋ | 5790/34278 [6:26:08<31:39:01, 4.00s/it] {'loss': 0.1736, 'grad_norm': 1.1559620482353958, 'learning_rate': 9.502557763571984e-06, 'epoch': 0.17} 17%|█▋ | 5790/34278 [6:26:08<31:39:01, 4.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5791/34278 [6:26:11<30:17:24, 3.83s/it] {'loss': 0.1652, 'grad_norm': 0.8539528088437137, 'learning_rate': 9.502352313943437e-06, 'epoch': 0.17} 17%|█▋ | 5791/34278 [6:26:11<30:17:24, 3.83s/it] 17%|█▋ | 5792/34278 [6:26:17<34:55:32, 4.41s/it] {'loss': 0.1729, 'grad_norm': 1.0127769894227874, 'learning_rate': 9.502146824118964e-06, 'epoch': 0.17} 17%|█▋ | 5792/34278 [6:26:17<34:55:32, 4.41s/it] 17%|█▋ | 5793/34278 [6:26:20<31:46:11, 4.02s/it] {'loss': 0.1648, 'grad_norm': 1.0727713735757451, 'learning_rate': 9.501941294100397e-06, 'epoch': 0.17} 17%|█▋ | 5793/34278 [6:26:20<31:46:11, 4.02s/it] 17%|█▋ | 5794/34278 [6:26:23<29:43:37, 3.76s/it] {'loss': 0.159, 'grad_norm': 0.9718664482621123, 'learning_rate': 9.501735723889573e-06, 'epoch': 0.17} 17%|█▋ | 5794/34278 [6:26:23<29:43:37, 3.76s/it] 17%|█▋ | 5795/34278 [6:26:27<29:18:42, 3.70s/it] {'loss': 0.1744, 'grad_norm': 1.3117615410056271, 'learning_rate': 9.501530113488326e-06, 'epoch': 0.17} 17%|█▋ | 5795/34278 [6:26:27<29:18:42, 3.70s/it] 17%|█▋ | 5796/34278 [6:26:30<28:12:14, 3.56s/it] {'loss': 0.1749, 'grad_norm': 1.0712446014810333, 'learning_rate': 9.501324462898495e-06, 'epoch': 0.17} 17%|█▋ | 5796/34278 [6:26:30<28:12:14, 3.56s/it] 17%|█▋ | 5797/34278 [6:26:36<33:52:13, 4.28s/it] {'loss': 0.1892, 'grad_norm': 0.8634629738870782, 'learning_rate': 9.501118772121913e-06, 'epoch': 0.17} 17%|█▋ | 5797/34278 [6:26:36<33:52:13, 4.28s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (10423 > 8192). Running this sequence through the model will result in indexing errors 17%|█▋ | 5798/34278 [6:26:39<30:56:26, 3.91s/it] {'loss': 0.1707, 'grad_norm': 1.0273353474251192, 'learning_rate': 9.500913041160417e-06, 'epoch': 0.17} 17%|█▋ | 5798/34278 [6:26:39<30:56:26, 3.91s/it] 17%|█▋ | 5799/34278 [6:26:42<30:08:19, 3.81s/it] {'loss': 0.1939, 'grad_norm': 1.0731080401598547, 'learning_rate': 9.500707270015846e-06, 'epoch': 0.17} 17%|█▋ | 5799/34278 [6:26:42<30:08:19, 3.81s/it] 17%|█▋ | 5800/34278 [6:26:46<28:51:35, 3.65s/it] {'loss': 0.186, 'grad_norm': 1.0145773911709222, 'learning_rate': 9.500501458690031e-06, 'epoch': 0.17} 17%|█▋ | 5800/34278 [6:26:46<28:51:35, 3.65s/it] 17%|█▋ | 5801/34278 [6:26:52<34:24:28, 4.35s/it] {'loss': 0.1807, 'grad_norm': 0.8661426553472247, 'learning_rate': 9.500295607184815e-06, 'epoch': 0.17} 17%|█▋ | 5801/34278 [6:26:52<34:24:28, 4.35s/it] 17%|█▋ | 5802/34278 [6:26:55<31:05:42, 3.93s/it] {'loss': 0.166, 'grad_norm': 0.8685623315834823, 'learning_rate': 9.500089715502035e-06, 'epoch': 0.17} 17%|█▋ | 5802/34278 [6:26:55<31:05:42, 3.93s/it] 17%|█▋ | 5803/34278 [6:27:00<35:27:38, 4.48s/it] {'loss': 0.1672, 'grad_norm': 0.8529004373329677, 'learning_rate': 9.499883783643526e-06, 'epoch': 0.17} 17%|█▋ | 5803/34278 [6:27:00<35:27:38, 4.48s/it] 17%|█▋ | 5804/34278 [6:27:04<33:12:44, 4.20s/it] {'loss': 0.162, 'grad_norm': 0.6636763544981219, 'learning_rate': 9.499677811611133e-06, 'epoch': 0.17} 17%|█▋ | 5804/34278 [6:27:04<33:12:44, 4.20s/it] 17%|█▋ | 5805/34278 [6:27:07<31:37:00, 4.00s/it] {'loss': 0.168, 'grad_norm': 0.7199486842416523, 'learning_rate': 9.499471799406687e-06, 'epoch': 0.17} 17%|█▋ | 5805/34278 [6:27:07<31:37:00, 4.00s/it] 17%|█▋ | 5806/34278 [6:27:13<36:29:29, 4.61s/it] {'loss': 0.1727, 'grad_norm': 0.7058273381665765, 'learning_rate': 9.49926574703203e-06, 'epoch': 0.17} 17%|█▋ | 5806/34278 [6:27:13<36:29:29, 4.61s/it] 17%|█▋ | 5807/34278 [6:27:16<32:14:44, 4.08s/it] {'loss': 0.1457, 'grad_norm': 0.7683216964440348, 'learning_rate': 9.499059654489005e-06, 'epoch': 0.17} 17%|█▋ | 5807/34278 [6:27:16<32:14:44, 4.08s/it] 17%|█▋ | 5808/34278 [6:27:20<31:34:46, 3.99s/it] {'loss': 0.1513, 'grad_norm': 0.7008463940438889, 'learning_rate': 9.498853521779449e-06, 'epoch': 0.17} 17%|█▋ | 5808/34278 [6:27:20<31:34:46, 3.99s/it] 17%|█▋ | 5809/34278 [6:27:23<28:43:44, 3.63s/it] {'loss': 0.186, 'grad_norm': 0.77511154420062, 'learning_rate': 9.498647348905203e-06, 'epoch': 0.17} 17%|█▋ | 5809/34278 [6:27:23<28:43:44, 3.63s/it] 17%|█▋ | 5810/34278 [6:27:27<29:10:32, 3.69s/it] {'loss': 0.1804, 'grad_norm': 0.8665380179551119, 'learning_rate': 9.498441135868107e-06, 'epoch': 0.17} 17%|█▋ | 5810/34278 [6:27:27<29:10:32, 3.69s/it] 17%|█▋ | 5811/34278 [6:27:30<27:08:28, 3.43s/it] {'loss': 0.1512, 'grad_norm': 0.864757819281507, 'learning_rate': 9.498234882670003e-06, 'epoch': 0.17} 17%|█▋ | 5811/34278 [6:27:30<27:08:28, 3.43s/it] 17%|█▋ | 5812/34278 [6:27:35<33:06:56, 4.19s/it] {'loss': 0.1806, 'grad_norm': 0.670650331227537, 'learning_rate': 9.49802858931273e-06, 'epoch': 0.17} 17%|█▋ | 5812/34278 [6:27:36<33:06:56, 4.19s/it] 17%|█▋ | 5813/34278 [6:27:40<32:57:52, 4.17s/it] {'loss': 0.1798, 'grad_norm': 0.9059467872314915, 'learning_rate': 9.497822255798132e-06, 'epoch': 0.17} 17%|█▋ | 5813/34278 [6:27:40<32:57:52, 4.17s/it] 17%|█▋ | 5814/34278 [6:27:44<32:24:49, 4.10s/it] {'loss': 0.1619, 'grad_norm': 0.936561903060316, 'learning_rate': 9.497615882128053e-06, 'epoch': 0.17} 17%|█▋ | 5814/34278 [6:27:44<32:24:49, 4.10s/it] 17%|█▋ | 5815/34278 [6:27:49<36:30:36, 4.62s/it] {'loss': 0.1629, 'grad_norm': 0.8561048927543008, 'learning_rate': 9.497409468304331e-06, 'epoch': 0.17} 17%|█▋ | 5815/34278 [6:27:49<36:30:36, 4.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5816/34278 [6:27:53<34:18:58, 4.34s/it] {'loss': 0.1657, 'grad_norm': 1.0939282699945563, 'learning_rate': 9.49720301432881e-06, 'epoch': 0.17} 17%|█▋ | 5816/34278 [6:27:53<34:18:58, 4.34s/it] 17%|█▋ | 5817/34278 [6:27:58<34:59:46, 4.43s/it] {'loss': 0.1787, 'grad_norm': 1.3073161896135992, 'learning_rate': 9.496996520203336e-06, 'epoch': 0.17} 17%|█▋ | 5817/34278 [6:27:58<34:59:46, 4.43s/it] 17%|█▋ | 5818/34278 [6:28:01<31:35:16, 4.00s/it] {'loss': 0.1585, 'grad_norm': 1.4015824687904928, 'learning_rate': 9.496789985929749e-06, 'epoch': 0.17} 17%|█▋ | 5818/34278 [6:28:01<31:35:16, 4.00s/it] 17%|█▋ | 5819/34278 [6:28:04<29:11:24, 3.69s/it] {'loss': 0.156, 'grad_norm': 0.9853956862235628, 'learning_rate': 9.496583411509897e-06, 'epoch': 0.17} 17%|█▋ | 5819/34278 [6:28:04<29:11:24, 3.69s/it] 17%|█▋ | 5820/34278 [6:28:10<34:41:58, 4.39s/it] {'loss': 0.166, 'grad_norm': 0.7535386398845096, 'learning_rate': 9.49637679694562e-06, 'epoch': 0.17} 17%|█▋ | 5820/34278 [6:28:10<34:41:58, 4.39s/it] 17%|█▋ | 5821/34278 [6:28:12<30:44:57, 3.89s/it] {'loss': 0.2054, 'grad_norm': 1.1852578585965288, 'learning_rate': 9.496170142238763e-06, 'epoch': 0.17} 17%|█▋ | 5821/34278 [6:28:12<30:44:57, 3.89s/it] 17%|█▋ | 5822/34278 [6:28:16<29:20:50, 3.71s/it] {'loss': 0.1493, 'grad_norm': 0.6938151094907697, 'learning_rate': 9.495963447391174e-06, 'epoch': 0.17} 17%|█▋ | 5822/34278 [6:28:16<29:20:50, 3.71s/it] 17%|█▋ | 5823/34278 [6:28:19<27:32:52, 3.49s/it] {'loss': 0.1709, 'grad_norm': 0.825948110000058, 'learning_rate': 9.495756712404695e-06, 'epoch': 0.17} 17%|█▋ | 5823/34278 [6:28:19<27:32:52, 3.49s/it] 17%|█▋ | 5824/34278 [6:28:21<25:38:20, 3.24s/it] {'loss': 0.1946, 'grad_norm': 0.8517842125407908, 'learning_rate': 9.495549937281177e-06, 'epoch': 0.17} 17%|█▋ | 5824/34278 [6:28:21<25:38:20, 3.24s/it] 17%|█▋ | 5825/34278 [6:28:25<26:38:16, 3.37s/it] {'loss': 0.146, 'grad_norm': 0.8141468243310551, 'learning_rate': 9.495343122022458e-06, 'epoch': 0.17} 17%|█▋ | 5825/34278 [6:28:25<26:38:16, 3.37s/it] 17%|█▋ | 5826/34278 [6:28:31<33:25:38, 4.23s/it] {'loss': 0.1805, 'grad_norm': 0.8520375754287067, 'learning_rate': 9.495136266630392e-06, 'epoch': 0.17} 17%|█▋ | 5826/34278 [6:28:31<33:25:38, 4.23s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5827/34278 [6:28:35<31:10:28, 3.94s/it] {'loss': 0.2049, 'grad_norm': 0.8924959092288862, 'learning_rate': 9.49492937110682e-06, 'epoch': 0.17} 17%|█▋ | 5827/34278 [6:28:35<31:10:28, 3.94s/it] 17%|█▋ | 5828/34278 [6:28:38<30:23:19, 3.85s/it] {'loss': 0.1619, 'grad_norm': 0.7921049782065184, 'learning_rate': 9.494722435453593e-06, 'epoch': 0.17} 17%|█▋ | 5828/34278 [6:28:38<30:23:19, 3.85s/it] 17%|█▋ | 5829/34278 [6:28:41<28:45:53, 3.64s/it] {'loss': 0.1754, 'grad_norm': 0.8075579509681048, 'learning_rate': 9.494515459672557e-06, 'epoch': 0.17} 17%|█▋ | 5829/34278 [6:28:41<28:45:53, 3.64s/it] 17%|█▋ | 5830/34278 [6:28:45<27:52:34, 3.53s/it] {'loss': 0.169, 'grad_norm': 0.924799922280922, 'learning_rate': 9.49430844376556e-06, 'epoch': 0.17} 17%|█▋ | 5830/34278 [6:28:45<27:52:34, 3.53s/it] 17%|█▋ | 5831/34278 [6:28:48<28:04:50, 3.55s/it] {'loss': 0.1633, 'grad_norm': 0.7704035544183194, 'learning_rate': 9.494101387734448e-06, 'epoch': 0.17} 17%|█▋ | 5831/34278 [6:28:48<28:04:50, 3.55s/it] 17%|█▋ | 5832/34278 [6:28:51<26:39:44, 3.37s/it] {'loss': 0.1566, 'grad_norm': 0.7869394969098592, 'learning_rate': 9.493894291581074e-06, 'epoch': 0.17} 17%|█▋ | 5832/34278 [6:28:51<26:39:44, 3.37s/it] 17%|█▋ | 5833/34278 [6:28:55<27:55:02, 3.53s/it] {'loss': 0.1649, 'grad_norm': 0.9816539631901909, 'learning_rate': 9.493687155307285e-06, 'epoch': 0.17} 17%|█▋ | 5833/34278 [6:28:55<27:55:02, 3.53s/it] 17%|█▋ | 5834/34278 [6:28:58<26:37:51, 3.37s/it] {'loss': 0.173, 'grad_norm': 0.8285838160023006, 'learning_rate': 9.493479978914928e-06, 'epoch': 0.17} 17%|█▋ | 5834/34278 [6:28:58<26:37:51, 3.37s/it] 17%|█▋ | 5835/34278 [6:29:01<25:23:54, 3.21s/it] {'loss': 0.1633, 'grad_norm': 0.7251819896058284, 'learning_rate': 9.493272762405856e-06, 'epoch': 0.17} 17%|█▋ | 5835/34278 [6:29:01<25:23:54, 3.21s/it] 17%|█▋ | 5836/34278 [6:29:07<31:35:17, 4.00s/it] {'loss': 0.1884, 'grad_norm': 0.8053394908046079, 'learning_rate': 9.493065505781916e-06, 'epoch': 0.17} 17%|█▋ | 5836/34278 [6:29:07<31:35:17, 4.00s/it] 17%|█▋ | 5837/34278 [6:29:10<30:53:39, 3.91s/it] {'loss': 0.1711, 'grad_norm': 0.8806605956908382, 'learning_rate': 9.49285820904496e-06, 'epoch': 0.17} 17%|█▋ | 5837/34278 [6:29:10<30:53:39, 3.91s/it] 17%|█▋ | 5838/34278 [6:29:13<28:06:55, 3.56s/it] {'loss': 0.1613, 'grad_norm': 0.7246739029732067, 'learning_rate': 9.492650872196839e-06, 'epoch': 0.17} 17%|█▋ | 5838/34278 [6:29:13<28:06:55, 3.56s/it] 17%|█▋ | 5839/34278 [6:29:16<27:33:00, 3.49s/it] {'loss': 0.1658, 'grad_norm': 0.8886602726063461, 'learning_rate': 9.492443495239404e-06, 'epoch': 0.17} 17%|█▋ | 5839/34278 [6:29:16<27:33:00, 3.49s/it] 17%|█▋ | 5840/34278 [6:29:19<26:08:01, 3.31s/it] {'loss': 0.1753, 'grad_norm': 0.8578512789545912, 'learning_rate': 9.492236078174504e-06, 'epoch': 0.17} 17%|█▋ | 5840/34278 [6:29:19<26:08:01, 3.31s/it] 17%|█▋ | 5841/34278 [6:29:26<32:57:27, 4.17s/it] {'loss': 0.1645, 'grad_norm': 1.030892003656916, 'learning_rate': 9.492028621003994e-06, 'epoch': 0.17} 17%|█▋ | 5841/34278 [6:29:26<32:57:27, 4.17s/it] 17%|█▋ | 5842/34278 [6:29:32<38:03:40, 4.82s/it] {'loss': 0.1726, 'grad_norm': 1.062780621892427, 'learning_rate': 9.491821123729725e-06, 'epoch': 0.17} 17%|█▋ | 5842/34278 [6:29:32<38:03:40, 4.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5843/34278 [6:29:35<34:30:48, 4.37s/it] {'loss': 0.1743, 'grad_norm': 0.944916376038252, 'learning_rate': 9.49161358635355e-06, 'epoch': 0.17} 17%|█▋ | 5843/34278 [6:29:35<34:30:48, 4.37s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5844/34278 [6:29:39<32:22:16, 4.10s/it] {'loss': 0.198, 'grad_norm': 0.9317109319704673, 'learning_rate': 9.49140600887732e-06, 'epoch': 0.17} 17%|█▋ | 5844/34278 [6:29:39<32:22:16, 4.10s/it] 17%|█▋ | 5845/34278 [6:29:42<29:22:50, 3.72s/it] {'loss': 0.1691, 'grad_norm': 0.8489490810089662, 'learning_rate': 9.49119839130289e-06, 'epoch': 0.17} 17%|█▋ | 5845/34278 [6:29:42<29:22:50, 3.72s/it] 17%|█▋ | 5846/34278 [6:29:45<29:09:59, 3.69s/it] {'loss': 0.1644, 'grad_norm': 0.9000019676171469, 'learning_rate': 9.49099073363211e-06, 'epoch': 0.17} 17%|█▋ | 5846/34278 [6:29:45<29:09:59, 3.69s/it] 17%|█▋ | 5847/34278 [6:29:50<31:30:13, 3.99s/it] {'loss': 0.1541, 'grad_norm': 0.6412596210619251, 'learning_rate': 9.49078303586684e-06, 'epoch': 0.17} 17%|█▋ | 5847/34278 [6:29:50<31:30:13, 3.99s/it] 17%|█▋ | 5848/34278 [6:29:53<30:00:47, 3.80s/it] {'loss': 0.1775, 'grad_norm': 0.8121724387267714, 'learning_rate': 9.49057529800893e-06, 'epoch': 0.17} 17%|█▋ | 5848/34278 [6:29:53<30:00:47, 3.80s/it] 17%|█▋ | 5849/34278 [6:29:58<31:26:03, 3.98s/it] {'loss': 0.1587, 'grad_norm': 0.7549442879290451, 'learning_rate': 9.490367520060236e-06, 'epoch': 0.17} 17%|█▋ | 5849/34278 [6:29:58<31:26:03, 3.98s/it] 17%|█▋ | 5850/34278 [6:30:02<32:42:53, 4.14s/it] {'loss': 0.1524, 'grad_norm': 0.8276613389561046, 'learning_rate': 9.490159702022611e-06, 'epoch': 0.17} 17%|█▋ | 5850/34278 [6:30:02<32:42:53, 4.14s/it] 17%|█▋ | 5851/34278 [6:30:05<29:50:25, 3.78s/it] {'loss': 0.1403, 'grad_norm': 0.6336513971427156, 'learning_rate': 9.489951843897916e-06, 'epoch': 0.17} 17%|█▋ | 5851/34278 [6:30:05<29:50:25, 3.78s/it] 17%|█▋ | 5852/34278 [6:30:11<35:42:41, 4.52s/it] {'loss': 0.1801, 'grad_norm': 0.835059243274728, 'learning_rate': 9.489743945688e-06, 'epoch': 0.17} 17%|█▋ | 5852/34278 [6:30:11<35:42:41, 4.52s/it] 17%|█▋ | 5853/34278 [6:30:14<32:08:44, 4.07s/it] {'loss': 0.1545, 'grad_norm': 0.8635128132339861, 'learning_rate': 9.489536007394721e-06, 'epoch': 0.17} 17%|█▋ | 5853/34278 [6:30:14<32:08:44, 4.07s/it] 17%|█▋ | 5854/34278 [6:30:17<29:52:08, 3.78s/it] {'loss': 0.1942, 'grad_norm': 0.937145630045578, 'learning_rate': 9.489328029019939e-06, 'epoch': 0.17} 17%|█▋ | 5854/34278 [6:30:17<29:52:08, 3.78s/it] 17%|█▋ | 5855/34278 [6:30:20<27:33:14, 3.49s/it] {'loss': 0.1652, 'grad_norm': 0.8666681316470812, 'learning_rate': 9.489120010565506e-06, 'epoch': 0.17} 17%|█▋ | 5855/34278 [6:30:20<27:33:14, 3.49s/it] 17%|█▋ | 5856/34278 [6:30:23<26:52:04, 3.40s/it] {'loss': 0.159, 'grad_norm': 0.9248298409223987, 'learning_rate': 9.488911952033283e-06, 'epoch': 0.17} 17%|█▋ | 5856/34278 [6:30:23<26:52:04, 3.40s/it] 17%|█▋ | 5857/34278 [6:30:27<27:03:01, 3.43s/it] {'loss': 0.1645, 'grad_norm': 1.0064340540443737, 'learning_rate': 9.488703853425125e-06, 'epoch': 0.17} 17%|█▋ | 5857/34278 [6:30:27<27:03:01, 3.43s/it] 17%|█▋ | 5858/34278 [6:30:30<27:08:42, 3.44s/it] {'loss': 0.1615, 'grad_norm': 0.9921130219170589, 'learning_rate': 9.48849571474289e-06, 'epoch': 0.17} 17%|█▋ | 5858/34278 [6:30:30<27:08:42, 3.44s/it] 17%|█▋ | 5859/34278 [6:30:34<26:40:39, 3.38s/it] {'loss': 0.1696, 'grad_norm': 0.7842710062827105, 'learning_rate': 9.488287535988437e-06, 'epoch': 0.17} 17%|█▋ | 5859/34278 [6:30:34<26:40:39, 3.38s/it] 17%|█▋ | 5860/34278 [6:30:39<31:59:28, 4.05s/it] {'loss': 0.1512, 'grad_norm': 1.2510016437280238, 'learning_rate': 9.488079317163624e-06, 'epoch': 0.17} 17%|█▋ | 5860/34278 [6:30:39<31:59:28, 4.05s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5861/34278 [6:30:43<31:25:29, 3.98s/it] {'loss': 0.1964, 'grad_norm': 1.1097337672361536, 'learning_rate': 9.48787105827031e-06, 'epoch': 0.17} 17%|█▋ | 5861/34278 [6:30:43<31:25:29, 3.98s/it] 17%|█▋ | 5862/34278 [6:30:46<28:48:37, 3.65s/it] {'loss': 0.188, 'grad_norm': 0.9184401922265988, 'learning_rate': 9.487662759310354e-06, 'epoch': 0.17} 17%|█▋ | 5862/34278 [6:30:46<28:48:37, 3.65s/it] 17%|█▋ | 5863/34278 [6:30:49<28:28:32, 3.61s/it] {'loss': 0.1921, 'grad_norm': 0.8779127180149895, 'learning_rate': 9.487454420285618e-06, 'epoch': 0.17} 17%|█▋ | 5863/34278 [6:30:49<28:28:32, 3.61s/it] 17%|█▋ | 5864/34278 [6:30:52<27:05:57, 3.43s/it] {'loss': 0.1722, 'grad_norm': 1.0451577051652654, 'learning_rate': 9.48724604119796e-06, 'epoch': 0.17} 17%|█▋ | 5864/34278 [6:30:52<27:05:57, 3.43s/it] 17%|█▋ | 5865/34278 [6:30:58<33:07:51, 4.20s/it] {'loss': 0.1486, 'grad_norm': 1.0543682482136674, 'learning_rate': 9.487037622049238e-06, 'epoch': 0.17} 17%|█▋ | 5865/34278 [6:30:58<33:07:51, 4.20s/it] 17%|█▋ | 5866/34278 [6:31:04<35:09:50, 4.46s/it] {'loss': 0.161, 'grad_norm': 0.9540005552983775, 'learning_rate': 9.486829162841318e-06, 'epoch': 0.17} 17%|█▋ | 5866/34278 [6:31:04<35:09:50, 4.46s/it] 17%|█▋ | 5867/34278 [6:31:06<31:28:39, 3.99s/it] {'loss': 0.1627, 'grad_norm': 0.9103475364595928, 'learning_rate': 9.486620663576058e-06, 'epoch': 0.17} 17%|█▋ | 5867/34278 [6:31:06<31:28:39, 3.99s/it] 17%|█▋ | 5868/34278 [6:31:09<28:55:21, 3.66s/it] {'loss': 0.1894, 'grad_norm': 0.9717469794001211, 'learning_rate': 9.486412124255318e-06, 'epoch': 0.17} 17%|█▋ | 5868/34278 [6:31:09<28:55:21, 3.66s/it] 17%|█▋ | 5869/34278 [6:31:12<27:45:54, 3.52s/it] {'loss': 0.1707, 'grad_norm': 0.7894868945012453, 'learning_rate': 9.486203544880963e-06, 'epoch': 0.17} 17%|█▋ | 5869/34278 [6:31:13<27:45:54, 3.52s/it] 17%|█▋ | 5870/34278 [6:31:16<28:38:45, 3.63s/it] {'loss': 0.1624, 'grad_norm': 0.9958722679967967, 'learning_rate': 9.485994925454853e-06, 'epoch': 0.17} 17%|█▋ | 5870/34278 [6:31:16<28:38:45, 3.63s/it] 17%|█▋ | 5871/34278 [6:31:22<34:01:33, 4.31s/it] {'loss': 0.1894, 'grad_norm': 0.9983706914619377, 'learning_rate': 9.485786265978852e-06, 'epoch': 0.17} 17%|█▋ | 5871/34278 [6:31:22<34:01:33, 4.31s/it] 17%|█▋ | 5872/34278 [6:31:26<31:59:13, 4.05s/it] {'loss': 0.187, 'grad_norm': 0.8709152103796894, 'learning_rate': 9.485577566454822e-06, 'epoch': 0.17} 17%|█▋ | 5872/34278 [6:31:26<31:59:13, 4.05s/it] 17%|█▋ | 5873/34278 [6:31:30<32:24:28, 4.11s/it] {'loss': 0.1687, 'grad_norm': 0.7704062122345586, 'learning_rate': 9.485368826884625e-06, 'epoch': 0.17} 17%|█▋ | 5873/34278 [6:31:30<32:24:28, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5874/34278 [6:31:34<33:05:25, 4.19s/it] {'loss': 0.1797, 'grad_norm': 0.9113342942131873, 'learning_rate': 9.485160047270128e-06, 'epoch': 0.17} 17%|█▋ | 5874/34278 [6:31:34<33:05:25, 4.19s/it] 17%|█▋ | 5875/34278 [6:31:38<31:57:27, 4.05s/it] {'loss': 0.1787, 'grad_norm': 0.8294673446858788, 'learning_rate': 9.48495122761319e-06, 'epoch': 0.17} 17%|█▋ | 5875/34278 [6:31:38<31:57:27, 4.05s/it] 17%|█▋ | 5876/34278 [6:31:43<34:13:58, 4.34s/it] {'loss': 0.1792, 'grad_norm': 0.7857387256772692, 'learning_rate': 9.48474236791568e-06, 'epoch': 0.17} 17%|█▋ | 5876/34278 [6:31:43<34:13:58, 4.34s/it] 17%|█▋ | 5877/34278 [6:31:49<38:38:36, 4.90s/it] {'loss': 0.179, 'grad_norm': 0.7558495006071084, 'learning_rate': 9.484533468179461e-06, 'epoch': 0.17} 17%|█▋ | 5877/34278 [6:31:49<38:38:36, 4.90s/it] 17%|█▋ | 5878/34278 [6:31:52<34:06:11, 4.32s/it] {'loss': 0.1888, 'grad_norm': 0.7936706599984249, 'learning_rate': 9.484324528406397e-06, 'epoch': 0.17} 17%|█▋ | 5878/34278 [6:31:52<34:06:11, 4.32s/it] 17%|█▋ | 5879/34278 [6:31:55<31:18:02, 3.97s/it] {'loss': 0.1719, 'grad_norm': 0.7705994564390533, 'learning_rate': 9.484115548598353e-06, 'epoch': 0.17} 17%|█▋ | 5879/34278 [6:31:55<31:18:02, 3.97s/it] 17%|█▋ | 5880/34278 [6:32:00<33:03:57, 4.19s/it] {'loss': 0.169, 'grad_norm': 0.7123561091210682, 'learning_rate': 9.483906528757199e-06, 'epoch': 0.17} 17%|█▋ | 5880/34278 [6:32:00<33:03:57, 4.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5881/34278 [6:32:03<29:58:38, 3.80s/it] {'loss': 0.1585, 'grad_norm': 0.7658641762756876, 'learning_rate': 9.483697468884795e-06, 'epoch': 0.17} 17%|█▋ | 5881/34278 [6:32:03<29:58:38, 3.80s/it] 17%|█▋ | 5882/34278 [6:32:08<31:54:16, 4.04s/it] {'loss': 0.1663, 'grad_norm': 0.7731858572594492, 'learning_rate': 9.483488368983012e-06, 'epoch': 0.17} 17%|█▋ | 5882/34278 [6:32:08<31:54:16, 4.04s/it] 17%|█▋ | 5883/34278 [6:32:11<30:07:30, 3.82s/it] {'loss': 0.1838, 'grad_norm': 0.723524529100536, 'learning_rate': 9.483279229053715e-06, 'epoch': 0.17} 17%|█▋ | 5883/34278 [6:32:11<30:07:30, 3.82s/it] 17%|█▋ | 5884/34278 [6:32:14<28:15:35, 3.58s/it] {'loss': 0.156, 'grad_norm': 0.837544800571379, 'learning_rate': 9.48307004909877e-06, 'epoch': 0.17} 17%|█▋ | 5884/34278 [6:32:14<28:15:35, 3.58s/it] 17%|█▋ | 5885/34278 [6:32:17<27:48:39, 3.53s/it] {'loss': 0.1569, 'grad_norm': 0.7445408836660294, 'learning_rate': 9.482860829120046e-06, 'epoch': 0.17} 17%|█▋ | 5885/34278 [6:32:17<27:48:39, 3.53s/it] 17%|█▋ | 5886/34278 [6:32:21<29:02:26, 3.68s/it] {'loss': 0.1529, 'grad_norm': 0.9026515150135537, 'learning_rate': 9.482651569119412e-06, 'epoch': 0.17} 17%|█▋ | 5886/34278 [6:32:21<29:02:26, 3.68s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (10530 > 8192). Running this sequence through the model will result in indexing errors 17%|█▋ | 5887/34278 [6:32:25<28:09:42, 3.57s/it] {'loss': 0.1881, 'grad_norm': 0.8318603556547081, 'learning_rate': 9.482442269098734e-06, 'epoch': 0.17} 17%|█▋ | 5887/34278 [6:32:25<28:09:42, 3.57s/it] 17%|█▋ | 5888/34278 [6:32:29<30:43:30, 3.90s/it] {'loss': 0.1774, 'grad_norm': 0.8781507598297048, 'learning_rate': 9.482232929059882e-06, 'epoch': 0.17} 17%|█▋ | 5888/34278 [6:32:29<30:43:30, 3.90s/it] 17%|█▋ | 5889/34278 [6:32:33<29:05:04, 3.69s/it] {'loss': 0.1579, 'grad_norm': 0.8663416585325672, 'learning_rate': 9.482023549004725e-06, 'epoch': 0.17} 17%|█▋ | 5889/34278 [6:32:33<29:05:04, 3.69s/it] 17%|█▋ | 5890/34278 [6:32:39<34:32:47, 4.38s/it] {'loss': 0.163, 'grad_norm': 0.9783300054513878, 'learning_rate': 9.48181412893513e-06, 'epoch': 0.17} 17%|█▋ | 5890/34278 [6:32:39<34:32:47, 4.38s/it] 17%|█▋ | 5891/34278 [6:32:42<31:23:35, 3.98s/it] {'loss': 0.1858, 'grad_norm': 1.0992628919712968, 'learning_rate': 9.481604668852969e-06, 'epoch': 0.17} 17%|█▋ | 5891/34278 [6:32:42<31:23:35, 3.98s/it] 17%|█▋ | 5892/34278 [6:32:45<29:13:00, 3.71s/it] {'loss': 0.1917, 'grad_norm': 1.474379319261719, 'learning_rate': 9.48139516876011e-06, 'epoch': 0.17} 17%|█▋ | 5892/34278 [6:32:45<29:13:00, 3.71s/it] 17%|█▋ | 5893/34278 [6:32:50<33:32:27, 4.25s/it] {'loss': 0.1567, 'grad_norm': 0.9871554734600607, 'learning_rate': 9.481185628658427e-06, 'epoch': 0.17} 17%|█▋ | 5893/34278 [6:32:50<33:32:27, 4.25s/it] 17%|█▋ | 5894/34278 [6:32:53<30:26:13, 3.86s/it] {'loss': 0.1584, 'grad_norm': 0.8606364597631231, 'learning_rate': 9.480976048549788e-06, 'epoch': 0.17} 17%|█▋ | 5894/34278 [6:32:53<30:26:13, 3.86s/it] 17%|█▋ | 5895/34278 [6:32:58<32:44:55, 4.15s/it] {'loss': 0.1546, 'grad_norm': 1.0110607347591787, 'learning_rate': 9.480766428436064e-06, 'epoch': 0.17} 17%|█▋ | 5895/34278 [6:32:58<32:44:55, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5896/34278 [6:33:01<29:58:45, 3.80s/it] {'loss': 0.1724, 'grad_norm': 0.8948162546528683, 'learning_rate': 9.480556768319127e-06, 'epoch': 0.17} 17%|█▋ | 5896/34278 [6:33:01<29:58:45, 3.80s/it] 17%|█▋ | 5897/34278 [6:33:06<31:47:08, 4.03s/it] {'loss': 0.1653, 'grad_norm': 1.041592621819511, 'learning_rate': 9.480347068200848e-06, 'epoch': 0.17} 17%|█▋ | 5897/34278 [6:33:06<31:47:08, 4.03s/it] 17%|█▋ | 5898/34278 [6:33:12<36:59:51, 4.69s/it] {'loss': 0.1618, 'grad_norm': 0.8787071543385294, 'learning_rate': 9.480137328083102e-06, 'epoch': 0.17} 17%|█▋ | 5898/34278 [6:33:12<36:59:51, 4.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5899/34278 [6:33:15<33:22:25, 4.23s/it] {'loss': 0.1756, 'grad_norm': 0.9065587370815846, 'learning_rate': 9.479927547967758e-06, 'epoch': 0.17} 17%|█▋ | 5899/34278 [6:33:15<33:22:25, 4.23s/it] 17%|█▋ | 5900/34278 [6:33:18<29:47:22, 3.78s/it] {'loss': 0.1534, 'grad_norm': 1.1462145546323987, 'learning_rate': 9.47971772785669e-06, 'epoch': 0.17} 17%|█▋ | 5900/34278 [6:33:18<29:47:22, 3.78s/it] 17%|█▋ | 5901/34278 [6:33:21<29:06:33, 3.69s/it] {'loss': 0.1838, 'grad_norm': 0.8907073665411045, 'learning_rate': 9.479507867751772e-06, 'epoch': 0.17} 17%|█▋ | 5901/34278 [6:33:21<29:06:33, 3.69s/it] 17%|█▋ | 5902/34278 [6:33:25<30:17:41, 3.84s/it] {'loss': 0.1477, 'grad_norm': 0.8302950690582827, 'learning_rate': 9.479297967654877e-06, 'epoch': 0.17} 17%|█▋ | 5902/34278 [6:33:25<30:17:41, 3.84s/it] 17%|█▋ | 5903/34278 [6:33:29<29:50:33, 3.79s/it] {'loss': 0.1661, 'grad_norm': 1.0954433713833198, 'learning_rate': 9.479088027567879e-06, 'epoch': 0.17} 17%|█▋ | 5903/34278 [6:33:29<29:50:33, 3.79s/it] 17%|█▋ | 5904/34278 [6:33:32<28:49:38, 3.66s/it] {'loss': 0.1754, 'grad_norm': 0.8691451504944712, 'learning_rate': 9.478878047492653e-06, 'epoch': 0.17} 17%|█▋ | 5904/34278 [6:33:32<28:49:38, 3.66s/it] 17%|█▋ | 5905/34278 [6:33:36<27:42:13, 3.52s/it] {'loss': 0.1695, 'grad_norm': 1.0099113901889492, 'learning_rate': 9.478668027431071e-06, 'epoch': 0.17} 17%|█▋ | 5905/34278 [6:33:36<27:42:13, 3.52s/it] 17%|█▋ | 5906/34278 [6:33:39<28:08:10, 3.57s/it] {'loss': 0.1894, 'grad_norm': 0.9850315987586054, 'learning_rate': 9.478457967385013e-06, 'epoch': 0.17} 17%|█▋ | 5906/34278 [6:33:39<28:08:10, 3.57s/it] 17%|█▋ | 5907/34278 [6:33:43<29:11:54, 3.70s/it] {'loss': 0.1422, 'grad_norm': 0.7299713781503557, 'learning_rate': 9.47824786735635e-06, 'epoch': 0.17} 17%|█▋ | 5907/34278 [6:33:43<29:11:54, 3.70s/it] 17%|█▋ | 5908/34278 [6:33:46<27:22:15, 3.47s/it] {'loss': 0.1688, 'grad_norm': 0.8527877793748475, 'learning_rate': 9.478037727346959e-06, 'epoch': 0.17} 17%|█▋ | 5908/34278 [6:33:46<27:22:15, 3.47s/it] 17%|█▋ | 5909/34278 [6:33:49<26:33:31, 3.37s/it] {'loss': 0.1654, 'grad_norm': 0.7605342388667926, 'learning_rate': 9.477827547358716e-06, 'epoch': 0.17} 17%|█▋ | 5909/34278 [6:33:49<26:33:31, 3.37s/it] 17%|█▋ | 5910/34278 [6:33:53<26:12:13, 3.33s/it] {'loss': 0.1578, 'grad_norm': 0.7355909282845711, 'learning_rate': 9.477617327393496e-06, 'epoch': 0.17} 17%|█▋ | 5910/34278 [6:33:53<26:12:13, 3.33s/it] 17%|█▋ | 5911/34278 [6:33:56<26:48:37, 3.40s/it] {'loss': 0.1666, 'grad_norm': 0.895120318100169, 'learning_rate': 9.47740706745318e-06, 'epoch': 0.17} 17%|█▋ | 5911/34278 [6:33:56<26:48:37, 3.40s/it] 17%|█▋ | 5912/34278 [6:34:00<27:03:34, 3.43s/it] {'loss': 0.185, 'grad_norm': 0.7411542237171944, 'learning_rate': 9.47719676753964e-06, 'epoch': 0.17} 17%|█▋ | 5912/34278 [6:34:00<27:03:34, 3.43s/it] 17%|█▋ | 5913/34278 [6:34:03<26:44:37, 3.39s/it] {'loss': 0.1589, 'grad_norm': 0.710531623170141, 'learning_rate': 9.476986427654759e-06, 'epoch': 0.17} 17%|█▋ | 5913/34278 [6:34:03<26:44:37, 3.39s/it] 17%|█▋ | 5914/34278 [6:34:06<25:56:44, 3.29s/it] {'loss': 0.1717, 'grad_norm': 0.7799338090969948, 'learning_rate': 9.476776047800412e-06, 'epoch': 0.17} 17%|█▋ | 5914/34278 [6:34:06<25:56:44, 3.29s/it] 17%|█▋ | 5915/34278 [6:34:10<28:39:33, 3.64s/it] {'loss': 0.1505, 'grad_norm': 0.6765085921456918, 'learning_rate': 9.476565627978473e-06, 'epoch': 0.17} 17%|█▋ | 5915/34278 [6:34:10<28:39:33, 3.64s/it] 17%|█▋ | 5916/34278 [6:34:13<27:15:51, 3.46s/it] {'loss': 0.1527, 'grad_norm': 1.1168412809581576, 'learning_rate': 9.47635516819083e-06, 'epoch': 0.17} 17%|█▋ | 5916/34278 [6:34:13<27:15:51, 3.46s/it] 17%|█▋ | 5917/34278 [6:34:17<26:34:29, 3.37s/it] {'loss': 0.1811, 'grad_norm': 0.776208522254885, 'learning_rate': 9.476144668439353e-06, 'epoch': 0.17} 17%|█▋ | 5917/34278 [6:34:17<26:34:29, 3.37s/it] 17%|█▋ | 5918/34278 [6:34:20<27:22:11, 3.47s/it] {'loss': 0.164, 'grad_norm': 0.8716613828854245, 'learning_rate': 9.475934128725926e-06, 'epoch': 0.17} 17%|█▋ | 5918/34278 [6:34:20<27:22:11, 3.47s/it] 17%|█▋ | 5919/34278 [6:34:24<27:13:12, 3.46s/it] {'loss': 0.1558, 'grad_norm': 0.7464377444385695, 'learning_rate': 9.475723549052427e-06, 'epoch': 0.17} 17%|█▋ | 5919/34278 [6:34:24<27:13:12, 3.46s/it] 17%|█▋ | 5920/34278 [6:34:27<27:43:22, 3.52s/it] {'loss': 0.1651, 'grad_norm': 0.8552805410092627, 'learning_rate': 9.475512929420739e-06, 'epoch': 0.17} 17%|█▋ | 5920/34278 [6:34:27<27:43:22, 3.52s/it] 17%|█▋ | 5921/34278 [6:34:32<30:39:54, 3.89s/it] {'loss': 0.1598, 'grad_norm': 0.8271860599322377, 'learning_rate': 9.475302269832736e-06, 'epoch': 0.17} 17%|█▋ | 5921/34278 [6:34:32<30:39:54, 3.89s/it] 17%|█▋ | 5922/34278 [6:34:36<30:29:46, 3.87s/it] {'loss': 0.1584, 'grad_norm': 0.9967403670391106, 'learning_rate': 9.475091570290306e-06, 'epoch': 0.17} 17%|█▋ | 5922/34278 [6:34:36<30:29:46, 3.87s/it] 17%|█▋ | 5923/34278 [6:34:42<35:09:43, 4.46s/it] {'loss': 0.1686, 'grad_norm': 0.9750872399469732, 'learning_rate': 9.474880830795326e-06, 'epoch': 0.17} 17%|█▋ | 5923/34278 [6:34:42<35:09:43, 4.46s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5924/34278 [6:34:46<33:11:36, 4.21s/it] {'loss': 0.1659, 'grad_norm': 0.9847673654342468, 'learning_rate': 9.474670051349677e-06, 'epoch': 0.17} 17%|█▋ | 5924/34278 [6:34:46<33:11:36, 4.21s/it] 17%|█▋ | 5925/34278 [6:34:49<30:57:46, 3.93s/it] {'loss': 0.1882, 'grad_norm': 1.5394662452136487, 'learning_rate': 9.474459231955243e-06, 'epoch': 0.17} 17%|█▋ | 5925/34278 [6:34:49<30:57:46, 3.93s/it] 17%|█▋ | 5926/34278 [6:34:53<31:26:55, 3.99s/it] {'loss': 0.1705, 'grad_norm': 0.8466411139553673, 'learning_rate': 9.474248372613904e-06, 'epoch': 0.17} 17%|█▋ | 5926/34278 [6:34:53<31:26:55, 3.99s/it] 17%|█▋ | 5927/34278 [6:34:59<36:00:40, 4.57s/it] {'loss': 0.1929, 'grad_norm': 0.797873163935156, 'learning_rate': 9.474037473327546e-06, 'epoch': 0.17} 17%|█▋ | 5927/34278 [6:34:59<36:00:40, 4.57s/it] 17%|█▋ | 5928/34278 [6:35:02<33:26:40, 4.25s/it] {'loss': 0.1621, 'grad_norm': 1.0790938186326624, 'learning_rate': 9.473826534098048e-06, 'epoch': 0.17} 17%|█▋ | 5928/34278 [6:35:02<33:26:40, 4.25s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f06c7ef3100> Failed to fetch sample 2898121. Exception: cannot identify image file <_io.BytesIO object at 0x7f06c7ef3100> 17%|█▋ | 5929/34278 [6:35:05<30:48:20, 3.91s/it] {'loss': 0.1808, 'grad_norm': 0.9646935693715456, 'learning_rate': 9.473615554927294e-06, 'epoch': 0.17} 17%|█▋ | 5929/34278 [6:35:05<30:48:20, 3.91s/it] 17%|█▋ | 5930/34278 [6:35:10<32:03:24, 4.07s/it] {'loss': 0.1757, 'grad_norm': 0.8499971109968554, 'learning_rate': 9.473404535817168e-06, 'epoch': 0.17} 17%|█▋ | 5930/34278 [6:35:10<32:03:24, 4.07s/it] 17%|█▋ | 5931/34278 [6:35:13<29:29:34, 3.75s/it] {'loss': 0.1693, 'grad_norm': 0.8560382926739467, 'learning_rate': 9.473193476769556e-06, 'epoch': 0.17} 17%|█▋ | 5931/34278 [6:35:13<29:29:34, 3.75s/it] 17%|█▋ | 5932/34278 [6:35:17<29:22:48, 3.73s/it] {'loss': 0.1765, 'grad_norm': 1.0118514806139574, 'learning_rate': 9.47298237778634e-06, 'epoch': 0.17} 17%|█▋ | 5932/34278 [6:35:17<29:22:48, 3.73s/it] 17%|█▋ | 5933/34278 [6:35:20<28:13:51, 3.59s/it] {'loss': 0.1825, 'grad_norm': 0.8925696467654138, 'learning_rate': 9.472771238869404e-06, 'epoch': 0.17} 17%|█▋ | 5933/34278 [6:35:20<28:13:51, 3.59s/it] 17%|█▋ | 5934/34278 [6:35:26<34:04:30, 4.33s/it] {'loss': 0.1787, 'grad_norm': 1.1306236346184735, 'learning_rate': 9.472560060020635e-06, 'epoch': 0.17} 17%|█▋ | 5934/34278 [6:35:26<34:04:30, 4.33s/it] 17%|█▋ | 5935/34278 [6:35:30<34:38:05, 4.40s/it] {'loss': 0.1656, 'grad_norm': 1.1127923557527213, 'learning_rate': 9.472348841241917e-06, 'epoch': 0.17} 17%|█▋ | 5935/34278 [6:35:30<34:38:05, 4.40s/it] 17%|█▋ | 5936/34278 [6:35:34<32:12:01, 4.09s/it] {'loss': 0.1771, 'grad_norm': 0.9330077890733786, 'learning_rate': 9.472137582535137e-06, 'epoch': 0.17} 17%|█▋ | 5936/34278 [6:35:34<32:12:01, 4.09s/it] 17%|█▋ | 5937/34278 [6:35:37<29:18:16, 3.72s/it] {'loss': 0.1713, 'grad_norm': 0.8874432218424393, 'learning_rate': 9.47192628390218e-06, 'epoch': 0.17} 17%|█▋ | 5937/34278 [6:35:37<29:18:16, 3.72s/it] 17%|█▋ | 5938/34278 [6:35:41<31:42:51, 4.03s/it] {'loss': 0.1793, 'grad_norm': 0.7874904272003118, 'learning_rate': 9.471714945344932e-06, 'epoch': 0.17} 17%|█▋ | 5938/34278 [6:35:41<31:42:51, 4.03s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5939/34278 [6:35:46<31:58:01, 4.06s/it] {'loss': 0.1784, 'grad_norm': 1.0186968902520714, 'learning_rate': 9.471503566865281e-06, 'epoch': 0.17} 17%|█▋ | 5939/34278 [6:35:46<31:58:01, 4.06s/it] 17%|█▋ | 5940/34278 [6:35:49<29:28:08, 3.74s/it] {'loss': 0.1898, 'grad_norm': 0.7667923188504596, 'learning_rate': 9.471292148465113e-06, 'epoch': 0.17} 17%|█▋ | 5940/34278 [6:35:49<29:28:08, 3.74s/it] 17%|█▋ | 5941/34278 [6:35:52<28:09:42, 3.58s/it] {'loss': 0.1546, 'grad_norm': 0.7736002263705937, 'learning_rate': 9.471080690146316e-06, 'epoch': 0.17} 17%|█▋ | 5941/34278 [6:35:52<28:09:42, 3.58s/it] 17%|█▋ | 5942/34278 [6:35:55<27:20:12, 3.47s/it] {'loss': 0.1596, 'grad_norm': 0.8722639941577405, 'learning_rate': 9.470869191910779e-06, 'epoch': 0.17} 17%|█▋ | 5942/34278 [6:35:55<27:20:12, 3.47s/it] 17%|█▋ | 5943/34278 [6:36:00<29:57:38, 3.81s/it] {'loss': 0.1701, 'grad_norm': 0.9742425619725633, 'learning_rate': 9.47065765376039e-06, 'epoch': 0.17} 17%|█▋ | 5943/34278 [6:36:00<29:57:38, 3.81s/it] 17%|█▋ | 5944/34278 [6:36:02<27:45:02, 3.53s/it] {'loss': 0.1874, 'grad_norm': 0.8977959599437407, 'learning_rate': 9.470446075697033e-06, 'epoch': 0.17} 17%|█▋ | 5944/34278 [6:36:02<27:45:02, 3.53s/it] 17%|█▋ | 5945/34278 [6:36:06<27:57:18, 3.55s/it] {'loss': 0.1744, 'grad_norm': 0.8078948420257451, 'learning_rate': 9.470234457722604e-06, 'epoch': 0.17} 17%|█▋ | 5945/34278 [6:36:06<27:57:18, 3.55s/it] 17%|█▋ | 5946/34278 [6:36:11<32:04:37, 4.08s/it] {'loss': 0.1822, 'grad_norm': 0.8299434972604273, 'learning_rate': 9.470022799838986e-06, 'epoch': 0.17} 17%|█▋ | 5946/34278 [6:36:11<32:04:37, 4.08s/it] 17%|█▋ | 5947/34278 [6:36:14<29:52:27, 3.80s/it] {'loss': 0.1774, 'grad_norm': 0.9917905356144868, 'learning_rate': 9.469811102048074e-06, 'epoch': 0.17} 17%|█▋ | 5947/34278 [6:36:15<29:52:27, 3.80s/it] 17%|█▋ | 5948/34278 [6:36:18<29:39:04, 3.77s/it] {'loss': 0.153, 'grad_norm': 0.7140443255715782, 'learning_rate': 9.469599364351756e-06, 'epoch': 0.17} 17%|█▋ | 5948/34278 [6:36:18<29:39:04, 3.77s/it] 17%|█▋ | 5949/34278 [6:36:22<29:54:09, 3.80s/it] {'loss': 0.1676, 'grad_norm': 0.9216111115796793, 'learning_rate': 9.46938758675192e-06, 'epoch': 0.17} 17%|█▋ | 5949/34278 [6:36:22<29:54:09, 3.80s/it] 17%|█▋ | 5950/34278 [6:36:25<28:55:21, 3.68s/it] {'loss': 0.1472, 'grad_norm': 0.8813427707493571, 'learning_rate': 9.46917576925046e-06, 'epoch': 0.17} 17%|█▋ | 5950/34278 [6:36:25<28:55:21, 3.68s/it] 17%|█▋ | 5951/34278 [6:36:31<34:15:23, 4.35s/it] {'loss': 0.1761, 'grad_norm': 0.745755009849987, 'learning_rate': 9.468963911849264e-06, 'epoch': 0.17} 17%|█▋ | 5951/34278 [6:36:31<34:15:23, 4.35s/it] 17%|█▋ | 5952/34278 [6:36:35<31:58:35, 4.06s/it] {'loss': 0.1837, 'grad_norm': 0.9162878895306341, 'learning_rate': 9.468752014550227e-06, 'epoch': 0.17} 17%|█▋ | 5952/34278 [6:36:35<31:58:35, 4.06s/it] 17%|█▋ | 5953/34278 [6:36:38<30:38:27, 3.89s/it] {'loss': 0.1837, 'grad_norm': 0.9514592588014141, 'learning_rate': 9.468540077355237e-06, 'epoch': 0.17} 17%|█▋ | 5953/34278 [6:36:38<30:38:27, 3.89s/it] 17%|█▋ | 5954/34278 [6:36:42<29:31:07, 3.75s/it] {'loss': 0.1692, 'grad_norm': 0.8191123760490736, 'learning_rate': 9.468328100266189e-06, 'epoch': 0.17} 17%|█▋ | 5954/34278 [6:36:42<29:31:07, 3.75s/it] 17%|█▋ | 5955/34278 [6:36:48<34:43:19, 4.41s/it] {'loss': 0.1745, 'grad_norm': 0.8295743832141615, 'learning_rate': 9.468116083284972e-06, 'epoch': 0.17} 17%|█▋ | 5955/34278 [6:36:48<34:43:19, 4.41s/it] 17%|█▋ | 5956/34278 [6:36:50<30:48:08, 3.92s/it] {'loss': 0.1653, 'grad_norm': 0.9984093734614587, 'learning_rate': 9.467904026413485e-06, 'epoch': 0.17} 17%|█▋ | 5956/34278 [6:36:50<30:48:08, 3.92s/it] 17%|█▋ | 5957/34278 [6:36:53<28:15:39, 3.59s/it] {'loss': 0.1516, 'grad_norm': 0.8485915407094916, 'learning_rate': 9.467691929653615e-06, 'epoch': 0.17} 17%|█▋ | 5957/34278 [6:36:53<28:15:39, 3.59s/it] 17%|█▋ | 5958/34278 [6:36:59<33:39:28, 4.28s/it] {'loss': 0.1734, 'grad_norm': 0.8760360025884906, 'learning_rate': 9.46747979300726e-06, 'epoch': 0.17} 17%|█▋ | 5958/34278 [6:36:59<33:39:28, 4.28s/it] 17%|█▋ | 5959/34278 [6:37:02<31:06:00, 3.95s/it] {'loss': 0.1604, 'grad_norm': 0.974936331869323, 'learning_rate': 9.46726761647631e-06, 'epoch': 0.17} 17%|█▋ | 5959/34278 [6:37:02<31:06:00, 3.95s/it] 17%|█▋ | 5960/34278 [6:37:05<28:52:22, 3.67s/it] {'loss': 0.1659, 'grad_norm': 0.8487539303292866, 'learning_rate': 9.467055400062661e-06, 'epoch': 0.17} 17%|█▋ | 5960/34278 [6:37:05<28:52:22, 3.67s/it] 17%|█▋ | 5961/34278 [6:37:09<29:32:31, 3.76s/it] {'loss': 0.1537, 'grad_norm': 0.879925549034562, 'learning_rate': 9.466843143768208e-06, 'epoch': 0.17} 17%|█▋ | 5961/34278 [6:37:09<29:32:31, 3.76s/it] 17%|█▋ | 5962/34278 [6:37:13<29:10:10, 3.71s/it] {'loss': 0.1588, 'grad_norm': 1.0531656602568154, 'learning_rate': 9.466630847594846e-06, 'epoch': 0.17} 17%|█▋ | 5962/34278 [6:37:13<29:10:10, 3.71s/it] 17%|█▋ | 5963/34278 [6:37:16<27:57:20, 3.55s/it] {'loss': 0.1529, 'grad_norm': 0.8055294366853316, 'learning_rate': 9.46641851154447e-06, 'epoch': 0.17} 17%|█▋ | 5963/34278 [6:37:16<27:57:20, 3.55s/it] 17%|█▋ | 5964/34278 [6:37:19<27:05:33, 3.44s/it] {'loss': 0.1388, 'grad_norm': 0.8963424247946293, 'learning_rate': 9.466206135618976e-06, 'epoch': 0.17} 17%|█▋ | 5964/34278 [6:37:19<27:05:33, 3.44s/it] 17%|█▋ | 5965/34278 [6:37:24<29:14:45, 3.72s/it] {'loss': 0.1421, 'grad_norm': 0.8520309048234174, 'learning_rate': 9.46599371982026e-06, 'epoch': 0.17} 17%|█▋ | 5965/34278 [6:37:24<29:14:45, 3.72s/it] 17%|█▋ | 5966/34278 [6:37:27<27:25:30, 3.49s/it] {'loss': 0.145, 'grad_norm': 0.9403170645611213, 'learning_rate': 9.465781264150218e-06, 'epoch': 0.17} 17%|█▋ | 5966/34278 [6:37:27<27:25:30, 3.49s/it] 17%|█▋ | 5967/34278 [6:37:29<25:47:34, 3.28s/it] {'loss': 0.1815, 'grad_norm': 0.7752863862495293, 'learning_rate': 9.465568768610746e-06, 'epoch': 0.17} 17%|█▋ | 5967/34278 [6:37:29<25:47:34, 3.28s/it] 17%|█▋ | 5968/34278 [6:37:36<32:32:49, 4.14s/it] {'loss': 0.1653, 'grad_norm': 0.7940347796017446, 'learning_rate': 9.465356233203744e-06, 'epoch': 0.17} 17%|█▋ | 5968/34278 [6:37:36<32:32:49, 4.14s/it] 17%|█▋ | 5969/34278 [6:37:38<29:46:02, 3.79s/it] {'loss': 0.155, 'grad_norm': 0.9184169748005556, 'learning_rate': 9.465143657931107e-06, 'epoch': 0.17} 17%|█▋ | 5969/34278 [6:37:39<29:46:02, 3.79s/it] 17%|█▋ | 5970/34278 [6:37:42<28:15:17, 3.59s/it] {'loss': 0.1706, 'grad_norm': 0.8519137706543816, 'learning_rate': 9.464931042794732e-06, 'epoch': 0.17} 17%|█▋ | 5970/34278 [6:37:42<28:15:17, 3.59s/it] 17%|█▋ | 5971/34278 [6:37:48<34:02:16, 4.33s/it] {'loss': 0.1962, 'grad_norm': 1.0760462073750925, 'learning_rate': 9.464718387796519e-06, 'epoch': 0.17} 17%|█▋ | 5971/34278 [6:37:48<34:02:16, 4.33s/it] 17%|█▋ | 5972/34278 [6:37:51<31:41:28, 4.03s/it] {'loss': 0.1602, 'grad_norm': 0.9306352113149406, 'learning_rate': 9.464505692938366e-06, 'epoch': 0.17} 17%|█▋ | 5972/34278 [6:37:51<31:41:28, 4.03s/it] 17%|█▋ | 5973/34278 [6:37:54<29:54:38, 3.80s/it] {'loss': 0.1799, 'grad_norm': 0.9626143433325164, 'learning_rate': 9.464292958222173e-06, 'epoch': 0.17} 17%|█▋ | 5973/34278 [6:37:54<29:54:38, 3.80s/it] 17%|█▋ | 5974/34278 [6:37:57<28:23:07, 3.61s/it] {'loss': 0.2107, 'grad_norm': 0.8169859941832072, 'learning_rate': 9.464080183649838e-06, 'epoch': 0.17} 17%|█▋ | 5974/34278 [6:37:57<28:23:07, 3.61s/it] 17%|█▋ | 5975/34278 [6:38:01<27:40:39, 3.52s/it] {'loss': 0.1526, 'grad_norm': 0.8663734991313604, 'learning_rate': 9.46386736922326e-06, 'epoch': 0.17} 17%|█▋ | 5975/34278 [6:38:01<27:40:39, 3.52s/it] 17%|█▋ | 5976/34278 [6:38:04<26:18:59, 3.35s/it] {'loss': 0.1616, 'grad_norm': 0.9345566173290707, 'learning_rate': 9.46365451494434e-06, 'epoch': 0.17} 17%|█▋ | 5976/34278 [6:38:04<26:18:59, 3.35s/it] 17%|█▋ | 5977/34278 [6:38:07<25:47:41, 3.28s/it] {'loss': 0.1564, 'grad_norm': 0.768155821852064, 'learning_rate': 9.463441620814978e-06, 'epoch': 0.17} 17%|█▋ | 5977/34278 [6:38:07<25:47:41, 3.28s/it] 17%|█▋ | 5978/34278 [6:38:10<26:31:22, 3.37s/it] {'loss': 0.1618, 'grad_norm': 0.9104263530019562, 'learning_rate': 9.463228686837073e-06, 'epoch': 0.17} 17%|█▋ | 5978/34278 [6:38:10<26:31:22, 3.37s/it] 17%|█▋ | 5979/34278 [6:38:15<28:40:00, 3.65s/it] {'loss': 0.1752, 'grad_norm': 0.7977559775174695, 'learning_rate': 9.463015713012531e-06, 'epoch': 0.17} 17%|█▋ | 5979/34278 [6:38:15<28:40:00, 3.65s/it] 17%|█▋ | 5980/34278 [6:38:18<26:49:41, 3.41s/it] {'loss': 0.1526, 'grad_norm': 0.8462361553626403, 'learning_rate': 9.462802699343248e-06, 'epoch': 0.17} 17%|█▋ | 5980/34278 [6:38:18<26:49:41, 3.41s/it] 17%|█▋ | 5981/34278 [6:38:20<25:28:50, 3.24s/it] {'loss': 0.2182, 'grad_norm': 0.9133740290586382, 'learning_rate': 9.462589645831128e-06, 'epoch': 0.17} 17%|█▋ | 5981/34278 [6:38:20<25:28:50, 3.24s/it] 17%|█▋ | 5982/34278 [6:38:26<30:48:10, 3.92s/it] {'loss': 0.1599, 'grad_norm': 0.7961643263590523, 'learning_rate': 9.462376552478074e-06, 'epoch': 0.17} 17%|█▋ | 5982/34278 [6:38:26<30:48:10, 3.92s/it] 17%|█▋ | 5983/34278 [6:38:30<31:23:13, 3.99s/it] {'loss': 0.1813, 'grad_norm': 0.8578588468885399, 'learning_rate': 9.462163419285987e-06, 'epoch': 0.17} 17%|█▋ | 5983/34278 [6:38:30<31:23:13, 3.99s/it] 17%|█▋ | 5984/34278 [6:38:33<29:19:11, 3.73s/it] {'loss': 0.1771, 'grad_norm': 1.0301119862778012, 'learning_rate': 9.46195024625677e-06, 'epoch': 0.17} 17%|█▋ | 5984/34278 [6:38:33<29:19:11, 3.73s/it] 17%|█▋ | 5985/34278 [6:38:37<28:36:45, 3.64s/it] {'loss': 0.154, 'grad_norm': 0.9098537210614265, 'learning_rate': 9.461737033392327e-06, 'epoch': 0.17} 17%|█▋ | 5985/34278 [6:38:37<28:36:45, 3.64s/it] 17%|█▋ | 5986/34278 [6:38:40<27:38:05, 3.52s/it] {'loss': 0.1792, 'grad_norm': 1.021547156446205, 'learning_rate': 9.461523780694559e-06, 'epoch': 0.17} 17%|█▋ | 5986/34278 [6:38:40<27:38:05, 3.52s/it] 17%|█▋ | 5987/34278 [6:38:43<26:49:47, 3.41s/it] {'loss': 0.1887, 'grad_norm': 1.0632900295887269, 'learning_rate': 9.461310488165373e-06, 'epoch': 0.17} 17%|█▋ | 5987/34278 [6:38:43<26:49:47, 3.41s/it] 17%|█▋ | 5988/34278 [6:38:46<25:34:34, 3.25s/it] {'loss': 0.1698, 'grad_norm': 1.296644032438599, 'learning_rate': 9.461097155806673e-06, 'epoch': 0.17} 17%|█▋ | 5988/34278 [6:38:46<25:34:34, 3.25s/it] 17%|█▋ | 5989/34278 [6:38:50<26:25:04, 3.36s/it] {'loss': 0.1681, 'grad_norm': 0.9021366184339357, 'learning_rate': 9.46088378362036e-06, 'epoch': 0.17} 17%|█▋ | 5989/34278 [6:38:50<26:25:04, 3.36s/it] 17%|█▋ | 5990/34278 [6:38:53<26:11:19, 3.33s/it] {'loss': 0.2058, 'grad_norm': 1.1251424074743566, 'learning_rate': 9.460670371608345e-06, 'epoch': 0.17} 17%|█▋ | 5990/34278 [6:38:53<26:11:19, 3.33s/it] 17%|█▋ | 5991/34278 [6:38:56<25:56:21, 3.30s/it] {'loss': 0.1645, 'grad_norm': 0.9416358799561906, 'learning_rate': 9.460456919772527e-06, 'epoch': 0.17} 17%|█▋ | 5991/34278 [6:38:56<25:56:21, 3.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5992/34278 [6:38:59<24:39:56, 3.14s/it] {'loss': 0.1857, 'grad_norm': 1.1801873635148534, 'learning_rate': 9.460243428114815e-06, 'epoch': 0.17} 17%|█▋ | 5992/34278 [6:38:59<24:39:56, 3.14s/it] 17%|█▋ | 5993/34278 [6:39:05<31:23:04, 3.99s/it] {'loss': 0.1506, 'grad_norm': 0.9746959799548469, 'learning_rate': 9.460029896637115e-06, 'epoch': 0.17} 17%|█▋ | 5993/34278 [6:39:05<31:23:04, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5994/34278 [6:39:09<31:03:48, 3.95s/it] {'loss': 0.1684, 'grad_norm': 1.036935427850256, 'learning_rate': 9.459816325341331e-06, 'epoch': 0.17} 17%|█▋ | 5994/34278 [6:39:09<31:03:48, 3.95s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 17%|█▋ | 5995/34278 [6:39:12<30:15:01, 3.85s/it] {'loss': 0.1819, 'grad_norm': 0.7227916773233188, 'learning_rate': 9.459602714229373e-06, 'epoch': 0.17} 17%|█▋ | 5995/34278 [6:39:12<30:15:01, 3.85s/it] 17%|█▋ | 5996/34278 [6:39:15<28:11:35, 3.59s/it] {'loss': 0.1471, 'grad_norm': 0.7485200504083205, 'learning_rate': 9.459389063303147e-06, 'epoch': 0.17} 17%|█▋ | 5996/34278 [6:39:15<28:11:35, 3.59s/it] 17%|█▋ | 5997/34278 [6:39:19<28:01:46, 3.57s/it] {'loss': 0.191, 'grad_norm': 0.9782731977611967, 'learning_rate': 9.45917537256456e-06, 'epoch': 0.17} 17%|█▋ | 5997/34278 [6:39:19<28:01:46, 3.57s/it] 17%|█▋ | 5998/34278 [6:39:22<27:17:13, 3.47s/it] {'loss': 0.1485, 'grad_norm': 0.8880707327511158, 'learning_rate': 9.458961642015518e-06, 'epoch': 0.17} 17%|█▋ | 5998/34278 [6:39:22<27:17:13, 3.47s/it] 18%|█▊ | 5999/34278 [6:39:26<28:50:01, 3.67s/it] {'loss': 0.1749, 'grad_norm': 0.8374546938347205, 'learning_rate': 9.458747871657931e-06, 'epoch': 0.18} 18%|█▊ | 5999/34278 [6:39:26<28:50:01, 3.67s/it] 18%|█▊ | 6000/34278 [6:39:29<27:51:38, 3.55s/it] {'loss': 0.157, 'grad_norm': 1.3244317301623223, 'learning_rate': 9.45853406149371e-06, 'epoch': 0.18} 18%|█▊ | 6000/34278 [6:39:29<27:51:38, 3.55s/it]Set eos token id to 151658 Set eos token toSet eos token id to Set eos token id toSet eos token id to<|diff_marker|>Set eos token id to Set eos token id to 151658151658Set generation config eos token id to 151658151658 Set eos token toSet eos token id toSet eos token id to151658 Set eos token to [151658] <|diff_marker|>Set eos token to Set eos token to<|diff_marker|>Set eos token to 151658 Set generation config eos token id to151658 <|diff_marker|> <|diff_marker|> <|diff_marker|>Set generation config eos token id to Set eos token to Set eos token to Set generation config eos token id to Set generation config eos token id toSet generation config eos token id to [151658] <|diff_marker|> [151658]<|diff_marker|> [151658]Set generation config eos token id to Set generation config eos token id to [151658] [151658][151658] [151658] Set eos token id toSet eos token id toSet eos token id toSet eos token id to 151658151658151658151658 Set eos token id toSet eos token id toSet eos token id to151658Set eos token id toSet eos token id to Set eos token id to 151658151658Set eos token to 151658151658151658<|diff_marker|>151658Set eos token toSet eos token to Set eos token to Set generation config eos token id toSet eos token to<|diff_marker|>Set eos token to Set eos token to<|diff_marker|> <|diff_marker|> Set generation config eos token id to<|diff_marker|>[151658] Set generation config eos token id to <|diff_marker|><|diff_marker|> Set generation config eos token id to Set generation config eos token id to [151658][151658] Set generation config eos token id to [151658] Set generation config eos token id to[151658] [151658] [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set gener151658151658Set eos token id to Set eos token id toSet eos token id to Set eos token id to 151658Set eos token to Set eos token to 151658 <|diff_marker|> 151658 151658151658 151658 <|diff_marker|>Set eos token to Set eos token toSet generation config eos token id to Set eos token to Set eos token to Set eos token to Set eos token to <|diff_marker|><|diff_marker|>Set generation config eos token id to <|diff_marker|> <|diff_marker|> <|diff_marker|> [151658]<|diff_marker|> Set generation config eos token id to Set generation config eos token id toSet generation config eos token id toSet generation config eos token id to[151658] Set generation config eos token id to Set generation config eos token id to [151658] [151658][151658][151658] [151658] [151658] Set eos token id back to et eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643]151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to Set eos token id back to [151645, 151643]151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6001/34278 [6:40:05<104:15:21, 13.27s/it] {'loss': 0.1953, 'grad_norm': 1.331792773017602, 'learning_rate': 9.45832021152476e-06, 'epoch': 0.18} 18%|█▊ | 6001/34278 [6:40:05<104:15:21, 13.27s/it] 18%|█▊ | 6002/34278 [6:40:08<79:52:30, 10.17s/it] {'loss': 0.1745, 'grad_norm': 0.8670937692733209, 'learning_rate': 9.458106321752992e-06, 'epoch': 0.18} 18%|█▊ | 6002/34278 [6:40:08<79:52:30, 10.17s/it] 18%|█▊ | 6003/34278 [6:40:11<63:06:18, 8.03s/it] {'loss': 0.1731, 'grad_norm': 1.064234831429125, 'learning_rate': 9.457892392180313e-06, 'epoch': 0.18} 18%|█▊ | 6003/34278 [6:40:11<63:06:18, 8.03s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f216b9ff010> Failed to fetch sample 3268267. Exception: cannot identify image file <_io.BytesIO object at 0x7f216b9ff010> 18%|█▊ | 6004/34278 [6:40:15<52:47:48, 6.72s/it] {'loss': 0.1863, 'grad_norm': 0.9256814528488202, 'learning_rate': 9.457678422808636e-06, 'epoch': 0.18} 18%|█▊ | 6004/34278 [6:40:15<52:47:48, 6.72s/it] 18%|█▊ | 6005/34278 [6:40:20<48:13:23, 6.14s/it] {'loss': 0.1429, 'grad_norm': 1.1655334845881768, 'learning_rate': 9.45746441363987e-06, 'epoch': 0.18} 18%|█▊ | 6005/34278 [6:40:20<48:13:23, 6.14s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6006/34278 [6:40:23<41:21:40, 5.27s/it] {'loss': 0.1785, 'grad_norm': 0.9539661582528656, 'learning_rate': 9.457250364675926e-06, 'epoch': 0.18} 18%|█▊ | 6006/34278 [6:40:23<41:21:40, 5.27s/it] 18%|█▊ | 6007/34278 [6:40:27<39:25:53, 5.02s/it] {'loss': 0.1558, 'grad_norm': 0.8417616364845183, 'learning_rate': 9.457036275918714e-06, 'epoch': 0.18} 18%|█▊ | 6007/34278 [6:40:27<39:25:53, 5.02s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6008/34278 [6:40:31<35:38:46, 4.54s/it] {'loss': 0.1854, 'grad_norm': 1.041565807270326, 'learning_rate': 9.456822147370149e-06, 'epoch': 0.18} 18%|█▊ | 6008/34278 [6:40:31<35:38:46, 4.54s/it] 18%|█▊ | 6009/34278 [6:40:35<35:21:47, 4.50s/it] {'loss': 0.1599, 'grad_norm': 0.7819352945132542, 'learning_rate': 9.456607979032137e-06, 'epoch': 0.18} 18%|█▊ | 6009/34278 [6:40:35<35:21:47, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6010/34278 [6:40:38<31:43:55, 4.04s/it] {'loss': 0.1703, 'grad_norm': 0.9423586350948675, 'learning_rate': 9.456393770906594e-06, 'epoch': 0.18} 18%|█▊ | 6010/34278 [6:40:38<31:43:55, 4.04s/it] 18%|█▊ | 6011/34278 [6:40:44<36:06:44, 4.60s/it] {'loss': 0.1773, 'grad_norm': 0.8588015736703734, 'learning_rate': 9.45617952299543e-06, 'epoch': 0.18} 18%|█▊ | 6011/34278 [6:40:44<36:06:44, 4.60s/it] 18%|█▊ | 6012/34278 [6:40:49<37:25:15, 4.77s/it] {'loss': 0.1753, 'grad_norm': 0.7507797498074688, 'learning_rate': 9.455965235300559e-06, 'epoch': 0.18} 18%|█▊ | 6012/34278 [6:40:49<37:25:15, 4.77s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10348 > 8192). Running this sequence through the model will result in indexing errors 18%|█▊ | 6013/34278 [6:40:55<40:10:48, 5.12s/it] {'loss': 0.18, 'grad_norm': 1.2850760496932208, 'learning_rate': 9.455750907823895e-06, 'epoch': 0.18} 18%|█▊ | 6013/34278 [6:40:55<40:10:48, 5.12s/it] 18%|█▊ | 6014/34278 [6:41:02<43:03:11, 5.48s/it] {'loss': 0.1579, 'grad_norm': 0.8619489869393079, 'learning_rate': 9.45553654056735e-06, 'epoch': 0.18} 18%|█▊ | 6014/34278 [6:41:02<43:03:11, 5.48s/it] 18%|█▊ | 6015/34278 [6:41:05<37:44:17, 4.81s/it] {'loss': 0.1606, 'grad_norm': 0.7063174720726706, 'learning_rate': 9.45532213353284e-06, 'epoch': 0.18} 18%|█▊ | 6015/34278 [6:41:05<37:44:17, 4.81s/it] 18%|█▊ | 6016/34278 [6:41:11<40:49:13, 5.20s/it] {'loss': 0.1626, 'grad_norm': 0.7749751225775768, 'learning_rate': 9.455107686722276e-06, 'epoch': 0.18} 18%|█▊ | 6016/34278 [6:41:11<40:49:13, 5.20s/it] 18%|█▊ | 6017/34278 [6:41:14<35:54:49, 4.57s/it] {'loss': 0.1955, 'grad_norm': 0.8938572402846047, 'learning_rate': 9.454893200137574e-06, 'epoch': 0.18} 18%|█▊ | 6017/34278 [6:41:14<35:54:49, 4.57s/it] 18%|█▊ | 6018/34278 [6:41:17<32:37:38, 4.16s/it] {'loss': 0.1532, 'grad_norm': 0.7553586630209235, 'learning_rate': 9.45467867378065e-06, 'epoch': 0.18} 18%|█▊ | 6018/34278 [6:41:17<32:37:38, 4.16s/it] 18%|█▊ | 6019/34278 [6:41:20<30:18:35, 3.86s/it] {'loss': 0.1577, 'grad_norm': 0.8913931513797135, 'learning_rate': 9.454464107653418e-06, 'epoch': 0.18} 18%|█▊ | 6019/34278 [6:41:20<30:18:35, 3.86s/it] 18%|█▊ | 6020/34278 [6:41:26<33:36:34, 4.28s/it] {'loss': 0.1942, 'grad_norm': 0.8958707414497841, 'learning_rate': 9.454249501757794e-06, 'epoch': 0.18} 18%|█▊ | 6020/34278 [6:41:26<33:36:34, 4.28s/it] 18%|█▊ | 6021/34278 [6:41:29<30:45:34, 3.92s/it] {'loss': 0.1688, 'grad_norm': 1.0879702654646908, 'learning_rate': 9.454034856095693e-06, 'epoch': 0.18} 18%|█▊ | 6021/34278 [6:41:29<30:45:34, 3.92s/it] 18%|█▊ | 6022/34278 [6:41:35<35:53:16, 4.57s/it] {'loss': 0.1803, 'grad_norm': 0.7765588735381302, 'learning_rate': 9.45382017066903e-06, 'epoch': 0.18} 18%|█▊ | 6022/34278 [6:41:35<35:53:16, 4.57s/it] 18%|█▊ | 6023/34278 [6:41:38<33:25:48, 4.26s/it] {'loss': 0.1716, 'grad_norm': 0.8144183889369842, 'learning_rate': 9.453605445479727e-06, 'epoch': 0.18} 18%|█▊ | 6023/34278 [6:41:38<33:25:48, 4.26s/it] 18%|█▊ | 6024/34278 [6:41:41<30:49:53, 3.93s/it] {'loss': 0.16, 'grad_norm': 0.7901507030206922, 'learning_rate': 9.453390680529696e-06, 'epoch': 0.18} 18%|█▊ | 6024/34278 [6:41:41<30:49:53, 3.93s/it] 18%|█▊ | 6025/34278 [6:41:45<30:41:22, 3.91s/it] {'loss': 0.181, 'grad_norm': 0.9581655271792159, 'learning_rate': 9.453175875820857e-06, 'epoch': 0.18} 18%|█▊ | 6025/34278 [6:41:45<30:41:22, 3.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6026/34278 [6:41:49<29:03:02, 3.70s/it] {'loss': 0.1694, 'grad_norm': 0.7957862568945363, 'learning_rate': 9.452961031355128e-06, 'epoch': 0.18} 18%|█▊ | 6026/34278 [6:41:49<29:03:02, 3.70s/it] 18%|█▊ | 6027/34278 [6:41:52<27:28:05, 3.50s/it] {'loss': 0.1731, 'grad_norm': 0.8253825296684637, 'learning_rate': 9.452746147134423e-06, 'epoch': 0.18} 18%|█▊ | 6027/34278 [6:41:52<27:28:05, 3.50s/it] 18%|█▊ | 6028/34278 [6:41:55<27:04:29, 3.45s/it] {'loss': 0.148, 'grad_norm': 0.8902491986234154, 'learning_rate': 9.452531223160665e-06, 'epoch': 0.18} 18%|█▊ | 6028/34278 [6:41:55<27:04:29, 3.45s/it] 18%|█▊ | 6029/34278 [6:41:58<26:22:37, 3.36s/it] {'loss': 0.1761, 'grad_norm': 0.860239722845713, 'learning_rate': 9.452316259435771e-06, 'epoch': 0.18} 18%|█▊ | 6029/34278 [6:41:58<26:22:37, 3.36s/it] 18%|█▊ | 6030/34278 [6:42:02<28:05:32, 3.58s/it] {'loss': 0.1718, 'grad_norm': 1.2344045246556863, 'learning_rate': 9.45210125596166e-06, 'epoch': 0.18} 18%|█▊ | 6030/34278 [6:42:02<28:05:32, 3.58s/it] 18%|█▊ | 6031/34278 [6:42:05<26:33:46, 3.39s/it] {'loss': 0.1543, 'grad_norm': 0.9847896196009573, 'learning_rate': 9.451886212740253e-06, 'epoch': 0.18} 18%|█▊ | 6031/34278 [6:42:05<26:33:46, 3.39s/it] 18%|█▊ | 6032/34278 [6:42:08<25:27:05, 3.24s/it] {'loss': 0.1601, 'grad_norm': 0.8664705599311466, 'learning_rate': 9.45167112977347e-06, 'epoch': 0.18} 18%|█▊ | 6032/34278 [6:42:08<25:27:05, 3.24s/it] 18%|█▊ | 6033/34278 [6:42:11<25:27:25, 3.24s/it] {'loss': 0.1798, 'grad_norm': 1.0049382101091084, 'learning_rate': 9.451456007063227e-06, 'epoch': 0.18} 18%|█▊ | 6033/34278 [6:42:11<25:27:25, 3.24s/it] 18%|█▊ | 6034/34278 [6:42:15<25:37:48, 3.27s/it] {'loss': 0.167, 'grad_norm': 0.8925356302001541, 'learning_rate': 9.451240844611447e-06, 'epoch': 0.18} 18%|█▊ | 6034/34278 [6:42:15<25:37:48, 3.27s/it] 18%|█▊ | 6035/34278 [6:42:20<31:49:15, 4.06s/it] {'loss': 0.1752, 'grad_norm': 0.7737694120149674, 'learning_rate': 9.451025642420053e-06, 'epoch': 0.18} 18%|█▊ | 6035/34278 [6:42:21<31:49:15, 4.06s/it] 18%|█▊ | 6036/34278 [6:42:26<36:19:06, 4.63s/it] {'loss': 0.1857, 'grad_norm': 0.997220088503365, 'learning_rate': 9.450810400490964e-06, 'epoch': 0.18} 18%|█▊ | 6036/34278 [6:42:26<36:19:06, 4.63s/it] 18%|█▊ | 6037/34278 [6:42:30<34:00:05, 4.33s/it] {'loss': 0.1308, 'grad_norm': 0.5647631873998366, 'learning_rate': 9.450595118826102e-06, 'epoch': 0.18} 18%|█▊ | 6037/34278 [6:42:30<34:00:05, 4.33s/it] 18%|█▊ | 6038/34278 [6:42:34<32:03:18, 4.09s/it] {'loss': 0.1719, 'grad_norm': 0.8906089630960201, 'learning_rate': 9.450379797427389e-06, 'epoch': 0.18} 18%|█▊ | 6038/34278 [6:42:34<32:03:18, 4.09s/it] 18%|█▊ | 6039/34278 [6:42:37<29:46:51, 3.80s/it] {'loss': 0.1563, 'grad_norm': 0.8681627581345246, 'learning_rate': 9.450164436296749e-06, 'epoch': 0.18} 18%|█▊ | 6039/34278 [6:42:37<29:46:51, 3.80s/it] 18%|█▊ | 6040/34278 [6:42:40<28:10:50, 3.59s/it] {'loss': 0.1704, 'grad_norm': 1.049205464938567, 'learning_rate': 9.449949035436103e-06, 'epoch': 0.18} 18%|█▊ | 6040/34278 [6:42:40<28:10:50, 3.59s/it] 18%|█▊ | 6041/34278 [6:42:43<26:42:32, 3.41s/it] {'loss': 0.1629, 'grad_norm': 0.8109953914533022, 'learning_rate': 9.449733594847372e-06, 'epoch': 0.18} 18%|█▊ | 6041/34278 [6:42:43<26:42:32, 3.41s/it] 18%|█▊ | 6042/34278 [6:42:46<25:48:37, 3.29s/it] {'loss': 0.1474, 'grad_norm': 0.8814713554775707, 'learning_rate': 9.449518114532484e-06, 'epoch': 0.18} 18%|█▊ | 6042/34278 [6:42:46<25:48:37, 3.29s/it] 18%|█▊ | 6043/34278 [6:42:50<28:40:11, 3.66s/it] {'loss': 0.1661, 'grad_norm': 0.8461899653006055, 'learning_rate': 9.449302594493359e-06, 'epoch': 0.18} 18%|█▊ | 6043/34278 [6:42:50<28:40:11, 3.66s/it] 18%|█▊ | 6044/34278 [6:42:53<27:23:39, 3.49s/it] {'loss': 0.166, 'grad_norm': 0.8465339571038941, 'learning_rate': 9.449087034731924e-06, 'epoch': 0.18} 18%|█▊ | 6044/34278 [6:42:53<27:23:39, 3.49s/it] 18%|█▊ | 6045/34278 [6:42:58<30:29:22, 3.89s/it] {'loss': 0.1616, 'grad_norm': 0.8806843254973223, 'learning_rate': 9.448871435250102e-06, 'epoch': 0.18} 18%|█▊ | 6045/34278 [6:42:58<30:29:22, 3.89s/it] 18%|█▊ | 6046/34278 [6:43:01<28:18:16, 3.61s/it] {'loss': 0.1627, 'grad_norm': 0.89434729703085, 'learning_rate': 9.448655796049817e-06, 'epoch': 0.18} 18%|█▊ | 6046/34278 [6:43:01<28:18:16, 3.61s/it] 18%|█▊ | 6047/34278 [6:43:04<26:49:04, 3.42s/it] {'loss': 0.1562, 'grad_norm': 0.6465515443049981, 'learning_rate': 9.448440117132995e-06, 'epoch': 0.18} 18%|█▊ | 6047/34278 [6:43:04<26:49:04, 3.42s/it] 18%|█▊ | 6048/34278 [6:43:07<25:42:52, 3.28s/it] {'loss': 0.2078, 'grad_norm': 0.9649083594734409, 'learning_rate': 9.448224398501562e-06, 'epoch': 0.18} 18%|█▊ | 6048/34278 [6:43:07<25:42:52, 3.28s/it] 18%|█▊ | 6049/34278 [6:43:11<26:13:25, 3.34s/it] {'loss': 0.1997, 'grad_norm': 1.0466966296126328, 'learning_rate': 9.448008640157444e-06, 'epoch': 0.18} 18%|█▊ | 6049/34278 [6:43:11<26:13:25, 3.34s/it] 18%|█▊ | 6050/34278 [6:43:14<25:54:40, 3.30s/it] {'loss': 0.1664, 'grad_norm': 0.8800708082618894, 'learning_rate': 9.447792842102566e-06, 'epoch': 0.18} 18%|█▊ | 6050/34278 [6:43:14<25:54:40, 3.30s/it] 18%|█▊ | 6051/34278 [6:43:17<25:07:35, 3.20s/it] {'loss': 0.1781, 'grad_norm': 1.063802355330643, 'learning_rate': 9.447577004338855e-06, 'epoch': 0.18} 18%|█▊ | 6051/34278 [6:43:17<25:07:35, 3.20s/it] 18%|█▊ | 6052/34278 [6:43:23<32:47:14, 4.18s/it] {'loss': 0.1667, 'grad_norm': 0.9923137827728826, 'learning_rate': 9.447361126868238e-06, 'epoch': 0.18} 18%|█▊ | 6052/34278 [6:43:23<32:47:14, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6053/34278 [6:43:28<32:52:39, 4.19s/it] {'loss': 0.1577, 'grad_norm': 0.7367424048833521, 'learning_rate': 9.447145209692643e-06, 'epoch': 0.18} 18%|█▊ | 6053/34278 [6:43:28<32:52:39, 4.19s/it] 18%|█▊ | 6054/34278 [6:43:33<35:22:24, 4.51s/it] {'loss': 0.1612, 'grad_norm': 0.8162899202253272, 'learning_rate': 9.446929252813997e-06, 'epoch': 0.18} 18%|█▊ | 6054/34278 [6:43:33<35:22:24, 4.51s/it] 18%|█▊ | 6055/34278 [6:43:36<31:34:46, 4.03s/it] {'loss': 0.1548, 'grad_norm': 0.7287134917054725, 'learning_rate': 9.446713256234229e-06, 'epoch': 0.18} 18%|█▊ | 6055/34278 [6:43:36<31:34:46, 4.03s/it] 18%|█▊ | 6056/34278 [6:43:40<31:40:52, 4.04s/it] {'loss': 0.1878, 'grad_norm': 0.9588223252770661, 'learning_rate': 9.446497219955266e-06, 'epoch': 0.18} 18%|█▊ | 6056/34278 [6:43:40<31:40:52, 4.04s/it] 18%|█▊ | 6057/34278 [6:43:45<35:17:42, 4.50s/it] {'loss': 0.1373, 'grad_norm': 0.7985875635412174, 'learning_rate': 9.446281143979038e-06, 'epoch': 0.18} 18%|█▊ | 6057/34278 [6:43:45<35:17:42, 4.50s/it] 18%|█▊ | 6058/34278 [6:43:49<32:28:31, 4.14s/it] {'loss': 0.1828, 'grad_norm': 0.8723456676721568, 'learning_rate': 9.446065028307472e-06, 'epoch': 0.18} 18%|█▊ | 6058/34278 [6:43:49<32:28:31, 4.14s/it] 18%|█▊ | 6059/34278 [6:43:53<31:56:39, 4.08s/it] {'loss': 0.1608, 'grad_norm': 0.8644260320732804, 'learning_rate': 9.4458488729425e-06, 'epoch': 0.18} 18%|█▊ | 6059/34278 [6:43:53<31:56:39, 4.08s/it] 18%|█▊ | 6060/34278 [6:43:57<32:33:17, 4.15s/it] {'loss': 0.1579, 'grad_norm': 0.8467008843915851, 'learning_rate': 9.44563267788605e-06, 'epoch': 0.18} 18%|█▊ | 6060/34278 [6:43:57<32:33:17, 4.15s/it] 18%|█▊ | 6061/34278 [6:44:01<32:35:55, 4.16s/it] {'loss': 0.176, 'grad_norm': 0.8381171799356435, 'learning_rate': 9.445416443140052e-06, 'epoch': 0.18} 18%|█▊ | 6061/34278 [6:44:01<32:35:55, 4.16s/it] 18%|█▊ | 6062/34278 [6:44:04<30:02:06, 3.83s/it] {'loss': 0.1741, 'grad_norm': 0.6937196730004178, 'learning_rate': 9.445200168706438e-06, 'epoch': 0.18} 18%|█▊ | 6062/34278 [6:44:04<30:02:06, 3.83s/it] 18%|█▊ | 6063/34278 [6:44:07<27:37:24, 3.52s/it] {'loss': 0.1656, 'grad_norm': 1.3537163352995736, 'learning_rate': 9.444983854587138e-06, 'epoch': 0.18} 18%|█▊ | 6063/34278 [6:44:07<27:37:24, 3.52s/it] 18%|█▊ | 6064/34278 [6:44:13<33:33:10, 4.28s/it] {'loss': 0.1757, 'grad_norm': 0.883871849743838, 'learning_rate': 9.444767500784084e-06, 'epoch': 0.18} 18%|█▊ | 6064/34278 [6:44:13<33:33:10, 4.28s/it] 18%|█▊ | 6065/34278 [6:44:19<37:50:27, 4.83s/it] {'loss': 0.1602, 'grad_norm': 0.7363846114493716, 'learning_rate': 9.444551107299205e-06, 'epoch': 0.18} 18%|█▊ | 6065/34278 [6:44:19<37:50:27, 4.83s/it] 18%|█▊ | 6066/34278 [6:44:23<35:18:28, 4.51s/it] {'loss': 0.1573, 'grad_norm': 0.8035527678014244, 'learning_rate': 9.444334674134437e-06, 'epoch': 0.18} 18%|█▊ | 6066/34278 [6:44:23<35:18:28, 4.51s/it] 18%|█▊ | 6067/34278 [6:44:26<31:42:41, 4.05s/it] {'loss': 0.1568, 'grad_norm': 0.9608067326938806, 'learning_rate': 9.444118201291707e-06, 'epoch': 0.18} 18%|█▊ | 6067/34278 [6:44:26<31:42:41, 4.05s/it] 18%|█▊ | 6068/34278 [6:44:29<29:54:47, 3.82s/it] {'loss': 0.1816, 'grad_norm': 0.957312868090363, 'learning_rate': 9.443901688772953e-06, 'epoch': 0.18} 18%|█▊ | 6068/34278 [6:44:29<29:54:47, 3.82s/it] 18%|█▊ | 6069/34278 [6:44:32<28:38:27, 3.66s/it] {'loss': 0.1401, 'grad_norm': 0.8634941496582655, 'learning_rate': 9.443685136580105e-06, 'epoch': 0.18} 18%|█▊ | 6069/34278 [6:44:32<28:38:27, 3.66s/it] 18%|█▊ | 6070/34278 [6:44:35<26:49:37, 3.42s/it] {'loss': 0.1834, 'grad_norm': 0.8787652676909244, 'learning_rate': 9.443468544715097e-06, 'epoch': 0.18} 18%|█▊ | 6070/34278 [6:44:35<26:49:37, 3.42s/it] 18%|█▊ | 6071/34278 [6:44:39<27:59:38, 3.57s/it] {'loss': 0.1573, 'grad_norm': 0.8212803919043005, 'learning_rate': 9.443251913179862e-06, 'epoch': 0.18} 18%|█▊ | 6071/34278 [6:44:39<27:59:38, 3.57s/it] 18%|█▊ | 6072/34278 [6:44:44<29:58:02, 3.82s/it] {'loss': 0.1568, 'grad_norm': 0.73746500851273, 'learning_rate': 9.443035241976335e-06, 'epoch': 0.18} 18%|█▊ | 6072/34278 [6:44:44<29:58:02, 3.82s/it] 18%|█▊ | 6073/34278 [6:44:47<27:56:20, 3.57s/it] {'loss': 0.1791, 'grad_norm': 0.7926757313992104, 'learning_rate': 9.442818531106451e-06, 'epoch': 0.18} 18%|█▊ | 6073/34278 [6:44:47<27:56:20, 3.57s/it] 18%|█▊ | 6074/34278 [6:44:52<33:19:48, 4.25s/it] {'loss': 0.1981, 'grad_norm': 0.9429842205284127, 'learning_rate': 9.442601780572141e-06, 'epoch': 0.18} 18%|█▊ | 6074/34278 [6:44:52<33:19:48, 4.25s/it] 18%|█▊ | 6075/34278 [6:44:57<33:53:23, 4.33s/it] {'loss': 0.1437, 'grad_norm': 0.833712576794287, 'learning_rate': 9.442384990375344e-06, 'epoch': 0.18} 18%|█▊ | 6075/34278 [6:44:57<33:53:23, 4.33s/it] 18%|█▊ | 6076/34278 [6:45:03<37:58:06, 4.85s/it] {'loss': 0.1653, 'grad_norm': 0.7937003278803161, 'learning_rate': 9.442168160517995e-06, 'epoch': 0.18} 18%|█▊ | 6076/34278 [6:45:03<37:58:06, 4.85s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6077/34278 [6:45:08<37:46:24, 4.82s/it] {'loss': 0.1773, 'grad_norm': 0.9252279333681923, 'learning_rate': 9.44195129100203e-06, 'epoch': 0.18} 18%|█▊ | 6077/34278 [6:45:08<37:46:24, 4.82s/it] 18%|█▊ | 6078/34278 [6:45:11<33:08:52, 4.23s/it] {'loss': 0.1501, 'grad_norm': 1.01782313691626, 'learning_rate': 9.441734381829382e-06, 'epoch': 0.18} 18%|█▊ | 6078/34278 [6:45:11<33:08:52, 4.23s/it] 18%|█▊ | 6079/34278 [6:45:14<32:01:20, 4.09s/it] {'loss': 0.1502, 'grad_norm': 0.9169277443816802, 'learning_rate': 9.441517433001992e-06, 'epoch': 0.18} 18%|█▊ | 6079/34278 [6:45:14<32:01:20, 4.09s/it] 18%|█▊ | 6080/34278 [6:45:17<29:36:22, 3.78s/it] {'loss': 0.1448, 'grad_norm': 0.6476156203642134, 'learning_rate': 9.441300444521792e-06, 'epoch': 0.18} 18%|█▊ | 6080/34278 [6:45:17<29:36:22, 3.78s/it] 18%|█▊ | 6081/34278 [6:45:22<31:20:13, 4.00s/it] {'loss': 0.162, 'grad_norm': 0.7252667035537854, 'learning_rate': 9.441083416390725e-06, 'epoch': 0.18} 18%|█▊ | 6081/34278 [6:45:22<31:20:13, 4.00s/it] 18%|█▊ | 6082/34278 [6:45:27<34:01:46, 4.34s/it] {'loss': 0.1607, 'grad_norm': 0.9443395614628517, 'learning_rate': 9.440866348610723e-06, 'epoch': 0.18} 18%|█▊ | 6082/34278 [6:45:27<34:01:46, 4.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6083/34278 [6:45:31<32:46:13, 4.18s/it] {'loss': 0.1833, 'grad_norm': 0.7600945171755237, 'learning_rate': 9.440649241183727e-06, 'epoch': 0.18} 18%|█▊ | 6083/34278 [6:45:31<32:46:13, 4.18s/it] 18%|█▊ | 6084/34278 [6:45:34<29:58:33, 3.83s/it] {'loss': 0.1689, 'grad_norm': 0.931479269840474, 'learning_rate': 9.440432094111675e-06, 'epoch': 0.18} 18%|█▊ | 6084/34278 [6:45:34<29:58:33, 3.83s/it] 18%|█▊ | 6085/34278 [6:45:37<28:42:46, 3.67s/it] {'loss': 0.1665, 'grad_norm': 0.7627085732521647, 'learning_rate': 9.440214907396506e-06, 'epoch': 0.18} 18%|█▊ | 6085/34278 [6:45:37<28:42:46, 3.67s/it] 18%|█▊ | 6086/34278 [6:45:43<34:16:03, 4.38s/it] {'loss': 0.1539, 'grad_norm': 0.7402417189600208, 'learning_rate': 9.439997681040156e-06, 'epoch': 0.18} 18%|█▊ | 6086/34278 [6:45:43<34:16:03, 4.38s/it] 18%|█▊ | 6087/34278 [6:45:50<39:42:20, 5.07s/it] {'loss': 0.1491, 'grad_norm': 0.7909033891310475, 'learning_rate': 9.439780415044568e-06, 'epoch': 0.18} 18%|█▊ | 6087/34278 [6:45:50<39:42:20, 5.07s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6088/34278 [6:45:53<34:27:31, 4.40s/it] {'loss': 0.1773, 'grad_norm': 0.9789034030402775, 'learning_rate': 9.439563109411682e-06, 'epoch': 0.18} 18%|█▊ | 6088/34278 [6:45:53<34:27:31, 4.40s/it] 18%|█▊ | 6089/34278 [6:45:57<33:35:17, 4.29s/it] {'loss': 0.1579, 'grad_norm': 1.0910629320412395, 'learning_rate': 9.439345764143434e-06, 'epoch': 0.18} 18%|█▊ | 6089/34278 [6:45:57<33:35:17, 4.29s/it] 18%|█▊ | 6090/34278 [6:46:00<31:00:49, 3.96s/it] {'loss': 0.1447, 'grad_norm': 0.8578377522873836, 'learning_rate': 9.439128379241767e-06, 'epoch': 0.18} 18%|█▊ | 6090/34278 [6:46:00<31:00:49, 3.96s/it] 18%|█▊ | 6091/34278 [6:46:06<36:04:52, 4.61s/it] {'loss': 0.1731, 'grad_norm': 1.120834663212433, 'learning_rate': 9.438910954708622e-06, 'epoch': 0.18} 18%|█▊ | 6091/34278 [6:46:06<36:04:52, 4.61s/it] 18%|█▊ | 6092/34278 [6:46:10<34:45:12, 4.44s/it] {'loss': 0.1675, 'grad_norm': 0.7903505977482612, 'learning_rate': 9.43869349054594e-06, 'epoch': 0.18} 18%|█▊ | 6092/34278 [6:46:10<34:45:12, 4.44s/it] 18%|█▊ | 6093/34278 [6:46:14<32:29:36, 4.15s/it] {'loss': 0.1647, 'grad_norm': 0.8174964146133391, 'learning_rate': 9.438475986755661e-06, 'epoch': 0.18} 18%|█▊ | 6093/34278 [6:46:14<32:29:36, 4.15s/it] 18%|█▊ | 6094/34278 [6:46:16<29:30:54, 3.77s/it] {'loss': 0.1748, 'grad_norm': 0.6848338727131013, 'learning_rate': 9.438258443339729e-06, 'epoch': 0.18} 18%|█▊ | 6094/34278 [6:46:16<29:30:54, 3.77s/it] 18%|█▊ | 6095/34278 [6:46:20<28:16:16, 3.61s/it] {'loss': 0.18, 'grad_norm': 0.8040857348148285, 'learning_rate': 9.438040860300085e-06, 'epoch': 0.18} 18%|█▊ | 6095/34278 [6:46:20<28:16:16, 3.61s/it] 18%|█▊ | 6096/34278 [6:46:23<27:15:40, 3.48s/it] {'loss': 0.1864, 'grad_norm': 0.9080380906748584, 'learning_rate': 9.437823237638672e-06, 'epoch': 0.18} 18%|█▊ | 6096/34278 [6:46:23<27:15:40, 3.48s/it] 18%|█▊ | 6097/34278 [6:46:27<27:39:57, 3.53s/it] {'loss': 0.1682, 'grad_norm': 0.7563824475411975, 'learning_rate': 9.43760557535743e-06, 'epoch': 0.18} 18%|█▊ | 6097/34278 [6:46:27<27:39:57, 3.53s/it] 18%|█▊ | 6098/34278 [6:46:29<26:00:10, 3.32s/it] {'loss': 0.1731, 'grad_norm': 0.7708359204805659, 'learning_rate': 9.437387873458308e-06, 'epoch': 0.18} 18%|█▊ | 6098/34278 [6:46:29<26:00:10, 3.32s/it] 18%|█▊ | 6099/34278 [6:46:33<25:54:40, 3.31s/it] {'loss': 0.1842, 'grad_norm': 0.8988494689525325, 'learning_rate': 9.437170131943245e-06, 'epoch': 0.18} 18%|█▊ | 6099/34278 [6:46:33<25:54:40, 3.31s/it] 18%|█▊ | 6100/34278 [6:46:38<30:45:31, 3.93s/it] {'loss': 0.1932, 'grad_norm': 0.881316888063764, 'learning_rate': 9.436952350814187e-06, 'epoch': 0.18} 18%|█▊ | 6100/34278 [6:46:38<30:45:31, 3.93s/it] 18%|█▊ | 6101/34278 [6:46:43<33:44:49, 4.31s/it] {'loss': 0.1842, 'grad_norm': 0.8412463896369021, 'learning_rate': 9.436734530073078e-06, 'epoch': 0.18} 18%|█▊ | 6101/34278 [6:46:43<33:44:49, 4.31s/it] 18%|█▊ | 6102/34278 [6:46:50<38:27:36, 4.91s/it] {'loss': 0.139, 'grad_norm': 0.7500046491642948, 'learning_rate': 9.43651666972186e-06, 'epoch': 0.18} 18%|█▊ | 6102/34278 [6:46:50<38:27:36, 4.91s/it] 18%|█▊ | 6103/34278 [6:46:53<34:14:13, 4.37s/it] {'loss': 0.1901, 'grad_norm': 1.1570837505681435, 'learning_rate': 9.436298769762481e-06, 'epoch': 0.18} 18%|█▊ | 6103/34278 [6:46:53<34:14:13, 4.37s/it] 18%|█▊ | 6104/34278 [6:46:56<31:03:22, 3.97s/it] {'loss': 0.1489, 'grad_norm': 0.7852084294455182, 'learning_rate': 9.436080830196888e-06, 'epoch': 0.18} 18%|█▊ | 6104/34278 [6:46:56<31:03:22, 3.97s/it] 18%|█▊ | 6105/34278 [6:46:59<29:45:58, 3.80s/it] {'loss': 0.1664, 'grad_norm': 0.7920356544226356, 'learning_rate': 9.435862851027023e-06, 'epoch': 0.18} 18%|█▊ | 6105/34278 [6:46:59<29:45:58, 3.80s/it] 18%|█▊ | 6106/34278 [6:47:05<34:47:05, 4.45s/it] {'loss': 0.1818, 'grad_norm': 0.8037907284038206, 'learning_rate': 9.435644832254831e-06, 'epoch': 0.18} 18%|█▊ | 6106/34278 [6:47:05<34:47:05, 4.45s/it] 18%|█▊ | 6107/34278 [6:47:08<31:01:12, 3.96s/it] {'loss': 0.1751, 'grad_norm': 0.9146267883701465, 'learning_rate': 9.435426773882264e-06, 'epoch': 0.18} 18%|█▊ | 6107/34278 [6:47:08<31:01:12, 3.96s/it] 18%|█▊ | 6108/34278 [6:47:13<34:04:49, 4.36s/it] {'loss': 0.1675, 'grad_norm': 0.8506247433730693, 'learning_rate': 9.435208675911263e-06, 'epoch': 0.18} 18%|█▊ | 6108/34278 [6:47:13<34:04:49, 4.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6109/34278 [6:47:16<31:16:49, 4.00s/it] {'loss': 0.1766, 'grad_norm': 1.2478055350424464, 'learning_rate': 9.43499053834378e-06, 'epoch': 0.18} 18%|█▊ | 6109/34278 [6:47:16<31:16:49, 4.00s/it] 18%|█▊ | 6110/34278 [6:47:19<29:17:48, 3.74s/it] {'loss': 0.1632, 'grad_norm': 0.7861486485923962, 'learning_rate': 9.434772361181759e-06, 'epoch': 0.18} 18%|█▊ | 6110/34278 [6:47:19<29:17:48, 3.74s/it] 18%|█▊ | 6111/34278 [6:47:24<31:03:05, 3.97s/it] {'loss': 0.1731, 'grad_norm': 0.9131489960481203, 'learning_rate': 9.434554144427148e-06, 'epoch': 0.18} 18%|█▊ | 6111/34278 [6:47:24<31:03:05, 3.97s/it] 18%|█▊ | 6112/34278 [6:47:27<29:41:16, 3.79s/it] {'loss': 0.1839, 'grad_norm': 1.1593414358106822, 'learning_rate': 9.434335888081898e-06, 'epoch': 0.18} 18%|█▊ | 6112/34278 [6:47:27<29:41:16, 3.79s/it] 18%|█▊ | 6113/34278 [6:47:30<28:05:00, 3.59s/it] {'loss': 0.1533, 'grad_norm': 0.847344521852335, 'learning_rate': 9.434117592147955e-06, 'epoch': 0.18} 18%|█▊ | 6113/34278 [6:47:30<28:05:00, 3.59s/it] 18%|█▊ | 6114/34278 [6:47:33<26:24:56, 3.38s/it] {'loss': 0.1768, 'grad_norm': 1.0958624282003797, 'learning_rate': 9.43389925662727e-06, 'epoch': 0.18} 18%|█▊ | 6114/34278 [6:47:33<26:24:56, 3.38s/it] 18%|█▊ | 6115/34278 [6:47:39<32:18:08, 4.13s/it] {'loss': 0.1707, 'grad_norm': 1.2150248763347455, 'learning_rate': 9.433680881521789e-06, 'epoch': 0.18} 18%|█▊ | 6115/34278 [6:47:39<32:18:08, 4.13s/it] 18%|█▊ | 6116/34278 [6:47:43<30:42:36, 3.93s/it] {'loss': 0.169, 'grad_norm': 0.898320883463539, 'learning_rate': 9.433462466833462e-06, 'epoch': 0.18} 18%|█▊ | 6116/34278 [6:47:43<30:42:36, 3.93s/it] 18%|█▊ | 6117/34278 [6:47:47<31:24:51, 4.02s/it] {'loss': 0.1774, 'grad_norm': 1.0818057778123038, 'learning_rate': 9.433244012564245e-06, 'epoch': 0.18} 18%|█▊ | 6117/34278 [6:47:47<31:24:51, 4.02s/it] 18%|█▊ | 6118/34278 [6:47:50<29:39:26, 3.79s/it] {'loss': 0.1719, 'grad_norm': 1.093611499526309, 'learning_rate': 9.433025518716081e-06, 'epoch': 0.18} 18%|█▊ | 6118/34278 [6:47:50<29:39:26, 3.79s/it] 18%|█▊ | 6119/34278 [6:47:56<35:03:35, 4.48s/it] {'loss': 0.1773, 'grad_norm': 1.04629029446073, 'learning_rate': 9.432806985290924e-06, 'epoch': 0.18} 18%|█▊ | 6119/34278 [6:47:56<35:03:35, 4.48s/it] 18%|█▊ | 6120/34278 [6:48:00<32:32:09, 4.16s/it] {'loss': 0.1721, 'grad_norm': 0.9329986990371699, 'learning_rate': 9.432588412290725e-06, 'epoch': 0.18} 18%|█▊ | 6120/34278 [6:48:00<32:32:09, 4.16s/it] 18%|█▊ | 6121/34278 [6:48:04<33:32:03, 4.29s/it] {'loss': 0.1983, 'grad_norm': 0.8700545346597031, 'learning_rate': 9.432369799717434e-06, 'epoch': 0.18} 18%|█▊ | 6121/34278 [6:48:04<33:32:03, 4.29s/it] 18%|█▊ | 6122/34278 [6:48:07<30:59:02, 3.96s/it] {'loss': 0.1828, 'grad_norm': 1.0542415237391032, 'learning_rate': 9.432151147573003e-06, 'epoch': 0.18} 18%|█▊ | 6122/34278 [6:48:07<30:59:02, 3.96s/it] 18%|█▊ | 6123/34278 [6:48:12<32:00:29, 4.09s/it] {'loss': 0.1351, 'grad_norm': 0.838955245856994, 'learning_rate': 9.431932455859384e-06, 'epoch': 0.18} 18%|█▊ | 6123/34278 [6:48:12<32:00:29, 4.09s/it] 18%|█▊ | 6124/34278 [6:48:16<31:35:06, 4.04s/it] {'loss': 0.2086, 'grad_norm': 1.3967139817878313, 'learning_rate': 9.431713724578531e-06, 'epoch': 0.18} 18%|█▊ | 6124/34278 [6:48:16<31:35:06, 4.04s/it] 18%|█▊ | 6125/34278 [6:48:19<28:51:39, 3.69s/it] {'loss': 0.1567, 'grad_norm': 0.9763115375116108, 'learning_rate': 9.431494953732396e-06, 'epoch': 0.18} 18%|█▊ | 6125/34278 [6:48:19<28:51:39, 3.69s/it] 18%|█▊ | 6126/34278 [6:48:23<31:21:27, 4.01s/it] {'loss': 0.1768, 'grad_norm': 0.849450274946255, 'learning_rate': 9.431276143322933e-06, 'epoch': 0.18} 18%|█▊ | 6126/34278 [6:48:23<31:21:27, 4.01s/it] 18%|█▊ | 6127/34278 [6:48:30<37:32:52, 4.80s/it] {'loss': 0.1791, 'grad_norm': 1.176015247895556, 'learning_rate': 9.431057293352093e-06, 'epoch': 0.18} 18%|█▊ | 6127/34278 [6:48:30<37:32:52, 4.80s/it] 18%|█▊ | 6128/34278 [6:48:34<34:51:01, 4.46s/it] {'loss': 0.2019, 'grad_norm': 1.2749422903424745, 'learning_rate': 9.430838403821831e-06, 'epoch': 0.18} 18%|█▊ | 6128/34278 [6:48:34<34:51:01, 4.46s/it] 18%|█▊ | 6129/34278 [6:48:37<32:41:59, 4.18s/it] {'loss': 0.1769, 'grad_norm': 0.9438674709566486, 'learning_rate': 9.430619474734102e-06, 'epoch': 0.18} 18%|█▊ | 6129/34278 [6:48:37<32:41:59, 4.18s/it] 18%|█▊ | 6130/34278 [6:48:41<31:17:35, 4.00s/it] {'loss': 0.2025, 'grad_norm': 0.8042485643091488, 'learning_rate': 9.43040050609086e-06, 'epoch': 0.18} 18%|█▊ | 6130/34278 [6:48:41<31:17:35, 4.00s/it] 18%|█▊ | 6131/34278 [6:48:44<28:51:37, 3.69s/it] {'loss': 0.1685, 'grad_norm': 1.0309599888690277, 'learning_rate': 9.43018149789406e-06, 'epoch': 0.18} 18%|█▊ | 6131/34278 [6:48:44<28:51:37, 3.69s/it] 18%|█▊ | 6132/34278 [6:48:47<27:06:49, 3.47s/it] {'loss': 0.1466, 'grad_norm': 0.7531252493687638, 'learning_rate': 9.429962450145657e-06, 'epoch': 0.18} 18%|█▊ | 6132/34278 [6:48:47<27:06:49, 3.47s/it] 18%|█▊ | 6133/34278 [6:48:51<29:07:30, 3.73s/it] {'loss': 0.1838, 'grad_norm': 0.8056546655740853, 'learning_rate': 9.429743362847608e-06, 'epoch': 0.18} 18%|█▊ | 6133/34278 [6:48:51<29:07:30, 3.73s/it] 18%|█▊ | 6134/34278 [6:48:54<27:30:46, 3.52s/it] {'loss': 0.2005, 'grad_norm': 0.9787697867974474, 'learning_rate': 9.429524236001866e-06, 'epoch': 0.18} 18%|█▊ | 6134/34278 [6:48:54<27:30:46, 3.52s/it] 18%|█▊ | 6135/34278 [6:48:57<26:15:04, 3.36s/it] {'loss': 0.1482, 'grad_norm': 0.7721888307938807, 'learning_rate': 9.429305069610389e-06, 'epoch': 0.18} 18%|█▊ | 6135/34278 [6:48:57<26:15:04, 3.36s/it] 18%|█▊ | 6136/34278 [6:49:00<26:14:06, 3.36s/it] {'loss': 0.1672, 'grad_norm': 0.7173122780538939, 'learning_rate': 9.429085863675135e-06, 'epoch': 0.18} 18%|█▊ | 6136/34278 [6:49:00<26:14:06, 3.36s/it] 18%|█▊ | 6137/34278 [6:49:04<26:46:46, 3.43s/it] {'loss': 0.17, 'grad_norm': 0.7886093796379801, 'learning_rate': 9.42886661819806e-06, 'epoch': 0.18} 18%|█▊ | 6137/34278 [6:49:04<26:46:46, 3.43s/it] 18%|█▊ | 6138/34278 [6:49:07<26:09:11, 3.35s/it] {'loss': 0.148, 'grad_norm': 0.6716444684628302, 'learning_rate': 9.42864733318112e-06, 'epoch': 0.18} 18%|█▊ | 6138/34278 [6:49:07<26:09:11, 3.35s/it] 18%|█▊ | 6139/34278 [6:49:13<32:25:49, 4.15s/it] {'loss': 0.1501, 'grad_norm': 0.7653181569944403, 'learning_rate': 9.428428008626274e-06, 'epoch': 0.18} 18%|█▊ | 6139/34278 [6:49:13<32:25:49, 4.15s/it] 18%|█▊ | 6140/34278 [6:49:19<36:29:55, 4.67s/it] {'loss': 0.1726, 'grad_norm': 0.7644070134856724, 'learning_rate': 9.42820864453548e-06, 'epoch': 0.18} 18%|█▊ | 6140/34278 [6:49:19<36:29:55, 4.67s/it] 18%|█▊ | 6141/34278 [6:49:24<36:49:58, 4.71s/it] {'loss': 0.1565, 'grad_norm': 0.9368679198269921, 'learning_rate': 9.427989240910695e-06, 'epoch': 0.18} 18%|█▊ | 6141/34278 [6:49:24<36:49:58, 4.71s/it] 18%|█▊ | 6142/34278 [6:49:30<40:22:50, 5.17s/it] {'loss': 0.1487, 'grad_norm': 0.6316129360047353, 'learning_rate': 9.42776979775388e-06, 'epoch': 0.18} 18%|█▊ | 6142/34278 [6:49:30<40:22:50, 5.17s/it] 18%|█▊ | 6143/34278 [6:49:34<38:05:06, 4.87s/it] {'loss': 0.1758, 'grad_norm': 0.8383230480926974, 'learning_rate': 9.427550315066994e-06, 'epoch': 0.18} 18%|█▊ | 6143/34278 [6:49:34<38:05:06, 4.87s/it] 18%|█▊ | 6144/34278 [6:49:38<34:42:49, 4.44s/it] {'loss': 0.1881, 'grad_norm': 0.9890782465178287, 'learning_rate': 9.427330792851996e-06, 'epoch': 0.18} 18%|█▊ | 6144/34278 [6:49:38<34:42:49, 4.44s/it] 18%|█▊ | 6145/34278 [6:49:41<31:40:37, 4.05s/it] {'loss': 0.1745, 'grad_norm': 0.7080945023913502, 'learning_rate': 9.427111231110844e-06, 'epoch': 0.18} 18%|█▊ | 6145/34278 [6:49:41<31:40:37, 4.05s/it] 18%|█▊ | 6146/34278 [6:49:44<29:17:43, 3.75s/it] {'loss': 0.1557, 'grad_norm': 0.7666605079265063, 'learning_rate': 9.4268916298455e-06, 'epoch': 0.18} 18%|█▊ | 6146/34278 [6:49:44<29:17:43, 3.75s/it] 18%|█▊ | 6147/34278 [6:49:48<29:29:47, 3.77s/it] {'loss': 0.1538, 'grad_norm': 0.8405628010532938, 'learning_rate': 9.426671989057926e-06, 'epoch': 0.18} 18%|█▊ | 6147/34278 [6:49:48<29:29:47, 3.77s/it] 18%|█▊ | 6148/34278 [6:49:51<27:50:13, 3.56s/it] {'loss': 0.1742, 'grad_norm': 0.8001625757953447, 'learning_rate': 9.42645230875008e-06, 'epoch': 0.18} 18%|█▊ | 6148/34278 [6:49:51<27:50:13, 3.56s/it] 18%|█▊ | 6149/34278 [6:49:54<26:08:13, 3.35s/it] {'loss': 0.1515, 'grad_norm': 0.9332369139694799, 'learning_rate': 9.426232588923925e-06, 'epoch': 0.18} 18%|█▊ | 6149/34278 [6:49:54<26:08:13, 3.35s/it] 18%|█▊ | 6150/34278 [6:49:56<24:53:05, 3.18s/it] {'loss': 0.1644, 'grad_norm': 0.8351790934983703, 'learning_rate': 9.426012829581421e-06, 'epoch': 0.18} 18%|█▊ | 6150/34278 [6:49:56<24:53:05, 3.18s/it] 18%|█▊ | 6151/34278 [6:50:02<29:52:50, 3.82s/it] {'loss': 0.182, 'grad_norm': 0.7994829000612014, 'learning_rate': 9.42579303072453e-06, 'epoch': 0.18} 18%|█▊ | 6151/34278 [6:50:02<29:52:50, 3.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6152/34278 [6:50:05<28:02:22, 3.59s/it] {'loss': 0.1886, 'grad_norm': 0.959677623640591, 'learning_rate': 9.425573192355219e-06, 'epoch': 0.18} 18%|█▊ | 6152/34278 [6:50:05<28:02:22, 3.59s/it] 18%|█▊ | 6153/34278 [6:50:10<32:06:40, 4.11s/it] {'loss': 0.1647, 'grad_norm': 0.9195168526872746, 'learning_rate': 9.425353314475445e-06, 'epoch': 0.18} 18%|█▊ | 6153/34278 [6:50:10<32:06:40, 4.11s/it] 18%|█▊ | 6154/34278 [6:50:14<31:29:16, 4.03s/it] {'loss': 0.158, 'grad_norm': 0.711177053303128, 'learning_rate': 9.425133397087171e-06, 'epoch': 0.18} 18%|█▊ | 6154/34278 [6:50:14<31:29:16, 4.03s/it] 18%|█▊ | 6155/34278 [6:50:18<30:15:17, 3.87s/it] {'loss': 0.155, 'grad_norm': 0.6299993751647488, 'learning_rate': 9.424913440192366e-06, 'epoch': 0.18} 18%|█▊ | 6155/34278 [6:50:18<30:15:17, 3.87s/it] 18%|█▊ | 6156/34278 [6:50:23<34:26:45, 4.41s/it] {'loss': 0.1545, 'grad_norm': 0.7729505405505678, 'learning_rate': 9.424693443792988e-06, 'epoch': 0.18} 18%|█▊ | 6156/34278 [6:50:23<34:26:45, 4.41s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6157/34278 [6:50:29<37:11:37, 4.76s/it] {'loss': 0.1654, 'grad_norm': 1.0723102235306599, 'learning_rate': 9.424473407891003e-06, 'epoch': 0.18} 18%|█▊ | 6157/34278 [6:50:29<37:11:37, 4.76s/it] 18%|█▊ | 6158/34278 [6:50:32<33:28:18, 4.29s/it] {'loss': 0.1494, 'grad_norm': 0.7710784361163612, 'learning_rate': 9.424253332488377e-06, 'epoch': 0.18} 18%|█▊ | 6158/34278 [6:50:32<33:28:18, 4.29s/it] 18%|█▊ | 6159/34278 [6:50:35<30:35:24, 3.92s/it] {'loss': 0.1604, 'grad_norm': 0.8657758464748335, 'learning_rate': 9.424033217587072e-06, 'epoch': 0.18} 18%|█▊ | 6159/34278 [6:50:35<30:35:24, 3.92s/it] 18%|█▊ | 6160/34278 [6:50:38<28:38:28, 3.67s/it] {'loss': 0.1773, 'grad_norm': 0.9032855261951235, 'learning_rate': 9.423813063189056e-06, 'epoch': 0.18} 18%|█▊ | 6160/34278 [6:50:38<28:38:28, 3.67s/it] 18%|█▊ | 6161/34278 [6:50:42<29:19:47, 3.76s/it] {'loss': 0.1609, 'grad_norm': 0.8345559494797617, 'learning_rate': 9.423592869296292e-06, 'epoch': 0.18} 18%|█▊ | 6161/34278 [6:50:42<29:19:47, 3.76s/it] 18%|█▊ | 6162/34278 [6:50:45<28:04:54, 3.60s/it] {'loss': 0.1889, 'grad_norm': 0.8849683920236567, 'learning_rate': 9.423372635910748e-06, 'epoch': 0.18} 18%|█▊ | 6162/34278 [6:50:45<28:04:54, 3.60s/it] 18%|█▊ | 6163/34278 [6:50:49<27:28:00, 3.52s/it] {'loss': 0.174, 'grad_norm': 0.9609974502826502, 'learning_rate': 9.42315236303439e-06, 'epoch': 0.18} 18%|█▊ | 6163/34278 [6:50:49<27:28:00, 3.52s/it] 18%|█▊ | 6164/34278 [6:50:53<28:33:48, 3.66s/it] {'loss': 0.1553, 'grad_norm': 0.9045587451641919, 'learning_rate': 9.42293205066918e-06, 'epoch': 0.18} 18%|█▊ | 6164/34278 [6:50:53<28:33:48, 3.66s/it] 18%|█▊ | 6165/34278 [6:50:57<29:22:28, 3.76s/it] {'loss': 0.1562, 'grad_norm': 0.9235731750614937, 'learning_rate': 9.422711698817091e-06, 'epoch': 0.18} 18%|█▊ | 6165/34278 [6:50:57<29:22:28, 3.76s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f62455d2890> Failed to fetch sample 3110974. Exception: cannot identify image file <_io.BytesIO object at 0x7f62455d2890> 18%|█▊ | 6166/34278 [6:51:00<28:44:01, 3.68s/it] {'loss': 0.1538, 'grad_norm': 0.9983722343381372, 'learning_rate': 9.422491307480085e-06, 'epoch': 0.18} 18%|█▊ | 6166/34278 [6:51:00<28:44:01, 3.68s/it] 18%|█▊ | 6167/34278 [6:51:04<28:08:53, 3.60s/it] {'loss': 0.1516, 'grad_norm': 1.040014406683864, 'learning_rate': 9.422270876660136e-06, 'epoch': 0.18} 18%|█▊ | 6167/34278 [6:51:04<28:08:53, 3.60s/it] 18%|█▊ | 6168/34278 [6:51:07<27:31:14, 3.52s/it] {'loss': 0.1742, 'grad_norm': 0.907915871955659, 'learning_rate': 9.422050406359207e-06, 'epoch': 0.18} 18%|█▊ | 6168/34278 [6:51:07<27:31:14, 3.52s/it] 18%|█▊ | 6169/34278 [6:51:13<33:32:03, 4.29s/it] {'loss': 0.1555, 'grad_norm': 0.8433086264631569, 'learning_rate': 9.421829896579267e-06, 'epoch': 0.18} 18%|█▊ | 6169/34278 [6:51:13<33:32:03, 4.29s/it] 18%|█▊ | 6170/34278 [6:51:16<31:47:51, 4.07s/it] {'loss': 0.1832, 'grad_norm': 1.1187874230437564, 'learning_rate': 9.421609347322285e-06, 'epoch': 0.18} 18%|█▊ | 6170/34278 [6:51:16<31:47:51, 4.07s/it] 18%|█▊ | 6171/34278 [6:51:23<36:42:48, 4.70s/it] {'loss': 0.1468, 'grad_norm': 0.8590340789382764, 'learning_rate': 9.42138875859023e-06, 'epoch': 0.18} 18%|█▊ | 6171/34278 [6:51:23<36:42:48, 4.70s/it] 18%|█▊ | 6172/34278 [6:51:29<40:37:20, 5.20s/it] {'loss': 0.1675, 'grad_norm': 0.7310620734230633, 'learning_rate': 9.421168130385074e-06, 'epoch': 0.18} 18%|█▊ | 6172/34278 [6:51:29<40:37:20, 5.20s/it] 18%|█▊ | 6173/34278 [6:51:33<36:56:51, 4.73s/it] {'loss': 0.171, 'grad_norm': 0.9963718411069702, 'learning_rate': 9.420947462708783e-06, 'epoch': 0.18} 18%|█▊ | 6173/34278 [6:51:33<36:56:51, 4.73s/it] 18%|█▊ | 6174/34278 [6:51:36<33:03:55, 4.24s/it] {'loss': 0.1751, 'grad_norm': 0.9020944829065396, 'learning_rate': 9.420726755563327e-06, 'epoch': 0.18} 18%|█▊ | 6174/34278 [6:51:36<33:03:55, 4.24s/it] 18%|█▊ | 6175/34278 [6:51:40<32:48:15, 4.20s/it] {'loss': 0.1562, 'grad_norm': 0.7587580020082834, 'learning_rate': 9.42050600895068e-06, 'epoch': 0.18} 18%|█▊ | 6175/34278 [6:51:40<32:48:15, 4.20s/it] 18%|█▊ | 6176/34278 [6:51:43<30:35:09, 3.92s/it] {'loss': 0.1786, 'grad_norm': 0.922642096759342, 'learning_rate': 9.42028522287281e-06, 'epoch': 0.18} 18%|█▊ | 6176/34278 [6:51:43<30:35:09, 3.92s/it] 18%|█▊ | 6177/34278 [6:51:49<35:10:16, 4.51s/it] {'loss': 0.1998, 'grad_norm': 0.8950283400370009, 'learning_rate': 9.420064397331688e-06, 'epoch': 0.18} 18%|█▊ | 6177/34278 [6:51:49<35:10:16, 4.51s/it] 18%|█▊ | 6178/34278 [6:51:52<32:13:50, 4.13s/it] {'loss': 0.147, 'grad_norm': 0.7929709305026523, 'learning_rate': 9.419843532329287e-06, 'epoch': 0.18} 18%|█▊ | 6178/34278 [6:51:52<32:13:50, 4.13s/it] 18%|█▊ | 6179/34278 [6:51:58<36:22:37, 4.66s/it] {'loss': 0.1641, 'grad_norm': 1.2104020486551266, 'learning_rate': 9.419622627867577e-06, 'epoch': 0.18} 18%|█▊ | 6179/34278 [6:51:58<36:22:37, 4.66s/it] 18%|█▊ | 6180/34278 [6:52:02<33:27:35, 4.29s/it] {'loss': 0.1713, 'grad_norm': 0.8553677422338132, 'learning_rate': 9.419401683948533e-06, 'epoch': 0.18} 18%|█▊ | 6180/34278 [6:52:02<33:27:35, 4.29s/it] 18%|█▊ | 6181/34278 [6:52:05<30:52:04, 3.96s/it] {'loss': 0.186, 'grad_norm': 0.8737367582203313, 'learning_rate': 9.419180700574123e-06, 'epoch': 0.18} 18%|█▊ | 6181/34278 [6:52:05<30:52:04, 3.96s/it] 18%|█▊ | 6182/34278 [6:52:11<35:46:10, 4.58s/it] {'loss': 0.1863, 'grad_norm': 0.8067083291331099, 'learning_rate': 9.418959677746325e-06, 'epoch': 0.18} 18%|█▊ | 6182/34278 [6:52:11<35:46:10, 4.58s/it] 18%|█▊ | 6183/34278 [6:52:14<32:45:45, 4.20s/it] {'loss': 0.1524, 'grad_norm': 0.7832841294836334, 'learning_rate': 9.418738615467108e-06, 'epoch': 0.18} 18%|█▊ | 6183/34278 [6:52:14<32:45:45, 4.20s/it] 18%|█▊ | 6184/34278 [6:52:20<37:17:48, 4.78s/it] {'loss': 0.1591, 'grad_norm': 0.9379152920288453, 'learning_rate': 9.41851751373845e-06, 'epoch': 0.18} 18%|█▊ | 6184/34278 [6:52:20<37:17:48, 4.78s/it] 18%|█▊ | 6185/34278 [6:52:24<33:55:14, 4.35s/it] {'loss': 0.173, 'grad_norm': 0.8878302057374515, 'learning_rate': 9.41829637256232e-06, 'epoch': 0.18} 18%|█▊ | 6185/34278 [6:52:24<33:55:14, 4.35s/it] 18%|█▊ | 6186/34278 [6:52:27<32:05:50, 4.11s/it] {'loss': 0.1816, 'grad_norm': 0.9087940608673092, 'learning_rate': 9.418075191940697e-06, 'epoch': 0.18} 18%|█▊ | 6186/34278 [6:52:27<32:05:50, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6187/34278 [6:52:31<31:37:10, 4.05s/it] {'loss': 0.1604, 'grad_norm': 0.7582838323836925, 'learning_rate': 9.417853971875553e-06, 'epoch': 0.18} 18%|█▊ | 6187/34278 [6:52:31<31:37:10, 4.05s/it] 18%|█▊ | 6188/34278 [6:52:37<34:59:17, 4.48s/it] {'loss': 0.1652, 'grad_norm': 1.3421748469345602, 'learning_rate': 9.417632712368861e-06, 'epoch': 0.18} 18%|█▊ | 6188/34278 [6:52:37<34:59:17, 4.48s/it] 18%|█▊ | 6189/34278 [6:52:40<31:55:08, 4.09s/it] {'loss': 0.1735, 'grad_norm': 0.818452488353429, 'learning_rate': 9.417411413422601e-06, 'epoch': 0.18} 18%|█▊ | 6189/34278 [6:52:40<31:55:08, 4.09s/it] 18%|█▊ | 6190/34278 [6:52:43<30:40:44, 3.93s/it] {'loss': 0.1565, 'grad_norm': 0.8319082879299589, 'learning_rate': 9.417190075038745e-06, 'epoch': 0.18} 18%|█▊ | 6190/34278 [6:52:43<30:40:44, 3.93s/it] 18%|█▊ | 6191/34278 [6:52:48<32:28:25, 4.16s/it] {'loss': 0.142, 'grad_norm': 0.9572358486376007, 'learning_rate': 9.416968697219272e-06, 'epoch': 0.18} 18%|█▊ | 6191/34278 [6:52:48<32:28:25, 4.16s/it] 18%|█▊ | 6192/34278 [6:52:52<32:34:56, 4.18s/it] {'loss': 0.181, 'grad_norm': 1.0652700399568353, 'learning_rate': 9.416747279966155e-06, 'epoch': 0.18} 18%|█▊ | 6192/34278 [6:52:52<32:34:56, 4.18s/it] 18%|█▊ | 6193/34278 [6:52:56<31:28:16, 4.03s/it] {'loss': 0.1587, 'grad_norm': 0.8093986188290544, 'learning_rate': 9.416525823281375e-06, 'epoch': 0.18} 18%|█▊ | 6193/34278 [6:52:56<31:28:16, 4.03s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8736 > 8192). Running this sequence through the model will result in indexing errors 18%|█▊ | 6194/34278 [6:52:59<28:52:23, 3.70s/it] {'loss': 0.1844, 'grad_norm': 0.8793735767853701, 'learning_rate': 9.416304327166905e-06, 'epoch': 0.18} 18%|█▊ | 6194/34278 [6:52:59<28:52:23, 3.70s/it] 18%|█▊ | 6195/34278 [6:53:03<29:27:16, 3.78s/it] {'loss': 0.1934, 'grad_norm': 0.9718214223440156, 'learning_rate': 9.416082791624726e-06, 'epoch': 0.18} 18%|█▊ | 6195/34278 [6:53:03<29:27:16, 3.78s/it] 18%|█▊ | 6196/34278 [6:53:06<27:26:09, 3.52s/it] {'loss': 0.1361, 'grad_norm': 0.756008513886222, 'learning_rate': 9.415861216656812e-06, 'epoch': 0.18} 18%|█▊ | 6196/34278 [6:53:06<27:26:09, 3.52s/it] 18%|█▊ | 6197/34278 [6:53:11<32:12:48, 4.13s/it] {'loss': 0.1675, 'grad_norm': 0.9071404958173924, 'learning_rate': 9.415639602265144e-06, 'epoch': 0.18} 18%|█▊ | 6197/34278 [6:53:11<32:12:48, 4.13s/it] 18%|█▊ | 6198/34278 [6:53:14<29:43:20, 3.81s/it] {'loss': 0.1699, 'grad_norm': 0.8786545741210041, 'learning_rate': 9.4154179484517e-06, 'epoch': 0.18} 18%|█▊ | 6198/34278 [6:53:14<29:43:20, 3.81s/it] 18%|█▊ | 6199/34278 [6:53:17<27:55:13, 3.58s/it] {'loss': 0.1499, 'grad_norm': 0.9263575442534925, 'learning_rate': 9.415196255218457e-06, 'epoch': 0.18} 18%|█▊ | 6199/34278 [6:53:17<27:55:13, 3.58s/it] 18%|█▊ | 6200/34278 [6:53:20<26:44:48, 3.43s/it] {'loss': 0.1437, 'grad_norm': 0.7796762701779799, 'learning_rate': 9.414974522567398e-06, 'epoch': 0.18} 18%|█▊ | 6200/34278 [6:53:20<26:44:48, 3.43s/it] 18%|█▊ | 6201/34278 [6:53:24<26:56:10, 3.45s/it] {'loss': 0.1977, 'grad_norm': 0.866025414208799, 'learning_rate': 9.414752750500499e-06, 'epoch': 0.18} 18%|█▊ | 6201/34278 [6:53:24<26:56:10, 3.45s/it] 18%|█▊ | 6202/34278 [6:53:28<28:01:50, 3.59s/it] {'loss': 0.1613, 'grad_norm': 1.186550089311323, 'learning_rate': 9.414530939019741e-06, 'epoch': 0.18} 18%|█▊ | 6202/34278 [6:53:28<28:01:50, 3.59s/it] 18%|█▊ | 6203/34278 [6:53:32<29:23:54, 3.77s/it] {'loss': 0.1958, 'grad_norm': 1.0800366163797837, 'learning_rate': 9.414309088127105e-06, 'epoch': 0.18} 18%|█▊ | 6203/34278 [6:53:32<29:23:54, 3.77s/it] 18%|█▊ | 6204/34278 [6:53:35<27:00:59, 3.46s/it] {'loss': 0.1769, 'grad_norm': 0.9683499345321749, 'learning_rate': 9.414087197824573e-06, 'epoch': 0.18} 18%|█▊ | 6204/34278 [6:53:35<27:00:59, 3.46s/it] 18%|█▊ | 6205/34278 [6:53:38<26:53:40, 3.45s/it] {'loss': 0.165, 'grad_norm': 1.0385735295901457, 'learning_rate': 9.413865268114123e-06, 'epoch': 0.18} 18%|█▊ | 6205/34278 [6:53:38<26:53:40, 3.45s/it] 18%|█▊ | 6206/34278 [6:53:42<27:36:24, 3.54s/it] {'loss': 0.1531, 'grad_norm': 0.929342593065302, 'learning_rate': 9.413643298997736e-06, 'epoch': 0.18} 18%|█▊ | 6206/34278 [6:53:42<27:36:24, 3.54s/it] 18%|█▊ | 6207/34278 [6:53:45<25:52:55, 3.32s/it] {'loss': 0.1556, 'grad_norm': 0.7853337979334546, 'learning_rate': 9.413421290477397e-06, 'epoch': 0.18} 18%|█▊ | 6207/34278 [6:53:45<25:52:55, 3.32s/it] 18%|█▊ | 6208/34278 [6:53:48<25:06:35, 3.22s/it] {'loss': 0.1745, 'grad_norm': 0.8194819974039269, 'learning_rate': 9.413199242555086e-06, 'epoch': 0.18} 18%|█▊ | 6208/34278 [6:53:48<25:06:35, 3.22s/it] 18%|█▊ | 6209/34278 [6:53:51<25:46:24, 3.31s/it] {'loss': 0.154, 'grad_norm': 0.7724744253414053, 'learning_rate': 9.412977155232787e-06, 'epoch': 0.18} 18%|█▊ | 6209/34278 [6:53:51<25:46:24, 3.31s/it] 18%|█▊ | 6210/34278 [6:53:54<25:26:10, 3.26s/it] {'loss': 0.174, 'grad_norm': 0.8542435209665531, 'learning_rate': 9.412755028512478e-06, 'epoch': 0.18} 18%|█▊ | 6210/34278 [6:53:54<25:26:10, 3.26s/it] 18%|█▊ | 6211/34278 [6:53:59<29:00:01, 3.72s/it] {'loss': 0.1657, 'grad_norm': 1.022489064681061, 'learning_rate': 9.412532862396149e-06, 'epoch': 0.18} 18%|█▊ | 6211/34278 [6:53:59<29:00:01, 3.72s/it] 18%|█▊ | 6212/34278 [6:54:02<27:34:08, 3.54s/it] {'loss': 0.2094, 'grad_norm': 0.7578179979500707, 'learning_rate': 9.412310656885779e-06, 'epoch': 0.18} 18%|█▊ | 6212/34278 [6:54:02<27:34:08, 3.54s/it] 18%|█▊ | 6213/34278 [6:54:08<32:57:24, 4.23s/it] {'loss': 0.1478, 'grad_norm': 0.8060204496253309, 'learning_rate': 9.412088411983352e-06, 'epoch': 0.18} 18%|█▊ | 6213/34278 [6:54:08<32:57:24, 4.23s/it] 18%|█▊ | 6214/34278 [6:54:14<37:46:56, 4.85s/it] {'loss': 0.1604, 'grad_norm': 0.9909780923516098, 'learning_rate': 9.411866127690855e-06, 'epoch': 0.18} 18%|█▊ | 6214/34278 [6:54:14<37:46:56, 4.85s/it] 18%|█▊ | 6215/34278 [6:54:19<36:29:47, 4.68s/it] {'loss': 0.1631, 'grad_norm': 0.7047233474518533, 'learning_rate': 9.411643804010266e-06, 'epoch': 0.18} 18%|█▊ | 6215/34278 [6:54:19<36:29:47, 4.68s/it] 18%|█▊ | 6216/34278 [6:54:22<34:00:23, 4.36s/it] {'loss': 0.1676, 'grad_norm': 0.8575426263434166, 'learning_rate': 9.411421440943577e-06, 'epoch': 0.18} 18%|█▊ | 6216/34278 [6:54:22<34:00:23, 4.36s/it] 18%|█▊ | 6217/34278 [6:54:25<30:38:00, 3.93s/it] {'loss': 0.1601, 'grad_norm': 1.0595142798092263, 'learning_rate': 9.411199038492771e-06, 'epoch': 0.18} 18%|█▊ | 6217/34278 [6:54:25<30:38:00, 3.93s/it] 18%|█▊ | 6218/34278 [6:54:30<32:48:44, 4.21s/it] {'loss': 0.1621, 'grad_norm': 0.8622059520119328, 'learning_rate': 9.410976596659833e-06, 'epoch': 0.18} 18%|█▊ | 6218/34278 [6:54:30<32:48:44, 4.21s/it] 18%|█▊ | 6219/34278 [6:54:33<30:13:31, 3.88s/it] {'loss': 0.1605, 'grad_norm': 0.9781827108687683, 'learning_rate': 9.410754115446747e-06, 'epoch': 0.18} 18%|█▊ | 6219/34278 [6:54:33<30:13:31, 3.88s/it] 18%|█▊ | 6220/34278 [6:54:36<27:57:01, 3.59s/it] {'loss': 0.1858, 'grad_norm': 0.8949547567745179, 'learning_rate': 9.410531594855503e-06, 'epoch': 0.18} 18%|█▊ | 6220/34278 [6:54:36<27:57:01, 3.59s/it] 18%|█▊ | 6221/34278 [6:54:40<29:16:19, 3.76s/it] {'loss': 0.1556, 'grad_norm': 0.9421490204150713, 'learning_rate': 9.410309034888086e-06, 'epoch': 0.18} 18%|█▊ | 6221/34278 [6:54:40<29:16:19, 3.76s/it] 18%|█▊ | 6222/34278 [6:54:44<30:01:13, 3.85s/it] {'loss': 0.1776, 'grad_norm': 0.9222360795511418, 'learning_rate': 9.410086435546481e-06, 'epoch': 0.18} 18%|█▊ | 6222/34278 [6:54:44<30:01:13, 3.85s/it] 18%|█▊ | 6223/34278 [6:54:48<28:34:07, 3.67s/it] {'loss': 0.1931, 'grad_norm': 0.9042954423969205, 'learning_rate': 9.409863796832679e-06, 'epoch': 0.18} 18%|█▊ | 6223/34278 [6:54:48<28:34:07, 3.67s/it] 18%|█▊ | 6224/34278 [6:54:50<26:20:18, 3.38s/it] {'loss': 0.1938, 'grad_norm': 0.87669143043865, 'learning_rate': 9.409641118748665e-06, 'epoch': 0.18} 18%|█▊ | 6224/34278 [6:54:50<26:20:18, 3.38s/it] 18%|█▊ | 6225/34278 [6:54:54<26:16:04, 3.37s/it] {'loss': 0.1572, 'grad_norm': 0.8911654886162883, 'learning_rate': 9.409418401296429e-06, 'epoch': 0.18} 18%|█▊ | 6225/34278 [6:54:54<26:16:04, 3.37s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6226/34278 [6:54:57<25:21:11, 3.25s/it] {'loss': 0.1601, 'grad_norm': 0.9490587471244761, 'learning_rate': 9.409195644477955e-06, 'epoch': 0.18} 18%|█▊ | 6226/34278 [6:54:57<25:21:11, 3.25s/it] 18%|█▊ | 6227/34278 [6:55:01<27:39:54, 3.55s/it] {'loss': 0.1904, 'grad_norm': 0.7109438871780286, 'learning_rate': 9.408972848295237e-06, 'epoch': 0.18} 18%|█▊ | 6227/34278 [6:55:01<27:39:54, 3.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6228/34278 [6:55:04<26:11:18, 3.36s/it] {'loss': 0.1664, 'grad_norm': 0.9835123319739566, 'learning_rate': 9.408750012750262e-06, 'epoch': 0.18} 18%|█▊ | 6228/34278 [6:55:04<26:11:18, 3.36s/it] 18%|█▊ | 6229/34278 [6:55:07<25:43:38, 3.30s/it] {'loss': 0.1545, 'grad_norm': 0.9472216549479102, 'learning_rate': 9.408527137845019e-06, 'epoch': 0.18} 18%|█▊ | 6229/34278 [6:55:07<25:43:38, 3.30s/it] 18%|█▊ | 6230/34278 [6:55:10<25:13:15, 3.24s/it] {'loss': 0.1521, 'grad_norm': 0.6824306815751001, 'learning_rate': 9.408304223581497e-06, 'epoch': 0.18} 18%|█▊ | 6230/34278 [6:55:10<25:13:15, 3.24s/it] 18%|█▊ | 6231/34278 [6:55:13<24:23:44, 3.13s/it] {'loss': 0.2012, 'grad_norm': 0.8559321147756367, 'learning_rate': 9.40808126996169e-06, 'epoch': 0.18} 18%|█▊ | 6231/34278 [6:55:13<24:23:44, 3.13s/it] 18%|█▊ | 6232/34278 [6:55:16<24:05:47, 3.09s/it] {'loss': 0.1629, 'grad_norm': 1.04243711595298, 'learning_rate': 9.407858276987582e-06, 'epoch': 0.18} 18%|█▊ | 6232/34278 [6:55:16<24:05:47, 3.09s/it] 18%|█▊ | 6233/34278 [6:55:19<24:27:02, 3.14s/it] {'loss': 0.1714, 'grad_norm': 0.781091931137516, 'learning_rate': 9.407635244661171e-06, 'epoch': 0.18} 18%|█▊ | 6233/34278 [6:55:19<24:27:02, 3.14s/it] 18%|█▊ | 6234/34278 [6:55:23<25:43:37, 3.30s/it] {'loss': 0.1809, 'grad_norm': 0.7351836000670849, 'learning_rate': 9.407412172984443e-06, 'epoch': 0.18} 18%|█▊ | 6234/34278 [6:55:23<25:43:37, 3.30s/it] 18%|█▊ | 6235/34278 [6:55:26<24:25:03, 3.13s/it] {'loss': 0.1832, 'grad_norm': 0.8888704776542667, 'learning_rate': 9.407189061959391e-06, 'epoch': 0.18} 18%|█▊ | 6235/34278 [6:55:26<24:25:03, 3.13s/it] 18%|█▊ | 6236/34278 [6:55:29<25:48:52, 3.31s/it] {'loss': 0.1396, 'grad_norm': 0.9210496492979364, 'learning_rate': 9.406965911588009e-06, 'epoch': 0.18} 18%|█▊ | 6236/34278 [6:55:29<25:48:52, 3.31s/it] 18%|█▊ | 6237/34278 [6:55:33<26:19:26, 3.38s/it] {'loss': 0.1838, 'grad_norm': 0.7581127058051876, 'learning_rate': 9.406742721872283e-06, 'epoch': 0.18} 18%|█▊ | 6237/34278 [6:55:33<26:19:26, 3.38s/it] 18%|█▊ | 6238/34278 [6:55:36<26:24:04, 3.39s/it] {'loss': 0.1777, 'grad_norm': 0.7802841499323447, 'learning_rate': 9.406519492814215e-06, 'epoch': 0.18} 18%|█▊ | 6238/34278 [6:55:36<26:24:04, 3.39s/it] 18%|█▊ | 6239/34278 [6:55:40<26:19:44, 3.38s/it] {'loss': 0.1714, 'grad_norm': 1.0567663023915306, 'learning_rate': 9.406296224415791e-06, 'epoch': 0.18} 18%|█▊ | 6239/34278 [6:55:40<26:19:44, 3.38s/it] 18%|█▊ | 6240/34278 [6:55:46<32:26:29, 4.17s/it] {'loss': 0.1707, 'grad_norm': 1.0572082746664182, 'learning_rate': 9.406072916679006e-06, 'epoch': 0.18} 18%|█▊ | 6240/34278 [6:55:46<32:26:29, 4.17s/it] 18%|█▊ | 6241/34278 [6:55:49<30:37:54, 3.93s/it] {'loss': 0.1631, 'grad_norm': 1.1517742435805653, 'learning_rate': 9.405849569605853e-06, 'epoch': 0.18} 18%|█▊ | 6241/34278 [6:55:49<30:37:54, 3.93s/it] 18%|█▊ | 6242/34278 [6:55:52<28:07:33, 3.61s/it] {'loss': 0.1626, 'grad_norm': 0.9723575965767852, 'learning_rate': 9.405626183198329e-06, 'epoch': 0.18} 18%|█▊ | 6242/34278 [6:55:52<28:07:33, 3.61s/it] 18%|█▊ | 6243/34278 [6:55:55<27:01:34, 3.47s/it] {'loss': 0.1844, 'grad_norm': 0.8955581516573967, 'learning_rate': 9.405402757458424e-06, 'epoch': 0.18} 18%|█▊ | 6243/34278 [6:55:55<27:01:34, 3.47s/it] 18%|█▊ | 6244/34278 [6:55:58<26:32:04, 3.41s/it] {'loss': 0.1528, 'grad_norm': 0.8263967085292345, 'learning_rate': 9.405179292388135e-06, 'epoch': 0.18} 18%|█▊ | 6244/34278 [6:55:58<26:32:04, 3.41s/it] 18%|█▊ | 6245/34278 [6:56:02<27:16:45, 3.50s/it] {'loss': 0.1688, 'grad_norm': 0.8216943605496201, 'learning_rate': 9.404955787989458e-06, 'epoch': 0.18} 18%|█▊ | 6245/34278 [6:56:02<27:16:45, 3.50s/it] 18%|█▊ | 6246/34278 [6:56:05<27:10:22, 3.49s/it] {'loss': 0.1562, 'grad_norm': 1.2210662319131085, 'learning_rate': 9.404732244264387e-06, 'epoch': 0.18} 18%|█▊ | 6246/34278 [6:56:06<27:10:22, 3.49s/it] 18%|█▊ | 6247/34278 [6:56:09<26:24:08, 3.39s/it] {'loss': 0.1441, 'grad_norm': 0.8952438572608528, 'learning_rate': 9.404508661214918e-06, 'epoch': 0.18} 18%|█▊ | 6247/34278 [6:56:09<26:24:08, 3.39s/it] 18%|█▊ | 6248/34278 [6:56:12<25:55:16, 3.33s/it] {'loss': 0.1582, 'grad_norm': 0.809408536103548, 'learning_rate': 9.404285038843047e-06, 'epoch': 0.18} 18%|█▊ | 6248/34278 [6:56:12<25:55:16, 3.33s/it] 18%|█▊ | 6249/34278 [6:56:15<25:43:46, 3.30s/it] {'loss': 0.1508, 'grad_norm': 0.6595809122634859, 'learning_rate': 9.404061377150771e-06, 'epoch': 0.18} 18%|█▊ | 6249/34278 [6:56:15<25:43:46, 3.30s/it] 18%|█▊ | 6250/34278 [6:56:18<25:15:58, 3.25s/it] {'loss': 0.1513, 'grad_norm': 0.7862198516734406, 'learning_rate': 9.403837676140084e-06, 'epoch': 0.18} 18%|█▊ | 6250/34278 [6:56:18<25:15:58, 3.25s/it] 18%|█▊ | 6251/34278 [6:56:22<26:54:42, 3.46s/it] {'loss': 0.1689, 'grad_norm': 0.6850742799073719, 'learning_rate': 9.403613935812988e-06, 'epoch': 0.18} 18%|█▊ | 6251/34278 [6:56:22<26:54:42, 3.46s/it] 18%|█▊ | 6252/34278 [6:56:25<25:41:55, 3.30s/it] {'loss': 0.1385, 'grad_norm': 0.8220992034464922, 'learning_rate': 9.403390156171477e-06, 'epoch': 0.18} 18%|█▊ | 6252/34278 [6:56:25<25:41:55, 3.30s/it] 18%|█▊ | 6253/34278 [6:56:29<26:13:07, 3.37s/it] {'loss': 0.1675, 'grad_norm': 0.9788715151395558, 'learning_rate': 9.40316633721755e-06, 'epoch': 0.18} 18%|█▊ | 6253/34278 [6:56:29<26:13:07, 3.37s/it] 18%|█▊ | 6254/34278 [6:56:35<32:25:20, 4.17s/it] {'loss': 0.207, 'grad_norm': 0.8508562582919345, 'learning_rate': 9.402942478953205e-06, 'epoch': 0.18} 18%|█▊ | 6254/34278 [6:56:35<32:25:20, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6255/34278 [6:56:40<35:04:38, 4.51s/it] {'loss': 0.1678, 'grad_norm': 0.970477424704942, 'learning_rate': 9.402718581380442e-06, 'epoch': 0.18} 18%|█▊ | 6255/34278 [6:56:40<35:04:38, 4.51s/it] 18%|█▊ | 6256/34278 [6:56:43<31:28:11, 4.04s/it] {'loss': 0.1542, 'grad_norm': 0.8971265181992876, 'learning_rate': 9.402494644501256e-06, 'epoch': 0.18} 18%|█▊ | 6256/34278 [6:56:43<31:28:11, 4.04s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9361 > 8192). Running this sequence through the model will result in indexing errors 18%|█▊ | 6257/34278 [6:56:47<31:37:19, 4.06s/it] {'loss': 0.1632, 'grad_norm': 0.9824876826399018, 'learning_rate': 9.402270668317651e-06, 'epoch': 0.18} 18%|█▊ | 6257/34278 [6:56:47<31:37:19, 4.06s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12031 > 8192). Running this sequence through the model will result in indexing errors 18%|█▊ | 6258/34278 [6:56:52<33:07:07, 4.26s/it] {'loss': 0.1619, 'grad_norm': 0.6949440436484277, 'learning_rate': 9.402046652831623e-06, 'epoch': 0.18} 18%|█▊ | 6258/34278 [6:56:52<33:07:07, 4.26s/it] 18%|█▊ | 6259/34278 [6:56:55<30:24:37, 3.91s/it] {'loss': 0.1644, 'grad_norm': 0.8770120753793275, 'learning_rate': 9.401822598045173e-06, 'epoch': 0.18} 18%|█▊ | 6259/34278 [6:56:55<30:24:37, 3.91s/it] 18%|█▊ | 6260/34278 [6:56:58<29:31:40, 3.79s/it] {'loss': 0.1695, 'grad_norm': 0.9068548662885453, 'learning_rate': 9.401598503960303e-06, 'epoch': 0.18} 18%|█▊ | 6260/34278 [6:56:58<29:31:40, 3.79s/it] 18%|█▊ | 6261/34278 [6:57:01<27:35:58, 3.55s/it] {'loss': 0.1571, 'grad_norm': 0.8946411605336688, 'learning_rate': 9.401374370579013e-06, 'epoch': 0.18} 18%|█▊ | 6261/34278 [6:57:01<27:35:58, 3.55s/it] 18%|█▊ | 6262/34278 [6:57:07<31:32:15, 4.05s/it] {'loss': 0.1602, 'grad_norm': 0.8074587967392921, 'learning_rate': 9.401150197903301e-06, 'epoch': 0.18} 18%|█▊ | 6262/34278 [6:57:07<31:32:15, 4.05s/it] 18%|█▊ | 6263/34278 [6:57:12<35:27:13, 4.56s/it] {'loss': 0.1817, 'grad_norm': 2.60264728155018, 'learning_rate': 9.400925985935172e-06, 'epoch': 0.18} 18%|█▊ | 6263/34278 [6:57:12<35:27:13, 4.56s/it] 18%|█▊ | 6264/34278 [6:57:16<34:23:31, 4.42s/it] {'loss': 0.172, 'grad_norm': 0.9913172170775648, 'learning_rate': 9.400701734676628e-06, 'epoch': 0.18} 18%|█▊ | 6264/34278 [6:57:16<34:23:31, 4.42s/it] 18%|█▊ | 6265/34278 [6:57:20<33:12:23, 4.27s/it] {'loss': 0.1919, 'grad_norm': 0.8779597607438039, 'learning_rate': 9.400477444129667e-06, 'epoch': 0.18} 18%|█▊ | 6265/34278 [6:57:20<33:12:23, 4.27s/it] 18%|█▊ | 6266/34278 [6:57:23<29:20:23, 3.77s/it] {'loss': 0.1685, 'grad_norm': 0.8429548045429296, 'learning_rate': 9.400253114296293e-06, 'epoch': 0.18} 18%|█▊ | 6266/34278 [6:57:23<29:20:23, 3.77s/it] 18%|█▊ | 6267/34278 [6:57:26<27:43:23, 3.56s/it] {'loss': 0.1846, 'grad_norm': 0.9852071087522531, 'learning_rate': 9.400028745178512e-06, 'epoch': 0.18} 18%|█▊ | 6267/34278 [6:57:26<27:43:23, 3.56s/it] 18%|█▊ | 6268/34278 [6:57:29<26:26:45, 3.40s/it] {'loss': 0.1751, 'grad_norm': 1.0386223536856265, 'learning_rate': 9.399804336778325e-06, 'epoch': 0.18} 18%|█▊ | 6268/34278 [6:57:29<26:26:45, 3.40s/it] 18%|█▊ | 6269/34278 [6:57:33<27:24:15, 3.52s/it] {'loss': 0.1652, 'grad_norm': 0.7428465970760484, 'learning_rate': 9.399579889097733e-06, 'epoch': 0.18} 18%|█▊ | 6269/34278 [6:57:33<27:24:15, 3.52s/it] 18%|█▊ | 6270/34278 [6:57:39<33:06:31, 4.26s/it] {'loss': 0.1747, 'grad_norm': 0.8416234696604418, 'learning_rate': 9.399355402138743e-06, 'epoch': 0.18} 18%|█▊ | 6270/34278 [6:57:39<33:06:31, 4.26s/it] 18%|█▊ | 6271/34278 [6:57:41<29:33:19, 3.80s/it] {'loss': 0.1621, 'grad_norm': 0.8880831660665703, 'learning_rate': 9.399130875903357e-06, 'epoch': 0.18} 18%|█▊ | 6271/34278 [6:57:41<29:33:19, 3.80s/it] 18%|█▊ | 6272/34278 [6:57:46<32:14:03, 4.14s/it] {'loss': 0.176, 'grad_norm': 0.8646320623649186, 'learning_rate': 9.398906310393582e-06, 'epoch': 0.18} 18%|█▊ | 6272/34278 [6:57:46<32:14:03, 4.14s/it] 18%|█▊ | 6273/34278 [6:57:50<30:01:08, 3.86s/it] {'loss': 0.1634, 'grad_norm': 0.8468516507308228, 'learning_rate': 9.398681705611423e-06, 'epoch': 0.18} 18%|█▊ | 6273/34278 [6:57:50<30:01:08, 3.86s/it] 18%|█▊ | 6274/34278 [6:57:53<28:22:23, 3.65s/it] {'loss': 0.1806, 'grad_norm': 1.0238616595287175, 'learning_rate': 9.39845706155888e-06, 'epoch': 0.18} 18%|█▊ | 6274/34278 [6:57:53<28:22:23, 3.65s/it] 18%|█▊ | 6275/34278 [6:57:56<26:55:17, 3.46s/it] {'loss': 0.1695, 'grad_norm': 0.9580535676433123, 'learning_rate': 9.398232378237965e-06, 'epoch': 0.18} 18%|█▊ | 6275/34278 [6:57:56<26:55:17, 3.46s/it] 18%|█▊ | 6276/34278 [6:57:59<26:20:25, 3.39s/it] {'loss': 0.176, 'grad_norm': 0.954693563100793, 'learning_rate': 9.398007655650682e-06, 'epoch': 0.18} 18%|█▊ | 6276/34278 [6:57:59<26:20:25, 3.39s/it] 18%|█▊ | 6277/34278 [6:58:03<28:31:18, 3.67s/it] {'loss': 0.1505, 'grad_norm': 0.7206433187720213, 'learning_rate': 9.397782893799036e-06, 'epoch': 0.18} 18%|█▊ | 6277/34278 [6:58:03<28:31:18, 3.67s/it] 18%|█▊ | 6278/34278 [6:58:07<28:39:58, 3.69s/it] {'loss': 0.1624, 'grad_norm': 0.8757530542891719, 'learning_rate': 9.397558092685033e-06, 'epoch': 0.18} 18%|█▊ | 6278/34278 [6:58:07<28:39:58, 3.69s/it] 18%|█▊ | 6279/34278 [6:58:10<27:41:48, 3.56s/it] {'loss': 0.1636, 'grad_norm': 1.0543168020175973, 'learning_rate': 9.397333252310682e-06, 'epoch': 0.18} 18%|█▊ | 6279/34278 [6:58:10<27:41:48, 3.56s/it] 18%|█▊ | 6280/34278 [6:58:16<33:37:01, 4.32s/it] {'loss': 0.1827, 'grad_norm': 0.778576249760065, 'learning_rate': 9.39710837267799e-06, 'epoch': 0.18} 18%|█▊ | 6280/34278 [6:58:16<33:37:01, 4.32s/it] 18%|█▊ | 6281/34278 [6:58:20<31:09:07, 4.01s/it] {'loss': 0.2028, 'grad_norm': 1.1641877106989917, 'learning_rate': 9.396883453788964e-06, 'epoch': 0.18} 18%|█▊ | 6281/34278 [6:58:20<31:09:07, 4.01s/it] 18%|█▊ | 6282/34278 [6:58:23<30:11:05, 3.88s/it] {'loss': 0.1523, 'grad_norm': 1.1271654281740786, 'learning_rate': 9.39665849564561e-06, 'epoch': 0.18} 18%|█▊ | 6282/34278 [6:58:23<30:11:05, 3.88s/it] 18%|█▊ | 6283/34278 [6:58:27<28:40:28, 3.69s/it] {'loss': 0.168, 'grad_norm': 0.8299374316477975, 'learning_rate': 9.396433498249939e-06, 'epoch': 0.18} 18%|█▊ | 6283/34278 [6:58:27<28:40:28, 3.69s/it] 18%|█▊ | 6284/34278 [6:58:29<26:41:20, 3.43s/it] {'loss': 0.1718, 'grad_norm': 1.5415467716062783, 'learning_rate': 9.396208461603962e-06, 'epoch': 0.18} 18%|█▊ | 6284/34278 [6:58:29<26:41:20, 3.43s/it] 18%|█▊ | 6285/34278 [6:58:32<25:05:31, 3.23s/it] {'loss': 0.1458, 'grad_norm': 1.0033100132751955, 'learning_rate': 9.395983385709683e-06, 'epoch': 0.18} 18%|█▊ | 6285/34278 [6:58:32<25:05:31, 3.23s/it] 18%|█▊ | 6286/34278 [6:58:37<29:20:46, 3.77s/it] {'loss': 0.165, 'grad_norm': 0.7665849418163303, 'learning_rate': 9.395758270569114e-06, 'epoch': 0.18} 18%|█▊ | 6286/34278 [6:58:37<29:20:46, 3.77s/it] 18%|█▊ | 6287/34278 [6:58:41<30:31:07, 3.93s/it] {'loss': 0.1542, 'grad_norm': 0.9567717284645186, 'learning_rate': 9.395533116184266e-06, 'epoch': 0.18} 18%|█▊ | 6287/34278 [6:58:41<30:31:07, 3.93s/it] 18%|█▊ | 6288/34278 [6:58:47<34:08:36, 4.39s/it] {'loss': 0.1763, 'grad_norm': 0.8993418807931096, 'learning_rate': 9.395307922557145e-06, 'epoch': 0.18} 18%|█▊ | 6288/34278 [6:58:47<34:08:36, 4.39s/it] 18%|█▊ | 6289/34278 [6:58:51<33:08:24, 4.26s/it] {'loss': 0.1565, 'grad_norm': 0.8079546539093717, 'learning_rate': 9.395082689689765e-06, 'epoch': 0.18} 18%|█▊ | 6289/34278 [6:58:51<33:08:24, 4.26s/it] 18%|█▊ | 6290/34278 [6:58:55<32:32:47, 4.19s/it] {'loss': 0.1899, 'grad_norm': 0.9748613091499144, 'learning_rate': 9.394857417584137e-06, 'epoch': 0.18} 18%|█▊ | 6290/34278 [6:58:55<32:32:47, 4.19s/it] 18%|█▊ | 6291/34278 [6:58:58<30:09:37, 3.88s/it] {'loss': 0.1431, 'grad_norm': 0.974800626920447, 'learning_rate': 9.394632106242271e-06, 'epoch': 0.18} 18%|█▊ | 6291/34278 [6:58:58<30:09:37, 3.88s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fafd81b0950> Failed to fetch sample 2701720. Exception: cannot identify image file <_io.BytesIO object at 0x7fafd81b0950> 18%|█▊ | 6292/34278 [6:59:01<27:35:19, 3.55s/it] {'loss': 0.1732, 'grad_norm': 1.2020729522605325, 'learning_rate': 9.394406755666177e-06, 'epoch': 0.18} 18%|█▊ | 6292/34278 [6:59:01<27:35:19, 3.55s/it] 18%|█▊ | 6293/34278 [6:59:06<30:23:24, 3.91s/it] {'loss': 0.1771, 'grad_norm': 1.1276256250163865, 'learning_rate': 9.39418136585787e-06, 'epoch': 0.18} 18%|█▊ | 6293/34278 [6:59:06<30:23:24, 3.91s/it] 18%|█▊ | 6294/34278 [6:59:09<29:14:47, 3.76s/it] {'loss': 0.172, 'grad_norm': 0.6975061214800776, 'learning_rate': 9.393955936819362e-06, 'epoch': 0.18} 18%|█▊ | 6294/34278 [6:59:09<29:14:47, 3.76s/it] 18%|█▊ | 6295/34278 [6:59:12<27:22:00, 3.52s/it] {'loss': 0.1851, 'grad_norm': 0.9135330478235544, 'learning_rate': 9.393730468552661e-06, 'epoch': 0.18} 18%|█▊ | 6295/34278 [6:59:12<27:22:00, 3.52s/it] 18%|█▊ | 6296/34278 [6:59:15<25:54:23, 3.33s/it] {'loss': 0.1727, 'grad_norm': 0.9041104249236712, 'learning_rate': 9.393504961059786e-06, 'epoch': 0.18} 18%|█▊ | 6296/34278 [6:59:15<25:54:23, 3.33s/it] 18%|█▊ | 6297/34278 [6:59:19<26:54:01, 3.46s/it] {'loss': 0.1659, 'grad_norm': 0.8974073959120973, 'learning_rate': 9.393279414342747e-06, 'epoch': 0.18} 18%|█▊ | 6297/34278 [6:59:19<26:54:01, 3.46s/it] 18%|█▊ | 6298/34278 [6:59:23<28:44:57, 3.70s/it] {'loss': 0.1657, 'grad_norm': 0.9090586042105718, 'learning_rate': 9.393053828403558e-06, 'epoch': 0.18} 18%|█▊ | 6298/34278 [6:59:23<28:44:57, 3.70s/it] 18%|█▊ | 6299/34278 [6:59:28<32:06:14, 4.13s/it] {'loss': 0.169, 'grad_norm': 1.0353462341734232, 'learning_rate': 9.392828203244232e-06, 'epoch': 0.18} 18%|█▊ | 6299/34278 [6:59:28<32:06:14, 4.13s/it] 18%|█▊ | 6300/34278 [6:59:31<29:21:12, 3.78s/it] {'loss': 0.1686, 'grad_norm': 0.8231975707209924, 'learning_rate': 9.392602538866785e-06, 'epoch': 0.18} 18%|█▊ | 6300/34278 [6:59:31<29:21:12, 3.78s/it] 18%|█▊ | 6301/34278 [6:59:36<32:27:14, 4.18s/it] {'loss': 0.1641, 'grad_norm': 0.7398106825842896, 'learning_rate': 9.39237683527323e-06, 'epoch': 0.18} 18%|█▊ | 6301/34278 [6:59:36<32:27:14, 4.18s/it] 18%|█▊ | 6302/34278 [6:59:42<36:50:52, 4.74s/it] {'loss': 0.1572, 'grad_norm': 0.9598872308409644, 'learning_rate': 9.392151092465587e-06, 'epoch': 0.18} 18%|█▊ | 6302/34278 [6:59:42<36:50:52, 4.74s/it] 18%|█▊ | 6303/34278 [6:59:45<33:04:09, 4.26s/it] {'loss': 0.1482, 'grad_norm': 0.7734106660152992, 'learning_rate': 9.391925310445863e-06, 'epoch': 0.18} 18%|█▊ | 6303/34278 [6:59:45<33:04:09, 4.26s/it] 18%|█▊ | 6304/34278 [6:59:51<37:24:11, 4.81s/it] {'loss': 0.1516, 'grad_norm': 0.7760284564417853, 'learning_rate': 9.391699489216082e-06, 'epoch': 0.18} 18%|█▊ | 6304/34278 [6:59:51<37:24:11, 4.81s/it] 18%|█▊ | 6305/34278 [6:59:55<34:35:18, 4.45s/it] {'loss': 0.1794, 'grad_norm': 0.9199543724306547, 'learning_rate': 9.391473628778253e-06, 'epoch': 0.18} 18%|█▊ | 6305/34278 [6:59:55<34:35:18, 4.45s/it] 18%|█▊ | 6306/34278 [7:00:00<36:14:27, 4.66s/it] {'loss': 0.156, 'grad_norm': 0.8064664006424641, 'learning_rate': 9.391247729134399e-06, 'epoch': 0.18} 18%|█▊ | 6306/34278 [7:00:00<36:14:27, 4.66s/it] 18%|█▊ | 6307/34278 [7:00:04<33:26:46, 4.30s/it] {'loss': 0.1597, 'grad_norm': 0.9541207820094149, 'learning_rate': 9.391021790286532e-06, 'epoch': 0.18} 18%|█▊ | 6307/34278 [7:00:04<33:26:46, 4.30s/it] 18%|█▊ | 6308/34278 [7:00:06<30:06:10, 3.87s/it] {'loss': 0.15, 'grad_norm': 0.835873642125506, 'learning_rate': 9.39079581223667e-06, 'epoch': 0.18} 18%|█▊ | 6308/34278 [7:00:06<30:06:10, 3.87s/it] 18%|█▊ | 6309/34278 [7:00:09<27:46:44, 3.58s/it] {'loss': 0.1751, 'grad_norm': 0.913013134531752, 'learning_rate': 9.390569794986833e-06, 'epoch': 0.18} 18%|█▊ | 6309/34278 [7:00:09<27:46:44, 3.58s/it] 18%|█▊ | 6310/34278 [7:00:14<29:43:51, 3.83s/it] {'loss': 0.1839, 'grad_norm': 0.8186151914454821, 'learning_rate': 9.390343738539036e-06, 'epoch': 0.18} 18%|█▊ | 6310/34278 [7:00:14<29:43:51, 3.83s/it] 18%|█▊ | 6311/34278 [7:00:17<28:36:02, 3.68s/it] {'loss': 0.1607, 'grad_norm': 0.989796088579576, 'learning_rate': 9.390117642895298e-06, 'epoch': 0.18} 18%|█▊ | 6311/34278 [7:00:17<28:36:02, 3.68s/it] 18%|█▊ | 6312/34278 [7:00:20<26:42:01, 3.44s/it] {'loss': 0.1621, 'grad_norm': 0.766298593376058, 'learning_rate': 9.389891508057638e-06, 'epoch': 0.18} 18%|█▊ | 6312/34278 [7:00:20<26:42:01, 3.44s/it] 18%|█▊ | 6313/34278 [7:00:23<24:50:21, 3.20s/it] {'loss': 0.1754, 'grad_norm': 0.9280216493572327, 'learning_rate': 9.389665334028073e-06, 'epoch': 0.18} 18%|█▊ | 6313/34278 [7:00:23<24:50:21, 3.20s/it] 18%|█▊ | 6314/34278 [7:00:29<31:27:55, 4.05s/it] {'loss': 0.1873, 'grad_norm': 0.8973743583496462, 'learning_rate': 9.389439120808625e-06, 'epoch': 0.18} 18%|█▊ | 6314/34278 [7:00:29<31:27:55, 4.05s/it] 18%|█▊ | 6315/34278 [7:00:33<31:26:56, 4.05s/it] {'loss': 0.1678, 'grad_norm': 0.8473033153608642, 'learning_rate': 9.389212868401313e-06, 'epoch': 0.18} 18%|█▊ | 6315/34278 [7:00:33<31:26:56, 4.05s/it] 18%|█▊ | 6316/34278 [7:00:36<28:59:00, 3.73s/it] {'loss': 0.1688, 'grad_norm': 0.9923850099865504, 'learning_rate': 9.388986576808156e-06, 'epoch': 0.18} 18%|█▊ | 6316/34278 [7:00:36<28:59:00, 3.73s/it] 18%|█▊ | 6317/34278 [7:00:39<28:52:14, 3.72s/it] {'loss': 0.1696, 'grad_norm': 1.0566776003326883, 'learning_rate': 9.388760246031175e-06, 'epoch': 0.18} 18%|█▊ | 6317/34278 [7:00:39<28:52:14, 3.72s/it] 18%|█▊ | 6318/34278 [7:00:42<26:34:22, 3.42s/it] {'loss': 0.1494, 'grad_norm': 1.0754251362690752, 'learning_rate': 9.38853387607239e-06, 'epoch': 0.18} 18%|█▊ | 6318/34278 [7:00:42<26:34:22, 3.42s/it] 18%|█▊ | 6319/34278 [7:00:45<26:20:26, 3.39s/it] {'loss': 0.1728, 'grad_norm': 0.7831028392985439, 'learning_rate': 9.388307466933821e-06, 'epoch': 0.18} 18%|█▊ | 6319/34278 [7:00:45<26:20:26, 3.39s/it] 18%|█▊ | 6320/34278 [7:00:49<25:48:13, 3.32s/it] {'loss': 0.1511, 'grad_norm': 0.8504901453125424, 'learning_rate': 9.388081018617492e-06, 'epoch': 0.18} 18%|█▊ | 6320/34278 [7:00:49<25:48:13, 3.32s/it] 18%|█▊ | 6321/34278 [7:00:52<25:28:31, 3.28s/it] {'loss': 0.164, 'grad_norm': 1.0756459690975153, 'learning_rate': 9.387854531125421e-06, 'epoch': 0.18} 18%|█▊ | 6321/34278 [7:00:52<25:28:31, 3.28s/it] 18%|█▊ | 6322/34278 [7:00:57<29:59:20, 3.86s/it] {'loss': 0.1823, 'grad_norm': 0.9622875988951577, 'learning_rate': 9.387628004459633e-06, 'epoch': 0.18} 18%|█▊ | 6322/34278 [7:00:57<29:59:20, 3.86s/it] 18%|█▊ | 6323/34278 [7:01:03<34:54:10, 4.49s/it] {'loss': 0.1958, 'grad_norm': 0.8239741278722938, 'learning_rate': 9.387401438622151e-06, 'epoch': 0.18} 18%|█▊ | 6323/34278 [7:01:03<34:54:10, 4.49s/it] 18%|█▊ | 6324/34278 [7:01:06<31:14:21, 4.02s/it] {'loss': 0.1561, 'grad_norm': 0.8432333835187112, 'learning_rate': 9.387174833614996e-06, 'epoch': 0.18} 18%|█▊ | 6324/34278 [7:01:06<31:14:21, 4.02s/it] 18%|█▊ | 6325/34278 [7:01:12<35:46:19, 4.61s/it] {'loss': 0.1761, 'grad_norm': 0.9037546123012228, 'learning_rate': 9.38694818944019e-06, 'epoch': 0.18} 18%|█▊ | 6325/34278 [7:01:12<35:46:19, 4.61s/it] 18%|█▊ | 6326/34278 [7:01:15<32:57:12, 4.24s/it] {'loss': 0.1706, 'grad_norm': 0.6479765240275243, 'learning_rate': 9.386721506099759e-06, 'epoch': 0.18} 18%|█▊ | 6326/34278 [7:01:15<32:57:12, 4.24s/it] 18%|█▊ | 6327/34278 [7:01:19<32:25:12, 4.18s/it] {'loss': 0.1498, 'grad_norm': 0.8840684744698483, 'learning_rate': 9.386494783595725e-06, 'epoch': 0.18} 18%|█▊ | 6327/34278 [7:01:19<32:25:12, 4.18s/it] 18%|█▊ | 6328/34278 [7:01:23<32:00:11, 4.12s/it] {'loss': 0.1699, 'grad_norm': 4.114698133905036, 'learning_rate': 9.386268021930114e-06, 'epoch': 0.18} 18%|█▊ | 6328/34278 [7:01:23<32:00:11, 4.12s/it] 18%|█▊ | 6329/34278 [7:01:30<37:06:14, 4.78s/it] {'loss': 0.1751, 'grad_norm': 0.8557881661633471, 'learning_rate': 9.386041221104947e-06, 'epoch': 0.18} 18%|█▊ | 6329/34278 [7:01:30<37:06:14, 4.78s/it] 18%|█▊ | 6330/34278 [7:01:33<34:08:48, 4.40s/it] {'loss': 0.1611, 'grad_norm': 0.6900398003115896, 'learning_rate': 9.385814381122252e-06, 'epoch': 0.18} 18%|█▊ | 6330/34278 [7:01:33<34:08:48, 4.40s/it] 18%|█▊ | 6331/34278 [7:01:37<32:20:34, 4.17s/it] {'loss': 0.1589, 'grad_norm': 0.9162916362396664, 'learning_rate': 9.385587501984056e-06, 'epoch': 0.18} 18%|█▊ | 6331/34278 [7:01:37<32:20:34, 4.17s/it] 18%|█▊ | 6332/34278 [7:01:41<32:29:42, 4.19s/it] {'loss': 0.1482, 'grad_norm': 0.8271291333589972, 'learning_rate': 9.385360583692378e-06, 'epoch': 0.18} 18%|█▊ | 6332/34278 [7:01:41<32:29:42, 4.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6333/34278 [7:01:46<35:01:01, 4.51s/it] {'loss': 0.1528, 'grad_norm': 0.7268643631329941, 'learning_rate': 9.385133626249247e-06, 'epoch': 0.18} 18%|█▊ | 6333/34278 [7:01:46<35:01:01, 4.51s/it] 18%|█▊ | 6334/34278 [7:01:49<31:00:41, 4.00s/it] {'loss': 0.1775, 'grad_norm': 1.0353747666604984, 'learning_rate': 9.384906629656692e-06, 'epoch': 0.18} 18%|█▊ | 6334/34278 [7:01:49<31:00:41, 4.00s/it] 18%|█▊ | 6335/34278 [7:01:55<34:31:32, 4.45s/it] {'loss': 0.171, 'grad_norm': 0.9373739253374651, 'learning_rate': 9.384679593916737e-06, 'epoch': 0.18} 18%|█▊ | 6335/34278 [7:01:55<34:31:32, 4.45s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 18%|█▊ | 6336/34278 [7:01:58<31:35:38, 4.07s/it] {'loss': 0.1529, 'grad_norm': 0.7342795308586916, 'learning_rate': 9.384452519031409e-06, 'epoch': 0.18} 18%|█▊ | 6336/34278 [7:01:58<31:35:38, 4.07s/it] 18%|█▊ | 6337/34278 [7:02:01<29:55:31, 3.86s/it] {'loss': 0.1691, 'grad_norm': 0.9723050522783112, 'learning_rate': 9.384225405002736e-06, 'epoch': 0.18} 18%|█▊ | 6337/34278 [7:02:01<29:55:31, 3.86s/it] 18%|█▊ | 6338/34278 [7:02:04<28:25:05, 3.66s/it] {'loss': 0.1648, 'grad_norm': 0.9092930360712453, 'learning_rate': 9.383998251832744e-06, 'epoch': 0.18} 18%|█▊ | 6338/34278 [7:02:04<28:25:05, 3.66s/it] 18%|█▊ | 6339/34278 [7:02:07<26:41:18, 3.44s/it] {'loss': 0.1938, 'grad_norm': 0.8127884031882446, 'learning_rate': 9.383771059523464e-06, 'epoch': 0.18} 18%|█▊ | 6339/34278 [7:02:07<26:41:18, 3.44s/it] 18%|█▊ | 6340/34278 [7:02:11<28:36:31, 3.69s/it] {'loss': 0.1617, 'grad_norm': 0.8100007733503917, 'learning_rate': 9.383543828076923e-06, 'epoch': 0.18} 18%|█▊ | 6340/34278 [7:02:11<28:36:31, 3.69s/it] 18%|█▊ | 6341/34278 [7:02:15<27:18:24, 3.52s/it] {'loss': 0.1598, 'grad_norm': 1.0994614799445628, 'learning_rate': 9.383316557495145e-06, 'epoch': 0.18} 18%|█▊ | 6341/34278 [7:02:15<27:18:24, 3.52s/it] 19%|█▊ | 6342/34278 [7:02:18<26:04:40, 3.36s/it] {'loss': 0.1669, 'grad_norm': 0.7678430777969951, 'learning_rate': 9.383089247780168e-06, 'epoch': 0.19} 19%|█▊ | 6342/34278 [7:02:18<26:04:40, 3.36s/it] 19%|█▊ | 6343/34278 [7:02:21<26:13:34, 3.38s/it] {'loss': 0.1539, 'grad_norm': 0.8686584777523905, 'learning_rate': 9.382861898934013e-06, 'epoch': 0.19} 19%|█▊ | 6343/34278 [7:02:21<26:13:34, 3.38s/it] 19%|█▊ | 6344/34278 [7:02:24<26:24:49, 3.40s/it] {'loss': 0.1845, 'grad_norm': 1.0690350375174442, 'learning_rate': 9.382634510958714e-06, 'epoch': 0.19} 19%|█▊ | 6344/34278 [7:02:24<26:24:49, 3.40s/it] 19%|█▊ | 6345/34278 [7:02:27<24:41:49, 3.18s/it] {'loss': 0.1728, 'grad_norm': 0.8535316947589281, 'learning_rate': 9.382407083856302e-06, 'epoch': 0.19} 19%|█▊ | 6345/34278 [7:02:27<24:41:49, 3.18s/it] 19%|█▊ | 6346/34278 [7:02:31<26:02:15, 3.36s/it] {'loss': 0.1627, 'grad_norm': 0.9349007679845841, 'learning_rate': 9.382179617628804e-06, 'epoch': 0.19} 19%|█▊ | 6346/34278 [7:02:31<26:02:15, 3.36s/it] 19%|█▊ | 6347/34278 [7:02:34<26:05:22, 3.36s/it] {'loss': 0.1641, 'grad_norm': 0.9874725412467277, 'learning_rate': 9.381952112278254e-06, 'epoch': 0.19} 19%|█▊ | 6347/34278 [7:02:34<26:05:22, 3.36s/it] 19%|█▊ | 6348/34278 [7:02:38<26:06:37, 3.37s/it] {'loss': 0.2031, 'grad_norm': 0.9545658165985447, 'learning_rate': 9.38172456780668e-06, 'epoch': 0.19} 19%|█▊ | 6348/34278 [7:02:38<26:06:37, 3.37s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff00c1c6e80> Failed to fetch sample 2717417. Exception: cannot identify image file <_io.BytesIO object at 0x7ff00c1c6e80> 19%|█▊ | 6349/34278 [7:02:43<31:04:47, 4.01s/it] {'loss': 0.1796, 'grad_norm': 0.859255529423847, 'learning_rate': 9.381496984216117e-06, 'epoch': 0.19} 19%|█▊ | 6349/34278 [7:02:43<31:04:47, 4.01s/it] 19%|█▊ | 6350/34278 [7:02:49<35:43:18, 4.60s/it] {'loss': 0.1616, 'grad_norm': 0.8752842002519265, 'learning_rate': 9.381269361508593e-06, 'epoch': 0.19} 19%|█▊ | 6350/34278 [7:02:49<35:43:18, 4.60s/it] 19%|█▊ | 6351/34278 [7:02:52<31:55:50, 4.12s/it] {'loss': 0.1531, 'grad_norm': 0.7403846602310657, 'learning_rate': 9.381041699686143e-06, 'epoch': 0.19} 19%|█▊ | 6351/34278 [7:02:52<31:55:50, 4.12s/it] 19%|█▊ | 6352/34278 [7:02:55<30:13:42, 3.90s/it] {'loss': 0.1728, 'grad_norm': 0.9101251262560319, 'learning_rate': 9.380813998750798e-06, 'epoch': 0.19} 19%|█▊ | 6352/34278 [7:02:56<30:13:42, 3.90s/it] 19%|█▊ | 6353/34278 [7:02:59<28:26:27, 3.67s/it] {'loss': 0.1902, 'grad_norm': 0.9226859299655494, 'learning_rate': 9.380586258704592e-06, 'epoch': 0.19} 19%|█▊ | 6353/34278 [7:02:59<28:26:27, 3.67s/it] 19%|█▊ | 6354/34278 [7:03:05<33:41:30, 4.34s/it] {'loss': 0.1512, 'grad_norm': 0.735729210753132, 'learning_rate': 9.380358479549556e-06, 'epoch': 0.19} 19%|█▊ | 6354/34278 [7:03:05<33:41:30, 4.34s/it] 19%|█▊ | 6355/34278 [7:03:07<30:20:00, 3.91s/it] {'loss': 0.1516, 'grad_norm': 0.7761692285022256, 'learning_rate': 9.380130661287728e-06, 'epoch': 0.19} 19%|█▊ | 6355/34278 [7:03:07<30:20:00, 3.91s/it] 19%|█▊ | 6356/34278 [7:03:12<32:23:26, 4.18s/it] {'loss': 0.1607, 'grad_norm': 0.9170830195308721, 'learning_rate': 9.379902803921135e-06, 'epoch': 0.19} 19%|█▊ | 6356/34278 [7:03:12<32:23:26, 4.18s/it] 19%|█▊ | 6357/34278 [7:03:16<30:35:40, 3.94s/it] {'loss': 0.164, 'grad_norm': 0.8316013696108943, 'learning_rate': 9.379674907451819e-06, 'epoch': 0.19} 19%|█▊ | 6357/34278 [7:03:16<30:35:40, 3.94s/it] 19%|█▊ | 6358/34278 [7:03:19<29:58:35, 3.87s/it] {'loss': 0.1611, 'grad_norm': 0.8536503482264336, 'learning_rate': 9.379446971881808e-06, 'epoch': 0.19} 19%|█▊ | 6358/34278 [7:03:19<29:58:35, 3.87s/it] 19%|█▊ | 6359/34278 [7:03:26<35:48:41, 4.62s/it] {'loss': 0.1661, 'grad_norm': 0.839788580451009, 'learning_rate': 9.379218997213143e-06, 'epoch': 0.19} 19%|█▊ | 6359/34278 [7:03:26<35:48:41, 4.62s/it] 19%|█▊ | 6360/34278 [7:03:29<32:49:05, 4.23s/it] {'loss': 0.1651, 'grad_norm': 0.8795878200249163, 'learning_rate': 9.378990983447855e-06, 'epoch': 0.19} 19%|█▊ | 6360/34278 [7:03:29<32:49:05, 4.23s/it] 19%|█▊ | 6361/34278 [7:03:36<38:13:19, 4.93s/it] {'loss': 0.1843, 'grad_norm': 0.9092622775474988, 'learning_rate': 9.37876293058798e-06, 'epoch': 0.19} 19%|█▊ | 6361/34278 [7:03:36<38:13:19, 4.93s/it] 19%|█▊ | 6362/34278 [7:03:40<37:25:29, 4.83s/it] {'loss': 0.1548, 'grad_norm': 0.8255685089027337, 'learning_rate': 9.378534838635556e-06, 'epoch': 0.19} 19%|█▊ | 6362/34278 [7:03:40<37:25:29, 4.83s/it] 19%|█▊ | 6363/34278 [7:03:43<33:00:21, 4.26s/it] {'loss': 0.1846, 'grad_norm': 0.9788167032986756, 'learning_rate': 9.378306707592618e-06, 'epoch': 0.19} 19%|█▊ | 6363/34278 [7:03:43<33:00:21, 4.26s/it] 19%|█▊ | 6364/34278 [7:03:46<30:16:51, 3.91s/it] {'loss': 0.1678, 'grad_norm': 0.8886286713130412, 'learning_rate': 9.378078537461203e-06, 'epoch': 0.19} 19%|█▊ | 6364/34278 [7:03:46<30:16:51, 3.91s/it] 19%|█▊ | 6365/34278 [7:03:50<29:15:26, 3.77s/it] {'loss': 0.1522, 'grad_norm': 0.888688783427601, 'learning_rate': 9.377850328243348e-06, 'epoch': 0.19} 19%|█▊ | 6365/34278 [7:03:50<29:15:26, 3.77s/it] 19%|█▊ | 6366/34278 [7:03:53<27:06:53, 3.50s/it] {'loss': 0.1448, 'grad_norm': 0.6460136648863666, 'learning_rate': 9.377622079941089e-06, 'epoch': 0.19} 19%|█▊ | 6366/34278 [7:03:53<27:06:53, 3.50s/it] 19%|█▊ | 6367/34278 [7:03:57<28:44:28, 3.71s/it] {'loss': 0.1849, 'grad_norm': 1.0355527099034294, 'learning_rate': 9.377393792556466e-06, 'epoch': 0.19} 19%|█▊ | 6367/34278 [7:03:57<28:44:28, 3.71s/it] 19%|█▊ | 6368/34278 [7:04:03<34:03:01, 4.39s/it] {'loss': 0.1641, 'grad_norm': 0.9430101147387856, 'learning_rate': 9.377165466091516e-06, 'epoch': 0.19} 19%|█▊ | 6368/34278 [7:04:03<34:03:01, 4.39s/it] 19%|█▊ | 6369/34278 [7:04:06<31:08:43, 4.02s/it] {'loss': 0.1634, 'grad_norm': 0.7912800017061814, 'learning_rate': 9.376937100548277e-06, 'epoch': 0.19} 19%|█▊ | 6369/34278 [7:04:06<31:08:43, 4.02s/it] 19%|█▊ | 6370/34278 [7:04:09<29:13:37, 3.77s/it] {'loss': 0.1537, 'grad_norm': 1.2790306723717684, 'learning_rate': 9.376708695928791e-06, 'epoch': 0.19} 19%|█▊ | 6370/34278 [7:04:09<29:13:37, 3.77s/it] 19%|█▊ | 6371/34278 [7:04:13<28:58:25, 3.74s/it] {'loss': 0.1554, 'grad_norm': 0.9569122567998494, 'learning_rate': 9.376480252235091e-06, 'epoch': 0.19} 19%|█▊ | 6371/34278 [7:04:13<28:58:25, 3.74s/it] 19%|█▊ | 6372/34278 [7:04:18<33:02:44, 4.26s/it] {'loss': 0.1696, 'grad_norm': 0.8385870624766267, 'learning_rate': 9.376251769469223e-06, 'epoch': 0.19} 19%|█▊ | 6372/34278 [7:04:18<33:02:44, 4.26s/it] 19%|█▊ | 6373/34278 [7:04:21<30:23:18, 3.92s/it] {'loss': 0.1739, 'grad_norm': 1.2551188792359957, 'learning_rate': 9.376023247633224e-06, 'epoch': 0.19} 19%|█▊ | 6373/34278 [7:04:21<30:23:18, 3.92s/it] 19%|█▊ | 6374/34278 [7:04:24<28:19:25, 3.65s/it] {'loss': 0.1553, 'grad_norm': 0.9203606008213495, 'learning_rate': 9.375794686729132e-06, 'epoch': 0.19} 19%|█▊ | 6374/34278 [7:04:24<28:19:25, 3.65s/it] 19%|█▊ | 6375/34278 [7:04:27<26:32:32, 3.42s/it] {'loss': 0.1753, 'grad_norm': 0.9664355287797862, 'learning_rate': 9.37556608675899e-06, 'epoch': 0.19} 19%|█▊ | 6375/34278 [7:04:27<26:32:32, 3.42s/it] 19%|█▊ | 6376/34278 [7:04:31<26:40:57, 3.44s/it] {'loss': 0.1992, 'grad_norm': 0.8985595808477231, 'learning_rate': 9.375337447724839e-06, 'epoch': 0.19} 19%|█▊ | 6376/34278 [7:04:31<26:40:57, 3.44s/it] 19%|█▊ | 6377/34278 [7:04:34<26:24:30, 3.41s/it] {'loss': 0.1645, 'grad_norm': 0.8896127544730188, 'learning_rate': 9.37510876962872e-06, 'epoch': 0.19} 19%|█▊ | 6377/34278 [7:04:34<26:24:30, 3.41s/it] 19%|█▊ | 6378/34278 [7:04:37<24:48:58, 3.20s/it] {'loss': 0.1495, 'grad_norm': 0.9140614984575676, 'learning_rate': 9.374880052472674e-06, 'epoch': 0.19} 19%|█▊ | 6378/34278 [7:04:37<24:48:58, 3.20s/it] 19%|█▊ | 6379/34278 [7:04:43<30:59:05, 4.00s/it] {'loss': 0.154, 'grad_norm': 0.8551668307596201, 'learning_rate': 9.374651296258743e-06, 'epoch': 0.19} 19%|█▊ | 6379/34278 [7:04:43<30:59:05, 4.00s/it] 19%|█▊ | 6380/34278 [7:04:46<29:03:10, 3.75s/it] {'loss': 0.1641, 'grad_norm': 0.7345254800454083, 'learning_rate': 9.374422500988971e-06, 'epoch': 0.19} 19%|█▊ | 6380/34278 [7:04:46<29:03:10, 3.75s/it] 19%|█▊ | 6381/34278 [7:04:50<30:50:18, 3.98s/it] {'loss': 0.2001, 'grad_norm': 0.8967007878719371, 'learning_rate': 9.374193666665397e-06, 'epoch': 0.19} 19%|█▊ | 6381/34278 [7:04:50<30:50:18, 3.98s/it] 19%|█▊ | 6382/34278 [7:04:53<28:42:19, 3.70s/it] {'loss': 0.1617, 'grad_norm': 1.0206969373347077, 'learning_rate': 9.373964793290067e-06, 'epoch': 0.19} 19%|█▊ | 6382/34278 [7:04:53<28:42:19, 3.70s/it] 19%|█▊ | 6383/34278 [7:04:58<30:51:04, 3.98s/it] {'loss': 0.1779, 'grad_norm': 0.9382054746335238, 'learning_rate': 9.373735880865024e-06, 'epoch': 0.19} 19%|█▊ | 6383/34278 [7:04:58<30:51:04, 3.98s/it] 19%|█▊ | 6384/34278 [7:05:02<29:54:54, 3.86s/it] {'loss': 0.1626, 'grad_norm': 0.7812344511784418, 'learning_rate': 9.373506929392311e-06, 'epoch': 0.19} 19%|█▊ | 6384/34278 [7:05:02<29:54:54, 3.86s/it] 19%|█▊ | 6385/34278 [7:05:05<28:32:33, 3.68s/it] {'loss': 0.1775, 'grad_norm': 0.9776708510589419, 'learning_rate': 9.373277938873973e-06, 'epoch': 0.19} 19%|█▊ | 6385/34278 [7:05:05<28:32:33, 3.68s/it] 19%|█▊ | 6386/34278 [7:05:11<33:33:12, 4.33s/it] {'loss': 0.1786, 'grad_norm': 0.9224752017283132, 'learning_rate': 9.373048909312052e-06, 'epoch': 0.19} 19%|█▊ | 6386/34278 [7:05:11<33:33:12, 4.33s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8389 > 8192). Running this sequence through the model will result in indexing errors 19%|█▊ | 6387/34278 [7:05:17<37:18:06, 4.81s/it] {'loss': 0.1732, 'grad_norm': 0.6783865099924923, 'learning_rate': 9.372819840708594e-06, 'epoch': 0.19} 19%|█▊ | 6387/34278 [7:05:17<37:18:06, 4.81s/it] 19%|█▊ | 6388/34278 [7:05:20<33:09:01, 4.28s/it] {'loss': 0.1851, 'grad_norm': 1.3051642832191346, 'learning_rate': 9.372590733065645e-06, 'epoch': 0.19} 19%|█▊ | 6388/34278 [7:05:20<33:09:01, 4.28s/it] 19%|█▊ | 6389/34278 [7:05:23<31:18:43, 4.04s/it] {'loss': 0.1774, 'grad_norm': 1.0003059103132785, 'learning_rate': 9.37236158638525e-06, 'epoch': 0.19} 19%|█▊ | 6389/34278 [7:05:23<31:18:43, 4.04s/it] 19%|█▊ | 6390/34278 [7:05:28<32:34:18, 4.20s/it] {'loss': 0.1906, 'grad_norm': 1.2418311183576725, 'learning_rate': 9.372132400669456e-06, 'epoch': 0.19} 19%|█▊ | 6390/34278 [7:05:28<32:34:18, 4.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▊ | 6391/34278 [7:05:31<30:52:40, 3.99s/it] {'loss': 0.1789, 'grad_norm': 1.1159368494090276, 'learning_rate': 9.371903175920306e-06, 'epoch': 0.19} 19%|█▊ | 6391/34278 [7:05:31<30:52:40, 3.99s/it] 19%|█▊ | 6392/34278 [7:05:34<28:11:59, 3.64s/it] {'loss': 0.1568, 'grad_norm': 0.8045398393384725, 'learning_rate': 9.371673912139847e-06, 'epoch': 0.19} 19%|█▊ | 6392/34278 [7:05:34<28:11:59, 3.64s/it] 19%|█▊ | 6393/34278 [7:05:38<28:11:48, 3.64s/it] {'loss': 0.1633, 'grad_norm': 0.8406425441848276, 'learning_rate': 9.371444609330129e-06, 'epoch': 0.19} 19%|█▊ | 6393/34278 [7:05:38<28:11:48, 3.64s/it] 19%|█▊ | 6394/34278 [7:05:42<30:52:40, 3.99s/it] {'loss': 0.1641, 'grad_norm': 0.7074436767919758, 'learning_rate': 9.371215267493195e-06, 'epoch': 0.19} 19%|█▊ | 6394/34278 [7:05:42<30:52:40, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▊ | 6395/34278 [7:05:46<29:18:16, 3.78s/it] {'loss': 0.1834, 'grad_norm': 0.8501370586374717, 'learning_rate': 9.370985886631097e-06, 'epoch': 0.19} 19%|█▊ | 6395/34278 [7:05:46<29:18:16, 3.78s/it] 19%|█▊ | 6396/34278 [7:05:49<27:30:04, 3.55s/it] {'loss': 0.1542, 'grad_norm': 0.9463208895737822, 'learning_rate': 9.370756466745879e-06, 'epoch': 0.19} 19%|█▊ | 6396/34278 [7:05:49<27:30:04, 3.55s/it] 19%|█▊ | 6397/34278 [7:05:52<26:53:54, 3.47s/it] {'loss': 0.1657, 'grad_norm': 0.8719077905712171, 'learning_rate': 9.37052700783959e-06, 'epoch': 0.19} 19%|█▊ | 6397/34278 [7:05:52<26:53:54, 3.47s/it] 19%|█▊ | 6398/34278 [7:05:57<29:47:53, 3.85s/it] {'loss': 0.1524, 'grad_norm': 0.9821077085761606, 'learning_rate': 9.37029750991428e-06, 'epoch': 0.19} 19%|█▊ | 6398/34278 [7:05:57<29:47:53, 3.85s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▊ | 6399/34278 [7:06:00<28:18:38, 3.66s/it] {'loss': 0.1602, 'grad_norm': 0.8425957668888753, 'learning_rate': 9.370067972971998e-06, 'epoch': 0.19} 19%|█▊ | 6399/34278 [7:06:00<28:18:38, 3.66s/it] 19%|█▊ | 6400/34278 [7:06:03<27:53:57, 3.60s/it] {'loss': 0.1914, 'grad_norm': 0.9156979950842294, 'learning_rate': 9.369838397014792e-06, 'epoch': 0.19} 19%|█▊ | 6400/34278 [7:06:03<27:53:57, 3.60s/it] 19%|█▊ | 6401/34278 [7:06:08<30:59:45, 4.00s/it] {'loss': 0.1723, 'grad_norm': 0.9284921112275565, 'learning_rate': 9.36960878204471e-06, 'epoch': 0.19} 19%|█▊ | 6401/34278 [7:06:08<30:59:45, 4.00s/it] 19%|█▊ | 6402/34278 [7:06:11<28:35:38, 3.69s/it] {'loss': 0.1695, 'grad_norm': 0.813841492557956, 'learning_rate': 9.369379128063807e-06, 'epoch': 0.19} 19%|█▊ | 6402/34278 [7:06:11<28:35:38, 3.69s/it] 19%|█▊ | 6403/34278 [7:06:14<27:08:37, 3.51s/it] {'loss': 0.1486, 'grad_norm': 1.0530021092744695, 'learning_rate': 9.369149435074127e-06, 'epoch': 0.19} 19%|█▊ | 6403/34278 [7:06:14<27:08:37, 3.51s/it] 19%|█▊ | 6404/34278 [7:06:18<27:15:14, 3.52s/it] {'loss': 0.1685, 'grad_norm': 0.8517192153127459, 'learning_rate': 9.368919703077726e-06, 'epoch': 0.19} 19%|█▊ | 6404/34278 [7:06:18<27:15:14, 3.52s/it] 19%|█▊ | 6405/34278 [7:06:21<25:56:26, 3.35s/it] {'loss': 0.1774, 'grad_norm': 0.9136196497971955, 'learning_rate': 9.368689932076651e-06, 'epoch': 0.19} 19%|█▊ | 6405/34278 [7:06:21<25:56:26, 3.35s/it] 19%|█▊ | 6406/34278 [7:06:24<25:40:36, 3.32s/it] {'loss': 0.1422, 'grad_norm': 0.6810571973131726, 'learning_rate': 9.368460122072958e-06, 'epoch': 0.19} 19%|█▊ | 6406/34278 [7:06:24<25:40:36, 3.32s/it] 19%|█▊ | 6407/34278 [7:06:28<25:58:23, 3.35s/it] {'loss': 0.1691, 'grad_norm': 0.98660818876648, 'learning_rate': 9.368230273068694e-06, 'epoch': 0.19} 19%|█▊ | 6407/34278 [7:06:28<25:58:23, 3.35s/it] 19%|█▊ | 6408/34278 [7:06:31<26:06:44, 3.37s/it] {'loss': 0.1296, 'grad_norm': 0.6836505378334928, 'learning_rate': 9.368000385065914e-06, 'epoch': 0.19} 19%|█▊ | 6408/34278 [7:06:31<26:06:44, 3.37s/it] 19%|█▊ | 6409/34278 [7:06:34<25:00:33, 3.23s/it] {'loss': 0.1753, 'grad_norm': 1.1529083326425609, 'learning_rate': 9.367770458066668e-06, 'epoch': 0.19} 19%|█▊ | 6409/34278 [7:06:34<25:00:33, 3.23s/it] 19%|█▊ | 6410/34278 [7:06:37<24:07:09, 3.12s/it] {'loss': 0.188, 'grad_norm': 0.9984352785617496, 'learning_rate': 9.36754049207301e-06, 'epoch': 0.19} 19%|█▊ | 6410/34278 [7:06:37<24:07:09, 3.12s/it] 19%|█▊ | 6411/34278 [7:06:41<25:36:08, 3.31s/it] {'loss': 0.1552, 'grad_norm': 0.8513446484147696, 'learning_rate': 9.367310487086994e-06, 'epoch': 0.19} 19%|█▊ | 6411/34278 [7:06:41<25:36:08, 3.31s/it] 19%|█▊ | 6412/34278 [7:06:46<30:04:18, 3.88s/it] {'loss': 0.1618, 'grad_norm': 0.8994726050458148, 'learning_rate': 9.367080443110672e-06, 'epoch': 0.19} 19%|█▊ | 6412/34278 [7:06:46<30:04:18, 3.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▊ | 6413/34278 [7:06:49<28:59:21, 3.75s/it] {'loss': 0.1932, 'grad_norm': 0.9579227745596848, 'learning_rate': 9.366850360146098e-06, 'epoch': 0.19} 19%|█▊ | 6413/34278 [7:06:49<28:59:21, 3.75s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11120 > 8192). Running this sequence through the model will result in indexing errors 19%|█▊ | 6414/34278 [7:06:53<30:06:39, 3.89s/it] {'loss': 0.1573, 'grad_norm': 0.836438245202908, 'learning_rate': 9.366620238195327e-06, 'epoch': 0.19} 19%|█▊ | 6414/34278 [7:06:53<30:06:39, 3.89s/it] 19%|█▊ | 6415/34278 [7:06:57<29:23:13, 3.80s/it] {'loss': 0.1408, 'grad_norm': 0.9597197293795949, 'learning_rate': 9.366390077260413e-06, 'epoch': 0.19} 19%|█▊ | 6415/34278 [7:06:57<29:23:13, 3.80s/it] 19%|█▊ | 6416/34278 [7:07:00<28:02:09, 3.62s/it] {'loss': 0.1932, 'grad_norm': 0.878411391925992, 'learning_rate': 9.366159877343411e-06, 'epoch': 0.19} 19%|█▊ | 6416/34278 [7:07:00<28:02:09, 3.62s/it] 19%|█▊ | 6417/34278 [7:07:04<28:03:48, 3.63s/it] {'loss': 0.1526, 'grad_norm': 0.8044571836578218, 'learning_rate': 9.365929638446375e-06, 'epoch': 0.19} 19%|█▊ | 6417/34278 [7:07:04<28:03:48, 3.63s/it] 19%|█▊ | 6418/34278 [7:07:10<33:45:34, 4.36s/it] {'loss': 0.1624, 'grad_norm': 0.8660836589451121, 'learning_rate': 9.365699360571361e-06, 'epoch': 0.19} 19%|█▊ | 6418/34278 [7:07:10<33:45:34, 4.36s/it] 19%|█▊ | 6419/34278 [7:07:13<30:36:38, 3.96s/it] {'loss': 0.17, 'grad_norm': 0.7468442113023918, 'learning_rate': 9.365469043720428e-06, 'epoch': 0.19} 19%|█▊ | 6419/34278 [7:07:13<30:36:38, 3.96s/it] 19%|█▊ | 6420/34278 [7:07:16<28:32:39, 3.69s/it] {'loss': 0.152, 'grad_norm': 0.9647577885839785, 'learning_rate': 9.365238687895626e-06, 'epoch': 0.19} 19%|█▊ | 6420/34278 [7:07:16<28:32:39, 3.69s/it] 19%|█▊ | 6421/34278 [7:07:19<27:27:20, 3.55s/it] {'loss': 0.1507, 'grad_norm': 0.8733550915431748, 'learning_rate': 9.365008293099017e-06, 'epoch': 0.19} 19%|█▊ | 6421/34278 [7:07:19<27:27:20, 3.55s/it] 19%|█▊ | 6422/34278 [7:07:25<33:13:24, 4.29s/it] {'loss': 0.1737, 'grad_norm': 0.833192434914469, 'learning_rate': 9.364777859332656e-06, 'epoch': 0.19} 19%|█▊ | 6422/34278 [7:07:25<33:13:24, 4.29s/it] 19%|█▊ | 6423/34278 [7:07:29<31:51:02, 4.12s/it] {'loss': 0.1855, 'grad_norm': 0.943764908474735, 'learning_rate': 9.364547386598599e-06, 'epoch': 0.19} 19%|█▊ | 6423/34278 [7:07:29<31:51:02, 4.12s/it] 19%|█▊ | 6424/34278 [7:07:33<30:32:18, 3.95s/it] {'loss': 0.1575, 'grad_norm': 0.9674986000953253, 'learning_rate': 9.364316874898906e-06, 'epoch': 0.19} 19%|█▊ | 6424/34278 [7:07:33<30:32:18, 3.95s/it] 19%|█▊ | 6425/34278 [7:07:36<28:16:54, 3.66s/it] {'loss': 0.1818, 'grad_norm': 0.8465644267118999, 'learning_rate': 9.364086324235634e-06, 'epoch': 0.19} 19%|█▊ | 6425/34278 [7:07:36<28:16:54, 3.66s/it] 19%|█▊ | 6426/34278 [7:07:41<32:59:38, 4.26s/it] {'loss': 0.158, 'grad_norm': 3.682477680581137, 'learning_rate': 9.36385573461084e-06, 'epoch': 0.19} 19%|█▊ | 6426/34278 [7:07:41<32:59:38, 4.26s/it] 19%|█▊ | 6427/34278 [7:07:44<29:41:17, 3.84s/it] {'loss': 0.1399, 'grad_norm': 1.1389325051338943, 'learning_rate': 9.363625106026585e-06, 'epoch': 0.19} 19%|█▊ | 6427/34278 [7:07:44<29:41:17, 3.84s/it] 19%|█▉ | 6428/34278 [7:07:47<27:37:48, 3.57s/it] {'loss': 0.1548, 'grad_norm': 0.9335206135751585, 'learning_rate': 9.363394438484926e-06, 'epoch': 0.19} 19%|█▉ | 6428/34278 [7:07:47<27:37:48, 3.57s/it] 19%|█▉ | 6429/34278 [7:07:50<26:12:57, 3.39s/it] {'loss': 0.1612, 'grad_norm': 0.9756779180721152, 'learning_rate': 9.363163731987924e-06, 'epoch': 0.19} 19%|█▉ | 6429/34278 [7:07:50<26:12:57, 3.39s/it] 19%|█▉ | 6430/34278 [7:07:53<25:55:03, 3.35s/it] {'loss': 0.1745, 'grad_norm': 0.9415783249737719, 'learning_rate': 9.362932986537636e-06, 'epoch': 0.19} 19%|█▉ | 6430/34278 [7:07:53<25:55:03, 3.35s/it] 19%|█▉ | 6431/34278 [7:07:56<25:04:08, 3.24s/it] {'loss': 0.1721, 'grad_norm': 0.91126906562831, 'learning_rate': 9.362702202136125e-06, 'epoch': 0.19} 19%|█▉ | 6431/34278 [7:07:56<25:04:08, 3.24s/it] 19%|█▉ | 6432/34278 [7:08:02<31:43:04, 4.10s/it] {'loss': 0.1499, 'grad_norm': 0.86616612377215, 'learning_rate': 9.36247137878545e-06, 'epoch': 0.19} 19%|█▉ | 6432/34278 [7:08:02<31:43:04, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▉ | 6433/34278 [7:08:06<29:58:32, 3.88s/it] {'loss': 0.1665, 'grad_norm': 1.0357674865729583, 'learning_rate': 9.362240516487672e-06, 'epoch': 0.19} 19%|█▉ | 6433/34278 [7:08:06<29:58:32, 3.88s/it] 19%|█▉ | 6434/34278 [7:08:09<27:35:16, 3.57s/it] {'loss': 0.1454, 'grad_norm': 0.82598036222361, 'learning_rate': 9.362009615244852e-06, 'epoch': 0.19} 19%|█▉ | 6434/34278 [7:08:09<27:35:16, 3.57s/it] 19%|█▉ | 6435/34278 [7:08:11<26:14:12, 3.39s/it] {'loss': 0.1768, 'grad_norm': 0.8879978864420757, 'learning_rate': 9.36177867505905e-06, 'epoch': 0.19} 19%|█▉ | 6435/34278 [7:08:11<26:14:12, 3.39s/it] 19%|█▉ | 6436/34278 [7:08:15<26:25:27, 3.42s/it] {'loss': 0.1527, 'grad_norm': 0.8781657754241868, 'learning_rate': 9.36154769593233e-06, 'epoch': 0.19} 19%|█▉ | 6436/34278 [7:08:15<26:25:27, 3.42s/it] 19%|█▉ | 6437/34278 [7:08:18<25:53:43, 3.35s/it] {'loss': 0.153, 'grad_norm': 0.9536049107109157, 'learning_rate': 9.361316677866756e-06, 'epoch': 0.19} 19%|█▉ | 6437/34278 [7:08:18<25:53:43, 3.35s/it] 19%|█▉ | 6438/34278 [7:08:23<30:03:26, 3.89s/it] {'loss': 0.1847, 'grad_norm': 0.8754771873434788, 'learning_rate': 9.361085620864384e-06, 'epoch': 0.19} 19%|█▉ | 6438/34278 [7:08:23<30:03:26, 3.89s/it] 19%|█▉ | 6439/34278 [7:08:27<28:41:11, 3.71s/it] {'loss': 0.1598, 'grad_norm': 0.9788522664013113, 'learning_rate': 9.360854524927283e-06, 'epoch': 0.19} 19%|█▉ | 6439/34278 [7:08:27<28:41:11, 3.71s/it] 19%|█▉ | 6440/34278 [7:08:30<27:48:02, 3.60s/it] {'loss': 0.1575, 'grad_norm': 0.8847500326674329, 'learning_rate': 9.360623390057513e-06, 'epoch': 0.19} 19%|█▉ | 6440/34278 [7:08:30<27:48:02, 3.60s/it] 19%|█▉ | 6441/34278 [7:08:33<26:20:58, 3.41s/it] {'loss': 0.1628, 'grad_norm': 0.8954633065214586, 'learning_rate': 9.36039221625714e-06, 'epoch': 0.19} 19%|█▉ | 6441/34278 [7:08:33<26:20:58, 3.41s/it] 19%|█▉ | 6442/34278 [7:08:36<26:16:33, 3.40s/it] {'loss': 0.1855, 'grad_norm': 1.0958594723145838, 'learning_rate': 9.360161003528225e-06, 'epoch': 0.19} 19%|█▉ | 6442/34278 [7:08:36<26:16:33, 3.40s/it] 19%|█▉ | 6443/34278 [7:08:40<26:43:17, 3.46s/it] {'loss': 0.1757, 'grad_norm': 1.0493007576651754, 'learning_rate': 9.359929751872832e-06, 'epoch': 0.19} 19%|█▉ | 6443/34278 [7:08:40<26:43:17, 3.46s/it] 19%|█▉ | 6444/34278 [7:08:43<26:39:39, 3.45s/it] {'loss': 0.1831, 'grad_norm': 0.7734016970896587, 'learning_rate': 9.359698461293029e-06, 'epoch': 0.19} 19%|█▉ | 6444/34278 [7:08:43<26:39:39, 3.45s/it] 19%|█▉ | 6445/34278 [7:08:47<26:20:38, 3.41s/it] {'loss': 0.1845, 'grad_norm': 0.7670175090011172, 'learning_rate': 9.359467131790878e-06, 'epoch': 0.19} 19%|█▉ | 6445/34278 [7:08:47<26:20:38, 3.41s/it] 19%|█▉ | 6446/34278 [7:08:52<30:41:20, 3.97s/it] {'loss': 0.1661, 'grad_norm': 0.837747572703878, 'learning_rate': 9.359235763368444e-06, 'epoch': 0.19} 19%|█▉ | 6446/34278 [7:08:52<30:41:20, 3.97s/it] 19%|█▉ | 6447/34278 [7:08:57<32:54:38, 4.26s/it] {'loss': 0.18, 'grad_norm': 0.9472609193995318, 'learning_rate': 9.359004356027796e-06, 'epoch': 0.19} 19%|█▉ | 6447/34278 [7:08:57<32:54:38, 4.26s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▉ | 6448/34278 [7:09:00<29:31:50, 3.82s/it] {'loss': 0.1691, 'grad_norm': 0.7727561093733268, 'learning_rate': 9.358772909770996e-06, 'epoch': 0.19} 19%|█▉ | 6448/34278 [7:09:00<29:31:50, 3.82s/it] 19%|█▉ | 6449/34278 [7:09:02<27:01:31, 3.50s/it] {'loss': 0.1788, 'grad_norm': 0.9422799353347984, 'learning_rate': 9.358541424600112e-06, 'epoch': 0.19} 19%|█▉ | 6449/34278 [7:09:02<27:01:31, 3.50s/it] 19%|█▉ | 6450/34278 [7:09:05<25:37:58, 3.32s/it] {'loss': 0.2013, 'grad_norm': 1.0040165688649716, 'learning_rate': 9.358309900517212e-06, 'epoch': 0.19} 19%|█▉ | 6450/34278 [7:09:05<25:37:58, 3.32s/it] 19%|█▉ | 6451/34278 [7:09:10<28:18:21, 3.66s/it] {'loss': 0.1488, 'grad_norm': 0.6921997078968785, 'learning_rate': 9.358078337524362e-06, 'epoch': 0.19} 19%|█▉ | 6451/34278 [7:09:10<28:18:21, 3.66s/it] 19%|█▉ | 6452/34278 [7:09:13<27:56:04, 3.61s/it] {'loss': 0.1971, 'grad_norm': 0.8853309572382455, 'learning_rate': 9.357846735623627e-06, 'epoch': 0.19} 19%|█▉ | 6452/34278 [7:09:13<27:56:04, 3.61s/it] 19%|█▉ | 6453/34278 [7:09:17<27:27:31, 3.55s/it] {'loss': 0.1524, 'grad_norm': 0.9426241570187159, 'learning_rate': 9.357615094817076e-06, 'epoch': 0.19} 19%|█▉ | 6453/34278 [7:09:17<27:27:31, 3.55s/it] 19%|█▉ | 6454/34278 [7:09:20<27:42:18, 3.58s/it] {'loss': 0.1897, 'grad_norm': 0.876801031963035, 'learning_rate': 9.35738341510678e-06, 'epoch': 0.19} 19%|█▉ | 6454/34278 [7:09:20<27:42:18, 3.58s/it] 19%|█▉ | 6455/34278 [7:09:24<27:18:43, 3.53s/it] {'loss': 0.1811, 'grad_norm': 0.8590444765798828, 'learning_rate': 9.357151696494805e-06, 'epoch': 0.19} 19%|█▉ | 6455/34278 [7:09:24<27:18:43, 3.53s/it] 19%|█▉ | 6456/34278 [7:09:27<26:59:20, 3.49s/it] {'loss': 0.1648, 'grad_norm': 1.041073665471181, 'learning_rate': 9.356919938983217e-06, 'epoch': 0.19} 19%|█▉ | 6456/34278 [7:09:27<26:59:20, 3.49s/it] 19%|█▉ | 6457/34278 [7:09:30<26:11:24, 3.39s/it] {'loss': 0.1568, 'grad_norm': 0.7810445562439615, 'learning_rate': 9.35668814257409e-06, 'epoch': 0.19} 19%|█▉ | 6457/34278 [7:09:30<26:11:24, 3.39s/it] 19%|█▉ | 6458/34278 [7:09:34<27:27:54, 3.55s/it] {'loss': 0.149, 'grad_norm': 0.9808872059364543, 'learning_rate': 9.356456307269493e-06, 'epoch': 0.19} 19%|█▉ | 6458/34278 [7:09:34<27:27:54, 3.55s/it] 19%|█▉ | 6459/34278 [7:09:37<26:07:38, 3.38s/it] {'loss': 0.1566, 'grad_norm': 0.7957062531699588, 'learning_rate': 9.35622443307149e-06, 'epoch': 0.19} 19%|█▉ | 6459/34278 [7:09:37<26:07:38, 3.38s/it] 19%|█▉ | 6460/34278 [7:09:40<25:26:57, 3.29s/it] {'loss': 0.1707, 'grad_norm': 0.9969133059553652, 'learning_rate': 9.355992519982159e-06, 'epoch': 0.19} 19%|█▉ | 6460/34278 [7:09:40<25:26:57, 3.29s/it] 19%|█▉ | 6461/34278 [7:09:45<29:51:35, 3.86s/it] {'loss': 0.1634, 'grad_norm': 0.9009970339068955, 'learning_rate': 9.355760568003564e-06, 'epoch': 0.19} 19%|█▉ | 6461/34278 [7:09:45<29:51:35, 3.86s/it] 19%|█▉ | 6462/34278 [7:09:48<27:42:41, 3.59s/it] {'loss': 0.158, 'grad_norm': 0.9034640625249046, 'learning_rate': 9.35552857713778e-06, 'epoch': 0.19} 19%|█▉ | 6462/34278 [7:09:48<27:42:41, 3.59s/it] 19%|█▉ | 6463/34278 [7:09:54<31:51:52, 4.12s/it] {'loss': 0.1796, 'grad_norm': 0.8307151872442313, 'learning_rate': 9.355296547386876e-06, 'epoch': 0.19} 19%|█▉ | 6463/34278 [7:09:54<31:51:52, 4.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▉ | 6464/34278 [7:09:57<30:09:10, 3.90s/it] {'loss': 0.1559, 'grad_norm': 0.7823639156560412, 'learning_rate': 9.355064478752925e-06, 'epoch': 0.19} 19%|█▉ | 6464/34278 [7:09:57<30:09:10, 3.90s/it] 19%|█▉ | 6465/34278 [7:10:01<30:40:34, 3.97s/it] {'loss': 0.1643, 'grad_norm': 1.1364863653210286, 'learning_rate': 9.354832371237996e-06, 'epoch': 0.19} 19%|█▉ | 6465/34278 [7:10:01<30:40:34, 3.97s/it] 19%|█▉ | 6466/34278 [7:10:05<28:58:47, 3.75s/it] {'loss': 0.1766, 'grad_norm': 0.819555976878176, 'learning_rate': 9.354600224844166e-06, 'epoch': 0.19} 19%|█▉ | 6466/34278 [7:10:05<28:58:47, 3.75s/it] 19%|█▉ | 6467/34278 [7:10:08<29:16:59, 3.79s/it] {'loss': 0.1578, 'grad_norm': 0.8021540114947207, 'learning_rate': 9.354368039573502e-06, 'epoch': 0.19} 19%|█▉ | 6467/34278 [7:10:08<29:16:59, 3.79s/it] 19%|█▉ | 6468/34278 [7:10:12<28:47:37, 3.73s/it] {'loss': 0.1749, 'grad_norm': 1.0294397190067515, 'learning_rate': 9.354135815428081e-06, 'epoch': 0.19} 19%|█▉ | 6468/34278 [7:10:12<28:47:37, 3.73s/it] 19%|█▉ | 6469/34278 [7:10:15<27:50:49, 3.60s/it] {'loss': 0.1583, 'grad_norm': 0.8596814036866786, 'learning_rate': 9.353903552409975e-06, 'epoch': 0.19} 19%|█▉ | 6469/34278 [7:10:15<27:50:49, 3.60s/it] 19%|█▉ | 6470/34278 [7:10:18<26:37:19, 3.45s/it] {'loss': 0.1912, 'grad_norm': 0.9579756939625307, 'learning_rate': 9.353671250521257e-06, 'epoch': 0.19} 19%|█▉ | 6470/34278 [7:10:18<26:37:19, 3.45s/it] 19%|█▉ | 6471/34278 [7:10:24<32:03:13, 4.15s/it] {'loss': 0.1682, 'grad_norm': 0.8536816187458158, 'learning_rate': 9.353438909764e-06, 'epoch': 0.19} 19%|█▉ | 6471/34278 [7:10:24<32:03:13, 4.15s/it] 19%|█▉ | 6472/34278 [7:10:28<31:30:13, 4.08s/it] {'loss': 0.1575, 'grad_norm': 0.8471358060067807, 'learning_rate': 9.353206530140282e-06, 'epoch': 0.19} 19%|█▉ | 6472/34278 [7:10:28<31:30:13, 4.08s/it] 19%|█▉ | 6473/34278 [7:10:32<30:04:16, 3.89s/it] {'loss': 0.1705, 'grad_norm': 0.7423011039671563, 'learning_rate': 9.352974111652174e-06, 'epoch': 0.19} 19%|█▉ | 6473/34278 [7:10:32<30:04:16, 3.89s/it] 19%|█▉ | 6474/34278 [7:10:35<28:10:27, 3.65s/it] {'loss': 0.1919, 'grad_norm': 0.872510793966477, 'learning_rate': 9.352741654301752e-06, 'epoch': 0.19} 19%|█▉ | 6474/34278 [7:10:35<28:10:27, 3.65s/it] 19%|█▉ | 6475/34278 [7:10:38<26:52:24, 3.48s/it] {'loss': 0.176, 'grad_norm': 0.8843322687907549, 'learning_rate': 9.352509158091092e-06, 'epoch': 0.19} 19%|█▉ | 6475/34278 [7:10:38<26:52:24, 3.48s/it] 19%|█▉ | 6476/34278 [7:10:41<26:36:50, 3.45s/it] {'loss': 0.1737, 'grad_norm': 0.9153013391100571, 'learning_rate': 9.35227662302227e-06, 'epoch': 0.19} 19%|█▉ | 6476/34278 [7:10:41<26:36:50, 3.45s/it] 19%|█▉ | 6477/34278 [7:10:45<28:10:31, 3.65s/it] {'loss': 0.1556, 'grad_norm': 0.8340567504416462, 'learning_rate': 9.35204404909736e-06, 'epoch': 0.19} 19%|█▉ | 6477/34278 [7:10:45<28:10:31, 3.65s/it] 19%|█▉ | 6478/34278 [7:10:49<28:08:48, 3.64s/it] {'loss': 0.147, 'grad_norm': 0.7850765747011559, 'learning_rate': 9.35181143631844e-06, 'epoch': 0.19} 19%|█▉ | 6478/34278 [7:10:49<28:08:48, 3.64s/it] 19%|█▉ | 6479/34278 [7:10:52<26:31:51, 3.44s/it] {'loss': 0.1667, 'grad_norm': 1.04531413559154, 'learning_rate': 9.351578784687589e-06, 'epoch': 0.19} 19%|█▉ | 6479/34278 [7:10:52<26:31:51, 3.44s/it] 19%|█▉ | 6480/34278 [7:10:55<25:35:41, 3.31s/it] {'loss': 0.1461, 'grad_norm': 0.8802548522615019, 'learning_rate': 9.351346094206878e-06, 'epoch': 0.19} 19%|█▉ | 6480/34278 [7:10:55<25:35:41, 3.31s/it] 19%|█▉ | 6481/34278 [7:10:59<27:28:24, 3.56s/it] {'loss': 0.161, 'grad_norm': 0.8610718168542294, 'learning_rate': 9.351113364878388e-06, 'epoch': 0.19} 19%|█▉ | 6481/34278 [7:10:59<27:28:24, 3.56s/it] 19%|█▉ | 6482/34278 [7:11:03<29:17:46, 3.79s/it] {'loss': 0.1791, 'grad_norm': 0.9291714448392332, 'learning_rate': 9.350880596704199e-06, 'epoch': 0.19} 19%|█▉ | 6482/34278 [7:11:03<29:17:46, 3.79s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▉ | 6483/34278 [7:11:07<29:38:39, 3.84s/it] {'loss': 0.1772, 'grad_norm': 1.0332328719886352, 'learning_rate': 9.350647789686384e-06, 'epoch': 0.19} 19%|█▉ | 6483/34278 [7:11:07<29:38:39, 3.84s/it] 19%|█▉ | 6484/34278 [7:11:10<27:50:19, 3.61s/it] {'loss': 0.1466, 'grad_norm': 0.7525244706407437, 'learning_rate': 9.350414943827027e-06, 'epoch': 0.19} 19%|█▉ | 6484/34278 [7:11:10<27:50:19, 3.61s/it] 19%|█▉ | 6485/34278 [7:11:16<31:39:30, 4.10s/it] {'loss': 0.1645, 'grad_norm': 1.052568781939131, 'learning_rate': 9.350182059128202e-06, 'epoch': 0.19} 19%|█▉ | 6485/34278 [7:11:16<31:39:30, 4.10s/it] 19%|█▉ | 6486/34278 [7:11:19<29:37:36, 3.84s/it] {'loss': 0.1478, 'grad_norm': 0.8692515713275414, 'learning_rate': 9.34994913559199e-06, 'epoch': 0.19} 19%|█▉ | 6486/34278 [7:11:19<29:37:36, 3.84s/it] 19%|█▉ | 6487/34278 [7:11:22<27:18:15, 3.54s/it] {'loss': 0.1699, 'grad_norm': 0.7641666212824467, 'learning_rate': 9.34971617322047e-06, 'epoch': 0.19} 19%|█▉ | 6487/34278 [7:11:22<27:18:15, 3.54s/it] 19%|█▉ | 6488/34278 [7:11:24<25:44:46, 3.34s/it] {'loss': 0.1503, 'grad_norm': 1.012211156978612, 'learning_rate': 9.349483172015723e-06, 'epoch': 0.19} 19%|█▉ | 6488/34278 [7:11:24<25:44:46, 3.34s/it] 19%|█▉ | 6489/34278 [7:11:28<26:06:30, 3.38s/it] {'loss': 0.1756, 'grad_norm': 0.8705933276035973, 'learning_rate': 9.349250131979829e-06, 'epoch': 0.19} 19%|█▉ | 6489/34278 [7:11:28<26:06:30, 3.38s/it] 19%|█▉ | 6490/34278 [7:11:31<25:13:52, 3.27s/it] {'loss': 0.1706, 'grad_norm': 0.8260686852987795, 'learning_rate': 9.349017053114868e-06, 'epoch': 0.19} 19%|█▉ | 6490/34278 [7:11:31<25:13:52, 3.27s/it] 19%|█▉ | 6491/34278 [7:11:34<25:17:49, 3.28s/it] {'loss': 0.1625, 'grad_norm': 1.0823426214485259, 'learning_rate': 9.34878393542292e-06, 'epoch': 0.19} 19%|█▉ | 6491/34278 [7:11:34<25:17:49, 3.28s/it] 19%|█▉ | 6492/34278 [7:11:38<25:39:00, 3.32s/it] {'loss': 0.1488, 'grad_norm': 0.838019795588882, 'learning_rate': 9.348550778906069e-06, 'epoch': 0.19} 19%|█▉ | 6492/34278 [7:11:38<25:39:00, 3.32s/it] 19%|█▉ | 6493/34278 [7:11:41<25:49:24, 3.35s/it] {'loss': 0.161, 'grad_norm': 0.8252509549584727, 'learning_rate': 9.348317583566393e-06, 'epoch': 0.19} 19%|█▉ | 6493/34278 [7:11:41<25:49:24, 3.35s/it] 19%|█▉ | 6494/34278 [7:11:45<26:24:12, 3.42s/it] {'loss': 0.1573, 'grad_norm': 0.8667037898451724, 'learning_rate': 9.348084349405977e-06, 'epoch': 0.19} 19%|█▉ | 6494/34278 [7:11:45<26:24:12, 3.42s/it] 19%|█▉ | 6495/34278 [7:11:48<25:46:00, 3.34s/it] {'loss': 0.1616, 'grad_norm': 0.9632097702897997, 'learning_rate': 9.347851076426902e-06, 'epoch': 0.19} 19%|█▉ | 6495/34278 [7:11:48<25:46:00, 3.34s/it] 19%|█▉ | 6496/34278 [7:11:51<24:34:20, 3.18s/it] {'loss': 0.1696, 'grad_norm': 0.9263513470808169, 'learning_rate': 9.347617764631248e-06, 'epoch': 0.19} 19%|█▉ | 6496/34278 [7:11:51<24:34:20, 3.18s/it] 19%|█▉ | 6497/34278 [7:11:54<24:29:33, 3.17s/it] {'loss': 0.1867, 'grad_norm': 0.9603998868666372, 'learning_rate': 9.347384414021103e-06, 'epoch': 0.19} 19%|█▉ | 6497/34278 [7:11:54<24:29:33, 3.17s/it] 19%|█▉ | 6498/34278 [7:11:57<24:26:17, 3.17s/it] {'loss': 0.1646, 'grad_norm': 0.9443097850177389, 'learning_rate': 9.347151024598547e-06, 'epoch': 0.19} 19%|█▉ | 6498/34278 [7:11:57<24:26:17, 3.17s/it] 19%|█▉ | 6499/34278 [7:12:01<25:35:51, 3.32s/it] {'loss': 0.191, 'grad_norm': 0.9662622519989555, 'learning_rate': 9.346917596365663e-06, 'epoch': 0.19} 19%|█▉ | 6499/34278 [7:12:01<25:35:51, 3.32s/it] 19%|█▉ | 6500/34278 [7:12:05<28:06:01, 3.64s/it] {'loss': 0.1623, 'grad_norm': 0.7954713778166729, 'learning_rate': 9.346684129324539e-06, 'epoch': 0.19} 19%|█▉ | 6500/34278 [7:12:05<28:06:01, 3.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▉ | 6501/34278 [7:12:08<26:56:17, 3.49s/it] {'loss': 0.1905, 'grad_norm': 0.8878743061399259, 'learning_rate': 9.346450623477255e-06, 'epoch': 0.19} 19%|█▉ | 6501/34278 [7:12:08<26:56:17, 3.49s/it] 19%|█▉ | 6502/34278 [7:12:11<26:08:31, 3.39s/it] {'loss': 0.1586, 'grad_norm': 1.0539982976897557, 'learning_rate': 9.346217078825898e-06, 'epoch': 0.19} 19%|█▉ | 6502/34278 [7:12:11<26:08:31, 3.39s/it] 19%|█▉ | 6503/34278 [7:12:17<32:08:15, 4.17s/it] {'loss': 0.1772, 'grad_norm': 0.796267607966624, 'learning_rate': 9.345983495372552e-06, 'epoch': 0.19} 19%|█▉ | 6503/34278 [7:12:17<32:08:15, 4.17s/it] 19%|█▉ | 6504/34278 [7:12:22<32:55:54, 4.27s/it] {'loss': 0.1952, 'grad_norm': 0.9537548330363622, 'learning_rate': 9.345749873119304e-06, 'epoch': 0.19} 19%|█▉ | 6504/34278 [7:12:22<32:55:54, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▉ | 6505/34278 [7:12:25<30:46:55, 3.99s/it] {'loss': 0.1787, 'grad_norm': 1.1008248733668338, 'learning_rate': 9.345516212068237e-06, 'epoch': 0.19} 19%|█▉ | 6505/34278 [7:12:25<30:46:55, 3.99s/it] 19%|█▉ | 6506/34278 [7:12:30<32:55:41, 4.27s/it] {'loss': 0.1526, 'grad_norm': 0.9523573448915854, 'learning_rate': 9.34528251222144e-06, 'epoch': 0.19} 19%|█▉ | 6506/34278 [7:12:30<32:55:41, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▉ | 6507/34278 [7:12:33<30:20:53, 3.93s/it] {'loss': 0.1711, 'grad_norm': 0.9023001720464379, 'learning_rate': 9.345048773580995e-06, 'epoch': 0.19} 19%|█▉ | 6507/34278 [7:12:33<30:20:53, 3.93s/it] 19%|█▉ | 6508/34278 [7:12:37<29:26:00, 3.82s/it] {'loss': 0.1614, 'grad_norm': 0.9134559758736757, 'learning_rate': 9.344814996148995e-06, 'epoch': 0.19} 19%|█▉ | 6508/34278 [7:12:37<29:26:00, 3.82s/it] 19%|█▉ | 6509/34278 [7:12:40<28:37:36, 3.71s/it] {'loss': 0.1853, 'grad_norm': 0.9209440705402696, 'learning_rate': 9.344581179927523e-06, 'epoch': 0.19} 19%|█▉ | 6509/34278 [7:12:40<28:37:36, 3.71s/it] 19%|█▉ | 6510/34278 [7:12:43<27:19:50, 3.54s/it] {'loss': 0.203, 'grad_norm': 1.0425855002494953, 'learning_rate': 9.344347324918667e-06, 'epoch': 0.19} 19%|█▉ | 6510/34278 [7:12:43<27:19:50, 3.54s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8662 > 8192). Running this sequence through the model will result in indexing errors 19%|█▉ | 6511/34278 [7:12:47<26:44:01, 3.47s/it] {'loss': 0.1594, 'grad_norm': 0.9322617122236974, 'learning_rate': 9.344113431124517e-06, 'epoch': 0.19} 19%|█▉ | 6511/34278 [7:12:47<26:44:01, 3.47s/it] 19%|█▉ | 6512/34278 [7:12:51<29:31:51, 3.83s/it] {'loss': 0.1669, 'grad_norm': 0.7838747514885989, 'learning_rate': 9.343879498547157e-06, 'epoch': 0.19} 19%|█▉ | 6512/34278 [7:12:51<29:31:51, 3.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▉ | 6513/34278 [7:12:55<30:16:33, 3.93s/it] {'loss': 0.1763, 'grad_norm': 1.0160814798959792, 'learning_rate': 9.343645527188678e-06, 'epoch': 0.19} 19%|█▉ | 6513/34278 [7:12:55<30:16:33, 3.93s/it] 19%|█▉ | 6514/34278 [7:12:59<28:21:24, 3.68s/it] {'loss': 0.159, 'grad_norm': 0.9135837272844288, 'learning_rate': 9.34341151705117e-06, 'epoch': 0.19} 19%|█▉ | 6514/34278 [7:12:59<28:21:24, 3.68s/it] 19%|█▉ | 6515/34278 [7:13:03<29:46:45, 3.86s/it] {'loss': 0.1617, 'grad_norm': 0.7831207396365385, 'learning_rate': 9.34317746813672e-06, 'epoch': 0.19} 19%|█▉ | 6515/34278 [7:13:03<29:46:45, 3.86s/it] 19%|█▉ | 6516/34278 [7:13:09<35:27:49, 4.60s/it] {'loss': 0.1857, 'grad_norm': 0.8770832195483749, 'learning_rate': 9.342943380447417e-06, 'epoch': 0.19} 19%|█▉ | 6516/34278 [7:13:09<35:27:49, 4.60s/it] 19%|█▉ | 6517/34278 [7:13:12<32:25:12, 4.20s/it] {'loss': 0.1663, 'grad_norm': 0.8567349049776124, 'learning_rate': 9.342709253985356e-06, 'epoch': 0.19} 19%|█▉ | 6517/34278 [7:13:12<32:25:12, 4.20s/it] 19%|█▉ | 6518/34278 [7:13:16<30:00:26, 3.89s/it] {'loss': 0.1398, 'grad_norm': 0.6983197993077767, 'learning_rate': 9.342475088752621e-06, 'epoch': 0.19} 19%|█▉ | 6518/34278 [7:13:16<30:00:26, 3.89s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8839 > 8192). Running this sequence through the model will result in indexing errors 19%|█▉ | 6519/34278 [7:13:19<27:46:27, 3.60s/it] {'loss': 0.1837, 'grad_norm': 0.8021887784195425, 'learning_rate': 9.342240884751305e-06, 'epoch': 0.19} 19%|█▉ | 6519/34278 [7:13:19<27:46:27, 3.60s/it] 19%|█▉ | 6520/34278 [7:13:24<32:54:39, 4.27s/it] {'loss': 0.1656, 'grad_norm': 0.8892876153441276, 'learning_rate': 9.342006641983499e-06, 'epoch': 0.19} 19%|█▉ | 6520/34278 [7:13:25<32:54:39, 4.27s/it] 19%|█▉ | 6521/34278 [7:13:28<31:47:28, 4.12s/it] {'loss': 0.1438, 'grad_norm': 0.8278555770237782, 'learning_rate': 9.341772360451294e-06, 'epoch': 0.19} 19%|█▉ | 6521/34278 [7:13:28<31:47:28, 4.12s/it] 19%|█▉ | 6522/34278 [7:13:32<30:06:44, 3.91s/it] {'loss': 0.1524, 'grad_norm': 0.9707716915155568, 'learning_rate': 9.341538040156783e-06, 'epoch': 0.19} 19%|█▉ | 6522/34278 [7:13:32<30:06:44, 3.91s/it] 19%|█▉ | 6523/34278 [7:13:34<27:33:46, 3.58s/it] {'loss': 0.178, 'grad_norm': 0.9330398926275478, 'learning_rate': 9.341303681102056e-06, 'epoch': 0.19} 19%|█▉ | 6523/34278 [7:13:34<27:33:46, 3.58s/it] 19%|█▉ | 6524/34278 [7:13:39<31:01:42, 4.02s/it] {'loss': 0.1646, 'grad_norm': 0.7281296627655914, 'learning_rate': 9.341069283289207e-06, 'epoch': 0.19} 19%|█▉ | 6524/34278 [7:13:39<31:01:42, 4.02s/it] 19%|█▉ | 6525/34278 [7:13:42<28:27:04, 3.69s/it] {'loss': 0.1605, 'grad_norm': 1.0189311532604206, 'learning_rate': 9.340834846720326e-06, 'epoch': 0.19} 19%|█▉ | 6525/34278 [7:13:42<28:27:04, 3.69s/it] 19%|█▉ | 6526/34278 [7:13:45<26:44:24, 3.47s/it] {'loss': 0.1555, 'grad_norm': 0.7388679711664593, 'learning_rate': 9.340600371397508e-06, 'epoch': 0.19} 19%|█▉ | 6526/34278 [7:13:45<26:44:24, 3.47s/it] 19%|█▉ | 6527/34278 [7:13:48<25:34:17, 3.32s/it] {'loss': 0.1565, 'grad_norm': 0.9170058175685646, 'learning_rate': 9.340365857322846e-06, 'epoch': 0.19} 19%|█▉ | 6527/34278 [7:13:48<25:34:17, 3.32s/it] 19%|█▉ | 6528/34278 [7:13:51<24:49:32, 3.22s/it] {'loss': 0.1416, 'grad_norm': 0.7596638777324773, 'learning_rate': 9.340131304498435e-06, 'epoch': 0.19} 19%|█▉ | 6528/34278 [7:13:51<24:49:32, 3.22s/it] 19%|█▉ | 6529/34278 [7:13:55<25:30:28, 3.31s/it] {'loss': 0.187, 'grad_norm': 0.8463017137810688, 'learning_rate': 9.339896712926367e-06, 'epoch': 0.19} 19%|█▉ | 6529/34278 [7:13:55<25:30:28, 3.31s/it] 19%|█▉ | 6530/34278 [7:13:58<24:49:01, 3.22s/it] {'loss': 0.1685, 'grad_norm': 0.7594154273046081, 'learning_rate': 9.339662082608739e-06, 'epoch': 0.19} 19%|█▉ | 6530/34278 [7:13:58<24:49:01, 3.22s/it] 19%|█▉ | 6531/34278 [7:14:02<27:38:28, 3.59s/it] {'loss': 0.16, 'grad_norm': 0.7411891091664682, 'learning_rate': 9.33942741354764e-06, 'epoch': 0.19} 19%|█▉ | 6531/34278 [7:14:02<27:38:28, 3.59s/it] 19%|█▉ | 6532/34278 [7:14:05<26:37:55, 3.46s/it] {'loss': 0.1768, 'grad_norm': 0.9121398575667056, 'learning_rate': 9.339192705745172e-06, 'epoch': 0.19} 19%|█▉ | 6532/34278 [7:14:05<26:37:55, 3.46s/it] 19%|█▉ | 6533/34278 [7:14:09<26:05:40, 3.39s/it] {'loss': 0.1651, 'grad_norm': 0.8243088908698623, 'learning_rate': 9.338957959203427e-06, 'epoch': 0.19} 19%|█▉ | 6533/34278 [7:14:09<26:05:40, 3.39s/it] 19%|█▉ | 6534/34278 [7:14:12<25:26:08, 3.30s/it] {'loss': 0.1877, 'grad_norm': 0.8719208184465156, 'learning_rate': 9.3387231739245e-06, 'epoch': 0.19} 19%|█▉ | 6534/34278 [7:14:12<25:26:08, 3.30s/it] 19%|█▉ | 6535/34278 [7:14:15<25:40:36, 3.33s/it] {'loss': 0.1681, 'grad_norm': 0.752749898025691, 'learning_rate': 9.338488349910489e-06, 'epoch': 0.19} 19%|█▉ | 6535/34278 [7:14:15<25:40:36, 3.33s/it] 19%|█▉ | 6536/34278 [7:14:21<31:43:55, 4.12s/it] {'loss': 0.1636, 'grad_norm': 1.0676027126913896, 'learning_rate': 9.33825348716349e-06, 'epoch': 0.19} 19%|█▉ | 6536/34278 [7:14:21<31:43:55, 4.12s/it] 19%|█▉ | 6537/34278 [7:14:24<29:17:03, 3.80s/it] {'loss': 0.1743, 'grad_norm': 0.745378376347966, 'learning_rate': 9.338018585685599e-06, 'epoch': 0.19} 19%|█▉ | 6537/34278 [7:14:24<29:17:03, 3.80s/it] 19%|█▉ | 6538/34278 [7:14:28<28:19:37, 3.68s/it] {'loss': 0.172, 'grad_norm': 0.867800714762912, 'learning_rate': 9.337783645478912e-06, 'epoch': 0.19} 19%|█▉ | 6538/34278 [7:14:28<28:19:37, 3.68s/it] 19%|█▉ | 6539/34278 [7:14:31<26:56:39, 3.50s/it] {'loss': 0.2088, 'grad_norm': 1.0383533865490961, 'learning_rate': 9.337548666545532e-06, 'epoch': 0.19} 19%|█▉ | 6539/34278 [7:14:31<26:56:39, 3.50s/it] 19%|█▉ | 6540/34278 [7:14:34<27:32:28, 3.57s/it] {'loss': 0.1437, 'grad_norm': 0.8019896158769209, 'learning_rate': 9.33731364888755e-06, 'epoch': 0.19} 19%|█▉ | 6540/34278 [7:14:34<27:32:28, 3.57s/it] 19%|█▉ | 6541/34278 [7:14:38<27:14:23, 3.54s/it] {'loss': 0.1537, 'grad_norm': 0.7264352087170607, 'learning_rate': 9.337078592507069e-06, 'epoch': 0.19} 19%|█▉ | 6541/34278 [7:14:38<27:14:23, 3.54s/it] 19%|█▉ | 6542/34278 [7:14:41<27:22:20, 3.55s/it] {'loss': 0.1805, 'grad_norm': 0.9476453646977872, 'learning_rate': 9.336843497406184e-06, 'epoch': 0.19} 19%|█▉ | 6542/34278 [7:14:41<27:22:20, 3.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▉ | 6543/34278 [7:14:48<33:28:43, 4.35s/it] {'loss': 0.1565, 'grad_norm': 0.8229482466231757, 'learning_rate': 9.336608363586997e-06, 'epoch': 0.19} 19%|█▉ | 6543/34278 [7:14:48<33:28:43, 4.35s/it] 19%|█▉ | 6544/34278 [7:14:52<34:11:54, 4.44s/it] {'loss': 0.1693, 'grad_norm': 0.8411750734928479, 'learning_rate': 9.336373191051604e-06, 'epoch': 0.19} 19%|█▉ | 6544/34278 [7:14:52<34:11:54, 4.44s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▉ | 6545/34278 [7:14:55<31:13:01, 4.05s/it] {'loss': 0.1423, 'grad_norm': 0.752537013957642, 'learning_rate': 9.336137979802107e-06, 'epoch': 0.19} 19%|█▉ | 6545/34278 [7:14:55<31:13:01, 4.05s/it] 19%|█▉ | 6546/34278 [7:15:01<34:15:33, 4.45s/it] {'loss': 0.1862, 'grad_norm': 0.813754702244634, 'learning_rate': 9.335902729840606e-06, 'epoch': 0.19} 19%|█▉ | 6546/34278 [7:15:01<34:15:33, 4.45s/it] 19%|█▉ | 6547/34278 [7:15:05<32:57:07, 4.28s/it] {'loss': 0.1607, 'grad_norm': 0.7726436207337717, 'learning_rate': 9.3356674411692e-06, 'epoch': 0.19} 19%|█▉ | 6547/34278 [7:15:05<32:57:07, 4.28s/it] 19%|█▉ | 6548/34278 [7:15:08<30:44:06, 3.99s/it] {'loss': 0.1869, 'grad_norm': 0.8806303552398813, 'learning_rate': 9.33543211378999e-06, 'epoch': 0.19} 19%|█▉ | 6548/34278 [7:15:08<30:44:06, 3.99s/it] 19%|█▉ | 6549/34278 [7:15:11<29:06:11, 3.78s/it] {'loss': 0.1593, 'grad_norm': 0.7724633385704062, 'learning_rate': 9.335196747705077e-06, 'epoch': 0.19} 19%|█▉ | 6549/34278 [7:15:11<29:06:11, 3.78s/it] 19%|█▉ | 6550/34278 [7:15:14<26:44:07, 3.47s/it] {'loss': 0.1327, 'grad_norm': 1.007481687845384, 'learning_rate': 9.334961342916563e-06, 'epoch': 0.19} 19%|█▉ | 6550/34278 [7:15:14<26:44:07, 3.47s/it] 19%|█▉ | 6551/34278 [7:15:18<27:02:07, 3.51s/it] {'loss': 0.1467, 'grad_norm': 0.7255188714256837, 'learning_rate': 9.334725899426549e-06, 'epoch': 0.19} 19%|█▉ | 6551/34278 [7:15:18<27:02:07, 3.51s/it] 19%|█▉ | 6552/34278 [7:15:21<26:34:51, 3.45s/it] {'loss': 0.1678, 'grad_norm': 0.8636454349993008, 'learning_rate': 9.334490417237137e-06, 'epoch': 0.19} 19%|█▉ | 6552/34278 [7:15:21<26:34:51, 3.45s/it] 19%|█▉ | 6553/34278 [7:15:24<25:32:19, 3.32s/it] {'loss': 0.1508, 'grad_norm': 0.8242913205173432, 'learning_rate': 9.334254896350428e-06, 'epoch': 0.19} 19%|█▉ | 6553/34278 [7:15:24<25:32:19, 3.32s/it] 19%|█▉ | 6554/34278 [7:15:28<27:30:37, 3.57s/it] {'loss': 0.1515, 'grad_norm': 1.1292800022059526, 'learning_rate': 9.334019336768525e-06, 'epoch': 0.19} 19%|█▉ | 6554/34278 [7:15:28<27:30:37, 3.57s/it] 19%|█▉ | 6555/34278 [7:15:31<25:38:41, 3.33s/it] {'loss': 0.1525, 'grad_norm': 0.884273781257435, 'learning_rate': 9.333783738493534e-06, 'epoch': 0.19} 19%|█▉ | 6555/34278 [7:15:31<25:38:41, 3.33s/it] 19%|█▉ | 6556/34278 [7:15:34<25:00:43, 3.25s/it] {'loss': 0.1616, 'grad_norm': 0.9425489409246627, 'learning_rate': 9.333548101527557e-06, 'epoch': 0.19} 19%|█▉ | 6556/34278 [7:15:34<25:00:43, 3.25s/it] 19%|█▉ | 6557/34278 [7:15:37<25:02:54, 3.25s/it] {'loss': 0.1548, 'grad_norm': 0.9218937211022523, 'learning_rate': 9.333312425872696e-06, 'epoch': 0.19} 19%|█▉ | 6557/34278 [7:15:37<25:02:54, 3.25s/it] 19%|█▉ | 6558/34278 [7:15:40<25:12:59, 3.27s/it] {'loss': 0.1552, 'grad_norm': 0.8745561746227835, 'learning_rate': 9.333076711531055e-06, 'epoch': 0.19} 19%|█▉ | 6558/34278 [7:15:41<25:12:59, 3.27s/it] 19%|█▉ | 6559/34278 [7:15:45<27:59:25, 3.64s/it] {'loss': 0.1627, 'grad_norm': 0.971277358473962, 'learning_rate': 9.33284095850474e-06, 'epoch': 0.19} 19%|█▉ | 6559/34278 [7:15:45<27:59:25, 3.64s/it] 19%|█▉ | 6560/34278 [7:15:49<29:00:38, 3.77s/it] {'loss': 0.1605, 'grad_norm': 0.729642279874078, 'learning_rate': 9.332605166795857e-06, 'epoch': 0.19} 19%|█▉ | 6560/34278 [7:15:49<29:00:38, 3.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▉ | 6561/34278 [7:15:53<30:24:18, 3.95s/it] {'loss': 0.1486, 'grad_norm': 0.9982954152299268, 'learning_rate': 9.332369336406508e-06, 'epoch': 0.19} 19%|█▉ | 6561/34278 [7:15:53<30:24:18, 3.95s/it] 19%|█▉ | 6562/34278 [7:15:57<29:38:43, 3.85s/it] {'loss': 0.1635, 'grad_norm': 1.000662881706075, 'learning_rate': 9.332133467338799e-06, 'epoch': 0.19} 19%|█▉ | 6562/34278 [7:15:57<29:38:43, 3.85s/it] 19%|█▉ | 6563/34278 [7:16:00<27:39:33, 3.59s/it] {'loss': 0.1439, 'grad_norm': 0.7443616070218757, 'learning_rate': 9.331897559594839e-06, 'epoch': 0.19} 19%|█▉ | 6563/34278 [7:16:00<27:39:33, 3.59s/it] 19%|█▉ | 6564/34278 [7:16:03<26:30:28, 3.44s/it] {'loss': 0.1745, 'grad_norm': 1.0920462291255546, 'learning_rate': 9.33166161317673e-06, 'epoch': 0.19} 19%|█▉ | 6564/34278 [7:16:03<26:30:28, 3.44s/it] 19%|█▉ | 6565/34278 [7:16:09<32:11:25, 4.18s/it] {'loss': 0.1842, 'grad_norm': 0.8604818796068482, 'learning_rate': 9.33142562808658e-06, 'epoch': 0.19} 19%|█▉ | 6565/34278 [7:16:09<32:11:25, 4.18s/it] 19%|█▉ | 6566/34278 [7:16:12<29:51:58, 3.88s/it] {'loss': 0.1855, 'grad_norm': 0.9014934031732198, 'learning_rate': 9.331189604326498e-06, 'epoch': 0.19} 19%|█▉ | 6566/34278 [7:16:12<29:51:58, 3.88s/it] 19%|█▉ | 6567/34278 [7:16:17<30:56:32, 4.02s/it] {'loss': 0.1642, 'grad_norm': 1.158240541171249, 'learning_rate': 9.330953541898587e-06, 'epoch': 0.19} 19%|█▉ | 6567/34278 [7:16:17<30:56:32, 4.02s/it] 19%|█▉ | 6568/34278 [7:16:22<33:24:45, 4.34s/it] {'loss': 0.165, 'grad_norm': 1.0939223699238565, 'learning_rate': 9.330717440804957e-06, 'epoch': 0.19} 19%|█▉ | 6568/34278 [7:16:22<33:24:45, 4.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▉ | 6569/34278 [7:16:27<35:47:34, 4.65s/it] {'loss': 0.1647, 'grad_norm': 0.7959937577769631, 'learning_rate': 9.330481301047716e-06, 'epoch': 0.19} 19%|█▉ | 6569/34278 [7:16:27<35:47:34, 4.65s/it] 19%|█▉ | 6570/34278 [7:16:30<31:56:27, 4.15s/it] {'loss': 0.1833, 'grad_norm': 0.8363215560981384, 'learning_rate': 9.330245122628972e-06, 'epoch': 0.19} 19%|█▉ | 6570/34278 [7:16:30<31:56:27, 4.15s/it] 19%|█▉ | 6571/34278 [7:16:33<29:07:16, 3.78s/it] {'loss': 0.1692, 'grad_norm': 1.0241992590915194, 'learning_rate': 9.33000890555083e-06, 'epoch': 0.19} 19%|█▉ | 6571/34278 [7:16:33<29:07:16, 3.78s/it] 19%|█▉ | 6572/34278 [7:16:36<27:53:31, 3.62s/it] {'loss': 0.1904, 'grad_norm': 0.8207735379074976, 'learning_rate': 9.329772649815407e-06, 'epoch': 0.19} 19%|█▉ | 6572/34278 [7:16:36<27:53:31, 3.62s/it] 19%|█▉ | 6573/34278 [7:16:39<26:54:36, 3.50s/it] {'loss': 0.1896, 'grad_norm': 0.9699830671805612, 'learning_rate': 9.329536355424804e-06, 'epoch': 0.19} 19%|█▉ | 6573/34278 [7:16:39<26:54:36, 3.50s/it] 19%|█▉ | 6574/34278 [7:16:42<25:49:59, 3.36s/it] {'loss': 0.171, 'grad_norm': 0.8962596024080848, 'learning_rate': 9.329300022381135e-06, 'epoch': 0.19} 19%|█▉ | 6574/34278 [7:16:42<25:49:59, 3.36s/it] 19%|█▉ | 6575/34278 [7:16:45<24:26:43, 3.18s/it] {'loss': 0.1658, 'grad_norm': 0.8742934378036707, 'learning_rate': 9.329063650686511e-06, 'epoch': 0.19} 19%|█▉ | 6575/34278 [7:16:45<24:26:43, 3.18s/it] 19%|█▉ | 6576/34278 [7:16:50<29:02:19, 3.77s/it] {'loss': 0.1735, 'grad_norm': 0.8093306750612865, 'learning_rate': 9.328827240343037e-06, 'epoch': 0.19} 19%|█▉ | 6576/34278 [7:16:50<29:02:19, 3.77s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0cff46b330> Failed to fetch sample 2735735. Exception: cannot identify image file <_io.BytesIO object at 0x7f0cff46b330> 19%|█▉ | 6577/34278 [7:16:53<27:23:32, 3.56s/it] {'loss': 0.1673, 'grad_norm': 0.7606877618978442, 'learning_rate': 9.328590791352828e-06, 'epoch': 0.19} 19%|█▉ | 6577/34278 [7:16:53<27:23:32, 3.56s/it] 19%|█▉ | 6578/34278 [7:16:58<29:12:15, 3.80s/it] {'loss': 0.1843, 'grad_norm': 0.973003170760142, 'learning_rate': 9.328354303717995e-06, 'epoch': 0.19} 19%|█▉ | 6578/34278 [7:16:58<29:12:15, 3.80s/it] 19%|█▉ | 6579/34278 [7:17:04<34:14:02, 4.45s/it] {'loss': 0.1704, 'grad_norm': 0.7516757091467464, 'learning_rate': 9.328117777440647e-06, 'epoch': 0.19} 19%|█▉ | 6579/34278 [7:17:04<34:14:02, 4.45s/it] 19%|█▉ | 6580/34278 [7:17:10<38:13:44, 4.97s/it] {'loss': 0.1511, 'grad_norm': 0.8252779180987707, 'learning_rate': 9.327881212522896e-06, 'epoch': 0.19} 19%|█▉ | 6580/34278 [7:17:10<38:13:44, 4.97s/it] 19%|█▉ | 6581/34278 [7:17:13<34:01:45, 4.42s/it] {'loss': 0.1692, 'grad_norm': 0.898972360986517, 'learning_rate': 9.327644608966855e-06, 'epoch': 0.19} 19%|█▉ | 6581/34278 [7:17:13<34:01:45, 4.42s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fefca82c680> Failed to fetch sample 3423926. Exception: cannot identify image file <_io.BytesIO object at 0x7fefca82c680> 19%|█▉ | 6582/34278 [7:17:17<31:58:32, 4.16s/it] {'loss': 0.1602, 'grad_norm': 0.9227271030833828, 'learning_rate': 9.327407966774635e-06, 'epoch': 0.19} 19%|█▉ | 6582/34278 [7:17:17<31:58:32, 4.16s/it] 19%|█▉ | 6583/34278 [7:17:20<30:38:37, 3.98s/it] {'loss': 0.1936, 'grad_norm': 0.9116754764672668, 'learning_rate': 9.327171285948352e-06, 'epoch': 0.19} 19%|█▉ | 6583/34278 [7:17:20<30:38:37, 3.98s/it] 19%|█▉ | 6584/34278 [7:17:23<29:07:01, 3.78s/it] {'loss': 0.1517, 'grad_norm': 0.9654934836715419, 'learning_rate': 9.326934566490116e-06, 'epoch': 0.19} 19%|█▉ | 6584/34278 [7:17:24<29:07:01, 3.78s/it] 19%|█▉ | 6585/34278 [7:17:30<34:56:14, 4.54s/it] {'loss': 0.1479, 'grad_norm': 0.8335776057812551, 'learning_rate': 9.326697808402041e-06, 'epoch': 0.19} 19%|█▉ | 6585/34278 [7:17:30<34:56:14, 4.54s/it] 19%|█▉ | 6586/34278 [7:17:33<31:15:24, 4.06s/it] {'loss': 0.1683, 'grad_norm': 0.8741283696287376, 'learning_rate': 9.32646101168624e-06, 'epoch': 0.19} 19%|█▉ | 6586/34278 [7:17:33<31:15:24, 4.06s/it] 19%|█▉ | 6587/34278 [7:17:36<29:57:16, 3.89s/it] {'loss': 0.1514, 'grad_norm': 0.8851147245285534, 'learning_rate': 9.326224176344829e-06, 'epoch': 0.19} 19%|█▉ | 6587/34278 [7:17:36<29:57:16, 3.89s/it] 19%|█▉ | 6588/34278 [7:17:39<28:11:50, 3.67s/it] {'loss': 0.1776, 'grad_norm': 1.0776747378112042, 'learning_rate': 9.32598730237992e-06, 'epoch': 0.19} 19%|█▉ | 6588/34278 [7:17:39<28:11:50, 3.67s/it] 19%|█▉ | 6589/34278 [7:17:43<27:08:50, 3.53s/it] {'loss': 0.174, 'grad_norm': 0.9799316250456251, 'learning_rate': 9.32575038979363e-06, 'epoch': 0.19} 19%|█▉ | 6589/34278 [7:17:43<27:08:50, 3.53s/it] 19%|█▉ | 6590/34278 [7:17:49<32:49:54, 4.27s/it] {'loss': 0.1701, 'grad_norm': 0.8183463022592024, 'learning_rate': 9.325513438588073e-06, 'epoch': 0.19} 19%|█▉ | 6590/34278 [7:17:49<32:49:54, 4.27s/it] 19%|█▉ | 6591/34278 [7:17:52<31:23:35, 4.08s/it] {'loss': 0.16, 'grad_norm': 0.8361733681857653, 'learning_rate': 9.325276448765365e-06, 'epoch': 0.19} 19%|█▉ | 6591/34278 [7:17:52<31:23:35, 4.08s/it] 19%|█▉ | 6592/34278 [7:17:58<35:00:51, 4.55s/it] {'loss': 0.1581, 'grad_norm': 0.7967788134473966, 'learning_rate': 9.325039420327621e-06, 'epoch': 0.19} 19%|█▉ | 6592/34278 [7:17:58<35:00:51, 4.55s/it] 19%|█▉ | 6593/34278 [7:18:04<38:24:59, 5.00s/it] {'loss': 0.1748, 'grad_norm': 0.894301695688426, 'learning_rate': 9.324802353276957e-06, 'epoch': 0.19} 19%|█▉ | 6593/34278 [7:18:04<38:24:59, 5.00s/it] 19%|█▉ | 6594/34278 [7:18:08<35:15:42, 4.59s/it] {'loss': 0.1728, 'grad_norm': 0.6669028858586441, 'learning_rate': 9.324565247615491e-06, 'epoch': 0.19} 19%|█▉ | 6594/34278 [7:18:08<35:15:42, 4.59s/it] 19%|█▉ | 6595/34278 [7:18:11<31:46:31, 4.13s/it] {'loss': 0.1743, 'grad_norm': 0.8436065735195872, 'learning_rate': 9.324328103345338e-06, 'epoch': 0.19} 19%|█▉ | 6595/34278 [7:18:11<31:46:31, 4.13s/it] 19%|█▉ | 6596/34278 [7:18:17<35:56:31, 4.67s/it] {'loss': 0.184, 'grad_norm': 0.8692035951921453, 'learning_rate': 9.324090920468615e-06, 'epoch': 0.19} 19%|█▉ | 6596/34278 [7:18:17<35:56:31, 4.67s/it] 19%|█▉ | 6597/34278 [7:18:20<33:57:48, 4.42s/it] {'loss': 0.1774, 'grad_norm': 0.9045271630083936, 'learning_rate': 9.323853698987443e-06, 'epoch': 0.19} 19%|█▉ | 6597/34278 [7:18:20<33:57:48, 4.42s/it] 19%|█▉ | 6598/34278 [7:18:24<31:11:52, 4.06s/it] {'loss': 0.152, 'grad_norm': 0.6962124244886452, 'learning_rate': 9.323616438903937e-06, 'epoch': 0.19} 19%|█▉ | 6598/34278 [7:18:24<31:11:52, 4.06s/it] 19%|█▉ | 6599/34278 [7:18:27<30:37:31, 3.98s/it] {'loss': 0.1851, 'grad_norm': 0.8647424905362106, 'learning_rate': 9.323379140220215e-06, 'epoch': 0.19} 19%|█▉ | 6599/34278 [7:18:27<30:37:31, 3.98s/it] 19%|█▉ | 6600/34278 [7:18:30<28:36:54, 3.72s/it] {'loss': 0.1748, 'grad_norm': 0.8542025098778888, 'learning_rate': 9.323141802938395e-06, 'epoch': 0.19} 19%|█▉ | 6600/34278 [7:18:30<28:36:54, 3.72s/it] 19%|█▉ | 6601/34278 [7:18:35<30:37:44, 3.98s/it] {'loss': 0.1704, 'grad_norm': 0.8022133668432814, 'learning_rate': 9.322904427060598e-06, 'epoch': 0.19} 19%|█▉ | 6601/34278 [7:18:35<30:37:44, 3.98s/it] 19%|█▉ | 6602/34278 [7:18:39<29:34:19, 3.85s/it] {'loss': 0.1938, 'grad_norm': 0.9583753138003067, 'learning_rate': 9.322667012588942e-06, 'epoch': 0.19} 19%|█▉ | 6602/34278 [7:18:39<29:34:19, 3.85s/it] 19%|█▉ | 6603/34278 [7:18:45<35:23:18, 4.60s/it] {'loss': 0.1499, 'grad_norm': 0.7933999801245497, 'learning_rate': 9.322429559525548e-06, 'epoch': 0.19} 19%|█▉ | 6603/34278 [7:18:45<35:23:18, 4.60s/it] 19%|█▉ | 6604/34278 [7:18:49<34:18:41, 4.46s/it] {'loss': 0.1401, 'grad_norm': 0.8192991203228889, 'learning_rate': 9.322192067872533e-06, 'epoch': 0.19} 19%|█▉ | 6604/34278 [7:18:49<34:18:41, 4.46s/it] 19%|█▉ | 6605/34278 [7:18:52<31:21:27, 4.08s/it] {'loss': 0.167, 'grad_norm': 0.7750423447305869, 'learning_rate': 9.321954537632019e-06, 'epoch': 0.19} 19%|█▉ | 6605/34278 [7:18:52<31:21:27, 4.08s/it] 19%|█▉ | 6606/34278 [7:18:57<33:35:08, 4.37s/it] {'loss': 0.1652, 'grad_norm': 0.8832348182547236, 'learning_rate': 9.321716968806127e-06, 'epoch': 0.19} 19%|█▉ | 6606/34278 [7:18:57<33:35:08, 4.37s/it] 19%|█▉ | 6607/34278 [7:19:01<30:50:35, 4.01s/it] {'loss': 0.1885, 'grad_norm': 0.9731916544546582, 'learning_rate': 9.32147936139698e-06, 'epoch': 0.19} 19%|█▉ | 6607/34278 [7:19:01<30:50:35, 4.01s/it] 19%|█▉ | 6608/34278 [7:19:06<35:17:44, 4.59s/it] {'loss': 0.1893, 'grad_norm': 0.9056726077735234, 'learning_rate': 9.321241715406694e-06, 'epoch': 0.19} 19%|█▉ | 6608/34278 [7:19:06<35:17:44, 4.59s/it] 19%|█▉ | 6609/34278 [7:19:09<31:31:06, 4.10s/it] {'loss': 0.1966, 'grad_norm': 0.9771268289081955, 'learning_rate': 9.321004030837394e-06, 'epoch': 0.19} 19%|█▉ | 6609/34278 [7:19:09<31:31:06, 4.10s/it] 19%|█▉ | 6610/34278 [7:19:13<29:30:36, 3.84s/it] {'loss': 0.1908, 'grad_norm': 0.9250638256199035, 'learning_rate': 9.320766307691202e-06, 'epoch': 0.19} 19%|█▉ | 6610/34278 [7:19:13<29:30:36, 3.84s/it] 19%|█▉ | 6611/34278 [7:19:17<30:10:26, 3.93s/it] {'loss': 0.1897, 'grad_norm': 1.1064866746245714, 'learning_rate': 9.32052854597024e-06, 'epoch': 0.19} 19%|█▉ | 6611/34278 [7:19:17<30:10:26, 3.93s/it] 19%|█▉ | 6612/34278 [7:19:20<28:05:11, 3.65s/it] {'loss': 0.1275, 'grad_norm': 1.2970316922890326, 'learning_rate': 9.32029074567663e-06, 'epoch': 0.19} 19%|█▉ | 6612/34278 [7:19:20<28:05:11, 3.65s/it] 19%|█▉ | 6613/34278 [7:19:23<26:21:50, 3.43s/it] {'loss': 0.1836, 'grad_norm': 0.9121026402023582, 'learning_rate': 9.320052906812495e-06, 'epoch': 0.19} 19%|█▉ | 6613/34278 [7:19:23<26:21:50, 3.43s/it] 19%|█▉ | 6614/34278 [7:19:26<26:01:05, 3.39s/it] {'loss': 0.1575, 'grad_norm': 0.9468523757153974, 'learning_rate': 9.31981502937996e-06, 'epoch': 0.19} 19%|█▉ | 6614/34278 [7:19:26<26:01:05, 3.39s/it] 19%|█▉ | 6615/34278 [7:19:30<27:01:01, 3.52s/it] {'loss': 0.18, 'grad_norm': 0.8629423786920889, 'learning_rate': 9.319577113381147e-06, 'epoch': 0.19} 19%|█▉ | 6615/34278 [7:19:30<27:01:01, 3.52s/it] 19%|█▉ | 6616/34278 [7:19:34<27:33:09, 3.59s/it] {'loss': 0.1722, 'grad_norm': 0.7974688953391694, 'learning_rate': 9.319339158818182e-06, 'epoch': 0.19} 19%|█▉ | 6616/34278 [7:19:34<27:33:09, 3.59s/it] 19%|█▉ | 6617/34278 [7:19:36<26:00:14, 3.38s/it] {'loss': 0.1878, 'grad_norm': 0.8848472276388092, 'learning_rate': 9.319101165693187e-06, 'epoch': 0.19} 19%|█▉ | 6617/34278 [7:19:36<26:00:14, 3.38s/it] 19%|█▉ | 6618/34278 [7:19:41<28:49:58, 3.75s/it] {'loss': 0.1773, 'grad_norm': 0.7751275570666104, 'learning_rate': 9.318863134008288e-06, 'epoch': 0.19} 19%|█▉ | 6618/34278 [7:19:41<28:49:58, 3.75s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▉ | 6619/34278 [7:19:47<34:03:15, 4.43s/it] {'loss': 0.1602, 'grad_norm': 0.8620195648849519, 'learning_rate': 9.31862506376561e-06, 'epoch': 0.19} 19%|█▉ | 6619/34278 [7:19:47<34:03:15, 4.43s/it] 19%|█▉ | 6620/34278 [7:19:53<37:44:07, 4.91s/it] {'loss': 0.1572, 'grad_norm': 0.6632416890148962, 'learning_rate': 9.318386954967278e-06, 'epoch': 0.19} 19%|█▉ | 6620/34278 [7:19:53<37:44:07, 4.91s/it] 19%|█▉ | 6621/34278 [7:19:56<32:55:27, 4.29s/it] {'loss': 0.1557, 'grad_norm': 0.9913204114730793, 'learning_rate': 9.318148807615418e-06, 'epoch': 0.19} 19%|█▉ | 6621/34278 [7:19:56<32:55:27, 4.29s/it] 19%|█▉ | 6622/34278 [7:20:02<37:18:00, 4.86s/it] {'loss': 0.1736, 'grad_norm': 0.8709380792413741, 'learning_rate': 9.317910621712156e-06, 'epoch': 0.19} 19%|█▉ | 6622/34278 [7:20:02<37:18:00, 4.86s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▉ | 6623/34278 [7:20:08<39:46:31, 5.18s/it] {'loss': 0.172, 'grad_norm': 0.8025506641134886, 'learning_rate': 9.31767239725962e-06, 'epoch': 0.19} 19%|█▉ | 6623/34278 [7:20:08<39:46:31, 5.18s/it] 19%|█▉ | 6624/34278 [7:20:12<37:35:29, 4.89s/it] {'loss': 0.1639, 'grad_norm': 0.6836281445296057, 'learning_rate': 9.317434134259934e-06, 'epoch': 0.19} 19%|█▉ | 6624/34278 [7:20:12<37:35:29, 4.89s/it] 19%|█▉ | 6625/34278 [7:20:18<38:30:04, 5.01s/it] {'loss': 0.1513, 'grad_norm': 1.0082377785568835, 'learning_rate': 9.317195832715228e-06, 'epoch': 0.19} 19%|█▉ | 6625/34278 [7:20:18<38:30:04, 5.01s/it] 19%|█▉ | 6626/34278 [7:20:23<39:50:23, 5.19s/it] {'loss': 0.1743, 'grad_norm': 0.9049552447867992, 'learning_rate': 9.31695749262763e-06, 'epoch': 0.19} 19%|█▉ | 6626/34278 [7:20:23<39:50:23, 5.19s/it] 19%|█▉ | 6627/34278 [7:20:27<35:36:58, 4.64s/it] {'loss': 0.1335, 'grad_norm': 0.6619899587092459, 'learning_rate': 9.316719113999263e-06, 'epoch': 0.19} 19%|█▉ | 6627/34278 [7:20:27<35:36:58, 4.64s/it] 19%|█▉ | 6628/34278 [7:20:30<31:55:09, 4.16s/it] {'loss': 0.1442, 'grad_norm': 0.7845890178964112, 'learning_rate': 9.316480696832259e-06, 'epoch': 0.19} 19%|█▉ | 6628/34278 [7:20:30<31:55:09, 4.16s/it] 19%|█▉ | 6629/34278 [7:20:33<30:12:52, 3.93s/it] {'loss': 0.16, 'grad_norm': 0.8581096565647223, 'learning_rate': 9.316242241128746e-06, 'epoch': 0.19} 19%|█▉ | 6629/34278 [7:20:33<30:12:52, 3.93s/it] 19%|█▉ | 6630/34278 [7:20:36<27:37:24, 3.60s/it] {'loss': 0.1667, 'grad_norm': 0.8871473185494342, 'learning_rate': 9.316003746890854e-06, 'epoch': 0.19} 19%|█▉ | 6630/34278 [7:20:36<27:37:24, 3.60s/it] 19%|█▉ | 6631/34278 [7:20:40<27:59:27, 3.64s/it] {'loss': 0.171, 'grad_norm': 0.72978329018481, 'learning_rate': 9.315765214120709e-06, 'epoch': 0.19} 19%|█▉ | 6631/34278 [7:20:40<27:59:27, 3.64s/it] 19%|█▉ | 6632/34278 [7:20:44<30:11:31, 3.93s/it] {'loss': 0.162, 'grad_norm': 0.9565276574491248, 'learning_rate': 9.315526642820443e-06, 'epoch': 0.19} 19%|█▉ | 6632/34278 [7:20:44<30:11:31, 3.93s/it] 19%|█▉ | 6633/34278 [7:20:47<28:03:54, 3.65s/it] {'loss': 0.1533, 'grad_norm': 0.8492242129049904, 'learning_rate': 9.315288032992185e-06, 'epoch': 0.19} 19%|█▉ | 6633/34278 [7:20:47<28:03:54, 3.65s/it] 19%|█▉ | 6634/34278 [7:20:51<27:55:34, 3.64s/it] {'loss': 0.1306, 'grad_norm': 0.8202558106143849, 'learning_rate': 9.315049384638065e-06, 'epoch': 0.19} 19%|█▉ | 6634/34278 [7:20:51<27:55:34, 3.64s/it] 19%|█▉ | 6635/34278 [7:20:54<27:00:50, 3.52s/it] {'loss': 0.164, 'grad_norm': 0.9556762645442757, 'learning_rate': 9.314810697760214e-06, 'epoch': 0.19} 19%|█▉ | 6635/34278 [7:20:54<27:00:50, 3.52s/it] 19%|█▉ | 6636/34278 [7:20:58<27:07:04, 3.53s/it] {'loss': 0.2004, 'grad_norm': 0.970958578848858, 'learning_rate': 9.314571972360765e-06, 'epoch': 0.19} 19%|█▉ | 6636/34278 [7:20:58<27:07:04, 3.53s/it] 19%|█▉ | 6637/34278 [7:21:02<28:03:42, 3.65s/it] {'loss': 0.1925, 'grad_norm': 1.076134050832174, 'learning_rate': 9.314333208441847e-06, 'epoch': 0.19} 19%|█▉ | 6637/34278 [7:21:02<28:03:42, 3.65s/it] 19%|█▉ | 6638/34278 [7:21:05<26:41:59, 3.48s/it] {'loss': 0.1581, 'grad_norm': 0.7572545491179042, 'learning_rate': 9.314094406005592e-06, 'epoch': 0.19} 19%|█▉ | 6638/34278 [7:21:05<26:41:59, 3.48s/it] 19%|█▉ | 6639/34278 [7:21:08<27:27:40, 3.58s/it] {'loss': 0.1512, 'grad_norm': 0.7839825173434335, 'learning_rate': 9.31385556505413e-06, 'epoch': 0.19} 19%|█▉ | 6639/34278 [7:21:08<27:27:40, 3.58s/it] 19%|█▉ | 6640/34278 [7:21:14<33:04:52, 4.31s/it] {'loss': 0.18, 'grad_norm': 0.8591181566493377, 'learning_rate': 9.313616685589596e-06, 'epoch': 0.19} 19%|█▉ | 6640/34278 [7:21:14<33:04:52, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 19%|█▉ | 6641/34278 [7:21:17<29:09:09, 3.80s/it] {'loss': 0.1809, 'grad_norm': 0.956879252265813, 'learning_rate': 9.313377767614125e-06, 'epoch': 0.19} 19%|█▉ | 6641/34278 [7:21:17<29:09:09, 3.80s/it] 19%|█▉ | 6642/34278 [7:21:20<27:40:23, 3.60s/it] {'loss': 0.1644, 'grad_norm': 0.9351799785382513, 'learning_rate': 9.313138811129844e-06, 'epoch': 0.19} 19%|█▉ | 6642/34278 [7:21:20<27:40:23, 3.60s/it] 19%|█▉ | 6643/34278 [7:21:26<33:01:52, 4.30s/it] {'loss': 0.1649, 'grad_norm': 0.9192455379752127, 'learning_rate': 9.31289981613889e-06, 'epoch': 0.19} 19%|█▉ | 6643/34278 [7:21:26<33:01:52, 4.30s/it] 19%|█▉ | 6644/34278 [7:21:30<31:29:19, 4.10s/it] {'loss': 0.1714, 'grad_norm': 0.931037002354566, 'learning_rate': 9.312660782643397e-06, 'epoch': 0.19} 19%|█▉ | 6644/34278 [7:21:30<31:29:19, 4.10s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0cab1bfe70> Failed to fetch sample 3641698. Exception: cannot identify image file <_io.BytesIO object at 0x7f0cab1bfe70> 19%|█▉ | 6645/34278 [7:21:33<30:04:35, 3.92s/it] {'loss': 0.1797, 'grad_norm': 0.9645718792541462, 'learning_rate': 9.312421710645496e-06, 'epoch': 0.19} 19%|█▉ | 6645/34278 [7:21:33<30:04:35, 3.92s/it] 19%|█▉ | 6646/34278 [7:21:36<28:11:56, 3.67s/it] {'loss': 0.1747, 'grad_norm': 0.891172809150048, 'learning_rate': 9.312182600147325e-06, 'epoch': 0.19} 19%|█▉ | 6646/34278 [7:21:36<28:11:56, 3.67s/it] 19%|█▉ | 6647/34278 [7:21:41<30:08:18, 3.93s/it] {'loss': 0.171, 'grad_norm': 0.912154300639407, 'learning_rate': 9.311943451151017e-06, 'epoch': 0.19} 19%|█▉ | 6647/34278 [7:21:41<30:08:18, 3.93s/it] 19%|█▉ | 6648/34278 [7:21:47<34:51:56, 4.54s/it] {'loss': 0.1668, 'grad_norm': 0.965505005244721, 'learning_rate': 9.311704263658707e-06, 'epoch': 0.19} 19%|█▉ | 6648/34278 [7:21:47<34:51:56, 4.54s/it] 19%|█▉ | 6649/34278 [7:21:50<31:09:09, 4.06s/it] {'loss': 0.1679, 'grad_norm': 0.8875713134156251, 'learning_rate': 9.311465037672532e-06, 'epoch': 0.19} 19%|█▉ | 6649/34278 [7:21:50<31:09:09, 4.06s/it] 19%|█▉ | 6650/34278 [7:21:54<32:29:26, 4.23s/it] {'loss': 0.165, 'grad_norm': 1.018908109861898, 'learning_rate': 9.311225773194624e-06, 'epoch': 0.19} 19%|█▉ | 6650/34278 [7:21:54<32:29:26, 4.23s/it] 19%|█▉ | 6651/34278 [7:21:58<31:28:22, 4.10s/it] {'loss': 0.1462, 'grad_norm': 1.0440928186899154, 'learning_rate': 9.310986470227121e-06, 'epoch': 0.19} 19%|█▉ | 6651/34278 [7:21:58<31:28:22, 4.10s/it] 19%|█▉ | 6652/34278 [7:22:01<29:16:28, 3.81s/it] {'loss': 0.1764, 'grad_norm': 0.9394903747176593, 'learning_rate': 9.310747128772162e-06, 'epoch': 0.19} 19%|█▉ | 6652/34278 [7:22:01<29:16:28, 3.81s/it] 19%|█▉ | 6653/34278 [7:22:05<29:16:08, 3.81s/it] {'loss': 0.1659, 'grad_norm': 0.8371496225635469, 'learning_rate': 9.31050774883188e-06, 'epoch': 0.19} 19%|█▉ | 6653/34278 [7:22:05<29:16:08, 3.81s/it] 19%|█▉ | 6654/34278 [7:22:08<27:26:02, 3.58s/it] {'loss': 0.1525, 'grad_norm': 0.8341934815383878, 'learning_rate': 9.310268330408417e-06, 'epoch': 0.19} 19%|█▉ | 6654/34278 [7:22:08<27:26:02, 3.58s/it] 19%|█▉ | 6655/34278 [7:22:11<25:45:21, 3.36s/it] {'loss': 0.1597, 'grad_norm': 1.1010564884598677, 'learning_rate': 9.310028873503905e-06, 'epoch': 0.19} 19%|█▉ | 6655/34278 [7:22:11<25:45:21, 3.36s/it] 19%|█▉ | 6656/34278 [7:22:14<24:58:45, 3.26s/it] {'loss': 0.1514, 'grad_norm': 0.7642353216518087, 'learning_rate': 9.309789378120483e-06, 'epoch': 0.19} 19%|█▉ | 6656/34278 [7:22:14<24:58:45, 3.26s/it] 19%|█▉ | 6657/34278 [7:22:17<24:43:52, 3.22s/it] {'loss': 0.1936, 'grad_norm': 0.9972370208399856, 'learning_rate': 9.309549844260292e-06, 'epoch': 0.19} 19%|█▉ | 6657/34278 [7:22:17<24:43:52, 3.22s/it] 19%|█▉ | 6658/34278 [7:22:20<24:37:25, 3.21s/it] {'loss': 0.1709, 'grad_norm': 1.077879262495552, 'learning_rate': 9.309310271925469e-06, 'epoch': 0.19} 19%|█▉ | 6658/34278 [7:22:20<24:37:25, 3.21s/it] 19%|█▉ | 6659/34278 [7:22:25<27:11:09, 3.54s/it] {'loss': 0.1722, 'grad_norm': 0.9146424859094537, 'learning_rate': 9.309070661118151e-06, 'epoch': 0.19} 19%|█▉ | 6659/34278 [7:22:25<27:11:09, 3.54s/it] 19%|█▉ | 6660/34278 [7:22:28<26:35:32, 3.47s/it] {'loss': 0.1599, 'grad_norm': 0.959592205810473, 'learning_rate': 9.30883101184048e-06, 'epoch': 0.19} 19%|█▉ | 6660/34278 [7:22:28<26:35:32, 3.47s/it] 19%|█▉ | 6661/34278 [7:22:34<32:31:26, 4.24s/it] {'loss': 0.1405, 'grad_norm': 0.7038250088223772, 'learning_rate': 9.308591324094594e-06, 'epoch': 0.19} 19%|█▉ | 6661/34278 [7:22:34<32:31:26, 4.24s/it] 19%|█▉ | 6662/34278 [7:22:37<29:39:06, 3.87s/it] {'loss': 0.152, 'grad_norm': 0.9424024148519394, 'learning_rate': 9.308351597882632e-06, 'epoch': 0.19} 19%|█▉ | 6662/34278 [7:22:37<29:39:06, 3.87s/it] 19%|█▉ | 6663/34278 [7:22:40<27:51:17, 3.63s/it] {'loss': 0.1676, 'grad_norm': 0.8046587631286805, 'learning_rate': 9.308111833206737e-06, 'epoch': 0.19} 19%|█▉ | 6663/34278 [7:22:40<27:51:17, 3.63s/it] 19%|█▉ | 6664/34278 [7:22:43<27:03:27, 3.53s/it] {'loss': 0.1734, 'grad_norm': 0.9187185870845379, 'learning_rate': 9.307872030069049e-06, 'epoch': 0.19} 19%|█▉ | 6664/34278 [7:22:43<27:03:27, 3.53s/it] 19%|█▉ | 6665/34278 [7:22:47<26:21:48, 3.44s/it] {'loss': 0.1574, 'grad_norm': 0.779844975332377, 'learning_rate': 9.307632188471707e-06, 'epoch': 0.19} 19%|█▉ | 6665/34278 [7:22:47<26:21:48, 3.44s/it] 19%|█▉ | 6666/34278 [7:22:50<25:26:25, 3.32s/it] {'loss': 0.1611, 'grad_norm': 0.945219663686819, 'learning_rate': 9.30739230841685e-06, 'epoch': 0.19} 19%|█▉ | 6666/34278 [7:22:50<25:26:25, 3.32s/it] 19%|█▉ | 6667/34278 [7:22:54<26:56:48, 3.51s/it] {'loss': 0.1833, 'grad_norm': 0.8842010436067574, 'learning_rate': 9.307152389906626e-06, 'epoch': 0.19} 19%|█▉ | 6667/34278 [7:22:54<26:56:48, 3.51s/it] 19%|█▉ | 6668/34278 [7:22:57<27:15:05, 3.55s/it] {'loss': 0.1665, 'grad_norm': 0.9778749000964558, 'learning_rate': 9.306912432943173e-06, 'epoch': 0.19} 19%|█▉ | 6668/34278 [7:22:57<27:15:05, 3.55s/it] 19%|█▉ | 6669/34278 [7:23:00<26:28:05, 3.45s/it] {'loss': 0.169, 'grad_norm': 0.793338859736364, 'learning_rate': 9.306672437528635e-06, 'epoch': 0.19} 19%|█▉ | 6669/34278 [7:23:00<26:28:05, 3.45s/it] 19%|█▉ | 6670/34278 [7:23:03<25:27:07, 3.32s/it] {'loss': 0.154, 'grad_norm': 0.7503759724553432, 'learning_rate': 9.306432403665152e-06, 'epoch': 0.19} 19%|█▉ | 6670/34278 [7:23:03<25:27:07, 3.32s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9802 > 8192). Running this sequence through the model will result in indexing errors 19%|█▉ | 6671/34278 [7:23:08<29:01:05, 3.78s/it] {'loss': 0.1485, 'grad_norm': 0.7981291064907011, 'learning_rate': 9.30619233135487e-06, 'epoch': 0.19} 19%|█▉ | 6671/34278 [7:23:08<29:01:05, 3.78s/it] 19%|█▉ | 6672/34278 [7:23:14<33:47:40, 4.41s/it] {'loss': 0.1587, 'grad_norm': 0.8824631369306233, 'learning_rate': 9.30595222059993e-06, 'epoch': 0.19} 19%|█▉ | 6672/34278 [7:23:14<33:47:40, 4.41s/it] 19%|█▉ | 6673/34278 [7:23:20<37:17:24, 4.86s/it] {'loss': 0.1897, 'grad_norm': 0.9097674315717833, 'learning_rate': 9.305712071402474e-06, 'epoch': 0.19} 19%|█▉ | 6673/34278 [7:23:20<37:17:24, 4.86s/it] 19%|█▉ | 6674/34278 [7:23:24<35:46:02, 4.66s/it] {'loss': 0.1516, 'grad_norm': 0.9764717255707349, 'learning_rate': 9.305471883764651e-06, 'epoch': 0.19} 19%|█▉ | 6674/34278 [7:23:25<35:46:02, 4.66s/it] 19%|█▉ | 6675/34278 [7:23:29<34:46:16, 4.53s/it] {'loss': 0.1533, 'grad_norm': 0.857674852944612, 'learning_rate': 9.305231657688605e-06, 'epoch': 0.19} 19%|█▉ | 6675/34278 [7:23:29<34:46:16, 4.53s/it] 19%|█▉ | 6676/34278 [7:23:31<30:48:18, 4.02s/it] {'loss': 0.1458, 'grad_norm': 0.7699677407374467, 'learning_rate': 9.304991393176475e-06, 'epoch': 0.19} 19%|█▉ | 6676/34278 [7:23:31<30:48:18, 4.02s/it] 19%|█▉ | 6677/34278 [7:23:34<28:29:49, 3.72s/it] {'loss': 0.1924, 'grad_norm': 0.8300309033541458, 'learning_rate': 9.30475109023041e-06, 'epoch': 0.19} 19%|█▉ | 6677/34278 [7:23:34<28:29:49, 3.72s/it] 19%|█▉ | 6678/34278 [7:23:40<32:19:37, 4.22s/it] {'loss': 0.1483, 'grad_norm': 0.8280787865320262, 'learning_rate': 9.304510748852558e-06, 'epoch': 0.19} 19%|█▉ | 6678/34278 [7:23:40<32:19:37, 4.22s/it] 19%|█▉ | 6679/34278 [7:23:46<36:42:41, 4.79s/it] {'loss': 0.1768, 'grad_norm': 0.8621778447224153, 'learning_rate': 9.304270369045058e-06, 'epoch': 0.19} 19%|█▉ | 6679/34278 [7:23:46<36:42:41, 4.79s/it] 19%|█▉ | 6680/34278 [7:23:51<38:03:45, 4.97s/it] {'loss': 0.1647, 'grad_norm': 0.8403947736527188, 'learning_rate': 9.30402995081006e-06, 'epoch': 0.19} 19%|█▉ | 6680/34278 [7:23:51<38:03:45, 4.97s/it] 19%|█▉ | 6681/34278 [7:23:56<38:14:09, 4.99s/it] {'loss': 0.173, 'grad_norm': 1.1243189869466546, 'learning_rate': 9.303789494149711e-06, 'epoch': 0.19} 19%|█▉ | 6681/34278 [7:23:56<38:14:09, 4.99s/it] 19%|█▉ | 6682/34278 [7:24:00<34:18:47, 4.48s/it] {'loss': 0.1583, 'grad_norm': 0.7889869801713401, 'learning_rate': 9.303548999066157e-06, 'epoch': 0.19} 19%|█▉ | 6682/34278 [7:24:00<34:18:47, 4.48s/it] 19%|█▉ | 6683/34278 [7:24:06<37:46:12, 4.93s/it] {'loss': 0.1637, 'grad_norm': 0.9309794684973345, 'learning_rate': 9.303308465561544e-06, 'epoch': 0.19} 19%|█▉ | 6683/34278 [7:24:06<37:46:12, 4.93s/it] 19%|█▉ | 6684/34278 [7:24:08<33:05:37, 4.32s/it] {'loss': 0.1983, 'grad_norm': 0.877032678167129, 'learning_rate': 9.303067893638022e-06, 'epoch': 0.19} 19%|█▉ | 6684/34278 [7:24:08<33:05:37, 4.32s/it] 20%|█▉ | 6685/34278 [7:24:12<30:32:09, 3.98s/it] {'loss': 0.1406, 'grad_norm': 0.7429751270285164, 'learning_rate': 9.302827283297736e-06, 'epoch': 0.2} 20%|█▉ | 6685/34278 [7:24:12<30:32:09, 3.98s/it] 20%|█▉ | 6686/34278 [7:24:18<35:02:00, 4.57s/it] {'loss': 0.1559, 'grad_norm': 1.1250560779172676, 'learning_rate': 9.302586634542835e-06, 'epoch': 0.2} 20%|█▉ | 6686/34278 [7:24:18<35:02:00, 4.57s/it] 20%|█▉ | 6687/34278 [7:24:24<38:11:21, 4.98s/it] {'loss': 0.1703, 'grad_norm': 0.6895416128858841, 'learning_rate': 9.302345947375469e-06, 'epoch': 0.2} 20%|█▉ | 6687/34278 [7:24:24<38:11:21, 4.98s/it] 20%|█▉ | 6688/34278 [7:24:27<34:06:25, 4.45s/it] {'loss': 0.1504, 'grad_norm': 0.7863778257084978, 'learning_rate': 9.302105221797784e-06, 'epoch': 0.2} 20%|█▉ | 6688/34278 [7:24:27<34:06:25, 4.45s/it] 20%|█▉ | 6689/34278 [7:24:30<30:47:41, 4.02s/it] {'loss': 0.1656, 'grad_norm': 1.0091930647466942, 'learning_rate': 9.30186445781193e-06, 'epoch': 0.2} 20%|█▉ | 6689/34278 [7:24:30<30:47:41, 4.02s/it] 20%|█▉ | 6690/34278 [7:24:33<29:36:24, 3.86s/it] {'loss': 0.1483, 'grad_norm': 0.7497217945700104, 'learning_rate': 9.301623655420058e-06, 'epoch': 0.2} 20%|█▉ | 6690/34278 [7:24:33<29:36:24, 3.86s/it] 20%|█▉ | 6691/34278 [7:24:37<29:15:54, 3.82s/it] {'loss': 0.1734, 'grad_norm': 0.7333323914970696, 'learning_rate': 9.301382814624318e-06, 'epoch': 0.2} 20%|█▉ | 6691/34278 [7:24:37<29:15:54, 3.82s/it] 20%|█▉ | 6692/34278 [7:24:40<27:39:50, 3.61s/it] {'loss': 0.1787, 'grad_norm': 1.0619694790705123, 'learning_rate': 9.301141935426856e-06, 'epoch': 0.2} 20%|█▉ | 6692/34278 [7:24:40<27:39:50, 3.61s/it] 20%|█▉ | 6693/34278 [7:24:43<26:52:05, 3.51s/it] {'loss': 0.1586, 'grad_norm': 1.1363839954284976, 'learning_rate': 9.300901017829827e-06, 'epoch': 0.2} 20%|█▉ | 6693/34278 [7:24:43<26:52:05, 3.51s/it] 20%|█▉ | 6694/34278 [7:24:46<25:19:14, 3.30s/it] {'loss': 0.1578, 'grad_norm': 0.9425364829644454, 'learning_rate': 9.300660061835382e-06, 'epoch': 0.2} 20%|█▉ | 6694/34278 [7:24:46<25:19:14, 3.30s/it] 20%|█▉ | 6695/34278 [7:24:49<24:09:43, 3.15s/it] {'loss': 0.1702, 'grad_norm': 0.8252125484495676, 'learning_rate': 9.30041906744567e-06, 'epoch': 0.2} 20%|█▉ | 6695/34278 [7:24:49<24:09:43, 3.15s/it] 20%|█▉ | 6696/34278 [7:24:52<24:05:38, 3.14s/it] {'loss': 0.1566, 'grad_norm': 0.9493848757992757, 'learning_rate': 9.30017803466284e-06, 'epoch': 0.2} 20%|█▉ | 6696/34278 [7:24:52<24:05:38, 3.14s/it] 20%|█▉ | 6697/34278 [7:24:55<23:48:07, 3.11s/it] {'loss': 0.1722, 'grad_norm': 1.0631928542107913, 'learning_rate': 9.299936963489051e-06, 'epoch': 0.2} 20%|█▉ | 6697/34278 [7:24:55<23:48:07, 3.11s/it] 20%|█▉ | 6698/34278 [7:24:58<23:46:55, 3.10s/it] {'loss': 0.1579, 'grad_norm': 0.9191586911053998, 'learning_rate': 9.29969585392645e-06, 'epoch': 0.2} 20%|█▉ | 6698/34278 [7:24:58<23:46:55, 3.10s/it] 20%|█▉ | 6699/34278 [7:25:02<24:55:02, 3.25s/it] {'loss': 0.172, 'grad_norm': 0.9179787981754665, 'learning_rate': 9.299454705977191e-06, 'epoch': 0.2} 20%|█▉ | 6699/34278 [7:25:02<24:55:02, 3.25s/it] 20%|█▉ | 6700/34278 [7:25:05<25:48:58, 3.37s/it] {'loss': 0.157, 'grad_norm': 0.8691153998650513, 'learning_rate': 9.299213519643427e-06, 'epoch': 0.2} 20%|█▉ | 6700/34278 [7:25:06<25:48:58, 3.37s/it] 20%|█▉ | 6701/34278 [7:25:10<28:05:08, 3.67s/it] {'loss': 0.1592, 'grad_norm': 0.9333121570377566, 'learning_rate': 9.298972294927308e-06, 'epoch': 0.2} 20%|█▉ | 6701/34278 [7:25:10<28:05:08, 3.67s/it] 20%|█▉ | 6702/34278 [7:25:16<33:59:42, 4.44s/it] {'loss': 0.1698, 'grad_norm': 0.736950193272381, 'learning_rate': 9.298731031830994e-06, 'epoch': 0.2} 20%|█▉ | 6702/34278 [7:25:16<33:59:42, 4.44s/it] 20%|█▉ | 6703/34278 [7:25:22<36:59:15, 4.83s/it] {'loss': 0.1426, 'grad_norm': 0.9342355920214414, 'learning_rate': 9.298489730356635e-06, 'epoch': 0.2} 20%|█▉ | 6703/34278 [7:25:22<36:59:15, 4.83s/it] 20%|█▉ | 6704/34278 [7:25:25<33:04:23, 4.32s/it] {'loss': 0.1654, 'grad_norm': 0.8999560289479994, 'learning_rate': 9.298248390506387e-06, 'epoch': 0.2} 20%|█▉ | 6704/34278 [7:25:25<33:04:23, 4.32s/it] 20%|█▉ | 6705/34278 [7:25:28<29:59:59, 3.92s/it] {'loss': 0.158, 'grad_norm': 0.7436024642026525, 'learning_rate': 9.2980070122824e-06, 'epoch': 0.2} 20%|█▉ | 6705/34278 [7:25:28<29:59:59, 3.92s/it] 20%|█▉ | 6706/34278 [7:25:31<27:28:45, 3.59s/it] {'loss': 0.1698, 'grad_norm': 0.836181561382301, 'learning_rate': 9.297765595686834e-06, 'epoch': 0.2} 20%|█▉ | 6706/34278 [7:25:31<27:28:45, 3.59s/it] 20%|█▉ | 6707/34278 [7:25:34<26:39:20, 3.48s/it] {'loss': 0.1443, 'grad_norm': 0.8837979522452214, 'learning_rate': 9.297524140721843e-06, 'epoch': 0.2} 20%|█▉ | 6707/34278 [7:25:34<26:39:20, 3.48s/it] 20%|█▉ | 6708/34278 [7:25:37<26:03:21, 3.40s/it] {'loss': 0.1788, 'grad_norm': 0.8967727658671764, 'learning_rate': 9.297282647389583e-06, 'epoch': 0.2} 20%|█▉ | 6708/34278 [7:25:37<26:03:21, 3.40s/it] 20%|█▉ | 6709/34278 [7:25:40<25:01:06, 3.27s/it] {'loss': 0.1827, 'grad_norm': 0.8503853960943315, 'learning_rate': 9.297041115692208e-06, 'epoch': 0.2} 20%|█▉ | 6709/34278 [7:25:40<25:01:06, 3.27s/it] 20%|█▉ | 6710/34278 [7:25:43<24:34:32, 3.21s/it] {'loss': 0.1689, 'grad_norm': 1.06943952060511, 'learning_rate': 9.296799545631876e-06, 'epoch': 0.2} 20%|█▉ | 6710/34278 [7:25:43<24:34:32, 3.21s/it] 20%|█▉ | 6711/34278 [7:25:49<30:54:31, 4.04s/it] {'loss': 0.1471, 'grad_norm': 0.7823124754458123, 'learning_rate': 9.296557937210745e-06, 'epoch': 0.2} 20%|█▉ | 6711/34278 [7:25:49<30:54:31, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 20%|█▉ | 6712/34278 [7:25:54<33:21:56, 4.36s/it] {'loss': 0.1509, 'grad_norm': 0.8053578511032365, 'learning_rate': 9.296316290430969e-06, 'epoch': 0.2} 20%|█▉ | 6712/34278 [7:25:54<33:21:56, 4.36s/it] 20%|█▉ | 6713/34278 [7:25:57<30:04:33, 3.93s/it] {'loss': 0.1734, 'grad_norm': 0.8349161213608354, 'learning_rate': 9.296074605294707e-06, 'epoch': 0.2} 20%|█▉ | 6713/34278 [7:25:57<30:04:33, 3.93s/it] 20%|█▉ | 6714/34278 [7:26:03<34:34:21, 4.52s/it] {'loss': 0.1694, 'grad_norm': 0.8981238701740619, 'learning_rate': 9.295832881804116e-06, 'epoch': 0.2} 20%|█▉ | 6714/34278 [7:26:03<34:34:21, 4.52s/it] 20%|█▉ | 6715/34278 [7:26:06<31:19:51, 4.09s/it] {'loss': 0.1537, 'grad_norm': 1.0839142407035054, 'learning_rate': 9.295591119961356e-06, 'epoch': 0.2} 20%|█▉ | 6715/34278 [7:26:06<31:19:51, 4.09s/it] 20%|█▉ | 6716/34278 [7:26:09<29:10:36, 3.81s/it] {'loss': 0.1895, 'grad_norm': 0.8004759947306443, 'learning_rate': 9.295349319768583e-06, 'epoch': 0.2} 20%|█▉ | 6716/34278 [7:26:09<29:10:36, 3.81s/it] 20%|█▉ | 6717/34278 [7:26:13<27:55:19, 3.65s/it] {'loss': 0.1678, 'grad_norm': 1.0750397339890871, 'learning_rate': 9.295107481227957e-06, 'epoch': 0.2} 20%|█▉ | 6717/34278 [7:26:13<27:55:19, 3.65s/it] 20%|█▉ | 6718/34278 [7:26:17<28:44:05, 3.75s/it] {'loss': 0.1782, 'grad_norm': 1.0142931560315016, 'learning_rate': 9.294865604341635e-06, 'epoch': 0.2} 20%|█▉ | 6718/34278 [7:26:17<28:44:05, 3.75s/it] 20%|█▉ | 6719/34278 [7:26:20<28:23:11, 3.71s/it] {'loss': 0.1819, 'grad_norm': 0.7769502369432942, 'learning_rate': 9.29462368911178e-06, 'epoch': 0.2} 20%|█▉ | 6719/34278 [7:26:20<28:23:11, 3.71s/it] 20%|█▉ | 6720/34278 [7:26:25<30:54:39, 4.04s/it] {'loss': 0.1722, 'grad_norm': 0.8453196871331251, 'learning_rate': 9.29438173554055e-06, 'epoch': 0.2} 20%|█▉ | 6720/34278 [7:26:25<30:54:39, 4.04s/it] 20%|█▉ | 6721/34278 [7:26:28<28:35:33, 3.74s/it] {'loss': 0.1702, 'grad_norm': 0.8773114967078133, 'learning_rate': 9.294139743630104e-06, 'epoch': 0.2} 20%|█▉ | 6721/34278 [7:26:28<28:35:33, 3.74s/it] 20%|█▉ | 6722/34278 [7:26:31<27:28:47, 3.59s/it] {'loss': 0.1367, 'grad_norm': 0.8901471248144578, 'learning_rate': 9.293897713382603e-06, 'epoch': 0.2} 20%|█▉ | 6722/34278 [7:26:31<27:28:47, 3.59s/it] 20%|█▉ | 6723/34278 [7:26:34<25:56:45, 3.39s/it] {'loss': 0.1786, 'grad_norm': 0.7341032379100687, 'learning_rate': 9.29365564480021e-06, 'epoch': 0.2} 20%|█▉ | 6723/34278 [7:26:34<25:56:45, 3.39s/it] 20%|█▉ | 6724/34278 [7:26:40<31:33:35, 4.12s/it] {'loss': 0.1861, 'grad_norm': 0.8538138627001078, 'learning_rate': 9.293413537885083e-06, 'epoch': 0.2} 20%|█▉ | 6724/34278 [7:26:40<31:33:35, 4.12s/it] 20%|█▉ | 6725/34278 [7:26:44<32:01:42, 4.18s/it] {'loss': 0.1492, 'grad_norm': 0.9065472952693894, 'learning_rate': 9.293171392639385e-06, 'epoch': 0.2} 20%|█▉ | 6725/34278 [7:26:44<32:01:42, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 20%|█▉ | 6726/34278 [7:26:50<35:41:57, 4.66s/it] {'loss': 0.1601, 'grad_norm': 0.9563289558009515, 'learning_rate': 9.292929209065278e-06, 'epoch': 0.2} 20%|█▉ | 6726/34278 [7:26:50<35:41:57, 4.66s/it] 20%|█▉ | 6727/34278 [7:26:54<34:07:34, 4.46s/it] {'loss': 0.19, 'grad_norm': 0.8153129262051362, 'learning_rate': 9.292686987164924e-06, 'epoch': 0.2} 20%|█▉ | 6727/34278 [7:26:54<34:07:34, 4.46s/it] 20%|█▉ | 6728/34278 [7:26:58<31:41:39, 4.14s/it] {'loss': 0.1473, 'grad_norm': 0.9845470112599826, 'learning_rate': 9.292444726940485e-06, 'epoch': 0.2} 20%|█▉ | 6728/34278 [7:26:58<31:41:39, 4.14s/it] 20%|█▉ | 6729/34278 [7:27:01<28:58:20, 3.79s/it] {'loss': 0.1435, 'grad_norm': 0.8607922807974953, 'learning_rate': 9.292202428394124e-06, 'epoch': 0.2} 20%|█▉ | 6729/34278 [7:27:01<28:58:20, 3.79s/it] 20%|█▉ | 6730/34278 [7:27:07<34:03:35, 4.45s/it] {'loss': 0.1621, 'grad_norm': 0.7267509290603898, 'learning_rate': 9.291960091528004e-06, 'epoch': 0.2} 20%|█▉ | 6730/34278 [7:27:07<34:03:35, 4.45s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11851 > 8192). Running this sequence through the model will result in indexing errors 20%|█▉ | 6731/34278 [7:27:13<37:41:49, 4.93s/it] {'loss': 0.1583, 'grad_norm': 0.8054969037959844, 'learning_rate': 9.29171771634429e-06, 'epoch': 0.2} 20%|█▉ | 6731/34278 [7:27:13<37:41:49, 4.93s/it] 20%|█▉ | 6732/34278 [7:27:16<35:22:11, 4.62s/it] {'loss': 0.185, 'grad_norm': 1.136024857434458, 'learning_rate': 9.291475302845145e-06, 'epoch': 0.2} 20%|█▉ | 6732/34278 [7:27:17<35:22:11, 4.62s/it] 20%|█▉ | 6733/34278 [7:27:22<38:19:46, 5.01s/it] {'loss': 0.1788, 'grad_norm': 0.8106671548634049, 'learning_rate': 9.291232851032733e-06, 'epoch': 0.2} 20%|█▉ | 6733/34278 [7:27:22<38:19:46, 5.01s/it] 20%|█▉ | 6734/34278 [7:27:26<35:06:45, 4.59s/it] {'loss': 0.1515, 'grad_norm': 1.0138559362243085, 'learning_rate': 9.290990360909218e-06, 'epoch': 0.2} 20%|█▉ | 6734/34278 [7:27:26<35:06:45, 4.59s/it] 20%|█▉ | 6735/34278 [7:27:29<31:36:42, 4.13s/it] {'loss': 0.1487, 'grad_norm': 0.9685956393538463, 'learning_rate': 9.290747832476765e-06, 'epoch': 0.2} 20%|█▉ | 6735/34278 [7:27:29<31:36:42, 4.13s/it] 20%|█▉ | 6736/34278 [7:27:33<30:01:21, 3.92s/it] {'loss': 0.1553, 'grad_norm': 1.0979140692203073, 'learning_rate': 9.29050526573754e-06, 'epoch': 0.2} 20%|█▉ | 6736/34278 [7:27:33<30:01:21, 3.92s/it] 20%|█▉ | 6737/34278 [7:27:35<27:50:43, 3.64s/it] {'loss': 0.176, 'grad_norm': 0.9914402686672533, 'learning_rate': 9.290262660693708e-06, 'epoch': 0.2} 20%|█▉ | 6737/34278 [7:27:36<27:50:43, 3.64s/it] 20%|█▉ | 6738/34278 [7:27:40<29:00:19, 3.79s/it] {'loss': 0.2007, 'grad_norm': 0.8714868605471512, 'learning_rate': 9.290020017347434e-06, 'epoch': 0.2} 20%|█▉ | 6738/34278 [7:27:40<29:00:19, 3.79s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 20%|█▉ | 6739/34278 [7:27:43<28:40:30, 3.75s/it] {'loss': 0.1886, 'grad_norm': 0.962420922455051, 'learning_rate': 9.289777335700888e-06, 'epoch': 0.2} 20%|█▉ | 6739/34278 [7:27:43<28:40:30, 3.75s/it] 20%|█▉ | 6740/34278 [7:27:47<27:39:18, 3.62s/it] {'loss': 0.1545, 'grad_norm': 0.8631583971632991, 'learning_rate': 9.289534615756231e-06, 'epoch': 0.2} 20%|█▉ | 6740/34278 [7:27:47<27:39:18, 3.62s/it] 20%|█▉ | 6741/34278 [7:27:50<26:14:37, 3.43s/it] {'loss': 0.1394, 'grad_norm': 0.7641851756402492, 'learning_rate': 9.289291857515634e-06, 'epoch': 0.2} 20%|█▉ | 6741/34278 [7:27:50<26:14:37, 3.43s/it] 20%|█▉ | 6742/34278 [7:27:53<25:18:51, 3.31s/it] {'loss': 0.1712, 'grad_norm': 0.8290259161847129, 'learning_rate': 9.289049060981264e-06, 'epoch': 0.2} 20%|█▉ | 6742/34278 [7:27:53<25:18:51, 3.31s/it] 20%|█▉ | 6743/34278 [7:27:56<25:18:41, 3.31s/it] {'loss': 0.1477, 'grad_norm': 0.7554586638487255, 'learning_rate': 9.288806226155288e-06, 'epoch': 0.2} 20%|█▉ | 6743/34278 [7:27:56<25:18:41, 3.31s/it] 20%|█▉ | 6744/34278 [7:27:59<24:31:19, 3.21s/it] {'loss': 0.1939, 'grad_norm': 0.811628315137839, 'learning_rate': 9.288563353039873e-06, 'epoch': 0.2} 20%|█▉ | 6744/34278 [7:27:59<24:31:19, 3.21s/it] 20%|█▉ | 6745/34278 [7:28:04<28:03:48, 3.67s/it] {'loss': 0.1664, 'grad_norm': 0.8483973942457574, 'learning_rate': 9.288320441637189e-06, 'epoch': 0.2} 20%|█▉ | 6745/34278 [7:28:04<28:03:48, 3.67s/it] 20%|█▉ | 6746/34278 [7:28:08<29:50:54, 3.90s/it] {'loss': 0.1835, 'grad_norm': 0.8667848777043733, 'learning_rate': 9.288077491949403e-06, 'epoch': 0.2} 20%|█▉ | 6746/34278 [7:28:08<29:50:54, 3.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 20%|█▉ | 6747/34278 [7:28:14<34:17:51, 4.48s/it] {'loss': 0.191, 'grad_norm': 0.871493082736332, 'learning_rate': 9.287834503978685e-06, 'epoch': 0.2} 20%|█▉ | 6747/34278 [7:28:14<34:17:51, 4.48s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 20%|█▉ | 6748/34278 [7:28:20<37:45:45, 4.94s/it] {'loss': 0.1839, 'grad_norm': 0.7598035949504574, 'learning_rate': 9.287591477727205e-06, 'epoch': 0.2} 20%|█▉ | 6748/34278 [7:28:20<37:45:45, 4.94s/it] 20%|█▉ | 6749/34278 [7:28:23<34:14:50, 4.48s/it] {'loss': 0.1614, 'grad_norm': 0.8706855653558321, 'learning_rate': 9.28734841319713e-06, 'epoch': 0.2} 20%|█▉ | 6749/34278 [7:28:23<34:14:50, 4.48s/it] 20%|█▉ | 6750/34278 [7:28:27<31:26:55, 4.11s/it] {'loss': 0.1579, 'grad_norm': 0.7810263676089523, 'learning_rate': 9.287105310390634e-06, 'epoch': 0.2} 20%|█▉ | 6750/34278 [7:28:27<31:26:55, 4.11s/it] 20%|█▉ | 6751/34278 [7:28:32<34:07:57, 4.46s/it] {'loss': 0.1645, 'grad_norm': 0.9410537912137862, 'learning_rate': 9.286862169309886e-06, 'epoch': 0.2} 20%|█▉ | 6751/34278 [7:28:32<34:07:57, 4.46s/it] 20%|█▉ | 6752/34278 [7:28:35<31:25:30, 4.11s/it] {'loss': 0.1611, 'grad_norm': 1.301653554428064, 'learning_rate': 9.286618989957053e-06, 'epoch': 0.2} 20%|█▉ | 6752/34278 [7:28:35<31:25:30, 4.11s/it] 20%|█▉ | 6753/34278 [7:28:40<34:12:32, 4.47s/it] {'loss': 0.1556, 'grad_norm': 1.0884767791477747, 'learning_rate': 9.286375772334309e-06, 'epoch': 0.2} 20%|█▉ | 6753/34278 [7:28:41<34:12:32, 4.47s/it] 20%|█▉ | 6754/34278 [7:28:44<32:18:25, 4.23s/it] {'loss': 0.1581, 'grad_norm': 0.6899878328774506, 'learning_rate': 9.286132516443826e-06, 'epoch': 0.2} 20%|█▉ | 6754/34278 [7:28:44<32:18:25, 4.23s/it] 20%|█▉ | 6755/34278 [7:28:47<29:25:09, 3.85s/it] {'loss': 0.1603, 'grad_norm': 0.8429315060403528, 'learning_rate': 9.285889222287776e-06, 'epoch': 0.2} 20%|█▉ | 6755/34278 [7:28:47<29:25:09, 3.85s/it] 20%|█▉ | 6756/34278 [7:28:51<28:28:51, 3.73s/it] {'loss': 0.2059, 'grad_norm': 0.9048267404546941, 'learning_rate': 9.28564588986833e-06, 'epoch': 0.2} 20%|█▉ | 6756/34278 [7:28:51<28:28:51, 3.73s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9229 > 8192). Running this sequence through the model will result in indexing errors 20%|█▉ | 6757/34278 [7:28:54<27:22:37, 3.58s/it] {'loss': 0.1671, 'grad_norm': 0.7372171872163484, 'learning_rate': 9.285402519187659e-06, 'epoch': 0.2} 20%|█▉ | 6757/34278 [7:28:54<27:22:37, 3.58s/it] 20%|█▉ | 6758/34278 [7:28:57<26:43:56, 3.50s/it] {'loss': 0.1521, 'grad_norm': 0.7491177044705533, 'learning_rate': 9.285159110247938e-06, 'epoch': 0.2} 20%|█▉ | 6758/34278 [7:28:57<26:43:56, 3.50s/it] 20%|█▉ | 6759/34278 [7:29:00<26:22:54, 3.45s/it] {'loss': 0.1657, 'grad_norm': 0.9188720580649349, 'learning_rate': 9.28491566305134e-06, 'epoch': 0.2} 20%|█▉ | 6759/34278 [7:29:00<26:22:54, 3.45s/it] 20%|█▉ | 6760/34278 [7:29:03<25:26:04, 3.33s/it] {'loss': 0.1511, 'grad_norm': 0.766881910798833, 'learning_rate': 9.284672177600039e-06, 'epoch': 0.2} 20%|█▉ | 6760/34278 [7:29:03<25:26:04, 3.33s/it] 20%|█▉ | 6761/34278 [7:29:07<26:30:41, 3.47s/it] {'loss': 0.149, 'grad_norm': 0.8821957120476288, 'learning_rate': 9.284428653896207e-06, 'epoch': 0.2} 20%|█▉ | 6761/34278 [7:29:07<26:30:41, 3.47s/it] 20%|█▉ | 6762/34278 [7:29:10<25:22:08, 3.32s/it] {'loss': 0.1632, 'grad_norm': 0.8651784938194986, 'learning_rate': 9.284185091942017e-06, 'epoch': 0.2} 20%|█▉ | 6762/34278 [7:29:10<25:22:08, 3.32s/it] 20%|█▉ | 6763/34278 [7:29:13<24:07:53, 3.16s/it] {'loss': 0.1706, 'grad_norm': 0.7624688335799735, 'learning_rate': 9.283941491739648e-06, 'epoch': 0.2} 20%|█▉ | 6763/34278 [7:29:13<24:07:53, 3.16s/it] 20%|█▉ | 6764/34278 [7:29:16<24:30:18, 3.21s/it] {'loss': 0.1788, 'grad_norm': 1.0473505300517565, 'learning_rate': 9.28369785329127e-06, 'epoch': 0.2} 20%|█▉ | 6764/34278 [7:29:16<24:30:18, 3.21s/it] 20%|█▉ | 6765/34278 [7:29:20<25:14:06, 3.30s/it] {'loss': 0.1759, 'grad_norm': 0.8783393953173931, 'learning_rate': 9.283454176599059e-06, 'epoch': 0.2} 20%|█▉ | 6765/34278 [7:29:20<25:14:06, 3.30s/it] 20%|█▉ | 6766/34278 [7:29:23<24:37:01, 3.22s/it] {'loss': 0.1809, 'grad_norm': 1.090863210412366, 'learning_rate': 9.283210461665195e-06, 'epoch': 0.2} 20%|█▉ | 6766/34278 [7:29:23<24:37:01, 3.22s/it] 20%|█▉ | 6767/34278 [7:29:26<24:58:45, 3.27s/it] {'loss': 0.1513, 'grad_norm': 0.8816239928570382, 'learning_rate': 9.282966708491848e-06, 'epoch': 0.2} 20%|█▉ | 6767/34278 [7:29:26<24:58:45, 3.27s/it] 20%|█▉ | 6768/34278 [7:29:30<26:53:16, 3.52s/it] {'loss': 0.1764, 'grad_norm': 0.906288892074761, 'learning_rate': 9.282722917081197e-06, 'epoch': 0.2} 20%|█▉ | 6768/34278 [7:29:30<26:53:16, 3.52s/it] 20%|█▉ | 6769/34278 [7:29:34<26:59:34, 3.53s/it] {'loss': 0.1577, 'grad_norm': 0.8922413127003459, 'learning_rate': 9.282479087435419e-06, 'epoch': 0.2} 20%|█▉ | 6769/34278 [7:29:34<26:59:34, 3.53s/it] 20%|█▉ | 6770/34278 [7:29:37<25:52:35, 3.39s/it] {'loss': 0.176, 'grad_norm': 0.8980575146322631, 'learning_rate': 9.28223521955669e-06, 'epoch': 0.2} 20%|█▉ | 6770/34278 [7:29:37<25:52:35, 3.39s/it] 20%|█▉ | 6771/34278 [7:29:40<25:05:39, 3.28s/it] {'loss': 0.1475, 'grad_norm': 0.8317359960658007, 'learning_rate': 9.281991313447185e-06, 'epoch': 0.2} 20%|█▉ | 6771/34278 [7:29:40<25:05:39, 3.28s/it] 20%|█▉ | 6772/34278 [7:29:43<25:05:45, 3.28s/it] {'loss': 0.1707, 'grad_norm': 0.9007402706647583, 'learning_rate': 9.281747369109086e-06, 'epoch': 0.2} 20%|█▉ | 6772/34278 [7:29:43<25:05:45, 3.28s/it] 20%|█▉ | 6773/34278 [7:29:46<24:19:40, 3.18s/it] {'loss': 0.1502, 'grad_norm': 1.072492056741432, 'learning_rate': 9.281503386544569e-06, 'epoch': 0.2} 20%|█▉ | 6773/34278 [7:29:46<24:19:40, 3.18s/it] 20%|█▉ | 6774/34278 [7:29:52<30:44:43, 4.02s/it] {'loss': 0.1737, 'grad_norm': 0.8813739309462332, 'learning_rate': 9.281259365755811e-06, 'epoch': 0.2} 20%|█▉ | 6774/34278 [7:29:52<30:44:43, 4.02s/it] 20%|█▉ | 6775/34278 [7:29:56<29:07:54, 3.81s/it] {'loss': 0.1608, 'grad_norm': 0.8453346845434963, 'learning_rate': 9.28101530674499e-06, 'epoch': 0.2} 20%|█▉ | 6775/34278 [7:29:56<29:07:54, 3.81s/it] 20%|█▉ | 6776/34278 [7:30:02<33:59:22, 4.45s/it] {'loss': 0.1457, 'grad_norm': 0.8030093121854524, 'learning_rate': 9.280771209514287e-06, 'epoch': 0.2} 20%|█▉ | 6776/34278 [7:30:02<33:59:22, 4.45s/it] 20%|█▉ | 6777/34278 [7:30:05<30:49:08, 4.03s/it] {'loss': 0.174, 'grad_norm': 0.9315923295272862, 'learning_rate': 9.280527074065881e-06, 'epoch': 0.2} 20%|█▉ | 6777/34278 [7:30:05<30:49:08, 4.03s/it] 20%|█▉ | 6778/34278 [7:30:09<31:12:53, 4.09s/it] {'loss': 0.155, 'grad_norm': 1.0099377696711123, 'learning_rate': 9.280282900401953e-06, 'epoch': 0.2} 20%|█▉ | 6778/34278 [7:30:09<31:12:53, 4.09s/it] 20%|█▉ | 6779/34278 [7:30:14<33:18:16, 4.36s/it] {'loss': 0.1823, 'grad_norm': 0.876679878840367, 'learning_rate': 9.280038688524678e-06, 'epoch': 0.2} 20%|█▉ | 6779/34278 [7:30:14<33:18:16, 4.36s/it] 20%|█▉ | 6780/34278 [7:30:20<38:23:04, 5.03s/it] {'loss': 0.175, 'grad_norm': 1.1406207717128962, 'learning_rate': 9.279794438436241e-06, 'epoch': 0.2} 20%|█▉ | 6780/34278 [7:30:20<38:23:04, 5.03s/it] 20%|█▉ | 6781/34278 [7:30:24<34:11:34, 4.48s/it] {'loss': 0.1669, 'grad_norm': 0.8593591024214584, 'learning_rate': 9.279550150138821e-06, 'epoch': 0.2} 20%|█▉ | 6781/34278 [7:30:24<34:11:34, 4.48s/it] 20%|█▉ | 6782/34278 [7:30:27<31:21:15, 4.11s/it] {'loss': 0.1493, 'grad_norm': 1.119386990221924, 'learning_rate': 9.279305823634599e-06, 'epoch': 0.2} 20%|█▉ | 6782/34278 [7:30:27<31:21:15, 4.11s/it] 20%|█▉ | 6783/34278 [7:30:31<31:01:51, 4.06s/it] {'loss': 0.1497, 'grad_norm': 0.8571760555090274, 'learning_rate': 9.279061458925755e-06, 'epoch': 0.2} 20%|█▉ | 6783/34278 [7:30:31<31:01:51, 4.06s/it] 20%|█▉ | 6784/34278 [7:30:34<29:05:18, 3.81s/it] {'loss': 0.1583, 'grad_norm': 1.1002761353387693, 'learning_rate': 9.278817056014473e-06, 'epoch': 0.2} 20%|█▉ | 6784/34278 [7:30:34<29:05:18, 3.81s/it] 20%|█▉ | 6785/34278 [7:30:37<27:33:54, 3.61s/it] {'loss': 0.2065, 'grad_norm': 1.0908781638352976, 'learning_rate': 9.278572614902932e-06, 'epoch': 0.2} 20%|█▉ | 6785/34278 [7:30:37<27:33:54, 3.61s/it] 20%|█▉ | 6786/34278 [7:30:41<27:04:23, 3.55s/it] {'loss': 0.1693, 'grad_norm': 0.851372087478325, 'learning_rate': 9.278328135593318e-06, 'epoch': 0.2} 20%|█▉ | 6786/34278 [7:30:41<27:04:23, 3.55s/it] 20%|█▉ | 6787/34278 [7:30:47<33:27:16, 4.38s/it] {'loss': 0.1552, 'grad_norm': 0.8643394174725874, 'learning_rate': 9.278083618087811e-06, 'epoch': 0.2} 20%|█▉ | 6787/34278 [7:30:47<33:27:16, 4.38s/it] 20%|█▉ | 6788/34278 [7:30:53<37:32:30, 4.92s/it] {'loss': 0.1732, 'grad_norm': 0.7977493439466831, 'learning_rate': 9.277839062388594e-06, 'epoch': 0.2} 20%|█▉ | 6788/34278 [7:30:53<37:32:30, 4.92s/it] 20%|█▉ | 6789/34278 [7:30:57<35:34:50, 4.66s/it] {'loss': 0.1775, 'grad_norm': 0.7624833243387342, 'learning_rate': 9.277594468497853e-06, 'epoch': 0.2} 20%|█▉ | 6789/34278 [7:30:57<35:34:50, 4.66s/it] 20%|█▉ | 6790/34278 [7:31:01<33:18:06, 4.36s/it] {'loss': 0.1724, 'grad_norm': 0.8444387072539988, 'learning_rate': 9.277349836417769e-06, 'epoch': 0.2} 20%|█▉ | 6790/34278 [7:31:01<33:18:06, 4.36s/it] 20%|█▉ | 6791/34278 [7:31:04<30:55:59, 4.05s/it] {'loss': 0.1716, 'grad_norm': 0.8496209699680165, 'learning_rate': 9.277105166150525e-06, 'epoch': 0.2} 20%|█▉ | 6791/34278 [7:31:04<30:55:59, 4.05s/it] 20%|█▉ | 6792/34278 [7:31:08<30:45:49, 4.03s/it] {'loss': 0.1821, 'grad_norm': 0.9393124415834895, 'learning_rate': 9.276860457698308e-06, 'epoch': 0.2} 20%|█▉ | 6792/34278 [7:31:08<30:45:49, 4.03s/it] 20%|█▉ | 6793/34278 [7:31:11<29:26:16, 3.86s/it] {'loss': 0.1688, 'grad_norm': 0.8848031449719974, 'learning_rate': 9.276615711063303e-06, 'epoch': 0.2} 20%|█▉ | 6793/34278 [7:31:11<29:26:16, 3.86s/it] 20%|█▉ | 6794/34278 [7:31:15<27:35:26, 3.61s/it] {'loss': 0.1739, 'grad_norm': 0.9007435531593639, 'learning_rate': 9.276370926247693e-06, 'epoch': 0.2} 20%|█▉ | 6794/34278 [7:31:15<27:35:26, 3.61s/it] 20%|█▉ | 6795/34278 [7:31:18<26:19:51, 3.45s/it] {'loss': 0.1599, 'grad_norm': 0.9604942915988172, 'learning_rate': 9.276126103253664e-06, 'epoch': 0.2} 20%|█▉ | 6795/34278 [7:31:18<26:19:51, 3.45s/it] 20%|█▉ | 6796/34278 [7:31:21<25:38:27, 3.36s/it] {'loss': 0.1715, 'grad_norm': 0.8461491811556062, 'learning_rate': 9.275881242083402e-06, 'epoch': 0.2} 20%|█▉ | 6796/34278 [7:31:21<25:38:27, 3.36s/it] 20%|█▉ | 6797/34278 [7:31:27<31:45:52, 4.16s/it] {'loss': 0.1992, 'grad_norm': 1.0300730779632639, 'learning_rate': 9.275636342739094e-06, 'epoch': 0.2} 20%|█▉ | 6797/34278 [7:31:27<31:45:52, 4.16s/it] 20%|█▉ | 6798/34278 [7:31:32<34:02:49, 4.46s/it] {'loss': 0.1727, 'grad_norm': 0.8445314481632383, 'learning_rate': 9.275391405222923e-06, 'epoch': 0.2} 20%|█▉ | 6798/34278 [7:31:32<34:02:49, 4.46s/it] 20%|█▉ | 6799/34278 [7:31:35<30:20:27, 3.97s/it] {'loss': 0.1654, 'grad_norm': 0.7359376197202013, 'learning_rate': 9.27514642953708e-06, 'epoch': 0.2} 20%|█▉ | 6799/34278 [7:31:35<30:20:27, 3.97s/it] 20%|█▉ | 6800/34278 [7:31:41<34:47:21, 4.56s/it] {'loss': 0.158, 'grad_norm': 0.8443801360351817, 'learning_rate': 9.274901415683751e-06, 'epoch': 0.2} 20%|█▉ | 6800/34278 [7:31:41<34:47:21, 4.56s/it] 20%|█▉ | 6801/34278 [7:31:44<32:49:54, 4.30s/it] {'loss': 0.1787, 'grad_norm': 0.8307639478949574, 'learning_rate': 9.27465636366512e-06, 'epoch': 0.2} 20%|█▉ | 6801/34278 [7:31:44<32:49:54, 4.30s/it] 20%|█▉ | 6802/34278 [7:31:48<30:28:46, 3.99s/it] {'loss': 0.1918, 'grad_norm': 0.9858139091472421, 'learning_rate': 9.27441127348338e-06, 'epoch': 0.2} 20%|█▉ | 6802/34278 [7:31:48<30:28:46, 3.99s/it] 20%|█▉ | 6803/34278 [7:31:52<30:45:10, 4.03s/it] {'loss': 0.1699, 'grad_norm': 0.8464730211325557, 'learning_rate': 9.274166145140715e-06, 'epoch': 0.2} 20%|█▉ | 6803/34278 [7:31:52<30:45:10, 4.03s/it] 20%|█▉ | 6804/34278 [7:31:55<29:32:13, 3.87s/it] {'loss': 0.1559, 'grad_norm': 0.7180746493841713, 'learning_rate': 9.273920978639315e-06, 'epoch': 0.2} 20%|█▉ | 6804/34278 [7:31:55<29:32:13, 3.87s/it] 20%|█▉ | 6805/34278 [7:31:58<27:47:29, 3.64s/it] {'loss': 0.1622, 'grad_norm': 0.799114963785012, 'learning_rate': 9.27367577398137e-06, 'epoch': 0.2} 20%|█▉ | 6805/34278 [7:31:58<27:47:29, 3.64s/it] 20%|█▉ | 6806/34278 [7:32:05<33:31:31, 4.39s/it] {'loss': 0.1439, 'grad_norm': 0.6447383763126631, 'learning_rate': 9.273430531169068e-06, 'epoch': 0.2} 20%|█▉ | 6806/34278 [7:32:05<33:31:31, 4.39s/it] 20%|█▉ | 6807/34278 [7:32:10<36:53:05, 4.83s/it] {'loss': 0.1633, 'grad_norm': 0.7725223467058463, 'learning_rate': 9.273185250204597e-06, 'epoch': 0.2} 20%|█▉ | 6807/34278 [7:32:10<36:53:05, 4.83s/it] 20%|█▉ | 6808/34278 [7:32:14<32:54:00, 4.31s/it] {'loss': 0.1544, 'grad_norm': 0.7575513821876597, 'learning_rate': 9.272939931090148e-06, 'epoch': 0.2} 20%|█▉ | 6808/34278 [7:32:14<32:54:00, 4.31s/it] 20%|█▉ | 6809/34278 [7:32:20<37:32:58, 4.92s/it] {'loss': 0.1687, 'grad_norm': 0.7586924842351225, 'learning_rate': 9.272694573827914e-06, 'epoch': 0.2} 20%|█▉ | 6809/34278 [7:32:20<37:32:58, 4.92s/it] 20%|█▉ | 6810/34278 [7:32:23<33:31:06, 4.39s/it] {'loss': 0.1749, 'grad_norm': 0.9272205458155205, 'learning_rate': 9.272449178420079e-06, 'epoch': 0.2} 20%|█▉ | 6810/34278 [7:32:23<33:31:06, 4.39s/it] 20%|█▉ | 6811/34278 [7:32:26<31:26:14, 4.12s/it] {'loss': 0.1564, 'grad_norm': 0.7176069199236317, 'learning_rate': 9.27220374486884e-06, 'epoch': 0.2} 20%|█▉ | 6811/34278 [7:32:27<31:26:14, 4.12s/it] 20%|█▉ | 6812/34278 [7:32:33<35:48:48, 4.69s/it] {'loss': 0.1696, 'grad_norm': 0.7930359692764781, 'learning_rate': 9.271958273176385e-06, 'epoch': 0.2} 20%|█▉ | 6812/34278 [7:32:33<35:48:48, 4.69s/it] 20%|█▉ | 6813/34278 [7:32:39<38:49:36, 5.09s/it] {'loss': 0.1754, 'grad_norm': 0.9127337094725932, 'learning_rate': 9.271712763344907e-06, 'epoch': 0.2} 20%|█▉ | 6813/34278 [7:32:39<38:49:36, 5.09s/it] 20%|█▉ | 6814/34278 [7:32:43<36:18:33, 4.76s/it] {'loss': 0.1817, 'grad_norm': 0.8029291901645138, 'learning_rate': 9.271467215376598e-06, 'epoch': 0.2} 20%|█▉ | 6814/34278 [7:32:43<36:18:33, 4.76s/it] 20%|█▉ | 6815/34278 [7:32:46<33:12:09, 4.35s/it] {'loss': 0.1618, 'grad_norm': 2.336362348610908, 'learning_rate': 9.271221629273647e-06, 'epoch': 0.2} 20%|█▉ | 6815/34278 [7:32:46<33:12:09, 4.35s/it] 20%|█▉ | 6816/34278 [7:32:49<30:26:01, 3.99s/it] {'loss': 0.1604, 'grad_norm': 0.986410040723693, 'learning_rate': 9.27097600503825e-06, 'epoch': 0.2} 20%|█▉ | 6816/34278 [7:32:49<30:26:01, 3.99s/it] 20%|█▉ | 6817/34278 [7:32:52<27:30:57, 3.61s/it] {'loss': 0.1528, 'grad_norm': 0.9145616410938826, 'learning_rate': 9.2707303426726e-06, 'epoch': 0.2} 20%|█▉ | 6817/34278 [7:32:52<27:30:57, 3.61s/it] 20%|█▉ | 6818/34278 [7:32:56<29:13:01, 3.83s/it] {'loss': 0.1506, 'grad_norm': 0.8020578464674823, 'learning_rate': 9.270484642178888e-06, 'epoch': 0.2} 20%|█▉ | 6818/34278 [7:32:56<29:13:01, 3.83s/it] 20%|█▉ | 6819/34278 [7:32:59<27:51:38, 3.65s/it] {'loss': 0.1644, 'grad_norm': 0.8017613401330342, 'learning_rate': 9.270238903559307e-06, 'epoch': 0.2} 20%|█▉ | 6819/34278 [7:32:59<27:51:38, 3.65s/it] 20%|█▉ | 6820/34278 [7:33:03<26:52:05, 3.52s/it] {'loss': 0.1661, 'grad_norm': 0.9630723985747422, 'learning_rate': 9.269993126816055e-06, 'epoch': 0.2} 20%|█▉ | 6820/34278 [7:33:03<26:52:05, 3.52s/it] 20%|█▉ | 6821/34278 [7:33:06<27:08:37, 3.56s/it] {'loss': 0.1656, 'grad_norm': 1.071793350005467, 'learning_rate': 9.269747311951322e-06, 'epoch': 0.2} 20%|█▉ | 6821/34278 [7:33:06<27:08:37, 3.56s/it] 20%|█▉ | 6822/34278 [7:33:10<26:30:28, 3.48s/it] {'loss': 0.167, 'grad_norm': 0.7592639567566531, 'learning_rate': 9.269501458967306e-06, 'epoch': 0.2} 20%|█▉ | 6822/34278 [7:33:10<26:30:28, 3.48s/it] 20%|█▉ | 6823/34278 [7:33:13<26:20:32, 3.45s/it] {'loss': 0.1683, 'grad_norm': 0.9551099958261103, 'learning_rate': 9.269255567866199e-06, 'epoch': 0.2} 20%|█▉ | 6823/34278 [7:33:13<26:20:32, 3.45s/it] 20%|█▉ | 6824/34278 [7:33:16<25:36:22, 3.36s/it] {'loss': 0.1573, 'grad_norm': 0.8111207539993681, 'learning_rate': 9.269009638650198e-06, 'epoch': 0.2} 20%|█▉ | 6824/34278 [7:33:16<25:36:22, 3.36s/it] 20%|█▉ | 6825/34278 [7:33:19<25:12:55, 3.31s/it] {'loss': 0.1663, 'grad_norm': 0.9348714473789744, 'learning_rate': 9.268763671321497e-06, 'epoch': 0.2} 20%|█▉ | 6825/34278 [7:33:19<25:12:55, 3.31s/it] 20%|█▉ | 6826/34278 [7:33:23<25:31:43, 3.35s/it] {'loss': 0.1808, 'grad_norm': 0.9722626260327435, 'learning_rate': 9.268517665882294e-06, 'epoch': 0.2} 20%|█▉ | 6826/34278 [7:33:23<25:31:43, 3.35s/it] 20%|█▉ | 6827/34278 [7:33:30<35:02:52, 4.60s/it] {'loss': 0.1769, 'grad_norm': 0.9170527580958073, 'learning_rate': 9.268271622334784e-06, 'epoch': 0.2} 20%|█▉ | 6827/34278 [7:33:30<35:02:52, 4.60s/it] 20%|█▉ | 6828/34278 [7:33:33<31:40:44, 4.15s/it] {'loss': 0.1879, 'grad_norm': 1.014970958407438, 'learning_rate': 9.268025540681163e-06, 'epoch': 0.2} 20%|█▉ | 6828/34278 [7:33:33<31:40:44, 4.15s/it] 20%|█▉ | 6829/34278 [7:33:37<30:07:50, 3.95s/it] {'loss': 0.1487, 'grad_norm': 0.8491689365116207, 'learning_rate': 9.26777942092363e-06, 'epoch': 0.2} 20%|█▉ | 6829/34278 [7:33:37<30:07:50, 3.95s/it] 20%|█▉ | 6830/34278 [7:33:40<29:08:51, 3.82s/it] {'loss': 0.1828, 'grad_norm': 0.8266477970103245, 'learning_rate': 9.26753326306438e-06, 'epoch': 0.2} 20%|█▉ | 6830/34278 [7:33:40<29:08:51, 3.82s/it] 20%|█▉ | 6831/34278 [7:33:44<29:34:50, 3.88s/it] {'loss': 0.1635, 'grad_norm': 0.9012240304171691, 'learning_rate': 9.267287067105612e-06, 'epoch': 0.2} 20%|█▉ | 6831/34278 [7:33:44<29:34:50, 3.88s/it] 20%|█▉ | 6832/34278 [7:33:48<28:10:30, 3.70s/it] {'loss': 0.1647, 'grad_norm': 0.8504390585172055, 'learning_rate': 9.267040833049525e-06, 'epoch': 0.2} 20%|█▉ | 6832/34278 [7:33:48<28:10:30, 3.70s/it] 20%|█▉ | 6833/34278 [7:33:51<27:01:01, 3.54s/it] {'loss': 0.1581, 'grad_norm': 0.7854031137197991, 'learning_rate': 9.266794560898315e-06, 'epoch': 0.2} 20%|█▉ | 6833/34278 [7:33:51<27:01:01, 3.54s/it] 20%|█▉ | 6834/34278 [7:33:57<32:53:26, 4.31s/it] {'loss': 0.1446, 'grad_norm': 0.74083613402986, 'learning_rate': 9.266548250654183e-06, 'epoch': 0.2} 20%|█▉ | 6834/34278 [7:33:57<32:53:26, 4.31s/it] 20%|█▉ | 6835/34278 [7:34:01<31:56:46, 4.19s/it] {'loss': 0.1766, 'grad_norm': 0.8604736649988246, 'learning_rate': 9.266301902319326e-06, 'epoch': 0.2} 20%|█▉ | 6835/34278 [7:34:01<31:56:46, 4.19s/it] 20%|█▉ | 6836/34278 [7:34:04<29:29:05, 3.87s/it] {'loss': 0.1608, 'grad_norm': 0.7926394245167371, 'learning_rate': 9.266055515895945e-06, 'epoch': 0.2} 20%|█▉ | 6836/34278 [7:34:04<29:29:05, 3.87s/it] 20%|█▉ | 6837/34278 [7:34:07<28:19:34, 3.72s/it] {'loss': 0.1459, 'grad_norm': 0.616829938563071, 'learning_rate': 9.265809091386236e-06, 'epoch': 0.2} 20%|█▉ | 6837/34278 [7:34:07<28:19:34, 3.72s/it] 20%|█▉ | 6838/34278 [7:34:13<32:50:17, 4.31s/it] {'loss': 0.1735, 'grad_norm': 0.8577998647176115, 'learning_rate': 9.265562628792402e-06, 'epoch': 0.2} 20%|█▉ | 6838/34278 [7:34:13<32:50:17, 4.31s/it] 20%|█▉ | 6839/34278 [7:34:16<29:14:17, 3.84s/it] {'loss': 0.18, 'grad_norm': 0.9264624005572271, 'learning_rate': 9.265316128116647e-06, 'epoch': 0.2} 20%|█▉ | 6839/34278 [7:34:16<29:14:17, 3.84s/it] 20%|█▉ | 6840/34278 [7:34:20<29:13:05, 3.83s/it] {'loss': 0.1875, 'grad_norm': 0.829671413022701, 'learning_rate': 9.265069589361165e-06, 'epoch': 0.2} 20%|█▉ | 6840/34278 [7:34:20<29:13:05, 3.83s/it] 20%|█▉ | 6841/34278 [7:34:23<27:32:17, 3.61s/it] {'loss': 0.1673, 'grad_norm': 0.7875201566196365, 'learning_rate': 9.264823012528159e-06, 'epoch': 0.2} 20%|█▉ | 6841/34278 [7:34:23<27:32:17, 3.61s/it] 20%|█▉ | 6842/34278 [7:34:26<27:59:39, 3.67s/it] {'loss': 0.1629, 'grad_norm': 0.7874558435542366, 'learning_rate': 9.264576397619832e-06, 'epoch': 0.2} 20%|█▉ | 6842/34278 [7:34:26<27:59:39, 3.67s/it] 20%|█▉ | 6843/34278 [7:34:30<27:11:02, 3.57s/it] {'loss': 0.1466, 'grad_norm': 0.7724462576193868, 'learning_rate': 9.264329744638385e-06, 'epoch': 0.2} 20%|█▉ | 6843/34278 [7:34:30<27:11:02, 3.57s/it] 20%|█▉ | 6844/34278 [7:34:34<29:08:54, 3.82s/it] {'loss': 0.1862, 'grad_norm': 0.7668766098109513, 'learning_rate': 9.264083053586022e-06, 'epoch': 0.2} 20%|█▉ | 6844/34278 [7:34:34<29:08:54, 3.82s/it] 20%|█▉ | 6845/34278 [7:34:39<32:05:17, 4.21s/it] {'loss': 0.1803, 'grad_norm': 0.773502420006225, 'learning_rate': 9.263836324464942e-06, 'epoch': 0.2} 20%|█▉ | 6845/34278 [7:34:39<32:05:17, 4.21s/it] 20%|█▉ | 6846/34278 [7:34:43<31:15:25, 4.10s/it] {'loss': 0.1441, 'grad_norm': 0.7835549402281338, 'learning_rate': 9.263589557277349e-06, 'epoch': 0.2} 20%|█▉ | 6846/34278 [7:34:43<31:15:25, 4.10s/it] 20%|█▉ | 6847/34278 [7:34:46<28:55:36, 3.80s/it] {'loss': 0.1921, 'grad_norm': 0.8700013437396111, 'learning_rate': 9.263342752025446e-06, 'epoch': 0.2} 20%|█▉ | 6847/34278 [7:34:46<28:55:36, 3.80s/it] 20%|█▉ | 6848/34278 [7:34:49<27:42:04, 3.64s/it] {'loss': 0.1758, 'grad_norm': 0.8383673342033546, 'learning_rate': 9.263095908711436e-06, 'epoch': 0.2} 20%|█▉ | 6848/34278 [7:34:50<27:42:04, 3.64s/it] 20%|█▉ | 6849/34278 [7:34:53<27:16:14, 3.58s/it] {'loss': 0.1747, 'grad_norm': 0.9235433147483293, 'learning_rate': 9.262849027337524e-06, 'epoch': 0.2} 20%|█▉ | 6849/34278 [7:34:53<27:16:14, 3.58s/it] 20%|█▉ | 6850/34278 [7:34:56<26:51:50, 3.53s/it] {'loss': 0.1463, 'grad_norm': 0.8038146197528632, 'learning_rate': 9.262602107905913e-06, 'epoch': 0.2} 20%|█▉ | 6850/34278 [7:34:56<26:51:50, 3.53s/it] 20%|█▉ | 6851/34278 [7:35:02<31:24:54, 4.12s/it] {'loss': 0.1313, 'grad_norm': 0.752886841622253, 'learning_rate': 9.26235515041881e-06, 'epoch': 0.2} 20%|█▉ | 6851/34278 [7:35:02<31:24:54, 4.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 20%|█▉ | 6852/34278 [7:35:08<36:02:21, 4.73s/it] {'loss': 0.1682, 'grad_norm': 0.9008267139088517, 'learning_rate': 9.262108154878415e-06, 'epoch': 0.2} 20%|█▉ | 6852/34278 [7:35:08<36:02:21, 4.73s/it] 20%|█▉ | 6853/34278 [7:35:11<32:41:55, 4.29s/it] {'loss': 0.1679, 'grad_norm': 0.9045358933870375, 'learning_rate': 9.261861121286938e-06, 'epoch': 0.2} 20%|█▉ | 6853/34278 [7:35:11<32:41:55, 4.29s/it] 20%|█▉ | 6854/34278 [7:35:15<30:20:47, 3.98s/it] {'loss': 0.158, 'grad_norm': 1.1205794896514105, 'learning_rate': 9.261614049646581e-06, 'epoch': 0.2} 20%|█▉ | 6854/34278 [7:35:15<30:20:47, 3.98s/it] 20%|█▉ | 6855/34278 [7:35:18<28:29:20, 3.74s/it] {'loss': 0.1611, 'grad_norm': 0.9444648051004103, 'learning_rate': 9.261366939959552e-06, 'epoch': 0.2} 20%|█▉ | 6855/34278 [7:35:18<28:29:20, 3.74s/it] 20%|██ | 6856/34278 [7:35:21<27:30:17, 3.61s/it] {'loss': 0.1622, 'grad_norm': 0.8900281994126368, 'learning_rate': 9.261119792228056e-06, 'epoch': 0.2} 20%|██ | 6856/34278 [7:35:21<27:30:17, 3.61s/it] 20%|██ | 6857/34278 [7:35:27<32:43:12, 4.30s/it] {'loss': 0.1554, 'grad_norm': 0.9793048129383088, 'learning_rate': 9.260872606454299e-06, 'epoch': 0.2} 20%|██ | 6857/34278 [7:35:27<32:43:12, 4.30s/it] 20%|██ | 6858/34278 [7:35:30<30:38:08, 4.02s/it] {'loss': 0.1402, 'grad_norm': 0.7866144396388639, 'learning_rate': 9.260625382640489e-06, 'epoch': 0.2} 20%|██ | 6858/34278 [7:35:30<30:38:08, 4.02s/it] 20%|██ | 6859/34278 [7:35:34<30:04:13, 3.95s/it] {'loss': 0.1975, 'grad_norm': 1.1020344557217077, 'learning_rate': 9.260378120788833e-06, 'epoch': 0.2} 20%|██ | 6859/34278 [7:35:34<30:04:13, 3.95s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9295 > 8192). Running this sequence through the model will result in indexing errors 20%|██ | 6860/34278 [7:35:40<34:44:09, 4.56s/it] {'loss': 0.1812, 'grad_norm': 0.9935447255693488, 'learning_rate': 9.260130820901539e-06, 'epoch': 0.2} 20%|██ | 6860/34278 [7:35:40<34:44:09, 4.56s/it] 20%|██ | 6861/34278 [7:35:46<37:57:57, 4.99s/it] {'loss': 0.1681, 'grad_norm': 0.92842363614548, 'learning_rate': 9.259883482980812e-06, 'epoch': 0.2} 20%|██ | 6861/34278 [7:35:46<37:57:57, 4.99s/it] 20%|██ | 6862/34278 [7:35:49<33:23:33, 4.38s/it] {'loss': 0.1464, 'grad_norm': 0.8056007956260339, 'learning_rate': 9.259636107028863e-06, 'epoch': 0.2} 20%|██ | 6862/34278 [7:35:49<33:23:33, 4.38s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff307eee2f0> Failed to fetch sample 2751684. Exception: cannot identify image file <_io.BytesIO object at 0x7ff307eee2f0> 20%|██ | 6863/34278 [7:35:53<33:35:05, 4.41s/it] {'loss': 0.1628, 'grad_norm': 0.8201984804203223, 'learning_rate': 9.2593886930479e-06, 'epoch': 0.2} 20%|██ | 6863/34278 [7:35:54<33:35:05, 4.41s/it] 20%|██ | 6864/34278 [7:35:57<31:13:39, 4.10s/it] {'loss': 0.1631, 'grad_norm': 0.9561164466426443, 'learning_rate': 9.259141241040132e-06, 'epoch': 0.2} 20%|██ | 6864/34278 [7:35:57<31:13:39, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 20%|██ | 6865/34278 [7:36:02<32:40:37, 4.29s/it] {'loss': 0.1916, 'grad_norm': 0.7606171656600601, 'learning_rate': 9.258893751007768e-06, 'epoch': 0.2} 20%|██ | 6865/34278 [7:36:02<32:40:37, 4.29s/it] 20%|██ | 6866/34278 [7:36:05<30:24:22, 3.99s/it] {'loss': 0.1785, 'grad_norm': 0.9893724394233364, 'learning_rate': 9.258646222953014e-06, 'epoch': 0.2} 20%|██ | 6866/34278 [7:36:05<30:24:22, 3.99s/it] 20%|██ | 6867/34278 [7:36:08<29:20:33, 3.85s/it] {'loss': 0.1832, 'grad_norm': 1.0492910901475423, 'learning_rate': 9.258398656878086e-06, 'epoch': 0.2} 20%|██ | 6867/34278 [7:36:08<29:20:33, 3.85s/it] 20%|██ | 6868/34278 [7:36:14<32:48:27, 4.31s/it] {'loss': 0.1447, 'grad_norm': 0.8239922216935243, 'learning_rate': 9.25815105278519e-06, 'epoch': 0.2} 20%|██ | 6868/34278 [7:36:14<32:48:27, 4.31s/it] 20%|██ | 6869/34278 [7:36:17<30:22:20, 3.99s/it] {'loss': 0.156, 'grad_norm': 0.9857818821904483, 'learning_rate': 9.257903410676542e-06, 'epoch': 0.2} 20%|██ | 6869/34278 [7:36:17<30:22:20, 3.99s/it] 20%|██ | 6870/34278 [7:36:23<35:02:39, 4.60s/it] {'loss': 0.1626, 'grad_norm': 1.1030047783364014, 'learning_rate': 9.257655730554343e-06, 'epoch': 0.2} 20%|██ | 6870/34278 [7:36:23<35:02:39, 4.60s/it] 20%|██ | 6871/34278 [7:36:26<30:58:15, 4.07s/it] {'loss': 0.1441, 'grad_norm': 0.7547724311474497, 'learning_rate': 9.257408012420814e-06, 'epoch': 0.2} 20%|██ | 6871/34278 [7:36:26<30:58:15, 4.07s/it] 20%|██ | 6872/34278 [7:36:30<30:38:20, 4.02s/it] {'loss': 0.1588, 'grad_norm': 0.8284604181580133, 'learning_rate': 9.25716025627816e-06, 'epoch': 0.2} 20%|██ | 6872/34278 [7:36:30<30:38:20, 4.02s/it] 20%|██ | 6873/34278 [7:36:33<28:15:14, 3.71s/it] {'loss': 0.1478, 'grad_norm': 0.8664436845875464, 'learning_rate': 9.256912462128598e-06, 'epoch': 0.2} 20%|██ | 6873/34278 [7:36:33<28:15:14, 3.71s/it] 20%|██ | 6874/34278 [7:36:36<26:33:39, 3.49s/it] {'loss': 0.1738, 'grad_norm': 0.7916567029228911, 'learning_rate': 9.256664629974336e-06, 'epoch': 0.2} 20%|██ | 6874/34278 [7:36:36<26:33:39, 3.49s/it] 20%|██ | 6875/34278 [7:36:42<32:16:25, 4.24s/it] {'loss': 0.1574, 'grad_norm': 0.876323632801491, 'learning_rate': 9.256416759817589e-06, 'epoch': 0.2} 20%|██ | 6875/34278 [7:36:42<32:16:25, 4.24s/it] 20%|██ | 6876/34278 [7:36:45<29:50:24, 3.92s/it] {'loss': 0.1627, 'grad_norm': 0.8757684449739846, 'learning_rate': 9.256168851660568e-06, 'epoch': 0.2} 20%|██ | 6876/34278 [7:36:45<29:50:24, 3.92s/it] 20%|██ | 6877/34278 [7:36:51<34:58:34, 4.60s/it] {'loss': 0.166, 'grad_norm': 0.7331586042635607, 'learning_rate': 9.255920905505489e-06, 'epoch': 0.2} 20%|██ | 6877/34278 [7:36:51<34:58:34, 4.60s/it] 20%|██ | 6878/34278 [7:36:54<31:42:20, 4.17s/it] {'loss': 0.1572, 'grad_norm': 0.884088409347574, 'learning_rate': 9.255672921354564e-06, 'epoch': 0.2} 20%|██ | 6878/34278 [7:36:54<31:42:20, 4.17s/it] 20%|██ | 6879/34278 [7:36:59<33:33:42, 4.41s/it] {'loss': 0.1738, 'grad_norm': 0.9423527980206594, 'learning_rate': 9.255424899210006e-06, 'epoch': 0.2} 20%|██ | 6879/34278 [7:36:59<33:33:42, 4.41s/it] 20%|██ | 6880/34278 [7:37:03<31:06:56, 4.09s/it] {'loss': 0.1736, 'grad_norm': 0.7542517853422843, 'learning_rate': 9.255176839074031e-06, 'epoch': 0.2} 20%|██ | 6880/34278 [7:37:03<31:06:56, 4.09s/it] 20%|██ | 6881/34278 [7:37:05<28:07:36, 3.70s/it] {'loss': 0.1571, 'grad_norm': 0.9210894675280078, 'learning_rate': 9.254928740948854e-06, 'epoch': 0.2} 20%|██ | 6881/34278 [7:37:05<28:07:36, 3.70s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9798 > 8192). Running this sequence through the model will result in indexing errors 20%|██ | 6882/34278 [7:37:11<33:16:02, 4.37s/it] {'loss': 0.1654, 'grad_norm': 0.9075573167169664, 'learning_rate': 9.254680604836688e-06, 'epoch': 0.2} 20%|██ | 6882/34278 [7:37:11<33:16:02, 4.37s/it] 20%|██ | 6883/34278 [7:37:16<34:02:53, 4.47s/it] {'loss': 0.1576, 'grad_norm': 0.7357175761969743, 'learning_rate': 9.254432430739749e-06, 'epoch': 0.2} 20%|██ | 6883/34278 [7:37:16<34:02:53, 4.47s/it] 20%|██ | 6884/34278 [7:37:19<30:57:50, 4.07s/it] {'loss': 0.1757, 'grad_norm': 0.9293557786696476, 'learning_rate': 9.254184218660252e-06, 'epoch': 0.2} 20%|██ | 6884/34278 [7:37:19<30:57:50, 4.07s/it] 20%|██ | 6885/34278 [7:37:22<28:55:59, 3.80s/it] {'loss': 0.1973, 'grad_norm': 1.2702381815754837, 'learning_rate': 9.253935968600416e-06, 'epoch': 0.2} 20%|██ | 6885/34278 [7:37:22<28:55:59, 3.80s/it] 20%|██ | 6886/34278 [7:37:29<34:27:58, 4.53s/it] {'loss': 0.1494, 'grad_norm': 0.8137709259145418, 'learning_rate': 9.253687680562454e-06, 'epoch': 0.2} 20%|██ | 6886/34278 [7:37:29<34:27:58, 4.53s/it] 20%|██ | 6887/34278 [7:37:35<37:55:41, 4.98s/it] {'loss': 0.1455, 'grad_norm': 0.7591368911767536, 'learning_rate': 9.253439354548583e-06, 'epoch': 0.2} 20%|██ | 6887/34278 [7:37:35<37:55:41, 4.98s/it] 20%|██ | 6888/34278 [7:37:39<37:15:46, 4.90s/it] {'loss': 0.177, 'grad_norm': 0.8431311808072077, 'learning_rate': 9.253190990561022e-06, 'epoch': 0.2} 20%|██ | 6888/34278 [7:37:39<37:15:46, 4.90s/it] 20%|██ | 6889/34278 [7:37:42<32:12:20, 4.23s/it] {'loss': 0.1629, 'grad_norm': 0.7439925566693707, 'learning_rate': 9.252942588601988e-06, 'epoch': 0.2} 20%|██ | 6889/34278 [7:37:42<32:12:20, 4.23s/it] 20%|██ | 6890/34278 [7:37:45<29:25:03, 3.87s/it] {'loss': 0.1783, 'grad_norm': 0.8463428058909843, 'learning_rate': 9.252694148673695e-06, 'epoch': 0.2} 20%|██ | 6890/34278 [7:37:45<29:25:03, 3.87s/it] 20%|██ | 6891/34278 [7:37:48<27:19:05, 3.59s/it] {'loss': 0.17, 'grad_norm': 0.8698538347710973, 'learning_rate': 9.252445670778367e-06, 'epoch': 0.2} 20%|██ | 6891/34278 [7:37:48<27:19:05, 3.59s/it] 20%|██ | 6892/34278 [7:37:51<25:26:59, 3.35s/it] {'loss': 0.1683, 'grad_norm': 1.0189241222151606, 'learning_rate': 9.252197154918217e-06, 'epoch': 0.2} 20%|██ | 6892/34278 [7:37:51<25:26:59, 3.35s/it] 20%|██ | 6893/34278 [7:37:54<26:03:37, 3.43s/it] {'loss': 0.1963, 'grad_norm': 0.7984973411853468, 'learning_rate': 9.251948601095466e-06, 'epoch': 0.2} 20%|██ | 6893/34278 [7:37:54<26:03:37, 3.43s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11999 > 8192). Running this sequence through the model will result in indexing errors 20%|██ | 6894/34278 [7:37:57<25:12:30, 3.31s/it] {'loss': 0.1655, 'grad_norm': 1.1090263605234885, 'learning_rate': 9.251700009312334e-06, 'epoch': 0.2} 20%|██ | 6894/34278 [7:37:57<25:12:30, 3.31s/it] 20%|██ | 6895/34278 [7:38:01<25:09:24, 3.31s/it] {'loss': 0.166, 'grad_norm': 0.7521774243577772, 'learning_rate': 9.25145137957104e-06, 'epoch': 0.2} 20%|██ | 6895/34278 [7:38:01<25:09:24, 3.31s/it] 20%|██ | 6896/34278 [7:38:04<24:08:52, 3.17s/it] {'loss': 0.163, 'grad_norm': 0.8901929590357675, 'learning_rate': 9.251202711873802e-06, 'epoch': 0.2} 20%|██ | 6896/34278 [7:38:04<24:08:52, 3.17s/it] 20%|██ | 6897/34278 [7:38:08<26:12:28, 3.45s/it] {'loss': 0.1647, 'grad_norm': 0.8387379421239792, 'learning_rate': 9.25095400622284e-06, 'epoch': 0.2} 20%|██ | 6897/34278 [7:38:08<26:12:28, 3.45s/it] 20%|██ | 6898/34278 [7:38:14<32:36:44, 4.29s/it] {'loss': 0.169, 'grad_norm': 0.7271202499149118, 'learning_rate': 9.250705262620376e-06, 'epoch': 0.2} 20%|██ | 6898/34278 [7:38:14<32:36:44, 4.29s/it] 20%|██ | 6899/34278 [7:38:17<29:39:01, 3.90s/it] {'loss': 0.1608, 'grad_norm': 0.7917441056416124, 'learning_rate': 9.25045648106863e-06, 'epoch': 0.2} 20%|██ | 6899/34278 [7:38:17<29:39:01, 3.90s/it] 20%|██ | 6900/34278 [7:38:21<30:56:43, 4.07s/it] {'loss': 0.1534, 'grad_norm': 1.0166959242851203, 'learning_rate': 9.250207661569824e-06, 'epoch': 0.2} 20%|██ | 6900/34278 [7:38:21<30:56:43, 4.07s/it] 20%|██ | 6901/34278 [7:38:25<30:28:21, 4.01s/it] {'loss': 0.1824, 'grad_norm': 0.7654424818295686, 'learning_rate': 9.249958804126178e-06, 'epoch': 0.2} 20%|██ | 6901/34278 [7:38:25<30:28:21, 4.01s/it] 20%|██ | 6902/34278 [7:38:29<30:42:07, 4.04s/it] {'loss': 0.1843, 'grad_norm': 0.9651451193769214, 'learning_rate': 9.249709908739914e-06, 'epoch': 0.2} 20%|██ | 6902/34278 [7:38:29<30:42:07, 4.04s/it] 20%|██ | 6903/34278 [7:38:35<35:06:20, 4.62s/it] {'loss': 0.1569, 'grad_norm': 0.8894528565331724, 'learning_rate': 9.249460975413256e-06, 'epoch': 0.2} 20%|██ | 6903/34278 [7:38:35<35:06:20, 4.62s/it] 20%|██ | 6904/34278 [7:38:39<33:48:27, 4.45s/it] {'loss': 0.16, 'grad_norm': 0.6423539728835258, 'learning_rate': 9.249212004148424e-06, 'epoch': 0.2} 20%|██ | 6904/34278 [7:38:39<33:48:27, 4.45s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11734 > 8192). Running this sequence through the model will result in indexing errors 20%|██ | 6905/34278 [7:38:43<31:10:43, 4.10s/it] {'loss': 0.1559, 'grad_norm': 0.8063827987347563, 'learning_rate': 9.248962994947641e-06, 'epoch': 0.2} 20%|██ | 6905/34278 [7:38:43<31:10:43, 4.10s/it] 20%|██ | 6906/34278 [7:38:48<34:15:17, 4.51s/it] {'loss': 0.1608, 'grad_norm': 0.8624478514379726, 'learning_rate': 9.248713947813131e-06, 'epoch': 0.2} 20%|██ | 6906/34278 [7:38:48<34:15:17, 4.51s/it] 20%|██ | 6907/34278 [7:38:51<30:35:36, 4.02s/it] {'loss': 0.1734, 'grad_norm': 0.8950892712865031, 'learning_rate': 9.248464862747117e-06, 'epoch': 0.2} 20%|██ | 6907/34278 [7:38:51<30:35:36, 4.02s/it] 20%|██ | 6908/34278 [7:38:54<28:41:26, 3.77s/it] {'loss': 0.1794, 'grad_norm': 0.8905745182797603, 'learning_rate': 9.248215739751825e-06, 'epoch': 0.2} 20%|██ | 6908/34278 [7:38:54<28:41:26, 3.77s/it] 20%|██ | 6909/34278 [7:38:57<26:46:06, 3.52s/it] {'loss': 0.1573, 'grad_norm': 0.9989720835810473, 'learning_rate': 9.247966578829476e-06, 'epoch': 0.2} 20%|██ | 6909/34278 [7:38:57<26:46:06, 3.52s/it] 20%|██ | 6910/34278 [7:39:00<25:06:47, 3.30s/it] {'loss': 0.1749, 'grad_norm': 0.8291663064917253, 'learning_rate': 9.247717379982293e-06, 'epoch': 0.2} 20%|██ | 6910/34278 [7:39:00<25:06:47, 3.30s/it] 20%|██ | 6911/34278 [7:39:03<24:56:37, 3.28s/it] {'loss': 0.1698, 'grad_norm': 0.9065232306006418, 'learning_rate': 9.247468143212505e-06, 'epoch': 0.2} 20%|██ | 6911/34278 [7:39:03<24:56:37, 3.28s/it] 20%|██ | 6912/34278 [7:39:06<25:02:11, 3.29s/it] {'loss': 0.1598, 'grad_norm': 0.7636604866786924, 'learning_rate': 9.247218868522335e-06, 'epoch': 0.2} 20%|██ | 6912/34278 [7:39:06<25:02:11, 3.29s/it] 20%|██ | 6913/34278 [7:39:10<26:16:22, 3.46s/it] {'loss': 0.1608, 'grad_norm': 0.7669056755864361, 'learning_rate': 9.24696955591401e-06, 'epoch': 0.2} 20%|██ | 6913/34278 [7:39:10<26:16:22, 3.46s/it] 20%|██ | 6914/34278 [7:39:13<25:31:52, 3.36s/it] {'loss': 0.164, 'grad_norm': 0.8165590403692065, 'learning_rate': 9.246720205389752e-06, 'epoch': 0.2} 20%|██ | 6914/34278 [7:39:13<25:31:52, 3.36s/it] 20%|██ | 6915/34278 [7:39:17<25:26:29, 3.35s/it] {'loss': 0.169, 'grad_norm': 0.6512869425367765, 'learning_rate': 9.246470816951792e-06, 'epoch': 0.2} 20%|██ | 6915/34278 [7:39:17<25:26:29, 3.35s/it] 20%|██ | 6916/34278 [7:39:23<31:56:01, 4.20s/it] {'loss': 0.1714, 'grad_norm': 0.851582519614119, 'learning_rate': 9.246221390602353e-06, 'epoch': 0.2} 20%|██ | 6916/34278 [7:39:23<31:56:01, 4.20s/it] 20%|██ | 6917/34278 [7:39:29<36:16:02, 4.77s/it] {'loss': 0.1591, 'grad_norm': 0.8819487941044123, 'learning_rate': 9.245971926343664e-06, 'epoch': 0.2} 20%|██ | 6917/34278 [7:39:29<36:16:02, 4.77s/it] 20%|██ | 6918/34278 [7:39:35<38:34:45, 5.08s/it] {'loss': 0.1553, 'grad_norm': 0.9313966169414754, 'learning_rate': 9.245722424177953e-06, 'epoch': 0.2} 20%|██ | 6918/34278 [7:39:35<38:34:45, 5.08s/it] 20%|██ | 6919/34278 [7:39:41<41:45:33, 5.49s/it] {'loss': 0.157, 'grad_norm': 0.691431593294132, 'learning_rate': 9.245472884107442e-06, 'epoch': 0.2} 20%|██ | 6919/34278 [7:39:41<41:45:33, 5.49s/it] 20%|██ | 6920/34278 [7:39:44<36:24:54, 4.79s/it] {'loss': 0.1683, 'grad_norm': 1.0897206048393633, 'learning_rate': 9.245223306134364e-06, 'epoch': 0.2} 20%|██ | 6920/34278 [7:39:44<36:24:54, 4.79s/it] 20%|██ | 6921/34278 [7:39:50<39:17:12, 5.17s/it] {'loss': 0.1312, 'grad_norm': 0.7985922287025494, 'learning_rate': 9.244973690260947e-06, 'epoch': 0.2} 20%|██ | 6921/34278 [7:39:51<39:17:12, 5.17s/it] 20%|██ | 6922/34278 [7:39:56<40:51:38, 5.38s/it] {'loss': 0.1776, 'grad_norm': 1.1762938233012288, 'learning_rate': 9.244724036489416e-06, 'epoch': 0.2} 20%|██ | 6922/34278 [7:39:56<40:51:38, 5.38s/it] 20%|██ | 6923/34278 [7:39:59<35:44:09, 4.70s/it] {'loss': 0.1612, 'grad_norm': 0.6709814809299784, 'learning_rate': 9.244474344822003e-06, 'epoch': 0.2} 20%|██ | 6923/34278 [7:39:59<35:44:09, 4.70s/it] 20%|██ | 6924/34278 [7:40:02<31:38:29, 4.16s/it] {'loss': 0.1623, 'grad_norm': 0.8355430771473132, 'learning_rate': 9.244224615260939e-06, 'epoch': 0.2} 20%|██ | 6924/34278 [7:40:02<31:38:29, 4.16s/it] 20%|██ | 6925/34278 [7:40:05<28:44:18, 3.78s/it] {'loss': 0.1451, 'grad_norm': 0.841972104666194, 'learning_rate': 9.243974847808447e-06, 'epoch': 0.2} 20%|██ | 6925/34278 [7:40:05<28:44:18, 3.78s/it] 20%|██ | 6926/34278 [7:40:10<31:23:12, 4.13s/it] {'loss': 0.1629, 'grad_norm': 0.9060411147309869, 'learning_rate': 9.243725042466762e-06, 'epoch': 0.2} 20%|██ | 6926/34278 [7:40:10<31:23:12, 4.13s/it] 20%|██ | 6927/34278 [7:40:13<28:09:28, 3.71s/it] {'loss': 0.1508, 'grad_norm': 1.0164509121189214, 'learning_rate': 9.243475199238115e-06, 'epoch': 0.2} 20%|██ | 6927/34278 [7:40:13<28:09:28, 3.71s/it] 20%|██ | 6928/34278 [7:40:16<26:51:31, 3.54s/it] {'loss': 0.1521, 'grad_norm': 1.2165006713459436, 'learning_rate': 9.243225318124731e-06, 'epoch': 0.2} 20%|██ | 6928/34278 [7:40:16<26:51:31, 3.54s/it] 20%|██ | 6929/34278 [7:40:20<26:40:29, 3.51s/it] {'loss': 0.1364, 'grad_norm': 0.9375090116468068, 'learning_rate': 9.242975399128846e-06, 'epoch': 0.2} 20%|██ | 6929/34278 [7:40:20<26:40:29, 3.51s/it] 20%|██ | 6930/34278 [7:40:23<26:10:04, 3.44s/it] {'loss': 0.1709, 'grad_norm': 0.7444053922033216, 'learning_rate': 9.242725442252689e-06, 'epoch': 0.2} 20%|██ | 6930/34278 [7:40:23<26:10:04, 3.44s/it] 20%|██ | 6931/34278 [7:40:27<27:23:46, 3.61s/it] {'loss': 0.1975, 'grad_norm': 0.8938477042452997, 'learning_rate': 9.242475447498494e-06, 'epoch': 0.2} 20%|██ | 6931/34278 [7:40:27<27:23:46, 3.61s/it] 20%|██ | 6932/34278 [7:40:31<27:46:24, 3.66s/it] {'loss': 0.175, 'grad_norm': 0.9444066284799371, 'learning_rate': 9.242225414868489e-06, 'epoch': 0.2} 20%|██ | 6932/34278 [7:40:31<27:46:24, 3.66s/it] 20%|██ | 6933/34278 [7:40:34<26:50:19, 3.53s/it] {'loss': 0.1668, 'grad_norm': 0.7975877128495011, 'learning_rate': 9.241975344364908e-06, 'epoch': 0.2} 20%|██ | 6933/34278 [7:40:34<26:50:19, 3.53s/it] 20%|██ | 6934/34278 [7:40:38<27:56:40, 3.68s/it] {'loss': 0.1775, 'grad_norm': 0.9344549468260859, 'learning_rate': 9.241725235989984e-06, 'epoch': 0.2} 20%|██ | 6934/34278 [7:40:38<27:56:40, 3.68s/it] 20%|██ | 6935/34278 [7:40:41<27:17:33, 3.59s/it] {'loss': 0.2053, 'grad_norm': 0.9183123396344244, 'learning_rate': 9.24147508974595e-06, 'epoch': 0.2} 20%|██ | 6935/34278 [7:40:41<27:17:33, 3.59s/it] 20%|██ | 6936/34278 [7:40:46<30:51:43, 4.06s/it] {'loss': 0.1748, 'grad_norm': 0.812772993081817, 'learning_rate': 9.24122490563504e-06, 'epoch': 0.2} 20%|██ | 6936/34278 [7:40:46<30:51:43, 4.06s/it] 20%|██ | 6937/34278 [7:40:49<27:41:17, 3.65s/it] {'loss': 0.1757, 'grad_norm': 1.2402670599204466, 'learning_rate': 9.240974683659484e-06, 'epoch': 0.2} 20%|██ | 6937/34278 [7:40:49<27:41:17, 3.65s/it] 20%|██ | 6938/34278 [7:40:52<26:05:06, 3.43s/it] {'loss': 0.1974, 'grad_norm': 0.9170751278495316, 'learning_rate': 9.24072442382152e-06, 'epoch': 0.2} 20%|██ | 6938/34278 [7:40:52<26:05:06, 3.43s/it] 20%|██ | 6939/34278 [7:40:55<25:02:11, 3.30s/it] {'loss': 0.1603, 'grad_norm': 1.0772371356975892, 'learning_rate': 9.240474126123382e-06, 'epoch': 0.2} 20%|██ | 6939/34278 [7:40:55<25:02:11, 3.30s/it] 20%|██ | 6940/34278 [7:40:59<25:38:53, 3.38s/it] {'loss': 0.1687, 'grad_norm': 0.8619979857339938, 'learning_rate': 9.240223790567301e-06, 'epoch': 0.2} 20%|██ | 6940/34278 [7:40:59<25:38:53, 3.38s/it] 20%|██ | 6941/34278 [7:41:02<25:24:14, 3.35s/it] {'loss': 0.1663, 'grad_norm': 0.7450528065605044, 'learning_rate': 9.239973417155514e-06, 'epoch': 0.2} 20%|██ | 6941/34278 [7:41:02<25:24:14, 3.35s/it] 20%|██ | 6942/34278 [7:41:05<25:17:32, 3.33s/it] {'loss': 0.1615, 'grad_norm': 0.929227600853273, 'learning_rate': 9.239723005890259e-06, 'epoch': 0.2} 20%|██ | 6942/34278 [7:41:05<25:17:32, 3.33s/it] 20%|██ | 6943/34278 [7:41:08<24:45:36, 3.26s/it] {'loss': 0.1539, 'grad_norm': 0.612880897375697, 'learning_rate': 9.239472556773767e-06, 'epoch': 0.2} 20%|██ | 6943/34278 [7:41:08<24:45:36, 3.26s/it] 20%|██ | 6944/34278 [7:41:11<24:21:35, 3.21s/it] {'loss': 0.1586, 'grad_norm': 0.797816385622781, 'learning_rate': 9.239222069808278e-06, 'epoch': 0.2} 20%|██ | 6944/34278 [7:41:11<24:21:35, 3.21s/it] 20%|██ | 6945/34278 [7:41:14<23:44:36, 3.13s/it] {'loss': 0.1911, 'grad_norm': 0.8512931410273533, 'learning_rate': 9.238971544996024e-06, 'epoch': 0.2} 20%|██ | 6945/34278 [7:41:14<23:44:36, 3.13s/it] 20%|██ | 6946/34278 [7:41:18<24:08:01, 3.18s/it] {'loss': 0.1887, 'grad_norm': 0.7940492984333843, 'learning_rate': 9.238720982339244e-06, 'epoch': 0.2} 20%|██ | 6946/34278 [7:41:18<24:08:01, 3.18s/it] 20%|██ | 6947/34278 [7:41:21<24:42:11, 3.25s/it] {'loss': 0.1605, 'grad_norm': 0.9662901286894028, 'learning_rate': 9.238470381840177e-06, 'epoch': 0.2} 20%|██ | 6947/34278 [7:41:21<24:42:11, 3.25s/it] 20%|██ | 6948/34278 [7:41:24<24:49:31, 3.27s/it] {'loss': 0.1469, 'grad_norm': 0.7812194229483999, 'learning_rate': 9.238219743501056e-06, 'epoch': 0.2} 20%|██ | 6948/34278 [7:41:24<24:49:31, 3.27s/it] 20%|██ | 6949/34278 [7:41:28<25:37:51, 3.38s/it] {'loss': 0.1756, 'grad_norm': 0.9627478070169841, 'learning_rate': 9.237969067324122e-06, 'epoch': 0.2} 20%|██ | 6949/34278 [7:41:28<25:37:51, 3.38s/it] 20%|██ | 6950/34278 [7:41:31<25:25:00, 3.35s/it] {'loss': 0.1743, 'grad_norm': 0.9180610650279988, 'learning_rate': 9.237718353311614e-06, 'epoch': 0.2} 20%|██ | 6950/34278 [7:41:31<25:25:00, 3.35s/it] 20%|██ | 6951/34278 [7:41:34<25:00:51, 3.30s/it] {'loss': 0.1642, 'grad_norm': 0.8145939137932461, 'learning_rate': 9.237467601465765e-06, 'epoch': 0.2} 20%|██ | 6951/34278 [7:41:34<25:00:51, 3.30s/it] 20%|██ | 6952/34278 [7:41:40<29:34:56, 3.90s/it] {'loss': 0.1877, 'grad_norm': 0.9541162411630298, 'learning_rate': 9.237216811788818e-06, 'epoch': 0.2} 20%|██ | 6952/34278 [7:41:40<29:34:56, 3.90s/it] 20%|██ | 6953/34278 [7:41:43<27:13:47, 3.59s/it] {'loss': 0.1525, 'grad_norm': 0.7818593113436227, 'learning_rate': 9.23696598428301e-06, 'epoch': 0.2} 20%|██ | 6953/34278 [7:41:43<27:13:47, 3.59s/it] 20%|██ | 6954/34278 [7:41:49<32:42:09, 4.31s/it] {'loss': 0.1879, 'grad_norm': 0.8353926491125307, 'learning_rate': 9.236715118950584e-06, 'epoch': 0.2} 20%|██ | 6954/34278 [7:41:49<32:42:09, 4.31s/it] 20%|██ | 6955/34278 [7:41:51<29:22:25, 3.87s/it] {'loss': 0.1641, 'grad_norm': 0.8569013518121356, 'learning_rate': 9.236464215793773e-06, 'epoch': 0.2} 20%|██ | 6955/34278 [7:41:51<29:22:25, 3.87s/it] 20%|██ | 6956/34278 [7:41:54<27:30:17, 3.62s/it] {'loss': 0.1595, 'grad_norm': 0.7441027716221941, 'learning_rate': 9.236213274814822e-06, 'epoch': 0.2} 20%|██ | 6956/34278 [7:41:54<27:30:17, 3.62s/it] 20%|██ | 6957/34278 [7:41:58<28:06:59, 3.70s/it] {'loss': 0.1535, 'grad_norm': 0.6849361560051208, 'learning_rate': 9.23596229601597e-06, 'epoch': 0.2} 20%|██ | 6957/34278 [7:41:58<28:06:59, 3.70s/it] 20%|██ | 6958/34278 [7:42:02<29:12:36, 3.85s/it] {'loss': 0.1598, 'grad_norm': 0.7500331412109894, 'learning_rate': 9.23571127939946e-06, 'epoch': 0.2} 20%|██ | 6958/34278 [7:42:03<29:12:36, 3.85s/it] 20%|██ | 6959/34278 [7:42:06<28:37:40, 3.77s/it] {'loss': 0.1946, 'grad_norm': 0.8118918095758154, 'learning_rate': 9.23546022496753e-06, 'epoch': 0.2} 20%|██ | 6959/34278 [7:42:06<28:37:40, 3.77s/it] 20%|██ | 6960/34278 [7:42:09<26:18:43, 3.47s/it] {'loss': 0.1733, 'grad_norm': 0.8545793980969125, 'learning_rate': 9.23520913272242e-06, 'epoch': 0.2} 20%|██ | 6960/34278 [7:42:09<26:18:43, 3.47s/it] 20%|██ | 6961/34278 [7:42:14<30:06:17, 3.97s/it] {'loss': 0.1637, 'grad_norm': 0.816879846775412, 'learning_rate': 9.234958002666377e-06, 'epoch': 0.2} 20%|██ | 6961/34278 [7:42:14<30:06:17, 3.97s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 20%|██ | 6962/34278 [7:42:17<28:52:43, 3.81s/it] {'loss': 0.1909, 'grad_norm': 0.8082137698355487, 'learning_rate': 9.234706834801637e-06, 'epoch': 0.2} 20%|██ | 6962/34278 [7:42:17<28:52:43, 3.81s/it] 20%|██ | 6963/34278 [7:42:20<27:10:18, 3.58s/it] {'loss': 0.1437, 'grad_norm': 0.886717342632263, 'learning_rate': 9.234455629130447e-06, 'epoch': 0.2} 20%|██ | 6963/34278 [7:42:20<27:10:18, 3.58s/it] 20%|██ | 6964/34278 [7:42:26<32:39:29, 4.30s/it] {'loss': 0.1669, 'grad_norm': 0.7627804041244418, 'learning_rate': 9.234204385655048e-06, 'epoch': 0.2} 20%|██ | 6964/34278 [7:42:26<32:39:29, 4.30s/it] 20%|██ | 6965/34278 [7:42:30<30:47:12, 4.06s/it] {'loss': 0.155, 'grad_norm': 0.9082853257968745, 'learning_rate': 9.233953104377683e-06, 'epoch': 0.2} 20%|██ | 6965/34278 [7:42:30<30:47:12, 4.06s/it] 20%|██ | 6966/34278 [7:42:33<28:10:58, 3.71s/it] {'loss': 0.1533, 'grad_norm': 0.7534748533281638, 'learning_rate': 9.233701785300594e-06, 'epoch': 0.2} 20%|██ | 6966/34278 [7:42:33<28:10:58, 3.71s/it] 20%|██ | 6967/34278 [7:42:36<26:48:14, 3.53s/it] {'loss': 0.1481, 'grad_norm': 0.8623488199709958, 'learning_rate': 9.233450428426028e-06, 'epoch': 0.2} 20%|██ | 6967/34278 [7:42:36<26:48:14, 3.53s/it] 20%|██ | 6968/34278 [7:42:39<26:09:28, 3.45s/it] {'loss': 0.1712, 'grad_norm': 0.9881855600799769, 'learning_rate': 9.233199033756225e-06, 'epoch': 0.2} 20%|██ | 6968/34278 [7:42:39<26:09:28, 3.45s/it] 20%|██ | 6969/34278 [7:42:44<28:28:21, 3.75s/it] {'loss': 0.1582, 'grad_norm': 0.844767954923981, 'learning_rate': 9.232947601293434e-06, 'epoch': 0.2} 20%|██ | 6969/34278 [7:42:44<28:28:21, 3.75s/it] 20%|██ | 6970/34278 [7:42:47<28:11:43, 3.72s/it] {'loss': 0.1791, 'grad_norm': 0.8276002682189507, 'learning_rate': 9.232696131039896e-06, 'epoch': 0.2} 20%|██ | 6970/34278 [7:42:47<28:11:43, 3.72s/it] 20%|██ | 6971/34278 [7:42:53<32:31:52, 4.29s/it] {'loss': 0.1616, 'grad_norm': 0.8617800403501159, 'learning_rate': 9.232444622997856e-06, 'epoch': 0.2} 20%|██ | 6971/34278 [7:42:53<32:31:52, 4.29s/it] 20%|██ | 6972/34278 [7:42:59<37:08:26, 4.90s/it] {'loss': 0.1654, 'grad_norm': 0.8659663059040512, 'learning_rate': 9.232193077169564e-06, 'epoch': 0.2} 20%|██ | 6972/34278 [7:42:59<37:08:26, 4.90s/it] 20%|██ | 6973/34278 [7:43:03<34:46:29, 4.58s/it] {'loss': 0.151, 'grad_norm': 0.779554960350323, 'learning_rate': 9.23194149355726e-06, 'epoch': 0.2} 20%|██ | 6973/34278 [7:43:03<34:46:29, 4.58s/it] 20%|██ | 6974/34278 [7:43:06<31:42:31, 4.18s/it] {'loss': 0.1754, 'grad_norm': 0.8464240910273081, 'learning_rate': 9.231689872163193e-06, 'epoch': 0.2} 20%|██ | 6974/34278 [7:43:06<31:42:31, 4.18s/it] 20%|██ | 6975/34278 [7:43:09<29:01:37, 3.83s/it] {'loss': 0.1564, 'grad_norm': 0.8346599537505763, 'learning_rate': 9.23143821298961e-06, 'epoch': 0.2} 20%|██ | 6975/34278 [7:43:09<29:01:37, 3.83s/it] 20%|██ | 6976/34278 [7:43:13<28:46:19, 3.79s/it] {'loss': 0.169, 'grad_norm': 0.7887283247170853, 'learning_rate': 9.231186516038756e-06, 'epoch': 0.2} 20%|██ | 6976/34278 [7:43:13<28:46:19, 3.79s/it] 20%|██ | 6977/34278 [7:43:16<27:04:50, 3.57s/it] {'loss': 0.1657, 'grad_norm': 0.8313811211179252, 'learning_rate': 9.230934781312879e-06, 'epoch': 0.2} 20%|██ | 6977/34278 [7:43:16<27:04:50, 3.57s/it] 20%|██ | 6978/34278 [7:43:19<25:48:01, 3.40s/it] {'loss': 0.1501, 'grad_norm': 0.6988999318097441, 'learning_rate': 9.230683008814226e-06, 'epoch': 0.2} 20%|██ | 6978/34278 [7:43:19<25:48:01, 3.40s/it] 20%|██ | 6979/34278 [7:43:24<30:03:02, 3.96s/it] {'loss': 0.1589, 'grad_norm': 0.9094972559651215, 'learning_rate': 9.230431198545045e-06, 'epoch': 0.2} 20%|██ | 6979/34278 [7:43:24<30:03:02, 3.96s/it] 20%|██ | 6980/34278 [7:43:28<29:05:13, 3.84s/it] {'loss': 0.1537, 'grad_norm': 0.8904694721866652, 'learning_rate': 9.230179350507584e-06, 'epoch': 0.2} 20%|██ | 6980/34278 [7:43:28<29:05:13, 3.84s/it] 20%|██ | 6981/34278 [7:43:31<28:07:52, 3.71s/it] {'loss': 0.1606, 'grad_norm': 0.9547194753711763, 'learning_rate': 9.229927464704094e-06, 'epoch': 0.2} 20%|██ | 6981/34278 [7:43:31<28:07:52, 3.71s/it] 20%|██ | 6982/34278 [7:43:34<26:43:40, 3.53s/it] {'loss': 0.1509, 'grad_norm': 0.9622437302424728, 'learning_rate': 9.22967554113682e-06, 'epoch': 0.2} 20%|██ | 6982/34278 [7:43:34<26:43:40, 3.53s/it] 20%|██ | 6983/34278 [7:43:39<28:15:27, 3.73s/it] {'loss': 0.1726, 'grad_norm': 0.8020621072325536, 'learning_rate': 9.22942357980801e-06, 'epoch': 0.2} 20%|██ | 6983/34278 [7:43:39<28:15:27, 3.73s/it] 20%|██ | 6984/34278 [7:43:42<26:54:53, 3.55s/it] {'loss': 0.1567, 'grad_norm': 0.8312982184894929, 'learning_rate': 9.229171580719917e-06, 'epoch': 0.2} 20%|██ | 6984/34278 [7:43:42<26:54:53, 3.55s/it] 20%|██ | 6985/34278 [7:43:45<26:59:21, 3.56s/it] {'loss': 0.1985, 'grad_norm': 0.9668106852012995, 'learning_rate': 9.228919543874793e-06, 'epoch': 0.2} 20%|██ | 6985/34278 [7:43:45<26:59:21, 3.56s/it] 20%|██ | 6986/34278 [7:43:48<25:57:24, 3.42s/it] {'loss': 0.1732, 'grad_norm': 0.8657568338629704, 'learning_rate': 9.22866746927488e-06, 'epoch': 0.2} 20%|██ | 6986/34278 [7:43:48<25:57:24, 3.42s/it] 20%|██ | 6987/34278 [7:43:54<30:21:49, 4.01s/it] {'loss': 0.1663, 'grad_norm': 0.8623396974875376, 'learning_rate': 9.228415356922437e-06, 'epoch': 0.2} 20%|██ | 6987/34278 [7:43:54<30:21:49, 4.01s/it] 20%|██ | 6988/34278 [7:43:59<33:12:07, 4.38s/it] {'loss': 0.198, 'grad_norm': 0.8453525214965825, 'learning_rate': 9.228163206819709e-06, 'epoch': 0.2} 20%|██ | 6988/34278 [7:43:59<33:12:07, 4.38s/it] 20%|██ | 6989/34278 [7:44:05<36:39:59, 4.84s/it] {'loss': 0.1661, 'grad_norm': 0.9125294274115585, 'learning_rate': 9.22791101896895e-06, 'epoch': 0.2} 20%|██ | 6989/34278 [7:44:05<36:39:59, 4.84s/it] 20%|██ | 6990/34278 [7:44:09<34:05:50, 4.50s/it] {'loss': 0.1639, 'grad_norm': 0.8576353726170916, 'learning_rate': 9.227658793372412e-06, 'epoch': 0.2} 20%|██ | 6990/34278 [7:44:09<34:05:50, 4.50s/it] 20%|██ | 6991/34278 [7:44:12<31:26:54, 4.15s/it] {'loss': 0.2013, 'grad_norm': 0.8394250192109743, 'learning_rate': 9.227406530032343e-06, 'epoch': 0.2} 20%|██ | 6991/34278 [7:44:12<31:26:54, 4.15s/it] 20%|██ | 6992/34278 [7:44:15<28:08:02, 3.71s/it] {'loss': 0.1581, 'grad_norm': 0.8698217646400507, 'learning_rate': 9.227154228951e-06, 'epoch': 0.2} 20%|██ | 6992/34278 [7:44:15<28:08:02, 3.71s/it] 20%|██ | 6993/34278 [7:44:18<26:59:08, 3.56s/it] {'loss': 0.1646, 'grad_norm': 0.7342443622768361, 'learning_rate': 9.226901890130632e-06, 'epoch': 0.2} 20%|██ | 6993/34278 [7:44:18<26:59:08, 3.56s/it] 20%|██ | 6994/34278 [7:44:22<27:48:57, 3.67s/it] {'loss': 0.1632, 'grad_norm': 0.7905139694958258, 'learning_rate': 9.226649513573494e-06, 'epoch': 0.2} 20%|██ | 6994/34278 [7:44:22<27:48:57, 3.67s/it] 20%|██ | 6995/34278 [7:44:25<27:20:32, 3.61s/it] {'loss': 0.1966, 'grad_norm': 0.9604884114024702, 'learning_rate': 9.226397099281837e-06, 'epoch': 0.2} 20%|██ | 6995/34278 [7:44:25<27:20:32, 3.61s/it] 20%|██ | 6996/34278 [7:44:31<31:40:04, 4.18s/it] {'loss': 0.1489, 'grad_norm': 0.8188683052092606, 'learning_rate': 9.226144647257916e-06, 'epoch': 0.2} 20%|██ | 6996/34278 [7:44:31<31:40:04, 4.18s/it] 20%|██ | 6997/34278 [7:44:35<32:35:37, 4.30s/it] {'loss': 0.16, 'grad_norm': 1.0209871621250723, 'learning_rate': 9.225892157503983e-06, 'epoch': 0.2} 20%|██ | 6997/34278 [7:44:35<32:35:37, 4.30s/it] 20%|██ | 6998/34278 [7:44:39<30:11:12, 3.98s/it] {'loss': 0.1827, 'grad_norm': 0.9416078696474081, 'learning_rate': 9.225639630022295e-06, 'epoch': 0.2} 20%|██ | 6998/34278 [7:44:39<30:11:12, 3.98s/it] 20%|██ | 6999/34278 [7:44:42<27:40:08, 3.65s/it] {'loss': 0.175, 'grad_norm': 0.9224229764513623, 'learning_rate': 9.225387064815106e-06, 'epoch': 0.2} 20%|██ | 6999/34278 [7:44:42<27:40:08, 3.65s/it] 20%|██ | 7000/34278 [7:44:44<25:58:38, 3.43s/it] {'loss': 0.1511, 'grad_norm': 1.0303395643999491, 'learning_rate': 9.225134461884668e-06, 'epoch': 0.2} 20%|██ | 7000/34278 [7:44:44<25:58:38, 3.43s/it] 20%|██ | 7001/34278 [7:44:50<31:05:19, 4.10s/it] {'loss': 0.1639, 'grad_norm': 0.82603145282564, 'learning_rate': 9.224881821233239e-06, 'epoch': 0.2} 20%|██ | 7001/34278 [7:44:50<31:05:19, 4.10s/it] 20%|██ | 7002/34278 [7:44:53<29:08:22, 3.85s/it] {'loss': 0.1844, 'grad_norm': 0.9880245574139768, 'learning_rate': 9.224629142863075e-06, 'epoch': 0.2} 20%|██ | 7002/34278 [7:44:53<29:08:22, 3.85s/it] 20%|██ | 7003/34278 [7:44:57<28:15:10, 3.73s/it] {'loss': 0.1764, 'grad_norm': 1.2395735111046278, 'learning_rate': 9.224376426776428e-06, 'epoch': 0.2} 20%|██ | 7003/34278 [7:44:57<28:15:10, 3.73s/it] 20%|██ | 7004/34278 [7:45:03<33:03:57, 4.36s/it] {'loss': 0.1789, 'grad_norm': 0.7913835154235777, 'learning_rate': 9.224123672975557e-06, 'epoch': 0.2} 20%|██ | 7004/34278 [7:45:03<33:03:57, 4.36s/it] 20%|██ | 7005/34278 [7:45:06<29:53:57, 3.95s/it] {'loss': 0.1538, 'grad_norm': 1.1300233207007448, 'learning_rate': 9.22387088146272e-06, 'epoch': 0.2} 20%|██ | 7005/34278 [7:45:06<29:53:57, 3.95s/it] 20%|██ | 7006/34278 [7:45:08<27:15:18, 3.60s/it] {'loss': 0.1743, 'grad_norm': 1.1307792259323688, 'learning_rate': 9.223618052240171e-06, 'epoch': 0.2} 20%|██ | 7006/34278 [7:45:08<27:15:18, 3.60s/it] 20%|██ | 7007/34278 [7:45:11<25:23:55, 3.35s/it] {'loss': 0.1587, 'grad_norm': 0.8735797992845247, 'learning_rate': 9.22336518531017e-06, 'epoch': 0.2} 20%|██ | 7007/34278 [7:45:11<25:23:55, 3.35s/it] 20%|██ | 7008/34278 [7:45:16<28:42:54, 3.79s/it] {'loss': 0.1574, 'grad_norm': 0.796213194462923, 'learning_rate': 9.223112280674971e-06, 'epoch': 0.2} 20%|██ | 7008/34278 [7:45:16<28:42:54, 3.79s/it] 20%|██ | 7009/34278 [7:45:22<33:41:47, 4.45s/it] {'loss': 0.171, 'grad_norm': 0.9041885368326816, 'learning_rate': 9.222859338336834e-06, 'epoch': 0.2} 20%|██ | 7009/34278 [7:45:22<33:41:47, 4.45s/it] 20%|██ | 7010/34278 [7:45:25<30:36:11, 4.04s/it] {'loss': 0.1529, 'grad_norm': 0.7192505392225091, 'learning_rate': 9.222606358298017e-06, 'epoch': 0.2} 20%|██ | 7010/34278 [7:45:25<30:36:11, 4.04s/it] 20%|██ | 7011/34278 [7:45:31<35:15:50, 4.66s/it] {'loss': 0.1694, 'grad_norm': 0.8520647621195595, 'learning_rate': 9.222353340560779e-06, 'epoch': 0.2} 20%|██ | 7011/34278 [7:45:31<35:15:50, 4.66s/it] 20%|██ | 7012/34278 [7:45:34<32:00:33, 4.23s/it] {'loss': 0.1527, 'grad_norm': 0.7528888103249979, 'learning_rate': 9.222100285127376e-06, 'epoch': 0.2} 20%|██ | 7012/34278 [7:45:34<32:00:33, 4.23s/it] 20%|██ | 7013/34278 [7:45:38<30:23:35, 4.01s/it] {'loss': 0.1966, 'grad_norm': 0.8277757246979951, 'learning_rate': 9.221847192000072e-06, 'epoch': 0.2} 20%|██ | 7013/34278 [7:45:38<30:23:35, 4.01s/it] 20%|██ | 7014/34278 [7:45:43<32:35:00, 4.30s/it] {'loss': 0.1638, 'grad_norm': 0.762915048514747, 'learning_rate': 9.221594061181122e-06, 'epoch': 0.2} 20%|██ | 7014/34278 [7:45:43<32:35:00, 4.30s/it] 20%|██ | 7015/34278 [7:45:47<31:56:26, 4.22s/it] {'loss': 0.182, 'grad_norm': 0.9475856252908226, 'learning_rate': 9.22134089267279e-06, 'epoch': 0.2} 20%|██ | 7015/34278 [7:45:47<31:56:26, 4.22s/it] 20%|██ | 7016/34278 [7:45:53<36:10:23, 4.78s/it] {'loss': 0.144, 'grad_norm': 0.8304613080203108, 'learning_rate': 9.221087686477335e-06, 'epoch': 0.2} 20%|██ | 7016/34278 [7:45:53<36:10:23, 4.78s/it] 20%|██ | 7017/34278 [7:45:56<32:21:00, 4.27s/it] {'loss': 0.1406, 'grad_norm': 0.9407708063085051, 'learning_rate': 9.220834442597015e-06, 'epoch': 0.2} 20%|██ | 7017/34278 [7:45:56<32:21:00, 4.27s/it] 20%|██ | 7018/34278 [7:46:02<35:31:58, 4.69s/it] {'loss': 0.1919, 'grad_norm': 0.9594131857982843, 'learning_rate': 9.220581161034093e-06, 'epoch': 0.2} 20%|██ | 7018/34278 [7:46:02<35:31:58, 4.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f06ec804630> Failed to fetch sample 2691937. Exception: cannot identify image file <_io.BytesIO object at 0x7f06ec804630> 20%|██ | 7019/34278 [7:46:05<33:13:27, 4.39s/it] {'loss': 0.1613, 'grad_norm': 0.8012598194062235, 'learning_rate': 9.22032784179083e-06, 'epoch': 0.2} 20%|██ | 7019/34278 [7:46:05<33:13:27, 4.39s/it] 20%|██ | 7020/34278 [7:46:09<31:21:20, 4.14s/it] {'loss': 0.1524, 'grad_norm': 0.783222446437833, 'learning_rate': 9.220074484869488e-06, 'epoch': 0.2} 20%|██ | 7020/34278 [7:46:09<31:21:20, 4.14s/it] 20%|██ | 7021/34278 [7:46:13<31:12:14, 4.12s/it] {'loss': 0.1649, 'grad_norm': 1.0735242632621904, 'learning_rate': 9.219821090272326e-06, 'epoch': 0.2} 20%|██ | 7021/34278 [7:46:13<31:12:14, 4.12s/it] 20%|██ | 7022/34278 [7:46:17<31:29:49, 4.16s/it] {'loss': 0.1657, 'grad_norm': 0.8837234931799185, 'learning_rate': 9.219567658001613e-06, 'epoch': 0.2} 20%|██ | 7022/34278 [7:46:17<31:29:49, 4.16s/it] 20%|██ | 7023/34278 [7:46:24<36:04:30, 4.77s/it] {'loss': 0.1556, 'grad_norm': 0.8152130609318874, 'learning_rate': 9.219314188059605e-06, 'epoch': 0.2} 20%|██ | 7023/34278 [7:46:24<36:04:30, 4.77s/it] 20%|██ | 7024/34278 [7:46:28<35:49:13, 4.73s/it] {'loss': 0.1939, 'grad_norm': 0.8369414123535003, 'learning_rate': 9.219060680448567e-06, 'epoch': 0.2} 20%|██ | 7024/34278 [7:46:28<35:49:13, 4.73s/it] 20%|██ | 7025/34278 [7:46:31<32:16:21, 4.26s/it] {'loss': 0.1925, 'grad_norm': 1.0024558097070586, 'learning_rate': 9.218807135170763e-06, 'epoch': 0.2} 20%|██ | 7025/34278 [7:46:31<32:16:21, 4.26s/it] 20%|██ | 7026/34278 [7:46:37<36:18:19, 4.80s/it] {'loss': 0.1798, 'grad_norm': 0.7609945916463682, 'learning_rate': 9.218553552228454e-06, 'epoch': 0.2} 20%|██ | 7026/34278 [7:46:37<36:18:19, 4.80s/it] 21%|██ | 7027/34278 [7:46:41<33:51:31, 4.47s/it] {'loss': 0.1609, 'grad_norm': 0.8195889686366729, 'learning_rate': 9.218299931623907e-06, 'epoch': 0.21} 21%|██ | 7027/34278 [7:46:41<33:51:31, 4.47s/it] 21%|██ | 7028/34278 [7:46:44<30:17:22, 4.00s/it] {'loss': 0.1663, 'grad_norm': 0.8855074580997848, 'learning_rate': 9.218046273359385e-06, 'epoch': 0.21} 21%|██ | 7028/34278 [7:46:44<30:17:22, 4.00s/it] 21%|██ | 7029/34278 [7:46:47<27:52:23, 3.68s/it] {'loss': 0.1615, 'grad_norm': 0.8058363475589431, 'learning_rate': 9.217792577437154e-06, 'epoch': 0.21} 21%|██ | 7029/34278 [7:46:47<27:52:23, 3.68s/it] 21%|██ | 7030/34278 [7:46:52<31:39:39, 4.18s/it] {'loss': 0.184, 'grad_norm': 0.7965888061691665, 'learning_rate': 9.217538843859477e-06, 'epoch': 0.21} 21%|██ | 7030/34278 [7:46:52<31:39:39, 4.18s/it] 21%|██ | 7031/34278 [7:46:55<28:37:06, 3.78s/it] {'loss': 0.1978, 'grad_norm': 0.9233542755276904, 'learning_rate': 9.217285072628621e-06, 'epoch': 0.21} 21%|██ | 7031/34278 [7:46:55<28:37:06, 3.78s/it] 21%|██ | 7032/34278 [7:46:58<27:12:02, 3.59s/it] {'loss': 0.1549, 'grad_norm': 0.7417683680594019, 'learning_rate': 9.217031263746849e-06, 'epoch': 0.21} 21%|██ | 7032/34278 [7:46:58<27:12:02, 3.59s/it] 21%|██ | 7033/34278 [7:47:01<25:32:37, 3.38s/it] {'loss': 0.1561, 'grad_norm': 0.7508384502098701, 'learning_rate': 9.216777417216429e-06, 'epoch': 0.21} 21%|██ | 7033/34278 [7:47:01<25:32:37, 3.38s/it] 21%|██ | 7034/34278 [7:47:07<31:41:36, 4.19s/it] {'loss': 0.1619, 'grad_norm': 0.7734620855738245, 'learning_rate': 9.216523533039628e-06, 'epoch': 0.21} 21%|██ | 7034/34278 [7:47:07<31:41:36, 4.19s/it] 21%|██ | 7035/34278 [7:47:10<28:41:26, 3.79s/it] {'loss': 0.1733, 'grad_norm': 0.8909058400944175, 'learning_rate': 9.21626961121871e-06, 'epoch': 0.21} 21%|██ | 7035/34278 [7:47:10<28:41:26, 3.79s/it] 21%|██ | 7036/34278 [7:47:13<27:33:26, 3.64s/it] {'loss': 0.1708, 'grad_norm': 0.8887479590938959, 'learning_rate': 9.216015651755944e-06, 'epoch': 0.21} 21%|██ | 7036/34278 [7:47:13<27:33:26, 3.64s/it] 21%|██ | 7037/34278 [7:47:19<33:04:03, 4.37s/it] {'loss': 0.1761, 'grad_norm': 0.7810672670879556, 'learning_rate': 9.215761654653597e-06, 'epoch': 0.21} 21%|██ | 7037/34278 [7:47:19<33:04:03, 4.37s/it] 21%|██ | 7038/34278 [7:47:26<37:38:04, 4.97s/it] {'loss': 0.1416, 'grad_norm': 0.825915603229344, 'learning_rate': 9.215507619913937e-06, 'epoch': 0.21} 21%|██ | 7038/34278 [7:47:26<37:38:04, 4.97s/it] 21%|██ | 7039/34278 [7:47:32<39:47:45, 5.26s/it] {'loss': 0.1604, 'grad_norm': 0.9494067374597331, 'learning_rate': 9.215253547539229e-06, 'epoch': 0.21} 21%|██ | 7039/34278 [7:47:32<39:47:45, 5.26s/it] 21%|██ | 7040/34278 [7:47:35<35:38:15, 4.71s/it] {'loss': 0.1665, 'grad_norm': 0.849334261808857, 'learning_rate': 9.214999437531746e-06, 'epoch': 0.21} 21%|██ | 7040/34278 [7:47:35<35:38:15, 4.71s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██ | 7041/34278 [7:47:39<33:50:28, 4.47s/it] {'loss': 0.168, 'grad_norm': 0.9562283780863475, 'learning_rate': 9.214745289893753e-06, 'epoch': 0.21} 21%|██ | 7041/34278 [7:47:39<33:50:28, 4.47s/it] 21%|██ | 7042/34278 [7:47:42<30:52:27, 4.08s/it] {'loss': 0.1501, 'grad_norm': 0.8395458214040364, 'learning_rate': 9.21449110462752e-06, 'epoch': 0.21} 21%|██ | 7042/34278 [7:47:42<30:52:27, 4.08s/it] 21%|██ | 7043/34278 [7:47:45<28:37:14, 3.78s/it] {'loss': 0.134, 'grad_norm': 0.8971796442230011, 'learning_rate': 9.214236881735317e-06, 'epoch': 0.21} 21%|██ | 7043/34278 [7:47:45<28:37:14, 3.78s/it] 21%|██ | 7044/34278 [7:47:48<26:52:21, 3.55s/it] {'loss': 0.1744, 'grad_norm': 1.0300220366056503, 'learning_rate': 9.213982621219413e-06, 'epoch': 0.21} 21%|██ | 7044/34278 [7:47:48<26:52:21, 3.55s/it] 21%|██ | 7045/34278 [7:47:52<26:35:27, 3.52s/it] {'loss': 0.1748, 'grad_norm': 1.1743529491957736, 'learning_rate': 9.213728323082079e-06, 'epoch': 0.21} 21%|██ | 7045/34278 [7:47:52<26:35:27, 3.52s/it] 21%|██ | 7046/34278 [7:47:55<25:24:54, 3.36s/it] {'loss': 0.1632, 'grad_norm': 1.0193126074031889, 'learning_rate': 9.213473987325583e-06, 'epoch': 0.21} 21%|██ | 7046/34278 [7:47:55<25:24:54, 3.36s/it] 21%|██ | 7047/34278 [7:48:01<31:28:47, 4.16s/it] {'loss': 0.1524, 'grad_norm': 0.8004089414962124, 'learning_rate': 9.213219613952198e-06, 'epoch': 0.21} 21%|██ | 7047/34278 [7:48:01<31:28:47, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██ | 7048/34278 [7:48:04<29:29:54, 3.90s/it] {'loss': 0.172, 'grad_norm': 1.0382022058331408, 'learning_rate': 9.212965202964192e-06, 'epoch': 0.21} 21%|██ | 7048/34278 [7:48:04<29:29:54, 3.90s/it] 21%|██ | 7049/34278 [7:48:07<27:54:50, 3.69s/it] {'loss': 0.1494, 'grad_norm': 0.8307365765560831, 'learning_rate': 9.212710754363841e-06, 'epoch': 0.21} 21%|██ | 7049/34278 [7:48:07<27:54:50, 3.69s/it] 21%|██ | 7050/34278 [7:48:10<26:11:30, 3.46s/it] {'loss': 0.1748, 'grad_norm': 0.89444786051651, 'learning_rate': 9.212456268153414e-06, 'epoch': 0.21} 21%|██ | 7050/34278 [7:48:10<26:11:30, 3.46s/it] 21%|██ | 7051/34278 [7:48:14<25:58:01, 3.43s/it] {'loss': 0.16, 'grad_norm': 0.7110872416229941, 'learning_rate': 9.212201744335182e-06, 'epoch': 0.21} 21%|██ | 7051/34278 [7:48:14<25:58:01, 3.43s/it] 21%|██ | 7052/34278 [7:48:16<24:36:01, 3.25s/it] {'loss': 0.1777, 'grad_norm': 0.9553424011884314, 'learning_rate': 9.211947182911418e-06, 'epoch': 0.21} 21%|██ | 7052/34278 [7:48:16<24:36:01, 3.25s/it] 21%|██ | 7053/34278 [7:48:19<23:55:01, 3.16s/it] {'loss': 0.1692, 'grad_norm': 0.8427944379759794, 'learning_rate': 9.211692583884395e-06, 'epoch': 0.21} 21%|██ | 7053/34278 [7:48:19<23:55:01, 3.16s/it] 21%|██ | 7054/34278 [7:48:24<26:02:17, 3.44s/it] {'loss': 0.1734, 'grad_norm': 0.8912967009010678, 'learning_rate': 9.211437947256387e-06, 'epoch': 0.21} 21%|██ | 7054/34278 [7:48:24<26:02:17, 3.44s/it] 21%|██ | 7055/34278 [7:48:27<26:32:05, 3.51s/it] {'loss': 0.1605, 'grad_norm': 1.0440606064314335, 'learning_rate': 9.211183273029667e-06, 'epoch': 0.21} 21%|██ | 7055/34278 [7:48:27<26:32:05, 3.51s/it] 21%|██ | 7056/34278 [7:48:31<27:10:14, 3.59s/it] {'loss': 0.174, 'grad_norm': 0.8608823932440356, 'learning_rate': 9.210928561206507e-06, 'epoch': 0.21} 21%|██ | 7056/34278 [7:48:31<27:10:14, 3.59s/it] 21%|██ | 7057/34278 [7:48:34<25:39:47, 3.39s/it] {'loss': 0.1549, 'grad_norm': 0.8798085720541676, 'learning_rate': 9.210673811789181e-06, 'epoch': 0.21} 21%|██ | 7057/34278 [7:48:34<25:39:47, 3.39s/it] 21%|██ | 7058/34278 [7:48:37<25:23:04, 3.36s/it] {'loss': 0.1979, 'grad_norm': 1.0416204542312937, 'learning_rate': 9.210419024779967e-06, 'epoch': 0.21} 21%|██ | 7058/34278 [7:48:37<25:23:04, 3.36s/it] 21%|██ | 7059/34278 [7:48:41<26:11:52, 3.46s/it] {'loss': 0.164, 'grad_norm': 0.8238960482510895, 'learning_rate': 9.210164200181133e-06, 'epoch': 0.21} 21%|██ | 7059/34278 [7:48:41<26:11:52, 3.46s/it] 21%|██ | 7060/34278 [7:48:44<25:52:06, 3.42s/it] {'loss': 0.1715, 'grad_norm': 0.9591263439518445, 'learning_rate': 9.209909337994963e-06, 'epoch': 0.21} 21%|██ | 7060/34278 [7:48:44<25:52:06, 3.42s/it] 21%|██ | 7061/34278 [7:48:50<31:35:33, 4.18s/it] {'loss': 0.1736, 'grad_norm': 0.8637896079605812, 'learning_rate': 9.209654438223724e-06, 'epoch': 0.21} 21%|██ | 7061/34278 [7:48:50<31:35:33, 4.18s/it] 21%|██ | 7062/34278 [7:48:54<31:20:53, 4.15s/it] {'loss': 0.1771, 'grad_norm': 0.8204201986572738, 'learning_rate': 9.209399500869695e-06, 'epoch': 0.21} 21%|██ | 7062/34278 [7:48:54<31:20:53, 4.15s/it] 21%|██ | 7063/34278 [7:48:57<29:19:05, 3.88s/it] {'loss': 0.1635, 'grad_norm': 0.8331051108982461, 'learning_rate': 9.209144525935154e-06, 'epoch': 0.21} 21%|██ | 7063/34278 [7:48:57<29:19:05, 3.88s/it] 21%|██ | 7064/34278 [7:49:03<33:47:23, 4.47s/it] {'loss': 0.1653, 'grad_norm': 0.8698195907117675, 'learning_rate': 9.208889513422374e-06, 'epoch': 0.21} 21%|██ | 7064/34278 [7:49:03<33:47:23, 4.47s/it] 21%|██ | 7065/34278 [7:49:08<34:31:39, 4.57s/it] {'loss': 0.1462, 'grad_norm': 0.7944483701447789, 'learning_rate': 9.208634463333634e-06, 'epoch': 0.21} 21%|██ | 7065/34278 [7:49:08<34:31:39, 4.57s/it] 21%|██ | 7066/34278 [7:49:14<38:08:12, 5.05s/it] {'loss': 0.1849, 'grad_norm': 0.9980133372433863, 'learning_rate': 9.20837937567121e-06, 'epoch': 0.21} 21%|██ | 7066/34278 [7:49:14<38:08:12, 5.05s/it] 21%|██ | 7067/34278 [7:49:18<35:54:34, 4.75s/it] {'loss': 0.1949, 'grad_norm': 0.8137617243493083, 'learning_rate': 9.20812425043738e-06, 'epoch': 0.21} 21%|██ | 7067/34278 [7:49:18<35:54:34, 4.75s/it] 21%|██ | 7068/34278 [7:49:24<37:17:45, 4.93s/it] {'loss': 0.1379, 'grad_norm': 0.9273455101518907, 'learning_rate': 9.20786908763442e-06, 'epoch': 0.21} 21%|██ | 7068/34278 [7:49:24<37:17:45, 4.93s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██ | 7069/34278 [7:49:28<35:33:30, 4.70s/it] {'loss': 0.1676, 'grad_norm': 0.8629397425055924, 'learning_rate': 9.20761388726461e-06, 'epoch': 0.21} 21%|██ | 7069/34278 [7:49:28<35:33:30, 4.70s/it] 21%|██ | 7070/34278 [7:49:33<36:55:55, 4.89s/it] {'loss': 0.1484, 'grad_norm': 0.6833022223183531, 'learning_rate': 9.207358649330229e-06, 'epoch': 0.21} 21%|██ | 7070/34278 [7:49:33<36:55:55, 4.89s/it] 21%|██ | 7071/34278 [7:49:36<33:18:10, 4.41s/it] {'loss': 0.1598, 'grad_norm': 0.8137189089894815, 'learning_rate': 9.207103373833553e-06, 'epoch': 0.21} 21%|██ | 7071/34278 [7:49:36<33:18:10, 4.41s/it] 21%|██ | 7072/34278 [7:49:43<37:38:27, 4.98s/it] {'loss': 0.1588, 'grad_norm': 0.9811256991406814, 'learning_rate': 9.206848060776861e-06, 'epoch': 0.21} 21%|██ | 7072/34278 [7:49:43<37:38:27, 4.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██ | 7073/34278 [7:49:46<33:17:23, 4.41s/it] {'loss': 0.1549, 'grad_norm': 0.8532945645803387, 'learning_rate': 9.206592710162436e-06, 'epoch': 0.21} 21%|██ | 7073/34278 [7:49:46<33:17:23, 4.41s/it] 21%|██ | 7074/34278 [7:49:50<32:39:38, 4.32s/it] {'loss': 0.1729, 'grad_norm': 0.8748392553037043, 'learning_rate': 9.206337321992554e-06, 'epoch': 0.21} 21%|██ | 7074/34278 [7:49:50<32:39:38, 4.32s/it] 21%|██ | 7075/34278 [7:49:53<30:46:59, 4.07s/it] {'loss': 0.1427, 'grad_norm': 0.7947230153235141, 'learning_rate': 9.206081896269498e-06, 'epoch': 0.21} 21%|██ | 7075/34278 [7:49:53<30:46:59, 4.07s/it] 21%|██ | 7076/34278 [7:49:57<29:58:44, 3.97s/it] {'loss': 0.1874, 'grad_norm': 0.8735131842080504, 'learning_rate': 9.205826432995547e-06, 'epoch': 0.21} 21%|██ | 7076/34278 [7:49:57<29:58:44, 3.97s/it] 21%|██ | 7077/34278 [7:50:00<27:57:15, 3.70s/it] {'loss': 0.1574, 'grad_norm': 0.904135571112561, 'learning_rate': 9.20557093217298e-06, 'epoch': 0.21} 21%|██ | 7077/34278 [7:50:00<27:57:15, 3.70s/it] 21%|██ | 7078/34278 [7:50:06<31:44:52, 4.20s/it] {'loss': 0.152, 'grad_norm': 1.1024578813330324, 'learning_rate': 9.20531539380408e-06, 'epoch': 0.21} 21%|██ | 7078/34278 [7:50:06<31:44:52, 4.20s/it] 21%|██ | 7079/34278 [7:50:09<29:17:38, 3.88s/it] {'loss': 0.1478, 'grad_norm': 0.8570271618694294, 'learning_rate': 9.205059817891128e-06, 'epoch': 0.21} 21%|██ | 7079/34278 [7:50:09<29:17:38, 3.88s/it] 21%|██ | 7080/34278 [7:50:14<32:02:12, 4.24s/it] {'loss': 0.1632, 'grad_norm': 0.7449770153439637, 'learning_rate': 9.204804204436406e-06, 'epoch': 0.21} 21%|██ | 7080/34278 [7:50:14<32:02:12, 4.24s/it] 21%|██ | 7081/34278 [7:50:18<32:01:50, 4.24s/it] {'loss': 0.2087, 'grad_norm': 0.9955024876594831, 'learning_rate': 9.204548553442196e-06, 'epoch': 0.21} 21%|██ | 7081/34278 [7:50:18<32:01:50, 4.24s/it] 21%|██ | 7082/34278 [7:50:21<28:55:27, 3.83s/it] {'loss': 0.1488, 'grad_norm': 0.8777825750110577, 'learning_rate': 9.204292864910781e-06, 'epoch': 0.21} 21%|██ | 7082/34278 [7:50:21<28:55:27, 3.83s/it] 21%|██ | 7083/34278 [7:50:27<33:27:22, 4.43s/it] {'loss': 0.1598, 'grad_norm': 0.8349816278410988, 'learning_rate': 9.204037138844441e-06, 'epoch': 0.21} 21%|██ | 7083/34278 [7:50:27<33:27:22, 4.43s/it] 21%|██ | 7084/34278 [7:50:32<34:32:36, 4.57s/it] {'loss': 0.182, 'grad_norm': 1.2555677252554858, 'learning_rate': 9.203781375245465e-06, 'epoch': 0.21} 21%|██ | 7084/34278 [7:50:32<34:32:36, 4.57s/it] 21%|██ | 7085/34278 [7:50:37<35:07:34, 4.65s/it] {'loss': 0.1719, 'grad_norm': 0.8331678704155605, 'learning_rate': 9.203525574116127e-06, 'epoch': 0.21} 21%|██ | 7085/34278 [7:50:37<35:07:34, 4.65s/it] 21%|██ | 7086/34278 [7:50:40<32:17:06, 4.27s/it] {'loss': 0.1465, 'grad_norm': 0.7213427739702032, 'learning_rate': 9.20326973545872e-06, 'epoch': 0.21} 21%|██ | 7086/34278 [7:50:40<32:17:06, 4.27s/it] 21%|██ | 7087/34278 [7:50:43<29:26:40, 3.90s/it] {'loss': 0.1567, 'grad_norm': 0.7682660254920037, 'learning_rate': 9.203013859275523e-06, 'epoch': 0.21} 21%|██ | 7087/34278 [7:50:43<29:26:40, 3.90s/it] 21%|██ | 7088/34278 [7:50:46<27:00:39, 3.58s/it] {'loss': 0.1708, 'grad_norm': 0.8731578594256073, 'learning_rate': 9.202757945568822e-06, 'epoch': 0.21} 21%|██ | 7088/34278 [7:50:46<27:00:39, 3.58s/it] 21%|██ | 7089/34278 [7:50:49<26:17:55, 3.48s/it] {'loss': 0.145, 'grad_norm': 0.8333449796435296, 'learning_rate': 9.2025019943409e-06, 'epoch': 0.21} 21%|██ | 7089/34278 [7:50:49<26:17:55, 3.48s/it] 21%|██ | 7090/34278 [7:50:53<26:35:49, 3.52s/it] {'loss': 0.1705, 'grad_norm': 0.8760915723100529, 'learning_rate': 9.202246005594045e-06, 'epoch': 0.21} 21%|██ | 7090/34278 [7:50:53<26:35:49, 3.52s/it] 21%|██ | 7091/34278 [7:50:56<25:29:08, 3.37s/it] {'loss': 0.1764, 'grad_norm': 1.0430655009673637, 'learning_rate': 9.20198997933054e-06, 'epoch': 0.21} 21%|██ | 7091/34278 [7:50:56<25:29:08, 3.37s/it] 21%|██ | 7092/34278 [7:50:59<25:06:38, 3.33s/it] {'loss': 0.1528, 'grad_norm': 0.8727107465988335, 'learning_rate': 9.201733915552672e-06, 'epoch': 0.21} 21%|██ | 7092/34278 [7:50:59<25:06:38, 3.33s/it] 21%|██ | 7093/34278 [7:51:02<25:04:50, 3.32s/it] {'loss': 0.1685, 'grad_norm': 0.9070862177074983, 'learning_rate': 9.201477814262727e-06, 'epoch': 0.21} 21%|██ | 7093/34278 [7:51:02<25:04:50, 3.32s/it] 21%|██ | 7094/34278 [7:51:05<24:15:41, 3.21s/it] {'loss': 0.1574, 'grad_norm': 0.9758127645458515, 'learning_rate': 9.20122167546299e-06, 'epoch': 0.21} 21%|██ | 7094/34278 [7:51:05<24:15:41, 3.21s/it] 21%|██ | 7095/34278 [7:51:12<31:37:39, 4.19s/it] {'loss': 0.1737, 'grad_norm': 0.8160619992259059, 'learning_rate': 9.20096549915575e-06, 'epoch': 0.21} 21%|██ | 7095/34278 [7:51:12<31:37:39, 4.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██ | 7096/34278 [7:51:15<30:35:43, 4.05s/it] {'loss': 0.1451, 'grad_norm': 0.7248935104136316, 'learning_rate': 9.200709285343292e-06, 'epoch': 0.21} 21%|██ | 7096/34278 [7:51:15<30:35:43, 4.05s/it] 21%|██ | 7097/34278 [7:51:18<28:08:06, 3.73s/it] {'loss': 0.1441, 'grad_norm': 0.7895891594620774, 'learning_rate': 9.200453034027903e-06, 'epoch': 0.21} 21%|██ | 7097/34278 [7:51:18<28:08:06, 3.73s/it] 21%|██ | 7098/34278 [7:51:22<28:30:02, 3.77s/it] {'loss': 0.1796, 'grad_norm': 0.9970660714634246, 'learning_rate': 9.200196745211873e-06, 'epoch': 0.21} 21%|██ | 7098/34278 [7:51:22<28:30:02, 3.77s/it] 21%|██ | 7099/34278 [7:51:26<27:41:30, 3.67s/it] {'loss': 0.1676, 'grad_norm': 0.7827229750158712, 'learning_rate': 9.19994041889749e-06, 'epoch': 0.21} 21%|██ | 7099/34278 [7:51:26<27:41:30, 3.67s/it] 21%|██ | 7100/34278 [7:51:29<27:35:31, 3.65s/it] {'loss': 0.1454, 'grad_norm': 0.7112071647046179, 'learning_rate': 9.19968405508704e-06, 'epoch': 0.21} 21%|██ | 7100/34278 [7:51:29<27:35:31, 3.65s/it] 21%|██ | 7101/34278 [7:51:33<26:41:17, 3.54s/it] {'loss': 0.168, 'grad_norm': 0.6660471009508432, 'learning_rate': 9.199427653782815e-06, 'epoch': 0.21} 21%|██ | 7101/34278 [7:51:33<26:41:17, 3.54s/it] 21%|██ | 7102/34278 [7:51:36<27:10:51, 3.60s/it] {'loss': 0.1601, 'grad_norm': 0.8299462389483362, 'learning_rate': 9.199171214987103e-06, 'epoch': 0.21} 21%|██ | 7102/34278 [7:51:36<27:10:51, 3.60s/it] 21%|██ | 7103/34278 [7:51:40<26:44:37, 3.54s/it] {'loss': 0.148, 'grad_norm': 0.8344605365563602, 'learning_rate': 9.198914738702191e-06, 'epoch': 0.21} 21%|██ | 7103/34278 [7:51:40<26:44:37, 3.54s/it] 21%|██ | 7104/34278 [7:51:44<28:00:30, 3.71s/it] {'loss': 0.1748, 'grad_norm': 0.7109615234153235, 'learning_rate': 9.19865822493037e-06, 'epoch': 0.21} 21%|██ | 7104/34278 [7:51:44<28:00:30, 3.71s/it] 21%|██ | 7105/34278 [7:51:47<27:36:39, 3.66s/it] {'loss': 0.1813, 'grad_norm': 0.9412419045345006, 'learning_rate': 9.198401673673934e-06, 'epoch': 0.21} 21%|██ | 7105/34278 [7:51:47<27:36:39, 3.66s/it] 21%|██ | 7106/34278 [7:51:52<28:55:46, 3.83s/it] {'loss': 0.1892, 'grad_norm': 1.1276383291887195, 'learning_rate': 9.198145084935167e-06, 'epoch': 0.21} 21%|██ | 7106/34278 [7:51:52<28:55:46, 3.83s/it] 21%|██ | 7107/34278 [7:51:55<28:47:35, 3.81s/it] {'loss': 0.1624, 'grad_norm': 0.9710707860135948, 'learning_rate': 9.197888458716364e-06, 'epoch': 0.21} 21%|██ | 7107/34278 [7:51:55<28:47:35, 3.81s/it] 21%|██ | 7108/34278 [7:51:59<27:39:21, 3.66s/it] {'loss': 0.1566, 'grad_norm': 1.019701360818089, 'learning_rate': 9.197631795019815e-06, 'epoch': 0.21} 21%|██ | 7108/34278 [7:51:59<27:39:21, 3.66s/it] 21%|██ | 7109/34278 [7:52:02<27:12:05, 3.60s/it] {'loss': 0.1758, 'grad_norm': 0.74507133497317, 'learning_rate': 9.197375093847811e-06, 'epoch': 0.21} 21%|██ | 7109/34278 [7:52:02<27:12:05, 3.60s/it] 21%|██ | 7110/34278 [7:52:05<25:27:23, 3.37s/it] {'loss': 0.1707, 'grad_norm': 1.0144998879685472, 'learning_rate': 9.197118355202644e-06, 'epoch': 0.21} 21%|██ | 7110/34278 [7:52:05<25:27:23, 3.37s/it] 21%|██ | 7111/34278 [7:52:10<28:22:32, 3.76s/it] {'loss': 0.1722, 'grad_norm': 0.8741559351263685, 'learning_rate': 9.196861579086607e-06, 'epoch': 0.21} 21%|██ | 7111/34278 [7:52:10<28:22:32, 3.76s/it] 21%|██ | 7112/34278 [7:52:13<27:15:05, 3.61s/it] {'loss': 0.1478, 'grad_norm': 0.6842181467468504, 'learning_rate': 9.196604765501991e-06, 'epoch': 0.21} 21%|██ | 7112/34278 [7:52:13<27:15:05, 3.61s/it] 21%|██ | 7113/34278 [7:52:17<28:13:49, 3.74s/it] {'loss': 0.1753, 'grad_norm': 0.9035682230126743, 'learning_rate': 9.196347914451089e-06, 'epoch': 0.21} 21%|██ | 7113/34278 [7:52:17<28:13:49, 3.74s/it] 21%|██ | 7114/34278 [7:52:21<30:02:52, 3.98s/it] {'loss': 0.1662, 'grad_norm': 0.802764908227016, 'learning_rate': 9.196091025936195e-06, 'epoch': 0.21} 21%|██ | 7114/34278 [7:52:21<30:02:52, 3.98s/it] 21%|██ | 7115/34278 [7:52:24<27:38:42, 3.66s/it] {'loss': 0.155, 'grad_norm': 0.8312720985285272, 'learning_rate': 9.195834099959604e-06, 'epoch': 0.21} 21%|██ | 7115/34278 [7:52:24<27:38:42, 3.66s/it] 21%|██ | 7116/34278 [7:52:27<25:46:15, 3.42s/it] {'loss': 0.171, 'grad_norm': 0.7328392867043356, 'learning_rate': 9.195577136523606e-06, 'epoch': 0.21} 21%|██ | 7116/34278 [7:52:27<25:46:15, 3.42s/it] 21%|██ | 7117/34278 [7:52:31<25:35:11, 3.39s/it] {'loss': 0.1784, 'grad_norm': 0.7986201166657328, 'learning_rate': 9.195320135630496e-06, 'epoch': 0.21} 21%|██ | 7117/34278 [7:52:31<25:35:11, 3.39s/it] 21%|██ | 7118/34278 [7:52:35<27:13:32, 3.61s/it] {'loss': 0.1426, 'grad_norm': 0.8296984793649945, 'learning_rate': 9.19506309728257e-06, 'epoch': 0.21} 21%|██ | 7118/34278 [7:52:35<27:13:32, 3.61s/it] 21%|██ | 7119/34278 [7:52:38<27:02:47, 3.59s/it] {'loss': 0.1379, 'grad_norm': 0.6494697354782675, 'learning_rate': 9.194806021482123e-06, 'epoch': 0.21} 21%|██ | 7119/34278 [7:52:38<27:02:47, 3.59s/it] 21%|██ | 7120/34278 [7:52:41<26:10:58, 3.47s/it] {'loss': 0.1491, 'grad_norm': 1.2721754865246282, 'learning_rate': 9.194548908231448e-06, 'epoch': 0.21} 21%|██ | 7120/34278 [7:52:41<26:10:58, 3.47s/it] 21%|██ | 7121/34278 [7:52:45<25:54:54, 3.44s/it] {'loss': 0.1481, 'grad_norm': 1.114672137703868, 'learning_rate': 9.194291757532842e-06, 'epoch': 0.21} 21%|██ | 7121/34278 [7:52:45<25:54:54, 3.44s/it] 21%|██ | 7122/34278 [7:52:48<25:21:14, 3.36s/it] {'loss': 0.1589, 'grad_norm': 0.8637457423044791, 'learning_rate': 9.194034569388602e-06, 'epoch': 0.21} 21%|██ | 7122/34278 [7:52:48<25:21:14, 3.36s/it] 21%|██ | 7123/34278 [7:52:51<25:06:49, 3.33s/it] {'loss': 0.1765, 'grad_norm': 0.7333022679873099, 'learning_rate': 9.193777343801021e-06, 'epoch': 0.21} 21%|██ | 7123/34278 [7:52:51<25:06:49, 3.33s/it] 21%|██ | 7124/34278 [7:52:55<26:05:13, 3.46s/it] {'loss': 0.1891, 'grad_norm': 0.9906453369889608, 'learning_rate': 9.193520080772398e-06, 'epoch': 0.21} 21%|██ | 7124/34278 [7:52:55<26:05:13, 3.46s/it] 21%|██ | 7125/34278 [7:52:58<24:57:03, 3.31s/it] {'loss': 0.2019, 'grad_norm': 0.9500924370608755, 'learning_rate': 9.193262780305028e-06, 'epoch': 0.21} 21%|██ | 7125/34278 [7:52:58<24:57:03, 3.31s/it] 21%|██ | 7126/34278 [7:53:01<24:45:32, 3.28s/it] {'loss': 0.1505, 'grad_norm': 1.052800775250883, 'learning_rate': 9.193005442401209e-06, 'epoch': 0.21} 21%|██ | 7126/34278 [7:53:01<24:45:32, 3.28s/it] 21%|██ | 7127/34278 [7:53:05<25:25:49, 3.37s/it] {'loss': 0.1727, 'grad_norm': 0.7881281559280178, 'learning_rate': 9.192748067063238e-06, 'epoch': 0.21} 21%|██ | 7127/34278 [7:53:05<25:25:49, 3.37s/it] 21%|██ | 7128/34278 [7:53:08<25:41:54, 3.41s/it] {'loss': 0.1557, 'grad_norm': 0.9240133278018553, 'learning_rate': 9.192490654293414e-06, 'epoch': 0.21} 21%|██ | 7128/34278 [7:53:08<25:41:54, 3.41s/it] 21%|██ | 7129/34278 [7:53:12<27:39:52, 3.67s/it] {'loss': 0.1691, 'grad_norm': 0.9479760475904528, 'learning_rate': 9.192233204094034e-06, 'epoch': 0.21} 21%|██ | 7129/34278 [7:53:12<27:39:52, 3.67s/it] 21%|██ | 7130/34278 [7:53:18<32:55:57, 4.37s/it] {'loss': 0.1541, 'grad_norm': 0.9927279076327339, 'learning_rate': 9.191975716467397e-06, 'epoch': 0.21} 21%|██ | 7130/34278 [7:53:18<32:55:57, 4.37s/it] 21%|██ | 7131/34278 [7:53:22<30:12:25, 4.01s/it] {'loss': 0.1408, 'grad_norm': 0.9893259101543747, 'learning_rate': 9.1917181914158e-06, 'epoch': 0.21} 21%|██ | 7131/34278 [7:53:22<30:12:25, 4.01s/it] 21%|██ | 7132/34278 [7:53:26<30:03:53, 3.99s/it] {'loss': 0.1874, 'grad_norm': 0.895924776365021, 'learning_rate': 9.191460628941544e-06, 'epoch': 0.21} 21%|██ | 7132/34278 [7:53:26<30:03:53, 3.99s/it] 21%|██ | 7133/34278 [7:53:29<28:22:39, 3.76s/it] {'loss': 0.1894, 'grad_norm': 0.9570681155974422, 'learning_rate': 9.191203029046929e-06, 'epoch': 0.21} 21%|██ | 7133/34278 [7:53:29<28:22:39, 3.76s/it] 21%|██ | 7134/34278 [7:53:32<27:31:16, 3.65s/it] {'loss': 0.1959, 'grad_norm': 1.0782237602145626, 'learning_rate': 9.190945391734254e-06, 'epoch': 0.21} 21%|██ | 7134/34278 [7:53:32<27:31:16, 3.65s/it] 21%|██ | 7135/34278 [7:53:37<31:14:09, 4.14s/it] {'loss': 0.1554, 'grad_norm': 0.879667010937371, 'learning_rate': 9.190687717005818e-06, 'epoch': 0.21} 21%|██ | 7135/34278 [7:53:38<31:14:09, 4.14s/it] 21%|██ | 7136/34278 [7:53:43<33:17:53, 4.42s/it] {'loss': 0.173, 'grad_norm': 1.7162003841975093, 'learning_rate': 9.190430004863924e-06, 'epoch': 0.21} 21%|██ | 7136/34278 [7:53:43<33:17:53, 4.42s/it] 21%|██ | 7137/34278 [7:53:47<32:44:46, 4.34s/it] {'loss': 0.1772, 'grad_norm': 1.114298507950001, 'learning_rate': 9.190172255310869e-06, 'epoch': 0.21} 21%|██ | 7137/34278 [7:53:47<32:44:46, 4.34s/it] 21%|██ | 7138/34278 [7:53:50<29:50:35, 3.96s/it] {'loss': 0.1714, 'grad_norm': 0.9187882525918382, 'learning_rate': 9.18991446834896e-06, 'epoch': 0.21} 21%|██ | 7138/34278 [7:53:50<29:50:35, 3.96s/it] 21%|██ | 7139/34278 [7:53:53<28:02:19, 3.72s/it] {'loss': 0.1503, 'grad_norm': 0.8934628611463499, 'learning_rate': 9.189656643980492e-06, 'epoch': 0.21} 21%|██ | 7139/34278 [7:53:53<28:02:19, 3.72s/it] 21%|██ | 7140/34278 [7:53:56<26:35:33, 3.53s/it] {'loss': 0.1561, 'grad_norm': 0.8729740494879913, 'learning_rate': 9.189398782207771e-06, 'epoch': 0.21} 21%|██ | 7140/34278 [7:53:56<26:35:33, 3.53s/it] 21%|██ | 7141/34278 [7:53:59<25:17:56, 3.36s/it] {'loss': 0.1615, 'grad_norm': 0.8532934030012483, 'learning_rate': 9.189140883033097e-06, 'epoch': 0.21} 21%|██ | 7141/34278 [7:53:59<25:17:56, 3.36s/it] 21%|██ | 7142/34278 [7:54:02<23:56:53, 3.18s/it] {'loss': 0.175, 'grad_norm': 0.8884195549436464, 'learning_rate': 9.188882946458773e-06, 'epoch': 0.21} 21%|██ | 7142/34278 [7:54:02<23:56:53, 3.18s/it] 21%|██ | 7143/34278 [7:54:08<30:29:56, 4.05s/it] {'loss': 0.1552, 'grad_norm': 0.9929578120059851, 'learning_rate': 9.188624972487101e-06, 'epoch': 0.21} 21%|██ | 7143/34278 [7:54:08<30:29:56, 4.05s/it] 21%|██ | 7144/34278 [7:54:13<33:35:55, 4.46s/it] {'loss': 0.1798, 'grad_norm': 0.8440531149573288, 'learning_rate': 9.188366961120386e-06, 'epoch': 0.21} 21%|██ | 7144/34278 [7:54:13<33:35:55, 4.46s/it] 21%|██ | 7145/34278 [7:54:16<29:55:08, 3.97s/it] {'loss': 0.1861, 'grad_norm': 0.7885896114796663, 'learning_rate': 9.188108912360932e-06, 'epoch': 0.21} 21%|██ | 7145/34278 [7:54:16<29:55:08, 3.97s/it] 21%|██ | 7146/34278 [7:54:20<30:33:24, 4.05s/it] {'loss': 0.177, 'grad_norm': 1.0741397692528705, 'learning_rate': 9.18785082621104e-06, 'epoch': 0.21} 21%|██ | 7146/34278 [7:54:20<30:33:24, 4.05s/it] 21%|██ | 7147/34278 [7:54:26<34:59:25, 4.64s/it] {'loss': 0.1751, 'grad_norm': 0.7888149571309065, 'learning_rate': 9.187592702673017e-06, 'epoch': 0.21} 21%|██ | 7147/34278 [7:54:26<34:59:25, 4.64s/it] 21%|██ | 7148/34278 [7:54:30<31:45:46, 4.21s/it] {'loss': 0.167, 'grad_norm': 0.8882939061613188, 'learning_rate': 9.187334541749165e-06, 'epoch': 0.21} 21%|██ | 7148/34278 [7:54:30<31:45:46, 4.21s/it] 21%|██ | 7149/34278 [7:54:33<29:17:44, 3.89s/it] {'loss': 0.1871, 'grad_norm': 1.1154958535430273, 'learning_rate': 9.187076343441787e-06, 'epoch': 0.21} 21%|██ | 7149/34278 [7:54:33<29:17:44, 3.89s/it] 21%|██ | 7150/34278 [7:54:36<27:38:39, 3.67s/it] {'loss': 0.1558, 'grad_norm': 0.8528943675779154, 'learning_rate': 9.186818107753195e-06, 'epoch': 0.21} 21%|██ | 7150/34278 [7:54:36<27:38:39, 3.67s/it] 21%|██ | 7151/34278 [7:54:42<33:20:31, 4.42s/it] {'loss': 0.1457, 'grad_norm': 0.9105786245507598, 'learning_rate': 9.18655983468569e-06, 'epoch': 0.21} 21%|██ | 7151/34278 [7:54:42<33:20:31, 4.42s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██ | 7152/34278 [7:54:45<30:35:45, 4.06s/it] {'loss': 0.2017, 'grad_norm': 0.8758734781424876, 'learning_rate': 9.186301524241576e-06, 'epoch': 0.21} 21%|██ | 7152/34278 [7:54:45<30:35:45, 4.06s/it] 21%|██ | 7153/34278 [7:54:51<35:08:38, 4.66s/it] {'loss': 0.1895, 'grad_norm': 0.9515679452598051, 'learning_rate': 9.186043176423162e-06, 'epoch': 0.21} 21%|██ | 7153/34278 [7:54:51<35:08:38, 4.66s/it] 21%|██ | 7154/34278 [7:54:55<32:48:56, 4.36s/it] {'loss': 0.1914, 'grad_norm': 0.9761232825843177, 'learning_rate': 9.185784791232755e-06, 'epoch': 0.21} 21%|██ | 7154/34278 [7:54:55<32:48:56, 4.36s/it] 21%|██ | 7155/34278 [7:54:58<29:44:50, 3.95s/it] {'loss': 0.1627, 'grad_norm': 0.7893318345979233, 'learning_rate': 9.185526368672662e-06, 'epoch': 0.21} 21%|██ | 7155/34278 [7:54:58<29:44:50, 3.95s/it] 21%|██ | 7156/34278 [7:55:02<30:27:56, 4.04s/it] {'loss': 0.1758, 'grad_norm': 0.9538821863437764, 'learning_rate': 9.185267908745186e-06, 'epoch': 0.21} 21%|██ | 7156/34278 [7:55:02<30:27:56, 4.04s/it] 21%|██ | 7157/34278 [7:55:05<28:00:33, 3.72s/it] {'loss': 0.1539, 'grad_norm': 0.9263212016878108, 'learning_rate': 9.185009411452638e-06, 'epoch': 0.21} 21%|██ | 7157/34278 [7:55:05<28:00:33, 3.72s/it] 21%|██ | 7158/34278 [7:55:08<26:59:25, 3.58s/it] {'loss': 0.1606, 'grad_norm': 0.7255824406567902, 'learning_rate': 9.184750876797325e-06, 'epoch': 0.21} 21%|██ | 7158/34278 [7:55:08<26:59:25, 3.58s/it] 21%|██ | 7159/34278 [7:55:11<25:15:00, 3.35s/it] {'loss': 0.153, 'grad_norm': 0.8900029933857889, 'learning_rate': 9.184492304781555e-06, 'epoch': 0.21} 21%|██ | 7159/34278 [7:55:11<25:15:00, 3.35s/it] 21%|██ | 7160/34278 [7:55:15<26:30:00, 3.52s/it] {'loss': 0.1592, 'grad_norm': 0.8014889704446233, 'learning_rate': 9.184233695407635e-06, 'epoch': 0.21} 21%|██ | 7160/34278 [7:55:15<26:30:00, 3.52s/it] 21%|██ | 7161/34278 [7:55:21<31:18:43, 4.16s/it] {'loss': 0.1887, 'grad_norm': 0.8585024131393774, 'learning_rate': 9.18397504867788e-06, 'epoch': 0.21} 21%|██ | 7161/34278 [7:55:21<31:18:43, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██ | 7162/34278 [7:55:24<29:46:55, 3.95s/it] {'loss': 0.1509, 'grad_norm': 0.6927086297948429, 'learning_rate': 9.18371636459459e-06, 'epoch': 0.21} 21%|██ | 7162/34278 [7:55:24<29:46:55, 3.95s/it] 21%|██ | 7163/34278 [7:55:31<35:08:56, 4.67s/it] {'loss': 0.1887, 'grad_norm': 0.8285743828056754, 'learning_rate': 9.183457643160082e-06, 'epoch': 0.21} 21%|██ | 7163/34278 [7:55:31<35:08:56, 4.67s/it] 21%|██ | 7164/34278 [7:55:34<31:42:43, 4.21s/it] {'loss': 0.1712, 'grad_norm': 0.861001703495805, 'learning_rate': 9.183198884376661e-06, 'epoch': 0.21} 21%|██ | 7164/34278 [7:55:34<31:42:43, 4.21s/it] 21%|██ | 7165/34278 [7:55:40<36:20:36, 4.83s/it] {'loss': 0.1697, 'grad_norm': 0.7323659507296036, 'learning_rate': 9.18294008824664e-06, 'epoch': 0.21} 21%|██ | 7165/34278 [7:55:40<36:20:36, 4.83s/it] 21%|██ | 7166/34278 [7:55:43<32:40:49, 4.34s/it] {'loss': 0.168, 'grad_norm': 0.7159030178356711, 'learning_rate': 9.182681254772327e-06, 'epoch': 0.21} 21%|██ | 7166/34278 [7:55:43<32:40:49, 4.34s/it] 21%|██ | 7167/34278 [7:55:47<31:34:11, 4.19s/it] {'loss': 0.1709, 'grad_norm': 0.7549284021070887, 'learning_rate': 9.182422383956036e-06, 'epoch': 0.21} 21%|██ | 7167/34278 [7:55:47<31:34:11, 4.19s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10763 > 8192). Running this sequence through the model will result in indexing errors 21%|██ | 7168/34278 [7:55:51<30:02:12, 3.99s/it] {'loss': 0.1554, 'grad_norm': 0.94434557018474, 'learning_rate': 9.182163475800077e-06, 'epoch': 0.21} 21%|██ | 7168/34278 [7:55:51<30:02:12, 3.99s/it] 21%|██ | 7169/34278 [7:55:53<27:22:07, 3.63s/it] {'loss': 0.1729, 'grad_norm': 0.8347875207314545, 'learning_rate': 9.181904530306757e-06, 'epoch': 0.21} 21%|██ | 7169/34278 [7:55:53<27:22:07, 3.63s/it] 21%|██ | 7170/34278 [7:55:57<26:57:25, 3.58s/it] {'loss': 0.1366, 'grad_norm': 0.6783003229252659, 'learning_rate': 9.181645547478395e-06, 'epoch': 0.21} 21%|██ | 7170/34278 [7:55:57<26:57:25, 3.58s/it] 21%|██ | 7171/34278 [7:56:00<26:44:58, 3.55s/it] {'loss': 0.165, 'grad_norm': 1.1050779346558968, 'learning_rate': 9.1813865273173e-06, 'epoch': 0.21} 21%|██ | 7171/34278 [7:56:00<26:44:58, 3.55s/it] 21%|██ | 7172/34278 [7:56:04<27:03:16, 3.59s/it] {'loss': 0.1742, 'grad_norm': 0.8185826575869648, 'learning_rate': 9.181127469825784e-06, 'epoch': 0.21} 21%|██ | 7172/34278 [7:56:04<27:03:16, 3.59s/it] 21%|██ | 7173/34278 [7:56:07<26:39:48, 3.54s/it] {'loss': 0.1655, 'grad_norm': 1.0439897072743844, 'learning_rate': 9.180868375006158e-06, 'epoch': 0.21} 21%|██ | 7173/34278 [7:56:07<26:39:48, 3.54s/it] 21%|██ | 7174/34278 [7:56:10<25:18:01, 3.36s/it] {'loss': 0.1684, 'grad_norm': 1.0412342674908153, 'learning_rate': 9.180609242860739e-06, 'epoch': 0.21} 21%|██ | 7174/34278 [7:56:10<25:18:01, 3.36s/it] 21%|██ | 7175/34278 [7:56:13<24:39:15, 3.27s/it] {'loss': 0.1593, 'grad_norm': 0.8897105234040097, 'learning_rate': 9.180350073391838e-06, 'epoch': 0.21} 21%|██ | 7175/34278 [7:56:13<24:39:15, 3.27s/it] 21%|██ | 7176/34278 [7:56:17<25:06:22, 3.33s/it] {'loss': 0.1723, 'grad_norm': 0.8714788467791885, 'learning_rate': 9.18009086660177e-06, 'epoch': 0.21} 21%|██ | 7176/34278 [7:56:17<25:06:22, 3.33s/it] 21%|██ | 7177/34278 [7:56:20<24:15:35, 3.22s/it] {'loss': 0.135, 'grad_norm': 0.7005511708820562, 'learning_rate': 9.179831622492847e-06, 'epoch': 0.21} 21%|██ | 7177/34278 [7:56:20<24:15:35, 3.22s/it] 21%|██ | 7178/34278 [7:56:23<25:04:22, 3.33s/it] {'loss': 0.149, 'grad_norm': 1.2540627711913974, 'learning_rate': 9.179572341067387e-06, 'epoch': 0.21} 21%|██ | 7178/34278 [7:56:23<25:04:22, 3.33s/it] 21%|██ | 7179/34278 [7:56:29<30:58:52, 4.12s/it] {'loss': 0.1518, 'grad_norm': 0.7243322612171238, 'learning_rate': 9.179313022327703e-06, 'epoch': 0.21} 21%|██ | 7179/34278 [7:56:29<30:58:52, 4.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██ | 7180/34278 [7:56:32<28:24:56, 3.78s/it] {'loss': 0.1772, 'grad_norm': 0.8735805655729467, 'learning_rate': 9.17905366627611e-06, 'epoch': 0.21} 21%|██ | 7180/34278 [7:56:32<28:24:56, 3.78s/it] 21%|██ | 7181/34278 [7:56:36<27:37:34, 3.67s/it] {'loss': 0.16, 'grad_norm': 1.2367216747115835, 'learning_rate': 9.178794272914921e-06, 'epoch': 0.21} 21%|██ | 7181/34278 [7:56:36<27:37:34, 3.67s/it] 21%|██ | 7182/34278 [7:56:40<29:36:10, 3.93s/it] {'loss': 0.1558, 'grad_norm': 0.7919243540922308, 'learning_rate': 9.178534842246457e-06, 'epoch': 0.21} 21%|██ | 7182/34278 [7:56:40<29:36:10, 3.93s/it] 21%|██ | 7183/34278 [7:56:43<27:26:38, 3.65s/it] {'loss': 0.1633, 'grad_norm': 0.6764322676637853, 'learning_rate': 9.17827537427303e-06, 'epoch': 0.21} 21%|██ | 7183/34278 [7:56:43<27:26:38, 3.65s/it] 21%|██ | 7184/34278 [7:56:49<32:55:23, 4.37s/it] {'loss': 0.169, 'grad_norm': 0.8948139087747934, 'learning_rate': 9.178015868996959e-06, 'epoch': 0.21} 21%|██ | 7184/34278 [7:56:49<32:55:23, 4.37s/it] 21%|██ | 7185/34278 [7:56:53<30:15:36, 4.02s/it] {'loss': 0.1812, 'grad_norm': 0.9265613587274324, 'learning_rate': 9.17775632642056e-06, 'epoch': 0.21} 21%|██ | 7185/34278 [7:56:53<30:15:36, 4.02s/it] 21%|██ | 7186/34278 [7:56:59<34:54:45, 4.64s/it] {'loss': 0.1568, 'grad_norm': 0.8292242281181695, 'learning_rate': 9.177496746546148e-06, 'epoch': 0.21} 21%|██ | 7186/34278 [7:56:59<34:54:45, 4.64s/it] 21%|██ | 7187/34278 [7:57:02<31:43:24, 4.22s/it] {'loss': 0.1551, 'grad_norm': 0.7472986987443127, 'learning_rate': 9.177237129376043e-06, 'epoch': 0.21} 21%|██ | 7187/34278 [7:57:02<31:43:24, 4.22s/it] 21%|██ | 7188/34278 [7:57:05<28:42:11, 3.81s/it] {'loss': 0.181, 'grad_norm': 0.9179470707908803, 'learning_rate': 9.176977474912563e-06, 'epoch': 0.21} 21%|██ | 7188/34278 [7:57:05<28:42:11, 3.81s/it] 21%|██ | 7189/34278 [7:57:08<26:25:56, 3.51s/it] {'loss': 0.1752, 'grad_norm': 0.9507366317154728, 'learning_rate': 9.176717783158023e-06, 'epoch': 0.21} 21%|██ | 7189/34278 [7:57:08<26:25:56, 3.51s/it] 21%|██ | 7190/34278 [7:57:11<25:33:11, 3.40s/it] {'loss': 0.1425, 'grad_norm': 0.7908172868930405, 'learning_rate': 9.176458054114746e-06, 'epoch': 0.21} 21%|██ | 7190/34278 [7:57:11<25:33:11, 3.40s/it] 21%|██ | 7191/34278 [7:57:14<25:02:27, 3.33s/it] {'loss': 0.1714, 'grad_norm': 0.9594138180980587, 'learning_rate': 9.176198287785048e-06, 'epoch': 0.21} 21%|██ | 7191/34278 [7:57:14<25:02:27, 3.33s/it] 21%|██ | 7192/34278 [7:57:18<25:44:29, 3.42s/it] {'loss': 0.1654, 'grad_norm': 0.868901899928297, 'learning_rate': 9.175938484171248e-06, 'epoch': 0.21} 21%|██ | 7192/34278 [7:57:18<25:44:29, 3.42s/it] 21%|██ | 7193/34278 [7:57:24<31:45:36, 4.22s/it] {'loss': 0.1818, 'grad_norm': 0.8398311088406083, 'learning_rate': 9.175678643275668e-06, 'epoch': 0.21} 21%|██ | 7193/34278 [7:57:24<31:45:36, 4.22s/it] 21%|██ | 7194/34278 [7:57:28<31:41:31, 4.21s/it] {'loss': 0.1681, 'grad_norm': 0.8050506810692823, 'learning_rate': 9.175418765100624e-06, 'epoch': 0.21} 21%|██ | 7194/34278 [7:57:28<31:41:31, 4.21s/it] 21%|██ | 7195/34278 [7:57:31<29:44:48, 3.95s/it] {'loss': 0.1586, 'grad_norm': 0.8963428993314191, 'learning_rate': 9.175158849648438e-06, 'epoch': 0.21} 21%|██ | 7195/34278 [7:57:31<29:44:48, 3.95s/it] 21%|██ | 7196/34278 [7:57:34<27:46:38, 3.69s/it] {'loss': 0.1618, 'grad_norm': 0.8480606598967553, 'learning_rate': 9.17489889692143e-06, 'epoch': 0.21} 21%|██ | 7196/34278 [7:57:34<27:46:38, 3.69s/it] 21%|██ | 7197/34278 [7:57:38<27:14:57, 3.62s/it] {'loss': 0.1765, 'grad_norm': 0.7181614838963151, 'learning_rate': 9.174638906921921e-06, 'epoch': 0.21} 21%|██ | 7197/34278 [7:57:38<27:14:57, 3.62s/it] 21%|██ | 7198/34278 [7:57:41<25:37:37, 3.41s/it] {'loss': 0.1534, 'grad_norm': 0.850916230145168, 'learning_rate': 9.174378879652235e-06, 'epoch': 0.21} 21%|██ | 7198/34278 [7:57:41<25:37:37, 3.41s/it] 21%|██ | 7199/34278 [7:57:43<24:19:59, 3.23s/it] {'loss': 0.1849, 'grad_norm': 0.9370392528764342, 'learning_rate': 9.17411881511469e-06, 'epoch': 0.21} 21%|██ | 7199/34278 [7:57:43<24:19:59, 3.23s/it] 21%|██ | 7200/34278 [7:57:46<23:21:45, 3.11s/it] {'loss': 0.1609, 'grad_norm': 0.9624771284835438, 'learning_rate': 9.173858713311606e-06, 'epoch': 0.21} 21%|██ | 7200/34278 [7:57:46<23:21:45, 3.11s/it] 21%|██ | 7201/34278 [7:57:52<29:54:33, 3.98s/it] {'loss': 0.1477, 'grad_norm': 0.7370014896462503, 'learning_rate': 9.17359857424531e-06, 'epoch': 0.21} 21%|██ | 7201/34278 [7:57:52<29:54:33, 3.98s/it] 21%|██ | 7202/34278 [7:57:58<34:28:17, 4.58s/it] {'loss': 0.1738, 'grad_norm': 1.030148428794689, 'learning_rate': 9.173338397918123e-06, 'epoch': 0.21} 21%|██ | 7202/34278 [7:57:58<34:28:17, 4.58s/it] 21%|██ | 7203/34278 [7:58:01<30:52:38, 4.11s/it] {'loss': 0.1721, 'grad_norm': 0.9207997100627249, 'learning_rate': 9.173078184332366e-06, 'epoch': 0.21} 21%|██ | 7203/34278 [7:58:01<30:52:38, 4.11s/it] 21%|██ | 7204/34278 [7:58:05<31:07:08, 4.14s/it] {'loss': 0.1439, 'grad_norm': 0.7433046817468101, 'learning_rate': 9.172817933490364e-06, 'epoch': 0.21} 21%|██ | 7204/34278 [7:58:05<31:07:08, 4.14s/it] 21%|██ | 7205/34278 [7:58:08<28:31:25, 3.79s/it] {'loss': 0.1634, 'grad_norm': 1.0273490729952162, 'learning_rate': 9.172557645394438e-06, 'epoch': 0.21} 21%|██ | 7205/34278 [7:58:08<28:31:25, 3.79s/it] 21%|██ | 7206/34278 [7:58:11<26:21:25, 3.50s/it] {'loss': 0.1654, 'grad_norm': 0.9397161896137732, 'learning_rate': 9.172297320046915e-06, 'epoch': 0.21} 21%|██ | 7206/34278 [7:58:11<26:21:25, 3.50s/it] 21%|██ | 7207/34278 [7:58:15<26:19:07, 3.50s/it] {'loss': 0.1482, 'grad_norm': 0.8099268542749183, 'learning_rate': 9.172036957450116e-06, 'epoch': 0.21} 21%|██ | 7207/34278 [7:58:15<26:19:07, 3.50s/it] 21%|██ | 7208/34278 [7:58:20<30:24:24, 4.04s/it] {'loss': 0.1539, 'grad_norm': 0.8430220199734931, 'learning_rate': 9.171776557606368e-06, 'epoch': 0.21} 21%|██ | 7208/34278 [7:58:20<30:24:24, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██ | 7209/34278 [7:58:24<29:05:37, 3.87s/it] {'loss': 0.2128, 'grad_norm': 1.2139852278920045, 'learning_rate': 9.171516120517993e-06, 'epoch': 0.21} 21%|██ | 7209/34278 [7:58:24<29:05:37, 3.87s/it] 21%|██ | 7210/34278 [7:58:29<33:11:06, 4.41s/it] {'loss': 0.1805, 'grad_norm': 0.9566087572947806, 'learning_rate': 9.17125564618732e-06, 'epoch': 0.21} 21%|██ | 7210/34278 [7:58:29<33:11:06, 4.41s/it] 21%|██ | 7211/34278 [7:58:34<33:07:23, 4.41s/it] {'loss': 0.1394, 'grad_norm': 0.7991018451759162, 'learning_rate': 9.170995134616673e-06, 'epoch': 0.21} 21%|██ | 7211/34278 [7:58:34<33:07:23, 4.41s/it] 21%|██ | 7212/34278 [7:58:37<30:40:47, 4.08s/it] {'loss': 0.1541, 'grad_norm': 0.9455057662208112, 'learning_rate': 9.170734585808376e-06, 'epoch': 0.21} 21%|██ | 7212/34278 [7:58:37<30:40:47, 4.08s/it] 21%|██ | 7213/34278 [7:58:43<34:25:24, 4.58s/it] {'loss': 0.184, 'grad_norm': 0.8655907379066675, 'learning_rate': 9.170473999764755e-06, 'epoch': 0.21} 21%|██ | 7213/34278 [7:58:43<34:25:24, 4.58s/it] 21%|██ | 7214/34278 [7:58:46<30:59:49, 4.12s/it] {'loss': 0.1639, 'grad_norm': 0.8199372371606958, 'learning_rate': 9.17021337648814e-06, 'epoch': 0.21} 21%|██ | 7214/34278 [7:58:46<30:59:49, 4.12s/it] 21%|██ | 7215/34278 [7:58:49<28:50:27, 3.84s/it] {'loss': 0.143, 'grad_norm': 0.6649741533327053, 'learning_rate': 9.169952715980854e-06, 'epoch': 0.21} 21%|██ | 7215/34278 [7:58:49<28:50:27, 3.84s/it] 21%|██ | 7216/34278 [7:58:53<28:37:58, 3.81s/it] {'loss': 0.191, 'grad_norm': 0.9405125419912356, 'learning_rate': 9.169692018245226e-06, 'epoch': 0.21} 21%|██ | 7216/34278 [7:58:53<28:37:58, 3.81s/it] 21%|██ | 7217/34278 [7:58:57<29:33:56, 3.93s/it] {'loss': 0.1738, 'grad_norm': 0.8292947727476876, 'learning_rate': 9.169431283283583e-06, 'epoch': 0.21} 21%|██ | 7217/34278 [7:58:57<29:33:56, 3.93s/it] 21%|██ | 7218/34278 [7:59:00<27:06:49, 3.61s/it] {'loss': 0.1563, 'grad_norm': 0.7121011638614227, 'learning_rate': 9.169170511098254e-06, 'epoch': 0.21} 21%|██ | 7218/34278 [7:59:00<27:06:49, 3.61s/it] 21%|██ | 7219/34278 [7:59:03<26:54:22, 3.58s/it] {'loss': 0.1494, 'grad_norm': 0.846206531882245, 'learning_rate': 9.168909701691564e-06, 'epoch': 0.21} 21%|██ | 7219/34278 [7:59:03<26:54:22, 3.58s/it] 21%|██ | 7220/34278 [7:59:06<25:32:12, 3.40s/it] {'loss': 0.1695, 'grad_norm': 0.9946974153910175, 'learning_rate': 9.168648855065844e-06, 'epoch': 0.21} 21%|██ | 7220/34278 [7:59:06<25:32:12, 3.40s/it] 21%|██ | 7221/34278 [7:59:10<26:15:27, 3.49s/it] {'loss': 0.1745, 'grad_norm': 0.7830084892918774, 'learning_rate': 9.168387971223422e-06, 'epoch': 0.21} 21%|██ | 7221/34278 [7:59:10<26:15:27, 3.49s/it] 21%|██ | 7222/34278 [7:59:16<32:48:29, 4.37s/it] {'loss': 0.1511, 'grad_norm': 0.8729232446653769, 'learning_rate': 9.16812705016663e-06, 'epoch': 0.21} 21%|██ | 7222/34278 [7:59:16<32:48:29, 4.37s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██ | 7223/34278 [7:59:20<30:24:53, 4.05s/it] {'loss': 0.1821, 'grad_norm': 1.106811527660954, 'learning_rate': 9.16786609189779e-06, 'epoch': 0.21} 21%|██ | 7223/34278 [7:59:20<30:24:53, 4.05s/it] 21%|██ | 7224/34278 [7:59:23<28:39:44, 3.81s/it] {'loss': 0.1542, 'grad_norm': 0.8419729335873963, 'learning_rate': 9.167605096419238e-06, 'epoch': 0.21} 21%|██ | 7224/34278 [7:59:23<28:39:44, 3.81s/it] 21%|██ | 7225/34278 [7:59:26<27:30:32, 3.66s/it] {'loss': 0.1516, 'grad_norm': 0.9634351687740887, 'learning_rate': 9.167344063733305e-06, 'epoch': 0.21} 21%|██ | 7225/34278 [7:59:26<27:30:32, 3.66s/it] 21%|██ | 7226/34278 [7:59:29<26:02:50, 3.47s/it] {'loss': 0.1505, 'grad_norm': 0.7983317844393885, 'learning_rate': 9.167082993842317e-06, 'epoch': 0.21} 21%|██ | 7226/34278 [7:59:29<26:02:50, 3.47s/it] 21%|██ | 7227/34278 [7:59:32<25:21:40, 3.38s/it] {'loss': 0.1335, 'grad_norm': 0.7538471682521021, 'learning_rate': 9.166821886748607e-06, 'epoch': 0.21} 21%|██ | 7227/34278 [7:59:32<25:21:40, 3.38s/it] 21%|██ | 7228/34278 [7:59:36<26:43:09, 3.56s/it] {'loss': 0.1605, 'grad_norm': 0.9316829504194767, 'learning_rate': 9.166560742454507e-06, 'epoch': 0.21} 21%|██ | 7228/34278 [7:59:36<26:43:09, 3.56s/it] 21%|██ | 7229/34278 [7:59:40<27:13:53, 3.62s/it] {'loss': 0.19, 'grad_norm': 0.8741634697097257, 'learning_rate': 9.166299560962346e-06, 'epoch': 0.21} 21%|██ | 7229/34278 [7:59:40<27:13:53, 3.62s/it] 21%|██ | 7230/34278 [7:59:45<29:08:02, 3.88s/it] {'loss': 0.1578, 'grad_norm': 0.8083318365286651, 'learning_rate': 9.166038342274458e-06, 'epoch': 0.21} 21%|██ | 7230/34278 [7:59:45<29:08:02, 3.88s/it] 21%|██ | 7231/34278 [7:59:48<27:20:14, 3.64s/it] {'loss': 0.1791, 'grad_norm': 0.8987026413554663, 'learning_rate': 9.165777086393173e-06, 'epoch': 0.21} 21%|██ | 7231/34278 [7:59:48<27:20:14, 3.64s/it] 21%|██ | 7232/34278 [7:59:51<25:54:46, 3.45s/it] {'loss': 0.1622, 'grad_norm': 0.9635897845510202, 'learning_rate': 9.165515793320824e-06, 'epoch': 0.21} 21%|██ | 7232/34278 [7:59:51<25:54:46, 3.45s/it] 21%|██ | 7233/34278 [7:59:54<25:23:25, 3.38s/it] {'loss': 0.1745, 'grad_norm': 0.9937506673765454, 'learning_rate': 9.165254463059747e-06, 'epoch': 0.21} 21%|██ | 7233/34278 [7:59:54<25:23:25, 3.38s/it] 21%|██ | 7234/34278 [7:59:57<24:29:35, 3.26s/it] {'loss': 0.17, 'grad_norm': 0.8573155118927874, 'learning_rate': 9.164993095612271e-06, 'epoch': 0.21} 21%|██ | 7234/34278 [7:59:57<24:29:35, 3.26s/it] 21%|██ | 7235/34278 [8:00:01<25:30:25, 3.40s/it] {'loss': 0.1857, 'grad_norm': 1.0406524258434822, 'learning_rate': 9.164731690980732e-06, 'epoch': 0.21} 21%|██ | 7235/34278 [8:00:01<25:30:25, 3.40s/it] 21%|██ | 7236/34278 [8:00:07<31:44:28, 4.23s/it] {'loss': 0.1587, 'grad_norm': 0.8987827523359867, 'learning_rate': 9.16447024916746e-06, 'epoch': 0.21} 21%|██ | 7236/34278 [8:00:07<31:44:28, 4.23s/it] 21%|██ | 7237/34278 [8:00:10<29:07:31, 3.88s/it] {'loss': 0.1444, 'grad_norm': 0.6673609456296926, 'learning_rate': 9.164208770174795e-06, 'epoch': 0.21} 21%|██ | 7237/34278 [8:00:10<29:07:31, 3.88s/it] 21%|██ | 7238/34278 [8:00:14<29:08:56, 3.88s/it] {'loss': 0.1545, 'grad_norm': 0.859594804297381, 'learning_rate': 9.163947254005066e-06, 'epoch': 0.21} 21%|██ | 7238/34278 [8:00:14<29:08:56, 3.88s/it] 21%|██ | 7239/34278 [8:00:17<27:16:21, 3.63s/it] {'loss': 0.15, 'grad_norm': 0.7358956790046004, 'learning_rate': 9.163685700660611e-06, 'epoch': 0.21} 21%|██ | 7239/34278 [8:00:17<27:16:21, 3.63s/it] 21%|██ | 7240/34278 [8:00:20<27:00:13, 3.60s/it] {'loss': 0.1729, 'grad_norm': 0.7973373779002741, 'learning_rate': 9.163424110143763e-06, 'epoch': 0.21} 21%|██ | 7240/34278 [8:00:20<27:00:13, 3.60s/it] 21%|██ | 7241/34278 [8:00:24<26:12:41, 3.49s/it] {'loss': 0.1332, 'grad_norm': 0.7203175182166265, 'learning_rate': 9.16316248245686e-06, 'epoch': 0.21} 21%|██ | 7241/34278 [8:00:24<26:12:41, 3.49s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9617 > 8192). Running this sequence through the model will result in indexing errors 21%|██ | 7242/34278 [8:00:27<25:22:22, 3.38s/it] {'loss': 0.1525, 'grad_norm': 0.8740975631117702, 'learning_rate': 9.162900817602235e-06, 'epoch': 0.21} 21%|██ | 7242/34278 [8:00:27<25:22:22, 3.38s/it] 21%|██ | 7243/34278 [8:00:30<24:51:50, 3.31s/it] {'loss': 0.1545, 'grad_norm': 0.7848806097179016, 'learning_rate': 9.162639115582226e-06, 'epoch': 0.21} 21%|██ | 7243/34278 [8:00:30<24:51:50, 3.31s/it] 21%|██ | 7244/34278 [8:00:34<26:44:31, 3.56s/it] {'loss': 0.1618, 'grad_norm': 0.8666497284228482, 'learning_rate': 9.16237737639917e-06, 'epoch': 0.21} 21%|██ | 7244/34278 [8:00:34<26:44:31, 3.56s/it] 21%|██ | 7245/34278 [8:00:39<29:19:25, 3.91s/it] {'loss': 0.1736, 'grad_norm': 0.7492361144512795, 'learning_rate': 9.162115600055398e-06, 'epoch': 0.21} 21%|██ | 7245/34278 [8:00:39<29:19:25, 3.91s/it] 21%|██ | 7246/34278 [8:00:42<28:55:13, 3.85s/it] {'loss': 0.1597, 'grad_norm': 0.7576215749565726, 'learning_rate': 9.161853786553256e-06, 'epoch': 0.21} 21%|██ | 7246/34278 [8:00:42<28:55:13, 3.85s/it] 21%|██ | 7247/34278 [8:00:46<28:06:47, 3.74s/it] {'loss': 0.1486, 'grad_norm': 0.8438264160498878, 'learning_rate': 9.161591935895073e-06, 'epoch': 0.21} 21%|██ | 7247/34278 [8:00:46<28:06:47, 3.74s/it] 21%|██ | 7248/34278 [8:00:49<26:56:05, 3.59s/it] {'loss': 0.1666, 'grad_norm': 0.7609494232054379, 'learning_rate': 9.161330048083194e-06, 'epoch': 0.21} 21%|██ | 7248/34278 [8:00:49<26:56:05, 3.59s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██ | 7249/34278 [8:00:52<25:09:33, 3.35s/it] {'loss': 0.1615, 'grad_norm': 0.8909393589085635, 'learning_rate': 9.161068123119953e-06, 'epoch': 0.21} 21%|██ | 7249/34278 [8:00:52<25:09:33, 3.35s/it] 21%|██ | 7250/34278 [8:00:56<25:53:16, 3.45s/it] {'loss': 0.1446, 'grad_norm': 0.8393390508080685, 'learning_rate': 9.160806161007687e-06, 'epoch': 0.21} 21%|██ | 7250/34278 [8:00:56<25:53:16, 3.45s/it] 21%|██ | 7251/34278 [8:01:02<31:30:21, 4.20s/it] {'loss': 0.1617, 'grad_norm': 0.7310636219773928, 'learning_rate': 9.16054416174874e-06, 'epoch': 0.21} 21%|██ | 7251/34278 [8:01:02<31:30:21, 4.20s/it] 21%|██ | 7252/34278 [8:01:06<32:24:04, 4.32s/it] {'loss': 0.1734, 'grad_norm': 0.8136839895612361, 'learning_rate': 9.160282125345445e-06, 'epoch': 0.21} 21%|██ | 7252/34278 [8:01:06<32:24:04, 4.32s/it] 21%|██ | 7253/34278 [8:01:09<29:29:27, 3.93s/it] {'loss': 0.1512, 'grad_norm': 0.8147171789853841, 'learning_rate': 9.160020051800148e-06, 'epoch': 0.21} 21%|██ | 7253/34278 [8:01:09<29:29:27, 3.93s/it] 21%|██ | 7254/34278 [8:01:12<27:25:47, 3.65s/it] {'loss': 0.1528, 'grad_norm': 0.9635672195950745, 'learning_rate': 9.159757941115181e-06, 'epoch': 0.21} 21%|██ | 7254/34278 [8:01:12<27:25:47, 3.65s/it] 21%|██ | 7255/34278 [8:01:16<27:57:43, 3.73s/it] {'loss': 0.1742, 'grad_norm': 0.7038737130414471, 'learning_rate': 9.15949579329289e-06, 'epoch': 0.21} 21%|██ | 7255/34278 [8:01:16<27:57:43, 3.73s/it] 21%|██ | 7256/34278 [8:01:19<27:07:17, 3.61s/it] {'loss': 0.1661, 'grad_norm': 0.8915309327020401, 'learning_rate': 9.159233608335614e-06, 'epoch': 0.21} 21%|██ | 7256/34278 [8:01:19<27:07:17, 3.61s/it] 21%|██ | 7257/34278 [8:01:23<26:04:25, 3.47s/it] {'loss': 0.1285, 'grad_norm': 0.7964819668145793, 'learning_rate': 9.158971386245691e-06, 'epoch': 0.21} 21%|██ | 7257/34278 [8:01:23<26:04:25, 3.47s/it] 21%|██ | 7258/34278 [8:01:28<31:25:12, 4.19s/it] {'loss': 0.1775, 'grad_norm': 0.8556872958705937, 'learning_rate': 9.158709127025468e-06, 'epoch': 0.21} 21%|██ | 7258/34278 [8:01:28<31:25:12, 4.19s/it] 21%|██ | 7259/34278 [8:01:32<29:00:22, 3.86s/it] {'loss': 0.1612, 'grad_norm': 0.7919850483184171, 'learning_rate': 9.15844683067728e-06, 'epoch': 0.21} 21%|██ | 7259/34278 [8:01:32<29:00:22, 3.86s/it] 21%|██ | 7260/34278 [8:01:34<26:29:41, 3.53s/it] {'loss': 0.1723, 'grad_norm': 0.8689093484287964, 'learning_rate': 9.15818449720347e-06, 'epoch': 0.21} 21%|██ | 7260/34278 [8:01:34<26:29:41, 3.53s/it] 21%|██ | 7261/34278 [8:01:40<32:00:17, 4.26s/it] {'loss': 0.1561, 'grad_norm': 1.0878408856516633, 'learning_rate': 9.157922126606385e-06, 'epoch': 0.21} 21%|██ | 7261/34278 [8:01:40<32:00:17, 4.26s/it] 21%|██ | 7262/34278 [8:01:43<29:08:38, 3.88s/it] {'loss': 0.1507, 'grad_norm': 0.7054973680891888, 'learning_rate': 9.157659718888362e-06, 'epoch': 0.21} 21%|██ | 7262/34278 [8:01:43<29:08:38, 3.88s/it] 21%|██ | 7263/34278 [8:01:48<31:47:00, 4.24s/it] {'loss': 0.1776, 'grad_norm': 0.7655988458983077, 'learning_rate': 9.157397274051745e-06, 'epoch': 0.21} 21%|██ | 7263/34278 [8:01:48<31:47:00, 4.24s/it] 21%|██ | 7264/34278 [8:01:52<31:35:04, 4.21s/it] {'loss': 0.2037, 'grad_norm': 0.8851712451319848, 'learning_rate': 9.157134792098878e-06, 'epoch': 0.21} 21%|██ | 7264/34278 [8:01:52<31:35:04, 4.21s/it] 21%|██ | 7265/34278 [8:01:59<35:51:28, 4.78s/it] {'loss': 0.1795, 'grad_norm': 0.9438857408147793, 'learning_rate': 9.156872273032104e-06, 'epoch': 0.21} 21%|██ | 7265/34278 [8:01:59<35:51:28, 4.78s/it] 21%|██ | 7266/34278 [8:02:04<38:28:22, 5.13s/it] {'loss': 0.1696, 'grad_norm': 0.9911627622933965, 'learning_rate': 9.156609716853768e-06, 'epoch': 0.21} 21%|██ | 7266/34278 [8:02:05<38:28:22, 5.13s/it] 21%|██ | 7267/34278 [8:02:11<40:41:28, 5.42s/it] {'loss': 0.1657, 'grad_norm': 0.8586388532504238, 'learning_rate': 9.156347123566211e-06, 'epoch': 0.21} 21%|██ | 7267/34278 [8:02:11<40:41:28, 5.42s/it] 21%|██ | 7268/34278 [8:02:14<35:20:42, 4.71s/it] {'loss': 0.1542, 'grad_norm': 0.9447965772581178, 'learning_rate': 9.15608449317178e-06, 'epoch': 0.21} 21%|██ | 7268/34278 [8:02:14<35:20:42, 4.71s/it] 21%|██ | 7269/34278 [8:02:17<32:20:44, 4.31s/it] {'loss': 0.179, 'grad_norm': 1.0570953122206987, 'learning_rate': 9.15582182567282e-06, 'epoch': 0.21} 21%|██ | 7269/34278 [8:02:17<32:20:44, 4.31s/it] 21%|██ | 7270/34278 [8:02:22<34:55:34, 4.66s/it] {'loss': 0.1824, 'grad_norm': 0.9138467528532017, 'learning_rate': 9.155559121071673e-06, 'epoch': 0.21} 21%|██ | 7270/34278 [8:02:22<34:55:34, 4.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██ | 7271/34278 [8:02:29<39:01:00, 5.20s/it] {'loss': 0.1544, 'grad_norm': 0.7993348248640859, 'learning_rate': 9.155296379370686e-06, 'epoch': 0.21} 21%|██ | 7271/34278 [8:02:29<39:01:00, 5.20s/it] 21%|██ | 7272/34278 [8:02:34<39:31:59, 5.27s/it] {'loss': 0.1529, 'grad_norm': 0.9384783663292866, 'learning_rate': 9.155033600572206e-06, 'epoch': 0.21} 21%|██ | 7272/34278 [8:02:34<39:31:59, 5.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██ | 7273/34278 [8:02:41<41:53:26, 5.58s/it] {'loss': 0.1603, 'grad_norm': 0.9040254271351434, 'learning_rate': 9.154770784678577e-06, 'epoch': 0.21} 21%|██ | 7273/34278 [8:02:41<41:53:26, 5.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██ | 7274/34278 [8:02:44<36:52:38, 4.92s/it] {'loss': 0.1622, 'grad_norm': 0.8428159003755334, 'learning_rate': 9.154507931692146e-06, 'epoch': 0.21} 21%|██ | 7274/34278 [8:02:44<36:52:38, 4.92s/it] 21%|██ | 7275/34278 [8:02:49<36:48:41, 4.91s/it] {'loss': 0.1672, 'grad_norm': 0.932883922513483, 'learning_rate': 9.154245041615262e-06, 'epoch': 0.21} 21%|██ | 7275/34278 [8:02:49<36:48:41, 4.91s/it] 21%|██ | 7276/34278 [8:02:53<33:54:27, 4.52s/it] {'loss': 0.1557, 'grad_norm': 0.6227746897426399, 'learning_rate': 9.153982114450268e-06, 'epoch': 0.21} 21%|██ | 7276/34278 [8:02:53<33:54:27, 4.52s/it] 21%|██ | 7277/34278 [8:02:55<30:19:36, 4.04s/it] {'loss': 0.1764, 'grad_norm': 0.9073164938194024, 'learning_rate': 9.153719150199513e-06, 'epoch': 0.21} 21%|██ | 7277/34278 [8:02:56<30:19:36, 4.04s/it] 21%|██ | 7278/34278 [8:03:01<34:43:03, 4.63s/it] {'loss': 0.1545, 'grad_norm': 1.0442867199960852, 'learning_rate': 9.153456148865347e-06, 'epoch': 0.21} 21%|██ | 7278/34278 [8:03:02<34:43:03, 4.63s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9318 > 8192). Running this sequence through the model will result in indexing errors 21%|██ | 7279/34278 [8:03:05<32:18:18, 4.31s/it] {'loss': 0.1575, 'grad_norm': 0.6690553638554411, 'learning_rate': 9.153193110450115e-06, 'epoch': 0.21} 21%|██ | 7279/34278 [8:03:05<32:18:18, 4.31s/it] 21%|██ | 7280/34278 [8:03:08<30:01:25, 4.00s/it] {'loss': 0.1572, 'grad_norm': 0.8853732465828301, 'learning_rate': 9.152930034956166e-06, 'epoch': 0.21} 21%|██ | 7280/34278 [8:03:08<30:01:25, 4.00s/it] 21%|██ | 7281/34278 [8:03:14<34:05:34, 4.55s/it] {'loss': 0.1563, 'grad_norm': 0.8426211020092063, 'learning_rate': 9.152666922385849e-06, 'epoch': 0.21} 21%|██ | 7281/34278 [8:03:14<34:05:34, 4.55s/it] 21%|██ | 7282/34278 [8:03:17<31:15:39, 4.17s/it] {'loss': 0.1511, 'grad_norm': 0.7788186702083829, 'learning_rate': 9.152403772741514e-06, 'epoch': 0.21} 21%|██ | 7282/34278 [8:03:17<31:15:39, 4.17s/it] 21%|██ | 7283/34278 [8:03:21<29:34:30, 3.94s/it] {'loss': 0.1744, 'grad_norm': 0.822945340131927, 'learning_rate': 9.152140586025509e-06, 'epoch': 0.21} 21%|██ | 7283/34278 [8:03:21<29:34:30, 3.94s/it] 21%|██ | 7284/34278 [8:03:25<30:42:48, 4.10s/it] {'loss': 0.1634, 'grad_norm': 0.9471169841068209, 'learning_rate': 9.151877362240182e-06, 'epoch': 0.21} 21%|██ | 7284/34278 [8:03:26<30:42:48, 4.10s/it] 21%|██▏ | 7285/34278 [8:03:32<35:34:42, 4.75s/it] {'loss': 0.1814, 'grad_norm': 0.8465634649866716, 'learning_rate': 9.151614101387886e-06, 'epoch': 0.21} 21%|██▏ | 7285/34278 [8:03:32<35:34:42, 4.75s/it] 21%|██▏ | 7286/34278 [8:03:37<37:44:40, 5.03s/it] {'loss': 0.1372, 'grad_norm': 1.2962259187476264, 'learning_rate': 9.151350803470971e-06, 'epoch': 0.21} 21%|██▏ | 7286/34278 [8:03:37<37:44:40, 5.03s/it] 21%|██▏ | 7287/34278 [8:03:41<34:16:51, 4.57s/it] {'loss': 0.1836, 'grad_norm': 0.9542451308170714, 'learning_rate': 9.151087468491788e-06, 'epoch': 0.21} 21%|██▏ | 7287/34278 [8:03:41<34:16:51, 4.57s/it] 21%|██▏ | 7288/34278 [8:03:44<32:03:42, 4.28s/it] {'loss': 0.1786, 'grad_norm': 1.0098810004640373, 'learning_rate': 9.150824096452686e-06, 'epoch': 0.21} 21%|██▏ | 7288/34278 [8:03:44<32:03:42, 4.28s/it] 21%|██▏ | 7289/34278 [8:03:47<29:07:58, 3.89s/it] {'loss': 0.1796, 'grad_norm': 1.0573907358946364, 'learning_rate': 9.150560687356018e-06, 'epoch': 0.21} 21%|██▏ | 7289/34278 [8:03:47<29:07:58, 3.89s/it] 21%|██▏ | 7290/34278 [8:03:51<28:45:56, 3.84s/it] {'loss': 0.1736, 'grad_norm': 1.3555653612269074, 'learning_rate': 9.150297241204134e-06, 'epoch': 0.21} 21%|██▏ | 7290/34278 [8:03:51<28:45:56, 3.84s/it] 21%|██▏ | 7291/34278 [8:03:55<28:10:54, 3.76s/it] {'loss': 0.1935, 'grad_norm': 0.9332835706812773, 'learning_rate': 9.150033757999389e-06, 'epoch': 0.21} 21%|██▏ | 7291/34278 [8:03:55<28:10:54, 3.76s/it] 21%|██▏ | 7292/34278 [8:03:59<29:05:11, 3.88s/it] {'loss': 0.1537, 'grad_norm': 0.9551991891457929, 'learning_rate': 9.149770237744132e-06, 'epoch': 0.21} 21%|██▏ | 7292/34278 [8:03:59<29:05:11, 3.88s/it] 21%|██▏ | 7293/34278 [8:04:02<27:10:47, 3.63s/it] {'loss': 0.1569, 'grad_norm': 1.1512656443502813, 'learning_rate': 9.149506680440715e-06, 'epoch': 0.21} 21%|██▏ | 7293/34278 [8:04:02<27:10:47, 3.63s/it] 21%|██▏ | 7294/34278 [8:04:06<27:19:55, 3.65s/it] {'loss': 0.1679, 'grad_norm': 0.893945134974164, 'learning_rate': 9.149243086091495e-06, 'epoch': 0.21} 21%|██▏ | 7294/34278 [8:04:06<27:19:55, 3.65s/it] 21%|██▏ | 7295/34278 [8:04:09<27:25:07, 3.66s/it] {'loss': 0.1753, 'grad_norm': 0.9385221012246392, 'learning_rate': 9.148979454698824e-06, 'epoch': 0.21} 21%|██▏ | 7295/34278 [8:04:09<27:25:07, 3.66s/it] 21%|██▏ | 7296/34278 [8:04:13<26:50:37, 3.58s/it] {'loss': 0.1542, 'grad_norm': 0.8300356824294304, 'learning_rate': 9.148715786265054e-06, 'epoch': 0.21} 21%|██▏ | 7296/34278 [8:04:13<26:50:37, 3.58s/it] 21%|██▏ | 7297/34278 [8:04:19<32:23:08, 4.32s/it] {'loss': 0.1686, 'grad_norm': 0.857791008708655, 'learning_rate': 9.148452080792538e-06, 'epoch': 0.21} 21%|██▏ | 7297/34278 [8:04:19<32:23:08, 4.32s/it] 21%|██▏ | 7298/34278 [8:04:22<30:44:25, 4.10s/it] {'loss': 0.1893, 'grad_norm': 0.7634129287782235, 'learning_rate': 9.148188338283635e-06, 'epoch': 0.21} 21%|██▏ | 7298/34278 [8:04:22<30:44:25, 4.10s/it] 21%|██▏ | 7299/34278 [8:04:25<28:18:05, 3.78s/it] {'loss': 0.1527, 'grad_norm': 0.7691221227717636, 'learning_rate': 9.147924558740694e-06, 'epoch': 0.21} 21%|██▏ | 7299/34278 [8:04:25<28:18:05, 3.78s/it] 21%|██▏ | 7300/34278 [8:04:31<32:25:19, 4.33s/it] {'loss': 0.1605, 'grad_norm': 0.8425605026794849, 'learning_rate': 9.147660742166075e-06, 'epoch': 0.21} 21%|██▏ | 7300/34278 [8:04:31<32:25:19, 4.33s/it] 21%|██▏ | 7301/34278 [8:04:37<35:49:47, 4.78s/it] {'loss': 0.1768, 'grad_norm': 0.9963815151871954, 'learning_rate': 9.14739688856213e-06, 'epoch': 0.21} 21%|██▏ | 7301/34278 [8:04:37<35:49:47, 4.78s/it] 21%|██▏ | 7302/34278 [8:04:40<31:31:44, 4.21s/it] {'loss': 0.1563, 'grad_norm': 0.8846975435104927, 'learning_rate': 9.147132997931216e-06, 'epoch': 0.21} 21%|██▏ | 7302/34278 [8:04:40<31:31:44, 4.21s/it] 21%|██▏ | 7303/34278 [8:04:42<28:36:06, 3.82s/it] {'loss': 0.169, 'grad_norm': 0.8220305453751654, 'learning_rate': 9.146869070275688e-06, 'epoch': 0.21} 21%|██▏ | 7303/34278 [8:04:43<28:36:06, 3.82s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11731 > 8192). Running this sequence through the model will result in indexing errors 21%|██▏ | 7304/34278 [8:04:46<27:16:50, 3.64s/it] {'loss': 0.1514, 'grad_norm': 0.7534101736131481, 'learning_rate': 9.146605105597904e-06, 'epoch': 0.21} 21%|██▏ | 7304/34278 [8:04:46<27:16:50, 3.64s/it] 21%|██▏ | 7305/34278 [8:04:50<28:58:14, 3.87s/it] {'loss': 0.1675, 'grad_norm': 1.0260436057186861, 'learning_rate': 9.146341103900219e-06, 'epoch': 0.21} 21%|██▏ | 7305/34278 [8:04:50<28:58:14, 3.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██▏ | 7306/34278 [8:04:54<28:43:43, 3.83s/it] {'loss': 0.1613, 'grad_norm': 0.7428550466098834, 'learning_rate': 9.14607706518499e-06, 'epoch': 0.21} 21%|██▏ | 7306/34278 [8:04:54<28:43:43, 3.83s/it] 21%|██▏ | 7307/34278 [8:04:57<27:57:45, 3.73s/it] {'loss': 0.169, 'grad_norm': 0.8292004776562788, 'learning_rate': 9.145812989454576e-06, 'epoch': 0.21} 21%|██▏ | 7307/34278 [8:04:57<27:57:45, 3.73s/it] 21%|██▏ | 7308/34278 [8:05:00<26:19:14, 3.51s/it] {'loss': 0.1321, 'grad_norm': 0.8611035287697096, 'learning_rate': 9.145548876711332e-06, 'epoch': 0.21} 21%|██▏ | 7308/34278 [8:05:00<26:19:14, 3.51s/it] 21%|██▏ | 7309/34278 [8:05:04<25:27:19, 3.40s/it] {'loss': 0.1811, 'grad_norm': 0.7735265669200234, 'learning_rate': 9.145284726957618e-06, 'epoch': 0.21} 21%|██▏ | 7309/34278 [8:05:04<25:27:19, 3.40s/it] 21%|██▏ | 7310/34278 [8:05:09<31:10:53, 4.16s/it] {'loss': 0.1535, 'grad_norm': 0.757016431699202, 'learning_rate': 9.14502054019579e-06, 'epoch': 0.21} 21%|██▏ | 7310/34278 [8:05:09<31:10:53, 4.16s/it] 21%|██▏ | 7311/34278 [8:05:16<35:44:48, 4.77s/it] {'loss': 0.167, 'grad_norm': 0.7863331618232147, 'learning_rate': 9.14475631642821e-06, 'epoch': 0.21} 21%|██▏ | 7311/34278 [8:05:16<35:44:48, 4.77s/it] 21%|██▏ | 7312/34278 [8:05:22<38:32:50, 5.15s/it] {'loss': 0.1575, 'grad_norm': 0.8302171844504787, 'learning_rate': 9.144492055657234e-06, 'epoch': 0.21} 21%|██▏ | 7312/34278 [8:05:22<38:32:50, 5.15s/it] 21%|██▏ | 7313/34278 [8:05:25<34:46:17, 4.64s/it] {'loss': 0.1636, 'grad_norm': 0.9145860360312988, 'learning_rate': 9.144227757885222e-06, 'epoch': 0.21} 21%|██▏ | 7313/34278 [8:05:25<34:46:17, 4.64s/it] 21%|██▏ | 7314/34278 [8:05:28<31:21:33, 4.19s/it] {'loss': 0.1355, 'grad_norm': 1.0497092239261154, 'learning_rate': 9.143963423114534e-06, 'epoch': 0.21} 21%|██▏ | 7314/34278 [8:05:28<31:21:33, 4.19s/it] 21%|██▏ | 7315/34278 [8:05:32<29:27:06, 3.93s/it] {'loss': 0.1707, 'grad_norm': 0.924946311542116, 'learning_rate': 9.143699051347533e-06, 'epoch': 0.21} 21%|██▏ | 7315/34278 [8:05:32<29:27:06, 3.93s/it] 21%|██▏ | 7316/34278 [8:05:35<29:20:45, 3.92s/it] {'loss': 0.1656, 'grad_norm': 0.8193654544468313, 'learning_rate': 9.14343464258657e-06, 'epoch': 0.21} 21%|██▏ | 7316/34278 [8:05:35<29:20:45, 3.92s/it] 21%|██▏ | 7317/34278 [8:05:39<27:31:43, 3.68s/it] {'loss': 0.1508, 'grad_norm': 0.8506878470010961, 'learning_rate': 9.143170196834016e-06, 'epoch': 0.21} 21%|██▏ | 7317/34278 [8:05:39<27:31:43, 3.68s/it] 21%|██▏ | 7318/34278 [8:05:44<31:04:21, 4.15s/it] {'loss': 0.1316, 'grad_norm': 0.7402661330754645, 'learning_rate': 9.142905714092228e-06, 'epoch': 0.21} 21%|██▏ | 7318/34278 [8:05:44<31:04:21, 4.15s/it] 21%|██▏ | 7319/34278 [8:05:47<29:37:37, 3.96s/it] {'loss': 0.17, 'grad_norm': 0.8419823467258659, 'learning_rate': 9.142641194363565e-06, 'epoch': 0.21} 21%|██▏ | 7319/34278 [8:05:47<29:37:37, 3.96s/it] 21%|██▏ | 7320/34278 [8:05:51<28:10:17, 3.76s/it] {'loss': 0.1707, 'grad_norm': 0.9193798138031342, 'learning_rate': 9.142376637650389e-06, 'epoch': 0.21} 21%|██▏ | 7320/34278 [8:05:51<28:10:17, 3.76s/it] 21%|██▏ | 7321/34278 [8:05:54<26:12:28, 3.50s/it] {'loss': 0.1492, 'grad_norm': 0.7789488088692962, 'learning_rate': 9.142112043955065e-06, 'epoch': 0.21} 21%|██▏ | 7321/34278 [8:05:54<26:12:28, 3.50s/it] 21%|██▏ | 7322/34278 [8:05:57<25:31:23, 3.41s/it] {'loss': 0.1623, 'grad_norm': 0.8795139713047154, 'learning_rate': 9.141847413279955e-06, 'epoch': 0.21} 21%|██▏ | 7322/34278 [8:05:57<25:31:23, 3.41s/it] 21%|██▏ | 7323/34278 [8:06:00<24:50:16, 3.32s/it] {'loss': 0.1824, 'grad_norm': 0.848520747485592, 'learning_rate': 9.141582745627418e-06, 'epoch': 0.21} 21%|██▏ | 7323/34278 [8:06:00<24:50:16, 3.32s/it] 21%|██▏ | 7324/34278 [8:06:05<29:02:45, 3.88s/it] {'loss': 0.2026, 'grad_norm': 0.8085062744672866, 'learning_rate': 9.141318040999818e-06, 'epoch': 0.21} 21%|██▏ | 7324/34278 [8:06:05<29:02:45, 3.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██▏ | 7325/34278 [8:06:08<27:24:33, 3.66s/it] {'loss': 0.1367, 'grad_norm': 1.1710800232932104, 'learning_rate': 9.14105329939952e-06, 'epoch': 0.21} 21%|██▏ | 7325/34278 [8:06:08<27:24:33, 3.66s/it] 21%|██▏ | 7326/34278 [8:06:13<30:38:46, 4.09s/it] {'loss': 0.1695, 'grad_norm': 0.8742825967810263, 'learning_rate': 9.140788520828887e-06, 'epoch': 0.21} 21%|██▏ | 7326/34278 [8:06:13<30:38:46, 4.09s/it] 21%|██▏ | 7327/34278 [8:06:17<30:00:26, 4.01s/it] {'loss': 0.159, 'grad_norm': 0.9654315392518187, 'learning_rate': 9.140523705290284e-06, 'epoch': 0.21} 21%|██▏ | 7327/34278 [8:06:17<30:00:26, 4.01s/it] 21%|██▏ | 7328/34278 [8:06:20<28:29:12, 3.81s/it] {'loss': 0.1342, 'grad_norm': 0.756589814868092, 'learning_rate': 9.140258852786073e-06, 'epoch': 0.21} 21%|██▏ | 7328/34278 [8:06:20<28:29:12, 3.81s/it] 21%|██▏ | 7329/34278 [8:06:24<27:41:21, 3.70s/it] {'loss': 0.1719, 'grad_norm': 0.838268804707351, 'learning_rate': 9.139993963318619e-06, 'epoch': 0.21} 21%|██▏ | 7329/34278 [8:06:24<27:41:21, 3.70s/it] 21%|██▏ | 7330/34278 [8:06:27<25:37:24, 3.42s/it] {'loss': 0.1564, 'grad_norm': 0.7408867817376356, 'learning_rate': 9.139729036890286e-06, 'epoch': 0.21} 21%|██▏ | 7330/34278 [8:06:27<25:37:24, 3.42s/it] 21%|██▏ | 7331/34278 [8:06:30<24:51:36, 3.32s/it] {'loss': 0.1464, 'grad_norm': 0.7836876194083487, 'learning_rate': 9.139464073503442e-06, 'epoch': 0.21} 21%|██▏ | 7331/34278 [8:06:30<24:51:36, 3.32s/it] 21%|██▏ | 7332/34278 [8:06:32<23:27:51, 3.13s/it] {'loss': 0.168, 'grad_norm': 0.8026603789293204, 'learning_rate': 9.13919907316045e-06, 'epoch': 0.21} 21%|██▏ | 7332/34278 [8:06:32<23:27:51, 3.13s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9896 > 8192). Running this sequence through the model will result in indexing errors 21%|██▏ | 7333/34278 [8:06:36<25:00:25, 3.34s/it] {'loss': 0.1767, 'grad_norm': 1.1129692994111364, 'learning_rate': 9.138934035863676e-06, 'epoch': 0.21} 21%|██▏ | 7333/34278 [8:06:36<25:00:25, 3.34s/it] 21%|██▏ | 7334/34278 [8:06:39<24:42:36, 3.30s/it] {'loss': 0.1605, 'grad_norm': 0.9043989670577581, 'learning_rate': 9.138668961615489e-06, 'epoch': 0.21} 21%|██▏ | 7334/34278 [8:06:39<24:42:36, 3.30s/it] 21%|██▏ | 7335/34278 [8:06:43<25:19:13, 3.38s/it] {'loss': 0.1579, 'grad_norm': 0.761803105172563, 'learning_rate': 9.138403850418252e-06, 'epoch': 0.21} 21%|██▏ | 7335/34278 [8:06:43<25:19:13, 3.38s/it] 21%|██▏ | 7336/34278 [8:06:49<31:07:26, 4.16s/it] {'loss': 0.1919, 'grad_norm': 0.9006980494791728, 'learning_rate': 9.138138702274334e-06, 'epoch': 0.21} 21%|██▏ | 7336/34278 [8:06:49<31:07:26, 4.16s/it] 21%|██▏ | 7337/34278 [8:06:53<30:09:28, 4.03s/it] {'loss': 0.1497, 'grad_norm': 1.0927969594980038, 'learning_rate': 9.137873517186102e-06, 'epoch': 0.21} 21%|██▏ | 7337/34278 [8:06:53<30:09:28, 4.03s/it] 21%|██▏ | 7338/34278 [8:06:56<29:04:24, 3.89s/it] {'loss': 0.1471, 'grad_norm': 0.7630838897546133, 'learning_rate': 9.137608295155922e-06, 'epoch': 0.21} 21%|██▏ | 7338/34278 [8:06:56<29:04:24, 3.89s/it] 21%|██▏ | 7339/34278 [8:06:59<26:50:31, 3.59s/it] {'loss': 0.1565, 'grad_norm': 0.7360159065532706, 'learning_rate': 9.137343036186163e-06, 'epoch': 0.21} 21%|██▏ | 7339/34278 [8:06:59<26:50:31, 3.59s/it] 21%|██▏ | 7340/34278 [8:07:02<25:35:10, 3.42s/it] {'loss': 0.1567, 'grad_norm': 0.8045461629263935, 'learning_rate': 9.137077740279193e-06, 'epoch': 0.21} 21%|██▏ | 7340/34278 [8:07:02<25:35:10, 3.42s/it] 21%|██▏ | 7341/34278 [8:07:05<24:36:43, 3.29s/it] {'loss': 0.2215, 'grad_norm': 0.8868403361881234, 'learning_rate': 9.13681240743738e-06, 'epoch': 0.21} 21%|██▏ | 7341/34278 [8:07:05<24:36:43, 3.29s/it] 21%|██▏ | 7342/34278 [8:07:09<25:28:56, 3.41s/it] {'loss': 0.1478, 'grad_norm': 0.7991777175910662, 'learning_rate': 9.136547037663095e-06, 'epoch': 0.21} 21%|██▏ | 7342/34278 [8:07:09<25:28:56, 3.41s/it] 21%|██▏ | 7343/34278 [8:07:14<29:00:23, 3.88s/it] {'loss': 0.1701, 'grad_norm': 0.8704840435846436, 'learning_rate': 9.136281630958706e-06, 'epoch': 0.21} 21%|██▏ | 7343/34278 [8:07:14<29:00:23, 3.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██▏ | 7344/34278 [8:07:18<28:35:06, 3.82s/it] {'loss': 0.161, 'grad_norm': 0.7646456628845033, 'learning_rate': 9.13601618732658e-06, 'epoch': 0.21} 21%|██▏ | 7344/34278 [8:07:18<28:35:06, 3.82s/it] 21%|██▏ | 7345/34278 [8:07:24<33:41:34, 4.50s/it] {'loss': 0.1771, 'grad_norm': 0.8046625131504545, 'learning_rate': 9.135750706769089e-06, 'epoch': 0.21} 21%|██▏ | 7345/34278 [8:07:24<33:41:34, 4.50s/it] 21%|██▏ | 7346/34278 [8:07:26<29:58:31, 4.01s/it] {'loss': 0.1893, 'grad_norm': 0.8329365527287422, 'learning_rate': 9.135485189288604e-06, 'epoch': 0.21} 21%|██▏ | 7346/34278 [8:07:26<29:58:31, 4.01s/it] 21%|██▏ | 7347/34278 [8:07:30<27:44:47, 3.71s/it] {'loss': 0.1723, 'grad_norm': 0.8606383080312506, 'learning_rate': 9.135219634887493e-06, 'epoch': 0.21} 21%|██▏ | 7347/34278 [8:07:30<27:44:47, 3.71s/it] 21%|██▏ | 7348/34278 [8:07:33<26:39:59, 3.56s/it] {'loss': 0.1775, 'grad_norm': 0.8321246251854761, 'learning_rate': 9.134954043568131e-06, 'epoch': 0.21} 21%|██▏ | 7348/34278 [8:07:33<26:39:59, 3.56s/it] 21%|██▏ | 7349/34278 [8:07:37<28:52:29, 3.86s/it] {'loss': 0.156, 'grad_norm': 0.9277947646560082, 'learning_rate': 9.134688415332885e-06, 'epoch': 0.21} 21%|██▏ | 7349/34278 [8:07:37<28:52:29, 3.86s/it] 21%|██▏ | 7350/34278 [8:07:40<27:15:31, 3.64s/it] {'loss': 0.1367, 'grad_norm': 0.8517468199609406, 'learning_rate': 9.134422750184127e-06, 'epoch': 0.21} 21%|██▏ | 7350/34278 [8:07:40<27:15:31, 3.64s/it] 21%|██▏ | 7351/34278 [8:07:46<30:38:50, 4.10s/it] {'loss': 0.1661, 'grad_norm': 1.2604531790201514, 'learning_rate': 9.13415704812423e-06, 'epoch': 0.21} 21%|██▏ | 7351/34278 [8:07:46<30:38:50, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 21%|██▏ | 7352/34278 [8:07:49<28:24:28, 3.80s/it] {'loss': 0.1515, 'grad_norm': 0.787396494273231, 'learning_rate': 9.133891309155565e-06, 'epoch': 0.21} 21%|██▏ | 7352/34278 [8:07:49<28:24:28, 3.80s/it] 21%|██▏ | 7353/34278 [8:07:52<27:33:23, 3.68s/it] {'loss': 0.1761, 'grad_norm': 1.0771416045998476, 'learning_rate': 9.133625533280505e-06, 'epoch': 0.21} 21%|██▏ | 7353/34278 [8:07:52<27:33:23, 3.68s/it] 21%|██▏ | 7354/34278 [8:07:56<27:34:53, 3.69s/it] {'loss': 0.1564, 'grad_norm': 0.8482116276404471, 'learning_rate': 9.133359720501425e-06, 'epoch': 0.21} 21%|██▏ | 7354/34278 [8:07:56<27:34:53, 3.69s/it] 21%|██▏ | 7355/34278 [8:08:00<28:53:14, 3.86s/it] {'loss': 0.1711, 'grad_norm': 0.9357043607950191, 'learning_rate': 9.133093870820695e-06, 'epoch': 0.21} 21%|██▏ | 7355/34278 [8:08:00<28:53:14, 3.86s/it] 21%|██▏ | 7356/34278 [8:08:04<28:15:32, 3.78s/it] {'loss': 0.1593, 'grad_norm': 0.8300983990280097, 'learning_rate': 9.132827984240691e-06, 'epoch': 0.21} 21%|██▏ | 7356/34278 [8:08:04<28:15:32, 3.78s/it] 21%|██▏ | 7357/34278 [8:08:07<26:14:28, 3.51s/it] {'loss': 0.1616, 'grad_norm': 0.725622974870402, 'learning_rate': 9.132562060763784e-06, 'epoch': 0.21} 21%|██▏ | 7357/34278 [8:08:07<26:14:28, 3.51s/it] 21%|██▏ | 7358/34278 [8:08:13<31:58:22, 4.28s/it] {'loss': 0.184, 'grad_norm': 0.9195976435212265, 'learning_rate': 9.13229610039235e-06, 'epoch': 0.21} 21%|██▏ | 7358/34278 [8:08:13<31:58:22, 4.28s/it] 21%|██▏ | 7359/34278 [8:08:16<30:08:15, 4.03s/it] {'loss': 0.2024, 'grad_norm': 0.8605882375424254, 'learning_rate': 9.132030103128762e-06, 'epoch': 0.21} 21%|██▏ | 7359/34278 [8:08:16<30:08:15, 4.03s/it] 21%|██▏ | 7360/34278 [8:08:19<27:44:25, 3.71s/it] {'loss': 0.1897, 'grad_norm': 0.907525777273798, 'learning_rate': 9.131764068975397e-06, 'epoch': 0.21} 21%|██▏ | 7360/34278 [8:08:19<27:44:25, 3.71s/it] 21%|██▏ | 7361/34278 [8:08:23<28:35:20, 3.82s/it] {'loss': 0.1466, 'grad_norm': 0.9433606626723489, 'learning_rate': 9.131497997934627e-06, 'epoch': 0.21} 21%|██▏ | 7361/34278 [8:08:23<28:35:20, 3.82s/it] 21%|██▏ | 7362/34278 [8:08:27<27:50:44, 3.72s/it] {'loss': 0.1554, 'grad_norm': 0.8210241521435729, 'learning_rate': 9.13123189000883e-06, 'epoch': 0.21} 21%|██▏ | 7362/34278 [8:08:27<27:50:44, 3.72s/it] 21%|██▏ | 7363/34278 [8:08:29<25:57:42, 3.47s/it] {'loss': 0.1662, 'grad_norm': 0.8116618100076691, 'learning_rate': 9.130965745200382e-06, 'epoch': 0.21} 21%|██▏ | 7363/34278 [8:08:30<25:57:42, 3.47s/it] 21%|██▏ | 7364/34278 [8:08:33<25:47:41, 3.45s/it] {'loss': 0.1887, 'grad_norm': 1.2190663889972422, 'learning_rate': 9.130699563511656e-06, 'epoch': 0.21} 21%|██▏ | 7364/34278 [8:08:33<25:47:41, 3.45s/it] 21%|██▏ | 7365/34278 [8:08:38<28:37:59, 3.83s/it] {'loss': 0.1604, 'grad_norm': 0.8565025067762257, 'learning_rate': 9.130433344945032e-06, 'epoch': 0.21} 21%|██▏ | 7365/34278 [8:08:38<28:37:59, 3.83s/it] 21%|██▏ | 7366/34278 [8:08:41<26:34:58, 3.56s/it] {'loss': 0.1591, 'grad_norm': 0.8834188656468229, 'learning_rate': 9.130167089502884e-06, 'epoch': 0.21} 21%|██▏ | 7366/34278 [8:08:41<26:34:58, 3.56s/it] 21%|██▏ | 7367/34278 [8:08:45<27:42:37, 3.71s/it] {'loss': 0.1381, 'grad_norm': 1.143879753765153, 'learning_rate': 9.12990079718759e-06, 'epoch': 0.21} 21%|██▏ | 7367/34278 [8:08:45<27:42:37, 3.71s/it] 21%|██▏ | 7368/34278 [8:08:48<27:20:33, 3.66s/it] {'loss': 0.155, 'grad_norm': 0.8172458099029768, 'learning_rate': 9.129634468001529e-06, 'epoch': 0.21} 21%|██▏ | 7368/34278 [8:08:48<27:20:33, 3.66s/it] 21%|██▏ | 7369/34278 [8:08:51<25:48:37, 3.45s/it] {'loss': 0.154, 'grad_norm': 0.8025807980795531, 'learning_rate': 9.129368101947076e-06, 'epoch': 0.21} 21%|██▏ | 7369/34278 [8:08:51<25:48:37, 3.45s/it] 22%|██▏ | 7370/34278 [8:08:54<24:34:21, 3.29s/it] {'loss': 0.1945, 'grad_norm': 0.974295012210877, 'learning_rate': 9.12910169902661e-06, 'epoch': 0.22} 22%|██▏ | 7370/34278 [8:08:54<24:34:21, 3.29s/it] 22%|██▏ | 7371/34278 [8:09:00<30:30:07, 4.08s/it] {'loss': 0.1682, 'grad_norm': 0.8587266874156988, 'learning_rate': 9.128835259242511e-06, 'epoch': 0.22} 22%|██▏ | 7371/34278 [8:09:00<30:30:07, 4.08s/it] 22%|██▏ | 7372/34278 [8:09:03<27:54:59, 3.74s/it] {'loss': 0.1767, 'grad_norm': 0.8151983296692307, 'learning_rate': 9.128568782597155e-06, 'epoch': 0.22} 22%|██▏ | 7372/34278 [8:09:03<27:54:59, 3.74s/it] 22%|██▏ | 7373/34278 [8:09:06<27:36:54, 3.70s/it] {'loss': 0.1704, 'grad_norm': 0.7577545340507429, 'learning_rate': 9.128302269092925e-06, 'epoch': 0.22} 22%|██▏ | 7373/34278 [8:09:06<27:36:54, 3.70s/it] 22%|██▏ | 7374/34278 [8:09:09<26:06:51, 3.49s/it] {'loss': 0.1669, 'grad_norm': 0.8399926028136064, 'learning_rate': 9.128035718732196e-06, 'epoch': 0.22} 22%|██▏ | 7374/34278 [8:09:09<26:06:51, 3.49s/it] 22%|██▏ | 7375/34278 [8:09:12<24:30:16, 3.28s/it] {'loss': 0.1634, 'grad_norm': 0.9750137504931491, 'learning_rate': 9.12776913151735e-06, 'epoch': 0.22} 22%|██▏ | 7375/34278 [8:09:12<24:30:16, 3.28s/it] 22%|██▏ | 7376/34278 [8:09:15<24:02:06, 3.22s/it] {'loss': 0.157, 'grad_norm': 0.8003817450825019, 'learning_rate': 9.127502507450765e-06, 'epoch': 0.22} 22%|██▏ | 7376/34278 [8:09:15<24:02:06, 3.22s/it] 22%|██▏ | 7377/34278 [8:09:18<23:23:53, 3.13s/it] {'loss': 0.1671, 'grad_norm': 0.8293215185825179, 'learning_rate': 9.127235846534826e-06, 'epoch': 0.22} 22%|██▏ | 7377/34278 [8:09:18<23:23:53, 3.13s/it] 22%|██▏ | 7378/34278 [8:09:21<22:43:55, 3.04s/it] {'loss': 0.1662, 'grad_norm': 1.05160228048288, 'learning_rate': 9.126969148771907e-06, 'epoch': 0.22} 22%|██▏ | 7378/34278 [8:09:21<22:43:55, 3.04s/it] 22%|██▏ | 7379/34278 [8:09:24<22:53:03, 3.06s/it] {'loss': 0.1695, 'grad_norm': 0.7691951627365682, 'learning_rate': 9.126702414164395e-06, 'epoch': 0.22} 22%|██▏ | 7379/34278 [8:09:24<22:53:03, 3.06s/it] 22%|██▏ | 7380/34278 [8:09:28<25:26:36, 3.41s/it] {'loss': 0.1517, 'grad_norm': 0.8939838121579913, 'learning_rate': 9.126435642714669e-06, 'epoch': 0.22} 22%|██▏ | 7380/34278 [8:09:28<25:26:36, 3.41s/it] 22%|██▏ | 7381/34278 [8:09:33<27:40:55, 3.71s/it] {'loss': 0.1707, 'grad_norm': 0.8513543627829114, 'learning_rate': 9.12616883442511e-06, 'epoch': 0.22} 22%|██▏ | 7381/34278 [8:09:33<27:40:55, 3.71s/it] 22%|██▏ | 7382/34278 [8:09:36<26:20:16, 3.53s/it] {'loss': 0.1994, 'grad_norm': 0.9092042406280318, 'learning_rate': 9.1259019892981e-06, 'epoch': 0.22} 22%|██▏ | 7382/34278 [8:09:36<26:20:16, 3.53s/it] 22%|██▏ | 7383/34278 [8:09:40<27:08:17, 3.63s/it] {'loss': 0.1679, 'grad_norm': 0.9000988792296255, 'learning_rate': 9.125635107336024e-06, 'epoch': 0.22} 22%|██▏ | 7383/34278 [8:09:40<27:08:17, 3.63s/it] 22%|██▏ | 7384/34278 [8:09:43<26:46:28, 3.58s/it] {'loss': 0.1758, 'grad_norm': 0.908123743882174, 'learning_rate': 9.125368188541262e-06, 'epoch': 0.22} 22%|██▏ | 7384/34278 [8:09:43<26:46:28, 3.58s/it] 22%|██▏ | 7385/34278 [8:09:46<25:40:31, 3.44s/it] {'loss': 0.176, 'grad_norm': 0.8767753044865089, 'learning_rate': 9.125101232916196e-06, 'epoch': 0.22} 22%|██▏ | 7385/34278 [8:09:46<25:40:31, 3.44s/it] 22%|██▏ | 7386/34278 [8:09:53<32:12:24, 4.31s/it] {'loss': 0.1538, 'grad_norm': 0.9149316587210717, 'learning_rate': 9.124834240463212e-06, 'epoch': 0.22} 22%|██▏ | 7386/34278 [8:09:53<32:12:24, 4.31s/it] 22%|██▏ | 7387/34278 [8:09:56<29:28:05, 3.95s/it] {'loss': 0.1675, 'grad_norm': 0.8840923678120068, 'learning_rate': 9.124567211184693e-06, 'epoch': 0.22} 22%|██▏ | 7387/34278 [8:09:56<29:28:05, 3.95s/it] 22%|██▏ | 7388/34278 [8:10:01<33:08:18, 4.44s/it] {'loss': 0.2106, 'grad_norm': 0.8313796305740393, 'learning_rate': 9.124300145083022e-06, 'epoch': 0.22} 22%|██▏ | 7388/34278 [8:10:01<33:08:18, 4.44s/it] 22%|██▏ | 7389/34278 [8:10:07<36:44:21, 4.92s/it] {'loss': 0.1649, 'grad_norm': 1.081647456332479, 'learning_rate': 9.124033042160583e-06, 'epoch': 0.22} 22%|██▏ | 7389/34278 [8:10:07<36:44:21, 4.92s/it] 22%|██▏ | 7390/34278 [8:10:12<34:55:51, 4.68s/it] {'loss': 0.1427, 'grad_norm': 0.7099284823286249, 'learning_rate': 9.123765902419764e-06, 'epoch': 0.22} 22%|██▏ | 7390/34278 [8:10:12<34:55:51, 4.68s/it] 22%|██▏ | 7391/34278 [8:10:15<31:58:59, 4.28s/it] {'loss': 0.1486, 'grad_norm': 0.8659710311845469, 'learning_rate': 9.123498725862946e-06, 'epoch': 0.22} 22%|██▏ | 7391/34278 [8:10:15<31:58:59, 4.28s/it] 22%|██▏ | 7392/34278 [8:10:18<28:12:24, 3.78s/it] {'loss': 0.1584, 'grad_norm': 0.8675062325411816, 'learning_rate': 9.123231512492513e-06, 'epoch': 0.22} 22%|██▏ | 7392/34278 [8:10:18<28:12:24, 3.78s/it] 22%|██▏ | 7393/34278 [8:10:21<26:55:18, 3.60s/it] {'loss': 0.1762, 'grad_norm': 0.8166116693541626, 'learning_rate': 9.122964262310858e-06, 'epoch': 0.22} 22%|██▏ | 7393/34278 [8:10:21<26:55:18, 3.60s/it] 22%|██▏ | 7394/34278 [8:10:27<32:10:03, 4.31s/it] {'loss': 0.1694, 'grad_norm': 0.8307253049564872, 'learning_rate': 9.12269697532036e-06, 'epoch': 0.22} 22%|██▏ | 7394/34278 [8:10:27<32:10:03, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 22%|██▏ | 7395/34278 [8:10:30<29:39:21, 3.97s/it] {'loss': 0.1825, 'grad_norm': 0.835149841674723, 'learning_rate': 9.122429651523408e-06, 'epoch': 0.22} 22%|██▏ | 7395/34278 [8:10:30<29:39:21, 3.97s/it] 22%|██▏ | 7396/34278 [8:10:33<27:44:28, 3.72s/it] {'loss': 0.1541, 'grad_norm': 1.039023270171752, 'learning_rate': 9.122162290922387e-06, 'epoch': 0.22} 22%|██▏ | 7396/34278 [8:10:33<27:44:28, 3.72s/it] 22%|██▏ | 7397/34278 [8:10:36<26:31:51, 3.55s/it] {'loss': 0.1706, 'grad_norm': 1.2286048092311068, 'learning_rate': 9.121894893519688e-06, 'epoch': 0.22} 22%|██▏ | 7397/34278 [8:10:36<26:31:51, 3.55s/it] 22%|██▏ | 7398/34278 [8:10:42<32:14:57, 4.32s/it] {'loss': 0.1796, 'grad_norm': 0.8429100222118212, 'learning_rate': 9.121627459317693e-06, 'epoch': 0.22} 22%|██▏ | 7398/34278 [8:10:42<32:14:57, 4.32s/it] 22%|██▏ | 7399/34278 [8:10:46<30:20:25, 4.06s/it] {'loss': 0.1771, 'grad_norm': 0.911287718055003, 'learning_rate': 9.121359988318792e-06, 'epoch': 0.22} 22%|██▏ | 7399/34278 [8:10:46<30:20:25, 4.06s/it] 22%|██▏ | 7400/34278 [8:10:49<28:36:40, 3.83s/it] {'loss': 0.1553, 'grad_norm': 0.9175234071784473, 'learning_rate': 9.121092480525374e-06, 'epoch': 0.22} 22%|██▏ | 7400/34278 [8:10:49<28:36:40, 3.83s/it] 22%|██▏ | 7401/34278 [8:10:52<27:44:42, 3.72s/it] {'loss': 0.1682, 'grad_norm': 0.9441039377984193, 'learning_rate': 9.120824935939824e-06, 'epoch': 0.22} 22%|██▏ | 7401/34278 [8:10:52<27:44:42, 3.72s/it] 22%|██▏ | 7402/34278 [8:10:56<26:34:53, 3.56s/it] {'loss': 0.1871, 'grad_norm': 0.7742096126268675, 'learning_rate': 9.120557354564534e-06, 'epoch': 0.22} 22%|██▏ | 7402/34278 [8:10:56<26:34:53, 3.56s/it] 22%|██▏ | 7403/34278 [8:10:59<26:59:18, 3.62s/it] {'loss': 0.1873, 'grad_norm': 0.8067408667463365, 'learning_rate': 9.120289736401892e-06, 'epoch': 0.22} 22%|██▏ | 7403/34278 [8:10:59<26:59:18, 3.62s/it] 22%|██▏ | 7404/34278 [8:11:05<31:54:26, 4.27s/it] {'loss': 0.1776, 'grad_norm': 0.7300165533170915, 'learning_rate': 9.120022081454286e-06, 'epoch': 0.22} 22%|██▏ | 7404/34278 [8:11:05<31:54:26, 4.27s/it] 22%|██▏ | 7405/34278 [8:11:11<35:51:21, 4.80s/it] {'loss': 0.1571, 'grad_norm': 0.7454749963610728, 'learning_rate': 9.119754389724107e-06, 'epoch': 0.22} 22%|██▏ | 7405/34278 [8:11:11<35:51:21, 4.80s/it] 22%|██▏ | 7406/34278 [8:11:14<31:36:54, 4.24s/it] {'loss': 0.1824, 'grad_norm': 0.7374077050525888, 'learning_rate': 9.119486661213744e-06, 'epoch': 0.22} 22%|██▏ | 7406/34278 [8:11:14<31:36:54, 4.24s/it] 22%|██▏ | 7407/34278 [8:11:17<28:48:53, 3.86s/it] {'loss': 0.1446, 'grad_norm': 0.8220961724986665, 'learning_rate': 9.119218895925588e-06, 'epoch': 0.22} 22%|██▏ | 7407/34278 [8:11:17<28:48:53, 3.86s/it] 22%|██▏ | 7408/34278 [8:11:23<33:52:03, 4.54s/it] {'loss': 0.195, 'grad_norm': 0.7783355121923526, 'learning_rate': 9.118951093862028e-06, 'epoch': 0.22} 22%|██▏ | 7408/34278 [8:11:23<33:52:03, 4.54s/it] 22%|██▏ | 7409/34278 [8:11:27<31:25:12, 4.21s/it] {'loss': 0.1704, 'grad_norm': 0.7202058289041149, 'learning_rate': 9.118683255025457e-06, 'epoch': 0.22} 22%|██▏ | 7409/34278 [8:11:27<31:25:12, 4.21s/it] 22%|██▏ | 7410/34278 [8:11:32<34:42:53, 4.65s/it] {'loss': 0.1622, 'grad_norm': 0.8610059331317936, 'learning_rate': 9.118415379418265e-06, 'epoch': 0.22} 22%|██▏ | 7410/34278 [8:11:32<34:42:53, 4.65s/it] 22%|██▏ | 7411/34278 [8:11:36<32:13:45, 4.32s/it] {'loss': 0.1421, 'grad_norm': 0.7001027600457649, 'learning_rate': 9.118147467042844e-06, 'epoch': 0.22} 22%|██▏ | 7411/34278 [8:11:36<32:13:45, 4.32s/it] 22%|██▏ | 7412/34278 [8:11:41<34:25:09, 4.61s/it] {'loss': 0.1784, 'grad_norm': 0.8022121923333063, 'learning_rate': 9.117879517901584e-06, 'epoch': 0.22} 22%|██▏ | 7412/34278 [8:11:41<34:25:09, 4.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 22%|██▏ | 7413/34278 [8:11:45<32:41:36, 4.38s/it] {'loss': 0.1584, 'grad_norm': 0.9036733519026036, 'learning_rate': 9.11761153199688e-06, 'epoch': 0.22} 22%|██▏ | 7413/34278 [8:11:45<32:41:36, 4.38s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 22%|██▏ | 7414/34278 [8:11:51<36:52:18, 4.94s/it] {'loss': 0.1569, 'grad_norm': 0.7732901311394017, 'learning_rate': 9.117343509331122e-06, 'epoch': 0.22} 22%|██▏ | 7414/34278 [8:11:51<36:52:18, 4.94s/it] 22%|██▏ | 7415/34278 [8:11:55<34:00:18, 4.56s/it] {'loss': 0.1618, 'grad_norm': 0.8897167987673777, 'learning_rate': 9.117075449906704e-06, 'epoch': 0.22} 22%|██▏ | 7415/34278 [8:11:55<34:00:18, 4.56s/it] 22%|██▏ | 7416/34278 [8:11:58<30:12:41, 4.05s/it] {'loss': 0.1841, 'grad_norm': 0.9203266233694063, 'learning_rate': 9.11680735372602e-06, 'epoch': 0.22} 22%|██▏ | 7416/34278 [8:11:58<30:12:41, 4.05s/it] 22%|██▏ | 7417/34278 [8:12:02<29:59:26, 4.02s/it] {'loss': 0.167, 'grad_norm': 0.9771782649620638, 'learning_rate': 9.116539220791464e-06, 'epoch': 0.22} 22%|██▏ | 7417/34278 [8:12:02<29:59:26, 4.02s/it] 22%|██▏ | 7418/34278 [8:12:06<29:43:06, 3.98s/it] {'loss': 0.1596, 'grad_norm': 0.7923600665984015, 'learning_rate': 9.116271051105428e-06, 'epoch': 0.22} 22%|██▏ | 7418/34278 [8:12:06<29:43:06, 3.98s/it] 22%|██▏ | 7419/34278 [8:12:09<28:20:02, 3.80s/it] {'loss': 0.1747, 'grad_norm': 0.9232057168625178, 'learning_rate': 9.116002844670304e-06, 'epoch': 0.22} 22%|██▏ | 7419/34278 [8:12:09<28:20:02, 3.80s/it] 22%|██▏ | 7420/34278 [8:12:13<28:40:09, 3.84s/it] {'loss': 0.1449, 'grad_norm': 0.7897568343124842, 'learning_rate': 9.115734601488492e-06, 'epoch': 0.22} 22%|██▏ | 7420/34278 [8:12:13<28:40:09, 3.84s/it] 22%|██▏ | 7421/34278 [8:12:16<27:16:01, 3.65s/it] {'loss': 0.1414, 'grad_norm': 0.9717566598530014, 'learning_rate': 9.115466321562384e-06, 'epoch': 0.22} 22%|██▏ | 7421/34278 [8:12:16<27:16:01, 3.65s/it] 22%|██▏ | 7422/34278 [8:12:19<26:14:10, 3.52s/it] {'loss': 0.1689, 'grad_norm': 0.7384852427970718, 'learning_rate': 9.115198004894371e-06, 'epoch': 0.22} 22%|██▏ | 7422/34278 [8:12:19<26:14:10, 3.52s/it] 22%|██▏ | 7423/34278 [8:12:23<25:29:39, 3.42s/it] {'loss': 0.1844, 'grad_norm': 0.9349622934838513, 'learning_rate': 9.114929651486857e-06, 'epoch': 0.22} 22%|██▏ | 7423/34278 [8:12:23<25:29:39, 3.42s/it] 22%|██▏ | 7424/34278 [8:12:26<24:57:45, 3.35s/it] {'loss': 0.153, 'grad_norm': 1.1331587925570314, 'learning_rate': 9.114661261342232e-06, 'epoch': 0.22} 22%|██▏ | 7424/34278 [8:12:26<24:57:45, 3.35s/it] 22%|██▏ | 7425/34278 [8:12:29<24:06:21, 3.23s/it] {'loss': 0.1398, 'grad_norm': 0.8569517521357359, 'learning_rate': 9.114392834462895e-06, 'epoch': 0.22} 22%|██▏ | 7425/34278 [8:12:29<24:06:21, 3.23s/it] 22%|██▏ | 7426/34278 [8:12:33<26:41:34, 3.58s/it] {'loss': 0.1609, 'grad_norm': 1.0594142407397262, 'learning_rate': 9.114124370851238e-06, 'epoch': 0.22} 22%|██▏ | 7426/34278 [8:12:33<26:41:34, 3.58s/it] 22%|██▏ | 7427/34278 [8:12:36<25:24:47, 3.41s/it] {'loss': 0.1944, 'grad_norm': 0.9872499379799294, 'learning_rate': 9.113855870509664e-06, 'epoch': 0.22} 22%|██▏ | 7427/34278 [8:12:36<25:24:47, 3.41s/it] 22%|██▏ | 7428/34278 [8:12:39<24:20:12, 3.26s/it] {'loss': 0.16, 'grad_norm': 0.9348282425197145, 'learning_rate': 9.113587333440566e-06, 'epoch': 0.22} 22%|██▏ | 7428/34278 [8:12:39<24:20:12, 3.26s/it] 22%|██▏ | 7429/34278 [8:12:43<26:08:32, 3.51s/it] {'loss': 0.1529, 'grad_norm': 0.8385280054899792, 'learning_rate': 9.11331875964634e-06, 'epoch': 0.22} 22%|██▏ | 7429/34278 [8:12:43<26:08:32, 3.51s/it] 22%|██▏ | 7430/34278 [8:12:46<25:42:54, 3.45s/it] {'loss': 0.1605, 'grad_norm': 0.8559787857518748, 'learning_rate': 9.113050149129387e-06, 'epoch': 0.22} 22%|██▏ | 7430/34278 [8:12:46<25:42:54, 3.45s/it] 22%|██▏ | 7431/34278 [8:12:49<24:37:37, 3.30s/it] {'loss': 0.161, 'grad_norm': 0.8517007332578688, 'learning_rate': 9.112781501892105e-06, 'epoch': 0.22} 22%|██▏ | 7431/34278 [8:12:49<24:37:37, 3.30s/it] 22%|██▏ | 7432/34278 [8:12:53<24:32:39, 3.29s/it] {'loss': 0.1422, 'grad_norm': 1.1456108258528437, 'learning_rate': 9.112512817936892e-06, 'epoch': 0.22} 22%|██▏ | 7432/34278 [8:12:53<24:32:39, 3.29s/it] 22%|██▏ | 7433/34278 [8:12:59<31:25:23, 4.21s/it] {'loss': 0.1636, 'grad_norm': 0.9161175606348492, 'learning_rate': 9.112244097266144e-06, 'epoch': 0.22} 22%|██▏ | 7433/34278 [8:12:59<31:25:23, 4.21s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 22%|██▏ | 7434/34278 [8:13:02<29:08:21, 3.91s/it] {'loss': 0.1416, 'grad_norm': 1.0118412085103796, 'learning_rate': 9.111975339882265e-06, 'epoch': 0.22} 22%|██▏ | 7434/34278 [8:13:02<29:08:21, 3.91s/it] 22%|██▏ | 7435/34278 [8:13:05<26:38:51, 3.57s/it] {'loss': 0.156, 'grad_norm': 0.7363456828612303, 'learning_rate': 9.11170654578765e-06, 'epoch': 0.22} 22%|██▏ | 7435/34278 [8:13:05<26:38:51, 3.57s/it] 22%|██▏ | 7436/34278 [8:13:09<26:30:06, 3.55s/it] {'loss': 0.1583, 'grad_norm': 0.7820229076664653, 'learning_rate': 9.1114377149847e-06, 'epoch': 0.22} 22%|██▏ | 7436/34278 [8:13:09<26:30:06, 3.55s/it] 22%|██▏ | 7437/34278 [8:13:12<25:59:06, 3.49s/it] {'loss': 0.1769, 'grad_norm': 1.0942678168691158, 'learning_rate': 9.11116884747582e-06, 'epoch': 0.22} 22%|██▏ | 7437/34278 [8:13:12<25:59:06, 3.49s/it] 22%|██▏ | 7438/34278 [8:13:16<26:53:24, 3.61s/it] {'loss': 0.1871, 'grad_norm': 0.9447698383398426, 'learning_rate': 9.1108999432634e-06, 'epoch': 0.22} 22%|██▏ | 7438/34278 [8:13:16<26:53:24, 3.61s/it] 22%|██▏ | 7439/34278 [8:13:19<26:29:00, 3.55s/it] {'loss': 0.1936, 'grad_norm': 0.8157624572363903, 'learning_rate': 9.11063100234985e-06, 'epoch': 0.22} 22%|██▏ | 7439/34278 [8:13:19<26:29:00, 3.55s/it] 22%|██▏ | 7440/34278 [8:13:22<24:51:23, 3.33s/it] {'loss': 0.1633, 'grad_norm': 0.9421469363701053, 'learning_rate': 9.110362024737566e-06, 'epoch': 0.22} 22%|██▏ | 7440/34278 [8:13:22<24:51:23, 3.33s/it] 22%|██▏ | 7441/34278 [8:13:27<28:13:48, 3.79s/it] {'loss': 0.1627, 'grad_norm': 0.9607965558693335, 'learning_rate': 9.110093010428953e-06, 'epoch': 0.22} 22%|██▏ | 7441/34278 [8:13:27<28:13:48, 3.79s/it] 22%|██▏ | 7442/34278 [8:13:30<26:56:07, 3.61s/it] {'loss': 0.1966, 'grad_norm': 0.8354674897137242, 'learning_rate': 9.10982395942641e-06, 'epoch': 0.22} 22%|██▏ | 7442/34278 [8:13:30<26:56:07, 3.61s/it] 22%|██▏ | 7443/34278 [8:13:33<25:59:45, 3.49s/it] {'loss': 0.1484, 'grad_norm': 0.906809921024923, 'learning_rate': 9.10955487173234e-06, 'epoch': 0.22} 22%|██▏ | 7443/34278 [8:13:33<25:59:45, 3.49s/it] 22%|██▏ | 7444/34278 [8:13:37<25:40:52, 3.45s/it] {'loss': 0.1425, 'grad_norm': 0.980781509983675, 'learning_rate': 9.109285747349145e-06, 'epoch': 0.22} 22%|██▏ | 7444/34278 [8:13:37<25:40:52, 3.45s/it] 22%|██▏ | 7445/34278 [8:13:40<25:07:50, 3.37s/it] {'loss': 0.1559, 'grad_norm': 0.8914872052533028, 'learning_rate': 9.109016586279227e-06, 'epoch': 0.22} 22%|██▏ | 7445/34278 [8:13:40<25:07:50, 3.37s/it] 22%|██▏ | 7446/34278 [8:13:43<24:13:06, 3.25s/it] {'loss': 0.1585, 'grad_norm': 0.7714414266749681, 'learning_rate': 9.10874738852499e-06, 'epoch': 0.22} 22%|██▏ | 7446/34278 [8:13:43<24:13:06, 3.25s/it] 22%|██▏ | 7447/34278 [8:13:46<24:16:29, 3.26s/it] {'loss': 0.1649, 'grad_norm': 1.0786465918729728, 'learning_rate': 9.108478154088838e-06, 'epoch': 0.22} 22%|██▏ | 7447/34278 [8:13:46<24:16:29, 3.26s/it] 22%|██▏ | 7448/34278 [8:13:49<24:02:10, 3.23s/it] {'loss': 0.1651, 'grad_norm': 0.8838986337349987, 'learning_rate': 9.108208882973172e-06, 'epoch': 0.22} 22%|██▏ | 7448/34278 [8:13:49<24:02:10, 3.23s/it] 22%|██▏ | 7449/34278 [8:13:52<23:58:08, 3.22s/it] {'loss': 0.1799, 'grad_norm': 0.7998682490861097, 'learning_rate': 9.1079395751804e-06, 'epoch': 0.22} 22%|██▏ | 7449/34278 [8:13:52<23:58:08, 3.22s/it] 22%|██▏ | 7450/34278 [8:13:56<23:58:43, 3.22s/it] {'loss': 0.1488, 'grad_norm': 1.121077534441, 'learning_rate': 9.107670230712924e-06, 'epoch': 0.22} 22%|██▏ | 7450/34278 [8:13:56<23:58:43, 3.22s/it] 22%|██▏ | 7451/34278 [8:13:59<25:18:38, 3.40s/it] {'loss': 0.1773, 'grad_norm': 0.8781274845047359, 'learning_rate': 9.107400849573148e-06, 'epoch': 0.22} 22%|██▏ | 7451/34278 [8:13:59<25:18:38, 3.40s/it] 22%|██▏ | 7452/34278 [8:14:03<26:27:45, 3.55s/it] {'loss': 0.1399, 'grad_norm': 0.9108741953398324, 'learning_rate': 9.107131431763479e-06, 'epoch': 0.22} 22%|██▏ | 7452/34278 [8:14:03<26:27:45, 3.55s/it] 22%|██▏ | 7453/34278 [8:14:07<25:40:47, 3.45s/it] {'loss': 0.1823, 'grad_norm': 1.4747423636955517, 'learning_rate': 9.106861977286319e-06, 'epoch': 0.22} 22%|██▏ | 7453/34278 [8:14:07<25:40:47, 3.45s/it] 22%|██▏ | 7454/34278 [8:14:12<31:03:32, 4.17s/it] {'loss': 0.1478, 'grad_norm': 0.9139258343487516, 'learning_rate': 9.106592486144077e-06, 'epoch': 0.22} 22%|██▏ | 7454/34278 [8:14:12<31:03:32, 4.17s/it] 22%|██▏ | 7455/34278 [8:14:16<28:46:42, 3.86s/it] {'loss': 0.175, 'grad_norm': 0.9833107826241818, 'learning_rate': 9.106322958339156e-06, 'epoch': 0.22} 22%|██▏ | 7455/34278 [8:14:16<28:46:42, 3.86s/it] 22%|██▏ | 7456/34278 [8:14:20<29:21:23, 3.94s/it] {'loss': 0.1738, 'grad_norm': 0.9696490842829171, 'learning_rate': 9.106053393873965e-06, 'epoch': 0.22} 22%|██▏ | 7456/34278 [8:14:20<29:21:23, 3.94s/it] 22%|██▏ | 7457/34278 [8:14:23<27:41:21, 3.72s/it] {'loss': 0.1756, 'grad_norm': 1.2310078909985334, 'learning_rate': 9.105783792750909e-06, 'epoch': 0.22} 22%|██▏ | 7457/34278 [8:14:23<27:41:21, 3.72s/it] 22%|██▏ | 7458/34278 [8:14:27<27:34:05, 3.70s/it] {'loss': 0.1577, 'grad_norm': 0.7618525861111929, 'learning_rate': 9.105514154972397e-06, 'epoch': 0.22} 22%|██▏ | 7458/34278 [8:14:27<27:34:05, 3.70s/it] 22%|██▏ | 7459/34278 [8:14:29<25:33:54, 3.43s/it] {'loss': 0.1793, 'grad_norm': 0.9952546015463827, 'learning_rate': 9.105244480540833e-06, 'epoch': 0.22} 22%|██▏ | 7459/34278 [8:14:29<25:33:54, 3.43s/it] 22%|██▏ | 7460/34278 [8:14:33<25:42:39, 3.45s/it] {'loss': 0.1626, 'grad_norm': 0.9626963435260152, 'learning_rate': 9.104974769458626e-06, 'epoch': 0.22} 22%|██▏ | 7460/34278 [8:14:33<25:42:39, 3.45s/it] 22%|██▏ | 7461/34278 [8:14:36<25:39:06, 3.44s/it] {'loss': 0.1722, 'grad_norm': 0.9606094758968395, 'learning_rate': 9.104705021728185e-06, 'epoch': 0.22} 22%|██▏ | 7461/34278 [8:14:36<25:39:06, 3.44s/it] 22%|██▏ | 7462/34278 [8:14:43<32:33:00, 4.37s/it] {'loss': 0.1693, 'grad_norm': 0.9611724485693799, 'learning_rate': 9.104435237351918e-06, 'epoch': 0.22} 22%|██▏ | 7462/34278 [8:14:43<32:33:00, 4.37s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2acff7bbf0> Failed to fetch sample 2898118. Exception: cannot identify image file <_io.BytesIO object at 0x7f2acff7bbf0> 22%|██▏ | 7463/34278 [8:14:46<29:34:32, 3.97s/it] {'loss': 0.1714, 'grad_norm': 0.994146211050947, 'learning_rate': 9.104165416332232e-06, 'epoch': 0.22} 22%|██▏ | 7463/34278 [8:14:46<29:34:32, 3.97s/it] 22%|██▏ | 7464/34278 [8:14:50<29:38:15, 3.98s/it] {'loss': 0.1474, 'grad_norm': 0.7237656428923633, 'learning_rate': 9.103895558671538e-06, 'epoch': 0.22} 22%|██▏ | 7464/34278 [8:14:50<29:38:15, 3.98s/it] 22%|██▏ | 7465/34278 [8:14:53<27:08:57, 3.65s/it] {'loss': 0.1638, 'grad_norm': 0.853741497086034, 'learning_rate': 9.103625664372244e-06, 'epoch': 0.22} 22%|██▏ | 7465/34278 [8:14:53<27:08:57, 3.65s/it] 22%|██▏ | 7466/34278 [8:14:56<26:32:42, 3.56s/it] {'loss': 0.1592, 'grad_norm': 0.9604956953178158, 'learning_rate': 9.10335573343676e-06, 'epoch': 0.22} 22%|██▏ | 7466/34278 [8:14:56<26:32:42, 3.56s/it] 22%|██▏ | 7467/34278 [8:15:02<32:11:12, 4.32s/it] {'loss': 0.1483, 'grad_norm': 0.7211730382503233, 'learning_rate': 9.103085765867494e-06, 'epoch': 0.22} 22%|██▏ | 7467/34278 [8:15:02<32:11:12, 4.32s/it] 22%|██▏ | 7468/34278 [8:15:05<29:00:35, 3.90s/it] {'loss': 0.1737, 'grad_norm': 0.9003289263920882, 'learning_rate': 9.102815761666857e-06, 'epoch': 0.22} 22%|██▏ | 7468/34278 [8:15:05<29:00:35, 3.90s/it] 22%|██▏ | 7469/34278 [8:15:10<30:39:49, 4.12s/it] {'loss': 0.1767, 'grad_norm': 1.063103587280332, 'learning_rate': 9.102545720837264e-06, 'epoch': 0.22} 22%|██▏ | 7469/34278 [8:15:10<30:39:49, 4.12s/it] 22%|██▏ | 7470/34278 [8:15:13<28:56:02, 3.89s/it] {'loss': 0.1749, 'grad_norm': 0.9651305024078388, 'learning_rate': 9.102275643381118e-06, 'epoch': 0.22} 22%|██▏ | 7470/34278 [8:15:13<28:56:02, 3.89s/it] 22%|██▏ | 7471/34278 [8:15:18<31:28:54, 4.23s/it] {'loss': 0.1778, 'grad_norm': 0.7664733929527952, 'learning_rate': 9.102005529300837e-06, 'epoch': 0.22} 22%|██▏ | 7471/34278 [8:15:18<31:28:54, 4.23s/it] 22%|██▏ | 7472/34278 [8:15:22<30:31:35, 4.10s/it] {'loss': 0.1598, 'grad_norm': 0.7476178142610409, 'learning_rate': 9.10173537859883e-06, 'epoch': 0.22} 22%|██▏ | 7472/34278 [8:15:22<30:31:35, 4.10s/it] 22%|██▏ | 7473/34278 [8:15:25<28:03:17, 3.77s/it] {'loss': 0.1597, 'grad_norm': 0.788871726907985, 'learning_rate': 9.101465191277507e-06, 'epoch': 0.22} 22%|██▏ | 7473/34278 [8:15:25<28:03:17, 3.77s/it] 22%|██▏ | 7474/34278 [8:15:28<26:22:56, 3.54s/it] {'loss': 0.1542, 'grad_norm': 0.8464570107689131, 'learning_rate': 9.101194967339284e-06, 'epoch': 0.22} 22%|██▏ | 7474/34278 [8:15:28<26:22:56, 3.54s/it] 22%|██▏ | 7475/34278 [8:15:31<25:48:57, 3.47s/it] {'loss': 0.1511, 'grad_norm': 0.9205452933907281, 'learning_rate': 9.100924706786568e-06, 'epoch': 0.22} 22%|██▏ | 7475/34278 [8:15:31<25:48:57, 3.47s/it] 22%|██▏ | 7476/34278 [8:15:34<24:57:16, 3.35s/it] {'loss': 0.1639, 'grad_norm': 0.7740632248814412, 'learning_rate': 9.100654409621779e-06, 'epoch': 0.22} 22%|██▏ | 7476/34278 [8:15:34<24:57:16, 3.35s/it] 22%|██▏ | 7477/34278 [8:15:37<24:42:44, 3.32s/it] {'loss': 0.1458, 'grad_norm': 0.6725205946707393, 'learning_rate': 9.100384075847324e-06, 'epoch': 0.22} 22%|██▏ | 7477/34278 [8:15:37<24:42:44, 3.32s/it] 22%|██▏ | 7478/34278 [8:15:42<27:54:03, 3.75s/it] {'loss': 0.1436, 'grad_norm': 0.8691746230253524, 'learning_rate': 9.10011370546562e-06, 'epoch': 0.22} 22%|██▏ | 7478/34278 [8:15:42<27:54:03, 3.75s/it] 22%|██▏ | 7479/34278 [8:15:46<26:59:21, 3.63s/it] {'loss': 0.2026, 'grad_norm': 0.8966021726089877, 'learning_rate': 9.099843298479079e-06, 'epoch': 0.22} 22%|██▏ | 7479/34278 [8:15:46<26:59:21, 3.63s/it] 22%|██▏ | 7480/34278 [8:15:52<32:11:45, 4.33s/it] {'loss': 0.1721, 'grad_norm': 1.525537760543073, 'learning_rate': 9.099572854890115e-06, 'epoch': 0.22} 22%|██▏ | 7480/34278 [8:15:52<32:11:45, 4.33s/it] 22%|██▏ | 7481/34278 [8:15:55<29:26:07, 3.95s/it] {'loss': 0.1698, 'grad_norm': 1.0278026389305943, 'learning_rate': 9.099302374701145e-06, 'epoch': 0.22} 22%|██▏ | 7481/34278 [8:15:55<29:26:07, 3.95s/it] 22%|██▏ | 7482/34278 [8:15:59<29:51:43, 4.01s/it] {'loss': 0.1575, 'grad_norm': 0.810566816600142, 'learning_rate': 9.09903185791458e-06, 'epoch': 0.22} 22%|██▏ | 7482/34278 [8:15:59<29:51:43, 4.01s/it] 22%|██▏ | 7483/34278 [8:16:05<34:45:31, 4.67s/it] {'loss': 0.1806, 'grad_norm': 0.9454530453508962, 'learning_rate': 9.098761304532839e-06, 'epoch': 0.22} 22%|██▏ | 7483/34278 [8:16:05<34:45:31, 4.67s/it] 22%|██▏ | 7484/34278 [8:16:11<37:41:04, 5.06s/it] {'loss': 0.136, 'grad_norm': 0.6411644748981236, 'learning_rate': 9.098490714558335e-06, 'epoch': 0.22} 22%|██▏ | 7484/34278 [8:16:11<37:41:04, 5.06s/it] 22%|██▏ | 7485/34278 [8:16:14<33:02:23, 4.44s/it] {'loss': 0.1502, 'grad_norm': 0.9555150607957652, 'learning_rate': 9.098220087993484e-06, 'epoch': 0.22} 22%|██▏ | 7485/34278 [8:16:14<33:02:23, 4.44s/it] 22%|██▏ | 7486/34278 [8:16:18<32:37:21, 4.38s/it] {'loss': 0.1755, 'grad_norm': 0.7725450469261882, 'learning_rate': 9.0979494248407e-06, 'epoch': 0.22} 22%|██▏ | 7486/34278 [8:16:18<32:37:21, 4.38s/it] 22%|██▏ | 7487/34278 [8:16:24<36:21:38, 4.89s/it] {'loss': 0.1487, 'grad_norm': 0.7376261609439652, 'learning_rate': 9.097678725102406e-06, 'epoch': 0.22} 22%|██▏ | 7487/34278 [8:16:24<36:21:38, 4.89s/it] 22%|██▏ | 7488/34278 [8:16:27<31:49:44, 4.28s/it] {'loss': 0.158, 'grad_norm': 0.9009890967373166, 'learning_rate': 9.097407988781012e-06, 'epoch': 0.22} 22%|██▏ | 7488/34278 [8:16:27<31:49:44, 4.28s/it] 22%|██▏ | 7489/34278 [8:16:32<34:18:17, 4.61s/it] {'loss': 0.1578, 'grad_norm': 0.9162151433377764, 'learning_rate': 9.097137215878938e-06, 'epoch': 0.22} 22%|██▏ | 7489/34278 [8:16:32<34:18:17, 4.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 22%|██▏ | 7490/34278 [8:16:38<36:12:25, 4.87s/it] {'loss': 0.1824, 'grad_norm': 0.8071439964557588, 'learning_rate': 9.096866406398601e-06, 'epoch': 0.22} 22%|██▏ | 7490/34278 [8:16:38<36:12:25, 4.87s/it] 22%|██▏ | 7491/34278 [8:16:41<32:30:38, 4.37s/it] {'loss': 0.175, 'grad_norm': 0.8452596310026971, 'learning_rate': 9.096595560342418e-06, 'epoch': 0.22} 22%|██▏ | 7491/34278 [8:16:41<32:30:38, 4.37s/it] 22%|██▏ | 7492/34278 [8:16:44<28:59:37, 3.90s/it] {'loss': 0.1785, 'grad_norm': 0.9880919817564738, 'learning_rate': 9.09632467771281e-06, 'epoch': 0.22} 22%|██▏ | 7492/34278 [8:16:44<28:59:37, 3.90s/it] 22%|██▏ | 7493/34278 [8:16:47<27:21:52, 3.68s/it] {'loss': 0.1407, 'grad_norm': 0.8639885120516009, 'learning_rate': 9.096053758512193e-06, 'epoch': 0.22} 22%|██▏ | 7493/34278 [8:16:47<27:21:52, 3.68s/it] 22%|██▏ | 7494/34278 [8:16:51<28:01:53, 3.77s/it] {'loss': 0.1597, 'grad_norm': 0.9298884048629106, 'learning_rate': 9.095782802742983e-06, 'epoch': 0.22} 22%|██▏ | 7494/34278 [8:16:51<28:01:53, 3.77s/it] 22%|██▏ | 7495/34278 [8:16:55<28:37:17, 3.85s/it] {'loss': 0.1693, 'grad_norm': 0.8963426486967087, 'learning_rate': 9.095511810407605e-06, 'epoch': 0.22} 22%|██▏ | 7495/34278 [8:16:55<28:37:17, 3.85s/it] 22%|██▏ | 7496/34278 [8:16:58<26:39:02, 3.58s/it] {'loss': 0.152, 'grad_norm': 0.779138913716008, 'learning_rate': 9.095240781508472e-06, 'epoch': 0.22} 22%|██▏ | 7496/34278 [8:16:58<26:39:02, 3.58s/it] 22%|██▏ | 7497/34278 [8:17:03<30:02:25, 4.04s/it] {'loss': 0.1658, 'grad_norm': 0.98885334805735, 'learning_rate': 9.09496971604801e-06, 'epoch': 0.22} 22%|██▏ | 7497/34278 [8:17:03<30:02:25, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 22%|██▏ | 7498/34278 [8:17:09<34:43:23, 4.67s/it] {'loss': 0.171, 'grad_norm': 1.0373383670255278, 'learning_rate': 9.094698614028635e-06, 'epoch': 0.22} 22%|██▏ | 7498/34278 [8:17:09<34:43:23, 4.67s/it] 22%|██▏ | 7499/34278 [8:17:15<37:06:55, 4.99s/it] {'loss': 0.1718, 'grad_norm': 0.9342057049525488, 'learning_rate': 9.094427475452767e-06, 'epoch': 0.22} 22%|██▏ | 7499/34278 [8:17:15<37:06:55, 4.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 22%|██▏ | 7500/34278 [8:17:18<33:12:48, 4.47s/it] {'loss': 0.1728, 'grad_norm': 0.9690013882216534, 'learning_rate': 9.09415630032283e-06, 'epoch': 0.22} 22%|██▏ | 7500/34278 [8:17:18<33:12:48, 4.47s/it] 22%|██▏ | 7501/34278 [8:17:22<31:12:26, 4.20s/it] {'loss': 0.1509, 'grad_norm': 0.8017953998969738, 'learning_rate': 9.09388508864124e-06, 'epoch': 0.22} 22%|██▏ | 7501/34278 [8:17:22<31:12:26, 4.20s/it] 22%|██▏ | 7502/34278 [8:17:25<29:53:31, 4.02s/it] {'loss': 0.1743, 'grad_norm': 1.2085718131544307, 'learning_rate': 9.093613840410423e-06, 'epoch': 0.22} 22%|██▏ | 7502/34278 [8:17:25<29:53:31, 4.02s/it] 22%|██▏ | 7503/34278 [8:17:29<27:50:34, 3.74s/it] {'loss': 0.1228, 'grad_norm': 0.9052288263657564, 'learning_rate': 9.0933425556328e-06, 'epoch': 0.22} 22%|██▏ | 7503/34278 [8:17:29<27:50:34, 3.74s/it] 22%|██▏ | 7504/34278 [8:17:35<32:53:43, 4.42s/it] {'loss': 0.1555, 'grad_norm': 0.7426645505106202, 'learning_rate': 9.09307123431079e-06, 'epoch': 0.22} 22%|██▏ | 7504/34278 [8:17:35<32:53:43, 4.42s/it] 22%|██▏ | 7505/34278 [8:17:38<30:59:48, 4.17s/it] {'loss': 0.1653, 'grad_norm': 0.8284735612852144, 'learning_rate': 9.092799876446818e-06, 'epoch': 0.22} 22%|██▏ | 7505/34278 [8:17:38<30:59:48, 4.17s/it] 22%|██▏ | 7506/34278 [8:17:41<28:36:20, 3.85s/it] {'loss': 0.1621, 'grad_norm': 1.1483666041753051, 'learning_rate': 9.092528482043306e-06, 'epoch': 0.22} 22%|██▏ | 7506/34278 [8:17:41<28:36:20, 3.85s/it] 22%|██▏ | 7507/34278 [8:17:45<27:20:07, 3.68s/it] {'loss': 0.156, 'grad_norm': 0.8718060940875548, 'learning_rate': 9.092257051102675e-06, 'epoch': 0.22} 22%|██▏ | 7507/34278 [8:17:45<27:20:07, 3.68s/it] 22%|██▏ | 7508/34278 [8:17:48<26:11:07, 3.52s/it] {'loss': 0.1579, 'grad_norm': 0.9244857952426913, 'learning_rate': 9.091985583627352e-06, 'epoch': 0.22} 22%|██▏ | 7508/34278 [8:17:48<26:11:07, 3.52s/it] 22%|██▏ | 7509/34278 [8:17:51<26:32:54, 3.57s/it] {'loss': 0.159, 'grad_norm': 0.9375404157081058, 'learning_rate': 9.091714079619758e-06, 'epoch': 0.22} 22%|██▏ | 7509/34278 [8:17:51<26:32:54, 3.57s/it] 22%|██▏ | 7510/34278 [8:17:55<26:11:33, 3.52s/it] {'loss': 0.1669, 'grad_norm': 0.9444245221277218, 'learning_rate': 9.091442539082317e-06, 'epoch': 0.22} 22%|██▏ | 7510/34278 [8:17:55<26:11:33, 3.52s/it] 22%|██▏ | 7511/34278 [8:17:58<25:57:33, 3.49s/it] {'loss': 0.1575, 'grad_norm': 0.8695707828850077, 'learning_rate': 9.091170962017453e-06, 'epoch': 0.22} 22%|██▏ | 7511/34278 [8:17:58<25:57:33, 3.49s/it] 22%|██▏ | 7512/34278 [8:18:03<28:29:31, 3.83s/it] {'loss': 0.1587, 'grad_norm': 0.9718117313019489, 'learning_rate': 9.090899348427593e-06, 'epoch': 0.22} 22%|██▏ | 7512/34278 [8:18:03<28:29:31, 3.83s/it] 22%|██▏ | 7513/34278 [8:18:06<26:46:05, 3.60s/it] {'loss': 0.1736, 'grad_norm': 0.9249537069936109, 'learning_rate': 9.090627698315159e-06, 'epoch': 0.22} 22%|██▏ | 7513/34278 [8:18:06<26:46:05, 3.60s/it] 22%|██▏ | 7514/34278 [8:18:10<26:58:22, 3.63s/it] {'loss': 0.1854, 'grad_norm': 1.0613644382339964, 'learning_rate': 9.090356011682578e-06, 'epoch': 0.22} 22%|██▏ | 7514/34278 [8:18:10<26:58:22, 3.63s/it] 22%|██▏ | 7515/34278 [8:18:14<28:33:33, 3.84s/it] {'loss': 0.1618, 'grad_norm': 0.7595003707986981, 'learning_rate': 9.090084288532276e-06, 'epoch': 0.22} 22%|██▏ | 7515/34278 [8:18:14<28:33:33, 3.84s/it] 22%|██▏ | 7516/34278 [8:18:17<26:54:08, 3.62s/it] {'loss': 0.1818, 'grad_norm': 0.8861136072560637, 'learning_rate': 9.089812528866674e-06, 'epoch': 0.22} 22%|██▏ | 7516/34278 [8:18:17<26:54:08, 3.62s/it] 22%|██▏ | 7517/34278 [8:18:20<26:06:39, 3.51s/it] {'loss': 0.1688, 'grad_norm': 1.1073248286947113, 'learning_rate': 9.089540732688205e-06, 'epoch': 0.22} 22%|██▏ | 7517/34278 [8:18:20<26:06:39, 3.51s/it] 22%|██▏ | 7518/34278 [8:18:25<28:46:37, 3.87s/it] {'loss': 0.1597, 'grad_norm': 0.8579089096518323, 'learning_rate': 9.089268899999293e-06, 'epoch': 0.22} 22%|██▏ | 7518/34278 [8:18:25<28:46:37, 3.87s/it] 22%|██▏ | 7519/34278 [8:18:28<27:16:54, 3.67s/it] {'loss': 0.1766, 'grad_norm': 0.8120764675150153, 'learning_rate': 9.088997030802364e-06, 'epoch': 0.22} 22%|██▏ | 7519/34278 [8:18:28<27:16:54, 3.67s/it] 22%|██▏ | 7520/34278 [8:18:31<25:57:53, 3.49s/it] {'loss': 0.1622, 'grad_norm': 1.160647013834858, 'learning_rate': 9.088725125099844e-06, 'epoch': 0.22} 22%|██▏ | 7520/34278 [8:18:31<25:57:53, 3.49s/it] 22%|██▏ | 7521/34278 [8:18:34<24:26:37, 3.29s/it] {'loss': 0.1598, 'grad_norm': 0.9285213051213117, 'learning_rate': 9.088453182894165e-06, 'epoch': 0.22} 22%|██▏ | 7521/34278 [8:18:34<24:26:37, 3.29s/it] 22%|██▏ | 7522/34278 [8:18:38<25:30:03, 3.43s/it] {'loss': 0.1688, 'grad_norm': 0.8613692267922717, 'learning_rate': 9.08818120418775e-06, 'epoch': 0.22} 22%|██▏ | 7522/34278 [8:18:38<25:30:03, 3.43s/it] 22%|██▏ | 7523/34278 [8:18:41<24:29:48, 3.30s/it] {'loss': 0.1662, 'grad_norm': 0.8943126674686942, 'learning_rate': 9.08790918898303e-06, 'epoch': 0.22} 22%|██▏ | 7523/34278 [8:18:41<24:29:48, 3.30s/it] 22%|██▏ | 7524/34278 [8:18:44<24:16:04, 3.27s/it] {'loss': 0.1621, 'grad_norm': 0.7885484357016502, 'learning_rate': 9.087637137282432e-06, 'epoch': 0.22} 22%|██▏ | 7524/34278 [8:18:44<24:16:04, 3.27s/it] 22%|██▏ | 7525/34278 [8:18:47<23:04:08, 3.10s/it] {'loss': 0.1645, 'grad_norm': 0.7457779815187887, 'learning_rate': 9.087365049088386e-06, 'epoch': 0.22} 22%|██▏ | 7525/34278 [8:18:47<23:04:08, 3.10s/it] 22%|██▏ | 7526/34278 [8:18:50<23:19:36, 3.14s/it] {'loss': 0.1432, 'grad_norm': 0.9917753671200845, 'learning_rate': 9.08709292440332e-06, 'epoch': 0.22} 22%|██▏ | 7526/34278 [8:18:50<23:19:36, 3.14s/it] 22%|██▏ | 7527/34278 [8:18:53<23:36:47, 3.18s/it] {'loss': 0.1587, 'grad_norm': 1.0063350040424661, 'learning_rate': 9.086820763229665e-06, 'epoch': 0.22} 22%|██▏ | 7527/34278 [8:18:53<23:36:47, 3.18s/it] 22%|██▏ | 7528/34278 [8:18:56<23:21:36, 3.14s/it] {'loss': 0.1676, 'grad_norm': 1.1737765979425956, 'learning_rate': 9.086548565569848e-06, 'epoch': 0.22} 22%|██▏ | 7528/34278 [8:18:56<23:21:36, 3.14s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8431 > 8192). Running this sequence through the model will result in indexing errors 22%|██▏ | 7529/34278 [8:19:02<29:31:53, 3.97s/it] {'loss': 0.1493, 'grad_norm': 0.9962712694939887, 'learning_rate': 9.086276331426302e-06, 'epoch': 0.22} 22%|██▏ | 7529/34278 [8:19:02<29:31:53, 3.97s/it] 22%|██▏ | 7530/34278 [8:19:05<27:54:29, 3.76s/it] {'loss': 0.1781, 'grad_norm': 1.0197100473107363, 'learning_rate': 9.086004060801456e-06, 'epoch': 0.22} 22%|██▏ | 7530/34278 [8:19:05<27:54:29, 3.76s/it] 22%|██▏ | 7531/34278 [8:19:10<30:26:23, 4.10s/it] {'loss': 0.1709, 'grad_norm': 0.8887979084517902, 'learning_rate': 9.085731753697741e-06, 'epoch': 0.22} 22%|██▏ | 7531/34278 [8:19:10<30:26:23, 4.10s/it] 22%|██▏ | 7532/34278 [8:19:13<27:28:44, 3.70s/it] {'loss': 0.1477, 'grad_norm': 1.0573885465579034, 'learning_rate': 9.085459410117589e-06, 'epoch': 0.22} 22%|██▏ | 7532/34278 [8:19:13<27:28:44, 3.70s/it] 22%|██▏ | 7533/34278 [8:19:16<26:41:54, 3.59s/it] {'loss': 0.1606, 'grad_norm': 0.8718632363111908, 'learning_rate': 9.085187030063432e-06, 'epoch': 0.22} 22%|██▏ | 7533/34278 [8:19:16<26:41:54, 3.59s/it] 22%|██▏ | 7534/34278 [8:19:20<26:06:57, 3.52s/it] {'loss': 0.1665, 'grad_norm': 0.7976871165360285, 'learning_rate': 9.084914613537699e-06, 'epoch': 0.22} 22%|██▏ | 7534/34278 [8:19:20<26:06:57, 3.52s/it] 22%|██▏ | 7535/34278 [8:19:23<25:45:48, 3.47s/it] {'loss': 0.1699, 'grad_norm': 0.8654040256073582, 'learning_rate': 9.084642160542823e-06, 'epoch': 0.22} 22%|██▏ | 7535/34278 [8:19:23<25:45:48, 3.47s/it] 22%|██▏ | 7536/34278 [8:19:27<27:04:06, 3.64s/it] {'loss': 0.1834, 'grad_norm': 0.9982965024961783, 'learning_rate': 9.084369671081237e-06, 'epoch': 0.22} 22%|██▏ | 7536/34278 [8:19:27<27:04:06, 3.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 22%|██▏ | 7537/34278 [8:19:30<25:25:24, 3.42s/it] {'loss': 0.168, 'grad_norm': 0.7991190223835759, 'learning_rate': 9.084097145155372e-06, 'epoch': 0.22} 22%|██▏ | 7537/34278 [8:19:30<25:25:24, 3.42s/it] 22%|██▏ | 7538/34278 [8:19:34<27:30:29, 3.70s/it] {'loss': 0.1568, 'grad_norm': 0.8384227842961963, 'learning_rate': 9.083824582767667e-06, 'epoch': 0.22} 22%|██▏ | 7538/34278 [8:19:34<27:30:29, 3.70s/it] 22%|██▏ | 7539/34278 [8:19:40<32:28:14, 4.37s/it] {'loss': 0.1895, 'grad_norm': 0.8639148551820021, 'learning_rate': 9.083551983920546e-06, 'epoch': 0.22} 22%|██▏ | 7539/34278 [8:19:40<32:28:14, 4.37s/it] 22%|██▏ | 7540/34278 [8:19:44<30:21:34, 4.09s/it] {'loss': 0.16, 'grad_norm': 0.8776760204043199, 'learning_rate': 9.083279348616451e-06, 'epoch': 0.22} 22%|██▏ | 7540/34278 [8:19:44<30:21:34, 4.09s/it] 22%|██▏ | 7541/34278 [8:19:47<29:04:34, 3.91s/it] {'loss': 0.1624, 'grad_norm': 0.8397931986688365, 'learning_rate': 9.083006676857813e-06, 'epoch': 0.22} 22%|██▏ | 7541/34278 [8:19:47<29:04:34, 3.91s/it] 22%|██▏ | 7542/34278 [8:19:50<26:26:20, 3.56s/it] {'loss': 0.1431, 'grad_norm': 0.9393977801830081, 'learning_rate': 9.082733968647064e-06, 'epoch': 0.22} 22%|██▏ | 7542/34278 [8:19:50<26:26:20, 3.56s/it] 22%|██▏ | 7543/34278 [8:19:57<32:50:27, 4.42s/it] {'loss': 0.1519, 'grad_norm': 1.2585098990210004, 'learning_rate': 9.082461223986643e-06, 'epoch': 0.22} 22%|██▏ | 7543/34278 [8:19:57<32:50:27, 4.42s/it] 22%|██▏ | 7544/34278 [8:20:00<30:12:30, 4.07s/it] {'loss': 0.1834, 'grad_norm': 0.8099130895397219, 'learning_rate': 9.08218844287898e-06, 'epoch': 0.22} 22%|██▏ | 7544/34278 [8:20:00<30:12:30, 4.07s/it] 22%|██▏ | 7545/34278 [8:20:03<28:33:12, 3.85s/it] {'loss': 0.1707, 'grad_norm': 0.8935366257897372, 'learning_rate': 9.081915625326516e-06, 'epoch': 0.22} 22%|██▏ | 7545/34278 [8:20:03<28:33:12, 3.85s/it] 22%|██▏ | 7546/34278 [8:20:06<27:01:05, 3.64s/it] {'loss': 0.1765, 'grad_norm': 0.9456191841995787, 'learning_rate': 9.081642771331681e-06, 'epoch': 0.22} 22%|██▏ | 7546/34278 [8:20:06<27:01:05, 3.64s/it] 22%|██▏ | 7547/34278 [8:20:11<30:21:22, 4.09s/it] {'loss': 0.1672, 'grad_norm': 1.0163265319686992, 'learning_rate': 9.081369880896916e-06, 'epoch': 0.22} 22%|██▏ | 7547/34278 [8:20:11<30:21:22, 4.09s/it] 22%|██▏ | 7548/34278 [8:20:15<30:03:49, 4.05s/it] {'loss': 0.167, 'grad_norm': 1.0068121967005164, 'learning_rate': 9.081096954024653e-06, 'epoch': 0.22} 22%|██▏ | 7548/34278 [8:20:15<30:03:49, 4.05s/it] 22%|██▏ | 7549/34278 [8:20:21<32:53:35, 4.43s/it] {'loss': 0.1774, 'grad_norm': 0.8631234713501891, 'learning_rate': 9.080823990717332e-06, 'epoch': 0.22} 22%|██▏ | 7549/34278 [8:20:21<32:53:35, 4.43s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 22%|██▏ | 7550/34278 [8:20:24<31:19:42, 4.22s/it] {'loss': 0.1621, 'grad_norm': 0.938971277102497, 'learning_rate': 9.080550990977388e-06, 'epoch': 0.22} 22%|██▏ | 7550/34278 [8:20:24<31:19:42, 4.22s/it] 22%|██▏ | 7551/34278 [8:20:29<32:45:51, 4.41s/it] {'loss': 0.1563, 'grad_norm': 0.9393416019374851, 'learning_rate': 9.08027795480726e-06, 'epoch': 0.22} 22%|██▏ | 7551/34278 [8:20:29<32:45:51, 4.41s/it] 22%|██▏ | 7552/34278 [8:20:33<30:53:42, 4.16s/it] {'loss': 0.1699, 'grad_norm': 0.9137967490648851, 'learning_rate': 9.080004882209384e-06, 'epoch': 0.22} 22%|██▏ | 7552/34278 [8:20:33<30:53:42, 4.16s/it] 22%|██▏ | 7553/34278 [8:20:36<28:07:10, 3.79s/it] {'loss': 0.1408, 'grad_norm': 0.7496392812503637, 'learning_rate': 9.079731773186196e-06, 'epoch': 0.22} 22%|██▏ | 7553/34278 [8:20:36<28:07:10, 3.79s/it] 22%|██▏ | 7554/34278 [8:20:40<28:57:21, 3.90s/it] {'loss': 0.1679, 'grad_norm': 0.9376169250918007, 'learning_rate': 9.079458627740139e-06, 'epoch': 0.22} 22%|██▏ | 7554/34278 [8:20:40<28:57:21, 3.90s/it] 22%|██▏ | 7555/34278 [8:20:46<33:38:20, 4.53s/it] {'loss': 0.1712, 'grad_norm': 0.7795422893143354, 'learning_rate': 9.079185445873649e-06, 'epoch': 0.22} 22%|██▏ | 7555/34278 [8:20:46<33:38:20, 4.53s/it] 22%|██▏ | 7556/34278 [8:20:49<29:38:20, 3.99s/it] {'loss': 0.1687, 'grad_norm': 0.8337107539120889, 'learning_rate': 9.078912227589166e-06, 'epoch': 0.22} 22%|██▏ | 7556/34278 [8:20:49<29:38:20, 3.99s/it] 22%|██▏ | 7557/34278 [8:20:52<28:23:51, 3.83s/it] {'loss': 0.1454, 'grad_norm': 0.8084468214276382, 'learning_rate': 9.078638972889126e-06, 'epoch': 0.22} 22%|██▏ | 7557/34278 [8:20:52<28:23:51, 3.83s/it] 22%|██▏ | 7558/34278 [8:20:55<26:32:27, 3.58s/it] {'loss': 0.1607, 'grad_norm': 0.7236410382389844, 'learning_rate': 9.078365681775974e-06, 'epoch': 0.22} 22%|██▏ | 7558/34278 [8:20:55<26:32:27, 3.58s/it] 22%|██▏ | 7559/34278 [8:20:58<25:11:12, 3.39s/it] {'loss': 0.1388, 'grad_norm': 0.8600214654896388, 'learning_rate': 9.078092354252143e-06, 'epoch': 0.22} 22%|██▏ | 7559/34278 [8:20:58<25:11:12, 3.39s/it] 22%|██▏ | 7560/34278 [8:21:01<24:01:56, 3.24s/it] {'loss': 0.175, 'grad_norm': 1.0482883562274232, 'learning_rate': 9.07781899032008e-06, 'epoch': 0.22} 22%|██▏ | 7560/34278 [8:21:01<24:01:56, 3.24s/it] 22%|██▏ | 7561/34278 [8:21:06<28:16:08, 3.81s/it] {'loss': 0.1719, 'grad_norm': 0.7623423869872636, 'learning_rate': 9.077545589982221e-06, 'epoch': 0.22} 22%|██▏ | 7561/34278 [8:21:06<28:16:08, 3.81s/it] 22%|██▏ | 7562/34278 [8:21:09<26:41:35, 3.60s/it] {'loss': 0.1391, 'grad_norm': 0.8428118785734672, 'learning_rate': 9.077272153241008e-06, 'epoch': 0.22} 22%|██▏ | 7562/34278 [8:21:09<26:41:35, 3.60s/it] 22%|██▏ | 7563/34278 [8:21:12<25:11:38, 3.40s/it] {'loss': 0.1574, 'grad_norm': 0.8976367068085978, 'learning_rate': 9.076998680098883e-06, 'epoch': 0.22} 22%|██▏ | 7563/34278 [8:21:12<25:11:38, 3.40s/it] 22%|██▏ | 7564/34278 [8:21:16<25:27:17, 3.43s/it] {'loss': 0.1437, 'grad_norm': 0.7699393351335566, 'learning_rate': 9.076725170558289e-06, 'epoch': 0.22} 22%|██▏ | 7564/34278 [8:21:16<25:27:17, 3.43s/it] 22%|██▏ | 7565/34278 [8:21:19<24:45:39, 3.34s/it] {'loss': 0.1804, 'grad_norm': 0.8454398342400627, 'learning_rate': 9.076451624621665e-06, 'epoch': 0.22} 22%|██▏ | 7565/34278 [8:21:19<24:45:39, 3.34s/it] 22%|██▏ | 7566/34278 [8:21:22<24:28:43, 3.30s/it] {'loss': 0.1415, 'grad_norm': 0.838918798589249, 'learning_rate': 9.076178042291453e-06, 'epoch': 0.22} 22%|██▏ | 7566/34278 [8:21:22<24:28:43, 3.30s/it] 22%|██▏ | 7567/34278 [8:21:25<23:45:46, 3.20s/it] {'loss': 0.1506, 'grad_norm': 0.8760292330861484, 'learning_rate': 9.075904423570096e-06, 'epoch': 0.22} 22%|██▏ | 7567/34278 [8:21:25<23:45:46, 3.20s/it] 22%|██▏ | 7568/34278 [8:21:30<28:37:36, 3.86s/it] {'loss': 0.1394, 'grad_norm': 0.8392243222768452, 'learning_rate': 9.075630768460037e-06, 'epoch': 0.22} 22%|██▏ | 7568/34278 [8:21:30<28:37:36, 3.86s/it] 22%|██▏ | 7569/34278 [8:21:33<26:45:47, 3.61s/it] {'loss': 0.1623, 'grad_norm': 0.7150461773229624, 'learning_rate': 9.075357076963723e-06, 'epoch': 0.22} 22%|██▏ | 7569/34278 [8:21:33<26:45:47, 3.61s/it] 22%|██▏ | 7570/34278 [8:21:37<27:07:49, 3.66s/it] {'loss': 0.1661, 'grad_norm': 1.021782821192606, 'learning_rate': 9.07508334908359e-06, 'epoch': 0.22} 22%|██▏ | 7570/34278 [8:21:37<27:07:49, 3.66s/it] 22%|██▏ | 7571/34278 [8:21:40<25:43:49, 3.47s/it] {'loss': 0.1565, 'grad_norm': 0.7182875658508563, 'learning_rate': 9.074809584822087e-06, 'epoch': 0.22} 22%|██▏ | 7571/34278 [8:21:40<25:43:49, 3.47s/it] 22%|██▏ | 7572/34278 [8:21:45<29:25:39, 3.97s/it] {'loss': 0.1697, 'grad_norm': 0.7924182059793997, 'learning_rate': 9.074535784181658e-06, 'epoch': 0.22} 22%|██▏ | 7572/34278 [8:21:45<29:25:39, 3.97s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 22%|██▏ | 7573/34278 [8:21:50<30:22:03, 4.09s/it] {'loss': 0.1507, 'grad_norm': 0.7716242050220606, 'learning_rate': 9.074261947164744e-06, 'epoch': 0.22} 22%|██▏ | 7573/34278 [8:21:50<30:22:03, 4.09s/it] 22%|██▏ | 7574/34278 [8:21:53<28:26:40, 3.83s/it] {'loss': 0.1559, 'grad_norm': 0.8161076577769065, 'learning_rate': 9.073988073773792e-06, 'epoch': 0.22} 22%|██▏ | 7574/34278 [8:21:53<28:26:40, 3.83s/it] 22%|██▏ | 7575/34278 [8:21:57<28:45:25, 3.88s/it] {'loss': 0.1756, 'grad_norm': 0.9744163088086125, 'learning_rate': 9.07371416401125e-06, 'epoch': 0.22} 22%|██▏ | 7575/34278 [8:21:57<28:45:25, 3.88s/it] 22%|██▏ | 7576/34278 [8:22:01<29:59:44, 4.04s/it] {'loss': 0.1575, 'grad_norm': 0.9554848713519216, 'learning_rate': 9.073440217879557e-06, 'epoch': 0.22} 22%|██▏ | 7576/34278 [8:22:01<29:59:44, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 22%|██▏ | 7577/34278 [8:22:04<27:49:25, 3.75s/it] {'loss': 0.141, 'grad_norm': 0.7954637106628107, 'learning_rate': 9.073166235381163e-06, 'epoch': 0.22} 22%|██▏ | 7577/34278 [8:22:04<27:49:25, 3.75s/it] 22%|██▏ | 7578/34278 [8:22:08<27:10:14, 3.66s/it] {'loss': 0.1734, 'grad_norm': 0.977347338980649, 'learning_rate': 9.072892216518513e-06, 'epoch': 0.22} 22%|██▏ | 7578/34278 [8:22:08<27:10:14, 3.66s/it] 22%|██▏ | 7579/34278 [8:22:11<26:36:38, 3.59s/it] {'loss': 0.1746, 'grad_norm': 0.8007874726654638, 'learning_rate': 9.072618161294056e-06, 'epoch': 0.22} 22%|██▏ | 7579/34278 [8:22:11<26:36:38, 3.59s/it] 22%|██▏ | 7580/34278 [8:22:14<25:26:43, 3.43s/it] {'loss': 0.1724, 'grad_norm': 0.8751881298397141, 'learning_rate': 9.072344069710234e-06, 'epoch': 0.22} 22%|██▏ | 7580/34278 [8:22:14<25:26:43, 3.43s/it] 22%|██▏ | 7581/34278 [8:22:18<25:34:57, 3.45s/it] {'loss': 0.1496, 'grad_norm': 0.8967593436275193, 'learning_rate': 9.072069941769497e-06, 'epoch': 0.22} 22%|██▏ | 7581/34278 [8:22:18<25:34:57, 3.45s/it] 22%|██▏ | 7582/34278 [8:22:21<25:18:18, 3.41s/it] {'loss': 0.1673, 'grad_norm': 0.8527490070747805, 'learning_rate': 9.071795777474291e-06, 'epoch': 0.22} 22%|██▏ | 7582/34278 [8:22:21<25:18:18, 3.41s/it] 22%|██▏ | 7583/34278 [8:22:24<24:55:44, 3.36s/it] {'loss': 0.1588, 'grad_norm': 0.7119002806549058, 'learning_rate': 9.071521576827066e-06, 'epoch': 0.22} 22%|██▏ | 7583/34278 [8:22:24<24:55:44, 3.36s/it] 22%|██▏ | 7584/34278 [8:22:27<24:17:52, 3.28s/it] {'loss': 0.1735, 'grad_norm': 0.7191548340724161, 'learning_rate': 9.071247339830266e-06, 'epoch': 0.22} 22%|██▏ | 7584/34278 [8:22:27<24:17:52, 3.28s/it] 22%|██▏ | 7585/34278 [8:22:30<23:47:48, 3.21s/it] {'loss': 0.1472, 'grad_norm': 0.7932341849874444, 'learning_rate': 9.070973066486343e-06, 'epoch': 0.22} 22%|██▏ | 7585/34278 [8:22:30<23:47:48, 3.21s/it] 22%|██▏ | 7586/34278 [8:22:34<23:40:12, 3.19s/it] {'loss': 0.1782, 'grad_norm': 0.9788378489903121, 'learning_rate': 9.070698756797744e-06, 'epoch': 0.22} 22%|██▏ | 7586/34278 [8:22:34<23:40:12, 3.19s/it] 22%|██▏ | 7587/34278 [8:22:37<24:09:50, 3.26s/it] {'loss': 0.1493, 'grad_norm': 0.670224463952022, 'learning_rate': 9.070424410766918e-06, 'epoch': 0.22} 22%|██▏ | 7587/34278 [8:22:37<24:09:50, 3.26s/it] 22%|██▏ | 7588/34278 [8:22:40<23:25:56, 3.16s/it] {'loss': 0.1666, 'grad_norm': 1.100490778659628, 'learning_rate': 9.070150028396315e-06, 'epoch': 0.22} 22%|██▏ | 7588/34278 [8:22:40<23:25:56, 3.16s/it] 22%|██▏ | 7589/34278 [8:22:43<23:50:22, 3.22s/it] {'loss': 0.1372, 'grad_norm': 0.9397402986306163, 'learning_rate': 9.069875609688384e-06, 'epoch': 0.22} 22%|██▏ | 7589/34278 [8:22:43<23:50:22, 3.22s/it] 22%|██▏ | 7590/34278 [8:22:48<26:04:32, 3.52s/it] {'loss': 0.1703, 'grad_norm': 0.8819686288122192, 'learning_rate': 9.069601154645575e-06, 'epoch': 0.22} 22%|██▏ | 7590/34278 [8:22:48<26:04:32, 3.52s/it] 22%|██▏ | 7591/34278 [8:22:51<25:42:24, 3.47s/it] {'loss': 0.1793, 'grad_norm': 1.0019783126541315, 'learning_rate': 9.06932666327034e-06, 'epoch': 0.22} 22%|██▏ | 7591/34278 [8:22:51<25:42:24, 3.47s/it] 22%|██▏ | 7592/34278 [8:22:57<31:36:56, 4.27s/it] {'loss': 0.1506, 'grad_norm': 0.8283424701988963, 'learning_rate': 9.069052135565126e-06, 'epoch': 0.22} 22%|██▏ | 7592/34278 [8:22:57<31:36:56, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 22%|██▏ | 7593/34278 [8:23:03<36:06:25, 4.87s/it] {'loss': 0.1609, 'grad_norm': 1.0498749053308172, 'learning_rate': 9.068777571532385e-06, 'epoch': 0.22} 22%|██▏ | 7593/34278 [8:23:03<36:06:25, 4.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 22%|██▏ | 7594/34278 [8:23:07<32:52:07, 4.43s/it] {'loss': 0.1591, 'grad_norm': 0.845997786865884, 'learning_rate': 9.06850297117457e-06, 'epoch': 0.22} 22%|██▏ | 7594/34278 [8:23:07<32:52:07, 4.43s/it] 22%|██▏ | 7595/34278 [8:23:13<36:25:44, 4.91s/it] {'loss': 0.1713, 'grad_norm': 0.9371159908291806, 'learning_rate': 9.068228334494133e-06, 'epoch': 0.22} 22%|██▏ | 7595/34278 [8:23:13<36:25:44, 4.91s/it] 22%|██▏ | 7596/34278 [8:23:17<34:04:50, 4.60s/it] {'loss': 0.1835, 'grad_norm': 1.0625213095938626, 'learning_rate': 9.067953661493524e-06, 'epoch': 0.22} 22%|██▏ | 7596/34278 [8:23:17<34:04:50, 4.60s/it] 22%|██▏ | 7597/34278 [8:23:20<30:30:58, 4.12s/it] {'loss': 0.1658, 'grad_norm': 0.9265621786408575, 'learning_rate': 9.067678952175196e-06, 'epoch': 0.22} 22%|██▏ | 7597/34278 [8:23:20<30:30:58, 4.12s/it] 22%|██▏ | 7598/34278 [8:23:23<28:18:38, 3.82s/it] {'loss': 0.1519, 'grad_norm': 0.8570825544727586, 'learning_rate': 9.067404206541601e-06, 'epoch': 0.22} 22%|██▏ | 7598/34278 [8:23:23<28:18:38, 3.82s/it] 22%|██▏ | 7599/34278 [8:23:27<29:25:55, 3.97s/it] {'loss': 0.1553, 'grad_norm': 0.8283111861076646, 'learning_rate': 9.067129424595191e-06, 'epoch': 0.22} 22%|██▏ | 7599/34278 [8:23:27<29:25:55, 3.97s/it] 22%|██▏ | 7600/34278 [8:23:30<27:07:51, 3.66s/it] {'loss': 0.162, 'grad_norm': 0.7113587454962451, 'learning_rate': 9.066854606338422e-06, 'epoch': 0.22} 22%|██▏ | 7600/34278 [8:23:30<27:07:51, 3.66s/it] 22%|██▏ | 7601/34278 [8:23:34<27:22:33, 3.69s/it] {'loss': 0.1607, 'grad_norm': 0.7252270028494985, 'learning_rate': 9.066579751773745e-06, 'epoch': 0.22} 22%|██▏ | 7601/34278 [8:23:34<27:22:33, 3.69s/it] 22%|██▏ | 7602/34278 [8:23:37<27:07:24, 3.66s/it] {'loss': 0.1618, 'grad_norm': 0.734893412520523, 'learning_rate': 9.066304860903616e-06, 'epoch': 0.22} 22%|██▏ | 7602/34278 [8:23:38<27:07:24, 3.66s/it] 22%|██▏ | 7603/34278 [8:23:41<27:48:46, 3.75s/it] {'loss': 0.1661, 'grad_norm': 0.887711150723681, 'learning_rate': 9.066029933730486e-06, 'epoch': 0.22} 22%|██▏ | 7603/34278 [8:23:41<27:48:46, 3.75s/it] 22%|██▏ | 7604/34278 [8:23:44<26:27:24, 3.57s/it] {'loss': 0.1698, 'grad_norm': 0.9686223705673456, 'learning_rate': 9.065754970256813e-06, 'epoch': 0.22} 22%|██▏ | 7604/34278 [8:23:44<26:27:24, 3.57s/it] 22%|██▏ | 7605/34278 [8:23:47<25:01:52, 3.38s/it] {'loss': 0.1545, 'grad_norm': 0.6550977232812051, 'learning_rate': 9.06547997048505e-06, 'epoch': 0.22} 22%|██▏ | 7605/34278 [8:23:47<25:01:52, 3.38s/it] 22%|██▏ | 7606/34278 [8:23:50<23:33:40, 3.18s/it] {'loss': 0.1617, 'grad_norm': 0.8286044566302697, 'learning_rate': 9.065204934417654e-06, 'epoch': 0.22} 22%|██▏ | 7606/34278 [8:23:50<23:33:40, 3.18s/it] 22%|██▏ | 7607/34278 [8:23:53<23:50:48, 3.22s/it] {'loss': 0.17, 'grad_norm': 0.9547923364302132, 'learning_rate': 9.064929862057075e-06, 'epoch': 0.22} 22%|██▏ | 7607/34278 [8:23:53<23:50:48, 3.22s/it] 22%|██▏ | 7608/34278 [8:23:58<26:47:48, 3.62s/it] {'loss': 0.1646, 'grad_norm': 0.7569291356378777, 'learning_rate': 9.064654753405775e-06, 'epoch': 0.22} 22%|██▏ | 7608/34278 [8:23:58<26:47:48, 3.62s/it] 22%|██▏ | 7609/34278 [8:24:01<26:27:29, 3.57s/it] {'loss': 0.1888, 'grad_norm': 1.008866085330164, 'learning_rate': 9.064379608466207e-06, 'epoch': 0.22} 22%|██▏ | 7609/34278 [8:24:01<26:27:29, 3.57s/it] 22%|██▏ | 7610/34278 [8:24:08<32:01:38, 4.32s/it] {'loss': 0.1932, 'grad_norm': 0.8024048623313427, 'learning_rate': 9.064104427240828e-06, 'epoch': 0.22} 22%|██▏ | 7610/34278 [8:24:08<32:01:38, 4.32s/it] 22%|██▏ | 7611/34278 [8:24:12<32:19:18, 4.36s/it] {'loss': 0.1537, 'grad_norm': 0.796585074083653, 'learning_rate': 9.063829209732096e-06, 'epoch': 0.22} 22%|██▏ | 7611/34278 [8:24:12<32:19:18, 4.36s/it] 22%|██▏ | 7612/34278 [8:24:15<30:03:20, 4.06s/it] {'loss': 0.1483, 'grad_norm': 0.7401358079354325, 'learning_rate': 9.063553955942465e-06, 'epoch': 0.22} 22%|██▏ | 7612/34278 [8:24:15<30:03:20, 4.06s/it] 22%|██▏ | 7613/34278 [8:24:19<29:18:44, 3.96s/it] {'loss': 0.1658, 'grad_norm': 0.7607351143856173, 'learning_rate': 9.063278665874396e-06, 'epoch': 0.22} 22%|██▏ | 7613/34278 [8:24:19<29:18:44, 3.96s/it] 22%|██▏ | 7614/34278 [8:24:24<31:46:27, 4.29s/it] {'loss': 0.1508, 'grad_norm': 0.7940261184520345, 'learning_rate': 9.063003339530342e-06, 'epoch': 0.22} 22%|██▏ | 7614/34278 [8:24:24<31:46:27, 4.29s/it] 22%|██▏ | 7615/34278 [8:24:30<35:20:12, 4.77s/it] {'loss': 0.1631, 'grad_norm': 0.71268964953337, 'learning_rate': 9.062727976912769e-06, 'epoch': 0.22} 22%|██▏ | 7615/34278 [8:24:30<35:20:12, 4.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 22%|██▏ | 7616/34278 [8:24:34<33:45:32, 4.56s/it] {'loss': 0.1882, 'grad_norm': 0.9682479161327682, 'learning_rate': 9.062452578024128e-06, 'epoch': 0.22} 22%|██▏ | 7616/34278 [8:24:34<33:45:32, 4.56s/it] 22%|██▏ | 7617/34278 [8:24:38<31:30:17, 4.25s/it] {'loss': 0.1651, 'grad_norm': 0.7615084534557112, 'learning_rate': 9.062177142866879e-06, 'epoch': 0.22} 22%|██▏ | 7617/34278 [8:24:38<31:30:17, 4.25s/it] 22%|██▏ | 7618/34278 [8:24:41<30:31:56, 4.12s/it] {'loss': 0.1679, 'grad_norm': 0.9499185985324785, 'learning_rate': 9.061901671443483e-06, 'epoch': 0.22} 22%|██▏ | 7618/34278 [8:24:41<30:31:56, 4.12s/it] 22%|██▏ | 7619/34278 [8:24:45<28:36:44, 3.86s/it] {'loss': 0.1599, 'grad_norm': 0.7525132228183636, 'learning_rate': 9.061626163756398e-06, 'epoch': 0.22} 22%|██▏ | 7619/34278 [8:24:45<28:36:44, 3.86s/it] 22%|██▏ | 7620/34278 [8:24:49<29:32:29, 3.99s/it] {'loss': 0.1937, 'grad_norm': 0.7786034564604447, 'learning_rate': 9.061350619808086e-06, 'epoch': 0.22} 22%|██▏ | 7620/34278 [8:24:49<29:32:29, 3.99s/it] 22%|██▏ | 7621/34278 [8:24:55<33:08:58, 4.48s/it] {'loss': 0.1712, 'grad_norm': 0.9028832343859958, 'learning_rate': 9.061075039601003e-06, 'epoch': 0.22} 22%|██▏ | 7621/34278 [8:24:55<33:08:58, 4.48s/it] 22%|██▏ | 7622/34278 [8:24:58<30:47:30, 4.16s/it] {'loss': 0.1555, 'grad_norm': 0.7954470304362796, 'learning_rate': 9.060799423137615e-06, 'epoch': 0.22} 22%|██▏ | 7622/34278 [8:24:58<30:47:30, 4.16s/it] 22%|██▏ | 7623/34278 [8:25:02<29:23:04, 3.97s/it] {'loss': 0.1547, 'grad_norm': 0.9866484570970352, 'learning_rate': 9.060523770420376e-06, 'epoch': 0.22} 22%|██▏ | 7623/34278 [8:25:02<29:23:04, 3.97s/it] 22%|██▏ | 7624/34278 [8:25:04<26:49:55, 3.62s/it] {'loss': 0.1512, 'grad_norm': 0.7820834776528715, 'learning_rate': 9.060248081451752e-06, 'epoch': 0.22} 22%|██▏ | 7624/34278 [8:25:04<26:49:55, 3.62s/it] 22%|██▏ | 7625/34278 [8:25:07<25:29:07, 3.44s/it] {'loss': 0.1533, 'grad_norm': 0.8186999136404977, 'learning_rate': 9.059972356234202e-06, 'epoch': 0.22} 22%|██▏ | 7625/34278 [8:25:07<25:29:07, 3.44s/it] 22%|██▏ | 7626/34278 [8:25:11<25:21:07, 3.42s/it] {'loss': 0.1582, 'grad_norm': 0.991945352238341, 'learning_rate': 9.059696594770186e-06, 'epoch': 0.22} 22%|██▏ | 7626/34278 [8:25:11<25:21:07, 3.42s/it] 22%|██▏ | 7627/34278 [8:25:15<27:56:27, 3.77s/it] {'loss': 0.154, 'grad_norm': 0.8035245190630039, 'learning_rate': 9.059420797062169e-06, 'epoch': 0.22} 22%|██▏ | 7627/34278 [8:25:15<27:56:27, 3.77s/it] 22%|██▏ | 7628/34278 [8:25:20<28:50:01, 3.89s/it] {'loss': 0.1683, 'grad_norm': 1.0123527087147737, 'learning_rate': 9.059144963112612e-06, 'epoch': 0.22} 22%|██▏ | 7628/34278 [8:25:20<28:50:01, 3.89s/it] 22%|██▏ | 7629/34278 [8:25:24<29:46:39, 4.02s/it] {'loss': 0.1624, 'grad_norm': 0.7261878728564278, 'learning_rate': 9.058869092923979e-06, 'epoch': 0.22} 22%|██▏ | 7629/34278 [8:25:24<29:46:39, 4.02s/it] 22%|██▏ | 7630/34278 [8:25:27<27:35:49, 3.73s/it] {'loss': 0.1308, 'grad_norm': 0.8661584585519259, 'learning_rate': 9.058593186498731e-06, 'epoch': 0.22} 22%|██▏ | 7630/34278 [8:25:27<27:35:49, 3.73s/it] 22%|██▏ | 7631/34278 [8:25:30<26:03:22, 3.52s/it] {'loss': 0.1872, 'grad_norm': 0.8825084494799464, 'learning_rate': 9.058317243839333e-06, 'epoch': 0.22} 22%|██▏ | 7631/34278 [8:25:30<26:03:22, 3.52s/it] 22%|██▏ | 7632/34278 [8:25:33<25:09:05, 3.40s/it] {'loss': 0.1744, 'grad_norm': 0.782826852717805, 'learning_rate': 9.058041264948244e-06, 'epoch': 0.22} 22%|██▏ | 7632/34278 [8:25:33<25:09:05, 3.40s/it] 22%|██▏ | 7633/34278 [8:25:36<24:51:45, 3.36s/it] {'loss': 0.165, 'grad_norm': 0.7472818214781765, 'learning_rate': 9.057765249827935e-06, 'epoch': 0.22} 22%|██▏ | 7633/34278 [8:25:36<24:51:45, 3.36s/it] 22%|██▏ | 7634/34278 [8:25:39<24:06:02, 3.26s/it] {'loss': 0.1589, 'grad_norm': 0.8827334418567251, 'learning_rate': 9.057489198480864e-06, 'epoch': 0.22} 22%|██▏ | 7634/34278 [8:25:39<24:06:02, 3.26s/it] 22%|██▏ | 7635/34278 [8:25:42<22:52:15, 3.09s/it] {'loss': 0.1557, 'grad_norm': 0.7843369468588055, 'learning_rate': 9.057213110909499e-06, 'epoch': 0.22} 22%|██▏ | 7635/34278 [8:25:42<22:52:15, 3.09s/it] 22%|██▏ | 7636/34278 [8:25:46<24:02:37, 3.25s/it] {'loss': 0.1531, 'grad_norm': 0.8039448951636196, 'learning_rate': 9.056936987116304e-06, 'epoch': 0.22} 22%|██▏ | 7636/34278 [8:25:46<24:02:37, 3.25s/it] 22%|██▏ | 7637/34278 [8:25:49<24:46:05, 3.35s/it] {'loss': 0.1699, 'grad_norm': 0.9018143498707185, 'learning_rate': 9.056660827103744e-06, 'epoch': 0.22} 22%|██▏ | 7637/34278 [8:25:49<24:46:05, 3.35s/it] 22%|██▏ | 7638/34278 [8:25:53<24:43:47, 3.34s/it] {'loss': 0.1643, 'grad_norm': 0.8453876767618507, 'learning_rate': 9.056384630874284e-06, 'epoch': 0.22} 22%|██▏ | 7638/34278 [8:25:53<24:43:47, 3.34s/it] 22%|██▏ | 7639/34278 [8:25:56<24:18:10, 3.28s/it] {'loss': 0.1428, 'grad_norm': 1.112609241081302, 'learning_rate': 9.056108398430392e-06, 'epoch': 0.22} 22%|██▏ | 7639/34278 [8:25:56<24:18:10, 3.28s/it] 22%|██▏ | 7640/34278 [8:26:00<25:44:24, 3.48s/it] {'loss': 0.1922, 'grad_norm': 1.1699841161145523, 'learning_rate': 9.055832129774531e-06, 'epoch': 0.22} 22%|██▏ | 7640/34278 [8:26:00<25:44:24, 3.48s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 22%|██▏ | 7641/34278 [8:26:03<24:52:26, 3.36s/it] {'loss': 0.1632, 'grad_norm': 0.7491678728972175, 'learning_rate': 9.05555582490917e-06, 'epoch': 0.22} 22%|██▏ | 7641/34278 [8:26:03<24:52:26, 3.36s/it] 22%|██▏ | 7642/34278 [8:26:06<24:09:08, 3.26s/it] {'loss': 0.1593, 'grad_norm': 1.2193349843145231, 'learning_rate': 9.055279483836773e-06, 'epoch': 0.22} 22%|██▏ | 7642/34278 [8:26:06<24:09:08, 3.26s/it] 22%|██▏ | 7643/34278 [8:26:09<24:15:51, 3.28s/it] {'loss': 0.1545, 'grad_norm': 0.9511142377842252, 'learning_rate': 9.05500310655981e-06, 'epoch': 0.22} 22%|██▏ | 7643/34278 [8:26:09<24:15:51, 3.28s/it] 22%|██▏ | 7644/34278 [8:26:12<23:52:01, 3.23s/it] {'loss': 0.1463, 'grad_norm': 0.8744041500940533, 'learning_rate': 9.054726693080748e-06, 'epoch': 0.22} 22%|██▏ | 7644/34278 [8:26:12<23:52:01, 3.23s/it] 22%|██▏ | 7645/34278 [8:26:16<25:36:12, 3.46s/it] {'loss': 0.1832, 'grad_norm': 1.0206027985283834, 'learning_rate': 9.054450243402054e-06, 'epoch': 0.22} 22%|██▏ | 7645/34278 [8:26:16<25:36:12, 3.46s/it] 22%|██▏ | 7646/34278 [8:26:20<25:37:49, 3.46s/it] {'loss': 0.1564, 'grad_norm': 0.9566518601858032, 'learning_rate': 9.054173757526195e-06, 'epoch': 0.22} 22%|██▏ | 7646/34278 [8:26:20<25:37:49, 3.46s/it] 22%|██▏ | 7647/34278 [8:26:25<30:21:11, 4.10s/it] {'loss': 0.2013, 'grad_norm': 0.7813565535186373, 'learning_rate': 9.05389723545564e-06, 'epoch': 0.22} 22%|██▏ | 7647/34278 [8:26:25<30:21:11, 4.10s/it] 22%|██▏ | 7648/34278 [8:26:29<28:58:03, 3.92s/it] {'loss': 0.1319, 'grad_norm': 0.7598699186360195, 'learning_rate': 9.053620677192859e-06, 'epoch': 0.22} 22%|██▏ | 7648/34278 [8:26:29<28:58:03, 3.92s/it] 22%|██▏ | 7649/34278 [8:26:32<26:58:28, 3.65s/it] {'loss': 0.173, 'grad_norm': 0.9025666236147157, 'learning_rate': 9.05334408274032e-06, 'epoch': 0.22} 22%|██▏ | 7649/34278 [8:26:32<26:58:28, 3.65s/it] 22%|██▏ | 7650/34278 [8:26:38<32:27:59, 4.39s/it] {'loss': 0.1639, 'grad_norm': 0.984611023588771, 'learning_rate': 9.053067452100493e-06, 'epoch': 0.22} 22%|██▏ | 7650/34278 [8:26:38<32:27:59, 4.39s/it] 22%|██▏ | 7651/34278 [8:26:44<36:18:54, 4.91s/it] {'loss': 0.1552, 'grad_norm': 0.6207124157241103, 'learning_rate': 9.052790785275848e-06, 'epoch': 0.22} 22%|██▏ | 7651/34278 [8:26:44<36:18:54, 4.91s/it] 22%|██▏ | 7652/34278 [8:26:48<34:11:18, 4.62s/it] {'loss': 0.1614, 'grad_norm': 0.7937699128727909, 'learning_rate': 9.052514082268853e-06, 'epoch': 0.22} 22%|██▏ | 7652/34278 [8:26:48<34:11:18, 4.62s/it] 22%|██▏ | 7653/34278 [8:26:52<33:41:26, 4.56s/it] {'loss': 0.1566, 'grad_norm': 0.9311623697082175, 'learning_rate': 9.052237343081982e-06, 'epoch': 0.22} 22%|██▏ | 7653/34278 [8:26:52<33:41:26, 4.56s/it] 22%|██▏ | 7654/34278 [8:26:55<29:52:46, 4.04s/it] {'loss': 0.1512, 'grad_norm': 0.779768384518718, 'learning_rate': 9.051960567717702e-06, 'epoch': 0.22} 22%|██▏ | 7654/34278 [8:26:55<29:52:46, 4.04s/it] 22%|██▏ | 7655/34278 [8:26:58<27:25:21, 3.71s/it] {'loss': 0.1609, 'grad_norm': 0.783404962765794, 'learning_rate': 9.051683756178484e-06, 'epoch': 0.22} 22%|██▏ | 7655/34278 [8:26:58<27:25:21, 3.71s/it] 22%|██▏ | 7656/34278 [8:27:02<27:31:15, 3.72s/it] {'loss': 0.186, 'grad_norm': 1.0170932451725594, 'learning_rate': 9.051406908466803e-06, 'epoch': 0.22} 22%|██▏ | 7656/34278 [8:27:02<27:31:15, 3.72s/it] 22%|██▏ | 7657/34278 [8:27:08<32:52:34, 4.45s/it] {'loss': 0.1659, 'grad_norm': 0.9983335483662694, 'learning_rate': 9.051130024585125e-06, 'epoch': 0.22} 22%|██▏ | 7657/34278 [8:27:08<32:52:34, 4.45s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 22%|██▏ | 7658/34278 [8:27:14<37:19:17, 5.05s/it] {'loss': 0.1702, 'grad_norm': 0.9680024818605943, 'learning_rate': 9.050853104535927e-06, 'epoch': 0.22} 22%|██▏ | 7658/34278 [8:27:14<37:19:17, 5.05s/it] 22%|██▏ | 7659/34278 [8:27:17<32:45:19, 4.43s/it] {'loss': 0.1733, 'grad_norm': 0.9655365420641593, 'learning_rate': 9.05057614832168e-06, 'epoch': 0.22} 22%|██▏ | 7659/34278 [8:27:17<32:45:19, 4.43s/it] 22%|██▏ | 7660/34278 [8:27:21<29:52:03, 4.04s/it] {'loss': 0.1479, 'grad_norm': 0.8928412249761869, 'learning_rate': 9.050299155944857e-06, 'epoch': 0.22} 22%|██▏ | 7660/34278 [8:27:21<29:52:03, 4.04s/it] 22%|██▏ | 7661/34278 [8:27:24<27:44:13, 3.75s/it] {'loss': 0.1679, 'grad_norm': 0.7972993805394739, 'learning_rate': 9.050022127407928e-06, 'epoch': 0.22} 22%|██▏ | 7661/34278 [8:27:24<27:44:13, 3.75s/it] 22%|██▏ | 7662/34278 [8:27:27<26:04:05, 3.53s/it] {'loss': 0.1443, 'grad_norm': 0.9849039289849973, 'learning_rate': 9.049745062713368e-06, 'epoch': 0.22} 22%|██▏ | 7662/34278 [8:27:27<26:04:05, 3.53s/it] 22%|██▏ | 7663/34278 [8:27:29<24:06:45, 3.26s/it] {'loss': 0.1489, 'grad_norm': 0.9891067194514748, 'learning_rate': 9.049467961863652e-06, 'epoch': 0.22} 22%|██▏ | 7663/34278 [8:27:29<24:06:45, 3.26s/it] 22%|██▏ | 7664/34278 [8:27:32<23:06:05, 3.12s/it] {'loss': 0.1379, 'grad_norm': 0.8719737764767784, 'learning_rate': 9.049190824861254e-06, 'epoch': 0.22} 22%|██▏ | 7664/34278 [8:27:32<23:06:05, 3.12s/it] 22%|██▏ | 7665/34278 [8:27:35<23:33:21, 3.19s/it] {'loss': 0.1486, 'grad_norm': 1.1786497385711123, 'learning_rate': 9.048913651708643e-06, 'epoch': 0.22} 22%|██▏ | 7665/34278 [8:27:35<23:33:21, 3.19s/it] 22%|██▏ | 7666/34278 [8:27:39<23:36:38, 3.19s/it] {'loss': 0.1503, 'grad_norm': 0.9145057571417332, 'learning_rate': 9.048636442408302e-06, 'epoch': 0.22} 22%|██▏ | 7666/34278 [8:27:39<23:36:38, 3.19s/it] 22%|██▏ | 7667/34278 [8:27:43<26:47:29, 3.62s/it] {'loss': 0.1748, 'grad_norm': 0.8945697892923155, 'learning_rate': 9.0483591969627e-06, 'epoch': 0.22} 22%|██▏ | 7667/34278 [8:27:43<26:47:29, 3.62s/it] 22%|██▏ | 7668/34278 [8:27:49<31:26:05, 4.25s/it] {'loss': 0.168, 'grad_norm': 0.9387624666252707, 'learning_rate': 9.048081915374312e-06, 'epoch': 0.22} 22%|██▏ | 7668/34278 [8:27:49<31:26:05, 4.25s/it] 22%|██▏ | 7669/34278 [8:27:53<30:36:16, 4.14s/it] {'loss': 0.1904, 'grad_norm': 1.0523629310613862, 'learning_rate': 9.047804597645615e-06, 'epoch': 0.22} 22%|██▏ | 7669/34278 [8:27:53<30:36:16, 4.14s/it] 22%|██▏ | 7670/34278 [8:27:56<28:02:49, 3.79s/it] {'loss': 0.1773, 'grad_norm': 0.9639725962769458, 'learning_rate': 9.047527243779086e-06, 'epoch': 0.22} 22%|██▏ | 7670/34278 [8:27:56<28:02:49, 3.79s/it] 22%|██▏ | 7671/34278 [8:28:00<28:07:03, 3.80s/it] {'loss': 0.1709, 'grad_norm': 0.7872886284236622, 'learning_rate': 9.047249853777201e-06, 'epoch': 0.22} 22%|██▏ | 7671/34278 [8:28:00<28:07:03, 3.80s/it] 22%|██▏ | 7672/34278 [8:28:03<27:04:07, 3.66s/it] {'loss': 0.1828, 'grad_norm': 0.9870747744907474, 'learning_rate': 9.046972427642434e-06, 'epoch': 0.22} 22%|██▏ | 7672/34278 [8:28:03<27:04:07, 3.66s/it] 22%|██▏ | 7673/34278 [8:28:07<27:54:32, 3.78s/it] {'loss': 0.1641, 'grad_norm': 0.8699741891418625, 'learning_rate': 9.046694965377263e-06, 'epoch': 0.22} 22%|██▏ | 7673/34278 [8:28:07<27:54:32, 3.78s/it] 22%|██▏ | 7674/34278 [8:28:10<26:03:58, 3.53s/it] {'loss': 0.1732, 'grad_norm': 0.9056957844811336, 'learning_rate': 9.046417466984165e-06, 'epoch': 0.22} 22%|██▏ | 7674/34278 [8:28:10<26:03:58, 3.53s/it] 22%|██▏ | 7675/34278 [8:28:14<27:51:33, 3.77s/it] {'loss': 0.1731, 'grad_norm': 0.943514829476528, 'learning_rate': 9.046139932465618e-06, 'epoch': 0.22} 22%|██▏ | 7675/34278 [8:28:14<27:51:33, 3.77s/it] 22%|██▏ | 7676/34278 [8:28:18<27:56:39, 3.78s/it] {'loss': 0.1441, 'grad_norm': 0.8878351010150347, 'learning_rate': 9.045862361824101e-06, 'epoch': 0.22} 22%|██▏ | 7676/34278 [8:28:18<27:56:39, 3.78s/it] 22%|██▏ | 7677/34278 [8:28:21<26:11:14, 3.54s/it] {'loss': 0.1479, 'grad_norm': 1.0017549404017092, 'learning_rate': 9.04558475506209e-06, 'epoch': 0.22} 22%|██▏ | 7677/34278 [8:28:21<26:11:14, 3.54s/it] 22%|██▏ | 7678/34278 [8:28:25<26:19:06, 3.56s/it] {'loss': 0.1555, 'grad_norm': 0.7349079912287529, 'learning_rate': 9.045307112182064e-06, 'epoch': 0.22} 22%|██▏ | 7678/34278 [8:28:25<26:19:06, 3.56s/it] 22%|██▏ | 7679/34278 [8:28:28<26:35:57, 3.60s/it] {'loss': 0.1641, 'grad_norm': 0.8843178662615621, 'learning_rate': 9.045029433186502e-06, 'epoch': 0.22} 22%|██▏ | 7679/34278 [8:28:28<26:35:57, 3.60s/it] 22%|██▏ | 7680/34278 [8:28:31<24:53:21, 3.37s/it] {'loss': 0.162, 'grad_norm': 1.0304141828281947, 'learning_rate': 9.044751718077883e-06, 'epoch': 0.22} 22%|██▏ | 7680/34278 [8:28:31<24:53:21, 3.37s/it] 22%|██▏ | 7681/34278 [8:28:34<24:00:31, 3.25s/it] {'loss': 0.1535, 'grad_norm': 0.7978965128000045, 'learning_rate': 9.044473966858684e-06, 'epoch': 0.22} 22%|██▏ | 7681/34278 [8:28:34<24:00:31, 3.25s/it] 22%|██▏ | 7682/34278 [8:28:38<24:53:32, 3.37s/it] {'loss': 0.1747, 'grad_norm': 0.8422392322066434, 'learning_rate': 9.044196179531389e-06, 'epoch': 0.22} 22%|██▏ | 7682/34278 [8:28:38<24:53:32, 3.37s/it] 22%|██▏ | 7683/34278 [8:28:44<31:28:07, 4.26s/it] {'loss': 0.1619, 'grad_norm': 0.9792872894157268, 'learning_rate': 9.043918356098476e-06, 'epoch': 0.22} 22%|██▏ | 7683/34278 [8:28:44<31:28:07, 4.26s/it] 22%|██▏ | 7684/34278 [8:28:47<28:10:34, 3.81s/it] {'loss': 0.1542, 'grad_norm': 0.8599377481402988, 'learning_rate': 9.043640496562425e-06, 'epoch': 0.22} 22%|██▏ | 7684/34278 [8:28:47<28:10:34, 3.81s/it] 22%|██▏ | 7685/34278 [8:28:50<27:19:06, 3.70s/it] {'loss': 0.153, 'grad_norm': 0.884419926074693, 'learning_rate': 9.043362600925717e-06, 'epoch': 0.22} 22%|██▏ | 7685/34278 [8:28:50<27:19:06, 3.70s/it] 22%|██▏ | 7686/34278 [8:28:56<32:03:49, 4.34s/it] {'loss': 0.1569, 'grad_norm': 0.947556215129273, 'learning_rate': 9.043084669190832e-06, 'epoch': 0.22} 22%|██▏ | 7686/34278 [8:28:56<32:03:49, 4.34s/it] 22%|██▏ | 7687/34278 [8:28:59<29:35:47, 4.01s/it] {'loss': 0.1658, 'grad_norm': 0.7913255157996661, 'learning_rate': 9.042806701360254e-06, 'epoch': 0.22} 22%|██▏ | 7687/34278 [8:28:59<29:35:47, 4.01s/it] 22%|██▏ | 7688/34278 [8:29:03<27:30:44, 3.72s/it] {'loss': 0.1452, 'grad_norm': 0.7619208615017694, 'learning_rate': 9.042528697436461e-06, 'epoch': 0.22} 22%|██▏ | 7688/34278 [8:29:03<27:30:44, 3.72s/it] 22%|██▏ | 7689/34278 [8:29:05<25:12:41, 3.41s/it] {'loss': 0.1651, 'grad_norm': 0.8566179978048031, 'learning_rate': 9.042250657421938e-06, 'epoch': 0.22} 22%|██▏ | 7689/34278 [8:29:05<25:12:41, 3.41s/it] 22%|██▏ | 7690/34278 [8:29:09<25:44:45, 3.49s/it] {'loss': 0.1654, 'grad_norm': 0.8134848864736269, 'learning_rate': 9.041972581319165e-06, 'epoch': 0.22} 22%|██▏ | 7690/34278 [8:29:09<25:44:45, 3.49s/it] 22%|██▏ | 7691/34278 [8:29:15<31:25:48, 4.26s/it] {'loss': 0.1654, 'grad_norm': 0.7432956393861663, 'learning_rate': 9.041694469130628e-06, 'epoch': 0.22} 22%|██▏ | 7691/34278 [8:29:15<31:25:48, 4.26s/it] 22%|██▏ | 7692/34278 [8:29:18<28:44:02, 3.89s/it] {'loss': 0.143, 'grad_norm': 0.7811840870532569, 'learning_rate': 9.041416320858804e-06, 'epoch': 0.22} 22%|██▏ | 7692/34278 [8:29:18<28:44:02, 3.89s/it] 22%|██▏ | 7693/34278 [8:29:24<33:16:24, 4.51s/it] {'loss': 0.1653, 'grad_norm': 0.8061525347883657, 'learning_rate': 9.041138136506183e-06, 'epoch': 0.22} 22%|██▏ | 7693/34278 [8:29:24<33:16:24, 4.51s/it] 22%|██▏ | 7694/34278 [8:29:28<33:04:25, 4.48s/it] {'loss': 0.1478, 'grad_norm': 0.688503899823051, 'learning_rate': 9.040859916075244e-06, 'epoch': 0.22} 22%|██▏ | 7694/34278 [8:29:28<33:04:25, 4.48s/it] 22%|██▏ | 7695/34278 [8:29:32<31:56:08, 4.32s/it] {'loss': 0.139, 'grad_norm': 0.9075490749612564, 'learning_rate': 9.040581659568472e-06, 'epoch': 0.22} 22%|██▏ | 7695/34278 [8:29:32<31:56:08, 4.32s/it] 22%|██▏ | 7696/34278 [8:29:35<29:00:04, 3.93s/it] {'loss': 0.1364, 'grad_norm': 0.6928728556934488, 'learning_rate': 9.040303366988353e-06, 'epoch': 0.22} 22%|██▏ | 7696/34278 [8:29:35<29:00:04, 3.93s/it] 22%|██▏ | 7697/34278 [8:29:41<33:44:08, 4.57s/it] {'loss': 0.1437, 'grad_norm': 0.6761159505016567, 'learning_rate': 9.04002503833737e-06, 'epoch': 0.22} 22%|██▏ | 7697/34278 [8:29:41<33:44:08, 4.57s/it] 22%|██▏ | 7698/34278 [8:29:45<31:55:23, 4.32s/it] {'loss': 0.1497, 'grad_norm': 0.8425281803151635, 'learning_rate': 9.039746673618007e-06, 'epoch': 0.22} 22%|██▏ | 7698/34278 [8:29:45<31:55:23, 4.32s/it] 22%|██▏ | 7699/34278 [8:29:49<31:03:31, 4.21s/it] {'loss': 0.1809, 'grad_norm': 0.9503518187886093, 'learning_rate': 9.039468272832749e-06, 'epoch': 0.22} 22%|██▏ | 7699/34278 [8:29:49<31:03:31, 4.21s/it] 22%|██▏ | 7700/34278 [8:29:52<28:13:13, 3.82s/it] {'loss': 0.1531, 'grad_norm': 0.9150172811661464, 'learning_rate': 9.039189835984085e-06, 'epoch': 0.22} 22%|██▏ | 7700/34278 [8:29:52<28:13:13, 3.82s/it] 22%|██▏ | 7701/34278 [8:29:56<28:47:22, 3.90s/it] {'loss': 0.1634, 'grad_norm': 0.9210411178398046, 'learning_rate': 9.038911363074495e-06, 'epoch': 0.22} 22%|██▏ | 7701/34278 [8:29:56<28:47:22, 3.90s/it] 22%|██▏ | 7702/34278 [8:29:59<26:42:24, 3.62s/it] {'loss': 0.1898, 'grad_norm': 0.783002610336514, 'learning_rate': 9.038632854106473e-06, 'epoch': 0.22} 22%|██▏ | 7702/34278 [8:29:59<26:42:24, 3.62s/it] 22%|██▏ | 7703/34278 [8:30:04<29:30:53, 4.00s/it] {'loss': 0.1468, 'grad_norm': 0.7402063151144914, 'learning_rate': 9.038354309082498e-06, 'epoch': 0.22} 22%|██▏ | 7703/34278 [8:30:04<29:30:53, 4.00s/it] 22%|██▏ | 7704/34278 [8:30:08<29:22:49, 3.98s/it] {'loss': 0.1436, 'grad_norm': 0.7865639516467352, 'learning_rate': 9.038075728005061e-06, 'epoch': 0.22} 22%|██▏ | 7704/34278 [8:30:08<29:22:49, 3.98s/it] 22%|██▏ | 7705/34278 [8:30:11<28:24:25, 3.85s/it] {'loss': 0.1687, 'grad_norm': 0.8337676546211781, 'learning_rate': 9.037797110876645e-06, 'epoch': 0.22} 22%|██▏ | 7705/34278 [8:30:11<28:24:25, 3.85s/it] 22%|██▏ | 7706/34278 [8:30:14<26:36:07, 3.60s/it] {'loss': 0.145, 'grad_norm': 0.7662961401587532, 'learning_rate': 9.037518457699744e-06, 'epoch': 0.22} 22%|██▏ | 7706/34278 [8:30:14<26:36:07, 3.60s/it] 22%|██▏ | 7707/34278 [8:30:17<25:08:55, 3.41s/it] {'loss': 0.1467, 'grad_norm': 1.0198900699113442, 'learning_rate': 9.03723976847684e-06, 'epoch': 0.22} 22%|██▏ | 7707/34278 [8:30:17<25:08:55, 3.41s/it] 22%|██▏ | 7708/34278 [8:30:20<24:31:19, 3.32s/it] {'loss': 0.1569, 'grad_norm': 0.6969708413849717, 'learning_rate': 9.036961043210424e-06, 'epoch': 0.22} 22%|██▏ | 7708/34278 [8:30:20<24:31:19, 3.32s/it] 22%|██▏ | 7709/34278 [8:30:26<28:55:18, 3.92s/it] {'loss': 0.1604, 'grad_norm': 0.7766844709857786, 'learning_rate': 9.036682281902984e-06, 'epoch': 0.22} 22%|██▏ | 7709/34278 [8:30:26<28:55:18, 3.92s/it] 22%|██▏ | 7710/34278 [8:30:29<27:14:21, 3.69s/it] {'loss': 0.1539, 'grad_norm': 0.8059386561736319, 'learning_rate': 9.036403484557005e-06, 'epoch': 0.22} 22%|██▏ | 7710/34278 [8:30:29<27:14:21, 3.69s/it] 22%|██▏ | 7711/34278 [8:30:32<25:44:41, 3.49s/it] {'loss': 0.1607, 'grad_norm': 0.865283687640427, 'learning_rate': 9.036124651174983e-06, 'epoch': 0.22} 22%|██▏ | 7711/34278 [8:30:32<25:44:41, 3.49s/it] 22%|██▏ | 7712/34278 [8:30:35<24:38:04, 3.34s/it] {'loss': 0.1407, 'grad_norm': 0.7670100636936602, 'learning_rate': 9.035845781759403e-06, 'epoch': 0.22} 22%|██▏ | 7712/34278 [8:30:35<24:38:04, 3.34s/it] 23%|██▎ | 7713/34278 [8:30:38<23:46:14, 3.22s/it] {'loss': 0.1828, 'grad_norm': 1.0289788523648595, 'learning_rate': 9.035566876312754e-06, 'epoch': 0.23} 23%|██▎ | 7713/34278 [8:30:38<23:46:14, 3.22s/it] 23%|██▎ | 7714/34278 [8:30:43<27:25:03, 3.72s/it] {'loss': 0.158, 'grad_norm': 0.8626691997769976, 'learning_rate': 9.035287934837529e-06, 'epoch': 0.23} 23%|██▎ | 7714/34278 [8:30:43<27:25:03, 3.72s/it] 23%|██▎ | 7715/34278 [8:30:48<30:49:47, 4.18s/it] {'loss': 0.1708, 'grad_norm': 0.9646817622704134, 'learning_rate': 9.035008957336215e-06, 'epoch': 0.23} 23%|██▎ | 7715/34278 [8:30:48<30:49:47, 4.18s/it] 23%|██▎ | 7716/34278 [8:30:51<29:04:52, 3.94s/it] {'loss': 0.1681, 'grad_norm': 0.8368005802837244, 'learning_rate': 9.034729943811304e-06, 'epoch': 0.23} 23%|██▎ | 7716/34278 [8:30:51<29:04:52, 3.94s/it] 23%|██▎ | 7717/34278 [8:30:55<27:16:47, 3.70s/it] {'loss': 0.1519, 'grad_norm': 0.8280718769062629, 'learning_rate': 9.034450894265288e-06, 'epoch': 0.23} 23%|██▎ | 7717/34278 [8:30:55<27:16:47, 3.70s/it] 23%|██▎ | 7718/34278 [8:30:58<27:02:57, 3.67s/it] {'loss': 0.1667, 'grad_norm': 1.0617486656027086, 'learning_rate': 9.034171808700657e-06, 'epoch': 0.23} 23%|██▎ | 7718/34278 [8:30:58<27:02:57, 3.67s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7719/34278 [8:31:04<32:28:24, 4.40s/it] {'loss': 0.1751, 'grad_norm': 0.912307137057485, 'learning_rate': 9.033892687119905e-06, 'epoch': 0.23} 23%|██▎ | 7719/34278 [8:31:04<32:28:24, 4.40s/it] 23%|██▎ | 7720/34278 [8:31:07<29:20:10, 3.98s/it] {'loss': 0.1835, 'grad_norm': 0.9032676593843462, 'learning_rate': 9.03361352952552e-06, 'epoch': 0.23} 23%|██▎ | 7720/34278 [8:31:07<29:20:10, 3.98s/it] 23%|██▎ | 7721/34278 [8:31:13<33:27:31, 4.54s/it] {'loss': 0.1802, 'grad_norm': 1.1243956755698408, 'learning_rate': 9.033334335919997e-06, 'epoch': 0.23} 23%|██▎ | 7721/34278 [8:31:13<33:27:31, 4.54s/it] 23%|██▎ | 7722/34278 [8:31:16<30:20:49, 4.11s/it] {'loss': 0.1652, 'grad_norm': 0.8308440720513559, 'learning_rate': 9.033055106305828e-06, 'epoch': 0.23} 23%|██▎ | 7722/34278 [8:31:16<30:20:49, 4.11s/it] 23%|██▎ | 7723/34278 [8:31:19<27:38:38, 3.75s/it] {'loss': 0.1764, 'grad_norm': 0.7229023191329865, 'learning_rate': 9.032775840685505e-06, 'epoch': 0.23} 23%|██▎ | 7723/34278 [8:31:19<27:38:38, 3.75s/it] 23%|██▎ | 7724/34278 [8:31:22<26:32:51, 3.60s/it] {'loss': 0.1576, 'grad_norm': 0.8759747801209383, 'learning_rate': 9.032496539061523e-06, 'epoch': 0.23} 23%|██▎ | 7724/34278 [8:31:22<26:32:51, 3.60s/it] 23%|██▎ | 7725/34278 [8:31:26<26:42:24, 3.62s/it] {'loss': 0.1818, 'grad_norm': 0.8923359849564969, 'learning_rate': 9.032217201436374e-06, 'epoch': 0.23} 23%|██▎ | 7725/34278 [8:31:26<26:42:24, 3.62s/it] 23%|██▎ | 7726/34278 [8:31:30<27:20:00, 3.71s/it] {'loss': 0.1587, 'grad_norm': 0.7070430046591983, 'learning_rate': 9.031937827812552e-06, 'epoch': 0.23} 23%|██▎ | 7726/34278 [8:31:30<27:20:00, 3.71s/it] 23%|██▎ | 7727/34278 [8:31:33<25:29:38, 3.46s/it] {'loss': 0.1787, 'grad_norm': 0.7723221493766538, 'learning_rate': 9.031658418192553e-06, 'epoch': 0.23} 23%|██▎ | 7727/34278 [8:31:33<25:29:38, 3.46s/it] 23%|██▎ | 7728/34278 [8:31:37<26:33:57, 3.60s/it] {'loss': 0.1611, 'grad_norm': 0.9411068171141572, 'learning_rate': 9.031378972578867e-06, 'epoch': 0.23} 23%|██▎ | 7728/34278 [8:31:37<26:33:57, 3.60s/it] 23%|██▎ | 7729/34278 [8:31:40<25:06:23, 3.40s/it] {'loss': 0.1493, 'grad_norm': 0.7716916797419265, 'learning_rate': 9.031099490973996e-06, 'epoch': 0.23} 23%|██▎ | 7729/34278 [8:31:40<25:06:23, 3.40s/it] 23%|██▎ | 7730/34278 [8:31:43<24:58:20, 3.39s/it] {'loss': 0.1593, 'grad_norm': 0.7524008958908079, 'learning_rate': 9.030819973380429e-06, 'epoch': 0.23} 23%|██▎ | 7730/34278 [8:31:43<24:58:20, 3.39s/it] 23%|██▎ | 7731/34278 [8:31:47<26:58:40, 3.66s/it] {'loss': 0.1501, 'grad_norm': 0.9948539072114604, 'learning_rate': 9.030540419800664e-06, 'epoch': 0.23} 23%|██▎ | 7731/34278 [8:31:47<26:58:40, 3.66s/it] 23%|██▎ | 7732/34278 [8:31:51<27:48:49, 3.77s/it] {'loss': 0.1596, 'grad_norm': 0.7505041602274153, 'learning_rate': 9.030260830237195e-06, 'epoch': 0.23} 23%|██▎ | 7732/34278 [8:31:51<27:48:49, 3.77s/it] 23%|██▎ | 7733/34278 [8:31:55<26:55:16, 3.65s/it] {'loss': 0.15, 'grad_norm': 0.7834396248817043, 'learning_rate': 9.029981204692521e-06, 'epoch': 0.23} 23%|██▎ | 7733/34278 [8:31:55<26:55:16, 3.65s/it] 23%|██▎ | 7734/34278 [8:31:58<25:29:08, 3.46s/it] {'loss': 0.155, 'grad_norm': 0.9118209380807698, 'learning_rate': 9.029701543169136e-06, 'epoch': 0.23} 23%|██▎ | 7734/34278 [8:31:58<25:29:08, 3.46s/it] 23%|██▎ | 7735/34278 [8:32:01<24:54:14, 3.38s/it] {'loss': 0.157, 'grad_norm': 0.7907698722510623, 'learning_rate': 9.029421845669537e-06, 'epoch': 0.23} 23%|██▎ | 7735/34278 [8:32:01<24:54:14, 3.38s/it] 23%|██▎ | 7736/34278 [8:32:04<25:14:46, 3.42s/it] {'loss': 0.1624, 'grad_norm': 0.9267610415957518, 'learning_rate': 9.029142112196224e-06, 'epoch': 0.23} 23%|██▎ | 7736/34278 [8:32:04<25:14:46, 3.42s/it] 23%|██▎ | 7737/34278 [8:32:08<25:26:53, 3.45s/it] {'loss': 0.1484, 'grad_norm': 0.7296141636833037, 'learning_rate': 9.02886234275169e-06, 'epoch': 0.23} 23%|██▎ | 7737/34278 [8:32:08<25:26:53, 3.45s/it] 23%|██▎ | 7738/34278 [8:32:11<24:41:46, 3.35s/it] {'loss': 0.1591, 'grad_norm': 0.7530974192188826, 'learning_rate': 9.028582537338434e-06, 'epoch': 0.23} 23%|██▎ | 7738/34278 [8:32:11<24:41:46, 3.35s/it] 23%|██▎ | 7739/34278 [8:32:15<26:51:30, 3.64s/it] {'loss': 0.1734, 'grad_norm': 0.854773736858789, 'learning_rate': 9.028302695958956e-06, 'epoch': 0.23} 23%|██▎ | 7739/34278 [8:32:15<26:51:30, 3.64s/it] 23%|██▎ | 7740/34278 [8:32:19<26:35:10, 3.61s/it] {'loss': 0.1708, 'grad_norm': 0.8437420622238747, 'learning_rate': 9.028022818615753e-06, 'epoch': 0.23} 23%|██▎ | 7740/34278 [8:32:19<26:35:10, 3.61s/it] 23%|██▎ | 7741/34278 [8:32:23<27:42:26, 3.76s/it] {'loss': 0.1676, 'grad_norm': 0.8625507996665871, 'learning_rate': 9.027742905311324e-06, 'epoch': 0.23} 23%|██▎ | 7741/34278 [8:32:23<27:42:26, 3.76s/it] 23%|██▎ | 7742/34278 [8:32:26<26:17:23, 3.57s/it] {'loss': 0.1985, 'grad_norm': 0.8691371780488519, 'learning_rate': 9.02746295604817e-06, 'epoch': 0.23} 23%|██▎ | 7742/34278 [8:32:26<26:17:23, 3.57s/it] 23%|██▎ | 7743/34278 [8:32:31<29:59:37, 4.07s/it] {'loss': 0.1826, 'grad_norm': 0.9278230265563486, 'learning_rate': 9.027182970828786e-06, 'epoch': 0.23} 23%|██▎ | 7743/34278 [8:32:31<29:59:37, 4.07s/it] 23%|██▎ | 7744/34278 [8:32:35<29:04:47, 3.95s/it] {'loss': 0.1448, 'grad_norm': 0.7856522119252362, 'learning_rate': 9.026902949655673e-06, 'epoch': 0.23} 23%|██▎ | 7744/34278 [8:32:35<29:04:47, 3.95s/it] 23%|██▎ | 7745/34278 [8:32:38<27:53:53, 3.79s/it] {'loss': 0.1626, 'grad_norm': 0.9002523993168426, 'learning_rate': 9.026622892531333e-06, 'epoch': 0.23} 23%|██▎ | 7745/34278 [8:32:38<27:53:53, 3.79s/it] 23%|██▎ | 7746/34278 [8:32:41<25:43:05, 3.49s/it] {'loss': 0.1374, 'grad_norm': 0.7387181122476415, 'learning_rate': 9.026342799458265e-06, 'epoch': 0.23} 23%|██▎ | 7746/34278 [8:32:41<25:43:05, 3.49s/it] 23%|██▎ | 7747/34278 [8:32:44<25:04:09, 3.40s/it] {'loss': 0.1786, 'grad_norm': 1.1592372854476964, 'learning_rate': 9.026062670438969e-06, 'epoch': 0.23} 23%|██▎ | 7747/34278 [8:32:44<25:04:09, 3.40s/it] 23%|██▎ | 7748/34278 [8:32:48<24:30:42, 3.33s/it] {'loss': 0.1643, 'grad_norm': 0.8886203635134633, 'learning_rate': 9.025782505475947e-06, 'epoch': 0.23} 23%|██▎ | 7748/34278 [8:32:48<24:30:42, 3.33s/it] 23%|██▎ | 7749/34278 [8:32:51<23:39:18, 3.21s/it] {'loss': 0.1575, 'grad_norm': 0.8658568934398153, 'learning_rate': 9.0255023045717e-06, 'epoch': 0.23} 23%|██▎ | 7749/34278 [8:32:51<23:39:18, 3.21s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10614 > 8192). Running this sequence through the model will result in indexing errors 23%|██▎ | 7750/34278 [8:32:56<28:45:19, 3.90s/it] {'loss': 0.1712, 'grad_norm': 0.8613802901972075, 'learning_rate': 9.025222067728729e-06, 'epoch': 0.23} 23%|██▎ | 7750/34278 [8:32:56<28:45:19, 3.90s/it] 23%|██▎ | 7751/34278 [8:32:59<27:40:15, 3.76s/it] {'loss': 0.1639, 'grad_norm': 0.8466326761128169, 'learning_rate': 9.024941794949536e-06, 'epoch': 0.23} 23%|██▎ | 7751/34278 [8:33:00<27:40:15, 3.76s/it] 23%|██▎ | 7752/34278 [8:33:05<32:31:44, 4.41s/it] {'loss': 0.1487, 'grad_norm': 0.9454126268379517, 'learning_rate': 9.024661486236624e-06, 'epoch': 0.23} 23%|██▎ | 7752/34278 [8:33:05<32:31:44, 4.41s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7753/34278 [8:33:09<31:23:34, 4.26s/it] {'loss': 0.1684, 'grad_norm': 1.190906457612595, 'learning_rate': 9.024381141592495e-06, 'epoch': 0.23} 23%|██▎ | 7753/34278 [8:33:09<31:23:34, 4.26s/it] 23%|██▎ | 7754/34278 [8:33:13<28:59:43, 3.94s/it] {'loss': 0.159, 'grad_norm': 0.8217592094029972, 'learning_rate': 9.024100761019652e-06, 'epoch': 0.23} 23%|██▎ | 7754/34278 [8:33:13<28:59:43, 3.94s/it] 23%|██▎ | 7755/34278 [8:33:18<33:29:42, 4.55s/it] {'loss': 0.1481, 'grad_norm': 0.736328682964108, 'learning_rate': 9.023820344520597e-06, 'epoch': 0.23} 23%|██▎ | 7755/34278 [8:33:19<33:29:42, 4.55s/it] 23%|██▎ | 7756/34278 [8:33:21<29:30:45, 4.01s/it] {'loss': 0.161, 'grad_norm': 1.043382123408031, 'learning_rate': 9.023539892097837e-06, 'epoch': 0.23} 23%|██▎ | 7756/34278 [8:33:21<29:30:45, 4.01s/it] 23%|██▎ | 7757/34278 [8:33:25<29:46:47, 4.04s/it] {'loss': 0.1741, 'grad_norm': 0.7643653720367733, 'learning_rate': 9.02325940375387e-06, 'epoch': 0.23} 23%|██▎ | 7757/34278 [8:33:25<29:46:47, 4.04s/it] 23%|██▎ | 7758/34278 [8:33:29<27:47:54, 3.77s/it] {'loss': 0.1941, 'grad_norm': 0.8874120344038124, 'learning_rate': 9.022978879491207e-06, 'epoch': 0.23} 23%|██▎ | 7758/34278 [8:33:29<27:47:54, 3.77s/it] 23%|██▎ | 7759/34278 [8:33:32<26:48:06, 3.64s/it] {'loss': 0.1883, 'grad_norm': 0.831561926407437, 'learning_rate': 9.022698319312346e-06, 'epoch': 0.23} 23%|██▎ | 7759/34278 [8:33:32<26:48:06, 3.64s/it] 23%|██▎ | 7760/34278 [8:33:36<28:16:29, 3.84s/it] {'loss': 0.1562, 'grad_norm': 0.9981630667292514, 'learning_rate': 9.022417723219797e-06, 'epoch': 0.23} 23%|██▎ | 7760/34278 [8:33:36<28:16:29, 3.84s/it] 23%|██▎ | 7761/34278 [8:33:41<29:27:46, 4.00s/it] {'loss': 0.1651, 'grad_norm': 1.485055949321117, 'learning_rate': 9.02213709121606e-06, 'epoch': 0.23} 23%|██▎ | 7761/34278 [8:33:41<29:27:46, 4.00s/it] 23%|██▎ | 7762/34278 [8:33:45<29:26:24, 4.00s/it] {'loss': 0.1634, 'grad_norm': 1.0950257317704974, 'learning_rate': 9.021856423303645e-06, 'epoch': 0.23} 23%|██▎ | 7762/34278 [8:33:45<29:26:24, 4.00s/it] 23%|██▎ | 7763/34278 [8:33:48<27:18:44, 3.71s/it] {'loss': 0.1706, 'grad_norm': 1.3654749199431029, 'learning_rate': 9.021575719485056e-06, 'epoch': 0.23} 23%|██▎ | 7763/34278 [8:33:48<27:18:44, 3.71s/it] 23%|██▎ | 7764/34278 [8:33:50<25:10:22, 3.42s/it] {'loss': 0.1572, 'grad_norm': 1.133425702358128, 'learning_rate': 9.0212949797628e-06, 'epoch': 0.23} 23%|██▎ | 7764/34278 [8:33:50<25:10:22, 3.42s/it] 23%|██▎ | 7765/34278 [8:33:56<30:29:16, 4.14s/it] {'loss': 0.1636, 'grad_norm': 0.8283512805983546, 'learning_rate': 9.02101420413938e-06, 'epoch': 0.23} 23%|██▎ | 7765/34278 [8:33:56<30:29:16, 4.14s/it] 23%|██▎ | 7766/34278 [8:34:07<45:58:03, 6.24s/it] {'loss': 0.1703, 'grad_norm': 1.1206211136306208, 'learning_rate': 9.020733392617306e-06, 'epoch': 0.23} 23%|██▎ | 7766/34278 [8:34:07<45:58:03, 6.24s/it] 23%|██▎ | 7767/34278 [8:34:11<39:34:17, 5.37s/it] {'loss': 0.1867, 'grad_norm': 0.9820767132312471, 'learning_rate': 9.020452545199084e-06, 'epoch': 0.23} 23%|██▎ | 7767/34278 [8:34:11<39:34:17, 5.37s/it] 23%|██▎ | 7768/34278 [8:34:15<37:18:55, 5.07s/it] {'loss': 0.1444, 'grad_norm': 0.7701110439822542, 'learning_rate': 9.020171661887223e-06, 'epoch': 0.23} 23%|██▎ | 7768/34278 [8:34:15<37:18:55, 5.07s/it] 23%|██▎ | 7769/34278 [8:34:19<34:19:08, 4.66s/it] {'loss': 0.1785, 'grad_norm': 0.8374307375015951, 'learning_rate': 9.019890742684227e-06, 'epoch': 0.23} 23%|██▎ | 7769/34278 [8:34:19<34:19:08, 4.66s/it] 23%|██▎ | 7770/34278 [8:34:22<32:17:06, 4.38s/it] {'loss': 0.183, 'grad_norm': 1.0855926879485878, 'learning_rate': 9.019609787592607e-06, 'epoch': 0.23} 23%|██▎ | 7770/34278 [8:34:22<32:17:06, 4.38s/it] 23%|██▎ | 7771/34278 [8:34:29<36:24:30, 4.94s/it] {'loss': 0.1556, 'grad_norm': 0.881521948980321, 'learning_rate': 9.01932879661487e-06, 'epoch': 0.23} 23%|██▎ | 7771/34278 [8:34:29<36:24:30, 4.94s/it] 23%|██▎ | 7772/34278 [8:34:31<31:44:51, 4.31s/it] {'loss': 0.1902, 'grad_norm': 0.8624881531492817, 'learning_rate': 9.019047769753527e-06, 'epoch': 0.23} 23%|██▎ | 7772/34278 [8:34:32<31:44:51, 4.31s/it] 23%|██▎ | 7773/34278 [8:34:45<53:07:18, 7.22s/it] {'loss': 0.1273, 'grad_norm': 0.765844828474478, 'learning_rate': 9.018766707011082e-06, 'epoch': 0.23} 23%|██▎ | 7773/34278 [8:34:45<53:07:18, 7.22s/it] 23%|██▎ | 7774/34278 [8:34:48<43:08:44, 5.86s/it] {'loss': 0.1399, 'grad_norm': 0.9658090031841545, 'learning_rate': 9.018485608390048e-06, 'epoch': 0.23} 23%|██▎ | 7774/34278 [8:34:48<43:08:44, 5.86s/it] 23%|██▎ | 7775/34278 [8:35:03<62:59:29, 8.56s/it] {'loss': 0.1321, 'grad_norm': 0.7638279854341149, 'learning_rate': 9.018204473892935e-06, 'epoch': 0.23} 23%|██▎ | 7775/34278 [8:35:03<62:59:29, 8.56s/it] 23%|██▎ | 7776/34278 [8:35:08<55:22:39, 7.52s/it] {'loss': 0.1457, 'grad_norm': 0.7406486751853761, 'learning_rate': 9.017923303522251e-06, 'epoch': 0.23} 23%|██▎ | 7776/34278 [8:35:08<55:22:39, 7.52s/it] 23%|██▎ | 7777/34278 [8:35:22<69:02:52, 9.38s/it] {'loss': 0.1568, 'grad_norm': 0.7685794357804425, 'learning_rate': 9.017642097280506e-06, 'epoch': 0.23} 23%|██▎ | 7777/34278 [8:35:22<69:02:52, 9.38s/it] 23%|██▎ | 7778/34278 [8:35:37<80:58:27, 11.00s/it] {'loss': 0.1365, 'grad_norm': 0.7109188143174116, 'learning_rate': 9.017360855170212e-06, 'epoch': 0.23} 23%|██▎ | 7778/34278 [8:35:37<80:58:27, 11.00s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f16a37de750> Failed to fetch sample 2735738. Exception: cannot identify image file <_io.BytesIO object at 0x7f16a37de750> 23%|██▎ | 7779/34278 [8:35:48<81:18:08, 11.05s/it] {'loss': 0.1551, 'grad_norm': 0.8266958434797916, 'learning_rate': 9.01707957719388e-06, 'epoch': 0.23} 23%|██▎ | 7779/34278 [8:35:48<81:18:08, 11.05s/it] 23%|██▎ | 7780/34278 [8:36:04<92:30:19, 12.57s/it] {'loss': 0.1662, 'grad_norm': 0.8098626300968494, 'learning_rate': 9.01679826335402e-06, 'epoch': 0.23} 23%|██▎ | 7780/34278 [8:36:04<92:30:19, 12.57s/it] 23%|██▎ | 7781/34278 [8:36:15<89:00:50, 12.09s/it] {'loss': 0.1809, 'grad_norm': 0.987600792423462, 'learning_rate': 9.016516913653144e-06, 'epoch': 0.23} 23%|██▎ | 7781/34278 [8:36:15<89:00:50, 12.09s/it] 23%|██▎ | 7782/34278 [8:36:30<95:58:44, 13.04s/it] {'loss': 0.1485, 'grad_norm': 0.7391288497416076, 'learning_rate': 9.016235528093764e-06, 'epoch': 0.23} 23%|██▎ | 7782/34278 [8:36:30<95:58:44, 13.04s/it] 23%|██▎ | 7783/34278 [8:36:46<101:23:21, 13.78s/it] {'loss': 0.1455, 'grad_norm': 0.7452053286443837, 'learning_rate': 9.015954106678391e-06, 'epoch': 0.23} 23%|██▎ | 7783/34278 [8:36:46<101:23:21, 13.78s/it] 23%|██▎ | 7784/34278 [8:37:18<141:55:57, 19.29s/it] {'loss': 0.179, 'grad_norm': 0.8935608228872053, 'learning_rate': 9.01567264940954e-06, 'epoch': 0.23} 23%|██▎ | 7784/34278 [8:37:18<141:55:57, 19.29s/it] 23%|██▎ | 7785/34278 [8:37:21<107:11:54, 14.57s/it] {'loss': 0.1481, 'grad_norm': 0.6328135737272962, 'learning_rate': 9.01539115628972e-06, 'epoch': 0.23} 23%|██▎ | 7785/34278 [8:37:21<107:11:54, 14.57s/it] 23%|██▎ | 7786/34278 [8:37:38<112:26:10, 15.28s/it] {'loss': 0.1723, 'grad_norm': 0.8840860145091277, 'learning_rate': 9.01510962732145e-06, 'epoch': 0.23} 23%|██▎ | 7786/34278 [8:37:38<112:26:10, 15.28s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7787/34278 [8:37:49<102:36:33, 13.94s/it] {'loss': 0.1682, 'grad_norm': 0.867258923074141, 'learning_rate': 9.014828062507237e-06, 'epoch': 0.23} 23%|██▎ | 7787/34278 [8:37:49<102:36:33, 13.94s/it] 23%|██▎ | 7788/34278 [8:38:22<145:03:58, 19.71s/it] {'loss': 0.1663, 'grad_norm': 0.7557198857562272, 'learning_rate': 9.014546461849597e-06, 'epoch': 0.23} 23%|██▎ | 7788/34278 [8:38:22<145:03:58, 19.71s/it] 23%|██▎ | 7789/34278 [8:38:49<160:27:32, 21.81s/it] {'loss': 0.1603, 'grad_norm': 0.9233793516447816, 'learning_rate': 9.014264825351046e-06, 'epoch': 0.23} 23%|██▎ | 7789/34278 [8:38:49<160:27:32, 21.81s/it] 23%|██▎ | 7790/34278 [8:39:00<137:11:37, 18.65s/it] {'loss': 0.1673, 'grad_norm': 0.8904760694334602, 'learning_rate': 9.013983153014097e-06, 'epoch': 0.23} 23%|██▎ | 7790/34278 [8:39:00<137:11:37, 18.65s/it] 23%|██▎ | 7791/34278 [8:39:32<166:23:41, 22.62s/it] {'loss': 0.1546, 'grad_norm': 0.9100790280131235, 'learning_rate': 9.013701444841262e-06, 'epoch': 0.23} 23%|██▎ | 7791/34278 [8:39:32<166:23:41, 22.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7792/34278 [8:39:44<142:40:47, 19.39s/it] {'loss': 0.1516, 'grad_norm': 0.825827433111236, 'learning_rate': 9.013419700835062e-06, 'epoch': 0.23} 23%|██▎ | 7792/34278 [8:39:44<142:40:47, 19.39s/it] 23%|██▎ | 7793/34278 [8:40:19<176:07:13, 23.94s/it] {'loss': 0.1378, 'grad_norm': 0.714287197207715, 'learning_rate': 9.013137920998007e-06, 'epoch': 0.23} 23%|██▎ | 7793/34278 [8:40:19<176:07:13, 23.94s/it] 23%|██▎ | 7794/34278 [8:40:22<130:23:34, 17.72s/it] {'loss': 0.1691, 'grad_norm': 1.1462027908735675, 'learning_rate': 9.012856105332615e-06, 'epoch': 0.23} 23%|██▎ | 7794/34278 [8:40:22<130:23:34, 17.72s/it] 23%|██▎ | 7795/34278 [8:40:52<156:55:14, 21.33s/it] {'loss': 0.1668, 'grad_norm': 1.0480933585120638, 'learning_rate': 9.012574253841401e-06, 'epoch': 0.23} 23%|██▎ | 7795/34278 [8:40:52<156:55:14, 21.33s/it] 23%|██▎ | 7796/34278 [8:41:07<144:01:23, 19.58s/it] {'loss': 0.1717, 'grad_norm': 0.8537650131964035, 'learning_rate': 9.012292366526884e-06, 'epoch': 0.23} 23%|██▎ | 7796/34278 [8:41:07<144:01:23, 19.58s/it] 23%|██▎ | 7797/34278 [8:41:24<137:18:48, 18.67s/it] {'loss': 0.1705, 'grad_norm': 0.9896746025215346, 'learning_rate': 9.012010443391578e-06, 'epoch': 0.23} 23%|██▎ | 7797/34278 [8:41:24<137:18:48, 18.67s/it] 23%|██▎ | 7798/34278 [8:41:48<149:44:39, 20.36s/it] {'loss': 0.1754, 'grad_norm': 0.9716721601903818, 'learning_rate': 9.011728484438e-06, 'epoch': 0.23} 23%|██▎ | 7798/34278 [8:41:48<149:44:39, 20.36s/it] 23%|██▎ | 7799/34278 [8:41:51<111:47:49, 15.20s/it] {'loss': 0.1485, 'grad_norm': 0.840118546389893, 'learning_rate': 9.011446489668667e-06, 'epoch': 0.23} 23%|██▎ | 7799/34278 [8:41:51<111:47:49, 15.20s/it] 23%|██▎ | 7800/34278 [8:42:02<103:22:26, 14.05s/it] {'loss': 0.1449, 'grad_norm': 0.7677962194515048, 'learning_rate': 9.011164459086099e-06, 'epoch': 0.23} 23%|██▎ | 7800/34278 [8:42:02<103:22:26, 14.05s/it] 23%|██▎ | 7801/34278 [8:42:17<105:09:50, 14.30s/it] {'loss': 0.1818, 'grad_norm': 0.9554203066954745, 'learning_rate': 9.010882392692812e-06, 'epoch': 0.23} 23%|██▎ | 7801/34278 [8:42:17<105:09:50, 14.30s/it] 23%|██▎ | 7802/34278 [8:42:32<105:07:48, 14.29s/it] {'loss': 0.1599, 'grad_norm': 1.0770749099152734, 'learning_rate': 9.010600290491323e-06, 'epoch': 0.23} 23%|██▎ | 7802/34278 [8:42:32<105:07:48, 14.29s/it] 23%|██▎ | 7803/34278 [8:42:35<80:57:16, 11.01s/it] {'loss': 0.1906, 'grad_norm': 0.7743675527840735, 'learning_rate': 9.010318152484152e-06, 'epoch': 0.23} 23%|██▎ | 7803/34278 [8:42:35<80:57:16, 11.01s/it] 23%|██▎ | 7804/34278 [8:42:41<69:41:14, 9.48s/it] {'loss': 0.1739, 'grad_norm': 1.1281114450625802, 'learning_rate': 9.01003597867382e-06, 'epoch': 0.23} 23%|██▎ | 7804/34278 [8:42:41<69:41:14, 9.48s/it] 23%|██▎ | 7805/34278 [8:42:44<55:29:03, 7.55s/it] {'loss': 0.1616, 'grad_norm': 1.1674191060660375, 'learning_rate': 9.00975376906284e-06, 'epoch': 0.23} 23%|██▎ | 7805/34278 [8:42:44<55:29:03, 7.55s/it] 23%|██▎ | 7806/34278 [8:42:56<64:45:34, 8.81s/it] {'loss': 0.1633, 'grad_norm': 0.9705020625119505, 'learning_rate': 9.009471523653742e-06, 'epoch': 0.23} 23%|██▎ | 7806/34278 [8:42:56<64:45:34, 8.81s/it] 23%|██▎ | 7807/34278 [8:43:07<70:00:33, 9.52s/it] {'loss': 0.1524, 'grad_norm': 0.7453195533992221, 'learning_rate': 9.009189242449034e-06, 'epoch': 0.23} 23%|██▎ | 7807/34278 [8:43:07<70:00:33, 9.52s/it] 23%|██▎ | 7808/34278 [8:43:12<61:09:01, 8.32s/it] {'loss': 0.1605, 'grad_norm': 0.8832157465935807, 'learning_rate': 9.008906925451243e-06, 'epoch': 0.23} 23%|██▎ | 7808/34278 [8:43:12<61:09:01, 8.32s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7809/34278 [8:43:18<55:12:01, 7.51s/it] {'loss': 0.181, 'grad_norm': 1.1004881089961562, 'learning_rate': 9.008624572662888e-06, 'epoch': 0.23} 23%|██▎ | 7809/34278 [8:43:18<55:12:01, 7.51s/it] 23%|██▎ | 7810/34278 [8:43:21<45:29:36, 6.19s/it] {'loss': 0.1729, 'grad_norm': 0.7527910908048386, 'learning_rate': 9.00834218408649e-06, 'epoch': 0.23} 23%|██▎ | 7810/34278 [8:43:21<45:29:36, 6.19s/it] 23%|██▎ | 7811/34278 [8:43:46<87:53:58, 11.96s/it] {'loss': 0.1585, 'grad_norm': 0.7625762769082549, 'learning_rate': 9.00805975972457e-06, 'epoch': 0.23} 23%|██▎ | 7811/34278 [8:43:46<87:53:58, 11.96s/it] 23%|██▎ | 7812/34278 [8:43:50<70:03:52, 9.53s/it] {'loss': 0.1595, 'grad_norm': 0.9127263062150434, 'learning_rate': 9.007777299579649e-06, 'epoch': 0.23} 23%|██▎ | 7812/34278 [8:43:50<70:03:52, 9.53s/it] 23%|██▎ | 7813/34278 [8:44:17<107:11:27, 14.58s/it] {'loss': 0.1672, 'grad_norm': 0.8264397871512847, 'learning_rate': 9.007494803654249e-06, 'epoch': 0.23} 23%|██▎ | 7813/34278 [8:44:17<107:11:27, 14.58s/it] 23%|██▎ | 7814/34278 [8:44:19<81:03:27, 11.03s/it] {'loss': 0.1555, 'grad_norm': 0.9024746727207784, 'learning_rate': 9.007212271950892e-06, 'epoch': 0.23} 23%|██▎ | 7814/34278 [8:44:19<81:03:27, 11.03s/it] 23%|██▎ | 7815/34278 [8:44:22<63:21:33, 8.62s/it] {'loss': 0.1411, 'grad_norm': 0.7173904426501031, 'learning_rate': 9.006929704472101e-06, 'epoch': 0.23} 23%|██▎ | 7815/34278 [8:44:22<63:21:33, 8.62s/it] 23%|██▎ | 7816/34278 [8:44:26<51:18:08, 6.98s/it] {'loss': 0.1432, 'grad_norm': 0.7899023405750852, 'learning_rate': 9.006647101220398e-06, 'epoch': 0.23} 23%|██▎ | 7816/34278 [8:44:26<51:18:08, 6.98s/it] 23%|██▎ | 7817/34278 [8:44:29<42:53:54, 5.84s/it] {'loss': 0.1841, 'grad_norm': 1.022143416772875, 'learning_rate': 9.006364462198306e-06, 'epoch': 0.23} 23%|██▎ | 7817/34278 [8:44:29<42:53:54, 5.84s/it] 23%|██▎ | 7818/34278 [8:44:32<37:01:13, 5.04s/it] {'loss': 0.1545, 'grad_norm': 0.9567311761695345, 'learning_rate': 9.006081787408348e-06, 'epoch': 0.23} 23%|██▎ | 7818/34278 [8:44:32<37:01:13, 5.04s/it] 23%|██▎ | 7819/34278 [8:44:35<33:06:46, 4.51s/it] {'loss': 0.1541, 'grad_norm': 0.9918120469913401, 'learning_rate': 9.005799076853048e-06, 'epoch': 0.23} 23%|██▎ | 7819/34278 [8:44:35<33:06:46, 4.51s/it] 23%|██▎ | 7820/34278 [8:44:39<32:36:28, 4.44s/it] {'loss': 0.1706, 'grad_norm': 0.9430094332135609, 'learning_rate': 9.00551633053493e-06, 'epoch': 0.23} 23%|██▎ | 7820/34278 [8:44:39<32:36:28, 4.44s/it] 23%|██▎ | 7821/34278 [8:44:42<28:45:57, 3.91s/it] {'loss': 0.1478, 'grad_norm': 0.8282651104457783, 'learning_rate': 9.005233548456518e-06, 'epoch': 0.23} 23%|██▎ | 7821/34278 [8:44:42<28:45:57, 3.91s/it] 23%|██▎ | 7822/34278 [8:44:46<29:37:51, 4.03s/it] {'loss': 0.1705, 'grad_norm': 1.0210830831906763, 'learning_rate': 9.004950730620338e-06, 'epoch': 0.23} 23%|██▎ | 7822/34278 [8:44:46<29:37:51, 4.03s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7823/34278 [8:44:50<28:31:26, 3.88s/it] {'loss': 0.1779, 'grad_norm': 0.9474757314337756, 'learning_rate': 9.004667877028915e-06, 'epoch': 0.23} 23%|██▎ | 7823/34278 [8:44:50<28:31:26, 3.88s/it] 23%|██▎ | 7824/34278 [8:44:55<31:05:38, 4.23s/it] {'loss': 0.1598, 'grad_norm': 0.8493551949810291, 'learning_rate': 9.004384987684771e-06, 'epoch': 0.23} 23%|██▎ | 7824/34278 [8:44:55<31:05:38, 4.23s/it] 23%|██▎ | 7825/34278 [8:44:58<29:04:30, 3.96s/it] {'loss': 0.1329, 'grad_norm': 0.9577551249206481, 'learning_rate': 9.004102062590437e-06, 'epoch': 0.23} 23%|██▎ | 7825/34278 [8:44:58<29:04:30, 3.96s/it] 23%|██▎ | 7826/34278 [8:45:03<29:57:14, 4.08s/it] {'loss': 0.1712, 'grad_norm': 0.9244488280811329, 'learning_rate': 9.003819101748432e-06, 'epoch': 0.23} 23%|██▎ | 7826/34278 [8:45:03<29:57:14, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7827/34278 [8:45:05<27:10:46, 3.70s/it] {'loss': 0.1776, 'grad_norm': 0.9143973487115579, 'learning_rate': 9.003536105161288e-06, 'epoch': 0.23} 23%|██▎ | 7827/34278 [8:45:06<27:10:46, 3.70s/it] 23%|██▎ | 7828/34278 [8:45:09<26:48:23, 3.65s/it] {'loss': 0.1707, 'grad_norm': 0.9323695028120131, 'learning_rate': 9.003253072831529e-06, 'epoch': 0.23} 23%|██▎ | 7828/34278 [8:45:09<26:48:23, 3.65s/it] 23%|██▎ | 7829/34278 [8:45:14<30:08:10, 4.10s/it] {'loss': 0.1712, 'grad_norm': 1.087380405820929, 'learning_rate': 9.00297000476168e-06, 'epoch': 0.23} 23%|██▎ | 7829/34278 [8:45:14<30:08:10, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7830/34278 [8:45:17<27:52:48, 3.79s/it] {'loss': 0.1462, 'grad_norm': 0.7641626718912142, 'learning_rate': 9.002686900954275e-06, 'epoch': 0.23} 23%|██▎ | 7830/34278 [8:45:17<27:52:48, 3.79s/it] 23%|██▎ | 7831/34278 [8:45:20<25:41:50, 3.50s/it] {'loss': 0.1595, 'grad_norm': 0.8528325098822728, 'learning_rate': 9.002403761411832e-06, 'epoch': 0.23} 23%|██▎ | 7831/34278 [8:45:20<25:41:50, 3.50s/it] 23%|██▎ | 7832/34278 [8:45:23<24:33:16, 3.34s/it] {'loss': 0.1678, 'grad_norm': 0.9642626444010138, 'learning_rate': 9.002120586136887e-06, 'epoch': 0.23} 23%|██▎ | 7832/34278 [8:45:23<24:33:16, 3.34s/it] 23%|██▎ | 7833/34278 [8:45:26<24:05:40, 3.28s/it] {'loss': 0.1442, 'grad_norm': 0.9639017949911085, 'learning_rate': 9.001837375131963e-06, 'epoch': 0.23} 23%|██▎ | 7833/34278 [8:45:26<24:05:40, 3.28s/it] 23%|██▎ | 7834/34278 [8:45:29<23:35:49, 3.21s/it] {'loss': 0.1438, 'grad_norm': 0.8683937244315321, 'learning_rate': 9.00155412839959e-06, 'epoch': 0.23} 23%|██▎ | 7834/34278 [8:45:29<23:35:49, 3.21s/it] 23%|██▎ | 7835/34278 [8:45:33<23:58:27, 3.26s/it] {'loss': 0.1417, 'grad_norm': 0.7520997942617577, 'learning_rate': 9.001270845942298e-06, 'epoch': 0.23} 23%|██▎ | 7835/34278 [8:45:33<23:58:27, 3.26s/it] 23%|██▎ | 7836/34278 [8:45:37<26:23:42, 3.59s/it] {'loss': 0.1924, 'grad_norm': 0.9781807377410655, 'learning_rate': 9.000987527762614e-06, 'epoch': 0.23} 23%|██▎ | 7836/34278 [8:45:37<26:23:42, 3.59s/it] 23%|██▎ | 7837/34278 [8:45:40<24:57:00, 3.40s/it] {'loss': 0.1753, 'grad_norm': 0.839190644885023, 'learning_rate': 9.000704173863071e-06, 'epoch': 0.23} 23%|██▎ | 7837/34278 [8:45:40<24:57:00, 3.40s/it] 23%|██▎ | 7838/34278 [8:45:46<30:53:30, 4.21s/it] {'loss': 0.1635, 'grad_norm': 0.6885528433710029, 'learning_rate': 9.000420784246194e-06, 'epoch': 0.23} 23%|██▎ | 7838/34278 [8:45:46<30:53:30, 4.21s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7839/34278 [8:45:49<28:27:27, 3.87s/it] {'loss': 0.1565, 'grad_norm': 0.7794359405780938, 'learning_rate': 9.000137358914516e-06, 'epoch': 0.23} 23%|██▎ | 7839/34278 [8:45:49<28:27:27, 3.87s/it] 23%|██▎ | 7840/34278 [8:45:53<27:37:45, 3.76s/it] {'loss': 0.1522, 'grad_norm': 0.7247611656491247, 'learning_rate': 8.999853897870565e-06, 'epoch': 0.23} 23%|██▎ | 7840/34278 [8:45:53<27:37:45, 3.76s/it] 23%|██▎ | 7841/34278 [8:45:56<26:14:51, 3.57s/it] {'loss': 0.1601, 'grad_norm': 1.0571012770404538, 'learning_rate': 8.999570401116874e-06, 'epoch': 0.23} 23%|██▎ | 7841/34278 [8:45:56<26:14:51, 3.57s/it] 23%|██▎ | 7842/34278 [8:46:02<31:54:21, 4.34s/it] {'loss': 0.1723, 'grad_norm': 0.9208776416346971, 'learning_rate': 8.999286868655974e-06, 'epoch': 0.23} 23%|██▎ | 7842/34278 [8:46:02<31:54:21, 4.34s/it] 23%|██▎ | 7843/34278 [8:46:05<29:17:23, 3.99s/it] {'loss': 0.15, 'grad_norm': 0.8248710677339088, 'learning_rate': 8.999003300490396e-06, 'epoch': 0.23} 23%|██▎ | 7843/34278 [8:46:05<29:17:23, 3.99s/it] 23%|██▎ | 7844/34278 [8:46:11<33:38:36, 4.58s/it] {'loss': 0.1621, 'grad_norm': 0.7639701251864959, 'learning_rate': 8.99871969662267e-06, 'epoch': 0.23} 23%|██▎ | 7844/34278 [8:46:11<33:38:36, 4.58s/it] 23%|██▎ | 7845/34278 [8:46:14<29:36:41, 4.03s/it] {'loss': 0.1783, 'grad_norm': 1.0376494613981613, 'learning_rate': 8.998436057055332e-06, 'epoch': 0.23} 23%|██▎ | 7845/34278 [8:46:14<29:36:41, 4.03s/it] 23%|██▎ | 7846/34278 [8:46:17<28:16:07, 3.85s/it] {'loss': 0.1491, 'grad_norm': 0.7757586736623177, 'learning_rate': 8.998152381790907e-06, 'epoch': 0.23} 23%|██▎ | 7846/34278 [8:46:17<28:16:07, 3.85s/it] 23%|██▎ | 7847/34278 [8:46:21<27:26:52, 3.74s/it] {'loss': 0.1548, 'grad_norm': 0.8881310885196493, 'learning_rate': 8.997868670831935e-06, 'epoch': 0.23} 23%|██▎ | 7847/34278 [8:46:21<27:26:52, 3.74s/it] 23%|██▎ | 7848/34278 [8:46:27<33:26:17, 4.55s/it] {'loss': 0.1647, 'grad_norm': 0.7430788008706929, 'learning_rate': 8.997584924180945e-06, 'epoch': 0.23} 23%|██▎ | 7848/34278 [8:46:27<33:26:17, 4.55s/it] 23%|██▎ | 7849/34278 [8:46:30<30:11:32, 4.11s/it] {'loss': 0.1621, 'grad_norm': 0.8635293715316253, 'learning_rate': 8.99730114184047e-06, 'epoch': 0.23} 23%|██▎ | 7849/34278 [8:46:30<30:11:32, 4.11s/it] 23%|██▎ | 7850/34278 [8:46:34<29:04:52, 3.96s/it] {'loss': 0.1434, 'grad_norm': 0.78795988793354, 'learning_rate': 8.997017323813046e-06, 'epoch': 0.23} 23%|██▎ | 7850/34278 [8:46:34<29:04:52, 3.96s/it] 23%|██▎ | 7851/34278 [8:46:39<31:57:42, 4.35s/it] {'loss': 0.1809, 'grad_norm': 0.7522582486832059, 'learning_rate': 8.996733470101204e-06, 'epoch': 0.23} 23%|██▎ | 7851/34278 [8:46:39<31:57:42, 4.35s/it] 23%|██▎ | 7852/34278 [8:46:42<29:31:22, 4.02s/it] {'loss': 0.1564, 'grad_norm': 0.731333929964769, 'learning_rate': 8.99644958070748e-06, 'epoch': 0.23} 23%|██▎ | 7852/34278 [8:46:42<29:31:22, 4.02s/it] 23%|██▎ | 7853/34278 [8:46:45<27:33:54, 3.76s/it] {'loss': 0.1587, 'grad_norm': 0.8006664639301642, 'learning_rate': 8.99616565563441e-06, 'epoch': 0.23} 23%|██▎ | 7853/34278 [8:46:45<27:33:54, 3.76s/it] 23%|██▎ | 7854/34278 [8:46:49<26:54:21, 3.67s/it] {'loss': 0.1699, 'grad_norm': 0.7840244405115172, 'learning_rate': 8.995881694884526e-06, 'epoch': 0.23} 23%|██▎ | 7854/34278 [8:46:49<26:54:21, 3.67s/it] 23%|██▎ | 7855/34278 [8:46:52<25:19:22, 3.45s/it] {'loss': 0.144, 'grad_norm': 0.7834716981915015, 'learning_rate': 8.995597698460364e-06, 'epoch': 0.23} 23%|██▎ | 7855/34278 [8:46:52<25:19:22, 3.45s/it] 23%|██▎ | 7856/34278 [8:46:57<28:56:21, 3.94s/it] {'loss': 0.1868, 'grad_norm': 0.7539945566652915, 'learning_rate': 8.99531366636446e-06, 'epoch': 0.23} 23%|██▎ | 7856/34278 [8:46:57<28:56:21, 3.94s/it] 23%|██▎ | 7857/34278 [8:47:01<29:59:22, 4.09s/it] {'loss': 0.1511, 'grad_norm': 0.8070913680316629, 'learning_rate': 8.99502959859935e-06, 'epoch': 0.23} 23%|██▎ | 7857/34278 [8:47:01<29:59:22, 4.09s/it] 23%|██▎ | 7858/34278 [8:47:04<27:42:22, 3.78s/it] {'loss': 0.1549, 'grad_norm': 0.7146344547838658, 'learning_rate': 8.994745495167567e-06, 'epoch': 0.23} 23%|██▎ | 7858/34278 [8:47:04<27:42:22, 3.78s/it] 23%|██▎ | 7859/34278 [8:47:09<30:15:24, 4.12s/it] {'loss': 0.1788, 'grad_norm': 0.8230625572382745, 'learning_rate': 8.994461356071651e-06, 'epoch': 0.23} 23%|██▎ | 7859/34278 [8:47:09<30:15:24, 4.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7860/34278 [8:47:12<27:12:11, 3.71s/it] {'loss': 0.1405, 'grad_norm': 0.9757606580104359, 'learning_rate': 8.99417718131414e-06, 'epoch': 0.23} 23%|██▎ | 7860/34278 [8:47:12<27:12:11, 3.71s/it] 23%|██▎ | 7861/34278 [8:47:15<26:10:02, 3.57s/it] {'loss': 0.1302, 'grad_norm': 0.7860021861901404, 'learning_rate': 8.993892970897564e-06, 'epoch': 0.23} 23%|██▎ | 7861/34278 [8:47:15<26:10:02, 3.57s/it] 23%|██▎ | 7862/34278 [8:47:21<31:43:01, 4.32s/it] {'loss': 0.1536, 'grad_norm': 0.8665656633073543, 'learning_rate': 8.993608724824467e-06, 'epoch': 0.23} 23%|██▎ | 7862/34278 [8:47:21<31:43:01, 4.32s/it] 23%|██▎ | 7863/34278 [8:47:25<29:49:48, 4.07s/it] {'loss': 0.1569, 'grad_norm': 0.7053428463513204, 'learning_rate': 8.993324443097387e-06, 'epoch': 0.23} 23%|██▎ | 7863/34278 [8:47:25<29:49:48, 4.07s/it] 23%|██▎ | 7864/34278 [8:47:28<28:07:33, 3.83s/it] {'loss': 0.1651, 'grad_norm': 0.8591552881685272, 'learning_rate': 8.993040125718857e-06, 'epoch': 0.23} 23%|██▎ | 7864/34278 [8:47:28<28:07:33, 3.83s/it] 23%|██▎ | 7865/34278 [8:47:31<26:54:03, 3.67s/it] {'loss': 0.1622, 'grad_norm': 0.8655036985855517, 'learning_rate': 8.992755772691418e-06, 'epoch': 0.23} 23%|██▎ | 7865/34278 [8:47:32<26:54:03, 3.67s/it] 23%|██▎ | 7866/34278 [8:47:35<25:45:30, 3.51s/it] {'loss': 0.1664, 'grad_norm': 0.8914369730508105, 'learning_rate': 8.99247138401761e-06, 'epoch': 0.23} 23%|██▎ | 7866/34278 [8:47:35<25:45:30, 3.51s/it] 23%|██▎ | 7867/34278 [8:47:37<24:20:08, 3.32s/it] {'loss': 0.1658, 'grad_norm': 0.941520779873375, 'learning_rate': 8.99218695969997e-06, 'epoch': 0.23} 23%|██▎ | 7867/34278 [8:47:37<24:20:08, 3.32s/it] 23%|██▎ | 7868/34278 [8:47:41<24:18:08, 3.31s/it] {'loss': 0.1598, 'grad_norm': 0.9019841936882262, 'learning_rate': 8.991902499741036e-06, 'epoch': 0.23} 23%|██▎ | 7868/34278 [8:47:41<24:18:08, 3.31s/it] 23%|██▎ | 7869/34278 [8:47:44<24:11:57, 3.30s/it] {'loss': 0.1542, 'grad_norm': 1.0984679069219512, 'learning_rate': 8.991618004143353e-06, 'epoch': 0.23} 23%|██▎ | 7869/34278 [8:47:44<24:11:57, 3.30s/it] 23%|██▎ | 7870/34278 [8:47:47<23:04:08, 3.14s/it] {'loss': 0.2469, 'grad_norm': 30.538307535441056, 'learning_rate': 8.991333472909455e-06, 'epoch': 0.23} 23%|██▎ | 7870/34278 [8:47:47<23:04:08, 3.14s/it] 23%|██▎ | 7871/34278 [8:47:50<22:12:21, 3.03s/it] {'loss': 0.1521, 'grad_norm': 0.9871575245444739, 'learning_rate': 8.991048906041884e-06, 'epoch': 0.23} 23%|██▎ | 7871/34278 [8:47:50<22:12:21, 3.03s/it] 23%|██▎ | 7872/34278 [8:47:52<21:23:03, 2.92s/it] {'loss': 0.1661, 'grad_norm': 7.0030216908220515, 'learning_rate': 8.990764303543183e-06, 'epoch': 0.23} 23%|██▎ | 7872/34278 [8:47:52<21:23:03, 2.92s/it] 23%|██▎ | 7873/34278 [8:47:55<21:39:18, 2.95s/it] {'loss': 0.2029, 'grad_norm': 0.8611120724498316, 'learning_rate': 8.99047966541589e-06, 'epoch': 0.23} 23%|██▎ | 7873/34278 [8:47:55<21:39:18, 2.95s/it] 23%|██▎ | 7874/34278 [8:47:59<23:20:40, 3.18s/it] {'loss': 0.1964, 'grad_norm': 1.0826007546623617, 'learning_rate': 8.990194991662547e-06, 'epoch': 0.23} 23%|██▎ | 7874/34278 [8:47:59<23:20:40, 3.18s/it] 23%|██▎ | 7875/34278 [8:48:02<22:24:42, 3.06s/it] {'loss': 0.1797, 'grad_norm': 0.9013871563643052, 'learning_rate': 8.989910282285696e-06, 'epoch': 0.23} 23%|██▎ | 7875/34278 [8:48:02<22:24:42, 3.06s/it] 23%|██▎ | 7876/34278 [8:48:05<22:30:28, 3.07s/it] {'loss': 0.1487, 'grad_norm': 0.6976822409689709, 'learning_rate': 8.989625537287879e-06, 'epoch': 0.23} 23%|██▎ | 7876/34278 [8:48:05<22:30:28, 3.07s/it] 23%|██▎ | 7877/34278 [8:48:08<22:27:11, 3.06s/it] {'loss': 0.1594, 'grad_norm': 0.8141668230560807, 'learning_rate': 8.989340756671637e-06, 'epoch': 0.23} 23%|██▎ | 7877/34278 [8:48:08<22:27:11, 3.06s/it] 23%|██▎ | 7878/34278 [8:48:11<22:11:33, 3.03s/it] {'loss': 0.1853, 'grad_norm': 0.797237042164959, 'learning_rate': 8.989055940439513e-06, 'epoch': 0.23} 23%|██▎ | 7878/34278 [8:48:11<22:11:33, 3.03s/it] 23%|██▎ | 7879/34278 [8:48:17<28:58:05, 3.95s/it] {'loss': 0.1854, 'grad_norm': 0.8427973492284777, 'learning_rate': 8.98877108859405e-06, 'epoch': 0.23} 23%|██▎ | 7879/34278 [8:48:17<28:58:05, 3.95s/it] 23%|██▎ | 7880/34278 [8:48:20<26:57:39, 3.68s/it] {'loss': 0.1588, 'grad_norm': 0.7827418496927713, 'learning_rate': 8.98848620113779e-06, 'epoch': 0.23} 23%|██▎ | 7880/34278 [8:48:20<26:57:39, 3.68s/it] 23%|██▎ | 7881/34278 [8:48:24<27:17:17, 3.72s/it] {'loss': 0.1877, 'grad_norm': 0.8137145924122848, 'learning_rate': 8.988201278073279e-06, 'epoch': 0.23} 23%|██▎ | 7881/34278 [8:48:24<27:17:17, 3.72s/it] 23%|██▎ | 7882/34278 [8:48:30<32:32:13, 4.44s/it] {'loss': 0.145, 'grad_norm': 0.7185630838548984, 'learning_rate': 8.987916319403058e-06, 'epoch': 0.23} 23%|██▎ | 7882/34278 [8:48:30<32:32:13, 4.44s/it] 23%|██▎ | 7883/34278 [8:48:35<35:00:21, 4.77s/it] {'loss': 0.1573, 'grad_norm': 0.6945072273188773, 'learning_rate': 8.987631325129672e-06, 'epoch': 0.23} 23%|██▎ | 7883/34278 [8:48:36<35:00:21, 4.77s/it] 23%|██▎ | 7884/34278 [8:48:39<33:04:07, 4.51s/it] {'loss': 0.1542, 'grad_norm': 0.6783951637049451, 'learning_rate': 8.987346295255665e-06, 'epoch': 0.23} 23%|██▎ | 7884/34278 [8:48:39<33:04:07, 4.51s/it] 23%|██▎ | 7885/34278 [8:48:42<29:11:37, 3.98s/it] {'loss': 0.1544, 'grad_norm': 0.8143489706937942, 'learning_rate': 8.987061229783583e-06, 'epoch': 0.23} 23%|██▎ | 7885/34278 [8:48:42<29:11:37, 3.98s/it] 23%|██▎ | 7886/34278 [8:48:45<27:37:29, 3.77s/it] {'loss': 0.1507, 'grad_norm': 0.9324773063743967, 'learning_rate': 8.98677612871597e-06, 'epoch': 0.23} 23%|██▎ | 7886/34278 [8:48:45<27:37:29, 3.77s/it] 23%|██▎ | 7887/34278 [8:48:49<26:19:10, 3.59s/it] {'loss': 0.1712, 'grad_norm': 0.8604850345630668, 'learning_rate': 8.986490992055371e-06, 'epoch': 0.23} 23%|██▎ | 7887/34278 [8:48:49<26:19:10, 3.59s/it] 23%|██▎ | 7888/34278 [8:48:52<26:11:38, 3.57s/it] {'loss': 0.1654, 'grad_norm': 0.9366105297218565, 'learning_rate': 8.986205819804332e-06, 'epoch': 0.23} 23%|██▎ | 7888/34278 [8:48:52<26:11:38, 3.57s/it] 23%|██▎ | 7889/34278 [8:48:58<31:17:30, 4.27s/it] {'loss': 0.1563, 'grad_norm': 0.7698870467790884, 'learning_rate': 8.9859206119654e-06, 'epoch': 0.23} 23%|██▎ | 7889/34278 [8:48:58<31:17:30, 4.27s/it] 23%|██▎ | 7890/34278 [8:49:02<30:49:32, 4.21s/it] {'loss': 0.1482, 'grad_norm': 0.8892805451272776, 'learning_rate': 8.98563536854112e-06, 'epoch': 0.23} 23%|██▎ | 7890/34278 [8:49:02<30:49:32, 4.21s/it] 23%|██▎ | 7891/34278 [8:49:06<29:33:23, 4.03s/it] {'loss': 0.1876, 'grad_norm': 0.831349679074709, 'learning_rate': 8.985350089534039e-06, 'epoch': 0.23} 23%|██▎ | 7891/34278 [8:49:06<29:33:23, 4.03s/it] 23%|██▎ | 7892/34278 [8:49:09<27:08:55, 3.70s/it] {'loss': 0.1659, 'grad_norm': 0.9095725014546474, 'learning_rate': 8.985064774946704e-06, 'epoch': 0.23} 23%|██▎ | 7892/34278 [8:49:09<27:08:55, 3.70s/it] 23%|██▎ | 7893/34278 [8:49:12<26:30:38, 3.62s/it] {'loss': 0.1585, 'grad_norm': 0.6884617720193952, 'learning_rate': 8.98477942478166e-06, 'epoch': 0.23} 23%|██▎ | 7893/34278 [8:49:12<26:30:38, 3.62s/it] 23%|██▎ | 7894/34278 [8:49:15<25:01:49, 3.42s/it] {'loss': 0.156, 'grad_norm': 0.7678957873083491, 'learning_rate': 8.984494039041458e-06, 'epoch': 0.23} 23%|██▎ | 7894/34278 [8:49:15<25:01:49, 3.42s/it] 23%|██▎ | 7895/34278 [8:49:18<24:49:04, 3.39s/it] {'loss': 0.1651, 'grad_norm': 0.8958766686510108, 'learning_rate': 8.984208617728645e-06, 'epoch': 0.23} 23%|██▎ | 7895/34278 [8:49:18<24:49:04, 3.39s/it] 23%|██▎ | 7896/34278 [8:49:22<25:07:34, 3.43s/it] {'loss': 0.1646, 'grad_norm': 0.7220839846605178, 'learning_rate': 8.983923160845766e-06, 'epoch': 0.23} 23%|██▎ | 7896/34278 [8:49:22<25:07:34, 3.43s/it] 23%|██▎ | 7897/34278 [8:49:28<31:12:20, 4.26s/it] {'loss': 0.1296, 'grad_norm': 0.7298317309624487, 'learning_rate': 8.983637668395375e-06, 'epoch': 0.23} 23%|██▎ | 7897/34278 [8:49:28<31:12:20, 4.26s/it] 23%|██▎ | 7898/34278 [8:49:31<28:25:56, 3.88s/it] {'loss': 0.1598, 'grad_norm': 0.9141541064688687, 'learning_rate': 8.983352140380017e-06, 'epoch': 0.23} 23%|██▎ | 7898/34278 [8:49:31<28:25:56, 3.88s/it] 23%|██▎ | 7899/34278 [8:49:34<25:49:51, 3.53s/it] {'loss': 0.1861, 'grad_norm': 0.867744659092262, 'learning_rate': 8.983066576802241e-06, 'epoch': 0.23} 23%|██▎ | 7899/34278 [8:49:34<25:49:51, 3.53s/it] 23%|██▎ | 7900/34278 [8:49:37<24:13:11, 3.31s/it] {'loss': 0.1694, 'grad_norm': 0.9990059552793454, 'learning_rate': 8.9827809776646e-06, 'epoch': 0.23} 23%|██▎ | 7900/34278 [8:49:37<24:13:11, 3.31s/it] 23%|██▎ | 7901/34278 [8:49:40<23:40:33, 3.23s/it] {'loss': 0.1724, 'grad_norm': 0.8735981646512109, 'learning_rate': 8.98249534296964e-06, 'epoch': 0.23} 23%|██▎ | 7901/34278 [8:49:40<23:40:33, 3.23s/it] 23%|██▎ | 7902/34278 [8:49:43<23:51:51, 3.26s/it] {'loss': 0.1522, 'grad_norm': 0.8882672213679386, 'learning_rate': 8.98220967271991e-06, 'epoch': 0.23} 23%|██▎ | 7902/34278 [8:49:43<23:51:51, 3.26s/it] 23%|██▎ | 7903/34278 [8:49:47<24:48:06, 3.39s/it] {'loss': 0.1771, 'grad_norm': 0.9497935493498831, 'learning_rate': 8.981923966917965e-06, 'epoch': 0.23} 23%|██▎ | 7903/34278 [8:49:47<24:48:06, 3.39s/it] 23%|██▎ | 7904/34278 [8:49:50<25:20:53, 3.46s/it] {'loss': 0.1736, 'grad_norm': 0.8874610411280186, 'learning_rate': 8.981638225566352e-06, 'epoch': 0.23} 23%|██▎ | 7904/34278 [8:49:50<25:20:53, 3.46s/it] 23%|██▎ | 7905/34278 [8:49:54<25:29:11, 3.48s/it] {'loss': 0.159, 'grad_norm': 0.9846342420793052, 'learning_rate': 8.981352448667625e-06, 'epoch': 0.23} 23%|██▎ | 7905/34278 [8:49:54<25:29:11, 3.48s/it] 23%|██▎ | 7906/34278 [8:49:58<26:20:01, 3.59s/it] {'loss': 0.1644, 'grad_norm': 0.9571905763581517, 'learning_rate': 8.981066636224334e-06, 'epoch': 0.23} 23%|██▎ | 7906/34278 [8:49:58<26:20:01, 3.59s/it] 23%|██▎ | 7907/34278 [8:50:00<24:46:26, 3.38s/it] {'loss': 0.1565, 'grad_norm': 1.000066371424318, 'learning_rate': 8.980780788239029e-06, 'epoch': 0.23} 23%|██▎ | 7907/34278 [8:50:00<24:46:26, 3.38s/it] 23%|██▎ | 7908/34278 [8:50:06<28:26:03, 3.88s/it] {'loss': 0.1396, 'grad_norm': 0.8388845856794662, 'learning_rate': 8.980494904714263e-06, 'epoch': 0.23} 23%|██▎ | 7908/34278 [8:50:06<28:26:03, 3.88s/it] 23%|██▎ | 7909/34278 [8:50:09<26:48:22, 3.66s/it] {'loss': 0.1526, 'grad_norm': 1.1958278189996134, 'learning_rate': 8.98020898565259e-06, 'epoch': 0.23} 23%|██▎ | 7909/34278 [8:50:09<26:48:22, 3.66s/it] 23%|██▎ | 7910/34278 [8:50:12<25:54:13, 3.54s/it] {'loss': 0.1575, 'grad_norm': 0.8659713240342617, 'learning_rate': 8.979923031056561e-06, 'epoch': 0.23} 23%|██▎ | 7910/34278 [8:50:12<25:54:13, 3.54s/it] 23%|██▎ | 7911/34278 [8:50:15<24:38:44, 3.36s/it] {'loss': 0.1713, 'grad_norm': 0.8888592808462227, 'learning_rate': 8.979637040928728e-06, 'epoch': 0.23} 23%|██▎ | 7911/34278 [8:50:15<24:38:44, 3.36s/it] 23%|██▎ | 7912/34278 [8:50:19<26:20:54, 3.60s/it] {'loss': 0.1779, 'grad_norm': 1.0438234571076548, 'learning_rate': 8.979351015271648e-06, 'epoch': 0.23} 23%|██▎ | 7912/34278 [8:50:19<26:20:54, 3.60s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7913/34278 [8:50:23<26:43:25, 3.65s/it] {'loss': 0.1784, 'grad_norm': 0.9340396543834354, 'learning_rate': 8.979064954087871e-06, 'epoch': 0.23} 23%|██▎ | 7913/34278 [8:50:23<26:43:25, 3.65s/it] 23%|██▎ | 7914/34278 [8:50:26<24:48:35, 3.39s/it] {'loss': 0.1523, 'grad_norm': 0.8961219508440874, 'learning_rate': 8.97877885737995e-06, 'epoch': 0.23} 23%|██▎ | 7914/34278 [8:50:26<24:48:35, 3.39s/it] 23%|██▎ | 7915/34278 [8:50:29<25:00:30, 3.42s/it] {'loss': 0.1603, 'grad_norm': 0.9385047700926765, 'learning_rate': 8.978492725150444e-06, 'epoch': 0.23} 23%|██▎ | 7915/34278 [8:50:29<25:00:30, 3.42s/it] 23%|██▎ | 7916/34278 [8:50:33<25:35:37, 3.50s/it] {'loss': 0.1799, 'grad_norm': 1.1180766056341944, 'learning_rate': 8.978206557401903e-06, 'epoch': 0.23} 23%|██▎ | 7916/34278 [8:50:33<25:35:37, 3.50s/it] 23%|██▎ | 7917/34278 [8:50:37<26:43:57, 3.65s/it] {'loss': 0.1895, 'grad_norm': 0.8924194484470929, 'learning_rate': 8.977920354136885e-06, 'epoch': 0.23} 23%|██▎ | 7917/34278 [8:50:37<26:43:57, 3.65s/it] 23%|██▎ | 7918/34278 [8:50:42<30:03:32, 4.11s/it] {'loss': 0.1388, 'grad_norm': 0.7494284756242796, 'learning_rate': 8.977634115357942e-06, 'epoch': 0.23} 23%|██▎ | 7918/34278 [8:50:42<30:03:32, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7919/34278 [8:50:45<27:14:49, 3.72s/it] {'loss': 0.1608, 'grad_norm': 0.7633926722825112, 'learning_rate': 8.977347841067631e-06, 'epoch': 0.23} 23%|██▎ | 7919/34278 [8:50:45<27:14:49, 3.72s/it] 23%|██▎ | 7920/34278 [8:50:48<25:11:59, 3.44s/it] {'loss': 0.1641, 'grad_norm': 0.8638611396990575, 'learning_rate': 8.97706153126851e-06, 'epoch': 0.23} 23%|██▎ | 7920/34278 [8:50:48<25:11:59, 3.44s/it] 23%|██▎ | 7921/34278 [8:50:51<24:28:40, 3.34s/it] {'loss': 0.1378, 'grad_norm': 0.780354734871524, 'learning_rate': 8.976775185963131e-06, 'epoch': 0.23} 23%|██▎ | 7921/34278 [8:50:51<24:28:40, 3.34s/it] 23%|██▎ | 7922/34278 [8:50:56<27:58:43, 3.82s/it] {'loss': 0.193, 'grad_norm': 0.8743108969033693, 'learning_rate': 8.976488805154054e-06, 'epoch': 0.23} 23%|██▎ | 7922/34278 [8:50:56<27:58:43, 3.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7923/34278 [8:50:59<27:48:33, 3.80s/it] {'loss': 0.1547, 'grad_norm': 0.8404789222654602, 'learning_rate': 8.976202388843833e-06, 'epoch': 0.23} 23%|██▎ | 7923/34278 [8:50:59<27:48:33, 3.80s/it] 23%|██▎ | 7924/34278 [8:51:05<32:30:56, 4.44s/it] {'loss': 0.1795, 'grad_norm': 0.858721831932792, 'learning_rate': 8.975915937035029e-06, 'epoch': 0.23} 23%|██▎ | 7924/34278 [8:51:05<32:30:56, 4.44s/it] 23%|██▎ | 7925/34278 [8:51:11<35:36:45, 4.86s/it] {'loss': 0.1759, 'grad_norm': 0.8571207713848548, 'learning_rate': 8.975629449730194e-06, 'epoch': 0.23} 23%|██▎ | 7925/34278 [8:51:11<35:36:45, 4.86s/it] 23%|██▎ | 7926/34278 [8:51:16<35:23:12, 4.83s/it] {'loss': 0.16, 'grad_norm': 0.9479269031813717, 'learning_rate': 8.975342926931888e-06, 'epoch': 0.23} 23%|██▎ | 7926/34278 [8:51:16<35:23:12, 4.83s/it] 23%|██▎ | 7927/34278 [8:51:19<31:29:42, 4.30s/it] {'loss': 0.1819, 'grad_norm': 0.8689659096373238, 'learning_rate': 8.97505636864267e-06, 'epoch': 0.23} 23%|██▎ | 7927/34278 [8:51:19<31:29:42, 4.30s/it] 23%|██▎ | 7928/34278 [8:51:22<28:33:14, 3.90s/it] {'loss': 0.167, 'grad_norm': 0.7629244776263481, 'learning_rate': 8.974769774865097e-06, 'epoch': 0.23} 23%|██▎ | 7928/34278 [8:51:22<28:33:14, 3.90s/it] 23%|██▎ | 7929/34278 [8:51:28<33:03:22, 4.52s/it] {'loss': 0.1928, 'grad_norm': 0.8319382999924249, 'learning_rate': 8.97448314560173e-06, 'epoch': 0.23} 23%|██▎ | 7929/34278 [8:51:28<33:03:22, 4.52s/it] 23%|██▎ | 7930/34278 [8:51:32<31:45:34, 4.34s/it] {'loss': 0.1476, 'grad_norm': 0.7636448846716699, 'learning_rate': 8.974196480855126e-06, 'epoch': 0.23} 23%|██▎ | 7930/34278 [8:51:32<31:45:34, 4.34s/it] 23%|██▎ | 7931/34278 [8:51:38<36:08:47, 4.94s/it] {'loss': 0.1553, 'grad_norm': 0.7868278619232064, 'learning_rate': 8.973909780627845e-06, 'epoch': 0.23} 23%|██▎ | 7931/34278 [8:51:38<36:08:47, 4.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7932/34278 [8:51:41<32:17:21, 4.41s/it] {'loss': 0.1468, 'grad_norm': 1.783456156342513, 'learning_rate': 8.973623044922444e-06, 'epoch': 0.23} 23%|██▎ | 7932/34278 [8:51:41<32:17:21, 4.41s/it] 23%|██▎ | 7933/34278 [8:51:45<31:04:51, 4.25s/it] {'loss': 0.1935, 'grad_norm': 0.7361392006785855, 'learning_rate': 8.973336273741487e-06, 'epoch': 0.23} 23%|██▎ | 7933/34278 [8:51:45<31:04:51, 4.25s/it] 23%|██▎ | 7934/34278 [8:51:48<28:34:16, 3.90s/it] {'loss': 0.1595, 'grad_norm': 0.6799103841962983, 'learning_rate': 8.973049467087531e-06, 'epoch': 0.23} 23%|██▎ | 7934/34278 [8:51:48<28:34:16, 3.90s/it] 23%|██▎ | 7935/34278 [8:51:54<32:23:08, 4.43s/it] {'loss': 0.1556, 'grad_norm': 0.8197974864462231, 'learning_rate': 8.972762624963139e-06, 'epoch': 0.23} 23%|██▎ | 7935/34278 [8:51:54<32:23:08, 4.43s/it] 23%|██▎ | 7936/34278 [8:51:57<30:00:12, 4.10s/it] {'loss': 0.1486, 'grad_norm': 0.9292101613609204, 'learning_rate': 8.972475747370869e-06, 'epoch': 0.23} 23%|██▎ | 7936/34278 [8:51:57<30:00:12, 4.10s/it] 23%|██▎ | 7937/34278 [8:52:02<31:40:04, 4.33s/it] {'loss': 0.1508, 'grad_norm': 0.8351032308641211, 'learning_rate': 8.972188834313285e-06, 'epoch': 0.23} 23%|██▎ | 7937/34278 [8:52:02<31:40:04, 4.33s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7938/34278 [8:52:05<28:40:56, 3.92s/it] {'loss': 0.1729, 'grad_norm': 0.8236472852016963, 'learning_rate': 8.971901885792947e-06, 'epoch': 0.23} 23%|██▎ | 7938/34278 [8:52:05<28:40:56, 3.92s/it] 23%|██▎ | 7939/34278 [8:52:08<27:33:23, 3.77s/it] {'loss': 0.1546, 'grad_norm': 0.754782621058893, 'learning_rate': 8.971614901812417e-06, 'epoch': 0.23} 23%|██▎ | 7939/34278 [8:52:09<27:33:23, 3.77s/it] 23%|██▎ | 7940/34278 [8:52:12<26:49:40, 3.67s/it] {'loss': 0.1498, 'grad_norm': 0.8184346728606533, 'learning_rate': 8.971327882374257e-06, 'epoch': 0.23} 23%|██▎ | 7940/34278 [8:52:12<26:49:40, 3.67s/it] 23%|██▎ | 7941/34278 [8:52:15<25:08:11, 3.44s/it] {'loss': 0.1587, 'grad_norm': 0.7842135080527304, 'learning_rate': 8.97104082748103e-06, 'epoch': 0.23} 23%|██▎ | 7941/34278 [8:52:15<25:08:11, 3.44s/it] 23%|██▎ | 7942/34278 [8:52:18<24:47:24, 3.39s/it] {'loss': 0.1545, 'grad_norm': 0.8003555749338849, 'learning_rate': 8.970753737135298e-06, 'epoch': 0.23} 23%|██▎ | 7942/34278 [8:52:18<24:47:24, 3.39s/it] 23%|██▎ | 7943/34278 [8:52:21<24:16:17, 3.32s/it] {'loss': 0.1682, 'grad_norm': 0.9322813589322437, 'learning_rate': 8.970466611339625e-06, 'epoch': 0.23} 23%|██▎ | 7943/34278 [8:52:21<24:16:17, 3.32s/it] 23%|██▎ | 7944/34278 [8:52:25<25:37:54, 3.50s/it] {'loss': 0.1509, 'grad_norm': 0.7959947942319586, 'learning_rate': 8.970179450096574e-06, 'epoch': 0.23} 23%|██▎ | 7944/34278 [8:52:25<25:37:54, 3.50s/it] 23%|██▎ | 7945/34278 [8:52:28<24:24:19, 3.34s/it] {'loss': 0.1536, 'grad_norm': 0.7854743779984855, 'learning_rate': 8.96989225340871e-06, 'epoch': 0.23} 23%|██▎ | 7945/34278 [8:52:28<24:24:19, 3.34s/it] 23%|██▎ | 7946/34278 [8:52:31<24:08:34, 3.30s/it] {'loss': 0.1548, 'grad_norm': 1.0302643387878918, 'learning_rate': 8.969605021278594e-06, 'epoch': 0.23} 23%|██▎ | 7946/34278 [8:52:31<24:08:34, 3.30s/it] 23%|██▎ | 7947/34278 [8:52:34<23:05:17, 3.16s/it] {'loss': 0.1501, 'grad_norm': 0.8798981554529599, 'learning_rate': 8.969317753708792e-06, 'epoch': 0.23} 23%|██▎ | 7947/34278 [8:52:34<23:05:17, 3.16s/it] 23%|██▎ | 7948/34278 [8:52:37<23:07:10, 3.16s/it] {'loss': 0.1587, 'grad_norm': 0.8036381869669064, 'learning_rate': 8.96903045070187e-06, 'epoch': 0.23} 23%|██▎ | 7948/34278 [8:52:37<23:07:10, 3.16s/it] 23%|██▎ | 7949/34278 [8:52:40<22:39:33, 3.10s/it] {'loss': 0.1757, 'grad_norm': 0.981191173155209, 'learning_rate': 8.968743112260389e-06, 'epoch': 0.23} 23%|██▎ | 7949/34278 [8:52:40<22:39:33, 3.10s/it] 23%|██▎ | 7950/34278 [8:52:46<28:28:32, 3.89s/it] {'loss': 0.1604, 'grad_norm': 0.9158301363845617, 'learning_rate': 8.968455738386919e-06, 'epoch': 0.23} 23%|██▎ | 7950/34278 [8:52:46<28:28:32, 3.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7951/34278 [8:52:51<30:07:29, 4.12s/it] {'loss': 0.1801, 'grad_norm': 1.0966531889290108, 'learning_rate': 8.968168329084022e-06, 'epoch': 0.23} 23%|██▎ | 7951/34278 [8:52:51<30:07:29, 4.12s/it] 23%|██▎ | 7952/34278 [8:52:54<27:52:26, 3.81s/it] {'loss': 0.1644, 'grad_norm': 0.9422267898448748, 'learning_rate': 8.967880884354267e-06, 'epoch': 0.23} 23%|██▎ | 7952/34278 [8:52:54<27:52:26, 3.81s/it] 23%|██▎ | 7953/34278 [8:52:57<26:32:28, 3.63s/it] {'loss': 0.1493, 'grad_norm': 0.7561762332547298, 'learning_rate': 8.967593404200219e-06, 'epoch': 0.23} 23%|██▎ | 7953/34278 [8:52:57<26:32:28, 3.63s/it] 23%|██▎ | 7954/34278 [8:53:01<27:47:25, 3.80s/it] {'loss': 0.1709, 'grad_norm': 1.1758815200025448, 'learning_rate': 8.967305888624442e-06, 'epoch': 0.23} 23%|██▎ | 7954/34278 [8:53:01<27:47:25, 3.80s/it] 23%|██▎ | 7955/34278 [8:53:04<26:27:17, 3.62s/it] {'loss': 0.1836, 'grad_norm': 0.9566726405483177, 'learning_rate': 8.967018337629508e-06, 'epoch': 0.23} 23%|██▎ | 7955/34278 [8:53:04<26:27:17, 3.62s/it] 23%|██▎ | 7956/34278 [8:53:08<25:30:49, 3.49s/it] {'loss': 0.1681, 'grad_norm': 0.7946192356490421, 'learning_rate': 8.966730751217978e-06, 'epoch': 0.23} 23%|██▎ | 7956/34278 [8:53:08<25:30:49, 3.49s/it] 23%|██▎ | 7957/34278 [8:53:11<24:37:15, 3.37s/it] {'loss': 0.1519, 'grad_norm': 0.9160009772815291, 'learning_rate': 8.966443129392426e-06, 'epoch': 0.23} 23%|██▎ | 7957/34278 [8:53:11<24:37:15, 3.37s/it] 23%|██▎ | 7958/34278 [8:53:14<24:02:47, 3.29s/it] {'loss': 0.1702, 'grad_norm': 1.0918327574228717, 'learning_rate': 8.966155472155414e-06, 'epoch': 0.23} 23%|██▎ | 7958/34278 [8:53:14<24:02:47, 3.29s/it] 23%|██▎ | 7959/34278 [8:53:17<23:49:05, 3.26s/it] {'loss': 0.155, 'grad_norm': 0.7626379469549405, 'learning_rate': 8.965867779509513e-06, 'epoch': 0.23} 23%|██▎ | 7959/34278 [8:53:17<23:49:05, 3.26s/it] 23%|██▎ | 7960/34278 [8:53:20<24:01:44, 3.29s/it] {'loss': 0.1588, 'grad_norm': 0.8618726573903422, 'learning_rate': 8.965580051457292e-06, 'epoch': 0.23} 23%|██▎ | 7960/34278 [8:53:20<24:01:44, 3.29s/it] 23%|██▎ | 7961/34278 [8:53:24<24:40:27, 3.38s/it] {'loss': 0.1753, 'grad_norm': 0.9285347313650374, 'learning_rate': 8.96529228800132e-06, 'epoch': 0.23} 23%|██▎ | 7961/34278 [8:53:24<24:40:27, 3.38s/it] 23%|██▎ | 7962/34278 [8:53:28<25:25:54, 3.48s/it] {'loss': 0.1694, 'grad_norm': 0.8710843645255506, 'learning_rate': 8.965004489144165e-06, 'epoch': 0.23} 23%|██▎ | 7962/34278 [8:53:28<25:25:54, 3.48s/it] 23%|██▎ | 7963/34278 [8:53:32<26:31:55, 3.63s/it] {'loss': 0.1671, 'grad_norm': 0.7944912694806492, 'learning_rate': 8.964716654888395e-06, 'epoch': 0.23} 23%|██▎ | 7963/34278 [8:53:32<26:31:55, 3.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7964/34278 [8:53:34<24:34:56, 3.36s/it] {'loss': 0.1552, 'grad_norm': 0.805655434799871, 'learning_rate': 8.964428785236581e-06, 'epoch': 0.23} 23%|██▎ | 7964/34278 [8:53:34<24:34:56, 3.36s/it] 23%|██▎ | 7965/34278 [8:53:41<31:20:57, 4.29s/it] {'loss': 0.1476, 'grad_norm': 0.8562198493417296, 'learning_rate': 8.964140880191294e-06, 'epoch': 0.23} 23%|██▎ | 7965/34278 [8:53:41<31:20:57, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7966/34278 [8:53:45<32:07:11, 4.39s/it] {'loss': 0.1662, 'grad_norm': 0.7119304754090852, 'learning_rate': 8.963852939755104e-06, 'epoch': 0.23} 23%|██▎ | 7966/34278 [8:53:45<32:07:11, 4.39s/it] 23%|██▎ | 7967/34278 [8:53:49<30:00:48, 4.11s/it] {'loss': 0.1595, 'grad_norm': 0.9471470810961862, 'learning_rate': 8.96356496393058e-06, 'epoch': 0.23} 23%|██▎ | 7967/34278 [8:53:49<30:00:48, 4.11s/it] 23%|██▎ | 7968/34278 [8:53:52<27:40:26, 3.79s/it] {'loss': 0.1747, 'grad_norm': 0.802465025689359, 'learning_rate': 8.963276952720294e-06, 'epoch': 0.23} 23%|██▎ | 7968/34278 [8:53:52<27:40:26, 3.79s/it] 23%|██▎ | 7969/34278 [8:53:55<26:47:57, 3.67s/it] {'loss': 0.1743, 'grad_norm': 0.653049839552109, 'learning_rate': 8.96298890612682e-06, 'epoch': 0.23} 23%|██▎ | 7969/34278 [8:53:55<26:47:57, 3.67s/it] 23%|██▎ | 7970/34278 [8:53:58<25:44:29, 3.52s/it] {'loss': 0.1519, 'grad_norm': 0.7449545503708133, 'learning_rate': 8.962700824152724e-06, 'epoch': 0.23} 23%|██▎ | 7970/34278 [8:53:58<25:44:29, 3.52s/it] 23%|██▎ | 7971/34278 [8:54:01<24:10:45, 3.31s/it] {'loss': 0.1643, 'grad_norm': 0.859759788278501, 'learning_rate': 8.962412706800583e-06, 'epoch': 0.23} 23%|██▎ | 7971/34278 [8:54:01<24:10:45, 3.31s/it] 23%|██▎ | 7972/34278 [8:54:05<25:38:43, 3.51s/it] {'loss': 0.1623, 'grad_norm': 0.7912390424457756, 'learning_rate': 8.962124554072966e-06, 'epoch': 0.23} 23%|██▎ | 7972/34278 [8:54:05<25:38:43, 3.51s/it] 23%|██▎ | 7973/34278 [8:54:08<24:27:46, 3.35s/it] {'loss': 0.1567, 'grad_norm': 0.96985934893583, 'learning_rate': 8.961836365972448e-06, 'epoch': 0.23} 23%|██▎ | 7973/34278 [8:54:08<24:27:46, 3.35s/it] 23%|██▎ | 7974/34278 [8:54:13<26:32:05, 3.63s/it] {'loss': 0.1413, 'grad_norm': 0.8698070988113134, 'learning_rate': 8.9615481425016e-06, 'epoch': 0.23} 23%|██▎ | 7974/34278 [8:54:13<26:32:05, 3.63s/it] 23%|██▎ | 7975/34278 [8:54:15<24:49:44, 3.40s/it] {'loss': 0.1888, 'grad_norm': 1.1161478727879717, 'learning_rate': 8.961259883662997e-06, 'epoch': 0.23} 23%|██▎ | 7975/34278 [8:54:15<24:49:44, 3.40s/it] 23%|██▎ | 7976/34278 [8:54:18<23:48:53, 3.26s/it] {'loss': 0.1671, 'grad_norm': 0.9863245588204235, 'learning_rate': 8.960971589459208e-06, 'epoch': 0.23} 23%|██▎ | 7976/34278 [8:54:18<23:48:53, 3.26s/it] 23%|██▎ | 7977/34278 [8:54:21<23:22:43, 3.20s/it] {'loss': 0.1938, 'grad_norm': 1.340980192673474, 'learning_rate': 8.960683259892813e-06, 'epoch': 0.23} 23%|██▎ | 7977/34278 [8:54:21<23:22:43, 3.20s/it] 23%|██▎ | 7978/34278 [8:54:24<22:53:49, 3.13s/it] {'loss': 0.1596, 'grad_norm': 1.0935669382378899, 'learning_rate': 8.960394894966383e-06, 'epoch': 0.23} 23%|██▎ | 7978/34278 [8:54:24<22:53:49, 3.13s/it] 23%|██▎ | 7979/34278 [8:54:27<22:09:43, 3.03s/it] {'loss': 0.155, 'grad_norm': 0.7201980386984678, 'learning_rate': 8.960106494682492e-06, 'epoch': 0.23} 23%|██▎ | 7979/34278 [8:54:27<22:09:43, 3.03s/it] 23%|██▎ | 7980/34278 [8:54:30<22:29:06, 3.08s/it] {'loss': 0.1618, 'grad_norm': 0.9488219835295263, 'learning_rate': 8.959818059043717e-06, 'epoch': 0.23} 23%|██▎ | 7980/34278 [8:54:30<22:29:06, 3.08s/it] 23%|██▎ | 7981/34278 [8:54:34<24:30:35, 3.36s/it] {'loss': 0.1788, 'grad_norm': 1.0172603853180795, 'learning_rate': 8.959529588052631e-06, 'epoch': 0.23} 23%|██▎ | 7981/34278 [8:54:34<24:30:35, 3.36s/it] 23%|██▎ | 7982/34278 [8:54:37<22:50:31, 3.13s/it] {'loss': 0.1527, 'grad_norm': 0.7545652274753679, 'learning_rate': 8.959241081711811e-06, 'epoch': 0.23} 23%|██▎ | 7982/34278 [8:54:37<22:50:31, 3.13s/it] 23%|██▎ | 7983/34278 [8:54:42<27:26:00, 3.76s/it] {'loss': 0.1716, 'grad_norm': 0.9252378447180483, 'learning_rate': 8.95895254002383e-06, 'epoch': 0.23} 23%|██▎ | 7983/34278 [8:54:42<27:26:00, 3.76s/it] 23%|██▎ | 7984/34278 [8:54:47<29:15:40, 4.01s/it] {'loss': 0.157, 'grad_norm': 0.7860081958102194, 'learning_rate': 8.958663962991265e-06, 'epoch': 0.23} 23%|██▎ | 7984/34278 [8:54:47<29:15:40, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7985/34278 [8:54:52<31:26:08, 4.30s/it] {'loss': 0.1515, 'grad_norm': 1.022959670385792, 'learning_rate': 8.958375350616695e-06, 'epoch': 0.23} 23%|██▎ | 7985/34278 [8:54:52<31:26:08, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7986/34278 [8:54:55<30:02:59, 4.11s/it] {'loss': 0.1801, 'grad_norm': 0.8943316343739773, 'learning_rate': 8.958086702902695e-06, 'epoch': 0.23} 23%|██▎ | 7986/34278 [8:54:55<30:02:59, 4.11s/it] 23%|██▎ | 7987/34278 [8:54:59<29:20:10, 4.02s/it] {'loss': 0.1685, 'grad_norm': 1.0773939516632338, 'learning_rate': 8.957798019851842e-06, 'epoch': 0.23} 23%|██▎ | 7987/34278 [8:54:59<29:20:10, 4.02s/it] 23%|██▎ | 7988/34278 [8:55:02<26:56:37, 3.69s/it] {'loss': 0.1691, 'grad_norm': 0.9049791383744568, 'learning_rate': 8.957509301466712e-06, 'epoch': 0.23} 23%|██▎ | 7988/34278 [8:55:02<26:56:37, 3.69s/it] 23%|██▎ | 7989/34278 [8:55:06<26:52:32, 3.68s/it] {'loss': 0.1465, 'grad_norm': 0.8968414027056072, 'learning_rate': 8.957220547749884e-06, 'epoch': 0.23} 23%|██▎ | 7989/34278 [8:55:06<26:52:32, 3.68s/it] 23%|██▎ | 7990/34278 [8:55:09<25:21:12, 3.47s/it] {'loss': 0.1811, 'grad_norm': 0.9636828632979421, 'learning_rate': 8.956931758703935e-06, 'epoch': 0.23} 23%|██▎ | 7990/34278 [8:55:09<25:21:12, 3.47s/it] 23%|██▎ | 7991/34278 [8:55:12<24:07:09, 3.30s/it] {'loss': 0.1709, 'grad_norm': 0.9914752249113326, 'learning_rate': 8.956642934331446e-06, 'epoch': 0.23} 23%|██▎ | 7991/34278 [8:55:12<24:07:09, 3.30s/it] 23%|██▎ | 7992/34278 [8:55:15<24:58:13, 3.42s/it] {'loss': 0.1863, 'grad_norm': 1.088550261313748, 'learning_rate': 8.956354074634992e-06, 'epoch': 0.23} 23%|██▎ | 7992/34278 [8:55:15<24:58:13, 3.42s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 7993/34278 [8:55:18<23:33:58, 3.23s/it] {'loss': 0.1504, 'grad_norm': 1.1213752797523344, 'learning_rate': 8.956065179617153e-06, 'epoch': 0.23} 23%|██▎ | 7993/34278 [8:55:18<23:33:58, 3.23s/it] 23%|██▎ | 7994/34278 [8:55:21<23:43:14, 3.25s/it] {'loss': 0.1528, 'grad_norm': 1.5068470923028956, 'learning_rate': 8.955776249280508e-06, 'epoch': 0.23} 23%|██▎ | 7994/34278 [8:55:21<23:43:14, 3.25s/it] 23%|██▎ | 7995/34278 [8:55:25<23:31:36, 3.22s/it] {'loss': 0.2054, 'grad_norm': 1.0904897897433061, 'learning_rate': 8.955487283627638e-06, 'epoch': 0.23} 23%|██▎ | 7995/34278 [8:55:25<23:31:36, 3.22s/it] 23%|██▎ | 7996/34278 [8:55:28<23:00:03, 3.15s/it] {'loss': 0.1348, 'grad_norm': 1.0699673947294661, 'learning_rate': 8.955198282661122e-06, 'epoch': 0.23} 23%|██▎ | 7996/34278 [8:55:28<23:00:03, 3.15s/it] 23%|██▎ | 7997/34278 [8:55:31<24:02:09, 3.29s/it] {'loss': 0.1811, 'grad_norm': 0.7775833248102009, 'learning_rate': 8.954909246383539e-06, 'epoch': 0.23} 23%|██▎ | 7997/34278 [8:55:31<24:02:09, 3.29s/it] 23%|██▎ | 7998/34278 [8:55:35<24:19:39, 3.33s/it] {'loss': 0.1642, 'grad_norm': 0.9471724998179966, 'learning_rate': 8.95462017479747e-06, 'epoch': 0.23} 23%|██▎ | 7998/34278 [8:55:35<24:19:39, 3.33s/it] 23%|██▎ | 7999/34278 [8:55:37<22:53:11, 3.14s/it] {'loss': 0.1598, 'grad_norm': 0.8766472407340916, 'learning_rate': 8.954331067905498e-06, 'epoch': 0.23} 23%|██▎ | 7999/34278 [8:55:37<22:53:11, 3.14s/it] 23%|██▎ | 8000/34278 [8:55:40<22:45:23, 3.12s/it] {'loss': 0.1601, 'grad_norm': 0.7779864407460095, 'learning_rate': 8.9540419257102e-06, 'epoch': 0.23} Set eos token id toSet eos token id toSet eos token id to et eos token id toSet eos token id toSet eos token id toSet eos token id tos token id to<|diff_marker|> Set generation config eos token id to151658Set eos token id to Set eos token id to151658Set eos token to[151658]Set eos token id to <|diff_marker|>Set eos token to 151658151658 Set eos token id to151658<|diff_marker|>Set generation config eos token id to Set eos token toSet eos token toSet generation config eos token id to151658 Set eos token to [151658]<|diff_marker|> <|diff_marker|> <|diff_marker|>[151658] Set eos token toSet generation config eos token id to Set generation config eos token id toSet generation config eos token id to<|diff_marker|> [151658][151658]Set generation config eos token id to [151658] [151658] 151658 Set eos token id toSet eos token id toSet eos token id toSet eos token to <|diff_marker|> 151658151658Set eos token id to151658Set generation config eos token id to Set eos token id toSet eos token toSet eos token id toSet eos token toSet eos token id to151658Set eos token to[151658] 151658 <|diff_marker|><|diff_marker|> <|diff_marker|> 151658 151658Set eos token toSet generation config eos token id toSet generation config eos token id toSet eos token to Set generation config eos token id to <|diff_marker|> Set eos token to[151658] [151658] <|diff_marker|>[151658]Set generation config eos token id to [151658]Set generation config eos token id to [151658]Set eos token to <|diff_marker|><|diff_marker|> Set generation config eos token id toSet generation config eos token id to [151658][151658] Set eos token id back toS151645 Set eos token ba 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645Set eos token id back to Set eos token back to <|im_end|>151645 Set generation config eos token id back to Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id ba 151645151645 Set eos token back toSet eos token back to <|im_end|><|im_end|> Set generation config eos token id back toSet generation config eos token id back to [151645, 151643][151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] ack to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 8001/34278 [8:56:15<92:37:06, 12.69s/it] {'loss': 0.1677, 'grad_norm': 0.8068738332383827, 'learning_rate': 8.953752748214161e-06, 'epoch': 0.23} 23%|██▎ | 8001/34278 [8:56:15<92:37:06, 12.69s/it] 23%|██▎ | 8002/34278 [8:56:19<72:04:57, 9.88s/it] {'loss': 0.1731, 'grad_norm': 0.9702547364467246, 'learning_rate': 8.953463535419962e-06, 'epoch': 0.23} 23%|██▎ | 8002/34278 [8:56:19<72:04:57, 9.88s/it] 23%|██▎ | 8003/34278 [8:56:22<56:55:30, 7.80s/it] {'loss': 0.1867, 'grad_norm': 0.8231585920754951, 'learning_rate': 8.953174287330182e-06, 'epoch': 0.23} 23%|██▎ | 8003/34278 [8:56:22<56:55:30, 7.80s/it] 23%|██▎ | 8004/34278 [8:56:25<46:58:59, 6.44s/it] {'loss': 0.1697, 'grad_norm': 0.7479409714255466, 'learning_rate': 8.952885003947407e-06, 'epoch': 0.23} 23%|██▎ | 8004/34278 [8:56:25<46:58:59, 6.44s/it] 23%|██▎ | 8005/34278 [8:56:28<39:39:11, 5.43s/it] {'loss': 0.1417, 'grad_norm': 0.7061660965729089, 'learning_rate': 8.95259568527422e-06, 'epoch': 0.23} 23%|██▎ | 8005/34278 [8:56:28<39:39:11, 5.43s/it] 23%|██▎ | 8006/34278 [8:56:32<35:42:51, 4.89s/it] {'loss': 0.2055, 'grad_norm': 1.0359937169217408, 'learning_rate': 8.952306331313199e-06, 'epoch': 0.23} 23%|██▎ | 8006/34278 [8:56:32<35:42:51, 4.89s/it] 23%|██▎ | 8007/34278 [8:56:35<31:19:49, 4.29s/it] {'loss': 0.1684, 'grad_norm': 0.8176126070362488, 'learning_rate': 8.952016942066932e-06, 'epoch': 0.23} 23%|██▎ | 8007/34278 [8:56:35<31:19:49, 4.29s/it] 23%|██▎ | 8008/34278 [8:56:39<30:47:32, 4.22s/it] {'loss': 0.1641, 'grad_norm': 1.0809415238229225, 'learning_rate': 8.951727517538001e-06, 'epoch': 0.23} 23%|██▎ | 8008/34278 [8:56:39<30:47:32, 4.22s/it] 23%|██▎ | 8009/34278 [8:56:42<28:58:04, 3.97s/it] {'loss': 0.1588, 'grad_norm': 0.8048304969169773, 'learning_rate': 8.951438057728991e-06, 'epoch': 0.23} 23%|██▎ | 8009/34278 [8:56:42<28:58:04, 3.97s/it] 23%|██▎ | 8010/34278 [8:56:47<31:41:31, 4.34s/it] {'loss': 0.1657, 'grad_norm': 0.9538286575976014, 'learning_rate': 8.951148562642485e-06, 'epoch': 0.23} 23%|██▎ | 8010/34278 [8:56:47<31:41:31, 4.34s/it] 23%|██▎ | 8011/34278 [8:56:53<35:15:53, 4.83s/it] {'loss': 0.1975, 'grad_norm': 1.1010655732932972, 'learning_rate': 8.950859032281068e-06, 'epoch': 0.23} 23%|██▎ | 8011/34278 [8:56:53<35:15:53, 4.83s/it] 23%|██▎ | 8012/34278 [8:56:58<35:18:56, 4.84s/it] {'loss': 0.1913, 'grad_norm': 0.8423336991321413, 'learning_rate': 8.950569466647322e-06, 'epoch': 0.23} 23%|██▎ | 8012/34278 [8:56:58<35:18:56, 4.84s/it] 23%|██▎ | 8013/34278 [8:57:03<35:16:19, 4.83s/it] {'loss': 0.1685, 'grad_norm': 0.9953928472588061, 'learning_rate': 8.950279865743838e-06, 'epoch': 0.23} 23%|██▎ | 8013/34278 [8:57:03<35:16:19, 4.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 8014/34278 [8:57:06<31:23:47, 4.30s/it] {'loss': 0.1487, 'grad_norm': 0.7296554387113371, 'learning_rate': 8.949990229573198e-06, 'epoch': 0.23} 23%|██▎ | 8014/34278 [8:57:06<31:23:47, 4.30s/it] 23%|██▎ | 8015/34278 [8:57:09<28:55:44, 3.97s/it] {'loss': 0.145, 'grad_norm': 0.7998938992446264, 'learning_rate': 8.949700558137986e-06, 'epoch': 0.23} 23%|██▎ | 8015/34278 [8:57:09<28:55:44, 3.97s/it] 23%|██▎ | 8016/34278 [8:57:12<26:30:27, 3.63s/it] {'loss': 0.1876, 'grad_norm': 0.8282895543551169, 'learning_rate': 8.949410851440793e-06, 'epoch': 0.23} 23%|██▎ | 8016/34278 [8:57:12<26:30:27, 3.63s/it] 23%|██▎ | 8017/34278 [8:57:18<31:34:48, 4.33s/it] {'loss': 0.1445, 'grad_norm': 0.7432614557343605, 'learning_rate': 8.949121109484202e-06, 'epoch': 0.23} 23%|██▎ | 8017/34278 [8:57:18<31:34:48, 4.33s/it] 23%|██▎ | 8018/34278 [8:57:23<33:52:03, 4.64s/it] {'loss': 0.1824, 'grad_norm': 0.827381807714969, 'learning_rate': 8.9488313322708e-06, 'epoch': 0.23} 23%|██▎ | 8018/34278 [8:57:23<33:52:03, 4.64s/it] 23%|██▎ | 8019/34278 [8:57:26<30:27:08, 4.17s/it] {'loss': 0.1404, 'grad_norm': 0.7444319638690382, 'learning_rate': 8.948541519803174e-06, 'epoch': 0.23} 23%|██▎ | 8019/34278 [8:57:26<30:27:08, 4.17s/it] 23%|██▎ | 8020/34278 [8:57:29<27:11:38, 3.73s/it] {'loss': 0.1606, 'grad_norm': 0.7989216545674781, 'learning_rate': 8.948251672083913e-06, 'epoch': 0.23} 23%|██▎ | 8020/34278 [8:57:29<27:11:38, 3.73s/it] 23%|██▎ | 8021/34278 [8:57:32<25:47:33, 3.54s/it] {'loss': 0.1628, 'grad_norm': 0.8655703677849157, 'learning_rate': 8.947961789115602e-06, 'epoch': 0.23} 23%|██▎ | 8021/34278 [8:57:32<25:47:33, 3.54s/it] 23%|██▎ | 8022/34278 [8:57:38<30:54:44, 4.24s/it] {'loss': 0.1626, 'grad_norm': 0.878427661366001, 'learning_rate': 8.947671870900833e-06, 'epoch': 0.23} 23%|██▎ | 8022/34278 [8:57:38<30:54:44, 4.24s/it] 23%|██▎ | 8023/34278 [8:57:41<27:40:11, 3.79s/it] {'loss': 0.1424, 'grad_norm': 0.7137478691908138, 'learning_rate': 8.94738191744219e-06, 'epoch': 0.23} 23%|██▎ | 8023/34278 [8:57:41<27:40:11, 3.79s/it] 23%|██▎ | 8024/34278 [8:57:44<26:35:48, 3.65s/it] {'loss': 0.174, 'grad_norm': 0.9131009998198611, 'learning_rate': 8.947091928742265e-06, 'epoch': 0.23} 23%|██▎ | 8024/34278 [8:57:44<26:35:48, 3.65s/it] 23%|██▎ | 8025/34278 [8:57:47<25:49:40, 3.54s/it] {'loss': 0.157, 'grad_norm': 0.8295889966807191, 'learning_rate': 8.946801904803643e-06, 'epoch': 0.23} 23%|██▎ | 8025/34278 [8:57:47<25:49:40, 3.54s/it] 23%|██▎ | 8026/34278 [8:57:51<25:34:35, 3.51s/it] {'loss': 0.1654, 'grad_norm': 0.679940778230621, 'learning_rate': 8.946511845628917e-06, 'epoch': 0.23} 23%|██▎ | 8026/34278 [8:57:51<25:34:35, 3.51s/it] 23%|██▎ | 8027/34278 [8:57:57<31:05:00, 4.26s/it] {'loss': 0.1355, 'grad_norm': 0.6610192945988974, 'learning_rate': 8.946221751220676e-06, 'epoch': 0.23} 23%|██▎ | 8027/34278 [8:57:57<31:05:00, 4.26s/it] 23%|██▎ | 8028/34278 [8:58:00<28:05:48, 3.85s/it] {'loss': 0.1647, 'grad_norm': 3.442276636815862, 'learning_rate': 8.945931621581511e-06, 'epoch': 0.23} 23%|██▎ | 8028/34278 [8:58:00<28:05:48, 3.85s/it] 23%|██▎ | 8029/34278 [8:58:03<27:39:16, 3.79s/it] {'loss': 0.1424, 'grad_norm': 0.8839468671166052, 'learning_rate': 8.945641456714007e-06, 'epoch': 0.23} 23%|██▎ | 8029/34278 [8:58:03<27:39:16, 3.79s/it] 23%|██▎ | 8030/34278 [8:58:07<26:50:00, 3.68s/it] {'loss': 0.1647, 'grad_norm': 0.8889270339695862, 'learning_rate': 8.94535125662076e-06, 'epoch': 0.23} 23%|██▎ | 8030/34278 [8:58:07<26:50:00, 3.68s/it] 23%|██▎ | 8031/34278 [8:58:10<26:09:05, 3.59s/it] {'loss': 0.1728, 'grad_norm': 1.0277466748950153, 'learning_rate': 8.94506102130436e-06, 'epoch': 0.23} 23%|██▎ | 8031/34278 [8:58:10<26:09:05, 3.59s/it] 23%|██▎ | 8032/34278 [8:58:13<25:12:01, 3.46s/it] {'loss': 0.1326, 'grad_norm': 0.8233028025233522, 'learning_rate': 8.944770750767393e-06, 'epoch': 0.23} 23%|██▎ | 8032/34278 [8:58:13<25:12:01, 3.46s/it] 23%|██▎ | 8033/34278 [8:58:17<24:58:05, 3.42s/it] {'loss': 0.1366, 'grad_norm': 0.7082904575322121, 'learning_rate': 8.944480445012458e-06, 'epoch': 0.23} 23%|██▎ | 8033/34278 [8:58:17<24:58:05, 3.42s/it] 23%|██▎ | 8034/34278 [8:58:20<23:51:53, 3.27s/it] {'loss': 0.1695, 'grad_norm': 0.7853682305585855, 'learning_rate': 8.94419010404214e-06, 'epoch': 0.23} 23%|██▎ | 8034/34278 [8:58:20<23:51:53, 3.27s/it] 23%|██▎ | 8035/34278 [8:58:23<24:50:21, 3.41s/it] {'loss': 0.1645, 'grad_norm': 0.805043308464724, 'learning_rate': 8.943899727859038e-06, 'epoch': 0.23} 23%|██▎ | 8035/34278 [8:58:23<24:50:21, 3.41s/it] 23%|██▎ | 8036/34278 [8:58:28<27:41:49, 3.80s/it] {'loss': 0.155, 'grad_norm': 0.8223772365652124, 'learning_rate': 8.943609316465739e-06, 'epoch': 0.23} 23%|██▎ | 8036/34278 [8:58:28<27:41:49, 3.80s/it] 23%|██▎ | 8037/34278 [8:58:31<26:21:43, 3.62s/it] {'loss': 0.1343, 'grad_norm': 0.8983760952745956, 'learning_rate': 8.943318869864836e-06, 'epoch': 0.23} 23%|██▎ | 8037/34278 [8:58:31<26:21:43, 3.62s/it] 23%|██▎ | 8038/34278 [8:58:38<32:14:22, 4.42s/it] {'loss': 0.1403, 'grad_norm': 0.7157276833187896, 'learning_rate': 8.943028388058925e-06, 'epoch': 0.23} 23%|██▎ | 8038/34278 [8:58:38<32:14:22, 4.42s/it] 23%|██▎ | 8039/34278 [8:58:44<35:55:24, 4.93s/it] {'loss': 0.1321, 'grad_norm': 0.7825010396865133, 'learning_rate': 8.942737871050598e-06, 'epoch': 0.23} 23%|██▎ | 8039/34278 [8:58:44<35:55:24, 4.93s/it] 23%|██▎ | 8040/34278 [8:58:50<38:36:14, 5.30s/it] {'loss': 0.1929, 'grad_norm': 0.8076007038951147, 'learning_rate': 8.942447318842449e-06, 'epoch': 0.23} 23%|██▎ | 8040/34278 [8:58:50<38:36:14, 5.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 8041/34278 [8:58:53<34:42:46, 4.76s/it] {'loss': 0.1736, 'grad_norm': 0.8279545523036859, 'learning_rate': 8.94215673143707e-06, 'epoch': 0.23} 23%|██▎ | 8041/34278 [8:58:53<34:42:46, 4.76s/it] 23%|██▎ | 8042/34278 [8:58:56<30:23:15, 4.17s/it] {'loss': 0.1557, 'grad_norm': 0.7908824169183024, 'learning_rate': 8.941866108837058e-06, 'epoch': 0.23} 23%|██▎ | 8042/34278 [8:58:56<30:23:15, 4.17s/it] 23%|██▎ | 8043/34278 [8:59:00<28:51:31, 3.96s/it] {'loss': 0.1511, 'grad_norm': 0.8955458930943784, 'learning_rate': 8.941575451045006e-06, 'epoch': 0.23} 23%|██▎ | 8043/34278 [8:59:00<28:51:31, 3.96s/it] 23%|██▎ | 8044/34278 [8:59:05<31:03:31, 4.26s/it] {'loss': 0.1823, 'grad_norm': 0.8377022011248544, 'learning_rate': 8.941284758063508e-06, 'epoch': 0.23} 23%|██▎ | 8044/34278 [8:59:05<31:03:31, 4.26s/it] 23%|██▎ | 8045/34278 [8:59:08<28:20:17, 3.89s/it] {'loss': 0.1541, 'grad_norm': 2.101691490561202, 'learning_rate': 8.940994029895162e-06, 'epoch': 0.23} 23%|██▎ | 8045/34278 [8:59:08<28:20:17, 3.89s/it] 23%|██▎ | 8046/34278 [8:59:11<26:27:57, 3.63s/it] {'loss': 0.1395, 'grad_norm': 0.8641975686700145, 'learning_rate': 8.940703266542561e-06, 'epoch': 0.23} 23%|██▎ | 8046/34278 [8:59:11<26:27:57, 3.63s/it] 23%|██▎ | 8047/34278 [8:59:14<26:17:33, 3.61s/it] {'loss': 0.1741, 'grad_norm': 0.7837571588736078, 'learning_rate': 8.940412468008303e-06, 'epoch': 0.23} 23%|██▎ | 8047/34278 [8:59:14<26:17:33, 3.61s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11687 > 8192). Running this sequence through the model will result in indexing errors 23%|██▎ | 8048/34278 [8:59:19<28:24:04, 3.90s/it] {'loss': 0.1627, 'grad_norm': 0.8798153568066842, 'learning_rate': 8.940121634294983e-06, 'epoch': 0.23} 23%|██▎ | 8048/34278 [8:59:19<28:24:04, 3.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 23%|██▎ | 8049/34278 [8:59:22<26:41:35, 3.66s/it] {'loss': 0.1569, 'grad_norm': 0.8248862615696155, 'learning_rate': 8.939830765405198e-06, 'epoch': 0.23} 23%|██▎ | 8049/34278 [8:59:22<26:41:35, 3.66s/it] 23%|██▎ | 8050/34278 [8:59:25<24:39:21, 3.38s/it] {'loss': 0.1523, 'grad_norm': 1.2675451855792133, 'learning_rate': 8.939539861341544e-06, 'epoch': 0.23} 23%|██▎ | 8050/34278 [8:59:25<24:39:21, 3.38s/it] 23%|██▎ | 8051/34278 [8:59:28<24:16:09, 3.33s/it] {'loss': 0.1699, 'grad_norm': 0.8146104364588431, 'learning_rate': 8.939248922106618e-06, 'epoch': 0.23} 23%|██▎ | 8051/34278 [8:59:28<24:16:09, 3.33s/it] 23%|██▎ | 8052/34278 [8:59:30<22:50:57, 3.14s/it] {'loss': 0.1512, 'grad_norm': 0.8045537304185456, 'learning_rate': 8.938957947703019e-06, 'epoch': 0.23} 23%|██▎ | 8052/34278 [8:59:30<22:50:57, 3.14s/it] 23%|██▎ | 8053/34278 [8:59:34<24:01:39, 3.30s/it] {'loss': 0.1859, 'grad_norm': 0.9183869283521147, 'learning_rate': 8.938666938133343e-06, 'epoch': 0.23} 23%|██▎ | 8053/34278 [8:59:34<24:01:39, 3.30s/it] 23%|██▎ | 8054/34278 [8:59:37<23:46:15, 3.26s/it] {'loss': 0.1604, 'grad_norm': 0.8897729225953223, 'learning_rate': 8.938375893400189e-06, 'epoch': 0.23} 23%|██▎ | 8054/34278 [8:59:37<23:46:15, 3.26s/it] 23%|██▎ | 8055/34278 [8:59:43<29:48:43, 4.09s/it] {'loss': 0.1525, 'grad_norm': 0.7725779472047589, 'learning_rate': 8.938084813506155e-06, 'epoch': 0.23} 23%|██▎ | 8055/34278 [8:59:43<29:48:43, 4.09s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 24%|██▎ | 8056/34278 [8:59:46<26:45:09, 3.67s/it] {'loss': 0.1627, 'grad_norm': 0.7989758230798651, 'learning_rate': 8.937793698453841e-06, 'epoch': 0.24} 24%|██▎ | 8056/34278 [8:59:46<26:45:09, 3.67s/it] 24%|██▎ | 8057/34278 [8:59:49<25:05:55, 3.45s/it] {'loss': 0.1553, 'grad_norm': 0.8581415588098958, 'learning_rate': 8.937502548245844e-06, 'epoch': 0.24} 24%|██▎ | 8057/34278 [8:59:49<25:05:55, 3.45s/it] 24%|██▎ | 8058/34278 [8:59:54<29:03:53, 3.99s/it] {'loss': 0.1514, 'grad_norm': 0.9810036306537716, 'learning_rate': 8.937211362884764e-06, 'epoch': 0.24} 24%|██▎ | 8058/34278 [8:59:54<29:03:53, 3.99s/it] 24%|██▎ | 8059/34278 [8:59:58<29:40:12, 4.07s/it] {'loss': 0.1743, 'grad_norm': 0.702512980123612, 'learning_rate': 8.9369201423732e-06, 'epoch': 0.24} 24%|██▎ | 8059/34278 [8:59:59<29:40:12, 4.07s/it] 24%|██▎ | 8060/34278 [9:00:02<27:33:53, 3.78s/it] {'loss': 0.1582, 'grad_norm': 0.9888176350519338, 'learning_rate': 8.936628886713754e-06, 'epoch': 0.24} 24%|██▎ | 8060/34278 [9:00:02<27:33:53, 3.78s/it] 24%|██▎ | 8061/34278 [9:00:05<26:26:02, 3.63s/it] {'loss': 0.1596, 'grad_norm': 0.9156496686672432, 'learning_rate': 8.936337595909024e-06, 'epoch': 0.24} 24%|██▎ | 8061/34278 [9:00:05<26:26:02, 3.63s/it] 24%|██▎ | 8062/34278 [9:00:08<25:09:03, 3.45s/it] {'loss': 0.152, 'grad_norm': 0.7957213009908428, 'learning_rate': 8.936046269961614e-06, 'epoch': 0.24} 24%|██▎ | 8062/34278 [9:00:08<25:09:03, 3.45s/it] 24%|██▎ | 8063/34278 [9:00:11<24:20:49, 3.34s/it] {'loss': 0.2111, 'grad_norm': 0.9959649863010526, 'learning_rate': 8.93575490887412e-06, 'epoch': 0.24} 24%|██▎ | 8063/34278 [9:00:11<24:20:49, 3.34s/it] 24%|██▎ | 8064/34278 [9:00:15<24:44:03, 3.40s/it] {'loss': 0.1481, 'grad_norm': 1.0618771730079335, 'learning_rate': 8.935463512649147e-06, 'epoch': 0.24} 24%|██▎ | 8064/34278 [9:00:15<24:44:03, 3.40s/it] 24%|██▎ | 8065/34278 [9:00:18<25:09:32, 3.46s/it] {'loss': 0.1599, 'grad_norm': 0.8232608343299364, 'learning_rate': 8.935172081289293e-06, 'epoch': 0.24} 24%|██▎ | 8065/34278 [9:00:18<25:09:32, 3.46s/it] 24%|██▎ | 8066/34278 [9:00:21<24:15:57, 3.33s/it] {'loss': 0.154, 'grad_norm': 0.9889686791904171, 'learning_rate': 8.934880614797166e-06, 'epoch': 0.24} 24%|██▎ | 8066/34278 [9:00:21<24:15:57, 3.33s/it] 24%|██▎ | 8067/34278 [9:00:24<23:22:08, 3.21s/it] {'loss': 0.1548, 'grad_norm': 0.8758211257441239, 'learning_rate': 8.934589113175363e-06, 'epoch': 0.24} 24%|██▎ | 8067/34278 [9:00:24<23:22:08, 3.21s/it] 24%|██▎ | 8068/34278 [9:00:27<22:55:59, 3.15s/it] {'loss': 0.1443, 'grad_norm': 0.861350412594102, 'learning_rate': 8.934297576426487e-06, 'epoch': 0.24} 24%|██▎ | 8068/34278 [9:00:27<22:55:59, 3.15s/it] 24%|██▎ | 8069/34278 [9:00:30<23:01:00, 3.16s/it] {'loss': 0.154, 'grad_norm': 0.8082345622066743, 'learning_rate': 8.93400600455314e-06, 'epoch': 0.24} 24%|██▎ | 8069/34278 [9:00:30<23:01:00, 3.16s/it] 24%|██▎ | 8070/34278 [9:00:34<25:14:19, 3.47s/it] {'loss': 0.1865, 'grad_norm': 1.0046091692315662, 'learning_rate': 8.933714397557928e-06, 'epoch': 0.24} 24%|██▎ | 8070/34278 [9:00:34<25:14:19, 3.47s/it] 24%|██▎ | 8071/34278 [9:00:38<25:18:31, 3.48s/it] {'loss': 0.1387, 'grad_norm': 0.9800636252704941, 'learning_rate': 8.933422755443453e-06, 'epoch': 0.24} 24%|██▎ | 8071/34278 [9:00:38<25:18:31, 3.48s/it] 24%|██▎ | 8072/34278 [9:00:41<25:14:32, 3.47s/it] {'loss': 0.1666, 'grad_norm': 6.8535591367862825, 'learning_rate': 8.933131078212318e-06, 'epoch': 0.24} 24%|██▎ | 8072/34278 [9:00:41<25:14:32, 3.47s/it] 24%|██▎ | 8073/34278 [9:00:44<24:22:29, 3.35s/it] {'loss': 0.1554, 'grad_norm': 1.024408699368027, 'learning_rate': 8.932839365867127e-06, 'epoch': 0.24} 24%|██▎ | 8073/34278 [9:00:44<24:22:29, 3.35s/it] 24%|██▎ | 8074/34278 [9:00:47<23:14:20, 3.19s/it] {'loss': 0.1703, 'grad_norm': 0.8963925660044842, 'learning_rate': 8.932547618410486e-06, 'epoch': 0.24} 24%|██▎ | 8074/34278 [9:00:47<23:14:20, 3.19s/it] 24%|██▎ | 8075/34278 [9:00:50<22:28:43, 3.09s/it] {'loss': 0.1505, 'grad_norm': 0.7748710339714768, 'learning_rate': 8.932255835845e-06, 'epoch': 0.24} 24%|██▎ | 8075/34278 [9:00:50<22:28:43, 3.09s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8518 > 8192). Running this sequence through the model will result in indexing errors 24%|██▎ | 8076/34278 [9:00:55<26:25:59, 3.63s/it] {'loss': 0.1912, 'grad_norm': 3.705315847541083, 'learning_rate': 8.931964018173272e-06, 'epoch': 0.24} 24%|██▎ | 8076/34278 [9:00:55<26:25:59, 3.63s/it] 24%|██▎ | 8077/34278 [9:00:59<26:38:30, 3.66s/it] {'loss': 0.1723, 'grad_norm': 0.830467918815383, 'learning_rate': 8.931672165397907e-06, 'epoch': 0.24} 24%|██▎ | 8077/34278 [9:00:59<26:38:30, 3.66s/it] 24%|██▎ | 8078/34278 [9:01:02<25:38:41, 3.52s/it] {'loss': 0.1519, 'grad_norm': 0.784473307751724, 'learning_rate': 8.931380277521511e-06, 'epoch': 0.24} 24%|██▎ | 8078/34278 [9:01:02<25:38:41, 3.52s/it] 24%|██▎ | 8079/34278 [9:01:08<31:17:54, 4.30s/it] {'loss': 0.1603, 'grad_norm': 0.9859581011216128, 'learning_rate': 8.931088354546691e-06, 'epoch': 0.24} 24%|██▎ | 8079/34278 [9:01:08<31:17:54, 4.30s/it] 24%|██▎ | 8080/34278 [9:01:12<29:41:52, 4.08s/it] {'loss': 0.176, 'grad_norm': 0.8925307079247121, 'learning_rate': 8.930796396476051e-06, 'epoch': 0.24} 24%|██▎ | 8080/34278 [9:01:12<29:41:52, 4.08s/it] 24%|██▎ | 8081/34278 [9:01:15<28:00:49, 3.85s/it] {'loss': 0.1947, 'grad_norm': 0.7635710131029193, 'learning_rate': 8.930504403312201e-06, 'epoch': 0.24} 24%|██▎ | 8081/34278 [9:01:15<28:00:49, 3.85s/it] 24%|██▎ | 8082/34278 [9:01:19<28:01:21, 3.85s/it] {'loss': 0.1627, 'grad_norm': 0.8758109918925939, 'learning_rate': 8.930212375057747e-06, 'epoch': 0.24} 24%|██▎ | 8082/34278 [9:01:19<28:01:21, 3.85s/it] 24%|██▎ | 8083/34278 [9:01:25<33:08:13, 4.55s/it] {'loss': 0.1575, 'grad_norm': 0.9129551107737623, 'learning_rate': 8.929920311715293e-06, 'epoch': 0.24} 24%|██▎ | 8083/34278 [9:01:25<33:08:13, 4.55s/it] 24%|██▎ | 8084/34278 [9:01:31<36:05:07, 4.96s/it] {'loss': 0.1373, 'grad_norm': 0.6878454110352461, 'learning_rate': 8.92962821328745e-06, 'epoch': 0.24} 24%|██▎ | 8084/34278 [9:01:31<36:05:07, 4.96s/it] 24%|██▎ | 8085/34278 [9:01:34<31:58:00, 4.39s/it] {'loss': 0.1652, 'grad_norm': 0.8301824701868347, 'learning_rate': 8.929336079776822e-06, 'epoch': 0.24} 24%|██▎ | 8085/34278 [9:01:34<31:58:00, 4.39s/it] 24%|██▎ | 8086/34278 [9:01:37<28:56:05, 3.98s/it] {'loss': 0.1528, 'grad_norm': 0.8809332968442258, 'learning_rate': 8.929043911186021e-06, 'epoch': 0.24} 24%|██▎ | 8086/34278 [9:01:37<28:56:05, 3.98s/it] 24%|██▎ | 8087/34278 [9:01:43<33:21:36, 4.59s/it] {'loss': 0.1596, 'grad_norm': 0.8597079340281912, 'learning_rate': 8.928751707517655e-06, 'epoch': 0.24} 24%|██▎ | 8087/34278 [9:01:43<33:21:36, 4.59s/it] 24%|██▎ | 8088/34278 [9:01:46<29:16:20, 4.02s/it] {'loss': 0.1846, 'grad_norm': 0.8951525859435204, 'learning_rate': 8.92845946877433e-06, 'epoch': 0.24} 24%|██▎ | 8088/34278 [9:01:46<29:16:20, 4.02s/it] 24%|██▎ | 8089/34278 [9:01:49<28:09:02, 3.87s/it] {'loss': 0.1852, 'grad_norm': 0.8740482636392535, 'learning_rate': 8.92816719495866e-06, 'epoch': 0.24} 24%|██▎ | 8089/34278 [9:01:49<28:09:02, 3.87s/it] 24%|██▎ | 8090/34278 [9:01:55<31:42:37, 4.36s/it] {'loss': 0.166, 'grad_norm': 0.7349794775328545, 'learning_rate': 8.927874886073247e-06, 'epoch': 0.24} 24%|██▎ | 8090/34278 [9:01:55<31:42:37, 4.36s/it] 24%|██▎ | 8091/34278 [9:01:58<29:40:07, 4.08s/it] {'loss': 0.1689, 'grad_norm': 0.9817526374427119, 'learning_rate': 8.927582542120707e-06, 'epoch': 0.24} 24%|██▎ | 8091/34278 [9:01:58<29:40:07, 4.08s/it] 24%|██▎ | 8092/34278 [9:02:02<28:23:43, 3.90s/it] {'loss': 0.1799, 'grad_norm': 0.9985307535530884, 'learning_rate': 8.927290163103646e-06, 'epoch': 0.24} 24%|██▎ | 8092/34278 [9:02:02<28:23:43, 3.90s/it] 24%|██▎ | 8093/34278 [9:02:05<27:27:45, 3.78s/it] {'loss': 0.1252, 'grad_norm': 0.7443694265539429, 'learning_rate': 8.926997749024677e-06, 'epoch': 0.24} 24%|██▎ | 8093/34278 [9:02:05<27:27:45, 3.78s/it] 24%|██▎ | 8094/34278 [9:02:08<25:58:46, 3.57s/it] {'loss': 0.1545, 'grad_norm': 1.0655059388918109, 'learning_rate': 8.926705299886408e-06, 'epoch': 0.24} 24%|██▎ | 8094/34278 [9:02:08<25:58:46, 3.57s/it] 24%|██▎ | 8095/34278 [9:02:13<29:38:25, 4.08s/it] {'loss': 0.1596, 'grad_norm': 0.7822390842209348, 'learning_rate': 8.926412815691454e-06, 'epoch': 0.24} 24%|██▎ | 8095/34278 [9:02:13<29:38:25, 4.08s/it] 24%|██▎ | 8096/34278 [9:02:16<27:10:06, 3.74s/it] {'loss': 0.1466, 'grad_norm': 0.6411630374731159, 'learning_rate': 8.926120296442421e-06, 'epoch': 0.24} 24%|██▎ | 8096/34278 [9:02:16<27:10:06, 3.74s/it] 24%|██▎ | 8097/34278 [9:02:22<30:44:51, 4.23s/it] {'loss': 0.1542, 'grad_norm': 0.7433286019275946, 'learning_rate': 8.925827742141926e-06, 'epoch': 0.24} 24%|██▎ | 8097/34278 [9:02:22<30:44:51, 4.23s/it] 24%|██▎ | 8098/34278 [9:02:25<28:16:11, 3.89s/it] {'loss': 0.1556, 'grad_norm': 0.8266822077356416, 'learning_rate': 8.925535152792577e-06, 'epoch': 0.24} 24%|██▎ | 8098/34278 [9:02:25<28:16:11, 3.89s/it] 24%|██▎ | 8099/34278 [9:02:28<27:35:42, 3.79s/it] {'loss': 0.1962, 'grad_norm': 0.8447328919697852, 'learning_rate': 8.925242528396986e-06, 'epoch': 0.24} 24%|██▎ | 8099/34278 [9:02:28<27:35:42, 3.79s/it] 24%|██▎ | 8100/34278 [9:02:32<27:35:10, 3.79s/it] {'loss': 0.1705, 'grad_norm': 0.867242488085671, 'learning_rate': 8.924949868957769e-06, 'epoch': 0.24} 24%|██▎ | 8100/34278 [9:02:32<27:35:10, 3.79s/it] 24%|██▎ | 8101/34278 [9:02:36<26:56:51, 3.71s/it] {'loss': 0.176, 'grad_norm': 0.861202562787116, 'learning_rate': 8.924657174477535e-06, 'epoch': 0.24} 24%|██▎ | 8101/34278 [9:02:36<26:56:51, 3.71s/it] 24%|██▎ | 8102/34278 [9:02:39<24:52:13, 3.42s/it] {'loss': 0.1737, 'grad_norm': 0.8235656420001373, 'learning_rate': 8.924364444958898e-06, 'epoch': 0.24} 24%|██▎ | 8102/34278 [9:02:39<24:52:13, 3.42s/it] 24%|██▎ | 8103/34278 [9:02:42<24:41:35, 3.40s/it] {'loss': 0.177, 'grad_norm': 0.7579535439089422, 'learning_rate': 8.924071680404474e-06, 'epoch': 0.24} 24%|██▎ | 8103/34278 [9:02:42<24:41:35, 3.40s/it] 24%|██▎ | 8104/34278 [9:02:48<30:24:52, 4.18s/it] {'loss': 0.1625, 'grad_norm': 0.8908424358661887, 'learning_rate': 8.923778880816874e-06, 'epoch': 0.24} 24%|██▎ | 8104/34278 [9:02:48<30:24:52, 4.18s/it] 24%|██▎ | 8105/34278 [9:02:53<31:53:11, 4.39s/it] {'loss': 0.1823, 'grad_norm': 0.9356466203247811, 'learning_rate': 8.923486046198712e-06, 'epoch': 0.24} 24%|██▎ | 8105/34278 [9:02:53<31:53:11, 4.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 24%|██▎ | 8106/34278 [9:02:56<30:15:29, 4.16s/it] {'loss': 0.1761, 'grad_norm': 0.7974110500492371, 'learning_rate': 8.923193176552604e-06, 'epoch': 0.24} 24%|██▎ | 8106/34278 [9:02:56<30:15:29, 4.16s/it] 24%|██▎ | 8107/34278 [9:03:02<33:11:31, 4.57s/it] {'loss': 0.1369, 'grad_norm': 0.9337107920765044, 'learning_rate': 8.922900271881163e-06, 'epoch': 0.24} 24%|██▎ | 8107/34278 [9:03:02<33:11:31, 4.57s/it] 24%|██▎ | 8108/34278 [9:03:05<30:51:10, 4.24s/it] {'loss': 0.1502, 'grad_norm': 0.7518255024182023, 'learning_rate': 8.922607332187005e-06, 'epoch': 0.24} 24%|██▎ | 8108/34278 [9:03:05<30:51:10, 4.24s/it] 24%|██▎ | 8109/34278 [9:03:08<27:44:12, 3.82s/it] {'loss': 0.1581, 'grad_norm': 0.8621728466275934, 'learning_rate': 8.922314357472745e-06, 'epoch': 0.24} 24%|██▎ | 8109/34278 [9:03:08<27:44:12, 3.82s/it] 24%|██▎ | 8110/34278 [9:03:11<26:15:00, 3.61s/it] {'loss': 0.1499, 'grad_norm': 1.0851661029928277, 'learning_rate': 8.922021347741e-06, 'epoch': 0.24} 24%|██▎ | 8110/34278 [9:03:11<26:15:00, 3.61s/it] 24%|██▎ | 8111/34278 [9:03:14<25:01:40, 3.44s/it] {'loss': 0.141, 'grad_norm': 0.79383978632219, 'learning_rate': 8.921728302994385e-06, 'epoch': 0.24} 24%|██▎ | 8111/34278 [9:03:14<25:01:40, 3.44s/it] 24%|██▎ | 8112/34278 [9:03:20<30:47:30, 4.24s/it] {'loss': 0.1548, 'grad_norm': 0.8758827526338778, 'learning_rate': 8.921435223235514e-06, 'epoch': 0.24} 24%|██▎ | 8112/34278 [9:03:20<30:47:30, 4.24s/it] 24%|██▎ | 8113/34278 [9:03:25<30:26:46, 4.19s/it] {'loss': 0.1923, 'grad_norm': 1.003926531533097, 'learning_rate': 8.921142108467007e-06, 'epoch': 0.24} 24%|██▎ | 8113/34278 [9:03:25<30:26:46, 4.19s/it] 24%|██▎ | 8114/34278 [9:03:28<27:50:52, 3.83s/it] {'loss': 0.1885, 'grad_norm': 0.7985070966603274, 'learning_rate': 8.920848958691479e-06, 'epoch': 0.24} 24%|██▎ | 8114/34278 [9:03:28<27:50:52, 3.83s/it] 24%|██▎ | 8115/34278 [9:03:32<28:53:48, 3.98s/it] {'loss': 0.1591, 'grad_norm': 0.8518616953368908, 'learning_rate': 8.920555773911547e-06, 'epoch': 0.24} 24%|██▎ | 8115/34278 [9:03:32<28:53:48, 3.98s/it] 24%|██▎ | 8116/34278 [9:03:35<27:30:08, 3.78s/it] {'loss': 0.1658, 'grad_norm': 0.9892504056410393, 'learning_rate': 8.920262554129828e-06, 'epoch': 0.24} 24%|██▎ | 8116/34278 [9:03:35<27:30:08, 3.78s/it] 24%|██▎ | 8117/34278 [9:03:39<28:36:09, 3.94s/it] {'loss': 0.1509, 'grad_norm': 0.9199013954920274, 'learning_rate': 8.919969299348943e-06, 'epoch': 0.24} 24%|██▎ | 8117/34278 [9:03:39<28:36:09, 3.94s/it] 24%|██▎ | 8118/34278 [9:03:43<26:55:18, 3.70s/it] {'loss': 0.1915, 'grad_norm': 1.0071317843558194, 'learning_rate': 8.919676009571508e-06, 'epoch': 0.24} 24%|██▎ | 8118/34278 [9:03:43<26:55:18, 3.70s/it] 24%|██▎ | 8119/34278 [9:03:47<29:27:18, 4.05s/it] {'loss': 0.1676, 'grad_norm': 0.8415660230966282, 'learning_rate': 8.919382684800138e-06, 'epoch': 0.24} 24%|██▎ | 8119/34278 [9:03:48<29:27:18, 4.05s/it] 24%|██▎ | 8120/34278 [9:03:51<28:22:24, 3.90s/it] {'loss': 0.1498, 'grad_norm': 1.0731547107912802, 'learning_rate': 8.919089325037457e-06, 'epoch': 0.24} 24%|██▎ | 8120/34278 [9:03:51<28:22:24, 3.90s/it] 24%|██▎ | 8121/34278 [9:03:54<26:52:38, 3.70s/it] {'loss': 0.1551, 'grad_norm': 0.9496190979236668, 'learning_rate': 8.918795930286084e-06, 'epoch': 0.24} 24%|██▎ | 8121/34278 [9:03:54<26:52:38, 3.70s/it] 24%|██▎ | 8122/34278 [9:03:57<25:32:13, 3.51s/it] {'loss': 0.1679, 'grad_norm': 0.731065271012625, 'learning_rate': 8.918502500548633e-06, 'epoch': 0.24} 24%|██▎ | 8122/34278 [9:03:57<25:32:13, 3.51s/it] 24%|██▎ | 8123/34278 [9:04:01<24:56:01, 3.43s/it] {'loss': 0.1899, 'grad_norm': 1.0046200625359758, 'learning_rate': 8.91820903582773e-06, 'epoch': 0.24} 24%|██▎ | 8123/34278 [9:04:01<24:56:01, 3.43s/it] 24%|██▎ | 8124/34278 [9:04:04<24:58:25, 3.44s/it] {'loss': 0.1892, 'grad_norm': 1.064180628499699, 'learning_rate': 8.91791553612599e-06, 'epoch': 0.24} 24%|██▎ | 8124/34278 [9:04:04<24:58:25, 3.44s/it] 24%|██▎ | 8125/34278 [9:04:07<24:35:32, 3.39s/it] {'loss': 0.174, 'grad_norm': 0.7097016029175054, 'learning_rate': 8.917622001446035e-06, 'epoch': 0.24} 24%|██▎ | 8125/34278 [9:04:07<24:35:32, 3.39s/it] 24%|██▎ | 8126/34278 [9:04:11<25:00:32, 3.44s/it] {'loss': 0.1612, 'grad_norm': 0.9185933572329031, 'learning_rate': 8.917328431790488e-06, 'epoch': 0.24} 24%|██▎ | 8126/34278 [9:04:11<25:00:32, 3.44s/it] 24%|██▎ | 8127/34278 [9:04:16<29:00:45, 3.99s/it] {'loss': 0.1731, 'grad_norm': 0.9006503602478779, 'learning_rate': 8.917034827161969e-06, 'epoch': 0.24} 24%|██▎ | 8127/34278 [9:04:16<29:00:45, 3.99s/it] 24%|██▎ | 8128/34278 [9:04:19<26:01:05, 3.58s/it] {'loss': 0.1305, 'grad_norm': 0.8489097841051876, 'learning_rate': 8.916741187563094e-06, 'epoch': 0.24} 24%|██▎ | 8128/34278 [9:04:19<26:01:05, 3.58s/it] 24%|██▎ | 8129/34278 [9:04:24<28:30:55, 3.93s/it] {'loss': 0.1641, 'grad_norm': 0.9279074904873551, 'learning_rate': 8.91644751299649e-06, 'epoch': 0.24} 24%|██▎ | 8129/34278 [9:04:24<28:30:55, 3.93s/it] 24%|██▎ | 8130/34278 [9:04:27<26:51:53, 3.70s/it] {'loss': 0.1335, 'grad_norm': 0.6544628540452366, 'learning_rate': 8.91615380346478e-06, 'epoch': 0.24} 24%|██▎ | 8130/34278 [9:04:27<26:51:53, 3.70s/it] 24%|██▎ | 8131/34278 [9:04:30<26:14:00, 3.61s/it] {'loss': 0.1345, 'grad_norm': 0.7925203958586841, 'learning_rate': 8.915860058970582e-06, 'epoch': 0.24} 24%|██▎ | 8131/34278 [9:04:30<26:14:00, 3.61s/it] 24%|██▎ | 8132/34278 [9:04:33<25:46:08, 3.55s/it] {'loss': 0.1621, 'grad_norm': 0.7910895955053553, 'learning_rate': 8.91556627951652e-06, 'epoch': 0.24} 24%|██▎ | 8132/34278 [9:04:34<25:46:08, 3.55s/it] 24%|██▎ | 8133/34278 [9:04:39<30:32:41, 4.21s/it] {'loss': 0.1854, 'grad_norm': 0.7104863015512473, 'learning_rate': 8.915272465105218e-06, 'epoch': 0.24} 24%|██▎ | 8133/34278 [9:04:39<30:32:41, 4.21s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 24%|██▎ | 8134/34278 [9:04:43<29:36:57, 4.08s/it] {'loss': 0.1868, 'grad_norm': 1.0036378804966084, 'learning_rate': 8.914978615739297e-06, 'epoch': 0.24} 24%|██▎ | 8134/34278 [9:04:43<29:36:57, 4.08s/it] 24%|██▎ | 8135/34278 [9:04:46<26:45:35, 3.68s/it] {'loss': 0.1628, 'grad_norm': 0.8674590168823036, 'learning_rate': 8.914684731421382e-06, 'epoch': 0.24} 24%|██▎ | 8135/34278 [9:04:46<26:45:35, 3.68s/it] 24%|██▎ | 8136/34278 [9:04:51<29:26:32, 4.05s/it] {'loss': 0.1769, 'grad_norm': 0.8672015430406163, 'learning_rate': 8.914390812154094e-06, 'epoch': 0.24} 24%|██▎ | 8136/34278 [9:04:51<29:26:32, 4.05s/it] 24%|██▎ | 8137/34278 [9:04:56<31:06:37, 4.28s/it] {'loss': 0.161, 'grad_norm': 0.6706117907279768, 'learning_rate': 8.914096857940062e-06, 'epoch': 0.24} 24%|██▎ | 8137/34278 [9:04:56<31:06:37, 4.28s/it] 24%|██▎ | 8138/34278 [9:04:59<28:43:17, 3.96s/it] {'loss': 0.1873, 'grad_norm': 0.8967270688761205, 'learning_rate': 8.913802868781907e-06, 'epoch': 0.24} 24%|██▎ | 8138/34278 [9:04:59<28:43:17, 3.96s/it] 24%|██▎ | 8139/34278 [9:05:02<26:40:05, 3.67s/it] {'loss': 0.1524, 'grad_norm': 0.7705670336362179, 'learning_rate': 8.913508844682255e-06, 'epoch': 0.24} 24%|██▎ | 8139/34278 [9:05:02<26:40:05, 3.67s/it] 24%|██▎ | 8140/34278 [9:05:06<28:54:52, 3.98s/it] {'loss': 0.1529, 'grad_norm': 0.822662568821872, 'learning_rate': 8.91321478564373e-06, 'epoch': 0.24} 24%|██▎ | 8140/34278 [9:05:06<28:54:52, 3.98s/it] 24%|██▎ | 8141/34278 [9:05:12<33:16:36, 4.58s/it] {'loss': 0.1762, 'grad_norm': 0.8236205950678361, 'learning_rate': 8.912920691668957e-06, 'epoch': 0.24} 24%|██▎ | 8141/34278 [9:05:12<33:16:36, 4.58s/it] 24%|██▍ | 8142/34278 [9:05:16<31:12:30, 4.30s/it] {'loss': 0.1395, 'grad_norm': 0.8853210785081229, 'learning_rate': 8.912626562760563e-06, 'epoch': 0.24} 24%|██▍ | 8142/34278 [9:05:16<31:12:30, 4.30s/it] 24%|██▍ | 8143/34278 [9:05:22<35:04:55, 4.83s/it] {'loss': 0.1518, 'grad_norm': 0.7586904563452705, 'learning_rate': 8.912332398921171e-06, 'epoch': 0.24} 24%|██▍ | 8143/34278 [9:05:22<35:04:55, 4.83s/it] 24%|██▍ | 8144/34278 [9:05:26<32:16:35, 4.45s/it] {'loss': 0.1772, 'grad_norm': 0.8651132750866162, 'learning_rate': 8.91203820015341e-06, 'epoch': 0.24} 24%|██▍ | 8144/34278 [9:05:26<32:16:35, 4.45s/it] 24%|██▍ | 8145/34278 [9:05:32<35:50:32, 4.94s/it] {'loss': 0.1534, 'grad_norm': 0.9343750341887734, 'learning_rate': 8.911743966459908e-06, 'epoch': 0.24} 24%|██▍ | 8145/34278 [9:05:32<35:50:32, 4.94s/it] 24%|██▍ | 8146/34278 [9:05:35<31:56:31, 4.40s/it] {'loss': 0.1513, 'grad_norm': 0.8400181663770149, 'learning_rate': 8.911449697843286e-06, 'epoch': 0.24} 24%|██▍ | 8146/34278 [9:05:35<31:56:31, 4.40s/it] 24%|██▍ | 8147/34278 [9:05:41<35:31:45, 4.89s/it] {'loss': 0.1513, 'grad_norm': 0.8103376553564225, 'learning_rate': 8.911155394306177e-06, 'epoch': 0.24} 24%|██▍ | 8147/34278 [9:05:41<35:31:45, 4.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 24%|██▍ | 8148/34278 [9:05:47<37:37:26, 5.18s/it] {'loss': 0.1707, 'grad_norm': 1.040231236289833, 'learning_rate': 8.910861055851208e-06, 'epoch': 0.24} 24%|██▍ | 8148/34278 [9:05:47<37:37:26, 5.18s/it] 24%|██▍ | 8149/34278 [9:05:50<32:28:37, 4.47s/it] {'loss': 0.1475, 'grad_norm': 0.7340572763322634, 'learning_rate': 8.910566682481001e-06, 'epoch': 0.24} 24%|██▍ | 8149/34278 [9:05:50<32:28:37, 4.47s/it] 24%|██▍ | 8150/34278 [9:05:54<32:11:27, 4.44s/it] {'loss': 0.147, 'grad_norm': 0.8822008953144147, 'learning_rate': 8.91027227419819e-06, 'epoch': 0.24} 24%|██▍ | 8150/34278 [9:05:54<32:11:27, 4.44s/it] 24%|██▍ | 8151/34278 [9:05:58<30:36:40, 4.22s/it] {'loss': 0.1408, 'grad_norm': 0.7228118620003549, 'learning_rate': 8.909977831005403e-06, 'epoch': 0.24} 24%|██▍ | 8151/34278 [9:05:58<30:36:40, 4.22s/it] 24%|██▍ | 8152/34278 [9:06:01<29:16:14, 4.03s/it] {'loss': 0.1888, 'grad_norm': 1.1023290703753612, 'learning_rate': 8.909683352905267e-06, 'epoch': 0.24} 24%|██▍ | 8152/34278 [9:06:01<29:16:14, 4.03s/it] 24%|██▍ | 8153/34278 [9:06:04<27:06:47, 3.74s/it] {'loss': 0.1543, 'grad_norm': 0.8220589699407617, 'learning_rate': 8.90938883990041e-06, 'epoch': 0.24} 24%|██▍ | 8153/34278 [9:06:04<27:06:47, 3.74s/it] 24%|██▍ | 8154/34278 [9:06:11<32:34:46, 4.49s/it] {'loss': 0.157, 'grad_norm': 0.9489623908894284, 'learning_rate': 8.909094291993464e-06, 'epoch': 0.24} 24%|██▍ | 8154/34278 [9:06:11<32:34:46, 4.49s/it] 24%|██▍ | 8155/34278 [9:06:14<29:36:22, 4.08s/it] {'loss': 0.1531, 'grad_norm': 0.7905877103053048, 'learning_rate': 8.908799709187057e-06, 'epoch': 0.24} 24%|██▍ | 8155/34278 [9:06:14<29:36:22, 4.08s/it] 24%|██▍ | 8156/34278 [9:06:18<30:42:00, 4.23s/it] {'loss': 0.1514, 'grad_norm': 0.8957629797758214, 'learning_rate': 8.908505091483819e-06, 'epoch': 0.24} 24%|██▍ | 8156/34278 [9:06:18<30:42:00, 4.23s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 24%|██▍ | 8157/34278 [9:06:22<29:53:13, 4.12s/it] {'loss': 0.1827, 'grad_norm': 0.9786727450738257, 'learning_rate': 8.90821043888638e-06, 'epoch': 0.24} 24%|██▍ | 8157/34278 [9:06:22<29:53:13, 4.12s/it] 24%|██▍ | 8158/34278 [9:06:26<29:25:07, 4.05s/it] {'loss': 0.1646, 'grad_norm': 0.7937521194492156, 'learning_rate': 8.907915751397372e-06, 'epoch': 0.24} 24%|██▍ | 8158/34278 [9:06:26<29:25:07, 4.05s/it] 24%|██▍ | 8159/34278 [9:06:29<27:22:38, 3.77s/it] {'loss': 0.1629, 'grad_norm': 0.702109595124551, 'learning_rate': 8.907621029019425e-06, 'epoch': 0.24} 24%|██▍ | 8159/34278 [9:06:29<27:22:38, 3.77s/it] 24%|██▍ | 8160/34278 [9:06:32<25:44:47, 3.55s/it] {'loss': 0.1617, 'grad_norm': 0.876485020854691, 'learning_rate': 8.907326271755171e-06, 'epoch': 0.24} 24%|██▍ | 8160/34278 [9:06:32<25:44:47, 3.55s/it] 24%|██▍ | 8161/34278 [9:06:37<28:53:55, 3.98s/it] {'loss': 0.1543, 'grad_norm': 0.8172164952933865, 'learning_rate': 8.90703147960724e-06, 'epoch': 0.24} 24%|██▍ | 8161/34278 [9:06:37<28:53:55, 3.98s/it] 24%|██▍ | 8162/34278 [9:06:40<26:38:01, 3.67s/it] {'loss': 0.1711, 'grad_norm': 1.0589003402978017, 'learning_rate': 8.906736652578264e-06, 'epoch': 0.24} 24%|██▍ | 8162/34278 [9:06:40<26:38:01, 3.67s/it] 24%|██▍ | 8163/34278 [9:06:43<25:03:22, 3.45s/it] {'loss': 0.1676, 'grad_norm': 0.8320959954646029, 'learning_rate': 8.906441790670877e-06, 'epoch': 0.24} 24%|██▍ | 8163/34278 [9:06:43<25:03:22, 3.45s/it] 24%|██▍ | 8164/34278 [9:06:49<31:15:09, 4.31s/it] {'loss': 0.1526, 'grad_norm': 0.7943563348025561, 'learning_rate': 8.906146893887708e-06, 'epoch': 0.24} 24%|██▍ | 8164/34278 [9:06:49<31:15:09, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 24%|██▍ | 8165/34278 [9:06:53<28:46:06, 3.97s/it] {'loss': 0.1519, 'grad_norm': 0.9243283488288019, 'learning_rate': 8.905851962231393e-06, 'epoch': 0.24} 24%|██▍ | 8165/34278 [9:06:53<28:46:06, 3.97s/it] 24%|██▍ | 8166/34278 [9:06:56<27:07:20, 3.74s/it] {'loss': 0.1452, 'grad_norm': 0.6483119265403959, 'learning_rate': 8.905556995704566e-06, 'epoch': 0.24} 24%|██▍ | 8166/34278 [9:06:56<27:07:20, 3.74s/it] 24%|██▍ | 8167/34278 [9:07:02<31:55:27, 4.40s/it] {'loss': 0.1974, 'grad_norm': 1.0631456207996153, 'learning_rate': 8.905261994309857e-06, 'epoch': 0.24} 24%|██▍ | 8167/34278 [9:07:02<31:55:27, 4.40s/it] 24%|██▍ | 8168/34278 [9:07:05<28:53:41, 3.98s/it] {'loss': 0.1361, 'grad_norm': 0.7426459150555751, 'learning_rate': 8.9049669580499e-06, 'epoch': 0.24} 24%|██▍ | 8168/34278 [9:07:05<28:53:41, 3.98s/it] 24%|██▍ | 8169/34278 [9:07:08<26:53:09, 3.71s/it] {'loss': 0.1559, 'grad_norm': 0.7508657482341585, 'learning_rate': 8.904671886927334e-06, 'epoch': 0.24} 24%|██▍ | 8169/34278 [9:07:08<26:53:09, 3.71s/it] 24%|██▍ | 8170/34278 [9:07:11<26:30:39, 3.66s/it] {'loss': 0.1678, 'grad_norm': 0.9359231183611243, 'learning_rate': 8.904376780944786e-06, 'epoch': 0.24} 24%|██▍ | 8170/34278 [9:07:11<26:30:39, 3.66s/it] 24%|██▍ | 8171/34278 [9:07:14<25:07:49, 3.47s/it] {'loss': 0.1437, 'grad_norm': 0.8394916958169752, 'learning_rate': 8.904081640104895e-06, 'epoch': 0.24} 24%|██▍ | 8171/34278 [9:07:14<25:07:49, 3.47s/it] 24%|██▍ | 8172/34278 [9:07:18<25:11:57, 3.47s/it] {'loss': 0.1254, 'grad_norm': 0.7594288412337721, 'learning_rate': 8.903786464410295e-06, 'epoch': 0.24} 24%|██▍ | 8172/34278 [9:07:18<25:11:57, 3.47s/it] 24%|██▍ | 8173/34278 [9:07:23<29:53:04, 4.12s/it] {'loss': 0.1525, 'grad_norm': 0.8464185946780548, 'learning_rate': 8.903491253863622e-06, 'epoch': 0.24} 24%|██▍ | 8173/34278 [9:07:23<29:53:04, 4.12s/it] 24%|██▍ | 8174/34278 [9:07:27<28:36:21, 3.95s/it] {'loss': 0.1619, 'grad_norm': 0.8671937451759861, 'learning_rate': 8.903196008467511e-06, 'epoch': 0.24} 24%|██▍ | 8174/34278 [9:07:27<28:36:21, 3.95s/it] 24%|██▍ | 8175/34278 [9:07:30<26:02:49, 3.59s/it] {'loss': 0.1697, 'grad_norm': 0.9450577389185771, 'learning_rate': 8.902900728224597e-06, 'epoch': 0.24} 24%|██▍ | 8175/34278 [9:07:30<26:02:49, 3.59s/it] 24%|██▍ | 8176/34278 [9:07:33<25:47:34, 3.56s/it] {'loss': 0.1456, 'grad_norm': 0.6310459193767253, 'learning_rate': 8.902605413137517e-06, 'epoch': 0.24} 24%|██▍ | 8176/34278 [9:07:33<25:47:34, 3.56s/it] 24%|██▍ | 8177/34278 [9:07:40<31:54:09, 4.40s/it] {'loss': 0.1753, 'grad_norm': 0.8438666745577079, 'learning_rate': 8.902310063208907e-06, 'epoch': 0.24} 24%|██▍ | 8177/34278 [9:07:40<31:54:09, 4.40s/it] 24%|██▍ | 8178/34278 [9:07:43<29:18:26, 4.04s/it] {'loss': 0.1585, 'grad_norm': 0.7813218821139011, 'learning_rate': 8.902014678441406e-06, 'epoch': 0.24} 24%|██▍ | 8178/34278 [9:07:43<29:18:26, 4.04s/it] 24%|██▍ | 8179/34278 [9:07:49<33:24:44, 4.61s/it] {'loss': 0.179, 'grad_norm': 0.7184763683271836, 'learning_rate': 8.90171925883765e-06, 'epoch': 0.24} 24%|██▍ | 8179/34278 [9:07:49<33:24:44, 4.61s/it] 24%|██▍ | 8180/34278 [9:07:52<30:35:24, 4.22s/it] {'loss': 0.1457, 'grad_norm': 0.788171670428232, 'learning_rate': 8.901423804400273e-06, 'epoch': 0.24} 24%|██▍ | 8180/34278 [9:07:52<30:35:24, 4.22s/it] 24%|██▍ | 8181/34278 [9:07:55<27:33:31, 3.80s/it] {'loss': 0.147, 'grad_norm': 0.8076824354053939, 'learning_rate': 8.901128315131917e-06, 'epoch': 0.24} 24%|██▍ | 8181/34278 [9:07:55<27:33:31, 3.80s/it] 24%|██▍ | 8182/34278 [9:08:01<31:56:01, 4.41s/it] {'loss': 0.1428, 'grad_norm': 0.7033556840440839, 'learning_rate': 8.900832791035218e-06, 'epoch': 0.24} 24%|██▍ | 8182/34278 [9:08:01<31:56:01, 4.41s/it] 24%|██▍ | 8183/34278 [9:08:05<30:54:27, 4.26s/it] {'loss': 0.1681, 'grad_norm': 0.7371450897019787, 'learning_rate': 8.900537232112816e-06, 'epoch': 0.24} 24%|██▍ | 8183/34278 [9:08:05<30:54:27, 4.26s/it] 24%|██▍ | 8184/34278 [9:08:08<28:06:43, 3.88s/it] {'loss': 0.1788, 'grad_norm': 0.7652072647129703, 'learning_rate': 8.90024163836735e-06, 'epoch': 0.24} 24%|██▍ | 8184/34278 [9:08:08<28:06:43, 3.88s/it] 24%|██▍ | 8185/34278 [9:08:13<30:44:06, 4.24s/it] {'loss': 0.1649, 'grad_norm': 0.9637137372175031, 'learning_rate': 8.899946009801455e-06, 'epoch': 0.24} 24%|██▍ | 8185/34278 [9:08:13<30:44:06, 4.24s/it] 24%|██▍ | 8186/34278 [9:08:16<28:50:01, 3.98s/it] {'loss': 0.1829, 'grad_norm': 0.8459098693789464, 'learning_rate': 8.899650346417773e-06, 'epoch': 0.24} 24%|██▍ | 8186/34278 [9:08:16<28:50:01, 3.98s/it] 24%|██▍ | 8187/34278 [9:08:19<27:05:31, 3.74s/it] {'loss': 0.1615, 'grad_norm': 1.0011578724095007, 'learning_rate': 8.899354648218947e-06, 'epoch': 0.24} 24%|██▍ | 8187/34278 [9:08:19<27:05:31, 3.74s/it] 24%|██▍ | 8188/34278 [9:08:22<25:05:55, 3.46s/it] {'loss': 0.1681, 'grad_norm': 0.7948233494079993, 'learning_rate': 8.899058915207611e-06, 'epoch': 0.24} 24%|██▍ | 8188/34278 [9:08:22<25:05:55, 3.46s/it] 24%|██▍ | 8189/34278 [9:08:26<26:18:45, 3.63s/it] {'loss': 0.1338, 'grad_norm': 0.9046111595988586, 'learning_rate': 8.898763147386408e-06, 'epoch': 0.24} 24%|██▍ | 8189/34278 [9:08:26<26:18:45, 3.63s/it] 24%|██▍ | 8190/34278 [9:08:30<27:30:12, 3.80s/it] {'loss': 0.1772, 'grad_norm': 0.8962855423839565, 'learning_rate': 8.898467344757979e-06, 'epoch': 0.24} 24%|██▍ | 8190/34278 [9:08:30<27:30:12, 3.80s/it] 24%|██▍ | 8191/34278 [9:08:34<26:33:56, 3.67s/it] {'loss': 0.1607, 'grad_norm': 0.7896687967471685, 'learning_rate': 8.898171507324964e-06, 'epoch': 0.24} 24%|██▍ | 8191/34278 [9:08:34<26:33:56, 3.67s/it] 24%|██▍ | 8192/34278 [9:08:37<26:12:34, 3.62s/it] {'loss': 0.1556, 'grad_norm': 1.0707583006921217, 'learning_rate': 8.897875635090005e-06, 'epoch': 0.24} 24%|██▍ | 8192/34278 [9:08:37<26:12:34, 3.62s/it] 24%|██▍ | 8193/34278 [9:08:41<26:08:46, 3.61s/it] {'loss': 0.1516, 'grad_norm': 0.909136269454888, 'learning_rate': 8.89757972805574e-06, 'epoch': 0.24} 24%|██▍ | 8193/34278 [9:08:41<26:08:46, 3.61s/it] 24%|██▍ | 8194/34278 [9:08:44<25:16:46, 3.49s/it] {'loss': 0.1511, 'grad_norm': 0.8802805681779381, 'learning_rate': 8.897283786224817e-06, 'epoch': 0.24} 24%|██▍ | 8194/34278 [9:08:44<25:16:46, 3.49s/it] 24%|██▍ | 8195/34278 [9:08:50<30:41:34, 4.24s/it] {'loss': 0.1702, 'grad_norm': 0.749464207805033, 'learning_rate': 8.896987809599874e-06, 'epoch': 0.24} 24%|██▍ | 8195/34278 [9:08:50<30:41:34, 4.24s/it] 24%|██▍ | 8196/34278 [9:08:54<29:31:04, 4.07s/it] {'loss': 0.1477, 'grad_norm': 0.6808348804245334, 'learning_rate': 8.896691798183552e-06, 'epoch': 0.24} 24%|██▍ | 8196/34278 [9:08:54<29:31:04, 4.07s/it] 24%|██▍ | 8197/34278 [9:08:58<30:05:25, 4.15s/it] {'loss': 0.1527, 'grad_norm': 0.7649554312826676, 'learning_rate': 8.896395751978498e-06, 'epoch': 0.24} 24%|██▍ | 8197/34278 [9:08:58<30:05:25, 4.15s/it] 24%|██▍ | 8198/34278 [9:09:01<28:34:44, 3.94s/it] {'loss': 0.1826, 'grad_norm': 0.9094753134461215, 'learning_rate': 8.896099670987351e-06, 'epoch': 0.24} 24%|██▍ | 8198/34278 [9:09:01<28:34:44, 3.94s/it] 24%|██▍ | 8199/34278 [9:09:04<26:16:09, 3.63s/it] {'loss': 0.1668, 'grad_norm': 0.8199112485425468, 'learning_rate': 8.895803555212757e-06, 'epoch': 0.24} 24%|██▍ | 8199/34278 [9:09:04<26:16:09, 3.63s/it] 24%|██▍ | 8200/34278 [9:09:08<27:27:28, 3.79s/it] {'loss': 0.1617, 'grad_norm': 0.7843508250661938, 'learning_rate': 8.89550740465736e-06, 'epoch': 0.24} 24%|██▍ | 8200/34278 [9:09:08<27:27:28, 3.79s/it] 24%|██▍ | 8201/34278 [9:09:12<27:08:53, 3.75s/it] {'loss': 0.1416, 'grad_norm': 0.7023619564626988, 'learning_rate': 8.895211219323802e-06, 'epoch': 0.24} 24%|██▍ | 8201/34278 [9:09:12<27:08:53, 3.75s/it] 24%|██▍ | 8202/34278 [9:09:16<27:54:08, 3.85s/it] {'loss': 0.1609, 'grad_norm': 0.8051786204993654, 'learning_rate': 8.894914999214727e-06, 'epoch': 0.24} 24%|██▍ | 8202/34278 [9:09:16<27:54:08, 3.85s/it] 24%|██▍ | 8203/34278 [9:09:20<28:04:08, 3.88s/it] {'loss': 0.1716, 'grad_norm': 0.7610729256588492, 'learning_rate': 8.894618744332783e-06, 'epoch': 0.24} 24%|██▍ | 8203/34278 [9:09:20<28:04:08, 3.88s/it] 24%|██▍ | 8204/34278 [9:09:25<29:17:37, 4.04s/it] {'loss': 0.1743, 'grad_norm': 0.8722026870288849, 'learning_rate': 8.89432245468061e-06, 'epoch': 0.24} 24%|██▍ | 8204/34278 [9:09:25<29:17:37, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 24%|██▍ | 8205/34278 [9:09:31<33:39:51, 4.65s/it] {'loss': 0.1566, 'grad_norm': 0.7253988078472988, 'learning_rate': 8.894026130260858e-06, 'epoch': 0.24} 24%|██▍ | 8205/34278 [9:09:31<33:39:51, 4.65s/it] 24%|██▍ | 8206/34278 [9:09:34<31:02:11, 4.29s/it] {'loss': 0.1722, 'grad_norm': 0.7558594801533177, 'learning_rate': 8.89372977107617e-06, 'epoch': 0.24} 24%|██▍ | 8206/34278 [9:09:34<31:02:11, 4.29s/it] 24%|██▍ | 8207/34278 [9:09:37<28:12:15, 3.89s/it] {'loss': 0.1419, 'grad_norm': 0.715904179026281, 'learning_rate': 8.89343337712919e-06, 'epoch': 0.24} 24%|██▍ | 8207/34278 [9:09:37<28:12:15, 3.89s/it] 24%|██▍ | 8208/34278 [9:09:41<28:50:34, 3.98s/it] {'loss': 0.1581, 'grad_norm': 0.8812508029464208, 'learning_rate': 8.893136948422569e-06, 'epoch': 0.24} 24%|██▍ | 8208/34278 [9:09:41<28:50:34, 3.98s/it] 24%|██▍ | 8209/34278 [9:09:45<29:12:20, 4.03s/it] {'loss': 0.1779, 'grad_norm': 0.8586838495986173, 'learning_rate': 8.89284048495895e-06, 'epoch': 0.24} 24%|██▍ | 8209/34278 [9:09:45<29:12:20, 4.03s/it] 24%|██▍ | 8210/34278 [9:09:51<33:33:11, 4.63s/it] {'loss': 0.1473, 'grad_norm': 0.7406003304511822, 'learning_rate': 8.892543986740979e-06, 'epoch': 0.24} 24%|██▍ | 8210/34278 [9:09:51<33:33:11, 4.63s/it] 24%|██▍ | 8211/34278 [9:09:55<30:18:50, 4.19s/it] {'loss': 0.1515, 'grad_norm': 0.8979035192711243, 'learning_rate': 8.892247453771306e-06, 'epoch': 0.24} 24%|██▍ | 8211/34278 [9:09:55<30:18:50, 4.19s/it] 24%|██▍ | 8212/34278 [9:09:58<29:38:31, 4.09s/it] {'loss': 0.1394, 'grad_norm': 0.6963097051903574, 'learning_rate': 8.891950886052576e-06, 'epoch': 0.24} 24%|██▍ | 8212/34278 [9:09:58<29:38:31, 4.09s/it] 24%|██▍ | 8213/34278 [9:10:02<27:47:29, 3.84s/it] {'loss': 0.165, 'grad_norm': 1.267749439137429, 'learning_rate': 8.891654283587438e-06, 'epoch': 0.24} 24%|██▍ | 8213/34278 [9:10:02<27:47:29, 3.84s/it] 24%|██▍ | 8214/34278 [9:10:07<30:33:57, 4.22s/it] {'loss': 0.1672, 'grad_norm': 1.1354387491417997, 'learning_rate': 8.891357646378538e-06, 'epoch': 0.24} 24%|██▍ | 8214/34278 [9:10:07<30:33:57, 4.22s/it] 24%|██▍ | 8215/34278 [9:10:10<27:50:33, 3.85s/it] {'loss': 0.1744, 'grad_norm': 0.7444284210006101, 'learning_rate': 8.891060974428528e-06, 'epoch': 0.24} 24%|██▍ | 8215/34278 [9:10:10<27:50:33, 3.85s/it] 24%|██▍ | 8216/34278 [9:10:13<27:10:19, 3.75s/it] {'loss': 0.1379, 'grad_norm': 0.6855454737424209, 'learning_rate': 8.890764267740053e-06, 'epoch': 0.24} 24%|██▍ | 8216/34278 [9:10:13<27:10:19, 3.75s/it] 24%|██▍ | 8217/34278 [9:10:17<26:50:49, 3.71s/it] {'loss': 0.1595, 'grad_norm': 0.9024297879509084, 'learning_rate': 8.890467526315765e-06, 'epoch': 0.24} 24%|██▍ | 8217/34278 [9:10:17<26:50:49, 3.71s/it] 24%|██▍ | 8218/34278 [9:10:23<31:40:12, 4.38s/it] {'loss': 0.1689, 'grad_norm': 0.7834463539446351, 'learning_rate': 8.89017075015831e-06, 'epoch': 0.24} 24%|██▍ | 8218/34278 [9:10:23<31:40:12, 4.38s/it] 24%|██▍ | 8219/34278 [9:10:26<29:16:32, 4.04s/it] {'loss': 0.1837, 'grad_norm': 0.988813715135792, 'learning_rate': 8.889873939270341e-06, 'epoch': 0.24} 24%|██▍ | 8219/34278 [9:10:26<29:16:32, 4.04s/it] 24%|██▍ | 8220/34278 [9:10:29<27:01:07, 3.73s/it] {'loss': 0.1419, 'grad_norm': 0.7267918856114794, 'learning_rate': 8.889577093654504e-06, 'epoch': 0.24} 24%|██▍ | 8220/34278 [9:10:29<27:01:07, 3.73s/it] 24%|██▍ | 8221/34278 [9:10:32<24:53:02, 3.44s/it] {'loss': 0.1544, 'grad_norm': 0.7787868909140904, 'learning_rate': 8.889280213313454e-06, 'epoch': 0.24} 24%|██▍ | 8221/34278 [9:10:32<24:53:02, 3.44s/it] 24%|██▍ | 8222/34278 [9:10:36<26:12:06, 3.62s/it] {'loss': 0.1623, 'grad_norm': 1.0797811006635678, 'learning_rate': 8.888983298249838e-06, 'epoch': 0.24} 24%|██▍ | 8222/34278 [9:10:36<26:12:06, 3.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 24%|██▍ | 8223/34278 [9:10:39<25:13:34, 3.49s/it] {'loss': 0.1674, 'grad_norm': 0.8443592671417744, 'learning_rate': 8.888686348466305e-06, 'epoch': 0.24} 24%|██▍ | 8223/34278 [9:10:39<25:13:34, 3.49s/it] 24%|██▍ | 8224/34278 [9:10:42<24:29:56, 3.39s/it] {'loss': 0.1826, 'grad_norm': 1.0697659872943215, 'learning_rate': 8.88838936396551e-06, 'epoch': 0.24} 24%|██▍ | 8224/34278 [9:10:42<24:29:56, 3.39s/it] 24%|██▍ | 8225/34278 [9:10:45<23:11:09, 3.20s/it] {'loss': 0.1366, 'grad_norm': 0.7664760824723789, 'learning_rate': 8.888092344750103e-06, 'epoch': 0.24} 24%|██▍ | 8225/34278 [9:10:45<23:11:09, 3.20s/it] 24%|██▍ | 8226/34278 [9:10:48<23:43:46, 3.28s/it] {'loss': 0.1776, 'grad_norm': 0.9754768917787945, 'learning_rate': 8.887795290822736e-06, 'epoch': 0.24} 24%|██▍ | 8226/34278 [9:10:48<23:43:46, 3.28s/it] 24%|██▍ | 8227/34278 [9:10:55<29:58:31, 4.14s/it] {'loss': 0.1644, 'grad_norm': 0.8573701254119962, 'learning_rate': 8.887498202186062e-06, 'epoch': 0.24} 24%|██▍ | 8227/34278 [9:10:55<29:58:31, 4.14s/it] 24%|██▍ | 8228/34278 [9:10:58<28:51:37, 3.99s/it] {'loss': 0.1358, 'grad_norm': 0.9220847422993835, 'learning_rate': 8.88720107884273e-06, 'epoch': 0.24} 24%|██▍ | 8228/34278 [9:10:58<28:51:37, 3.99s/it] 24%|██▍ | 8229/34278 [9:11:04<33:09:07, 4.58s/it] {'loss': 0.1627, 'grad_norm': 0.9619544322343004, 'learning_rate': 8.886903920795396e-06, 'epoch': 0.24} 24%|██▍ | 8229/34278 [9:11:04<33:09:07, 4.58s/it] 24%|██▍ | 8230/34278 [9:11:08<31:00:45, 4.29s/it] {'loss': 0.1609, 'grad_norm': 0.9877543280422053, 'learning_rate': 8.88660672804671e-06, 'epoch': 0.24} 24%|██▍ | 8230/34278 [9:11:08<31:00:45, 4.29s/it] 24%|██▍ | 8231/34278 [9:11:13<31:53:19, 4.41s/it] {'loss': 0.1835, 'grad_norm': 0.770908935040861, 'learning_rate': 8.886309500599328e-06, 'epoch': 0.24} 24%|██▍ | 8231/34278 [9:11:13<31:53:19, 4.41s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 24%|██▍ | 8232/34278 [9:11:16<29:54:00, 4.13s/it] {'loss': 0.1808, 'grad_norm': 1.1567854629304466, 'learning_rate': 8.886012238455903e-06, 'epoch': 0.24} 24%|██▍ | 8232/34278 [9:11:16<29:54:00, 4.13s/it] 24%|██▍ | 8233/34278 [9:11:19<27:21:26, 3.78s/it] {'loss': 0.1654, 'grad_norm': 1.1332016046396984, 'learning_rate': 8.885714941619087e-06, 'epoch': 0.24} 24%|██▍ | 8233/34278 [9:11:19<27:21:26, 3.78s/it] 24%|██▍ | 8234/34278 [9:11:22<25:51:39, 3.57s/it] {'loss': 0.1418, 'grad_norm': 0.9002091516933857, 'learning_rate': 8.885417610091536e-06, 'epoch': 0.24} 24%|██▍ | 8234/34278 [9:11:22<25:51:39, 3.57s/it] 24%|██▍ | 8235/34278 [9:11:25<24:32:26, 3.39s/it] {'loss': 0.1466, 'grad_norm': 1.2171802257120898, 'learning_rate': 8.885120243875905e-06, 'epoch': 0.24} 24%|██▍ | 8235/34278 [9:11:25<24:32:26, 3.39s/it] 24%|██▍ | 8236/34278 [9:11:28<23:21:46, 3.23s/it] {'loss': 0.1697, 'grad_norm': 0.9221963422303984, 'learning_rate': 8.884822842974847e-06, 'epoch': 0.24} 24%|██▍ | 8236/34278 [9:11:28<23:21:46, 3.23s/it] 24%|██▍ | 8237/34278 [9:11:32<24:49:49, 3.43s/it] {'loss': 0.1695, 'grad_norm': 0.9012719303878594, 'learning_rate': 8.88452540739102e-06, 'epoch': 0.24} 24%|██▍ | 8237/34278 [9:11:32<24:49:49, 3.43s/it] 24%|██▍ | 8238/34278 [9:11:38<30:38:42, 4.24s/it] {'loss': 0.1563, 'grad_norm': 0.8564097549638308, 'learning_rate': 8.884227937127076e-06, 'epoch': 0.24} 24%|██▍ | 8238/34278 [9:11:38<30:38:42, 4.24s/it] 24%|██▍ | 8239/34278 [9:11:41<28:03:58, 3.88s/it] {'loss': 0.1455, 'grad_norm': 0.8370971704673468, 'learning_rate': 8.883930432185673e-06, 'epoch': 0.24} 24%|██▍ | 8239/34278 [9:11:41<28:03:58, 3.88s/it] 24%|██▍ | 8240/34278 [9:11:47<32:04:04, 4.43s/it] {'loss': 0.1617, 'grad_norm': 1.0679698209456254, 'learning_rate': 8.883632892569466e-06, 'epoch': 0.24} 24%|██▍ | 8240/34278 [9:11:47<32:04:04, 4.43s/it] 24%|██▍ | 8241/34278 [9:11:50<28:53:45, 4.00s/it] {'loss': 0.1589, 'grad_norm': 1.118725383962962, 'learning_rate': 8.88333531828111e-06, 'epoch': 0.24} 24%|██▍ | 8241/34278 [9:11:50<28:53:45, 4.00s/it] 24%|██▍ | 8242/34278 [9:11:53<26:30:42, 3.67s/it] {'loss': 0.1632, 'grad_norm': 0.9374701175420922, 'learning_rate': 8.883037709323263e-06, 'epoch': 0.24} 24%|██▍ | 8242/34278 [9:11:53<26:30:42, 3.67s/it] 24%|██▍ | 8243/34278 [9:11:56<25:14:43, 3.49s/it] {'loss': 0.1641, 'grad_norm': 0.894106912041009, 'learning_rate': 8.882740065698586e-06, 'epoch': 0.24} 24%|██▍ | 8243/34278 [9:11:56<25:14:43, 3.49s/it] 24%|██▍ | 8244/34278 [9:11:59<24:46:46, 3.43s/it] {'loss': 0.1426, 'grad_norm': 0.88794811179921, 'learning_rate': 8.882442387409729e-06, 'epoch': 0.24} 24%|██▍ | 8244/34278 [9:11:59<24:46:46, 3.43s/it] 24%|██▍ | 8245/34278 [9:12:02<24:37:32, 3.41s/it] {'loss': 0.1781, 'grad_norm': 0.8991441307822194, 'learning_rate': 8.882144674459354e-06, 'epoch': 0.24} 24%|██▍ | 8245/34278 [9:12:02<24:37:32, 3.41s/it] 24%|██▍ | 8246/34278 [9:12:08<30:05:49, 4.16s/it] {'loss': 0.1424, 'grad_norm': 0.8587688129744776, 'learning_rate': 8.88184692685012e-06, 'epoch': 0.24} 24%|██▍ | 8246/34278 [9:12:08<30:05:49, 4.16s/it] 24%|██▍ | 8247/34278 [9:12:12<28:55:29, 4.00s/it] {'loss': 0.1662, 'grad_norm': 1.0713316984903607, 'learning_rate': 8.88154914458468e-06, 'epoch': 0.24} 24%|██▍ | 8247/34278 [9:12:12<28:55:29, 4.00s/it] 24%|██▍ | 8248/34278 [9:12:18<33:10:15, 4.59s/it] {'loss': 0.1503, 'grad_norm': 0.8101948080509492, 'learning_rate': 8.881251327665699e-06, 'epoch': 0.24} 24%|██▍ | 8248/34278 [9:12:18<33:10:15, 4.59s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 24%|██▍ | 8249/34278 [9:12:21<30:30:45, 4.22s/it] {'loss': 0.1642, 'grad_norm': 0.7635063527922878, 'learning_rate': 8.88095347609583e-06, 'epoch': 0.24} 24%|██▍ | 8249/34278 [9:12:21<30:30:45, 4.22s/it] 24%|██▍ | 8250/34278 [9:12:24<28:39:14, 3.96s/it] {'loss': 0.163, 'grad_norm': 0.882359914169995, 'learning_rate': 8.880655589877737e-06, 'epoch': 0.24} 24%|██▍ | 8250/34278 [9:12:24<28:39:14, 3.96s/it] 24%|██▍ | 8251/34278 [9:12:28<27:01:28, 3.74s/it] {'loss': 0.1562, 'grad_norm': 0.6826838905996303, 'learning_rate': 8.880357669014077e-06, 'epoch': 0.24} 24%|██▍ | 8251/34278 [9:12:28<27:01:28, 3.74s/it] 24%|██▍ | 8252/34278 [9:12:31<25:21:43, 3.51s/it] {'loss': 0.1518, 'grad_norm': 0.7851893250684556, 'learning_rate': 8.88005971350751e-06, 'epoch': 0.24} 24%|██▍ | 8252/34278 [9:12:31<25:21:43, 3.51s/it] 24%|██▍ | 8253/34278 [9:12:36<30:17:16, 4.19s/it] {'loss': 0.1783, 'grad_norm': 0.8064569725792964, 'learning_rate': 8.879761723360695e-06, 'epoch': 0.24} 24%|██▍ | 8253/34278 [9:12:36<30:17:16, 4.19s/it] 24%|██▍ | 8254/34278 [9:12:40<29:35:54, 4.09s/it] {'loss': 0.1567, 'grad_norm': 0.7934433704487039, 'learning_rate': 8.879463698576294e-06, 'epoch': 0.24} 24%|██▍ | 8254/34278 [9:12:40<29:35:54, 4.09s/it] 24%|██▍ | 8255/34278 [9:12:44<27:58:42, 3.87s/it] {'loss': 0.1509, 'grad_norm': 0.7199144588868956, 'learning_rate': 8.879165639156968e-06, 'epoch': 0.24} 24%|██▍ | 8255/34278 [9:12:44<27:58:42, 3.87s/it] 24%|██▍ | 8256/34278 [9:12:47<26:25:07, 3.65s/it] {'loss': 0.1502, 'grad_norm': 0.8761875305665132, 'learning_rate': 8.878867545105377e-06, 'epoch': 0.24} 24%|██▍ | 8256/34278 [9:12:47<26:25:07, 3.65s/it] 24%|██▍ | 8257/34278 [9:12:50<24:58:30, 3.46s/it] {'loss': 0.1408, 'grad_norm': 1.013003472781489, 'learning_rate': 8.87856941642418e-06, 'epoch': 0.24} 24%|██▍ | 8257/34278 [9:12:50<24:58:30, 3.46s/it] 24%|██▍ | 8258/34278 [9:12:56<30:58:44, 4.29s/it] {'loss': 0.1632, 'grad_norm': 0.9854636634111242, 'learning_rate': 8.878271253116044e-06, 'epoch': 0.24} 24%|██▍ | 8258/34278 [9:12:56<30:58:44, 4.29s/it] 24%|██▍ | 8259/34278 [9:13:00<30:26:31, 4.21s/it] {'loss': 0.2009, 'grad_norm': 0.7951311412085198, 'learning_rate': 8.877973055183629e-06, 'epoch': 0.24} 24%|██▍ | 8259/34278 [9:13:00<30:26:31, 4.21s/it] 24%|██▍ | 8260/34278 [9:13:03<27:44:35, 3.84s/it] {'loss': 0.1964, 'grad_norm': 1.0186181133174064, 'learning_rate': 8.877674822629595e-06, 'epoch': 0.24} 24%|██▍ | 8260/34278 [9:13:03<27:44:35, 3.84s/it] 24%|██▍ | 8261/34278 [9:13:06<26:32:57, 3.67s/it] {'loss': 0.1573, 'grad_norm': 0.9672603265242955, 'learning_rate': 8.877376555456604e-06, 'epoch': 0.24} 24%|██▍ | 8261/34278 [9:13:06<26:32:57, 3.67s/it] 24%|██▍ | 8262/34278 [9:13:10<26:23:25, 3.65s/it] {'loss': 0.1908, 'grad_norm': 0.8894235143992626, 'learning_rate': 8.877078253667321e-06, 'epoch': 0.24} 24%|██▍ | 8262/34278 [9:13:10<26:23:25, 3.65s/it] 24%|██▍ | 8263/34278 [9:13:14<26:44:40, 3.70s/it] {'loss': 0.1466, 'grad_norm': 0.8754001686628599, 'learning_rate': 8.876779917264412e-06, 'epoch': 0.24} 24%|██▍ | 8263/34278 [9:13:14<26:44:40, 3.70s/it] 24%|██▍ | 8264/34278 [9:13:17<25:49:17, 3.57s/it] {'loss': 0.157, 'grad_norm': 1.0628663388415525, 'learning_rate': 8.876481546250535e-06, 'epoch': 0.24} 24%|██▍ | 8264/34278 [9:13:17<25:49:17, 3.57s/it] 24%|██▍ | 8265/34278 [9:13:20<24:42:34, 3.42s/it] {'loss': 0.145, 'grad_norm': 0.9723407257991477, 'learning_rate': 8.876183140628355e-06, 'epoch': 0.24} 24%|██▍ | 8265/34278 [9:13:20<24:42:34, 3.42s/it] 24%|██▍ | 8266/34278 [9:13:25<28:54:59, 4.00s/it] {'loss': 0.1625, 'grad_norm': 0.9469882403125053, 'learning_rate': 8.87588470040054e-06, 'epoch': 0.24} 24%|██▍ | 8266/34278 [9:13:25<28:54:59, 4.00s/it] 24%|██▍ | 8267/34278 [9:13:29<27:40:22, 3.83s/it] {'loss': 0.1407, 'grad_norm': 0.8699907055802648, 'learning_rate': 8.87558622556975e-06, 'epoch': 0.24} 24%|██▍ | 8267/34278 [9:13:29<27:40:22, 3.83s/it] 24%|██▍ | 8268/34278 [9:13:32<26:06:10, 3.61s/it] {'loss': 0.1565, 'grad_norm': 0.9010022926823396, 'learning_rate': 8.875287716138651e-06, 'epoch': 0.24} 24%|██▍ | 8268/34278 [9:13:32<26:06:10, 3.61s/it] 24%|██▍ | 8269/34278 [9:13:35<25:48:13, 3.57s/it] {'loss': 0.147, 'grad_norm': 0.8853167383305223, 'learning_rate': 8.87498917210991e-06, 'epoch': 0.24} 24%|██▍ | 8269/34278 [9:13:35<25:48:13, 3.57s/it] 24%|██▍ | 8270/34278 [9:13:39<26:44:58, 3.70s/it] {'loss': 0.1628, 'grad_norm': 0.7458998076653564, 'learning_rate': 8.87469059348619e-06, 'epoch': 0.24} 24%|██▍ | 8270/34278 [9:13:39<26:44:58, 3.70s/it] 24%|██▍ | 8271/34278 [9:13:43<25:33:09, 3.54s/it] {'loss': 0.1468, 'grad_norm': 0.9189429055503623, 'learning_rate': 8.874391980270157e-06, 'epoch': 0.24} 24%|██▍ | 8271/34278 [9:13:43<25:33:09, 3.54s/it] 24%|██▍ | 8272/34278 [9:13:46<25:57:24, 3.59s/it] {'loss': 0.15, 'grad_norm': 0.7682745223848754, 'learning_rate': 8.874093332464477e-06, 'epoch': 0.24} 24%|██▍ | 8272/34278 [9:13:46<25:57:24, 3.59s/it] 24%|██▍ | 8273/34278 [9:13:50<25:26:49, 3.52s/it] {'loss': 0.1593, 'grad_norm': 0.7591197071634669, 'learning_rate': 8.873794650071819e-06, 'epoch': 0.24} 24%|██▍ | 8273/34278 [9:13:50<25:26:49, 3.52s/it] 24%|██▍ | 8274/34278 [9:13:53<24:39:11, 3.41s/it] {'loss': 0.1581, 'grad_norm': 1.1246996911394547, 'learning_rate': 8.873495933094844e-06, 'epoch': 0.24} 24%|██▍ | 8274/34278 [9:13:53<24:39:11, 3.41s/it] 24%|██▍ | 8275/34278 [9:13:59<29:59:40, 4.15s/it] {'loss': 0.1543, 'grad_norm': 0.7076735575515133, 'learning_rate': 8.873197181536223e-06, 'epoch': 0.24} 24%|██▍ | 8275/34278 [9:13:59<29:59:40, 4.15s/it] 24%|██▍ | 8276/34278 [9:14:02<27:35:43, 3.82s/it] {'loss': 0.1755, 'grad_norm': 0.9029068568351564, 'learning_rate': 8.872898395398624e-06, 'epoch': 0.24} 24%|██▍ | 8276/34278 [9:14:02<27:35:43, 3.82s/it] 24%|██▍ | 8277/34278 [9:14:05<26:39:35, 3.69s/it] {'loss': 0.1629, 'grad_norm': 0.7549673909441204, 'learning_rate': 8.87259957468471e-06, 'epoch': 0.24} 24%|██▍ | 8277/34278 [9:14:05<26:39:35, 3.69s/it] 24%|██▍ | 8278/34278 [9:14:08<25:14:17, 3.49s/it] {'loss': 0.1514, 'grad_norm': 0.7575082299161752, 'learning_rate': 8.872300719397152e-06, 'epoch': 0.24} 24%|██▍ | 8278/34278 [9:14:08<25:14:17, 3.49s/it] 24%|██▍ | 8279/34278 [9:14:14<29:16:17, 4.05s/it] {'loss': 0.1987, 'grad_norm': 0.8352503239311698, 'learning_rate': 8.872001829538619e-06, 'epoch': 0.24} 24%|██▍ | 8279/34278 [9:14:14<29:16:17, 4.05s/it] 24%|██▍ | 8280/34278 [9:14:17<28:10:33, 3.90s/it] {'loss': 0.167, 'grad_norm': 0.7337178680236806, 'learning_rate': 8.871702905111776e-06, 'epoch': 0.24} 24%|██▍ | 8280/34278 [9:14:17<28:10:33, 3.90s/it] 24%|██▍ | 8281/34278 [9:14:20<26:33:32, 3.68s/it] {'loss': 0.1666, 'grad_norm': 0.7666388414207655, 'learning_rate': 8.871403946119294e-06, 'epoch': 0.24} 24%|██▍ | 8281/34278 [9:14:20<26:33:32, 3.68s/it] 24%|██▍ | 8282/34278 [9:14:24<26:21:43, 3.65s/it] {'loss': 0.174, 'grad_norm': 0.8030265459227783, 'learning_rate': 8.871104952563843e-06, 'epoch': 0.24} 24%|██▍ | 8282/34278 [9:14:24<26:21:43, 3.65s/it] 24%|██▍ | 8283/34278 [9:14:29<29:00:57, 4.02s/it] {'loss': 0.1606, 'grad_norm': 0.906392696766294, 'learning_rate': 8.870805924448091e-06, 'epoch': 0.24} 24%|██▍ | 8283/34278 [9:14:29<29:00:57, 4.02s/it] 24%|██▍ | 8284/34278 [9:14:34<32:40:01, 4.52s/it] {'loss': 0.1489, 'grad_norm': 0.6662394594558032, 'learning_rate': 8.870506861774708e-06, 'epoch': 0.24} 24%|██▍ | 8284/34278 [9:14:34<32:40:01, 4.52s/it] 24%|██▍ | 8285/34278 [9:14:38<31:39:54, 4.39s/it] {'loss': 0.1632, 'grad_norm': 0.8848343698281591, 'learning_rate': 8.870207764546363e-06, 'epoch': 0.24} 24%|██▍ | 8285/34278 [9:14:39<31:39:54, 4.39s/it] 24%|██▍ | 8286/34278 [9:14:42<30:36:39, 4.24s/it] {'loss': 0.1734, 'grad_norm': 0.9298968774587074, 'learning_rate': 8.869908632765727e-06, 'epoch': 0.24} 24%|██▍ | 8286/34278 [9:14:42<30:36:39, 4.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 24%|██▍ | 8287/34278 [9:14:46<30:05:23, 4.17s/it] {'loss': 0.1832, 'grad_norm': 0.8343997794773508, 'learning_rate': 8.86960946643547e-06, 'epoch': 0.24} 24%|██▍ | 8287/34278 [9:14:46<30:05:23, 4.17s/it] 24%|██▍ | 8288/34278 [9:14:52<34:02:13, 4.71s/it] {'loss': 0.1592, 'grad_norm': 1.220524871783636, 'learning_rate': 8.869310265558264e-06, 'epoch': 0.24} 24%|██▍ | 8288/34278 [9:14:52<34:02:13, 4.71s/it] 24%|██▍ | 8289/34278 [9:14:56<30:55:47, 4.28s/it] {'loss': 0.1787, 'grad_norm': 0.8262776044773047, 'learning_rate': 8.869011030136781e-06, 'epoch': 0.24} 24%|██▍ | 8289/34278 [9:14:56<30:55:47, 4.28s/it] 24%|██▍ | 8290/34278 [9:14:59<28:50:02, 3.99s/it] {'loss': 0.1785, 'grad_norm': 1.1037398922421195, 'learning_rate': 8.868711760173688e-06, 'epoch': 0.24} 24%|██▍ | 8290/34278 [9:14:59<28:50:02, 3.99s/it] 24%|██▍ | 8291/34278 [9:15:02<26:15:26, 3.64s/it] {'loss': 0.1632, 'grad_norm': 0.7803727084389832, 'learning_rate': 8.868412455671663e-06, 'epoch': 0.24} 24%|██▍ | 8291/34278 [9:15:02<26:15:26, 3.64s/it] 24%|██▍ | 8292/34278 [9:15:06<27:45:54, 3.85s/it] {'loss': 0.184, 'grad_norm': 0.9759382256345664, 'learning_rate': 8.868113116633374e-06, 'epoch': 0.24} 24%|██▍ | 8292/34278 [9:15:06<27:45:54, 3.85s/it] 24%|██▍ | 8293/34278 [9:15:12<31:57:46, 4.43s/it] {'loss': 0.1361, 'grad_norm': 0.887876272034532, 'learning_rate': 8.867813743061493e-06, 'epoch': 0.24} 24%|██▍ | 8293/34278 [9:15:12<31:57:46, 4.43s/it] 24%|██▍ | 8294/34278 [9:15:16<30:17:22, 4.20s/it] {'loss': 0.1425, 'grad_norm': 0.807529058890851, 'learning_rate': 8.867514334958696e-06, 'epoch': 0.24} 24%|██▍ | 8294/34278 [9:15:16<30:17:22, 4.20s/it] 24%|██▍ | 8295/34278 [9:15:18<27:26:46, 3.80s/it] {'loss': 0.1573, 'grad_norm': 0.7849043958733483, 'learning_rate': 8.867214892327653e-06, 'epoch': 0.24} 24%|██▍ | 8295/34278 [9:15:18<27:26:46, 3.80s/it] 24%|██▍ | 8296/34278 [9:15:22<27:02:21, 3.75s/it] {'loss': 0.1952, 'grad_norm': 0.9015404177738828, 'learning_rate': 8.86691541517104e-06, 'epoch': 0.24} 24%|██▍ | 8296/34278 [9:15:22<27:02:21, 3.75s/it] 24%|██▍ | 8297/34278 [9:15:25<25:43:05, 3.56s/it] {'loss': 0.152, 'grad_norm': 0.9219305059684217, 'learning_rate': 8.866615903491529e-06, 'epoch': 0.24} 24%|██▍ | 8297/34278 [9:15:25<25:43:05, 3.56s/it] 24%|██▍ | 8298/34278 [9:15:28<24:33:40, 3.40s/it] {'loss': 0.161, 'grad_norm': 0.8353303290023836, 'learning_rate': 8.866316357291793e-06, 'epoch': 0.24} 24%|██▍ | 8298/34278 [9:15:28<24:33:40, 3.40s/it] 24%|██▍ | 8299/34278 [9:15:31<23:48:29, 3.30s/it] {'loss': 0.1634, 'grad_norm': 0.8584782625626277, 'learning_rate': 8.866016776574509e-06, 'epoch': 0.24} 24%|██▍ | 8299/34278 [9:15:31<23:48:29, 3.30s/it] 24%|██▍ | 8300/34278 [9:15:37<29:55:18, 4.15s/it] {'loss': 0.1501, 'grad_norm': 0.8010399387149826, 'learning_rate': 8.865717161342348e-06, 'epoch': 0.24} 24%|██▍ | 8300/34278 [9:15:37<29:55:18, 4.15s/it] 24%|██▍ | 8301/34278 [9:15:42<31:49:03, 4.41s/it] {'loss': 0.1468, 'grad_norm': 0.8403769802589146, 'learning_rate': 8.86541751159799e-06, 'epoch': 0.24} 24%|██▍ | 8301/34278 [9:15:42<31:49:03, 4.41s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 24%|██▍ | 8302/34278 [9:15:45<28:12:16, 3.91s/it] {'loss': 0.1571, 'grad_norm': 0.9557938237322129, 'learning_rate': 8.865117827344106e-06, 'epoch': 0.24} 24%|██▍ | 8302/34278 [9:15:45<28:12:16, 3.91s/it] 24%|██▍ | 8303/34278 [9:15:48<26:32:12, 3.68s/it] {'loss': 0.1533, 'grad_norm': 1.036574215317314, 'learning_rate': 8.864818108583372e-06, 'epoch': 0.24} 24%|██▍ | 8303/34278 [9:15:48<26:32:12, 3.68s/it] 24%|██▍ | 8304/34278 [9:15:53<28:07:15, 3.90s/it] {'loss': 0.1583, 'grad_norm': 1.0665467470789982, 'learning_rate': 8.864518355318465e-06, 'epoch': 0.24} 24%|██▍ | 8304/34278 [9:15:53<28:07:15, 3.90s/it] 24%|██▍ | 8305/34278 [9:15:56<25:58:45, 3.60s/it] {'loss': 0.16, 'grad_norm': 0.8899892947644182, 'learning_rate': 8.864218567552061e-06, 'epoch': 0.24} 24%|██▍ | 8305/34278 [9:15:56<25:58:45, 3.60s/it] 24%|██▍ | 8306/34278 [9:15:58<24:04:45, 3.34s/it] {'loss': 0.1801, 'grad_norm': 0.8293750946816629, 'learning_rate': 8.863918745286836e-06, 'epoch': 0.24} 24%|██▍ | 8306/34278 [9:15:58<24:04:45, 3.34s/it] 24%|██▍ | 8307/34278 [9:16:02<25:47:55, 3.58s/it] {'loss': 0.1587, 'grad_norm': 0.9905375946112777, 'learning_rate': 8.863618888525466e-06, 'epoch': 0.24} 24%|██▍ | 8307/34278 [9:16:02<25:47:55, 3.58s/it] 24%|██▍ | 8308/34278 [9:16:06<24:54:16, 3.45s/it] {'loss': 0.1318, 'grad_norm': 0.6796720878474997, 'learning_rate': 8.863318997270628e-06, 'epoch': 0.24} 24%|██▍ | 8308/34278 [9:16:06<24:54:16, 3.45s/it] 24%|██▍ | 8309/34278 [9:16:09<24:19:34, 3.37s/it] {'loss': 0.1567, 'grad_norm': 0.8107026806590917, 'learning_rate': 8.863019071525004e-06, 'epoch': 0.24} 24%|██▍ | 8309/34278 [9:16:09<24:19:34, 3.37s/it] 24%|██▍ | 8310/34278 [9:16:15<30:18:43, 4.20s/it] {'loss': 0.1704, 'grad_norm': 1.2963652228907871, 'learning_rate': 8.862719111291265e-06, 'epoch': 0.24} 24%|██▍ | 8310/34278 [9:16:15<30:18:43, 4.20s/it] 24%|██▍ | 8311/34278 [9:16:21<33:55:03, 4.70s/it] {'loss': 0.168, 'grad_norm': 0.7919308649984564, 'learning_rate': 8.862419116572091e-06, 'epoch': 0.24} 24%|██▍ | 8311/34278 [9:16:21<33:55:03, 4.70s/it] 24%|██▍ | 8312/34278 [9:16:25<32:18:06, 4.48s/it] {'loss': 0.1485, 'grad_norm': 0.7014596902903338, 'learning_rate': 8.862119087370164e-06, 'epoch': 0.24} 24%|██▍ | 8312/34278 [9:16:25<32:18:06, 4.48s/it] 24%|██▍ | 8313/34278 [9:16:28<29:08:59, 4.04s/it] {'loss': 0.1802, 'grad_norm': 0.8910306467453531, 'learning_rate': 8.861819023688158e-06, 'epoch': 0.24} 24%|██▍ | 8313/34278 [9:16:28<29:08:59, 4.04s/it] 24%|██▍ | 8314/34278 [9:16:31<27:09:44, 3.77s/it] {'loss': 0.1311, 'grad_norm': 0.6840901742528571, 'learning_rate': 8.861518925528753e-06, 'epoch': 0.24} 24%|██▍ | 8314/34278 [9:16:31<27:09:44, 3.77s/it] 24%|██▍ | 8315/34278 [9:16:37<31:50:29, 4.42s/it] {'loss': 0.1485, 'grad_norm': 0.6571646828440548, 'learning_rate': 8.861218792894631e-06, 'epoch': 0.24} 24%|██▍ | 8315/34278 [9:16:37<31:50:29, 4.42s/it] 24%|██▍ | 8316/34278 [9:16:40<28:22:09, 3.93s/it] {'loss': 0.1413, 'grad_norm': 0.9296671671227267, 'learning_rate': 8.860918625788468e-06, 'epoch': 0.24} 24%|██▍ | 8316/34278 [9:16:40<28:22:09, 3.93s/it] 24%|██▍ | 8317/34278 [9:16:43<26:41:01, 3.70s/it] {'loss': 0.1631, 'grad_norm': 0.7680292881436243, 'learning_rate': 8.860618424212945e-06, 'epoch': 0.24} 24%|██▍ | 8317/34278 [9:16:43<26:41:01, 3.70s/it] 24%|██▍ | 8318/34278 [9:16:47<26:47:27, 3.72s/it] {'loss': 0.1505, 'grad_norm': 0.7553524615331095, 'learning_rate': 8.860318188170744e-06, 'epoch': 0.24} 24%|██▍ | 8318/34278 [9:16:47<26:47:27, 3.72s/it] 24%|██▍ | 8319/34278 [9:16:50<25:55:55, 3.60s/it] {'loss': 0.1543, 'grad_norm': 0.7744109809950512, 'learning_rate': 8.860017917664543e-06, 'epoch': 0.24} 24%|██▍ | 8319/34278 [9:16:50<25:55:55, 3.60s/it] 24%|██▍ | 8320/34278 [9:16:53<25:35:13, 3.55s/it] {'loss': 0.1649, 'grad_norm': 0.741122023611664, 'learning_rate': 8.859717612697023e-06, 'epoch': 0.24} 24%|██▍ | 8320/34278 [9:16:53<25:35:13, 3.55s/it] 24%|██▍ | 8321/34278 [9:16:56<23:58:30, 3.33s/it] {'loss': 0.146, 'grad_norm': 0.8842410624110942, 'learning_rate': 8.859417273270866e-06, 'epoch': 0.24} 24%|██▍ | 8321/34278 [9:16:56<23:58:30, 3.33s/it] 24%|██▍ | 8322/34278 [9:17:01<26:24:25, 3.66s/it] {'loss': 0.1991, 'grad_norm': 0.8396582509731948, 'learning_rate': 8.859116899388752e-06, 'epoch': 0.24} 24%|██▍ | 8322/34278 [9:17:01<26:24:25, 3.66s/it] 24%|██▍ | 8323/34278 [9:17:04<25:27:41, 3.53s/it] {'loss': 0.1617, 'grad_norm': 0.8399060739518684, 'learning_rate': 8.858816491053364e-06, 'epoch': 0.24} 24%|██▍ | 8323/34278 [9:17:04<25:27:41, 3.53s/it] 24%|██▍ | 8324/34278 [9:17:08<26:05:44, 3.62s/it] {'loss': 0.1553, 'grad_norm': 0.932869570379997, 'learning_rate': 8.858516048267383e-06, 'epoch': 0.24} 24%|██▍ | 8324/34278 [9:17:08<26:05:44, 3.62s/it] 24%|██▍ | 8325/34278 [9:17:11<25:46:38, 3.58s/it] {'loss': 0.1521, 'grad_norm': 0.9219004107075606, 'learning_rate': 8.85821557103349e-06, 'epoch': 0.24} 24%|██▍ | 8325/34278 [9:17:11<25:46:38, 3.58s/it] 24%|██▍ | 8326/34278 [9:17:14<24:42:45, 3.43s/it] {'loss': 0.1229, 'grad_norm': 0.7569424998936463, 'learning_rate': 8.857915059354373e-06, 'epoch': 0.24} 24%|██▍ | 8326/34278 [9:17:14<24:42:45, 3.43s/it] 24%|██▍ | 8327/34278 [9:17:20<28:52:51, 4.01s/it] {'loss': 0.155, 'grad_norm': 0.9706743467715019, 'learning_rate': 8.85761451323271e-06, 'epoch': 0.24} 24%|██▍ | 8327/34278 [9:17:20<28:52:51, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 24%|██▍ | 8328/34278 [9:17:22<26:09:46, 3.63s/it] {'loss': 0.1528, 'grad_norm': 0.9094527668747412, 'learning_rate': 8.857313932671186e-06, 'epoch': 0.24} 24%|██▍ | 8328/34278 [9:17:22<26:09:46, 3.63s/it] 24%|██▍ | 8329/34278 [9:17:25<24:54:55, 3.46s/it] {'loss': 0.1449, 'grad_norm': 0.9351577674569229, 'learning_rate': 8.857013317672484e-06, 'epoch': 0.24} 24%|██▍ | 8329/34278 [9:17:25<24:54:55, 3.46s/it] 24%|██▍ | 8330/34278 [9:17:29<24:18:58, 3.37s/it] {'loss': 0.144, 'grad_norm': 0.7590114397369029, 'learning_rate': 8.856712668239287e-06, 'epoch': 0.24} 24%|██▍ | 8330/34278 [9:17:29<24:18:58, 3.37s/it] 24%|██▍ | 8331/34278 [9:17:33<25:59:13, 3.61s/it] {'loss': 0.1637, 'grad_norm': 0.8761829605476987, 'learning_rate': 8.85641198437428e-06, 'epoch': 0.24} 24%|██▍ | 8331/34278 [9:17:33<25:59:13, 3.61s/it] 24%|██▍ | 8332/34278 [9:17:38<28:40:02, 3.98s/it] {'loss': 0.1459, 'grad_norm': 0.9394578365099572, 'learning_rate': 8.856111266080149e-06, 'epoch': 0.24} 24%|██▍ | 8332/34278 [9:17:38<28:40:02, 3.98s/it] 24%|██▍ | 8333/34278 [9:17:43<32:01:09, 4.44s/it] {'loss': 0.1548, 'grad_norm': 0.7659881061961028, 'learning_rate': 8.855810513359574e-06, 'epoch': 0.24} 24%|██▍ | 8333/34278 [9:17:43<32:01:09, 4.44s/it] 24%|██▍ | 8334/34278 [9:17:49<34:10:54, 4.74s/it] {'loss': 0.1614, 'grad_norm': 0.8870597339994251, 'learning_rate': 8.855509726215247e-06, 'epoch': 0.24} 24%|██▍ | 8334/34278 [9:17:49<34:10:54, 4.74s/it] 24%|██▍ | 8335/34278 [9:17:54<36:36:10, 5.08s/it] {'loss': 0.1602, 'grad_norm': 0.9362402687968784, 'learning_rate': 8.855208904649848e-06, 'epoch': 0.24} 24%|██▍ | 8335/34278 [9:17:54<36:36:10, 5.08s/it] 24%|██▍ | 8336/34278 [9:17:58<32:25:00, 4.50s/it] {'loss': 0.1549, 'grad_norm': 0.920572264713605, 'learning_rate': 8.854908048666064e-06, 'epoch': 0.24} 24%|██▍ | 8336/34278 [9:17:58<32:25:00, 4.50s/it] 24%|██▍ | 8337/34278 [9:18:01<29:35:52, 4.11s/it] {'loss': 0.1584, 'grad_norm': 0.7838104690539485, 'learning_rate': 8.85460715826658e-06, 'epoch': 0.24} 24%|██▍ | 8337/34278 [9:18:01<29:35:52, 4.11s/it] 24%|██▍ | 8338/34278 [9:18:06<32:26:23, 4.50s/it] {'loss': 0.1636, 'grad_norm': 0.8296277648008216, 'learning_rate': 8.854306233454085e-06, 'epoch': 0.24} 24%|██▍ | 8338/34278 [9:18:06<32:26:23, 4.50s/it] 24%|██▍ | 8339/34278 [9:18:09<28:52:34, 4.01s/it] {'loss': 0.1583, 'grad_norm': 0.9261275509176963, 'learning_rate': 8.854005274231264e-06, 'epoch': 0.24} 24%|██▍ | 8339/34278 [9:18:09<28:52:34, 4.01s/it] 24%|██▍ | 8340/34278 [9:18:12<26:36:42, 3.69s/it] {'loss': 0.151, 'grad_norm': 0.7284314849938635, 'learning_rate': 8.853704280600803e-06, 'epoch': 0.24} 24%|██▍ | 8340/34278 [9:18:12<26:36:42, 3.69s/it] 24%|██▍ | 8341/34278 [9:18:15<25:11:10, 3.50s/it] {'loss': 0.1499, 'grad_norm': 0.855592558560831, 'learning_rate': 8.853403252565391e-06, 'epoch': 0.24} 24%|██▍ | 8341/34278 [9:18:15<25:11:10, 3.50s/it] 24%|██▍ | 8342/34278 [9:18:18<23:33:38, 3.27s/it] {'loss': 0.1929, 'grad_norm': 1.0361960340942988, 'learning_rate': 8.853102190127714e-06, 'epoch': 0.24} 24%|██▍ | 8342/34278 [9:18:18<23:33:38, 3.27s/it] 24%|██▍ | 8343/34278 [9:18:23<27:32:33, 3.82s/it] {'loss': 0.1676, 'grad_norm': 0.8606824807586467, 'learning_rate': 8.852801093290461e-06, 'epoch': 0.24} 24%|██▍ | 8343/34278 [9:18:23<27:32:33, 3.82s/it] 24%|██▍ | 8344/34278 [9:18:26<26:27:45, 3.67s/it] {'loss': 0.1475, 'grad_norm': 0.802848091187224, 'learning_rate': 8.852499962056321e-06, 'epoch': 0.24} 24%|██▍ | 8344/34278 [9:18:26<26:27:45, 3.67s/it] 24%|██▍ | 8345/34278 [9:18:29<25:32:06, 3.54s/it] {'loss': 0.1586, 'grad_norm': 0.941050678114156, 'learning_rate': 8.852198796427978e-06, 'epoch': 0.24} 24%|██▍ | 8345/34278 [9:18:29<25:32:06, 3.54s/it] 24%|██▍ | 8346/34278 [9:18:32<23:58:21, 3.33s/it] {'loss': 0.1487, 'grad_norm': 0.8658456740404586, 'learning_rate': 8.851897596408125e-06, 'epoch': 0.24} 24%|██▍ | 8346/34278 [9:18:32<23:58:21, 3.33s/it] 24%|██▍ | 8347/34278 [9:18:36<24:43:27, 3.43s/it] {'loss': 0.1837, 'grad_norm': 0.7557942088182621, 'learning_rate': 8.85159636199945e-06, 'epoch': 0.24} 24%|██▍ | 8347/34278 [9:18:36<24:43:27, 3.43s/it] 24%|██▍ | 8348/34278 [9:18:40<25:54:22, 3.60s/it] {'loss': 0.1666, 'grad_norm': 0.9696259629692668, 'learning_rate': 8.851295093204642e-06, 'epoch': 0.24} 24%|██▍ | 8348/34278 [9:18:40<25:54:22, 3.60s/it] 24%|██▍ | 8349/34278 [9:18:43<25:35:25, 3.55s/it] {'loss': 0.1538, 'grad_norm': 1.1376486002663677, 'learning_rate': 8.850993790026391e-06, 'epoch': 0.24} 24%|██▍ | 8349/34278 [9:18:43<25:35:25, 3.55s/it] 24%|██▍ | 8350/34278 [9:18:46<24:29:58, 3.40s/it] {'loss': 0.1669, 'grad_norm': 0.7397184643381797, 'learning_rate': 8.850692452467387e-06, 'epoch': 0.24} 24%|██▍ | 8350/34278 [9:18:46<24:29:58, 3.40s/it] 24%|██▍ | 8351/34278 [9:18:49<23:46:34, 3.30s/it] {'loss': 0.1752, 'grad_norm': 1.0140075529701373, 'learning_rate': 8.850391080530319e-06, 'epoch': 0.24} 24%|██▍ | 8351/34278 [9:18:49<23:46:34, 3.30s/it] 24%|██▍ | 8352/34278 [9:18:53<23:24:25, 3.25s/it] {'loss': 0.1754, 'grad_norm': 1.0352251266250718, 'learning_rate': 8.850089674217879e-06, 'epoch': 0.24} 24%|██▍ | 8352/34278 [9:18:53<23:24:25, 3.25s/it] 24%|██▍ | 8353/34278 [9:18:55<22:22:56, 3.11s/it] {'loss': 0.1357, 'grad_norm': 0.7166052530629659, 'learning_rate': 8.849788233532759e-06, 'epoch': 0.24} 24%|██▍ | 8353/34278 [9:18:55<22:22:56, 3.11s/it] 24%|██▍ | 8354/34278 [9:19:01<27:48:22, 3.86s/it] {'loss': 0.1563, 'grad_norm': 0.8722454672563336, 'learning_rate': 8.849486758477647e-06, 'epoch': 0.24} 24%|██▍ | 8354/34278 [9:19:01<27:48:22, 3.86s/it] 24%|██▍ | 8355/34278 [9:19:05<27:31:58, 3.82s/it] {'loss': 0.1512, 'grad_norm': 0.787635601188435, 'learning_rate': 8.849185249055236e-06, 'epoch': 0.24} 24%|██▍ | 8355/34278 [9:19:05<27:31:58, 3.82s/it] 24%|██▍ | 8356/34278 [9:19:08<26:17:01, 3.65s/it] {'loss': 0.1546, 'grad_norm': 0.7581295989698724, 'learning_rate': 8.848883705268219e-06, 'epoch': 0.24} 24%|██▍ | 8356/34278 [9:19:08<26:17:01, 3.65s/it] 24%|██▍ | 8357/34278 [9:19:11<25:23:18, 3.53s/it] {'loss': 0.1618, 'grad_norm': 0.850561381936157, 'learning_rate': 8.848582127119285e-06, 'epoch': 0.24} 24%|██▍ | 8357/34278 [9:19:11<25:23:18, 3.53s/it] 24%|██▍ | 8358/34278 [9:19:17<31:13:54, 4.34s/it] {'loss': 0.1723, 'grad_norm': 0.7886592090366925, 'learning_rate': 8.84828051461113e-06, 'epoch': 0.24} 24%|██▍ | 8358/34278 [9:19:17<31:13:54, 4.34s/it] 24%|██▍ | 8359/34278 [9:19:23<33:49:12, 4.70s/it] {'loss': 0.1518, 'grad_norm': 0.8313972585497709, 'learning_rate': 8.847978867746446e-06, 'epoch': 0.24} 24%|██▍ | 8359/34278 [9:19:23<33:49:12, 4.70s/it] 24%|██▍ | 8360/34278 [9:19:28<34:13:59, 4.75s/it] {'loss': 0.1642, 'grad_norm': 0.8607898970493084, 'learning_rate': 8.847677186527924e-06, 'epoch': 0.24} 24%|██▍ | 8360/34278 [9:19:28<34:13:59, 4.75s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 24%|██▍ | 8361/34278 [9:19:32<33:56:39, 4.72s/it] {'loss': 0.1494, 'grad_norm': 0.7871688502088359, 'learning_rate': 8.84737547095826e-06, 'epoch': 0.24} 24%|██▍ | 8361/34278 [9:19:32<33:56:39, 4.72s/it] 24%|██▍ | 8362/34278 [9:19:36<30:39:42, 4.26s/it] {'loss': 0.1779, 'grad_norm': 0.8363080403057463, 'learning_rate': 8.847073721040145e-06, 'epoch': 0.24} 24%|██▍ | 8362/34278 [9:19:36<30:39:42, 4.26s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0e403e4680> Failed to fetch sample 3291474. Exception: cannot identify image file <_io.BytesIO object at 0x7f0e403e4680> 24%|██▍ | 8363/34278 [9:19:39<29:01:06, 4.03s/it] {'loss': 0.1782, 'grad_norm': 0.8484381925285275, 'learning_rate': 8.846771936776275e-06, 'epoch': 0.24} 24%|██▍ | 8363/34278 [9:19:39<29:01:06, 4.03s/it] 24%|██▍ | 8364/34278 [9:19:44<30:50:18, 4.28s/it] {'loss': 0.1451, 'grad_norm': 1.0067802342523158, 'learning_rate': 8.846470118169343e-06, 'epoch': 0.24} 24%|██▍ | 8364/34278 [9:19:44<30:50:18, 4.28s/it] 24%|██▍ | 8365/34278 [9:19:47<28:29:52, 3.96s/it] {'loss': 0.148, 'grad_norm': 0.8955634481195281, 'learning_rate': 8.846168265222044e-06, 'epoch': 0.24} 24%|██▍ | 8365/34278 [9:19:47<28:29:52, 3.96s/it] 24%|██▍ | 8366/34278 [9:19:50<26:36:52, 3.70s/it] {'loss': 0.1753, 'grad_norm': 1.0444161973369948, 'learning_rate': 8.845866377937073e-06, 'epoch': 0.24} 24%|██▍ | 8366/34278 [9:19:50<26:36:52, 3.70s/it] 24%|██▍ | 8367/34278 [9:19:54<26:44:28, 3.72s/it] {'loss': 0.1451, 'grad_norm': 0.906266243851301, 'learning_rate': 8.845564456317124e-06, 'epoch': 0.24} 24%|██▍ | 8367/34278 [9:19:54<26:44:28, 3.72s/it] 24%|██▍ | 8368/34278 [9:19:57<25:57:46, 3.61s/it] {'loss': 0.1526, 'grad_norm': 1.0563299544092395, 'learning_rate': 8.845262500364896e-06, 'epoch': 0.24} 24%|██▍ | 8368/34278 [9:19:57<25:57:46, 3.61s/it] 24%|██▍ | 8369/34278 [9:20:01<24:58:29, 3.47s/it] {'loss': 0.1537, 'grad_norm': 0.8454233917176577, 'learning_rate': 8.84496051008308e-06, 'epoch': 0.24} 24%|██▍ | 8369/34278 [9:20:01<24:58:29, 3.47s/it] 24%|██▍ | 8370/34278 [9:20:04<24:04:16, 3.34s/it] {'loss': 0.1633, 'grad_norm': 1.0749430110370448, 'learning_rate': 8.844658485474376e-06, 'epoch': 0.24} 24%|██▍ | 8370/34278 [9:20:04<24:04:16, 3.34s/it] 24%|██▍ | 8371/34278 [9:20:07<23:40:39, 3.29s/it] {'loss': 0.1558, 'grad_norm': 1.0658178285046038, 'learning_rate': 8.844356426541476e-06, 'epoch': 0.24} 24%|██▍ | 8371/34278 [9:20:07<23:40:39, 3.29s/it] 24%|██▍ | 8372/34278 [9:20:10<23:43:29, 3.30s/it] {'loss': 0.1927, 'grad_norm': 0.8166177664164356, 'learning_rate': 8.844054333287081e-06, 'epoch': 0.24} 24%|██▍ | 8372/34278 [9:20:10<23:43:29, 3.30s/it] 24%|██▍ | 8373/34278 [9:20:13<23:08:07, 3.22s/it] {'loss': 0.1737, 'grad_norm': 1.1520638365934748, 'learning_rate': 8.84375220571389e-06, 'epoch': 0.24} 24%|██▍ | 8373/34278 [9:20:13<23:08:07, 3.22s/it] 24%|██▍ | 8374/34278 [9:20:17<24:46:14, 3.44s/it] {'loss': 0.1859, 'grad_norm': 0.9214527141771043, 'learning_rate': 8.843450043824593e-06, 'epoch': 0.24} 24%|██▍ | 8374/34278 [9:20:17<24:46:14, 3.44s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 24%|██▍ | 8375/34278 [9:20:21<25:14:33, 3.51s/it] {'loss': 0.1419, 'grad_norm': 0.7492326427282036, 'learning_rate': 8.843147847621893e-06, 'epoch': 0.24} 24%|██▍ | 8375/34278 [9:20:21<25:14:33, 3.51s/it] 24%|██▍ | 8376/34278 [9:20:24<24:25:50, 3.40s/it] {'loss': 0.1607, 'grad_norm': 0.7411821285949995, 'learning_rate': 8.842845617108485e-06, 'epoch': 0.24} 24%|██▍ | 8376/34278 [9:20:24<24:25:50, 3.40s/it] 24%|██▍ | 8377/34278 [9:20:27<23:33:29, 3.27s/it] {'loss': 0.1481, 'grad_norm': 0.7154000436837367, 'learning_rate': 8.842543352287069e-06, 'epoch': 0.24} 24%|██▍ | 8377/34278 [9:20:27<23:33:29, 3.27s/it] 24%|██▍ | 8378/34278 [9:20:30<23:13:01, 3.23s/it] {'loss': 0.1662, 'grad_norm': 0.9138082082050378, 'learning_rate': 8.842241053160345e-06, 'epoch': 0.24} 24%|██▍ | 8378/34278 [9:20:30<23:13:01, 3.23s/it] 24%|██▍ | 8379/34278 [9:20:33<23:18:34, 3.24s/it] {'loss': 0.1418, 'grad_norm': 0.964306699508199, 'learning_rate': 8.841938719731008e-06, 'epoch': 0.24} 24%|██▍ | 8379/34278 [9:20:33<23:18:34, 3.24s/it] 24%|██▍ | 8380/34278 [9:20:39<29:26:54, 4.09s/it] {'loss': 0.1754, 'grad_norm': 0.7727967753547276, 'learning_rate': 8.841636352001762e-06, 'epoch': 0.24} 24%|██▍ | 8380/34278 [9:20:39<29:26:54, 4.09s/it] 24%|██▍ | 8381/34278 [9:20:42<27:10:20, 3.78s/it] {'loss': 0.1444, 'grad_norm': 0.8789704182148401, 'learning_rate': 8.841333949975302e-06, 'epoch': 0.24} 24%|██▍ | 8381/34278 [9:20:42<27:10:20, 3.78s/it] 24%|██▍ | 8382/34278 [9:20:45<25:21:32, 3.53s/it] {'loss': 0.1458, 'grad_norm': 0.7411244681921415, 'learning_rate': 8.84103151365433e-06, 'epoch': 0.24} 24%|██▍ | 8382/34278 [9:20:45<25:21:32, 3.53s/it] 24%|██▍ | 8383/34278 [9:20:49<24:56:22, 3.47s/it] {'loss': 0.1606, 'grad_norm': 0.8487622912626777, 'learning_rate': 8.840729043041545e-06, 'epoch': 0.24} 24%|██▍ | 8383/34278 [9:20:49<24:56:22, 3.47s/it] 24%|██▍ | 8384/34278 [9:20:52<23:45:56, 3.30s/it] {'loss': 0.1865, 'grad_norm': 0.9033530842049082, 'learning_rate': 8.840426538139647e-06, 'epoch': 0.24} 24%|██▍ | 8384/34278 [9:20:52<23:45:56, 3.30s/it] 24%|██▍ | 8385/34278 [9:20:55<22:55:17, 3.19s/it] {'loss': 0.1704, 'grad_norm': 0.8992042468042295, 'learning_rate': 8.84012399895134e-06, 'epoch': 0.24} 24%|██▍ | 8385/34278 [9:20:55<22:55:17, 3.19s/it] 24%|██▍ | 8386/34278 [9:20:59<24:38:46, 3.43s/it] {'loss': 0.1592, 'grad_norm': 0.7081280719070593, 'learning_rate': 8.83982142547932e-06, 'epoch': 0.24} 24%|██▍ | 8386/34278 [9:20:59<24:38:46, 3.43s/it] 24%|██▍ | 8387/34278 [9:21:01<23:28:27, 3.26s/it] {'loss': 0.1369, 'grad_norm': 1.0275869945524918, 'learning_rate': 8.839518817726293e-06, 'epoch': 0.24} 24%|██▍ | 8387/34278 [9:21:01<23:28:27, 3.26s/it] 24%|██▍ | 8388/34278 [9:21:05<23:29:07, 3.27s/it] {'loss': 0.1523, 'grad_norm': 0.7760137392426738, 'learning_rate': 8.839216175694957e-06, 'epoch': 0.24} 24%|██▍ | 8388/34278 [9:21:05<23:29:07, 3.27s/it] 24%|██▍ | 8389/34278 [9:21:08<22:40:54, 3.15s/it] {'loss': 0.1679, 'grad_norm': 0.8148903229715854, 'learning_rate': 8.838913499388018e-06, 'epoch': 0.24} 24%|██▍ | 8389/34278 [9:21:08<22:40:54, 3.15s/it] 24%|██▍ | 8390/34278 [9:21:11<22:27:04, 3.12s/it] {'loss': 0.1422, 'grad_norm': 0.9797201186920089, 'learning_rate': 8.838610788808173e-06, 'epoch': 0.24} 24%|██▍ | 8390/34278 [9:21:11<22:27:04, 3.12s/it] 24%|██▍ | 8391/34278 [9:21:14<22:03:14, 3.07s/it] {'loss': 0.1759, 'grad_norm': 1.0186898166413612, 'learning_rate': 8.838308043958128e-06, 'epoch': 0.24} 24%|██▍ | 8391/34278 [9:21:14<22:03:14, 3.07s/it] 24%|██▍ | 8392/34278 [9:21:16<21:39:18, 3.01s/it] {'loss': 0.1646, 'grad_norm': 1.027818751354689, 'learning_rate': 8.838005264840585e-06, 'epoch': 0.24} 24%|██▍ | 8392/34278 [9:21:16<21:39:18, 3.01s/it] 24%|██▍ | 8393/34278 [9:21:20<23:48:48, 3.31s/it] {'loss': 0.1519, 'grad_norm': 1.225498641750817, 'learning_rate': 8.837702451458248e-06, 'epoch': 0.24} 24%|██▍ | 8393/34278 [9:21:20<23:48:48, 3.31s/it] 24%|██▍ | 8394/34278 [9:21:23<23:07:34, 3.22s/it] {'loss': 0.1519, 'grad_norm': 0.9275017187124127, 'learning_rate': 8.83739960381382e-06, 'epoch': 0.24} 24%|██▍ | 8394/34278 [9:21:23<23:07:34, 3.22s/it] 24%|██▍ | 8395/34278 [9:21:26<22:02:13, 3.07s/it] {'loss': 0.1524, 'grad_norm': 1.0713892531659432, 'learning_rate': 8.837096721910004e-06, 'epoch': 0.24} 24%|██▍ | 8395/34278 [9:21:26<22:02:13, 3.07s/it] 24%|██▍ | 8396/34278 [9:21:30<23:20:45, 3.25s/it] {'loss': 0.1648, 'grad_norm': 1.162071648834617, 'learning_rate': 8.836793805749504e-06, 'epoch': 0.24} 24%|██▍ | 8396/34278 [9:21:30<23:20:45, 3.25s/it] 24%|██▍ | 8397/34278 [9:21:33<23:42:14, 3.30s/it] {'loss': 0.1681, 'grad_norm': 0.7598401323146186, 'learning_rate': 8.836490855335026e-06, 'epoch': 0.24} 24%|██▍ | 8397/34278 [9:21:33<23:42:14, 3.30s/it] 24%|██▍ | 8398/34278 [9:21:36<23:18:09, 3.24s/it] {'loss': 0.1494, 'grad_norm': 0.7747140963682789, 'learning_rate': 8.83618787066927e-06, 'epoch': 0.24} 24%|██▍ | 8398/34278 [9:21:36<23:18:09, 3.24s/it] 25%|██▍ | 8399/34278 [9:21:39<22:45:13, 3.17s/it] {'loss': 0.1581, 'grad_norm': 0.8215032281001173, 'learning_rate': 8.835884851754948e-06, 'epoch': 0.25} 25%|██▍ | 8399/34278 [9:21:39<22:45:13, 3.17s/it] 25%|██▍ | 8400/34278 [9:21:45<29:05:47, 4.05s/it] {'loss': 0.155, 'grad_norm': 0.7220283583031136, 'learning_rate': 8.83558179859476e-06, 'epoch': 0.25} 25%|██▍ | 8400/34278 [9:21:45<29:05:47, 4.05s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 25%|██▍ | 8401/34278 [9:21:49<28:05:16, 3.91s/it] {'loss': 0.1304, 'grad_norm': 0.6641240567457919, 'learning_rate': 8.835278711191414e-06, 'epoch': 0.25} 25%|██▍ | 8401/34278 [9:21:49<28:05:16, 3.91s/it] 25%|██▍ | 8402/34278 [9:21:54<29:23:43, 4.09s/it] {'loss': 0.1456, 'grad_norm': 0.7538929804627872, 'learning_rate': 8.834975589547616e-06, 'epoch': 0.25} 25%|██▍ | 8402/34278 [9:21:54<29:23:43, 4.09s/it] 25%|██▍ | 8403/34278 [9:21:57<27:23:14, 3.81s/it] {'loss': 0.1367, 'grad_norm': 0.8066346810016369, 'learning_rate': 8.83467243366607e-06, 'epoch': 0.25} 25%|██▍ | 8403/34278 [9:21:57<27:23:14, 3.81s/it] 25%|██▍ | 8404/34278 [9:22:01<28:48:32, 4.01s/it] {'loss': 0.1419, 'grad_norm': 0.7457877340021333, 'learning_rate': 8.834369243549484e-06, 'epoch': 0.25} 25%|██▍ | 8404/34278 [9:22:01<28:48:32, 4.01s/it] 25%|██▍ | 8405/34278 [9:22:04<26:56:17, 3.75s/it] {'loss': 0.1444, 'grad_norm': 0.687374925843236, 'learning_rate': 8.834066019200566e-06, 'epoch': 0.25} 25%|██▍ | 8405/34278 [9:22:04<26:56:17, 3.75s/it] 25%|██▍ | 8406/34278 [9:22:07<24:46:51, 3.45s/it] {'loss': 0.1415, 'grad_norm': 0.9135412221902482, 'learning_rate': 8.83376276062202e-06, 'epoch': 0.25} 25%|██▍ | 8406/34278 [9:22:07<24:46:51, 3.45s/it] 25%|██▍ | 8407/34278 [9:22:10<24:42:06, 3.44s/it] {'loss': 0.1384, 'grad_norm': 0.8808876465180047, 'learning_rate': 8.833459467816557e-06, 'epoch': 0.25} 25%|██▍ | 8407/34278 [9:22:10<24:42:06, 3.44s/it] 25%|██▍ | 8408/34278 [9:22:17<30:24:51, 4.23s/it] {'loss': 0.1786, 'grad_norm': 0.7754192923566724, 'learning_rate': 8.833156140786883e-06, 'epoch': 0.25} 25%|██▍ | 8408/34278 [9:22:17<30:24:51, 4.23s/it] 25%|██▍ | 8409/34278 [9:22:20<28:02:05, 3.90s/it] {'loss': 0.18, 'grad_norm': 0.8651372266868499, 'learning_rate': 8.832852779535704e-06, 'epoch': 0.25} 25%|██▍ | 8409/34278 [9:22:20<28:02:05, 3.90s/it] 25%|██▍ | 8410/34278 [9:22:23<27:22:14, 3.81s/it] {'loss': 0.1696, 'grad_norm': 0.8809853052651581, 'learning_rate': 8.832549384065732e-06, 'epoch': 0.25} 25%|██▍ | 8410/34278 [9:22:23<27:22:14, 3.81s/it] 25%|██▍ | 8411/34278 [9:22:26<25:31:26, 3.55s/it] {'loss': 0.1438, 'grad_norm': 0.8133090326836228, 'learning_rate': 8.832245954379674e-06, 'epoch': 0.25} 25%|██▍ | 8411/34278 [9:22:26<25:31:26, 3.55s/it] 25%|██▍ | 8412/34278 [9:22:29<24:13:16, 3.37s/it] {'loss': 0.1598, 'grad_norm': 0.9548720252177634, 'learning_rate': 8.831942490480238e-06, 'epoch': 0.25} 25%|██▍ | 8412/34278 [9:22:29<24:13:16, 3.37s/it] 25%|██▍ | 8413/34278 [9:22:32<23:36:07, 3.29s/it] {'loss': 0.164, 'grad_norm': 0.9654900151038924, 'learning_rate': 8.831638992370136e-06, 'epoch': 0.25} 25%|██▍ | 8413/34278 [9:22:32<23:36:07, 3.29s/it] 25%|██▍ | 8414/34278 [9:22:35<23:24:11, 3.26s/it] {'loss': 0.1499, 'grad_norm': 1.0402333658753493, 'learning_rate': 8.831335460052075e-06, 'epoch': 0.25} 25%|██▍ | 8414/34278 [9:22:35<23:24:11, 3.26s/it] 25%|██▍ | 8415/34278 [9:22:41<29:15:45, 4.07s/it] {'loss': 0.1668, 'grad_norm': 1.0229072464196518, 'learning_rate': 8.831031893528765e-06, 'epoch': 0.25} 25%|██▍ | 8415/34278 [9:22:41<29:15:45, 4.07s/it] 25%|██▍ | 8416/34278 [9:22:46<31:05:20, 4.33s/it] {'loss': 0.1671, 'grad_norm': 1.1475131038509692, 'learning_rate': 8.830728292802917e-06, 'epoch': 0.25} 25%|██▍ | 8416/34278 [9:22:46<31:05:20, 4.33s/it] 25%|██▍ | 8417/34278 [9:22:49<28:30:59, 3.97s/it] {'loss': 0.1521, 'grad_norm': 0.9151730725841057, 'learning_rate': 8.830424657877241e-06, 'epoch': 0.25} 25%|██▍ | 8417/34278 [9:22:49<28:30:59, 3.97s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12980 > 8192). Running this sequence through the model will result in indexing errors 25%|██▍ | 8418/34278 [9:22:55<30:48:00, 4.29s/it] {'loss': 0.1581, 'grad_norm': 0.7484544291478948, 'learning_rate': 8.830120988754448e-06, 'epoch': 0.25} 25%|██▍ | 8418/34278 [9:22:55<30:48:00, 4.29s/it] 25%|██▍ | 8419/34278 [9:22:58<28:25:53, 3.96s/it] {'loss': 0.1717, 'grad_norm': 0.835116926310824, 'learning_rate': 8.82981728543725e-06, 'epoch': 0.25} 25%|██▍ | 8419/34278 [9:22:58<28:25:53, 3.96s/it] 25%|██▍ | 8420/34278 [9:23:01<25:58:17, 3.62s/it] {'loss': 0.186, 'grad_norm': 2.437720013564385, 'learning_rate': 8.829513547928357e-06, 'epoch': 0.25} 25%|██▍ | 8420/34278 [9:23:01<25:58:17, 3.62s/it] 25%|██▍ | 8421/34278 [9:23:04<25:11:00, 3.51s/it] {'loss': 0.1402, 'grad_norm': 0.9567922993993235, 'learning_rate': 8.829209776230481e-06, 'epoch': 0.25} 25%|██▍ | 8421/34278 [9:23:04<25:11:00, 3.51s/it] 25%|██▍ | 8422/34278 [9:23:07<23:55:39, 3.33s/it] {'loss': 0.1562, 'grad_norm': 0.6785539352419789, 'learning_rate': 8.828905970346333e-06, 'epoch': 0.25} 25%|██▍ | 8422/34278 [9:23:07<23:55:39, 3.33s/it] 25%|██▍ | 8423/34278 [9:23:10<24:47:18, 3.45s/it] {'loss': 0.1565, 'grad_norm': 1.0096327952414297, 'learning_rate': 8.82860213027863e-06, 'epoch': 0.25} 25%|██▍ | 8423/34278 [9:23:10<24:47:18, 3.45s/it] 25%|██▍ | 8424/34278 [9:23:13<23:55:53, 3.33s/it] {'loss': 0.1736, 'grad_norm': 0.9654423138572058, 'learning_rate': 8.828298256030078e-06, 'epoch': 0.25} 25%|██▍ | 8424/34278 [9:23:14<23:55:53, 3.33s/it] 25%|██▍ | 8425/34278 [9:23:19<28:21:44, 3.95s/it] {'loss': 0.1433, 'grad_norm': 0.869416426857183, 'learning_rate': 8.827994347603395e-06, 'epoch': 0.25} 25%|██▍ | 8425/34278 [9:23:19<28:21:44, 3.95s/it] 25%|██▍ | 8426/34278 [9:23:25<33:05:45, 4.61s/it] {'loss': 0.1526, 'grad_norm': 0.654907225293584, 'learning_rate': 8.82769040500129e-06, 'epoch': 0.25} 25%|██▍ | 8426/34278 [9:23:25<33:05:45, 4.61s/it] 25%|██▍ | 8427/34278 [9:23:28<29:54:42, 4.17s/it] {'loss': 0.1591, 'grad_norm': 1.0479087753183833, 'learning_rate': 8.827386428226481e-06, 'epoch': 0.25} 25%|██▍ | 8427/34278 [9:23:28<29:54:42, 4.17s/it] 25%|██▍ | 8428/34278 [9:23:31<27:24:06, 3.82s/it] {'loss': 0.1478, 'grad_norm': 0.86967230802673, 'learning_rate': 8.827082417281679e-06, 'epoch': 0.25} 25%|██▍ | 8428/34278 [9:23:31<27:24:06, 3.82s/it] 25%|██▍ | 8429/34278 [9:23:35<26:37:27, 3.71s/it] {'loss': 0.1431, 'grad_norm': 0.7681872088185056, 'learning_rate': 8.826778372169599e-06, 'epoch': 0.25} 25%|██▍ | 8429/34278 [9:23:35<26:37:27, 3.71s/it] 25%|██▍ | 8430/34278 [9:23:38<26:50:10, 3.74s/it] {'loss': 0.1547, 'grad_norm': 0.9663204434008695, 'learning_rate': 8.826474292892954e-06, 'epoch': 0.25} 25%|██▍ | 8430/34278 [9:23:39<26:50:10, 3.74s/it] 25%|██▍ | 8431/34278 [9:23:44<31:10:39, 4.34s/it] {'loss': 0.1549, 'grad_norm': 0.8851904730109313, 'learning_rate': 8.82617017945446e-06, 'epoch': 0.25} 25%|██▍ | 8431/34278 [9:23:44<31:10:39, 4.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 25%|██▍ | 8432/34278 [9:23:47<27:46:14, 3.87s/it] {'loss': 0.1739, 'grad_norm': 0.7937661421652855, 'learning_rate': 8.825866031856833e-06, 'epoch': 0.25} 25%|██▍ | 8432/34278 [9:23:47<27:46:14, 3.87s/it] 25%|██▍ | 8433/34278 [9:23:51<27:44:46, 3.86s/it] {'loss': 0.1457, 'grad_norm': 0.7341655041980866, 'learning_rate': 8.825561850102788e-06, 'epoch': 0.25} 25%|██▍ | 8433/34278 [9:23:51<27:44:46, 3.86s/it] 25%|██▍ | 8434/34278 [9:23:56<30:59:43, 4.32s/it] {'loss': 0.1962, 'grad_norm': 1.0127929178350177, 'learning_rate': 8.82525763419504e-06, 'epoch': 0.25} 25%|██▍ | 8434/34278 [9:23:56<30:59:43, 4.32s/it] 25%|██▍ | 8435/34278 [9:24:00<30:03:03, 4.19s/it] {'loss': 0.1924, 'grad_norm': 0.9877385349978934, 'learning_rate': 8.824953384136305e-06, 'epoch': 0.25} 25%|██▍ | 8435/34278 [9:24:00<30:03:03, 4.19s/it] 25%|██▍ | 8436/34278 [9:24:04<30:17:10, 4.22s/it] {'loss': 0.1591, 'grad_norm': 0.8526552191336515, 'learning_rate': 8.824649099929297e-06, 'epoch': 0.25} 25%|██▍ | 8436/34278 [9:24:04<30:17:10, 4.22s/it] 25%|██▍ | 8437/34278 [9:24:07<27:27:34, 3.83s/it] {'loss': 0.1717, 'grad_norm': 0.8486905635216188, 'learning_rate': 8.824344781576736e-06, 'epoch': 0.25} 25%|██▍ | 8437/34278 [9:24:07<27:27:34, 3.83s/it] 25%|██▍ | 8438/34278 [9:24:11<27:27:57, 3.83s/it] {'loss': 0.1603, 'grad_norm': 0.9146396786883396, 'learning_rate': 8.82404042908134e-06, 'epoch': 0.25} 25%|██▍ | 8438/34278 [9:24:11<27:27:57, 3.83s/it] 25%|██▍ | 8439/34278 [9:24:14<25:42:06, 3.58s/it] {'loss': 0.1741, 'grad_norm': 0.9111314156629063, 'learning_rate': 8.823736042445822e-06, 'epoch': 0.25} 25%|██▍ | 8439/34278 [9:24:14<25:42:06, 3.58s/it] 25%|██▍ | 8440/34278 [9:24:18<26:00:00, 3.62s/it] {'loss': 0.1656, 'grad_norm': 0.7504101195768871, 'learning_rate': 8.8234316216729e-06, 'epoch': 0.25} 25%|██▍ | 8440/34278 [9:24:18<26:00:00, 3.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 25%|██▍ | 8441/34278 [9:24:24<31:54:54, 4.45s/it] {'loss': 0.1782, 'grad_norm': 0.8826741149882912, 'learning_rate': 8.823127166765296e-06, 'epoch': 0.25} 25%|██▍ | 8441/34278 [9:24:24<31:54:54, 4.45s/it] 25%|██▍ | 8442/34278 [9:24:28<31:14:11, 4.35s/it] {'loss': 0.1427, 'grad_norm': 1.079721249237678, 'learning_rate': 8.822822677725725e-06, 'epoch': 0.25} 25%|██▍ | 8442/34278 [9:24:28<31:14:11, 4.35s/it] 25%|██▍ | 8443/34278 [9:24:32<29:48:20, 4.15s/it] {'loss': 0.1471, 'grad_norm': 0.8250626002008481, 'learning_rate': 8.822518154556904e-06, 'epoch': 0.25} 25%|██▍ | 8443/34278 [9:24:32<29:48:20, 4.15s/it] 25%|██▍ | 8444/34278 [9:24:37<32:22:52, 4.51s/it] {'loss': 0.1564, 'grad_norm': 2.595163956846552, 'learning_rate': 8.822213597261553e-06, 'epoch': 0.25} 25%|██▍ | 8444/34278 [9:24:37<32:22:52, 4.51s/it] 25%|██▍ | 8445/34278 [9:24:40<28:37:37, 3.99s/it] {'loss': 0.1628, 'grad_norm': 1.028256115729576, 'learning_rate': 8.821909005842393e-06, 'epoch': 0.25} 25%|██▍ | 8445/34278 [9:24:40<28:37:37, 3.99s/it] 25%|██▍ | 8446/34278 [9:24:43<26:04:59, 3.63s/it] {'loss': 0.1517, 'grad_norm': 0.8793545771654575, 'learning_rate': 8.821604380302141e-06, 'epoch': 0.25} 25%|██▍ | 8446/34278 [9:24:43<26:04:59, 3.63s/it] 25%|██▍ | 8447/34278 [9:24:47<26:57:17, 3.76s/it] {'loss': 0.1579, 'grad_norm': 0.6827770932932468, 'learning_rate': 8.82129972064352e-06, 'epoch': 0.25} 25%|██▍ | 8447/34278 [9:24:47<26:57:17, 3.76s/it] 25%|██▍ | 8448/34278 [9:24:51<26:37:23, 3.71s/it] {'loss': 0.1639, 'grad_norm': 0.8235675301541698, 'learning_rate': 8.820995026869244e-06, 'epoch': 0.25} 25%|██▍ | 8448/34278 [9:24:51<26:37:23, 3.71s/it] 25%|██▍ | 8449/34278 [9:24:54<25:30:47, 3.56s/it] {'loss': 0.1788, 'grad_norm': 0.9152598512939685, 'learning_rate': 8.820690298982037e-06, 'epoch': 0.25} 25%|██▍ | 8449/34278 [9:24:54<25:30:47, 3.56s/it] 25%|██▍ | 8450/34278 [9:24:57<24:07:48, 3.36s/it] {'loss': 0.1414, 'grad_norm': 0.840929740765244, 'learning_rate': 8.82038553698462e-06, 'epoch': 0.25} 25%|██▍ | 8450/34278 [9:24:57<24:07:48, 3.36s/it] 25%|██▍ | 8451/34278 [9:25:00<24:20:29, 3.39s/it] {'loss': 0.1521, 'grad_norm': 0.7697768271526885, 'learning_rate': 8.820080740879713e-06, 'epoch': 0.25} 25%|██▍ | 8451/34278 [9:25:00<24:20:29, 3.39s/it] 25%|██▍ | 8452/34278 [9:25:04<24:33:32, 3.42s/it] {'loss': 0.1517, 'grad_norm': 3.034581115019779, 'learning_rate': 8.819775910670036e-06, 'epoch': 0.25} 25%|██▍ | 8452/34278 [9:25:04<24:33:32, 3.42s/it] 25%|██▍ | 8453/34278 [9:25:07<23:27:59, 3.27s/it] {'loss': 0.1552, 'grad_norm': 0.8956114868858843, 'learning_rate': 8.819471046358313e-06, 'epoch': 0.25} 25%|██▍ | 8453/34278 [9:25:07<23:27:59, 3.27s/it] 25%|██▍ | 8454/34278 [9:25:10<23:24:18, 3.26s/it] {'loss': 0.1694, 'grad_norm': 1.075827573742953, 'learning_rate': 8.819166147947263e-06, 'epoch': 0.25} 25%|██▍ | 8454/34278 [9:25:10<23:24:18, 3.26s/it] 25%|██▍ | 8455/34278 [9:25:14<24:29:51, 3.42s/it] {'loss': 0.1531, 'grad_norm': 0.7881868658598621, 'learning_rate': 8.81886121543961e-06, 'epoch': 0.25} 25%|██▍ | 8455/34278 [9:25:14<24:29:51, 3.42s/it] 25%|██▍ | 8456/34278 [9:25:18<27:35:28, 3.85s/it] {'loss': 0.1895, 'grad_norm': 0.729149444478052, 'learning_rate': 8.818556248838075e-06, 'epoch': 0.25} 25%|██▍ | 8456/34278 [9:25:18<27:35:28, 3.85s/it] 25%|██▍ | 8457/34278 [9:25:22<26:17:15, 3.67s/it] {'loss': 0.1638, 'grad_norm': 0.8361171197534403, 'learning_rate': 8.818251248145382e-06, 'epoch': 0.25} 25%|██▍ | 8457/34278 [9:25:22<26:17:15, 3.67s/it] 25%|██▍ | 8458/34278 [9:25:25<26:31:43, 3.70s/it] {'loss': 0.17, 'grad_norm': 0.8258725227718289, 'learning_rate': 8.817946213364254e-06, 'epoch': 0.25} 25%|██▍ | 8458/34278 [9:25:25<26:31:43, 3.70s/it] 25%|██▍ | 8459/34278 [9:25:29<25:16:53, 3.53s/it] {'loss': 0.1714, 'grad_norm': 1.0109646348966774, 'learning_rate': 8.817641144497413e-06, 'epoch': 0.25} 25%|██▍ | 8459/34278 [9:25:29<25:16:53, 3.53s/it] 25%|██▍ | 8460/34278 [9:25:33<26:39:43, 3.72s/it] {'loss': 0.1489, 'grad_norm': 0.8278978580142614, 'learning_rate': 8.817336041547582e-06, 'epoch': 0.25} 25%|██▍ | 8460/34278 [9:25:33<26:39:43, 3.72s/it] 25%|██▍ | 8461/34278 [9:25:36<26:18:35, 3.67s/it] {'loss': 0.1738, 'grad_norm': 0.8980349193116316, 'learning_rate': 8.817030904517488e-06, 'epoch': 0.25} 25%|██▍ | 8461/34278 [9:25:36<26:18:35, 3.67s/it] 25%|██▍ | 8462/34278 [9:25:41<27:51:05, 3.88s/it] {'loss': 0.1641, 'grad_norm': 1.0060006588202914, 'learning_rate': 8.816725733409852e-06, 'epoch': 0.25} 25%|██▍ | 8462/34278 [9:25:41<27:51:05, 3.88s/it] 25%|██▍ | 8463/34278 [9:25:44<26:17:39, 3.67s/it] {'loss': 0.1559, 'grad_norm': 1.1469604360476755, 'learning_rate': 8.8164205282274e-06, 'epoch': 0.25} 25%|██▍ | 8463/34278 [9:25:44<26:17:39, 3.67s/it] 25%|██▍ | 8464/34278 [9:25:49<29:25:27, 4.10s/it] {'loss': 0.1691, 'grad_norm': 0.9779373329827378, 'learning_rate': 8.816115288972857e-06, 'epoch': 0.25} 25%|██▍ | 8464/34278 [9:25:49<29:25:27, 4.10s/it] 25%|██▍ | 8465/34278 [9:25:52<26:47:47, 3.74s/it] {'loss': 0.1711, 'grad_norm': 1.5967700458984158, 'learning_rate': 8.815810015648947e-06, 'epoch': 0.25} 25%|██▍ | 8465/34278 [9:25:52<26:47:47, 3.74s/it] 25%|██▍ | 8466/34278 [9:25:56<27:25:03, 3.82s/it] {'loss': 0.1709, 'grad_norm': 0.9709621836414194, 'learning_rate': 8.815504708258398e-06, 'epoch': 0.25} 25%|██▍ | 8466/34278 [9:25:56<27:25:03, 3.82s/it] 25%|██▍ | 8467/34278 [9:25:59<25:56:12, 3.62s/it] {'loss': 0.1869, 'grad_norm': 0.9165203118739379, 'learning_rate': 8.815199366803932e-06, 'epoch': 0.25} 25%|██▍ | 8467/34278 [9:25:59<25:56:12, 3.62s/it] 25%|██▍ | 8468/34278 [9:26:05<31:42:48, 4.42s/it] {'loss': 0.1504, 'grad_norm': 0.8066612396376657, 'learning_rate': 8.814893991288277e-06, 'epoch': 0.25} 25%|██▍ | 8468/34278 [9:26:05<31:42:48, 4.42s/it] 25%|██▍ | 8469/34278 [9:26:09<29:17:48, 4.09s/it] {'loss': 0.179, 'grad_norm': 0.7944070270789971, 'learning_rate': 8.814588581714158e-06, 'epoch': 0.25} 25%|██▍ | 8469/34278 [9:26:09<29:17:48, 4.09s/it] 25%|██▍ | 8470/34278 [9:26:12<26:54:33, 3.75s/it] {'loss': 0.1585, 'grad_norm': 0.8136541392797165, 'learning_rate': 8.814283138084305e-06, 'epoch': 0.25} 25%|██▍ | 8470/34278 [9:26:12<26:54:33, 3.75s/it] 25%|██▍ | 8471/34278 [9:26:15<25:29:57, 3.56s/it] {'loss': 0.1446, 'grad_norm': 0.8970606040478348, 'learning_rate': 8.813977660401442e-06, 'epoch': 0.25} 25%|██▍ | 8471/34278 [9:26:15<25:29:57, 3.56s/it] 25%|██▍ | 8472/34278 [9:26:18<25:27:13, 3.55s/it] {'loss': 0.1317, 'grad_norm': 0.6884550119953908, 'learning_rate': 8.813672148668296e-06, 'epoch': 0.25} 25%|██▍ | 8472/34278 [9:26:18<25:27:13, 3.55s/it] 25%|██▍ | 8473/34278 [9:26:21<24:42:03, 3.45s/it] {'loss': 0.1705, 'grad_norm': 0.792880746072291, 'learning_rate': 8.813366602887596e-06, 'epoch': 0.25} 25%|██▍ | 8473/34278 [9:26:21<24:42:03, 3.45s/it] 25%|██▍ | 8474/34278 [9:26:25<24:16:45, 3.39s/it] {'loss': 0.1696, 'grad_norm': 0.8942780596943223, 'learning_rate': 8.81306102306207e-06, 'epoch': 0.25} 25%|██▍ | 8474/34278 [9:26:25<24:16:45, 3.39s/it] 25%|██▍ | 8475/34278 [9:26:27<22:47:14, 3.18s/it] {'loss': 0.1902, 'grad_norm': 0.9207073793628455, 'learning_rate': 8.812755409194444e-06, 'epoch': 0.25} 25%|██▍ | 8475/34278 [9:26:27<22:47:14, 3.18s/it] 25%|██▍ | 8476/34278 [9:26:31<24:10:16, 3.37s/it] {'loss': 0.1565, 'grad_norm': 0.7593633274691847, 'learning_rate': 8.81244976128745e-06, 'epoch': 0.25} 25%|██▍ | 8476/34278 [9:26:31<24:10:16, 3.37s/it] 25%|██▍ | 8477/34278 [9:26:37<29:10:43, 4.07s/it] {'loss': 0.1913, 'grad_norm': 0.9856712072108497, 'learning_rate': 8.812144079343814e-06, 'epoch': 0.25} 25%|██▍ | 8477/34278 [9:26:37<29:10:43, 4.07s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 25%|██▍ | 8478/34278 [9:26:40<28:09:04, 3.93s/it] {'loss': 0.1612, 'grad_norm': 0.9154848599457953, 'learning_rate': 8.811838363366263e-06, 'epoch': 0.25} 25%|██▍ | 8478/34278 [9:26:40<28:09:04, 3.93s/it] 25%|██▍ | 8479/34278 [9:26:46<31:37:12, 4.41s/it] {'loss': 0.1938, 'grad_norm': 0.7981364221196089, 'learning_rate': 8.811532613357532e-06, 'epoch': 0.25} 25%|██▍ | 8479/34278 [9:26:46<31:37:12, 4.41s/it] 25%|██▍ | 8480/34278 [9:26:49<28:56:24, 4.04s/it] {'loss': 0.1933, 'grad_norm': 0.8215717143394589, 'learning_rate': 8.811226829320347e-06, 'epoch': 0.25} 25%|██▍ | 8480/34278 [9:26:49<28:56:24, 4.04s/it] 25%|██▍ | 8481/34278 [9:26:53<27:32:38, 3.84s/it] {'loss': 0.16, 'grad_norm': 0.8889634002739583, 'learning_rate': 8.810921011257439e-06, 'epoch': 0.25} 25%|██▍ | 8481/34278 [9:26:53<27:32:38, 3.84s/it] 25%|██▍ | 8482/34278 [9:26:56<25:51:16, 3.61s/it] {'loss': 0.1527, 'grad_norm': 0.7136621367510201, 'learning_rate': 8.810615159171539e-06, 'epoch': 0.25} 25%|██▍ | 8482/34278 [9:26:56<25:51:16, 3.61s/it] 25%|██▍ | 8483/34278 [9:26:59<25:16:05, 3.53s/it] {'loss': 0.1462, 'grad_norm': 0.768581865624633, 'learning_rate': 8.810309273065374e-06, 'epoch': 0.25} 25%|██▍ | 8483/34278 [9:26:59<25:16:05, 3.53s/it] 25%|██▍ | 8484/34278 [9:27:02<24:45:56, 3.46s/it] {'loss': 0.1498, 'grad_norm': 0.7718778497787705, 'learning_rate': 8.810003352941679e-06, 'epoch': 0.25} 25%|██▍ | 8484/34278 [9:27:02<24:45:56, 3.46s/it] 25%|██▍ | 8485/34278 [9:27:06<24:21:36, 3.40s/it] {'loss': 0.1766, 'grad_norm': 0.8938330577938709, 'learning_rate': 8.809697398803183e-06, 'epoch': 0.25} 25%|██▍ | 8485/34278 [9:27:06<24:21:36, 3.40s/it] 25%|██▍ | 8486/34278 [9:27:10<25:40:40, 3.58s/it] {'loss': 0.1661, 'grad_norm': 0.7270221163387063, 'learning_rate': 8.809391410652618e-06, 'epoch': 0.25} 25%|██▍ | 8486/34278 [9:27:10<25:40:40, 3.58s/it] 25%|██▍ | 8487/34278 [9:27:16<31:03:52, 4.34s/it] {'loss': 0.167, 'grad_norm': 0.8740024671496396, 'learning_rate': 8.809085388492716e-06, 'epoch': 0.25} 25%|██▍ | 8487/34278 [9:27:16<31:03:52, 4.34s/it] 25%|██▍ | 8488/34278 [9:27:18<27:34:33, 3.85s/it] {'loss': 0.1993, 'grad_norm': 1.0996909584243546, 'learning_rate': 8.808779332326208e-06, 'epoch': 0.25} 25%|██▍ | 8488/34278 [9:27:18<27:34:33, 3.85s/it] 25%|██▍ | 8489/34278 [9:27:22<26:08:36, 3.65s/it] {'loss': 0.1428, 'grad_norm': 0.8376569971712228, 'learning_rate': 8.808473242155828e-06, 'epoch': 0.25} 25%|██▍ | 8489/34278 [9:27:22<26:08:36, 3.65s/it] 25%|██▍ | 8490/34278 [9:27:26<26:58:52, 3.77s/it] {'loss': 0.1546, 'grad_norm': 0.8162306345311063, 'learning_rate': 8.808167117984308e-06, 'epoch': 0.25} 25%|██▍ | 8490/34278 [9:27:26<26:58:52, 3.77s/it] 25%|██▍ | 8491/34278 [9:27:29<25:22:12, 3.54s/it] {'loss': 0.2119, 'grad_norm': 1.145727284508113, 'learning_rate': 8.807860959814381e-06, 'epoch': 0.25} 25%|██▍ | 8491/34278 [9:27:29<25:22:12, 3.54s/it] 25%|██▍ | 8492/34278 [9:27:35<31:00:49, 4.33s/it] {'loss': 0.1759, 'grad_norm': 0.8791723337086851, 'learning_rate': 8.807554767648782e-06, 'epoch': 0.25} 25%|██▍ | 8492/34278 [9:27:35<31:00:49, 4.33s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 25%|██▍ | 8493/34278 [9:27:38<29:14:59, 4.08s/it] {'loss': 0.1558, 'grad_norm': 0.7711098477920081, 'learning_rate': 8.80724854149024e-06, 'epoch': 0.25} 25%|██▍ | 8493/34278 [9:27:38<29:14:59, 4.08s/it] 25%|██▍ | 8494/34278 [9:27:41<26:43:21, 3.73s/it] {'loss': 0.165, 'grad_norm': 0.879832302265608, 'learning_rate': 8.806942281341496e-06, 'epoch': 0.25} 25%|██▍ | 8494/34278 [9:27:41<26:43:21, 3.73s/it] 25%|██▍ | 8495/34278 [9:27:44<25:22:15, 3.54s/it] {'loss': 0.149, 'grad_norm': 1.0633651975194989, 'learning_rate': 8.806635987205276e-06, 'epoch': 0.25} 25%|██▍ | 8495/34278 [9:27:44<25:22:15, 3.54s/it] 25%|██▍ | 8496/34278 [9:27:49<27:11:30, 3.80s/it] {'loss': 0.1769, 'grad_norm': 0.9650721712161302, 'learning_rate': 8.80632965908432e-06, 'epoch': 0.25} 25%|██▍ | 8496/34278 [9:27:49<27:11:30, 3.80s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 25%|██▍ | 8497/34278 [9:27:52<27:09:17, 3.79s/it] {'loss': 0.1764, 'grad_norm': 1.115058116156931, 'learning_rate': 8.806023296981364e-06, 'epoch': 0.25} 25%|██▍ | 8497/34278 [9:27:52<27:09:17, 3.79s/it] 25%|██▍ | 8498/34278 [9:27:56<25:52:24, 3.61s/it] {'loss': 0.1581, 'grad_norm': 0.8621447225082789, 'learning_rate': 8.805716900899137e-06, 'epoch': 0.25} 25%|██▍ | 8498/34278 [9:27:56<25:52:24, 3.61s/it] 25%|██▍ | 8499/34278 [9:27:59<24:48:33, 3.46s/it] {'loss': 0.1538, 'grad_norm': 0.877791472812501, 'learning_rate': 8.805410470840378e-06, 'epoch': 0.25} 25%|██▍ | 8499/34278 [9:27:59<24:48:33, 3.46s/it] 25%|██▍ | 8500/34278 [9:28:02<24:19:46, 3.40s/it] {'loss': 0.1705, 'grad_norm': 0.850704153107941, 'learning_rate': 8.805104006807825e-06, 'epoch': 0.25} 25%|██▍ | 8500/34278 [9:28:02<24:19:46, 3.40s/it] 25%|██▍ | 8501/34278 [9:28:05<24:00:33, 3.35s/it] {'loss': 0.1484, 'grad_norm': 1.1570560093546498, 'learning_rate': 8.80479750880421e-06, 'epoch': 0.25} 25%|██▍ | 8501/34278 [9:28:05<24:00:33, 3.35s/it] 25%|██▍ | 8502/34278 [9:28:08<22:29:26, 3.14s/it] {'loss': 0.1528, 'grad_norm': 1.1830242487341915, 'learning_rate': 8.804490976832272e-06, 'epoch': 0.25} 25%|██▍ | 8502/34278 [9:28:08<22:29:26, 3.14s/it] 25%|██▍ | 8503/34278 [9:28:14<28:18:34, 3.95s/it] {'loss': 0.1621, 'grad_norm': 0.8592503509249563, 'learning_rate': 8.804184410894747e-06, 'epoch': 0.25} 25%|██▍ | 8503/34278 [9:28:14<28:18:34, 3.95s/it] 25%|██▍ | 8504/34278 [9:28:17<26:18:51, 3.68s/it] {'loss': 0.1643, 'grad_norm': 1.5333932107566217, 'learning_rate': 8.803877810994373e-06, 'epoch': 0.25} 25%|██▍ | 8504/34278 [9:28:17<26:18:51, 3.68s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbf269975b0> Failed to fetch sample 2654739. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf269975b0> 25%|██▍ | 8505/34278 [9:28:20<24:49:39, 3.47s/it] {'loss': 0.1664, 'grad_norm': 0.9298591687473505, 'learning_rate': 8.803571177133884e-06, 'epoch': 0.25} 25%|██▍ | 8505/34278 [9:28:20<24:49:39, 3.47s/it] 25%|██▍ | 8506/34278 [9:28:24<25:47:29, 3.60s/it] {'loss': 0.1675, 'grad_norm': 0.9366701256850264, 'learning_rate': 8.80326450931602e-06, 'epoch': 0.25} 25%|██▍ | 8506/34278 [9:28:24<25:47:29, 3.60s/it] 25%|██▍ | 8507/34278 [9:28:27<25:39:44, 3.58s/it] {'loss': 0.1484, 'grad_norm': 1.0417008665791745, 'learning_rate': 8.802957807543517e-06, 'epoch': 0.25} 25%|██▍ | 8507/34278 [9:28:27<25:39:44, 3.58s/it] 25%|██▍ | 8508/34278 [9:28:30<24:28:05, 3.42s/it] {'loss': 0.1784, 'grad_norm': 0.6713970157192196, 'learning_rate': 8.802651071819118e-06, 'epoch': 0.25} 25%|██▍ | 8508/34278 [9:28:30<24:28:05, 3.42s/it] 25%|██▍ | 8509/34278 [9:28:33<24:00:17, 3.35s/it] {'loss': 0.1619, 'grad_norm': 0.7218572054930797, 'learning_rate': 8.802344302145555e-06, 'epoch': 0.25} 25%|██▍ | 8509/34278 [9:28:33<24:00:17, 3.35s/it] 25%|██▍ | 8510/34278 [9:28:38<27:03:28, 3.78s/it] {'loss': 0.1829, 'grad_norm': 0.8067575500397708, 'learning_rate': 8.80203749852557e-06, 'epoch': 0.25} 25%|██▍ | 8510/34278 [9:28:38<27:03:28, 3.78s/it] 25%|██▍ | 8511/34278 [9:28:41<25:32:47, 3.57s/it] {'loss': 0.1457, 'grad_norm': 0.6142079062235674, 'learning_rate': 8.801730660961902e-06, 'epoch': 0.25} 25%|██▍ | 8511/34278 [9:28:41<25:32:47, 3.57s/it] 25%|██▍ | 8512/34278 [9:28:44<24:27:54, 3.42s/it] {'loss': 0.141, 'grad_norm': 0.7805665552257766, 'learning_rate': 8.80142378945729e-06, 'epoch': 0.25} 25%|██▍ | 8512/34278 [9:28:44<24:27:54, 3.42s/it] 25%|██▍ | 8513/34278 [9:28:47<23:36:47, 3.30s/it] {'loss': 0.162, 'grad_norm': 0.8006755079437554, 'learning_rate': 8.801116884014475e-06, 'epoch': 0.25} 25%|██▍ | 8513/34278 [9:28:47<23:36:47, 3.30s/it] 25%|██▍ | 8514/34278 [9:28:50<22:48:42, 3.19s/it] {'loss': 0.1377, 'grad_norm': 0.9079863639047883, 'learning_rate': 8.800809944636195e-06, 'epoch': 0.25} 25%|██▍ | 8514/34278 [9:28:50<22:48:42, 3.19s/it] 25%|██▍ | 8515/34278 [9:28:54<24:10:11, 3.38s/it] {'loss': 0.1521, 'grad_norm': 0.7832487583579651, 'learning_rate': 8.800502971325193e-06, 'epoch': 0.25} 25%|██▍ | 8515/34278 [9:28:54<24:10:11, 3.38s/it] 25%|██▍ | 8516/34278 [9:28:58<24:25:42, 3.41s/it] {'loss': 0.185, 'grad_norm': 0.892213394264103, 'learning_rate': 8.800195964084205e-06, 'epoch': 0.25} 25%|██▍ | 8516/34278 [9:28:58<24:25:42, 3.41s/it] 25%|██▍ | 8517/34278 [9:29:01<24:01:22, 3.36s/it] {'loss': 0.1519, 'grad_norm': 0.9631334992011685, 'learning_rate': 8.799888922915975e-06, 'epoch': 0.25} 25%|██▍ | 8517/34278 [9:29:01<24:01:22, 3.36s/it] 25%|██▍ | 8518/34278 [9:29:06<27:24:56, 3.83s/it] {'loss': 0.1579, 'grad_norm': 0.7927291508948916, 'learning_rate': 8.799581847823247e-06, 'epoch': 0.25} 25%|██▍ | 8518/34278 [9:29:06<27:24:56, 3.83s/it] 25%|██▍ | 8519/34278 [9:29:09<25:50:33, 3.61s/it] {'loss': 0.1464, 'grad_norm': 0.782056273651509, 'learning_rate': 8.799274738808757e-06, 'epoch': 0.25} 25%|██▍ | 8519/34278 [9:29:09<25:50:33, 3.61s/it] 25%|██▍ | 8520/34278 [9:29:12<24:26:02, 3.41s/it] {'loss': 0.1684, 'grad_norm': 0.8473680251759129, 'learning_rate': 8.798967595875247e-06, 'epoch': 0.25} 25%|██▍ | 8520/34278 [9:29:12<24:26:02, 3.41s/it] 25%|██▍ | 8521/34278 [9:29:15<23:15:06, 3.25s/it] {'loss': 0.1628, 'grad_norm': 0.8015432953151329, 'learning_rate': 8.798660419025464e-06, 'epoch': 0.25} 25%|██▍ | 8521/34278 [9:29:15<23:15:06, 3.25s/it] 25%|██▍ | 8522/34278 [9:29:21<29:46:41, 4.16s/it] {'loss': 0.187, 'grad_norm': 1.0293800914323845, 'learning_rate': 8.798353208262147e-06, 'epoch': 0.25} 25%|██▍ | 8522/34278 [9:29:21<29:46:41, 4.16s/it] 25%|██▍ | 8523/34278 [9:29:24<27:39:58, 3.87s/it] {'loss': 0.1727, 'grad_norm': 0.908218371022314, 'learning_rate': 8.79804596358804e-06, 'epoch': 0.25} 25%|██▍ | 8523/34278 [9:29:24<27:39:58, 3.87s/it] 25%|██▍ | 8524/34278 [9:29:30<32:00:17, 4.47s/it] {'loss': 0.1757, 'grad_norm': 1.1202371764316217, 'learning_rate': 8.797738685005883e-06, 'epoch': 0.25} 25%|██▍ | 8524/34278 [9:29:30<32:00:17, 4.47s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 25%|██▍ | 8525/34278 [9:29:33<29:27:50, 4.12s/it] {'loss': 0.151, 'grad_norm': 1.0151239456504162, 'learning_rate': 8.797431372518424e-06, 'epoch': 0.25} 25%|██▍ | 8525/34278 [9:29:33<29:27:50, 4.12s/it] 25%|██▍ | 8526/34278 [9:29:38<30:56:55, 4.33s/it] {'loss': 0.1655, 'grad_norm': 0.7940225277400671, 'learning_rate': 8.797124026128403e-06, 'epoch': 0.25} 25%|██▍ | 8526/34278 [9:29:38<30:56:55, 4.33s/it] 25%|██▍ | 8527/34278 [9:29:41<27:35:02, 3.86s/it] {'loss': 0.1543, 'grad_norm': 0.725528592028752, 'learning_rate': 8.796816645838566e-06, 'epoch': 0.25} 25%|██▍ | 8527/34278 [9:29:41<27:35:02, 3.86s/it] 25%|██▍ | 8528/34278 [9:29:44<26:33:54, 3.71s/it] {'loss': 0.1857, 'grad_norm': 0.9403967739044864, 'learning_rate': 8.796509231651655e-06, 'epoch': 0.25} 25%|██▍ | 8528/34278 [9:29:44<26:33:54, 3.71s/it] 25%|██▍ | 8529/34278 [9:29:50<31:15:54, 4.37s/it] {'loss': 0.1642, 'grad_norm': 0.9240299974494754, 'learning_rate': 8.796201783570417e-06, 'epoch': 0.25} 25%|██▍ | 8529/34278 [9:29:50<31:15:54, 4.37s/it] 25%|██▍ | 8530/34278 [9:29:53<28:16:04, 3.95s/it] {'loss': 0.1623, 'grad_norm': 0.7519665485351806, 'learning_rate': 8.795894301597596e-06, 'epoch': 0.25} 25%|██▍ | 8530/34278 [9:29:53<28:16:04, 3.95s/it] 25%|██▍ | 8531/34278 [9:29:56<26:13:29, 3.67s/it] {'loss': 0.1726, 'grad_norm': 0.8571661523051104, 'learning_rate': 8.795586785735935e-06, 'epoch': 0.25} 25%|██▍ | 8531/34278 [9:29:56<26:13:29, 3.67s/it] 25%|██▍ | 8532/34278 [9:29:59<25:10:59, 3.52s/it] {'loss': 0.1604, 'grad_norm': 0.947172515776725, 'learning_rate': 8.795279235988183e-06, 'epoch': 0.25} 25%|██▍ | 8532/34278 [9:29:59<25:10:59, 3.52s/it] 25%|██▍ | 8533/34278 [9:30:05<30:16:07, 4.23s/it] {'loss': 0.1589, 'grad_norm': 0.8151591283214685, 'learning_rate': 8.794971652357083e-06, 'epoch': 0.25} 25%|██▍ | 8533/34278 [9:30:05<30:16:07, 4.23s/it] 25%|██▍ | 8534/34278 [9:30:08<27:36:58, 3.86s/it] {'loss': 0.1738, 'grad_norm': 0.8089411185537707, 'learning_rate': 8.794664034845383e-06, 'epoch': 0.25} 25%|██▍ | 8534/34278 [9:30:08<27:36:58, 3.86s/it] 25%|██▍ | 8535/34278 [9:30:12<26:29:51, 3.71s/it] {'loss': 0.1651, 'grad_norm': 0.7158728657870362, 'learning_rate': 8.794356383455826e-06, 'epoch': 0.25} 25%|██▍ | 8535/34278 [9:30:12<26:29:51, 3.71s/it] 25%|██▍ | 8536/34278 [9:30:15<25:03:52, 3.51s/it] {'loss': 0.1578, 'grad_norm': 0.8113539840458991, 'learning_rate': 8.794048698191165e-06, 'epoch': 0.25} 25%|██▍ | 8536/34278 [9:30:15<25:03:52, 3.51s/it] 25%|██▍ | 8537/34278 [9:30:19<25:55:47, 3.63s/it] {'loss': 0.1495, 'grad_norm': 0.6200616649387198, 'learning_rate': 8.79374097905414e-06, 'epoch': 0.25} 25%|██▍ | 8537/34278 [9:30:19<25:55:47, 3.63s/it] 25%|██▍ | 8538/34278 [9:30:23<26:50:26, 3.75s/it] {'loss': 0.1726, 'grad_norm': 0.9316102662343403, 'learning_rate': 8.793433226047501e-06, 'epoch': 0.25} 25%|██▍ | 8538/34278 [9:30:23<26:50:26, 3.75s/it] 25%|██▍ | 8539/34278 [9:30:26<25:47:49, 3.61s/it] {'loss': 0.1464, 'grad_norm': 0.8846989390645414, 'learning_rate': 8.793125439173997e-06, 'epoch': 0.25} 25%|██▍ | 8539/34278 [9:30:26<25:47:49, 3.61s/it] 25%|██▍ | 8540/34278 [9:30:29<24:18:19, 3.40s/it] {'loss': 0.164, 'grad_norm': 0.7452266622871222, 'learning_rate': 8.792817618436375e-06, 'epoch': 0.25} 25%|██▍ | 8540/34278 [9:30:29<24:18:19, 3.40s/it] 25%|██▍ | 8541/34278 [9:30:32<24:18:04, 3.40s/it] {'loss': 0.1474, 'grad_norm': 0.8744304124780296, 'learning_rate': 8.792509763837382e-06, 'epoch': 0.25} 25%|██▍ | 8541/34278 [9:30:32<24:18:04, 3.40s/it] 25%|██▍ | 8542/34278 [9:30:38<29:30:30, 4.13s/it] {'loss': 0.1688, 'grad_norm': 0.8878039406492998, 'learning_rate': 8.792201875379767e-06, 'epoch': 0.25} 25%|██▍ | 8542/34278 [9:30:38<29:30:30, 4.13s/it] 25%|██▍ | 8543/34278 [9:30:42<30:01:12, 4.20s/it] {'loss': 0.1524, 'grad_norm': 0.9490768869872591, 'learning_rate': 8.791893953066279e-06, 'epoch': 0.25} 25%|██▍ | 8543/34278 [9:30:42<30:01:12, 4.20s/it] 25%|██▍ | 8544/34278 [9:30:45<27:12:16, 3.81s/it] {'loss': 0.158, 'grad_norm': 0.8695160326896547, 'learning_rate': 8.791585996899667e-06, 'epoch': 0.25} 25%|██▍ | 8544/34278 [9:30:45<27:12:16, 3.81s/it] 25%|██▍ | 8545/34278 [9:30:51<31:48:14, 4.45s/it] {'loss': 0.1801, 'grad_norm': 0.9238782009657288, 'learning_rate': 8.79127800688268e-06, 'epoch': 0.25} 25%|██▍ | 8545/34278 [9:30:51<31:48:14, 4.45s/it] 25%|██▍ | 8546/34278 [9:30:57<35:03:58, 4.91s/it] {'loss': 0.1786, 'grad_norm': 0.911790527348034, 'learning_rate': 8.790969983018067e-06, 'epoch': 0.25} 25%|██▍ | 8546/34278 [9:30:57<35:03:58, 4.91s/it] 25%|██▍ | 8547/34278 [9:31:03<37:26:23, 5.24s/it] {'loss': 0.1675, 'grad_norm': 0.9008959008504507, 'learning_rate': 8.790661925308582e-06, 'epoch': 0.25} 25%|██▍ | 8547/34278 [9:31:03<37:26:23, 5.24s/it] 25%|██▍ | 8548/34278 [9:31:06<33:09:23, 4.64s/it] {'loss': 0.157, 'grad_norm': 0.8888191495930848, 'learning_rate': 8.79035383375697e-06, 'epoch': 0.25} 25%|██▍ | 8548/34278 [9:31:06<33:09:23, 4.64s/it] 25%|██▍ | 8549/34278 [9:31:10<30:06:30, 4.21s/it] {'loss': 0.1563, 'grad_norm': 0.9852866117505581, 'learning_rate': 8.790045708365983e-06, 'epoch': 0.25} 25%|██▍ | 8549/34278 [9:31:10<30:06:30, 4.21s/it] 25%|██▍ | 8550/34278 [9:31:13<28:00:55, 3.92s/it] {'loss': 0.1505, 'grad_norm': 0.8088381591791408, 'learning_rate': 8.789737549138376e-06, 'epoch': 0.25} 25%|██▍ | 8550/34278 [9:31:13<28:00:55, 3.92s/it] 25%|██▍ | 8551/34278 [9:31:16<26:31:14, 3.71s/it] {'loss': 0.1529, 'grad_norm': 0.7571559136577702, 'learning_rate': 8.789429356076895e-06, 'epoch': 0.25} 25%|██▍ | 8551/34278 [9:31:16<26:31:14, 3.71s/it] 25%|██▍ | 8552/34278 [9:31:19<25:11:11, 3.52s/it] {'loss': 0.1319, 'grad_norm': 0.9057581284918355, 'learning_rate': 8.789121129184292e-06, 'epoch': 0.25} 25%|██▍ | 8552/34278 [9:31:19<25:11:11, 3.52s/it] 25%|██▍ | 8553/34278 [9:31:22<23:36:33, 3.30s/it] {'loss': 0.145, 'grad_norm': 0.7759745420464187, 'learning_rate': 8.78881286846332e-06, 'epoch': 0.25} 25%|██▍ | 8553/34278 [9:31:22<23:36:33, 3.30s/it] 25%|██▍ | 8554/34278 [9:31:26<24:48:08, 3.47s/it] {'loss': 0.15, 'grad_norm': 0.693580534777219, 'learning_rate': 8.788504573916735e-06, 'epoch': 0.25} 25%|██▍ | 8554/34278 [9:31:26<24:48:08, 3.47s/it] 25%|██▍ | 8555/34278 [9:31:30<27:02:20, 3.78s/it] {'loss': 0.1468, 'grad_norm': 0.7697521924643248, 'learning_rate': 8.788196245547283e-06, 'epoch': 0.25} 25%|██▍ | 8555/34278 [9:31:30<27:02:20, 3.78s/it] 25%|██▍ | 8556/34278 [9:31:33<25:20:17, 3.55s/it] {'loss': 0.1826, 'grad_norm': 0.7953894129472417, 'learning_rate': 8.787887883357718e-06, 'epoch': 0.25} 25%|██▍ | 8556/34278 [9:31:33<25:20:17, 3.55s/it] 25%|██▍ | 8557/34278 [9:31:37<26:13:42, 3.67s/it] {'loss': 0.1663, 'grad_norm': 0.7171112382319759, 'learning_rate': 8.787579487350795e-06, 'epoch': 0.25} 25%|██▍ | 8557/34278 [9:31:37<26:13:42, 3.67s/it] 25%|██▍ | 8558/34278 [9:31:41<26:05:01, 3.65s/it] {'loss': 0.168, 'grad_norm': 0.8971018760703712, 'learning_rate': 8.787271057529267e-06, 'epoch': 0.25} 25%|██▍ | 8558/34278 [9:31:41<26:05:01, 3.65s/it] 25%|██▍ | 8559/34278 [9:31:44<24:05:03, 3.37s/it] {'loss': 0.1697, 'grad_norm': 0.914590183249974, 'learning_rate': 8.786962593895887e-06, 'epoch': 0.25} 25%|██▍ | 8559/34278 [9:31:44<24:05:03, 3.37s/it] 25%|██▍ | 8560/34278 [9:31:47<24:21:03, 3.41s/it] {'loss': 0.1627, 'grad_norm': 0.7356187797125372, 'learning_rate': 8.786654096453411e-06, 'epoch': 0.25} 25%|██▍ | 8560/34278 [9:31:47<24:21:03, 3.41s/it] 25%|██▍ | 8561/34278 [9:31:53<28:56:02, 4.05s/it] {'loss': 0.1481, 'grad_norm': 0.7085676442730278, 'learning_rate': 8.786345565204588e-06, 'epoch': 0.25} 25%|██▍ | 8561/34278 [9:31:53<28:56:02, 4.05s/it] 25%|██▍ | 8562/34278 [9:31:56<28:12:42, 3.95s/it] {'loss': 0.1521, 'grad_norm': 0.8035803998376765, 'learning_rate': 8.786037000152176e-06, 'epoch': 0.25} 25%|██▍ | 8562/34278 [9:31:56<28:12:42, 3.95s/it] 25%|██▍ | 8563/34278 [9:32:00<27:51:43, 3.90s/it] {'loss': 0.1561, 'grad_norm': 0.8456489905190574, 'learning_rate': 8.785728401298931e-06, 'epoch': 0.25} 25%|██▍ | 8563/34278 [9:32:00<27:51:43, 3.90s/it] 25%|██▍ | 8564/34278 [9:32:03<26:06:07, 3.65s/it] {'loss': 0.1635, 'grad_norm': 0.806132692376375, 'learning_rate': 8.785419768647606e-06, 'epoch': 0.25} 25%|██▍ | 8564/34278 [9:32:03<26:06:07, 3.65s/it] 25%|██▍ | 8565/34278 [9:32:07<25:30:15, 3.57s/it] {'loss': 0.1273, 'grad_norm': 0.697632255673314, 'learning_rate': 8.785111102200958e-06, 'epoch': 0.25} 25%|██▍ | 8565/34278 [9:32:07<25:30:15, 3.57s/it] 25%|██▍ | 8566/34278 [9:32:13<30:45:47, 4.31s/it] {'loss': 0.1439, 'grad_norm': 0.7291816114635233, 'learning_rate': 8.78480240196174e-06, 'epoch': 0.25} 25%|██▍ | 8566/34278 [9:32:13<30:45:47, 4.31s/it] 25%|██▍ | 8567/34278 [9:32:15<27:13:38, 3.81s/it] {'loss': 0.1608, 'grad_norm': 0.8278397948964653, 'learning_rate': 8.784493667932709e-06, 'epoch': 0.25} 25%|██▍ | 8567/34278 [9:32:15<27:13:38, 3.81s/it] 25%|██▍ | 8568/34278 [9:32:20<29:36:23, 4.15s/it] {'loss': 0.1725, 'grad_norm': 0.8764478466593116, 'learning_rate': 8.784184900116623e-06, 'epoch': 0.25} 25%|██▍ | 8568/34278 [9:32:20<29:36:23, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 25%|██▍ | 8569/34278 [9:32:25<30:42:34, 4.30s/it] {'loss': 0.1613, 'grad_norm': 0.8910826173893902, 'learning_rate': 8.783876098516239e-06, 'epoch': 0.25} 25%|██▍ | 8569/34278 [9:32:25<30:42:34, 4.30s/it] 25%|██▌ | 8570/34278 [9:32:31<34:15:02, 4.80s/it] {'loss': 0.1909, 'grad_norm': 0.9454110100942791, 'learning_rate': 8.783567263134312e-06, 'epoch': 0.25} 25%|██▌ | 8570/34278 [9:32:31<34:15:02, 4.80s/it] 25%|██▌ | 8571/34278 [9:32:34<30:58:20, 4.34s/it] {'loss': 0.1531, 'grad_norm': 0.7174902051083285, 'learning_rate': 8.783258393973597e-06, 'epoch': 0.25} 25%|██▌ | 8571/34278 [9:32:34<30:58:20, 4.34s/it] 25%|██▌ | 8572/34278 [9:32:37<27:56:11, 3.91s/it] {'loss': 0.1601, 'grad_norm': 0.8168221964766684, 'learning_rate': 8.782949491036856e-06, 'epoch': 0.25} 25%|██▌ | 8572/34278 [9:32:37<27:56:11, 3.91s/it] 25%|██▌ | 8573/34278 [9:32:40<25:44:39, 3.61s/it] {'loss': 0.1343, 'grad_norm': 0.7669406975016696, 'learning_rate': 8.782640554326847e-06, 'epoch': 0.25} 25%|██▌ | 8573/34278 [9:32:40<25:44:39, 3.61s/it] 25%|██▌ | 8574/34278 [9:32:44<26:24:26, 3.70s/it] {'loss': 0.1638, 'grad_norm': 0.8471735608476396, 'learning_rate': 8.782331583846323e-06, 'epoch': 0.25} 25%|██▌ | 8574/34278 [9:32:44<26:24:26, 3.70s/it] 25%|██▌ | 8575/34278 [9:32:47<25:26:48, 3.56s/it] {'loss': 0.135, 'grad_norm': 0.6773378519508466, 'learning_rate': 8.782022579598046e-06, 'epoch': 0.25} 25%|██▌ | 8575/34278 [9:32:47<25:26:48, 3.56s/it] 25%|██▌ | 8576/34278 [9:32:53<30:51:10, 4.32s/it] {'loss': 0.1768, 'grad_norm': 0.9158554370246568, 'learning_rate': 8.781713541584775e-06, 'epoch': 0.25} 25%|██▌ | 8576/34278 [9:32:53<30:51:10, 4.32s/it] 25%|██▌ | 8577/34278 [9:32:56<27:46:19, 3.89s/it] {'loss': 0.1492, 'grad_norm': 1.1025626378261217, 'learning_rate': 8.78140446980927e-06, 'epoch': 0.25} 25%|██▌ | 8577/34278 [9:32:56<27:46:19, 3.89s/it] 25%|██▌ | 8578/34278 [9:32:59<26:01:39, 3.65s/it] {'loss': 0.151, 'grad_norm': 0.9874043507033101, 'learning_rate': 8.781095364274286e-06, 'epoch': 0.25} 25%|██▌ | 8578/34278 [9:32:59<26:01:39, 3.65s/it] 25%|██▌ | 8579/34278 [9:33:02<25:03:37, 3.51s/it] {'loss': 0.1588, 'grad_norm': 0.9573220286234951, 'learning_rate': 8.780786224982585e-06, 'epoch': 0.25} 25%|██▌ | 8579/34278 [9:33:02<25:03:37, 3.51s/it] 25%|██▌ | 8580/34278 [9:33:05<23:43:06, 3.32s/it] {'loss': 0.1523, 'grad_norm': 0.8466670019844869, 'learning_rate': 8.780477051936928e-06, 'epoch': 0.25} 25%|██▌ | 8580/34278 [9:33:05<23:43:06, 3.32s/it] 25%|██▌ | 8581/34278 [9:33:09<25:40:03, 3.60s/it] {'loss': 0.1722, 'grad_norm': 0.8922545595963715, 'learning_rate': 8.780167845140075e-06, 'epoch': 0.25} 25%|██▌ | 8581/34278 [9:33:09<25:40:03, 3.60s/it] 25%|██▌ | 8582/34278 [9:33:13<25:05:53, 3.52s/it] {'loss': 0.1589, 'grad_norm': 0.7860626236954013, 'learning_rate': 8.779858604594786e-06, 'epoch': 0.25} 25%|██▌ | 8582/34278 [9:33:13<25:05:53, 3.52s/it] 25%|██▌ | 8583/34278 [9:33:18<28:24:36, 3.98s/it] {'loss': 0.1522, 'grad_norm': 0.8939785190141168, 'learning_rate': 8.779549330303822e-06, 'epoch': 0.25} 25%|██▌ | 8583/34278 [9:33:18<28:24:36, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 25%|██▌ | 8584/34278 [9:33:22<28:27:39, 3.99s/it] {'loss': 0.172, 'grad_norm': 0.9091443804093601, 'learning_rate': 8.779240022269941e-06, 'epoch': 0.25} 25%|██▌ | 8584/34278 [9:33:22<28:27:39, 3.99s/it] 25%|██▌ | 8585/34278 [9:33:27<30:04:04, 4.21s/it] {'loss': 0.1697, 'grad_norm': 0.7720207746400931, 'learning_rate': 8.778930680495911e-06, 'epoch': 0.25} 25%|██▌ | 8585/34278 [9:33:27<30:04:04, 4.21s/it] 25%|██▌ | 8586/34278 [9:33:30<29:21:36, 4.11s/it] {'loss': 0.1549, 'grad_norm': 0.8593576498178653, 'learning_rate': 8.778621304984487e-06, 'epoch': 0.25} 25%|██▌ | 8586/34278 [9:33:30<29:21:36, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 25%|██▌ | 8587/34278 [9:33:34<28:06:42, 3.94s/it] {'loss': 0.1396, 'grad_norm': 0.8583096187012206, 'learning_rate': 8.778311895738436e-06, 'epoch': 0.25} 25%|██▌ | 8587/34278 [9:33:34<28:06:42, 3.94s/it] 25%|██▌ | 8588/34278 [9:33:38<27:40:40, 3.88s/it] {'loss': 0.1694, 'grad_norm': 0.7815697983140117, 'learning_rate': 8.778002452760517e-06, 'epoch': 0.25} 25%|██▌ | 8588/34278 [9:33:38<27:40:40, 3.88s/it] 25%|██▌ | 8589/34278 [9:33:41<26:10:42, 3.67s/it] {'loss': 0.1823, 'grad_norm': 0.9921630331516273, 'learning_rate': 8.777692976053496e-06, 'epoch': 0.25} 25%|██▌ | 8589/34278 [9:33:41<26:10:42, 3.67s/it] 25%|██▌ | 8590/34278 [9:33:45<26:02:11, 3.65s/it] {'loss': 0.1536, 'grad_norm': 0.9075847254953955, 'learning_rate': 8.77738346562013e-06, 'epoch': 0.25} 25%|██▌ | 8590/34278 [9:33:45<26:02:11, 3.65s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 25%|██▌ | 8591/34278 [9:33:48<24:42:11, 3.46s/it] {'loss': 0.1569, 'grad_norm': 0.9035407785795335, 'learning_rate': 8.77707392146319e-06, 'epoch': 0.25} 25%|██▌ | 8591/34278 [9:33:48<24:42:11, 3.46s/it] 25%|██▌ | 8592/34278 [9:33:51<24:45:21, 3.47s/it] {'loss': 0.1777, 'grad_norm': 0.9623455556518227, 'learning_rate': 8.776764343585434e-06, 'epoch': 0.25} 25%|██▌ | 8592/34278 [9:33:51<24:45:21, 3.47s/it] 25%|██▌ | 8593/34278 [9:33:54<23:30:27, 3.29s/it] {'loss': 0.1787, 'grad_norm': 0.7794053355405665, 'learning_rate': 8.776454731989628e-06, 'epoch': 0.25} 25%|██▌ | 8593/34278 [9:33:54<23:30:27, 3.29s/it] 25%|██▌ | 8594/34278 [9:33:58<24:39:38, 3.46s/it] {'loss': 0.1457, 'grad_norm': 0.8189228578184276, 'learning_rate': 8.776145086678535e-06, 'epoch': 0.25} 25%|██▌ | 8594/34278 [9:33:58<24:39:38, 3.46s/it] 25%|██▌ | 8595/34278 [9:34:02<25:38:47, 3.59s/it] {'loss': 0.1808, 'grad_norm': 0.8106201577521042, 'learning_rate': 8.775835407654922e-06, 'epoch': 0.25} 25%|██▌ | 8595/34278 [9:34:02<25:38:47, 3.59s/it] 25%|██▌ | 8596/34278 [9:34:05<24:36:06, 3.45s/it] {'loss': 0.1709, 'grad_norm': 1.0761468091271227, 'learning_rate': 8.77552569492155e-06, 'epoch': 0.25} 25%|██▌ | 8596/34278 [9:34:05<24:36:06, 3.45s/it] 25%|██▌ | 8597/34278 [9:34:08<24:55:20, 3.49s/it] {'loss': 0.1448, 'grad_norm': 0.7975032316074803, 'learning_rate': 8.775215948481187e-06, 'epoch': 0.25} 25%|██▌ | 8597/34278 [9:34:08<24:55:20, 3.49s/it] 25%|██▌ | 8598/34278 [9:34:13<27:40:40, 3.88s/it] {'loss': 0.1569, 'grad_norm': 1.0595088346673573, 'learning_rate': 8.774906168336595e-06, 'epoch': 0.25} 25%|██▌ | 8598/34278 [9:34:13<27:40:40, 3.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 25%|██▌ | 8599/34278 [9:34:16<26:08:21, 3.66s/it] {'loss': 0.159, 'grad_norm': 0.7694985571630539, 'learning_rate': 8.774596354490544e-06, 'epoch': 0.25} 25%|██▌ | 8599/34278 [9:34:16<26:08:21, 3.66s/it] 25%|██▌ | 8600/34278 [9:34:20<25:36:33, 3.59s/it] {'loss': 0.1853, 'grad_norm': 1.1726649632400676, 'learning_rate': 8.774286506945797e-06, 'epoch': 0.25} 25%|██▌ | 8600/34278 [9:34:20<25:36:33, 3.59s/it] 25%|██▌ | 8601/34278 [9:34:26<31:45:25, 4.45s/it] {'loss': 0.1998, 'grad_norm': 1.010183535285587, 'learning_rate': 8.773976625705122e-06, 'epoch': 0.25} 25%|██▌ | 8601/34278 [9:34:26<31:45:25, 4.45s/it] 25%|██▌ | 8602/34278 [9:34:33<35:58:25, 5.04s/it] {'loss': 0.1576, 'grad_norm': 0.8032566842963813, 'learning_rate': 8.773666710771283e-06, 'epoch': 0.25} 25%|██▌ | 8602/34278 [9:34:33<35:58:25, 5.04s/it] 25%|██▌ | 8603/34278 [9:34:38<37:32:55, 5.26s/it] {'loss': 0.1656, 'grad_norm': 0.7919812675501868, 'learning_rate': 8.77335676214705e-06, 'epoch': 0.25} 25%|██▌ | 8603/34278 [9:34:38<37:32:55, 5.26s/it] 25%|██▌ | 8604/34278 [9:34:45<40:05:49, 5.62s/it] {'loss': 0.158, 'grad_norm': 0.9404814588709256, 'learning_rate': 8.773046779835189e-06, 'epoch': 0.25} 25%|██▌ | 8604/34278 [9:34:45<40:05:49, 5.62s/it] 25%|██▌ | 8605/34278 [9:34:49<36:10:33, 5.07s/it] {'loss': 0.1669, 'grad_norm': 0.8010105402020736, 'learning_rate': 8.772736763838466e-06, 'epoch': 0.25} 25%|██▌ | 8605/34278 [9:34:49<36:10:33, 5.07s/it] 25%|██▌ | 8606/34278 [9:34:52<31:24:04, 4.40s/it] {'loss': 0.1547, 'grad_norm': 1.118376939418988, 'learning_rate': 8.772426714159648e-06, 'epoch': 0.25} 25%|██▌ | 8606/34278 [9:34:52<31:24:04, 4.40s/it] 25%|██▌ | 8607/34278 [9:34:54<27:56:58, 3.92s/it] {'loss': 0.1542, 'grad_norm': 0.7799989234003954, 'learning_rate': 8.772116630801506e-06, 'epoch': 0.25} 25%|██▌ | 8607/34278 [9:34:54<27:56:58, 3.92s/it] 25%|██▌ | 8608/34278 [9:34:58<26:36:42, 3.73s/it] {'loss': 0.1582, 'grad_norm': 0.712752334450024, 'learning_rate': 8.77180651376681e-06, 'epoch': 0.25} 25%|██▌ | 8608/34278 [9:34:58<26:36:42, 3.73s/it] 25%|██▌ | 8609/34278 [9:35:01<24:54:42, 3.49s/it] {'loss': 0.1809, 'grad_norm': 0.7919712574181268, 'learning_rate': 8.771496363058323e-06, 'epoch': 0.25} 25%|██▌ | 8609/34278 [9:35:01<24:54:42, 3.49s/it] 25%|██▌ | 8610/34278 [9:35:04<24:59:15, 3.50s/it] {'loss': 0.1618, 'grad_norm': 1.0160139063748992, 'learning_rate': 8.771186178678817e-06, 'epoch': 0.25} 25%|██▌ | 8610/34278 [9:35:04<24:59:15, 3.50s/it] 25%|██▌ | 8611/34278 [9:35:08<26:37:18, 3.73s/it] {'loss': 0.1639, 'grad_norm': 0.8429941639356404, 'learning_rate': 8.770875960631063e-06, 'epoch': 0.25} 25%|██▌ | 8611/34278 [9:35:08<26:37:18, 3.73s/it] 25%|██▌ | 8612/34278 [9:35:11<25:00:56, 3.51s/it] {'loss': 0.1607, 'grad_norm': 1.0152709983300994, 'learning_rate': 8.770565708917826e-06, 'epoch': 0.25} 25%|██▌ | 8612/34278 [9:35:11<25:00:56, 3.51s/it] 25%|██▌ | 8613/34278 [9:35:16<27:01:25, 3.79s/it] {'loss': 0.1421, 'grad_norm': 0.9282003931382112, 'learning_rate': 8.77025542354188e-06, 'epoch': 0.25} 25%|██▌ | 8613/34278 [9:35:16<27:01:25, 3.79s/it] 25%|██▌ | 8614/34278 [9:35:19<26:12:48, 3.68s/it] {'loss': 0.1563, 'grad_norm': 0.8633570939081969, 'learning_rate': 8.769945104505992e-06, 'epoch': 0.25} 25%|██▌ | 8614/34278 [9:35:19<26:12:48, 3.68s/it] 25%|██▌ | 8615/34278 [9:35:23<25:28:40, 3.57s/it] {'loss': 0.1615, 'grad_norm': 0.764496718697493, 'learning_rate': 8.769634751812937e-06, 'epoch': 0.25} 25%|██▌ | 8615/34278 [9:35:23<25:28:40, 3.57s/it] 25%|██▌ | 8616/34278 [9:35:28<29:47:12, 4.18s/it] {'loss': 0.1425, 'grad_norm': 0.8117302432475778, 'learning_rate': 8.769324365465482e-06, 'epoch': 0.25} 25%|██▌ | 8616/34278 [9:35:28<29:47:12, 4.18s/it] 25%|██▌ | 8617/34278 [9:35:32<28:16:02, 3.97s/it] {'loss': 0.1695, 'grad_norm': 0.9333843792401662, 'learning_rate': 8.769013945466396e-06, 'epoch': 0.25} 25%|██▌ | 8617/34278 [9:35:32<28:16:02, 3.97s/it] 25%|██▌ | 8618/34278 [9:35:35<27:18:37, 3.83s/it] {'loss': 0.1693, 'grad_norm': 1.120504946358395, 'learning_rate': 8.768703491818455e-06, 'epoch': 0.25} 25%|██▌ | 8618/34278 [9:35:35<27:18:37, 3.83s/it] 25%|██▌ | 8619/34278 [9:35:39<27:02:07, 3.79s/it] {'loss': 0.1578, 'grad_norm': 0.8375545723038, 'learning_rate': 8.76839300452443e-06, 'epoch': 0.25} 25%|██▌ | 8619/34278 [9:35:39<27:02:07, 3.79s/it] 25%|██▌ | 8620/34278 [9:35:42<25:05:37, 3.52s/it] {'loss': 0.1502, 'grad_norm': 0.9141436940891724, 'learning_rate': 8.76808248358709e-06, 'epoch': 0.25} 25%|██▌ | 8620/34278 [9:35:42<25:05:37, 3.52s/it] 25%|██▌ | 8621/34278 [9:35:45<24:16:26, 3.41s/it] {'loss': 0.1797, 'grad_norm': 0.9099046009644094, 'learning_rate': 8.76777192900921e-06, 'epoch': 0.25} 25%|██▌ | 8621/34278 [9:35:45<24:16:26, 3.41s/it] 25%|██▌ | 8622/34278 [9:35:48<23:17:15, 3.27s/it] {'loss': 0.1521, 'grad_norm': 0.7826325768943171, 'learning_rate': 8.767461340793563e-06, 'epoch': 0.25} 25%|██▌ | 8622/34278 [9:35:48<23:17:15, 3.27s/it] 25%|██▌ | 8623/34278 [9:35:51<22:58:11, 3.22s/it] {'loss': 0.1623, 'grad_norm': 0.8693445201995545, 'learning_rate': 8.767150718942919e-06, 'epoch': 0.25} 25%|██▌ | 8623/34278 [9:35:51<22:58:11, 3.22s/it] 25%|██▌ | 8624/34278 [9:35:55<24:23:42, 3.42s/it] {'loss': 0.1561, 'grad_norm': 1.0338015467666206, 'learning_rate': 8.766840063460054e-06, 'epoch': 0.25} 25%|██▌ | 8624/34278 [9:35:55<24:23:42, 3.42s/it] 25%|██▌ | 8625/34278 [9:35:58<23:43:45, 3.33s/it] {'loss': 0.1593, 'grad_norm': 0.9980300350500342, 'learning_rate': 8.766529374347738e-06, 'epoch': 0.25} 25%|██▌ | 8625/34278 [9:35:58<23:43:45, 3.33s/it] 25%|██▌ | 8626/34278 [9:36:01<22:51:00, 3.21s/it] {'loss': 0.1547, 'grad_norm': 0.8951577122410843, 'learning_rate': 8.766218651608748e-06, 'epoch': 0.25} 25%|██▌ | 8626/34278 [9:36:01<22:51:00, 3.21s/it] 25%|██▌ | 8627/34278 [9:36:04<22:22:15, 3.14s/it] {'loss': 0.1576, 'grad_norm': 0.8125359105854942, 'learning_rate': 8.765907895245857e-06, 'epoch': 0.25} 25%|██▌ | 8627/34278 [9:36:04<22:22:15, 3.14s/it] 25%|██▌ | 8628/34278 [9:36:07<22:01:17, 3.09s/it] {'loss': 0.1688, 'grad_norm': 0.9954761177039216, 'learning_rate': 8.765597105261838e-06, 'epoch': 0.25} 25%|██▌ | 8628/34278 [9:36:07<22:01:17, 3.09s/it] 25%|██▌ | 8629/34278 [9:36:10<21:40:26, 3.04s/it] {'loss': 0.1483, 'grad_norm': 0.9543299198439987, 'learning_rate': 8.765286281659469e-06, 'epoch': 0.25} 25%|██▌ | 8629/34278 [9:36:10<21:40:26, 3.04s/it] 25%|██▌ | 8630/34278 [9:36:16<27:48:50, 3.90s/it] {'loss': 0.157, 'grad_norm': 0.8071287681668878, 'learning_rate': 8.764975424441522e-06, 'epoch': 0.25} 25%|██▌ | 8630/34278 [9:36:16<27:48:50, 3.90s/it] 25%|██▌ | 8631/34278 [9:36:22<33:04:53, 4.64s/it] {'loss': 0.1868, 'grad_norm': 0.8540025790801501, 'learning_rate': 8.764664533610774e-06, 'epoch': 0.25} 25%|██▌ | 8631/34278 [9:36:22<33:04:53, 4.64s/it] 25%|██▌ | 8632/34278 [9:36:25<29:38:22, 4.16s/it] {'loss': 0.1524, 'grad_norm': 0.9667079527671832, 'learning_rate': 8.764353609169997e-06, 'epoch': 0.25} 25%|██▌ | 8632/34278 [9:36:25<29:38:22, 4.16s/it] 25%|██▌ | 8633/34278 [9:36:30<30:25:07, 4.27s/it] {'loss': 0.1516, 'grad_norm': 0.9058758410702621, 'learning_rate': 8.764042651121973e-06, 'epoch': 0.25} 25%|██▌ | 8633/34278 [9:36:30<30:25:07, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 25%|██▌ | 8634/34278 [9:36:32<27:33:34, 3.87s/it] {'loss': 0.1475, 'grad_norm': 0.8409416216707294, 'learning_rate': 8.763731659469473e-06, 'epoch': 0.25} 25%|██▌ | 8634/34278 [9:36:32<27:33:34, 3.87s/it] 25%|██▌ | 8635/34278 [9:36:35<24:57:32, 3.50s/it] {'loss': 0.1428, 'grad_norm': 0.8538141851741363, 'learning_rate': 8.763420634215277e-06, 'epoch': 0.25} 25%|██▌ | 8635/34278 [9:36:35<24:57:32, 3.50s/it] 25%|██▌ | 8636/34278 [9:36:38<24:20:31, 3.42s/it] {'loss': 0.1615, 'grad_norm': 0.9442534273461484, 'learning_rate': 8.763109575362156e-06, 'epoch': 0.25} 25%|██▌ | 8636/34278 [9:36:38<24:20:31, 3.42s/it] 25%|██▌ | 8637/34278 [9:36:42<23:56:18, 3.36s/it] {'loss': 0.1632, 'grad_norm': 0.805499568697246, 'learning_rate': 8.762798482912895e-06, 'epoch': 0.25} 25%|██▌ | 8637/34278 [9:36:42<23:56:18, 3.36s/it] 25%|██▌ | 8638/34278 [9:36:45<23:53:40, 3.35s/it] {'loss': 0.1724, 'grad_norm': 0.6903477934349642, 'learning_rate': 8.762487356870267e-06, 'epoch': 0.25} 25%|██▌ | 8638/34278 [9:36:45<23:53:40, 3.35s/it] 25%|██▌ | 8639/34278 [9:36:48<23:39:50, 3.32s/it] {'loss': 0.1323, 'grad_norm': 0.7868907082652199, 'learning_rate': 8.762176197237048e-06, 'epoch': 0.25} 25%|██▌ | 8639/34278 [9:36:48<23:39:50, 3.32s/it] 25%|██▌ | 8640/34278 [9:36:52<23:56:26, 3.36s/it] {'loss': 0.1552, 'grad_norm': 0.6980466504190983, 'learning_rate': 8.76186500401602e-06, 'epoch': 0.25} 25%|██▌ | 8640/34278 [9:36:52<23:56:26, 3.36s/it] 25%|██▌ | 8641/34278 [9:36:55<23:17:41, 3.27s/it] {'loss': 0.1546, 'grad_norm': 0.792164302645313, 'learning_rate': 8.761553777209957e-06, 'epoch': 0.25} 25%|██▌ | 8641/34278 [9:36:55<23:17:41, 3.27s/it] 25%|██▌ | 8642/34278 [9:36:58<23:12:48, 3.26s/it] {'loss': 0.1372, 'grad_norm': 0.94007538803066, 'learning_rate': 8.761242516821642e-06, 'epoch': 0.25} 25%|██▌ | 8642/34278 [9:36:58<23:12:48, 3.26s/it] 25%|██▌ | 8643/34278 [9:37:01<22:11:03, 3.12s/it] {'loss': 0.1457, 'grad_norm': 0.8307347612519377, 'learning_rate': 8.760931222853851e-06, 'epoch': 0.25} 25%|██▌ | 8643/34278 [9:37:01<22:11:03, 3.12s/it] 25%|██▌ | 8644/34278 [9:37:04<21:50:22, 3.07s/it] {'loss': 0.1611, 'grad_norm': 0.9547156420905087, 'learning_rate': 8.760619895309364e-06, 'epoch': 0.25} 25%|██▌ | 8644/34278 [9:37:04<21:50:22, 3.07s/it] 25%|██▌ | 8645/34278 [9:37:07<21:40:26, 3.04s/it] {'loss': 0.1869, 'grad_norm': 1.2277498281310582, 'learning_rate': 8.76030853419096e-06, 'epoch': 0.25} 25%|██▌ | 8645/34278 [9:37:07<21:40:26, 3.04s/it] 25%|██▌ | 8646/34278 [9:37:10<21:56:05, 3.08s/it] {'loss': 0.1893, 'grad_norm': 0.7589041422482073, 'learning_rate': 8.759997139501418e-06, 'epoch': 0.25} 25%|██▌ | 8646/34278 [9:37:10<21:56:05, 3.08s/it] 25%|██▌ | 8647/34278 [9:37:14<23:45:57, 3.34s/it] {'loss': 0.1566, 'grad_norm': 1.1921327947013587, 'learning_rate': 8.759685711243519e-06, 'epoch': 0.25} 25%|██▌ | 8647/34278 [9:37:14<23:45:57, 3.34s/it] 25%|██▌ | 8648/34278 [9:37:19<27:55:15, 3.92s/it] {'loss': 0.1679, 'grad_norm': 1.0194506263989331, 'learning_rate': 8.759374249420046e-06, 'epoch': 0.25} 25%|██▌ | 8648/34278 [9:37:19<27:55:15, 3.92s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 25%|██▌ | 8649/34278 [9:37:24<29:10:17, 4.10s/it] {'loss': 0.1517, 'grad_norm': 0.8559406641656732, 'learning_rate': 8.759062754033776e-06, 'epoch': 0.25} 25%|██▌ | 8649/34278 [9:37:24<29:10:17, 4.10s/it] 25%|██▌ | 8650/34278 [9:37:27<26:55:30, 3.78s/it] {'loss': 0.1579, 'grad_norm': 0.8029563133814118, 'learning_rate': 8.75875122508749e-06, 'epoch': 0.25} 25%|██▌ | 8650/34278 [9:37:27<26:55:30, 3.78s/it] 25%|██▌ | 8651/34278 [9:37:30<26:13:37, 3.68s/it] {'loss': 0.1528, 'grad_norm': 1.0332525877684775, 'learning_rate': 8.758439662583972e-06, 'epoch': 0.25} 25%|██▌ | 8651/34278 [9:37:30<26:13:37, 3.68s/it] 25%|██▌ | 8652/34278 [9:37:33<25:15:33, 3.55s/it] {'loss': 0.165, 'grad_norm': 1.0776228863205328, 'learning_rate': 8.758128066526002e-06, 'epoch': 0.25} 25%|██▌ | 8652/34278 [9:37:33<25:15:33, 3.55s/it] 25%|██▌ | 8653/34278 [9:37:36<23:55:56, 3.36s/it] {'loss': 0.1493, 'grad_norm': 0.9647404425923645, 'learning_rate': 8.75781643691636e-06, 'epoch': 0.25} 25%|██▌ | 8653/34278 [9:37:36<23:55:56, 3.36s/it] 25%|██▌ | 8654/34278 [9:37:40<25:03:13, 3.52s/it] {'loss': 0.1519, 'grad_norm': 1.3075908157331049, 'learning_rate': 8.757504773757831e-06, 'epoch': 0.25} 25%|██▌ | 8654/34278 [9:37:40<25:03:13, 3.52s/it] 25%|██▌ | 8655/34278 [9:37:43<24:24:54, 3.43s/it] {'loss': 0.1432, 'grad_norm': 0.9019791019347585, 'learning_rate': 8.757193077053197e-06, 'epoch': 0.25} 25%|██▌ | 8655/34278 [9:37:43<24:24:54, 3.43s/it] 25%|██▌ | 8656/34278 [9:37:46<23:05:19, 3.24s/it] {'loss': 0.1371, 'grad_norm': 0.8670379469726758, 'learning_rate': 8.756881346805238e-06, 'epoch': 0.25} 25%|██▌ | 8656/34278 [9:37:46<23:05:19, 3.24s/it] 25%|██▌ | 8657/34278 [9:37:50<23:48:14, 3.34s/it] {'loss': 0.1419, 'grad_norm': 0.9192781121346693, 'learning_rate': 8.75656958301674e-06, 'epoch': 0.25} 25%|██▌ | 8657/34278 [9:37:50<23:48:14, 3.34s/it] 25%|██▌ | 8658/34278 [9:37:53<23:47:55, 3.34s/it] {'loss': 0.1417, 'grad_norm': 0.8267824674551398, 'learning_rate': 8.756257785690488e-06, 'epoch': 0.25} 25%|██▌ | 8658/34278 [9:37:53<23:47:55, 3.34s/it] 25%|██▌ | 8659/34278 [9:37:56<23:37:37, 3.32s/it] {'loss': 0.1454, 'grad_norm': 0.7242660404992772, 'learning_rate': 8.755945954829259e-06, 'epoch': 0.25} 25%|██▌ | 8659/34278 [9:37:56<23:37:37, 3.32s/it] 25%|██▌ | 8660/34278 [9:37:59<22:50:10, 3.21s/it] {'loss': 0.158, 'grad_norm': 0.9708102460840208, 'learning_rate': 8.755634090435845e-06, 'epoch': 0.25} 25%|██▌ | 8660/34278 [9:37:59<22:50:10, 3.21s/it] 25%|██▌ | 8661/34278 [9:38:05<27:49:06, 3.91s/it] {'loss': 0.154, 'grad_norm': 0.8964518881575788, 'learning_rate': 8.755322192513026e-06, 'epoch': 0.25} 25%|██▌ | 8661/34278 [9:38:05<27:49:06, 3.91s/it] 25%|██▌ | 8662/34278 [9:38:08<25:47:02, 3.62s/it] {'loss': 0.1633, 'grad_norm': 0.7597045241854307, 'learning_rate': 8.755010261063583e-06, 'epoch': 0.25} 25%|██▌ | 8662/34278 [9:38:08<25:47:02, 3.62s/it] 25%|██▌ | 8663/34278 [9:38:11<25:33:57, 3.59s/it] {'loss': 0.1346, 'grad_norm': 0.6959537002629588, 'learning_rate': 8.754698296090306e-06, 'epoch': 0.25} 25%|██▌ | 8663/34278 [9:38:11<25:33:57, 3.59s/it] 25%|██▌ | 8664/34278 [9:38:16<28:12:25, 3.96s/it] {'loss': 0.1723, 'grad_norm': 1.1404850589425124, 'learning_rate': 8.754386297595982e-06, 'epoch': 0.25} 25%|██▌ | 8664/34278 [9:38:16<28:12:25, 3.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 25%|██▌ | 8665/34278 [9:38:19<26:52:10, 3.78s/it] {'loss': 0.1613, 'grad_norm': 0.8324400735924496, 'learning_rate': 8.754074265583391e-06, 'epoch': 0.25} 25%|██▌ | 8665/34278 [9:38:19<26:52:10, 3.78s/it] 25%|██▌ | 8666/34278 [9:38:22<24:40:15, 3.47s/it] {'loss': 0.1457, 'grad_norm': 0.8231664541683732, 'learning_rate': 8.753762200055323e-06, 'epoch': 0.25} 25%|██▌ | 8666/34278 [9:38:22<24:40:15, 3.47s/it] 25%|██▌ | 8667/34278 [9:38:26<25:57:05, 3.65s/it] {'loss': 0.148, 'grad_norm': 0.7615114736289824, 'learning_rate': 8.75345010101456e-06, 'epoch': 0.25} 25%|██▌ | 8667/34278 [9:38:26<25:57:05, 3.65s/it] 25%|██▌ | 8668/34278 [9:38:30<26:14:59, 3.69s/it] {'loss': 0.1614, 'grad_norm': 0.6634927283572258, 'learning_rate': 8.753137968463891e-06, 'epoch': 0.25} 25%|██▌ | 8668/34278 [9:38:30<26:14:59, 3.69s/it] 25%|██▌ | 8669/34278 [9:38:34<27:01:30, 3.80s/it] {'loss': 0.1622, 'grad_norm': 0.7274235145822803, 'learning_rate': 8.752825802406104e-06, 'epoch': 0.25} 25%|██▌ | 8669/34278 [9:38:34<27:01:30, 3.80s/it] 25%|██▌ | 8670/34278 [9:38:38<28:05:06, 3.95s/it] {'loss': 0.1695, 'grad_norm': 0.8576516402955692, 'learning_rate': 8.752513602843984e-06, 'epoch': 0.25} 25%|██▌ | 8670/34278 [9:38:38<28:05:06, 3.95s/it] 25%|██▌ | 8671/34278 [9:38:41<25:41:25, 3.61s/it] {'loss': 0.156, 'grad_norm': 3.6622315857967838, 'learning_rate': 8.752201369780317e-06, 'epoch': 0.25} 25%|██▌ | 8671/34278 [9:38:41<25:41:25, 3.61s/it] 25%|██▌ | 8672/34278 [9:38:44<24:37:28, 3.46s/it] {'loss': 0.1438, 'grad_norm': 1.0653715826093315, 'learning_rate': 8.751889103217892e-06, 'epoch': 0.25} 25%|██▌ | 8672/34278 [9:38:44<24:37:28, 3.46s/it] 25%|██▌ | 8673/34278 [9:38:49<26:58:14, 3.79s/it] {'loss': 0.1403, 'grad_norm': 0.9536320888241338, 'learning_rate': 8.751576803159495e-06, 'epoch': 0.25} 25%|██▌ | 8673/34278 [9:38:49<26:58:14, 3.79s/it] 25%|██▌ | 8674/34278 [9:38:53<27:18:20, 3.84s/it] {'loss': 0.1532, 'grad_norm': 0.9389850085117988, 'learning_rate': 8.751264469607919e-06, 'epoch': 0.25} 25%|██▌ | 8674/34278 [9:38:53<27:18:20, 3.84s/it] 25%|██▌ | 8675/34278 [9:38:56<26:07:29, 3.67s/it] {'loss': 0.1612, 'grad_norm': 0.9264477765314943, 'learning_rate': 8.750952102565949e-06, 'epoch': 0.25} 25%|██▌ | 8675/34278 [9:38:56<26:07:29, 3.67s/it] 25%|██▌ | 8676/34278 [9:38:59<24:55:08, 3.50s/it] {'loss': 0.1576, 'grad_norm': 0.7896236278290961, 'learning_rate': 8.750639702036372e-06, 'epoch': 0.25} 25%|██▌ | 8676/34278 [9:38:59<24:55:08, 3.50s/it] 25%|██▌ | 8677/34278 [9:39:03<24:47:22, 3.49s/it] {'loss': 0.1816, 'grad_norm': 1.089660135807044, 'learning_rate': 8.75032726802198e-06, 'epoch': 0.25} 25%|██▌ | 8677/34278 [9:39:03<24:47:22, 3.49s/it] 25%|██▌ | 8678/34278 [9:39:08<28:42:03, 4.04s/it] {'loss': 0.167, 'grad_norm': 0.7963842850599411, 'learning_rate': 8.75001480052556e-06, 'epoch': 0.25} 25%|██▌ | 8678/34278 [9:39:08<28:42:03, 4.04s/it] 25%|██▌ | 8679/34278 [9:39:14<32:41:17, 4.60s/it] {'loss': 0.159, 'grad_norm': 0.8047992319509558, 'learning_rate': 8.749702299549908e-06, 'epoch': 0.25} 25%|██▌ | 8679/34278 [9:39:14<32:41:17, 4.60s/it] 25%|██▌ | 8680/34278 [9:39:17<30:09:54, 4.24s/it] {'loss': 0.1534, 'grad_norm': 0.8361255619347895, 'learning_rate': 8.749389765097805e-06, 'epoch': 0.25} 25%|██▌ | 8680/34278 [9:39:17<30:09:54, 4.24s/it] 25%|██▌ | 8681/34278 [9:39:20<27:31:19, 3.87s/it] {'loss': 0.1673, 'grad_norm': 0.8121742773997878, 'learning_rate': 8.749077197172044e-06, 'epoch': 0.25} 25%|██▌ | 8681/34278 [9:39:20<27:31:19, 3.87s/it] 25%|██▌ | 8682/34278 [9:39:23<26:02:24, 3.66s/it] {'loss': 0.174, 'grad_norm': 0.7805005133041115, 'learning_rate': 8.74876459577542e-06, 'epoch': 0.25} 25%|██▌ | 8682/34278 [9:39:23<26:02:24, 3.66s/it] 25%|██▌ | 8683/34278 [9:39:27<26:29:27, 3.73s/it] {'loss': 0.1759, 'grad_norm': 0.7274078253360706, 'learning_rate': 8.748451960910718e-06, 'epoch': 0.25} 25%|██▌ | 8683/34278 [9:39:27<26:29:27, 3.73s/it] 25%|██▌ | 8684/34278 [9:39:31<25:36:27, 3.60s/it] {'loss': 0.2104, 'grad_norm': 1.0429367089407238, 'learning_rate': 8.748139292580733e-06, 'epoch': 0.25} 25%|██▌ | 8684/34278 [9:39:31<25:36:27, 3.60s/it] 25%|██▌ | 8685/34278 [9:39:34<24:17:04, 3.42s/it] {'loss': 0.1652, 'grad_norm': 0.8506984585250397, 'learning_rate': 8.747826590788256e-06, 'epoch': 0.25} 25%|██▌ | 8685/34278 [9:39:34<24:17:04, 3.42s/it] 25%|██▌ | 8686/34278 [9:39:38<25:36:52, 3.60s/it] {'loss': 0.1337, 'grad_norm': 1.1421366549445577, 'learning_rate': 8.747513855536077e-06, 'epoch': 0.25} 25%|██▌ | 8686/34278 [9:39:38<25:36:52, 3.60s/it] 25%|██▌ | 8687/34278 [9:39:41<25:04:36, 3.53s/it] {'loss': 0.1407, 'grad_norm': 1.1113881068498233, 'learning_rate': 8.747201086826989e-06, 'epoch': 0.25} 25%|██▌ | 8687/34278 [9:39:41<25:04:36, 3.53s/it] 25%|██▌ | 8688/34278 [9:39:45<26:18:38, 3.70s/it] {'loss': 0.174, 'grad_norm': 0.9908063541335966, 'learning_rate': 8.746888284663784e-06, 'epoch': 0.25} 25%|██▌ | 8688/34278 [9:39:45<26:18:38, 3.70s/it] 25%|██▌ | 8689/34278 [9:39:48<25:19:13, 3.56s/it] {'loss': 0.1406, 'grad_norm': 0.7712802509983608, 'learning_rate': 8.746575449049255e-06, 'epoch': 0.25} 25%|██▌ | 8689/34278 [9:39:48<25:19:13, 3.56s/it] 25%|██▌ | 8690/34278 [9:39:54<30:25:24, 4.28s/it] {'loss': 0.1513, 'grad_norm': 0.7431100563887824, 'learning_rate': 8.746262579986194e-06, 'epoch': 0.25} 25%|██▌ | 8690/34278 [9:39:54<30:25:24, 4.28s/it] 25%|██▌ | 8691/34278 [9:40:00<34:02:49, 4.79s/it] {'loss': 0.1537, 'grad_norm': 0.7644407130429836, 'learning_rate': 8.745949677477396e-06, 'epoch': 0.25} 25%|██▌ | 8691/34278 [9:40:00<34:02:49, 4.79s/it] 25%|██▌ | 8692/34278 [9:40:03<30:22:52, 4.27s/it] {'loss': 0.1978, 'grad_norm': 0.9870168221703469, 'learning_rate': 8.745636741525654e-06, 'epoch': 0.25} 25%|██▌ | 8692/34278 [9:40:03<30:22:52, 4.27s/it] 25%|██▌ | 8693/34278 [9:40:06<27:48:00, 3.91s/it] {'loss': 0.1409, 'grad_norm': 0.8982865652069405, 'learning_rate': 8.745323772133761e-06, 'epoch': 0.25} 25%|██▌ | 8693/34278 [9:40:06<27:48:00, 3.91s/it] 25%|██▌ | 8694/34278 [9:40:09<25:44:25, 3.62s/it] {'loss': 0.1413, 'grad_norm': 0.9042036941833452, 'learning_rate': 8.745010769304509e-06, 'epoch': 0.25} 25%|██▌ | 8694/34278 [9:40:09<25:44:25, 3.62s/it] 25%|██▌ | 8695/34278 [9:40:13<24:44:14, 3.48s/it] {'loss': 0.177, 'grad_norm': 0.9012370192724426, 'learning_rate': 8.744697733040696e-06, 'epoch': 0.25} 25%|██▌ | 8695/34278 [9:40:13<24:44:14, 3.48s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11494 > 8192). Running this sequence through the model will result in indexing errors 25%|██▌ | 8696/34278 [9:40:18<29:46:53, 4.19s/it] {'loss': 0.1522, 'grad_norm': 0.7438421399011913, 'learning_rate': 8.744384663345118e-06, 'epoch': 0.25} 25%|██▌ | 8696/34278 [9:40:18<29:46:53, 4.19s/it] 25%|██▌ | 8697/34278 [9:40:22<28:32:47, 4.02s/it] {'loss': 0.1901, 'grad_norm': 0.843966325383359, 'learning_rate': 8.744071560220567e-06, 'epoch': 0.25} 25%|██▌ | 8697/34278 [9:40:22<28:32:47, 4.02s/it] 25%|██▌ | 8698/34278 [9:40:26<28:02:34, 3.95s/it] {'loss': 0.1535, 'grad_norm': 0.7354098162433835, 'learning_rate': 8.743758423669837e-06, 'epoch': 0.25} 25%|██▌ | 8698/34278 [9:40:26<28:02:34, 3.95s/it] 25%|██▌ | 8699/34278 [9:40:29<26:47:49, 3.77s/it] {'loss': 0.1744, 'grad_norm': 1.2117025504985062, 'learning_rate': 8.743445253695725e-06, 'epoch': 0.25} 25%|██▌ | 8699/34278 [9:40:29<26:47:49, 3.77s/it] 25%|██▌ | 8700/34278 [9:40:34<29:27:54, 4.15s/it] {'loss': 0.1653, 'grad_norm': 0.9362173993868674, 'learning_rate': 8.743132050301031e-06, 'epoch': 0.25} 25%|██▌ | 8700/34278 [9:40:34<29:27:54, 4.15s/it] 25%|██▌ | 8701/34278 [9:40:37<27:30:15, 3.87s/it] {'loss': 0.15, 'grad_norm': 0.8310313706059644, 'learning_rate': 8.742818813488545e-06, 'epoch': 0.25} 25%|██▌ | 8701/34278 [9:40:37<27:30:15, 3.87s/it] 25%|██▌ | 8702/34278 [9:40:40<25:14:27, 3.55s/it] {'loss': 0.1394, 'grad_norm': 0.8139816868476868, 'learning_rate': 8.742505543261066e-06, 'epoch': 0.25} 25%|██▌ | 8702/34278 [9:40:40<25:14:27, 3.55s/it] 25%|██▌ | 8703/34278 [9:40:43<23:31:40, 3.31s/it] {'loss': 0.159, 'grad_norm': 0.7197049985343916, 'learning_rate': 8.742192239621391e-06, 'epoch': 0.25} 25%|██▌ | 8703/34278 [9:40:43<23:31:40, 3.31s/it] 25%|██▌ | 8704/34278 [9:40:46<22:48:55, 3.21s/it] {'loss': 0.1603, 'grad_norm': 0.7073837646585768, 'learning_rate': 8.741878902572318e-06, 'epoch': 0.25} 25%|██▌ | 8704/34278 [9:40:46<22:48:55, 3.21s/it] 25%|██▌ | 8705/34278 [9:40:50<24:43:48, 3.48s/it] {'loss': 0.1687, 'grad_norm': 0.7041714988132413, 'learning_rate': 8.741565532116643e-06, 'epoch': 0.25} 25%|██▌ | 8705/34278 [9:40:50<24:43:48, 3.48s/it] 25%|██▌ | 8706/34278 [9:40:53<24:00:27, 3.38s/it] {'loss': 0.1433, 'grad_norm': 0.7522600859004025, 'learning_rate': 8.741252128257164e-06, 'epoch': 0.25} 25%|██▌ | 8706/34278 [9:40:53<24:00:27, 3.38s/it] 25%|██▌ | 8707/34278 [9:40:56<23:32:18, 3.31s/it] {'loss': 0.1498, 'grad_norm': 0.7407409471862808, 'learning_rate': 8.740938690996678e-06, 'epoch': 0.25} 25%|██▌ | 8707/34278 [9:40:56<23:32:18, 3.31s/it] 25%|██▌ | 8708/34278 [9:41:00<23:52:35, 3.36s/it] {'loss': 0.1435, 'grad_norm': 0.8667489606012565, 'learning_rate': 8.740625220337987e-06, 'epoch': 0.25} 25%|██▌ | 8708/34278 [9:41:00<23:52:35, 3.36s/it] 25%|██▌ | 8709/34278 [9:41:03<23:59:00, 3.38s/it] {'loss': 0.1362, 'grad_norm': 0.6547311529300278, 'learning_rate': 8.740311716283884e-06, 'epoch': 0.25} 25%|██▌ | 8709/34278 [9:41:03<23:59:00, 3.38s/it] 25%|██▌ | 8710/34278 [9:41:09<29:56:52, 4.22s/it] {'loss': 0.1499, 'grad_norm': 0.8371369414539442, 'learning_rate': 8.739998178837172e-06, 'epoch': 0.25} 25%|██▌ | 8710/34278 [9:41:09<29:56:52, 4.22s/it] 25%|██▌ | 8711/34278 [9:41:15<33:24:33, 4.70s/it] {'loss': 0.1419, 'grad_norm': 0.8356802734640116, 'learning_rate': 8.739684608000651e-06, 'epoch': 0.25} 25%|██▌ | 8711/34278 [9:41:15<33:24:33, 4.70s/it] 25%|██▌ | 8712/34278 [9:41:19<30:25:52, 4.29s/it] {'loss': 0.1506, 'grad_norm': 0.7991211929141547, 'learning_rate': 8.739371003777117e-06, 'epoch': 0.25} 25%|██▌ | 8712/34278 [9:41:19<30:25:52, 4.29s/it] 25%|██▌ | 8713/34278 [9:41:22<27:54:37, 3.93s/it] {'loss': 0.1538, 'grad_norm': 0.7427853021170802, 'learning_rate': 8.73905736616937e-06, 'epoch': 0.25} 25%|██▌ | 8713/34278 [9:41:22<27:54:37, 3.93s/it] 25%|██▌ | 8714/34278 [9:41:25<27:22:58, 3.86s/it] {'loss': 0.1589, 'grad_norm': 0.8113598601341298, 'learning_rate': 8.738743695180214e-06, 'epoch': 0.25} 25%|██▌ | 8714/34278 [9:41:25<27:22:58, 3.86s/it] 25%|██▌ | 8715/34278 [9:41:30<29:55:46, 4.21s/it] {'loss': 0.1637, 'grad_norm': 0.8569630973636471, 'learning_rate': 8.738429990812445e-06, 'epoch': 0.25} 25%|██▌ | 8715/34278 [9:41:30<29:55:46, 4.21s/it] 25%|██▌ | 8716/34278 [9:41:34<28:03:22, 3.95s/it] {'loss': 0.16, 'grad_norm': 0.9100324245795195, 'learning_rate': 8.738116253068866e-06, 'epoch': 0.25} 25%|██▌ | 8716/34278 [9:41:34<28:03:22, 3.95s/it] 25%|██▌ | 8717/34278 [9:41:40<32:26:59, 4.57s/it] {'loss': 0.1536, 'grad_norm': 0.8638207685398084, 'learning_rate': 8.737802481952277e-06, 'epoch': 0.25} 25%|██▌ | 8717/34278 [9:41:40<32:26:59, 4.57s/it] 25%|██▌ | 8718/34278 [9:41:45<34:43:28, 4.89s/it] {'loss': 0.1394, 'grad_norm': 0.845447471564934, 'learning_rate': 8.73748867746548e-06, 'epoch': 0.25} 25%|██▌ | 8718/34278 [9:41:45<34:43:28, 4.89s/it] 25%|██▌ | 8719/34278 [9:41:49<31:10:53, 4.39s/it] {'loss': 0.1692, 'grad_norm': 0.873617854106875, 'learning_rate': 8.737174839611277e-06, 'epoch': 0.25} 25%|██▌ | 8719/34278 [9:41:49<31:10:53, 4.39s/it] 25%|██▌ | 8720/34278 [9:41:52<28:15:25, 3.98s/it] {'loss': 0.1448, 'grad_norm': 0.9311736290655294, 'learning_rate': 8.736860968392469e-06, 'epoch': 0.25} 25%|██▌ | 8720/34278 [9:41:52<28:15:25, 3.98s/it] 25%|██▌ | 8721/34278 [9:41:55<26:36:26, 3.75s/it] {'loss': 0.1659, 'grad_norm': 1.1191446734340702, 'learning_rate': 8.736547063811858e-06, 'epoch': 0.25} 25%|██▌ | 8721/34278 [9:41:55<26:36:26, 3.75s/it] 25%|██▌ | 8722/34278 [9:41:58<25:49:59, 3.64s/it] {'loss': 0.1582, 'grad_norm': 1.1348476700458443, 'learning_rate': 8.736233125872247e-06, 'epoch': 0.25} 25%|██▌ | 8722/34278 [9:41:58<25:49:59, 3.64s/it] 25%|██▌ | 8723/34278 [9:42:01<23:52:33, 3.36s/it] {'loss': 0.1621, 'grad_norm': 1.1460856678321898, 'learning_rate': 8.735919154576438e-06, 'epoch': 0.25} 25%|██▌ | 8723/34278 [9:42:01<23:52:33, 3.36s/it] 25%|██▌ | 8724/34278 [9:42:04<23:48:41, 3.35s/it] {'loss': 0.1658, 'grad_norm': 1.1746904149793125, 'learning_rate': 8.735605149927236e-06, 'epoch': 0.25} 25%|██▌ | 8724/34278 [9:42:04<23:48:41, 3.35s/it] 25%|██▌ | 8725/34278 [9:42:07<22:52:46, 3.22s/it] {'loss': 0.1516, 'grad_norm': 1.329046099020268, 'learning_rate': 8.735291111927441e-06, 'epoch': 0.25} 25%|██▌ | 8725/34278 [9:42:07<22:52:46, 3.22s/it] 25%|██▌ | 8726/34278 [9:42:10<22:40:16, 3.19s/it] {'loss': 0.1882, 'grad_norm': 1.1640390457061447, 'learning_rate': 8.73497704057986e-06, 'epoch': 0.25} 25%|██▌ | 8726/34278 [9:42:10<22:40:16, 3.19s/it] 25%|██▌ | 8727/34278 [9:42:13<22:14:49, 3.13s/it] {'loss': 0.1573, 'grad_norm': 1.0280124658537055, 'learning_rate': 8.734662935887295e-06, 'epoch': 0.25} 25%|██▌ | 8727/34278 [9:42:13<22:14:49, 3.13s/it] 25%|██▌ | 8728/34278 [9:42:16<21:51:42, 3.08s/it] {'loss': 0.17, 'grad_norm': 0.9639943602244389, 'learning_rate': 8.73434879785255e-06, 'epoch': 0.25} 25%|██▌ | 8728/34278 [9:42:16<21:51:42, 3.08s/it] 25%|██▌ | 8729/34278 [9:42:20<22:18:26, 3.14s/it] {'loss': 0.1587, 'grad_norm': 0.825427524115082, 'learning_rate': 8.734034626478432e-06, 'epoch': 0.25} 25%|██▌ | 8729/34278 [9:42:20<22:18:26, 3.14s/it] 25%|██▌ | 8730/34278 [9:42:23<23:02:15, 3.25s/it] {'loss': 0.1663, 'grad_norm': 0.9130862061586684, 'learning_rate': 8.733720421767744e-06, 'epoch': 0.25} 25%|██▌ | 8730/34278 [9:42:23<23:02:15, 3.25s/it] 25%|██▌ | 8731/34278 [9:42:29<28:40:08, 4.04s/it] {'loss': 0.1486, 'grad_norm': 0.8498724775905401, 'learning_rate': 8.733406183723293e-06, 'epoch': 0.25} 25%|██▌ | 8731/34278 [9:42:29<28:40:08, 4.04s/it] 25%|██▌ | 8732/34278 [9:42:32<25:57:06, 3.66s/it] {'loss': 0.1389, 'grad_norm': 1.0775770106421836, 'learning_rate': 8.73309191234788e-06, 'epoch': 0.25} 25%|██▌ | 8732/34278 [9:42:32<25:57:06, 3.66s/it] 25%|██▌ | 8733/34278 [9:42:36<27:33:47, 3.88s/it] {'loss': 0.1382, 'grad_norm': 0.7640333168428327, 'learning_rate': 8.732777607644314e-06, 'epoch': 0.25} 25%|██▌ | 8733/34278 [9:42:36<27:33:47, 3.88s/it] 25%|██▌ | 8734/34278 [9:42:40<26:31:24, 3.74s/it] {'loss': 0.1836, 'grad_norm': 0.8952833004604634, 'learning_rate': 8.7324632696154e-06, 'epoch': 0.25} 25%|██▌ | 8734/34278 [9:42:40<26:31:24, 3.74s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2a7bffd080> Failed to fetch sample 2854381. Exception: cannot identify image file <_io.BytesIO object at 0x7f2a7bffd080> 25%|██▌ | 8735/34278 [9:42:43<25:45:50, 3.63s/it] {'loss': 0.1611, 'grad_norm': 0.849439438598427, 'learning_rate': 8.732148898263946e-06, 'epoch': 0.25} 25%|██▌ | 8735/34278 [9:42:43<25:45:50, 3.63s/it] 25%|██▌ | 8736/34278 [9:42:47<26:48:52, 3.78s/it] {'loss': 0.196, 'grad_norm': 1.0782415390561326, 'learning_rate': 8.73183449359276e-06, 'epoch': 0.25} 25%|██▌ | 8736/34278 [9:42:47<26:48:52, 3.78s/it] 25%|██▌ | 8737/34278 [9:42:51<26:12:26, 3.69s/it] {'loss': 0.2017, 'grad_norm': 1.0005187587540993, 'learning_rate': 8.731520055604642e-06, 'epoch': 0.25} 25%|██▌ | 8737/34278 [9:42:51<26:12:26, 3.69s/it] 25%|██▌ | 8738/34278 [9:42:54<25:01:20, 3.53s/it] {'loss': 0.1589, 'grad_norm': 0.7896312246550041, 'learning_rate': 8.731205584302406e-06, 'epoch': 0.25} 25%|██▌ | 8738/34278 [9:42:54<25:01:20, 3.53s/it] 25%|██▌ | 8739/34278 [9:42:57<24:19:40, 3.43s/it] {'loss': 0.1515, 'grad_norm': 0.944690480624181, 'learning_rate': 8.730891079688856e-06, 'epoch': 0.25} 25%|██▌ | 8739/34278 [9:42:57<24:19:40, 3.43s/it] 25%|██▌ | 8740/34278 [9:43:03<30:28:24, 4.30s/it] {'loss': 0.1718, 'grad_norm': 1.028373765804265, 'learning_rate': 8.730576541766803e-06, 'epoch': 0.25} 25%|██▌ | 8740/34278 [9:43:03<30:28:24, 4.30s/it] 26%|██▌ | 8741/34278 [9:43:06<27:14:34, 3.84s/it] {'loss': 0.1606, 'grad_norm': 0.9030042930995393, 'learning_rate': 8.730261970539052e-06, 'epoch': 0.26} 26%|██▌ | 8741/34278 [9:43:06<27:14:34, 3.84s/it] 26%|██▌ | 8742/34278 [9:43:09<25:37:02, 3.61s/it] {'loss': 0.1404, 'grad_norm': 0.8539913410412551, 'learning_rate': 8.729947366008413e-06, 'epoch': 0.26} 26%|██▌ | 8742/34278 [9:43:09<25:37:02, 3.61s/it] 26%|██▌ | 8743/34278 [9:43:14<27:43:42, 3.91s/it] {'loss': 0.1377, 'grad_norm': 0.8820475575043571, 'learning_rate': 8.729632728177695e-06, 'epoch': 0.26} 26%|██▌ | 8743/34278 [9:43:14<27:43:42, 3.91s/it] 26%|██▌ | 8744/34278 [9:43:17<26:56:25, 3.80s/it] {'loss': 0.1599, 'grad_norm': 0.7477643030610125, 'learning_rate': 8.729318057049704e-06, 'epoch': 0.26} 26%|██▌ | 8744/34278 [9:43:17<26:56:25, 3.80s/it] 26%|██▌ | 8745/34278 [9:43:20<25:03:07, 3.53s/it] {'loss': 0.1551, 'grad_norm': 0.7396971754806884, 'learning_rate': 8.729003352627255e-06, 'epoch': 0.26} 26%|██▌ | 8745/34278 [9:43:20<25:03:07, 3.53s/it] 26%|██▌ | 8746/34278 [9:43:24<25:59:43, 3.67s/it] {'loss': 0.1372, 'grad_norm': 1.1071026612148385, 'learning_rate': 8.728688614913152e-06, 'epoch': 0.26} 26%|██▌ | 8746/34278 [9:43:24<25:59:43, 3.67s/it] 26%|██▌ | 8747/34278 [9:43:28<26:05:53, 3.68s/it] {'loss': 0.1521, 'grad_norm': 0.9409646097818506, 'learning_rate': 8.728373843910207e-06, 'epoch': 0.26} 26%|██▌ | 8747/34278 [9:43:28<26:05:53, 3.68s/it] 26%|██▌ | 8748/34278 [9:43:34<31:17:23, 4.41s/it] {'loss': 0.1638, 'grad_norm': 0.9150115926042979, 'learning_rate': 8.728059039621231e-06, 'epoch': 0.26} 26%|██▌ | 8748/34278 [9:43:34<31:17:23, 4.41s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 26%|██▌ | 8749/34278 [9:43:40<35:23:47, 4.99s/it] {'loss': 0.1609, 'grad_norm': 0.9937005959563189, 'learning_rate': 8.727744202049035e-06, 'epoch': 0.26} 26%|██▌ | 8749/34278 [9:43:40<35:23:47, 4.99s/it] 26%|██▌ | 8750/34278 [9:43:46<37:16:05, 5.26s/it] {'loss': 0.1703, 'grad_norm': 0.8958513950848999, 'learning_rate': 8.727429331196426e-06, 'epoch': 0.26} 26%|██▌ | 8750/34278 [9:43:46<37:16:05, 5.26s/it] 26%|██▌ | 8751/34278 [9:43:52<39:33:56, 5.58s/it] {'loss': 0.1424, 'grad_norm': 0.8248501867962532, 'learning_rate': 8.72711442706622e-06, 'epoch': 0.26} 26%|██▌ | 8751/34278 [9:43:52<39:33:56, 5.58s/it] 26%|██▌ | 8752/34278 [9:43:59<40:34:36, 5.72s/it] {'loss': 0.1659, 'grad_norm': 0.9571136844958456, 'learning_rate': 8.726799489661225e-06, 'epoch': 0.26} 26%|██▌ | 8752/34278 [9:43:59<40:34:36, 5.72s/it] 26%|██▌ | 8753/34278 [9:44:02<35:57:47, 5.07s/it] {'loss': 0.1616, 'grad_norm': 1.0667095653183198, 'learning_rate': 8.726484518984256e-06, 'epoch': 0.26} 26%|██▌ | 8753/34278 [9:44:02<35:57:47, 5.07s/it] 26%|██▌ | 8754/34278 [9:44:08<37:52:52, 5.34s/it] {'loss': 0.1573, 'grad_norm': 0.8618680921339075, 'learning_rate': 8.72616951503812e-06, 'epoch': 0.26} 26%|██▌ | 8754/34278 [9:44:08<37:52:52, 5.34s/it] 26%|██▌ | 8755/34278 [9:44:11<32:30:30, 4.59s/it] {'loss': 0.179, 'grad_norm': 0.8299326664354898, 'learning_rate': 8.725854477825632e-06, 'epoch': 0.26} 26%|██▌ | 8755/34278 [9:44:11<32:30:30, 4.59s/it] 26%|██▌ | 8756/34278 [9:44:14<29:37:49, 4.18s/it] {'loss': 0.1391, 'grad_norm': 0.8713894055508062, 'learning_rate': 8.725539407349606e-06, 'epoch': 0.26} 26%|██▌ | 8756/34278 [9:44:14<29:37:49, 4.18s/it] 26%|██▌ | 8757/34278 [9:44:17<27:37:57, 3.90s/it] {'loss': 0.1533, 'grad_norm': 0.8007962491614893, 'learning_rate': 8.725224303612854e-06, 'epoch': 0.26} 26%|██▌ | 8757/34278 [9:44:17<27:37:57, 3.90s/it] 26%|██▌ | 8758/34278 [9:44:23<32:06:10, 4.53s/it] {'loss': 0.1811, 'grad_norm': 0.971945767080681, 'learning_rate': 8.724909166618187e-06, 'epoch': 0.26} 26%|██▌ | 8758/34278 [9:44:23<32:06:10, 4.53s/it] 26%|██▌ | 8759/34278 [9:44:26<28:34:08, 4.03s/it] {'loss': 0.1613, 'grad_norm': 0.8146285945171134, 'learning_rate': 8.724593996368422e-06, 'epoch': 0.26} 26%|██▌ | 8759/34278 [9:44:26<28:34:08, 4.03s/it] 26%|██▌ | 8760/34278 [9:44:29<26:17:34, 3.71s/it] {'loss': 0.1497, 'grad_norm': 0.8374369013562248, 'learning_rate': 8.72427879286637e-06, 'epoch': 0.26} 26%|██▌ | 8760/34278 [9:44:29<26:17:34, 3.71s/it] 26%|██▌ | 8761/34278 [9:44:33<25:44:29, 3.63s/it] {'loss': 0.1552, 'grad_norm': 0.9625265198360471, 'learning_rate': 8.723963556114847e-06, 'epoch': 0.26} 26%|██▌ | 8761/34278 [9:44:33<25:44:29, 3.63s/it] 26%|██▌ | 8762/34278 [9:44:39<30:36:25, 4.32s/it] {'loss': 0.1567, 'grad_norm': 0.7911600922588395, 'learning_rate': 8.723648286116664e-06, 'epoch': 0.26} 26%|██▌ | 8762/34278 [9:44:39<30:36:25, 4.32s/it] 26%|██▌ | 8763/34278 [9:44:42<28:39:07, 4.04s/it] {'loss': 0.16, 'grad_norm': 0.6944363270822046, 'learning_rate': 8.723332982874639e-06, 'epoch': 0.26} 26%|██▌ | 8763/34278 [9:44:42<28:39:07, 4.04s/it] 26%|██▌ | 8764/34278 [9:44:47<31:40:17, 4.47s/it] {'loss': 0.1689, 'grad_norm': 0.7716485153147591, 'learning_rate': 8.723017646391587e-06, 'epoch': 0.26} 26%|██▌ | 8764/34278 [9:44:47<31:40:17, 4.47s/it] 26%|██▌ | 8765/34278 [9:44:52<31:21:48, 4.43s/it] {'loss': 0.1541, 'grad_norm': 0.8354276672058369, 'learning_rate': 8.722702276670323e-06, 'epoch': 0.26} 26%|██▌ | 8765/34278 [9:44:52<31:21:48, 4.43s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 26%|██▌ | 8766/34278 [9:44:55<28:49:24, 4.07s/it] {'loss': 0.1488, 'grad_norm': 0.695295395783904, 'learning_rate': 8.72238687371366e-06, 'epoch': 0.26} 26%|██▌ | 8766/34278 [9:44:55<28:49:24, 4.07s/it] 26%|██▌ | 8767/34278 [9:45:01<33:17:08, 4.70s/it] {'loss': 0.1492, 'grad_norm': 0.7860631251591896, 'learning_rate': 8.722071437524415e-06, 'epoch': 0.26} 26%|██▌ | 8767/34278 [9:45:01<33:17:08, 4.70s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa3bda4fd30> Failed to fetch sample 2691936. Exception: cannot identify image file <_io.BytesIO object at 0x7fa3bda4fd30> 26%|██▌ | 8768/34278 [9:45:07<34:51:56, 4.92s/it] {'loss': 0.1689, 'grad_norm': 0.8912421457723493, 'learning_rate': 8.721755968105406e-06, 'epoch': 0.26} 26%|██▌ | 8768/34278 [9:45:07<34:51:56, 4.92s/it] 26%|██▌ | 8769/34278 [9:45:09<30:37:32, 4.32s/it] {'loss': 0.1569, 'grad_norm': 0.7598400483680359, 'learning_rate': 8.721440465459448e-06, 'epoch': 0.26} 26%|██▌ | 8769/34278 [9:45:10<30:37:32, 4.32s/it] 26%|██▌ | 8770/34278 [9:45:14<30:00:20, 4.23s/it] {'loss': 0.1437, 'grad_norm': 0.857580910539667, 'learning_rate': 8.721124929589358e-06, 'epoch': 0.26} 26%|██▌ | 8770/34278 [9:45:14<30:00:20, 4.23s/it] 26%|██▌ | 8771/34278 [9:45:17<27:41:04, 3.91s/it] {'loss': 0.1589, 'grad_norm': 0.6838812636405465, 'learning_rate': 8.720809360497953e-06, 'epoch': 0.26} 26%|██▌ | 8771/34278 [9:45:17<27:41:04, 3.91s/it] 26%|██▌ | 8772/34278 [9:45:20<26:17:14, 3.71s/it] {'loss': 0.1506, 'grad_norm': 0.9153310537215729, 'learning_rate': 8.720493758188049e-06, 'epoch': 0.26} 26%|██▌ | 8772/34278 [9:45:20<26:17:14, 3.71s/it] 26%|██▌ | 8773/34278 [9:45:24<26:29:53, 3.74s/it] {'loss': 0.152, 'grad_norm': 0.82615409438108, 'learning_rate': 8.720178122662466e-06, 'epoch': 0.26} 26%|██▌ | 8773/34278 [9:45:24<26:29:53, 3.74s/it] 26%|██▌ | 8774/34278 [9:45:27<26:02:00, 3.67s/it] {'loss': 0.1637, 'grad_norm': 0.8605169310453629, 'learning_rate': 8.71986245392402e-06, 'epoch': 0.26} 26%|██▌ | 8774/34278 [9:45:27<26:02:00, 3.67s/it] 26%|██▌ | 8775/34278 [9:45:31<25:57:34, 3.66s/it] {'loss': 0.1457, 'grad_norm': 0.9090589536210341, 'learning_rate': 8.719546751975531e-06, 'epoch': 0.26} 26%|██▌ | 8775/34278 [9:45:31<25:57:34, 3.66s/it] 26%|██▌ | 8776/34278 [9:45:34<25:49:02, 3.64s/it] {'loss': 0.1443, 'grad_norm': 0.7772837033436167, 'learning_rate': 8.719231016819817e-06, 'epoch': 0.26} 26%|██▌ | 8776/34278 [9:45:35<25:49:02, 3.64s/it] 26%|██▌ | 8777/34278 [9:45:39<27:10:58, 3.84s/it] {'loss': 0.1499, 'grad_norm': 1.0667870199913696, 'learning_rate': 8.718915248459697e-06, 'epoch': 0.26} 26%|██▌ | 8777/34278 [9:45:39<27:10:58, 3.84s/it] 26%|██▌ | 8778/34278 [9:45:44<30:32:01, 4.31s/it] {'loss': 0.1692, 'grad_norm': 0.8081053989600095, 'learning_rate': 8.718599446897987e-06, 'epoch': 0.26} 26%|██▌ | 8778/34278 [9:45:44<30:32:01, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (13867 > 8192). Running this sequence through the model will result in indexing errors 26%|██▌ | 8779/34278 [9:45:48<29:34:49, 4.18s/it] {'loss': 0.1746, 'grad_norm': 0.983564666362862, 'learning_rate': 8.718283612137508e-06, 'epoch': 0.26} 26%|██▌ | 8779/34278 [9:45:48<29:34:49, 4.18s/it] 26%|██▌ | 8780/34278 [9:45:51<27:28:42, 3.88s/it] {'loss': 0.1757, 'grad_norm': 0.8528117300126145, 'learning_rate': 8.717967744181084e-06, 'epoch': 0.26} 26%|██▌ | 8780/34278 [9:45:51<27:28:42, 3.88s/it] 26%|██▌ | 8781/34278 [9:45:54<25:15:46, 3.57s/it] {'loss': 0.1547, 'grad_norm': 0.741080976782169, 'learning_rate': 8.717651843031529e-06, 'epoch': 0.26} 26%|██▌ | 8781/34278 [9:45:54<25:15:46, 3.57s/it] 26%|██▌ | 8782/34278 [9:45:57<24:31:01, 3.46s/it] {'loss': 0.1674, 'grad_norm': 1.1153756248130342, 'learning_rate': 8.717335908691667e-06, 'epoch': 0.26} 26%|██▌ | 8782/34278 [9:45:57<24:31:01, 3.46s/it] 26%|██▌ | 8783/34278 [9:46:00<23:31:49, 3.32s/it] {'loss': 0.1452, 'grad_norm': 0.8409951078676309, 'learning_rate': 8.717019941164317e-06, 'epoch': 0.26} 26%|██▌ | 8783/34278 [9:46:00<23:31:49, 3.32s/it] 26%|██▌ | 8784/34278 [9:46:03<22:55:31, 3.24s/it] {'loss': 0.1911, 'grad_norm': 0.6809635702314836, 'learning_rate': 8.7167039404523e-06, 'epoch': 0.26} 26%|██▌ | 8784/34278 [9:46:03<22:55:31, 3.24s/it] 26%|██▌ | 8785/34278 [9:46:06<22:20:13, 3.15s/it] {'loss': 0.1412, 'grad_norm': 1.1079818340434244, 'learning_rate': 8.71638790655844e-06, 'epoch': 0.26} 26%|██▌ | 8785/34278 [9:46:06<22:20:13, 3.15s/it] 26%|██▌ | 8786/34278 [9:46:10<22:36:29, 3.19s/it] {'loss': 0.1679, 'grad_norm': 0.9838662069191417, 'learning_rate': 8.716071839485552e-06, 'epoch': 0.26} 26%|██▌ | 8786/34278 [9:46:10<22:36:29, 3.19s/it] 26%|██▌ | 8787/34278 [9:46:13<23:15:24, 3.28s/it] {'loss': 0.1546, 'grad_norm': 0.6506205885962708, 'learning_rate': 8.715755739236464e-06, 'epoch': 0.26} 26%|██▌ | 8787/34278 [9:46:13<23:15:24, 3.28s/it] 26%|██▌ | 8788/34278 [9:46:16<22:42:37, 3.21s/it] {'loss': 0.1545, 'grad_norm': 0.718062480563348, 'learning_rate': 8.715439605813994e-06, 'epoch': 0.26} 26%|██▌ | 8788/34278 [9:46:16<22:42:37, 3.21s/it] 26%|██▌ | 8789/34278 [9:46:19<22:42:50, 3.21s/it] {'loss': 0.141, 'grad_norm': 0.8612418407748099, 'learning_rate': 8.715123439220968e-06, 'epoch': 0.26} 26%|██▌ | 8789/34278 [9:46:19<22:42:50, 3.21s/it] 26%|██▌ | 8790/34278 [9:46:24<26:22:16, 3.72s/it] {'loss': 0.1696, 'grad_norm': 0.634017237768269, 'learning_rate': 8.714807239460206e-06, 'epoch': 0.26} 26%|██▌ | 8790/34278 [9:46:24<26:22:16, 3.72s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 26%|██▌ | 8791/34278 [9:46:28<25:30:30, 3.60s/it] {'loss': 0.155, 'grad_norm': 0.7110658042582967, 'learning_rate': 8.714491006534532e-06, 'epoch': 0.26} 26%|██▌ | 8791/34278 [9:46:28<25:30:30, 3.60s/it] 26%|██▌ | 8792/34278 [9:46:31<24:35:36, 3.47s/it] {'loss': 0.1521, 'grad_norm': 1.2230597945265116, 'learning_rate': 8.714174740446769e-06, 'epoch': 0.26} 26%|██▌ | 8792/34278 [9:46:31<24:35:36, 3.47s/it] 26%|██▌ | 8793/34278 [9:46:37<30:02:25, 4.24s/it] {'loss': 0.179, 'grad_norm': 0.7733159226290556, 'learning_rate': 8.713858441199741e-06, 'epoch': 0.26} 26%|██▌ | 8793/34278 [9:46:37<30:02:25, 4.24s/it] 26%|██▌ | 8794/34278 [9:46:41<30:00:12, 4.24s/it] {'loss': 0.1584, 'grad_norm': 0.9072653388804234, 'learning_rate': 8.713542108796271e-06, 'epoch': 0.26} 26%|██▌ | 8794/34278 [9:46:41<30:00:12, 4.24s/it] 26%|██▌ | 8795/34278 [9:46:45<28:38:31, 4.05s/it] {'loss': 0.1729, 'grad_norm': 1.1565474412328696, 'learning_rate': 8.713225743239183e-06, 'epoch': 0.26} 26%|██▌ | 8795/34278 [9:46:45<28:38:31, 4.05s/it] 26%|██▌ | 8796/34278 [9:46:48<26:51:25, 3.79s/it] {'loss': 0.1369, 'grad_norm': 0.8551553752200075, 'learning_rate': 8.712909344531302e-06, 'epoch': 0.26} 26%|██▌ | 8796/34278 [9:46:48<26:51:25, 3.79s/it] 26%|██▌ | 8797/34278 [9:46:54<31:39:41, 4.47s/it] {'loss': 0.1688, 'grad_norm': 0.9607543688190029, 'learning_rate': 8.712592912675454e-06, 'epoch': 0.26} 26%|██▌ | 8797/34278 [9:46:54<31:39:41, 4.47s/it] 26%|██▌ | 8798/34278 [9:46:57<29:37:36, 4.19s/it] {'loss': 0.1917, 'grad_norm': 0.9504308927374339, 'learning_rate': 8.712276447674462e-06, 'epoch': 0.26} 26%|██▌ | 8798/34278 [9:46:57<29:37:36, 4.19s/it] 26%|██▌ | 8799/34278 [9:47:03<31:43:29, 4.48s/it] {'loss': 0.183, 'grad_norm': 1.0782985099457012, 'learning_rate': 8.711959949531152e-06, 'epoch': 0.26} 26%|██▌ | 8799/34278 [9:47:03<31:43:29, 4.48s/it] 26%|██▌ | 8800/34278 [9:47:06<29:22:28, 4.15s/it] {'loss': 0.1563, 'grad_norm': 1.1792973780465728, 'learning_rate': 8.71164341824835e-06, 'epoch': 0.26} 26%|██▌ | 8800/34278 [9:47:06<29:22:28, 4.15s/it] 26%|██▌ | 8801/34278 [9:47:10<29:03:09, 4.11s/it] {'loss': 0.1761, 'grad_norm': 0.8086254196557137, 'learning_rate': 8.711326853828881e-06, 'epoch': 0.26} 26%|██▌ | 8801/34278 [9:47:10<29:03:09, 4.11s/it] 26%|██▌ | 8802/34278 [9:47:13<26:19:10, 3.72s/it] {'loss': 0.1523, 'grad_norm': 0.8422364878747902, 'learning_rate': 8.711010256275572e-06, 'epoch': 0.26} 26%|██▌ | 8802/34278 [9:47:13<26:19:10, 3.72s/it] 26%|██▌ | 8803/34278 [9:47:16<24:47:20, 3.50s/it] {'loss': 0.1478, 'grad_norm': 1.0218613293672558, 'learning_rate': 8.710693625591249e-06, 'epoch': 0.26} 26%|██▌ | 8803/34278 [9:47:16<24:47:20, 3.50s/it] 26%|██▌ | 8804/34278 [9:47:19<24:42:06, 3.49s/it] {'loss': 0.1506, 'grad_norm': 1.067926623148478, 'learning_rate': 8.71037696177874e-06, 'epoch': 0.26} 26%|██▌ | 8804/34278 [9:47:19<24:42:06, 3.49s/it] 26%|██▌ | 8805/34278 [9:47:23<25:03:42, 3.54s/it] {'loss': 0.1609, 'grad_norm': 0.8521451983426845, 'learning_rate': 8.710060264840872e-06, 'epoch': 0.26} 26%|██▌ | 8805/34278 [9:47:23<25:03:42, 3.54s/it] 26%|██▌ | 8806/34278 [9:47:26<24:02:48, 3.40s/it] {'loss': 0.1773, 'grad_norm': 1.0538250897279264, 'learning_rate': 8.70974353478047e-06, 'epoch': 0.26} 26%|██▌ | 8806/34278 [9:47:26<24:02:48, 3.40s/it] 26%|██▌ | 8807/34278 [9:47:30<26:27:07, 3.74s/it] {'loss': 0.1599, 'grad_norm': 0.8845549457939076, 'learning_rate': 8.709426771600363e-06, 'epoch': 0.26} 26%|██▌ | 8807/34278 [9:47:30<26:27:07, 3.74s/it] 26%|██▌ | 8808/34278 [9:47:33<24:45:53, 3.50s/it] {'loss': 0.1897, 'grad_norm': 1.0316144930372386, 'learning_rate': 8.70910997530338e-06, 'epoch': 0.26} 26%|██▌ | 8808/34278 [9:47:33<24:45:53, 3.50s/it] 26%|██▌ | 8809/34278 [9:47:39<28:23:02, 4.01s/it] {'loss': 0.1785, 'grad_norm': 0.8275803806540443, 'learning_rate': 8.70879314589235e-06, 'epoch': 0.26} 26%|██▌ | 8809/34278 [9:47:39<28:23:02, 4.01s/it] 26%|██▌ | 8810/34278 [9:47:42<26:15:10, 3.71s/it] {'loss': 0.1811, 'grad_norm': 0.9629761451867813, 'learning_rate': 8.708476283370098e-06, 'epoch': 0.26} 26%|██▌ | 8810/34278 [9:47:42<26:15:10, 3.71s/it] 26%|██▌ | 8811/34278 [9:47:46<27:10:37, 3.84s/it] {'loss': 0.158, 'grad_norm': 0.9505803037569778, 'learning_rate': 8.708159387739456e-06, 'epoch': 0.26} 26%|██▌ | 8811/34278 [9:47:46<27:10:37, 3.84s/it] 26%|██▌ | 8812/34278 [9:47:49<26:05:16, 3.69s/it] {'loss': 0.1709, 'grad_norm': 0.7218256849695398, 'learning_rate': 8.70784245900325e-06, 'epoch': 0.26} 26%|██▌ | 8812/34278 [9:47:49<26:05:16, 3.69s/it] 26%|██▌ | 8813/34278 [9:47:53<26:11:00, 3.70s/it] {'loss': 0.1731, 'grad_norm': 0.8765072740167568, 'learning_rate': 8.707525497164316e-06, 'epoch': 0.26} 26%|██▌ | 8813/34278 [9:47:53<26:11:00, 3.70s/it] 26%|██▌ | 8814/34278 [9:47:57<26:24:10, 3.73s/it] {'loss': 0.1466, 'grad_norm': 0.8155519387229843, 'learning_rate': 8.707208502225476e-06, 'epoch': 0.26} 26%|██▌ | 8814/34278 [9:47:57<26:24:10, 3.73s/it] 26%|██▌ | 8815/34278 [9:48:00<26:23:57, 3.73s/it] {'loss': 0.1564, 'grad_norm': 0.7103380025975924, 'learning_rate': 8.706891474189566e-06, 'epoch': 0.26} 26%|██▌ | 8815/34278 [9:48:00<26:23:57, 3.73s/it] 26%|██▌ | 8816/34278 [9:48:05<27:58:17, 3.95s/it] {'loss': 0.148, 'grad_norm': 0.7816166744788744, 'learning_rate': 8.706574413059411e-06, 'epoch': 0.26} 26%|██▌ | 8816/34278 [9:48:05<27:58:17, 3.95s/it] 26%|██▌ | 8817/34278 [9:48:08<27:10:06, 3.84s/it] {'loss': 0.1337, 'grad_norm': 0.6888803989050883, 'learning_rate': 8.706257318837846e-06, 'epoch': 0.26} 26%|██▌ | 8817/34278 [9:48:08<27:10:06, 3.84s/it] 26%|██▌ | 8818/34278 [9:48:14<31:21:55, 4.44s/it] {'loss': 0.1365, 'grad_norm': 0.7799227031136653, 'learning_rate': 8.7059401915277e-06, 'epoch': 0.26} 26%|██▌ | 8818/34278 [9:48:14<31:21:55, 4.44s/it] 26%|██▌ | 8819/34278 [9:48:17<28:33:11, 4.04s/it] {'loss': 0.1838, 'grad_norm': 0.8660913193244866, 'learning_rate': 8.705623031131805e-06, 'epoch': 0.26} 26%|██▌ | 8819/34278 [9:48:17<28:33:11, 4.04s/it] 26%|██▌ | 8820/34278 [9:48:20<25:31:24, 3.61s/it] {'loss': 0.1331, 'grad_norm': 0.6483985644983833, 'learning_rate': 8.70530583765299e-06, 'epoch': 0.26} 26%|██▌ | 8820/34278 [9:48:20<25:31:24, 3.61s/it] 26%|██▌ | 8821/34278 [9:48:26<31:12:31, 4.41s/it] {'loss': 0.1595, 'grad_norm': 1.2646915220802024, 'learning_rate': 8.704988611094093e-06, 'epoch': 0.26} 26%|██▌ | 8821/34278 [9:48:26<31:12:31, 4.41s/it] 26%|██▌ | 8822/34278 [9:48:30<29:58:50, 4.24s/it] {'loss': 0.1565, 'grad_norm': 0.7979583065309406, 'learning_rate': 8.704671351457941e-06, 'epoch': 0.26} 26%|██▌ | 8822/34278 [9:48:30<29:58:50, 4.24s/it] 26%|██▌ | 8823/34278 [9:48:34<28:34:21, 4.04s/it] {'loss': 0.1467, 'grad_norm': 0.7451983996609904, 'learning_rate': 8.704354058747366e-06, 'epoch': 0.26} 26%|██▌ | 8823/34278 [9:48:34<28:34:21, 4.04s/it] 26%|██▌ | 8824/34278 [9:48:39<31:12:06, 4.41s/it] {'loss': 0.1668, 'grad_norm': 1.0089067781043577, 'learning_rate': 8.704036732965202e-06, 'epoch': 0.26} 26%|██▌ | 8824/34278 [9:48:39<31:12:06, 4.41s/it] 26%|██▌ | 8825/34278 [9:48:45<34:27:09, 4.87s/it] {'loss': 0.1518, 'grad_norm': 1.0520293847921796, 'learning_rate': 8.703719374114283e-06, 'epoch': 0.26} 26%|██▌ | 8825/34278 [9:48:45<34:27:09, 4.87s/it] 26%|██▌ | 8826/34278 [9:48:48<31:11:37, 4.41s/it] {'loss': 0.1662, 'grad_norm': 0.8712651565054161, 'learning_rate': 8.703401982197444e-06, 'epoch': 0.26} 26%|██▌ | 8826/34278 [9:48:48<31:11:37, 4.41s/it] 26%|██▌ | 8827/34278 [9:48:51<28:22:05, 4.01s/it] {'loss': 0.1497, 'grad_norm': 0.9988059629732003, 'learning_rate': 8.703084557217513e-06, 'epoch': 0.26} 26%|██▌ | 8827/34278 [9:48:51<28:22:05, 4.01s/it] 26%|██▌ | 8828/34278 [9:48:54<26:08:15, 3.70s/it] {'loss': 0.1649, 'grad_norm': 0.8589199021224142, 'learning_rate': 8.702767099177328e-06, 'epoch': 0.26} 26%|██▌ | 8828/34278 [9:48:54<26:08:15, 3.70s/it] 26%|██▌ | 8829/34278 [9:48:59<28:10:23, 3.99s/it] {'loss': 0.1294, 'grad_norm': 0.7726342037518046, 'learning_rate': 8.702449608079722e-06, 'epoch': 0.26} 26%|██▌ | 8829/34278 [9:48:59<28:10:23, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 26%|██▌ | 8830/34278 [9:49:02<26:55:39, 3.81s/it] {'loss': 0.1504, 'grad_norm': 0.7079604464746598, 'learning_rate': 8.70213208392753e-06, 'epoch': 0.26} 26%|██▌ | 8830/34278 [9:49:02<26:55:39, 3.81s/it] 26%|██▌ | 8831/34278 [9:49:05<25:14:07, 3.57s/it] {'loss': 0.1645, 'grad_norm': 0.8668711322604143, 'learning_rate': 8.701814526723588e-06, 'epoch': 0.26} 26%|██▌ | 8831/34278 [9:49:05<25:14:07, 3.57s/it] 26%|██▌ | 8832/34278 [9:49:08<24:17:01, 3.44s/it] {'loss': 0.1598, 'grad_norm': 0.8689087172820066, 'learning_rate': 8.701496936470728e-06, 'epoch': 0.26} 26%|██▌ | 8832/34278 [9:49:08<24:17:01, 3.44s/it] 26%|██▌ | 8833/34278 [9:49:12<23:53:28, 3.38s/it] {'loss': 0.1745, 'grad_norm': 0.729254569499863, 'learning_rate': 8.701179313171787e-06, 'epoch': 0.26} 26%|██▌ | 8833/34278 [9:49:12<23:53:28, 3.38s/it] 26%|██▌ | 8834/34278 [9:49:15<23:43:08, 3.36s/it] {'loss': 0.161, 'grad_norm': 0.7750600007600736, 'learning_rate': 8.7008616568296e-06, 'epoch': 0.26} 26%|██▌ | 8834/34278 [9:49:15<23:43:08, 3.36s/it] 26%|██▌ | 8835/34278 [9:49:20<27:21:24, 3.87s/it] {'loss': 0.1961, 'grad_norm': 1.1576861073778233, 'learning_rate': 8.700543967447005e-06, 'epoch': 0.26} 26%|██▌ | 8835/34278 [9:49:20<27:21:24, 3.87s/it] 26%|██▌ | 8836/34278 [9:49:23<25:28:51, 3.61s/it] {'loss': 0.1596, 'grad_norm': 0.7936500970083683, 'learning_rate': 8.700226245026838e-06, 'epoch': 0.26} 26%|██▌ | 8836/34278 [9:49:23<25:28:51, 3.61s/it] 26%|██▌ | 8837/34278 [9:49:26<24:57:35, 3.53s/it] {'loss': 0.1334, 'grad_norm': 0.82538331026121, 'learning_rate': 8.699908489571931e-06, 'epoch': 0.26} 26%|██▌ | 8837/34278 [9:49:26<24:57:35, 3.53s/it] 26%|██▌ | 8838/34278 [9:49:30<24:40:34, 3.49s/it] {'loss': 0.1621, 'grad_norm': 0.7553905612520914, 'learning_rate': 8.699590701085125e-06, 'epoch': 0.26} 26%|██▌ | 8838/34278 [9:49:30<24:40:34, 3.49s/it] 26%|██▌ | 8839/34278 [9:49:33<25:03:24, 3.55s/it] {'loss': 0.161, 'grad_norm': 0.7823037908029749, 'learning_rate': 8.699272879569258e-06, 'epoch': 0.26} 26%|██▌ | 8839/34278 [9:49:34<25:03:24, 3.55s/it] 26%|██▌ | 8840/34278 [9:49:37<24:53:11, 3.52s/it] {'loss': 0.1622, 'grad_norm': 0.8909308772809414, 'learning_rate': 8.698955025027165e-06, 'epoch': 0.26} 26%|██▌ | 8840/34278 [9:49:37<24:53:11, 3.52s/it] 26%|██▌ | 8841/34278 [9:49:40<23:59:43, 3.40s/it] {'loss': 0.147, 'grad_norm': 0.6483082566578128, 'learning_rate': 8.698637137461685e-06, 'epoch': 0.26} 26%|██▌ | 8841/34278 [9:49:40<23:59:43, 3.40s/it] 26%|██▌ | 8842/34278 [9:49:44<24:25:37, 3.46s/it] {'loss': 0.1652, 'grad_norm': 1.1071666630884744, 'learning_rate': 8.698319216875656e-06, 'epoch': 0.26} 26%|██▌ | 8842/34278 [9:49:44<24:25:37, 3.46s/it] 26%|██▌ | 8843/34278 [9:49:46<22:56:01, 3.25s/it] {'loss': 0.1398, 'grad_norm': 0.8406033754314253, 'learning_rate': 8.698001263271914e-06, 'epoch': 0.26} 26%|██▌ | 8843/34278 [9:49:46<22:56:01, 3.25s/it] 26%|██▌ | 8844/34278 [9:49:50<23:19:07, 3.30s/it] {'loss': 0.1765, 'grad_norm': 0.7163675769592432, 'learning_rate': 8.697683276653302e-06, 'epoch': 0.26} 26%|██▌ | 8844/34278 [9:49:50<23:19:07, 3.30s/it] 26%|██▌ | 8845/34278 [9:49:55<28:07:18, 3.98s/it] {'loss': 0.1617, 'grad_norm': 0.8801074754734578, 'learning_rate': 8.697365257022654e-06, 'epoch': 0.26} 26%|██▌ | 8845/34278 [9:49:55<28:07:18, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 26%|██▌ | 8846/34278 [9:49:59<26:35:11, 3.76s/it] {'loss': 0.1408, 'grad_norm': 0.8879008470690003, 'learning_rate': 8.697047204382813e-06, 'epoch': 0.26} 26%|██▌ | 8846/34278 [9:49:59<26:35:11, 3.76s/it] 26%|██▌ | 8847/34278 [9:50:02<26:24:51, 3.74s/it] {'loss': 0.1424, 'grad_norm': 0.7716951003806196, 'learning_rate': 8.696729118736618e-06, 'epoch': 0.26} 26%|██▌ | 8847/34278 [9:50:02<26:24:51, 3.74s/it] 26%|██▌ | 8848/34278 [9:50:08<30:46:32, 4.36s/it] {'loss': 0.1855, 'grad_norm': 0.8899398981891741, 'learning_rate': 8.696411000086906e-06, 'epoch': 0.26} 26%|██▌ | 8848/34278 [9:50:08<30:46:32, 4.36s/it] 26%|██▌ | 8849/34278 [9:50:12<29:30:42, 4.18s/it] {'loss': 0.1559, 'grad_norm': 0.8839715875952767, 'learning_rate': 8.69609284843652e-06, 'epoch': 0.26} 26%|██▌ | 8849/34278 [9:50:12<29:30:42, 4.18s/it] 26%|██▌ | 8850/34278 [9:50:15<26:42:43, 3.78s/it] {'loss': 0.1678, 'grad_norm': 1.0712819701383225, 'learning_rate': 8.695774663788299e-06, 'epoch': 0.26} 26%|██▌ | 8850/34278 [9:50:15<26:42:43, 3.78s/it] 26%|██▌ | 8851/34278 [9:50:21<31:33:07, 4.47s/it] {'loss': 0.1464, 'grad_norm': 0.9844537685094489, 'learning_rate': 8.695456446145084e-06, 'epoch': 0.26} 26%|██▌ | 8851/34278 [9:50:21<31:33:07, 4.47s/it] 26%|██▌ | 8852/34278 [9:50:24<28:24:47, 4.02s/it] {'loss': 0.1625, 'grad_norm': 1.0907816708370612, 'learning_rate': 8.695138195509715e-06, 'epoch': 0.26} 26%|██▌ | 8852/34278 [9:50:24<28:24:47, 4.02s/it] 26%|██▌ | 8853/34278 [9:50:27<26:38:40, 3.77s/it] {'loss': 0.1758, 'grad_norm': 1.0740844466516966, 'learning_rate': 8.694819911885034e-06, 'epoch': 0.26} 26%|██▌ | 8853/34278 [9:50:27<26:38:40, 3.77s/it] 26%|██▌ | 8854/34278 [9:50:30<25:43:45, 3.64s/it] {'loss': 0.153, 'grad_norm': 0.8891126516343939, 'learning_rate': 8.694501595273887e-06, 'epoch': 0.26} 26%|██▌ | 8854/34278 [9:50:30<25:43:45, 3.64s/it] 26%|██▌ | 8855/34278 [9:50:34<26:04:10, 3.69s/it] {'loss': 0.1636, 'grad_norm': 0.8336304641396416, 'learning_rate': 8.694183245679108e-06, 'epoch': 0.26} 26%|██▌ | 8855/34278 [9:50:34<26:04:10, 3.69s/it] 26%|██▌ | 8856/34278 [9:50:40<30:49:27, 4.37s/it] {'loss': 0.1687, 'grad_norm': 0.9731712741496811, 'learning_rate': 8.693864863103546e-06, 'epoch': 0.26} 26%|██▌ | 8856/34278 [9:50:40<30:49:27, 4.37s/it] 26%|██▌ | 8857/34278 [9:50:44<29:08:56, 4.13s/it] {'loss': 0.173, 'grad_norm': 0.9648990726717422, 'learning_rate': 8.693546447550036e-06, 'epoch': 0.26} 26%|██▌ | 8857/34278 [9:50:44<29:08:56, 4.13s/it] 26%|██▌ | 8858/34278 [9:50:47<28:06:34, 3.98s/it] {'loss': 0.1546, 'grad_norm': 0.904415710537177, 'learning_rate': 8.693227999021428e-06, 'epoch': 0.26} 26%|██▌ | 8858/34278 [9:50:47<28:06:34, 3.98s/it] 26%|██▌ | 8859/34278 [9:50:51<26:34:17, 3.76s/it] {'loss': 0.1497, 'grad_norm': 0.7976806557549195, 'learning_rate': 8.69290951752056e-06, 'epoch': 0.26} 26%|██▌ | 8859/34278 [9:50:51<26:34:17, 3.76s/it] 26%|██▌ | 8860/34278 [9:50:54<25:13:59, 3.57s/it] {'loss': 0.1555, 'grad_norm': 0.8187815959775875, 'learning_rate': 8.69259100305028e-06, 'epoch': 0.26} 26%|██▌ | 8860/34278 [9:50:54<25:13:59, 3.57s/it] 26%|██▌ | 8861/34278 [9:50:57<25:17:20, 3.58s/it] {'loss': 0.1534, 'grad_norm': 0.7853515194328271, 'learning_rate': 8.692272455613427e-06, 'epoch': 0.26} 26%|██▌ | 8861/34278 [9:50:57<25:17:20, 3.58s/it] 26%|██▌ | 8862/34278 [9:51:00<24:23:45, 3.46s/it] {'loss': 0.1656, 'grad_norm': 0.7234433521282186, 'learning_rate': 8.691953875212848e-06, 'epoch': 0.26} 26%|██▌ | 8862/34278 [9:51:00<24:23:45, 3.46s/it] 26%|██▌ | 8863/34278 [9:51:06<29:41:39, 4.21s/it] {'loss': 0.1457, 'grad_norm': 0.8750074985201797, 'learning_rate': 8.691635261851385e-06, 'epoch': 0.26} 26%|██▌ | 8863/34278 [9:51:06<29:41:39, 4.21s/it] 26%|██▌ | 8864/34278 [9:51:10<28:03:56, 3.98s/it] {'loss': 0.1459, 'grad_norm': 0.7847218231122696, 'learning_rate': 8.691316615531885e-06, 'epoch': 0.26} 26%|██▌ | 8864/34278 [9:51:10<28:03:56, 3.98s/it] 26%|██▌ | 8865/34278 [9:51:14<27:35:03, 3.91s/it] {'loss': 0.1697, 'grad_norm': 0.7680291020795362, 'learning_rate': 8.690997936257191e-06, 'epoch': 0.26} 26%|██▌ | 8865/34278 [9:51:14<27:35:03, 3.91s/it] 26%|██▌ | 8866/34278 [9:51:17<25:45:04, 3.65s/it] {'loss': 0.156, 'grad_norm': 0.6764550047370509, 'learning_rate': 8.690679224030149e-06, 'epoch': 0.26} 26%|██▌ | 8866/34278 [9:51:17<25:45:04, 3.65s/it] 26%|██▌ | 8867/34278 [9:51:22<29:49:13, 4.22s/it] {'loss': 0.1414, 'grad_norm': 0.8041923507195612, 'learning_rate': 8.690360478853603e-06, 'epoch': 0.26} 26%|██▌ | 8867/34278 [9:51:22<29:49:13, 4.22s/it] 26%|██▌ | 8868/34278 [9:51:27<30:46:32, 4.36s/it] {'loss': 0.145, 'grad_norm': 0.8314780058372228, 'learning_rate': 8.6900417007304e-06, 'epoch': 0.26} 26%|██▌ | 8868/34278 [9:51:27<30:46:32, 4.36s/it] 26%|██▌ | 8869/34278 [9:51:31<30:11:08, 4.28s/it] {'loss': 0.1571, 'grad_norm': 0.8227874731554615, 'learning_rate': 8.689722889663386e-06, 'epoch': 0.26} 26%|██▌ | 8869/34278 [9:51:31<30:11:08, 4.28s/it] 26%|██▌ | 8870/34278 [9:51:34<28:01:16, 3.97s/it] {'loss': 0.1593, 'grad_norm': 0.7284739481406283, 'learning_rate': 8.689404045655406e-06, 'epoch': 0.26} 26%|██▌ | 8870/34278 [9:51:34<28:01:16, 3.97s/it] 26%|██▌ | 8871/34278 [9:51:39<29:20:57, 4.16s/it] {'loss': 0.1556, 'grad_norm': 1.277646345397803, 'learning_rate': 8.689085168709309e-06, 'epoch': 0.26} 26%|██▌ | 8871/34278 [9:51:39<29:20:57, 4.16s/it] 26%|██▌ | 8872/34278 [9:51:45<33:11:37, 4.70s/it] {'loss': 0.1789, 'grad_norm': 0.8730722697824033, 'learning_rate': 8.688766258827938e-06, 'epoch': 0.26} 26%|██▌ | 8872/34278 [9:51:45<33:11:37, 4.70s/it] 26%|██▌ | 8873/34278 [9:51:48<29:55:23, 4.24s/it] {'loss': 0.1462, 'grad_norm': 0.8774586082077487, 'learning_rate': 8.688447316014144e-06, 'epoch': 0.26} 26%|██▌ | 8873/34278 [9:51:48<29:55:23, 4.24s/it] 26%|██▌ | 8874/34278 [9:51:54<33:15:18, 4.71s/it] {'loss': 0.1662, 'grad_norm': 0.7898122849796687, 'learning_rate': 8.688128340270772e-06, 'epoch': 0.26} 26%|██▌ | 8874/34278 [9:51:54<33:15:18, 4.71s/it] 26%|██▌ | 8875/34278 [9:51:56<29:04:01, 4.12s/it] {'loss': 0.147, 'grad_norm': 0.7686793143659327, 'learning_rate': 8.68780933160067e-06, 'epoch': 0.26} 26%|██▌ | 8875/34278 [9:51:57<29:04:01, 4.12s/it] 26%|██▌ | 8876/34278 [9:52:00<27:18:44, 3.87s/it] {'loss': 0.1646, 'grad_norm': 0.9601607688264278, 'learning_rate': 8.687490290006689e-06, 'epoch': 0.26} 26%|██▌ | 8876/34278 [9:52:00<27:18:44, 3.87s/it] 26%|██▌ | 8877/34278 [9:52:06<32:06:54, 4.55s/it] {'loss': 0.1979, 'grad_norm': 0.9082518558436797, 'learning_rate': 8.687171215491673e-06, 'epoch': 0.26} 26%|██▌ | 8877/34278 [9:52:06<32:06:54, 4.55s/it] 26%|██▌ | 8878/34278 [9:52:10<31:30:22, 4.47s/it] {'loss': 0.1427, 'grad_norm': 0.761382025265103, 'learning_rate': 8.686852108058472e-06, 'epoch': 0.26} 26%|██▌ | 8878/34278 [9:52:10<31:30:22, 4.47s/it] 26%|██▌ | 8879/34278 [9:52:16<34:34:37, 4.90s/it] {'loss': 0.1495, 'grad_norm': 0.7154054704953288, 'learning_rate': 8.686532967709938e-06, 'epoch': 0.26} 26%|██▌ | 8879/34278 [9:52:16<34:34:37, 4.90s/it] 26%|██▌ | 8880/34278 [9:52:19<31:21:54, 4.45s/it] {'loss': 0.1534, 'grad_norm': 0.7285945560701206, 'learning_rate': 8.686213794448914e-06, 'epoch': 0.26} 26%|██▌ | 8880/34278 [9:52:19<31:21:54, 4.45s/it] 26%|██▌ | 8881/34278 [9:52:25<32:50:40, 4.66s/it] {'loss': 0.148, 'grad_norm': 0.8378752925761715, 'learning_rate': 8.685894588278256e-06, 'epoch': 0.26} 26%|██▌ | 8881/34278 [9:52:25<32:50:40, 4.66s/it] 26%|██▌ | 8882/34278 [9:52:28<29:31:43, 4.19s/it] {'loss': 0.164, 'grad_norm': 0.9615955944272917, 'learning_rate': 8.685575349200812e-06, 'epoch': 0.26} 26%|██▌ | 8882/34278 [9:52:28<29:31:43, 4.19s/it] 26%|██▌ | 8883/34278 [9:52:31<27:26:27, 3.89s/it] {'loss': 0.1464, 'grad_norm': 0.7677578971911746, 'learning_rate': 8.685256077219428e-06, 'epoch': 0.26} 26%|██▌ | 8883/34278 [9:52:31<27:26:27, 3.89s/it] 26%|██▌ | 8884/34278 [9:52:35<26:48:30, 3.80s/it] {'loss': 0.1421, 'grad_norm': 0.877176485121053, 'learning_rate': 8.684936772336961e-06, 'epoch': 0.26} 26%|██▌ | 8884/34278 [9:52:35<26:48:30, 3.80s/it] 26%|██▌ | 8885/34278 [9:52:37<24:54:19, 3.53s/it] {'loss': 0.1452, 'grad_norm': 0.9925513683909988, 'learning_rate': 8.684617434556255e-06, 'epoch': 0.26} 26%|██▌ | 8885/34278 [9:52:37<24:54:19, 3.53s/it] 26%|██▌ | 8886/34278 [9:52:43<30:05:58, 4.27s/it] {'loss': 0.1629, 'grad_norm': 0.8638704131644256, 'learning_rate': 8.684298063880166e-06, 'epoch': 0.26} 26%|██▌ | 8886/34278 [9:52:43<30:05:58, 4.27s/it] 26%|██▌ | 8887/34278 [9:52:49<33:57:14, 4.81s/it] {'loss': 0.168, 'grad_norm': 0.7770560173656657, 'learning_rate': 8.683978660311542e-06, 'epoch': 0.26} 26%|██▌ | 8887/34278 [9:52:50<33:57:14, 4.81s/it] 26%|██▌ | 8888/34278 [9:52:54<32:14:04, 4.57s/it] {'loss': 0.1867, 'grad_norm': 0.8800620468543845, 'learning_rate': 8.683659223853238e-06, 'epoch': 0.26} 26%|██▌ | 8888/34278 [9:52:54<32:14:04, 4.57s/it] 26%|██▌ | 8889/34278 [9:52:56<28:50:34, 4.09s/it] {'loss': 0.1581, 'grad_norm': 0.9077449434044605, 'learning_rate': 8.683339754508102e-06, 'epoch': 0.26} 26%|██▌ | 8889/34278 [9:52:56<28:50:34, 4.09s/it] 26%|██▌ | 8890/34278 [9:53:01<30:11:43, 4.28s/it] {'loss': 0.152, 'grad_norm': 0.7541267098540116, 'learning_rate': 8.683020252278988e-06, 'epoch': 0.26} 26%|██▌ | 8890/34278 [9:53:01<30:11:43, 4.28s/it] 26%|██▌ | 8891/34278 [9:53:04<27:43:18, 3.93s/it] {'loss': 0.1671, 'grad_norm': 0.9230556082843885, 'learning_rate': 8.68270071716875e-06, 'epoch': 0.26} 26%|██▌ | 8891/34278 [9:53:04<27:43:18, 3.93s/it] 26%|██▌ | 8892/34278 [9:53:10<31:05:09, 4.41s/it] {'loss': 0.1989, 'grad_norm': 0.9330205850452694, 'learning_rate': 8.682381149180239e-06, 'epoch': 0.26} 26%|██▌ | 8892/34278 [9:53:10<31:05:09, 4.41s/it] 26%|██▌ | 8893/34278 [9:53:13<27:57:06, 3.96s/it] {'loss': 0.1331, 'grad_norm': 0.8778281677680428, 'learning_rate': 8.682061548316307e-06, 'epoch': 0.26} 26%|██▌ | 8893/34278 [9:53:13<27:57:06, 3.96s/it] 26%|██▌ | 8894/34278 [9:53:16<25:36:34, 3.63s/it] {'loss': 0.1745, 'grad_norm': 0.8308052619745485, 'learning_rate': 8.681741914579807e-06, 'epoch': 0.26} 26%|██▌ | 8894/34278 [9:53:16<25:36:34, 3.63s/it] 26%|██▌ | 8895/34278 [9:53:20<28:10:06, 4.00s/it] {'loss': 0.1615, 'grad_norm': 0.7996080524419938, 'learning_rate': 8.681422247973596e-06, 'epoch': 0.26} 26%|██▌ | 8895/34278 [9:53:20<28:10:06, 4.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 26%|██▌ | 8896/34278 [9:53:25<28:57:21, 4.11s/it] {'loss': 0.1636, 'grad_norm': 0.9020034260693374, 'learning_rate': 8.681102548500526e-06, 'epoch': 0.26} 26%|██▌ | 8896/34278 [9:53:25<28:57:21, 4.11s/it] 26%|██▌ | 8897/34278 [9:53:29<28:38:03, 4.06s/it] {'loss': 0.1743, 'grad_norm': 0.8887946298156156, 'learning_rate': 8.68078281616345e-06, 'epoch': 0.26} 26%|██▌ | 8897/34278 [9:53:29<28:38:03, 4.06s/it] 26%|██▌ | 8898/34278 [9:53:33<29:07:02, 4.13s/it] {'loss': 0.1558, 'grad_norm': 0.7243999100222906, 'learning_rate': 8.680463050965227e-06, 'epoch': 0.26} 26%|██▌ | 8898/34278 [9:53:33<29:07:02, 4.13s/it] 26%|██▌ | 8899/34278 [9:53:37<28:38:24, 4.06s/it] {'loss': 0.1774, 'grad_norm': 0.8516601822735672, 'learning_rate': 8.680143252908704e-06, 'epoch': 0.26} 26%|██▌ | 8899/34278 [9:53:37<28:38:24, 4.06s/it] 26%|██▌ | 8900/34278 [9:53:41<27:51:40, 3.95s/it] {'loss': 0.1433, 'grad_norm': 0.9026393380363806, 'learning_rate': 8.679823421996745e-06, 'epoch': 0.26} 26%|██▌ | 8900/34278 [9:53:41<27:51:40, 3.95s/it] 26%|██▌ | 8901/34278 [9:53:45<29:15:37, 4.15s/it] {'loss': 0.1449, 'grad_norm': 0.6900676970466267, 'learning_rate': 8.679503558232197e-06, 'epoch': 0.26} 26%|██▌ | 8901/34278 [9:53:45<29:15:37, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 26%|██▌ | 8902/34278 [9:53:48<26:10:16, 3.71s/it] {'loss': 0.1735, 'grad_norm': 0.9630417895997558, 'learning_rate': 8.679183661617923e-06, 'epoch': 0.26} 26%|██▌ | 8902/34278 [9:53:48<26:10:16, 3.71s/it] 26%|██▌ | 8903/34278 [9:53:51<24:57:09, 3.54s/it] {'loss': 0.1499, 'grad_norm': 0.76408418310192, 'learning_rate': 8.678863732156773e-06, 'epoch': 0.26} 26%|██▌ | 8903/34278 [9:53:51<24:57:09, 3.54s/it] 26%|██▌ | 8904/34278 [9:53:57<29:29:31, 4.18s/it] {'loss': 0.1333, 'grad_norm': 0.8353763808398833, 'learning_rate': 8.678543769851606e-06, 'epoch': 0.26} 26%|██▌ | 8904/34278 [9:53:57<29:29:31, 4.18s/it] 26%|██▌ | 8905/34278 [9:54:02<31:32:13, 4.47s/it] {'loss': 0.1486, 'grad_norm': 0.7340719249282294, 'learning_rate': 8.678223774705279e-06, 'epoch': 0.26} 26%|██▌ | 8905/34278 [9:54:02<31:32:13, 4.47s/it] 26%|██▌ | 8906/34278 [9:54:05<28:33:18, 4.05s/it] {'loss': 0.1886, 'grad_norm': 0.7884093911045592, 'learning_rate': 8.677903746720648e-06, 'epoch': 0.26} 26%|██▌ | 8906/34278 [9:54:05<28:33:18, 4.05s/it] 26%|██▌ | 8907/34278 [9:54:09<28:45:51, 4.08s/it] {'loss': 0.169, 'grad_norm': 0.9833278264474146, 'learning_rate': 8.677583685900572e-06, 'epoch': 0.26} 26%|██▌ | 8907/34278 [9:54:09<28:45:51, 4.08s/it] 26%|██▌ | 8908/34278 [9:54:16<33:37:46, 4.77s/it] {'loss': 0.1481, 'grad_norm': 0.9930250670443083, 'learning_rate': 8.677263592247905e-06, 'epoch': 0.26} 26%|██▌ | 8908/34278 [9:54:16<33:37:46, 4.77s/it] 26%|██▌ | 8909/34278 [9:54:19<30:10:25, 4.28s/it] {'loss': 0.1324, 'grad_norm': 0.9653846703152127, 'learning_rate': 8.676943465765506e-06, 'epoch': 0.26} 26%|██▌ | 8909/34278 [9:54:19<30:10:25, 4.28s/it] 26%|██▌ | 8910/34278 [9:54:23<30:07:03, 4.27s/it] {'loss': 0.1332, 'grad_norm': 0.9202175390113022, 'learning_rate': 8.676623306456235e-06, 'epoch': 0.26} 26%|██▌ | 8910/34278 [9:54:23<30:07:03, 4.27s/it] 26%|██▌ | 8911/34278 [9:54:27<29:22:13, 4.17s/it] {'loss': 0.1493, 'grad_norm': 0.8731167581220094, 'learning_rate': 8.676303114322948e-06, 'epoch': 0.26} 26%|██▌ | 8911/34278 [9:54:27<29:22:13, 4.17s/it] 26%|██▌ | 8912/34278 [9:54:33<33:32:03, 4.76s/it] {'loss': 0.1739, 'grad_norm': 0.886676344309571, 'learning_rate': 8.675982889368503e-06, 'epoch': 0.26} 26%|██▌ | 8912/34278 [9:54:33<33:32:03, 4.76s/it] 26%|██▌ | 8913/34278 [9:54:39<36:56:21, 5.24s/it] {'loss': 0.1703, 'grad_norm': 0.7860685323189874, 'learning_rate': 8.675662631595762e-06, 'epoch': 0.26} 26%|██▌ | 8913/34278 [9:54:39<36:56:21, 5.24s/it] 26%|██▌ | 8914/34278 [9:54:43<32:40:40, 4.64s/it] {'loss': 0.1552, 'grad_norm': 1.1572269862127893, 'learning_rate': 8.675342341007582e-06, 'epoch': 0.26} 26%|██▌ | 8914/34278 [9:54:43<32:40:40, 4.64s/it] 26%|██▌ | 8915/34278 [9:54:46<29:59:55, 4.26s/it] {'loss': 0.1511, 'grad_norm': 0.7990010528377889, 'learning_rate': 8.675022017606824e-06, 'epoch': 0.26} 26%|██▌ | 8915/34278 [9:54:46<29:59:55, 4.26s/it] 26%|██▌ | 8916/34278 [9:54:49<28:22:42, 4.03s/it] {'loss': 0.1535, 'grad_norm': 0.9849423360542807, 'learning_rate': 8.674701661396345e-06, 'epoch': 0.26} 26%|██▌ | 8916/34278 [9:54:49<28:22:42, 4.03s/it] 26%|██▌ | 8917/34278 [9:54:52<25:44:48, 3.65s/it] {'loss': 0.1351, 'grad_norm': 0.7597937933589964, 'learning_rate': 8.674381272379008e-06, 'epoch': 0.26} 26%|██▌ | 8917/34278 [9:54:52<25:44:48, 3.65s/it] 26%|██▌ | 8918/34278 [9:54:55<24:42:00, 3.51s/it] {'loss': 0.154, 'grad_norm': 0.8573456198285051, 'learning_rate': 8.674060850557673e-06, 'epoch': 0.26} 26%|██▌ | 8918/34278 [9:54:55<24:42:00, 3.51s/it] 26%|██▌ | 8919/34278 [9:54:58<23:36:33, 3.35s/it] {'loss': 0.1729, 'grad_norm': 0.9188513446682759, 'learning_rate': 8.673740395935198e-06, 'epoch': 0.26} 26%|██▌ | 8919/34278 [9:54:58<23:36:33, 3.35s/it] 26%|██▌ | 8920/34278 [9:55:02<23:08:33, 3.29s/it] {'loss': 0.1777, 'grad_norm': 1.2466147536443686, 'learning_rate': 8.673419908514447e-06, 'epoch': 0.26} 26%|██▌ | 8920/34278 [9:55:02<23:08:33, 3.29s/it] 26%|██▌ | 8921/34278 [9:55:05<23:26:23, 3.33s/it] {'loss': 0.1642, 'grad_norm': 1.0470505186834715, 'learning_rate': 8.67309938829828e-06, 'epoch': 0.26} 26%|██▌ | 8921/34278 [9:55:05<23:26:23, 3.33s/it] 26%|██▌ | 8922/34278 [9:55:08<23:11:45, 3.29s/it] {'loss': 0.1586, 'grad_norm': 1.1318805959368543, 'learning_rate': 8.672778835289556e-06, 'epoch': 0.26} 26%|██▌ | 8922/34278 [9:55:08<23:11:45, 3.29s/it] 26%|██▌ | 8923/34278 [9:55:11<22:24:51, 3.18s/it] {'loss': 0.1549, 'grad_norm': 0.8774068696043981, 'learning_rate': 8.672458249491143e-06, 'epoch': 0.26} 26%|██▌ | 8923/34278 [9:55:11<22:24:51, 3.18s/it] 26%|██▌ | 8924/34278 [9:55:15<23:29:04, 3.33s/it] {'loss': 0.1942, 'grad_norm': 0.9910230944103623, 'learning_rate': 8.672137630905897e-06, 'epoch': 0.26} 26%|██▌ | 8924/34278 [9:55:15<23:29:04, 3.33s/it] 26%|██▌ | 8925/34278 [9:55:18<23:00:40, 3.27s/it] {'loss': 0.1459, 'grad_norm': 1.0426900066770715, 'learning_rate': 8.671816979536682e-06, 'epoch': 0.26} 26%|██▌ | 8925/34278 [9:55:18<23:00:40, 3.27s/it] 26%|██▌ | 8926/34278 [9:55:22<24:33:25, 3.49s/it] {'loss': 0.1679, 'grad_norm': 0.872076369217735, 'learning_rate': 8.671496295386363e-06, 'epoch': 0.26} 26%|██▌ | 8926/34278 [9:55:22<24:33:25, 3.49s/it] 26%|██▌ | 8927/34278 [9:55:26<25:29:57, 3.62s/it] {'loss': 0.1801, 'grad_norm': 0.7784845171279845, 'learning_rate': 8.671175578457803e-06, 'epoch': 0.26} 26%|██▌ | 8927/34278 [9:55:26<25:29:57, 3.62s/it] 26%|██▌ | 8928/34278 [9:55:29<25:11:28, 3.58s/it] {'loss': 0.1559, 'grad_norm': 1.1143806082736285, 'learning_rate': 8.670854828753862e-06, 'epoch': 0.26} 26%|██▌ | 8928/34278 [9:55:29<25:11:28, 3.58s/it] 26%|██▌ | 8929/34278 [9:55:36<30:59:14, 4.40s/it] {'loss': 0.1738, 'grad_norm': 0.8231650852474093, 'learning_rate': 8.670534046277405e-06, 'epoch': 0.26} 26%|██▌ | 8929/34278 [9:55:36<30:59:14, 4.40s/it] 26%|██▌ | 8930/34278 [9:55:38<27:42:59, 3.94s/it] {'loss': 0.1518, 'grad_norm': 0.7152319986232941, 'learning_rate': 8.670213231031299e-06, 'epoch': 0.26} 26%|██▌ | 8930/34278 [9:55:38<27:42:59, 3.94s/it] 26%|██▌ | 8931/34278 [9:55:45<32:11:45, 4.57s/it] {'loss': 0.1901, 'grad_norm': 0.8815074491040681, 'learning_rate': 8.669892383018402e-06, 'epoch': 0.26} 26%|██▌ | 8931/34278 [9:55:45<32:11:45, 4.57s/it] 26%|██▌ | 8932/34278 [9:55:48<29:16:54, 4.16s/it] {'loss': 0.1405, 'grad_norm': 0.9281452692370107, 'learning_rate': 8.669571502241582e-06, 'epoch': 0.26} 26%|██▌ | 8932/34278 [9:55:48<29:16:54, 4.16s/it] 26%|██▌ | 8933/34278 [9:55:54<33:19:48, 4.73s/it] {'loss': 0.1507, 'grad_norm': 0.843511794898961, 'learning_rate': 8.669250588703706e-06, 'epoch': 0.26} 26%|██▌ | 8933/34278 [9:55:54<33:19:48, 4.73s/it] 26%|██▌ | 8934/34278 [9:55:57<30:34:24, 4.34s/it] {'loss': 0.1737, 'grad_norm': 0.8313034900109824, 'learning_rate': 8.668929642407634e-06, 'epoch': 0.26} 26%|██▌ | 8934/34278 [9:55:57<30:34:24, 4.34s/it] 26%|██▌ | 8935/34278 [9:56:00<28:15:14, 4.01s/it] {'loss': 0.1558, 'grad_norm': 0.8363379483768809, 'learning_rate': 8.668608663356237e-06, 'epoch': 0.26} 26%|██▌ | 8935/34278 [9:56:00<28:15:14, 4.01s/it] 26%|██▌ | 8936/34278 [9:56:05<30:06:40, 4.28s/it] {'loss': 0.1657, 'grad_norm': 0.8525148222277982, 'learning_rate': 8.668287651552377e-06, 'epoch': 0.26} 26%|██▌ | 8936/34278 [9:56:05<30:06:40, 4.28s/it] 26%|██▌ | 8937/34278 [9:56:08<27:27:01, 3.90s/it] {'loss': 0.1445, 'grad_norm': 0.8191935090884097, 'learning_rate': 8.66796660699892e-06, 'epoch': 0.26} 26%|██▌ | 8937/34278 [9:56:08<27:27:01, 3.90s/it] 26%|██▌ | 8938/34278 [9:56:12<26:25:38, 3.75s/it] {'loss': 0.1542, 'grad_norm': 0.8618948197418193, 'learning_rate': 8.667645529698731e-06, 'epoch': 0.26} 26%|██▌ | 8938/34278 [9:56:12<26:25:38, 3.75s/it] 26%|██▌ | 8939/34278 [9:56:15<25:34:37, 3.63s/it] {'loss': 0.1305, 'grad_norm': 0.8259232628387545, 'learning_rate': 8.66732441965468e-06, 'epoch': 0.26} 26%|██▌ | 8939/34278 [9:56:15<25:34:37, 3.63s/it] 26%|██▌ | 8940/34278 [9:56:18<24:43:23, 3.51s/it] {'loss': 0.1302, 'grad_norm': 0.710543349995988, 'learning_rate': 8.667003276869632e-06, 'epoch': 0.26} 26%|██▌ | 8940/34278 [9:56:18<24:43:23, 3.51s/it] 26%|██▌ | 8941/34278 [9:56:25<31:08:03, 4.42s/it] {'loss': 0.1849, 'grad_norm': 0.8812651562659243, 'learning_rate': 8.666682101346456e-06, 'epoch': 0.26} 26%|██▌ | 8941/34278 [9:56:25<31:08:03, 4.42s/it] 26%|██▌ | 8942/34278 [9:56:31<34:54:00, 4.96s/it] {'loss': 0.172, 'grad_norm': 0.8899631092618145, 'learning_rate': 8.666360893088015e-06, 'epoch': 0.26} 26%|██▌ | 8942/34278 [9:56:31<34:54:00, 4.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 26%|██▌ | 8943/34278 [9:56:34<30:57:38, 4.40s/it] {'loss': 0.1272, 'grad_norm': 0.8699659262285109, 'learning_rate': 8.666039652097178e-06, 'epoch': 0.26} 26%|██▌ | 8943/34278 [9:56:34<30:57:38, 4.40s/it] 26%|██▌ | 8944/34278 [9:56:39<32:34:11, 4.63s/it] {'loss': 0.1593, 'grad_norm': 0.8170363413887268, 'learning_rate': 8.665718378376816e-06, 'epoch': 0.26} 26%|██▌ | 8944/34278 [9:56:39<32:34:11, 4.63s/it] 26%|██▌ | 8945/34278 [9:56:43<30:01:24, 4.27s/it] {'loss': 0.1715, 'grad_norm': 0.980424642282014, 'learning_rate': 8.665397071929796e-06, 'epoch': 0.26} 26%|██▌ | 8945/34278 [9:56:43<30:01:24, 4.27s/it] 26%|██▌ | 8946/34278 [9:56:48<32:45:41, 4.66s/it] {'loss': 0.1482, 'grad_norm': 0.8161710016929798, 'learning_rate': 8.665075732758985e-06, 'epoch': 0.26} 26%|██▌ | 8946/34278 [9:56:48<32:45:41, 4.66s/it] 26%|██▌ | 8947/34278 [9:56:54<34:22:37, 4.89s/it] {'loss': 0.1674, 'grad_norm': 0.8648771296327689, 'learning_rate': 8.664754360867252e-06, 'epoch': 0.26} 26%|██▌ | 8947/34278 [9:56:54<34:22:37, 4.89s/it] 26%|██▌ | 8948/34278 [9:56:57<30:18:10, 4.31s/it] {'loss': 0.1571, 'grad_norm': 1.1281315541060049, 'learning_rate': 8.664432956257468e-06, 'epoch': 0.26} 26%|██▌ | 8948/34278 [9:56:57<30:18:10, 4.31s/it] 26%|██▌ | 8949/34278 [9:57:00<27:20:08, 3.89s/it] {'loss': 0.1841, 'grad_norm': 0.9988097399923852, 'learning_rate': 8.664111518932501e-06, 'epoch': 0.26} 26%|██▌ | 8949/34278 [9:57:00<27:20:08, 3.89s/it] 26%|██▌ | 8950/34278 [9:57:06<31:49:21, 4.52s/it] {'loss': 0.1608, 'grad_norm': 0.9501992764518179, 'learning_rate': 8.663790048895222e-06, 'epoch': 0.26} 26%|██▌ | 8950/34278 [9:57:06<31:49:21, 4.52s/it] 26%|██▌ | 8951/34278 [9:57:09<30:10:48, 4.29s/it] {'loss': 0.1523, 'grad_norm': 0.9569829937773743, 'learning_rate': 8.6634685461485e-06, 'epoch': 0.26} 26%|██▌ | 8951/34278 [9:57:09<30:10:48, 4.29s/it] 26%|██▌ | 8952/34278 [9:57:12<27:31:04, 3.91s/it] {'loss': 0.1558, 'grad_norm': 1.122425336542205, 'learning_rate': 8.663147010695202e-06, 'epoch': 0.26} 26%|██▌ | 8952/34278 [9:57:12<27:31:04, 3.91s/it] 26%|██▌ | 8953/34278 [9:57:17<29:07:41, 4.14s/it] {'loss': 0.1465, 'grad_norm': 1.0707956436614317, 'learning_rate': 8.662825442538206e-06, 'epoch': 0.26} 26%|██▌ | 8953/34278 [9:57:17<29:07:41, 4.14s/it] 26%|██▌ | 8954/34278 [9:57:21<27:38:16, 3.93s/it] {'loss': 0.1436, 'grad_norm': 0.6557527786271601, 'learning_rate': 8.662503841680377e-06, 'epoch': 0.26} 26%|██▌ | 8954/34278 [9:57:21<27:38:16, 3.93s/it] 26%|██▌ | 8955/34278 [9:57:24<26:59:15, 3.84s/it] {'loss': 0.1681, 'grad_norm': 1.0398305195766397, 'learning_rate': 8.662182208124588e-06, 'epoch': 0.26} 26%|██▌ | 8955/34278 [9:57:24<26:59:15, 3.84s/it] 26%|██▌ | 8956/34278 [9:57:27<25:18:22, 3.60s/it] {'loss': 0.172, 'grad_norm': 1.0251881764708946, 'learning_rate': 8.661860541873712e-06, 'epoch': 0.26} 26%|██▌ | 8956/34278 [9:57:27<25:18:22, 3.60s/it] 26%|██▌ | 8957/34278 [9:57:31<24:38:52, 3.50s/it] {'loss': 0.166, 'grad_norm': 1.1542249340298345, 'learning_rate': 8.661538842930617e-06, 'epoch': 0.26} 26%|██▌ | 8957/34278 [9:57:31<24:38:52, 3.50s/it] 26%|██▌ | 8958/34278 [9:57:34<23:36:28, 3.36s/it] {'loss': 0.1572, 'grad_norm': 0.8961954416241569, 'learning_rate': 8.661217111298179e-06, 'epoch': 0.26} 26%|██▌ | 8958/34278 [9:57:34<23:36:28, 3.36s/it] 26%|██▌ | 8959/34278 [9:57:37<23:39:04, 3.36s/it] {'loss': 0.1571, 'grad_norm': 1.0721378267867974, 'learning_rate': 8.660895346979268e-06, 'epoch': 0.26} 26%|██▌ | 8959/34278 [9:57:37<23:39:04, 3.36s/it] 26%|██▌ | 8960/34278 [9:57:43<29:29:13, 4.19s/it] {'loss': 0.1747, 'grad_norm': 1.233492676885101, 'learning_rate': 8.660573549976755e-06, 'epoch': 0.26} 26%|██▌ | 8960/34278 [9:57:43<29:29:13, 4.19s/it] 26%|██▌ | 8961/34278 [9:57:48<31:31:48, 4.48s/it] {'loss': 0.1454, 'grad_norm': 0.7274276762163758, 'learning_rate': 8.66025172029352e-06, 'epoch': 0.26} 26%|██▌ | 8961/34278 [9:57:48<31:31:48, 4.48s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 26%|██▌ | 8962/34278 [9:57:51<28:42:37, 4.08s/it] {'loss': 0.1604, 'grad_norm': 0.8377190903857192, 'learning_rate': 8.65992985793243e-06, 'epoch': 0.26} 26%|██▌ | 8962/34278 [9:57:51<28:42:37, 4.08s/it] 26%|██▌ | 8963/34278 [9:57:54<26:33:28, 3.78s/it] {'loss': 0.1585, 'grad_norm': 1.08457859515976, 'learning_rate': 8.659607962896356e-06, 'epoch': 0.26} 26%|██▌ | 8963/34278 [9:57:54<26:33:28, 3.78s/it] 26%|██▌ | 8964/34278 [9:57:57<24:44:55, 3.52s/it] {'loss': 0.1327, 'grad_norm': 0.692446274962691, 'learning_rate': 8.65928603518818e-06, 'epoch': 0.26} 26%|██▌ | 8964/34278 [9:57:57<24:44:55, 3.52s/it] 26%|██▌ | 8965/34278 [9:58:00<24:00:18, 3.41s/it] {'loss': 0.1535, 'grad_norm': 0.9453876214076944, 'learning_rate': 8.65896407481077e-06, 'epoch': 0.26} 26%|██▌ | 8965/34278 [9:58:01<24:00:18, 3.41s/it] 26%|██▌ | 8966/34278 [9:58:05<25:47:08, 3.67s/it] {'loss': 0.1611, 'grad_norm': 0.8171866133815451, 'learning_rate': 8.658642081767003e-06, 'epoch': 0.26} 26%|██▌ | 8966/34278 [9:58:05<25:47:08, 3.67s/it] 26%|██▌ | 8967/34278 [9:58:08<24:19:18, 3.46s/it] {'loss': 0.1756, 'grad_norm': 0.7267011421363825, 'learning_rate': 8.658320056059752e-06, 'epoch': 0.26} 26%|██▌ | 8967/34278 [9:58:08<24:19:18, 3.46s/it] 26%|██▌ | 8968/34278 [9:58:11<23:31:46, 3.35s/it] {'loss': 0.1624, 'grad_norm': 0.6845048571919239, 'learning_rate': 8.657997997691893e-06, 'epoch': 0.26} 26%|██▌ | 8968/34278 [9:58:11<23:31:46, 3.35s/it] 26%|██▌ | 8969/34278 [9:58:14<24:11:32, 3.44s/it] {'loss': 0.1786, 'grad_norm': 0.8526214663746644, 'learning_rate': 8.657675906666301e-06, 'epoch': 0.26} 26%|██▌ | 8969/34278 [9:58:14<24:11:32, 3.44s/it] 26%|██▌ | 8970/34278 [9:58:17<23:02:04, 3.28s/it] {'loss': 0.144, 'grad_norm': 0.7279969395521828, 'learning_rate': 8.657353782985853e-06, 'epoch': 0.26} 26%|██▌ | 8970/34278 [9:58:17<23:02:04, 3.28s/it] 26%|██▌ | 8971/34278 [9:58:21<23:07:50, 3.29s/it] {'loss': 0.2039, 'grad_norm': 6.897200323623024, 'learning_rate': 8.657031626653423e-06, 'epoch': 0.26} 26%|██▌ | 8971/34278 [9:58:21<23:07:50, 3.29s/it] 26%|██▌ | 8972/34278 [9:58:27<29:41:46, 4.22s/it] {'loss': 0.1441, 'grad_norm': 0.9029048669461691, 'learning_rate': 8.656709437671886e-06, 'epoch': 0.26} 26%|██▌ | 8972/34278 [9:58:27<29:41:46, 4.22s/it] 26%|██▌ | 8973/34278 [9:58:31<28:29:58, 4.05s/it] {'loss': 0.1268, 'grad_norm': 0.793420907658017, 'learning_rate': 8.656387216044122e-06, 'epoch': 0.26} 26%|██▌ | 8973/34278 [9:58:31<28:29:58, 4.05s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f16a37a5f80> Failed to fetch sample 2696825. Exception: cannot identify image file <_io.BytesIO object at 0x7f16a37a5f80> 26%|██▌ | 8974/34278 [9:58:34<26:51:48, 3.82s/it] {'loss': 0.1498, 'grad_norm': 0.7511495038593089, 'learning_rate': 8.656064961773006e-06, 'epoch': 0.26} 26%|██▌ | 8974/34278 [9:58:34<26:51:48, 3.82s/it] 26%|██▌ | 8975/34278 [9:58:37<26:03:24, 3.71s/it] {'loss': 0.1448, 'grad_norm': 0.9088885928694428, 'learning_rate': 8.655742674861414e-06, 'epoch': 0.26} 26%|██▌ | 8975/34278 [9:58:37<26:03:24, 3.71s/it] 26%|██▌ | 8976/34278 [9:58:43<29:31:54, 4.20s/it] {'loss': 0.1746, 'grad_norm': 0.6745868296167258, 'learning_rate': 8.655420355312224e-06, 'epoch': 0.26} 26%|██▌ | 8976/34278 [9:58:43<29:31:54, 4.20s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 26%|██▌ | 8977/34278 [9:58:46<28:19:41, 4.03s/it] {'loss': 0.1735, 'grad_norm': 1.1519673108011803, 'learning_rate': 8.655098003128312e-06, 'epoch': 0.26} 26%|██▌ | 8977/34278 [9:58:46<28:19:41, 4.03s/it] 26%|██▌ | 8978/34278 [9:58:50<27:53:04, 3.97s/it] {'loss': 0.1455, 'grad_norm': 1.231659580622008, 'learning_rate': 8.654775618312561e-06, 'epoch': 0.26} 26%|██▌ | 8978/34278 [9:58:50<27:53:04, 3.97s/it] 26%|██▌ | 8979/34278 [9:58:55<29:30:35, 4.20s/it] {'loss': 0.1909, 'grad_norm': 0.7365798841092727, 'learning_rate': 8.654453200867842e-06, 'epoch': 0.26} 26%|██▌ | 8979/34278 [9:58:55<29:30:35, 4.20s/it] 26%|██▌ | 8980/34278 [9:58:58<27:45:19, 3.95s/it] {'loss': 0.156, 'grad_norm': 0.8951672978577805, 'learning_rate': 8.654130750797041e-06, 'epoch': 0.26} 26%|██▌ | 8980/34278 [9:58:58<27:45:19, 3.95s/it] 26%|██▌ | 8981/34278 [9:59:01<25:26:21, 3.62s/it] {'loss': 0.1511, 'grad_norm': 1.0670266903104302, 'learning_rate': 8.65380826810303e-06, 'epoch': 0.26} 26%|██▌ | 8981/34278 [9:59:01<25:26:21, 3.62s/it] 26%|██▌ | 8982/34278 [9:59:04<24:39:41, 3.51s/it] {'loss': 0.1423, 'grad_norm': 0.7877643841195099, 'learning_rate': 8.653485752788692e-06, 'epoch': 0.26} 26%|██▌ | 8982/34278 [9:59:04<24:39:41, 3.51s/it] 26%|██▌ | 8983/34278 [9:59:10<29:47:21, 4.24s/it] {'loss': 0.1511, 'grad_norm': 0.863593991044593, 'learning_rate': 8.653163204856906e-06, 'epoch': 0.26} 26%|██▌ | 8983/34278 [9:59:10<29:47:21, 4.24s/it] 26%|██▌ | 8984/34278 [9:59:14<27:29:32, 3.91s/it] {'loss': 0.1583, 'grad_norm': 0.8138077050375636, 'learning_rate': 8.65284062431055e-06, 'epoch': 0.26} 26%|██▌ | 8984/34278 [9:59:14<27:29:32, 3.91s/it] 26%|██▌ | 8985/34278 [9:59:17<25:32:27, 3.64s/it] {'loss': 0.1727, 'grad_norm': 0.7585225714676208, 'learning_rate': 8.652518011152507e-06, 'epoch': 0.26} 26%|██▌ | 8985/34278 [9:59:17<25:32:27, 3.64s/it] 26%|██▌ | 8986/34278 [9:59:20<24:31:07, 3.49s/it] {'loss': 0.1228, 'grad_norm': 0.9090473772890441, 'learning_rate': 8.652195365385652e-06, 'epoch': 0.26} 26%|██▌ | 8986/34278 [9:59:20<24:31:07, 3.49s/it] 26%|██▌ | 8987/34278 [9:59:23<24:17:22, 3.46s/it] {'loss': 0.1982, 'grad_norm': 0.7627118636691527, 'learning_rate': 8.651872687012871e-06, 'epoch': 0.26} 26%|██▌ | 8987/34278 [9:59:23<24:17:22, 3.46s/it] 26%|██▌ | 8988/34278 [9:59:26<24:00:18, 3.42s/it] {'loss': 0.1645, 'grad_norm': 0.7793849449518128, 'learning_rate': 8.651549976037042e-06, 'epoch': 0.26} 26%|██▌ | 8988/34278 [9:59:26<24:00:18, 3.42s/it] 26%|██▌ | 8989/34278 [9:59:29<22:36:41, 3.22s/it] {'loss': 0.1493, 'grad_norm': 0.6968624570367113, 'learning_rate': 8.651227232461045e-06, 'epoch': 0.26} 26%|██▌ | 8989/34278 [9:59:29<22:36:41, 3.22s/it] 26%|██▌ | 8990/34278 [9:59:32<22:21:04, 3.18s/it] {'loss': 0.197, 'grad_norm': 0.8330253397591032, 'learning_rate': 8.650904456287765e-06, 'epoch': 0.26} 26%|██▌ | 8990/34278 [9:59:32<22:21:04, 3.18s/it] 26%|██▌ | 8991/34278 [9:59:35<21:51:32, 3.11s/it] {'loss': 0.1647, 'grad_norm': 0.9423238449724195, 'learning_rate': 8.65058164752008e-06, 'epoch': 0.26} 26%|██▌ | 8991/34278 [9:59:35<21:51:32, 3.11s/it] 26%|██▌ | 8992/34278 [9:59:38<21:19:32, 3.04s/it] {'loss': 0.1633, 'grad_norm': 0.7795221338227699, 'learning_rate': 8.650258806160874e-06, 'epoch': 0.26} 26%|██▌ | 8992/34278 [9:59:38<21:19:32, 3.04s/it] 26%|██▌ | 8993/34278 [9:59:41<21:37:25, 3.08s/it] {'loss': 0.1692, 'grad_norm': 0.9439610475194946, 'learning_rate': 8.649935932213029e-06, 'epoch': 0.26} 26%|██▌ | 8993/34278 [9:59:41<21:37:25, 3.08s/it] 26%|██▌ | 8994/34278 [9:59:44<20:56:55, 2.98s/it] {'loss': 0.1469, 'grad_norm': 1.2520198627267944, 'learning_rate': 8.649613025679428e-06, 'epoch': 0.26} 26%|██▌ | 8994/34278 [9:59:44<20:56:55, 2.98s/it] 26%|██▌ | 8995/34278 [9:59:49<25:34:24, 3.64s/it] {'loss': 0.1635, 'grad_norm': 0.8153986117003251, 'learning_rate': 8.649290086562952e-06, 'epoch': 0.26} 26%|██▌ | 8995/34278 [9:59:49<25:34:24, 3.64s/it] 26%|██▌ | 8996/34278 [9:59:54<28:00:49, 3.99s/it] {'loss': 0.1399, 'grad_norm': 0.9706801494347831, 'learning_rate': 8.648967114866485e-06, 'epoch': 0.26} 26%|██▌ | 8996/34278 [9:59:54<28:00:49, 3.99s/it] 26%|██▌ | 8997/34278 [9:59:57<26:43:53, 3.81s/it] {'loss': 0.158, 'grad_norm': 1.0048217905972299, 'learning_rate': 8.648644110592912e-06, 'epoch': 0.26} 26%|██▌ | 8997/34278 [9:59:57<26:43:53, 3.81s/it] 26%|██▋ | 8998/34278 [10:00:01<25:59:32, 3.70s/it] {'loss': 0.1535, 'grad_norm': 0.6717974086650561, 'learning_rate': 8.648321073745113e-06, 'epoch': 0.26} 26%|██▋ | 8998/34278 [10:00:01<25:59:32, 3.70s/it] 26%|██▋ | 8999/34278 [10:00:04<24:07:45, 3.44s/it] {'loss': 0.1781, 'grad_norm': 1.2447905270372501, 'learning_rate': 8.647998004325977e-06, 'epoch': 0.26} 26%|██▋ | 8999/34278 [10:00:04<24:07:45, 3.44s/it] 26%|██▋ | 9000/34278 [10:00:08<26:52:15, 3.83s/it] {'loss': 0.158, 'grad_norm': 0.9021571185221067, 'learning_rate': 8.647674902338384e-06, 'epoch': 0.26} 26%|██▋ | 9000/34278 [10:00:08<26:52:15, 3.83s/it] 26%|██▋ | 9001/34278 [10:00:13<27:42:02, 3.95s/it] {'loss': 0.1625, 'grad_norm': 0.7515560073859457, 'learning_rate': 8.647351767785221e-06, 'epoch': 0.26} 26%|██▋ | 9001/34278 [10:00:13<27:42:02, 3.95s/it] 26%|██▋ | 9002/34278 [10:00:17<29:32:21, 4.21s/it] {'loss': 0.1646, 'grad_norm': 0.8906606297136825, 'learning_rate': 8.647028600669373e-06, 'epoch': 0.26} 26%|██▋ | 9002/34278 [10:00:17<29:32:21, 4.21s/it] 26%|██▋ | 9003/34278 [10:00:20<26:57:35, 3.84s/it] {'loss': 0.1549, 'grad_norm': 0.7556433769020066, 'learning_rate': 8.646705400993722e-06, 'epoch': 0.26} 26%|██▋ | 9003/34278 [10:00:20<26:57:35, 3.84s/it] 26%|██▋ | 9004/34278 [10:00:25<29:27:55, 4.20s/it] {'loss': 0.1482, 'grad_norm': 0.7594058956556977, 'learning_rate': 8.646382168761159e-06, 'epoch': 0.26} 26%|██▋ | 9004/34278 [10:00:25<29:27:55, 4.20s/it] 26%|██▋ | 9005/34278 [10:00:29<27:27:53, 3.91s/it] {'loss': 0.1623, 'grad_norm': 0.7329656671449547, 'learning_rate': 8.646058903974563e-06, 'epoch': 0.26} 26%|██▋ | 9005/34278 [10:00:29<27:27:53, 3.91s/it] 26%|██▋ | 9006/34278 [10:00:35<32:14:23, 4.59s/it] {'loss': 0.1415, 'grad_norm': 0.7762564542528988, 'learning_rate': 8.645735606636825e-06, 'epoch': 0.26} 26%|██▋ | 9006/34278 [10:00:35<32:14:23, 4.59s/it] 26%|██▋ | 9007/34278 [10:00:38<28:32:18, 4.07s/it] {'loss': 0.1517, 'grad_norm': 0.9856730080145799, 'learning_rate': 8.645412276750829e-06, 'epoch': 0.26} 26%|██▋ | 9007/34278 [10:00:38<28:32:18, 4.07s/it] 26%|██▋ | 9008/34278 [10:00:43<30:23:16, 4.33s/it] {'loss': 0.1643, 'grad_norm': 0.7124863843563982, 'learning_rate': 8.645088914319464e-06, 'epoch': 0.26} 26%|██▋ | 9008/34278 [10:00:43<30:23:16, 4.33s/it] 26%|██▋ | 9009/34278 [10:00:46<27:22:17, 3.90s/it] {'loss': 0.1647, 'grad_norm': 0.9274028579170007, 'learning_rate': 8.644765519345615e-06, 'epoch': 0.26} 26%|██▋ | 9009/34278 [10:00:46<27:22:17, 3.90s/it] 26%|██▋ | 9010/34278 [10:00:50<28:32:27, 4.07s/it] {'loss': 0.1537, 'grad_norm': 0.9549257484451333, 'learning_rate': 8.644442091832168e-06, 'epoch': 0.26} 26%|██▋ | 9010/34278 [10:00:50<28:32:27, 4.07s/it] 26%|██▋ | 9011/34278 [10:00:53<26:53:28, 3.83s/it] {'loss': 0.1314, 'grad_norm': 0.6818318593964277, 'learning_rate': 8.644118631782014e-06, 'epoch': 0.26} 26%|██▋ | 9011/34278 [10:00:53<26:53:28, 3.83s/it] 26%|██▋ | 9012/34278 [10:00:57<26:28:06, 3.77s/it] {'loss': 0.1442, 'grad_norm': 0.8348523017673742, 'learning_rate': 8.643795139198037e-06, 'epoch': 0.26} 26%|██▋ | 9012/34278 [10:00:57<26:28:06, 3.77s/it] 26%|██▋ | 9013/34278 [10:01:03<31:05:18, 4.43s/it] {'loss': 0.1873, 'grad_norm': 0.8854657649503088, 'learning_rate': 8.643471614083127e-06, 'epoch': 0.26} 26%|██▋ | 9013/34278 [10:01:03<31:05:18, 4.43s/it] 26%|██▋ | 9014/34278 [10:01:09<34:04:59, 4.86s/it] {'loss': 0.1267, 'grad_norm': 0.7212956182481403, 'learning_rate': 8.643148056440174e-06, 'epoch': 0.26} 26%|██▋ | 9014/34278 [10:01:09<34:04:59, 4.86s/it] 26%|██▋ | 9015/34278 [10:01:13<33:13:45, 4.74s/it] {'loss': 0.1537, 'grad_norm': 0.7784757188427204, 'learning_rate': 8.642824466272065e-06, 'epoch': 0.26} 26%|██▋ | 9015/34278 [10:01:13<33:13:45, 4.74s/it] 26%|██▋ | 9016/34278 [10:01:16<29:30:28, 4.21s/it] {'loss': 0.1672, 'grad_norm': 0.9231990584958093, 'learning_rate': 8.642500843581687e-06, 'epoch': 0.26} 26%|██▋ | 9016/34278 [10:01:16<29:30:28, 4.21s/it] 26%|██▋ | 9017/34278 [10:01:19<27:34:10, 3.93s/it] {'loss': 0.1497, 'grad_norm': 0.8723099299784459, 'learning_rate': 8.64217718837193e-06, 'epoch': 0.26} 26%|██▋ | 9017/34278 [10:01:19<27:34:10, 3.93s/it] 26%|██▋ | 9018/34278 [10:01:23<26:25:16, 3.77s/it] {'loss': 0.1549, 'grad_norm': 0.9186162309499485, 'learning_rate': 8.641853500645685e-06, 'epoch': 0.26} 26%|██▋ | 9018/34278 [10:01:23<26:25:16, 3.77s/it] 26%|██▋ | 9019/34278 [10:01:26<25:18:09, 3.61s/it] {'loss': 0.1778, 'grad_norm': 0.8548084882452498, 'learning_rate': 8.641529780405843e-06, 'epoch': 0.26} 26%|██▋ | 9019/34278 [10:01:26<25:18:09, 3.61s/it] 26%|██▋ | 9020/34278 [10:01:32<30:27:10, 4.34s/it] {'loss': 0.1745, 'grad_norm': 0.9450437817969356, 'learning_rate': 8.641206027655293e-06, 'epoch': 0.26} 26%|██▋ | 9020/34278 [10:01:32<30:27:10, 4.34s/it] 26%|██▋ | 9021/34278 [10:01:38<33:40:24, 4.80s/it] {'loss': 0.1454, 'grad_norm': 0.8052277640189358, 'learning_rate': 8.640882242396922e-06, 'epoch': 0.26} 26%|██▋ | 9021/34278 [10:01:38<33:40:24, 4.80s/it] 26%|██▋ | 9022/34278 [10:01:41<30:16:01, 4.31s/it] {'loss': 0.1511, 'grad_norm': 0.9179526845228608, 'learning_rate': 8.640558424633625e-06, 'epoch': 0.26} 26%|██▋ | 9022/34278 [10:01:41<30:16:01, 4.31s/it] 26%|██▋ | 9023/34278 [10:01:44<27:51:54, 3.97s/it] {'loss': 0.1674, 'grad_norm': 0.9192983316214682, 'learning_rate': 8.640234574368292e-06, 'epoch': 0.26} 26%|██▋ | 9023/34278 [10:01:44<27:51:54, 3.97s/it] 26%|██▋ | 9024/34278 [10:01:47<25:21:08, 3.61s/it] {'loss': 0.1589, 'grad_norm': 0.9721749546684811, 'learning_rate': 8.639910691603815e-06, 'epoch': 0.26} 26%|██▋ | 9024/34278 [10:01:47<25:21:08, 3.61s/it] 26%|██▋ | 9025/34278 [10:01:53<30:20:28, 4.33s/it] {'loss': 0.1858, 'grad_norm': 0.8077257913425597, 'learning_rate': 8.63958677634308e-06, 'epoch': 0.26} 26%|██▋ | 9025/34278 [10:01:53<30:20:28, 4.33s/it] 26%|██▋ | 9026/34278 [10:01:56<28:21:05, 4.04s/it] {'loss': 0.1564, 'grad_norm': 0.738405800801672, 'learning_rate': 8.639262828588988e-06, 'epoch': 0.26} 26%|██▋ | 9026/34278 [10:01:56<28:21:05, 4.04s/it] 26%|██▋ | 9027/34278 [10:02:00<26:31:03, 3.78s/it] {'loss': 0.1719, 'grad_norm': 0.7852468710522374, 'learning_rate': 8.638938848344422e-06, 'epoch': 0.26} 26%|██▋ | 9027/34278 [10:02:00<26:31:03, 3.78s/it] 26%|██▋ | 9028/34278 [10:02:02<24:31:56, 3.50s/it] {'loss': 0.1497, 'grad_norm': 0.7730085408186513, 'learning_rate': 8.63861483561228e-06, 'epoch': 0.26} 26%|██▋ | 9028/34278 [10:02:03<24:31:56, 3.50s/it] 26%|██▋ | 9029/34278 [10:02:05<23:27:20, 3.34s/it] {'loss': 0.167, 'grad_norm': 0.9883000188980213, 'learning_rate': 8.638290790395453e-06, 'epoch': 0.26} 26%|██▋ | 9029/34278 [10:02:05<23:27:20, 3.34s/it] 26%|██▋ | 9030/34278 [10:02:09<23:08:30, 3.30s/it] {'loss': 0.1643, 'grad_norm': 0.8436573459162998, 'learning_rate': 8.637966712696837e-06, 'epoch': 0.26} 26%|██▋ | 9030/34278 [10:02:09<23:08:30, 3.30s/it] 26%|██▋ | 9031/34278 [10:02:13<25:24:33, 3.62s/it] {'loss': 0.1694, 'grad_norm': 0.7092549122404046, 'learning_rate': 8.637642602519321e-06, 'epoch': 0.26} 26%|██▋ | 9031/34278 [10:02:13<25:24:33, 3.62s/it] 26%|██▋ | 9032/34278 [10:02:16<24:51:26, 3.54s/it] {'loss': 0.1352, 'grad_norm': 1.0446338585028105, 'learning_rate': 8.6373184598658e-06, 'epoch': 0.26} 26%|██▋ | 9032/34278 [10:02:16<24:51:26, 3.54s/it] 26%|██▋ | 9033/34278 [10:02:20<25:35:02, 3.65s/it] {'loss': 0.1584, 'grad_norm': 0.7027271973633433, 'learning_rate': 8.636994284739167e-06, 'epoch': 0.26} 26%|██▋ | 9033/34278 [10:02:20<25:35:02, 3.65s/it] 26%|██▋ | 9034/34278 [10:02:24<25:50:19, 3.68s/it] {'loss': 0.1623, 'grad_norm': 0.9778596732464395, 'learning_rate': 8.636670077142319e-06, 'epoch': 0.26} 26%|██▋ | 9034/34278 [10:02:24<25:50:19, 3.68s/it] 26%|██▋ | 9035/34278 [10:02:27<24:22:41, 3.48s/it] {'loss': 0.1393, 'grad_norm': 0.8677475585979214, 'learning_rate': 8.636345837078149e-06, 'epoch': 0.26} 26%|██▋ | 9035/34278 [10:02:27<24:22:41, 3.48s/it] 26%|██▋ | 9036/34278 [10:02:31<24:17:28, 3.46s/it] {'loss': 0.1377, 'grad_norm': 0.8155722533322951, 'learning_rate': 8.63602156454955e-06, 'epoch': 0.26} 26%|██▋ | 9036/34278 [10:02:31<24:17:28, 3.46s/it] 26%|██▋ | 9037/34278 [10:02:33<22:53:16, 3.26s/it] {'loss': 0.1707, 'grad_norm': 0.9343654193152432, 'learning_rate': 8.63569725955942e-06, 'epoch': 0.26} 26%|██▋ | 9037/34278 [10:02:33<22:53:16, 3.26s/it] 26%|██▋ | 9038/34278 [10:02:36<22:08:42, 3.16s/it] {'loss': 0.1643, 'grad_norm': 0.920493289169441, 'learning_rate': 8.63537292211065e-06, 'epoch': 0.26} 26%|██▋ | 9038/34278 [10:02:36<22:08:42, 3.16s/it] 26%|██▋ | 9039/34278 [10:02:40<23:01:23, 3.28s/it] {'loss': 0.1527, 'grad_norm': 1.1822121690421965, 'learning_rate': 8.63504855220614e-06, 'epoch': 0.26} 26%|██▋ | 9039/34278 [10:02:40<23:01:23, 3.28s/it] 26%|██▋ | 9040/34278 [10:02:45<26:48:45, 3.82s/it] {'loss': 0.1497, 'grad_norm': 0.8324666194814841, 'learning_rate': 8.634724149848785e-06, 'epoch': 0.26} 26%|██▋ | 9040/34278 [10:02:45<26:48:45, 3.82s/it] 26%|██▋ | 9041/34278 [10:02:51<31:20:42, 4.47s/it] {'loss': 0.1547, 'grad_norm': 0.8232415364228624, 'learning_rate': 8.634399715041479e-06, 'epoch': 0.26} 26%|██▋ | 9041/34278 [10:02:51<31:20:42, 4.47s/it] 26%|██▋ | 9042/34278 [10:02:56<32:23:36, 4.62s/it] {'loss': 0.1638, 'grad_norm': 1.0870219534930903, 'learning_rate': 8.634075247787121e-06, 'epoch': 0.26} 26%|██▋ | 9042/34278 [10:02:56<32:23:36, 4.62s/it] 26%|██▋ | 9043/34278 [10:02:59<29:33:29, 4.22s/it] {'loss': 0.1396, 'grad_norm': 0.8903388992881184, 'learning_rate': 8.633750748088608e-06, 'epoch': 0.26} 26%|██▋ | 9043/34278 [10:02:59<29:33:29, 4.22s/it] 26%|██▋ | 9044/34278 [10:03:03<28:44:42, 4.10s/it] {'loss': 0.1733, 'grad_norm': 1.0044263782715226, 'learning_rate': 8.633426215948833e-06, 'epoch': 0.26} 26%|██▋ | 9044/34278 [10:03:03<28:44:42, 4.10s/it] 26%|██▋ | 9045/34278 [10:03:07<29:18:24, 4.18s/it] {'loss': 0.1823, 'grad_norm': 1.2698529262262845, 'learning_rate': 8.633101651370696e-06, 'epoch': 0.26} 26%|██▋ | 9045/34278 [10:03:07<29:18:24, 4.18s/it] 26%|██▋ | 9046/34278 [10:03:10<26:59:08, 3.85s/it] {'loss': 0.2038, 'grad_norm': 1.0729459899868792, 'learning_rate': 8.632777054357098e-06, 'epoch': 0.26} 26%|██▋ | 9046/34278 [10:03:10<26:59:08, 3.85s/it] 26%|██▋ | 9047/34278 [10:03:14<27:01:35, 3.86s/it] {'loss': 0.163, 'grad_norm': 0.7010983361020153, 'learning_rate': 8.632452424910932e-06, 'epoch': 0.26} 26%|██▋ | 9047/34278 [10:03:14<27:01:35, 3.86s/it] 26%|██▋ | 9048/34278 [10:03:18<25:58:23, 3.71s/it] {'loss': 0.1752, 'grad_norm': 1.0570119784118244, 'learning_rate': 8.632127763035096e-06, 'epoch': 0.26} 26%|██▋ | 9048/34278 [10:03:18<25:58:23, 3.71s/it] 26%|██▋ | 9049/34278 [10:03:21<24:51:05, 3.55s/it] {'loss': 0.1549, 'grad_norm': 1.0045126713576789, 'learning_rate': 8.631803068732493e-06, 'epoch': 0.26} 26%|██▋ | 9049/34278 [10:03:21<24:51:05, 3.55s/it] 26%|██▋ | 9050/34278 [10:03:24<24:46:27, 3.54s/it] {'loss': 0.1407, 'grad_norm': 0.966544410497631, 'learning_rate': 8.631478342006019e-06, 'epoch': 0.26} 26%|██▋ | 9050/34278 [10:03:24<24:46:27, 3.54s/it] 26%|██▋ | 9051/34278 [10:03:28<24:28:45, 3.49s/it] {'loss': 0.1771, 'grad_norm': 0.9947474879539469, 'learning_rate': 8.631153582858571e-06, 'epoch': 0.26} 26%|██▋ | 9051/34278 [10:03:28<24:28:45, 3.49s/it] 26%|██▋ | 9052/34278 [10:03:33<29:15:15, 4.17s/it] {'loss': 0.1617, 'grad_norm': 0.8973227562563453, 'learning_rate': 8.630828791293053e-06, 'epoch': 0.26} 26%|██▋ | 9052/34278 [10:03:33<29:15:15, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 26%|██▋ | 9053/34278 [10:03:37<27:24:42, 3.91s/it] {'loss': 0.1372, 'grad_norm': 1.1181779623509178, 'learning_rate': 8.63050396731236e-06, 'epoch': 0.26} 26%|██▋ | 9053/34278 [10:03:37<27:24:42, 3.91s/it] 26%|██▋ | 9054/34278 [10:03:40<25:22:20, 3.62s/it] {'loss': 0.1598, 'grad_norm': 0.8669198873518774, 'learning_rate': 8.630179110919396e-06, 'epoch': 0.26} 26%|██▋ | 9054/34278 [10:03:40<25:22:20, 3.62s/it] 26%|██▋ | 9055/34278 [10:03:45<29:30:20, 4.21s/it] {'loss': 0.148, 'grad_norm': 0.8858988726842243, 'learning_rate': 8.62985422211706e-06, 'epoch': 0.26} 26%|██▋ | 9055/34278 [10:03:45<29:30:20, 4.21s/it] 26%|██▋ | 9056/34278 [10:03:48<26:19:39, 3.76s/it] {'loss': 0.1687, 'grad_norm': 1.0085743706771297, 'learning_rate': 8.629529300908252e-06, 'epoch': 0.26} 26%|██▋ | 9056/34278 [10:03:48<26:19:39, 3.76s/it] 26%|██▋ | 9057/34278 [10:03:51<24:30:11, 3.50s/it] {'loss': 0.1473, 'grad_norm': 0.7803668383702542, 'learning_rate': 8.629204347295871e-06, 'epoch': 0.26} 26%|██▋ | 9057/34278 [10:03:51<24:30:11, 3.50s/it] 26%|██▋ | 9058/34278 [10:03:54<24:15:08, 3.46s/it] {'loss': 0.1715, 'grad_norm': 0.9095071387023369, 'learning_rate': 8.628879361282822e-06, 'epoch': 0.26} 26%|██▋ | 9058/34278 [10:03:54<24:15:08, 3.46s/it] 26%|██▋ | 9059/34278 [10:03:57<23:04:12, 3.29s/it] {'loss': 0.1432, 'grad_norm': 0.7505477609686817, 'learning_rate': 8.628554342872001e-06, 'epoch': 0.26} 26%|██▋ | 9059/34278 [10:03:57<23:04:12, 3.29s/it] 26%|██▋ | 9060/34278 [10:04:01<23:20:10, 3.33s/it] {'loss': 0.1556, 'grad_norm': 0.799754619139009, 'learning_rate': 8.628229292066317e-06, 'epoch': 0.26} 26%|██▋ | 9060/34278 [10:04:01<23:20:10, 3.33s/it] 26%|██▋ | 9061/34278 [10:04:04<23:50:33, 3.40s/it] {'loss': 0.1534, 'grad_norm': 0.8988449170981903, 'learning_rate': 8.627904208868667e-06, 'epoch': 0.26} 26%|██▋ | 9061/34278 [10:04:04<23:50:33, 3.40s/it] 26%|██▋ | 9062/34278 [10:04:07<22:52:06, 3.26s/it] {'loss': 0.144, 'grad_norm': 0.8730500814357809, 'learning_rate': 8.627579093281954e-06, 'epoch': 0.26} 26%|██▋ | 9062/34278 [10:04:07<22:52:06, 3.26s/it] 26%|██▋ | 9063/34278 [10:04:10<22:49:06, 3.26s/it] {'loss': 0.1594, 'grad_norm': 0.8647646197225216, 'learning_rate': 8.62725394530908e-06, 'epoch': 0.26} 26%|██▋ | 9063/34278 [10:04:10<22:49:06, 3.26s/it] 26%|██▋ | 9064/34278 [10:04:13<21:53:42, 3.13s/it] {'loss': 0.1342, 'grad_norm': 0.704882418996132, 'learning_rate': 8.62692876495295e-06, 'epoch': 0.26} 26%|██▋ | 9064/34278 [10:04:13<21:53:42, 3.13s/it] 26%|██▋ | 9065/34278 [10:04:17<23:12:06, 3.31s/it] {'loss': 0.146, 'grad_norm': 0.8287469086871684, 'learning_rate': 8.626603552216463e-06, 'epoch': 0.26} 26%|██▋ | 9065/34278 [10:04:17<23:12:06, 3.31s/it] 26%|██▋ | 9066/34278 [10:04:23<28:54:57, 4.13s/it] {'loss': 0.1673, 'grad_norm': 0.7172902656043784, 'learning_rate': 8.626278307102527e-06, 'epoch': 0.26} 26%|██▋ | 9066/34278 [10:04:23<28:54:57, 4.13s/it] 26%|██▋ | 9067/34278 [10:04:26<26:07:42, 3.73s/it] {'loss': 0.1723, 'grad_norm': 0.8710773055210546, 'learning_rate': 8.625953029614045e-06, 'epoch': 0.26} 26%|██▋ | 9067/34278 [10:04:26<26:07:42, 3.73s/it] 26%|██▋ | 9068/34278 [10:04:30<27:38:47, 3.95s/it] {'loss': 0.1723, 'grad_norm': 0.9036504981894505, 'learning_rate': 8.625627719753919e-06, 'epoch': 0.26} 26%|██▋ | 9068/34278 [10:04:30<27:38:47, 3.95s/it] 26%|██▋ | 9069/34278 [10:04:34<26:23:11, 3.77s/it] {'loss': 0.1419, 'grad_norm': 0.827967608980461, 'learning_rate': 8.625302377525055e-06, 'epoch': 0.26} 26%|██▋ | 9069/34278 [10:04:34<26:23:11, 3.77s/it] 26%|██▋ | 9070/34278 [10:04:37<25:40:53, 3.67s/it] {'loss': 0.1516, 'grad_norm': 1.1107301199535424, 'learning_rate': 8.624977002930356e-06, 'epoch': 0.26} 26%|██▋ | 9070/34278 [10:04:37<25:40:53, 3.67s/it] 26%|██▋ | 9071/34278 [10:04:40<23:38:28, 3.38s/it] {'loss': 0.156, 'grad_norm': 1.0366582363229322, 'learning_rate': 8.624651595972729e-06, 'epoch': 0.26} 26%|██▋ | 9071/34278 [10:04:40<23:38:28, 3.38s/it] 26%|██▋ | 9072/34278 [10:04:46<29:12:14, 4.17s/it] {'loss': 0.1832, 'grad_norm': 0.8720778348750352, 'learning_rate': 8.624326156655075e-06, 'epoch': 0.26} 26%|██▋ | 9072/34278 [10:04:46<29:12:14, 4.17s/it] 26%|██▋ | 9073/34278 [10:04:49<28:26:36, 4.06s/it] {'loss': 0.1605, 'grad_norm': 1.1158055232479938, 'learning_rate': 8.624000684980305e-06, 'epoch': 0.26} 26%|██▋ | 9073/34278 [10:04:49<28:26:36, 4.06s/it] 26%|██▋ | 9074/34278 [10:04:52<25:45:17, 3.68s/it] {'loss': 0.1451, 'grad_norm': 0.9359495379112925, 'learning_rate': 8.62367518095132e-06, 'epoch': 0.26} 26%|██▋ | 9074/34278 [10:04:52<25:45:17, 3.68s/it] 26%|██▋ | 9075/34278 [10:04:55<24:03:53, 3.44s/it] {'loss': 0.1413, 'grad_norm': 0.797724665799639, 'learning_rate': 8.623349644571029e-06, 'epoch': 0.26} 26%|██▋ | 9075/34278 [10:04:55<24:03:53, 3.44s/it] 26%|██▋ | 9076/34278 [10:04:58<23:30:47, 3.36s/it] {'loss': 0.1632, 'grad_norm': 1.147284726854106, 'learning_rate': 8.623024075842337e-06, 'epoch': 0.26} 26%|██▋ | 9076/34278 [10:04:58<23:30:47, 3.36s/it] 26%|██▋ | 9077/34278 [10:05:01<22:58:55, 3.28s/it] {'loss': 0.182, 'grad_norm': 0.8992657168090905, 'learning_rate': 8.622698474768151e-06, 'epoch': 0.26} 26%|██▋ | 9077/34278 [10:05:01<22:58:55, 3.28s/it] 26%|██▋ | 9078/34278 [10:05:05<23:12:05, 3.31s/it] {'loss': 0.1639, 'grad_norm': 0.7815571394446682, 'learning_rate': 8.622372841351378e-06, 'epoch': 0.26} 26%|██▋ | 9078/34278 [10:05:05<23:12:05, 3.31s/it] 26%|██▋ | 9079/34278 [10:05:08<22:49:35, 3.26s/it] {'loss': 0.1688, 'grad_norm': 0.8564192048276966, 'learning_rate': 8.622047175594926e-06, 'epoch': 0.26} 26%|██▋ | 9079/34278 [10:05:08<22:49:35, 3.26s/it] 26%|██▋ | 9080/34278 [10:05:11<23:02:32, 3.29s/it] {'loss': 0.1461, 'grad_norm': 0.8074776318284307, 'learning_rate': 8.6217214775017e-06, 'epoch': 0.26} 26%|██▋ | 9080/34278 [10:05:11<23:02:32, 3.29s/it] 26%|██▋ | 9081/34278 [10:05:15<22:56:52, 3.28s/it] {'loss': 0.1711, 'grad_norm': 0.7674874346703437, 'learning_rate': 8.62139574707461e-06, 'epoch': 0.26} 26%|██▋ | 9081/34278 [10:05:15<22:56:52, 3.28s/it] 26%|██▋ | 9082/34278 [10:05:18<23:35:21, 3.37s/it] {'loss': 0.143, 'grad_norm': 0.9988993533004416, 'learning_rate': 8.621069984316562e-06, 'epoch': 0.26} 26%|██▋ | 9082/34278 [10:05:18<23:35:21, 3.37s/it] 26%|██▋ | 9083/34278 [10:05:22<23:40:34, 3.38s/it] {'loss': 0.1508, 'grad_norm': 0.7562372592021596, 'learning_rate': 8.620744189230468e-06, 'epoch': 0.26} 26%|██▋ | 9083/34278 [10:05:22<23:40:34, 3.38s/it] 27%|██▋ | 9084/34278 [10:05:25<24:46:19, 3.54s/it] {'loss': 0.1884, 'grad_norm': 0.7517946916366848, 'learning_rate': 8.620418361819231e-06, 'epoch': 0.27} 27%|██▋ | 9084/34278 [10:05:25<24:46:19, 3.54s/it] 27%|██▋ | 9085/34278 [10:05:29<24:09:16, 3.45s/it] {'loss': 0.149, 'grad_norm': 0.8642858531473507, 'learning_rate': 8.620092502085766e-06, 'epoch': 0.27} 27%|██▋ | 9085/34278 [10:05:29<24:09:16, 3.45s/it] 27%|██▋ | 9086/34278 [10:05:32<23:56:17, 3.42s/it] {'loss': 0.1563, 'grad_norm': 0.8279634536047271, 'learning_rate': 8.619766610032978e-06, 'epoch': 0.27} 27%|██▋ | 9086/34278 [10:05:32<23:56:17, 3.42s/it] 27%|██▋ | 9087/34278 [10:05:37<27:31:29, 3.93s/it] {'loss': 0.1529, 'grad_norm': 1.0595198042791127, 'learning_rate': 8.619440685663777e-06, 'epoch': 0.27} 27%|██▋ | 9087/34278 [10:05:37<27:31:29, 3.93s/it] 27%|██▋ | 9088/34278 [10:05:41<26:38:22, 3.81s/it] {'loss': 0.1795, 'grad_norm': 0.8305228327930336, 'learning_rate': 8.619114728981076e-06, 'epoch': 0.27} 27%|██▋ | 9088/34278 [10:05:41<26:38:22, 3.81s/it] 27%|██▋ | 9089/34278 [10:05:47<31:43:03, 4.53s/it] {'loss': 0.132, 'grad_norm': 0.7461641349306766, 'learning_rate': 8.61878873998778e-06, 'epoch': 0.27} 27%|██▋ | 9089/34278 [10:05:47<31:43:03, 4.53s/it] 27%|██▋ | 9090/34278 [10:05:50<28:38:24, 4.09s/it] {'loss': 0.1636, 'grad_norm': 1.1785531779312541, 'learning_rate': 8.618462718686803e-06, 'epoch': 0.27} 27%|██▋ | 9090/34278 [10:05:50<28:38:24, 4.09s/it] 27%|██▋ | 9091/34278 [10:05:53<26:05:11, 3.73s/it] {'loss': 0.1692, 'grad_norm': 0.7808261703990087, 'learning_rate': 8.618136665081056e-06, 'epoch': 0.27} 27%|██▋ | 9091/34278 [10:05:53<26:05:11, 3.73s/it] 27%|██▋ | 9092/34278 [10:05:56<25:04:39, 3.58s/it] {'loss': 0.1771, 'grad_norm': 0.9399072229359386, 'learning_rate': 8.617810579173448e-06, 'epoch': 0.27} 27%|██▋ | 9092/34278 [10:05:56<25:04:39, 3.58s/it] 27%|██▋ | 9093/34278 [10:06:02<29:57:44, 4.28s/it] {'loss': 0.1472, 'grad_norm': 0.8716638403910036, 'learning_rate': 8.61748446096689e-06, 'epoch': 0.27} 27%|██▋ | 9093/34278 [10:06:02<29:57:44, 4.28s/it] 27%|██▋ | 9094/34278 [10:06:05<28:10:24, 4.03s/it] {'loss': 0.1618, 'grad_norm': 0.8539748051850683, 'learning_rate': 8.617158310464295e-06, 'epoch': 0.27} 27%|██▋ | 9094/34278 [10:06:05<28:10:24, 4.03s/it] 27%|██▋ | 9095/34278 [10:06:08<25:47:30, 3.69s/it] {'loss': 0.1631, 'grad_norm': 0.7179318488689149, 'learning_rate': 8.616832127668573e-06, 'epoch': 0.27} 27%|██▋ | 9095/34278 [10:06:08<25:47:30, 3.69s/it] 27%|██▋ | 9096/34278 [10:06:13<28:39:04, 4.10s/it] {'loss': 0.1398, 'grad_norm': 1.0042392548906347, 'learning_rate': 8.616505912582638e-06, 'epoch': 0.27} 27%|██▋ | 9096/34278 [10:06:13<28:39:04, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 27%|██▋ | 9097/34278 [10:06:17<26:37:00, 3.81s/it] {'loss': 0.1655, 'grad_norm': 0.8236697306505159, 'learning_rate': 8.616179665209402e-06, 'epoch': 0.27} 27%|██▋ | 9097/34278 [10:06:17<26:37:00, 3.81s/it] 27%|██▋ | 9098/34278 [10:06:20<25:36:55, 3.66s/it] {'loss': 0.1774, 'grad_norm': 0.825549036235817, 'learning_rate': 8.615853385551776e-06, 'epoch': 0.27} 27%|██▋ | 9098/34278 [10:06:20<25:36:55, 3.66s/it] 27%|██▋ | 9099/34278 [10:06:23<23:42:53, 3.39s/it] {'loss': 0.1542, 'grad_norm': 0.9269318908416707, 'learning_rate': 8.615527073612675e-06, 'epoch': 0.27} 27%|██▋ | 9099/34278 [10:06:23<23:42:53, 3.39s/it] 27%|██▋ | 9100/34278 [10:06:28<28:22:16, 4.06s/it] {'loss': 0.1437, 'grad_norm': 0.7275122950988118, 'learning_rate': 8.615200729395011e-06, 'epoch': 0.27} 27%|██▋ | 9100/34278 [10:06:28<28:22:16, 4.06s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 27%|██▋ | 9101/34278 [10:06:31<26:15:00, 3.75s/it] {'loss': 0.1539, 'grad_norm': 0.614408299811067, 'learning_rate': 8.614874352901698e-06, 'epoch': 0.27} 27%|██▋ | 9101/34278 [10:06:31<26:15:00, 3.75s/it] 27%|██▋ | 9102/34278 [10:06:34<24:33:05, 3.51s/it] {'loss': 0.1675, 'grad_norm': 0.8434856908633734, 'learning_rate': 8.61454794413565e-06, 'epoch': 0.27} 27%|██▋ | 9102/34278 [10:06:34<24:33:05, 3.51s/it] 27%|██▋ | 9103/34278 [10:06:37<23:17:39, 3.33s/it] {'loss': 0.173, 'grad_norm': 0.7641500041163647, 'learning_rate': 8.61422150309978e-06, 'epoch': 0.27} 27%|██▋ | 9103/34278 [10:06:37<23:17:39, 3.33s/it] 27%|██▋ | 9104/34278 [10:06:40<22:25:43, 3.21s/it] {'loss': 0.1704, 'grad_norm': 0.8790201535354943, 'learning_rate': 8.613895029797003e-06, 'epoch': 0.27} 27%|██▋ | 9104/34278 [10:06:40<22:25:43, 3.21s/it] 27%|██▋ | 9105/34278 [10:06:43<22:04:15, 3.16s/it] {'loss': 0.1482, 'grad_norm': 0.7619920842925546, 'learning_rate': 8.613568524230235e-06, 'epoch': 0.27} 27%|██▋ | 9105/34278 [10:06:43<22:04:15, 3.16s/it] 27%|██▋ | 9106/34278 [10:06:46<21:49:29, 3.12s/it] {'loss': 0.1458, 'grad_norm': 0.9080016098777615, 'learning_rate': 8.61324198640239e-06, 'epoch': 0.27} 27%|██▋ | 9106/34278 [10:06:46<21:49:29, 3.12s/it] 27%|██▋ | 9107/34278 [10:06:49<21:06:19, 3.02s/it] {'loss': 0.1588, 'grad_norm': 0.8153199610735942, 'learning_rate': 8.612915416316383e-06, 'epoch': 0.27} 27%|██▋ | 9107/34278 [10:06:49<21:06:19, 3.02s/it] 27%|██▋ | 9108/34278 [10:06:52<22:14:55, 3.18s/it] {'loss': 0.1576, 'grad_norm': 0.7773071241995667, 'learning_rate': 8.612588813975128e-06, 'epoch': 0.27} 27%|██▋ | 9108/34278 [10:06:52<22:14:55, 3.18s/it] 27%|██▋ | 9109/34278 [10:06:56<23:15:57, 3.33s/it] {'loss': 0.1743, 'grad_norm': 0.7211843174080956, 'learning_rate': 8.612262179381546e-06, 'epoch': 0.27} 27%|██▋ | 9109/34278 [10:06:56<23:15:57, 3.33s/it] 27%|██▋ | 9110/34278 [10:07:00<24:25:24, 3.49s/it] {'loss': 0.158, 'grad_norm': 0.9762819829991857, 'learning_rate': 8.611935512538546e-06, 'epoch': 0.27} 27%|██▋ | 9110/34278 [10:07:00<24:25:24, 3.49s/it] 27%|██▋ | 9111/34278 [10:07:03<24:18:32, 3.48s/it] {'loss': 0.1683, 'grad_norm': 0.8956469604263455, 'learning_rate': 8.611608813449049e-06, 'epoch': 0.27} 27%|██▋ | 9111/34278 [10:07:03<24:18:32, 3.48s/it] 27%|██▋ | 9112/34278 [10:07:09<29:26:41, 4.21s/it] {'loss': 0.1602, 'grad_norm': 0.8204892734320686, 'learning_rate': 8.61128208211597e-06, 'epoch': 0.27} 27%|██▋ | 9112/34278 [10:07:09<29:26:41, 4.21s/it] 27%|██▋ | 9113/34278 [10:07:13<27:09:23, 3.88s/it] {'loss': 0.1442, 'grad_norm': 0.7809749207214982, 'learning_rate': 8.610955318542228e-06, 'epoch': 0.27} 27%|██▋ | 9113/34278 [10:07:13<27:09:23, 3.88s/it] 27%|██▋ | 9114/34278 [10:07:16<26:18:04, 3.76s/it] {'loss': 0.1586, 'grad_norm': 0.9870598039420582, 'learning_rate': 8.610628522730739e-06, 'epoch': 0.27} 27%|██▋ | 9114/34278 [10:07:16<26:18:04, 3.76s/it] 27%|██▋ | 9115/34278 [10:07:20<25:52:09, 3.70s/it] {'loss': 0.1703, 'grad_norm': 1.0519012197283568, 'learning_rate': 8.61030169468442e-06, 'epoch': 0.27} 27%|██▋ | 9115/34278 [10:07:20<25:52:09, 3.70s/it] 27%|██▋ | 9116/34278 [10:07:23<24:26:15, 3.50s/it] {'loss': 0.1786, 'grad_norm': 0.8214479126276892, 'learning_rate': 8.60997483440619e-06, 'epoch': 0.27} 27%|██▋ | 9116/34278 [10:07:23<24:26:15, 3.50s/it] 27%|██▋ | 9117/34278 [10:07:28<27:30:15, 3.94s/it] {'loss': 0.1677, 'grad_norm': 0.8068036418888651, 'learning_rate': 8.609647941898965e-06, 'epoch': 0.27} 27%|██▋ | 9117/34278 [10:07:28<27:30:15, 3.94s/it] 27%|██▋ | 9118/34278 [10:07:33<31:39:43, 4.53s/it] {'loss': 0.1665, 'grad_norm': 1.0475768614955818, 'learning_rate': 8.609321017165666e-06, 'epoch': 0.27} 27%|██▋ | 9118/34278 [10:07:33<31:39:43, 4.53s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 27%|██▋ | 9119/34278 [10:07:36<28:28:17, 4.07s/it] {'loss': 0.1467, 'grad_norm': 0.9301065775618507, 'learning_rate': 8.60899406020921e-06, 'epoch': 0.27} 27%|██▋ | 9119/34278 [10:07:36<28:28:17, 4.07s/it] 27%|██▋ | 9120/34278 [10:07:39<26:11:08, 3.75s/it] {'loss': 0.2, 'grad_norm': 0.8931493725532034, 'learning_rate': 8.608667071032518e-06, 'epoch': 0.27} 27%|██▋ | 9120/34278 [10:07:39<26:11:08, 3.75s/it] 27%|██▋ | 9121/34278 [10:07:45<30:58:42, 4.43s/it] {'loss': 0.1879, 'grad_norm': 0.9897605891146277, 'learning_rate': 8.608340049638505e-06, 'epoch': 0.27} 27%|██▋ | 9121/34278 [10:07:45<30:58:42, 4.43s/it] 27%|██▋ | 9122/34278 [10:07:50<30:21:00, 4.34s/it] {'loss': 0.1386, 'grad_norm': 0.9371094007725559, 'learning_rate': 8.608012996030096e-06, 'epoch': 0.27} 27%|██▋ | 9122/34278 [10:07:50<30:21:00, 4.34s/it] 27%|██▋ | 9123/34278 [10:07:53<28:01:16, 4.01s/it] {'loss': 0.157, 'grad_norm': 0.7642659985859662, 'learning_rate': 8.607685910210207e-06, 'epoch': 0.27} 27%|██▋ | 9123/34278 [10:07:53<28:01:16, 4.01s/it] 27%|██▋ | 9124/34278 [10:07:56<27:16:00, 3.90s/it] {'loss': 0.1592, 'grad_norm': 0.760785170955767, 'learning_rate': 8.607358792181758e-06, 'epoch': 0.27} 27%|██▋ | 9124/34278 [10:07:56<27:16:00, 3.90s/it] 27%|██▋ | 9125/34278 [10:08:00<25:24:43, 3.64s/it] {'loss': 0.1596, 'grad_norm': 0.9363709350453202, 'learning_rate': 8.607031641947674e-06, 'epoch': 0.27} 27%|██▋ | 9125/34278 [10:08:00<25:24:43, 3.64s/it] 27%|██▋ | 9126/34278 [10:08:05<28:57:34, 4.14s/it] {'loss': 0.1592, 'grad_norm': 0.9553890540930121, 'learning_rate': 8.60670445951087e-06, 'epoch': 0.27} 27%|██▋ | 9126/34278 [10:08:05<28:57:34, 4.14s/it] 27%|██▋ | 9127/34278 [10:08:09<28:16:57, 4.05s/it] {'loss': 0.164, 'grad_norm': 0.8108281807974532, 'learning_rate': 8.606377244874272e-06, 'epoch': 0.27} 27%|██▋ | 9127/34278 [10:08:09<28:16:57, 4.05s/it] 27%|██▋ | 9128/34278 [10:08:13<27:53:15, 3.99s/it] {'loss': 0.1456, 'grad_norm': 0.744083984081861, 'learning_rate': 8.606049998040798e-06, 'epoch': 0.27} 27%|██▋ | 9128/34278 [10:08:13<27:53:15, 3.99s/it] 27%|██▋ | 9129/34278 [10:08:17<29:08:08, 4.17s/it] {'loss': 0.1505, 'grad_norm': 0.9122152357575725, 'learning_rate': 8.60572271901337e-06, 'epoch': 0.27} 27%|██▋ | 9129/34278 [10:08:17<29:08:08, 4.17s/it] 27%|██▋ | 9130/34278 [10:08:20<27:04:29, 3.88s/it] {'loss': 0.1513, 'grad_norm': 0.9656027725681182, 'learning_rate': 8.60539540779491e-06, 'epoch': 0.27} 27%|██▋ | 9130/34278 [10:08:20<27:04:29, 3.88s/it] 27%|██▋ | 9131/34278 [10:08:24<27:05:19, 3.88s/it] {'loss': 0.1507, 'grad_norm': 0.6944694313875288, 'learning_rate': 8.60506806438834e-06, 'epoch': 0.27} 27%|██▋ | 9131/34278 [10:08:24<27:05:19, 3.88s/it] 27%|██▋ | 9132/34278 [10:08:28<27:38:01, 3.96s/it] {'loss': 0.1453, 'grad_norm': 1.3818471606419696, 'learning_rate': 8.604740688796585e-06, 'epoch': 0.27} 27%|██▋ | 9132/34278 [10:08:28<27:38:01, 3.96s/it] 27%|██▋ | 9133/34278 [10:08:32<26:54:42, 3.85s/it] {'loss': 0.1693, 'grad_norm': 1.0641447817434493, 'learning_rate': 8.604413281022563e-06, 'epoch': 0.27} 27%|██▋ | 9133/34278 [10:08:32<26:54:42, 3.85s/it] 27%|██▋ | 9134/34278 [10:08:36<26:38:42, 3.81s/it] {'loss': 0.1424, 'grad_norm': 0.8243456389069569, 'learning_rate': 8.604085841069202e-06, 'epoch': 0.27} 27%|██▋ | 9134/34278 [10:08:36<26:38:42, 3.81s/it] 27%|██▋ | 9135/34278 [10:08:39<25:47:20, 3.69s/it] {'loss': 0.1505, 'grad_norm': 1.0053326829075988, 'learning_rate': 8.60375836893942e-06, 'epoch': 0.27} 27%|██▋ | 9135/34278 [10:08:39<25:47:20, 3.69s/it] 27%|██▋ | 9136/34278 [10:08:45<29:38:02, 4.24s/it] {'loss': 0.153, 'grad_norm': 0.9620008886312827, 'learning_rate': 8.603430864636147e-06, 'epoch': 0.27} 27%|██▋ | 9136/34278 [10:08:45<29:38:02, 4.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 27%|██▋ | 9137/34278 [10:08:48<27:04:19, 3.88s/it] {'loss': 0.1624, 'grad_norm': 0.9968262005020653, 'learning_rate': 8.603103328162303e-06, 'epoch': 0.27} 27%|██▋ | 9137/34278 [10:08:48<27:04:19, 3.88s/it] 27%|██▋ | 9138/34278 [10:08:51<25:18:49, 3.62s/it] {'loss': 0.1381, 'grad_norm': 0.8984514152344192, 'learning_rate': 8.602775759520812e-06, 'epoch': 0.27} 27%|██▋ | 9138/34278 [10:08:51<25:18:49, 3.62s/it] 27%|██▋ | 9139/34278 [10:08:53<23:27:06, 3.36s/it] {'loss': 0.1482, 'grad_norm': 0.9913964856948402, 'learning_rate': 8.602448158714598e-06, 'epoch': 0.27} 27%|██▋ | 9139/34278 [10:08:53<23:27:06, 3.36s/it] 27%|██▋ | 9140/34278 [10:08:57<23:00:33, 3.30s/it] {'loss': 0.1453, 'grad_norm': 0.8265462729925408, 'learning_rate': 8.602120525746588e-06, 'epoch': 0.27} 27%|██▋ | 9140/34278 [10:08:57<23:00:33, 3.30s/it] 27%|██▋ | 9141/34278 [10:09:00<22:48:50, 3.27s/it] {'loss': 0.1562, 'grad_norm': 0.8347403841944075, 'learning_rate': 8.601792860619704e-06, 'epoch': 0.27} 27%|██▋ | 9141/34278 [10:09:00<22:48:50, 3.27s/it] 27%|██▋ | 9142/34278 [10:09:03<22:01:06, 3.15s/it] {'loss': 0.167, 'grad_norm': 0.8428107837326312, 'learning_rate': 8.601465163336875e-06, 'epoch': 0.27} 27%|██▋ | 9142/34278 [10:09:03<22:01:06, 3.15s/it] 27%|██▋ | 9143/34278 [10:09:07<24:16:09, 3.48s/it] {'loss': 0.1616, 'grad_norm': 0.9401110801896505, 'learning_rate': 8.601137433901026e-06, 'epoch': 0.27} 27%|██▋ | 9143/34278 [10:09:07<24:16:09, 3.48s/it] 27%|██▋ | 9144/34278 [10:09:13<29:26:00, 4.22s/it] {'loss': 0.161, 'grad_norm': 0.7493122862994414, 'learning_rate': 8.600809672315079e-06, 'epoch': 0.27} 27%|██▋ | 9144/34278 [10:09:13<29:26:00, 4.22s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 27%|██▋ | 9145/34278 [10:09:16<27:25:52, 3.93s/it] {'loss': 0.1646, 'grad_norm': 0.9069616054462428, 'learning_rate': 8.600481878581963e-06, 'epoch': 0.27} 27%|██▋ | 9145/34278 [10:09:16<27:25:52, 3.93s/it] 27%|██▋ | 9146/34278 [10:09:19<25:34:46, 3.66s/it] {'loss': 0.1826, 'grad_norm': 0.7983840351539716, 'learning_rate': 8.600154052704606e-06, 'epoch': 0.27} 27%|██▋ | 9146/34278 [10:09:19<25:34:46, 3.66s/it] 27%|██▋ | 9147/34278 [10:09:23<25:31:38, 3.66s/it] {'loss': 0.1525, 'grad_norm': 0.9543317972020042, 'learning_rate': 8.599826194685932e-06, 'epoch': 0.27} 27%|██▋ | 9147/34278 [10:09:23<25:31:38, 3.66s/it] 27%|██▋ | 9148/34278 [10:09:26<23:51:23, 3.42s/it] {'loss': 0.1865, 'grad_norm': 0.883120951423925, 'learning_rate': 8.599498304528869e-06, 'epoch': 0.27} 27%|██▋ | 9148/34278 [10:09:26<23:51:23, 3.42s/it] 27%|██▋ | 9149/34278 [10:09:30<25:48:38, 3.70s/it] {'loss': 0.1692, 'grad_norm': 0.819372229915627, 'learning_rate': 8.599170382236343e-06, 'epoch': 0.27} 27%|██▋ | 9149/34278 [10:09:30<25:48:38, 3.70s/it] 27%|██▋ | 9150/34278 [10:09:34<26:54:59, 3.86s/it] {'loss': 0.1698, 'grad_norm': 0.8996655475799555, 'learning_rate': 8.598842427811286e-06, 'epoch': 0.27} 27%|██▋ | 9150/34278 [10:09:34<26:54:59, 3.86s/it] 27%|██▋ | 9151/34278 [10:09:38<27:48:08, 3.98s/it] {'loss': 0.1663, 'grad_norm': 0.7135545026287398, 'learning_rate': 8.598514441256622e-06, 'epoch': 0.27} 27%|██▋ | 9151/34278 [10:09:38<27:48:08, 3.98s/it] 27%|██▋ | 9152/34278 [10:09:41<25:26:55, 3.65s/it] {'loss': 0.1815, 'grad_norm': 0.8362320184332745, 'learning_rate': 8.59818642257528e-06, 'epoch': 0.27} 27%|██▋ | 9152/34278 [10:09:41<25:26:55, 3.65s/it] 27%|██▋ | 9153/34278 [10:09:44<24:01:27, 3.44s/it] {'loss': 0.1485, 'grad_norm': 0.996673344357169, 'learning_rate': 8.597858371770189e-06, 'epoch': 0.27} 27%|██▋ | 9153/34278 [10:09:44<24:01:27, 3.44s/it] 27%|██▋ | 9154/34278 [10:09:47<23:09:17, 3.32s/it] {'loss': 0.1591, 'grad_norm': 0.8816343547125585, 'learning_rate': 8.597530288844275e-06, 'epoch': 0.27} 27%|██▋ | 9154/34278 [10:09:47<23:09:17, 3.32s/it] 27%|██▋ | 9155/34278 [10:09:50<21:56:07, 3.14s/it] {'loss': 0.1665, 'grad_norm': 0.8489802573039555, 'learning_rate': 8.597202173800471e-06, 'epoch': 0.27} 27%|██▋ | 9155/34278 [10:09:50<21:56:07, 3.14s/it] 27%|██▋ | 9156/34278 [10:09:54<24:13:56, 3.47s/it] {'loss': 0.1559, 'grad_norm': 1.0257458096727483, 'learning_rate': 8.596874026641705e-06, 'epoch': 0.27} 27%|██▋ | 9156/34278 [10:09:54<24:13:56, 3.47s/it] 27%|██▋ | 9157/34278 [10:09:59<25:51:02, 3.70s/it] {'loss': 0.1564, 'grad_norm': 0.9692393709333937, 'learning_rate': 8.596545847370904e-06, 'epoch': 0.27} 27%|██▋ | 9157/34278 [10:09:59<25:51:02, 3.70s/it] 27%|██▋ | 9158/34278 [10:10:04<29:36:50, 4.24s/it] {'loss': 0.1649, 'grad_norm': 0.9106118102650673, 'learning_rate': 8.596217635991004e-06, 'epoch': 0.27} 27%|██▋ | 9158/34278 [10:10:04<29:36:50, 4.24s/it] 27%|██▋ | 9159/34278 [10:10:07<26:59:43, 3.87s/it] {'loss': 0.1709, 'grad_norm': 0.9422265893505043, 'learning_rate': 8.59588939250493e-06, 'epoch': 0.27} 27%|██▋ | 9159/34278 [10:10:07<26:59:43, 3.87s/it] 27%|██▋ | 9160/34278 [10:10:12<28:22:42, 4.07s/it] {'loss': 0.163, 'grad_norm': 0.8309488586446533, 'learning_rate': 8.595561116915613e-06, 'epoch': 0.27} 27%|██▋ | 9160/34278 [10:10:12<28:22:42, 4.07s/it] 27%|██▋ | 9161/34278 [10:10:15<26:51:17, 3.85s/it] {'loss': 0.1828, 'grad_norm': 0.9041403780788585, 'learning_rate': 8.595232809225987e-06, 'epoch': 0.27} 27%|██▋ | 9161/34278 [10:10:15<26:51:17, 3.85s/it] 27%|██▋ | 9162/34278 [10:10:21<32:33:58, 4.67s/it] {'loss': 0.1361, 'grad_norm': 0.8975794841273685, 'learning_rate': 8.594904469438979e-06, 'epoch': 0.27} 27%|██▋ | 9162/34278 [10:10:21<32:33:58, 4.67s/it] 27%|██▋ | 9163/34278 [10:10:25<30:01:01, 4.30s/it] {'loss': 0.1608, 'grad_norm': 1.0592758026772573, 'learning_rate': 8.594576097557521e-06, 'epoch': 0.27} 27%|██▋ | 9163/34278 [10:10:25<30:01:01, 4.30s/it] 27%|██▋ | 9164/34278 [10:10:29<29:09:37, 4.18s/it] {'loss': 0.1619, 'grad_norm': 0.7650386674472878, 'learning_rate': 8.594247693584547e-06, 'epoch': 0.27} 27%|██▋ | 9164/34278 [10:10:29<29:09:37, 4.18s/it] 27%|██▋ | 9165/34278 [10:10:35<32:55:51, 4.72s/it] {'loss': 0.1519, 'grad_norm': 0.840847459811344, 'learning_rate': 8.593919257522988e-06, 'epoch': 0.27} 27%|██▋ | 9165/34278 [10:10:35<32:55:51, 4.72s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9308 > 8192). Running this sequence through the model will result in indexing errors 27%|██▋ | 9166/34278 [10:10:38<29:01:28, 4.16s/it] {'loss': 0.1611, 'grad_norm': 0.9773620966100263, 'learning_rate': 8.593590789375775e-06, 'epoch': 0.27} 27%|██▋ | 9166/34278 [10:10:38<29:01:28, 4.16s/it] 27%|██▋ | 9167/34278 [10:10:44<33:27:02, 4.80s/it] {'loss': 0.1359, 'grad_norm': 0.8697579112754361, 'learning_rate': 8.59326228914584e-06, 'epoch': 0.27} 27%|██▋ | 9167/34278 [10:10:44<33:27:02, 4.80s/it] 27%|██▋ | 9168/34278 [10:10:49<34:46:11, 4.98s/it] {'loss': 0.139, 'grad_norm': 0.9788890258356004, 'learning_rate': 8.59293375683612e-06, 'epoch': 0.27} 27%|██▋ | 9168/34278 [10:10:49<34:46:11, 4.98s/it] 27%|██▋ | 9169/34278 [10:10:52<30:40:27, 4.40s/it] {'loss': 0.1617, 'grad_norm': 0.8896247117464511, 'learning_rate': 8.592605192449543e-06, 'epoch': 0.27} 27%|██▋ | 9169/34278 [10:10:52<30:40:27, 4.40s/it] 27%|██▋ | 9170/34278 [10:10:58<34:02:25, 4.88s/it] {'loss': 0.1648, 'grad_norm': 1.0945036165838098, 'learning_rate': 8.592276595989045e-06, 'epoch': 0.27} 27%|██▋ | 9170/34278 [10:10:58<34:02:25, 4.88s/it] 27%|██▋ | 9171/34278 [10:11:01<30:03:22, 4.31s/it] {'loss': 0.156, 'grad_norm': 1.1723938840064165, 'learning_rate': 8.59194796745756e-06, 'epoch': 0.27} 27%|██▋ | 9171/34278 [10:11:01<30:03:22, 4.31s/it] 27%|██▋ | 9172/34278 [10:11:08<34:17:24, 4.92s/it] {'loss': 0.1815, 'grad_norm': 0.944501312743487, 'learning_rate': 8.591619306858019e-06, 'epoch': 0.27} 27%|██▋ | 9172/34278 [10:11:08<34:17:24, 4.92s/it] 27%|██▋ | 9173/34278 [10:11:11<30:35:12, 4.39s/it] {'loss': 0.1816, 'grad_norm': 0.9320193758861964, 'learning_rate': 8.59129061419336e-06, 'epoch': 0.27} 27%|██▋ | 9173/34278 [10:11:11<30:35:12, 4.39s/it] 27%|██▋ | 9174/34278 [10:11:14<27:44:21, 3.98s/it] {'loss': 0.1503, 'grad_norm': 1.0080757883311802, 'learning_rate': 8.590961889466514e-06, 'epoch': 0.27} 27%|██▋ | 9174/34278 [10:11:14<27:44:21, 3.98s/it] 27%|██▋ | 9175/34278 [10:11:18<28:28:48, 4.08s/it] {'loss': 0.1685, 'grad_norm': 0.9988390895799796, 'learning_rate': 8.590633132680419e-06, 'epoch': 0.27} 27%|██▋ | 9175/34278 [10:11:18<28:28:48, 4.08s/it] 27%|██▋ | 9176/34278 [10:11:21<25:59:58, 3.73s/it] {'loss': 0.1659, 'grad_norm': 0.9220635792551475, 'learning_rate': 8.590304343838008e-06, 'epoch': 0.27} 27%|██▋ | 9176/34278 [10:11:21<25:59:58, 3.73s/it] 27%|██▋ | 9177/34278 [10:11:24<25:03:22, 3.59s/it] {'loss': 0.1321, 'grad_norm': 0.7333628790025075, 'learning_rate': 8.589975522942218e-06, 'epoch': 0.27} 27%|██▋ | 9177/34278 [10:11:24<25:03:22, 3.59s/it] 27%|██▋ | 9178/34278 [10:11:28<24:29:08, 3.51s/it] {'loss': 0.1724, 'grad_norm': 1.079258263839422, 'learning_rate': 8.589646669995983e-06, 'epoch': 0.27} 27%|██▋ | 9178/34278 [10:11:28<24:29:08, 3.51s/it] 27%|██▋ | 9179/34278 [10:11:32<25:06:13, 3.60s/it] {'loss': 0.1295, 'grad_norm': 0.8545169594228665, 'learning_rate': 8.589317785002238e-06, 'epoch': 0.27} 27%|██▋ | 9179/34278 [10:11:32<25:06:13, 3.60s/it] 27%|██▋ | 9180/34278 [10:11:35<25:32:46, 3.66s/it] {'loss': 0.1383, 'grad_norm': 0.7275041417697529, 'learning_rate': 8.588988867963922e-06, 'epoch': 0.27} 27%|██▋ | 9180/34278 [10:11:35<25:32:46, 3.66s/it] 27%|██▋ | 9181/34278 [10:11:38<23:45:32, 3.41s/it] {'loss': 0.1805, 'grad_norm': 1.0811129047409436, 'learning_rate': 8.58865991888397e-06, 'epoch': 0.27} 27%|██▋ | 9181/34278 [10:11:38<23:45:32, 3.41s/it] 27%|██▋ | 9182/34278 [10:11:42<23:42:25, 3.40s/it] {'loss': 0.1751, 'grad_norm': 0.8539191484350386, 'learning_rate': 8.588330937765318e-06, 'epoch': 0.27} 27%|██▋ | 9182/34278 [10:11:42<23:42:25, 3.40s/it] 27%|██▋ | 9183/34278 [10:11:45<22:58:20, 3.30s/it] {'loss': 0.1371, 'grad_norm': 0.7555783785508675, 'learning_rate': 8.588001924610905e-06, 'epoch': 0.27} 27%|██▋ | 9183/34278 [10:11:45<22:58:20, 3.30s/it] 27%|██▋ | 9184/34278 [10:11:48<22:28:38, 3.22s/it] {'loss': 0.1533, 'grad_norm': 0.7832880033592341, 'learning_rate': 8.587672879423668e-06, 'epoch': 0.27} 27%|██▋ | 9184/34278 [10:11:48<22:28:38, 3.22s/it] 27%|██▋ | 9185/34278 [10:11:53<27:51:28, 4.00s/it] {'loss': 0.1607, 'grad_norm': 0.7548938877031783, 'learning_rate': 8.587343802206543e-06, 'epoch': 0.27} 27%|██▋ | 9185/34278 [10:11:53<27:51:28, 4.00s/it] 27%|██▋ | 9186/34278 [10:11:59<30:17:15, 4.35s/it] {'loss': 0.1623, 'grad_norm': 0.7232071859426559, 'learning_rate': 8.587014692962468e-06, 'epoch': 0.27} 27%|██▋ | 9186/34278 [10:11:59<30:17:15, 4.35s/it] 27%|██▋ | 9187/34278 [10:12:05<33:51:38, 4.86s/it] {'loss': 0.1472, 'grad_norm': 0.7363097351957697, 'learning_rate': 8.586685551694384e-06, 'epoch': 0.27} 27%|██▋ | 9187/34278 [10:12:05<33:51:38, 4.86s/it] 27%|██▋ | 9188/34278 [10:12:07<29:39:07, 4.25s/it] {'loss': 0.1571, 'grad_norm': 0.8764554487660825, 'learning_rate': 8.586356378405228e-06, 'epoch': 0.27} 27%|██▋ | 9188/34278 [10:12:07<29:39:07, 4.25s/it] 27%|██▋ | 9189/34278 [10:12:11<27:15:44, 3.91s/it] {'loss': 0.1431, 'grad_norm': 0.6740339109139587, 'learning_rate': 8.586027173097935e-06, 'epoch': 0.27} 27%|██▋ | 9189/34278 [10:12:11<27:15:44, 3.91s/it] 27%|██▋ | 9190/34278 [10:12:17<31:32:19, 4.53s/it] {'loss': 0.1666, 'grad_norm': 0.7955090497724493, 'learning_rate': 8.58569793577545e-06, 'epoch': 0.27} 27%|██▋ | 9190/34278 [10:12:17<31:32:19, 4.53s/it] 27%|██▋ | 9191/34278 [10:12:20<28:28:28, 4.09s/it] {'loss': 0.1699, 'grad_norm': 0.8110620248158532, 'learning_rate': 8.58536866644071e-06, 'epoch': 0.27} 27%|██▋ | 9191/34278 [10:12:20<28:28:28, 4.09s/it] 27%|██▋ | 9192/34278 [10:12:26<32:37:58, 4.68s/it] {'loss': 0.1531, 'grad_norm': 0.9302065781587839, 'learning_rate': 8.585039365096652e-06, 'epoch': 0.27} 27%|██▋ | 9192/34278 [10:12:26<32:37:58, 4.68s/it] 27%|██▋ | 9193/34278 [10:12:29<30:38:16, 4.40s/it] {'loss': 0.1318, 'grad_norm': 0.7246321762602458, 'learning_rate': 8.584710031746222e-06, 'epoch': 0.27} 27%|██▋ | 9193/34278 [10:12:29<30:38:16, 4.40s/it] 27%|██▋ | 9194/34278 [10:12:33<28:14:21, 4.05s/it] {'loss': 0.1788, 'grad_norm': 0.8120322319483088, 'learning_rate': 8.584380666392354e-06, 'epoch': 0.27} 27%|██▋ | 9194/34278 [10:12:33<28:14:21, 4.05s/it] 27%|██▋ | 9195/34278 [10:12:38<30:54:53, 4.44s/it] {'loss': 0.1534, 'grad_norm': 0.8096852428493483, 'learning_rate': 8.584051269037992e-06, 'epoch': 0.27} 27%|██▋ | 9195/34278 [10:12:38<30:54:53, 4.44s/it] 27%|██▋ | 9196/34278 [10:12:42<29:01:48, 4.17s/it] {'loss': 0.1392, 'grad_norm': 0.8341933111889306, 'learning_rate': 8.583721839686074e-06, 'epoch': 0.27} 27%|██▋ | 9196/34278 [10:12:42<29:01:48, 4.17s/it] 27%|██▋ | 9197/34278 [10:12:45<28:32:10, 4.10s/it] {'loss': 0.1547, 'grad_norm': 0.8384538346422006, 'learning_rate': 8.583392378339546e-06, 'epoch': 0.27} 27%|██▋ | 9197/34278 [10:12:45<28:32:10, 4.10s/it] 27%|██▋ | 9198/34278 [10:12:48<26:16:19, 3.77s/it] {'loss': 0.1398, 'grad_norm': 0.8337948487907447, 'learning_rate': 8.583062885001345e-06, 'epoch': 0.27} 27%|██▋ | 9198/34278 [10:12:49<26:16:19, 3.77s/it] 27%|██▋ | 9199/34278 [10:12:54<30:08:20, 4.33s/it] {'loss': 0.1388, 'grad_norm': 0.9001017741716905, 'learning_rate': 8.582733359674413e-06, 'epoch': 0.27} 27%|██▋ | 9199/34278 [10:12:54<30:08:20, 4.33s/it] 27%|██▋ | 9200/34278 [10:12:58<28:19:17, 4.07s/it] {'loss': 0.1401, 'grad_norm': 0.8210396932280165, 'learning_rate': 8.582403802361694e-06, 'epoch': 0.27} 27%|██▋ | 9200/34278 [10:12:58<28:19:17, 4.07s/it] 27%|██▋ | 9201/34278 [10:13:01<27:43:35, 3.98s/it] {'loss': 0.1399, 'grad_norm': 0.8767992996359955, 'learning_rate': 8.58207421306613e-06, 'epoch': 0.27} 27%|██▋ | 9201/34278 [10:13:01<27:43:35, 3.98s/it] 27%|██▋ | 9202/34278 [10:13:04<25:00:50, 3.59s/it] {'loss': 0.1668, 'grad_norm': 0.7541149616576392, 'learning_rate': 8.58174459179066e-06, 'epoch': 0.27} 27%|██▋ | 9202/34278 [10:13:04<25:00:50, 3.59s/it] 27%|██▋ | 9203/34278 [10:13:07<23:42:39, 3.40s/it] {'loss': 0.1596, 'grad_norm': 1.0436433724993603, 'learning_rate': 8.58141493853823e-06, 'epoch': 0.27} 27%|██▋ | 9203/34278 [10:13:07<23:42:39, 3.40s/it] 27%|██▋ | 9204/34278 [10:13:10<22:40:20, 3.26s/it] {'loss': 0.1518, 'grad_norm': 0.839183303192571, 'learning_rate': 8.581085253311783e-06, 'epoch': 0.27} 27%|██▋ | 9204/34278 [10:13:10<22:40:20, 3.26s/it] 27%|██▋ | 9205/34278 [10:13:13<22:12:16, 3.19s/it] {'loss': 0.1646, 'grad_norm': 0.885748581867198, 'learning_rate': 8.580755536114262e-06, 'epoch': 0.27} 27%|██▋ | 9205/34278 [10:13:13<22:12:16, 3.19s/it] 27%|██▋ | 9206/34278 [10:13:16<22:10:47, 3.18s/it] {'loss': 0.1846, 'grad_norm': 0.9010468031908282, 'learning_rate': 8.58042578694861e-06, 'epoch': 0.27} 27%|██▋ | 9206/34278 [10:13:16<22:10:47, 3.18s/it] 27%|██▋ | 9207/34278 [10:13:20<23:07:33, 3.32s/it] {'loss': 0.1526, 'grad_norm': 0.9110955868109781, 'learning_rate': 8.580096005817771e-06, 'epoch': 0.27} 27%|██▋ | 9207/34278 [10:13:20<23:07:33, 3.32s/it] 27%|██▋ | 9208/34278 [10:13:23<23:23:04, 3.36s/it] {'loss': 0.1607, 'grad_norm': 0.7531201583190602, 'learning_rate': 8.57976619272469e-06, 'epoch': 0.27} 27%|██▋ | 9208/34278 [10:13:23<23:23:04, 3.36s/it] 27%|██▋ | 9209/34278 [10:13:30<30:30:27, 4.38s/it] {'loss': 0.1371, 'grad_norm': 0.7406639129407381, 'learning_rate': 8.57943634767231e-06, 'epoch': 0.27} 27%|██▋ | 9209/34278 [10:13:30<30:30:27, 4.38s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc776ae3b50> Failed to fetch sample 2691938. Exception: cannot identify image file <_io.BytesIO object at 0x7fc776ae3b50> 27%|██▋ | 9210/34278 [10:13:33<28:06:53, 4.04s/it] {'loss': 0.1679, 'grad_norm': 0.8488613798188147, 'learning_rate': 8.579106470663578e-06, 'epoch': 0.27} 27%|██▋ | 9210/34278 [10:13:33<28:06:53, 4.04s/it] 27%|██▋ | 9211/34278 [10:13:40<33:35:40, 4.82s/it] {'loss': 0.1632, 'grad_norm': 0.8229960816134502, 'learning_rate': 8.578776561701438e-06, 'epoch': 0.27} 27%|██▋ | 9211/34278 [10:13:40<33:35:40, 4.82s/it] 27%|██▋ | 9212/34278 [10:13:43<29:59:03, 4.31s/it] {'loss': 0.1689, 'grad_norm': 1.1316756306441187, 'learning_rate': 8.578446620788834e-06, 'epoch': 0.27} 27%|██▋ | 9212/34278 [10:13:43<29:59:03, 4.31s/it] 27%|██▋ | 9213/34278 [10:13:46<27:48:17, 3.99s/it] {'loss': 0.177, 'grad_norm': 0.7629383223594314, 'learning_rate': 8.578116647928714e-06, 'epoch': 0.27} 27%|██▋ | 9213/34278 [10:13:46<27:48:17, 3.99s/it] 27%|██▋ | 9214/34278 [10:13:50<26:25:05, 3.79s/it] {'loss': 0.1656, 'grad_norm': 1.161795672400416, 'learning_rate': 8.577786643124022e-06, 'epoch': 0.27} 27%|██▋ | 9214/34278 [10:13:50<26:25:05, 3.79s/it] 27%|██▋ | 9215/34278 [10:13:55<30:17:33, 4.35s/it] {'loss': 0.1644, 'grad_norm': 0.7668669152188602, 'learning_rate': 8.577456606377704e-06, 'epoch': 0.27} 27%|██▋ | 9215/34278 [10:13:55<30:17:33, 4.35s/it] 27%|██▋ | 9216/34278 [10:13:58<27:25:32, 3.94s/it] {'loss': 0.1525, 'grad_norm': 0.7187454993662719, 'learning_rate': 8.577126537692707e-06, 'epoch': 0.27} 27%|██▋ | 9216/34278 [10:13:58<27:25:32, 3.94s/it] 27%|██▋ | 9217/34278 [10:14:01<25:17:48, 3.63s/it] {'loss': 0.182, 'grad_norm': 0.98976293249415, 'learning_rate': 8.576796437071982e-06, 'epoch': 0.27} 27%|██▋ | 9217/34278 [10:14:01<25:17:48, 3.63s/it] 27%|██▋ | 9218/34278 [10:14:07<30:02:24, 4.32s/it] {'loss': 0.1478, 'grad_norm': 0.7323738644458602, 'learning_rate': 8.576466304518469e-06, 'epoch': 0.27} 27%|██▋ | 9218/34278 [10:14:07<30:02:24, 4.32s/it] 27%|██▋ | 9219/34278 [10:14:12<31:19:25, 4.50s/it] {'loss': 0.1584, 'grad_norm': 0.7744388673034995, 'learning_rate': 8.57613614003512e-06, 'epoch': 0.27} 27%|██▋ | 9219/34278 [10:14:12<31:19:25, 4.50s/it] 27%|██▋ | 9220/34278 [10:14:15<27:34:17, 3.96s/it] {'loss': 0.1464, 'grad_norm': 1.4568329485976723, 'learning_rate': 8.57580594362488e-06, 'epoch': 0.27} 27%|██▋ | 9220/34278 [10:14:15<27:34:17, 3.96s/it] 27%|██▋ | 9221/34278 [10:14:18<25:53:22, 3.72s/it] {'loss': 0.1603, 'grad_norm': 0.8150362305649753, 'learning_rate': 8.5754757152907e-06, 'epoch': 0.27} 27%|██▋ | 9221/34278 [10:14:18<25:53:22, 3.72s/it] 27%|██▋ | 9222/34278 [10:14:24<30:46:30, 4.42s/it] {'loss': 0.1578, 'grad_norm': 0.9979086478524237, 'learning_rate': 8.575145455035525e-06, 'epoch': 0.27} 27%|██▋ | 9222/34278 [10:14:24<30:46:30, 4.42s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 27%|██▋ | 9223/34278 [10:14:27<27:42:51, 3.98s/it] {'loss': 0.1614, 'grad_norm': 0.8383760282009125, 'learning_rate': 8.574815162862305e-06, 'epoch': 0.27} 27%|██▋ | 9223/34278 [10:14:27<27:42:51, 3.98s/it] 27%|██▋ | 9224/34278 [10:14:30<25:12:41, 3.62s/it] {'loss': 0.1571, 'grad_norm': 0.8110399966876206, 'learning_rate': 8.574484838773988e-06, 'epoch': 0.27} 27%|██▋ | 9224/34278 [10:14:30<25:12:41, 3.62s/it] 27%|██▋ | 9225/34278 [10:14:33<23:58:55, 3.45s/it] {'loss': 0.1386, 'grad_norm': 0.9286612870673736, 'learning_rate': 8.574154482773524e-06, 'epoch': 0.27} 27%|██▋ | 9225/34278 [10:14:33<23:58:55, 3.45s/it] 27%|██▋ | 9226/34278 [10:14:39<30:10:07, 4.34s/it] {'loss': 0.1884, 'grad_norm': 0.8942256144728984, 'learning_rate': 8.573824094863863e-06, 'epoch': 0.27} 27%|██▋ | 9226/34278 [10:14:39<30:10:07, 4.34s/it] 27%|██▋ | 9227/34278 [10:14:42<28:19:17, 4.07s/it] {'loss': 0.1827, 'grad_norm': 0.8160018496012329, 'learning_rate': 8.573493675047953e-06, 'epoch': 0.27} 27%|██▋ | 9227/34278 [10:14:43<28:19:17, 4.07s/it] 27%|██▋ | 9228/34278 [10:14:45<25:50:37, 3.71s/it] {'loss': 0.1467, 'grad_norm': 0.9950852190666365, 'learning_rate': 8.573163223328744e-06, 'epoch': 0.27} 27%|██▋ | 9228/34278 [10:14:45<25:50:37, 3.71s/it] 27%|██▋ | 9229/34278 [10:14:50<27:44:44, 3.99s/it] {'loss': 0.1525, 'grad_norm': 0.9832431634164503, 'learning_rate': 8.572832739709187e-06, 'epoch': 0.27} 27%|██▋ | 9229/34278 [10:14:50<27:44:44, 3.99s/it] 27%|██▋ | 9230/34278 [10:14:53<26:00:34, 3.74s/it] {'loss': 0.1583, 'grad_norm': 0.8231481339351706, 'learning_rate': 8.572502224192233e-06, 'epoch': 0.27} 27%|██▋ | 9230/34278 [10:14:53<26:00:34, 3.74s/it] 27%|██▋ | 9231/34278 [10:14:57<25:24:49, 3.65s/it] {'loss': 0.1417, 'grad_norm': 0.924052550024724, 'learning_rate': 8.572171676780832e-06, 'epoch': 0.27} 27%|██▋ | 9231/34278 [10:14:57<25:24:49, 3.65s/it] 27%|██▋ | 9232/34278 [10:15:01<27:43:37, 3.99s/it] {'loss': 0.1314, 'grad_norm': 0.8464288011978807, 'learning_rate': 8.571841097477933e-06, 'epoch': 0.27} 27%|██▋ | 9232/34278 [10:15:01<27:43:37, 3.99s/it] 27%|██▋ | 9233/34278 [10:15:04<25:40:36, 3.69s/it] {'loss': 0.1333, 'grad_norm': 0.8420754487561416, 'learning_rate': 8.571510486286492e-06, 'epoch': 0.27} 27%|██▋ | 9233/34278 [10:15:04<25:40:36, 3.69s/it] 27%|██▋ | 9234/34278 [10:15:08<25:29:32, 3.66s/it] {'loss': 0.182, 'grad_norm': 1.0327153316506936, 'learning_rate': 8.571179843209457e-06, 'epoch': 0.27} 27%|██▋ | 9234/34278 [10:15:08<25:29:32, 3.66s/it] 27%|██▋ | 9235/34278 [10:15:11<23:59:18, 3.45s/it] {'loss': 0.1758, 'grad_norm': 1.0459509761829675, 'learning_rate': 8.57084916824978e-06, 'epoch': 0.27} 27%|██▋ | 9235/34278 [10:15:11<23:59:18, 3.45s/it] 27%|██▋ | 9236/34278 [10:15:15<24:47:32, 3.56s/it] {'loss': 0.148, 'grad_norm': 0.9712306776005475, 'learning_rate': 8.570518461410415e-06, 'epoch': 0.27} 27%|██▋ | 9236/34278 [10:15:15<24:47:32, 3.56s/it] 27%|██▋ | 9237/34278 [10:15:19<25:13:58, 3.63s/it] {'loss': 0.1654, 'grad_norm': 0.7930466124645379, 'learning_rate': 8.570187722694312e-06, 'epoch': 0.27} 27%|██▋ | 9237/34278 [10:15:19<25:13:58, 3.63s/it] 27%|██▋ | 9238/34278 [10:15:22<24:11:32, 3.48s/it] {'loss': 0.1626, 'grad_norm': 0.780649466370968, 'learning_rate': 8.569856952104427e-06, 'epoch': 0.27} 27%|██▋ | 9238/34278 [10:15:22<24:11:32, 3.48s/it] 27%|██▋ | 9239/34278 [10:15:25<22:59:54, 3.31s/it] {'loss': 0.1627, 'grad_norm': 0.9327302768809439, 'learning_rate': 8.56952614964371e-06, 'epoch': 0.27} 27%|██▋ | 9239/34278 [10:15:25<22:59:54, 3.31s/it] 27%|██▋ | 9240/34278 [10:15:28<23:25:39, 3.37s/it] {'loss': 0.1435, 'grad_norm': 0.8298760840573467, 'learning_rate': 8.569195315315117e-06, 'epoch': 0.27} 27%|██▋ | 9240/34278 [10:15:28<23:25:39, 3.37s/it] 27%|██▋ | 9241/34278 [10:15:32<23:38:12, 3.40s/it] {'loss': 0.1935, 'grad_norm': 0.866636920406715, 'learning_rate': 8.568864449121599e-06, 'epoch': 0.27} 27%|██▋ | 9241/34278 [10:15:32<23:38:12, 3.40s/it] 27%|██▋ | 9242/34278 [10:15:35<23:46:07, 3.42s/it] {'loss': 0.1839, 'grad_norm': 0.8687460907291265, 'learning_rate': 8.568533551066113e-06, 'epoch': 0.27} 27%|██▋ | 9242/34278 [10:15:35<23:46:07, 3.42s/it] 27%|██▋ | 9243/34278 [10:15:38<23:19:37, 3.35s/it] {'loss': 0.15, 'grad_norm': 0.9308339552873933, 'learning_rate': 8.56820262115161e-06, 'epoch': 0.27} 27%|██▋ | 9243/34278 [10:15:38<23:19:37, 3.35s/it] 27%|██▋ | 9244/34278 [10:15:42<23:31:45, 3.38s/it] {'loss': 0.1576, 'grad_norm': 1.3052139536501206, 'learning_rate': 8.567871659381047e-06, 'epoch': 0.27} 27%|██▋ | 9244/34278 [10:15:42<23:31:45, 3.38s/it] 27%|██▋ | 9245/34278 [10:15:45<22:59:47, 3.31s/it] {'loss': 0.1902, 'grad_norm': 0.9432496306131262, 'learning_rate': 8.567540665757375e-06, 'epoch': 0.27} 27%|██▋ | 9245/34278 [10:15:45<22:59:47, 3.31s/it] 27%|██▋ | 9246/34278 [10:15:48<23:45:16, 3.42s/it] {'loss': 0.1457, 'grad_norm': 0.7887585699622204, 'learning_rate': 8.567209640283553e-06, 'epoch': 0.27} 27%|██▋ | 9246/34278 [10:15:48<23:45:16, 3.42s/it] 27%|██▋ | 9247/34278 [10:15:54<28:18:25, 4.07s/it] {'loss': 0.1569, 'grad_norm': 0.8241288397490717, 'learning_rate': 8.566878582962534e-06, 'epoch': 0.27} 27%|██▋ | 9247/34278 [10:15:54<28:18:25, 4.07s/it] 27%|██▋ | 9248/34278 [10:15:57<26:36:04, 3.83s/it] {'loss': 0.1625, 'grad_norm': 0.9992394244678918, 'learning_rate': 8.566547493797278e-06, 'epoch': 0.27} 27%|██▋ | 9248/34278 [10:15:57<26:36:04, 3.83s/it] 27%|██▋ | 9249/34278 [10:16:02<28:49:16, 4.15s/it] {'loss': 0.1481, 'grad_norm': 0.8744530226184944, 'learning_rate': 8.566216372790735e-06, 'epoch': 0.27} 27%|██▋ | 9249/34278 [10:16:02<28:49:16, 4.15s/it] 27%|██▋ | 9250/34278 [10:16:05<25:57:33, 3.73s/it] {'loss': 0.1335, 'grad_norm': 0.8531297003913738, 'learning_rate': 8.565885219945862e-06, 'epoch': 0.27} 27%|██▋ | 9250/34278 [10:16:05<25:57:33, 3.73s/it] 27%|██▋ | 9251/34278 [10:16:08<24:37:11, 3.54s/it] {'loss': 0.1431, 'grad_norm': 0.721705219093551, 'learning_rate': 8.565554035265618e-06, 'epoch': 0.27} 27%|██▋ | 9251/34278 [10:16:08<24:37:11, 3.54s/it] 27%|██▋ | 9252/34278 [10:16:11<23:14:09, 3.34s/it] {'loss': 0.1788, 'grad_norm': 0.8786835574171571, 'learning_rate': 8.565222818752959e-06, 'epoch': 0.27} 27%|██▋ | 9252/34278 [10:16:11<23:14:09, 3.34s/it] 27%|██▋ | 9253/34278 [10:16:17<28:03:51, 4.04s/it] {'loss': 0.1617, 'grad_norm': 0.7672552773990936, 'learning_rate': 8.564891570410842e-06, 'epoch': 0.27} 27%|██▋ | 9253/34278 [10:16:17<28:03:51, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 27%|██▋ | 9254/34278 [10:16:22<31:33:02, 4.54s/it] {'loss': 0.1427, 'grad_norm': 0.7098568549686243, 'learning_rate': 8.564560290242224e-06, 'epoch': 0.27} 27%|██▋ | 9254/34278 [10:16:22<31:33:02, 4.54s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 27%|██▋ | 9255/34278 [10:16:26<28:47:49, 4.14s/it] {'loss': 0.1575, 'grad_norm': 0.685298869254431, 'learning_rate': 8.564228978250062e-06, 'epoch': 0.27} 27%|██▋ | 9255/34278 [10:16:26<28:47:49, 4.14s/it] 27%|██▋ | 9256/34278 [10:16:30<29:48:48, 4.29s/it] {'loss': 0.1572, 'grad_norm': 0.9341513204785777, 'learning_rate': 8.563897634437316e-06, 'epoch': 0.27} 27%|██▋ | 9256/34278 [10:16:30<29:48:48, 4.29s/it] 27%|██▋ | 9257/34278 [10:16:37<34:31:50, 4.97s/it] {'loss': 0.1654, 'grad_norm': 0.6749370421864948, 'learning_rate': 8.563566258806942e-06, 'epoch': 0.27} 27%|██▋ | 9257/34278 [10:16:37<34:31:50, 4.97s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 27%|██▋ | 9258/34278 [10:16:40<30:56:17, 4.45s/it] {'loss': 0.1533, 'grad_norm': 0.6732788259830977, 'learning_rate': 8.5632348513619e-06, 'epoch': 0.27} 27%|██▋ | 9258/34278 [10:16:40<30:56:17, 4.45s/it] 27%|██▋ | 9259/34278 [10:16:43<27:42:56, 3.99s/it] {'loss': 0.1599, 'grad_norm': 0.9100626040318655, 'learning_rate': 8.562903412105146e-06, 'epoch': 0.27} 27%|██▋ | 9259/34278 [10:16:43<27:42:56, 3.99s/it] 27%|██▋ | 9260/34278 [10:16:46<26:20:41, 3.79s/it] {'loss': 0.167, 'grad_norm': 0.8611613812154474, 'learning_rate': 8.562571941039641e-06, 'epoch': 0.27} 27%|██▋ | 9260/34278 [10:16:46<26:20:41, 3.79s/it] 27%|██▋ | 9261/34278 [10:16:52<30:11:50, 4.35s/it] {'loss': 0.1409, 'grad_norm': 0.9697066162839673, 'learning_rate': 8.562240438168345e-06, 'epoch': 0.27} 27%|██▋ | 9261/34278 [10:16:52<30:11:50, 4.35s/it] 27%|██▋ | 9262/34278 [10:16:55<27:40:43, 3.98s/it] {'loss': 0.1657, 'grad_norm': 0.9169661997507168, 'learning_rate': 8.561908903494216e-06, 'epoch': 0.27} 27%|██▋ | 9262/34278 [10:16:55<27:40:43, 3.98s/it] 27%|██▋ | 9263/34278 [10:17:01<31:49:13, 4.58s/it] {'loss': 0.1351, 'grad_norm': 0.6697914017206752, 'learning_rate': 8.561577337020217e-06, 'epoch': 0.27} 27%|██▋ | 9263/34278 [10:17:01<31:49:13, 4.58s/it] 27%|██▋ | 9264/34278 [10:17:04<27:51:24, 4.01s/it] {'loss': 0.1519, 'grad_norm': 0.6985498204266574, 'learning_rate': 8.561245738749302e-06, 'epoch': 0.27} 27%|██▋ | 9264/34278 [10:17:04<27:51:24, 4.01s/it] 27%|██▋ | 9265/34278 [10:17:07<26:06:12, 3.76s/it] {'loss': 0.1522, 'grad_norm': 0.6572420921352242, 'learning_rate': 8.560914108684437e-06, 'epoch': 0.27} 27%|██▋ | 9265/34278 [10:17:07<26:06:12, 3.76s/it] 27%|██▋ | 9266/34278 [10:17:11<26:23:10, 3.80s/it] {'loss': 0.1455, 'grad_norm': 0.8025213477402136, 'learning_rate': 8.560582446828582e-06, 'epoch': 0.27} 27%|██▋ | 9266/34278 [10:17:11<26:23:10, 3.80s/it] 27%|██▋ | 9267/34278 [10:17:14<25:26:28, 3.66s/it] {'loss': 0.1438, 'grad_norm': 0.8256942591098151, 'learning_rate': 8.560250753184695e-06, 'epoch': 0.27} 27%|██▋ | 9267/34278 [10:17:14<25:26:28, 3.66s/it] 27%|██▋ | 9268/34278 [10:17:17<24:38:08, 3.55s/it] {'loss': 0.1566, 'grad_norm': 1.6282283347267548, 'learning_rate': 8.559919027755741e-06, 'epoch': 0.27} 27%|██▋ | 9268/34278 [10:17:17<24:38:08, 3.55s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9359 > 8192). Running this sequence through the model will result in indexing errors 27%|██▋ | 9269/34278 [10:17:21<24:55:26, 3.59s/it] {'loss': 0.152, 'grad_norm': 0.8542265396487017, 'learning_rate': 8.55958727054468e-06, 'epoch': 0.27} 27%|██▋ | 9269/34278 [10:17:21<24:55:26, 3.59s/it] 27%|██▋ | 9270/34278 [10:17:24<24:27:00, 3.52s/it] {'loss': 0.1656, 'grad_norm': 0.8206370962531879, 'learning_rate': 8.559255481554471e-06, 'epoch': 0.27} 27%|██▋ | 9270/34278 [10:17:24<24:27:00, 3.52s/it] 27%|██▋ | 9271/34278 [10:17:27<23:12:19, 3.34s/it] {'loss': 0.1317, 'grad_norm': 0.6080706314194168, 'learning_rate': 8.558923660788081e-06, 'epoch': 0.27} 27%|██▋ | 9271/34278 [10:17:27<23:12:19, 3.34s/it] 27%|██▋ | 9272/34278 [10:17:31<23:34:05, 3.39s/it] {'loss': 0.145, 'grad_norm': 0.8862959009360586, 'learning_rate': 8.558591808248469e-06, 'epoch': 0.27} 27%|██▋ | 9272/34278 [10:17:31<23:34:05, 3.39s/it] 27%|██▋ | 9273/34278 [10:17:34<22:45:17, 3.28s/it] {'loss': 0.1855, 'grad_norm': 0.8417583707498439, 'learning_rate': 8.5582599239386e-06, 'epoch': 0.27} 27%|██▋ | 9273/34278 [10:17:34<22:45:17, 3.28s/it] 27%|██▋ | 9274/34278 [10:17:39<26:10:37, 3.77s/it] {'loss': 0.1513, 'grad_norm': 0.6334134148413211, 'learning_rate': 8.557928007861433e-06, 'epoch': 0.27} 27%|██▋ | 9274/34278 [10:17:39<26:10:37, 3.77s/it] 27%|██▋ | 9275/34278 [10:17:42<25:54:54, 3.73s/it] {'loss': 0.1381, 'grad_norm': 0.7808936451747582, 'learning_rate': 8.557596060019936e-06, 'epoch': 0.27} 27%|██▋ | 9275/34278 [10:17:42<25:54:54, 3.73s/it] 27%|██▋ | 9276/34278 [10:17:47<28:00:28, 4.03s/it] {'loss': 0.1418, 'grad_norm': 0.8780040742489097, 'learning_rate': 8.557264080417071e-06, 'epoch': 0.27} 27%|██▋ | 9276/34278 [10:17:47<28:00:28, 4.03s/it] 27%|██▋ | 9277/34278 [10:17:50<25:41:49, 3.70s/it] {'loss': 0.1608, 'grad_norm': 0.788857754357467, 'learning_rate': 8.556932069055803e-06, 'epoch': 0.27} 27%|██▋ | 9277/34278 [10:17:50<25:41:49, 3.70s/it] 27%|██▋ | 9278/34278 [10:17:55<27:26:30, 3.95s/it] {'loss': 0.1721, 'grad_norm': 0.8843795580702432, 'learning_rate': 8.556600025939092e-06, 'epoch': 0.27} 27%|██▋ | 9278/34278 [10:17:55<27:26:30, 3.95s/it] 27%|██▋ | 9279/34278 [10:17:58<25:23:07, 3.66s/it] {'loss': 0.1824, 'grad_norm': 1.071660232202477, 'learning_rate': 8.556267951069906e-06, 'epoch': 0.27} 27%|██▋ | 9279/34278 [10:17:58<25:23:07, 3.66s/it] 27%|██▋ | 9280/34278 [10:18:02<27:37:33, 3.98s/it] {'loss': 0.1354, 'grad_norm': 0.880605245956067, 'learning_rate': 8.555935844451209e-06, 'epoch': 0.27} 27%|██▋ | 9280/34278 [10:18:02<27:37:33, 3.98s/it] 27%|██▋ | 9281/34278 [10:18:05<25:35:17, 3.69s/it] {'loss': 0.14, 'grad_norm': 0.7316997735124465, 'learning_rate': 8.555603706085965e-06, 'epoch': 0.27} 27%|██▋ | 9281/34278 [10:18:05<25:35:17, 3.69s/it] 27%|██▋ | 9282/34278 [10:18:11<30:29:29, 4.39s/it] {'loss': 0.1342, 'grad_norm': 1.1402191306652094, 'learning_rate': 8.55527153597714e-06, 'epoch': 0.27} 27%|██▋ | 9282/34278 [10:18:11<30:29:29, 4.39s/it] 27%|██▋ | 9283/34278 [10:18:17<32:53:33, 4.74s/it] {'loss': 0.1573, 'grad_norm': 0.95009564932012, 'learning_rate': 8.5549393341277e-06, 'epoch': 0.27} 27%|██▋ | 9283/34278 [10:18:17<32:53:33, 4.74s/it] 27%|██▋ | 9284/34278 [10:18:20<29:34:27, 4.26s/it] {'loss': 0.1513, 'grad_norm': 0.6920552480734122, 'learning_rate': 8.554607100540609e-06, 'epoch': 0.27} 27%|██▋ | 9284/34278 [10:18:20<29:34:27, 4.26s/it] 27%|██▋ | 9285/34278 [10:18:24<29:45:07, 4.29s/it] {'loss': 0.1697, 'grad_norm': 1.107356065150215, 'learning_rate': 8.554274835218834e-06, 'epoch': 0.27} 27%|██▋ | 9285/34278 [10:18:24<29:45:07, 4.29s/it] 27%|██▋ | 9286/34278 [10:18:29<30:12:19, 4.35s/it] {'loss': 0.1527, 'grad_norm': 0.8283999436443544, 'learning_rate': 8.553942538165344e-06, 'epoch': 0.27} 27%|██▋ | 9286/34278 [10:18:29<30:12:19, 4.35s/it] 27%|██▋ | 9287/34278 [10:18:32<27:59:05, 4.03s/it] {'loss': 0.1472, 'grad_norm': 0.8953118067974883, 'learning_rate': 8.5536102093831e-06, 'epoch': 0.27} 27%|██▋ | 9287/34278 [10:18:32<27:59:05, 4.03s/it] 27%|██▋ | 9288/34278 [10:18:36<27:39:14, 3.98s/it] {'loss': 0.2096, 'grad_norm': 0.8188603250966501, 'learning_rate': 8.553277848875077e-06, 'epoch': 0.27} 27%|██▋ | 9288/34278 [10:18:36<27:39:14, 3.98s/it] 27%|██▋ | 9289/34278 [10:18:39<26:27:58, 3.81s/it] {'loss': 0.1616, 'grad_norm': 1.0244274232072392, 'learning_rate': 8.552945456644234e-06, 'epoch': 0.27} 27%|██▋ | 9289/34278 [10:18:39<26:27:58, 3.81s/it] 27%|██▋ | 9290/34278 [10:18:43<25:24:28, 3.66s/it] {'loss': 0.1605, 'grad_norm': 0.8385400445991816, 'learning_rate': 8.552613032693545e-06, 'epoch': 0.27} 27%|██▋ | 9290/34278 [10:18:43<25:24:28, 3.66s/it] 27%|██▋ | 9291/34278 [10:18:46<24:10:54, 3.48s/it] {'loss': 0.1561, 'grad_norm': 0.729594948400735, 'learning_rate': 8.552280577025972e-06, 'epoch': 0.27} 27%|██▋ | 9291/34278 [10:18:46<24:10:54, 3.48s/it] 27%|██▋ | 9292/34278 [10:18:49<22:59:32, 3.31s/it] {'loss': 0.1773, 'grad_norm': 0.9424074248978331, 'learning_rate': 8.551948089644487e-06, 'epoch': 0.27} 27%|██▋ | 9292/34278 [10:18:49<22:59:32, 3.31s/it] 27%|██▋ | 9293/34278 [10:18:52<22:54:24, 3.30s/it] {'loss': 0.1766, 'grad_norm': 0.8850560771163402, 'learning_rate': 8.551615570552058e-06, 'epoch': 0.27} 27%|██▋ | 9293/34278 [10:18:52<22:54:24, 3.30s/it] 27%|██▋ | 9294/34278 [10:18:55<23:17:28, 3.36s/it] {'loss': 0.1595, 'grad_norm': 0.7019283025275048, 'learning_rate': 8.551283019751652e-06, 'epoch': 0.27} 27%|██▋ | 9294/34278 [10:18:55<23:17:28, 3.36s/it] 27%|██▋ | 9295/34278 [10:19:01<28:39:33, 4.13s/it] {'loss': 0.2028, 'grad_norm': 0.7876402084548211, 'learning_rate': 8.550950437246239e-06, 'epoch': 0.27} 27%|██▋ | 9295/34278 [10:19:01<28:39:33, 4.13s/it] 27%|██▋ | 9296/34278 [10:19:04<26:05:14, 3.76s/it] {'loss': 0.2062, 'grad_norm': 0.9726648168405307, 'learning_rate': 8.55061782303879e-06, 'epoch': 0.27} 27%|██▋ | 9296/34278 [10:19:04<26:05:14, 3.76s/it] 27%|██▋ | 9297/34278 [10:19:10<30:52:35, 4.45s/it] {'loss': 0.1621, 'grad_norm': 0.743316445583261, 'learning_rate': 8.550285177132271e-06, 'epoch': 0.27} 27%|██▋ | 9297/34278 [10:19:10<30:52:35, 4.45s/it] 27%|██▋ | 9298/34278 [10:19:14<29:13:05, 4.21s/it] {'loss': 0.1589, 'grad_norm': 0.897330137360762, 'learning_rate': 8.549952499529654e-06, 'epoch': 0.27} 27%|██▋ | 9298/34278 [10:19:14<29:13:05, 4.21s/it] 27%|██▋ | 9299/34278 [10:19:18<29:27:42, 4.25s/it] {'loss': 0.1563, 'grad_norm': 0.8456954463841175, 'learning_rate': 8.54961979023391e-06, 'epoch': 0.27} 27%|██▋ | 9299/34278 [10:19:18<29:27:42, 4.25s/it] 27%|██▋ | 9300/34278 [10:19:21<27:05:07, 3.90s/it] {'loss': 0.1468, 'grad_norm': 0.8642499948240627, 'learning_rate': 8.549287049248006e-06, 'epoch': 0.27} 27%|██▋ | 9300/34278 [10:19:21<27:05:07, 3.90s/it] 27%|██▋ | 9301/34278 [10:19:25<27:09:09, 3.91s/it] {'loss': 0.1511, 'grad_norm': 0.8681786148279045, 'learning_rate': 8.548954276574914e-06, 'epoch': 0.27} 27%|██▋ | 9301/34278 [10:19:25<27:09:09, 3.91s/it] 27%|██▋ | 9302/34278 [10:19:29<25:38:33, 3.70s/it] {'loss': 0.199, 'grad_norm': 0.8314107658375232, 'learning_rate': 8.548621472217606e-06, 'epoch': 0.27} 27%|██▋ | 9302/34278 [10:19:29<25:38:33, 3.70s/it] 27%|██▋ | 9303/34278 [10:19:35<30:23:31, 4.38s/it] {'loss': 0.1512, 'grad_norm': 0.883653189288504, 'learning_rate': 8.548288636179053e-06, 'epoch': 0.27} 27%|██▋ | 9303/34278 [10:19:35<30:23:31, 4.38s/it] 27%|██▋ | 9304/34278 [10:19:40<33:15:32, 4.79s/it] {'loss': 0.1523, 'grad_norm': 0.9456358639701622, 'learning_rate': 8.547955768462226e-06, 'epoch': 0.27} 27%|██▋ | 9304/34278 [10:19:40<33:15:32, 4.79s/it] 27%|██▋ | 9305/34278 [10:19:43<29:42:56, 4.28s/it] {'loss': 0.1696, 'grad_norm': 0.7569096835080417, 'learning_rate': 8.547622869070096e-06, 'epoch': 0.27} 27%|██▋ | 9305/34278 [10:19:43<29:42:56, 4.28s/it] 27%|██▋ | 9306/34278 [10:19:47<28:46:30, 4.15s/it] {'loss': 0.1561, 'grad_norm': 1.030717279030035, 'learning_rate': 8.547289938005638e-06, 'epoch': 0.27} 27%|██▋ | 9306/34278 [10:19:47<28:46:30, 4.15s/it] 27%|██▋ | 9307/34278 [10:19:50<26:33:53, 3.83s/it] {'loss': 0.1362, 'grad_norm': 0.9282325955141839, 'learning_rate': 8.54695697527182e-06, 'epoch': 0.27} 27%|██▋ | 9307/34278 [10:19:50<26:33:53, 3.83s/it] 27%|██▋ | 9308/34278 [10:19:54<25:57:13, 3.74s/it] {'loss': 0.1802, 'grad_norm': 1.1271936814291548, 'learning_rate': 8.546623980871617e-06, 'epoch': 0.27} 27%|██▋ | 9308/34278 [10:19:54<25:57:13, 3.74s/it] 27%|██▋ | 9309/34278 [10:19:57<25:18:32, 3.65s/it] {'loss': 0.1555, 'grad_norm': 1.0773611159059693, 'learning_rate': 8.546290954808004e-06, 'epoch': 0.27} 27%|██▋ | 9309/34278 [10:19:57<25:18:32, 3.65s/it] 27%|██▋ | 9310/34278 [10:20:02<28:23:41, 4.09s/it] {'loss': 0.1732, 'grad_norm': 0.8407925582473406, 'learning_rate': 8.54595789708395e-06, 'epoch': 0.27} 27%|██▋ | 9310/34278 [10:20:02<28:23:41, 4.09s/it] 27%|██▋ | 9311/34278 [10:20:06<26:43:49, 3.85s/it] {'loss': 0.1375, 'grad_norm': 0.9216617540670842, 'learning_rate': 8.54562480770243e-06, 'epoch': 0.27} 27%|██▋ | 9311/34278 [10:20:06<26:43:49, 3.85s/it] 27%|██▋ | 9312/34278 [10:20:12<31:11:20, 4.50s/it] {'loss': 0.1558, 'grad_norm': 0.9849609811805082, 'learning_rate': 8.54529168666642e-06, 'epoch': 0.27} 27%|██▋ | 9312/34278 [10:20:12<31:11:20, 4.50s/it] 27%|██▋ | 9313/34278 [10:20:15<29:13:58, 4.22s/it] {'loss': 0.1605, 'grad_norm': 0.8263233447640331, 'learning_rate': 8.544958533978891e-06, 'epoch': 0.27} 27%|██▋ | 9313/34278 [10:20:15<29:13:58, 4.22s/it] 27%|██▋ | 9314/34278 [10:20:18<26:15:43, 3.79s/it] {'loss': 0.1453, 'grad_norm': 0.7228264966155231, 'learning_rate': 8.544625349642818e-06, 'epoch': 0.27} 27%|██▋ | 9314/34278 [10:20:18<26:15:43, 3.79s/it] 27%|██▋ | 9315/34278 [10:20:23<28:59:57, 4.18s/it] {'loss': 0.1437, 'grad_norm': 0.8717108165395171, 'learning_rate': 8.544292133661178e-06, 'epoch': 0.27} 27%|██▋ | 9315/34278 [10:20:23<28:59:57, 4.18s/it] 27%|██▋ | 9316/34278 [10:20:27<28:03:05, 4.05s/it] {'loss': 0.1644, 'grad_norm': 0.6765522309707471, 'learning_rate': 8.543958886036942e-06, 'epoch': 0.27} 27%|██▋ | 9316/34278 [10:20:27<28:03:05, 4.05s/it] 27%|██▋ | 9317/34278 [10:20:30<25:45:04, 3.71s/it] {'loss': 0.1614, 'grad_norm': 0.9209086600834943, 'learning_rate': 8.543625606773088e-06, 'epoch': 0.27} 27%|██▋ | 9317/34278 [10:20:30<25:45:04, 3.71s/it] 27%|██▋ | 9318/34278 [10:20:36<30:28:38, 4.40s/it] {'loss': 0.1314, 'grad_norm': 0.6893649129641594, 'learning_rate': 8.543292295872591e-06, 'epoch': 0.27} 27%|██▋ | 9318/34278 [10:20:36<30:28:38, 4.40s/it] 27%|██▋ | 9319/34278 [10:20:42<33:56:45, 4.90s/it] {'loss': 0.1766, 'grad_norm': 1.004828097157871, 'learning_rate': 8.542958953338424e-06, 'epoch': 0.27} 27%|██▋ | 9319/34278 [10:20:42<33:56:45, 4.90s/it] 27%|██▋ | 9320/34278 [10:20:45<30:33:41, 4.41s/it] {'loss': 0.1761, 'grad_norm': 0.8221329832359012, 'learning_rate': 8.542625579173567e-06, 'epoch': 0.27} 27%|██▋ | 9320/34278 [10:20:45<30:33:41, 4.41s/it] 27%|██▋ | 9321/34278 [10:20:48<28:09:16, 4.06s/it] {'loss': 0.1485, 'grad_norm': 0.7769165503015246, 'learning_rate': 8.542292173380994e-06, 'epoch': 0.27} 27%|██▋ | 9321/34278 [10:20:48<28:09:16, 4.06s/it] 27%|██▋ | 9322/34278 [10:20:52<28:05:15, 4.05s/it] {'loss': 0.1413, 'grad_norm': 0.8324451091458328, 'learning_rate': 8.541958735963683e-06, 'epoch': 0.27} 27%|██▋ | 9322/34278 [10:20:52<28:05:15, 4.05s/it] 27%|██▋ | 9323/34278 [10:20:56<27:29:54, 3.97s/it] {'loss': 0.1442, 'grad_norm': 0.7786202238810676, 'learning_rate': 8.54162526692461e-06, 'epoch': 0.27} 27%|██▋ | 9323/34278 [10:20:56<27:29:54, 3.97s/it] 27%|██▋ | 9324/34278 [10:21:01<28:35:13, 4.12s/it] {'loss': 0.1439, 'grad_norm': 0.74539655946189, 'learning_rate': 8.541291766266751e-06, 'epoch': 0.27} 27%|██▋ | 9324/34278 [10:21:01<28:35:13, 4.12s/it] 27%|██▋ | 9325/34278 [10:21:04<26:11:52, 3.78s/it] {'loss': 0.1492, 'grad_norm': 0.7872587596658237, 'learning_rate': 8.540958233993084e-06, 'epoch': 0.27} 27%|██▋ | 9325/34278 [10:21:04<26:11:52, 3.78s/it] 27%|██▋ | 9326/34278 [10:21:07<25:58:54, 3.75s/it] {'loss': 0.1496, 'grad_norm': 0.9161060159698041, 'learning_rate': 8.540624670106587e-06, 'epoch': 0.27} 27%|██▋ | 9326/34278 [10:21:07<25:58:54, 3.75s/it] 27%|██▋ | 9327/34278 [10:21:11<26:16:54, 3.79s/it] {'loss': 0.1731, 'grad_norm': 0.7337597554029629, 'learning_rate': 8.54029107461024e-06, 'epoch': 0.27} 27%|██▋ | 9327/34278 [10:21:11<26:16:54, 3.79s/it] 27%|██▋ | 9328/34278 [10:21:14<25:10:13, 3.63s/it] {'loss': 0.1601, 'grad_norm': 0.9281427391751094, 'learning_rate': 8.539957447507019e-06, 'epoch': 0.27} 27%|██▋ | 9328/34278 [10:21:15<25:10:13, 3.63s/it] 27%|██▋ | 9329/34278 [10:21:18<24:55:32, 3.60s/it] {'loss': 0.1496, 'grad_norm': 0.8651842214497207, 'learning_rate': 8.539623788799903e-06, 'epoch': 0.27} 27%|██▋ | 9329/34278 [10:21:18<24:55:32, 3.60s/it] 27%|██▋ | 9330/34278 [10:21:21<23:42:15, 3.42s/it] {'loss': 0.1468, 'grad_norm': 0.6802650700621498, 'learning_rate': 8.53929009849187e-06, 'epoch': 0.27} 27%|██▋ | 9330/34278 [10:21:21<23:42:15, 3.42s/it] 27%|██▋ | 9331/34278 [10:21:24<22:55:54, 3.31s/it] {'loss': 0.1367, 'grad_norm': 1.1047225626386818, 'learning_rate': 8.5389563765859e-06, 'epoch': 0.27} 27%|██▋ | 9331/34278 [10:21:24<22:55:54, 3.31s/it] 27%|██▋ | 9332/34278 [10:21:30<28:21:29, 4.09s/it] {'loss': 0.1797, 'grad_norm': 0.9881449060735117, 'learning_rate': 8.538622623084973e-06, 'epoch': 0.27} 27%|██▋ | 9332/34278 [10:21:30<28:21:29, 4.09s/it] 27%|██▋ | 9333/34278 [10:21:36<32:09:13, 4.64s/it] {'loss': 0.1803, 'grad_norm': 0.7753754384464604, 'learning_rate': 8.538288837992066e-06, 'epoch': 0.27} 27%|██▋ | 9333/34278 [10:21:36<32:09:13, 4.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 27%|██▋ | 9334/34278 [10:21:39<28:37:57, 4.13s/it] {'loss': 0.1688, 'grad_norm': 1.1058551155594245, 'learning_rate': 8.537955021310162e-06, 'epoch': 0.27} 27%|██▋ | 9334/34278 [10:21:39<28:37:57, 4.13s/it] 27%|██▋ | 9335/34278 [10:21:42<26:50:05, 3.87s/it] {'loss': 0.1629, 'grad_norm': 0.7792812827582498, 'learning_rate': 8.537621173042241e-06, 'epoch': 0.27} 27%|██▋ | 9335/34278 [10:21:42<26:50:05, 3.87s/it] 27%|██▋ | 9336/34278 [10:21:48<30:35:10, 4.41s/it] {'loss': 0.1583, 'grad_norm': 0.8340503383022526, 'learning_rate': 8.537287293191283e-06, 'epoch': 0.27} 27%|██▋ | 9336/34278 [10:21:48<30:35:10, 4.41s/it] 27%|██▋ | 9337/34278 [10:21:51<27:29:50, 3.97s/it] {'loss': 0.1425, 'grad_norm': 0.7750323342153583, 'learning_rate': 8.536953381760266e-06, 'epoch': 0.27} 27%|██▋ | 9337/34278 [10:21:51<27:29:50, 3.97s/it] 27%|██▋ | 9338/34278 [10:21:54<26:03:19, 3.76s/it] {'loss': 0.1596, 'grad_norm': 0.8793720364119025, 'learning_rate': 8.536619438752176e-06, 'epoch': 0.27} 27%|██▋ | 9338/34278 [10:21:54<26:03:19, 3.76s/it] 27%|██▋ | 9339/34278 [10:21:57<25:06:44, 3.63s/it] {'loss': 0.1577, 'grad_norm': 0.7476810066439837, 'learning_rate': 8.536285464169992e-06, 'epoch': 0.27} 27%|██▋ | 9339/34278 [10:21:57<25:06:44, 3.63s/it] 27%|██▋ | 9340/34278 [10:22:01<25:17:17, 3.65s/it] {'loss': 0.1679, 'grad_norm': 0.7617103675329747, 'learning_rate': 8.535951458016693e-06, 'epoch': 0.27} 27%|██▋ | 9340/34278 [10:22:01<25:17:17, 3.65s/it] 27%|██▋ | 9341/34278 [10:22:05<25:17:03, 3.65s/it] {'loss': 0.1488, 'grad_norm': 0.8757811291725777, 'learning_rate': 8.535617420295267e-06, 'epoch': 0.27} 27%|██▋ | 9341/34278 [10:22:05<25:17:03, 3.65s/it] 27%|██▋ | 9342/34278 [10:22:08<25:10:45, 3.64s/it] {'loss': 0.17, 'grad_norm': 0.9223600901517369, 'learning_rate': 8.53528335100869e-06, 'epoch': 0.27} 27%|██▋ | 9342/34278 [10:22:08<25:10:45, 3.64s/it] 27%|██▋ | 9343/34278 [10:22:11<24:10:59, 3.49s/it] {'loss': 0.153, 'grad_norm': 0.9502397521295001, 'learning_rate': 8.534949250159947e-06, 'epoch': 0.27} 27%|██▋ | 9343/34278 [10:22:11<24:10:59, 3.49s/it] 27%|██▋ | 9344/34278 [10:22:15<24:20:46, 3.52s/it] {'loss': 0.1478, 'grad_norm': 0.8614990086082593, 'learning_rate': 8.534615117752024e-06, 'epoch': 0.27} 27%|██▋ | 9344/34278 [10:22:15<24:20:46, 3.52s/it] 27%|██▋ | 9345/34278 [10:22:19<25:22:07, 3.66s/it] {'loss': 0.1413, 'grad_norm': 0.7105929316816207, 'learning_rate': 8.534280953787899e-06, 'epoch': 0.27} 27%|██▋ | 9345/34278 [10:22:19<25:22:07, 3.66s/it] 27%|██▋ | 9346/34278 [10:22:22<24:14:11, 3.50s/it] {'loss': 0.1684, 'grad_norm': 1.1303127016511665, 'learning_rate': 8.533946758270556e-06, 'epoch': 0.27} 27%|██▋ | 9346/34278 [10:22:22<24:14:11, 3.50s/it] 27%|██▋ | 9347/34278 [10:22:26<24:55:14, 3.60s/it] {'loss': 0.1393, 'grad_norm': 0.7069785066889098, 'learning_rate': 8.533612531202981e-06, 'epoch': 0.27} 27%|██▋ | 9347/34278 [10:22:26<24:55:14, 3.60s/it] 27%|██▋ | 9348/34278 [10:22:32<30:00:08, 4.33s/it] {'loss': 0.1656, 'grad_norm': 0.7996427312250106, 'learning_rate': 8.533278272588159e-06, 'epoch': 0.27} 27%|██▋ | 9348/34278 [10:22:32<30:00:08, 4.33s/it] 27%|██▋ | 9349/34278 [10:22:35<27:33:32, 3.98s/it] {'loss': 0.1747, 'grad_norm': 1.0170876726172817, 'learning_rate': 8.53294398242907e-06, 'epoch': 0.27} 27%|██▋ | 9349/34278 [10:22:35<27:33:32, 3.98s/it] 27%|██▋ | 9350/34278 [10:22:41<31:36:13, 4.56s/it] {'loss': 0.1608, 'grad_norm': 0.7867301607587156, 'learning_rate': 8.5326096607287e-06, 'epoch': 0.27} 27%|██▋ | 9350/34278 [10:22:41<31:36:13, 4.56s/it] 27%|██▋ | 9351/34278 [10:22:45<29:12:56, 4.22s/it] {'loss': 0.1786, 'grad_norm': 0.8138361970107523, 'learning_rate': 8.532275307490034e-06, 'epoch': 0.27} 27%|██▋ | 9351/34278 [10:22:45<29:12:56, 4.22s/it] 27%|██▋ | 9352/34278 [10:22:50<31:09:46, 4.50s/it] {'loss': 0.1555, 'grad_norm': 0.899526759375119, 'learning_rate': 8.531940922716058e-06, 'epoch': 0.27} 27%|██▋ | 9352/34278 [10:22:50<31:09:46, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 27%|██▋ | 9353/34278 [10:22:54<30:26:41, 4.40s/it] {'loss': 0.1737, 'grad_norm': 0.8441496100724186, 'learning_rate': 8.531606506409757e-06, 'epoch': 0.27} 27%|██▋ | 9353/34278 [10:22:54<30:26:41, 4.40s/it] 27%|██▋ | 9354/34278 [10:22:57<28:07:40, 4.06s/it] {'loss': 0.1447, 'grad_norm': 0.8824631930854943, 'learning_rate': 8.531272058574116e-06, 'epoch': 0.27} 27%|██▋ | 9354/34278 [10:22:57<28:07:40, 4.06s/it] 27%|██▋ | 9355/34278 [10:23:02<29:56:49, 4.33s/it] {'loss': 0.169, 'grad_norm': 0.9767803676552924, 'learning_rate': 8.530937579212122e-06, 'epoch': 0.27} 27%|██▋ | 9355/34278 [10:23:02<29:56:49, 4.33s/it] 27%|██▋ | 9356/34278 [10:23:05<26:50:53, 3.88s/it] {'loss': 0.168, 'grad_norm': 1.5006568498457795, 'learning_rate': 8.530603068326759e-06, 'epoch': 0.27} 27%|██▋ | 9356/34278 [10:23:05<26:50:53, 3.88s/it] 27%|██▋ | 9357/34278 [10:23:08<25:31:23, 3.69s/it] {'loss': 0.1273, 'grad_norm': 0.8441209861961562, 'learning_rate': 8.530268525921015e-06, 'epoch': 0.27} 27%|██▋ | 9357/34278 [10:23:08<25:31:23, 3.69s/it] 27%|██▋ | 9358/34278 [10:23:12<25:52:04, 3.74s/it] {'loss': 0.1558, 'grad_norm': 1.0782498860134957, 'learning_rate': 8.529933951997875e-06, 'epoch': 0.27} 27%|██▋ | 9358/34278 [10:23:12<25:52:04, 3.74s/it] 27%|██▋ | 9359/34278 [10:23:15<24:29:32, 3.54s/it] {'loss': 0.1514, 'grad_norm': 0.655681288690383, 'learning_rate': 8.52959934656033e-06, 'epoch': 0.27} 27%|██▋ | 9359/34278 [10:23:15<24:29:32, 3.54s/it] 27%|██▋ | 9360/34278 [10:23:18<24:04:54, 3.48s/it] {'loss': 0.177, 'grad_norm': 1.041019869902648, 'learning_rate': 8.529264709611362e-06, 'epoch': 0.27} 27%|██▋ | 9360/34278 [10:23:18<24:04:54, 3.48s/it] 27%|██▋ | 9361/34278 [10:23:24<28:43:05, 4.15s/it] {'loss': 0.1569, 'grad_norm': 0.8222929072925065, 'learning_rate': 8.528930041153962e-06, 'epoch': 0.27} 27%|██▋ | 9361/34278 [10:23:24<28:43:05, 4.15s/it] 27%|██▋ | 9362/34278 [10:23:28<27:55:53, 4.04s/it] {'loss': 0.149, 'grad_norm': 0.6549911906259803, 'learning_rate': 8.528595341191117e-06, 'epoch': 0.27} 27%|██▋ | 9362/34278 [10:23:28<27:55:53, 4.04s/it] 27%|██▋ | 9363/34278 [10:23:31<25:31:58, 3.69s/it] {'loss': 0.159, 'grad_norm': 0.9660138123216552, 'learning_rate': 8.528260609725816e-06, 'epoch': 0.27} 27%|██▋ | 9363/34278 [10:23:31<25:31:58, 3.69s/it] 27%|██▋ | 9364/34278 [10:23:34<25:01:08, 3.62s/it] {'loss': 0.1625, 'grad_norm': 0.8154555060234866, 'learning_rate': 8.527925846761046e-06, 'epoch': 0.27} 27%|██▋ | 9364/34278 [10:23:34<25:01:08, 3.62s/it] 27%|██▋ | 9365/34278 [10:23:38<25:45:10, 3.72s/it] {'loss': 0.1676, 'grad_norm': 0.783703359651396, 'learning_rate': 8.527591052299797e-06, 'epoch': 0.27} 27%|██▋ | 9365/34278 [10:23:38<25:45:10, 3.72s/it] 27%|██▋ | 9366/34278 [10:23:41<24:31:47, 3.54s/it] {'loss': 0.1566, 'grad_norm': 0.9158468762553383, 'learning_rate': 8.527256226345056e-06, 'epoch': 0.27} 27%|██▋ | 9366/34278 [10:23:41<24:31:47, 3.54s/it] 27%|██▋ | 9367/34278 [10:23:44<23:24:52, 3.38s/it] {'loss': 0.1441, 'grad_norm': 0.9611408206840761, 'learning_rate': 8.526921368899815e-06, 'epoch': 0.27} 27%|██▋ | 9367/34278 [10:23:44<23:24:52, 3.38s/it] 27%|██▋ | 9368/34278 [10:23:48<23:10:25, 3.35s/it] {'loss': 0.1453, 'grad_norm': 0.8413751222970743, 'learning_rate': 8.52658647996706e-06, 'epoch': 0.27} 27%|██▋ | 9368/34278 [10:23:48<23:10:25, 3.35s/it] 27%|██▋ | 9369/34278 [10:23:52<24:24:22, 3.53s/it] {'loss': 0.1636, 'grad_norm': 0.9248861010144814, 'learning_rate': 8.526251559549783e-06, 'epoch': 0.27} 27%|██▋ | 9369/34278 [10:23:52<24:24:22, 3.53s/it] 27%|██▋ | 9370/34278 [10:23:54<22:53:55, 3.31s/it] {'loss': 0.1564, 'grad_norm': 0.9136868728947342, 'learning_rate': 8.525916607650975e-06, 'epoch': 0.27} 27%|██▋ | 9370/34278 [10:23:54<22:53:55, 3.31s/it] 27%|██▋ | 9371/34278 [10:23:57<22:37:09, 3.27s/it] {'loss': 0.1588, 'grad_norm': 0.9328527593817365, 'learning_rate': 8.525581624273624e-06, 'epoch': 0.27} 27%|██▋ | 9371/34278 [10:23:57<22:37:09, 3.27s/it] 27%|██▋ | 9372/34278 [10:24:00<22:03:58, 3.19s/it] {'loss': 0.1625, 'grad_norm': 0.9736891820902451, 'learning_rate': 8.525246609420724e-06, 'epoch': 0.27} 27%|██▋ | 9372/34278 [10:24:01<22:03:58, 3.19s/it] 27%|██▋ | 9373/34278 [10:24:06<26:43:29, 3.86s/it] {'loss': 0.161, 'grad_norm': 0.9497070190972525, 'learning_rate': 8.524911563095262e-06, 'epoch': 0.27} 27%|██▋ | 9373/34278 [10:24:06<26:43:29, 3.86s/it] 27%|██▋ | 9374/34278 [10:24:09<25:42:29, 3.72s/it] {'loss': 0.144, 'grad_norm': 0.927790040062506, 'learning_rate': 8.524576485300231e-06, 'epoch': 0.27} 27%|██▋ | 9374/34278 [10:24:09<25:42:29, 3.72s/it] 27%|██▋ | 9375/34278 [10:24:13<24:40:06, 3.57s/it] {'loss': 0.1349, 'grad_norm': 0.7191343826786696, 'learning_rate': 8.524241376038623e-06, 'epoch': 0.27} 27%|██▋ | 9375/34278 [10:24:13<24:40:06, 3.57s/it] 27%|██▋ | 9376/34278 [10:24:16<25:30:48, 3.69s/it] {'loss': 0.1508, 'grad_norm': 0.944773784415641, 'learning_rate': 8.523906235313428e-06, 'epoch': 0.27} 27%|██▋ | 9376/34278 [10:24:16<25:30:48, 3.69s/it] 27%|██▋ | 9377/34278 [10:24:23<31:02:03, 4.49s/it] {'loss': 0.1475, 'grad_norm': 0.7617543791306891, 'learning_rate': 8.52357106312764e-06, 'epoch': 0.27} 27%|██▋ | 9377/34278 [10:24:23<31:02:03, 4.49s/it] 27%|██▋ | 9378/34278 [10:24:27<29:32:54, 4.27s/it] {'loss': 0.1479, 'grad_norm': 0.8210786254523115, 'learning_rate': 8.523235859484253e-06, 'epoch': 0.27} 27%|██▋ | 9378/34278 [10:24:27<29:32:54, 4.27s/it] 27%|██▋ | 9379/34278 [10:24:31<30:29:15, 4.41s/it] {'loss': 0.1673, 'grad_norm': 0.9006378895303033, 'learning_rate': 8.522900624386254e-06, 'epoch': 0.27} 27%|██▋ | 9379/34278 [10:24:31<30:29:15, 4.41s/it] 27%|██▋ | 9380/34278 [10:24:37<33:21:12, 4.82s/it] {'loss': 0.1475, 'grad_norm': 0.7785757867311436, 'learning_rate': 8.522565357836642e-06, 'epoch': 0.27} 27%|██▋ | 9380/34278 [10:24:37<33:21:12, 4.82s/it] 27%|██▋ | 9381/34278 [10:24:40<29:39:52, 4.29s/it] {'loss': 0.1666, 'grad_norm': 0.9006336390443282, 'learning_rate': 8.522230059838405e-06, 'epoch': 0.27} 27%|██▋ | 9381/34278 [10:24:40<29:39:52, 4.29s/it] 27%|██▋ | 9382/34278 [10:24:45<31:18:02, 4.53s/it] {'loss': 0.1625, 'grad_norm': 0.9108054518379897, 'learning_rate': 8.521894730394541e-06, 'epoch': 0.27} 27%|██▋ | 9382/34278 [10:24:45<31:18:02, 4.53s/it] 27%|██▋ | 9383/34278 [10:24:51<33:01:12, 4.77s/it] {'loss': 0.1566, 'grad_norm': 0.7737262733776332, 'learning_rate': 8.521559369508041e-06, 'epoch': 0.27} 27%|██▋ | 9383/34278 [10:24:51<33:01:12, 4.77s/it] 27%|██▋ | 9384/34278 [10:24:54<29:45:18, 4.30s/it] {'loss': 0.1429, 'grad_norm': 0.8088826900645648, 'learning_rate': 8.5212239771819e-06, 'epoch': 0.27} 27%|██▋ | 9384/34278 [10:24:54<29:45:18, 4.30s/it] 27%|██▋ | 9385/34278 [10:24:57<27:14:52, 3.94s/it] {'loss': 0.1262, 'grad_norm': 0.7963841820897791, 'learning_rate': 8.52088855341911e-06, 'epoch': 0.27} 27%|██▋ | 9385/34278 [10:24:57<27:14:52, 3.94s/it] 27%|██▋ | 9386/34278 [10:25:03<31:38:07, 4.58s/it] {'loss': 0.1207, 'grad_norm': 0.7710471350944503, 'learning_rate': 8.520553098222668e-06, 'epoch': 0.27} 27%|██▋ | 9386/34278 [10:25:03<31:38:07, 4.58s/it] 27%|██▋ | 9387/34278 [10:25:08<32:12:00, 4.66s/it] {'loss': 0.1494, 'grad_norm': 0.6119562969575766, 'learning_rate': 8.52021761159557e-06, 'epoch': 0.27} 27%|██▋ | 9387/34278 [10:25:08<32:12:00, 4.66s/it] 27%|██▋ | 9388/34278 [10:25:12<31:43:01, 4.59s/it] {'loss': 0.1467, 'grad_norm': 0.8668420715332206, 'learning_rate': 8.519882093540808e-06, 'epoch': 0.27} 27%|██▋ | 9388/34278 [10:25:12<31:43:01, 4.59s/it] 27%|██▋ | 9389/34278 [10:25:16<29:12:56, 4.23s/it] {'loss': 0.1644, 'grad_norm': 0.8984444470974137, 'learning_rate': 8.519546544061381e-06, 'epoch': 0.27} 27%|██▋ | 9389/34278 [10:25:16<29:12:56, 4.23s/it] 27%|██▋ | 9390/34278 [10:25:20<28:49:13, 4.17s/it] {'loss': 0.1712, 'grad_norm': 0.879324483678071, 'learning_rate': 8.51921096316028e-06, 'epoch': 0.27} 27%|██▋ | 9390/34278 [10:25:20<28:49:13, 4.17s/it] 27%|██▋ | 9391/34278 [10:25:23<26:13:36, 3.79s/it] {'loss': 0.1455, 'grad_norm': 0.8891892272802413, 'learning_rate': 8.518875350840504e-06, 'epoch': 0.27} 27%|██▋ | 9391/34278 [10:25:23<26:13:36, 3.79s/it] 27%|██▋ | 9392/34278 [10:25:26<25:28:32, 3.69s/it] {'loss': 0.1612, 'grad_norm': 0.7005306166547945, 'learning_rate': 8.51853970710505e-06, 'epoch': 0.27} 27%|██▋ | 9392/34278 [10:25:26<25:28:32, 3.69s/it] 27%|██▋ | 9393/34278 [10:25:30<26:06:31, 3.78s/it] {'loss': 0.1308, 'grad_norm': 1.0161531778115314, 'learning_rate': 8.518204031956913e-06, 'epoch': 0.27} 27%|██▋ | 9393/34278 [10:25:30<26:06:31, 3.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 27%|██▋ | 9394/34278 [10:25:36<30:25:44, 4.40s/it] {'loss': 0.1543, 'grad_norm': 0.6532963668948017, 'learning_rate': 8.51786832539909e-06, 'epoch': 0.27} 27%|██▋ | 9394/34278 [10:25:36<30:25:44, 4.40s/it] 27%|██▋ | 9395/34278 [10:25:39<28:37:12, 4.14s/it] {'loss': 0.1822, 'grad_norm': 0.8509789010570168, 'learning_rate': 8.51753258743458e-06, 'epoch': 0.27} 27%|██▋ | 9395/34278 [10:25:39<28:37:12, 4.14s/it] 27%|██▋ | 9396/34278 [10:25:43<26:39:05, 3.86s/it] {'loss': 0.165, 'grad_norm': 0.9009064557668237, 'learning_rate': 8.517196818066377e-06, 'epoch': 0.27} 27%|██▋ | 9396/34278 [10:25:43<26:39:05, 3.86s/it] 27%|██▋ | 9397/34278 [10:25:46<25:17:41, 3.66s/it] {'loss': 0.1404, 'grad_norm': 0.6804187319634439, 'learning_rate': 8.51686101729748e-06, 'epoch': 0.27} 27%|██▋ | 9397/34278 [10:25:46<25:17:41, 3.66s/it] 27%|██▋ | 9398/34278 [10:25:52<30:02:12, 4.35s/it] {'loss': 0.1455, 'grad_norm': 0.887818453724445, 'learning_rate': 8.516525185130888e-06, 'epoch': 0.27} 27%|██▋ | 9398/34278 [10:25:52<30:02:12, 4.35s/it] 27%|██▋ | 9399/34278 [10:25:55<26:58:59, 3.90s/it] {'loss': 0.142, 'grad_norm': 0.7398801558085408, 'learning_rate': 8.5161893215696e-06, 'epoch': 0.27} 27%|██▋ | 9399/34278 [10:25:55<26:58:59, 3.90s/it] 27%|██▋ | 9400/34278 [10:25:57<24:53:58, 3.60s/it] {'loss': 0.1381, 'grad_norm': 0.6916633544053097, 'learning_rate': 8.515853426616612e-06, 'epoch': 0.27} 27%|██▋ | 9400/34278 [10:25:58<24:53:58, 3.60s/it] 27%|██▋ | 9401/34278 [10:26:01<24:46:27, 3.59s/it] {'loss': 0.1479, 'grad_norm': 0.7964698108646552, 'learning_rate': 8.515517500274923e-06, 'epoch': 0.27} 27%|██▋ | 9401/34278 [10:26:01<24:46:27, 3.59s/it] 27%|██▋ | 9402/34278 [10:26:04<23:38:28, 3.42s/it] {'loss': 0.1467, 'grad_norm': 0.8310633527532242, 'learning_rate': 8.515181542547534e-06, 'epoch': 0.27} 27%|██▋ | 9402/34278 [10:26:04<23:38:28, 3.42s/it] 27%|██▋ | 9403/34278 [10:26:08<24:02:21, 3.48s/it] {'loss': 0.1397, 'grad_norm': 0.7182431225015057, 'learning_rate': 8.514845553437443e-06, 'epoch': 0.27} 27%|██▋ | 9403/34278 [10:26:08<24:02:21, 3.48s/it] 27%|██▋ | 9404/34278 [10:26:11<23:53:23, 3.46s/it] {'loss': 0.1658, 'grad_norm': 1.0091043667757507, 'learning_rate': 8.514509532947651e-06, 'epoch': 0.27} 27%|██▋ | 9404/34278 [10:26:11<23:53:23, 3.46s/it] 27%|██▋ | 9405/34278 [10:26:14<23:45:43, 3.44s/it] {'loss': 0.1545, 'grad_norm': 1.0161650994653664, 'learning_rate': 8.514173481081156e-06, 'epoch': 0.27} 27%|██▋ | 9405/34278 [10:26:15<23:45:43, 3.44s/it] 27%|██▋ | 9406/34278 [10:26:18<22:58:46, 3.33s/it] {'loss': 0.1802, 'grad_norm': 0.8351714662634255, 'learning_rate': 8.513837397840958e-06, 'epoch': 0.27} 27%|██▋ | 9406/34278 [10:26:18<22:58:46, 3.33s/it] 27%|██▋ | 9407/34278 [10:26:21<23:14:35, 3.36s/it] {'loss': 0.1461, 'grad_norm': 1.2116086747951689, 'learning_rate': 8.51350128323006e-06, 'epoch': 0.27} 27%|██▋ | 9407/34278 [10:26:21<23:14:35, 3.36s/it] 27%|██▋ | 9408/34278 [10:26:27<28:52:29, 4.18s/it] {'loss': 0.1611, 'grad_norm': 0.9772749691648196, 'learning_rate': 8.513165137251463e-06, 'epoch': 0.27} 27%|██▋ | 9408/34278 [10:26:27<28:52:29, 4.18s/it] 27%|██▋ | 9409/34278 [10:26:30<26:33:59, 3.85s/it] {'loss': 0.1442, 'grad_norm': 0.8097937103562994, 'learning_rate': 8.512828959908162e-06, 'epoch': 0.27} 27%|██▋ | 9409/34278 [10:26:30<26:33:59, 3.85s/it] 27%|██▋ | 9410/34278 [10:26:35<28:48:58, 4.17s/it] {'loss': 0.1462, 'grad_norm': 0.9182900497252301, 'learning_rate': 8.512492751203165e-06, 'epoch': 0.27} 27%|██▋ | 9410/34278 [10:26:35<28:48:58, 4.17s/it] 27%|██▋ | 9411/34278 [10:26:41<32:33:21, 4.71s/it] {'loss': 0.1632, 'grad_norm': 0.8817319234172042, 'learning_rate': 8.512156511139471e-06, 'epoch': 0.27} 27%|██▋ | 9411/34278 [10:26:41<32:33:21, 4.71s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 27%|██▋ | 9412/34278 [10:26:44<29:42:51, 4.30s/it] {'loss': 0.1743, 'grad_norm': 0.8286976994626774, 'learning_rate': 8.511820239720084e-06, 'epoch': 0.27} 27%|██▋ | 9412/34278 [10:26:44<29:42:51, 4.30s/it] 27%|██▋ | 9413/34278 [10:26:48<27:27:27, 3.98s/it] {'loss': 0.1411, 'grad_norm': 0.735404074180732, 'learning_rate': 8.511483936948002e-06, 'epoch': 0.27} 27%|██▋ | 9413/34278 [10:26:48<27:27:27, 3.98s/it] 27%|██▋ | 9414/34278 [10:26:51<26:21:40, 3.82s/it] {'loss': 0.1515, 'grad_norm': 0.9814813185347944, 'learning_rate': 8.51114760282623e-06, 'epoch': 0.27} 27%|██▋ | 9414/34278 [10:26:51<26:21:40, 3.82s/it] 27%|██▋ | 9415/34278 [10:26:54<24:49:47, 3.60s/it] {'loss': 0.1687, 'grad_norm': 0.9934159197137334, 'learning_rate': 8.51081123735777e-06, 'epoch': 0.27} 27%|██▋ | 9415/34278 [10:26:54<24:49:47, 3.60s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8dcef0f240> Failed to fetch sample 3641702. Exception: cannot identify image file <_io.BytesIO object at 0x7f8dcef0f240> 27%|██▋ | 9416/34278 [10:26:57<23:58:19, 3.47s/it] {'loss': 0.1555, 'grad_norm': 0.7681812285132499, 'learning_rate': 8.510474840545627e-06, 'epoch': 0.27} 27%|██▋ | 9416/34278 [10:26:57<23:58:19, 3.47s/it] 27%|██▋ | 9417/34278 [10:27:01<24:12:34, 3.51s/it] {'loss': 0.1711, 'grad_norm': 0.9609235946924203, 'learning_rate': 8.5101384123928e-06, 'epoch': 0.27} 27%|██▋ | 9417/34278 [10:27:01<24:12:34, 3.51s/it] 27%|██▋ | 9418/34278 [10:27:05<25:14:33, 3.66s/it] {'loss': 0.155, 'grad_norm': 1.0237015564079377, 'learning_rate': 8.509801952902296e-06, 'epoch': 0.27} 27%|██▋ | 9418/34278 [10:27:05<25:14:33, 3.66s/it] 27%|██▋ | 9419/34278 [10:27:08<25:00:49, 3.62s/it] {'loss': 0.1631, 'grad_norm': 0.7614658059014379, 'learning_rate': 8.50946546207712e-06, 'epoch': 0.27} 27%|██▋ | 9419/34278 [10:27:08<25:00:49, 3.62s/it] 27%|██▋ | 9420/34278 [10:27:11<23:20:59, 3.38s/it] {'loss': 0.1568, 'grad_norm': 0.8529999767142598, 'learning_rate': 8.509128939920272e-06, 'epoch': 0.27} 27%|██▋ | 9420/34278 [10:27:11<23:20:59, 3.38s/it] 27%|██▋ | 9421/34278 [10:27:15<23:04:04, 3.34s/it] {'loss': 0.16, 'grad_norm': 0.8678651961051029, 'learning_rate': 8.508792386434759e-06, 'epoch': 0.27} 27%|██▋ | 9421/34278 [10:27:15<23:04:04, 3.34s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10483 > 8192). Running this sequence through the model will result in indexing errors 27%|██▋ | 9422/34278 [10:27:21<29:17:05, 4.24s/it] {'loss': 0.1792, 'grad_norm': 0.7979435258787866, 'learning_rate': 8.508455801623586e-06, 'epoch': 0.27} 27%|██▋ | 9422/34278 [10:27:21<29:17:05, 4.24s/it] 27%|██▋ | 9423/34278 [10:27:24<27:07:35, 3.93s/it] {'loss': 0.1365, 'grad_norm': 1.0046558438662228, 'learning_rate': 8.508119185489757e-06, 'epoch': 0.27} 27%|██▋ | 9423/34278 [10:27:24<27:07:35, 3.93s/it] 27%|██▋ | 9424/34278 [10:27:27<24:41:43, 3.58s/it] {'loss': 0.1447, 'grad_norm': 0.7355819122446643, 'learning_rate': 8.507782538036276e-06, 'epoch': 0.27} 27%|██▋ | 9424/34278 [10:27:27<24:41:43, 3.58s/it] 27%|██▋ | 9425/34278 [10:27:31<25:12:30, 3.65s/it] {'loss': 0.1353, 'grad_norm': 0.8460955767277549, 'learning_rate': 8.507445859266152e-06, 'epoch': 0.27} 27%|██▋ | 9425/34278 [10:27:31<25:12:30, 3.65s/it] 27%|██▋ | 9426/34278 [10:27:34<25:00:04, 3.62s/it] {'loss': 0.1678, 'grad_norm': 0.7817880534790024, 'learning_rate': 8.507109149182387e-06, 'epoch': 0.27} 27%|██▋ | 9426/34278 [10:27:34<25:00:04, 3.62s/it] 28%|██▊ | 9427/34278 [10:27:40<29:51:57, 4.33s/it] {'loss': 0.176, 'grad_norm': 0.7356257704051247, 'learning_rate': 8.506772407787988e-06, 'epoch': 0.28} 28%|██▊ | 9427/34278 [10:27:40<29:51:57, 4.33s/it] 28%|██▊ | 9428/34278 [10:27:44<28:13:22, 4.09s/it] {'loss': 0.1321, 'grad_norm': 0.8704584172582093, 'learning_rate': 8.506435635085966e-06, 'epoch': 0.28} 28%|██▊ | 9428/34278 [10:27:44<28:13:22, 4.09s/it] 28%|██▊ | 9429/34278 [10:27:47<26:22:07, 3.82s/it] {'loss': 0.1518, 'grad_norm': 0.8179491883004322, 'learning_rate': 8.50609883107932e-06, 'epoch': 0.28} 28%|██▊ | 9429/34278 [10:27:47<26:22:07, 3.82s/it] 28%|██▊ | 9430/34278 [10:27:50<25:32:19, 3.70s/it] {'loss': 0.1696, 'grad_norm': 0.819497659757066, 'learning_rate': 8.505761995771061e-06, 'epoch': 0.28} 28%|██▊ | 9430/34278 [10:27:50<25:32:19, 3.70s/it] 28%|██▊ | 9431/34278 [10:27:53<23:49:48, 3.45s/it] {'loss': 0.1654, 'grad_norm': 0.917272713572263, 'learning_rate': 8.505425129164198e-06, 'epoch': 0.28} 28%|██▊ | 9431/34278 [10:27:53<23:49:48, 3.45s/it] 28%|██▊ | 9432/34278 [10:27:56<22:18:27, 3.23s/it] {'loss': 0.2001, 'grad_norm': 1.0703230633705194, 'learning_rate': 8.505088231261733e-06, 'epoch': 0.28} 28%|██▊ | 9432/34278 [10:27:56<22:18:27, 3.23s/it] 28%|██▊ | 9433/34278 [10:27:59<22:30:21, 3.26s/it] {'loss': 0.173, 'grad_norm': 1.0264963239532443, 'learning_rate': 8.50475130206668e-06, 'epoch': 0.28} 28%|██▊ | 9433/34278 [10:27:59<22:30:21, 3.26s/it] 28%|██▊ | 9434/34278 [10:28:02<22:10:13, 3.21s/it] {'loss': 0.1455, 'grad_norm': 0.8749461127264444, 'learning_rate': 8.504414341582043e-06, 'epoch': 0.28} 28%|██▊ | 9434/34278 [10:28:02<22:10:13, 3.21s/it] 28%|██▊ | 9435/34278 [10:28:06<23:00:38, 3.33s/it] {'loss': 0.1396, 'grad_norm': 0.7617261193989336, 'learning_rate': 8.50407734981083e-06, 'epoch': 0.28} 28%|██▊ | 9435/34278 [10:28:06<23:00:38, 3.33s/it] 28%|██▊ | 9436/34278 [10:28:09<22:44:10, 3.29s/it] {'loss': 0.1468, 'grad_norm': 0.8694962379193559, 'learning_rate': 8.503740326756052e-06, 'epoch': 0.28} 28%|██▊ | 9436/34278 [10:28:09<22:44:10, 3.29s/it] 28%|██▊ | 9437/34278 [10:28:12<22:39:04, 3.28s/it] {'loss': 0.1726, 'grad_norm': 0.785782274498047, 'learning_rate': 8.503403272420718e-06, 'epoch': 0.28} 28%|██▊ | 9437/34278 [10:28:12<22:39:04, 3.28s/it] 28%|██▊ | 9438/34278 [10:28:15<22:09:17, 3.21s/it] {'loss': 0.1608, 'grad_norm': 0.9280001677456263, 'learning_rate': 8.503066186807833e-06, 'epoch': 0.28} 28%|██▊ | 9438/34278 [10:28:15<22:09:17, 3.21s/it] 28%|██▊ | 9439/34278 [10:28:18<21:34:28, 3.13s/it] {'loss': 0.1505, 'grad_norm': 0.7299456908462157, 'learning_rate': 8.502729069920412e-06, 'epoch': 0.28} 28%|██▊ | 9439/34278 [10:28:18<21:34:28, 3.13s/it] 28%|██▊ | 9440/34278 [10:28:21<21:17:50, 3.09s/it] {'loss': 0.1811, 'grad_norm': 0.8092359613359347, 'learning_rate': 8.502391921761462e-06, 'epoch': 0.28} 28%|██▊ | 9440/34278 [10:28:21<21:17:50, 3.09s/it] 28%|██▊ | 9441/34278 [10:28:27<25:42:12, 3.73s/it] {'loss': 0.1653, 'grad_norm': 0.9013716823187342, 'learning_rate': 8.502054742333992e-06, 'epoch': 0.28} 28%|██▊ | 9441/34278 [10:28:27<25:42:12, 3.73s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 28%|██▊ | 9442/34278 [10:28:30<25:07:48, 3.64s/it] {'loss': 0.1699, 'grad_norm': 0.9886197901556381, 'learning_rate': 8.501717531641012e-06, 'epoch': 0.28} 28%|██▊ | 9442/34278 [10:28:30<25:07:48, 3.64s/it] 28%|██▊ | 9443/34278 [10:28:34<24:44:09, 3.59s/it] {'loss': 0.1467, 'grad_norm': 0.8659035733514054, 'learning_rate': 8.501380289685536e-06, 'epoch': 0.28} 28%|██▊ | 9443/34278 [10:28:34<24:44:09, 3.59s/it] 28%|██▊ | 9444/34278 [10:28:37<24:57:50, 3.62s/it] {'loss': 0.1642, 'grad_norm': 0.9791734451901493, 'learning_rate': 8.501043016470572e-06, 'epoch': 0.28} 28%|██▊ | 9444/34278 [10:28:37<24:57:50, 3.62s/it] 28%|██▊ | 9445/34278 [10:28:42<26:28:39, 3.84s/it] {'loss': 0.1478, 'grad_norm': 0.7702547427096468, 'learning_rate': 8.500705711999131e-06, 'epoch': 0.28} 28%|██▊ | 9445/34278 [10:28:42<26:28:39, 3.84s/it] 28%|██▊ | 9446/34278 [10:28:46<27:15:16, 3.95s/it] {'loss': 0.1364, 'grad_norm': 0.7875749626466658, 'learning_rate': 8.500368376274226e-06, 'epoch': 0.28} 28%|██▊ | 9446/34278 [10:28:46<27:15:16, 3.95s/it] 28%|██▊ | 9447/34278 [10:28:52<32:36:49, 4.73s/it] {'loss': 0.1662, 'grad_norm': 0.8043242573994069, 'learning_rate': 8.500031009298866e-06, 'epoch': 0.28} 28%|██▊ | 9447/34278 [10:28:52<32:36:49, 4.73s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f075ab7d800> Failed to fetch sample 3657300. Exception: cannot identify image file <_io.BytesIO object at 0x7f075ab7d800> 28%|██▊ | 9448/34278 [10:28:55<28:57:38, 4.20s/it] {'loss': 0.1675, 'grad_norm': 0.8088033639772654, 'learning_rate': 8.499693611076067e-06, 'epoch': 0.28} 28%|██▊ | 9448/34278 [10:28:55<28:57:38, 4.20s/it] 28%|██▊ | 9449/34278 [10:28:59<28:44:20, 4.17s/it] {'loss': 0.1531, 'grad_norm': 0.8821848949372785, 'learning_rate': 8.499356181608838e-06, 'epoch': 0.28} 28%|██▊ | 9449/34278 [10:28:59<28:44:20, 4.17s/it] 28%|██▊ | 9450/34278 [10:29:03<27:07:24, 3.93s/it] {'loss': 0.1525, 'grad_norm': 0.9874441023274517, 'learning_rate': 8.499018720900192e-06, 'epoch': 0.28} 28%|██▊ | 9450/34278 [10:29:03<27:07:24, 3.93s/it] 28%|██▊ | 9451/34278 [10:29:09<31:14:05, 4.53s/it] {'loss': 0.1653, 'grad_norm': 0.7685083610959368, 'learning_rate': 8.498681228953143e-06, 'epoch': 0.28} 28%|██▊ | 9451/34278 [10:29:09<31:14:05, 4.53s/it] 28%|██▊ | 9452/34278 [10:29:12<29:30:41, 4.28s/it] {'loss': 0.1714, 'grad_norm': 0.8866078506983858, 'learning_rate': 8.498343705770702e-06, 'epoch': 0.28} 28%|██▊ | 9452/34278 [10:29:12<29:30:41, 4.28s/it] 28%|██▊ | 9453/34278 [10:29:15<26:41:00, 3.87s/it] {'loss': 0.1776, 'grad_norm': 0.9772436502810337, 'learning_rate': 8.498006151355884e-06, 'epoch': 0.28} 28%|██▊ | 9453/34278 [10:29:15<26:41:00, 3.87s/it] 28%|██▊ | 9454/34278 [10:29:21<30:57:48, 4.49s/it] {'loss': 0.1467, 'grad_norm': 0.7397309353070088, 'learning_rate': 8.497668565711702e-06, 'epoch': 0.28} 28%|██▊ | 9454/34278 [10:29:21<30:57:48, 4.49s/it] 28%|██▊ | 9455/34278 [10:29:24<27:59:34, 4.06s/it] {'loss': 0.1493, 'grad_norm': 1.0057359122021043, 'learning_rate': 8.49733094884117e-06, 'epoch': 0.28} 28%|██▊ | 9455/34278 [10:29:24<27:59:34, 4.06s/it] 28%|██▊ | 9456/34278 [10:29:27<26:09:24, 3.79s/it] {'loss': 0.1507, 'grad_norm': 0.8780908547386509, 'learning_rate': 8.496993300747302e-06, 'epoch': 0.28} 28%|██▊ | 9456/34278 [10:29:27<26:09:24, 3.79s/it] 28%|██▊ | 9457/34278 [10:29:32<26:57:41, 3.91s/it] {'loss': 0.1378, 'grad_norm': 0.9346196779829378, 'learning_rate': 8.496655621433114e-06, 'epoch': 0.28} 28%|██▊ | 9457/34278 [10:29:32<26:57:41, 3.91s/it] 28%|██▊ | 9458/34278 [10:29:35<25:57:46, 3.77s/it] {'loss': 0.1819, 'grad_norm': 0.8424834810393838, 'learning_rate': 8.496317910901619e-06, 'epoch': 0.28} 28%|██▊ | 9458/34278 [10:29:35<25:57:46, 3.77s/it] 28%|██▊ | 9459/34278 [10:29:38<24:59:18, 3.62s/it] {'loss': 0.1397, 'grad_norm': 0.8996576444638953, 'learning_rate': 8.49598016915583e-06, 'epoch': 0.28} 28%|██▊ | 9459/34278 [10:29:38<24:59:18, 3.62s/it] 28%|██▊ | 9460/34278 [10:29:41<23:41:23, 3.44s/it] {'loss': 0.1819, 'grad_norm': 0.8663009147602474, 'learning_rate': 8.495642396198767e-06, 'epoch': 0.28} 28%|██▊ | 9460/34278 [10:29:41<23:41:23, 3.44s/it] 28%|██▊ | 9461/34278 [10:29:44<23:02:58, 3.34s/it] {'loss': 0.158, 'grad_norm': 0.8017268754536865, 'learning_rate': 8.495304592033442e-06, 'epoch': 0.28} 28%|██▊ | 9461/34278 [10:29:44<23:02:58, 3.34s/it] 28%|██▊ | 9462/34278 [10:29:47<22:15:25, 3.23s/it] {'loss': 0.1544, 'grad_norm': 0.7966559988575063, 'learning_rate': 8.494966756662873e-06, 'epoch': 0.28} 28%|██▊ | 9462/34278 [10:29:47<22:15:25, 3.23s/it] 28%|██▊ | 9463/34278 [10:29:50<21:29:54, 3.12s/it] {'loss': 0.1584, 'grad_norm': 1.001757100621573, 'learning_rate': 8.494628890090075e-06, 'epoch': 0.28} 28%|██▊ | 9463/34278 [10:29:50<21:29:54, 3.12s/it] 28%|██▊ | 9464/34278 [10:29:56<26:16:30, 3.81s/it] {'loss': 0.1672, 'grad_norm': 0.7751713493444756, 'learning_rate': 8.494290992318063e-06, 'epoch': 0.28} 28%|██▊ | 9464/34278 [10:29:56<26:16:30, 3.81s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 28%|██▊ | 9465/34278 [10:30:01<28:21:46, 4.12s/it] {'loss': 0.164, 'grad_norm': 0.6891467762655721, 'learning_rate': 8.493953063349857e-06, 'epoch': 0.28} 28%|██▊ | 9465/34278 [10:30:01<28:21:46, 4.12s/it] 28%|██▊ | 9466/34278 [10:30:04<27:03:12, 3.93s/it] {'loss': 0.1585, 'grad_norm': 0.7761812369257313, 'learning_rate': 8.493615103188471e-06, 'epoch': 0.28} 28%|██▊ | 9466/34278 [10:30:04<27:03:12, 3.93s/it] 28%|██▊ | 9467/34278 [10:30:07<24:59:39, 3.63s/it] {'loss': 0.179, 'grad_norm': 0.933854139528823, 'learning_rate': 8.493277111836924e-06, 'epoch': 0.28} 28%|██▊ | 9467/34278 [10:30:07<24:59:39, 3.63s/it] 28%|██▊ | 9468/34278 [10:30:11<26:09:33, 3.80s/it] {'loss': 0.1453, 'grad_norm': 0.7236528190804553, 'learning_rate': 8.492939089298233e-06, 'epoch': 0.28} 28%|██▊ | 9468/34278 [10:30:11<26:09:33, 3.80s/it] 28%|██▊ | 9469/34278 [10:30:14<25:09:57, 3.65s/it] {'loss': 0.1698, 'grad_norm': 0.7853520953354832, 'learning_rate': 8.492601035575414e-06, 'epoch': 0.28} 28%|██▊ | 9469/34278 [10:30:14<25:09:57, 3.65s/it] 28%|██▊ | 9470/34278 [10:30:18<25:51:42, 3.75s/it] {'loss': 0.1439, 'grad_norm': 0.844320334934501, 'learning_rate': 8.492262950671488e-06, 'epoch': 0.28} 28%|██▊ | 9470/34278 [10:30:18<25:51:42, 3.75s/it] 28%|██▊ | 9471/34278 [10:30:23<26:29:43, 3.85s/it] {'loss': 0.14, 'grad_norm': 0.7426067971931045, 'learning_rate': 8.491924834589472e-06, 'epoch': 0.28} 28%|██▊ | 9471/34278 [10:30:23<26:29:43, 3.85s/it] 28%|██▊ | 9472/34278 [10:30:26<25:24:26, 3.69s/it] {'loss': 0.1542, 'grad_norm': 0.7512280644895644, 'learning_rate': 8.491586687332385e-06, 'epoch': 0.28} 28%|██▊ | 9472/34278 [10:30:26<25:24:26, 3.69s/it] 28%|██▊ | 9473/34278 [10:30:29<23:39:22, 3.43s/it] {'loss': 0.1692, 'grad_norm': 0.7872685923164168, 'learning_rate': 8.491248508903245e-06, 'epoch': 0.28} 28%|██▊ | 9473/34278 [10:30:29<23:39:22, 3.43s/it] 28%|██▊ | 9474/34278 [10:30:32<23:54:44, 3.47s/it] {'loss': 0.2009, 'grad_norm': 0.8758122269640476, 'learning_rate': 8.490910299305073e-06, 'epoch': 0.28} 28%|██▊ | 9474/34278 [10:30:32<23:54:44, 3.47s/it] 28%|██▊ | 9475/34278 [10:30:36<23:45:20, 3.45s/it] {'loss': 0.1255, 'grad_norm': 0.9125559819894459, 'learning_rate': 8.490572058540884e-06, 'epoch': 0.28} 28%|██▊ | 9475/34278 [10:30:36<23:45:20, 3.45s/it] 28%|██▊ | 9476/34278 [10:30:39<22:58:32, 3.33s/it] {'loss': 0.1476, 'grad_norm': 0.7770517431740911, 'learning_rate': 8.490233786613703e-06, 'epoch': 0.28} 28%|██▊ | 9476/34278 [10:30:39<22:58:32, 3.33s/it] 28%|██▊ | 9477/34278 [10:30:42<21:55:22, 3.18s/it] {'loss': 0.1386, 'grad_norm': 1.1274314402792818, 'learning_rate': 8.489895483526548e-06, 'epoch': 0.28} 28%|██▊ | 9477/34278 [10:30:42<21:55:22, 3.18s/it] 28%|██▊ | 9478/34278 [10:30:44<21:02:06, 3.05s/it] {'loss': 0.1423, 'grad_norm': 1.9472034299099266, 'learning_rate': 8.48955714928244e-06, 'epoch': 0.28} 28%|██▊ | 9478/34278 [10:30:44<21:02:06, 3.05s/it] 28%|██▊ | 9479/34278 [10:30:49<24:44:15, 3.59s/it] {'loss': 0.1689, 'grad_norm': 0.9428783171021343, 'learning_rate': 8.489218783884399e-06, 'epoch': 0.28} 28%|██▊ | 9479/34278 [10:30:49<24:44:15, 3.59s/it] 28%|██▊ | 9480/34278 [10:30:52<24:05:14, 3.50s/it] {'loss': 0.1655, 'grad_norm': 1.1013294079397176, 'learning_rate': 8.488880387335444e-06, 'epoch': 0.28} 28%|██▊ | 9480/34278 [10:30:52<24:05:14, 3.50s/it] 28%|██▊ | 9481/34278 [10:30:55<22:48:32, 3.31s/it] {'loss': 0.1564, 'grad_norm': 0.6931283174171184, 'learning_rate': 8.488541959638599e-06, 'epoch': 0.28} 28%|██▊ | 9481/34278 [10:30:55<22:48:32, 3.31s/it] 28%|██▊ | 9482/34278 [10:30:59<24:04:05, 3.49s/it] {'loss': 0.1539, 'grad_norm': 1.1418914566979168, 'learning_rate': 8.488203500796883e-06, 'epoch': 0.28} 28%|██▊ | 9482/34278 [10:30:59<24:04:05, 3.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 28%|██▊ | 9483/34278 [10:31:02<23:33:39, 3.42s/it] {'loss': 0.1812, 'grad_norm': 0.8650724531174553, 'learning_rate': 8.48786501081332e-06, 'epoch': 0.28} 28%|██▊ | 9483/34278 [10:31:02<23:33:39, 3.42s/it] 28%|██▊ | 9484/34278 [10:31:05<22:21:21, 3.25s/it] {'loss': 0.1628, 'grad_norm': 0.8506173055726773, 'learning_rate': 8.487526489690928e-06, 'epoch': 0.28} 28%|██▊ | 9484/34278 [10:31:05<22:21:21, 3.25s/it] 28%|██▊ | 9485/34278 [10:31:09<22:19:31, 3.24s/it] {'loss': 0.153, 'grad_norm': 1.088489506144093, 'learning_rate': 8.487187937432737e-06, 'epoch': 0.28} 28%|██▊ | 9485/34278 [10:31:09<22:19:31, 3.24s/it] 28%|██▊ | 9486/34278 [10:31:14<27:02:34, 3.93s/it] {'loss': 0.1483, 'grad_norm': 0.9366524320763602, 'learning_rate': 8.486849354041761e-06, 'epoch': 0.28} 28%|██▊ | 9486/34278 [10:31:14<27:02:34, 3.93s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 28%|██▊ | 9487/34278 [10:31:18<26:46:34, 3.89s/it] {'loss': 0.1574, 'grad_norm': 0.7796073498256102, 'learning_rate': 8.486510739521027e-06, 'epoch': 0.28} 28%|██▊ | 9487/34278 [10:31:18<26:46:34, 3.89s/it] 28%|██▊ | 9488/34278 [10:31:22<26:19:37, 3.82s/it] {'loss': 0.1625, 'grad_norm': 0.806991179746802, 'learning_rate': 8.486172093873557e-06, 'epoch': 0.28} 28%|██▊ | 9488/34278 [10:31:22<26:19:37, 3.82s/it] 28%|██▊ | 9489/34278 [10:31:25<25:26:39, 3.70s/it] {'loss': 0.1441, 'grad_norm': 0.8683205123133251, 'learning_rate': 8.485833417102375e-06, 'epoch': 0.28} 28%|██▊ | 9489/34278 [10:31:25<25:26:39, 3.70s/it] 28%|██▊ | 9490/34278 [10:31:31<29:57:57, 4.35s/it] {'loss': 0.1741, 'grad_norm': 0.8616785742222477, 'learning_rate': 8.485494709210506e-06, 'epoch': 0.28} 28%|██▊ | 9490/34278 [10:31:31<29:57:57, 4.35s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 28%|██▊ | 9491/34278 [10:31:34<27:04:20, 3.93s/it] {'loss': 0.17, 'grad_norm': 0.8220876158014837, 'learning_rate': 8.485155970200972e-06, 'epoch': 0.28} 28%|██▊ | 9491/34278 [10:31:34<27:04:20, 3.93s/it] 28%|██▊ | 9492/34278 [10:31:37<24:48:39, 3.60s/it] {'loss': 0.1583, 'grad_norm': 0.7495882026570969, 'learning_rate': 8.484817200076796e-06, 'epoch': 0.28} 28%|██▊ | 9492/34278 [10:31:37<24:48:39, 3.60s/it] 28%|██▊ | 9493/34278 [10:31:40<24:41:15, 3.59s/it] {'loss': 0.1576, 'grad_norm': 1.0705677075094195, 'learning_rate': 8.484478398841003e-06, 'epoch': 0.28} 28%|██▊ | 9493/34278 [10:31:40<24:41:15, 3.59s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff4d82eddf0> Failed to fetch sample 2691939. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4d82eddf0> 28%|██▊ | 9494/34278 [10:31:44<24:29:50, 3.56s/it] {'loss': 0.1551, 'grad_norm': 0.5928858273713736, 'learning_rate': 8.48413956649662e-06, 'epoch': 0.28} 28%|██▊ | 9494/34278 [10:31:44<24:29:50, 3.56s/it] 28%|██▊ | 9495/34278 [10:31:47<23:40:13, 3.44s/it] {'loss': 0.1647, 'grad_norm': 1.0627089887650565, 'learning_rate': 8.483800703046672e-06, 'epoch': 0.28} 28%|██▊ | 9495/34278 [10:31:47<23:40:13, 3.44s/it] 28%|██▊ | 9496/34278 [10:31:50<22:41:24, 3.30s/it] {'loss': 0.1297, 'grad_norm': 0.7247697188724354, 'learning_rate': 8.483461808494182e-06, 'epoch': 0.28} 28%|██▊ | 9496/34278 [10:31:50<22:41:24, 3.30s/it] 28%|██▊ | 9497/34278 [10:31:54<24:20:18, 3.54s/it] {'loss': 0.1694, 'grad_norm': 0.8934715469223729, 'learning_rate': 8.483122882842177e-06, 'epoch': 0.28} 28%|██▊ | 9497/34278 [10:31:54<24:20:18, 3.54s/it] 28%|██▊ | 9498/34278 [10:31:57<23:46:05, 3.45s/it] {'loss': 0.1727, 'grad_norm': 0.9746300274126058, 'learning_rate': 8.48278392609368e-06, 'epoch': 0.28} 28%|██▊ | 9498/34278 [10:31:57<23:46:05, 3.45s/it] 28%|██▊ | 9499/34278 [10:32:02<25:57:52, 3.77s/it] {'loss': 0.146, 'grad_norm': 0.8096267207398115, 'learning_rate': 8.482444938251722e-06, 'epoch': 0.28} 28%|██▊ | 9499/34278 [10:32:02<25:57:52, 3.77s/it] 28%|██▊ | 9500/34278 [10:32:05<25:25:46, 3.69s/it] {'loss': 0.1532, 'grad_norm': 0.8776300149543106, 'learning_rate': 8.482105919319325e-06, 'epoch': 0.28} 28%|██▊ | 9500/34278 [10:32:05<25:25:46, 3.69s/it] 28%|██▊ | 9501/34278 [10:32:09<26:24:59, 3.84s/it] {'loss': 0.1595, 'grad_norm': 0.9293347863312186, 'learning_rate': 8.48176686929952e-06, 'epoch': 0.28} 28%|██▊ | 9501/34278 [10:32:09<26:24:59, 3.84s/it] 28%|██▊ | 9502/34278 [10:32:13<26:33:22, 3.86s/it] {'loss': 0.1623, 'grad_norm': 0.9528707708136305, 'learning_rate': 8.481427788195329e-06, 'epoch': 0.28} 28%|██▊ | 9502/34278 [10:32:13<26:33:22, 3.86s/it] 28%|██▊ | 9503/34278 [10:32:17<25:41:01, 3.73s/it] {'loss': 0.1412, 'grad_norm': 0.9223367460630922, 'learning_rate': 8.481088676009783e-06, 'epoch': 0.28} 28%|██▊ | 9503/34278 [10:32:17<25:41:01, 3.73s/it] 28%|██▊ | 9504/34278 [10:32:21<25:59:40, 3.78s/it] {'loss': 0.1523, 'grad_norm': 0.791567245986272, 'learning_rate': 8.48074953274591e-06, 'epoch': 0.28} 28%|██▊ | 9504/34278 [10:32:21<25:59:40, 3.78s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f579f21e4d0> Failed to fetch sample 3058469. Exception: cannot identify image file <_io.BytesIO object at 0x7f579f21e4d0> 28%|██▊ | 9505/34278 [10:32:24<24:54:06, 3.62s/it] {'loss': 0.1557, 'grad_norm': 1.3407727074347742, 'learning_rate': 8.480410358406735e-06, 'epoch': 0.28} 28%|██▊ | 9505/34278 [10:32:24<24:54:06, 3.62s/it] 28%|██▊ | 9506/34278 [10:32:28<25:51:53, 3.76s/it] {'loss': 0.1551, 'grad_norm': 0.706668075550768, 'learning_rate': 8.480071152995285e-06, 'epoch': 0.28} 28%|██▊ | 9506/34278 [10:32:28<25:51:53, 3.76s/it] 28%|██▊ | 9507/34278 [10:32:31<23:59:27, 3.49s/it] {'loss': 0.1492, 'grad_norm': 0.8734217382710632, 'learning_rate': 8.479731916514592e-06, 'epoch': 0.28} 28%|██▊ | 9507/34278 [10:32:31<23:59:27, 3.49s/it] 28%|██▊ | 9508/34278 [10:32:34<23:19:31, 3.39s/it] {'loss': 0.1482, 'grad_norm': 0.7561871450179262, 'learning_rate': 8.479392648967684e-06, 'epoch': 0.28} 28%|██▊ | 9508/34278 [10:32:34<23:19:31, 3.39s/it] 28%|██▊ | 9509/34278 [10:32:37<22:46:37, 3.31s/it] {'loss': 0.1515, 'grad_norm': 0.9958435262815148, 'learning_rate': 8.479053350357587e-06, 'epoch': 0.28} 28%|██▊ | 9509/34278 [10:32:37<22:46:37, 3.31s/it] 28%|██▊ | 9510/34278 [10:32:40<22:05:49, 3.21s/it] {'loss': 0.1683, 'grad_norm': 1.005507379061439, 'learning_rate': 8.478714020687334e-06, 'epoch': 0.28} 28%|██▊ | 9510/34278 [10:32:40<22:05:49, 3.21s/it] 28%|██▊ | 9511/34278 [10:32:46<27:41:03, 4.02s/it] {'loss': 0.1549, 'grad_norm': 0.8848887712016784, 'learning_rate': 8.478374659959953e-06, 'epoch': 0.28} 28%|██▊ | 9511/34278 [10:32:46<27:41:03, 4.02s/it] 28%|██▊ | 9512/34278 [10:32:49<25:02:22, 3.64s/it] {'loss': 0.1711, 'grad_norm': 0.9422045957305167, 'learning_rate': 8.478035268178473e-06, 'epoch': 0.28} 28%|██▊ | 9512/34278 [10:32:49<25:02:22, 3.64s/it] 28%|██▊ | 9513/34278 [10:32:55<29:39:36, 4.31s/it] {'loss': 0.1752, 'grad_norm': 0.7149755989431283, 'learning_rate': 8.477695845345922e-06, 'epoch': 0.28} 28%|██▊ | 9513/34278 [10:32:55<29:39:36, 4.31s/it] 28%|██▊ | 9514/34278 [10:32:57<26:50:43, 3.90s/it] {'loss': 0.1247, 'grad_norm': 0.8667007312392415, 'learning_rate': 8.477356391465336e-06, 'epoch': 0.28} 28%|██▊ | 9514/34278 [10:32:57<26:50:43, 3.90s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9469 > 8192). Running this sequence through the model will result in indexing errors 28%|██▊ | 9515/34278 [10:33:02<27:39:35, 4.02s/it] {'loss': 0.1755, 'grad_norm': 0.9587353462019751, 'learning_rate': 8.477016906539742e-06, 'epoch': 0.28} 28%|██▊ | 9515/34278 [10:33:02<27:39:35, 4.02s/it] 28%|██▊ | 9516/34278 [10:33:08<32:30:54, 4.73s/it] {'loss': 0.171, 'grad_norm': 0.8329582723699501, 'learning_rate': 8.476677390572167e-06, 'epoch': 0.28} 28%|██▊ | 9516/34278 [10:33:08<32:30:54, 4.73s/it] 28%|██▊ | 9517/34278 [10:33:11<28:56:54, 4.21s/it] {'loss': 0.1534, 'grad_norm': 0.9834669720070736, 'learning_rate': 8.47633784356565e-06, 'epoch': 0.28} 28%|██▊ | 9517/34278 [10:33:11<28:56:54, 4.21s/it] 28%|██▊ | 9518/34278 [10:33:14<27:02:09, 3.93s/it] {'loss': 0.1417, 'grad_norm': 0.6999872588764519, 'learning_rate': 8.475998265523219e-06, 'epoch': 0.28} 28%|██▊ | 9518/34278 [10:33:14<27:02:09, 3.93s/it] 28%|██▊ | 9519/34278 [10:33:17<25:07:33, 3.65s/it] {'loss': 0.1643, 'grad_norm': 0.8368581948785472, 'learning_rate': 8.475658656447903e-06, 'epoch': 0.28} 28%|██▊ | 9519/34278 [10:33:17<25:07:33, 3.65s/it] 28%|██▊ | 9520/34278 [10:33:23<30:01:25, 4.37s/it] {'loss': 0.174, 'grad_norm': 0.8936896290158791, 'learning_rate': 8.475319016342739e-06, 'epoch': 0.28} 28%|██▊ | 9520/34278 [10:33:24<30:01:25, 4.37s/it] 28%|██▊ | 9521/34278 [10:33:30<33:54:30, 4.93s/it] {'loss': 0.1411, 'grad_norm': 0.9997248846685993, 'learning_rate': 8.474979345210753e-06, 'epoch': 0.28} 28%|██▊ | 9521/34278 [10:33:30<33:54:30, 4.93s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 28%|██▊ | 9522/34278 [10:33:36<36:09:45, 5.26s/it] {'loss': 0.1643, 'grad_norm': 0.7923848027596828, 'learning_rate': 8.474639643054983e-06, 'epoch': 0.28} 28%|██▊ | 9522/34278 [10:33:36<36:09:45, 5.26s/it] 28%|██▊ | 9523/34278 [10:33:40<33:11:43, 4.83s/it] {'loss': 0.1337, 'grad_norm': 0.7807511127005486, 'learning_rate': 8.47429990987846e-06, 'epoch': 0.28} 28%|██▊ | 9523/34278 [10:33:40<33:11:43, 4.83s/it] 28%|██▊ | 9524/34278 [10:33:42<28:33:00, 4.15s/it] {'loss': 0.1439, 'grad_norm': 0.8125141038135897, 'learning_rate': 8.473960145684217e-06, 'epoch': 0.28} 28%|██▊ | 9524/34278 [10:33:42<28:33:00, 4.15s/it] 28%|██▊ | 9525/34278 [10:33:45<26:05:58, 3.80s/it] {'loss': 0.132, 'grad_norm': 0.8156162963513861, 'learning_rate': 8.473620350475284e-06, 'epoch': 0.28} 28%|██▊ | 9525/34278 [10:33:45<26:05:58, 3.80s/it] 28%|██▊ | 9526/34278 [10:33:49<25:21:56, 3.69s/it] {'loss': 0.1529, 'grad_norm': 0.6806743244156654, 'learning_rate': 8.473280524254701e-06, 'epoch': 0.28} 28%|██▊ | 9526/34278 [10:33:49<25:21:56, 3.69s/it] 28%|██▊ | 9527/34278 [10:33:52<24:20:54, 3.54s/it] {'loss': 0.1428, 'grad_norm': 0.7297864166420638, 'learning_rate': 8.472940667025497e-06, 'epoch': 0.28} 28%|██▊ | 9527/34278 [10:33:52<24:20:54, 3.54s/it] 28%|██▊ | 9528/34278 [10:33:55<23:38:07, 3.44s/it] {'loss': 0.1531, 'grad_norm': 0.91214494122196, 'learning_rate': 8.472600778790709e-06, 'epoch': 0.28} 28%|██▊ | 9528/34278 [10:33:55<23:38:07, 3.44s/it] 28%|██▊ | 9529/34278 [10:33:59<25:02:32, 3.64s/it] {'loss': 0.1624, 'grad_norm': 0.8338650660903385, 'learning_rate': 8.472260859553369e-06, 'epoch': 0.28} 28%|██▊ | 9529/34278 [10:33:59<25:02:32, 3.64s/it] 28%|██▊ | 9530/34278 [10:34:02<23:52:02, 3.47s/it] {'loss': 0.1509, 'grad_norm': 0.8308162285551156, 'learning_rate': 8.471920909316514e-06, 'epoch': 0.28} 28%|██▊ | 9530/34278 [10:34:02<23:52:02, 3.47s/it] 28%|██▊ | 9531/34278 [10:34:05<21:53:05, 3.18s/it] {'loss': 0.1563, 'grad_norm': 0.8998302888977185, 'learning_rate': 8.47158092808318e-06, 'epoch': 0.28} 28%|██▊ | 9531/34278 [10:34:05<21:53:05, 3.18s/it] 28%|██▊ | 9532/34278 [10:34:08<22:20:27, 3.25s/it] {'loss': 0.1407, 'grad_norm': 0.9354212032607615, 'learning_rate': 8.471240915856396e-06, 'epoch': 0.28} 28%|██▊ | 9532/34278 [10:34:08<22:20:27, 3.25s/it] 28%|██▊ | 9533/34278 [10:34:12<22:47:36, 3.32s/it] {'loss': 0.1387, 'grad_norm': 0.7051327334938483, 'learning_rate': 8.470900872639203e-06, 'epoch': 0.28} 28%|██▊ | 9533/34278 [10:34:12<22:47:36, 3.32s/it] 28%|██▊ | 9534/34278 [10:34:15<22:44:47, 3.31s/it] {'loss': 0.174, 'grad_norm': 0.8094406200554851, 'learning_rate': 8.470560798434636e-06, 'epoch': 0.28} 28%|██▊ | 9534/34278 [10:34:15<22:44:47, 3.31s/it] 28%|██▊ | 9535/34278 [10:34:18<22:07:19, 3.22s/it] {'loss': 0.175, 'grad_norm': 0.9467595326622934, 'learning_rate': 8.47022069324573e-06, 'epoch': 0.28} 28%|██▊ | 9535/34278 [10:34:18<22:07:19, 3.22s/it] 28%|██▊ | 9536/34278 [10:34:21<22:10:36, 3.23s/it] {'loss': 0.174, 'grad_norm': 0.8008647499990702, 'learning_rate': 8.469880557075525e-06, 'epoch': 0.28} 28%|██▊ | 9536/34278 [10:34:21<22:10:36, 3.23s/it] 28%|██▊ | 9537/34278 [10:34:25<22:59:26, 3.35s/it] {'loss': 0.1502, 'grad_norm': 0.9201909051532778, 'learning_rate': 8.469540389927052e-06, 'epoch': 0.28} 28%|██▊ | 9537/34278 [10:34:25<22:59:26, 3.35s/it] 28%|██▊ | 9538/34278 [10:34:30<27:43:58, 4.04s/it] {'loss': 0.1471, 'grad_norm': 0.7352171064832371, 'learning_rate': 8.46920019180335e-06, 'epoch': 0.28} 28%|██▊ | 9538/34278 [10:34:30<27:43:58, 4.04s/it] 28%|██▊ | 9539/34278 [10:34:34<26:25:06, 3.84s/it] {'loss': 0.1441, 'grad_norm': 0.9244963989311303, 'learning_rate': 8.468859962707459e-06, 'epoch': 0.28} 28%|██▊ | 9539/34278 [10:34:34<26:25:06, 3.84s/it] 28%|██▊ | 9540/34278 [10:34:37<25:03:13, 3.65s/it] {'loss': 0.152, 'grad_norm': 0.839838171042912, 'learning_rate': 8.468519702642413e-06, 'epoch': 0.28} 28%|██▊ | 9540/34278 [10:34:37<25:03:13, 3.65s/it] 28%|██▊ | 9541/34278 [10:34:40<23:57:48, 3.49s/it] {'loss': 0.1429, 'grad_norm': 0.7189624686847728, 'learning_rate': 8.468179411611252e-06, 'epoch': 0.28} 28%|██▊ | 9541/34278 [10:34:40<23:57:48, 3.49s/it] 28%|██▊ | 9542/34278 [10:34:44<24:17:04, 3.53s/it] {'loss': 0.1433, 'grad_norm': 0.8024451840621278, 'learning_rate': 8.467839089617011e-06, 'epoch': 0.28} 28%|██▊ | 9542/34278 [10:34:44<24:17:04, 3.53s/it] 28%|██▊ | 9543/34278 [10:34:50<30:02:46, 4.37s/it] {'loss': 0.1524, 'grad_norm': 0.9133265010862672, 'learning_rate': 8.467498736662732e-06, 'epoch': 0.28} 28%|██▊ | 9543/34278 [10:34:50<30:02:46, 4.37s/it] 28%|██▊ | 9544/34278 [10:34:53<27:27:26, 4.00s/it] {'loss': 0.1634, 'grad_norm': 0.7505854001962619, 'learning_rate': 8.467158352751453e-06, 'epoch': 0.28} 28%|██▊ | 9544/34278 [10:34:53<27:27:26, 4.00s/it] 28%|██▊ | 9545/34278 [10:34:57<27:46:12, 4.04s/it] {'loss': 0.1807, 'grad_norm': 0.8928417787444326, 'learning_rate': 8.466817937886211e-06, 'epoch': 0.28} 28%|██▊ | 9545/34278 [10:34:57<27:46:12, 4.04s/it] 28%|██▊ | 9546/34278 [10:35:00<25:57:22, 3.78s/it] {'loss': 0.1535, 'grad_norm': 1.0277759705271292, 'learning_rate': 8.466477492070046e-06, 'epoch': 0.28} 28%|██▊ | 9546/34278 [10:35:00<25:57:22, 3.78s/it] 28%|██▊ | 9547/34278 [10:35:05<27:12:02, 3.96s/it] {'loss': 0.1444, 'grad_norm': 0.7637001525940128, 'learning_rate': 8.466137015305997e-06, 'epoch': 0.28} 28%|██▊ | 9547/34278 [10:35:05<27:12:02, 3.96s/it] 28%|██▊ | 9548/34278 [10:35:10<30:09:51, 4.39s/it] {'loss': 0.1741, 'grad_norm': 0.7501390401123026, 'learning_rate': 8.465796507597106e-06, 'epoch': 0.28} 28%|██▊ | 9548/34278 [10:35:10<30:09:51, 4.39s/it] 28%|██▊ | 9549/34278 [10:35:13<27:25:50, 3.99s/it] {'loss': 0.1468, 'grad_norm': 0.9450607457332094, 'learning_rate': 8.465455968946409e-06, 'epoch': 0.28} 28%|██▊ | 9549/34278 [10:35:13<27:25:50, 3.99s/it] 28%|██▊ | 9550/34278 [10:35:19<30:13:45, 4.40s/it] {'loss': 0.1581, 'grad_norm': 0.8179128995100446, 'learning_rate': 8.465115399356948e-06, 'epoch': 0.28} 28%|██▊ | 9550/34278 [10:35:19<30:13:45, 4.40s/it] 28%|██▊ | 9551/34278 [10:35:25<34:26:27, 5.01s/it] {'loss': 0.1693, 'grad_norm': 0.8206187225207779, 'learning_rate': 8.464774798831766e-06, 'epoch': 0.28} 28%|██▊ | 9551/34278 [10:35:25<34:26:27, 5.01s/it] 28%|██▊ | 9552/34278 [10:35:28<29:52:39, 4.35s/it] {'loss': 0.1726, 'grad_norm': 1.0282827859906551, 'learning_rate': 8.464434167373901e-06, 'epoch': 0.28} 28%|██▊ | 9552/34278 [10:35:28<29:52:39, 4.35s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa5ba189670> Failed to fetch sample 3412263. Exception: cannot identify image file <_io.BytesIO object at 0x7fa5ba189670> 28%|██▊ | 9553/34278 [10:35:31<28:01:04, 4.08s/it] {'loss': 0.1881, 'grad_norm': 0.8731238924757797, 'learning_rate': 8.464093504986395e-06, 'epoch': 0.28} 28%|██▊ | 9553/34278 [10:35:31<28:01:04, 4.08s/it] 28%|██▊ | 9554/34278 [10:35:34<26:05:37, 3.80s/it] {'loss': 0.1401, 'grad_norm': 0.9374981958347302, 'learning_rate': 8.463752811672289e-06, 'epoch': 0.28} 28%|██▊ | 9554/34278 [10:35:34<26:05:37, 3.80s/it] 28%|██▊ | 9555/34278 [10:35:40<30:29:24, 4.44s/it] {'loss': 0.1391, 'grad_norm': 0.7063176768742483, 'learning_rate': 8.463412087434624e-06, 'epoch': 0.28} 28%|██▊ | 9555/34278 [10:35:40<30:29:24, 4.44s/it] 28%|██▊ | 9556/34278 [10:35:46<33:23:08, 4.86s/it] {'loss': 0.1707, 'grad_norm': 0.8843815844052086, 'learning_rate': 8.463071332276442e-06, 'epoch': 0.28} 28%|██▊ | 9556/34278 [10:35:46<33:23:08, 4.86s/it] 28%|██▊ | 9557/34278 [10:35:50<30:19:48, 4.42s/it] {'loss': 0.1432, 'grad_norm': 1.0458012025569847, 'learning_rate': 8.462730546200788e-06, 'epoch': 0.28} 28%|██▊ | 9557/34278 [10:35:50<30:19:48, 4.42s/it] 28%|██▊ | 9558/34278 [10:35:56<34:31:28, 5.03s/it] {'loss': 0.1602, 'grad_norm': 0.8036876244199916, 'learning_rate': 8.4623897292107e-06, 'epoch': 0.28} 28%|██▊ | 9558/34278 [10:35:56<34:31:28, 5.03s/it] 28%|██▊ | 9559/34278 [10:35:59<30:01:27, 4.37s/it] {'loss': 0.1565, 'grad_norm': 0.9752595131467424, 'learning_rate': 8.462048881309226e-06, 'epoch': 0.28} 28%|██▊ | 9559/34278 [10:35:59<30:01:27, 4.37s/it] 28%|██▊ | 9560/34278 [10:36:02<27:54:59, 4.07s/it] {'loss': 0.1741, 'grad_norm': 1.0282166667261705, 'learning_rate': 8.461708002499405e-06, 'epoch': 0.28} 28%|██▊ | 9560/34278 [10:36:02<27:54:59, 4.07s/it] 28%|██▊ | 9561/34278 [10:36:06<27:19:36, 3.98s/it] {'loss': 0.1444, 'grad_norm': 0.8807169214872331, 'learning_rate': 8.46136709278428e-06, 'epoch': 0.28} 28%|██▊ | 9561/34278 [10:36:06<27:19:36, 3.98s/it] 28%|██▊ | 9562/34278 [10:36:09<25:32:08, 3.72s/it] {'loss': 0.1609, 'grad_norm': 0.8867295328036336, 'learning_rate': 8.461026152166896e-06, 'epoch': 0.28} 28%|██▊ | 9562/34278 [10:36:09<25:32:08, 3.72s/it] 28%|██▊ | 9563/34278 [10:36:15<30:04:36, 4.38s/it] {'loss': 0.1756, 'grad_norm': 1.0239980761347185, 'learning_rate': 8.460685180650297e-06, 'epoch': 0.28} 28%|██▊ | 9563/34278 [10:36:15<30:04:36, 4.38s/it] 28%|██▊ | 9564/34278 [10:36:18<26:55:02, 3.92s/it] {'loss': 0.133, 'grad_norm': 0.8283350317446394, 'learning_rate': 8.460344178237528e-06, 'epoch': 0.28} 28%|██▊ | 9564/34278 [10:36:18<26:55:02, 3.92s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10836 > 8192). Running this sequence through the model will result in indexing errors 28%|██▊ | 9565/34278 [10:36:22<27:09:25, 3.96s/it] {'loss': 0.1457, 'grad_norm': 0.9790011158432254, 'learning_rate': 8.460003144931632e-06, 'epoch': 0.28} 28%|██▊ | 9565/34278 [10:36:22<27:09:25, 3.96s/it] 28%|██▊ | 9566/34278 [10:36:25<25:02:45, 3.65s/it] {'loss': 0.1542, 'grad_norm': 0.9965715972386453, 'learning_rate': 8.459662080735653e-06, 'epoch': 0.28} 28%|██▊ | 9566/34278 [10:36:25<25:02:45, 3.65s/it] 28%|██▊ | 9567/34278 [10:36:31<29:52:14, 4.35s/it] {'loss': 0.1477, 'grad_norm': 0.7819092938540751, 'learning_rate': 8.459320985652635e-06, 'epoch': 0.28} 28%|██▊ | 9567/34278 [10:36:31<29:52:14, 4.35s/it] 28%|██▊ | 9568/34278 [10:36:37<34:18:06, 5.00s/it] {'loss': 0.1371, 'grad_norm': 0.8493448943608398, 'learning_rate': 8.458979859685628e-06, 'epoch': 0.28} 28%|██▊ | 9568/34278 [10:36:37<34:18:06, 5.00s/it] 28%|██▊ | 9569/34278 [10:36:41<30:23:51, 4.43s/it] {'loss': 0.1449, 'grad_norm': 1.0223759261181469, 'learning_rate': 8.458638702837673e-06, 'epoch': 0.28} 28%|██▊ | 9569/34278 [10:36:41<30:23:51, 4.43s/it] 28%|██▊ | 9570/34278 [10:36:46<32:12:52, 4.69s/it] {'loss': 0.144, 'grad_norm': 0.9055193244735283, 'learning_rate': 8.45829751511182e-06, 'epoch': 0.28} 28%|██▊ | 9570/34278 [10:36:46<32:12:52, 4.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 28%|██▊ | 9571/34278 [10:36:49<28:03:14, 4.09s/it] {'loss': 0.1478, 'grad_norm': 0.9511476417475678, 'learning_rate': 8.457956296511109e-06, 'epoch': 0.28} 28%|██▊ | 9571/34278 [10:36:49<28:03:14, 4.09s/it] 28%|██▊ | 9572/34278 [10:36:51<25:42:35, 3.75s/it] {'loss': 0.1505, 'grad_norm': 0.9198111942266042, 'learning_rate': 8.457615047038592e-06, 'epoch': 0.28} 28%|██▊ | 9572/34278 [10:36:51<25:42:35, 3.75s/it] 28%|██▊ | 9573/34278 [10:36:55<25:42:30, 3.75s/it] {'loss': 0.1462, 'grad_norm': 0.698641892426292, 'learning_rate': 8.45727376669731e-06, 'epoch': 0.28} 28%|██▊ | 9573/34278 [10:36:55<25:42:30, 3.75s/it] 28%|██▊ | 9574/34278 [10:36:58<24:12:21, 3.53s/it] {'loss': 0.136, 'grad_norm': 0.9362354068875892, 'learning_rate': 8.456932455490317e-06, 'epoch': 0.28} 28%|██▊ | 9574/34278 [10:36:58<24:12:21, 3.53s/it] 28%|██▊ | 9575/34278 [10:37:02<25:17:58, 3.69s/it] {'loss': 0.161, 'grad_norm': 0.982646526197458, 'learning_rate': 8.456591113420656e-06, 'epoch': 0.28} 28%|██▊ | 9575/34278 [10:37:02<25:17:58, 3.69s/it] 28%|██▊ | 9576/34278 [10:37:06<24:24:25, 3.56s/it] {'loss': 0.1616, 'grad_norm': 0.7905049599781792, 'learning_rate': 8.45624974049137e-06, 'epoch': 0.28} 28%|██▊ | 9576/34278 [10:37:06<24:24:25, 3.56s/it] 28%|██▊ | 9577/34278 [10:37:08<23:10:54, 3.38s/it] {'loss': 0.1476, 'grad_norm': 0.7745858645449902, 'learning_rate': 8.455908336705515e-06, 'epoch': 0.28} 28%|██▊ | 9577/34278 [10:37:08<23:10:54, 3.38s/it] 28%|██▊ | 9578/34278 [10:37:12<24:13:18, 3.53s/it] {'loss': 0.1436, 'grad_norm': 0.941160734241985, 'learning_rate': 8.455566902066138e-06, 'epoch': 0.28} 28%|██▊ | 9578/34278 [10:37:12<24:13:18, 3.53s/it] 28%|██▊ | 9579/34278 [10:37:16<24:21:30, 3.55s/it] {'loss': 0.1519, 'grad_norm': 0.9515311332832856, 'learning_rate': 8.45522543657628e-06, 'epoch': 0.28} 28%|██▊ | 9579/34278 [10:37:16<24:21:30, 3.55s/it] 28%|██▊ | 9580/34278 [10:37:19<23:07:12, 3.37s/it] {'loss': 0.1649, 'grad_norm': 0.7975605752727278, 'learning_rate': 8.454883940238995e-06, 'epoch': 0.28} 28%|██▊ | 9580/34278 [10:37:19<23:07:12, 3.37s/it] 28%|██▊ | 9581/34278 [10:37:23<25:01:54, 3.65s/it] {'loss': 0.1767, 'grad_norm': 0.9336212618189144, 'learning_rate': 8.454542413057335e-06, 'epoch': 0.28} 28%|██▊ | 9581/34278 [10:37:23<25:01:54, 3.65s/it] 28%|██▊ | 9582/34278 [10:37:26<23:36:59, 3.44s/it] {'loss': 0.1596, 'grad_norm': 1.0130532601655777, 'learning_rate': 8.45420085503434e-06, 'epoch': 0.28} 28%|██▊ | 9582/34278 [10:37:26<23:36:59, 3.44s/it] 28%|██▊ | 9583/34278 [10:37:30<23:28:42, 3.42s/it] {'loss': 0.1501, 'grad_norm': 1.213368749096683, 'learning_rate': 8.453859266173065e-06, 'epoch': 0.28} 28%|██▊ | 9583/34278 [10:37:30<23:28:42, 3.42s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10474 > 8192). Running this sequence through the model will result in indexing errors 28%|██▊ | 9584/34278 [10:37:33<22:48:44, 3.33s/it] {'loss': 0.1771, 'grad_norm': 0.8912051490678606, 'learning_rate': 8.453517646476561e-06, 'epoch': 0.28} 28%|██▊ | 9584/34278 [10:37:33<22:48:44, 3.33s/it] 28%|██▊ | 9585/34278 [10:37:37<25:43:59, 3.75s/it] {'loss': 0.1435, 'grad_norm': 1.0348282245629992, 'learning_rate': 8.453175995947876e-06, 'epoch': 0.28} 28%|██▊ | 9585/34278 [10:37:37<25:43:59, 3.75s/it] 28%|██▊ | 9586/34278 [10:37:40<23:43:15, 3.46s/it] {'loss': 0.1538, 'grad_norm': 0.9865785864347076, 'learning_rate': 8.452834314590059e-06, 'epoch': 0.28} 28%|██▊ | 9586/34278 [10:37:40<23:43:15, 3.46s/it] 28%|██▊ | 9587/34278 [10:37:44<24:04:26, 3.51s/it] {'loss': 0.1444, 'grad_norm': 1.4387595409228264, 'learning_rate': 8.452492602406162e-06, 'epoch': 0.28} 28%|██▊ | 9587/34278 [10:37:44<24:04:26, 3.51s/it] 28%|██▊ | 9588/34278 [10:37:47<24:02:10, 3.50s/it] {'loss': 0.1581, 'grad_norm': 0.9294671559167269, 'learning_rate': 8.452150859399234e-06, 'epoch': 0.28} 28%|██▊ | 9588/34278 [10:37:47<24:02:10, 3.50s/it] 28%|██▊ | 9589/34278 [10:37:53<29:14:05, 4.26s/it] {'loss': 0.1433, 'grad_norm': 0.9480256468071755, 'learning_rate': 8.451809085572327e-06, 'epoch': 0.28} 28%|██▊ | 9589/34278 [10:37:53<29:14:05, 4.26s/it] 28%|██▊ | 9590/34278 [10:37:56<26:34:05, 3.87s/it] {'loss': 0.1634, 'grad_norm': 0.8572738889356397, 'learning_rate': 8.451467280928494e-06, 'epoch': 0.28} 28%|██▊ | 9590/34278 [10:37:56<26:34:05, 3.87s/it] 28%|██▊ | 9591/34278 [10:38:00<25:29:03, 3.72s/it] {'loss': 0.1709, 'grad_norm': 0.8235818251921146, 'learning_rate': 8.451125445470784e-06, 'epoch': 0.28} 28%|██▊ | 9591/34278 [10:38:00<25:29:03, 3.72s/it] 28%|██▊ | 9592/34278 [10:38:03<23:48:46, 3.47s/it] {'loss': 0.1835, 'grad_norm': 0.9078272402439236, 'learning_rate': 8.450783579202251e-06, 'epoch': 0.28} 28%|██▊ | 9592/34278 [10:38:03<23:48:46, 3.47s/it] 28%|██▊ | 9593/34278 [10:38:06<22:48:25, 3.33s/it] {'loss': 0.1569, 'grad_norm': 0.8456187301304922, 'learning_rate': 8.450441682125944e-06, 'epoch': 0.28} 28%|██▊ | 9593/34278 [10:38:06<22:48:25, 3.33s/it] 28%|██▊ | 9594/34278 [10:38:10<24:22:41, 3.56s/it] {'loss': 0.176, 'grad_norm': 0.8309217526061093, 'learning_rate': 8.45009975424492e-06, 'epoch': 0.28} 28%|██▊ | 9594/34278 [10:38:10<24:22:41, 3.56s/it] 28%|██▊ | 9595/34278 [10:38:12<22:47:30, 3.32s/it] {'loss': 0.1537, 'grad_norm': 0.9969650659314814, 'learning_rate': 8.449757795562229e-06, 'epoch': 0.28} 28%|██▊ | 9595/34278 [10:38:12<22:47:30, 3.32s/it] 28%|██▊ | 9596/34278 [10:38:15<22:08:31, 3.23s/it] {'loss': 0.1211, 'grad_norm': 0.6570499741744189, 'learning_rate': 8.44941580608092e-06, 'epoch': 0.28} 28%|██▊ | 9596/34278 [10:38:15<22:08:31, 3.23s/it] 28%|██▊ | 9597/34278 [10:38:18<21:41:39, 3.16s/it] {'loss': 0.1848, 'grad_norm': 0.9583072138632722, 'learning_rate': 8.449073785804054e-06, 'epoch': 0.28} 28%|██▊ | 9597/34278 [10:38:18<21:41:39, 3.16s/it] 28%|██▊ | 9598/34278 [10:38:22<21:34:49, 3.15s/it] {'loss': 0.1884, 'grad_norm': 0.9519186905717256, 'learning_rate': 8.448731734734678e-06, 'epoch': 0.28} 28%|██▊ | 9598/34278 [10:38:22<21:34:49, 3.15s/it] 28%|██▊ | 9599/34278 [10:38:25<22:52:52, 3.34s/it] {'loss': 0.1632, 'grad_norm': 0.8923949258864584, 'learning_rate': 8.448389652875852e-06, 'epoch': 0.28} 28%|██▊ | 9599/34278 [10:38:25<22:52:52, 3.34s/it] 28%|██▊ | 9600/34278 [10:38:28<22:10:50, 3.24s/it] {'loss': 0.1603, 'grad_norm': 1.0578728689403207, 'learning_rate': 8.448047540230624e-06, 'epoch': 0.28} 28%|██▊ | 9600/34278 [10:38:28<22:10:50, 3.24s/it] 28%|██▊ | 9601/34278 [10:38:34<26:40:25, 3.89s/it] {'loss': 0.1552, 'grad_norm': 0.7595943667885049, 'learning_rate': 8.447705396802051e-06, 'epoch': 0.28} 28%|██▊ | 9601/34278 [10:38:34<26:40:25, 3.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 28%|██▊ | 9602/34278 [10:38:37<25:34:56, 3.73s/it] {'loss': 0.1491, 'grad_norm': 0.7514324199529084, 'learning_rate': 8.447363222593186e-06, 'epoch': 0.28} 28%|██▊ | 9602/34278 [10:38:37<25:34:56, 3.73s/it] 28%|██▊ | 9603/34278 [10:38:41<25:15:15, 3.68s/it] {'loss': 0.1555, 'grad_norm': 0.978949857641314, 'learning_rate': 8.447021017607087e-06, 'epoch': 0.28} 28%|██▊ | 9603/34278 [10:38:41<25:15:15, 3.68s/it] 28%|██▊ | 9604/34278 [10:38:47<29:46:41, 4.34s/it] {'loss': 0.1505, 'grad_norm': 0.8809412748670732, 'learning_rate': 8.446678781846806e-06, 'epoch': 0.28} 28%|██▊ | 9604/34278 [10:38:47<29:46:41, 4.34s/it] 28%|██▊ | 9605/34278 [10:38:51<30:29:52, 4.45s/it] {'loss': 0.1383, 'grad_norm': 0.7514678066878479, 'learning_rate': 8.4463365153154e-06, 'epoch': 0.28} 28%|██▊ | 9605/34278 [10:38:51<30:29:52, 4.45s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9243 > 8192). Running this sequence through the model will result in indexing errors 28%|██▊ | 9606/34278 [10:38:56<30:48:29, 4.50s/it] {'loss': 0.1549, 'grad_norm': 0.7846690672438622, 'learning_rate': 8.445994218015923e-06, 'epoch': 0.28} 28%|██▊ | 9606/34278 [10:38:56<30:48:29, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 28%|██▊ | 9607/34278 [10:39:00<30:57:13, 4.52s/it] {'loss': 0.142, 'grad_norm': 0.9113320114314601, 'learning_rate': 8.445651889951435e-06, 'epoch': 0.28} 28%|██▊ | 9607/34278 [10:39:00<30:57:13, 4.52s/it] 28%|██▊ | 9608/34278 [10:39:04<29:53:17, 4.36s/it] {'loss': 0.1468, 'grad_norm': 0.7934733096175987, 'learning_rate': 8.445309531124988e-06, 'epoch': 0.28} 28%|██▊ | 9608/34278 [10:39:04<29:53:17, 4.36s/it] 28%|██▊ | 9609/34278 [10:39:07<27:12:21, 3.97s/it] {'loss': 0.1757, 'grad_norm': 0.7543301037519203, 'learning_rate': 8.44496714153964e-06, 'epoch': 0.28} 28%|██▊ | 9609/34278 [10:39:08<27:12:21, 3.97s/it] 28%|██▊ | 9610/34278 [10:39:11<25:22:42, 3.70s/it] {'loss': 0.1423, 'grad_norm': 0.8455391980296363, 'learning_rate': 8.444624721198447e-06, 'epoch': 0.28} 28%|██▊ | 9610/34278 [10:39:11<25:22:42, 3.70s/it] 28%|██▊ | 9611/34278 [10:39:14<24:11:25, 3.53s/it] {'loss': 0.1475, 'grad_norm': 0.9840328904892477, 'learning_rate': 8.444282270104467e-06, 'epoch': 0.28} 28%|██▊ | 9611/34278 [10:39:14<24:11:25, 3.53s/it] 28%|██▊ | 9612/34278 [10:39:17<24:04:11, 3.51s/it] {'loss': 0.1761, 'grad_norm': 0.7572645399233074, 'learning_rate': 8.443939788260757e-06, 'epoch': 0.28} 28%|██▊ | 9612/34278 [10:39:17<24:04:11, 3.51s/it] 28%|██▊ | 9613/34278 [10:39:21<25:20:27, 3.70s/it] {'loss': 0.1563, 'grad_norm': 1.0025945055821288, 'learning_rate': 8.443597275670376e-06, 'epoch': 0.28} 28%|██▊ | 9613/34278 [10:39:21<25:20:27, 3.70s/it] 28%|██▊ | 9614/34278 [10:39:25<25:08:13, 3.67s/it] {'loss': 0.1589, 'grad_norm': 0.9968358292689676, 'learning_rate': 8.44325473233638e-06, 'epoch': 0.28} 28%|██▊ | 9614/34278 [10:39:25<25:08:13, 3.67s/it] 28%|██▊ | 9615/34278 [10:39:28<24:54:02, 3.63s/it] {'loss': 0.1516, 'grad_norm': 0.8747427905731093, 'learning_rate': 8.442912158261828e-06, 'epoch': 0.28} 28%|██▊ | 9615/34278 [10:39:28<24:54:02, 3.63s/it] 28%|██▊ | 9616/34278 [10:39:31<22:50:48, 3.34s/it] {'loss': 0.1297, 'grad_norm': 0.8879882495231916, 'learning_rate': 8.442569553449777e-06, 'epoch': 0.28} 28%|██▊ | 9616/34278 [10:39:31<22:50:48, 3.34s/it] 28%|██▊ | 9617/34278 [10:39:34<21:31:51, 3.14s/it] {'loss': 0.1619, 'grad_norm': 0.8453425011322785, 'learning_rate': 8.442226917903287e-06, 'epoch': 0.28} 28%|██▊ | 9617/34278 [10:39:34<21:31:51, 3.14s/it] 28%|██▊ | 9618/34278 [10:39:37<22:00:12, 3.21s/it] {'loss': 0.1631, 'grad_norm': 0.9752282385155893, 'learning_rate': 8.441884251625419e-06, 'epoch': 0.28} 28%|██▊ | 9618/34278 [10:39:37<22:00:12, 3.21s/it] 28%|██▊ | 9619/34278 [10:39:40<22:02:41, 3.22s/it] {'loss': 0.1432, 'grad_norm': 1.0441464487675278, 'learning_rate': 8.441541554619228e-06, 'epoch': 0.28} 28%|██▊ | 9619/34278 [10:39:40<22:02:41, 3.22s/it] 28%|██▊ | 9620/34278 [10:39:44<22:59:07, 3.36s/it] {'loss': 0.1543, 'grad_norm': 0.838889876502053, 'learning_rate': 8.441198826887776e-06, 'epoch': 0.28} 28%|██▊ | 9620/34278 [10:39:44<22:59:07, 3.36s/it] 28%|██▊ | 9621/34278 [10:39:50<28:13:21, 4.12s/it] {'loss': 0.1393, 'grad_norm': 0.9042981784842837, 'learning_rate': 8.440856068434122e-06, 'epoch': 0.28} 28%|██▊ | 9621/34278 [10:39:50<28:13:21, 4.12s/it] 28%|██▊ | 9622/34278 [10:39:53<26:58:16, 3.94s/it] {'loss': 0.1466, 'grad_norm': 0.7590671300959784, 'learning_rate': 8.440513279261327e-06, 'epoch': 0.28} 28%|██▊ | 9622/34278 [10:39:53<26:58:16, 3.94s/it] 28%|██▊ | 9623/34278 [10:39:58<28:23:53, 4.15s/it] {'loss': 0.1441, 'grad_norm': 0.786209439269673, 'learning_rate': 8.44017045937245e-06, 'epoch': 0.28} 28%|██▊ | 9623/34278 [10:39:58<28:23:53, 4.15s/it] 28%|██▊ | 9624/34278 [10:40:02<27:50:10, 4.06s/it] {'loss': 0.1631, 'grad_norm': 0.8337456741412318, 'learning_rate': 8.439827608770552e-06, 'epoch': 0.28} 28%|██▊ | 9624/34278 [10:40:02<27:50:10, 4.06s/it] 28%|██▊ | 9625/34278 [10:40:07<29:24:57, 4.30s/it] {'loss': 0.1478, 'grad_norm': 0.699361095548608, 'learning_rate': 8.439484727458696e-06, 'epoch': 0.28} 28%|██▊ | 9625/34278 [10:40:07<29:24:57, 4.30s/it] 28%|██▊ | 9626/34278 [10:40:11<29:25:25, 4.30s/it] {'loss': 0.1605, 'grad_norm': 0.7906992172976698, 'learning_rate': 8.43914181543994e-06, 'epoch': 0.28} 28%|██▊ | 9626/34278 [10:40:11<29:25:25, 4.30s/it] 28%|██▊ | 9627/34278 [10:40:14<26:52:45, 3.93s/it] {'loss': 0.1668, 'grad_norm': 0.716958191351726, 'learning_rate': 8.438798872717349e-06, 'epoch': 0.28} 28%|██▊ | 9627/34278 [10:40:14<26:52:45, 3.93s/it] 28%|██▊ | 9628/34278 [10:40:18<26:36:53, 3.89s/it] {'loss': 0.1394, 'grad_norm': 0.9336564764011152, 'learning_rate': 8.43845589929398e-06, 'epoch': 0.28} 28%|██▊ | 9628/34278 [10:40:18<26:36:53, 3.89s/it] 28%|██▊ | 9629/34278 [10:40:21<25:04:56, 3.66s/it] {'loss': 0.1426, 'grad_norm': 0.921589429799974, 'learning_rate': 8.438112895172899e-06, 'epoch': 0.28} 28%|██▊ | 9629/34278 [10:40:21<25:04:56, 3.66s/it] 28%|██▊ | 9630/34278 [10:40:25<24:35:32, 3.59s/it] {'loss': 0.1551, 'grad_norm': 0.722214393748142, 'learning_rate': 8.437769860357166e-06, 'epoch': 0.28} 28%|██▊ | 9630/34278 [10:40:25<24:35:32, 3.59s/it] 28%|██▊ | 9631/34278 [10:40:28<24:30:56, 3.58s/it] {'loss': 0.1826, 'grad_norm': 0.7821484680067312, 'learning_rate': 8.437426794849845e-06, 'epoch': 0.28} 28%|██▊ | 9631/34278 [10:40:28<24:30:56, 3.58s/it] 28%|██▊ | 9632/34278 [10:40:31<23:02:07, 3.36s/it] {'loss': 0.1399, 'grad_norm': 0.8398165397558471, 'learning_rate': 8.437083698653998e-06, 'epoch': 0.28} 28%|██▊ | 9632/34278 [10:40:31<23:02:07, 3.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 28%|██▊ | 9633/34278 [10:40:35<23:49:04, 3.48s/it] {'loss': 0.1588, 'grad_norm': 1.0337390383108171, 'learning_rate': 8.436740571772689e-06, 'epoch': 0.28} 28%|██▊ | 9633/34278 [10:40:35<23:49:04, 3.48s/it] 28%|██▊ | 9634/34278 [10:40:38<22:57:01, 3.35s/it] {'loss': 0.1614, 'grad_norm': 0.7992678715725394, 'learning_rate': 8.436397414208979e-06, 'epoch': 0.28} 28%|██▊ | 9634/34278 [10:40:38<22:57:01, 3.35s/it] 28%|██▊ | 9635/34278 [10:40:41<21:47:55, 3.18s/it] {'loss': 0.1623, 'grad_norm': 0.7817699595069072, 'learning_rate': 8.436054225965933e-06, 'epoch': 0.28} 28%|██▊ | 9635/34278 [10:40:41<21:47:55, 3.18s/it] 28%|██▊ | 9636/34278 [10:40:44<21:22:41, 3.12s/it] {'loss': 0.147, 'grad_norm': 0.822464347563449, 'learning_rate': 8.435711007046616e-06, 'epoch': 0.28} 28%|██▊ | 9636/34278 [10:40:44<21:22:41, 3.12s/it] 28%|██▊ | 9637/34278 [10:40:47<23:04:48, 3.37s/it] {'loss': 0.15, 'grad_norm': 0.8761331626179581, 'learning_rate': 8.435367757454092e-06, 'epoch': 0.28} 28%|██▊ | 9637/34278 [10:40:47<23:04:48, 3.37s/it] 28%|██▊ | 9638/34278 [10:40:51<22:30:49, 3.29s/it] {'loss': 0.1492, 'grad_norm': 0.8090730886916009, 'learning_rate': 8.435024477191426e-06, 'epoch': 0.28} 28%|██▊ | 9638/34278 [10:40:51<22:30:49, 3.29s/it] 28%|██▊ | 9639/34278 [10:40:54<22:18:44, 3.26s/it] {'loss': 0.1448, 'grad_norm': 0.832891485574726, 'learning_rate': 8.434681166261679e-06, 'epoch': 0.28} 28%|██▊ | 9639/34278 [10:40:54<22:18:44, 3.26s/it] 28%|██▊ | 9640/34278 [10:40:57<22:14:53, 3.25s/it] {'loss': 0.1782, 'grad_norm': 0.8954288258009087, 'learning_rate': 8.434337824667918e-06, 'epoch': 0.28} 28%|██▊ | 9640/34278 [10:40:57<22:14:53, 3.25s/it] 28%|██▊ | 9641/34278 [10:41:00<21:57:38, 3.21s/it] {'loss': 0.1666, 'grad_norm': 0.7956031882393465, 'learning_rate': 8.43399445241321e-06, 'epoch': 0.28} 28%|██▊ | 9641/34278 [10:41:00<21:57:38, 3.21s/it] 28%|██▊ | 9642/34278 [10:41:05<26:03:32, 3.81s/it] {'loss': 0.1595, 'grad_norm': 1.387686999700781, 'learning_rate': 8.433651049500619e-06, 'epoch': 0.28} 28%|██▊ | 9642/34278 [10:41:05<26:03:32, 3.81s/it] 28%|██▊ | 9643/34278 [10:41:09<26:13:43, 3.83s/it] {'loss': 0.1454, 'grad_norm': 0.804597063386392, 'learning_rate': 8.433307615933211e-06, 'epoch': 0.28} 28%|██▊ | 9643/34278 [10:41:09<26:13:43, 3.83s/it] 28%|██▊ | 9644/34278 [10:41:15<31:02:45, 4.54s/it] {'loss': 0.1516, 'grad_norm': 0.7854458337786944, 'learning_rate': 8.432964151714052e-06, 'epoch': 0.28} 28%|██▊ | 9644/34278 [10:41:15<31:02:45, 4.54s/it] 28%|██▊ | 9645/34278 [10:41:18<27:45:29, 4.06s/it] {'loss': 0.193, 'grad_norm': 0.8839624480045271, 'learning_rate': 8.43262065684621e-06, 'epoch': 0.28} 28%|██▊ | 9645/34278 [10:41:18<27:45:29, 4.06s/it] 28%|██▊ | 9646/34278 [10:41:22<26:05:04, 3.81s/it] {'loss': 0.1507, 'grad_norm': 0.7793352517051193, 'learning_rate': 8.432277131332749e-06, 'epoch': 0.28} 28%|██▊ | 9646/34278 [10:41:22<26:05:04, 3.81s/it] 28%|██▊ | 9647/34278 [10:41:25<24:52:45, 3.64s/it] {'loss': 0.1532, 'grad_norm': 0.9059971897460766, 'learning_rate': 8.431933575176737e-06, 'epoch': 0.28} 28%|██▊ | 9647/34278 [10:41:25<24:52:45, 3.64s/it] 28%|██▊ | 9648/34278 [10:41:28<23:51:22, 3.49s/it] {'loss': 0.1561, 'grad_norm': 0.9085606820922222, 'learning_rate': 8.43158998838124e-06, 'epoch': 0.28} 28%|██▊ | 9648/34278 [10:41:28<23:51:22, 3.49s/it] 28%|██▊ | 9649/34278 [10:41:31<22:18:02, 3.26s/it] {'loss': 0.1497, 'grad_norm': 1.09575395504824, 'learning_rate': 8.431246370949328e-06, 'epoch': 0.28} 28%|██▊ | 9649/34278 [10:41:31<22:18:02, 3.26s/it] 28%|██▊ | 9650/34278 [10:41:34<23:01:54, 3.37s/it] {'loss': 0.1655, 'grad_norm': 0.7822475216933066, 'learning_rate': 8.430902722884068e-06, 'epoch': 0.28} 28%|██▊ | 9650/34278 [10:41:34<23:01:54, 3.37s/it] 28%|██▊ | 9651/34278 [10:41:40<26:57:40, 3.94s/it] {'loss': 0.1612, 'grad_norm': 0.971521671599174, 'learning_rate': 8.43055904418853e-06, 'epoch': 0.28} 28%|██▊ | 9651/34278 [10:41:40<26:57:40, 3.94s/it] 28%|██▊ | 9652/34278 [10:41:43<25:52:57, 3.78s/it] {'loss': 0.152, 'grad_norm': 0.7586145780635484, 'learning_rate': 8.430215334865775e-06, 'epoch': 0.28} 28%|██▊ | 9652/34278 [10:41:43<25:52:57, 3.78s/it] 28%|██▊ | 9653/34278 [10:41:46<24:18:30, 3.55s/it] {'loss': 0.1523, 'grad_norm': 0.8031869234102891, 'learning_rate': 8.429871594918879e-06, 'epoch': 0.28} 28%|██▊ | 9653/34278 [10:41:46<24:18:30, 3.55s/it] 28%|██▊ | 9654/34278 [10:41:50<25:27:26, 3.72s/it] {'loss': 0.1598, 'grad_norm': 0.751065033773529, 'learning_rate': 8.429527824350908e-06, 'epoch': 0.28} 28%|██▊ | 9654/34278 [10:41:50<25:27:26, 3.72s/it] 28%|██▊ | 9655/34278 [10:41:53<24:27:03, 3.57s/it] {'loss': 0.1487, 'grad_norm': 0.6919716421043798, 'learning_rate': 8.429184023164932e-06, 'epoch': 0.28} 28%|██▊ | 9655/34278 [10:41:53<24:27:03, 3.57s/it] 28%|██▊ | 9656/34278 [10:41:59<27:57:20, 4.09s/it] {'loss': 0.1484, 'grad_norm': 1.2130296148788566, 'learning_rate': 8.428840191364017e-06, 'epoch': 0.28} 28%|██▊ | 9656/34278 [10:41:59<27:57:20, 4.09s/it] 28%|██▊ | 9657/34278 [10:42:02<26:04:51, 3.81s/it] {'loss': 0.1561, 'grad_norm': 0.8612543317526583, 'learning_rate': 8.428496328951237e-06, 'epoch': 0.28} 28%|██▊ | 9657/34278 [10:42:02<26:04:51, 3.81s/it] 28%|██▊ | 9658/34278 [10:42:05<24:25:29, 3.57s/it] {'loss': 0.1498, 'grad_norm': 0.8319945336881983, 'learning_rate': 8.42815243592966e-06, 'epoch': 0.28} 28%|██▊ | 9658/34278 [10:42:05<24:25:29, 3.57s/it] 28%|██▊ | 9659/34278 [10:42:08<23:58:19, 3.51s/it] {'loss': 0.1557, 'grad_norm': 0.8900715989334957, 'learning_rate': 8.427808512302358e-06, 'epoch': 0.28} 28%|██▊ | 9659/34278 [10:42:08<23:58:19, 3.51s/it] 28%|██▊ | 9660/34278 [10:42:13<26:24:26, 3.86s/it] {'loss': 0.1565, 'grad_norm': 0.9209050378328874, 'learning_rate': 8.427464558072397e-06, 'epoch': 0.28} 28%|██▊ | 9660/34278 [10:42:13<26:24:26, 3.86s/it] 28%|██▊ | 9661/34278 [10:42:18<29:13:06, 4.27s/it] {'loss': 0.1735, 'grad_norm': 1.0024738651345637, 'learning_rate': 8.427120573242853e-06, 'epoch': 0.28} 28%|██▊ | 9661/34278 [10:42:18<29:13:06, 4.27s/it] 28%|██▊ | 9662/34278 [10:42:22<28:26:44, 4.16s/it] {'loss': 0.1865, 'grad_norm': 0.904717257631177, 'learning_rate': 8.426776557816793e-06, 'epoch': 0.28} 28%|██▊ | 9662/34278 [10:42:22<28:26:44, 4.16s/it] 28%|██▊ | 9663/34278 [10:42:26<27:14:19, 3.98s/it] {'loss': 0.1484, 'grad_norm': 0.9535330914955639, 'learning_rate': 8.426432511797292e-06, 'epoch': 0.28} 28%|██▊ | 9663/34278 [10:42:26<27:14:19, 3.98s/it] 28%|██▊ | 9664/34278 [10:42:29<26:05:31, 3.82s/it] {'loss': 0.1493, 'grad_norm': 0.83583328816481, 'learning_rate': 8.426088435187418e-06, 'epoch': 0.28} 28%|██▊ | 9664/34278 [10:42:29<26:05:31, 3.82s/it] 28%|██▊ | 9665/34278 [10:42:34<28:17:44, 4.14s/it] {'loss': 0.163, 'grad_norm': 0.9611065676349803, 'learning_rate': 8.425744327990244e-06, 'epoch': 0.28} 28%|██▊ | 9665/34278 [10:42:34<28:17:44, 4.14s/it] 28%|██▊ | 9666/34278 [10:42:37<26:23:23, 3.86s/it] {'loss': 0.1459, 'grad_norm': 0.9520432319743514, 'learning_rate': 8.425400190208842e-06, 'epoch': 0.28} 28%|██▊ | 9666/34278 [10:42:37<26:23:23, 3.86s/it] 28%|██▊ | 9667/34278 [10:42:41<25:55:15, 3.79s/it] {'loss': 0.1744, 'grad_norm': 0.891136038748808, 'learning_rate': 8.425056021846285e-06, 'epoch': 0.28} 28%|██▊ | 9667/34278 [10:42:41<25:55:15, 3.79s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10348 > 8192). Running this sequence through the model will result in indexing errors 28%|██▊ | 9668/34278 [10:42:45<27:16:23, 3.99s/it] {'loss': 0.1303, 'grad_norm': 0.8104828816106517, 'learning_rate': 8.424711822905647e-06, 'epoch': 0.28} 28%|██▊ | 9668/34278 [10:42:45<27:16:23, 3.99s/it] 28%|██▊ | 9669/34278 [10:42:48<25:19:04, 3.70s/it] {'loss': 0.1399, 'grad_norm': 0.6664588376329665, 'learning_rate': 8.42436759339e-06, 'epoch': 0.28} 28%|██▊ | 9669/34278 [10:42:48<25:19:04, 3.70s/it] 28%|██▊ | 9670/34278 [10:42:51<24:11:26, 3.54s/it] {'loss': 0.1526, 'grad_norm': 0.8861763069231229, 'learning_rate': 8.424023333302414e-06, 'epoch': 0.28} 28%|██▊ | 9670/34278 [10:42:51<24:11:26, 3.54s/it] 28%|██▊ | 9671/34278 [10:42:54<22:45:33, 3.33s/it] {'loss': 0.1447, 'grad_norm': 0.7632870412812516, 'learning_rate': 8.423679042645967e-06, 'epoch': 0.28} 28%|██▊ | 9671/34278 [10:42:54<22:45:33, 3.33s/it] 28%|██▊ | 9672/34278 [10:42:58<23:00:35, 3.37s/it] {'loss': 0.1472, 'grad_norm': 0.7678346797721908, 'learning_rate': 8.423334721423729e-06, 'epoch': 0.28} 28%|██▊ | 9672/34278 [10:42:58<23:00:35, 3.37s/it] 28%|██▊ | 9673/34278 [10:43:01<23:47:53, 3.48s/it] {'loss': 0.1729, 'grad_norm': 0.8287153773092156, 'learning_rate': 8.422990369638778e-06, 'epoch': 0.28} 28%|██▊ | 9673/34278 [10:43:01<23:47:53, 3.48s/it] 28%|██▊ | 9674/34278 [10:43:04<22:30:33, 3.29s/it] {'loss': 0.1572, 'grad_norm': 0.7733026353657988, 'learning_rate': 8.422645987294184e-06, 'epoch': 0.28} 28%|██▊ | 9674/34278 [10:43:04<22:30:33, 3.29s/it] 28%|██▊ | 9675/34278 [10:43:07<21:33:23, 3.15s/it] {'loss': 0.1423, 'grad_norm': 0.9097808971524729, 'learning_rate': 8.422301574393025e-06, 'epoch': 0.28} 28%|██▊ | 9675/34278 [10:43:07<21:33:23, 3.15s/it] 28%|██▊ | 9676/34278 [10:43:10<22:05:20, 3.23s/it] {'loss': 0.1319, 'grad_norm': 0.821824437294448, 'learning_rate': 8.421957130938374e-06, 'epoch': 0.28} 28%|██▊ | 9676/34278 [10:43:10<22:05:20, 3.23s/it] 28%|██▊ | 9677/34278 [10:43:14<21:55:32, 3.21s/it] {'loss': 0.1534, 'grad_norm': 0.9400066976953318, 'learning_rate': 8.421612656933306e-06, 'epoch': 0.28} 28%|██▊ | 9677/34278 [10:43:14<21:55:32, 3.21s/it] 28%|██▊ | 9678/34278 [10:43:17<23:02:39, 3.37s/it] {'loss': 0.1781, 'grad_norm': 0.9436382052241648, 'learning_rate': 8.421268152380898e-06, 'epoch': 0.28} 28%|██▊ | 9678/34278 [10:43:17<23:02:39, 3.37s/it] 28%|██▊ | 9679/34278 [10:43:21<23:05:56, 3.38s/it] {'loss': 0.1608, 'grad_norm': 0.8874724888722365, 'learning_rate': 8.420923617284224e-06, 'epoch': 0.28} 28%|██▊ | 9679/34278 [10:43:21<23:05:56, 3.38s/it] 28%|██▊ | 9680/34278 [10:43:24<23:31:59, 3.44s/it] {'loss': 0.1562, 'grad_norm': 0.7266858505170246, 'learning_rate': 8.420579051646363e-06, 'epoch': 0.28} 28%|██▊ | 9680/34278 [10:43:25<23:31:59, 3.44s/it] 28%|██▊ | 9681/34278 [10:43:29<25:12:19, 3.69s/it] {'loss': 0.1501, 'grad_norm': 0.7521659709929877, 'learning_rate': 8.420234455470386e-06, 'epoch': 0.28} 28%|██▊ | 9681/34278 [10:43:29<25:12:19, 3.69s/it] 28%|██▊ | 9682/34278 [10:43:32<23:49:48, 3.49s/it] {'loss': 0.1311, 'grad_norm': 1.1471401215465622, 'learning_rate': 8.419889828759374e-06, 'epoch': 0.28} 28%|██▊ | 9682/34278 [10:43:32<23:49:48, 3.49s/it] 28%|██▊ | 9683/34278 [10:43:35<23:53:27, 3.50s/it] {'loss': 0.1613, 'grad_norm': 0.688710553463838, 'learning_rate': 8.419545171516399e-06, 'epoch': 0.28} 28%|██▊ | 9683/34278 [10:43:35<23:53:27, 3.50s/it] 28%|██▊ | 9684/34278 [10:43:39<25:25:13, 3.72s/it] {'loss': 0.1601, 'grad_norm': 0.786648071437947, 'learning_rate': 8.419200483744544e-06, 'epoch': 0.28} 28%|██▊ | 9684/34278 [10:43:39<25:25:13, 3.72s/it] 28%|██▊ | 9685/34278 [10:43:45<29:15:32, 4.28s/it] {'loss': 0.1436, 'grad_norm': 0.7593508447780223, 'learning_rate': 8.418855765446883e-06, 'epoch': 0.28} 28%|██▊ | 9685/34278 [10:43:45<29:15:32, 4.28s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 28%|██▊ | 9686/34278 [10:43:48<26:58:17, 3.95s/it] {'loss': 0.1522, 'grad_norm': 0.8258345204164949, 'learning_rate': 8.418511016626492e-06, 'epoch': 0.28} 28%|██▊ | 9686/34278 [10:43:48<26:58:17, 3.95s/it] 28%|██▊ | 9687/34278 [10:43:52<26:06:35, 3.82s/it] {'loss': 0.1307, 'grad_norm': 0.6930969976952299, 'learning_rate': 8.418166237286453e-06, 'epoch': 0.28} 28%|██▊ | 9687/34278 [10:43:52<26:06:35, 3.82s/it] 28%|██▊ | 9688/34278 [10:43:55<24:07:12, 3.53s/it] {'loss': 0.1471, 'grad_norm': 0.775159952248678, 'learning_rate': 8.41782142742984e-06, 'epoch': 0.28} 28%|██▊ | 9688/34278 [10:43:55<24:07:12, 3.53s/it] 28%|██▊ | 9689/34278 [10:43:58<23:38:35, 3.46s/it] {'loss': 0.1495, 'grad_norm': 0.9026015798308387, 'learning_rate': 8.417476587059735e-06, 'epoch': 0.28} 28%|██▊ | 9689/34278 [10:43:58<23:38:35, 3.46s/it] 28%|██▊ | 9690/34278 [10:44:01<22:47:51, 3.34s/it] {'loss': 0.1628, 'grad_norm': 0.8061822204561225, 'learning_rate': 8.417131716179212e-06, 'epoch': 0.28} 28%|██▊ | 9690/34278 [10:44:01<22:47:51, 3.34s/it] 28%|██▊ | 9691/34278 [10:44:05<24:58:29, 3.66s/it] {'loss': 0.1493, 'grad_norm': 0.7828662242122386, 'learning_rate': 8.416786814791355e-06, 'epoch': 0.28} 28%|██▊ | 9691/34278 [10:44:05<24:58:29, 3.66s/it] 28%|██▊ | 9692/34278 [10:44:08<23:30:46, 3.44s/it] {'loss': 0.1742, 'grad_norm': 0.929254933961137, 'learning_rate': 8.416441882899241e-06, 'epoch': 0.28} 28%|██▊ | 9692/34278 [10:44:08<23:30:46, 3.44s/it] 28%|██▊ | 9693/34278 [10:44:14<28:33:55, 4.18s/it] {'loss': 0.1509, 'grad_norm': 0.7516171096257392, 'learning_rate': 8.41609692050595e-06, 'epoch': 0.28} 28%|██▊ | 9693/34278 [10:44:14<28:33:55, 4.18s/it] 28%|██▊ | 9694/34278 [10:44:20<32:30:48, 4.76s/it] {'loss': 0.1315, 'grad_norm': 0.7647455961180056, 'learning_rate': 8.415751927614559e-06, 'epoch': 0.28} 28%|██▊ | 9694/34278 [10:44:20<32:30:48, 4.76s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 28%|██▊ | 9695/34278 [10:44:23<29:10:10, 4.27s/it] {'loss': 0.1624, 'grad_norm': 0.8782312479853289, 'learning_rate': 8.415406904228151e-06, 'epoch': 0.28} 28%|██▊ | 9695/34278 [10:44:23<29:10:10, 4.27s/it] 28%|██▊ | 9696/34278 [10:44:26<26:30:42, 3.88s/it] {'loss': 0.1498, 'grad_norm': 1.0164220231613077, 'learning_rate': 8.415061850349806e-06, 'epoch': 0.28} 28%|██▊ | 9696/34278 [10:44:26<26:30:42, 3.88s/it] 28%|██▊ | 9697/34278 [10:44:30<25:04:30, 3.67s/it] {'loss': 0.1562, 'grad_norm': 0.8547393999235816, 'learning_rate': 8.414716765982604e-06, 'epoch': 0.28} 28%|██▊ | 9697/34278 [10:44:30<25:04:30, 3.67s/it] 28%|██▊ | 9698/34278 [10:44:35<29:32:56, 4.33s/it] {'loss': 0.1452, 'grad_norm': 1.264434929576917, 'learning_rate': 8.414371651129627e-06, 'epoch': 0.28} 28%|██▊ | 9698/34278 [10:44:35<29:32:56, 4.33s/it] 28%|██▊ | 9699/34278 [10:44:40<29:14:22, 4.28s/it] {'loss': 0.1603, 'grad_norm': 1.0139116707185494, 'learning_rate': 8.414026505793953e-06, 'epoch': 0.28} 28%|██▊ | 9699/34278 [10:44:40<29:14:22, 4.28s/it] 28%|██▊ | 9700/34278 [10:44:45<32:31:56, 4.77s/it] {'loss': 0.1654, 'grad_norm': 0.9295979925804307, 'learning_rate': 8.413681329978666e-06, 'epoch': 0.28} 28%|██▊ | 9700/34278 [10:44:46<32:31:56, 4.77s/it] 28%|██▊ | 9701/34278 [10:44:50<31:42:56, 4.65s/it] {'loss': 0.1797, 'grad_norm': 1.2400910685047386, 'learning_rate': 8.413336123686847e-06, 'epoch': 0.28} 28%|██▊ | 9701/34278 [10:44:50<31:42:56, 4.65s/it] 28%|██▊ | 9702/34278 [10:44:53<28:15:35, 4.14s/it] {'loss': 0.1712, 'grad_norm': 1.0754853612442141, 'learning_rate': 8.412990886921579e-06, 'epoch': 0.28} 28%|██▊ | 9702/34278 [10:44:53<28:15:35, 4.14s/it] 28%|██▊ | 9703/34278 [10:44:56<26:27:46, 3.88s/it] {'loss': 0.1592, 'grad_norm': 0.9555839236093209, 'learning_rate': 8.412645619685943e-06, 'epoch': 0.28} 28%|██▊ | 9703/34278 [10:44:56<26:27:46, 3.88s/it] 28%|██▊ | 9704/34278 [10:45:00<26:03:49, 3.82s/it] {'loss': 0.157, 'grad_norm': 1.0459397894466103, 'learning_rate': 8.41230032198302e-06, 'epoch': 0.28} 28%|██▊ | 9704/34278 [10:45:00<26:03:49, 3.82s/it] 28%|██▊ | 9705/34278 [10:45:03<24:34:24, 3.60s/it] {'loss': 0.1588, 'grad_norm': 0.8341881986997727, 'learning_rate': 8.411954993815894e-06, 'epoch': 0.28} 28%|██▊ | 9705/34278 [10:45:03<24:34:24, 3.60s/it] 28%|██▊ | 9706/34278 [10:45:06<23:02:41, 3.38s/it] {'loss': 0.1536, 'grad_norm': 0.8119137135313987, 'learning_rate': 8.41160963518765e-06, 'epoch': 0.28} 28%|██▊ | 9706/34278 [10:45:06<23:02:41, 3.38s/it] 28%|██▊ | 9707/34278 [10:45:09<22:02:10, 3.23s/it] {'loss': 0.1603, 'grad_norm': 0.8152878389508309, 'learning_rate': 8.411264246101369e-06, 'epoch': 0.28} 28%|██▊ | 9707/34278 [10:45:09<22:02:10, 3.23s/it] 28%|██▊ | 9708/34278 [10:45:12<21:44:01, 3.18s/it] {'loss': 0.1478, 'grad_norm': 0.8343559062998911, 'learning_rate': 8.410918826560134e-06, 'epoch': 0.28} 28%|██▊ | 9708/34278 [10:45:12<21:44:01, 3.18s/it] 28%|██▊ | 9709/34278 [10:45:15<21:15:20, 3.11s/it] {'loss': 0.1563, 'grad_norm': 0.7597627365743722, 'learning_rate': 8.410573376567031e-06, 'epoch': 0.28} 28%|██▊ | 9709/34278 [10:45:15<21:15:20, 3.11s/it] 28%|██▊ | 9710/34278 [10:45:19<22:54:40, 3.36s/it] {'loss': 0.1692, 'grad_norm': 0.903494677094035, 'learning_rate': 8.410227896125142e-06, 'epoch': 0.28} 28%|██▊ | 9710/34278 [10:45:19<22:54:40, 3.36s/it] 28%|██▊ | 9711/34278 [10:45:22<22:45:18, 3.33s/it] {'loss': 0.1397, 'grad_norm': 1.3270150784866643, 'learning_rate': 8.409882385237555e-06, 'epoch': 0.28} 28%|██▊ | 9711/34278 [10:45:22<22:45:18, 3.33s/it] 28%|██▊ | 9712/34278 [10:45:25<22:29:54, 3.30s/it] {'loss': 0.1489, 'grad_norm': 0.8830790643507895, 'learning_rate': 8.409536843907351e-06, 'epoch': 0.28} 28%|██▊ | 9712/34278 [10:45:25<22:29:54, 3.30s/it] 28%|██▊ | 9713/34278 [10:45:31<28:10:05, 4.13s/it] {'loss': 0.1576, 'grad_norm': 1.068664797549712, 'learning_rate': 8.409191272137616e-06, 'epoch': 0.28} 28%|██▊ | 9713/34278 [10:45:31<28:10:05, 4.13s/it] 28%|██▊ | 9714/34278 [10:45:34<25:21:25, 3.72s/it] {'loss': 0.1393, 'grad_norm': 0.7253653285187763, 'learning_rate': 8.408845669931434e-06, 'epoch': 0.28} 28%|██▊ | 9714/34278 [10:45:34<25:21:25, 3.72s/it] 28%|██▊ | 9715/34278 [10:45:40<30:02:54, 4.40s/it] {'loss': 0.1533, 'grad_norm': 0.7163191072519595, 'learning_rate': 8.408500037291894e-06, 'epoch': 0.28} 28%|██▊ | 9715/34278 [10:45:40<30:02:54, 4.40s/it] 28%|██▊ | 9716/34278 [10:45:43<27:28:22, 4.03s/it] {'loss': 0.1556, 'grad_norm': 0.8467666868444809, 'learning_rate': 8.408154374222076e-06, 'epoch': 0.28} 28%|██▊ | 9716/34278 [10:45:43<27:28:22, 4.03s/it] 28%|██▊ | 9717/34278 [10:45:49<31:34:36, 4.63s/it] {'loss': 0.1469, 'grad_norm': 0.8696130561027053, 'learning_rate': 8.407808680725072e-06, 'epoch': 0.28} 28%|██▊ | 9717/34278 [10:45:49<31:34:36, 4.63s/it] 28%|██▊ | 9718/34278 [10:45:52<28:55:06, 4.24s/it] {'loss': 0.151, 'grad_norm': 0.7964640245797752, 'learning_rate': 8.407462956803965e-06, 'epoch': 0.28} 28%|██▊ | 9718/34278 [10:45:52<28:55:06, 4.24s/it] 28%|██▊ | 9719/34278 [10:45:55<26:13:19, 3.84s/it] {'loss': 0.1548, 'grad_norm': 0.7264124165991117, 'learning_rate': 8.407117202461841e-06, 'epoch': 0.28} 28%|██▊ | 9719/34278 [10:45:55<26:13:19, 3.84s/it] 28%|██▊ | 9720/34278 [10:45:58<24:04:04, 3.53s/it] {'loss': 0.1956, 'grad_norm': 0.7390467530436048, 'learning_rate': 8.406771417701788e-06, 'epoch': 0.28} 28%|██▊ | 9720/34278 [10:45:58<24:04:04, 3.53s/it] 28%|██▊ | 9721/34278 [10:46:04<29:02:12, 4.26s/it] {'loss': 0.1286, 'grad_norm': 0.9587720822857085, 'learning_rate': 8.406425602526895e-06, 'epoch': 0.28} 28%|██▊ | 9721/34278 [10:46:04<29:02:12, 4.26s/it] 28%|██▊ | 9722/34278 [10:46:10<32:46:47, 4.81s/it] {'loss': 0.1409, 'grad_norm': 0.8603120053150856, 'learning_rate': 8.406079756940246e-06, 'epoch': 0.28} 28%|██▊ | 9722/34278 [10:46:10<32:46:47, 4.81s/it] 28%|██▊ | 9723/34278 [10:46:13<29:46:19, 4.36s/it] {'loss': 0.1568, 'grad_norm': 1.0536283198004628, 'learning_rate': 8.40573388094493e-06, 'epoch': 0.28} 28%|██▊ | 9723/34278 [10:46:13<29:46:19, 4.36s/it] 28%|██▊ | 9724/34278 [10:46:16<26:50:29, 3.94s/it] {'loss': 0.139, 'grad_norm': 0.7891749487967007, 'learning_rate': 8.405387974544036e-06, 'epoch': 0.28} 28%|██▊ | 9724/34278 [10:46:16<26:50:29, 3.94s/it] 28%|██▊ | 9725/34278 [10:46:21<28:24:05, 4.16s/it] {'loss': 0.1345, 'grad_norm': 0.8603404080694517, 'learning_rate': 8.405042037740649e-06, 'epoch': 0.28} 28%|██▊ | 9725/34278 [10:46:21<28:24:05, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 28%|██▊ | 9726/34278 [10:46:24<26:19:42, 3.86s/it] {'loss': 0.161, 'grad_norm': 1.4079981681800657, 'learning_rate': 8.404696070537861e-06, 'epoch': 0.28} 28%|██▊ | 9726/34278 [10:46:24<26:19:42, 3.86s/it] 28%|██▊ | 9727/34278 [10:46:29<27:41:47, 4.06s/it] {'loss': 0.142, 'grad_norm': 0.79156746578909, 'learning_rate': 8.404350072938758e-06, 'epoch': 0.28} 28%|██▊ | 9727/34278 [10:46:29<27:41:47, 4.06s/it] 28%|██▊ | 9728/34278 [10:46:32<25:27:04, 3.73s/it] {'loss': 0.1466, 'grad_norm': 0.9428572683118923, 'learning_rate': 8.404004044946432e-06, 'epoch': 0.28} 28%|██▊ | 9728/34278 [10:46:32<25:27:04, 3.73s/it] 28%|██▊ | 9729/34278 [10:46:35<24:24:40, 3.58s/it] {'loss': 0.134, 'grad_norm': 0.6755371678286886, 'learning_rate': 8.40365798656397e-06, 'epoch': 0.28} 28%|██▊ | 9729/34278 [10:46:35<24:24:40, 3.58s/it] 28%|██▊ | 9730/34278 [10:46:38<23:14:59, 3.41s/it] {'loss': 0.1629, 'grad_norm': 0.8582491479733165, 'learning_rate': 8.403311897794461e-06, 'epoch': 0.28} 28%|██▊ | 9730/34278 [10:46:38<23:14:59, 3.41s/it] 28%|██▊ | 9731/34278 [10:46:42<24:59:43, 3.67s/it] {'loss': 0.1608, 'grad_norm': 0.909304447218576, 'learning_rate': 8.402965778640996e-06, 'epoch': 0.28} 28%|██▊ | 9731/34278 [10:46:42<24:59:43, 3.67s/it] 28%|██▊ | 9732/34278 [10:46:45<23:30:23, 3.45s/it] {'loss': 0.1592, 'grad_norm': 0.7001333232973245, 'learning_rate': 8.402619629106667e-06, 'epoch': 0.28} 28%|██▊ | 9732/34278 [10:46:45<23:30:23, 3.45s/it] 28%|██▊ | 9733/34278 [10:46:49<23:50:01, 3.50s/it] {'loss': 0.1535, 'grad_norm': 0.8150591980831794, 'learning_rate': 8.40227344919456e-06, 'epoch': 0.28} 28%|██▊ | 9733/34278 [10:46:49<23:50:01, 3.50s/it] 28%|██▊ | 9734/34278 [10:46:52<23:09:31, 3.40s/it] {'loss': 0.1527, 'grad_norm': 0.7028752977026115, 'learning_rate': 8.401927238907768e-06, 'epoch': 0.28} 28%|██▊ | 9734/34278 [10:46:52<23:09:31, 3.40s/it] 28%|██▊ | 9735/34278 [10:46:58<28:22:33, 4.16s/it] {'loss': 0.1499, 'grad_norm': 0.7821393443659244, 'learning_rate': 8.401580998249383e-06, 'epoch': 0.28} 28%|██▊ | 9735/34278 [10:46:58<28:22:33, 4.16s/it] 28%|██▊ | 9736/34278 [10:47:01<26:32:30, 3.89s/it] {'loss': 0.1568, 'grad_norm': 0.8229367088023415, 'learning_rate': 8.401234727222495e-06, 'epoch': 0.28} 28%|██▊ | 9736/34278 [10:47:01<26:32:30, 3.89s/it] 28%|██▊ | 9737/34278 [10:47:05<25:46:38, 3.78s/it] {'loss': 0.1495, 'grad_norm': 0.8706227628537601, 'learning_rate': 8.400888425830193e-06, 'epoch': 0.28} 28%|██▊ | 9737/34278 [10:47:05<25:46:38, 3.78s/it] 28%|██▊ | 9738/34278 [10:47:09<25:51:00, 3.79s/it] {'loss': 0.1346, 'grad_norm': 0.6997778137587798, 'learning_rate': 8.400542094075572e-06, 'epoch': 0.28} 28%|██▊ | 9738/34278 [10:47:09<25:51:00, 3.79s/it] 28%|██▊ | 9739/34278 [10:47:11<24:09:39, 3.54s/it] {'loss': 0.1352, 'grad_norm': 0.7620811191556768, 'learning_rate': 8.400195731961725e-06, 'epoch': 0.28} 28%|██▊ | 9739/34278 [10:47:11<24:09:39, 3.54s/it] 28%|██▊ | 9740/34278 [10:47:15<24:55:37, 3.66s/it] {'loss': 0.1481, 'grad_norm': 0.740916557473217, 'learning_rate': 8.39984933949174e-06, 'epoch': 0.28} 28%|██▊ | 9740/34278 [10:47:15<24:55:37, 3.66s/it] 28%|██▊ | 9741/34278 [10:47:18<23:20:23, 3.42s/it] {'loss': 0.1644, 'grad_norm': 0.6581033953325057, 'learning_rate': 8.399502916668712e-06, 'epoch': 0.28} 28%|██▊ | 9741/34278 [10:47:18<23:20:23, 3.42s/it] 28%|██▊ | 9742/34278 [10:47:24<27:00:50, 3.96s/it] {'loss': 0.1415, 'grad_norm': 0.8227219664659927, 'learning_rate': 8.399156463495735e-06, 'epoch': 0.28} 28%|██▊ | 9742/34278 [10:47:24<27:00:50, 3.96s/it] 28%|██▊ | 9743/34278 [10:47:28<27:57:16, 4.10s/it] {'loss': 0.1405, 'grad_norm': 0.7457571557541653, 'learning_rate': 8.398809979975898e-06, 'epoch': 0.28} 28%|██▊ | 9743/34278 [10:47:28<27:57:16, 4.10s/it] 28%|██▊ | 9744/34278 [10:47:34<31:51:13, 4.67s/it] {'loss': 0.1503, 'grad_norm': 0.6780960869226833, 'learning_rate': 8.398463466112298e-06, 'epoch': 0.28} 28%|██▊ | 9744/34278 [10:47:34<31:51:13, 4.67s/it] 28%|██▊ | 9745/34278 [10:47:37<28:51:19, 4.23s/it] {'loss': 0.1329, 'grad_norm': 0.7994458526346818, 'learning_rate': 8.398116921908028e-06, 'epoch': 0.28} 28%|██▊ | 9745/34278 [10:47:37<28:51:19, 4.23s/it] 28%|██▊ | 9746/34278 [10:47:40<25:47:22, 3.78s/it] {'loss': 0.1415, 'grad_norm': 0.9628546074911857, 'learning_rate': 8.397770347366181e-06, 'epoch': 0.28} 28%|██▊ | 9746/34278 [10:47:40<25:47:22, 3.78s/it] 28%|██▊ | 9747/34278 [10:47:43<24:10:13, 3.55s/it] {'loss': 0.1444, 'grad_norm': 0.6365447065570922, 'learning_rate': 8.397423742489852e-06, 'epoch': 0.28} 28%|██▊ | 9747/34278 [10:47:43<24:10:13, 3.55s/it] 28%|██▊ | 9748/34278 [10:47:47<24:50:49, 3.65s/it] {'loss': 0.1412, 'grad_norm': 0.7321654765354305, 'learning_rate': 8.397077107282134e-06, 'epoch': 0.28} 28%|██▊ | 9748/34278 [10:47:47<24:50:49, 3.65s/it] 28%|██▊ | 9749/34278 [10:47:50<23:44:25, 3.48s/it] {'loss': 0.1539, 'grad_norm': 1.3495289834810458, 'learning_rate': 8.396730441746121e-06, 'epoch': 0.28} 28%|██▊ | 9749/34278 [10:47:50<23:44:25, 3.48s/it] 28%|██▊ | 9750/34278 [10:47:53<22:34:18, 3.31s/it] {'loss': 0.1613, 'grad_norm': 0.7402820912285748, 'learning_rate': 8.396383745884913e-06, 'epoch': 0.28} 28%|██▊ | 9750/34278 [10:47:53<22:34:18, 3.31s/it] 28%|██▊ | 9751/34278 [10:47:57<24:03:22, 3.53s/it] {'loss': 0.1454, 'grad_norm': 0.7990324437728387, 'learning_rate': 8.3960370197016e-06, 'epoch': 0.28} 28%|██▊ | 9751/34278 [10:47:57<24:03:22, 3.53s/it] 28%|██▊ | 9752/34278 [10:48:01<24:36:34, 3.61s/it] {'loss': 0.1382, 'grad_norm': 0.8000174929224456, 'learning_rate': 8.395690263199279e-06, 'epoch': 0.28} 28%|██▊ | 9752/34278 [10:48:01<24:36:34, 3.61s/it] 28%|██▊ | 9753/34278 [10:48:04<23:43:12, 3.48s/it] {'loss': 0.1462, 'grad_norm': 0.8392983185148813, 'learning_rate': 8.395343476381047e-06, 'epoch': 0.28} 28%|██▊ | 9753/34278 [10:48:04<23:43:12, 3.48s/it] 28%|██▊ | 9754/34278 [10:48:08<25:19:14, 3.72s/it] {'loss': 0.1662, 'grad_norm': 0.811664279287985, 'learning_rate': 8.394996659249996e-06, 'epoch': 0.28} 28%|██▊ | 9754/34278 [10:48:08<25:19:14, 3.72s/it] 28%|██▊ | 9755/34278 [10:48:12<25:29:59, 3.74s/it] {'loss': 0.1635, 'grad_norm': 0.8364192688275434, 'learning_rate': 8.394649811809228e-06, 'epoch': 0.28} 28%|██▊ | 9755/34278 [10:48:12<25:29:59, 3.74s/it] 28%|██▊ | 9756/34278 [10:48:15<24:38:27, 3.62s/it] {'loss': 0.1534, 'grad_norm': 0.8514969812664084, 'learning_rate': 8.394302934061836e-06, 'epoch': 0.28} 28%|██▊ | 9756/34278 [10:48:15<24:38:27, 3.62s/it] 28%|██▊ | 9757/34278 [10:48:20<27:01:44, 3.97s/it] {'loss': 0.1375, 'grad_norm': 0.6980328385273792, 'learning_rate': 8.393956026010917e-06, 'epoch': 0.28} 28%|██▊ | 9757/34278 [10:48:20<27:01:44, 3.97s/it] 28%|██▊ | 9758/34278 [10:48:25<29:32:20, 4.34s/it] {'loss': 0.1501, 'grad_norm': 0.890712888984573, 'learning_rate': 8.39360908765957e-06, 'epoch': 0.28} 28%|██▊ | 9758/34278 [10:48:25<29:32:20, 4.34s/it] 28%|██▊ | 9759/34278 [10:48:28<27:08:02, 3.98s/it] {'loss': 0.1417, 'grad_norm': 0.7935256251004001, 'learning_rate': 8.393262119010891e-06, 'epoch': 0.28} 28%|██▊ | 9759/34278 [10:48:28<27:08:02, 3.98s/it] 28%|██▊ | 9760/34278 [10:48:32<25:56:27, 3.81s/it] {'loss': 0.139, 'grad_norm': 1.0583601653878323, 'learning_rate': 8.392915120067979e-06, 'epoch': 0.28} 28%|██▊ | 9760/34278 [10:48:32<25:56:27, 3.81s/it] 28%|██▊ | 9761/34278 [10:48:38<30:18:49, 4.45s/it] {'loss': 0.1318, 'grad_norm': 0.8763551162292266, 'learning_rate': 8.392568090833928e-06, 'epoch': 0.28} 28%|██▊ | 9761/34278 [10:48:38<30:18:49, 4.45s/it] 28%|██▊ | 9762/34278 [10:48:41<28:15:49, 4.15s/it] {'loss': 0.1421, 'grad_norm': 0.910605598245978, 'learning_rate': 8.392221031311842e-06, 'epoch': 0.28} 28%|██▊ | 9762/34278 [10:48:41<28:15:49, 4.15s/it] 28%|██▊ | 9763/34278 [10:48:44<26:34:12, 3.90s/it] {'loss': 0.1672, 'grad_norm': 1.0002623823839114, 'learning_rate': 8.391873941504813e-06, 'epoch': 0.28} 28%|██▊ | 9763/34278 [10:48:44<26:34:12, 3.90s/it] 28%|██▊ | 9764/34278 [10:48:48<25:27:08, 3.74s/it] {'loss': 0.1459, 'grad_norm': 1.150826451710878, 'learning_rate': 8.391526821415946e-06, 'epoch': 0.28} 28%|██▊ | 9764/34278 [10:48:48<25:27:08, 3.74s/it] 28%|██▊ | 9765/34278 [10:48:51<24:20:18, 3.57s/it] {'loss': 0.1463, 'grad_norm': 0.8333527000506046, 'learning_rate': 8.391179671048335e-06, 'epoch': 0.28} 28%|██▊ | 9765/34278 [10:48:51<24:20:18, 3.57s/it] 28%|██▊ | 9766/34278 [10:48:56<26:29:34, 3.89s/it] {'loss': 0.1778, 'grad_norm': 0.8496365208020461, 'learning_rate': 8.390832490405083e-06, 'epoch': 0.28} 28%|██▊ | 9766/34278 [10:48:56<26:29:34, 3.89s/it] 28%|██▊ | 9767/34278 [10:48:59<24:28:12, 3.59s/it] {'loss': 0.1557, 'grad_norm': 0.9566922452947795, 'learning_rate': 8.390485279489288e-06, 'epoch': 0.28} 28%|██▊ | 9767/34278 [10:48:59<24:28:12, 3.59s/it] 28%|██▊ | 9768/34278 [10:49:02<23:38:18, 3.47s/it] {'loss': 0.1963, 'grad_norm': 0.9743776238793905, 'learning_rate': 8.39013803830405e-06, 'epoch': 0.28} 28%|██▊ | 9768/34278 [10:49:02<23:38:18, 3.47s/it] 28%|██▊ | 9769/34278 [10:49:05<22:45:51, 3.34s/it] {'loss': 0.1618, 'grad_norm': 0.8105312690064024, 'learning_rate': 8.389790766852468e-06, 'epoch': 0.28} 28%|██▊ | 9769/34278 [10:49:05<22:45:51, 3.34s/it] 29%|██▊ | 9770/34278 [10:49:08<23:32:01, 3.46s/it] {'loss': 0.1372, 'grad_norm': 0.9612419895376347, 'learning_rate': 8.389443465137644e-06, 'epoch': 0.29} 29%|██▊ | 9770/34278 [10:49:09<23:32:01, 3.46s/it] 29%|██▊ | 9771/34278 [10:49:11<22:31:35, 3.31s/it] {'loss': 0.1557, 'grad_norm': 0.989031838983565, 'learning_rate': 8.389096133162676e-06, 'epoch': 0.29} 29%|██▊ | 9771/34278 [10:49:11<22:31:35, 3.31s/it] 29%|██▊ | 9772/34278 [10:49:15<22:44:23, 3.34s/it] {'loss': 0.1578, 'grad_norm': 0.7003070634055556, 'learning_rate': 8.388748770930668e-06, 'epoch': 0.29} 29%|██▊ | 9772/34278 [10:49:15<22:44:23, 3.34s/it] 29%|██▊ | 9773/34278 [10:49:18<21:48:36, 3.20s/it] {'loss': 0.1542, 'grad_norm': 0.6892256137104693, 'learning_rate': 8.38840137844472e-06, 'epoch': 0.29} 29%|██▊ | 9773/34278 [10:49:18<21:48:36, 3.20s/it] 29%|██▊ | 9774/34278 [10:49:22<24:45:18, 3.64s/it] {'loss': 0.1462, 'grad_norm': 0.7613776122796025, 'learning_rate': 8.388053955707933e-06, 'epoch': 0.29} 29%|██▊ | 9774/34278 [10:49:22<24:45:18, 3.64s/it] 29%|██▊ | 9775/34278 [10:49:25<23:14:19, 3.41s/it] {'loss': 0.1711, 'grad_norm': 0.8396255255344625, 'learning_rate': 8.387706502723411e-06, 'epoch': 0.29} 29%|██▊ | 9775/34278 [10:49:25<23:14:19, 3.41s/it] 29%|██▊ | 9776/34278 [10:49:29<24:31:02, 3.60s/it] {'loss': 0.1616, 'grad_norm': 0.8393162253934431, 'learning_rate': 8.387359019494253e-06, 'epoch': 0.29} 29%|██▊ | 9776/34278 [10:49:29<24:31:02, 3.60s/it] 29%|██▊ | 9777/34278 [10:49:32<23:36:35, 3.47s/it] {'loss': 0.1436, 'grad_norm': 0.9361006689228223, 'learning_rate': 8.38701150602356e-06, 'epoch': 0.29} 29%|██▊ | 9777/34278 [10:49:33<23:36:35, 3.47s/it] 29%|██▊ | 9778/34278 [10:49:37<24:55:21, 3.66s/it] {'loss': 0.1526, 'grad_norm': 0.7737408120928901, 'learning_rate': 8.386663962314439e-06, 'epoch': 0.29} 29%|██▊ | 9778/34278 [10:49:37<24:55:21, 3.66s/it] 29%|██▊ | 9779/34278 [10:49:40<24:03:40, 3.54s/it] {'loss': 0.1591, 'grad_norm': 1.0143515421180576, 'learning_rate': 8.38631638836999e-06, 'epoch': 0.29} 29%|██▊ | 9779/34278 [10:49:40<24:03:40, 3.54s/it] 29%|██▊ | 9780/34278 [10:49:43<23:14:52, 3.42s/it] {'loss': 0.1482, 'grad_norm': 0.9423559033595247, 'learning_rate': 8.385968784193318e-06, 'epoch': 0.29} 29%|██▊ | 9780/34278 [10:49:43<23:14:52, 3.42s/it] 29%|██▊ | 9781/34278 [10:49:48<27:18:25, 4.01s/it] {'loss': 0.1648, 'grad_norm': 0.8088790668770177, 'learning_rate': 8.385621149787523e-06, 'epoch': 0.29} 29%|██▊ | 9781/34278 [10:49:48<27:18:25, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 29%|██▊ | 9782/34278 [10:49:51<25:01:32, 3.68s/it] {'loss': 0.1449, 'grad_norm': 1.2837111217628376, 'learning_rate': 8.385273485155712e-06, 'epoch': 0.29} 29%|██▊ | 9782/34278 [10:49:51<25:01:32, 3.68s/it] 29%|██▊ | 9783/34278 [10:49:57<29:42:54, 4.37s/it] {'loss': 0.1561, 'grad_norm': 0.7041231284703034, 'learning_rate': 8.384925790300988e-06, 'epoch': 0.29} 29%|██▊ | 9783/34278 [10:49:57<29:42:54, 4.37s/it] 29%|██▊ | 9784/34278 [10:50:00<26:38:22, 3.92s/it] {'loss': 0.1636, 'grad_norm': 0.7609432455307122, 'learning_rate': 8.384578065226452e-06, 'epoch': 0.29} 29%|██▊ | 9784/34278 [10:50:00<26:38:22, 3.92s/it] 29%|██▊ | 9785/34278 [10:50:03<24:51:47, 3.65s/it] {'loss': 0.1524, 'grad_norm': 1.0170438604752867, 'learning_rate': 8.384230309935212e-06, 'epoch': 0.29} 29%|██▊ | 9785/34278 [10:50:03<24:51:47, 3.65s/it] 29%|██▊ | 9786/34278 [10:50:07<24:40:19, 3.63s/it] {'loss': 0.1635, 'grad_norm': 0.6991345591064341, 'learning_rate': 8.383882524430373e-06, 'epoch': 0.29} 29%|██▊ | 9786/34278 [10:50:07<24:40:19, 3.63s/it] 29%|██▊ | 9787/34278 [10:50:10<23:35:55, 3.47s/it] {'loss': 0.1455, 'grad_norm': 0.793002041694541, 'learning_rate': 8.383534708715039e-06, 'epoch': 0.29} 29%|██▊ | 9787/34278 [10:50:10<23:35:55, 3.47s/it] 29%|██▊ | 9788/34278 [10:50:14<25:02:50, 3.68s/it] {'loss': 0.1561, 'grad_norm': 0.9108779665096352, 'learning_rate': 8.383186862792313e-06, 'epoch': 0.29} 29%|██▊ | 9788/34278 [10:50:14<25:02:50, 3.68s/it] 29%|██▊ | 9789/34278 [10:50:20<30:12:44, 4.44s/it] {'loss': 0.1413, 'grad_norm': 0.7557158888877378, 'learning_rate': 8.382838986665303e-06, 'epoch': 0.29} 29%|██▊ | 9789/34278 [10:50:20<30:12:44, 4.44s/it] 29%|██▊ | 9790/34278 [10:50:25<31:02:02, 4.56s/it] {'loss': 0.1429, 'grad_norm': 0.6168635431655549, 'learning_rate': 8.382491080337114e-06, 'epoch': 0.29} 29%|██▊ | 9790/34278 [10:50:25<31:02:02, 4.56s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 29%|██▊ | 9791/34278 [10:50:28<27:51:02, 4.09s/it] {'loss': 0.1843, 'grad_norm': 0.8606945167870395, 'learning_rate': 8.382143143810853e-06, 'epoch': 0.29} 29%|██▊ | 9791/34278 [10:50:28<27:51:02, 4.09s/it] 29%|██▊ | 9792/34278 [10:50:31<25:44:29, 3.78s/it] {'loss': 0.1742, 'grad_norm': 0.8543899192255275, 'learning_rate': 8.381795177089625e-06, 'epoch': 0.29} 29%|██▊ | 9792/34278 [10:50:31<25:44:29, 3.78s/it] 29%|██▊ | 9793/34278 [10:50:35<26:42:13, 3.93s/it] {'loss': 0.1625, 'grad_norm': 0.8743916722210373, 'learning_rate': 8.381447180176536e-06, 'epoch': 0.29} 29%|██▊ | 9793/34278 [10:50:35<26:42:13, 3.93s/it] 29%|██▊ | 9794/34278 [10:50:38<24:56:02, 3.67s/it] {'loss': 0.1668, 'grad_norm': 0.8911906150549053, 'learning_rate': 8.381099153074694e-06, 'epoch': 0.29} 29%|██▊ | 9794/34278 [10:50:38<24:56:02, 3.67s/it] 29%|██▊ | 9795/34278 [10:50:42<24:06:37, 3.55s/it] {'loss': 0.1621, 'grad_norm': 0.8898417053456381, 'learning_rate': 8.380751095787206e-06, 'epoch': 0.29} 29%|██▊ | 9795/34278 [10:50:42<24:06:37, 3.55s/it] 29%|██▊ | 9796/34278 [10:50:48<28:51:56, 4.24s/it] {'loss': 0.1547, 'grad_norm': 0.8492672080647266, 'learning_rate': 8.38040300831718e-06, 'epoch': 0.29} 29%|██▊ | 9796/34278 [10:50:48<28:51:56, 4.24s/it] 29%|██▊ | 9797/34278 [10:50:54<33:10:10, 4.88s/it] {'loss': 0.1542, 'grad_norm': 0.8231220735251121, 'learning_rate': 8.380054890667721e-06, 'epoch': 0.29} 29%|██▊ | 9797/34278 [10:50:54<33:10:10, 4.88s/it] 29%|██▊ | 9798/34278 [10:50:57<30:05:58, 4.43s/it] {'loss': 0.1501, 'grad_norm': 0.9392926209405533, 'learning_rate': 8.379706742841942e-06, 'epoch': 0.29} 29%|██▊ | 9798/34278 [10:50:57<30:05:58, 4.43s/it] 29%|██▊ | 9799/34278 [10:51:00<26:57:20, 3.96s/it] {'loss': 0.1825, 'grad_norm': 0.8668033463206155, 'learning_rate': 8.379358564842945e-06, 'epoch': 0.29} 29%|██▊ | 9799/34278 [10:51:00<26:57:20, 3.96s/it] 29%|██▊ | 9800/34278 [10:51:03<25:07:19, 3.69s/it] {'loss': 0.1397, 'grad_norm': 0.782579551210596, 'learning_rate': 8.379010356673842e-06, 'epoch': 0.29} 29%|██▊ | 9800/34278 [10:51:03<25:07:19, 3.69s/it] 29%|██▊ | 9801/34278 [10:51:09<29:39:08, 4.36s/it] {'loss': 0.1711, 'grad_norm': 0.8429579693804491, 'learning_rate': 8.378662118337744e-06, 'epoch': 0.29} 29%|██▊ | 9801/34278 [10:51:09<29:39:08, 4.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 29%|██▊ | 9802/34278 [10:51:12<26:32:11, 3.90s/it] {'loss': 0.1583, 'grad_norm': 1.1811981160323473, 'learning_rate': 8.378313849837754e-06, 'epoch': 0.29} 29%|██▊ | 9802/34278 [10:51:12<26:32:11, 3.90s/it] 29%|██▊ | 9803/34278 [10:51:15<24:54:23, 3.66s/it] {'loss': 0.1611, 'grad_norm': 0.7390792447280129, 'learning_rate': 8.377965551176986e-06, 'epoch': 0.29} 29%|██▊ | 9803/34278 [10:51:15<24:54:23, 3.66s/it] 29%|██▊ | 9804/34278 [10:51:18<23:20:19, 3.43s/it] {'loss': 0.1976, 'grad_norm': 0.8515452839779818, 'learning_rate': 8.377617222358547e-06, 'epoch': 0.29} 29%|██▊ | 9804/34278 [10:51:18<23:20:19, 3.43s/it] 29%|██▊ | 9805/34278 [10:51:21<22:53:31, 3.37s/it] {'loss': 0.1824, 'grad_norm': 1.138326602151265, 'learning_rate': 8.377268863385548e-06, 'epoch': 0.29} 29%|██▊ | 9805/34278 [10:51:21<22:53:31, 3.37s/it] 29%|██▊ | 9806/34278 [10:51:24<22:15:22, 3.27s/it] {'loss': 0.1539, 'grad_norm': 0.6932803493196364, 'learning_rate': 8.376920474261098e-06, 'epoch': 0.29} 29%|██▊ | 9806/34278 [10:51:24<22:15:22, 3.27s/it] 29%|██▊ | 9807/34278 [10:51:28<23:09:20, 3.41s/it] {'loss': 0.1639, 'grad_norm': 0.8042209766996735, 'learning_rate': 8.37657205498831e-06, 'epoch': 0.29} 29%|██▊ | 9807/34278 [10:51:28<23:09:20, 3.41s/it] 29%|██▊ | 9808/34278 [10:51:31<22:28:23, 3.31s/it] {'loss': 0.1563, 'grad_norm': 0.9771992419589725, 'learning_rate': 8.376223605570292e-06, 'epoch': 0.29} 29%|██▊ | 9808/34278 [10:51:31<22:28:23, 3.31s/it] 29%|██▊ | 9809/34278 [10:51:34<21:56:37, 3.23s/it] {'loss': 0.1436, 'grad_norm': 0.8513448143246213, 'learning_rate': 8.375875126010156e-06, 'epoch': 0.29} 29%|██▊ | 9809/34278 [10:51:34<21:56:37, 3.23s/it] 29%|██▊ | 9810/34278 [10:51:37<21:00:13, 3.09s/it] {'loss': 0.157, 'grad_norm': 0.7190331498425138, 'learning_rate': 8.375526616311012e-06, 'epoch': 0.29} 29%|██▊ | 9810/34278 [10:51:37<21:00:13, 3.09s/it] 29%|██▊ | 9811/34278 [10:51:40<21:47:48, 3.21s/it] {'loss': 0.1276, 'grad_norm': 0.7185870328303054, 'learning_rate': 8.375178076475971e-06, 'epoch': 0.29} 29%|██▊ | 9811/34278 [10:51:40<21:47:48, 3.21s/it] 29%|██▊ | 9812/34278 [10:51:43<21:05:00, 3.10s/it] {'loss': 0.1703, 'grad_norm': 1.1960626524080855, 'learning_rate': 8.37482950650815e-06, 'epoch': 0.29} 29%|██▊ | 9812/34278 [10:51:43<21:05:00, 3.10s/it] 29%|██▊ | 9813/34278 [10:51:46<21:26:01, 3.15s/it] {'loss': 0.1562, 'grad_norm': 0.9473444616150966, 'learning_rate': 8.374480906410651e-06, 'epoch': 0.29} 29%|██▊ | 9813/34278 [10:51:47<21:26:01, 3.15s/it] 29%|██▊ | 9814/34278 [10:51:49<20:36:09, 3.03s/it] {'loss': 0.1624, 'grad_norm': 1.5379768613731213, 'learning_rate': 8.374132276186596e-06, 'epoch': 0.29} 29%|██▊ | 9814/34278 [10:51:49<20:36:09, 3.03s/it] 29%|██▊ | 9815/34278 [10:51:55<26:46:37, 3.94s/it] {'loss': 0.1488, 'grad_norm': 0.8579904917878909, 'learning_rate': 8.373783615839093e-06, 'epoch': 0.29} 29%|██▊ | 9815/34278 [10:51:55<26:46:37, 3.94s/it] 29%|██▊ | 9816/34278 [10:51:59<25:51:05, 3.80s/it] {'loss': 0.1467, 'grad_norm': 0.8139606444233826, 'learning_rate': 8.373434925371255e-06, 'epoch': 0.29} 29%|██▊ | 9816/34278 [10:51:59<25:51:05, 3.80s/it] 29%|██▊ | 9817/34278 [10:52:02<25:36:44, 3.77s/it] {'loss': 0.1696, 'grad_norm': 0.8252844186380475, 'learning_rate': 8.373086204786195e-06, 'epoch': 0.29} 29%|██▊ | 9817/34278 [10:52:03<25:36:44, 3.77s/it] 29%|██▊ | 9818/34278 [10:52:08<28:28:06, 4.19s/it] {'loss': 0.1355, 'grad_norm': 0.846608937522948, 'learning_rate': 8.372737454087026e-06, 'epoch': 0.29} 29%|██▊ | 9818/34278 [10:52:08<28:28:06, 4.19s/it] 29%|██▊ | 9819/34278 [10:52:13<30:05:06, 4.43s/it] {'loss': 0.1526, 'grad_norm': 0.6950788202532748, 'learning_rate': 8.372388673276864e-06, 'epoch': 0.29} 29%|██▊ | 9819/34278 [10:52:13<30:05:06, 4.43s/it] 29%|██▊ | 9820/34278 [10:52:16<28:30:29, 4.20s/it] {'loss': 0.1835, 'grad_norm': 0.7839518495820436, 'learning_rate': 8.37203986235882e-06, 'epoch': 0.29} 29%|██▊ | 9820/34278 [10:52:16<28:30:29, 4.20s/it] 29%|██▊ | 9821/34278 [10:52:22<32:10:06, 4.74s/it] {'loss': 0.13, 'grad_norm': 0.8047945199413639, 'learning_rate': 8.371691021336008e-06, 'epoch': 0.29} 29%|██▊ | 9821/34278 [10:52:22<32:10:06, 4.74s/it] 29%|██▊ | 9822/34278 [10:52:28<34:31:40, 5.08s/it] {'loss': 0.1504, 'grad_norm': 0.6528806430649705, 'learning_rate': 8.371342150211544e-06, 'epoch': 0.29} 29%|██▊ | 9822/34278 [10:52:28<34:31:40, 5.08s/it] 29%|██▊ | 9823/34278 [10:52:32<31:35:01, 4.65s/it] {'loss': 0.1557, 'grad_norm': 0.8877748140982329, 'learning_rate': 8.370993248988543e-06, 'epoch': 0.29} 29%|██▊ | 9823/34278 [10:52:32<31:35:01, 4.65s/it] 29%|██▊ | 9824/34278 [10:52:35<29:33:36, 4.35s/it] {'loss': 0.1635, 'grad_norm': 1.015067989317165, 'learning_rate': 8.370644317670118e-06, 'epoch': 0.29} 29%|██▊ | 9824/34278 [10:52:36<29:33:36, 4.35s/it] 29%|██▊ | 9825/34278 [10:52:39<28:11:47, 4.15s/it] {'loss': 0.1652, 'grad_norm': 0.9142522240636217, 'learning_rate': 8.370295356259386e-06, 'epoch': 0.29} 29%|██▊ | 9825/34278 [10:52:39<28:11:47, 4.15s/it] 29%|██▊ | 9826/34278 [10:52:45<32:08:03, 4.73s/it] {'loss': 0.1646, 'grad_norm': 0.8756461731394208, 'learning_rate': 8.369946364759462e-06, 'epoch': 0.29} 29%|██▊ | 9826/34278 [10:52:45<32:08:03, 4.73s/it] 29%|██▊ | 9827/34278 [10:52:48<28:26:45, 4.19s/it] {'loss': 0.1558, 'grad_norm': 1.1130965289972723, 'learning_rate': 8.36959734317346e-06, 'epoch': 0.29} 29%|██▊ | 9827/34278 [10:52:48<28:26:45, 4.19s/it] 29%|██▊ | 9828/34278 [10:52:53<28:46:42, 4.24s/it] {'loss': 0.1576, 'grad_norm': 0.8259424285503624, 'learning_rate': 8.369248291504497e-06, 'epoch': 0.29} 29%|██▊ | 9828/34278 [10:52:53<28:46:42, 4.24s/it] 29%|██▊ | 9829/34278 [10:52:57<28:40:29, 4.22s/it] {'loss': 0.1625, 'grad_norm': 0.9562641163907196, 'learning_rate': 8.368899209755691e-06, 'epoch': 0.29} 29%|██▊ | 9829/34278 [10:52:57<28:40:29, 4.22s/it] 29%|██▊ | 9830/34278 [10:53:00<26:04:46, 3.84s/it] {'loss': 0.1291, 'grad_norm': 1.2872070525831119, 'learning_rate': 8.368550097930156e-06, 'epoch': 0.29} 29%|██▊ | 9830/34278 [10:53:00<26:04:46, 3.84s/it] 29%|██▊ | 9831/34278 [10:53:02<23:49:24, 3.51s/it] {'loss': 0.145, 'grad_norm': 0.8581258318130291, 'learning_rate': 8.368200956031011e-06, 'epoch': 0.29} 29%|██▊ | 9831/34278 [10:53:02<23:49:24, 3.51s/it] 29%|██▊ | 9832/34278 [10:53:05<22:31:15, 3.32s/it] {'loss': 0.1416, 'grad_norm': 1.087307890910858, 'learning_rate': 8.367851784061371e-06, 'epoch': 0.29} 29%|██▊ | 9832/34278 [10:53:05<22:31:15, 3.32s/it] 29%|██▊ | 9833/34278 [10:53:09<23:51:43, 3.51s/it] {'loss': 0.1525, 'grad_norm': 0.9742143876335019, 'learning_rate': 8.367502582024354e-06, 'epoch': 0.29} 29%|██▊ | 9833/34278 [10:53:09<23:51:43, 3.51s/it] 29%|██▊ | 9834/34278 [10:53:15<29:15:13, 4.31s/it] {'loss': 0.2012, 'grad_norm': 1.0880570097246778, 'learning_rate': 8.367153349923078e-06, 'epoch': 0.29} 29%|██▊ | 9834/34278 [10:53:15<29:15:13, 4.31s/it] 29%|██▊ | 9835/34278 [10:53:18<26:23:05, 3.89s/it] {'loss': 0.1496, 'grad_norm': 1.1997412533739402, 'learning_rate': 8.366804087760662e-06, 'epoch': 0.29} 29%|██▊ | 9835/34278 [10:53:18<26:23:05, 3.89s/it] 29%|██▊ | 9836/34278 [10:53:21<24:46:31, 3.65s/it] {'loss': 0.1756, 'grad_norm': 0.8698108124035253, 'learning_rate': 8.366454795540221e-06, 'epoch': 0.29} 29%|██▊ | 9836/34278 [10:53:21<24:46:31, 3.65s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6244c5aac0> Failed to fetch sample 3646668. Exception: cannot identify image file <_io.BytesIO object at 0x7f6244c5aac0> 29%|██▊ | 9837/34278 [10:53:28<29:56:35, 4.41s/it] {'loss': 0.1822, 'grad_norm': 1.1371208532155463, 'learning_rate': 8.366105473264877e-06, 'epoch': 0.29} 29%|██▊ | 9837/34278 [10:53:28<29:56:35, 4.41s/it] 29%|██▊ | 9838/34278 [10:53:33<31:38:52, 4.66s/it] {'loss': 0.1827, 'grad_norm': 1.349272065080481, 'learning_rate': 8.365756120937746e-06, 'epoch': 0.29} 29%|██▊ | 9838/34278 [10:53:33<31:38:52, 4.66s/it] 29%|██▊ | 9839/34278 [10:53:39<34:26:33, 5.07s/it] {'loss': 0.1712, 'grad_norm': 1.159707127945847, 'learning_rate': 8.365406738561948e-06, 'epoch': 0.29} 29%|██▊ | 9839/34278 [10:53:39<34:26:33, 5.07s/it] 29%|██▊ | 9840/34278 [10:53:42<30:24:42, 4.48s/it] {'loss': 0.1624, 'grad_norm': 0.7695853217531026, 'learning_rate': 8.365057326140602e-06, 'epoch': 0.29} 29%|██▊ | 9840/34278 [10:53:42<30:24:42, 4.48s/it] 29%|██▊ | 9841/34278 [10:53:45<28:16:06, 4.16s/it] {'loss': 0.1629, 'grad_norm': 1.2176623436540903, 'learning_rate': 8.364707883676826e-06, 'epoch': 0.29} 29%|██▊ | 9841/34278 [10:53:45<28:16:06, 4.16s/it] 29%|██▊ | 9842/34278 [10:53:48<25:40:56, 3.78s/it] {'loss': 0.1564, 'grad_norm': 0.7969760569180943, 'learning_rate': 8.364358411173742e-06, 'epoch': 0.29} 29%|██▊ | 9842/34278 [10:53:48<25:40:56, 3.78s/it] 29%|██▊ | 9843/34278 [10:53:52<24:41:20, 3.64s/it] {'loss': 0.1545, 'grad_norm': 0.7393559798164078, 'learning_rate': 8.36400890863447e-06, 'epoch': 0.29} 29%|██▊ | 9843/34278 [10:53:52<24:41:20, 3.64s/it] 29%|██▊ | 9844/34278 [10:53:55<24:53:36, 3.67s/it] {'loss': 0.1378, 'grad_norm': 0.7461860357689802, 'learning_rate': 8.363659376062129e-06, 'epoch': 0.29} 29%|██▊ | 9844/34278 [10:53:55<24:53:36, 3.67s/it] 29%|██▊ | 9845/34278 [10:53:58<23:43:54, 3.50s/it] {'loss': 0.1397, 'grad_norm': 0.9308486159446256, 'learning_rate': 8.36330981345984e-06, 'epoch': 0.29} 29%|██▊ | 9845/34278 [10:53:58<23:43:54, 3.50s/it] 29%|██▊ | 9846/34278 [10:54:02<24:18:28, 3.58s/it] {'loss': 0.1647, 'grad_norm': 0.8344962007898478, 'learning_rate': 8.362960220830725e-06, 'epoch': 0.29} 29%|██▊ | 9846/34278 [10:54:02<24:18:28, 3.58s/it] 29%|██▊ | 9847/34278 [10:54:05<22:58:22, 3.39s/it] {'loss': 0.1698, 'grad_norm': 0.7315788043770404, 'learning_rate': 8.362610598177904e-06, 'epoch': 0.29} 29%|██▊ | 9847/34278 [10:54:05<22:58:22, 3.39s/it] 29%|██▊ | 9848/34278 [10:54:08<22:19:33, 3.29s/it] {'loss': 0.1553, 'grad_norm': 0.9402750814257732, 'learning_rate': 8.362260945504497e-06, 'epoch': 0.29} 29%|██▊ | 9848/34278 [10:54:08<22:19:33, 3.29s/it] 29%|██▊ | 9849/34278 [10:54:11<22:11:18, 3.27s/it] {'loss': 0.1546, 'grad_norm': 0.8061549441004859, 'learning_rate': 8.361911262813628e-06, 'epoch': 0.29} 29%|██▊ | 9849/34278 [10:54:11<22:11:18, 3.27s/it] 29%|██▊ | 9850/34278 [10:54:17<27:46:32, 4.09s/it] {'loss': 0.1336, 'grad_norm': 0.847045834650828, 'learning_rate': 8.361561550108417e-06, 'epoch': 0.29} 29%|██▊ | 9850/34278 [10:54:17<27:46:32, 4.09s/it] 29%|██▊ | 9851/34278 [10:54:21<26:40:53, 3.93s/it] {'loss': 0.1699, 'grad_norm': 0.8061753933894321, 'learning_rate': 8.361211807391987e-06, 'epoch': 0.29} 29%|██▊ | 9851/34278 [10:54:21<26:40:53, 3.93s/it] 29%|██▊ | 9852/34278 [10:54:24<25:36:46, 3.77s/it] {'loss': 0.15, 'grad_norm': 0.8444227995422279, 'learning_rate': 8.36086203466746e-06, 'epoch': 0.29} 29%|██▊ | 9852/34278 [10:54:24<25:36:46, 3.77s/it] 29%|██▊ | 9853/34278 [10:54:30<29:07:10, 4.29s/it] {'loss': 0.167, 'grad_norm': 0.8212005765713866, 'learning_rate': 8.36051223193796e-06, 'epoch': 0.29} 29%|██▊ | 9853/34278 [10:54:30<29:07:10, 4.29s/it] 29%|██▊ | 9854/34278 [10:54:33<27:29:39, 4.05s/it] {'loss': 0.1541, 'grad_norm': 0.7896960182652282, 'learning_rate': 8.360162399206609e-06, 'epoch': 0.29} 29%|██▊ | 9854/34278 [10:54:33<27:29:39, 4.05s/it] 29%|██▉ | 9855/34278 [10:54:37<26:30:34, 3.91s/it] {'loss': 0.156, 'grad_norm': 0.7054874119732321, 'learning_rate': 8.35981253647653e-06, 'epoch': 0.29} 29%|██▉ | 9855/34278 [10:54:37<26:30:34, 3.91s/it] 29%|██▉ | 9856/34278 [10:54:40<24:28:27, 3.61s/it] {'loss': 0.1455, 'grad_norm': 0.794441548546251, 'learning_rate': 8.359462643750847e-06, 'epoch': 0.29} 29%|██▉ | 9856/34278 [10:54:40<24:28:27, 3.61s/it] 29%|██▉ | 9857/34278 [10:54:44<26:10:42, 3.86s/it] {'loss': 0.1522, 'grad_norm': 0.7243266466759475, 'learning_rate': 8.359112721032682e-06, 'epoch': 0.29} 29%|██▉ | 9857/34278 [10:54:44<26:10:42, 3.86s/it] 29%|██▉ | 9858/34278 [10:54:49<27:14:19, 4.02s/it] {'loss': 0.1588, 'grad_norm': 0.854715048031843, 'learning_rate': 8.358762768325162e-06, 'epoch': 0.29} 29%|██▉ | 9858/34278 [10:54:49<27:14:19, 4.02s/it] 29%|██▉ | 9859/34278 [10:54:52<25:30:32, 3.76s/it] {'loss': 0.1573, 'grad_norm': 0.7871523200061623, 'learning_rate': 8.35841278563141e-06, 'epoch': 0.29} 29%|██▉ | 9859/34278 [10:54:52<25:30:32, 3.76s/it] 29%|██▉ | 9860/34278 [10:54:56<25:59:56, 3.83s/it] {'loss': 0.1682, 'grad_norm': 0.9531408083364188, 'learning_rate': 8.358062772954549e-06, 'epoch': 0.29} 29%|██▉ | 9860/34278 [10:54:56<25:59:56, 3.83s/it] 29%|██▉ | 9861/34278 [10:55:02<30:46:20, 4.54s/it] {'loss': 0.147, 'grad_norm': 0.6812932180373655, 'learning_rate': 8.357712730297707e-06, 'epoch': 0.29} 29%|██▉ | 9861/34278 [10:55:02<30:46:20, 4.54s/it] 29%|██▉ | 9862/34278 [10:55:06<28:53:53, 4.26s/it] {'loss': 0.1545, 'grad_norm': 0.7838031131738635, 'learning_rate': 8.357362657664005e-06, 'epoch': 0.29} 29%|██▉ | 9862/34278 [10:55:06<28:53:53, 4.26s/it] 29%|██▉ | 9863/34278 [10:55:10<28:50:05, 4.25s/it] {'loss': 0.1631, 'grad_norm': 1.1692794571002676, 'learning_rate': 8.357012555056571e-06, 'epoch': 0.29} 29%|██▉ | 9863/34278 [10:55:10<28:50:05, 4.25s/it] 29%|██▉ | 9864/34278 [10:55:13<26:22:39, 3.89s/it] {'loss': 0.1505, 'grad_norm': 0.759695566501756, 'learning_rate': 8.356662422478532e-06, 'epoch': 0.29} 29%|██▉ | 9864/34278 [10:55:13<26:22:39, 3.89s/it] 29%|██▉ | 9865/34278 [10:55:18<29:40:24, 4.38s/it] {'loss': 0.179, 'grad_norm': 0.8668270726138102, 'learning_rate': 8.356312259933013e-06, 'epoch': 0.29} 29%|██▉ | 9865/34278 [10:55:18<29:40:24, 4.38s/it] 29%|██▉ | 9866/34278 [10:55:23<29:38:13, 4.37s/it] {'loss': 0.1636, 'grad_norm': 1.2673528714091171, 'learning_rate': 8.355962067423135e-06, 'epoch': 0.29} 29%|██▉ | 9866/34278 [10:55:23<29:38:13, 4.37s/it] 29%|██▉ | 9867/34278 [10:55:25<26:15:43, 3.87s/it] {'loss': 0.1697, 'grad_norm': 0.8678672769322022, 'learning_rate': 8.355611844952033e-06, 'epoch': 0.29} 29%|██▉ | 9867/34278 [10:55:26<26:15:43, 3.87s/it] 29%|██▉ | 9868/34278 [10:55:29<25:55:03, 3.82s/it] {'loss': 0.1515, 'grad_norm': 0.8174206486742935, 'learning_rate': 8.355261592522828e-06, 'epoch': 0.29} 29%|██▉ | 9868/34278 [10:55:29<25:55:03, 3.82s/it] 29%|██▉ | 9869/34278 [10:55:34<27:29:40, 4.06s/it] {'loss': 0.1491, 'grad_norm': 0.7988968203734829, 'learning_rate': 8.354911310138647e-06, 'epoch': 0.29} 29%|██▉ | 9869/34278 [10:55:34<27:29:40, 4.06s/it] 29%|██▉ | 9870/34278 [10:55:37<25:45:06, 3.80s/it] {'loss': 0.1627, 'grad_norm': 0.6804670031915753, 'learning_rate': 8.354560997802622e-06, 'epoch': 0.29} 29%|██▉ | 9870/34278 [10:55:37<25:45:06, 3.80s/it] 29%|██▉ | 9871/34278 [10:55:40<23:56:32, 3.53s/it] {'loss': 0.1439, 'grad_norm': 0.7502123958402307, 'learning_rate': 8.354210655517876e-06, 'epoch': 0.29} 29%|██▉ | 9871/34278 [10:55:40<23:56:32, 3.53s/it] 29%|██▉ | 9872/34278 [10:55:44<24:34:51, 3.63s/it] {'loss': 0.1749, 'grad_norm': 0.9163857814759543, 'learning_rate': 8.353860283287535e-06, 'epoch': 0.29} 29%|██▉ | 9872/34278 [10:55:44<24:34:51, 3.63s/it] 29%|██▉ | 9873/34278 [10:55:47<23:21:50, 3.45s/it] {'loss': 0.1671, 'grad_norm': 0.7503514897510583, 'learning_rate': 8.353509881114734e-06, 'epoch': 0.29} 29%|██▉ | 9873/34278 [10:55:47<23:21:50, 3.45s/it] 29%|██▉ | 9874/34278 [10:55:51<24:03:46, 3.55s/it] {'loss': 0.1709, 'grad_norm': 0.9986993786625777, 'learning_rate': 8.353159449002595e-06, 'epoch': 0.29} 29%|██▉ | 9874/34278 [10:55:51<24:03:46, 3.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 29%|██▉ | 9875/34278 [10:55:53<22:43:43, 3.35s/it] {'loss': 0.1723, 'grad_norm': 0.7901623980860604, 'learning_rate': 8.352808986954251e-06, 'epoch': 0.29} 29%|██▉ | 9875/34278 [10:55:53<22:43:43, 3.35s/it] 29%|██▉ | 9876/34278 [10:55:56<21:57:30, 3.24s/it] {'loss': 0.1469, 'grad_norm': 0.6467608429051405, 'learning_rate': 8.352458494972825e-06, 'epoch': 0.29} 29%|██▉ | 9876/34278 [10:55:56<21:57:30, 3.24s/it] 29%|██▉ | 9877/34278 [10:55:59<21:34:17, 3.18s/it] {'loss': 0.1693, 'grad_norm': 0.7686440350854351, 'learning_rate': 8.352107973061455e-06, 'epoch': 0.29} 29%|██▉ | 9877/34278 [10:55:59<21:34:17, 3.18s/it] 29%|██▉ | 9878/34278 [10:56:02<21:04:46, 3.11s/it] {'loss': 0.145, 'grad_norm': 0.7673670580493841, 'learning_rate': 8.351757421223262e-06, 'epoch': 0.29} 29%|██▉ | 9878/34278 [10:56:02<21:04:46, 3.11s/it] 29%|██▉ | 9879/34278 [10:56:07<24:27:08, 3.61s/it] {'loss': 0.1473, 'grad_norm': 0.7527342040107813, 'learning_rate': 8.351406839461378e-06, 'epoch': 0.29} 29%|██▉ | 9879/34278 [10:56:07<24:27:08, 3.61s/it] 29%|██▉ | 9880/34278 [10:56:10<22:35:57, 3.33s/it] {'loss': 0.1328, 'grad_norm': 0.9030520454763455, 'learning_rate': 8.351056227778935e-06, 'epoch': 0.29} 29%|██▉ | 9880/34278 [10:56:10<22:35:57, 3.33s/it] 29%|██▉ | 9881/34278 [10:56:13<21:52:51, 3.23s/it] {'loss': 0.1554, 'grad_norm': 0.826526670234512, 'learning_rate': 8.350705586179063e-06, 'epoch': 0.29} 29%|██▉ | 9881/34278 [10:56:13<21:52:51, 3.23s/it] 29%|██▉ | 9882/34278 [10:56:16<21:49:50, 3.22s/it] {'loss': 0.1413, 'grad_norm': 0.9628114119687834, 'learning_rate': 8.35035491466489e-06, 'epoch': 0.29} 29%|██▉ | 9882/34278 [10:56:16<21:49:50, 3.22s/it] 29%|██▉ | 9883/34278 [10:56:19<21:43:47, 3.21s/it] {'loss': 0.1451, 'grad_norm': 0.7046764569947076, 'learning_rate': 8.350004213239549e-06, 'epoch': 0.29} 29%|██▉ | 9883/34278 [10:56:19<21:43:47, 3.21s/it] 29%|██▉ | 9884/34278 [10:56:25<27:20:04, 4.03s/it] {'loss': 0.1656, 'grad_norm': 0.7790342088529321, 'learning_rate': 8.349653481906169e-06, 'epoch': 0.29} 29%|██▉ | 9884/34278 [10:56:25<27:20:04, 4.03s/it] 29%|██▉ | 9885/34278 [10:56:28<24:32:04, 3.62s/it] {'loss': 0.1703, 'grad_norm': 0.9868218322854813, 'learning_rate': 8.349302720667883e-06, 'epoch': 0.29} 29%|██▉ | 9885/34278 [10:56:28<24:32:04, 3.62s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10187 > 8192). Running this sequence through the model will result in indexing errors 29%|██▉ | 9886/34278 [10:56:31<23:55:54, 3.53s/it] {'loss': 0.137, 'grad_norm': 0.7137767194711725, 'learning_rate': 8.348951929527822e-06, 'epoch': 0.29} 29%|██▉ | 9886/34278 [10:56:31<23:55:54, 3.53s/it] 29%|██▉ | 9887/34278 [10:56:34<23:08:28, 3.42s/it] {'loss': 0.1455, 'grad_norm': 0.8227141535054573, 'learning_rate': 8.348601108489118e-06, 'epoch': 0.29} 29%|██▉ | 9887/34278 [10:56:34<23:08:28, 3.42s/it] 29%|██▉ | 9888/34278 [10:56:38<23:47:52, 3.51s/it] {'loss': 0.1618, 'grad_norm': 0.8436155558799002, 'learning_rate': 8.348250257554902e-06, 'epoch': 0.29} 29%|██▉ | 9888/34278 [10:56:38<23:47:52, 3.51s/it] 29%|██▉ | 9889/34278 [10:56:42<24:09:21, 3.57s/it] {'loss': 0.1252, 'grad_norm': 1.0102542206494416, 'learning_rate': 8.347899376728307e-06, 'epoch': 0.29} 29%|██▉ | 9889/34278 [10:56:42<24:09:21, 3.57s/it] 29%|██▉ | 9890/34278 [10:56:48<29:29:37, 4.35s/it] {'loss': 0.1235, 'grad_norm': 0.6229948350671283, 'learning_rate': 8.347548466012464e-06, 'epoch': 0.29} 29%|██▉ | 9890/34278 [10:56:48<29:29:37, 4.35s/it] 29%|██▉ | 9891/34278 [10:56:51<26:32:20, 3.92s/it] {'loss': 0.154, 'grad_norm': 0.7487960006338579, 'learning_rate': 8.34719752541051e-06, 'epoch': 0.29} 29%|██▉ | 9891/34278 [10:56:51<26:32:20, 3.92s/it] 29%|██▉ | 9892/34278 [10:56:54<25:09:04, 3.71s/it] {'loss': 0.1546, 'grad_norm': 0.8203898754336488, 'learning_rate': 8.346846554925577e-06, 'epoch': 0.29} 29%|██▉ | 9892/34278 [10:56:54<25:09:04, 3.71s/it] 29%|██▉ | 9893/34278 [10:57:00<28:56:38, 4.27s/it] {'loss': 0.1654, 'grad_norm': 0.741354464163945, 'learning_rate': 8.346495554560794e-06, 'epoch': 0.29} 29%|██▉ | 9893/34278 [10:57:00<28:56:38, 4.27s/it] 29%|██▉ | 9894/34278 [10:57:03<26:36:14, 3.93s/it] {'loss': 0.1563, 'grad_norm': 0.928620964154592, 'learning_rate': 8.346144524319298e-06, 'epoch': 0.29} 29%|██▉ | 9894/34278 [10:57:03<26:36:14, 3.93s/it] 29%|██▉ | 9895/34278 [10:57:07<26:12:23, 3.87s/it] {'loss': 0.1405, 'grad_norm': 0.7041305464807207, 'learning_rate': 8.345793464204221e-06, 'epoch': 0.29} 29%|██▉ | 9895/34278 [10:57:07<26:12:23, 3.87s/it] 29%|██▉ | 9896/34278 [10:57:11<26:43:16, 3.95s/it] {'loss': 0.1347, 'grad_norm': 0.7809452957929224, 'learning_rate': 8.345442374218702e-06, 'epoch': 0.29} 29%|██▉ | 9896/34278 [10:57:11<26:43:16, 3.95s/it] 29%|██▉ | 9897/34278 [10:57:17<31:01:11, 4.58s/it] {'loss': 0.139, 'grad_norm': 0.7657322015755201, 'learning_rate': 8.34509125436587e-06, 'epoch': 0.29} 29%|██▉ | 9897/34278 [10:57:17<31:01:11, 4.58s/it] 29%|██▉ | 9898/34278 [10:57:20<29:12:59, 4.31s/it] {'loss': 0.1622, 'grad_norm': 0.7689742927200562, 'learning_rate': 8.344740104648862e-06, 'epoch': 0.29} 29%|██▉ | 9898/34278 [10:57:20<29:12:59, 4.31s/it] 29%|██▉ | 9899/34278 [10:57:25<29:48:00, 4.40s/it] {'loss': 0.1568, 'grad_norm': 0.9481928855239199, 'learning_rate': 8.344388925070812e-06, 'epoch': 0.29} 29%|██▉ | 9899/34278 [10:57:25<29:48:00, 4.40s/it] 29%|██▉ | 9900/34278 [10:57:31<32:57:14, 4.87s/it] {'loss': 0.1382, 'grad_norm': 0.7098603340041709, 'learning_rate': 8.344037715634859e-06, 'epoch': 0.29} 29%|██▉ | 9900/34278 [10:57:31<32:57:14, 4.87s/it] 29%|██▉ | 9901/34278 [10:57:37<35:14:24, 5.20s/it] {'loss': 0.1657, 'grad_norm': 1.4781048055873522, 'learning_rate': 8.343686476344132e-06, 'epoch': 0.29} 29%|██▉ | 9901/34278 [10:57:37<35:14:24, 5.20s/it] 29%|██▉ | 9902/34278 [10:57:40<31:02:06, 4.58s/it] {'loss': 0.1345, 'grad_norm': 0.9229871341863847, 'learning_rate': 8.343335207201773e-06, 'epoch': 0.29} 29%|██▉ | 9902/34278 [10:57:40<31:02:06, 4.58s/it] 29%|██▉ | 9903/34278 [10:57:43<27:32:07, 4.07s/it] {'loss': 0.1233, 'grad_norm': 0.7753778008196123, 'learning_rate': 8.342983908210915e-06, 'epoch': 0.29} 29%|██▉ | 9903/34278 [10:57:43<27:32:07, 4.07s/it] 29%|██▉ | 9904/34278 [10:57:46<25:48:59, 3.81s/it] {'loss': 0.1416, 'grad_norm': 0.8450474439015068, 'learning_rate': 8.342632579374693e-06, 'epoch': 0.29} 29%|██▉ | 9904/34278 [10:57:46<25:48:59, 3.81s/it] 29%|██▉ | 9905/34278 [10:57:49<24:27:49, 3.61s/it] {'loss': 0.1737, 'grad_norm': 0.7643050333338456, 'learning_rate': 8.342281220696246e-06, 'epoch': 0.29} 29%|██▉ | 9905/34278 [10:57:49<24:27:49, 3.61s/it] 29%|██▉ | 9906/34278 [10:57:52<23:31:22, 3.47s/it] {'loss': 0.1298, 'grad_norm': 0.8086534856495711, 'learning_rate': 8.341929832178712e-06, 'epoch': 0.29} 29%|██▉ | 9906/34278 [10:57:52<23:31:22, 3.47s/it] 29%|██▉ | 9907/34278 [10:57:56<24:23:54, 3.60s/it] {'loss': 0.1546, 'grad_norm': 1.0202434340116664, 'learning_rate': 8.341578413825224e-06, 'epoch': 0.29} 29%|██▉ | 9907/34278 [10:57:56<24:23:54, 3.60s/it] 29%|██▉ | 9908/34278 [10:58:00<24:45:07, 3.66s/it] {'loss': 0.1483, 'grad_norm': 0.8856129755694068, 'learning_rate': 8.341226965638922e-06, 'epoch': 0.29} 29%|██▉ | 9908/34278 [10:58:00<24:45:07, 3.66s/it] 29%|██▉ | 9909/34278 [10:58:03<23:47:37, 3.52s/it] {'loss': 0.1575, 'grad_norm': 1.0051947137453312, 'learning_rate': 8.340875487622944e-06, 'epoch': 0.29} 29%|██▉ | 9909/34278 [10:58:03<23:47:37, 3.52s/it] 29%|██▉ | 9910/34278 [10:58:08<27:06:24, 4.00s/it] {'loss': 0.1697, 'grad_norm': 0.8923354587071173, 'learning_rate': 8.340523979780426e-06, 'epoch': 0.29} 29%|██▉ | 9910/34278 [10:58:09<27:06:24, 4.00s/it] 29%|██▉ | 9911/34278 [10:58:11<24:53:34, 3.68s/it] {'loss': 0.138, 'grad_norm': 0.7912794953994675, 'learning_rate': 8.340172442114509e-06, 'epoch': 0.29} 29%|██▉ | 9911/34278 [10:58:11<24:53:34, 3.68s/it] 29%|██▉ | 9912/34278 [10:58:14<23:14:12, 3.43s/it] {'loss': 0.1494, 'grad_norm': 0.7861918709653906, 'learning_rate': 8.33982087462833e-06, 'epoch': 0.29} 29%|██▉ | 9912/34278 [10:58:14<23:14:12, 3.43s/it] 29%|██▉ | 9913/34278 [10:58:17<22:22:53, 3.31s/it] {'loss': 0.1475, 'grad_norm': 0.9112625258963118, 'learning_rate': 8.339469277325025e-06, 'epoch': 0.29} 29%|██▉ | 9913/34278 [10:58:17<22:22:53, 3.31s/it] 29%|██▉ | 9914/34278 [10:58:23<28:03:14, 4.15s/it] {'loss': 0.1506, 'grad_norm': 0.8893112375319981, 'learning_rate': 8.339117650207738e-06, 'epoch': 0.29} 29%|██▉ | 9914/34278 [10:58:23<28:03:14, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 29%|██▉ | 9915/34278 [10:58:28<28:37:22, 4.23s/it] {'loss': 0.1437, 'grad_norm': 0.9022641279176083, 'learning_rate': 8.338765993279604e-06, 'epoch': 0.29} 29%|██▉ | 9915/34278 [10:58:28<28:37:22, 4.23s/it] 29%|██▉ | 9916/34278 [10:58:31<26:40:41, 3.94s/it] {'loss': 0.1698, 'grad_norm': 0.9962674200604096, 'learning_rate': 8.338414306543764e-06, 'epoch': 0.29} 29%|██▉ | 9916/34278 [10:58:31<26:40:41, 3.94s/it] 29%|██▉ | 9917/34278 [10:58:35<26:05:57, 3.86s/it] {'loss': 0.1334, 'grad_norm': 0.8840193722246299, 'learning_rate': 8.33806259000336e-06, 'epoch': 0.29} 29%|██▉ | 9917/34278 [10:58:35<26:05:57, 3.86s/it] 29%|██▉ | 9918/34278 [10:58:40<28:32:56, 4.22s/it] {'loss': 0.1557, 'grad_norm': 0.8890595472257062, 'learning_rate': 8.337710843661528e-06, 'epoch': 0.29} 29%|██▉ | 9918/34278 [10:58:40<28:32:56, 4.22s/it] 29%|██▉ | 9919/34278 [10:58:43<26:25:52, 3.91s/it] {'loss': 0.1654, 'grad_norm': 0.8783307524752315, 'learning_rate': 8.337359067521411e-06, 'epoch': 0.29} 29%|██▉ | 9919/34278 [10:58:43<26:25:52, 3.91s/it] 29%|██▉ | 9920/34278 [10:58:46<24:25:17, 3.61s/it] {'loss': 0.1601, 'grad_norm': 0.7667862558799786, 'learning_rate': 8.33700726158615e-06, 'epoch': 0.29} 29%|██▉ | 9920/34278 [10:58:46<24:25:17, 3.61s/it] 29%|██▉ | 9921/34278 [10:58:49<23:23:23, 3.46s/it] {'loss': 0.1531, 'grad_norm': 1.0607707908253314, 'learning_rate': 8.336655425858885e-06, 'epoch': 0.29} 29%|██▉ | 9921/34278 [10:58:49<23:23:23, 3.46s/it] 29%|██▉ | 9922/34278 [10:58:54<25:55:11, 3.83s/it] {'loss': 0.1605, 'grad_norm': 0.8227189793116355, 'learning_rate': 8.336303560342756e-06, 'epoch': 0.29} 29%|██▉ | 9922/34278 [10:58:54<25:55:11, 3.83s/it] 29%|██▉ | 9923/34278 [10:59:00<30:07:54, 4.45s/it] {'loss': 0.1693, 'grad_norm': 0.8533704546216876, 'learning_rate': 8.335951665040904e-06, 'epoch': 0.29} 29%|██▉ | 9923/34278 [10:59:00<30:07:54, 4.45s/it] 29%|██▉ | 9924/34278 [10:59:03<28:02:09, 4.14s/it] {'loss': 0.1656, 'grad_norm': 0.8029944727248988, 'learning_rate': 8.335599739956474e-06, 'epoch': 0.29} 29%|██▉ | 9924/34278 [10:59:03<28:02:09, 4.14s/it] 29%|██▉ | 9925/34278 [10:59:06<26:03:50, 3.85s/it] {'loss': 0.1341, 'grad_norm': 0.8797297804836803, 'learning_rate': 8.335247785092604e-06, 'epoch': 0.29} 29%|██▉ | 9925/34278 [10:59:06<26:03:50, 3.85s/it] 29%|██▉ | 9926/34278 [10:59:10<25:06:10, 3.71s/it] {'loss': 0.1602, 'grad_norm': 0.8583529494083441, 'learning_rate': 8.33489580045244e-06, 'epoch': 0.29} 29%|██▉ | 9926/34278 [10:59:10<25:06:10, 3.71s/it] 29%|██▉ | 9927/34278 [10:59:13<23:42:46, 3.51s/it] {'loss': 0.1736, 'grad_norm': 0.7376563991148708, 'learning_rate': 8.334543786039122e-06, 'epoch': 0.29} 29%|██▉ | 9927/34278 [10:59:13<23:42:46, 3.51s/it] 29%|██▉ | 9928/34278 [10:59:18<28:18:14, 4.18s/it] {'loss': 0.1708, 'grad_norm': 0.8326224275218932, 'learning_rate': 8.33419174185579e-06, 'epoch': 0.29} 29%|██▉ | 9928/34278 [10:59:18<28:18:14, 4.18s/it] 29%|██▉ | 9929/34278 [10:59:23<29:59:33, 4.43s/it] {'loss': 0.1532, 'grad_norm': 0.7862764218821267, 'learning_rate': 8.333839667905594e-06, 'epoch': 0.29} 29%|██▉ | 9929/34278 [10:59:23<29:59:33, 4.43s/it] 29%|██▉ | 9930/34278 [10:59:28<30:21:21, 4.49s/it] {'loss': 0.1631, 'grad_norm': 0.6965774250218687, 'learning_rate': 8.333487564191672e-06, 'epoch': 0.29} 29%|██▉ | 9930/34278 [10:59:28<30:21:21, 4.49s/it] 29%|██▉ | 9931/34278 [10:59:31<27:28:57, 4.06s/it] {'loss': 0.1421, 'grad_norm': 0.7734103847833171, 'learning_rate': 8.333135430717167e-06, 'epoch': 0.29} 29%|██▉ | 9931/34278 [10:59:31<27:28:57, 4.06s/it] 29%|██▉ | 9932/34278 [10:59:35<26:19:43, 3.89s/it] {'loss': 0.1544, 'grad_norm': 0.7402186181779997, 'learning_rate': 8.332783267485227e-06, 'epoch': 0.29} 29%|██▉ | 9932/34278 [10:59:35<26:19:43, 3.89s/it] 29%|██▉ | 9933/34278 [10:59:39<27:10:25, 4.02s/it] {'loss': 0.1569, 'grad_norm': 0.8362642679456843, 'learning_rate': 8.332431074498992e-06, 'epoch': 0.29} 29%|██▉ | 9933/34278 [10:59:39<27:10:25, 4.02s/it] 29%|██▉ | 9934/34278 [10:59:42<25:38:56, 3.79s/it] {'loss': 0.14, 'grad_norm': 0.8623462724671603, 'learning_rate': 8.33207885176161e-06, 'epoch': 0.29} 29%|██▉ | 9934/34278 [10:59:42<25:38:56, 3.79s/it] 29%|██▉ | 9935/34278 [10:59:45<24:29:20, 3.62s/it] {'loss': 0.1729, 'grad_norm': 0.8445223702334346, 'learning_rate': 8.331726599276221e-06, 'epoch': 0.29} 29%|██▉ | 9935/34278 [10:59:45<24:29:20, 3.62s/it] 29%|██▉ | 9936/34278 [10:59:49<25:04:57, 3.71s/it] {'loss': 0.1449, 'grad_norm': 0.885988204287692, 'learning_rate': 8.331374317045974e-06, 'epoch': 0.29} 29%|██▉ | 9936/34278 [10:59:49<25:04:57, 3.71s/it] 29%|██▉ | 9937/34278 [10:59:55<28:17:12, 4.18s/it] {'loss': 0.1503, 'grad_norm': 0.7388917651173174, 'learning_rate': 8.33102200507401e-06, 'epoch': 0.29} 29%|██▉ | 9937/34278 [10:59:55<28:17:12, 4.18s/it] 29%|██▉ | 9938/34278 [10:59:59<29:37:27, 4.38s/it] {'loss': 0.158, 'grad_norm': 0.7618862505541767, 'learning_rate': 8.33066966336348e-06, 'epoch': 0.29} 29%|██▉ | 9938/34278 [10:59:59<29:37:27, 4.38s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 29%|██▉ | 9939/34278 [11:00:05<31:09:48, 4.61s/it] {'loss': 0.1407, 'grad_norm': 0.7039575963698365, 'learning_rate': 8.330317291917525e-06, 'epoch': 0.29} 29%|██▉ | 9939/34278 [11:00:05<31:09:48, 4.61s/it] 29%|██▉ | 9940/34278 [11:00:08<27:53:18, 4.13s/it] {'loss': 0.1624, 'grad_norm': 0.9483758954090866, 'learning_rate': 8.32996489073929e-06, 'epoch': 0.29} 29%|██▉ | 9940/34278 [11:00:08<27:53:18, 4.13s/it] 29%|██▉ | 9941/34278 [11:00:10<25:12:20, 3.73s/it] {'loss': 0.1492, 'grad_norm': 0.7212525909556974, 'learning_rate': 8.329612459831926e-06, 'epoch': 0.29} 29%|██▉ | 9941/34278 [11:00:10<25:12:20, 3.73s/it] 29%|██▉ | 9942/34278 [11:00:13<23:23:40, 3.46s/it] {'loss': 0.1365, 'grad_norm': 0.8573451154135282, 'learning_rate': 8.329259999198577e-06, 'epoch': 0.29} 29%|██▉ | 9942/34278 [11:00:13<23:23:40, 3.46s/it] 29%|██▉ | 9943/34278 [11:00:17<23:19:38, 3.45s/it] {'loss': 0.1544, 'grad_norm': 0.9432659078236447, 'learning_rate': 8.328907508842388e-06, 'epoch': 0.29} 29%|██▉ | 9943/34278 [11:00:17<23:19:38, 3.45s/it] 29%|██▉ | 9944/34278 [11:00:20<22:43:44, 3.36s/it] {'loss': 0.1493, 'grad_norm': 0.7315033120337017, 'learning_rate': 8.328554988766509e-06, 'epoch': 0.29} 29%|██▉ | 9944/34278 [11:00:20<22:43:44, 3.36s/it] 29%|██▉ | 9945/34278 [11:00:23<22:10:45, 3.28s/it] {'loss': 0.1396, 'grad_norm': 1.167455282089146, 'learning_rate': 8.328202438974083e-06, 'epoch': 0.29} 29%|██▉ | 9945/34278 [11:00:23<22:10:45, 3.28s/it] 29%|██▉ | 9946/34278 [11:00:27<23:22:30, 3.46s/it] {'loss': 0.1559, 'grad_norm': 1.1215715354094669, 'learning_rate': 8.327849859468263e-06, 'epoch': 0.29} 29%|██▉ | 9946/34278 [11:00:27<23:22:30, 3.46s/it] 29%|██▉ | 9947/34278 [11:00:32<27:21:05, 4.05s/it] {'loss': 0.1417, 'grad_norm': 0.8945220203736282, 'learning_rate': 8.327497250252192e-06, 'epoch': 0.29} 29%|██▉ | 9947/34278 [11:00:32<27:21:05, 4.05s/it] 29%|██▉ | 9948/34278 [11:00:35<25:18:18, 3.74s/it] {'loss': 0.1674, 'grad_norm': 1.0190752539257046, 'learning_rate': 8.327144611329022e-06, 'epoch': 0.29} 29%|██▉ | 9948/34278 [11:00:35<25:18:18, 3.74s/it] 29%|██▉ | 9949/34278 [11:00:38<23:37:58, 3.50s/it] {'loss': 0.1578, 'grad_norm': 0.933557154708929, 'learning_rate': 8.326791942701895e-06, 'epoch': 0.29} 29%|██▉ | 9949/34278 [11:00:38<23:37:58, 3.50s/it] 29%|██▉ | 9950/34278 [11:00:41<22:32:03, 3.33s/it] {'loss': 0.1636, 'grad_norm': 0.9863857215165585, 'learning_rate': 8.326439244373968e-06, 'epoch': 0.29} 29%|██▉ | 9950/34278 [11:00:41<22:32:03, 3.33s/it] 29%|██▉ | 9951/34278 [11:00:45<23:52:06, 3.53s/it] {'loss': 0.1558, 'grad_norm': 1.4159947206301777, 'learning_rate': 8.326086516348384e-06, 'epoch': 0.29} 29%|██▉ | 9951/34278 [11:00:45<23:52:06, 3.53s/it] 29%|██▉ | 9952/34278 [11:00:48<22:54:34, 3.39s/it] {'loss': 0.1575, 'grad_norm': 0.8933563458370728, 'learning_rate': 8.325733758628292e-06, 'epoch': 0.29} 29%|██▉ | 9952/34278 [11:00:48<22:54:34, 3.39s/it] 29%|██▉ | 9953/34278 [11:00:52<23:31:02, 3.48s/it] {'loss': 0.1594, 'grad_norm': 1.0723245638303787, 'learning_rate': 8.325380971216846e-06, 'epoch': 0.29} 29%|██▉ | 9953/34278 [11:00:52<23:31:02, 3.48s/it] 29%|██▉ | 9954/34278 [11:00:55<22:48:47, 3.38s/it] {'loss': 0.1939, 'grad_norm': 0.9162685575737721, 'learning_rate': 8.325028154117191e-06, 'epoch': 0.29} 29%|██▉ | 9954/34278 [11:00:55<22:48:47, 3.38s/it] 29%|██▉ | 9955/34278 [11:00:58<22:08:25, 3.28s/it] {'loss': 0.1496, 'grad_norm': 0.7659311648089028, 'learning_rate': 8.324675307332478e-06, 'epoch': 0.29} 29%|██▉ | 9955/34278 [11:00:58<22:08:25, 3.28s/it] 29%|██▉ | 9956/34278 [11:01:01<21:14:56, 3.15s/it] {'loss': 0.1618, 'grad_norm': 0.9303261328242991, 'learning_rate': 8.324322430865858e-06, 'epoch': 0.29} 29%|██▉ | 9956/34278 [11:01:01<21:14:56, 3.15s/it] 29%|██▉ | 9957/34278 [11:01:04<21:25:42, 3.17s/it] {'loss': 0.137, 'grad_norm': 0.8380924897880033, 'learning_rate': 8.32396952472048e-06, 'epoch': 0.29} 29%|██▉ | 9957/34278 [11:01:04<21:25:42, 3.17s/it] 29%|██▉ | 9958/34278 [11:01:10<27:10:59, 4.02s/it] {'loss': 0.1697, 'grad_norm': 1.0817912252194481, 'learning_rate': 8.323616588899497e-06, 'epoch': 0.29} 29%|██▉ | 9958/34278 [11:01:10<27:10:59, 4.02s/it] 29%|██▉ | 9959/34278 [11:01:15<28:28:17, 4.21s/it] {'loss': 0.1583, 'grad_norm': 0.7405987750868062, 'learning_rate': 8.323263623406057e-06, 'epoch': 0.29} 29%|██▉ | 9959/34278 [11:01:15<28:28:17, 4.21s/it] 29%|██▉ | 9960/34278 [11:01:19<28:55:40, 4.28s/it] {'loss': 0.1495, 'grad_norm': 1.1149496128661358, 'learning_rate': 8.322910628243314e-06, 'epoch': 0.29} 29%|██▉ | 9960/34278 [11:01:19<28:55:40, 4.28s/it] 29%|██▉ | 9961/34278 [11:01:23<27:40:48, 4.10s/it] {'loss': 0.1503, 'grad_norm': 0.8458534683375639, 'learning_rate': 8.322557603414418e-06, 'epoch': 0.29} 29%|██▉ | 9961/34278 [11:01:23<27:40:48, 4.10s/it] 29%|██▉ | 9962/34278 [11:01:26<25:47:26, 3.82s/it] {'loss': 0.1401, 'grad_norm': 0.9372275666730926, 'learning_rate': 8.322204548922521e-06, 'epoch': 0.29} 29%|██▉ | 9962/34278 [11:01:26<25:47:26, 3.82s/it] 29%|██▉ | 9963/34278 [11:01:29<24:31:04, 3.63s/it] {'loss': 0.1817, 'grad_norm': 0.7629056453068565, 'learning_rate': 8.321851464770775e-06, 'epoch': 0.29} 29%|██▉ | 9963/34278 [11:01:29<24:31:04, 3.63s/it] 29%|██▉ | 9964/34278 [11:01:34<27:18:55, 4.04s/it] {'loss': 0.1607, 'grad_norm': 1.1246679773874921, 'learning_rate': 8.321498350962331e-06, 'epoch': 0.29} 29%|██▉ | 9964/34278 [11:01:34<27:18:55, 4.04s/it] 29%|██▉ | 9965/34278 [11:01:38<25:48:13, 3.82s/it] {'loss': 0.1478, 'grad_norm': 0.8833611714238749, 'learning_rate': 8.321145207500343e-06, 'epoch': 0.29} 29%|██▉ | 9965/34278 [11:01:38<25:48:13, 3.82s/it] 29%|██▉ | 9966/34278 [11:01:40<23:38:34, 3.50s/it] {'loss': 0.1447, 'grad_norm': 0.8184608900826866, 'learning_rate': 8.320792034387964e-06, 'epoch': 0.29} 29%|██▉ | 9966/34278 [11:01:40<23:38:34, 3.50s/it] 29%|██▉ | 9967/34278 [11:01:44<23:55:52, 3.54s/it] {'loss': 0.164, 'grad_norm': 0.9607555983475776, 'learning_rate': 8.320438831628345e-06, 'epoch': 0.29} 29%|██▉ | 9967/34278 [11:01:44<23:55:52, 3.54s/it] 29%|██▉ | 9968/34278 [11:01:49<26:21:18, 3.90s/it] {'loss': 0.1523, 'grad_norm': 0.888310835837436, 'learning_rate': 8.320085599224642e-06, 'epoch': 0.29} 29%|██▉ | 9968/34278 [11:01:49<26:21:18, 3.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 29%|██▉ | 9969/34278 [11:01:52<24:42:23, 3.66s/it] {'loss': 0.1589, 'grad_norm': 0.8757880741813339, 'learning_rate': 8.319732337180008e-06, 'epoch': 0.29} 29%|██▉ | 9969/34278 [11:01:52<24:42:23, 3.66s/it] 29%|██▉ | 9970/34278 [11:01:58<29:12:52, 4.33s/it] {'loss': 0.1686, 'grad_norm': 0.8857833465418704, 'learning_rate': 8.319379045497595e-06, 'epoch': 0.29} 29%|██▉ | 9970/34278 [11:01:58<29:12:52, 4.33s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 29%|██▉ | 9971/34278 [11:02:00<26:05:40, 3.86s/it] {'loss': 0.1522, 'grad_norm': 1.0051605070308964, 'learning_rate': 8.319025724180559e-06, 'epoch': 0.29} 29%|██▉ | 9971/34278 [11:02:00<26:05:40, 3.86s/it] 29%|██▉ | 9972/34278 [11:02:05<27:06:42, 4.02s/it] {'loss': 0.1533, 'grad_norm': 0.8221400217366555, 'learning_rate': 8.318672373232053e-06, 'epoch': 0.29} 29%|██▉ | 9972/34278 [11:02:05<27:06:42, 4.02s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 29%|██▉ | 9973/34278 [11:02:09<27:08:31, 4.02s/it] {'loss': 0.1668, 'grad_norm': 0.8176771708565067, 'learning_rate': 8.318318992655232e-06, 'epoch': 0.29} 29%|██▉ | 9973/34278 [11:02:09<27:08:31, 4.02s/it] 29%|██▉ | 9974/34278 [11:02:12<25:17:45, 3.75s/it] {'loss': 0.1623, 'grad_norm': 0.8854928575385405, 'learning_rate': 8.317965582453251e-06, 'epoch': 0.29} 29%|██▉ | 9974/34278 [11:02:12<25:17:45, 3.75s/it] 29%|██▉ | 9975/34278 [11:02:15<24:55:27, 3.69s/it] {'loss': 0.1705, 'grad_norm': 0.8171128503970739, 'learning_rate': 8.317612142629268e-06, 'epoch': 0.29} 29%|██▉ | 9975/34278 [11:02:16<24:55:27, 3.69s/it] 29%|██▉ | 9976/34278 [11:02:19<23:58:03, 3.55s/it] {'loss': 0.1538, 'grad_norm': 0.9079090036597574, 'learning_rate': 8.317258673186432e-06, 'epoch': 0.29} 29%|██▉ | 9976/34278 [11:02:19<23:58:03, 3.55s/it] 29%|██▉ | 9977/34278 [11:02:24<26:52:33, 3.98s/it] {'loss': 0.1505, 'grad_norm': 0.7968587160417797, 'learning_rate': 8.316905174127906e-06, 'epoch': 0.29} 29%|██▉ | 9977/34278 [11:02:24<26:52:33, 3.98s/it] 29%|██▉ | 9978/34278 [11:02:27<24:45:09, 3.67s/it] {'loss': 0.1415, 'grad_norm': 0.8164199405906625, 'learning_rate': 8.31655164545684e-06, 'epoch': 0.29} 29%|██▉ | 9978/34278 [11:02:27<24:45:09, 3.67s/it] 29%|██▉ | 9979/34278 [11:02:30<23:32:44, 3.49s/it] {'loss': 0.1704, 'grad_norm': 0.8377358511468971, 'learning_rate': 8.316198087176393e-06, 'epoch': 0.29} 29%|██▉ | 9979/34278 [11:02:30<23:32:44, 3.49s/it] 29%|██▉ | 9980/34278 [11:02:36<29:02:52, 4.30s/it] {'loss': 0.161, 'grad_norm': 0.6789145992788284, 'learning_rate': 8.315844499289722e-06, 'epoch': 0.29} 29%|██▉ | 9980/34278 [11:02:36<29:02:52, 4.30s/it] 29%|██▉ | 9981/34278 [11:02:39<26:52:34, 3.98s/it] {'loss': 0.1962, 'grad_norm': 0.9878176562475328, 'learning_rate': 8.315490881799982e-06, 'epoch': 0.29} 29%|██▉ | 9981/34278 [11:02:39<26:52:34, 3.98s/it] 29%|██▉ | 9982/34278 [11:02:42<25:11:20, 3.73s/it] {'loss': 0.1341, 'grad_norm': 0.7551533792819878, 'learning_rate': 8.315137234710332e-06, 'epoch': 0.29} 29%|██▉ | 9982/34278 [11:02:42<25:11:20, 3.73s/it] 29%|██▉ | 9983/34278 [11:02:48<29:19:26, 4.35s/it] {'loss': 0.1564, 'grad_norm': 1.044286035733997, 'learning_rate': 8.314783558023927e-06, 'epoch': 0.29} 29%|██▉ | 9983/34278 [11:02:48<29:19:26, 4.35s/it] 29%|██▉ | 9984/34278 [11:02:54<32:45:06, 4.85s/it] {'loss': 0.1464, 'grad_norm': 0.7874113543302953, 'learning_rate': 8.314429851743927e-06, 'epoch': 0.29} 29%|██▉ | 9984/34278 [11:02:54<32:45:06, 4.85s/it] 29%|██▉ | 9985/34278 [11:03:00<35:21:59, 5.24s/it] {'loss': 0.1596, 'grad_norm': 0.9118496172043951, 'learning_rate': 8.314076115873485e-06, 'epoch': 0.29} 29%|██▉ | 9985/34278 [11:03:00<35:21:59, 5.24s/it] 29%|██▉ | 9986/34278 [11:03:07<37:48:54, 5.60s/it] {'loss': 0.1763, 'grad_norm': 0.966762188743754, 'learning_rate': 8.313722350415767e-06, 'epoch': 0.29} 29%|██▉ | 9986/34278 [11:03:07<37:48:54, 5.60s/it] 29%|██▉ | 9987/34278 [11:03:10<33:56:27, 5.03s/it] {'loss': 0.1515, 'grad_norm': 0.8326195256442548, 'learning_rate': 8.313368555373925e-06, 'epoch': 0.29} 29%|██▉ | 9987/34278 [11:03:10<33:56:27, 5.03s/it] 29%|██▉ | 9988/34278 [11:03:15<32:52:30, 4.87s/it] {'loss': 0.1868, 'grad_norm': 0.9867023969084737, 'learning_rate': 8.313014730751119e-06, 'epoch': 0.29} 29%|██▉ | 9988/34278 [11:03:15<32:52:30, 4.87s/it] 29%|██▉ | 9989/34278 [11:03:18<28:53:16, 4.28s/it] {'loss': 0.1834, 'grad_norm': 0.9944314437123973, 'learning_rate': 8.312660876550509e-06, 'epoch': 0.29} 29%|██▉ | 9989/34278 [11:03:18<28:53:16, 4.28s/it] 29%|██▉ | 9990/34278 [11:03:23<31:36:16, 4.68s/it] {'loss': 0.1385, 'grad_norm': 0.7193657700096948, 'learning_rate': 8.312306992775254e-06, 'epoch': 0.29} 29%|██▉ | 9990/34278 [11:03:23<31:36:16, 4.68s/it] 29%|██▉ | 9991/34278 [11:03:28<30:22:31, 4.50s/it] {'loss': 0.1443, 'grad_norm': 0.7827293906850198, 'learning_rate': 8.311953079428511e-06, 'epoch': 0.29} 29%|██▉ | 9991/34278 [11:03:28<30:22:31, 4.50s/it] 29%|██▉ | 9992/34278 [11:03:34<33:30:24, 4.97s/it] {'loss': 0.1345, 'grad_norm': 0.775521931058426, 'learning_rate': 8.311599136513443e-06, 'epoch': 0.29} 29%|██▉ | 9992/34278 [11:03:34<33:30:24, 4.97s/it] 29%|██▉ | 9993/34278 [11:03:38<31:43:49, 4.70s/it] {'loss': 0.1337, 'grad_norm': 0.7918526820206877, 'learning_rate': 8.311245164033208e-06, 'epoch': 0.29} 29%|██▉ | 9993/34278 [11:03:38<31:43:49, 4.70s/it] 29%|██▉ | 9994/34278 [11:03:42<31:12:21, 4.63s/it] {'loss': 0.14, 'grad_norm': 0.9449828723661597, 'learning_rate': 8.310891161990967e-06, 'epoch': 0.29} 29%|██▉ | 9994/34278 [11:03:42<31:12:21, 4.63s/it] 29%|██▉ | 9995/34278 [11:03:46<28:52:41, 4.28s/it] {'loss': 0.1706, 'grad_norm': 0.8535034442896857, 'learning_rate': 8.31053713038988e-06, 'epoch': 0.29} 29%|██▉ | 9995/34278 [11:03:46<28:52:41, 4.28s/it] 29%|██▉ | 9996/34278 [11:03:49<26:53:52, 3.99s/it] {'loss': 0.1319, 'grad_norm': 0.9759757025382537, 'learning_rate': 8.31018306923311e-06, 'epoch': 0.29} 29%|██▉ | 9996/34278 [11:03:49<26:53:52, 3.99s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9702 > 8192). Running this sequence through the model will result in indexing errors 29%|██▉ | 9997/34278 [11:03:52<25:59:12, 3.85s/it] {'loss': 0.1462, 'grad_norm': 0.7421867198696321, 'learning_rate': 8.30982897852381e-06, 'epoch': 0.29} 29%|██▉ | 9997/34278 [11:03:52<25:59:12, 3.85s/it] 29%|██▉ | 9998/34278 [11:03:55<24:01:04, 3.56s/it] {'loss': 0.1441, 'grad_norm': 0.8340210160364849, 'learning_rate': 8.309474858265153e-06, 'epoch': 0.29} 29%|██▉ | 9998/34278 [11:03:55<24:01:04, 3.56s/it] 29%|██▉ | 9999/34278 [11:03:58<23:01:42, 3.41s/it] {'loss': 0.1611, 'grad_norm': 0.9196255388910509, 'learning_rate': 8.309120708460291e-06, 'epoch': 0.29} 29%|██▉ | 9999/34278 [11:03:58<23:01:42, 3.41s/it] 29%|██▉ | 10000/34278 [11:04:01<21:52:22, 3.24s/it] {'loss': 0.1376, 'grad_norm': 0.7995903341022464, 'learning_rate': 8.30876652911239e-06, 'epoch': 0.29} 29%|██▉ | 10000/34278 [11:04:01<21:52:22, 3.24s/it]Set eos token id toSet eos token id to 151658151658Set eos token id toSet eos token id to Set eos token id toSet eos token toSet eos token to Set eos token id to Set eos token id to 151658 151658151658 <|diff_marker|> Set eos token id to 151658 Set eos token toSet generation config eos token id to 151658 Set eos token to Set eos token to151658Set eos token to <|diff_marker|> <|diff_marker|> [151658]Set eos token to <|diff_marker|> <|diff_marker|> Set eos token toSet generation config eos token id to<|diff_marker|> <|diff_marker|> Set generation config eos token id to Set generation config eos token id toSet generation config eos token id to [151658]Set generation config eos token id toSet generation config eos token id to[151658] [151658] <|diff_marker|> [151658] [151658][151658] Set generation config eos token id to [151658] Set eos token id toSet eos token id to t eos token id toSet eos token id toSet eos token id toSet eos token id toSet eos token id to151658151658Set eos token id to Set eos token toSet eos token to <|diff_marker|><|diff_marker|>Set eos token id to151658 Set generation config eos token id to Set generation config eos token id to 151658 Set eos token to <|diff_marker|>[151658] Set eos token to <|diff_marker|>Set generation config eos token id to [151658]Set generation config eos token id to[151658] Set eos token id to[151658] 151658 Set eos token to Set eos token id to<|diff_marker|>Set eos token id toSet eos token id to Set generation config eos token id to 151658151658 151658 [151658] Set eos token toSet eos token to Set eos token to<|diff_marker|><|diff_marker|> <|diff_marker|> Set generation config eos token id toSet generation config eos token id to Set generation config eos token id to [151658] [151658][151658] [151658] [151658] Set eos token id toSet eos token id to Set eos token id toSet eos token id to151658 151658 Set eos token id to 151658 151658 151658151658Set eos token id toSet eos token to Set eos token to 151658Set eos token toSet eos token to <|diff_marker|> Set eos token toSet eos token to<|diff_marker|>151658 <|diff_marker|> Set generation config eos token id toSet eos token to<|diff_marker|>Set generation config eos token id to <|diff_marker|>Set eos token to Set generation config eos token id to <|diff_marker|> Set generation config eos token id toSet generation config eos token id to[151658] [151658]<|diff_marker|> Set generation config eos token id to [151658] [151658] [151658] Set generation config eos token id to<|diff_marker|> [151658] [151658]Set generation config eos token id to [151658] Set eos token id bacSet e151658 Set eos toke151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back toSet eos token id back to 151645 151645 Set eos token back to <|im_end|>Set eos token back to <|im_end|> Set generation config eos token id back toSet gener Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to Set eos token id back to 151645[151645, 151643] Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643]Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 29%|██▉ | 10001/34278 [11:04:35<84:33:11, 12.54s/it] {'loss': 0.1346, 'grad_norm': 0.5775818656754369, 'learning_rate': 8.308412320224612e-06, 'epoch': 0.29} 29%|██▉ | 10001/34278 [11:04:35<84:33:11, 12.54s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 29%|██▉ | 10002/34278 [11:04:40<69:03:16, 10.24s/it] {'loss': 0.1564, 'grad_norm': 0.7585769294015465, 'learning_rate': 8.30805808180012e-06, 'epoch': 0.29} 29%|██▉ | 10002/34278 [11:04:40<69:03:16, 10.24s/it] 29%|██▉ | 10003/34278 [11:04:44<55:01:07, 8.16s/it] {'loss': 0.1724, 'grad_norm': 0.9220451686588651, 'learning_rate': 8.307703813842071e-06, 'epoch': 0.29} 29%|██▉ | 10003/34278 [11:04:44<55:01:07, 8.16s/it] 29%|██▉ | 10004/34278 [11:04:47<45:18:08, 6.72s/it] {'loss': 0.1286, 'grad_norm': 0.7012222953328154, 'learning_rate': 8.307349516353634e-06, 'epoch': 0.29} 29%|██▉ | 10004/34278 [11:04:47<45:18:08, 6.72s/it] 29%|██▉ | 10005/34278 [11:04:50<37:38:17, 5.58s/it] {'loss': 0.1679, 'grad_norm': 0.7923834248307075, 'learning_rate': 8.306995189337973e-06, 'epoch': 0.29} 29%|██▉ | 10005/34278 [11:04:50<37:38:17, 5.58s/it] 29%|██▉ | 10006/34278 [11:04:53<32:35:06, 4.83s/it] {'loss': 0.1706, 'grad_norm': 0.946432255528313, 'learning_rate': 8.306640832798242e-06, 'epoch': 0.29} 29%|██▉ | 10006/34278 [11:04:53<32:35:06, 4.83s/it] 29%|██▉ | 10007/34278 [11:04:56<29:09:01, 4.32s/it] {'loss': 0.1492, 'grad_norm': 0.8277375669619255, 'learning_rate': 8.306286446737616e-06, 'epoch': 0.29} 29%|██▉ | 10007/34278 [11:04:56<29:09:01, 4.32s/it] 29%|██▉ | 10008/34278 [11:04:59<26:54:01, 3.99s/it] {'loss': 0.1566, 'grad_norm': 0.9954771463628386, 'learning_rate': 8.305932031159253e-06, 'epoch': 0.29} 29%|██▉ | 10008/34278 [11:04:59<26:54:01, 3.99s/it] 29%|██▉ | 10009/34278 [11:05:02<24:39:05, 3.66s/it] {'loss': 0.1625, 'grad_norm': 0.8562541686694175, 'learning_rate': 8.305577586066317e-06, 'epoch': 0.29} 29%|██▉ | 10009/34278 [11:05:02<24:39:05, 3.66s/it] 29%|██▉ | 10010/34278 [11:05:05<23:22:29, 3.47s/it] {'loss': 0.1524, 'grad_norm': 1.2764191456792593, 'learning_rate': 8.305223111461975e-06, 'epoch': 0.29} 29%|██▉ | 10010/34278 [11:05:05<23:22:29, 3.47s/it] 29%|██▉ | 10011/34278 [11:05:11<27:43:34, 4.11s/it] {'loss': 0.1657, 'grad_norm': 1.1348108065448188, 'learning_rate': 8.30486860734939e-06, 'epoch': 0.29} 29%|██▉ | 10011/34278 [11:05:11<27:43:34, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 29%|██▉ | 10012/34278 [11:05:15<27:19:23, 4.05s/it] {'loss': 0.1542, 'grad_norm': 0.6247148365727904, 'learning_rate': 8.304514073731724e-06, 'epoch': 0.29} 29%|██▉ | 10012/34278 [11:05:15<27:19:23, 4.05s/it] 29%|██▉ | 10013/34278 [11:05:18<25:11:44, 3.74s/it] {'loss': 0.1557, 'grad_norm': 0.8604647433551958, 'learning_rate': 8.304159510612149e-06, 'epoch': 0.29} 29%|██▉ | 10013/34278 [11:05:18<25:11:44, 3.74s/it] 29%|██▉ | 10014/34278 [11:05:23<27:11:58, 4.04s/it] {'loss': 0.1381, 'grad_norm': 1.0468673531886143, 'learning_rate': 8.303804917993825e-06, 'epoch': 0.29} 29%|██▉ | 10014/34278 [11:05:23<27:11:58, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 29%|██▉ | 10015/34278 [11:05:28<31:01:46, 4.60s/it] {'loss': 0.1417, 'grad_norm': 0.9221026240612864, 'learning_rate': 8.303450295879917e-06, 'epoch': 0.29} 29%|██▉ | 10015/34278 [11:05:28<31:01:46, 4.60s/it] 29%|██▉ | 10016/34278 [11:05:33<30:30:14, 4.53s/it] {'loss': 0.1761, 'grad_norm': 0.9779188760797921, 'learning_rate': 8.303095644273598e-06, 'epoch': 0.29} 29%|██▉ | 10016/34278 [11:05:33<30:30:14, 4.53s/it] 29%|██▉ | 10017/34278 [11:05:36<27:19:20, 4.05s/it] {'loss': 0.1462, 'grad_norm': 0.95628535233947, 'learning_rate': 8.302740963178026e-06, 'epoch': 0.29} 29%|██▉ | 10017/34278 [11:05:36<27:19:20, 4.05s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9407 > 8192). Running this sequence through the model will result in indexing errors 29%|██▉ | 10018/34278 [11:05:39<25:44:32, 3.82s/it] {'loss': 0.1274, 'grad_norm': 0.8146698359549647, 'learning_rate': 8.302386252596372e-06, 'epoch': 0.29} 29%|██▉ | 10018/34278 [11:05:39<25:44:32, 3.82s/it] 29%|██▉ | 10019/34278 [11:05:42<24:37:10, 3.65s/it] {'loss': 0.1451, 'grad_norm': 1.0327793173177984, 'learning_rate': 8.302031512531802e-06, 'epoch': 0.29} 29%|██▉ | 10019/34278 [11:05:42<24:37:10, 3.65s/it] 29%|██▉ | 10020/34278 [11:05:45<23:24:41, 3.47s/it] {'loss': 0.1518, 'grad_norm': 0.783726058461353, 'learning_rate': 8.301676742987484e-06, 'epoch': 0.29} 29%|██▉ | 10020/34278 [11:05:45<23:24:41, 3.47s/it] 29%|██▉ | 10021/34278 [11:05:49<24:14:42, 3.60s/it] {'loss': 0.1537, 'grad_norm': 0.7114796690993442, 'learning_rate': 8.301321943966583e-06, 'epoch': 0.29} 29%|██▉ | 10021/34278 [11:05:49<24:14:42, 3.60s/it] 29%|██▉ | 10022/34278 [11:05:53<24:00:24, 3.56s/it] {'loss': 0.1378, 'grad_norm': 0.8343504034340079, 'learning_rate': 8.30096711547227e-06, 'epoch': 0.29} 29%|██▉ | 10022/34278 [11:05:53<24:00:24, 3.56s/it] 29%|██▉ | 10023/34278 [11:05:56<23:27:10, 3.48s/it] {'loss': 0.1599, 'grad_norm': 1.0848139303363526, 'learning_rate': 8.300612257507707e-06, 'epoch': 0.29} 29%|██▉ | 10023/34278 [11:05:56<23:27:10, 3.48s/it] 29%|██▉ | 10024/34278 [11:06:00<23:36:30, 3.50s/it] {'loss': 0.1409, 'grad_norm': 0.7767597599214419, 'learning_rate': 8.300257370076069e-06, 'epoch': 0.29} 29%|██▉ | 10024/34278 [11:06:00<23:36:30, 3.50s/it] 29%|██▉ | 10025/34278 [11:06:04<24:41:46, 3.67s/it] {'loss': 0.1302, 'grad_norm': 0.8318580216756729, 'learning_rate': 8.29990245318052e-06, 'epoch': 0.29} 29%|██▉ | 10025/34278 [11:06:04<24:41:46, 3.67s/it] 29%|██▉ | 10026/34278 [11:06:07<23:53:58, 3.55s/it] {'loss': 0.1367, 'grad_norm': 0.9190345106170331, 'learning_rate': 8.299547506824228e-06, 'epoch': 0.29} 29%|██▉ | 10026/34278 [11:06:07<23:53:58, 3.55s/it] 29%|██▉ | 10027/34278 [11:06:10<22:48:23, 3.39s/it] {'loss': 0.141, 'grad_norm': 0.8454636319416081, 'learning_rate': 8.299192531010365e-06, 'epoch': 0.29} 29%|██▉ | 10027/34278 [11:06:10<22:48:23, 3.39s/it] 29%|██▉ | 10028/34278 [11:06:13<21:48:14, 3.24s/it] {'loss': 0.1216, 'grad_norm': 0.9496611352931372, 'learning_rate': 8.298837525742099e-06, 'epoch': 0.29} 29%|██▉ | 10028/34278 [11:06:13<21:48:14, 3.24s/it] 29%|██▉ | 10029/34278 [11:06:16<22:01:06, 3.27s/it] {'loss': 0.1559, 'grad_norm': 1.0058824718148227, 'learning_rate': 8.298482491022597e-06, 'epoch': 0.29} 29%|██▉ | 10029/34278 [11:06:16<22:01:06, 3.27s/it] 29%|██▉ | 10030/34278 [11:06:22<27:39:48, 4.11s/it] {'loss': 0.1563, 'grad_norm': 0.7105909931539826, 'learning_rate': 8.298127426855032e-06, 'epoch': 0.29} 29%|██▉ | 10030/34278 [11:06:22<27:39:48, 4.11s/it] 29%|██▉ | 10031/34278 [11:06:28<30:51:50, 4.58s/it] {'loss': 0.1431, 'grad_norm': 0.8594947440795527, 'learning_rate': 8.297772333242572e-06, 'epoch': 0.29} 29%|██▉ | 10031/34278 [11:06:28<30:51:50, 4.58s/it] 29%|██▉ | 10032/34278 [11:06:31<27:56:24, 4.15s/it] {'loss': 0.1788, 'grad_norm': 0.9180978214471736, 'learning_rate': 8.29741721018839e-06, 'epoch': 0.29} 29%|██▉ | 10032/34278 [11:06:31<27:56:24, 4.15s/it] 29%|██▉ | 10033/34278 [11:06:36<29:43:07, 4.41s/it] {'loss': 0.1425, 'grad_norm': 0.8572780249185475, 'learning_rate': 8.297062057695653e-06, 'epoch': 0.29} 29%|██▉ | 10033/34278 [11:06:36<29:43:07, 4.41s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 29%|██▉ | 10034/34278 [11:06:42<32:56:39, 4.89s/it] {'loss': 0.1757, 'grad_norm': 0.8718833186795538, 'learning_rate': 8.296706875767533e-06, 'epoch': 0.29} 29%|██▉ | 10034/34278 [11:06:42<32:56:39, 4.89s/it] 29%|██▉ | 10035/34278 [11:06:46<30:53:53, 4.59s/it] {'loss': 0.1799, 'grad_norm': 1.0267974594728448, 'learning_rate': 8.2963516644072e-06, 'epoch': 0.29} 29%|██▉ | 10035/34278 [11:06:46<30:53:53, 4.59s/it] 29%|██▉ | 10036/34278 [11:06:50<29:52:57, 4.44s/it] {'loss': 0.1456, 'grad_norm': 0.9630029145702377, 'learning_rate': 8.295996423617828e-06, 'epoch': 0.29} 29%|██▉ | 10036/34278 [11:06:50<29:52:57, 4.44s/it] 29%|██▉ | 10037/34278 [11:06:54<28:50:17, 4.28s/it] {'loss': 0.1544, 'grad_norm': 0.9891852313316698, 'learning_rate': 8.295641153402586e-06, 'epoch': 0.29} 29%|██▉ | 10037/34278 [11:06:54<28:50:17, 4.28s/it] 29%|██▉ | 10038/34278 [11:06:57<27:13:03, 4.04s/it] {'loss': 0.1688, 'grad_norm': 0.8636462159642979, 'learning_rate': 8.295285853764647e-06, 'epoch': 0.29} 29%|██▉ | 10038/34278 [11:06:57<27:13:03, 4.04s/it] 29%|██▉ | 10039/34278 [11:07:00<25:10:08, 3.74s/it] {'loss': 0.1628, 'grad_norm': 0.9216727137755006, 'learning_rate': 8.294930524707181e-06, 'epoch': 0.29} 29%|██▉ | 10039/34278 [11:07:00<25:10:08, 3.74s/it] 29%|██▉ | 10040/34278 [11:07:05<26:44:02, 3.97s/it] {'loss': 0.16, 'grad_norm': 1.1656075453781285, 'learning_rate': 8.294575166233364e-06, 'epoch': 0.29} 29%|██▉ | 10040/34278 [11:07:05<26:44:02, 3.97s/it] 29%|██▉ | 10041/34278 [11:07:08<24:40:19, 3.66s/it] {'loss': 0.1477, 'grad_norm': 0.8517564097130488, 'learning_rate': 8.294219778346366e-06, 'epoch': 0.29} 29%|██▉ | 10041/34278 [11:07:08<24:40:19, 3.66s/it] 29%|██▉ | 10042/34278 [11:07:12<25:03:00, 3.72s/it] {'loss': 0.1482, 'grad_norm': 0.8107151679766771, 'learning_rate': 8.293864361049358e-06, 'epoch': 0.29} 29%|██▉ | 10042/34278 [11:07:12<25:03:00, 3.72s/it] 29%|██▉ | 10043/34278 [11:07:15<23:58:09, 3.56s/it] {'loss': 0.1508, 'grad_norm': 0.7619916270717272, 'learning_rate': 8.293508914345517e-06, 'epoch': 0.29} 29%|██▉ | 10043/34278 [11:07:15<23:58:09, 3.56s/it] 29%|██▉ | 10044/34278 [11:07:21<29:05:29, 4.32s/it] {'loss': 0.1631, 'grad_norm': 0.7420486994078926, 'learning_rate': 8.293153438238015e-06, 'epoch': 0.29} 29%|██▉ | 10044/34278 [11:07:21<29:05:29, 4.32s/it] 29%|██▉ | 10045/34278 [11:07:24<26:03:28, 3.87s/it] {'loss': 0.1728, 'grad_norm': 0.7447317873850063, 'learning_rate': 8.292797932730023e-06, 'epoch': 0.29} 29%|██▉ | 10045/34278 [11:07:24<26:03:28, 3.87s/it] 29%|██▉ | 10046/34278 [11:07:28<26:10:56, 3.89s/it] {'loss': 0.1699, 'grad_norm': 0.7478213728555458, 'learning_rate': 8.292442397824721e-06, 'epoch': 0.29} 29%|██▉ | 10046/34278 [11:07:28<26:10:56, 3.89s/it] 29%|██▉ | 10047/34278 [11:07:31<25:42:55, 3.82s/it] {'loss': 0.1519, 'grad_norm': 0.6485024024547189, 'learning_rate': 8.292086833525277e-06, 'epoch': 0.29} 29%|██▉ | 10047/34278 [11:07:31<25:42:55, 3.82s/it] 29%|██▉ | 10048/34278 [11:07:34<24:05:13, 3.58s/it] {'loss': 0.1584, 'grad_norm': 0.7590094261702236, 'learning_rate': 8.291731239834865e-06, 'epoch': 0.29} 29%|██▉ | 10048/34278 [11:07:34<24:05:13, 3.58s/it] 29%|██▉ | 10049/34278 [11:07:38<23:40:34, 3.52s/it] {'loss': 0.1418, 'grad_norm': 0.8811821365401096, 'learning_rate': 8.291375616756666e-06, 'epoch': 0.29} 29%|██▉ | 10049/34278 [11:07:38<23:40:34, 3.52s/it] 29%|██▉ | 10050/34278 [11:07:42<25:23:09, 3.77s/it] {'loss': 0.2009, 'grad_norm': 0.8239299166996248, 'learning_rate': 8.291019964293852e-06, 'epoch': 0.29} 29%|██▉ | 10050/34278 [11:07:42<25:23:09, 3.77s/it] 29%|██▉ | 10051/34278 [11:07:47<26:51:16, 3.99s/it] {'loss': 0.1417, 'grad_norm': 0.6186205133828201, 'learning_rate': 8.290664282449594e-06, 'epoch': 0.29} 29%|██▉ | 10051/34278 [11:07:47<26:51:16, 3.99s/it] 29%|██▉ | 10052/34278 [11:07:53<31:16:33, 4.65s/it] {'loss': 0.138, 'grad_norm': 0.6901163615293291, 'learning_rate': 8.290308571227073e-06, 'epoch': 0.29} 29%|██▉ | 10052/34278 [11:07:53<31:16:33, 4.65s/it] 29%|██▉ | 10053/34278 [11:07:56<28:39:33, 4.26s/it] {'loss': 0.1611, 'grad_norm': 0.8567262364175189, 'learning_rate': 8.289952830629462e-06, 'epoch': 0.29} 29%|██▉ | 10053/34278 [11:07:56<28:39:33, 4.26s/it] 29%|██▉ | 10054/34278 [11:07:59<26:30:54, 3.94s/it] {'loss': 0.1556, 'grad_norm': 0.826080934820219, 'learning_rate': 8.289597060659937e-06, 'epoch': 0.29} 29%|██▉ | 10054/34278 [11:07:59<26:30:54, 3.94s/it] 29%|██▉ | 10055/34278 [11:08:03<25:02:37, 3.72s/it] {'loss': 0.1523, 'grad_norm': 0.741746620216231, 'learning_rate': 8.289241261321674e-06, 'epoch': 0.29} 29%|██▉ | 10055/34278 [11:08:03<25:02:37, 3.72s/it] 29%|██▉ | 10056/34278 [11:08:06<24:22:59, 3.62s/it] {'loss': 0.1785, 'grad_norm': 0.8285011445736965, 'learning_rate': 8.288885432617853e-06, 'epoch': 0.29} 29%|██▉ | 10056/34278 [11:08:06<24:22:59, 3.62s/it] 29%|██▉ | 10057/34278 [11:08:09<23:00:53, 3.42s/it] {'loss': 0.1475, 'grad_norm': 0.7628317405006118, 'learning_rate': 8.288529574551645e-06, 'epoch': 0.29} 29%|██▉ | 10057/34278 [11:08:09<23:00:53, 3.42s/it] 29%|██▉ | 10058/34278 [11:08:13<23:17:36, 3.46s/it] {'loss': 0.1475, 'grad_norm': 0.7375996428106932, 'learning_rate': 8.288173687126231e-06, 'epoch': 0.29} 29%|██▉ | 10058/34278 [11:08:13<23:17:36, 3.46s/it] 29%|██▉ | 10059/34278 [11:08:16<22:23:18, 3.33s/it] {'loss': 0.1627, 'grad_norm': 0.9312608079105043, 'learning_rate': 8.287817770344789e-06, 'epoch': 0.29} 29%|██▉ | 10059/34278 [11:08:16<22:23:18, 3.33s/it] 29%|██▉ | 10060/34278 [11:08:21<27:30:06, 4.09s/it] {'loss': 0.1598, 'grad_norm': 0.6759824880556607, 'learning_rate': 8.287461824210491e-06, 'epoch': 0.29} 29%|██▉ | 10060/34278 [11:08:21<27:30:06, 4.09s/it] 29%|██▉ | 10061/34278 [11:08:25<27:20:09, 4.06s/it] {'loss': 0.1404, 'grad_norm': 1.141226546046676, 'learning_rate': 8.287105848726523e-06, 'epoch': 0.29} 29%|██▉ | 10061/34278 [11:08:25<27:20:09, 4.06s/it] 29%|██▉ | 10062/34278 [11:08:31<29:25:30, 4.37s/it] {'loss': 0.1657, 'grad_norm': 1.0168099581722219, 'learning_rate': 8.286749843896058e-06, 'epoch': 0.29} 29%|██▉ | 10062/34278 [11:08:31<29:25:30, 4.37s/it] 29%|██▉ | 10063/34278 [11:08:35<29:01:17, 4.31s/it] {'loss': 0.1248, 'grad_norm': 1.096694261391748, 'learning_rate': 8.286393809722272e-06, 'epoch': 0.29} 29%|██▉ | 10063/34278 [11:08:35<29:01:17, 4.31s/it] 29%|██▉ | 10064/34278 [11:08:38<27:50:59, 4.14s/it] {'loss': 0.1403, 'grad_norm': 1.1444504946474283, 'learning_rate': 8.286037746208348e-06, 'epoch': 0.29} 29%|██▉ | 10064/34278 [11:08:38<27:50:59, 4.14s/it] 29%|██▉ | 10065/34278 [11:08:42<26:17:47, 3.91s/it] {'loss': 0.1317, 'grad_norm': 0.8336631442716201, 'learning_rate': 8.285681653357465e-06, 'epoch': 0.29} 29%|██▉ | 10065/34278 [11:08:42<26:17:47, 3.91s/it] 29%|██▉ | 10066/34278 [11:08:45<24:05:19, 3.58s/it] {'loss': 0.1669, 'grad_norm': 1.1462786308024369, 'learning_rate': 8.2853255311728e-06, 'epoch': 0.29} 29%|██▉ | 10066/34278 [11:08:45<24:05:19, 3.58s/it] 29%|██▉ | 10067/34278 [11:08:48<23:49:01, 3.54s/it] {'loss': 0.1295, 'grad_norm': 0.6630550088012315, 'learning_rate': 8.28496937965753e-06, 'epoch': 0.29} 29%|██▉ | 10067/34278 [11:08:48<23:49:01, 3.54s/it] 29%|██▉ | 10068/34278 [11:08:54<29:16:45, 4.35s/it] {'loss': 0.136, 'grad_norm': 0.7847929484569912, 'learning_rate': 8.28461319881484e-06, 'epoch': 0.29} 29%|██▉ | 10068/34278 [11:08:54<29:16:45, 4.35s/it] 29%|██▉ | 10069/34278 [11:08:58<28:01:28, 4.17s/it] {'loss': 0.1539, 'grad_norm': 0.800683760933707, 'learning_rate': 8.284256988647907e-06, 'epoch': 0.29} 29%|██▉ | 10069/34278 [11:08:58<28:01:28, 4.17s/it] 29%|██▉ | 10070/34278 [11:09:04<31:36:37, 4.70s/it] {'loss': 0.1429, 'grad_norm': 0.8101367953085461, 'learning_rate': 8.283900749159912e-06, 'epoch': 0.29} 29%|██▉ | 10070/34278 [11:09:04<31:36:37, 4.70s/it] 29%|██▉ | 10071/34278 [11:09:08<29:32:56, 4.39s/it] {'loss': 0.1364, 'grad_norm': 0.8707948290695925, 'learning_rate': 8.283544480354036e-06, 'epoch': 0.29} 29%|██▉ | 10071/34278 [11:09:08<29:32:56, 4.39s/it] 29%|██▉ | 10072/34278 [11:09:11<26:50:25, 3.99s/it] {'loss': 0.1401, 'grad_norm': 1.0013094993900706, 'learning_rate': 8.283188182233458e-06, 'epoch': 0.29} 29%|██▉ | 10072/34278 [11:09:11<26:50:25, 3.99s/it] 29%|██▉ | 10073/34278 [11:09:14<25:15:18, 3.76s/it] {'loss': 0.1635, 'grad_norm': 0.810950696245934, 'learning_rate': 8.282831854801359e-06, 'epoch': 0.29} 29%|██▉ | 10073/34278 [11:09:14<25:15:18, 3.76s/it] 29%|██▉ | 10074/34278 [11:09:17<24:23:47, 3.63s/it] {'loss': 0.1494, 'grad_norm': 0.9939931738019238, 'learning_rate': 8.28247549806092e-06, 'epoch': 0.29} 29%|██▉ | 10074/34278 [11:09:17<24:23:47, 3.63s/it] 29%|██▉ | 10075/34278 [11:09:20<23:14:46, 3.46s/it] {'loss': 0.1496, 'grad_norm': 0.9989568642233154, 'learning_rate': 8.282119112015325e-06, 'epoch': 0.29} 29%|██▉ | 10075/34278 [11:09:20<23:14:46, 3.46s/it] 29%|██▉ | 10076/34278 [11:09:25<26:26:54, 3.93s/it] {'loss': 0.1391, 'grad_norm': 0.6881224454052522, 'learning_rate': 8.281762696667755e-06, 'epoch': 0.29} 29%|██▉ | 10076/34278 [11:09:25<26:26:54, 3.93s/it] 29%|██▉ | 10077/34278 [11:09:28<24:35:00, 3.66s/it] {'loss': 0.1647, 'grad_norm': 1.1326747256186758, 'learning_rate': 8.281406252021389e-06, 'epoch': 0.29} 29%|██▉ | 10077/34278 [11:09:28<24:35:00, 3.66s/it] 29%|██▉ | 10078/34278 [11:09:32<23:35:41, 3.51s/it] {'loss': 0.195, 'grad_norm': 0.9086346742634053, 'learning_rate': 8.28104977807941e-06, 'epoch': 0.29} 29%|██▉ | 10078/34278 [11:09:32<23:35:41, 3.51s/it] 29%|██▉ | 10079/34278 [11:09:34<22:21:24, 3.33s/it] {'loss': 0.137, 'grad_norm': 1.705925428776988, 'learning_rate': 8.280693274845006e-06, 'epoch': 0.29} 29%|██▉ | 10079/34278 [11:09:34<22:21:24, 3.33s/it] 29%|██▉ | 10080/34278 [11:09:38<22:10:22, 3.30s/it] {'loss': 0.1713, 'grad_norm': 1.0773313951320238, 'learning_rate': 8.280336742321351e-06, 'epoch': 0.29} 29%|██▉ | 10080/34278 [11:09:38<22:10:22, 3.30s/it] 29%|██▉ | 10081/34278 [11:09:44<27:51:04, 4.14s/it] {'loss': 0.1641, 'grad_norm': 1.0134830466090594, 'learning_rate': 8.279980180511636e-06, 'epoch': 0.29} 29%|██▉ | 10081/34278 [11:09:44<27:51:04, 4.14s/it] 29%|██▉ | 10082/34278 [11:09:47<25:21:17, 3.77s/it] {'loss': 0.1658, 'grad_norm': 0.7687904917679984, 'learning_rate': 8.279623589419041e-06, 'epoch': 0.29} 29%|██▉ | 10082/34278 [11:09:47<25:21:17, 3.77s/it] 29%|██▉ | 10083/34278 [11:09:50<24:39:53, 3.67s/it] {'loss': 0.1585, 'grad_norm': 1.1714835304382243, 'learning_rate': 8.279266969046748e-06, 'epoch': 0.29} 29%|██▉ | 10083/34278 [11:09:50<24:39:53, 3.67s/it] 29%|██▉ | 10084/34278 [11:09:53<22:53:35, 3.41s/it] {'loss': 0.1488, 'grad_norm': 0.7753132212968409, 'learning_rate': 8.278910319397944e-06, 'epoch': 0.29} 29%|██▉ | 10084/34278 [11:09:53<22:53:35, 3.41s/it] 29%|██▉ | 10085/34278 [11:09:56<22:48:40, 3.39s/it] {'loss': 0.1437, 'grad_norm': 0.7890368342165006, 'learning_rate': 8.27855364047581e-06, 'epoch': 0.29} 29%|██▉ | 10085/34278 [11:09:56<22:48:40, 3.39s/it] 29%|██▉ | 10086/34278 [11:10:00<24:00:18, 3.57s/it] {'loss': 0.1393, 'grad_norm': 0.998941095158542, 'learning_rate': 8.27819693228353e-06, 'epoch': 0.29} 29%|██▉ | 10086/34278 [11:10:00<24:00:18, 3.57s/it] 29%|██▉ | 10087/34278 [11:10:03<22:30:41, 3.35s/it] {'loss': 0.1545, 'grad_norm': 1.1003955530389602, 'learning_rate': 8.277840194824293e-06, 'epoch': 0.29} 29%|██▉ | 10087/34278 [11:10:03<22:30:41, 3.35s/it] 29%|██▉ | 10088/34278 [11:10:06<21:25:10, 3.19s/it] {'loss': 0.1558, 'grad_norm': 0.8417741980652979, 'learning_rate': 8.277483428101282e-06, 'epoch': 0.29} 29%|██▉ | 10088/34278 [11:10:06<21:25:10, 3.19s/it] 29%|██▉ | 10089/34278 [11:10:09<21:37:17, 3.22s/it] {'loss': 0.1614, 'grad_norm': 0.8643247738015186, 'learning_rate': 8.277126632117678e-06, 'epoch': 0.29} 29%|██▉ | 10089/34278 [11:10:09<21:37:17, 3.22s/it] 29%|██▉ | 10090/34278 [11:10:13<22:42:40, 3.38s/it] {'loss': 0.1687, 'grad_norm': 0.8045667629341287, 'learning_rate': 8.276769806876672e-06, 'epoch': 0.29} 29%|██▉ | 10090/34278 [11:10:13<22:42:40, 3.38s/it] 29%|██▉ | 10091/34278 [11:10:16<22:31:36, 3.35s/it] {'loss': 0.1519, 'grad_norm': 0.7557136704949229, 'learning_rate': 8.276412952381447e-06, 'epoch': 0.29} 29%|██▉ | 10091/34278 [11:10:16<22:31:36, 3.35s/it] 29%|██▉ | 10092/34278 [11:10:19<21:36:38, 3.22s/it] {'loss': 0.1488, 'grad_norm': 0.8300803948164209, 'learning_rate': 8.27605606863519e-06, 'epoch': 0.29} 29%|██▉ | 10092/34278 [11:10:19<21:36:38, 3.22s/it] 29%|██▉ | 10093/34278 [11:10:23<22:35:06, 3.36s/it] {'loss': 0.1232, 'grad_norm': 0.8716545199341675, 'learning_rate': 8.275699155641086e-06, 'epoch': 0.29} 29%|██▉ | 10093/34278 [11:10:23<22:35:06, 3.36s/it] 29%|██▉ | 10094/34278 [11:10:28<26:19:39, 3.92s/it] {'loss': 0.1553, 'grad_norm': 0.9740788443599583, 'learning_rate': 8.275342213402323e-06, 'epoch': 0.29} 29%|██▉ | 10094/34278 [11:10:28<26:19:39, 3.92s/it] 29%|██▉ | 10095/34278 [11:10:32<25:39:09, 3.82s/it] {'loss': 0.159, 'grad_norm': 0.908314983029271, 'learning_rate': 8.274985241922085e-06, 'epoch': 0.29} 29%|██▉ | 10095/34278 [11:10:32<25:39:09, 3.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 29%|██▉ | 10096/34278 [11:10:35<23:50:00, 3.55s/it] {'loss': 0.1329, 'grad_norm': 0.7413311019259263, 'learning_rate': 8.274628241203559e-06, 'epoch': 0.29} 29%|██▉ | 10096/34278 [11:10:35<23:50:00, 3.55s/it] 29%|██▉ | 10097/34278 [11:10:38<23:11:14, 3.45s/it] {'loss': 0.1348, 'grad_norm': 0.9126331962160001, 'learning_rate': 8.274271211249936e-06, 'epoch': 0.29} 29%|██▉ | 10097/34278 [11:10:38<23:11:14, 3.45s/it] 29%|██▉ | 10098/34278 [11:10:41<22:21:07, 3.33s/it] {'loss': 0.1381, 'grad_norm': 0.735422309302129, 'learning_rate': 8.273914152064402e-06, 'epoch': 0.29} 29%|██▉ | 10098/34278 [11:10:41<22:21:07, 3.33s/it] 29%|██▉ | 10099/34278 [11:10:44<21:11:12, 3.15s/it] {'loss': 0.1653, 'grad_norm': 0.9076706463326716, 'learning_rate': 8.273557063650142e-06, 'epoch': 0.29} 29%|██▉ | 10099/34278 [11:10:44<21:11:12, 3.15s/it] 29%|██▉ | 10100/34278 [11:10:46<20:06:02, 2.99s/it] {'loss': 0.1624, 'grad_norm': 0.8381643215388135, 'learning_rate': 8.27319994601035e-06, 'epoch': 0.29} 29%|██▉ | 10100/34278 [11:10:46<20:06:02, 2.99s/it] 29%|██▉ | 10101/34278 [11:10:50<22:14:06, 3.31s/it] {'loss': 0.1559, 'grad_norm': 0.7525793137297849, 'learning_rate': 8.272842799148204e-06, 'epoch': 0.29} 29%|██▉ | 10101/34278 [11:10:50<22:14:06, 3.31s/it] 29%|██▉ | 10102/34278 [11:10:57<28:17:53, 4.21s/it] {'loss': 0.1535, 'grad_norm': 0.845378689064027, 'learning_rate': 8.272485623066902e-06, 'epoch': 0.29} 29%|██▉ | 10102/34278 [11:10:57<28:17:53, 4.21s/it] 29%|██▉ | 10103/34278 [11:11:02<31:40:55, 4.72s/it] {'loss': 0.1551, 'grad_norm': 0.7426295645551895, 'learning_rate': 8.272128417769631e-06, 'epoch': 0.29} 29%|██▉ | 10103/34278 [11:11:02<31:40:55, 4.72s/it] 29%|██▉ | 10104/34278 [11:11:06<28:56:02, 4.31s/it] {'loss': 0.1632, 'grad_norm': 0.7030265053522543, 'learning_rate': 8.271771183259576e-06, 'epoch': 0.29} 29%|██▉ | 10104/34278 [11:11:06<28:56:02, 4.31s/it] 29%|██▉ | 10105/34278 [11:11:09<27:11:00, 4.05s/it] {'loss': 0.1812, 'grad_norm': 0.8667857278370785, 'learning_rate': 8.27141391953993e-06, 'epoch': 0.29} 29%|██▉ | 10105/34278 [11:11:09<27:11:00, 4.05s/it] 29%|██▉ | 10106/34278 [11:11:13<26:45:35, 3.99s/it] {'loss': 0.1428, 'grad_norm': 0.757826397612349, 'learning_rate': 8.271056626613882e-06, 'epoch': 0.29} 29%|██▉ | 10106/34278 [11:11:13<26:45:35, 3.99s/it] 29%|██▉ | 10107/34278 [11:11:16<24:52:40, 3.71s/it] {'loss': 0.1496, 'grad_norm': 0.7010888562714828, 'learning_rate': 8.27069930448462e-06, 'epoch': 0.29} 29%|██▉ | 10107/34278 [11:11:16<24:52:40, 3.71s/it] 29%|██▉ | 10108/34278 [11:11:20<24:24:19, 3.64s/it] {'loss': 0.1667, 'grad_norm': 0.8489102107716576, 'learning_rate': 8.270341953155337e-06, 'epoch': 0.29} 29%|██▉ | 10108/34278 [11:11:20<24:24:19, 3.64s/it] 29%|██▉ | 10109/34278 [11:11:23<23:39:42, 3.52s/it] {'loss': 0.1508, 'grad_norm': 0.7182762903971774, 'learning_rate': 8.269984572629221e-06, 'epoch': 0.29} 29%|██▉ | 10109/34278 [11:11:23<23:39:42, 3.52s/it] 29%|██▉ | 10110/34278 [11:11:26<23:24:06, 3.49s/it] {'loss': 0.1374, 'grad_norm': 0.9205830169267184, 'learning_rate': 8.269627162909464e-06, 'epoch': 0.29} 29%|██▉ | 10110/34278 [11:11:26<23:24:06, 3.49s/it] 29%|██▉ | 10111/34278 [11:11:30<24:04:30, 3.59s/it] {'loss': 0.1667, 'grad_norm': 0.9013683714979126, 'learning_rate': 8.269269723999254e-06, 'epoch': 0.29} 29%|██▉ | 10111/34278 [11:11:30<24:04:30, 3.59s/it] 29%|██▉ | 10112/34278 [11:11:36<29:07:13, 4.34s/it] {'loss': 0.1521, 'grad_norm': 0.9022818034859249, 'learning_rate': 8.268912255901787e-06, 'epoch': 0.29} 29%|██▉ | 10112/34278 [11:11:36<29:07:13, 4.34s/it] 30%|██▉ | 10113/34278 [11:11:40<27:19:40, 4.07s/it] {'loss': 0.1366, 'grad_norm': 0.8998209867337773, 'learning_rate': 8.268554758620251e-06, 'epoch': 0.3} 30%|██▉ | 10113/34278 [11:11:40<27:19:40, 4.07s/it] 30%|██▉ | 10114/34278 [11:11:43<26:08:44, 3.90s/it] {'loss': 0.1428, 'grad_norm': 0.9093486643780566, 'learning_rate': 8.268197232157838e-06, 'epoch': 0.3} 30%|██▉ | 10114/34278 [11:11:43<26:08:44, 3.90s/it] 30%|██▉ | 10115/34278 [11:11:46<24:53:14, 3.71s/it] {'loss': 0.1538, 'grad_norm': 0.8043344415927015, 'learning_rate': 8.26783967651774e-06, 'epoch': 0.3} 30%|██▉ | 10115/34278 [11:11:46<24:53:14, 3.71s/it] 30%|██▉ | 10116/34278 [11:11:49<23:29:20, 3.50s/it] {'loss': 0.1394, 'grad_norm': 0.708856084485211, 'learning_rate': 8.267482091703149e-06, 'epoch': 0.3} 30%|██▉ | 10116/34278 [11:11:49<23:29:20, 3.50s/it] 30%|██▉ | 10117/34278 [11:11:53<23:03:25, 3.44s/it] {'loss': 0.1527, 'grad_norm': 1.2013644115603497, 'learning_rate': 8.26712447771726e-06, 'epoch': 0.3} 30%|██▉ | 10117/34278 [11:11:53<23:03:25, 3.44s/it] 30%|██▉ | 10118/34278 [11:11:56<22:29:50, 3.35s/it] {'loss': 0.16, 'grad_norm': 0.9668843632885805, 'learning_rate': 8.266766834563262e-06, 'epoch': 0.3} 30%|██▉ | 10118/34278 [11:11:56<22:29:50, 3.35s/it] 30%|██▉ | 10119/34278 [11:11:59<22:22:26, 3.33s/it] {'loss': 0.167, 'grad_norm': 0.82648255757909, 'learning_rate': 8.266409162244349e-06, 'epoch': 0.3} 30%|██▉ | 10119/34278 [11:11:59<22:22:26, 3.33s/it] 30%|██▉ | 10120/34278 [11:12:02<22:17:39, 3.32s/it] {'loss': 0.1487, 'grad_norm': 0.8733450903456376, 'learning_rate': 8.266051460763715e-06, 'epoch': 0.3} 30%|██▉ | 10120/34278 [11:12:02<22:17:39, 3.32s/it] 30%|██▉ | 10121/34278 [11:12:05<20:58:37, 3.13s/it] {'loss': 0.1856, 'grad_norm': 1.0592545546496737, 'learning_rate': 8.265693730124554e-06, 'epoch': 0.3} 30%|██▉ | 10121/34278 [11:12:05<20:58:37, 3.13s/it] 30%|██▉ | 10122/34278 [11:12:09<21:45:27, 3.24s/it] {'loss': 0.1414, 'grad_norm': 0.8973038592028657, 'learning_rate': 8.26533597033006e-06, 'epoch': 0.3} 30%|██▉ | 10122/34278 [11:12:09<21:45:27, 3.24s/it] 30%|██▉ | 10123/34278 [11:12:12<21:09:12, 3.15s/it] {'loss': 0.167, 'grad_norm': 0.9415657241169062, 'learning_rate': 8.264978181383423e-06, 'epoch': 0.3} 30%|██▉ | 10123/34278 [11:12:12<21:09:12, 3.15s/it] 30%|██▉ | 10124/34278 [11:12:17<26:29:24, 3.95s/it] {'loss': 0.1469, 'grad_norm': 1.0662482061733354, 'learning_rate': 8.264620363287844e-06, 'epoch': 0.3} 30%|██▉ | 10124/34278 [11:12:17<26:29:24, 3.95s/it] 30%|██▉ | 10125/34278 [11:12:20<24:41:48, 3.68s/it] {'loss': 0.1817, 'grad_norm': 0.954724171184433, 'learning_rate': 8.26426251604651e-06, 'epoch': 0.3} 30%|██▉ | 10125/34278 [11:12:20<24:41:48, 3.68s/it] 30%|██▉ | 10126/34278 [11:12:27<29:36:52, 4.41s/it] {'loss': 0.1478, 'grad_norm': 0.9343136494408995, 'learning_rate': 8.26390463966262e-06, 'epoch': 0.3} 30%|██▉ | 10126/34278 [11:12:27<29:36:52, 4.41s/it] 30%|██▉ | 10127/34278 [11:12:33<33:01:33, 4.92s/it] {'loss': 0.1444, 'grad_norm': 0.9635716826314182, 'learning_rate': 8.263546734139372e-06, 'epoch': 0.3} 30%|██▉ | 10127/34278 [11:12:33<33:01:33, 4.92s/it] 30%|██▉ | 10128/34278 [11:12:35<28:41:20, 4.28s/it] {'loss': 0.1521, 'grad_norm': 0.7939738464961386, 'learning_rate': 8.263188799479955e-06, 'epoch': 0.3} 30%|██▉ | 10128/34278 [11:12:35<28:41:20, 4.28s/it] 30%|██▉ | 10129/34278 [11:12:40<28:45:50, 4.29s/it] {'loss': 0.1633, 'grad_norm': 0.9325713030748594, 'learning_rate': 8.262830835687568e-06, 'epoch': 0.3} 30%|██▉ | 10129/34278 [11:12:40<28:45:50, 4.29s/it] 30%|██▉ | 10130/34278 [11:12:45<29:41:46, 4.43s/it] {'loss': 0.1472, 'grad_norm': 1.0552677541169195, 'learning_rate': 8.262472842765405e-06, 'epoch': 0.3} 30%|██▉ | 10130/34278 [11:12:45<29:41:46, 4.43s/it] 30%|██▉ | 10131/34278 [11:12:48<27:03:55, 4.04s/it] {'loss': 0.1473, 'grad_norm': 0.9015243805482912, 'learning_rate': 8.262114820716665e-06, 'epoch': 0.3} 30%|██▉ | 10131/34278 [11:12:48<27:03:55, 4.04s/it] 30%|██▉ | 10132/34278 [11:12:51<25:12:04, 3.76s/it] {'loss': 0.1392, 'grad_norm': 0.7980512752168968, 'learning_rate': 8.261756769544541e-06, 'epoch': 0.3} 30%|██▉ | 10132/34278 [11:12:51<25:12:04, 3.76s/it] 30%|██▉ | 10133/34278 [11:12:54<23:58:33, 3.57s/it] {'loss': 0.1683, 'grad_norm': 1.1291745660181034, 'learning_rate': 8.261398689252234e-06, 'epoch': 0.3} 30%|██▉ | 10133/34278 [11:12:54<23:58:33, 3.57s/it] 30%|██▉ | 10134/34278 [11:12:57<22:48:57, 3.40s/it] {'loss': 0.161, 'grad_norm': 1.0890608310415952, 'learning_rate': 8.261040579842933e-06, 'epoch': 0.3} 30%|██▉ | 10134/34278 [11:12:57<22:48:57, 3.40s/it] 30%|██▉ | 10135/34278 [11:13:00<21:48:47, 3.25s/it] {'loss': 0.1539, 'grad_norm': 0.8316305480572167, 'learning_rate': 8.260682441319845e-06, 'epoch': 0.3} 30%|██▉ | 10135/34278 [11:13:00<21:48:47, 3.25s/it] 30%|██▉ | 10136/34278 [11:13:03<21:47:47, 3.25s/it] {'loss': 0.1514, 'grad_norm': 0.9928843511492947, 'learning_rate': 8.26032427368616e-06, 'epoch': 0.3} 30%|██▉ | 10136/34278 [11:13:03<21:47:47, 3.25s/it] 30%|██▉ | 10137/34278 [11:13:06<21:11:15, 3.16s/it] {'loss': 0.1602, 'grad_norm': 0.8259319623286105, 'learning_rate': 8.25996607694508e-06, 'epoch': 0.3} 30%|██▉ | 10137/34278 [11:13:06<21:11:15, 3.16s/it] 30%|██▉ | 10138/34278 [11:13:11<24:01:49, 3.58s/it] {'loss': 0.1593, 'grad_norm': 0.9239118835587717, 'learning_rate': 8.2596078510998e-06, 'epoch': 0.3} 30%|██▉ | 10138/34278 [11:13:11<24:01:49, 3.58s/it] 30%|██▉ | 10139/34278 [11:13:14<24:33:19, 3.66s/it] {'loss': 0.1392, 'grad_norm': 0.8069166392881194, 'learning_rate': 8.259249596153521e-06, 'epoch': 0.3} 30%|██▉ | 10139/34278 [11:13:14<24:33:19, 3.66s/it] 30%|██▉ | 10140/34278 [11:13:17<23:21:01, 3.48s/it] {'loss': 0.1503, 'grad_norm': 0.9033003788543029, 'learning_rate': 8.258891312109435e-06, 'epoch': 0.3} 30%|██▉ | 10140/34278 [11:13:17<23:21:01, 3.48s/it] 30%|██▉ | 10141/34278 [11:13:21<22:44:45, 3.39s/it] {'loss': 0.1545, 'grad_norm': 0.6946109693948403, 'learning_rate': 8.25853299897075e-06, 'epoch': 0.3} 30%|██▉ | 10141/34278 [11:13:21<22:44:45, 3.39s/it] 30%|██▉ | 10142/34278 [11:13:25<24:46:44, 3.70s/it] {'loss': 0.1629, 'grad_norm': 0.9662825112047134, 'learning_rate': 8.258174656740659e-06, 'epoch': 0.3} 30%|██▉ | 10142/34278 [11:13:25<24:46:44, 3.70s/it] 30%|██▉ | 10143/34278 [11:13:28<24:15:18, 3.62s/it] {'loss': 0.1412, 'grad_norm': 0.7973506026688649, 'learning_rate': 8.257816285422362e-06, 'epoch': 0.3} 30%|██▉ | 10143/34278 [11:13:29<24:15:18, 3.62s/it] 30%|██▉ | 10144/34278 [11:13:35<29:19:33, 4.37s/it] {'loss': 0.1593, 'grad_norm': 0.7062174155878412, 'learning_rate': 8.257457885019059e-06, 'epoch': 0.3} 30%|██▉ | 10144/34278 [11:13:35<29:19:33, 4.37s/it] 30%|██▉ | 10145/34278 [11:13:38<26:40:00, 3.98s/it] {'loss': 0.1429, 'grad_norm': 0.8000916472159385, 'learning_rate': 8.25709945553395e-06, 'epoch': 0.3} 30%|██▉ | 10145/34278 [11:13:38<26:40:00, 3.98s/it] 30%|██▉ | 10146/34278 [11:13:42<27:58:53, 4.17s/it] {'loss': 0.1582, 'grad_norm': 0.9658574160510581, 'learning_rate': 8.256740996970233e-06, 'epoch': 0.3} 30%|██▉ | 10146/34278 [11:13:42<27:58:53, 4.17s/it] 30%|██▉ | 10147/34278 [11:13:48<31:28:10, 4.69s/it] {'loss': 0.1401, 'grad_norm': 0.6404408032298828, 'learning_rate': 8.256382509331111e-06, 'epoch': 0.3} 30%|██▉ | 10147/34278 [11:13:48<31:28:10, 4.69s/it] 30%|██▉ | 10148/34278 [11:13:52<29:24:35, 4.39s/it] {'loss': 0.1565, 'grad_norm': 0.7047901170856019, 'learning_rate': 8.256023992619784e-06, 'epoch': 0.3} 30%|██▉ | 10148/34278 [11:13:52<29:24:35, 4.39s/it] 30%|██▉ | 10149/34278 [11:13:55<26:32:10, 3.96s/it] {'loss': 0.1507, 'grad_norm': 0.9099826941267748, 'learning_rate': 8.255665446839452e-06, 'epoch': 0.3} 30%|██▉ | 10149/34278 [11:13:55<26:32:10, 3.96s/it] 30%|██▉ | 10150/34278 [11:13:58<25:09:53, 3.75s/it] {'loss': 0.1369, 'grad_norm': 0.7380877626178057, 'learning_rate': 8.255306871993314e-06, 'epoch': 0.3} 30%|██▉ | 10150/34278 [11:13:58<25:09:53, 3.75s/it] 30%|██▉ | 10151/34278 [11:14:01<24:01:16, 3.58s/it] {'loss': 0.1561, 'grad_norm': 0.7092539338751792, 'learning_rate': 8.254948268084577e-06, 'epoch': 0.3} 30%|██▉ | 10151/34278 [11:14:01<24:01:16, 3.58s/it] 30%|██▉ | 10152/34278 [11:14:05<25:13:22, 3.76s/it] {'loss': 0.1482, 'grad_norm': 0.8490075569884373, 'learning_rate': 8.254589635116436e-06, 'epoch': 0.3} 30%|██▉ | 10152/34278 [11:14:06<25:13:22, 3.76s/it] 30%|██▉ | 10153/34278 [11:14:08<22:59:27, 3.43s/it] {'loss': 0.1334, 'grad_norm': 0.7187741221837821, 'learning_rate': 8.254230973092097e-06, 'epoch': 0.3} 30%|██▉ | 10153/34278 [11:14:08<22:59:27, 3.43s/it] 30%|██▉ | 10154/34278 [11:14:12<22:49:29, 3.41s/it] {'loss': 0.1869, 'grad_norm': 0.7570635529022741, 'learning_rate': 8.253872282014759e-06, 'epoch': 0.3} 30%|██▉ | 10154/34278 [11:14:12<22:49:29, 3.41s/it] 30%|██▉ | 10155/34278 [11:14:15<23:19:26, 3.48s/it] {'loss': 0.148, 'grad_norm': 0.7745372825322814, 'learning_rate': 8.253513561887627e-06, 'epoch': 0.3} 30%|██▉ | 10155/34278 [11:14:15<23:19:26, 3.48s/it] 30%|██▉ | 10156/34278 [11:14:18<22:49:44, 3.41s/it] {'loss': 0.1736, 'grad_norm': 0.9402534676032971, 'learning_rate': 8.253154812713903e-06, 'epoch': 0.3} 30%|██▉ | 10156/34278 [11:14:18<22:49:44, 3.41s/it] 30%|██▉ | 10157/34278 [11:14:21<21:39:53, 3.23s/it] {'loss': 0.1718, 'grad_norm': 0.9180733339274976, 'learning_rate': 8.252796034496791e-06, 'epoch': 0.3} 30%|██▉ | 10157/34278 [11:14:21<21:39:53, 3.23s/it] 30%|██▉ | 10158/34278 [11:14:24<21:26:01, 3.20s/it] {'loss': 0.1631, 'grad_norm': 0.8470308732823799, 'learning_rate': 8.252437227239489e-06, 'epoch': 0.3} 30%|██▉ | 10158/34278 [11:14:24<21:26:01, 3.20s/it] 30%|██▉ | 10159/34278 [11:14:28<23:15:13, 3.47s/it] {'loss': 0.1559, 'grad_norm': 0.7592252529268745, 'learning_rate': 8.252078390945206e-06, 'epoch': 0.3} 30%|██▉ | 10159/34278 [11:14:28<23:15:13, 3.47s/it] 30%|██▉ | 10160/34278 [11:14:32<24:18:44, 3.63s/it] {'loss': 0.1336, 'grad_norm': 0.8559338330101544, 'learning_rate': 8.251719525617144e-06, 'epoch': 0.3} 30%|██▉ | 10160/34278 [11:14:32<24:18:44, 3.63s/it] 30%|██▉ | 10161/34278 [11:14:36<24:12:05, 3.61s/it] {'loss': 0.1435, 'grad_norm': 0.8125242378958871, 'learning_rate': 8.251360631258503e-06, 'epoch': 0.3} 30%|██▉ | 10161/34278 [11:14:36<24:12:05, 3.61s/it] 30%|██▉ | 10162/34278 [11:14:40<24:15:45, 3.62s/it] {'loss': 0.1442, 'grad_norm': 0.657284100177316, 'learning_rate': 8.251001707872495e-06, 'epoch': 0.3} 30%|██▉ | 10162/34278 [11:14:40<24:15:45, 3.62s/it] 30%|██▉ | 10163/34278 [11:14:43<23:54:40, 3.57s/it] {'loss': 0.1543, 'grad_norm': 0.6370550522371218, 'learning_rate': 8.250642755462318e-06, 'epoch': 0.3} 30%|██▉ | 10163/34278 [11:14:43<23:54:40, 3.57s/it] 30%|██▉ | 10164/34278 [11:14:48<27:02:33, 4.04s/it] {'loss': 0.1427, 'grad_norm': 0.7726338232353873, 'learning_rate': 8.250283774031175e-06, 'epoch': 0.3} 30%|██▉ | 10164/34278 [11:14:48<27:02:33, 4.04s/it] 30%|██▉ | 10165/34278 [11:14:51<25:01:28, 3.74s/it] {'loss': 0.175, 'grad_norm': 0.8420372492225615, 'learning_rate': 8.249924763582278e-06, 'epoch': 0.3} 30%|██▉ | 10165/34278 [11:14:51<25:01:28, 3.74s/it] 30%|██▉ | 10166/34278 [11:14:55<25:08:30, 3.75s/it] {'loss': 0.1377, 'grad_norm': 0.8813049245433957, 'learning_rate': 8.249565724118828e-06, 'epoch': 0.3} 30%|██▉ | 10166/34278 [11:14:55<25:08:30, 3.75s/it] 30%|██▉ | 10167/34278 [11:14:58<23:36:46, 3.53s/it] {'loss': 0.1731, 'grad_norm': 0.7851547264469557, 'learning_rate': 8.249206655644032e-06, 'epoch': 0.3} 30%|██▉ | 10167/34278 [11:14:58<23:36:46, 3.53s/it] 30%|██▉ | 10168/34278 [11:15:04<27:30:48, 4.11s/it] {'loss': 0.1742, 'grad_norm': 0.9119747534119117, 'learning_rate': 8.248847558161093e-06, 'epoch': 0.3} 30%|██▉ | 10168/34278 [11:15:04<27:30:48, 4.11s/it] 30%|██▉ | 10169/34278 [11:15:08<28:40:21, 4.28s/it] {'loss': 0.1656, 'grad_norm': 0.9045248243802394, 'learning_rate': 8.248488431673221e-06, 'epoch': 0.3} 30%|██▉ | 10169/34278 [11:15:08<28:40:21, 4.28s/it] 30%|██▉ | 10170/34278 [11:15:12<27:10:54, 4.06s/it] {'loss': 0.1624, 'grad_norm': 0.8469130071152402, 'learning_rate': 8.248129276183616e-06, 'epoch': 0.3} 30%|██▉ | 10170/34278 [11:15:12<27:10:54, 4.06s/it] 30%|██▉ | 10171/34278 [11:15:14<24:30:19, 3.66s/it] {'loss': 0.1475, 'grad_norm': 0.7706688757352926, 'learning_rate': 8.247770091695491e-06, 'epoch': 0.3} 30%|██▉ | 10171/34278 [11:15:14<24:30:19, 3.66s/it] 30%|██▉ | 10172/34278 [11:15:18<24:59:37, 3.73s/it] {'loss': 0.152, 'grad_norm': 0.9874774742372416, 'learning_rate': 8.24741087821205e-06, 'epoch': 0.3} 30%|██▉ | 10172/34278 [11:15:18<24:59:37, 3.73s/it] 30%|██▉ | 10173/34278 [11:15:22<25:19:50, 3.78s/it] {'loss': 0.1824, 'grad_norm': 0.9054971824276945, 'learning_rate': 8.247051635736498e-06, 'epoch': 0.3} 30%|██▉ | 10173/34278 [11:15:22<25:19:50, 3.78s/it] 30%|██▉ | 10174/34278 [11:15:25<23:52:15, 3.57s/it] {'loss': 0.1541, 'grad_norm': 0.9128203976783008, 'learning_rate': 8.246692364272045e-06, 'epoch': 0.3} 30%|██▉ | 10174/34278 [11:15:25<23:52:15, 3.57s/it] 30%|██▉ | 10175/34278 [11:15:28<22:41:49, 3.39s/it] {'loss': 0.1382, 'grad_norm': 1.1635042957392343, 'learning_rate': 8.246333063821898e-06, 'epoch': 0.3} 30%|██▉ | 10175/34278 [11:15:28<22:41:49, 3.39s/it] 30%|██▉ | 10176/34278 [11:15:32<23:00:40, 3.44s/it] {'loss': 0.18, 'grad_norm': 0.9255205883852237, 'learning_rate': 8.245973734389263e-06, 'epoch': 0.3} 30%|██▉ | 10176/34278 [11:15:32<23:00:40, 3.44s/it] 30%|██▉ | 10177/34278 [11:15:37<27:06:00, 4.05s/it] {'loss': 0.1883, 'grad_norm': 0.7999807333850611, 'learning_rate': 8.24561437597735e-06, 'epoch': 0.3} 30%|██▉ | 10177/34278 [11:15:37<27:06:00, 4.05s/it] 30%|██▉ | 10178/34278 [11:15:42<28:23:16, 4.24s/it] {'loss': 0.1387, 'grad_norm': 0.9396477424366436, 'learning_rate': 8.24525498858937e-06, 'epoch': 0.3} 30%|██▉ | 10178/34278 [11:15:42<28:23:16, 4.24s/it] 30%|██▉ | 10179/34278 [11:15:46<28:25:19, 4.25s/it] {'loss': 0.1718, 'grad_norm': 1.026522258031758, 'learning_rate': 8.244895572228524e-06, 'epoch': 0.3} 30%|██▉ | 10179/34278 [11:15:46<28:25:19, 4.25s/it] 30%|██▉ | 10180/34278 [11:15:49<26:17:26, 3.93s/it] {'loss': 0.177, 'grad_norm': 0.8259889089290392, 'learning_rate': 8.244536126898025e-06, 'epoch': 0.3} 30%|██▉ | 10180/34278 [11:15:49<26:17:26, 3.93s/it] 30%|██▉ | 10181/34278 [11:15:52<24:21:05, 3.64s/it] {'loss': 0.1825, 'grad_norm': 0.9117572925297596, 'learning_rate': 8.244176652601084e-06, 'epoch': 0.3} 30%|██▉ | 10181/34278 [11:15:52<24:21:05, 3.64s/it] 30%|██▉ | 10182/34278 [11:15:57<25:58:05, 3.88s/it] {'loss': 0.1501, 'grad_norm': 0.8321216196138452, 'learning_rate': 8.243817149340906e-06, 'epoch': 0.3} 30%|██▉ | 10182/34278 [11:15:57<25:58:05, 3.88s/it] 30%|██▉ | 10183/34278 [11:16:02<29:02:25, 4.34s/it] {'loss': 0.1601, 'grad_norm': 0.7396207053227941, 'learning_rate': 8.243457617120705e-06, 'epoch': 0.3} 30%|██▉ | 10183/34278 [11:16:02<29:02:25, 4.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 30%|██▉ | 10184/34278 [11:16:05<26:03:49, 3.89s/it] {'loss': 0.1638, 'grad_norm': 0.9472317680086649, 'learning_rate': 8.243098055943687e-06, 'epoch': 0.3} 30%|██▉ | 10184/34278 [11:16:05<26:03:49, 3.89s/it] 30%|██▉ | 10185/34278 [11:16:10<27:16:58, 4.08s/it] {'loss': 0.1533, 'grad_norm': 0.7518333084253055, 'learning_rate': 8.242738465813066e-06, 'epoch': 0.3} 30%|██▉ | 10185/34278 [11:16:10<27:16:58, 4.08s/it] 30%|██▉ | 10186/34278 [11:16:13<25:37:29, 3.83s/it] {'loss': 0.1578, 'grad_norm': 0.7751418699107506, 'learning_rate': 8.242378846732048e-06, 'epoch': 0.3} 30%|██▉ | 10186/34278 [11:16:13<25:37:29, 3.83s/it] 30%|██▉ | 10187/34278 [11:16:16<23:54:37, 3.57s/it] {'loss': 0.1619, 'grad_norm': 0.7512033072350223, 'learning_rate': 8.242019198703848e-06, 'epoch': 0.3} 30%|██▉ | 10187/34278 [11:16:16<23:54:37, 3.57s/it] 30%|██▉ | 10188/34278 [11:16:21<27:19:27, 4.08s/it] {'loss': 0.1805, 'grad_norm': 0.8138719776250573, 'learning_rate': 8.241659521731672e-06, 'epoch': 0.3} 30%|██▉ | 10188/34278 [11:16:21<27:19:27, 4.08s/it] 30%|██▉ | 10189/34278 [11:16:26<29:13:15, 4.37s/it] {'loss': 0.1452, 'grad_norm': 0.705947187282989, 'learning_rate': 8.241299815818735e-06, 'epoch': 0.3} 30%|██▉ | 10189/34278 [11:16:26<29:13:15, 4.37s/it] 30%|██▉ | 10190/34278 [11:16:29<26:34:24, 3.97s/it] {'loss': 0.1459, 'grad_norm': 0.8837392994603281, 'learning_rate': 8.240940080968247e-06, 'epoch': 0.3} 30%|██▉ | 10190/34278 [11:16:29<26:34:24, 3.97s/it] 30%|██▉ | 10191/34278 [11:16:33<25:32:15, 3.82s/it] {'loss': 0.1457, 'grad_norm': 0.7240890297328977, 'learning_rate': 8.240580317183419e-06, 'epoch': 0.3} 30%|██▉ | 10191/34278 [11:16:33<25:32:15, 3.82s/it] 30%|██▉ | 10192/34278 [11:16:36<25:30:12, 3.81s/it] {'loss': 0.1622, 'grad_norm': 0.7813744532725917, 'learning_rate': 8.240220524467464e-06, 'epoch': 0.3} 30%|██▉ | 10192/34278 [11:16:37<25:30:12, 3.81s/it] 30%|██▉ | 10193/34278 [11:16:40<25:28:46, 3.81s/it] {'loss': 0.1398, 'grad_norm': 0.9073252697563148, 'learning_rate': 8.239860702823595e-06, 'epoch': 0.3} 30%|██▉ | 10193/34278 [11:16:40<25:28:46, 3.81s/it] 30%|██▉ | 10194/34278 [11:16:44<25:02:44, 3.74s/it] {'loss': 0.1438, 'grad_norm': 0.7897438605174113, 'learning_rate': 8.23950085225502e-06, 'epoch': 0.3} 30%|██▉ | 10194/34278 [11:16:44<25:02:44, 3.74s/it] 30%|██▉ | 10195/34278 [11:16:47<23:08:26, 3.46s/it] {'loss': 0.1853, 'grad_norm': 1.2624818808897202, 'learning_rate': 8.239140972764956e-06, 'epoch': 0.3} 30%|██▉ | 10195/34278 [11:16:47<23:08:26, 3.46s/it] 30%|██▉ | 10196/34278 [11:16:51<24:40:55, 3.69s/it] {'loss': 0.1582, 'grad_norm': 1.672777151134189, 'learning_rate': 8.238781064356616e-06, 'epoch': 0.3} 30%|██▉ | 10196/34278 [11:16:51<24:40:55, 3.69s/it] 30%|██▉ | 10197/34278 [11:16:54<23:23:47, 3.50s/it] {'loss': 0.1537, 'grad_norm': 0.8680324500789193, 'learning_rate': 8.238421127033209e-06, 'epoch': 0.3} 30%|██▉ | 10197/34278 [11:16:54<23:23:47, 3.50s/it] 30%|██▉ | 10198/34278 [11:16:57<22:16:32, 3.33s/it] {'loss': 0.1411, 'grad_norm': 0.7200645621115024, 'learning_rate': 8.238061160797955e-06, 'epoch': 0.3} 30%|██▉ | 10198/34278 [11:16:57<22:16:32, 3.33s/it] 30%|██▉ | 10199/34278 [11:17:01<24:22:25, 3.64s/it] {'loss': 0.143, 'grad_norm': 0.9405584044150864, 'learning_rate': 8.237701165654061e-06, 'epoch': 0.3} 30%|██▉ | 10199/34278 [11:17:01<24:22:25, 3.64s/it] 30%|██▉ | 10200/34278 [11:17:05<24:23:02, 3.65s/it] {'loss': 0.1489, 'grad_norm': 0.904492235212471, 'learning_rate': 8.237341141604744e-06, 'epoch': 0.3} 30%|██▉ | 10200/34278 [11:17:05<24:23:02, 3.65s/it] 30%|██▉ | 10201/34278 [11:17:08<23:06:12, 3.45s/it] {'loss': 0.147, 'grad_norm': 0.8956774962792353, 'learning_rate': 8.23698108865322e-06, 'epoch': 0.3} 30%|██▉ | 10201/34278 [11:17:08<23:06:12, 3.45s/it] 30%|██▉ | 10202/34278 [11:17:11<21:41:46, 3.24s/it] {'loss': 0.1521, 'grad_norm': 0.8081462348216187, 'learning_rate': 8.2366210068027e-06, 'epoch': 0.3} 30%|██▉ | 10202/34278 [11:17:11<21:41:46, 3.24s/it] 30%|██▉ | 10203/34278 [11:17:14<21:11:33, 3.17s/it] {'loss': 0.138, 'grad_norm': 0.8383484266872394, 'learning_rate': 8.2362608960564e-06, 'epoch': 0.3} 30%|██▉ | 10203/34278 [11:17:14<21:11:33, 3.17s/it] 30%|██▉ | 10204/34278 [11:17:19<24:57:55, 3.73s/it] {'loss': 0.141, 'grad_norm': 0.7245719642499691, 'learning_rate': 8.235900756417536e-06, 'epoch': 0.3} 30%|██▉ | 10204/34278 [11:17:19<24:57:55, 3.73s/it] 30%|██▉ | 10205/34278 [11:17:22<24:05:05, 3.60s/it] {'loss': 0.1451, 'grad_norm': 0.8899043513103961, 'learning_rate': 8.235540587889323e-06, 'epoch': 0.3} 30%|██▉ | 10205/34278 [11:17:22<24:05:05, 3.60s/it] 30%|██▉ | 10206/34278 [11:17:25<22:47:41, 3.41s/it] {'loss': 0.169, 'grad_norm': 0.8104554293482947, 'learning_rate': 8.235180390474974e-06, 'epoch': 0.3} 30%|██▉ | 10206/34278 [11:17:25<22:47:41, 3.41s/it] 30%|██▉ | 10207/34278 [11:17:29<23:14:57, 3.48s/it] {'loss': 0.1293, 'grad_norm': 0.9087468555306801, 'learning_rate': 8.23482016417771e-06, 'epoch': 0.3} 30%|██▉ | 10207/34278 [11:17:29<23:14:57, 3.48s/it] 30%|██▉ | 10208/34278 [11:17:32<22:20:22, 3.34s/it] {'loss': 0.1351, 'grad_norm': 0.7162043882960849, 'learning_rate': 8.234459909000743e-06, 'epoch': 0.3} 30%|██▉ | 10208/34278 [11:17:32<22:20:22, 3.34s/it] 30%|██▉ | 10209/34278 [11:17:35<22:24:34, 3.35s/it] {'loss': 0.1674, 'grad_norm': 0.7840134994351502, 'learning_rate': 8.234099624947289e-06, 'epoch': 0.3} 30%|██▉ | 10209/34278 [11:17:35<22:24:34, 3.35s/it] 30%|██▉ | 10210/34278 [11:17:39<22:57:53, 3.44s/it] {'loss': 0.1701, 'grad_norm': 1.0159502554130149, 'learning_rate': 8.233739312020565e-06, 'epoch': 0.3} 30%|██▉ | 10210/34278 [11:17:39<22:57:53, 3.44s/it] 30%|██▉ | 10211/34278 [11:17:43<24:39:28, 3.69s/it] {'loss': 0.1721, 'grad_norm': 0.8907157936985476, 'learning_rate': 8.233378970223789e-06, 'epoch': 0.3} 30%|██▉ | 10211/34278 [11:17:43<24:39:28, 3.69s/it] 30%|██▉ | 10212/34278 [11:17:46<23:06:39, 3.46s/it] {'loss': 0.1455, 'grad_norm': 0.6892916911517295, 'learning_rate': 8.23301859956018e-06, 'epoch': 0.3} 30%|██▉ | 10212/34278 [11:17:46<23:06:39, 3.46s/it] 30%|██▉ | 10213/34278 [11:17:52<27:53:44, 4.17s/it] {'loss': 0.1685, 'grad_norm': 0.7885020635472038, 'learning_rate': 8.232658200032948e-06, 'epoch': 0.3} 30%|██▉ | 10213/34278 [11:17:52<27:53:44, 4.17s/it] 30%|██▉ | 10214/34278 [11:17:58<31:15:16, 4.68s/it] {'loss': 0.1571, 'grad_norm': 0.9534180734690023, 'learning_rate': 8.232297771645318e-06, 'epoch': 0.3} 30%|██▉ | 10214/34278 [11:17:58<31:15:16, 4.68s/it] 30%|██▉ | 10215/34278 [11:18:01<28:42:46, 4.30s/it] {'loss': 0.1542, 'grad_norm': 0.7569478004758581, 'learning_rate': 8.231937314400505e-06, 'epoch': 0.3} 30%|██▉ | 10215/34278 [11:18:01<28:42:46, 4.30s/it] 30%|██▉ | 10216/34278 [11:18:06<29:18:13, 4.38s/it] {'loss': 0.1614, 'grad_norm': 0.805664931552756, 'learning_rate': 8.231576828301725e-06, 'epoch': 0.3} 30%|██▉ | 10216/34278 [11:18:06<29:18:13, 4.38s/it] 30%|██▉ | 10217/34278 [11:18:09<26:38:08, 3.99s/it] {'loss': 0.1556, 'grad_norm': 0.7313367021495826, 'learning_rate': 8.2312163133522e-06, 'epoch': 0.3} 30%|██▉ | 10217/34278 [11:18:09<26:38:08, 3.99s/it] 30%|██▉ | 10218/34278 [11:18:14<29:04:53, 4.35s/it] {'loss': 0.1415, 'grad_norm': 0.7169676352550582, 'learning_rate': 8.23085576955515e-06, 'epoch': 0.3} 30%|██▉ | 10218/34278 [11:18:14<29:04:53, 4.35s/it] 30%|██▉ | 10219/34278 [11:18:17<26:33:06, 3.97s/it] {'loss': 0.1578, 'grad_norm': 0.6631959426055198, 'learning_rate': 8.230495196913788e-06, 'epoch': 0.3} 30%|██▉ | 10219/34278 [11:18:17<26:33:06, 3.97s/it] 30%|██▉ | 10220/34278 [11:18:20<25:12:55, 3.77s/it] {'loss': 0.1515, 'grad_norm': 0.813108070470013, 'learning_rate': 8.230134595431337e-06, 'epoch': 0.3} 30%|██▉ | 10220/34278 [11:18:20<25:12:55, 3.77s/it] 30%|██▉ | 10221/34278 [11:18:26<29:52:48, 4.47s/it] {'loss': 0.1363, 'grad_norm': 0.8102880711543592, 'learning_rate': 8.229773965111014e-06, 'epoch': 0.3} 30%|██▉ | 10221/34278 [11:18:26<29:52:48, 4.47s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 30%|██▉ | 10222/34278 [11:18:30<29:21:34, 4.39s/it] {'loss': 0.1604, 'grad_norm': 0.7438666778322964, 'learning_rate': 8.229413305956043e-06, 'epoch': 0.3} 30%|██▉ | 10222/34278 [11:18:31<29:21:34, 4.39s/it] 30%|██▉ | 10223/34278 [11:18:36<32:28:57, 4.86s/it] {'loss': 0.1505, 'grad_norm': 0.8293960771238174, 'learning_rate': 8.229052617969637e-06, 'epoch': 0.3} 30%|██▉ | 10223/34278 [11:18:36<32:28:57, 4.86s/it] 30%|██▉ | 10224/34278 [11:18:40<29:52:21, 4.47s/it] {'loss': 0.1632, 'grad_norm': 0.7914743911608186, 'learning_rate': 8.228691901155022e-06, 'epoch': 0.3} 30%|██▉ | 10224/34278 [11:18:40<29:52:21, 4.47s/it] 30%|██▉ | 10225/34278 [11:18:45<30:46:08, 4.61s/it] {'loss': 0.146, 'grad_norm': 0.7882176192556702, 'learning_rate': 8.228331155515417e-06, 'epoch': 0.3} 30%|██▉ | 10225/34278 [11:18:45<30:46:08, 4.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 30%|██▉ | 10226/34278 [11:18:51<33:14:57, 4.98s/it] {'loss': 0.1801, 'grad_norm': 0.7575265575180946, 'learning_rate': 8.227970381054042e-06, 'epoch': 0.3} 30%|██▉ | 10226/34278 [11:18:51<33:14:57, 4.98s/it] 30%|██▉ | 10227/34278 [11:18:54<29:39:20, 4.44s/it] {'loss': 0.125, 'grad_norm': 0.6871519850408472, 'learning_rate': 8.227609577774116e-06, 'epoch': 0.3} 30%|██▉ | 10227/34278 [11:18:54<29:39:20, 4.44s/it] 30%|██▉ | 10228/34278 [11:18:57<26:12:33, 3.92s/it] {'loss': 0.1453, 'grad_norm': 1.143200474102427, 'learning_rate': 8.227248745678865e-06, 'epoch': 0.3} 30%|██▉ | 10228/34278 [11:18:57<26:12:33, 3.92s/it] 30%|██▉ | 10229/34278 [11:19:00<26:00:10, 3.89s/it] {'loss': 0.1404, 'grad_norm': 0.8561203769958857, 'learning_rate': 8.226887884771506e-06, 'epoch': 0.3} 30%|██▉ | 10229/34278 [11:19:01<26:00:10, 3.89s/it] 30%|██▉ | 10230/34278 [11:19:07<30:26:22, 4.56s/it] {'loss': 0.1463, 'grad_norm': 0.8693956702410475, 'learning_rate': 8.226526995055263e-06, 'epoch': 0.3} 30%|██▉ | 10230/34278 [11:19:07<30:26:22, 4.56s/it] 30%|██▉ | 10231/34278 [11:19:10<27:39:07, 4.14s/it] {'loss': 0.1588, 'grad_norm': 0.8539535376655861, 'learning_rate': 8.226166076533357e-06, 'epoch': 0.3} 30%|██▉ | 10231/34278 [11:19:10<27:39:07, 4.14s/it] 30%|██▉ | 10232/34278 [11:19:14<28:18:50, 4.24s/it] {'loss': 0.1264, 'grad_norm': 0.73328261929883, 'learning_rate': 8.22580512920901e-06, 'epoch': 0.3} 30%|██▉ | 10232/34278 [11:19:14<28:18:50, 4.24s/it] 30%|██▉ | 10233/34278 [11:19:17<26:02:26, 3.90s/it] {'loss': 0.1547, 'grad_norm': 0.9856024289609955, 'learning_rate': 8.225444153085445e-06, 'epoch': 0.3} 30%|██▉ | 10233/34278 [11:19:17<26:02:26, 3.90s/it] 30%|██▉ | 10234/34278 [11:19:21<25:33:32, 3.83s/it] {'loss': 0.1387, 'grad_norm': 0.6253266718298031, 'learning_rate': 8.225083148165885e-06, 'epoch': 0.3} 30%|██▉ | 10234/34278 [11:19:21<25:33:32, 3.83s/it] 30%|██▉ | 10235/34278 [11:19:24<23:42:50, 3.55s/it] {'loss': 0.1614, 'grad_norm': 0.8751337165851122, 'learning_rate': 8.224722114453553e-06, 'epoch': 0.3} 30%|██▉ | 10235/34278 [11:19:24<23:42:50, 3.55s/it] 30%|██▉ | 10236/34278 [11:19:30<29:20:51, 4.39s/it] {'loss': 0.1718, 'grad_norm': 0.7518899303442966, 'learning_rate': 8.22436105195167e-06, 'epoch': 0.3} 30%|██▉ | 10236/34278 [11:19:30<29:20:51, 4.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 30%|██▉ | 10237/34278 [11:19:33<26:55:42, 4.03s/it] {'loss': 0.1501, 'grad_norm': 0.7498274503018532, 'learning_rate': 8.223999960663463e-06, 'epoch': 0.3} 30%|██▉ | 10237/34278 [11:19:33<26:55:42, 4.03s/it] 30%|██▉ | 10238/34278 [11:19:37<25:19:05, 3.79s/it] {'loss': 0.1387, 'grad_norm': 0.7797355591447336, 'learning_rate': 8.223638840592154e-06, 'epoch': 0.3} 30%|██▉ | 10238/34278 [11:19:37<25:19:05, 3.79s/it] 30%|██▉ | 10239/34278 [11:19:42<28:09:33, 4.22s/it] {'loss': 0.1606, 'grad_norm': 0.9011564981490462, 'learning_rate': 8.223277691740966e-06, 'epoch': 0.3} 30%|██▉ | 10239/34278 [11:19:42<28:09:33, 4.22s/it] 30%|██▉ | 10240/34278 [11:19:46<27:49:04, 4.17s/it] {'loss': 0.1498, 'grad_norm': 1.04411022699535, 'learning_rate': 8.222916514113125e-06, 'epoch': 0.3} 30%|██▉ | 10240/34278 [11:19:46<27:49:04, 4.17s/it] 30%|██▉ | 10241/34278 [11:19:50<27:32:43, 4.13s/it] {'loss': 0.1487, 'grad_norm': 0.7183070537426233, 'learning_rate': 8.222555307711852e-06, 'epoch': 0.3} 30%|██▉ | 10241/34278 [11:19:50<27:32:43, 4.13s/it] 30%|██▉ | 10242/34278 [11:19:56<31:10:05, 4.67s/it] {'loss': 0.1476, 'grad_norm': 0.8366805257443651, 'learning_rate': 8.222194072540377e-06, 'epoch': 0.3} 30%|██▉ | 10242/34278 [11:19:56<31:10:05, 4.67s/it] 30%|██▉ | 10243/34278 [11:20:00<29:13:07, 4.38s/it] {'loss': 0.1492, 'grad_norm': 0.774981856596977, 'learning_rate': 8.221832808601925e-06, 'epoch': 0.3} 30%|██▉ | 10243/34278 [11:20:00<29:13:07, 4.38s/it] 30%|██▉ | 10244/34278 [11:20:03<26:17:28, 3.94s/it] {'loss': 0.1579, 'grad_norm': 0.9555939794967316, 'learning_rate': 8.221471515899714e-06, 'epoch': 0.3} 30%|██▉ | 10244/34278 [11:20:03<26:17:28, 3.94s/it] 30%|██▉ | 10245/34278 [11:20:08<30:21:14, 4.55s/it] {'loss': 0.1329, 'grad_norm': 0.8608671731454439, 'learning_rate': 8.221110194436976e-06, 'epoch': 0.3} 30%|██▉ | 10245/34278 [11:20:09<30:21:14, 4.55s/it] 30%|██▉ | 10246/34278 [11:20:15<33:21:45, 5.00s/it] {'loss': 0.1436, 'grad_norm': 0.6972785003151737, 'learning_rate': 8.220748844216936e-06, 'epoch': 0.3} 30%|██▉ | 10246/34278 [11:20:15<33:21:45, 5.00s/it] 30%|██▉ | 10247/34278 [11:20:18<30:36:46, 4.59s/it] {'loss': 0.1566, 'grad_norm': 0.7262080941899789, 'learning_rate': 8.220387465242819e-06, 'epoch': 0.3} 30%|██▉ | 10247/34278 [11:20:18<30:36:46, 4.59s/it] 30%|██▉ | 10248/34278 [11:20:22<28:38:53, 4.29s/it] {'loss': 0.1439, 'grad_norm': 0.8696697212962222, 'learning_rate': 8.22002605751785e-06, 'epoch': 0.3} 30%|██▉ | 10248/34278 [11:20:22<28:38:53, 4.29s/it] 30%|██▉ | 10249/34278 [11:20:25<25:56:10, 3.89s/it] {'loss': 0.1463, 'grad_norm': 0.8963960824103208, 'learning_rate': 8.219664621045258e-06, 'epoch': 0.3} 30%|██▉ | 10249/34278 [11:20:25<25:56:10, 3.89s/it] 30%|██▉ | 10250/34278 [11:20:28<24:40:52, 3.70s/it] {'loss': 0.1756, 'grad_norm': 0.9134689373427836, 'learning_rate': 8.21930315582827e-06, 'epoch': 0.3} 30%|██▉ | 10250/34278 [11:20:28<24:40:52, 3.70s/it] 30%|██▉ | 10251/34278 [11:20:31<23:31:27, 3.52s/it] {'loss': 0.1523, 'grad_norm': 0.791892052304756, 'learning_rate': 8.21894166187011e-06, 'epoch': 0.3} 30%|██▉ | 10251/34278 [11:20:31<23:31:27, 3.52s/it] 30%|██▉ | 10252/34278 [11:20:34<22:36:24, 3.39s/it] {'loss': 0.1562, 'grad_norm': 0.8090637785611028, 'learning_rate': 8.21858013917401e-06, 'epoch': 0.3} 30%|██▉ | 10252/34278 [11:20:34<22:36:24, 3.39s/it] 30%|██▉ | 10253/34278 [11:20:37<22:06:20, 3.31s/it] {'loss': 0.1435, 'grad_norm': 1.021161273179451, 'learning_rate': 8.218218587743192e-06, 'epoch': 0.3} 30%|██▉ | 10253/34278 [11:20:37<22:06:20, 3.31s/it] 30%|██▉ | 10254/34278 [11:20:41<22:36:26, 3.39s/it] {'loss': 0.1464, 'grad_norm': 0.7197261281266153, 'learning_rate': 8.217857007580888e-06, 'epoch': 0.3} 30%|██▉ | 10254/34278 [11:20:41<22:36:26, 3.39s/it] 30%|██▉ | 10255/34278 [11:20:45<24:17:27, 3.64s/it] {'loss': 0.1701, 'grad_norm': 0.9457110521780498, 'learning_rate': 8.217495398690324e-06, 'epoch': 0.3} 30%|██▉ | 10255/34278 [11:20:45<24:17:27, 3.64s/it] 30%|██▉ | 10256/34278 [11:20:49<24:09:15, 3.62s/it] {'loss': 0.1465, 'grad_norm': 0.9163208966796982, 'learning_rate': 8.21713376107473e-06, 'epoch': 0.3} 30%|██▉ | 10256/34278 [11:20:49<24:09:15, 3.62s/it] 30%|██▉ | 10257/34278 [11:20:52<23:00:22, 3.45s/it] {'loss': 0.1586, 'grad_norm': 0.8249674709456052, 'learning_rate': 8.216772094737332e-06, 'epoch': 0.3} 30%|██▉ | 10257/34278 [11:20:52<23:00:22, 3.45s/it] 30%|██▉ | 10258/34278 [11:20:55<22:17:41, 3.34s/it] {'loss': 0.1451, 'grad_norm': 0.8585279812762839, 'learning_rate': 8.216410399681365e-06, 'epoch': 0.3} 30%|██▉ | 10258/34278 [11:20:55<22:17:41, 3.34s/it] 30%|██▉ | 10259/34278 [11:20:58<21:55:01, 3.28s/it] {'loss': 0.1662, 'grad_norm': 0.6816281737593278, 'learning_rate': 8.21604867591005e-06, 'epoch': 0.3} 30%|██▉ | 10259/34278 [11:20:58<21:55:01, 3.28s/it] 30%|██▉ | 10260/34278 [11:21:02<22:59:35, 3.45s/it] {'loss': 0.1723, 'grad_norm': 0.9843714814859522, 'learning_rate': 8.215686923426622e-06, 'epoch': 0.3} 30%|██▉ | 10260/34278 [11:21:02<22:59:35, 3.45s/it] 30%|██▉ | 10261/34278 [11:21:05<22:53:40, 3.43s/it] {'loss': 0.1229, 'grad_norm': 0.7630391592359387, 'learning_rate': 8.215325142234307e-06, 'epoch': 0.3} 30%|██▉ | 10261/34278 [11:21:05<22:53:40, 3.43s/it] 30%|██▉ | 10262/34278 [11:21:10<25:09:02, 3.77s/it] {'loss': 0.1385, 'grad_norm': 0.9016404250362807, 'learning_rate': 8.214963332336339e-06, 'epoch': 0.3} 30%|██▉ | 10262/34278 [11:21:10<25:09:02, 3.77s/it] 30%|██▉ | 10263/34278 [11:21:16<29:53:54, 4.48s/it] {'loss': 0.1648, 'grad_norm': 0.8493932257954774, 'learning_rate': 8.214601493735942e-06, 'epoch': 0.3} 30%|██▉ | 10263/34278 [11:21:16<29:53:54, 4.48s/it] 30%|██▉ | 10264/34278 [11:21:22<33:08:11, 4.97s/it] {'loss': 0.1739, 'grad_norm': 0.9210046561471733, 'learning_rate': 8.214239626436354e-06, 'epoch': 0.3} 30%|██▉ | 10264/34278 [11:21:22<33:08:11, 4.97s/it] 30%|██▉ | 10265/34278 [11:21:27<32:21:15, 4.85s/it] {'loss': 0.1584, 'grad_norm': 0.7523427517724842, 'learning_rate': 8.2138777304408e-06, 'epoch': 0.3} 30%|██▉ | 10265/34278 [11:21:27<32:21:15, 4.85s/it] 30%|██▉ | 10266/34278 [11:21:30<30:19:10, 4.55s/it] {'loss': 0.1504, 'grad_norm': 0.8284269398140005, 'learning_rate': 8.213515805752513e-06, 'epoch': 0.3} 30%|██▉ | 10266/34278 [11:21:30<30:19:10, 4.55s/it] 30%|██▉ | 10267/34278 [11:21:34<28:13:00, 4.23s/it] {'loss': 0.1479, 'grad_norm': 0.7850184968983689, 'learning_rate': 8.213153852374726e-06, 'epoch': 0.3} 30%|██▉ | 10267/34278 [11:21:34<28:13:00, 4.23s/it] 30%|██▉ | 10268/34278 [11:21:37<25:43:23, 3.86s/it] {'loss': 0.1479, 'grad_norm': 0.9958689700645952, 'learning_rate': 8.212791870310665e-06, 'epoch': 0.3} 30%|██▉ | 10268/34278 [11:21:37<25:43:23, 3.86s/it] 30%|██▉ | 10269/34278 [11:21:40<24:22:00, 3.65s/it] {'loss': 0.1367, 'grad_norm': 0.7823202795779842, 'learning_rate': 8.212429859563569e-06, 'epoch': 0.3} 30%|██▉ | 10269/34278 [11:21:40<24:22:00, 3.65s/it] 30%|██▉ | 10270/34278 [11:21:43<22:40:41, 3.40s/it] {'loss': 0.1482, 'grad_norm': 0.9517726455881705, 'learning_rate': 8.212067820136663e-06, 'epoch': 0.3} 30%|██▉ | 10270/34278 [11:21:43<22:40:41, 3.40s/it] 30%|██▉ | 10271/34278 [11:21:46<22:38:08, 3.39s/it] {'loss': 0.1494, 'grad_norm': 0.747457871674271, 'learning_rate': 8.211705752033183e-06, 'epoch': 0.3} 30%|██▉ | 10271/34278 [11:21:46<22:38:08, 3.39s/it] 30%|██▉ | 10272/34278 [11:21:50<22:23:41, 3.36s/it] {'loss': 0.1229, 'grad_norm': 0.6483071425870026, 'learning_rate': 8.211343655256361e-06, 'epoch': 0.3} 30%|██▉ | 10272/34278 [11:21:50<22:23:41, 3.36s/it] 30%|██▉ | 10273/34278 [11:21:52<21:12:33, 3.18s/it] {'loss': 0.1444, 'grad_norm': 0.9479736087280959, 'learning_rate': 8.210981529809432e-06, 'epoch': 0.3} 30%|██▉ | 10273/34278 [11:21:52<21:12:33, 3.18s/it] 30%|██▉ | 10274/34278 [11:21:56<22:15:49, 3.34s/it] {'loss': 0.15, 'grad_norm': 0.6231169873050983, 'learning_rate': 8.210619375695622e-06, 'epoch': 0.3} 30%|██▉ | 10274/34278 [11:21:56<22:15:49, 3.34s/it] 30%|██▉ | 10275/34278 [11:21:59<20:50:56, 3.13s/it] {'loss': 0.1386, 'grad_norm': 0.8844057892049493, 'learning_rate': 8.210257192918172e-06, 'epoch': 0.3} 30%|██▉ | 10275/34278 [11:21:59<20:50:56, 3.13s/it] 30%|██▉ | 10276/34278 [11:22:05<26:42:03, 4.00s/it] {'loss': 0.1394, 'grad_norm': 0.7838326925631378, 'learning_rate': 8.20989498148031e-06, 'epoch': 0.3} 30%|██▉ | 10276/34278 [11:22:05<26:42:03, 4.00s/it] 30%|██▉ | 10277/34278 [11:22:08<25:48:36, 3.87s/it] {'loss': 0.14, 'grad_norm': 0.7596321222748872, 'learning_rate': 8.209532741385273e-06, 'epoch': 0.3} 30%|██▉ | 10277/34278 [11:22:08<25:48:36, 3.87s/it] 30%|██▉ | 10278/34278 [11:22:12<26:13:41, 3.93s/it] {'loss': 0.1774, 'grad_norm': 0.7995005177917673, 'learning_rate': 8.209170472636293e-06, 'epoch': 0.3} 30%|██▉ | 10278/34278 [11:22:12<26:13:41, 3.93s/it] 30%|██▉ | 10279/34278 [11:22:16<25:08:38, 3.77s/it] {'loss': 0.1642, 'grad_norm': 0.8847720040873738, 'learning_rate': 8.208808175236607e-06, 'epoch': 0.3} 30%|██▉ | 10279/34278 [11:22:16<25:08:38, 3.77s/it] 30%|██▉ | 10280/34278 [11:22:19<25:03:21, 3.76s/it] {'loss': 0.1556, 'grad_norm': 0.7847791299400547, 'learning_rate': 8.208445849189445e-06, 'epoch': 0.3} 30%|██▉ | 10280/34278 [11:22:19<25:03:21, 3.76s/it] 30%|██▉ | 10281/34278 [11:22:22<23:22:49, 3.51s/it] {'loss': 0.1385, 'grad_norm': 0.8683120041271374, 'learning_rate': 8.208083494498045e-06, 'epoch': 0.3} 30%|██▉ | 10281/34278 [11:22:22<23:22:49, 3.51s/it] 30%|██▉ | 10282/34278 [11:22:26<22:55:27, 3.44s/it] {'loss': 0.1565, 'grad_norm': 0.9240785679067222, 'learning_rate': 8.207721111165643e-06, 'epoch': 0.3} 30%|██▉ | 10282/34278 [11:22:26<22:55:27, 3.44s/it] 30%|██▉ | 10283/34278 [11:22:32<27:47:43, 4.17s/it] {'loss': 0.1437, 'grad_norm': 0.7920139812191519, 'learning_rate': 8.207358699195471e-06, 'epoch': 0.3} 30%|██▉ | 10283/34278 [11:22:32<27:47:43, 4.17s/it] 30%|███ | 10284/34278 [11:22:37<31:14:59, 4.69s/it] {'loss': 0.1624, 'grad_norm': 1.0384454371715393, 'learning_rate': 8.206996258590767e-06, 'epoch': 0.3} 30%|███ | 10284/34278 [11:22:37<31:14:59, 4.69s/it] 30%|███ | 10285/34278 [11:22:40<27:30:15, 4.13s/it] {'loss': 0.147, 'grad_norm': 1.1790725332448648, 'learning_rate': 8.206633789354766e-06, 'epoch': 0.3} 30%|███ | 10285/34278 [11:22:40<27:30:15, 4.13s/it] 30%|███ | 10286/34278 [11:22:46<30:08:39, 4.52s/it] {'loss': 0.1624, 'grad_norm': 0.7010278425790094, 'learning_rate': 8.206271291490704e-06, 'epoch': 0.3} 30%|███ | 10286/34278 [11:22:46<30:08:39, 4.52s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 30%|███ | 10287/34278 [11:22:49<27:46:41, 4.17s/it] {'loss': 0.1483, 'grad_norm': 1.2912617849744015, 'learning_rate': 8.205908765001817e-06, 'epoch': 0.3} 30%|███ | 10287/34278 [11:22:49<27:46:41, 4.17s/it] 30%|███ | 10288/34278 [11:22:52<25:28:13, 3.82s/it] {'loss': 0.1336, 'grad_norm': 1.1345470992794995, 'learning_rate': 8.205546209891341e-06, 'epoch': 0.3} 30%|███ | 10288/34278 [11:22:52<25:28:13, 3.82s/it] 30%|███ | 10289/34278 [11:22:56<25:06:45, 3.77s/it] {'loss': 0.1351, 'grad_norm': 0.7795416089869779, 'learning_rate': 8.205183626162515e-06, 'epoch': 0.3} 30%|███ | 10289/34278 [11:22:56<25:06:45, 3.77s/it] 30%|███ | 10290/34278 [11:23:02<30:29:00, 4.57s/it] {'loss': 0.1624, 'grad_norm': 0.9744407377689432, 'learning_rate': 8.204821013818576e-06, 'epoch': 0.3} 30%|███ | 10290/34278 [11:23:02<30:29:00, 4.57s/it] 30%|███ | 10291/34278 [11:23:05<27:49:20, 4.18s/it] {'loss': 0.1511, 'grad_norm': 1.1623602615663815, 'learning_rate': 8.204458372862757e-06, 'epoch': 0.3} 30%|███ | 10291/34278 [11:23:05<27:49:20, 4.18s/it] 30%|███ | 10292/34278 [11:23:08<25:27:07, 3.82s/it] {'loss': 0.1634, 'grad_norm': 1.5405276564005022, 'learning_rate': 8.2040957032983e-06, 'epoch': 0.3} 30%|███ | 10292/34278 [11:23:08<25:27:07, 3.82s/it] 30%|███ | 10293/34278 [11:23:14<29:55:33, 4.49s/it] {'loss': 0.1395, 'grad_norm': 0.8384976869936589, 'learning_rate': 8.203733005128443e-06, 'epoch': 0.3} 30%|███ | 10293/34278 [11:23:14<29:55:33, 4.49s/it] 30%|███ | 10294/34278 [11:23:18<27:27:20, 4.12s/it] {'loss': 0.1606, 'grad_norm': 1.538099235954295, 'learning_rate': 8.203370278356422e-06, 'epoch': 0.3} 30%|███ | 10294/34278 [11:23:18<27:27:20, 4.12s/it] 30%|███ | 10295/34278 [11:23:21<25:37:41, 3.85s/it] {'loss': 0.1685, 'grad_norm': 0.9786164778485515, 'learning_rate': 8.203007522985474e-06, 'epoch': 0.3} 30%|███ | 10295/34278 [11:23:21<25:37:41, 3.85s/it] 30%|███ | 10296/34278 [11:23:25<25:14:09, 3.79s/it] {'loss': 0.168, 'grad_norm': 0.9661251732423624, 'learning_rate': 8.202644739018839e-06, 'epoch': 0.3} 30%|███ | 10296/34278 [11:23:25<25:14:09, 3.79s/it] 30%|███ | 10297/34278 [11:23:28<25:31:44, 3.83s/it] {'loss': 0.1399, 'grad_norm': 0.6613973036691084, 'learning_rate': 8.20228192645976e-06, 'epoch': 0.3} 30%|███ | 10297/34278 [11:23:28<25:31:44, 3.83s/it] 30%|███ | 10298/34278 [11:23:32<24:03:40, 3.61s/it] {'loss': 0.1361, 'grad_norm': 0.8107857653177807, 'learning_rate': 8.201919085311468e-06, 'epoch': 0.3} 30%|███ | 10298/34278 [11:23:32<24:03:40, 3.61s/it] 30%|███ | 10299/34278 [11:23:35<23:53:01, 3.59s/it] {'loss': 0.154, 'grad_norm': 0.9302113463508935, 'learning_rate': 8.20155621557721e-06, 'epoch': 0.3} 30%|███ | 10299/34278 [11:23:35<23:53:01, 3.59s/it] 30%|███ | 10300/34278 [11:23:39<23:49:57, 3.58s/it] {'loss': 0.1506, 'grad_norm': 0.725640049113617, 'learning_rate': 8.20119331726022e-06, 'epoch': 0.3} 30%|███ | 10300/34278 [11:23:39<23:49:57, 3.58s/it] 30%|███ | 10301/34278 [11:23:43<24:35:40, 3.69s/it] {'loss': 0.1611, 'grad_norm': 1.125724549815784, 'learning_rate': 8.200830390363741e-06, 'epoch': 0.3} 30%|███ | 10301/34278 [11:23:43<24:35:40, 3.69s/it] 30%|███ | 10302/34278 [11:23:46<24:28:14, 3.67s/it] {'loss': 0.1486, 'grad_norm': 0.987039406035594, 'learning_rate': 8.200467434891013e-06, 'epoch': 0.3} 30%|███ | 10302/34278 [11:23:46<24:28:14, 3.67s/it] 30%|███ | 10303/34278 [11:23:52<28:47:11, 4.32s/it] {'loss': 0.1359, 'grad_norm': 0.7619418183972609, 'learning_rate': 8.200104450845276e-06, 'epoch': 0.3} 30%|███ | 10303/34278 [11:23:52<28:47:11, 4.32s/it] 30%|███ | 10304/34278 [11:23:55<26:42:30, 4.01s/it] {'loss': 0.1679, 'grad_norm': 0.9464439971922779, 'learning_rate': 8.19974143822977e-06, 'epoch': 0.3} 30%|███ | 10304/34278 [11:23:55<26:42:30, 4.01s/it] 30%|███ | 10305/34278 [11:24:00<28:28:46, 4.28s/it] {'loss': 0.146, 'grad_norm': 0.9102843331183711, 'learning_rate': 8.199378397047737e-06, 'epoch': 0.3} 30%|███ | 10305/34278 [11:24:00<28:28:46, 4.28s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 30%|███ | 10306/34278 [11:24:03<25:47:20, 3.87s/it] {'loss': 0.1658, 'grad_norm': 0.8757140850972511, 'learning_rate': 8.199015327302416e-06, 'epoch': 0.3} 30%|███ | 10306/34278 [11:24:03<25:47:20, 3.87s/it] 30%|███ | 10307/34278 [11:24:09<29:47:17, 4.47s/it] {'loss': 0.2044, 'grad_norm': 1.0102006920627888, 'learning_rate': 8.19865222899705e-06, 'epoch': 0.3} 30%|███ | 10307/34278 [11:24:09<29:47:17, 4.47s/it] 30%|███ | 10308/34278 [11:24:13<28:00:50, 4.21s/it] {'loss': 0.1744, 'grad_norm': 1.0609644563273757, 'learning_rate': 8.198289102134883e-06, 'epoch': 0.3} 30%|███ | 10308/34278 [11:24:13<28:00:50, 4.21s/it] 30%|███ | 10309/34278 [11:24:16<25:52:34, 3.89s/it] {'loss': 0.171, 'grad_norm': 0.938656871725238, 'learning_rate': 8.197925946719152e-06, 'epoch': 0.3} 30%|███ | 10309/34278 [11:24:16<25:52:34, 3.89s/it] 30%|███ | 10310/34278 [11:24:22<30:09:17, 4.53s/it] {'loss': 0.1357, 'grad_norm': 0.8870932026683985, 'learning_rate': 8.197562762753102e-06, 'epoch': 0.3} 30%|███ | 10310/34278 [11:24:22<30:09:17, 4.53s/it] 30%|███ | 10311/34278 [11:24:25<27:03:23, 4.06s/it] {'loss': 0.1605, 'grad_norm': 0.8816972483532031, 'learning_rate': 8.197199550239974e-06, 'epoch': 0.3} 30%|███ | 10311/34278 [11:24:25<27:03:23, 4.06s/it] 30%|███ | 10312/34278 [11:24:28<24:53:51, 3.74s/it] {'loss': 0.1447, 'grad_norm': 0.9581892569146573, 'learning_rate': 8.196836309183014e-06, 'epoch': 0.3} 30%|███ | 10312/34278 [11:24:28<24:53:51, 3.74s/it] 30%|███ | 10313/34278 [11:24:31<24:03:14, 3.61s/it] {'loss': 0.1508, 'grad_norm': 0.8903482159640526, 'learning_rate': 8.19647303958546e-06, 'epoch': 0.3} 30%|███ | 10313/34278 [11:24:31<24:03:14, 3.61s/it] 30%|███ | 10314/34278 [11:24:34<23:12:10, 3.49s/it] {'loss': 0.1396, 'grad_norm': 1.0125581683334, 'learning_rate': 8.19610974145056e-06, 'epoch': 0.3} 30%|███ | 10314/34278 [11:24:34<23:12:10, 3.49s/it] 30%|███ | 10315/34278 [11:24:38<23:01:24, 3.46s/it] {'loss': 0.1712, 'grad_norm': 0.9137559289052214, 'learning_rate': 8.195746414781554e-06, 'epoch': 0.3} 30%|███ | 10315/34278 [11:24:38<23:01:24, 3.46s/it] 30%|███ | 10316/34278 [11:24:41<21:49:37, 3.28s/it] {'loss': 0.1784, 'grad_norm': 0.9293851882221098, 'learning_rate': 8.195383059581685e-06, 'epoch': 0.3} 30%|███ | 10316/34278 [11:24:41<21:49:37, 3.28s/it] 30%|███ | 10317/34278 [11:24:44<21:26:06, 3.22s/it] {'loss': 0.1461, 'grad_norm': 0.7644480786179568, 'learning_rate': 8.195019675854201e-06, 'epoch': 0.3} 30%|███ | 10317/34278 [11:24:44<21:26:06, 3.22s/it] 30%|███ | 10318/34278 [11:24:47<21:24:07, 3.22s/it] {'loss': 0.1467, 'grad_norm': 0.954902379280209, 'learning_rate': 8.194656263602345e-06, 'epoch': 0.3} 30%|███ | 10318/34278 [11:24:47<21:24:07, 3.22s/it] 30%|███ | 10319/34278 [11:24:50<21:55:32, 3.29s/it] {'loss': 0.1612, 'grad_norm': 0.9365848295917715, 'learning_rate': 8.194292822829359e-06, 'epoch': 0.3} 30%|███ | 10319/34278 [11:24:50<21:55:32, 3.29s/it] 30%|███ | 10320/34278 [11:24:53<21:35:02, 3.24s/it] {'loss': 0.1601, 'grad_norm': 0.889638080171425, 'learning_rate': 8.19392935353849e-06, 'epoch': 0.3} 30%|███ | 10320/34278 [11:24:53<21:35:02, 3.24s/it] 30%|███ | 10321/34278 [11:24:56<20:46:40, 3.12s/it] {'loss': 0.1455, 'grad_norm': 0.900540095228079, 'learning_rate': 8.193565855732982e-06, 'epoch': 0.3} 30%|███ | 10321/34278 [11:24:56<20:46:40, 3.12s/it] 30%|███ | 10322/34278 [11:25:02<26:21:26, 3.96s/it] {'loss': 0.162, 'grad_norm': 0.6722095940519582, 'learning_rate': 8.193202329416079e-06, 'epoch': 0.3} 30%|███ | 10322/34278 [11:25:02<26:21:26, 3.96s/it] 30%|███ | 10323/34278 [11:25:05<24:22:34, 3.66s/it] {'loss': 0.1422, 'grad_norm': 1.0135786558021573, 'learning_rate': 8.19283877459103e-06, 'epoch': 0.3} 30%|███ | 10323/34278 [11:25:05<24:22:34, 3.66s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa75b74d8a0> Failed to fetch sample 3221373. Exception: cannot identify image file <_io.BytesIO object at 0x7fa75b74d8a0> 30%|███ | 10324/34278 [11:25:09<24:54:30, 3.74s/it] {'loss': 0.1362, 'grad_norm': 0.9075304908466894, 'learning_rate': 8.192475191261078e-06, 'epoch': 0.3} 30%|███ | 10324/34278 [11:25:09<24:54:30, 3.74s/it] 30%|███ | 10325/34278 [11:25:12<23:07:21, 3.48s/it] {'loss': 0.168, 'grad_norm': 0.9742175270809147, 'learning_rate': 8.19211157942947e-06, 'epoch': 0.3} 30%|███ | 10325/34278 [11:25:12<23:07:21, 3.48s/it] 30%|███ | 10326/34278 [11:25:15<22:02:38, 3.31s/it] {'loss': 0.1618, 'grad_norm': 0.8020497956000485, 'learning_rate': 8.19174793909945e-06, 'epoch': 0.3} 30%|███ | 10326/34278 [11:25:15<22:02:38, 3.31s/it] 30%|███ | 10327/34278 [11:25:18<21:34:38, 3.24s/it] {'loss': 0.1741, 'grad_norm': 0.8269449779623378, 'learning_rate': 8.191384270274267e-06, 'epoch': 0.3} 30%|███ | 10327/34278 [11:25:18<21:34:38, 3.24s/it] 30%|███ | 10328/34278 [11:25:22<22:53:53, 3.44s/it] {'loss': 0.1693, 'grad_norm': 0.8019645339230919, 'learning_rate': 8.191020572957168e-06, 'epoch': 0.3} 30%|███ | 10328/34278 [11:25:22<22:53:53, 3.44s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8540 > 8192). Running this sequence through the model will result in indexing errors 30%|███ | 10329/34278 [11:25:25<22:45:34, 3.42s/it] {'loss': 0.1385, 'grad_norm': 0.8147885163158093, 'learning_rate': 8.190656847151399e-06, 'epoch': 0.3} 30%|███ | 10329/34278 [11:25:25<22:45:34, 3.42s/it] 30%|███ | 10330/34278 [11:25:29<22:56:17, 3.45s/it] {'loss': 0.1617, 'grad_norm': 0.621541687023796, 'learning_rate': 8.190293092860206e-06, 'epoch': 0.3} 30%|███ | 10330/34278 [11:25:29<22:56:17, 3.45s/it] 30%|███ | 10331/34278 [11:25:32<21:57:59, 3.30s/it] {'loss': 0.171, 'grad_norm': 0.7998469469603753, 'learning_rate': 8.18992931008684e-06, 'epoch': 0.3} 30%|███ | 10331/34278 [11:25:32<21:57:59, 3.30s/it] 30%|███ | 10332/34278 [11:25:36<23:14:54, 3.50s/it] {'loss': 0.148, 'grad_norm': 0.6410746273269085, 'learning_rate': 8.189565498834545e-06, 'epoch': 0.3} 30%|███ | 10332/34278 [11:25:36<23:14:54, 3.50s/it] 30%|███ | 10333/34278 [11:25:41<26:24:04, 3.97s/it] {'loss': 0.1833, 'grad_norm': 0.7768540062680194, 'learning_rate': 8.18920165910657e-06, 'epoch': 0.3} 30%|███ | 10333/34278 [11:25:41<26:24:04, 3.97s/it] 30%|███ | 10334/34278 [11:25:44<25:51:17, 3.89s/it] {'loss': 0.1354, 'grad_norm': 0.7328658348242143, 'learning_rate': 8.188837790906166e-06, 'epoch': 0.3} 30%|███ | 10334/34278 [11:25:44<25:51:17, 3.89s/it] 30%|███ | 10335/34278 [11:25:48<24:15:15, 3.65s/it] {'loss': 0.1334, 'grad_norm': 0.6725912272901395, 'learning_rate': 8.18847389423658e-06, 'epoch': 0.3} 30%|███ | 10335/34278 [11:25:48<24:15:15, 3.65s/it] 30%|███ | 10336/34278 [11:25:51<23:33:59, 3.54s/it] {'loss': 0.1394, 'grad_norm': 0.6057666850636925, 'learning_rate': 8.188109969101057e-06, 'epoch': 0.3} 30%|███ | 10336/34278 [11:25:51<23:33:59, 3.54s/it] 30%|███ | 10337/34278 [11:25:54<23:11:56, 3.49s/it] {'loss': 0.161, 'grad_norm': 0.9733823505793422, 'learning_rate': 8.187746015502851e-06, 'epoch': 0.3} 30%|███ | 10337/34278 [11:25:54<23:11:56, 3.49s/it] 30%|███ | 10338/34278 [11:26:00<27:00:03, 4.06s/it] {'loss': 0.1712, 'grad_norm': 0.7831182117923976, 'learning_rate': 8.187382033445209e-06, 'epoch': 0.3} 30%|███ | 10338/34278 [11:26:00<27:00:03, 4.06s/it] 30%|███ | 10339/34278 [11:26:02<24:24:26, 3.67s/it] {'loss': 0.1333, 'grad_norm': 0.9387900721378101, 'learning_rate': 8.187018022931383e-06, 'epoch': 0.3} 30%|███ | 10339/34278 [11:26:02<24:24:26, 3.67s/it] 30%|███ | 10340/34278 [11:26:09<29:45:19, 4.47s/it] {'loss': 0.1591, 'grad_norm': 0.882741465257527, 'learning_rate': 8.18665398396462e-06, 'epoch': 0.3} 30%|███ | 10340/34278 [11:26:09<29:45:19, 4.47s/it] 30%|███ | 10341/34278 [11:26:15<32:56:01, 4.95s/it] {'loss': 0.1591, 'grad_norm': 0.7772272118956491, 'learning_rate': 8.186289916548169e-06, 'epoch': 0.3} 30%|███ | 10341/34278 [11:26:15<32:56:01, 4.95s/it] 30%|███ | 10342/34278 [11:26:18<28:36:27, 4.30s/it] {'loss': 0.152, 'grad_norm': 0.7726661431253278, 'learning_rate': 8.185925820685283e-06, 'epoch': 0.3} 30%|███ | 10342/34278 [11:26:18<28:36:27, 4.30s/it] 30%|███ | 10343/34278 [11:26:21<26:37:46, 4.01s/it] {'loss': 0.167, 'grad_norm': 0.8159915507142284, 'learning_rate': 8.185561696379213e-06, 'epoch': 0.3} 30%|███ | 10343/34278 [11:26:21<26:37:46, 4.01s/it] 30%|███ | 10344/34278 [11:26:24<25:02:49, 3.77s/it] {'loss': 0.1474, 'grad_norm': 0.7668837526204888, 'learning_rate': 8.185197543633207e-06, 'epoch': 0.3} 30%|███ | 10344/34278 [11:26:24<25:02:49, 3.77s/it] 30%|███ | 10345/34278 [11:26:27<23:25:53, 3.52s/it] {'loss': 0.1538, 'grad_norm': 0.7665248750590504, 'learning_rate': 8.18483336245052e-06, 'epoch': 0.3} 30%|███ | 10345/34278 [11:26:27<23:25:53, 3.52s/it] 30%|███ | 10346/34278 [11:26:33<28:14:35, 4.25s/it] {'loss': 0.1423, 'grad_norm': 0.8241660718574091, 'learning_rate': 8.1844691528344e-06, 'epoch': 0.3} 30%|███ | 10346/34278 [11:26:33<28:14:35, 4.25s/it] 30%|███ | 10347/34278 [11:26:39<31:24:55, 4.73s/it] {'loss': 0.1321, 'grad_norm': 0.7919856246679993, 'learning_rate': 8.1841049147881e-06, 'epoch': 0.3} 30%|███ | 10347/34278 [11:26:39<31:24:55, 4.73s/it] 30%|███ | 10348/34278 [11:26:42<27:47:18, 4.18s/it] {'loss': 0.1747, 'grad_norm': 0.742251352204128, 'learning_rate': 8.183740648314871e-06, 'epoch': 0.3} 30%|███ | 10348/34278 [11:26:42<27:47:18, 4.18s/it] 30%|███ | 10349/34278 [11:26:46<27:23:17, 4.12s/it] {'loss': 0.1432, 'grad_norm': 0.8004211233653133, 'learning_rate': 8.183376353417965e-06, 'epoch': 0.3} 30%|███ | 10349/34278 [11:26:46<27:23:17, 4.12s/it] 30%|███ | 10350/34278 [11:26:49<24:52:35, 3.74s/it] {'loss': 0.1413, 'grad_norm': 0.9220228761361982, 'learning_rate': 8.183012030100634e-06, 'epoch': 0.3} 30%|███ | 10350/34278 [11:26:49<24:52:35, 3.74s/it] 30%|███ | 10351/34278 [11:26:52<24:16:39, 3.65s/it] {'loss': 0.1532, 'grad_norm': 0.656116739552237, 'learning_rate': 8.182647678366133e-06, 'epoch': 0.3} 30%|███ | 10351/34278 [11:26:52<24:16:39, 3.65s/it] 30%|███ | 10352/34278 [11:26:55<22:54:02, 3.45s/it] {'loss': 0.1712, 'grad_norm': 0.8473274892563499, 'learning_rate': 8.182283298217712e-06, 'epoch': 0.3} 30%|███ | 10352/34278 [11:26:55<22:54:02, 3.45s/it] 30%|███ | 10353/34278 [11:26:58<21:28:37, 3.23s/it] {'loss': 0.175, 'grad_norm': 0.9386840779750911, 'learning_rate': 8.181918889658626e-06, 'epoch': 0.3} 30%|███ | 10353/34278 [11:26:58<21:28:37, 3.23s/it] 30%|███ | 10354/34278 [11:27:02<23:01:27, 3.46s/it] {'loss': 0.1527, 'grad_norm': 0.7958658354844564, 'learning_rate': 8.18155445269213e-06, 'epoch': 0.3} 30%|███ | 10354/34278 [11:27:02<23:01:27, 3.46s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41678e8400> Failed to fetch sample 3255715. Exception: cannot identify image file <_io.BytesIO object at 0x7f41678e8400> 30%|███ | 10355/34278 [11:27:05<22:55:51, 3.45s/it] {'loss': 0.1483, 'grad_norm': 0.8519466938766371, 'learning_rate': 8.181189987321472e-06, 'epoch': 0.3} 30%|███ | 10355/34278 [11:27:05<22:55:51, 3.45s/it] 30%|███ | 10356/34278 [11:27:09<24:38:48, 3.71s/it] {'loss': 0.1497, 'grad_norm': 0.847514887919152, 'learning_rate': 8.180825493549911e-06, 'epoch': 0.3} 30%|███ | 10356/34278 [11:27:09<24:38:48, 3.71s/it] 30%|███ | 10357/34278 [11:27:13<24:06:32, 3.63s/it] {'loss': 0.1508, 'grad_norm': 0.7475612548723383, 'learning_rate': 8.180460971380699e-06, 'epoch': 0.3} 30%|███ | 10357/34278 [11:27:13<24:06:32, 3.63s/it] 30%|███ | 10358/34278 [11:27:16<24:00:27, 3.61s/it] {'loss': 0.1632, 'grad_norm': 0.838551162543459, 'learning_rate': 8.18009642081709e-06, 'epoch': 0.3} 30%|███ | 10358/34278 [11:27:16<24:00:27, 3.61s/it] 30%|███ | 10359/34278 [11:27:23<29:00:55, 4.37s/it] {'loss': 0.1561, 'grad_norm': 0.9474378218034406, 'learning_rate': 8.17973184186234e-06, 'epoch': 0.3} 30%|███ | 10359/34278 [11:27:23<29:00:55, 4.37s/it] 30%|███ | 10360/34278 [11:27:26<27:10:10, 4.09s/it] {'loss': 0.1462, 'grad_norm': 0.9161082185613947, 'learning_rate': 8.179367234519704e-06, 'epoch': 0.3} 30%|███ | 10360/34278 [11:27:26<27:10:10, 4.09s/it] 30%|███ | 10361/34278 [11:27:30<26:13:24, 3.95s/it] {'loss': 0.1411, 'grad_norm': 0.8657332818263342, 'learning_rate': 8.179002598792435e-06, 'epoch': 0.3} 30%|███ | 10361/34278 [11:27:30<26:13:24, 3.95s/it] 30%|███ | 10362/34278 [11:27:35<30:01:55, 4.52s/it] {'loss': 0.1604, 'grad_norm': 0.9058986082532352, 'learning_rate': 8.17863793468379e-06, 'epoch': 0.3} 30%|███ | 10362/34278 [11:27:35<30:01:55, 4.52s/it] 30%|███ | 10363/34278 [11:27:41<31:18:22, 4.71s/it] {'loss': 0.1455, 'grad_norm': 0.7833653167162915, 'learning_rate': 8.178273242197025e-06, 'epoch': 0.3} 30%|███ | 10363/34278 [11:27:41<31:18:22, 4.71s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8442 > 8192). Running this sequence through the model will result in indexing errors 30%|███ | 10364/34278 [11:27:45<29:51:53, 4.50s/it] {'loss': 0.1505, 'grad_norm': 0.752484372612084, 'learning_rate': 8.177908521335395e-06, 'epoch': 0.3} 30%|███ | 10364/34278 [11:27:45<29:51:53, 4.50s/it] 30%|███ | 10365/34278 [11:27:51<33:29:48, 5.04s/it] {'loss': 0.1355, 'grad_norm': 0.8873062199169135, 'learning_rate': 8.177543772102155e-06, 'epoch': 0.3} 30%|███ | 10365/34278 [11:27:51<33:29:48, 5.04s/it] 30%|███ | 10366/34278 [11:27:54<30:18:20, 4.56s/it] {'loss': 0.1776, 'grad_norm': 0.8772193812904566, 'learning_rate': 8.177178994500564e-06, 'epoch': 0.3} 30%|███ | 10366/34278 [11:27:54<30:18:20, 4.56s/it] 30%|███ | 10367/34278 [11:27:58<29:17:50, 4.41s/it] {'loss': 0.1451, 'grad_norm': 0.7696517234416018, 'learning_rate': 8.176814188533877e-06, 'epoch': 0.3} 30%|███ | 10367/34278 [11:27:58<29:17:50, 4.41s/it] 30%|███ | 10368/34278 [11:28:04<31:05:16, 4.68s/it] {'loss': 0.1495, 'grad_norm': 0.8426114226823732, 'learning_rate': 8.17644935420535e-06, 'epoch': 0.3} 30%|███ | 10368/34278 [11:28:04<31:05:16, 4.68s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 30%|███ | 10369/34278 [11:28:07<27:17:48, 4.11s/it] {'loss': 0.1502, 'grad_norm': 0.8623379632947579, 'learning_rate': 8.176084491518245e-06, 'epoch': 0.3} 30%|███ | 10369/34278 [11:28:07<27:17:48, 4.11s/it] 30%|███ | 10370/34278 [11:28:10<26:09:55, 3.94s/it] {'loss': 0.1406, 'grad_norm': 0.7152243468324411, 'learning_rate': 8.175719600475813e-06, 'epoch': 0.3} 30%|███ | 10370/34278 [11:28:10<26:09:55, 3.94s/it] 30%|███ | 10371/34278 [11:28:13<23:50:42, 3.59s/it] {'loss': 0.1512, 'grad_norm': 0.9131510402685786, 'learning_rate': 8.175354681081316e-06, 'epoch': 0.3} 30%|███ | 10371/34278 [11:28:13<23:50:42, 3.59s/it] 30%|███ | 10372/34278 [11:28:17<25:16:29, 3.81s/it] {'loss': 0.1558, 'grad_norm': 0.8318978801392292, 'learning_rate': 8.174989733338009e-06, 'epoch': 0.3} 30%|███ | 10372/34278 [11:28:17<25:16:29, 3.81s/it] 30%|███ | 10373/34278 [11:28:20<23:19:02, 3.51s/it] {'loss': 0.1456, 'grad_norm': 0.7242443691616607, 'learning_rate': 8.174624757249153e-06, 'epoch': 0.3} 30%|███ | 10373/34278 [11:28:20<23:19:02, 3.51s/it] 30%|███ | 10374/34278 [11:28:24<23:27:29, 3.53s/it] {'loss': 0.1537, 'grad_norm': 0.8414540738241345, 'learning_rate': 8.174259752818003e-06, 'epoch': 0.3} 30%|███ | 10374/34278 [11:28:24<23:27:29, 3.53s/it] 30%|███ | 10375/34278 [11:28:27<23:01:57, 3.47s/it] {'loss': 0.1369, 'grad_norm': 0.7253542838346363, 'learning_rate': 8.173894720047821e-06, 'epoch': 0.3} 30%|███ | 10375/34278 [11:28:27<23:01:57, 3.47s/it] 30%|███ | 10376/34278 [11:28:31<23:53:03, 3.60s/it] {'loss': 0.1605, 'grad_norm': 0.7510016927922919, 'learning_rate': 8.173529658941865e-06, 'epoch': 0.3} 30%|███ | 10376/34278 [11:28:31<23:53:03, 3.60s/it] 30%|███ | 10377/34278 [11:28:34<23:18:48, 3.51s/it] {'loss': 0.1385, 'grad_norm': 0.7228815788623432, 'learning_rate': 8.173164569503393e-06, 'epoch': 0.3} 30%|███ | 10377/34278 [11:28:34<23:18:48, 3.51s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 30%|███ | 10378/34278 [11:28:40<28:36:26, 4.31s/it] {'loss': 0.2099, 'grad_norm': 0.947579151274672, 'learning_rate': 8.172799451735666e-06, 'epoch': 0.3} 30%|███ | 10378/34278 [11:28:40<28:36:26, 4.31s/it] 30%|███ | 10379/34278 [11:28:46<30:48:46, 4.64s/it] {'loss': 0.1683, 'grad_norm': 0.9877209902427233, 'learning_rate': 8.17243430564194e-06, 'epoch': 0.3} 30%|███ | 10379/34278 [11:28:46<30:48:46, 4.64s/it] 30%|███ | 10380/34278 [11:28:49<27:11:29, 4.10s/it] {'loss': 0.1508, 'grad_norm': 0.8461806348773305, 'learning_rate': 8.172069131225481e-06, 'epoch': 0.3} 30%|███ | 10380/34278 [11:28:49<27:11:29, 4.10s/it] 30%|███ | 10381/34278 [11:28:55<31:05:05, 4.68s/it] {'loss': 0.1471, 'grad_norm': 1.0047050747683886, 'learning_rate': 8.171703928489548e-06, 'epoch': 0.3} 30%|███ | 10381/34278 [11:28:55<31:05:05, 4.68s/it] 30%|███ | 10382/34278 [11:29:00<33:21:19, 5.03s/it] {'loss': 0.1327, 'grad_norm': 0.7623453747136857, 'learning_rate': 8.171338697437394e-06, 'epoch': 0.3} 30%|███ | 10382/34278 [11:29:00<33:21:19, 5.03s/it] 30%|███ | 10383/34278 [11:29:04<30:23:26, 4.58s/it] {'loss': 0.1624, 'grad_norm': 0.7533753878572584, 'learning_rate': 8.170973438072289e-06, 'epoch': 0.3} 30%|███ | 10383/34278 [11:29:04<30:23:26, 4.58s/it] 30%|███ | 10384/34278 [11:29:07<27:07:50, 4.09s/it] {'loss': 0.1401, 'grad_norm': 0.8229887691219726, 'learning_rate': 8.170608150397489e-06, 'epoch': 0.3} 30%|███ | 10384/34278 [11:29:07<27:07:50, 4.09s/it] 30%|███ | 10385/34278 [11:29:11<27:47:20, 4.19s/it] {'loss': 0.1565, 'grad_norm': 0.6903285197247508, 'learning_rate': 8.170242834416256e-06, 'epoch': 0.3} 30%|███ | 10385/34278 [11:29:11<27:47:20, 4.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 30%|███ | 10386/34278 [11:29:17<31:12:16, 4.70s/it] {'loss': 0.1414, 'grad_norm': 0.8827581146553132, 'learning_rate': 8.169877490131852e-06, 'epoch': 0.3} 30%|███ | 10386/34278 [11:29:17<31:12:16, 4.70s/it] 30%|███ | 10387/34278 [11:29:23<33:39:16, 5.07s/it] {'loss': 0.1255, 'grad_norm': 0.8311994245631472, 'learning_rate': 8.16951211754754e-06, 'epoch': 0.3} 30%|███ | 10387/34278 [11:29:23<33:39:16, 5.07s/it] 30%|███ | 10388/34278 [11:29:26<30:09:21, 4.54s/it] {'loss': 0.1647, 'grad_norm': 0.9045715324433118, 'learning_rate': 8.169146716666578e-06, 'epoch': 0.3} 30%|███ | 10388/34278 [11:29:26<30:09:21, 4.54s/it] 30%|███ | 10389/34278 [11:29:30<28:41:43, 4.32s/it] {'loss': 0.1418, 'grad_norm': 0.7350612393133354, 'learning_rate': 8.168781287492232e-06, 'epoch': 0.3} 30%|███ | 10389/34278 [11:29:30<28:41:43, 4.32s/it] 30%|███ | 10390/34278 [11:29:36<32:16:39, 4.86s/it] {'loss': 0.1561, 'grad_norm': 0.898199394934361, 'learning_rate': 8.168415830027762e-06, 'epoch': 0.3} 30%|███ | 10390/34278 [11:29:36<32:16:39, 4.86s/it] 30%|███ | 10391/34278 [11:29:40<29:12:27, 4.40s/it] {'loss': 0.1499, 'grad_norm': 0.8829793775558368, 'learning_rate': 8.168050344276434e-06, 'epoch': 0.3} 30%|███ | 10391/34278 [11:29:40<29:12:27, 4.40s/it] 30%|███ | 10392/34278 [11:29:43<26:36:14, 4.01s/it] {'loss': 0.1282, 'grad_norm': 0.8110120390463106, 'learning_rate': 8.167684830241506e-06, 'epoch': 0.3} 30%|███ | 10392/34278 [11:29:43<26:36:14, 4.01s/it] 30%|███ | 10393/34278 [11:29:46<24:34:11, 3.70s/it] {'loss': 0.1314, 'grad_norm': 0.7874125730460129, 'learning_rate': 8.167319287926247e-06, 'epoch': 0.3} 30%|███ | 10393/34278 [11:29:46<24:34:11, 3.70s/it] 30%|███ | 10394/34278 [11:29:49<23:41:29, 3.57s/it] {'loss': 0.1629, 'grad_norm': 0.8854226381926515, 'learning_rate': 8.166953717333915e-06, 'epoch': 0.3} 30%|███ | 10394/34278 [11:29:49<23:41:29, 3.57s/it] 30%|███ | 10395/34278 [11:29:52<22:18:14, 3.36s/it] {'loss': 0.1611, 'grad_norm': 0.9192712835689084, 'learning_rate': 8.166588118467778e-06, 'epoch': 0.3} 30%|███ | 10395/34278 [11:29:52<22:18:14, 3.36s/it] 30%|███ | 10396/34278 [11:29:56<23:15:23, 3.51s/it] {'loss': 0.1592, 'grad_norm': 1.1209985968149403, 'learning_rate': 8.166222491331097e-06, 'epoch': 0.3} 30%|███ | 10396/34278 [11:29:56<23:15:23, 3.51s/it] 30%|███ | 10397/34278 [11:30:02<28:56:39, 4.36s/it] {'loss': 0.143, 'grad_norm': 0.8058659321648536, 'learning_rate': 8.165856835927138e-06, 'epoch': 0.3} 30%|███ | 10397/34278 [11:30:02<28:56:39, 4.36s/it] 30%|███ | 10398/34278 [11:30:06<28:39:23, 4.32s/it] {'loss': 0.1882, 'grad_norm': 0.8609479958453766, 'learning_rate': 8.165491152259163e-06, 'epoch': 0.3} 30%|███ | 10398/34278 [11:30:06<28:39:23, 4.32s/it] 30%|███ | 10399/34278 [11:30:09<26:09:58, 3.94s/it] {'loss': 0.1528, 'grad_norm': 0.71334878740112, 'learning_rate': 8.165125440330443e-06, 'epoch': 0.3} 30%|███ | 10399/34278 [11:30:09<26:09:58, 3.94s/it] 30%|███ | 10400/34278 [11:30:12<24:10:21, 3.64s/it] {'loss': 0.1399, 'grad_norm': 0.8731792207689436, 'learning_rate': 8.164759700144235e-06, 'epoch': 0.3} 30%|███ | 10400/34278 [11:30:12<24:10:21, 3.64s/it] 30%|███ | 10401/34278 [11:30:16<25:05:37, 3.78s/it] {'loss': 0.1398, 'grad_norm': 0.7985744200953458, 'learning_rate': 8.16439393170381e-06, 'epoch': 0.3} 30%|███ | 10401/34278 [11:30:16<25:05:37, 3.78s/it] 30%|███ | 10402/34278 [11:30:21<27:24:37, 4.13s/it] {'loss': 0.1645, 'grad_norm': 0.8038517488074439, 'learning_rate': 8.164028135012429e-06, 'epoch': 0.3} 30%|███ | 10402/34278 [11:30:21<27:24:37, 4.13s/it] 30%|███ | 10403/34278 [11:30:27<29:55:18, 4.51s/it] {'loss': 0.1447, 'grad_norm': 0.8762013050995305, 'learning_rate': 8.163662310073362e-06, 'epoch': 0.3} 30%|███ | 10403/34278 [11:30:27<29:55:18, 4.51s/it] 30%|███ | 10404/34278 [11:30:30<28:02:20, 4.23s/it] {'loss': 0.1365, 'grad_norm': 0.8191903536601476, 'learning_rate': 8.163296456889873e-06, 'epoch': 0.3} 30%|███ | 10404/34278 [11:30:30<28:02:20, 4.23s/it] 30%|███ | 10405/34278 [11:30:34<25:57:04, 3.91s/it] {'loss': 0.1434, 'grad_norm': 0.7931625614152237, 'learning_rate': 8.162930575465228e-06, 'epoch': 0.3} 30%|███ | 10405/34278 [11:30:34<25:57:04, 3.91s/it] 30%|███ | 10406/34278 [11:30:36<23:48:29, 3.59s/it] {'loss': 0.1678, 'grad_norm': 0.742726212924904, 'learning_rate': 8.162564665802693e-06, 'epoch': 0.3} 30%|███ | 10406/34278 [11:30:36<23:48:29, 3.59s/it] 30%|███ | 10407/34278 [11:30:41<25:08:20, 3.79s/it] {'loss': 0.1287, 'grad_norm': 0.8429872191417145, 'learning_rate': 8.162198727905536e-06, 'epoch': 0.3} 30%|███ | 10407/34278 [11:30:41<25:08:20, 3.79s/it] 30%|███ | 10408/34278 [11:30:44<23:51:34, 3.60s/it] {'loss': 0.1505, 'grad_norm': 0.7898703474677157, 'learning_rate': 8.161832761777024e-06, 'epoch': 0.3} 30%|███ | 10408/34278 [11:30:44<23:51:34, 3.60s/it] 30%|███ | 10409/34278 [11:30:47<22:53:39, 3.45s/it] {'loss': 0.1725, 'grad_norm': 0.7891222648082646, 'learning_rate': 8.161466767420426e-06, 'epoch': 0.3} 30%|███ | 10409/34278 [11:30:47<22:53:39, 3.45s/it] 30%|███ | 10410/34278 [11:30:51<24:04:14, 3.63s/it] {'loss': 0.1568, 'grad_norm': 0.8572627779806057, 'learning_rate': 8.161100744839004e-06, 'epoch': 0.3} 30%|███ | 10410/34278 [11:30:51<24:04:14, 3.63s/it] 30%|███ | 10411/34278 [11:30:57<28:34:19, 4.31s/it] {'loss': 0.1539, 'grad_norm': 0.8907723320209793, 'learning_rate': 8.160734694036031e-06, 'epoch': 0.3} 30%|███ | 10411/34278 [11:30:57<28:34:19, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 30%|███ | 10412/34278 [11:31:00<26:22:11, 3.98s/it] {'loss': 0.1557, 'grad_norm': 0.8754776835040876, 'learning_rate': 8.160368615014771e-06, 'epoch': 0.3} 30%|███ | 10412/34278 [11:31:00<26:22:11, 3.98s/it] 30%|███ | 10413/34278 [11:31:03<24:17:41, 3.66s/it] {'loss': 0.1639, 'grad_norm': 0.7988711441501878, 'learning_rate': 8.160002507778497e-06, 'epoch': 0.3} 30%|███ | 10413/34278 [11:31:03<24:17:41, 3.66s/it] 30%|███ | 10414/34278 [11:31:06<23:38:44, 3.57s/it] {'loss': 0.167, 'grad_norm': 1.0929980824225913, 'learning_rate': 8.159636372330475e-06, 'epoch': 0.3} 30%|███ | 10414/34278 [11:31:06<23:38:44, 3.57s/it] 30%|███ | 10415/34278 [11:31:12<27:22:02, 4.13s/it] {'loss': 0.1495, 'grad_norm': 1.0097993171818347, 'learning_rate': 8.159270208673973e-06, 'epoch': 0.3} 30%|███ | 10415/34278 [11:31:12<27:22:02, 4.13s/it] 30%|███ | 10416/34278 [11:31:15<25:25:42, 3.84s/it] {'loss': 0.1539, 'grad_norm': 0.802350305488741, 'learning_rate': 8.15890401681226e-06, 'epoch': 0.3} 30%|███ | 10416/34278 [11:31:15<25:25:42, 3.84s/it] 30%|███ | 10417/34278 [11:31:18<23:39:31, 3.57s/it] {'loss': 0.1612, 'grad_norm': 0.8516468241319436, 'learning_rate': 8.158537796748607e-06, 'epoch': 0.3} 30%|███ | 10417/34278 [11:31:18<23:39:31, 3.57s/it] 30%|███ | 10418/34278 [11:31:21<22:34:33, 3.41s/it] {'loss': 0.1745, 'grad_norm': 0.8920408566504613, 'learning_rate': 8.158171548486281e-06, 'epoch': 0.3} 30%|███ | 10418/34278 [11:31:21<22:34:33, 3.41s/it] 30%|███ | 10419/34278 [11:31:24<21:26:49, 3.24s/it] {'loss': 0.1687, 'grad_norm': 0.7488362337002084, 'learning_rate': 8.157805272028557e-06, 'epoch': 0.3} 30%|███ | 10419/34278 [11:31:24<21:26:49, 3.24s/it] 30%|███ | 10420/34278 [11:31:27<20:54:06, 3.15s/it] {'loss': 0.1696, 'grad_norm': 0.6807176760151589, 'learning_rate': 8.157438967378697e-06, 'epoch': 0.3} 30%|███ | 10420/34278 [11:31:27<20:54:06, 3.15s/it] 30%|███ | 10421/34278 [11:31:30<22:00:55, 3.32s/it] {'loss': 0.1498, 'grad_norm': 0.8600650742579082, 'learning_rate': 8.157072634539977e-06, 'epoch': 0.3} 30%|███ | 10421/34278 [11:31:30<22:00:55, 3.32s/it] 30%|███ | 10422/34278 [11:31:34<22:21:29, 3.37s/it] {'loss': 0.1672, 'grad_norm': 0.7670456470979753, 'learning_rate': 8.156706273515667e-06, 'epoch': 0.3} 30%|███ | 10422/34278 [11:31:34<22:21:29, 3.37s/it] 30%|███ | 10423/34278 [11:31:40<28:07:43, 4.24s/it] {'loss': 0.1683, 'grad_norm': 0.7617168129050363, 'learning_rate': 8.156339884309038e-06, 'epoch': 0.3} 30%|███ | 10423/34278 [11:31:40<28:07:43, 4.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 30%|███ | 10424/34278 [11:31:43<26:11:18, 3.95s/it] {'loss': 0.1537, 'grad_norm': 1.0888207429500198, 'learning_rate': 8.155973466923359e-06, 'epoch': 0.3} 30%|███ | 10424/34278 [11:31:43<26:11:18, 3.95s/it] 30%|███ | 10425/34278 [11:31:47<24:51:15, 3.75s/it] {'loss': 0.1798, 'grad_norm': 0.9988337315375342, 'learning_rate': 8.155607021361903e-06, 'epoch': 0.3} 30%|███ | 10425/34278 [11:31:47<24:51:15, 3.75s/it] 30%|███ | 10426/34278 [11:31:52<27:09:10, 4.10s/it] {'loss': 0.1542, 'grad_norm': 0.7287563782009567, 'learning_rate': 8.155240547627938e-06, 'epoch': 0.3} 30%|███ | 10426/34278 [11:31:52<27:09:10, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 30%|███ | 10427/34278 [11:31:55<26:09:14, 3.95s/it] {'loss': 0.1574, 'grad_norm': 0.755339563244242, 'learning_rate': 8.15487404572474e-06, 'epoch': 0.3} 30%|███ | 10427/34278 [11:31:55<26:09:14, 3.95s/it] 30%|███ | 10428/34278 [11:31:59<25:11:22, 3.80s/it] {'loss': 0.1942, 'grad_norm': 0.8623245463594157, 'learning_rate': 8.154507515655581e-06, 'epoch': 0.3} 30%|███ | 10428/34278 [11:31:59<25:11:22, 3.80s/it] 30%|███ | 10429/34278 [11:32:02<24:11:05, 3.65s/it] {'loss': 0.1322, 'grad_norm': 0.7249600682792448, 'learning_rate': 8.15414095742373e-06, 'epoch': 0.3} 30%|███ | 10429/34278 [11:32:02<24:11:05, 3.65s/it] 30%|███ | 10430/34278 [11:32:05<22:50:26, 3.45s/it] {'loss': 0.1464, 'grad_norm': 0.7004999686107767, 'learning_rate': 8.153774371032464e-06, 'epoch': 0.3} 30%|███ | 10430/34278 [11:32:05<22:50:26, 3.45s/it] 30%|███ | 10431/34278 [11:32:08<22:00:00, 3.32s/it] {'loss': 0.161, 'grad_norm': 0.8112538713363775, 'learning_rate': 8.15340775648505e-06, 'epoch': 0.3} 30%|███ | 10431/34278 [11:32:08<22:00:00, 3.32s/it] 30%|███ | 10432/34278 [11:32:11<21:44:51, 3.28s/it] {'loss': 0.1521, 'grad_norm': 0.8591419625633835, 'learning_rate': 8.153041113784767e-06, 'epoch': 0.3} 30%|███ | 10432/34278 [11:32:11<21:44:51, 3.28s/it] 30%|███ | 10433/34278 [11:32:17<27:10:48, 4.10s/it] {'loss': 0.155, 'grad_norm': 0.8700572414342698, 'learning_rate': 8.152674442934885e-06, 'epoch': 0.3} 30%|███ | 10433/34278 [11:32:17<27:10:48, 4.10s/it] 30%|███ | 10434/34278 [11:32:20<25:09:04, 3.80s/it] {'loss': 0.1467, 'grad_norm': 0.7908035965519012, 'learning_rate': 8.152307743938677e-06, 'epoch': 0.3} 30%|███ | 10434/34278 [11:32:20<25:09:04, 3.80s/it] 30%|███ | 10435/34278 [11:32:26<29:25:37, 4.44s/it] {'loss': 0.173, 'grad_norm': 0.8839608715563377, 'learning_rate': 8.151941016799419e-06, 'epoch': 0.3} 30%|███ | 10435/34278 [11:32:26<29:25:37, 4.44s/it] 30%|███ | 10436/34278 [11:32:30<28:34:11, 4.31s/it] {'loss': 0.1415, 'grad_norm': 0.7360087090301753, 'learning_rate': 8.151574261520383e-06, 'epoch': 0.3} 30%|███ | 10436/34278 [11:32:30<28:34:11, 4.31s/it] 30%|███ | 10437/34278 [11:32:33<26:03:08, 3.93s/it] {'loss': 0.1459, 'grad_norm': 0.6797613757354378, 'learning_rate': 8.151207478104845e-06, 'epoch': 0.3} 30%|███ | 10437/34278 [11:32:33<26:03:08, 3.93s/it] 30%|███ | 10438/34278 [11:32:39<29:48:35, 4.50s/it] {'loss': 0.1582, 'grad_norm': 0.9030865435243874, 'learning_rate': 8.15084066655608e-06, 'epoch': 0.3} 30%|███ | 10438/34278 [11:32:39<29:48:35, 4.50s/it] 30%|███ | 10439/34278 [11:32:42<26:30:15, 4.00s/it] {'loss': 0.1622, 'grad_norm': 0.6909089962920174, 'learning_rate': 8.150473826877362e-06, 'epoch': 0.3} 30%|███ | 10439/34278 [11:32:42<26:30:15, 4.00s/it] 30%|███ | 10440/34278 [11:32:45<24:50:51, 3.75s/it] {'loss': 0.1495, 'grad_norm': 0.6740257320563846, 'learning_rate': 8.150106959071964e-06, 'epoch': 0.3} 30%|███ | 10440/34278 [11:32:45<24:50:51, 3.75s/it] 30%|███ | 10441/34278 [11:32:49<25:08:10, 3.80s/it] {'loss': 0.1684, 'grad_norm': 0.9058561789803754, 'learning_rate': 8.149740063143164e-06, 'epoch': 0.3} 30%|███ | 10441/34278 [11:32:49<25:08:10, 3.80s/it] 30%|███ | 10442/34278 [11:32:52<23:34:29, 3.56s/it] {'loss': 0.1491, 'grad_norm': 0.7488456272778348, 'learning_rate': 8.149373139094234e-06, 'epoch': 0.3} 30%|███ | 10442/34278 [11:32:52<23:34:29, 3.56s/it] 30%|███ | 10443/34278 [11:32:58<27:42:01, 4.18s/it] {'loss': 0.1731, 'grad_norm': 0.8715300170098201, 'learning_rate': 8.149006186928456e-06, 'epoch': 0.3} 30%|███ | 10443/34278 [11:32:58<27:42:01, 4.18s/it] 30%|███ | 10444/34278 [11:33:01<26:06:24, 3.94s/it] {'loss': 0.1624, 'grad_norm': 0.6919558722174817, 'learning_rate': 8.148639206649102e-06, 'epoch': 0.3} 30%|███ | 10444/34278 [11:33:01<26:06:24, 3.94s/it] 30%|███ | 10445/34278 [11:33:04<24:51:33, 3.76s/it] {'loss': 0.1586, 'grad_norm': 0.7923866396440833, 'learning_rate': 8.148272198259447e-06, 'epoch': 0.3} 30%|███ | 10445/34278 [11:33:04<24:51:33, 3.76s/it] 30%|███ | 10446/34278 [11:33:08<24:31:56, 3.71s/it] {'loss': 0.126, 'grad_norm': 0.8144296132624399, 'learning_rate': 8.14790516176277e-06, 'epoch': 0.3} 30%|███ | 10446/34278 [11:33:08<24:31:56, 3.71s/it] 30%|███ | 10447/34278 [11:33:11<24:07:27, 3.64s/it] {'loss': 0.1407, 'grad_norm': 0.7364901169217135, 'learning_rate': 8.147538097162348e-06, 'epoch': 0.3} 30%|███ | 10447/34278 [11:33:11<24:07:27, 3.64s/it] 30%|███ | 10448/34278 [11:33:15<23:53:20, 3.61s/it] {'loss': 0.157, 'grad_norm': 0.9646854588622465, 'learning_rate': 8.147171004461456e-06, 'epoch': 0.3} 30%|███ | 10448/34278 [11:33:15<23:53:20, 3.61s/it] 30%|███ | 10449/34278 [11:33:18<22:44:03, 3.43s/it] {'loss': 0.1316, 'grad_norm': 0.7595725187358502, 'learning_rate': 8.146803883663374e-06, 'epoch': 0.3} 30%|███ | 10449/34278 [11:33:18<22:44:03, 3.43s/it] 30%|███ | 10450/34278 [11:33:21<22:05:32, 3.34s/it] {'loss': 0.1509, 'grad_norm': 0.8946622909608126, 'learning_rate': 8.146436734771377e-06, 'epoch': 0.3} 30%|███ | 10450/34278 [11:33:21<22:05:32, 3.34s/it] 30%|███ | 10451/34278 [11:33:27<27:46:56, 4.20s/it] {'loss': 0.1618, 'grad_norm': 1.0007198457280269, 'learning_rate': 8.146069557788745e-06, 'epoch': 0.3} 30%|███ | 10451/34278 [11:33:27<27:46:56, 4.20s/it] 30%|███ | 10452/34278 [11:33:30<24:59:22, 3.78s/it] {'loss': 0.1501, 'grad_norm': 0.9129991902060763, 'learning_rate': 8.145702352718754e-06, 'epoch': 0.3} 30%|███ | 10452/34278 [11:33:30<24:59:22, 3.78s/it] 30%|███ | 10453/34278 [11:33:33<23:53:19, 3.61s/it] {'loss': 0.1555, 'grad_norm': 0.8097005749941277, 'learning_rate': 8.145335119564683e-06, 'epoch': 0.3} 30%|███ | 10453/34278 [11:33:33<23:53:19, 3.61s/it] 30%|███ | 10454/34278 [11:33:37<24:11:50, 3.66s/it] {'loss': 0.1869, 'grad_norm': 0.9170231266424669, 'learning_rate': 8.144967858329813e-06, 'epoch': 0.3} 30%|███ | 10454/34278 [11:33:37<24:11:50, 3.66s/it] 31%|███ | 10455/34278 [11:33:40<23:23:21, 3.53s/it] {'loss': 0.1622, 'grad_norm': 0.9509365711142003, 'learning_rate': 8.14460056901742e-06, 'epoch': 0.31} 31%|███ | 10455/34278 [11:33:40<23:23:21, 3.53s/it] 31%|███ | 10456/34278 [11:33:44<22:39:08, 3.42s/it] {'loss': 0.1453, 'grad_norm': 0.9542346725135301, 'learning_rate': 8.144233251630782e-06, 'epoch': 0.31} 31%|███ | 10456/34278 [11:33:44<22:39:08, 3.42s/it] 31%|███ | 10457/34278 [11:33:47<22:03:51, 3.33s/it] {'loss': 0.1483, 'grad_norm': 0.7056139469173172, 'learning_rate': 8.14386590617318e-06, 'epoch': 0.31} 31%|███ | 10457/34278 [11:33:47<22:03:51, 3.33s/it] 31%|███ | 10458/34278 [11:33:53<27:25:40, 4.15s/it] {'loss': 0.1475, 'grad_norm': 0.8549857988433522, 'learning_rate': 8.143498532647897e-06, 'epoch': 0.31} 31%|███ | 10458/34278 [11:33:53<27:25:40, 4.15s/it] 31%|███ | 10459/34278 [11:33:56<25:25:06, 3.84s/it] {'loss': 0.16, 'grad_norm': 1.0269668522883968, 'learning_rate': 8.143131131058208e-06, 'epoch': 0.31} 31%|███ | 10459/34278 [11:33:56<25:25:06, 3.84s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa5ba938ea0> Failed to fetch sample 3721455. Exception: cannot identify image file <_io.BytesIO object at 0x7fa5ba938ea0> 31%|███ | 10460/34278 [11:33:59<23:51:05, 3.61s/it] {'loss': 0.1167, 'grad_norm': 0.7689087023102948, 'learning_rate': 8.142763701407392e-06, 'epoch': 0.31} 31%|███ | 10460/34278 [11:33:59<23:51:05, 3.61s/it] 31%|███ | 10461/34278 [11:34:03<24:40:03, 3.73s/it] {'loss': 0.1608, 'grad_norm': 1.0460279979785334, 'learning_rate': 8.142396243698735e-06, 'epoch': 0.31} 31%|███ | 10461/34278 [11:34:03<24:40:03, 3.73s/it] 31%|███ | 10462/34278 [11:34:07<24:46:53, 3.75s/it] {'loss': 0.1536, 'grad_norm': 0.9350770804843989, 'learning_rate': 8.142028757935512e-06, 'epoch': 0.31} 31%|███ | 10462/34278 [11:34:07<24:46:53, 3.75s/it] 31%|███ | 10463/34278 [11:34:11<25:03:50, 3.79s/it] {'loss': 0.1754, 'grad_norm': 1.0405166274056603, 'learning_rate': 8.141661244121008e-06, 'epoch': 0.31} 31%|███ | 10463/34278 [11:34:11<25:03:50, 3.79s/it] 31%|███ | 10464/34278 [11:34:14<24:22:27, 3.68s/it] {'loss': 0.1338, 'grad_norm': 0.6683346329780173, 'learning_rate': 8.141293702258503e-06, 'epoch': 0.31} 31%|███ | 10464/34278 [11:34:14<24:22:27, 3.68s/it] 31%|███ | 10465/34278 [11:34:20<29:49:35, 4.51s/it] {'loss': 0.1357, 'grad_norm': 0.6545227732087646, 'learning_rate': 8.140926132351276e-06, 'epoch': 0.31} 31%|███ | 10465/34278 [11:34:20<29:49:35, 4.51s/it] 31%|███ | 10466/34278 [11:34:24<27:47:17, 4.20s/it] {'loss': 0.1387, 'grad_norm': 0.8560348701733025, 'learning_rate': 8.140558534402612e-06, 'epoch': 0.31} 31%|███ | 10466/34278 [11:34:24<27:47:17, 4.20s/it] 31%|███ | 10467/34278 [11:34:27<26:04:17, 3.94s/it] {'loss': 0.1431, 'grad_norm': 0.6970725379754402, 'learning_rate': 8.14019090841579e-06, 'epoch': 0.31} 31%|███ | 10467/34278 [11:34:27<26:04:17, 3.94s/it] 31%|███ | 10468/34278 [11:34:31<24:41:59, 3.73s/it] {'loss': 0.1352, 'grad_norm': 0.7710512787846989, 'learning_rate': 8.139823254394093e-06, 'epoch': 0.31} 31%|███ | 10468/34278 [11:34:31<24:41:59, 3.73s/it] 31%|███ | 10469/34278 [11:34:34<23:17:24, 3.52s/it] {'loss': 0.1635, 'grad_norm': 0.8962857019792468, 'learning_rate': 8.139455572340805e-06, 'epoch': 0.31} 31%|███ | 10469/34278 [11:34:34<23:17:24, 3.52s/it] 31%|███ | 10470/34278 [11:34:38<25:20:39, 3.83s/it] {'loss': 0.1646, 'grad_norm': 0.905799204557261, 'learning_rate': 8.139087862259207e-06, 'epoch': 0.31} 31%|███ | 10470/34278 [11:34:38<25:20:39, 3.83s/it] 31%|███ | 10471/34278 [11:34:41<23:46:01, 3.59s/it] {'loss': 0.1542, 'grad_norm': 0.8407657457303579, 'learning_rate': 8.138720124152579e-06, 'epoch': 0.31} 31%|███ | 10471/34278 [11:34:41<23:46:01, 3.59s/it] 31%|███ | 10472/34278 [11:34:47<29:07:44, 4.40s/it] {'loss': 0.1626, 'grad_norm': 0.8183441760590756, 'learning_rate': 8.13835235802421e-06, 'epoch': 0.31} 31%|███ | 10472/34278 [11:34:47<29:07:44, 4.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 31%|███ | 10473/34278 [11:34:50<25:34:19, 3.87s/it] {'loss': 0.1529, 'grad_norm': 0.763458797159557, 'learning_rate': 8.137984563877379e-06, 'epoch': 0.31} 31%|███ | 10473/34278 [11:34:50<25:34:19, 3.87s/it] 31%|███ | 10474/34278 [11:34:56<30:04:17, 4.55s/it] {'loss': 0.1529, 'grad_norm': 0.937601898474738, 'learning_rate': 8.137616741715371e-06, 'epoch': 0.31} 31%|███ | 10474/34278 [11:34:56<30:04:17, 4.55s/it] 31%|███ | 10475/34278 [11:35:00<29:11:22, 4.41s/it] {'loss': 0.1453, 'grad_norm': 0.9490012727975684, 'learning_rate': 8.137248891541471e-06, 'epoch': 0.31} 31%|███ | 10475/34278 [11:35:00<29:11:22, 4.41s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 31%|███ | 10476/34278 [11:35:06<32:15:23, 4.88s/it] {'loss': 0.1378, 'grad_norm': 0.8531084592155752, 'learning_rate': 8.136881013358961e-06, 'epoch': 0.31} 31%|███ | 10476/34278 [11:35:06<32:15:23, 4.88s/it] 31%|███ | 10477/34278 [11:35:09<28:33:30, 4.32s/it] {'loss': 0.1579, 'grad_norm': 0.8723797164563764, 'learning_rate': 8.136513107171125e-06, 'epoch': 0.31} 31%|███ | 10477/34278 [11:35:09<28:33:30, 4.32s/it] 31%|███ | 10478/34278 [11:35:15<31:35:21, 4.78s/it] {'loss': 0.161, 'grad_norm': 0.8086968325029635, 'learning_rate': 8.13614517298125e-06, 'epoch': 0.31} 31%|███ | 10478/34278 [11:35:15<31:35:21, 4.78s/it] 31%|███ | 10479/34278 [11:35:18<28:01:42, 4.24s/it] {'loss': 0.146, 'grad_norm': 0.8715856525622632, 'learning_rate': 8.13577721079262e-06, 'epoch': 0.31} 31%|███ | 10479/34278 [11:35:18<28:01:42, 4.24s/it] 31%|███ | 10480/34278 [11:35:22<27:28:25, 4.16s/it] {'loss': 0.1557, 'grad_norm': 0.7591362339179039, 'learning_rate': 8.13540922060852e-06, 'epoch': 0.31} 31%|███ | 10480/34278 [11:35:22<27:28:25, 4.16s/it] 31%|███ | 10481/34278 [11:35:26<26:14:25, 3.97s/it] {'loss': 0.127, 'grad_norm': 0.8213478670060186, 'learning_rate': 8.135041202432233e-06, 'epoch': 0.31} 31%|███ | 10481/34278 [11:35:26<26:14:25, 3.97s/it] 31%|███ | 10482/34278 [11:35:30<27:10:55, 4.11s/it] {'loss': 0.1474, 'grad_norm': 0.8292046129731733, 'learning_rate': 8.134673156267048e-06, 'epoch': 0.31} 31%|███ | 10482/34278 [11:35:30<27:10:55, 4.11s/it] 31%|███ | 10483/34278 [11:35:36<31:39:25, 4.79s/it] {'loss': 0.1731, 'grad_norm': 0.7640714590123188, 'learning_rate': 8.134305082116247e-06, 'epoch': 0.31} 31%|███ | 10483/34278 [11:35:36<31:39:25, 4.79s/it] 31%|███ | 10484/34278 [11:35:40<28:55:15, 4.38s/it] {'loss': 0.1592, 'grad_norm': 0.7413570950614466, 'learning_rate': 8.133936979983122e-06, 'epoch': 0.31} 31%|███ | 10484/34278 [11:35:40<28:55:15, 4.38s/it] 31%|███ | 10485/34278 [11:35:43<27:14:06, 4.12s/it] {'loss': 0.1588, 'grad_norm': 0.9120628833778192, 'learning_rate': 8.133568849870953e-06, 'epoch': 0.31} 31%|███ | 10485/34278 [11:35:43<27:14:06, 4.12s/it] 31%|███ | 10486/34278 [11:35:48<28:25:35, 4.30s/it] {'loss': 0.1248, 'grad_norm': 0.7695217672867839, 'learning_rate': 8.13320069178303e-06, 'epoch': 0.31} 31%|███ | 10486/34278 [11:35:48<28:25:35, 4.30s/it] 31%|███ | 10487/34278 [11:35:51<25:48:10, 3.90s/it] {'loss': 0.1553, 'grad_norm': 0.999547933643516, 'learning_rate': 8.13283250572264e-06, 'epoch': 0.31} 31%|███ | 10487/34278 [11:35:51<25:48:10, 3.90s/it] 31%|███ | 10488/34278 [11:35:56<28:36:54, 4.33s/it] {'loss': 0.1727, 'grad_norm': 0.8916076244179395, 'learning_rate': 8.132464291693068e-06, 'epoch': 0.31} 31%|███ | 10488/34278 [11:35:56<28:36:54, 4.33s/it] 31%|███ | 10489/34278 [11:35:59<25:23:10, 3.84s/it] {'loss': 0.1527, 'grad_norm': 0.6844763225824567, 'learning_rate': 8.132096049697604e-06, 'epoch': 0.31} 31%|███ | 10489/34278 [11:35:59<25:23:10, 3.84s/it] 31%|███ | 10490/34278 [11:36:02<23:44:36, 3.59s/it] {'loss': 0.159, 'grad_norm': 0.9559178165337375, 'learning_rate': 8.131727779739533e-06, 'epoch': 0.31} 31%|███ | 10490/34278 [11:36:02<23:44:36, 3.59s/it] 31%|███ | 10491/34278 [11:36:05<22:28:22, 3.40s/it] {'loss': 0.181, 'grad_norm': 0.9284463156598932, 'learning_rate': 8.131359481822145e-06, 'epoch': 0.31} 31%|███ | 10491/34278 [11:36:05<22:28:22, 3.40s/it] 31%|███ | 10492/34278 [11:36:08<21:13:17, 3.21s/it] {'loss': 0.1523, 'grad_norm': 0.781265055697093, 'learning_rate': 8.130991155948726e-06, 'epoch': 0.31} 31%|███ | 10492/34278 [11:36:08<21:13:17, 3.21s/it] 31%|███ | 10493/34278 [11:36:14<26:32:52, 4.02s/it] {'loss': 0.13, 'grad_norm': 0.7559726874149129, 'learning_rate': 8.130622802122566e-06, 'epoch': 0.31} 31%|███ | 10493/34278 [11:36:14<26:32:52, 4.02s/it] 31%|███ | 10494/34278 [11:36:18<26:32:17, 4.02s/it] {'loss': 0.1477, 'grad_norm': 0.9472351681259725, 'learning_rate': 8.130254420346954e-06, 'epoch': 0.31} 31%|███ | 10494/34278 [11:36:18<26:32:17, 4.02s/it] 31%|███ | 10495/34278 [11:36:21<25:56:30, 3.93s/it] {'loss': 0.1725, 'grad_norm': 0.8739245150334625, 'learning_rate': 8.129886010625176e-06, 'epoch': 0.31} 31%|███ | 10495/34278 [11:36:21<25:56:30, 3.93s/it] 31%|███ | 10496/34278 [11:36:26<27:56:37, 4.23s/it] {'loss': 0.1354, 'grad_norm': 0.9484586419264308, 'learning_rate': 8.129517572960523e-06, 'epoch': 0.31} 31%|███ | 10496/34278 [11:36:26<27:56:37, 4.23s/it] 31%|███ | 10497/34278 [11:36:31<28:08:55, 4.26s/it] {'loss': 0.1495, 'grad_norm': 1.0090227898368496, 'learning_rate': 8.129149107356285e-06, 'epoch': 0.31} 31%|███ | 10497/34278 [11:36:31<28:08:55, 4.26s/it] 31%|███ | 10498/34278 [11:36:37<31:47:53, 4.81s/it] {'loss': 0.1388, 'grad_norm': 0.9908528918921414, 'learning_rate': 8.12878061381575e-06, 'epoch': 0.31} 31%|███ | 10498/34278 [11:36:37<31:47:53, 4.81s/it] 31%|███ | 10499/34278 [11:36:40<27:54:07, 4.22s/it] {'loss': 0.1626, 'grad_norm': 1.005759873694505, 'learning_rate': 8.12841209234221e-06, 'epoch': 0.31} 31%|███ | 10499/34278 [11:36:40<27:54:07, 4.22s/it] 31%|███ | 10500/34278 [11:36:43<25:10:11, 3.81s/it] {'loss': 0.155, 'grad_norm': 0.977183583589827, 'learning_rate': 8.128043542938953e-06, 'epoch': 0.31} 31%|███ | 10500/34278 [11:36:43<25:10:11, 3.81s/it] 31%|███ | 10501/34278 [11:36:46<23:54:41, 3.62s/it] {'loss': 0.1639, 'grad_norm': 0.9762998814957362, 'learning_rate': 8.12767496560927e-06, 'epoch': 0.31} 31%|███ | 10501/34278 [11:36:46<23:54:41, 3.62s/it] 31%|███ | 10502/34278 [11:36:52<29:50:42, 4.52s/it] {'loss': 0.1751, 'grad_norm': 0.8543228431308588, 'learning_rate': 8.127306360356451e-06, 'epoch': 0.31} 31%|███ | 10502/34278 [11:36:52<29:50:42, 4.52s/it] 31%|███ | 10503/34278 [11:36:55<26:39:26, 4.04s/it] {'loss': 0.1697, 'grad_norm': 0.9352274648423903, 'learning_rate': 8.126937727183789e-06, 'epoch': 0.31} 31%|███ | 10503/34278 [11:36:55<26:39:26, 4.04s/it] 31%|███ | 10504/34278 [11:36:58<24:52:13, 3.77s/it] {'loss': 0.1798, 'grad_norm': 1.011660987303486, 'learning_rate': 8.12656906609457e-06, 'epoch': 0.31} 31%|███ | 10504/34278 [11:36:58<24:52:13, 3.77s/it] 31%|███ | 10505/34278 [11:37:02<24:10:12, 3.66s/it] {'loss': 0.1365, 'grad_norm': 0.8967067042704103, 'learning_rate': 8.12620037709209e-06, 'epoch': 0.31} 31%|███ | 10505/34278 [11:37:02<24:10:12, 3.66s/it] 31%|███ | 10506/34278 [11:37:05<22:21:52, 3.39s/it] {'loss': 0.171, 'grad_norm': 1.0385156300922904, 'learning_rate': 8.125831660179642e-06, 'epoch': 0.31} 31%|███ | 10506/34278 [11:37:05<22:21:52, 3.39s/it] 31%|███ | 10507/34278 [11:37:08<22:14:08, 3.37s/it] {'loss': 0.18, 'grad_norm': 1.5670291613771283, 'learning_rate': 8.125462915360511e-06, 'epoch': 0.31} 31%|███ | 10507/34278 [11:37:08<22:14:08, 3.37s/it] 31%|███ | 10508/34278 [11:37:14<27:25:03, 4.15s/it] {'loss': 0.163, 'grad_norm': 0.9472429438047364, 'learning_rate': 8.125094142637997e-06, 'epoch': 0.31} 31%|███ | 10508/34278 [11:37:14<27:25:03, 4.15s/it] 31%|███ | 10509/34278 [11:37:17<25:04:04, 3.80s/it] {'loss': 0.1461, 'grad_norm': 0.764220114055355, 'learning_rate': 8.124725342015387e-06, 'epoch': 0.31} 31%|███ | 10509/34278 [11:37:17<25:04:04, 3.80s/it] 31%|███ | 10510/34278 [11:37:23<29:16:04, 4.43s/it] {'loss': 0.1493, 'grad_norm': 0.9179438054200285, 'learning_rate': 8.124356513495975e-06, 'epoch': 0.31} 31%|███ | 10510/34278 [11:37:23<29:16:04, 4.43s/it] 31%|███ | 10511/34278 [11:37:26<26:34:53, 4.03s/it] {'loss': 0.165, 'grad_norm': 0.8480395922489338, 'learning_rate': 8.123987657083054e-06, 'epoch': 0.31} 31%|███ | 10511/34278 [11:37:26<26:34:53, 4.03s/it] 31%|███ | 10512/34278 [11:37:28<23:51:50, 3.61s/it] {'loss': 0.1409, 'grad_norm': 0.6859944965232568, 'learning_rate': 8.123618772779917e-06, 'epoch': 0.31} 31%|███ | 10512/34278 [11:37:28<23:51:50, 3.61s/it] 31%|███ | 10513/34278 [11:37:32<23:33:16, 3.57s/it] {'loss': 0.1642, 'grad_norm': 0.7930331392558869, 'learning_rate': 8.123249860589856e-06, 'epoch': 0.31} 31%|███ | 10513/34278 [11:37:32<23:33:16, 3.57s/it] 31%|███ | 10514/34278 [11:37:36<23:58:32, 3.63s/it] {'loss': 0.1315, 'grad_norm': 0.8710376750806804, 'learning_rate': 8.122880920516167e-06, 'epoch': 0.31} 31%|███ | 10514/34278 [11:37:36<23:58:32, 3.63s/it] 31%|███ | 10515/34278 [11:37:39<22:32:58, 3.42s/it] {'loss': 0.1536, 'grad_norm': 0.8011444341675579, 'learning_rate': 8.122511952562143e-06, 'epoch': 0.31} 31%|███ | 10515/34278 [11:37:39<22:32:58, 3.42s/it] 31%|███ | 10516/34278 [11:37:42<22:05:08, 3.35s/it] {'loss': 0.1644, 'grad_norm': 1.2824621487969918, 'learning_rate': 8.122142956731078e-06, 'epoch': 0.31} 31%|███ | 10516/34278 [11:37:42<22:05:08, 3.35s/it] 31%|███ | 10517/34278 [11:37:46<23:21:41, 3.54s/it] {'loss': 0.1442, 'grad_norm': 0.9464360122593795, 'learning_rate': 8.121773933026265e-06, 'epoch': 0.31} 31%|███ | 10517/34278 [11:37:46<23:21:41, 3.54s/it] 31%|███ | 10518/34278 [11:37:49<22:58:13, 3.48s/it] {'loss': 0.1433, 'grad_norm': 0.9458128911218686, 'learning_rate': 8.121404881451e-06, 'epoch': 0.31} 31%|███ | 10518/34278 [11:37:49<22:58:13, 3.48s/it] 31%|███ | 10519/34278 [11:37:52<22:14:56, 3.37s/it] {'loss': 0.1413, 'grad_norm': 0.9214215518317815, 'learning_rate': 8.121035802008577e-06, 'epoch': 0.31} 31%|███ | 10519/34278 [11:37:52<22:14:56, 3.37s/it] 31%|███ | 10520/34278 [11:37:56<22:17:51, 3.38s/it] {'loss': 0.1555, 'grad_norm': 1.0435395184292795, 'learning_rate': 8.120666694702292e-06, 'epoch': 0.31} 31%|███ | 10520/34278 [11:37:56<22:17:51, 3.38s/it] 31%|███ | 10521/34278 [11:37:59<22:24:58, 3.40s/it] {'loss': 0.1479, 'grad_norm': 0.9804105724680121, 'learning_rate': 8.12029755953544e-06, 'epoch': 0.31} 31%|███ | 10521/34278 [11:37:59<22:24:58, 3.40s/it] 31%|███ | 10522/34278 [11:38:03<22:37:23, 3.43s/it] {'loss': 0.1397, 'grad_norm': 0.8337310791436562, 'learning_rate': 8.119928396511315e-06, 'epoch': 0.31} 31%|███ | 10522/34278 [11:38:03<22:37:23, 3.43s/it] 31%|███ | 10523/34278 [11:38:06<22:29:18, 3.41s/it] {'loss': 0.1714, 'grad_norm': 0.8634544296137432, 'learning_rate': 8.119559205633213e-06, 'epoch': 0.31} 31%|███ | 10523/34278 [11:38:06<22:29:18, 3.41s/it] 31%|███ | 10524/34278 [11:38:10<22:52:17, 3.47s/it] {'loss': 0.1663, 'grad_norm': 1.511782194331731, 'learning_rate': 8.119189986904435e-06, 'epoch': 0.31} 31%|███ | 10524/34278 [11:38:10<22:52:17, 3.47s/it] 31%|███ | 10525/34278 [11:38:13<23:23:38, 3.55s/it] {'loss': 0.1587, 'grad_norm': 0.9658302254436144, 'learning_rate': 8.11882074032827e-06, 'epoch': 0.31} 31%|███ | 10525/34278 [11:38:13<23:23:38, 3.55s/it] 31%|███ | 10526/34278 [11:38:19<27:32:26, 4.17s/it] {'loss': 0.155, 'grad_norm': 0.8059627139149879, 'learning_rate': 8.11845146590802e-06, 'epoch': 0.31} 31%|███ | 10526/34278 [11:38:19<27:32:26, 4.17s/it] 31%|███ | 10527/34278 [11:38:22<25:26:08, 3.86s/it] {'loss': 0.1771, 'grad_norm': 0.7921105237179331, 'learning_rate': 8.118082163646979e-06, 'epoch': 0.31} 31%|███ | 10527/34278 [11:38:22<25:26:08, 3.86s/it] 31%|███ | 10528/34278 [11:38:26<25:27:47, 3.86s/it] {'loss': 0.1532, 'grad_norm': 0.8326608499445625, 'learning_rate': 8.117712833548443e-06, 'epoch': 0.31} 31%|███ | 10528/34278 [11:38:26<25:27:47, 3.86s/it] 31%|███ | 10529/34278 [11:38:29<24:10:16, 3.66s/it] {'loss': 0.179, 'grad_norm': 0.9911650912599619, 'learning_rate': 8.117343475615714e-06, 'epoch': 0.31} 31%|███ | 10529/34278 [11:38:29<24:10:16, 3.66s/it] 31%|███ | 10530/34278 [11:38:33<24:05:50, 3.65s/it] {'loss': 0.1483, 'grad_norm': 0.7040185556638828, 'learning_rate': 8.116974089852085e-06, 'epoch': 0.31} 31%|███ | 10530/34278 [11:38:33<24:05:50, 3.65s/it] 31%|███ | 10531/34278 [11:38:37<25:31:20, 3.87s/it] {'loss': 0.1892, 'grad_norm': 1.1280098054724013, 'learning_rate': 8.116604676260855e-06, 'epoch': 0.31} 31%|███ | 10531/34278 [11:38:37<25:31:20, 3.87s/it] 31%|███ | 10532/34278 [11:38:41<24:53:19, 3.77s/it] {'loss': 0.1301, 'grad_norm': 0.6794934403780349, 'learning_rate': 8.116235234845324e-06, 'epoch': 0.31} 31%|███ | 10532/34278 [11:38:41<24:53:19, 3.77s/it] 31%|███ | 10533/34278 [11:38:46<28:04:10, 4.26s/it] {'loss': 0.1644, 'grad_norm': 0.7077044569622198, 'learning_rate': 8.115865765608789e-06, 'epoch': 0.31} 31%|███ | 10533/34278 [11:38:46<28:04:10, 4.26s/it] 31%|███ | 10534/34278 [11:38:50<26:41:40, 4.05s/it] {'loss': 0.1698, 'grad_norm': 0.7466181617760245, 'learning_rate': 8.115496268554545e-06, 'epoch': 0.31} 31%|███ | 10534/34278 [11:38:50<26:41:40, 4.05s/it] 31%|███ | 10535/34278 [11:38:53<25:17:01, 3.83s/it] {'loss': 0.1364, 'grad_norm': 0.6420098664136437, 'learning_rate': 8.115126743685897e-06, 'epoch': 0.31} 31%|███ | 10535/34278 [11:38:53<25:17:01, 3.83s/it] 31%|███ | 10536/34278 [11:38:58<28:03:41, 4.25s/it] {'loss': 0.1362, 'grad_norm': 0.8882542711176362, 'learning_rate': 8.114757191006141e-06, 'epoch': 0.31} 31%|███ | 10536/34278 [11:38:58<28:03:41, 4.25s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 31%|███ | 10537/34278 [11:39:01<26:15:40, 3.98s/it] {'loss': 0.1501, 'grad_norm': 1.1086496996804374, 'learning_rate': 8.114387610518574e-06, 'epoch': 0.31} 31%|███ | 10537/34278 [11:39:01<26:15:40, 3.98s/it] 31%|███ | 10538/34278 [11:39:05<25:46:06, 3.91s/it] {'loss': 0.1461, 'grad_norm': 0.7485858463299588, 'learning_rate': 8.1140180022265e-06, 'epoch': 0.31} 31%|███ | 10538/34278 [11:39:05<25:46:06, 3.91s/it] 31%|███ | 10539/34278 [11:39:08<24:07:29, 3.66s/it] {'loss': 0.1257, 'grad_norm': 1.022310161568353, 'learning_rate': 8.113648366133218e-06, 'epoch': 0.31} 31%|███ | 10539/34278 [11:39:08<24:07:29, 3.66s/it] 31%|███ | 10540/34278 [11:39:13<25:16:38, 3.83s/it] {'loss': 0.1386, 'grad_norm': 0.8740042421782029, 'learning_rate': 8.113278702242025e-06, 'epoch': 0.31} 31%|███ | 10540/34278 [11:39:13<25:16:38, 3.83s/it] 31%|███ | 10541/34278 [11:39:16<23:43:36, 3.60s/it] {'loss': 0.1383, 'grad_norm': 1.071707657078678, 'learning_rate': 8.112909010556222e-06, 'epoch': 0.31} 31%|███ | 10541/34278 [11:39:16<23:43:36, 3.60s/it] 31%|███ | 10542/34278 [11:39:21<27:19:33, 4.14s/it] {'loss': 0.16, 'grad_norm': 1.1158674356285905, 'learning_rate': 8.11253929107911e-06, 'epoch': 0.31} 31%|███ | 10542/34278 [11:39:21<27:19:33, 4.14s/it] 31%|███ | 10543/34278 [11:39:24<25:08:48, 3.81s/it] {'loss': 0.1356, 'grad_norm': 0.8078489878682861, 'learning_rate': 8.112169543813992e-06, 'epoch': 0.31} 31%|███ | 10543/34278 [11:39:24<25:08:48, 3.81s/it] 31%|███ | 10544/34278 [11:39:27<23:28:17, 3.56s/it] {'loss': 0.1359, 'grad_norm': 0.8692041404439562, 'learning_rate': 8.111799768764169e-06, 'epoch': 0.31} 31%|███ | 10544/34278 [11:39:27<23:28:17, 3.56s/it] 31%|███ | 10545/34278 [11:39:30<22:02:08, 3.34s/it] {'loss': 0.1697, 'grad_norm': 1.1023630639028916, 'learning_rate': 8.111429965932938e-06, 'epoch': 0.31} 31%|███ | 10545/34278 [11:39:30<22:02:08, 3.34s/it] 31%|███ | 10546/34278 [11:39:34<22:49:38, 3.46s/it] {'loss': 0.1631, 'grad_norm': 0.8951416247908781, 'learning_rate': 8.111060135323601e-06, 'epoch': 0.31} 31%|███ | 10546/34278 [11:39:34<22:49:38, 3.46s/it] 31%|███ | 10547/34278 [11:39:37<23:33:34, 3.57s/it] {'loss': 0.1515, 'grad_norm': 1.138681929210952, 'learning_rate': 8.110690276939466e-06, 'epoch': 0.31} 31%|███ | 10547/34278 [11:39:37<23:33:34, 3.57s/it] 31%|███ | 10548/34278 [11:39:40<22:16:14, 3.38s/it] {'loss': 0.1498, 'grad_norm': 1.1975022017271477, 'learning_rate': 8.110320390783828e-06, 'epoch': 0.31} 31%|███ | 10548/34278 [11:39:40<22:16:14, 3.38s/it] 31%|███ | 10549/34278 [11:39:44<22:13:46, 3.37s/it] {'loss': 0.1731, 'grad_norm': 0.780744692482144, 'learning_rate': 8.109950476859993e-06, 'epoch': 0.31} 31%|███ | 10549/34278 [11:39:44<22:13:46, 3.37s/it] 31%|███ | 10550/34278 [11:39:47<22:01:46, 3.34s/it] {'loss': 0.137, 'grad_norm': 0.7934698933211289, 'learning_rate': 8.109580535171262e-06, 'epoch': 0.31} 31%|███ | 10550/34278 [11:39:47<22:01:46, 3.34s/it] 31%|███ | 10551/34278 [11:39:51<23:09:52, 3.51s/it] {'loss': 0.1326, 'grad_norm': 1.0296175217764811, 'learning_rate': 8.10921056572094e-06, 'epoch': 0.31} 31%|███ | 10551/34278 [11:39:51<23:09:52, 3.51s/it] 31%|███ | 10552/34278 [11:39:54<22:58:27, 3.49s/it] {'loss': 0.1429, 'grad_norm': 0.7560402079317322, 'learning_rate': 8.108840568512326e-06, 'epoch': 0.31} 31%|███ | 10552/34278 [11:39:54<22:58:27, 3.49s/it] 31%|███ | 10553/34278 [11:39:58<23:51:40, 3.62s/it] {'loss': 0.1237, 'grad_norm': 0.6237040766499032, 'learning_rate': 8.108470543548728e-06, 'epoch': 0.31} 31%|███ | 10553/34278 [11:39:58<23:51:40, 3.62s/it] 31%|███ | 10554/34278 [11:40:01<22:36:37, 3.43s/it] {'loss': 0.1441, 'grad_norm': 1.0722082215091353, 'learning_rate': 8.108100490833444e-06, 'epoch': 0.31} 31%|███ | 10554/34278 [11:40:01<22:36:37, 3.43s/it] 31%|███ | 10555/34278 [11:40:06<24:16:11, 3.68s/it] {'loss': 0.1475, 'grad_norm': 0.8508576037764862, 'learning_rate': 8.107730410369783e-06, 'epoch': 0.31} 31%|███ | 10555/34278 [11:40:06<24:16:11, 3.68s/it] 31%|███ | 10556/34278 [11:40:09<22:59:42, 3.49s/it] {'loss': 0.1687, 'grad_norm': 0.7939273351996611, 'learning_rate': 8.107360302161047e-06, 'epoch': 0.31} 31%|███ | 10556/34278 [11:40:09<22:59:42, 3.49s/it] 31%|███ | 10557/34278 [11:40:11<21:44:55, 3.30s/it] {'loss': 0.1441, 'grad_norm': 0.7899107972686913, 'learning_rate': 8.106990166210539e-06, 'epoch': 0.31} 31%|███ | 10557/34278 [11:40:11<21:44:55, 3.30s/it] 31%|███ | 10558/34278 [11:40:15<21:25:41, 3.25s/it] {'loss': 0.1463, 'grad_norm': 0.902306511065696, 'learning_rate': 8.106620002521564e-06, 'epoch': 0.31} 31%|███ | 10558/34278 [11:40:15<21:25:41, 3.25s/it] 31%|███ | 10559/34278 [11:40:20<25:03:59, 3.80s/it] {'loss': 0.1644, 'grad_norm': 0.8155122241241145, 'learning_rate': 8.106249811097428e-06, 'epoch': 0.31} 31%|███ | 10559/34278 [11:40:20<25:03:59, 3.80s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 31%|███ | 10560/34278 [11:40:22<23:11:40, 3.52s/it] {'loss': 0.1453, 'grad_norm': 0.8296850751204471, 'learning_rate': 8.105879591941436e-06, 'epoch': 0.31} 31%|███ | 10560/34278 [11:40:23<23:11:40, 3.52s/it] 31%|███ | 10561/34278 [11:40:27<24:15:32, 3.68s/it] {'loss': 0.181, 'grad_norm': 0.8270095028586417, 'learning_rate': 8.10550934505689e-06, 'epoch': 0.31} 31%|███ | 10561/34278 [11:40:27<24:15:32, 3.68s/it] 31%|███ | 10562/34278 [11:40:30<22:50:03, 3.47s/it] {'loss': 0.1396, 'grad_norm': 0.8338867825945535, 'learning_rate': 8.1051390704471e-06, 'epoch': 0.31} 31%|███ | 10562/34278 [11:40:30<22:50:03, 3.47s/it] 31%|███ | 10563/34278 [11:40:33<22:10:11, 3.37s/it] {'loss': 0.1719, 'grad_norm': 0.8411356780114958, 'learning_rate': 8.10476876811537e-06, 'epoch': 0.31} 31%|███ | 10563/34278 [11:40:33<22:10:11, 3.37s/it] 31%|███ | 10564/34278 [11:40:35<21:05:08, 3.20s/it] {'loss': 0.1436, 'grad_norm': 0.8384720164246909, 'learning_rate': 8.104398438065004e-06, 'epoch': 0.31} 31%|███ | 10564/34278 [11:40:35<21:05:08, 3.20s/it] 31%|███ | 10565/34278 [11:40:40<22:51:17, 3.47s/it] {'loss': 0.1643, 'grad_norm': 1.586304716249349, 'learning_rate': 8.10402808029931e-06, 'epoch': 0.31} 31%|███ | 10565/34278 [11:40:40<22:51:17, 3.47s/it] 31%|███ | 10566/34278 [11:40:43<23:14:41, 3.53s/it] {'loss': 0.1438, 'grad_norm': 0.7091876904890907, 'learning_rate': 8.103657694821597e-06, 'epoch': 0.31} 31%|███ | 10566/34278 [11:40:43<23:14:41, 3.53s/it] 31%|███ | 10567/34278 [11:40:49<28:01:36, 4.26s/it] {'loss': 0.1282, 'grad_norm': 0.6861675408497042, 'learning_rate': 8.103287281635165e-06, 'epoch': 0.31} 31%|███ | 10567/34278 [11:40:49<28:01:36, 4.26s/it] 31%|███ | 10568/34278 [11:40:52<25:27:14, 3.86s/it] {'loss': 0.142, 'grad_norm': 0.8024224508177042, 'learning_rate': 8.102916840743327e-06, 'epoch': 0.31} 31%|███ | 10568/34278 [11:40:52<25:27:14, 3.86s/it] 31%|███ | 10569/34278 [11:40:55<23:43:11, 3.60s/it] {'loss': 0.1523, 'grad_norm': 0.739856121913459, 'learning_rate': 8.102546372149389e-06, 'epoch': 0.31} 31%|███ | 10569/34278 [11:40:55<23:43:11, 3.60s/it] 31%|███ | 10570/34278 [11:40:58<22:09:19, 3.36s/it] {'loss': 0.156, 'grad_norm': 0.7441723765909202, 'learning_rate': 8.102175875856655e-06, 'epoch': 0.31} 31%|███ | 10570/34278 [11:40:58<22:09:19, 3.36s/it] 31%|███ | 10571/34278 [11:41:01<21:36:21, 3.28s/it] {'loss': 0.1452, 'grad_norm': 0.8616164805014292, 'learning_rate': 8.101805351868438e-06, 'epoch': 0.31} 31%|███ | 10571/34278 [11:41:01<21:36:21, 3.28s/it] 31%|███ | 10572/34278 [11:41:05<22:43:10, 3.45s/it] {'loss': 0.1618, 'grad_norm': 0.7507438067787525, 'learning_rate': 8.101434800188042e-06, 'epoch': 0.31} 31%|███ | 10572/34278 [11:41:05<22:43:10, 3.45s/it] 31%|███ | 10573/34278 [11:41:08<21:33:03, 3.27s/it] {'loss': 0.1563, 'grad_norm': 0.6780369471077534, 'learning_rate': 8.101064220818776e-06, 'epoch': 0.31} 31%|███ | 10573/34278 [11:41:08<21:33:03, 3.27s/it] 31%|███ | 10574/34278 [11:41:14<27:48:02, 4.22s/it] {'loss': 0.1392, 'grad_norm': 0.6803403032015081, 'learning_rate': 8.10069361376395e-06, 'epoch': 0.31} 31%|███ | 10574/34278 [11:41:14<27:48:02, 4.22s/it] 31%|███ | 10575/34278 [11:41:19<29:57:59, 4.55s/it] {'loss': 0.1521, 'grad_norm': 0.7246904738957859, 'learning_rate': 8.100322979026872e-06, 'epoch': 0.31} 31%|███ | 10575/34278 [11:41:20<29:57:59, 4.55s/it] 31%|███ | 10576/34278 [11:41:23<27:13:07, 4.13s/it] {'loss': 0.1332, 'grad_norm': 0.781055731214032, 'learning_rate': 8.099952316610849e-06, 'epoch': 0.31} 31%|███ | 10576/34278 [11:41:23<27:13:07, 4.13s/it] 31%|███ | 10577/34278 [11:41:27<26:50:16, 4.08s/it] {'loss': 0.1525, 'grad_norm': 0.7652344949530506, 'learning_rate': 8.099581626519193e-06, 'epoch': 0.31} 31%|███ | 10577/34278 [11:41:27<26:50:16, 4.08s/it] 31%|███ | 10578/34278 [11:41:32<29:20:16, 4.46s/it] {'loss': 0.1789, 'grad_norm': 0.7697622437224937, 'learning_rate': 8.099210908755213e-06, 'epoch': 0.31} 31%|███ | 10578/34278 [11:41:32<29:20:16, 4.46s/it] 31%|███ | 10579/34278 [11:41:37<30:59:20, 4.71s/it] {'loss': 0.1519, 'grad_norm': 1.137822249189753, 'learning_rate': 8.098840163322215e-06, 'epoch': 0.31} 31%|███ | 10579/34278 [11:41:37<30:59:20, 4.71s/it] 31%|███ | 10580/34278 [11:41:42<31:04:59, 4.72s/it] {'loss': 0.1594, 'grad_norm': 0.9379822407843574, 'learning_rate': 8.098469390223514e-06, 'epoch': 0.31} 31%|███ | 10580/34278 [11:41:42<31:04:59, 4.72s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 31%|███ | 10581/34278 [11:41:46<29:03:13, 4.41s/it] {'loss': 0.1458, 'grad_norm': 0.7880585471282775, 'learning_rate': 8.098098589462416e-06, 'epoch': 0.31} 31%|███ | 10581/34278 [11:41:46<29:03:13, 4.41s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8603 > 8192). Running this sequence through the model will result in indexing errors 31%|███ | 10582/34278 [11:41:49<26:12:11, 3.98s/it] {'loss': 0.1373, 'grad_norm': 1.0291747194947232, 'learning_rate': 8.097727761042236e-06, 'epoch': 0.31} 31%|███ | 10582/34278 [11:41:49<26:12:11, 3.98s/it] 31%|███ | 10583/34278 [11:41:55<30:00:27, 4.56s/it] {'loss': 0.1317, 'grad_norm': 0.7812957602030183, 'learning_rate': 8.09735690496628e-06, 'epoch': 0.31} 31%|███ | 10583/34278 [11:41:55<30:00:27, 4.56s/it] 31%|███ | 10584/34278 [11:42:00<32:38:43, 4.96s/it] {'loss': 0.1727, 'grad_norm': 0.8718213140293359, 'learning_rate': 8.096986021237863e-06, 'epoch': 0.31} 31%|███ | 10584/34278 [11:42:00<32:38:43, 4.96s/it] 31%|███ | 10585/34278 [11:42:04<29:38:55, 4.50s/it] {'loss': 0.1537, 'grad_norm': 0.9414986040552127, 'learning_rate': 8.096615109860291e-06, 'epoch': 0.31} 31%|███ | 10585/34278 [11:42:04<29:38:55, 4.50s/it] 31%|███ | 10586/34278 [11:42:07<26:42:13, 4.06s/it] {'loss': 0.122, 'grad_norm': 0.7669372121470045, 'learning_rate': 8.09624417083688e-06, 'epoch': 0.31} 31%|███ | 10586/34278 [11:42:07<26:42:13, 4.06s/it] 31%|███ | 10587/34278 [11:42:10<25:14:20, 3.84s/it] {'loss': 0.166, 'grad_norm': 0.762580240102507, 'learning_rate': 8.09587320417094e-06, 'epoch': 0.31} 31%|███ | 10587/34278 [11:42:10<25:14:20, 3.84s/it] 31%|███ | 10588/34278 [11:42:13<24:07:20, 3.67s/it] {'loss': 0.1337, 'grad_norm': 0.6681375948337287, 'learning_rate': 8.095502209865785e-06, 'epoch': 0.31} 31%|███ | 10588/34278 [11:42:13<24:07:20, 3.67s/it] 31%|███ | 10589/34278 [11:42:18<26:13:56, 3.99s/it] {'loss': 0.1658, 'grad_norm': 0.8626854000460653, 'learning_rate': 8.095131187924723e-06, 'epoch': 0.31} 31%|███ | 10589/34278 [11:42:18<26:13:56, 3.99s/it] 31%|███ | 10590/34278 [11:42:22<25:12:54, 3.83s/it] {'loss': 0.1515, 'grad_norm': 0.7997351499456614, 'learning_rate': 8.09476013835107e-06, 'epoch': 0.31} 31%|███ | 10590/34278 [11:42:22<25:12:54, 3.83s/it] 31%|███ | 10591/34278 [11:42:27<28:08:22, 4.28s/it] {'loss': 0.1401, 'grad_norm': 0.9376171703942249, 'learning_rate': 8.094389061148135e-06, 'epoch': 0.31} 31%|███ | 10591/34278 [11:42:27<28:08:22, 4.28s/it] 31%|███ | 10592/34278 [11:42:30<25:40:32, 3.90s/it] {'loss': 0.1524, 'grad_norm': 0.7558847608908864, 'learning_rate': 8.094017956319236e-06, 'epoch': 0.31} 31%|███ | 10592/34278 [11:42:30<25:40:32, 3.90s/it] 31%|███ | 10593/34278 [11:42:34<26:40:12, 4.05s/it] {'loss': 0.1326, 'grad_norm': 0.8543530620818583, 'learning_rate': 8.093646823867683e-06, 'epoch': 0.31} 31%|███ | 10593/34278 [11:42:34<26:40:12, 4.05s/it] 31%|███ | 10594/34278 [11:42:37<24:38:30, 3.75s/it] {'loss': 0.1584, 'grad_norm': 0.7740183128550507, 'learning_rate': 8.093275663796787e-06, 'epoch': 0.31} 31%|███ | 10594/34278 [11:42:37<24:38:30, 3.75s/it] 31%|███ | 10595/34278 [11:42:41<24:06:28, 3.66s/it] {'loss': 0.1485, 'grad_norm': 0.8451521187831812, 'learning_rate': 8.092904476109867e-06, 'epoch': 0.31} 31%|███ | 10595/34278 [11:42:41<24:06:28, 3.66s/it] 31%|███ | 10596/34278 [11:42:44<22:56:43, 3.49s/it] {'loss': 0.1547, 'grad_norm': 0.7527218799395663, 'learning_rate': 8.092533260810234e-06, 'epoch': 0.31} 31%|███ | 10596/34278 [11:42:44<22:56:43, 3.49s/it] 31%|███ | 10597/34278 [11:42:47<22:17:19, 3.39s/it] {'loss': 0.1334, 'grad_norm': 0.8702787459635344, 'learning_rate': 8.0921620179012e-06, 'epoch': 0.31} 31%|███ | 10597/34278 [11:42:47<22:17:19, 3.39s/it] 31%|███ | 10598/34278 [11:42:51<22:25:15, 3.41s/it] {'loss': 0.1384, 'grad_norm': 0.8224779270059287, 'learning_rate': 8.091790747386084e-06, 'epoch': 0.31} 31%|███ | 10598/34278 [11:42:51<22:25:15, 3.41s/it] 31%|███ | 10599/34278 [11:42:54<22:28:11, 3.42s/it] {'loss': 0.1703, 'grad_norm': 0.9221937293844968, 'learning_rate': 8.091419449268197e-06, 'epoch': 0.31} 31%|███ | 10599/34278 [11:42:54<22:28:11, 3.42s/it] 31%|███ | 10600/34278 [11:42:57<22:16:08, 3.39s/it] {'loss': 0.136, 'grad_norm': 0.9491475623337832, 'learning_rate': 8.091048123550855e-06, 'epoch': 0.31} 31%|███ | 10600/34278 [11:42:57<22:16:08, 3.39s/it] 31%|███ | 10601/34278 [11:43:02<24:43:30, 3.76s/it] {'loss': 0.1482, 'grad_norm': 0.7995688228938687, 'learning_rate': 8.090676770237374e-06, 'epoch': 0.31} 31%|███ | 10601/34278 [11:43:02<24:43:30, 3.76s/it] 31%|███ | 10602/34278 [11:43:06<24:17:46, 3.69s/it] {'loss': 0.1453, 'grad_norm': 0.9582166128890751, 'learning_rate': 8.090305389331069e-06, 'epoch': 0.31} 31%|███ | 10602/34278 [11:43:06<24:17:46, 3.69s/it] 31%|███ | 10603/34278 [11:43:09<24:30:29, 3.73s/it] {'loss': 0.1476, 'grad_norm': 0.9879063390849203, 'learning_rate': 8.089933980835254e-06, 'epoch': 0.31} 31%|███ | 10603/34278 [11:43:09<24:30:29, 3.73s/it] 31%|███ | 10604/34278 [11:43:13<24:43:56, 3.76s/it] {'loss': 0.1451, 'grad_norm': 0.9051906040954101, 'learning_rate': 8.089562544753247e-06, 'epoch': 0.31} 31%|███ | 10604/34278 [11:43:13<24:43:56, 3.76s/it] 31%|███ | 10605/34278 [11:43:16<23:05:50, 3.51s/it] {'loss': 0.1574, 'grad_norm': 1.0179094349743891, 'learning_rate': 8.089191081088364e-06, 'epoch': 0.31} 31%|███ | 10605/34278 [11:43:16<23:05:50, 3.51s/it] 31%|███ | 10606/34278 [11:43:20<22:51:23, 3.48s/it] {'loss': 0.1905, 'grad_norm': 0.8367263083938471, 'learning_rate': 8.088819589843919e-06, 'epoch': 0.31} 31%|███ | 10606/34278 [11:43:20<22:51:23, 3.48s/it] 31%|███ | 10607/34278 [11:43:23<23:03:26, 3.51s/it] {'loss': 0.1341, 'grad_norm': 1.0329858362778113, 'learning_rate': 8.08844807102323e-06, 'epoch': 0.31} 31%|███ | 10607/34278 [11:43:23<23:03:26, 3.51s/it] 31%|███ | 10608/34278 [11:43:28<25:38:29, 3.90s/it] {'loss': 0.1603, 'grad_norm': 1.1484767834595642, 'learning_rate': 8.088076524629613e-06, 'epoch': 0.31} 31%|███ | 10608/34278 [11:43:28<25:38:29, 3.90s/it] 31%|███ | 10609/34278 [11:43:31<23:33:40, 3.58s/it] {'loss': 0.1458, 'grad_norm': 1.0068689908140545, 'learning_rate': 8.087704950666388e-06, 'epoch': 0.31} 31%|███ | 10609/34278 [11:43:31<23:33:40, 3.58s/it] 31%|███ | 10610/34278 [11:43:34<22:41:14, 3.45s/it] {'loss': 0.1787, 'grad_norm': 1.0035871570590194, 'learning_rate': 8.08733334913687e-06, 'epoch': 0.31} 31%|███ | 10610/34278 [11:43:34<22:41:14, 3.45s/it] 31%|███ | 10611/34278 [11:43:38<23:34:47, 3.59s/it] {'loss': 0.1485, 'grad_norm': 0.841565240388496, 'learning_rate': 8.086961720044374e-06, 'epoch': 0.31} 31%|███ | 10611/34278 [11:43:38<23:34:47, 3.59s/it] 31%|███ | 10612/34278 [11:43:41<23:33:22, 3.58s/it] {'loss': 0.1374, 'grad_norm': 1.01173294811488, 'learning_rate': 8.086590063392224e-06, 'epoch': 0.31} 31%|███ | 10612/34278 [11:43:41<23:33:22, 3.58s/it] 31%|███ | 10613/34278 [11:43:44<22:00:03, 3.35s/it] {'loss': 0.1599, 'grad_norm': 1.083260824888384, 'learning_rate': 8.086218379183735e-06, 'epoch': 0.31} 31%|███ | 10613/34278 [11:43:44<22:00:03, 3.35s/it] 31%|███ | 10614/34278 [11:43:48<23:12:49, 3.53s/it] {'loss': 0.1554, 'grad_norm': 0.8371162877497406, 'learning_rate': 8.085846667422224e-06, 'epoch': 0.31} 31%|███ | 10614/34278 [11:43:48<23:12:49, 3.53s/it] 31%|███ | 10615/34278 [11:43:54<26:51:35, 4.09s/it] {'loss': 0.1601, 'grad_norm': 0.9618314941000575, 'learning_rate': 8.08547492811101e-06, 'epoch': 0.31} 31%|███ | 10615/34278 [11:43:54<26:51:35, 4.09s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 31%|███ | 10616/34278 [11:43:56<24:09:37, 3.68s/it] {'loss': 0.1595, 'grad_norm': 0.8562001405810296, 'learning_rate': 8.085103161253413e-06, 'epoch': 0.31} 31%|███ | 10616/34278 [11:43:56<24:09:37, 3.68s/it] 31%|███ | 10617/34278 [11:44:00<24:08:53, 3.67s/it] {'loss': 0.1459, 'grad_norm': 0.7229896698756507, 'learning_rate': 8.084731366852752e-06, 'epoch': 0.31} 31%|███ | 10617/34278 [11:44:00<24:08:53, 3.67s/it] 31%|███ | 10618/34278 [11:44:04<24:20:13, 3.70s/it] {'loss': 0.1332, 'grad_norm': 0.7145254810818518, 'learning_rate': 8.084359544912344e-06, 'epoch': 0.31} 31%|███ | 10618/34278 [11:44:04<24:20:13, 3.70s/it] 31%|███ | 10619/34278 [11:44:09<26:44:08, 4.07s/it] {'loss': 0.1589, 'grad_norm': 0.7985107534143561, 'learning_rate': 8.08398769543551e-06, 'epoch': 0.31} 31%|███ | 10619/34278 [11:44:09<26:44:08, 4.07s/it] 31%|███ | 10620/34278 [11:44:13<28:14:01, 4.30s/it] {'loss': 0.1632, 'grad_norm': 0.8866769673396308, 'learning_rate': 8.083615818425573e-06, 'epoch': 0.31} 31%|███ | 10620/34278 [11:44:13<28:14:01, 4.30s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 31%|███ | 10621/34278 [11:44:17<26:54:59, 4.10s/it] {'loss': 0.1417, 'grad_norm': 0.9127343915015304, 'learning_rate': 8.083243913885848e-06, 'epoch': 0.31} 31%|███ | 10621/34278 [11:44:17<26:54:59, 4.10s/it] 31%|███ | 10622/34278 [11:44:21<25:44:29, 3.92s/it] {'loss': 0.1552, 'grad_norm': 0.6578977562893924, 'learning_rate': 8.082871981819658e-06, 'epoch': 0.31} 31%|███ | 10622/34278 [11:44:21<25:44:29, 3.92s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 31%|███ | 10623/34278 [11:44:25<26:03:06, 3.96s/it] {'loss': 0.1483, 'grad_norm': 1.1127677504484195, 'learning_rate': 8.082500022230323e-06, 'epoch': 0.31} 31%|███ | 10623/34278 [11:44:25<26:03:06, 3.96s/it] 31%|███ | 10624/34278 [11:44:27<23:50:41, 3.63s/it] {'loss': 0.186, 'grad_norm': 0.8847830017787616, 'learning_rate': 8.082128035121162e-06, 'epoch': 0.31} 31%|███ | 10624/34278 [11:44:27<23:50:41, 3.63s/it] 31%|███ | 10625/34278 [11:44:32<26:06:47, 3.97s/it] {'loss': 0.1247, 'grad_norm': 0.6800510056270985, 'learning_rate': 8.081756020495501e-06, 'epoch': 0.31} 31%|███ | 10625/34278 [11:44:32<26:06:47, 3.97s/it] 31%|███ | 10626/34278 [11:44:39<31:01:04, 4.72s/it] {'loss': 0.1543, 'grad_norm': 0.7799657725364644, 'learning_rate': 8.081383978356655e-06, 'epoch': 0.31} 31%|███ | 10626/34278 [11:44:39<31:01:04, 4.72s/it] 31%|███ | 10627/34278 [11:44:42<28:57:55, 4.41s/it] {'loss': 0.153, 'grad_norm': 0.8811745190164654, 'learning_rate': 8.08101190870795e-06, 'epoch': 0.31} 31%|███ | 10627/34278 [11:44:42<28:57:55, 4.41s/it] 31%|███ | 10628/34278 [11:44:46<26:31:00, 4.04s/it] {'loss': 0.1504, 'grad_norm': 0.7482189825762555, 'learning_rate': 8.080639811552704e-06, 'epoch': 0.31} 31%|███ | 10628/34278 [11:44:46<26:31:00, 4.04s/it] 31%|███ | 10629/34278 [11:44:49<25:41:57, 3.91s/it] {'loss': 0.1514, 'grad_norm': 0.7595443605436023, 'learning_rate': 8.080267686894244e-06, 'epoch': 0.31} 31%|███ | 10629/34278 [11:44:49<25:41:57, 3.91s/it] 31%|███ | 10630/34278 [11:44:52<24:23:40, 3.71s/it] {'loss': 0.1475, 'grad_norm': 0.7780154206427352, 'learning_rate': 8.079895534735887e-06, 'epoch': 0.31} 31%|███ | 10630/34278 [11:44:52<24:23:40, 3.71s/it] 31%|███ | 10631/34278 [11:44:56<23:45:33, 3.62s/it] {'loss': 0.1656, 'grad_norm': 0.7048107343833214, 'learning_rate': 8.07952335508096e-06, 'epoch': 0.31} 31%|███ | 10631/34278 [11:44:56<23:45:33, 3.62s/it] 31%|███ | 10632/34278 [11:45:00<25:03:43, 3.82s/it] {'loss': 0.1604, 'grad_norm': 0.7852511955892731, 'learning_rate': 8.079151147932783e-06, 'epoch': 0.31} 31%|███ | 10632/34278 [11:45:00<25:03:43, 3.82s/it] 31%|███ | 10633/34278 [11:45:04<25:32:25, 3.89s/it] {'loss': 0.1579, 'grad_norm': 0.8862456259772208, 'learning_rate': 8.078778913294677e-06, 'epoch': 0.31} 31%|███ | 10633/34278 [11:45:04<25:32:25, 3.89s/it] 31%|███ | 10634/34278 [11:45:07<23:06:42, 3.52s/it] {'loss': 0.1588, 'grad_norm': 0.859633153948346, 'learning_rate': 8.078406651169972e-06, 'epoch': 0.31} 31%|███ | 10634/34278 [11:45:07<23:06:42, 3.52s/it] 31%|███ | 10635/34278 [11:45:10<22:25:24, 3.41s/it] {'loss': 0.1706, 'grad_norm': 0.7830934120308404, 'learning_rate': 8.078034361561986e-06, 'epoch': 0.31} 31%|███ | 10635/34278 [11:45:10<22:25:24, 3.41s/it] 31%|███ | 10636/34278 [11:45:13<21:41:34, 3.30s/it] {'loss': 0.1322, 'grad_norm': 0.6885264756111578, 'learning_rate': 8.077662044474043e-06, 'epoch': 0.31} 31%|███ | 10636/34278 [11:45:13<21:41:34, 3.30s/it] 31%|███ | 10637/34278 [11:45:17<23:47:25, 3.62s/it] {'loss': 0.1569, 'grad_norm': 0.9129535942341924, 'learning_rate': 8.077289699909467e-06, 'epoch': 0.31} 31%|███ | 10637/34278 [11:45:17<23:47:25, 3.62s/it] 31%|███ | 10638/34278 [11:45:21<22:46:10, 3.47s/it] {'loss': 0.1634, 'grad_norm': 1.5824290449546556, 'learning_rate': 8.076917327871585e-06, 'epoch': 0.31} 31%|███ | 10638/34278 [11:45:21<22:46:10, 3.47s/it] 31%|███ | 10639/34278 [11:45:24<21:52:17, 3.33s/it] {'loss': 0.1311, 'grad_norm': 0.6993724461831868, 'learning_rate': 8.07654492836372e-06, 'epoch': 0.31} 31%|███ | 10639/34278 [11:45:24<21:52:17, 3.33s/it] 31%|███ | 10640/34278 [11:45:27<22:05:34, 3.36s/it] {'loss': 0.1416, 'grad_norm': 0.7090772943109829, 'learning_rate': 8.076172501389194e-06, 'epoch': 0.31} 31%|███ | 10640/34278 [11:45:27<22:05:34, 3.36s/it] 31%|███ | 10641/34278 [11:45:30<21:55:05, 3.34s/it] {'loss': 0.1214, 'grad_norm': 0.7689088325897413, 'learning_rate': 8.075800046951336e-06, 'epoch': 0.31} 31%|███ | 10641/34278 [11:45:30<21:55:05, 3.34s/it] 31%|███ | 10642/34278 [11:45:33<21:32:42, 3.28s/it] {'loss': 0.1551, 'grad_norm': 0.9255989646444487, 'learning_rate': 8.075427565053471e-06, 'epoch': 0.31} 31%|███ | 10642/34278 [11:45:33<21:32:42, 3.28s/it] 31%|███ | 10643/34278 [11:45:39<26:50:29, 4.09s/it] {'loss': 0.1643, 'grad_norm': 0.9407014546893899, 'learning_rate': 8.07505505569892e-06, 'epoch': 0.31} 31%|███ | 10643/34278 [11:45:39<26:50:29, 4.09s/it] 31%|███ | 10644/34278 [11:45:44<28:41:02, 4.37s/it] {'loss': 0.1692, 'grad_norm': 0.6631045486438311, 'learning_rate': 8.074682518891013e-06, 'epoch': 0.31} 31%|███ | 10644/34278 [11:45:44<28:41:02, 4.37s/it] 31%|███ | 10645/34278 [11:45:47<25:57:28, 3.95s/it] {'loss': 0.1507, 'grad_norm': 0.7748464554000842, 'learning_rate': 8.074309954633074e-06, 'epoch': 0.31} 31%|███ | 10645/34278 [11:45:47<25:57:28, 3.95s/it] 31%|███ | 10646/34278 [11:45:50<24:08:38, 3.68s/it] {'loss': 0.1413, 'grad_norm': 0.7765330736455465, 'learning_rate': 8.07393736292843e-06, 'epoch': 0.31} 31%|███ | 10646/34278 [11:45:50<24:08:38, 3.68s/it] 31%|███ | 10647/34278 [11:45:54<24:41:24, 3.76s/it] {'loss': 0.1758, 'grad_norm': 0.8579947591245294, 'learning_rate': 8.073564743780407e-06, 'epoch': 0.31} 31%|███ | 10647/34278 [11:45:54<24:41:24, 3.76s/it] 31%|███ | 10648/34278 [11:45:59<27:15:09, 4.15s/it] {'loss': 0.137, 'grad_norm': 0.7410292246732326, 'learning_rate': 8.07319209719233e-06, 'epoch': 0.31} 31%|███ | 10648/34278 [11:45:59<27:15:09, 4.15s/it] 31%|███ | 10649/34278 [11:46:02<24:48:00, 3.78s/it] {'loss': 0.1572, 'grad_norm': 0.7330946753608799, 'learning_rate': 8.072819423167529e-06, 'epoch': 0.31} 31%|███ | 10649/34278 [11:46:02<24:48:00, 3.78s/it] 31%|███ | 10650/34278 [11:46:07<26:00:16, 3.96s/it] {'loss': 0.144, 'grad_norm': 0.9172111121028844, 'learning_rate': 8.07244672170933e-06, 'epoch': 0.31} 31%|███ | 10650/34278 [11:46:07<26:00:16, 3.96s/it] 31%|███ | 10651/34278 [11:46:11<25:40:53, 3.91s/it] {'loss': 0.165, 'grad_norm': 0.8347018719207745, 'learning_rate': 8.07207399282106e-06, 'epoch': 0.31} 31%|███ | 10651/34278 [11:46:11<25:40:53, 3.91s/it] 31%|███ | 10652/34278 [11:46:14<24:50:32, 3.79s/it] {'loss': 0.1516, 'grad_norm': 0.9026895241321442, 'learning_rate': 8.071701236506046e-06, 'epoch': 0.31} 31%|███ | 10652/34278 [11:46:14<24:50:32, 3.79s/it] 31%|███ | 10653/34278 [11:46:20<28:20:05, 4.32s/it] {'loss': 0.1541, 'grad_norm': 0.7653364453481548, 'learning_rate': 8.071328452767616e-06, 'epoch': 0.31} 31%|███ | 10653/34278 [11:46:20<28:20:05, 4.32s/it] 31%|███ | 10654/34278 [11:46:24<28:31:58, 4.35s/it] {'loss': 0.1536, 'grad_norm': 0.842937097174414, 'learning_rate': 8.0709556416091e-06, 'epoch': 0.31} 31%|███ | 10654/34278 [11:46:24<28:31:58, 4.35s/it] 31%|███ | 10655/34278 [11:46:27<25:36:20, 3.90s/it] {'loss': 0.1593, 'grad_norm': 0.7441741293056079, 'learning_rate': 8.070582803033827e-06, 'epoch': 0.31} 31%|███ | 10655/34278 [11:46:27<25:36:20, 3.90s/it] 31%|███ | 10656/34278 [11:46:30<23:51:35, 3.64s/it] {'loss': 0.1361, 'grad_norm': 0.8379730241885758, 'learning_rate': 8.07020993704512e-06, 'epoch': 0.31} 31%|███ | 10656/34278 [11:46:30<23:51:35, 3.64s/it] 31%|███ | 10657/34278 [11:46:33<23:05:15, 3.52s/it] {'loss': 0.1453, 'grad_norm': 0.8639332739638844, 'learning_rate': 8.069837043646313e-06, 'epoch': 0.31} 31%|███ | 10657/34278 [11:46:33<23:05:15, 3.52s/it] 31%|███ | 10658/34278 [11:46:37<23:47:22, 3.63s/it] {'loss': 0.1624, 'grad_norm': 0.8095227989886883, 'learning_rate': 8.069464122840736e-06, 'epoch': 0.31} 31%|███ | 10658/34278 [11:46:37<23:47:22, 3.63s/it] 31%|███ | 10659/34278 [11:46:40<22:39:17, 3.45s/it] {'loss': 0.1314, 'grad_norm': 0.8009859399076495, 'learning_rate': 8.069091174631713e-06, 'epoch': 0.31} 31%|███ | 10659/34278 [11:46:40<22:39:17, 3.45s/it] 31%|███ | 10660/34278 [11:46:43<21:50:05, 3.33s/it] {'loss': 0.1241, 'grad_norm': 0.7809940192959718, 'learning_rate': 8.068718199022578e-06, 'epoch': 0.31} 31%|███ | 10660/34278 [11:46:43<21:50:05, 3.33s/it] 31%|███ | 10661/34278 [11:46:49<26:05:59, 3.98s/it] {'loss': 0.1445, 'grad_norm': 0.8787555741360006, 'learning_rate': 8.06834519601666e-06, 'epoch': 0.31} 31%|███ | 10661/34278 [11:46:49<26:05:59, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 31%|███ | 10662/34278 [11:46:52<25:53:03, 3.95s/it] {'loss': 0.1576, 'grad_norm': 0.8156662472298689, 'learning_rate': 8.067972165617287e-06, 'epoch': 0.31} 31%|███ | 10662/34278 [11:46:52<25:53:03, 3.95s/it] 31%|███ | 10663/34278 [11:46:55<24:00:45, 3.66s/it] {'loss': 0.1546, 'grad_norm': 0.8065762398079646, 'learning_rate': 8.067599107827793e-06, 'epoch': 0.31} 31%|███ | 10663/34278 [11:46:55<24:00:45, 3.66s/it] 31%|███ | 10664/34278 [11:46:59<23:02:29, 3.51s/it] {'loss': 0.1406, 'grad_norm': 0.7447673881203032, 'learning_rate': 8.067226022651505e-06, 'epoch': 0.31} 31%|███ | 10664/34278 [11:46:59<23:02:29, 3.51s/it] 31%|███ | 10665/34278 [11:47:04<27:39:39, 4.22s/it] {'loss': 0.1521, 'grad_norm': 0.8428734686193561, 'learning_rate': 8.066852910091754e-06, 'epoch': 0.31} 31%|███ | 10665/34278 [11:47:04<27:39:39, 4.22s/it] 31%|███ | 10666/34278 [11:47:07<25:11:14, 3.84s/it] {'loss': 0.1539, 'grad_norm': 0.7604425710000424, 'learning_rate': 8.066479770151875e-06, 'epoch': 0.31} 31%|███ | 10666/34278 [11:47:07<25:11:14, 3.84s/it] 31%|███ | 10667/34278 [11:47:11<24:37:14, 3.75s/it] {'loss': 0.1547, 'grad_norm': 0.8009054114318241, 'learning_rate': 8.066106602835195e-06, 'epoch': 0.31} 31%|███ | 10667/34278 [11:47:11<24:37:14, 3.75s/it] 31%|███ | 10668/34278 [11:47:17<29:30:44, 4.50s/it] {'loss': 0.1528, 'grad_norm': 1.0157049129411353, 'learning_rate': 8.065733408145047e-06, 'epoch': 0.31} 31%|███ | 10668/34278 [11:47:17<29:30:44, 4.50s/it] 31%|███ | 10669/34278 [11:47:21<27:48:45, 4.24s/it] {'loss': 0.1535, 'grad_norm': 0.970607898681344, 'learning_rate': 8.065360186084764e-06, 'epoch': 0.31} 31%|███ | 10669/34278 [11:47:21<27:48:45, 4.24s/it] 31%|███ | 10670/34278 [11:47:24<25:35:50, 3.90s/it] {'loss': 0.1637, 'grad_norm': 0.692521856646399, 'learning_rate': 8.064986936657678e-06, 'epoch': 0.31} 31%|███ | 10670/34278 [11:47:24<25:35:50, 3.90s/it] 31%|███ | 10671/34278 [11:47:27<24:20:58, 3.71s/it] {'loss': 0.18, 'grad_norm': 0.9277228764146348, 'learning_rate': 8.064613659867117e-06, 'epoch': 0.31} 31%|███ | 10671/34278 [11:47:27<24:20:58, 3.71s/it] 31%|███ | 10672/34278 [11:47:30<23:05:04, 3.52s/it] {'loss': 0.1336, 'grad_norm': 1.0267082516333221, 'learning_rate': 8.06424035571642e-06, 'epoch': 0.31} 31%|███ | 10672/34278 [11:47:30<23:05:04, 3.52s/it] 31%|███ | 10673/34278 [11:47:35<24:33:39, 3.75s/it] {'loss': 0.1424, 'grad_norm': 0.7567316283677745, 'learning_rate': 8.063867024208915e-06, 'epoch': 0.31} 31%|███ | 10673/34278 [11:47:35<24:33:39, 3.75s/it] 31%|███ | 10674/34278 [11:47:38<23:36:51, 3.60s/it] {'loss': 0.1521, 'grad_norm': 0.8372580592986668, 'learning_rate': 8.063493665347937e-06, 'epoch': 0.31} 31%|███ | 10674/34278 [11:47:38<23:36:51, 3.60s/it] 31%|███ | 10675/34278 [11:47:43<26:28:44, 4.04s/it] {'loss': 0.1568, 'grad_norm': 0.8257915263991669, 'learning_rate': 8.063120279136818e-06, 'epoch': 0.31} 31%|███ | 10675/34278 [11:47:43<26:28:44, 4.04s/it] 31%|███ | 10676/34278 [11:47:49<30:46:47, 4.69s/it] {'loss': 0.1512, 'grad_norm': 0.6840641979015937, 'learning_rate': 8.062746865578894e-06, 'epoch': 0.31} 31%|███ | 10676/34278 [11:47:49<30:46:47, 4.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 31%|███ | 10677/34278 [11:47:52<27:59:44, 4.27s/it] {'loss': 0.1588, 'grad_norm': 0.956027467205039, 'learning_rate': 8.062373424677497e-06, 'epoch': 0.31} 31%|███ | 10677/34278 [11:47:52<27:59:44, 4.27s/it] 31%|███ | 10678/34278 [11:47:56<26:43:49, 4.08s/it] {'loss': 0.132, 'grad_norm': 0.74587657090834, 'learning_rate': 8.061999956435959e-06, 'epoch': 0.31} 31%|███ | 10678/34278 [11:47:56<26:43:49, 4.08s/it] 31%|███ | 10679/34278 [11:47:59<24:56:04, 3.80s/it] {'loss': 0.1306, 'grad_norm': 0.7686836235919776, 'learning_rate': 8.061626460857618e-06, 'epoch': 0.31} 31%|███ | 10679/34278 [11:47:59<24:56:04, 3.80s/it] 31%|███ | 10680/34278 [11:48:02<23:47:25, 3.63s/it] {'loss': 0.1406, 'grad_norm': 0.6952941128726813, 'learning_rate': 8.061252937945807e-06, 'epoch': 0.31} 31%|███ | 10680/34278 [11:48:02<23:47:25, 3.63s/it] 31%|███ | 10681/34278 [11:48:05<22:27:21, 3.43s/it] {'loss': 0.1355, 'grad_norm': 0.7551025388799266, 'learning_rate': 8.06087938770386e-06, 'epoch': 0.31} 31%|███ | 10681/34278 [11:48:05<22:27:21, 3.43s/it] 31%|███ | 10682/34278 [11:48:11<27:05:05, 4.13s/it] {'loss': 0.123, 'grad_norm': 0.7968590949508415, 'learning_rate': 8.060505810135113e-06, 'epoch': 0.31} 31%|███ | 10682/34278 [11:48:11<27:05:05, 4.13s/it] 31%|███ | 10683/34278 [11:48:15<26:37:31, 4.06s/it] {'loss': 0.1587, 'grad_norm': 0.7104227466698682, 'learning_rate': 8.0601322052429e-06, 'epoch': 0.31} 31%|███ | 10683/34278 [11:48:15<26:37:31, 4.06s/it] 31%|███ | 10684/34278 [11:48:20<28:05:50, 4.29s/it] {'loss': 0.137, 'grad_norm': 0.770712191753551, 'learning_rate': 8.059758573030559e-06, 'epoch': 0.31} 31%|███ | 10684/34278 [11:48:20<28:05:50, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 31%|███ | 10685/34278 [11:48:24<27:24:19, 4.18s/it] {'loss': 0.1379, 'grad_norm': 1.0207202104356439, 'learning_rate': 8.059384913501422e-06, 'epoch': 0.31} 31%|███ | 10685/34278 [11:48:24<27:24:19, 4.18s/it] 31%|███ | 10686/34278 [11:48:28<27:38:17, 4.22s/it] {'loss': 0.1337, 'grad_norm': 0.9498034214744107, 'learning_rate': 8.059011226658826e-06, 'epoch': 0.31} 31%|███ | 10686/34278 [11:48:28<27:38:17, 4.22s/it] 31%|███ | 10687/34278 [11:48:32<26:14:56, 4.01s/it] {'loss': 0.1565, 'grad_norm': 0.7442823311245105, 'learning_rate': 8.05863751250611e-06, 'epoch': 0.31} 31%|███ | 10687/34278 [11:48:32<26:14:56, 4.01s/it] 31%|███ | 10688/34278 [11:48:35<24:24:04, 3.72s/it] {'loss': 0.1645, 'grad_norm': 0.8086803621633196, 'learning_rate': 8.058263771046608e-06, 'epoch': 0.31} 31%|███ | 10688/34278 [11:48:35<24:24:04, 3.72s/it] 31%|███ | 10689/34278 [11:48:38<23:29:43, 3.59s/it] {'loss': 0.13, 'grad_norm': 0.7835500579927944, 'learning_rate': 8.057890002283657e-06, 'epoch': 0.31} 31%|███ | 10689/34278 [11:48:38<23:29:43, 3.59s/it] 31%|███ | 10690/34278 [11:48:43<25:52:24, 3.95s/it] {'loss': 0.1867, 'grad_norm': 0.8949264288312289, 'learning_rate': 8.057516206220594e-06, 'epoch': 0.31} 31%|███ | 10690/34278 [11:48:43<25:52:24, 3.95s/it] 31%|███ | 10691/34278 [11:48:46<23:42:07, 3.62s/it] {'loss': 0.1439, 'grad_norm': 0.763094435902557, 'learning_rate': 8.057142382860757e-06, 'epoch': 0.31} 31%|███ | 10691/34278 [11:48:46<23:42:07, 3.62s/it] 31%|███ | 10692/34278 [11:48:50<24:17:51, 3.71s/it] {'loss': 0.1707, 'grad_norm': 1.1047382814335238, 'learning_rate': 8.05676853220748e-06, 'epoch': 0.31} 31%|███ | 10692/34278 [11:48:50<24:17:51, 3.71s/it] 31%|███ | 10693/34278 [11:48:53<24:32:14, 3.75s/it] {'loss': 0.1338, 'grad_norm': 1.0420450594740067, 'learning_rate': 8.056394654264107e-06, 'epoch': 0.31} 31%|███ | 10693/34278 [11:48:53<24:32:14, 3.75s/it] 31%|███ | 10694/34278 [11:48:57<23:47:56, 3.63s/it] {'loss': 0.1528, 'grad_norm': 1.1649076445090332, 'learning_rate': 8.056020749033968e-06, 'epoch': 0.31} 31%|███ | 10694/34278 [11:48:57<23:47:56, 3.63s/it] 31%|███ | 10695/34278 [11:49:00<22:37:57, 3.45s/it] {'loss': 0.1424, 'grad_norm': 0.9895356237269789, 'learning_rate': 8.055646816520409e-06, 'epoch': 0.31} 31%|███ | 10695/34278 [11:49:00<22:37:57, 3.45s/it] 31%|███ | 10696/34278 [11:49:06<27:52:06, 4.25s/it] {'loss': 0.1454, 'grad_norm': 1.155915915417647, 'learning_rate': 8.05527285672676e-06, 'epoch': 0.31} 31%|███ | 10696/34278 [11:49:06<27:52:06, 4.25s/it] 31%|███ | 10697/34278 [11:49:10<28:03:00, 4.28s/it] {'loss': 0.1343, 'grad_norm': 0.8550922229798301, 'learning_rate': 8.05489886965637e-06, 'epoch': 0.31} 31%|███ | 10697/34278 [11:49:10<28:03:00, 4.28s/it] 31%|███ | 10698/34278 [11:49:14<26:19:45, 4.02s/it] {'loss': 0.1417, 'grad_norm': 1.0372996907065362, 'learning_rate': 8.054524855312568e-06, 'epoch': 0.31} 31%|███ | 10698/34278 [11:49:14<26:19:45, 4.02s/it] 31%|███ | 10699/34278 [11:49:17<24:29:10, 3.74s/it] {'loss': 0.149, 'grad_norm': 0.8225030740209992, 'learning_rate': 8.0541508136987e-06, 'epoch': 0.31} 31%|███ | 10699/34278 [11:49:17<24:29:10, 3.74s/it] 31%|███ | 10700/34278 [11:49:20<23:43:38, 3.62s/it] {'loss': 0.1408, 'grad_norm': 1.0389846090648762, 'learning_rate': 8.053776744818102e-06, 'epoch': 0.31} 31%|███ | 10700/34278 [11:49:20<23:43:38, 3.62s/it] 31%|███ | 10701/34278 [11:49:23<22:52:43, 3.49s/it] {'loss': 0.1606, 'grad_norm': 0.8972509564498101, 'learning_rate': 8.053402648674113e-06, 'epoch': 0.31} 31%|███ | 10701/34278 [11:49:23<22:52:43, 3.49s/it] 31%|███ | 10702/34278 [11:49:27<22:30:57, 3.44s/it] {'loss': 0.1475, 'grad_norm': 0.7994333451254089, 'learning_rate': 8.053028525270075e-06, 'epoch': 0.31} 31%|███ | 10702/34278 [11:49:27<22:30:57, 3.44s/it] 31%|███ | 10703/34278 [11:49:30<21:47:18, 3.33s/it] {'loss': 0.1556, 'grad_norm': 1.2633577308984363, 'learning_rate': 8.052654374609326e-06, 'epoch': 0.31} 31%|███ | 10703/34278 [11:49:30<21:47:18, 3.33s/it] 31%|███ | 10704/34278 [11:49:33<21:28:47, 3.28s/it] {'loss': 0.1432, 'grad_norm': 1.0440090002034648, 'learning_rate': 8.052280196695209e-06, 'epoch': 0.31} 31%|███ | 10704/34278 [11:49:33<21:28:47, 3.28s/it] 31%|███ | 10705/34278 [11:49:36<20:56:06, 3.20s/it] {'loss': 0.1416, 'grad_norm': 0.7322160840033916, 'learning_rate': 8.051905991531061e-06, 'epoch': 0.31} 31%|███ | 10705/34278 [11:49:36<20:56:06, 3.20s/it] 31%|███ | 10706/34278 [11:49:39<20:50:34, 3.18s/it] {'loss': 0.1613, 'grad_norm': 0.9900805434997059, 'learning_rate': 8.051531759120228e-06, 'epoch': 0.31} 31%|███ | 10706/34278 [11:49:39<20:50:34, 3.18s/it] 31%|███ | 10707/34278 [11:49:42<20:32:06, 3.14s/it] {'loss': 0.131, 'grad_norm': 0.9174698172214244, 'learning_rate': 8.051157499466044e-06, 'epoch': 0.31} 31%|███ | 10707/34278 [11:49:42<20:32:06, 3.14s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 31%|███ | 10708/34278 [11:49:47<23:28:22, 3.59s/it] {'loss': 0.1499, 'grad_norm': 0.7010178794851797, 'learning_rate': 8.050783212571857e-06, 'epoch': 0.31} 31%|███ | 10708/34278 [11:49:47<23:28:22, 3.59s/it] 31%|███ | 10709/34278 [11:49:51<25:08:49, 3.84s/it] {'loss': 0.1613, 'grad_norm': 0.9298159267560002, 'learning_rate': 8.050408898441005e-06, 'epoch': 0.31} 31%|███ | 10709/34278 [11:49:51<25:08:49, 3.84s/it] 31%|███ | 10710/34278 [11:49:56<26:48:09, 4.09s/it] {'loss': 0.1572, 'grad_norm': 0.8912738723340679, 'learning_rate': 8.050034557076831e-06, 'epoch': 0.31} 31%|███ | 10710/34278 [11:49:56<26:48:09, 4.09s/it] 31%|███ | 10711/34278 [11:50:02<30:39:35, 4.68s/it] {'loss': 0.1328, 'grad_norm': 0.7617971180421135, 'learning_rate': 8.049660188482677e-06, 'epoch': 0.31} 31%|███ | 10711/34278 [11:50:02<30:39:35, 4.68s/it] 31%|███▏ | 10712/34278 [11:50:05<27:05:45, 4.14s/it] {'loss': 0.1429, 'grad_norm': 0.6849817819796101, 'learning_rate': 8.049285792661882e-06, 'epoch': 0.31} 31%|███▏ | 10712/34278 [11:50:05<27:05:45, 4.14s/it] 31%|███▏ | 10713/34278 [11:50:08<24:35:27, 3.76s/it] {'loss': 0.1484, 'grad_norm': 0.912788931517718, 'learning_rate': 8.048911369617794e-06, 'epoch': 0.31} 31%|███▏ | 10713/34278 [11:50:08<24:35:27, 3.76s/it] 31%|███▏ | 10714/34278 [11:50:13<28:22:40, 4.34s/it] {'loss': 0.1559, 'grad_norm': 0.8351000231911664, 'learning_rate': 8.048536919353753e-06, 'epoch': 0.31} 31%|███▏ | 10714/34278 [11:50:13<28:22:40, 4.34s/it] 31%|███▏ | 10715/34278 [11:50:19<30:15:16, 4.62s/it] {'loss': 0.1551, 'grad_norm': 0.7026278300710169, 'learning_rate': 8.048162441873102e-06, 'epoch': 0.31} 31%|███▏ | 10715/34278 [11:50:19<30:15:16, 4.62s/it] 31%|███▏ | 10716/34278 [11:50:22<27:05:00, 4.14s/it] {'loss': 0.1316, 'grad_norm': 0.6711410878079361, 'learning_rate': 8.047787937179183e-06, 'epoch': 0.31} 31%|███▏ | 10716/34278 [11:50:22<27:05:00, 4.14s/it] 31%|███▏ | 10717/34278 [11:50:25<24:51:28, 3.80s/it] {'loss': 0.1463, 'grad_norm': 0.7666660355141117, 'learning_rate': 8.047413405275344e-06, 'epoch': 0.31} 31%|███▏ | 10717/34278 [11:50:25<24:51:28, 3.80s/it] 31%|███▏ | 10718/34278 [11:50:28<23:32:02, 3.60s/it] {'loss': 0.1822, 'grad_norm': 0.8413731982089626, 'learning_rate': 8.047038846164923e-06, 'epoch': 0.31} 31%|███▏ | 10718/34278 [11:50:28<23:32:02, 3.60s/it] 31%|███▏ | 10719/34278 [11:50:31<23:12:39, 3.55s/it] {'loss': 0.1576, 'grad_norm': 0.8177612017088078, 'learning_rate': 8.046664259851267e-06, 'epoch': 0.31} 31%|███▏ | 10719/34278 [11:50:31<23:12:39, 3.55s/it] 31%|███▏ | 10720/34278 [11:50:34<22:17:36, 3.41s/it] {'loss': 0.1275, 'grad_norm': 0.9037697769111159, 'learning_rate': 8.046289646337719e-06, 'epoch': 0.31} 31%|███▏ | 10720/34278 [11:50:34<22:17:36, 3.41s/it] 31%|███▏ | 10721/34278 [11:50:38<22:45:07, 3.48s/it] {'loss': 0.1588, 'grad_norm': 0.8146306173853757, 'learning_rate': 8.045915005627626e-06, 'epoch': 0.31} 31%|███▏ | 10721/34278 [11:50:38<22:45:07, 3.48s/it] 31%|███▏ | 10722/34278 [11:50:43<25:30:41, 3.90s/it] {'loss': 0.1623, 'grad_norm': 0.8398277328802198, 'learning_rate': 8.045540337724329e-06, 'epoch': 0.31} 31%|███▏ | 10722/34278 [11:50:43<25:30:41, 3.90s/it] 31%|███▏ | 10723/34278 [11:50:46<23:45:19, 3.63s/it] {'loss': 0.1473, 'grad_norm': 0.6977706410600809, 'learning_rate': 8.045165642631176e-06, 'epoch': 0.31} 31%|███▏ | 10723/34278 [11:50:46<23:45:19, 3.63s/it] 31%|███▏ | 10724/34278 [11:50:50<24:09:11, 3.69s/it] {'loss': 0.1282, 'grad_norm': 0.8347512600324639, 'learning_rate': 8.044790920351512e-06, 'epoch': 0.31} 31%|███▏ | 10724/34278 [11:50:50<24:09:11, 3.69s/it] 31%|███▏ | 10725/34278 [11:50:54<24:53:44, 3.81s/it] {'loss': 0.1446, 'grad_norm': 1.0121807184248806, 'learning_rate': 8.044416170888681e-06, 'epoch': 0.31} 31%|███▏ | 10725/34278 [11:50:54<24:53:44, 3.81s/it] 31%|███▏ | 10726/34278 [11:50:59<29:00:39, 4.43s/it] {'loss': 0.1518, 'grad_norm': 0.6374140196195595, 'learning_rate': 8.044041394246027e-06, 'epoch': 0.31} 31%|███▏ | 10726/34278 [11:50:59<29:00:39, 4.43s/it] 31%|███▏ | 10727/34278 [11:51:05<31:57:16, 4.88s/it] {'loss': 0.1559, 'grad_norm': 1.0939177543901855, 'learning_rate': 8.0436665904269e-06, 'epoch': 0.31} 31%|███▏ | 10727/34278 [11:51:05<31:57:16, 4.88s/it] 31%|███▏ | 10728/34278 [11:51:08<27:50:57, 4.26s/it] {'loss': 0.1361, 'grad_norm': 1.0342650290269129, 'learning_rate': 8.043291759434643e-06, 'epoch': 0.31} 31%|███▏ | 10728/34278 [11:51:08<27:50:57, 4.26s/it] 31%|███▏ | 10729/34278 [11:51:14<31:14:32, 4.78s/it] {'loss': 0.16, 'grad_norm': 0.906841026379668, 'learning_rate': 8.042916901272606e-06, 'epoch': 0.31} 31%|███▏ | 10729/34278 [11:51:14<31:14:32, 4.78s/it] 31%|███▏ | 10730/34278 [11:51:17<27:59:35, 4.28s/it] {'loss': 0.1565, 'grad_norm': 0.7827500777722924, 'learning_rate': 8.042542015944133e-06, 'epoch': 0.31} 31%|███▏ | 10730/34278 [11:51:17<27:59:35, 4.28s/it] 31%|███▏ | 10731/34278 [11:51:21<26:52:17, 4.11s/it] {'loss': 0.1701, 'grad_norm': 0.9417677569325281, 'learning_rate': 8.04216710345257e-06, 'epoch': 0.31} 31%|███▏ | 10731/34278 [11:51:21<26:52:17, 4.11s/it] 31%|███▏ | 10732/34278 [11:51:25<25:44:41, 3.94s/it] {'loss': 0.1428, 'grad_norm': 0.6544675059020303, 'learning_rate': 8.041792163801266e-06, 'epoch': 0.31} 31%|███▏ | 10732/34278 [11:51:25<25:44:41, 3.94s/it] 31%|███▏ | 10733/34278 [11:51:28<24:41:01, 3.77s/it] {'loss': 0.1716, 'grad_norm': 0.7878177342915902, 'learning_rate': 8.041417196993565e-06, 'epoch': 0.31} 31%|███▏ | 10733/34278 [11:51:28<24:41:01, 3.77s/it] 31%|███▏ | 10734/34278 [11:51:32<25:56:04, 3.97s/it] {'loss': 0.1472, 'grad_norm': 0.8376969242944549, 'learning_rate': 8.041042203032821e-06, 'epoch': 0.31} 31%|███▏ | 10734/34278 [11:51:32<25:56:04, 3.97s/it] 31%|███▏ | 10735/34278 [11:51:36<24:33:06, 3.75s/it] {'loss': 0.1382, 'grad_norm': 0.9388497991176934, 'learning_rate': 8.040667181922378e-06, 'epoch': 0.31} 31%|███▏ | 10735/34278 [11:51:36<24:33:06, 3.75s/it] 31%|███▏ | 10736/34278 [11:51:40<25:52:48, 3.96s/it] {'loss': 0.1428, 'grad_norm': 0.7996431337234259, 'learning_rate': 8.040292133665582e-06, 'epoch': 0.31} 31%|███▏ | 10736/34278 [11:51:40<25:52:48, 3.96s/it] 31%|███▏ | 10737/34278 [11:51:46<30:02:22, 4.59s/it] {'loss': 0.1388, 'grad_norm': 0.9414314353418634, 'learning_rate': 8.039917058265784e-06, 'epoch': 0.31} 31%|███▏ | 10737/34278 [11:51:46<30:02:22, 4.59s/it] 31%|███▏ | 10738/34278 [11:51:50<28:41:58, 4.39s/it] {'loss': 0.1438, 'grad_norm': 1.0338513178501352, 'learning_rate': 8.039541955726333e-06, 'epoch': 0.31} 31%|███▏ | 10738/34278 [11:51:50<28:41:58, 4.39s/it] 31%|███▏ | 10739/34278 [11:51:53<25:59:55, 3.98s/it] {'loss': 0.146, 'grad_norm': 0.9033452820449029, 'learning_rate': 8.039166826050577e-06, 'epoch': 0.31} 31%|███▏ | 10739/34278 [11:51:53<25:59:55, 3.98s/it] 31%|███▏ | 10740/34278 [11:51:56<24:15:29, 3.71s/it] {'loss': 0.1303, 'grad_norm': 0.9429621006855641, 'learning_rate': 8.038791669241865e-06, 'epoch': 0.31} 31%|███▏ | 10740/34278 [11:51:56<24:15:29, 3.71s/it] 31%|███▏ | 10741/34278 [11:51:59<23:09:28, 3.54s/it] {'loss': 0.1469, 'grad_norm': 0.7545239946383687, 'learning_rate': 8.038416485303546e-06, 'epoch': 0.31} 31%|███▏ | 10741/34278 [11:51:59<23:09:28, 3.54s/it] 31%|███▏ | 10742/34278 [11:52:05<26:57:03, 4.12s/it] {'loss': 0.1571, 'grad_norm': 0.8022933400590296, 'learning_rate': 8.03804127423897e-06, 'epoch': 0.31} 31%|███▏ | 10742/34278 [11:52:05<26:57:03, 4.12s/it] 31%|███▏ | 10743/34278 [11:52:08<25:42:20, 3.93s/it] {'loss': 0.19, 'grad_norm': 0.9467000448849283, 'learning_rate': 8.037666036051489e-06, 'epoch': 0.31} 31%|███▏ | 10743/34278 [11:52:08<25:42:20, 3.93s/it] 31%|███▏ | 10744/34278 [11:52:14<29:53:31, 4.57s/it] {'loss': 0.1495, 'grad_norm': 0.7750292485174162, 'learning_rate': 8.037290770744448e-06, 'epoch': 0.31} 31%|███▏ | 10744/34278 [11:52:14<29:53:31, 4.57s/it] 31%|███▏ | 10745/34278 [11:52:18<28:44:23, 4.40s/it] {'loss': 0.1524, 'grad_norm': 1.0258822155581384, 'learning_rate': 8.036915478321201e-06, 'epoch': 0.31} 31%|███▏ | 10745/34278 [11:52:18<28:44:23, 4.40s/it] 31%|███▏ | 10746/34278 [11:52:22<28:05:47, 4.30s/it] {'loss': 0.1536, 'grad_norm': 0.8124866017833455, 'learning_rate': 8.036540158785097e-06, 'epoch': 0.31} 31%|███▏ | 10746/34278 [11:52:22<28:05:47, 4.30s/it] 31%|███▏ | 10747/34278 [11:52:27<29:14:52, 4.47s/it] {'loss': 0.132, 'grad_norm': 1.0427948302415047, 'learning_rate': 8.036164812139487e-06, 'epoch': 0.31} 31%|███▏ | 10747/34278 [11:52:27<29:14:52, 4.47s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 31%|███▏ | 10748/34278 [11:52:30<26:15:03, 4.02s/it] {'loss': 0.1376, 'grad_norm': 0.8445958957902322, 'learning_rate': 8.035789438387724e-06, 'epoch': 0.31} 31%|███▏ | 10748/34278 [11:52:30<26:15:03, 4.02s/it] 31%|███▏ | 10749/34278 [11:52:33<24:19:47, 3.72s/it] {'loss': 0.1911, 'grad_norm': 1.2067884629560839, 'learning_rate': 8.035414037533156e-06, 'epoch': 0.31} 31%|███▏ | 10749/34278 [11:52:33<24:19:47, 3.72s/it] 31%|███▏ | 10750/34278 [11:52:37<25:13:46, 3.86s/it] {'loss': 0.1509, 'grad_norm': 1.1192234183737655, 'learning_rate': 8.035038609579138e-06, 'epoch': 0.31} 31%|███▏ | 10750/34278 [11:52:37<25:13:46, 3.86s/it] 31%|███▏ | 10751/34278 [11:52:40<23:30:46, 3.60s/it] {'loss': 0.1631, 'grad_norm': 0.8019585745421046, 'learning_rate': 8.034663154529018e-06, 'epoch': 0.31} 31%|███▏ | 10751/34278 [11:52:40<23:30:46, 3.60s/it] 31%|███▏ | 10752/34278 [11:52:44<23:19:47, 3.57s/it] {'loss': 0.1314, 'grad_norm': 1.1192644625980521, 'learning_rate': 8.03428767238615e-06, 'epoch': 0.31} 31%|███▏ | 10752/34278 [11:52:44<23:19:47, 3.57s/it] 31%|███▏ | 10753/34278 [11:52:50<28:01:21, 4.29s/it] {'loss': 0.1622, 'grad_norm': 0.8762285213604124, 'learning_rate': 8.033912163153886e-06, 'epoch': 0.31} 31%|███▏ | 10753/34278 [11:52:50<28:01:21, 4.29s/it] 31%|███▏ | 10754/34278 [11:52:53<26:33:55, 4.07s/it] {'loss': 0.1578, 'grad_norm': 0.8767256040478457, 'learning_rate': 8.03353662683558e-06, 'epoch': 0.31} 31%|███▏ | 10754/34278 [11:52:53<26:33:55, 4.07s/it] 31%|███▏ | 10755/34278 [11:52:57<24:44:10, 3.79s/it] {'loss': 0.1516, 'grad_norm': 0.999344346633624, 'learning_rate': 8.033161063434582e-06, 'epoch': 0.31} 31%|███▏ | 10755/34278 [11:52:57<24:44:10, 3.79s/it] 31%|███▏ | 10756/34278 [11:52:59<22:57:12, 3.51s/it] {'loss': 0.143, 'grad_norm': 0.7808600078335992, 'learning_rate': 8.032785472954246e-06, 'epoch': 0.31} 31%|███▏ | 10756/34278 [11:52:59<22:57:12, 3.51s/it] 31%|███▏ | 10757/34278 [11:53:05<27:31:38, 4.21s/it] {'loss': 0.1265, 'grad_norm': 0.8899712922793491, 'learning_rate': 8.032409855397925e-06, 'epoch': 0.31} 31%|███▏ | 10757/34278 [11:53:05<27:31:38, 4.21s/it] 31%|███▏ | 10758/34278 [11:53:09<25:34:41, 3.92s/it] {'loss': 0.1348, 'grad_norm': 0.8781754079412691, 'learning_rate': 8.032034210768973e-06, 'epoch': 0.31} 31%|███▏ | 10758/34278 [11:53:09<25:34:41, 3.92s/it] 31%|███▏ | 10759/34278 [11:53:15<29:57:26, 4.59s/it] {'loss': 0.1515, 'grad_norm': 0.9344278429889267, 'learning_rate': 8.031658539070744e-06, 'epoch': 0.31} 31%|███▏ | 10759/34278 [11:53:15<29:57:26, 4.59s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9537 > 8192). Running this sequence through the model will result in indexing errors 31%|███▏ | 10760/34278 [11:53:18<26:57:24, 4.13s/it] {'loss': 0.1305, 'grad_norm': 0.7558407745735243, 'learning_rate': 8.03128284030659e-06, 'epoch': 0.31} 31%|███▏ | 10760/34278 [11:53:18<26:57:24, 4.13s/it] 31%|███▏ | 10761/34278 [11:53:24<30:42:51, 4.70s/it] {'loss': 0.1572, 'grad_norm': 0.9094098942735137, 'learning_rate': 8.030907114479866e-06, 'epoch': 0.31} 31%|███▏ | 10761/34278 [11:53:24<30:42:51, 4.70s/it] 31%|███▏ | 10762/34278 [11:53:28<29:01:57, 4.44s/it] {'loss': 0.1671, 'grad_norm': 0.8515731402323798, 'learning_rate': 8.03053136159393e-06, 'epoch': 0.31} 31%|███▏ | 10762/34278 [11:53:28<29:01:57, 4.44s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff00c1ad5d0> Failed to fetch sample 3058468. Exception: cannot identify image file <_io.BytesIO object at 0x7ff00c1ad5d0> 31%|███▏ | 10763/34278 [11:53:31<26:53:56, 4.12s/it] {'loss': 0.1429, 'grad_norm': 0.8192516539648291, 'learning_rate': 8.030155581652131e-06, 'epoch': 0.31} 31%|███▏ | 10763/34278 [11:53:31<26:53:56, 4.12s/it] 31%|███▏ | 10764/34278 [11:53:34<24:47:38, 3.80s/it] {'loss': 0.1488, 'grad_norm': 0.7003987349239982, 'learning_rate': 8.029779774657827e-06, 'epoch': 0.31} 31%|███▏ | 10764/34278 [11:53:34<24:47:38, 3.80s/it] 31%|███▏ | 10765/34278 [11:53:38<24:46:38, 3.79s/it] {'loss': 0.1438, 'grad_norm': 0.6648401101509428, 'learning_rate': 8.029403940614372e-06, 'epoch': 0.31} 31%|███▏ | 10765/34278 [11:53:38<24:46:38, 3.79s/it] 31%|███▏ | 10766/34278 [11:53:42<25:56:18, 3.97s/it] {'loss': 0.1357, 'grad_norm': 1.1170959064644317, 'learning_rate': 8.029028079525124e-06, 'epoch': 0.31} 31%|███▏ | 10766/34278 [11:53:42<25:56:18, 3.97s/it] 31%|███▏ | 10767/34278 [11:53:48<30:02:34, 4.60s/it] {'loss': 0.1645, 'grad_norm': 0.8448134227537687, 'learning_rate': 8.028652191393432e-06, 'epoch': 0.31} 31%|███▏ | 10767/34278 [11:53:48<30:02:34, 4.60s/it] 31%|███▏ | 10768/34278 [11:53:54<32:59:36, 5.05s/it] {'loss': 0.1488, 'grad_norm': 0.7307092293862033, 'learning_rate': 8.028276276222658e-06, 'epoch': 0.31} 31%|███▏ | 10768/34278 [11:53:54<32:59:36, 5.05s/it] 31%|███▏ | 10769/34278 [11:53:57<28:46:15, 4.41s/it] {'loss': 0.1589, 'grad_norm': 0.9418361270201677, 'learning_rate': 8.027900334016158e-06, 'epoch': 0.31} 31%|███▏ | 10769/34278 [11:53:57<28:46:15, 4.41s/it] 31%|███▏ | 10770/34278 [11:54:00<25:48:07, 3.95s/it] {'loss': 0.1523, 'grad_norm': 0.9507383502112244, 'learning_rate': 8.027524364777285e-06, 'epoch': 0.31} 31%|███▏ | 10770/34278 [11:54:00<25:48:07, 3.95s/it] 31%|███▏ | 10771/34278 [11:54:03<23:58:24, 3.67s/it] {'loss': 0.1177, 'grad_norm': 0.8114521425027372, 'learning_rate': 8.027148368509398e-06, 'epoch': 0.31} 31%|███▏ | 10771/34278 [11:54:03<23:58:24, 3.67s/it] 31%|███▏ | 10772/34278 [11:54:08<26:21:35, 4.04s/it] {'loss': 0.1436, 'grad_norm': 0.9654237897383231, 'learning_rate': 8.026772345215853e-06, 'epoch': 0.31} 31%|███▏ | 10772/34278 [11:54:08<26:21:35, 4.04s/it] 31%|███▏ | 10773/34278 [11:54:11<24:08:36, 3.70s/it] {'loss': 0.1462, 'grad_norm': 0.7326560639180407, 'learning_rate': 8.026396294900007e-06, 'epoch': 0.31} 31%|███▏ | 10773/34278 [11:54:11<24:08:36, 3.70s/it] 31%|███▏ | 10774/34278 [11:54:14<22:27:06, 3.44s/it] {'loss': 0.1469, 'grad_norm': 0.9610973891969635, 'learning_rate': 8.026020217565217e-06, 'epoch': 0.31} 31%|███▏ | 10774/34278 [11:54:14<22:27:06, 3.44s/it] 31%|███▏ | 10775/34278 [11:54:17<21:33:12, 3.30s/it] {'loss': 0.1193, 'grad_norm': 0.7081117753533893, 'learning_rate': 8.02564411321484e-06, 'epoch': 0.31} 31%|███▏ | 10775/34278 [11:54:17<21:33:12, 3.30s/it] 31%|███▏ | 10776/34278 [11:54:20<21:55:46, 3.36s/it] {'loss': 0.1724, 'grad_norm': 0.8779128326353343, 'learning_rate': 8.025267981852236e-06, 'epoch': 0.31} 31%|███▏ | 10776/34278 [11:54:20<21:55:46, 3.36s/it] 31%|███▏ | 10777/34278 [11:54:23<21:30:09, 3.29s/it] {'loss': 0.139, 'grad_norm': 0.7002203640783967, 'learning_rate': 8.024891823480763e-06, 'epoch': 0.31} 31%|███▏ | 10777/34278 [11:54:23<21:30:09, 3.29s/it] 31%|███▏ | 10778/34278 [11:54:27<22:01:51, 3.37s/it] {'loss': 0.1699, 'grad_norm': 0.8890647628148157, 'learning_rate': 8.024515638103775e-06, 'epoch': 0.31} 31%|███▏ | 10778/34278 [11:54:27<22:01:51, 3.37s/it] 31%|███▏ | 10779/34278 [11:54:31<24:07:14, 3.70s/it] {'loss': 0.1491, 'grad_norm': 0.7395894130367539, 'learning_rate': 8.024139425724636e-06, 'epoch': 0.31} 31%|███▏ | 10779/34278 [11:54:31<24:07:14, 3.70s/it] 31%|███▏ | 10780/34278 [11:54:36<26:05:36, 4.00s/it] {'loss': 0.1569, 'grad_norm': 0.7167755306461832, 'learning_rate': 8.023763186346701e-06, 'epoch': 0.31} 31%|███▏ | 10780/34278 [11:54:36<26:05:36, 4.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 31%|███▏ | 10781/34278 [11:54:39<24:19:45, 3.73s/it] {'loss': 0.1428, 'grad_norm': 0.8959477083050045, 'learning_rate': 8.023386919973328e-06, 'epoch': 0.31} 31%|███▏ | 10781/34278 [11:54:39<24:19:45, 3.73s/it] 31%|███▏ | 10782/34278 [11:54:44<26:01:39, 3.99s/it] {'loss': 0.1534, 'grad_norm': 0.7926966857231422, 'learning_rate': 8.023010626607881e-06, 'epoch': 0.31} 31%|███▏ | 10782/34278 [11:54:44<26:01:39, 3.99s/it] 31%|███▏ | 10783/34278 [11:54:47<24:34:12, 3.76s/it] {'loss': 0.1517, 'grad_norm': 0.7519276625120004, 'learning_rate': 8.022634306253717e-06, 'epoch': 0.31} 31%|███▏ | 10783/34278 [11:54:47<24:34:12, 3.76s/it] 31%|███▏ | 10784/34278 [11:54:50<23:05:39, 3.54s/it] {'loss': 0.1564, 'grad_norm': 0.7797314713469649, 'learning_rate': 8.022257958914194e-06, 'epoch': 0.31} 31%|███▏ | 10784/34278 [11:54:50<23:05:39, 3.54s/it] 31%|███▏ | 10785/34278 [11:54:53<22:25:12, 3.44s/it] {'loss': 0.1461, 'grad_norm': 1.3093424623422203, 'learning_rate': 8.021881584592672e-06, 'epoch': 0.31} 31%|███▏ | 10785/34278 [11:54:53<22:25:12, 3.44s/it] 31%|███▏ | 10786/34278 [11:54:59<26:23:46, 4.05s/it] {'loss': 0.1674, 'grad_norm': 1.0210827254759813, 'learning_rate': 8.021505183292515e-06, 'epoch': 0.31} 31%|███▏ | 10786/34278 [11:54:59<26:23:46, 4.05s/it] 31%|███▏ | 10787/34278 [11:55:02<24:26:57, 3.75s/it] {'loss': 0.1486, 'grad_norm': 0.8662462238752835, 'learning_rate': 8.02112875501708e-06, 'epoch': 0.31} 31%|███▏ | 10787/34278 [11:55:02<24:26:57, 3.75s/it] 31%|███▏ | 10788/34278 [11:55:05<23:38:05, 3.62s/it] {'loss': 0.1325, 'grad_norm': 0.8316937972387971, 'learning_rate': 8.02075229976973e-06, 'epoch': 0.31} 31%|███▏ | 10788/34278 [11:55:05<23:38:05, 3.62s/it] 31%|███▏ | 10789/34278 [11:55:09<23:53:22, 3.66s/it] {'loss': 0.1685, 'grad_norm': 1.1354955370376583, 'learning_rate': 8.020375817553824e-06, 'epoch': 0.31} 31%|███▏ | 10789/34278 [11:55:09<23:53:22, 3.66s/it] 31%|███▏ | 10790/34278 [11:55:12<22:58:40, 3.52s/it] {'loss': 0.1638, 'grad_norm': 0.9533332775851375, 'learning_rate': 8.019999308372724e-06, 'epoch': 0.31} 31%|███▏ | 10790/34278 [11:55:12<22:58:40, 3.52s/it] 31%|███▏ | 10791/34278 [11:55:15<21:58:09, 3.37s/it] {'loss': 0.1686, 'grad_norm': 0.9078033304961471, 'learning_rate': 8.01962277222979e-06, 'epoch': 0.31} 31%|███▏ | 10791/34278 [11:55:15<21:58:09, 3.37s/it] 31%|███▏ | 10792/34278 [11:55:19<22:25:08, 3.44s/it] {'loss': 0.1622, 'grad_norm': 0.9491410550554914, 'learning_rate': 8.019246209128384e-06, 'epoch': 0.31} 31%|███▏ | 10792/34278 [11:55:19<22:25:08, 3.44s/it] 31%|███▏ | 10793/34278 [11:55:25<27:15:56, 4.18s/it] {'loss': 0.1465, 'grad_norm': 0.8494628393581969, 'learning_rate': 8.01886961907187e-06, 'epoch': 0.31} 31%|███▏ | 10793/34278 [11:55:25<27:15:56, 4.18s/it] 31%|███▏ | 10794/34278 [11:55:29<27:26:37, 4.21s/it] {'loss': 0.145, 'grad_norm': 1.2820199107453916, 'learning_rate': 8.018493002063608e-06, 'epoch': 0.31} 31%|███▏ | 10794/34278 [11:55:29<27:26:37, 4.21s/it] 31%|███▏ | 10795/34278 [11:55:32<25:40:26, 3.94s/it] {'loss': 0.1601, 'grad_norm': 1.037733081029843, 'learning_rate': 8.018116358106962e-06, 'epoch': 0.31} 31%|███▏ | 10795/34278 [11:55:32<25:40:26, 3.94s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f16ba1a27a0> Failed to fetch sample 3187198. Exception: cannot identify image file <_io.BytesIO object at 0x7f16ba1a27a0> 31%|███▏ | 10796/34278 [11:55:35<23:38:34, 3.62s/it] {'loss': 0.1668, 'grad_norm': 0.9043489129817143, 'learning_rate': 8.017739687205295e-06, 'epoch': 0.31} 31%|███▏ | 10796/34278 [11:55:35<23:38:34, 3.62s/it] 31%|███▏ | 10797/34278 [11:55:39<25:11:41, 3.86s/it] {'loss': 0.1394, 'grad_norm': 1.2169740694511075, 'learning_rate': 8.017362989361965e-06, 'epoch': 0.31} 31%|███▏ | 10797/34278 [11:55:40<25:11:41, 3.86s/it] 32%|███▏ | 10798/34278 [11:55:42<23:31:35, 3.61s/it] {'loss': 0.141, 'grad_norm': 0.875495406598159, 'learning_rate': 8.016986264580341e-06, 'epoch': 0.32} 32%|███▏ | 10798/34278 [11:55:43<23:31:35, 3.61s/it] 32%|███▏ | 10799/34278 [11:55:46<22:33:39, 3.46s/it] {'loss': 0.1465, 'grad_norm': 0.9965183835394941, 'learning_rate': 8.016609512863784e-06, 'epoch': 0.32} 32%|███▏ | 10799/34278 [11:55:46<22:33:39, 3.46s/it] 32%|███▏ | 10800/34278 [11:55:49<22:05:24, 3.39s/it] {'loss': 0.1268, 'grad_norm': 0.7638952406978357, 'learning_rate': 8.016232734215655e-06, 'epoch': 0.32} 32%|███▏ | 10800/34278 [11:55:49<22:05:24, 3.39s/it] 32%|███▏ | 10801/34278 [11:55:52<21:44:06, 3.33s/it] {'loss': 0.1742, 'grad_norm': 0.8975328949139931, 'learning_rate': 8.015855928639323e-06, 'epoch': 0.32} 32%|███▏ | 10801/34278 [11:55:52<21:44:06, 3.33s/it] 32%|███▏ | 10802/34278 [11:55:56<22:17:16, 3.42s/it] {'loss': 0.1366, 'grad_norm': 0.6719977806764448, 'learning_rate': 8.015479096138149e-06, 'epoch': 0.32} 32%|███▏ | 10802/34278 [11:55:56<22:17:16, 3.42s/it] 32%|███▏ | 10803/34278 [11:55:59<21:12:14, 3.25s/it] {'loss': 0.1734, 'grad_norm': 0.9731258658017219, 'learning_rate': 8.015102236715494e-06, 'epoch': 0.32} 32%|███▏ | 10803/34278 [11:55:59<21:12:14, 3.25s/it] 32%|███▏ | 10804/34278 [11:56:02<22:19:59, 3.43s/it] {'loss': 0.1477, 'grad_norm': 0.8365768415108314, 'learning_rate': 8.01472535037473e-06, 'epoch': 0.32} 32%|███▏ | 10804/34278 [11:56:02<22:19:59, 3.43s/it] 32%|███▏ | 10805/34278 [11:56:06<21:59:11, 3.37s/it] {'loss': 0.1504, 'grad_norm': 0.8668622645569349, 'learning_rate': 8.014348437119215e-06, 'epoch': 0.32} 32%|███▏ | 10805/34278 [11:56:06<21:59:11, 3.37s/it] 32%|███▏ | 10806/34278 [11:56:12<27:08:06, 4.16s/it] {'loss': 0.1433, 'grad_norm': 0.8114028199900455, 'learning_rate': 8.013971496952318e-06, 'epoch': 0.32} 32%|███▏ | 10806/34278 [11:56:12<27:08:06, 4.16s/it] 32%|███▏ | 10807/34278 [11:56:17<30:19:09, 4.65s/it] {'loss': 0.1324, 'grad_norm': 0.7153840424955196, 'learning_rate': 8.013594529877402e-06, 'epoch': 0.32} 32%|███▏ | 10807/34278 [11:56:17<30:19:09, 4.65s/it] 32%|███▏ | 10808/34278 [11:56:22<30:22:30, 4.66s/it] {'loss': 0.1318, 'grad_norm': 0.7751645304753552, 'learning_rate': 8.013217535897835e-06, 'epoch': 0.32} 32%|███▏ | 10808/34278 [11:56:22<30:22:30, 4.66s/it] 32%|███▏ | 10809/34278 [11:56:26<28:17:10, 4.34s/it] {'loss': 0.1358, 'grad_norm': 1.0414413983644775, 'learning_rate': 8.012840515016979e-06, 'epoch': 0.32} 32%|███▏ | 10809/34278 [11:56:26<28:17:10, 4.34s/it] 32%|███▏ | 10810/34278 [11:56:32<31:26:33, 4.82s/it] {'loss': 0.1597, 'grad_norm': 1.9637889379868687, 'learning_rate': 8.012463467238206e-06, 'epoch': 0.32} 32%|███▏ | 10810/34278 [11:56:32<31:26:33, 4.82s/it] 32%|███▏ | 10811/34278 [11:56:35<27:47:43, 4.26s/it] {'loss': 0.1422, 'grad_norm': 0.7796162263142161, 'learning_rate': 8.012086392564876e-06, 'epoch': 0.32} 32%|███▏ | 10811/34278 [11:56:35<27:47:43, 4.26s/it] 32%|███▏ | 10812/34278 [11:56:41<31:28:02, 4.83s/it] {'loss': 0.1832, 'grad_norm': 0.8997929953235346, 'learning_rate': 8.011709291000356e-06, 'epoch': 0.32} 32%|███▏ | 10812/34278 [11:56:41<31:28:02, 4.83s/it] 32%|███▏ | 10813/34278 [11:56:44<28:06:22, 4.31s/it] {'loss': 0.1351, 'grad_norm': 0.874184850708513, 'learning_rate': 8.011332162548016e-06, 'epoch': 0.32} 32%|███▏ | 10813/34278 [11:56:44<28:06:22, 4.31s/it] 32%|███▏ | 10814/34278 [11:56:47<25:28:57, 3.91s/it] {'loss': 0.1437, 'grad_norm': 0.8213953251088818, 'learning_rate': 8.01095500721122e-06, 'epoch': 0.32} 32%|███▏ | 10814/34278 [11:56:47<25:28:57, 3.91s/it] 32%|███▏ | 10815/34278 [11:56:50<24:41:57, 3.79s/it] {'loss': 0.1755, 'grad_norm': 0.7463334749308627, 'learning_rate': 8.01057782499334e-06, 'epoch': 0.32} 32%|███▏ | 10815/34278 [11:56:50<24:41:57, 3.79s/it] 32%|███▏ | 10816/34278 [11:56:53<22:53:14, 3.51s/it] {'loss': 0.1573, 'grad_norm': 0.8353718007235376, 'learning_rate': 8.010200615897736e-06, 'epoch': 0.32} 32%|███▏ | 10816/34278 [11:56:53<22:53:14, 3.51s/it] 32%|███▏ | 10817/34278 [11:56:58<24:57:26, 3.83s/it] {'loss': 0.1535, 'grad_norm': 0.7838193827414881, 'learning_rate': 8.00982337992778e-06, 'epoch': 0.32} 32%|███▏ | 10817/34278 [11:56:58<24:57:26, 3.83s/it] 32%|███▏ | 10818/34278 [11:57:01<24:10:15, 3.71s/it] {'loss': 0.1528, 'grad_norm': 0.9705819795490795, 'learning_rate': 8.009446117086842e-06, 'epoch': 0.32} 32%|███▏ | 10818/34278 [11:57:01<24:10:15, 3.71s/it] 32%|███▏ | 10819/34278 [11:57:05<24:40:33, 3.79s/it] {'loss': 0.143, 'grad_norm': 0.6172604382681017, 'learning_rate': 8.009068827378286e-06, 'epoch': 0.32} 32%|███▏ | 10819/34278 [11:57:05<24:40:33, 3.79s/it] 32%|███▏ | 10820/34278 [11:57:09<24:25:45, 3.75s/it] {'loss': 0.1779, 'grad_norm': 0.9071173585735542, 'learning_rate': 8.008691510805483e-06, 'epoch': 0.32} 32%|███▏ | 10820/34278 [11:57:09<24:25:45, 3.75s/it] 32%|███▏ | 10821/34278 [11:57:12<22:24:39, 3.44s/it] {'loss': 0.1423, 'grad_norm': 0.8501037531554967, 'learning_rate': 8.008314167371799e-06, 'epoch': 0.32} 32%|███▏ | 10821/34278 [11:57:12<22:24:39, 3.44s/it] 32%|███▏ | 10822/34278 [11:57:16<24:13:13, 3.72s/it] {'loss': 0.1383, 'grad_norm': 0.783255751313327, 'learning_rate': 8.007936797080604e-06, 'epoch': 0.32} 32%|███▏ | 10822/34278 [11:57:16<24:13:13, 3.72s/it] 32%|███▏ | 10823/34278 [11:57:22<29:01:28, 4.45s/it] {'loss': 0.1503, 'grad_norm': 0.7944849478688778, 'learning_rate': 8.007559399935267e-06, 'epoch': 0.32} 32%|███▏ | 10823/34278 [11:57:22<29:01:28, 4.45s/it] 32%|███▏ | 10824/34278 [11:57:25<26:09:37, 4.02s/it] {'loss': 0.2021, 'grad_norm': 3.6894205757231644, 'learning_rate': 8.007181975939158e-06, 'epoch': 0.32} 32%|███▏ | 10824/34278 [11:57:25<26:09:37, 4.02s/it] 32%|███▏ | 10825/34278 [11:57:29<25:06:23, 3.85s/it] {'loss': 0.146, 'grad_norm': 0.8167854693751999, 'learning_rate': 8.006804525095646e-06, 'epoch': 0.32} 32%|███▏ | 10825/34278 [11:57:29<25:06:23, 3.85s/it] 32%|███▏ | 10826/34278 [11:57:34<28:49:59, 4.43s/it] {'loss': 0.1619, 'grad_norm': 1.3318246565123562, 'learning_rate': 8.006427047408103e-06, 'epoch': 0.32} 32%|███▏ | 10826/34278 [11:57:34<28:49:59, 4.43s/it] 32%|███▏ | 10827/34278 [11:57:37<26:27:43, 4.06s/it] {'loss': 0.1658, 'grad_norm': 0.8930266059831007, 'learning_rate': 8.006049542879894e-06, 'epoch': 0.32} 32%|███▏ | 10827/34278 [11:57:38<26:27:43, 4.06s/it] 32%|███▏ | 10828/34278 [11:57:41<24:46:00, 3.80s/it] {'loss': 0.151, 'grad_norm': 0.8459288008179908, 'learning_rate': 8.005672011514395e-06, 'epoch': 0.32} 32%|███▏ | 10828/34278 [11:57:41<24:46:00, 3.80s/it] 32%|███▏ | 10829/34278 [11:57:44<23:11:44, 3.56s/it] {'loss': 0.1411, 'grad_norm': 1.0865249739569145, 'learning_rate': 8.005294453314974e-06, 'epoch': 0.32} 32%|███▏ | 10829/34278 [11:57:44<23:11:44, 3.56s/it] 32%|███▏ | 10830/34278 [11:57:47<21:47:48, 3.35s/it] {'loss': 0.1595, 'grad_norm': 1.2482569102171208, 'learning_rate': 8.004916868285e-06, 'epoch': 0.32} 32%|███▏ | 10830/34278 [11:57:47<21:47:48, 3.35s/it] 32%|███▏ | 10831/34278 [11:57:50<21:19:37, 3.27s/it] {'loss': 0.1571, 'grad_norm': 0.7735840041017777, 'learning_rate': 8.004539256427845e-06, 'epoch': 0.32} 32%|███▏ | 10831/34278 [11:57:50<21:19:37, 3.27s/it] 32%|███▏ | 10832/34278 [11:57:54<23:07:52, 3.55s/it] {'loss': 0.1326, 'grad_norm': 0.8111414793263528, 'learning_rate': 8.004161617746882e-06, 'epoch': 0.32} 32%|███▏ | 10832/34278 [11:57:54<23:07:52, 3.55s/it] 32%|███▏ | 10833/34278 [11:57:59<27:14:19, 4.18s/it] {'loss': 0.1562, 'grad_norm': 0.6201908759795967, 'learning_rate': 8.003783952245481e-06, 'epoch': 0.32} 32%|███▏ | 10833/34278 [11:58:00<27:14:19, 4.18s/it] 32%|███▏ | 10834/34278 [11:58:05<30:42:59, 4.72s/it] {'loss': 0.1743, 'grad_norm': 0.7690556915432534, 'learning_rate': 8.003406259927012e-06, 'epoch': 0.32} 32%|███▏ | 10834/34278 [11:58:05<30:42:59, 4.72s/it] 32%|███▏ | 10835/34278 [11:58:09<28:12:47, 4.33s/it] {'loss': 0.1518, 'grad_norm': 0.8998700133027496, 'learning_rate': 8.003028540794852e-06, 'epoch': 0.32} 32%|███▏ | 10835/34278 [11:58:09<28:12:47, 4.33s/it] 32%|███▏ | 10836/34278 [11:58:12<26:13:21, 4.03s/it] {'loss': 0.1296, 'grad_norm': 0.7703325383165246, 'learning_rate': 8.002650794852367e-06, 'epoch': 0.32} 32%|███▏ | 10836/34278 [11:58:12<26:13:21, 4.03s/it] 32%|███▏ | 10837/34278 [11:58:15<23:41:21, 3.64s/it] {'loss': 0.1336, 'grad_norm': 0.8496010461891292, 'learning_rate': 8.002273022102936e-06, 'epoch': 0.32} 32%|███▏ | 10837/34278 [11:58:15<23:41:21, 3.64s/it] 32%|███▏ | 10838/34278 [11:58:19<24:18:16, 3.73s/it] {'loss': 0.124, 'grad_norm': 0.9303915219740783, 'learning_rate': 8.001895222549925e-06, 'epoch': 0.32} 32%|███▏ | 10838/34278 [11:58:19<24:18:16, 3.73s/it] 32%|███▏ | 10839/34278 [11:58:22<22:51:51, 3.51s/it] {'loss': 0.1697, 'grad_norm': 0.8372252034605704, 'learning_rate': 8.001517396196711e-06, 'epoch': 0.32} 32%|███▏ | 10839/34278 [11:58:22<22:51:51, 3.51s/it] 32%|███▏ | 10840/34278 [11:58:26<23:59:43, 3.69s/it] {'loss': 0.1374, 'grad_norm': 0.7688007130811148, 'learning_rate': 8.001139543046668e-06, 'epoch': 0.32} 32%|███▏ | 10840/34278 [11:58:26<23:59:43, 3.69s/it] 32%|███▏ | 10841/34278 [11:58:29<22:40:45, 3.48s/it] {'loss': 0.1521, 'grad_norm': 0.9295937439114904, 'learning_rate': 8.000761663103164e-06, 'epoch': 0.32} 32%|███▏ | 10841/34278 [11:58:29<22:40:45, 3.48s/it] 32%|███▏ | 10842/34278 [11:58:33<22:59:10, 3.53s/it] {'loss': 0.1632, 'grad_norm': 0.9722704463867623, 'learning_rate': 8.00038375636958e-06, 'epoch': 0.32} 32%|███▏ | 10842/34278 [11:58:33<22:59:10, 3.53s/it] 32%|███▏ | 10843/34278 [11:58:36<23:00:26, 3.53s/it] {'loss': 0.1528, 'grad_norm': 0.9706299166019182, 'learning_rate': 8.000005822849284e-06, 'epoch': 0.32} 32%|███▏ | 10843/34278 [11:58:36<23:00:26, 3.53s/it] 32%|███▏ | 10844/34278 [11:58:40<22:46:12, 3.50s/it] {'loss': 0.153, 'grad_norm': 0.7881230146869718, 'learning_rate': 7.999627862545652e-06, 'epoch': 0.32} 32%|███▏ | 10844/34278 [11:58:40<22:46:12, 3.50s/it] 32%|███▏ | 10845/34278 [11:58:43<22:51:40, 3.51s/it] {'loss': 0.1779, 'grad_norm': 1.3278552700245558, 'learning_rate': 7.999249875462058e-06, 'epoch': 0.32} 32%|███▏ | 10845/34278 [11:58:43<22:51:40, 3.51s/it] 32%|███▏ | 10846/34278 [11:58:47<22:43:40, 3.49s/it] {'loss': 0.1446, 'grad_norm': 0.7948740939947869, 'learning_rate': 7.99887186160188e-06, 'epoch': 0.32} 32%|███▏ | 10846/34278 [11:58:47<22:43:40, 3.49s/it] 32%|███▏ | 10847/34278 [11:58:50<21:53:45, 3.36s/it] {'loss': 0.1516, 'grad_norm': 1.0025482089247173, 'learning_rate': 7.998493820968487e-06, 'epoch': 0.32} 32%|███▏ | 10847/34278 [11:58:50<21:53:45, 3.36s/it] 32%|███▏ | 10848/34278 [11:58:53<21:16:24, 3.27s/it] {'loss': 0.1374, 'grad_norm': 0.8067955914748581, 'learning_rate': 7.998115753565259e-06, 'epoch': 0.32} 32%|███▏ | 10848/34278 [11:58:53<21:16:24, 3.27s/it] 32%|███▏ | 10849/34278 [11:58:56<21:57:59, 3.38s/it] {'loss': 0.1324, 'grad_norm': 1.141590069966231, 'learning_rate': 7.997737659395569e-06, 'epoch': 0.32} 32%|███▏ | 10849/34278 [11:58:56<21:57:59, 3.38s/it] 32%|███▏ | 10850/34278 [11:58:59<21:12:11, 3.26s/it] {'loss': 0.1698, 'grad_norm': 0.6753486870475083, 'learning_rate': 7.99735953846279e-06, 'epoch': 0.32} 32%|███▏ | 10850/34278 [11:58:59<21:12:11, 3.26s/it] 32%|███▏ | 10851/34278 [11:59:03<21:43:33, 3.34s/it] {'loss': 0.1406, 'grad_norm': 0.6681018223927342, 'learning_rate': 7.996981390770305e-06, 'epoch': 0.32} 32%|███▏ | 10851/34278 [11:59:03<21:43:33, 3.34s/it] 32%|███▏ | 10852/34278 [11:59:06<21:44:26, 3.34s/it] {'loss': 0.1493, 'grad_norm': 0.9748489643663281, 'learning_rate': 7.996603216321482e-06, 'epoch': 0.32} 32%|███▏ | 10852/34278 [11:59:06<21:44:26, 3.34s/it] 32%|███▏ | 10853/34278 [11:59:09<20:50:05, 3.20s/it] {'loss': 0.1403, 'grad_norm': 0.8698199547853437, 'learning_rate': 7.996225015119702e-06, 'epoch': 0.32} 32%|███▏ | 10853/34278 [11:59:09<20:50:05, 3.20s/it] 32%|███▏ | 10854/34278 [11:59:12<20:35:39, 3.17s/it] {'loss': 0.1348, 'grad_norm': 0.810940875301132, 'learning_rate': 7.99584678716834e-06, 'epoch': 0.32} 32%|███▏ | 10854/34278 [11:59:12<20:35:39, 3.17s/it] 32%|███▏ | 10855/34278 [11:59:15<20:10:09, 3.10s/it] {'loss': 0.1532, 'grad_norm': 0.9992900550347783, 'learning_rate': 7.995468532470773e-06, 'epoch': 0.32} 32%|███▏ | 10855/34278 [11:59:15<20:10:09, 3.10s/it] 32%|███▏ | 10856/34278 [11:59:19<21:04:48, 3.24s/it] {'loss': 0.1314, 'grad_norm': 0.8919525298022424, 'learning_rate': 7.995090251030379e-06, 'epoch': 0.32} 32%|███▏ | 10856/34278 [11:59:19<21:04:48, 3.24s/it] 32%|███▏ | 10857/34278 [11:59:22<20:34:26, 3.16s/it] {'loss': 0.1277, 'grad_norm': 1.0945965505220643, 'learning_rate': 7.994711942850536e-06, 'epoch': 0.32} 32%|███▏ | 10857/34278 [11:59:22<20:34:26, 3.16s/it] 32%|███▏ | 10858/34278 [11:59:25<20:02:47, 3.08s/it] {'loss': 0.1483, 'grad_norm': 1.0945924397721185, 'learning_rate': 7.994333607934616e-06, 'epoch': 0.32} 32%|███▏ | 10858/34278 [11:59:25<20:02:47, 3.08s/it] 32%|███▏ | 10859/34278 [11:59:28<20:13:34, 3.11s/it] {'loss': 0.1609, 'grad_norm': 0.8033755381283553, 'learning_rate': 7.993955246286e-06, 'epoch': 0.32} 32%|███▏ | 10859/34278 [11:59:28<20:13:34, 3.11s/it] 32%|███▏ | 10860/34278 [11:59:32<22:43:00, 3.49s/it] {'loss': 0.1388, 'grad_norm': 0.8190977917699773, 'learning_rate': 7.99357685790807e-06, 'epoch': 0.32} 32%|███▏ | 10860/34278 [11:59:32<22:43:00, 3.49s/it] 32%|███▏ | 10861/34278 [11:59:38<27:52:39, 4.29s/it] {'loss': 0.1426, 'grad_norm': 0.8652544400655409, 'learning_rate': 7.993198442804198e-06, 'epoch': 0.32} 32%|███▏ | 10861/34278 [11:59:38<27:52:39, 4.29s/it] 32%|███▏ | 10862/34278 [11:59:41<25:40:14, 3.95s/it] {'loss': 0.1746, 'grad_norm': 1.0621468836251697, 'learning_rate': 7.992820000977765e-06, 'epoch': 0.32} 32%|███▏ | 10862/34278 [11:59:41<25:40:14, 3.95s/it] 32%|███▏ | 10863/34278 [11:59:44<23:32:06, 3.62s/it] {'loss': 0.1535, 'grad_norm': 0.7581573525477618, 'learning_rate': 7.99244153243215e-06, 'epoch': 0.32} 32%|███▏ | 10863/34278 [11:59:44<23:32:06, 3.62s/it] 32%|███▏ | 10864/34278 [11:59:48<24:28:53, 3.76s/it] {'loss': 0.143, 'grad_norm': 1.0495793236692208, 'learning_rate': 7.992063037170731e-06, 'epoch': 0.32} 32%|███▏ | 10864/34278 [11:59:48<24:28:53, 3.76s/it] 32%|███▏ | 10865/34278 [11:59:52<23:33:48, 3.62s/it] {'loss': 0.1601, 'grad_norm': 0.9951663002642008, 'learning_rate': 7.991684515196887e-06, 'epoch': 0.32} 32%|███▏ | 10865/34278 [11:59:52<23:33:48, 3.62s/it] 32%|███▏ | 10866/34278 [11:59:54<21:48:48, 3.35s/it] {'loss': 0.151, 'grad_norm': 0.8261304615673515, 'learning_rate': 7.991305966513998e-06, 'epoch': 0.32} 32%|███▏ | 10866/34278 [11:59:54<21:48:48, 3.35s/it] 32%|███▏ | 10867/34278 [12:00:00<27:08:36, 4.17s/it] {'loss': 0.1427, 'grad_norm': 0.9458205934801588, 'learning_rate': 7.990927391125445e-06, 'epoch': 0.32} 32%|███▏ | 10867/34278 [12:00:00<27:08:36, 4.17s/it] 32%|███▏ | 10868/34278 [12:00:04<25:31:05, 3.92s/it] {'loss': 0.1373, 'grad_norm': 0.8106575000502098, 'learning_rate': 7.990548789034605e-06, 'epoch': 0.32} 32%|███▏ | 10868/34278 [12:00:04<25:31:05, 3.92s/it] 32%|███▏ | 10869/34278 [12:00:07<24:02:18, 3.70s/it] {'loss': 0.1622, 'grad_norm': 0.7383692640181932, 'learning_rate': 7.990170160244859e-06, 'epoch': 0.32} 32%|███▏ | 10869/34278 [12:00:07<24:02:18, 3.70s/it] 32%|███▏ | 10870/34278 [12:00:10<23:30:10, 3.61s/it] {'loss': 0.1498, 'grad_norm': 0.8348757818591909, 'learning_rate': 7.989791504759588e-06, 'epoch': 0.32} 32%|███▏ | 10870/34278 [12:00:10<23:30:10, 3.61s/it] 32%|███▏ | 10871/34278 [12:00:13<22:21:19, 3.44s/it] {'loss': 0.1322, 'grad_norm': 0.8478373700925567, 'learning_rate': 7.989412822582172e-06, 'epoch': 0.32} 32%|███▏ | 10871/34278 [12:00:13<22:21:19, 3.44s/it] 32%|███▏ | 10872/34278 [12:00:18<24:19:15, 3.74s/it] {'loss': 0.1785, 'grad_norm': 0.7925508960137072, 'learning_rate': 7.989034113715991e-06, 'epoch': 0.32} 32%|███▏ | 10872/34278 [12:00:18<24:19:15, 3.74s/it] 32%|███▏ | 10873/34278 [12:00:21<22:18:59, 3.43s/it] {'loss': 0.1493, 'grad_norm': 0.9072265128149225, 'learning_rate': 7.98865537816443e-06, 'epoch': 0.32} 32%|███▏ | 10873/34278 [12:00:21<22:18:59, 3.43s/it] 32%|███▏ | 10874/34278 [12:00:24<21:23:10, 3.29s/it] {'loss': 0.1317, 'grad_norm': 0.6618899521697119, 'learning_rate': 7.988276615930864e-06, 'epoch': 0.32} 32%|███▏ | 10874/34278 [12:00:24<21:23:10, 3.29s/it] 32%|███▏ | 10875/34278 [12:00:26<20:47:19, 3.20s/it] {'loss': 0.14, 'grad_norm': 0.856390210165339, 'learning_rate': 7.98789782701868e-06, 'epoch': 0.32} 32%|███▏ | 10875/34278 [12:00:27<20:47:19, 3.20s/it] 32%|███▏ | 10876/34278 [12:00:30<20:49:21, 3.20s/it] {'loss': 0.1585, 'grad_norm': 0.8282340027579668, 'learning_rate': 7.987519011431256e-06, 'epoch': 0.32} 32%|███▏ | 10876/34278 [12:00:30<20:49:21, 3.20s/it] 32%|███▏ | 10877/34278 [12:00:34<22:54:05, 3.52s/it] {'loss': 0.1446, 'grad_norm': 0.8133317569681802, 'learning_rate': 7.987140169171976e-06, 'epoch': 0.32} 32%|███▏ | 10877/34278 [12:00:34<22:54:05, 3.52s/it] 32%|███▏ | 10878/34278 [12:00:40<27:30:14, 4.23s/it] {'loss': 0.1391, 'grad_norm': 0.6426067356846086, 'learning_rate': 7.986761300244221e-06, 'epoch': 0.32} 32%|███▏ | 10878/34278 [12:00:40<27:30:14, 4.23s/it] 32%|███▏ | 10879/34278 [12:00:46<30:58:06, 4.76s/it] {'loss': 0.1456, 'grad_norm': 0.7135164368604657, 'learning_rate': 7.986382404651376e-06, 'epoch': 0.32} 32%|███▏ | 10879/34278 [12:00:46<30:58:06, 4.76s/it] 32%|███▏ | 10880/34278 [12:00:50<29:03:50, 4.47s/it] {'loss': 0.1581, 'grad_norm': 0.8220779592727057, 'learning_rate': 7.98600348239682e-06, 'epoch': 0.32} 32%|███▏ | 10880/34278 [12:00:50<29:03:50, 4.47s/it] 32%|███▏ | 10881/34278 [12:00:52<25:44:34, 3.96s/it] {'loss': 0.1603, 'grad_norm': 0.8985655432169064, 'learning_rate': 7.985624533483938e-06, 'epoch': 0.32} 32%|███▏ | 10881/34278 [12:00:52<25:44:34, 3.96s/it] 32%|███▏ | 10882/34278 [12:00:56<25:14:34, 3.88s/it] {'loss': 0.1499, 'grad_norm': 0.8243899362161067, 'learning_rate': 7.985245557916114e-06, 'epoch': 0.32} 32%|███▏ | 10882/34278 [12:00:56<25:14:34, 3.88s/it] 32%|███▏ | 10883/34278 [12:00:59<23:45:31, 3.66s/it] {'loss': 0.1355, 'grad_norm': 0.7845523521009368, 'learning_rate': 7.984866555696728e-06, 'epoch': 0.32} 32%|███▏ | 10883/34278 [12:00:59<23:45:31, 3.66s/it] 32%|███▏ | 10884/34278 [12:01:03<23:19:10, 3.59s/it] {'loss': 0.1446, 'grad_norm': 0.9071479875801252, 'learning_rate': 7.984487526829168e-06, 'epoch': 0.32} 32%|███▏ | 10884/34278 [12:01:03<23:19:10, 3.59s/it] 32%|███▏ | 10885/34278 [12:01:07<23:57:19, 3.69s/it] {'loss': 0.1612, 'grad_norm': 1.0093951628565614, 'learning_rate': 7.984108471316815e-06, 'epoch': 0.32} 32%|███▏ | 10885/34278 [12:01:07<23:57:19, 3.69s/it] 32%|███▏ | 10886/34278 [12:01:10<23:10:28, 3.57s/it] {'loss': 0.1421, 'grad_norm': 0.6823686103470579, 'learning_rate': 7.983729389163054e-06, 'epoch': 0.32} 32%|███▏ | 10886/34278 [12:01:10<23:10:28, 3.57s/it] 32%|███▏ | 10887/34278 [12:01:13<22:21:18, 3.44s/it] {'loss': 0.1542, 'grad_norm': 0.8161962542510086, 'learning_rate': 7.983350280371269e-06, 'epoch': 0.32} 32%|███▏ | 10887/34278 [12:01:13<22:21:18, 3.44s/it] 32%|███▏ | 10888/34278 [12:01:16<21:37:51, 3.33s/it] {'loss': 0.1255, 'grad_norm': 0.7319529789211147, 'learning_rate': 7.982971144944846e-06, 'epoch': 0.32} 32%|███▏ | 10888/34278 [12:01:16<21:37:51, 3.33s/it] 32%|███▏ | 10889/34278 [12:01:22<26:53:15, 4.14s/it] {'loss': 0.162, 'grad_norm': 0.6978920309349856, 'learning_rate': 7.982591982887168e-06, 'epoch': 0.32} 32%|███▏ | 10889/34278 [12:01:22<26:53:15, 4.14s/it] 32%|███▏ | 10890/34278 [12:01:26<26:12:13, 4.03s/it] {'loss': 0.1428, 'grad_norm': 0.7974316615751017, 'learning_rate': 7.982212794201621e-06, 'epoch': 0.32} 32%|███▏ | 10890/34278 [12:01:26<26:12:13, 4.03s/it] 32%|███▏ | 10891/34278 [12:01:29<24:56:35, 3.84s/it] {'loss': 0.1363, 'grad_norm': 0.7324167704909141, 'learning_rate': 7.98183357889159e-06, 'epoch': 0.32} 32%|███▏ | 10891/34278 [12:01:29<24:56:35, 3.84s/it] 32%|███▏ | 10892/34278 [12:01:33<24:05:28, 3.71s/it] {'loss': 0.1514, 'grad_norm': 0.9590530925832134, 'learning_rate': 7.98145433696046e-06, 'epoch': 0.32} 32%|███▏ | 10892/34278 [12:01:33<24:05:28, 3.71s/it] 32%|███▏ | 10893/34278 [12:01:39<28:42:51, 4.42s/it] {'loss': 0.161, 'grad_norm': 0.8461320732724353, 'learning_rate': 7.98107506841162e-06, 'epoch': 0.32} 32%|███▏ | 10893/34278 [12:01:39<28:42:51, 4.42s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 32%|███▏ | 10894/34278 [12:01:42<26:04:16, 4.01s/it] {'loss': 0.1564, 'grad_norm': 0.8229166627709099, 'learning_rate': 7.98069577324845e-06, 'epoch': 0.32} 32%|███▏ | 10894/34278 [12:01:42<26:04:16, 4.01s/it] 32%|███▏ | 10895/34278 [12:01:45<23:30:11, 3.62s/it] {'loss': 0.1361, 'grad_norm': 0.8833468948109152, 'learning_rate': 7.980316451474339e-06, 'epoch': 0.32} 32%|███▏ | 10895/34278 [12:01:45<23:30:11, 3.62s/it] 32%|███▏ | 10896/34278 [12:01:48<22:59:46, 3.54s/it] {'loss': 0.153, 'grad_norm': 1.0419301433172863, 'learning_rate': 7.979937103092677e-06, 'epoch': 0.32} 32%|███▏ | 10896/34278 [12:01:48<22:59:46, 3.54s/it] 32%|███▏ | 10897/34278 [12:01:52<24:37:48, 3.79s/it] {'loss': 0.168, 'grad_norm': 0.7309515740337175, 'learning_rate': 7.979557728106848e-06, 'epoch': 0.32} 32%|███▏ | 10897/34278 [12:01:52<24:37:48, 3.79s/it] 32%|███▏ | 10898/34278 [12:01:55<23:11:35, 3.57s/it] {'loss': 0.1501, 'grad_norm': 1.0051947532936936, 'learning_rate': 7.979178326520238e-06, 'epoch': 0.32} 32%|███▏ | 10898/34278 [12:01:55<23:11:35, 3.57s/it] 32%|███▏ | 10899/34278 [12:01:58<21:31:42, 3.32s/it] {'loss': 0.1376, 'grad_norm': 0.8832778360594972, 'learning_rate': 7.978798898336235e-06, 'epoch': 0.32} 32%|███▏ | 10899/34278 [12:01:58<21:31:42, 3.32s/it] 32%|███▏ | 10900/34278 [12:02:02<23:33:52, 3.63s/it] {'loss': 0.1861, 'grad_norm': 0.8412014254184449, 'learning_rate': 7.978419443558227e-06, 'epoch': 0.32} 32%|███▏ | 10900/34278 [12:02:02<23:33:52, 3.63s/it] 32%|███▏ | 10901/34278 [12:02:06<22:44:33, 3.50s/it] {'loss': 0.1645, 'grad_norm': 0.9142482863988516, 'learning_rate': 7.9780399621896e-06, 'epoch': 0.32} 32%|███▏ | 10901/34278 [12:02:06<22:44:33, 3.50s/it] 32%|███▏ | 10902/34278 [12:02:09<22:21:54, 3.44s/it] {'loss': 0.1618, 'grad_norm': 1.1139117786825425, 'learning_rate': 7.977660454233744e-06, 'epoch': 0.32} 32%|███▏ | 10902/34278 [12:02:09<22:21:54, 3.44s/it] 32%|███▏ | 10903/34278 [12:02:12<22:12:21, 3.42s/it] {'loss': 0.1453, 'grad_norm': 0.8952417141100726, 'learning_rate': 7.977280919694047e-06, 'epoch': 0.32} 32%|███▏ | 10903/34278 [12:02:12<22:12:21, 3.42s/it] 32%|███▏ | 10904/34278 [12:02:16<21:46:39, 3.35s/it] {'loss': 0.1479, 'grad_norm': 1.2342890758949616, 'learning_rate': 7.976901358573896e-06, 'epoch': 0.32} 32%|███▏ | 10904/34278 [12:02:16<21:46:39, 3.35s/it] 32%|███▏ | 10905/34278 [12:02:19<22:33:41, 3.48s/it] {'loss': 0.1504, 'grad_norm': 1.2172665966323095, 'learning_rate': 7.976521770876679e-06, 'epoch': 0.32} 32%|███▏ | 10905/34278 [12:02:19<22:33:41, 3.48s/it] 32%|███▏ | 10906/34278 [12:02:25<26:48:06, 4.13s/it] {'loss': 0.1514, 'grad_norm': 0.9820515451969672, 'learning_rate': 7.976142156605788e-06, 'epoch': 0.32} 32%|███▏ | 10906/34278 [12:02:25<26:48:06, 4.13s/it] 32%|███▏ | 10907/34278 [12:02:28<25:22:30, 3.91s/it] {'loss': 0.1243, 'grad_norm': 0.7379730837343433, 'learning_rate': 7.97576251576461e-06, 'epoch': 0.32} 32%|███▏ | 10907/34278 [12:02:28<25:22:30, 3.91s/it] 32%|███▏ | 10908/34278 [12:02:32<24:12:35, 3.73s/it] {'loss': 0.1566, 'grad_norm': 1.2874446005764328, 'learning_rate': 7.975382848356533e-06, 'epoch': 0.32} 32%|███▏ | 10908/34278 [12:02:32<24:12:35, 3.73s/it] 32%|███▏ | 10909/34278 [12:02:37<26:43:27, 4.12s/it] {'loss': 0.145, 'grad_norm': 0.9823168690057406, 'learning_rate': 7.97500315438495e-06, 'epoch': 0.32} 32%|███▏ | 10909/34278 [12:02:37<26:43:27, 4.12s/it] 32%|███▏ | 10910/34278 [12:02:40<26:02:32, 4.01s/it] {'loss': 0.1529, 'grad_norm': 0.976866401389679, 'learning_rate': 7.974623433853248e-06, 'epoch': 0.32} 32%|███▏ | 10910/34278 [12:02:40<26:02:32, 4.01s/it] 32%|███▏ | 10911/34278 [12:02:44<24:53:49, 3.84s/it] {'loss': 0.1701, 'grad_norm': 1.0329106099311451, 'learning_rate': 7.97424368676482e-06, 'epoch': 0.32} 32%|███▏ | 10911/34278 [12:02:44<24:53:49, 3.84s/it] 32%|███▏ | 10912/34278 [12:02:47<24:30:30, 3.78s/it] {'loss': 0.1494, 'grad_norm': 1.238083789297518, 'learning_rate': 7.973863913123053e-06, 'epoch': 0.32} 32%|███▏ | 10912/34278 [12:02:47<24:30:30, 3.78s/it] 32%|███▏ | 10913/34278 [12:02:50<22:59:55, 3.54s/it] {'loss': 0.1447, 'grad_norm': 0.8916058755722975, 'learning_rate': 7.973484112931337e-06, 'epoch': 0.32} 32%|███▏ | 10913/34278 [12:02:50<22:59:55, 3.54s/it] 32%|███▏ | 10914/34278 [12:02:54<22:43:55, 3.50s/it] {'loss': 0.1718, 'grad_norm': 0.7613436905631191, 'learning_rate': 7.973104286193067e-06, 'epoch': 0.32} 32%|███▏ | 10914/34278 [12:02:54<22:43:55, 3.50s/it] 32%|███▏ | 10915/34278 [12:03:00<27:38:02, 4.26s/it] {'loss': 0.1562, 'grad_norm': 1.0944601201705713, 'learning_rate': 7.972724432911632e-06, 'epoch': 0.32} 32%|███▏ | 10915/34278 [12:03:00<27:38:02, 4.26s/it] 32%|███▏ | 10916/34278 [12:03:03<26:07:17, 4.03s/it] {'loss': 0.1608, 'grad_norm': 0.8782577023184184, 'learning_rate': 7.972344553090422e-06, 'epoch': 0.32} 32%|███▏ | 10916/34278 [12:03:03<26:07:17, 4.03s/it] 32%|███▏ | 10917/34278 [12:03:07<24:26:33, 3.77s/it] {'loss': 0.1446, 'grad_norm': 0.8529555948164445, 'learning_rate': 7.97196464673283e-06, 'epoch': 0.32} 32%|███▏ | 10917/34278 [12:03:07<24:26:33, 3.77s/it] 32%|███▏ | 10918/34278 [12:03:10<23:47:13, 3.67s/it] {'loss': 0.141, 'grad_norm': 0.8768972001228, 'learning_rate': 7.971584713842247e-06, 'epoch': 0.32} 32%|███▏ | 10918/34278 [12:03:10<23:47:13, 3.67s/it] 32%|███▏ | 10919/34278 [12:03:13<21:58:59, 3.39s/it] {'loss': 0.1521, 'grad_norm': 0.8389996599149626, 'learning_rate': 7.971204754422063e-06, 'epoch': 0.32} 32%|███▏ | 10919/34278 [12:03:13<21:58:59, 3.39s/it] 32%|███▏ | 10920/34278 [12:03:16<20:57:45, 3.23s/it] {'loss': 0.1667, 'grad_norm': 0.939397250400976, 'learning_rate': 7.970824768475675e-06, 'epoch': 0.32} 32%|███▏ | 10920/34278 [12:03:16<20:57:45, 3.23s/it] 32%|███▏ | 10921/34278 [12:03:19<20:53:18, 3.22s/it] {'loss': 0.1543, 'grad_norm': 0.7930986520373468, 'learning_rate': 7.970444756006473e-06, 'epoch': 0.32} 32%|███▏ | 10921/34278 [12:03:19<20:53:18, 3.22s/it] 32%|███▏ | 10922/34278 [12:03:23<22:58:16, 3.54s/it] {'loss': 0.141, 'grad_norm': 0.8159388536947406, 'learning_rate': 7.970064717017847e-06, 'epoch': 0.32} 32%|███▏ | 10922/34278 [12:03:23<22:58:16, 3.54s/it] 32%|███▏ | 10923/34278 [12:03:27<23:55:04, 3.69s/it] {'loss': 0.1422, 'grad_norm': 0.766377693268765, 'learning_rate': 7.969684651513193e-06, 'epoch': 0.32} 32%|███▏ | 10923/34278 [12:03:27<23:55:04, 3.69s/it] 32%|███▏ | 10924/34278 [12:03:31<24:01:08, 3.70s/it] {'loss': 0.1842, 'grad_norm': 0.8639135229695858, 'learning_rate': 7.969304559495903e-06, 'epoch': 0.32} 32%|███▏ | 10924/34278 [12:03:31<24:01:08, 3.70s/it] 32%|███▏ | 10925/34278 [12:03:34<23:08:14, 3.57s/it] {'loss': 0.1442, 'grad_norm': 0.7901964245547788, 'learning_rate': 7.968924440969372e-06, 'epoch': 0.32} 32%|███▏ | 10925/34278 [12:03:34<23:08:14, 3.57s/it] 32%|███▏ | 10926/34278 [12:03:38<23:35:27, 3.64s/it] {'loss': 0.1261, 'grad_norm': 0.7547189508705513, 'learning_rate': 7.968544295936992e-06, 'epoch': 0.32} 32%|███▏ | 10926/34278 [12:03:38<23:35:27, 3.64s/it] 32%|███▏ | 10927/34278 [12:03:42<25:10:57, 3.88s/it] {'loss': 0.1627, 'grad_norm': 0.8062241964346836, 'learning_rate': 7.968164124402156e-06, 'epoch': 0.32} 32%|███▏ | 10927/34278 [12:03:42<25:10:57, 3.88s/it] 32%|███▏ | 10928/34278 [12:03:47<26:23:09, 4.07s/it] {'loss': 0.1312, 'grad_norm': 0.7902701470212061, 'learning_rate': 7.967783926368259e-06, 'epoch': 0.32} 32%|███▏ | 10928/34278 [12:03:47<26:23:09, 4.07s/it] 32%|███▏ | 10929/34278 [12:03:53<30:12:56, 4.66s/it] {'loss': 0.1509, 'grad_norm': 0.7790390263454149, 'learning_rate': 7.967403701838697e-06, 'epoch': 0.32} 32%|███▏ | 10929/34278 [12:03:53<30:12:56, 4.66s/it] 32%|███▏ | 10930/34278 [12:03:56<27:21:54, 4.22s/it] {'loss': 0.1816, 'grad_norm': 0.8192497541244029, 'learning_rate': 7.967023450816864e-06, 'epoch': 0.32} 32%|███▏ | 10930/34278 [12:03:56<27:21:54, 4.22s/it] 32%|███▏ | 10931/34278 [12:03:59<24:56:32, 3.85s/it] {'loss': 0.1573, 'grad_norm': 0.7206749785485109, 'learning_rate': 7.966643173306151e-06, 'epoch': 0.32} 32%|███▏ | 10931/34278 [12:03:59<24:56:32, 3.85s/it] 32%|███▏ | 10932/34278 [12:04:05<28:53:46, 4.46s/it] {'loss': 0.1279, 'grad_norm': 0.7553026242956746, 'learning_rate': 7.96626286930996e-06, 'epoch': 0.32} 32%|███▏ | 10932/34278 [12:04:05<28:53:46, 4.46s/it] 32%|███▏ | 10933/34278 [12:04:09<27:13:41, 4.20s/it] {'loss': 0.1378, 'grad_norm': 0.7900128280464674, 'learning_rate': 7.965882538831678e-06, 'epoch': 0.32} 32%|███▏ | 10933/34278 [12:04:09<27:13:41, 4.20s/it] 32%|███▏ | 10934/34278 [12:04:12<24:54:59, 3.84s/it] {'loss': 0.1466, 'grad_norm': 0.781617927870265, 'learning_rate': 7.965502181874707e-06, 'epoch': 0.32} 32%|███▏ | 10934/34278 [12:04:12<24:54:59, 3.84s/it] 32%|███▏ | 10935/34278 [12:04:15<24:28:01, 3.77s/it] {'loss': 0.1346, 'grad_norm': 0.7226488317673317, 'learning_rate': 7.965121798442438e-06, 'epoch': 0.32} 32%|███▏ | 10935/34278 [12:04:15<24:28:01, 3.77s/it] 32%|███▏ | 10936/34278 [12:04:18<23:09:29, 3.57s/it] {'loss': 0.1321, 'grad_norm': 0.6723334210407697, 'learning_rate': 7.964741388538272e-06, 'epoch': 0.32} 32%|███▏ | 10936/34278 [12:04:18<23:09:29, 3.57s/it] 32%|███▏ | 10937/34278 [12:04:21<21:35:13, 3.33s/it] {'loss': 0.1424, 'grad_norm': 0.8684507089063188, 'learning_rate': 7.964360952165603e-06, 'epoch': 0.32} 32%|███▏ | 10937/34278 [12:04:21<21:35:13, 3.33s/it] 32%|███▏ | 10938/34278 [12:04:25<22:41:27, 3.50s/it] {'loss': 0.1736, 'grad_norm': 0.9239067511311433, 'learning_rate': 7.963980489327826e-06, 'epoch': 0.32} 32%|███▏ | 10938/34278 [12:04:25<22:41:27, 3.50s/it] 32%|███▏ | 10939/34278 [12:04:28<22:31:02, 3.47s/it] {'loss': 0.1342, 'grad_norm': 0.8309033312700427, 'learning_rate': 7.96360000002834e-06, 'epoch': 0.32} 32%|███▏ | 10939/34278 [12:04:28<22:31:02, 3.47s/it] 32%|███▏ | 10940/34278 [12:04:32<22:48:30, 3.52s/it] {'loss': 0.151, 'grad_norm': 0.8864879593290103, 'learning_rate': 7.963219484270537e-06, 'epoch': 0.32} 32%|███▏ | 10940/34278 [12:04:32<22:48:30, 3.52s/it] 32%|███▏ | 10941/34278 [12:04:35<21:20:46, 3.29s/it] {'loss': 0.1625, 'grad_norm': 0.9543105795961997, 'learning_rate': 7.962838942057821e-06, 'epoch': 0.32} 32%|███▏ | 10941/34278 [12:04:35<21:20:46, 3.29s/it] 32%|███▏ | 10942/34278 [12:04:40<24:28:54, 3.78s/it] {'loss': 0.1401, 'grad_norm': 0.7911350191859794, 'learning_rate': 7.962458373393587e-06, 'epoch': 0.32} 32%|███▏ | 10942/34278 [12:04:40<24:28:54, 3.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 32%|███▏ | 10943/34278 [12:04:43<23:14:06, 3.58s/it] {'loss': 0.1403, 'grad_norm': 0.8265873061121952, 'learning_rate': 7.96207777828123e-06, 'epoch': 0.32} 32%|███▏ | 10943/34278 [12:04:43<23:14:06, 3.58s/it] 32%|███▏ | 10944/34278 [12:04:46<22:38:49, 3.49s/it] {'loss': 0.1652, 'grad_norm': 1.0898136522483184, 'learning_rate': 7.961697156724149e-06, 'epoch': 0.32} 32%|███▏ | 10944/34278 [12:04:46<22:38:49, 3.49s/it] 32%|███▏ | 10945/34278 [12:04:51<25:58:48, 4.01s/it] {'loss': 0.1411, 'grad_norm': 0.7436830795595055, 'learning_rate': 7.961316508725745e-06, 'epoch': 0.32} 32%|███▏ | 10945/34278 [12:04:51<25:58:48, 4.01s/it] 32%|███▏ | 10946/34278 [12:04:54<24:14:34, 3.74s/it] {'loss': 0.1467, 'grad_norm': 1.0525989627328995, 'learning_rate': 7.960935834289412e-06, 'epoch': 0.32} 32%|███▏ | 10946/34278 [12:04:54<24:14:34, 3.74s/it] 32%|███▏ | 10947/34278 [12:04:59<25:51:31, 3.99s/it] {'loss': 0.1375, 'grad_norm': 0.6856700623005294, 'learning_rate': 7.960555133418551e-06, 'epoch': 0.32} 32%|███▏ | 10947/34278 [12:04:59<25:51:31, 3.99s/it] 32%|███▏ | 10948/34278 [12:05:02<24:51:20, 3.84s/it] {'loss': 0.1351, 'grad_norm': 0.7745108663757383, 'learning_rate': 7.960174406116561e-06, 'epoch': 0.32} 32%|███▏ | 10948/34278 [12:05:02<24:51:20, 3.84s/it] 32%|███▏ | 10949/34278 [12:05:06<23:31:30, 3.63s/it] {'loss': 0.1486, 'grad_norm': 0.8947273049284142, 'learning_rate': 7.959793652386841e-06, 'epoch': 0.32} 32%|███▏ | 10949/34278 [12:05:06<23:31:30, 3.63s/it] 32%|███▏ | 10950/34278 [12:05:09<23:25:19, 3.61s/it] {'loss': 0.1483, 'grad_norm': 0.8237067717890412, 'learning_rate': 7.95941287223279e-06, 'epoch': 0.32} 32%|███▏ | 10950/34278 [12:05:09<23:25:19, 3.61s/it] 32%|███▏ | 10951/34278 [12:05:12<22:35:28, 3.49s/it] {'loss': 0.1458, 'grad_norm': 0.9335537446322002, 'learning_rate': 7.959032065657807e-06, 'epoch': 0.32} 32%|███▏ | 10951/34278 [12:05:12<22:35:28, 3.49s/it] 32%|███▏ | 10952/34278 [12:05:16<22:04:34, 3.41s/it] {'loss': 0.1611, 'grad_norm': 0.7338829389501422, 'learning_rate': 7.958651232665292e-06, 'epoch': 0.32} 32%|███▏ | 10952/34278 [12:05:16<22:04:34, 3.41s/it] 32%|███▏ | 10953/34278 [12:05:19<21:51:57, 3.37s/it] {'loss': 0.1398, 'grad_norm': 0.7767700317831092, 'learning_rate': 7.958270373258645e-06, 'epoch': 0.32} 32%|███▏ | 10953/34278 [12:05:19<21:51:57, 3.37s/it] 32%|███▏ | 10954/34278 [12:05:23<24:11:20, 3.73s/it] {'loss': 0.1501, 'grad_norm': 0.6514863483326759, 'learning_rate': 7.957889487441266e-06, 'epoch': 0.32} 32%|███▏ | 10954/34278 [12:05:23<24:11:20, 3.73s/it] 32%|███▏ | 10955/34278 [12:05:27<22:58:38, 3.55s/it] {'loss': 0.1396, 'grad_norm': 0.8655125459821137, 'learning_rate': 7.957508575216556e-06, 'epoch': 0.32} 32%|███▏ | 10955/34278 [12:05:27<22:58:38, 3.55s/it] 32%|███▏ | 10956/34278 [12:05:30<23:29:49, 3.63s/it] {'loss': 0.1596, 'grad_norm': 0.7380382683302142, 'learning_rate': 7.957127636587916e-06, 'epoch': 0.32} 32%|███▏ | 10956/34278 [12:05:30<23:29:49, 3.63s/it] 32%|███▏ | 10957/34278 [12:05:34<22:46:11, 3.51s/it] {'loss': 0.1354, 'grad_norm': 0.832772327199341, 'learning_rate': 7.956746671558746e-06, 'epoch': 0.32} 32%|███▏ | 10957/34278 [12:05:34<22:46:11, 3.51s/it] 32%|███▏ | 10958/34278 [12:05:37<22:09:25, 3.42s/it] {'loss': 0.1505, 'grad_norm': 0.8846179502107682, 'learning_rate': 7.956365680132447e-06, 'epoch': 0.32} 32%|███▏ | 10958/34278 [12:05:37<22:09:25, 3.42s/it] 32%|███▏ | 10959/34278 [12:05:40<22:16:56, 3.44s/it] {'loss': 0.1433, 'grad_norm': 0.766310363080684, 'learning_rate': 7.955984662312422e-06, 'epoch': 0.32} 32%|███▏ | 10959/34278 [12:05:40<22:16:56, 3.44s/it] 32%|███▏ | 10960/34278 [12:05:43<21:37:05, 3.34s/it] {'loss': 0.1391, 'grad_norm': 0.8378369436549159, 'learning_rate': 7.955603618102072e-06, 'epoch': 0.32} 32%|███▏ | 10960/34278 [12:05:43<21:37:05, 3.34s/it] 32%|███▏ | 10961/34278 [12:05:49<26:01:10, 4.02s/it] {'loss': 0.1541, 'grad_norm': 0.9344404355154655, 'learning_rate': 7.955222547504795e-06, 'epoch': 0.32} 32%|███▏ | 10961/34278 [12:05:49<26:01:10, 4.02s/it] 32%|███▏ | 10962/34278 [12:05:55<29:07:13, 4.50s/it] {'loss': 0.1649, 'grad_norm': 0.8942332460010611, 'learning_rate': 7.954841450524e-06, 'epoch': 0.32} 32%|███▏ | 10962/34278 [12:05:55<29:07:13, 4.50s/it] 32%|███▏ | 10963/34278 [12:05:57<25:45:55, 3.98s/it] {'loss': 0.1615, 'grad_norm': 0.9588201479934079, 'learning_rate': 7.954460327163085e-06, 'epoch': 0.32} 32%|███▏ | 10963/34278 [12:05:57<25:45:55, 3.98s/it] 32%|███▏ | 10964/34278 [12:06:01<25:19:59, 3.91s/it] {'loss': 0.1623, 'grad_norm': 1.065150453822113, 'learning_rate': 7.954079177425454e-06, 'epoch': 0.32} 32%|███▏ | 10964/34278 [12:06:01<25:19:59, 3.91s/it] 32%|███▏ | 10965/34278 [12:06:05<25:18:59, 3.91s/it] {'loss': 0.1605, 'grad_norm': 1.104971740385175, 'learning_rate': 7.953698001314508e-06, 'epoch': 0.32} 32%|███▏ | 10965/34278 [12:06:05<25:18:59, 3.91s/it] 32%|███▏ | 10966/34278 [12:06:09<25:11:44, 3.89s/it] {'loss': 0.1789, 'grad_norm': 0.8466155555025482, 'learning_rate': 7.953316798833653e-06, 'epoch': 0.32} 32%|███▏ | 10966/34278 [12:06:09<25:11:44, 3.89s/it] 32%|███▏ | 10967/34278 [12:06:12<23:27:42, 3.62s/it] {'loss': 0.1475, 'grad_norm': 0.8382534225784651, 'learning_rate': 7.952935569986289e-06, 'epoch': 0.32} 32%|███▏ | 10967/34278 [12:06:12<23:27:42, 3.62s/it] 32%|███▏ | 10968/34278 [12:06:15<22:29:20, 3.47s/it] {'loss': 0.1221, 'grad_norm': 0.8249852671925271, 'learning_rate': 7.952554314775822e-06, 'epoch': 0.32} 32%|███▏ | 10968/34278 [12:06:15<22:29:20, 3.47s/it] 32%|███▏ | 10969/34278 [12:06:18<22:01:35, 3.40s/it] {'loss': 0.1617, 'grad_norm': 0.8597897758081215, 'learning_rate': 7.952173033205654e-06, 'epoch': 0.32} 32%|███▏ | 10969/34278 [12:06:18<22:01:35, 3.40s/it] 32%|███▏ | 10970/34278 [12:06:24<27:00:56, 4.17s/it] {'loss': 0.1851, 'grad_norm': 0.7663886683609561, 'learning_rate': 7.951791725279192e-06, 'epoch': 0.32} 32%|███▏ | 10970/34278 [12:06:24<27:00:56, 4.17s/it] 32%|███▏ | 10971/34278 [12:06:27<24:42:30, 3.82s/it] {'loss': 0.1285, 'grad_norm': 0.7510502612254416, 'learning_rate': 7.951410390999836e-06, 'epoch': 0.32} 32%|███▏ | 10971/34278 [12:06:27<24:42:30, 3.82s/it] 32%|███▏ | 10972/34278 [12:06:31<25:39:58, 3.96s/it] {'loss': 0.1336, 'grad_norm': 0.84640212145857, 'learning_rate': 7.951029030370993e-06, 'epoch': 0.32} 32%|███▏ | 10972/34278 [12:06:32<25:39:58, 3.96s/it] 32%|███▏ | 10973/34278 [12:06:35<24:02:30, 3.71s/it] {'loss': 0.1377, 'grad_norm': 0.6752485779770158, 'learning_rate': 7.950647643396069e-06, 'epoch': 0.32} 32%|███▏ | 10973/34278 [12:06:35<24:02:30, 3.71s/it] 32%|███▏ | 10974/34278 [12:06:38<22:43:19, 3.51s/it] {'loss': 0.157, 'grad_norm': 0.8721891486643623, 'learning_rate': 7.950266230078465e-06, 'epoch': 0.32} 32%|███▏ | 10974/34278 [12:06:38<22:43:19, 3.51s/it] 32%|███▏ | 10975/34278 [12:06:41<22:05:31, 3.41s/it] {'loss': 0.1556, 'grad_norm': 0.8238194749835336, 'learning_rate': 7.949884790421591e-06, 'epoch': 0.32} 32%|███▏ | 10975/34278 [12:06:41<22:05:31, 3.41s/it] 32%|███▏ | 10976/34278 [12:06:44<22:29:29, 3.47s/it] {'loss': 0.1603, 'grad_norm': 0.6238059548766347, 'learning_rate': 7.949503324428847e-06, 'epoch': 0.32} 32%|███▏ | 10976/34278 [12:06:44<22:29:29, 3.47s/it] 32%|███▏ | 10977/34278 [12:06:49<24:48:25, 3.83s/it] {'loss': 0.1675, 'grad_norm': 0.8611891416617158, 'learning_rate': 7.949121832103643e-06, 'epoch': 0.32} 32%|███▏ | 10977/34278 [12:06:49<24:48:25, 3.83s/it] 32%|███▏ | 10978/34278 [12:06:54<26:16:29, 4.06s/it] {'loss': 0.1566, 'grad_norm': 0.8696323812718035, 'learning_rate': 7.948740313449382e-06, 'epoch': 0.32} 32%|███▏ | 10978/34278 [12:06:54<26:16:29, 4.06s/it] 32%|███▏ | 10979/34278 [12:06:58<26:32:47, 4.10s/it] {'loss': 0.1565, 'grad_norm': 0.7918960502139011, 'learning_rate': 7.948358768469473e-06, 'epoch': 0.32} 32%|███▏ | 10979/34278 [12:06:58<26:32:47, 4.10s/it] 32%|███▏ | 10980/34278 [12:07:04<30:27:49, 4.71s/it] {'loss': 0.1347, 'grad_norm': 0.8701276574822827, 'learning_rate': 7.94797719716732e-06, 'epoch': 0.32} 32%|███▏ | 10980/34278 [12:07:04<30:27:49, 4.71s/it] 32%|███▏ | 10981/34278 [12:07:07<26:50:39, 4.15s/it] {'loss': 0.1339, 'grad_norm': 0.9816902745359031, 'learning_rate': 7.94759559954633e-06, 'epoch': 0.32} 32%|███▏ | 10981/34278 [12:07:07<26:50:39, 4.15s/it] 32%|███▏ | 10982/34278 [12:07:11<25:49:08, 3.99s/it] {'loss': 0.1261, 'grad_norm': 0.9397271221685616, 'learning_rate': 7.94721397560991e-06, 'epoch': 0.32} 32%|███▏ | 10982/34278 [12:07:11<25:49:08, 3.99s/it] 32%|███▏ | 10983/34278 [12:07:14<24:07:19, 3.73s/it] {'loss': 0.1323, 'grad_norm': 1.0055992255206205, 'learning_rate': 7.946832325361468e-06, 'epoch': 0.32} 32%|███▏ | 10983/34278 [12:07:14<24:07:19, 3.73s/it] 32%|███▏ | 10984/34278 [12:07:17<22:38:06, 3.50s/it] {'loss': 0.1447, 'grad_norm': 0.7657455244757513, 'learning_rate': 7.94645064880441e-06, 'epoch': 0.32} 32%|███▏ | 10984/34278 [12:07:17<22:38:06, 3.50s/it] 32%|███▏ | 10985/34278 [12:07:20<22:47:15, 3.52s/it] {'loss': 0.1293, 'grad_norm': 0.879737735234662, 'learning_rate': 7.946068945942144e-06, 'epoch': 0.32} 32%|███▏ | 10985/34278 [12:07:20<22:47:15, 3.52s/it] 32%|███▏ | 10986/34278 [12:07:24<22:55:47, 3.54s/it] {'loss': 0.1403, 'grad_norm': 0.8606244572622053, 'learning_rate': 7.945687216778078e-06, 'epoch': 0.32} 32%|███▏ | 10986/34278 [12:07:24<22:55:47, 3.54s/it] 32%|███▏ | 10987/34278 [12:07:27<22:23:41, 3.46s/it] {'loss': 0.1705, 'grad_norm': 0.8821171060557782, 'learning_rate': 7.94530546131562e-06, 'epoch': 0.32} 32%|███▏ | 10987/34278 [12:07:27<22:23:41, 3.46s/it] 32%|███▏ | 10988/34278 [12:07:32<24:21:51, 3.77s/it] {'loss': 0.1404, 'grad_norm': 0.9948303760975354, 'learning_rate': 7.944923679558176e-06, 'epoch': 0.32} 32%|███▏ | 10988/34278 [12:07:32<24:21:51, 3.77s/it] 32%|███▏ | 10989/34278 [12:07:35<23:26:06, 3.62s/it] {'loss': 0.1322, 'grad_norm': 0.8341498066454957, 'learning_rate': 7.944541871509159e-06, 'epoch': 0.32} 32%|███▏ | 10989/34278 [12:07:35<23:26:06, 3.62s/it] 32%|███▏ | 10990/34278 [12:07:40<26:35:48, 4.11s/it] {'loss': 0.1305, 'grad_norm': 0.7838082232009056, 'learning_rate': 7.944160037171973e-06, 'epoch': 0.32} 32%|███▏ | 10990/34278 [12:07:40<26:35:48, 4.11s/it] 32%|███▏ | 10991/34278 [12:07:46<29:50:26, 4.61s/it] {'loss': 0.1544, 'grad_norm': 0.7141231750828113, 'learning_rate': 7.94377817655003e-06, 'epoch': 0.32} 32%|███▏ | 10991/34278 [12:07:46<29:50:26, 4.61s/it] 32%|███▏ | 10992/34278 [12:07:49<27:32:11, 4.26s/it] {'loss': 0.1526, 'grad_norm': 0.8018060602628841, 'learning_rate': 7.943396289646738e-06, 'epoch': 0.32} 32%|███▏ | 10992/34278 [12:07:49<27:32:11, 4.26s/it] 32%|███▏ | 10993/34278 [12:07:54<28:08:41, 4.35s/it] {'loss': 0.1424, 'grad_norm': 0.7334660975972801, 'learning_rate': 7.943014376465508e-06, 'epoch': 0.32} 32%|███▏ | 10993/34278 [12:07:54<28:08:41, 4.35s/it] 32%|███▏ | 10994/34278 [12:07:57<25:46:37, 3.99s/it] {'loss': 0.1454, 'grad_norm': 0.704839933028303, 'learning_rate': 7.942632437009746e-06, 'epoch': 0.32} 32%|███▏ | 10994/34278 [12:07:57<25:46:37, 3.99s/it] 32%|███▏ | 10995/34278 [12:08:02<27:07:04, 4.19s/it] {'loss': 0.1636, 'grad_norm': 0.8012705308577012, 'learning_rate': 7.942250471282864e-06, 'epoch': 0.32} 32%|███▏ | 10995/34278 [12:08:02<27:07:04, 4.19s/it] 32%|███▏ | 10996/34278 [12:08:05<24:57:09, 3.86s/it] {'loss': 0.1471, 'grad_norm': 0.7911693022338174, 'learning_rate': 7.941868479288276e-06, 'epoch': 0.32} 32%|███▏ | 10996/34278 [12:08:05<24:57:09, 3.86s/it] 32%|███▏ | 10997/34278 [12:08:10<28:28:37, 4.40s/it] {'loss': 0.1414, 'grad_norm': 0.7842991048808973, 'learning_rate': 7.941486461029384e-06, 'epoch': 0.32} 32%|███▏ | 10997/34278 [12:08:10<28:28:37, 4.40s/it] 32%|███▏ | 10998/34278 [12:08:13<25:46:07, 3.98s/it] {'loss': 0.158, 'grad_norm': 0.719470077422427, 'learning_rate': 7.941104416509604e-06, 'epoch': 0.32} 32%|███▏ | 10998/34278 [12:08:13<25:46:07, 3.98s/it] 32%|███▏ | 10999/34278 [12:08:16<23:57:14, 3.70s/it] {'loss': 0.1439, 'grad_norm': 0.8242210322970973, 'learning_rate': 7.940722345732347e-06, 'epoch': 0.32} 32%|███▏ | 10999/34278 [12:08:16<23:57:14, 3.70s/it] 32%|███▏ | 11000/34278 [12:08:21<25:37:29, 3.96s/it] {'loss': 0.1579, 'grad_norm': 0.8489609518286017, 'learning_rate': 7.940340248701022e-06, 'epoch': 0.32} 32%|███▏ | 11000/34278 [12:08:21<25:37:29, 3.96s/it] 32%|███▏ | 11001/34278 [12:08:25<25:20:20, 3.92s/it] {'loss': 0.1464, 'grad_norm': 0.8551238030790933, 'learning_rate': 7.939958125419042e-06, 'epoch': 0.32} 32%|███▏ | 11001/34278 [12:08:25<25:20:20, 3.92s/it] 32%|███▏ | 11002/34278 [12:08:28<24:28:00, 3.78s/it] {'loss': 0.1482, 'grad_norm': 0.7940226839705632, 'learning_rate': 7.939575975889817e-06, 'epoch': 0.32} 32%|███▏ | 11002/34278 [12:08:28<24:28:00, 3.78s/it] 32%|███▏ | 11003/34278 [12:08:31<23:02:53, 3.56s/it] {'loss': 0.126, 'grad_norm': 1.1013519935521343, 'learning_rate': 7.93919380011676e-06, 'epoch': 0.32} 32%|███▏ | 11003/34278 [12:08:31<23:02:53, 3.56s/it] 32%|███▏ | 11004/34278 [12:08:35<22:32:41, 3.49s/it] {'loss': 0.1517, 'grad_norm': 0.9228344579150133, 'learning_rate': 7.938811598103282e-06, 'epoch': 0.32} 32%|███▏ | 11004/34278 [12:08:35<22:32:41, 3.49s/it] 32%|███▏ | 11005/34278 [12:08:41<28:36:17, 4.42s/it] {'loss': 0.1824, 'grad_norm': 1.227069686520391, 'learning_rate': 7.938429369852796e-06, 'epoch': 0.32} 32%|███▏ | 11005/34278 [12:08:41<28:36:17, 4.42s/it] 32%|███▏ | 11006/34278 [12:08:44<25:46:49, 3.99s/it] {'loss': 0.1245, 'grad_norm': 0.761631095083584, 'learning_rate': 7.938047115368713e-06, 'epoch': 0.32} 32%|███▏ | 11006/34278 [12:08:44<25:46:49, 3.99s/it] 32%|███▏ | 11007/34278 [12:08:47<23:49:24, 3.69s/it] {'loss': 0.1436, 'grad_norm': 1.0703644652778053, 'learning_rate': 7.937664834654449e-06, 'epoch': 0.32} 32%|███▏ | 11007/34278 [12:08:47<23:49:24, 3.69s/it] 32%|███▏ | 11008/34278 [12:08:52<26:34:15, 4.11s/it] {'loss': 0.1511, 'grad_norm': 0.9210100666132759, 'learning_rate': 7.937282527713412e-06, 'epoch': 0.32} 32%|███▏ | 11008/34278 [12:08:52<26:34:15, 4.11s/it] 32%|███▏ | 11009/34278 [12:08:55<24:24:21, 3.78s/it] {'loss': 0.1563, 'grad_norm': 0.8369537948127395, 'learning_rate': 7.93690019454902e-06, 'epoch': 0.32} 32%|███▏ | 11009/34278 [12:08:55<24:24:21, 3.78s/it] 32%|███▏ | 11010/34278 [12:08:58<22:40:55, 3.51s/it] {'loss': 0.1387, 'grad_norm': 1.1750987802114254, 'learning_rate': 7.936517835164682e-06, 'epoch': 0.32} 32%|███▏ | 11010/34278 [12:08:58<22:40:55, 3.51s/it] 32%|███▏ | 11011/34278 [12:09:02<22:17:57, 3.45s/it] {'loss': 0.1705, 'grad_norm': 1.078128425755581, 'learning_rate': 7.936135449563815e-06, 'epoch': 0.32} 32%|███▏ | 11011/34278 [12:09:02<22:17:57, 3.45s/it] 32%|███▏ | 11012/34278 [12:09:04<20:56:14, 3.24s/it] {'loss': 0.1482, 'grad_norm': 1.0341634095727026, 'learning_rate': 7.935753037749832e-06, 'epoch': 0.32} 32%|███▏ | 11012/34278 [12:09:04<20:56:14, 3.24s/it] 32%|███▏ | 11013/34278 [12:09:07<20:12:23, 3.13s/it] {'loss': 0.165, 'grad_norm': 1.003273278719217, 'learning_rate': 7.935370599726147e-06, 'epoch': 0.32} 32%|███▏ | 11013/34278 [12:09:07<20:12:23, 3.13s/it] 32%|███▏ | 11014/34278 [12:09:11<20:49:58, 3.22s/it] {'loss': 0.1592, 'grad_norm': 0.8501567233419036, 'learning_rate': 7.93498813549617e-06, 'epoch': 0.32} 32%|███▏ | 11014/34278 [12:09:11<20:49:58, 3.22s/it] 32%|███▏ | 11015/34278 [12:09:14<20:26:41, 3.16s/it] {'loss': 0.1757, 'grad_norm': 1.1143881601142083, 'learning_rate': 7.934605645063325e-06, 'epoch': 0.32} 32%|███▏ | 11015/34278 [12:09:14<20:26:41, 3.16s/it] 32%|███▏ | 11016/34278 [12:09:17<21:17:21, 3.29s/it] {'loss': 0.1674, 'grad_norm': 0.7427794374614133, 'learning_rate': 7.934223128431017e-06, 'epoch': 0.32} 32%|███▏ | 11016/34278 [12:09:17<21:17:21, 3.29s/it] 32%|███▏ | 11017/34278 [12:09:20<20:48:32, 3.22s/it] {'loss': 0.144, 'grad_norm': 0.9468726487888137, 'learning_rate': 7.93384058560267e-06, 'epoch': 0.32} 32%|███▏ | 11017/34278 [12:09:20<20:48:32, 3.22s/it] 32%|███▏ | 11018/34278 [12:09:26<25:40:14, 3.97s/it] {'loss': 0.1369, 'grad_norm': 0.7587061259930971, 'learning_rate': 7.933458016581691e-06, 'epoch': 0.32} 32%|███▏ | 11018/34278 [12:09:26<25:40:14, 3.97s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 32%|███▏ | 11019/34278 [12:09:29<24:05:14, 3.73s/it] {'loss': 0.161, 'grad_norm': 0.8365927342007075, 'learning_rate': 7.9330754213715e-06, 'epoch': 0.32} 32%|███▏ | 11019/34278 [12:09:29<24:05:14, 3.73s/it] 32%|███▏ | 11020/34278 [12:09:32<22:30:44, 3.48s/it] {'loss': 0.1702, 'grad_norm': 0.7843770152117566, 'learning_rate': 7.932692799975513e-06, 'epoch': 0.32} 32%|███▏ | 11020/34278 [12:09:32<22:30:44, 3.48s/it] 32%|███▏ | 11021/34278 [12:09:36<23:54:09, 3.70s/it] {'loss': 0.1456, 'grad_norm': 0.8703596720960927, 'learning_rate': 7.932310152397142e-06, 'epoch': 0.32} 32%|███▏ | 11021/34278 [12:09:36<23:54:09, 3.70s/it] 32%|███▏ | 11022/34278 [12:09:40<23:03:27, 3.57s/it] {'loss': 0.156, 'grad_norm': 1.0118251000382694, 'learning_rate': 7.931927478639809e-06, 'epoch': 0.32} 32%|███▏ | 11022/34278 [12:09:40<23:03:27, 3.57s/it] 32%|███▏ | 11023/34278 [12:09:43<23:05:50, 3.58s/it] {'loss': 0.1544, 'grad_norm': 0.6723312241330154, 'learning_rate': 7.931544778706925e-06, 'epoch': 0.32} 32%|███▏ | 11023/34278 [12:09:43<23:05:50, 3.58s/it] 32%|███▏ | 11024/34278 [12:09:46<22:08:49, 3.43s/it] {'loss': 0.157, 'grad_norm': 0.6346271035640576, 'learning_rate': 7.93116205260191e-06, 'epoch': 0.32} 32%|███▏ | 11024/34278 [12:09:46<22:08:49, 3.43s/it] 32%|███▏ | 11025/34278 [12:09:51<23:50:49, 3.69s/it] {'loss': 0.1293, 'grad_norm': 0.6450277752550774, 'learning_rate': 7.93077930032818e-06, 'epoch': 0.32} 32%|███▏ | 11025/34278 [12:09:51<23:50:49, 3.69s/it] 32%|███▏ | 11026/34278 [12:09:54<23:00:40, 3.56s/it] {'loss': 0.164, 'grad_norm': 0.7727609392974565, 'learning_rate': 7.930396521889152e-06, 'epoch': 0.32} 32%|███▏ | 11026/34278 [12:09:54<23:00:40, 3.56s/it] 32%|███▏ | 11027/34278 [12:10:00<27:48:57, 4.31s/it] {'loss': 0.1368, 'grad_norm': 0.8947902710093486, 'learning_rate': 7.930013717288244e-06, 'epoch': 0.32} 32%|███▏ | 11027/34278 [12:10:00<27:48:57, 4.31s/it] 32%|███▏ | 11028/34278 [12:10:03<25:31:07, 3.95s/it] {'loss': 0.1491, 'grad_norm': 0.7523660713209434, 'learning_rate': 7.929630886528874e-06, 'epoch': 0.32} 32%|███▏ | 11028/34278 [12:10:03<25:31:07, 3.95s/it] 32%|███▏ | 11029/34278 [12:10:06<24:08:26, 3.74s/it] {'loss': 0.1517, 'grad_norm': 1.0094263864221134, 'learning_rate': 7.929248029614455e-06, 'epoch': 0.32} 32%|███▏ | 11029/34278 [12:10:06<24:08:26, 3.74s/it] 32%|███▏ | 11030/34278 [12:10:09<22:46:13, 3.53s/it] {'loss': 0.156, 'grad_norm': 0.8769132684330191, 'learning_rate': 7.928865146548411e-06, 'epoch': 0.32} 32%|███▏ | 11030/34278 [12:10:09<22:46:13, 3.53s/it] 32%|███▏ | 11031/34278 [12:10:12<21:46:56, 3.37s/it] {'loss': 0.1731, 'grad_norm': 0.8737246949168511, 'learning_rate': 7.928482237334159e-06, 'epoch': 0.32} 32%|███▏ | 11031/34278 [12:10:12<21:46:56, 3.37s/it] 32%|███▏ | 11032/34278 [12:10:15<20:28:48, 3.17s/it] {'loss': 0.1587, 'grad_norm': 0.8785322342147993, 'learning_rate': 7.928099301975116e-06, 'epoch': 0.32} 32%|███▏ | 11032/34278 [12:10:15<20:28:48, 3.17s/it] 32%|███▏ | 11033/34278 [12:10:21<25:45:37, 3.99s/it] {'loss': 0.1502, 'grad_norm': 0.9112884315348686, 'learning_rate': 7.927716340474701e-06, 'epoch': 0.32} 32%|███▏ | 11033/34278 [12:10:21<25:45:37, 3.99s/it] 32%|███▏ | 11034/34278 [12:10:27<29:46:46, 4.61s/it] {'loss': 0.1507, 'grad_norm': 0.7559231869687969, 'learning_rate': 7.927333352836334e-06, 'epoch': 0.32} 32%|███▏ | 11034/34278 [12:10:27<29:46:46, 4.61s/it] 32%|███▏ | 11035/34278 [12:10:30<26:55:15, 4.17s/it] {'loss': 0.134, 'grad_norm': 0.8276737793134261, 'learning_rate': 7.926950339063435e-06, 'epoch': 0.32} 32%|███▏ | 11035/34278 [12:10:30<26:55:15, 4.17s/it] 32%|███▏ | 11036/34278 [12:10:33<24:37:17, 3.81s/it] {'loss': 0.1249, 'grad_norm': 1.1149869744579006, 'learning_rate': 7.92656729915942e-06, 'epoch': 0.32} 32%|███▏ | 11036/34278 [12:10:33<24:37:17, 3.81s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8685 > 8192). Running this sequence through the model will result in indexing errors 32%|███▏ | 11037/34278 [12:10:36<23:05:42, 3.58s/it] {'loss': 0.1415, 'grad_norm': 0.9181834269609487, 'learning_rate': 7.926184233127711e-06, 'epoch': 0.32} 32%|███▏ | 11037/34278 [12:10:36<23:05:42, 3.58s/it] 32%|███▏ | 11038/34278 [12:10:39<21:59:24, 3.41s/it] {'loss': 0.1581, 'grad_norm': 0.7895080741571071, 'learning_rate': 7.925801140971728e-06, 'epoch': 0.32} 32%|███▏ | 11038/34278 [12:10:39<21:59:24, 3.41s/it] 32%|███▏ | 11039/34278 [12:10:42<21:56:36, 3.40s/it] {'loss': 0.1574, 'grad_norm': 0.8239369710506671, 'learning_rate': 7.92541802269489e-06, 'epoch': 0.32} 32%|███▏ | 11039/34278 [12:10:42<21:56:36, 3.40s/it] 32%|███▏ | 11040/34278 [12:10:46<23:04:23, 3.57s/it] {'loss': 0.1439, 'grad_norm': 1.0015644202281127, 'learning_rate': 7.925034878300619e-06, 'epoch': 0.32} 32%|███▏ | 11040/34278 [12:10:46<23:04:23, 3.57s/it] 32%|███▏ | 11041/34278 [12:10:50<23:24:39, 3.63s/it] {'loss': 0.1629, 'grad_norm': 0.8838377900826695, 'learning_rate': 7.924651707792337e-06, 'epoch': 0.32} 32%|███▏ | 11041/34278 [12:10:50<23:24:39, 3.63s/it] 32%|███▏ | 11042/34278 [12:10:53<21:36:23, 3.35s/it] {'loss': 0.1552, 'grad_norm': 0.9723878533223169, 'learning_rate': 7.924268511173459e-06, 'epoch': 0.32} 32%|███▏ | 11042/34278 [12:10:53<21:36:23, 3.35s/it] 32%|███▏ | 11043/34278 [12:10:57<22:14:09, 3.45s/it] {'loss': 0.1437, 'grad_norm': 0.8082210243148918, 'learning_rate': 7.923885288447413e-06, 'epoch': 0.32} 32%|███▏ | 11043/34278 [12:10:57<22:14:09, 3.45s/it] 32%|███▏ | 11044/34278 [12:11:00<22:02:38, 3.42s/it] {'loss': 0.1501, 'grad_norm': 1.0104745462052036, 'learning_rate': 7.923502039617615e-06, 'epoch': 0.32} 32%|███▏ | 11044/34278 [12:11:00<22:02:38, 3.42s/it] 32%|███▏ | 11045/34278 [12:11:03<22:03:56, 3.42s/it] {'loss': 0.1871, 'grad_norm': 0.9943546720506817, 'learning_rate': 7.923118764687489e-06, 'epoch': 0.32} 32%|███▏ | 11045/34278 [12:11:03<22:03:56, 3.42s/it] 32%|███▏ | 11046/34278 [12:11:07<23:01:22, 3.57s/it] {'loss': 0.1912, 'grad_norm': 0.9721145965729276, 'learning_rate': 7.922735463660455e-06, 'epoch': 0.32} 32%|███▏ | 11046/34278 [12:11:07<23:01:22, 3.57s/it] 32%|███▏ | 11047/34278 [12:11:11<23:14:53, 3.60s/it] {'loss': 0.1494, 'grad_norm': 0.7190358049891815, 'learning_rate': 7.922352136539938e-06, 'epoch': 0.32} 32%|███▏ | 11047/34278 [12:11:11<23:14:53, 3.60s/it] 32%|███▏ | 11048/34278 [12:11:15<23:23:04, 3.62s/it] {'loss': 0.1492, 'grad_norm': 0.9486864584518475, 'learning_rate': 7.921968783329362e-06, 'epoch': 0.32} 32%|███▏ | 11048/34278 [12:11:15<23:23:04, 3.62s/it] 32%|███▏ | 11049/34278 [12:11:18<22:09:03, 3.43s/it] {'loss': 0.1556, 'grad_norm': 1.1308214754558508, 'learning_rate': 7.921585404032142e-06, 'epoch': 0.32} 32%|███▏ | 11049/34278 [12:11:18<22:09:03, 3.43s/it] 32%|███▏ | 11050/34278 [12:11:21<22:00:27, 3.41s/it] {'loss': 0.1849, 'grad_norm': 0.8976733890933533, 'learning_rate': 7.921201998651707e-06, 'epoch': 0.32} 32%|███▏ | 11050/34278 [12:11:21<22:00:27, 3.41s/it] 32%|███▏ | 11051/34278 [12:11:24<21:58:30, 3.41s/it] {'loss': 0.163, 'grad_norm': 0.7042217237944887, 'learning_rate': 7.920818567191476e-06, 'epoch': 0.32} 32%|███▏ | 11051/34278 [12:11:24<21:58:30, 3.41s/it] 32%|███▏ | 11052/34278 [12:11:27<21:29:16, 3.33s/it] {'loss': 0.1505, 'grad_norm': 0.9241326609308739, 'learning_rate': 7.920435109654877e-06, 'epoch': 0.32} 32%|███▏ | 11052/34278 [12:11:27<21:29:16, 3.33s/it] 32%|███▏ | 11053/34278 [12:11:30<20:48:03, 3.22s/it] {'loss': 0.1374, 'grad_norm': 0.7560781844894718, 'learning_rate': 7.920051626045326e-06, 'epoch': 0.32} 32%|███▏ | 11053/34278 [12:11:30<20:48:03, 3.22s/it] 32%|███▏ | 11054/34278 [12:11:34<20:44:07, 3.21s/it] {'loss': 0.1693, 'grad_norm': 0.8588511064051029, 'learning_rate': 7.919668116366254e-06, 'epoch': 0.32} 32%|███▏ | 11054/34278 [12:11:34<20:44:07, 3.21s/it] 32%|███▏ | 11055/34278 [12:11:37<20:46:59, 3.22s/it] {'loss': 0.1531, 'grad_norm': 0.7864607662435174, 'learning_rate': 7.919284580621082e-06, 'epoch': 0.32} 32%|███▏ | 11055/34278 [12:11:37<20:46:59, 3.22s/it] 32%|███▏ | 11056/34278 [12:11:40<20:43:21, 3.21s/it] {'loss': 0.1386, 'grad_norm': 0.7873508435241552, 'learning_rate': 7.918901018813234e-06, 'epoch': 0.32} 32%|███▏ | 11056/34278 [12:11:40<20:43:21, 3.21s/it] 32%|███▏ | 11057/34278 [12:11:43<19:31:39, 3.03s/it] {'loss': 0.1156, 'grad_norm': 0.7088069197804802, 'learning_rate': 7.918517430946135e-06, 'epoch': 0.32} 32%|███▏ | 11057/34278 [12:11:43<19:31:39, 3.03s/it] 32%|███▏ | 11058/34278 [12:11:46<20:45:56, 3.22s/it] {'loss': 0.1333, 'grad_norm': 0.7059647387036411, 'learning_rate': 7.91813381702321e-06, 'epoch': 0.32} 32%|███▏ | 11058/34278 [12:11:46<20:45:56, 3.22s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10474 > 8192). Running this sequence through the model will result in indexing errors 32%|███▏ | 11059/34278 [12:11:52<25:55:58, 4.02s/it] {'loss': 0.1361, 'grad_norm': 0.666454605594955, 'learning_rate': 7.917750177047881e-06, 'epoch': 0.32} 32%|███▏ | 11059/34278 [12:11:52<25:55:58, 4.02s/it] 32%|███▏ | 11060/34278 [12:11:55<23:37:04, 3.66s/it] {'loss': 0.1457, 'grad_norm': 1.2113628974280057, 'learning_rate': 7.917366511023575e-06, 'epoch': 0.32} 32%|███▏ | 11060/34278 [12:11:55<23:37:04, 3.66s/it] 32%|███▏ | 11061/34278 [12:11:58<22:35:44, 3.50s/it] {'loss': 0.1524, 'grad_norm': 0.842500261956832, 'learning_rate': 7.916982818953718e-06, 'epoch': 0.32} 32%|███▏ | 11061/34278 [12:11:58<22:35:44, 3.50s/it] 32%|███▏ | 11062/34278 [12:12:01<21:30:48, 3.34s/it] {'loss': 0.1482, 'grad_norm': 0.8034266029849304, 'learning_rate': 7.916599100841734e-06, 'epoch': 0.32} 32%|███▏ | 11062/34278 [12:12:01<21:30:48, 3.34s/it] 32%|███▏ | 11063/34278 [12:12:04<20:29:53, 3.18s/it] {'loss': 0.1457, 'grad_norm': 0.9418898273074237, 'learning_rate': 7.916215356691051e-06, 'epoch': 0.32} 32%|███▏ | 11063/34278 [12:12:04<20:29:53, 3.18s/it] 32%|███▏ | 11064/34278 [12:12:07<20:54:23, 3.24s/it] {'loss': 0.1658, 'grad_norm': 0.8991728969743075, 'learning_rate': 7.915831586505092e-06, 'epoch': 0.32} 32%|███▏ | 11064/34278 [12:12:07<20:54:23, 3.24s/it] 32%|███▏ | 11065/34278 [12:12:11<22:22:49, 3.47s/it] {'loss': 0.1569, 'grad_norm': 0.7920795098995894, 'learning_rate': 7.915447790287285e-06, 'epoch': 0.32} 32%|███▏ | 11065/34278 [12:12:11<22:22:49, 3.47s/it] 32%|███▏ | 11066/34278 [12:12:14<21:31:29, 3.34s/it] {'loss': 0.1798, 'grad_norm': 0.8132999221667033, 'learning_rate': 7.915063968041055e-06, 'epoch': 0.32} 32%|███▏ | 11066/34278 [12:12:14<21:31:29, 3.34s/it] 32%|███▏ | 11067/34278 [12:12:18<21:58:56, 3.41s/it] {'loss': 0.1545, 'grad_norm': 1.0027852714766794, 'learning_rate': 7.914680119769831e-06, 'epoch': 0.32} 32%|███▏ | 11067/34278 [12:12:18<21:58:56, 3.41s/it] 32%|███▏ | 11068/34278 [12:12:24<27:02:59, 4.20s/it] {'loss': 0.1309, 'grad_norm': 0.6407065559864056, 'learning_rate': 7.91429624547704e-06, 'epoch': 0.32} 32%|███▏ | 11068/34278 [12:12:24<27:02:59, 4.20s/it] 32%|███▏ | 11069/34278 [12:12:27<25:40:19, 3.98s/it] {'loss': 0.1461, 'grad_norm': 0.8188800605209654, 'learning_rate': 7.913912345166106e-06, 'epoch': 0.32} 32%|███▏ | 11069/34278 [12:12:27<25:40:19, 3.98s/it] 32%|███▏ | 11070/34278 [12:12:30<23:01:04, 3.57s/it] {'loss': 0.1815, 'grad_norm': 0.8840368480843117, 'learning_rate': 7.91352841884046e-06, 'epoch': 0.32} 32%|███▏ | 11070/34278 [12:12:30<23:01:04, 3.57s/it] 32%|███▏ | 11071/34278 [12:12:34<23:06:36, 3.58s/it] {'loss': 0.1514, 'grad_norm': 0.963373660572464, 'learning_rate': 7.913144466503524e-06, 'epoch': 0.32} 32%|███▏ | 11071/34278 [12:12:34<23:06:36, 3.58s/it] 32%|███▏ | 11072/34278 [12:12:38<23:42:17, 3.68s/it] {'loss': 0.1659, 'grad_norm': 0.847454837134214, 'learning_rate': 7.912760488158732e-06, 'epoch': 0.32} 32%|███▏ | 11072/34278 [12:12:38<23:42:17, 3.68s/it] 32%|███▏ | 11073/34278 [12:12:41<22:29:49, 3.49s/it] {'loss': 0.1512, 'grad_norm': 0.8394342933315185, 'learning_rate': 7.91237648380951e-06, 'epoch': 0.32} 32%|███▏ | 11073/34278 [12:12:41<22:29:49, 3.49s/it] 32%|███▏ | 11074/34278 [12:12:44<21:59:40, 3.41s/it] {'loss': 0.1424, 'grad_norm': 0.988287441046759, 'learning_rate': 7.911992453459286e-06, 'epoch': 0.32} 32%|███▏ | 11074/34278 [12:12:44<21:59:40, 3.41s/it] 32%|███▏ | 11075/34278 [12:12:47<21:38:34, 3.36s/it] {'loss': 0.1503, 'grad_norm': 0.8230831543372418, 'learning_rate': 7.911608397111488e-06, 'epoch': 0.32} 32%|███▏ | 11075/34278 [12:12:47<21:38:34, 3.36s/it] 32%|███▏ | 11076/34278 [12:12:52<24:08:21, 3.75s/it] {'loss': 0.1641, 'grad_norm': 0.8519448947315523, 'learning_rate': 7.911224314769546e-06, 'epoch': 0.32} 32%|███▏ | 11076/34278 [12:12:52<24:08:21, 3.75s/it] 32%|███▏ | 11077/34278 [12:12:55<22:35:48, 3.51s/it] {'loss': 0.1523, 'grad_norm': 0.8377550743800041, 'learning_rate': 7.910840206436888e-06, 'epoch': 0.32} 32%|███▏ | 11077/34278 [12:12:55<22:35:48, 3.51s/it] 32%|███▏ | 11078/34278 [12:12:58<22:18:02, 3.46s/it] {'loss': 0.1445, 'grad_norm': 0.610019788807047, 'learning_rate': 7.910456072116944e-06, 'epoch': 0.32} 32%|███▏ | 11078/34278 [12:12:58<22:18:02, 3.46s/it] 32%|███▏ | 11079/34278 [12:13:04<26:55:59, 4.18s/it] {'loss': 0.1551, 'grad_norm': 0.8807595995263549, 'learning_rate': 7.910071911813142e-06, 'epoch': 0.32} 32%|███▏ | 11079/34278 [12:13:04<26:55:59, 4.18s/it] 32%|███▏ | 11080/34278 [12:13:10<30:50:48, 4.79s/it] {'loss': 0.1506, 'grad_norm': 0.7677249527011757, 'learning_rate': 7.909687725528911e-06, 'epoch': 0.32} 32%|███▏ | 11080/34278 [12:13:10<30:50:48, 4.79s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 32%|███▏ | 11081/34278 [12:13:13<27:56:49, 4.34s/it] {'loss': 0.1361, 'grad_norm': 0.7463840512551806, 'learning_rate': 7.909303513267685e-06, 'epoch': 0.32} 32%|███▏ | 11081/34278 [12:13:13<27:56:49, 4.34s/it] 32%|███▏ | 11082/34278 [12:13:17<26:08:17, 4.06s/it] {'loss': 0.1363, 'grad_norm': 1.12408880891437, 'learning_rate': 7.908919275032892e-06, 'epoch': 0.32} 32%|███▏ | 11082/34278 [12:13:17<26:08:17, 4.06s/it] 32%|███▏ | 11083/34278 [12:13:20<25:13:27, 3.91s/it] {'loss': 0.1705, 'grad_norm': 0.7552353405072342, 'learning_rate': 7.90853501082796e-06, 'epoch': 0.32} 32%|███▏ | 11083/34278 [12:13:20<25:13:27, 3.91s/it] 32%|███▏ | 11084/34278 [12:13:24<24:59:43, 3.88s/it] {'loss': 0.1795, 'grad_norm': 0.7364333369784888, 'learning_rate': 7.908150720656324e-06, 'epoch': 0.32} 32%|███▏ | 11084/34278 [12:13:24<24:59:43, 3.88s/it] 32%|███▏ | 11085/34278 [12:13:28<25:32:58, 3.97s/it] {'loss': 0.1496, 'grad_norm': 0.9176463827324679, 'learning_rate': 7.907766404521414e-06, 'epoch': 0.32} 32%|███▏ | 11085/34278 [12:13:28<25:32:58, 3.97s/it] 32%|███▏ | 11086/34278 [12:13:32<25:15:58, 3.92s/it] {'loss': 0.1337, 'grad_norm': 0.8070896589297977, 'learning_rate': 7.907382062426656e-06, 'epoch': 0.32} 32%|███▏ | 11086/34278 [12:13:32<25:15:58, 3.92s/it] 32%|███▏ | 11087/34278 [12:13:36<25:19:27, 3.93s/it] {'loss': 0.1577, 'grad_norm': 0.8071384790599688, 'learning_rate': 7.906997694375486e-06, 'epoch': 0.32} 32%|███▏ | 11087/34278 [12:13:36<25:19:27, 3.93s/it] 32%|███▏ | 11088/34278 [12:13:40<24:40:18, 3.83s/it] {'loss': 0.1642, 'grad_norm': 0.8882272638805864, 'learning_rate': 7.906613300371336e-06, 'epoch': 0.32} 32%|███▏ | 11088/34278 [12:13:40<24:40:18, 3.83s/it] 32%|███▏ | 11089/34278 [12:13:43<23:57:26, 3.72s/it] {'loss': 0.1372, 'grad_norm': 0.9243385594022651, 'learning_rate': 7.906228880417635e-06, 'epoch': 0.32} 32%|███▏ | 11089/34278 [12:13:43<23:57:26, 3.72s/it] 32%|███▏ | 11090/34278 [12:13:47<23:43:39, 3.68s/it] {'loss': 0.1654, 'grad_norm': 0.7258597190739321, 'learning_rate': 7.905844434517816e-06, 'epoch': 0.32} 32%|███▏ | 11090/34278 [12:13:47<23:43:39, 3.68s/it] 32%|███▏ | 11091/34278 [12:13:50<23:06:28, 3.59s/it] {'loss': 0.1519, 'grad_norm': 0.7576725544155756, 'learning_rate': 7.905459962675313e-06, 'epoch': 0.32} 32%|███▏ | 11091/34278 [12:13:50<23:06:28, 3.59s/it] 32%|███▏ | 11092/34278 [12:13:54<23:58:08, 3.72s/it] {'loss': 0.1644, 'grad_norm': 0.8156263374114935, 'learning_rate': 7.905075464893555e-06, 'epoch': 0.32} 32%|███▏ | 11092/34278 [12:13:54<23:58:08, 3.72s/it] 32%|███▏ | 11093/34278 [12:13:58<23:15:44, 3.61s/it] {'loss': 0.1293, 'grad_norm': 0.9372645601270756, 'learning_rate': 7.904690941175979e-06, 'epoch': 0.32} 32%|███▏ | 11093/34278 [12:13:58<23:15:44, 3.61s/it] 32%|███▏ | 11094/34278 [12:14:01<22:40:34, 3.52s/it] {'loss': 0.125, 'grad_norm': 0.6610553053673265, 'learning_rate': 7.904306391526012e-06, 'epoch': 0.32} 32%|███▏ | 11094/34278 [12:14:01<22:40:34, 3.52s/it] 32%|███▏ | 11095/34278 [12:14:05<24:24:29, 3.79s/it] {'loss': 0.1426, 'grad_norm': 0.7163195473998811, 'learning_rate': 7.903921815947095e-06, 'epoch': 0.32} 32%|███▏ | 11095/34278 [12:14:05<24:24:29, 3.79s/it] 32%|███▏ | 11096/34278 [12:14:09<24:40:40, 3.83s/it] {'loss': 0.1623, 'grad_norm': 1.1733843832401145, 'learning_rate': 7.903537214442656e-06, 'epoch': 0.32} 32%|███▏ | 11096/34278 [12:14:09<24:40:40, 3.83s/it] 32%|███▏ | 11097/34278 [12:14:12<22:55:53, 3.56s/it] {'loss': 0.1539, 'grad_norm': 1.0320280774661348, 'learning_rate': 7.90315258701613e-06, 'epoch': 0.32} 32%|███▏ | 11097/34278 [12:14:12<22:55:53, 3.56s/it] 32%|███▏ | 11098/34278 [12:14:17<25:16:27, 3.93s/it] {'loss': 0.1355, 'grad_norm': 1.084036003611658, 'learning_rate': 7.90276793367095e-06, 'epoch': 0.32} 32%|███▏ | 11098/34278 [12:14:17<25:16:27, 3.93s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 32%|███▏ | 11099/34278 [12:14:20<23:56:58, 3.72s/it] {'loss': 0.164, 'grad_norm': 0.8556990140036362, 'learning_rate': 7.902383254410551e-06, 'epoch': 0.32} 32%|███▏ | 11099/34278 [12:14:20<23:56:58, 3.72s/it] 32%|███▏ | 11100/34278 [12:14:25<25:43:46, 4.00s/it] {'loss': 0.1372, 'grad_norm': 0.8520628457765326, 'learning_rate': 7.901998549238368e-06, 'epoch': 0.32} 32%|███▏ | 11100/34278 [12:14:25<25:43:46, 4.00s/it] 32%|███▏ | 11101/34278 [12:14:28<23:42:02, 3.68s/it] {'loss': 0.1467, 'grad_norm': 1.0005421369998007, 'learning_rate': 7.901613818157834e-06, 'epoch': 0.32} 32%|███▏ | 11101/34278 [12:14:28<23:42:02, 3.68s/it] 32%|███▏ | 11102/34278 [12:14:31<22:57:26, 3.57s/it] {'loss': 0.1362, 'grad_norm': 1.1171458073610518, 'learning_rate': 7.901229061172385e-06, 'epoch': 0.32} 32%|███▏ | 11102/34278 [12:14:31<22:57:26, 3.57s/it] 32%|███▏ | 11103/34278 [12:14:34<22:23:56, 3.48s/it] {'loss': 0.1517, 'grad_norm': 0.8278324943556908, 'learning_rate': 7.900844278285456e-06, 'epoch': 0.32} 32%|███▏ | 11103/34278 [12:14:34<22:23:56, 3.48s/it] 32%|███▏ | 11104/34278 [12:14:37<21:52:35, 3.40s/it] {'loss': 0.1347, 'grad_norm': 0.8315488441333372, 'learning_rate': 7.900459469500479e-06, 'epoch': 0.32} 32%|███▏ | 11104/34278 [12:14:37<21:52:35, 3.40s/it] 32%|███▏ | 11105/34278 [12:14:40<20:40:30, 3.21s/it] {'loss': 0.159, 'grad_norm': 1.2264931770295593, 'learning_rate': 7.900074634820895e-06, 'epoch': 0.32} 32%|███▏ | 11105/34278 [12:14:40<20:40:30, 3.21s/it] 32%|███▏ | 11106/34278 [12:14:43<20:24:13, 3.17s/it] {'loss': 0.1328, 'grad_norm': 0.8233679230374752, 'learning_rate': 7.899689774250135e-06, 'epoch': 0.32} 32%|███▏ | 11106/34278 [12:14:43<20:24:13, 3.17s/it] 32%|███▏ | 11107/34278 [12:14:46<19:31:45, 3.03s/it] {'loss': 0.1586, 'grad_norm': 0.8688961977315816, 'learning_rate': 7.899304887791639e-06, 'epoch': 0.32} 32%|███▏ | 11107/34278 [12:14:46<19:31:45, 3.03s/it] 32%|███▏ | 11108/34278 [12:14:49<20:05:21, 3.12s/it] {'loss': 0.1552, 'grad_norm': 0.9345347455118921, 'learning_rate': 7.89891997544884e-06, 'epoch': 0.32} 32%|███▏ | 11108/34278 [12:14:49<20:05:21, 3.12s/it] 32%|███▏ | 11109/34278 [12:14:54<22:06:47, 3.44s/it] {'loss': 0.1317, 'grad_norm': 0.9791120219413809, 'learning_rate': 7.898535037225175e-06, 'epoch': 0.32} 32%|███▏ | 11109/34278 [12:14:54<22:06:47, 3.44s/it] 32%|███▏ | 11110/34278 [12:14:57<21:12:24, 3.30s/it] {'loss': 0.1489, 'grad_norm': 0.9586664401398748, 'learning_rate': 7.898150073124082e-06, 'epoch': 0.32} 32%|███▏ | 11110/34278 [12:14:57<21:12:24, 3.30s/it] 32%|███▏ | 11111/34278 [12:15:00<21:52:56, 3.40s/it] {'loss': 0.1444, 'grad_norm': 0.9522432142225266, 'learning_rate': 7.897765083148996e-06, 'epoch': 0.32} 32%|███▏ | 11111/34278 [12:15:00<21:52:56, 3.40s/it] 32%|███▏ | 11112/34278 [12:15:05<23:57:53, 3.72s/it] {'loss': 0.1707, 'grad_norm': 0.8064721428674649, 'learning_rate': 7.897380067303358e-06, 'epoch': 0.32} 32%|███▏ | 11112/34278 [12:15:05<23:57:53, 3.72s/it] 32%|███▏ | 11113/34278 [12:15:08<23:38:50, 3.67s/it] {'loss': 0.1688, 'grad_norm': 1.0066213801822907, 'learning_rate': 7.896995025590599e-06, 'epoch': 0.32} 32%|███▏ | 11113/34278 [12:15:08<23:38:50, 3.67s/it] 32%|███▏ | 11114/34278 [12:15:11<21:51:38, 3.40s/it] {'loss': 0.1603, 'grad_norm': 0.998585894294648, 'learning_rate': 7.896609958014161e-06, 'epoch': 0.32} 32%|███▏ | 11114/34278 [12:15:11<21:51:38, 3.40s/it] 32%|███▏ | 11115/34278 [12:15:16<25:06:58, 3.90s/it] {'loss': 0.1472, 'grad_norm': 0.7110054043003348, 'learning_rate': 7.896224864577481e-06, 'epoch': 0.32} 32%|███▏ | 11115/34278 [12:15:16<25:06:58, 3.90s/it] 32%|███▏ | 11116/34278 [12:15:19<23:39:37, 3.68s/it] {'loss': 0.159, 'grad_norm': 1.0472696116758133, 'learning_rate': 7.895839745283995e-06, 'epoch': 0.32} 32%|███▏ | 11116/34278 [12:15:19<23:39:37, 3.68s/it] 32%|███▏ | 11117/34278 [12:15:22<22:26:36, 3.49s/it] {'loss': 0.1261, 'grad_norm': 1.0111319698013854, 'learning_rate': 7.895454600137146e-06, 'epoch': 0.32} 32%|███▏ | 11117/34278 [12:15:22<22:26:36, 3.49s/it] 32%|███▏ | 11118/34278 [12:15:26<22:23:42, 3.48s/it] {'loss': 0.1294, 'grad_norm': 0.9057512398346269, 'learning_rate': 7.895069429140368e-06, 'epoch': 0.32} 32%|███▏ | 11118/34278 [12:15:26<22:23:42, 3.48s/it] 32%|███▏ | 11119/34278 [12:15:32<27:33:43, 4.28s/it] {'loss': 0.1548, 'grad_norm': 0.7968962884674599, 'learning_rate': 7.894684232297102e-06, 'epoch': 0.32} 32%|███▏ | 11119/34278 [12:15:32<27:33:43, 4.28s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9815 > 8192). Running this sequence through the model will result in indexing errors 32%|███▏ | 11120/34278 [12:15:36<26:27:29, 4.11s/it] {'loss': 0.146, 'grad_norm': 0.9322557724599413, 'learning_rate': 7.894299009610785e-06, 'epoch': 0.32} 32%|███▏ | 11120/34278 [12:15:36<26:27:29, 4.11s/it] 32%|███▏ | 11121/34278 [12:15:40<26:18:43, 4.09s/it] {'loss': 0.1448, 'grad_norm': 0.8162066814814293, 'learning_rate': 7.89391376108486e-06, 'epoch': 0.32} 32%|███▏ | 11121/34278 [12:15:40<26:18:43, 4.09s/it] 32%|███▏ | 11122/34278 [12:15:43<24:15:54, 3.77s/it] {'loss': 0.1897, 'grad_norm': 0.880392781658322, 'learning_rate': 7.89352848672276e-06, 'epoch': 0.32} 32%|███▏ | 11122/34278 [12:15:43<24:15:54, 3.77s/it] 32%|███▏ | 11123/34278 [12:15:46<23:42:48, 3.69s/it] {'loss': 0.1256, 'grad_norm': 0.9233284810847567, 'learning_rate': 7.893143186527932e-06, 'epoch': 0.32} 32%|███▏ | 11123/34278 [12:15:46<23:42:48, 3.69s/it] 32%|███▏ | 11124/34278 [12:15:51<25:08:21, 3.91s/it] {'loss': 0.1541, 'grad_norm': 0.7914260869194623, 'learning_rate': 7.892757860503811e-06, 'epoch': 0.32} 32%|███▏ | 11124/34278 [12:15:51<25:08:21, 3.91s/it] 32%|███▏ | 11125/34278 [12:15:54<23:30:25, 3.66s/it] {'loss': 0.1603, 'grad_norm': 1.0147958123374983, 'learning_rate': 7.892372508653836e-06, 'epoch': 0.32} 32%|███▏ | 11125/34278 [12:15:54<23:30:25, 3.66s/it] 32%|███▏ | 11126/34278 [12:16:00<27:54:49, 4.34s/it] {'loss': 0.1393, 'grad_norm': 0.778283953114621, 'learning_rate': 7.891987130981453e-06, 'epoch': 0.32} 32%|███▏ | 11126/34278 [12:16:00<27:54:49, 4.34s/it] 32%|███▏ | 11127/34278 [12:16:02<25:05:43, 3.90s/it] {'loss': 0.1487, 'grad_norm': 0.9710467584953915, 'learning_rate': 7.891601727490097e-06, 'epoch': 0.32} 32%|███▏ | 11127/34278 [12:16:02<25:05:43, 3.90s/it] 32%|███▏ | 11128/34278 [12:16:05<22:50:50, 3.55s/it] {'loss': 0.1341, 'grad_norm': 0.7689397872836617, 'learning_rate': 7.891216298183211e-06, 'epoch': 0.32} 32%|███▏ | 11128/34278 [12:16:05<22:50:50, 3.55s/it] 32%|███▏ | 11129/34278 [12:16:10<25:31:11, 3.97s/it] {'loss': 0.1435, 'grad_norm': 0.8023403353438116, 'learning_rate': 7.890830843064238e-06, 'epoch': 0.32} 32%|███▏ | 11129/34278 [12:16:10<25:31:11, 3.97s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 32%|███▏ | 11130/34278 [12:16:16<29:24:13, 4.57s/it] {'loss': 0.1591, 'grad_norm': 0.8413398113707, 'learning_rate': 7.890445362136617e-06, 'epoch': 0.32} 32%|███▏ | 11130/34278 [12:16:16<29:24:13, 4.57s/it] 32%|███▏ | 11131/34278 [12:16:19<26:03:02, 4.05s/it] {'loss': 0.1716, 'grad_norm': 0.6539052901783666, 'learning_rate': 7.890059855403788e-06, 'epoch': 0.32} 32%|███▏ | 11131/34278 [12:16:19<26:03:02, 4.05s/it] 32%|███▏ | 11132/34278 [12:16:22<24:10:30, 3.76s/it] {'loss': 0.14, 'grad_norm': 0.963041856513202, 'learning_rate': 7.889674322869197e-06, 'epoch': 0.32} 32%|███▏ | 11132/34278 [12:16:22<24:10:30, 3.76s/it] 32%|███▏ | 11133/34278 [12:16:26<24:53:23, 3.87s/it] {'loss': 0.1558, 'grad_norm': 0.7747670818321699, 'learning_rate': 7.889288764536283e-06, 'epoch': 0.32} 32%|███▏ | 11133/34278 [12:16:26<24:53:23, 3.87s/it] 32%|███▏ | 11134/34278 [12:16:32<28:41:45, 4.46s/it] {'loss': 0.1487, 'grad_norm': 0.7081080602780688, 'learning_rate': 7.888903180408487e-06, 'epoch': 0.32} 32%|███▏ | 11134/34278 [12:16:32<28:41:45, 4.46s/it] 32%|███▏ | 11135/34278 [12:16:35<25:44:52, 4.01s/it] {'loss': 0.1484, 'grad_norm': 0.7767953386113995, 'learning_rate': 7.888517570489254e-06, 'epoch': 0.32} 32%|███▏ | 11135/34278 [12:16:35<25:44:52, 4.01s/it] 32%|███▏ | 11136/34278 [12:16:39<25:36:47, 3.98s/it] {'loss': 0.16, 'grad_norm': 0.9086030933297429, 'learning_rate': 7.888131934782025e-06, 'epoch': 0.32} 32%|███▏ | 11136/34278 [12:16:39<25:36:47, 3.98s/it] 32%|███▏ | 11137/34278 [12:16:43<26:37:57, 4.14s/it] {'loss': 0.1743, 'grad_norm': 0.8442980583402525, 'learning_rate': 7.887746273290244e-06, 'epoch': 0.32} 32%|███▏ | 11137/34278 [12:16:43<26:37:57, 4.14s/it] 32%|███▏ | 11138/34278 [12:16:47<26:33:19, 4.13s/it] {'loss': 0.1633, 'grad_norm': 0.781528325486156, 'learning_rate': 7.887360586017355e-06, 'epoch': 0.32} 32%|███▏ | 11138/34278 [12:16:48<26:33:19, 4.13s/it] 32%|███▏ | 11139/34278 [12:16:51<25:34:04, 3.98s/it] {'loss': 0.1407, 'grad_norm': 0.8827801295303677, 'learning_rate': 7.886974872966797e-06, 'epoch': 0.32} 32%|███▏ | 11139/34278 [12:16:51<25:34:04, 3.98s/it] 32%|███▏ | 11140/34278 [12:16:55<24:41:13, 3.84s/it] {'loss': 0.152, 'grad_norm': 0.861913956074121, 'learning_rate': 7.88658913414202e-06, 'epoch': 0.32} 32%|███▏ | 11140/34278 [12:16:55<24:41:13, 3.84s/it] 33%|███▎ | 11141/34278 [12:16:58<24:10:05, 3.76s/it] {'loss': 0.1577, 'grad_norm': 0.7879894827965718, 'learning_rate': 7.88620336954646e-06, 'epoch': 0.33} 33%|███▎ | 11141/34278 [12:16:58<24:10:05, 3.76s/it] 33%|███▎ | 11142/34278 [12:17:04<28:23:00, 4.42s/it] {'loss': 0.1499, 'grad_norm': 0.9862164150006236, 'learning_rate': 7.885817579183568e-06, 'epoch': 0.33} 33%|███▎ | 11142/34278 [12:17:04<28:23:00, 4.42s/it] 33%|███▎ | 11143/34278 [12:17:10<31:24:48, 4.89s/it] {'loss': 0.1492, 'grad_norm': 0.7817851699096838, 'learning_rate': 7.885431763056785e-06, 'epoch': 0.33} 33%|███▎ | 11143/34278 [12:17:10<31:24:48, 4.89s/it] 33%|███▎ | 11144/34278 [12:17:13<28:03:07, 4.37s/it] {'loss': 0.1532, 'grad_norm': 0.7994504919131414, 'learning_rate': 7.885045921169558e-06, 'epoch': 0.33} 33%|███▎ | 11144/34278 [12:17:13<28:03:07, 4.37s/it] 33%|███▎ | 11145/34278 [12:17:16<25:49:34, 4.02s/it] {'loss': 0.1661, 'grad_norm': 0.8545913647459151, 'learning_rate': 7.884660053525328e-06, 'epoch': 0.33} 33%|███▎ | 11145/34278 [12:17:16<25:49:34, 4.02s/it] 33%|███▎ | 11146/34278 [12:17:21<26:55:54, 4.19s/it] {'loss': 0.1312, 'grad_norm': 0.7178098624736637, 'learning_rate': 7.88427416012754e-06, 'epoch': 0.33} 33%|███▎ | 11146/34278 [12:17:21<26:55:54, 4.19s/it] 33%|███▎ | 11147/34278 [12:17:27<30:29:44, 4.75s/it] {'loss': 0.1677, 'grad_norm': 0.9187691784701346, 'learning_rate': 7.883888240979645e-06, 'epoch': 0.33} 33%|███▎ | 11147/34278 [12:17:27<30:29:44, 4.75s/it] 33%|███▎ | 11148/34278 [12:17:31<29:05:45, 4.53s/it] {'loss': 0.1505, 'grad_norm': 0.8330749505255329, 'learning_rate': 7.883502296085082e-06, 'epoch': 0.33} 33%|███▎ | 11148/34278 [12:17:31<29:05:45, 4.53s/it] 33%|███▎ | 11149/34278 [12:17:34<26:36:48, 4.14s/it] {'loss': 0.1354, 'grad_norm': 0.9117442196393197, 'learning_rate': 7.883116325447297e-06, 'epoch': 0.33} 33%|███▎ | 11149/34278 [12:17:34<26:36:48, 4.14s/it] 33%|███▎ | 11150/34278 [12:17:37<24:24:28, 3.80s/it] {'loss': 0.1235, 'grad_norm': 0.8664809256906031, 'learning_rate': 7.88273032906974e-06, 'epoch': 0.33} 33%|███▎ | 11150/34278 [12:17:37<24:24:28, 3.80s/it] 33%|███▎ | 11151/34278 [12:17:40<22:59:15, 3.58s/it] {'loss': 0.1585, 'grad_norm': 0.8885645752805146, 'learning_rate': 7.882344306955854e-06, 'epoch': 0.33} 33%|███▎ | 11151/34278 [12:17:40<22:59:15, 3.58s/it] 33%|███▎ | 11152/34278 [12:17:44<22:45:05, 3.54s/it] {'loss': 0.1667, 'grad_norm': 1.043108426199016, 'learning_rate': 7.881958259109086e-06, 'epoch': 0.33} 33%|███▎ | 11152/34278 [12:17:44<22:45:05, 3.54s/it] 33%|███▎ | 11153/34278 [12:17:47<22:37:03, 3.52s/it] {'loss': 0.1454, 'grad_norm': 0.797157519242401, 'learning_rate': 7.881572185532883e-06, 'epoch': 0.33} 33%|███▎ | 11153/34278 [12:17:47<22:37:03, 3.52s/it] 33%|███▎ | 11154/34278 [12:17:51<21:57:39, 3.42s/it] {'loss': 0.1498, 'grad_norm': 0.7372472441294787, 'learning_rate': 7.881186086230692e-06, 'epoch': 0.33} 33%|███▎ | 11154/34278 [12:17:51<21:57:39, 3.42s/it] 33%|███▎ | 11155/34278 [12:17:53<20:51:13, 3.25s/it] {'loss': 0.1338, 'grad_norm': 0.8927634683470496, 'learning_rate': 7.880799961205958e-06, 'epoch': 0.33} 33%|███▎ | 11155/34278 [12:17:53<20:51:13, 3.25s/it] 33%|███▎ | 11156/34278 [12:17:57<21:25:54, 3.34s/it] {'loss': 0.1573, 'grad_norm': 0.9679172359122158, 'learning_rate': 7.880413810462131e-06, 'epoch': 0.33} 33%|███▎ | 11156/34278 [12:17:57<21:25:54, 3.34s/it] 33%|███▎ | 11157/34278 [12:18:01<21:59:52, 3.43s/it] {'loss': 0.1401, 'grad_norm': 0.8530429571627943, 'learning_rate': 7.880027634002656e-06, 'epoch': 0.33} 33%|███▎ | 11157/34278 [12:18:01<21:59:52, 3.43s/it] 33%|███▎ | 11158/34278 [12:18:04<21:17:31, 3.32s/it] {'loss': 0.1489, 'grad_norm': 1.0792902672644435, 'learning_rate': 7.879641431830982e-06, 'epoch': 0.33} 33%|███▎ | 11158/34278 [12:18:04<21:17:31, 3.32s/it] 33%|███▎ | 11159/34278 [12:18:07<21:54:35, 3.41s/it] {'loss': 0.151, 'grad_norm': 0.8090864583869836, 'learning_rate': 7.879255203950558e-06, 'epoch': 0.33} 33%|███▎ | 11159/34278 [12:18:07<21:54:35, 3.41s/it] 33%|███▎ | 11160/34278 [12:18:10<21:02:08, 3.28s/it] {'loss': 0.1628, 'grad_norm': 0.8242460238232276, 'learning_rate': 7.87886895036483e-06, 'epoch': 0.33} 33%|███▎ | 11160/34278 [12:18:10<21:02:08, 3.28s/it] 33%|███▎ | 11161/34278 [12:18:15<23:48:35, 3.71s/it] {'loss': 0.1436, 'grad_norm': 0.955244334399679, 'learning_rate': 7.878482671077245e-06, 'epoch': 0.33} 33%|███▎ | 11161/34278 [12:18:15<23:48:35, 3.71s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 33%|███▎ | 11162/34278 [12:18:18<22:44:28, 3.54s/it] {'loss': 0.1548, 'grad_norm': 0.8717935363950352, 'learning_rate': 7.878096366091257e-06, 'epoch': 0.33} 33%|███▎ | 11162/34278 [12:18:18<22:44:28, 3.54s/it] 33%|███▎ | 11163/34278 [12:18:21<21:55:49, 3.42s/it] {'loss': 0.1532, 'grad_norm': 0.7861828283605923, 'learning_rate': 7.87771003541031e-06, 'epoch': 0.33} 33%|███▎ | 11163/34278 [12:18:21<21:55:49, 3.42s/it] 33%|███▎ | 11164/34278 [12:18:25<23:15:06, 3.62s/it] {'loss': 0.1376, 'grad_norm': 1.0991238720235803, 'learning_rate': 7.877323679037856e-06, 'epoch': 0.33} 33%|███▎ | 11164/34278 [12:18:25<23:15:06, 3.62s/it] 33%|███▎ | 11165/34278 [12:18:29<22:59:04, 3.58s/it] {'loss': 0.1605, 'grad_norm': 0.8782449589718162, 'learning_rate': 7.876937296977343e-06, 'epoch': 0.33} 33%|███▎ | 11165/34278 [12:18:29<22:59:04, 3.58s/it] 33%|███▎ | 11166/34278 [12:18:35<28:12:30, 4.39s/it] {'loss': 0.1545, 'grad_norm': 0.9904516402225627, 'learning_rate': 7.87655088923222e-06, 'epoch': 0.33} 33%|███▎ | 11166/34278 [12:18:35<28:12:30, 4.39s/it] 33%|███▎ | 11167/34278 [12:18:39<26:42:49, 4.16s/it] {'loss': 0.1526, 'grad_norm': 0.926925223891559, 'learning_rate': 7.876164455805936e-06, 'epoch': 0.33} 33%|███▎ | 11167/34278 [12:18:39<26:42:49, 4.16s/it] 33%|███▎ | 11168/34278 [12:18:42<25:46:09, 4.01s/it] {'loss': 0.1456, 'grad_norm': 1.0697371220281324, 'learning_rate': 7.875777996701945e-06, 'epoch': 0.33} 33%|███▎ | 11168/34278 [12:18:42<25:46:09, 4.01s/it] 33%|███▎ | 11169/34278 [12:18:45<24:02:57, 3.75s/it] {'loss': 0.1539, 'grad_norm': 0.8192916929295959, 'learning_rate': 7.875391511923694e-06, 'epoch': 0.33} 33%|███▎ | 11169/34278 [12:18:46<24:02:57, 3.75s/it] 33%|███▎ | 11170/34278 [12:18:49<22:51:49, 3.56s/it] {'loss': 0.1766, 'grad_norm': 0.9150330513863495, 'learning_rate': 7.875005001474634e-06, 'epoch': 0.33} 33%|███▎ | 11170/34278 [12:18:49<22:51:49, 3.56s/it] 33%|███▎ | 11171/34278 [12:18:53<25:13:42, 3.93s/it] {'loss': 0.1422, 'grad_norm': 0.9828826048360307, 'learning_rate': 7.874618465358214e-06, 'epoch': 0.33} 33%|███▎ | 11171/34278 [12:18:53<25:13:42, 3.93s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 33%|███▎ | 11172/34278 [12:19:00<29:41:46, 4.63s/it] {'loss': 0.1409, 'grad_norm': 0.6383145733164322, 'learning_rate': 7.874231903577888e-06, 'epoch': 0.33} 33%|███▎ | 11172/34278 [12:19:00<29:41:46, 4.63s/it] 33%|███▎ | 11173/34278 [12:19:06<32:08:16, 5.01s/it] {'loss': 0.1356, 'grad_norm': 0.7695161636554368, 'learning_rate': 7.873845316137105e-06, 'epoch': 0.33} 33%|███▎ | 11173/34278 [12:19:06<32:08:16, 5.01s/it] 33%|███▎ | 11174/34278 [12:19:09<28:43:18, 4.48s/it] {'loss': 0.1471, 'grad_norm': 1.0856298287608799, 'learning_rate': 7.873458703039318e-06, 'epoch': 0.33} 33%|███▎ | 11174/34278 [12:19:09<28:43:18, 4.48s/it] 33%|███▎ | 11175/34278 [12:19:14<29:51:52, 4.65s/it] {'loss': 0.1394, 'grad_norm': 0.9055813711090364, 'learning_rate': 7.873072064287977e-06, 'epoch': 0.33} 33%|███▎ | 11175/34278 [12:19:14<29:51:52, 4.65s/it] 33%|███▎ | 11176/34278 [12:19:17<26:50:56, 4.18s/it] {'loss': 0.1183, 'grad_norm': 0.7847444403890237, 'learning_rate': 7.872685399886534e-06, 'epoch': 0.33} 33%|███▎ | 11176/34278 [12:19:17<26:50:56, 4.18s/it] 33%|███▎ | 11177/34278 [12:19:20<25:10:31, 3.92s/it] {'loss': 0.1475, 'grad_norm': 0.906607351781194, 'learning_rate': 7.872298709838442e-06, 'epoch': 0.33} 33%|███▎ | 11177/34278 [12:19:20<25:10:31, 3.92s/it] 33%|███▎ | 11178/34278 [12:19:26<29:12:56, 4.55s/it] {'loss': 0.1605, 'grad_norm': 0.7337407212504137, 'learning_rate': 7.871911994147153e-06, 'epoch': 0.33} 33%|███▎ | 11178/34278 [12:19:26<29:12:56, 4.55s/it] 33%|███▎ | 11179/34278 [12:19:29<25:52:28, 4.03s/it] {'loss': 0.1473, 'grad_norm': 0.9478752233020101, 'learning_rate': 7.871525252816118e-06, 'epoch': 0.33} 33%|███▎ | 11179/34278 [12:19:29<25:52:28, 4.03s/it] 33%|███▎ | 11180/34278 [12:19:32<23:44:44, 3.70s/it] {'loss': 0.1634, 'grad_norm': 1.0254850132373712, 'learning_rate': 7.871138485848792e-06, 'epoch': 0.33} 33%|███▎ | 11180/34278 [12:19:32<23:44:44, 3.70s/it] 33%|███▎ | 11181/34278 [12:19:35<22:14:44, 3.47s/it] {'loss': 0.1402, 'grad_norm': 0.8695701462858811, 'learning_rate': 7.870751693248629e-06, 'epoch': 0.33} 33%|███▎ | 11181/34278 [12:19:35<22:14:44, 3.47s/it] 33%|███▎ | 11182/34278 [12:19:38<21:18:23, 3.32s/it] {'loss': 0.1551, 'grad_norm': 0.8789685242400097, 'learning_rate': 7.870364875019077e-06, 'epoch': 0.33} 33%|███▎ | 11182/34278 [12:19:38<21:18:23, 3.32s/it] 33%|███▎ | 11183/34278 [12:19:41<20:52:54, 3.26s/it] {'loss': 0.1532, 'grad_norm': 1.056578533196887, 'learning_rate': 7.869978031163595e-06, 'epoch': 0.33} 33%|███▎ | 11183/34278 [12:19:41<20:52:54, 3.26s/it] 33%|███▎ | 11184/34278 [12:19:44<20:32:29, 3.20s/it] {'loss': 0.1528, 'grad_norm': 0.8102537330696241, 'learning_rate': 7.869591161685632e-06, 'epoch': 0.33} 33%|███▎ | 11184/34278 [12:19:44<20:32:29, 3.20s/it] 33%|███▎ | 11185/34278 [12:19:47<20:44:55, 3.23s/it] {'loss': 0.137, 'grad_norm': 0.676497318379484, 'learning_rate': 7.869204266588646e-06, 'epoch': 0.33} 33%|███▎ | 11185/34278 [12:19:47<20:44:55, 3.23s/it] 33%|███▎ | 11186/34278 [12:19:51<21:15:29, 3.31s/it] {'loss': 0.1326, 'grad_norm': 0.7706067624671409, 'learning_rate': 7.868817345876087e-06, 'epoch': 0.33} 33%|███▎ | 11186/34278 [12:19:51<21:15:29, 3.31s/it] 33%|███▎ | 11187/34278 [12:19:54<21:03:23, 3.28s/it] {'loss': 0.1532, 'grad_norm': 0.9601753302633754, 'learning_rate': 7.868430399551414e-06, 'epoch': 0.33} 33%|███▎ | 11187/34278 [12:19:54<21:03:23, 3.28s/it] 33%|███▎ | 11188/34278 [12:20:00<26:19:08, 4.10s/it] {'loss': 0.1731, 'grad_norm': 0.8017299868137221, 'learning_rate': 7.868043427618079e-06, 'epoch': 0.33} 33%|███▎ | 11188/34278 [12:20:00<26:19:08, 4.10s/it] 33%|███▎ | 11189/34278 [12:20:03<24:26:51, 3.81s/it] {'loss': 0.1564, 'grad_norm': 0.9979207894961692, 'learning_rate': 7.867656430079536e-06, 'epoch': 0.33} 33%|███▎ | 11189/34278 [12:20:03<24:26:51, 3.81s/it] 33%|███▎ | 11190/34278 [12:20:06<22:57:59, 3.58s/it] {'loss': 0.1531, 'grad_norm': 0.8489733284107609, 'learning_rate': 7.867269406939241e-06, 'epoch': 0.33} 33%|███▎ | 11190/34278 [12:20:06<22:57:59, 3.58s/it] 33%|███▎ | 11191/34278 [12:20:09<21:30:22, 3.35s/it] {'loss': 0.1477, 'grad_norm': 1.2871244862954452, 'learning_rate': 7.86688235820065e-06, 'epoch': 0.33} 33%|███▎ | 11191/34278 [12:20:09<21:30:22, 3.35s/it] 33%|███▎ | 11192/34278 [12:20:12<21:14:53, 3.31s/it] {'loss': 0.1592, 'grad_norm': 0.8434115513925409, 'learning_rate': 7.866495283867217e-06, 'epoch': 0.33} 33%|███▎ | 11192/34278 [12:20:12<21:14:53, 3.31s/it] 33%|███▎ | 11193/34278 [12:20:16<20:52:53, 3.26s/it] {'loss': 0.1339, 'grad_norm': 0.7868800112584352, 'learning_rate': 7.866108183942398e-06, 'epoch': 0.33} 33%|███▎ | 11193/34278 [12:20:16<20:52:53, 3.26s/it] 33%|███▎ | 11194/34278 [12:20:20<23:29:28, 3.66s/it] {'loss': 0.145, 'grad_norm': 0.7322802973463202, 'learning_rate': 7.86572105842965e-06, 'epoch': 0.33} 33%|███▎ | 11194/34278 [12:20:20<23:29:28, 3.66s/it] 33%|███▎ | 11195/34278 [12:20:23<22:38:45, 3.53s/it] {'loss': 0.1633, 'grad_norm': 0.7045795705883792, 'learning_rate': 7.865333907332428e-06, 'epoch': 0.33} 33%|███▎ | 11195/34278 [12:20:23<22:38:45, 3.53s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f61c8f373d0> Failed to fetch sample 3686876. Exception: cannot identify image file <_io.BytesIO object at 0x7f61c8f373d0> 33%|███▎ | 11196/34278 [12:20:27<22:11:48, 3.46s/it] {'loss': 0.1372, 'grad_norm': 0.8515529749835823, 'learning_rate': 7.864946730654189e-06, 'epoch': 0.33} 33%|███▎ | 11196/34278 [12:20:27<22:11:48, 3.46s/it] 33%|███▎ | 11197/34278 [12:20:30<21:33:23, 3.36s/it] {'loss': 0.1445, 'grad_norm': 0.8143587191431548, 'learning_rate': 7.864559528398389e-06, 'epoch': 0.33} 33%|███▎ | 11197/34278 [12:20:30<21:33:23, 3.36s/it] 33%|███▎ | 11198/34278 [12:20:33<20:59:32, 3.27s/it] {'loss': 0.1749, 'grad_norm': 0.7803727200147857, 'learning_rate': 7.864172300568486e-06, 'epoch': 0.33} 33%|███▎ | 11198/34278 [12:20:33<20:59:32, 3.27s/it] 33%|███▎ | 11199/34278 [12:20:36<20:22:24, 3.18s/it] {'loss': 0.1546, 'grad_norm': 0.8890237745265267, 'learning_rate': 7.863785047167937e-06, 'epoch': 0.33} 33%|███▎ | 11199/34278 [12:20:36<20:22:24, 3.18s/it] 33%|███▎ | 11200/34278 [12:20:39<20:40:17, 3.22s/it] {'loss': 0.1536, 'grad_norm': 0.8192458116294992, 'learning_rate': 7.863397768200199e-06, 'epoch': 0.33} 33%|███▎ | 11200/34278 [12:20:39<20:40:17, 3.22s/it] 33%|███▎ | 11201/34278 [12:20:43<22:02:03, 3.44s/it] {'loss': 0.166, 'grad_norm': 0.8109583909799909, 'learning_rate': 7.863010463668727e-06, 'epoch': 0.33} 33%|███▎ | 11201/34278 [12:20:43<22:02:03, 3.44s/it] 33%|███▎ | 11202/34278 [12:20:46<21:48:48, 3.40s/it] {'loss': 0.1344, 'grad_norm': 0.7512780401103101, 'learning_rate': 7.862623133576983e-06, 'epoch': 0.33} 33%|███▎ | 11202/34278 [12:20:46<21:48:48, 3.40s/it] 33%|███▎ | 11203/34278 [12:20:52<25:36:10, 3.99s/it] {'loss': 0.1426, 'grad_norm': 0.8751608725724604, 'learning_rate': 7.862235777928421e-06, 'epoch': 0.33} 33%|███▎ | 11203/34278 [12:20:52<25:36:10, 3.99s/it] 33%|███▎ | 11204/34278 [12:20:55<24:12:50, 3.78s/it] {'loss': 0.1424, 'grad_norm': 0.7561860223338784, 'learning_rate': 7.861848396726503e-06, 'epoch': 0.33} 33%|███▎ | 11204/34278 [12:20:55<24:12:50, 3.78s/it] 33%|███▎ | 11205/34278 [12:20:58<22:37:57, 3.53s/it] {'loss': 0.1652, 'grad_norm': 1.0115884453789517, 'learning_rate': 7.861460989974687e-06, 'epoch': 0.33} 33%|███▎ | 11205/34278 [12:20:58<22:37:57, 3.53s/it] 33%|███▎ | 11206/34278 [12:21:01<21:24:58, 3.34s/it] {'loss': 0.1373, 'grad_norm': 0.7903954517382313, 'learning_rate': 7.86107355767643e-06, 'epoch': 0.33} 33%|███▎ | 11206/34278 [12:21:01<21:24:58, 3.34s/it] 33%|███▎ | 11207/34278 [12:21:06<25:14:03, 3.94s/it] {'loss': 0.1402, 'grad_norm': 0.7381217575098457, 'learning_rate': 7.860686099835189e-06, 'epoch': 0.33} 33%|███▎ | 11207/34278 [12:21:06<25:14:03, 3.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 33%|███▎ | 11208/34278 [12:21:12<28:06:43, 4.39s/it] {'loss': 0.1529, 'grad_norm': 0.8546309919055587, 'learning_rate': 7.860298616454427e-06, 'epoch': 0.33} 33%|███▎ | 11208/34278 [12:21:12<28:06:43, 4.39s/it] 33%|███▎ | 11209/34278 [12:21:15<26:18:15, 4.10s/it] {'loss': 0.154, 'grad_norm': 0.7078441242209368, 'learning_rate': 7.8599111075376e-06, 'epoch': 0.33} 33%|███▎ | 11209/34278 [12:21:15<26:18:15, 4.10s/it] 33%|███▎ | 11210/34278 [12:21:21<30:07:00, 4.70s/it] {'loss': 0.1312, 'grad_norm': 1.0809135431051031, 'learning_rate': 7.85952357308817e-06, 'epoch': 0.33} 33%|███▎ | 11210/34278 [12:21:21<30:07:00, 4.70s/it] 33%|███▎ | 11211/34278 [12:21:25<28:25:21, 4.44s/it] {'loss': 0.1228, 'grad_norm': 0.9314678237368279, 'learning_rate': 7.8591360131096e-06, 'epoch': 0.33} 33%|███▎ | 11211/34278 [12:21:25<28:25:21, 4.44s/it] 33%|███▎ | 11212/34278 [12:21:31<32:06:27, 5.01s/it] {'loss': 0.1342, 'grad_norm': 0.9749281045999717, 'learning_rate': 7.85874842760534e-06, 'epoch': 0.33} 33%|███▎ | 11212/34278 [12:21:31<32:06:27, 5.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 33%|███▎ | 11213/34278 [12:21:36<30:48:34, 4.81s/it] {'loss': 0.14, 'grad_norm': 0.8503717148202816, 'learning_rate': 7.85836081657886e-06, 'epoch': 0.33} 33%|███▎ | 11213/34278 [12:21:36<30:48:34, 4.81s/it] 33%|███▎ | 11214/34278 [12:21:42<33:03:44, 5.16s/it] {'loss': 0.1725, 'grad_norm': 0.7811678816801917, 'learning_rate': 7.857973180033615e-06, 'epoch': 0.33} 33%|███▎ | 11214/34278 [12:21:42<33:03:44, 5.16s/it] 33%|███▎ | 11215/34278 [12:21:47<33:16:27, 5.19s/it] {'loss': 0.1597, 'grad_norm': 0.8289429545831913, 'learning_rate': 7.85758551797307e-06, 'epoch': 0.33} 33%|███▎ | 11215/34278 [12:21:47<33:16:27, 5.19s/it] 33%|███▎ | 11216/34278 [12:21:51<30:07:43, 4.70s/it] {'loss': 0.1655, 'grad_norm': 0.8870308805479863, 'learning_rate': 7.857197830400683e-06, 'epoch': 0.33} 33%|███▎ | 11216/34278 [12:21:51<30:07:43, 4.70s/it] 33%|███▎ | 11217/34278 [12:21:54<27:24:20, 4.28s/it] {'loss': 0.1552, 'grad_norm': 0.6846381544502131, 'learning_rate': 7.856810117319916e-06, 'epoch': 0.33} 33%|███▎ | 11217/34278 [12:21:54<27:24:20, 4.28s/it] 33%|███▎ | 11218/34278 [12:21:57<25:45:57, 4.02s/it] {'loss': 0.1317, 'grad_norm': 0.778537156230549, 'learning_rate': 7.85642237873423e-06, 'epoch': 0.33} 33%|███▎ | 11218/34278 [12:21:57<25:45:57, 4.02s/it] 33%|███▎ | 11219/34278 [12:22:00<23:55:25, 3.74s/it] {'loss': 0.1648, 'grad_norm': 0.741098038821696, 'learning_rate': 7.856034614647087e-06, 'epoch': 0.33} 33%|███▎ | 11219/34278 [12:22:00<23:55:25, 3.74s/it] 33%|███▎ | 11220/34278 [12:22:06<27:12:30, 4.25s/it] {'loss': 0.1472, 'grad_norm': 0.7653311032549058, 'learning_rate': 7.855646825061948e-06, 'epoch': 0.33} 33%|███▎ | 11220/34278 [12:22:06<27:12:30, 4.25s/it] 33%|███▎ | 11221/34278 [12:22:09<24:37:44, 3.85s/it] {'loss': 0.1336, 'grad_norm': 0.5955863209548142, 'learning_rate': 7.855259009982275e-06, 'epoch': 0.33} 33%|███▎ | 11221/34278 [12:22:09<24:37:44, 3.85s/it] 33%|███▎ | 11222/34278 [12:22:12<23:09:19, 3.62s/it] {'loss': 0.1552, 'grad_norm': 0.7148012467713774, 'learning_rate': 7.854871169411533e-06, 'epoch': 0.33} 33%|███▎ | 11222/34278 [12:22:12<23:09:19, 3.62s/it] 33%|███▎ | 11223/34278 [12:22:15<22:26:34, 3.50s/it] {'loss': 0.1856, 'grad_norm': 0.8127493975852811, 'learning_rate': 7.854483303353182e-06, 'epoch': 0.33} 33%|███▎ | 11223/34278 [12:22:15<22:26:34, 3.50s/it] 33%|███▎ | 11224/34278 [12:22:18<21:06:07, 3.30s/it] {'loss': 0.1233, 'grad_norm': 0.6241468024838582, 'learning_rate': 7.854095411810688e-06, 'epoch': 0.33} 33%|███▎ | 11224/34278 [12:22:18<21:06:07, 3.30s/it] 33%|███▎ | 11225/34278 [12:22:21<20:40:56, 3.23s/it] {'loss': 0.1457, 'grad_norm': 0.8904881057328751, 'learning_rate': 7.853707494787508e-06, 'epoch': 0.33} 33%|███▎ | 11225/34278 [12:22:21<20:40:56, 3.23s/it] 33%|███▎ | 11226/34278 [12:22:24<20:08:14, 3.14s/it] {'loss': 0.1591, 'grad_norm': 0.9114078043253493, 'learning_rate': 7.85331955228711e-06, 'epoch': 0.33} 33%|███▎ | 11226/34278 [12:22:24<20:08:14, 3.14s/it] 33%|███▎ | 11227/34278 [12:22:28<21:15:14, 3.32s/it] {'loss': 0.1543, 'grad_norm': 0.8735414792196101, 'learning_rate': 7.852931584312955e-06, 'epoch': 0.33} 33%|███▎ | 11227/34278 [12:22:28<21:15:14, 3.32s/it] 33%|███▎ | 11228/34278 [12:22:31<21:26:15, 3.35s/it] {'loss': 0.1556, 'grad_norm': 1.187150524167911, 'learning_rate': 7.85254359086851e-06, 'epoch': 0.33} 33%|███▎ | 11228/34278 [12:22:31<21:26:15, 3.35s/it] 33%|███▎ | 11229/34278 [12:22:35<23:12:07, 3.62s/it] {'loss': 0.1389, 'grad_norm': 0.896711636972266, 'learning_rate': 7.852155571957237e-06, 'epoch': 0.33} 33%|███▎ | 11229/34278 [12:22:35<23:12:07, 3.62s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 33%|███▎ | 11230/34278 [12:22:40<24:32:02, 3.83s/it] {'loss': 0.1527, 'grad_norm': 1.2077028882700518, 'learning_rate': 7.851767527582597e-06, 'epoch': 0.33} 33%|███▎ | 11230/34278 [12:22:40<24:32:02, 3.83s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f931c173ec0> Failed to fetch sample 2735740. Exception: cannot identify image file <_io.BytesIO object at 0x7f931c173ec0> 33%|███▎ | 11231/34278 [12:22:43<23:11:32, 3.62s/it] {'loss': 0.1503, 'grad_norm': 0.7263188814374241, 'learning_rate': 7.851379457748058e-06, 'epoch': 0.33} 33%|███▎ | 11231/34278 [12:22:43<23:11:32, 3.62s/it] 33%|███▎ | 11232/34278 [12:22:46<23:04:55, 3.61s/it] {'loss': 0.1095, 'grad_norm': 0.7322433282618662, 'learning_rate': 7.850991362457086e-06, 'epoch': 0.33} 33%|███▎ | 11232/34278 [12:22:46<23:04:55, 3.61s/it] 33%|███▎ | 11233/34278 [12:22:52<27:32:04, 4.30s/it] {'loss': 0.1504, 'grad_norm': 0.8758994807569604, 'learning_rate': 7.850603241713143e-06, 'epoch': 0.33} 33%|███▎ | 11233/34278 [12:22:52<27:32:04, 4.30s/it] 33%|███▎ | 11234/34278 [12:22:55<25:21:13, 3.96s/it] {'loss': 0.1395, 'grad_norm': 0.8231619583819102, 'learning_rate': 7.850215095519693e-06, 'epoch': 0.33} 33%|███▎ | 11234/34278 [12:22:55<25:21:13, 3.96s/it] 33%|███▎ | 11235/34278 [12:22:59<24:53:22, 3.89s/it] {'loss': 0.1651, 'grad_norm': 0.7710888808662312, 'learning_rate': 7.849826923880205e-06, 'epoch': 0.33} 33%|███▎ | 11235/34278 [12:22:59<24:53:22, 3.89s/it] 33%|███▎ | 11236/34278 [12:23:05<29:09:37, 4.56s/it] {'loss': 0.1653, 'grad_norm': 0.8255779225335644, 'learning_rate': 7.849438726798142e-06, 'epoch': 0.33} 33%|███▎ | 11236/34278 [12:23:05<29:09:37, 4.56s/it] 33%|███▎ | 11237/34278 [12:23:08<26:07:21, 4.08s/it] {'loss': 0.1373, 'grad_norm': 1.2851151130354712, 'learning_rate': 7.84905050427697e-06, 'epoch': 0.33} 33%|███▎ | 11237/34278 [12:23:08<26:07:21, 4.08s/it] 33%|███▎ | 11238/34278 [12:23:11<23:36:04, 3.69s/it] {'loss': 0.1725, 'grad_norm': 0.9665058607869698, 'learning_rate': 7.848662256320155e-06, 'epoch': 0.33} 33%|███▎ | 11238/34278 [12:23:11<23:36:04, 3.69s/it] 33%|███▎ | 11239/34278 [12:23:14<22:21:53, 3.49s/it] {'loss': 0.1347, 'grad_norm': 0.7286947726251944, 'learning_rate': 7.848273982931164e-06, 'epoch': 0.33} 33%|███▎ | 11239/34278 [12:23:14<22:21:53, 3.49s/it] 33%|███▎ | 11240/34278 [12:23:20<27:03:34, 4.23s/it] {'loss': 0.1782, 'grad_norm': 0.9619457127205155, 'learning_rate': 7.847885684113463e-06, 'epoch': 0.33} 33%|███▎ | 11240/34278 [12:23:20<27:03:34, 4.23s/it] 33%|███▎ | 11241/34278 [12:23:24<27:45:23, 4.34s/it] {'loss': 0.1584, 'grad_norm': 1.3098646305027817, 'learning_rate': 7.847497359870517e-06, 'epoch': 0.33} 33%|███▎ | 11241/34278 [12:23:24<27:45:23, 4.34s/it] 33%|███▎ | 11242/34278 [12:23:28<26:46:36, 4.18s/it] {'loss': 0.1514, 'grad_norm': 0.7374484823036807, 'learning_rate': 7.847109010205796e-06, 'epoch': 0.33} 33%|███▎ | 11242/34278 [12:23:28<26:46:36, 4.18s/it] 33%|███▎ | 11243/34278 [12:23:34<29:35:50, 4.63s/it] {'loss': 0.1322, 'grad_norm': 0.847237314622174, 'learning_rate': 7.846720635122765e-06, 'epoch': 0.33} 33%|███▎ | 11243/34278 [12:23:34<29:35:50, 4.63s/it] 33%|███▎ | 11244/34278 [12:23:39<30:37:38, 4.79s/it] {'loss': 0.16, 'grad_norm': 0.9292241527720175, 'learning_rate': 7.84633223462489e-06, 'epoch': 0.33} 33%|███▎ | 11244/34278 [12:23:39<30:37:38, 4.79s/it] 33%|███▎ | 11245/34278 [12:23:42<27:24:37, 4.28s/it] {'loss': 0.1466, 'grad_norm': 0.8562620323172302, 'learning_rate': 7.845943808715643e-06, 'epoch': 0.33} 33%|███▎ | 11245/34278 [12:23:42<27:24:37, 4.28s/it] 33%|███▎ | 11246/34278 [12:23:48<31:01:49, 4.85s/it] {'loss': 0.1357, 'grad_norm': 0.7034178594607732, 'learning_rate': 7.845555357398488e-06, 'epoch': 0.33} 33%|███▎ | 11246/34278 [12:23:48<31:01:49, 4.85s/it] 33%|███▎ | 11247/34278 [12:23:52<29:25:29, 4.60s/it] {'loss': 0.1457, 'grad_norm': 0.8738043470643679, 'learning_rate': 7.845166880676894e-06, 'epoch': 0.33} 33%|███▎ | 11247/34278 [12:23:52<29:25:29, 4.60s/it] 33%|███▎ | 11248/34278 [12:23:56<26:54:15, 4.21s/it] {'loss': 0.1391, 'grad_norm': 0.7889693949055656, 'learning_rate': 7.844778378554328e-06, 'epoch': 0.33} 33%|███▎ | 11248/34278 [12:23:56<26:54:15, 4.21s/it] 33%|███▎ | 11249/34278 [12:24:02<31:23:55, 4.91s/it] {'loss': 0.1621, 'grad_norm': 0.9570973099032345, 'learning_rate': 7.844389851034262e-06, 'epoch': 0.33} 33%|███▎ | 11249/34278 [12:24:02<31:23:55, 4.91s/it] 33%|███▎ | 11250/34278 [12:24:07<31:08:10, 4.87s/it] {'loss': 0.1403, 'grad_norm': 1.0272368292421759, 'learning_rate': 7.84400129812016e-06, 'epoch': 0.33} 33%|███▎ | 11250/34278 [12:24:07<31:08:10, 4.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 33%|███▎ | 11251/34278 [12:24:13<33:19:37, 5.21s/it] {'loss': 0.1505, 'grad_norm': 0.7281150320970144, 'learning_rate': 7.843612719815495e-06, 'epoch': 0.33} 33%|███▎ | 11251/34278 [12:24:13<33:19:37, 5.21s/it] 33%|███▎ | 11252/34278 [12:24:19<35:03:20, 5.48s/it] {'loss': 0.1709, 'grad_norm': 0.9004117672538315, 'learning_rate': 7.843224116123735e-06, 'epoch': 0.33} 33%|███▎ | 11252/34278 [12:24:19<35:03:20, 5.48s/it] 33%|███▎ | 11253/34278 [12:24:23<31:10:53, 4.88s/it] {'loss': 0.1388, 'grad_norm': 0.9160051166247537, 'learning_rate': 7.842835487048347e-06, 'epoch': 0.33} 33%|███▎ | 11253/34278 [12:24:23<31:10:53, 4.88s/it] 33%|███▎ | 11254/34278 [12:24:26<29:09:48, 4.56s/it] {'loss': 0.1462, 'grad_norm': 0.6360534837243493, 'learning_rate': 7.842446832592805e-06, 'epoch': 0.33} 33%|███▎ | 11254/34278 [12:24:26<29:09:48, 4.56s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 33%|███▎ | 11255/34278 [12:24:29<26:18:30, 4.11s/it] {'loss': 0.1816, 'grad_norm': 0.8936577309356724, 'learning_rate': 7.842058152760573e-06, 'epoch': 0.33} 33%|███▎ | 11255/34278 [12:24:29<26:18:30, 4.11s/it] 33%|███▎ | 11256/34278 [12:24:34<27:09:04, 4.25s/it] {'loss': 0.1429, 'grad_norm': 0.8079007857039887, 'learning_rate': 7.841669447555126e-06, 'epoch': 0.33} 33%|███▎ | 11256/34278 [12:24:34<27:09:04, 4.25s/it] 33%|███▎ | 11257/34278 [12:24:37<25:20:23, 3.96s/it] {'loss': 0.1311, 'grad_norm': 0.8079294421546867, 'learning_rate': 7.841280716979933e-06, 'epoch': 0.33} 33%|███▎ | 11257/34278 [12:24:37<25:20:23, 3.96s/it] 33%|███▎ | 11258/34278 [12:24:42<27:07:29, 4.24s/it] {'loss': 0.1484, 'grad_norm': 0.6798051594740071, 'learning_rate': 7.840891961038464e-06, 'epoch': 0.33} 33%|███▎ | 11258/34278 [12:24:42<27:07:29, 4.24s/it] 33%|███▎ | 11259/34278 [12:24:45<24:51:39, 3.89s/it] {'loss': 0.1597, 'grad_norm': 0.8981734630181988, 'learning_rate': 7.840503179734188e-06, 'epoch': 0.33} 33%|███▎ | 11259/34278 [12:24:45<24:51:39, 3.89s/it] 33%|███▎ | 11260/34278 [12:24:50<26:37:06, 4.16s/it] {'loss': 0.1344, 'grad_norm': 0.79375380572684, 'learning_rate': 7.840114373070579e-06, 'epoch': 0.33} 33%|███▎ | 11260/34278 [12:24:50<26:37:06, 4.16s/it] 33%|███▎ | 11261/34278 [12:24:54<25:57:00, 4.06s/it] {'loss': 0.1217, 'grad_norm': 0.8088605455674179, 'learning_rate': 7.839725541051106e-06, 'epoch': 0.33} 33%|███▎ | 11261/34278 [12:24:54<25:57:00, 4.06s/it] 33%|███▎ | 11262/34278 [12:24:57<24:10:11, 3.78s/it] {'loss': 0.1379, 'grad_norm': 0.7509650738130004, 'learning_rate': 7.839336683679241e-06, 'epoch': 0.33} 33%|███▎ | 11262/34278 [12:24:57<24:10:11, 3.78s/it] 33%|███▎ | 11263/34278 [12:25:00<22:51:58, 3.58s/it] {'loss': 0.1506, 'grad_norm': 0.9784735787095662, 'learning_rate': 7.838947800958459e-06, 'epoch': 0.33} 33%|███▎ | 11263/34278 [12:25:00<22:51:58, 3.58s/it] 33%|███▎ | 11264/34278 [12:25:04<22:34:26, 3.53s/it] {'loss': 0.1491, 'grad_norm': 0.9860942217770315, 'learning_rate': 7.838558892892226e-06, 'epoch': 0.33} 33%|███▎ | 11264/34278 [12:25:04<22:34:26, 3.53s/it] 33%|███▎ | 11265/34278 [12:25:09<26:22:32, 4.13s/it] {'loss': 0.1514, 'grad_norm': 0.8076819378761262, 'learning_rate': 7.838169959484017e-06, 'epoch': 0.33} 33%|███▎ | 11265/34278 [12:25:09<26:22:32, 4.13s/it] 33%|███▎ | 11266/34278 [12:25:13<25:10:24, 3.94s/it] {'loss': 0.1586, 'grad_norm': 0.9269881626634534, 'learning_rate': 7.837781000737306e-06, 'epoch': 0.33} 33%|███▎ | 11266/34278 [12:25:13<25:10:24, 3.94s/it] 33%|███▎ | 11267/34278 [12:25:16<23:33:18, 3.69s/it] {'loss': 0.1646, 'grad_norm': 0.7327365327819917, 'learning_rate': 7.837392016655562e-06, 'epoch': 0.33} 33%|███▎ | 11267/34278 [12:25:16<23:33:18, 3.69s/it] 33%|███▎ | 11268/34278 [12:25:20<23:57:57, 3.75s/it] {'loss': 0.1245, 'grad_norm': 0.831318890265991, 'learning_rate': 7.837003007242258e-06, 'epoch': 0.33} 33%|███▎ | 11268/34278 [12:25:20<23:57:57, 3.75s/it] 33%|███▎ | 11269/34278 [12:25:23<23:11:03, 3.63s/it] {'loss': 0.1414, 'grad_norm': 0.8241302883260383, 'learning_rate': 7.83661397250087e-06, 'epoch': 0.33} 33%|███▎ | 11269/34278 [12:25:23<23:11:03, 3.63s/it] 33%|███▎ | 11270/34278 [12:25:27<23:09:46, 3.62s/it] {'loss': 0.1463, 'grad_norm': 0.9117858533668711, 'learning_rate': 7.83622491243487e-06, 'epoch': 0.33} 33%|███▎ | 11270/34278 [12:25:27<23:09:46, 3.62s/it] 33%|███▎ | 11271/34278 [12:25:30<22:44:35, 3.56s/it] {'loss': 0.1549, 'grad_norm': 0.9879726116733071, 'learning_rate': 7.835835827047731e-06, 'epoch': 0.33} 33%|███▎ | 11271/34278 [12:25:30<22:44:35, 3.56s/it] 33%|███▎ | 11272/34278 [12:25:34<23:25:51, 3.67s/it] {'loss': 0.1554, 'grad_norm': 0.8261101970410129, 'learning_rate': 7.835446716342926e-06, 'epoch': 0.33} 33%|███▎ | 11272/34278 [12:25:34<23:25:51, 3.67s/it] 33%|███▎ | 11273/34278 [12:25:39<27:07:15, 4.24s/it] {'loss': 0.1415, 'grad_norm': 0.6414598425966459, 'learning_rate': 7.83505758032393e-06, 'epoch': 0.33} 33%|███▎ | 11273/34278 [12:25:39<27:07:15, 4.24s/it] 33%|███▎ | 11274/34278 [12:25:42<24:26:21, 3.82s/it] {'loss': 0.1532, 'grad_norm': 0.7659185417449016, 'learning_rate': 7.834668418994216e-06, 'epoch': 0.33} 33%|███▎ | 11274/34278 [12:25:42<24:26:21, 3.82s/it] 33%|███▎ | 11275/34278 [12:25:45<23:11:52, 3.63s/it] {'loss': 0.1509, 'grad_norm': 1.0189263475884969, 'learning_rate': 7.834279232357261e-06, 'epoch': 0.33} 33%|███▎ | 11275/34278 [12:25:45<23:11:52, 3.63s/it] 33%|███▎ | 11276/34278 [12:25:49<22:36:51, 3.54s/it] {'loss': 0.1503, 'grad_norm': 0.6077407312814666, 'learning_rate': 7.833890020416537e-06, 'epoch': 0.33} 33%|███▎ | 11276/34278 [12:25:49<22:36:51, 3.54s/it] 33%|███▎ | 11277/34278 [12:25:52<21:44:35, 3.40s/it] {'loss': 0.1685, 'grad_norm': 0.8050501152685532, 'learning_rate': 7.833500783175518e-06, 'epoch': 0.33} 33%|███▎ | 11277/34278 [12:25:52<21:44:35, 3.40s/it] 33%|███▎ | 11278/34278 [12:25:55<20:43:40, 3.24s/it] {'loss': 0.135, 'grad_norm': 0.7442021567327869, 'learning_rate': 7.833111520637681e-06, 'epoch': 0.33} 33%|███▎ | 11278/34278 [12:25:55<20:43:40, 3.24s/it] 33%|███▎ | 11279/34278 [12:25:58<20:12:53, 3.16s/it] {'loss': 0.1499, 'grad_norm': 0.7083448176879351, 'learning_rate': 7.832722232806503e-06, 'epoch': 0.33} 33%|███▎ | 11279/34278 [12:25:58<20:12:53, 3.16s/it] 33%|███▎ | 11280/34278 [12:26:01<19:34:37, 3.06s/it] {'loss': 0.1408, 'grad_norm': 0.715504077425142, 'learning_rate': 7.832332919685452e-06, 'epoch': 0.33} 33%|███▎ | 11280/34278 [12:26:01<19:34:37, 3.06s/it] 33%|███▎ | 11281/34278 [12:26:04<19:22:48, 3.03s/it] {'loss': 0.142, 'grad_norm': 0.9422272669677965, 'learning_rate': 7.831943581278011e-06, 'epoch': 0.33} 33%|███▎ | 11281/34278 [12:26:04<19:22:48, 3.03s/it] 33%|███▎ | 11282/34278 [12:26:07<20:08:22, 3.15s/it] {'loss': 0.1437, 'grad_norm': 0.704431642332087, 'learning_rate': 7.831554217587655e-06, 'epoch': 0.33} 33%|███▎ | 11282/34278 [12:26:07<20:08:22, 3.15s/it] 33%|███▎ | 11283/34278 [12:26:11<21:24:49, 3.35s/it] {'loss': 0.1325, 'grad_norm': 0.6947801321932113, 'learning_rate': 7.831164828617858e-06, 'epoch': 0.33} 33%|███▎ | 11283/34278 [12:26:11<21:24:49, 3.35s/it] 33%|███▎ | 11284/34278 [12:26:17<26:24:09, 4.13s/it] {'loss': 0.1795, 'grad_norm': 0.7415241606652632, 'learning_rate': 7.830775414372099e-06, 'epoch': 0.33} 33%|███▎ | 11284/34278 [12:26:17<26:24:09, 4.13s/it] 33%|███▎ | 11285/34278 [12:26:20<24:08:55, 3.78s/it] {'loss': 0.1756, 'grad_norm': 0.8856096077059094, 'learning_rate': 7.830385974853852e-06, 'epoch': 0.33} 33%|███▎ | 11285/34278 [12:26:20<24:08:55, 3.78s/it] 33%|███▎ | 11286/34278 [12:26:23<22:39:43, 3.55s/it] {'loss': 0.1392, 'grad_norm': 1.024118628830476, 'learning_rate': 7.829996510066594e-06, 'epoch': 0.33} 33%|███▎ | 11286/34278 [12:26:23<22:39:43, 3.55s/it] 33%|███▎ | 11287/34278 [12:26:27<24:33:14, 3.84s/it] {'loss': 0.1477, 'grad_norm': 0.9095232382551075, 'learning_rate': 7.829607020013802e-06, 'epoch': 0.33} 33%|███▎ | 11287/34278 [12:26:27<24:33:14, 3.84s/it] 33%|███▎ | 11288/34278 [12:26:30<22:32:17, 3.53s/it] {'loss': 0.1559, 'grad_norm': 0.9476127453849916, 'learning_rate': 7.829217504698957e-06, 'epoch': 0.33} 33%|███▎ | 11288/34278 [12:26:30<22:32:17, 3.53s/it] 33%|███▎ | 11289/34278 [12:26:34<24:03:55, 3.77s/it] {'loss': 0.1477, 'grad_norm': 0.8061722285929731, 'learning_rate': 7.82882796412553e-06, 'epoch': 0.33} 33%|███▎ | 11289/34278 [12:26:34<24:03:55, 3.77s/it] 33%|███▎ | 11290/34278 [12:26:38<24:14:46, 3.80s/it] {'loss': 0.1429, 'grad_norm': 0.6870770038574902, 'learning_rate': 7.828438398297005e-06, 'epoch': 0.33} 33%|███▎ | 11290/34278 [12:26:38<24:14:46, 3.80s/it] 33%|███▎ | 11291/34278 [12:26:41<22:32:43, 3.53s/it] {'loss': 0.1488, 'grad_norm': 0.9215153764572268, 'learning_rate': 7.828048807216854e-06, 'epoch': 0.33} 33%|███▎ | 11291/34278 [12:26:41<22:32:43, 3.53s/it] 33%|███▎ | 11292/34278 [12:26:45<23:04:07, 3.61s/it] {'loss': 0.1558, 'grad_norm': 0.8951236392736993, 'learning_rate': 7.827659190888562e-06, 'epoch': 0.33} 33%|███▎ | 11292/34278 [12:26:45<23:04:07, 3.61s/it] 33%|███▎ | 11293/34278 [12:26:48<22:06:46, 3.46s/it] {'loss': 0.1737, 'grad_norm': 0.9029770747599283, 'learning_rate': 7.827269549315602e-06, 'epoch': 0.33} 33%|███▎ | 11293/34278 [12:26:48<22:06:46, 3.46s/it] 33%|███▎ | 11294/34278 [12:26:51<21:23:41, 3.35s/it] {'loss': 0.134, 'grad_norm': 0.926310766870844, 'learning_rate': 7.826879882501455e-06, 'epoch': 0.33} 33%|███▎ | 11294/34278 [12:26:51<21:23:41, 3.35s/it] 33%|███▎ | 11295/34278 [12:26:54<21:20:58, 3.34s/it] {'loss': 0.1479, 'grad_norm': 0.8409863761331066, 'learning_rate': 7.826490190449596e-06, 'epoch': 0.33} 33%|███▎ | 11295/34278 [12:26:55<21:20:58, 3.34s/it] 33%|███▎ | 11296/34278 [12:26:58<21:07:48, 3.31s/it] {'loss': 0.1388, 'grad_norm': 0.8437637679566358, 'learning_rate': 7.826100473163512e-06, 'epoch': 0.33} 33%|███▎ | 11296/34278 [12:26:58<21:07:48, 3.31s/it] 33%|███▎ | 11297/34278 [12:27:04<27:07:11, 4.25s/it] {'loss': 0.156, 'grad_norm': 0.891714974199195, 'learning_rate': 7.825710730646676e-06, 'epoch': 0.33} 33%|███▎ | 11297/34278 [12:27:04<27:07:11, 4.25s/it] 33%|███▎ | 11298/34278 [12:27:07<25:20:20, 3.97s/it] {'loss': 0.1492, 'grad_norm': 0.897468335129447, 'learning_rate': 7.825320962902568e-06, 'epoch': 0.33} 33%|███▎ | 11298/34278 [12:27:07<25:20:20, 3.97s/it] 33%|███▎ | 11299/34278 [12:27:10<23:21:44, 3.66s/it] {'loss': 0.126, 'grad_norm': 0.6932552940129606, 'learning_rate': 7.82493116993467e-06, 'epoch': 0.33} 33%|███▎ | 11299/34278 [12:27:10<23:21:44, 3.66s/it] 33%|███▎ | 11300/34278 [12:27:15<25:55:30, 4.06s/it] {'loss': 0.144, 'grad_norm': 0.8449570914458303, 'learning_rate': 7.82454135174646e-06, 'epoch': 0.33} 33%|███▎ | 11300/34278 [12:27:15<25:55:30, 4.06s/it] 33%|███▎ | 11301/34278 [12:27:19<25:37:59, 4.02s/it] {'loss': 0.1432, 'grad_norm': 0.9087588850523538, 'learning_rate': 7.82415150834142e-06, 'epoch': 0.33} 33%|███▎ | 11301/34278 [12:27:19<25:37:59, 4.02s/it] 33%|███▎ | 11302/34278 [12:27:23<24:13:47, 3.80s/it] {'loss': 0.1441, 'grad_norm': 0.7557663548751011, 'learning_rate': 7.823761639723029e-06, 'epoch': 0.33} 33%|███▎ | 11302/34278 [12:27:23<24:13:47, 3.80s/it] 33%|███▎ | 11303/34278 [12:27:26<23:10:03, 3.63s/it] {'loss': 0.143, 'grad_norm': 0.887443115216238, 'learning_rate': 7.823371745894768e-06, 'epoch': 0.33} 33%|███▎ | 11303/34278 [12:27:26<23:10:03, 3.63s/it] 33%|███▎ | 11304/34278 [12:27:32<27:47:01, 4.35s/it] {'loss': 0.1434, 'grad_norm': 0.7834573258451816, 'learning_rate': 7.822981826860118e-06, 'epoch': 0.33} 33%|███▎ | 11304/34278 [12:27:32<27:47:01, 4.35s/it] 33%|███▎ | 11305/34278 [12:27:35<25:58:42, 4.07s/it] {'loss': 0.1396, 'grad_norm': 0.8623294560083001, 'learning_rate': 7.822591882622562e-06, 'epoch': 0.33} 33%|███▎ | 11305/34278 [12:27:35<25:58:42, 4.07s/it] 33%|███▎ | 11306/34278 [12:27:38<23:55:15, 3.75s/it] {'loss': 0.1551, 'grad_norm': 0.9454631376020142, 'learning_rate': 7.822201913185577e-06, 'epoch': 0.33} 33%|███▎ | 11306/34278 [12:27:38<23:55:15, 3.75s/it] 33%|███▎ | 11307/34278 [12:27:43<25:33:51, 4.01s/it] {'loss': 0.1896, 'grad_norm': 0.8654563307718793, 'learning_rate': 7.821811918552647e-06, 'epoch': 0.33} 33%|███▎ | 11307/34278 [12:27:43<25:33:51, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 33%|███▎ | 11308/34278 [12:27:47<26:01:32, 4.08s/it] {'loss': 0.1817, 'grad_norm': 0.953560744709995, 'learning_rate': 7.821421898727255e-06, 'epoch': 0.33} 33%|███▎ | 11308/34278 [12:27:47<26:01:32, 4.08s/it] 33%|███▎ | 11309/34278 [12:27:53<29:50:02, 4.68s/it] {'loss': 0.133, 'grad_norm': 0.8154254191760685, 'learning_rate': 7.821031853712881e-06, 'epoch': 0.33} 33%|███▎ | 11309/34278 [12:27:53<29:50:02, 4.68s/it] 33%|███▎ | 11310/34278 [12:27:57<27:15:05, 4.27s/it] {'loss': 0.1518, 'grad_norm': 0.9257272792921443, 'learning_rate': 7.82064178351301e-06, 'epoch': 0.33} 33%|███▎ | 11310/34278 [12:27:57<27:15:05, 4.27s/it] 33%|███▎ | 11311/34278 [12:28:00<24:55:16, 3.91s/it] {'loss': 0.1299, 'grad_norm': 0.803301284297927, 'learning_rate': 7.820251688131121e-06, 'epoch': 0.33} 33%|███▎ | 11311/34278 [12:28:00<24:55:16, 3.91s/it] 33%|███▎ | 11312/34278 [12:28:05<27:28:26, 4.31s/it] {'loss': 0.1424, 'grad_norm': 0.818206556440267, 'learning_rate': 7.819861567570699e-06, 'epoch': 0.33} 33%|███▎ | 11312/34278 [12:28:05<27:28:26, 4.31s/it] 33%|███▎ | 11313/34278 [12:28:08<25:07:59, 3.94s/it] {'loss': 0.1454, 'grad_norm': 0.9667275436284948, 'learning_rate': 7.819471421835224e-06, 'epoch': 0.33} 33%|███▎ | 11313/34278 [12:28:08<25:07:59, 3.94s/it] 33%|███▎ | 11314/34278 [12:28:11<24:15:41, 3.80s/it] {'loss': 0.132, 'grad_norm': 0.9182018915794946, 'learning_rate': 7.819081250928184e-06, 'epoch': 0.33} 33%|███▎ | 11314/34278 [12:28:11<24:15:41, 3.80s/it] 33%|███▎ | 11315/34278 [12:28:17<27:42:48, 4.34s/it] {'loss': 0.1572, 'grad_norm': 0.9555021564657732, 'learning_rate': 7.818691054853056e-06, 'epoch': 0.33} 33%|███▎ | 11315/34278 [12:28:17<27:42:48, 4.34s/it] 33%|███▎ | 11316/34278 [12:28:20<25:17:22, 3.96s/it] {'loss': 0.1366, 'grad_norm': 0.7809252659139025, 'learning_rate': 7.81830083361333e-06, 'epoch': 0.33} 33%|███▎ | 11316/34278 [12:28:20<25:17:22, 3.96s/it] 33%|███▎ | 11317/34278 [12:28:24<24:18:33, 3.81s/it] {'loss': 0.1532, 'grad_norm': 0.9081021460098465, 'learning_rate': 7.817910587212486e-06, 'epoch': 0.33} 33%|███▎ | 11317/34278 [12:28:24<24:18:33, 3.81s/it] 33%|███▎ | 11318/34278 [12:28:29<26:58:44, 4.23s/it] {'loss': 0.1382, 'grad_norm': 0.8547658909787655, 'learning_rate': 7.81752031565401e-06, 'epoch': 0.33} 33%|███▎ | 11318/34278 [12:28:29<26:58:44, 4.23s/it] 33%|███▎ | 11319/34278 [12:28:32<24:54:14, 3.90s/it] {'loss': 0.1544, 'grad_norm': 0.8022174461139985, 'learning_rate': 7.817130018941383e-06, 'epoch': 0.33} 33%|███▎ | 11319/34278 [12:28:32<24:54:14, 3.90s/it] 33%|███▎ | 11320/34278 [12:28:35<23:01:12, 3.61s/it] {'loss': 0.127, 'grad_norm': 0.97159226609784, 'learning_rate': 7.816739697078094e-06, 'epoch': 0.33} 33%|███▎ | 11320/34278 [12:28:35<23:01:12, 3.61s/it] 33%|███▎ | 11321/34278 [12:28:39<25:00:43, 3.92s/it] {'loss': 0.1342, 'grad_norm': 0.853486364731287, 'learning_rate': 7.816349350067625e-06, 'epoch': 0.33} 33%|███▎ | 11321/34278 [12:28:39<25:00:43, 3.92s/it] 33%|███▎ | 11322/34278 [12:28:43<23:35:40, 3.70s/it] {'loss': 0.1628, 'grad_norm': 1.0613319647337767, 'learning_rate': 7.81595897791346e-06, 'epoch': 0.33} 33%|███▎ | 11322/34278 [12:28:43<23:35:40, 3.70s/it] 33%|███▎ | 11323/34278 [12:28:46<22:19:29, 3.50s/it] {'loss': 0.1689, 'grad_norm': 0.9607553484038386, 'learning_rate': 7.815568580619087e-06, 'epoch': 0.33} 33%|███▎ | 11323/34278 [12:28:46<22:19:29, 3.50s/it] 33%|███▎ | 11324/34278 [12:28:48<20:48:34, 3.26s/it] {'loss': 0.1398, 'grad_norm': 0.9772394208745946, 'learning_rate': 7.815178158187991e-06, 'epoch': 0.33} 33%|███▎ | 11324/34278 [12:28:48<20:48:34, 3.26s/it] 33%|███▎ | 11325/34278 [12:28:52<21:04:44, 3.31s/it] {'loss': 0.1434, 'grad_norm': 1.2193309788058466, 'learning_rate': 7.814787710623652e-06, 'epoch': 0.33} 33%|███▎ | 11325/34278 [12:28:52<21:04:44, 3.31s/it] 33%|███▎ | 11326/34278 [12:28:55<20:49:17, 3.27s/it] {'loss': 0.1672, 'grad_norm': 1.0430780770787516, 'learning_rate': 7.814397237929564e-06, 'epoch': 0.33} 33%|███▎ | 11326/34278 [12:28:55<20:49:17, 3.27s/it] 33%|███▎ | 11327/34278 [12:28:59<21:34:07, 3.38s/it] {'loss': 0.14, 'grad_norm': 0.9829767299141772, 'learning_rate': 7.814006740109208e-06, 'epoch': 0.33} 33%|███▎ | 11327/34278 [12:28:59<21:34:07, 3.38s/it] 33%|███▎ | 11328/34278 [12:29:01<20:20:22, 3.19s/it] {'loss': 0.1486, 'grad_norm': 1.1205465339136425, 'learning_rate': 7.813616217166071e-06, 'epoch': 0.33} 33%|███▎ | 11328/34278 [12:29:01<20:20:22, 3.19s/it] 33%|███▎ | 11329/34278 [12:29:04<19:47:28, 3.10s/it] {'loss': 0.1601, 'grad_norm': 0.8423887911422739, 'learning_rate': 7.813225669103641e-06, 'epoch': 0.33} 33%|███▎ | 11329/34278 [12:29:04<19:47:28, 3.10s/it] 33%|███▎ | 11330/34278 [12:29:07<20:00:12, 3.14s/it] {'loss': 0.1799, 'grad_norm': 0.7399347113271597, 'learning_rate': 7.812835095925404e-06, 'epoch': 0.33} 33%|███▎ | 11330/34278 [12:29:08<20:00:12, 3.14s/it] 33%|███▎ | 11331/34278 [12:29:13<25:15:33, 3.96s/it] {'loss': 0.1327, 'grad_norm': 1.0475515746343944, 'learning_rate': 7.812444497634847e-06, 'epoch': 0.33} 33%|███▎ | 11331/34278 [12:29:13<25:15:33, 3.96s/it] 33%|███▎ | 11332/34278 [12:29:17<23:41:24, 3.72s/it] {'loss': 0.1253, 'grad_norm': 0.7769497032823693, 'learning_rate': 7.812053874235455e-06, 'epoch': 0.33} 33%|███▎ | 11332/34278 [12:29:17<23:41:24, 3.72s/it] 33%|███▎ | 11333/34278 [12:29:19<22:13:42, 3.49s/it] {'loss': 0.1522, 'grad_norm': 0.8137021183200754, 'learning_rate': 7.811663225730718e-06, 'epoch': 0.33} 33%|███▎ | 11333/34278 [12:29:19<22:13:42, 3.49s/it] 33%|███▎ | 11334/34278 [12:29:23<22:25:39, 3.52s/it] {'loss': 0.1333, 'grad_norm': 0.6840079985260334, 'learning_rate': 7.811272552124125e-06, 'epoch': 0.33} 33%|███▎ | 11334/34278 [12:29:23<22:25:39, 3.52s/it] 33%|███▎ | 11335/34278 [12:29:26<21:54:49, 3.44s/it] {'loss': 0.1583, 'grad_norm': 0.8757888851495554, 'learning_rate': 7.81088185341916e-06, 'epoch': 0.33} 33%|███▎ | 11335/34278 [12:29:26<21:54:49, 3.44s/it] 33%|███▎ | 11336/34278 [12:29:33<27:26:43, 4.31s/it] {'loss': 0.152, 'grad_norm': 0.9502181614649482, 'learning_rate': 7.810491129619314e-06, 'epoch': 0.33} 33%|███▎ | 11336/34278 [12:29:33<27:26:43, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 33%|███▎ | 11337/34278 [12:29:36<25:53:57, 4.06s/it] {'loss': 0.1318, 'grad_norm': 1.0521417441197338, 'learning_rate': 7.810100380728072e-06, 'epoch': 0.33} 33%|███▎ | 11337/34278 [12:29:36<25:53:57, 4.06s/it] 33%|███▎ | 11338/34278 [12:29:39<23:52:37, 3.75s/it] {'loss': 0.159, 'grad_norm': 0.7521275234108433, 'learning_rate': 7.809709606748926e-06, 'epoch': 0.33} 33%|███▎ | 11338/34278 [12:29:39<23:52:37, 3.75s/it] 33%|███▎ | 11339/34278 [12:29:42<22:48:58, 3.58s/it] {'loss': 0.1591, 'grad_norm': 0.9068589984650683, 'learning_rate': 7.809318807685364e-06, 'epoch': 0.33} 33%|███▎ | 11339/34278 [12:29:42<22:48:58, 3.58s/it] 33%|███▎ | 11340/34278 [12:29:46<22:36:55, 3.55s/it] {'loss': 0.141, 'grad_norm': 0.9063226841119603, 'learning_rate': 7.808927983540873e-06, 'epoch': 0.33} 33%|███▎ | 11340/34278 [12:29:46<22:36:55, 3.55s/it] 33%|███▎ | 11341/34278 [12:29:49<22:03:57, 3.46s/it] {'loss': 0.1406, 'grad_norm': 1.023090238082688, 'learning_rate': 7.808537134318944e-06, 'epoch': 0.33} 33%|███▎ | 11341/34278 [12:29:49<22:03:57, 3.46s/it] 33%|███▎ | 11342/34278 [12:29:53<22:48:36, 3.58s/it] {'loss': 0.1472, 'grad_norm': 0.9189415234848043, 'learning_rate': 7.808146260023067e-06, 'epoch': 0.33} 33%|███▎ | 11342/34278 [12:29:53<22:48:36, 3.58s/it] 33%|███▎ | 11343/34278 [12:29:56<22:28:21, 3.53s/it] {'loss': 0.1763, 'grad_norm': 0.9280011152793862, 'learning_rate': 7.807755360656727e-06, 'epoch': 0.33} 33%|███▎ | 11343/34278 [12:29:56<22:28:21, 3.53s/it] 33%|███▎ | 11344/34278 [12:30:00<22:02:22, 3.46s/it] {'loss': 0.1477, 'grad_norm': 1.0233840700040582, 'learning_rate': 7.807364436223422e-06, 'epoch': 0.33} 33%|███▎ | 11344/34278 [12:30:00<22:02:22, 3.46s/it] 33%|███▎ | 11345/34278 [12:30:03<22:14:57, 3.49s/it] {'loss': 0.1357, 'grad_norm': 1.161617052132787, 'learning_rate': 7.806973486726634e-06, 'epoch': 0.33} 33%|███▎ | 11345/34278 [12:30:03<22:14:57, 3.49s/it] 33%|███▎ | 11346/34278 [12:30:06<21:37:06, 3.39s/it] {'loss': 0.1408, 'grad_norm': 1.083374635455969, 'learning_rate': 7.806582512169859e-06, 'epoch': 0.33} 33%|███▎ | 11346/34278 [12:30:06<21:37:06, 3.39s/it] 33%|███▎ | 11347/34278 [12:30:10<22:56:31, 3.60s/it] {'loss': 0.1467, 'grad_norm': 0.9179876388683236, 'learning_rate': 7.806191512556584e-06, 'epoch': 0.33} 33%|███▎ | 11347/34278 [12:30:10<22:56:31, 3.60s/it] 33%|███▎ | 11348/34278 [12:30:13<21:30:10, 3.38s/it] {'loss': 0.1539, 'grad_norm': 0.9674249558014094, 'learning_rate': 7.805800487890302e-06, 'epoch': 0.33} 33%|███▎ | 11348/34278 [12:30:13<21:30:10, 3.38s/it] 33%|███▎ | 11349/34278 [12:30:17<23:00:51, 3.61s/it] {'loss': 0.1404, 'grad_norm': 1.1368088176756828, 'learning_rate': 7.805409438174502e-06, 'epoch': 0.33} 33%|███▎ | 11349/34278 [12:30:17<23:00:51, 3.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 33%|███▎ | 11350/34278 [12:30:23<27:32:41, 4.32s/it] {'loss': 0.1312, 'grad_norm': 0.8900375496043085, 'learning_rate': 7.805018363412677e-06, 'epoch': 0.33} 33%|███▎ | 11350/34278 [12:30:24<27:32:41, 4.32s/it] 33%|███▎ | 11351/34278 [12:30:27<26:19:09, 4.13s/it] {'loss': 0.155, 'grad_norm': 0.7199386376933307, 'learning_rate': 7.804627263608317e-06, 'epoch': 0.33} 33%|███▎ | 11351/34278 [12:30:27<26:19:09, 4.13s/it] 33%|███▎ | 11352/34278 [12:30:30<24:48:30, 3.90s/it] {'loss': 0.1585, 'grad_norm': 1.0488604656936606, 'learning_rate': 7.804236138764916e-06, 'epoch': 0.33} 33%|███▎ | 11352/34278 [12:30:30<24:48:30, 3.90s/it] 33%|███▎ | 11353/34278 [12:30:33<22:31:43, 3.54s/it] {'loss': 0.1404, 'grad_norm': 0.9974434078440045, 'learning_rate': 7.803844988885962e-06, 'epoch': 0.33} 33%|███▎ | 11353/34278 [12:30:33<22:31:43, 3.54s/it] 33%|███▎ | 11354/34278 [12:30:37<23:06:44, 3.63s/it] {'loss': 0.1821, 'grad_norm': 0.8802585061595912, 'learning_rate': 7.803453813974951e-06, 'epoch': 0.33} 33%|███▎ | 11354/34278 [12:30:37<23:06:44, 3.63s/it] 33%|███▎ | 11355/34278 [12:30:40<22:32:41, 3.54s/it] {'loss': 0.1803, 'grad_norm': 0.9447214851208002, 'learning_rate': 7.803062614035372e-06, 'epoch': 0.33} 33%|███▎ | 11355/34278 [12:30:40<22:32:41, 3.54s/it] 33%|███▎ | 11356/34278 [12:30:44<22:44:18, 3.57s/it] {'loss': 0.1629, 'grad_norm': 1.341498285531634, 'learning_rate': 7.802671389070721e-06, 'epoch': 0.33} 33%|███▎ | 11356/34278 [12:30:44<22:44:18, 3.57s/it] 33%|███▎ | 11357/34278 [12:30:50<27:06:46, 4.26s/it] {'loss': 0.1138, 'grad_norm': 0.8542320307106278, 'learning_rate': 7.802280139084489e-06, 'epoch': 0.33} 33%|███▎ | 11357/34278 [12:30:50<27:06:46, 4.26s/it] 33%|███▎ | 11358/34278 [12:30:54<26:07:45, 4.10s/it] {'loss': 0.1682, 'grad_norm': 0.8783499068564741, 'learning_rate': 7.801888864080166e-06, 'epoch': 0.33} 33%|███▎ | 11358/34278 [12:30:54<26:07:45, 4.10s/it] 33%|███▎ | 11359/34278 [12:30:57<23:51:20, 3.75s/it] {'loss': 0.1627, 'grad_norm': 0.9882950313021468, 'learning_rate': 7.80149756406125e-06, 'epoch': 0.33} 33%|███▎ | 11359/34278 [12:30:57<23:51:20, 3.75s/it] 33%|███▎ | 11360/34278 [12:30:59<21:56:13, 3.45s/it] {'loss': 0.1412, 'grad_norm': 0.7591211774130825, 'learning_rate': 7.801106239031233e-06, 'epoch': 0.33} 33%|███▎ | 11360/34278 [12:30:59<21:56:13, 3.45s/it] 33%|███▎ | 11361/34278 [12:31:04<24:37:13, 3.87s/it] {'loss': 0.1457, 'grad_norm': 0.7556389361556377, 'learning_rate': 7.800714888993607e-06, 'epoch': 0.33} 33%|███▎ | 11361/34278 [12:31:04<24:37:13, 3.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 33%|███▎ | 11362/34278 [12:31:08<25:13:08, 3.96s/it] {'loss': 0.1707, 'grad_norm': 0.9666078241016921, 'learning_rate': 7.800323513951867e-06, 'epoch': 0.33} 33%|███▎ | 11362/34278 [12:31:08<25:13:08, 3.96s/it] 33%|███▎ | 11363/34278 [12:31:13<26:04:39, 4.10s/it] {'loss': 0.1347, 'grad_norm': 0.8416880894335402, 'learning_rate': 7.799932113909508e-06, 'epoch': 0.33} 33%|███▎ | 11363/34278 [12:31:13<26:04:39, 4.10s/it] 33%|███▎ | 11364/34278 [12:31:16<23:49:18, 3.74s/it] {'loss': 0.1154, 'grad_norm': 0.7060471191911536, 'learning_rate': 7.799540688870024e-06, 'epoch': 0.33} 33%|███▎ | 11364/34278 [12:31:16<23:49:18, 3.74s/it] 33%|███▎ | 11365/34278 [12:31:19<23:41:28, 3.72s/it] {'loss': 0.1428, 'grad_norm': 0.9235554422842465, 'learning_rate': 7.799149238836908e-06, 'epoch': 0.33} 33%|███▎ | 11365/34278 [12:31:19<23:41:28, 3.72s/it] 33%|███▎ | 11366/34278 [12:31:24<25:32:48, 4.01s/it] {'loss': 0.1502, 'grad_norm': 0.794240013814919, 'learning_rate': 7.798757763813656e-06, 'epoch': 0.33} 33%|███▎ | 11366/34278 [12:31:24<25:32:48, 4.01s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11236 > 8192). Running this sequence through the model will result in indexing errors 33%|███▎ | 11367/34278 [12:31:27<23:30:14, 3.69s/it] {'loss': 0.1691, 'grad_norm': 0.7457371350575628, 'learning_rate': 7.798366263803763e-06, 'epoch': 0.33} 33%|███▎ | 11367/34278 [12:31:27<23:30:14, 3.69s/it] 33%|███▎ | 11368/34278 [12:31:30<23:02:42, 3.62s/it] {'loss': 0.1499, 'grad_norm': 0.8171198498176778, 'learning_rate': 7.797974738810723e-06, 'epoch': 0.33} 33%|███▎ | 11368/34278 [12:31:30<23:02:42, 3.62s/it] 33%|███▎ | 11369/34278 [12:31:37<27:56:29, 4.39s/it] {'loss': 0.1799, 'grad_norm': 0.7518992320645509, 'learning_rate': 7.797583188838033e-06, 'epoch': 0.33} 33%|███▎ | 11369/34278 [12:31:37<27:56:29, 4.39s/it] 33%|███▎ | 11370/34278 [12:31:40<25:47:13, 4.05s/it] {'loss': 0.1528, 'grad_norm': 0.9249122309323343, 'learning_rate': 7.79719161388919e-06, 'epoch': 0.33} 33%|███▎ | 11370/34278 [12:31:40<25:47:13, 4.05s/it] 33%|███▎ | 11371/34278 [12:31:43<23:38:41, 3.72s/it] {'loss': 0.1468, 'grad_norm': 0.908845387575393, 'learning_rate': 7.796800013967685e-06, 'epoch': 0.33} 33%|███▎ | 11371/34278 [12:31:43<23:38:41, 3.72s/it] 33%|███▎ | 11372/34278 [12:31:47<23:54:04, 3.76s/it] {'loss': 0.164, 'grad_norm': 0.7230178464134541, 'learning_rate': 7.79640838907702e-06, 'epoch': 0.33} 33%|███▎ | 11372/34278 [12:31:47<23:54:04, 3.76s/it] 33%|███▎ | 11373/34278 [12:31:50<22:47:19, 3.58s/it] {'loss': 0.1611, 'grad_norm': 0.8837016122612601, 'learning_rate': 7.796016739220686e-06, 'epoch': 0.33} 33%|███▎ | 11373/34278 [12:31:50<22:47:19, 3.58s/it] 33%|███▎ | 11374/34278 [12:31:53<22:08:18, 3.48s/it] {'loss': 0.1473, 'grad_norm': 1.1638407280343497, 'learning_rate': 7.795625064402184e-06, 'epoch': 0.33} 33%|███▎ | 11374/34278 [12:31:53<22:08:18, 3.48s/it] 33%|███▎ | 11375/34278 [12:31:57<22:31:36, 3.54s/it] {'loss': 0.1458, 'grad_norm': 1.0006773533680604, 'learning_rate': 7.795233364625008e-06, 'epoch': 0.33} 33%|███▎ | 11375/34278 [12:31:57<22:31:36, 3.54s/it] 33%|███▎ | 11376/34278 [12:32:01<23:21:18, 3.67s/it] {'loss': 0.1576, 'grad_norm': 0.87323873384313, 'learning_rate': 7.794841639892655e-06, 'epoch': 0.33} 33%|███▎ | 11376/34278 [12:32:01<23:21:18, 3.67s/it] 33%|███▎ | 11377/34278 [12:32:04<22:24:31, 3.52s/it] {'loss': 0.1535, 'grad_norm': 0.7583736870201176, 'learning_rate': 7.794449890208624e-06, 'epoch': 0.33} 33%|███▎ | 11377/34278 [12:32:04<22:24:31, 3.52s/it] 33%|███▎ | 11378/34278 [12:32:07<22:25:54, 3.53s/it] {'loss': 0.1434, 'grad_norm': 0.907097595137542, 'learning_rate': 7.794058115576411e-06, 'epoch': 0.33} 33%|███▎ | 11378/34278 [12:32:07<22:25:54, 3.53s/it] 33%|███▎ | 11379/34278 [12:32:12<23:35:45, 3.71s/it] {'loss': 0.1501, 'grad_norm': 1.078460085956634, 'learning_rate': 7.793666315999514e-06, 'epoch': 0.33} 33%|███▎ | 11379/34278 [12:32:12<23:35:45, 3.71s/it] 33%|███▎ | 11380/34278 [12:32:16<25:04:54, 3.94s/it] {'loss': 0.1457, 'grad_norm': 0.7874561281958534, 'learning_rate': 7.793274491481431e-06, 'epoch': 0.33} 33%|███▎ | 11380/34278 [12:32:16<25:04:54, 3.94s/it] 33%|███▎ | 11381/34278 [12:32:19<22:44:39, 3.58s/it] {'loss': 0.1608, 'grad_norm': 0.9717033244524153, 'learning_rate': 7.792882642025662e-06, 'epoch': 0.33} 33%|███▎ | 11381/34278 [12:32:19<22:44:39, 3.58s/it] 33%|███▎ | 11382/34278 [12:32:22<21:40:06, 3.41s/it] {'loss': 0.1338, 'grad_norm': 1.1234856164189455, 'learning_rate': 7.7924907676357e-06, 'epoch': 0.33} 33%|███▎ | 11382/34278 [12:32:22<21:40:06, 3.41s/it] 33%|███▎ | 11383/34278 [12:32:25<21:35:15, 3.39s/it] {'loss': 0.1479, 'grad_norm': 0.6348800428920081, 'learning_rate': 7.79209886831505e-06, 'epoch': 0.33} 33%|███▎ | 11383/34278 [12:32:25<21:35:15, 3.39s/it] 33%|███▎ | 11384/34278 [12:32:29<22:07:36, 3.48s/it] {'loss': 0.1586, 'grad_norm': 0.9411558584280161, 'learning_rate': 7.791706944067207e-06, 'epoch': 0.33} 33%|███▎ | 11384/34278 [12:32:29<22:07:36, 3.48s/it] 33%|███▎ | 11385/34278 [12:32:32<20:57:42, 3.30s/it] {'loss': 0.1839, 'grad_norm': 0.9933133911246549, 'learning_rate': 7.79131499489567e-06, 'epoch': 0.33} 33%|███▎ | 11385/34278 [12:32:32<20:57:42, 3.30s/it] 33%|███▎ | 11386/34278 [12:32:34<19:54:54, 3.13s/it] {'loss': 0.1702, 'grad_norm': 0.8544700110182338, 'learning_rate': 7.79092302080394e-06, 'epoch': 0.33} 33%|███▎ | 11386/34278 [12:32:34<19:54:54, 3.13s/it] 33%|███▎ | 11387/34278 [12:32:38<20:33:07, 3.23s/it] {'loss': 0.1392, 'grad_norm': 0.8343955580679631, 'learning_rate': 7.790531021795516e-06, 'epoch': 0.33} 33%|███▎ | 11387/34278 [12:32:38<20:33:07, 3.23s/it] 33%|███▎ | 11388/34278 [12:32:41<20:12:03, 3.18s/it] {'loss': 0.1613, 'grad_norm': 0.8935637225075622, 'learning_rate': 7.790138997873895e-06, 'epoch': 0.33} 33%|███▎ | 11388/34278 [12:32:41<20:12:03, 3.18s/it] 33%|███▎ | 11389/34278 [12:32:44<19:57:19, 3.14s/it] {'loss': 0.1307, 'grad_norm': 0.7621996166956655, 'learning_rate': 7.789746949042582e-06, 'epoch': 0.33} 33%|███▎ | 11389/34278 [12:32:44<19:57:19, 3.14s/it] 33%|███▎ | 11390/34278 [12:32:48<21:19:13, 3.35s/it] {'loss': 0.1692, 'grad_norm': 0.8073744013821377, 'learning_rate': 7.789354875305074e-06, 'epoch': 0.33} 33%|███▎ | 11390/34278 [12:32:48<21:19:13, 3.35s/it] 33%|███▎ | 11391/34278 [12:32:51<21:38:49, 3.40s/it] {'loss': 0.165, 'grad_norm': 0.8965316595742587, 'learning_rate': 7.788962776664867e-06, 'epoch': 0.33} 33%|███▎ | 11391/34278 [12:32:51<21:38:49, 3.40s/it] 33%|███▎ | 11392/34278 [12:32:54<20:29:48, 3.22s/it] {'loss': 0.1411, 'grad_norm': 0.7934239369806525, 'learning_rate': 7.78857065312547e-06, 'epoch': 0.33} 33%|███▎ | 11392/34278 [12:32:54<20:29:48, 3.22s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8767 > 8192). Running this sequence through the model will result in indexing errors 33%|███▎ | 11393/34278 [12:32:57<20:33:25, 3.23s/it] {'loss': 0.1432, 'grad_norm': 0.8277492835564576, 'learning_rate': 7.78817850469038e-06, 'epoch': 0.33} 33%|███▎ | 11393/34278 [12:32:57<20:33:25, 3.23s/it] 33%|███▎ | 11394/34278 [12:33:01<20:15:25, 3.19s/it] {'loss': 0.1502, 'grad_norm': 0.7552515074734151, 'learning_rate': 7.787786331363097e-06, 'epoch': 0.33} 33%|███▎ | 11394/34278 [12:33:01<20:15:25, 3.19s/it] 33%|███▎ | 11395/34278 [12:33:03<19:36:10, 3.08s/it] {'loss': 0.1521, 'grad_norm': 0.9938117671261162, 'learning_rate': 7.787394133147125e-06, 'epoch': 0.33} 33%|███▎ | 11395/34278 [12:33:03<19:36:10, 3.08s/it] 33%|███▎ | 11396/34278 [12:33:08<21:40:37, 3.41s/it] {'loss': 0.1471, 'grad_norm': 0.8481254081862278, 'learning_rate': 7.787001910045962e-06, 'epoch': 0.33} 33%|███▎ | 11396/34278 [12:33:08<21:40:37, 3.41s/it] 33%|███▎ | 11397/34278 [12:33:11<21:10:18, 3.33s/it] {'loss': 0.1685, 'grad_norm': 0.8684717639587272, 'learning_rate': 7.786609662063109e-06, 'epoch': 0.33} 33%|███▎ | 11397/34278 [12:33:11<21:10:18, 3.33s/it] 33%|███▎ | 11398/34278 [12:33:13<20:11:10, 3.18s/it] {'loss': 0.1614, 'grad_norm': 0.9166406583040055, 'learning_rate': 7.786217389202073e-06, 'epoch': 0.33} 33%|███▎ | 11398/34278 [12:33:14<20:11:10, 3.18s/it] 33%|███▎ | 11399/34278 [12:33:17<20:35:32, 3.24s/it] {'loss': 0.1466, 'grad_norm': 0.7848013886076809, 'learning_rate': 7.785825091466352e-06, 'epoch': 0.33} 33%|███▎ | 11399/34278 [12:33:17<20:35:32, 3.24s/it] 33%|███▎ | 11400/34278 [12:33:20<19:59:19, 3.15s/it] {'loss': 0.1657, 'grad_norm': 0.6776690801310096, 'learning_rate': 7.78543276885945e-06, 'epoch': 0.33} 33%|███▎ | 11400/34278 [12:33:20<19:59:19, 3.15s/it] 33%|███▎ | 11401/34278 [12:33:23<20:39:05, 3.25s/it] {'loss': 0.1528, 'grad_norm': 1.0952363863083787, 'learning_rate': 7.785040421384871e-06, 'epoch': 0.33} 33%|███▎ | 11401/34278 [12:33:23<20:39:05, 3.25s/it] 33%|███▎ | 11402/34278 [12:33:29<25:49:42, 4.06s/it] {'loss': 0.1511, 'grad_norm': 0.7588286368887359, 'learning_rate': 7.784648049046114e-06, 'epoch': 0.33} 33%|███▎ | 11402/34278 [12:33:29<25:49:42, 4.06s/it] 33%|███▎ | 11403/34278 [12:33:32<23:19:06, 3.67s/it] {'loss': 0.1444, 'grad_norm': 0.8805903717398548, 'learning_rate': 7.784255651846684e-06, 'epoch': 0.33} 33%|███▎ | 11403/34278 [12:33:32<23:19:06, 3.67s/it] 33%|███▎ | 11404/34278 [12:33:35<22:44:54, 3.58s/it] {'loss': 0.134, 'grad_norm': 0.7726353599224017, 'learning_rate': 7.783863229790085e-06, 'epoch': 0.33} 33%|███▎ | 11404/34278 [12:33:35<22:44:54, 3.58s/it] 33%|███▎ | 11405/34278 [12:33:39<23:17:10, 3.67s/it] {'loss': 0.137, 'grad_norm': 0.7227797656123466, 'learning_rate': 7.783470782879818e-06, 'epoch': 0.33} 33%|███▎ | 11405/34278 [12:33:39<23:17:10, 3.67s/it] 33%|███▎ | 11406/34278 [12:33:44<26:06:35, 4.11s/it] {'loss': 0.1406, 'grad_norm': 0.6219612835543467, 'learning_rate': 7.783078311119389e-06, 'epoch': 0.33} 33%|███▎ | 11406/34278 [12:33:44<26:06:35, 4.11s/it] 33%|███▎ | 11407/34278 [12:33:51<30:18:10, 4.77s/it] {'loss': 0.1339, 'grad_norm': 0.7399841750316177, 'learning_rate': 7.782685814512303e-06, 'epoch': 0.33} 33%|███▎ | 11407/34278 [12:33:51<30:18:10, 4.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 33%|███▎ | 11408/34278 [12:33:54<27:27:47, 4.32s/it] {'loss': 0.1421, 'grad_norm': 0.5929710896739792, 'learning_rate': 7.782293293062062e-06, 'epoch': 0.33} 33%|███▎ | 11408/34278 [12:33:54<27:27:47, 4.32s/it] 33%|███▎ | 11409/34278 [12:33:57<25:29:37, 4.01s/it] {'loss': 0.1606, 'grad_norm': 0.6908336840206548, 'learning_rate': 7.781900746772169e-06, 'epoch': 0.33} 33%|███▎ | 11409/34278 [12:33:57<25:29:37, 4.01s/it] 33%|███▎ | 11410/34278 [12:34:01<24:57:31, 3.93s/it] {'loss': 0.1692, 'grad_norm': 0.7495708806481434, 'learning_rate': 7.78150817564613e-06, 'epoch': 0.33} 33%|███▎ | 11410/34278 [12:34:01<24:57:31, 3.93s/it] 33%|███▎ | 11411/34278 [12:34:07<28:55:52, 4.55s/it] {'loss': 0.1226, 'grad_norm': 0.8252047910093573, 'learning_rate': 7.781115579687452e-06, 'epoch': 0.33} 33%|███▎ | 11411/34278 [12:34:07<28:55:52, 4.55s/it] 33%|███▎ | 11412/34278 [12:34:13<31:47:55, 5.01s/it] {'loss': 0.168, 'grad_norm': 0.8055027083945296, 'learning_rate': 7.780722958899637e-06, 'epoch': 0.33} 33%|███▎ | 11412/34278 [12:34:13<31:47:55, 5.01s/it] 33%|███▎ | 11413/34278 [12:34:16<27:34:50, 4.34s/it] {'loss': 0.1467, 'grad_norm': 0.7067676823503098, 'learning_rate': 7.78033031328619e-06, 'epoch': 0.33} 33%|███▎ | 11413/34278 [12:34:16<27:34:50, 4.34s/it] 33%|███▎ | 11414/34278 [12:34:19<25:10:34, 3.96s/it] {'loss': 0.1352, 'grad_norm': 0.9506405613626454, 'learning_rate': 7.779937642850618e-06, 'epoch': 0.33} 33%|███▎ | 11414/34278 [12:34:19<25:10:34, 3.96s/it] 33%|███▎ | 11415/34278 [12:34:25<29:15:54, 4.61s/it] {'loss': 0.1617, 'grad_norm': 1.0165299854068452, 'learning_rate': 7.779544947596428e-06, 'epoch': 0.33} 33%|███▎ | 11415/34278 [12:34:25<29:15:54, 4.61s/it] 33%|███▎ | 11416/34278 [12:34:32<32:50:19, 5.17s/it] {'loss': 0.1506, 'grad_norm': 0.7459070939394024, 'learning_rate': 7.779152227527124e-06, 'epoch': 0.33} 33%|███▎ | 11416/34278 [12:34:32<32:50:19, 5.17s/it] 33%|███▎ | 11417/34278 [12:34:35<29:21:30, 4.62s/it] {'loss': 0.1498, 'grad_norm': 0.9564467771215571, 'learning_rate': 7.778759482646213e-06, 'epoch': 0.33} 33%|███▎ | 11417/34278 [12:34:35<29:21:30, 4.62s/it] 33%|███▎ | 11418/34278 [12:34:38<26:43:44, 4.21s/it] {'loss': 0.1232, 'grad_norm': 0.8113753157089846, 'learning_rate': 7.778366712957198e-06, 'epoch': 0.33} 33%|███▎ | 11418/34278 [12:34:38<26:43:44, 4.21s/it] 33%|███▎ | 11419/34278 [12:34:42<25:25:20, 4.00s/it] {'loss': 0.1415, 'grad_norm': 0.8962795501339426, 'learning_rate': 7.77797391846359e-06, 'epoch': 0.33} 33%|███▎ | 11419/34278 [12:34:42<25:25:20, 4.00s/it] 33%|███▎ | 11420/34278 [12:34:45<23:25:34, 3.69s/it] {'loss': 0.1635, 'grad_norm': 0.9959166871234717, 'learning_rate': 7.777581099168894e-06, 'epoch': 0.33} 33%|███▎ | 11420/34278 [12:34:45<23:25:34, 3.69s/it] 33%|███▎ | 11421/34278 [12:34:47<21:52:10, 3.44s/it] {'loss': 0.1437, 'grad_norm': 0.8844159556240272, 'learning_rate': 7.777188255076616e-06, 'epoch': 0.33} 33%|███▎ | 11421/34278 [12:34:48<21:52:10, 3.44s/it] 33%|███▎ | 11422/34278 [12:34:51<21:36:48, 3.40s/it] {'loss': 0.1365, 'grad_norm': 0.8218999750495313, 'learning_rate': 7.776795386190265e-06, 'epoch': 0.33} 33%|███▎ | 11422/34278 [12:34:51<21:36:48, 3.40s/it] 33%|███▎ | 11423/34278 [12:34:56<24:50:07, 3.91s/it] {'loss': 0.1316, 'grad_norm': 0.8954471981358914, 'learning_rate': 7.77640249251335e-06, 'epoch': 0.33} 33%|███▎ | 11423/34278 [12:34:56<24:50:07, 3.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 33%|███▎ | 11424/34278 [12:35:02<29:16:57, 4.61s/it] {'loss': 0.1469, 'grad_norm': 0.8534036207605681, 'learning_rate': 7.776009574049373e-06, 'epoch': 0.33} 33%|███▎ | 11424/34278 [12:35:02<29:16:57, 4.61s/it] 33%|███▎ | 11425/34278 [12:35:05<26:21:01, 4.15s/it] {'loss': 0.1331, 'grad_norm': 0.8490377272865366, 'learning_rate': 7.775616630801846e-06, 'epoch': 0.33} 33%|███▎ | 11425/34278 [12:35:05<26:21:01, 4.15s/it] 33%|███▎ | 11426/34278 [12:35:09<25:40:28, 4.04s/it] {'loss': 0.1599, 'grad_norm': 0.888002051237779, 'learning_rate': 7.775223662774276e-06, 'epoch': 0.33} 33%|███▎ | 11426/34278 [12:35:09<25:40:28, 4.04s/it] 33%|███▎ | 11427/34278 [12:35:15<28:42:02, 4.52s/it] {'loss': 0.1525, 'grad_norm': 0.7401795024990379, 'learning_rate': 7.774830669970172e-06, 'epoch': 0.33} 33%|███▎ | 11427/34278 [12:35:15<28:42:02, 4.52s/it] 33%|███▎ | 11428/34278 [12:35:21<31:30:51, 4.97s/it] {'loss': 0.1568, 'grad_norm': 1.0355651212089865, 'learning_rate': 7.774437652393042e-06, 'epoch': 0.33} 33%|███▎ | 11428/34278 [12:35:21<31:30:51, 4.97s/it] 33%|███▎ | 11429/34278 [12:35:26<33:07:09, 5.22s/it] {'loss': 0.1581, 'grad_norm': 0.7684071493659119, 'learning_rate': 7.774044610046396e-06, 'epoch': 0.33} 33%|███▎ | 11429/34278 [12:35:26<33:07:09, 5.22s/it] 33%|███▎ | 11430/34278 [12:35:29<28:56:12, 4.56s/it] {'loss': 0.1529, 'grad_norm': 0.7654692111072509, 'learning_rate': 7.77365154293374e-06, 'epoch': 0.33} 33%|███▎ | 11430/34278 [12:35:30<28:56:12, 4.56s/it] 33%|███▎ | 11431/34278 [12:35:36<31:42:58, 5.00s/it] {'loss': 0.1444, 'grad_norm': 0.8563814345870975, 'learning_rate': 7.773258451058587e-06, 'epoch': 0.33} 33%|███▎ | 11431/34278 [12:35:36<31:42:58, 5.00s/it] 33%|███▎ | 11432/34278 [12:35:39<28:07:10, 4.43s/it] {'loss': 0.1467, 'grad_norm': 0.7146638452884608, 'learning_rate': 7.772865334424444e-06, 'epoch': 0.33} 33%|███▎ | 11432/34278 [12:35:39<28:07:10, 4.43s/it] 33%|███▎ | 11433/34278 [12:35:42<25:37:52, 4.04s/it] {'loss': 0.1572, 'grad_norm': 0.8164867364796478, 'learning_rate': 7.772472193034821e-06, 'epoch': 0.33} 33%|███▎ | 11433/34278 [12:35:42<25:37:52, 4.04s/it] 33%|███▎ | 11434/34278 [12:35:45<23:57:44, 3.78s/it] {'loss': 0.1618, 'grad_norm': 0.9721989059492175, 'learning_rate': 7.772079026893229e-06, 'epoch': 0.33} 33%|███▎ | 11434/34278 [12:35:45<23:57:44, 3.78s/it] 33%|███▎ | 11435/34278 [12:35:48<23:14:54, 3.66s/it] {'loss': 0.123, 'grad_norm': 0.9092877986747195, 'learning_rate': 7.771685836003175e-06, 'epoch': 0.33} 33%|███▎ | 11435/34278 [12:35:48<23:14:54, 3.66s/it] 33%|███▎ | 11436/34278 [12:35:51<22:00:23, 3.47s/it] {'loss': 0.1313, 'grad_norm': 0.8869602588924006, 'learning_rate': 7.771292620368173e-06, 'epoch': 0.33} 33%|███▎ | 11436/34278 [12:35:51<22:00:23, 3.47s/it] 33%|███▎ | 11437/34278 [12:35:54<21:20:49, 3.36s/it] {'loss': 0.1567, 'grad_norm': 0.8444640841614083, 'learning_rate': 7.770899379991732e-06, 'epoch': 0.33} 33%|███▎ | 11437/34278 [12:35:54<21:20:49, 3.36s/it] 33%|███▎ | 11438/34278 [12:35:57<20:42:04, 3.26s/it] {'loss': 0.181, 'grad_norm': 1.1219937013782346, 'learning_rate': 7.770506114877364e-06, 'epoch': 0.33} 33%|███▎ | 11438/34278 [12:35:57<20:42:04, 3.26s/it] 33%|███▎ | 11439/34278 [12:36:01<20:32:09, 3.24s/it] {'loss': 0.1154, 'grad_norm': 0.9832192985802704, 'learning_rate': 7.770112825028578e-06, 'epoch': 0.33} 33%|███▎ | 11439/34278 [12:36:01<20:32:09, 3.24s/it] 33%|███▎ | 11440/34278 [12:36:07<25:43:13, 4.05s/it] {'loss': 0.1601, 'grad_norm': 0.7641647964742511, 'learning_rate': 7.769719510448886e-06, 'epoch': 0.33} 33%|███▎ | 11440/34278 [12:36:07<25:43:13, 4.05s/it] 33%|███▎ | 11441/34278 [12:36:09<23:05:27, 3.64s/it] {'loss': 0.147, 'grad_norm': 1.0673713528525672, 'learning_rate': 7.769326171141797e-06, 'epoch': 0.33} 33%|███▎ | 11441/34278 [12:36:09<23:05:27, 3.64s/it] 33%|███▎ | 11442/34278 [12:36:13<22:24:20, 3.53s/it] {'loss': 0.1354, 'grad_norm': 1.0112107621321438, 'learning_rate': 7.768932807110828e-06, 'epoch': 0.33} 33%|███▎ | 11442/34278 [12:36:13<22:24:20, 3.53s/it] 33%|███▎ | 11443/34278 [12:36:19<27:05:13, 4.27s/it] {'loss': 0.1398, 'grad_norm': 0.5999351158126687, 'learning_rate': 7.768539418359487e-06, 'epoch': 0.33} 33%|███▎ | 11443/34278 [12:36:19<27:05:13, 4.27s/it] 33%|███▎ | 11444/34278 [12:36:23<27:21:24, 4.31s/it] {'loss': 0.1294, 'grad_norm': 0.7862293484002059, 'learning_rate': 7.768146004891287e-06, 'epoch': 0.33} 33%|███▎ | 11444/34278 [12:36:23<27:21:24, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 33%|███▎ | 11445/34278 [12:36:26<24:36:43, 3.88s/it] {'loss': 0.1478, 'grad_norm': 0.7798420221815294, 'learning_rate': 7.767752566709739e-06, 'epoch': 0.33} 33%|███▎ | 11445/34278 [12:36:26<24:36:43, 3.88s/it] 33%|███▎ | 11446/34278 [12:36:29<23:50:26, 3.76s/it] {'loss': 0.1767, 'grad_norm': 1.0460569583065815, 'learning_rate': 7.767359103818357e-06, 'epoch': 0.33} 33%|███▎ | 11446/34278 [12:36:29<23:50:26, 3.76s/it] 33%|███▎ | 11447/34278 [12:36:33<22:54:52, 3.61s/it] {'loss': 0.1661, 'grad_norm': 0.7185436296585129, 'learning_rate': 7.766965616220655e-06, 'epoch': 0.33} 33%|███▎ | 11447/34278 [12:36:33<22:54:52, 3.61s/it] 33%|███▎ | 11448/34278 [12:36:36<21:58:30, 3.47s/it] {'loss': 0.1636, 'grad_norm': 1.0710920071104186, 'learning_rate': 7.766572103920144e-06, 'epoch': 0.33} 33%|███▎ | 11448/34278 [12:36:36<21:58:30, 3.47s/it] 33%|███▎ | 11449/34278 [12:36:39<21:13:43, 3.35s/it] {'loss': 0.1599, 'grad_norm': 0.8645450446415387, 'learning_rate': 7.766178566920338e-06, 'epoch': 0.33} 33%|███▎ | 11449/34278 [12:36:39<21:13:43, 3.35s/it] 33%|███▎ | 11450/34278 [12:36:42<20:18:12, 3.20s/it] {'loss': 0.1291, 'grad_norm': 0.8817345169230677, 'learning_rate': 7.76578500522475e-06, 'epoch': 0.33} 33%|███▎ | 11450/34278 [12:36:42<20:18:12, 3.20s/it] 33%|███▎ | 11451/34278 [12:36:48<25:36:26, 4.04s/it] {'loss': 0.1468, 'grad_norm': 0.9304675892587817, 'learning_rate': 7.765391418836893e-06, 'epoch': 0.33} 33%|███▎ | 11451/34278 [12:36:48<25:36:26, 4.04s/it] 33%|███▎ | 11452/34278 [12:36:51<23:53:59, 3.77s/it] {'loss': 0.1332, 'grad_norm': 0.7462799203420152, 'learning_rate': 7.764997807760283e-06, 'epoch': 0.33} 33%|███▎ | 11452/34278 [12:36:51<23:53:59, 3.77s/it] 33%|███▎ | 11453/34278 [12:36:54<22:48:03, 3.60s/it] {'loss': 0.1615, 'grad_norm': 0.8722627404862797, 'learning_rate': 7.764604171998432e-06, 'epoch': 0.33} 33%|███▎ | 11453/34278 [12:36:54<22:48:03, 3.60s/it] 33%|███▎ | 11454/34278 [12:37:00<27:12:30, 4.29s/it] {'loss': 0.1536, 'grad_norm': 0.937752211240302, 'learning_rate': 7.764210511554854e-06, 'epoch': 0.33} 33%|███▎ | 11454/34278 [12:37:00<27:12:30, 4.29s/it] 33%|███▎ | 11455/34278 [12:37:03<25:02:23, 3.95s/it] {'loss': 0.1522, 'grad_norm': 0.777923375392146, 'learning_rate': 7.763816826433066e-06, 'epoch': 0.33} 33%|███▎ | 11455/34278 [12:37:03<25:02:23, 3.95s/it] 33%|███▎ | 11456/34278 [12:37:06<23:35:46, 3.72s/it] {'loss': 0.155, 'grad_norm': 1.0454405741657025, 'learning_rate': 7.76342311663658e-06, 'epoch': 0.33} 33%|███▎ | 11456/34278 [12:37:06<23:35:46, 3.72s/it] 33%|███▎ | 11457/34278 [12:37:09<22:30:24, 3.55s/it] {'loss': 0.1381, 'grad_norm': 0.9664911830553712, 'learning_rate': 7.763029382168912e-06, 'epoch': 0.33} 33%|███▎ | 11457/34278 [12:37:09<22:30:24, 3.55s/it] 33%|███▎ | 11458/34278 [12:37:14<25:06:02, 3.96s/it] {'loss': 0.1464, 'grad_norm': 0.9646019848903161, 'learning_rate': 7.762635623033577e-06, 'epoch': 0.33} 33%|███▎ | 11458/34278 [12:37:14<25:06:02, 3.96s/it] 33%|███▎ | 11459/34278 [12:37:18<23:57:15, 3.78s/it] {'loss': 0.1415, 'grad_norm': 0.959057745452574, 'learning_rate': 7.76224183923409e-06, 'epoch': 0.33} 33%|███▎ | 11459/34278 [12:37:18<23:57:15, 3.78s/it] 33%|███▎ | 11460/34278 [12:37:23<26:42:05, 4.21s/it] {'loss': 0.1317, 'grad_norm': 0.815788890676571, 'learning_rate': 7.76184803077397e-06, 'epoch': 0.33} 33%|███▎ | 11460/34278 [12:37:23<26:42:05, 4.21s/it] 33%|███▎ | 11461/34278 [12:37:26<24:44:00, 3.90s/it] {'loss': 0.1191, 'grad_norm': 0.6938665614662191, 'learning_rate': 7.761454197656728e-06, 'epoch': 0.33} 33%|███▎ | 11461/34278 [12:37:26<24:44:00, 3.90s/it] 33%|███▎ | 11462/34278 [12:37:29<22:29:11, 3.55s/it] {'loss': 0.1425, 'grad_norm': 0.8584014957548999, 'learning_rate': 7.761060339885882e-06, 'epoch': 0.33} 33%|███▎ | 11462/34278 [12:37:29<22:29:11, 3.55s/it] 33%|███▎ | 11463/34278 [12:37:32<21:17:36, 3.36s/it] {'loss': 0.1584, 'grad_norm': 0.9439635147918899, 'learning_rate': 7.76066645746495e-06, 'epoch': 0.33} 33%|███▎ | 11463/34278 [12:37:32<21:17:36, 3.36s/it] 33%|███▎ | 11464/34278 [12:37:35<20:43:08, 3.27s/it] {'loss': 0.1487, 'grad_norm': 0.7275435427161401, 'learning_rate': 7.760272550397446e-06, 'epoch': 0.33} 33%|███▎ | 11464/34278 [12:37:35<20:43:08, 3.27s/it] 33%|███▎ | 11465/34278 [12:37:38<20:18:54, 3.21s/it] {'loss': 0.1596, 'grad_norm': 0.8284264484764152, 'learning_rate': 7.759878618686886e-06, 'epoch': 0.33} 33%|███▎ | 11465/34278 [12:37:38<20:18:54, 3.21s/it] 33%|███▎ | 11466/34278 [12:37:41<21:08:14, 3.34s/it] {'loss': 0.1761, 'grad_norm': 0.9728561120807099, 'learning_rate': 7.759484662336792e-06, 'epoch': 0.33} 33%|███▎ | 11466/34278 [12:37:41<21:08:14, 3.34s/it] 33%|███▎ | 11467/34278 [12:37:48<26:23:22, 4.16s/it] {'loss': 0.1312, 'grad_norm': 0.7633814891351184, 'learning_rate': 7.759090681350676e-06, 'epoch': 0.33} 33%|███▎ | 11467/34278 [12:37:48<26:23:22, 4.16s/it] 33%|███▎ | 11468/34278 [12:37:53<29:46:09, 4.70s/it] {'loss': 0.153, 'grad_norm': 0.6563076696473864, 'learning_rate': 7.758696675732057e-06, 'epoch': 0.33} 33%|███▎ | 11468/34278 [12:37:53<29:46:09, 4.70s/it] 33%|███▎ | 11469/34278 [12:37:56<26:25:58, 4.17s/it] {'loss': 0.1594, 'grad_norm': 0.907111826757, 'learning_rate': 7.758302645484451e-06, 'epoch': 0.33} 33%|███▎ | 11469/34278 [12:37:56<26:25:58, 4.17s/it] 33%|███▎ | 11470/34278 [12:37:59<24:06:35, 3.81s/it] {'loss': 0.1394, 'grad_norm': 0.7887719592653158, 'learning_rate': 7.75790859061138e-06, 'epoch': 0.33} 33%|███▎ | 11470/34278 [12:37:59<24:06:35, 3.81s/it] 33%|███▎ | 11471/34278 [12:38:02<22:04:45, 3.49s/it] {'loss': 0.1366, 'grad_norm': 0.6941328921783568, 'learning_rate': 7.757514511116358e-06, 'epoch': 0.33} 33%|███▎ | 11471/34278 [12:38:02<22:04:45, 3.49s/it] 33%|███▎ | 11472/34278 [12:38:06<22:17:57, 3.52s/it] {'loss': 0.1467, 'grad_norm': 0.7424400834815343, 'learning_rate': 7.757120407002904e-06, 'epoch': 0.33} 33%|███▎ | 11472/34278 [12:38:06<22:17:57, 3.52s/it] 33%|███▎ | 11473/34278 [12:38:09<21:29:22, 3.39s/it] {'loss': 0.1444, 'grad_norm': 0.7666029824850172, 'learning_rate': 7.75672627827454e-06, 'epoch': 0.33} 33%|███▎ | 11473/34278 [12:38:09<21:29:22, 3.39s/it] 33%|███▎ | 11474/34278 [12:38:11<20:04:02, 3.17s/it] {'loss': 0.1442, 'grad_norm': 0.8542415574993383, 'learning_rate': 7.75633212493478e-06, 'epoch': 0.33} 33%|███▎ | 11474/34278 [12:38:11<20:04:02, 3.17s/it] 33%|███▎ | 11475/34278 [12:38:14<19:15:27, 3.04s/it] {'loss': 0.1456, 'grad_norm': 0.8128021005906819, 'learning_rate': 7.755937946987144e-06, 'epoch': 0.33} 33%|███▎ | 11475/34278 [12:38:14<19:15:27, 3.04s/it] 33%|███▎ | 11476/34278 [12:38:20<23:59:55, 3.79s/it] {'loss': 0.1631, 'grad_norm': 0.882984283684404, 'learning_rate': 7.755543744435153e-06, 'epoch': 0.33} 33%|███▎ | 11476/34278 [12:38:20<23:59:55, 3.79s/it] 33%|███▎ | 11477/34278 [12:38:24<25:39:51, 4.05s/it] {'loss': 0.1695, 'grad_norm': 0.8129933914221654, 'learning_rate': 7.755149517282325e-06, 'epoch': 0.33} 33%|███▎ | 11477/34278 [12:38:24<25:39:51, 4.05s/it] 33%|███▎ | 11478/34278 [12:38:28<24:26:34, 3.86s/it] {'loss': 0.144, 'grad_norm': 0.8860049986178933, 'learning_rate': 7.75475526553218e-06, 'epoch': 0.33} 33%|███▎ | 11478/34278 [12:38:28<24:26:34, 3.86s/it] 33%|███▎ | 11479/34278 [12:38:33<27:50:59, 4.40s/it] {'loss': 0.1469, 'grad_norm': 0.8283335356209518, 'learning_rate': 7.754360989188237e-06, 'epoch': 0.33} 33%|███▎ | 11479/34278 [12:38:33<27:50:59, 4.40s/it] 33%|███▎ | 11480/34278 [12:38:37<25:31:50, 4.03s/it] {'loss': 0.1456, 'grad_norm': 0.635544225279093, 'learning_rate': 7.753966688254018e-06, 'epoch': 0.33} 33%|███▎ | 11480/34278 [12:38:37<25:31:50, 4.03s/it] 33%|███▎ | 11481/34278 [12:38:43<29:13:45, 4.62s/it] {'loss': 0.1827, 'grad_norm': 1.1415888434866515, 'learning_rate': 7.75357236273304e-06, 'epoch': 0.33} 33%|███▎ | 11481/34278 [12:38:43<29:13:45, 4.62s/it] 33%|███▎ | 11482/34278 [12:38:45<25:29:04, 4.02s/it] {'loss': 0.1437, 'grad_norm': 0.9347864709686792, 'learning_rate': 7.753178012628826e-06, 'epoch': 0.33} 33%|███▎ | 11482/34278 [12:38:45<25:29:04, 4.02s/it] 33%|███▎ | 11483/34278 [12:38:49<25:30:50, 4.03s/it] {'loss': 0.1429, 'grad_norm': 0.6804694142483877, 'learning_rate': 7.752783637944897e-06, 'epoch': 0.33} 33%|███▎ | 11483/34278 [12:38:49<25:30:50, 4.03s/it] 34%|███▎ | 11484/34278 [12:38:55<28:18:32, 4.47s/it] {'loss': 0.1703, 'grad_norm': 1.253721368214493, 'learning_rate': 7.752389238684773e-06, 'epoch': 0.34} 34%|███▎ | 11484/34278 [12:38:55<28:18:32, 4.47s/it] 34%|███▎ | 11485/34278 [12:38:58<25:19:23, 4.00s/it] {'loss': 0.1159, 'grad_norm': 1.0616431022335604, 'learning_rate': 7.751994814851973e-06, 'epoch': 0.34} 34%|███▎ | 11485/34278 [12:38:58<25:19:23, 4.00s/it] 34%|███▎ | 11486/34278 [12:39:03<28:12:01, 4.45s/it] {'loss': 0.158, 'grad_norm': 0.8984131766963803, 'learning_rate': 7.751600366450021e-06, 'epoch': 0.34} 34%|███▎ | 11486/34278 [12:39:03<28:12:01, 4.45s/it] 34%|███▎ | 11487/34278 [12:39:06<25:53:28, 4.09s/it] {'loss': 0.1545, 'grad_norm': 0.9184852145861464, 'learning_rate': 7.751205893482438e-06, 'epoch': 0.34} 34%|███▎ | 11487/34278 [12:39:06<25:53:28, 4.09s/it] 34%|███▎ | 11488/34278 [12:39:10<24:42:12, 3.90s/it] {'loss': 0.1354, 'grad_norm': 0.7671535794109193, 'learning_rate': 7.750811395952745e-06, 'epoch': 0.34} 34%|███▎ | 11488/34278 [12:39:10<24:42:12, 3.90s/it] 34%|███▎ | 11489/34278 [12:39:13<23:27:44, 3.71s/it] {'loss': 0.1406, 'grad_norm': 0.8323228137742448, 'learning_rate': 7.750416873864464e-06, 'epoch': 0.34} 34%|███▎ | 11489/34278 [12:39:13<23:27:44, 3.71s/it] 34%|███▎ | 11490/34278 [12:39:17<23:35:39, 3.73s/it] {'loss': 0.1619, 'grad_norm': 0.777114115614653, 'learning_rate': 7.75002232722112e-06, 'epoch': 0.34} 34%|███▎ | 11490/34278 [12:39:17<23:35:39, 3.73s/it] 34%|███▎ | 11491/34278 [12:39:22<25:31:28, 4.03s/it] {'loss': 0.1614, 'grad_norm': 0.8828507213690756, 'learning_rate': 7.749627756026232e-06, 'epoch': 0.34} 34%|███▎ | 11491/34278 [12:39:22<25:31:28, 4.03s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 34%|███▎ | 11492/34278 [12:39:25<23:40:04, 3.74s/it] {'loss': 0.1747, 'grad_norm': 0.790047056780413, 'learning_rate': 7.749233160283323e-06, 'epoch': 0.34} 34%|███▎ | 11492/34278 [12:39:25<23:40:04, 3.74s/it] 34%|███▎ | 11493/34278 [12:39:28<22:30:02, 3.56s/it] {'loss': 0.1744, 'grad_norm': 0.9592747218110751, 'learning_rate': 7.748838539995918e-06, 'epoch': 0.34} 34%|███▎ | 11493/34278 [12:39:28<22:30:02, 3.56s/it] 34%|███▎ | 11494/34278 [12:39:31<22:07:23, 3.50s/it] {'loss': 0.1469, 'grad_norm': 0.7547276464424753, 'learning_rate': 7.748443895167539e-06, 'epoch': 0.34} 34%|███▎ | 11494/34278 [12:39:31<22:07:23, 3.50s/it] 34%|███▎ | 11495/34278 [12:39:34<20:23:23, 3.22s/it] {'loss': 0.1523, 'grad_norm': 0.7952485905164294, 'learning_rate': 7.748049225801706e-06, 'epoch': 0.34} 34%|███▎ | 11495/34278 [12:39:34<20:23:23, 3.22s/it] 34%|███▎ | 11496/34278 [12:39:40<25:53:20, 4.09s/it] {'loss': 0.1455, 'grad_norm': 1.1703727583077554, 'learning_rate': 7.747654531901949e-06, 'epoch': 0.34} 34%|███▎ | 11496/34278 [12:39:40<25:53:20, 4.09s/it] 34%|███▎ | 11497/34278 [12:39:43<24:50:11, 3.92s/it] {'loss': 0.1323, 'grad_norm': 0.7693174535264024, 'learning_rate': 7.747259813471786e-06, 'epoch': 0.34} 34%|███▎ | 11497/34278 [12:39:43<24:50:11, 3.92s/it] 34%|███▎ | 11498/34278 [12:39:46<22:45:47, 3.60s/it] {'loss': 0.1522, 'grad_norm': 0.7369482111349072, 'learning_rate': 7.746865070514744e-06, 'epoch': 0.34} 34%|███▎ | 11498/34278 [12:39:46<22:45:47, 3.60s/it] 34%|███▎ | 11499/34278 [12:39:50<22:20:31, 3.53s/it] {'loss': 0.1771, 'grad_norm': 0.9564162516945413, 'learning_rate': 7.746470303034347e-06, 'epoch': 0.34} 34%|███▎ | 11499/34278 [12:39:50<22:20:31, 3.53s/it] 34%|███▎ | 11500/34278 [12:39:52<20:58:08, 3.31s/it] {'loss': 0.1578, 'grad_norm': 0.8108868752021353, 'learning_rate': 7.746075511034119e-06, 'epoch': 0.34} 34%|███▎ | 11500/34278 [12:39:52<20:58:08, 3.31s/it] 34%|███▎ | 11501/34278 [12:39:56<21:44:45, 3.44s/it] {'loss': 0.1497, 'grad_norm': 0.7785537166136959, 'learning_rate': 7.745680694517582e-06, 'epoch': 0.34} 34%|███▎ | 11501/34278 [12:39:56<21:44:45, 3.44s/it] 34%|███▎ | 11502/34278 [12:40:03<27:26:38, 4.34s/it] {'loss': 0.1813, 'grad_norm': 0.958205389114564, 'learning_rate': 7.745285853488264e-06, 'epoch': 0.34} 34%|███▎ | 11502/34278 [12:40:03<27:26:38, 4.34s/it] 34%|███▎ | 11503/34278 [12:40:06<25:54:27, 4.10s/it] {'loss': 0.1475, 'grad_norm': 0.9537185309746866, 'learning_rate': 7.74489098794969e-06, 'epoch': 0.34} 34%|███▎ | 11503/34278 [12:40:06<25:54:27, 4.10s/it] 34%|███▎ | 11504/34278 [12:40:09<24:13:08, 3.83s/it] {'loss': 0.1361, 'grad_norm': 0.6389980035757383, 'learning_rate': 7.744496097905385e-06, 'epoch': 0.34} 34%|███▎ | 11504/34278 [12:40:09<24:13:08, 3.83s/it] 34%|███▎ | 11505/34278 [12:40:12<22:28:35, 3.55s/it] {'loss': 0.1943, 'grad_norm': 0.800956536064156, 'learning_rate': 7.744101183358874e-06, 'epoch': 0.34} 34%|███▎ | 11505/34278 [12:40:12<22:28:35, 3.55s/it] 34%|███▎ | 11506/34278 [12:40:16<22:02:51, 3.49s/it] {'loss': 0.1681, 'grad_norm': 0.923807856964023, 'learning_rate': 7.743706244313682e-06, 'epoch': 0.34} 34%|███▎ | 11506/34278 [12:40:16<22:02:51, 3.49s/it] 34%|███▎ | 11507/34278 [12:40:19<21:46:30, 3.44s/it] {'loss': 0.1356, 'grad_norm': 0.7330873874583181, 'learning_rate': 7.743311280773335e-06, 'epoch': 0.34} 34%|███▎ | 11507/34278 [12:40:19<21:46:30, 3.44s/it] 34%|███▎ | 11508/34278 [12:40:22<21:52:59, 3.46s/it] {'loss': 0.1694, 'grad_norm': 0.7937806466044719, 'learning_rate': 7.742916292741363e-06, 'epoch': 0.34} 34%|███▎ | 11508/34278 [12:40:22<21:52:59, 3.46s/it] 34%|███▎ | 11509/34278 [12:40:26<21:05:47, 3.34s/it] {'loss': 0.1381, 'grad_norm': 1.0402736819907141, 'learning_rate': 7.742521280221286e-06, 'epoch': 0.34} 34%|███▎ | 11509/34278 [12:40:26<21:05:47, 3.34s/it] 34%|███▎ | 11510/34278 [12:40:28<19:52:55, 3.14s/it] {'loss': 0.1355, 'grad_norm': 0.6845691795976966, 'learning_rate': 7.742126243216635e-06, 'epoch': 0.34} 34%|███▎ | 11510/34278 [12:40:28<19:52:55, 3.14s/it] 34%|███▎ | 11511/34278 [12:40:32<20:50:11, 3.29s/it] {'loss': 0.1573, 'grad_norm': 0.9618473390404858, 'learning_rate': 7.741731181730933e-06, 'epoch': 0.34} 34%|███▎ | 11511/34278 [12:40:32<20:50:11, 3.29s/it] 34%|███▎ | 11512/34278 [12:40:35<19:59:54, 3.16s/it] {'loss': 0.1521, 'grad_norm': 0.9770323611181168, 'learning_rate': 7.741336095767713e-06, 'epoch': 0.34} 34%|███▎ | 11512/34278 [12:40:35<19:59:54, 3.16s/it] 34%|███▎ | 11513/34278 [12:40:38<19:31:44, 3.09s/it] {'loss': 0.1747, 'grad_norm': 0.7055360652263694, 'learning_rate': 7.740940985330497e-06, 'epoch': 0.34} 34%|███▎ | 11513/34278 [12:40:38<19:31:44, 3.09s/it] 34%|███▎ | 11514/34278 [12:40:41<19:29:16, 3.08s/it] {'loss': 0.1533, 'grad_norm': 0.9706015849531999, 'learning_rate': 7.740545850422813e-06, 'epoch': 0.34} 34%|███▎ | 11514/34278 [12:40:41<19:29:16, 3.08s/it] 34%|███▎ | 11515/34278 [12:40:44<19:25:42, 3.07s/it] {'loss': 0.1536, 'grad_norm': 0.8807825946758363, 'learning_rate': 7.740150691048192e-06, 'epoch': 0.34} 34%|███▎ | 11515/34278 [12:40:44<19:25:42, 3.07s/it] 34%|███▎ | 11516/34278 [12:40:47<19:11:59, 3.04s/it] {'loss': 0.1347, 'grad_norm': 0.8525706200236469, 'learning_rate': 7.73975550721016e-06, 'epoch': 0.34} 34%|███▎ | 11516/34278 [12:40:47<19:11:59, 3.04s/it] 34%|███▎ | 11517/34278 [12:40:52<23:36:56, 3.74s/it] {'loss': 0.1617, 'grad_norm': 0.7138250781620418, 'learning_rate': 7.739360298912243e-06, 'epoch': 0.34} 34%|███▎ | 11517/34278 [12:40:52<23:36:56, 3.74s/it] 34%|███▎ | 11518/34278 [12:40:58<27:26:12, 4.34s/it] {'loss': 0.1487, 'grad_norm': 0.8366224157733974, 'learning_rate': 7.738965066157973e-06, 'epoch': 0.34} 34%|███▎ | 11518/34278 [12:40:58<27:26:12, 4.34s/it] 34%|███▎ | 11519/34278 [12:41:01<25:30:21, 4.03s/it] {'loss': 0.1802, 'grad_norm': 0.9277490964824728, 'learning_rate': 7.738569808950875e-06, 'epoch': 0.34} 34%|███▎ | 11519/34278 [12:41:01<25:30:21, 4.03s/it] 34%|███▎ | 11520/34278 [12:41:06<26:37:00, 4.21s/it] {'loss': 0.1388, 'grad_norm': 0.8420714910620926, 'learning_rate': 7.738174527294481e-06, 'epoch': 0.34} 34%|███▎ | 11520/34278 [12:41:06<26:37:00, 4.21s/it] 34%|███▎ | 11521/34278 [12:41:08<23:48:20, 3.77s/it] {'loss': 0.1344, 'grad_norm': 0.8219743641842782, 'learning_rate': 7.737779221192317e-06, 'epoch': 0.34} 34%|███▎ | 11521/34278 [12:41:08<23:48:20, 3.77s/it] 34%|███▎ | 11522/34278 [12:41:14<27:58:08, 4.42s/it] {'loss': 0.1721, 'grad_norm': 0.8191873161261526, 'learning_rate': 7.737383890647915e-06, 'epoch': 0.34} 34%|███▎ | 11522/34278 [12:41:14<27:58:08, 4.42s/it] 34%|███▎ | 11523/34278 [12:41:20<30:49:04, 4.88s/it] {'loss': 0.1642, 'grad_norm': 0.7910467427290071, 'learning_rate': 7.736988535664803e-06, 'epoch': 0.34} 34%|███▎ | 11523/34278 [12:41:20<30:49:04, 4.88s/it] 34%|███▎ | 11524/34278 [12:41:24<27:47:26, 4.40s/it] {'loss': 0.1749, 'grad_norm': 0.9137471482546433, 'learning_rate': 7.73659315624651e-06, 'epoch': 0.34} 34%|███▎ | 11524/34278 [12:41:24<27:47:26, 4.40s/it] 34%|███▎ | 11525/34278 [12:41:30<30:44:27, 4.86s/it] {'loss': 0.1739, 'grad_norm': 0.8395428256052949, 'learning_rate': 7.736197752396566e-06, 'epoch': 0.34} 34%|███▎ | 11525/34278 [12:41:30<30:44:27, 4.86s/it] 34%|███▎ | 11526/34278 [12:41:33<28:09:58, 4.46s/it] {'loss': 0.1773, 'grad_norm': 0.7931969987369323, 'learning_rate': 7.735802324118503e-06, 'epoch': 0.34} 34%|███▎ | 11526/34278 [12:41:33<28:09:58, 4.46s/it] 34%|███▎ | 11527/34278 [12:41:36<25:29:14, 4.03s/it] {'loss': 0.162, 'grad_norm': 0.760285127003303, 'learning_rate': 7.73540687141585e-06, 'epoch': 0.34} 34%|███▎ | 11527/34278 [12:41:36<25:29:14, 4.03s/it] 34%|███▎ | 11528/34278 [12:41:39<23:19:49, 3.69s/it] {'loss': 0.1632, 'grad_norm': 0.7667508518337527, 'learning_rate': 7.735011394292136e-06, 'epoch': 0.34} 34%|███▎ | 11528/34278 [12:41:39<23:19:49, 3.69s/it] 34%|███▎ | 11529/34278 [12:41:42<22:48:58, 3.61s/it] {'loss': 0.1462, 'grad_norm': 0.659323906184719, 'learning_rate': 7.734615892750895e-06, 'epoch': 0.34} 34%|███▎ | 11529/34278 [12:41:42<22:48:58, 3.61s/it] 34%|███▎ | 11530/34278 [12:41:47<23:43:18, 3.75s/it] {'loss': 0.1387, 'grad_norm': 0.7444712343622968, 'learning_rate': 7.734220366795655e-06, 'epoch': 0.34} 34%|███▎ | 11530/34278 [12:41:47<23:43:18, 3.75s/it] 34%|███▎ | 11531/34278 [12:41:50<23:23:31, 3.70s/it] {'loss': 0.1288, 'grad_norm': 0.9170292521706552, 'learning_rate': 7.733824816429948e-06, 'epoch': 0.34} 34%|███▎ | 11531/34278 [12:41:50<23:23:31, 3.70s/it] 34%|███▎ | 11532/34278 [12:41:54<22:51:03, 3.62s/it] {'loss': 0.1354, 'grad_norm': 0.8803357947564434, 'learning_rate': 7.733429241657306e-06, 'epoch': 0.34} 34%|███▎ | 11532/34278 [12:41:54<22:51:03, 3.62s/it] 34%|███▎ | 11533/34278 [12:41:59<27:13:22, 4.31s/it] {'loss': 0.1425, 'grad_norm': 0.8195762284716492, 'learning_rate': 7.73303364248126e-06, 'epoch': 0.34} 34%|███▎ | 11533/34278 [12:41:59<27:13:22, 4.31s/it] 34%|███▎ | 11534/34278 [12:42:02<24:35:49, 3.89s/it] {'loss': 0.1501, 'grad_norm': 0.778225354573916, 'learning_rate': 7.732638018905343e-06, 'epoch': 0.34} 34%|███▎ | 11534/34278 [12:42:02<24:35:49, 3.89s/it] 34%|███▎ | 11535/34278 [12:42:08<27:30:29, 4.35s/it] {'loss': 0.1262, 'grad_norm': 0.8929304391087508, 'learning_rate': 7.732242370933085e-06, 'epoch': 0.34} 34%|███▎ | 11535/34278 [12:42:08<27:30:29, 4.35s/it] 34%|███▎ | 11536/34278 [12:42:11<25:54:18, 4.10s/it] {'loss': 0.1244, 'grad_norm': 0.7725661324199767, 'learning_rate': 7.731846698568021e-06, 'epoch': 0.34} 34%|███▎ | 11536/34278 [12:42:11<25:54:18, 4.10s/it] 34%|███▎ | 11537/34278 [12:42:14<23:23:01, 3.70s/it] {'loss': 0.1502, 'grad_norm': 0.8611246450752171, 'learning_rate': 7.73145100181368e-06, 'epoch': 0.34} 34%|███▎ | 11537/34278 [12:42:14<23:23:01, 3.70s/it] 34%|███▎ | 11538/34278 [12:42:17<21:49:23, 3.45s/it] {'loss': 0.1326, 'grad_norm': 0.9232382153814258, 'learning_rate': 7.731055280673598e-06, 'epoch': 0.34} 34%|███▎ | 11538/34278 [12:42:17<21:49:23, 3.45s/it] 34%|███▎ | 11539/34278 [12:42:20<21:12:39, 3.36s/it] {'loss': 0.1585, 'grad_norm': 0.80485965733028, 'learning_rate': 7.730659535151306e-06, 'epoch': 0.34} 34%|███▎ | 11539/34278 [12:42:20<21:12:39, 3.36s/it] 34%|███▎ | 11540/34278 [12:42:24<21:40:13, 3.43s/it] {'loss': 0.1658, 'grad_norm': 0.7852042836798314, 'learning_rate': 7.730263765250337e-06, 'epoch': 0.34} 34%|███▎ | 11540/34278 [12:42:24<21:40:13, 3.43s/it] 34%|███▎ | 11541/34278 [12:42:27<21:04:28, 3.34s/it] {'loss': 0.1592, 'grad_norm': 0.8940431546891692, 'learning_rate': 7.729867970974223e-06, 'epoch': 0.34} 34%|███▎ | 11541/34278 [12:42:27<21:04:28, 3.34s/it] 34%|███▎ | 11542/34278 [12:42:30<20:56:37, 3.32s/it] {'loss': 0.117, 'grad_norm': 0.8670945497825935, 'learning_rate': 7.729472152326503e-06, 'epoch': 0.34} 34%|███▎ | 11542/34278 [12:42:30<20:56:37, 3.32s/it] 34%|███▎ | 11543/34278 [12:42:34<22:00:25, 3.48s/it] {'loss': 0.1563, 'grad_norm': 0.8225740785398498, 'learning_rate': 7.729076309310704e-06, 'epoch': 0.34} 34%|███▎ | 11543/34278 [12:42:34<22:00:25, 3.48s/it] 34%|███▎ | 11544/34278 [12:42:38<23:08:58, 3.67s/it] {'loss': 0.1558, 'grad_norm': 1.127576641006544, 'learning_rate': 7.728680441930366e-06, 'epoch': 0.34} 34%|███▎ | 11544/34278 [12:42:38<23:08:58, 3.67s/it] 34%|███▎ | 11545/34278 [12:42:41<22:33:48, 3.57s/it] {'loss': 0.1561, 'grad_norm': 0.8866329896476555, 'learning_rate': 7.72828455018902e-06, 'epoch': 0.34} 34%|███▎ | 11545/34278 [12:42:41<22:33:48, 3.57s/it] 34%|███▎ | 11546/34278 [12:42:45<22:49:26, 3.61s/it] {'loss': 0.1528, 'grad_norm': 0.8556623051275783, 'learning_rate': 7.727888634090199e-06, 'epoch': 0.34} 34%|███▎ | 11546/34278 [12:42:45<22:49:26, 3.61s/it] 34%|███▎ | 11547/34278 [12:42:49<23:19:50, 3.69s/it] {'loss': 0.142, 'grad_norm': 0.9138167291437244, 'learning_rate': 7.72749269363744e-06, 'epoch': 0.34} 34%|███▎ | 11547/34278 [12:42:49<23:19:50, 3.69s/it] 34%|███▎ | 11548/34278 [12:42:52<21:51:10, 3.46s/it] {'loss': 0.1659, 'grad_norm': 0.9106381062019776, 'learning_rate': 7.727096728834278e-06, 'epoch': 0.34} 34%|███▎ | 11548/34278 [12:42:52<21:51:10, 3.46s/it] 34%|███▎ | 11549/34278 [12:42:55<20:35:19, 3.26s/it] {'loss': 0.159, 'grad_norm': 0.9148809918401275, 'learning_rate': 7.726700739684247e-06, 'epoch': 0.34} 34%|███▎ | 11549/34278 [12:42:55<20:35:19, 3.26s/it] 34%|███▎ | 11550/34278 [12:43:01<25:42:47, 4.07s/it] {'loss': 0.1596, 'grad_norm': 0.9319620451235113, 'learning_rate': 7.726304726190884e-06, 'epoch': 0.34} 34%|███▎ | 11550/34278 [12:43:01<25:42:47, 4.07s/it] 34%|███▎ | 11551/34278 [12:43:04<24:13:29, 3.84s/it] {'loss': 0.1625, 'grad_norm': 0.9805622581121656, 'learning_rate': 7.725908688357722e-06, 'epoch': 0.34} 34%|███▎ | 11551/34278 [12:43:04<24:13:29, 3.84s/it] 34%|███▎ | 11552/34278 [12:43:07<22:52:09, 3.62s/it] {'loss': 0.1566, 'grad_norm': 0.7912166294319537, 'learning_rate': 7.725512626188299e-06, 'epoch': 0.34} 34%|███▎ | 11552/34278 [12:43:07<22:52:09, 3.62s/it] 34%|███▎ | 11553/34278 [12:43:11<22:47:18, 3.61s/it] {'loss': 0.1409, 'grad_norm': 0.9273319767245617, 'learning_rate': 7.725116539686148e-06, 'epoch': 0.34} 34%|███▎ | 11553/34278 [12:43:11<22:47:18, 3.61s/it] 34%|███▎ | 11554/34278 [12:43:17<27:09:45, 4.30s/it] {'loss': 0.1334, 'grad_norm': 0.7738242382317029, 'learning_rate': 7.72472042885481e-06, 'epoch': 0.34} 34%|███▎ | 11554/34278 [12:43:17<27:09:45, 4.30s/it] 34%|███▎ | 11555/34278 [12:43:20<26:02:44, 4.13s/it] {'loss': 0.1564, 'grad_norm': 0.9688612676545774, 'learning_rate': 7.724324293697816e-06, 'epoch': 0.34} 34%|███▎ | 11555/34278 [12:43:20<26:02:44, 4.13s/it] 34%|███▎ | 11556/34278 [12:43:24<25:06:34, 3.98s/it] {'loss': 0.1633, 'grad_norm': 0.9992497320618804, 'learning_rate': 7.723928134218705e-06, 'epoch': 0.34} 34%|███▎ | 11556/34278 [12:43:24<25:06:34, 3.98s/it] 34%|███▎ | 11557/34278 [12:43:29<26:58:43, 4.27s/it] {'loss': 0.158, 'grad_norm': 0.8453637314931073, 'learning_rate': 7.723531950421014e-06, 'epoch': 0.34} 34%|███▎ | 11557/34278 [12:43:29<26:58:43, 4.27s/it] 34%|███▎ | 11558/34278 [12:43:33<26:16:05, 4.16s/it] {'loss': 0.1397, 'grad_norm': 0.9307297852232389, 'learning_rate': 7.72313574230828e-06, 'epoch': 0.34} 34%|███▎ | 11558/34278 [12:43:33<26:16:05, 4.16s/it] 34%|███▎ | 11559/34278 [12:43:39<30:31:32, 4.84s/it] {'loss': 0.1594, 'grad_norm': 0.8159144809842509, 'learning_rate': 7.722739509884042e-06, 'epoch': 0.34} 34%|███▎ | 11559/34278 [12:43:39<30:31:32, 4.84s/it] 34%|███▎ | 11560/34278 [12:43:44<29:39:48, 4.70s/it] {'loss': 0.1831, 'grad_norm': 0.8272914173842891, 'learning_rate': 7.722343253151834e-06, 'epoch': 0.34} 34%|███▎ | 11560/34278 [12:43:44<29:39:48, 4.70s/it] 34%|███▎ | 11561/34278 [12:43:47<26:29:26, 4.20s/it] {'loss': 0.1327, 'grad_norm': 0.8692708278798568, 'learning_rate': 7.721946972115196e-06, 'epoch': 0.34} 34%|███▎ | 11561/34278 [12:43:47<26:29:26, 4.20s/it] 34%|███▎ | 11562/34278 [12:43:50<25:40:57, 4.07s/it] {'loss': 0.1526, 'grad_norm': 1.0081766452662235, 'learning_rate': 7.721550666777664e-06, 'epoch': 0.34} 34%|███▎ | 11562/34278 [12:43:50<25:40:57, 4.07s/it] 34%|███▎ | 11563/34278 [12:43:54<24:08:21, 3.83s/it] {'loss': 0.1307, 'grad_norm': 0.8039508529224917, 'learning_rate': 7.721154337142778e-06, 'epoch': 0.34} 34%|███▎ | 11563/34278 [12:43:54<24:08:21, 3.83s/it] 34%|███▎ | 11564/34278 [12:43:57<22:31:42, 3.57s/it] {'loss': 0.1278, 'grad_norm': 0.8050979085130708, 'learning_rate': 7.720757983214076e-06, 'epoch': 0.34} 34%|███▎ | 11564/34278 [12:43:57<22:31:42, 3.57s/it] 34%|███▎ | 11565/34278 [12:44:00<22:15:34, 3.53s/it] {'loss': 0.1462, 'grad_norm': 0.6830516599371543, 'learning_rate': 7.720361604995097e-06, 'epoch': 0.34} 34%|███▎ | 11565/34278 [12:44:00<22:15:34, 3.53s/it] 34%|███▎ | 11566/34278 [12:44:03<21:12:37, 3.36s/it] {'loss': 0.1356, 'grad_norm': 0.8294060227749831, 'learning_rate': 7.719965202489377e-06, 'epoch': 0.34} 34%|███▎ | 11566/34278 [12:44:03<21:12:37, 3.36s/it] 34%|███▎ | 11567/34278 [12:44:07<21:40:34, 3.44s/it] {'loss': 0.1421, 'grad_norm': 0.8635004014556787, 'learning_rate': 7.71956877570046e-06, 'epoch': 0.34} 34%|███▎ | 11567/34278 [12:44:07<21:40:34, 3.44s/it] 34%|███▎ | 11568/34278 [12:44:11<22:29:20, 3.56s/it] {'loss': 0.131, 'grad_norm': 0.7720892884340611, 'learning_rate': 7.719172324631878e-06, 'epoch': 0.34} 34%|███▎ | 11568/34278 [12:44:11<22:29:20, 3.56s/it] 34%|███▍ | 11569/34278 [12:44:14<21:29:56, 3.41s/it] {'loss': 0.1339, 'grad_norm': 0.7483209597107527, 'learning_rate': 7.718775849287178e-06, 'epoch': 0.34} 34%|███▍ | 11569/34278 [12:44:14<21:29:56, 3.41s/it] 34%|███▍ | 11570/34278 [12:44:17<21:37:24, 3.43s/it] {'loss': 0.1766, 'grad_norm': 1.015752418732121, 'learning_rate': 7.718379349669893e-06, 'epoch': 0.34} 34%|███▍ | 11570/34278 [12:44:17<21:37:24, 3.43s/it] 34%|███▍ | 11571/34278 [12:44:20<20:27:54, 3.24s/it] {'loss': 0.1597, 'grad_norm': 1.0326672410961844, 'learning_rate': 7.71798282578357e-06, 'epoch': 0.34} 34%|███▍ | 11571/34278 [12:44:20<20:27:54, 3.24s/it] 34%|███▍ | 11572/34278 [12:44:23<20:36:56, 3.27s/it] {'loss': 0.1536, 'grad_norm': 0.9156662847768858, 'learning_rate': 7.717586277631744e-06, 'epoch': 0.34} 34%|███▍ | 11572/34278 [12:44:23<20:36:56, 3.27s/it] 34%|███▍ | 11573/34278 [12:44:26<19:51:33, 3.15s/it] {'loss': 0.1333, 'grad_norm': 0.9008508855862385, 'learning_rate': 7.717189705217954e-06, 'epoch': 0.34} 34%|███▍ | 11573/34278 [12:44:26<19:51:33, 3.15s/it] 34%|███▍ | 11574/34278 [12:44:29<20:15:15, 3.21s/it] {'loss': 0.1627, 'grad_norm': 1.0395423797248333, 'learning_rate': 7.716793108545745e-06, 'epoch': 0.34} 34%|███▍ | 11574/34278 [12:44:29<20:15:15, 3.21s/it] 34%|███▍ | 11575/34278 [12:44:33<20:12:58, 3.21s/it] {'loss': 0.1613, 'grad_norm': 0.9719764033060374, 'learning_rate': 7.716396487618655e-06, 'epoch': 0.34} 34%|███▍ | 11575/34278 [12:44:33<20:12:58, 3.21s/it] 34%|███▍ | 11576/34278 [12:44:36<20:34:17, 3.26s/it] {'loss': 0.1527, 'grad_norm': 0.9309991623526762, 'learning_rate': 7.715999842440225e-06, 'epoch': 0.34} 34%|███▍ | 11576/34278 [12:44:36<20:34:17, 3.26s/it] 34%|███▍ | 11577/34278 [12:44:40<21:51:34, 3.47s/it] {'loss': 0.198, 'grad_norm': 0.9842918455596753, 'learning_rate': 7.715603173013999e-06, 'epoch': 0.34} 34%|███▍ | 11577/34278 [12:44:40<21:51:34, 3.47s/it] 34%|███▍ | 11578/34278 [12:44:43<21:24:27, 3.40s/it] {'loss': 0.1587, 'grad_norm': 1.4516828031751667, 'learning_rate': 7.715206479343516e-06, 'epoch': 0.34} 34%|███▍ | 11578/34278 [12:44:43<21:24:27, 3.40s/it] 34%|███▍ | 11579/34278 [12:44:47<22:14:44, 3.53s/it] {'loss': 0.1418, 'grad_norm': 0.9777438508037208, 'learning_rate': 7.714809761432317e-06, 'epoch': 0.34} 34%|███▍ | 11579/34278 [12:44:47<22:14:44, 3.53s/it] 34%|███▍ | 11580/34278 [12:44:51<22:26:27, 3.56s/it] {'loss': 0.1748, 'grad_norm': 0.9632788395810549, 'learning_rate': 7.714413019283942e-06, 'epoch': 0.34} 34%|███▍ | 11580/34278 [12:44:51<22:26:27, 3.56s/it] 34%|███▍ | 11581/34278 [12:44:54<22:32:12, 3.57s/it] {'loss': 0.1432, 'grad_norm': 0.8574549089734549, 'learning_rate': 7.714016252901939e-06, 'epoch': 0.34} 34%|███▍ | 11581/34278 [12:44:54<22:32:12, 3.57s/it] 34%|███▍ | 11582/34278 [12:44:58<22:00:10, 3.49s/it] {'loss': 0.1449, 'grad_norm': 0.9000887676744005, 'learning_rate': 7.713619462289846e-06, 'epoch': 0.34} 34%|███▍ | 11582/34278 [12:44:58<22:00:10, 3.49s/it] 34%|███▍ | 11583/34278 [12:45:01<22:34:04, 3.58s/it] {'loss': 0.1383, 'grad_norm': 0.9491616826860819, 'learning_rate': 7.713222647451203e-06, 'epoch': 0.34} 34%|███▍ | 11583/34278 [12:45:01<22:34:04, 3.58s/it] 34%|███▍ | 11584/34278 [12:45:04<21:15:10, 3.37s/it] {'loss': 0.1482, 'grad_norm': 0.8886926381518871, 'learning_rate': 7.71282580838956e-06, 'epoch': 0.34} 34%|███▍ | 11584/34278 [12:45:04<21:15:10, 3.37s/it] 34%|███▍ | 11585/34278 [12:45:07<20:39:01, 3.28s/it] {'loss': 0.1478, 'grad_norm': 0.7507167077278053, 'learning_rate': 7.712428945108454e-06, 'epoch': 0.34} 34%|███▍ | 11585/34278 [12:45:07<20:39:01, 3.28s/it] 34%|███▍ | 11586/34278 [12:45:13<26:01:10, 4.13s/it] {'loss': 0.1477, 'grad_norm': 0.9189021330503667, 'learning_rate': 7.712032057611431e-06, 'epoch': 0.34} 34%|███▍ | 11586/34278 [12:45:13<26:01:10, 4.13s/it] 34%|███▍ | 11587/34278 [12:45:17<24:23:39, 3.87s/it] {'loss': 0.1583, 'grad_norm': 1.0128260734012735, 'learning_rate': 7.711635145902032e-06, 'epoch': 0.34} 34%|███▍ | 11587/34278 [12:45:17<24:23:39, 3.87s/it] 34%|███▍ | 11588/34278 [12:45:20<23:12:45, 3.68s/it] {'loss': 0.1577, 'grad_norm': 0.8575179334345707, 'learning_rate': 7.711238209983802e-06, 'epoch': 0.34} 34%|███▍ | 11588/34278 [12:45:20<23:12:45, 3.68s/it] 34%|███▍ | 11589/34278 [12:45:23<21:39:41, 3.44s/it] {'loss': 0.1535, 'grad_norm': 0.9998616785880221, 'learning_rate': 7.710841249860286e-06, 'epoch': 0.34} 34%|███▍ | 11589/34278 [12:45:23<21:39:41, 3.44s/it] 34%|███▍ | 11590/34278 [12:45:27<22:33:02, 3.58s/it] {'loss': 0.1438, 'grad_norm': 0.9767503548015507, 'learning_rate': 7.710444265535024e-06, 'epoch': 0.34} 34%|███▍ | 11590/34278 [12:45:27<22:33:02, 3.58s/it] 34%|███▍ | 11591/34278 [12:45:30<22:04:21, 3.50s/it] {'loss': 0.1472, 'grad_norm': 0.7505459740062331, 'learning_rate': 7.710047257011564e-06, 'epoch': 0.34} 34%|███▍ | 11591/34278 [12:45:30<22:04:21, 3.50s/it] 34%|███▍ | 11592/34278 [12:45:33<21:49:33, 3.46s/it] {'loss': 0.1196, 'grad_norm': 0.989067891001346, 'learning_rate': 7.709650224293449e-06, 'epoch': 0.34} 34%|███▍ | 11592/34278 [12:45:33<21:49:33, 3.46s/it] 34%|███▍ | 11593/34278 [12:45:38<24:21:08, 3.86s/it] {'loss': 0.1322, 'grad_norm': 0.8147623034459972, 'learning_rate': 7.709253167384223e-06, 'epoch': 0.34} 34%|███▍ | 11593/34278 [12:45:38<24:21:08, 3.86s/it] 34%|███▍ | 11594/34278 [12:45:45<29:00:53, 4.60s/it] {'loss': 0.145, 'grad_norm': 0.7033094865173153, 'learning_rate': 7.708856086287432e-06, 'epoch': 0.34} 34%|███▍ | 11594/34278 [12:45:45<29:00:53, 4.60s/it] 34%|███▍ | 11595/34278 [12:45:48<26:13:06, 4.16s/it] {'loss': 0.1682, 'grad_norm': 1.0309814820693728, 'learning_rate': 7.708458981006621e-06, 'epoch': 0.34} 34%|███▍ | 11595/34278 [12:45:48<26:13:06, 4.16s/it] 34%|███▍ | 11596/34278 [12:45:51<24:09:34, 3.83s/it] {'loss': 0.1753, 'grad_norm': 0.775960565931343, 'learning_rate': 7.708061851545334e-06, 'epoch': 0.34} 34%|███▍ | 11596/34278 [12:45:51<24:09:34, 3.83s/it] 34%|███▍ | 11597/34278 [12:45:57<28:22:59, 4.51s/it] {'loss': 0.1498, 'grad_norm': 0.8237789546111267, 'learning_rate': 7.707664697907117e-06, 'epoch': 0.34} 34%|███▍ | 11597/34278 [12:45:57<28:22:59, 4.51s/it] 34%|███▍ | 11598/34278 [12:46:00<25:13:02, 4.00s/it] {'loss': 0.1742, 'grad_norm': 0.9349919105975405, 'learning_rate': 7.707267520095515e-06, 'epoch': 0.34} 34%|███▍ | 11598/34278 [12:46:00<25:13:02, 4.00s/it] 34%|███▍ | 11599/34278 [12:46:02<22:52:24, 3.63s/it] {'loss': 0.1268, 'grad_norm': 1.0083016500343505, 'learning_rate': 7.70687031811408e-06, 'epoch': 0.34} 34%|███▍ | 11599/34278 [12:46:02<22:52:24, 3.63s/it] 34%|███▍ | 11600/34278 [12:46:05<21:51:05, 3.47s/it] {'loss': 0.118, 'grad_norm': 0.697713882631908, 'learning_rate': 7.706473091966347e-06, 'epoch': 0.34} 34%|███▍ | 11600/34278 [12:46:05<21:51:05, 3.47s/it] 34%|███▍ | 11601/34278 [12:46:09<21:58:00, 3.49s/it] {'loss': 0.1489, 'grad_norm': 0.7643882171336065, 'learning_rate': 7.706075841655871e-06, 'epoch': 0.34} 34%|███▍ | 11601/34278 [12:46:09<21:58:00, 3.49s/it] 34%|███▍ | 11602/34278 [12:46:13<22:06:12, 3.51s/it] {'loss': 0.1532, 'grad_norm': 1.0678200592313596, 'learning_rate': 7.705678567186195e-06, 'epoch': 0.34} 34%|███▍ | 11602/34278 [12:46:13<22:06:12, 3.51s/it] 34%|███▍ | 11603/34278 [12:46:16<21:58:54, 3.49s/it] {'loss': 0.1302, 'grad_norm': 0.8535030518138836, 'learning_rate': 7.705281268560866e-06, 'epoch': 0.34} 34%|███▍ | 11603/34278 [12:46:16<21:58:54, 3.49s/it] 34%|███▍ | 11604/34278 [12:46:22<27:28:51, 4.36s/it] {'loss': 0.1661, 'grad_norm': 0.7600060163055182, 'learning_rate': 7.704883945783435e-06, 'epoch': 0.34} 34%|███▍ | 11604/34278 [12:46:22<27:28:51, 4.36s/it] 34%|███▍ | 11605/34278 [12:46:28<30:33:55, 4.85s/it] {'loss': 0.147, 'grad_norm': 1.0692788141133338, 'learning_rate': 7.704486598857444e-06, 'epoch': 0.34} 34%|███▍ | 11605/34278 [12:46:28<30:33:55, 4.85s/it] 34%|███▍ | 11606/34278 [12:46:32<28:31:09, 4.53s/it] {'loss': 0.1467, 'grad_norm': 0.9700778543064547, 'learning_rate': 7.70408922778644e-06, 'epoch': 0.34} 34%|███▍ | 11606/34278 [12:46:32<28:31:09, 4.53s/it] 34%|███▍ | 11607/34278 [12:46:35<25:09:04, 3.99s/it] {'loss': 0.135, 'grad_norm': 0.9525649354280223, 'learning_rate': 7.703691832573975e-06, 'epoch': 0.34} 34%|███▍ | 11607/34278 [12:46:35<25:09:04, 3.99s/it] 34%|███▍ | 11608/34278 [12:46:38<23:19:22, 3.70s/it] {'loss': 0.1381, 'grad_norm': 0.8816075139805843, 'learning_rate': 7.703294413223595e-06, 'epoch': 0.34} 34%|███▍ | 11608/34278 [12:46:38<23:19:22, 3.70s/it] 34%|███▍ | 11609/34278 [12:46:41<22:17:50, 3.54s/it] {'loss': 0.1508, 'grad_norm': 0.7983909998112839, 'learning_rate': 7.702896969738847e-06, 'epoch': 0.34} 34%|███▍ | 11609/34278 [12:46:41<22:17:50, 3.54s/it] 34%|███▍ | 11610/34278 [12:46:44<21:25:03, 3.40s/it] {'loss': 0.1558, 'grad_norm': 0.9203609233880439, 'learning_rate': 7.702499502123281e-06, 'epoch': 0.34} 34%|███▍ | 11610/34278 [12:46:44<21:25:03, 3.40s/it] 34%|███▍ | 11611/34278 [12:46:47<20:42:55, 3.29s/it] {'loss': 0.1439, 'grad_norm': 0.6744051911156428, 'learning_rate': 7.702102010380444e-06, 'epoch': 0.34} 34%|███▍ | 11611/34278 [12:46:47<20:42:55, 3.29s/it] 34%|███▍ | 11612/34278 [12:46:51<22:06:55, 3.51s/it] {'loss': 0.1645, 'grad_norm': 0.7199704773594848, 'learning_rate': 7.701704494513885e-06, 'epoch': 0.34} 34%|███▍ | 11612/34278 [12:46:51<22:06:55, 3.51s/it] 34%|███▍ | 11613/34278 [12:46:57<26:53:02, 4.27s/it] {'loss': 0.1357, 'grad_norm': 0.8217566485644069, 'learning_rate': 7.701306954527153e-06, 'epoch': 0.34} 34%|███▍ | 11613/34278 [12:46:57<26:53:02, 4.27s/it] 34%|███▍ | 11614/34278 [12:47:00<24:40:50, 3.92s/it] {'loss': 0.1558, 'grad_norm': 0.8422794302581642, 'learning_rate': 7.700909390423798e-06, 'epoch': 0.34} 34%|███▍ | 11614/34278 [12:47:00<24:40:50, 3.92s/it] 34%|███▍ | 11615/34278 [12:47:04<23:56:00, 3.80s/it] {'loss': 0.1409, 'grad_norm': 0.7310318262358183, 'learning_rate': 7.70051180220737e-06, 'epoch': 0.34} 34%|███▍ | 11615/34278 [12:47:04<23:56:00, 3.80s/it] 34%|███▍ | 11616/34278 [12:47:07<23:04:47, 3.67s/it] {'loss': 0.1392, 'grad_norm': 3.798914361404054, 'learning_rate': 7.700114189881413e-06, 'epoch': 0.34} 34%|███▍ | 11616/34278 [12:47:07<23:04:47, 3.67s/it] 34%|███▍ | 11617/34278 [12:47:10<21:57:48, 3.49s/it] {'loss': 0.1639, 'grad_norm': 0.8806185465121645, 'learning_rate': 7.699716553449485e-06, 'epoch': 0.34} 34%|███▍ | 11617/34278 [12:47:10<21:57:48, 3.49s/it] 34%|███▍ | 11618/34278 [12:47:15<24:21:22, 3.87s/it] {'loss': 0.1384, 'grad_norm': 0.8395571390425997, 'learning_rate': 7.699318892915131e-06, 'epoch': 0.34} 34%|███▍ | 11618/34278 [12:47:15<24:21:22, 3.87s/it] 34%|███▍ | 11619/34278 [12:47:18<22:42:18, 3.61s/it] {'loss': 0.159, 'grad_norm': 0.8303897802809496, 'learning_rate': 7.698921208281903e-06, 'epoch': 0.34} 34%|███▍ | 11619/34278 [12:47:18<22:42:18, 3.61s/it] 34%|███▍ | 11620/34278 [12:47:21<21:56:59, 3.49s/it] {'loss': 0.146, 'grad_norm': 0.8165403889974459, 'learning_rate': 7.69852349955335e-06, 'epoch': 0.34} 34%|███▍ | 11620/34278 [12:47:21<21:56:59, 3.49s/it] 34%|███▍ | 11621/34278 [12:47:25<21:37:44, 3.44s/it] {'loss': 0.1588, 'grad_norm': 1.092760362693043, 'learning_rate': 7.698125766733023e-06, 'epoch': 0.34} 34%|███▍ | 11621/34278 [12:47:25<21:37:44, 3.44s/it] 34%|███▍ | 11622/34278 [12:47:31<26:20:10, 4.18s/it] {'loss': 0.1317, 'grad_norm': 0.6902754686469906, 'learning_rate': 7.697728009824475e-06, 'epoch': 0.34} 34%|███▍ | 11622/34278 [12:47:31<26:20:10, 4.18s/it] 34%|███▍ | 11623/34278 [12:47:34<24:30:18, 3.89s/it] {'loss': 0.1541, 'grad_norm': 0.7312169230775266, 'learning_rate': 7.697330228831254e-06, 'epoch': 0.34} 34%|███▍ | 11623/34278 [12:47:34<24:30:18, 3.89s/it] 34%|███▍ | 11624/34278 [12:47:38<25:24:38, 4.04s/it] {'loss': 0.1491, 'grad_norm': 0.8231684244030799, 'learning_rate': 7.696932423756912e-06, 'epoch': 0.34} 34%|███▍ | 11624/34278 [12:47:38<25:24:38, 4.04s/it] 34%|███▍ | 11625/34278 [12:47:41<23:36:41, 3.75s/it] {'loss': 0.1329, 'grad_norm': 0.9415065153653359, 'learning_rate': 7.696534594605e-06, 'epoch': 0.34} 34%|███▍ | 11625/34278 [12:47:41<23:36:41, 3.75s/it] 34%|███▍ | 11626/34278 [12:47:44<22:03:09, 3.50s/it] {'loss': 0.1679, 'grad_norm': 0.792732553868953, 'learning_rate': 7.696136741379073e-06, 'epoch': 0.34} 34%|███▍ | 11626/34278 [12:47:44<22:03:09, 3.50s/it] 34%|███▍ | 11627/34278 [12:47:47<21:06:26, 3.35s/it] {'loss': 0.1394, 'grad_norm': 0.7594829618379453, 'learning_rate': 7.69573886408268e-06, 'epoch': 0.34} 34%|███▍ | 11627/34278 [12:47:47<21:06:26, 3.35s/it] 34%|███▍ | 11628/34278 [12:47:50<20:47:27, 3.30s/it] {'loss': 0.1308, 'grad_norm': 0.9546364425628039, 'learning_rate': 7.695340962719376e-06, 'epoch': 0.34} 34%|███▍ | 11628/34278 [12:47:50<20:47:27, 3.30s/it] 34%|███▍ | 11629/34278 [12:47:54<21:35:16, 3.43s/it] {'loss': 0.1607, 'grad_norm': 0.7239429493125663, 'learning_rate': 7.69494303729271e-06, 'epoch': 0.34} 34%|███▍ | 11629/34278 [12:47:54<21:35:16, 3.43s/it] 34%|███▍ | 11630/34278 [12:47:58<23:20:52, 3.71s/it] {'loss': 0.1538, 'grad_norm': 0.7648747255782211, 'learning_rate': 7.694545087806236e-06, 'epoch': 0.34} 34%|███▍ | 11630/34278 [12:47:58<23:20:52, 3.71s/it] 34%|███▍ | 11631/34278 [12:48:04<27:20:41, 4.35s/it] {'loss': 0.1606, 'grad_norm': 1.0398673908202867, 'learning_rate': 7.694147114263505e-06, 'epoch': 0.34} 34%|███▍ | 11631/34278 [12:48:04<27:20:41, 4.35s/it] 34%|███▍ | 11632/34278 [12:48:07<24:54:37, 3.96s/it] {'loss': 0.134, 'grad_norm': 0.8509038630853565, 'learning_rate': 7.693749116668073e-06, 'epoch': 0.34} 34%|███▍ | 11632/34278 [12:48:07<24:54:37, 3.96s/it] 34%|███▍ | 11633/34278 [12:48:10<23:19:21, 3.71s/it] {'loss': 0.1486, 'grad_norm': 0.6808845366630925, 'learning_rate': 7.69335109502349e-06, 'epoch': 0.34} 34%|███▍ | 11633/34278 [12:48:10<23:19:21, 3.71s/it] 34%|███▍ | 11634/34278 [12:48:16<26:28:04, 4.21s/it] {'loss': 0.157, 'grad_norm': 0.9596622071022206, 'learning_rate': 7.692953049333315e-06, 'epoch': 0.34} 34%|███▍ | 11634/34278 [12:48:16<26:28:04, 4.21s/it] 34%|███▍ | 11635/34278 [12:48:19<24:53:10, 3.96s/it] {'loss': 0.1492, 'grad_norm': 0.9780170561775249, 'learning_rate': 7.692554979601097e-06, 'epoch': 0.34} 34%|███▍ | 11635/34278 [12:48:19<24:53:10, 3.96s/it] 34%|███▍ | 11636/34278 [12:48:22<23:14:18, 3.69s/it] {'loss': 0.1423, 'grad_norm': 0.7947887071631937, 'learning_rate': 7.69215688583039e-06, 'epoch': 0.34} 34%|███▍ | 11636/34278 [12:48:22<23:14:18, 3.69s/it] 34%|███▍ | 11637/34278 [12:48:26<24:00:11, 3.82s/it] {'loss': 0.1439, 'grad_norm': 0.7950864769771145, 'learning_rate': 7.69175876802475e-06, 'epoch': 0.34} 34%|███▍ | 11637/34278 [12:48:26<24:00:11, 3.82s/it] 34%|███▍ | 11638/34278 [12:48:29<22:39:20, 3.60s/it] {'loss': 0.1345, 'grad_norm': 0.852636176725188, 'learning_rate': 7.691360626187729e-06, 'epoch': 0.34} 34%|███▍ | 11638/34278 [12:48:29<22:39:20, 3.60s/it] 34%|███▍ | 11639/34278 [12:48:32<21:14:51, 3.38s/it] {'loss': 0.1546, 'grad_norm': 0.8749306756916018, 'learning_rate': 7.690962460322883e-06, 'epoch': 0.34} 34%|███▍ | 11639/34278 [12:48:32<21:14:51, 3.38s/it] 34%|███▍ | 11640/34278 [12:48:36<21:55:52, 3.49s/it] {'loss': 0.1427, 'grad_norm': 0.7573621451009913, 'learning_rate': 7.690564270433766e-06, 'epoch': 0.34} 34%|███▍ | 11640/34278 [12:48:36<21:55:52, 3.49s/it] 34%|███▍ | 11641/34278 [12:48:42<26:56:21, 4.28s/it] {'loss': 0.1364, 'grad_norm': 0.6883258193192641, 'learning_rate': 7.690166056523935e-06, 'epoch': 0.34} 34%|███▍ | 11641/34278 [12:48:42<26:56:21, 4.28s/it] 34%|███▍ | 11642/34278 [12:48:45<24:12:37, 3.85s/it] {'loss': 0.1739, 'grad_norm': 0.8212558475232992, 'learning_rate': 7.689767818596943e-06, 'epoch': 0.34} 34%|███▍ | 11642/34278 [12:48:45<24:12:37, 3.85s/it] 34%|███▍ | 11643/34278 [12:48:51<28:30:43, 4.53s/it] {'loss': 0.1603, 'grad_norm': 1.0178685133479788, 'learning_rate': 7.689369556656346e-06, 'epoch': 0.34} 34%|███▍ | 11643/34278 [12:48:51<28:30:43, 4.53s/it] 34%|███▍ | 11644/34278 [12:48:55<26:18:25, 4.18s/it] {'loss': 0.1419, 'grad_norm': 0.7929640467630086, 'learning_rate': 7.6889712707057e-06, 'epoch': 0.34} 34%|███▍ | 11644/34278 [12:48:55<26:18:25, 4.18s/it] 34%|███▍ | 11645/34278 [12:49:00<28:39:12, 4.56s/it] {'loss': 0.1648, 'grad_norm': 0.8285062651424151, 'learning_rate': 7.68857296074856e-06, 'epoch': 0.34} 34%|███▍ | 11645/34278 [12:49:00<28:39:12, 4.56s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 34%|███▍ | 11646/34278 [12:49:04<28:21:18, 4.51s/it] {'loss': 0.1465, 'grad_norm': 0.7637143786134057, 'learning_rate': 7.688174626788483e-06, 'epoch': 0.34} 34%|███▍ | 11646/34278 [12:49:04<28:21:18, 4.51s/it] 34%|███▍ | 11647/34278 [12:49:08<27:14:29, 4.33s/it] {'loss': 0.1341, 'grad_norm': 0.8038705124572267, 'learning_rate': 7.687776268829024e-06, 'epoch': 0.34} 34%|███▍ | 11647/34278 [12:49:08<27:14:29, 4.33s/it] 34%|███▍ | 11648/34278 [12:49:14<30:22:35, 4.83s/it] {'loss': 0.1316, 'grad_norm': 0.6997760841537921, 'learning_rate': 7.687377886873739e-06, 'epoch': 0.34} 34%|███▍ | 11648/34278 [12:49:14<30:22:35, 4.83s/it] 34%|███▍ | 11649/34278 [12:49:19<29:49:21, 4.74s/it] {'loss': 0.1584, 'grad_norm': 0.8494265532655626, 'learning_rate': 7.686979480926189e-06, 'epoch': 0.34} 34%|███▍ | 11649/34278 [12:49:19<29:49:21, 4.74s/it] 34%|███▍ | 11650/34278 [12:49:22<26:40:45, 4.24s/it] {'loss': 0.153, 'grad_norm': 0.8115325270981985, 'learning_rate': 7.686581050989925e-06, 'epoch': 0.34} 34%|███▍ | 11650/34278 [12:49:22<26:40:45, 4.24s/it] 34%|███▍ | 11651/34278 [12:49:26<27:18:22, 4.34s/it] {'loss': 0.1444, 'grad_norm': 0.6196469535696476, 'learning_rate': 7.686182597068505e-06, 'epoch': 0.34} 34%|███▍ | 11651/34278 [12:49:26<27:18:22, 4.34s/it] 34%|███▍ | 11652/34278 [12:49:30<25:42:41, 4.09s/it] {'loss': 0.1488, 'grad_norm': 0.8742196977812604, 'learning_rate': 7.685784119165492e-06, 'epoch': 0.34} 34%|███▍ | 11652/34278 [12:49:30<25:42:41, 4.09s/it] 34%|███▍ | 11653/34278 [12:49:33<23:23:04, 3.72s/it] {'loss': 0.149, 'grad_norm': 0.9336099521443803, 'learning_rate': 7.685385617284437e-06, 'epoch': 0.34} 34%|███▍ | 11653/34278 [12:49:33<23:23:04, 3.72s/it] 34%|███▍ | 11654/34278 [12:49:36<22:02:10, 3.51s/it] {'loss': 0.1323, 'grad_norm': 0.8556779058735722, 'learning_rate': 7.684987091428902e-06, 'epoch': 0.34} 34%|███▍ | 11654/34278 [12:49:36<22:02:10, 3.51s/it] 34%|███▍ | 11655/34278 [12:49:39<22:14:43, 3.54s/it] {'loss': 0.1431, 'grad_norm': 0.9249586535401686, 'learning_rate': 7.684588541602443e-06, 'epoch': 0.34} 34%|███▍ | 11655/34278 [12:49:39<22:14:43, 3.54s/it] 34%|███▍ | 11656/34278 [12:49:43<21:24:41, 3.41s/it] {'loss': 0.1416, 'grad_norm': 1.0598896664425277, 'learning_rate': 7.684189967808616e-06, 'epoch': 0.34} 34%|███▍ | 11656/34278 [12:49:43<21:24:41, 3.41s/it] 34%|███▍ | 11657/34278 [12:49:47<23:51:32, 3.80s/it] {'loss': 0.1554, 'grad_norm': 0.813634062026962, 'learning_rate': 7.683791370050984e-06, 'epoch': 0.34} 34%|███▍ | 11657/34278 [12:49:47<23:51:32, 3.80s/it] 34%|███▍ | 11658/34278 [12:49:53<27:59:36, 4.46s/it] {'loss': 0.1618, 'grad_norm': 1.0084842884915317, 'learning_rate': 7.683392748333102e-06, 'epoch': 0.34} 34%|███▍ | 11658/34278 [12:49:53<27:59:36, 4.46s/it] 34%|███▍ | 11659/34278 [12:49:56<25:26:31, 4.05s/it] {'loss': 0.1681, 'grad_norm': 1.2608013321156923, 'learning_rate': 7.682994102658532e-06, 'epoch': 0.34} 34%|███▍ | 11659/34278 [12:49:56<25:26:31, 4.05s/it] 34%|███▍ | 11660/34278 [12:50:00<24:05:12, 3.83s/it] {'loss': 0.1473, 'grad_norm': 0.7856307772707307, 'learning_rate': 7.68259543303083e-06, 'epoch': 0.34} 34%|███▍ | 11660/34278 [12:50:00<24:05:12, 3.83s/it] 34%|███▍ | 11661/34278 [12:50:03<23:20:30, 3.72s/it] {'loss': 0.1544, 'grad_norm': 1.024086214458485, 'learning_rate': 7.682196739453556e-06, 'epoch': 0.34} 34%|███▍ | 11661/34278 [12:50:03<23:20:30, 3.72s/it] 34%|███▍ | 11662/34278 [12:50:07<24:01:37, 3.82s/it] {'loss': 0.1893, 'grad_norm': 1.2069008199184548, 'learning_rate': 7.68179802193027e-06, 'epoch': 0.34} 34%|███▍ | 11662/34278 [12:50:07<24:01:37, 3.82s/it] 34%|███▍ | 11663/34278 [12:50:12<25:41:32, 4.09s/it] {'loss': 0.1641, 'grad_norm': 0.872775111768538, 'learning_rate': 7.681399280464531e-06, 'epoch': 0.34} 34%|███▍ | 11663/34278 [12:50:12<25:41:32, 4.09s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 34%|███▍ | 11664/34278 [12:50:15<24:12:41, 3.85s/it] {'loss': 0.144, 'grad_norm': 0.8029842648616166, 'learning_rate': 7.681000515059901e-06, 'epoch': 0.34} 34%|███▍ | 11664/34278 [12:50:15<24:12:41, 3.85s/it] 34%|███▍ | 11665/34278 [12:50:21<26:57:37, 4.29s/it] {'loss': 0.1396, 'grad_norm': 1.0516458477893995, 'learning_rate': 7.680601725719937e-06, 'epoch': 0.34} 34%|███▍ | 11665/34278 [12:50:21<26:57:37, 4.29s/it] 34%|███▍ | 11666/34278 [12:50:24<25:01:26, 3.98s/it] {'loss': 0.1601, 'grad_norm': 0.9434842839220801, 'learning_rate': 7.680202912448201e-06, 'epoch': 0.34} 34%|███▍ | 11666/34278 [12:50:24<25:01:26, 3.98s/it] 34%|███▍ | 11667/34278 [12:50:27<23:48:51, 3.79s/it] {'loss': 0.1498, 'grad_norm': 0.737427446621569, 'learning_rate': 7.679804075248254e-06, 'epoch': 0.34} 34%|███▍ | 11667/34278 [12:50:27<23:48:51, 3.79s/it] 34%|███▍ | 11668/34278 [12:50:30<22:48:03, 3.63s/it] {'loss': 0.1346, 'grad_norm': 0.7840859302324004, 'learning_rate': 7.679405214123654e-06, 'epoch': 0.34} 34%|███▍ | 11668/34278 [12:50:30<22:48:03, 3.63s/it] 34%|███▍ | 11669/34278 [12:50:33<21:17:47, 3.39s/it] {'loss': 0.1316, 'grad_norm': 0.8400931942200677, 'learning_rate': 7.679006329077965e-06, 'epoch': 0.34} 34%|███▍ | 11669/34278 [12:50:33<21:17:47, 3.39s/it] 34%|███▍ | 11670/34278 [12:50:36<20:21:28, 3.24s/it] {'loss': 0.1637, 'grad_norm': 0.758598806223022, 'learning_rate': 7.678607420114747e-06, 'epoch': 0.34} 34%|███▍ | 11670/34278 [12:50:36<20:21:28, 3.24s/it] 34%|███▍ | 11671/34278 [12:50:42<25:58:45, 4.14s/it] {'loss': 0.1394, 'grad_norm': 0.8306524681387498, 'learning_rate': 7.678208487237562e-06, 'epoch': 0.34} 34%|███▍ | 11671/34278 [12:50:42<25:58:45, 4.14s/it] 34%|███▍ | 11672/34278 [12:50:47<26:38:41, 4.24s/it] {'loss': 0.1478, 'grad_norm': 0.7991370783091845, 'learning_rate': 7.677809530449971e-06, 'epoch': 0.34} 34%|███▍ | 11672/34278 [12:50:47<26:38:41, 4.24s/it] 34%|███▍ | 11673/34278 [12:50:50<25:19:45, 4.03s/it] {'loss': 0.1399, 'grad_norm': 0.6720491765664105, 'learning_rate': 7.677410549755534e-06, 'epoch': 0.34} 34%|███▍ | 11673/34278 [12:50:50<25:19:45, 4.03s/it] 34%|███▍ | 11674/34278 [12:50:53<23:00:11, 3.66s/it] {'loss': 0.1362, 'grad_norm': 0.765124001772518, 'learning_rate': 7.677011545157818e-06, 'epoch': 0.34} 34%|███▍ | 11674/34278 [12:50:53<23:00:11, 3.66s/it] 34%|███▍ | 11675/34278 [12:50:56<21:32:53, 3.43s/it] {'loss': 0.1415, 'grad_norm': 1.0082223645553332, 'learning_rate': 7.676612516660379e-06, 'epoch': 0.34} 34%|███▍ | 11675/34278 [12:50:56<21:32:53, 3.43s/it] 34%|███▍ | 11676/34278 [12:51:00<22:21:49, 3.56s/it] {'loss': 0.1751, 'grad_norm': 0.6765023902159439, 'learning_rate': 7.676213464266783e-06, 'epoch': 0.34} 34%|███▍ | 11676/34278 [12:51:00<22:21:49, 3.56s/it] 34%|███▍ | 11677/34278 [12:51:03<21:27:33, 3.42s/it] {'loss': 0.1436, 'grad_norm': 0.8235423491168042, 'learning_rate': 7.675814387980592e-06, 'epoch': 0.34} 34%|███▍ | 11677/34278 [12:51:03<21:27:33, 3.42s/it] 34%|███▍ | 11678/34278 [12:51:08<24:57:00, 3.97s/it] {'loss': 0.1483, 'grad_norm': 0.8448033539821688, 'learning_rate': 7.67541528780537e-06, 'epoch': 0.34} 34%|███▍ | 11678/34278 [12:51:08<24:57:00, 3.97s/it] 34%|███▍ | 11679/34278 [12:51:11<23:21:25, 3.72s/it] {'loss': 0.1615, 'grad_norm': 0.8426377749357006, 'learning_rate': 7.67501616374468e-06, 'epoch': 0.34} 34%|███▍ | 11679/34278 [12:51:11<23:21:25, 3.72s/it] 34%|███▍ | 11680/34278 [12:51:15<22:31:44, 3.59s/it] {'loss': 0.16, 'grad_norm': 0.8540467357121467, 'learning_rate': 7.67461701580208e-06, 'epoch': 0.34} 34%|███▍ | 11680/34278 [12:51:15<22:31:44, 3.59s/it] 34%|███▍ | 11681/34278 [12:51:18<21:59:02, 3.50s/it] {'loss': 0.1469, 'grad_norm': 0.8699308629755351, 'learning_rate': 7.674217843981142e-06, 'epoch': 0.34} 34%|███▍ | 11681/34278 [12:51:18<21:59:02, 3.50s/it] 34%|███▍ | 11682/34278 [12:51:21<20:49:15, 3.32s/it] {'loss': 0.1409, 'grad_norm': 1.067218892242633, 'learning_rate': 7.673818648285423e-06, 'epoch': 0.34} 34%|███▍ | 11682/34278 [12:51:21<20:49:15, 3.32s/it] 34%|███▍ | 11683/34278 [12:51:27<26:08:54, 4.17s/it] {'loss': 0.1502, 'grad_norm': 0.9097203156946464, 'learning_rate': 7.67341942871849e-06, 'epoch': 0.34} 34%|███▍ | 11683/34278 [12:51:27<26:08:54, 4.17s/it] 34%|███▍ | 11684/34278 [12:51:30<24:10:46, 3.85s/it] {'loss': 0.1156, 'grad_norm': 0.7491748931649446, 'learning_rate': 7.673020185283908e-06, 'epoch': 0.34} 34%|███▍ | 11684/34278 [12:51:30<24:10:46, 3.85s/it] 34%|███▍ | 11685/34278 [12:51:35<25:47:18, 4.11s/it] {'loss': 0.1329, 'grad_norm': 0.7259473015696555, 'learning_rate': 7.672620917985238e-06, 'epoch': 0.34} 34%|███▍ | 11685/34278 [12:51:35<25:47:18, 4.11s/it] 34%|███▍ | 11686/34278 [12:51:38<24:17:35, 3.87s/it] {'loss': 0.1607, 'grad_norm': 0.8042559195369055, 'learning_rate': 7.672221626826046e-06, 'epoch': 0.34} 34%|███▍ | 11686/34278 [12:51:38<24:17:35, 3.87s/it] 34%|███▍ | 11687/34278 [12:51:41<22:39:35, 3.61s/it] {'loss': 0.1669, 'grad_norm': 0.8026800332869615, 'learning_rate': 7.671822311809899e-06, 'epoch': 0.34} 34%|███▍ | 11687/34278 [12:51:41<22:39:35, 3.61s/it] 34%|███▍ | 11688/34278 [12:51:44<21:26:52, 3.42s/it] {'loss': 0.1606, 'grad_norm': 0.7856470094621225, 'learning_rate': 7.671422972940359e-06, 'epoch': 0.34} 34%|███▍ | 11688/34278 [12:51:44<21:26:52, 3.42s/it] 34%|███▍ | 11689/34278 [12:51:50<26:48:22, 4.27s/it] {'loss': 0.1456, 'grad_norm': 0.6992651121871641, 'learning_rate': 7.671023610220993e-06, 'epoch': 0.34} 34%|███▍ | 11689/34278 [12:51:50<26:48:22, 4.27s/it] 34%|███▍ | 11690/34278 [12:51:56<29:00:10, 4.62s/it] {'loss': 0.1348, 'grad_norm': 0.8107099706788065, 'learning_rate': 7.670624223655367e-06, 'epoch': 0.34} 34%|███▍ | 11690/34278 [12:51:56<29:00:10, 4.62s/it] 34%|███▍ | 11691/34278 [12:51:59<26:36:04, 4.24s/it] {'loss': 0.1322, 'grad_norm': 0.598836061017337, 'learning_rate': 7.670224813247043e-06, 'epoch': 0.34} 34%|███▍ | 11691/34278 [12:51:59<26:36:04, 4.24s/it] 34%|███▍ | 11692/34278 [12:52:03<24:48:50, 3.96s/it] {'loss': 0.1429, 'grad_norm': 0.8186424373152729, 'learning_rate': 7.66982537899959e-06, 'epoch': 0.34} 34%|███▍ | 11692/34278 [12:52:03<24:48:50, 3.96s/it] 34%|███▍ | 11693/34278 [12:52:06<23:39:42, 3.77s/it] {'loss': 0.1651, 'grad_norm': 1.0788267547690011, 'learning_rate': 7.669425920916575e-06, 'epoch': 0.34} 34%|███▍ | 11693/34278 [12:52:06<23:39:42, 3.77s/it] 34%|███▍ | 11694/34278 [12:52:12<28:01:48, 4.47s/it] {'loss': 0.1215, 'grad_norm': 0.7094197262715586, 'learning_rate': 7.669026439001562e-06, 'epoch': 0.34} 34%|███▍ | 11694/34278 [12:52:12<28:01:48, 4.47s/it] 34%|███▍ | 11695/34278 [12:52:16<27:58:19, 4.46s/it] {'loss': 0.1395, 'grad_norm': 0.8876359519809253, 'learning_rate': 7.668626933258117e-06, 'epoch': 0.34} 34%|███▍ | 11695/34278 [12:52:16<27:58:19, 4.46s/it] 34%|███▍ | 11696/34278 [12:52:21<29:09:34, 4.65s/it] {'loss': 0.1361, 'grad_norm': 1.0110935606247058, 'learning_rate': 7.668227403689807e-06, 'epoch': 0.34} 34%|███▍ | 11696/34278 [12:52:21<29:09:34, 4.65s/it] 34%|███▍ | 11697/34278 [12:52:27<30:08:16, 4.80s/it] {'loss': 0.1535, 'grad_norm': 0.6144065129193836, 'learning_rate': 7.667827850300203e-06, 'epoch': 0.34} 34%|███▍ | 11697/34278 [12:52:27<30:08:16, 4.80s/it] 34%|███▍ | 11698/34278 [12:52:30<26:51:58, 4.28s/it] {'loss': 0.16, 'grad_norm': 0.8677026963363644, 'learning_rate': 7.667428273092867e-06, 'epoch': 0.34} 34%|███▍ | 11698/34278 [12:52:30<26:51:58, 4.28s/it] 34%|███▍ | 11699/34278 [12:52:33<24:49:48, 3.96s/it] {'loss': 0.1571, 'grad_norm': 0.8629451257224124, 'learning_rate': 7.667028672071368e-06, 'epoch': 0.34} 34%|███▍ | 11699/34278 [12:52:33<24:49:48, 3.96s/it] 34%|███▍ | 11700/34278 [12:52:37<24:24:41, 3.89s/it] {'loss': 0.1829, 'grad_norm': 0.8449410464990044, 'learning_rate': 7.666629047239273e-06, 'epoch': 0.34} 34%|███▍ | 11700/34278 [12:52:37<24:24:41, 3.89s/it] 34%|███▍ | 11701/34278 [12:52:40<23:18:50, 3.72s/it] {'loss': 0.1771, 'grad_norm': 0.8818627338487651, 'learning_rate': 7.666229398600151e-06, 'epoch': 0.34} 34%|███▍ | 11701/34278 [12:52:40<23:18:50, 3.72s/it] 34%|███▍ | 11702/34278 [12:52:43<22:21:19, 3.56s/it] {'loss': 0.1712, 'grad_norm': 1.0868901220828913, 'learning_rate': 7.66582972615757e-06, 'epoch': 0.34} 34%|███▍ | 11702/34278 [12:52:43<22:21:19, 3.56s/it] 34%|███▍ | 11703/34278 [12:52:46<21:54:14, 3.49s/it] {'loss': 0.1461, 'grad_norm': 0.836277865875183, 'learning_rate': 7.665430029915098e-06, 'epoch': 0.34} 34%|███▍ | 11703/34278 [12:52:46<21:54:14, 3.49s/it] 34%|███▍ | 11704/34278 [12:52:50<21:21:00, 3.40s/it] {'loss': 0.1406, 'grad_norm': 0.9585987858847443, 'learning_rate': 7.665030309876303e-06, 'epoch': 0.34} 34%|███▍ | 11704/34278 [12:52:50<21:21:00, 3.40s/it] 34%|███▍ | 11705/34278 [12:52:53<20:57:01, 3.34s/it] {'loss': 0.1582, 'grad_norm': 0.9428673267000058, 'learning_rate': 7.664630566044751e-06, 'epoch': 0.34} 34%|███▍ | 11705/34278 [12:52:53<20:57:01, 3.34s/it] 34%|███▍ | 11706/34278 [12:52:59<25:49:28, 4.12s/it] {'loss': 0.1325, 'grad_norm': 0.8198290317941089, 'learning_rate': 7.664230798424016e-06, 'epoch': 0.34} 34%|███▍ | 11706/34278 [12:52:59<25:49:28, 4.12s/it] 34%|███▍ | 11707/34278 [12:53:02<24:51:03, 3.96s/it] {'loss': 0.1335, 'grad_norm': 0.9226597031973783, 'learning_rate': 7.663831007017664e-06, 'epoch': 0.34} 34%|███▍ | 11707/34278 [12:53:02<24:51:03, 3.96s/it] 34%|███▍ | 11708/34278 [12:53:06<23:54:52, 3.81s/it] {'loss': 0.1528, 'grad_norm': 0.8968307048251073, 'learning_rate': 7.663431191829263e-06, 'epoch': 0.34} 34%|███▍ | 11708/34278 [12:53:06<23:54:52, 3.81s/it] 34%|███▍ | 11709/34278 [12:53:09<23:08:18, 3.69s/it] {'loss': 0.165, 'grad_norm': 0.9982291680294357, 'learning_rate': 7.663031352862387e-06, 'epoch': 0.34} 34%|███▍ | 11709/34278 [12:53:09<23:08:18, 3.69s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10317 > 8192). Running this sequence through the model will result in indexing errors 34%|███▍ | 11710/34278 [12:53:13<23:16:56, 3.71s/it] {'loss': 0.1399, 'grad_norm': 0.8121613049726315, 'learning_rate': 7.6626314901206e-06, 'epoch': 0.34} 34%|███▍ | 11710/34278 [12:53:13<23:16:56, 3.71s/it] 34%|███▍ | 11711/34278 [12:53:17<23:10:59, 3.70s/it] {'loss': 0.1857, 'grad_norm': 0.8579351196202621, 'learning_rate': 7.662231603607475e-06, 'epoch': 0.34} 34%|███▍ | 11711/34278 [12:53:17<23:10:59, 3.70s/it] 34%|███▍ | 11712/34278 [12:53:20<22:16:24, 3.55s/it] {'loss': 0.1689, 'grad_norm': 0.9538851908123045, 'learning_rate': 7.661831693326584e-06, 'epoch': 0.34} 34%|███▍ | 11712/34278 [12:53:20<22:16:24, 3.55s/it] 34%|███▍ | 11713/34278 [12:53:24<23:37:07, 3.77s/it] {'loss': 0.1455, 'grad_norm': 0.7149473699449731, 'learning_rate': 7.661431759281492e-06, 'epoch': 0.34} 34%|███▍ | 11713/34278 [12:53:24<23:37:07, 3.77s/it] 34%|███▍ | 11714/34278 [12:53:28<23:31:12, 3.75s/it] {'loss': 0.1673, 'grad_norm': 1.1132623450476096, 'learning_rate': 7.661031801475776e-06, 'epoch': 0.34} 34%|███▍ | 11714/34278 [12:53:28<23:31:12, 3.75s/it] 34%|███▍ | 11715/34278 [12:53:32<24:05:52, 3.84s/it] {'loss': 0.1537, 'grad_norm': 0.721488298031133, 'learning_rate': 7.660631819913001e-06, 'epoch': 0.34} 34%|███▍ | 11715/34278 [12:53:32<24:05:52, 3.84s/it] 34%|███▍ | 11716/34278 [12:53:35<23:01:17, 3.67s/it] {'loss': 0.1426, 'grad_norm': 0.6243591437984605, 'learning_rate': 7.66023181459674e-06, 'epoch': 0.34} 34%|███▍ | 11716/34278 [12:53:35<23:01:17, 3.67s/it] 34%|███▍ | 11717/34278 [12:53:39<24:01:51, 3.83s/it] {'loss': 0.1305, 'grad_norm': 0.7133498745963897, 'learning_rate': 7.659831785530567e-06, 'epoch': 0.34} 34%|███▍ | 11717/34278 [12:53:39<24:01:51, 3.83s/it] 34%|███▍ | 11718/34278 [12:53:43<23:49:51, 3.80s/it] {'loss': 0.1555, 'grad_norm': 0.8109652739888527, 'learning_rate': 7.659431732718048e-06, 'epoch': 0.34} 34%|███▍ | 11718/34278 [12:53:43<23:49:51, 3.80s/it] 34%|███▍ | 11719/34278 [12:53:46<22:45:00, 3.63s/it] {'loss': 0.1695, 'grad_norm': 0.8429149586885538, 'learning_rate': 7.659031656162759e-06, 'epoch': 0.34} 34%|███▍ | 11719/34278 [12:53:46<22:45:00, 3.63s/it] 34%|███▍ | 11720/34278 [12:53:50<21:59:39, 3.51s/it] {'loss': 0.175, 'grad_norm': 0.9783021579542683, 'learning_rate': 7.65863155586827e-06, 'epoch': 0.34} 34%|███▍ | 11720/34278 [12:53:50<21:59:39, 3.51s/it] 34%|███▍ | 11721/34278 [12:53:52<20:33:41, 3.28s/it] {'loss': 0.1139, 'grad_norm': 0.7455761986059449, 'learning_rate': 7.658231431838153e-06, 'epoch': 0.34} 34%|███▍ | 11721/34278 [12:53:52<20:33:41, 3.28s/it] 34%|███▍ | 11722/34278 [12:53:56<20:28:15, 3.27s/it] {'loss': 0.1547, 'grad_norm': 0.8614572722605297, 'learning_rate': 7.657831284075978e-06, 'epoch': 0.34} 34%|███▍ | 11722/34278 [12:53:56<20:28:15, 3.27s/it] 34%|███▍ | 11723/34278 [12:54:00<21:51:52, 3.49s/it] {'loss': 0.1469, 'grad_norm': 0.8069408183015282, 'learning_rate': 7.657431112585323e-06, 'epoch': 0.34} 34%|███▍ | 11723/34278 [12:54:00<21:51:52, 3.49s/it] 34%|███▍ | 11724/34278 [12:54:03<20:49:05, 3.32s/it] {'loss': 0.1525, 'grad_norm': 1.0893315245729782, 'learning_rate': 7.657030917369757e-06, 'epoch': 0.34} 34%|███▍ | 11724/34278 [12:54:03<20:49:05, 3.32s/it] 34%|███▍ | 11725/34278 [12:54:05<19:44:36, 3.15s/it] {'loss': 0.1449, 'grad_norm': 0.9923218410567376, 'learning_rate': 7.656630698432852e-06, 'epoch': 0.34} 34%|███▍ | 11725/34278 [12:54:05<19:44:36, 3.15s/it] 34%|███▍ | 11726/34278 [12:54:08<19:43:46, 3.15s/it] {'loss': 0.1511, 'grad_norm': 0.8452843397484929, 'learning_rate': 7.656230455778182e-06, 'epoch': 0.34} 34%|███▍ | 11726/34278 [12:54:08<19:43:46, 3.15s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12615 > 8192). Running this sequence through the model will result in indexing errors 34%|███▍ | 11727/34278 [12:54:12<20:07:53, 3.21s/it] {'loss': 0.163, 'grad_norm': 0.7588875919560713, 'learning_rate': 7.655830189409322e-06, 'epoch': 0.34} 34%|███▍ | 11727/34278 [12:54:12<20:07:53, 3.21s/it] 34%|███▍ | 11728/34278 [12:54:15<20:32:38, 3.28s/it] {'loss': 0.1403, 'grad_norm': 0.8346468307359247, 'learning_rate': 7.655429899329843e-06, 'epoch': 0.34} 34%|███▍ | 11728/34278 [12:54:15<20:32:38, 3.28s/it] 34%|███▍ | 11729/34278 [12:54:21<24:48:12, 3.96s/it] {'loss': 0.1581, 'grad_norm': 0.9571316650537509, 'learning_rate': 7.65502958554332e-06, 'epoch': 0.34} 34%|███▍ | 11729/34278 [12:54:21<24:48:12, 3.96s/it] 34%|███▍ | 11730/34278 [12:54:24<23:55:50, 3.82s/it] {'loss': 0.1318, 'grad_norm': 0.9824069479427872, 'learning_rate': 7.654629248053326e-06, 'epoch': 0.34} 34%|███▍ | 11730/34278 [12:54:24<23:55:50, 3.82s/it] 34%|███▍ | 11731/34278 [12:54:27<21:58:11, 3.51s/it] {'loss': 0.1619, 'grad_norm': 0.9142646902186613, 'learning_rate': 7.654228886863437e-06, 'epoch': 0.34} 34%|███▍ | 11731/34278 [12:54:27<21:58:11, 3.51s/it] 34%|███▍ | 11732/34278 [12:54:30<21:24:23, 3.42s/it] {'loss': 0.1552, 'grad_norm': 1.007233342140085, 'learning_rate': 7.653828501977228e-06, 'epoch': 0.34} 34%|███▍ | 11732/34278 [12:54:30<21:24:23, 3.42s/it] 34%|███▍ | 11733/34278 [12:54:35<23:02:01, 3.68s/it] {'loss': 0.1476, 'grad_norm': 0.7545373944008383, 'learning_rate': 7.653428093398268e-06, 'epoch': 0.34} 34%|███▍ | 11733/34278 [12:54:35<23:02:01, 3.68s/it] 34%|███▍ | 11734/34278 [12:54:37<21:34:09, 3.44s/it] {'loss': 0.1419, 'grad_norm': 1.0079163395605315, 'learning_rate': 7.653027661130137e-06, 'epoch': 0.34} 34%|███▍ | 11734/34278 [12:54:37<21:34:09, 3.44s/it] 34%|███▍ | 11735/34278 [12:54:41<21:24:30, 3.42s/it] {'loss': 0.1355, 'grad_norm': 1.177233676016444, 'learning_rate': 7.652627205176409e-06, 'epoch': 0.34} 34%|███▍ | 11735/34278 [12:54:41<21:24:30, 3.42s/it] 34%|███▍ | 11736/34278 [12:54:44<21:27:41, 3.43s/it] {'loss': 0.1316, 'grad_norm': 0.8320629231908665, 'learning_rate': 7.652226725540657e-06, 'epoch': 0.34} 34%|███▍ | 11736/34278 [12:54:44<21:27:41, 3.43s/it] 34%|███▍ | 11737/34278 [12:54:48<22:00:07, 3.51s/it] {'loss': 0.1291, 'grad_norm': 1.0215268112603635, 'learning_rate': 7.651826222226459e-06, 'epoch': 0.34} 34%|███▍ | 11737/34278 [12:54:48<22:00:07, 3.51s/it] 34%|███▍ | 11738/34278 [12:54:54<26:40:36, 4.26s/it] {'loss': 0.1575, 'grad_norm': 0.7397867723421131, 'learning_rate': 7.651425695237388e-06, 'epoch': 0.34} 34%|███▍ | 11738/34278 [12:54:54<26:40:36, 4.26s/it] 34%|███▍ | 11739/34278 [12:54:57<24:48:48, 3.96s/it] {'loss': 0.1633, 'grad_norm': 0.8591103044847006, 'learning_rate': 7.651025144577025e-06, 'epoch': 0.34} 34%|███▍ | 11739/34278 [12:54:57<24:48:48, 3.96s/it] 34%|███▍ | 11740/34278 [12:55:01<25:11:20, 4.02s/it] {'loss': 0.1551, 'grad_norm': 1.1377176191831462, 'learning_rate': 7.650624570248938e-06, 'epoch': 0.34} 34%|███▍ | 11740/34278 [12:55:01<25:11:20, 4.02s/it] 34%|███▍ | 11741/34278 [12:55:04<23:17:00, 3.72s/it] {'loss': 0.1306, 'grad_norm': 0.5858570252888864, 'learning_rate': 7.650223972256709e-06, 'epoch': 0.34} 34%|███▍ | 11741/34278 [12:55:04<23:17:00, 3.72s/it] 34%|███▍ | 11742/34278 [12:55:08<22:09:55, 3.54s/it] {'loss': 0.139, 'grad_norm': 1.1229020819776705, 'learning_rate': 7.649823350603915e-06, 'epoch': 0.34} 34%|███▍ | 11742/34278 [12:55:08<22:09:55, 3.54s/it] 34%|███▍ | 11743/34278 [12:55:12<22:58:23, 3.67s/it] {'loss': 0.1496, 'grad_norm': 0.9165859861890355, 'learning_rate': 7.649422705294127e-06, 'epoch': 0.34} 34%|███▍ | 11743/34278 [12:55:12<22:58:23, 3.67s/it] 34%|███▍ | 11744/34278 [12:55:15<22:19:26, 3.57s/it] {'loss': 0.1327, 'grad_norm': 0.7792939136773086, 'learning_rate': 7.64902203633093e-06, 'epoch': 0.34} 34%|███▍ | 11744/34278 [12:55:15<22:19:26, 3.57s/it] 34%|███▍ | 11745/34278 [12:55:19<23:18:27, 3.72s/it] {'loss': 0.1557, 'grad_norm': 0.8641812268876443, 'learning_rate': 7.648621343717895e-06, 'epoch': 0.34} 34%|███▍ | 11745/34278 [12:55:19<23:18:27, 3.72s/it] 34%|███▍ | 11746/34278 [12:55:25<27:20:24, 4.37s/it] {'loss': 0.1321, 'grad_norm': 0.9967717067789429, 'learning_rate': 7.648220627458597e-06, 'epoch': 0.34} 34%|███▍ | 11746/34278 [12:55:25<27:20:24, 4.37s/it] 34%|███▍ | 11747/34278 [12:55:29<27:01:27, 4.32s/it] {'loss': 0.1454, 'grad_norm': 0.7088341018010219, 'learning_rate': 7.647819887556621e-06, 'epoch': 0.34} 34%|███▍ | 11747/34278 [12:55:29<27:01:27, 4.32s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f06a7fae6b0> Failed to fetch sample 3423928. Exception: cannot identify image file <_io.BytesIO object at 0x7f06a7fae6b0> 34%|███▍ | 11748/34278 [12:55:32<25:03:16, 4.00s/it] {'loss': 0.1451, 'grad_norm': 0.9249612003418474, 'learning_rate': 7.647419124015543e-06, 'epoch': 0.34} 34%|███▍ | 11748/34278 [12:55:32<25:03:16, 4.00s/it] 34%|███▍ | 11749/34278 [12:55:37<26:33:36, 4.24s/it] {'loss': 0.1641, 'grad_norm': 0.8937051285563443, 'learning_rate': 7.647018336838936e-06, 'epoch': 0.34} 34%|███▍ | 11749/34278 [12:55:37<26:33:36, 4.24s/it] 34%|███▍ | 11750/34278 [12:55:43<29:43:02, 4.75s/it] {'loss': 0.1453, 'grad_norm': 0.9790850300978163, 'learning_rate': 7.646617526030381e-06, 'epoch': 0.34} 34%|███▍ | 11750/34278 [12:55:43<29:43:02, 4.75s/it] 34%|███▍ | 11751/34278 [12:55:46<27:09:48, 4.34s/it] {'loss': 0.1433, 'grad_norm': 0.7835615237928236, 'learning_rate': 7.64621669159346e-06, 'epoch': 0.34} 34%|███▍ | 11751/34278 [12:55:46<27:09:48, 4.34s/it] 34%|███▍ | 11752/34278 [12:55:49<24:42:03, 3.95s/it] {'loss': 0.1608, 'grad_norm': 0.9354865998205882, 'learning_rate': 7.645815833531745e-06, 'epoch': 0.34} 34%|███▍ | 11752/34278 [12:55:49<24:42:03, 3.95s/it] 34%|███▍ | 11753/34278 [12:55:53<23:04:54, 3.69s/it] {'loss': 0.1621, 'grad_norm': 0.97072884507175, 'learning_rate': 7.645414951848817e-06, 'epoch': 0.34} 34%|███▍ | 11753/34278 [12:55:53<23:04:54, 3.69s/it] 34%|███▍ | 11754/34278 [12:55:59<27:34:42, 4.41s/it] {'loss': 0.1611, 'grad_norm': 0.810817602561764, 'learning_rate': 7.64501404654826e-06, 'epoch': 0.34} 34%|███▍ | 11754/34278 [12:55:59<27:34:42, 4.41s/it] 34%|███▍ | 11755/34278 [12:56:02<24:51:41, 3.97s/it] {'loss': 0.1463, 'grad_norm': 0.8599322383836334, 'learning_rate': 7.644613117633644e-06, 'epoch': 0.34} 34%|███▍ | 11755/34278 [12:56:02<24:51:41, 3.97s/it] 34%|███▍ | 11756/34278 [12:56:05<24:13:29, 3.87s/it] {'loss': 0.1782, 'grad_norm': 0.9766877972562575, 'learning_rate': 7.644212165108556e-06, 'epoch': 0.34} 34%|███▍ | 11756/34278 [12:56:05<24:13:29, 3.87s/it] 34%|███▍ | 11757/34278 [12:56:08<23:01:07, 3.68s/it] {'loss': 0.1624, 'grad_norm': 1.0184043242109055, 'learning_rate': 7.643811188976574e-06, 'epoch': 0.34} 34%|███▍ | 11757/34278 [12:56:08<23:01:07, 3.68s/it] 34%|███▍ | 11758/34278 [12:56:14<26:49:02, 4.29s/it] {'loss': 0.1539, 'grad_norm': 0.9744970603175473, 'learning_rate': 7.643410189241275e-06, 'epoch': 0.34} 34%|███▍ | 11758/34278 [12:56:14<26:49:02, 4.29s/it] 34%|███▍ | 11759/34278 [12:56:19<27:31:45, 4.40s/it] {'loss': 0.1264, 'grad_norm': 0.8742113603224083, 'learning_rate': 7.643009165906242e-06, 'epoch': 0.34} 34%|███▍ | 11759/34278 [12:56:19<27:31:45, 4.40s/it] 34%|███▍ | 11760/34278 [12:56:22<25:18:44, 4.05s/it] {'loss': 0.1735, 'grad_norm': 1.0238609436396693, 'learning_rate': 7.642608118975055e-06, 'epoch': 0.34} 34%|███▍ | 11760/34278 [12:56:22<25:18:44, 4.05s/it] 34%|███▍ | 11761/34278 [12:56:27<26:17:45, 4.20s/it] {'loss': 0.1312, 'grad_norm': 0.9282281248682216, 'learning_rate': 7.64220704845129e-06, 'epoch': 0.34} 34%|███▍ | 11761/34278 [12:56:27<26:17:45, 4.20s/it] 34%|███▍ | 11762/34278 [12:56:30<24:25:53, 3.91s/it] {'loss': 0.1294, 'grad_norm': 0.6502956815705189, 'learning_rate': 7.641805954338534e-06, 'epoch': 0.34} 34%|███▍ | 11762/34278 [12:56:30<24:25:53, 3.91s/it] 34%|███▍ | 11763/34278 [12:56:33<23:31:35, 3.76s/it] {'loss': 0.1338, 'grad_norm': 0.9077142918053643, 'learning_rate': 7.641404836640365e-06, 'epoch': 0.34} 34%|███▍ | 11763/34278 [12:56:33<23:31:35, 3.76s/it] 34%|███▍ | 11764/34278 [12:56:36<21:45:49, 3.48s/it] {'loss': 0.1656, 'grad_norm': 0.8776814631969013, 'learning_rate': 7.641003695360363e-06, 'epoch': 0.34} 34%|███▍ | 11764/34278 [12:56:36<21:45:49, 3.48s/it] 34%|███▍ | 11765/34278 [12:56:40<23:17:55, 3.73s/it] {'loss': 0.1539, 'grad_norm': 0.6837570423818243, 'learning_rate': 7.640602530502112e-06, 'epoch': 0.34} 34%|███▍ | 11765/34278 [12:56:40<23:17:55, 3.73s/it] 34%|███▍ | 11766/34278 [12:56:43<21:27:19, 3.43s/it] {'loss': 0.1592, 'grad_norm': 1.0231023425151644, 'learning_rate': 7.64020134206919e-06, 'epoch': 0.34} 34%|███▍ | 11766/34278 [12:56:43<21:27:19, 3.43s/it] 34%|███▍ | 11767/34278 [12:56:47<22:05:41, 3.53s/it] {'loss': 0.1513, 'grad_norm': 0.7795675692279443, 'learning_rate': 7.639800130065183e-06, 'epoch': 0.34} 34%|███▍ | 11767/34278 [12:56:47<22:05:41, 3.53s/it] 34%|███▍ | 11768/34278 [12:56:51<22:46:54, 3.64s/it] {'loss': 0.1579, 'grad_norm': 0.9129465443806678, 'learning_rate': 7.639398894493668e-06, 'epoch': 0.34} 34%|███▍ | 11768/34278 [12:56:51<22:46:54, 3.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 34%|███▍ | 11769/34278 [12:56:54<21:54:03, 3.50s/it] {'loss': 0.1726, 'grad_norm': 0.740825249618372, 'learning_rate': 7.638997635358232e-06, 'epoch': 0.34} 34%|███▍ | 11769/34278 [12:56:54<21:54:03, 3.50s/it] 34%|███▍ | 11770/34278 [12:56:57<21:26:36, 3.43s/it] {'loss': 0.1361, 'grad_norm': 0.7092008977067692, 'learning_rate': 7.638596352662453e-06, 'epoch': 0.34} 34%|███▍ | 11770/34278 [12:56:57<21:26:36, 3.43s/it] 34%|███▍ | 11771/34278 [12:57:01<22:06:59, 3.54s/it] {'loss': 0.1497, 'grad_norm': 0.8115283013936152, 'learning_rate': 7.638195046409918e-06, 'epoch': 0.34} 34%|███▍ | 11771/34278 [12:57:01<22:06:59, 3.54s/it] 34%|███▍ | 11772/34278 [12:57:04<21:18:16, 3.41s/it] {'loss': 0.1359, 'grad_norm': 0.7232949261510621, 'learning_rate': 7.637793716604208e-06, 'epoch': 0.34} 34%|███▍ | 11772/34278 [12:57:04<21:18:16, 3.41s/it] 34%|███▍ | 11773/34278 [12:57:08<21:23:33, 3.42s/it] {'loss': 0.1502, 'grad_norm': 0.7038991828709446, 'learning_rate': 7.637392363248901e-06, 'epoch': 0.34} 34%|███▍ | 11773/34278 [12:57:08<21:23:33, 3.42s/it] 34%|███▍ | 11774/34278 [12:57:11<20:45:25, 3.32s/it] {'loss': 0.1607, 'grad_norm': 0.8078874550307183, 'learning_rate': 7.636990986347588e-06, 'epoch': 0.34} 34%|███▍ | 11774/34278 [12:57:11<20:45:25, 3.32s/it] 34%|███▍ | 11775/34278 [12:57:13<19:45:08, 3.16s/it] {'loss': 0.1629, 'grad_norm': 0.7599301363984657, 'learning_rate': 7.636589585903849e-06, 'epoch': 0.34} 34%|███▍ | 11775/34278 [12:57:13<19:45:08, 3.16s/it] 34%|███▍ | 11776/34278 [12:57:19<24:55:09, 3.99s/it] {'loss': 0.1298, 'grad_norm': 0.7400366455527537, 'learning_rate': 7.636188161921265e-06, 'epoch': 0.34} 34%|███▍ | 11776/34278 [12:57:19<24:55:09, 3.99s/it] 34%|███▍ | 11777/34278 [12:57:26<28:59:46, 4.64s/it] {'loss': 0.1799, 'grad_norm': 0.8050150329011148, 'learning_rate': 7.635786714403426e-06, 'epoch': 0.34} 34%|███▍ | 11777/34278 [12:57:26<28:59:46, 4.64s/it] 34%|███▍ | 11778/34278 [12:57:29<26:14:22, 4.20s/it] {'loss': 0.1495, 'grad_norm': 0.8460019406267215, 'learning_rate': 7.63538524335391e-06, 'epoch': 0.34} 34%|███▍ | 11778/34278 [12:57:29<26:14:22, 4.20s/it] 34%|███▍ | 11779/34278 [12:57:32<24:05:58, 3.86s/it] {'loss': 0.1403, 'grad_norm': 0.7025849164842259, 'learning_rate': 7.634983748776303e-06, 'epoch': 0.34} 34%|███▍ | 11779/34278 [12:57:32<24:05:58, 3.86s/it] 34%|███▍ | 11780/34278 [12:57:35<22:39:22, 3.63s/it] {'loss': 0.159, 'grad_norm': 0.7814189906227804, 'learning_rate': 7.634582230674192e-06, 'epoch': 0.34} 34%|███▍ | 11780/34278 [12:57:35<22:39:22, 3.63s/it] 34%|███▍ | 11781/34278 [12:57:41<27:08:18, 4.34s/it] {'loss': 0.1195, 'grad_norm': 0.9038548466785211, 'learning_rate': 7.63418068905116e-06, 'epoch': 0.34} 34%|███▍ | 11781/34278 [12:57:41<27:08:18, 4.34s/it] 34%|███▍ | 11782/34278 [12:57:47<30:17:07, 4.85s/it] {'loss': 0.1377, 'grad_norm': 0.7281775028636094, 'learning_rate': 7.63377912391079e-06, 'epoch': 0.34} 34%|███▍ | 11782/34278 [12:57:47<30:17:07, 4.85s/it] 34%|███▍ | 11783/34278 [12:57:50<27:12:51, 4.36s/it] {'loss': 0.1449, 'grad_norm': 0.7515869071367169, 'learning_rate': 7.63337753525667e-06, 'epoch': 0.34} 34%|███▍ | 11783/34278 [12:57:50<27:12:51, 4.36s/it] 34%|███▍ | 11784/34278 [12:57:53<24:42:32, 3.95s/it] {'loss': 0.1326, 'grad_norm': 1.0078354578427962, 'learning_rate': 7.632975923092384e-06, 'epoch': 0.34} 34%|███▍ | 11784/34278 [12:57:53<24:42:32, 3.95s/it] 34%|███▍ | 11785/34278 [12:57:56<22:51:38, 3.66s/it] {'loss': 0.1323, 'grad_norm': 0.7075062353104524, 'learning_rate': 7.632574287421516e-06, 'epoch': 0.34} 34%|███▍ | 11785/34278 [12:57:56<22:51:38, 3.66s/it] 34%|███▍ | 11786/34278 [12:57:59<22:20:38, 3.58s/it] {'loss': 0.1375, 'grad_norm': 1.0379271170357378, 'learning_rate': 7.632172628247654e-06, 'epoch': 0.34} 34%|███▍ | 11786/34278 [12:57:59<22:20:38, 3.58s/it] 34%|███▍ | 11787/34278 [12:58:03<21:44:57, 3.48s/it] {'loss': 0.1518, 'grad_norm': 0.7948980531470224, 'learning_rate': 7.631770945574384e-06, 'epoch': 0.34} 34%|███▍ | 11787/34278 [12:58:03<21:44:57, 3.48s/it] 34%|███▍ | 11788/34278 [12:58:07<23:29:10, 3.76s/it] {'loss': 0.1317, 'grad_norm': 0.9554102668718361, 'learning_rate': 7.63136923940529e-06, 'epoch': 0.34} 34%|███▍ | 11788/34278 [12:58:07<23:29:10, 3.76s/it] 34%|███▍ | 11789/34278 [12:58:10<22:19:28, 3.57s/it] {'loss': 0.1472, 'grad_norm': 0.7869855229794803, 'learning_rate': 7.63096750974396e-06, 'epoch': 0.34} 34%|███▍ | 11789/34278 [12:58:10<22:19:28, 3.57s/it] 34%|███▍ | 11790/34278 [12:58:13<21:15:56, 3.40s/it] {'loss': 0.156, 'grad_norm': 0.8554123739345731, 'learning_rate': 7.630565756593981e-06, 'epoch': 0.34} 34%|███▍ | 11790/34278 [12:58:13<21:15:56, 3.40s/it] 34%|███▍ | 11791/34278 [12:58:17<21:39:48, 3.47s/it] {'loss': 0.1298, 'grad_norm': 0.8192941417051375, 'learning_rate': 7.630163979958938e-06, 'epoch': 0.34} 34%|███▍ | 11791/34278 [12:58:17<21:39:48, 3.47s/it] 34%|███▍ | 11792/34278 [12:58:28<35:44:08, 5.72s/it] {'loss': 0.1516, 'grad_norm': 0.7793168006203326, 'learning_rate': 7.629762179842419e-06, 'epoch': 0.34} 34%|███▍ | 11792/34278 [12:58:28<35:44:08, 5.72s/it] 34%|███▍ | 11793/34278 [12:58:31<31:45:51, 5.09s/it] {'loss': 0.1571, 'grad_norm': 0.9064907156623201, 'learning_rate': 7.629360356248012e-06, 'epoch': 0.34} 34%|███▍ | 11793/34278 [12:58:31<31:45:51, 5.09s/it] 34%|███▍ | 11794/34278 [12:58:34<27:49:01, 4.45s/it] {'loss': 0.139, 'grad_norm': 0.8137239435787438, 'learning_rate': 7.628958509179303e-06, 'epoch': 0.34} 34%|███▍ | 11794/34278 [12:58:34<27:49:01, 4.45s/it] 34%|███▍ | 11795/34278 [12:58:39<27:07:58, 4.34s/it] {'loss': 0.1451, 'grad_norm': 1.104659530404001, 'learning_rate': 7.628556638639879e-06, 'epoch': 0.34} 34%|███▍ | 11795/34278 [12:58:39<27:07:58, 4.34s/it] 34%|███▍ | 11796/34278 [12:58:42<25:43:18, 4.12s/it] {'loss': 0.1328, 'grad_norm': 0.8586459575034965, 'learning_rate': 7.628154744633329e-06, 'epoch': 0.34} 34%|███▍ | 11796/34278 [12:58:42<25:43:18, 4.12s/it] 34%|███▍ | 11797/34278 [12:58:46<24:37:55, 3.94s/it] {'loss': 0.1368, 'grad_norm': 0.7810139291249382, 'learning_rate': 7.627752827163242e-06, 'epoch': 0.34} 34%|███▍ | 11797/34278 [12:58:46<24:37:55, 3.94s/it] 34%|███▍ | 11798/34278 [12:58:57<38:42:54, 6.20s/it] {'loss': 0.1574, 'grad_norm': 0.9871821266778147, 'learning_rate': 7.627350886233203e-06, 'epoch': 0.34} 34%|███▍ | 11798/34278 [12:58:57<38:42:54, 6.20s/it] 34%|███▍ | 11799/34278 [12:59:00<33:07:05, 5.30s/it] {'loss': 0.1251, 'grad_norm': 0.7171893193030499, 'learning_rate': 7.626948921846805e-06, 'epoch': 0.34} 34%|███▍ | 11799/34278 [12:59:00<33:07:05, 5.30s/it] 34%|███▍ | 11800/34278 [12:59:18<55:35:17, 8.90s/it] {'loss': 0.1502, 'grad_norm': 0.8632420151489926, 'learning_rate': 7.6265469340076326e-06, 'epoch': 0.34} 34%|███▍ | 11800/34278 [12:59:18<55:35:17, 8.90s/it] 34%|███▍ | 11801/34278 [12:59:21<46:04:24, 7.38s/it] {'loss': 0.1432, 'grad_norm': 0.8329398766342871, 'learning_rate': 7.6261449227192765e-06, 'epoch': 0.34} 34%|███▍ | 11801/34278 [12:59:21<46:04:24, 7.38s/it] 34%|███▍ | 11802/34278 [12:59:28<43:39:15, 6.99s/it] {'loss': 0.1604, 'grad_norm': 0.7117900534539481, 'learning_rate': 7.625742887985325e-06, 'epoch': 0.34} 34%|███▍ | 11802/34278 [12:59:28<43:39:15, 6.99s/it] 34%|███▍ | 11803/34278 [12:59:41<55:49:01, 8.94s/it] {'loss': 0.1484, 'grad_norm': 0.7999454698812005, 'learning_rate': 7.6253408298093665e-06, 'epoch': 0.34} 34%|███▍ | 11803/34278 [12:59:41<55:49:01, 8.94s/it] 34%|███▍ | 11804/34278 [12:59:46<48:50:04, 7.82s/it] {'loss': 0.1777, 'grad_norm': 0.9287764574396962, 'learning_rate': 7.6249387481949954e-06, 'epoch': 0.34} 34%|███▍ | 11804/34278 [12:59:46<48:50:04, 7.82s/it] 34%|███▍ | 11805/34278 [12:59:57<54:27:24, 8.72s/it] {'loss': 0.1488, 'grad_norm': 0.6733447502490658, 'learning_rate': 7.624536643145796e-06, 'epoch': 0.34} 34%|███▍ | 11805/34278 [12:59:57<54:27:24, 8.72s/it] 34%|███▍ | 11806/34278 [13:00:02<48:06:49, 7.71s/it] {'loss': 0.1527, 'grad_norm': 0.7819878196454474, 'learning_rate': 7.624134514665359e-06, 'epoch': 0.34} 34%|███▍ | 11806/34278 [13:00:02<48:06:49, 7.71s/it] 34%|███▍ | 11807/34278 [13:00:16<59:27:48, 9.53s/it] {'loss': 0.1405, 'grad_norm': 0.7754314146898684, 'learning_rate': 7.623732362757277e-06, 'epoch': 0.34} 34%|███▍ | 11807/34278 [13:00:16<59:27:48, 9.53s/it] 34%|███▍ | 11808/34278 [13:00:19<47:34:26, 7.62s/it] {'loss': 0.1505, 'grad_norm': 0.9180180567783497, 'learning_rate': 7.6233301874251375e-06, 'epoch': 0.34} 34%|███▍ | 11808/34278 [13:00:19<47:34:26, 7.62s/it] 34%|███▍ | 11809/34278 [13:00:45<81:36:26, 13.08s/it] {'loss': 0.1502, 'grad_norm': 0.7774195654853466, 'learning_rate': 7.622927988672533e-06, 'epoch': 0.34} 34%|███▍ | 11809/34278 [13:00:45<81:36:26, 13.08s/it] 34%|███▍ | 11810/34278 [13:01:07<98:28:17, 15.78s/it] {'loss': 0.1508, 'grad_norm': 0.829149529881248, 'learning_rate': 7.622525766503054e-06, 'epoch': 0.34} 34%|███▍ | 11810/34278 [13:01:07<98:28:17, 15.78s/it] 34%|███▍ | 11811/34278 [13:01:12<77:18:53, 12.39s/it] {'loss': 0.1468, 'grad_norm': 0.7174062064908221, 'learning_rate': 7.62212352092029e-06, 'epoch': 0.34} 34%|███▍ | 11811/34278 [13:01:12<77:18:53, 12.39s/it] 34%|███▍ | 11812/34278 [13:01:23<74:40:51, 11.97s/it] {'loss': 0.1492, 'grad_norm': 0.6797694666968168, 'learning_rate': 7.6217212519278335e-06, 'epoch': 0.34} 34%|███▍ | 11812/34278 [13:01:23<74:40:51, 11.97s/it] 34%|███▍ | 11813/34278 [13:01:57<116:59:08, 18.75s/it] {'loss': 0.1412, 'grad_norm': 0.7676823786348645, 'learning_rate': 7.621318959529276e-06, 'epoch': 0.34} 34%|███▍ | 11813/34278 [13:01:57<116:59:08, 18.75s/it] 34%|███▍ | 11814/34278 [13:02:24<131:27:52, 21.07s/it] {'loss': 0.1587, 'grad_norm': 0.8530918494192405, 'learning_rate': 7.620916643728209e-06, 'epoch': 0.34} 34%|███▍ | 11814/34278 [13:02:24<131:27:52, 21.07s/it] 34%|███▍ | 11815/34278 [13:02:36<114:22:38, 18.33s/it] {'loss': 0.164, 'grad_norm': 0.8594938308878028, 'learning_rate': 7.620514304528223e-06, 'epoch': 0.34} 34%|███▍ | 11815/34278 [13:02:36<114:22:38, 18.33s/it] 34%|███▍ | 11816/34278 [13:02:51<109:32:44, 17.56s/it] {'loss': 0.1455, 'grad_norm': 0.7989164744621123, 'learning_rate': 7.62011194193291e-06, 'epoch': 0.34} 34%|███▍ | 11816/34278 [13:02:51<109:32:44, 17.56s/it] 34%|███▍ | 11817/34278 [13:03:13<117:21:08, 18.81s/it] {'loss': 0.185, 'grad_norm': 1.052688992387018, 'learning_rate': 7.619709555945865e-06, 'epoch': 0.34} 34%|███▍ | 11817/34278 [13:03:13<117:21:08, 18.81s/it] 34%|███▍ | 11818/34278 [13:03:35<122:26:27, 19.63s/it] {'loss': 0.122, 'grad_norm': 0.6429489131274733, 'learning_rate': 7.619307146570677e-06, 'epoch': 0.34} 34%|███▍ | 11818/34278 [13:03:35<122:26:27, 19.63s/it] 34%|███▍ | 11819/34278 [13:03:52<117:18:11, 18.80s/it] {'loss': 0.1481, 'grad_norm': 0.803905314838356, 'learning_rate': 7.618904713810941e-06, 'epoch': 0.34} 34%|███▍ | 11819/34278 [13:03:52<117:18:11, 18.80s/it] 34%|███▍ | 11820/34278 [13:04:03<104:13:38, 16.71s/it] {'loss': 0.1467, 'grad_norm': 1.0731413585649892, 'learning_rate': 7.618502257670249e-06, 'epoch': 0.34} 34%|███▍ | 11820/34278 [13:04:03<104:13:38, 16.71s/it] 34%|███▍ | 11821/34278 [13:04:14<93:34:27, 15.00s/it] {'loss': 0.1305, 'grad_norm': 0.639650152004615, 'learning_rate': 7.618099778152193e-06, 'epoch': 0.34} 34%|███▍ | 11821/34278 [13:04:14<93:34:27, 15.00s/it] 34%|███▍ | 11822/34278 [13:04:37<107:20:38, 17.21s/it] {'loss': 0.1682, 'grad_norm': 0.7677202402257982, 'learning_rate': 7.617697275260367e-06, 'epoch': 0.34} 34%|███▍ | 11822/34278 [13:04:37<107:20:38, 17.21s/it] 34%|███▍ | 11823/34278 [13:04:59<115:59:23, 18.60s/it] {'loss': 0.1397, 'grad_norm': 1.022652750176267, 'learning_rate': 7.6172947489983655e-06, 'epoch': 0.34} 34%|███▍ | 11823/34278 [13:04:59<115:59:23, 18.60s/it] 34%|███▍ | 11824/34278 [13:05:15<112:14:42, 18.00s/it] {'loss': 0.1339, 'grad_norm': 0.7521072658515839, 'learning_rate': 7.616892199369781e-06, 'epoch': 0.34} 34%|███▍ | 11824/34278 [13:05:15<112:14:42, 18.00s/it] 34%|███▍ | 11825/34278 [13:05:31<107:35:29, 17.25s/it] {'loss': 0.1631, 'grad_norm': 0.7864432038689395, 'learning_rate': 7.616489626378207e-06, 'epoch': 0.34} 34%|███▍ | 11825/34278 [13:05:31<107:35:29, 17.25s/it] 35%|███▍ | 11826/34278 [13:05:47<105:07:00, 16.85s/it] {'loss': 0.1544, 'grad_norm': 0.7365787358449442, 'learning_rate': 7.616087030027239e-06, 'epoch': 0.35} 35%|███▍ | 11826/34278 [13:05:47<105:07:00, 16.85s/it] 35%|███▍ | 11827/34278 [13:06:13<122:37:51, 19.66s/it] {'loss': 0.1677, 'grad_norm': 0.9077261418591541, 'learning_rate': 7.6156844103204704e-06, 'epoch': 0.35} 35%|███▍ | 11827/34278 [13:06:13<122:37:51, 19.66s/it] 35%|███▍ | 11828/34278 [13:06:24<107:20:33, 17.21s/it] {'loss': 0.1502, 'grad_norm': 0.7962660675824412, 'learning_rate': 7.615281767261495e-06, 'epoch': 0.35} 35%|███▍ | 11828/34278 [13:06:24<107:20:33, 17.21s/it] 35%|███▍ | 11829/34278 [13:06:36<96:07:01, 15.41s/it] {'loss': 0.1372, 'grad_norm': 0.8082546928000954, 'learning_rate': 7.6148791008539106e-06, 'epoch': 0.35} 35%|███▍ | 11829/34278 [13:06:36<96:07:01, 15.41s/it] 35%|███▍ | 11830/34278 [13:06:58<109:15:27, 17.52s/it] {'loss': 0.1753, 'grad_norm': 0.8922996356975215, 'learning_rate': 7.614476411101308e-06, 'epoch': 0.35} 35%|███▍ | 11830/34278 [13:06:58<109:15:27, 17.52s/it] 35%|███▍ | 11831/34278 [13:07:09<97:43:17, 15.67s/it] {'loss': 0.1533, 'grad_norm': 0.8510290452679441, 'learning_rate': 7.614073698007285e-06, 'epoch': 0.35} 35%|███▍ | 11831/34278 [13:07:09<97:43:17, 15.67s/it] 35%|███▍ | 11832/34278 [13:07:26<99:29:13, 15.96s/it] {'loss': 0.1509, 'grad_norm': 0.8544577967758579, 'learning_rate': 7.613670961575435e-06, 'epoch': 0.35} 35%|███▍ | 11832/34278 [13:07:26<99:29:13, 15.96s/it] 35%|███▍ | 11833/34278 [13:07:31<79:44:28, 12.79s/it] {'loss': 0.1526, 'grad_norm': 1.0018614761194795, 'learning_rate': 7.613268201809354e-06, 'epoch': 0.35} 35%|███▍ | 11833/34278 [13:07:31<79:44:28, 12.79s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 35%|███▍ | 11834/34278 [13:07:45<81:07:00, 13.01s/it] {'loss': 0.1476, 'grad_norm': 1.0104117854874037, 'learning_rate': 7.61286541871264e-06, 'epoch': 0.35} 35%|███▍ | 11834/34278 [13:07:45<81:07:00, 13.01s/it] 35%|███▍ | 11835/34278 [13:08:04<92:14:21, 14.80s/it] {'loss': 0.1153, 'grad_norm': 0.6568999429253323, 'learning_rate': 7.612462612288886e-06, 'epoch': 0.35} 35%|███▍ | 11835/34278 [13:08:04<92:14:21, 14.80s/it] 35%|███▍ | 11836/34278 [13:08:18<91:00:01, 14.60s/it] {'loss': 0.1575, 'grad_norm': 0.9294501296868387, 'learning_rate': 7.61205978254169e-06, 'epoch': 0.35} 35%|███▍ | 11836/34278 [13:08:18<91:00:01, 14.60s/it] 35%|███▍ | 11837/34278 [13:08:34<92:40:16, 14.87s/it] {'loss': 0.1731, 'grad_norm': 1.1657132330027167, 'learning_rate': 7.611656929474649e-06, 'epoch': 0.35} 35%|███▍ | 11837/34278 [13:08:34<92:40:16, 14.87s/it] 35%|███▍ | 11838/34278 [13:08:39<75:46:51, 12.16s/it] {'loss': 0.1704, 'grad_norm': 0.9563649950542031, 'learning_rate': 7.611254053091357e-06, 'epoch': 0.35} 35%|███▍ | 11838/34278 [13:08:39<75:46:51, 12.16s/it] 35%|███▍ | 11839/34278 [13:08:53<78:31:59, 12.60s/it] {'loss': 0.1684, 'grad_norm': 1.2000026011577534, 'learning_rate': 7.610851153395413e-06, 'epoch': 0.35} 35%|███▍ | 11839/34278 [13:08:53<78:31:59, 12.60s/it] 35%|███▍ | 11840/34278 [13:08:56<60:59:11, 9.78s/it] {'loss': 0.1356, 'grad_norm': 0.7921110737620093, 'learning_rate': 7.6104482303904126e-06, 'epoch': 0.35} 35%|███▍ | 11840/34278 [13:08:56<60:59:11, 9.78s/it] 35%|███▍ | 11841/34278 [13:08:59<47:57:35, 7.70s/it] {'loss': 0.1611, 'grad_norm': 0.9640731752079257, 'learning_rate': 7.610045284079954e-06, 'epoch': 0.35} 35%|███▍ | 11841/34278 [13:08:59<47:57:35, 7.70s/it] 35%|███▍ | 11842/34278 [13:09:05<43:48:49, 7.03s/it] {'loss': 0.1499, 'grad_norm': 0.9162487112195041, 'learning_rate': 7.609642314467633e-06, 'epoch': 0.35} 35%|███▍ | 11842/34278 [13:09:05<43:48:49, 7.03s/it] 35%|███▍ | 11843/34278 [13:09:20<59:57:22, 9.62s/it] {'loss': 0.1494, 'grad_norm': 0.8670030361238367, 'learning_rate': 7.609239321557049e-06, 'epoch': 0.35} 35%|███▍ | 11843/34278 [13:09:20<59:57:22, 9.62s/it] 35%|███▍ | 11844/34278 [13:09:31<62:05:27, 9.96s/it] {'loss': 0.1318, 'grad_norm': 0.6570273748431265, 'learning_rate': 7.608836305351799e-06, 'epoch': 0.35} 35%|███▍ | 11844/34278 [13:09:31<62:05:27, 9.96s/it] 35%|███▍ | 11845/34278 [13:09:47<73:02:00, 11.72s/it] {'loss': 0.1615, 'grad_norm': 0.8564481053229913, 'learning_rate': 7.608433265855482e-06, 'epoch': 0.35} 35%|███▍ | 11845/34278 [13:09:47<73:02:00, 11.72s/it] 35%|███▍ | 11846/34278 [13:09:58<72:22:41, 11.62s/it] {'loss': 0.1593, 'grad_norm': 0.9063894688631816, 'learning_rate': 7.608030203071695e-06, 'epoch': 0.35} 35%|███▍ | 11846/34278 [13:09:58<72:22:41, 11.62s/it] 35%|███▍ | 11847/34278 [13:10:01<56:34:43, 9.08s/it] {'loss': 0.1485, 'grad_norm': 0.7773547283597553, 'learning_rate': 7.607627117004038e-06, 'epoch': 0.35} 35%|███▍ | 11847/34278 [13:10:01<56:34:43, 9.08s/it] 35%|███▍ | 11848/34278 [13:10:04<44:59:35, 7.22s/it] {'loss': 0.1692, 'grad_norm': 0.7893258556823853, 'learning_rate': 7.607224007656107e-06, 'epoch': 0.35} 35%|███▍ | 11848/34278 [13:10:04<44:59:35, 7.22s/it] 35%|███▍ | 11849/34278 [13:10:10<42:33:33, 6.83s/it] {'loss': 0.1544, 'grad_norm': 0.8479660892855598, 'learning_rate': 7.606820875031504e-06, 'epoch': 0.35} 35%|███▍ | 11849/34278 [13:10:10<42:33:33, 6.83s/it] 35%|███▍ | 11850/34278 [13:10:14<37:03:37, 5.95s/it] {'loss': 0.1518, 'grad_norm': 0.803579369190686, 'learning_rate': 7.606417719133825e-06, 'epoch': 0.35} 35%|███▍ | 11850/34278 [13:10:14<37:03:37, 5.95s/it] 35%|███▍ | 11851/34278 [13:10:20<36:39:36, 5.88s/it] {'loss': 0.1504, 'grad_norm': 0.8354596794096185, 'learning_rate': 7.6060145399666704e-06, 'epoch': 0.35} 35%|███▍ | 11851/34278 [13:10:20<36:39:36, 5.88s/it] 35%|███▍ | 11852/34278 [13:10:23<32:12:18, 5.17s/it] {'loss': 0.1693, 'grad_norm': 0.8353572247515243, 'learning_rate': 7.605611337533643e-06, 'epoch': 0.35} 35%|███▍ | 11852/34278 [13:10:23<32:12:18, 5.17s/it] 35%|███▍ | 11853/34278 [13:10:26<27:33:03, 4.42s/it] {'loss': 0.1195, 'grad_norm': 0.7725720728359323, 'learning_rate': 7.6052081118383355e-06, 'epoch': 0.35} 35%|███▍ | 11853/34278 [13:10:26<27:33:03, 4.42s/it] 35%|███▍ | 11854/34278 [13:10:30<26:23:38, 4.24s/it] {'loss': 0.1787, 'grad_norm': 0.8027182717158562, 'learning_rate': 7.604804862884356e-06, 'epoch': 0.35} 35%|███▍ | 11854/34278 [13:10:30<26:23:38, 4.24s/it] 35%|███▍ | 11855/34278 [13:10:33<24:48:10, 3.98s/it] {'loss': 0.1328, 'grad_norm': 0.7192842937875643, 'learning_rate': 7.604401590675299e-06, 'epoch': 0.35} 35%|███▍ | 11855/34278 [13:10:33<24:48:10, 3.98s/it] 35%|███▍ | 11856/34278 [13:10:39<28:19:51, 4.55s/it] {'loss': 0.1352, 'grad_norm': 0.6950695808910768, 'learning_rate': 7.603998295214765e-06, 'epoch': 0.35} 35%|███▍ | 11856/34278 [13:10:39<28:19:51, 4.55s/it] 35%|███▍ | 11857/34278 [13:10:45<30:57:22, 4.97s/it] {'loss': 0.1266, 'grad_norm': 0.7661365595616834, 'learning_rate': 7.603594976506356e-06, 'epoch': 0.35} 35%|███▍ | 11857/34278 [13:10:45<30:57:22, 4.97s/it] 35%|███▍ | 11858/34278 [13:10:51<33:12:52, 5.33s/it] {'loss': 0.1726, 'grad_norm': 0.6651939436178301, 'learning_rate': 7.6031916345536735e-06, 'epoch': 0.35} 35%|███▍ | 11858/34278 [13:10:51<33:12:52, 5.33s/it] 35%|███▍ | 11859/34278 [13:10:57<33:39:14, 5.40s/it] {'loss': 0.1596, 'grad_norm': 0.8730659487962338, 'learning_rate': 7.602788269360318e-06, 'epoch': 0.35} 35%|███▍ | 11859/34278 [13:10:57<33:39:14, 5.40s/it] 35%|███▍ | 11860/34278 [13:11:00<30:37:20, 4.92s/it] {'loss': 0.1703, 'grad_norm': 0.8727372307097581, 'learning_rate': 7.602384880929889e-06, 'epoch': 0.35} 35%|███▍ | 11860/34278 [13:11:00<30:37:20, 4.92s/it] 35%|███▍ | 11861/34278 [13:11:04<28:18:54, 4.55s/it] {'loss': 0.1466, 'grad_norm': 0.8503142625307699, 'learning_rate': 7.6019814692659885e-06, 'epoch': 0.35} 35%|███▍ | 11861/34278 [13:11:04<28:18:54, 4.55s/it] 35%|███▍ | 11862/34278 [13:11:07<25:29:34, 4.09s/it] {'loss': 0.1417, 'grad_norm': 0.9260690869444146, 'learning_rate': 7.601578034372221e-06, 'epoch': 0.35} 35%|███▍ | 11862/34278 [13:11:07<25:29:34, 4.09s/it] 35%|███▍ | 11863/34278 [13:11:11<25:07:10, 4.03s/it] {'loss': 0.1424, 'grad_norm': 0.9450116606750869, 'learning_rate': 7.601174576252184e-06, 'epoch': 0.35} 35%|███▍ | 11863/34278 [13:11:11<25:07:10, 4.03s/it] 35%|███▍ | 11864/34278 [13:11:14<23:36:54, 3.79s/it] {'loss': 0.1172, 'grad_norm': 0.9138215423529065, 'learning_rate': 7.600771094909483e-06, 'epoch': 0.35} 35%|███▍ | 11864/34278 [13:11:14<23:36:54, 3.79s/it] 35%|███▍ | 11865/34278 [13:11:19<24:37:06, 3.95s/it] {'loss': 0.1481, 'grad_norm': 0.7713805072391263, 'learning_rate': 7.600367590347716e-06, 'epoch': 0.35} 35%|███▍ | 11865/34278 [13:11:19<24:37:06, 3.95s/it] 35%|███▍ | 11866/34278 [13:11:22<22:41:06, 3.64s/it] {'loss': 0.1858, 'grad_norm': 0.879120531850407, 'learning_rate': 7.59996406257049e-06, 'epoch': 0.35} 35%|███▍ | 11866/34278 [13:11:22<22:41:06, 3.64s/it] 35%|███▍ | 11867/34278 [13:11:24<21:21:02, 3.43s/it] {'loss': 0.1262, 'grad_norm': 0.7535421844174716, 'learning_rate': 7.599560511581406e-06, 'epoch': 0.35} 35%|███▍ | 11867/34278 [13:11:24<21:21:02, 3.43s/it] 35%|███▍ | 11868/34278 [13:11:28<21:43:37, 3.49s/it] {'loss': 0.1309, 'grad_norm': 0.8378352398830269, 'learning_rate': 7.5991569373840625e-06, 'epoch': 0.35} 35%|███▍ | 11868/34278 [13:11:28<21:43:37, 3.49s/it] 35%|███▍ | 11869/34278 [13:11:31<20:59:31, 3.37s/it] {'loss': 0.1248, 'grad_norm': 0.8339734934011489, 'learning_rate': 7.59875333998207e-06, 'epoch': 0.35} 35%|███▍ | 11869/34278 [13:11:31<20:59:31, 3.37s/it] 35%|███▍ | 11870/34278 [13:11:36<24:13:25, 3.89s/it] {'loss': 0.1437, 'grad_norm': 0.7902759412143556, 'learning_rate': 7.598349719379028e-06, 'epoch': 0.35} 35%|███▍ | 11870/34278 [13:11:36<24:13:25, 3.89s/it] 35%|███▍ | 11871/34278 [13:11:41<26:12:34, 4.21s/it] {'loss': 0.1287, 'grad_norm': 0.8554018337484433, 'learning_rate': 7.597946075578538e-06, 'epoch': 0.35} 35%|███▍ | 11871/34278 [13:11:41<26:12:34, 4.21s/it] 35%|███▍ | 11872/34278 [13:11:44<23:37:02, 3.79s/it] {'loss': 0.1411, 'grad_norm': 0.9772872600690051, 'learning_rate': 7.5975424085842064e-06, 'epoch': 0.35} 35%|███▍ | 11872/34278 [13:11:44<23:37:02, 3.79s/it] 35%|███▍ | 11873/34278 [13:11:47<22:27:44, 3.61s/it] {'loss': 0.1531, 'grad_norm': 0.8488484589746388, 'learning_rate': 7.597138718399637e-06, 'epoch': 0.35} 35%|███▍ | 11873/34278 [13:11:47<22:27:44, 3.61s/it] 35%|███▍ | 11874/34278 [13:11:51<21:55:26, 3.52s/it] {'loss': 0.1408, 'grad_norm': 0.8447135368301548, 'learning_rate': 7.596735005028433e-06, 'epoch': 0.35} 35%|███▍ | 11874/34278 [13:11:51<21:55:26, 3.52s/it] 35%|███▍ | 11875/34278 [13:11:54<21:26:57, 3.45s/it] {'loss': 0.1532, 'grad_norm': 0.9235467300621414, 'learning_rate': 7.596331268474198e-06, 'epoch': 0.35} 35%|███▍ | 11875/34278 [13:11:54<21:26:57, 3.45s/it] 35%|███▍ | 11876/34278 [13:11:57<20:32:31, 3.30s/it] {'loss': 0.1313, 'grad_norm': 0.7567157832898448, 'learning_rate': 7.595927508740537e-06, 'epoch': 0.35} 35%|███▍ | 11876/34278 [13:11:57<20:32:31, 3.30s/it] 35%|███▍ | 11877/34278 [13:12:00<20:02:16, 3.22s/it] {'loss': 0.1479, 'grad_norm': 1.0210908962802436, 'learning_rate': 7.595523725831055e-06, 'epoch': 0.35} 35%|███▍ | 11877/34278 [13:12:00<20:02:16, 3.22s/it] 35%|███▍ | 11878/34278 [13:12:04<22:07:29, 3.56s/it] {'loss': 0.1532, 'grad_norm': 0.9050302294750344, 'learning_rate': 7.595119919749358e-06, 'epoch': 0.35} 35%|███▍ | 11878/34278 [13:12:04<22:07:29, 3.56s/it] 35%|███▍ | 11879/34278 [13:12:08<22:17:40, 3.58s/it] {'loss': 0.147, 'grad_norm': 0.8667936149681241, 'learning_rate': 7.594716090499049e-06, 'epoch': 0.35} 35%|███▍ | 11879/34278 [13:12:08<22:17:40, 3.58s/it] 35%|███▍ | 11880/34278 [13:12:12<22:34:26, 3.63s/it] {'loss': 0.1464, 'grad_norm': 1.254160825988959, 'learning_rate': 7.5943122380837334e-06, 'epoch': 0.35} 35%|███▍ | 11880/34278 [13:12:12<22:34:26, 3.63s/it] 35%|███▍ | 11881/34278 [13:12:18<27:20:04, 4.39s/it] {'loss': 0.161, 'grad_norm': 1.0749824320900403, 'learning_rate': 7.5939083625070186e-06, 'epoch': 0.35} 35%|███▍ | 11881/34278 [13:12:18<27:20:04, 4.39s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8469 > 8192). Running this sequence through the model will result in indexing errors 35%|███▍ | 11882/34278 [13:12:21<24:42:49, 3.97s/it] {'loss': 0.1449, 'grad_norm': 0.8301352679499988, 'learning_rate': 7.593504463772509e-06, 'epoch': 0.35} 35%|███▍ | 11882/34278 [13:12:21<24:42:49, 3.97s/it] 35%|███▍ | 11883/34278 [13:12:26<26:25:52, 4.25s/it] {'loss': 0.1406, 'grad_norm': 0.7502731131233293, 'learning_rate': 7.59310054188381e-06, 'epoch': 0.35} 35%|███▍ | 11883/34278 [13:12:26<26:25:52, 4.25s/it] 35%|███▍ | 11884/34278 [13:12:28<23:24:55, 3.76s/it] {'loss': 0.1627, 'grad_norm': 1.1800001635301305, 'learning_rate': 7.592696596844528e-06, 'epoch': 0.35} 35%|███▍ | 11884/34278 [13:12:28<23:24:55, 3.76s/it] 35%|███▍ | 11885/34278 [13:12:31<22:17:07, 3.58s/it] {'loss': 0.1413, 'grad_norm': 0.9820867231760797, 'learning_rate': 7.592292628658272e-06, 'epoch': 0.35} 35%|███▍ | 11885/34278 [13:12:31<22:17:07, 3.58s/it] 35%|███▍ | 11886/34278 [13:12:34<21:19:36, 3.43s/it] {'loss': 0.126, 'grad_norm': 0.6702341534760974, 'learning_rate': 7.591888637328645e-06, 'epoch': 0.35} 35%|███▍ | 11886/34278 [13:12:35<21:19:36, 3.43s/it] 35%|███▍ | 11887/34278 [13:12:38<21:52:27, 3.52s/it] {'loss': 0.1646, 'grad_norm': 1.0147294314014843, 'learning_rate': 7.591484622859254e-06, 'epoch': 0.35} 35%|███▍ | 11887/34278 [13:12:38<21:52:27, 3.52s/it] 35%|███▍ | 11888/34278 [13:12:42<21:48:05, 3.51s/it] {'loss': 0.1637, 'grad_norm': 1.0580514292145637, 'learning_rate': 7.591080585253709e-06, 'epoch': 0.35} 35%|███▍ | 11888/34278 [13:12:42<21:48:05, 3.51s/it] 35%|███▍ | 11889/34278 [13:12:48<26:13:04, 4.22s/it] {'loss': 0.1382, 'grad_norm': 0.7188400047709667, 'learning_rate': 7.590676524515612e-06, 'epoch': 0.35} 35%|███▍ | 11889/34278 [13:12:48<26:13:04, 4.22s/it] 35%|███▍ | 11890/34278 [13:12:50<23:38:15, 3.80s/it] {'loss': 0.1584, 'grad_norm': 0.8577118374671773, 'learning_rate': 7.5902724406485765e-06, 'epoch': 0.35} 35%|███▍ | 11890/34278 [13:12:50<23:38:15, 3.80s/it] 35%|███▍ | 11891/34278 [13:12:54<22:32:49, 3.63s/it] {'loss': 0.1305, 'grad_norm': 0.8155399706740338, 'learning_rate': 7.589868333656205e-06, 'epoch': 0.35} 35%|███▍ | 11891/34278 [13:12:54<22:32:49, 3.63s/it] 35%|███▍ | 11892/34278 [13:12:57<22:05:00, 3.55s/it] {'loss': 0.1433, 'grad_norm': 0.9305157557534992, 'learning_rate': 7.5894642035421085e-06, 'epoch': 0.35} 35%|███▍ | 11892/34278 [13:12:57<22:05:00, 3.55s/it] 35%|███▍ | 11893/34278 [13:13:00<21:48:56, 3.51s/it] {'loss': 0.1369, 'grad_norm': 0.6943071711003896, 'learning_rate': 7.589060050309893e-06, 'epoch': 0.35} 35%|███▍ | 11893/34278 [13:13:00<21:48:56, 3.51s/it] 35%|███▍ | 11894/34278 [13:13:04<22:24:10, 3.60s/it] {'loss': 0.1411, 'grad_norm': 0.7759315503155931, 'learning_rate': 7.588655873963169e-06, 'epoch': 0.35} 35%|███▍ | 11894/34278 [13:13:04<22:24:10, 3.60s/it] 35%|███▍ | 11895/34278 [13:13:10<27:02:47, 4.35s/it] {'loss': 0.1446, 'grad_norm': 0.7894836418023452, 'learning_rate': 7.58825167450554e-06, 'epoch': 0.35} 35%|███▍ | 11895/34278 [13:13:10<27:02:47, 4.35s/it] 35%|███▍ | 11896/34278 [13:13:14<25:58:25, 4.18s/it] {'loss': 0.1408, 'grad_norm': 0.8018921402564896, 'learning_rate': 7.58784745194062e-06, 'epoch': 0.35} 35%|███▍ | 11896/34278 [13:13:14<25:58:25, 4.18s/it] 35%|███▍ | 11897/34278 [13:13:18<25:07:52, 4.04s/it] {'loss': 0.1768, 'grad_norm': 0.7823097961516359, 'learning_rate': 7.587443206272016e-06, 'epoch': 0.35} 35%|███▍ | 11897/34278 [13:13:18<25:07:52, 4.04s/it] 35%|███▍ | 11898/34278 [13:13:24<29:16:50, 4.71s/it] {'loss': 0.1427, 'grad_norm': 0.5954033579328916, 'learning_rate': 7.587038937503336e-06, 'epoch': 0.35} 35%|███▍ | 11898/34278 [13:13:24<29:16:50, 4.71s/it] 35%|███▍ | 11899/34278 [13:13:27<26:25:35, 4.25s/it] {'loss': 0.1302, 'grad_norm': 0.8447046836073026, 'learning_rate': 7.586634645638192e-06, 'epoch': 0.35} 35%|███▍ | 11899/34278 [13:13:27<26:25:35, 4.25s/it] 35%|███▍ | 11900/34278 [13:13:30<24:01:07, 3.86s/it] {'loss': 0.1341, 'grad_norm': 1.320328461967567, 'learning_rate': 7.586230330680189e-06, 'epoch': 0.35} 35%|███▍ | 11900/34278 [13:13:30<24:01:07, 3.86s/it] 35%|███▍ | 11901/34278 [13:13:36<26:54:57, 4.33s/it] {'loss': 0.1734, 'grad_norm': 0.8787874159944483, 'learning_rate': 7.58582599263294e-06, 'epoch': 0.35} 35%|███▍ | 11901/34278 [13:13:36<26:54:57, 4.33s/it] 35%|███▍ | 11902/34278 [13:13:40<27:07:38, 4.36s/it] {'loss': 0.1544, 'grad_norm': 0.8941425430177624, 'learning_rate': 7.585421631500053e-06, 'epoch': 0.35} 35%|███▍ | 11902/34278 [13:13:40<27:07:38, 4.36s/it] 35%|███▍ | 11903/34278 [13:13:43<24:59:18, 4.02s/it] {'loss': 0.1424, 'grad_norm': 0.901562631007388, 'learning_rate': 7.585017247285139e-06, 'epoch': 0.35} 35%|███▍ | 11903/34278 [13:13:43<24:59:18, 4.02s/it] 35%|███▍ | 11904/34278 [13:13:49<28:20:59, 4.56s/it] {'loss': 0.1381, 'grad_norm': 0.86713749606017, 'learning_rate': 7.5846128399918085e-06, 'epoch': 0.35} 35%|███▍ | 11904/34278 [13:13:49<28:20:59, 4.56s/it] 35%|███▍ | 11905/34278 [13:13:53<27:32:06, 4.43s/it] {'loss': 0.1874, 'grad_norm': 0.7800106219324874, 'learning_rate': 7.5842084096236725e-06, 'epoch': 0.35} 35%|███▍ | 11905/34278 [13:13:53<27:32:06, 4.43s/it] 35%|███▍ | 11906/34278 [13:13:57<25:55:36, 4.17s/it] {'loss': 0.1495, 'grad_norm': 0.8805537185990067, 'learning_rate': 7.5838039561843394e-06, 'epoch': 0.35} 35%|███▍ | 11906/34278 [13:13:57<25:55:36, 4.17s/it] 35%|███▍ | 11907/34278 [13:14:00<23:20:22, 3.76s/it] {'loss': 0.1582, 'grad_norm': 1.0926782261149017, 'learning_rate': 7.58339947967742e-06, 'epoch': 0.35} 35%|███▍ | 11907/34278 [13:14:00<23:20:22, 3.76s/it] 35%|███▍ | 11908/34278 [13:14:04<24:07:22, 3.88s/it] {'loss': 0.1602, 'grad_norm': 0.7949222814397244, 'learning_rate': 7.58299498010653e-06, 'epoch': 0.35} 35%|███▍ | 11908/34278 [13:14:04<24:07:22, 3.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 35%|███▍ | 11909/34278 [13:14:08<24:37:07, 3.96s/it] {'loss': 0.1401, 'grad_norm': 1.0383029204101395, 'learning_rate': 7.582590457475277e-06, 'epoch': 0.35} 35%|███▍ | 11909/34278 [13:14:08<24:37:07, 3.96s/it] 35%|███▍ | 11910/34278 [13:14:14<28:29:48, 4.59s/it] {'loss': 0.157, 'grad_norm': 0.9646818690641682, 'learning_rate': 7.58218591178727e-06, 'epoch': 0.35} 35%|███▍ | 11910/34278 [13:14:14<28:29:48, 4.59s/it] 35%|███▍ | 11911/34278 [13:14:17<25:45:32, 4.15s/it] {'loss': 0.1291, 'grad_norm': 0.6976252489915159, 'learning_rate': 7.581781343046125e-06, 'epoch': 0.35} 35%|███▍ | 11911/34278 [13:14:17<25:45:32, 4.15s/it] 35%|███▍ | 11912/34278 [13:14:23<29:17:45, 4.72s/it] {'loss': 0.1608, 'grad_norm': 0.7660069983827438, 'learning_rate': 7.581376751255453e-06, 'epoch': 0.35} 35%|███▍ | 11912/34278 [13:14:23<29:17:45, 4.72s/it] 35%|███▍ | 11913/34278 [13:14:26<26:44:00, 4.30s/it] {'loss': 0.1423, 'grad_norm': 1.0084938181599934, 'learning_rate': 7.580972136418865e-06, 'epoch': 0.35} 35%|███▍ | 11913/34278 [13:14:27<26:44:00, 4.30s/it] 35%|███▍ | 11914/34278 [13:14:29<24:06:32, 3.88s/it] {'loss': 0.1482, 'grad_norm': 0.8818337085252713, 'learning_rate': 7.580567498539975e-06, 'epoch': 0.35} 35%|███▍ | 11914/34278 [13:14:29<24:06:32, 3.88s/it] 35%|███▍ | 11915/34278 [13:14:33<23:21:24, 3.76s/it] {'loss': 0.1395, 'grad_norm': 0.8588129928066195, 'learning_rate': 7.580162837622394e-06, 'epoch': 0.35} 35%|███▍ | 11915/34278 [13:14:33<23:21:24, 3.76s/it] 35%|███▍ | 11916/34278 [13:14:36<22:56:52, 3.69s/it] {'loss': 0.1566, 'grad_norm': 0.7396995712715371, 'learning_rate': 7.579758153669736e-06, 'epoch': 0.35} 35%|███▍ | 11916/34278 [13:14:36<22:56:52, 3.69s/it] 35%|███▍ | 11917/34278 [13:14:39<21:21:21, 3.44s/it] {'loss': 0.1488, 'grad_norm': 1.061775140027179, 'learning_rate': 7.579353446685611e-06, 'epoch': 0.35} 35%|███▍ | 11917/34278 [13:14:39<21:21:21, 3.44s/it] 35%|███▍ | 11918/34278 [13:14:42<20:41:13, 3.33s/it] {'loss': 0.1277, 'grad_norm': 1.0504367619558557, 'learning_rate': 7.578948716673636e-06, 'epoch': 0.35} 35%|███▍ | 11918/34278 [13:14:42<20:41:13, 3.33s/it] 35%|███▍ | 11919/34278 [13:14:45<20:09:21, 3.25s/it] {'loss': 0.1393, 'grad_norm': 0.6785031052925815, 'learning_rate': 7.578543963637422e-06, 'epoch': 0.35} 35%|███▍ | 11919/34278 [13:14:45<20:09:21, 3.25s/it] 35%|███▍ | 11920/34278 [13:14:49<21:45:25, 3.50s/it] {'loss': 0.1549, 'grad_norm': 0.9635945082363817, 'learning_rate': 7.578139187580582e-06, 'epoch': 0.35} 35%|███▍ | 11920/34278 [13:14:49<21:45:25, 3.50s/it] 35%|███▍ | 11921/34278 [13:14:53<20:56:18, 3.37s/it] {'loss': 0.1622, 'grad_norm': 0.8271091528319552, 'learning_rate': 7.57773438850673e-06, 'epoch': 0.35} 35%|███▍ | 11921/34278 [13:14:53<20:56:18, 3.37s/it] 35%|███▍ | 11922/34278 [13:14:56<20:39:08, 3.33s/it] {'loss': 0.1502, 'grad_norm': 0.8516236273741031, 'learning_rate': 7.577329566419482e-06, 'epoch': 0.35} 35%|███▍ | 11922/34278 [13:14:56<20:39:08, 3.33s/it] 35%|███▍ | 11923/34278 [13:14:59<20:43:53, 3.34s/it] {'loss': 0.1534, 'grad_norm': 1.3892241652855426, 'learning_rate': 7.5769247213224515e-06, 'epoch': 0.35} 35%|███▍ | 11923/34278 [13:14:59<20:43:53, 3.34s/it] 35%|███▍ | 11924/34278 [13:15:03<22:23:07, 3.61s/it] {'loss': 0.1528, 'grad_norm': 0.8022308581565714, 'learning_rate': 7.576519853219253e-06, 'epoch': 0.35} 35%|███▍ | 11924/34278 [13:15:03<22:23:07, 3.61s/it] 35%|███▍ | 11925/34278 [13:15:07<22:01:39, 3.55s/it] {'loss': 0.1344, 'grad_norm': 0.9506101986804958, 'learning_rate': 7.576114962113499e-06, 'epoch': 0.35} 35%|███▍ | 11925/34278 [13:15:07<22:01:39, 3.55s/it] 35%|███▍ | 11926/34278 [13:15:12<25:26:00, 4.10s/it] {'loss': 0.1411, 'grad_norm': 0.6805201034378913, 'learning_rate': 7.575710048008804e-06, 'epoch': 0.35} 35%|███▍ | 11926/34278 [13:15:12<25:26:00, 4.10s/it] 35%|███▍ | 11927/34278 [13:15:18<28:21:50, 4.57s/it] {'loss': 0.1488, 'grad_norm': 0.9126539739122491, 'learning_rate': 7.575305110908789e-06, 'epoch': 0.35} 35%|███▍ | 11927/34278 [13:15:18<28:21:50, 4.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 35%|███▍ | 11928/34278 [13:15:22<27:38:42, 4.45s/it] {'loss': 0.1382, 'grad_norm': 0.7806815575921464, 'learning_rate': 7.57490015081706e-06, 'epoch': 0.35} 35%|███▍ | 11928/34278 [13:15:22<27:38:42, 4.45s/it] 35%|███▍ | 11929/34278 [13:15:25<25:05:54, 4.04s/it] {'loss': 0.1357, 'grad_norm': 1.022223679229018, 'learning_rate': 7.5744951677372405e-06, 'epoch': 0.35} 35%|███▍ | 11929/34278 [13:15:25<25:05:54, 4.04s/it] 35%|███▍ | 11930/34278 [13:15:28<23:01:40, 3.71s/it] {'loss': 0.1598, 'grad_norm': 1.0693245096665438, 'learning_rate': 7.574090161672941e-06, 'epoch': 0.35} 35%|███▍ | 11930/34278 [13:15:28<23:01:40, 3.71s/it] 35%|███▍ | 11931/34278 [13:15:32<22:43:13, 3.66s/it] {'loss': 0.1458, 'grad_norm': 1.0079428372937314, 'learning_rate': 7.573685132627779e-06, 'epoch': 0.35} 35%|███▍ | 11931/34278 [13:15:32<22:43:13, 3.66s/it] 35%|███▍ | 11932/34278 [13:15:37<25:31:21, 4.11s/it] {'loss': 0.1711, 'grad_norm': 0.8614096510807508, 'learning_rate': 7.573280080605372e-06, 'epoch': 0.35} 35%|███▍ | 11932/34278 [13:15:37<25:31:21, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 35%|███▍ | 11933/34278 [13:15:40<23:32:27, 3.79s/it] {'loss': 0.1649, 'grad_norm': 0.8788365902587741, 'learning_rate': 7.5728750056093324e-06, 'epoch': 0.35} 35%|███▍ | 11933/34278 [13:15:40<23:32:27, 3.79s/it] 35%|███▍ | 11934/34278 [13:15:43<21:57:12, 3.54s/it] {'loss': 0.154, 'grad_norm': 1.0518207279438658, 'learning_rate': 7.572469907643281e-06, 'epoch': 0.35} 35%|███▍ | 11934/34278 [13:15:43<21:57:12, 3.54s/it] 35%|███▍ | 11935/34278 [13:15:46<20:53:09, 3.37s/it] {'loss': 0.1525, 'grad_norm': 0.8490076901076699, 'learning_rate': 7.572064786710831e-06, 'epoch': 0.35} 35%|███▍ | 11935/34278 [13:15:46<20:53:09, 3.37s/it] 35%|███▍ | 11936/34278 [13:15:51<24:04:33, 3.88s/it] {'loss': 0.1479, 'grad_norm': 1.1160961260912805, 'learning_rate': 7.571659642815601e-06, 'epoch': 0.35} 35%|███▍ | 11936/34278 [13:15:51<24:04:33, 3.88s/it] 35%|███▍ | 11937/34278 [13:15:55<24:34:09, 3.96s/it] {'loss': 0.1856, 'grad_norm': 0.9834525667146082, 'learning_rate': 7.571254475961207e-06, 'epoch': 0.35} 35%|███▍ | 11937/34278 [13:15:55<24:34:09, 3.96s/it] 35%|███▍ | 11938/34278 [13:15:58<23:26:17, 3.78s/it] {'loss': 0.1222, 'grad_norm': 0.9704653302409625, 'learning_rate': 7.570849286151268e-06, 'epoch': 0.35} 35%|███▍ | 11938/34278 [13:15:58<23:26:17, 3.78s/it] 35%|███▍ | 11939/34278 [13:16:01<22:17:52, 3.59s/it] {'loss': 0.1481, 'grad_norm': 0.7723015389239746, 'learning_rate': 7.570444073389401e-06, 'epoch': 0.35} 35%|███▍ | 11939/34278 [13:16:01<22:17:52, 3.59s/it] 35%|███▍ | 11940/34278 [13:16:07<26:46:13, 4.31s/it] {'loss': 0.1449, 'grad_norm': 1.0652574036750242, 'learning_rate': 7.570038837679221e-06, 'epoch': 0.35} 35%|███▍ | 11940/34278 [13:16:07<26:46:13, 4.31s/it] 35%|███▍ | 11941/34278 [13:16:12<26:46:59, 4.32s/it] {'loss': 0.1347, 'grad_norm': 0.9229097044709855, 'learning_rate': 7.569633579024349e-06, 'epoch': 0.35} 35%|███▍ | 11941/34278 [13:16:12<26:46:59, 4.32s/it] 35%|███▍ | 11942/34278 [13:16:15<24:41:16, 3.98s/it] {'loss': 0.1501, 'grad_norm': 1.0344299975079618, 'learning_rate': 7.569228297428401e-06, 'epoch': 0.35} 35%|███▍ | 11942/34278 [13:16:15<24:41:16, 3.98s/it] 35%|███▍ | 11943/34278 [13:16:21<28:24:09, 4.58s/it] {'loss': 0.1346, 'grad_norm': 1.0146508843935946, 'learning_rate': 7.568822992894996e-06, 'epoch': 0.35} 35%|███▍ | 11943/34278 [13:16:21<28:24:09, 4.58s/it] 35%|███▍ | 11944/34278 [13:16:24<26:07:50, 4.21s/it] {'loss': 0.1345, 'grad_norm': 0.7724812330039791, 'learning_rate': 7.5684176654277544e-06, 'epoch': 0.35} 35%|███▍ | 11944/34278 [13:16:24<26:07:50, 4.21s/it] 35%|███▍ | 11945/34278 [13:16:29<26:12:02, 4.22s/it] {'loss': 0.1524, 'grad_norm': 0.7955342858813292, 'learning_rate': 7.568012315030291e-06, 'epoch': 0.35} 35%|███▍ | 11945/34278 [13:16:29<26:12:02, 4.22s/it] 35%|███▍ | 11946/34278 [13:16:31<23:25:21, 3.78s/it] {'loss': 0.1633, 'grad_norm': 0.8702249736454624, 'learning_rate': 7.567606941706227e-06, 'epoch': 0.35} 35%|███▍ | 11946/34278 [13:16:31<23:25:21, 3.78s/it] 35%|███▍ | 11947/34278 [13:16:37<26:42:16, 4.31s/it] {'loss': 0.1411, 'grad_norm': 0.7052427597011488, 'learning_rate': 7.567201545459182e-06, 'epoch': 0.35} 35%|███▍ | 11947/34278 [13:16:37<26:42:16, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 35%|███▍ | 11948/34278 [13:16:40<24:16:05, 3.91s/it] {'loss': 0.136, 'grad_norm': 0.7754105467385218, 'learning_rate': 7.566796126292775e-06, 'epoch': 0.35} 35%|███▍ | 11948/34278 [13:16:40<24:16:05, 3.91s/it] 35%|███▍ | 11949/34278 [13:16:43<22:29:41, 3.63s/it] {'loss': 0.1383, 'grad_norm': 0.9373317031533087, 'learning_rate': 7.566390684210623e-06, 'epoch': 0.35} 35%|███▍ | 11949/34278 [13:16:43<22:29:41, 3.63s/it] 35%|███▍ | 11950/34278 [13:16:47<23:01:25, 3.71s/it] {'loss': 0.1305, 'grad_norm': 0.750597646362002, 'learning_rate': 7.565985219216348e-06, 'epoch': 0.35} 35%|███▍ | 11950/34278 [13:16:47<23:01:25, 3.71s/it] 35%|███▍ | 11951/34278 [13:16:51<23:33:57, 3.80s/it] {'loss': 0.1549, 'grad_norm': 0.7445725181939357, 'learning_rate': 7.56557973131357e-06, 'epoch': 0.35} 35%|███▍ | 11951/34278 [13:16:51<23:33:57, 3.80s/it] 35%|███▍ | 11952/34278 [13:16:54<23:10:51, 3.74s/it] {'loss': 0.1535, 'grad_norm': 1.1197940421993258, 'learning_rate': 7.565174220505908e-06, 'epoch': 0.35} 35%|███▍ | 11952/34278 [13:16:54<23:10:51, 3.74s/it] 35%|███▍ | 11953/34278 [13:16:57<21:41:18, 3.50s/it] {'loss': 0.1635, 'grad_norm': 0.8255686489900675, 'learning_rate': 7.5647686867969836e-06, 'epoch': 0.35} 35%|███▍ | 11953/34278 [13:16:57<21:41:18, 3.50s/it] 35%|███▍ | 11954/34278 [13:17:00<21:11:23, 3.42s/it] {'loss': 0.153, 'grad_norm': 0.8335188400060184, 'learning_rate': 7.564363130190417e-06, 'epoch': 0.35} 35%|███▍ | 11954/34278 [13:17:00<21:11:23, 3.42s/it] 35%|███▍ | 11955/34278 [13:17:04<20:55:09, 3.37s/it] {'loss': 0.1337, 'grad_norm': 0.9209087781811074, 'learning_rate': 7.563957550689829e-06, 'epoch': 0.35} 35%|███▍ | 11955/34278 [13:17:04<20:55:09, 3.37s/it] 35%|███▍ | 11956/34278 [13:17:07<19:57:03, 3.22s/it] {'loss': 0.165, 'grad_norm': 0.8560534645406798, 'learning_rate': 7.56355194829884e-06, 'epoch': 0.35} 35%|███▍ | 11956/34278 [13:17:07<19:57:03, 3.22s/it] 35%|███▍ | 11957/34278 [13:17:10<20:22:44, 3.29s/it] {'loss': 0.1406, 'grad_norm': 0.8093598768193273, 'learning_rate': 7.563146323021069e-06, 'epoch': 0.35} 35%|███▍ | 11957/34278 [13:17:10<20:22:44, 3.29s/it] 35%|███▍ | 11958/34278 [13:17:15<23:58:50, 3.87s/it] {'loss': 0.1449, 'grad_norm': 0.6959558351912577, 'learning_rate': 7.56274067486014e-06, 'epoch': 0.35} 35%|███▍ | 11958/34278 [13:17:15<23:58:50, 3.87s/it] 35%|███▍ | 11959/34278 [13:17:18<22:21:16, 3.61s/it] {'loss': 0.1488, 'grad_norm': 0.7469802476050259, 'learning_rate': 7.562335003819676e-06, 'epoch': 0.35} 35%|███▍ | 11959/34278 [13:17:18<22:21:16, 3.61s/it] 35%|███▍ | 11960/34278 [13:17:21<21:23:29, 3.45s/it] {'loss': 0.1309, 'grad_norm': 0.7686951957740379, 'learning_rate': 7.561929309903295e-06, 'epoch': 0.35} 35%|███▍ | 11960/34278 [13:17:21<21:23:29, 3.45s/it] 35%|███▍ | 11961/34278 [13:17:24<20:41:46, 3.34s/it] {'loss': 0.1376, 'grad_norm': 0.6687390943009407, 'learning_rate': 7.561523593114621e-06, 'epoch': 0.35} 35%|███▍ | 11961/34278 [13:17:24<20:41:46, 3.34s/it] 35%|███▍ | 11962/34278 [13:17:28<20:58:19, 3.38s/it] {'loss': 0.136, 'grad_norm': 0.7062388428726224, 'learning_rate': 7.561117853457277e-06, 'epoch': 0.35} 35%|███▍ | 11962/34278 [13:17:28<20:58:19, 3.38s/it] 35%|███▍ | 11963/34278 [13:17:34<25:39:27, 4.14s/it] {'loss': 0.1302, 'grad_norm': 0.8304008654549321, 'learning_rate': 7.560712090934883e-06, 'epoch': 0.35} 35%|███▍ | 11963/34278 [13:17:34<25:39:27, 4.14s/it] 35%|███▍ | 11964/34278 [13:17:38<25:09:35, 4.06s/it] {'loss': 0.1174, 'grad_norm': 1.1233598241761529, 'learning_rate': 7.560306305551064e-06, 'epoch': 0.35} 35%|███▍ | 11964/34278 [13:17:38<25:09:35, 4.06s/it] 35%|███▍ | 11965/34278 [13:17:41<23:22:03, 3.77s/it] {'loss': 0.1574, 'grad_norm': 0.6913879268733832, 'learning_rate': 7.5599004973094404e-06, 'epoch': 0.35} 35%|███▍ | 11965/34278 [13:17:41<23:22:03, 3.77s/it] 35%|███▍ | 11966/34278 [13:17:44<22:23:27, 3.61s/it] {'loss': 0.153, 'grad_norm': 1.0013549561427575, 'learning_rate': 7.559494666213636e-06, 'epoch': 0.35} 35%|███▍ | 11966/34278 [13:17:44<22:23:27, 3.61s/it] 35%|███▍ | 11967/34278 [13:17:48<23:52:00, 3.85s/it] {'loss': 0.1576, 'grad_norm': 0.732861297331376, 'learning_rate': 7.559088812267274e-06, 'epoch': 0.35} 35%|███▍ | 11967/34278 [13:17:48<23:52:00, 3.85s/it] 35%|███▍ | 11968/34278 [13:17:52<23:44:57, 3.83s/it] {'loss': 0.1535, 'grad_norm': 0.8105003688198452, 'learning_rate': 7.55868293547398e-06, 'epoch': 0.35} 35%|███▍ | 11968/34278 [13:17:52<23:44:57, 3.83s/it] 35%|███▍ | 11969/34278 [13:17:55<22:44:08, 3.67s/it] {'loss': 0.141, 'grad_norm': 0.8230910520647171, 'learning_rate': 7.558277035837373e-06, 'epoch': 0.35} 35%|███▍ | 11969/34278 [13:17:55<22:44:08, 3.67s/it] 35%|███▍ | 11970/34278 [13:17:59<22:00:09, 3.55s/it] {'loss': 0.1311, 'grad_norm': 0.7551049755951319, 'learning_rate': 7.5578711133610815e-06, 'epoch': 0.35} 35%|███▍ | 11970/34278 [13:17:59<22:00:09, 3.55s/it] 35%|███▍ | 11971/34278 [13:18:02<21:41:39, 3.50s/it] {'loss': 0.1257, 'grad_norm': 0.8307429867540659, 'learning_rate': 7.557465168048726e-06, 'epoch': 0.35} 35%|███▍ | 11971/34278 [13:18:02<21:41:39, 3.50s/it] 35%|███▍ | 11972/34278 [13:18:08<26:12:52, 4.23s/it] {'loss': 0.1416, 'grad_norm': 0.8900528984135553, 'learning_rate': 7.557059199903933e-06, 'epoch': 0.35} 35%|███▍ | 11972/34278 [13:18:08<26:12:52, 4.23s/it] 35%|███▍ | 11973/34278 [13:18:11<24:43:17, 3.99s/it] {'loss': 0.1301, 'grad_norm': 0.7253443237601457, 'learning_rate': 7.556653208930325e-06, 'epoch': 0.35} 35%|███▍ | 11973/34278 [13:18:11<24:43:17, 3.99s/it] 35%|███▍ | 11974/34278 [13:18:17<28:19:06, 4.57s/it] {'loss': 0.1566, 'grad_norm': 0.9300315031950055, 'learning_rate': 7.556247195131527e-06, 'epoch': 0.35} 35%|███▍ | 11974/34278 [13:18:17<28:19:06, 4.57s/it] 35%|███▍ | 11975/34278 [13:18:20<25:01:19, 4.04s/it] {'loss': 0.1435, 'grad_norm': 0.7510839853240795, 'learning_rate': 7.555841158511166e-06, 'epoch': 0.35} 35%|███▍ | 11975/34278 [13:18:20<25:01:19, 4.04s/it] 35%|███▍ | 11976/34278 [13:18:27<29:45:38, 4.80s/it] {'loss': 0.1615, 'grad_norm': 0.7661954174756337, 'learning_rate': 7.555435099072864e-06, 'epoch': 0.35} 35%|███▍ | 11976/34278 [13:18:27<29:45:38, 4.80s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 35%|███▍ | 11977/34278 [13:18:30<26:54:24, 4.34s/it] {'loss': 0.1311, 'grad_norm': 0.9991595133539672, 'learning_rate': 7.555029016820248e-06, 'epoch': 0.35} 35%|███▍ | 11977/34278 [13:18:30<26:54:24, 4.34s/it] 35%|███▍ | 11978/34278 [13:18:33<24:00:07, 3.87s/it] {'loss': 0.127, 'grad_norm': 0.8549913344929823, 'learning_rate': 7.554622911756943e-06, 'epoch': 0.35} 35%|███▍ | 11978/34278 [13:18:33<24:00:07, 3.87s/it] 35%|███▍ | 11979/34278 [13:18:37<24:42:36, 3.99s/it] {'loss': 0.154, 'grad_norm': 1.4073594749639478, 'learning_rate': 7.554216783886573e-06, 'epoch': 0.35} 35%|███▍ | 11979/34278 [13:18:37<24:42:36, 3.99s/it] 35%|███▍ | 11980/34278 [13:18:41<24:56:22, 4.03s/it] {'loss': 0.1617, 'grad_norm': 0.8859617198424669, 'learning_rate': 7.553810633212766e-06, 'epoch': 0.35} 35%|███▍ | 11980/34278 [13:18:41<24:56:22, 4.03s/it] 35%|███▍ | 11981/34278 [13:18:45<23:49:20, 3.85s/it] {'loss': 0.1383, 'grad_norm': 0.8206997272813257, 'learning_rate': 7.553404459739149e-06, 'epoch': 0.35} 35%|███▍ | 11981/34278 [13:18:45<23:49:20, 3.85s/it] 35%|███▍ | 11982/34278 [13:18:48<22:37:44, 3.65s/it] {'loss': 0.1281, 'grad_norm': 0.9161949749226932, 'learning_rate': 7.552998263469344e-06, 'epoch': 0.35} 35%|███▍ | 11982/34278 [13:18:48<22:37:44, 3.65s/it] 35%|███▍ | 11983/34278 [13:18:51<21:29:37, 3.47s/it] {'loss': 0.1389, 'grad_norm': 1.0145605532643704, 'learning_rate': 7.552592044406981e-06, 'epoch': 0.35} 35%|███▍ | 11983/34278 [13:18:51<21:29:37, 3.47s/it] 35%|███▍ | 11984/34278 [13:18:54<20:52:09, 3.37s/it] {'loss': 0.1581, 'grad_norm': 1.0319637108006643, 'learning_rate': 7.552185802555687e-06, 'epoch': 0.35} 35%|███▍ | 11984/34278 [13:18:54<20:52:09, 3.37s/it] 35%|███▍ | 11985/34278 [13:18:57<20:00:15, 3.23s/it] {'loss': 0.1383, 'grad_norm': 0.8741925693978755, 'learning_rate': 7.551779537919086e-06, 'epoch': 0.35} 35%|███▍ | 11985/34278 [13:18:57<20:00:15, 3.23s/it] 35%|███▍ | 11986/34278 [13:19:00<19:42:13, 3.18s/it] {'loss': 0.1572, 'grad_norm': 1.0485253410979751, 'learning_rate': 7.551373250500806e-06, 'epoch': 0.35} 35%|███▍ | 11986/34278 [13:19:00<19:42:13, 3.18s/it] 35%|███▍ | 11987/34278 [13:19:04<20:44:01, 3.35s/it] {'loss': 0.1624, 'grad_norm': 0.8319919877014683, 'learning_rate': 7.550966940304476e-06, 'epoch': 0.35} 35%|███▍ | 11987/34278 [13:19:04<20:44:01, 3.35s/it] 35%|███▍ | 11988/34278 [13:19:09<24:23:42, 3.94s/it] {'loss': 0.1885, 'grad_norm': 1.2454719213792826, 'learning_rate': 7.550560607333721e-06, 'epoch': 0.35} 35%|███▍ | 11988/34278 [13:19:09<24:23:42, 3.94s/it] 35%|███▍ | 11989/34278 [13:19:13<23:37:34, 3.82s/it] {'loss': 0.1508, 'grad_norm': 1.0859185053467137, 'learning_rate': 7.55015425159217e-06, 'epoch': 0.35} 35%|███▍ | 11989/34278 [13:19:13<23:37:34, 3.82s/it] 35%|███▍ | 11990/34278 [13:19:16<23:02:46, 3.72s/it] {'loss': 0.1772, 'grad_norm': 0.8276987795363928, 'learning_rate': 7.549747873083451e-06, 'epoch': 0.35} 35%|███▍ | 11990/34278 [13:19:16<23:02:46, 3.72s/it] 35%|███▍ | 11991/34278 [13:19:21<25:54:47, 4.19s/it] {'loss': 0.1423, 'grad_norm': 0.7641150272930679, 'learning_rate': 7.549341471811192e-06, 'epoch': 0.35} 35%|███▍ | 11991/34278 [13:19:21<25:54:47, 4.19s/it] 35%|███▍ | 11992/34278 [13:19:24<23:30:57, 3.80s/it] {'loss': 0.1496, 'grad_norm': 0.9790484771841722, 'learning_rate': 7.54893504777902e-06, 'epoch': 0.35} 35%|███▍ | 11992/34278 [13:19:24<23:30:57, 3.80s/it] 35%|███▍ | 11993/34278 [13:19:27<22:13:06, 3.59s/it] {'loss': 0.1474, 'grad_norm': 0.6392963206767323, 'learning_rate': 7.548528600990565e-06, 'epoch': 0.35} 35%|███▍ | 11993/34278 [13:19:27<22:13:06, 3.59s/it] 35%|███▍ | 11994/34278 [13:19:31<22:01:56, 3.56s/it] {'loss': 0.1316, 'grad_norm': 0.7092662469286853, 'learning_rate': 7.548122131449455e-06, 'epoch': 0.35} 35%|███▍ | 11994/34278 [13:19:31<22:01:56, 3.56s/it] 35%|███▍ | 11995/34278 [13:19:34<22:06:43, 3.57s/it] {'loss': 0.1359, 'grad_norm': 0.7921631611319702, 'learning_rate': 7.547715639159319e-06, 'epoch': 0.35} 35%|███▍ | 11995/34278 [13:19:34<22:06:43, 3.57s/it] 35%|███▍ | 11996/34278 [13:19:37<21:06:02, 3.41s/it] {'loss': 0.1493, 'grad_norm': 0.7064379363786392, 'learning_rate': 7.547309124123785e-06, 'epoch': 0.35} 35%|███▍ | 11996/34278 [13:19:37<21:06:02, 3.41s/it] 35%|███▍ | 11997/34278 [13:19:41<20:52:02, 3.37s/it] {'loss': 0.1337, 'grad_norm': 0.7506060493415203, 'learning_rate': 7.546902586346483e-06, 'epoch': 0.35} 35%|███▍ | 11997/34278 [13:19:41<20:52:02, 3.37s/it] 35%|███▌ | 11998/34278 [13:19:44<20:24:04, 3.30s/it] {'loss': 0.1353, 'grad_norm': 0.6260097911179058, 'learning_rate': 7.5464960258310435e-06, 'epoch': 0.35} 35%|███▌ | 11998/34278 [13:19:44<20:24:04, 3.30s/it] 35%|███▌ | 11999/34278 [13:19:47<20:01:26, 3.24s/it] {'loss': 0.1387, 'grad_norm': 0.8763805662986133, 'learning_rate': 7.546089442581097e-06, 'epoch': 0.35} 35%|███▌ | 11999/34278 [13:19:47<20:01:26, 3.24s/it] 35%|███▌ | 12000/34278 [13:19:50<19:12:37, 3.10s/it] {'loss': 0.1679, 'grad_norm': 0.8494966847252503, 'learning_rate': 7.545682836600269e-06, 'epoch': 0.35} 35%|███▌ | 12000/34278 [13:19:50<19:12:37, 3.10s/it]Set eos token id toSet eos token id toSet eos token id toSet eos token id toSet eos token id toSet eos token id to 151658Set eos token id to151658 151658 151658 151658Set eos token to151658 Set eos token to Set eos token to Set eos token id to<|diff_marker|>Set eos token to Set eos token toSet eos token to <|diff_marker|><|diff_marker|><|diff_marker|>Set generation config eos token id to 151658Set generation config eos token id to<|diff_marker|> [151658] Set generation config eos token id to [151658]151658 Set eos token toSet generation config eos token id toSet generation config eos token id to[151658] <|diff_marker|> Set eos token to [151658] [151658] Set generation config eos token id to <|diff_marker|> [151658] Set generation config eos token id to<|diff_marker|> [151658]Set generation config eos token id to [151658] Set eos token id toSet eos token id toSet eos token id toSet eos token id to 151658151658Set eos token id to151658151658Set eos token id to Set eos token id to Set eos token to Set eos token to 151658 Set eos token to<|diff_marker|>151658 151658 Set eos token to Set generation config eos token id to<|diff_marker|> <|diff_marker|> Set eos token toSet eos token toSet eos token id to Set generation config eos token id to[151658]Set eos token to Set generation config eos token id to <|diff_marker|> 151658<|diff_marker|> [151658]Set eos token toSet generation config eos token id to [151658]<|diff_marker|> Set generation config eos token id to <|diff_marker|> Set generation config eos token id to[151658][151658] Set generation config eos token id to <|diff_marker|> [151658] [151658]Set generation config eos token id to [151658] 151658] [151658] [151658] [151658] 151658151658151658Set eos token id toSet eos token id to Set eos token id to151658 Set eos token toSet eos token to151658151658 Set eos token to 151658<|diff_marker|>Set eos token to <|diff_marker|> <|diff_marker|>Set eos token toSet eos token to Set eos token to Set generation config eos token id to <|diff_marker|> Set generation config eos token id to<|diff_marker|> <|diff_marker|>[151658]Set generation config eos token id to [151658]Set generation config eos token id toSet generation config eos token id to Set generation config eos token id to<|diff_marker|>[151658] [151658] Set generation config eos token id to[151658] [151658] [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set generation config eos token id to [151658][151658] Set eos token id back toSet eos token id back to 151645 Set eos token back to <|im_end|> Set eos token id back toSet generation config eos token id back to 151645 [151645, 151643]Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 35%|███▌ | 12001/34278 [13:20:24<77:13:55, 12.48s/it] {'loss': 0.1616, 'grad_norm': 0.7976569160323592, 'learning_rate': 7.5452762078921935e-06, 'epoch': 0.35} 35%|███▌ | 12001/34278 [13:20:24<77:13:55, 12.48s/it] 35%|███▌ | 12002/34278 [13:20:28<60:33:10, 9.79s/it] {'loss': 0.1499, 'grad_norm': 0.9691712783063527, 'learning_rate': 7.544869556460501e-06, 'epoch': 0.35} 35%|███▌ | 12002/34278 [13:20:28<60:33:10, 9.79s/it] 35%|███▌ | 12003/34278 [13:20:31<48:29:03, 7.84s/it] {'loss': 0.1234, 'grad_norm': 0.7424149450252084, 'learning_rate': 7.544462882308818e-06, 'epoch': 0.35} 35%|███▌ | 12003/34278 [13:20:31<48:29:03, 7.84s/it] 35%|███▌ | 12004/34278 [13:20:35<41:55:29, 6.78s/it] {'loss': 0.1315, 'grad_norm': 0.7619736975676252, 'learning_rate': 7.54405618544078e-06, 'epoch': 0.35} 35%|███▌ | 12004/34278 [13:20:35<41:55:29, 6.78s/it] 35%|███▌ | 12005/34278 [13:20:38<35:03:58, 5.67s/it] {'loss': 0.1285, 'grad_norm': 0.809507942534479, 'learning_rate': 7.543649465860015e-06, 'epoch': 0.35} 35%|███▌ | 12005/34278 [13:20:38<35:03:58, 5.67s/it] 35%|███▌ | 12006/34278 [13:20:42<31:39:44, 5.12s/it] {'loss': 0.1695, 'grad_norm': 0.939740138966658, 'learning_rate': 7.543242723570154e-06, 'epoch': 0.35} 35%|███▌ | 12006/34278 [13:20:42<31:39:44, 5.12s/it] 35%|███▌ | 12007/34278 [13:20:45<27:40:21, 4.47s/it] {'loss': 0.1522, 'grad_norm': 0.9345311448229102, 'learning_rate': 7.54283595857483e-06, 'epoch': 0.35} 35%|███▌ | 12007/34278 [13:20:45<27:40:21, 4.47s/it] 35%|███▌ | 12008/34278 [13:20:49<26:06:24, 4.22s/it] {'loss': 0.1683, 'grad_norm': 0.8543695874272043, 'learning_rate': 7.542429170877672e-06, 'epoch': 0.35} 35%|███▌ | 12008/34278 [13:20:49<26:06:24, 4.22s/it] 35%|███▌ | 12009/34278 [13:20:52<23:50:53, 3.86s/it] {'loss': 0.1223, 'grad_norm': 1.0671623136526542, 'learning_rate': 7.542022360482315e-06, 'epoch': 0.35} 35%|███▌ | 12009/34278 [13:20:52<23:50:53, 3.86s/it] 35%|███▌ | 12010/34278 [13:20:55<23:12:32, 3.75s/it] {'loss': 0.1378, 'grad_norm': 0.7888800643287854, 'learning_rate': 7.54161552739239e-06, 'epoch': 0.35} 35%|███▌ | 12010/34278 [13:20:55<23:12:32, 3.75s/it] 35%|███▌ | 12011/34278 [13:21:01<27:11:28, 4.40s/it] {'loss': 0.1497, 'grad_norm': 0.8173120407487832, 'learning_rate': 7.541208671611526e-06, 'epoch': 0.35} 35%|███▌ | 12011/34278 [13:21:01<27:11:28, 4.40s/it] 35%|███▌ | 12012/34278 [13:21:06<27:33:26, 4.46s/it] {'loss': 0.1572, 'grad_norm': 1.160749866646878, 'learning_rate': 7.5408017931433585e-06, 'epoch': 0.35} 35%|███▌ | 12012/34278 [13:21:06<27:33:26, 4.46s/it] 35%|███▌ | 12013/34278 [13:21:12<29:59:36, 4.85s/it] {'loss': 0.1611, 'grad_norm': 0.9610673319523814, 'learning_rate': 7.540394891991519e-06, 'epoch': 0.35} 35%|███▌ | 12013/34278 [13:21:12<29:59:36, 4.85s/it] 35%|███▌ | 12014/34278 [13:21:17<32:01:56, 5.18s/it] {'loss': 0.1463, 'grad_norm': 0.8450580634548185, 'learning_rate': 7.539987968159641e-06, 'epoch': 0.35} 35%|███▌ | 12014/34278 [13:21:17<32:01:56, 5.18s/it] 35%|███▌ | 12015/34278 [13:21:21<28:44:14, 4.65s/it] {'loss': 0.1259, 'grad_norm': 0.9142269293585659, 'learning_rate': 7.539581021651357e-06, 'epoch': 0.35} 35%|███▌ | 12015/34278 [13:21:21<28:44:14, 4.65s/it] 35%|███▌ | 12016/34278 [13:21:24<26:07:07, 4.22s/it] {'loss': 0.1502, 'grad_norm': 0.923217182014225, 'learning_rate': 7.539174052470299e-06, 'epoch': 0.35} 35%|███▌ | 12016/34278 [13:21:24<26:07:07, 4.22s/it] 35%|███▌ | 12017/34278 [13:21:30<29:17:51, 4.74s/it] {'loss': 0.1492, 'grad_norm': 0.9057778560682554, 'learning_rate': 7.5387670606201e-06, 'epoch': 0.35} 35%|███▌ | 12017/34278 [13:21:30<29:17:51, 4.74s/it] 35%|███▌ | 12018/34278 [13:21:34<28:12:52, 4.56s/it] {'loss': 0.1529, 'grad_norm': 1.0498472866033246, 'learning_rate': 7.538360046104396e-06, 'epoch': 0.35} 35%|███▌ | 12018/34278 [13:21:34<28:12:52, 4.56s/it] 35%|███▌ | 12019/34278 [13:21:37<25:34:16, 4.14s/it] {'loss': 0.1377, 'grad_norm': 0.8125394527005271, 'learning_rate': 7.537953008926821e-06, 'epoch': 0.35} 35%|███▌ | 12019/34278 [13:21:37<25:34:16, 4.14s/it] 35%|███▌ | 12020/34278 [13:21:41<23:55:02, 3.87s/it] {'loss': 0.1473, 'grad_norm': 0.8165615823996295, 'learning_rate': 7.537545949091005e-06, 'epoch': 0.35} 35%|███▌ | 12020/34278 [13:21:41<23:55:02, 3.87s/it] 35%|███▌ | 12021/34278 [13:21:43<22:09:02, 3.58s/it] {'loss': 0.1346, 'grad_norm': 0.7409799015249711, 'learning_rate': 7.5371388666005866e-06, 'epoch': 0.35} 35%|███▌ | 12021/34278 [13:21:43<22:09:02, 3.58s/it] 35%|███▌ | 12022/34278 [13:21:46<21:03:22, 3.41s/it] {'loss': 0.1342, 'grad_norm': 0.8383167096198173, 'learning_rate': 7.536731761459197e-06, 'epoch': 0.35} 35%|███▌ | 12022/34278 [13:21:47<21:03:22, 3.41s/it] 35%|███▌ | 12023/34278 [13:21:52<24:49:54, 4.02s/it] {'loss': 0.1586, 'grad_norm': 0.7865520515596042, 'learning_rate': 7.536324633670471e-06, 'epoch': 0.35} 35%|███▌ | 12023/34278 [13:21:52<24:49:54, 4.02s/it] 35%|███▌ | 12024/34278 [13:21:55<22:50:04, 3.69s/it] {'loss': 0.1237, 'grad_norm': 0.757824135026902, 'learning_rate': 7.535917483238047e-06, 'epoch': 0.35} 35%|███▌ | 12024/34278 [13:21:55<22:50:04, 3.69s/it] 35%|███▌ | 12025/34278 [13:21:58<22:36:46, 3.66s/it] {'loss': 0.144, 'grad_norm': 0.8755553636142435, 'learning_rate': 7.535510310165555e-06, 'epoch': 0.35} 35%|███▌ | 12025/34278 [13:21:58<22:36:46, 3.66s/it] 35%|███▌ | 12026/34278 [13:22:04<27:00:33, 4.37s/it] {'loss': 0.1465, 'grad_norm': 0.755078719919549, 'learning_rate': 7.535103114456631e-06, 'epoch': 0.35} 35%|███▌ | 12026/34278 [13:22:04<27:00:33, 4.37s/it] 35%|███▌ | 12027/34278 [13:22:08<25:11:07, 4.07s/it] {'loss': 0.1559, 'grad_norm': 0.9402033884650173, 'learning_rate': 7.534695896114913e-06, 'epoch': 0.35} 35%|███▌ | 12027/34278 [13:22:08<25:11:07, 4.07s/it] 35%|███▌ | 12028/34278 [13:22:12<25:07:11, 4.06s/it] {'loss': 0.1448, 'grad_norm': 0.8540738477485491, 'learning_rate': 7.5342886551440355e-06, 'epoch': 0.35} 35%|███▌ | 12028/34278 [13:22:12<25:07:11, 4.06s/it] 35%|███▌ | 12029/34278 [13:22:16<25:49:11, 4.18s/it] {'loss': 0.154, 'grad_norm': 0.6425799279128513, 'learning_rate': 7.533881391547633e-06, 'epoch': 0.35} 35%|███▌ | 12029/34278 [13:22:16<25:49:11, 4.18s/it] 35%|███▌ | 12030/34278 [13:22:19<23:53:12, 3.87s/it] {'loss': 0.1432, 'grad_norm': 0.8609842832208886, 'learning_rate': 7.533474105329343e-06, 'epoch': 0.35} 35%|███▌ | 12030/34278 [13:22:19<23:53:12, 3.87s/it] 35%|███▌ | 12031/34278 [13:22:23<22:53:17, 3.70s/it] {'loss': 0.1636, 'grad_norm': 0.718611385625933, 'learning_rate': 7.5330667964928006e-06, 'epoch': 0.35} 35%|███▌ | 12031/34278 [13:22:23<22:53:17, 3.70s/it] 35%|███▌ | 12032/34278 [13:22:27<23:13:18, 3.76s/it] {'loss': 0.145, 'grad_norm': 0.7689415807473821, 'learning_rate': 7.5326594650416415e-06, 'epoch': 0.35} 35%|███▌ | 12032/34278 [13:22:27<23:13:18, 3.76s/it] 35%|███▌ | 12033/34278 [13:22:31<23:21:47, 3.78s/it] {'loss': 0.1612, 'grad_norm': 0.6523245428096088, 'learning_rate': 7.532252110979505e-06, 'epoch': 0.35} 35%|███▌ | 12033/34278 [13:22:31<23:21:47, 3.78s/it] 35%|███▌ | 12034/34278 [13:22:34<22:48:43, 3.69s/it] {'loss': 0.1475, 'grad_norm': 0.8062477362030178, 'learning_rate': 7.531844734310025e-06, 'epoch': 0.35} 35%|███▌ | 12034/34278 [13:22:34<22:48:43, 3.69s/it] 35%|███▌ | 12035/34278 [13:22:38<23:44:54, 3.84s/it] {'loss': 0.1475, 'grad_norm': 0.7632641767439492, 'learning_rate': 7.53143733503684e-06, 'epoch': 0.35} 35%|███▌ | 12035/34278 [13:22:38<23:44:54, 3.84s/it] 35%|███▌ | 12036/34278 [13:22:41<22:32:23, 3.65s/it] {'loss': 0.1442, 'grad_norm': 0.8321838797573419, 'learning_rate': 7.5310299131635874e-06, 'epoch': 0.35} 35%|███▌ | 12036/34278 [13:22:41<22:32:23, 3.65s/it] 35%|███▌ | 12037/34278 [13:22:44<21:30:22, 3.48s/it] {'loss': 0.1425, 'grad_norm': 0.753205481611474, 'learning_rate': 7.530622468693905e-06, 'epoch': 0.35} 35%|███▌ | 12037/34278 [13:22:44<21:30:22, 3.48s/it] 35%|███▌ | 12038/34278 [13:22:48<21:05:22, 3.41s/it] {'loss': 0.1732, 'grad_norm': 0.8974771231836078, 'learning_rate': 7.530215001631426e-06, 'epoch': 0.35} 35%|███▌ | 12038/34278 [13:22:48<21:05:22, 3.41s/it] 35%|███▌ | 12039/34278 [13:22:51<20:44:16, 3.36s/it] {'loss': 0.1299, 'grad_norm': 0.6152625349741981, 'learning_rate': 7.5298075119797945e-06, 'epoch': 0.35} 35%|███▌ | 12039/34278 [13:22:51<20:44:16, 3.36s/it] 35%|███▌ | 12040/34278 [13:22:55<21:34:27, 3.49s/it] {'loss': 0.1226, 'grad_norm': 0.7619754294519129, 'learning_rate': 7.529399999742644e-06, 'epoch': 0.35} 35%|███▌ | 12040/34278 [13:22:55<21:34:27, 3.49s/it] 35%|███▌ | 12041/34278 [13:23:00<24:38:36, 3.99s/it] {'loss': 0.1615, 'grad_norm': 0.7399284271059957, 'learning_rate': 7.528992464923615e-06, 'epoch': 0.35} 35%|███▌ | 12041/34278 [13:23:00<24:38:36, 3.99s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 35%|███▌ | 12042/34278 [13:23:03<22:35:45, 3.66s/it] {'loss': 0.1632, 'grad_norm': 0.963118361864318, 'learning_rate': 7.528584907526343e-06, 'epoch': 0.35} 35%|███▌ | 12042/34278 [13:23:03<22:35:45, 3.66s/it] 35%|███▌ | 12043/34278 [13:23:06<21:19:57, 3.45s/it] {'loss': 0.1532, 'grad_norm': 0.8250067029946331, 'learning_rate': 7.52817732755447e-06, 'epoch': 0.35} 35%|███▌ | 12043/34278 [13:23:06<21:19:57, 3.45s/it] 35%|███▌ | 12044/34278 [13:23:12<26:16:26, 4.25s/it] {'loss': 0.168, 'grad_norm': 0.8625709749218062, 'learning_rate': 7.5277697250116335e-06, 'epoch': 0.35} 35%|███▌ | 12044/34278 [13:23:12<26:16:26, 4.25s/it] 35%|███▌ | 12045/34278 [13:23:15<24:15:56, 3.93s/it] {'loss': 0.1555, 'grad_norm': 0.8102563031029681, 'learning_rate': 7.527362099901472e-06, 'epoch': 0.35} 35%|███▌ | 12045/34278 [13:23:15<24:15:56, 3.93s/it] 35%|███▌ | 12046/34278 [13:23:18<23:10:17, 3.75s/it] {'loss': 0.1348, 'grad_norm': 0.8905492280962982, 'learning_rate': 7.526954452227626e-06, 'epoch': 0.35} 35%|███▌ | 12046/34278 [13:23:18<23:10:17, 3.75s/it] 35%|███▌ | 12047/34278 [13:23:23<25:29:22, 4.13s/it] {'loss': 0.1553, 'grad_norm': 0.8863970891763994, 'learning_rate': 7.526546781993731e-06, 'epoch': 0.35} 35%|███▌ | 12047/34278 [13:23:24<25:29:22, 4.13s/it] 35%|███▌ | 12048/34278 [13:23:27<25:08:46, 4.07s/it] {'loss': 0.1264, 'grad_norm': 0.7320287163447757, 'learning_rate': 7.5261390892034315e-06, 'epoch': 0.35} 35%|███▌ | 12048/34278 [13:23:27<25:08:46, 4.07s/it] 35%|███▌ | 12049/34278 [13:23:30<22:31:16, 3.65s/it] {'loss': 0.1497, 'grad_norm': 0.8328094008963891, 'learning_rate': 7.525731373860365e-06, 'epoch': 0.35} 35%|███▌ | 12049/34278 [13:23:30<22:31:16, 3.65s/it] 35%|███▌ | 12050/34278 [13:23:34<22:18:20, 3.61s/it] {'loss': 0.1501, 'grad_norm': 0.8290748429832105, 'learning_rate': 7.525323635968171e-06, 'epoch': 0.35} 35%|███▌ | 12050/34278 [13:23:34<22:18:20, 3.61s/it] 35%|███▌ | 12051/34278 [13:23:37<21:54:01, 3.55s/it] {'loss': 0.1314, 'grad_norm': 0.7785605742897119, 'learning_rate': 7.524915875530493e-06, 'epoch': 0.35} 35%|███▌ | 12051/34278 [13:23:37<21:54:01, 3.55s/it] 35%|███▌ | 12052/34278 [13:23:41<22:52:36, 3.71s/it] {'loss': 0.1189, 'grad_norm': 0.7153148581576247, 'learning_rate': 7.524508092550968e-06, 'epoch': 0.35} 35%|███▌ | 12052/34278 [13:23:41<22:52:36, 3.71s/it] 35%|███▌ | 12053/34278 [13:23:44<21:51:51, 3.54s/it] {'loss': 0.1462, 'grad_norm': 1.00055863901744, 'learning_rate': 7.524100287033235e-06, 'epoch': 0.35} 35%|███▌ | 12053/34278 [13:23:44<21:51:51, 3.54s/it] 35%|███▌ | 12054/34278 [13:23:47<21:07:19, 3.42s/it] {'loss': 0.1563, 'grad_norm': 0.7853007709733817, 'learning_rate': 7.52369245898094e-06, 'epoch': 0.35} 35%|███▌ | 12054/34278 [13:23:47<21:07:19, 3.42s/it] 35%|███▌ | 12055/34278 [13:23:50<20:06:23, 3.26s/it] {'loss': 0.1374, 'grad_norm': 0.7423218991480369, 'learning_rate': 7.523284608397718e-06, 'epoch': 0.35} 35%|███▌ | 12055/34278 [13:23:50<20:06:23, 3.26s/it] 35%|███▌ | 12056/34278 [13:23:54<21:49:18, 3.54s/it] {'loss': 0.1408, 'grad_norm': 0.9201515724969784, 'learning_rate': 7.522876735287217e-06, 'epoch': 0.35} 35%|███▌ | 12056/34278 [13:23:54<21:49:18, 3.54s/it] 35%|███▌ | 12057/34278 [13:23:57<20:56:03, 3.39s/it] {'loss': 0.1389, 'grad_norm': 0.7792873540258134, 'learning_rate': 7.522468839653072e-06, 'epoch': 0.35} 35%|███▌ | 12057/34278 [13:23:57<20:56:03, 3.39s/it] 35%|███▌ | 12058/34278 [13:24:03<24:45:23, 4.01s/it] {'loss': 0.1244, 'grad_norm': 0.7504657217148921, 'learning_rate': 7.522060921498928e-06, 'epoch': 0.35} 35%|███▌ | 12058/34278 [13:24:03<24:45:23, 4.01s/it] 35%|███▌ | 12059/34278 [13:24:07<24:07:39, 3.91s/it] {'loss': 0.1385, 'grad_norm': 0.7462839313186306, 'learning_rate': 7.521652980828427e-06, 'epoch': 0.35} 35%|███▌ | 12059/34278 [13:24:07<24:07:39, 3.91s/it] 35%|███▌ | 12060/34278 [13:24:10<22:35:42, 3.66s/it] {'loss': 0.1484, 'grad_norm': 0.7949877922378598, 'learning_rate': 7.521245017645209e-06, 'epoch': 0.35} 35%|███▌ | 12060/34278 [13:24:10<22:35:42, 3.66s/it] 35%|███▌ | 12061/34278 [13:24:16<27:04:51, 4.39s/it] {'loss': 0.1352, 'grad_norm': 0.7792433715977214, 'learning_rate': 7.520837031952919e-06, 'epoch': 0.35} 35%|███▌ | 12061/34278 [13:24:16<27:04:51, 4.39s/it] 35%|███▌ | 12062/34278 [13:24:19<25:25:33, 4.12s/it] {'loss': 0.1487, 'grad_norm': 0.7798477650475799, 'learning_rate': 7.520429023755196e-06, 'epoch': 0.35} 35%|███▌ | 12062/34278 [13:24:19<25:25:33, 4.12s/it] 35%|███▌ | 12063/34278 [13:24:22<23:30:18, 3.81s/it] {'loss': 0.1606, 'grad_norm': 0.7851479605761774, 'learning_rate': 7.520020993055686e-06, 'epoch': 0.35} 35%|███▌ | 12063/34278 [13:24:22<23:30:18, 3.81s/it] 35%|███▌ | 12064/34278 [13:24:25<21:45:25, 3.53s/it] {'loss': 0.1318, 'grad_norm': 0.7797851748101124, 'learning_rate': 7.5196129398580296e-06, 'epoch': 0.35} 35%|███▌ | 12064/34278 [13:24:25<21:45:25, 3.53s/it] 35%|███▌ | 12065/34278 [13:24:31<26:08:25, 4.24s/it] {'loss': 0.1762, 'grad_norm': 0.9523367546966441, 'learning_rate': 7.51920486416587e-06, 'epoch': 0.35} 35%|███▌ | 12065/34278 [13:24:31<26:08:25, 4.24s/it] 35%|███▌ | 12066/34278 [13:24:34<23:26:36, 3.80s/it] {'loss': 0.1298, 'grad_norm': 0.7943619545766352, 'learning_rate': 7.518796765982851e-06, 'epoch': 0.35} 35%|███▌ | 12066/34278 [13:24:34<23:26:36, 3.80s/it] 35%|███▌ | 12067/34278 [13:24:37<21:42:51, 3.52s/it] {'loss': 0.1368, 'grad_norm': 0.8896828727948359, 'learning_rate': 7.518388645312615e-06, 'epoch': 0.35} 35%|███▌ | 12067/34278 [13:24:37<21:42:51, 3.52s/it] 35%|███▌ | 12068/34278 [13:24:40<21:54:26, 3.55s/it] {'loss': 0.1214, 'grad_norm': 0.7809993755085402, 'learning_rate': 7.517980502158806e-06, 'epoch': 0.35} 35%|███▌ | 12068/34278 [13:24:40<21:54:26, 3.55s/it] 35%|███▌ | 12069/34278 [13:24:44<21:30:31, 3.49s/it] {'loss': 0.1715, 'grad_norm': 0.9034489189629171, 'learning_rate': 7.51757233652507e-06, 'epoch': 0.35} 35%|███▌ | 12069/34278 [13:24:44<21:30:31, 3.49s/it] 35%|███▌ | 12070/34278 [13:24:47<20:30:38, 3.32s/it] {'loss': 0.1403, 'grad_norm': 0.7698354845054213, 'learning_rate': 7.5171641484150484e-06, 'epoch': 0.35} 35%|███▌ | 12070/34278 [13:24:47<20:30:38, 3.32s/it] 35%|███▌ | 12071/34278 [13:24:50<19:50:54, 3.22s/it] {'loss': 0.1399, 'grad_norm': 0.9194387834832348, 'learning_rate': 7.516755937832386e-06, 'epoch': 0.35} 35%|███▌ | 12071/34278 [13:24:50<19:50:54, 3.22s/it] 35%|███▌ | 12072/34278 [13:24:54<21:13:40, 3.44s/it] {'loss': 0.1295, 'grad_norm': 2.4641941778338987, 'learning_rate': 7.516347704780726e-06, 'epoch': 0.35} 35%|███▌ | 12072/34278 [13:24:54<21:13:40, 3.44s/it] 35%|███▌ | 12073/34278 [13:24:59<25:03:40, 4.06s/it] {'loss': 0.1486, 'grad_norm': 0.6255735670416824, 'learning_rate': 7.5159394492637175e-06, 'epoch': 0.35} 35%|███▌ | 12073/34278 [13:24:59<25:03:40, 4.06s/it] 35%|███▌ | 12074/34278 [13:25:02<22:43:08, 3.68s/it] {'loss': 0.1256, 'grad_norm': 0.815201660035071, 'learning_rate': 7.5155311712849995e-06, 'epoch': 0.35} 35%|███▌ | 12074/34278 [13:25:02<22:43:08, 3.68s/it] 35%|███▌ | 12075/34278 [13:25:07<25:03:40, 4.06s/it] {'loss': 0.1495, 'grad_norm': 1.3017787395294462, 'learning_rate': 7.515122870848222e-06, 'epoch': 0.35} 35%|███▌ | 12075/34278 [13:25:07<25:03:40, 4.06s/it] 35%|███▌ | 12076/34278 [13:25:11<25:52:13, 4.19s/it] {'loss': 0.1836, 'grad_norm': 0.8425767756467087, 'learning_rate': 7.5147145479570275e-06, 'epoch': 0.35} 35%|███▌ | 12076/34278 [13:25:11<25:52:13, 4.19s/it] 35%|███▌ | 12077/34278 [13:25:14<23:17:23, 3.78s/it] {'loss': 0.1378, 'grad_norm': 0.8544688733768933, 'learning_rate': 7.514306202615059e-06, 'epoch': 0.35} 35%|███▌ | 12077/34278 [13:25:14<23:17:23, 3.78s/it] 35%|███▌ | 12078/34278 [13:25:17<22:12:05, 3.60s/it] {'loss': 0.1688, 'grad_norm': 0.8529122529537075, 'learning_rate': 7.513897834825967e-06, 'epoch': 0.35} 35%|███▌ | 12078/34278 [13:25:17<22:12:05, 3.60s/it] 35%|███▌ | 12079/34278 [13:25:21<22:31:46, 3.65s/it] {'loss': 0.1547, 'grad_norm': 0.8849542775247677, 'learning_rate': 7.513489444593396e-06, 'epoch': 0.35} 35%|███▌ | 12079/34278 [13:25:21<22:31:46, 3.65s/it] 35%|███▌ | 12080/34278 [13:25:24<21:14:48, 3.45s/it] {'loss': 0.1611, 'grad_norm': 0.7622060557294766, 'learning_rate': 7.51308103192099e-06, 'epoch': 0.35} 35%|███▌ | 12080/34278 [13:25:24<21:14:48, 3.45s/it] 35%|███▌ | 12081/34278 [13:25:28<22:02:46, 3.58s/it] {'loss': 0.1543, 'grad_norm': 0.6766312717303754, 'learning_rate': 7.512672596812397e-06, 'epoch': 0.35} 35%|███▌ | 12081/34278 [13:25:28<22:02:46, 3.58s/it] 35%|███▌ | 12082/34278 [13:25:32<23:52:51, 3.87s/it] {'loss': 0.1495, 'grad_norm': 0.7355151435826048, 'learning_rate': 7.512264139271264e-06, 'epoch': 0.35} 35%|███▌ | 12082/34278 [13:25:32<23:52:51, 3.87s/it] 35%|███▌ | 12083/34278 [13:25:38<27:45:32, 4.50s/it] {'loss': 0.1688, 'grad_norm': 0.8755780711695437, 'learning_rate': 7.511855659301232e-06, 'epoch': 0.35} 35%|███▌ | 12083/34278 [13:25:38<27:45:32, 4.50s/it] 35%|███▌ | 12084/34278 [13:25:44<29:53:46, 4.85s/it] {'loss': 0.1417, 'grad_norm': 0.9116905314087498, 'learning_rate': 7.511447156905958e-06, 'epoch': 0.35} 35%|███▌ | 12084/34278 [13:25:44<29:53:46, 4.85s/it] 35%|███▌ | 12085/34278 [13:25:49<29:11:40, 4.74s/it] {'loss': 0.1485, 'grad_norm': 0.859793818173384, 'learning_rate': 7.511038632089081e-06, 'epoch': 0.35} 35%|███▌ | 12085/34278 [13:25:49<29:11:40, 4.74s/it] 35%|███▌ | 12086/34278 [13:25:53<28:58:47, 4.70s/it] {'loss': 0.1417, 'grad_norm': 0.8647161525227471, 'learning_rate': 7.510630084854249e-06, 'epoch': 0.35} 35%|███▌ | 12086/34278 [13:25:53<28:58:47, 4.70s/it] 35%|███▌ | 12087/34278 [13:25:56<25:48:05, 4.19s/it] {'loss': 0.1461, 'grad_norm': 0.8573088874550703, 'learning_rate': 7.510221515205113e-06, 'epoch': 0.35} 35%|███▌ | 12087/34278 [13:25:56<25:48:05, 4.19s/it] 35%|███▌ | 12088/34278 [13:26:00<24:16:39, 3.94s/it] {'loss': 0.1323, 'grad_norm': 0.8025920440243713, 'learning_rate': 7.509812923145318e-06, 'epoch': 0.35} 35%|███▌ | 12088/34278 [13:26:00<24:16:39, 3.94s/it] 35%|███▌ | 12089/34278 [13:26:03<23:02:15, 3.74s/it] {'loss': 0.1675, 'grad_norm': 0.9133724786435535, 'learning_rate': 7.509404308678512e-06, 'epoch': 0.35} 35%|███▌ | 12089/34278 [13:26:03<23:02:15, 3.74s/it] 35%|███▌ | 12090/34278 [13:26:06<21:31:09, 3.49s/it] {'loss': 0.1413, 'grad_norm': 1.091990138478178, 'learning_rate': 7.5089956718083435e-06, 'epoch': 0.35} 35%|███▌ | 12090/34278 [13:26:06<21:31:09, 3.49s/it] 35%|███▌ | 12091/34278 [13:26:10<23:29:04, 3.81s/it] {'loss': 0.1661, 'grad_norm': 0.8776644025683882, 'learning_rate': 7.508587012538462e-06, 'epoch': 0.35} 35%|███▌ | 12091/34278 [13:26:10<23:29:04, 3.81s/it] 35%|███▌ | 12092/34278 [13:26:14<22:33:18, 3.66s/it] {'loss': 0.1553, 'grad_norm': 0.931144277232217, 'learning_rate': 7.508178330872512e-06, 'epoch': 0.35} 35%|███▌ | 12092/34278 [13:26:14<22:33:18, 3.66s/it] 35%|███▌ | 12093/34278 [13:26:17<21:31:15, 3.49s/it] {'loss': 0.1362, 'grad_norm': 0.9601895583663163, 'learning_rate': 7.507769626814145e-06, 'epoch': 0.35} 35%|███▌ | 12093/34278 [13:26:17<21:31:15, 3.49s/it] 35%|███▌ | 12094/34278 [13:26:20<21:03:53, 3.42s/it] {'loss': 0.1468, 'grad_norm': 0.8446935053892277, 'learning_rate': 7.507360900367011e-06, 'epoch': 0.35} 35%|███▌ | 12094/34278 [13:26:20<21:03:53, 3.42s/it] 35%|███▌ | 12095/34278 [13:26:23<20:27:00, 3.32s/it] {'loss': 0.1702, 'grad_norm': 0.8883699535563186, 'learning_rate': 7.5069521515347565e-06, 'epoch': 0.35} 35%|███▌ | 12095/34278 [13:26:23<20:27:00, 3.32s/it] 35%|███▌ | 12096/34278 [13:26:28<23:44:50, 3.85s/it] {'loss': 0.1325, 'grad_norm': 0.9515818571757315, 'learning_rate': 7.506543380321032e-06, 'epoch': 0.35} 35%|███▌ | 12096/34278 [13:26:28<23:44:50, 3.85s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f20e47b2070> Failed to fetch sample 2854382. Exception: cannot identify image file <_io.BytesIO object at 0x7f20e47b2070> 35%|███▌ | 12097/34278 [13:26:32<23:32:27, 3.82s/it] {'loss': 0.1505, 'grad_norm': 0.7833744001885308, 'learning_rate': 7.5061345867294875e-06, 'epoch': 0.35} 35%|███▌ | 12097/34278 [13:26:32<23:32:27, 3.82s/it] 35%|███▌ | 12098/34278 [13:26:37<26:32:37, 4.31s/it] {'loss': 0.1545, 'grad_norm': 0.8904929112833724, 'learning_rate': 7.505725770763769e-06, 'epoch': 0.35} 35%|███▌ | 12098/34278 [13:26:37<26:32:37, 4.31s/it] 35%|███▌ | 12099/34278 [13:26:41<24:38:29, 4.00s/it] {'loss': 0.1457, 'grad_norm': 0.7462988140846417, 'learning_rate': 7.505316932427531e-06, 'epoch': 0.35} 35%|███▌ | 12099/34278 [13:26:41<24:38:29, 4.00s/it] 35%|███▌ | 12100/34278 [13:26:44<22:59:12, 3.73s/it] {'loss': 0.1401, 'grad_norm': 0.8340269984700273, 'learning_rate': 7.504908071724422e-06, 'epoch': 0.35} 35%|███▌ | 12100/34278 [13:26:44<22:59:12, 3.73s/it] 35%|███▌ | 12101/34278 [13:26:47<22:13:06, 3.61s/it] {'loss': 0.1387, 'grad_norm': 0.9812345267265725, 'learning_rate': 7.5044991886580895e-06, 'epoch': 0.35} 35%|███▌ | 12101/34278 [13:26:47<22:13:06, 3.61s/it] 35%|███▌ | 12102/34278 [13:26:50<20:45:59, 3.37s/it] {'loss': 0.1306, 'grad_norm': 0.7395035014847263, 'learning_rate': 7.504090283232188e-06, 'epoch': 0.35} 35%|███▌ | 12102/34278 [13:26:50<20:45:59, 3.37s/it] 35%|███▌ | 12103/34278 [13:26:56<25:16:30, 4.10s/it] {'loss': 0.1692, 'grad_norm': 0.9028502130789045, 'learning_rate': 7.503681355450365e-06, 'epoch': 0.35} 35%|███▌ | 12103/34278 [13:26:56<25:16:30, 4.10s/it] 35%|███▌ | 12104/34278 [13:27:01<28:27:36, 4.62s/it] {'loss': 0.1315, 'grad_norm': 0.7143190591113868, 'learning_rate': 7.503272405316273e-06, 'epoch': 0.35} 35%|███▌ | 12104/34278 [13:27:02<28:27:36, 4.62s/it] 35%|███▌ | 12105/34278 [13:27:05<27:15:35, 4.43s/it] {'loss': 0.1384, 'grad_norm': 0.6872319562248489, 'learning_rate': 7.502863432833563e-06, 'epoch': 0.35} 35%|███▌ | 12105/34278 [13:27:05<27:15:35, 4.43s/it] 35%|███▌ | 12106/34278 [13:27:09<25:05:28, 4.07s/it] {'loss': 0.1786, 'grad_norm': 0.9539852257019577, 'learning_rate': 7.502454438005886e-06, 'epoch': 0.35} 35%|███▌ | 12106/34278 [13:27:09<25:05:28, 4.07s/it] 35%|███▌ | 12107/34278 [13:27:12<22:53:07, 3.72s/it] {'loss': 0.1495, 'grad_norm': 0.9964917694248208, 'learning_rate': 7.502045420836892e-06, 'epoch': 0.35} 35%|███▌ | 12107/34278 [13:27:12<22:53:07, 3.72s/it] 35%|███▌ | 12108/34278 [13:27:16<24:25:03, 3.96s/it] {'loss': 0.157, 'grad_norm': 0.990353130855816, 'learning_rate': 7.501636381330234e-06, 'epoch': 0.35} 35%|███▌ | 12108/34278 [13:27:16<24:25:03, 3.96s/it] 35%|███▌ | 12109/34278 [13:27:20<24:23:56, 3.96s/it] {'loss': 0.156, 'grad_norm': 1.0891538696380902, 'learning_rate': 7.5012273194895655e-06, 'epoch': 0.35} 35%|███▌ | 12109/34278 [13:27:20<24:23:56, 3.96s/it] 35%|███▌ | 12110/34278 [13:27:23<22:48:54, 3.71s/it] {'loss': 0.1466, 'grad_norm': 1.0180733110929292, 'learning_rate': 7.500818235318533e-06, 'epoch': 0.35} 35%|███▌ | 12110/34278 [13:27:23<22:48:54, 3.71s/it] 35%|███▌ | 12111/34278 [13:27:26<21:39:24, 3.52s/it] {'loss': 0.1498, 'grad_norm': 1.0325203116977864, 'learning_rate': 7.5004091288207956e-06, 'epoch': 0.35} 35%|███▌ | 12111/34278 [13:27:26<21:39:24, 3.52s/it] 35%|███▌ | 12112/34278 [13:27:29<20:48:28, 3.38s/it] {'loss': 0.1304, 'grad_norm': 0.6384166936673309, 'learning_rate': 7.500000000000001e-06, 'epoch': 0.35} 35%|███▌ | 12112/34278 [13:27:29<20:48:28, 3.38s/it] 35%|███▌ | 12113/34278 [13:27:32<19:45:37, 3.21s/it] {'loss': 0.1439, 'grad_norm': 0.8528566954497445, 'learning_rate': 7.499590848859802e-06, 'epoch': 0.35} 35%|███▌ | 12113/34278 [13:27:32<19:45:37, 3.21s/it] 35%|███▌ | 12114/34278 [13:27:35<19:34:03, 3.18s/it] {'loss': 0.1413, 'grad_norm': 0.8353246956365478, 'learning_rate': 7.499181675403855e-06, 'epoch': 0.35} 35%|███▌ | 12114/34278 [13:27:35<19:34:03, 3.18s/it] 35%|███▌ | 12115/34278 [13:27:38<19:24:52, 3.15s/it] {'loss': 0.1422, 'grad_norm': 0.782769096074687, 'learning_rate': 7.49877247963581e-06, 'epoch': 0.35} 35%|███▌ | 12115/34278 [13:27:38<19:24:52, 3.15s/it] 35%|███▌ | 12116/34278 [13:27:41<18:28:22, 3.00s/it] {'loss': 0.1436, 'grad_norm': 0.748058712896468, 'learning_rate': 7.49836326155932e-06, 'epoch': 0.35} 35%|███▌ | 12116/34278 [13:27:41<18:28:22, 3.00s/it] 35%|███▌ | 12117/34278 [13:27:45<19:50:57, 3.22s/it] {'loss': 0.1508, 'grad_norm': 0.8964088295486463, 'learning_rate': 7.4979540211780396e-06, 'epoch': 0.35} 35%|███▌ | 12117/34278 [13:27:45<19:50:57, 3.22s/it] 35%|███▌ | 12118/34278 [13:27:49<22:06:41, 3.59s/it] {'loss': 0.1356, 'grad_norm': 0.7038898495325068, 'learning_rate': 7.497544758495622e-06, 'epoch': 0.35} 35%|███▌ | 12118/34278 [13:27:49<22:06:41, 3.59s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 35%|███▌ | 12119/34278 [13:27:52<21:27:17, 3.49s/it] {'loss': 0.1367, 'grad_norm': 0.6770117758968943, 'learning_rate': 7.497135473515719e-06, 'epoch': 0.35} 35%|███▌ | 12119/34278 [13:27:52<21:27:17, 3.49s/it] 35%|███▌ | 12120/34278 [13:27:55<20:35:19, 3.35s/it] {'loss': 0.1538, 'grad_norm': 0.9081026508989675, 'learning_rate': 7.49672616624199e-06, 'epoch': 0.35} 35%|███▌ | 12120/34278 [13:27:55<20:35:19, 3.35s/it] 35%|███▌ | 12121/34278 [13:27:58<19:57:02, 3.24s/it] {'loss': 0.1468, 'grad_norm': 0.9105090322175228, 'learning_rate': 7.496316836678085e-06, 'epoch': 0.35} 35%|███▌ | 12121/34278 [13:27:58<19:57:02, 3.24s/it] 35%|███▌ | 12122/34278 [13:28:02<19:40:41, 3.20s/it] {'loss': 0.1595, 'grad_norm': 1.0005440326874793, 'learning_rate': 7.495907484827658e-06, 'epoch': 0.35} 35%|███▌ | 12122/34278 [13:28:02<19:40:41, 3.20s/it] 35%|███▌ | 12123/34278 [13:28:05<20:53:13, 3.39s/it] {'loss': 0.1313, 'grad_norm': 0.943506731773164, 'learning_rate': 7.495498110694364e-06, 'epoch': 0.35} 35%|███▌ | 12123/34278 [13:28:05<20:53:13, 3.39s/it] 35%|███▌ | 12124/34278 [13:28:08<19:34:39, 3.18s/it] {'loss': 0.1451, 'grad_norm': 0.9540944309194603, 'learning_rate': 7.495088714281862e-06, 'epoch': 0.35} 35%|███▌ | 12124/34278 [13:28:08<19:34:39, 3.18s/it] 35%|███▌ | 12125/34278 [13:28:11<19:49:14, 3.22s/it] {'loss': 0.1469, 'grad_norm': 1.1399722373912111, 'learning_rate': 7.494679295593801e-06, 'epoch': 0.35} 35%|███▌ | 12125/34278 [13:28:11<19:49:14, 3.22s/it] 35%|███▌ | 12126/34278 [13:28:17<24:04:19, 3.91s/it] {'loss': 0.1518, 'grad_norm': 0.8379482755652226, 'learning_rate': 7.49426985463384e-06, 'epoch': 0.35} 35%|███▌ | 12126/34278 [13:28:17<24:04:19, 3.91s/it] 35%|███▌ | 12127/34278 [13:28:20<23:25:21, 3.81s/it] {'loss': 0.1539, 'grad_norm': 0.889161350385342, 'learning_rate': 7.493860391405632e-06, 'epoch': 0.35} 35%|███▌ | 12127/34278 [13:28:20<23:25:21, 3.81s/it] 35%|███▌ | 12128/34278 [13:28:25<24:38:09, 4.00s/it] {'loss': 0.1491, 'grad_norm': 1.0597870595233896, 'learning_rate': 7.4934509059128334e-06, 'epoch': 0.35} 35%|███▌ | 12128/34278 [13:28:25<24:38:09, 4.00s/it] 35%|███▌ | 12129/34278 [13:28:28<23:46:53, 3.87s/it] {'loss': 0.1492, 'grad_norm': 0.7716787255958263, 'learning_rate': 7.493041398159102e-06, 'epoch': 0.35} 35%|███▌ | 12129/34278 [13:28:29<23:46:53, 3.87s/it] 35%|███▌ | 12130/34278 [13:28:31<21:53:18, 3.56s/it] {'loss': 0.1298, 'grad_norm': 0.7270262602920922, 'learning_rate': 7.49263186814809e-06, 'epoch': 0.35} 35%|███▌ | 12130/34278 [13:28:31<21:53:18, 3.56s/it] 35%|███▌ | 12131/34278 [13:28:34<20:31:57, 3.34s/it] {'loss': 0.149, 'grad_norm': 1.020961829125233, 'learning_rate': 7.492222315883458e-06, 'epoch': 0.35} 35%|███▌ | 12131/34278 [13:28:34<20:31:57, 3.34s/it] 35%|███▌ | 12132/34278 [13:28:39<24:10:37, 3.93s/it] {'loss': 0.1781, 'grad_norm': 0.8551619094051521, 'learning_rate': 7.491812741368859e-06, 'epoch': 0.35} 35%|███▌ | 12132/34278 [13:28:39<24:10:37, 3.93s/it] 35%|███▌ | 12133/34278 [13:28:42<22:23:59, 3.64s/it] {'loss': 0.1584, 'grad_norm': 0.7148504836663766, 'learning_rate': 7.491403144607951e-06, 'epoch': 0.35} 35%|███▌ | 12133/34278 [13:28:42<22:23:59, 3.64s/it] 35%|███▌ | 12134/34278 [13:28:45<21:15:53, 3.46s/it] {'loss': 0.1528, 'grad_norm': 0.7310860409671713, 'learning_rate': 7.490993525604389e-06, 'epoch': 0.35} 35%|███▌ | 12134/34278 [13:28:45<21:15:53, 3.46s/it] 35%|███▌ | 12135/34278 [13:28:49<20:54:28, 3.40s/it] {'loss': 0.142, 'grad_norm': 0.8109727413891358, 'learning_rate': 7.490583884361834e-06, 'epoch': 0.35} 35%|███▌ | 12135/34278 [13:28:49<20:54:28, 3.40s/it] 35%|███▌ | 12136/34278 [13:28:53<22:27:10, 3.65s/it] {'loss': 0.1671, 'grad_norm': 0.719566074690827, 'learning_rate': 7.49017422088394e-06, 'epoch': 0.35} 35%|███▌ | 12136/34278 [13:28:53<22:27:10, 3.65s/it] 35%|███▌ | 12137/34278 [13:28:57<22:18:25, 3.63s/it] {'loss': 0.15, 'grad_norm': 0.7792648338872127, 'learning_rate': 7.489764535174363e-06, 'epoch': 0.35} 35%|███▌ | 12137/34278 [13:28:57<22:18:25, 3.63s/it] 35%|███▌ | 12138/34278 [13:29:03<27:26:18, 4.46s/it] {'loss': 0.1703, 'grad_norm': 1.1983618739593251, 'learning_rate': 7.489354827236765e-06, 'epoch': 0.35} 35%|███▌ | 12138/34278 [13:29:03<27:26:18, 4.46s/it] 35%|███▌ | 12139/34278 [13:29:06<24:41:50, 4.02s/it] {'loss': 0.133, 'grad_norm': 0.8303239451422616, 'learning_rate': 7.4889450970748e-06, 'epoch': 0.35} 35%|███▌ | 12139/34278 [13:29:06<24:41:50, 4.02s/it] 35%|███▌ | 12140/34278 [13:29:12<28:59:31, 4.71s/it] {'loss': 0.1457, 'grad_norm': 0.9016886227944668, 'learning_rate': 7.488535344692127e-06, 'epoch': 0.35} 35%|███▌ | 12140/34278 [13:29:12<28:59:31, 4.71s/it] 35%|███▌ | 12141/34278 [13:29:18<31:03:59, 5.05s/it] {'loss': 0.1544, 'grad_norm': 0.8093533800280821, 'learning_rate': 7.488125570092406e-06, 'epoch': 0.35} 35%|███▌ | 12141/34278 [13:29:18<31:03:59, 5.05s/it] 35%|███▌ | 12142/34278 [13:29:22<28:33:02, 4.64s/it] {'loss': 0.1475, 'grad_norm': 0.8239954533166405, 'learning_rate': 7.487715773279293e-06, 'epoch': 0.35} 35%|███▌ | 12142/34278 [13:29:22<28:33:02, 4.64s/it] 35%|███▌ | 12143/34278 [13:29:28<30:57:37, 5.04s/it] {'loss': 0.1468, 'grad_norm': 1.099868567715278, 'learning_rate': 7.4873059542564465e-06, 'epoch': 0.35} 35%|███▌ | 12143/34278 [13:29:28<30:57:37, 5.04s/it] 35%|███▌ | 12144/34278 [13:29:31<28:17:36, 4.60s/it] {'loss': 0.1479, 'grad_norm': 1.0263479613663975, 'learning_rate': 7.486896113027528e-06, 'epoch': 0.35} 35%|███▌ | 12144/34278 [13:29:31<28:17:36, 4.60s/it] 35%|███▌ | 12145/34278 [13:29:35<26:08:47, 4.25s/it] {'loss': 0.1572, 'grad_norm': 0.7351293633729539, 'learning_rate': 7.486486249596194e-06, 'epoch': 0.35} 35%|███▌ | 12145/34278 [13:29:35<26:08:47, 4.25s/it] 35%|███▌ | 12146/34278 [13:29:38<24:45:19, 4.03s/it] {'loss': 0.1359, 'grad_norm': 0.7289195572148732, 'learning_rate': 7.486076363966104e-06, 'epoch': 0.35} 35%|███▌ | 12146/34278 [13:29:38<24:45:19, 4.03s/it] 35%|███▌ | 12147/34278 [13:29:41<23:04:36, 3.75s/it] {'loss': 0.1309, 'grad_norm': 0.9182815224095634, 'learning_rate': 7.485666456140918e-06, 'epoch': 0.35} 35%|███▌ | 12147/34278 [13:29:41<23:04:36, 3.75s/it] 35%|███▌ | 12148/34278 [13:29:45<21:56:44, 3.57s/it] {'loss': 0.1657, 'grad_norm': 0.8835430736644039, 'learning_rate': 7.485256526124295e-06, 'epoch': 0.35} 35%|███▌ | 12148/34278 [13:29:45<21:56:44, 3.57s/it] 35%|███▌ | 12149/34278 [13:29:48<22:25:09, 3.65s/it] {'loss': 0.1398, 'grad_norm': 0.8613889013926775, 'learning_rate': 7.484846573919895e-06, 'epoch': 0.35} 35%|███▌ | 12149/34278 [13:29:48<22:25:09, 3.65s/it] 35%|███▌ | 12150/34278 [13:29:53<24:20:03, 3.96s/it] {'loss': 0.1483, 'grad_norm': 0.9474985870617246, 'learning_rate': 7.484436599531377e-06, 'epoch': 0.35} 35%|███▌ | 12150/34278 [13:29:53<24:20:03, 3.96s/it] 35%|███▌ | 12151/34278 [13:29:59<27:49:44, 4.53s/it] {'loss': 0.156, 'grad_norm': 0.8719680967907784, 'learning_rate': 7.484026602962405e-06, 'epoch': 0.35} 35%|███▌ | 12151/34278 [13:29:59<27:49:44, 4.53s/it] 35%|███▌ | 12152/34278 [13:30:02<25:13:30, 4.10s/it] {'loss': 0.1311, 'grad_norm': 0.9192150275876452, 'learning_rate': 7.483616584216633e-06, 'epoch': 0.35} 35%|███▌ | 12152/34278 [13:30:02<25:13:30, 4.10s/it] 35%|███▌ | 12153/34278 [13:30:06<24:05:22, 3.92s/it] {'loss': 0.1274, 'grad_norm': 0.8408136692254884, 'learning_rate': 7.483206543297727e-06, 'epoch': 0.35} 35%|███▌ | 12153/34278 [13:30:06<24:05:22, 3.92s/it] 35%|███▌ | 12154/34278 [13:30:09<22:45:48, 3.70s/it] {'loss': 0.1455, 'grad_norm': 1.3566829517458583, 'learning_rate': 7.482796480209346e-06, 'epoch': 0.35} 35%|███▌ | 12154/34278 [13:30:09<22:45:48, 3.70s/it] 35%|███▌ | 12155/34278 [13:30:12<22:11:28, 3.61s/it] {'loss': 0.1445, 'grad_norm': 0.9014132017831294, 'learning_rate': 7.48238639495515e-06, 'epoch': 0.35} 35%|███▌ | 12155/34278 [13:30:12<22:11:28, 3.61s/it] 35%|███▌ | 12156/34278 [13:30:17<24:12:11, 3.94s/it] {'loss': 0.1471, 'grad_norm': 0.8463275021589733, 'learning_rate': 7.481976287538802e-06, 'epoch': 0.35} 35%|███▌ | 12156/34278 [13:30:17<24:12:11, 3.94s/it] 35%|███▌ | 12157/34278 [13:30:21<24:36:34, 4.01s/it] {'loss': 0.1442, 'grad_norm': 0.9980583664623567, 'learning_rate': 7.481566157963961e-06, 'epoch': 0.35} 35%|███▌ | 12157/34278 [13:30:21<24:36:34, 4.01s/it] 35%|███▌ | 12158/34278 [13:30:24<22:23:56, 3.65s/it] {'loss': 0.1308, 'grad_norm': 0.8458981667551823, 'learning_rate': 7.481156006234289e-06, 'epoch': 0.35} 35%|███▌ | 12158/34278 [13:30:24<22:23:56, 3.65s/it] 35%|███▌ | 12159/34278 [13:30:27<21:00:43, 3.42s/it] {'loss': 0.1355, 'grad_norm': 0.8842798179502824, 'learning_rate': 7.480745832353451e-06, 'epoch': 0.35} 35%|███▌ | 12159/34278 [13:30:27<21:00:43, 3.42s/it] 35%|███▌ | 12160/34278 [13:30:30<21:35:34, 3.51s/it] {'loss': 0.1516, 'grad_norm': 0.9249057662349253, 'learning_rate': 7.480335636325104e-06, 'epoch': 0.35} 35%|███▌ | 12160/34278 [13:30:30<21:35:34, 3.51s/it] 35%|███▌ | 12161/34278 [13:30:34<21:50:48, 3.56s/it] {'loss': 0.1372, 'grad_norm': 0.9525575932168225, 'learning_rate': 7.479925418152914e-06, 'epoch': 0.35} 35%|███▌ | 12161/34278 [13:30:34<21:50:48, 3.56s/it] 35%|███▌ | 12162/34278 [13:30:37<20:50:50, 3.39s/it] {'loss': 0.1445, 'grad_norm': 0.9058629692852751, 'learning_rate': 7.479515177840542e-06, 'epoch': 0.35} 35%|███▌ | 12162/34278 [13:30:37<20:50:50, 3.39s/it] 35%|███▌ | 12163/34278 [13:30:40<20:14:43, 3.30s/it] {'loss': 0.1602, 'grad_norm': 0.93263703609085, 'learning_rate': 7.479104915391649e-06, 'epoch': 0.35} 35%|███▌ | 12163/34278 [13:30:40<20:14:43, 3.30s/it] 35%|███▌ | 12164/34278 [13:30:44<21:46:30, 3.54s/it] {'loss': 0.1168, 'grad_norm': 0.8515894085441673, 'learning_rate': 7.478694630809899e-06, 'epoch': 0.35} 35%|███▌ | 12164/34278 [13:30:44<21:46:30, 3.54s/it] 35%|███▌ | 12165/34278 [13:30:47<20:39:51, 3.36s/it] {'loss': 0.1408, 'grad_norm': 0.8805126690114949, 'learning_rate': 7.478284324098957e-06, 'epoch': 0.35} 35%|███▌ | 12165/34278 [13:30:47<20:39:51, 3.36s/it] 35%|███▌ | 12166/34278 [13:30:50<19:51:44, 3.23s/it] {'loss': 0.1591, 'grad_norm': 0.7426349012566554, 'learning_rate': 7.4778739952624835e-06, 'epoch': 0.35} 35%|███▌ | 12166/34278 [13:30:50<19:51:44, 3.23s/it] 35%|███▌ | 12167/34278 [13:30:53<19:03:57, 3.10s/it] {'loss': 0.1356, 'grad_norm': 0.6397230883071294, 'learning_rate': 7.477463644304141e-06, 'epoch': 0.35} 35%|███▌ | 12167/34278 [13:30:53<19:03:57, 3.10s/it] 35%|███▌ | 12168/34278 [13:30:57<19:57:50, 3.25s/it] {'loss': 0.1417, 'grad_norm': 0.8982364452423646, 'learning_rate': 7.477053271227596e-06, 'epoch': 0.35} 35%|███▌ | 12168/34278 [13:30:57<19:57:50, 3.25s/it] 36%|███▌ | 12169/34278 [13:30:59<19:18:51, 3.14s/it] {'loss': 0.1737, 'grad_norm': 0.8447914658551411, 'learning_rate': 7.47664287603651e-06, 'epoch': 0.36} 36%|███▌ | 12169/34278 [13:30:59<19:18:51, 3.14s/it] 36%|███▌ | 12170/34278 [13:31:03<19:23:55, 3.16s/it] {'loss': 0.1503, 'grad_norm': 0.8370376273579871, 'learning_rate': 7.476232458734547e-06, 'epoch': 0.36} 36%|███▌ | 12170/34278 [13:31:03<19:23:55, 3.16s/it] 36%|███▌ | 12171/34278 [13:31:06<20:09:39, 3.28s/it] {'loss': 0.129, 'grad_norm': 0.8019143284953446, 'learning_rate': 7.475822019325374e-06, 'epoch': 0.36} 36%|███▌ | 12171/34278 [13:31:06<20:09:39, 3.28s/it] 36%|███▌ | 12172/34278 [13:31:11<22:16:50, 3.63s/it] {'loss': 0.1345, 'grad_norm': 0.9026727682417437, 'learning_rate': 7.475411557812652e-06, 'epoch': 0.36} 36%|███▌ | 12172/34278 [13:31:11<22:16:50, 3.63s/it] 36%|███▌ | 12173/34278 [13:31:14<21:16:51, 3.47s/it] {'loss': 0.1414, 'grad_norm': 0.8508773884433103, 'learning_rate': 7.4750010742000445e-06, 'epoch': 0.36} 36%|███▌ | 12173/34278 [13:31:14<21:16:51, 3.47s/it] 36%|███▌ | 12174/34278 [13:31:17<20:07:48, 3.28s/it] {'loss': 0.1418, 'grad_norm': 0.9824204270516677, 'learning_rate': 7.474590568491222e-06, 'epoch': 0.36} 36%|███▌ | 12174/34278 [13:31:17<20:07:48, 3.28s/it] 36%|███▌ | 12175/34278 [13:31:19<19:27:59, 3.17s/it] {'loss': 0.1409, 'grad_norm': 0.8934801781268447, 'learning_rate': 7.474180040689842e-06, 'epoch': 0.36} 36%|███▌ | 12175/34278 [13:31:19<19:27:59, 3.17s/it] 36%|███▌ | 12176/34278 [13:31:22<19:05:25, 3.11s/it] {'loss': 0.1524, 'grad_norm': 0.9221120609784726, 'learning_rate': 7.473769490799575e-06, 'epoch': 0.36} 36%|███▌ | 12176/34278 [13:31:22<19:05:25, 3.11s/it] 36%|███▌ | 12177/34278 [13:31:28<23:44:50, 3.87s/it] {'loss': 0.1619, 'grad_norm': 0.9789817786148268, 'learning_rate': 7.473358918824085e-06, 'epoch': 0.36} 36%|███▌ | 12177/34278 [13:31:28<23:44:50, 3.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 36%|███▌ | 12178/34278 [13:31:31<22:30:38, 3.67s/it] {'loss': 0.1468, 'grad_norm': 0.7818843977047737, 'learning_rate': 7.472948324767035e-06, 'epoch': 0.36} 36%|███▌ | 12178/34278 [13:31:31<22:30:38, 3.67s/it] 36%|███▌ | 12179/34278 [13:31:35<21:47:30, 3.55s/it] {'loss': 0.1323, 'grad_norm': 0.8606870497211695, 'learning_rate': 7.472537708632095e-06, 'epoch': 0.36} 36%|███▌ | 12179/34278 [13:31:35<21:47:30, 3.55s/it] 36%|███▌ | 12180/34278 [13:31:39<22:39:04, 3.69s/it] {'loss': 0.1628, 'grad_norm': 0.8906836695819145, 'learning_rate': 7.472127070422928e-06, 'epoch': 0.36} 36%|███▌ | 12180/34278 [13:31:39<22:39:04, 3.69s/it] 36%|███▌ | 12181/34278 [13:31:42<22:04:53, 3.60s/it] {'loss': 0.122, 'grad_norm': 0.8088359794507107, 'learning_rate': 7.4717164101432e-06, 'epoch': 0.36} 36%|███▌ | 12181/34278 [13:31:42<22:04:53, 3.60s/it] 36%|███▌ | 12182/34278 [13:31:45<21:20:39, 3.48s/it] {'loss': 0.1158, 'grad_norm': 0.8566020698157975, 'learning_rate': 7.471305727796579e-06, 'epoch': 0.36} 36%|███▌ | 12182/34278 [13:31:45<21:20:39, 3.48s/it] 36%|███▌ | 12183/34278 [13:31:50<24:38:31, 4.02s/it] {'loss': 0.1742, 'grad_norm': 0.8750096258289879, 'learning_rate': 7.470895023386728e-06, 'epoch': 0.36} 36%|███▌ | 12183/34278 [13:31:50<24:38:31, 4.02s/it] 36%|███▌ | 12184/34278 [13:31:54<23:28:06, 3.82s/it] {'loss': 0.1523, 'grad_norm': 0.8400584190518494, 'learning_rate': 7.470484296917319e-06, 'epoch': 0.36} 36%|███▌ | 12184/34278 [13:31:54<23:28:06, 3.82s/it] 36%|███▌ | 12185/34278 [13:31:58<23:32:15, 3.84s/it] {'loss': 0.146, 'grad_norm': 1.110400780793465, 'learning_rate': 7.470073548392014e-06, 'epoch': 0.36} 36%|███▌ | 12185/34278 [13:31:58<23:32:15, 3.84s/it] 36%|███▌ | 12186/34278 [13:32:01<23:18:57, 3.80s/it] {'loss': 0.1428, 'grad_norm': 0.8514545642807553, 'learning_rate': 7.469662777814484e-06, 'epoch': 0.36} 36%|███▌ | 12186/34278 [13:32:01<23:18:57, 3.80s/it] 36%|███▌ | 12187/34278 [13:32:05<22:19:26, 3.64s/it] {'loss': 0.1775, 'grad_norm': 0.8435930628979348, 'learning_rate': 7.469251985188392e-06, 'epoch': 0.36} 36%|███▌ | 12187/34278 [13:32:05<22:19:26, 3.64s/it] 36%|███▌ | 12188/34278 [13:32:07<20:40:02, 3.37s/it] {'loss': 0.1388, 'grad_norm': 0.727108203502303, 'learning_rate': 7.468841170517408e-06, 'epoch': 0.36} 36%|███▌ | 12188/34278 [13:32:07<20:40:02, 3.37s/it] 36%|███▌ | 12189/34278 [13:32:10<19:34:11, 3.19s/it] {'loss': 0.1307, 'grad_norm': 1.5890076218904514, 'learning_rate': 7.468430333805201e-06, 'epoch': 0.36} 36%|███▌ | 12189/34278 [13:32:10<19:34:11, 3.19s/it] 36%|███▌ | 12190/34278 [13:32:16<24:46:34, 4.04s/it] {'loss': 0.1615, 'grad_norm': 0.8050188464842346, 'learning_rate': 7.468019475055436e-06, 'epoch': 0.36} 36%|███▌ | 12190/34278 [13:32:16<24:46:34, 4.04s/it] 36%|███▌ | 12191/34278 [13:32:19<22:47:07, 3.71s/it] {'loss': 0.1486, 'grad_norm': 0.6650859130140865, 'learning_rate': 7.467608594271782e-06, 'epoch': 0.36} 36%|███▌ | 12191/34278 [13:32:19<22:47:07, 3.71s/it] 36%|███▌ | 12192/34278 [13:32:22<21:10:58, 3.45s/it] {'loss': 0.1446, 'grad_norm': 0.8896317278270026, 'learning_rate': 7.467197691457908e-06, 'epoch': 0.36} 36%|███▌ | 12192/34278 [13:32:22<21:10:58, 3.45s/it] 36%|███▌ | 12193/34278 [13:32:25<20:30:01, 3.34s/it] {'loss': 0.1321, 'grad_norm': 0.7923808324329165, 'learning_rate': 7.466786766617482e-06, 'epoch': 0.36} 36%|███▌ | 12193/34278 [13:32:25<20:30:01, 3.34s/it] 36%|███▌ | 12194/34278 [13:32:28<20:15:06, 3.30s/it] {'loss': 0.1333, 'grad_norm': 0.8962565546218314, 'learning_rate': 7.466375819754173e-06, 'epoch': 0.36} 36%|███▌ | 12194/34278 [13:32:28<20:15:06, 3.30s/it] 36%|███▌ | 12195/34278 [13:32:31<19:50:49, 3.24s/it] {'loss': 0.1294, 'grad_norm': 0.8146336102797297, 'learning_rate': 7.46596485087165e-06, 'epoch': 0.36} 36%|███▌ | 12195/34278 [13:32:31<19:50:49, 3.24s/it] 36%|███▌ | 12196/34278 [13:32:35<20:02:56, 3.27s/it] {'loss': 0.1432, 'grad_norm': 0.8661790351439472, 'learning_rate': 7.465553859973581e-06, 'epoch': 0.36} 36%|███▌ | 12196/34278 [13:32:35<20:02:56, 3.27s/it] 36%|███▌ | 12197/34278 [13:32:39<22:07:16, 3.61s/it] {'loss': 0.154, 'grad_norm': 0.8719026272812894, 'learning_rate': 7.465142847063634e-06, 'epoch': 0.36} 36%|███▌ | 12197/34278 [13:32:39<22:07:16, 3.61s/it] 36%|███▌ | 12198/34278 [13:32:42<21:11:17, 3.45s/it] {'loss': 0.1184, 'grad_norm': 0.6680003649006871, 'learning_rate': 7.464731812145483e-06, 'epoch': 0.36} 36%|███▌ | 12198/34278 [13:32:42<21:11:17, 3.45s/it] 36%|███▌ | 12199/34278 [13:32:47<24:23:48, 3.98s/it] {'loss': 0.1653, 'grad_norm': 0.7488816345308577, 'learning_rate': 7.464320755222793e-06, 'epoch': 0.36} 36%|███▌ | 12199/34278 [13:32:47<24:23:48, 3.98s/it] 36%|███▌ | 12200/34278 [13:32:51<23:59:58, 3.91s/it] {'loss': 0.1359, 'grad_norm': 0.8290221349289797, 'learning_rate': 7.4639096762992345e-06, 'epoch': 0.36} 36%|███▌ | 12200/34278 [13:32:51<23:59:58, 3.91s/it] 36%|███▌ | 12201/34278 [13:32:55<24:27:24, 3.99s/it] {'loss': 0.1589, 'grad_norm': 1.0037579553709008, 'learning_rate': 7.463498575378482e-06, 'epoch': 0.36} 36%|███▌ | 12201/34278 [13:32:55<24:27:24, 3.99s/it] 36%|███▌ | 12202/34278 [13:32:59<24:04:08, 3.93s/it] {'loss': 0.1403, 'grad_norm': 0.7962303807839208, 'learning_rate': 7.463087452464199e-06, 'epoch': 0.36} 36%|███▌ | 12202/34278 [13:32:59<24:04:08, 3.93s/it] 36%|███▌ | 12203/34278 [13:33:03<23:15:40, 3.79s/it] {'loss': 0.1664, 'grad_norm': 0.8988662046657548, 'learning_rate': 7.462676307560059e-06, 'epoch': 0.36} 36%|███▌ | 12203/34278 [13:33:03<23:15:40, 3.79s/it] 36%|███▌ | 12204/34278 [13:33:06<23:03:33, 3.76s/it] {'loss': 0.1488, 'grad_norm': 0.8946106540425235, 'learning_rate': 7.462265140669735e-06, 'epoch': 0.36} 36%|███▌ | 12204/34278 [13:33:06<23:03:33, 3.76s/it] 36%|███▌ | 12205/34278 [13:33:12<26:02:58, 4.25s/it] {'loss': 0.1566, 'grad_norm': 0.866532976364385, 'learning_rate': 7.461853951796895e-06, 'epoch': 0.36} 36%|███▌ | 12205/34278 [13:33:12<26:02:58, 4.25s/it] 36%|███▌ | 12206/34278 [13:33:14<23:17:02, 3.80s/it] {'loss': 0.1466, 'grad_norm': 0.8344186923159563, 'learning_rate': 7.4614427409452116e-06, 'epoch': 0.36} 36%|███▌ | 12206/34278 [13:33:14<23:17:02, 3.80s/it] 36%|███▌ | 12207/34278 [13:33:20<27:13:48, 4.44s/it] {'loss': 0.1577, 'grad_norm': 1.290096770545583, 'learning_rate': 7.461031508118354e-06, 'epoch': 0.36} 36%|███▌ | 12207/34278 [13:33:20<27:13:48, 4.44s/it] 36%|███▌ | 12208/34278 [13:33:27<30:50:09, 5.03s/it] {'loss': 0.1697, 'grad_norm': 0.8999158766283113, 'learning_rate': 7.4606202533199945e-06, 'epoch': 0.36} 36%|███▌ | 12208/34278 [13:33:27<30:50:09, 5.03s/it] 36%|███▌ | 12209/34278 [13:33:31<29:27:40, 4.81s/it] {'loss': 0.1698, 'grad_norm': 0.9195358674976805, 'learning_rate': 7.460208976553804e-06, 'epoch': 0.36} 36%|███▌ | 12209/34278 [13:33:31<29:27:40, 4.81s/it] 36%|███▌ | 12210/34278 [13:33:34<26:53:53, 4.39s/it] {'loss': 0.1179, 'grad_norm': 1.03287815865787, 'learning_rate': 7.459797677823456e-06, 'epoch': 0.36} 36%|███▌ | 12210/34278 [13:33:34<26:53:53, 4.39s/it] 36%|███▌ | 12211/34278 [13:33:38<26:03:37, 4.25s/it] {'loss': 0.1521, 'grad_norm': 0.8481792372747801, 'learning_rate': 7.4593863571326204e-06, 'epoch': 0.36} 36%|███▌ | 12211/34278 [13:33:38<26:03:37, 4.25s/it] 36%|███▌ | 12212/34278 [13:33:42<24:24:42, 3.98s/it] {'loss': 0.1374, 'grad_norm': 0.905091407798776, 'learning_rate': 7.458975014484972e-06, 'epoch': 0.36} 36%|███▌ | 12212/34278 [13:33:42<24:24:42, 3.98s/it] 36%|███▌ | 12213/34278 [13:33:45<22:44:22, 3.71s/it] {'loss': 0.1383, 'grad_norm': 0.9903840552254256, 'learning_rate': 7.458563649884182e-06, 'epoch': 0.36} 36%|███▌ | 12213/34278 [13:33:45<22:44:22, 3.71s/it] 36%|███▌ | 12214/34278 [13:33:48<21:30:13, 3.51s/it] {'loss': 0.188, 'grad_norm': 0.8566119220499994, 'learning_rate': 7.458152263333921e-06, 'epoch': 0.36} 36%|███▌ | 12214/34278 [13:33:48<21:30:13, 3.51s/it] 36%|███▌ | 12215/34278 [13:33:51<21:25:03, 3.49s/it] {'loss': 0.1359, 'grad_norm': 0.806007458321855, 'learning_rate': 7.457740854837865e-06, 'epoch': 0.36} 36%|███▌ | 12215/34278 [13:33:51<21:25:03, 3.49s/it] 36%|███▌ | 12216/34278 [13:33:57<26:05:00, 4.26s/it] {'loss': 0.1636, 'grad_norm': 0.846336649082963, 'learning_rate': 7.457329424399685e-06, 'epoch': 0.36} 36%|███▌ | 12216/34278 [13:33:57<26:05:00, 4.26s/it] 36%|███▌ | 12217/34278 [13:34:00<23:35:23, 3.85s/it] {'loss': 0.1261, 'grad_norm': 0.9295740844894287, 'learning_rate': 7.456917972023052e-06, 'epoch': 0.36} 36%|███▌ | 12217/34278 [13:34:00<23:35:23, 3.85s/it] 36%|███▌ | 12218/34278 [13:34:04<24:06:23, 3.93s/it] {'loss': 0.1388, 'grad_norm': 0.6394264915549989, 'learning_rate': 7.456506497711644e-06, 'epoch': 0.36} 36%|███▌ | 12218/34278 [13:34:04<24:06:23, 3.93s/it] 36%|███▌ | 12219/34278 [13:34:08<23:14:11, 3.79s/it] {'loss': 0.1593, 'grad_norm': 0.8709909836795148, 'learning_rate': 7.456095001469135e-06, 'epoch': 0.36} 36%|███▌ | 12219/34278 [13:34:08<23:14:11, 3.79s/it] 36%|███▌ | 12220/34278 [13:34:12<24:49:08, 4.05s/it] {'loss': 0.1728, 'grad_norm': 0.9421989226538512, 'learning_rate': 7.455683483299192e-06, 'epoch': 0.36} 36%|███▌ | 12220/34278 [13:34:12<24:49:08, 4.05s/it] 36%|███▌ | 12221/34278 [13:34:18<28:05:50, 4.59s/it] {'loss': 0.1569, 'grad_norm': 0.8111037998968257, 'learning_rate': 7.455271943205495e-06, 'epoch': 0.36} 36%|███▌ | 12221/34278 [13:34:18<28:05:50, 4.59s/it] 36%|███▌ | 12222/34278 [13:34:21<25:17:29, 4.13s/it] {'loss': 0.1377, 'grad_norm': 0.8724484550037249, 'learning_rate': 7.4548603811917155e-06, 'epoch': 0.36} 36%|███▌ | 12222/34278 [13:34:21<25:17:29, 4.13s/it] 36%|███▌ | 12223/34278 [13:34:24<23:20:36, 3.81s/it] {'loss': 0.1614, 'grad_norm': 0.7079865680622673, 'learning_rate': 7.454448797261529e-06, 'epoch': 0.36} 36%|███▌ | 12223/34278 [13:34:24<23:20:36, 3.81s/it] 36%|███▌ | 12224/34278 [13:34:27<21:39:50, 3.54s/it] {'loss': 0.1599, 'grad_norm': 0.7457006911986671, 'learning_rate': 7.45403719141861e-06, 'epoch': 0.36} 36%|███▌ | 12224/34278 [13:34:27<21:39:50, 3.54s/it] 36%|███▌ | 12225/34278 [13:34:31<21:37:30, 3.53s/it] {'loss': 0.1179, 'grad_norm': 0.7775430973464645, 'learning_rate': 7.453625563666631e-06, 'epoch': 0.36} 36%|███▌ | 12225/34278 [13:34:31<21:37:30, 3.53s/it] 36%|███▌ | 12226/34278 [13:34:35<22:44:18, 3.71s/it] {'loss': 0.136, 'grad_norm': 0.70813607212652, 'learning_rate': 7.4532139140092694e-06, 'epoch': 0.36} 36%|███▌ | 12226/34278 [13:34:35<22:44:18, 3.71s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 36%|███▌ | 12227/34278 [13:34:38<21:43:36, 3.55s/it] {'loss': 0.1359, 'grad_norm': 0.757494936279595, 'learning_rate': 7.452802242450201e-06, 'epoch': 0.36} 36%|███▌ | 12227/34278 [13:34:38<21:43:36, 3.55s/it] 36%|███▌ | 12228/34278 [13:34:41<21:10:42, 3.46s/it] {'loss': 0.1293, 'grad_norm': 0.7858158925237995, 'learning_rate': 7.452390548993098e-06, 'epoch': 0.36} 36%|███▌ | 12228/34278 [13:34:41<21:10:42, 3.46s/it] 36%|███▌ | 12229/34278 [13:34:44<20:20:12, 3.32s/it] {'loss': 0.1476, 'grad_norm': 0.7511279403357667, 'learning_rate': 7.451978833641639e-06, 'epoch': 0.36} 36%|███▌ | 12229/34278 [13:34:44<20:20:12, 3.32s/it] 36%|███▌ | 12230/34278 [13:34:47<19:36:40, 3.20s/it] {'loss': 0.1462, 'grad_norm': 0.7953282518094266, 'learning_rate': 7.451567096399497e-06, 'epoch': 0.36} 36%|███▌ | 12230/34278 [13:34:47<19:36:40, 3.20s/it] 36%|███▌ | 12231/34278 [13:34:51<20:25:55, 3.34s/it] {'loss': 0.1433, 'grad_norm': 0.7967242197118525, 'learning_rate': 7.45115533727035e-06, 'epoch': 0.36} 36%|███▌ | 12231/34278 [13:34:51<20:25:55, 3.34s/it] 36%|███▌ | 12232/34278 [13:34:54<20:04:59, 3.28s/it] {'loss': 0.1494, 'grad_norm': 0.9466414202769508, 'learning_rate': 7.450743556257874e-06, 'epoch': 0.36} 36%|███▌ | 12232/34278 [13:34:54<20:04:59, 3.28s/it] 36%|███▌ | 12233/34278 [13:34:58<20:38:04, 3.37s/it] {'loss': 0.1479, 'grad_norm': 0.7434133201655584, 'learning_rate': 7.450331753365743e-06, 'epoch': 0.36} 36%|███▌ | 12233/34278 [13:34:58<20:38:04, 3.37s/it] 36%|███▌ | 12234/34278 [13:35:00<19:35:07, 3.20s/it] {'loss': 0.1489, 'grad_norm': 1.0130427033764855, 'learning_rate': 7.449919928597637e-06, 'epoch': 0.36} 36%|███▌ | 12234/34278 [13:35:00<19:35:07, 3.20s/it] 36%|███▌ | 12235/34278 [13:35:03<18:51:36, 3.08s/it] {'loss': 0.1357, 'grad_norm': 1.0454899133306563, 'learning_rate': 7.449508081957228e-06, 'epoch': 0.36} 36%|███▌ | 12235/34278 [13:35:03<18:51:36, 3.08s/it] 36%|███▌ | 12236/34278 [13:35:09<24:21:52, 3.98s/it] {'loss': 0.1651, 'grad_norm': 1.060813850156249, 'learning_rate': 7.449096213448198e-06, 'epoch': 0.36} 36%|███▌ | 12236/34278 [13:35:09<24:21:52, 3.98s/it] 36%|███▌ | 12237/34278 [13:35:13<24:23:00, 3.98s/it] {'loss': 0.1484, 'grad_norm': 0.9166046971015583, 'learning_rate': 7.44868432307422e-06, 'epoch': 0.36} 36%|███▌ | 12237/34278 [13:35:13<24:23:00, 3.98s/it] 36%|███▌ | 12238/34278 [13:35:16<22:27:10, 3.67s/it] {'loss': 0.1642, 'grad_norm': 1.0708409711345588, 'learning_rate': 7.448272410838975e-06, 'epoch': 0.36} 36%|███▌ | 12238/34278 [13:35:16<22:27:10, 3.67s/it] 36%|███▌ | 12239/34278 [13:35:22<26:13:35, 4.28s/it] {'loss': 0.1529, 'grad_norm': 0.9934534541385831, 'learning_rate': 7.447860476746136e-06, 'epoch': 0.36} 36%|███▌ | 12239/34278 [13:35:22<26:13:35, 4.28s/it] 36%|███▌ | 12240/34278 [13:35:26<26:19:31, 4.30s/it] {'loss': 0.1325, 'grad_norm': 0.8925425957204665, 'learning_rate': 7.447448520799384e-06, 'epoch': 0.36} 36%|███▌ | 12240/34278 [13:35:26<26:19:31, 4.30s/it] 36%|███▌ | 12241/34278 [13:35:30<25:34:32, 4.18s/it] {'loss': 0.1586, 'grad_norm': 0.8122120542844333, 'learning_rate': 7.447036543002396e-06, 'epoch': 0.36} 36%|███▌ | 12241/34278 [13:35:30<25:34:32, 4.18s/it] 36%|███▌ | 12242/34278 [13:35:34<24:08:31, 3.94s/it] {'loss': 0.1258, 'grad_norm': 0.7963038408521792, 'learning_rate': 7.4466245433588495e-06, 'epoch': 0.36} 36%|███▌ | 12242/34278 [13:35:34<24:08:31, 3.94s/it] 36%|███▌ | 12243/34278 [13:35:37<22:24:19, 3.66s/it] {'loss': 0.1231, 'grad_norm': 0.8484007574660379, 'learning_rate': 7.4462125218724236e-06, 'epoch': 0.36} 36%|███▌ | 12243/34278 [13:35:37<22:24:19, 3.66s/it] 36%|███▌ | 12244/34278 [13:35:40<21:27:55, 3.51s/it] {'loss': 0.1538, 'grad_norm': 0.71054399049409, 'learning_rate': 7.445800478546796e-06, 'epoch': 0.36} 36%|███▌ | 12244/34278 [13:35:40<21:27:55, 3.51s/it] 36%|███▌ | 12245/34278 [13:35:44<23:35:50, 3.86s/it] {'loss': 0.1727, 'grad_norm': 0.8263265182463267, 'learning_rate': 7.445388413385646e-06, 'epoch': 0.36} 36%|███▌ | 12245/34278 [13:35:44<23:35:50, 3.86s/it] 36%|███▌ | 12246/34278 [13:35:49<24:08:32, 3.94s/it] {'loss': 0.1351, 'grad_norm': 0.7054479997653809, 'learning_rate': 7.444976326392652e-06, 'epoch': 0.36} 36%|███▌ | 12246/34278 [13:35:49<24:08:32, 3.94s/it] 36%|███▌ | 12247/34278 [13:35:53<24:44:42, 4.04s/it] {'loss': 0.1744, 'grad_norm': 0.8755809091412908, 'learning_rate': 7.444564217571491e-06, 'epoch': 0.36} 36%|███▌ | 12247/34278 [13:35:53<24:44:42, 4.04s/it] 36%|███▌ | 12248/34278 [13:35:57<24:20:20, 3.98s/it] {'loss': 0.1346, 'grad_norm': 1.0175376639742526, 'learning_rate': 7.444152086925847e-06, 'epoch': 0.36} 36%|███▌ | 12248/34278 [13:35:57<24:20:20, 3.98s/it] 36%|███▌ | 12249/34278 [13:35:59<21:58:35, 3.59s/it] {'loss': 0.1399, 'grad_norm': 0.7427987981336198, 'learning_rate': 7.443739934459397e-06, 'epoch': 0.36} 36%|███▌ | 12249/34278 [13:35:59<21:58:35, 3.59s/it] 36%|███▌ | 12250/34278 [13:36:02<20:25:24, 3.34s/it] {'loss': 0.1365, 'grad_norm': 0.933542900671776, 'learning_rate': 7.443327760175817e-06, 'epoch': 0.36} 36%|███▌ | 12250/34278 [13:36:02<20:25:24, 3.34s/it] 36%|███▌ | 12251/34278 [13:36:06<21:39:28, 3.54s/it] {'loss': 0.1341, 'grad_norm': 0.8583530360675189, 'learning_rate': 7.442915564078793e-06, 'epoch': 0.36} 36%|███▌ | 12251/34278 [13:36:06<21:39:28, 3.54s/it] 36%|███▌ | 12252/34278 [13:36:10<21:26:20, 3.50s/it] {'loss': 0.1376, 'grad_norm': 0.7579398069238198, 'learning_rate': 7.442503346172001e-06, 'epoch': 0.36} 36%|███▌ | 12252/34278 [13:36:10<21:26:20, 3.50s/it] 36%|███▌ | 12253/34278 [13:36:13<21:10:33, 3.46s/it] {'loss': 0.1316, 'grad_norm': 0.9293926173443993, 'learning_rate': 7.4420911064591215e-06, 'epoch': 0.36} 36%|███▌ | 12253/34278 [13:36:13<21:10:33, 3.46s/it] 36%|███▌ | 12254/34278 [13:36:17<21:26:51, 3.51s/it] {'loss': 0.1417, 'grad_norm': 0.6904763213668176, 'learning_rate': 7.441678844943836e-06, 'epoch': 0.36} 36%|███▌ | 12254/34278 [13:36:17<21:26:51, 3.51s/it] 36%|███▌ | 12255/34278 [13:36:21<23:13:29, 3.80s/it] {'loss': 0.1549, 'grad_norm': 0.8402918439439921, 'learning_rate': 7.441266561629825e-06, 'epoch': 0.36} 36%|███▌ | 12255/34278 [13:36:21<23:13:29, 3.80s/it] 36%|███▌ | 12256/34278 [13:36:24<21:20:41, 3.49s/it] {'loss': 0.1352, 'grad_norm': 0.7303999063960172, 'learning_rate': 7.440854256520769e-06, 'epoch': 0.36} 36%|███▌ | 12256/34278 [13:36:24<21:20:41, 3.49s/it] 36%|███▌ | 12257/34278 [13:36:27<20:00:06, 3.27s/it] {'loss': 0.1388, 'grad_norm': 0.9822577952961637, 'learning_rate': 7.440441929620348e-06, 'epoch': 0.36} 36%|███▌ | 12257/34278 [13:36:27<20:00:06, 3.27s/it] 36%|███▌ | 12258/34278 [13:36:30<19:56:05, 3.26s/it] {'loss': 0.1602, 'grad_norm': 0.9098563102938438, 'learning_rate': 7.4400295809322445e-06, 'epoch': 0.36} 36%|███▌ | 12258/34278 [13:36:30<19:56:05, 3.26s/it] 36%|███▌ | 12259/34278 [13:36:33<19:40:32, 3.22s/it] {'loss': 0.1449, 'grad_norm': 0.9816882972557667, 'learning_rate': 7.439617210460139e-06, 'epoch': 0.36} 36%|███▌ | 12259/34278 [13:36:33<19:40:32, 3.22s/it] 36%|███▌ | 12260/34278 [13:36:36<18:58:27, 3.10s/it] {'loss': 0.1276, 'grad_norm': 0.9220034516470308, 'learning_rate': 7.439204818207715e-06, 'epoch': 0.36} 36%|███▌ | 12260/34278 [13:36:36<18:58:27, 3.10s/it] 36%|███▌ | 12261/34278 [13:36:39<19:06:37, 3.12s/it] {'loss': 0.1563, 'grad_norm': 0.79823479876437, 'learning_rate': 7.438792404178652e-06, 'epoch': 0.36} 36%|███▌ | 12261/34278 [13:36:39<19:06:37, 3.12s/it] 36%|███▌ | 12262/34278 [13:36:45<24:17:58, 3.97s/it] {'loss': 0.1494, 'grad_norm': 0.841057332417277, 'learning_rate': 7.4383799683766315e-06, 'epoch': 0.36} 36%|███▌ | 12262/34278 [13:36:45<24:17:58, 3.97s/it] 36%|███▌ | 12263/34278 [13:36:48<22:28:33, 3.68s/it] {'loss': 0.1473, 'grad_norm': 0.8624899017320787, 'learning_rate': 7.437967510805336e-06, 'epoch': 0.36} 36%|███▌ | 12263/34278 [13:36:48<22:28:33, 3.68s/it] 36%|███▌ | 12264/34278 [13:36:51<20:57:00, 3.43s/it] {'loss': 0.1142, 'grad_norm': 0.8338898053412516, 'learning_rate': 7.4375550314684505e-06, 'epoch': 0.36} 36%|███▌ | 12264/34278 [13:36:51<20:57:00, 3.43s/it] 36%|███▌ | 12265/34278 [13:36:55<22:23:38, 3.66s/it] {'loss': 0.1393, 'grad_norm': 0.7141194529104241, 'learning_rate': 7.437142530369654e-06, 'epoch': 0.36} 36%|███▌ | 12265/34278 [13:36:55<22:23:38, 3.66s/it] 36%|███▌ | 12266/34278 [13:36:58<20:43:46, 3.39s/it] {'loss': 0.1293, 'grad_norm': 0.9102631886380496, 'learning_rate': 7.436730007512633e-06, 'epoch': 0.36} 36%|███▌ | 12266/34278 [13:36:58<20:43:46, 3.39s/it] 36%|███▌ | 12267/34278 [13:37:01<21:15:01, 3.48s/it] {'loss': 0.1278, 'grad_norm': 0.8405538358882885, 'learning_rate': 7.436317462901068e-06, 'epoch': 0.36} 36%|███▌ | 12267/34278 [13:37:01<21:15:01, 3.48s/it] 36%|███▌ | 12268/34278 [13:37:05<21:45:39, 3.56s/it] {'loss': 0.1511, 'grad_norm': 0.9766458337122229, 'learning_rate': 7.43590489653864e-06, 'epoch': 0.36} 36%|███▌ | 12268/34278 [13:37:05<21:45:39, 3.56s/it] 36%|███▌ | 12269/34278 [13:37:10<23:29:22, 3.84s/it] {'loss': 0.1767, 'grad_norm': 0.8098832418681778, 'learning_rate': 7.4354923084290364e-06, 'epoch': 0.36} 36%|███▌ | 12269/34278 [13:37:10<23:29:22, 3.84s/it] 36%|███▌ | 12270/34278 [13:37:13<23:05:34, 3.78s/it] {'loss': 0.1393, 'grad_norm': 0.8814530567633131, 'learning_rate': 7.435079698575939e-06, 'epoch': 0.36} 36%|███▌ | 12270/34278 [13:37:13<23:05:34, 3.78s/it] 36%|███▌ | 12271/34278 [13:37:16<21:54:25, 3.58s/it] {'loss': 0.1381, 'grad_norm': 0.864062979438122, 'learning_rate': 7.43466706698303e-06, 'epoch': 0.36} 36%|███▌ | 12271/34278 [13:37:16<21:54:25, 3.58s/it] 36%|███▌ | 12272/34278 [13:37:20<21:31:53, 3.52s/it] {'loss': 0.1365, 'grad_norm': 0.801081478137863, 'learning_rate': 7.434254413653995e-06, 'epoch': 0.36} 36%|███▌ | 12272/34278 [13:37:20<21:31:53, 3.52s/it] 36%|███▌ | 12273/34278 [13:37:23<20:47:54, 3.40s/it] {'loss': 0.144, 'grad_norm': 0.6406910898646578, 'learning_rate': 7.433841738592518e-06, 'epoch': 0.36} 36%|███▌ | 12273/34278 [13:37:23<20:47:54, 3.40s/it] 36%|███▌ | 12274/34278 [13:37:28<23:48:12, 3.89s/it] {'loss': 0.1604, 'grad_norm': 0.7635709007643025, 'learning_rate': 7.433429041802282e-06, 'epoch': 0.36} 36%|███▌ | 12274/34278 [13:37:28<23:48:12, 3.89s/it] 36%|███▌ | 12275/34278 [13:37:31<22:19:59, 3.65s/it] {'loss': 0.1479, 'grad_norm': 0.8966671507682924, 'learning_rate': 7.433016323286975e-06, 'epoch': 0.36} 36%|███▌ | 12275/34278 [13:37:31<22:19:59, 3.65s/it] 36%|███▌ | 12276/34278 [13:37:35<23:16:58, 3.81s/it] {'loss': 0.1483, 'grad_norm': 0.7248138408881539, 'learning_rate': 7.432603583050277e-06, 'epoch': 0.36} 36%|███▌ | 12276/34278 [13:37:35<23:16:58, 3.81s/it] 36%|███▌ | 12277/34278 [13:37:38<21:52:39, 3.58s/it] {'loss': 0.1753, 'grad_norm': 0.8051256495884842, 'learning_rate': 7.432190821095875e-06, 'epoch': 0.36} 36%|███▌ | 12277/34278 [13:37:38<21:52:39, 3.58s/it] 36%|███▌ | 12278/34278 [13:37:41<21:15:34, 3.48s/it] {'loss': 0.1109, 'grad_norm': 0.7255186023301744, 'learning_rate': 7.431778037427455e-06, 'epoch': 0.36} 36%|███▌ | 12278/34278 [13:37:41<21:15:34, 3.48s/it] 36%|███▌ | 12279/34278 [13:37:45<20:47:24, 3.40s/it] {'loss': 0.1254, 'grad_norm': 0.7527019200543248, 'learning_rate': 7.431365232048701e-06, 'epoch': 0.36} 36%|███▌ | 12279/34278 [13:37:45<20:47:24, 3.40s/it] 36%|███▌ | 12280/34278 [13:37:48<19:44:36, 3.23s/it] {'loss': 0.1373, 'grad_norm': 0.7222836276234812, 'learning_rate': 7.430952404963298e-06, 'epoch': 0.36} 36%|███▌ | 12280/34278 [13:37:48<19:44:36, 3.23s/it] 36%|███▌ | 12281/34278 [13:37:51<19:28:10, 3.19s/it] {'loss': 0.1524, 'grad_norm': 0.8216852059432174, 'learning_rate': 7.430539556174933e-06, 'epoch': 0.36} 36%|███▌ | 12281/34278 [13:37:51<19:28:10, 3.19s/it] 36%|███▌ | 12282/34278 [13:37:55<22:33:15, 3.69s/it] {'loss': 0.156, 'grad_norm': 0.9893838644930824, 'learning_rate': 7.43012668568729e-06, 'epoch': 0.36} 36%|███▌ | 12282/34278 [13:37:55<22:33:15, 3.69s/it] 36%|███▌ | 12283/34278 [13:37:59<21:24:53, 3.51s/it] {'loss': 0.1526, 'grad_norm': 0.969389833464752, 'learning_rate': 7.429713793504056e-06, 'epoch': 0.36} 36%|███▌ | 12283/34278 [13:37:59<21:24:53, 3.51s/it] 36%|███▌ | 12284/34278 [13:38:02<20:42:44, 3.39s/it] {'loss': 0.147, 'grad_norm': 0.8041972180659, 'learning_rate': 7.429300879628918e-06, 'epoch': 0.36} 36%|███▌ | 12284/34278 [13:38:02<20:42:44, 3.39s/it] 36%|███▌ | 12285/34278 [13:38:05<19:50:50, 3.25s/it] {'loss': 0.1642, 'grad_norm': 1.1458077909766509, 'learning_rate': 7.428887944065562e-06, 'epoch': 0.36} 36%|███▌ | 12285/34278 [13:38:05<19:50:50, 3.25s/it] 36%|███▌ | 12286/34278 [13:38:11<25:01:51, 4.10s/it] {'loss': 0.1826, 'grad_norm': 1.0715062884145345, 'learning_rate': 7.428474986817673e-06, 'epoch': 0.36} 36%|███▌ | 12286/34278 [13:38:11<25:01:51, 4.10s/it] 36%|███▌ | 12287/34278 [13:38:15<24:56:50, 4.08s/it] {'loss': 0.156, 'grad_norm': 0.7704009576124625, 'learning_rate': 7.42806200788894e-06, 'epoch': 0.36} 36%|███▌ | 12287/34278 [13:38:15<24:56:50, 4.08s/it] 36%|███▌ | 12288/34278 [13:38:19<25:17:00, 4.14s/it] {'loss': 0.1522, 'grad_norm': 1.4323846304424777, 'learning_rate': 7.427649007283049e-06, 'epoch': 0.36} 36%|███▌ | 12288/34278 [13:38:19<25:17:00, 4.14s/it] 36%|███▌ | 12289/34278 [13:38:23<25:54:50, 4.24s/it] {'loss': 0.1407, 'grad_norm': 0.8563152652701729, 'learning_rate': 7.4272359850036865e-06, 'epoch': 0.36} 36%|███▌ | 12289/34278 [13:38:23<25:54:50, 4.24s/it] 36%|███▌ | 12290/34278 [13:38:30<29:48:33, 4.88s/it] {'loss': 0.1297, 'grad_norm': 0.7655571933224661, 'learning_rate': 7.426822941054541e-06, 'epoch': 0.36} 36%|███▌ | 12290/34278 [13:38:30<29:48:33, 4.88s/it] 36%|███▌ | 12291/34278 [13:38:36<32:35:37, 5.34s/it] {'loss': 0.1415, 'grad_norm': 1.1216758641583837, 'learning_rate': 7.4264098754393e-06, 'epoch': 0.36} 36%|███▌ | 12291/34278 [13:38:36<32:35:37, 5.34s/it] 36%|███▌ | 12292/34278 [13:38:40<28:58:44, 4.75s/it] {'loss': 0.1496, 'grad_norm': 0.9519263359126535, 'learning_rate': 7.42599678816165e-06, 'epoch': 0.36} 36%|███▌ | 12292/34278 [13:38:40<28:58:44, 4.75s/it] 36%|███▌ | 12293/34278 [13:38:43<25:55:52, 4.25s/it] {'loss': 0.1518, 'grad_norm': 0.8922317014353787, 'learning_rate': 7.42558367922528e-06, 'epoch': 0.36} 36%|███▌ | 12293/34278 [13:38:43<25:55:52, 4.25s/it] 36%|███▌ | 12294/34278 [13:38:47<25:33:33, 4.19s/it] {'loss': 0.1518, 'grad_norm': 0.8629977456094764, 'learning_rate': 7.42517054863388e-06, 'epoch': 0.36} 36%|███▌ | 12294/34278 [13:38:47<25:33:33, 4.19s/it] 36%|███▌ | 12295/34278 [13:38:50<23:56:02, 3.92s/it] {'loss': 0.1398, 'grad_norm': 0.6720919543875122, 'learning_rate': 7.424757396391133e-06, 'epoch': 0.36} 36%|███▌ | 12295/34278 [13:38:50<23:56:02, 3.92s/it] 36%|███▌ | 12296/34278 [13:38:53<22:31:03, 3.69s/it] {'loss': 0.1722, 'grad_norm': 0.8753834844981437, 'learning_rate': 7.424344222500734e-06, 'epoch': 0.36} 36%|███▌ | 12296/34278 [13:38:53<22:31:03, 3.69s/it] 36%|███▌ | 12297/34278 [13:38:59<26:55:08, 4.41s/it] {'loss': 0.1486, 'grad_norm': 0.8353090656853341, 'learning_rate': 7.423931026966365e-06, 'epoch': 0.36} 36%|███▌ | 12297/34278 [13:38:59<26:55:08, 4.41s/it] 36%|███▌ | 12298/34278 [13:39:02<24:19:45, 3.98s/it] {'loss': 0.172, 'grad_norm': 0.769762163570692, 'learning_rate': 7.4235178097917216e-06, 'epoch': 0.36} 36%|███▌ | 12298/34278 [13:39:02<24:19:45, 3.98s/it] 36%|███▌ | 12299/34278 [13:39:05<22:56:22, 3.76s/it] {'loss': 0.1523, 'grad_norm': 0.7878399871303298, 'learning_rate': 7.4231045709804885e-06, 'epoch': 0.36} 36%|███▌ | 12299/34278 [13:39:06<22:56:22, 3.76s/it] 36%|███▌ | 12300/34278 [13:39:09<21:38:32, 3.55s/it] {'loss': 0.1317, 'grad_norm': 0.8206114113125107, 'learning_rate': 7.422691310536355e-06, 'epoch': 0.36} 36%|███▌ | 12300/34278 [13:39:09<21:38:32, 3.55s/it] 36%|███▌ | 12301/34278 [13:39:12<20:49:26, 3.41s/it] {'loss': 0.1348, 'grad_norm': 0.7403662408992112, 'learning_rate': 7.422278028463013e-06, 'epoch': 0.36} 36%|███▌ | 12301/34278 [13:39:12<20:49:26, 3.41s/it] 36%|███▌ | 12302/34278 [13:39:15<20:44:36, 3.40s/it] {'loss': 0.1471, 'grad_norm': 0.8872450680061393, 'learning_rate': 7.421864724764152e-06, 'epoch': 0.36} 36%|███▌ | 12302/34278 [13:39:15<20:44:36, 3.40s/it] 36%|███▌ | 12303/34278 [13:39:18<20:12:12, 3.31s/it] {'loss': 0.1527, 'grad_norm': 0.7791741098502287, 'learning_rate': 7.421451399443459e-06, 'epoch': 0.36} 36%|███▌ | 12303/34278 [13:39:18<20:12:12, 3.31s/it] 36%|███▌ | 12304/34278 [13:39:23<22:30:23, 3.69s/it] {'loss': 0.1365, 'grad_norm': 0.685358201498907, 'learning_rate': 7.421038052504627e-06, 'epoch': 0.36} 36%|███▌ | 12304/34278 [13:39:23<22:30:23, 3.69s/it] 36%|███▌ | 12305/34278 [13:39:26<22:34:41, 3.70s/it] {'loss': 0.1728, 'grad_norm': 0.8881283338308029, 'learning_rate': 7.4206246839513455e-06, 'epoch': 0.36} 36%|███▌ | 12305/34278 [13:39:26<22:34:41, 3.70s/it] 36%|███▌ | 12306/34278 [13:39:29<20:39:47, 3.39s/it] {'loss': 0.152, 'grad_norm': 1.0459925805075425, 'learning_rate': 7.420211293787305e-06, 'epoch': 0.36} 36%|███▌ | 12306/34278 [13:39:29<20:39:47, 3.39s/it] 36%|███▌ | 12307/34278 [13:39:32<19:56:54, 3.27s/it] {'loss': 0.1317, 'grad_norm': 0.8036929620852763, 'learning_rate': 7.419797882016193e-06, 'epoch': 0.36} 36%|███▌ | 12307/34278 [13:39:32<19:56:54, 3.27s/it] 36%|███▌ | 12308/34278 [13:39:35<19:29:27, 3.19s/it] {'loss': 0.1336, 'grad_norm': 0.8080802933071737, 'learning_rate': 7.419384448641706e-06, 'epoch': 0.36} 36%|███▌ | 12308/34278 [13:39:35<19:29:27, 3.19s/it] 36%|███▌ | 12309/34278 [13:39:38<19:24:35, 3.18s/it] {'loss': 0.1296, 'grad_norm': 0.8066517583138296, 'learning_rate': 7.418970993667531e-06, 'epoch': 0.36} 36%|███▌ | 12309/34278 [13:39:38<19:24:35, 3.18s/it] 36%|███▌ | 12310/34278 [13:39:41<19:19:52, 3.17s/it] {'loss': 0.1548, 'grad_norm': 1.0865065477623663, 'learning_rate': 7.41855751709736e-06, 'epoch': 0.36} 36%|███▌ | 12310/34278 [13:39:41<19:19:52, 3.17s/it] 36%|███▌ | 12311/34278 [13:39:47<24:27:41, 4.01s/it] {'loss': 0.1738, 'grad_norm': 0.8762441744657048, 'learning_rate': 7.418144018934888e-06, 'epoch': 0.36} 36%|███▌ | 12311/34278 [13:39:47<24:27:41, 4.01s/it] 36%|███▌ | 12312/34278 [13:39:50<22:40:19, 3.72s/it] {'loss': 0.1347, 'grad_norm': 1.0911604048878483, 'learning_rate': 7.417730499183801e-06, 'epoch': 0.36} 36%|███▌ | 12312/34278 [13:39:50<22:40:19, 3.72s/it] 36%|███▌ | 12313/34278 [13:39:56<27:03:15, 4.43s/it] {'loss': 0.1437, 'grad_norm': 0.8934094349034861, 'learning_rate': 7.417316957847793e-06, 'epoch': 0.36} 36%|███▌ | 12313/34278 [13:39:56<27:03:15, 4.43s/it] 36%|███▌ | 12314/34278 [13:39:59<24:29:47, 4.02s/it] {'loss': 0.1341, 'grad_norm': 0.9355503288008398, 'learning_rate': 7.416903394930556e-06, 'epoch': 0.36} 36%|███▌ | 12314/34278 [13:40:00<24:29:47, 4.02s/it] 36%|███▌ | 12315/34278 [13:40:03<23:33:50, 3.86s/it] {'loss': 0.1835, 'grad_norm': 0.9136214836862947, 'learning_rate': 7.416489810435783e-06, 'epoch': 0.36} 36%|███▌ | 12315/34278 [13:40:03<23:33:50, 3.86s/it] 36%|███▌ | 12316/34278 [13:40:07<23:06:43, 3.79s/it] {'loss': 0.1408, 'grad_norm': 0.9682915154151832, 'learning_rate': 7.4160762043671664e-06, 'epoch': 0.36} 36%|███▌ | 12316/34278 [13:40:07<23:06:43, 3.79s/it] 36%|███▌ | 12317/34278 [13:40:10<22:32:06, 3.69s/it] {'loss': 0.1561, 'grad_norm': 0.9504391712661863, 'learning_rate': 7.415662576728397e-06, 'epoch': 0.36} 36%|███▌ | 12317/34278 [13:40:10<22:32:06, 3.69s/it] 36%|███▌ | 12318/34278 [13:40:14<22:37:51, 3.71s/it] {'loss': 0.122, 'grad_norm': 0.8876264488166852, 'learning_rate': 7.41524892752317e-06, 'epoch': 0.36} 36%|███▌ | 12318/34278 [13:40:14<22:37:51, 3.71s/it] 36%|███▌ | 12319/34278 [13:40:17<21:53:47, 3.59s/it] {'loss': 0.1666, 'grad_norm': 0.9609503698594272, 'learning_rate': 7.414835256755176e-06, 'epoch': 0.36} 36%|███▌ | 12319/34278 [13:40:17<21:53:47, 3.59s/it] 36%|███▌ | 12320/34278 [13:40:21<21:48:08, 3.57s/it] {'loss': 0.1255, 'grad_norm': 0.7358350385194634, 'learning_rate': 7.41442156442811e-06, 'epoch': 0.36} 36%|███▌ | 12320/34278 [13:40:21<21:48:08, 3.57s/it] 36%|███▌ | 12321/34278 [13:40:24<22:05:14, 3.62s/it] {'loss': 0.1373, 'grad_norm': 0.8460819344876249, 'learning_rate': 7.414007850545666e-06, 'epoch': 0.36} 36%|███▌ | 12321/34278 [13:40:24<22:05:14, 3.62s/it] 36%|███▌ | 12322/34278 [13:40:30<26:30:14, 4.35s/it] {'loss': 0.1689, 'grad_norm': 0.8859639325453393, 'learning_rate': 7.4135941151115335e-06, 'epoch': 0.36} 36%|███▌ | 12322/34278 [13:40:30<26:30:14, 4.35s/it] 36%|███▌ | 12323/34278 [13:40:33<23:50:45, 3.91s/it] {'loss': 0.1613, 'grad_norm': 0.9864525481893218, 'learning_rate': 7.41318035812941e-06, 'epoch': 0.36} 36%|███▌ | 12323/34278 [13:40:33<23:50:45, 3.91s/it] 36%|███▌ | 12324/34278 [13:40:40<28:23:28, 4.66s/it] {'loss': 0.1358, 'grad_norm': 0.8165972592542372, 'learning_rate': 7.4127665796029905e-06, 'epoch': 0.36} 36%|███▌ | 12324/34278 [13:40:40<28:23:28, 4.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 36%|███▌ | 12325/34278 [13:40:46<30:51:56, 5.06s/it] {'loss': 0.1519, 'grad_norm': 0.8562266100046408, 'learning_rate': 7.412352779535963e-06, 'epoch': 0.36} 36%|███▌ | 12325/34278 [13:40:46<30:51:56, 5.06s/it] 36%|███▌ | 12326/34278 [13:40:50<29:47:48, 4.89s/it] {'loss': 0.1581, 'grad_norm': 0.912280309723881, 'learning_rate': 7.411938957932029e-06, 'epoch': 0.36} 36%|███▌ | 12326/34278 [13:40:50<29:47:48, 4.89s/it] 36%|███▌ | 12327/34278 [13:40:53<26:04:23, 4.28s/it] {'loss': 0.1407, 'grad_norm': 0.7717685849872788, 'learning_rate': 7.411525114794877e-06, 'epoch': 0.36} 36%|███▌ | 12327/34278 [13:40:53<26:04:23, 4.28s/it] 36%|███▌ | 12328/34278 [13:40:56<24:15:16, 3.98s/it] {'loss': 0.1538, 'grad_norm': 0.8565656156353816, 'learning_rate': 7.411111250128207e-06, 'epoch': 0.36} 36%|███▌ | 12328/34278 [13:40:56<24:15:16, 3.98s/it] 36%|███▌ | 12329/34278 [13:41:01<25:48:12, 4.23s/it] {'loss': 0.1503, 'grad_norm': 0.8023797838850455, 'learning_rate': 7.4106973639357104e-06, 'epoch': 0.36} 36%|███▌ | 12329/34278 [13:41:01<25:48:12, 4.23s/it] 36%|███▌ | 12330/34278 [13:41:04<24:03:57, 3.95s/it] {'loss': 0.1441, 'grad_norm': 0.8821948443676356, 'learning_rate': 7.4102834562210825e-06, 'epoch': 0.36} 36%|███▌ | 12330/34278 [13:41:04<24:03:57, 3.95s/it] 36%|███▌ | 12331/34278 [13:41:08<22:39:02, 3.72s/it] {'loss': 0.1563, 'grad_norm': 0.7279747327238699, 'learning_rate': 7.4098695269880205e-06, 'epoch': 0.36} 36%|███▌ | 12331/34278 [13:41:08<22:39:02, 3.72s/it] 36%|███▌ | 12332/34278 [13:41:12<24:26:34, 4.01s/it] {'loss': 0.1355, 'grad_norm': 0.9236086564391722, 'learning_rate': 7.4094555762402174e-06, 'epoch': 0.36} 36%|███▌ | 12332/34278 [13:41:12<24:26:34, 4.01s/it] 36%|███▌ | 12333/34278 [13:41:18<26:54:35, 4.41s/it] {'loss': 0.1609, 'grad_norm': 0.8103902030532772, 'learning_rate': 7.409041603981371e-06, 'epoch': 0.36} 36%|███▌ | 12333/34278 [13:41:18<26:54:35, 4.41s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 36%|███▌ | 12334/34278 [13:41:21<24:50:09, 4.07s/it] {'loss': 0.1522, 'grad_norm': 0.8249193407165809, 'learning_rate': 7.408627610215176e-06, 'epoch': 0.36} 36%|███▌ | 12334/34278 [13:41:21<24:50:09, 4.07s/it] 36%|███▌ | 12335/34278 [13:41:24<22:25:30, 3.68s/it] {'loss': 0.1183, 'grad_norm': 0.6871405848971597, 'learning_rate': 7.408213594945328e-06, 'epoch': 0.36} 36%|███▌ | 12335/34278 [13:41:24<22:25:30, 3.68s/it] 36%|███▌ | 12336/34278 [13:41:27<21:13:24, 3.48s/it] {'loss': 0.1368, 'grad_norm': 0.7755076899987711, 'learning_rate': 7.4077995581755255e-06, 'epoch': 0.36} 36%|███▌ | 12336/34278 [13:41:27<21:13:24, 3.48s/it] 36%|███▌ | 12337/34278 [13:41:30<21:14:24, 3.49s/it] {'loss': 0.1535, 'grad_norm': 0.9100559507869117, 'learning_rate': 7.407385499909462e-06, 'epoch': 0.36} 36%|███▌ | 12337/34278 [13:41:30<21:14:24, 3.49s/it] 36%|███▌ | 12338/34278 [13:41:33<20:38:36, 3.39s/it] {'loss': 0.1419, 'grad_norm': 0.7937661174648724, 'learning_rate': 7.406971420150837e-06, 'epoch': 0.36} 36%|███▌ | 12338/34278 [13:41:33<20:38:36, 3.39s/it] 36%|███▌ | 12339/34278 [13:41:36<19:44:26, 3.24s/it] {'loss': 0.1518, 'grad_norm': 0.8959987231263276, 'learning_rate': 7.406557318903344e-06, 'epoch': 0.36} 36%|███▌ | 12339/34278 [13:41:36<19:44:26, 3.24s/it] 36%|███▌ | 12340/34278 [13:41:43<25:23:58, 4.17s/it] {'loss': 0.1796, 'grad_norm': 0.8906269978011393, 'learning_rate': 7.406143196170681e-06, 'epoch': 0.36} 36%|███▌ | 12340/34278 [13:41:43<25:23:58, 4.17s/it] 36%|███▌ | 12341/34278 [13:41:46<23:06:41, 3.79s/it] {'loss': 0.1536, 'grad_norm': 0.8072700045665477, 'learning_rate': 7.405729051956548e-06, 'epoch': 0.36} 36%|███▌ | 12341/34278 [13:41:46<23:06:41, 3.79s/it] 36%|███▌ | 12342/34278 [13:41:52<27:19:25, 4.48s/it] {'loss': 0.1429, 'grad_norm': 1.0264394048547325, 'learning_rate': 7.405314886264639e-06, 'epoch': 0.36} 36%|███▌ | 12342/34278 [13:41:52<27:19:25, 4.48s/it] 36%|███▌ | 12343/34278 [13:41:54<24:08:59, 3.96s/it] {'loss': 0.15, 'grad_norm': 0.807448160087179, 'learning_rate': 7.404900699098654e-06, 'epoch': 0.36} 36%|███▌ | 12343/34278 [13:41:54<24:08:59, 3.96s/it] 36%|███▌ | 12344/34278 [13:41:58<22:58:25, 3.77s/it] {'loss': 0.1373, 'grad_norm': 0.6762839773443348, 'learning_rate': 7.404486490462289e-06, 'epoch': 0.36} 36%|███▌ | 12344/34278 [13:41:58<22:58:25, 3.77s/it] 36%|███▌ | 12345/34278 [13:42:01<22:00:15, 3.61s/it] {'loss': 0.1419, 'grad_norm': 1.0202249359976419, 'learning_rate': 7.404072260359243e-06, 'epoch': 0.36} 36%|███▌ | 12345/34278 [13:42:01<22:00:15, 3.61s/it] 36%|███▌ | 12346/34278 [13:42:04<21:21:26, 3.51s/it] {'loss': 0.1341, 'grad_norm': 0.8772031024908503, 'learning_rate': 7.403658008793213e-06, 'epoch': 0.36} 36%|███▌ | 12346/34278 [13:42:04<21:21:26, 3.51s/it] 36%|███▌ | 12347/34278 [13:42:08<21:01:51, 3.45s/it] {'loss': 0.1659, 'grad_norm': 0.8549827882854905, 'learning_rate': 7.4032437357678985e-06, 'epoch': 0.36} 36%|███▌ | 12347/34278 [13:42:08<21:01:51, 3.45s/it] 36%|███▌ | 12348/34278 [13:42:13<24:52:16, 4.08s/it] {'loss': 0.143, 'grad_norm': 1.0450883888744094, 'learning_rate': 7.4028294412869985e-06, 'epoch': 0.36} 36%|███▌ | 12348/34278 [13:42:13<24:52:16, 4.08s/it] 36%|███▌ | 12349/34278 [13:42:16<23:33:39, 3.87s/it] {'loss': 0.1542, 'grad_norm': 0.7713575426833317, 'learning_rate': 7.40241512535421e-06, 'epoch': 0.36} 36%|███▌ | 12349/34278 [13:42:17<23:33:39, 3.87s/it] 36%|███▌ | 12350/34278 [13:42:21<24:20:03, 4.00s/it] {'loss': 0.1447, 'grad_norm': 0.8330528390504065, 'learning_rate': 7.402000787973232e-06, 'epoch': 0.36} 36%|███▌ | 12350/34278 [13:42:21<24:20:03, 4.00s/it] 36%|███▌ | 12351/34278 [13:42:24<22:23:25, 3.68s/it] {'loss': 0.1709, 'grad_norm': 1.094625320039597, 'learning_rate': 7.401586429147767e-06, 'epoch': 0.36} 36%|███▌ | 12351/34278 [13:42:24<22:23:25, 3.68s/it] 36%|███▌ | 12352/34278 [13:42:27<21:14:46, 3.49s/it] {'loss': 0.1559, 'grad_norm': 0.7813934111383724, 'learning_rate': 7.401172048881509e-06, 'epoch': 0.36} 36%|███▌ | 12352/34278 [13:42:27<21:14:46, 3.49s/it] 36%|███▌ | 12353/34278 [13:42:30<21:06:06, 3.46s/it] {'loss': 0.128, 'grad_norm': 0.7804873840379437, 'learning_rate': 7.400757647178162e-06, 'epoch': 0.36} 36%|███▌ | 12353/34278 [13:42:30<21:06:06, 3.46s/it] 36%|███▌ | 12354/34278 [13:42:36<25:46:39, 4.23s/it] {'loss': 0.1431, 'grad_norm': 0.8758749781555031, 'learning_rate': 7.400343224041422e-06, 'epoch': 0.36} 36%|███▌ | 12354/34278 [13:42:36<25:46:39, 4.23s/it] 36%|███▌ | 12355/34278 [13:42:39<23:35:48, 3.87s/it] {'loss': 0.1481, 'grad_norm': 0.8377924261046195, 'learning_rate': 7.399928779474991e-06, 'epoch': 0.36} 36%|███▌ | 12355/34278 [13:42:39<23:35:48, 3.87s/it] 36%|███▌ | 12356/34278 [13:42:42<21:47:45, 3.58s/it] {'loss': 0.1439, 'grad_norm': 0.7853079153083155, 'learning_rate': 7.39951431348257e-06, 'epoch': 0.36} 36%|███▌ | 12356/34278 [13:42:42<21:47:45, 3.58s/it] 36%|███▌ | 12357/34278 [13:42:47<23:44:55, 3.90s/it] {'loss': 0.1273, 'grad_norm': 0.9347527446114469, 'learning_rate': 7.399099826067857e-06, 'epoch': 0.36} 36%|███▌ | 12357/34278 [13:42:47<23:44:55, 3.90s/it] 36%|███▌ | 12358/34278 [13:42:50<22:53:42, 3.76s/it] {'loss': 0.1287, 'grad_norm': 0.9243839040542723, 'learning_rate': 7.398685317234554e-06, 'epoch': 0.36} 36%|███▌ | 12358/34278 [13:42:50<22:53:42, 3.76s/it] 36%|███▌ | 12359/34278 [13:42:53<21:58:06, 3.61s/it] {'loss': 0.1491, 'grad_norm': 0.836372029203595, 'learning_rate': 7.398270786986361e-06, 'epoch': 0.36} 36%|███▌ | 12359/34278 [13:42:53<21:58:06, 3.61s/it] 36%|███▌ | 12360/34278 [13:42:57<20:59:56, 3.45s/it] {'loss': 0.1451, 'grad_norm': 1.074532348994187, 'learning_rate': 7.397856235326979e-06, 'epoch': 0.36} 36%|███▌ | 12360/34278 [13:42:57<20:59:56, 3.45s/it] 36%|███▌ | 12361/34278 [13:43:01<22:42:10, 3.73s/it] {'loss': 0.1262, 'grad_norm': 0.8130342485484173, 'learning_rate': 7.397441662260109e-06, 'epoch': 0.36} 36%|███▌ | 12361/34278 [13:43:01<22:42:10, 3.73s/it] 36%|███▌ | 12362/34278 [13:43:05<22:39:20, 3.72s/it] {'loss': 0.144, 'grad_norm': 0.9157609290069918, 'learning_rate': 7.3970270677894505e-06, 'epoch': 0.36} 36%|███▌ | 12362/34278 [13:43:05<22:39:20, 3.72s/it] 36%|███▌ | 12363/34278 [13:43:08<21:53:58, 3.60s/it] {'loss': 0.1699, 'grad_norm': 0.9774480662919182, 'learning_rate': 7.396612451918709e-06, 'epoch': 0.36} 36%|███▌ | 12363/34278 [13:43:08<21:53:58, 3.60s/it] 36%|███▌ | 12364/34278 [13:43:11<20:57:16, 3.44s/it] {'loss': 0.171, 'grad_norm': 0.7751576073896935, 'learning_rate': 7.396197814651582e-06, 'epoch': 0.36} 36%|███▌ | 12364/34278 [13:43:11<20:57:16, 3.44s/it] 36%|███▌ | 12365/34278 [13:43:17<25:40:19, 4.22s/it] {'loss': 0.1491, 'grad_norm': 1.026047430528191, 'learning_rate': 7.3957831559917735e-06, 'epoch': 0.36} 36%|███▌ | 12365/34278 [13:43:17<25:40:19, 4.22s/it] 36%|███▌ | 12366/34278 [13:43:20<23:58:06, 3.94s/it] {'loss': 0.1352, 'grad_norm': 1.1578198490582807, 'learning_rate': 7.395368475942985e-06, 'epoch': 0.36} 36%|███▌ | 12366/34278 [13:43:20<23:58:06, 3.94s/it] 36%|███▌ | 12367/34278 [13:43:24<24:19:13, 4.00s/it] {'loss': 0.1442, 'grad_norm': 0.7358419225212605, 'learning_rate': 7.394953774508918e-06, 'epoch': 0.36} 36%|███▌ | 12367/34278 [13:43:25<24:19:13, 4.00s/it] 36%|███▌ | 12368/34278 [13:43:28<23:59:46, 3.94s/it] {'loss': 0.1521, 'grad_norm': 0.7855099920570201, 'learning_rate': 7.3945390516932765e-06, 'epoch': 0.36} 36%|███▌ | 12368/34278 [13:43:28<23:59:46, 3.94s/it] 36%|███▌ | 12369/34278 [13:43:32<22:56:17, 3.77s/it] {'loss': 0.1627, 'grad_norm': 0.882718394909379, 'learning_rate': 7.394124307499762e-06, 'epoch': 0.36} 36%|███▌ | 12369/34278 [13:43:32<22:56:17, 3.77s/it] 36%|███▌ | 12370/34278 [13:43:36<24:33:17, 4.03s/it] {'loss': 0.1425, 'grad_norm': 1.1041984778019813, 'learning_rate': 7.393709541932076e-06, 'epoch': 0.36} 36%|███▌ | 12370/34278 [13:43:36<24:33:17, 4.03s/it] 36%|███▌ | 12371/34278 [13:43:40<23:45:01, 3.90s/it] {'loss': 0.1487, 'grad_norm': 0.8117247656766673, 'learning_rate': 7.393294754993924e-06, 'epoch': 0.36} 36%|███▌ | 12371/34278 [13:43:40<23:45:01, 3.90s/it] 36%|███▌ | 12372/34278 [13:43:44<24:55:09, 4.10s/it] {'loss': 0.1738, 'grad_norm': 0.9860067015206712, 'learning_rate': 7.392879946689007e-06, 'epoch': 0.36} 36%|███▌ | 12372/34278 [13:43:44<24:55:09, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 36%|███▌ | 12373/34278 [13:43:47<22:58:52, 3.78s/it] {'loss': 0.1522, 'grad_norm': 0.9697711162001428, 'learning_rate': 7.39246511702103e-06, 'epoch': 0.36} 36%|███▌ | 12373/34278 [13:43:47<22:58:52, 3.78s/it] 36%|███▌ | 12374/34278 [13:43:51<22:53:48, 3.76s/it] {'loss': 0.1507, 'grad_norm': 0.790772645726191, 'learning_rate': 7.3920502659936936e-06, 'epoch': 0.36} 36%|███▌ | 12374/34278 [13:43:51<22:53:48, 3.76s/it] 36%|███▌ | 12375/34278 [13:43:54<21:28:13, 3.53s/it] {'loss': 0.1264, 'grad_norm': 0.708130741606113, 'learning_rate': 7.3916353936107045e-06, 'epoch': 0.36} 36%|███▌ | 12375/34278 [13:43:54<21:28:13, 3.53s/it] 36%|███▌ | 12376/34278 [13:43:58<21:34:27, 3.55s/it] {'loss': 0.1417, 'grad_norm': 1.089621074722289, 'learning_rate': 7.3912204998757656e-06, 'epoch': 0.36} 36%|███▌ | 12376/34278 [13:43:58<21:34:27, 3.55s/it] 36%|███▌ | 12377/34278 [13:44:01<21:00:31, 3.45s/it] {'loss': 0.1423, 'grad_norm': 0.9019908227128198, 'learning_rate': 7.390805584792581e-06, 'epoch': 0.36} 36%|███▌ | 12377/34278 [13:44:01<21:00:31, 3.45s/it] 36%|███▌ | 12378/34278 [13:44:05<21:31:08, 3.54s/it] {'loss': 0.147, 'grad_norm': 0.8293264150623696, 'learning_rate': 7.390390648364855e-06, 'epoch': 0.36} 36%|███▌ | 12378/34278 [13:44:05<21:31:08, 3.54s/it] 36%|███▌ | 12379/34278 [13:44:10<24:37:41, 4.05s/it] {'loss': 0.1489, 'grad_norm': 0.9178389304117276, 'learning_rate': 7.389975690596292e-06, 'epoch': 0.36} 36%|███▌ | 12379/34278 [13:44:10<24:37:41, 4.05s/it] 36%|███▌ | 12380/34278 [13:44:16<28:44:49, 4.73s/it] {'loss': 0.1414, 'grad_norm': 1.0360618918651492, 'learning_rate': 7.389560711490595e-06, 'epoch': 0.36} 36%|███▌ | 12380/34278 [13:44:16<28:44:49, 4.73s/it] 36%|███▌ | 12381/34278 [13:44:19<25:44:25, 4.23s/it] {'loss': 0.1349, 'grad_norm': 0.6703535165588835, 'learning_rate': 7.389145711051473e-06, 'epoch': 0.36} 36%|███▌ | 12381/34278 [13:44:19<25:44:25, 4.23s/it] 36%|███▌ | 12382/34278 [13:44:22<23:19:28, 3.83s/it] {'loss': 0.1432, 'grad_norm': 0.6804167249294869, 'learning_rate': 7.388730689282626e-06, 'epoch': 0.36} 36%|███▌ | 12382/34278 [13:44:22<23:19:28, 3.83s/it] 36%|███▌ | 12383/34278 [13:44:28<26:19:17, 4.33s/it] {'loss': 0.1632, 'grad_norm': 0.7354037221891911, 'learning_rate': 7.388315646187763e-06, 'epoch': 0.36} 36%|███▌ | 12383/34278 [13:44:28<26:19:17, 4.33s/it] 36%|███▌ | 12384/34278 [13:44:34<29:25:17, 4.84s/it] {'loss': 0.1374, 'grad_norm': 0.805908468689585, 'learning_rate': 7.3879005817705886e-06, 'epoch': 0.36} 36%|███▌ | 12384/34278 [13:44:34<29:25:17, 4.84s/it] 36%|███▌ | 12385/34278 [13:44:37<26:55:23, 4.43s/it] {'loss': 0.1663, 'grad_norm': 1.2220610790779731, 'learning_rate': 7.387485496034805e-06, 'epoch': 0.36} 36%|███▌ | 12385/34278 [13:44:37<26:55:23, 4.43s/it] 36%|███▌ | 12386/34278 [13:44:41<25:57:14, 4.27s/it] {'loss': 0.1361, 'grad_norm': 0.6820647865653887, 'learning_rate': 7.387070388984123e-06, 'epoch': 0.36} 36%|███▌ | 12386/34278 [13:44:41<25:57:14, 4.27s/it] 36%|███▌ | 12387/34278 [13:44:47<29:04:10, 4.78s/it] {'loss': 0.1438, 'grad_norm': 0.9643924075132558, 'learning_rate': 7.386655260622247e-06, 'epoch': 0.36} 36%|███▌ | 12387/34278 [13:44:47<29:04:10, 4.78s/it] 36%|███▌ | 12388/34278 [13:44:52<29:48:09, 4.90s/it] {'loss': 0.1533, 'grad_norm': 0.8253438292834724, 'learning_rate': 7.386240110952881e-06, 'epoch': 0.36} 36%|███▌ | 12388/34278 [13:44:52<29:48:09, 4.90s/it] 36%|███▌ | 12389/34278 [13:44:56<27:33:21, 4.53s/it] {'loss': 0.1348, 'grad_norm': 0.931771747767439, 'learning_rate': 7.385824939979735e-06, 'epoch': 0.36} 36%|███▌ | 12389/34278 [13:44:56<27:33:21, 4.53s/it] 36%|███▌ | 12390/34278 [13:45:00<25:47:38, 4.24s/it] {'loss': 0.164, 'grad_norm': 0.915816847488074, 'learning_rate': 7.385409747706511e-06, 'epoch': 0.36} 36%|███▌ | 12390/34278 [13:45:00<25:47:38, 4.24s/it] 36%|███▌ | 12391/34278 [13:45:02<23:04:16, 3.79s/it] {'loss': 0.1439, 'grad_norm': 0.8121290138072851, 'learning_rate': 7.38499453413692e-06, 'epoch': 0.36} 36%|███▌ | 12391/34278 [13:45:02<23:04:16, 3.79s/it] 36%|███▌ | 12392/34278 [13:45:05<22:00:44, 3.62s/it] {'loss': 0.1581, 'grad_norm': 0.7327947356452752, 'learning_rate': 7.3845792992746665e-06, 'epoch': 0.36} 36%|███▌ | 12392/34278 [13:45:06<22:00:44, 3.62s/it] 36%|███▌ | 12393/34278 [13:45:09<21:24:50, 3.52s/it] {'loss': 0.1267, 'grad_norm': 0.8533791669557264, 'learning_rate': 7.384164043123458e-06, 'epoch': 0.36} 36%|███▌ | 12393/34278 [13:45:09<21:24:50, 3.52s/it] 36%|███▌ | 12394/34278 [13:45:15<26:07:03, 4.30s/it] {'loss': 0.1297, 'grad_norm': 0.8947048415332206, 'learning_rate': 7.383748765687002e-06, 'epoch': 0.36} 36%|███▌ | 12394/34278 [13:45:15<26:07:03, 4.30s/it] 36%|███▌ | 12395/34278 [13:45:18<23:31:30, 3.87s/it] {'loss': 0.1426, 'grad_norm': 0.9297131469072029, 'learning_rate': 7.383333466969007e-06, 'epoch': 0.36} 36%|███▌ | 12395/34278 [13:45:18<23:31:30, 3.87s/it] 36%|███▌ | 12396/34278 [13:45:21<22:23:09, 3.68s/it] {'loss': 0.1448, 'grad_norm': 0.9030085673039439, 'learning_rate': 7.38291814697318e-06, 'epoch': 0.36} 36%|███▌ | 12396/34278 [13:45:21<22:23:09, 3.68s/it] 36%|███▌ | 12397/34278 [13:45:24<21:19:23, 3.51s/it] {'loss': 0.1374, 'grad_norm': 0.7871505040381325, 'learning_rate': 7.382502805703227e-06, 'epoch': 0.36} 36%|███▌ | 12397/34278 [13:45:24<21:19:23, 3.51s/it] 36%|███▌ | 12398/34278 [13:45:28<21:17:03, 3.50s/it] {'loss': 0.1723, 'grad_norm': 0.7660459682425128, 'learning_rate': 7.382087443162859e-06, 'epoch': 0.36} 36%|███▌ | 12398/34278 [13:45:28<21:17:03, 3.50s/it] 36%|███▌ | 12399/34278 [13:45:31<21:05:30, 3.47s/it] {'loss': 0.1542, 'grad_norm': 0.7032781138080652, 'learning_rate': 7.381672059355782e-06, 'epoch': 0.36} 36%|███▌ | 12399/34278 [13:45:31<21:05:30, 3.47s/it] 36%|███▌ | 12400/34278 [13:45:34<19:41:11, 3.24s/it] {'loss': 0.1524, 'grad_norm': 1.1822475436946003, 'learning_rate': 7.3812566542857055e-06, 'epoch': 0.36} 36%|███▌ | 12400/34278 [13:45:34<19:41:11, 3.24s/it] 36%|███▌ | 12401/34278 [13:45:37<19:55:36, 3.28s/it] {'loss': 0.1587, 'grad_norm': 0.8368114519356725, 'learning_rate': 7.3808412279563394e-06, 'epoch': 0.36} 36%|███▌ | 12401/34278 [13:45:37<19:55:36, 3.28s/it] 36%|███▌ | 12402/34278 [13:45:40<19:10:57, 3.16s/it] {'loss': 0.157, 'grad_norm': 0.8324356038995239, 'learning_rate': 7.38042578037139e-06, 'epoch': 0.36} 36%|███▌ | 12402/34278 [13:45:40<19:10:57, 3.16s/it] 36%|███▌ | 12403/34278 [13:45:43<18:57:10, 3.12s/it] {'loss': 0.1458, 'grad_norm': 0.6767675483795794, 'learning_rate': 7.380010311534568e-06, 'epoch': 0.36} 36%|███▌ | 12403/34278 [13:45:43<18:57:10, 3.12s/it] 36%|███▌ | 12404/34278 [13:45:47<20:09:04, 3.32s/it] {'loss': 0.1446, 'grad_norm': 0.6801981630620667, 'learning_rate': 7.3795948214495816e-06, 'epoch': 0.36} 36%|███▌ | 12404/34278 [13:45:47<20:09:04, 3.32s/it] 36%|███▌ | 12405/34278 [13:45:50<19:26:08, 3.20s/it] {'loss': 0.1338, 'grad_norm': 0.7067335317127671, 'learning_rate': 7.379179310120139e-06, 'epoch': 0.36} 36%|███▌ | 12405/34278 [13:45:50<19:26:08, 3.20s/it] 36%|███▌ | 12406/34278 [13:45:53<19:15:52, 3.17s/it] {'loss': 0.1409, 'grad_norm': 0.9515411579017032, 'learning_rate': 7.378763777549955e-06, 'epoch': 0.36} 36%|███▌ | 12406/34278 [13:45:53<19:15:52, 3.17s/it] 36%|███▌ | 12407/34278 [13:45:57<20:54:15, 3.44s/it] {'loss': 0.1328, 'grad_norm': 0.8145976168747101, 'learning_rate': 7.378348223742734e-06, 'epoch': 0.36} 36%|███▌ | 12407/34278 [13:45:57<20:54:15, 3.44s/it] 36%|███▌ | 12408/34278 [13:46:00<20:58:23, 3.45s/it] {'loss': 0.1318, 'grad_norm': 0.7151310623421635, 'learning_rate': 7.377932648702189e-06, 'epoch': 0.36} 36%|███▌ | 12408/34278 [13:46:00<20:58:23, 3.45s/it] 36%|███▌ | 12409/34278 [13:46:04<22:17:01, 3.67s/it] {'loss': 0.1712, 'grad_norm': 0.8839560193938235, 'learning_rate': 7.377517052432027e-06, 'epoch': 0.36} 36%|███▌ | 12409/34278 [13:46:05<22:17:01, 3.67s/it] 36%|███▌ | 12410/34278 [13:46:08<21:10:43, 3.49s/it] {'loss': 0.1531, 'grad_norm': 0.8707843326698027, 'learning_rate': 7.377101434935961e-06, 'epoch': 0.36} 36%|███▌ | 12410/34278 [13:46:08<21:10:43, 3.49s/it] 36%|███▌ | 12411/34278 [13:46:11<20:50:19, 3.43s/it] {'loss': 0.1403, 'grad_norm': 0.8353989689392917, 'learning_rate': 7.376685796217702e-06, 'epoch': 0.36} 36%|███▌ | 12411/34278 [13:46:11<20:50:19, 3.43s/it] 36%|███▌ | 12412/34278 [13:46:14<20:32:03, 3.38s/it] {'loss': 0.1662, 'grad_norm': 0.7556286038576497, 'learning_rate': 7.376270136280958e-06, 'epoch': 0.36} 36%|███▌ | 12412/34278 [13:46:14<20:32:03, 3.38s/it] 36%|███▌ | 12413/34278 [13:46:17<20:02:49, 3.30s/it] {'loss': 0.1254, 'grad_norm': 1.0941445251976925, 'learning_rate': 7.375854455129443e-06, 'epoch': 0.36} 36%|███▌ | 12413/34278 [13:46:17<20:02:49, 3.30s/it] 36%|███▌ | 12414/34278 [13:46:20<19:23:27, 3.19s/it] {'loss': 0.1476, 'grad_norm': 0.8740104712910368, 'learning_rate': 7.375438752766864e-06, 'epoch': 0.36} 36%|███▌ | 12414/34278 [13:46:20<19:23:27, 3.19s/it] 36%|███▌ | 12415/34278 [13:46:25<22:41:38, 3.74s/it] {'loss': 0.1488, 'grad_norm': 0.7790841822211408, 'learning_rate': 7.375023029196937e-06, 'epoch': 0.36} 36%|███▌ | 12415/34278 [13:46:25<22:41:38, 3.74s/it] 36%|███▌ | 12416/34278 [13:46:30<25:14:51, 4.16s/it] {'loss': 0.149, 'grad_norm': 0.8175314120822311, 'learning_rate': 7.374607284423373e-06, 'epoch': 0.36} 36%|███▌ | 12416/34278 [13:46:30<25:14:51, 4.16s/it] 36%|███▌ | 12417/34278 [13:46:33<22:32:39, 3.71s/it] {'loss': 0.1689, 'grad_norm': 1.1122828174219652, 'learning_rate': 7.374191518449878e-06, 'epoch': 0.36} 36%|███▌ | 12417/34278 [13:46:33<22:32:39, 3.71s/it] 36%|███▌ | 12418/34278 [13:46:39<26:07:53, 4.30s/it] {'loss': 0.1331, 'grad_norm': 0.9378285633316117, 'learning_rate': 7.373775731280172e-06, 'epoch': 0.36} 36%|███▌ | 12418/34278 [13:46:39<26:07:53, 4.30s/it] 36%|███▌ | 12419/34278 [13:46:44<28:09:33, 4.64s/it] {'loss': 0.1541, 'grad_norm': 1.301318761980158, 'learning_rate': 7.37335992291796e-06, 'epoch': 0.36} 36%|███▌ | 12419/34278 [13:46:44<28:09:33, 4.64s/it] 36%|███▌ | 12420/34278 [13:46:47<24:54:44, 4.10s/it] {'loss': 0.1633, 'grad_norm': 0.8942643632562767, 'learning_rate': 7.3729440933669575e-06, 'epoch': 0.36} 36%|███▌ | 12420/34278 [13:46:47<24:54:44, 4.10s/it] 36%|███▌ | 12421/34278 [13:46:53<27:47:21, 4.58s/it] {'loss': 0.1183, 'grad_norm': 0.8358110494033888, 'learning_rate': 7.372528242630878e-06, 'epoch': 0.36} 36%|███▌ | 12421/34278 [13:46:53<27:47:21, 4.58s/it] 36%|███▌ | 12422/34278 [13:46:56<24:49:10, 4.09s/it] {'loss': 0.1443, 'grad_norm': 1.0317164305132445, 'learning_rate': 7.372112370713431e-06, 'epoch': 0.36} 36%|███▌ | 12422/34278 [13:46:56<24:49:10, 4.09s/it] 36%|███▌ | 12423/34278 [13:46:59<23:14:19, 3.83s/it] {'loss': 0.1569, 'grad_norm': 0.8983204452817084, 'learning_rate': 7.371696477618333e-06, 'epoch': 0.36} 36%|███▌ | 12423/34278 [13:46:59<23:14:19, 3.83s/it] 36%|███▌ | 12424/34278 [13:47:02<21:44:45, 3.58s/it] {'loss': 0.1423, 'grad_norm': 0.8926004064186805, 'learning_rate': 7.3712805633492935e-06, 'epoch': 0.36} 36%|███▌ | 12424/34278 [13:47:02<21:44:45, 3.58s/it] 36%|███▌ | 12425/34278 [13:47:08<25:58:11, 4.28s/it] {'loss': 0.1456, 'grad_norm': 0.8670794270314263, 'learning_rate': 7.370864627910027e-06, 'epoch': 0.36} 36%|███▌ | 12425/34278 [13:47:08<25:58:11, 4.28s/it] 36%|███▋ | 12426/34278 [13:47:12<25:12:19, 4.15s/it] {'loss': 0.1407, 'grad_norm': 0.8985125181143718, 'learning_rate': 7.370448671304248e-06, 'epoch': 0.36} 36%|███▋ | 12426/34278 [13:47:12<25:12:19, 4.15s/it] 36%|███▋ | 12427/34278 [13:47:15<23:12:01, 3.82s/it] {'loss': 0.1499, 'grad_norm': 0.7063648041598328, 'learning_rate': 7.370032693535669e-06, 'epoch': 0.36} 36%|███▋ | 12427/34278 [13:47:15<23:12:01, 3.82s/it] 36%|███▋ | 12428/34278 [13:47:20<26:35:56, 4.38s/it] {'loss': 0.148, 'grad_norm': 0.8830294779583203, 'learning_rate': 7.369616694608004e-06, 'epoch': 0.36} 36%|███▋ | 12428/34278 [13:47:20<26:35:56, 4.38s/it] 36%|███▋ | 12429/34278 [13:47:24<25:11:19, 4.15s/it] {'loss': 0.1704, 'grad_norm': 0.8577546795229164, 'learning_rate': 7.369200674524966e-06, 'epoch': 0.36} 36%|███▋ | 12429/34278 [13:47:24<25:11:19, 4.15s/it] 36%|███▋ | 12430/34278 [13:47:27<23:26:20, 3.86s/it] {'loss': 0.1719, 'grad_norm': 0.7825779464405499, 'learning_rate': 7.36878463329027e-06, 'epoch': 0.36} 36%|███▋ | 12430/34278 [13:47:27<23:26:20, 3.86s/it] 36%|███▋ | 12431/34278 [13:47:31<22:53:33, 3.77s/it] {'loss': 0.136, 'grad_norm': 0.738730076741277, 'learning_rate': 7.368368570907633e-06, 'epoch': 0.36} 36%|███▋ | 12431/34278 [13:47:31<22:53:33, 3.77s/it] 36%|███▋ | 12432/34278 [13:47:36<24:53:53, 4.10s/it] {'loss': 0.1406, 'grad_norm': 0.8400014819160159, 'learning_rate': 7.367952487380763e-06, 'epoch': 0.36} 36%|███▋ | 12432/34278 [13:47:36<24:53:53, 4.10s/it] 36%|███▋ | 12433/34278 [13:47:41<28:15:00, 4.66s/it] {'loss': 0.1586, 'grad_norm': 0.7990516261637564, 'learning_rate': 7.367536382713381e-06, 'epoch': 0.36} 36%|███▋ | 12433/34278 [13:47:42<28:15:00, 4.66s/it] 36%|███▋ | 12434/34278 [13:47:45<26:04:10, 4.30s/it] {'loss': 0.1413, 'grad_norm': 0.8288024477863183, 'learning_rate': 7.367120256909198e-06, 'epoch': 0.36} 36%|███▋ | 12434/34278 [13:47:45<26:04:10, 4.30s/it] 36%|███▋ | 12435/34278 [13:47:49<25:21:38, 4.18s/it] {'loss': 0.1455, 'grad_norm': 1.0154243653961876, 'learning_rate': 7.366704109971929e-06, 'epoch': 0.36} 36%|███▋ | 12435/34278 [13:47:49<25:21:38, 4.18s/it] 36%|███▋ | 12436/34278 [13:47:53<25:45:55, 4.25s/it] {'loss': 0.1483, 'grad_norm': 0.8138546523351705, 'learning_rate': 7.366287941905295e-06, 'epoch': 0.36} 36%|███▋ | 12436/34278 [13:47:53<25:45:55, 4.25s/it] 36%|███▋ | 12437/34278 [13:47:56<23:16:00, 3.84s/it] {'loss': 0.1437, 'grad_norm': 0.7551388897055035, 'learning_rate': 7.365871752713003e-06, 'epoch': 0.36} 36%|███▋ | 12437/34278 [13:47:56<23:16:00, 3.84s/it] 36%|███▋ | 12438/34278 [13:47:59<21:44:23, 3.58s/it] {'loss': 0.1355, 'grad_norm': 0.8987613971071026, 'learning_rate': 7.365455542398775e-06, 'epoch': 0.36} 36%|███▋ | 12438/34278 [13:47:59<21:44:23, 3.58s/it] 36%|███▋ | 12439/34278 [13:48:05<25:15:35, 4.16s/it] {'loss': 0.1478, 'grad_norm': 0.8256795075390594, 'learning_rate': 7.365039310966324e-06, 'epoch': 0.36} 36%|███▋ | 12439/34278 [13:48:05<25:15:35, 4.16s/it] 36%|███▋ | 12440/34278 [13:48:09<25:09:26, 4.15s/it] {'loss': 0.1734, 'grad_norm': 0.8169888938021671, 'learning_rate': 7.364623058419367e-06, 'epoch': 0.36} 36%|███▋ | 12440/34278 [13:48:09<25:09:26, 4.15s/it] 36%|███▋ | 12441/34278 [13:48:12<23:34:31, 3.89s/it] {'loss': 0.1717, 'grad_norm': 0.925222291897439, 'learning_rate': 7.364206784761618e-06, 'epoch': 0.36} 36%|███▋ | 12441/34278 [13:48:12<23:34:31, 3.89s/it] 36%|███▋ | 12442/34278 [13:48:15<21:35:51, 3.56s/it] {'loss': 0.1408, 'grad_norm': 0.9039461568838147, 'learning_rate': 7.363790489996797e-06, 'epoch': 0.36} 36%|███▋ | 12442/34278 [13:48:15<21:35:51, 3.56s/it] 36%|███▋ | 12443/34278 [13:48:18<20:47:44, 3.43s/it] {'loss': 0.1447, 'grad_norm': 0.8004050674599028, 'learning_rate': 7.363374174128619e-06, 'epoch': 0.36} 36%|███▋ | 12443/34278 [13:48:18<20:47:44, 3.43s/it] 36%|███▋ | 12444/34278 [13:48:21<20:17:21, 3.35s/it] {'loss': 0.1367, 'grad_norm': 0.8243223765501508, 'learning_rate': 7.362957837160799e-06, 'epoch': 0.36} 36%|███▋ | 12444/34278 [13:48:21<20:17:21, 3.35s/it] 36%|███▋ | 12445/34278 [13:48:25<21:23:48, 3.53s/it] {'loss': 0.1543, 'grad_norm': 0.7991490427144883, 'learning_rate': 7.362541479097056e-06, 'epoch': 0.36} 36%|███▋ | 12445/34278 [13:48:25<21:23:48, 3.53s/it] 36%|███▋ | 12446/34278 [13:48:28<19:54:15, 3.28s/it] {'loss': 0.1516, 'grad_norm': 1.0016952136525967, 'learning_rate': 7.3621250999411085e-06, 'epoch': 0.36} 36%|███▋ | 12446/34278 [13:48:28<19:54:15, 3.28s/it] 36%|███▋ | 12447/34278 [13:48:31<20:28:29, 3.38s/it] {'loss': 0.1734, 'grad_norm': 1.0991795309553871, 'learning_rate': 7.36170869969667e-06, 'epoch': 0.36} 36%|███▋ | 12447/34278 [13:48:31<20:28:29, 3.38s/it] 36%|███▋ | 12448/34278 [13:48:36<23:14:52, 3.83s/it] {'loss': 0.1499, 'grad_norm': 0.8366821099669521, 'learning_rate': 7.361292278367461e-06, 'epoch': 0.36} 36%|███▋ | 12448/34278 [13:48:36<23:14:52, 3.83s/it] 36%|███▋ | 12449/34278 [13:48:40<23:30:26, 3.88s/it] {'loss': 0.1349, 'grad_norm': 0.7070275659050894, 'learning_rate': 7.360875835957198e-06, 'epoch': 0.36} 36%|███▋ | 12449/34278 [13:48:40<23:30:26, 3.88s/it] 36%|███▋ | 12450/34278 [13:48:43<22:17:20, 3.68s/it] {'loss': 0.1388, 'grad_norm': 0.7723239819916525, 'learning_rate': 7.360459372469598e-06, 'epoch': 0.36} 36%|███▋ | 12450/34278 [13:48:43<22:17:20, 3.68s/it] 36%|███▋ | 12451/34278 [13:48:47<22:09:57, 3.66s/it] {'loss': 0.155, 'grad_norm': 0.6756298929571691, 'learning_rate': 7.360042887908382e-06, 'epoch': 0.36} 36%|███▋ | 12451/34278 [13:48:47<22:09:57, 3.66s/it] 36%|███▋ | 12452/34278 [13:48:51<21:57:05, 3.62s/it] {'loss': 0.1327, 'grad_norm': 0.8827125519419131, 'learning_rate': 7.359626382277265e-06, 'epoch': 0.36} 36%|███▋ | 12452/34278 [13:48:51<21:57:05, 3.62s/it] 36%|███▋ | 12453/34278 [13:48:54<21:36:37, 3.56s/it] {'loss': 0.1076, 'grad_norm': 0.8041012971349712, 'learning_rate': 7.359209855579968e-06, 'epoch': 0.36} 36%|███▋ | 12453/34278 [13:48:54<21:36:37, 3.56s/it] 36%|███▋ | 12454/34278 [13:48:57<20:57:28, 3.46s/it] {'loss': 0.1375, 'grad_norm': 0.697372531692577, 'learning_rate': 7.358793307820209e-06, 'epoch': 0.36} 36%|███▋ | 12454/34278 [13:48:57<20:57:28, 3.46s/it] 36%|███▋ | 12455/34278 [13:49:00<20:16:55, 3.35s/it] {'loss': 0.1726, 'grad_norm': 1.3237811105848196, 'learning_rate': 7.358376739001704e-06, 'epoch': 0.36} 36%|███▋ | 12455/34278 [13:49:00<20:16:55, 3.35s/it] 36%|███▋ | 12456/34278 [13:49:05<21:50:21, 3.60s/it] {'loss': 0.1337, 'grad_norm': 0.9758560351141198, 'learning_rate': 7.357960149128177e-06, 'epoch': 0.36} 36%|███▋ | 12456/34278 [13:49:05<21:50:21, 3.60s/it] 36%|███▋ | 12457/34278 [13:49:09<22:31:19, 3.72s/it] {'loss': 0.1452, 'grad_norm': 0.7721145331937025, 'learning_rate': 7.357543538203344e-06, 'epoch': 0.36} 36%|███▋ | 12457/34278 [13:49:09<22:31:19, 3.72s/it] 36%|███▋ | 12458/34278 [13:49:12<21:25:09, 3.53s/it] {'loss': 0.1342, 'grad_norm': 0.8972062164794726, 'learning_rate': 7.357126906230926e-06, 'epoch': 0.36} 36%|███▋ | 12458/34278 [13:49:12<21:25:09, 3.53s/it] 36%|███▋ | 12459/34278 [13:49:16<22:18:44, 3.68s/it] {'loss': 0.1753, 'grad_norm': 0.9051163989848191, 'learning_rate': 7.35671025321464e-06, 'epoch': 0.36} 36%|███▋ | 12459/34278 [13:49:16<22:18:44, 3.68s/it] 36%|███▋ | 12460/34278 [13:49:20<23:37:23, 3.90s/it] {'loss': 0.1451, 'grad_norm': 1.1198352697714742, 'learning_rate': 7.356293579158207e-06, 'epoch': 0.36} 36%|███▋ | 12460/34278 [13:49:20<23:37:23, 3.90s/it] 36%|███▋ | 12461/34278 [13:49:23<21:35:50, 3.56s/it] {'loss': 0.1333, 'grad_norm': 1.0176748330465637, 'learning_rate': 7.355876884065349e-06, 'epoch': 0.36} 36%|███▋ | 12461/34278 [13:49:23<21:35:50, 3.56s/it] 36%|███▋ | 12462/34278 [13:49:26<20:41:11, 3.41s/it] {'loss': 0.1503, 'grad_norm': 0.9384466077659679, 'learning_rate': 7.355460167939783e-06, 'epoch': 0.36} 36%|███▋ | 12462/34278 [13:49:26<20:41:11, 3.41s/it] 36%|███▋ | 12463/34278 [13:49:29<20:01:07, 3.30s/it] {'loss': 0.1331, 'grad_norm': 0.8246058816548979, 'learning_rate': 7.3550434307852335e-06, 'epoch': 0.36} 36%|███▋ | 12463/34278 [13:49:29<20:01:07, 3.30s/it] 36%|███▋ | 12464/34278 [13:49:32<20:14:35, 3.34s/it] {'loss': 0.1419, 'grad_norm': 1.0233503954893863, 'learning_rate': 7.354626672605416e-06, 'epoch': 0.36} 36%|███▋ | 12464/34278 [13:49:32<20:14:35, 3.34s/it] 36%|███▋ | 12465/34278 [13:49:35<19:43:23, 3.26s/it] {'loss': 0.1296, 'grad_norm': 0.8133834003284872, 'learning_rate': 7.354209893404054e-06, 'epoch': 0.36} 36%|███▋ | 12465/34278 [13:49:35<19:43:23, 3.26s/it] 36%|███▋ | 12466/34278 [13:49:39<20:56:41, 3.46s/it] {'loss': 0.1569, 'grad_norm': 1.2913316892089426, 'learning_rate': 7.353793093184869e-06, 'epoch': 0.36} 36%|███▋ | 12466/34278 [13:49:39<20:56:41, 3.46s/it] 36%|███▋ | 12467/34278 [13:49:42<19:39:12, 3.24s/it] {'loss': 0.1505, 'grad_norm': 0.8334656644246954, 'learning_rate': 7.353376271951581e-06, 'epoch': 0.36} 36%|███▋ | 12467/34278 [13:49:42<19:39:12, 3.24s/it] 36%|███▋ | 12468/34278 [13:49:46<21:27:01, 3.54s/it] {'loss': 0.1605, 'grad_norm': 0.8517273287822625, 'learning_rate': 7.352959429707911e-06, 'epoch': 0.36} 36%|███▋ | 12468/34278 [13:49:46<21:27:01, 3.54s/it] 36%|███▋ | 12469/34278 [13:49:51<23:06:31, 3.81s/it] {'loss': 0.1591, 'grad_norm': 0.8995970232561108, 'learning_rate': 7.3525425664575815e-06, 'epoch': 0.36} 36%|███▋ | 12469/34278 [13:49:51<23:06:31, 3.81s/it] 36%|███▋ | 12470/34278 [13:49:56<26:08:44, 4.32s/it] {'loss': 0.1557, 'grad_norm': 0.7371374702805029, 'learning_rate': 7.352125682204313e-06, 'epoch': 0.36} 36%|███▋ | 12470/34278 [13:49:56<26:08:44, 4.32s/it] 36%|███▋ | 12471/34278 [13:50:01<27:33:09, 4.55s/it] {'loss': 0.1453, 'grad_norm': 0.9365223917183233, 'learning_rate': 7.351708776951828e-06, 'epoch': 0.36} 36%|███▋ | 12471/34278 [13:50:01<27:33:09, 4.55s/it] 36%|███▋ | 12472/34278 [13:50:08<30:28:10, 5.03s/it] {'loss': 0.1408, 'grad_norm': 0.8667120873011884, 'learning_rate': 7.351291850703848e-06, 'epoch': 0.36} 36%|███▋ | 12472/34278 [13:50:08<30:28:10, 5.03s/it] 36%|███▋ | 12473/34278 [13:50:14<32:59:19, 5.45s/it] {'loss': 0.1465, 'grad_norm': 0.7788669940383607, 'learning_rate': 7.350874903464097e-06, 'epoch': 0.36} 36%|███▋ | 12473/34278 [13:50:14<32:59:19, 5.45s/it] 36%|███▋ | 12474/34278 [13:50:17<28:04:03, 4.63s/it] {'loss': 0.1449, 'grad_norm': 0.9206504844211706, 'learning_rate': 7.350457935236295e-06, 'epoch': 0.36} 36%|███▋ | 12474/34278 [13:50:17<28:04:03, 4.63s/it] 36%|███▋ | 12475/34278 [13:50:19<24:42:04, 4.08s/it] {'loss': 0.156, 'grad_norm': 1.1593282146380695, 'learning_rate': 7.350040946024165e-06, 'epoch': 0.36} 36%|███▋ | 12475/34278 [13:50:19<24:42:04, 4.08s/it] 36%|███▋ | 12476/34278 [13:50:24<25:17:25, 4.18s/it] {'loss': 0.136, 'grad_norm': 0.9126505802412705, 'learning_rate': 7.349623935831432e-06, 'epoch': 0.36} 36%|███▋ | 12476/34278 [13:50:24<25:17:25, 4.18s/it] 36%|███▋ | 12477/34278 [13:50:29<27:35:40, 4.56s/it] {'loss': 0.1421, 'grad_norm': 0.7666088844096738, 'learning_rate': 7.349206904661816e-06, 'epoch': 0.36} 36%|███▋ | 12477/34278 [13:50:29<27:35:40, 4.56s/it] 36%|███▋ | 12478/34278 [13:50:33<25:53:37, 4.28s/it] {'loss': 0.1389, 'grad_norm': 0.7484162238205191, 'learning_rate': 7.348789852519043e-06, 'epoch': 0.36} 36%|███▋ | 12478/34278 [13:50:33<25:53:37, 4.28s/it] 36%|███▋ | 12479/34278 [13:50:37<24:49:55, 4.10s/it] {'loss': 0.1389, 'grad_norm': 0.8097442914824786, 'learning_rate': 7.348372779406834e-06, 'epoch': 0.36} 36%|███▋ | 12479/34278 [13:50:37<24:49:55, 4.10s/it] 36%|███▋ | 12480/34278 [13:50:42<26:46:22, 4.42s/it] {'loss': 0.1251, 'grad_norm': 0.7987503546657555, 'learning_rate': 7.347955685328912e-06, 'epoch': 0.36} 36%|███▋ | 12480/34278 [13:50:42<26:46:22, 4.42s/it] 36%|███▋ | 12481/34278 [13:50:45<24:54:55, 4.12s/it] {'loss': 0.1302, 'grad_norm': 0.9288490047922632, 'learning_rate': 7.347538570289005e-06, 'epoch': 0.36} 36%|███▋ | 12481/34278 [13:50:45<24:54:55, 4.12s/it] 36%|███▋ | 12482/34278 [13:50:50<25:27:05, 4.20s/it] {'loss': 0.1229, 'grad_norm': 0.6750726528592491, 'learning_rate': 7.347121434290834e-06, 'epoch': 0.36} 36%|███▋ | 12482/34278 [13:50:50<25:27:05, 4.20s/it] 36%|███▋ | 12483/34278 [13:50:53<23:37:47, 3.90s/it] {'loss': 0.1453, 'grad_norm': 0.7746199922297062, 'learning_rate': 7.346704277338122e-06, 'epoch': 0.36} 36%|███▋ | 12483/34278 [13:50:53<23:37:47, 3.90s/it] 36%|███▋ | 12484/34278 [13:50:56<21:30:34, 3.55s/it] {'loss': 0.1409, 'grad_norm': 0.873953568339805, 'learning_rate': 7.346287099434593e-06, 'epoch': 0.36} 36%|███▋ | 12484/34278 [13:50:56<21:30:34, 3.55s/it] 36%|███▋ | 12485/34278 [13:50:58<20:24:53, 3.37s/it] {'loss': 0.1519, 'grad_norm': 0.7120896229151646, 'learning_rate': 7.345869900583975e-06, 'epoch': 0.36} 36%|███▋ | 12485/34278 [13:50:58<20:24:53, 3.37s/it] 36%|███▋ | 12486/34278 [13:51:03<22:20:38, 3.69s/it] {'loss': 0.1612, 'grad_norm': 0.8220019306038706, 'learning_rate': 7.345452680789989e-06, 'epoch': 0.36} 36%|███▋ | 12486/34278 [13:51:03<22:20:38, 3.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 36%|███▋ | 12487/34278 [13:51:06<21:09:53, 3.50s/it] {'loss': 0.1742, 'grad_norm': 0.8239607206109042, 'learning_rate': 7.345035440056363e-06, 'epoch': 0.36} 36%|███▋ | 12487/34278 [13:51:06<21:09:53, 3.50s/it] 36%|███▋ | 12488/34278 [13:51:10<21:25:51, 3.54s/it] {'loss': 0.1548, 'grad_norm': 0.9195671088239025, 'learning_rate': 7.34461817838682e-06, 'epoch': 0.36} 36%|███▋ | 12488/34278 [13:51:10<21:25:51, 3.54s/it] 36%|███▋ | 12489/34278 [13:51:13<22:03:37, 3.64s/it] {'loss': 0.1483, 'grad_norm': 0.820900638310747, 'learning_rate': 7.344200895785083e-06, 'epoch': 0.36} 36%|███▋ | 12489/34278 [13:51:14<22:03:37, 3.64s/it] 36%|███▋ | 12490/34278 [13:51:17<21:25:33, 3.54s/it] {'loss': 0.1403, 'grad_norm': 0.9859117442782211, 'learning_rate': 7.343783592254883e-06, 'epoch': 0.36} 36%|███▋ | 12490/34278 [13:51:17<21:25:33, 3.54s/it] 36%|███▋ | 12491/34278 [13:51:23<26:22:59, 4.36s/it] {'loss': 0.1631, 'grad_norm': 0.9502163373819177, 'learning_rate': 7.3433662677999426e-06, 'epoch': 0.36} 36%|███▋ | 12491/34278 [13:51:23<26:22:59, 4.36s/it] 36%|███▋ | 12492/34278 [13:51:27<25:00:38, 4.13s/it] {'loss': 0.1289, 'grad_norm': 0.86215746008853, 'learning_rate': 7.342948922423985e-06, 'epoch': 0.36} 36%|███▋ | 12492/34278 [13:51:27<25:00:38, 4.13s/it] 36%|███▋ | 12493/34278 [13:51:29<22:34:01, 3.73s/it] {'loss': 0.1184, 'grad_norm': 1.028385262362884, 'learning_rate': 7.342531556130742e-06, 'epoch': 0.36} 36%|███▋ | 12493/34278 [13:51:29<22:34:01, 3.73s/it] 36%|███▋ | 12494/34278 [13:51:35<25:46:25, 4.26s/it] {'loss': 0.1511, 'grad_norm': 0.8530250849243711, 'learning_rate': 7.342114168923935e-06, 'epoch': 0.36} 36%|███▋ | 12494/34278 [13:51:35<25:46:25, 4.26s/it] 36%|███▋ | 12495/34278 [13:51:41<28:54:39, 4.78s/it] {'loss': 0.1756, 'grad_norm': 0.9723990594033373, 'learning_rate': 7.341696760807291e-06, 'epoch': 0.36} 36%|███▋ | 12495/34278 [13:51:41<28:54:39, 4.78s/it] 36%|███▋ | 12496/34278 [13:51:45<27:01:53, 4.47s/it] {'loss': 0.1366, 'grad_norm': 0.9925847724373023, 'learning_rate': 7.341279331784539e-06, 'epoch': 0.36} 36%|███▋ | 12496/34278 [13:51:45<27:01:53, 4.47s/it] 36%|███▋ | 12497/34278 [13:51:48<24:31:20, 4.05s/it] {'loss': 0.1519, 'grad_norm': 1.0303219629002769, 'learning_rate': 7.340861881859403e-06, 'epoch': 0.36} 36%|███▋ | 12497/34278 [13:51:48<24:31:20, 4.05s/it] 36%|███▋ | 12498/34278 [13:51:51<23:14:40, 3.84s/it] {'loss': 0.1422, 'grad_norm': 0.9082810844900719, 'learning_rate': 7.34044441103561e-06, 'epoch': 0.36} 36%|███▋ | 12498/34278 [13:51:51<23:14:40, 3.84s/it] 36%|███▋ | 12499/34278 [13:51:54<21:37:58, 3.58s/it] {'loss': 0.1354, 'grad_norm': 0.927028182869787, 'learning_rate': 7.340026919316889e-06, 'epoch': 0.36} 36%|███▋ | 12499/34278 [13:51:54<21:37:58, 3.58s/it] 36%|███▋ | 12500/34278 [13:51:57<20:48:12, 3.44s/it] {'loss': 0.1121, 'grad_norm': 0.7169585676932424, 'learning_rate': 7.339609406706966e-06, 'epoch': 0.36} 36%|███▋ | 12500/34278 [13:51:57<20:48:12, 3.44s/it] 36%|███▋ | 12501/34278 [13:52:03<24:16:41, 4.01s/it] {'loss': 0.1406, 'grad_norm': 0.7099762171401988, 'learning_rate': 7.339191873209569e-06, 'epoch': 0.36} 36%|███▋ | 12501/34278 [13:52:03<24:16:41, 4.01s/it] 36%|███▋ | 12502/34278 [13:52:05<22:10:54, 3.67s/it] {'loss': 0.165, 'grad_norm': 0.8390462407599057, 'learning_rate': 7.3387743188284255e-06, 'epoch': 0.36} 36%|███▋ | 12502/34278 [13:52:05<22:10:54, 3.67s/it] 36%|███▋ | 12503/34278 [13:52:08<20:41:03, 3.42s/it] {'loss': 0.1242, 'grad_norm': 0.7523429675407453, 'learning_rate': 7.338356743567264e-06, 'epoch': 0.36} 36%|███▋ | 12503/34278 [13:52:08<20:41:03, 3.42s/it] 36%|███▋ | 12504/34278 [13:52:12<21:02:59, 3.48s/it] {'loss': 0.1673, 'grad_norm': 0.9066002377405343, 'learning_rate': 7.3379391474298085e-06, 'epoch': 0.36} 36%|███▋ | 12504/34278 [13:52:12<21:02:59, 3.48s/it] 36%|███▋ | 12505/34278 [13:52:15<19:43:56, 3.26s/it] {'loss': 0.1544, 'grad_norm': 1.3086197754302376, 'learning_rate': 7.337521530419793e-06, 'epoch': 0.36} 36%|███▋ | 12505/34278 [13:52:15<19:43:56, 3.26s/it] 36%|███▋ | 12506/34278 [13:52:18<19:59:11, 3.30s/it] {'loss': 0.1546, 'grad_norm': 1.0820332839361442, 'learning_rate': 7.337103892540945e-06, 'epoch': 0.36} 36%|███▋ | 12506/34278 [13:52:18<19:59:11, 3.30s/it] 36%|███▋ | 12507/34278 [13:52:21<20:06:34, 3.33s/it] {'loss': 0.1554, 'grad_norm': 0.7992437926444342, 'learning_rate': 7.336686233796988e-06, 'epoch': 0.36} 36%|███▋ | 12507/34278 [13:52:21<20:06:34, 3.33s/it] 36%|███▋ | 12508/34278 [13:52:25<19:53:26, 3.29s/it] {'loss': 0.118, 'grad_norm': 0.9363009022825539, 'learning_rate': 7.336268554191657e-06, 'epoch': 0.36} 36%|███▋ | 12508/34278 [13:52:25<19:53:26, 3.29s/it] 36%|███▋ | 12509/34278 [13:52:28<19:26:28, 3.22s/it] {'loss': 0.1483, 'grad_norm': 1.084394193415658, 'learning_rate': 7.335850853728675e-06, 'epoch': 0.36} 36%|███▋ | 12509/34278 [13:52:28<19:26:28, 3.22s/it] 36%|███▋ | 12510/34278 [13:52:31<19:30:19, 3.23s/it] {'loss': 0.1494, 'grad_norm': 0.9791888893527011, 'learning_rate': 7.335433132411775e-06, 'epoch': 0.36} 36%|███▋ | 12510/34278 [13:52:31<19:30:19, 3.23s/it] 36%|███▋ | 12511/34278 [13:52:35<21:13:29, 3.51s/it] {'loss': 0.1357, 'grad_norm': 0.7291919732518494, 'learning_rate': 7.335015390244688e-06, 'epoch': 0.36} 36%|███▋ | 12511/34278 [13:52:35<21:13:29, 3.51s/it] 37%|███▋ | 12512/34278 [13:52:38<20:38:37, 3.41s/it] {'loss': 0.1765, 'grad_norm': 0.8973636101942085, 'learning_rate': 7.334597627231138e-06, 'epoch': 0.37} 37%|███▋ | 12512/34278 [13:52:38<20:38:37, 3.41s/it] 37%|███▋ | 12513/34278 [13:52:43<22:45:05, 3.76s/it] {'loss': 0.1397, 'grad_norm': 0.8235104568007249, 'learning_rate': 7.334179843374859e-06, 'epoch': 0.37} 37%|███▋ | 12513/34278 [13:52:43<22:45:05, 3.76s/it] 37%|███▋ | 12514/34278 [13:52:48<24:34:51, 4.07s/it] {'loss': 0.1459, 'grad_norm': 0.8596961124766973, 'learning_rate': 7.333762038679579e-06, 'epoch': 0.37} 37%|███▋ | 12514/34278 [13:52:48<24:34:51, 4.07s/it] 37%|███▋ | 12515/34278 [13:52:54<28:57:56, 4.79s/it] {'loss': 0.1492, 'grad_norm': 0.8554809964876028, 'learning_rate': 7.3333442131490294e-06, 'epoch': 0.37} 37%|███▋ | 12515/34278 [13:52:54<28:57:56, 4.79s/it] 37%|███▋ | 12516/34278 [13:52:57<25:43:10, 4.25s/it] {'loss': 0.1575, 'grad_norm': 0.8191381647651743, 'learning_rate': 7.332926366786939e-06, 'epoch': 0.37} 37%|███▋ | 12516/34278 [13:52:57<25:43:10, 4.25s/it] 37%|███▋ | 12517/34278 [13:53:03<28:41:18, 4.75s/it] {'loss': 0.1343, 'grad_norm': 0.9718655156205231, 'learning_rate': 7.33250849959704e-06, 'epoch': 0.37} 37%|███▋ | 12517/34278 [13:53:03<28:41:18, 4.75s/it] 37%|███▋ | 12518/34278 [13:53:07<27:33:34, 4.56s/it] {'loss': 0.1414, 'grad_norm': 0.8546476213944995, 'learning_rate': 7.3320906115830615e-06, 'epoch': 0.37} 37%|███▋ | 12518/34278 [13:53:07<27:33:34, 4.56s/it] 37%|███▋ | 12519/34278 [13:53:10<24:29:18, 4.05s/it] {'loss': 0.1709, 'grad_norm': 1.244414597345418, 'learning_rate': 7.331672702748733e-06, 'epoch': 0.37} 37%|███▋ | 12519/34278 [13:53:10<24:29:18, 4.05s/it] 37%|███▋ | 12520/34278 [13:53:13<22:32:38, 3.73s/it] {'loss': 0.1346, 'grad_norm': 0.6982264029886699, 'learning_rate': 7.331254773097789e-06, 'epoch': 0.37} 37%|███▋ | 12520/34278 [13:53:13<22:32:38, 3.73s/it] 37%|███▋ | 12521/34278 [13:53:17<22:42:29, 3.76s/it] {'loss': 0.1182, 'grad_norm': 0.9942275033912196, 'learning_rate': 7.33083682263396e-06, 'epoch': 0.37} 37%|███▋ | 12521/34278 [13:53:17<22:42:29, 3.76s/it] 37%|███▋ | 12522/34278 [13:53:20<22:23:18, 3.70s/it] {'loss': 0.1383, 'grad_norm': 0.8936472742380306, 'learning_rate': 7.330418851360974e-06, 'epoch': 0.37} 37%|███▋ | 12522/34278 [13:53:20<22:23:18, 3.70s/it] 37%|███▋ | 12523/34278 [13:53:24<22:24:19, 3.71s/it] {'loss': 0.1462, 'grad_norm': 0.7166159321473673, 'learning_rate': 7.330000859282567e-06, 'epoch': 0.37} 37%|███▋ | 12523/34278 [13:53:24<22:24:19, 3.71s/it] 37%|███▋ | 12524/34278 [13:53:28<22:32:46, 3.73s/it] {'loss': 0.1607, 'grad_norm': 0.9633409343417314, 'learning_rate': 7.329582846402467e-06, 'epoch': 0.37} 37%|███▋ | 12524/34278 [13:53:28<22:32:46, 3.73s/it] 37%|███▋ | 12525/34278 [13:53:31<21:34:43, 3.57s/it] {'loss': 0.1804, 'grad_norm': 1.1610880501000653, 'learning_rate': 7.329164812724405e-06, 'epoch': 0.37} 37%|███▋ | 12525/34278 [13:53:31<21:34:43, 3.57s/it] 37%|███▋ | 12526/34278 [13:53:34<20:37:31, 3.41s/it] {'loss': 0.1408, 'grad_norm': 0.5569456887478516, 'learning_rate': 7.32874675825212e-06, 'epoch': 0.37} 37%|███▋ | 12526/34278 [13:53:34<20:37:31, 3.41s/it] 37%|███▋ | 12527/34278 [13:53:38<21:01:59, 3.48s/it] {'loss': 0.1547, 'grad_norm': 1.125308199847849, 'learning_rate': 7.328328682989338e-06, 'epoch': 0.37} 37%|███▋ | 12527/34278 [13:53:38<21:01:59, 3.48s/it] 37%|███▋ | 12528/34278 [13:53:42<22:52:38, 3.79s/it] {'loss': 0.1535, 'grad_norm': 1.0911837240978715, 'learning_rate': 7.327910586939794e-06, 'epoch': 0.37} 37%|███▋ | 12528/34278 [13:53:42<22:52:38, 3.79s/it] 37%|███▋ | 12529/34278 [13:53:45<21:35:39, 3.57s/it] {'loss': 0.1287, 'grad_norm': 0.6655995335581377, 'learning_rate': 7.327492470107218e-06, 'epoch': 0.37} 37%|███▋ | 12529/34278 [13:53:45<21:35:39, 3.57s/it] 37%|███▋ | 12530/34278 [13:53:49<22:00:21, 3.64s/it] {'loss': 0.138, 'grad_norm': 0.8633033130423697, 'learning_rate': 7.327074332495348e-06, 'epoch': 0.37} 37%|███▋ | 12530/34278 [13:53:49<22:00:21, 3.64s/it] 37%|███▋ | 12531/34278 [13:53:52<20:43:36, 3.43s/it] {'loss': 0.1588, 'grad_norm': 1.029517089838608, 'learning_rate': 7.326656174107911e-06, 'epoch': 0.37} 37%|███▋ | 12531/34278 [13:53:52<20:43:36, 3.43s/it] 37%|███▋ | 12532/34278 [13:53:55<20:09:11, 3.34s/it] {'loss': 0.1316, 'grad_norm': 0.8222599758116758, 'learning_rate': 7.326237994948644e-06, 'epoch': 0.37} 37%|███▋ | 12532/34278 [13:53:55<20:09:11, 3.34s/it] 37%|███▋ | 12533/34278 [13:54:01<24:54:28, 4.12s/it] {'loss': 0.1559, 'grad_norm': 0.815122704375594, 'learning_rate': 7.325819795021281e-06, 'epoch': 0.37} 37%|███▋ | 12533/34278 [13:54:01<24:54:28, 4.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 37%|███▋ | 12534/34278 [13:54:04<23:30:46, 3.89s/it] {'loss': 0.1494, 'grad_norm': 0.9597640113146232, 'learning_rate': 7.325401574329551e-06, 'epoch': 0.37} 37%|███▋ | 12534/34278 [13:54:05<23:30:46, 3.89s/it] 37%|███▋ | 12535/34278 [13:54:09<25:19:31, 4.19s/it] {'loss': 0.151, 'grad_norm': 0.8239063456490182, 'learning_rate': 7.3249833328771935e-06, 'epoch': 0.37} 37%|███▋ | 12535/34278 [13:54:09<25:19:31, 4.19s/it] 37%|███▋ | 12536/34278 [13:54:16<29:19:44, 4.86s/it] {'loss': 0.1573, 'grad_norm': 0.9384049183735227, 'learning_rate': 7.3245650706679395e-06, 'epoch': 0.37} 37%|███▋ | 12536/34278 [13:54:16<29:19:44, 4.86s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 37%|███▋ | 12537/34278 [13:54:20<27:22:13, 4.53s/it] {'loss': 0.1367, 'grad_norm': 0.836803788745504, 'learning_rate': 7.324146787705522e-06, 'epoch': 0.37} 37%|███▋ | 12537/34278 [13:54:20<27:22:13, 4.53s/it] 37%|███▋ | 12538/34278 [13:54:23<24:44:31, 4.10s/it] {'loss': 0.1714, 'grad_norm': 0.9357429364670283, 'learning_rate': 7.323728483993678e-06, 'epoch': 0.37} 37%|███▋ | 12538/34278 [13:54:23<24:44:31, 4.10s/it] 37%|███▋ | 12539/34278 [13:54:26<24:00:10, 3.97s/it] {'loss': 0.1551, 'grad_norm': 0.7367031054890589, 'learning_rate': 7.323310159536141e-06, 'epoch': 0.37} 37%|███▋ | 12539/34278 [13:54:26<24:00:10, 3.97s/it] 37%|███▋ | 12540/34278 [13:54:30<22:44:06, 3.77s/it] {'loss': 0.1333, 'grad_norm': 0.8075364455756667, 'learning_rate': 7.322891814336645e-06, 'epoch': 0.37} 37%|███▋ | 12540/34278 [13:54:30<22:44:06, 3.77s/it] 37%|███▋ | 12541/34278 [13:54:33<22:24:04, 3.71s/it] {'loss': 0.1683, 'grad_norm': 0.833073740749177, 'learning_rate': 7.3224734483989254e-06, 'epoch': 0.37} 37%|███▋ | 12541/34278 [13:54:33<22:24:04, 3.71s/it] 37%|███▋ | 12542/34278 [13:54:36<21:34:04, 3.57s/it] {'loss': 0.1647, 'grad_norm': 0.871719146026141, 'learning_rate': 7.322055061726717e-06, 'epoch': 0.37} 37%|███▋ | 12542/34278 [13:54:36<21:34:04, 3.57s/it] 37%|███▋ | 12543/34278 [13:54:40<21:09:31, 3.50s/it] {'loss': 0.1398, 'grad_norm': 0.9603750499039191, 'learning_rate': 7.321636654323756e-06, 'epoch': 0.37} 37%|███▋ | 12543/34278 [13:54:40<21:09:31, 3.50s/it] 37%|███▋ | 12544/34278 [13:54:46<26:00:12, 4.31s/it] {'loss': 0.1707, 'grad_norm': 0.8743735121129615, 'learning_rate': 7.321218226193777e-06, 'epoch': 0.37} 37%|███▋ | 12544/34278 [13:54:46<26:00:12, 4.31s/it] 37%|███▋ | 12545/34278 [13:54:49<24:18:00, 4.03s/it] {'loss': 0.1939, 'grad_norm': 0.9361800895713882, 'learning_rate': 7.320799777340516e-06, 'epoch': 0.37} 37%|███▋ | 12545/34278 [13:54:49<24:18:00, 4.03s/it] 37%|███▋ | 12546/34278 [13:54:55<26:37:48, 4.41s/it] {'loss': 0.1726, 'grad_norm': 1.4397172012087283, 'learning_rate': 7.320381307767708e-06, 'epoch': 0.37} 37%|███▋ | 12546/34278 [13:54:55<26:37:48, 4.41s/it] 37%|███▋ | 12547/34278 [13:54:58<25:35:47, 4.24s/it] {'loss': 0.134, 'grad_norm': 0.7590116649866849, 'learning_rate': 7.319962817479089e-06, 'epoch': 0.37} 37%|███▋ | 12547/34278 [13:54:59<25:35:47, 4.24s/it] 37%|███▋ | 12548/34278 [13:55:02<24:02:19, 3.98s/it] {'loss': 0.139, 'grad_norm': 0.9971170486586559, 'learning_rate': 7.319544306478398e-06, 'epoch': 0.37} 37%|███▋ | 12548/34278 [13:55:02<24:02:19, 3.98s/it] 37%|███▋ | 12549/34278 [13:55:06<23:55:39, 3.96s/it] {'loss': 0.1427, 'grad_norm': 1.0121016816886785, 'learning_rate': 7.3191257747693664e-06, 'epoch': 0.37} 37%|███▋ | 12549/34278 [13:55:06<23:55:39, 3.96s/it] 37%|███▋ | 12550/34278 [13:55:09<21:43:43, 3.60s/it] {'loss': 0.1373, 'grad_norm': 1.0037128635150039, 'learning_rate': 7.318707222355735e-06, 'epoch': 0.37} 37%|███▋ | 12550/34278 [13:55:09<21:43:43, 3.60s/it] 37%|███▋ | 12551/34278 [13:55:12<20:54:19, 3.46s/it] {'loss': 0.1292, 'grad_norm': 1.1971441068620639, 'learning_rate': 7.318288649241241e-06, 'epoch': 0.37} 37%|███▋ | 12551/34278 [13:55:12<20:54:19, 3.46s/it] 37%|███▋ | 12552/34278 [13:55:15<19:46:46, 3.28s/it] {'loss': 0.1645, 'grad_norm': 0.8520980034299164, 'learning_rate': 7.317870055429615e-06, 'epoch': 0.37} 37%|███▋ | 12552/34278 [13:55:15<19:46:46, 3.28s/it] 37%|███▋ | 12553/34278 [13:55:18<20:29:26, 3.40s/it] {'loss': 0.1567, 'grad_norm': 0.8578629480944141, 'learning_rate': 7.317451440924602e-06, 'epoch': 0.37} 37%|███▋ | 12553/34278 [13:55:18<20:29:26, 3.40s/it] 37%|███▋ | 12554/34278 [13:55:22<21:54:30, 3.63s/it] {'loss': 0.1311, 'grad_norm': 0.8119415694644823, 'learning_rate': 7.317032805729935e-06, 'epoch': 0.37} 37%|███▋ | 12554/34278 [13:55:22<21:54:30, 3.63s/it] 37%|███▋ | 12555/34278 [13:55:26<21:05:18, 3.49s/it] {'loss': 0.1482, 'grad_norm': 1.067426569165156, 'learning_rate': 7.31661414984935e-06, 'epoch': 0.37} 37%|███▋ | 12555/34278 [13:55:26<21:05:18, 3.49s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8dcef2ff60> Failed to fetch sample 3686874. Exception: cannot identify image file <_io.BytesIO object at 0x7f8dcef2ff60> 37%|███▋ | 12556/34278 [13:55:29<20:05:08, 3.33s/it] {'loss': 0.1483, 'grad_norm': 0.7879180641463571, 'learning_rate': 7.3161954732865906e-06, 'epoch': 0.37} 37%|███▋ | 12556/34278 [13:55:29<20:05:08, 3.33s/it] 37%|███▋ | 12557/34278 [13:55:34<24:03:11, 3.99s/it] {'loss': 0.1277, 'grad_norm': 0.8850431311253087, 'learning_rate': 7.315776776045388e-06, 'epoch': 0.37} 37%|███▋ | 12557/34278 [13:55:34<24:03:11, 3.99s/it] 37%|███▋ | 12558/34278 [13:55:37<23:01:25, 3.82s/it] {'loss': 0.1464, 'grad_norm': 0.7589116034713146, 'learning_rate': 7.315358058129485e-06, 'epoch': 0.37} 37%|███▋ | 12558/34278 [13:55:37<23:01:25, 3.82s/it] 37%|███▋ | 12559/34278 [13:55:41<22:29:30, 3.73s/it] {'loss': 0.1578, 'grad_norm': 0.8775888047374377, 'learning_rate': 7.314939319542617e-06, 'epoch': 0.37} 37%|███▋ | 12559/34278 [13:55:41<22:29:30, 3.73s/it] 37%|███▋ | 12560/34278 [13:55:44<21:14:17, 3.52s/it] {'loss': 0.1452, 'grad_norm': 0.9138593029972256, 'learning_rate': 7.314520560288522e-06, 'epoch': 0.37} 37%|███▋ | 12560/34278 [13:55:44<21:14:17, 3.52s/it] 37%|███▋ | 12561/34278 [13:55:48<21:18:44, 3.53s/it] {'loss': 0.1611, 'grad_norm': 0.9309638777077405, 'learning_rate': 7.314101780370942e-06, 'epoch': 0.37} 37%|███▋ | 12561/34278 [13:55:48<21:18:44, 3.53s/it] 37%|███▋ | 12562/34278 [13:55:50<20:05:25, 3.33s/it] {'loss': 0.1687, 'grad_norm': 0.8200125293243199, 'learning_rate': 7.313682979793614e-06, 'epoch': 0.37} 37%|███▋ | 12562/34278 [13:55:50<20:05:25, 3.33s/it] 37%|███▋ | 12563/34278 [13:55:56<24:33:00, 4.07s/it] {'loss': 0.1602, 'grad_norm': 0.7819398768318264, 'learning_rate': 7.313264158560276e-06, 'epoch': 0.37} 37%|███▋ | 12563/34278 [13:55:56<24:33:00, 4.07s/it] 37%|███▋ | 12564/34278 [13:55:59<22:55:07, 3.80s/it] {'loss': 0.1535, 'grad_norm': 0.960687783706917, 'learning_rate': 7.312845316674667e-06, 'epoch': 0.37} 37%|███▋ | 12564/34278 [13:55:59<22:55:07, 3.80s/it] 37%|███▋ | 12565/34278 [13:56:03<21:40:35, 3.59s/it] {'loss': 0.1455, 'grad_norm': 0.806436860397053, 'learning_rate': 7.312426454140528e-06, 'epoch': 0.37} 37%|███▋ | 12565/34278 [13:56:03<21:40:35, 3.59s/it] 37%|███▋ | 12566/34278 [13:56:05<20:32:12, 3.41s/it] {'loss': 0.1467, 'grad_norm': 0.8037049708673916, 'learning_rate': 7.312007570961598e-06, 'epoch': 0.37} 37%|███▋ | 12566/34278 [13:56:06<20:32:12, 3.41s/it] 37%|███▋ | 12567/34278 [13:56:10<22:39:30, 3.76s/it] {'loss': 0.1775, 'grad_norm': 0.8475448014865734, 'learning_rate': 7.311588667141615e-06, 'epoch': 0.37} 37%|███▋ | 12567/34278 [13:56:10<22:39:30, 3.76s/it] 37%|███▋ | 12568/34278 [13:56:16<26:45:42, 4.44s/it] {'loss': 0.1507, 'grad_norm': 0.8330063194912772, 'learning_rate': 7.311169742684321e-06, 'epoch': 0.37} 37%|███▋ | 12568/34278 [13:56:16<26:45:42, 4.44s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 37%|███▋ | 12569/34278 [13:56:19<23:53:58, 3.96s/it] {'loss': 0.1654, 'grad_norm': 0.9139978268192165, 'learning_rate': 7.3107507975934555e-06, 'epoch': 0.37} 37%|███▋ | 12569/34278 [13:56:19<23:53:58, 3.96s/it] 37%|███▋ | 12570/34278 [13:56:23<23:24:13, 3.88s/it] {'loss': 0.123, 'grad_norm': 0.7029375873164558, 'learning_rate': 7.3103318318727566e-06, 'epoch': 0.37} 37%|███▋ | 12570/34278 [13:56:23<23:24:13, 3.88s/it] 37%|███▋ | 12571/34278 [13:56:26<22:09:20, 3.67s/it] {'loss': 0.1627, 'grad_norm': 0.7456527961582567, 'learning_rate': 7.30991284552597e-06, 'epoch': 0.37} 37%|███▋ | 12571/34278 [13:56:26<22:09:20, 3.67s/it] 37%|███▋ | 12572/34278 [13:56:29<21:17:16, 3.53s/it] {'loss': 0.1414, 'grad_norm': 0.8667322478230863, 'learning_rate': 7.309493838556832e-06, 'epoch': 0.37} 37%|███▋ | 12572/34278 [13:56:29<21:17:16, 3.53s/it] 37%|███▋ | 12573/34278 [13:56:33<22:12:54, 3.68s/it] {'loss': 0.1484, 'grad_norm': 0.9750652424929281, 'learning_rate': 7.309074810969083e-06, 'epoch': 0.37} 37%|███▋ | 12573/34278 [13:56:33<22:12:54, 3.68s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 37%|███▋ | 12574/34278 [13:56:39<25:30:06, 4.23s/it] {'loss': 0.1632, 'grad_norm': 0.751521085709621, 'learning_rate': 7.308655762766466e-06, 'epoch': 0.37} 37%|███▋ | 12574/34278 [13:56:39<25:30:06, 4.23s/it] 37%|███▋ | 12575/34278 [13:56:42<23:55:53, 3.97s/it] {'loss': 0.114, 'grad_norm': 0.8418233374470161, 'learning_rate': 7.30823669395272e-06, 'epoch': 0.37} 37%|███▋ | 12575/34278 [13:56:42<23:55:53, 3.97s/it] 37%|███▋ | 12576/34278 [13:56:45<22:26:01, 3.72s/it] {'loss': 0.1416, 'grad_norm': 0.8811982961030276, 'learning_rate': 7.30781760453159e-06, 'epoch': 0.37} 37%|███▋ | 12576/34278 [13:56:45<22:26:01, 3.72s/it] 37%|███▋ | 12577/34278 [13:56:48<20:30:49, 3.40s/it] {'loss': 0.1682, 'grad_norm': 1.04891068844708, 'learning_rate': 7.307398494506814e-06, 'epoch': 0.37} 37%|███▋ | 12577/34278 [13:56:48<20:30:49, 3.40s/it] 37%|███▋ | 12578/34278 [13:56:51<19:50:11, 3.29s/it] {'loss': 0.1396, 'grad_norm': 1.0826757853124749, 'learning_rate': 7.306979363882136e-06, 'epoch': 0.37} 37%|███▋ | 12578/34278 [13:56:51<19:50:11, 3.29s/it] 37%|███▋ | 12579/34278 [13:56:54<20:22:43, 3.38s/it] {'loss': 0.1379, 'grad_norm': 0.7940050144481171, 'learning_rate': 7.306560212661295e-06, 'epoch': 0.37} 37%|███▋ | 12579/34278 [13:56:54<20:22:43, 3.38s/it] 37%|███▋ | 12580/34278 [13:56:57<19:55:43, 3.31s/it] {'loss': 0.1508, 'grad_norm': 0.6952751578376332, 'learning_rate': 7.306141040848037e-06, 'epoch': 0.37} 37%|███▋ | 12580/34278 [13:56:58<19:55:43, 3.31s/it] 37%|███▋ | 12581/34278 [13:57:01<19:33:40, 3.25s/it] {'loss': 0.1702, 'grad_norm': 0.9643883866133751, 'learning_rate': 7.305721848446103e-06, 'epoch': 0.37} 37%|███▋ | 12581/34278 [13:57:01<19:33:40, 3.25s/it] 37%|███▋ | 12582/34278 [13:57:07<24:26:35, 4.06s/it] {'loss': 0.1437, 'grad_norm': 0.7867314299628712, 'learning_rate': 7.305302635459233e-06, 'epoch': 0.37} 37%|███▋ | 12582/34278 [13:57:07<24:26:35, 4.06s/it] 37%|███▋ | 12583/34278 [13:57:11<26:02:00, 4.32s/it] {'loss': 0.1812, 'grad_norm': 1.0084647801481508, 'learning_rate': 7.304883401891173e-06, 'epoch': 0.37} 37%|███▋ | 12583/34278 [13:57:11<26:02:00, 4.32s/it] 37%|███▋ | 12584/34278 [13:57:17<27:26:49, 4.55s/it] {'loss': 0.1561, 'grad_norm': 0.8342827322264076, 'learning_rate': 7.304464147745662e-06, 'epoch': 0.37} 37%|███▋ | 12584/34278 [13:57:17<27:26:49, 4.55s/it] 37%|███▋ | 12585/34278 [13:57:22<28:43:23, 4.77s/it] {'loss': 0.1453, 'grad_norm': 0.8581378104788102, 'learning_rate': 7.3040448730264455e-06, 'epoch': 0.37} 37%|███▋ | 12585/34278 [13:57:22<28:43:23, 4.77s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 37%|███▋ | 12586/34278 [13:57:25<25:58:28, 4.31s/it] {'loss': 0.1307, 'grad_norm': 0.9456648386883989, 'learning_rate': 7.303625577737269e-06, 'epoch': 0.37} 37%|███▋ | 12586/34278 [13:57:25<25:58:28, 4.31s/it] 37%|███▋ | 12587/34278 [13:57:29<26:10:39, 4.34s/it] {'loss': 0.1276, 'grad_norm': 0.84759138025627, 'learning_rate': 7.303206261881871e-06, 'epoch': 0.37} 37%|███▋ | 12587/34278 [13:57:30<26:10:39, 4.34s/it] 37%|███▋ | 12588/34278 [13:57:32<23:27:53, 3.89s/it] {'loss': 0.1237, 'grad_norm': 0.6764654554263654, 'learning_rate': 7.302786925463998e-06, 'epoch': 0.37} 37%|███▋ | 12588/34278 [13:57:32<23:27:53, 3.89s/it] 37%|███▋ | 12589/34278 [13:57:36<22:26:22, 3.72s/it] {'loss': 0.1582, 'grad_norm': 0.8086057539763226, 'learning_rate': 7.302367568487393e-06, 'epoch': 0.37} 37%|███▋ | 12589/34278 [13:57:36<22:26:22, 3.72s/it] 37%|███▋ | 12590/34278 [13:57:42<26:55:18, 4.47s/it] {'loss': 0.1749, 'grad_norm': 1.0471987049130558, 'learning_rate': 7.3019481909558e-06, 'epoch': 0.37} 37%|███▋ | 12590/34278 [13:57:42<26:55:18, 4.47s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 37%|███▋ | 12591/34278 [13:57:45<24:59:10, 4.15s/it] {'loss': 0.1754, 'grad_norm': 0.7825322907667858, 'learning_rate': 7.301528792872963e-06, 'epoch': 0.37} 37%|███▋ | 12591/34278 [13:57:45<24:59:10, 4.15s/it] 37%|███▋ | 12592/34278 [13:57:48<22:50:53, 3.79s/it] {'loss': 0.1592, 'grad_norm': 0.808594072427891, 'learning_rate': 7.301109374242626e-06, 'epoch': 0.37} 37%|███▋ | 12592/34278 [13:57:48<22:50:53, 3.79s/it] 37%|███▋ | 12593/34278 [13:57:52<23:07:37, 3.84s/it] {'loss': 0.1381, 'grad_norm': 0.7378181528008292, 'learning_rate': 7.300689935068534e-06, 'epoch': 0.37} 37%|███▋ | 12593/34278 [13:57:52<23:07:37, 3.84s/it] 37%|███▋ | 12594/34278 [13:57:55<21:47:51, 3.62s/it] {'loss': 0.1713, 'grad_norm': 1.1631232788413999, 'learning_rate': 7.3002704753544316e-06, 'epoch': 0.37} 37%|███▋ | 12594/34278 [13:57:55<21:47:51, 3.62s/it] 37%|███▋ | 12595/34278 [13:57:58<20:31:12, 3.41s/it] {'loss': 0.1346, 'grad_norm': 0.7272131576528931, 'learning_rate': 7.299850995104063e-06, 'epoch': 0.37} 37%|███▋ | 12595/34278 [13:57:58<20:31:12, 3.41s/it] 37%|███▋ | 12596/34278 [13:58:01<20:19:30, 3.37s/it] {'loss': 0.1414, 'grad_norm': 0.7224715044228271, 'learning_rate': 7.2994314943211755e-06, 'epoch': 0.37} 37%|███▋ | 12596/34278 [13:58:02<20:19:30, 3.37s/it] 37%|███▋ | 12597/34278 [13:58:06<22:52:52, 3.80s/it] {'loss': 0.145, 'grad_norm': 0.84799081098376, 'learning_rate': 7.299011973009511e-06, 'epoch': 0.37} 37%|███▋ | 12597/34278 [13:58:06<22:52:52, 3.80s/it] 37%|███▋ | 12598/34278 [13:58:10<21:54:01, 3.64s/it] {'loss': 0.1574, 'grad_norm': 1.0170357671998496, 'learning_rate': 7.298592431172818e-06, 'epoch': 0.37} 37%|███▋ | 12598/34278 [13:58:10<21:54:01, 3.64s/it] 37%|███▋ | 12599/34278 [13:58:13<21:20:52, 3.55s/it] {'loss': 0.1352, 'grad_norm': 0.8404918700923496, 'learning_rate': 7.2981728688148365e-06, 'epoch': 0.37} 37%|███▋ | 12599/34278 [13:58:13<21:20:52, 3.55s/it] 37%|███▋ | 12600/34278 [13:58:16<20:54:09, 3.47s/it] {'loss': 0.1271, 'grad_norm': 0.8144727261788373, 'learning_rate': 7.297753285939319e-06, 'epoch': 0.37} 37%|███▋ | 12600/34278 [13:58:16<20:54:09, 3.47s/it] 37%|███▋ | 12601/34278 [13:58:20<20:42:34, 3.44s/it] {'loss': 0.1326, 'grad_norm': 1.0073862448209447, 'learning_rate': 7.297333682550009e-06, 'epoch': 0.37} 37%|███▋ | 12601/34278 [13:58:20<20:42:34, 3.44s/it] 37%|███▋ | 12602/34278 [13:58:22<19:35:15, 3.25s/it] {'loss': 0.1415, 'grad_norm': 1.1080962498161437, 'learning_rate': 7.296914058650653e-06, 'epoch': 0.37} 37%|███▋ | 12602/34278 [13:58:22<19:35:15, 3.25s/it] 37%|███▋ | 12603/34278 [13:58:26<20:30:52, 3.41s/it] {'loss': 0.1668, 'grad_norm': 0.7716627843074678, 'learning_rate': 7.296494414244996e-06, 'epoch': 0.37} 37%|███▋ | 12603/34278 [13:58:26<20:30:52, 3.41s/it] 37%|███▋ | 12604/34278 [13:58:31<22:33:20, 3.75s/it] {'loss': 0.1642, 'grad_norm': 0.9608901366792665, 'learning_rate': 7.296074749336785e-06, 'epoch': 0.37} 37%|███▋ | 12604/34278 [13:58:31<22:33:20, 3.75s/it] 37%|███▋ | 12605/34278 [13:58:34<21:40:57, 3.60s/it] {'loss': 0.1473, 'grad_norm': 0.7199984408702625, 'learning_rate': 7.295655063929765e-06, 'epoch': 0.37} 37%|███▋ | 12605/34278 [13:58:34<21:40:57, 3.60s/it] 37%|███▋ | 12606/34278 [13:58:37<21:12:30, 3.52s/it] {'loss': 0.1569, 'grad_norm': 0.8288821214606242, 'learning_rate': 7.295235358027686e-06, 'epoch': 0.37} 37%|███▋ | 12606/34278 [13:58:37<21:12:30, 3.52s/it] 37%|███▋ | 12607/34278 [13:58:42<22:54:28, 3.81s/it] {'loss': 0.1401, 'grad_norm': 0.8112232057577572, 'learning_rate': 7.294815631634294e-06, 'epoch': 0.37} 37%|███▋ | 12607/34278 [13:58:42<22:54:28, 3.81s/it] 37%|███▋ | 12608/34278 [13:58:46<23:41:27, 3.94s/it] {'loss': 0.1356, 'grad_norm': 0.654042417879684, 'learning_rate': 7.294395884753336e-06, 'epoch': 0.37} 37%|███▋ | 12608/34278 [13:58:46<23:41:27, 3.94s/it] 37%|███▋ | 12609/34278 [13:58:49<22:25:07, 3.72s/it] {'loss': 0.1169, 'grad_norm': 0.6115312187425573, 'learning_rate': 7.293976117388558e-06, 'epoch': 0.37} 37%|███▋ | 12609/34278 [13:58:49<22:25:07, 3.72s/it] 37%|███▋ | 12610/34278 [13:58:52<21:06:27, 3.51s/it] {'loss': 0.1531, 'grad_norm': 0.8874601262272724, 'learning_rate': 7.29355632954371e-06, 'epoch': 0.37} 37%|███▋ | 12610/34278 [13:58:52<21:06:27, 3.51s/it] 37%|███▋ | 12611/34278 [13:58:56<21:52:41, 3.64s/it] {'loss': 0.1137, 'grad_norm': 0.7446788851902356, 'learning_rate': 7.293136521222538e-06, 'epoch': 0.37} 37%|███▋ | 12611/34278 [13:58:56<21:52:41, 3.64s/it] 37%|███▋ | 12612/34278 [13:58:59<20:20:39, 3.38s/it] {'loss': 0.1368, 'grad_norm': 0.7548866430434173, 'learning_rate': 7.292716692428791e-06, 'epoch': 0.37} 37%|███▋ | 12612/34278 [13:58:59<20:20:39, 3.38s/it] 37%|███▋ | 12613/34278 [13:59:02<20:25:36, 3.39s/it] {'loss': 0.1469, 'grad_norm': 0.7365736692314704, 'learning_rate': 7.292296843166217e-06, 'epoch': 0.37} 37%|███▋ | 12613/34278 [13:59:02<20:25:36, 3.39s/it] 37%|███▋ | 12614/34278 [13:59:06<21:18:47, 3.54s/it] {'loss': 0.1535, 'grad_norm': 0.8264816518176756, 'learning_rate': 7.291876973438562e-06, 'epoch': 0.37} 37%|███▋ | 12614/34278 [13:59:06<21:18:47, 3.54s/it] 37%|███▋ | 12615/34278 [13:59:12<25:47:08, 4.29s/it] {'loss': 0.1689, 'grad_norm': 0.7813558982894001, 'learning_rate': 7.291457083249578e-06, 'epoch': 0.37} 37%|███▋ | 12615/34278 [13:59:12<25:47:08, 4.29s/it] 37%|███▋ | 12616/34278 [13:59:16<23:54:37, 3.97s/it] {'loss': 0.1541, 'grad_norm': 0.8491973925937982, 'learning_rate': 7.291037172603013e-06, 'epoch': 0.37} 37%|███▋ | 12616/34278 [13:59:16<23:54:37, 3.97s/it] 37%|███▋ | 12617/34278 [13:59:19<22:48:44, 3.79s/it] {'loss': 0.1657, 'grad_norm': 0.7364897979429192, 'learning_rate': 7.2906172415026136e-06, 'epoch': 0.37} 37%|███▋ | 12617/34278 [13:59:19<22:48:44, 3.79s/it] 37%|███▋ | 12618/34278 [13:59:23<22:33:23, 3.75s/it] {'loss': 0.1297, 'grad_norm': 0.7998676568471853, 'learning_rate': 7.290197289952131e-06, 'epoch': 0.37} 37%|███▋ | 12618/34278 [13:59:23<22:33:23, 3.75s/it] 37%|███▋ | 12619/34278 [13:59:26<22:12:25, 3.69s/it] {'loss': 0.1625, 'grad_norm': 0.828428368684619, 'learning_rate': 7.289777317955313e-06, 'epoch': 0.37} 37%|███▋ | 12619/34278 [13:59:26<22:12:25, 3.69s/it] 37%|███▋ | 12620/34278 [13:59:30<21:53:42, 3.64s/it] {'loss': 0.1485, 'grad_norm': 0.855784368364791, 'learning_rate': 7.289357325515911e-06, 'epoch': 0.37} 37%|███▋ | 12620/34278 [13:59:30<21:53:42, 3.64s/it] 37%|███▋ | 12621/34278 [13:59:33<21:49:47, 3.63s/it] {'loss': 0.1478, 'grad_norm': 0.8547418600021226, 'learning_rate': 7.288937312637673e-06, 'epoch': 0.37} 37%|███▋ | 12621/34278 [13:59:33<21:49:47, 3.63s/it] 37%|███▋ | 12622/34278 [13:59:39<25:58:10, 4.32s/it] {'loss': 0.1301, 'grad_norm': 0.964073973781457, 'learning_rate': 7.288517279324349e-06, 'epoch': 0.37} 37%|███▋ | 12622/34278 [13:59:39<25:58:10, 4.32s/it] 37%|███▋ | 12623/34278 [13:59:44<26:52:59, 4.47s/it] {'loss': 0.1701, 'grad_norm': 0.8274583125272957, 'learning_rate': 7.2880972255796875e-06, 'epoch': 0.37} 37%|███▋ | 12623/34278 [13:59:44<26:52:59, 4.47s/it] 37%|███▋ | 12624/34278 [13:59:49<28:35:22, 4.75s/it] {'loss': 0.1423, 'grad_norm': 0.982394475670459, 'learning_rate': 7.287677151407442e-06, 'epoch': 0.37} 37%|███▋ | 12624/34278 [13:59:49<28:35:22, 4.75s/it] 37%|███▋ | 12625/34278 [13:59:52<25:28:44, 4.24s/it] {'loss': 0.1225, 'grad_norm': 0.787345466973106, 'learning_rate': 7.28725705681136e-06, 'epoch': 0.37} 37%|███▋ | 12625/34278 [13:59:52<25:28:44, 4.24s/it] 37%|███▋ | 12626/34278 [13:59:57<25:31:41, 4.24s/it] {'loss': 0.1431, 'grad_norm': 0.9727213667441302, 'learning_rate': 7.286836941795193e-06, 'epoch': 0.37} 37%|███▋ | 12626/34278 [13:59:57<25:31:41, 4.24s/it] 37%|███▋ | 12627/34278 [14:00:03<29:08:26, 4.85s/it] {'loss': 0.1422, 'grad_norm': 0.9185978711724994, 'learning_rate': 7.286416806362693e-06, 'epoch': 0.37} 37%|███▋ | 12627/34278 [14:00:03<29:08:26, 4.85s/it] 37%|███▋ | 12628/34278 [14:00:06<25:51:11, 4.30s/it] {'loss': 0.1366, 'grad_norm': 0.8118258793856865, 'learning_rate': 7.285996650517608e-06, 'epoch': 0.37} 37%|███▋ | 12628/34278 [14:00:06<25:51:11, 4.30s/it] 37%|███▋ | 12629/34278 [14:00:09<24:24:58, 4.06s/it] {'loss': 0.1542, 'grad_norm': 1.1994577940098725, 'learning_rate': 7.285576474263692e-06, 'epoch': 0.37} 37%|███▋ | 12629/34278 [14:00:09<24:24:58, 4.06s/it] 37%|███▋ | 12630/34278 [14:00:13<23:51:27, 3.97s/it] {'loss': 0.1546, 'grad_norm': 1.0416878269668777, 'learning_rate': 7.285156277604693e-06, 'epoch': 0.37} 37%|███▋ | 12630/34278 [14:00:13<23:51:27, 3.97s/it] 37%|███▋ | 12631/34278 [14:00:19<27:00:44, 4.49s/it] {'loss': 0.1426, 'grad_norm': 1.0730921371275317, 'learning_rate': 7.284736060544366e-06, 'epoch': 0.37} 37%|███▋ | 12631/34278 [14:00:19<27:00:44, 4.49s/it] 37%|███▋ | 12632/34278 [14:00:22<24:31:49, 4.08s/it] {'loss': 0.1414, 'grad_norm': 0.8199071874414507, 'learning_rate': 7.284315823086459e-06, 'epoch': 0.37} 37%|███▋ | 12632/34278 [14:00:22<24:31:49, 4.08s/it] 37%|███▋ | 12633/34278 [14:00:25<23:10:07, 3.85s/it] {'loss': 0.133, 'grad_norm': 0.9402127848174042, 'learning_rate': 7.283895565234729e-06, 'epoch': 0.37} 37%|███▋ | 12633/34278 [14:00:25<23:10:07, 3.85s/it] 37%|███▋ | 12634/34278 [14:00:29<22:23:07, 3.72s/it] {'loss': 0.1707, 'grad_norm': 1.0920864126210375, 'learning_rate': 7.283475286992923e-06, 'epoch': 0.37} 37%|███▋ | 12634/34278 [14:00:29<22:23:07, 3.72s/it] 37%|███▋ | 12635/34278 [14:00:32<20:44:47, 3.45s/it] {'loss': 0.1367, 'grad_norm': 0.8404151566503486, 'learning_rate': 7.283054988364793e-06, 'epoch': 0.37} 37%|███▋ | 12635/34278 [14:00:32<20:44:47, 3.45s/it] 37%|███▋ | 12636/34278 [14:00:36<23:02:14, 3.83s/it] {'loss': 0.134, 'grad_norm': 0.7585589900651192, 'learning_rate': 7.282634669354094e-06, 'epoch': 0.37} 37%|███▋ | 12636/34278 [14:00:36<23:02:14, 3.83s/it] 37%|███▋ | 12637/34278 [14:00:39<21:33:15, 3.59s/it] {'loss': 0.1346, 'grad_norm': 0.9278302596057294, 'learning_rate': 7.282214329964578e-06, 'epoch': 0.37} 37%|███▋ | 12637/34278 [14:00:39<21:33:15, 3.59s/it] 37%|███▋ | 12638/34278 [14:00:42<20:24:18, 3.39s/it] {'loss': 0.1634, 'grad_norm': 0.7470972843530774, 'learning_rate': 7.2817939701999974e-06, 'epoch': 0.37} 37%|███▋ | 12638/34278 [14:00:42<20:24:18, 3.39s/it] 37%|███▋ | 12639/34278 [14:00:45<19:40:27, 3.27s/it] {'loss': 0.1637, 'grad_norm': 0.8306930217254754, 'learning_rate': 7.281373590064105e-06, 'epoch': 0.37} 37%|███▋ | 12639/34278 [14:00:45<19:40:27, 3.27s/it] 37%|███▋ | 12640/34278 [14:00:51<24:29:38, 4.08s/it] {'loss': 0.1651, 'grad_norm': 0.9285373974943016, 'learning_rate': 7.280953189560653e-06, 'epoch': 0.37} 37%|███▋ | 12640/34278 [14:00:51<24:29:38, 4.08s/it] 37%|███▋ | 12641/34278 [14:00:55<23:33:25, 3.92s/it] {'loss': 0.1332, 'grad_norm': 0.9310102750005027, 'learning_rate': 7.280532768693396e-06, 'epoch': 0.37} 37%|███▋ | 12641/34278 [14:00:55<23:33:25, 3.92s/it] 37%|███▋ | 12642/34278 [14:00:58<22:35:40, 3.76s/it] {'loss': 0.1651, 'grad_norm': 0.786755285060495, 'learning_rate': 7.280112327466087e-06, 'epoch': 0.37} 37%|███▋ | 12642/34278 [14:00:58<22:35:40, 3.76s/it] 37%|███▋ | 12643/34278 [14:01:01<21:42:26, 3.61s/it] {'loss': 0.1301, 'grad_norm': 0.9027386530738184, 'learning_rate': 7.27969186588248e-06, 'epoch': 0.37} 37%|███▋ | 12643/34278 [14:01:01<21:42:26, 3.61s/it] 37%|███▋ | 12644/34278 [14:01:05<20:46:54, 3.46s/it] {'loss': 0.1395, 'grad_norm': 0.8144305451701089, 'learning_rate': 7.2792713839463255e-06, 'epoch': 0.37} 37%|███▋ | 12644/34278 [14:01:05<20:46:54, 3.46s/it] 37%|███▋ | 12645/34278 [14:01:08<20:07:24, 3.35s/it] {'loss': 0.1436, 'grad_norm': 1.0296444669740603, 'learning_rate': 7.2788508816613836e-06, 'epoch': 0.37} 37%|███▋ | 12645/34278 [14:01:08<20:07:24, 3.35s/it] 37%|███▋ | 12646/34278 [14:01:11<19:58:48, 3.33s/it] {'loss': 0.1472, 'grad_norm': 0.9730056098013248, 'learning_rate': 7.278430359031403e-06, 'epoch': 0.37} 37%|███▋ | 12646/34278 [14:01:11<19:58:48, 3.33s/it] 37%|███▋ | 12647/34278 [14:01:14<19:56:03, 3.32s/it] {'loss': 0.138, 'grad_norm': 0.7994043618478935, 'learning_rate': 7.278009816060141e-06, 'epoch': 0.37} 37%|███▋ | 12647/34278 [14:01:14<19:56:03, 3.32s/it] 37%|███▋ | 12648/34278 [14:01:17<19:31:00, 3.25s/it] {'loss': 0.1426, 'grad_norm': 0.9740903092945568, 'learning_rate': 7.277589252751351e-06, 'epoch': 0.37} 37%|███▋ | 12648/34278 [14:01:17<19:31:00, 3.25s/it] 37%|███▋ | 12649/34278 [14:01:20<18:58:18, 3.16s/it] {'loss': 0.1581, 'grad_norm': 0.9514339440379055, 'learning_rate': 7.277168669108787e-06, 'epoch': 0.37} 37%|███▋ | 12649/34278 [14:01:20<18:58:18, 3.16s/it] 37%|███▋ | 12650/34278 [14:01:23<18:43:25, 3.12s/it] {'loss': 0.1339, 'grad_norm': 0.8376829055759069, 'learning_rate': 7.276748065136206e-06, 'epoch': 0.37} 37%|███▋ | 12650/34278 [14:01:23<18:43:25, 3.12s/it] 37%|███▋ | 12651/34278 [14:01:28<20:55:55, 3.48s/it] {'loss': 0.1326, 'grad_norm': 0.6932765483000005, 'learning_rate': 7.27632744083736e-06, 'epoch': 0.37} 37%|███▋ | 12651/34278 [14:01:28<20:55:55, 3.48s/it] 37%|███▋ | 12652/34278 [14:01:34<26:11:23, 4.36s/it] {'loss': 0.1264, 'grad_norm': 1.0511895405972749, 'learning_rate': 7.2759067962160075e-06, 'epoch': 0.37} 37%|███▋ | 12652/34278 [14:01:34<26:11:23, 4.36s/it] 37%|███▋ | 12653/34278 [14:01:39<27:53:37, 4.64s/it] {'loss': 0.1458, 'grad_norm': 1.0100299524399472, 'learning_rate': 7.275486131275903e-06, 'epoch': 0.37} 37%|███▋ | 12653/34278 [14:01:39<27:53:37, 4.64s/it] 37%|███▋ | 12654/34278 [14:01:45<30:10:40, 5.02s/it] {'loss': 0.1526, 'grad_norm': 0.9109570724650197, 'learning_rate': 7.2750654460208e-06, 'epoch': 0.37} 37%|███▋ | 12654/34278 [14:01:45<30:10:40, 5.02s/it] 37%|███▋ | 12655/34278 [14:01:48<26:22:35, 4.39s/it] {'loss': 0.1385, 'grad_norm': 0.8927953356704228, 'learning_rate': 7.274644740454458e-06, 'epoch': 0.37} 37%|███▋ | 12655/34278 [14:01:48<26:22:35, 4.39s/it] 37%|███▋ | 12656/34278 [14:01:51<24:27:15, 4.07s/it] {'loss': 0.15, 'grad_norm': 0.8107226697996354, 'learning_rate': 7.274224014580627e-06, 'epoch': 0.37} 37%|███▋ | 12656/34278 [14:01:51<24:27:15, 4.07s/it] 37%|███▋ | 12657/34278 [14:01:54<22:01:47, 3.67s/it] {'loss': 0.1292, 'grad_norm': 0.7177052389467927, 'learning_rate': 7.27380326840307e-06, 'epoch': 0.37} 37%|███▋ | 12657/34278 [14:01:54<22:01:47, 3.67s/it] 37%|███▋ | 12658/34278 [14:01:59<24:19:18, 4.05s/it] {'loss': 0.1596, 'grad_norm': 0.8994355904918047, 'learning_rate': 7.27338250192554e-06, 'epoch': 0.37} 37%|███▋ | 12658/34278 [14:01:59<24:19:18, 4.05s/it] 37%|███▋ | 12659/34278 [14:02:02<22:53:27, 3.81s/it] {'loss': 0.1586, 'grad_norm': 0.7671412830821046, 'learning_rate': 7.2729617151517915e-06, 'epoch': 0.37} 37%|███▋ | 12659/34278 [14:02:02<22:53:27, 3.81s/it] 37%|███▋ | 12660/34278 [14:02:06<22:46:18, 3.79s/it] {'loss': 0.1628, 'grad_norm': 0.7200823778342188, 'learning_rate': 7.272540908085586e-06, 'epoch': 0.37} 37%|███▋ | 12660/34278 [14:02:06<22:46:18, 3.79s/it] 37%|███▋ | 12661/34278 [14:02:09<21:08:04, 3.52s/it] {'loss': 0.1355, 'grad_norm': 0.932197094073567, 'learning_rate': 7.272120080730677e-06, 'epoch': 0.37} 37%|███▋ | 12661/34278 [14:02:09<21:08:04, 3.52s/it] 37%|███▋ | 12662/34278 [14:02:13<21:18:41, 3.55s/it] {'loss': 0.1338, 'grad_norm': 0.7493755613652879, 'learning_rate': 7.271699233090821e-06, 'epoch': 0.37} 37%|███▋ | 12662/34278 [14:02:13<21:18:41, 3.55s/it] 37%|███▋ | 12663/34278 [14:02:16<20:33:48, 3.42s/it] {'loss': 0.1663, 'grad_norm': 0.8691183903945077, 'learning_rate': 7.271278365169778e-06, 'epoch': 0.37} 37%|███▋ | 12663/34278 [14:02:16<20:33:48, 3.42s/it] 37%|███▋ | 12664/34278 [14:02:20<21:39:10, 3.61s/it] {'loss': 0.1309, 'grad_norm': 0.7487734779311649, 'learning_rate': 7.270857476971303e-06, 'epoch': 0.37} 37%|███▋ | 12664/34278 [14:02:20<21:39:10, 3.61s/it] 37%|███▋ | 12665/34278 [14:02:26<26:03:20, 4.34s/it] {'loss': 0.1271, 'grad_norm': 0.6663862528649757, 'learning_rate': 7.270436568499156e-06, 'epoch': 0.37} 37%|███▋ | 12665/34278 [14:02:26<26:03:20, 4.34s/it] 37%|███▋ | 12666/34278 [14:02:31<27:54:07, 4.65s/it] {'loss': 0.1346, 'grad_norm': 0.8112441244218074, 'learning_rate': 7.270015639757092e-06, 'epoch': 0.37} 37%|███▋ | 12666/34278 [14:02:31<27:54:07, 4.65s/it] 37%|███▋ | 12667/34278 [14:02:34<24:46:41, 4.13s/it] {'loss': 0.1395, 'grad_norm': 3.906884028660531, 'learning_rate': 7.269594690748871e-06, 'epoch': 0.37} 37%|███▋ | 12667/34278 [14:02:34<24:46:41, 4.13s/it] 37%|███▋ | 12668/34278 [14:02:39<25:42:23, 4.28s/it] {'loss': 0.1533, 'grad_norm': 0.9087683544262717, 'learning_rate': 7.26917372147825e-06, 'epoch': 0.37} 37%|███▋ | 12668/34278 [14:02:39<25:42:23, 4.28s/it] 37%|███▋ | 12669/34278 [14:02:43<26:10:29, 4.36s/it] {'loss': 0.1395, 'grad_norm': 0.7699315262554577, 'learning_rate': 7.268752731948987e-06, 'epoch': 0.37} 37%|███▋ | 12669/34278 [14:02:43<26:10:29, 4.36s/it] 37%|███▋ | 12670/34278 [14:02:47<24:25:05, 4.07s/it] {'loss': 0.1527, 'grad_norm': 0.8308879806137981, 'learning_rate': 7.268331722164843e-06, 'epoch': 0.37} 37%|███▋ | 12670/34278 [14:02:47<24:25:05, 4.07s/it] 37%|███▋ | 12671/34278 [14:02:50<22:23:18, 3.73s/it] {'loss': 0.1404, 'grad_norm': 0.7691284472237025, 'learning_rate': 7.267910692129574e-06, 'epoch': 0.37} 37%|███▋ | 12671/34278 [14:02:50<22:23:18, 3.73s/it] 37%|███▋ | 12672/34278 [14:02:53<21:25:04, 3.57s/it] {'loss': 0.1552, 'grad_norm': 0.7064038777247387, 'learning_rate': 7.267489641846938e-06, 'epoch': 0.37} 37%|███▋ | 12672/34278 [14:02:53<21:25:04, 3.57s/it] 37%|███▋ | 12673/34278 [14:02:56<20:37:04, 3.44s/it] {'loss': 0.1534, 'grad_norm': 0.8410894472372811, 'learning_rate': 7.267068571320699e-06, 'epoch': 0.37} 37%|███▋ | 12673/34278 [14:02:56<20:37:04, 3.44s/it] 37%|███▋ | 12674/34278 [14:03:00<21:38:22, 3.61s/it] {'loss': 0.1376, 'grad_norm': 0.954164419387724, 'learning_rate': 7.26664748055461e-06, 'epoch': 0.37} 37%|███▋ | 12674/34278 [14:03:00<21:38:22, 3.61s/it] 37%|███▋ | 12675/34278 [14:03:06<25:42:02, 4.28s/it] {'loss': 0.154, 'grad_norm': 1.1417621661994168, 'learning_rate': 7.266226369552436e-06, 'epoch': 0.37} 37%|███▋ | 12675/34278 [14:03:06<25:42:02, 4.28s/it] 37%|███▋ | 12676/34278 [14:03:10<25:20:12, 4.22s/it] {'loss': 0.1463, 'grad_norm': 0.6217955345989012, 'learning_rate': 7.265805238317933e-06, 'epoch': 0.37} 37%|███▋ | 12676/34278 [14:03:10<25:20:12, 4.22s/it] 37%|███▋ | 12677/34278 [14:03:13<23:27:11, 3.91s/it] {'loss': 0.1723, 'grad_norm': 0.8034245069093413, 'learning_rate': 7.2653840868548595e-06, 'epoch': 0.37} 37%|███▋ | 12677/34278 [14:03:13<23:27:11, 3.91s/it] 37%|███▋ | 12678/34278 [14:03:16<22:10:17, 3.70s/it] {'loss': 0.1395, 'grad_norm': 0.8025368610914108, 'learning_rate': 7.264962915166981e-06, 'epoch': 0.37} 37%|███▋ | 12678/34278 [14:03:16<22:10:17, 3.70s/it] 37%|███▋ | 12679/34278 [14:03:19<21:01:28, 3.50s/it] {'loss': 0.1656, 'grad_norm': 0.8857000147703661, 'learning_rate': 7.264541723258053e-06, 'epoch': 0.37} 37%|███▋ | 12679/34278 [14:03:19<21:01:28, 3.50s/it] 37%|███▋ | 12680/34278 [14:03:23<20:46:17, 3.46s/it] {'loss': 0.1508, 'grad_norm': 0.7973792407931395, 'learning_rate': 7.264120511131837e-06, 'epoch': 0.37} 37%|███▋ | 12680/34278 [14:03:23<20:46:17, 3.46s/it] 37%|███▋ | 12681/34278 [14:03:29<26:00:08, 4.33s/it] {'loss': 0.148, 'grad_norm': 0.8282811444616076, 'learning_rate': 7.263699278792093e-06, 'epoch': 0.37} 37%|███▋ | 12681/34278 [14:03:29<26:00:08, 4.33s/it] 37%|███▋ | 12682/34278 [14:03:32<23:49:14, 3.97s/it] {'loss': 0.1383, 'grad_norm': 0.9754627730566209, 'learning_rate': 7.263278026242583e-06, 'epoch': 0.37} 37%|███▋ | 12682/34278 [14:03:32<23:49:14, 3.97s/it] 37%|███▋ | 12683/34278 [14:03:36<24:09:15, 4.03s/it] {'loss': 0.1785, 'grad_norm': 0.8821581951445242, 'learning_rate': 7.2628567534870665e-06, 'epoch': 0.37} 37%|███▋ | 12683/34278 [14:03:36<24:09:15, 4.03s/it] 37%|███▋ | 12684/34278 [14:03:40<24:13:24, 4.04s/it] {'loss': 0.1419, 'grad_norm': 0.8014611317129382, 'learning_rate': 7.2624354605293045e-06, 'epoch': 0.37} 37%|███▋ | 12684/34278 [14:03:40<24:13:24, 4.04s/it] 37%|███▋ | 12685/34278 [14:03:44<22:40:56, 3.78s/it] {'loss': 0.174, 'grad_norm': 0.928112475567267, 'learning_rate': 7.26201414737306e-06, 'epoch': 0.37} 37%|███▋ | 12685/34278 [14:03:44<22:40:56, 3.78s/it] 37%|███▋ | 12686/34278 [14:03:48<23:10:27, 3.86s/it] {'loss': 0.1672, 'grad_norm': 0.9527295221348488, 'learning_rate': 7.261592814022094e-06, 'epoch': 0.37} 37%|███▋ | 12686/34278 [14:03:48<23:10:27, 3.86s/it] 37%|███▋ | 12687/34278 [14:03:51<22:21:05, 3.73s/it] {'loss': 0.1641, 'grad_norm': 0.8100111062669616, 'learning_rate': 7.2611714604801655e-06, 'epoch': 0.37} 37%|███▋ | 12687/34278 [14:03:51<22:21:05, 3.73s/it] 37%|███▋ | 12688/34278 [14:03:54<20:48:10, 3.47s/it] {'loss': 0.1624, 'grad_norm': 1.088862419658833, 'learning_rate': 7.260750086751039e-06, 'epoch': 0.37} 37%|███▋ | 12688/34278 [14:03:54<20:48:10, 3.47s/it] 37%|███▋ | 12689/34278 [14:03:57<19:51:37, 3.31s/it] {'loss': 0.1674, 'grad_norm': 0.8741832877252597, 'learning_rate': 7.260328692838475e-06, 'epoch': 0.37} 37%|███▋ | 12689/34278 [14:03:57<19:51:37, 3.31s/it] 37%|███▋ | 12690/34278 [14:04:01<21:42:35, 3.62s/it] {'loss': 0.163, 'grad_norm': 0.8113410708118879, 'learning_rate': 7.259907278746237e-06, 'epoch': 0.37} 37%|███▋ | 12690/34278 [14:04:01<21:42:35, 3.62s/it] 37%|███▋ | 12691/34278 [14:04:04<20:18:58, 3.39s/it] {'loss': 0.1351, 'grad_norm': 0.6559126878198583, 'learning_rate': 7.2594858444780845e-06, 'epoch': 0.37} 37%|███▋ | 12691/34278 [14:04:04<20:18:58, 3.39s/it] 37%|███▋ | 12692/34278 [14:04:07<20:11:27, 3.37s/it] {'loss': 0.1363, 'grad_norm': 1.057230150043581, 'learning_rate': 7.259064390037781e-06, 'epoch': 0.37} 37%|███▋ | 12692/34278 [14:04:07<20:11:27, 3.37s/it] 37%|███▋ | 12693/34278 [14:04:11<20:47:25, 3.47s/it] {'loss': 0.1516, 'grad_norm': 0.8382173020004834, 'learning_rate': 7.258642915429093e-06, 'epoch': 0.37} 37%|███▋ | 12693/34278 [14:04:11<20:47:25, 3.47s/it] 37%|███▋ | 12694/34278 [14:04:15<21:06:43, 3.52s/it] {'loss': 0.1599, 'grad_norm': 0.8541340696797689, 'learning_rate': 7.258221420655778e-06, 'epoch': 0.37} 37%|███▋ | 12694/34278 [14:04:15<21:06:43, 3.52s/it] 37%|███▋ | 12695/34278 [14:04:18<20:17:51, 3.39s/it] {'loss': 0.1588, 'grad_norm': 0.9839161636709558, 'learning_rate': 7.257799905721602e-06, 'epoch': 0.37} 37%|███▋ | 12695/34278 [14:04:18<20:17:51, 3.39s/it] 37%|███▋ | 12696/34278 [14:04:21<19:58:57, 3.33s/it] {'loss': 0.143, 'grad_norm': 0.8469433734573926, 'learning_rate': 7.257378370630328e-06, 'epoch': 0.37} 37%|███▋ | 12696/34278 [14:04:21<19:58:57, 3.33s/it] 37%|███▋ | 12697/34278 [14:04:24<20:07:49, 3.36s/it] {'loss': 0.155, 'grad_norm': 0.813809899925821, 'learning_rate': 7.256956815385718e-06, 'epoch': 0.37} 37%|███▋ | 12697/34278 [14:04:24<20:07:49, 3.36s/it] 37%|███▋ | 12698/34278 [14:04:27<19:38:39, 3.28s/it] {'loss': 0.1422, 'grad_norm': 0.9687633725390876, 'learning_rate': 7.2565352399915354e-06, 'epoch': 0.37} 37%|███▋ | 12698/34278 [14:04:27<19:38:39, 3.28s/it] 37%|███▋ | 12699/34278 [14:04:31<19:50:09, 3.31s/it] {'loss': 0.1287, 'grad_norm': 0.8074508913582549, 'learning_rate': 7.256113644451547e-06, 'epoch': 0.37} 37%|███▋ | 12699/34278 [14:04:31<19:50:09, 3.31s/it] 37%|███▋ | 12700/34278 [14:04:35<21:59:35, 3.67s/it] {'loss': 0.1462, 'grad_norm': 0.9016226971122039, 'learning_rate': 7.2556920287695135e-06, 'epoch': 0.37} 37%|███▋ | 12700/34278 [14:04:35<21:59:35, 3.67s/it] 37%|███▋ | 12701/34278 [14:04:40<23:13:38, 3.88s/it] {'loss': 0.1451, 'grad_norm': 0.9994119263648709, 'learning_rate': 7.2552703929491995e-06, 'epoch': 0.37} 37%|███▋ | 12701/34278 [14:04:40<23:13:38, 3.88s/it] 37%|███▋ | 12702/34278 [14:04:43<21:19:39, 3.56s/it] {'loss': 0.1363, 'grad_norm': 0.8768566263661491, 'learning_rate': 7.254848736994371e-06, 'epoch': 0.37} 37%|███▋ | 12702/34278 [14:04:43<21:19:39, 3.56s/it] 37%|███▋ | 12703/34278 [14:04:47<23:13:32, 3.88s/it] {'loss': 0.1594, 'grad_norm': 0.7873476512517228, 'learning_rate': 7.254427060908791e-06, 'epoch': 0.37} 37%|███▋ | 12703/34278 [14:04:47<23:13:32, 3.88s/it] 37%|███▋ | 12704/34278 [14:04:50<21:47:00, 3.63s/it] {'loss': 0.1426, 'grad_norm': 1.0958104625118537, 'learning_rate': 7.254005364696223e-06, 'epoch': 0.37} 37%|███▋ | 12704/34278 [14:04:50<21:47:00, 3.63s/it] 37%|███▋ | 12705/34278 [14:04:53<21:03:44, 3.51s/it] {'loss': 0.1423, 'grad_norm': 0.6734979153631149, 'learning_rate': 7.253583648360435e-06, 'epoch': 0.37} 37%|███▋ | 12705/34278 [14:04:53<21:03:44, 3.51s/it] 37%|███▋ | 12706/34278 [14:04:57<20:24:00, 3.40s/it] {'loss': 0.1423, 'grad_norm': 0.8778050796298381, 'learning_rate': 7.253161911905188e-06, 'epoch': 0.37} 37%|███▋ | 12706/34278 [14:04:57<20:24:00, 3.40s/it] 37%|███▋ | 12707/34278 [14:05:01<21:31:00, 3.59s/it] {'loss': 0.1225, 'grad_norm': 1.2332592993197071, 'learning_rate': 7.25274015533425e-06, 'epoch': 0.37} 37%|███▋ | 12707/34278 [14:05:01<21:31:00, 3.59s/it] 37%|███▋ | 12708/34278 [14:05:07<26:15:21, 4.38s/it] {'loss': 0.1251, 'grad_norm': 0.7214949137311216, 'learning_rate': 7.252318378651388e-06, 'epoch': 0.37} 37%|███▋ | 12708/34278 [14:05:07<26:15:21, 4.38s/it] 37%|███▋ | 12709/34278 [14:05:10<23:48:21, 3.97s/it] {'loss': 0.1438, 'grad_norm': 0.8838682258775027, 'learning_rate': 7.251896581860364e-06, 'epoch': 0.37} 37%|███▋ | 12709/34278 [14:05:10<23:48:21, 3.97s/it] 37%|███▋ | 12710/34278 [14:05:14<23:42:02, 3.96s/it] {'loss': 0.1368, 'grad_norm': 0.8865027838061383, 'learning_rate': 7.2514747649649445e-06, 'epoch': 0.37} 37%|███▋ | 12710/34278 [14:05:14<23:42:02, 3.96s/it] 37%|███▋ | 12711/34278 [14:05:17<21:53:45, 3.65s/it] {'loss': 0.1391, 'grad_norm': 0.6501735305599404, 'learning_rate': 7.2510529279688955e-06, 'epoch': 0.37} 37%|███▋ | 12711/34278 [14:05:17<21:53:45, 3.65s/it] 37%|███▋ | 12712/34278 [14:05:20<21:49:40, 3.64s/it] {'loss': 0.1523, 'grad_norm': 0.7626504494552498, 'learning_rate': 7.250631070875983e-06, 'epoch': 0.37} 37%|███▋ | 12712/34278 [14:05:20<21:49:40, 3.64s/it] 37%|███▋ | 12713/34278 [14:05:26<26:11:11, 4.37s/it] {'loss': 0.1926, 'grad_norm': 0.8341725753743278, 'learning_rate': 7.250209193689975e-06, 'epoch': 0.37} 37%|███▋ | 12713/34278 [14:05:26<26:11:11, 4.37s/it] 37%|███▋ | 12714/34278 [14:05:30<24:04:55, 4.02s/it] {'loss': 0.136, 'grad_norm': 0.8088679688713538, 'learning_rate': 7.249787296414635e-06, 'epoch': 0.37} 37%|███▋ | 12714/34278 [14:05:30<24:04:55, 4.02s/it] 37%|███▋ | 12715/34278 [14:05:35<25:36:00, 4.27s/it] {'loss': 0.1432, 'grad_norm': 0.6777045927186891, 'learning_rate': 7.249365379053731e-06, 'epoch': 0.37} 37%|███▋ | 12715/34278 [14:05:35<25:36:00, 4.27s/it] 37%|███▋ | 12716/34278 [14:05:38<23:48:47, 3.98s/it] {'loss': 0.1321, 'grad_norm': 0.736509969660584, 'learning_rate': 7.248943441611031e-06, 'epoch': 0.37} 37%|███▋ | 12716/34278 [14:05:38<23:48:47, 3.98s/it] 37%|███▋ | 12717/34278 [14:05:42<23:35:53, 3.94s/it] {'loss': 0.1631, 'grad_norm': 0.9220189050883129, 'learning_rate': 7.248521484090299e-06, 'epoch': 0.37} 37%|███▋ | 12717/34278 [14:05:42<23:35:53, 3.94s/it] 37%|███▋ | 12718/34278 [14:05:46<23:47:03, 3.97s/it] {'loss': 0.1554, 'grad_norm': 0.6296469905412683, 'learning_rate': 7.248099506495307e-06, 'epoch': 0.37} 37%|███▋ | 12718/34278 [14:05:46<23:47:03, 3.97s/it] 37%|███▋ | 12719/34278 [14:05:52<27:27:40, 4.59s/it] {'loss': 0.1568, 'grad_norm': 0.8702416146894282, 'learning_rate': 7.247677508829816e-06, 'epoch': 0.37} 37%|███▋ | 12719/34278 [14:05:52<27:27:40, 4.59s/it] 37%|███▋ | 12720/34278 [14:05:57<28:19:22, 4.73s/it] {'loss': 0.143, 'grad_norm': 0.7698777512516692, 'learning_rate': 7.2472554910976e-06, 'epoch': 0.37} 37%|███▋ | 12720/34278 [14:05:57<28:19:22, 4.73s/it] 37%|███▋ | 12721/34278 [14:06:00<25:51:06, 4.32s/it] {'loss': 0.1263, 'grad_norm': 0.6924246629798406, 'learning_rate': 7.246833453302422e-06, 'epoch': 0.37} 37%|███▋ | 12721/34278 [14:06:00<25:51:06, 4.32s/it] 37%|███▋ | 12722/34278 [14:06:03<23:37:25, 3.95s/it] {'loss': 0.1511, 'grad_norm': 0.7094621667323893, 'learning_rate': 7.24641139544805e-06, 'epoch': 0.37} 37%|███▋ | 12722/34278 [14:06:03<23:37:25, 3.95s/it] 37%|███▋ | 12723/34278 [14:06:07<23:43:12, 3.96s/it] {'loss': 0.1508, 'grad_norm': 0.7122842266152872, 'learning_rate': 7.2459893175382546e-06, 'epoch': 0.37} 37%|███▋ | 12723/34278 [14:06:07<23:43:12, 3.96s/it] 37%|███▋ | 12724/34278 [14:06:10<21:43:32, 3.63s/it] {'loss': 0.1574, 'grad_norm': 0.9425692565536175, 'learning_rate': 7.245567219576803e-06, 'epoch': 0.37} 37%|███▋ | 12724/34278 [14:06:10<21:43:32, 3.63s/it] 37%|███▋ | 12725/34278 [14:06:13<20:54:59, 3.49s/it] {'loss': 0.1379, 'grad_norm': 0.8105275883169979, 'learning_rate': 7.2451451015674624e-06, 'epoch': 0.37} 37%|███▋ | 12725/34278 [14:06:13<20:54:59, 3.49s/it] 37%|███▋ | 12726/34278 [14:06:16<19:45:12, 3.30s/it] {'loss': 0.1395, 'grad_norm': 0.7417886750217544, 'learning_rate': 7.244722963514002e-06, 'epoch': 0.37} 37%|███▋ | 12726/34278 [14:06:16<19:45:12, 3.30s/it] 37%|███▋ | 12727/34278 [14:06:19<18:49:16, 3.14s/it] {'loss': 0.1376, 'grad_norm': 0.9187036576494949, 'learning_rate': 7.244300805420192e-06, 'epoch': 0.37} 37%|███▋ | 12727/34278 [14:06:19<18:49:16, 3.14s/it] 37%|███▋ | 12728/34278 [14:06:22<18:29:01, 3.09s/it] {'loss': 0.1265, 'grad_norm': 0.672931933546487, 'learning_rate': 7.2438786272897995e-06, 'epoch': 0.37} 37%|███▋ | 12728/34278 [14:06:22<18:29:01, 3.09s/it] 37%|███▋ | 12729/34278 [14:06:25<18:32:57, 3.10s/it] {'loss': 0.1437, 'grad_norm': 0.8197851589142929, 'learning_rate': 7.243456429126594e-06, 'epoch': 0.37} 37%|███▋ | 12729/34278 [14:06:25<18:32:57, 3.10s/it] 37%|███▋ | 12730/34278 [14:06:28<18:31:00, 3.09s/it] {'loss': 0.1314, 'grad_norm': 0.694977998895702, 'learning_rate': 7.243034210934345e-06, 'epoch': 0.37} 37%|███▋ | 12730/34278 [14:06:28<18:31:00, 3.09s/it] 37%|███▋ | 12731/34278 [14:06:31<17:56:20, 3.00s/it] {'loss': 0.1516, 'grad_norm': 0.7052585724435444, 'learning_rate': 7.242611972716823e-06, 'epoch': 0.37} 37%|███▋ | 12731/34278 [14:06:31<17:56:20, 3.00s/it] 37%|███▋ | 12732/34278 [14:06:34<17:53:37, 2.99s/it] {'loss': 0.1368, 'grad_norm': 1.0067913676235247, 'learning_rate': 7.2421897144777965e-06, 'epoch': 0.37} 37%|███▋ | 12732/34278 [14:06:34<17:53:37, 2.99s/it] 37%|███▋ | 12733/34278 [14:06:38<20:02:37, 3.35s/it] {'loss': 0.1538, 'grad_norm': 0.9442444614065237, 'learning_rate': 7.2417674362210365e-06, 'epoch': 0.37} 37%|███▋ | 12733/34278 [14:06:38<20:02:37, 3.35s/it] 37%|███▋ | 12734/34278 [14:06:44<24:43:19, 4.13s/it] {'loss': 0.1667, 'grad_norm': 0.957447244826272, 'learning_rate': 7.241345137950309e-06, 'epoch': 0.37} 37%|███▋ | 12734/34278 [14:06:44<24:43:19, 4.13s/it] 37%|███▋ | 12735/34278 [14:06:47<23:32:18, 3.93s/it] {'loss': 0.1609, 'grad_norm': 0.8537917080419062, 'learning_rate': 7.24092281966939e-06, 'epoch': 0.37} 37%|███▋ | 12735/34278 [14:06:47<23:32:18, 3.93s/it] 37%|███▋ | 12736/34278 [14:06:53<26:45:35, 4.47s/it] {'loss': 0.1281, 'grad_norm': 0.7917483530332348, 'learning_rate': 7.2405004813820465e-06, 'epoch': 0.37} 37%|███▋ | 12736/34278 [14:06:53<26:45:35, 4.47s/it] 37%|███▋ | 12737/34278 [14:06:56<23:59:22, 4.01s/it] {'loss': 0.1313, 'grad_norm': 1.5691676737995557, 'learning_rate': 7.240078123092047e-06, 'epoch': 0.37} 37%|███▋ | 12737/34278 [14:06:56<23:59:22, 4.01s/it] 37%|███▋ | 12738/34278 [14:07:00<23:23:25, 3.91s/it] {'loss': 0.1512, 'grad_norm': 0.868321333198384, 'learning_rate': 7.2396557448031675e-06, 'epoch': 0.37} 37%|███▋ | 12738/34278 [14:07:00<23:23:25, 3.91s/it] 37%|███▋ | 12739/34278 [14:07:03<22:07:29, 3.70s/it] {'loss': 0.1534, 'grad_norm': 0.7526311165689189, 'learning_rate': 7.239233346519176e-06, 'epoch': 0.37} 37%|███▋ | 12739/34278 [14:07:03<22:07:29, 3.70s/it] 37%|███▋ | 12740/34278 [14:07:07<23:10:36, 3.87s/it] {'loss': 0.1577, 'grad_norm': 0.8363204816108493, 'learning_rate': 7.238810928243842e-06, 'epoch': 0.37} 37%|███▋ | 12740/34278 [14:07:07<23:10:36, 3.87s/it] 37%|███▋ | 12741/34278 [14:07:10<21:27:26, 3.59s/it] {'loss': 0.1489, 'grad_norm': 0.9236118860998855, 'learning_rate': 7.238388489980941e-06, 'epoch': 0.37} 37%|███▋ | 12741/34278 [14:07:10<21:27:26, 3.59s/it] 37%|███▋ | 12742/34278 [14:07:13<20:25:27, 3.41s/it] {'loss': 0.139, 'grad_norm': 0.7284906094986009, 'learning_rate': 7.2379660317342405e-06, 'epoch': 0.37} 37%|███▋ | 12742/34278 [14:07:13<20:25:27, 3.41s/it] 37%|███▋ | 12743/34278 [14:07:16<19:38:06, 3.28s/it] {'loss': 0.1337, 'grad_norm': 1.139559360147518, 'learning_rate': 7.237543553507513e-06, 'epoch': 0.37} 37%|███▋ | 12743/34278 [14:07:16<19:38:06, 3.28s/it] 37%|███▋ | 12744/34278 [14:07:20<19:53:12, 3.32s/it] {'loss': 0.1567, 'grad_norm': 0.9258717371918922, 'learning_rate': 7.237121055304532e-06, 'epoch': 0.37} 37%|███▋ | 12744/34278 [14:07:20<19:53:12, 3.32s/it] 37%|███▋ | 12745/34278 [14:07:23<20:40:54, 3.46s/it] {'loss': 0.161, 'grad_norm': 0.79707393282504, 'learning_rate': 7.236698537129066e-06, 'epoch': 0.37} 37%|███▋ | 12745/34278 [14:07:23<20:40:54, 3.46s/it] 37%|███▋ | 12746/34278 [14:07:28<22:36:11, 3.78s/it] {'loss': 0.1242, 'grad_norm': 0.9828875833309629, 'learning_rate': 7.236275998984892e-06, 'epoch': 0.37} 37%|███▋ | 12746/34278 [14:07:28<22:36:11, 3.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 37%|███▋ | 12747/34278 [14:07:32<23:07:18, 3.87s/it] {'loss': 0.1496, 'grad_norm': 0.7557707280313231, 'learning_rate': 7.235853440875777e-06, 'epoch': 0.37} 37%|███▋ | 12747/34278 [14:07:32<23:07:18, 3.87s/it] 37%|███▋ | 12748/34278 [14:07:35<22:20:04, 3.73s/it] {'loss': 0.1467, 'grad_norm': 0.9305273171442837, 'learning_rate': 7.235430862805499e-06, 'epoch': 0.37} 37%|███▋ | 12748/34278 [14:07:35<22:20:04, 3.73s/it] 37%|███▋ | 12749/34278 [14:07:40<24:19:23, 4.07s/it] {'loss': 0.1452, 'grad_norm': 0.7835039130081966, 'learning_rate': 7.235008264777827e-06, 'epoch': 0.37} 37%|███▋ | 12749/34278 [14:07:40<24:19:23, 4.07s/it] 37%|███▋ | 12750/34278 [14:07:44<23:22:11, 3.91s/it] {'loss': 0.1373, 'grad_norm': 0.7762569275972072, 'learning_rate': 7.2345856467965345e-06, 'epoch': 0.37} 37%|███▋ | 12750/34278 [14:07:44<23:22:11, 3.91s/it] 37%|███▋ | 12751/34278 [14:07:47<23:05:12, 3.86s/it] {'loss': 0.1359, 'grad_norm': 0.9320611300067672, 'learning_rate': 7.2341630088653955e-06, 'epoch': 0.37} 37%|███▋ | 12751/34278 [14:07:47<23:05:12, 3.86s/it] 37%|███▋ | 12752/34278 [14:07:52<24:09:18, 4.04s/it] {'loss': 0.1687, 'grad_norm': 0.9019528982672234, 'learning_rate': 7.233740350988181e-06, 'epoch': 0.37} 37%|███▋ | 12752/34278 [14:07:52<24:09:18, 4.04s/it] 37%|███▋ | 12753/34278 [14:07:55<21:52:45, 3.66s/it] {'loss': 0.1497, 'grad_norm': 0.7524268631385961, 'learning_rate': 7.233317673168667e-06, 'epoch': 0.37} 37%|███▋ | 12753/34278 [14:07:55<21:52:45, 3.66s/it] 37%|███▋ | 12754/34278 [14:07:58<21:12:59, 3.55s/it] {'loss': 0.1358, 'grad_norm': 0.8015622169646974, 'learning_rate': 7.232894975410626e-06, 'epoch': 0.37} 37%|███▋ | 12754/34278 [14:07:58<21:12:59, 3.55s/it] 37%|███▋ | 12755/34278 [14:08:01<20:36:58, 3.45s/it] {'loss': 0.1414, 'grad_norm': 0.7264333250152629, 'learning_rate': 7.232472257717831e-06, 'epoch': 0.37} 37%|███▋ | 12755/34278 [14:08:01<20:36:58, 3.45s/it] 37%|███▋ | 12756/34278 [14:08:05<20:44:11, 3.47s/it] {'loss': 0.1623, 'grad_norm': 0.8230271531325329, 'learning_rate': 7.232049520094057e-06, 'epoch': 0.37} 37%|███▋ | 12756/34278 [14:08:05<20:44:11, 3.47s/it] 37%|███▋ | 12757/34278 [14:08:08<19:50:46, 3.32s/it] {'loss': 0.1227, 'grad_norm': 0.8013650352287885, 'learning_rate': 7.231626762543078e-06, 'epoch': 0.37} 37%|███▋ | 12757/34278 [14:08:08<19:50:46, 3.32s/it] 37%|███▋ | 12758/34278 [14:08:13<22:49:14, 3.82s/it] {'loss': 0.1617, 'grad_norm': 0.8606246876335979, 'learning_rate': 7.231203985068666e-06, 'epoch': 0.37} 37%|███▋ | 12758/34278 [14:08:13<22:49:14, 3.82s/it] 37%|███▋ | 12759/34278 [14:08:16<22:40:24, 3.79s/it] {'loss': 0.1349, 'grad_norm': 0.7493734677857707, 'learning_rate': 7.230781187674601e-06, 'epoch': 0.37} 37%|███▋ | 12759/34278 [14:08:16<22:40:24, 3.79s/it] 37%|███▋ | 12760/34278 [14:08:20<21:37:55, 3.62s/it] {'loss': 0.1333, 'grad_norm': 0.797890250530901, 'learning_rate': 7.230358370364652e-06, 'epoch': 0.37} 37%|███▋ | 12760/34278 [14:08:20<21:37:55, 3.62s/it] 37%|███▋ | 12761/34278 [14:08:23<22:02:57, 3.69s/it] {'loss': 0.1275, 'grad_norm': 0.741412221042629, 'learning_rate': 7.2299355331425955e-06, 'epoch': 0.37} 37%|███▋ | 12761/34278 [14:08:23<22:02:57, 3.69s/it] 37%|███▋ | 12762/34278 [14:08:27<22:22:20, 3.74s/it] {'loss': 0.143, 'grad_norm': 0.7848481159593104, 'learning_rate': 7.229512676012207e-06, 'epoch': 0.37} 37%|███▋ | 12762/34278 [14:08:27<22:22:20, 3.74s/it] 37%|███▋ | 12763/34278 [14:08:31<21:41:23, 3.63s/it] {'loss': 0.128, 'grad_norm': 0.8226034318008352, 'learning_rate': 7.229089798977264e-06, 'epoch': 0.37} 37%|███▋ | 12763/34278 [14:08:31<21:41:23, 3.63s/it] 37%|███▋ | 12764/34278 [14:08:34<20:36:27, 3.45s/it] {'loss': 0.1571, 'grad_norm': 0.7372312788201161, 'learning_rate': 7.2286669020415355e-06, 'epoch': 0.37} 37%|███▋ | 12764/34278 [14:08:34<20:36:27, 3.45s/it] 37%|███▋ | 12765/34278 [14:08:37<21:05:49, 3.53s/it] {'loss': 0.1278, 'grad_norm': 0.7415974436887275, 'learning_rate': 7.228243985208804e-06, 'epoch': 0.37} 37%|███▋ | 12765/34278 [14:08:37<21:05:49, 3.53s/it] 37%|███▋ | 12766/34278 [14:08:41<22:00:05, 3.68s/it] {'loss': 0.1382, 'grad_norm': 0.8904885817454987, 'learning_rate': 7.227821048482842e-06, 'epoch': 0.37} 37%|███▋ | 12766/34278 [14:08:41<22:00:05, 3.68s/it] 37%|███▋ | 12767/34278 [14:08:45<21:47:41, 3.65s/it] {'loss': 0.1371, 'grad_norm': 0.8968460568461896, 'learning_rate': 7.227398091867422e-06, 'epoch': 0.37} 37%|███▋ | 12767/34278 [14:08:45<21:47:41, 3.65s/it] 37%|███▋ | 12768/34278 [14:08:48<21:13:13, 3.55s/it] {'loss': 0.171, 'grad_norm': 0.8371768229407455, 'learning_rate': 7.226975115366328e-06, 'epoch': 0.37} 37%|███▋ | 12768/34278 [14:08:48<21:13:13, 3.55s/it] 37%|███▋ | 12769/34278 [14:08:52<20:32:48, 3.44s/it] {'loss': 0.1462, 'grad_norm': 0.9250473502488088, 'learning_rate': 7.22655211898333e-06, 'epoch': 0.37} 37%|███▋ | 12769/34278 [14:08:52<20:32:48, 3.44s/it] 37%|███▋ | 12770/34278 [14:08:55<20:42:14, 3.47s/it] {'loss': 0.1575, 'grad_norm': 1.0230115628904144, 'learning_rate': 7.226129102722206e-06, 'epoch': 0.37} 37%|███▋ | 12770/34278 [14:08:55<20:42:14, 3.47s/it] 37%|███▋ | 12771/34278 [14:08:58<19:38:50, 3.29s/it] {'loss': 0.1377, 'grad_norm': 0.8323563189907527, 'learning_rate': 7.225706066586733e-06, 'epoch': 0.37} 37%|███▋ | 12771/34278 [14:08:58<19:38:50, 3.29s/it] 37%|███▋ | 12772/34278 [14:09:02<20:17:52, 3.40s/it] {'loss': 0.16, 'grad_norm': 0.7385433873269618, 'learning_rate': 7.225283010580686e-06, 'epoch': 0.37} 37%|███▋ | 12772/34278 [14:09:02<20:17:52, 3.40s/it] 37%|███▋ | 12773/34278 [14:09:05<20:02:46, 3.36s/it] {'loss': 0.1419, 'grad_norm': 0.8241995662093278, 'learning_rate': 7.224859934707845e-06, 'epoch': 0.37} 37%|███▋ | 12773/34278 [14:09:05<20:02:46, 3.36s/it] 37%|███▋ | 12774/34278 [14:09:08<20:28:44, 3.43s/it] {'loss': 0.1512, 'grad_norm': 0.8881407002886734, 'learning_rate': 7.224436838971986e-06, 'epoch': 0.37} 37%|███▋ | 12774/34278 [14:09:08<20:28:44, 3.43s/it] 37%|███▋ | 12775/34278 [14:09:12<20:07:55, 3.37s/it] {'loss': 0.1388, 'grad_norm': 0.9006077207869064, 'learning_rate': 7.224013723376886e-06, 'epoch': 0.37} 37%|███▋ | 12775/34278 [14:09:12<20:07:55, 3.37s/it] 37%|███▋ | 12776/34278 [14:09:18<25:28:36, 4.27s/it] {'loss': 0.1629, 'grad_norm': 0.840964875196712, 'learning_rate': 7.223590587926322e-06, 'epoch': 0.37} 37%|███▋ | 12776/34278 [14:09:18<25:28:36, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 37%|███▋ | 12777/34278 [14:09:22<24:18:11, 4.07s/it] {'loss': 0.1651, 'grad_norm': 1.0528834947135033, 'learning_rate': 7.223167432624071e-06, 'epoch': 0.37} 37%|███▋ | 12777/34278 [14:09:22<24:18:11, 4.07s/it] 37%|███▋ | 12778/34278 [14:09:25<22:25:35, 3.76s/it] {'loss': 0.145, 'grad_norm': 0.765849402535143, 'learning_rate': 7.2227442574739135e-06, 'epoch': 0.37} 37%|███▋ | 12778/34278 [14:09:25<22:25:35, 3.76s/it] 37%|███▋ | 12779/34278 [14:09:29<24:01:59, 4.02s/it] {'loss': 0.1496, 'grad_norm': 0.914936798327709, 'learning_rate': 7.222321062479625e-06, 'epoch': 0.37} 37%|███▋ | 12779/34278 [14:09:29<24:01:59, 4.02s/it] 37%|███▋ | 12780/34278 [14:09:32<22:22:35, 3.75s/it] {'loss': 0.1625, 'grad_norm': 1.0107411168454763, 'learning_rate': 7.221897847644985e-06, 'epoch': 0.37} 37%|███▋ | 12780/34278 [14:09:32<22:22:35, 3.75s/it] 37%|███▋ | 12781/34278 [14:09:38<26:25:10, 4.42s/it] {'loss': 0.1465, 'grad_norm': 0.8774221008451132, 'learning_rate': 7.221474612973771e-06, 'epoch': 0.37} 37%|███▋ | 12781/34278 [14:09:38<26:25:10, 4.42s/it] 37%|███▋ | 12782/34278 [14:09:43<26:04:00, 4.37s/it] {'loss': 0.1539, 'grad_norm': 0.818555050504513, 'learning_rate': 7.22105135846976e-06, 'epoch': 0.37} 37%|███▋ | 12782/34278 [14:09:43<26:04:00, 4.37s/it] 37%|███▋ | 12783/34278 [14:09:48<27:30:06, 4.61s/it] {'loss': 0.1549, 'grad_norm': 0.957235837635529, 'learning_rate': 7.220628084136736e-06, 'epoch': 0.37} 37%|███▋ | 12783/34278 [14:09:48<27:30:06, 4.61s/it] 37%|███▋ | 12784/34278 [14:09:51<24:55:12, 4.17s/it] {'loss': 0.1404, 'grad_norm': 0.7835090422931695, 'learning_rate': 7.220204789978473e-06, 'epoch': 0.37} 37%|███▋ | 12784/34278 [14:09:51<24:55:12, 4.17s/it] 37%|███▋ | 12785/34278 [14:09:54<23:05:02, 3.87s/it] {'loss': 0.1768, 'grad_norm': 0.7670782905317178, 'learning_rate': 7.219781475998753e-06, 'epoch': 0.37} 37%|███▋ | 12785/34278 [14:09:54<23:05:02, 3.87s/it] 37%|███▋ | 12786/34278 [14:09:58<22:10:32, 3.71s/it] {'loss': 0.1443, 'grad_norm': 1.1532221791237023, 'learning_rate': 7.219358142201352e-06, 'epoch': 0.37} 37%|███▋ | 12786/34278 [14:09:58<22:10:32, 3.71s/it] 37%|███▋ | 12787/34278 [14:10:01<21:09:50, 3.55s/it] {'loss': 0.1532, 'grad_norm': 0.8264581414476969, 'learning_rate': 7.218934788590053e-06, 'epoch': 0.37} 37%|███▋ | 12787/34278 [14:10:01<21:09:50, 3.55s/it] 37%|███▋ | 12788/34278 [14:10:04<20:45:55, 3.48s/it] {'loss': 0.1493, 'grad_norm': 0.9916087496784959, 'learning_rate': 7.218511415168633e-06, 'epoch': 0.37} 37%|███▋ | 12788/34278 [14:10:04<20:45:55, 3.48s/it] 37%|███▋ | 12789/34278 [14:10:07<19:38:58, 3.29s/it] {'loss': 0.1409, 'grad_norm': 1.1611645508019375, 'learning_rate': 7.218088021940872e-06, 'epoch': 0.37} 37%|███▋ | 12789/34278 [14:10:07<19:38:58, 3.29s/it] 37%|███▋ | 12790/34278 [14:10:11<20:54:16, 3.50s/it] {'loss': 0.1555, 'grad_norm': 0.9272351621046583, 'learning_rate': 7.217664608910552e-06, 'epoch': 0.37} 37%|███▋ | 12790/34278 [14:10:11<20:54:16, 3.50s/it] 37%|███▋ | 12791/34278 [14:10:14<20:30:35, 3.44s/it] {'loss': 0.1396, 'grad_norm': 0.9172760740376449, 'learning_rate': 7.217241176081451e-06, 'epoch': 0.37} 37%|███▋ | 12791/34278 [14:10:14<20:30:35, 3.44s/it] 37%|███▋ | 12792/34278 [14:10:17<19:31:41, 3.27s/it] {'loss': 0.1465, 'grad_norm': 1.1666594732936182, 'learning_rate': 7.21681772345735e-06, 'epoch': 0.37} 37%|███▋ | 12792/34278 [14:10:17<19:31:41, 3.27s/it] 37%|███▋ | 12793/34278 [14:10:21<20:34:31, 3.45s/it] {'loss': 0.1538, 'grad_norm': 0.8439716641350714, 'learning_rate': 7.21639425104203e-06, 'epoch': 0.37} 37%|███▋ | 12793/34278 [14:10:21<20:34:31, 3.45s/it] 37%|███▋ | 12794/34278 [14:10:25<21:01:27, 3.52s/it] {'loss': 0.1595, 'grad_norm': 1.0805611965721982, 'learning_rate': 7.215970758839272e-06, 'epoch': 0.37} 37%|███▋ | 12794/34278 [14:10:25<21:01:27, 3.52s/it] 37%|███▋ | 12795/34278 [14:10:28<20:02:58, 3.36s/it] {'loss': 0.1618, 'grad_norm': 1.3128693885816374, 'learning_rate': 7.215547246852856e-06, 'epoch': 0.37} 37%|███▋ | 12795/34278 [14:10:28<20:02:58, 3.36s/it] 37%|███▋ | 12796/34278 [14:10:31<19:24:29, 3.25s/it] {'loss': 0.1512, 'grad_norm': 1.1045141047839067, 'learning_rate': 7.21512371508656e-06, 'epoch': 0.37} 37%|███▋ | 12796/34278 [14:10:31<19:24:29, 3.25s/it] 37%|███▋ | 12797/34278 [14:10:37<24:19:43, 4.08s/it] {'loss': 0.1291, 'grad_norm': 0.592389833864529, 'learning_rate': 7.214700163544171e-06, 'epoch': 0.37} 37%|███▋ | 12797/34278 [14:10:37<24:19:43, 4.08s/it] 37%|███▋ | 12798/34278 [14:10:40<22:42:56, 3.81s/it] {'loss': 0.1526, 'grad_norm': 1.5000220461646439, 'learning_rate': 7.2142765922294675e-06, 'epoch': 0.37} 37%|███▋ | 12798/34278 [14:10:40<22:42:56, 3.81s/it] 37%|███▋ | 12799/34278 [14:10:46<26:31:30, 4.45s/it] {'loss': 0.1382, 'grad_norm': 1.2455101272119433, 'learning_rate': 7.213853001146229e-06, 'epoch': 0.37} 37%|███▋ | 12799/34278 [14:10:46<26:31:30, 4.45s/it] 37%|███▋ | 12800/34278 [14:10:51<29:01:05, 4.86s/it] {'loss': 0.1493, 'grad_norm': 0.6794075304930273, 'learning_rate': 7.213429390298243e-06, 'epoch': 0.37} 37%|███▋ | 12800/34278 [14:10:52<29:01:05, 4.86s/it] 37%|███▋ | 12801/34278 [14:10:57<30:59:15, 5.19s/it] {'loss': 0.1403, 'grad_norm': 0.8178729732341539, 'learning_rate': 7.213005759689286e-06, 'epoch': 0.37} 37%|███▋ | 12801/34278 [14:10:57<30:59:15, 5.19s/it] 37%|███▋ | 12802/34278 [14:11:02<29:34:52, 4.96s/it] {'loss': 0.1624, 'grad_norm': 1.2423852804172166, 'learning_rate': 7.212582109323141e-06, 'epoch': 0.37} 37%|███▋ | 12802/34278 [14:11:02<29:34:52, 4.96s/it] 37%|███▋ | 12803/34278 [14:11:05<26:41:10, 4.47s/it] {'loss': 0.1575, 'grad_norm': 0.7730886036964798, 'learning_rate': 7.212158439203593e-06, 'epoch': 0.37} 37%|███▋ | 12803/34278 [14:11:05<26:41:10, 4.47s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (13665 > 8192). Running this sequence through the model will result in indexing errors 37%|███▋ | 12804/34278 [14:11:08<24:27:29, 4.10s/it] {'loss': 0.1495, 'grad_norm': 1.0462063519020495, 'learning_rate': 7.21173474933442e-06, 'epoch': 0.37} 37%|███▋ | 12804/34278 [14:11:08<24:27:29, 4.10s/it] 37%|███▋ | 12805/34278 [14:11:11<22:19:26, 3.74s/it] {'loss': 0.1446, 'grad_norm': 0.9880394441696694, 'learning_rate': 7.2113110397194094e-06, 'epoch': 0.37} 37%|███▋ | 12805/34278 [14:11:11<22:19:26, 3.74s/it] 37%|███▋ | 12806/34278 [14:11:16<23:19:20, 3.91s/it] {'loss': 0.139, 'grad_norm': 0.8399042504911042, 'learning_rate': 7.210887310362341e-06, 'epoch': 0.37} 37%|███▋ | 12806/34278 [14:11:16<23:19:20, 3.91s/it] 37%|███▋ | 12807/34278 [14:11:19<22:09:12, 3.71s/it] {'loss': 0.13, 'grad_norm': 0.7712592400776408, 'learning_rate': 7.2104635612669984e-06, 'epoch': 0.37} 37%|███▋ | 12807/34278 [14:11:19<22:09:12, 3.71s/it] 37%|███▋ | 12808/34278 [14:11:22<20:56:00, 3.51s/it] {'loss': 0.132, 'grad_norm': 0.9868046955871935, 'learning_rate': 7.210039792437165e-06, 'epoch': 0.37} 37%|███▋ | 12808/34278 [14:11:22<20:56:00, 3.51s/it] 37%|███▋ | 12809/34278 [14:11:26<22:35:17, 3.79s/it] {'loss': 0.1276, 'grad_norm': 0.6052468986562363, 'learning_rate': 7.2096160038766225e-06, 'epoch': 0.37} 37%|███▋ | 12809/34278 [14:11:26<22:35:17, 3.79s/it] 37%|███▋ | 12810/34278 [14:11:29<20:50:26, 3.49s/it] {'loss': 0.1359, 'grad_norm': 1.004714779689122, 'learning_rate': 7.209192195589159e-06, 'epoch': 0.37} 37%|███▋ | 12810/34278 [14:11:29<20:50:26, 3.49s/it] 37%|███▋ | 12811/34278 [14:11:33<20:57:14, 3.51s/it] {'loss': 0.1413, 'grad_norm': 0.8037176755728264, 'learning_rate': 7.208768367578551e-06, 'epoch': 0.37} 37%|███▋ | 12811/34278 [14:11:33<20:57:14, 3.51s/it] 37%|███▋ | 12812/34278 [14:11:36<19:45:29, 3.31s/it] {'loss': 0.1505, 'grad_norm': 0.8238673228541763, 'learning_rate': 7.208344519848589e-06, 'epoch': 0.37} 37%|███▋ | 12812/34278 [14:11:36<19:45:29, 3.31s/it] 37%|███▋ | 12813/34278 [14:11:38<18:49:24, 3.16s/it] {'loss': 0.1531, 'grad_norm': 0.7690389986364415, 'learning_rate': 7.207920652403054e-06, 'epoch': 0.37} 37%|███▋ | 12813/34278 [14:11:38<18:49:24, 3.16s/it] 37%|███▋ | 12814/34278 [14:11:43<21:44:23, 3.65s/it] {'loss': 0.1368, 'grad_norm': 0.8254209981613991, 'learning_rate': 7.207496765245729e-06, 'epoch': 0.37} 37%|███▋ | 12814/34278 [14:11:43<21:44:23, 3.65s/it] 37%|███▋ | 12815/34278 [14:11:46<20:43:36, 3.48s/it] {'loss': 0.1407, 'grad_norm': 0.6900741912062895, 'learning_rate': 7.207072858380402e-06, 'epoch': 0.37} 37%|███▋ | 12815/34278 [14:11:46<20:43:36, 3.48s/it] 37%|███▋ | 12816/34278 [14:11:50<20:41:47, 3.47s/it] {'loss': 0.135, 'grad_norm': 1.0600254361303463, 'learning_rate': 7.206648931810855e-06, 'epoch': 0.37} 37%|███▋ | 12816/34278 [14:11:50<20:41:47, 3.47s/it] 37%|███▋ | 12817/34278 [14:11:53<19:30:56, 3.27s/it] {'loss': 0.1552, 'grad_norm': 0.7431866458866175, 'learning_rate': 7.20622498554087e-06, 'epoch': 0.37} 37%|███▋ | 12817/34278 [14:11:53<19:30:56, 3.27s/it] 37%|███▋ | 12818/34278 [14:11:57<21:50:23, 3.66s/it] {'loss': 0.1536, 'grad_norm': 0.7518029155696899, 'learning_rate': 7.205801019574239e-06, 'epoch': 0.37} 37%|███▋ | 12818/34278 [14:11:57<21:50:23, 3.66s/it] 37%|███▋ | 12819/34278 [14:12:01<22:40:21, 3.80s/it] {'loss': 0.1578, 'grad_norm': 0.7642362741752822, 'learning_rate': 7.205377033914742e-06, 'epoch': 0.37} 37%|███▋ | 12819/34278 [14:12:01<22:40:21, 3.80s/it] 37%|███▋ | 12820/34278 [14:12:05<22:15:02, 3.73s/it] {'loss': 0.1736, 'grad_norm': 1.0284505970533113, 'learning_rate': 7.204953028566164e-06, 'epoch': 0.37} 37%|███▋ | 12820/34278 [14:12:05<22:15:02, 3.73s/it] 37%|███▋ | 12821/34278 [14:12:08<21:56:10, 3.68s/it] {'loss': 0.1518, 'grad_norm': 0.8088192319957287, 'learning_rate': 7.204529003532292e-06, 'epoch': 0.37} 37%|███▋ | 12821/34278 [14:12:08<21:56:10, 3.68s/it] 37%|███▋ | 12822/34278 [14:12:11<20:36:20, 3.46s/it] {'loss': 0.1312, 'grad_norm': 0.8304772257381413, 'learning_rate': 7.204104958816913e-06, 'epoch': 0.37} 37%|███▋ | 12822/34278 [14:12:11<20:36:20, 3.46s/it] 37%|███▋ | 12823/34278 [14:12:15<20:52:47, 3.50s/it] {'loss': 0.1232, 'grad_norm': 0.7755346740607266, 'learning_rate': 7.203680894423809e-06, 'epoch': 0.37} 37%|███▋ | 12823/34278 [14:12:15<20:52:47, 3.50s/it] 37%|███▋ | 12824/34278 [14:12:18<19:56:33, 3.35s/it] {'loss': 0.1512, 'grad_norm': 0.7693245134642306, 'learning_rate': 7.203256810356769e-06, 'epoch': 0.37} 37%|███▋ | 12824/34278 [14:12:18<19:56:33, 3.35s/it] 37%|███▋ | 12825/34278 [14:12:21<19:23:22, 3.25s/it] {'loss': 0.1314, 'grad_norm': 0.9523896806829728, 'learning_rate': 7.202832706619579e-06, 'epoch': 0.37} 37%|███▋ | 12825/34278 [14:12:21<19:23:22, 3.25s/it] 37%|███▋ | 12826/34278 [14:12:24<18:59:33, 3.19s/it] {'loss': 0.127, 'grad_norm': 0.7951488704850891, 'learning_rate': 7.202408583216023e-06, 'epoch': 0.37} 37%|███▋ | 12826/34278 [14:12:24<18:59:33, 3.19s/it] 37%|███▋ | 12827/34278 [14:12:27<19:33:53, 3.28s/it] {'loss': 0.1682, 'grad_norm': 0.8067102693565923, 'learning_rate': 7.201984440149889e-06, 'epoch': 0.37} 37%|███▋ | 12827/34278 [14:12:27<19:33:53, 3.28s/it] 37%|███▋ | 12828/34278 [14:12:31<19:48:50, 3.33s/it] {'loss': 0.1596, 'grad_norm': 1.0685204648025133, 'learning_rate': 7.2015602774249645e-06, 'epoch': 0.37} 37%|███▋ | 12828/34278 [14:12:31<19:48:50, 3.33s/it] 37%|███▋ | 12829/34278 [14:12:35<20:31:51, 3.45s/it] {'loss': 0.1532, 'grad_norm': 0.9378162171489898, 'learning_rate': 7.201136095045035e-06, 'epoch': 0.37} 37%|███▋ | 12829/34278 [14:12:35<20:31:51, 3.45s/it] 37%|███▋ | 12830/34278 [14:12:38<19:43:58, 3.31s/it] {'loss': 0.1652, 'grad_norm': 0.6825283671666629, 'learning_rate': 7.200711893013889e-06, 'epoch': 0.37} 37%|███▋ | 12830/34278 [14:12:38<19:43:58, 3.31s/it] 37%|███▋ | 12831/34278 [14:12:41<20:11:51, 3.39s/it] {'loss': 0.1462, 'grad_norm': 0.7794609561132333, 'learning_rate': 7.200287671335311e-06, 'epoch': 0.37} 37%|███▋ | 12831/34278 [14:12:41<20:11:51, 3.39s/it] 37%|███▋ | 12832/34278 [14:12:44<19:41:16, 3.30s/it] {'loss': 0.1355, 'grad_norm': 0.6949545272455027, 'learning_rate': 7.199863430013088e-06, 'epoch': 0.37} 37%|███▋ | 12832/34278 [14:12:44<19:41:16, 3.30s/it] 37%|███▋ | 12833/34278 [14:12:48<20:10:11, 3.39s/it] {'loss': 0.1292, 'grad_norm': 0.7559217194938304, 'learning_rate': 7.1994391690510136e-06, 'epoch': 0.37} 37%|███▋ | 12833/34278 [14:12:48<20:10:11, 3.39s/it] 37%|███▋ | 12834/34278 [14:12:51<19:59:08, 3.36s/it] {'loss': 0.1494, 'grad_norm': 0.7526174945214693, 'learning_rate': 7.19901488845287e-06, 'epoch': 0.37} 37%|███▋ | 12834/34278 [14:12:51<19:59:08, 3.36s/it] 37%|███▋ | 12835/34278 [14:12:54<19:53:49, 3.34s/it] {'loss': 0.1545, 'grad_norm': 0.7730864831196734, 'learning_rate': 7.1985905882224446e-06, 'epoch': 0.37} 37%|███▋ | 12835/34278 [14:12:54<19:53:49, 3.34s/it] 37%|███▋ | 12836/34278 [14:12:57<19:16:43, 3.24s/it] {'loss': 0.1504, 'grad_norm': 0.8467018024327737, 'learning_rate': 7.198166268363529e-06, 'epoch': 0.37} 37%|███▋ | 12836/34278 [14:12:57<19:16:43, 3.24s/it] 37%|███▋ | 12837/34278 [14:13:00<18:35:49, 3.12s/it] {'loss': 0.145, 'grad_norm': 0.9572708813983742, 'learning_rate': 7.19774192887991e-06, 'epoch': 0.37} 37%|███▋ | 12837/34278 [14:13:00<18:35:49, 3.12s/it] 37%|███▋ | 12838/34278 [14:13:03<18:25:23, 3.09s/it] {'loss': 0.1393, 'grad_norm': 0.7930922126743046, 'learning_rate': 7.197317569775375e-06, 'epoch': 0.37} 37%|███▋ | 12838/34278 [14:13:03<18:25:23, 3.09s/it] 37%|███▋ | 12839/34278 [14:13:09<23:41:16, 3.98s/it] {'loss': 0.1508, 'grad_norm': 0.8380399611026271, 'learning_rate': 7.196893191053713e-06, 'epoch': 0.37} 37%|███▋ | 12839/34278 [14:13:09<23:41:16, 3.98s/it] 37%|███▋ | 12840/34278 [14:13:15<27:10:59, 4.56s/it] {'loss': 0.1626, 'grad_norm': 0.9770374218791077, 'learning_rate': 7.196468792718714e-06, 'epoch': 0.37} 37%|███▋ | 12840/34278 [14:13:15<27:10:59, 4.56s/it] 37%|███▋ | 12841/34278 [14:13:19<25:43:35, 4.32s/it] {'loss': 0.1444, 'grad_norm': 1.0658859121375772, 'learning_rate': 7.196044374774165e-06, 'epoch': 0.37} 37%|███▋ | 12841/34278 [14:13:19<25:43:35, 4.32s/it] 37%|███▋ | 12842/34278 [14:13:22<23:15:30, 3.91s/it] {'loss': 0.1416, 'grad_norm': 0.8047978745261797, 'learning_rate': 7.1956199372238555e-06, 'epoch': 0.37} 37%|███▋ | 12842/34278 [14:13:22<23:15:30, 3.91s/it] 37%|███▋ | 12843/34278 [14:13:27<24:32:22, 4.12s/it] {'loss': 0.1496, 'grad_norm': 1.0186389228583306, 'learning_rate': 7.1951954800715775e-06, 'epoch': 0.37} 37%|███▋ | 12843/34278 [14:13:27<24:32:22, 4.12s/it] 37%|███▋ | 12844/34278 [14:13:30<22:40:28, 3.81s/it] {'loss': 0.1628, 'grad_norm': 0.9815262561047814, 'learning_rate': 7.194771003321116e-06, 'epoch': 0.37} 37%|███▋ | 12844/34278 [14:13:30<22:40:28, 3.81s/it] 37%|███▋ | 12845/34278 [14:13:34<24:19:49, 4.09s/it] {'loss': 0.178, 'grad_norm': 0.8453425149772271, 'learning_rate': 7.194346506976264e-06, 'epoch': 0.37} 37%|███▋ | 12845/34278 [14:13:34<24:19:49, 4.09s/it] 37%|███▋ | 12846/34278 [14:13:38<23:32:19, 3.95s/it] {'loss': 0.1168, 'grad_norm': 0.7025099970289065, 'learning_rate': 7.193921991040811e-06, 'epoch': 0.37} 37%|███▋ | 12846/34278 [14:13:38<23:32:19, 3.95s/it] 37%|███▋ | 12847/34278 [14:13:42<22:55:23, 3.85s/it] {'loss': 0.1491, 'grad_norm': 0.8768540199574046, 'learning_rate': 7.193497455518545e-06, 'epoch': 0.37} 37%|███▋ | 12847/34278 [14:13:42<22:55:23, 3.85s/it] 37%|███▋ | 12848/34278 [14:13:45<21:49:48, 3.67s/it] {'loss': 0.1634, 'grad_norm': 0.9173644426660783, 'learning_rate': 7.193072900413258e-06, 'epoch': 0.37} 37%|███▋ | 12848/34278 [14:13:45<21:49:48, 3.67s/it] 37%|███▋ | 12849/34278 [14:13:48<20:39:15, 3.47s/it] {'loss': 0.1586, 'grad_norm': 0.76045222399714, 'learning_rate': 7.192648325728739e-06, 'epoch': 0.37} 37%|███▋ | 12849/34278 [14:13:48<20:39:15, 3.47s/it] 37%|███▋ | 12850/34278 [14:13:51<19:39:34, 3.30s/it] {'loss': 0.14, 'grad_norm': 0.7832145092698721, 'learning_rate': 7.1922237314687795e-06, 'epoch': 0.37} 37%|███▋ | 12850/34278 [14:13:51<19:39:34, 3.30s/it] 37%|███▋ | 12851/34278 [14:13:56<23:41:41, 3.98s/it] {'loss': 0.1504, 'grad_norm': 0.8572299816846876, 'learning_rate': 7.191799117637169e-06, 'epoch': 0.37} 37%|███▋ | 12851/34278 [14:13:56<23:41:41, 3.98s/it] 37%|███▋ | 12852/34278 [14:14:00<22:15:39, 3.74s/it] {'loss': 0.1604, 'grad_norm': 0.8418590093246815, 'learning_rate': 7.191374484237701e-06, 'epoch': 0.37} 37%|███▋ | 12852/34278 [14:14:00<22:15:39, 3.74s/it] 37%|███▋ | 12853/34278 [14:14:03<22:20:49, 3.75s/it] {'loss': 0.1268, 'grad_norm': 0.83495815477393, 'learning_rate': 7.1909498312741635e-06, 'epoch': 0.37} 37%|███▋ | 12853/34278 [14:14:03<22:20:49, 3.75s/it] 37%|███▋ | 12854/34278 [14:14:07<22:02:49, 3.70s/it] {'loss': 0.1617, 'grad_norm': 0.6949037949282069, 'learning_rate': 7.190525158750349e-06, 'epoch': 0.37} 37%|███▋ | 12854/34278 [14:14:07<22:02:49, 3.70s/it] 38%|███▊ | 12855/34278 [14:14:10<21:19:00, 3.58s/it] {'loss': 0.1202, 'grad_norm': 0.7707164045187853, 'learning_rate': 7.19010046667005e-06, 'epoch': 0.38} 38%|███▊ | 12855/34278 [14:14:10<21:19:00, 3.58s/it] 38%|███▊ | 12856/34278 [14:14:16<25:33:09, 4.29s/it] {'loss': 0.1565, 'grad_norm': 0.934571627521969, 'learning_rate': 7.189675755037055e-06, 'epoch': 0.38} 38%|███▊ | 12856/34278 [14:14:16<25:33:09, 4.29s/it] 38%|███▊ | 12857/34278 [14:14:20<24:17:03, 4.08s/it] {'loss': 0.114, 'grad_norm': 0.7682593052696219, 'learning_rate': 7.189251023855158e-06, 'epoch': 0.38} 38%|███▊ | 12857/34278 [14:14:20<24:17:03, 4.08s/it] 38%|███▊ | 12858/34278 [14:14:24<24:50:10, 4.17s/it] {'loss': 0.1389, 'grad_norm': 0.7097243328524351, 'learning_rate': 7.188826273128152e-06, 'epoch': 0.38} 38%|███▊ | 12858/34278 [14:14:24<24:50:10, 4.17s/it] 38%|███▊ | 12859/34278 [14:14:27<22:19:33, 3.75s/it] {'loss': 0.17, 'grad_norm': 1.101588816045312, 'learning_rate': 7.188401502859825e-06, 'epoch': 0.38} 38%|███▊ | 12859/34278 [14:14:27<22:19:33, 3.75s/it] 38%|███▊ | 12860/34278 [14:14:30<21:08:30, 3.55s/it] {'loss': 0.1359, 'grad_norm': 0.809860658631005, 'learning_rate': 7.187976713053975e-06, 'epoch': 0.38} 38%|███▊ | 12860/34278 [14:14:30<21:08:30, 3.55s/it] 38%|███▊ | 12861/34278 [14:14:36<24:35:05, 4.13s/it] {'loss': 0.1436, 'grad_norm': 0.776896204454263, 'learning_rate': 7.187551903714389e-06, 'epoch': 0.38} 38%|███▊ | 12861/34278 [14:14:36<24:35:05, 4.13s/it] 38%|███▊ | 12862/34278 [14:14:39<23:31:56, 3.96s/it] {'loss': 0.1186, 'grad_norm': 0.6367088342375902, 'learning_rate': 7.187127074844862e-06, 'epoch': 0.38} 38%|███▊ | 12862/34278 [14:14:39<23:31:56, 3.96s/it] 38%|███▊ | 12863/34278 [14:14:42<21:28:19, 3.61s/it] {'loss': 0.1259, 'grad_norm': 0.7681994472771605, 'learning_rate': 7.186702226449187e-06, 'epoch': 0.38} 38%|███▊ | 12863/34278 [14:14:42<21:28:19, 3.61s/it] 38%|███▊ | 12864/34278 [14:14:45<20:43:15, 3.48s/it] {'loss': 0.1253, 'grad_norm': 0.9294077857237404, 'learning_rate': 7.186277358531158e-06, 'epoch': 0.38} 38%|███▊ | 12864/34278 [14:14:45<20:43:15, 3.48s/it] 38%|███▊ | 12865/34278 [14:14:48<20:22:29, 3.43s/it] {'loss': 0.141, 'grad_norm': 0.805502573751167, 'learning_rate': 7.185852471094563e-06, 'epoch': 0.38} 38%|███▊ | 12865/34278 [14:14:48<20:22:29, 3.43s/it] 38%|███▊ | 12866/34278 [14:14:51<19:19:10, 3.25s/it] {'loss': 0.1303, 'grad_norm': 0.6881424002631252, 'learning_rate': 7.185427564143201e-06, 'epoch': 0.38} 38%|███▊ | 12866/34278 [14:14:51<19:19:10, 3.25s/it] 38%|███▊ | 12867/34278 [14:14:54<18:27:35, 3.10s/it] {'loss': 0.1442, 'grad_norm': 1.030590331604276, 'learning_rate': 7.1850026376808645e-06, 'epoch': 0.38} 38%|███▊ | 12867/34278 [14:14:54<18:27:35, 3.10s/it] 38%|███▊ | 12868/34278 [14:14:57<18:38:26, 3.13s/it] {'loss': 0.1242, 'grad_norm': 0.7952529500074451, 'learning_rate': 7.1845776917113445e-06, 'epoch': 0.38} 38%|███▊ | 12868/34278 [14:14:57<18:38:26, 3.13s/it] 38%|███▊ | 12869/34278 [14:15:01<19:32:10, 3.29s/it] {'loss': 0.1419, 'grad_norm': 0.8270142445276427, 'learning_rate': 7.184152726238437e-06, 'epoch': 0.38} 38%|███▊ | 12869/34278 [14:15:01<19:32:10, 3.29s/it] 38%|███▊ | 12870/34278 [14:15:05<21:11:30, 3.56s/it] {'loss': 0.1572, 'grad_norm': 0.8080599199572762, 'learning_rate': 7.183727741265935e-06, 'epoch': 0.38} 38%|███▊ | 12870/34278 [14:15:05<21:11:30, 3.56s/it] 38%|███▊ | 12871/34278 [14:15:08<20:44:35, 3.49s/it] {'loss': 0.1318, 'grad_norm': 0.7388164255739978, 'learning_rate': 7.183302736797632e-06, 'epoch': 0.38} 38%|███▊ | 12871/34278 [14:15:08<20:44:35, 3.49s/it] 38%|███▊ | 12872/34278 [14:15:12<20:17:36, 3.41s/it] {'loss': 0.1535, 'grad_norm': 0.8551495141645852, 'learning_rate': 7.182877712837326e-06, 'epoch': 0.38} 38%|███▊ | 12872/34278 [14:15:12<20:17:36, 3.41s/it] 38%|███▊ | 12873/34278 [14:15:15<20:16:02, 3.41s/it] {'loss': 0.1422, 'grad_norm': 0.7926117580264666, 'learning_rate': 7.182452669388809e-06, 'epoch': 0.38} 38%|███▊ | 12873/34278 [14:15:15<20:16:02, 3.41s/it] 38%|███▊ | 12874/34278 [14:15:19<20:44:21, 3.49s/it] {'loss': 0.1427, 'grad_norm': 0.9528508626438138, 'learning_rate': 7.182027606455873e-06, 'epoch': 0.38} 38%|███▊ | 12874/34278 [14:15:19<20:44:21, 3.49s/it] 38%|███▊ | 12875/34278 [14:15:21<19:31:30, 3.28s/it] {'loss': 0.1211, 'grad_norm': 0.7720977339248862, 'learning_rate': 7.181602524042317e-06, 'epoch': 0.38} 38%|███▊ | 12875/34278 [14:15:21<19:31:30, 3.28s/it] 38%|███▊ | 12876/34278 [14:15:25<20:45:35, 3.49s/it] {'loss': 0.136, 'grad_norm': 0.7329168311726614, 'learning_rate': 7.1811774221519336e-06, 'epoch': 0.38} 38%|███▊ | 12876/34278 [14:15:25<20:45:35, 3.49s/it] 38%|███▊ | 12877/34278 [14:15:29<20:15:04, 3.41s/it] {'loss': 0.1687, 'grad_norm': 0.7780701546134563, 'learning_rate': 7.180752300788518e-06, 'epoch': 0.38} 38%|███▊ | 12877/34278 [14:15:29<20:15:04, 3.41s/it] 38%|███▊ | 12878/34278 [14:15:33<21:43:56, 3.66s/it] {'loss': 0.1305, 'grad_norm': 0.885036411558389, 'learning_rate': 7.180327159955869e-06, 'epoch': 0.38} 38%|███▊ | 12878/34278 [14:15:33<21:43:56, 3.66s/it] 38%|███▊ | 12879/34278 [14:15:39<25:40:50, 4.32s/it] {'loss': 0.1281, 'grad_norm': 0.8002865762299235, 'learning_rate': 7.179901999657778e-06, 'epoch': 0.38} 38%|███▊ | 12879/34278 [14:15:39<25:40:50, 4.32s/it] 38%|███▊ | 12880/34278 [14:15:43<25:16:59, 4.25s/it] {'loss': 0.1277, 'grad_norm': 0.6139885216483737, 'learning_rate': 7.179476819898042e-06, 'epoch': 0.38} 38%|███▊ | 12880/34278 [14:15:43<25:16:59, 4.25s/it] 38%|███▊ | 12881/34278 [14:15:49<29:03:54, 4.89s/it] {'loss': 0.1667, 'grad_norm': 0.8295300050875486, 'learning_rate': 7.179051620680457e-06, 'epoch': 0.38} 38%|███▊ | 12881/34278 [14:15:49<29:03:54, 4.89s/it] 38%|███▊ | 12882/34278 [14:15:52<25:17:00, 4.25s/it] {'loss': 0.1501, 'grad_norm': 0.7792411625735932, 'learning_rate': 7.178626402008821e-06, 'epoch': 0.38} 38%|███▊ | 12882/34278 [14:15:52<25:17:00, 4.25s/it] 38%|███▊ | 12883/34278 [14:15:56<24:21:29, 4.10s/it] {'loss': 0.1292, 'grad_norm': 0.6584314271683108, 'learning_rate': 7.178201163886928e-06, 'epoch': 0.38} 38%|███▊ | 12883/34278 [14:15:56<24:21:29, 4.10s/it] 38%|███▊ | 12884/34278 [14:15:59<22:31:05, 3.79s/it] {'loss': 0.1196, 'grad_norm': 0.8692277701651672, 'learning_rate': 7.177775906318574e-06, 'epoch': 0.38} 38%|███▊ | 12884/34278 [14:15:59<22:31:05, 3.79s/it] 38%|███▊ | 12885/34278 [14:16:02<21:15:26, 3.58s/it] {'loss': 0.1338, 'grad_norm': 0.8391969923653462, 'learning_rate': 7.177350629307558e-06, 'epoch': 0.38} 38%|███▊ | 12885/34278 [14:16:02<21:15:26, 3.58s/it] 38%|███▊ | 12886/34278 [14:16:06<21:33:42, 3.63s/it] {'loss': 0.1491, 'grad_norm': 0.9623909496126544, 'learning_rate': 7.176925332857674e-06, 'epoch': 0.38} 38%|███▊ | 12886/34278 [14:16:06<21:33:42, 3.63s/it] 38%|███▊ | 12887/34278 [14:16:08<20:09:05, 3.39s/it] {'loss': 0.1353, 'grad_norm': 0.9077478299947973, 'learning_rate': 7.176500016972721e-06, 'epoch': 0.38} 38%|███▊ | 12887/34278 [14:16:08<20:09:05, 3.39s/it] 38%|███▊ | 12888/34278 [14:16:12<20:10:54, 3.40s/it] {'loss': 0.1477, 'grad_norm': 0.7651834974419126, 'learning_rate': 7.176074681656495e-06, 'epoch': 0.38} 38%|███▊ | 12888/34278 [14:16:12<20:10:54, 3.40s/it] 38%|███▊ | 12889/34278 [14:16:15<19:38:49, 3.31s/it] {'loss': 0.1251, 'grad_norm': 1.081742506430794, 'learning_rate': 7.175649326912794e-06, 'epoch': 0.38} 38%|███▊ | 12889/34278 [14:16:15<19:38:49, 3.31s/it] 38%|███▊ | 12890/34278 [14:16:21<24:58:37, 4.20s/it] {'loss': 0.1296, 'grad_norm': 0.7679017013848687, 'learning_rate': 7.175223952745416e-06, 'epoch': 0.38} 38%|███▊ | 12890/34278 [14:16:21<24:58:37, 4.20s/it] 38%|███▊ | 12891/34278 [14:16:25<23:29:17, 3.95s/it] {'loss': 0.1408, 'grad_norm': 0.9669289364213036, 'learning_rate': 7.174798559158157e-06, 'epoch': 0.38} 38%|███▊ | 12891/34278 [14:16:25<23:29:17, 3.95s/it] 38%|███▊ | 12892/34278 [14:16:28<21:33:23, 3.63s/it] {'loss': 0.1561, 'grad_norm': 0.9629585317633905, 'learning_rate': 7.174373146154814e-06, 'epoch': 0.38} 38%|███▊ | 12892/34278 [14:16:28<21:33:23, 3.63s/it] 38%|███▊ | 12893/34278 [14:16:31<21:02:53, 3.54s/it] {'loss': 0.154, 'grad_norm': 0.850242793603612, 'learning_rate': 7.17394771373919e-06, 'epoch': 0.38} 38%|███▊ | 12893/34278 [14:16:31<21:02:53, 3.54s/it] 38%|███▊ | 12894/34278 [14:16:34<20:40:00, 3.48s/it] {'loss': 0.1596, 'grad_norm': 0.9562489098328032, 'learning_rate': 7.173522261915078e-06, 'epoch': 0.38} 38%|███▊ | 12894/34278 [14:16:34<20:40:00, 3.48s/it] 38%|███▊ | 12895/34278 [14:16:38<20:52:36, 3.51s/it] {'loss': 0.1416, 'grad_norm': 0.8542218411673976, 'learning_rate': 7.173096790686278e-06, 'epoch': 0.38} 38%|███▊ | 12895/34278 [14:16:38<20:52:36, 3.51s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb3c187d030> Failed to fetch sample 2654740. Exception: cannot identify image file <_io.BytesIO object at 0x7fb3c187d030> 38%|███▊ | 12896/34278 [14:16:41<20:17:39, 3.42s/it] {'loss': 0.1225, 'grad_norm': 0.8199126257197301, 'learning_rate': 7.172671300056588e-06, 'epoch': 0.38} 38%|███▊ | 12896/34278 [14:16:41<20:17:39, 3.42s/it] 38%|███▊ | 12897/34278 [14:16:44<20:20:09, 3.42s/it] {'loss': 0.1641, 'grad_norm': 0.8759460699810758, 'learning_rate': 7.172245790029808e-06, 'epoch': 0.38} 38%|███▊ | 12897/34278 [14:16:44<20:20:09, 3.42s/it] 38%|███▊ | 12898/34278 [14:16:48<19:45:56, 3.33s/it] {'loss': 0.1689, 'grad_norm': 1.0116621949880957, 'learning_rate': 7.1718202606097366e-06, 'epoch': 0.38} 38%|███▊ | 12898/34278 [14:16:48<19:45:56, 3.33s/it] 38%|███▊ | 12899/34278 [14:16:50<19:09:05, 3.22s/it] {'loss': 0.1214, 'grad_norm': 0.6673690742963837, 'learning_rate': 7.171394711800172e-06, 'epoch': 0.38} 38%|███▊ | 12899/34278 [14:16:51<19:09:05, 3.22s/it] 38%|███▊ | 12900/34278 [14:16:54<19:23:21, 3.27s/it] {'loss': 0.1502, 'grad_norm': 1.007074597015964, 'learning_rate': 7.1709691436049145e-06, 'epoch': 0.38} 38%|███▊ | 12900/34278 [14:16:54<19:23:21, 3.27s/it] 38%|███▊ | 12901/34278 [14:16:58<20:44:12, 3.49s/it] {'loss': 0.1431, 'grad_norm': 0.8534217065999815, 'learning_rate': 7.170543556027762e-06, 'epoch': 0.38} 38%|███▊ | 12901/34278 [14:16:58<20:44:12, 3.49s/it] 38%|███▊ | 12902/34278 [14:17:01<19:39:55, 3.31s/it] {'loss': 0.1458, 'grad_norm': 0.8005454422205138, 'learning_rate': 7.170117949072514e-06, 'epoch': 0.38} 38%|███▊ | 12902/34278 [14:17:01<19:39:55, 3.31s/it] 38%|███▊ | 12903/34278 [14:17:04<19:52:08, 3.35s/it] {'loss': 0.1325, 'grad_norm': 0.7868223042884781, 'learning_rate': 7.1696923227429724e-06, 'epoch': 0.38} 38%|███▊ | 12903/34278 [14:17:04<19:52:08, 3.35s/it] 38%|███▊ | 12904/34278 [14:17:10<24:12:58, 4.08s/it] {'loss': 0.1387, 'grad_norm': 0.8432534370096046, 'learning_rate': 7.169266677042934e-06, 'epoch': 0.38} 38%|███▊ | 12904/34278 [14:17:10<24:12:58, 4.08s/it] 38%|███▊ | 12905/34278 [14:17:13<22:24:19, 3.77s/it] {'loss': 0.1401, 'grad_norm': 0.707858434326484, 'learning_rate': 7.168841011976202e-06, 'epoch': 0.38} 38%|███▊ | 12905/34278 [14:17:13<22:24:19, 3.77s/it] 38%|███▊ | 12906/34278 [14:17:17<21:52:01, 3.68s/it] {'loss': 0.1545, 'grad_norm': 0.7993405197909869, 'learning_rate': 7.168415327546575e-06, 'epoch': 0.38} 38%|███▊ | 12906/34278 [14:17:17<21:52:01, 3.68s/it] 38%|███▊ | 12907/34278 [14:17:19<20:31:46, 3.46s/it] {'loss': 0.1249, 'grad_norm': 0.7133277890137376, 'learning_rate': 7.167989623757853e-06, 'epoch': 0.38} 38%|███▊ | 12907/34278 [14:17:19<20:31:46, 3.46s/it] 38%|███▊ | 12908/34278 [14:17:24<21:37:28, 3.64s/it] {'loss': 0.1432, 'grad_norm': 0.6799619895613896, 'learning_rate': 7.1675639006138385e-06, 'epoch': 0.38} 38%|███▊ | 12908/34278 [14:17:24<21:37:28, 3.64s/it] 38%|███▊ | 12909/34278 [14:17:26<20:11:44, 3.40s/it] {'loss': 0.129, 'grad_norm': 0.8732558172520097, 'learning_rate': 7.16713815811833e-06, 'epoch': 0.38} 38%|███▊ | 12909/34278 [14:17:26<20:11:44, 3.40s/it] 38%|███▊ | 12910/34278 [14:17:29<19:37:14, 3.31s/it] {'loss': 0.1416, 'grad_norm': 0.8703644826462984, 'learning_rate': 7.166712396275128e-06, 'epoch': 0.38} 38%|███▊ | 12910/34278 [14:17:29<19:37:14, 3.31s/it] 38%|███▊ | 12911/34278 [14:17:33<20:43:08, 3.49s/it] {'loss': 0.1283, 'grad_norm': 0.8734050869939769, 'learning_rate': 7.166286615088037e-06, 'epoch': 0.38} 38%|███▊ | 12911/34278 [14:17:33<20:43:08, 3.49s/it] 38%|███▊ | 12912/34278 [14:17:37<21:20:01, 3.59s/it] {'loss': 0.1562, 'grad_norm': 0.8412952623074301, 'learning_rate': 7.165860814560855e-06, 'epoch': 0.38} 38%|███▊ | 12912/34278 [14:17:37<21:20:01, 3.59s/it] 38%|███▊ | 12913/34278 [14:17:41<21:53:18, 3.69s/it] {'loss': 0.1526, 'grad_norm': 0.9821024713743136, 'learning_rate': 7.165434994697386e-06, 'epoch': 0.38} 38%|███▊ | 12913/34278 [14:17:41<21:53:18, 3.69s/it] 38%|███▊ | 12914/34278 [14:17:46<23:53:38, 4.03s/it] {'loss': 0.1633, 'grad_norm': 0.7847728972948989, 'learning_rate': 7.16500915550143e-06, 'epoch': 0.38} 38%|███▊ | 12914/34278 [14:17:46<23:53:38, 4.03s/it] 38%|███▊ | 12915/34278 [14:17:52<26:58:11, 4.54s/it] {'loss': 0.1648, 'grad_norm': 0.872635241897278, 'learning_rate': 7.1645832969767894e-06, 'epoch': 0.38} 38%|███▊ | 12915/34278 [14:17:52<26:58:11, 4.54s/it] 38%|███▊ | 12916/34278 [14:17:55<24:33:36, 4.14s/it] {'loss': 0.1575, 'grad_norm': 0.847901694637845, 'learning_rate': 7.164157419127263e-06, 'epoch': 0.38} 38%|███▊ | 12916/34278 [14:17:55<24:33:36, 4.14s/it] 38%|███▊ | 12917/34278 [14:17:58<23:23:01, 3.94s/it] {'loss': 0.1414, 'grad_norm': 0.7788239673278788, 'learning_rate': 7.1637315219566585e-06, 'epoch': 0.38} 38%|███▊ | 12917/34278 [14:17:58<23:23:01, 3.94s/it] 38%|███▊ | 12918/34278 [14:18:04<27:09:07, 4.58s/it] {'loss': 0.1474, 'grad_norm': 0.9305316814725858, 'learning_rate': 7.1633056054687756e-06, 'epoch': 0.38} 38%|███▊ | 12918/34278 [14:18:04<27:09:07, 4.58s/it] 38%|███▊ | 12919/34278 [14:18:10<28:56:12, 4.88s/it] {'loss': 0.1465, 'grad_norm': 0.768186906984129, 'learning_rate': 7.162879669667415e-06, 'epoch': 0.38} 38%|███▊ | 12919/34278 [14:18:10<28:56:12, 4.88s/it] 38%|███▊ | 12920/34278 [14:18:15<29:22:24, 4.95s/it] {'loss': 0.1497, 'grad_norm': 0.7541898117802722, 'learning_rate': 7.162453714556383e-06, 'epoch': 0.38} 38%|███▊ | 12920/34278 [14:18:15<29:22:24, 4.95s/it] 38%|███▊ | 12921/34278 [14:18:21<31:01:50, 5.23s/it] {'loss': 0.1327, 'grad_norm': 0.7062473964100255, 'learning_rate': 7.162027740139479e-06, 'epoch': 0.38} 38%|███▊ | 12921/34278 [14:18:21<31:01:50, 5.23s/it] 38%|███▊ | 12922/34278 [14:18:27<33:14:09, 5.60s/it] {'loss': 0.1422, 'grad_norm': 0.8258927353984997, 'learning_rate': 7.1616017464205065e-06, 'epoch': 0.38} 38%|███▊ | 12922/34278 [14:18:27<33:14:09, 5.60s/it] 38%|███▊ | 12923/34278 [14:18:31<29:23:32, 4.95s/it] {'loss': 0.1427, 'grad_norm': 0.6491862551330999, 'learning_rate': 7.1611757334032725e-06, 'epoch': 0.38} 38%|███▊ | 12923/34278 [14:18:31<29:23:32, 4.95s/it] 38%|███▊ | 12924/34278 [14:18:35<27:24:02, 4.62s/it] {'loss': 0.1605, 'grad_norm': 1.10500134042551, 'learning_rate': 7.160749701091576e-06, 'epoch': 0.38} 38%|███▊ | 12924/34278 [14:18:35<27:24:02, 4.62s/it] 38%|███▊ | 12925/34278 [14:18:38<25:43:54, 4.34s/it] {'loss': 0.1173, 'grad_norm': 0.8126912899723046, 'learning_rate': 7.160323649489221e-06, 'epoch': 0.38} 38%|███▊ | 12925/34278 [14:18:38<25:43:54, 4.34s/it] 38%|███▊ | 12926/34278 [14:18:42<24:23:29, 4.11s/it] {'loss': 0.1391, 'grad_norm': 0.8426200014264674, 'learning_rate': 7.159897578600014e-06, 'epoch': 0.38} 38%|███▊ | 12926/34278 [14:18:42<24:23:29, 4.11s/it] 38%|███▊ | 12927/34278 [14:18:45<22:07:35, 3.73s/it] {'loss': 0.1502, 'grad_norm': 0.8834136266003784, 'learning_rate': 7.1594714884277564e-06, 'epoch': 0.38} 38%|███▊ | 12927/34278 [14:18:45<22:07:35, 3.73s/it] 38%|███▊ | 12928/34278 [14:18:48<20:42:26, 3.49s/it] {'loss': 0.1424, 'grad_norm': 1.0371223791268027, 'learning_rate': 7.1590453789762525e-06, 'epoch': 0.38} 38%|███▊ | 12928/34278 [14:18:48<20:42:26, 3.49s/it] 38%|███▊ | 12929/34278 [14:18:53<24:39:13, 4.16s/it] {'loss': 0.1144, 'grad_norm': 0.9614534965231251, 'learning_rate': 7.158619250249307e-06, 'epoch': 0.38} 38%|███▊ | 12929/34278 [14:18:54<24:39:13, 4.16s/it] 38%|███▊ | 12930/34278 [14:18:56<22:16:47, 3.76s/it] {'loss': 0.1418, 'grad_norm': 1.0766179123513977, 'learning_rate': 7.158193102250724e-06, 'epoch': 0.38} 38%|███▊ | 12930/34278 [14:18:56<22:16:47, 3.76s/it] 38%|███▊ | 12931/34278 [14:19:00<22:07:44, 3.73s/it] {'loss': 0.1536, 'grad_norm': 0.7851736856001515, 'learning_rate': 7.157766934984308e-06, 'epoch': 0.38} 38%|███▊ | 12931/34278 [14:19:00<22:07:44, 3.73s/it] 38%|███▊ | 12932/34278 [14:19:03<20:58:47, 3.54s/it] {'loss': 0.1436, 'grad_norm': 1.1128148525327382, 'learning_rate': 7.157340748453864e-06, 'epoch': 0.38} 38%|███▊ | 12932/34278 [14:19:03<20:58:47, 3.54s/it] 38%|███▊ | 12933/34278 [14:19:09<25:17:21, 4.27s/it] {'loss': 0.1474, 'grad_norm': 0.97632633608461, 'learning_rate': 7.1569145426631985e-06, 'epoch': 0.38} 38%|███▊ | 12933/34278 [14:19:09<25:17:21, 4.27s/it] 38%|███▊ | 12934/34278 [14:19:12<22:43:53, 3.83s/it] {'loss': 0.1593, 'grad_norm': 0.8353249735371481, 'learning_rate': 7.156488317616111e-06, 'epoch': 0.38} 38%|███▊ | 12934/34278 [14:19:12<22:43:53, 3.83s/it] 38%|███▊ | 12935/34278 [14:19:15<21:02:55, 3.55s/it] {'loss': 0.126, 'grad_norm': 0.6876320909100034, 'learning_rate': 7.156062073316414e-06, 'epoch': 0.38} 38%|███▊ | 12935/34278 [14:19:15<21:02:55, 3.55s/it] 38%|███▊ | 12936/34278 [14:19:19<21:37:44, 3.65s/it] {'loss': 0.1413, 'grad_norm': 0.9650734550691265, 'learning_rate': 7.155635809767909e-06, 'epoch': 0.38} 38%|███▊ | 12936/34278 [14:19:19<21:37:44, 3.65s/it] 38%|███▊ | 12937/34278 [14:19:22<21:13:27, 3.58s/it] {'loss': 0.1696, 'grad_norm': 1.3062486775554714, 'learning_rate': 7.1552095269744e-06, 'epoch': 0.38} 38%|███▊ | 12937/34278 [14:19:22<21:13:27, 3.58s/it] 38%|███▊ | 12938/34278 [14:19:28<25:40:12, 4.33s/it] {'loss': 0.1399, 'grad_norm': 0.7289393306992145, 'learning_rate': 7.154783224939697e-06, 'epoch': 0.38} 38%|███▊ | 12938/34278 [14:19:28<25:40:12, 4.33s/it] 38%|███▊ | 12939/34278 [14:19:31<23:35:18, 3.98s/it] {'loss': 0.1212, 'grad_norm': 1.007827707825707, 'learning_rate': 7.154356903667604e-06, 'epoch': 0.38} 38%|███▊ | 12939/34278 [14:19:31<23:35:18, 3.98s/it] 38%|███▊ | 12940/34278 [14:19:37<26:42:17, 4.51s/it] {'loss': 0.1374, 'grad_norm': 0.953456318475476, 'learning_rate': 7.153930563161926e-06, 'epoch': 0.38} 38%|███▊ | 12940/34278 [14:19:37<26:42:17, 4.51s/it] 38%|███▊ | 12941/34278 [14:19:40<24:12:21, 4.08s/it] {'loss': 0.1574, 'grad_norm': 0.7861556916629966, 'learning_rate': 7.15350420342647e-06, 'epoch': 0.38} 38%|███▊ | 12941/34278 [14:19:40<24:12:21, 4.08s/it] 38%|███▊ | 12942/34278 [14:19:44<23:19:18, 3.94s/it] {'loss': 0.1313, 'grad_norm': 1.035179138482058, 'learning_rate': 7.1530778244650425e-06, 'epoch': 0.38} 38%|███▊ | 12942/34278 [14:19:44<23:19:18, 3.94s/it] 38%|███▊ | 12943/34278 [14:19:47<22:09:01, 3.74s/it] {'loss': 0.1394, 'grad_norm': 0.7924070480833347, 'learning_rate': 7.1526514262814495e-06, 'epoch': 0.38} 38%|███▊ | 12943/34278 [14:19:47<22:09:01, 3.74s/it] 38%|███▊ | 12944/34278 [14:19:53<26:25:54, 4.46s/it] {'loss': 0.1638, 'grad_norm': 1.0319999178575172, 'learning_rate': 7.1522250088795e-06, 'epoch': 0.38} 38%|███▊ | 12944/34278 [14:19:53<26:25:54, 4.46s/it] 38%|███▊ | 12945/34278 [14:19:57<24:37:02, 4.15s/it] {'loss': 0.1532, 'grad_norm': 0.8222215758395307, 'learning_rate': 7.1517985722630005e-06, 'epoch': 0.38} 38%|███▊ | 12945/34278 [14:19:57<24:37:02, 4.15s/it] 38%|███▊ | 12946/34278 [14:19:59<22:12:22, 3.75s/it] {'loss': 0.1387, 'grad_norm': 1.104517354869253, 'learning_rate': 7.151372116435753e-06, 'epoch': 0.38} 38%|███▊ | 12946/34278 [14:19:59<22:12:22, 3.75s/it] 38%|███▊ | 12947/34278 [14:20:03<21:14:00, 3.58s/it] {'loss': 0.148, 'grad_norm': 0.9845709166685269, 'learning_rate': 7.150945641401571e-06, 'epoch': 0.38} 38%|███▊ | 12947/34278 [14:20:03<21:14:00, 3.58s/it] 38%|███▊ | 12948/34278 [14:20:06<20:55:59, 3.53s/it] {'loss': 0.1431, 'grad_norm': 0.8761587873510057, 'learning_rate': 7.150519147164261e-06, 'epoch': 0.38} 38%|███▊ | 12948/34278 [14:20:06<20:55:59, 3.53s/it] 38%|███▊ | 12949/34278 [14:20:09<20:51:04, 3.52s/it] {'loss': 0.1531, 'grad_norm': 1.0675377919863407, 'learning_rate': 7.150092633727627e-06, 'epoch': 0.38} 38%|███▊ | 12949/34278 [14:20:09<20:51:04, 3.52s/it] 38%|███▊ | 12950/34278 [14:20:12<19:29:53, 3.29s/it] {'loss': 0.1831, 'grad_norm': 0.8704741403693905, 'learning_rate': 7.149666101095482e-06, 'epoch': 0.38} 38%|███▊ | 12950/34278 [14:20:12<19:29:53, 3.29s/it] 38%|███▊ | 12951/34278 [14:20:18<24:20:23, 4.11s/it] {'loss': 0.1243, 'grad_norm': 0.71824553247317, 'learning_rate': 7.149239549271629e-06, 'epoch': 0.38} 38%|███▊ | 12951/34278 [14:20:18<24:20:23, 4.11s/it] 38%|███▊ | 12952/34278 [14:20:21<22:06:49, 3.73s/it] {'loss': 0.2166, 'grad_norm': 0.9730730253029378, 'learning_rate': 7.148812978259878e-06, 'epoch': 0.38} 38%|███▊ | 12952/34278 [14:20:21<22:06:49, 3.73s/it] 38%|███▊ | 12953/34278 [14:20:26<23:17:29, 3.93s/it] {'loss': 0.1237, 'grad_norm': 0.8428157727675645, 'learning_rate': 7.148386388064039e-06, 'epoch': 0.38} 38%|███▊ | 12953/34278 [14:20:26<23:17:29, 3.93s/it] 38%|███▊ | 12954/34278 [14:20:31<25:22:46, 4.28s/it] {'loss': 0.1371, 'grad_norm': 0.8006041165399888, 'learning_rate': 7.14795977868792e-06, 'epoch': 0.38} 38%|███▊ | 12954/34278 [14:20:31<25:22:46, 4.28s/it] 38%|███▊ | 12955/34278 [14:20:34<22:56:51, 3.87s/it] {'loss': 0.1249, 'grad_norm': 0.9305105700919174, 'learning_rate': 7.147533150135327e-06, 'epoch': 0.38} 38%|███▊ | 12955/34278 [14:20:34<22:56:51, 3.87s/it] 38%|███▊ | 12956/34278 [14:20:37<22:46:59, 3.85s/it] {'loss': 0.1459, 'grad_norm': 1.0914651019831614, 'learning_rate': 7.147106502410071e-06, 'epoch': 0.38} 38%|███▊ | 12956/34278 [14:20:37<22:46:59, 3.85s/it] 38%|███▊ | 12957/34278 [14:20:43<26:37:43, 4.50s/it] {'loss': 0.1446, 'grad_norm': 0.885704244639512, 'learning_rate': 7.146679835515962e-06, 'epoch': 0.38} 38%|███▊ | 12957/34278 [14:20:43<26:37:43, 4.50s/it] 38%|███▊ | 12958/34278 [14:20:47<24:29:51, 4.14s/it] {'loss': 0.1447, 'grad_norm': 0.958352530010404, 'learning_rate': 7.146253149456806e-06, 'epoch': 0.38} 38%|███▊ | 12958/34278 [14:20:47<24:29:51, 4.14s/it] 38%|███▊ | 12959/34278 [14:20:50<22:47:47, 3.85s/it] {'loss': 0.1344, 'grad_norm': 0.9641981998357005, 'learning_rate': 7.145826444236415e-06, 'epoch': 0.38} 38%|███▊ | 12959/34278 [14:20:50<22:47:47, 3.85s/it] 38%|███▊ | 12960/34278 [14:20:53<21:03:35, 3.56s/it] {'loss': 0.1658, 'grad_norm': 0.8479666204307287, 'learning_rate': 7.1453997198586e-06, 'epoch': 0.38} 38%|███▊ | 12960/34278 [14:20:53<21:03:35, 3.56s/it] 38%|███▊ | 12961/34278 [14:20:59<25:40:27, 4.34s/it] {'loss': 0.1219, 'grad_norm': 0.7779690532002743, 'learning_rate': 7.144972976327164e-06, 'epoch': 0.38} 38%|███▊ | 12961/34278 [14:20:59<25:40:27, 4.34s/it] 38%|███▊ | 12962/34278 [14:21:02<23:25:14, 3.96s/it] {'loss': 0.1535, 'grad_norm': 1.2539229754779762, 'learning_rate': 7.144546213645924e-06, 'epoch': 0.38} 38%|███▊ | 12962/34278 [14:21:02<23:25:14, 3.96s/it] 38%|███▊ | 12963/34278 [14:21:05<21:55:30, 3.70s/it] {'loss': 0.1582, 'grad_norm': 0.7621838741709405, 'learning_rate': 7.144119431818689e-06, 'epoch': 0.38} 38%|███▊ | 12963/34278 [14:21:05<21:55:30, 3.70s/it] 38%|███▊ | 12964/34278 [14:21:09<23:11:11, 3.92s/it] {'loss': 0.1305, 'grad_norm': 0.735321625004359, 'learning_rate': 7.1436926308492645e-06, 'epoch': 0.38} 38%|███▊ | 12964/34278 [14:21:09<23:11:11, 3.92s/it] 38%|███▊ | 12965/34278 [14:21:15<26:45:08, 4.52s/it] {'loss': 0.1298, 'grad_norm': 0.959420232020522, 'learning_rate': 7.1432658107414665e-06, 'epoch': 0.38} 38%|███▊ | 12965/34278 [14:21:15<26:45:08, 4.52s/it] 38%|███▊ | 12966/34278 [14:21:19<24:24:09, 4.12s/it] {'loss': 0.1726, 'grad_norm': 0.7998303281463885, 'learning_rate': 7.142838971499101e-06, 'epoch': 0.38} 38%|███▊ | 12966/34278 [14:21:19<24:24:09, 4.12s/it] 38%|███▊ | 12967/34278 [14:21:22<23:09:27, 3.91s/it] {'loss': 0.1813, 'grad_norm': 0.9765520966833411, 'learning_rate': 7.142412113125981e-06, 'epoch': 0.38} 38%|███▊ | 12967/34278 [14:21:22<23:09:27, 3.91s/it] 38%|███▊ | 12968/34278 [14:21:26<22:40:08, 3.83s/it] {'loss': 0.1365, 'grad_norm': 0.7140966566107453, 'learning_rate': 7.141985235625918e-06, 'epoch': 0.38} 38%|███▊ | 12968/34278 [14:21:26<22:40:08, 3.83s/it] 38%|███▊ | 12969/34278 [14:21:31<24:41:02, 4.17s/it] {'loss': 0.1651, 'grad_norm': 1.017120231765769, 'learning_rate': 7.141558339002721e-06, 'epoch': 0.38} 38%|███▊ | 12969/34278 [14:21:31<24:41:02, 4.17s/it] 38%|███▊ | 12970/34278 [14:21:34<24:15:02, 4.10s/it] {'loss': 0.1353, 'grad_norm': 0.9148143008092731, 'learning_rate': 7.141131423260204e-06, 'epoch': 0.38} 38%|███▊ | 12970/34278 [14:21:35<24:15:02, 4.10s/it] 38%|███▊ | 12971/34278 [14:21:38<22:51:22, 3.86s/it] {'loss': 0.1281, 'grad_norm': 0.7676960095955689, 'learning_rate': 7.140704488402175e-06, 'epoch': 0.38} 38%|███▊ | 12971/34278 [14:21:38<22:51:22, 3.86s/it] 38%|███▊ | 12972/34278 [14:21:41<22:07:50, 3.74s/it] {'loss': 0.1282, 'grad_norm': 0.7908730580557988, 'learning_rate': 7.1402775344324485e-06, 'epoch': 0.38} 38%|███▊ | 12972/34278 [14:21:41<22:07:50, 3.74s/it] 38%|███▊ | 12973/34278 [14:21:46<23:32:46, 3.98s/it] {'loss': 0.1518, 'grad_norm': 1.0511472798732373, 'learning_rate': 7.1398505613548345e-06, 'epoch': 0.38} 38%|███▊ | 12973/34278 [14:21:46<23:32:46, 3.98s/it] 38%|███▊ | 12974/34278 [14:21:52<26:37:02, 4.50s/it] {'loss': 0.1521, 'grad_norm': 0.771793661445887, 'learning_rate': 7.1394235691731454e-06, 'epoch': 0.38} 38%|███▊ | 12974/34278 [14:21:52<26:37:02, 4.50s/it] 38%|███▊ | 12975/34278 [14:21:55<24:09:44, 4.08s/it] {'loss': 0.1351, 'grad_norm': 0.6078147772896247, 'learning_rate': 7.1389965578911946e-06, 'epoch': 0.38} 38%|███▊ | 12975/34278 [14:21:55<24:09:44, 4.08s/it] 38%|███▊ | 12976/34278 [14:21:59<24:39:41, 4.17s/it] {'loss': 0.1486, 'grad_norm': 1.0553111873365606, 'learning_rate': 7.138569527512791e-06, 'epoch': 0.38} 38%|███▊ | 12976/34278 [14:21:59<24:39:41, 4.17s/it] 38%|███▊ | 12977/34278 [14:22:03<24:25:37, 4.13s/it] {'loss': 0.1513, 'grad_norm': 0.8961806256233492, 'learning_rate': 7.13814247804175e-06, 'epoch': 0.38} 38%|███▊ | 12977/34278 [14:22:03<24:25:37, 4.13s/it] 38%|███▊ | 12978/34278 [14:22:07<24:59:26, 4.22s/it] {'loss': 0.131, 'grad_norm': 0.8875527816798312, 'learning_rate': 7.137715409481884e-06, 'epoch': 0.38} 38%|███▊ | 12978/34278 [14:22:07<24:59:26, 4.22s/it] 38%|███▊ | 12979/34278 [14:22:11<23:31:27, 3.98s/it] {'loss': 0.1421, 'grad_norm': 0.7366436156084128, 'learning_rate': 7.137288321837005e-06, 'epoch': 0.38} 38%|███▊ | 12979/34278 [14:22:11<23:31:27, 3.98s/it] 38%|███▊ | 12980/34278 [14:22:14<22:19:04, 3.77s/it] {'loss': 0.1209, 'grad_norm': 0.6727231513794392, 'learning_rate': 7.136861215110926e-06, 'epoch': 0.38} 38%|███▊ | 12980/34278 [14:22:14<22:19:04, 3.77s/it] 38%|███▊ | 12981/34278 [14:22:18<22:27:02, 3.80s/it] {'loss': 0.1485, 'grad_norm': 0.8167695900482286, 'learning_rate': 7.1364340893074605e-06, 'epoch': 0.38} 38%|███▊ | 12981/34278 [14:22:18<22:27:02, 3.80s/it] 38%|███▊ | 12982/34278 [14:22:21<20:51:07, 3.52s/it] {'loss': 0.1541, 'grad_norm': 0.7501467648092552, 'learning_rate': 7.13600694443042e-06, 'epoch': 0.38} 38%|███▊ | 12982/34278 [14:22:21<20:51:07, 3.52s/it] 38%|███▊ | 12983/34278 [14:22:25<21:09:10, 3.58s/it] {'loss': 0.1574, 'grad_norm': 0.6557522570342736, 'learning_rate': 7.135579780483621e-06, 'epoch': 0.38} 38%|███▊ | 12983/34278 [14:22:25<21:09:10, 3.58s/it] 38%|███▊ | 12984/34278 [14:22:28<20:28:25, 3.46s/it] {'loss': 0.1316, 'grad_norm': 0.6872754374578743, 'learning_rate': 7.1351525974708756e-06, 'epoch': 0.38} 38%|███▊ | 12984/34278 [14:22:28<20:28:25, 3.46s/it] 38%|███▊ | 12985/34278 [14:22:31<19:51:23, 3.36s/it] {'loss': 0.1539, 'grad_norm': 0.8949837223811141, 'learning_rate': 7.134725395395997e-06, 'epoch': 0.38} 38%|███▊ | 12985/34278 [14:22:31<19:51:23, 3.36s/it] 38%|███▊ | 12986/34278 [14:22:37<23:50:48, 4.03s/it] {'loss': 0.1363, 'grad_norm': 0.7704934776306336, 'learning_rate': 7.1342981742627996e-06, 'epoch': 0.38} 38%|███▊ | 12986/34278 [14:22:37<23:50:48, 4.03s/it] 38%|███▊ | 12987/34278 [14:22:40<23:25:30, 3.96s/it] {'loss': 0.1555, 'grad_norm': 0.6718863266055916, 'learning_rate': 7.133870934075098e-06, 'epoch': 0.38} 38%|███▊ | 12987/34278 [14:22:40<23:25:30, 3.96s/it] 38%|███▊ | 12988/34278 [14:22:44<22:35:41, 3.82s/it] {'loss': 0.1389, 'grad_norm': 0.8177509554857887, 'learning_rate': 7.133443674836705e-06, 'epoch': 0.38} 38%|███▊ | 12988/34278 [14:22:44<22:35:41, 3.82s/it] 38%|███▊ | 12989/34278 [14:22:48<23:11:28, 3.92s/it] {'loss': 0.1379, 'grad_norm': 0.7196452370218708, 'learning_rate': 7.133016396551438e-06, 'epoch': 0.38} 38%|███▊ | 12989/34278 [14:22:48<23:11:28, 3.92s/it] 38%|███▊ | 12990/34278 [14:22:54<27:21:21, 4.63s/it] {'loss': 0.1585, 'grad_norm': 0.9257648968287675, 'learning_rate': 7.132589099223108e-06, 'epoch': 0.38} 38%|███▊ | 12990/34278 [14:22:54<27:21:21, 4.63s/it] 38%|███▊ | 12991/34278 [14:22:57<24:27:54, 4.14s/it] {'loss': 0.144, 'grad_norm': 0.9373042126222249, 'learning_rate': 7.132161782855533e-06, 'epoch': 0.38} 38%|███▊ | 12991/34278 [14:22:57<24:27:54, 4.14s/it] 38%|███▊ | 12992/34278 [14:23:00<22:12:54, 3.76s/it] {'loss': 0.1406, 'grad_norm': 0.9099163048196881, 'learning_rate': 7.131734447452525e-06, 'epoch': 0.38} 38%|███▊ | 12992/34278 [14:23:00<22:12:54, 3.76s/it] 38%|███▊ | 12993/34278 [14:23:06<26:41:20, 4.51s/it] {'loss': 0.1458, 'grad_norm': 0.9153891170724695, 'learning_rate': 7.131307093017902e-06, 'epoch': 0.38} 38%|███▊ | 12993/34278 [14:23:06<26:41:20, 4.51s/it] 38%|███▊ | 12994/34278 [14:23:10<24:33:24, 4.15s/it] {'loss': 0.1382, 'grad_norm': 0.8824702923566206, 'learning_rate': 7.130879719555477e-06, 'epoch': 0.38} 38%|███▊ | 12994/34278 [14:23:10<24:33:24, 4.15s/it] 38%|███▊ | 12995/34278 [14:23:13<22:55:49, 3.88s/it] {'loss': 0.1244, 'grad_norm': 0.7086844386030828, 'learning_rate': 7.130452327069068e-06, 'epoch': 0.38} 38%|███▊ | 12995/34278 [14:23:13<22:55:49, 3.88s/it] 38%|███▊ | 12996/34278 [14:23:19<26:14:55, 4.44s/it] {'loss': 0.1161, 'grad_norm': 0.9866606689997597, 'learning_rate': 7.130024915562488e-06, 'epoch': 0.38} 38%|███▊ | 12996/34278 [14:23:19<26:14:55, 4.44s/it] 38%|███▊ | 12997/34278 [14:23:24<27:00:30, 4.57s/it] {'loss': 0.1569, 'grad_norm': 0.7773186359509144, 'learning_rate': 7.129597485039554e-06, 'epoch': 0.38} 38%|███▊ | 12997/34278 [14:23:24<27:00:30, 4.57s/it] 38%|███▊ | 12998/34278 [14:23:27<25:18:08, 4.28s/it] {'loss': 0.134, 'grad_norm': 0.7173249266886189, 'learning_rate': 7.129170035504084e-06, 'epoch': 0.38} 38%|███▊ | 12998/34278 [14:23:27<25:18:08, 4.28s/it] 38%|███▊ | 12999/34278 [14:23:30<22:29:47, 3.81s/it] {'loss': 0.1384, 'grad_norm': 0.7722682844543123, 'learning_rate': 7.1287425669598896e-06, 'epoch': 0.38} 38%|███▊ | 12999/34278 [14:23:30<22:29:47, 3.81s/it] 38%|███▊ | 13000/34278 [14:23:36<26:17:10, 4.45s/it] {'loss': 0.123, 'grad_norm': 0.7188234623188318, 'learning_rate': 7.128315079410792e-06, 'epoch': 0.38} 38%|███▊ | 13000/34278 [14:23:36<26:17:10, 4.45s/it] 38%|███▊ | 13001/34278 [14:23:40<25:32:18, 4.32s/it] {'loss': 0.1349, 'grad_norm': 0.7432564586110186, 'learning_rate': 7.1278875728606035e-06, 'epoch': 0.38} 38%|███▊ | 13001/34278 [14:23:40<25:32:18, 4.32s/it] 38%|███▊ | 13002/34278 [14:23:43<23:38:15, 4.00s/it] {'loss': 0.1161, 'grad_norm': 0.7902680911141535, 'learning_rate': 7.127460047313144e-06, 'epoch': 0.38} 38%|███▊ | 13002/34278 [14:23:43<23:38:15, 4.00s/it] 38%|███▊ | 13003/34278 [14:23:48<25:58:24, 4.40s/it] {'loss': 0.1601, 'grad_norm': 2.027281163123131, 'learning_rate': 7.127032502772229e-06, 'epoch': 0.38} 38%|███▊ | 13003/34278 [14:23:48<25:58:24, 4.40s/it] 38%|███▊ | 13004/34278 [14:23:53<26:21:40, 4.46s/it] {'loss': 0.1426, 'grad_norm': 0.7195166679385525, 'learning_rate': 7.1266049392416745e-06, 'epoch': 0.38} 38%|███▊ | 13004/34278 [14:23:53<26:21:40, 4.46s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 38%|███▊ | 13005/34278 [14:23:56<24:15:24, 4.10s/it] {'loss': 0.149, 'grad_norm': 0.8912337214678676, 'learning_rate': 7.126177356725299e-06, 'epoch': 0.38} 38%|███▊ | 13005/34278 [14:23:56<24:15:24, 4.10s/it] 38%|███▊ | 13006/34278 [14:23:59<22:40:27, 3.84s/it] {'loss': 0.138, 'grad_norm': 0.8818951914893414, 'learning_rate': 7.1257497552269205e-06, 'epoch': 0.38} 38%|███▊ | 13006/34278 [14:24:00<22:40:27, 3.84s/it] 38%|███▊ | 13007/34278 [14:24:02<21:06:30, 3.57s/it] {'loss': 0.144, 'grad_norm': 0.8222407696262598, 'learning_rate': 7.1253221347503545e-06, 'epoch': 0.38} 38%|███▊ | 13007/34278 [14:24:02<21:06:30, 3.57s/it] 38%|███▊ | 13008/34278 [14:24:06<20:52:23, 3.53s/it] {'loss': 0.1586, 'grad_norm': 0.6831361651877006, 'learning_rate': 7.1248944952994204e-06, 'epoch': 0.38} 38%|███▊ | 13008/34278 [14:24:06<20:52:23, 3.53s/it] 38%|███▊ | 13009/34278 [14:24:09<19:50:31, 3.36s/it] {'loss': 0.1533, 'grad_norm': 0.9650011422116275, 'learning_rate': 7.124466836877936e-06, 'epoch': 0.38} 38%|███▊ | 13009/34278 [14:24:09<19:50:31, 3.36s/it] 38%|███▊ | 13010/34278 [14:24:12<18:53:03, 3.20s/it] {'loss': 0.1564, 'grad_norm': 0.8199863927822962, 'learning_rate': 7.12403915948972e-06, 'epoch': 0.38} 38%|███▊ | 13010/34278 [14:24:12<18:53:03, 3.20s/it] 38%|███▊ | 13011/34278 [14:24:15<18:20:25, 3.10s/it] {'loss': 0.1476, 'grad_norm': 0.7205981841075796, 'learning_rate': 7.123611463138585e-06, 'epoch': 0.38} 38%|███▊ | 13011/34278 [14:24:15<18:20:25, 3.10s/it] 38%|███▊ | 13012/34278 [14:24:18<18:06:31, 3.07s/it] {'loss': 0.149, 'grad_norm': 0.8891586204009748, 'learning_rate': 7.123183747828357e-06, 'epoch': 0.38} 38%|███▊ | 13012/34278 [14:24:18<18:06:31, 3.07s/it] 38%|███▊ | 13013/34278 [14:24:21<18:13:48, 3.09s/it] {'loss': 0.1216, 'grad_norm': 0.7511475967066265, 'learning_rate': 7.122756013562853e-06, 'epoch': 0.38} 38%|███▊ | 13013/34278 [14:24:21<18:13:48, 3.09s/it] 38%|███▊ | 13014/34278 [14:24:24<19:09:19, 3.24s/it] {'loss': 0.1317, 'grad_norm': 0.7172693632880247, 'learning_rate': 7.122328260345887e-06, 'epoch': 0.38} 38%|███▊ | 13014/34278 [14:24:24<19:09:19, 3.24s/it] 38%|███▊ | 13015/34278 [14:24:28<19:28:50, 3.30s/it] {'loss': 0.1508, 'grad_norm': 0.753777054067735, 'learning_rate': 7.1219004881812824e-06, 'epoch': 0.38} 38%|███▊ | 13015/34278 [14:24:28<19:28:50, 3.30s/it] 38%|███▊ | 13016/34278 [14:24:31<19:33:36, 3.31s/it] {'loss': 0.1519, 'grad_norm': 0.7407290835739581, 'learning_rate': 7.1214726970728566e-06, 'epoch': 0.38} 38%|███▊ | 13016/34278 [14:24:31<19:33:36, 3.31s/it] 38%|███▊ | 13017/34278 [14:24:36<22:51:53, 3.87s/it] {'loss': 0.1323, 'grad_norm': 0.8642151428868987, 'learning_rate': 7.121044887024428e-06, 'epoch': 0.38} 38%|███▊ | 13017/34278 [14:24:36<22:51:53, 3.87s/it] 38%|███▊ | 13018/34278 [14:24:39<21:22:31, 3.62s/it] {'loss': 0.1499, 'grad_norm': 0.8638983260524838, 'learning_rate': 7.120617058039818e-06, 'epoch': 0.38} 38%|███▊ | 13018/34278 [14:24:39<21:22:31, 3.62s/it] 38%|███▊ | 13019/34278 [14:24:45<24:25:56, 4.14s/it] {'loss': 0.1285, 'grad_norm': 0.6458164310364573, 'learning_rate': 7.120189210122846e-06, 'epoch': 0.38} 38%|███▊ | 13019/34278 [14:24:45<24:25:56, 4.14s/it] 38%|███▊ | 13020/34278 [14:24:49<24:36:19, 4.17s/it] {'loss': 0.1395, 'grad_norm': 1.307966356872709, 'learning_rate': 7.11976134327733e-06, 'epoch': 0.38} 38%|███▊ | 13020/34278 [14:24:49<24:36:19, 4.17s/it] 38%|███▊ | 13021/34278 [14:24:55<27:53:36, 4.72s/it] {'loss': 0.1521, 'grad_norm': 0.8898874170684433, 'learning_rate': 7.119333457507089e-06, 'epoch': 0.38} 38%|███▊ | 13021/34278 [14:24:55<27:53:36, 4.72s/it] 38%|███▊ | 13022/34278 [14:24:58<25:57:40, 4.40s/it] {'loss': 0.1528, 'grad_norm': 1.0613668946563664, 'learning_rate': 7.118905552815946e-06, 'epoch': 0.38} 38%|███▊ | 13022/34278 [14:24:58<25:57:40, 4.40s/it] 38%|███▊ | 13023/34278 [14:25:04<28:48:25, 4.88s/it] {'loss': 0.1492, 'grad_norm': 0.8766091331398129, 'learning_rate': 7.118477629207721e-06, 'epoch': 0.38} 38%|███▊ | 13023/34278 [14:25:05<28:48:25, 4.88s/it] 38%|███▊ | 13024/34278 [14:25:08<26:02:32, 4.41s/it] {'loss': 0.1617, 'grad_norm': 0.8012986552951176, 'learning_rate': 7.1180496866862325e-06, 'epoch': 0.38} 38%|███▊ | 13024/34278 [14:25:08<26:02:32, 4.41s/it] 38%|███▊ | 13025/34278 [14:25:11<23:43:28, 4.02s/it] {'loss': 0.1431, 'grad_norm': 1.0196559398984344, 'learning_rate': 7.1176217252553035e-06, 'epoch': 0.38} 38%|███▊ | 13025/34278 [14:25:11<23:43:28, 4.02s/it] 38%|███▊ | 13026/34278 [14:25:14<22:27:54, 3.81s/it] {'loss': 0.1444, 'grad_norm': 0.7495552245143045, 'learning_rate': 7.117193744918751e-06, 'epoch': 0.38} 38%|███▊ | 13026/34278 [14:25:14<22:27:54, 3.81s/it] 38%|███▊ | 13027/34278 [14:25:18<21:52:13, 3.70s/it] {'loss': 0.1675, 'grad_norm': 0.7002159722388028, 'learning_rate': 7.116765745680399e-06, 'epoch': 0.38} 38%|███▊ | 13027/34278 [14:25:18<21:52:13, 3.70s/it] 38%|███▊ | 13028/34278 [14:25:21<20:58:23, 3.55s/it] {'loss': 0.157, 'grad_norm': 0.9144575591113243, 'learning_rate': 7.116337727544069e-06, 'epoch': 0.38} 38%|███▊ | 13028/34278 [14:25:21<20:58:23, 3.55s/it] 38%|███▊ | 13029/34278 [14:25:24<20:17:25, 3.44s/it] {'loss': 0.1237, 'grad_norm': 0.8715475008276659, 'learning_rate': 7.115909690513578e-06, 'epoch': 0.38} 38%|███▊ | 13029/34278 [14:25:24<20:17:25, 3.44s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa5237ac9f0> Failed to fetch sample 3412264. Exception: cannot identify image file <_io.BytesIO object at 0x7fa5237ac9f0> 38%|███▊ | 13030/34278 [14:25:27<19:31:47, 3.31s/it] {'loss': 0.1552, 'grad_norm': 0.898523550872658, 'learning_rate': 7.1154816345927545e-06, 'epoch': 0.38} 38%|███▊ | 13030/34278 [14:25:27<19:31:47, 3.31s/it] 38%|███▊ | 13031/34278 [14:25:30<19:00:53, 3.22s/it] {'loss': 0.1262, 'grad_norm': 1.0962867135529077, 'learning_rate': 7.1150535597854135e-06, 'epoch': 0.38} 38%|███▊ | 13031/34278 [14:25:30<19:00:53, 3.22s/it] 38%|███▊ | 13032/34278 [14:25:33<19:10:40, 3.25s/it] {'loss': 0.1273, 'grad_norm': 0.821932644955272, 'learning_rate': 7.11462546609538e-06, 'epoch': 0.38} 38%|███▊ | 13032/34278 [14:25:33<19:10:40, 3.25s/it] 38%|███▊ | 13033/34278 [14:25:39<24:07:29, 4.09s/it] {'loss': 0.152, 'grad_norm': 0.9093602231829235, 'learning_rate': 7.114197353526474e-06, 'epoch': 0.38} 38%|███▊ | 13033/34278 [14:25:39<24:07:29, 4.09s/it] 38%|███▊ | 13034/34278 [14:25:45<26:33:56, 4.50s/it] {'loss': 0.1424, 'grad_norm': 0.7960080562623917, 'learning_rate': 7.1137692220825196e-06, 'epoch': 0.38} 38%|███▊ | 13034/34278 [14:25:45<26:33:56, 4.50s/it] 38%|███▊ | 13035/34278 [14:25:48<23:32:28, 3.99s/it] {'loss': 0.1409, 'grad_norm': 0.8044031144666941, 'learning_rate': 7.113341071767338e-06, 'epoch': 0.38} 38%|███▊ | 13035/34278 [14:25:48<23:32:28, 3.99s/it] 38%|███▊ | 13036/34278 [14:25:50<21:10:54, 3.59s/it] {'loss': 0.1299, 'grad_norm': 0.6914698331714102, 'learning_rate': 7.112912902584752e-06, 'epoch': 0.38} 38%|███▊ | 13036/34278 [14:25:50<21:10:54, 3.59s/it] 38%|███▊ | 13037/34278 [14:25:53<20:10:18, 3.42s/it] {'loss': 0.1319, 'grad_norm': 0.7620713834767586, 'learning_rate': 7.112484714538584e-06, 'epoch': 0.38} 38%|███▊ | 13037/34278 [14:25:53<20:10:18, 3.42s/it] 38%|███▊ | 13038/34278 [14:25:56<19:05:25, 3.24s/it] {'loss': 0.1477, 'grad_norm': 0.9237652496892056, 'learning_rate': 7.1120565076326565e-06, 'epoch': 0.38} 38%|███▊ | 13038/34278 [14:25:56<19:05:25, 3.24s/it] 38%|███▊ | 13039/34278 [14:25:59<18:23:05, 3.12s/it] {'loss': 0.1348, 'grad_norm': 0.9166525200244503, 'learning_rate': 7.1116282818707924e-06, 'epoch': 0.38} 38%|███▊ | 13039/34278 [14:25:59<18:23:05, 3.12s/it] 38%|███▊ | 13040/34278 [14:26:05<23:25:49, 3.97s/it] {'loss': 0.1348, 'grad_norm': 0.6828208987571173, 'learning_rate': 7.111200037256816e-06, 'epoch': 0.38} 38%|███▊ | 13040/34278 [14:26:05<23:25:49, 3.97s/it] 38%|███▊ | 13041/34278 [14:26:08<22:27:01, 3.81s/it] {'loss': 0.1427, 'grad_norm': 1.1488067358228407, 'learning_rate': 7.110771773794548e-06, 'epoch': 0.38} 38%|███▊ | 13041/34278 [14:26:08<22:27:01, 3.81s/it] 38%|███▊ | 13042/34278 [14:26:12<21:28:48, 3.64s/it] {'loss': 0.1495, 'grad_norm': 0.882183824494278, 'learning_rate': 7.110343491487815e-06, 'epoch': 0.38} 38%|███▊ | 13042/34278 [14:26:12<21:28:48, 3.64s/it] 38%|███▊ | 13043/34278 [14:26:15<20:39:15, 3.50s/it] {'loss': 0.1413, 'grad_norm': 0.8755335334471303, 'learning_rate': 7.109915190340439e-06, 'epoch': 0.38} 38%|███▊ | 13043/34278 [14:26:15<20:39:15, 3.50s/it] 38%|███▊ | 13044/34278 [14:26:19<20:55:52, 3.55s/it] {'loss': 0.1446, 'grad_norm': 0.7890303215104586, 'learning_rate': 7.109486870356243e-06, 'epoch': 0.38} 38%|███▊ | 13044/34278 [14:26:19<20:55:52, 3.55s/it] 38%|███▊ | 13045/34278 [14:26:22<21:27:46, 3.64s/it] {'loss': 0.1505, 'grad_norm': 0.8572201015853493, 'learning_rate': 7.1090585315390525e-06, 'epoch': 0.38} 38%|███▊ | 13045/34278 [14:26:22<21:27:46, 3.64s/it] 38%|███▊ | 13046/34278 [14:26:25<20:27:31, 3.47s/it] {'loss': 0.1593, 'grad_norm': 0.9668789016106413, 'learning_rate': 7.108630173892691e-06, 'epoch': 0.38} 38%|███▊ | 13046/34278 [14:26:25<20:27:31, 3.47s/it] 38%|███▊ | 13047/34278 [14:26:29<20:02:48, 3.40s/it] {'loss': 0.1483, 'grad_norm': 0.7463194331800527, 'learning_rate': 7.108201797420983e-06, 'epoch': 0.38} 38%|███▊ | 13047/34278 [14:26:29<20:02:48, 3.40s/it] 38%|███▊ | 13048/34278 [14:26:32<20:19:27, 3.45s/it] {'loss': 0.1519, 'grad_norm': 0.8313397874101502, 'learning_rate': 7.107773402127751e-06, 'epoch': 0.38} 38%|███▊ | 13048/34278 [14:26:32<20:19:27, 3.45s/it] 38%|███▊ | 13049/34278 [14:26:35<19:39:11, 3.33s/it] {'loss': 0.1429, 'grad_norm': 1.0873651459527833, 'learning_rate': 7.107344988016822e-06, 'epoch': 0.38} 38%|███▊ | 13049/34278 [14:26:35<19:39:11, 3.33s/it] 38%|███▊ | 13050/34278 [14:26:42<25:14:30, 4.28s/it] {'loss': 0.1449, 'grad_norm': 0.764576326893396, 'learning_rate': 7.1069165550920205e-06, 'epoch': 0.38} 38%|███▊ | 13050/34278 [14:26:42<25:14:30, 4.28s/it] 38%|███▊ | 13051/34278 [14:26:45<23:58:32, 4.07s/it] {'loss': 0.1521, 'grad_norm': 1.3986039501018053, 'learning_rate': 7.106488103357171e-06, 'epoch': 0.38} 38%|███▊ | 13051/34278 [14:26:45<23:58:32, 4.07s/it] 38%|███▊ | 13052/34278 [14:26:49<22:38:01, 3.84s/it] {'loss': 0.1485, 'grad_norm': 1.060778432603687, 'learning_rate': 7.106059632816098e-06, 'epoch': 0.38} 38%|███▊ | 13052/34278 [14:26:49<22:38:01, 3.84s/it] 38%|███▊ | 13053/34278 [14:26:55<26:23:01, 4.47s/it] {'loss': 0.1331, 'grad_norm': 0.9756185368049597, 'learning_rate': 7.105631143472628e-06, 'epoch': 0.38} 38%|███▊ | 13053/34278 [14:26:55<26:23:01, 4.47s/it] 38%|███▊ | 13054/34278 [14:26:58<24:01:27, 4.07s/it] {'loss': 0.1438, 'grad_norm': 1.0338252019646097, 'learning_rate': 7.105202635330586e-06, 'epoch': 0.38} 38%|███▊ | 13054/34278 [14:26:58<24:01:27, 4.07s/it] 38%|███▊ | 13055/34278 [14:27:01<22:41:10, 3.85s/it] {'loss': 0.1547, 'grad_norm': 0.8616962415237442, 'learning_rate': 7.104774108393797e-06, 'epoch': 0.38} 38%|███▊ | 13055/34278 [14:27:01<22:41:10, 3.85s/it] 38%|███▊ | 13056/34278 [14:27:04<21:16:18, 3.61s/it] {'loss': 0.1371, 'grad_norm': 0.7960991086389061, 'learning_rate': 7.104345562666086e-06, 'epoch': 0.38} 38%|███▊ | 13056/34278 [14:27:04<21:16:18, 3.61s/it] 38%|███▊ | 13057/34278 [14:27:08<21:28:07, 3.64s/it] {'loss': 0.1285, 'grad_norm': 1.0248147656572244, 'learning_rate': 7.1039169981512825e-06, 'epoch': 0.38} 38%|███▊ | 13057/34278 [14:27:08<21:28:07, 3.64s/it] 38%|███▊ | 13058/34278 [14:27:11<20:40:57, 3.51s/it] {'loss': 0.1336, 'grad_norm': 0.9138140422863517, 'learning_rate': 7.103488414853209e-06, 'epoch': 0.38} 38%|███▊ | 13058/34278 [14:27:11<20:40:57, 3.51s/it] 38%|███▊ | 13059/34278 [14:27:16<24:06:36, 4.09s/it] {'loss': 0.1356, 'grad_norm': 0.8605081926102031, 'learning_rate': 7.103059812775693e-06, 'epoch': 0.38} 38%|███▊ | 13059/34278 [14:27:17<24:06:36, 4.09s/it] 38%|███▊ | 13060/34278 [14:27:23<27:38:10, 4.69s/it] {'loss': 0.165, 'grad_norm': 0.8019811823044132, 'learning_rate': 7.102631191922561e-06, 'epoch': 0.38} 38%|███▊ | 13060/34278 [14:27:23<27:38:10, 4.69s/it] 38%|███▊ | 13061/34278 [14:27:27<26:42:35, 4.53s/it] {'loss': 0.1649, 'grad_norm': 0.9293022660798531, 'learning_rate': 7.10220255229764e-06, 'epoch': 0.38} 38%|███▊ | 13061/34278 [14:27:27<26:42:35, 4.53s/it] 38%|███▊ | 13062/34278 [14:27:30<24:23:31, 4.14s/it] {'loss': 0.1581, 'grad_norm': 1.134806044489274, 'learning_rate': 7.101773893904756e-06, 'epoch': 0.38} 38%|███▊ | 13062/34278 [14:27:30<24:23:31, 4.14s/it] 38%|███▊ | 13063/34278 [14:27:33<22:54:43, 3.89s/it] {'loss': 0.1561, 'grad_norm': 0.855958036612699, 'learning_rate': 7.101345216747737e-06, 'epoch': 0.38} 38%|███▊ | 13063/34278 [14:27:33<22:54:43, 3.89s/it] 38%|███▊ | 13064/34278 [14:27:36<21:33:27, 3.66s/it] {'loss': 0.1393, 'grad_norm': 1.1020334433221302, 'learning_rate': 7.100916520830409e-06, 'epoch': 0.38} 38%|███▊ | 13064/34278 [14:27:36<21:33:27, 3.66s/it] 38%|███▊ | 13065/34278 [14:27:43<26:23:10, 4.48s/it] {'loss': 0.157, 'grad_norm': 0.903005414426244, 'learning_rate': 7.1004878061565995e-06, 'epoch': 0.38} 38%|███▊ | 13065/34278 [14:27:43<26:23:10, 4.48s/it] 38%|███▊ | 13066/34278 [14:27:46<23:54:29, 4.06s/it] {'loss': 0.1682, 'grad_norm': 0.7590175862443029, 'learning_rate': 7.100059072730136e-06, 'epoch': 0.38} 38%|███▊ | 13066/34278 [14:27:46<23:54:29, 4.06s/it] 38%|███▊ | 13067/34278 [14:27:49<22:04:49, 3.75s/it] {'loss': 0.1373, 'grad_norm': 0.7066565815456609, 'learning_rate': 7.0996303205548486e-06, 'epoch': 0.38} 38%|███▊ | 13067/34278 [14:27:49<22:04:49, 3.75s/it] 38%|███▊ | 13068/34278 [14:27:53<22:12:13, 3.77s/it] {'loss': 0.1505, 'grad_norm': 0.7771638731156558, 'learning_rate': 7.099201549634561e-06, 'epoch': 0.38} 38%|███▊ | 13068/34278 [14:27:53<22:12:13, 3.77s/it] 38%|███▊ | 13069/34278 [14:27:56<21:33:09, 3.66s/it] {'loss': 0.1556, 'grad_norm': 0.7399993588177523, 'learning_rate': 7.098772759973104e-06, 'epoch': 0.38} 38%|███▊ | 13069/34278 [14:27:56<21:33:09, 3.66s/it] 38%|███▊ | 13070/34278 [14:27:59<20:00:45, 3.40s/it] {'loss': 0.1546, 'grad_norm': 0.693851392144675, 'learning_rate': 7.098343951574305e-06, 'epoch': 0.38} 38%|███▊ | 13070/34278 [14:27:59<20:00:45, 3.40s/it] 38%|███▊ | 13071/34278 [14:28:02<19:28:53, 3.31s/it] {'loss': 0.1422, 'grad_norm': 0.697594934361742, 'learning_rate': 7.097915124441991e-06, 'epoch': 0.38} 38%|███▊ | 13071/34278 [14:28:02<19:28:53, 3.31s/it] 38%|███▊ | 13072/34278 [14:28:05<19:48:56, 3.36s/it] {'loss': 0.1237, 'grad_norm': 0.6922999767936167, 'learning_rate': 7.097486278579993e-06, 'epoch': 0.38} 38%|███▊ | 13072/34278 [14:28:05<19:48:56, 3.36s/it] 38%|███▊ | 13073/34278 [14:28:09<19:45:52, 3.36s/it] {'loss': 0.1779, 'grad_norm': 0.9004585109504485, 'learning_rate': 7.097057413992136e-06, 'epoch': 0.38} 38%|███▊ | 13073/34278 [14:28:09<19:45:52, 3.36s/it] 38%|███▊ | 13074/34278 [14:28:12<19:52:23, 3.37s/it] {'loss': 0.1257, 'grad_norm': 0.820602806872622, 'learning_rate': 7.096628530682253e-06, 'epoch': 0.38} 38%|███▊ | 13074/34278 [14:28:12<19:52:23, 3.37s/it] 38%|███▊ | 13075/34278 [14:28:15<19:15:54, 3.27s/it] {'loss': 0.1286, 'grad_norm': 0.7324287376510917, 'learning_rate': 7.096199628654171e-06, 'epoch': 0.38} 38%|███▊ | 13075/34278 [14:28:15<19:15:54, 3.27s/it] 38%|███▊ | 13076/34278 [14:28:19<19:38:22, 3.33s/it] {'loss': 0.128, 'grad_norm': 0.7892389193274874, 'learning_rate': 7.095770707911718e-06, 'epoch': 0.38} 38%|███▊ | 13076/34278 [14:28:19<19:38:22, 3.33s/it] 38%|███▊ | 13077/34278 [14:28:22<19:12:32, 3.26s/it] {'loss': 0.1447, 'grad_norm': 0.9323040225014304, 'learning_rate': 7.0953417684587255e-06, 'epoch': 0.38} 38%|███▊ | 13077/34278 [14:28:22<19:12:32, 3.26s/it] 38%|███▊ | 13078/34278 [14:28:26<20:23:06, 3.46s/it] {'loss': 0.1501, 'grad_norm': 0.7199735658137739, 'learning_rate': 7.094912810299021e-06, 'epoch': 0.38} 38%|███▊ | 13078/34278 [14:28:26<20:23:06, 3.46s/it] 38%|███▊ | 13079/34278 [14:28:29<20:04:17, 3.41s/it] {'loss': 0.1658, 'grad_norm': 1.115182105137788, 'learning_rate': 7.094483833436435e-06, 'epoch': 0.38} 38%|███▊ | 13079/34278 [14:28:29<20:04:17, 3.41s/it] 38%|███▊ | 13080/34278 [14:28:32<19:53:50, 3.38s/it] {'loss': 0.1457, 'grad_norm': 0.8367324438425654, 'learning_rate': 7.094054837874798e-06, 'epoch': 0.38} 38%|███▊ | 13080/34278 [14:28:32<19:53:50, 3.38s/it] 38%|███▊ | 13081/34278 [14:28:36<20:37:02, 3.50s/it] {'loss': 0.1598, 'grad_norm': 0.8292608413906478, 'learning_rate': 7.093625823617939e-06, 'epoch': 0.38} 38%|███▊ | 13081/34278 [14:28:36<20:37:02, 3.50s/it] 38%|███▊ | 13082/34278 [14:28:40<20:43:44, 3.52s/it] {'loss': 0.1188, 'grad_norm': 0.9212773809358213, 'learning_rate': 7.0931967906696885e-06, 'epoch': 0.38} 38%|███▊ | 13082/34278 [14:28:40<20:43:44, 3.52s/it] 38%|███▊ | 13083/34278 [14:28:43<20:14:51, 3.44s/it] {'loss': 0.1668, 'grad_norm': 0.6563577012682009, 'learning_rate': 7.092767739033877e-06, 'epoch': 0.38} 38%|███▊ | 13083/34278 [14:28:43<20:14:51, 3.44s/it] 38%|███▊ | 13084/34278 [14:28:49<24:59:35, 4.25s/it] {'loss': 0.1635, 'grad_norm': 0.9401194032639766, 'learning_rate': 7.092338668714333e-06, 'epoch': 0.38} 38%|███▊ | 13084/34278 [14:28:49<24:59:35, 4.25s/it] 38%|███▊ | 13085/34278 [14:28:52<23:19:45, 3.96s/it] {'loss': 0.1319, 'grad_norm': 0.9556742187372048, 'learning_rate': 7.0919095797148915e-06, 'epoch': 0.38} 38%|███▊ | 13085/34278 [14:28:52<23:19:45, 3.96s/it] 38%|███▊ | 13086/34278 [14:28:58<26:47:34, 4.55s/it] {'loss': 0.1431, 'grad_norm': 0.7710741032344843, 'learning_rate': 7.091480472039378e-06, 'epoch': 0.38} 38%|███▊ | 13086/34278 [14:28:58<26:47:34, 4.55s/it] 38%|███▊ | 13087/34278 [14:29:02<25:47:13, 4.38s/it] {'loss': 0.1552, 'grad_norm': 0.7858500377077914, 'learning_rate': 7.091051345691628e-06, 'epoch': 0.38} 38%|███▊ | 13087/34278 [14:29:02<25:47:13, 4.38s/it] 38%|███▊ | 13088/34278 [14:29:08<28:43:41, 4.88s/it] {'loss': 0.1448, 'grad_norm': 1.3675361224142528, 'learning_rate': 7.090622200675471e-06, 'epoch': 0.38} 38%|███▊ | 13088/34278 [14:29:08<28:43:41, 4.88s/it] 38%|███▊ | 13089/34278 [14:29:12<27:14:44, 4.63s/it] {'loss': 0.1455, 'grad_norm': 0.8874926855421642, 'learning_rate': 7.090193036994737e-06, 'epoch': 0.38} 38%|███▊ | 13089/34278 [14:29:12<27:14:44, 4.63s/it] 38%|███▊ | 13090/34278 [14:29:18<29:34:20, 5.02s/it] {'loss': 0.1553, 'grad_norm': 0.9731620384555049, 'learning_rate': 7.089763854653259e-06, 'epoch': 0.38} 38%|███▊ | 13090/34278 [14:29:18<29:34:20, 5.02s/it] 38%|███▊ | 13091/34278 [14:29:23<28:06:59, 4.78s/it] {'loss': 0.1485, 'grad_norm': 1.0635409840006158, 'learning_rate': 7.089334653654868e-06, 'epoch': 0.38} 38%|███▊ | 13091/34278 [14:29:23<28:06:59, 4.78s/it] 38%|███▊ | 13092/34278 [14:29:26<25:03:45, 4.26s/it] {'loss': 0.1582, 'grad_norm': 0.8482488111094789, 'learning_rate': 7.088905434003396e-06, 'epoch': 0.38} 38%|███▊ | 13092/34278 [14:29:26<25:03:45, 4.26s/it] 38%|███▊ | 13093/34278 [14:29:29<23:42:34, 4.03s/it] {'loss': 0.1472, 'grad_norm': 1.218399775578868, 'learning_rate': 7.088476195702675e-06, 'epoch': 0.38} 38%|███▊ | 13093/34278 [14:29:29<23:42:34, 4.03s/it] 38%|███▊ | 13094/34278 [14:29:35<26:38:54, 4.53s/it] {'loss': 0.1344, 'grad_norm': 0.7200220502156465, 'learning_rate': 7.088046938756536e-06, 'epoch': 0.38} 38%|███▊ | 13094/34278 [14:29:35<26:38:54, 4.53s/it] 38%|███▊ | 13095/34278 [14:29:39<25:18:07, 4.30s/it] {'loss': 0.1283, 'grad_norm': 0.8267643247609752, 'learning_rate': 7.0876176631688144e-06, 'epoch': 0.38} 38%|███▊ | 13095/34278 [14:29:39<25:18:07, 4.30s/it] 38%|███▊ | 13096/34278 [14:29:42<23:37:57, 4.02s/it] {'loss': 0.1716, 'grad_norm': 0.7760840832630073, 'learning_rate': 7.0871883689433396e-06, 'epoch': 0.38} 38%|███▊ | 13096/34278 [14:29:42<23:37:57, 4.02s/it] 38%|███▊ | 13097/34278 [14:29:46<24:27:14, 4.16s/it] {'loss': 0.141, 'grad_norm': 0.5994343031843402, 'learning_rate': 7.086759056083945e-06, 'epoch': 0.38} 38%|███▊ | 13097/34278 [14:29:46<24:27:14, 4.16s/it] 38%|███▊ | 13098/34278 [14:29:50<23:29:21, 3.99s/it] {'loss': 0.1295, 'grad_norm': 0.7332000190947274, 'learning_rate': 7.086329724594464e-06, 'epoch': 0.38} 38%|███▊ | 13098/34278 [14:29:50<23:29:21, 3.99s/it] 38%|███▊ | 13099/34278 [14:29:56<27:17:15, 4.64s/it] {'loss': 0.1495, 'grad_norm': 0.7322465399146401, 'learning_rate': 7.0859003744787296e-06, 'epoch': 0.38} 38%|███▊ | 13099/34278 [14:29:56<27:17:15, 4.64s/it] 38%|███▊ | 13100/34278 [14:29:59<25:00:45, 4.25s/it] {'loss': 0.1384, 'grad_norm': 0.7286991271251654, 'learning_rate': 7.085471005740575e-06, 'epoch': 0.38} 38%|███▊ | 13100/34278 [14:29:59<25:00:45, 4.25s/it] 38%|███▊ | 13101/34278 [14:30:06<28:17:17, 4.81s/it] {'loss': 0.1475, 'grad_norm': 0.6617685520670704, 'learning_rate': 7.085041618383831e-06, 'epoch': 0.38} 38%|███▊ | 13101/34278 [14:30:06<28:17:17, 4.81s/it] 38%|███▊ | 13102/34278 [14:30:09<26:08:16, 4.44s/it] {'loss': 0.1403, 'grad_norm': 0.8567712621782794, 'learning_rate': 7.084612212412336e-06, 'epoch': 0.38} 38%|███▊ | 13102/34278 [14:30:09<26:08:16, 4.44s/it] 38%|███▊ | 13103/34278 [14:30:15<28:43:49, 4.88s/it] {'loss': 0.1615, 'grad_norm': 0.8779255613544709, 'learning_rate': 7.08418278782992e-06, 'epoch': 0.38} 38%|███▊ | 13103/34278 [14:30:15<28:43:49, 4.88s/it] 38%|███▊ | 13104/34278 [14:30:18<25:37:21, 4.36s/it] {'loss': 0.144, 'grad_norm': 0.740055980900757, 'learning_rate': 7.083753344640415e-06, 'epoch': 0.38} 38%|███▊ | 13104/34278 [14:30:18<25:37:21, 4.36s/it] 38%|███▊ | 13105/34278 [14:30:21<23:06:31, 3.93s/it] {'loss': 0.1327, 'grad_norm': 0.8713575859818685, 'learning_rate': 7.083323882847661e-06, 'epoch': 0.38} 38%|███▊ | 13105/34278 [14:30:21<23:06:31, 3.93s/it] 38%|███▊ | 13106/34278 [14:30:24<21:44:17, 3.70s/it] {'loss': 0.138, 'grad_norm': 0.8144970970507229, 'learning_rate': 7.082894402455487e-06, 'epoch': 0.38} 38%|███▊ | 13106/34278 [14:30:24<21:44:17, 3.70s/it] 38%|███▊ | 13107/34278 [14:30:27<19:56:29, 3.39s/it] {'loss': 0.1428, 'grad_norm': 0.7131895739917116, 'learning_rate': 7.08246490346773e-06, 'epoch': 0.38} 38%|███▊ | 13107/34278 [14:30:27<19:56:29, 3.39s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12637 > 8192). Running this sequence through the model will result in indexing errors 38%|███▊ | 13108/34278 [14:30:33<25:11:02, 4.28s/it] {'loss': 0.1426, 'grad_norm': 0.8303449996677388, 'learning_rate': 7.082035385888222e-06, 'epoch': 0.38} 38%|███▊ | 13108/34278 [14:30:33<25:11:02, 4.28s/it] 38%|███▊ | 13109/34278 [14:30:37<23:40:11, 4.03s/it] {'loss': 0.1676, 'grad_norm': 0.8555770600779227, 'learning_rate': 7.081605849720799e-06, 'epoch': 0.38} 38%|███▊ | 13109/34278 [14:30:37<23:40:11, 4.03s/it] 38%|███▊ | 13110/34278 [14:30:40<21:46:49, 3.70s/it] {'loss': 0.1326, 'grad_norm': 0.7928736545267687, 'learning_rate': 7.081176294969298e-06, 'epoch': 0.38} 38%|███▊ | 13110/34278 [14:30:40<21:46:49, 3.70s/it] 38%|███▊ | 13111/34278 [14:30:46<26:08:19, 4.45s/it] {'loss': 0.1356, 'grad_norm': 0.9716808502627015, 'learning_rate': 7.08074672163755e-06, 'epoch': 0.38} 38%|███▊ | 13111/34278 [14:30:46<26:08:19, 4.45s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 38%|███▊ | 13112/34278 [14:30:49<24:07:15, 4.10s/it] {'loss': 0.145, 'grad_norm': 0.800261002745498, 'learning_rate': 7.080317129729392e-06, 'epoch': 0.38} 38%|███▊ | 13112/34278 [14:30:49<24:07:15, 4.10s/it] 38%|███▊ | 13113/34278 [14:30:54<25:58:42, 4.42s/it] {'loss': 0.1428, 'grad_norm': 0.8046887385196647, 'learning_rate': 7.079887519248661e-06, 'epoch': 0.38} 38%|███▊ | 13113/34278 [14:30:54<25:58:42, 4.42s/it] 38%|███▊ | 13114/34278 [14:30:58<23:53:55, 4.07s/it] {'loss': 0.1237, 'grad_norm': 0.7150547588897157, 'learning_rate': 7.079457890199188e-06, 'epoch': 0.38} 38%|███▊ | 13114/34278 [14:30:58<23:53:55, 4.07s/it] 38%|███▊ | 13115/34278 [14:31:01<22:55:36, 3.90s/it] {'loss': 0.1412, 'grad_norm': 0.6849095150734505, 'learning_rate': 7.0790282425848145e-06, 'epoch': 0.38} 38%|███▊ | 13115/34278 [14:31:01<22:55:36, 3.90s/it] 38%|███▊ | 13116/34278 [14:31:04<21:56:15, 3.73s/it] {'loss': 0.1403, 'grad_norm': 0.9194422123303154, 'learning_rate': 7.07859857640937e-06, 'epoch': 0.38} 38%|███▊ | 13116/34278 [14:31:04<21:56:15, 3.73s/it] 38%|███▊ | 13117/34278 [14:31:08<22:13:22, 3.78s/it] {'loss': 0.1443, 'grad_norm': 0.8675519789641049, 'learning_rate': 7.0781688916766965e-06, 'epoch': 0.38} 38%|███▊ | 13117/34278 [14:31:08<22:13:22, 3.78s/it] 38%|███▊ | 13118/34278 [14:31:12<21:28:38, 3.65s/it] {'loss': 0.1138, 'grad_norm': 0.7194996006589511, 'learning_rate': 7.0777391883906265e-06, 'epoch': 0.38} 38%|███▊ | 13118/34278 [14:31:12<21:28:38, 3.65s/it] 38%|███▊ | 13119/34278 [14:31:15<20:21:29, 3.46s/it] {'loss': 0.1591, 'grad_norm': 0.9397140559046668, 'learning_rate': 7.077309466554996e-06, 'epoch': 0.38} 38%|███▊ | 13119/34278 [14:31:15<20:21:29, 3.46s/it] 38%|███▊ | 13120/34278 [14:31:21<25:35:31, 4.35s/it] {'loss': 0.1613, 'grad_norm': 0.7076092649402206, 'learning_rate': 7.076879726173643e-06, 'epoch': 0.38} 38%|███▊ | 13120/34278 [14:31:21<25:35:31, 4.35s/it] 38%|███▊ | 13121/34278 [14:31:24<23:17:16, 3.96s/it] {'loss': 0.1294, 'grad_norm': 0.7465014711959881, 'learning_rate': 7.0764499672504035e-06, 'epoch': 0.38} 38%|███▊ | 13121/34278 [14:31:24<23:17:16, 3.96s/it] 38%|███▊ | 13122/34278 [14:31:27<21:56:25, 3.73s/it] {'loss': 0.1346, 'grad_norm': 0.8638462707328708, 'learning_rate': 7.0760201897891145e-06, 'epoch': 0.38} 38%|███▊ | 13122/34278 [14:31:27<21:56:25, 3.73s/it] 38%|███▊ | 13123/34278 [14:31:31<22:17:40, 3.79s/it] {'loss': 0.1529, 'grad_norm': 0.8553734920619475, 'learning_rate': 7.075590393793612e-06, 'epoch': 0.38} 38%|███▊ | 13123/34278 [14:31:31<22:17:40, 3.79s/it] 38%|███▊ | 13124/34278 [14:31:34<20:46:27, 3.54s/it] {'loss': 0.1461, 'grad_norm': 0.7383771376980681, 'learning_rate': 7.075160579267734e-06, 'epoch': 0.38} 38%|███▊ | 13124/34278 [14:31:34<20:46:27, 3.54s/it] 38%|███▊ | 13125/34278 [14:31:38<20:19:33, 3.46s/it] {'loss': 0.1432, 'grad_norm': 0.6253729740034147, 'learning_rate': 7.074730746215319e-06, 'epoch': 0.38} 38%|███▊ | 13125/34278 [14:31:38<20:19:33, 3.46s/it] 38%|███▊ | 13126/34278 [14:31:43<23:31:00, 4.00s/it] {'loss': 0.1548, 'grad_norm': 0.8786975519092382, 'learning_rate': 7.074300894640202e-06, 'epoch': 0.38} 38%|███▊ | 13126/34278 [14:31:43<23:31:00, 4.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 38%|███▊ | 13127/34278 [14:31:46<21:18:52, 3.63s/it] {'loss': 0.139, 'grad_norm': 0.8880575694395363, 'learning_rate': 7.073871024546224e-06, 'epoch': 0.38} 38%|███▊ | 13127/34278 [14:31:46<21:18:52, 3.63s/it] 38%|███▊ | 13128/34278 [14:31:49<20:29:06, 3.49s/it] {'loss': 0.1441, 'grad_norm': 0.775338023325945, 'learning_rate': 7.073441135937218e-06, 'epoch': 0.38} 38%|███▊ | 13128/34278 [14:31:49<20:29:06, 3.49s/it] 38%|███▊ | 13129/34278 [14:31:54<22:55:55, 3.90s/it] {'loss': 0.1584, 'grad_norm': 0.7930982428849666, 'learning_rate': 7.073011228817026e-06, 'epoch': 0.38} 38%|███▊ | 13129/34278 [14:31:54<22:55:55, 3.90s/it] 38%|███▊ | 13130/34278 [14:31:57<22:00:02, 3.75s/it] {'loss': 0.1488, 'grad_norm': 0.911810955626411, 'learning_rate': 7.072581303189485e-06, 'epoch': 0.38} 38%|███▊ | 13130/34278 [14:31:57<22:00:02, 3.75s/it] 38%|███▊ | 13131/34278 [14:32:00<20:42:51, 3.53s/it] {'loss': 0.1604, 'grad_norm': 0.9098857458468019, 'learning_rate': 7.072151359058431e-06, 'epoch': 0.38} 38%|███▊ | 13131/34278 [14:32:00<20:42:51, 3.53s/it] 38%|███▊ | 13132/34278 [14:32:03<19:47:36, 3.37s/it] {'loss': 0.1378, 'grad_norm': 0.7879204619318778, 'learning_rate': 7.071721396427706e-06, 'epoch': 0.38} 38%|███▊ | 13132/34278 [14:32:03<19:47:36, 3.37s/it] 38%|███▊ | 13133/34278 [14:32:06<18:42:31, 3.19s/it] {'loss': 0.1387, 'grad_norm': 0.8966769901371093, 'learning_rate': 7.071291415301147e-06, 'epoch': 0.38} 38%|███▊ | 13133/34278 [14:32:06<18:42:31, 3.19s/it] 38%|███▊ | 13134/34278 [14:32:09<19:22:33, 3.30s/it] {'loss': 0.1482, 'grad_norm': 1.0341074388373561, 'learning_rate': 7.070861415682591e-06, 'epoch': 0.38} 38%|███▊ | 13134/34278 [14:32:09<19:22:33, 3.30s/it] 38%|███▊ | 13135/34278 [14:32:12<18:25:04, 3.14s/it] {'loss': 0.1419, 'grad_norm': 1.2304936071236645, 'learning_rate': 7.0704313975758795e-06, 'epoch': 0.38} 38%|███▊ | 13135/34278 [14:32:12<18:25:04, 3.14s/it] 38%|███▊ | 13136/34278 [14:32:16<19:00:01, 3.24s/it] {'loss': 0.1381, 'grad_norm': 1.0014628454773198, 'learning_rate': 7.07000136098485e-06, 'epoch': 0.38} 38%|███▊ | 13136/34278 [14:32:16<19:00:01, 3.24s/it] 38%|███▊ | 13137/34278 [14:32:19<19:01:47, 3.24s/it] {'loss': 0.1453, 'grad_norm': 1.0036603629484904, 'learning_rate': 7.069571305913344e-06, 'epoch': 0.38} 38%|███▊ | 13137/34278 [14:32:19<19:01:47, 3.24s/it] 38%|███▊ | 13138/34278 [14:32:22<18:47:25, 3.20s/it] {'loss': 0.1406, 'grad_norm': 1.1866706294109137, 'learning_rate': 7.0691412323651985e-06, 'epoch': 0.38} 38%|███▊ | 13138/34278 [14:32:22<18:47:25, 3.20s/it] 38%|███▊ | 13139/34278 [14:32:25<18:03:04, 3.07s/it] {'loss': 0.1393, 'grad_norm': 0.8985576384858317, 'learning_rate': 7.0687111403442545e-06, 'epoch': 0.38} 38%|███▊ | 13139/34278 [14:32:25<18:03:04, 3.07s/it] 38%|███▊ | 13140/34278 [14:32:28<17:35:19, 3.00s/it] {'loss': 0.1672, 'grad_norm': 0.9242137471340901, 'learning_rate': 7.068281029854352e-06, 'epoch': 0.38} 38%|███▊ | 13140/34278 [14:32:28<17:35:19, 3.00s/it] 38%|███▊ | 13141/34278 [14:32:31<18:22:06, 3.13s/it] {'loss': 0.169, 'grad_norm': 0.8816153269856454, 'learning_rate': 7.067850900899328e-06, 'epoch': 0.38} 38%|███▊ | 13141/34278 [14:32:31<18:22:06, 3.13s/it] 38%|███▊ | 13142/34278 [14:32:35<19:12:11, 3.27s/it] {'loss': 0.1299, 'grad_norm': 0.7916098566601042, 'learning_rate': 7.067420753483026e-06, 'epoch': 0.38} 38%|███▊ | 13142/34278 [14:32:35<19:12:11, 3.27s/it] 38%|███▊ | 13143/34278 [14:32:38<19:40:05, 3.35s/it] {'loss': 0.1358, 'grad_norm': 0.7133509619734466, 'learning_rate': 7.066990587609286e-06, 'epoch': 0.38} 38%|███▊ | 13143/34278 [14:32:38<19:40:05, 3.35s/it] 38%|███▊ | 13144/34278 [14:32:41<18:35:08, 3.17s/it] {'loss': 0.1394, 'grad_norm': 0.8723191442988169, 'learning_rate': 7.066560403281946e-06, 'epoch': 0.38} 38%|███▊ | 13144/34278 [14:32:41<18:35:08, 3.17s/it] 38%|███▊ | 13145/34278 [14:32:44<18:33:47, 3.16s/it] {'loss': 0.137, 'grad_norm': 0.7490395641106506, 'learning_rate': 7.06613020050485e-06, 'epoch': 0.38} 38%|███▊ | 13145/34278 [14:32:44<18:33:47, 3.16s/it] 38%|███▊ | 13146/34278 [14:32:47<17:46:14, 3.03s/it] {'loss': 0.1422, 'grad_norm': 0.8581450018827587, 'learning_rate': 7.065699979281834e-06, 'epoch': 0.38} 38%|███▊ | 13146/34278 [14:32:47<17:46:14, 3.03s/it] 38%|███▊ | 13147/34278 [14:32:50<18:28:39, 3.15s/it] {'loss': 0.1629, 'grad_norm': 0.778933914965665, 'learning_rate': 7.065269739616744e-06, 'epoch': 0.38} 38%|███▊ | 13147/34278 [14:32:50<18:28:39, 3.15s/it] 38%|███▊ | 13148/34278 [14:32:54<19:34:05, 3.33s/it] {'loss': 0.1286, 'grad_norm': 0.8985223450460263, 'learning_rate': 7.064839481513417e-06, 'epoch': 0.38} 38%|███▊ | 13148/34278 [14:32:54<19:34:05, 3.33s/it] 38%|███▊ | 13149/34278 [14:32:57<19:32:21, 3.33s/it] {'loss': 0.1746, 'grad_norm': 0.758957710394968, 'learning_rate': 7.064409204975696e-06, 'epoch': 0.38} 38%|███▊ | 13149/34278 [14:32:57<19:32:21, 3.33s/it] 38%|███▊ | 13150/34278 [14:33:00<18:59:42, 3.24s/it] {'loss': 0.1471, 'grad_norm': 0.7899024126793673, 'learning_rate': 7.0639789100074255e-06, 'epoch': 0.38} 38%|███▊ | 13150/34278 [14:33:00<18:59:42, 3.24s/it] 38%|███▊ | 13151/34278 [14:33:04<19:20:53, 3.30s/it] {'loss': 0.1378, 'grad_norm': 0.6982124085047007, 'learning_rate': 7.06354859661244e-06, 'epoch': 0.38} 38%|███▊ | 13151/34278 [14:33:04<19:20:53, 3.30s/it] 38%|███▊ | 13152/34278 [14:33:07<20:10:34, 3.44s/it] {'loss': 0.1433, 'grad_norm': 0.7941912606468112, 'learning_rate': 7.0631182647945884e-06, 'epoch': 0.38} 38%|███▊ | 13152/34278 [14:33:07<20:10:34, 3.44s/it] 38%|███▊ | 13153/34278 [14:33:11<21:06:01, 3.60s/it] {'loss': 0.1489, 'grad_norm': 0.8363233650555554, 'learning_rate': 7.062687914557708e-06, 'epoch': 0.38} 38%|███▊ | 13153/34278 [14:33:11<21:06:01, 3.60s/it] 38%|███▊ | 13154/34278 [14:33:18<25:45:33, 4.39s/it] {'loss': 0.1674, 'grad_norm': 0.829309201774025, 'learning_rate': 7.062257545905642e-06, 'epoch': 0.38} 38%|███▊ | 13154/34278 [14:33:18<25:45:33, 4.39s/it] 38%|███▊ | 13155/34278 [14:33:21<23:40:38, 4.04s/it] {'loss': 0.1302, 'grad_norm': 0.7630820142471877, 'learning_rate': 7.061827158842234e-06, 'epoch': 0.38} 38%|███▊ | 13155/34278 [14:33:21<23:40:38, 4.04s/it] 38%|███▊ | 13156/34278 [14:33:25<24:31:46, 4.18s/it] {'loss': 0.1648, 'grad_norm': 0.7884424514086901, 'learning_rate': 7.061396753371323e-06, 'epoch': 0.38} 38%|███▊ | 13156/34278 [14:33:25<24:31:46, 4.18s/it] 38%|███▊ | 13157/34278 [14:33:28<22:25:41, 3.82s/it] {'loss': 0.1626, 'grad_norm': 0.8154456882331151, 'learning_rate': 7.060966329496757e-06, 'epoch': 0.38} 38%|███▊ | 13157/34278 [14:33:28<22:25:41, 3.82s/it] 38%|███▊ | 13158/34278 [14:33:32<22:16:29, 3.80s/it] {'loss': 0.1508, 'grad_norm': 1.0104256131077718, 'learning_rate': 7.060535887222373e-06, 'epoch': 0.38} 38%|███▊ | 13158/34278 [14:33:32<22:16:29, 3.80s/it] 38%|███▊ | 13159/34278 [14:33:36<22:21:11, 3.81s/it] {'loss': 0.1506, 'grad_norm': 0.7977958642526866, 'learning_rate': 7.060105426552018e-06, 'epoch': 0.38} 38%|███▊ | 13159/34278 [14:33:36<22:21:11, 3.81s/it] 38%|███▊ | 13160/34278 [14:33:39<21:46:20, 3.71s/it] {'loss': 0.1234, 'grad_norm': 1.062430256860251, 'learning_rate': 7.0596749474895344e-06, 'epoch': 0.38} 38%|███▊ | 13160/34278 [14:33:39<21:46:20, 3.71s/it] 38%|███▊ | 13161/34278 [14:33:43<21:01:42, 3.58s/it] {'loss': 0.1573, 'grad_norm': 0.8411417714500453, 'learning_rate': 7.059244450038762e-06, 'epoch': 0.38} 38%|███▊ | 13161/34278 [14:33:43<21:01:42, 3.58s/it] 38%|███▊ | 13162/34278 [14:33:46<20:04:45, 3.42s/it] {'loss': 0.1486, 'grad_norm': 0.8339153474587397, 'learning_rate': 7.058813934203549e-06, 'epoch': 0.38} 38%|███▊ | 13162/34278 [14:33:46<20:04:45, 3.42s/it] 38%|███▊ | 13163/34278 [14:33:49<19:11:01, 3.27s/it] {'loss': 0.1361, 'grad_norm': 0.9894703493526382, 'learning_rate': 7.058383399987736e-06, 'epoch': 0.38} 38%|███▊ | 13163/34278 [14:33:49<19:11:01, 3.27s/it] 38%|███▊ | 13164/34278 [14:33:52<19:01:08, 3.24s/it] {'loss': 0.1368, 'grad_norm': 1.129688469946021, 'learning_rate': 7.057952847395166e-06, 'epoch': 0.38} 38%|███▊ | 13164/34278 [14:33:52<19:01:08, 3.24s/it] 38%|███▊ | 13165/34278 [14:33:55<18:33:00, 3.16s/it] {'loss': 0.1551, 'grad_norm': 0.7600692887581971, 'learning_rate': 7.057522276429686e-06, 'epoch': 0.38} 38%|███▊ | 13165/34278 [14:33:55<18:33:00, 3.16s/it] 38%|███▊ | 13166/34278 [14:33:58<19:10:06, 3.27s/it] {'loss': 0.1595, 'grad_norm': 1.2455831289902193, 'learning_rate': 7.057091687095138e-06, 'epoch': 0.38} 38%|███▊ | 13166/34278 [14:33:58<19:10:06, 3.27s/it] 38%|███▊ | 13167/34278 [14:34:01<18:10:45, 3.10s/it] {'loss': 0.1375, 'grad_norm': 1.0068548520666945, 'learning_rate': 7.056661079395366e-06, 'epoch': 0.38} 38%|███▊ | 13167/34278 [14:34:01<18:10:45, 3.10s/it] 38%|███▊ | 13168/34278 [14:34:04<18:37:04, 3.18s/it] {'loss': 0.1468, 'grad_norm': 0.7662769792406645, 'learning_rate': 7.056230453334214e-06, 'epoch': 0.38} 38%|███▊ | 13168/34278 [14:34:04<18:37:04, 3.18s/it] 38%|███▊ | 13169/34278 [14:34:09<21:54:33, 3.74s/it] {'loss': 0.1484, 'grad_norm': 0.9628656124681166, 'learning_rate': 7.055799808915529e-06, 'epoch': 0.38} 38%|███▊ | 13169/34278 [14:34:09<21:54:33, 3.74s/it] 38%|███▊ | 13170/34278 [14:34:14<22:41:32, 3.87s/it] {'loss': 0.1185, 'grad_norm': 0.8008510300215405, 'learning_rate': 7.0553691461431536e-06, 'epoch': 0.38} 38%|███▊ | 13170/34278 [14:34:14<22:41:32, 3.87s/it] 38%|███▊ | 13171/34278 [14:34:16<20:49:31, 3.55s/it] {'loss': 0.1371, 'grad_norm': 0.7761874409154677, 'learning_rate': 7.054938465020933e-06, 'epoch': 0.38} 38%|███▊ | 13171/34278 [14:34:16<20:49:31, 3.55s/it] 38%|███▊ | 13172/34278 [14:34:20<20:24:05, 3.48s/it] {'loss': 0.1527, 'grad_norm': 2.5130980677053087, 'learning_rate': 7.054507765552712e-06, 'epoch': 0.38} 38%|███▊ | 13172/34278 [14:34:20<20:24:05, 3.48s/it] 38%|███▊ | 13173/34278 [14:34:23<19:32:37, 3.33s/it] {'loss': 0.1329, 'grad_norm': 1.1640321970693506, 'learning_rate': 7.054077047742336e-06, 'epoch': 0.38} 38%|███▊ | 13173/34278 [14:34:23<19:32:37, 3.33s/it] 38%|███▊ | 13174/34278 [14:34:26<18:57:47, 3.23s/it] {'loss': 0.1514, 'grad_norm': 1.0871499764143346, 'learning_rate': 7.053646311593651e-06, 'epoch': 0.38} 38%|███▊ | 13174/34278 [14:34:26<18:57:47, 3.23s/it] 38%|███▊ | 13175/34278 [14:34:32<23:48:49, 4.06s/it] {'loss': 0.1531, 'grad_norm': 0.7590588990257113, 'learning_rate': 7.053215557110503e-06, 'epoch': 0.38} 38%|███▊ | 13175/34278 [14:34:32<23:48:49, 4.06s/it] 38%|███▊ | 13176/34278 [14:34:35<22:44:05, 3.88s/it] {'loss': 0.1473, 'grad_norm': 0.929740989737801, 'learning_rate': 7.052784784296735e-06, 'epoch': 0.38} 38%|███▊ | 13176/34278 [14:34:35<22:44:05, 3.88s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f3d6c25f830> Failed to fetch sample 3353702. Exception: cannot identify image file <_io.BytesIO object at 0x7f3d6c25f830> 38%|███▊ | 13177/34278 [14:34:39<21:54:08, 3.74s/it] {'loss': 0.1456, 'grad_norm': 1.277307219791055, 'learning_rate': 7.052353993156196e-06, 'epoch': 0.38} 38%|███▊ | 13177/34278 [14:34:39<21:54:08, 3.74s/it] 38%|███▊ | 13178/34278 [14:34:42<21:31:06, 3.67s/it] {'loss': 0.1575, 'grad_norm': 0.9094694212534672, 'learning_rate': 7.051923183692728e-06, 'epoch': 0.38} 38%|███▊ | 13178/34278 [14:34:42<21:31:06, 3.67s/it] 38%|███▊ | 13179/34278 [14:34:46<22:41:38, 3.87s/it] {'loss': 0.1427, 'grad_norm': 0.8590938719441122, 'learning_rate': 7.0514923559101814e-06, 'epoch': 0.38} 38%|███▊ | 13179/34278 [14:34:46<22:41:38, 3.87s/it] 38%|███▊ | 13180/34278 [14:34:49<20:51:33, 3.56s/it] {'loss': 0.1211, 'grad_norm': 0.8323348273163871, 'learning_rate': 7.0510615098124005e-06, 'epoch': 0.38} 38%|███▊ | 13180/34278 [14:34:49<20:51:33, 3.56s/it] 38%|███▊ | 13181/34278 [14:34:52<19:39:59, 3.36s/it] {'loss': 0.1386, 'grad_norm': 0.7958251264863333, 'learning_rate': 7.0506306454032326e-06, 'epoch': 0.38} 38%|███▊ | 13181/34278 [14:34:52<19:39:59, 3.36s/it] 38%|███▊ | 13182/34278 [14:34:55<19:08:18, 3.27s/it] {'loss': 0.1261, 'grad_norm': 0.9837312411046082, 'learning_rate': 7.050199762686522e-06, 'epoch': 0.38} 38%|███▊ | 13182/34278 [14:34:55<19:08:18, 3.27s/it] 38%|███▊ | 13183/34278 [14:35:01<23:46:16, 4.06s/it] {'loss': 0.1474, 'grad_norm': 0.7655586884409743, 'learning_rate': 7.04976886166612e-06, 'epoch': 0.38} 38%|███▊ | 13183/34278 [14:35:01<23:46:16, 4.06s/it] 38%|███▊ | 13184/34278 [14:35:05<23:10:40, 3.96s/it] {'loss': 0.1354, 'grad_norm': 0.7334294555695986, 'learning_rate': 7.049337942345868e-06, 'epoch': 0.38} 38%|███▊ | 13184/34278 [14:35:05<23:10:40, 3.96s/it] 38%|███▊ | 13185/34278 [14:35:08<21:56:22, 3.74s/it] {'loss': 0.1414, 'grad_norm': 0.8762714129354264, 'learning_rate': 7.048907004729619e-06, 'epoch': 0.38} 38%|███▊ | 13185/34278 [14:35:08<21:56:22, 3.74s/it] 38%|███▊ | 13186/34278 [14:35:14<26:22:48, 4.50s/it] {'loss': 0.146, 'grad_norm': 0.7870794442118512, 'learning_rate': 7.048476048821215e-06, 'epoch': 0.38} 38%|███▊ | 13186/34278 [14:35:14<26:22:48, 4.50s/it] 38%|███▊ | 13187/34278 [14:35:18<24:36:47, 4.20s/it] {'loss': 0.1389, 'grad_norm': 1.01210760370242, 'learning_rate': 7.048045074624508e-06, 'epoch': 0.38} 38%|███▊ | 13187/34278 [14:35:18<24:36:47, 4.20s/it] 38%|███▊ | 13188/34278 [14:35:21<22:57:58, 3.92s/it] {'loss': 0.1475, 'grad_norm': 0.7542164408527335, 'learning_rate': 7.047614082143342e-06, 'epoch': 0.38} 38%|███▊ | 13188/34278 [14:35:21<22:57:58, 3.92s/it] 38%|███▊ | 13189/34278 [14:35:24<20:47:58, 3.55s/it] {'loss': 0.134, 'grad_norm': 0.8052815624525067, 'learning_rate': 7.047183071381566e-06, 'epoch': 0.38} 38%|███▊ | 13189/34278 [14:35:24<20:47:58, 3.55s/it] 38%|███▊ | 13190/34278 [14:35:27<20:22:56, 3.48s/it] {'loss': 0.1905, 'grad_norm': 0.962643863131615, 'learning_rate': 7.046752042343029e-06, 'epoch': 0.38} 38%|███▊ | 13190/34278 [14:35:27<20:22:56, 3.48s/it] 38%|███▊ | 13191/34278 [14:35:30<19:56:54, 3.41s/it] {'loss': 0.1385, 'grad_norm': 0.9266481751109104, 'learning_rate': 7.046320995031578e-06, 'epoch': 0.38} 38%|███▊ | 13191/34278 [14:35:30<19:56:54, 3.41s/it] 38%|███▊ | 13192/34278 [14:35:37<24:48:41, 4.24s/it] {'loss': 0.1332, 'grad_norm': 0.6338790428294409, 'learning_rate': 7.045889929451063e-06, 'epoch': 0.38} 38%|███▊ | 13192/34278 [14:35:37<24:48:41, 4.24s/it] 38%|███▊ | 13193/34278 [14:35:42<27:40:59, 4.73s/it] {'loss': 0.1191, 'grad_norm': 0.7303123833273253, 'learning_rate': 7.045458845605329e-06, 'epoch': 0.38} 38%|███▊ | 13193/34278 [14:35:42<27:40:59, 4.73s/it] 38%|███▊ | 13194/34278 [14:35:47<27:13:36, 4.65s/it] {'loss': 0.1405, 'grad_norm': 0.8025559681918394, 'learning_rate': 7.045027743498227e-06, 'epoch': 0.38} 38%|███▊ | 13194/34278 [14:35:47<27:13:36, 4.65s/it] 38%|███▊ | 13195/34278 [14:35:50<24:26:41, 4.17s/it] {'loss': 0.1535, 'grad_norm': 0.8078721313870865, 'learning_rate': 7.044596623133607e-06, 'epoch': 0.38} 38%|███▊ | 13195/34278 [14:35:50<24:26:41, 4.17s/it] 38%|███▊ | 13196/34278 [14:35:53<22:48:03, 3.89s/it] {'loss': 0.1308, 'grad_norm': 0.8735492269731023, 'learning_rate': 7.044165484515315e-06, 'epoch': 0.38} 38%|███▊ | 13196/34278 [14:35:53<22:48:03, 3.89s/it] 38%|███▊ | 13197/34278 [14:35:57<21:59:00, 3.75s/it] {'loss': 0.1489, 'grad_norm': 0.8228735997223994, 'learning_rate': 7.043734327647202e-06, 'epoch': 0.38} 38%|███▊ | 13197/34278 [14:35:57<21:59:00, 3.75s/it] 39%|███▊ | 13198/34278 [14:36:01<23:36:56, 4.03s/it] {'loss': 0.1392, 'grad_norm': 0.6732863492182765, 'learning_rate': 7.043303152533119e-06, 'epoch': 0.39} 39%|███▊ | 13198/34278 [14:36:01<23:36:56, 4.03s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 39%|███▊ | 13199/34278 [14:36:06<24:26:32, 4.17s/it] {'loss': 0.2068, 'grad_norm': 0.8134206062713474, 'learning_rate': 7.042871959176909e-06, 'epoch': 0.39} 39%|███▊ | 13199/34278 [14:36:06<24:26:32, 4.17s/it] 39%|███▊ | 13200/34278 [14:36:09<22:17:49, 3.81s/it] {'loss': 0.1358, 'grad_norm': 0.838629080481506, 'learning_rate': 7.0424407475824285e-06, 'epoch': 0.39} 39%|███▊ | 13200/34278 [14:36:09<22:17:49, 3.81s/it] 39%|███▊ | 13201/34278 [14:36:12<21:12:19, 3.62s/it] {'loss': 0.1642, 'grad_norm': 0.8938669022470852, 'learning_rate': 7.042009517753525e-06, 'epoch': 0.39} 39%|███▊ | 13201/34278 [14:36:12<21:12:19, 3.62s/it] 39%|███▊ | 13202/34278 [14:36:15<20:42:13, 3.54s/it] {'loss': 0.13, 'grad_norm': 0.8037364863431298, 'learning_rate': 7.041578269694047e-06, 'epoch': 0.39} 39%|███▊ | 13202/34278 [14:36:15<20:42:13, 3.54s/it] 39%|███▊ | 13203/34278 [14:36:19<20:49:27, 3.56s/it] {'loss': 0.1806, 'grad_norm': 1.0073783371034495, 'learning_rate': 7.041147003407845e-06, 'epoch': 0.39} 39%|███▊ | 13203/34278 [14:36:19<20:49:27, 3.56s/it] 39%|███▊ | 13204/34278 [14:36:22<19:28:09, 3.33s/it] {'loss': 0.1547, 'grad_norm': 0.8013140474535293, 'learning_rate': 7.04071571889877e-06, 'epoch': 0.39} 39%|███▊ | 13204/34278 [14:36:22<19:28:09, 3.33s/it] 39%|███▊ | 13205/34278 [14:36:26<20:47:12, 3.55s/it] {'loss': 0.1378, 'grad_norm': 0.8313544827264453, 'learning_rate': 7.040284416170673e-06, 'epoch': 0.39} 39%|███▊ | 13205/34278 [14:36:26<20:47:12, 3.55s/it] 39%|███▊ | 13206/34278 [14:36:29<19:42:02, 3.37s/it] {'loss': 0.1379, 'grad_norm': 1.025391914818843, 'learning_rate': 7.039853095227404e-06, 'epoch': 0.39} 39%|███▊ | 13206/34278 [14:36:29<19:42:02, 3.37s/it] 39%|███▊ | 13207/34278 [14:36:32<20:06:16, 3.43s/it] {'loss': 0.1182, 'grad_norm': 0.7223974582346451, 'learning_rate': 7.039421756072814e-06, 'epoch': 0.39} 39%|███▊ | 13207/34278 [14:36:32<20:06:16, 3.43s/it] 39%|███▊ | 13208/34278 [14:36:38<24:19:12, 4.16s/it] {'loss': 0.1189, 'grad_norm': 0.7760834738644458, 'learning_rate': 7.038990398710751e-06, 'epoch': 0.39} 39%|███▊ | 13208/34278 [14:36:38<24:19:12, 4.16s/it] 39%|███▊ | 13209/34278 [14:36:42<24:37:04, 4.21s/it] {'loss': 0.1506, 'grad_norm': 1.148551727361324, 'learning_rate': 7.03855902314507e-06, 'epoch': 0.39} 39%|███▊ | 13209/34278 [14:36:42<24:37:04, 4.21s/it] 39%|███▊ | 13210/34278 [14:36:48<27:43:11, 4.74s/it] {'loss': 0.1336, 'grad_norm': 0.7109625283204699, 'learning_rate': 7.0381276293796204e-06, 'epoch': 0.39} 39%|███▊ | 13210/34278 [14:36:48<27:43:11, 4.74s/it] 39%|███▊ | 13211/34278 [14:36:51<24:33:23, 4.20s/it] {'loss': 0.1532, 'grad_norm': 1.0315526502324017, 'learning_rate': 7.0376962174182536e-06, 'epoch': 0.39} 39%|███▊ | 13211/34278 [14:36:51<24:33:23, 4.20s/it] 39%|███▊ | 13212/34278 [14:36:57<26:39:13, 4.55s/it] {'loss': 0.1611, 'grad_norm': 0.7497842649954283, 'learning_rate': 7.037264787264823e-06, 'epoch': 0.39} 39%|███▊ | 13212/34278 [14:36:57<26:39:13, 4.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 39%|███▊ | 13213/34278 [14:37:01<26:46:35, 4.58s/it] {'loss': 0.1495, 'grad_norm': 1.0430596271386305, 'learning_rate': 7.036833338923177e-06, 'epoch': 0.39} 39%|███▊ | 13213/34278 [14:37:01<26:46:35, 4.58s/it] 39%|███▊ | 13214/34278 [14:37:05<24:38:20, 4.21s/it] {'loss': 0.1399, 'grad_norm': 0.7477285953945649, 'learning_rate': 7.03640187239717e-06, 'epoch': 0.39} 39%|███▊ | 13214/34278 [14:37:05<24:38:20, 4.21s/it] 39%|███▊ | 13215/34278 [14:37:08<22:39:39, 3.87s/it] {'loss': 0.1332, 'grad_norm': 1.1289120089937266, 'learning_rate': 7.035970387690652e-06, 'epoch': 0.39} 39%|███▊ | 13215/34278 [14:37:08<22:39:39, 3.87s/it] 39%|███▊ | 13216/34278 [14:37:11<21:05:09, 3.60s/it] {'loss': 0.1217, 'grad_norm': 0.93720400927447, 'learning_rate': 7.035538884807478e-06, 'epoch': 0.39} 39%|███▊ | 13216/34278 [14:37:11<21:05:09, 3.60s/it] 39%|███▊ | 13217/34278 [14:37:14<20:20:56, 3.48s/it] {'loss': 0.1256, 'grad_norm': 0.7350282797287049, 'learning_rate': 7.035107363751499e-06, 'epoch': 0.39} 39%|███▊ | 13217/34278 [14:37:14<20:20:56, 3.48s/it] 39%|███▊ | 13218/34278 [14:37:18<20:30:50, 3.51s/it] {'loss': 0.1359, 'grad_norm': 1.0483884820992884, 'learning_rate': 7.034675824526566e-06, 'epoch': 0.39} 39%|███▊ | 13218/34278 [14:37:18<20:30:50, 3.51s/it] 39%|███▊ | 13219/34278 [14:37:20<18:53:19, 3.23s/it] {'loss': 0.121, 'grad_norm': 0.9309292583366686, 'learning_rate': 7.034244267136533e-06, 'epoch': 0.39} 39%|███▊ | 13219/34278 [14:37:20<18:53:19, 3.23s/it] 39%|███▊ | 13220/34278 [14:37:23<18:17:24, 3.13s/it] {'loss': 0.1367, 'grad_norm': 1.0145449977165903, 'learning_rate': 7.033812691585253e-06, 'epoch': 0.39} 39%|███▊ | 13220/34278 [14:37:23<18:17:24, 3.13s/it] 39%|███▊ | 13221/34278 [14:37:26<18:23:17, 3.14s/it] {'loss': 0.1324, 'grad_norm': 0.8552576548131363, 'learning_rate': 7.033381097876578e-06, 'epoch': 0.39} 39%|███▊ | 13221/34278 [14:37:26<18:23:17, 3.14s/it] 39%|███▊ | 13222/34278 [14:37:31<21:15:56, 3.64s/it] {'loss': 0.1327, 'grad_norm': 1.1621448997547186, 'learning_rate': 7.032949486014364e-06, 'epoch': 0.39} 39%|███▊ | 13222/34278 [14:37:31<21:15:56, 3.64s/it] 39%|███▊ | 13223/34278 [14:37:34<19:54:20, 3.40s/it] {'loss': 0.1475, 'grad_norm': 0.960819941259101, 'learning_rate': 7.032517856002461e-06, 'epoch': 0.39} 39%|███▊ | 13223/34278 [14:37:34<19:54:20, 3.40s/it] 39%|███▊ | 13224/34278 [14:37:37<18:49:11, 3.22s/it] {'loss': 0.1235, 'grad_norm': 0.9398773742574359, 'learning_rate': 7.0320862078447235e-06, 'epoch': 0.39} 39%|███▊ | 13224/34278 [14:37:37<18:49:11, 3.22s/it] 39%|███▊ | 13225/34278 [14:37:40<18:31:37, 3.17s/it] {'loss': 0.1163, 'grad_norm': 0.8729047289873096, 'learning_rate': 7.0316545415450065e-06, 'epoch': 0.39} 39%|███▊ | 13225/34278 [14:37:40<18:31:37, 3.17s/it] 39%|███▊ | 13226/34278 [14:37:44<20:29:39, 3.50s/it] {'loss': 0.1569, 'grad_norm': 1.0931647627080159, 'learning_rate': 7.0312228571071614e-06, 'epoch': 0.39} 39%|███▊ | 13226/34278 [14:37:44<20:29:39, 3.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 39%|███▊ | 13227/34278 [14:37:47<19:22:30, 3.31s/it] {'loss': 0.1408, 'grad_norm': 1.0550094070111407, 'learning_rate': 7.030791154535045e-06, 'epoch': 0.39} 39%|███▊ | 13227/34278 [14:37:47<19:22:30, 3.31s/it] 39%|███▊ | 13228/34278 [14:37:50<19:54:06, 3.40s/it] {'loss': 0.1593, 'grad_norm': 0.7257333429410203, 'learning_rate': 7.03035943383251e-06, 'epoch': 0.39} 39%|███▊ | 13228/34278 [14:37:50<19:54:06, 3.40s/it] 39%|███▊ | 13229/34278 [14:37:53<19:18:16, 3.30s/it] {'loss': 0.1429, 'grad_norm': 0.959882441307829, 'learning_rate': 7.029927695003408e-06, 'epoch': 0.39} 39%|███▊ | 13229/34278 [14:37:54<19:18:16, 3.30s/it] 39%|███▊ | 13230/34278 [14:37:57<19:06:22, 3.27s/it] {'loss': 0.141, 'grad_norm': 0.7871133713404713, 'learning_rate': 7.029495938051599e-06, 'epoch': 0.39} 39%|███▊ | 13230/34278 [14:37:57<19:06:22, 3.27s/it] 39%|███▊ | 13231/34278 [14:38:00<18:39:28, 3.19s/it] {'loss': 0.1525, 'grad_norm': 0.7285164992562347, 'learning_rate': 7.029064162980934e-06, 'epoch': 0.39} 39%|███▊ | 13231/34278 [14:38:00<18:39:28, 3.19s/it] 39%|███▊ | 13232/34278 [14:38:03<18:08:18, 3.10s/it] {'loss': 0.1383, 'grad_norm': 0.708274471085333, 'learning_rate': 7.028632369795267e-06, 'epoch': 0.39} 39%|███▊ | 13232/34278 [14:38:03<18:08:18, 3.10s/it] 39%|███▊ | 13233/34278 [14:38:08<22:59:18, 3.93s/it] {'loss': 0.1407, 'grad_norm': 0.9021603207471135, 'learning_rate': 7.028200558498457e-06, 'epoch': 0.39} 39%|███▊ | 13233/34278 [14:38:08<22:59:18, 3.93s/it] 39%|███▊ | 13234/34278 [14:38:12<22:14:32, 3.80s/it] {'loss': 0.1314, 'grad_norm': 0.7355522210034191, 'learning_rate': 7.0277687290943555e-06, 'epoch': 0.39} 39%|███▊ | 13234/34278 [14:38:12<22:14:32, 3.80s/it] 39%|███▊ | 13235/34278 [14:38:18<26:05:10, 4.46s/it] {'loss': 0.1462, 'grad_norm': 0.9074463689364065, 'learning_rate': 7.027336881586818e-06, 'epoch': 0.39} 39%|███▊ | 13235/34278 [14:38:18<26:05:10, 4.46s/it] 39%|███▊ | 13236/34278 [14:38:21<23:33:48, 4.03s/it] {'loss': 0.1215, 'grad_norm': 0.704192428681642, 'learning_rate': 7.026905015979702e-06, 'epoch': 0.39} 39%|███▊ | 13236/34278 [14:38:21<23:33:48, 4.03s/it] 39%|███▊ | 13237/34278 [14:38:27<27:21:37, 4.68s/it] {'loss': 0.1287, 'grad_norm': 0.6116731736115757, 'learning_rate': 7.026473132276862e-06, 'epoch': 0.39} 39%|███▊ | 13237/34278 [14:38:27<27:21:37, 4.68s/it] 39%|███▊ | 13238/34278 [14:38:31<25:16:13, 4.32s/it] {'loss': 0.145, 'grad_norm': 0.9547230536130067, 'learning_rate': 7.026041230482152e-06, 'epoch': 0.39} 39%|███▊ | 13238/34278 [14:38:31<25:16:13, 4.32s/it] 39%|███▊ | 13239/34278 [14:38:37<27:56:37, 4.78s/it] {'loss': 0.1458, 'grad_norm': 0.8319729792584203, 'learning_rate': 7.02560931059943e-06, 'epoch': 0.39} 39%|███▊ | 13239/34278 [14:38:37<27:56:37, 4.78s/it] 39%|███▊ | 13240/34278 [14:38:42<29:05:33, 4.98s/it] {'loss': 0.1336, 'grad_norm': 0.7360962858435293, 'learning_rate': 7.025177372632554e-06, 'epoch': 0.39} 39%|███▊ | 13240/34278 [14:38:42<29:05:33, 4.98s/it] 39%|███▊ | 13241/34278 [14:38:45<26:14:17, 4.49s/it] {'loss': 0.1566, 'grad_norm': 0.7169362741111335, 'learning_rate': 7.0247454165853746e-06, 'epoch': 0.39} 39%|███▊ | 13241/34278 [14:38:45<26:14:17, 4.49s/it] 39%|███▊ | 13242/34278 [14:38:49<24:58:38, 4.27s/it] {'loss': 0.1573, 'grad_norm': 0.842506903764658, 'learning_rate': 7.024313442461753e-06, 'epoch': 0.39} 39%|███▊ | 13242/34278 [14:38:49<24:58:38, 4.27s/it] 39%|███▊ | 13243/34278 [14:38:53<24:57:24, 4.27s/it] {'loss': 0.144, 'grad_norm': 0.7194110795473797, 'learning_rate': 7.023881450265544e-06, 'epoch': 0.39} 39%|███▊ | 13243/34278 [14:38:53<24:57:24, 4.27s/it] 39%|███▊ | 13244/34278 [14:38:59<27:46:28, 4.75s/it] {'loss': 0.128, 'grad_norm': 0.7806393290389207, 'learning_rate': 7.023449440000605e-06, 'epoch': 0.39} 39%|███▊ | 13244/34278 [14:38:59<27:46:28, 4.75s/it] 39%|███▊ | 13245/34278 [14:39:04<27:50:32, 4.77s/it] {'loss': 0.1547, 'grad_norm': 1.0850628310588, 'learning_rate': 7.023017411670792e-06, 'epoch': 0.39} 39%|███▊ | 13245/34278 [14:39:04<27:50:32, 4.77s/it] 39%|███▊ | 13246/34278 [14:39:07<24:23:57, 4.18s/it] {'loss': 0.1176, 'grad_norm': 0.7937628761505633, 'learning_rate': 7.022585365279963e-06, 'epoch': 0.39} 39%|███▊ | 13246/34278 [14:39:07<24:23:57, 4.18s/it] 39%|███▊ | 13247/34278 [14:39:13<27:38:57, 4.73s/it] {'loss': 0.1545, 'grad_norm': 1.016898103575818, 'learning_rate': 7.022153300831974e-06, 'epoch': 0.39} 39%|███▊ | 13247/34278 [14:39:13<27:38:57, 4.73s/it] 39%|███▊ | 13248/34278 [14:39:16<24:49:42, 4.25s/it] {'loss': 0.1433, 'grad_norm': 0.8049622544118625, 'learning_rate': 7.021721218330684e-06, 'epoch': 0.39} 39%|███▊ | 13248/34278 [14:39:16<24:49:42, 4.25s/it] 39%|███▊ | 13249/34278 [14:39:19<22:43:14, 3.89s/it] {'loss': 0.1645, 'grad_norm': 0.9263397793203935, 'learning_rate': 7.021289117779948e-06, 'epoch': 0.39} 39%|███▊ | 13249/34278 [14:39:19<22:43:14, 3.89s/it] 39%|███▊ | 13250/34278 [14:39:22<21:11:56, 3.63s/it] {'loss': 0.1427, 'grad_norm': 1.1031347446761475, 'learning_rate': 7.020856999183626e-06, 'epoch': 0.39} 39%|███▊ | 13250/34278 [14:39:22<21:11:56, 3.63s/it] 39%|███▊ | 13251/34278 [14:39:25<20:52:02, 3.57s/it] {'loss': 0.1485, 'grad_norm': 1.4868112119703023, 'learning_rate': 7.020424862545576e-06, 'epoch': 0.39} 39%|███▊ | 13251/34278 [14:39:25<20:52:02, 3.57s/it] 39%|███▊ | 13252/34278 [14:39:29<19:55:15, 3.41s/it] {'loss': 0.1403, 'grad_norm': 1.1050750802395453, 'learning_rate': 7.019992707869655e-06, 'epoch': 0.39} 39%|███▊ | 13252/34278 [14:39:29<19:55:15, 3.41s/it] 39%|███▊ | 13253/34278 [14:39:31<18:45:25, 3.21s/it] {'loss': 0.1497, 'grad_norm': 0.9099726322467829, 'learning_rate': 7.019560535159719e-06, 'epoch': 0.39} 39%|███▊ | 13253/34278 [14:39:31<18:45:25, 3.21s/it] 39%|███▊ | 13254/34278 [14:39:35<19:26:29, 3.33s/it] {'loss': 0.1517, 'grad_norm': 1.1939293331015322, 'learning_rate': 7.019128344419631e-06, 'epoch': 0.39} 39%|███▊ | 13254/34278 [14:39:35<19:26:29, 3.33s/it] 39%|███▊ | 13255/34278 [14:39:38<18:39:19, 3.19s/it] {'loss': 0.146, 'grad_norm': 1.2814198179595297, 'learning_rate': 7.018696135653248e-06, 'epoch': 0.39} 39%|███▊ | 13255/34278 [14:39:38<18:39:19, 3.19s/it] 39%|███▊ | 13256/34278 [14:39:41<18:57:43, 3.25s/it] {'loss': 0.1398, 'grad_norm': 0.7766726570343186, 'learning_rate': 7.018263908864424e-06, 'epoch': 0.39} 39%|███▊ | 13256/34278 [14:39:41<18:57:43, 3.25s/it] 39%|███▊ | 13257/34278 [14:39:44<18:14:41, 3.12s/it] {'loss': 0.1347, 'grad_norm': 0.7753074895094298, 'learning_rate': 7.017831664057026e-06, 'epoch': 0.39} 39%|███▊ | 13257/34278 [14:39:44<18:14:41, 3.12s/it] 39%|███▊ | 13258/34278 [14:39:47<17:58:29, 3.08s/it] {'loss': 0.127, 'grad_norm': 0.8792880022194638, 'learning_rate': 7.0173994012349066e-06, 'epoch': 0.39} 39%|███▊ | 13258/34278 [14:39:47<17:58:29, 3.08s/it] 39%|███▊ | 13259/34278 [14:39:50<18:01:09, 3.09s/it] {'loss': 0.1777, 'grad_norm': 0.9984398289008541, 'learning_rate': 7.016967120401925e-06, 'epoch': 0.39} 39%|███▊ | 13259/34278 [14:39:50<18:01:09, 3.09s/it] 39%|███▊ | 13260/34278 [14:39:53<17:39:41, 3.03s/it] {'loss': 0.1692, 'grad_norm': 0.8389228509562606, 'learning_rate': 7.016534821561947e-06, 'epoch': 0.39} 39%|███▊ | 13260/34278 [14:39:53<17:39:41, 3.03s/it] 39%|███▊ | 13261/34278 [14:39:56<18:05:40, 3.10s/it] {'loss': 0.1249, 'grad_norm': 0.8388674127325592, 'learning_rate': 7.016102504718824e-06, 'epoch': 0.39} 39%|███▊ | 13261/34278 [14:39:56<18:05:40, 3.10s/it] 39%|███▊ | 13262/34278 [14:40:02<23:36:25, 4.04s/it] {'loss': 0.1502, 'grad_norm': 0.7382572353971572, 'learning_rate': 7.015670169876419e-06, 'epoch': 0.39} 39%|███▊ | 13262/34278 [14:40:02<23:36:25, 4.04s/it] 39%|███▊ | 13263/34278 [14:40:06<22:12:09, 3.80s/it] {'loss': 0.1485, 'grad_norm': 0.8735915659877994, 'learning_rate': 7.015237817038594e-06, 'epoch': 0.39} 39%|███▊ | 13263/34278 [14:40:06<22:12:09, 3.80s/it] 39%|███▊ | 13264/34278 [14:40:09<21:20:39, 3.66s/it] {'loss': 0.1262, 'grad_norm': 0.9981665930892641, 'learning_rate': 7.014805446209205e-06, 'epoch': 0.39} 39%|███▊ | 13264/34278 [14:40:09<21:20:39, 3.66s/it] 39%|███▊ | 13265/34278 [14:40:13<21:19:29, 3.65s/it] {'loss': 0.1389, 'grad_norm': 0.6620816291523073, 'learning_rate': 7.014373057392115e-06, 'epoch': 0.39} 39%|███▊ | 13265/34278 [14:40:13<21:19:29, 3.65s/it] 39%|███▊ | 13266/34278 [14:40:16<20:24:29, 3.50s/it] {'loss': 0.1435, 'grad_norm': 0.7700687214631495, 'learning_rate': 7.013940650591182e-06, 'epoch': 0.39} 39%|███▊ | 13266/34278 [14:40:16<20:24:29, 3.50s/it] 39%|███▊ | 13267/34278 [14:40:19<19:30:25, 3.34s/it] {'loss': 0.1269, 'grad_norm': 1.1417505579337457, 'learning_rate': 7.01350822581027e-06, 'epoch': 0.39} 39%|███▊ | 13267/34278 [14:40:19<19:30:25, 3.34s/it] 39%|███▊ | 13268/34278 [14:40:22<19:01:20, 3.26s/it] {'loss': 0.1709, 'grad_norm': 0.9066213369944394, 'learning_rate': 7.013075783053235e-06, 'epoch': 0.39} 39%|███▊ | 13268/34278 [14:40:22<19:01:20, 3.26s/it] 39%|███▊ | 13269/34278 [14:40:25<18:58:17, 3.25s/it] {'loss': 0.132, 'grad_norm': 0.6524092610583785, 'learning_rate': 7.012643322323941e-06, 'epoch': 0.39} 39%|███▊ | 13269/34278 [14:40:25<18:58:17, 3.25s/it] 39%|███▊ | 13270/34278 [14:40:28<18:40:41, 3.20s/it] {'loss': 0.128, 'grad_norm': 0.8563818341021124, 'learning_rate': 7.012210843626248e-06, 'epoch': 0.39} 39%|███▊ | 13270/34278 [14:40:28<18:40:41, 3.20s/it] 39%|███▊ | 13271/34278 [14:40:32<18:59:46, 3.26s/it] {'loss': 0.137, 'grad_norm': 0.9548941536063597, 'learning_rate': 7.011778346964015e-06, 'epoch': 0.39} 39%|███▊ | 13271/34278 [14:40:32<18:59:46, 3.26s/it] 39%|███▊ | 13272/34278 [14:40:38<23:51:05, 4.09s/it] {'loss': 0.1452, 'grad_norm': 0.9557896221863059, 'learning_rate': 7.011345832341109e-06, 'epoch': 0.39} 39%|███▊ | 13272/34278 [14:40:38<23:51:05, 4.09s/it] 39%|███▊ | 13273/34278 [14:40:41<22:18:56, 3.82s/it] {'loss': 0.1533, 'grad_norm': 0.9963487168990761, 'learning_rate': 7.0109132997613845e-06, 'epoch': 0.39} 39%|███▊ | 13273/34278 [14:40:41<22:18:56, 3.82s/it] 39%|███▊ | 13274/34278 [14:40:45<22:18:54, 3.82s/it] {'loss': 0.1653, 'grad_norm': 0.998368277925134, 'learning_rate': 7.010480749228706e-06, 'epoch': 0.39} 39%|███▊ | 13274/34278 [14:40:45<22:18:54, 3.82s/it] 39%|███▊ | 13275/34278 [14:40:48<21:36:12, 3.70s/it] {'loss': 0.1196, 'grad_norm': 0.8492515173009113, 'learning_rate': 7.010048180746938e-06, 'epoch': 0.39} 39%|███▊ | 13275/34278 [14:40:48<21:36:12, 3.70s/it] 39%|███▊ | 13276/34278 [14:40:54<25:39:40, 4.40s/it] {'loss': 0.1659, 'grad_norm': 0.967085633972716, 'learning_rate': 7.009615594319937e-06, 'epoch': 0.39} 39%|███▊ | 13276/34278 [14:40:54<25:39:40, 4.40s/it] 39%|███▊ | 13277/34278 [14:40:58<25:17:27, 4.34s/it] {'loss': 0.1476, 'grad_norm': 0.8020883773551066, 'learning_rate': 7.0091829899515684e-06, 'epoch': 0.39} 39%|███▊ | 13277/34278 [14:40:58<25:17:27, 4.34s/it] 39%|███▊ | 13278/34278 [14:41:01<23:23:29, 4.01s/it] {'loss': 0.1428, 'grad_norm': 0.9411406020672687, 'learning_rate': 7.008750367645694e-06, 'epoch': 0.39} 39%|███▊ | 13278/34278 [14:41:01<23:23:29, 4.01s/it] 39%|███▊ | 13279/34278 [14:41:05<21:52:45, 3.75s/it] {'loss': 0.1338, 'grad_norm': 0.864130362302704, 'learning_rate': 7.008317727406175e-06, 'epoch': 0.39} 39%|███▊ | 13279/34278 [14:41:05<21:52:45, 3.75s/it] 39%|███▊ | 13280/34278 [14:41:08<20:37:28, 3.54s/it] {'loss': 0.1323, 'grad_norm': 0.7606777040854541, 'learning_rate': 7.007885069236876e-06, 'epoch': 0.39} 39%|███▊ | 13280/34278 [14:41:08<20:37:28, 3.54s/it] 39%|███▊ | 13281/34278 [14:41:11<19:58:04, 3.42s/it] {'loss': 0.157, 'grad_norm': 0.8536732065673777, 'learning_rate': 7.0074523931416585e-06, 'epoch': 0.39} 39%|███▊ | 13281/34278 [14:41:11<19:58:04, 3.42s/it] 39%|███▊ | 13282/34278 [14:41:14<19:06:11, 3.28s/it] {'loss': 0.1519, 'grad_norm': 1.2402798134330753, 'learning_rate': 7.007019699124385e-06, 'epoch': 0.39} 39%|███▊ | 13282/34278 [14:41:14<19:06:11, 3.28s/it] 39%|███▉ | 13283/34278 [14:41:17<18:39:31, 3.20s/it] {'loss': 0.1507, 'grad_norm': 0.9339407090209275, 'learning_rate': 7.006586987188917e-06, 'epoch': 0.39} 39%|███▉ | 13283/34278 [14:41:17<18:39:31, 3.20s/it] 39%|███▉ | 13284/34278 [14:41:22<21:32:47, 3.69s/it] {'loss': 0.1617, 'grad_norm': 1.0502150476663425, 'learning_rate': 7.006154257339121e-06, 'epoch': 0.39} 39%|███▉ | 13284/34278 [14:41:22<21:32:47, 3.69s/it] 39%|███▉ | 13285/34278 [14:41:24<19:38:23, 3.37s/it] {'loss': 0.1357, 'grad_norm': 1.1563302839339988, 'learning_rate': 7.00572150957886e-06, 'epoch': 0.39} 39%|███▉ | 13285/34278 [14:41:24<19:38:23, 3.37s/it] 39%|███▉ | 13286/34278 [14:41:30<24:25:05, 4.19s/it] {'loss': 0.1296, 'grad_norm': 0.8922102358620652, 'learning_rate': 7.005288743911994e-06, 'epoch': 0.39} 39%|███▉ | 13286/34278 [14:41:30<24:25:05, 4.19s/it] 39%|███▉ | 13287/34278 [14:41:34<23:20:29, 4.00s/it] {'loss': 0.1306, 'grad_norm': 0.8754802960953985, 'learning_rate': 7.004855960342389e-06, 'epoch': 0.39} 39%|███▉ | 13287/34278 [14:41:34<23:20:29, 4.00s/it] 39%|███▉ | 13288/34278 [14:41:40<26:47:42, 4.60s/it] {'loss': 0.1411, 'grad_norm': 0.9678799900693552, 'learning_rate': 7.00442315887391e-06, 'epoch': 0.39} 39%|███▉ | 13288/34278 [14:41:40<26:47:42, 4.60s/it] 39%|███▉ | 13289/34278 [14:41:44<25:46:22, 4.42s/it] {'loss': 0.1213, 'grad_norm': 1.3212304721783794, 'learning_rate': 7.003990339510417e-06, 'epoch': 0.39} 39%|███▉ | 13289/34278 [14:41:44<25:46:22, 4.42s/it] 39%|███▉ | 13290/34278 [14:41:47<23:45:17, 4.07s/it] {'loss': 0.1385, 'grad_norm': 0.8836057308626918, 'learning_rate': 7.003557502255779e-06, 'epoch': 0.39} 39%|███▉ | 13290/34278 [14:41:47<23:45:17, 4.07s/it] 39%|███▉ | 13291/34278 [14:41:50<21:57:30, 3.77s/it] {'loss': 0.1624, 'grad_norm': 0.9399206699618007, 'learning_rate': 7.003124647113857e-06, 'epoch': 0.39} 39%|███▉ | 13291/34278 [14:41:50<21:57:30, 3.77s/it] 39%|███▉ | 13292/34278 [14:41:53<21:02:44, 3.61s/it] {'loss': 0.1562, 'grad_norm': 0.8276290543556397, 'learning_rate': 7.002691774088517e-06, 'epoch': 0.39} 39%|███▉ | 13292/34278 [14:41:53<21:02:44, 3.61s/it] 39%|███▉ | 13293/34278 [14:41:57<21:24:51, 3.67s/it] {'loss': 0.1451, 'grad_norm': 0.9554146122010095, 'learning_rate': 7.002258883183621e-06, 'epoch': 0.39} 39%|███▉ | 13293/34278 [14:41:57<21:24:51, 3.67s/it] 39%|███▉ | 13294/34278 [14:42:01<22:20:05, 3.83s/it] {'loss': 0.1402, 'grad_norm': 0.8170182403350386, 'learning_rate': 7.001825974403038e-06, 'epoch': 0.39} 39%|███▉ | 13294/34278 [14:42:01<22:20:05, 3.83s/it] 39%|███▉ | 13295/34278 [14:42:05<21:47:35, 3.74s/it] {'loss': 0.139, 'grad_norm': 0.7558694011352974, 'learning_rate': 7.001393047750629e-06, 'epoch': 0.39} 39%|███▉ | 13295/34278 [14:42:05<21:47:35, 3.74s/it] 39%|███▉ | 13296/34278 [14:42:08<20:32:59, 3.53s/it] {'loss': 0.1372, 'grad_norm': 0.698040528931186, 'learning_rate': 7.000960103230261e-06, 'epoch': 0.39} 39%|███▉ | 13296/34278 [14:42:08<20:32:59, 3.53s/it] 39%|███▉ | 13297/34278 [14:42:12<20:29:44, 3.52s/it] {'loss': 0.1233, 'grad_norm': 0.931884498583288, 'learning_rate': 7.000527140845801e-06, 'epoch': 0.39} 39%|███▉ | 13297/34278 [14:42:12<20:29:44, 3.52s/it] 39%|███▉ | 13298/34278 [14:42:15<19:52:27, 3.41s/it] {'loss': 0.1408, 'grad_norm': 0.9023811833610386, 'learning_rate': 7.000094160601109e-06, 'epoch': 0.39} 39%|███▉ | 13298/34278 [14:42:15<19:52:27, 3.41s/it] 39%|███▉ | 13299/34278 [14:42:18<19:31:00, 3.35s/it] {'loss': 0.1636, 'grad_norm': 0.8712635751305197, 'learning_rate': 6.999661162500056e-06, 'epoch': 0.39} 39%|███▉ | 13299/34278 [14:42:18<19:31:00, 3.35s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f962cfa9d00> Failed to fetch sample 3588840. Exception: cannot identify image file <_io.BytesIO object at 0x7f962cfa9d00> 39%|███▉ | 13300/34278 [14:42:21<18:47:29, 3.22s/it] {'loss': 0.1234, 'grad_norm': 0.9558331606284869, 'learning_rate': 6.999228146546504e-06, 'epoch': 0.39} 39%|███▉ | 13300/34278 [14:42:21<18:47:29, 3.22s/it] 39%|███▉ | 13301/34278 [14:42:25<19:56:59, 3.42s/it] {'loss': 0.153, 'grad_norm': 0.9861588803013244, 'learning_rate': 6.998795112744321e-06, 'epoch': 0.39} 39%|███▉ | 13301/34278 [14:42:25<19:56:59, 3.42s/it] 39%|███▉ | 13302/34278 [14:42:28<19:13:12, 3.30s/it] {'loss': 0.1598, 'grad_norm': 1.109512376081367, 'learning_rate': 6.9983620610973725e-06, 'epoch': 0.39} 39%|███▉ | 13302/34278 [14:42:28<19:13:12, 3.30s/it] 39%|███▉ | 13303/34278 [14:42:31<18:42:15, 3.21s/it] {'loss': 0.1363, 'grad_norm': 0.8147009888384774, 'learning_rate': 6.997928991609525e-06, 'epoch': 0.39} 39%|███▉ | 13303/34278 [14:42:31<18:42:15, 3.21s/it] 39%|███▉ | 13304/34278 [14:42:34<19:36:40, 3.37s/it] {'loss': 0.1644, 'grad_norm': 0.9339738820826606, 'learning_rate': 6.997495904284643e-06, 'epoch': 0.39} 39%|███▉ | 13304/34278 [14:42:34<19:36:40, 3.37s/it] 39%|███▉ | 13305/34278 [14:42:38<19:27:35, 3.34s/it] {'loss': 0.1512, 'grad_norm': 0.7825712474625479, 'learning_rate': 6.9970627991265964e-06, 'epoch': 0.39} 39%|███▉ | 13305/34278 [14:42:38<19:27:35, 3.34s/it] 39%|███▉ | 13306/34278 [14:42:44<24:00:48, 4.12s/it] {'loss': 0.1671, 'grad_norm': 0.9293887480382017, 'learning_rate': 6.9966296761392485e-06, 'epoch': 0.39} 39%|███▉ | 13306/34278 [14:42:44<24:00:48, 4.12s/it] 39%|███▉ | 13307/34278 [14:42:47<21:48:26, 3.74s/it] {'loss': 0.1322, 'grad_norm': 0.8516994438558391, 'learning_rate': 6.9961965353264675e-06, 'epoch': 0.39} 39%|███▉ | 13307/34278 [14:42:47<21:48:26, 3.74s/it] 39%|███▉ | 13308/34278 [14:42:52<24:33:29, 4.22s/it] {'loss': 0.1544, 'grad_norm': 0.9549232915876097, 'learning_rate': 6.995763376692121e-06, 'epoch': 0.39} 39%|███▉ | 13308/34278 [14:42:52<24:33:29, 4.22s/it] 39%|███▉ | 13309/34278 [14:42:55<22:18:42, 3.83s/it] {'loss': 0.1342, 'grad_norm': 1.013295450340856, 'learning_rate': 6.995330200240075e-06, 'epoch': 0.39} 39%|███▉ | 13309/34278 [14:42:55<22:18:42, 3.83s/it] 39%|███▉ | 13310/34278 [14:42:59<22:55:14, 3.94s/it] {'loss': 0.1104, 'grad_norm': 0.6954548142395225, 'learning_rate': 6.994897005974197e-06, 'epoch': 0.39} 39%|███▉ | 13310/34278 [14:42:59<22:55:14, 3.94s/it] 39%|███▉ | 13311/34278 [14:43:02<21:44:02, 3.73s/it] {'loss': 0.1254, 'grad_norm': 0.8169893974116368, 'learning_rate': 6.9944637938983555e-06, 'epoch': 0.39} 39%|███▉ | 13311/34278 [14:43:02<21:44:02, 3.73s/it] 39%|███▉ | 13312/34278 [14:43:05<20:56:56, 3.60s/it] {'loss': 0.1555, 'grad_norm': 1.1914272777175088, 'learning_rate': 6.994030564016418e-06, 'epoch': 0.39} 39%|███▉ | 13312/34278 [14:43:06<20:56:56, 3.60s/it] 39%|███▉ | 13313/34278 [14:43:09<20:54:08, 3.59s/it] {'loss': 0.154, 'grad_norm': 0.8296728806910973, 'learning_rate': 6.993597316332249e-06, 'epoch': 0.39} 39%|███▉ | 13313/34278 [14:43:09<20:54:08, 3.59s/it] 39%|███▉ | 13314/34278 [14:43:12<19:25:24, 3.34s/it] {'loss': 0.1387, 'grad_norm': 0.8159415283837939, 'learning_rate': 6.9931640508497215e-06, 'epoch': 0.39} 39%|███▉ | 13314/34278 [14:43:12<19:25:24, 3.34s/it] 39%|███▉ | 13315/34278 [14:43:17<22:47:13, 3.91s/it] {'loss': 0.1463, 'grad_norm': 1.022868637756655, 'learning_rate': 6.9927307675727005e-06, 'epoch': 0.39} 39%|███▉ | 13315/34278 [14:43:17<22:47:13, 3.91s/it] 39%|███▉ | 13316/34278 [14:43:20<21:49:22, 3.75s/it] {'loss': 0.1829, 'grad_norm': 0.9184870012599347, 'learning_rate': 6.9922974665050534e-06, 'epoch': 0.39} 39%|███▉ | 13316/34278 [14:43:20<21:49:22, 3.75s/it] 39%|███▉ | 13317/34278 [14:43:25<23:02:47, 3.96s/it] {'loss': 0.1591, 'grad_norm': 1.0162453871884387, 'learning_rate': 6.991864147650653e-06, 'epoch': 0.39} 39%|███▉ | 13317/34278 [14:43:25<23:02:47, 3.96s/it] 39%|███▉ | 13318/34278 [14:43:29<24:06:47, 4.14s/it] {'loss': 0.1392, 'grad_norm': 0.6958059252750741, 'learning_rate': 6.991430811013363e-06, 'epoch': 0.39} 39%|███▉ | 13318/34278 [14:43:29<24:06:47, 4.14s/it] 39%|███▉ | 13319/34278 [14:43:33<22:13:30, 3.82s/it] {'loss': 0.1395, 'grad_norm': 1.5786897357763856, 'learning_rate': 6.990997456597054e-06, 'epoch': 0.39} 39%|███▉ | 13319/34278 [14:43:33<22:13:30, 3.82s/it] 39%|███▉ | 13320/34278 [14:43:37<22:42:27, 3.90s/it] {'loss': 0.149, 'grad_norm': 0.7972810695226562, 'learning_rate': 6.990564084405595e-06, 'epoch': 0.39} 39%|███▉ | 13320/34278 [14:43:37<22:42:27, 3.90s/it] 39%|███▉ | 13321/34278 [14:43:43<26:24:37, 4.54s/it] {'loss': 0.1416, 'grad_norm': 0.8777016036161785, 'learning_rate': 6.990130694442857e-06, 'epoch': 0.39} 39%|███▉ | 13321/34278 [14:43:43<26:24:37, 4.54s/it] 39%|███▉ | 13322/34278 [14:43:46<24:21:20, 4.18s/it] {'loss': 0.1377, 'grad_norm': 0.8780807240183021, 'learning_rate': 6.989697286712705e-06, 'epoch': 0.39} 39%|███▉ | 13322/34278 [14:43:46<24:21:20, 4.18s/it] 39%|███▉ | 13323/34278 [14:43:49<22:46:23, 3.91s/it] {'loss': 0.1512, 'grad_norm': 0.9176289988609246, 'learning_rate': 6.9892638612190125e-06, 'epoch': 0.39} 39%|███▉ | 13323/34278 [14:43:49<22:46:23, 3.91s/it] 39%|███▉ | 13324/34278 [14:43:52<21:14:24, 3.65s/it] {'loss': 0.1515, 'grad_norm': 0.803467416298351, 'learning_rate': 6.988830417965645e-06, 'epoch': 0.39} 39%|███▉ | 13324/34278 [14:43:52<21:14:24, 3.65s/it] 39%|███▉ | 13325/34278 [14:43:56<20:27:43, 3.52s/it] {'loss': 0.1527, 'grad_norm': 0.811414526116803, 'learning_rate': 6.988396956956476e-06, 'epoch': 0.39} 39%|███▉ | 13325/34278 [14:43:56<20:27:43, 3.52s/it] 39%|███▉ | 13326/34278 [14:43:59<20:15:53, 3.48s/it] {'loss': 0.1594, 'grad_norm': 0.725293480837676, 'learning_rate': 6.987963478195373e-06, 'epoch': 0.39} 39%|███▉ | 13326/34278 [14:43:59<20:15:53, 3.48s/it] 39%|███▉ | 13327/34278 [14:44:05<24:33:54, 4.22s/it] {'loss': 0.141, 'grad_norm': 0.8542884036141146, 'learning_rate': 6.9875299816862075e-06, 'epoch': 0.39} 39%|███▉ | 13327/34278 [14:44:05<24:33:54, 4.22s/it] 39%|███▉ | 13328/34278 [14:44:09<24:09:36, 4.15s/it] {'loss': 0.1649, 'grad_norm': 0.9948677236482435, 'learning_rate': 6.987096467432847e-06, 'epoch': 0.39} 39%|███▉ | 13328/34278 [14:44:09<24:09:36, 4.15s/it] 39%|███▉ | 13329/34278 [14:44:12<22:23:49, 3.85s/it] {'loss': 0.1173, 'grad_norm': 0.8379567960194645, 'learning_rate': 6.986662935439165e-06, 'epoch': 0.39} 39%|███▉ | 13329/34278 [14:44:12<22:23:49, 3.85s/it] 39%|███▉ | 13330/34278 [14:44:15<21:24:17, 3.68s/it] {'loss': 0.1329, 'grad_norm': 0.9104299231378886, 'learning_rate': 6.98622938570903e-06, 'epoch': 0.39} 39%|███▉ | 13330/34278 [14:44:15<21:24:17, 3.68s/it] 39%|███▉ | 13331/34278 [14:44:19<21:30:26, 3.70s/it] {'loss': 0.134, 'grad_norm': 1.2147220613229637, 'learning_rate': 6.985795818246313e-06, 'epoch': 0.39} 39%|███▉ | 13331/34278 [14:44:19<21:30:26, 3.70s/it] 39%|███▉ | 13332/34278 [14:44:22<19:47:34, 3.40s/it] {'loss': 0.1697, 'grad_norm': 1.1428833879959506, 'learning_rate': 6.985362233054887e-06, 'epoch': 0.39} 39%|███▉ | 13332/34278 [14:44:22<19:47:34, 3.40s/it] 39%|███▉ | 13333/34278 [14:44:26<21:18:55, 3.66s/it] {'loss': 0.1414, 'grad_norm': 0.7664877068593712, 'learning_rate': 6.984928630138619e-06, 'epoch': 0.39} 39%|███▉ | 13333/34278 [14:44:26<21:18:55, 3.66s/it] 39%|███▉ | 13334/34278 [14:44:30<21:28:09, 3.69s/it] {'loss': 0.1476, 'grad_norm': 0.7097562592324536, 'learning_rate': 6.984495009501381e-06, 'epoch': 0.39} 39%|███▉ | 13334/34278 [14:44:30<21:28:09, 3.69s/it] 39%|███▉ | 13335/34278 [14:44:33<20:23:43, 3.51s/it] {'loss': 0.1442, 'grad_norm': 0.9832394396281896, 'learning_rate': 6.984061371147047e-06, 'epoch': 0.39} 39%|███▉ | 13335/34278 [14:44:33<20:23:43, 3.51s/it] 39%|███▉ | 13336/34278 [14:44:39<24:53:41, 4.28s/it] {'loss': 0.1457, 'grad_norm': 0.8274496434558746, 'learning_rate': 6.983627715079487e-06, 'epoch': 0.39} 39%|███▉ | 13336/34278 [14:44:39<24:53:41, 4.28s/it] 39%|███▉ | 13337/34278 [14:44:43<24:17:09, 4.18s/it] {'loss': 0.1492, 'grad_norm': 0.7244927244327911, 'learning_rate': 6.98319404130257e-06, 'epoch': 0.39} 39%|███▉ | 13337/34278 [14:44:43<24:17:09, 4.18s/it] 39%|███▉ | 13338/34278 [14:44:46<22:55:18, 3.94s/it] {'loss': 0.1507, 'grad_norm': 1.043038759570755, 'learning_rate': 6.982760349820172e-06, 'epoch': 0.39} 39%|███▉ | 13338/34278 [14:44:46<22:55:18, 3.94s/it] 39%|███▉ | 13339/34278 [14:44:49<21:12:27, 3.65s/it] {'loss': 0.1362, 'grad_norm': 0.9942063133788643, 'learning_rate': 6.9823266406361625e-06, 'epoch': 0.39} 39%|███▉ | 13339/34278 [14:44:49<21:12:27, 3.65s/it] 39%|███▉ | 13340/34278 [14:44:55<25:16:51, 4.35s/it] {'loss': 0.1769, 'grad_norm': 0.8422806446955047, 'learning_rate': 6.981892913754414e-06, 'epoch': 0.39} 39%|███▉ | 13340/34278 [14:44:55<25:16:51, 4.35s/it] 39%|███▉ | 13341/34278 [14:44:58<22:56:34, 3.94s/it] {'loss': 0.1208, 'grad_norm': 0.8708336803825205, 'learning_rate': 6.981459169178799e-06, 'epoch': 0.39} 39%|███▉ | 13341/34278 [14:44:58<22:56:34, 3.94s/it] 39%|███▉ | 13342/34278 [14:45:01<21:19:29, 3.67s/it] {'loss': 0.156, 'grad_norm': 0.82161203254612, 'learning_rate': 6.98102540691319e-06, 'epoch': 0.39} 39%|███▉ | 13342/34278 [14:45:01<21:19:29, 3.67s/it] 39%|███▉ | 13343/34278 [14:45:05<21:04:12, 3.62s/it] {'loss': 0.1447, 'grad_norm': 0.8041841095006205, 'learning_rate': 6.980591626961457e-06, 'epoch': 0.39} 39%|███▉ | 13343/34278 [14:45:05<21:04:12, 3.62s/it] 39%|███▉ | 13344/34278 [14:45:11<25:39:16, 4.41s/it] {'loss': 0.1366, 'grad_norm': 0.9053273291678036, 'learning_rate': 6.980157829327476e-06, 'epoch': 0.39} 39%|███▉ | 13344/34278 [14:45:11<25:39:16, 4.41s/it] 39%|███▉ | 13345/34278 [14:45:14<23:00:13, 3.96s/it] {'loss': 0.1458, 'grad_norm': 0.8936512087957569, 'learning_rate': 6.979724014015119e-06, 'epoch': 0.39} 39%|███▉ | 13345/34278 [14:45:14<23:00:13, 3.96s/it] 39%|███▉ | 13346/34278 [14:45:17<21:49:59, 3.76s/it] {'loss': 0.144, 'grad_norm': 0.9754935926371973, 'learning_rate': 6.979290181028258e-06, 'epoch': 0.39} 39%|███▉ | 13346/34278 [14:45:17<21:49:59, 3.76s/it] 39%|███▉ | 13347/34278 [14:45:21<21:29:26, 3.70s/it] {'loss': 0.1295, 'grad_norm': 0.6794009994474512, 'learning_rate': 6.978856330370768e-06, 'epoch': 0.39} 39%|███▉ | 13347/34278 [14:45:21<21:29:26, 3.70s/it] 39%|███▉ | 13348/34278 [14:45:24<20:14:34, 3.48s/it] {'loss': 0.1337, 'grad_norm': 0.7867241800532104, 'learning_rate': 6.97842246204652e-06, 'epoch': 0.39} 39%|███▉ | 13348/34278 [14:45:24<20:14:34, 3.48s/it] 39%|███▉ | 13349/34278 [14:45:27<19:08:41, 3.29s/it] {'loss': 0.1414, 'grad_norm': 0.714191117427281, 'learning_rate': 6.977988576059387e-06, 'epoch': 0.39} 39%|███▉ | 13349/34278 [14:45:27<19:08:41, 3.29s/it] 39%|███▉ | 13350/34278 [14:45:30<18:43:42, 3.22s/it] {'loss': 0.1369, 'grad_norm': 0.8831457005697209, 'learning_rate': 6.977554672413247e-06, 'epoch': 0.39} 39%|███▉ | 13350/34278 [14:45:30<18:43:42, 3.22s/it] 39%|███▉ | 13351/34278 [14:45:33<18:49:18, 3.24s/it] {'loss': 0.153, 'grad_norm': 0.662207327080798, 'learning_rate': 6.97712075111197e-06, 'epoch': 0.39} 39%|███▉ | 13351/34278 [14:45:33<18:49:18, 3.24s/it] 39%|███▉ | 13352/34278 [14:45:36<18:13:33, 3.14s/it] {'loss': 0.1417, 'grad_norm': 0.9824106029973495, 'learning_rate': 6.97668681215943e-06, 'epoch': 0.39} 39%|███▉ | 13352/34278 [14:45:36<18:13:33, 3.14s/it] 39%|███▉ | 13353/34278 [14:45:39<18:43:30, 3.22s/it] {'loss': 0.1465, 'grad_norm': 0.7797362553881462, 'learning_rate': 6.976252855559504e-06, 'epoch': 0.39} 39%|███▉ | 13353/34278 [14:45:39<18:43:30, 3.22s/it] 39%|███▉ | 13354/34278 [14:45:43<19:44:11, 3.40s/it] {'loss': 0.161, 'grad_norm': 0.7245514585214006, 'learning_rate': 6.975818881316062e-06, 'epoch': 0.39} 39%|███▉ | 13354/34278 [14:45:43<19:44:11, 3.40s/it] 39%|███▉ | 13355/34278 [14:45:49<24:12:16, 4.16s/it] {'loss': 0.1372, 'grad_norm': 0.6144009335532287, 'learning_rate': 6.975384889432981e-06, 'epoch': 0.39} 39%|███▉ | 13355/34278 [14:45:49<24:12:16, 4.16s/it] 39%|███▉ | 13356/34278 [14:45:52<22:57:20, 3.95s/it] {'loss': 0.14, 'grad_norm': 0.8314983607142742, 'learning_rate': 6.974950879914136e-06, 'epoch': 0.39} 39%|███▉ | 13356/34278 [14:45:52<22:57:20, 3.95s/it] 39%|███▉ | 13357/34278 [14:45:56<21:31:20, 3.70s/it] {'loss': 0.109, 'grad_norm': 0.7598820440301378, 'learning_rate': 6.9745168527634024e-06, 'epoch': 0.39} 39%|███▉ | 13357/34278 [14:45:56<21:31:20, 3.70s/it] 39%|███▉ | 13358/34278 [14:45:59<21:00:23, 3.61s/it] {'loss': 0.1733, 'grad_norm': 0.667888523275534, 'learning_rate': 6.974082807984651e-06, 'epoch': 0.39} 39%|███▉ | 13358/34278 [14:45:59<21:00:23, 3.61s/it] 39%|███▉ | 13359/34278 [14:46:03<20:56:10, 3.60s/it] {'loss': 0.1278, 'grad_norm': 0.7147882986652229, 'learning_rate': 6.973648745581761e-06, 'epoch': 0.39} 39%|███▉ | 13359/34278 [14:46:03<20:56:10, 3.60s/it] 39%|███▉ | 13360/34278 [14:46:05<19:49:32, 3.41s/it] {'loss': 0.1422, 'grad_norm': 0.9553106230970906, 'learning_rate': 6.973214665558606e-06, 'epoch': 0.39} 39%|███▉ | 13360/34278 [14:46:05<19:49:32, 3.41s/it] 39%|███▉ | 13361/34278 [14:46:11<23:21:17, 4.02s/it] {'loss': 0.1396, 'grad_norm': 0.7756088344563122, 'learning_rate': 6.972780567919061e-06, 'epoch': 0.39} 39%|███▉ | 13361/34278 [14:46:11<23:21:17, 4.02s/it] 39%|███▉ | 13362/34278 [14:46:15<22:58:49, 3.96s/it] {'loss': 0.1432, 'grad_norm': 0.847104542280691, 'learning_rate': 6.972346452667003e-06, 'epoch': 0.39} 39%|███▉ | 13362/34278 [14:46:15<22:58:49, 3.96s/it] 39%|███▉ | 13363/34278 [14:46:19<23:20:07, 4.02s/it] {'loss': 0.1497, 'grad_norm': 0.8712932944498577, 'learning_rate': 6.971912319806306e-06, 'epoch': 0.39} 39%|███▉ | 13363/34278 [14:46:19<23:20:07, 4.02s/it] 39%|███▉ | 13364/34278 [14:46:22<21:41:43, 3.73s/it] {'loss': 0.1272, 'grad_norm': 0.6655785423789258, 'learning_rate': 6.971478169340846e-06, 'epoch': 0.39} 39%|███▉ | 13364/34278 [14:46:22<21:41:43, 3.73s/it] 39%|███▉ | 13365/34278 [14:46:26<22:40:29, 3.90s/it] {'loss': 0.1402, 'grad_norm': 0.7828140569257632, 'learning_rate': 6.971044001274502e-06, 'epoch': 0.39} 39%|███▉ | 13365/34278 [14:46:26<22:40:29, 3.90s/it] 39%|███▉ | 13366/34278 [14:46:30<21:34:53, 3.72s/it] {'loss': 0.1665, 'grad_norm': 0.888757757668912, 'learning_rate': 6.970609815611146e-06, 'epoch': 0.39} 39%|███▉ | 13366/34278 [14:46:30<21:34:53, 3.72s/it] 39%|███▉ | 13367/34278 [14:46:33<20:30:53, 3.53s/it] {'loss': 0.1469, 'grad_norm': 1.1091736288901854, 'learning_rate': 6.970175612354655e-06, 'epoch': 0.39} 39%|███▉ | 13367/34278 [14:46:33<20:30:53, 3.53s/it] 39%|███▉ | 13368/34278 [14:46:36<20:29:48, 3.53s/it] {'loss': 0.1707, 'grad_norm': 0.873249582296831, 'learning_rate': 6.969741391508907e-06, 'epoch': 0.39} 39%|███▉ | 13368/34278 [14:46:36<20:29:48, 3.53s/it] 39%|███▉ | 13369/34278 [14:46:42<24:11:29, 4.17s/it] {'loss': 0.1505, 'grad_norm': 0.7237473938954652, 'learning_rate': 6.969307153077779e-06, 'epoch': 0.39} 39%|███▉ | 13369/34278 [14:46:42<24:11:29, 4.17s/it] 39%|███▉ | 13370/34278 [14:46:45<21:39:13, 3.73s/it] {'loss': 0.1459, 'grad_norm': 0.8722380633299138, 'learning_rate': 6.968872897065147e-06, 'epoch': 0.39} 39%|███▉ | 13370/34278 [14:46:45<21:39:13, 3.73s/it] 39%|███▉ | 13371/34278 [14:46:48<20:48:38, 3.58s/it] {'loss': 0.141, 'grad_norm': 0.8690157356299223, 'learning_rate': 6.9684386234748866e-06, 'epoch': 0.39} 39%|███▉ | 13371/34278 [14:46:48<20:48:38, 3.58s/it] 39%|███▉ | 13372/34278 [14:46:51<20:44:24, 3.57s/it] {'loss': 0.121, 'grad_norm': 0.7648898813933638, 'learning_rate': 6.968004332310877e-06, 'epoch': 0.39} 39%|███▉ | 13372/34278 [14:46:51<20:44:24, 3.57s/it] 39%|███▉ | 13373/34278 [14:46:54<19:43:26, 3.40s/it] {'loss': 0.1603, 'grad_norm': 0.6912289339659948, 'learning_rate': 6.967570023576993e-06, 'epoch': 0.39} 39%|███▉ | 13373/34278 [14:46:54<19:43:26, 3.40s/it] 39%|███▉ | 13374/34278 [14:46:57<19:04:59, 3.29s/it] {'loss': 0.1539, 'grad_norm': 0.7626884920461074, 'learning_rate': 6.967135697277114e-06, 'epoch': 0.39} 39%|███▉ | 13374/34278 [14:46:57<19:04:59, 3.29s/it] 39%|███▉ | 13375/34278 [14:47:02<21:07:01, 3.64s/it] {'loss': 0.125, 'grad_norm': 0.9089730229210611, 'learning_rate': 6.96670135341512e-06, 'epoch': 0.39} 39%|███▉ | 13375/34278 [14:47:02<21:07:01, 3.64s/it] 39%|███▉ | 13376/34278 [14:47:05<21:04:39, 3.63s/it] {'loss': 0.1466, 'grad_norm': 0.7886555626671125, 'learning_rate': 6.966266991994881e-06, 'epoch': 0.39} 39%|███▉ | 13376/34278 [14:47:05<21:04:39, 3.63s/it] 39%|███▉ | 13377/34278 [14:47:10<22:48:44, 3.93s/it] {'loss': 0.1729, 'grad_norm': 0.980473515537376, 'learning_rate': 6.965832613020284e-06, 'epoch': 0.39} 39%|███▉ | 13377/34278 [14:47:10<22:48:44, 3.93s/it] 39%|███▉ | 13378/34278 [14:47:13<21:02:43, 3.63s/it] {'loss': 0.1566, 'grad_norm': 0.8537764799859051, 'learning_rate': 6.9653982164952e-06, 'epoch': 0.39} 39%|███▉ | 13378/34278 [14:47:13<21:02:43, 3.63s/it] 39%|███▉ | 13379/34278 [14:47:18<23:07:01, 3.98s/it] {'loss': 0.1522, 'grad_norm': 1.0210186184483199, 'learning_rate': 6.96496380242351e-06, 'epoch': 0.39} 39%|███▉ | 13379/34278 [14:47:18<23:07:01, 3.98s/it] 39%|███▉ | 13380/34278 [14:47:21<21:39:34, 3.73s/it] {'loss': 0.1384, 'grad_norm': 0.8263969947570391, 'learning_rate': 6.964529370809095e-06, 'epoch': 0.39} 39%|███▉ | 13380/34278 [14:47:21<21:39:34, 3.73s/it] 39%|███▉ | 13381/34278 [14:47:27<25:45:58, 4.44s/it] {'loss': 0.1613, 'grad_norm': 0.9348121446069635, 'learning_rate': 6.964094921655828e-06, 'epoch': 0.39} 39%|███▉ | 13381/34278 [14:47:27<25:45:58, 4.44s/it] 39%|███▉ | 13382/34278 [14:47:30<23:42:40, 4.09s/it] {'loss': 0.1304, 'grad_norm': 0.9743860353075883, 'learning_rate': 6.963660454967591e-06, 'epoch': 0.39} 39%|███▉ | 13382/34278 [14:47:30<23:42:40, 4.09s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41677b3970> Failed to fetch sample 3110972. Exception: cannot identify image file <_io.BytesIO object at 0x7f41677b3970> 39%|███▉ | 13383/34278 [14:47:36<25:58:09, 4.47s/it] {'loss': 0.1601, 'grad_norm': 0.712965059417837, 'learning_rate': 6.963225970748262e-06, 'epoch': 0.39} 39%|███▉ | 13383/34278 [14:47:36<25:58:09, 4.47s/it] 39%|███▉ | 13384/34278 [14:47:39<23:40:43, 4.08s/it] {'loss': 0.1621, 'grad_norm': 0.8278049793772592, 'learning_rate': 6.96279146900172e-06, 'epoch': 0.39} 39%|███▉ | 13384/34278 [14:47:39<23:40:43, 4.08s/it] 39%|███▉ | 13385/34278 [14:47:42<22:04:17, 3.80s/it] {'loss': 0.1639, 'grad_norm': 0.8431031487562697, 'learning_rate': 6.962356949731846e-06, 'epoch': 0.39} 39%|███▉ | 13385/34278 [14:47:42<22:04:17, 3.80s/it] 39%|███▉ | 13386/34278 [14:47:47<23:39:46, 4.08s/it] {'loss': 0.1272, 'grad_norm': 0.8061050375402079, 'learning_rate': 6.961922412942517e-06, 'epoch': 0.39} 39%|███▉ | 13386/34278 [14:47:47<23:39:46, 4.08s/it] 39%|███▉ | 13387/34278 [14:47:50<21:35:01, 3.72s/it] {'loss': 0.1391, 'grad_norm': 0.6606946199278728, 'learning_rate': 6.9614878586376125e-06, 'epoch': 0.39} 39%|███▉ | 13387/34278 [14:47:50<21:35:01, 3.72s/it] 39%|███▉ | 13388/34278 [14:47:52<20:11:41, 3.48s/it] {'loss': 0.1427, 'grad_norm': 0.9617599754709144, 'learning_rate': 6.961053286821012e-06, 'epoch': 0.39} 39%|███▉ | 13388/34278 [14:47:52<20:11:41, 3.48s/it] 39%|███▉ | 13389/34278 [14:47:58<24:32:48, 4.23s/it] {'loss': 0.1594, 'grad_norm': 0.9996814338324447, 'learning_rate': 6.960618697496597e-06, 'epoch': 0.39} 39%|███▉ | 13389/34278 [14:47:58<24:32:48, 4.23s/it] 39%|███▉ | 13390/34278 [14:48:02<23:14:12, 4.00s/it] {'loss': 0.1336, 'grad_norm': 0.7699719287102674, 'learning_rate': 6.960184090668245e-06, 'epoch': 0.39} 39%|███▉ | 13390/34278 [14:48:02<23:14:12, 4.00s/it] 39%|███▉ | 13391/34278 [14:48:06<23:29:20, 4.05s/it] {'loss': 0.1605, 'grad_norm': 0.8229926159042056, 'learning_rate': 6.959749466339839e-06, 'epoch': 0.39} 39%|███▉ | 13391/34278 [14:48:06<23:29:20, 4.05s/it] 39%|███▉ | 13392/34278 [14:48:09<21:54:12, 3.78s/it] {'loss': 0.1386, 'grad_norm': 0.8933858508674338, 'learning_rate': 6.959314824515258e-06, 'epoch': 0.39} 39%|███▉ | 13392/34278 [14:48:09<21:54:12, 3.78s/it] 39%|███▉ | 13393/34278 [14:48:13<21:35:48, 3.72s/it] {'loss': 0.137, 'grad_norm': 0.7196626303364317, 'learning_rate': 6.95888016519838e-06, 'epoch': 0.39} 39%|███▉ | 13393/34278 [14:48:13<21:35:48, 3.72s/it] 39%|███▉ | 13394/34278 [14:48:16<20:34:45, 3.55s/it] {'loss': 0.1375, 'grad_norm': 0.8334043516529545, 'learning_rate': 6.958445488393088e-06, 'epoch': 0.39} 39%|███▉ | 13394/34278 [14:48:16<20:34:45, 3.55s/it] 39%|███▉ | 13395/34278 [14:48:22<25:00:46, 4.31s/it] {'loss': 0.1373, 'grad_norm': 0.7419972030408932, 'learning_rate': 6.958010794103263e-06, 'epoch': 0.39} 39%|███▉ | 13395/34278 [14:48:22<25:00:46, 4.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 39%|███▉ | 13396/34278 [14:48:26<24:34:15, 4.24s/it] {'loss': 0.1451, 'grad_norm': 0.8216999238492296, 'learning_rate': 6.957576082332784e-06, 'epoch': 0.39} 39%|███▉ | 13396/34278 [14:48:26<24:34:15, 4.24s/it] 39%|███▉ | 13397/34278 [14:48:29<22:50:12, 3.94s/it] {'loss': 0.1328, 'grad_norm': 0.7050367296930518, 'learning_rate': 6.9571413530855345e-06, 'epoch': 0.39} 39%|███▉ | 13397/34278 [14:48:29<22:50:12, 3.94s/it] 39%|███▉ | 13398/34278 [14:48:33<22:21:41, 3.86s/it] {'loss': 0.1341, 'grad_norm': 0.8815943297971746, 'learning_rate': 6.956706606365393e-06, 'epoch': 0.39} 39%|███▉ | 13398/34278 [14:48:33<22:21:41, 3.86s/it] 39%|███▉ | 13399/34278 [14:48:36<21:11:17, 3.65s/it] {'loss': 0.1364, 'grad_norm': 0.9252955919633908, 'learning_rate': 6.956271842176242e-06, 'epoch': 0.39} 39%|███▉ | 13399/34278 [14:48:36<21:11:17, 3.65s/it] 39%|███▉ | 13400/34278 [14:48:40<20:53:28, 3.60s/it] {'loss': 0.1286, 'grad_norm': 0.8930728103288988, 'learning_rate': 6.9558370605219634e-06, 'epoch': 0.39} 39%|███▉ | 13400/34278 [14:48:40<20:53:28, 3.60s/it] 39%|███▉ | 13401/34278 [14:48:43<19:59:00, 3.45s/it] {'loss': 0.1381, 'grad_norm': 0.9163816687984387, 'learning_rate': 6.955402261406439e-06, 'epoch': 0.39} 39%|███▉ | 13401/34278 [14:48:43<19:59:00, 3.45s/it] 39%|███▉ | 13402/34278 [14:48:46<19:05:11, 3.29s/it] {'loss': 0.1574, 'grad_norm': 0.7680791483130143, 'learning_rate': 6.954967444833549e-06, 'epoch': 0.39} 39%|███▉ | 13402/34278 [14:48:46<19:05:11, 3.29s/it] 39%|███▉ | 13403/34278 [14:48:49<18:52:58, 3.26s/it] {'loss': 0.1443, 'grad_norm': 1.0069978034553984, 'learning_rate': 6.954532610807176e-06, 'epoch': 0.39} 39%|███▉ | 13403/34278 [14:48:49<18:52:58, 3.26s/it] 39%|███▉ | 13404/34278 [14:48:55<23:25:57, 4.04s/it] {'loss': 0.1377, 'grad_norm': 0.8988432582345715, 'learning_rate': 6.954097759331204e-06, 'epoch': 0.39} 39%|███▉ | 13404/34278 [14:48:55<23:25:57, 4.04s/it] 39%|███▉ | 13405/34278 [14:48:58<22:29:33, 3.88s/it] {'loss': 0.155, 'grad_norm': 0.7066239225388304, 'learning_rate': 6.953662890409512e-06, 'epoch': 0.39} 39%|███▉ | 13405/34278 [14:48:58<22:29:33, 3.88s/it] 39%|███▉ | 13406/34278 [14:49:01<21:23:30, 3.69s/it] {'loss': 0.1424, 'grad_norm': 1.1597307283036862, 'learning_rate': 6.9532280040459855e-06, 'epoch': 0.39} 39%|███▉ | 13406/34278 [14:49:02<21:23:30, 3.69s/it] 39%|███▉ | 13407/34278 [14:49:07<24:57:40, 4.31s/it] {'loss': 0.1663, 'grad_norm': 0.8405187499121104, 'learning_rate': 6.952793100244506e-06, 'epoch': 0.39} 39%|███▉ | 13407/34278 [14:49:07<24:57:40, 4.31s/it] 39%|███▉ | 13408/34278 [14:49:13<27:39:08, 4.77s/it] {'loss': 0.1335, 'grad_norm': 0.7460433118762413, 'learning_rate': 6.952358179008954e-06, 'epoch': 0.39} 39%|███▉ | 13408/34278 [14:49:13<27:39:08, 4.77s/it] 39%|███▉ | 13409/34278 [14:49:16<24:28:16, 4.22s/it] {'loss': 0.151, 'grad_norm': 1.059038470736294, 'learning_rate': 6.951923240343217e-06, 'epoch': 0.39} 39%|███▉ | 13409/34278 [14:49:16<24:28:16, 4.22s/it] 39%|███▉ | 13410/34278 [14:49:19<22:46:21, 3.93s/it] {'loss': 0.1437, 'grad_norm': 0.7679939106601765, 'learning_rate': 6.951488284251173e-06, 'epoch': 0.39} 39%|███▉ | 13410/34278 [14:49:19<22:46:21, 3.93s/it] 39%|███▉ | 13411/34278 [14:49:25<25:30:54, 4.40s/it] {'loss': 0.144, 'grad_norm': 0.8507353232572379, 'learning_rate': 6.9510533107367066e-06, 'epoch': 0.39} 39%|███▉ | 13411/34278 [14:49:25<25:30:54, 4.40s/it] 39%|███▉ | 13412/34278 [14:49:28<23:12:08, 4.00s/it] {'loss': 0.1485, 'grad_norm': 1.205972558740319, 'learning_rate': 6.950618319803704e-06, 'epoch': 0.39} 39%|███▉ | 13412/34278 [14:49:28<23:12:08, 4.00s/it] 39%|███▉ | 13413/34278 [14:49:31<21:34:29, 3.72s/it] {'loss': 0.1662, 'grad_norm': 1.0747486516695628, 'learning_rate': 6.950183311456046e-06, 'epoch': 0.39} 39%|███▉ | 13413/34278 [14:49:31<21:34:29, 3.72s/it] 39%|███▉ | 13414/34278 [14:49:34<21:08:04, 3.65s/it] {'loss': 0.1401, 'grad_norm': 0.8034709340146459, 'learning_rate': 6.9497482856976175e-06, 'epoch': 0.39} 39%|███▉ | 13414/34278 [14:49:34<21:08:04, 3.65s/it] 39%|███▉ | 13415/34278 [14:49:40<25:06:28, 4.33s/it] {'loss': 0.1371, 'grad_norm': 1.2788093437326604, 'learning_rate': 6.949313242532301e-06, 'epoch': 0.39} 39%|███▉ | 13415/34278 [14:49:40<25:06:28, 4.33s/it] 39%|███▉ | 13416/34278 [14:49:43<22:52:46, 3.95s/it] {'loss': 0.1456, 'grad_norm': 0.8976758945386955, 'learning_rate': 6.94887818196398e-06, 'epoch': 0.39} 39%|███▉ | 13416/34278 [14:49:43<22:52:46, 3.95s/it] 39%|███▉ | 13417/34278 [14:49:47<21:32:57, 3.72s/it] {'loss': 0.1163, 'grad_norm': 0.6312697081513999, 'learning_rate': 6.948443103996543e-06, 'epoch': 0.39} 39%|███▉ | 13417/34278 [14:49:47<21:32:57, 3.72s/it] 39%|███▉ | 13418/34278 [14:49:50<20:36:00, 3.56s/it] {'loss': 0.1408, 'grad_norm': 0.775990863607962, 'learning_rate': 6.948008008633868e-06, 'epoch': 0.39} 39%|███▉ | 13418/34278 [14:49:50<20:36:00, 3.56s/it] 39%|███▉ | 13419/34278 [14:49:53<19:42:54, 3.40s/it] {'loss': 0.1793, 'grad_norm': 0.9341190483712725, 'learning_rate': 6.947572895879844e-06, 'epoch': 0.39} 39%|███▉ | 13419/34278 [14:49:53<19:42:54, 3.40s/it] 39%|███▉ | 13420/34278 [14:49:57<20:27:57, 3.53s/it] {'loss': 0.1297, 'grad_norm': 0.6071239175759338, 'learning_rate': 6.947137765738354e-06, 'epoch': 0.39} 39%|███▉ | 13420/34278 [14:49:57<20:27:57, 3.53s/it] 39%|███▉ | 13421/34278 [14:50:00<19:39:22, 3.39s/it] {'loss': 0.1071, 'grad_norm': 0.6793513193876647, 'learning_rate': 6.946702618213284e-06, 'epoch': 0.39} 39%|███▉ | 13421/34278 [14:50:00<19:39:22, 3.39s/it] 39%|███▉ | 13422/34278 [14:50:04<21:05:06, 3.64s/it] {'loss': 0.1516, 'grad_norm': 0.7551884994473822, 'learning_rate': 6.946267453308518e-06, 'epoch': 0.39} 39%|███▉ | 13422/34278 [14:50:04<21:05:06, 3.64s/it] 39%|███▉ | 13423/34278 [14:50:07<20:31:14, 3.54s/it] {'loss': 0.1603, 'grad_norm': 0.780841912088871, 'learning_rate': 6.945832271027937e-06, 'epoch': 0.39} 39%|███▉ | 13423/34278 [14:50:07<20:31:14, 3.54s/it] 39%|███▉ | 13424/34278 [14:50:12<21:57:37, 3.79s/it] {'loss': 0.1259, 'grad_norm': 0.7038816360485793, 'learning_rate': 6.945397071375433e-06, 'epoch': 0.39} 39%|███▉ | 13424/34278 [14:50:12<21:57:37, 3.79s/it] 39%|███▉ | 13425/34278 [14:50:16<22:56:20, 3.96s/it] {'loss': 0.1238, 'grad_norm': 0.7328245116045685, 'learning_rate': 6.944961854354888e-06, 'epoch': 0.39} 39%|███▉ | 13425/34278 [14:50:16<22:56:20, 3.96s/it] 39%|███▉ | 13426/34278 [14:50:22<26:26:42, 4.57s/it] {'loss': 0.1537, 'grad_norm': 1.2517861853783918, 'learning_rate': 6.944526619970187e-06, 'epoch': 0.39} 39%|███▉ | 13426/34278 [14:50:22<26:26:42, 4.57s/it] 39%|███▉ | 13427/34278 [14:50:26<25:09:38, 4.34s/it] {'loss': 0.155, 'grad_norm': 0.743241132460899, 'learning_rate': 6.944091368225218e-06, 'epoch': 0.39} 39%|███▉ | 13427/34278 [14:50:26<25:09:38, 4.34s/it] 39%|███▉ | 13428/34278 [14:50:29<22:47:06, 3.93s/it] {'loss': 0.1292, 'grad_norm': 0.6935536329792923, 'learning_rate': 6.9436560991238635e-06, 'epoch': 0.39} 39%|███▉ | 13428/34278 [14:50:29<22:47:06, 3.93s/it] 39%|███▉ | 13429/34278 [14:50:32<21:34:32, 3.73s/it] {'loss': 0.1563, 'grad_norm': 0.7756549210802175, 'learning_rate': 6.943220812670013e-06, 'epoch': 0.39} 39%|███▉ | 13429/34278 [14:50:32<21:34:32, 3.73s/it] 39%|███▉ | 13430/34278 [14:50:36<22:12:47, 3.84s/it] {'loss': 0.1534, 'grad_norm': 0.7414237234309615, 'learning_rate': 6.94278550886755e-06, 'epoch': 0.39} 39%|███▉ | 13430/34278 [14:50:36<22:12:47, 3.84s/it] 39%|███▉ | 13431/34278 [14:50:40<22:22:15, 3.86s/it] {'loss': 0.1469, 'grad_norm': 0.8426514748873748, 'learning_rate': 6.942350187720361e-06, 'epoch': 0.39} 39%|███▉ | 13431/34278 [14:50:40<22:22:15, 3.86s/it] 39%|███▉ | 13432/34278 [14:50:44<22:17:11, 3.85s/it] {'loss': 0.1314, 'grad_norm': 0.7615308497476291, 'learning_rate': 6.941914849232336e-06, 'epoch': 0.39} 39%|███▉ | 13432/34278 [14:50:44<22:17:11, 3.85s/it] 39%|███▉ | 13433/34278 [14:50:50<26:08:16, 4.51s/it] {'loss': 0.1398, 'grad_norm': 0.7505485574867814, 'learning_rate': 6.941479493407356e-06, 'epoch': 0.39} 39%|███▉ | 13433/34278 [14:50:50<26:08:16, 4.51s/it] 39%|███▉ | 13434/34278 [14:50:53<23:45:48, 4.10s/it] {'loss': 0.1497, 'grad_norm': 0.9822685707588608, 'learning_rate': 6.9410441202493115e-06, 'epoch': 0.39} 39%|███▉ | 13434/34278 [14:50:53<23:45:48, 4.10s/it] 39%|███▉ | 13435/34278 [14:50:56<22:06:37, 3.82s/it] {'loss': 0.1599, 'grad_norm': 0.8013111594305649, 'learning_rate': 6.940608729762088e-06, 'epoch': 0.39} 39%|███▉ | 13435/34278 [14:50:56<22:06:37, 3.82s/it] 39%|███▉ | 13436/34278 [14:51:02<25:55:21, 4.48s/it] {'loss': 0.1313, 'grad_norm': 0.7633362671154748, 'learning_rate': 6.940173321949574e-06, 'epoch': 0.39} 39%|███▉ | 13436/34278 [14:51:02<25:55:21, 4.48s/it] 39%|███▉ | 13437/34278 [14:51:07<26:28:14, 4.57s/it] {'loss': 0.1291, 'grad_norm': 0.7355768933169572, 'learning_rate': 6.9397378968156555e-06, 'epoch': 0.39} 39%|███▉ | 13437/34278 [14:51:07<26:28:14, 4.57s/it] 39%|███▉ | 13438/34278 [14:51:11<24:45:21, 4.28s/it] {'loss': 0.1692, 'grad_norm': 0.8647746265161046, 'learning_rate': 6.9393024543642195e-06, 'epoch': 0.39} 39%|███▉ | 13438/34278 [14:51:11<24:45:21, 4.28s/it] 39%|███▉ | 13439/34278 [14:51:14<22:59:11, 3.97s/it] {'loss': 0.1305, 'grad_norm': 0.8840514013776822, 'learning_rate': 6.938866994599156e-06, 'epoch': 0.39} 39%|███▉ | 13439/34278 [14:51:14<22:59:11, 3.97s/it] 39%|███▉ | 13440/34278 [14:51:17<21:43:29, 3.75s/it] {'loss': 0.1401, 'grad_norm': 0.8111839257480463, 'learning_rate': 6.938431517524349e-06, 'epoch': 0.39} 39%|███▉ | 13440/34278 [14:51:17<21:43:29, 3.75s/it] 39%|███▉ | 13441/34278 [14:51:23<25:17:25, 4.37s/it] {'loss': 0.1444, 'grad_norm': 1.1505431095945406, 'learning_rate': 6.937996023143687e-06, 'epoch': 0.39} 39%|███▉ | 13441/34278 [14:51:23<25:17:25, 4.37s/it] 39%|███▉ | 13442/34278 [14:51:26<23:44:36, 4.10s/it] {'loss': 0.1443, 'grad_norm': 1.006634496209286, 'learning_rate': 6.937560511461062e-06, 'epoch': 0.39} 39%|███▉ | 13442/34278 [14:51:26<23:44:36, 4.10s/it] 39%|███▉ | 13443/34278 [14:51:29<21:43:08, 3.75s/it] {'loss': 0.1428, 'grad_norm': 0.7314659622389533, 'learning_rate': 6.937124982480358e-06, 'epoch': 0.39} 39%|███▉ | 13443/34278 [14:51:29<21:43:08, 3.75s/it] 39%|███▉ | 13444/34278 [14:51:32<20:39:56, 3.57s/it] {'loss': 0.1324, 'grad_norm': 0.839361978663376, 'learning_rate': 6.936689436205464e-06, 'epoch': 0.39} 39%|███▉ | 13444/34278 [14:51:32<20:39:56, 3.57s/it] 39%|███▉ | 13445/34278 [14:51:36<20:23:11, 3.52s/it] {'loss': 0.1252, 'grad_norm': 0.7894591607237862, 'learning_rate': 6.936253872640269e-06, 'epoch': 0.39} 39%|███▉ | 13445/34278 [14:51:36<20:23:11, 3.52s/it] 39%|███▉ | 13446/34278 [14:51:39<19:47:42, 3.42s/it] {'loss': 0.1134, 'grad_norm': 0.8514628703439183, 'learning_rate': 6.935818291788663e-06, 'epoch': 0.39} 39%|███▉ | 13446/34278 [14:51:39<19:47:42, 3.42s/it] 39%|███▉ | 13447/34278 [14:51:42<19:32:12, 3.38s/it] {'loss': 0.1392, 'grad_norm': 0.7229729683223703, 'learning_rate': 6.935382693654532e-06, 'epoch': 0.39} 39%|███▉ | 13447/34278 [14:51:42<19:32:12, 3.38s/it] 39%|███▉ | 13448/34278 [14:51:46<20:18:03, 3.51s/it] {'loss': 0.1354, 'grad_norm': 0.866114985431533, 'learning_rate': 6.934947078241767e-06, 'epoch': 0.39} 39%|███▉ | 13448/34278 [14:51:46<20:18:03, 3.51s/it] 39%|███▉ | 13449/34278 [14:51:49<19:04:15, 3.30s/it] {'loss': 0.1407, 'grad_norm': 1.0060248452344418, 'learning_rate': 6.934511445554257e-06, 'epoch': 0.39} 39%|███▉ | 13449/34278 [14:51:49<19:04:15, 3.30s/it] 39%|███▉ | 13450/34278 [14:51:52<19:02:16, 3.29s/it] {'loss': 0.1301, 'grad_norm': 0.920134761062441, 'learning_rate': 6.934075795595889e-06, 'epoch': 0.39} 39%|███▉ | 13450/34278 [14:51:52<19:02:16, 3.29s/it] 39%|███▉ | 13451/34278 [14:51:58<23:54:06, 4.13s/it] {'loss': 0.1618, 'grad_norm': 1.0085656115031563, 'learning_rate': 6.933640128370556e-06, 'epoch': 0.39} 39%|███▉ | 13451/34278 [14:51:58<23:54:06, 4.13s/it] 39%|███▉ | 13452/34278 [14:52:02<23:50:11, 4.12s/it] {'loss': 0.1413, 'grad_norm': 0.9004872747952627, 'learning_rate': 6.933204443882144e-06, 'epoch': 0.39} 39%|███▉ | 13452/34278 [14:52:02<23:50:11, 4.12s/it] 39%|███▉ | 13453/34278 [14:52:05<21:42:57, 3.75s/it] {'loss': 0.1173, 'grad_norm': 0.8948913061908531, 'learning_rate': 6.932768742134545e-06, 'epoch': 0.39} 39%|███▉ | 13453/34278 [14:52:05<21:42:57, 3.75s/it] 39%|███▉ | 13454/34278 [14:52:09<21:55:48, 3.79s/it] {'loss': 0.1371, 'grad_norm': 1.1737819694857237, 'learning_rate': 6.932333023131647e-06, 'epoch': 0.39} 39%|███▉ | 13454/34278 [14:52:09<21:55:48, 3.79s/it] 39%|███▉ | 13455/34278 [14:52:13<21:36:29, 3.74s/it] {'loss': 0.1329, 'grad_norm': 0.8046936114660276, 'learning_rate': 6.9318972868773425e-06, 'epoch': 0.39} 39%|███▉ | 13455/34278 [14:52:13<21:36:29, 3.74s/it] 39%|███▉ | 13456/34278 [14:52:17<21:37:36, 3.74s/it] {'loss': 0.1575, 'grad_norm': 0.7641640235500119, 'learning_rate': 6.931461533375518e-06, 'epoch': 0.39} 39%|███▉ | 13456/34278 [14:52:17<21:37:36, 3.74s/it] 39%|███▉ | 13457/34278 [14:52:21<23:36:19, 4.08s/it] {'loss': 0.1237, 'grad_norm': 0.8387902511728224, 'learning_rate': 6.931025762630069e-06, 'epoch': 0.39} 39%|███▉ | 13457/34278 [14:52:21<23:36:19, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 39%|███▉ | 13458/34278 [14:52:25<22:50:40, 3.95s/it] {'loss': 0.1409, 'grad_norm': 0.8401778315870334, 'learning_rate': 6.930589974644881e-06, 'epoch': 0.39} 39%|███▉ | 13458/34278 [14:52:25<22:50:40, 3.95s/it] 39%|███▉ | 13459/34278 [14:52:28<20:54:46, 3.62s/it] {'loss': 0.1375, 'grad_norm': 1.0466468885461653, 'learning_rate': 6.930154169423849e-06, 'epoch': 0.39} 39%|███▉ | 13459/34278 [14:52:28<20:54:46, 3.62s/it] 39%|███▉ | 13460/34278 [14:52:31<20:07:11, 3.48s/it] {'loss': 0.1496, 'grad_norm': 1.0032165642616175, 'learning_rate': 6.929718346970858e-06, 'epoch': 0.39} 39%|███▉ | 13460/34278 [14:52:31<20:07:11, 3.48s/it] 39%|███▉ | 13461/34278 [14:52:35<20:11:49, 3.49s/it] {'loss': 0.1302, 'grad_norm': 1.0900803003505035, 'learning_rate': 6.929282507289804e-06, 'epoch': 0.39} 39%|███▉ | 13461/34278 [14:52:35<20:11:49, 3.49s/it] 39%|███▉ | 13462/34278 [14:52:39<21:38:01, 3.74s/it] {'loss': 0.1875, 'grad_norm': 0.8295190180096297, 'learning_rate': 6.928846650384575e-06, 'epoch': 0.39} 39%|███▉ | 13462/34278 [14:52:39<21:38:01, 3.74s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 39%|███▉ | 13463/34278 [14:52:42<21:01:39, 3.64s/it] {'loss': 0.1562, 'grad_norm': 1.060580655896145, 'learning_rate': 6.928410776259065e-06, 'epoch': 0.39} 39%|███▉ | 13463/34278 [14:52:42<21:01:39, 3.64s/it] 39%|███▉ | 13464/34278 [14:52:47<23:16:45, 4.03s/it] {'loss': 0.1248, 'grad_norm': 0.9378510304402614, 'learning_rate': 6.927974884917163e-06, 'epoch': 0.39} 39%|███▉ | 13464/34278 [14:52:47<23:16:45, 4.03s/it] 39%|███▉ | 13465/34278 [14:52:51<22:56:37, 3.97s/it] {'loss': 0.1579, 'grad_norm': 0.7853299827127223, 'learning_rate': 6.927538976362762e-06, 'epoch': 0.39} 39%|███▉ | 13465/34278 [14:52:51<22:56:37, 3.97s/it] 39%|███▉ | 13466/34278 [14:52:54<20:52:58, 3.61s/it] {'loss': 0.1588, 'grad_norm': 1.0478984293700828, 'learning_rate': 6.9271030505997535e-06, 'epoch': 0.39} 39%|███▉ | 13466/34278 [14:52:54<20:52:58, 3.61s/it] 39%|███▉ | 13467/34278 [14:52:57<19:32:10, 3.38s/it] {'loss': 0.1735, 'grad_norm': 1.0617111371064418, 'learning_rate': 6.92666710763203e-06, 'epoch': 0.39} 39%|███▉ | 13467/34278 [14:52:57<19:32:10, 3.38s/it] 39%|███▉ | 13468/34278 [14:53:02<23:23:21, 4.05s/it] {'loss': 0.1472, 'grad_norm': 0.8479044160090914, 'learning_rate': 6.926231147463481e-06, 'epoch': 0.39} 39%|███▉ | 13468/34278 [14:53:02<23:23:21, 4.05s/it] 39%|███▉ | 13469/34278 [14:53:05<21:40:50, 3.75s/it] {'loss': 0.1569, 'grad_norm': 1.0243936652671943, 'learning_rate': 6.925795170098e-06, 'epoch': 0.39} 39%|███▉ | 13469/34278 [14:53:05<21:40:50, 3.75s/it] 39%|███▉ | 13470/34278 [14:53:11<25:48:49, 4.47s/it] {'loss': 0.1321, 'grad_norm': 1.0772117900981126, 'learning_rate': 6.92535917553948e-06, 'epoch': 0.39} 39%|███▉ | 13470/34278 [14:53:11<25:48:49, 4.47s/it] 39%|███▉ | 13471/34278 [14:53:14<22:59:27, 3.98s/it] {'loss': 0.1268, 'grad_norm': 0.8288398664135171, 'learning_rate': 6.924923163791811e-06, 'epoch': 0.39} 39%|███▉ | 13471/34278 [14:53:14<22:59:27, 3.98s/it] 39%|███▉ | 13472/34278 [14:53:18<22:23:18, 3.87s/it] {'loss': 0.1864, 'grad_norm': 1.0995106201194746, 'learning_rate': 6.92448713485889e-06, 'epoch': 0.39} 39%|███▉ | 13472/34278 [14:53:18<22:23:18, 3.87s/it] 39%|███▉ | 13473/34278 [14:53:23<23:48:20, 4.12s/it] {'loss': 0.1525, 'grad_norm': 1.0389480358599599, 'learning_rate': 6.924051088744606e-06, 'epoch': 0.39} 39%|███▉ | 13473/34278 [14:53:23<23:48:20, 4.12s/it] 39%|███▉ | 13474/34278 [14:53:27<23:39:40, 4.09s/it] {'loss': 0.1595, 'grad_norm': 0.8380733324749676, 'learning_rate': 6.923615025452854e-06, 'epoch': 0.39} 39%|███▉ | 13474/34278 [14:53:27<23:39:40, 4.09s/it] 39%|███▉ | 13475/34278 [14:53:30<22:26:41, 3.88s/it] {'loss': 0.1691, 'grad_norm': 0.9623144556101351, 'learning_rate': 6.923178944987525e-06, 'epoch': 0.39} 39%|███▉ | 13475/34278 [14:53:30<22:26:41, 3.88s/it] 39%|███▉ | 13476/34278 [14:53:34<23:20:17, 4.04s/it] {'loss': 0.1632, 'grad_norm': 0.9322633572796867, 'learning_rate': 6.922742847352515e-06, 'epoch': 0.39} 39%|███▉ | 13476/34278 [14:53:34<23:20:17, 4.04s/it] 39%|███▉ | 13477/34278 [14:53:39<23:35:12, 4.08s/it] {'loss': 0.1625, 'grad_norm': 0.8082087471222191, 'learning_rate': 6.922306732551716e-06, 'epoch': 0.39} 39%|███▉ | 13477/34278 [14:53:39<23:35:12, 4.08s/it] 39%|███▉ | 13478/34278 [14:53:45<26:53:39, 4.65s/it] {'loss': 0.1624, 'grad_norm': 0.8813435088048738, 'learning_rate': 6.92187060058902e-06, 'epoch': 0.39} 39%|███▉ | 13478/34278 [14:53:45<26:53:39, 4.65s/it] 39%|███▉ | 13479/34278 [14:53:51<29:18:01, 5.07s/it] {'loss': 0.1414, 'grad_norm': 0.7664965159860045, 'learning_rate': 6.921434451468323e-06, 'epoch': 0.39} 39%|███▉ | 13479/34278 [14:53:51<29:18:01, 5.07s/it] 39%|███▉ | 13480/34278 [14:53:54<26:18:29, 4.55s/it] {'loss': 0.1285, 'grad_norm': 0.7388090242828428, 'learning_rate': 6.9209982851935165e-06, 'epoch': 0.39} 39%|███▉ | 13480/34278 [14:53:54<26:18:29, 4.55s/it] 39%|███▉ | 13481/34278 [14:53:57<22:56:37, 3.97s/it] {'loss': 0.124, 'grad_norm': 0.7635029803592907, 'learning_rate': 6.920562101768498e-06, 'epoch': 0.39} 39%|███▉ | 13481/34278 [14:53:57<22:56:37, 3.97s/it] 39%|███▉ | 13482/34278 [14:54:00<21:34:43, 3.74s/it] {'loss': 0.1263, 'grad_norm': 0.6998395776978557, 'learning_rate': 6.920125901197159e-06, 'epoch': 0.39} 39%|███▉ | 13482/34278 [14:54:00<21:34:43, 3.74s/it] 39%|███▉ | 13483/34278 [14:54:04<21:56:08, 3.80s/it] {'loss': 0.1464, 'grad_norm': 0.6459048921764976, 'learning_rate': 6.919689683483392e-06, 'epoch': 0.39} 39%|███▉ | 13483/34278 [14:54:04<21:56:08, 3.80s/it] 39%|███▉ | 13484/34278 [14:54:07<20:48:51, 3.60s/it] {'loss': 0.1255, 'grad_norm': 0.8050426883242157, 'learning_rate': 6.919253448631097e-06, 'epoch': 0.39} 39%|███▉ | 13484/34278 [14:54:07<20:48:51, 3.60s/it] 39%|███▉ | 13485/34278 [14:54:10<20:11:35, 3.50s/it] {'loss': 0.1559, 'grad_norm': 0.7900914807108079, 'learning_rate': 6.918817196644163e-06, 'epoch': 0.39} 39%|███▉ | 13485/34278 [14:54:10<20:11:35, 3.50s/it] 39%|███▉ | 13486/34278 [14:54:16<24:26:52, 4.23s/it] {'loss': 0.1344, 'grad_norm': 0.7080388530461845, 'learning_rate': 6.918380927526488e-06, 'epoch': 0.39} 39%|███▉ | 13486/34278 [14:54:16<24:26:52, 4.23s/it] 39%|███▉ | 13487/34278 [14:54:20<23:04:47, 4.00s/it] {'loss': 0.14, 'grad_norm': 0.7611724563438523, 'learning_rate': 6.917944641281966e-06, 'epoch': 0.39} 39%|███▉ | 13487/34278 [14:54:20<23:04:47, 4.00s/it] 39%|███▉ | 13488/34278 [14:54:25<25:21:10, 4.39s/it] {'loss': 0.1348, 'grad_norm': 0.8121858794534704, 'learning_rate': 6.917508337914493e-06, 'epoch': 0.39} 39%|███▉ | 13488/34278 [14:54:25<25:21:10, 4.39s/it] 39%|███▉ | 13489/34278 [14:54:28<23:26:01, 4.06s/it] {'loss': 0.1295, 'grad_norm': 0.873557338320025, 'learning_rate': 6.9170720174279615e-06, 'epoch': 0.39} 39%|███▉ | 13489/34278 [14:54:28<23:26:01, 4.06s/it] 39%|███▉ | 13490/34278 [14:54:34<26:40:59, 4.62s/it] {'loss': 0.154, 'grad_norm': 0.6910521841080438, 'learning_rate': 6.91663567982627e-06, 'epoch': 0.39} 39%|███▉ | 13490/34278 [14:54:34<26:40:59, 4.62s/it] 39%|███▉ | 13491/34278 [14:54:37<24:16:51, 4.21s/it] {'loss': 0.1504, 'grad_norm': 0.9495453741411941, 'learning_rate': 6.9161993251133135e-06, 'epoch': 0.39} 39%|███▉ | 13491/34278 [14:54:37<24:16:51, 4.21s/it] 39%|███▉ | 13492/34278 [14:54:40<22:20:57, 3.87s/it] {'loss': 0.1413, 'grad_norm': 0.99695846990306, 'learning_rate': 6.915762953292985e-06, 'epoch': 0.39} 39%|███▉ | 13492/34278 [14:54:40<22:20:57, 3.87s/it] 39%|███▉ | 13493/34278 [14:54:45<24:07:03, 4.18s/it] {'loss': 0.1543, 'grad_norm': 0.9585221962636161, 'learning_rate': 6.915326564369183e-06, 'epoch': 0.39} 39%|███▉ | 13493/34278 [14:54:45<24:07:03, 4.18s/it] 39%|███▉ | 13494/34278 [14:54:48<21:50:52, 3.78s/it] {'loss': 0.143, 'grad_norm': 0.872098656123676, 'learning_rate': 6.914890158345802e-06, 'epoch': 0.39} 39%|███▉ | 13494/34278 [14:54:48<21:50:52, 3.78s/it] 39%|███▉ | 13495/34278 [14:54:52<21:32:01, 3.73s/it] {'loss': 0.139, 'grad_norm': 0.9465268504551351, 'learning_rate': 6.91445373522674e-06, 'epoch': 0.39} 39%|███▉ | 13495/34278 [14:54:52<21:32:01, 3.73s/it] 39%|███▉ | 13496/34278 [14:54:55<20:25:55, 3.54s/it] {'loss': 0.1464, 'grad_norm': 0.8381989221421502, 'learning_rate': 6.91401729501589e-06, 'epoch': 0.39} 39%|███▉ | 13496/34278 [14:54:55<20:25:55, 3.54s/it] 39%|███▉ | 13497/34278 [14:54:58<20:22:20, 3.53s/it] {'loss': 0.1449, 'grad_norm': 1.0099884439074915, 'learning_rate': 6.913580837717153e-06, 'epoch': 0.39} 39%|███▉ | 13497/34278 [14:54:58<20:22:20, 3.53s/it] 39%|███▉ | 13498/34278 [14:55:04<24:22:14, 4.22s/it] {'loss': 0.1398, 'grad_norm': 0.8391478691306493, 'learning_rate': 6.9131443633344205e-06, 'epoch': 0.39} 39%|███▉ | 13498/34278 [14:55:04<24:22:14, 4.22s/it] 39%|███▉ | 13499/34278 [14:55:09<24:54:13, 4.31s/it] {'loss': 0.145, 'grad_norm': 0.8009808303418667, 'learning_rate': 6.912707871871595e-06, 'epoch': 0.39} 39%|███▉ | 13499/34278 [14:55:09<24:54:13, 4.31s/it] 39%|███▉ | 13500/34278 [14:55:12<22:48:21, 3.95s/it] {'loss': 0.1391, 'grad_norm': 0.8945467445430153, 'learning_rate': 6.9122713633325674e-06, 'epoch': 0.39} 39%|███▉ | 13500/34278 [14:55:12<22:48:21, 3.95s/it] 39%|███▉ | 13501/34278 [14:55:15<22:06:33, 3.83s/it] {'loss': 0.1238, 'grad_norm': 1.0227978992708744, 'learning_rate': 6.911834837721239e-06, 'epoch': 0.39} 39%|███▉ | 13501/34278 [14:55:15<22:06:33, 3.83s/it] 39%|███▉ | 13502/34278 [14:55:18<20:35:21, 3.57s/it] {'loss': 0.147, 'grad_norm': 0.8293197513229508, 'learning_rate': 6.911398295041506e-06, 'epoch': 0.39} 39%|███▉ | 13502/34278 [14:55:18<20:35:21, 3.57s/it] 39%|███▉ | 13503/34278 [14:55:22<20:09:24, 3.49s/it] {'loss': 0.1383, 'grad_norm': 1.8600840272680774, 'learning_rate': 6.910961735297265e-06, 'epoch': 0.39} 39%|███▉ | 13503/34278 [14:55:22<20:09:24, 3.49s/it] 39%|███▉ | 13504/34278 [14:55:25<19:47:31, 3.43s/it] {'loss': 0.134, 'grad_norm': 1.1716484242902854, 'learning_rate': 6.910525158492413e-06, 'epoch': 0.39} 39%|███▉ | 13504/34278 [14:55:25<19:47:31, 3.43s/it] 39%|███▉ | 13505/34278 [14:55:32<25:15:40, 4.38s/it] {'loss': 0.1427, 'grad_norm': 0.8756867062019763, 'learning_rate': 6.910088564630848e-06, 'epoch': 0.39} 39%|███▉ | 13505/34278 [14:55:32<25:15:40, 4.38s/it] 39%|███▉ | 13506/34278 [14:55:35<23:17:01, 4.04s/it] {'loss': 0.1647, 'grad_norm': 1.069655355328582, 'learning_rate': 6.909651953716469e-06, 'epoch': 0.39} 39%|███▉ | 13506/34278 [14:55:35<23:17:01, 4.04s/it] 39%|███▉ | 13507/34278 [14:55:38<21:38:03, 3.75s/it] {'loss': 0.1522, 'grad_norm': 0.9480922403444754, 'learning_rate': 6.9092153257531735e-06, 'epoch': 0.39} 39%|███▉ | 13507/34278 [14:55:38<21:38:03, 3.75s/it] 39%|███▉ | 13508/34278 [14:55:41<21:15:38, 3.69s/it] {'loss': 0.1612, 'grad_norm': 0.8298992396590289, 'learning_rate': 6.90877868074486e-06, 'epoch': 0.39} 39%|███▉ | 13508/34278 [14:55:41<21:15:38, 3.69s/it] 39%|███▉ | 13509/34278 [14:55:45<21:50:30, 3.79s/it] {'loss': 0.1712, 'grad_norm': 0.8143344578505003, 'learning_rate': 6.908342018695424e-06, 'epoch': 0.39} 39%|███▉ | 13509/34278 [14:55:45<21:50:30, 3.79s/it] 39%|███▉ | 13510/34278 [14:55:49<20:50:53, 3.61s/it] {'loss': 0.1536, 'grad_norm': 0.9898841472787201, 'learning_rate': 6.907905339608768e-06, 'epoch': 0.39} 39%|███▉ | 13510/34278 [14:55:49<20:50:53, 3.61s/it] 39%|███▉ | 13511/34278 [14:55:52<20:18:47, 3.52s/it] {'loss': 0.1527, 'grad_norm': 0.9673297942709088, 'learning_rate': 6.907468643488788e-06, 'epoch': 0.39} 39%|███▉ | 13511/34278 [14:55:52<20:18:47, 3.52s/it] 39%|███▉ | 13512/34278 [14:55:56<20:35:12, 3.57s/it] {'loss': 0.1442, 'grad_norm': 0.7036049003008338, 'learning_rate': 6.907031930339384e-06, 'epoch': 0.39} 39%|███▉ | 13512/34278 [14:55:56<20:35:12, 3.57s/it] 39%|███▉ | 13513/34278 [14:56:00<22:36:04, 3.92s/it] {'loss': 0.1375, 'grad_norm': 0.9303468860530448, 'learning_rate': 6.906595200164452e-06, 'epoch': 0.39} 39%|███▉ | 13513/34278 [14:56:00<22:36:04, 3.92s/it] 39%|███▉ | 13514/34278 [14:56:04<22:03:44, 3.83s/it] {'loss': 0.1303, 'grad_norm': 0.8843318750722188, 'learning_rate': 6.906158452967895e-06, 'epoch': 0.39} 39%|███▉ | 13514/34278 [14:56:04<22:03:44, 3.83s/it] 39%|███▉ | 13515/34278 [14:56:07<21:00:15, 3.64s/it] {'loss': 0.1471, 'grad_norm': 0.8193348564637531, 'learning_rate': 6.905721688753611e-06, 'epoch': 0.39} 39%|███▉ | 13515/34278 [14:56:07<21:00:15, 3.64s/it] 39%|███▉ | 13516/34278 [14:56:10<20:18:06, 3.52s/it] {'loss': 0.1309, 'grad_norm': 0.8446213699654381, 'learning_rate': 6.905284907525496e-06, 'epoch': 0.39} 39%|███▉ | 13516/34278 [14:56:10<20:18:06, 3.52s/it] 39%|███▉ | 13517/34278 [14:56:16<23:05:11, 4.00s/it] {'loss': 0.1643, 'grad_norm': 0.8891430358286819, 'learning_rate': 6.9048481092874545e-06, 'epoch': 0.39} 39%|███▉ | 13517/34278 [14:56:16<23:05:11, 4.00s/it] 39%|███▉ | 13518/34278 [14:56:21<26:30:23, 4.60s/it] {'loss': 0.1563, 'grad_norm': 0.7375669001339739, 'learning_rate': 6.9044112940433825e-06, 'epoch': 0.39} 39%|███▉ | 13518/34278 [14:56:22<26:30:23, 4.60s/it] 39%|███▉ | 13519/34278 [14:56:25<24:08:52, 4.19s/it] {'loss': 0.1512, 'grad_norm': 0.7680674624667125, 'learning_rate': 6.903974461797182e-06, 'epoch': 0.39} 39%|███▉ | 13519/34278 [14:56:25<24:08:52, 4.19s/it] 39%|███▉ | 13520/34278 [14:56:27<21:39:04, 3.75s/it] {'loss': 0.1507, 'grad_norm': 0.8104362183563902, 'learning_rate': 6.903537612552752e-06, 'epoch': 0.39} 39%|███▉ | 13520/34278 [14:56:27<21:39:04, 3.75s/it] 39%|███▉ | 13521/34278 [14:56:30<20:10:39, 3.50s/it] {'loss': 0.1476, 'grad_norm': 0.7865974776961054, 'learning_rate': 6.903100746313992e-06, 'epoch': 0.39} 39%|███▉ | 13521/34278 [14:56:30<20:10:39, 3.50s/it] 39%|███▉ | 13522/34278 [14:56:34<20:00:58, 3.47s/it] {'loss': 0.1507, 'grad_norm': 0.9050056481214198, 'learning_rate': 6.902663863084803e-06, 'epoch': 0.39} 39%|███▉ | 13522/34278 [14:56:34<20:00:58, 3.47s/it] 39%|███▉ | 13523/34278 [14:56:37<20:22:06, 3.53s/it] {'loss': 0.1403, 'grad_norm': 0.9189139522724291, 'learning_rate': 6.902226962869085e-06, 'epoch': 0.39} 39%|███▉ | 13523/34278 [14:56:37<20:22:06, 3.53s/it] 39%|███▉ | 13524/34278 [14:56:41<20:33:30, 3.57s/it] {'loss': 0.1543, 'grad_norm': 1.0383020721973253, 'learning_rate': 6.90179004567074e-06, 'epoch': 0.39} 39%|███▉ | 13524/34278 [14:56:41<20:33:30, 3.57s/it] 39%|███▉ | 13525/34278 [14:56:44<19:56:04, 3.46s/it] {'loss': 0.1621, 'grad_norm': 0.8674877597307317, 'learning_rate': 6.9013531114936664e-06, 'epoch': 0.39} 39%|███▉ | 13525/34278 [14:56:44<19:56:04, 3.46s/it] 39%|███▉ | 13526/34278 [14:56:47<18:54:09, 3.28s/it] {'loss': 0.1304, 'grad_norm': 0.9702152424831368, 'learning_rate': 6.900916160341766e-06, 'epoch': 0.39} 39%|███▉ | 13526/34278 [14:56:47<18:54:09, 3.28s/it] 39%|███▉ | 13527/34278 [14:56:50<18:46:27, 3.26s/it] {'loss': 0.1557, 'grad_norm': 0.8763961594611642, 'learning_rate': 6.90047919221894e-06, 'epoch': 0.39} 39%|███▉ | 13527/34278 [14:56:50<18:46:27, 3.26s/it] 39%|███▉ | 13528/34278 [14:56:56<23:13:23, 4.03s/it] {'loss': 0.1639, 'grad_norm': 0.8700884550279984, 'learning_rate': 6.90004220712909e-06, 'epoch': 0.39} 39%|███▉ | 13528/34278 [14:56:56<23:13:23, 4.03s/it] 39%|███▉ | 13529/34278 [14:57:00<22:14:19, 3.86s/it] {'loss': 0.1496, 'grad_norm': 0.7770719129623878, 'learning_rate': 6.899605205076118e-06, 'epoch': 0.39} 39%|███▉ | 13529/34278 [14:57:00<22:14:19, 3.86s/it] 39%|███▉ | 13530/34278 [14:57:03<21:14:32, 3.69s/it] {'loss': 0.1551, 'grad_norm': 0.9202876285706476, 'learning_rate': 6.899168186063922e-06, 'epoch': 0.39} 39%|███▉ | 13530/34278 [14:57:03<21:14:32, 3.69s/it] 39%|███▉ | 13531/34278 [14:57:06<19:59:38, 3.47s/it] {'loss': 0.1592, 'grad_norm': 0.760391271866602, 'learning_rate': 6.898731150096405e-06, 'epoch': 0.39} 39%|███▉ | 13531/34278 [14:57:06<19:59:38, 3.47s/it] 39%|███▉ | 13532/34278 [14:57:09<18:59:21, 3.30s/it] {'loss': 0.1259, 'grad_norm': 0.6977262503265874, 'learning_rate': 6.898294097177472e-06, 'epoch': 0.39} 39%|███▉ | 13532/34278 [14:57:09<18:59:21, 3.30s/it] 39%|███▉ | 13533/34278 [14:57:12<18:36:41, 3.23s/it] {'loss': 0.1295, 'grad_norm': 0.8164748693757632, 'learning_rate': 6.897857027311021e-06, 'epoch': 0.39} 39%|███▉ | 13533/34278 [14:57:12<18:36:41, 3.23s/it] 39%|███▉ | 13534/34278 [14:57:15<18:43:38, 3.25s/it] {'loss': 0.1544, 'grad_norm': 0.7817836696599647, 'learning_rate': 6.897419940500957e-06, 'epoch': 0.39} 39%|███▉ | 13534/34278 [14:57:15<18:43:38, 3.25s/it] 39%|███▉ | 13535/34278 [14:57:18<17:49:42, 3.09s/it] {'loss': 0.1492, 'grad_norm': 0.8495968672593023, 'learning_rate': 6.8969828367511795e-06, 'epoch': 0.39} 39%|███▉ | 13535/34278 [14:57:18<17:49:42, 3.09s/it] 39%|███▉ | 13536/34278 [14:57:21<18:31:31, 3.22s/it] {'loss': 0.1255, 'grad_norm': 0.8039639482440903, 'learning_rate': 6.896545716065591e-06, 'epoch': 0.39} 39%|███▉ | 13536/34278 [14:57:21<18:31:31, 3.22s/it] 39%|███▉ | 13537/34278 [14:57:25<19:14:57, 3.34s/it] {'loss': 0.1574, 'grad_norm': 0.9095502768520879, 'learning_rate': 6.896108578448098e-06, 'epoch': 0.39} 39%|███▉ | 13537/34278 [14:57:25<19:14:57, 3.34s/it] 39%|███▉ | 13538/34278 [14:57:28<18:59:16, 3.30s/it] {'loss': 0.1813, 'grad_norm': 0.8898926453378159, 'learning_rate': 6.8956714239025976e-06, 'epoch': 0.39} 39%|███▉ | 13538/34278 [14:57:28<18:59:16, 3.30s/it] 39%|███▉ | 13539/34278 [14:57:31<18:41:12, 3.24s/it] {'loss': 0.146, 'grad_norm': 0.9486733008022501, 'learning_rate': 6.895234252432996e-06, 'epoch': 0.39} 39%|███▉ | 13539/34278 [14:57:31<18:41:12, 3.24s/it] 40%|███▉ | 13540/34278 [14:57:35<18:45:48, 3.26s/it] {'loss': 0.1532, 'grad_norm': 1.0384175536627858, 'learning_rate': 6.894797064043196e-06, 'epoch': 0.4} 40%|███▉ | 13540/34278 [14:57:35<18:45:48, 3.26s/it] 40%|███▉ | 13541/34278 [14:57:40<21:54:32, 3.80s/it] {'loss': 0.1528, 'grad_norm': 0.7370307247044519, 'learning_rate': 6.894359858737099e-06, 'epoch': 0.4} 40%|███▉ | 13541/34278 [14:57:40<21:54:32, 3.80s/it] 40%|███▉ | 13542/34278 [14:57:46<25:44:22, 4.47s/it] {'loss': 0.1347, 'grad_norm': 0.8463856413184923, 'learning_rate': 6.893922636518612e-06, 'epoch': 0.4} 40%|███▉ | 13542/34278 [14:57:46<25:44:22, 4.47s/it] 40%|███▉ | 13543/34278 [14:57:51<26:22:16, 4.58s/it] {'loss': 0.1662, 'grad_norm': 0.9238022439981455, 'learning_rate': 6.893485397391633e-06, 'epoch': 0.4} 40%|███▉ | 13543/34278 [14:57:51<26:22:16, 4.58s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 40%|███▉ | 13544/34278 [14:57:54<24:09:40, 4.20s/it] {'loss': 0.1378, 'grad_norm': 0.7820990008330788, 'learning_rate': 6.89304814136007e-06, 'epoch': 0.4} 40%|███▉ | 13544/34278 [14:57:54<24:09:40, 4.20s/it] 40%|███▉ | 13545/34278 [14:57:57<22:05:42, 3.84s/it] {'loss': 0.1963, 'grad_norm': 1.0888335436583485, 'learning_rate': 6.892610868427824e-06, 'epoch': 0.4} 40%|███▉ | 13545/34278 [14:57:57<22:05:42, 3.84s/it] 40%|███▉ | 13546/34278 [14:58:01<22:25:01, 3.89s/it] {'loss': 0.1188, 'grad_norm': 0.9156504059078454, 'learning_rate': 6.8921735785988e-06, 'epoch': 0.4} 40%|███▉ | 13546/34278 [14:58:01<22:25:01, 3.89s/it] 40%|███▉ | 13547/34278 [14:58:04<21:43:45, 3.77s/it] {'loss': 0.1421, 'grad_norm': 0.8877364499479483, 'learning_rate': 6.891736271876903e-06, 'epoch': 0.4} 40%|███▉ | 13547/34278 [14:58:04<21:43:45, 3.77s/it] 40%|███▉ | 13548/34278 [14:58:09<23:51:15, 4.14s/it] {'loss': 0.1265, 'grad_norm': 0.758630855278139, 'learning_rate': 6.8912989482660365e-06, 'epoch': 0.4} 40%|███▉ | 13548/34278 [14:58:09<23:51:15, 4.14s/it] 40%|███▉ | 13549/34278 [14:58:13<22:04:31, 3.83s/it] {'loss': 0.1463, 'grad_norm': 0.8116769721432958, 'learning_rate': 6.890861607770103e-06, 'epoch': 0.4} 40%|███▉ | 13549/34278 [14:58:13<22:04:31, 3.83s/it] 40%|███▉ | 13550/34278 [14:58:15<20:19:47, 3.53s/it] {'loss': 0.1325, 'grad_norm': 0.7628585711814966, 'learning_rate': 6.890424250393009e-06, 'epoch': 0.4} 40%|███▉ | 13550/34278 [14:58:15<20:19:47, 3.53s/it] 40%|███▉ | 13551/34278 [14:58:19<20:00:38, 3.48s/it] {'loss': 0.1433, 'grad_norm': 0.7730110063315181, 'learning_rate': 6.889986876138659e-06, 'epoch': 0.4} 40%|███▉ | 13551/34278 [14:58:19<20:00:38, 3.48s/it] 40%|███▉ | 13552/34278 [14:58:22<19:54:12, 3.46s/it] {'loss': 0.1457, 'grad_norm': 0.7051345359852466, 'learning_rate': 6.889549485010957e-06, 'epoch': 0.4} 40%|███▉ | 13552/34278 [14:58:22<19:54:12, 3.46s/it] 40%|███▉ | 13553/34278 [14:58:26<20:59:43, 3.65s/it] {'loss': 0.1305, 'grad_norm': 1.1425198602171238, 'learning_rate': 6.889112077013808e-06, 'epoch': 0.4} 40%|███▉ | 13553/34278 [14:58:26<20:59:43, 3.65s/it] 40%|███▉ | 13554/34278 [14:58:29<20:19:19, 3.53s/it] {'loss': 0.1214, 'grad_norm': 0.6616249465596958, 'learning_rate': 6.888674652151117e-06, 'epoch': 0.4} 40%|███▉ | 13554/34278 [14:58:29<20:19:19, 3.53s/it] 40%|███▉ | 13555/34278 [14:58:32<19:26:35, 3.38s/it] {'loss': 0.1596, 'grad_norm': 0.7753742877667316, 'learning_rate': 6.88823721042679e-06, 'epoch': 0.4} 40%|███▉ | 13555/34278 [14:58:32<19:26:35, 3.38s/it] 40%|███▉ | 13556/34278 [14:58:36<19:54:39, 3.46s/it] {'loss': 0.1646, 'grad_norm': 0.9304339527547232, 'learning_rate': 6.887799751844732e-06, 'epoch': 0.4} 40%|███▉ | 13556/34278 [14:58:36<19:54:39, 3.46s/it] 40%|███▉ | 13557/34278 [14:58:42<23:58:41, 4.17s/it] {'loss': 0.1346, 'grad_norm': 0.7840134291357677, 'learning_rate': 6.8873622764088495e-06, 'epoch': 0.4} 40%|███▉ | 13557/34278 [14:58:42<23:58:41, 4.17s/it] 40%|███▉ | 13558/34278 [14:58:48<27:07:04, 4.71s/it] {'loss': 0.1479, 'grad_norm': 0.7153077068367624, 'learning_rate': 6.886924784123046e-06, 'epoch': 0.4} 40%|███▉ | 13558/34278 [14:58:48<27:07:04, 4.71s/it] 40%|███▉ | 13559/34278 [14:58:51<24:08:22, 4.19s/it] {'loss': 0.128, 'grad_norm': 0.7315921902262976, 'learning_rate': 6.8864872749912296e-06, 'epoch': 0.4} 40%|███▉ | 13559/34278 [14:58:51<24:08:22, 4.19s/it] 40%|███▉ | 13560/34278 [14:58:54<21:50:54, 3.80s/it] {'loss': 0.1436, 'grad_norm': 0.8785516750944411, 'learning_rate': 6.886049749017304e-06, 'epoch': 0.4} 40%|███▉ | 13560/34278 [14:58:54<21:50:54, 3.80s/it] 40%|███▉ | 13561/34278 [14:58:57<20:52:48, 3.63s/it] {'loss': 0.1454, 'grad_norm': 0.8587352973909724, 'learning_rate': 6.885612206205175e-06, 'epoch': 0.4} 40%|███▉ | 13561/34278 [14:58:57<20:52:48, 3.63s/it] 40%|███▉ | 13562/34278 [14:59:01<21:49:17, 3.79s/it] {'loss': 0.1262, 'grad_norm': 0.7551958300928429, 'learning_rate': 6.885174646558754e-06, 'epoch': 0.4} 40%|███▉ | 13562/34278 [14:59:01<21:49:17, 3.79s/it] 40%|███▉ | 13563/34278 [14:59:04<20:37:29, 3.58s/it] {'loss': 0.1461, 'grad_norm': 0.6814931549444815, 'learning_rate': 6.8847370700819415e-06, 'epoch': 0.4} 40%|███▉ | 13563/34278 [14:59:04<20:37:29, 3.58s/it] 40%|███▉ | 13564/34278 [14:59:08<21:00:55, 3.65s/it] {'loss': 0.1431, 'grad_norm': 0.7909599958417884, 'learning_rate': 6.8842994767786466e-06, 'epoch': 0.4} 40%|███▉ | 13564/34278 [14:59:08<21:00:55, 3.65s/it] 40%|███▉ | 13565/34278 [14:59:11<20:28:18, 3.56s/it] {'loss': 0.1372, 'grad_norm': 0.7321566528171322, 'learning_rate': 6.883861866652776e-06, 'epoch': 0.4} 40%|███▉ | 13565/34278 [14:59:11<20:28:18, 3.56s/it] 40%|███▉ | 13566/34278 [14:59:15<20:15:58, 3.52s/it] {'loss': 0.1453, 'grad_norm': 0.8094397867003983, 'learning_rate': 6.883424239708236e-06, 'epoch': 0.4} 40%|███▉ | 13566/34278 [14:59:15<20:15:58, 3.52s/it] 40%|███▉ | 13567/34278 [14:59:18<19:39:57, 3.42s/it] {'loss': 0.14, 'grad_norm': 0.8227919385291982, 'learning_rate': 6.882986595948935e-06, 'epoch': 0.4} 40%|███▉ | 13567/34278 [14:59:18<19:39:57, 3.42s/it] 40%|███▉ | 13568/34278 [14:59:21<19:15:09, 3.35s/it] {'loss': 0.1081, 'grad_norm': 0.7330943694807599, 'learning_rate': 6.882548935378778e-06, 'epoch': 0.4} 40%|███▉ | 13568/34278 [14:59:21<19:15:09, 3.35s/it] 40%|███▉ | 13569/34278 [14:59:25<20:08:10, 3.50s/it] {'loss': 0.1501, 'grad_norm': 0.9788886462500496, 'learning_rate': 6.8821112580016734e-06, 'epoch': 0.4} 40%|███▉ | 13569/34278 [14:59:25<20:08:10, 3.50s/it] 40%|███▉ | 13570/34278 [14:59:31<24:19:32, 4.23s/it] {'loss': 0.1696, 'grad_norm': 0.85249551709532, 'learning_rate': 6.881673563821529e-06, 'epoch': 0.4} 40%|███▉ | 13570/34278 [14:59:31<24:19:32, 4.23s/it] 40%|███▉ | 13571/34278 [14:59:34<22:32:24, 3.92s/it] {'loss': 0.1185, 'grad_norm': 0.8054367868561896, 'learning_rate': 6.881235852842253e-06, 'epoch': 0.4} 40%|███▉ | 13571/34278 [14:59:34<22:32:24, 3.92s/it] 40%|███▉ | 13572/34278 [14:59:38<22:04:12, 3.84s/it] {'loss': 0.1567, 'grad_norm': 0.9278896903449173, 'learning_rate': 6.880798125067752e-06, 'epoch': 0.4} 40%|███▉ | 13572/34278 [14:59:38<22:04:12, 3.84s/it] 40%|███▉ | 13573/34278 [14:59:42<22:15:57, 3.87s/it] {'loss': 0.1617, 'grad_norm': 0.8227234897426949, 'learning_rate': 6.880360380501934e-06, 'epoch': 0.4} 40%|███▉ | 13573/34278 [14:59:42<22:15:57, 3.87s/it] 40%|███▉ | 13574/34278 [14:59:45<20:32:43, 3.57s/it] {'loss': 0.1452, 'grad_norm': 1.1067236153092375, 'learning_rate': 6.879922619148709e-06, 'epoch': 0.4} 40%|███▉ | 13574/34278 [14:59:45<20:32:43, 3.57s/it] 40%|███▉ | 13575/34278 [14:59:48<19:51:01, 3.45s/it] {'loss': 0.1243, 'grad_norm': 0.8642894651343764, 'learning_rate': 6.879484841011981e-06, 'epoch': 0.4} 40%|███▉ | 13575/34278 [14:59:48<19:51:01, 3.45s/it] 40%|███▉ | 13576/34278 [14:59:51<19:43:02, 3.43s/it] {'loss': 0.1472, 'grad_norm': 0.9586605647173144, 'learning_rate': 6.8790470460956625e-06, 'epoch': 0.4} 40%|███▉ | 13576/34278 [14:59:51<19:43:02, 3.43s/it] 40%|███▉ | 13577/34278 [14:59:54<19:17:20, 3.35s/it] {'loss': 0.1367, 'grad_norm': 0.9086036286416684, 'learning_rate': 6.878609234403661e-06, 'epoch': 0.4} 40%|███▉ | 13577/34278 [14:59:54<19:17:20, 3.35s/it] 40%|███▉ | 13578/34278 [14:59:57<18:38:01, 3.24s/it] {'loss': 0.1354, 'grad_norm': 0.8747271647171905, 'learning_rate': 6.878171405939883e-06, 'epoch': 0.4} 40%|███▉ | 13578/34278 [14:59:57<18:38:01, 3.24s/it] 40%|███▉ | 13579/34278 [15:00:03<23:17:37, 4.05s/it] {'loss': 0.1206, 'grad_norm': 0.8553925512113649, 'learning_rate': 6.8777335607082415e-06, 'epoch': 0.4} 40%|███▉ | 13579/34278 [15:00:03<23:17:37, 4.05s/it] 40%|███▉ | 13580/34278 [15:00:07<21:47:56, 3.79s/it] {'loss': 0.1282, 'grad_norm': 0.7722446172228927, 'learning_rate': 6.8772956987126415e-06, 'epoch': 0.4} 40%|███▉ | 13580/34278 [15:00:07<21:47:56, 3.79s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f96a23f16c0> Failed to fetch sample 3646662. Exception: cannot identify image file <_io.BytesIO object at 0x7f96a23f16c0> 40%|███▉ | 13581/34278 [15:00:10<20:51:47, 3.63s/it] {'loss': 0.1769, 'grad_norm': 0.9474777884206562, 'learning_rate': 6.876857819956993e-06, 'epoch': 0.4} 40%|███▉ | 13581/34278 [15:00:10<20:51:47, 3.63s/it] 40%|███▉ | 13582/34278 [15:00:13<19:39:22, 3.42s/it] {'loss': 0.1277, 'grad_norm': 0.7971638941395826, 'learning_rate': 6.876419924445208e-06, 'epoch': 0.4} 40%|███▉ | 13582/34278 [15:00:13<19:39:22, 3.42s/it] 40%|███▉ | 13583/34278 [15:00:16<18:54:35, 3.29s/it] {'loss': 0.151, 'grad_norm': 0.7513084934398179, 'learning_rate': 6.875982012181192e-06, 'epoch': 0.4} 40%|███▉ | 13583/34278 [15:00:16<18:54:35, 3.29s/it] 40%|███▉ | 13584/34278 [15:00:19<18:15:42, 3.18s/it] {'loss': 0.1452, 'grad_norm': 0.8124671974017118, 'learning_rate': 6.875544083168857e-06, 'epoch': 0.4} 40%|███▉ | 13584/34278 [15:00:19<18:15:42, 3.18s/it] 40%|███▉ | 13585/34278 [15:00:21<17:30:52, 3.05s/it] {'loss': 0.1395, 'grad_norm': 0.9957779640245273, 'learning_rate': 6.875106137412112e-06, 'epoch': 0.4} 40%|███▉ | 13585/34278 [15:00:21<17:30:52, 3.05s/it] 40%|███▉ | 13586/34278 [15:00:25<17:49:57, 3.10s/it] {'loss': 0.1231, 'grad_norm': 0.8307337501671527, 'learning_rate': 6.874668174914867e-06, 'epoch': 0.4} 40%|███▉ | 13586/34278 [15:00:25<17:49:57, 3.10s/it] 40%|███▉ | 13587/34278 [15:00:28<17:47:46, 3.10s/it] {'loss': 0.1488, 'grad_norm': 0.907476529519294, 'learning_rate': 6.874230195681032e-06, 'epoch': 0.4} 40%|███▉ | 13587/34278 [15:00:28<17:47:46, 3.10s/it] 40%|███▉ | 13588/34278 [15:00:31<18:47:32, 3.27s/it] {'loss': 0.1439, 'grad_norm': 1.0410896304727788, 'learning_rate': 6.8737921997145175e-06, 'epoch': 0.4} 40%|███▉ | 13588/34278 [15:00:31<18:47:32, 3.27s/it] 40%|███▉ | 13589/34278 [15:00:35<19:07:14, 3.33s/it] {'loss': 0.1469, 'grad_norm': 0.9864157463871159, 'learning_rate': 6.8733541870192345e-06, 'epoch': 0.4} 40%|███▉ | 13589/34278 [15:00:35<19:07:14, 3.33s/it] 40%|███▉ | 13590/34278 [15:00:39<20:19:21, 3.54s/it] {'loss': 0.1379, 'grad_norm': 1.1234569276469955, 'learning_rate': 6.87291615759909e-06, 'epoch': 0.4} 40%|███▉ | 13590/34278 [15:00:39<20:19:21, 3.54s/it] 40%|███▉ | 13591/34278 [15:00:42<19:44:42, 3.44s/it] {'loss': 0.1488, 'grad_norm': 1.0268759686645481, 'learning_rate': 6.872478111457999e-06, 'epoch': 0.4} 40%|███▉ | 13591/34278 [15:00:42<19:44:42, 3.44s/it] 40%|███▉ | 13592/34278 [15:00:45<19:23:38, 3.38s/it] {'loss': 0.1311, 'grad_norm': 0.8466032986845579, 'learning_rate': 6.8720400485998705e-06, 'epoch': 0.4} 40%|███▉ | 13592/34278 [15:00:45<19:23:38, 3.38s/it] 40%|███▉ | 13593/34278 [15:00:48<19:01:55, 3.31s/it] {'loss': 0.1406, 'grad_norm': 0.8212839396277573, 'learning_rate': 6.871601969028614e-06, 'epoch': 0.4} 40%|███▉ | 13593/34278 [15:00:48<19:01:55, 3.31s/it] 40%|███▉ | 13594/34278 [15:00:52<18:49:00, 3.27s/it] {'loss': 0.1567, 'grad_norm': 0.80401134328649, 'learning_rate': 6.871163872748144e-06, 'epoch': 0.4} 40%|███▉ | 13594/34278 [15:00:52<18:49:00, 3.27s/it] 40%|███▉ | 13595/34278 [15:00:55<18:53:05, 3.29s/it] {'loss': 0.1368, 'grad_norm': 0.9520256305586496, 'learning_rate': 6.870725759762369e-06, 'epoch': 0.4} 40%|███▉ | 13595/34278 [15:00:55<18:53:05, 3.29s/it] 40%|███▉ | 13596/34278 [15:01:00<21:09:20, 3.68s/it] {'loss': 0.1294, 'grad_norm': 0.7466076877669924, 'learning_rate': 6.870287630075198e-06, 'epoch': 0.4} 40%|███▉ | 13596/34278 [15:01:00<21:09:20, 3.68s/it] 40%|███▉ | 13597/34278 [15:01:03<20:07:17, 3.50s/it] {'loss': 0.1671, 'grad_norm': 0.8538214083824109, 'learning_rate': 6.8698494836905494e-06, 'epoch': 0.4} 40%|███▉ | 13597/34278 [15:01:03<20:07:17, 3.50s/it] 40%|███▉ | 13598/34278 [15:01:06<19:23:52, 3.38s/it] {'loss': 0.1356, 'grad_norm': 0.9462874808801192, 'learning_rate': 6.8694113206123305e-06, 'epoch': 0.4} 40%|███▉ | 13598/34278 [15:01:06<19:23:52, 3.38s/it] 40%|███▉ | 13599/34278 [15:01:09<18:37:11, 3.24s/it] {'loss': 0.1427, 'grad_norm': 0.7981356657539314, 'learning_rate': 6.868973140844453e-06, 'epoch': 0.4} 40%|███▉ | 13599/34278 [15:01:09<18:37:11, 3.24s/it] 40%|███▉ | 13600/34278 [15:01:11<17:51:18, 3.11s/it] {'loss': 0.1233, 'grad_norm': 0.8499232209187003, 'learning_rate': 6.868534944390828e-06, 'epoch': 0.4} 40%|███▉ | 13600/34278 [15:01:11<17:51:18, 3.11s/it] 40%|███▉ | 13601/34278 [15:01:17<22:51:48, 3.98s/it] {'loss': 0.1829, 'grad_norm': 0.6653544955477789, 'learning_rate': 6.868096731255371e-06, 'epoch': 0.4} 40%|███▉ | 13601/34278 [15:01:17<22:51:48, 3.98s/it] 40%|███▉ | 13602/34278 [15:01:21<21:58:07, 3.83s/it] {'loss': 0.1641, 'grad_norm': 1.0097988328038208, 'learning_rate': 6.867658501441991e-06, 'epoch': 0.4} 40%|███▉ | 13602/34278 [15:01:21<21:58:07, 3.83s/it] 40%|███▉ | 13603/34278 [15:01:27<25:39:45, 4.47s/it] {'loss': 0.1121, 'grad_norm': 0.8557864152496123, 'learning_rate': 6.867220254954602e-06, 'epoch': 0.4} 40%|███▉ | 13603/34278 [15:01:27<25:39:45, 4.47s/it] 40%|███▉ | 13604/34278 [15:01:31<24:48:15, 4.32s/it] {'loss': 0.1562, 'grad_norm': 0.9184535543711677, 'learning_rate': 6.866781991797118e-06, 'epoch': 0.4} 40%|███▉ | 13604/34278 [15:01:31<24:48:15, 4.32s/it] 40%|███▉ | 13605/34278 [15:01:37<27:22:33, 4.77s/it] {'loss': 0.1425, 'grad_norm': 0.7206300689972719, 'learning_rate': 6.866343711973446e-06, 'epoch': 0.4} 40%|███▉ | 13605/34278 [15:01:37<27:22:33, 4.77s/it] 40%|███▉ | 13606/34278 [15:01:43<29:26:25, 5.13s/it] {'loss': 0.1546, 'grad_norm': 0.9796407534578669, 'learning_rate': 6.865905415487506e-06, 'epoch': 0.4} 40%|███▉ | 13606/34278 [15:01:43<29:26:25, 5.13s/it] 40%|███▉ | 13607/34278 [15:01:46<25:54:08, 4.51s/it] {'loss': 0.1535, 'grad_norm': 1.1280364906262048, 'learning_rate': 6.8654671023432085e-06, 'epoch': 0.4} 40%|███▉ | 13607/34278 [15:01:46<25:54:08, 4.51s/it] 40%|███▉ | 13608/34278 [15:01:49<23:17:32, 4.06s/it] {'loss': 0.1372, 'grad_norm': 0.8940280196917338, 'learning_rate': 6.865028772544464e-06, 'epoch': 0.4} 40%|███▉ | 13608/34278 [15:01:49<23:17:32, 4.06s/it] 40%|███▉ | 13609/34278 [15:01:54<25:22:08, 4.42s/it] {'loss': 0.1337, 'grad_norm': 1.0597706161073654, 'learning_rate': 6.8645904260951905e-06, 'epoch': 0.4} 40%|███▉ | 13609/34278 [15:01:54<25:22:08, 4.42s/it] 40%|███▉ | 13610/34278 [15:01:57<23:18:59, 4.06s/it] {'loss': 0.1434, 'grad_norm': 0.7998501927757913, 'learning_rate': 6.864152062999297e-06, 'epoch': 0.4} 40%|███▉ | 13610/34278 [15:01:57<23:18:59, 4.06s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9633 > 8192). Running this sequence through the model will result in indexing errors 40%|███▉ | 13611/34278 [15:02:02<24:38:47, 4.29s/it] {'loss': 0.1447, 'grad_norm': 0.9850858201656993, 'learning_rate': 6.863713683260696e-06, 'epoch': 0.4} 40%|███▉ | 13611/34278 [15:02:02<24:38:47, 4.29s/it] 40%|███▉ | 13612/34278 [15:02:05<22:55:06, 3.99s/it] {'loss': 0.1464, 'grad_norm': 0.7793562916951422, 'learning_rate': 6.863275286883308e-06, 'epoch': 0.4} 40%|███▉ | 13612/34278 [15:02:05<22:55:06, 3.99s/it] 40%|███▉ | 13613/34278 [15:02:08<21:01:08, 3.66s/it] {'loss': 0.1352, 'grad_norm': 0.7311343531268348, 'learning_rate': 6.862836873871043e-06, 'epoch': 0.4} 40%|███▉ | 13613/34278 [15:02:08<21:01:08, 3.66s/it] 40%|███▉ | 13614/34278 [15:02:12<20:29:00, 3.57s/it] {'loss': 0.1684, 'grad_norm': 0.8864138963850705, 'learning_rate': 6.862398444227813e-06, 'epoch': 0.4} 40%|███▉ | 13614/34278 [15:02:12<20:29:00, 3.57s/it] 40%|███▉ | 13615/34278 [15:02:15<20:11:10, 3.52s/it] {'loss': 0.1689, 'grad_norm': 0.8371271507064795, 'learning_rate': 6.861959997957537e-06, 'epoch': 0.4} 40%|███▉ | 13615/34278 [15:02:15<20:11:10, 3.52s/it] 40%|███▉ | 13616/34278 [15:02:20<22:27:26, 3.91s/it] {'loss': 0.154, 'grad_norm': 0.759781032712083, 'learning_rate': 6.861521535064124e-06, 'epoch': 0.4} 40%|███▉ | 13616/34278 [15:02:20<22:27:26, 3.91s/it] 40%|███▉ | 13617/34278 [15:02:23<21:03:36, 3.67s/it] {'loss': 0.1781, 'grad_norm': 0.814907302855488, 'learning_rate': 6.861083055551492e-06, 'epoch': 0.4} 40%|███▉ | 13617/34278 [15:02:23<21:03:36, 3.67s/it] 40%|███▉ | 13618/34278 [15:02:29<25:16:39, 4.40s/it] {'loss': 0.1365, 'grad_norm': 0.9859344627740455, 'learning_rate': 6.860644559423555e-06, 'epoch': 0.4} 40%|███▉ | 13618/34278 [15:02:29<25:16:39, 4.40s/it] 40%|███▉ | 13619/34278 [15:02:35<28:17:11, 4.93s/it] {'loss': 0.1287, 'grad_norm': 0.5475142826293896, 'learning_rate': 6.860206046684229e-06, 'epoch': 0.4} 40%|███▉ | 13619/34278 [15:02:35<28:17:11, 4.93s/it] 40%|███▉ | 13620/34278 [15:02:38<24:51:31, 4.33s/it] {'loss': 0.1729, 'grad_norm': 1.0550977484094017, 'learning_rate': 6.859767517337425e-06, 'epoch': 0.4} 40%|███▉ | 13620/34278 [15:02:38<24:51:31, 4.33s/it] 40%|███▉ | 13621/34278 [15:02:41<21:56:13, 3.82s/it] {'loss': 0.145, 'grad_norm': 0.7835031097060281, 'learning_rate': 6.859328971387062e-06, 'epoch': 0.4} 40%|███▉ | 13621/34278 [15:02:41<21:56:13, 3.82s/it] 40%|███▉ | 13622/34278 [15:02:44<21:08:54, 3.69s/it] {'loss': 0.135, 'grad_norm': 0.7424203829276208, 'learning_rate': 6.858890408837054e-06, 'epoch': 0.4} 40%|███▉ | 13622/34278 [15:02:44<21:08:54, 3.69s/it] 40%|███▉ | 13623/34278 [15:02:47<20:11:00, 3.52s/it] {'loss': 0.1366, 'grad_norm': 0.7568285103001061, 'learning_rate': 6.858451829691314e-06, 'epoch': 0.4} 40%|███▉ | 13623/34278 [15:02:47<20:11:00, 3.52s/it] 40%|███▉ | 13624/34278 [15:02:51<20:58:20, 3.66s/it] {'loss': 0.1382, 'grad_norm': 0.9075307612597083, 'learning_rate': 6.858013233953762e-06, 'epoch': 0.4} 40%|███▉ | 13624/34278 [15:02:51<20:58:20, 3.66s/it] 40%|███▉ | 13625/34278 [15:02:54<19:50:48, 3.46s/it] {'loss': 0.1301, 'grad_norm': 0.8407214729416896, 'learning_rate': 6.85757462162831e-06, 'epoch': 0.4} 40%|███▉ | 13625/34278 [15:02:54<19:50:48, 3.46s/it] 40%|███▉ | 13626/34278 [15:03:00<23:57:34, 4.18s/it] {'loss': 0.1441, 'grad_norm': 0.7890127917039195, 'learning_rate': 6.857135992718875e-06, 'epoch': 0.4} 40%|███▉ | 13626/34278 [15:03:00<23:57:34, 4.18s/it] 40%|███▉ | 13627/34278 [15:03:04<23:12:03, 4.04s/it] {'loss': 0.1199, 'grad_norm': 0.8125349100335384, 'learning_rate': 6.856697347229375e-06, 'epoch': 0.4} 40%|███▉ | 13627/34278 [15:03:04<23:12:03, 4.04s/it] 40%|███▉ | 13628/34278 [15:03:08<23:43:44, 4.14s/it] {'loss': 0.1468, 'grad_norm': 0.9964706228097383, 'learning_rate': 6.856258685163724e-06, 'epoch': 0.4} 40%|███▉ | 13628/34278 [15:03:08<23:43:44, 4.14s/it] 40%|███▉ | 13629/34278 [15:03:11<22:06:12, 3.85s/it] {'loss': 0.1579, 'grad_norm': 0.6975779232087724, 'learning_rate': 6.855820006525838e-06, 'epoch': 0.4} 40%|███▉ | 13629/34278 [15:03:11<22:06:12, 3.85s/it] 40%|███▉ | 13630/34278 [15:03:15<22:09:05, 3.86s/it] {'loss': 0.168, 'grad_norm': 0.9108399539603893, 'learning_rate': 6.855381311319633e-06, 'epoch': 0.4} 40%|███▉ | 13630/34278 [15:03:15<22:09:05, 3.86s/it] 40%|███▉ | 13631/34278 [15:03:19<21:53:38, 3.82s/it] {'loss': 0.1458, 'grad_norm': 1.0386356101843057, 'learning_rate': 6.854942599549028e-06, 'epoch': 0.4} 40%|███▉ | 13631/34278 [15:03:19<21:53:38, 3.82s/it] 40%|███▉ | 13632/34278 [15:03:25<26:16:03, 4.58s/it] {'loss': 0.1123, 'grad_norm': 0.6874329075659427, 'learning_rate': 6.854503871217937e-06, 'epoch': 0.4} 40%|███▉ | 13632/34278 [15:03:25<26:16:03, 4.58s/it] 40%|███▉ | 13633/34278 [15:03:30<25:58:38, 4.53s/it] {'loss': 0.149, 'grad_norm': 0.8603785550408117, 'learning_rate': 6.854065126330279e-06, 'epoch': 0.4} 40%|███▉ | 13633/34278 [15:03:30<25:58:38, 4.53s/it] 40%|███▉ | 13634/34278 [15:03:33<23:43:39, 4.14s/it] {'loss': 0.1348, 'grad_norm': 0.985375576240278, 'learning_rate': 6.853626364889972e-06, 'epoch': 0.4} 40%|███▉ | 13634/34278 [15:03:33<23:43:39, 4.14s/it] 40%|███▉ | 13635/34278 [15:03:37<23:06:48, 4.03s/it] {'loss': 0.1339, 'grad_norm': 0.885232077058918, 'learning_rate': 6.853187586900927e-06, 'epoch': 0.4} 40%|███▉ | 13635/34278 [15:03:37<23:06:48, 4.03s/it] 40%|███▉ | 13636/34278 [15:03:40<22:30:59, 3.93s/it] {'loss': 0.1265, 'grad_norm': 0.8530249718279882, 'learning_rate': 6.852748792367069e-06, 'epoch': 0.4} 40%|███▉ | 13636/34278 [15:03:40<22:30:59, 3.93s/it] 40%|███▉ | 13637/34278 [15:03:44<21:16:38, 3.71s/it] {'loss': 0.1629, 'grad_norm': 0.8670925090412506, 'learning_rate': 6.852309981292311e-06, 'epoch': 0.4} 40%|███▉ | 13637/34278 [15:03:44<21:16:38, 3.71s/it] 40%|███▉ | 13638/34278 [15:03:46<19:48:53, 3.46s/it] {'loss': 0.1393, 'grad_norm': 1.0442530482132857, 'learning_rate': 6.851871153680572e-06, 'epoch': 0.4} 40%|███▉ | 13638/34278 [15:03:46<19:48:53, 3.46s/it] 40%|███▉ | 13639/34278 [15:03:50<19:32:27, 3.41s/it] {'loss': 0.1215, 'grad_norm': 1.0169475190648418, 'learning_rate': 6.851432309535769e-06, 'epoch': 0.4} 40%|███▉ | 13639/34278 [15:03:50<19:32:27, 3.41s/it] 40%|███▉ | 13640/34278 [15:03:53<20:07:43, 3.51s/it] {'loss': 0.1738, 'grad_norm': 0.9061511494073102, 'learning_rate': 6.8509934488618205e-06, 'epoch': 0.4} 40%|███▉ | 13640/34278 [15:03:54<20:07:43, 3.51s/it] 40%|███▉ | 13641/34278 [15:03:58<22:37:45, 3.95s/it] {'loss': 0.1317, 'grad_norm': 0.964644869692467, 'learning_rate': 6.850554571662643e-06, 'epoch': 0.4} 40%|███▉ | 13641/34278 [15:03:58<22:37:45, 3.95s/it] 40%|███▉ | 13642/34278 [15:04:01<21:02:22, 3.67s/it] {'loss': 0.1435, 'grad_norm': 1.1840934691961553, 'learning_rate': 6.850115677942159e-06, 'epoch': 0.4} 40%|███▉ | 13642/34278 [15:04:01<21:02:22, 3.67s/it] 40%|███▉ | 13643/34278 [15:04:05<20:59:11, 3.66s/it] {'loss': 0.1275, 'grad_norm': 0.6787396057643946, 'learning_rate': 6.8496767677042816e-06, 'epoch': 0.4} 40%|███▉ | 13643/34278 [15:04:05<20:59:11, 3.66s/it] 40%|███▉ | 13644/34278 [15:04:09<21:09:31, 3.69s/it] {'loss': 0.1441, 'grad_norm': 0.8043946569604296, 'learning_rate': 6.849237840952933e-06, 'epoch': 0.4} 40%|███▉ | 13644/34278 [15:04:09<21:09:31, 3.69s/it] 40%|███▉ | 13645/34278 [15:04:13<21:03:01, 3.67s/it] {'loss': 0.1477, 'grad_norm': 1.013178519533018, 'learning_rate': 6.8487988976920286e-06, 'epoch': 0.4} 40%|███▉ | 13645/34278 [15:04:13<21:03:01, 3.67s/it] 40%|███▉ | 13646/34278 [15:04:18<23:46:06, 4.15s/it] {'loss': 0.1303, 'grad_norm': 0.686504512547161, 'learning_rate': 6.84835993792549e-06, 'epoch': 0.4} 40%|███▉ | 13646/34278 [15:04:18<23:46:06, 4.15s/it] 40%|███▉ | 13647/34278 [15:04:22<23:46:49, 4.15s/it] {'loss': 0.1974, 'grad_norm': 1.4665453513914628, 'learning_rate': 6.847920961657235e-06, 'epoch': 0.4} 40%|███▉ | 13647/34278 [15:04:22<23:46:49, 4.15s/it] 40%|███▉ | 13648/34278 [15:04:27<25:06:43, 4.38s/it] {'loss': 0.1443, 'grad_norm': 0.8053405976829205, 'learning_rate': 6.847481968891183e-06, 'epoch': 0.4} 40%|███▉ | 13648/34278 [15:04:27<25:06:43, 4.38s/it] 40%|███▉ | 13649/34278 [15:04:30<23:35:56, 4.12s/it] {'loss': 0.1338, 'grad_norm': 0.7239652844664418, 'learning_rate': 6.847042959631253e-06, 'epoch': 0.4} 40%|███▉ | 13649/34278 [15:04:30<23:35:56, 4.12s/it] 40%|███▉ | 13650/34278 [15:04:35<25:07:46, 4.39s/it] {'loss': 0.1483, 'grad_norm': 0.8310026643884398, 'learning_rate': 6.846603933881364e-06, 'epoch': 0.4} 40%|███▉ | 13650/34278 [15:04:35<25:07:46, 4.39s/it] 40%|███▉ | 13651/34278 [15:04:41<27:57:34, 4.88s/it] {'loss': 0.1319, 'grad_norm': 0.7626654642274614, 'learning_rate': 6.846164891645436e-06, 'epoch': 0.4} 40%|███▉ | 13651/34278 [15:04:41<27:57:34, 4.88s/it] 40%|███▉ | 13652/34278 [15:04:45<25:18:37, 4.42s/it] {'loss': 0.1735, 'grad_norm': 0.804123077163358, 'learning_rate': 6.84572583292739e-06, 'epoch': 0.4} 40%|███▉ | 13652/34278 [15:04:45<25:18:37, 4.42s/it] 40%|███▉ | 13653/34278 [15:04:48<22:45:32, 3.97s/it] {'loss': 0.1313, 'grad_norm': 0.9864754575496317, 'learning_rate': 6.845286757731142e-06, 'epoch': 0.4} 40%|███▉ | 13653/34278 [15:04:48<22:45:32, 3.97s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (13632 > 8192). Running this sequence through the model will result in indexing errors 40%|███▉ | 13654/34278 [15:04:51<21:05:38, 3.68s/it] {'loss': 0.1441, 'grad_norm': 0.82570891583435, 'learning_rate': 6.844847666060617e-06, 'epoch': 0.4} 40%|███▉ | 13654/34278 [15:04:51<21:05:38, 3.68s/it] 40%|███▉ | 13655/34278 [15:04:54<20:20:52, 3.55s/it] {'loss': 0.1235, 'grad_norm': 0.9266173515462345, 'learning_rate': 6.844408557919731e-06, 'epoch': 0.4} 40%|███▉ | 13655/34278 [15:04:54<20:20:52, 3.55s/it] 40%|███▉ | 13656/34278 [15:04:57<19:52:26, 3.47s/it] {'loss': 0.1462, 'grad_norm': 0.7049451437364153, 'learning_rate': 6.843969433312404e-06, 'epoch': 0.4} 40%|███▉ | 13656/34278 [15:04:57<19:52:26, 3.47s/it] 40%|███▉ | 13657/34278 [15:05:00<19:30:01, 3.40s/it] {'loss': 0.1295, 'grad_norm': 0.761751592673055, 'learning_rate': 6.8435302922425606e-06, 'epoch': 0.4} 40%|███▉ | 13657/34278 [15:05:00<19:30:01, 3.40s/it] 40%|███▉ | 13658/34278 [15:05:04<19:13:46, 3.36s/it] {'loss': 0.1522, 'grad_norm': 0.7271112240939605, 'learning_rate': 6.843091134714117e-06, 'epoch': 0.4} 40%|███▉ | 13658/34278 [15:05:04<19:13:46, 3.36s/it] 40%|███▉ | 13659/34278 [15:05:07<18:56:37, 3.31s/it] {'loss': 0.131, 'grad_norm': 0.9028992364270692, 'learning_rate': 6.842651960730997e-06, 'epoch': 0.4} 40%|███▉ | 13659/34278 [15:05:07<18:56:37, 3.31s/it] 40%|███▉ | 13660/34278 [15:05:10<18:55:51, 3.31s/it] {'loss': 0.1241, 'grad_norm': 0.685376002217138, 'learning_rate': 6.842212770297121e-06, 'epoch': 0.4} 40%|███▉ | 13660/34278 [15:05:10<18:55:51, 3.31s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 40%|███▉ | 13661/34278 [15:05:16<23:47:58, 4.16s/it] {'loss': 0.1232, 'grad_norm': 0.7221454328676604, 'learning_rate': 6.8417735634164075e-06, 'epoch': 0.4} 40%|███▉ | 13661/34278 [15:05:16<23:47:58, 4.16s/it] 40%|███▉ | 13662/34278 [15:05:20<22:51:01, 3.99s/it] {'loss': 0.1513, 'grad_norm': 1.018529010873739, 'learning_rate': 6.841334340092779e-06, 'epoch': 0.4} 40%|███▉ | 13662/34278 [15:05:20<22:51:01, 3.99s/it] 40%|███▉ | 13663/34278 [15:05:24<22:58:45, 4.01s/it] {'loss': 0.1615, 'grad_norm': 0.9545248989083854, 'learning_rate': 6.840895100330159e-06, 'epoch': 0.4} 40%|███▉ | 13663/34278 [15:05:24<22:58:45, 4.01s/it] 40%|███▉ | 13664/34278 [15:05:27<21:11:18, 3.70s/it] {'loss': 0.1313, 'grad_norm': 0.7540961625778743, 'learning_rate': 6.840455844132465e-06, 'epoch': 0.4} 40%|███▉ | 13664/34278 [15:05:27<21:11:18, 3.70s/it] 40%|███▉ | 13665/34278 [15:05:30<20:06:04, 3.51s/it] {'loss': 0.1515, 'grad_norm': 0.769118084821783, 'learning_rate': 6.840016571503622e-06, 'epoch': 0.4} 40%|███▉ | 13665/34278 [15:05:30<20:06:04, 3.51s/it] 40%|███▉ | 13666/34278 [15:05:33<19:23:40, 3.39s/it] {'loss': 0.1445, 'grad_norm': 0.8988004881390081, 'learning_rate': 6.8395772824475494e-06, 'epoch': 0.4} 40%|███▉ | 13666/34278 [15:05:33<19:23:40, 3.39s/it] 40%|███▉ | 13667/34278 [15:05:36<18:40:17, 3.26s/it] {'loss': 0.1395, 'grad_norm': 0.9666765933995999, 'learning_rate': 6.839137976968171e-06, 'epoch': 0.4} 40%|███▉ | 13667/34278 [15:05:36<18:40:17, 3.26s/it] 40%|███▉ | 13668/34278 [15:05:40<19:31:47, 3.41s/it] {'loss': 0.1276, 'grad_norm': 0.8060915256768564, 'learning_rate': 6.838698655069406e-06, 'epoch': 0.4} 40%|███▉ | 13668/34278 [15:05:40<19:31:47, 3.41s/it] 40%|███▉ | 13669/34278 [15:05:44<20:14:07, 3.53s/it] {'loss': 0.1384, 'grad_norm': 0.7311542730813468, 'learning_rate': 6.83825931675518e-06, 'epoch': 0.4} 40%|███▉ | 13669/34278 [15:05:44<20:14:07, 3.53s/it] 40%|███▉ | 13670/34278 [15:05:50<24:43:22, 4.32s/it] {'loss': 0.1395, 'grad_norm': 0.836450015113188, 'learning_rate': 6.8378199620294126e-06, 'epoch': 0.4} 40%|███▉ | 13670/34278 [15:05:50<24:43:22, 4.32s/it] 40%|███▉ | 13671/34278 [15:05:53<23:08:57, 4.04s/it] {'loss': 0.1684, 'grad_norm': 0.9349376752476497, 'learning_rate': 6.837380590896028e-06, 'epoch': 0.4} 40%|███▉ | 13671/34278 [15:05:53<23:08:57, 4.04s/it] 40%|███▉ | 13672/34278 [15:05:57<22:59:05, 4.02s/it] {'loss': 0.1472, 'grad_norm': 0.7928181811245086, 'learning_rate': 6.836941203358947e-06, 'epoch': 0.4} 40%|███▉ | 13672/34278 [15:05:57<22:59:05, 4.02s/it] 40%|███▉ | 13673/34278 [15:06:00<21:15:09, 3.71s/it] {'loss': 0.1435, 'grad_norm': 0.7648200196571773, 'learning_rate': 6.836501799422095e-06, 'epoch': 0.4} 40%|███▉ | 13673/34278 [15:06:00<21:15:09, 3.71s/it] 40%|███▉ | 13674/34278 [15:06:04<21:59:31, 3.84s/it] {'loss': 0.192, 'grad_norm': 0.9884217383835263, 'learning_rate': 6.836062379089393e-06, 'epoch': 0.4} 40%|███▉ | 13674/34278 [15:06:04<21:59:31, 3.84s/it] 40%|███▉ | 13675/34278 [15:06:08<21:03:00, 3.68s/it] {'loss': 0.1224, 'grad_norm': 0.8091423813958553, 'learning_rate': 6.8356229423647636e-06, 'epoch': 0.4} 40%|███▉ | 13675/34278 [15:06:08<21:03:00, 3.68s/it] 40%|███▉ | 13676/34278 [15:06:14<24:52:25, 4.35s/it] {'loss': 0.1444, 'grad_norm': 0.9697528805961072, 'learning_rate': 6.83518348925213e-06, 'epoch': 0.4} 40%|███▉ | 13676/34278 [15:06:14<24:52:25, 4.35s/it] 40%|███▉ | 13677/34278 [15:06:16<22:27:59, 3.93s/it] {'loss': 0.1361, 'grad_norm': 0.8652562414182313, 'learning_rate': 6.834744019755419e-06, 'epoch': 0.4} 40%|███▉ | 13677/34278 [15:06:16<22:27:59, 3.93s/it] 40%|███▉ | 13678/34278 [15:06:19<20:41:11, 3.62s/it] {'loss': 0.1439, 'grad_norm': 0.899832730824932, 'learning_rate': 6.8343045338785495e-06, 'epoch': 0.4} 40%|███▉ | 13678/34278 [15:06:19<20:41:11, 3.62s/it] 40%|███▉ | 13679/34278 [15:06:23<21:20:49, 3.73s/it] {'loss': 0.1555, 'grad_norm': 1.0012039375355022, 'learning_rate': 6.833865031625448e-06, 'epoch': 0.4} 40%|███▉ | 13679/34278 [15:06:23<21:20:49, 3.73s/it] 40%|███▉ | 13680/34278 [15:06:26<20:16:45, 3.54s/it] {'loss': 0.1648, 'grad_norm': 1.0975808984182382, 'learning_rate': 6.833425513000036e-06, 'epoch': 0.4} 40%|███▉ | 13680/34278 [15:06:27<20:16:45, 3.54s/it] 40%|███▉ | 13681/34278 [15:06:29<19:06:07, 3.34s/it] {'loss': 0.1328, 'grad_norm': 0.6872895853973296, 'learning_rate': 6.8329859780062395e-06, 'epoch': 0.4} 40%|███▉ | 13681/34278 [15:06:29<19:06:07, 3.34s/it] 40%|███▉ | 13682/34278 [15:06:33<18:58:19, 3.32s/it] {'loss': 0.1425, 'grad_norm': 0.7616114784834984, 'learning_rate': 6.832546426647983e-06, 'epoch': 0.4} 40%|███▉ | 13682/34278 [15:06:33<18:58:19, 3.32s/it] 40%|███▉ | 13683/34278 [15:06:36<18:28:23, 3.23s/it] {'loss': 0.1538, 'grad_norm': 0.8331362403542113, 'learning_rate': 6.832106858929186e-06, 'epoch': 0.4} 40%|███▉ | 13683/34278 [15:06:36<18:28:23, 3.23s/it] 40%|███▉ | 13684/34278 [15:06:39<18:57:16, 3.31s/it] {'loss': 0.1409, 'grad_norm': 0.6674530606614226, 'learning_rate': 6.831667274853779e-06, 'epoch': 0.4} 40%|███▉ | 13684/34278 [15:06:39<18:57:16, 3.31s/it] 40%|███▉ | 13685/34278 [15:06:43<19:43:50, 3.45s/it] {'loss': 0.1325, 'grad_norm': 0.7260908785878939, 'learning_rate': 6.831227674425684e-06, 'epoch': 0.4} 40%|███▉ | 13685/34278 [15:06:43<19:43:50, 3.45s/it] 40%|███▉ | 13686/34278 [15:06:46<19:35:23, 3.42s/it] {'loss': 0.1601, 'grad_norm': 0.974648249879995, 'learning_rate': 6.830788057648824e-06, 'epoch': 0.4} 40%|███▉ | 13686/34278 [15:06:46<19:35:23, 3.42s/it] 40%|███▉ | 13687/34278 [15:06:52<24:13:07, 4.23s/it] {'loss': 0.1522, 'grad_norm': 0.8437984069947647, 'learning_rate': 6.830348424527126e-06, 'epoch': 0.4} 40%|███▉ | 13687/34278 [15:06:52<24:13:07, 4.23s/it] 40%|███▉ | 13688/34278 [15:06:57<25:11:03, 4.40s/it] {'loss': 0.1467, 'grad_norm': 0.7086230213971858, 'learning_rate': 6.829908775064514e-06, 'epoch': 0.4} 40%|███▉ | 13688/34278 [15:06:57<25:11:03, 4.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 40%|███▉ | 13689/34278 [15:07:00<22:41:32, 3.97s/it] {'loss': 0.1451, 'grad_norm': 0.9228994496897659, 'learning_rate': 6.829469109264915e-06, 'epoch': 0.4} 40%|███▉ | 13689/34278 [15:07:00<22:41:32, 3.97s/it] 40%|███▉ | 13690/34278 [15:07:03<21:23:19, 3.74s/it] {'loss': 0.1356, 'grad_norm': 0.9013430047366084, 'learning_rate': 6.82902942713225e-06, 'epoch': 0.4} 40%|███▉ | 13690/34278 [15:07:03<21:23:19, 3.74s/it] 40%|███▉ | 13691/34278 [15:07:07<20:22:11, 3.56s/it] {'loss': 0.1452, 'grad_norm': 0.7251464735188569, 'learning_rate': 6.828589728670447e-06, 'epoch': 0.4} 40%|███▉ | 13691/34278 [15:07:07<20:22:11, 3.56s/it] 40%|███▉ | 13692/34278 [15:07:09<19:23:14, 3.39s/it] {'loss': 0.1348, 'grad_norm': 0.7455558487477596, 'learning_rate': 6.828150013883433e-06, 'epoch': 0.4} 40%|███▉ | 13692/34278 [15:07:09<19:23:14, 3.39s/it] 40%|███▉ | 13693/34278 [15:07:15<23:39:53, 4.14s/it] {'loss': 0.1729, 'grad_norm': 0.9193250319521461, 'learning_rate': 6.8277102827751305e-06, 'epoch': 0.4} 40%|███▉ | 13693/34278 [15:07:15<23:39:53, 4.14s/it] 40%|███▉ | 13694/34278 [15:07:18<21:37:11, 3.78s/it] {'loss': 0.1405, 'grad_norm': 0.9482756342979466, 'learning_rate': 6.827270535349469e-06, 'epoch': 0.4} 40%|███▉ | 13694/34278 [15:07:18<21:37:11, 3.78s/it] 40%|███▉ | 13695/34278 [15:07:21<20:13:38, 3.54s/it] {'loss': 0.143, 'grad_norm': 0.9751141984532288, 'learning_rate': 6.826830771610371e-06, 'epoch': 0.4} 40%|███▉ | 13695/34278 [15:07:21<20:13:38, 3.54s/it] 40%|███▉ | 13696/34278 [15:07:24<18:47:57, 3.29s/it] {'loss': 0.141, 'grad_norm': 1.044181444455976, 'learning_rate': 6.8263909915617646e-06, 'epoch': 0.4} 40%|███▉ | 13696/34278 [15:07:24<18:47:57, 3.29s/it] 40%|███▉ | 13697/34278 [15:07:28<19:25:41, 3.40s/it] {'loss': 0.1363, 'grad_norm': 0.8687203231252904, 'learning_rate': 6.825951195207575e-06, 'epoch': 0.4} 40%|███▉ | 13697/34278 [15:07:28<19:25:41, 3.40s/it] 40%|███▉ | 13698/34278 [15:07:31<20:02:26, 3.51s/it] {'loss': 0.1484, 'grad_norm': 0.8530636291008991, 'learning_rate': 6.825511382551729e-06, 'epoch': 0.4} 40%|███▉ | 13698/34278 [15:07:31<20:02:26, 3.51s/it] 40%|███▉ | 13699/34278 [15:07:38<24:34:20, 4.30s/it] {'loss': 0.1423, 'grad_norm': 0.9079172967032355, 'learning_rate': 6.825071553598152e-06, 'epoch': 0.4} 40%|███▉ | 13699/34278 [15:07:38<24:34:20, 4.30s/it] 40%|███▉ | 13700/34278 [15:07:40<22:01:48, 3.85s/it] {'loss': 0.1404, 'grad_norm': 0.8637818657784768, 'learning_rate': 6.824631708350774e-06, 'epoch': 0.4} 40%|███▉ | 13700/34278 [15:07:40<22:01:48, 3.85s/it] 40%|███▉ | 13701/34278 [15:07:43<20:19:27, 3.56s/it] {'loss': 0.1404, 'grad_norm': 0.9649551259914709, 'learning_rate': 6.824191846813517e-06, 'epoch': 0.4} 40%|███▉ | 13701/34278 [15:07:43<20:19:27, 3.56s/it] 40%|███▉ | 13702/34278 [15:07:47<19:50:19, 3.47s/it] {'loss': 0.1235, 'grad_norm': 0.9593491688199949, 'learning_rate': 6.8237519689903145e-06, 'epoch': 0.4} 40%|███▉ | 13702/34278 [15:07:47<19:50:19, 3.47s/it] 40%|███▉ | 13703/34278 [15:07:50<19:15:27, 3.37s/it] {'loss': 0.1352, 'grad_norm': 1.0040722232054902, 'learning_rate': 6.823312074885087e-06, 'epoch': 0.4} 40%|███▉ | 13703/34278 [15:07:50<19:15:27, 3.37s/it] 40%|███▉ | 13704/34278 [15:07:53<19:34:10, 3.42s/it] {'loss': 0.1529, 'grad_norm': 0.8886624316938437, 'learning_rate': 6.822872164501765e-06, 'epoch': 0.4} 40%|███▉ | 13704/34278 [15:07:53<19:34:10, 3.42s/it] 40%|███▉ | 13705/34278 [15:07:57<20:11:41, 3.53s/it] {'loss': 0.1644, 'grad_norm': 0.7400950797262666, 'learning_rate': 6.822432237844275e-06, 'epoch': 0.4} 40%|███▉ | 13705/34278 [15:07:57<20:11:41, 3.53s/it] 40%|███▉ | 13706/34278 [15:08:01<20:59:44, 3.67s/it] {'loss': 0.1345, 'grad_norm': 0.8593681653423954, 'learning_rate': 6.821992294916546e-06, 'epoch': 0.4} 40%|███▉ | 13706/34278 [15:08:01<20:59:44, 3.67s/it] 40%|███▉ | 13707/34278 [15:08:04<19:43:42, 3.45s/it] {'loss': 0.1315, 'grad_norm': 1.0256629022542472, 'learning_rate': 6.821552335722504e-06, 'epoch': 0.4} 40%|███▉ | 13707/34278 [15:08:04<19:43:42, 3.45s/it] 40%|███▉ | 13708/34278 [15:08:09<23:05:20, 4.04s/it] {'loss': 0.1226, 'grad_norm': 0.7381785662894601, 'learning_rate': 6.821112360266079e-06, 'epoch': 0.4} 40%|███▉ | 13708/34278 [15:08:09<23:05:20, 4.04s/it] 40%|███▉ | 13709/34278 [15:08:12<20:55:29, 3.66s/it] {'loss': 0.1328, 'grad_norm': 1.0433674533863682, 'learning_rate': 6.820672368551198e-06, 'epoch': 0.4} 40%|███▉ | 13709/34278 [15:08:12<20:55:29, 3.66s/it] 40%|███▉ | 13710/34278 [15:08:16<20:54:58, 3.66s/it] {'loss': 0.1311, 'grad_norm': 0.7546346570667153, 'learning_rate': 6.8202323605817854e-06, 'epoch': 0.4} 40%|███▉ | 13710/34278 [15:08:16<20:54:58, 3.66s/it] 40%|███▉ | 13711/34278 [15:08:19<20:15:33, 3.55s/it] {'loss': 0.1355, 'grad_norm': 1.12112052236558, 'learning_rate': 6.819792336361775e-06, 'epoch': 0.4} 40%|███▉ | 13711/34278 [15:08:19<20:15:33, 3.55s/it] 40%|████ | 13712/34278 [15:08:24<23:10:38, 4.06s/it] {'loss': 0.1349, 'grad_norm': 0.697245421308078, 'learning_rate': 6.819352295895093e-06, 'epoch': 0.4} 40%|████ | 13712/34278 [15:08:24<23:10:38, 4.06s/it] 40%|████ | 13713/34278 [15:08:27<21:09:51, 3.70s/it] {'loss': 0.1358, 'grad_norm': 0.8054019674828651, 'learning_rate': 6.818912239185666e-06, 'epoch': 0.4} 40%|████ | 13713/34278 [15:08:27<21:09:51, 3.70s/it] 40%|████ | 13714/34278 [15:08:30<19:47:18, 3.46s/it] {'loss': 0.1464, 'grad_norm': 0.6075957489490795, 'learning_rate': 6.8184721662374285e-06, 'epoch': 0.4} 40%|████ | 13714/34278 [15:08:30<19:47:18, 3.46s/it] 40%|████ | 13715/34278 [15:08:33<19:25:28, 3.40s/it] {'loss': 0.1593, 'grad_norm': 0.7445938503867925, 'learning_rate': 6.818032077054304e-06, 'epoch': 0.4} 40%|████ | 13715/34278 [15:08:33<19:25:28, 3.40s/it] 40%|████ | 13716/34278 [15:08:37<20:05:48, 3.52s/it] {'loss': 0.1461, 'grad_norm': 0.628204010319937, 'learning_rate': 6.817591971640221e-06, 'epoch': 0.4} 40%|████ | 13716/34278 [15:08:37<20:05:48, 3.52s/it] 40%|████ | 13717/34278 [15:08:41<20:46:55, 3.64s/it] {'loss': 0.1495, 'grad_norm': 0.733320092149337, 'learning_rate': 6.817151849999114e-06, 'epoch': 0.4} 40%|████ | 13717/34278 [15:08:41<20:46:55, 3.64s/it] 40%|████ | 13718/34278 [15:08:45<21:18:59, 3.73s/it] {'loss': 0.1189, 'grad_norm': 0.8620401004457822, 'learning_rate': 6.8167117121349065e-06, 'epoch': 0.4} 40%|████ | 13718/34278 [15:08:45<21:18:59, 3.73s/it] 40%|████ | 13719/34278 [15:08:48<19:46:36, 3.46s/it] {'loss': 0.1396, 'grad_norm': 0.699680097500003, 'learning_rate': 6.8162715580515324e-06, 'epoch': 0.4} 40%|████ | 13719/34278 [15:08:48<19:46:36, 3.46s/it] 40%|████ | 13720/34278 [15:08:51<19:44:16, 3.46s/it] {'loss': 0.1318, 'grad_norm': 0.650314855260613, 'learning_rate': 6.815831387752918e-06, 'epoch': 0.4} 40%|████ | 13720/34278 [15:08:51<19:44:16, 3.46s/it] 40%|████ | 13721/34278 [15:08:57<23:13:06, 4.07s/it] {'loss': 0.128, 'grad_norm': 0.7255972396355026, 'learning_rate': 6.815391201242996e-06, 'epoch': 0.4} 40%|████ | 13721/34278 [15:08:57<23:13:06, 4.07s/it] 40%|████ | 13722/34278 [15:09:00<21:57:02, 3.84s/it] {'loss': 0.1406, 'grad_norm': 0.9121216969140034, 'learning_rate': 6.8149509985256935e-06, 'epoch': 0.4} 40%|████ | 13722/34278 [15:09:00<21:57:02, 3.84s/it] 40%|████ | 13723/34278 [15:09:04<21:45:27, 3.81s/it] {'loss': 0.1463, 'grad_norm': 0.7030736458426388, 'learning_rate': 6.814510779604942e-06, 'epoch': 0.4} 40%|████ | 13723/34278 [15:09:04<21:45:27, 3.81s/it] 40%|████ | 13724/34278 [15:09:10<25:00:16, 4.38s/it] {'loss': 0.1815, 'grad_norm': 0.8533675344816903, 'learning_rate': 6.814070544484672e-06, 'epoch': 0.4} 40%|████ | 13724/34278 [15:09:10<25:00:16, 4.38s/it] 40%|████ | 13725/34278 [15:09:15<27:26:56, 4.81s/it] {'loss': 0.1594, 'grad_norm': 0.8935322043974434, 'learning_rate': 6.813630293168811e-06, 'epoch': 0.4} 40%|████ | 13725/34278 [15:09:15<27:26:56, 4.81s/it] 40%|████ | 13726/34278 [15:09:19<24:56:28, 4.37s/it] {'loss': 0.1313, 'grad_norm': 0.6425807864399719, 'learning_rate': 6.813190025661294e-06, 'epoch': 0.4} 40%|████ | 13726/34278 [15:09:19<24:56:28, 4.37s/it] 40%|████ | 13727/34278 [15:09:22<23:11:02, 4.06s/it] {'loss': 0.1411, 'grad_norm': 1.0053122257531732, 'learning_rate': 6.8127497419660495e-06, 'epoch': 0.4} 40%|████ | 13727/34278 [15:09:22<23:11:02, 4.06s/it] 40%|████ | 13728/34278 [15:09:25<21:50:13, 3.83s/it] {'loss': 0.1133, 'grad_norm': 0.7250950628887969, 'learning_rate': 6.8123094420870065e-06, 'epoch': 0.4} 40%|████ | 13728/34278 [15:09:25<21:50:13, 3.83s/it] 40%|████ | 13729/34278 [15:09:30<22:35:21, 3.96s/it] {'loss': 0.1864, 'grad_norm': 1.0213365217185995, 'learning_rate': 6.811869126028099e-06, 'epoch': 0.4} 40%|████ | 13729/34278 [15:09:30<22:35:21, 3.96s/it] 40%|████ | 13730/34278 [15:09:33<22:04:31, 3.87s/it] {'loss': 0.1475, 'grad_norm': 0.6960357927011381, 'learning_rate': 6.811428793793255e-06, 'epoch': 0.4} 40%|████ | 13730/34278 [15:09:33<22:04:31, 3.87s/it] 40%|████ | 13731/34278 [15:09:36<21:03:25, 3.69s/it] {'loss': 0.1352, 'grad_norm': 0.7879005124766262, 'learning_rate': 6.810988445386406e-06, 'epoch': 0.4} 40%|████ | 13731/34278 [15:09:37<21:03:25, 3.69s/it] 40%|████ | 13732/34278 [15:09:42<24:16:02, 4.25s/it] {'loss': 0.1369, 'grad_norm': 0.8364188457649364, 'learning_rate': 6.810548080811487e-06, 'epoch': 0.4} 40%|████ | 13732/34278 [15:09:42<24:16:02, 4.25s/it] 40%|████ | 13733/34278 [15:09:46<23:31:50, 4.12s/it] {'loss': 0.1345, 'grad_norm': 0.8757253023910055, 'learning_rate': 6.810107700072427e-06, 'epoch': 0.4} 40%|████ | 13733/34278 [15:09:46<23:31:50, 4.12s/it] 40%|████ | 13734/34278 [15:09:49<22:14:41, 3.90s/it] {'loss': 0.1249, 'grad_norm': 0.6263167723830406, 'learning_rate': 6.809667303173156e-06, 'epoch': 0.4} 40%|████ | 13734/34278 [15:09:49<22:14:41, 3.90s/it] 40%|████ | 13735/34278 [15:09:52<20:47:54, 3.64s/it] {'loss': 0.1325, 'grad_norm': 0.7551049383541194, 'learning_rate': 6.809226890117609e-06, 'epoch': 0.4} 40%|████ | 13735/34278 [15:09:52<20:47:54, 3.64s/it] 40%|████ | 13736/34278 [15:09:56<20:42:55, 3.63s/it] {'loss': 0.1575, 'grad_norm': 0.7788834650234634, 'learning_rate': 6.8087864609097154e-06, 'epoch': 0.4} 40%|████ | 13736/34278 [15:09:56<20:42:55, 3.63s/it] 40%|████ | 13737/34278 [15:09:59<20:23:50, 3.57s/it] {'loss': 0.1224, 'grad_norm': 0.8587180472217778, 'learning_rate': 6.8083460155534075e-06, 'epoch': 0.4} 40%|████ | 13737/34278 [15:09:59<20:23:50, 3.57s/it] 40%|████ | 13738/34278 [15:10:02<19:35:55, 3.44s/it] {'loss': 0.1848, 'grad_norm': 1.0791596637441496, 'learning_rate': 6.807905554052619e-06, 'epoch': 0.4} 40%|████ | 13738/34278 [15:10:02<19:35:55, 3.44s/it] 40%|████ | 13739/34278 [15:10:05<18:39:20, 3.27s/it] {'loss': 0.1581, 'grad_norm': 0.8389357021892137, 'learning_rate': 6.8074650764112815e-06, 'epoch': 0.4} 40%|████ | 13739/34278 [15:10:05<18:39:20, 3.27s/it] 40%|████ | 13740/34278 [15:10:12<24:17:47, 4.26s/it] {'loss': 0.1397, 'grad_norm': 0.7995316925510891, 'learning_rate': 6.807024582633325e-06, 'epoch': 0.4} 40%|████ | 13740/34278 [15:10:12<24:17:47, 4.26s/it] 40%|████ | 13741/34278 [15:10:15<22:02:14, 3.86s/it] {'loss': 0.1596, 'grad_norm': 0.7666141779090255, 'learning_rate': 6.806584072722686e-06, 'epoch': 0.4} 40%|████ | 13741/34278 [15:10:15<22:02:14, 3.86s/it] 40%|████ | 13742/34278 [15:10:18<19:58:53, 3.50s/it] {'loss': 0.1502, 'grad_norm': 0.7958196756378516, 'learning_rate': 6.806143546683297e-06, 'epoch': 0.4} 40%|████ | 13742/34278 [15:10:18<19:58:53, 3.50s/it] 40%|████ | 13743/34278 [15:10:24<24:21:14, 4.27s/it] {'loss': 0.1631, 'grad_norm': 0.8472601015465253, 'learning_rate': 6.8057030045190866e-06, 'epoch': 0.4} 40%|████ | 13743/34278 [15:10:24<24:21:14, 4.27s/it] 40%|████ | 13744/34278 [15:10:29<25:55:26, 4.54s/it] {'loss': 0.1466, 'grad_norm': 0.7462214077864776, 'learning_rate': 6.805262446233993e-06, 'epoch': 0.4} 40%|████ | 13744/34278 [15:10:29<25:55:26, 4.54s/it] 40%|████ | 13745/34278 [15:10:33<24:47:04, 4.35s/it] {'loss': 0.1179, 'grad_norm': 0.7412816926561623, 'learning_rate': 6.804821871831947e-06, 'epoch': 0.4} 40%|████ | 13745/34278 [15:10:33<24:47:04, 4.35s/it] 40%|████ | 13746/34278 [15:10:36<23:25:44, 4.11s/it] {'loss': 0.1392, 'grad_norm': 0.7237460861176316, 'learning_rate': 6.804381281316881e-06, 'epoch': 0.4} 40%|████ | 13746/34278 [15:10:36<23:25:44, 4.11s/it] 40%|████ | 13747/34278 [15:10:40<23:06:46, 4.05s/it] {'loss': 0.1333, 'grad_norm': 0.7882012158566257, 'learning_rate': 6.803940674692732e-06, 'epoch': 0.4} 40%|████ | 13747/34278 [15:10:40<23:06:46, 4.05s/it] 40%|████ | 13748/34278 [15:10:43<21:45:23, 3.82s/it] {'loss': 0.1558, 'grad_norm': 0.7731565601826743, 'learning_rate': 6.80350005196343e-06, 'epoch': 0.4} 40%|████ | 13748/34278 [15:10:43<21:45:23, 3.82s/it] 40%|████ | 13749/34278 [15:10:46<20:29:58, 3.59s/it] {'loss': 0.1547, 'grad_norm': 0.7430322833263876, 'learning_rate': 6.803059413132909e-06, 'epoch': 0.4} 40%|████ | 13749/34278 [15:10:46<20:29:58, 3.59s/it] 40%|████ | 13750/34278 [15:10:50<19:36:26, 3.44s/it] {'loss': 0.1714, 'grad_norm': 0.8498513542633379, 'learning_rate': 6.802618758205105e-06, 'epoch': 0.4} 40%|████ | 13750/34278 [15:10:50<19:36:26, 3.44s/it] 40%|████ | 13751/34278 [15:10:53<19:21:13, 3.39s/it] {'loss': 0.1469, 'grad_norm': 1.0071163344423861, 'learning_rate': 6.802178087183951e-06, 'epoch': 0.4} 40%|████ | 13751/34278 [15:10:53<19:21:13, 3.39s/it] 40%|████ | 13752/34278 [15:10:57<20:04:06, 3.52s/it] {'loss': 0.1431, 'grad_norm': 0.9074375671638603, 'learning_rate': 6.801737400073381e-06, 'epoch': 0.4} 40%|████ | 13752/34278 [15:10:57<20:04:06, 3.52s/it] 40%|████ | 13753/34278 [15:11:00<20:26:43, 3.59s/it] {'loss': 0.1714, 'grad_norm': 0.8852197129256496, 'learning_rate': 6.80129669687733e-06, 'epoch': 0.4} 40%|████ | 13753/34278 [15:11:00<20:26:43, 3.59s/it] 40%|████ | 13754/34278 [15:11:06<23:12:59, 4.07s/it] {'loss': 0.1385, 'grad_norm': 0.6976559907887856, 'learning_rate': 6.800855977599732e-06, 'epoch': 0.4} 40%|████ | 13754/34278 [15:11:06<23:12:59, 4.07s/it] 40%|████ | 13755/34278 [15:11:09<21:49:35, 3.83s/it] {'loss': 0.1416, 'grad_norm': 0.879096578156981, 'learning_rate': 6.80041524224452e-06, 'epoch': 0.4} 40%|████ | 13755/34278 [15:11:09<21:49:35, 3.83s/it] 40%|████ | 13756/34278 [15:11:12<20:58:51, 3.68s/it] {'loss': 0.1414, 'grad_norm': 0.7184119635969515, 'learning_rate': 6.799974490815633e-06, 'epoch': 0.4} 40%|████ | 13756/34278 [15:11:12<20:58:51, 3.68s/it] 40%|████ | 13757/34278 [15:11:15<20:02:55, 3.52s/it] {'loss': 0.1153, 'grad_norm': 0.8268474876082909, 'learning_rate': 6.799533723317003e-06, 'epoch': 0.4} 40%|████ | 13757/34278 [15:11:15<20:02:55, 3.52s/it] 40%|████ | 13758/34278 [15:11:19<19:35:19, 3.44s/it] {'loss': 0.1534, 'grad_norm': 0.9037195838981106, 'learning_rate': 6.799092939752564e-06, 'epoch': 0.4} 40%|████ | 13758/34278 [15:11:19<19:35:19, 3.44s/it] 40%|████ | 13759/34278 [15:11:23<21:20:10, 3.74s/it] {'loss': 0.1586, 'grad_norm': 0.7823091877787984, 'learning_rate': 6.798652140126255e-06, 'epoch': 0.4} 40%|████ | 13759/34278 [15:11:23<21:20:10, 3.74s/it] 40%|████ | 13760/34278 [15:11:26<20:04:33, 3.52s/it] {'loss': 0.1513, 'grad_norm': 0.9153302275706843, 'learning_rate': 6.798211324442008e-06, 'epoch': 0.4} 40%|████ | 13760/34278 [15:11:26<20:04:33, 3.52s/it] 40%|████ | 13761/34278 [15:11:29<19:50:05, 3.48s/it] {'loss': 0.1422, 'grad_norm': 0.7831550550923494, 'learning_rate': 6.7977704927037595e-06, 'epoch': 0.4} 40%|████ | 13761/34278 [15:11:29<19:50:05, 3.48s/it] 40%|████ | 13762/34278 [15:11:33<19:14:22, 3.38s/it] {'loss': 0.1663, 'grad_norm': 0.7939853484381711, 'learning_rate': 6.797329644915445e-06, 'epoch': 0.4} 40%|████ | 13762/34278 [15:11:33<19:14:22, 3.38s/it] 40%|████ | 13763/34278 [15:11:36<18:52:59, 3.31s/it] {'loss': 0.1304, 'grad_norm': 0.5919398204852676, 'learning_rate': 6.796888781081e-06, 'epoch': 0.4} 40%|████ | 13763/34278 [15:11:36<18:52:59, 3.31s/it] 40%|████ | 13764/34278 [15:11:39<18:18:07, 3.21s/it] {'loss': 0.1343, 'grad_norm': 0.8825133697731564, 'learning_rate': 6.796447901204362e-06, 'epoch': 0.4} 40%|████ | 13764/34278 [15:11:39<18:18:07, 3.21s/it] 40%|████ | 13765/34278 [15:11:42<19:05:15, 3.35s/it] {'loss': 0.1412, 'grad_norm': 0.8454693897606342, 'learning_rate': 6.796007005289465e-06, 'epoch': 0.4} 40%|████ | 13765/34278 [15:11:42<19:05:15, 3.35s/it] 40%|████ | 13766/34278 [15:11:45<18:41:06, 3.28s/it] {'loss': 0.1245, 'grad_norm': 0.9068838026048358, 'learning_rate': 6.795566093340247e-06, 'epoch': 0.4} 40%|████ | 13766/34278 [15:11:45<18:41:06, 3.28s/it] 40%|████ | 13767/34278 [15:11:48<17:52:00, 3.14s/it] {'loss': 0.1392, 'grad_norm': 0.8999832981873501, 'learning_rate': 6.795125165360643e-06, 'epoch': 0.4} 40%|████ | 13767/34278 [15:11:48<17:52:00, 3.14s/it] 40%|████ | 13768/34278 [15:11:51<17:57:09, 3.15s/it] {'loss': 0.1506, 'grad_norm': 0.8406459172297751, 'learning_rate': 6.79468422135459e-06, 'epoch': 0.4} 40%|████ | 13768/34278 [15:11:51<17:57:09, 3.15s/it] 40%|████ | 13769/34278 [15:11:55<18:10:25, 3.19s/it] {'loss': 0.1271, 'grad_norm': 0.8693911267360379, 'learning_rate': 6.794243261326025e-06, 'epoch': 0.4} 40%|████ | 13769/34278 [15:11:55<18:10:25, 3.19s/it] 40%|████ | 13770/34278 [15:11:58<19:02:56, 3.34s/it] {'loss': 0.1581, 'grad_norm': 0.8783531792446729, 'learning_rate': 6.7938022852788845e-06, 'epoch': 0.4} 40%|████ | 13770/34278 [15:11:58<19:02:56, 3.34s/it] 40%|████ | 13771/34278 [15:12:01<18:29:54, 3.25s/it] {'loss': 0.1473, 'grad_norm': 0.9904763397049472, 'learning_rate': 6.793361293217105e-06, 'epoch': 0.4} 40%|████ | 13771/34278 [15:12:01<18:29:54, 3.25s/it] 40%|████ | 13772/34278 [15:12:05<18:29:28, 3.25s/it] {'loss': 0.1223, 'grad_norm': 0.8116511921077783, 'learning_rate': 6.792920285144624e-06, 'epoch': 0.4} 40%|████ | 13772/34278 [15:12:05<18:29:28, 3.25s/it] 40%|████ | 13773/34278 [15:12:08<18:23:37, 3.23s/it] {'loss': 0.1488, 'grad_norm': 1.127403999058117, 'learning_rate': 6.792479261065379e-06, 'epoch': 0.4} 40%|████ | 13773/34278 [15:12:08<18:23:37, 3.23s/it] 40%|████ | 13774/34278 [15:12:11<18:02:27, 3.17s/it] {'loss': 0.1434, 'grad_norm': 0.7887529480367309, 'learning_rate': 6.792038220983308e-06, 'epoch': 0.4} 40%|████ | 13774/34278 [15:12:11<18:02:27, 3.17s/it] 40%|████ | 13775/34278 [15:12:15<19:35:20, 3.44s/it] {'loss': 0.1776, 'grad_norm': 0.8417388114250335, 'learning_rate': 6.791597164902346e-06, 'epoch': 0.4} 40%|████ | 13775/34278 [15:12:15<19:35:20, 3.44s/it] 40%|████ | 13776/34278 [15:12:19<21:07:59, 3.71s/it] {'loss': 0.1535, 'grad_norm': 0.9964791878290662, 'learning_rate': 6.791156092826434e-06, 'epoch': 0.4} 40%|████ | 13776/34278 [15:12:19<21:07:59, 3.71s/it] 40%|████ | 13777/34278 [15:12:24<22:57:32, 4.03s/it] {'loss': 0.1346, 'grad_norm': 0.8273039584371675, 'learning_rate': 6.790715004759506e-06, 'epoch': 0.4} 40%|████ | 13777/34278 [15:12:24<22:57:32, 4.03s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 40%|████ | 13778/34278 [15:12:28<22:06:51, 3.88s/it] {'loss': 0.1111, 'grad_norm': 0.658956263251214, 'learning_rate': 6.790273900705502e-06, 'epoch': 0.4} 40%|████ | 13778/34278 [15:12:28<22:06:51, 3.88s/it] 40%|████ | 13779/34278 [15:12:33<23:53:43, 4.20s/it] {'loss': 0.1333, 'grad_norm': 0.9148364028349939, 'learning_rate': 6.789832780668362e-06, 'epoch': 0.4} 40%|████ | 13779/34278 [15:12:33<23:53:43, 4.20s/it] 40%|████ | 13780/34278 [15:12:35<21:27:11, 3.77s/it] {'loss': 0.1378, 'grad_norm': 0.976282860395019, 'learning_rate': 6.78939164465202e-06, 'epoch': 0.4} 40%|████ | 13780/34278 [15:12:35<21:27:11, 3.77s/it] 40%|████ | 13781/34278 [15:12:39<21:27:20, 3.77s/it] {'loss': 0.1425, 'grad_norm': 0.8420970097414102, 'learning_rate': 6.788950492660417e-06, 'epoch': 0.4} 40%|████ | 13781/34278 [15:12:39<21:27:20, 3.77s/it] 40%|████ | 13782/34278 [15:12:43<21:23:12, 3.76s/it] {'loss': 0.1568, 'grad_norm': 0.8257070823340206, 'learning_rate': 6.788509324697492e-06, 'epoch': 0.4} 40%|████ | 13782/34278 [15:12:43<21:23:12, 3.76s/it] 40%|████ | 13783/34278 [15:12:49<25:28:52, 4.48s/it] {'loss': 0.1807, 'grad_norm': 0.9999952522008241, 'learning_rate': 6.7880681407671835e-06, 'epoch': 0.4} 40%|████ | 13783/34278 [15:12:49<25:28:52, 4.48s/it] 40%|████ | 13784/34278 [15:12:54<26:04:39, 4.58s/it] {'loss': 0.1382, 'grad_norm': 1.4350027163849897, 'learning_rate': 6.787626940873427e-06, 'epoch': 0.4} 40%|████ | 13784/34278 [15:12:54<26:04:39, 4.58s/it] 40%|████ | 13785/34278 [15:12:58<24:34:19, 4.32s/it] {'loss': 0.1454, 'grad_norm': 0.944498623984756, 'learning_rate': 6.787185725020166e-06, 'epoch': 0.4} 40%|████ | 13785/34278 [15:12:58<24:34:19, 4.32s/it] 40%|████ | 13786/34278 [15:13:01<22:40:19, 3.98s/it] {'loss': 0.1152, 'grad_norm': 0.8353992898859052, 'learning_rate': 6.7867444932113365e-06, 'epoch': 0.4} 40%|████ | 13786/34278 [15:13:01<22:40:19, 3.98s/it] 40%|████ | 13787/34278 [15:13:04<22:08:28, 3.89s/it] {'loss': 0.1368, 'grad_norm': 0.9867069702543034, 'learning_rate': 6.7863032454508786e-06, 'epoch': 0.4} 40%|████ | 13787/34278 [15:13:04<22:08:28, 3.89s/it] 40%|████ | 13788/34278 [15:13:10<25:44:28, 4.52s/it] {'loss': 0.1765, 'grad_norm': 1.0051752584566038, 'learning_rate': 6.785861981742732e-06, 'epoch': 0.4} 40%|████ | 13788/34278 [15:13:10<25:44:28, 4.52s/it] 40%|████ | 13789/34278 [15:13:13<22:47:56, 4.01s/it] {'loss': 0.1785, 'grad_norm': 0.981644628910491, 'learning_rate': 6.785420702090837e-06, 'epoch': 0.4} 40%|████ | 13789/34278 [15:13:13<22:47:56, 4.01s/it] 40%|████ | 13790/34278 [15:13:19<26:08:49, 4.59s/it] {'loss': 0.1426, 'grad_norm': 0.8079146604087067, 'learning_rate': 6.7849794064991306e-06, 'epoch': 0.4} 40%|████ | 13790/34278 [15:13:19<26:08:49, 4.59s/it] 40%|████ | 13791/34278 [15:13:23<24:39:17, 4.33s/it] {'loss': 0.181, 'grad_norm': 0.9380987146825817, 'learning_rate': 6.784538094971555e-06, 'epoch': 0.4} 40%|████ | 13791/34278 [15:13:23<24:39:17, 4.33s/it] 40%|████ | 13792/34278 [15:13:27<24:31:03, 4.31s/it] {'loss': 0.1339, 'grad_norm': 0.8754134722520672, 'learning_rate': 6.784096767512048e-06, 'epoch': 0.4} 40%|████ | 13792/34278 [15:13:27<24:31:03, 4.31s/it] 40%|████ | 13793/34278 [15:13:32<24:55:55, 4.38s/it] {'loss': 0.1466, 'grad_norm': 0.851925529761588, 'learning_rate': 6.783655424124551e-06, 'epoch': 0.4} 40%|████ | 13793/34278 [15:13:32<24:55:55, 4.38s/it] 40%|████ | 13794/34278 [15:13:38<27:45:10, 4.88s/it] {'loss': 0.1496, 'grad_norm': 0.7035318098262453, 'learning_rate': 6.783214064813007e-06, 'epoch': 0.4} 40%|████ | 13794/34278 [15:13:38<27:45:10, 4.88s/it] 40%|████ | 13795/34278 [15:13:42<26:09:05, 4.60s/it] {'loss': 0.1548, 'grad_norm': 0.9123551210521693, 'learning_rate': 6.782772689581352e-06, 'epoch': 0.4} 40%|████ | 13795/34278 [15:13:42<26:09:05, 4.60s/it] 40%|████ | 13796/34278 [15:13:44<22:54:08, 4.03s/it] {'loss': 0.1428, 'grad_norm': 0.807820870976712, 'learning_rate': 6.782331298433527e-06, 'epoch': 0.4} 40%|████ | 13796/34278 [15:13:44<22:54:08, 4.03s/it] 40%|████ | 13797/34278 [15:13:48<22:18:46, 3.92s/it] {'loss': 0.164, 'grad_norm': 0.937659808531094, 'learning_rate': 6.781889891373475e-06, 'epoch': 0.4} 40%|████ | 13797/34278 [15:13:48<22:18:46, 3.92s/it] 40%|████ | 13798/34278 [15:13:52<22:14:32, 3.91s/it] {'loss': 0.1585, 'grad_norm': 0.8513003427067009, 'learning_rate': 6.781448468405134e-06, 'epoch': 0.4} 40%|████ | 13798/34278 [15:13:52<22:14:32, 3.91s/it] 40%|████ | 13799/34278 [15:13:55<21:07:24, 3.71s/it] {'loss': 0.1357, 'grad_norm': 0.6954541883374559, 'learning_rate': 6.781007029532447e-06, 'epoch': 0.4} 40%|████ | 13799/34278 [15:13:55<21:07:24, 3.71s/it] 40%|████ | 13800/34278 [15:13:59<20:29:07, 3.60s/it] {'loss': 0.1552, 'grad_norm': 1.1038917546829061, 'learning_rate': 6.780565574759355e-06, 'epoch': 0.4} 40%|████ | 13800/34278 [15:13:59<20:29:07, 3.60s/it] 40%|████ | 13801/34278 [15:14:02<19:52:37, 3.49s/it] {'loss': 0.1531, 'grad_norm': 0.9587137507817821, 'learning_rate': 6.780124104089797e-06, 'epoch': 0.4} 40%|████ | 13801/34278 [15:14:02<19:52:37, 3.49s/it] 40%|████ | 13802/34278 [15:14:05<19:21:01, 3.40s/it] {'loss': 0.1333, 'grad_norm': 0.9066835913109763, 'learning_rate': 6.779682617527716e-06, 'epoch': 0.4} 40%|████ | 13802/34278 [15:14:05<19:21:01, 3.40s/it] 40%|████ | 13803/34278 [15:14:10<22:05:12, 3.88s/it] {'loss': 0.1264, 'grad_norm': 0.8230750795006987, 'learning_rate': 6.779241115077055e-06, 'epoch': 0.4} 40%|████ | 13803/34278 [15:14:10<22:05:12, 3.88s/it] 40%|████ | 13804/34278 [15:14:13<20:27:29, 3.60s/it] {'loss': 0.1504, 'grad_norm': 0.7413107909208608, 'learning_rate': 6.778799596741754e-06, 'epoch': 0.4} 40%|████ | 13804/34278 [15:14:13<20:27:29, 3.60s/it] 40%|████ | 13805/34278 [15:14:17<20:42:36, 3.64s/it] {'loss': 0.1257, 'grad_norm': 0.9141975771969796, 'learning_rate': 6.778358062525754e-06, 'epoch': 0.4} 40%|████ | 13805/34278 [15:14:17<20:42:36, 3.64s/it] 40%|████ | 13806/34278 [15:14:20<19:24:52, 3.41s/it] {'loss': 0.1517, 'grad_norm': 0.7411382445326381, 'learning_rate': 6.7779165124329996e-06, 'epoch': 0.4} 40%|████ | 13806/34278 [15:14:20<19:24:52, 3.41s/it] 40%|████ | 13807/34278 [15:14:23<18:43:25, 3.29s/it] {'loss': 0.1217, 'grad_norm': 0.8542320160237078, 'learning_rate': 6.777474946467429e-06, 'epoch': 0.4} 40%|████ | 13807/34278 [15:14:23<18:43:25, 3.29s/it] 40%|████ | 13808/34278 [15:14:26<18:51:16, 3.32s/it] {'loss': 0.1413, 'grad_norm': 0.8449350094060898, 'learning_rate': 6.777033364632985e-06, 'epoch': 0.4} 40%|████ | 13808/34278 [15:14:26<18:51:16, 3.32s/it] 40%|████ | 13809/34278 [15:14:32<23:25:17, 4.12s/it] {'loss': 0.125, 'grad_norm': 0.6907324546454325, 'learning_rate': 6.776591766933615e-06, 'epoch': 0.4} 40%|████ | 13809/34278 [15:14:32<23:25:17, 4.12s/it] 40%|████ | 13810/34278 [15:14:35<21:52:27, 3.85s/it] {'loss': 0.1598, 'grad_norm': 0.8094075390524268, 'learning_rate': 6.776150153373256e-06, 'epoch': 0.4} 40%|████ | 13810/34278 [15:14:35<21:52:27, 3.85s/it] 40%|████ | 13811/34278 [15:14:38<20:31:21, 3.61s/it] {'loss': 0.1332, 'grad_norm': 0.7738383526866942, 'learning_rate': 6.775708523955853e-06, 'epoch': 0.4} 40%|████ | 13811/34278 [15:14:38<20:31:21, 3.61s/it] 40%|████ | 13812/34278 [15:14:41<19:20:14, 3.40s/it] {'loss': 0.1307, 'grad_norm': 1.0297724380694226, 'learning_rate': 6.775266878685347e-06, 'epoch': 0.4} 40%|████ | 13812/34278 [15:14:41<19:20:14, 3.40s/it] 40%|████ | 13813/34278 [15:14:47<23:45:24, 4.18s/it] {'loss': 0.1213, 'grad_norm': 0.6461842201212494, 'learning_rate': 6.774825217565683e-06, 'epoch': 0.4} 40%|████ | 13813/34278 [15:14:47<23:45:24, 4.18s/it] 40%|████ | 13814/34278 [15:14:53<26:41:24, 4.70s/it] {'loss': 0.1348, 'grad_norm': 0.8215650821296505, 'learning_rate': 6.774383540600802e-06, 'epoch': 0.4} 40%|████ | 13814/34278 [15:14:53<26:41:24, 4.70s/it] 40%|████ | 13815/34278 [15:14:57<24:59:13, 4.40s/it] {'loss': 0.1274, 'grad_norm': 0.7159503319524297, 'learning_rate': 6.773941847794649e-06, 'epoch': 0.4} 40%|████ | 13815/34278 [15:14:57<24:59:13, 4.40s/it] 40%|████ | 13816/34278 [15:15:01<24:57:40, 4.39s/it] {'loss': 0.1448, 'grad_norm': 1.0293789422131108, 'learning_rate': 6.773500139151168e-06, 'epoch': 0.4} 40%|████ | 13816/34278 [15:15:01<24:57:40, 4.39s/it] 40%|████ | 13817/34278 [15:15:06<25:45:53, 4.53s/it] {'loss': 0.1541, 'grad_norm': 0.784740719277153, 'learning_rate': 6.7730584146743e-06, 'epoch': 0.4} 40%|████ | 13817/34278 [15:15:06<25:45:53, 4.53s/it] 40%|████ | 13818/34278 [15:15:09<23:10:09, 4.08s/it] {'loss': 0.1421, 'grad_norm': 0.7981040480949031, 'learning_rate': 6.772616674367989e-06, 'epoch': 0.4} 40%|████ | 13818/34278 [15:15:09<23:10:09, 4.08s/it] 40%|████ | 13819/34278 [15:15:14<24:27:08, 4.30s/it] {'loss': 0.1348, 'grad_norm': 1.3657133072490035, 'learning_rate': 6.772174918236181e-06, 'epoch': 0.4} 40%|████ | 13819/34278 [15:15:14<24:27:08, 4.30s/it] 40%|████ | 13820/34278 [15:15:17<22:48:06, 4.01s/it] {'loss': 0.1657, 'grad_norm': 1.1758712817903028, 'learning_rate': 6.771733146282816e-06, 'epoch': 0.4} 40%|████ | 13820/34278 [15:15:17<22:48:06, 4.01s/it] 40%|████ | 13821/34278 [15:15:20<20:44:02, 3.65s/it] {'loss': 0.1284, 'grad_norm': 0.7212966744394569, 'learning_rate': 6.7712913585118434e-06, 'epoch': 0.4} 40%|████ | 13821/34278 [15:15:20<20:44:02, 3.65s/it] 40%|████ | 13822/34278 [15:15:25<23:28:06, 4.13s/it] {'loss': 0.1612, 'grad_norm': 0.8898275224673643, 'learning_rate': 6.770849554927203e-06, 'epoch': 0.4} 40%|████ | 13822/34278 [15:15:25<23:28:06, 4.13s/it] 40%|████ | 13823/34278 [15:15:28<21:44:30, 3.83s/it] {'loss': 0.1539, 'grad_norm': 1.1733907420083407, 'learning_rate': 6.77040773553284e-06, 'epoch': 0.4} 40%|████ | 13823/34278 [15:15:28<21:44:30, 3.83s/it] 40%|████ | 13824/34278 [15:15:32<20:58:55, 3.69s/it] {'loss': 0.136, 'grad_norm': 0.8545844898165131, 'learning_rate': 6.7699659003327e-06, 'epoch': 0.4} 40%|████ | 13824/34278 [15:15:32<20:58:55, 3.69s/it] 40%|████ | 13825/34278 [15:15:35<19:35:55, 3.45s/it] {'loss': 0.1172, 'grad_norm': 0.6994677993077235, 'learning_rate': 6.769524049330727e-06, 'epoch': 0.4} 40%|████ | 13825/34278 [15:15:35<19:35:55, 3.45s/it] 40%|████ | 13826/34278 [15:15:41<24:32:16, 4.32s/it] {'loss': 0.1431, 'grad_norm': 1.154375089571428, 'learning_rate': 6.769082182530866e-06, 'epoch': 0.4} 40%|████ | 13826/34278 [15:15:41<24:32:16, 4.32s/it] 40%|████ | 13827/34278 [15:15:44<22:41:52, 4.00s/it] {'loss': 0.1682, 'grad_norm': 0.9552074751750567, 'learning_rate': 6.76864029993706e-06, 'epoch': 0.4} 40%|████ | 13827/34278 [15:15:44<22:41:52, 4.00s/it] 40%|████ | 13828/34278 [15:15:47<21:30:04, 3.79s/it] {'loss': 0.1644, 'grad_norm': 0.9023359896616971, 'learning_rate': 6.768198401553258e-06, 'epoch': 0.4} 40%|████ | 13828/34278 [15:15:47<21:30:04, 3.79s/it] 40%|████ | 13829/34278 [15:15:51<21:07:24, 3.72s/it] {'loss': 0.135, 'grad_norm': 0.9605314386736236, 'learning_rate': 6.767756487383401e-06, 'epoch': 0.4} 40%|████ | 13829/34278 [15:15:51<21:07:24, 3.72s/it] 40%|████ | 13830/34278 [15:15:54<19:53:28, 3.50s/it] {'loss': 0.1452, 'grad_norm': 1.0007858002830456, 'learning_rate': 6.767314557431437e-06, 'epoch': 0.4} 40%|████ | 13830/34278 [15:15:54<19:53:28, 3.50s/it] 40%|████ | 13831/34278 [15:15:58<21:33:46, 3.80s/it] {'loss': 0.148, 'grad_norm': 0.7484454973488388, 'learning_rate': 6.76687261170131e-06, 'epoch': 0.4} 40%|████ | 13831/34278 [15:15:58<21:33:46, 3.80s/it] 40%|████ | 13832/34278 [15:16:01<19:48:26, 3.49s/it] {'loss': 0.1243, 'grad_norm': 0.7623243774816232, 'learning_rate': 6.766430650196966e-06, 'epoch': 0.4} 40%|████ | 13832/34278 [15:16:01<19:48:26, 3.49s/it] 40%|████ | 13833/34278 [15:16:07<24:01:34, 4.23s/it] {'loss': 0.1515, 'grad_norm': 0.8400999008001412, 'learning_rate': 6.76598867292235e-06, 'epoch': 0.4} 40%|████ | 13833/34278 [15:16:07<24:01:34, 4.23s/it] 40%|████ | 13834/34278 [15:16:11<22:31:44, 3.97s/it] {'loss': 0.1226, 'grad_norm': 0.7423313540985967, 'learning_rate': 6.765546679881412e-06, 'epoch': 0.4} 40%|████ | 13834/34278 [15:16:11<22:31:44, 3.97s/it] 40%|████ | 13835/34278 [15:16:15<22:32:42, 3.97s/it] {'loss': 0.1341, 'grad_norm': 0.8349194568129509, 'learning_rate': 6.765104671078091e-06, 'epoch': 0.4} 40%|████ | 13835/34278 [15:16:15<22:32:42, 3.97s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 40%|████ | 13836/34278 [15:16:21<26:13:22, 4.62s/it] {'loss': 0.1444, 'grad_norm': 0.9111840040080945, 'learning_rate': 6.764662646516339e-06, 'epoch': 0.4} 40%|████ | 13836/34278 [15:16:21<26:13:22, 4.62s/it] 40%|████ | 13837/34278 [15:16:24<23:42:46, 4.18s/it] {'loss': 0.1619, 'grad_norm': 0.8620917387428572, 'learning_rate': 6.7642206062001e-06, 'epoch': 0.4} 40%|████ | 13837/34278 [15:16:24<23:42:46, 4.18s/it] 40%|████ | 13838/34278 [15:16:27<21:32:03, 3.79s/it] {'loss': 0.1235, 'grad_norm': 1.0970345348337347, 'learning_rate': 6.763778550133319e-06, 'epoch': 0.4} 40%|████ | 13838/34278 [15:16:27<21:32:03, 3.79s/it] 40%|████ | 13839/34278 [15:16:31<21:39:43, 3.82s/it] {'loss': 0.1236, 'grad_norm': 0.7091912525413521, 'learning_rate': 6.763336478319946e-06, 'epoch': 0.4} 40%|████ | 13839/34278 [15:16:31<21:39:43, 3.82s/it] 40%|████ | 13840/34278 [15:16:34<21:04:50, 3.71s/it] {'loss': 0.1356, 'grad_norm': 0.9205896424246612, 'learning_rate': 6.762894390763926e-06, 'epoch': 0.4} 40%|████ | 13840/34278 [15:16:34<21:04:50, 3.71s/it] 40%|████ | 13841/34278 [15:16:39<23:51:37, 4.20s/it] {'loss': 0.1302, 'grad_norm': 0.6969497000615098, 'learning_rate': 6.762452287469203e-06, 'epoch': 0.4} 40%|████ | 13841/34278 [15:16:39<23:51:37, 4.20s/it] 40%|████ | 13842/34278 [15:16:44<24:12:26, 4.26s/it] {'loss': 0.1581, 'grad_norm': 0.9055544492457576, 'learning_rate': 6.762010168439729e-06, 'epoch': 0.4} 40%|████ | 13842/34278 [15:16:44<24:12:26, 4.26s/it] 40%|████ | 13843/34278 [15:16:48<23:46:23, 4.19s/it] {'loss': 0.1268, 'grad_norm': 0.9098400181396813, 'learning_rate': 6.7615680336794485e-06, 'epoch': 0.4} 40%|████ | 13843/34278 [15:16:48<23:46:23, 4.19s/it] 40%|████ | 13844/34278 [15:16:51<22:18:31, 3.93s/it] {'loss': 0.1566, 'grad_norm': 0.8242693953386512, 'learning_rate': 6.761125883192309e-06, 'epoch': 0.4} 40%|████ | 13844/34278 [15:16:51<22:18:31, 3.93s/it] 40%|████ | 13845/34278 [15:16:54<21:21:25, 3.76s/it] {'loss': 0.1531, 'grad_norm': 0.729184034319107, 'learning_rate': 6.7606837169822585e-06, 'epoch': 0.4} 40%|████ | 13845/34278 [15:16:55<21:21:25, 3.76s/it] 40%|████ | 13846/34278 [15:16:58<20:45:11, 3.66s/it] {'loss': 0.132, 'grad_norm': 0.681249633724076, 'learning_rate': 6.7602415350532425e-06, 'epoch': 0.4} 40%|████ | 13846/34278 [15:16:58<20:45:11, 3.66s/it] 40%|████ | 13847/34278 [15:17:01<20:09:55, 3.55s/it] {'loss': 0.1305, 'grad_norm': 0.8098940345284736, 'learning_rate': 6.759799337409212e-06, 'epoch': 0.4} 40%|████ | 13847/34278 [15:17:01<20:09:55, 3.55s/it] 40%|████ | 13848/34278 [15:17:04<19:27:36, 3.43s/it] {'loss': 0.1411, 'grad_norm': 0.9028666825782868, 'learning_rate': 6.759357124054113e-06, 'epoch': 0.4} 40%|████ | 13848/34278 [15:17:04<19:27:36, 3.43s/it] 40%|████ | 13849/34278 [15:17:11<24:07:11, 4.25s/it] {'loss': 0.1659, 'grad_norm': 0.7851334979480659, 'learning_rate': 6.758914894991892e-06, 'epoch': 0.4} 40%|████ | 13849/34278 [15:17:11<24:07:11, 4.25s/it] 40%|████ | 13850/34278 [15:17:13<21:45:00, 3.83s/it] {'loss': 0.1306, 'grad_norm': 0.8001877139594606, 'learning_rate': 6.7584726502264994e-06, 'epoch': 0.4} 40%|████ | 13850/34278 [15:17:13<21:45:00, 3.83s/it] 40%|████ | 13851/34278 [15:17:19<25:30:06, 4.49s/it] {'loss': 0.1512, 'grad_norm': 0.8583107281229556, 'learning_rate': 6.7580303897618845e-06, 'epoch': 0.4} 40%|████ | 13851/34278 [15:17:19<25:30:06, 4.49s/it] 40%|████ | 13852/34278 [15:17:24<26:09:17, 4.61s/it] {'loss': 0.147, 'grad_norm': 0.9444092669818012, 'learning_rate': 6.757588113601993e-06, 'epoch': 0.4} 40%|████ | 13852/34278 [15:17:24<26:09:17, 4.61s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8b677772e0> Failed to fetch sample 3291477. Exception: cannot identify image file <_io.BytesIO object at 0x7f8b677772e0> 40%|████ | 13853/34278 [15:17:27<23:34:04, 4.15s/it] {'loss': 0.1303, 'grad_norm': 0.959684922988974, 'learning_rate': 6.757145821750772e-06, 'epoch': 0.4} 40%|████ | 13853/34278 [15:17:27<23:34:04, 4.15s/it] 40%|████ | 13854/34278 [15:17:32<24:11:46, 4.26s/it] {'loss': 0.156, 'grad_norm': 1.23689071723602, 'learning_rate': 6.7567035142121765e-06, 'epoch': 0.4} 40%|████ | 13854/34278 [15:17:32<24:11:46, 4.26s/it] 40%|████ | 13855/34278 [15:17:35<22:25:40, 3.95s/it] {'loss': 0.1291, 'grad_norm': 0.960208884381623, 'learning_rate': 6.7562611909901485e-06, 'epoch': 0.4} 40%|████ | 13855/34278 [15:17:35<22:25:40, 3.95s/it] 40%|████ | 13856/34278 [15:17:39<21:59:24, 3.88s/it] {'loss': 0.1299, 'grad_norm': 0.8049659595421716, 'learning_rate': 6.755818852088641e-06, 'epoch': 0.4} 40%|████ | 13856/34278 [15:17:39<21:59:24, 3.88s/it] 40%|████ | 13857/34278 [15:17:43<21:39:04, 3.82s/it] {'loss': 0.1591, 'grad_norm': 0.9656629770882997, 'learning_rate': 6.755376497511602e-06, 'epoch': 0.4} 40%|████ | 13857/34278 [15:17:43<21:39:04, 3.82s/it] 40%|████ | 13858/34278 [15:17:49<25:26:19, 4.48s/it] {'loss': 0.1533, 'grad_norm': 0.7832230564762516, 'learning_rate': 6.75493412726298e-06, 'epoch': 0.4} 40%|████ | 13858/34278 [15:17:49<25:26:19, 4.48s/it] 40%|████ | 13859/34278 [15:17:53<24:32:54, 4.33s/it] {'loss': 0.1564, 'grad_norm': 0.993521752311356, 'learning_rate': 6.754491741346726e-06, 'epoch': 0.4} 40%|████ | 13859/34278 [15:17:53<24:32:54, 4.33s/it] 40%|████ | 13860/34278 [15:17:56<22:41:32, 4.00s/it] {'loss': 0.121, 'grad_norm': 0.7183243951121271, 'learning_rate': 6.754049339766787e-06, 'epoch': 0.4} 40%|████ | 13860/34278 [15:17:56<22:41:32, 4.00s/it] 40%|████ | 13861/34278 [15:17:59<21:22:32, 3.77s/it] {'loss': 0.1203, 'grad_norm': 0.7915070967006594, 'learning_rate': 6.753606922527116e-06, 'epoch': 0.4} 40%|████ | 13861/34278 [15:17:59<21:22:32, 3.77s/it] 40%|████ | 13862/34278 [15:18:03<21:18:04, 3.76s/it] {'loss': 0.1401, 'grad_norm': 0.9709987998774869, 'learning_rate': 6.75316448963166e-06, 'epoch': 0.4} 40%|████ | 13862/34278 [15:18:03<21:18:04, 3.76s/it] 40%|████ | 13863/34278 [15:18:08<23:29:28, 4.14s/it] {'loss': 0.1619, 'grad_norm': 0.9789264146470273, 'learning_rate': 6.75272204108437e-06, 'epoch': 0.4} 40%|████ | 13863/34278 [15:18:08<23:29:28, 4.14s/it] 40%|████ | 13864/34278 [15:18:11<21:38:05, 3.82s/it] {'loss': 0.152, 'grad_norm': 0.8771147484309805, 'learning_rate': 6.752279576889197e-06, 'epoch': 0.4} 40%|████ | 13864/34278 [15:18:11<21:38:05, 3.82s/it] 40%|████ | 13865/34278 [15:18:14<21:21:08, 3.77s/it] {'loss': 0.1554, 'grad_norm': 1.167611679520056, 'learning_rate': 6.751837097050089e-06, 'epoch': 0.4} 40%|████ | 13865/34278 [15:18:14<21:21:08, 3.77s/it] 40%|████ | 13866/34278 [15:18:20<25:03:12, 4.42s/it] {'loss': 0.1502, 'grad_norm': 0.9789670282267943, 'learning_rate': 6.751394601570999e-06, 'epoch': 0.4} 40%|████ | 13866/34278 [15:18:20<25:03:12, 4.42s/it] 40%|████ | 13867/34278 [15:18:24<23:31:28, 4.15s/it] {'loss': 0.1338, 'grad_norm': 0.8399982462391157, 'learning_rate': 6.750952090455875e-06, 'epoch': 0.4} 40%|████ | 13867/34278 [15:18:24<23:31:28, 4.15s/it] 40%|████ | 13868/34278 [15:18:29<24:54:32, 4.39s/it] {'loss': 0.126, 'grad_norm': 0.8032501860187403, 'learning_rate': 6.750509563708667e-06, 'epoch': 0.4} 40%|████ | 13868/34278 [15:18:29<24:54:32, 4.39s/it] 40%|████ | 13869/34278 [15:18:35<27:58:19, 4.93s/it] {'loss': 0.1584, 'grad_norm': 0.9235085166172942, 'learning_rate': 6.750067021333331e-06, 'epoch': 0.4} 40%|████ | 13869/34278 [15:18:35<27:58:19, 4.93s/it] 40%|████ | 13870/34278 [15:18:39<26:32:57, 4.68s/it] {'loss': 0.1671, 'grad_norm': 1.0484610246377482, 'learning_rate': 6.749624463333812e-06, 'epoch': 0.4} 40%|████ | 13870/34278 [15:18:39<26:32:57, 4.68s/it] 40%|████ | 13871/34278 [15:18:43<24:21:59, 4.30s/it] {'loss': 0.1477, 'grad_norm': 1.0220461955362339, 'learning_rate': 6.749181889714065e-06, 'epoch': 0.4} 40%|████ | 13871/34278 [15:18:43<24:21:59, 4.30s/it] 40%|████ | 13872/34278 [15:18:48<26:19:52, 4.65s/it] {'loss': 0.1577, 'grad_norm': 0.8179953457084675, 'learning_rate': 6.748739300478038e-06, 'epoch': 0.4} 40%|████ | 13872/34278 [15:18:48<26:19:52, 4.65s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 40%|████ | 13873/34278 [15:18:51<23:49:18, 4.20s/it] {'loss': 0.1306, 'grad_norm': 0.9453756368427266, 'learning_rate': 6.748296695629686e-06, 'epoch': 0.4} 40%|████ | 13873/34278 [15:18:51<23:49:18, 4.20s/it] 40%|████ | 13874/34278 [15:18:57<26:30:29, 4.68s/it] {'loss': 0.1742, 'grad_norm': 0.9213055063477555, 'learning_rate': 6.747854075172957e-06, 'epoch': 0.4} 40%|████ | 13874/34278 [15:18:57<26:30:29, 4.68s/it] 40%|████ | 13875/34278 [15:19:00<23:36:34, 4.17s/it] {'loss': 0.1376, 'grad_norm': 0.7635865055614118, 'learning_rate': 6.747411439111804e-06, 'epoch': 0.4} 40%|████ | 13875/34278 [15:19:00<23:36:34, 4.17s/it] 40%|████ | 13876/34278 [15:19:03<22:28:02, 3.96s/it] {'loss': 0.1379, 'grad_norm': 0.6466773592673615, 'learning_rate': 6.746968787450179e-06, 'epoch': 0.4} 40%|████ | 13876/34278 [15:19:03<22:28:02, 3.96s/it] 40%|████ | 13877/34278 [15:19:07<21:42:59, 3.83s/it] {'loss': 0.1361, 'grad_norm': 1.035500235797688, 'learning_rate': 6.746526120192034e-06, 'epoch': 0.4} 40%|████ | 13877/34278 [15:19:07<21:42:59, 3.83s/it] 40%|████ | 13878/34278 [15:19:10<20:07:35, 3.55s/it] {'loss': 0.1423, 'grad_norm': 0.8626095453778869, 'learning_rate': 6.74608343734132e-06, 'epoch': 0.4} 40%|████ | 13878/34278 [15:19:10<20:07:35, 3.55s/it] 40%|████ | 13879/34278 [15:19:16<24:39:53, 4.35s/it] {'loss': 0.1245, 'grad_norm': 0.678155769586051, 'learning_rate': 6.7456407389019914e-06, 'epoch': 0.4} 40%|████ | 13879/34278 [15:19:16<24:39:53, 4.35s/it] 40%|████ | 13880/34278 [15:19:20<23:02:50, 4.07s/it] {'loss': 0.1586, 'grad_norm': 0.9401401481724677, 'learning_rate': 6.745198024877997e-06, 'epoch': 0.4} 40%|████ | 13880/34278 [15:19:20<23:02:50, 4.07s/it] 40%|████ | 13881/34278 [15:19:23<21:26:18, 3.78s/it] {'loss': 0.121, 'grad_norm': 1.68582785052616, 'learning_rate': 6.744755295273293e-06, 'epoch': 0.4} 40%|████ | 13881/34278 [15:19:23<21:26:18, 3.78s/it] 40%|████ | 13882/34278 [15:19:26<20:09:41, 3.56s/it] {'loss': 0.1316, 'grad_norm': 0.7929425097320372, 'learning_rate': 6.74431255009183e-06, 'epoch': 0.4} 40%|████ | 13882/34278 [15:19:26<20:09:41, 3.56s/it] 41%|████ | 13883/34278 [15:19:31<24:01:16, 4.24s/it] {'loss': 0.1426, 'grad_norm': 0.8223080449054122, 'learning_rate': 6.743869789337561e-06, 'epoch': 0.41} 41%|████ | 13883/34278 [15:19:32<24:01:16, 4.24s/it] 41%|████ | 13884/34278 [15:19:36<24:45:54, 4.37s/it] {'loss': 0.158, 'grad_norm': 0.9757420853281957, 'learning_rate': 6.743427013014439e-06, 'epoch': 0.41} 41%|████ | 13884/34278 [15:19:36<24:45:54, 4.37s/it] 41%|████ | 13885/34278 [15:19:39<22:06:26, 3.90s/it] {'loss': 0.1408, 'grad_norm': 0.7557406490237888, 'learning_rate': 6.742984221126415e-06, 'epoch': 0.41} 41%|████ | 13885/34278 [15:19:39<22:06:26, 3.90s/it] 41%|████ | 13886/34278 [15:19:42<21:19:48, 3.77s/it] {'loss': 0.1428, 'grad_norm': 1.1250477990480532, 'learning_rate': 6.7425414136774455e-06, 'epoch': 0.41} 41%|████ | 13886/34278 [15:19:42<21:19:48, 3.77s/it] 41%|████ | 13887/34278 [15:19:46<21:19:29, 3.76s/it] {'loss': 0.1488, 'grad_norm': 0.7119943340348683, 'learning_rate': 6.742098590671482e-06, 'epoch': 0.41} 41%|████ | 13887/34278 [15:19:46<21:19:29, 3.76s/it] 41%|████ | 13888/34278 [15:19:49<20:11:30, 3.56s/it] {'loss': 0.1548, 'grad_norm': 0.8331837913748907, 'learning_rate': 6.741655752112477e-06, 'epoch': 0.41} 41%|████ | 13888/34278 [15:19:49<20:11:30, 3.56s/it] 41%|████ | 13889/34278 [15:19:53<19:39:29, 3.47s/it] {'loss': 0.1301, 'grad_norm': 1.1710702037678071, 'learning_rate': 6.741212898004387e-06, 'epoch': 0.41} 41%|████ | 13889/34278 [15:19:53<19:39:29, 3.47s/it] 41%|████ | 13890/34278 [15:19:56<19:25:29, 3.43s/it] {'loss': 0.1437, 'grad_norm': 0.8428828786739003, 'learning_rate': 6.740770028351162e-06, 'epoch': 0.41} 41%|████ | 13890/34278 [15:19:56<19:25:29, 3.43s/it] 41%|████ | 13891/34278 [15:19:59<19:25:42, 3.43s/it] {'loss': 0.1369, 'grad_norm': 0.735871318414556, 'learning_rate': 6.74032714315676e-06, 'epoch': 0.41} 41%|████ | 13891/34278 [15:19:59<19:25:42, 3.43s/it] 41%|████ | 13892/34278 [15:20:02<18:47:05, 3.32s/it] {'loss': 0.1389, 'grad_norm': 1.1021626705588285, 'learning_rate': 6.739884242425131e-06, 'epoch': 0.41} 41%|████ | 13892/34278 [15:20:02<18:47:05, 3.32s/it] 41%|████ | 13893/34278 [15:20:06<20:06:28, 3.55s/it] {'loss': 0.1433, 'grad_norm': 0.841698844264047, 'learning_rate': 6.739441326160232e-06, 'epoch': 0.41} 41%|████ | 13893/34278 [15:20:06<20:06:28, 3.55s/it] 41%|████ | 13894/34278 [15:20:13<24:34:06, 4.34s/it] {'loss': 0.1331, 'grad_norm': 0.8515204871458688, 'learning_rate': 6.7389983943660166e-06, 'epoch': 0.41} 41%|████ | 13894/34278 [15:20:13<24:34:06, 4.34s/it] 41%|████ | 13895/34278 [15:20:16<22:26:55, 3.96s/it] {'loss': 0.1428, 'grad_norm': 0.8197276598858572, 'learning_rate': 6.738555447046435e-06, 'epoch': 0.41} 41%|████ | 13895/34278 [15:20:16<22:26:55, 3.96s/it] 41%|████ | 13896/34278 [15:20:19<20:49:24, 3.68s/it] {'loss': 0.1458, 'grad_norm': 0.6604443901026178, 'learning_rate': 6.73811248420545e-06, 'epoch': 0.41} 41%|████ | 13896/34278 [15:20:19<20:49:24, 3.68s/it] 41%|████ | 13897/34278 [15:20:22<20:02:13, 3.54s/it] {'loss': 0.1314, 'grad_norm': 0.6456051881981346, 'learning_rate': 6.73766950584701e-06, 'epoch': 0.41} 41%|████ | 13897/34278 [15:20:22<20:02:13, 3.54s/it] 41%|████ | 13898/34278 [15:20:25<18:38:53, 3.29s/it] {'loss': 0.1425, 'grad_norm': 0.9471223019182881, 'learning_rate': 6.73722651197507e-06, 'epoch': 0.41} 41%|████ | 13898/34278 [15:20:25<18:38:53, 3.29s/it] 41%|████ | 13899/34278 [15:20:29<19:40:44, 3.48s/it] {'loss': 0.1476, 'grad_norm': 0.9073627970640993, 'learning_rate': 6.736783502593588e-06, 'epoch': 0.41} 41%|████ | 13899/34278 [15:20:29<19:40:44, 3.48s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9171 > 8192). Running this sequence through the model will result in indexing errors 41%|████ | 13900/34278 [15:20:32<20:22:13, 3.60s/it] {'loss': 0.1361, 'grad_norm': 0.7660624869700193, 'learning_rate': 6.7363404777065165e-06, 'epoch': 0.41} 41%|████ | 13900/34278 [15:20:32<20:22:13, 3.60s/it] 41%|████ | 13901/34278 [15:20:36<20:41:41, 3.66s/it] {'loss': 0.1342, 'grad_norm': 0.6532465243325225, 'learning_rate': 6.735897437317814e-06, 'epoch': 0.41} 41%|████ | 13901/34278 [15:20:36<20:41:41, 3.66s/it] 41%|████ | 13902/34278 [15:20:39<19:44:12, 3.49s/it] {'loss': 0.1538, 'grad_norm': 0.9244306440291867, 'learning_rate': 6.73545438143143e-06, 'epoch': 0.41} 41%|████ | 13902/34278 [15:20:39<19:44:12, 3.49s/it] 41%|████ | 13903/34278 [15:20:42<18:27:53, 3.26s/it] {'loss': 0.1483, 'grad_norm': 1.262680194200995, 'learning_rate': 6.735011310051326e-06, 'epoch': 0.41} 41%|████ | 13903/34278 [15:20:42<18:27:53, 3.26s/it] 41%|████ | 13904/34278 [15:20:45<18:20:24, 3.24s/it] {'loss': 0.1292, 'grad_norm': 0.7438170902124872, 'learning_rate': 6.734568223181454e-06, 'epoch': 0.41} 41%|████ | 13904/34278 [15:20:45<18:20:24, 3.24s/it] 41%|████ | 13905/34278 [15:20:50<20:47:53, 3.68s/it] {'loss': 0.1339, 'grad_norm': 0.8378501698766265, 'learning_rate': 6.734125120825772e-06, 'epoch': 0.41} 41%|████ | 13905/34278 [15:20:50<20:47:53, 3.68s/it] 41%|████ | 13906/34278 [15:20:54<20:44:45, 3.67s/it] {'loss': 0.1544, 'grad_norm': 0.7786892693350929, 'learning_rate': 6.733682002988234e-06, 'epoch': 0.41} 41%|████ | 13906/34278 [15:20:54<20:44:45, 3.67s/it] 41%|████ | 13907/34278 [15:20:59<24:29:43, 4.33s/it] {'loss': 0.1526, 'grad_norm': 0.7707653887747307, 'learning_rate': 6.733238869672798e-06, 'epoch': 0.41} 41%|████ | 13907/34278 [15:20:59<24:29:43, 4.33s/it] 41%|████ | 13908/34278 [15:21:03<22:48:51, 4.03s/it] {'loss': 0.1449, 'grad_norm': 0.824372124571805, 'learning_rate': 6.732795720883418e-06, 'epoch': 0.41} 41%|████ | 13908/34278 [15:21:03<22:48:51, 4.03s/it] 41%|████ | 13909/34278 [15:21:06<21:17:47, 3.76s/it] {'loss': 0.1457, 'grad_norm': 0.8756119621790452, 'learning_rate': 6.732352556624054e-06, 'epoch': 0.41} 41%|████ | 13909/34278 [15:21:06<21:17:47, 3.76s/it] 41%|████ | 13910/34278 [15:21:10<21:33:05, 3.81s/it] {'loss': 0.1546, 'grad_norm': 0.8489565094040106, 'learning_rate': 6.731909376898655e-06, 'epoch': 0.41} 41%|████ | 13910/34278 [15:21:10<21:33:05, 3.81s/it] 41%|████ | 13911/34278 [15:21:13<20:17:46, 3.59s/it] {'loss': 0.1331, 'grad_norm': 0.8823960475572548, 'learning_rate': 6.731466181711187e-06, 'epoch': 0.41} 41%|████ | 13911/34278 [15:21:13<20:17:46, 3.59s/it] 41%|████ | 13912/34278 [15:21:19<24:19:26, 4.30s/it] {'loss': 0.1301, 'grad_norm': 0.9194805386582351, 'learning_rate': 6.7310229710656e-06, 'epoch': 0.41} 41%|████ | 13912/34278 [15:21:19<24:19:26, 4.30s/it] 41%|████ | 13913/34278 [15:21:23<24:16:46, 4.29s/it] {'loss': 0.1783, 'grad_norm': 0.8924955956329476, 'learning_rate': 6.730579744965853e-06, 'epoch': 0.41} 41%|████ | 13913/34278 [15:21:23<24:16:46, 4.29s/it] 41%|████ | 13914/34278 [15:21:26<22:26:42, 3.97s/it] {'loss': 0.1293, 'grad_norm': 0.7918824527035137, 'learning_rate': 6.730136503415905e-06, 'epoch': 0.41} 41%|████ | 13914/34278 [15:21:26<22:26:42, 3.97s/it] 41%|████ | 13915/34278 [15:21:32<25:56:49, 4.59s/it] {'loss': 0.1413, 'grad_norm': 1.1705718073496392, 'learning_rate': 6.72969324641971e-06, 'epoch': 0.41} 41%|████ | 13915/34278 [15:21:32<25:56:49, 4.59s/it] 41%|████ | 13916/34278 [15:21:35<23:21:11, 4.13s/it] {'loss': 0.1422, 'grad_norm': 1.3340861208821262, 'learning_rate': 6.7292499739812265e-06, 'epoch': 0.41} 41%|████ | 13916/34278 [15:21:35<23:21:11, 4.13s/it] 41%|████ | 13917/34278 [15:21:39<21:41:52, 3.84s/it] {'loss': 0.153, 'grad_norm': 0.7533503981511502, 'learning_rate': 6.7288066861044135e-06, 'epoch': 0.41} 41%|████ | 13917/34278 [15:21:39<21:41:52, 3.84s/it] 41%|████ | 13918/34278 [15:21:43<22:00:51, 3.89s/it] {'loss': 0.1093, 'grad_norm': 0.7670577018488539, 'learning_rate': 6.728363382793226e-06, 'epoch': 0.41} 41%|████ | 13918/34278 [15:21:43<22:00:51, 3.89s/it] 41%|████ | 13919/34278 [15:21:46<21:28:19, 3.80s/it] {'loss': 0.1281, 'grad_norm': 1.0263661022388535, 'learning_rate': 6.727920064051623e-06, 'epoch': 0.41} 41%|████ | 13919/34278 [15:21:46<21:28:19, 3.80s/it] 41%|████ | 13920/34278 [15:21:50<20:50:58, 3.69s/it] {'loss': 0.1392, 'grad_norm': 1.0922248593278725, 'learning_rate': 6.727476729883562e-06, 'epoch': 0.41} 41%|████ | 13920/34278 [15:21:50<20:50:58, 3.69s/it] 41%|████ | 13921/34278 [15:21:53<19:37:00, 3.47s/it] {'loss': 0.1514, 'grad_norm': 0.8025404990280155, 'learning_rate': 6.727033380293e-06, 'epoch': 0.41} 41%|████ | 13921/34278 [15:21:53<19:37:00, 3.47s/it] 41%|████ | 13922/34278 [15:21:56<20:15:58, 3.58s/it] {'loss': 0.1371, 'grad_norm': 0.9046361301964011, 'learning_rate': 6.726590015283898e-06, 'epoch': 0.41} 41%|████ | 13922/34278 [15:21:56<20:15:58, 3.58s/it] 41%|████ | 13923/34278 [15:22:00<19:22:23, 3.43s/it] {'loss': 0.1564, 'grad_norm': 0.9155983421745632, 'learning_rate': 6.726146634860211e-06, 'epoch': 0.41} 41%|████ | 13923/34278 [15:22:00<19:22:23, 3.43s/it] 41%|████ | 13924/34278 [15:22:02<18:15:56, 3.23s/it] {'loss': 0.1161, 'grad_norm': 0.9383638023490466, 'learning_rate': 6.725703239025902e-06, 'epoch': 0.41} 41%|████ | 13924/34278 [15:22:02<18:15:56, 3.23s/it] 41%|████ | 13925/34278 [15:22:08<22:51:25, 4.04s/it] {'loss': 0.1147, 'grad_norm': 0.7331267287852953, 'learning_rate': 6.7252598277849224e-06, 'epoch': 0.41} 41%|████ | 13925/34278 [15:22:08<22:51:25, 4.04s/it] 41%|████ | 13926/34278 [15:22:14<25:26:47, 4.50s/it] {'loss': 0.14, 'grad_norm': 0.7813416010787895, 'learning_rate': 6.724816401141238e-06, 'epoch': 0.41} 41%|████ | 13926/34278 [15:22:14<25:26:47, 4.50s/it] 41%|████ | 13927/34278 [15:22:17<23:20:14, 4.13s/it] {'loss': 0.1195, 'grad_norm': 0.7237744179297166, 'learning_rate': 6.724372959098804e-06, 'epoch': 0.41} 41%|████ | 13927/34278 [15:22:17<23:20:14, 4.13s/it] 41%|████ | 13928/34278 [15:22:20<21:03:52, 3.73s/it] {'loss': 0.1245, 'grad_norm': 0.7361917382200347, 'learning_rate': 6.723929501661577e-06, 'epoch': 0.41} 41%|████ | 13928/34278 [15:22:20<21:03:52, 3.73s/it] 41%|████ | 13929/34278 [15:22:26<24:50:22, 4.39s/it] {'loss': 0.1464, 'grad_norm': 0.8156374258026021, 'learning_rate': 6.7234860288335226e-06, 'epoch': 0.41} 41%|████ | 13929/34278 [15:22:26<24:50:22, 4.39s/it] 41%|████ | 13930/34278 [15:22:32<27:29:38, 4.86s/it] {'loss': 0.127, 'grad_norm': 0.892837999734048, 'learning_rate': 6.723042540618594e-06, 'epoch': 0.41} 41%|████ | 13930/34278 [15:22:32<27:29:38, 4.86s/it] 41%|████ | 13931/34278 [15:22:35<25:05:37, 4.44s/it] {'loss': 0.1363, 'grad_norm': 0.7440565252222753, 'learning_rate': 6.722599037020754e-06, 'epoch': 0.41} 41%|████ | 13931/34278 [15:22:35<25:05:37, 4.44s/it] 41%|████ | 13932/34278 [15:22:39<23:40:00, 4.19s/it] {'loss': 0.1534, 'grad_norm': 0.983731475863947, 'learning_rate': 6.722155518043961e-06, 'epoch': 0.41} 41%|████ | 13932/34278 [15:22:39<23:40:00, 4.19s/it] 41%|████ | 13933/34278 [15:22:42<22:00:19, 3.89s/it] {'loss': 0.1436, 'grad_norm': 0.8136877659832364, 'learning_rate': 6.721711983692174e-06, 'epoch': 0.41} 41%|████ | 13933/34278 [15:22:42<22:00:19, 3.89s/it] 41%|████ | 13934/34278 [15:22:46<22:28:41, 3.98s/it] {'loss': 0.1349, 'grad_norm': 0.7771830696086671, 'learning_rate': 6.721268433969354e-06, 'epoch': 0.41} 41%|████ | 13934/34278 [15:22:46<22:28:41, 3.98s/it] 41%|████ | 13935/34278 [15:22:50<22:02:17, 3.90s/it] {'loss': 0.1331, 'grad_norm': 0.8047228295417147, 'learning_rate': 6.720824868879461e-06, 'epoch': 0.41} 41%|████ | 13935/34278 [15:22:50<22:02:17, 3.90s/it] 41%|████ | 13936/34278 [15:22:53<20:34:57, 3.64s/it] {'loss': 0.1345, 'grad_norm': 1.1618088355115954, 'learning_rate': 6.720381288426453e-06, 'epoch': 0.41} 41%|████ | 13936/34278 [15:22:53<20:34:57, 3.64s/it] 41%|████ | 13937/34278 [15:22:57<21:32:00, 3.81s/it] {'loss': 0.13, 'grad_norm': 0.8696414737240781, 'learning_rate': 6.719937692614291e-06, 'epoch': 0.41} 41%|████ | 13937/34278 [15:22:57<21:32:00, 3.81s/it] 41%|████ | 13938/34278 [15:23:00<19:51:55, 3.52s/it] {'loss': 0.1376, 'grad_norm': 0.8758969354676212, 'learning_rate': 6.719494081446938e-06, 'epoch': 0.41} 41%|████ | 13938/34278 [15:23:00<19:51:55, 3.52s/it] 41%|████ | 13939/34278 [15:23:03<19:05:54, 3.38s/it] {'loss': 0.144, 'grad_norm': 0.9729964811323002, 'learning_rate': 6.719050454928352e-06, 'epoch': 0.41} 41%|████ | 13939/34278 [15:23:03<19:05:54, 3.38s/it] 41%|████ | 13940/34278 [15:23:07<20:04:46, 3.55s/it] {'loss': 0.1451, 'grad_norm': 0.7917882584987335, 'learning_rate': 6.718606813062491e-06, 'epoch': 0.41} 41%|████ | 13940/34278 [15:23:07<20:04:46, 3.55s/it] 41%|████ | 13941/34278 [15:23:10<19:12:12, 3.40s/it] {'loss': 0.1478, 'grad_norm': 1.034237100355872, 'learning_rate': 6.718163155853324e-06, 'epoch': 0.41} 41%|████ | 13941/34278 [15:23:10<19:12:12, 3.40s/it] 41%|████ | 13942/34278 [15:23:13<18:40:09, 3.30s/it] {'loss': 0.131, 'grad_norm': 0.8205713409659606, 'learning_rate': 6.717719483304802e-06, 'epoch': 0.41} 41%|████ | 13942/34278 [15:23:13<18:40:09, 3.30s/it] 41%|████ | 13943/34278 [15:23:16<18:19:03, 3.24s/it] {'loss': 0.1312, 'grad_norm': 0.9173721626722815, 'learning_rate': 6.717275795420891e-06, 'epoch': 0.41} 41%|████ | 13943/34278 [15:23:16<18:19:03, 3.24s/it] 41%|████ | 13944/34278 [15:23:20<19:34:20, 3.47s/it] {'loss': 0.127, 'grad_norm': 0.8580739255366375, 'learning_rate': 6.716832092205553e-06, 'epoch': 0.41} 41%|████ | 13944/34278 [15:23:20<19:34:20, 3.47s/it] 41%|████ | 13945/34278 [15:23:24<19:41:27, 3.49s/it] {'loss': 0.1455, 'grad_norm': 0.7384111949999307, 'learning_rate': 6.716388373662748e-06, 'epoch': 0.41} 41%|████ | 13945/34278 [15:23:24<19:41:27, 3.49s/it] 41%|████ | 13946/34278 [15:23:29<22:02:40, 3.90s/it] {'loss': 0.1337, 'grad_norm': 0.7810865620045411, 'learning_rate': 6.7159446397964365e-06, 'epoch': 0.41} 41%|████ | 13946/34278 [15:23:29<22:02:40, 3.90s/it] 41%|████ | 13947/34278 [15:23:34<23:56:41, 4.24s/it] {'loss': 0.1504, 'grad_norm': 0.608604351621167, 'learning_rate': 6.71550089061058e-06, 'epoch': 0.41} 41%|████ | 13947/34278 [15:23:34<23:56:41, 4.24s/it] 41%|████ | 13948/34278 [15:23:37<22:24:03, 3.97s/it] {'loss': 0.1451, 'grad_norm': 0.7273311664070611, 'learning_rate': 6.715057126109144e-06, 'epoch': 0.41} 41%|████ | 13948/34278 [15:23:37<22:24:03, 3.97s/it] 41%|████ | 13949/34278 [15:23:40<21:23:57, 3.79s/it] {'loss': 0.1259, 'grad_norm': 0.8768176956995616, 'learning_rate': 6.714613346296084e-06, 'epoch': 0.41} 41%|████ | 13949/34278 [15:23:40<21:23:57, 3.79s/it] 41%|████ | 13950/34278 [15:23:46<23:57:20, 4.24s/it] {'loss': 0.1361, 'grad_norm': 0.6964834610105543, 'learning_rate': 6.7141695511753665e-06, 'epoch': 0.41} 41%|████ | 13950/34278 [15:23:46<23:57:20, 4.24s/it] 41%|████ | 13951/34278 [15:23:50<24:27:07, 4.33s/it] {'loss': 0.1344, 'grad_norm': 0.8443813847819621, 'learning_rate': 6.7137257407509535e-06, 'epoch': 0.41} 41%|████ | 13951/34278 [15:23:50<24:27:07, 4.33s/it] 41%|████ | 13952/34278 [15:23:53<22:23:28, 3.97s/it] {'loss': 0.1478, 'grad_norm': 0.7819794995447436, 'learning_rate': 6.7132819150268055e-06, 'epoch': 0.41} 41%|████ | 13952/34278 [15:23:53<22:23:28, 3.97s/it] 41%|████ | 13953/34278 [15:23:56<20:27:03, 3.62s/it] {'loss': 0.1512, 'grad_norm': 1.0522374776671786, 'learning_rate': 6.712838074006886e-06, 'epoch': 0.41} 41%|████ | 13953/34278 [15:23:56<20:27:03, 3.62s/it] 41%|████ | 13954/34278 [15:24:01<22:15:31, 3.94s/it] {'loss': 0.124, 'grad_norm': 0.7433847992382673, 'learning_rate': 6.712394217695157e-06, 'epoch': 0.41} 41%|████ | 13954/34278 [15:24:01<22:15:31, 3.94s/it] 41%|████ | 13955/34278 [15:24:04<21:44:50, 3.85s/it] {'loss': 0.1505, 'grad_norm': 0.8055643101783762, 'learning_rate': 6.71195034609558e-06, 'epoch': 0.41} 41%|████ | 13955/34278 [15:24:04<21:44:50, 3.85s/it] 41%|████ | 13956/34278 [15:24:09<22:19:30, 3.95s/it] {'loss': 0.1362, 'grad_norm': 0.8761014103587407, 'learning_rate': 6.711506459212121e-06, 'epoch': 0.41} 41%|████ | 13956/34278 [15:24:09<22:19:30, 3.95s/it] 41%|████ | 13957/34278 [15:24:11<20:17:32, 3.59s/it] {'loss': 0.1293, 'grad_norm': 0.8862245770951604, 'learning_rate': 6.7110625570487396e-06, 'epoch': 0.41} 41%|████ | 13957/34278 [15:24:11<20:17:32, 3.59s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f09f1346570> Failed to fetch sample 2654735. Exception: cannot identify image file <_io.BytesIO object at 0x7f09f1346570> 41%|████ | 13958/34278 [15:24:17<23:09:50, 4.10s/it] {'loss': 0.1555, 'grad_norm': 0.7771320225609402, 'learning_rate': 6.7106186396094e-06, 'epoch': 0.41} 41%|████ | 13958/34278 [15:24:17<23:09:50, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 41%|████ | 13959/34278 [15:24:20<21:33:20, 3.82s/it] {'loss': 0.1249, 'grad_norm': 0.8519494872743051, 'learning_rate': 6.710174706898066e-06, 'epoch': 0.41} 41%|████ | 13959/34278 [15:24:20<21:33:20, 3.82s/it] 41%|████ | 13960/34278 [15:24:25<24:28:21, 4.34s/it] {'loss': 0.141, 'grad_norm': 0.8705566898812466, 'learning_rate': 6.7097307589187e-06, 'epoch': 0.41} 41%|████ | 13960/34278 [15:24:25<24:28:21, 4.34s/it] 41%|████ | 13961/34278 [15:24:29<22:32:47, 4.00s/it] {'loss': 0.1804, 'grad_norm': 0.8819784237527349, 'learning_rate': 6.709286795675267e-06, 'epoch': 0.41} 41%|████ | 13961/34278 [15:24:29<22:32:47, 4.00s/it] 41%|████ | 13962/34278 [15:24:33<22:48:50, 4.04s/it] {'loss': 0.1475, 'grad_norm': 0.8451911639438492, 'learning_rate': 6.708842817171728e-06, 'epoch': 0.41} 41%|████ | 13962/34278 [15:24:33<22:48:50, 4.04s/it] 41%|████ | 13963/34278 [15:24:39<26:12:55, 4.65s/it] {'loss': 0.1511, 'grad_norm': 0.8277653220416917, 'learning_rate': 6.708398823412048e-06, 'epoch': 0.41} 41%|████ | 13963/34278 [15:24:39<26:12:55, 4.65s/it] 41%|████ | 13964/34278 [15:24:42<23:54:29, 4.24s/it] {'loss': 0.1412, 'grad_norm': 0.7493702203092666, 'learning_rate': 6.707954814400194e-06, 'epoch': 0.41} 41%|████ | 13964/34278 [15:24:42<23:54:29, 4.24s/it] 41%|████ | 13965/34278 [15:24:45<22:20:42, 3.96s/it] {'loss': 0.1299, 'grad_norm': 1.0245989458513236, 'learning_rate': 6.707510790140125e-06, 'epoch': 0.41} 41%|████ | 13965/34278 [15:24:45<22:20:42, 3.96s/it] 41%|████ | 13966/34278 [15:24:50<24:06:39, 4.27s/it] {'loss': 0.1336, 'grad_norm': 0.9824661100612829, 'learning_rate': 6.707066750635808e-06, 'epoch': 0.41} 41%|████ | 13966/34278 [15:24:50<24:06:39, 4.27s/it] 41%|████ | 13967/34278 [15:24:56<27:08:58, 4.81s/it] {'loss': 0.1147, 'grad_norm': 0.6584938280855426, 'learning_rate': 6.706622695891205e-06, 'epoch': 0.41} 41%|████ | 13967/34278 [15:24:56<27:08:58, 4.81s/it] 41%|████ | 13968/34278 [15:25:00<25:19:06, 4.49s/it] {'loss': 0.1276, 'grad_norm': 1.0934168843934498, 'learning_rate': 6.7061786259102836e-06, 'epoch': 0.41} 41%|████ | 13968/34278 [15:25:00<25:19:06, 4.49s/it] 41%|████ | 13969/34278 [15:25:03<22:37:13, 4.01s/it] {'loss': 0.1469, 'grad_norm': 1.0310521905153107, 'learning_rate': 6.705734540697007e-06, 'epoch': 0.41} 41%|████ | 13969/34278 [15:25:03<22:37:13, 4.01s/it] 41%|████ | 13970/34278 [15:25:06<20:31:41, 3.64s/it] {'loss': 0.1515, 'grad_norm': 0.8286522643462688, 'learning_rate': 6.705290440255339e-06, 'epoch': 0.41} 41%|████ | 13970/34278 [15:25:06<20:31:41, 3.64s/it] 41%|████ | 13971/34278 [15:25:09<19:29:39, 3.46s/it] {'loss': 0.1248, 'grad_norm': 0.9236250938075553, 'learning_rate': 6.704846324589245e-06, 'epoch': 0.41} 41%|████ | 13971/34278 [15:25:09<19:29:39, 3.46s/it] 41%|████ | 13972/34278 [15:25:13<20:23:45, 3.62s/it] {'loss': 0.1536, 'grad_norm': 1.0354918369542758, 'learning_rate': 6.704402193702688e-06, 'epoch': 0.41} 41%|████ | 13972/34278 [15:25:13<20:23:45, 3.62s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fef350c8590> Failed to fetch sample 2701717. Exception: cannot identify image file <_io.BytesIO object at 0x7fef350c8590> 41%|████ | 13973/34278 [15:25:18<23:45:35, 4.21s/it] {'loss': 0.1354, 'grad_norm': 0.8376302250466713, 'learning_rate': 6.703958047599638e-06, 'epoch': 0.41} 41%|████ | 13973/34278 [15:25:19<23:45:35, 4.21s/it] 41%|████ | 13974/34278 [15:25:21<21:35:28, 3.83s/it] {'loss': 0.1143, 'grad_norm': 0.8339055544052391, 'learning_rate': 6.703513886284057e-06, 'epoch': 0.41} 41%|████ | 13974/34278 [15:25:21<21:35:28, 3.83s/it] 41%|████ | 13975/34278 [15:25:25<20:21:16, 3.61s/it] {'loss': 0.1284, 'grad_norm': 0.7567474029921512, 'learning_rate': 6.703069709759908e-06, 'epoch': 0.41} 41%|████ | 13975/34278 [15:25:25<20:21:16, 3.61s/it] 41%|████ | 13976/34278 [15:25:30<23:48:56, 4.22s/it] {'loss': 0.1504, 'grad_norm': 0.9610262830773799, 'learning_rate': 6.702625518031163e-06, 'epoch': 0.41} 41%|████ | 13976/34278 [15:25:30<23:48:56, 4.22s/it] 41%|████ | 13977/34278 [15:25:33<21:53:10, 3.88s/it] {'loss': 0.1793, 'grad_norm': 0.8865308435034972, 'learning_rate': 6.702181311101782e-06, 'epoch': 0.41} 41%|████ | 13977/34278 [15:25:33<21:53:10, 3.88s/it] 41%|████ | 13978/34278 [15:25:36<20:19:46, 3.61s/it] {'loss': 0.1374, 'grad_norm': 0.6574608969593219, 'learning_rate': 6.7017370889757316e-06, 'epoch': 0.41} 41%|████ | 13978/34278 [15:25:36<20:19:46, 3.61s/it] 41%|████ | 13979/34278 [15:25:42<24:18:16, 4.31s/it] {'loss': 0.136, 'grad_norm': 1.0120752558987616, 'learning_rate': 6.701292851656981e-06, 'epoch': 0.41} 41%|████ | 13979/34278 [15:25:42<24:18:16, 4.31s/it] 41%|████ | 13980/34278 [15:25:45<22:09:09, 3.93s/it] {'loss': 0.1435, 'grad_norm': 1.9644773530936295, 'learning_rate': 6.700848599149492e-06, 'epoch': 0.41} 41%|████ | 13980/34278 [15:25:45<22:09:09, 3.93s/it] 41%|████ | 13981/34278 [15:25:48<20:39:21, 3.66s/it] {'loss': 0.1643, 'grad_norm': 1.160227490228442, 'learning_rate': 6.7004043314572334e-06, 'epoch': 0.41} 41%|████ | 13981/34278 [15:25:48<20:39:21, 3.66s/it] 41%|████ | 13982/34278 [15:25:52<20:22:46, 3.61s/it] {'loss': 0.1537, 'grad_norm': 0.7789937767422863, 'learning_rate': 6.699960048584171e-06, 'epoch': 0.41} 41%|████ | 13982/34278 [15:25:52<20:22:46, 3.61s/it] 41%|████ | 13983/34278 [15:25:55<20:09:33, 3.58s/it] {'loss': 0.162, 'grad_norm': 0.8362555771905549, 'learning_rate': 6.699515750534271e-06, 'epoch': 0.41} 41%|████ | 13983/34278 [15:25:55<20:09:33, 3.58s/it] 41%|████ | 13984/34278 [15:25:59<19:42:07, 3.50s/it] {'loss': 0.1443, 'grad_norm': 1.1339844282933316, 'learning_rate': 6.699071437311499e-06, 'epoch': 0.41} 41%|████ | 13984/34278 [15:25:59<19:42:07, 3.50s/it] 41%|████ | 13985/34278 [15:26:05<23:56:25, 4.25s/it] {'loss': 0.1389, 'grad_norm': 0.9865979361972813, 'learning_rate': 6.6986271089198255e-06, 'epoch': 0.41} 41%|████ | 13985/34278 [15:26:05<23:56:25, 4.25s/it] 41%|████ | 13986/34278 [15:26:10<26:15:31, 4.66s/it] {'loss': 0.1508, 'grad_norm': 0.6972531061018157, 'learning_rate': 6.698182765363213e-06, 'epoch': 0.41} 41%|████ | 13986/34278 [15:26:10<26:15:31, 4.66s/it] 41%|████ | 13987/34278 [15:26:13<23:25:53, 4.16s/it] {'loss': 0.1378, 'grad_norm': 0.6979498330401717, 'learning_rate': 6.69773840664563e-06, 'epoch': 0.41} 41%|████ | 13987/34278 [15:26:13<23:25:53, 4.16s/it] 41%|████ | 13988/34278 [15:26:16<20:58:51, 3.72s/it] {'loss': 0.1461, 'grad_norm': 0.8524600649031473, 'learning_rate': 6.697294032771044e-06, 'epoch': 0.41} 41%|████ | 13988/34278 [15:26:16<20:58:51, 3.72s/it] 41%|████ | 13989/34278 [15:26:19<19:57:05, 3.54s/it] {'loss': 0.1661, 'grad_norm': 0.7584244475405565, 'learning_rate': 6.696849643743423e-06, 'epoch': 0.41} 41%|████ | 13989/34278 [15:26:19<19:57:05, 3.54s/it] 41%|████ | 13990/34278 [15:26:23<20:34:58, 3.65s/it] {'loss': 0.1431, 'grad_norm': 0.7086604018047018, 'learning_rate': 6.69640523956673e-06, 'epoch': 0.41} 41%|████ | 13990/34278 [15:26:23<20:34:58, 3.65s/it] 41%|████ | 13991/34278 [15:26:26<19:51:21, 3.52s/it] {'loss': 0.1413, 'grad_norm': 0.829624793440343, 'learning_rate': 6.69596082024494e-06, 'epoch': 0.41} 41%|████ | 13991/34278 [15:26:26<19:51:21, 3.52s/it] 41%|████ | 13992/34278 [15:26:29<18:55:00, 3.36s/it] {'loss': 0.156, 'grad_norm': 0.891454242123997, 'learning_rate': 6.695516385782015e-06, 'epoch': 0.41} 41%|████ | 13992/34278 [15:26:29<18:55:00, 3.36s/it] 41%|████ | 13993/34278 [15:26:33<20:16:24, 3.60s/it] {'loss': 0.1315, 'grad_norm': 0.7184089098495967, 'learning_rate': 6.6950719361819235e-06, 'epoch': 0.41} 41%|████ | 13993/34278 [15:26:33<20:16:24, 3.60s/it] 41%|████ | 13994/34278 [15:26:37<20:36:42, 3.66s/it] {'loss': 0.14, 'grad_norm': 0.8348011185440816, 'learning_rate': 6.694627471448637e-06, 'epoch': 0.41} 41%|████ | 13994/34278 [15:26:37<20:36:42, 3.66s/it] 41%|████ | 13995/34278 [15:26:42<22:01:56, 3.91s/it] {'loss': 0.1303, 'grad_norm': 0.7968066873687387, 'learning_rate': 6.694182991586119e-06, 'epoch': 0.41} 41%|████ | 13995/34278 [15:26:42<22:01:56, 3.91s/it] 41%|████ | 13996/34278 [15:26:45<20:51:23, 3.70s/it] {'loss': 0.1311, 'grad_norm': 0.824678742292664, 'learning_rate': 6.69373849659834e-06, 'epoch': 0.41} 41%|████ | 13996/34278 [15:26:45<20:51:23, 3.70s/it] 41%|████ | 13997/34278 [15:26:49<22:04:49, 3.92s/it] {'loss': 0.1352, 'grad_norm': 1.0167257240538659, 'learning_rate': 6.693293986489269e-06, 'epoch': 0.41} 41%|████ | 13997/34278 [15:26:49<22:04:49, 3.92s/it] 41%|████ | 13998/34278 [15:26:52<20:59:33, 3.73s/it] {'loss': 0.1482, 'grad_norm': 1.0375429848464421, 'learning_rate': 6.692849461262871e-06, 'epoch': 0.41} 41%|████ | 13998/34278 [15:26:52<20:59:33, 3.73s/it] 41%|████ | 13999/34278 [15:26:57<22:25:02, 3.98s/it] {'loss': 0.1226, 'grad_norm': 0.8255680474348037, 'learning_rate': 6.692404920923119e-06, 'epoch': 0.41} 41%|████ | 13999/34278 [15:26:57<22:25:02, 3.98s/it] 41%|████ | 14000/34278 [15:27:00<20:42:16, 3.68s/it] {'loss': 0.1601, 'grad_norm': 1.3039704058943398, 'learning_rate': 6.69196036547398e-06, 'epoch': 0.41} 41%|████ | 14000/34278 [15:27:00<20:42:16, 3.68s/it]Set eos token id toSet eos token id to 151658151658Set eos token id to Set eos token toSet eos token id toSet eos token to151658 <|diff_marker|><|diff_marker|> Set eos token id to151658Set eos token toSet generation config eos token id toSet eos token id toSet generation config eos token id to <|diff_marker|>151658 Set eos token to [151658] 151658<|diff_marker|> Set generation config eos token id to[151658] Set eos token to Set eos token to Set generation config eos token id to <|diff_marker|>[151658]<|diff_marker|> [151658]Set generation config eos token id to Set generation config eos token id to [151658][151658] Set eos token id toSet eos token id toSet eos token id to Set eos token id toet eos token id toto [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] 151658Set eos token id toSet eos token id to Set eos token to 151658<|diff_marker|>151658 Set generation config eos token id toSet eos token id toSet eos token toSet eos token to <|diff_marker|><|diff_marker|> 151658[151658] Set generation config eos token id toSet generation config eos token id to Set eos token to <|diff_marker|>[151658] [151658] Set generation config eos token id to [151658] Set eos token id to Set eos token id toSet eos token id to151658 151658Set eos token to Set eos token id to<|diff_marker|> 151658 Set eos token to <|diff_marker|>Set generation config eos token id to Set eos token to Set generation config eos token id to<|diff_marker|> [151658] 151658 Set generation config eos token id to [151658] Set eos token to[151658] <|diff_marker|> Set generation config eos token id to [151658] 151658Set eos token id toSet eos token id toSet eos token id to Set eos token to <|diff_marker|>151658 Set eos token id to Set generation config eos token id to151658Set eos token id to151658Set eos token id to Set eos token to <|diff_marker|>Set eos token to Set eos token to [151658]151658151658Set generation config eos token id to <|diff_marker|><|diff_marker|> 151658Set eos token toSet eos token toSet generation config eos token id to[151658] Set generation config eos token id to <|diff_marker|> <|diff_marker|> Set eos token to [151658][151658] Set generation config eos token id to <|diff_marker|> Set generation config eos token id to [151658]Set generation config eos token id to [151658] [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id back toSet eos token id back to k to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back toSet eos token id back to 151645151645 Set eos token back toSet eos token back to <|im_end|><|im_end|> Set generation config eos token id back toSet generation config eos token id back to [151645, 151643][151645, 151643] Set eos token id back toSet eos token id back to 151645151645 Set eos token back to <|im_end|>Set eos token back to Set generation config eos token id back to<|im_end|> Set generation config eos token id back to [151645, 151643] [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to et eos token id back to ation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to Set eos token id back to151645 Set eos token back to151645 <|im_end|> Set generation config eos token id back toSet eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] [151645, 151643] Set eos token id back toSet eos token id back to 151645151645 Set eos token back toSet eos token back to <|im_end|><|im_end|> Set generation config eos token id back toSet generation config eos token id back to [151645, 151643] [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back toSet eos token id back to 151645151645 Set eos token back to Set eos token back to<|im_end|> <|im_end|> Set generation config eos token id back to Set generation config eos token id back to [151645, 151643] [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set eos token id back toSet generation config eos token id back to 151645 [151645, 151643]Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back toSet eos token id back to 151645 Set eos token back to151645 <|im_end|> Set generation config eos token id back to Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151645, 151643]151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 41%|████ | 14001/34278 [15:27:34<72:25:37, 12.86s/it] {'loss': 0.1416, 'grad_norm': 1.0278586590171908, 'learning_rate': 6.6915157949194235e-06, 'epoch': 0.41} 41%|████ | 14001/34278 [15:27:34<72:25:37, 12.86s/it] 41%|████ | 14002/34278 [15:27:37<56:02:08, 9.95s/it] {'loss': 0.1479, 'grad_norm': 0.8303631477247092, 'learning_rate': 6.691071209263416e-06, 'epoch': 0.41} 41%|████ | 14002/34278 [15:27:37<56:02:08, 9.95s/it] 41%|████ | 14003/34278 [15:27:42<46:55:58, 8.33s/it] {'loss': 0.1316, 'grad_norm': 1.3375067488748817, 'learning_rate': 6.690626608509929e-06, 'epoch': 0.41} 41%|████ | 14003/34278 [15:27:42<46:55:58, 8.33s/it] 41%|████ | 14004/34278 [15:27:48<42:42:07, 7.58s/it] {'loss': 0.1434, 'grad_norm': 0.8226352275152343, 'learning_rate': 6.690181992662932e-06, 'epoch': 0.41} 41%|████ | 14004/34278 [15:27:48<42:42:07, 7.58s/it] 41%|████ | 14005/34278 [15:27:53<37:53:18, 6.73s/it] {'loss': 0.163, 'grad_norm': 0.8093347680830837, 'learning_rate': 6.689737361726392e-06, 'epoch': 0.41} 41%|████ | 14005/34278 [15:27:53<37:53:18, 6.73s/it] 41%|████ | 14006/34278 [15:27:56<31:27:14, 5.59s/it] {'loss': 0.1386, 'grad_norm': 1.2951534940925888, 'learning_rate': 6.689292715704282e-06, 'epoch': 0.41} 41%|████ | 14006/34278 [15:27:56<31:27:14, 5.59s/it] 41%|████ | 14007/34278 [15:27:59<27:28:53, 4.88s/it] {'loss': 0.1399, 'grad_norm': 0.946998839121098, 'learning_rate': 6.6888480546005695e-06, 'epoch': 0.41} 41%|████ | 14007/34278 [15:27:59<27:28:53, 4.88s/it] 41%|████ | 14008/34278 [15:28:02<24:43:04, 4.39s/it] {'loss': 0.1551, 'grad_norm': 0.8178770682230518, 'learning_rate': 6.688403378419224e-06, 'epoch': 0.41} 41%|████ | 14008/34278 [15:28:02<24:43:04, 4.39s/it] 41%|████ | 14009/34278 [15:28:05<22:10:39, 3.94s/it] {'loss': 0.1274, 'grad_norm': 0.6966696986146201, 'learning_rate': 6.687958687164217e-06, 'epoch': 0.41} 41%|████ | 14009/34278 [15:28:05<22:10:39, 3.94s/it] 41%|████ | 14010/34278 [15:28:08<21:16:25, 3.78s/it] {'loss': 0.1575, 'grad_norm': 0.7511313940706461, 'learning_rate': 6.6875139808395175e-06, 'epoch': 0.41} 41%|████ | 14010/34278 [15:28:08<21:16:25, 3.78s/it] 41%|████ | 14011/34278 [15:28:12<20:50:10, 3.70s/it] {'loss': 0.152, 'grad_norm': 0.9318682540544598, 'learning_rate': 6.687069259449095e-06, 'epoch': 0.41} 41%|████ | 14011/34278 [15:28:12<20:50:10, 3.70s/it] 41%|████ | 14012/34278 [15:28:15<20:18:54, 3.61s/it] {'loss': 0.1517, 'grad_norm': 0.8470311858234206, 'learning_rate': 6.686624522996922e-06, 'epoch': 0.41} 41%|████ | 14012/34278 [15:28:15<20:18:54, 3.61s/it] 41%|████ | 14013/34278 [15:28:18<19:38:38, 3.49s/it] {'loss': 0.1488, 'grad_norm': 0.7094907056735673, 'learning_rate': 6.686179771486967e-06, 'epoch': 0.41} 41%|████ | 14013/34278 [15:28:18<19:38:38, 3.49s/it] 41%|████ | 14014/34278 [15:28:24<24:02:08, 4.27s/it] {'loss': 0.1317, 'grad_norm': 0.9403389917136222, 'learning_rate': 6.685735004923203e-06, 'epoch': 0.41} 41%|████ | 14014/34278 [15:28:25<24:02:08, 4.27s/it] 41%|████ | 14015/34278 [15:28:28<22:55:18, 4.07s/it] {'loss': 0.1355, 'grad_norm': 0.797529144309354, 'learning_rate': 6.685290223309598e-06, 'epoch': 0.41} 41%|████ | 14015/34278 [15:28:28<22:55:18, 4.07s/it] 41%|████ | 14016/34278 [15:28:33<24:57:18, 4.43s/it] {'loss': 0.1369, 'grad_norm': 0.7167927773370429, 'learning_rate': 6.684845426650126e-06, 'epoch': 0.41} 41%|████ | 14016/34278 [15:28:33<24:57:18, 4.43s/it] 41%|████ | 14017/34278 [15:28:37<23:02:41, 4.09s/it] {'loss': 0.1257, 'grad_norm': 0.6890553891716754, 'learning_rate': 6.684400614948754e-06, 'epoch': 0.41} 41%|████ | 14017/34278 [15:28:37<23:02:41, 4.09s/it] 41%|████ | 14018/34278 [15:28:43<26:14:37, 4.66s/it] {'loss': 0.1415, 'grad_norm': 0.7739541831571997, 'learning_rate': 6.683955788209455e-06, 'epoch': 0.41} 41%|████ | 14018/34278 [15:28:43<26:14:37, 4.66s/it] 41%|████ | 14019/34278 [15:28:46<24:22:06, 4.33s/it] {'loss': 0.1455, 'grad_norm': 1.0386203791338848, 'learning_rate': 6.6835109464362035e-06, 'epoch': 0.41} 41%|████ | 14019/34278 [15:28:46<24:22:06, 4.33s/it] 41%|████ | 14020/34278 [15:28:49<22:09:48, 3.94s/it] {'loss': 0.1272, 'grad_norm': 0.8341902669434255, 'learning_rate': 6.683066089632965e-06, 'epoch': 0.41} 41%|████ | 14020/34278 [15:28:49<22:09:48, 3.94s/it] 41%|████ | 14021/34278 [15:28:53<21:07:48, 3.76s/it] {'loss': 0.123, 'grad_norm': 0.7716372682979363, 'learning_rate': 6.682621217803718e-06, 'epoch': 0.41} 41%|████ | 14021/34278 [15:28:53<21:07:48, 3.76s/it] 41%|████ | 14022/34278 [15:28:57<22:34:36, 4.01s/it] {'loss': 0.1428, 'grad_norm': 0.86523064711934, 'learning_rate': 6.682176330952428e-06, 'epoch': 0.41} 41%|████ | 14022/34278 [15:28:57<22:34:36, 4.01s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12605 > 8192). Running this sequence through the model will result in indexing errors 41%|████ | 14023/34278 [15:29:03<25:24:09, 4.51s/it] {'loss': 0.1674, 'grad_norm': 0.9150572348856957, 'learning_rate': 6.681731429083068e-06, 'epoch': 0.41} 41%|████ | 14023/34278 [15:29:03<25:24:09, 4.51s/it] 41%|████ | 14024/34278 [15:29:06<23:27:06, 4.17s/it] {'loss': 0.1531, 'grad_norm': 0.8572875063355055, 'learning_rate': 6.681286512199614e-06, 'epoch': 0.41} 41%|████ | 14024/34278 [15:29:06<23:27:06, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 41%|████ | 14025/34278 [15:29:09<21:30:39, 3.82s/it] {'loss': 0.1212, 'grad_norm': 0.8999123305672735, 'learning_rate': 6.680841580306035e-06, 'epoch': 0.41} 41%|████ | 14025/34278 [15:29:09<21:30:39, 3.82s/it] 41%|████ | 14026/34278 [15:29:12<20:23:57, 3.63s/it] {'loss': 0.1492, 'grad_norm': 0.7772689415316842, 'learning_rate': 6.6803966334063035e-06, 'epoch': 0.41} 41%|████ | 14026/34278 [15:29:12<20:23:57, 3.63s/it] 41%|████ | 14027/34278 [15:29:16<20:11:43, 3.59s/it] {'loss': 0.1471, 'grad_norm': 1.009921908207947, 'learning_rate': 6.67995167150439e-06, 'epoch': 0.41} 41%|████ | 14027/34278 [15:29:16<20:11:43, 3.59s/it] 41%|████ | 14028/34278 [15:29:22<24:04:00, 4.28s/it] {'loss': 0.1364, 'grad_norm': 1.6112817363116887, 'learning_rate': 6.679506694604271e-06, 'epoch': 0.41} 41%|████ | 14028/34278 [15:29:22<24:04:00, 4.28s/it] 41%|████ | 14029/34278 [15:29:28<26:55:39, 4.79s/it] {'loss': 0.1575, 'grad_norm': 1.0200856189028351, 'learning_rate': 6.679061702709916e-06, 'epoch': 0.41} 41%|████ | 14029/34278 [15:29:28<26:55:39, 4.79s/it] 41%|████ | 14030/34278 [15:29:31<24:34:39, 4.37s/it] {'loss': 0.1415, 'grad_norm': 0.7929669231238768, 'learning_rate': 6.6786166958253e-06, 'epoch': 0.41} 41%|████ | 14030/34278 [15:29:31<24:34:39, 4.37s/it] 41%|████ | 14031/34278 [15:29:37<27:18:29, 4.86s/it] {'loss': 0.1445, 'grad_norm': 0.7426867630516113, 'learning_rate': 6.678171673954394e-06, 'epoch': 0.41} 41%|████ | 14031/34278 [15:29:37<27:18:29, 4.86s/it] 41%|████ | 14032/34278 [15:29:43<29:00:40, 5.16s/it] {'loss': 0.1343, 'grad_norm': 0.7855405902335418, 'learning_rate': 6.677726637101172e-06, 'epoch': 0.41} 41%|████ | 14032/34278 [15:29:43<29:00:40, 5.16s/it] 41%|████ | 14033/34278 [15:29:46<25:28:18, 4.53s/it] {'loss': 0.1424, 'grad_norm': 0.9421841198000807, 'learning_rate': 6.677281585269607e-06, 'epoch': 0.41} 41%|████ | 14033/34278 [15:29:46<25:28:18, 4.53s/it] 41%|████ | 14034/34278 [15:29:52<28:02:23, 4.99s/it] {'loss': 0.1483, 'grad_norm': 0.6584107554121822, 'learning_rate': 6.676836518463674e-06, 'epoch': 0.41} 41%|████ | 14034/34278 [15:29:52<28:02:23, 4.99s/it] 41%|████ | 14035/34278 [15:29:55<24:31:35, 4.36s/it] {'loss': 0.1567, 'grad_norm': 0.8858738294690848, 'learning_rate': 6.676391436687343e-06, 'epoch': 0.41} 41%|████ | 14035/34278 [15:29:55<24:31:35, 4.36s/it] 41%|████ | 14036/34278 [15:30:00<26:06:46, 4.64s/it] {'loss': 0.1418, 'grad_norm': 0.881739182410847, 'learning_rate': 6.67594633994459e-06, 'epoch': 0.41} 41%|████ | 14036/34278 [15:30:00<26:06:46, 4.64s/it] 41%|████ | 14037/34278 [15:30:04<23:37:56, 4.20s/it] {'loss': 0.1675, 'grad_norm': 0.870955021038668, 'learning_rate': 6.67550122823939e-06, 'epoch': 0.41} 41%|████ | 14037/34278 [15:30:04<23:37:56, 4.20s/it] 41%|████ | 14038/34278 [15:30:10<26:47:21, 4.76s/it] {'loss': 0.146, 'grad_norm': 1.054631251684788, 'learning_rate': 6.675056101575711e-06, 'epoch': 0.41} 41%|████ | 14038/34278 [15:30:10<26:47:21, 4.76s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 41%|████ | 14039/34278 [15:30:15<27:25:56, 4.88s/it] {'loss': 0.1225, 'grad_norm': 0.7255648835603584, 'learning_rate': 6.674610959957535e-06, 'epoch': 0.41} 41%|████ | 14039/34278 [15:30:15<27:25:56, 4.88s/it] 41%|████ | 14040/34278 [15:30:21<29:22:52, 5.23s/it] {'loss': 0.1515, 'grad_norm': 0.9959082762794741, 'learning_rate': 6.67416580338883e-06, 'epoch': 0.41} 41%|████ | 14040/34278 [15:30:21<29:22:52, 5.23s/it] 41%|████ | 14041/34278 [15:30:24<25:54:03, 4.61s/it] {'loss': 0.1448, 'grad_norm': 0.7235494032116029, 'learning_rate': 6.673720631873572e-06, 'epoch': 0.41} 41%|████ | 14041/34278 [15:30:24<25:54:03, 4.61s/it] 41%|████ | 14042/34278 [15:30:27<22:59:06, 4.09s/it] {'loss': 0.1511, 'grad_norm': 0.8384810544456339, 'learning_rate': 6.673275445415736e-06, 'epoch': 0.41} 41%|████ | 14042/34278 [15:30:27<22:59:06, 4.09s/it] 41%|████ | 14043/34278 [15:30:30<21:37:10, 3.85s/it] {'loss': 0.1502, 'grad_norm': 0.7629219454266113, 'learning_rate': 6.672830244019297e-06, 'epoch': 0.41} 41%|████ | 14043/34278 [15:30:30<21:37:10, 3.85s/it] 41%|████ | 14044/34278 [15:30:33<20:46:06, 3.70s/it] {'loss': 0.1356, 'grad_norm': 0.7565425254724589, 'learning_rate': 6.6723850276882285e-06, 'epoch': 0.41} 41%|████ | 14044/34278 [15:30:33<20:46:06, 3.70s/it] 41%|████ | 14045/34278 [15:30:36<19:26:54, 3.46s/it] {'loss': 0.1403, 'grad_norm': 0.8899515804339725, 'learning_rate': 6.671939796426507e-06, 'epoch': 0.41} 41%|████ | 14045/34278 [15:30:36<19:26:54, 3.46s/it] 41%|████ | 14046/34278 [15:30:39<18:34:51, 3.31s/it] {'loss': 0.1241, 'grad_norm': 0.7793189627037008, 'learning_rate': 6.671494550238105e-06, 'epoch': 0.41} 41%|████ | 14046/34278 [15:30:39<18:34:51, 3.31s/it] 41%|████ | 14047/34278 [15:30:44<20:23:31, 3.63s/it] {'loss': 0.1348, 'grad_norm': 0.6827247511238914, 'learning_rate': 6.671049289126997e-06, 'epoch': 0.41} 41%|████ | 14047/34278 [15:30:44<20:23:31, 3.63s/it] 41%|████ | 14048/34278 [15:30:47<19:36:28, 3.49s/it] {'loss': 0.1434, 'grad_norm': 1.9911398154900386, 'learning_rate': 6.670604013097162e-06, 'epoch': 0.41} 41%|████ | 14048/34278 [15:30:47<19:36:28, 3.49s/it] 41%|████ | 14049/34278 [15:30:50<18:53:31, 3.36s/it] {'loss': 0.1661, 'grad_norm': 0.9247532299116882, 'learning_rate': 6.670158722152574e-06, 'epoch': 0.41} 41%|████ | 14049/34278 [15:30:50<18:53:31, 3.36s/it] 41%|████ | 14050/34278 [15:30:53<18:29:19, 3.29s/it] {'loss': 0.1287, 'grad_norm': 0.8381483537966381, 'learning_rate': 6.669713416297205e-06, 'epoch': 0.41} 41%|████ | 14050/34278 [15:30:53<18:29:19, 3.29s/it] 41%|████ | 14051/34278 [15:30:57<19:03:15, 3.39s/it] {'loss': 0.1566, 'grad_norm': 0.8890645281254, 'learning_rate': 6.669268095535035e-06, 'epoch': 0.41} 41%|████ | 14051/34278 [15:30:57<19:03:15, 3.39s/it] 41%|████ | 14052/34278 [15:31:00<18:32:29, 3.30s/it] {'loss': 0.1402, 'grad_norm': 0.7941016350632227, 'learning_rate': 6.668822759870037e-06, 'epoch': 0.41} 41%|████ | 14052/34278 [15:31:00<18:32:29, 3.30s/it] 41%|████ | 14053/34278 [15:31:04<19:36:27, 3.49s/it] {'loss': 0.1505, 'grad_norm': 0.7829813544891336, 'learning_rate': 6.668377409306188e-06, 'epoch': 0.41} 41%|████ | 14053/34278 [15:31:04<19:36:27, 3.49s/it] 41%|████ | 14054/34278 [15:31:08<20:28:26, 3.64s/it] {'loss': 0.1397, 'grad_norm': 0.98636950730738, 'learning_rate': 6.6679320438474645e-06, 'epoch': 0.41} 41%|████ | 14054/34278 [15:31:08<20:28:26, 3.64s/it] 41%|████ | 14055/34278 [15:31:14<24:24:48, 4.35s/it] {'loss': 0.1855, 'grad_norm': 0.8465746512410242, 'learning_rate': 6.667486663497842e-06, 'epoch': 0.41} 41%|████ | 14055/34278 [15:31:14<24:24:48, 4.35s/it] 41%|████ | 14056/34278 [15:31:18<23:35:35, 4.20s/it] {'loss': 0.1486, 'grad_norm': 0.831950303805014, 'learning_rate': 6.667041268261295e-06, 'epoch': 0.41} 41%|████ | 14056/34278 [15:31:18<23:35:35, 4.20s/it] 41%|████ | 14057/34278 [15:31:21<22:40:41, 4.04s/it] {'loss': 0.1418, 'grad_norm': 0.8592891237764462, 'learning_rate': 6.6665958581418025e-06, 'epoch': 0.41} 41%|████ | 14057/34278 [15:31:21<22:40:41, 4.04s/it] 41%|████ | 14058/34278 [15:31:24<21:21:11, 3.80s/it] {'loss': 0.1677, 'grad_norm': 0.8936015891060176, 'learning_rate': 6.66615043314334e-06, 'epoch': 0.41} 41%|████ | 14058/34278 [15:31:24<21:21:11, 3.80s/it] 41%|████ | 14059/34278 [15:31:28<20:39:41, 3.68s/it] {'loss': 0.131, 'grad_norm': 0.7843285603489508, 'learning_rate': 6.665704993269884e-06, 'epoch': 0.41} 41%|████ | 14059/34278 [15:31:28<20:39:41, 3.68s/it] 41%|████ | 14060/34278 [15:31:31<20:04:25, 3.57s/it] {'loss': 0.1392, 'grad_norm': 1.0010218166116722, 'learning_rate': 6.665259538525413e-06, 'epoch': 0.41} 41%|████ | 14060/34278 [15:31:31<20:04:25, 3.57s/it] 41%|████ | 14061/34278 [15:31:35<19:58:51, 3.56s/it] {'loss': 0.1254, 'grad_norm': 0.6198236187489929, 'learning_rate': 6.664814068913901e-06, 'epoch': 0.41} 41%|████ | 14061/34278 [15:31:35<19:58:51, 3.56s/it] 41%|████ | 14062/34278 [15:31:38<18:54:53, 3.37s/it] {'loss': 0.1636, 'grad_norm': 0.8065200834730276, 'learning_rate': 6.664368584439326e-06, 'epoch': 0.41} 41%|████ | 14062/34278 [15:31:38<18:54:53, 3.37s/it] 41%|████ | 14063/34278 [15:31:41<19:42:40, 3.51s/it] {'loss': 0.1471, 'grad_norm': 0.9592785676171229, 'learning_rate': 6.663923085105666e-06, 'epoch': 0.41} 41%|████ | 14063/34278 [15:31:41<19:42:40, 3.51s/it] 41%|████ | 14064/34278 [15:31:44<18:36:44, 3.31s/it] {'loss': 0.1339, 'grad_norm': 0.829300522966816, 'learning_rate': 6.663477570916898e-06, 'epoch': 0.41} 41%|████ | 14064/34278 [15:31:44<18:36:44, 3.31s/it] 41%|████ | 14065/34278 [15:31:50<23:03:11, 4.11s/it] {'loss': 0.127, 'grad_norm': 0.685073210625489, 'learning_rate': 6.663032041876999e-06, 'epoch': 0.41} 41%|████ | 14065/34278 [15:31:50<23:03:11, 4.11s/it] 41%|████ | 14066/34278 [15:31:54<21:58:19, 3.91s/it] {'loss': 0.1348, 'grad_norm': 1.0905391998464318, 'learning_rate': 6.662586497989948e-06, 'epoch': 0.41} 41%|████ | 14066/34278 [15:31:54<21:58:19, 3.91s/it] 41%|████ | 14067/34278 [15:31:57<20:10:05, 3.59s/it] {'loss': 0.1454, 'grad_norm': 0.8738693239342453, 'learning_rate': 6.66214093925972e-06, 'epoch': 0.41} 41%|████ | 14067/34278 [15:31:57<20:10:05, 3.59s/it] 41%|████ | 14068/34278 [15:32:01<20:44:15, 3.69s/it] {'loss': 0.1296, 'grad_norm': 0.7284334938180899, 'learning_rate': 6.661695365690295e-06, 'epoch': 0.41} 41%|████ | 14068/34278 [15:32:01<20:44:15, 3.69s/it] 41%|████ | 14069/34278 [15:32:04<20:42:18, 3.69s/it] {'loss': 0.1495, 'grad_norm': 0.8823968510654029, 'learning_rate': 6.661249777285652e-06, 'epoch': 0.41} 41%|████ | 14069/34278 [15:32:04<20:42:18, 3.69s/it] 41%|████ | 14070/34278 [15:32:07<19:09:51, 3.41s/it] {'loss': 0.1479, 'grad_norm': 1.1569356128860702, 'learning_rate': 6.6608041740497665e-06, 'epoch': 0.41} 41%|████ | 14070/34278 [15:32:07<19:09:51, 3.41s/it] 41%|████ | 14071/34278 [15:32:10<19:14:30, 3.43s/it] {'loss': 0.1469, 'grad_norm': 0.7709293904305606, 'learning_rate': 6.660358555986617e-06, 'epoch': 0.41} 41%|████ | 14071/34278 [15:32:10<19:14:30, 3.43s/it] 41%|████ | 14072/34278 [15:32:14<18:48:06, 3.35s/it] {'loss': 0.1276, 'grad_norm': 1.1563332843555503, 'learning_rate': 6.659912923100184e-06, 'epoch': 0.41} 41%|████ | 14072/34278 [15:32:14<18:48:06, 3.35s/it] 41%|████ | 14073/34278 [15:32:17<19:18:26, 3.44s/it] {'loss': 0.1611, 'grad_norm': 1.0544475281449661, 'learning_rate': 6.659467275394443e-06, 'epoch': 0.41} 41%|████ | 14073/34278 [15:32:17<19:18:26, 3.44s/it] 41%|████ | 14074/34278 [15:32:21<19:03:08, 3.39s/it] {'loss': 0.1441, 'grad_norm': 0.7960526210038408, 'learning_rate': 6.659021612873375e-06, 'epoch': 0.41} 41%|████ | 14074/34278 [15:32:21<19:03:08, 3.39s/it] 41%|████ | 14075/34278 [15:32:24<18:40:26, 3.33s/it] {'loss': 0.1473, 'grad_norm': 0.6788930089530776, 'learning_rate': 6.658575935540958e-06, 'epoch': 0.41} 41%|████ | 14075/34278 [15:32:24<18:40:26, 3.33s/it] 41%|████ | 14076/34278 [15:32:28<20:36:39, 3.67s/it] {'loss': 0.1659, 'grad_norm': 0.7889712563877609, 'learning_rate': 6.658130243401173e-06, 'epoch': 0.41} 41%|████ | 14076/34278 [15:32:28<20:36:39, 3.67s/it] 41%|████ | 14077/34278 [15:32:33<22:49:26, 4.07s/it] {'loss': 0.1658, 'grad_norm': 0.9817376923813845, 'learning_rate': 6.6576845364579946e-06, 'epoch': 0.41} 41%|████ | 14077/34278 [15:32:33<22:49:26, 4.07s/it] 41%|████ | 14078/34278 [15:32:36<20:59:51, 3.74s/it] {'loss': 0.1385, 'grad_norm': 0.8720322947938877, 'learning_rate': 6.657238814715406e-06, 'epoch': 0.41} 41%|████ | 14078/34278 [15:32:36<20:59:51, 3.74s/it] 41%|████ | 14079/34278 [15:32:39<20:04:06, 3.58s/it] {'loss': 0.1358, 'grad_norm': 0.7252926333411631, 'learning_rate': 6.656793078177384e-06, 'epoch': 0.41} 41%|████ | 14079/34278 [15:32:39<20:04:06, 3.58s/it] 41%|████ | 14080/34278 [15:32:42<19:13:27, 3.43s/it] {'loss': 0.1743, 'grad_norm': 0.9131611531639156, 'learning_rate': 6.656347326847907e-06, 'epoch': 0.41} 41%|████ | 14080/34278 [15:32:42<19:13:27, 3.43s/it] 41%|████ | 14081/34278 [15:32:45<18:18:12, 3.26s/it] {'loss': 0.134, 'grad_norm': 1.0655033481842031, 'learning_rate': 6.65590156073096e-06, 'epoch': 0.41} 41%|████ | 14081/34278 [15:32:45<18:18:12, 3.26s/it] 41%|████ | 14082/34278 [15:32:49<19:14:14, 3.43s/it] {'loss': 0.1321, 'grad_norm': 0.758471771459251, 'learning_rate': 6.655455779830517e-06, 'epoch': 0.41} 41%|████ | 14082/34278 [15:32:49<19:14:14, 3.43s/it] 41%|████ | 14083/34278 [15:32:52<18:28:15, 3.29s/it] {'loss': 0.1508, 'grad_norm': 0.7336107284578407, 'learning_rate': 6.65500998415056e-06, 'epoch': 0.41} 41%|████ | 14083/34278 [15:32:52<18:28:15, 3.29s/it] 41%|████ | 14084/34278 [15:32:57<21:56:58, 3.91s/it] {'loss': 0.1252, 'grad_norm': 0.9340191362850672, 'learning_rate': 6.65456417369507e-06, 'epoch': 0.41} 41%|████ | 14084/34278 [15:32:57<21:56:58, 3.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 41%|████ | 14085/34278 [15:33:01<21:19:11, 3.80s/it] {'loss': 0.1506, 'grad_norm': 0.9134081318565378, 'learning_rate': 6.654118348468026e-06, 'epoch': 0.41} 41%|████ | 14085/34278 [15:33:01<21:19:11, 3.80s/it] 41%|████ | 14086/34278 [15:33:07<25:21:57, 4.52s/it] {'loss': 0.1541, 'grad_norm': 0.8297543236724341, 'learning_rate': 6.653672508473408e-06, 'epoch': 0.41} 41%|████ | 14086/34278 [15:33:07<25:21:57, 4.52s/it] 41%|████ | 14087/34278 [15:33:11<24:23:45, 4.35s/it] {'loss': 0.1435, 'grad_norm': 0.9345170815107622, 'learning_rate': 6.653226653715197e-06, 'epoch': 0.41} 41%|████ | 14087/34278 [15:33:11<24:23:45, 4.35s/it] 41%|████ | 14088/34278 [15:33:15<22:44:47, 4.06s/it] {'loss': 0.132, 'grad_norm': 0.9732913954697464, 'learning_rate': 6.652780784197371e-06, 'epoch': 0.41} 41%|████ | 14088/34278 [15:33:15<22:44:47, 4.06s/it] 41%|████ | 14089/34278 [15:33:19<23:20:21, 4.16s/it] {'loss': 0.1371, 'grad_norm': 0.8123372090511947, 'learning_rate': 6.652334899923914e-06, 'epoch': 0.41} 41%|████ | 14089/34278 [15:33:19<23:20:21, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 41%|████ | 14090/34278 [15:33:22<21:49:00, 3.89s/it] {'loss': 0.1317, 'grad_norm': 0.8663632495507103, 'learning_rate': 6.651889000898807e-06, 'epoch': 0.41} 41%|████ | 14090/34278 [15:33:22<21:49:00, 3.89s/it] 41%|████ | 14091/34278 [15:33:27<22:45:32, 4.06s/it] {'loss': 0.1443, 'grad_norm': 0.7733858395994007, 'learning_rate': 6.651443087126028e-06, 'epoch': 0.41} 41%|████ | 14091/34278 [15:33:27<22:45:32, 4.06s/it] 41%|████ | 14092/34278 [15:33:29<20:44:43, 3.70s/it] {'loss': 0.1538, 'grad_norm': 0.8989077850530516, 'learning_rate': 6.650997158609559e-06, 'epoch': 0.41} 41%|████ | 14092/34278 [15:33:29<20:44:43, 3.70s/it] 41%|████ | 14093/34278 [15:33:35<23:59:52, 4.28s/it] {'loss': 0.1374, 'grad_norm': 0.8221939205733133, 'learning_rate': 6.650551215353381e-06, 'epoch': 0.41} 41%|████ | 14093/34278 [15:33:35<23:59:52, 4.28s/it] 41%|████ | 14094/34278 [15:33:39<23:51:09, 4.25s/it] {'loss': 0.1489, 'grad_norm': 0.7128271885855784, 'learning_rate': 6.650105257361478e-06, 'epoch': 0.41} 41%|████ | 14094/34278 [15:33:39<23:51:09, 4.25s/it] 41%|████ | 14095/34278 [15:33:43<22:40:01, 4.04s/it] {'loss': 0.148, 'grad_norm': 0.8325895993213441, 'learning_rate': 6.649659284637826e-06, 'epoch': 0.41} 41%|████ | 14095/34278 [15:33:43<22:40:01, 4.04s/it] 41%|████ | 14096/34278 [15:33:46<20:25:39, 3.64s/it] {'loss': 0.1383, 'grad_norm': 0.7752183889670637, 'learning_rate': 6.649213297186413e-06, 'epoch': 0.41} 41%|████ | 14096/34278 [15:33:46<20:25:39, 3.64s/it] 41%|████ | 14097/34278 [15:33:49<19:16:29, 3.44s/it] {'loss': 0.1705, 'grad_norm': 1.1079170416632855, 'learning_rate': 6.648767295011216e-06, 'epoch': 0.41} 41%|████ | 14097/34278 [15:33:49<19:16:29, 3.44s/it] 41%|████ | 14098/34278 [15:33:51<18:18:08, 3.27s/it] {'loss': 0.1579, 'grad_norm': 0.9474382692371365, 'learning_rate': 6.648321278116216e-06, 'epoch': 0.41} 41%|████ | 14098/34278 [15:33:51<18:18:08, 3.27s/it] 41%|████ | 14099/34278 [15:33:58<23:05:15, 4.12s/it] {'loss': 0.1465, 'grad_norm': 1.0019968475051197, 'learning_rate': 6.6478752465054005e-06, 'epoch': 0.41} 41%|████ | 14099/34278 [15:33:58<23:05:15, 4.12s/it] 41%|████ | 14100/34278 [15:34:02<23:16:30, 4.15s/it] {'loss': 0.127, 'grad_norm': 0.8485983597306349, 'learning_rate': 6.6474292001827475e-06, 'epoch': 0.41} 41%|████ | 14100/34278 [15:34:02<23:16:30, 4.15s/it] 41%|████ | 14101/34278 [15:34:07<24:54:32, 4.44s/it] {'loss': 0.1749, 'grad_norm': 0.9700884182027817, 'learning_rate': 6.646983139152239e-06, 'epoch': 0.41} 41%|████ | 14101/34278 [15:34:07<24:54:32, 4.44s/it] 41%|████ | 14102/34278 [15:34:10<22:58:01, 4.10s/it] {'loss': 0.1512, 'grad_norm': 1.0427712116214656, 'learning_rate': 6.646537063417858e-06, 'epoch': 0.41} 41%|████ | 14102/34278 [15:34:10<22:58:01, 4.10s/it] 41%|████ | 14103/34278 [15:34:16<26:05:40, 4.66s/it] {'loss': 0.1433, 'grad_norm': 0.866105005706269, 'learning_rate': 6.646090972983588e-06, 'epoch': 0.41} 41%|████ | 14103/34278 [15:34:16<26:05:40, 4.66s/it] 41%|████ | 14104/34278 [15:34:19<23:49:04, 4.25s/it] {'loss': 0.1414, 'grad_norm': 0.8438360414774141, 'learning_rate': 6.64564486785341e-06, 'epoch': 0.41} 41%|████ | 14104/34278 [15:34:19<23:49:04, 4.25s/it] 41%|████ | 14105/34278 [15:34:26<27:32:23, 4.91s/it] {'loss': 0.1477, 'grad_norm': 1.0502680853296236, 'learning_rate': 6.6451987480313085e-06, 'epoch': 0.41} 41%|████ | 14105/34278 [15:34:26<27:32:23, 4.91s/it] 41%|████ | 14106/34278 [15:34:31<28:09:37, 5.03s/it] {'loss': 0.1629, 'grad_norm': 0.8129688190336931, 'learning_rate': 6.644752613521266e-06, 'epoch': 0.41} 41%|████ | 14106/34278 [15:34:31<28:09:37, 5.03s/it] 41%|████ | 14107/34278 [15:34:35<25:19:04, 4.52s/it] {'loss': 0.1461, 'grad_norm': 0.7820296113578517, 'learning_rate': 6.644306464327261e-06, 'epoch': 0.41} 41%|████ | 14107/34278 [15:34:35<25:19:04, 4.52s/it] 41%|████ | 14108/34278 [15:34:39<25:11:16, 4.50s/it] {'loss': 0.1413, 'grad_norm': 0.7590526242828608, 'learning_rate': 6.643860300453283e-06, 'epoch': 0.41} 41%|████ | 14108/34278 [15:34:39<25:11:16, 4.50s/it] 41%|████ | 14109/34278 [15:34:42<22:24:03, 4.00s/it] {'loss': 0.1463, 'grad_norm': 0.7411650521546309, 'learning_rate': 6.643414121903313e-06, 'epoch': 0.41} 41%|████ | 14109/34278 [15:34:42<22:24:03, 4.00s/it] 41%|████ | 14110/34278 [15:34:45<21:41:39, 3.87s/it] {'loss': 0.1591, 'grad_norm': 0.8121287706621918, 'learning_rate': 6.642967928681333e-06, 'epoch': 0.41} 41%|████ | 14110/34278 [15:34:45<21:41:39, 3.87s/it] 41%|████ | 14111/34278 [15:34:51<24:40:18, 4.40s/it] {'loss': 0.1483, 'grad_norm': 0.7481389329202898, 'learning_rate': 6.64252172079133e-06, 'epoch': 0.41} 41%|████ | 14111/34278 [15:34:51<24:40:18, 4.40s/it] 41%|████ | 14112/34278 [15:34:54<22:01:55, 3.93s/it] {'loss': 0.1315, 'grad_norm': 0.7614692513847493, 'learning_rate': 6.642075498237283e-06, 'epoch': 0.41} 41%|████ | 14112/34278 [15:34:54<22:01:55, 3.93s/it] 41%|████ | 14113/34278 [15:34:57<21:18:01, 3.80s/it] {'loss': 0.1459, 'grad_norm': 0.7969165419936783, 'learning_rate': 6.641629261023177e-06, 'epoch': 0.41} 41%|████ | 14113/34278 [15:34:57<21:18:01, 3.80s/it] 41%|████ | 14114/34278 [15:35:02<23:20:52, 4.17s/it] {'loss': 0.1466, 'grad_norm': 0.7107974912860358, 'learning_rate': 6.6411830091529984e-06, 'epoch': 0.41} 41%|████ | 14114/34278 [15:35:02<23:20:52, 4.17s/it] 41%|████ | 14115/34278 [15:35:07<23:48:44, 4.25s/it] {'loss': 0.1385, 'grad_norm': 0.891300069827346, 'learning_rate': 6.640736742630729e-06, 'epoch': 0.41} 41%|████ | 14115/34278 [15:35:07<23:48:44, 4.25s/it] 41%|████ | 14116/34278 [15:35:10<21:48:43, 3.89s/it] {'loss': 0.1702, 'grad_norm': 0.8454777967311086, 'learning_rate': 6.6402904614603546e-06, 'epoch': 0.41} 41%|████ | 14116/34278 [15:35:10<21:48:43, 3.89s/it] 41%|████ | 14117/34278 [15:35:13<20:47:38, 3.71s/it] {'loss': 0.149, 'grad_norm': 0.7000881618005538, 'learning_rate': 6.639844165645858e-06, 'epoch': 0.41} 41%|████ | 14117/34278 [15:35:13<20:47:38, 3.71s/it] 41%|████ | 14118/34278 [15:35:16<19:49:56, 3.54s/it] {'loss': 0.1594, 'grad_norm': 0.8146304024869583, 'learning_rate': 6.639397855191223e-06, 'epoch': 0.41} 41%|████ | 14118/34278 [15:35:16<19:49:56, 3.54s/it] 41%|████ | 14119/34278 [15:35:21<21:36:02, 3.86s/it] {'loss': 0.137, 'grad_norm': 0.7466521834770206, 'learning_rate': 6.638951530100437e-06, 'epoch': 0.41} 41%|████ | 14119/34278 [15:35:21<21:36:02, 3.86s/it] 41%|████ | 14120/34278 [15:35:25<21:22:22, 3.82s/it] {'loss': 0.1641, 'grad_norm': 0.8587020813308973, 'learning_rate': 6.638505190377482e-06, 'epoch': 0.41} 41%|████ | 14120/34278 [15:35:25<21:22:22, 3.82s/it] 41%|████ | 14121/34278 [15:35:27<19:42:35, 3.52s/it] {'loss': 0.1492, 'grad_norm': 0.8324195692325999, 'learning_rate': 6.6380588360263455e-06, 'epoch': 0.41} 41%|████ | 14121/34278 [15:35:27<19:42:35, 3.52s/it] 41%|████ | 14122/34278 [15:35:32<20:59:18, 3.75s/it] {'loss': 0.1268, 'grad_norm': 0.7463544353546875, 'learning_rate': 6.637612467051008e-06, 'epoch': 0.41} 41%|████ | 14122/34278 [15:35:32<20:59:18, 3.75s/it] 41%|████ | 14123/34278 [15:35:35<19:53:03, 3.55s/it] {'loss': 0.1402, 'grad_norm': 0.8463558671027969, 'learning_rate': 6.6371660834554586e-06, 'epoch': 0.41} 41%|████ | 14123/34278 [15:35:35<19:53:03, 3.55s/it] 41%|████ | 14124/34278 [15:35:38<18:59:01, 3.39s/it] {'loss': 0.1337, 'grad_norm': 0.8271932931175923, 'learning_rate': 6.6367196852436826e-06, 'epoch': 0.41} 41%|████ | 14124/34278 [15:35:38<18:59:01, 3.39s/it] 41%|████ | 14125/34278 [15:35:44<23:17:58, 4.16s/it] {'loss': 0.1491, 'grad_norm': 0.7450770554727912, 'learning_rate': 6.636273272419661e-06, 'epoch': 0.41} 41%|████ | 14125/34278 [15:35:44<23:17:58, 4.16s/it] 41%|████ | 14126/34278 [15:35:48<23:15:58, 4.16s/it] {'loss': 0.1154, 'grad_norm': 0.757882374639118, 'learning_rate': 6.635826844987385e-06, 'epoch': 0.41} 41%|████ | 14126/34278 [15:35:48<23:15:58, 4.16s/it] 41%|████ | 14127/34278 [15:35:51<21:28:50, 3.84s/it] {'loss': 0.1556, 'grad_norm': 0.7802016560566676, 'learning_rate': 6.6353804029508376e-06, 'epoch': 0.41} 41%|████ | 14127/34278 [15:35:51<21:28:50, 3.84s/it] 41%|████ | 14128/34278 [15:35:55<20:52:50, 3.73s/it] {'loss': 0.1251, 'grad_norm': 0.7780001687262968, 'learning_rate': 6.634933946314002e-06, 'epoch': 0.41} 41%|████ | 14128/34278 [15:35:55<20:52:50, 3.73s/it] 41%|████ | 14129/34278 [15:35:58<21:08:57, 3.78s/it] {'loss': 0.1476, 'grad_norm': 1.3313772094344312, 'learning_rate': 6.634487475080867e-06, 'epoch': 0.41} 41%|████ | 14129/34278 [15:35:58<21:08:57, 3.78s/it] 41%|████ | 14130/34278 [15:36:01<19:53:56, 3.56s/it] {'loss': 0.1289, 'grad_norm': 1.0211085561052402, 'learning_rate': 6.634040989255419e-06, 'epoch': 0.41} 41%|████ | 14130/34278 [15:36:01<19:53:56, 3.56s/it] 41%|████ | 14131/34278 [15:36:07<24:03:19, 4.30s/it] {'loss': 0.1565, 'grad_norm': 0.966672387384863, 'learning_rate': 6.633594488841642e-06, 'epoch': 0.41} 41%|████ | 14131/34278 [15:36:07<24:03:19, 4.30s/it] 41%|████ | 14132/34278 [15:36:11<22:18:32, 3.99s/it] {'loss': 0.1419, 'grad_norm': 0.8019155350181045, 'learning_rate': 6.633147973843525e-06, 'epoch': 0.41} 41%|████ | 14132/34278 [15:36:11<22:18:32, 3.99s/it] 41%|████ | 14133/34278 [15:36:17<25:36:04, 4.58s/it] {'loss': 0.1281, 'grad_norm': 0.8361654518518425, 'learning_rate': 6.632701444265052e-06, 'epoch': 0.41} 41%|████ | 14133/34278 [15:36:17<25:36:04, 4.58s/it] 41%|████ | 14134/34278 [15:36:20<23:03:45, 4.12s/it] {'loss': 0.1575, 'grad_norm': 1.112056236003766, 'learning_rate': 6.632254900110209e-06, 'epoch': 0.41} 41%|████ | 14134/34278 [15:36:20<23:03:45, 4.12s/it] 41%|████ | 14135/34278 [15:36:23<21:00:23, 3.75s/it] {'loss': 0.1367, 'grad_norm': 0.8007026380726106, 'learning_rate': 6.631808341382986e-06, 'epoch': 0.41} 41%|████ | 14135/34278 [15:36:23<21:00:23, 3.75s/it] 41%|████ | 14136/34278 [15:36:27<21:27:21, 3.83s/it] {'loss': 0.1343, 'grad_norm': 0.8146151167565929, 'learning_rate': 6.631361768087368e-06, 'epoch': 0.41} 41%|████ | 14136/34278 [15:36:27<21:27:21, 3.83s/it] 41%|████ | 14137/34278 [15:36:30<19:54:31, 3.56s/it] {'loss': 0.1306, 'grad_norm': 0.8897923106760964, 'learning_rate': 6.630915180227338e-06, 'epoch': 0.41} 41%|████ | 14137/34278 [15:36:30<19:54:31, 3.56s/it] 41%|████ | 14138/34278 [15:36:35<23:31:45, 4.21s/it] {'loss': 0.1419, 'grad_norm': 0.705878373741667, 'learning_rate': 6.630468577806889e-06, 'epoch': 0.41} 41%|████ | 14138/34278 [15:36:35<23:31:45, 4.21s/it] 41%|████ | 14139/34278 [15:36:40<24:07:01, 4.31s/it] {'loss': 0.1403, 'grad_norm': 0.6555732784778344, 'learning_rate': 6.630021960830007e-06, 'epoch': 0.41} 41%|████ | 14139/34278 [15:36:40<24:07:01, 4.31s/it] 41%|████▏ | 14140/34278 [15:36:43<22:04:56, 3.95s/it] {'loss': 0.1513, 'grad_norm': 0.7100573139655991, 'learning_rate': 6.6295753293006745e-06, 'epoch': 0.41} 41%|████▏ | 14140/34278 [15:36:43<22:04:56, 3.95s/it] 41%|████▏ | 14141/34278 [15:36:49<24:47:35, 4.43s/it] {'loss': 0.1658, 'grad_norm': 0.7408558010398478, 'learning_rate': 6.629128683222886e-06, 'epoch': 0.41} 41%|████▏ | 14141/34278 [15:36:49<24:47:35, 4.43s/it] 41%|████▏ | 14142/34278 [15:36:52<23:15:13, 4.16s/it] {'loss': 0.1433, 'grad_norm': 0.8762724704427244, 'learning_rate': 6.628682022600624e-06, 'epoch': 0.41} 41%|████▏ | 14142/34278 [15:36:52<23:15:13, 4.16s/it] 41%|████▏ | 14143/34278 [15:36:58<25:58:47, 4.65s/it] {'loss': 0.1414, 'grad_norm': 0.8278080142316044, 'learning_rate': 6.628235347437878e-06, 'epoch': 0.41} 41%|████▏ | 14143/34278 [15:36:58<25:58:47, 4.65s/it] 41%|████▏ | 14144/34278 [15:37:01<23:14:08, 4.15s/it] {'loss': 0.1421, 'grad_norm': 0.7474656768972003, 'learning_rate': 6.627788657738635e-06, 'epoch': 0.41} 41%|████▏ | 14144/34278 [15:37:01<23:14:08, 4.15s/it] 41%|████▏ | 14145/34278 [15:37:04<21:49:19, 3.90s/it] {'loss': 0.149, 'grad_norm': 0.8289650710922033, 'learning_rate': 6.627341953506884e-06, 'epoch': 0.41} 41%|████▏ | 14145/34278 [15:37:04<21:49:19, 3.90s/it] 41%|████▏ | 14146/34278 [15:37:07<20:03:54, 3.59s/it] {'loss': 0.1202, 'grad_norm': 0.7614300428490463, 'learning_rate': 6.6268952347466124e-06, 'epoch': 0.41} 41%|████▏ | 14146/34278 [15:37:07<20:03:54, 3.59s/it] 41%|████▏ | 14147/34278 [15:37:11<20:06:48, 3.60s/it] {'loss': 0.1455, 'grad_norm': 0.7006325906898112, 'learning_rate': 6.6264485014618086e-06, 'epoch': 0.41} 41%|████▏ | 14147/34278 [15:37:11<20:06:48, 3.60s/it] 41%|████▏ | 14148/34278 [15:37:14<19:29:32, 3.49s/it] {'loss': 0.1263, 'grad_norm': 0.7188461142267647, 'learning_rate': 6.62600175365646e-06, 'epoch': 0.41} 41%|████▏ | 14148/34278 [15:37:14<19:29:32, 3.49s/it] 41%|████▏ | 14149/34278 [15:37:17<19:14:39, 3.44s/it] {'loss': 0.1604, 'grad_norm': 0.8691992835050434, 'learning_rate': 6.6255549913345564e-06, 'epoch': 0.41} 41%|████▏ | 14149/34278 [15:37:17<19:14:39, 3.44s/it] 41%|████▏ | 14150/34278 [15:37:20<18:58:38, 3.39s/it] {'loss': 0.1374, 'grad_norm': 0.7837695638760147, 'learning_rate': 6.625108214500086e-06, 'epoch': 0.41} 41%|████▏ | 14150/34278 [15:37:20<18:58:38, 3.39s/it] 41%|████▏ | 14151/34278 [15:37:24<18:26:09, 3.30s/it] {'loss': 0.1407, 'grad_norm': 0.7637990297262751, 'learning_rate': 6.624661423157038e-06, 'epoch': 0.41} 41%|████▏ | 14151/34278 [15:37:24<18:26:09, 3.30s/it] 41%|████▏ | 14152/34278 [15:37:30<22:56:03, 4.10s/it] {'loss': 0.1278, 'grad_norm': 0.9888144909637261, 'learning_rate': 6.624214617309399e-06, 'epoch': 0.41} 41%|████▏ | 14152/34278 [15:37:30<22:56:03, 4.10s/it] 41%|████▏ | 14153/34278 [15:37:32<21:03:42, 3.77s/it] {'loss': 0.1527, 'grad_norm': 0.9697521391268682, 'learning_rate': 6.623767796961161e-06, 'epoch': 0.41} 41%|████▏ | 14153/34278 [15:37:33<21:03:42, 3.77s/it] 41%|████▏ | 14154/34278 [15:37:36<20:35:11, 3.68s/it] {'loss': 0.152, 'grad_norm': 0.9551915536592764, 'learning_rate': 6.623320962116312e-06, 'epoch': 0.41} 41%|████▏ | 14154/34278 [15:37:36<20:35:11, 3.68s/it] 41%|████▏ | 14155/34278 [15:37:40<20:55:51, 3.74s/it] {'loss': 0.1397, 'grad_norm': 0.8308831093437837, 'learning_rate': 6.62287411277884e-06, 'epoch': 0.41} 41%|████▏ | 14155/34278 [15:37:40<20:55:51, 3.74s/it] 41%|████▏ | 14156/34278 [15:37:43<20:34:04, 3.68s/it] {'loss': 0.1501, 'grad_norm': 1.2305136064429363, 'learning_rate': 6.622427248952736e-06, 'epoch': 0.41} 41%|████▏ | 14156/34278 [15:37:43<20:34:04, 3.68s/it] 41%|████▏ | 14157/34278 [15:37:49<22:58:24, 4.11s/it] {'loss': 0.1297, 'grad_norm': 1.108882331570643, 'learning_rate': 6.621980370641988e-06, 'epoch': 0.41} 41%|████▏ | 14157/34278 [15:37:49<22:58:24, 4.11s/it] 41%|████▏ | 14158/34278 [15:37:53<23:02:01, 4.12s/it] {'loss': 0.1454, 'grad_norm': 0.8715651073385032, 'learning_rate': 6.621533477850588e-06, 'epoch': 0.41} 41%|████▏ | 14158/34278 [15:37:53<23:02:01, 4.12s/it] 41%|████▏ | 14159/34278 [15:37:57<23:02:54, 4.12s/it] {'loss': 0.1424, 'grad_norm': 0.9190812515719307, 'learning_rate': 6.621086570582523e-06, 'epoch': 0.41} 41%|████▏ | 14159/34278 [15:37:57<23:02:54, 4.12s/it] 41%|████▏ | 14160/34278 [15:37:59<20:38:12, 3.69s/it] {'loss': 0.1179, 'grad_norm': 0.8071557816911274, 'learning_rate': 6.6206396488417835e-06, 'epoch': 0.41} 41%|████▏ | 14160/34278 [15:37:59<20:38:12, 3.69s/it] 41%|████▏ | 14161/34278 [15:38:04<21:41:28, 3.88s/it] {'loss': 0.1329, 'grad_norm': 0.9930843256093073, 'learning_rate': 6.620192712632361e-06, 'epoch': 0.41} 41%|████▏ | 14161/34278 [15:38:04<21:41:28, 3.88s/it] 41%|████▏ | 14162/34278 [15:38:07<20:44:19, 3.71s/it] {'loss': 0.1516, 'grad_norm': 0.7460166055800551, 'learning_rate': 6.619745761958245e-06, 'epoch': 0.41} 41%|████▏ | 14162/34278 [15:38:07<20:44:19, 3.71s/it] 41%|████▏ | 14163/34278 [15:38:10<19:37:05, 3.51s/it] {'loss': 0.1269, 'grad_norm': 0.914321139545343, 'learning_rate': 6.619298796823426e-06, 'epoch': 0.41} 41%|████▏ | 14163/34278 [15:38:10<19:37:05, 3.51s/it] 41%|████▏ | 14164/34278 [15:38:14<20:54:05, 3.74s/it] {'loss': 0.1492, 'grad_norm': 0.808775444438091, 'learning_rate': 6.6188518172318925e-06, 'epoch': 0.41} 41%|████▏ | 14164/34278 [15:38:14<20:54:05, 3.74s/it] 41%|████▏ | 14165/34278 [15:38:18<20:21:02, 3.64s/it] {'loss': 0.1353, 'grad_norm': 0.8998984093268817, 'learning_rate': 6.6184048231876375e-06, 'epoch': 0.41} 41%|████▏ | 14165/34278 [15:38:18<20:21:02, 3.64s/it] 41%|████▏ | 14166/34278 [15:38:21<19:58:23, 3.58s/it] {'loss': 0.1577, 'grad_norm': 0.8507902726391359, 'learning_rate': 6.61795781469465e-06, 'epoch': 0.41} 41%|████▏ | 14166/34278 [15:38:21<19:58:23, 3.58s/it] 41%|████▏ | 14167/34278 [15:38:26<21:36:58, 3.87s/it] {'loss': 0.1189, 'grad_norm': 0.9064598430984236, 'learning_rate': 6.61751079175692e-06, 'epoch': 0.41} 41%|████▏ | 14167/34278 [15:38:26<21:36:58, 3.87s/it] 41%|████▏ | 14168/34278 [15:38:30<21:23:19, 3.83s/it] {'loss': 0.1302, 'grad_norm': 0.7571406762091855, 'learning_rate': 6.617063754378442e-06, 'epoch': 0.41} 41%|████▏ | 14168/34278 [15:38:30<21:23:19, 3.83s/it] 41%|████▏ | 14169/34278 [15:38:33<20:27:09, 3.66s/it] {'loss': 0.147, 'grad_norm': 0.8348164600558217, 'learning_rate': 6.616616702563204e-06, 'epoch': 0.41} 41%|████▏ | 14169/34278 [15:38:33<20:27:09, 3.66s/it] 41%|████▏ | 14170/34278 [15:38:37<21:37:56, 3.87s/it] {'loss': 0.1358, 'grad_norm': 0.933216345129164, 'learning_rate': 6.6161696363151986e-06, 'epoch': 0.41} 41%|████▏ | 14170/34278 [15:38:37<21:37:56, 3.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 41%|████▏ | 14171/34278 [15:38:41<21:18:55, 3.82s/it] {'loss': 0.1344, 'grad_norm': 0.9152142623685499, 'learning_rate': 6.615722555638416e-06, 'epoch': 0.41} 41%|████▏ | 14171/34278 [15:38:41<21:18:55, 3.82s/it] 41%|████▏ | 14172/34278 [15:38:47<24:54:58, 4.46s/it] {'loss': 0.1343, 'grad_norm': 0.9600133495594002, 'learning_rate': 6.615275460536847e-06, 'epoch': 0.41} 41%|████▏ | 14172/34278 [15:38:47<24:54:58, 4.46s/it] 41%|████▏ | 14173/34278 [15:38:50<22:45:55, 4.08s/it] {'loss': 0.1331, 'grad_norm': 1.096416770061301, 'learning_rate': 6.614828351014487e-06, 'epoch': 0.41} 41%|████▏ | 14173/34278 [15:38:50<22:45:55, 4.08s/it] 41%|████▏ | 14174/34278 [15:38:54<22:16:40, 3.99s/it] {'loss': 0.1345, 'grad_norm': 0.8320915241840282, 'learning_rate': 6.614381227075323e-06, 'epoch': 0.41} 41%|████▏ | 14174/34278 [15:38:54<22:16:40, 3.99s/it] 41%|████▏ | 14175/34278 [15:38:57<21:26:01, 3.84s/it] {'loss': 0.1344, 'grad_norm': 0.88374329390917, 'learning_rate': 6.613934088723349e-06, 'epoch': 0.41} 41%|████▏ | 14175/34278 [15:38:57<21:26:01, 3.84s/it] 41%|████▏ | 14176/34278 [15:39:01<20:24:45, 3.66s/it] {'loss': 0.142, 'grad_norm': 1.157412923880576, 'learning_rate': 6.613486935962556e-06, 'epoch': 0.41} 41%|████▏ | 14176/34278 [15:39:01<20:24:45, 3.66s/it] 41%|████▏ | 14177/34278 [15:39:03<19:00:30, 3.40s/it] {'loss': 0.1504, 'grad_norm': 1.0243470901503362, 'learning_rate': 6.613039768796938e-06, 'epoch': 0.41} 41%|████▏ | 14177/34278 [15:39:03<19:00:30, 3.40s/it] 41%|████▏ | 14178/34278 [15:39:07<19:12:26, 3.44s/it] {'loss': 0.1441, 'grad_norm': 0.8814405839444529, 'learning_rate': 6.6125925872304865e-06, 'epoch': 0.41} 41%|████▏ | 14178/34278 [15:39:07<19:12:26, 3.44s/it] 41%|████▏ | 14179/34278 [15:39:10<18:26:34, 3.30s/it] {'loss': 0.1579, 'grad_norm': 1.2422971184799716, 'learning_rate': 6.612145391267192e-06, 'epoch': 0.41} 41%|████▏ | 14179/34278 [15:39:10<18:26:34, 3.30s/it] 41%|████▏ | 14180/34278 [15:39:13<18:27:50, 3.31s/it] {'loss': 0.1321, 'grad_norm': 1.04550647310779, 'learning_rate': 6.611698180911048e-06, 'epoch': 0.41} 41%|████▏ | 14180/34278 [15:39:13<18:27:50, 3.31s/it] 41%|████▏ | 14181/34278 [15:39:18<20:44:54, 3.72s/it] {'loss': 0.1289, 'grad_norm': 1.1812782521171592, 'learning_rate': 6.611250956166049e-06, 'epoch': 0.41} 41%|████▏ | 14181/34278 [15:39:18<20:44:54, 3.72s/it] 41%|████▏ | 14182/34278 [15:39:21<19:26:12, 3.48s/it] {'loss': 0.1316, 'grad_norm': 0.8090381359730179, 'learning_rate': 6.610803717036185e-06, 'epoch': 0.41} 41%|████▏ | 14182/34278 [15:39:21<19:26:12, 3.48s/it] 41%|████▏ | 14183/34278 [15:39:25<21:13:52, 3.80s/it] {'loss': 0.1409, 'grad_norm': 0.9203050921883699, 'learning_rate': 6.6103564635254505e-06, 'epoch': 0.41} 41%|████▏ | 14183/34278 [15:39:25<21:13:52, 3.80s/it] 41%|████▏ | 14184/34278 [15:39:31<25:06:53, 4.50s/it] {'loss': 0.1416, 'grad_norm': 0.8137031224583076, 'learning_rate': 6.609909195637837e-06, 'epoch': 0.41} 41%|████▏ | 14184/34278 [15:39:31<25:06:53, 4.50s/it] 41%|████▏ | 14185/34278 [15:39:35<23:53:14, 4.28s/it] {'loss': 0.1514, 'grad_norm': 0.8290285424710755, 'learning_rate': 6.60946191337734e-06, 'epoch': 0.41} 41%|████▏ | 14185/34278 [15:39:35<23:53:14, 4.28s/it] 41%|████▏ | 14186/34278 [15:39:38<21:31:11, 3.86s/it] {'loss': 0.1499, 'grad_norm': 0.7854435898787894, 'learning_rate': 6.609014616747951e-06, 'epoch': 0.41} 41%|████▏ | 14186/34278 [15:39:38<21:31:11, 3.86s/it] 41%|████▏ | 14187/34278 [15:39:41<20:29:55, 3.67s/it] {'loss': 0.1494, 'grad_norm': 0.7491527552813726, 'learning_rate': 6.608567305753661e-06, 'epoch': 0.41} 41%|████▏ | 14187/34278 [15:39:41<20:29:55, 3.67s/it] 41%|████▏ | 14188/34278 [15:39:45<20:57:30, 3.76s/it] {'loss': 0.1351, 'grad_norm': 0.8043017259004319, 'learning_rate': 6.60811998039847e-06, 'epoch': 0.41} 41%|████▏ | 14188/34278 [15:39:45<20:57:30, 3.76s/it] 41%|████▏ | 14189/34278 [15:39:48<19:42:28, 3.53s/it] {'loss': 0.1221, 'grad_norm': 0.6789911228331924, 'learning_rate': 6.607672640686365e-06, 'epoch': 0.41} 41%|████▏ | 14189/34278 [15:39:48<19:42:28, 3.53s/it] 41%|████▏ | 14190/34278 [15:39:51<19:07:59, 3.43s/it] {'loss': 0.1367, 'grad_norm': 0.8191645199722037, 'learning_rate': 6.607225286621342e-06, 'epoch': 0.41} 41%|████▏ | 14190/34278 [15:39:51<19:07:59, 3.43s/it] 41%|████▏ | 14191/34278 [15:39:55<19:13:10, 3.44s/it] {'loss': 0.1267, 'grad_norm': 0.7379204395749995, 'learning_rate': 6.6067779182073974e-06, 'epoch': 0.41} 41%|████▏ | 14191/34278 [15:39:55<19:13:10, 3.44s/it] 41%|████▏ | 14192/34278 [15:39:58<18:26:44, 3.31s/it] {'loss': 0.1474, 'grad_norm': 0.951248367738476, 'learning_rate': 6.606330535448523e-06, 'epoch': 0.41} 41%|████▏ | 14192/34278 [15:39:58<18:26:44, 3.31s/it] 41%|████▏ | 14193/34278 [15:40:01<18:17:36, 3.28s/it] {'loss': 0.1197, 'grad_norm': 0.7327792998003283, 'learning_rate': 6.605883138348712e-06, 'epoch': 0.41} 41%|████▏ | 14193/34278 [15:40:01<18:17:36, 3.28s/it] 41%|████▏ | 14194/34278 [15:40:07<23:14:45, 4.17s/it] {'loss': 0.1226, 'grad_norm': 0.7894826381083477, 'learning_rate': 6.605435726911959e-06, 'epoch': 0.41} 41%|████▏ | 14194/34278 [15:40:07<23:14:45, 4.17s/it] 41%|████▏ | 14195/34278 [15:40:12<23:40:20, 4.24s/it] {'loss': 0.1353, 'grad_norm': 0.9321632495459524, 'learning_rate': 6.604988301142261e-06, 'epoch': 0.41} 41%|████▏ | 14195/34278 [15:40:12<23:40:20, 4.24s/it] 41%|████▏ | 14196/34278 [15:40:18<27:10:32, 4.87s/it] {'loss': 0.1309, 'grad_norm': 0.7572130205053703, 'learning_rate': 6.604540861043609e-06, 'epoch': 0.41} 41%|████▏ | 14196/34278 [15:40:18<27:10:32, 4.87s/it] 41%|████▏ | 14197/34278 [15:40:21<24:34:02, 4.40s/it] {'loss': 0.1374, 'grad_norm': 0.7593881376595455, 'learning_rate': 6.60409340662e-06, 'epoch': 0.41} 41%|████▏ | 14197/34278 [15:40:21<24:34:02, 4.40s/it] 41%|████▏ | 14198/34278 [15:40:25<23:50:06, 4.27s/it] {'loss': 0.129, 'grad_norm': 0.7482341088492079, 'learning_rate': 6.603645937875428e-06, 'epoch': 0.41} 41%|████▏ | 14198/34278 [15:40:25<23:50:06, 4.27s/it] 41%|████▏ | 14199/34278 [15:40:32<26:52:39, 4.82s/it] {'loss': 0.1208, 'grad_norm': 0.8130791401643425, 'learning_rate': 6.603198454813888e-06, 'epoch': 0.41} 41%|████▏ | 14199/34278 [15:40:32<26:52:39, 4.82s/it] 41%|████▏ | 14200/34278 [15:40:35<25:25:51, 4.56s/it] {'loss': 0.151, 'grad_norm': 0.8647814715625625, 'learning_rate': 6.602750957439374e-06, 'epoch': 0.41} 41%|████▏ | 14200/34278 [15:40:36<25:25:51, 4.56s/it] 41%|████▏ | 14201/34278 [15:40:40<26:06:21, 4.68s/it] {'loss': 0.1615, 'grad_norm': 0.7238282092367369, 'learning_rate': 6.6023034457558846e-06, 'epoch': 0.41} 41%|████▏ | 14201/34278 [15:40:40<26:06:21, 4.68s/it] 41%|████▏ | 14202/34278 [15:40:43<23:10:06, 4.15s/it] {'loss': 0.1333, 'grad_norm': 0.6866369237153953, 'learning_rate': 6.6018559197674094e-06, 'epoch': 0.41} 41%|████▏ | 14202/34278 [15:40:43<23:10:06, 4.15s/it] 41%|████▏ | 14203/34278 [15:40:46<20:34:44, 3.69s/it] {'loss': 0.1366, 'grad_norm': 0.7287680754131481, 'learning_rate': 6.601408379477949e-06, 'epoch': 0.41} 41%|████▏ | 14203/34278 [15:40:46<20:34:44, 3.69s/it] 41%|████▏ | 14204/34278 [15:40:49<20:11:15, 3.62s/it] {'loss': 0.1266, 'grad_norm': 0.6889348026106697, 'learning_rate': 6.600960824891496e-06, 'epoch': 0.41} 41%|████▏ | 14204/34278 [15:40:49<20:11:15, 3.62s/it] 41%|████▏ | 14205/34278 [15:40:55<22:48:13, 4.09s/it] {'loss': 0.1528, 'grad_norm': 0.7416949954049658, 'learning_rate': 6.600513256012047e-06, 'epoch': 0.41} 41%|████▏ | 14205/34278 [15:40:55<22:48:13, 4.09s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 41%|████▏ | 14206/34278 [15:40:58<21:10:18, 3.80s/it] {'loss': 0.1505, 'grad_norm': 0.9181624808884554, 'learning_rate': 6.600065672843597e-06, 'epoch': 0.41} 41%|████▏ | 14206/34278 [15:40:58<21:10:18, 3.80s/it] 41%|████▏ | 14207/34278 [15:41:01<20:47:30, 3.73s/it] {'loss': 0.1306, 'grad_norm': 0.7794913680494193, 'learning_rate': 6.599618075390144e-06, 'epoch': 0.41} 41%|████▏ | 14207/34278 [15:41:01<20:47:30, 3.73s/it] 41%|████▏ | 14208/34278 [15:41:07<24:44:42, 4.44s/it] {'loss': 0.1277, 'grad_norm': 0.9401234257825831, 'learning_rate': 6.599170463655682e-06, 'epoch': 0.41} 41%|████▏ | 14208/34278 [15:41:07<24:44:42, 4.44s/it] 41%|████▏ | 14209/34278 [15:41:10<21:56:22, 3.94s/it] {'loss': 0.1561, 'grad_norm': 0.7856146179718172, 'learning_rate': 6.598722837644208e-06, 'epoch': 0.41} 41%|████▏ | 14209/34278 [15:41:10<21:56:22, 3.94s/it] 41%|████▏ | 14210/34278 [15:41:14<21:48:52, 3.91s/it] {'loss': 0.1282, 'grad_norm': 0.8748548482321112, 'learning_rate': 6.5982751973597185e-06, 'epoch': 0.41} 41%|████▏ | 14210/34278 [15:41:14<21:48:52, 3.91s/it] 41%|████▏ | 14211/34278 [15:41:17<19:32:37, 3.51s/it] {'loss': 0.1159, 'grad_norm': 1.0139285038810066, 'learning_rate': 6.597827542806209e-06, 'epoch': 0.41} 41%|████▏ | 14211/34278 [15:41:17<19:32:37, 3.51s/it] 41%|████▏ | 14212/34278 [15:41:20<18:45:31, 3.37s/it] {'loss': 0.1689, 'grad_norm': 0.9835177770005753, 'learning_rate': 6.597379873987677e-06, 'epoch': 0.41} 41%|████▏ | 14212/34278 [15:41:20<18:45:31, 3.37s/it] 41%|████▏ | 14213/34278 [15:41:23<18:35:09, 3.33s/it] {'loss': 0.133, 'grad_norm': 0.8719525758696153, 'learning_rate': 6.596932190908119e-06, 'epoch': 0.41} 41%|████▏ | 14213/34278 [15:41:23<18:35:09, 3.33s/it] 41%|████▏ | 14214/34278 [15:41:29<23:07:24, 4.15s/it] {'loss': 0.129, 'grad_norm': 0.9193789647211665, 'learning_rate': 6.59648449357153e-06, 'epoch': 0.41} 41%|████▏ | 14214/34278 [15:41:29<23:07:24, 4.15s/it] 41%|████▏ | 14215/34278 [15:41:33<22:16:14, 4.00s/it] {'loss': 0.1466, 'grad_norm': 0.8156922230844491, 'learning_rate': 6.596036781981909e-06, 'epoch': 0.41} 41%|████▏ | 14215/34278 [15:41:33<22:16:14, 4.00s/it] 41%|████▏ | 14216/34278 [15:41:39<25:52:10, 4.64s/it] {'loss': 0.1423, 'grad_norm': 0.6906246025773246, 'learning_rate': 6.595589056143255e-06, 'epoch': 0.41} 41%|████▏ | 14216/34278 [15:41:39<25:52:10, 4.64s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 41%|████▏ | 14217/34278 [15:41:42<23:22:44, 4.20s/it] {'loss': 0.1397, 'grad_norm': 0.777090503405836, 'learning_rate': 6.59514131605956e-06, 'epoch': 0.41} 41%|████▏ | 14217/34278 [15:41:42<23:22:44, 4.20s/it] 41%|████▏ | 14218/34278 [15:41:46<23:19:37, 4.19s/it] {'loss': 0.1652, 'grad_norm': 0.8214239367704292, 'learning_rate': 6.594693561734826e-06, 'epoch': 0.41} 41%|████▏ | 14218/34278 [15:41:46<23:19:37, 4.19s/it] 41%|████▏ | 14219/34278 [15:41:49<21:46:00, 3.91s/it] {'loss': 0.1308, 'grad_norm': 0.8431376353961676, 'learning_rate': 6.594245793173049e-06, 'epoch': 0.41} 41%|████▏ | 14219/34278 [15:41:49<21:46:00, 3.91s/it] 41%|████▏ | 14220/34278 [15:41:55<24:16:49, 4.36s/it] {'loss': 0.1349, 'grad_norm': 0.7942742018087572, 'learning_rate': 6.593798010378223e-06, 'epoch': 0.41} 41%|████▏ | 14220/34278 [15:41:55<24:16:49, 4.36s/it] 41%|████▏ | 14221/34278 [15:41:57<21:33:51, 3.87s/it] {'loss': 0.1354, 'grad_norm': 0.8673458762180498, 'learning_rate': 6.593350213354353e-06, 'epoch': 0.41} 41%|████▏ | 14221/34278 [15:41:57<21:33:51, 3.87s/it] 41%|████▏ | 14222/34278 [15:42:03<25:11:20, 4.52s/it] {'loss': 0.1356, 'grad_norm': 0.7163674722069435, 'learning_rate': 6.59290240210543e-06, 'epoch': 0.41} 41%|████▏ | 14222/34278 [15:42:03<25:11:20, 4.52s/it] 41%|████▏ | 14223/34278 [15:42:07<22:58:36, 4.12s/it] {'loss': 0.1536, 'grad_norm': 0.7939042595607231, 'learning_rate': 6.592454576635454e-06, 'epoch': 0.41} 41%|████▏ | 14223/34278 [15:42:07<22:58:36, 4.12s/it] 41%|████▏ | 14224/34278 [15:42:10<21:20:52, 3.83s/it] {'loss': 0.1406, 'grad_norm': 0.7638234883573528, 'learning_rate': 6.592006736948425e-06, 'epoch': 0.41} 41%|████▏ | 14224/34278 [15:42:10<21:20:52, 3.83s/it] 41%|████▏ | 14225/34278 [15:42:13<21:00:20, 3.77s/it] {'loss': 0.1451, 'grad_norm': 0.6860195885204661, 'learning_rate': 6.59155888304834e-06, 'epoch': 0.41} 41%|████▏ | 14225/34278 [15:42:13<21:00:20, 3.77s/it] 42%|████▏ | 14226/34278 [15:42:17<20:27:59, 3.67s/it] {'loss': 0.1486, 'grad_norm': 0.6717922327951451, 'learning_rate': 6.5911110149391976e-06, 'epoch': 0.42} 42%|████▏ | 14226/34278 [15:42:17<20:27:59, 3.67s/it] 42%|████▏ | 14227/34278 [15:42:20<20:12:53, 3.63s/it] {'loss': 0.1322, 'grad_norm': 0.7714242591097691, 'learning_rate': 6.590663132624995e-06, 'epoch': 0.42} 42%|████▏ | 14227/34278 [15:42:20<20:12:53, 3.63s/it] 42%|████▏ | 14228/34278 [15:42:24<20:12:24, 3.63s/it] {'loss': 0.1491, 'grad_norm': 0.71453038723994, 'learning_rate': 6.590215236109731e-06, 'epoch': 0.42} 42%|████▏ | 14228/34278 [15:42:24<20:12:24, 3.63s/it] 42%|████▏ | 14229/34278 [15:42:30<24:00:59, 4.31s/it] {'loss': 0.134, 'grad_norm': 0.8724219654697126, 'learning_rate': 6.589767325397407e-06, 'epoch': 0.42} 42%|████▏ | 14229/34278 [15:42:30<24:00:59, 4.31s/it] 42%|████▏ | 14230/34278 [15:42:33<22:30:31, 4.04s/it] {'loss': 0.152, 'grad_norm': 0.9593656382652455, 'learning_rate': 6.589319400492018e-06, 'epoch': 0.42} 42%|████▏ | 14230/34278 [15:42:33<22:30:31, 4.04s/it] 42%|████▏ | 14231/34278 [15:42:39<25:40:07, 4.61s/it] {'loss': 0.13, 'grad_norm': 0.9514974888951511, 'learning_rate': 6.588871461397567e-06, 'epoch': 0.42} 42%|████▏ | 14231/34278 [15:42:39<25:40:07, 4.61s/it] 42%|████▏ | 14232/34278 [15:42:43<24:00:21, 4.31s/it] {'loss': 0.1552, 'grad_norm': 1.1342312671634571, 'learning_rate': 6.588423508118048e-06, 'epoch': 0.42} 42%|████▏ | 14232/34278 [15:42:43<24:00:21, 4.31s/it] 42%|████▏ | 14233/34278 [15:42:49<26:41:41, 4.79s/it] {'loss': 0.142, 'grad_norm': 1.0483684124579047, 'learning_rate': 6.587975540657465e-06, 'epoch': 0.42} 42%|████▏ | 14233/34278 [15:42:49<26:41:41, 4.79s/it] 42%|████▏ | 14234/34278 [15:42:52<23:36:27, 4.24s/it] {'loss': 0.1457, 'grad_norm': 1.0360670981700808, 'learning_rate': 6.587527559019815e-06, 'epoch': 0.42} 42%|████▏ | 14234/34278 [15:42:52<23:36:27, 4.24s/it] 42%|████▏ | 14235/34278 [15:42:56<24:15:40, 4.36s/it] {'loss': 0.1359, 'grad_norm': 0.9073539394442789, 'learning_rate': 6.5870795632090965e-06, 'epoch': 0.42} 42%|████▏ | 14235/34278 [15:42:56<24:15:40, 4.36s/it] 42%|████▏ | 14236/34278 [15:42:59<21:29:30, 3.86s/it] {'loss': 0.1496, 'grad_norm': 1.0480011025979166, 'learning_rate': 6.586631553229313e-06, 'epoch': 0.42} 42%|████▏ | 14236/34278 [15:42:59<21:29:30, 3.86s/it] 42%|████▏ | 14237/34278 [15:43:02<20:01:38, 3.60s/it] {'loss': 0.1516, 'grad_norm': 0.7141874492710977, 'learning_rate': 6.5861835290844615e-06, 'epoch': 0.42} 42%|████▏ | 14237/34278 [15:43:02<20:01:38, 3.60s/it] 42%|████▏ | 14238/34278 [15:43:05<19:21:25, 3.48s/it] {'loss': 0.136, 'grad_norm': 0.7419194638758977, 'learning_rate': 6.585735490778541e-06, 'epoch': 0.42} 42%|████▏ | 14238/34278 [15:43:05<19:21:25, 3.48s/it] 42%|████▏ | 14239/34278 [15:43:08<18:29:25, 3.32s/it] {'loss': 0.1412, 'grad_norm': 0.6361997929822755, 'learning_rate': 6.585287438315553e-06, 'epoch': 0.42} 42%|████▏ | 14239/34278 [15:43:08<18:29:25, 3.32s/it] 42%|████▏ | 14240/34278 [15:43:12<18:51:50, 3.39s/it] {'loss': 0.1341, 'grad_norm': 1.7003927530082934, 'learning_rate': 6.5848393716994966e-06, 'epoch': 0.42} 42%|████▏ | 14240/34278 [15:43:12<18:51:50, 3.39s/it] 42%|████▏ | 14241/34278 [15:43:18<23:13:57, 4.17s/it] {'loss': 0.1401, 'grad_norm': 0.8609963655650889, 'learning_rate': 6.5843912909343734e-06, 'epoch': 0.42} 42%|████▏ | 14241/34278 [15:43:18<23:13:57, 4.17s/it] 42%|████▏ | 14242/34278 [15:43:21<21:40:12, 3.89s/it] {'loss': 0.1525, 'grad_norm': 0.8800828096276959, 'learning_rate': 6.583943196024182e-06, 'epoch': 0.42} 42%|████▏ | 14242/34278 [15:43:21<21:40:12, 3.89s/it] 42%|████▏ | 14243/34278 [15:43:27<25:35:36, 4.60s/it] {'loss': 0.1498, 'grad_norm': 0.735639549049615, 'learning_rate': 6.583495086972924e-06, 'epoch': 0.42} 42%|████▏ | 14243/34278 [15:43:27<25:35:36, 4.60s/it] 42%|████▏ | 14244/34278 [15:43:30<23:15:04, 4.18s/it] {'loss': 0.1597, 'grad_norm': 1.018300855877259, 'learning_rate': 6.5830469637846e-06, 'epoch': 0.42} 42%|████▏ | 14244/34278 [15:43:31<23:15:04, 4.18s/it] 42%|████▏ | 14245/34278 [15:43:34<21:33:34, 3.87s/it] {'loss': 0.1591, 'grad_norm': 0.79913859611663, 'learning_rate': 6.582598826463211e-06, 'epoch': 0.42} 42%|████▏ | 14245/34278 [15:43:34<21:33:34, 3.87s/it] 42%|████▏ | 14246/34278 [15:43:37<20:58:53, 3.77s/it] {'loss': 0.1286, 'grad_norm': 0.827353692461927, 'learning_rate': 6.58215067501276e-06, 'epoch': 0.42} 42%|████▏ | 14246/34278 [15:43:37<20:58:53, 3.77s/it] 42%|████▏ | 14247/34278 [15:43:41<21:09:08, 3.80s/it] {'loss': 0.135, 'grad_norm': 0.67302947878698, 'learning_rate': 6.5817025094372415e-06, 'epoch': 0.42} 42%|████▏ | 14247/34278 [15:43:41<21:09:08, 3.80s/it] 42%|████▏ | 14248/34278 [15:43:44<19:44:38, 3.55s/it] {'loss': 0.1242, 'grad_norm': 1.0326698294250154, 'learning_rate': 6.581254329740663e-06, 'epoch': 0.42} 42%|████▏ | 14248/34278 [15:43:44<19:44:38, 3.55s/it] 42%|████▏ | 14249/34278 [15:43:50<23:36:37, 4.24s/it] {'loss': 0.1433, 'grad_norm': 0.7923780509932484, 'learning_rate': 6.580806135927021e-06, 'epoch': 0.42} 42%|████▏ | 14249/34278 [15:43:50<23:36:37, 4.24s/it] 42%|████▏ | 14250/34278 [15:43:53<21:21:01, 3.84s/it] {'loss': 0.1393, 'grad_norm': 0.8635468660717607, 'learning_rate': 6.580357928000321e-06, 'epoch': 0.42} 42%|████▏ | 14250/34278 [15:43:53<21:21:01, 3.84s/it] 42%|████▏ | 14251/34278 [15:43:56<20:34:44, 3.70s/it] {'loss': 0.1444, 'grad_norm': 1.1185468153332023, 'learning_rate': 6.579909705964562e-06, 'epoch': 0.42} 42%|████▏ | 14251/34278 [15:43:56<20:34:44, 3.70s/it] 42%|████▏ | 14252/34278 [15:44:00<20:13:53, 3.64s/it] {'loss': 0.1466, 'grad_norm': 0.9428650205713817, 'learning_rate': 6.5794614698237465e-06, 'epoch': 0.42} 42%|████▏ | 14252/34278 [15:44:00<20:13:53, 3.64s/it] 42%|████▏ | 14253/34278 [15:44:03<19:16:59, 3.47s/it] {'loss': 0.1425, 'grad_norm': 0.7958016525600325, 'learning_rate': 6.579013219581876e-06, 'epoch': 0.42} 42%|████▏ | 14253/34278 [15:44:03<19:16:59, 3.47s/it] 42%|████▏ | 14254/34278 [15:44:06<19:08:41, 3.44s/it] {'loss': 0.1507, 'grad_norm': 1.004094928846351, 'learning_rate': 6.578564955242952e-06, 'epoch': 0.42} 42%|████▏ | 14254/34278 [15:44:06<19:08:41, 3.44s/it] 42%|████▏ | 14255/34278 [15:44:09<17:58:24, 3.23s/it] {'loss': 0.1497, 'grad_norm': 0.7654458813658859, 'learning_rate': 6.578116676810979e-06, 'epoch': 0.42} 42%|████▏ | 14255/34278 [15:44:09<17:58:24, 3.23s/it] 42%|████▏ | 14256/34278 [15:44:12<17:33:25, 3.16s/it] {'loss': 0.1243, 'grad_norm': 1.0215519857910305, 'learning_rate': 6.577668384289955e-06, 'epoch': 0.42} 42%|████▏ | 14256/34278 [15:44:12<17:33:25, 3.16s/it] 42%|████▏ | 14257/34278 [15:44:16<18:31:57, 3.33s/it] {'loss': 0.13, 'grad_norm': 0.8578334702034305, 'learning_rate': 6.577220077683884e-06, 'epoch': 0.42} 42%|████▏ | 14257/34278 [15:44:16<18:31:57, 3.33s/it] 42%|████▏ | 14258/34278 [15:44:22<22:59:42, 4.13s/it] {'loss': 0.1471, 'grad_norm': 0.7564010945740879, 'learning_rate': 6.57677175699677e-06, 'epoch': 0.42} 42%|████▏ | 14258/34278 [15:44:22<22:59:42, 4.13s/it] 42%|████▏ | 14259/34278 [15:44:27<25:03:01, 4.50s/it] {'loss': 0.1234, 'grad_norm': 0.9085508334254474, 'learning_rate': 6.576323422232612e-06, 'epoch': 0.42} 42%|████▏ | 14259/34278 [15:44:27<25:03:01, 4.50s/it] 42%|████▏ | 14260/34278 [15:44:30<23:01:47, 4.14s/it] {'loss': 0.1196, 'grad_norm': 0.8234747830665352, 'learning_rate': 6.575875073395417e-06, 'epoch': 0.42} 42%|████▏ | 14260/34278 [15:44:30<23:01:47, 4.14s/it] 42%|████▏ | 14261/34278 [15:44:34<22:32:39, 4.05s/it] {'loss': 0.1273, 'grad_norm': 0.7241153612141866, 'learning_rate': 6.5754267104891855e-06, 'epoch': 0.42} 42%|████▏ | 14261/34278 [15:44:34<22:32:39, 4.05s/it] 42%|████▏ | 14262/34278 [15:44:38<21:45:44, 3.91s/it] {'loss': 0.1653, 'grad_norm': 0.696367475586553, 'learning_rate': 6.574978333517918e-06, 'epoch': 0.42} 42%|████▏ | 14262/34278 [15:44:38<21:45:44, 3.91s/it] 42%|████▏ | 14263/34278 [15:44:41<20:32:28, 3.69s/it] {'loss': 0.1326, 'grad_norm': 0.7856989032450266, 'learning_rate': 6.574529942485623e-06, 'epoch': 0.42} 42%|████▏ | 14263/34278 [15:44:41<20:32:28, 3.69s/it] 42%|████▏ | 14264/34278 [15:44:44<20:24:03, 3.67s/it] {'loss': 0.1446, 'grad_norm': 0.9594314337708004, 'learning_rate': 6.574081537396299e-06, 'epoch': 0.42} 42%|████▏ | 14264/34278 [15:44:44<20:24:03, 3.67s/it] 42%|████▏ | 14265/34278 [15:44:50<22:43:09, 4.09s/it] {'loss': 0.1201, 'grad_norm': 0.7745130245349363, 'learning_rate': 6.573633118253951e-06, 'epoch': 0.42} 42%|████▏ | 14265/34278 [15:44:50<22:43:09, 4.09s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 42%|████▏ | 14266/34278 [15:44:53<20:57:15, 3.77s/it] {'loss': 0.1628, 'grad_norm': 0.8539250642295578, 'learning_rate': 6.5731846850625824e-06, 'epoch': 0.42} 42%|████▏ | 14266/34278 [15:44:53<20:57:15, 3.77s/it] 42%|████▏ | 14267/34278 [15:44:55<19:17:15, 3.47s/it] {'loss': 0.1347, 'grad_norm': 0.8183359594746717, 'learning_rate': 6.572736237826196e-06, 'epoch': 0.42} 42%|████▏ | 14267/34278 [15:44:55<19:17:15, 3.47s/it] 42%|████▏ | 14268/34278 [15:44:59<19:21:54, 3.48s/it] {'loss': 0.1516, 'grad_norm': 0.8686788464309594, 'learning_rate': 6.572287776548797e-06, 'epoch': 0.42} 42%|████▏ | 14268/34278 [15:44:59<19:21:54, 3.48s/it] 42%|████▏ | 14269/34278 [15:45:02<18:55:03, 3.40s/it] {'loss': 0.1465, 'grad_norm': 0.9002621558730742, 'learning_rate': 6.571839301234386e-06, 'epoch': 0.42} 42%|████▏ | 14269/34278 [15:45:02<18:55:03, 3.40s/it] 42%|████▏ | 14270/34278 [15:45:05<18:51:16, 3.39s/it] {'loss': 0.129, 'grad_norm': 0.8924838825693661, 'learning_rate': 6.571390811886971e-06, 'epoch': 0.42} 42%|████▏ | 14270/34278 [15:45:05<18:51:16, 3.39s/it] 42%|████▏ | 14271/34278 [15:45:09<19:18:47, 3.48s/it] {'loss': 0.1241, 'grad_norm': 0.7981411053326004, 'learning_rate': 6.570942308510553e-06, 'epoch': 0.42} 42%|████▏ | 14271/34278 [15:45:09<19:18:47, 3.48s/it] 42%|████▏ | 14272/34278 [15:45:12<18:31:08, 3.33s/it] {'loss': 0.1224, 'grad_norm': 1.0778965375656522, 'learning_rate': 6.570493791109137e-06, 'epoch': 0.42} 42%|████▏ | 14272/34278 [15:45:12<18:31:08, 3.33s/it] 42%|████▏ | 14273/34278 [15:45:15<17:32:35, 3.16s/it] {'loss': 0.1176, 'grad_norm': 1.0137178303005545, 'learning_rate': 6.570045259686728e-06, 'epoch': 0.42} 42%|████▏ | 14273/34278 [15:45:15<17:32:35, 3.16s/it] 42%|████▏ | 14274/34278 [15:45:18<17:01:15, 3.06s/it] {'loss': 0.1361, 'grad_norm': 0.8599627669058896, 'learning_rate': 6.569596714247328e-06, 'epoch': 0.42} 42%|████▏ | 14274/34278 [15:45:18<17:01:15, 3.06s/it] 42%|████▏ | 14275/34278 [15:45:22<19:01:40, 3.42s/it] {'loss': 0.1503, 'grad_norm': 1.2350963687471905, 'learning_rate': 6.569148154794945e-06, 'epoch': 0.42} 42%|████▏ | 14275/34278 [15:45:22<19:01:40, 3.42s/it] 42%|████▏ | 14276/34278 [15:45:26<20:43:04, 3.73s/it] {'loss': 0.1484, 'grad_norm': 1.1456342598343054, 'learning_rate': 6.568699581333583e-06, 'epoch': 0.42} 42%|████▏ | 14276/34278 [15:45:26<20:43:04, 3.73s/it] 42%|████▏ | 14277/34278 [15:45:30<20:42:43, 3.73s/it] {'loss': 0.15, 'grad_norm': 0.896590792772618, 'learning_rate': 6.568250993867242e-06, 'epoch': 0.42} 42%|████▏ | 14277/34278 [15:45:30<20:42:43, 3.73s/it] 42%|████▏ | 14278/34278 [15:45:35<22:36:35, 4.07s/it] {'loss': 0.1306, 'grad_norm': 0.9495093826242231, 'learning_rate': 6.567802392399934e-06, 'epoch': 0.42} 42%|████▏ | 14278/34278 [15:45:35<22:36:35, 4.07s/it] 42%|████▏ | 14279/34278 [15:45:38<21:03:13, 3.79s/it] {'loss': 0.117, 'grad_norm': 2.4680776091761025, 'learning_rate': 6.567353776935659e-06, 'epoch': 0.42} 42%|████▏ | 14279/34278 [15:45:38<21:03:13, 3.79s/it] 42%|████▏ | 14280/34278 [15:45:44<24:25:17, 4.40s/it] {'loss': 0.1496, 'grad_norm': 1.0618009436803515, 'learning_rate': 6.566905147478422e-06, 'epoch': 0.42} 42%|████▏ | 14280/34278 [15:45:44<24:25:17, 4.40s/it] 42%|████▏ | 14281/34278 [15:45:47<22:53:33, 4.12s/it] {'loss': 0.1631, 'grad_norm': 0.9761818292036695, 'learning_rate': 6.5664565040322325e-06, 'epoch': 0.42} 42%|████▏ | 14281/34278 [15:45:47<22:53:33, 4.12s/it] 42%|████▏ | 14282/34278 [15:45:51<22:13:33, 4.00s/it] {'loss': 0.145, 'grad_norm': 0.8427616603059416, 'learning_rate': 6.566007846601092e-06, 'epoch': 0.42} 42%|████▏ | 14282/34278 [15:45:51<22:13:33, 4.00s/it] 42%|████▏ | 14283/34278 [15:45:55<22:03:44, 3.97s/it] {'loss': 0.1511, 'grad_norm': 1.1511932611745979, 'learning_rate': 6.565559175189008e-06, 'epoch': 0.42} 42%|████▏ | 14283/34278 [15:45:55<22:03:44, 3.97s/it] 42%|████▏ | 14284/34278 [15:45:58<20:09:00, 3.63s/it] {'loss': 0.1333, 'grad_norm': 1.165155613581096, 'learning_rate': 6.565110489799985e-06, 'epoch': 0.42} 42%|████▏ | 14284/34278 [15:45:58<20:09:00, 3.63s/it] 42%|████▏ | 14285/34278 [15:46:02<20:17:09, 3.65s/it] {'loss': 0.1482, 'grad_norm': 0.8810995836888235, 'learning_rate': 6.564661790438029e-06, 'epoch': 0.42} 42%|████▏ | 14285/34278 [15:46:02<20:17:09, 3.65s/it] 42%|████▏ | 14286/34278 [15:46:07<23:22:02, 4.21s/it] {'loss': 0.1291, 'grad_norm': 0.9295967316792687, 'learning_rate': 6.564213077107147e-06, 'epoch': 0.42} 42%|████▏ | 14286/34278 [15:46:07<23:22:02, 4.21s/it] 42%|████▏ | 14287/34278 [15:46:10<21:19:47, 3.84s/it] {'loss': 0.1496, 'grad_norm': 1.298575314455602, 'learning_rate': 6.563764349811342e-06, 'epoch': 0.42} 42%|████▏ | 14287/34278 [15:46:10<21:19:47, 3.84s/it] 42%|████▏ | 14288/34278 [15:46:16<25:08:00, 4.53s/it] {'loss': 0.1503, 'grad_norm': 0.8940517436562201, 'learning_rate': 6.563315608554624e-06, 'epoch': 0.42} 42%|████▏ | 14288/34278 [15:46:16<25:08:00, 4.53s/it] 42%|████▏ | 14289/34278 [15:46:19<22:21:16, 4.03s/it] {'loss': 0.1398, 'grad_norm': 0.949590246522986, 'learning_rate': 6.562866853340997e-06, 'epoch': 0.42} 42%|████▏ | 14289/34278 [15:46:19<22:21:16, 4.03s/it] 42%|████▏ | 14290/34278 [15:46:22<21:15:35, 3.83s/it] {'loss': 0.1338, 'grad_norm': 0.7319174477402393, 'learning_rate': 6.562418084174467e-06, 'epoch': 0.42} 42%|████▏ | 14290/34278 [15:46:22<21:15:35, 3.83s/it] 42%|████▏ | 14291/34278 [15:46:25<19:27:17, 3.50s/it] {'loss': 0.1373, 'grad_norm': 0.8444589471790982, 'learning_rate': 6.561969301059044e-06, 'epoch': 0.42} 42%|████▏ | 14291/34278 [15:46:25<19:27:17, 3.50s/it] 42%|████▏ | 14292/34278 [15:46:29<19:42:49, 3.55s/it] {'loss': 0.1312, 'grad_norm': 0.9852698717575947, 'learning_rate': 6.561520503998728e-06, 'epoch': 0.42} 42%|████▏ | 14292/34278 [15:46:29<19:42:49, 3.55s/it] 42%|████▏ | 14293/34278 [15:46:32<19:01:50, 3.43s/it] {'loss': 0.1603, 'grad_norm': 0.8153453148167985, 'learning_rate': 6.561071692997533e-06, 'epoch': 0.42} 42%|████▏ | 14293/34278 [15:46:32<19:01:50, 3.43s/it] 42%|████▏ | 14294/34278 [15:46:35<18:11:23, 3.28s/it] {'loss': 0.1622, 'grad_norm': 0.8232336892224774, 'learning_rate': 6.560622868059461e-06, 'epoch': 0.42} 42%|████▏ | 14294/34278 [15:46:35<18:11:23, 3.28s/it] 42%|████▏ | 14295/34278 [15:46:38<18:29:23, 3.33s/it] {'loss': 0.176, 'grad_norm': 0.912693790723682, 'learning_rate': 6.56017402918852e-06, 'epoch': 0.42} 42%|████▏ | 14295/34278 [15:46:38<18:29:23, 3.33s/it] 42%|████▏ | 14296/34278 [15:46:42<18:18:09, 3.30s/it] {'loss': 0.1303, 'grad_norm': 0.7135774955610511, 'learning_rate': 6.559725176388719e-06, 'epoch': 0.42} 42%|████▏ | 14296/34278 [15:46:42<18:18:09, 3.30s/it] 42%|████▏ | 14297/34278 [15:46:45<18:04:04, 3.26s/it] {'loss': 0.1459, 'grad_norm': 0.8108366963627028, 'learning_rate': 6.559276309664064e-06, 'epoch': 0.42} 42%|████▏ | 14297/34278 [15:46:45<18:04:04, 3.26s/it] 42%|████▏ | 14298/34278 [15:46:49<19:07:51, 3.45s/it] {'loss': 0.1343, 'grad_norm': 0.8494333748366347, 'learning_rate': 6.558827429018562e-06, 'epoch': 0.42} 42%|████▏ | 14298/34278 [15:46:49<19:07:51, 3.45s/it] 42%|████▏ | 14299/34278 [15:46:52<18:24:29, 3.32s/it] {'loss': 0.1463, 'grad_norm': 0.66545406291422, 'learning_rate': 6.5583785344562204e-06, 'epoch': 0.42} 42%|████▏ | 14299/34278 [15:46:52<18:24:29, 3.32s/it] 42%|████▏ | 14300/34278 [15:46:56<19:24:11, 3.50s/it] {'loss': 0.1539, 'grad_norm': 1.0578226466632783, 'learning_rate': 6.557929625981048e-06, 'epoch': 0.42} 42%|████▏ | 14300/34278 [15:46:56<19:24:11, 3.50s/it] 42%|████▏ | 14301/34278 [15:46:59<18:42:11, 3.37s/it] {'loss': 0.1684, 'grad_norm': 0.7599889971263503, 'learning_rate': 6.557480703597051e-06, 'epoch': 0.42} 42%|████▏ | 14301/34278 [15:46:59<18:42:11, 3.37s/it] 42%|████▏ | 14302/34278 [15:47:02<18:31:20, 3.34s/it] {'loss': 0.1659, 'grad_norm': 0.7508250181418441, 'learning_rate': 6.5570317673082385e-06, 'epoch': 0.42} 42%|████▏ | 14302/34278 [15:47:02<18:31:20, 3.34s/it] 42%|████▏ | 14303/34278 [15:47:05<17:56:35, 3.23s/it] {'loss': 0.1467, 'grad_norm': 0.8196772922829928, 'learning_rate': 6.5565828171186175e-06, 'epoch': 0.42} 42%|████▏ | 14303/34278 [15:47:05<17:56:35, 3.23s/it] 42%|████▏ | 14304/34278 [15:47:09<18:49:49, 3.39s/it] {'loss': 0.1399, 'grad_norm': 0.8000356929982854, 'learning_rate': 6.556133853032197e-06, 'epoch': 0.42} 42%|████▏ | 14304/34278 [15:47:09<18:49:49, 3.39s/it] 42%|████▏ | 14305/34278 [15:47:12<18:35:03, 3.35s/it] {'loss': 0.1308, 'grad_norm': 0.8587873979531085, 'learning_rate': 6.555684875052985e-06, 'epoch': 0.42} 42%|████▏ | 14305/34278 [15:47:12<18:35:03, 3.35s/it] 42%|████▏ | 14306/34278 [15:47:15<17:41:18, 3.19s/it] {'loss': 0.1305, 'grad_norm': 0.7785555743013844, 'learning_rate': 6.555235883184991e-06, 'epoch': 0.42} 42%|████▏ | 14306/34278 [15:47:15<17:41:18, 3.19s/it] 42%|████▏ | 14307/34278 [15:47:18<17:08:45, 3.09s/it] {'loss': 0.1392, 'grad_norm': 0.7203983435626452, 'learning_rate': 6.55478687743222e-06, 'epoch': 0.42} 42%|████▏ | 14307/34278 [15:47:18<17:08:45, 3.09s/it] 42%|████▏ | 14308/34278 [15:47:22<19:19:44, 3.48s/it] {'loss': 0.1395, 'grad_norm': 0.8982224874390651, 'learning_rate': 6.554337857798686e-06, 'epoch': 0.42} 42%|████▏ | 14308/34278 [15:47:22<19:19:44, 3.48s/it] 42%|████▏ | 14309/34278 [15:47:25<18:23:03, 3.31s/it] {'loss': 0.1334, 'grad_norm': 0.9033788887895698, 'learning_rate': 6.553888824288393e-06, 'epoch': 0.42} 42%|████▏ | 14309/34278 [15:47:25<18:23:03, 3.31s/it] 42%|████▏ | 14310/34278 [15:47:28<18:19:59, 3.31s/it] {'loss': 0.1487, 'grad_norm': 0.880630137005619, 'learning_rate': 6.55343977690535e-06, 'epoch': 0.42} 42%|████▏ | 14310/34278 [15:47:28<18:19:59, 3.31s/it] 42%|████▏ | 14311/34278 [15:47:34<22:19:59, 4.03s/it] {'loss': 0.1225, 'grad_norm': 0.7806417589295644, 'learning_rate': 6.55299071565357e-06, 'epoch': 0.42} 42%|████▏ | 14311/34278 [15:47:34<22:19:59, 4.03s/it] 42%|████▏ | 14312/34278 [15:47:37<20:18:18, 3.66s/it] {'loss': 0.1645, 'grad_norm': 0.9294498284452497, 'learning_rate': 6.552541640537058e-06, 'epoch': 0.42} 42%|████▏ | 14312/34278 [15:47:37<20:18:18, 3.66s/it] 42%|████▏ | 14313/34278 [15:47:40<19:14:54, 3.47s/it] {'loss': 0.1535, 'grad_norm': 0.9978673562227008, 'learning_rate': 6.552092551559825e-06, 'epoch': 0.42} 42%|████▏ | 14313/34278 [15:47:40<19:14:54, 3.47s/it] 42%|████▏ | 14314/34278 [15:47:43<18:46:44, 3.39s/it] {'loss': 0.1527, 'grad_norm': 0.9282543665195457, 'learning_rate': 6.55164344872588e-06, 'epoch': 0.42} 42%|████▏ | 14314/34278 [15:47:43<18:46:44, 3.39s/it] 42%|████▏ | 14315/34278 [15:47:46<18:22:31, 3.31s/it] {'loss': 0.129, 'grad_norm': 0.8826576051342357, 'learning_rate': 6.551194332039235e-06, 'epoch': 0.42} 42%|████▏ | 14315/34278 [15:47:46<18:22:31, 3.31s/it] 42%|████▏ | 14316/34278 [15:47:52<22:43:22, 4.10s/it] {'loss': 0.1515, 'grad_norm': 0.8600713731233143, 'learning_rate': 6.550745201503894e-06, 'epoch': 0.42} 42%|████▏ | 14316/34278 [15:47:52<22:43:22, 4.10s/it] 42%|████▏ | 14317/34278 [15:47:56<22:36:19, 4.08s/it] {'loss': 0.1312, 'grad_norm': 1.198180508330796, 'learning_rate': 6.550296057123872e-06, 'epoch': 0.42} 42%|████▏ | 14317/34278 [15:47:56<22:36:19, 4.08s/it] 42%|████▏ | 14318/34278 [15:47:59<20:32:34, 3.71s/it] {'loss': 0.1515, 'grad_norm': 0.94208104984621, 'learning_rate': 6.549846898903176e-06, 'epoch': 0.42} 42%|████▏ | 14318/34278 [15:47:59<20:32:34, 3.71s/it] 42%|████▏ | 14319/34278 [15:48:02<19:31:17, 3.52s/it] {'loss': 0.156, 'grad_norm': 0.8037888861901492, 'learning_rate': 6.549397726845817e-06, 'epoch': 0.42} 42%|████▏ | 14319/34278 [15:48:02<19:31:17, 3.52s/it] 42%|████▏ | 14320/34278 [15:48:05<18:43:55, 3.38s/it] {'loss': 0.143, 'grad_norm': 0.8551748986067956, 'learning_rate': 6.548948540955806e-06, 'epoch': 0.42} 42%|████▏ | 14320/34278 [15:48:05<18:43:55, 3.38s/it] 42%|████▏ | 14321/34278 [15:48:08<18:11:45, 3.28s/it] {'loss': 0.1407, 'grad_norm': 0.9323192723203955, 'learning_rate': 6.548499341237152e-06, 'epoch': 0.42} 42%|████▏ | 14321/34278 [15:48:08<18:11:45, 3.28s/it] 42%|████▏ | 14322/34278 [15:48:11<17:53:50, 3.23s/it] {'loss': 0.1391, 'grad_norm': 0.7603429502452196, 'learning_rate': 6.548050127693865e-06, 'epoch': 0.42} 42%|████▏ | 14322/34278 [15:48:11<17:53:50, 3.23s/it] 42%|████▏ | 14323/34278 [15:48:17<22:12:53, 4.01s/it] {'loss': 0.1343, 'grad_norm': 1.425711060718987, 'learning_rate': 6.547600900329957e-06, 'epoch': 0.42} 42%|████▏ | 14323/34278 [15:48:17<22:12:53, 4.01s/it] 42%|████▏ | 14324/34278 [15:48:20<20:29:38, 3.70s/it] {'loss': 0.139, 'grad_norm': 0.9718586473008777, 'learning_rate': 6.547151659149435e-06, 'epoch': 0.42} 42%|████▏ | 14324/34278 [15:48:20<20:29:38, 3.70s/it] 42%|████▏ | 14325/34278 [15:48:23<19:58:58, 3.61s/it] {'loss': 0.1285, 'grad_norm': 0.9740842615198593, 'learning_rate': 6.546702404156313e-06, 'epoch': 0.42} 42%|████▏ | 14325/34278 [15:48:23<19:58:58, 3.61s/it] 42%|████▏ | 14326/34278 [15:48:27<19:45:27, 3.56s/it] {'loss': 0.1466, 'grad_norm': 0.7313133104116794, 'learning_rate': 6.546253135354603e-06, 'epoch': 0.42} 42%|████▏ | 14326/34278 [15:48:27<19:45:27, 3.56s/it] 42%|████▏ | 14327/34278 [15:48:32<21:42:14, 3.92s/it] {'loss': 0.1342, 'grad_norm': 0.64246601816848, 'learning_rate': 6.545803852748314e-06, 'epoch': 0.42} 42%|████▏ | 14327/34278 [15:48:32<21:42:14, 3.92s/it] 42%|████▏ | 14328/34278 [15:48:35<20:18:44, 3.67s/it] {'loss': 0.1233, 'grad_norm': 0.805222785127508, 'learning_rate': 6.545354556341457e-06, 'epoch': 0.42} 42%|████▏ | 14328/34278 [15:48:35<20:18:44, 3.67s/it] 42%|████▏ | 14329/34278 [15:48:40<23:15:22, 4.20s/it] {'loss': 0.145, 'grad_norm': 0.7740098512065345, 'learning_rate': 6.544905246138042e-06, 'epoch': 0.42} 42%|████▏ | 14329/34278 [15:48:40<23:15:22, 4.20s/it] 42%|████▏ | 14330/34278 [15:48:43<21:47:30, 3.93s/it] {'loss': 0.136, 'grad_norm': 0.8370453777839455, 'learning_rate': 6.544455922142084e-06, 'epoch': 0.42} 42%|████▏ | 14330/34278 [15:48:43<21:47:30, 3.93s/it] 42%|████▏ | 14331/34278 [15:48:46<20:11:56, 3.65s/it] {'loss': 0.1557, 'grad_norm': 0.855682185482877, 'learning_rate': 6.54400658435759e-06, 'epoch': 0.42} 42%|████▏ | 14331/34278 [15:48:46<20:11:56, 3.65s/it] 42%|████▏ | 14332/34278 [15:48:49<18:47:05, 3.39s/it] {'loss': 0.1391, 'grad_norm': 1.056530004972702, 'learning_rate': 6.543557232788574e-06, 'epoch': 0.42} 42%|████▏ | 14332/34278 [15:48:49<18:47:05, 3.39s/it] 42%|████▏ | 14333/34278 [15:48:53<19:03:28, 3.44s/it] {'loss': 0.153, 'grad_norm': 0.9034832075931041, 'learning_rate': 6.543107867439049e-06, 'epoch': 0.42} 42%|████▏ | 14333/34278 [15:48:53<19:03:28, 3.44s/it] 42%|████▏ | 14334/34278 [15:48:56<18:54:25, 3.41s/it] {'loss': 0.1275, 'grad_norm': 0.6181331059510731, 'learning_rate': 6.542658488313024e-06, 'epoch': 0.42} 42%|████▏ | 14334/34278 [15:48:56<18:54:25, 3.41s/it] 42%|████▏ | 14335/34278 [15:49:02<23:00:19, 4.15s/it] {'loss': 0.1293, 'grad_norm': 0.8372381582547589, 'learning_rate': 6.542209095414512e-06, 'epoch': 0.42} 42%|████▏ | 14335/34278 [15:49:02<23:00:19, 4.15s/it] 42%|████▏ | 14336/34278 [15:49:05<21:25:49, 3.87s/it] {'loss': 0.1261, 'grad_norm': 1.1441327183955639, 'learning_rate': 6.541759688747528e-06, 'epoch': 0.42} 42%|████▏ | 14336/34278 [15:49:05<21:25:49, 3.87s/it] 42%|████▏ | 14337/34278 [15:49:11<24:56:19, 4.50s/it] {'loss': 0.1454, 'grad_norm': 0.8883941768455438, 'learning_rate': 6.541310268316079e-06, 'epoch': 0.42} 42%|████▏ | 14337/34278 [15:49:11<24:56:19, 4.50s/it] 42%|████▏ | 14338/34278 [15:49:14<22:22:41, 4.04s/it] {'loss': 0.1317, 'grad_norm': 1.0577726495234712, 'learning_rate': 6.5408608341241805e-06, 'epoch': 0.42} 42%|████▏ | 14338/34278 [15:49:14<22:22:41, 4.04s/it] 42%|████▏ | 14339/34278 [15:49:17<20:56:22, 3.78s/it] {'loss': 0.1311, 'grad_norm': 1.3522374229605751, 'learning_rate': 6.5404113861758446e-06, 'epoch': 0.42} 42%|████▏ | 14339/34278 [15:49:17<20:56:22, 3.78s/it] 42%|████▏ | 14340/34278 [15:49:22<21:50:00, 3.94s/it] {'loss': 0.1569, 'grad_norm': 0.8630830853602386, 'learning_rate': 6.539961924475083e-06, 'epoch': 0.42} 42%|████▏ | 14340/34278 [15:49:22<21:50:00, 3.94s/it] 42%|████▏ | 14341/34278 [15:49:24<19:51:42, 3.59s/it] {'loss': 0.1538, 'grad_norm': 0.7511730437186435, 'learning_rate': 6.53951244902591e-06, 'epoch': 0.42} 42%|████▏ | 14341/34278 [15:49:24<19:51:42, 3.59s/it] 42%|████▏ | 14342/34278 [15:49:28<19:30:54, 3.52s/it] {'loss': 0.1514, 'grad_norm': 1.1997545421136582, 'learning_rate': 6.539062959832337e-06, 'epoch': 0.42} 42%|████▏ | 14342/34278 [15:49:28<19:30:54, 3.52s/it] 42%|████▏ | 14343/34278 [15:49:31<18:24:30, 3.32s/it] {'loss': 0.1259, 'grad_norm': 0.7480991491496013, 'learning_rate': 6.538613456898376e-06, 'epoch': 0.42} 42%|████▏ | 14343/34278 [15:49:31<18:24:30, 3.32s/it] 42%|████▏ | 14344/34278 [15:49:36<21:43:50, 3.92s/it] {'loss': 0.1465, 'grad_norm': 0.8950531223354506, 'learning_rate': 6.538163940228043e-06, 'epoch': 0.42} 42%|████▏ | 14344/34278 [15:49:36<21:43:50, 3.92s/it] 42%|████▏ | 14345/34278 [15:49:39<20:37:41, 3.73s/it] {'loss': 0.1447, 'grad_norm': 1.5710228081205437, 'learning_rate': 6.537714409825349e-06, 'epoch': 0.42} 42%|████▏ | 14345/34278 [15:49:39<20:37:41, 3.73s/it] 42%|████▏ | 14346/34278 [15:49:42<19:17:42, 3.48s/it] {'loss': 0.1451, 'grad_norm': 0.8818041009882807, 'learning_rate': 6.537264865694307e-06, 'epoch': 0.42} 42%|████▏ | 14346/34278 [15:49:42<19:17:42, 3.48s/it] 42%|████▏ | 14347/34278 [15:49:48<23:29:25, 4.24s/it] {'loss': 0.148, 'grad_norm': 0.7327222044131102, 'learning_rate': 6.5368153078389315e-06, 'epoch': 0.42} 42%|████▏ | 14347/34278 [15:49:48<23:29:25, 4.24s/it] 42%|████▏ | 14348/34278 [15:49:54<26:32:39, 4.79s/it] {'loss': 0.1388, 'grad_norm': 0.899954505310372, 'learning_rate': 6.536365736263236e-06, 'epoch': 0.42} 42%|████▏ | 14348/34278 [15:49:54<26:32:39, 4.79s/it] 42%|████▏ | 14349/34278 [15:49:57<23:24:55, 4.23s/it] {'loss': 0.1374, 'grad_norm': 0.9258798806308532, 'learning_rate': 6.535916150971234e-06, 'epoch': 0.42} 42%|████▏ | 14349/34278 [15:49:57<23:24:55, 4.23s/it] 42%|████▏ | 14350/34278 [15:50:00<21:02:32, 3.80s/it] {'loss': 0.1336, 'grad_norm': 0.8946088431754494, 'learning_rate': 6.5354665519669405e-06, 'epoch': 0.42} 42%|████▏ | 14350/34278 [15:50:00<21:02:32, 3.80s/it] 42%|████▏ | 14351/34278 [15:50:03<20:11:36, 3.65s/it] {'loss': 0.1608, 'grad_norm': 0.9430193967223919, 'learning_rate': 6.535016939254366e-06, 'epoch': 0.42} 42%|████▏ | 14351/34278 [15:50:03<20:11:36, 3.65s/it] 42%|████▏ | 14352/34278 [15:50:06<18:40:30, 3.37s/it] {'loss': 0.1399, 'grad_norm': 0.8612935527285908, 'learning_rate': 6.534567312837528e-06, 'epoch': 0.42} 42%|████▏ | 14352/34278 [15:50:06<18:40:30, 3.37s/it] 42%|████▏ | 14353/34278 [15:50:09<17:58:34, 3.25s/it] {'loss': 0.1291, 'grad_norm': 0.7180210667748339, 'learning_rate': 6.53411767272044e-06, 'epoch': 0.42} 42%|████▏ | 14353/34278 [15:50:09<17:58:34, 3.25s/it] 42%|████▏ | 14354/34278 [15:50:12<18:03:51, 3.26s/it] {'loss': 0.1484, 'grad_norm': 0.7933929550154948, 'learning_rate': 6.5336680189071135e-06, 'epoch': 0.42} 42%|████▏ | 14354/34278 [15:50:12<18:03:51, 3.26s/it] 42%|████▏ | 14355/34278 [15:50:16<18:32:55, 3.35s/it] {'loss': 0.1583, 'grad_norm': 0.7875205272562983, 'learning_rate': 6.533218351401567e-06, 'epoch': 0.42} 42%|████▏ | 14355/34278 [15:50:16<18:32:55, 3.35s/it] 42%|████▏ | 14356/34278 [15:50:20<19:17:47, 3.49s/it] {'loss': 0.1323, 'grad_norm': 0.739509735175548, 'learning_rate': 6.532768670207813e-06, 'epoch': 0.42} 42%|████▏ | 14356/34278 [15:50:20<19:17:47, 3.49s/it] 42%|████▏ | 14357/34278 [15:50:23<19:02:46, 3.44s/it] {'loss': 0.1348, 'grad_norm': 0.7523924236931829, 'learning_rate': 6.532318975329864e-06, 'epoch': 0.42} 42%|████▏ | 14357/34278 [15:50:23<19:02:46, 3.44s/it] 42%|████▏ | 14358/34278 [15:50:26<19:02:38, 3.44s/it] {'loss': 0.1586, 'grad_norm': 0.8304539510543331, 'learning_rate': 6.5318692667717395e-06, 'epoch': 0.42} 42%|████▏ | 14358/34278 [15:50:26<19:02:38, 3.44s/it] 42%|████▏ | 14359/34278 [15:50:30<19:10:24, 3.47s/it] {'loss': 0.1478, 'grad_norm': 0.8434055583480368, 'learning_rate': 6.531419544537452e-06, 'epoch': 0.42} 42%|████▏ | 14359/34278 [15:50:30<19:10:24, 3.47s/it] 42%|████▏ | 14360/34278 [15:50:33<18:21:35, 3.32s/it] {'loss': 0.1321, 'grad_norm': 0.7399589595592506, 'learning_rate': 6.530969808631014e-06, 'epoch': 0.42} 42%|████▏ | 14360/34278 [15:50:33<18:21:35, 3.32s/it] 42%|████▏ | 14361/34278 [15:50:39<22:34:22, 4.08s/it] {'loss': 0.141, 'grad_norm': 0.9081734487926589, 'learning_rate': 6.530520059056446e-06, 'epoch': 0.42} 42%|████▏ | 14361/34278 [15:50:39<22:34:22, 4.08s/it] 42%|████▏ | 14362/34278 [15:50:42<20:57:56, 3.79s/it] {'loss': 0.1296, 'grad_norm': 0.703386302611474, 'learning_rate': 6.5300702958177585e-06, 'epoch': 0.42} 42%|████▏ | 14362/34278 [15:50:42<20:57:56, 3.79s/it] 42%|████▏ | 14363/34278 [15:50:47<24:08:32, 4.36s/it] {'loss': 0.1273, 'grad_norm': 0.8498679100186433, 'learning_rate': 6.529620518918969e-06, 'epoch': 0.42} 42%|████▏ | 14363/34278 [15:50:47<24:08:32, 4.36s/it] 42%|████▏ | 14364/34278 [15:50:50<21:31:46, 3.89s/it] {'loss': 0.1391, 'grad_norm': 0.833550905480528, 'learning_rate': 6.529170728364092e-06, 'epoch': 0.42} 42%|████▏ | 14364/34278 [15:50:50<21:31:46, 3.89s/it] 42%|████▏ | 14365/34278 [15:50:56<24:48:22, 4.48s/it] {'loss': 0.1446, 'grad_norm': 4.407525355618035, 'learning_rate': 6.528720924157144e-06, 'epoch': 0.42} 42%|████▏ | 14365/34278 [15:50:56<24:48:22, 4.48s/it] 42%|████▏ | 14366/34278 [15:51:00<24:02:33, 4.35s/it] {'loss': 0.1194, 'grad_norm': 0.8079687573902641, 'learning_rate': 6.528271106302141e-06, 'epoch': 0.42} 42%|████▏ | 14366/34278 [15:51:00<24:02:33, 4.35s/it] 42%|████▏ | 14367/34278 [15:51:05<25:29:37, 4.61s/it] {'loss': 0.1394, 'grad_norm': 0.8849809924407699, 'learning_rate': 6.527821274803098e-06, 'epoch': 0.42} 42%|████▏ | 14367/34278 [15:51:05<25:29:37, 4.61s/it] 42%|████▏ | 14368/34278 [15:51:11<26:52:24, 4.86s/it] {'loss': 0.1374, 'grad_norm': 0.8365329865325013, 'learning_rate': 6.527371429664032e-06, 'epoch': 0.42} 42%|████▏ | 14368/34278 [15:51:11<26:52:24, 4.86s/it] 42%|████▏ | 14369/34278 [15:51:15<25:24:04, 4.59s/it] {'loss': 0.1422, 'grad_norm': 0.6877693104433702, 'learning_rate': 6.526921570888958e-06, 'epoch': 0.42} 42%|████▏ | 14369/34278 [15:51:15<25:24:04, 4.59s/it] 42%|████▏ | 14370/34278 [15:51:18<23:11:44, 4.19s/it] {'loss': 0.1465, 'grad_norm': 1.159280137860826, 'learning_rate': 6.526471698481892e-06, 'epoch': 0.42} 42%|████▏ | 14370/34278 [15:51:18<23:11:44, 4.19s/it] 42%|████▏ | 14371/34278 [15:51:21<21:48:52, 3.94s/it] {'loss': 0.1379, 'grad_norm': 0.7981103899872733, 'learning_rate': 6.526021812446854e-06, 'epoch': 0.42} 42%|████▏ | 14371/34278 [15:51:21<21:48:52, 3.94s/it] 42%|████▏ | 14372/34278 [15:51:27<25:18:21, 4.58s/it] {'loss': 0.1441, 'grad_norm': 0.9183917751391264, 'learning_rate': 6.525571912787854e-06, 'epoch': 0.42} 42%|████▏ | 14372/34278 [15:51:27<25:18:21, 4.58s/it] 42%|████▏ | 14373/34278 [15:51:34<27:44:56, 5.02s/it] {'loss': 0.1201, 'grad_norm': 0.7898228443070926, 'learning_rate': 6.525121999508915e-06, 'epoch': 0.42} 42%|████▏ | 14373/34278 [15:51:34<27:44:56, 5.02s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 42%|████▏ | 14374/34278 [15:51:39<28:36:37, 5.17s/it] {'loss': 0.1493, 'grad_norm': 1.109406773607781, 'learning_rate': 6.524672072614048e-06, 'epoch': 0.42} 42%|████▏ | 14374/34278 [15:51:39<28:36:37, 5.17s/it] 42%|████▏ | 14375/34278 [15:51:42<24:38:49, 4.46s/it] {'loss': 0.1129, 'grad_norm': 0.7248682319379581, 'learning_rate': 6.524222132107273e-06, 'epoch': 0.42} 42%|████▏ | 14375/34278 [15:51:42<24:38:49, 4.46s/it] 42%|████▏ | 14376/34278 [15:51:46<24:37:12, 4.45s/it] {'loss': 0.137, 'grad_norm': 1.0096391272567418, 'learning_rate': 6.5237721779926086e-06, 'epoch': 0.42} 42%|████▏ | 14376/34278 [15:51:46<24:37:12, 4.45s/it] 42%|████▏ | 14377/34278 [15:51:49<22:15:21, 4.03s/it] {'loss': 0.1327, 'grad_norm': 0.7742284748243854, 'learning_rate': 6.52332221027407e-06, 'epoch': 0.42} 42%|████▏ | 14377/34278 [15:51:49<22:15:21, 4.03s/it] 42%|████▏ | 14378/34278 [15:51:52<20:27:59, 3.70s/it] {'loss': 0.1391, 'grad_norm': 1.0158524719397997, 'learning_rate': 6.522872228955672e-06, 'epoch': 0.42} 42%|████▏ | 14378/34278 [15:51:52<20:27:59, 3.70s/it] 42%|████▏ | 14379/34278 [15:51:58<23:07:09, 4.18s/it] {'loss': 0.151, 'grad_norm': 0.7942257032163648, 'learning_rate': 6.522422234041436e-06, 'epoch': 0.42} 42%|████▏ | 14379/34278 [15:51:58<23:07:09, 4.18s/it] 42%|████▏ | 14380/34278 [15:52:04<26:29:49, 4.79s/it] {'loss': 0.1466, 'grad_norm': 0.9800663996007741, 'learning_rate': 6.521972225535378e-06, 'epoch': 0.42} 42%|████▏ | 14380/34278 [15:52:04<26:29:49, 4.79s/it] 42%|████▏ | 14381/34278 [15:52:07<24:04:36, 4.36s/it] {'loss': 0.1458, 'grad_norm': 0.8590362545214911, 'learning_rate': 6.5215222034415146e-06, 'epoch': 0.42} 42%|████▏ | 14381/34278 [15:52:07<24:04:36, 4.36s/it] 42%|████▏ | 14382/34278 [15:52:10<21:44:07, 3.93s/it] {'loss': 0.1116, 'grad_norm': 0.7623873539684414, 'learning_rate': 6.521072167763864e-06, 'epoch': 0.42} 42%|████▏ | 14382/34278 [15:52:10<21:44:07, 3.93s/it] 42%|████▏ | 14383/34278 [15:52:16<25:50:41, 4.68s/it] {'loss': 0.1436, 'grad_norm': 1.0187623040401035, 'learning_rate': 6.520622118506446e-06, 'epoch': 0.42} 42%|████▏ | 14383/34278 [15:52:16<25:50:41, 4.68s/it] 42%|████▏ | 14384/34278 [15:52:19<22:56:34, 4.15s/it] {'loss': 0.1534, 'grad_norm': 0.8541720230399411, 'learning_rate': 6.520172055673274e-06, 'epoch': 0.42} 42%|████▏ | 14384/34278 [15:52:19<22:56:34, 4.15s/it] 42%|████▏ | 14385/34278 [15:52:22<21:12:10, 3.84s/it] {'loss': 0.15, 'grad_norm': 0.8024418784142257, 'learning_rate': 6.5197219792683695e-06, 'epoch': 0.42} 42%|████▏ | 14385/34278 [15:52:22<21:12:10, 3.84s/it] 42%|████▏ | 14386/34278 [15:52:26<21:15:58, 3.85s/it] {'loss': 0.1366, 'grad_norm': 0.7600298330378055, 'learning_rate': 6.519271889295752e-06, 'epoch': 0.42} 42%|████▏ | 14386/34278 [15:52:26<21:15:58, 3.85s/it] 42%|████▏ | 14387/34278 [15:52:31<22:32:16, 4.08s/it] {'loss': 0.1476, 'grad_norm': 0.9907289182929588, 'learning_rate': 6.518821785759435e-06, 'epoch': 0.42} 42%|████▏ | 14387/34278 [15:52:31<22:32:16, 4.08s/it] 42%|████▏ | 14388/34278 [15:52:36<23:42:51, 4.29s/it] {'loss': 0.1517, 'grad_norm': 0.7972380548786816, 'learning_rate': 6.518371668663442e-06, 'epoch': 0.42} 42%|████▏ | 14388/34278 [15:52:36<23:42:51, 4.29s/it] 42%|████▏ | 14389/34278 [15:52:41<25:33:33, 4.63s/it] {'loss': 0.1425, 'grad_norm': 1.100186635034028, 'learning_rate': 6.517921538011789e-06, 'epoch': 0.42} 42%|████▏ | 14389/34278 [15:52:41<25:33:33, 4.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 42%|████▏ | 14390/34278 [15:52:45<23:34:44, 4.27s/it] {'loss': 0.1349, 'grad_norm': 1.0191478368419349, 'learning_rate': 6.517471393808492e-06, 'epoch': 0.42} 42%|████▏ | 14390/34278 [15:52:45<23:34:44, 4.27s/it] 42%|████▏ | 14391/34278 [15:52:48<21:27:45, 3.89s/it] {'loss': 0.1436, 'grad_norm': 0.750193984141269, 'learning_rate': 6.517021236057575e-06, 'epoch': 0.42} 42%|████▏ | 14391/34278 [15:52:48<21:27:45, 3.89s/it] 42%|████▏ | 14392/34278 [15:52:51<20:15:58, 3.67s/it] {'loss': 0.1573, 'grad_norm': 0.8731682549347246, 'learning_rate': 6.516571064763055e-06, 'epoch': 0.42} 42%|████▏ | 14392/34278 [15:52:51<20:15:58, 3.67s/it] 42%|████▏ | 14393/34278 [15:52:54<19:28:34, 3.53s/it] {'loss': 0.1234, 'grad_norm': 1.0952546377060774, 'learning_rate': 6.51612087992895e-06, 'epoch': 0.42} 42%|████▏ | 14393/34278 [15:52:54<19:28:34, 3.53s/it] 42%|████▏ | 14394/34278 [15:52:57<18:50:15, 3.41s/it] {'loss': 0.1414, 'grad_norm': 0.9693942403539806, 'learning_rate': 6.51567068155928e-06, 'epoch': 0.42} 42%|████▏ | 14394/34278 [15:52:57<18:50:15, 3.41s/it] 42%|████▏ | 14395/34278 [15:53:00<18:07:39, 3.28s/it] {'loss': 0.1447, 'grad_norm': 0.9917884406099452, 'learning_rate': 6.515220469658062e-06, 'epoch': 0.42} 42%|████▏ | 14395/34278 [15:53:00<18:07:39, 3.28s/it] 42%|████▏ | 14396/34278 [15:53:04<18:57:44, 3.43s/it] {'loss': 0.1443, 'grad_norm': 1.0461003193772818, 'learning_rate': 6.514770244229319e-06, 'epoch': 0.42} 42%|████▏ | 14396/34278 [15:53:04<18:57:44, 3.43s/it] 42%|████▏ | 14397/34278 [15:53:09<21:48:58, 3.95s/it] {'loss': 0.1479, 'grad_norm': 0.7730216763770943, 'learning_rate': 6.51432000527707e-06, 'epoch': 0.42} 42%|████▏ | 14397/34278 [15:53:09<21:48:58, 3.95s/it] 42%|████▏ | 14398/34278 [15:53:13<22:07:58, 4.01s/it] {'loss': 0.1598, 'grad_norm': 0.7271575490953223, 'learning_rate': 6.513869752805333e-06, 'epoch': 0.42} 42%|████▏ | 14398/34278 [15:53:13<22:07:58, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 42%|████▏ | 14399/34278 [15:53:17<22:31:24, 4.08s/it] {'loss': 0.152, 'grad_norm': 0.7973071694822216, 'learning_rate': 6.513419486818125e-06, 'epoch': 0.42} 42%|████▏ | 14399/34278 [15:53:17<22:31:24, 4.08s/it] 42%|████▏ | 14400/34278 [15:53:21<21:11:21, 3.84s/it] {'loss': 0.1625, 'grad_norm': 0.9407392053654383, 'learning_rate': 6.512969207319472e-06, 'epoch': 0.42} 42%|████▏ | 14400/34278 [15:53:21<21:11:21, 3.84s/it] 42%|████▏ | 14401/34278 [15:53:25<21:17:26, 3.86s/it] {'loss': 0.1352, 'grad_norm': 0.807174221705208, 'learning_rate': 6.512518914313392e-06, 'epoch': 0.42} 42%|████▏ | 14401/34278 [15:53:25<21:17:26, 3.86s/it] 42%|████▏ | 14402/34278 [15:53:31<25:28:50, 4.62s/it] {'loss': 0.1457, 'grad_norm': 0.7364112184600825, 'learning_rate': 6.512068607803901e-06, 'epoch': 0.42} 42%|████▏ | 14402/34278 [15:53:31<25:28:50, 4.62s/it] 42%|████▏ | 14403/34278 [15:53:34<22:41:44, 4.11s/it] {'loss': 0.1461, 'grad_norm': 0.7971310429385652, 'learning_rate': 6.5116182877950255e-06, 'epoch': 0.42} 42%|████▏ | 14403/34278 [15:53:34<22:41:44, 4.11s/it] 42%|████▏ | 14404/34278 [15:53:38<22:54:29, 4.15s/it] {'loss': 0.138, 'grad_norm': 0.8979404807664068, 'learning_rate': 6.511167954290781e-06, 'epoch': 0.42} 42%|████▏ | 14404/34278 [15:53:38<22:54:29, 4.15s/it] 42%|████▏ | 14405/34278 [15:53:42<22:19:33, 4.04s/it] {'loss': 0.1515, 'grad_norm': 0.8636420617153241, 'learning_rate': 6.5107176072951895e-06, 'epoch': 0.42} 42%|████▏ | 14405/34278 [15:53:42<22:19:33, 4.04s/it] 42%|████▏ | 14406/34278 [15:53:45<20:14:18, 3.67s/it] {'loss': 0.1355, 'grad_norm': 0.8233094600114146, 'learning_rate': 6.510267246812274e-06, 'epoch': 0.42} 42%|████▏ | 14406/34278 [15:53:45<20:14:18, 3.67s/it] 42%|████▏ | 14407/34278 [15:53:48<19:41:33, 3.57s/it] {'loss': 0.1086, 'grad_norm': 0.8062573146011515, 'learning_rate': 6.5098168728460505e-06, 'epoch': 0.42} 42%|████▏ | 14407/34278 [15:53:48<19:41:33, 3.57s/it] 42%|████▏ | 14408/34278 [15:53:51<19:23:04, 3.51s/it] {'loss': 0.1325, 'grad_norm': 0.7171526571763346, 'learning_rate': 6.509366485400544e-06, 'epoch': 0.42} 42%|████▏ | 14408/34278 [15:53:51<19:23:04, 3.51s/it] 42%|████▏ | 14409/34278 [15:53:55<18:52:24, 3.42s/it] {'loss': 0.1545, 'grad_norm': 0.7503885416778169, 'learning_rate': 6.508916084479774e-06, 'epoch': 0.42} 42%|████▏ | 14409/34278 [15:53:55<18:52:24, 3.42s/it] 42%|████▏ | 14410/34278 [15:54:00<21:19:36, 3.86s/it] {'loss': 0.1856, 'grad_norm': 0.8119253575625889, 'learning_rate': 6.50846567008776e-06, 'epoch': 0.42} 42%|████▏ | 14410/34278 [15:54:00<21:19:36, 3.86s/it] 42%|████▏ | 14411/34278 [15:54:03<20:14:10, 3.67s/it] {'loss': 0.1501, 'grad_norm': 1.0800479061082418, 'learning_rate': 6.5080152422285255e-06, 'epoch': 0.42} 42%|████▏ | 14411/34278 [15:54:03<20:14:10, 3.67s/it] 42%|████▏ | 14412/34278 [15:54:06<19:26:13, 3.52s/it] {'loss': 0.1566, 'grad_norm': 0.9561561455476036, 'learning_rate': 6.507564800906091e-06, 'epoch': 0.42} 42%|████▏ | 14412/34278 [15:54:06<19:26:13, 3.52s/it] 42%|████▏ | 14413/34278 [15:54:09<18:56:28, 3.43s/it] {'loss': 0.1282, 'grad_norm': 0.7765485467964146, 'learning_rate': 6.507114346124479e-06, 'epoch': 0.42} 42%|████▏ | 14413/34278 [15:54:09<18:56:28, 3.43s/it] 42%|████▏ | 14414/34278 [15:54:12<18:18:23, 3.32s/it] {'loss': 0.1249, 'grad_norm': 0.7646190425731896, 'learning_rate': 6.506663877887707e-06, 'epoch': 0.42} 42%|████▏ | 14414/34278 [15:54:12<18:18:23, 3.32s/it] 42%|████▏ | 14415/34278 [15:54:16<18:20:31, 3.32s/it] {'loss': 0.1429, 'grad_norm': 0.8564671391059466, 'learning_rate': 6.506213396199801e-06, 'epoch': 0.42} 42%|████▏ | 14415/34278 [15:54:16<18:20:31, 3.32s/it] 42%|████▏ | 14416/34278 [15:54:19<18:17:33, 3.32s/it] {'loss': 0.1376, 'grad_norm': 1.058010102157088, 'learning_rate': 6.505762901064782e-06, 'epoch': 0.42} 42%|████▏ | 14416/34278 [15:54:19<18:17:33, 3.32s/it] 42%|████▏ | 14417/34278 [15:54:22<18:25:28, 3.34s/it] {'loss': 0.1318, 'grad_norm': 0.82715533223091, 'learning_rate': 6.50531239248667e-06, 'epoch': 0.42} 42%|████▏ | 14417/34278 [15:54:22<18:25:28, 3.34s/it] 42%|████▏ | 14418/34278 [15:54:28<22:56:01, 4.16s/it] {'loss': 0.1342, 'grad_norm': 0.7235873882819607, 'learning_rate': 6.50486187046949e-06, 'epoch': 0.42} 42%|████▏ | 14418/34278 [15:54:28<22:56:01, 4.16s/it] 42%|████▏ | 14419/34278 [15:54:31<21:05:20, 3.82s/it] {'loss': 0.1493, 'grad_norm': 0.877946213738441, 'learning_rate': 6.504411335017263e-06, 'epoch': 0.42} 42%|████▏ | 14419/34278 [15:54:31<21:05:20, 3.82s/it] 42%|████▏ | 14420/34278 [15:54:35<20:01:26, 3.63s/it] {'loss': 0.1396, 'grad_norm': 1.0487731356546257, 'learning_rate': 6.503960786134007e-06, 'epoch': 0.42} 42%|████▏ | 14420/34278 [15:54:35<20:01:26, 3.63s/it] 42%|████▏ | 14421/34278 [15:54:38<19:35:37, 3.55s/it] {'loss': 0.1184, 'grad_norm': 0.7336313983219429, 'learning_rate': 6.503510223823751e-06, 'epoch': 0.42} 42%|████▏ | 14421/34278 [15:54:38<19:35:37, 3.55s/it] 42%|████▏ | 14422/34278 [15:54:41<19:26:54, 3.53s/it] {'loss': 0.1426, 'grad_norm': 0.9128619470625778, 'learning_rate': 6.503059648090514e-06, 'epoch': 0.42} 42%|████▏ | 14422/34278 [15:54:41<19:26:54, 3.53s/it] 42%|████▏ | 14423/34278 [15:54:45<19:23:52, 3.52s/it] {'loss': 0.1635, 'grad_norm': 0.9919108847115311, 'learning_rate': 6.502609058938319e-06, 'epoch': 0.42} 42%|████▏ | 14423/34278 [15:54:45<19:23:52, 3.52s/it] 42%|████▏ | 14424/34278 [15:54:50<21:57:00, 3.98s/it] {'loss': 0.1226, 'grad_norm': 0.701259220471712, 'learning_rate': 6.50215845637119e-06, 'epoch': 0.42} 42%|████▏ | 14424/34278 [15:54:50<21:57:00, 3.98s/it] 42%|████▏ | 14425/34278 [15:54:54<21:39:35, 3.93s/it] {'loss': 0.1307, 'grad_norm': 0.8996779412217872, 'learning_rate': 6.501707840393147e-06, 'epoch': 0.42} 42%|████▏ | 14425/34278 [15:54:54<21:39:35, 3.93s/it] 42%|████▏ | 14426/34278 [15:55:00<25:13:09, 4.57s/it] {'loss': 0.1554, 'grad_norm': 0.7019403756544518, 'learning_rate': 6.501257211008216e-06, 'epoch': 0.42} 42%|████▏ | 14426/34278 [15:55:00<25:13:09, 4.57s/it] 42%|████▏ | 14427/34278 [15:55:06<28:09:47, 5.11s/it] {'loss': 0.1465, 'grad_norm': 1.04852831197307, 'learning_rate': 6.500806568220419e-06, 'epoch': 0.42} 42%|████▏ | 14427/34278 [15:55:06<28:09:47, 5.11s/it] 42%|████▏ | 14428/34278 [15:55:09<24:48:38, 4.50s/it] {'loss': 0.1607, 'grad_norm': 0.7910520105499192, 'learning_rate': 6.500355912033781e-06, 'epoch': 0.42} 42%|████▏ | 14428/34278 [15:55:09<24:48:38, 4.50s/it] 42%|████▏ | 14429/34278 [15:55:13<23:18:11, 4.23s/it] {'loss': 0.1665, 'grad_norm': 0.8815956341747757, 'learning_rate': 6.49990524245232e-06, 'epoch': 0.42} 42%|████▏ | 14429/34278 [15:55:13<23:18:11, 4.23s/it] 42%|████▏ | 14430/34278 [15:55:17<22:36:32, 4.10s/it] {'loss': 0.128, 'grad_norm': 0.790428077954022, 'learning_rate': 6.4994545594800655e-06, 'epoch': 0.42} 42%|████▏ | 14430/34278 [15:55:17<22:36:32, 4.10s/it] 42%|████▏ | 14431/34278 [15:55:20<21:09:56, 3.84s/it] {'loss': 0.1562, 'grad_norm': 0.8413021886541057, 'learning_rate': 6.499003863121039e-06, 'epoch': 0.42} 42%|████▏ | 14431/34278 [15:55:20<21:09:56, 3.84s/it] 42%|████▏ | 14432/34278 [15:55:24<21:05:13, 3.83s/it] {'loss': 0.1362, 'grad_norm': 0.8960692385578597, 'learning_rate': 6.498553153379262e-06, 'epoch': 0.42} 42%|████▏ | 14432/34278 [15:55:24<21:05:13, 3.83s/it] 42%|████▏ | 14433/34278 [15:55:27<20:30:08, 3.72s/it] {'loss': 0.1454, 'grad_norm': 1.081311930001512, 'learning_rate': 6.498102430258761e-06, 'epoch': 0.42} 42%|████▏ | 14433/34278 [15:55:27<20:30:08, 3.72s/it] 42%|████▏ | 14434/34278 [15:55:30<19:52:48, 3.61s/it] {'loss': 0.1357, 'grad_norm': 0.876389351039299, 'learning_rate': 6.49765169376356e-06, 'epoch': 0.42} 42%|████▏ | 14434/34278 [15:55:31<19:52:48, 3.61s/it] 42%|████▏ | 14435/34278 [15:55:34<19:47:13, 3.59s/it] {'loss': 0.1289, 'grad_norm': 0.875251342324052, 'learning_rate': 6.49720094389768e-06, 'epoch': 0.42} 42%|████▏ | 14435/34278 [15:55:34<19:47:13, 3.59s/it] 42%|████▏ | 14436/34278 [15:55:38<20:16:03, 3.68s/it] {'loss': 0.1378, 'grad_norm': 0.810552797688318, 'learning_rate': 6.49675018066515e-06, 'epoch': 0.42} 42%|████▏ | 14436/34278 [15:55:38<20:16:03, 3.68s/it] 42%|████▏ | 14437/34278 [15:55:41<19:43:03, 3.58s/it] {'loss': 0.1612, 'grad_norm': 0.8625619733077718, 'learning_rate': 6.496299404069991e-06, 'epoch': 0.42} 42%|████▏ | 14437/34278 [15:55:41<19:43:03, 3.58s/it] 42%|████▏ | 14438/34278 [15:55:45<19:11:43, 3.48s/it] {'loss': 0.1672, 'grad_norm': 1.0399405499631094, 'learning_rate': 6.4958486141162266e-06, 'epoch': 0.42} 42%|████▏ | 14438/34278 [15:55:45<19:11:43, 3.48s/it] 42%|████▏ | 14439/34278 [15:55:49<20:45:18, 3.77s/it] {'loss': 0.1235, 'grad_norm': 0.9267043636752254, 'learning_rate': 6.495397810807884e-06, 'epoch': 0.42} 42%|████▏ | 14439/34278 [15:55:49<20:45:18, 3.77s/it] 42%|████▏ | 14440/34278 [15:55:52<19:03:37, 3.46s/it] {'loss': 0.1309, 'grad_norm': 0.7273239851740533, 'learning_rate': 6.4949469941489874e-06, 'epoch': 0.42} 42%|████▏ | 14440/34278 [15:55:52<19:03:37, 3.46s/it] 42%|████▏ | 14441/34278 [15:55:55<18:24:49, 3.34s/it] {'loss': 0.1402, 'grad_norm': 0.9197924239614961, 'learning_rate': 6.49449616414356e-06, 'epoch': 0.42} 42%|████▏ | 14441/34278 [15:55:55<18:24:49, 3.34s/it] 42%|████▏ | 14442/34278 [15:55:58<18:24:58, 3.34s/it] {'loss': 0.1296, 'grad_norm': 1.041942525152656, 'learning_rate': 6.4940453207956274e-06, 'epoch': 0.42} 42%|████▏ | 14442/34278 [15:55:58<18:24:58, 3.34s/it] 42%|████▏ | 14443/34278 [15:56:04<21:51:47, 3.97s/it] {'loss': 0.1253, 'grad_norm': 0.8788726373225624, 'learning_rate': 6.493594464109217e-06, 'epoch': 0.42} 42%|████▏ | 14443/34278 [15:56:04<21:51:47, 3.97s/it] 42%|████▏ | 14444/34278 [15:56:09<24:52:55, 4.52s/it] {'loss': 0.14, 'grad_norm': 0.7326265049777108, 'learning_rate': 6.493143594088348e-06, 'epoch': 0.42} 42%|████▏ | 14444/34278 [15:56:09<24:52:55, 4.52s/it] 42%|████▏ | 14445/34278 [15:56:12<22:23:03, 4.06s/it] {'loss': 0.1328, 'grad_norm': 0.9904664601160023, 'learning_rate': 6.492692710737052e-06, 'epoch': 0.42} 42%|████▏ | 14445/34278 [15:56:12<22:23:03, 4.06s/it] 42%|████▏ | 14446/34278 [15:56:16<21:06:46, 3.83s/it] {'loss': 0.1454, 'grad_norm': 0.9432998346701613, 'learning_rate': 6.492241814059351e-06, 'epoch': 0.42} 42%|████▏ | 14446/34278 [15:56:16<21:06:46, 3.83s/it] 42%|████▏ | 14447/34278 [15:56:22<24:33:40, 4.46s/it] {'loss': 0.1219, 'grad_norm': 0.7541833112111631, 'learning_rate': 6.491790904059271e-06, 'epoch': 0.42} 42%|████▏ | 14447/34278 [15:56:22<24:33:40, 4.46s/it] 42%|████▏ | 14448/34278 [15:56:25<22:07:47, 4.02s/it] {'loss': 0.1599, 'grad_norm': 0.9898427213653226, 'learning_rate': 6.491339980740839e-06, 'epoch': 0.42} 42%|████▏ | 14448/34278 [15:56:25<22:07:47, 4.02s/it] 42%|████▏ | 14449/34278 [15:56:27<19:48:53, 3.60s/it] {'loss': 0.1725, 'grad_norm': 1.103395260444594, 'learning_rate': 6.490889044108079e-06, 'epoch': 0.42} 42%|████▏ | 14449/34278 [15:56:27<19:48:53, 3.60s/it] 42%|████▏ | 14450/34278 [15:56:30<19:19:30, 3.51s/it] {'loss': 0.1362, 'grad_norm': 0.8698384097384274, 'learning_rate': 6.490438094165017e-06, 'epoch': 0.42} 42%|████▏ | 14450/34278 [15:56:30<19:19:30, 3.51s/it] 42%|████▏ | 14451/34278 [15:56:34<19:36:33, 3.56s/it] {'loss': 0.1436, 'grad_norm': 0.7955909545275446, 'learning_rate': 6.48998713091568e-06, 'epoch': 0.42} 42%|████▏ | 14451/34278 [15:56:34<19:36:33, 3.56s/it] 42%|████▏ | 14452/34278 [15:56:37<18:31:23, 3.36s/it] {'loss': 0.1428, 'grad_norm': 0.9972732986336026, 'learning_rate': 6.4895361543640945e-06, 'epoch': 0.42} 42%|████▏ | 14452/34278 [15:56:37<18:31:23, 3.36s/it] 42%|████▏ | 14453/34278 [15:56:40<17:21:46, 3.15s/it] {'loss': 0.1189, 'grad_norm': 0.8758179208475545, 'learning_rate': 6.489085164514285e-06, 'epoch': 0.42} 42%|████▏ | 14453/34278 [15:56:40<17:21:46, 3.15s/it] 42%|████▏ | 14454/34278 [15:56:43<17:18:12, 3.14s/it] {'loss': 0.1247, 'grad_norm': 0.8987846600605, 'learning_rate': 6.4886341613702785e-06, 'epoch': 0.42} 42%|████▏ | 14454/34278 [15:56:43<17:18:12, 3.14s/it] 42%|████▏ | 14455/34278 [15:56:46<17:30:32, 3.18s/it] {'loss': 0.1225, 'grad_norm': 0.8682519836965628, 'learning_rate': 6.4881831449361025e-06, 'epoch': 0.42} 42%|████▏ | 14455/34278 [15:56:46<17:30:32, 3.18s/it] 42%|████▏ | 14456/34278 [15:56:49<16:40:58, 3.03s/it] {'loss': 0.1617, 'grad_norm': 1.0115723125029012, 'learning_rate': 6.487732115215781e-06, 'epoch': 0.42} 42%|████▏ | 14456/34278 [15:56:49<16:40:58, 3.03s/it] 42%|████▏ | 14457/34278 [15:56:52<16:15:57, 2.95s/it] {'loss': 0.1344, 'grad_norm': 0.7916856556748872, 'learning_rate': 6.487281072213343e-06, 'epoch': 0.42} 42%|████▏ | 14457/34278 [15:56:52<16:15:57, 2.95s/it] 42%|████▏ | 14458/34278 [15:56:56<18:43:08, 3.40s/it] {'loss': 0.1487, 'grad_norm': 0.9089706188771904, 'learning_rate': 6.486830015932816e-06, 'epoch': 0.42} 42%|████▏ | 14458/34278 [15:56:56<18:43:08, 3.40s/it] 42%|████▏ | 14459/34278 [15:57:00<19:47:16, 3.59s/it] {'loss': 0.1182, 'grad_norm': 0.8129103123432081, 'learning_rate': 6.486378946378222e-06, 'epoch': 0.42} 42%|████▏ | 14459/34278 [15:57:00<19:47:16, 3.59s/it] 42%|████▏ | 14460/34278 [15:57:03<18:39:19, 3.39s/it] {'loss': 0.1509, 'grad_norm': 0.8407381672554703, 'learning_rate': 6.485927863553595e-06, 'epoch': 0.42} 42%|████▏ | 14460/34278 [15:57:03<18:39:19, 3.39s/it] 42%|████▏ | 14461/34278 [15:57:06<18:09:22, 3.30s/it] {'loss': 0.1277, 'grad_norm': 0.7697383827971408, 'learning_rate': 6.485476767462958e-06, 'epoch': 0.42} 42%|████▏ | 14461/34278 [15:57:06<18:09:22, 3.30s/it] 42%|████▏ | 14462/34278 [15:57:10<18:35:23, 3.38s/it] {'loss': 0.1479, 'grad_norm': 0.7826646804597619, 'learning_rate': 6.485025658110337e-06, 'epoch': 0.42} 42%|████▏ | 14462/34278 [15:57:10<18:35:23, 3.38s/it] 42%|████▏ | 14463/34278 [15:57:16<22:50:15, 4.15s/it] {'loss': 0.1557, 'grad_norm': 1.0353386333869485, 'learning_rate': 6.484574535499766e-06, 'epoch': 0.42} 42%|████▏ | 14463/34278 [15:57:16<22:50:15, 4.15s/it] 42%|████▏ | 14464/34278 [15:57:19<22:13:48, 4.04s/it] {'loss': 0.1557, 'grad_norm': 0.8493185436058729, 'learning_rate': 6.484123399635264e-06, 'epoch': 0.42} 42%|████▏ | 14464/34278 [15:57:19<22:13:48, 4.04s/it] 42%|████▏ | 14465/34278 [15:57:25<24:29:59, 4.45s/it] {'loss': 0.1512, 'grad_norm': 0.8082244009837359, 'learning_rate': 6.483672250520863e-06, 'epoch': 0.42} 42%|████▏ | 14465/34278 [15:57:25<24:29:59, 4.45s/it] 42%|████▏ | 14466/34278 [15:57:31<26:41:23, 4.85s/it] {'loss': 0.1488, 'grad_norm': 1.1399350390091791, 'learning_rate': 6.483221088160592e-06, 'epoch': 0.42} 42%|████▏ | 14466/34278 [15:57:31<26:41:23, 4.85s/it] 42%|████▏ | 14467/34278 [15:57:34<23:37:43, 4.29s/it] {'loss': 0.1312, 'grad_norm': 1.1131730149976795, 'learning_rate': 6.482769912558475e-06, 'epoch': 0.42} 42%|████▏ | 14467/34278 [15:57:34<23:37:43, 4.29s/it] 42%|████▏ | 14468/34278 [15:57:37<21:32:48, 3.92s/it] {'loss': 0.1116, 'grad_norm': 0.8989369341205465, 'learning_rate': 6.482318723718544e-06, 'epoch': 0.42} 42%|████▏ | 14468/34278 [15:57:37<21:32:48, 3.92s/it] 42%|████▏ | 14469/34278 [15:57:40<20:47:51, 3.78s/it] {'loss': 0.1527, 'grad_norm': 0.901983020841205, 'learning_rate': 6.481867521644825e-06, 'epoch': 0.42} 42%|████▏ | 14469/34278 [15:57:40<20:47:51, 3.78s/it] 42%|████▏ | 14470/34278 [15:57:46<24:34:50, 4.47s/it] {'loss': 0.161, 'grad_norm': 1.1364380199037103, 'learning_rate': 6.481416306341346e-06, 'epoch': 0.42} 42%|████▏ | 14470/34278 [15:57:46<24:34:50, 4.47s/it] 42%|████▏ | 14471/34278 [15:57:52<27:36:40, 5.02s/it] {'loss': 0.1235, 'grad_norm': 0.8151538578658158, 'learning_rate': 6.480965077812136e-06, 'epoch': 0.42} 42%|████▏ | 14471/34278 [15:57:52<27:36:40, 5.02s/it] 42%|████▏ | 14472/34278 [15:57:58<28:51:00, 5.24s/it] {'loss': 0.1301, 'grad_norm': 0.7338103446092081, 'learning_rate': 6.480513836061223e-06, 'epoch': 0.42} 42%|████▏ | 14472/34278 [15:57:58<28:51:00, 5.24s/it] 42%|████▏ | 14473/34278 [15:58:01<25:29:46, 4.63s/it] {'loss': 0.1287, 'grad_norm': 0.7493153678674694, 'learning_rate': 6.480062581092638e-06, 'epoch': 0.42} 42%|████▏ | 14473/34278 [15:58:01<25:29:46, 4.63s/it] 42%|████▏ | 14474/34278 [15:58:06<24:54:50, 4.53s/it] {'loss': 0.1377, 'grad_norm': 0.9227149055644112, 'learning_rate': 6.479611312910405e-06, 'epoch': 0.42} 42%|████▏ | 14474/34278 [15:58:06<24:54:50, 4.53s/it] 42%|████▏ | 14475/34278 [15:58:09<22:23:55, 4.07s/it] {'loss': 0.1528, 'grad_norm': 1.029601597535191, 'learning_rate': 6.479160031518555e-06, 'epoch': 0.42} 42%|████▏ | 14475/34278 [15:58:09<22:23:55, 4.07s/it] 42%|████▏ | 14476/34278 [15:58:13<23:19:05, 4.24s/it] {'loss': 0.1482, 'grad_norm': 0.8191931311640929, 'learning_rate': 6.47870873692112e-06, 'epoch': 0.42} 42%|████▏ | 14476/34278 [15:58:13<23:19:05, 4.24s/it] 42%|████▏ | 14477/34278 [15:58:16<20:42:11, 3.76s/it] {'loss': 0.1396, 'grad_norm': 0.7787834985340119, 'learning_rate': 6.4782574291221234e-06, 'epoch': 0.42} 42%|████▏ | 14477/34278 [15:58:16<20:42:11, 3.76s/it] 42%|████▏ | 14478/34278 [15:58:20<21:32:55, 3.92s/it] {'loss': 0.1599, 'grad_norm': 1.0113480106310186, 'learning_rate': 6.4778061081256e-06, 'epoch': 0.42} 42%|████▏ | 14478/34278 [15:58:20<21:32:55, 3.92s/it] 42%|████▏ | 14479/34278 [15:58:24<21:12:12, 3.86s/it] {'loss': 0.131, 'grad_norm': 0.8803704920654022, 'learning_rate': 6.477354773935576e-06, 'epoch': 0.42} 42%|████▏ | 14479/34278 [15:58:24<21:12:12, 3.86s/it] 42%|████▏ | 14480/34278 [15:58:28<21:24:10, 3.89s/it] {'loss': 0.1369, 'grad_norm': 0.8194151344060884, 'learning_rate': 6.476903426556079e-06, 'epoch': 0.42} 42%|████▏ | 14480/34278 [15:58:28<21:24:10, 3.89s/it] 42%|████▏ | 14481/34278 [15:58:33<22:40:09, 4.12s/it] {'loss': 0.1289, 'grad_norm': 1.0973736802943928, 'learning_rate': 6.4764520659911436e-06, 'epoch': 0.42} 42%|████▏ | 14481/34278 [15:58:33<22:40:09, 4.12s/it] 42%|████▏ | 14482/34278 [15:58:36<21:49:33, 3.97s/it] {'loss': 0.1429, 'grad_norm': 0.7909811108443876, 'learning_rate': 6.476000692244795e-06, 'epoch': 0.42} 42%|████▏ | 14482/34278 [15:58:36<21:49:33, 3.97s/it] 42%|████▏ | 14483/34278 [15:58:41<22:55:17, 4.17s/it] {'loss': 0.1467, 'grad_norm': 0.8807801299318434, 'learning_rate': 6.475549305321065e-06, 'epoch': 0.42} 42%|████▏ | 14483/34278 [15:58:41<22:55:17, 4.17s/it] 42%|████▏ | 14484/34278 [15:58:44<21:32:18, 3.92s/it] {'loss': 0.1458, 'grad_norm': 0.8934675700711427, 'learning_rate': 6.475097905223984e-06, 'epoch': 0.42} 42%|████▏ | 14484/34278 [15:58:44<21:32:18, 3.92s/it] 42%|████▏ | 14485/34278 [15:58:47<20:28:47, 3.72s/it] {'loss': 0.1418, 'grad_norm': 1.0066656807002248, 'learning_rate': 6.474646491957579e-06, 'epoch': 0.42} 42%|████▏ | 14485/34278 [15:58:47<20:28:47, 3.72s/it] 42%|████▏ | 14486/34278 [15:58:54<25:04:54, 4.56s/it] {'loss': 0.1267, 'grad_norm': 0.7798560813803963, 'learning_rate': 6.474195065525884e-06, 'epoch': 0.42} 42%|████▏ | 14486/34278 [15:58:54<25:04:54, 4.56s/it] 42%|████▏ | 14487/34278 [15:59:00<26:45:03, 4.87s/it] {'loss': 0.1555, 'grad_norm': 0.8455626917926337, 'learning_rate': 6.473743625932926e-06, 'epoch': 0.42} 42%|████▏ | 14487/34278 [15:59:00<26:45:03, 4.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 42%|████▏ | 14488/34278 [15:59:06<28:38:25, 5.21s/it] {'loss': 0.1396, 'grad_norm': 1.2057873204756218, 'learning_rate': 6.473292173182738e-06, 'epoch': 0.42} 42%|████▏ | 14488/34278 [15:59:06<28:38:25, 5.21s/it] 42%|████▏ | 14489/34278 [15:59:11<29:40:12, 5.40s/it] {'loss': 0.1434, 'grad_norm': 0.7783230851033105, 'learning_rate': 6.472840707279348e-06, 'epoch': 0.42} 42%|████▏ | 14489/34278 [15:59:11<29:40:12, 5.40s/it] 42%|████▏ | 14490/34278 [15:59:14<25:55:49, 4.72s/it] {'loss': 0.1389, 'grad_norm': 0.9383578860812518, 'learning_rate': 6.4723892282267875e-06, 'epoch': 0.42} 42%|████▏ | 14490/34278 [15:59:14<25:55:49, 4.72s/it] 42%|████▏ | 14491/34278 [15:59:18<24:07:23, 4.39s/it] {'loss': 0.139, 'grad_norm': 0.9187106361483238, 'learning_rate': 6.47193773602909e-06, 'epoch': 0.42} 42%|████▏ | 14491/34278 [15:59:18<24:07:23, 4.39s/it] 42%|████▏ | 14492/34278 [15:59:21<22:06:03, 4.02s/it] {'loss': 0.1322, 'grad_norm': 1.0385874121281264, 'learning_rate': 6.47148623069028e-06, 'epoch': 0.42} 42%|████▏ | 14492/34278 [15:59:21<22:06:03, 4.02s/it] 42%|████▏ | 14493/34278 [15:59:25<21:48:04, 3.97s/it] {'loss': 0.1401, 'grad_norm': 1.0229260498219677, 'learning_rate': 6.471034712214396e-06, 'epoch': 0.42} 42%|████▏ | 14493/34278 [15:59:25<21:48:04, 3.97s/it] 42%|████▏ | 14494/34278 [15:59:28<20:48:43, 3.79s/it] {'loss': 0.1258, 'grad_norm': 0.7925077642592967, 'learning_rate': 6.470583180605463e-06, 'epoch': 0.42} 42%|████▏ | 14494/34278 [15:59:28<20:48:43, 3.79s/it] 42%|████▏ | 14495/34278 [15:59:33<22:00:51, 4.01s/it] {'loss': 0.1682, 'grad_norm': 1.0985345433187705, 'learning_rate': 6.470131635867515e-06, 'epoch': 0.42} 42%|████▏ | 14495/34278 [15:59:33<22:00:51, 4.01s/it] 42%|████▏ | 14496/34278 [15:59:36<20:22:39, 3.71s/it] {'loss': 0.1395, 'grad_norm': 0.9135708400307052, 'learning_rate': 6.4696800780045825e-06, 'epoch': 0.42} 42%|████▏ | 14496/34278 [15:59:36<20:22:39, 3.71s/it] 42%|████▏ | 14497/34278 [15:59:39<19:29:40, 3.55s/it] {'loss': 0.1427, 'grad_norm': 1.0214609418602645, 'learning_rate': 6.469228507020697e-06, 'epoch': 0.42} 42%|████▏ | 14497/34278 [15:59:39<19:29:40, 3.55s/it] 42%|████▏ | 14498/34278 [15:59:43<19:33:16, 3.56s/it] {'loss': 0.1438, 'grad_norm': 2.079836989571655, 'learning_rate': 6.46877692291989e-06, 'epoch': 0.42} 42%|████▏ | 14498/34278 [15:59:43<19:33:16, 3.56s/it] 42%|████▏ | 14499/34278 [15:59:46<18:57:09, 3.45s/it] {'loss': 0.1397, 'grad_norm': 0.9745922787004977, 'learning_rate': 6.468325325706194e-06, 'epoch': 0.42} 42%|████▏ | 14499/34278 [15:59:46<18:57:09, 3.45s/it] 42%|████▏ | 14500/34278 [15:59:49<18:03:16, 3.29s/it] {'loss': 0.1297, 'grad_norm': 0.8420242500977787, 'learning_rate': 6.467873715383639e-06, 'epoch': 0.42} 42%|████▏ | 14500/34278 [15:59:49<18:03:16, 3.29s/it] 42%|████▏ | 14501/34278 [15:59:55<22:17:27, 4.06s/it] {'loss': 0.1465, 'grad_norm': 0.9924336625137476, 'learning_rate': 6.4674220919562594e-06, 'epoch': 0.42} 42%|████▏ | 14501/34278 [15:59:55<22:17:27, 4.06s/it] 42%|████▏ | 14502/34278 [15:59:58<21:06:23, 3.84s/it] {'loss': 0.1493, 'grad_norm': 0.9787686309765826, 'learning_rate': 6.466970455428085e-06, 'epoch': 0.42} 42%|████▏ | 14502/34278 [15:59:58<21:06:23, 3.84s/it] 42%|████▏ | 14503/34278 [16:00:02<20:40:33, 3.76s/it] {'loss': 0.161, 'grad_norm': 0.91428197702099, 'learning_rate': 6.466518805803148e-06, 'epoch': 0.42} 42%|████▏ | 14503/34278 [16:00:02<20:40:33, 3.76s/it] 42%|████▏ | 14504/34278 [16:00:07<23:54:29, 4.35s/it] {'loss': 0.1687, 'grad_norm': 1.0981507514878934, 'learning_rate': 6.466067143085481e-06, 'epoch': 0.42} 42%|████▏ | 14504/34278 [16:00:07<23:54:29, 4.35s/it] 42%|████▏ | 14505/34278 [16:00:13<25:57:29, 4.73s/it] {'loss': 0.159, 'grad_norm': 0.8314098051543142, 'learning_rate': 6.465615467279116e-06, 'epoch': 0.42} 42%|████▏ | 14505/34278 [16:00:13<25:57:29, 4.73s/it] 42%|████▏ | 14506/34278 [16:00:16<23:24:52, 4.26s/it] {'loss': 0.141, 'grad_norm': 0.9551817398368962, 'learning_rate': 6.4651637783880885e-06, 'epoch': 0.42} 42%|████▏ | 14506/34278 [16:00:16<23:24:52, 4.26s/it] 42%|████▏ | 14507/34278 [16:00:22<26:00:02, 4.73s/it] {'loss': 0.113, 'grad_norm': 0.6840919919862214, 'learning_rate': 6.464712076416426e-06, 'epoch': 0.42} 42%|████▏ | 14507/34278 [16:00:22<26:00:02, 4.73s/it] 42%|████▏ | 14508/34278 [16:00:25<23:40:36, 4.31s/it] {'loss': 0.1543, 'grad_norm': 0.7784800829644861, 'learning_rate': 6.464260361368165e-06, 'epoch': 0.42} 42%|████▏ | 14508/34278 [16:00:25<23:40:36, 4.31s/it] 42%|████▏ | 14509/34278 [16:00:28<21:09:22, 3.85s/it] {'loss': 0.1512, 'grad_norm': 1.1386175511507357, 'learning_rate': 6.463808633247337e-06, 'epoch': 0.42} 42%|████▏ | 14509/34278 [16:00:28<21:09:22, 3.85s/it] 42%|████▏ | 14510/34278 [16:00:31<19:17:41, 3.51s/it] {'loss': 0.1558, 'grad_norm': 0.8722414067316532, 'learning_rate': 6.463356892057975e-06, 'epoch': 0.42} 42%|████▏ | 14510/34278 [16:00:31<19:17:41, 3.51s/it] 42%|████▏ | 14511/34278 [16:00:35<20:10:11, 3.67s/it] {'loss': 0.1371, 'grad_norm': 0.7786503128579964, 'learning_rate': 6.462905137804112e-06, 'epoch': 0.42} 42%|████▏ | 14511/34278 [16:00:35<20:10:11, 3.67s/it] 42%|████▏ | 14512/34278 [16:00:38<18:56:49, 3.45s/it] {'loss': 0.1693, 'grad_norm': 0.9231178955877767, 'learning_rate': 6.462453370489781e-06, 'epoch': 0.42} 42%|████▏ | 14512/34278 [16:00:38<18:56:49, 3.45s/it] 42%|████▏ | 14513/34278 [16:00:41<18:53:10, 3.44s/it] {'loss': 0.1686, 'grad_norm': 1.0801509375056284, 'learning_rate': 6.462001590119015e-06, 'epoch': 0.42} 42%|████▏ | 14513/34278 [16:00:41<18:53:10, 3.44s/it] 42%|████▏ | 14514/34278 [16:00:44<18:13:25, 3.32s/it] {'loss': 0.1345, 'grad_norm': 0.9346433310084615, 'learning_rate': 6.461549796695847e-06, 'epoch': 0.42} 42%|████▏ | 14514/34278 [16:00:44<18:13:25, 3.32s/it] 42%|████▏ | 14515/34278 [16:00:48<19:24:44, 3.54s/it] {'loss': 0.1633, 'grad_norm': 0.9262236702603318, 'learning_rate': 6.461097990224313e-06, 'epoch': 0.42} 42%|████▏ | 14515/34278 [16:00:48<19:24:44, 3.54s/it] 42%|████▏ | 14516/34278 [16:00:53<21:01:47, 3.83s/it] {'loss': 0.1399, 'grad_norm': 0.8092654480157292, 'learning_rate': 6.460646170708445e-06, 'epoch': 0.42} 42%|████▏ | 14516/34278 [16:00:53<21:01:47, 3.83s/it] 42%|████▏ | 14517/34278 [16:00:56<19:59:03, 3.64s/it] {'loss': 0.1293, 'grad_norm': 0.9282686039633216, 'learning_rate': 6.460194338152276e-06, 'epoch': 0.42} 42%|████▏ | 14517/34278 [16:00:56<19:59:03, 3.64s/it] 42%|████▏ | 14518/34278 [16:00:59<19:13:51, 3.50s/it] {'loss': 0.1208, 'grad_norm': 0.7488744299677202, 'learning_rate': 6.459742492559842e-06, 'epoch': 0.42} 42%|████▏ | 14518/34278 [16:00:59<19:13:51, 3.50s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd5a52ba9d0> Failed to fetch sample 3412266. Exception: cannot identify image file <_io.BytesIO object at 0x7fd5a52ba9d0> 42%|████▏ | 14519/34278 [16:01:02<18:11:12, 3.31s/it] {'loss': 0.1464, 'grad_norm': 0.919009376017385, 'learning_rate': 6.459290633935172e-06, 'epoch': 0.42} 42%|████▏ | 14519/34278 [16:01:02<18:11:12, 3.31s/it] 42%|████▏ | 14520/34278 [16:01:06<19:52:19, 3.62s/it] {'loss': 0.1453, 'grad_norm': 0.8989568180018619, 'learning_rate': 6.458838762282306e-06, 'epoch': 0.42} 42%|████▏ | 14520/34278 [16:01:06<19:52:19, 3.62s/it] 42%|████▏ | 14521/34278 [16:01:09<18:31:34, 3.38s/it] {'loss': 0.1225, 'grad_norm': 0.7095194367777338, 'learning_rate': 6.458386877605276e-06, 'epoch': 0.42} 42%|████▏ | 14521/34278 [16:01:09<18:31:34, 3.38s/it] 42%|████▏ | 14522/34278 [16:01:13<18:30:09, 3.37s/it] {'loss': 0.1393, 'grad_norm': 0.8215149469125433, 'learning_rate': 6.457934979908115e-06, 'epoch': 0.42} 42%|████▏ | 14522/34278 [16:01:13<18:30:09, 3.37s/it] 42%|████▏ | 14523/34278 [16:01:16<18:22:35, 3.35s/it] {'loss': 0.1368, 'grad_norm': 0.7256366165278979, 'learning_rate': 6.45748306919486e-06, 'epoch': 0.42} 42%|████▏ | 14523/34278 [16:01:16<18:22:35, 3.35s/it] 42%|████▏ | 14524/34278 [16:01:19<17:43:05, 3.23s/it] {'loss': 0.1318, 'grad_norm': 0.7054796777177985, 'learning_rate': 6.457031145469543e-06, 'epoch': 0.42} 42%|████▏ | 14524/34278 [16:01:19<17:43:05, 3.23s/it] 42%|████▏ | 14525/34278 [16:01:22<18:10:57, 3.31s/it] {'loss': 0.1462, 'grad_norm': 0.8383542805823619, 'learning_rate': 6.4565792087362e-06, 'epoch': 0.42} 42%|████▏ | 14525/34278 [16:01:22<18:10:57, 3.31s/it] 42%|████▏ | 14526/34278 [16:01:26<19:18:24, 3.52s/it] {'loss': 0.1517, 'grad_norm': 0.6398447224104609, 'learning_rate': 6.456127258998866e-06, 'epoch': 0.42} 42%|████▏ | 14526/34278 [16:01:26<19:18:24, 3.52s/it] 42%|████▏ | 14527/34278 [16:01:32<22:47:38, 4.15s/it] {'loss': 0.1441, 'grad_norm': 0.7920031033802036, 'learning_rate': 6.455675296261574e-06, 'epoch': 0.42} 42%|████▏ | 14527/34278 [16:01:32<22:47:38, 4.15s/it] 42%|████▏ | 14528/34278 [16:01:35<21:42:43, 3.96s/it] {'loss': 0.1362, 'grad_norm': 0.8937732273513436, 'learning_rate': 6.455223320528361e-06, 'epoch': 0.42} 42%|████▏ | 14528/34278 [16:01:35<21:42:43, 3.96s/it] 42%|████▏ | 14529/34278 [16:01:39<21:16:38, 3.88s/it] {'loss': 0.1426, 'grad_norm': 0.9071346261385165, 'learning_rate': 6.454771331803262e-06, 'epoch': 0.42} 42%|████▏ | 14529/34278 [16:01:39<21:16:38, 3.88s/it] 42%|████▏ | 14530/34278 [16:01:45<24:38:40, 4.49s/it] {'loss': 0.1337, 'grad_norm': 0.734115501364661, 'learning_rate': 6.454319330090313e-06, 'epoch': 0.42} 42%|████▏ | 14530/34278 [16:01:45<24:38:40, 4.49s/it] 42%|████▏ | 14531/34278 [16:01:49<23:49:30, 4.34s/it] {'loss': 0.1339, 'grad_norm': 0.9098881502519791, 'learning_rate': 6.453867315393546e-06, 'epoch': 0.42} 42%|████▏ | 14531/34278 [16:01:49<23:49:30, 4.34s/it] 42%|████▏ | 14532/34278 [16:01:53<22:24:23, 4.09s/it] {'loss': 0.11, 'grad_norm': 1.0463011771926354, 'learning_rate': 6.453415287717e-06, 'epoch': 0.42} 42%|████▏ | 14532/34278 [16:01:53<22:24:23, 4.09s/it] 42%|████▏ | 14533/34278 [16:01:59<25:43:12, 4.69s/it] {'loss': 0.1252, 'grad_norm': 0.8406661020750411, 'learning_rate': 6.45296324706471e-06, 'epoch': 0.42} 42%|████▏ | 14533/34278 [16:01:59<25:43:12, 4.69s/it] 42%|████▏ | 14534/34278 [16:02:01<22:36:09, 4.12s/it] {'loss': 0.1392, 'grad_norm': 0.7746094591515468, 'learning_rate': 6.452511193440708e-06, 'epoch': 0.42} 42%|████▏ | 14534/34278 [16:02:01<22:36:09, 4.12s/it] 42%|████▏ | 14535/34278 [16:02:05<21:05:11, 3.84s/it] {'loss': 0.1379, 'grad_norm': 1.0186412532074003, 'learning_rate': 6.452059126849035e-06, 'epoch': 0.42} 42%|████▏ | 14535/34278 [16:02:05<21:05:11, 3.84s/it] 42%|████▏ | 14536/34278 [16:02:11<25:16:27, 4.61s/it] {'loss': 0.1355, 'grad_norm': 0.9899846271410743, 'learning_rate': 6.451607047293726e-06, 'epoch': 0.42} 42%|████▏ | 14536/34278 [16:02:11<25:16:27, 4.61s/it] 42%|████▏ | 14537/34278 [16:02:15<23:28:34, 4.28s/it] {'loss': 0.1294, 'grad_norm': 0.7079457275795795, 'learning_rate': 6.451154954778813e-06, 'epoch': 0.42} 42%|████▏ | 14537/34278 [16:02:15<23:28:34, 4.28s/it] 42%|████▏ | 14538/34278 [16:02:18<21:31:29, 3.93s/it] {'loss': 0.1384, 'grad_norm': 0.8899821635349057, 'learning_rate': 6.4507028493083365e-06, 'epoch': 0.42} 42%|████▏ | 14538/34278 [16:02:18<21:31:29, 3.93s/it] 42%|████▏ | 14539/34278 [16:02:22<21:35:09, 3.94s/it] {'loss': 0.1489, 'grad_norm': 1.3216708054820439, 'learning_rate': 6.4502507308863316e-06, 'epoch': 0.42} 42%|████▏ | 14539/34278 [16:02:22<21:35:09, 3.94s/it] 42%|████▏ | 14540/34278 [16:02:28<25:02:31, 4.57s/it] {'loss': 0.1283, 'grad_norm': 0.7976546901755325, 'learning_rate': 6.449798599516833e-06, 'epoch': 0.42} 42%|████▏ | 14540/34278 [16:02:28<25:02:31, 4.57s/it] 42%|████▏ | 14541/34278 [16:02:31<23:32:06, 4.29s/it] {'loss': 0.1446, 'grad_norm': 0.8192831887327774, 'learning_rate': 6.44934645520388e-06, 'epoch': 0.42} 42%|████▏ | 14541/34278 [16:02:31<23:32:06, 4.29s/it] 42%|████▏ | 14542/34278 [16:02:35<22:08:58, 4.04s/it] {'loss': 0.1393, 'grad_norm': 1.0154037535246627, 'learning_rate': 6.448894297951507e-06, 'epoch': 0.42} 42%|████▏ | 14542/34278 [16:02:35<22:08:58, 4.04s/it] 42%|████▏ | 14543/34278 [16:02:38<20:56:37, 3.82s/it] {'loss': 0.1407, 'grad_norm': 0.8159730911855505, 'learning_rate': 6.448442127763752e-06, 'epoch': 0.42} 42%|████▏ | 14543/34278 [16:02:38<20:56:37, 3.82s/it] 42%|████▏ | 14544/34278 [16:02:41<19:40:50, 3.59s/it] {'loss': 0.1503, 'grad_norm': 1.1149308617173088, 'learning_rate': 6.447989944644651e-06, 'epoch': 0.42} 42%|████▏ | 14544/34278 [16:02:41<19:40:50, 3.59s/it] 42%|████▏ | 14545/34278 [16:02:45<19:23:55, 3.54s/it] {'loss': 0.1495, 'grad_norm': 0.6467155633479809, 'learning_rate': 6.447537748598241e-06, 'epoch': 0.42} 42%|████▏ | 14545/34278 [16:02:45<19:23:55, 3.54s/it] 42%|████▏ | 14546/34278 [16:02:47<18:17:23, 3.34s/it] {'loss': 0.141, 'grad_norm': 1.0370129458736717, 'learning_rate': 6.447085539628562e-06, 'epoch': 0.42} 42%|████▏ | 14546/34278 [16:02:47<18:17:23, 3.34s/it] 42%|████▏ | 14547/34278 [16:02:51<18:19:39, 3.34s/it] {'loss': 0.1542, 'grad_norm': 0.8784085698016646, 'learning_rate': 6.446633317739646e-06, 'epoch': 0.42} 42%|████▏ | 14547/34278 [16:02:51<18:19:39, 3.34s/it] 42%|████▏ | 14548/34278 [16:02:54<17:58:27, 3.28s/it] {'loss': 0.1434, 'grad_norm': 0.8093690792754681, 'learning_rate': 6.446181082935534e-06, 'epoch': 0.42} 42%|████▏ | 14548/34278 [16:02:54<17:58:27, 3.28s/it] 42%|████▏ | 14549/34278 [16:02:58<19:00:14, 3.47s/it] {'loss': 0.1478, 'grad_norm': 1.4952964465652518, 'learning_rate': 6.445728835220262e-06, 'epoch': 0.42} 42%|████▏ | 14549/34278 [16:02:58<19:00:14, 3.47s/it] 42%|████▏ | 14550/34278 [16:03:01<18:39:47, 3.41s/it] {'loss': 0.1148, 'grad_norm': 0.6586631728253155, 'learning_rate': 6.44527657459787e-06, 'epoch': 0.42} 42%|████▏ | 14550/34278 [16:03:01<18:39:47, 3.41s/it] 42%|████▏ | 14551/34278 [16:03:04<18:31:48, 3.38s/it] {'loss': 0.1425, 'grad_norm': 0.7988990831046745, 'learning_rate': 6.444824301072391e-06, 'epoch': 0.42} 42%|████▏ | 14551/34278 [16:03:04<18:31:48, 3.38s/it] 42%|████▏ | 14552/34278 [16:03:08<18:31:17, 3.38s/it] {'loss': 0.149, 'grad_norm': 0.7756335620708912, 'learning_rate': 6.4443720146478675e-06, 'epoch': 0.42} 42%|████▏ | 14552/34278 [16:03:08<18:31:17, 3.38s/it] 42%|████▏ | 14553/34278 [16:03:11<18:03:20, 3.30s/it] {'loss': 0.1458, 'grad_norm': 0.7363998507458365, 'learning_rate': 6.443919715328336e-06, 'epoch': 0.42} 42%|████▏ | 14553/34278 [16:03:11<18:03:20, 3.30s/it] 42%|████▏ | 14554/34278 [16:03:15<18:54:10, 3.45s/it] {'loss': 0.1527, 'grad_norm': 0.8815222354786677, 'learning_rate': 6.4434674031178314e-06, 'epoch': 0.42} 42%|████▏ | 14554/34278 [16:03:15<18:54:10, 3.45s/it] 42%|████▏ | 14555/34278 [16:03:18<18:30:38, 3.38s/it] {'loss': 0.1362, 'grad_norm': 0.924317890620181, 'learning_rate': 6.443015078020397e-06, 'epoch': 0.42} 42%|████▏ | 14555/34278 [16:03:18<18:30:38, 3.38s/it] 42%|████▏ | 14556/34278 [16:03:21<18:09:55, 3.32s/it] {'loss': 0.1226, 'grad_norm': 0.6482306932868144, 'learning_rate': 6.442562740040067e-06, 'epoch': 0.42} 42%|████▏ | 14556/34278 [16:03:21<18:09:55, 3.32s/it] 42%|████▏ | 14557/34278 [16:03:25<19:25:30, 3.55s/it] {'loss': 0.1348, 'grad_norm': 0.8306777726432536, 'learning_rate': 6.442110389180881e-06, 'epoch': 0.42} 42%|████▏ | 14557/34278 [16:03:25<19:25:30, 3.55s/it] 42%|████▏ | 14558/34278 [16:03:29<19:10:48, 3.50s/it] {'loss': 0.1212, 'grad_norm': 0.8132825556688807, 'learning_rate': 6.4416580254468795e-06, 'epoch': 0.42} 42%|████▏ | 14558/34278 [16:03:29<19:10:48, 3.50s/it] 42%|████▏ | 14559/34278 [16:03:32<18:54:44, 3.45s/it] {'loss': 0.1524, 'grad_norm': 0.8170276126492356, 'learning_rate': 6.441205648842097e-06, 'epoch': 0.42} 42%|████▏ | 14559/34278 [16:03:32<18:54:44, 3.45s/it] 42%|████▏ | 14560/34278 [16:03:38<22:32:59, 4.12s/it] {'loss': 0.1311, 'grad_norm': 0.8744322554012237, 'learning_rate': 6.440753259370575e-06, 'epoch': 0.42} 42%|████▏ | 14560/34278 [16:03:38<22:32:59, 4.12s/it] 42%|████▏ | 14561/34278 [16:03:41<21:58:31, 4.01s/it] {'loss': 0.1267, 'grad_norm': 1.0156791192303587, 'learning_rate': 6.440300857036354e-06, 'epoch': 0.42} 42%|████▏ | 14561/34278 [16:03:41<21:58:31, 4.01s/it] 42%|████▏ | 14562/34278 [16:03:45<22:13:20, 4.06s/it] {'loss': 0.1473, 'grad_norm': 0.977216342020889, 'learning_rate': 6.439848441843469e-06, 'epoch': 0.42} 42%|████▏ | 14562/34278 [16:03:45<22:13:20, 4.06s/it] 42%|████▏ | 14563/34278 [16:03:48<20:12:03, 3.69s/it] {'loss': 0.1433, 'grad_norm': 0.9484923266452084, 'learning_rate': 6.439396013795961e-06, 'epoch': 0.42} 42%|████▏ | 14563/34278 [16:03:48<20:12:03, 3.69s/it] 42%|████▏ | 14564/34278 [16:03:51<19:19:04, 3.53s/it] {'loss': 0.1459, 'grad_norm': 0.9721265502642883, 'learning_rate': 6.438943572897869e-06, 'epoch': 0.42} 42%|████▏ | 14564/34278 [16:03:51<19:19:04, 3.53s/it] 42%|████▏ | 14565/34278 [16:03:57<22:03:01, 4.03s/it] {'loss': 0.169, 'grad_norm': 0.8066516855911635, 'learning_rate': 6.4384911191532316e-06, 'epoch': 0.42} 42%|████▏ | 14565/34278 [16:03:57<22:03:01, 4.03s/it] 42%|████▏ | 14566/34278 [16:04:00<21:25:58, 3.91s/it] {'loss': 0.1259, 'grad_norm': 1.0033952960392376, 'learning_rate': 6.43803865256609e-06, 'epoch': 0.42} 42%|████▏ | 14566/34278 [16:04:00<21:25:58, 3.91s/it] 42%|████▏ | 14567/34278 [16:04:03<20:03:36, 3.66s/it] {'loss': 0.165, 'grad_norm': 1.0417315083259715, 'learning_rate': 6.437586173140482e-06, 'epoch': 0.42} 42%|████▏ | 14567/34278 [16:04:03<20:03:36, 3.66s/it] 42%|████▏ | 14568/34278 [16:04:09<24:03:34, 4.39s/it] {'loss': 0.1449, 'grad_norm': 0.9037656702375, 'learning_rate': 6.43713368088045e-06, 'epoch': 0.42} 42%|████▏ | 14568/34278 [16:04:09<24:03:34, 4.39s/it] 43%|████▎ | 14569/34278 [16:04:13<21:59:30, 4.02s/it] {'loss': 0.1257, 'grad_norm': 0.7885990323875087, 'learning_rate': 6.436681175790028e-06, 'epoch': 0.43} 43%|████▎ | 14569/34278 [16:04:13<21:59:30, 4.02s/it] 43%|████▎ | 14570/34278 [16:04:16<20:58:03, 3.83s/it] {'loss': 0.1671, 'grad_norm': 0.8877062285620676, 'learning_rate': 6.4362286578732626e-06, 'epoch': 0.43} 43%|████▎ | 14570/34278 [16:04:16<20:58:03, 3.83s/it] 43%|████▎ | 14571/34278 [16:04:19<19:53:08, 3.63s/it] {'loss': 0.1491, 'grad_norm': 0.9317443602253012, 'learning_rate': 6.4357761271341876e-06, 'epoch': 0.43} 43%|████▎ | 14571/34278 [16:04:19<19:53:08, 3.63s/it] 43%|████▎ | 14572/34278 [16:04:25<23:43:59, 4.34s/it] {'loss': 0.1796, 'grad_norm': 1.0402581857014654, 'learning_rate': 6.435323583576847e-06, 'epoch': 0.43} 43%|████▎ | 14572/34278 [16:04:25<23:43:59, 4.34s/it] 43%|████▎ | 14573/34278 [16:04:29<22:48:56, 4.17s/it] {'loss': 0.1152, 'grad_norm': 0.9707473246841538, 'learning_rate': 6.434871027205282e-06, 'epoch': 0.43} 43%|████▎ | 14573/34278 [16:04:29<22:48:56, 4.17s/it] 43%|████▎ | 14574/34278 [16:04:33<22:53:12, 4.18s/it] {'loss': 0.1311, 'grad_norm': 0.8164273297113386, 'learning_rate': 6.434418458023529e-06, 'epoch': 0.43} 43%|████▎ | 14574/34278 [16:04:33<22:53:12, 4.18s/it] 43%|████▎ | 14575/34278 [16:04:36<21:17:46, 3.89s/it] {'loss': 0.1492, 'grad_norm': 0.8475827553944394, 'learning_rate': 6.433965876035631e-06, 'epoch': 0.43} 43%|████▎ | 14575/34278 [16:04:36<21:17:46, 3.89s/it] 43%|████▎ | 14576/34278 [16:04:40<21:37:19, 3.95s/it] {'loss': 0.1274, 'grad_norm': 0.7173770494645649, 'learning_rate': 6.433513281245628e-06, 'epoch': 0.43} 43%|████▎ | 14576/34278 [16:04:40<21:37:19, 3.95s/it] 43%|████▎ | 14577/34278 [16:04:44<20:15:22, 3.70s/it] {'loss': 0.1228, 'grad_norm': 0.8127496231813877, 'learning_rate': 6.43306067365756e-06, 'epoch': 0.43} 43%|████▎ | 14577/34278 [16:04:44<20:15:22, 3.70s/it] 43%|████▎ | 14578/34278 [16:04:49<22:25:08, 4.10s/it] {'loss': 0.1198, 'grad_norm': 0.6928186794485005, 'learning_rate': 6.43260805327547e-06, 'epoch': 0.43} 43%|████▎ | 14578/34278 [16:04:49<22:25:08, 4.10s/it] 43%|████▎ | 14579/34278 [16:04:52<20:55:20, 3.82s/it] {'loss': 0.1304, 'grad_norm': 0.7726298241645552, 'learning_rate': 6.432155420103396e-06, 'epoch': 0.43} 43%|████▎ | 14579/34278 [16:04:52<20:55:20, 3.82s/it] 43%|████▎ | 14580/34278 [16:04:55<19:48:33, 3.62s/it] {'loss': 0.1569, 'grad_norm': 0.7400268183665162, 'learning_rate': 6.431702774145381e-06, 'epoch': 0.43} 43%|████▎ | 14580/34278 [16:04:55<19:48:33, 3.62s/it] 43%|████▎ | 14581/34278 [16:04:58<18:55:12, 3.46s/it] {'loss': 0.1326, 'grad_norm': 0.9906871144526957, 'learning_rate': 6.4312501154054655e-06, 'epoch': 0.43} 43%|████▎ | 14581/34278 [16:04:58<18:55:12, 3.46s/it] 43%|████▎ | 14582/34278 [16:05:01<18:30:32, 3.38s/it] {'loss': 0.132, 'grad_norm': 0.7263777764166814, 'learning_rate': 6.430797443887689e-06, 'epoch': 0.43} 43%|████▎ | 14582/34278 [16:05:01<18:30:32, 3.38s/it] 43%|████▎ | 14583/34278 [16:05:07<22:29:27, 4.11s/it] {'loss': 0.1349, 'grad_norm': 0.7966484002152375, 'learning_rate': 6.430344759596096e-06, 'epoch': 0.43} 43%|████▎ | 14583/34278 [16:05:07<22:29:27, 4.11s/it] 43%|████▎ | 14584/34278 [16:05:10<21:08:12, 3.86s/it] {'loss': 0.15, 'grad_norm': 0.8534298329644914, 'learning_rate': 6.429892062534726e-06, 'epoch': 0.43} 43%|████▎ | 14584/34278 [16:05:10<21:08:12, 3.86s/it] 43%|████▎ | 14585/34278 [16:05:15<22:44:12, 4.16s/it] {'loss': 0.1386, 'grad_norm': 1.0236583563235566, 'learning_rate': 6.429439352707623e-06, 'epoch': 0.43} 43%|████▎ | 14585/34278 [16:05:15<22:44:12, 4.16s/it] 43%|████▎ | 14586/34278 [16:05:21<25:35:29, 4.68s/it] {'loss': 0.1505, 'grad_norm': 0.7415576548091419, 'learning_rate': 6.428986630118824e-06, 'epoch': 0.43} 43%|████▎ | 14586/34278 [16:05:21<25:35:29, 4.68s/it] 43%|████▎ | 14587/34278 [16:05:24<23:02:26, 4.21s/it] {'loss': 0.133, 'grad_norm': 0.7181493695752493, 'learning_rate': 6.428533894772373e-06, 'epoch': 0.43} 43%|████▎ | 14587/34278 [16:05:24<23:02:26, 4.21s/it] 43%|████▎ | 14588/34278 [16:05:27<21:29:22, 3.93s/it] {'loss': 0.1403, 'grad_norm': 0.7921312425054876, 'learning_rate': 6.428081146672315e-06, 'epoch': 0.43} 43%|████▎ | 14588/34278 [16:05:27<21:29:22, 3.93s/it] 43%|████▎ | 14589/34278 [16:05:30<19:51:01, 3.63s/it] {'loss': 0.1396, 'grad_norm': 0.8902890918129877, 'learning_rate': 6.427628385822688e-06, 'epoch': 0.43} 43%|████▎ | 14589/34278 [16:05:30<19:51:01, 3.63s/it] 43%|████▎ | 14590/34278 [16:05:37<24:19:05, 4.45s/it] {'loss': 0.1617, 'grad_norm': 0.8471636344004445, 'learning_rate': 6.427175612227535e-06, 'epoch': 0.43} 43%|████▎ | 14590/34278 [16:05:37<24:19:05, 4.45s/it] 43%|████▎ | 14591/34278 [16:05:40<22:19:56, 4.08s/it] {'loss': 0.1448, 'grad_norm': 0.9145104986876964, 'learning_rate': 6.4267228258909e-06, 'epoch': 0.43} 43%|████▎ | 14591/34278 [16:05:40<22:19:56, 4.08s/it] 43%|████▎ | 14592/34278 [16:05:43<20:46:43, 3.80s/it] {'loss': 0.1206, 'grad_norm': 1.274479869068229, 'learning_rate': 6.426270026816824e-06, 'epoch': 0.43} 43%|████▎ | 14592/34278 [16:05:43<20:46:43, 3.80s/it] 43%|████▎ | 14593/34278 [16:05:47<20:23:30, 3.73s/it] {'loss': 0.1352, 'grad_norm': 0.6782055025083349, 'learning_rate': 6.425817215009349e-06, 'epoch': 0.43} 43%|████▎ | 14593/34278 [16:05:47<20:23:30, 3.73s/it] 43%|████▎ | 14594/34278 [16:05:50<19:13:19, 3.52s/it] {'loss': 0.1496, 'grad_norm': 0.8963187453287486, 'learning_rate': 6.425364390472518e-06, 'epoch': 0.43} 43%|████▎ | 14594/34278 [16:05:50<19:13:19, 3.52s/it] 43%|████▎ | 14595/34278 [16:05:52<17:45:43, 3.25s/it] {'loss': 0.1437, 'grad_norm': 0.7295762114599001, 'learning_rate': 6.424911553210376e-06, 'epoch': 0.43} 43%|████▎ | 14595/34278 [16:05:52<17:45:43, 3.25s/it] 43%|████▎ | 14596/34278 [16:05:56<17:48:41, 3.26s/it] {'loss': 0.122, 'grad_norm': 0.7534925172133695, 'learning_rate': 6.4244587032269615e-06, 'epoch': 0.43} 43%|████▎ | 14596/34278 [16:05:56<17:48:41, 3.26s/it] 43%|████▎ | 14597/34278 [16:06:00<19:12:11, 3.51s/it] {'loss': 0.1509, 'grad_norm': 0.7328190284111709, 'learning_rate': 6.424005840526321e-06, 'epoch': 0.43} 43%|████▎ | 14597/34278 [16:06:00<19:12:11, 3.51s/it] 43%|████▎ | 14598/34278 [16:06:03<18:50:59, 3.45s/it] {'loss': 0.1511, 'grad_norm': 0.7813073881683321, 'learning_rate': 6.423552965112496e-06, 'epoch': 0.43} 43%|████▎ | 14598/34278 [16:06:03<18:50:59, 3.45s/it] 43%|████▎ | 14599/34278 [16:06:06<17:59:11, 3.29s/it] {'loss': 0.1505, 'grad_norm': 0.9991050549661694, 'learning_rate': 6.42310007698953e-06, 'epoch': 0.43} 43%|████▎ | 14599/34278 [16:06:06<17:59:11, 3.29s/it] 43%|████▎ | 14600/34278 [16:06:09<18:04:57, 3.31s/it] {'loss': 0.1314, 'grad_norm': 0.8524195923266077, 'learning_rate': 6.4226471761614675e-06, 'epoch': 0.43} 43%|████▎ | 14600/34278 [16:06:09<18:04:57, 3.31s/it] 43%|████▎ | 14601/34278 [16:06:12<17:45:23, 3.25s/it] {'loss': 0.1393, 'grad_norm': 0.9428984302763539, 'learning_rate': 6.422194262632349e-06, 'epoch': 0.43} 43%|████▎ | 14601/34278 [16:06:12<17:45:23, 3.25s/it] 43%|████▎ | 14602/34278 [16:06:15<17:11:50, 3.15s/it] {'loss': 0.1408, 'grad_norm': 0.7506115387752303, 'learning_rate': 6.421741336406218e-06, 'epoch': 0.43} 43%|████▎ | 14602/34278 [16:06:15<17:11:50, 3.15s/it] 43%|████▎ | 14603/34278 [16:06:18<16:53:05, 3.09s/it] {'loss': 0.1451, 'grad_norm': 0.8425262154277137, 'learning_rate': 6.4212883974871236e-06, 'epoch': 0.43} 43%|████▎ | 14603/34278 [16:06:18<16:53:05, 3.09s/it] 43%|████▎ | 14604/34278 [16:06:22<18:08:12, 3.32s/it] {'loss': 0.1551, 'grad_norm': 0.8511437934435212, 'learning_rate': 6.4208354458791035e-06, 'epoch': 0.43} 43%|████▎ | 14604/34278 [16:06:22<18:08:12, 3.32s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 43%|████▎ | 14605/34278 [16:06:26<18:54:55, 3.46s/it] {'loss': 0.1641, 'grad_norm': 0.881979714026643, 'learning_rate': 6.420382481586203e-06, 'epoch': 0.43} 43%|████▎ | 14605/34278 [16:06:26<18:54:55, 3.46s/it] 43%|████▎ | 14606/34278 [16:06:32<23:23:04, 4.28s/it] {'loss': 0.1662, 'grad_norm': 0.9978972110053544, 'learning_rate': 6.419929504612469e-06, 'epoch': 0.43} 43%|████▎ | 14606/34278 [16:06:32<23:23:04, 4.28s/it] 43%|████▎ | 14607/34278 [16:06:35<21:58:49, 4.02s/it] {'loss': 0.1579, 'grad_norm': 0.8786849002723329, 'learning_rate': 6.419476514961942e-06, 'epoch': 0.43} 43%|████▎ | 14607/34278 [16:06:35<21:58:49, 4.02s/it] 43%|████▎ | 14608/34278 [16:06:39<20:37:45, 3.78s/it] {'loss': 0.1448, 'grad_norm': 1.0140316415914195, 'learning_rate': 6.419023512638667e-06, 'epoch': 0.43} 43%|████▎ | 14608/34278 [16:06:39<20:37:45, 3.78s/it] 43%|████▎ | 14609/34278 [16:06:42<19:07:27, 3.50s/it] {'loss': 0.1406, 'grad_norm': 0.9005428459177359, 'learning_rate': 6.41857049764669e-06, 'epoch': 0.43} 43%|████▎ | 14609/34278 [16:06:42<19:07:27, 3.50s/it] 43%|████▎ | 14610/34278 [16:06:45<19:22:33, 3.55s/it] {'loss': 0.1507, 'grad_norm': 0.7864340508166872, 'learning_rate': 6.418117469990053e-06, 'epoch': 0.43} 43%|████▎ | 14610/34278 [16:06:45<19:22:33, 3.55s/it] 43%|████▎ | 14611/34278 [16:06:48<18:16:06, 3.34s/it] {'loss': 0.1246, 'grad_norm': 0.8805637404277392, 'learning_rate': 6.417664429672803e-06, 'epoch': 0.43} 43%|████▎ | 14611/34278 [16:06:48<18:16:06, 3.34s/it] 43%|████▎ | 14612/34278 [16:06:52<20:00:12, 3.66s/it] {'loss': 0.157, 'grad_norm': 0.8653322469341848, 'learning_rate': 6.417211376698982e-06, 'epoch': 0.43} 43%|████▎ | 14612/34278 [16:06:52<20:00:12, 3.66s/it] 43%|████▎ | 14613/34278 [16:06:56<19:04:22, 3.49s/it] {'loss': 0.1357, 'grad_norm': 0.7611503716150947, 'learning_rate': 6.416758311072638e-06, 'epoch': 0.43} 43%|████▎ | 14613/34278 [16:06:56<19:04:22, 3.49s/it] 43%|████▎ | 14614/34278 [16:07:01<21:37:19, 3.96s/it] {'loss': 0.1477, 'grad_norm': 0.7678749486288232, 'learning_rate': 6.416305232797813e-06, 'epoch': 0.43} 43%|████▎ | 14614/34278 [16:07:01<21:37:19, 3.96s/it] 43%|████▎ | 14615/34278 [16:07:04<21:06:09, 3.86s/it] {'loss': 0.1613, 'grad_norm': 0.7823079024797566, 'learning_rate': 6.415852141878553e-06, 'epoch': 0.43} 43%|████▎ | 14615/34278 [16:07:04<21:06:09, 3.86s/it] 43%|████▎ | 14616/34278 [16:07:07<19:30:11, 3.57s/it] {'loss': 0.1425, 'grad_norm': 0.8640111274829947, 'learning_rate': 6.415399038318903e-06, 'epoch': 0.43} 43%|████▎ | 14616/34278 [16:07:07<19:30:11, 3.57s/it] 43%|████▎ | 14617/34278 [16:07:11<19:14:28, 3.52s/it] {'loss': 0.1532, 'grad_norm': 0.9216988519330479, 'learning_rate': 6.414945922122908e-06, 'epoch': 0.43} 43%|████▎ | 14617/34278 [16:07:11<19:14:28, 3.52s/it] 43%|████▎ | 14618/34278 [16:07:14<19:03:00, 3.49s/it] {'loss': 0.1167, 'grad_norm': 0.791367072581288, 'learning_rate': 6.414492793294615e-06, 'epoch': 0.43} 43%|████▎ | 14618/34278 [16:07:14<19:03:00, 3.49s/it] 43%|████▎ | 14619/34278 [16:07:17<18:00:24, 3.30s/it] {'loss': 0.1591, 'grad_norm': 0.879186370276052, 'learning_rate': 6.414039651838066e-06, 'epoch': 0.43} 43%|████▎ | 14619/34278 [16:07:17<18:00:24, 3.30s/it] 43%|████▎ | 14620/34278 [16:07:22<21:33:10, 3.95s/it] {'loss': 0.1443, 'grad_norm': 0.976453014064819, 'learning_rate': 6.41358649775731e-06, 'epoch': 0.43} 43%|████▎ | 14620/34278 [16:07:22<21:33:10, 3.95s/it] 43%|████▎ | 14621/34278 [16:07:27<23:04:23, 4.23s/it] {'loss': 0.1367, 'grad_norm': 0.7046471432503829, 'learning_rate': 6.413133331056391e-06, 'epoch': 0.43} 43%|████▎ | 14621/34278 [16:07:27<23:04:23, 4.23s/it] 43%|████▎ | 14622/34278 [16:07:31<22:31:02, 4.12s/it] {'loss': 0.1399, 'grad_norm': 1.0266429249007463, 'learning_rate': 6.412680151739354e-06, 'epoch': 0.43} 43%|████▎ | 14622/34278 [16:07:31<22:31:02, 4.12s/it] 43%|████▎ | 14623/34278 [16:07:35<21:30:12, 3.94s/it] {'loss': 0.1357, 'grad_norm': 0.7657002313823391, 'learning_rate': 6.412226959810246e-06, 'epoch': 0.43} 43%|████▎ | 14623/34278 [16:07:35<21:30:12, 3.94s/it] 43%|████▎ | 14624/34278 [16:07:38<20:03:08, 3.67s/it] {'loss': 0.126, 'grad_norm': 0.9034638933677264, 'learning_rate': 6.411773755273114e-06, 'epoch': 0.43} 43%|████▎ | 14624/34278 [16:07:38<20:03:08, 3.67s/it] 43%|████▎ | 14625/34278 [16:07:41<19:12:30, 3.52s/it] {'loss': 0.1174, 'grad_norm': 0.7096191941575555, 'learning_rate': 6.411320538132002e-06, 'epoch': 0.43} 43%|████▎ | 14625/34278 [16:07:41<19:12:30, 3.52s/it] 43%|████▎ | 14626/34278 [16:07:47<23:22:05, 4.28s/it] {'loss': 0.1377, 'grad_norm': 0.6205600714182188, 'learning_rate': 6.410867308390958e-06, 'epoch': 0.43} 43%|████▎ | 14626/34278 [16:07:47<23:22:05, 4.28s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 43%|████▎ | 14627/34278 [16:07:50<21:57:10, 4.02s/it] {'loss': 0.1359, 'grad_norm': 0.830338251694489, 'learning_rate': 6.410414066054026e-06, 'epoch': 0.43} 43%|████▎ | 14627/34278 [16:07:50<21:57:10, 4.02s/it] 43%|████▎ | 14628/34278 [16:07:54<20:44:56, 3.80s/it] {'loss': 0.1316, 'grad_norm': 0.8140714428121494, 'learning_rate': 6.409960811125256e-06, 'epoch': 0.43} 43%|████▎ | 14628/34278 [16:07:54<20:44:56, 3.80s/it] 43%|████▎ | 14629/34278 [16:07:57<20:05:11, 3.68s/it] {'loss': 0.1445, 'grad_norm': 1.0873122870007383, 'learning_rate': 6.40950754360869e-06, 'epoch': 0.43} 43%|████▎ | 14629/34278 [16:07:57<20:05:11, 3.68s/it] 43%|████▎ | 14630/34278 [16:08:00<19:28:27, 3.57s/it] {'loss': 0.132, 'grad_norm': 0.7581253703358832, 'learning_rate': 6.40905426350838e-06, 'epoch': 0.43} 43%|████▎ | 14630/34278 [16:08:00<19:28:27, 3.57s/it] 43%|████▎ | 14631/34278 [16:08:05<20:44:35, 3.80s/it] {'loss': 0.1509, 'grad_norm': 0.8040443130093091, 'learning_rate': 6.408600970828367e-06, 'epoch': 0.43} 43%|████▎ | 14631/34278 [16:08:05<20:44:35, 3.80s/it] 43%|████▎ | 14632/34278 [16:08:08<19:46:43, 3.62s/it] {'loss': 0.1342, 'grad_norm': 0.8831884676715346, 'learning_rate': 6.408147665572701e-06, 'epoch': 0.43} 43%|████▎ | 14632/34278 [16:08:08<19:46:43, 3.62s/it] 43%|████▎ | 14633/34278 [16:08:12<19:58:59, 3.66s/it] {'loss': 0.1393, 'grad_norm': 0.9123740565031518, 'learning_rate': 6.407694347745431e-06, 'epoch': 0.43} 43%|████▎ | 14633/34278 [16:08:12<19:58:59, 3.66s/it] 43%|████▎ | 14634/34278 [16:08:15<19:23:14, 3.55s/it] {'loss': 0.1462, 'grad_norm': 2.6416471712195997, 'learning_rate': 6.407241017350601e-06, 'epoch': 0.43} 43%|████▎ | 14634/34278 [16:08:15<19:23:14, 3.55s/it] 43%|████▎ | 14635/34278 [16:08:21<22:55:53, 4.20s/it] {'loss': 0.1611, 'grad_norm': 0.8061560738824025, 'learning_rate': 6.406787674392259e-06, 'epoch': 0.43} 43%|████▎ | 14635/34278 [16:08:21<22:55:53, 4.20s/it] 43%|████▎ | 14636/34278 [16:08:25<22:58:57, 4.21s/it] {'loss': 0.1451, 'grad_norm': 0.9363533775038286, 'learning_rate': 6.406334318874452e-06, 'epoch': 0.43} 43%|████▎ | 14636/34278 [16:08:25<22:58:57, 4.21s/it] 43%|████▎ | 14637/34278 [16:08:28<21:07:14, 3.87s/it] {'loss': 0.1448, 'grad_norm': 0.867880207892632, 'learning_rate': 6.4058809508012285e-06, 'epoch': 0.43} 43%|████▎ | 14637/34278 [16:08:28<21:07:14, 3.87s/it] 43%|████▎ | 14638/34278 [16:08:31<20:00:34, 3.67s/it] {'loss': 0.1204, 'grad_norm': 0.8735382344438637, 'learning_rate': 6.405427570176635e-06, 'epoch': 0.43} 43%|████▎ | 14638/34278 [16:08:31<20:00:34, 3.67s/it] 43%|████▎ | 14639/34278 [16:08:34<18:44:31, 3.44s/it] {'loss': 0.145, 'grad_norm': 0.9445295126908109, 'learning_rate': 6.40497417700472e-06, 'epoch': 0.43} 43%|████▎ | 14639/34278 [16:08:34<18:44:31, 3.44s/it] 43%|████▎ | 14640/34278 [16:08:37<18:25:31, 3.38s/it] {'loss': 0.1316, 'grad_norm': 0.8897433233589293, 'learning_rate': 6.404520771289531e-06, 'epoch': 0.43} 43%|████▎ | 14640/34278 [16:08:37<18:25:31, 3.38s/it] 43%|████▎ | 14641/34278 [16:08:41<18:53:42, 3.46s/it] {'loss': 0.1382, 'grad_norm': 1.2586564701146228, 'learning_rate': 6.404067353035115e-06, 'epoch': 0.43} 43%|████▎ | 14641/34278 [16:08:41<18:53:42, 3.46s/it] 43%|████▎ | 14642/34278 [16:08:44<18:54:10, 3.47s/it] {'loss': 0.1609, 'grad_norm': 0.9972580081860094, 'learning_rate': 6.403613922245522e-06, 'epoch': 0.43} 43%|████▎ | 14642/34278 [16:08:44<18:54:10, 3.47s/it] 43%|████▎ | 14643/34278 [16:08:50<22:55:45, 4.20s/it] {'loss': 0.1524, 'grad_norm': 0.8452344557746346, 'learning_rate': 6.403160478924799e-06, 'epoch': 0.43} 43%|████▎ | 14643/34278 [16:08:50<22:55:45, 4.20s/it] 43%|████▎ | 14644/34278 [16:08:54<21:33:51, 3.95s/it] {'loss': 0.1462, 'grad_norm': 1.3583397959402321, 'learning_rate': 6.402707023076993e-06, 'epoch': 0.43} 43%|████▎ | 14644/34278 [16:08:54<21:33:51, 3.95s/it] 43%|████▎ | 14645/34278 [16:08:57<20:00:55, 3.67s/it] {'loss': 0.1483, 'grad_norm': 1.063121484670113, 'learning_rate': 6.402253554706155e-06, 'epoch': 0.43} 43%|████▎ | 14645/34278 [16:08:57<20:00:55, 3.67s/it] 43%|████▎ | 14646/34278 [16:09:00<19:18:21, 3.54s/it] {'loss': 0.1359, 'grad_norm': 0.8858529375043536, 'learning_rate': 6.401800073816331e-06, 'epoch': 0.43} 43%|████▎ | 14646/34278 [16:09:00<19:18:21, 3.54s/it] 43%|████▎ | 14647/34278 [16:09:04<20:09:51, 3.70s/it] {'loss': 0.1426, 'grad_norm': 1.106632331475958, 'learning_rate': 6.401346580411571e-06, 'epoch': 0.43} 43%|████▎ | 14647/34278 [16:09:04<20:09:51, 3.70s/it] 43%|████▎ | 14648/34278 [16:09:07<19:12:18, 3.52s/it] {'loss': 0.1442, 'grad_norm': 1.1434067015461586, 'learning_rate': 6.400893074495923e-06, 'epoch': 0.43} 43%|████▎ | 14648/34278 [16:09:07<19:12:18, 3.52s/it] 43%|████▎ | 14649/34278 [16:09:10<18:15:03, 3.35s/it] {'loss': 0.1307, 'grad_norm': 0.961129580601906, 'learning_rate': 6.4004395560734366e-06, 'epoch': 0.43} 43%|████▎ | 14649/34278 [16:09:10<18:15:03, 3.35s/it] 43%|████▎ | 14650/34278 [16:09:13<17:58:48, 3.30s/it] {'loss': 0.1172, 'grad_norm': 1.217218499848346, 'learning_rate': 6.39998602514816e-06, 'epoch': 0.43} 43%|████▎ | 14650/34278 [16:09:13<17:58:48, 3.30s/it] 43%|████▎ | 14651/34278 [16:09:16<17:16:51, 3.17s/it] {'loss': 0.1245, 'grad_norm': 0.9946967120904634, 'learning_rate': 6.399532481724142e-06, 'epoch': 0.43} 43%|████▎ | 14651/34278 [16:09:16<17:16:51, 3.17s/it] 43%|████▎ | 14652/34278 [16:09:19<17:30:56, 3.21s/it] {'loss': 0.123, 'grad_norm': 0.7041848791125103, 'learning_rate': 6.399078925805432e-06, 'epoch': 0.43} 43%|████▎ | 14652/34278 [16:09:19<17:30:56, 3.21s/it] 43%|████▎ | 14653/34278 [16:09:22<17:14:53, 3.16s/it] {'loss': 0.1329, 'grad_norm': 1.0551097008424255, 'learning_rate': 6.398625357396079e-06, 'epoch': 0.43} 43%|████▎ | 14653/34278 [16:09:22<17:14:53, 3.16s/it] 43%|████▎ | 14654/34278 [16:09:26<18:38:01, 3.42s/it] {'loss': 0.1304, 'grad_norm': 1.2890936070936188, 'learning_rate': 6.398171776500132e-06, 'epoch': 0.43} 43%|████▎ | 14654/34278 [16:09:26<18:38:01, 3.42s/it] 43%|████▎ | 14655/34278 [16:09:30<19:05:03, 3.50s/it] {'loss': 0.1334, 'grad_norm': 0.7244104729640806, 'learning_rate': 6.397718183121644e-06, 'epoch': 0.43} 43%|████▎ | 14655/34278 [16:09:30<19:05:03, 3.50s/it] 43%|████▎ | 14656/34278 [16:09:35<20:47:29, 3.81s/it] {'loss': 0.1486, 'grad_norm': 0.7180189347348244, 'learning_rate': 6.397264577264659e-06, 'epoch': 0.43} 43%|████▎ | 14656/34278 [16:09:35<20:47:29, 3.81s/it] 43%|████▎ | 14657/34278 [16:09:38<19:50:43, 3.64s/it] {'loss': 0.1218, 'grad_norm': 0.7370659794570515, 'learning_rate': 6.396810958933231e-06, 'epoch': 0.43} 43%|████▎ | 14657/34278 [16:09:38<19:50:43, 3.64s/it] 43%|████▎ | 14658/34278 [16:09:41<18:49:58, 3.46s/it] {'loss': 0.1288, 'grad_norm': 1.2249162954026172, 'learning_rate': 6.396357328131408e-06, 'epoch': 0.43} 43%|████▎ | 14658/34278 [16:09:41<18:49:58, 3.46s/it] 43%|████▎ | 14659/34278 [16:09:44<18:10:59, 3.34s/it] {'loss': 0.1374, 'grad_norm': 0.7801917318751426, 'learning_rate': 6.3959036848632395e-06, 'epoch': 0.43} 43%|████▎ | 14659/34278 [16:09:44<18:10:59, 3.34s/it] 43%|████▎ | 14660/34278 [16:09:47<17:58:35, 3.30s/it] {'loss': 0.1112, 'grad_norm': 0.6327083136322517, 'learning_rate': 6.395450029132777e-06, 'epoch': 0.43} 43%|████▎ | 14660/34278 [16:09:47<17:58:35, 3.30s/it] 43%|████▎ | 14661/34278 [16:09:53<22:10:03, 4.07s/it] {'loss': 0.1371, 'grad_norm': 0.9239410078834523, 'learning_rate': 6.39499636094407e-06, 'epoch': 0.43} 43%|████▎ | 14661/34278 [16:09:53<22:10:03, 4.07s/it] 43%|████▎ | 14662/34278 [16:09:56<20:50:52, 3.83s/it] {'loss': 0.1303, 'grad_norm': 0.7501179303141055, 'learning_rate': 6.394542680301165e-06, 'epoch': 0.43} 43%|████▎ | 14662/34278 [16:09:56<20:50:52, 3.83s/it] 43%|████▎ | 14663/34278 [16:10:01<22:35:48, 4.15s/it] {'loss': 0.1485, 'grad_norm': 0.7814820705795176, 'learning_rate': 6.3940889872081205e-06, 'epoch': 0.43} 43%|████▎ | 14663/34278 [16:10:01<22:35:48, 4.15s/it] 43%|████▎ | 14664/34278 [16:10:04<20:36:16, 3.78s/it] {'loss': 0.1244, 'grad_norm': 0.7109996794925011, 'learning_rate': 6.39363528166898e-06, 'epoch': 0.43} 43%|████▎ | 14664/34278 [16:10:04<20:36:16, 3.78s/it] 43%|████▎ | 14665/34278 [16:10:07<19:11:48, 3.52s/it] {'loss': 0.1306, 'grad_norm': 0.907689957657468, 'learning_rate': 6.393181563687798e-06, 'epoch': 0.43} 43%|████▎ | 14665/34278 [16:10:07<19:11:48, 3.52s/it] 43%|████▎ | 14666/34278 [16:10:10<18:08:05, 3.33s/it] {'loss': 0.1283, 'grad_norm': 0.7765637906615048, 'learning_rate': 6.3927278332686215e-06, 'epoch': 0.43} 43%|████▎ | 14666/34278 [16:10:10<18:08:05, 3.33s/it] 43%|████▎ | 14667/34278 [16:10:13<17:55:15, 3.29s/it] {'loss': 0.1428, 'grad_norm': 0.8711063592658128, 'learning_rate': 6.392274090415505e-06, 'epoch': 0.43} 43%|████▎ | 14667/34278 [16:10:13<17:55:15, 3.29s/it] 43%|████▎ | 14668/34278 [16:10:17<18:43:50, 3.44s/it] {'loss': 0.1452, 'grad_norm': 0.9415437304505929, 'learning_rate': 6.391820335132497e-06, 'epoch': 0.43} 43%|████▎ | 14668/34278 [16:10:17<18:43:50, 3.44s/it] 43%|████▎ | 14669/34278 [16:10:20<18:55:45, 3.48s/it] {'loss': 0.1427, 'grad_norm': 0.6453696198741923, 'learning_rate': 6.391366567423649e-06, 'epoch': 0.43} 43%|████▎ | 14669/34278 [16:10:20<18:55:45, 3.48s/it] 43%|████▎ | 14670/34278 [16:10:24<18:16:14, 3.35s/it] {'loss': 0.1602, 'grad_norm': 1.0611456114369375, 'learning_rate': 6.390912787293012e-06, 'epoch': 0.43} 43%|████▎ | 14670/34278 [16:10:24<18:16:14, 3.35s/it] 43%|████▎ | 14671/34278 [16:10:26<17:10:57, 3.15s/it] {'loss': 0.131, 'grad_norm': 0.8196304221773512, 'learning_rate': 6.390458994744638e-06, 'epoch': 0.43} 43%|████▎ | 14671/34278 [16:10:26<17:10:57, 3.15s/it] 43%|████▎ | 14672/34278 [16:10:30<17:31:46, 3.22s/it] {'loss': 0.1619, 'grad_norm': 0.9389071162102353, 'learning_rate': 6.390005189782579e-06, 'epoch': 0.43} 43%|████▎ | 14672/34278 [16:10:30<17:31:46, 3.22s/it] 43%|████▎ | 14673/34278 [16:10:33<17:13:24, 3.16s/it] {'loss': 0.1505, 'grad_norm': 0.920338543674002, 'learning_rate': 6.389551372410886e-06, 'epoch': 0.43} 43%|████▎ | 14673/34278 [16:10:33<17:13:24, 3.16s/it] 43%|████▎ | 14674/34278 [16:10:36<17:48:08, 3.27s/it] {'loss': 0.1366, 'grad_norm': 0.8812668029713562, 'learning_rate': 6.389097542633608e-06, 'epoch': 0.43} 43%|████▎ | 14674/34278 [16:10:36<17:48:08, 3.27s/it] 43%|████▎ | 14675/34278 [16:10:39<17:06:35, 3.14s/it] {'loss': 0.1411, 'grad_norm': 0.8265439211426261, 'learning_rate': 6.388643700454801e-06, 'epoch': 0.43} 43%|████▎ | 14675/34278 [16:10:39<17:06:35, 3.14s/it] 43%|████▎ | 14676/34278 [16:10:42<17:17:04, 3.17s/it] {'loss': 0.1625, 'grad_norm': 1.0629184186352492, 'learning_rate': 6.388189845878513e-06, 'epoch': 0.43} 43%|████▎ | 14676/34278 [16:10:42<17:17:04, 3.17s/it] 43%|████▎ | 14677/34278 [16:10:45<17:07:22, 3.14s/it] {'loss': 0.1404, 'grad_norm': 0.9247386992323844, 'learning_rate': 6.387735978908797e-06, 'epoch': 0.43} 43%|████▎ | 14677/34278 [16:10:45<17:07:22, 3.14s/it] 43%|████▎ | 14678/34278 [16:10:48<17:05:36, 3.14s/it] {'loss': 0.1414, 'grad_norm': 0.9480840655829462, 'learning_rate': 6.387282099549707e-06, 'epoch': 0.43} 43%|████▎ | 14678/34278 [16:10:48<17:05:36, 3.14s/it] 43%|████▎ | 14679/34278 [16:10:54<20:50:03, 3.83s/it] {'loss': 0.1297, 'grad_norm': 0.6551231763411762, 'learning_rate': 6.386828207805292e-06, 'epoch': 0.43} 43%|████▎ | 14679/34278 [16:10:54<20:50:03, 3.83s/it] 43%|████▎ | 14680/34278 [16:10:57<19:57:05, 3.66s/it] {'loss': 0.1209, 'grad_norm': 0.7151633617029399, 'learning_rate': 6.386374303679607e-06, 'epoch': 0.43} 43%|████▎ | 14680/34278 [16:10:57<19:57:05, 3.66s/it] 43%|████▎ | 14681/34278 [16:11:00<18:35:07, 3.41s/it] {'loss': 0.1431, 'grad_norm': 1.0121783239850635, 'learning_rate': 6.385920387176703e-06, 'epoch': 0.43} 43%|████▎ | 14681/34278 [16:11:00<18:35:07, 3.41s/it] 43%|████▎ | 14682/34278 [16:11:03<17:36:11, 3.23s/it] {'loss': 0.1454, 'grad_norm': 1.028630375965957, 'learning_rate': 6.385466458300632e-06, 'epoch': 0.43} 43%|████▎ | 14682/34278 [16:11:03<17:36:11, 3.23s/it] 43%|████▎ | 14683/34278 [16:11:06<17:36:28, 3.23s/it] {'loss': 0.1477, 'grad_norm': 0.9145442814616266, 'learning_rate': 6.385012517055448e-06, 'epoch': 0.43} 43%|████▎ | 14683/34278 [16:11:06<17:36:28, 3.23s/it] 43%|████▎ | 14684/34278 [16:11:09<17:17:48, 3.18s/it] {'loss': 0.17, 'grad_norm': 0.8999939217874624, 'learning_rate': 6.384558563445203e-06, 'epoch': 0.43} 43%|████▎ | 14684/34278 [16:11:09<17:17:48, 3.18s/it] 43%|████▎ | 14685/34278 [16:11:13<18:24:35, 3.38s/it] {'loss': 0.1426, 'grad_norm': 0.8934297591872845, 'learning_rate': 6.384104597473948e-06, 'epoch': 0.43} 43%|████▎ | 14685/34278 [16:11:13<18:24:35, 3.38s/it] 43%|████▎ | 14686/34278 [16:11:16<18:02:21, 3.31s/it] {'loss': 0.1505, 'grad_norm': 0.9772242512820827, 'learning_rate': 6.383650619145738e-06, 'epoch': 0.43} 43%|████▎ | 14686/34278 [16:11:16<18:02:21, 3.31s/it] 43%|████▎ | 14687/34278 [16:11:22<22:57:33, 4.22s/it] {'loss': 0.1528, 'grad_norm': 0.7706815869965222, 'learning_rate': 6.383196628464627e-06, 'epoch': 0.43} 43%|████▎ | 14687/34278 [16:11:22<22:57:33, 4.22s/it] 43%|████▎ | 14688/34278 [16:11:25<20:56:55, 3.85s/it] {'loss': 0.1521, 'grad_norm': 0.8005157520564964, 'learning_rate': 6.382742625434667e-06, 'epoch': 0.43} 43%|████▎ | 14688/34278 [16:11:25<20:56:55, 3.85s/it] 43%|████▎ | 14689/34278 [16:11:29<20:05:32, 3.69s/it] {'loss': 0.1391, 'grad_norm': 1.0033328094096647, 'learning_rate': 6.382288610059908e-06, 'epoch': 0.43} 43%|████▎ | 14689/34278 [16:11:29<20:05:32, 3.69s/it] 43%|████▎ | 14690/34278 [16:11:33<21:22:48, 3.93s/it] {'loss': 0.1515, 'grad_norm': 0.8924927853595284, 'learning_rate': 6.3818345823444094e-06, 'epoch': 0.43} 43%|████▎ | 14690/34278 [16:11:33<21:22:48, 3.93s/it] 43%|████▎ | 14691/34278 [16:11:36<20:06:21, 3.70s/it] {'loss': 0.1419, 'grad_norm': 0.8280280209418701, 'learning_rate': 6.38138054229222e-06, 'epoch': 0.43} 43%|████▎ | 14691/34278 [16:11:36<20:06:21, 3.70s/it] 43%|████▎ | 14692/34278 [16:11:40<19:46:40, 3.64s/it] {'loss': 0.1614, 'grad_norm': 0.8792603523785534, 'learning_rate': 6.380926489907394e-06, 'epoch': 0.43} 43%|████▎ | 14692/34278 [16:11:40<19:46:40, 3.64s/it] 43%|████▎ | 14693/34278 [16:11:43<18:53:15, 3.47s/it] {'loss': 0.1259, 'grad_norm': 0.9324739732655796, 'learning_rate': 6.380472425193989e-06, 'epoch': 0.43} 43%|████▎ | 14693/34278 [16:11:43<18:53:15, 3.47s/it] 43%|████▎ | 14694/34278 [16:11:47<19:33:29, 3.60s/it] {'loss': 0.1331, 'grad_norm': 0.7579055524898174, 'learning_rate': 6.380018348156054e-06, 'epoch': 0.43} 43%|████▎ | 14694/34278 [16:11:47<19:33:29, 3.60s/it] 43%|████▎ | 14695/34278 [16:11:53<23:31:39, 4.33s/it] {'loss': 0.1603, 'grad_norm': 0.7546100364855088, 'learning_rate': 6.379564258797644e-06, 'epoch': 0.43} 43%|████▎ | 14695/34278 [16:11:53<23:31:39, 4.33s/it] 43%|████▎ | 14696/34278 [16:11:56<21:05:17, 3.88s/it] {'loss': 0.1404, 'grad_norm': 0.9605596683417, 'learning_rate': 6.379110157122815e-06, 'epoch': 0.43} 43%|████▎ | 14696/34278 [16:11:56<21:05:17, 3.88s/it] 43%|████▎ | 14697/34278 [16:12:00<21:39:53, 3.98s/it] {'loss': 0.1459, 'grad_norm': 0.9019800145363126, 'learning_rate': 6.378656043135618e-06, 'epoch': 0.43} 43%|████▎ | 14697/34278 [16:12:00<21:39:53, 3.98s/it] 43%|████▎ | 14698/34278 [16:12:03<20:12:07, 3.71s/it] {'loss': 0.1395, 'grad_norm': 1.2597422167241183, 'learning_rate': 6.37820191684011e-06, 'epoch': 0.43} 43%|████▎ | 14698/34278 [16:12:03<20:12:07, 3.71s/it] 43%|████▎ | 14699/34278 [16:12:07<21:10:10, 3.89s/it] {'loss': 0.1407, 'grad_norm': 0.7761258828569346, 'learning_rate': 6.377747778240344e-06, 'epoch': 0.43} 43%|████▎ | 14699/34278 [16:12:07<21:10:10, 3.89s/it] 43%|████▎ | 14700/34278 [16:12:11<21:04:39, 3.88s/it] {'loss': 0.1284, 'grad_norm': 0.6569514069900847, 'learning_rate': 6.377293627340374e-06, 'epoch': 0.43} 43%|████▎ | 14700/34278 [16:12:11<21:04:39, 3.88s/it] 43%|████▎ | 14701/34278 [16:12:16<22:49:08, 4.20s/it] {'loss': 0.1438, 'grad_norm': 0.7491732766116863, 'learning_rate': 6.376839464144257e-06, 'epoch': 0.43} 43%|████▎ | 14701/34278 [16:12:16<22:49:08, 4.20s/it] 43%|████▎ | 14702/34278 [16:12:19<20:45:24, 3.82s/it] {'loss': 0.1654, 'grad_norm': 0.9238875908563786, 'learning_rate': 6.376385288656044e-06, 'epoch': 0.43} 43%|████▎ | 14702/34278 [16:12:19<20:45:24, 3.82s/it] 43%|████▎ | 14703/34278 [16:12:22<19:33:28, 3.60s/it] {'loss': 0.1444, 'grad_norm': 0.7969852178845906, 'learning_rate': 6.3759311008797945e-06, 'epoch': 0.43} 43%|████▎ | 14703/34278 [16:12:22<19:33:28, 3.60s/it] 43%|████▎ | 14704/34278 [16:12:25<18:37:48, 3.43s/it] {'loss': 0.1431, 'grad_norm': 0.7045271765828216, 'learning_rate': 6.3754769008195576e-06, 'epoch': 0.43} 43%|████▎ | 14704/34278 [16:12:25<18:37:48, 3.43s/it] 43%|████▎ | 14705/34278 [16:12:28<18:00:31, 3.31s/it] {'loss': 0.151, 'grad_norm': 1.0154308113320514, 'learning_rate': 6.375022688479393e-06, 'epoch': 0.43} 43%|████▎ | 14705/34278 [16:12:28<18:00:31, 3.31s/it] 43%|████▎ | 14706/34278 [16:12:32<18:54:58, 3.48s/it] {'loss': 0.1422, 'grad_norm': 0.9667412064910351, 'learning_rate': 6.374568463863353e-06, 'epoch': 0.43} 43%|████▎ | 14706/34278 [16:12:32<18:54:58, 3.48s/it] 43%|████▎ | 14707/34278 [16:12:35<18:12:52, 3.35s/it] {'loss': 0.1273, 'grad_norm': 0.7824217836663594, 'learning_rate': 6.374114226975494e-06, 'epoch': 0.43} 43%|████▎ | 14707/34278 [16:12:35<18:12:52, 3.35s/it] 43%|████▎ | 14708/34278 [16:12:38<17:50:05, 3.28s/it] {'loss': 0.1203, 'grad_norm': 1.1880867361015324, 'learning_rate': 6.3736599778198725e-06, 'epoch': 0.43} 43%|████▎ | 14708/34278 [16:12:38<17:50:05, 3.28s/it] 43%|████▎ | 14709/34278 [16:12:42<18:35:28, 3.42s/it] {'loss': 0.1455, 'grad_norm': 1.0002914702130004, 'learning_rate': 6.373205716400543e-06, 'epoch': 0.43} 43%|████▎ | 14709/34278 [16:12:42<18:35:28, 3.42s/it] 43%|████▎ | 14710/34278 [16:12:45<17:18:50, 3.19s/it] {'loss': 0.1555, 'grad_norm': 1.0439607342833814, 'learning_rate': 6.372751442721559e-06, 'epoch': 0.43} 43%|████▎ | 14710/34278 [16:12:45<17:18:50, 3.19s/it] 43%|████▎ | 14711/34278 [16:12:48<17:20:23, 3.19s/it] {'loss': 0.1154, 'grad_norm': 1.0342263842925121, 'learning_rate': 6.372297156786978e-06, 'epoch': 0.43} 43%|████▎ | 14711/34278 [16:12:48<17:20:23, 3.19s/it] 43%|████▎ | 14712/34278 [16:12:51<17:42:24, 3.26s/it] {'loss': 0.1457, 'grad_norm': 0.9284556281035707, 'learning_rate': 6.371842858600856e-06, 'epoch': 0.43} 43%|████▎ | 14712/34278 [16:12:51<17:42:24, 3.26s/it] 43%|████▎ | 14713/34278 [16:12:55<17:49:50, 3.28s/it] {'loss': 0.135, 'grad_norm': 0.9488620328940125, 'learning_rate': 6.3713885481672476e-06, 'epoch': 0.43} 43%|████▎ | 14713/34278 [16:12:55<17:49:50, 3.28s/it] 43%|████▎ | 14714/34278 [16:12:58<17:19:49, 3.19s/it] {'loss': 0.1406, 'grad_norm': 0.9318273083737971, 'learning_rate': 6.37093422549021e-06, 'epoch': 0.43} 43%|████▎ | 14714/34278 [16:12:58<17:19:49, 3.19s/it] 43%|████▎ | 14715/34278 [16:13:02<19:43:26, 3.63s/it] {'loss': 0.1315, 'grad_norm': 0.6390769448191116, 'learning_rate': 6.3704798905737995e-06, 'epoch': 0.43} 43%|████▎ | 14715/34278 [16:13:02<19:43:26, 3.63s/it] 43%|████▎ | 14716/34278 [16:13:05<18:24:33, 3.39s/it] {'loss': 0.144, 'grad_norm': 0.8955550043126647, 'learning_rate': 6.3700255434220714e-06, 'epoch': 0.43} 43%|████▎ | 14716/34278 [16:13:05<18:24:33, 3.39s/it] 43%|████▎ | 14717/34278 [16:13:09<18:32:53, 3.41s/it] {'loss': 0.1457, 'grad_norm': 0.8678171250181997, 'learning_rate': 6.3695711840390826e-06, 'epoch': 0.43} 43%|████▎ | 14717/34278 [16:13:09<18:32:53, 3.41s/it] 43%|████▎ | 14718/34278 [16:13:12<18:22:13, 3.38s/it] {'loss': 0.1627, 'grad_norm': 0.797499210866375, 'learning_rate': 6.36911681242889e-06, 'epoch': 0.43} 43%|████▎ | 14718/34278 [16:13:12<18:22:13, 3.38s/it] 43%|████▎ | 14719/34278 [16:13:15<18:00:50, 3.32s/it] {'loss': 0.1451, 'grad_norm': 0.9453525502046212, 'learning_rate': 6.368662428595548e-06, 'epoch': 0.43} 43%|████▎ | 14719/34278 [16:13:15<18:00:50, 3.32s/it] 43%|████▎ | 14720/34278 [16:13:20<20:12:26, 3.72s/it] {'loss': 0.1609, 'grad_norm': 0.6540759317187861, 'learning_rate': 6.368208032543115e-06, 'epoch': 0.43} 43%|████▎ | 14720/34278 [16:13:20<20:12:26, 3.72s/it] 43%|████▎ | 14721/34278 [16:13:23<20:10:58, 3.72s/it] {'loss': 0.1464, 'grad_norm': 0.8733203342358636, 'learning_rate': 6.367753624275648e-06, 'epoch': 0.43} 43%|████▎ | 14721/34278 [16:13:23<20:10:58, 3.72s/it] 43%|████▎ | 14722/34278 [16:13:29<23:05:59, 4.25s/it] {'loss': 0.1104, 'grad_norm': 0.8924426659836293, 'learning_rate': 6.367299203797202e-06, 'epoch': 0.43} 43%|████▎ | 14722/34278 [16:13:29<23:05:59, 4.25s/it] 43%|████▎ | 14723/34278 [16:13:33<23:12:22, 4.27s/it] {'loss': 0.1509, 'grad_norm': 0.7593034303139613, 'learning_rate': 6.366844771111835e-06, 'epoch': 0.43} 43%|████▎ | 14723/34278 [16:13:33<23:12:22, 4.27s/it] 43%|████▎ | 14724/34278 [16:13:37<22:22:26, 4.12s/it] {'loss': 0.1425, 'grad_norm': 0.8137304725602087, 'learning_rate': 6.366390326223605e-06, 'epoch': 0.43} 43%|████▎ | 14724/34278 [16:13:37<22:22:26, 4.12s/it] 43%|████▎ | 14725/34278 [16:13:40<21:05:02, 3.88s/it] {'loss': 0.1279, 'grad_norm': 1.0327459850613934, 'learning_rate': 6.365935869136568e-06, 'epoch': 0.43} 43%|████▎ | 14725/34278 [16:13:40<21:05:02, 3.88s/it] 43%|████▎ | 14726/34278 [16:13:46<23:53:30, 4.40s/it] {'loss': 0.1454, 'grad_norm': 0.6340026185915157, 'learning_rate': 6.365481399854782e-06, 'epoch': 0.43} 43%|████▎ | 14726/34278 [16:13:46<23:53:30, 4.40s/it] 43%|████▎ | 14727/34278 [16:13:52<26:39:59, 4.91s/it] {'loss': 0.1585, 'grad_norm': 0.7648429592184687, 'learning_rate': 6.365026918382303e-06, 'epoch': 0.43} 43%|████▎ | 14727/34278 [16:13:52<26:39:59, 4.91s/it] 43%|████▎ | 14728/34278 [16:13:55<23:56:02, 4.41s/it] {'loss': 0.1483, 'grad_norm': 1.2101507521140271, 'learning_rate': 6.36457242472319e-06, 'epoch': 0.43} 43%|████▎ | 14728/34278 [16:13:55<23:56:02, 4.41s/it] 43%|████▎ | 14729/34278 [16:14:01<26:26:39, 4.87s/it] {'loss': 0.1363, 'grad_norm': 0.7692745981042659, 'learning_rate': 6.3641179188815e-06, 'epoch': 0.43} 43%|████▎ | 14729/34278 [16:14:01<26:26:39, 4.87s/it] 43%|████▎ | 14730/34278 [16:14:06<26:06:34, 4.81s/it] {'loss': 0.1166, 'grad_norm': 0.6070084951386207, 'learning_rate': 6.363663400861291e-06, 'epoch': 0.43} 43%|████▎ | 14730/34278 [16:14:06<26:06:34, 4.81s/it] 43%|████▎ | 14731/34278 [16:14:09<23:17:30, 4.29s/it] {'loss': 0.1697, 'grad_norm': 0.9497120536580956, 'learning_rate': 6.363208870666621e-06, 'epoch': 0.43} 43%|████▎ | 14731/34278 [16:14:09<23:17:30, 4.29s/it] 43%|████▎ | 14732/34278 [16:14:12<21:03:11, 3.88s/it] {'loss': 0.133, 'grad_norm': 0.9847011205316754, 'learning_rate': 6.362754328301548e-06, 'epoch': 0.43} 43%|████▎ | 14732/34278 [16:14:12<21:03:11, 3.88s/it] 43%|████▎ | 14733/34278 [16:14:18<25:27:21, 4.69s/it] {'loss': 0.1421, 'grad_norm': 0.7350037064593803, 'learning_rate': 6.36229977377013e-06, 'epoch': 0.43} 43%|████▎ | 14733/34278 [16:14:18<25:27:21, 4.69s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 43%|████▎ | 14734/34278 [16:14:21<22:33:54, 4.16s/it] {'loss': 0.1665, 'grad_norm': 0.8404109498205895, 'learning_rate': 6.361845207076423e-06, 'epoch': 0.43} 43%|████▎ | 14734/34278 [16:14:21<22:33:54, 4.16s/it] 43%|████▎ | 14735/34278 [16:14:25<21:59:15, 4.05s/it] {'loss': 0.1252, 'grad_norm': 0.829169797032689, 'learning_rate': 6.361390628224488e-06, 'epoch': 0.43} 43%|████▎ | 14735/34278 [16:14:25<21:59:15, 4.05s/it] 43%|████▎ | 14736/34278 [16:14:28<20:41:16, 3.81s/it] {'loss': 0.141, 'grad_norm': 0.9767726647577462, 'learning_rate': 6.3609360372183834e-06, 'epoch': 0.43} 43%|████▎ | 14736/34278 [16:14:28<20:41:16, 3.81s/it] 43%|████▎ | 14737/34278 [16:14:31<19:12:27, 3.54s/it] {'loss': 0.1345, 'grad_norm': 1.0249799083053788, 'learning_rate': 6.360481434062164e-06, 'epoch': 0.43} 43%|████▎ | 14737/34278 [16:14:31<19:12:27, 3.54s/it] 43%|████▎ | 14738/34278 [16:14:34<18:18:29, 3.37s/it] {'loss': 0.1477, 'grad_norm': 1.0543286085668138, 'learning_rate': 6.360026818759894e-06, 'epoch': 0.43} 43%|████▎ | 14738/34278 [16:14:34<18:18:29, 3.37s/it] 43%|████▎ | 14739/34278 [16:14:37<18:02:40, 3.32s/it] {'loss': 0.1285, 'grad_norm': 0.8235070795515166, 'learning_rate': 6.359572191315629e-06, 'epoch': 0.43} 43%|████▎ | 14739/34278 [16:14:37<18:02:40, 3.32s/it] 43%|████▎ | 14740/34278 [16:14:41<17:38:13, 3.25s/it] {'loss': 0.1303, 'grad_norm': 0.7234196639753296, 'learning_rate': 6.359117551733427e-06, 'epoch': 0.43} 43%|████▎ | 14740/34278 [16:14:41<17:38:13, 3.25s/it] 43%|████▎ | 14741/34278 [16:14:44<17:45:58, 3.27s/it] {'loss': 0.1565, 'grad_norm': 0.7910670223734967, 'learning_rate': 6.358662900017348e-06, 'epoch': 0.43} 43%|████▎ | 14741/34278 [16:14:44<17:45:58, 3.27s/it] 43%|████▎ | 14742/34278 [16:14:48<19:23:37, 3.57s/it] {'loss': 0.1157, 'grad_norm': 1.1594702431343762, 'learning_rate': 6.358208236171451e-06, 'epoch': 0.43} 43%|████▎ | 14742/34278 [16:14:48<19:23:37, 3.57s/it] 43%|████▎ | 14743/34278 [16:14:51<19:02:46, 3.51s/it] {'loss': 0.1321, 'grad_norm': 0.7541241378096104, 'learning_rate': 6.357753560199795e-06, 'epoch': 0.43} 43%|████▎ | 14743/34278 [16:14:52<19:02:46, 3.51s/it] 43%|████▎ | 14744/34278 [16:14:56<20:46:53, 3.83s/it] {'loss': 0.1454, 'grad_norm': 0.7988039846351058, 'learning_rate': 6.35729887210644e-06, 'epoch': 0.43} 43%|████▎ | 14744/34278 [16:14:56<20:46:53, 3.83s/it] 43%|████▎ | 14745/34278 [16:15:00<20:15:43, 3.73s/it] {'loss': 0.1423, 'grad_norm': 0.8143874563860272, 'learning_rate': 6.356844171895444e-06, 'epoch': 0.43} 43%|████▎ | 14745/34278 [16:15:00<20:15:43, 3.73s/it] 43%|████▎ | 14746/34278 [16:15:06<23:49:26, 4.39s/it] {'loss': 0.1291, 'grad_norm': 0.8809231822137538, 'learning_rate': 6.356389459570868e-06, 'epoch': 0.43} 43%|████▎ | 14746/34278 [16:15:06<23:49:26, 4.39s/it] 43%|████▎ | 14747/34278 [16:15:09<21:59:58, 4.06s/it] {'loss': 0.1326, 'grad_norm': 0.7402927413175663, 'learning_rate': 6.35593473513677e-06, 'epoch': 0.43} 43%|████▎ | 14747/34278 [16:15:09<21:59:58, 4.06s/it] 43%|████▎ | 14748/34278 [16:15:12<19:58:19, 3.68s/it] {'loss': 0.1348, 'grad_norm': 0.8641214382003821, 'learning_rate': 6.355479998597211e-06, 'epoch': 0.43} 43%|████▎ | 14748/34278 [16:15:12<19:58:19, 3.68s/it] 43%|████▎ | 14749/34278 [16:15:18<23:45:00, 4.38s/it] {'loss': 0.1471, 'grad_norm': 1.1837464571437748, 'learning_rate': 6.355025249956249e-06, 'epoch': 0.43} 43%|████▎ | 14749/34278 [16:15:18<23:45:00, 4.38s/it] 43%|████▎ | 14750/34278 [16:15:21<21:54:18, 4.04s/it] {'loss': 0.1292, 'grad_norm': 0.7054205118269635, 'learning_rate': 6.354570489217946e-06, 'epoch': 0.43} 43%|████▎ | 14750/34278 [16:15:21<21:54:18, 4.04s/it] 43%|████▎ | 14751/34278 [16:15:25<22:52:54, 4.22s/it] {'loss': 0.1482, 'grad_norm': 0.9815155821544417, 'learning_rate': 6.35411571638636e-06, 'epoch': 0.43} 43%|████▎ | 14751/34278 [16:15:26<22:52:54, 4.22s/it] 43%|████▎ | 14752/34278 [16:15:29<21:55:25, 4.04s/it] {'loss': 0.1495, 'grad_norm': 0.7517246662679853, 'learning_rate': 6.353660931465553e-06, 'epoch': 0.43} 43%|████▎ | 14752/34278 [16:15:29<21:55:25, 4.04s/it] 43%|████▎ | 14753/34278 [16:15:32<19:54:02, 3.67s/it] {'loss': 0.1376, 'grad_norm': 0.8042249174324039, 'learning_rate': 6.353206134459585e-06, 'epoch': 0.43} 43%|████▎ | 14753/34278 [16:15:32<19:54:02, 3.67s/it] 43%|████▎ | 14754/34278 [16:15:36<20:10:01, 3.72s/it] {'loss': 0.1348, 'grad_norm': 0.7419311244258114, 'learning_rate': 6.352751325372515e-06, 'epoch': 0.43} 43%|████▎ | 14754/34278 [16:15:36<20:10:01, 3.72s/it] 43%|████▎ | 14755/34278 [16:15:39<19:53:37, 3.67s/it] {'loss': 0.1289, 'grad_norm': 0.7243243445421994, 'learning_rate': 6.352296504208404e-06, 'epoch': 0.43} 43%|████▎ | 14755/34278 [16:15:39<19:53:37, 3.67s/it] 43%|████▎ | 14756/34278 [16:15:43<20:43:56, 3.82s/it] {'loss': 0.1236, 'grad_norm': 0.7399725796332371, 'learning_rate': 6.351841670971313e-06, 'epoch': 0.43} 43%|████▎ | 14756/34278 [16:15:43<20:43:56, 3.82s/it] 43%|████▎ | 14757/34278 [16:15:46<19:22:15, 3.57s/it] {'loss': 0.1465, 'grad_norm': 0.7262352627221309, 'learning_rate': 6.3513868256653e-06, 'epoch': 0.43} 43%|████▎ | 14757/34278 [16:15:46<19:22:15, 3.57s/it] 43%|████▎ | 14758/34278 [16:15:50<19:18:13, 3.56s/it] {'loss': 0.1353, 'grad_norm': 0.982726738830854, 'learning_rate': 6.350931968294432e-06, 'epoch': 0.43} 43%|████▎ | 14758/34278 [16:15:50<19:18:13, 3.56s/it] 43%|████▎ | 14759/34278 [16:15:53<19:08:55, 3.53s/it] {'loss': 0.1286, 'grad_norm': 0.8203260531758041, 'learning_rate': 6.3504770988627625e-06, 'epoch': 0.43} 43%|████▎ | 14759/34278 [16:15:53<19:08:55, 3.53s/it] 43%|████▎ | 14760/34278 [16:15:57<18:43:28, 3.45s/it] {'loss': 0.1239, 'grad_norm': 0.8721819696136724, 'learning_rate': 6.350022217374358e-06, 'epoch': 0.43} 43%|████▎ | 14760/34278 [16:15:57<18:43:28, 3.45s/it] 43%|████▎ | 14761/34278 [16:16:00<18:08:56, 3.35s/it] {'loss': 0.125, 'grad_norm': 0.924114420605603, 'learning_rate': 6.349567323833277e-06, 'epoch': 0.43} 43%|████▎ | 14761/34278 [16:16:00<18:08:56, 3.35s/it] 43%|████▎ | 14762/34278 [16:16:03<17:44:33, 3.27s/it] {'loss': 0.143, 'grad_norm': 0.9240299533530848, 'learning_rate': 6.349112418243579e-06, 'epoch': 0.43} 43%|████▎ | 14762/34278 [16:16:03<17:44:33, 3.27s/it] 43%|████▎ | 14763/34278 [16:16:09<22:09:40, 4.09s/it] {'loss': 0.133, 'grad_norm': 0.7282105079622296, 'learning_rate': 6.3486575006093295e-06, 'epoch': 0.43} 43%|████▎ | 14763/34278 [16:16:09<22:09:40, 4.09s/it] 43%|████▎ | 14764/34278 [16:16:14<23:40:28, 4.37s/it] {'loss': 0.1231, 'grad_norm': 0.8709789885973791, 'learning_rate': 6.348202570934588e-06, 'epoch': 0.43} 43%|████▎ | 14764/34278 [16:16:14<23:40:28, 4.37s/it] 43%|████▎ | 14765/34278 [16:16:18<23:39:29, 4.36s/it] {'loss': 0.1609, 'grad_norm': 1.1936752809308744, 'learning_rate': 6.347747629223415e-06, 'epoch': 0.43} 43%|████▎ | 14765/34278 [16:16:18<23:39:29, 4.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 43%|████▎ | 14766/34278 [16:16:22<22:07:35, 4.08s/it] {'loss': 0.1489, 'grad_norm': 0.9580558806630077, 'learning_rate': 6.347292675479872e-06, 'epoch': 0.43} 43%|████▎ | 14766/34278 [16:16:22<22:07:35, 4.08s/it] 43%|████▎ | 14767/34278 [16:16:27<24:28:40, 4.52s/it] {'loss': 0.1415, 'grad_norm': 0.8337521198576897, 'learning_rate': 6.346837709708023e-06, 'epoch': 0.43} 43%|████▎ | 14767/34278 [16:16:27<24:28:40, 4.52s/it] 43%|████▎ | 14768/34278 [16:16:31<23:08:02, 4.27s/it] {'loss': 0.1323, 'grad_norm': 0.9152428122636086, 'learning_rate': 6.34638273191193e-06, 'epoch': 0.43} 43%|████▎ | 14768/34278 [16:16:31<23:08:02, 4.27s/it] 43%|████▎ | 14769/34278 [16:16:35<22:24:31, 4.14s/it] {'loss': 0.1692, 'grad_norm': 1.162065846525339, 'learning_rate': 6.34592774209565e-06, 'epoch': 0.43} 43%|████▎ | 14769/34278 [16:16:35<22:24:31, 4.14s/it] 43%|████▎ | 14770/34278 [16:16:39<21:59:48, 4.06s/it] {'loss': 0.1455, 'grad_norm': 0.8715189540513543, 'learning_rate': 6.345472740263251e-06, 'epoch': 0.43} 43%|████▎ | 14770/34278 [16:16:39<21:59:48, 4.06s/it] 43%|████▎ | 14771/34278 [16:16:42<20:18:03, 3.75s/it] {'loss': 0.1382, 'grad_norm': 0.9425556467260457, 'learning_rate': 6.345017726418792e-06, 'epoch': 0.43} 43%|████▎ | 14771/34278 [16:16:42<20:18:03, 3.75s/it] 43%|████▎ | 14772/34278 [16:16:48<23:55:01, 4.41s/it] {'loss': 0.1311, 'grad_norm': 1.1258620194234759, 'learning_rate': 6.344562700566334e-06, 'epoch': 0.43} 43%|████▎ | 14772/34278 [16:16:48<23:55:01, 4.41s/it] 43%|████▎ | 14773/34278 [16:16:51<22:28:24, 4.15s/it] {'loss': 0.1332, 'grad_norm': 0.9633974929046847, 'learning_rate': 6.344107662709943e-06, 'epoch': 0.43} 43%|████▎ | 14773/34278 [16:16:51<22:28:24, 4.15s/it] 43%|████▎ | 14774/34278 [16:16:54<20:30:56, 3.79s/it] {'loss': 0.1058, 'grad_norm': 0.7794259895237825, 'learning_rate': 6.343652612853679e-06, 'epoch': 0.43} 43%|████▎ | 14774/34278 [16:16:54<20:30:56, 3.79s/it] 43%|████▎ | 14775/34278 [16:16:57<19:00:45, 3.51s/it] {'loss': 0.1167, 'grad_norm': 1.0422691345915542, 'learning_rate': 6.343197551001605e-06, 'epoch': 0.43} 43%|████▎ | 14775/34278 [16:16:57<19:00:45, 3.51s/it] 43%|████▎ | 14776/34278 [16:17:01<20:35:15, 3.80s/it] {'loss': 0.1456, 'grad_norm': 0.9600283141702454, 'learning_rate': 6.342742477157784e-06, 'epoch': 0.43} 43%|████▎ | 14776/34278 [16:17:01<20:35:15, 3.80s/it] 43%|████▎ | 14777/34278 [16:17:05<20:02:48, 3.70s/it] {'loss': 0.1537, 'grad_norm': 0.8345360247156994, 'learning_rate': 6.3422873913262796e-06, 'epoch': 0.43} 43%|████▎ | 14777/34278 [16:17:05<20:02:48, 3.70s/it] 43%|████▎ | 14778/34278 [16:17:08<18:46:12, 3.47s/it] {'loss': 0.1425, 'grad_norm': 0.9786213226755879, 'learning_rate': 6.341832293511152e-06, 'epoch': 0.43} 43%|████▎ | 14778/34278 [16:17:08<18:46:12, 3.47s/it] 43%|████▎ | 14779/34278 [16:17:11<18:55:55, 3.50s/it] {'loss': 0.1461, 'grad_norm': 0.9059194181708891, 'learning_rate': 6.341377183716469e-06, 'epoch': 0.43} 43%|████▎ | 14779/34278 [16:17:11<18:55:55, 3.50s/it] 43%|████▎ | 14780/34278 [16:17:18<23:34:18, 4.35s/it] {'loss': 0.1464, 'grad_norm': 0.6796709303229915, 'learning_rate': 6.340922061946288e-06, 'epoch': 0.43} 43%|████▎ | 14780/34278 [16:17:18<23:34:18, 4.35s/it] 43%|████▎ | 14781/34278 [16:17:22<24:07:28, 4.45s/it] {'loss': 0.1427, 'grad_norm': 0.788282698161525, 'learning_rate': 6.3404669282046745e-06, 'epoch': 0.43} 43%|████▎ | 14781/34278 [16:17:22<24:07:28, 4.45s/it] 43%|████▎ | 14782/34278 [16:17:27<24:15:22, 4.48s/it] {'loss': 0.1189, 'grad_norm': 0.9207172393032127, 'learning_rate': 6.340011782495694e-06, 'epoch': 0.43} 43%|████▎ | 14782/34278 [16:17:27<24:15:22, 4.48s/it] 43%|████▎ | 14783/34278 [16:17:30<22:11:01, 4.10s/it] {'loss': 0.1365, 'grad_norm': 0.7501648475307601, 'learning_rate': 6.339556624823409e-06, 'epoch': 0.43} 43%|████▎ | 14783/34278 [16:17:30<22:11:01, 4.10s/it] 43%|████▎ | 14784/34278 [16:17:33<19:58:07, 3.69s/it] {'loss': 0.1503, 'grad_norm': 0.7625215407292547, 'learning_rate': 6.339101455191881e-06, 'epoch': 0.43} 43%|████▎ | 14784/34278 [16:17:33<19:58:07, 3.69s/it] 43%|████▎ | 14785/34278 [16:17:37<19:58:53, 3.69s/it] {'loss': 0.1318, 'grad_norm': 0.8133516307255715, 'learning_rate': 6.338646273605175e-06, 'epoch': 0.43} 43%|████▎ | 14785/34278 [16:17:37<19:58:53, 3.69s/it] 43%|████▎ | 14786/34278 [16:17:40<19:40:26, 3.63s/it] {'loss': 0.1322, 'grad_norm': 0.6745848108498254, 'learning_rate': 6.338191080067354e-06, 'epoch': 0.43} 43%|████▎ | 14786/34278 [16:17:40<19:40:26, 3.63s/it] 43%|████▎ | 14787/34278 [16:17:43<19:03:27, 3.52s/it] {'loss': 0.1542, 'grad_norm': 0.6268664280390904, 'learning_rate': 6.337735874582482e-06, 'epoch': 0.43} 43%|████▎ | 14787/34278 [16:17:43<19:03:27, 3.52s/it] 43%|████▎ | 14788/34278 [16:17:46<17:59:55, 3.32s/it] {'loss': 0.1309, 'grad_norm': 0.7997227880190497, 'learning_rate': 6.337280657154625e-06, 'epoch': 0.43} 43%|████▎ | 14788/34278 [16:17:46<17:59:55, 3.32s/it] 43%|████▎ | 14789/34278 [16:17:49<17:38:45, 3.26s/it] {'loss': 0.1286, 'grad_norm': 0.703119308180586, 'learning_rate': 6.336825427787845e-06, 'epoch': 0.43} 43%|████▎ | 14789/34278 [16:17:49<17:38:45, 3.26s/it] 43%|████▎ | 14790/34278 [16:17:53<18:28:18, 3.41s/it] {'loss': 0.1321, 'grad_norm': 0.7206278096767809, 'learning_rate': 6.336370186486207e-06, 'epoch': 0.43} 43%|████▎ | 14790/34278 [16:17:53<18:28:18, 3.41s/it] 43%|████▎ | 14791/34278 [16:17:56<17:37:57, 3.26s/it] {'loss': 0.1137, 'grad_norm': 0.7961418270278572, 'learning_rate': 6.335914933253775e-06, 'epoch': 0.43} 43%|████▎ | 14791/34278 [16:17:56<17:37:57, 3.26s/it] 43%|████▎ | 14792/34278 [16:17:59<17:14:23, 3.19s/it] {'loss': 0.1316, 'grad_norm': 1.242814905356728, 'learning_rate': 6.335459668094612e-06, 'epoch': 0.43} 43%|████▎ | 14792/34278 [16:17:59<17:14:23, 3.19s/it] 43%|████▎ | 14793/34278 [16:18:04<20:32:34, 3.80s/it] {'loss': 0.1186, 'grad_norm': 0.774466874132768, 'learning_rate': 6.335004391012786e-06, 'epoch': 0.43} 43%|████▎ | 14793/34278 [16:18:04<20:32:34, 3.80s/it] 43%|████▎ | 14794/34278 [16:18:07<19:08:29, 3.54s/it] {'loss': 0.135, 'grad_norm': 2.4898668528798806, 'learning_rate': 6.334549102012357e-06, 'epoch': 0.43} 43%|████▎ | 14794/34278 [16:18:07<19:08:29, 3.54s/it] 43%|████▎ | 14795/34278 [16:18:11<19:12:00, 3.55s/it] {'loss': 0.1519, 'grad_norm': 0.8596545447807837, 'learning_rate': 6.334093801097395e-06, 'epoch': 0.43} 43%|████▎ | 14795/34278 [16:18:11<19:12:00, 3.55s/it] 43%|████▎ | 14796/34278 [16:18:16<21:50:17, 4.04s/it] {'loss': 0.1436, 'grad_norm': 0.844686921865426, 'learning_rate': 6.333638488271961e-06, 'epoch': 0.43} 43%|████▎ | 14796/34278 [16:18:16<21:50:17, 4.04s/it] 43%|████▎ | 14797/34278 [16:18:22<25:10:39, 4.65s/it] {'loss': 0.1603, 'grad_norm': 0.9872144068219241, 'learning_rate': 6.33318316354012e-06, 'epoch': 0.43} 43%|████▎ | 14797/34278 [16:18:22<25:10:39, 4.65s/it] 43%|████▎ | 14798/34278 [16:18:26<24:09:12, 4.46s/it] {'loss': 0.1258, 'grad_norm': 0.749878854173997, 'learning_rate': 6.332727826905939e-06, 'epoch': 0.43} 43%|████▎ | 14798/34278 [16:18:26<24:09:12, 4.46s/it] 43%|████▎ | 14799/34278 [16:18:32<26:45:34, 4.95s/it] {'loss': 0.156, 'grad_norm': 0.7914791480108483, 'learning_rate': 6.33227247837348e-06, 'epoch': 0.43} 43%|████▎ | 14799/34278 [16:18:32<26:45:34, 4.95s/it] 43%|████▎ | 14800/34278 [16:18:36<24:17:51, 4.49s/it] {'loss': 0.1167, 'grad_norm': 0.7703284842657782, 'learning_rate': 6.331817117946814e-06, 'epoch': 0.43} 43%|████▎ | 14800/34278 [16:18:36<24:17:51, 4.49s/it] 43%|████▎ | 14801/34278 [16:18:40<23:39:14, 4.37s/it] {'loss': 0.1318, 'grad_norm': 0.7347550592703608, 'learning_rate': 6.33136174563e-06, 'epoch': 0.43} 43%|████▎ | 14801/34278 [16:18:40<23:39:14, 4.37s/it] 43%|████▎ | 14802/34278 [16:18:43<22:20:38, 4.13s/it] {'loss': 0.1351, 'grad_norm': 0.8049458505427863, 'learning_rate': 6.330906361427106e-06, 'epoch': 0.43} 43%|████▎ | 14802/34278 [16:18:43<22:20:38, 4.13s/it] 43%|████▎ | 14803/34278 [16:18:46<20:54:08, 3.86s/it] {'loss': 0.1359, 'grad_norm': 0.8281217131378312, 'learning_rate': 6.330450965342199e-06, 'epoch': 0.43} 43%|████▎ | 14803/34278 [16:18:46<20:54:08, 3.86s/it] 43%|████▎ | 14804/34278 [16:18:51<21:40:08, 4.01s/it] {'loss': 0.1542, 'grad_norm': 0.8203016360463499, 'learning_rate': 6.329995557379344e-06, 'epoch': 0.43} 43%|████▎ | 14804/34278 [16:18:51<21:40:08, 4.01s/it] 43%|████▎ | 14805/34278 [16:18:55<22:32:26, 4.17s/it] {'loss': 0.152, 'grad_norm': 0.748088965862053, 'learning_rate': 6.329540137542605e-06, 'epoch': 0.43} 43%|████▎ | 14805/34278 [16:18:55<22:32:26, 4.17s/it] 43%|████▎ | 14806/34278 [16:18:58<20:45:39, 3.84s/it] {'loss': 0.1433, 'grad_norm': 0.8582098892345322, 'learning_rate': 6.329084705836049e-06, 'epoch': 0.43} 43%|████▎ | 14806/34278 [16:18:58<20:45:39, 3.84s/it] 43%|████▎ | 14807/34278 [16:19:01<19:30:45, 3.61s/it] {'loss': 0.1301, 'grad_norm': 0.8470891127566365, 'learning_rate': 6.328629262263741e-06, 'epoch': 0.43} 43%|████▎ | 14807/34278 [16:19:01<19:30:45, 3.61s/it] 43%|████▎ | 14808/34278 [16:19:05<18:56:24, 3.50s/it] {'loss': 0.1521, 'grad_norm': 0.7133027560523214, 'learning_rate': 6.328173806829751e-06, 'epoch': 0.43} 43%|████▎ | 14808/34278 [16:19:05<18:56:24, 3.50s/it] 43%|████▎ | 14809/34278 [16:19:08<18:48:56, 3.48s/it] {'loss': 0.1244, 'grad_norm': 0.9062881346168233, 'learning_rate': 6.3277183395381405e-06, 'epoch': 0.43} 43%|████▎ | 14809/34278 [16:19:08<18:48:56, 3.48s/it] 43%|████▎ | 14810/34278 [16:19:13<21:38:45, 4.00s/it] {'loss': 0.123, 'grad_norm': 0.7081085915181456, 'learning_rate': 6.3272628603929775e-06, 'epoch': 0.43} 43%|████▎ | 14810/34278 [16:19:13<21:38:45, 4.00s/it] 43%|████▎ | 14811/34278 [16:19:16<20:12:07, 3.74s/it] {'loss': 0.1323, 'grad_norm': 0.7425128005166141, 'learning_rate': 6.3268073693983275e-06, 'epoch': 0.43} 43%|████▎ | 14811/34278 [16:19:16<20:12:07, 3.74s/it] 43%|████▎ | 14812/34278 [16:19:23<23:54:38, 4.42s/it] {'loss': 0.1224, 'grad_norm': 0.8403564542402145, 'learning_rate': 6.3263518665582606e-06, 'epoch': 0.43} 43%|████▎ | 14812/34278 [16:19:23<23:54:38, 4.42s/it] 43%|████▎ | 14813/34278 [16:19:26<21:52:13, 4.04s/it] {'loss': 0.1504, 'grad_norm': 0.7591884647532012, 'learning_rate': 6.32589635187684e-06, 'epoch': 0.43} 43%|████▎ | 14813/34278 [16:19:26<21:52:13, 4.04s/it] 43%|████▎ | 14814/34278 [16:19:29<19:54:26, 3.68s/it] {'loss': 0.1317, 'grad_norm': 0.7700852059272205, 'learning_rate': 6.325440825358131e-06, 'epoch': 0.43} 43%|████▎ | 14814/34278 [16:19:29<19:54:26, 3.68s/it] 43%|████▎ | 14815/34278 [16:19:32<18:49:03, 3.48s/it] {'loss': 0.1345, 'grad_norm': 0.79680827071898, 'learning_rate': 6.324985287006206e-06, 'epoch': 0.43} 43%|████▎ | 14815/34278 [16:19:32<18:49:03, 3.48s/it] 43%|████▎ | 14816/34278 [16:19:35<18:41:49, 3.46s/it] {'loss': 0.125, 'grad_norm': 0.9241577630626693, 'learning_rate': 6.324529736825127e-06, 'epoch': 0.43} 43%|████▎ | 14816/34278 [16:19:35<18:41:49, 3.46s/it] 43%|████▎ | 14817/34278 [16:19:38<18:43:35, 3.46s/it] {'loss': 0.1365, 'grad_norm': 0.8675621657450447, 'learning_rate': 6.324074174818961e-06, 'epoch': 0.43} 43%|████▎ | 14817/34278 [16:19:38<18:43:35, 3.46s/it] 43%|████▎ | 14818/34278 [16:19:42<18:12:36, 3.37s/it] {'loss': 0.1414, 'grad_norm': 0.709867319033397, 'learning_rate': 6.323618600991781e-06, 'epoch': 0.43} 43%|████▎ | 14818/34278 [16:19:42<18:12:36, 3.37s/it] 43%|████▎ | 14819/34278 [16:19:45<17:50:50, 3.30s/it] {'loss': 0.1476, 'grad_norm': 0.7159572321632133, 'learning_rate': 6.323163015347648e-06, 'epoch': 0.43} 43%|████▎ | 14819/34278 [16:19:45<17:50:50, 3.30s/it] 43%|████▎ | 14820/34278 [16:19:49<19:07:44, 3.54s/it] {'loss': 0.1443, 'grad_norm': 0.7680476626555585, 'learning_rate': 6.322707417890631e-06, 'epoch': 0.43} 43%|████▎ | 14820/34278 [16:19:49<19:07:44, 3.54s/it] 43%|████▎ | 14821/34278 [16:19:52<18:25:34, 3.41s/it] {'loss': 0.1141, 'grad_norm': 0.8288063399065261, 'learning_rate': 6.322251808624799e-06, 'epoch': 0.43} 43%|████▎ | 14821/34278 [16:19:52<18:25:34, 3.41s/it] 43%|████▎ | 14822/34278 [16:19:58<22:18:34, 4.13s/it] {'loss': 0.1378, 'grad_norm': 0.6191127988079769, 'learning_rate': 6.321796187554217e-06, 'epoch': 0.43} 43%|████▎ | 14822/34278 [16:19:58<22:18:34, 4.13s/it] 43%|████▎ | 14823/34278 [16:20:01<20:39:51, 3.82s/it] {'loss': 0.1489, 'grad_norm': 1.4845953736268853, 'learning_rate': 6.321340554682955e-06, 'epoch': 0.43} 43%|████▎ | 14823/34278 [16:20:01<20:39:51, 3.82s/it] 43%|████▎ | 14824/34278 [16:20:05<20:45:40, 3.84s/it] {'loss': 0.1437, 'grad_norm': 1.2885937859330068, 'learning_rate': 6.320884910015079e-06, 'epoch': 0.43} 43%|████▎ | 14824/34278 [16:20:05<20:45:40, 3.84s/it] 43%|████▎ | 14825/34278 [16:20:07<18:46:02, 3.47s/it] {'loss': 0.1098, 'grad_norm': 0.8138087163816917, 'learning_rate': 6.320429253554661e-06, 'epoch': 0.43} 43%|████▎ | 14825/34278 [16:20:07<18:46:02, 3.47s/it] 43%|████▎ | 14826/34278 [16:20:12<20:17:40, 3.76s/it] {'loss': 0.1451, 'grad_norm': 0.8907509065064294, 'learning_rate': 6.319973585305762e-06, 'epoch': 0.43} 43%|████▎ | 14826/34278 [16:20:12<20:17:40, 3.76s/it] 43%|████▎ | 14827/34278 [16:20:17<22:29:05, 4.16s/it] {'loss': 0.1552, 'grad_norm': 0.8669764743772568, 'learning_rate': 6.319517905272455e-06, 'epoch': 0.43} 43%|████▎ | 14827/34278 [16:20:17<22:29:05, 4.16s/it] 43%|████▎ | 14828/34278 [16:20:21<23:01:09, 4.26s/it] {'loss': 0.1477, 'grad_norm': 1.0586071764642637, 'learning_rate': 6.319062213458808e-06, 'epoch': 0.43} 43%|████▎ | 14828/34278 [16:20:21<23:01:09, 4.26s/it] 43%|████▎ | 14829/34278 [16:20:24<21:03:29, 3.90s/it] {'loss': 0.1427, 'grad_norm': 0.8915242741557635, 'learning_rate': 6.318606509868888e-06, 'epoch': 0.43} 43%|████▎ | 14829/34278 [16:20:24<21:03:29, 3.90s/it] 43%|████▎ | 14830/34278 [16:20:28<20:41:54, 3.83s/it] {'loss': 0.13, 'grad_norm': 0.7399806161453074, 'learning_rate': 6.318150794506765e-06, 'epoch': 0.43} 43%|████▎ | 14830/34278 [16:20:28<20:41:54, 3.83s/it] 43%|████▎ | 14831/34278 [16:20:31<19:27:41, 3.60s/it] {'loss': 0.1449, 'grad_norm': 0.939475842359072, 'learning_rate': 6.317695067376506e-06, 'epoch': 0.43} 43%|████▎ | 14831/34278 [16:20:31<19:27:41, 3.60s/it] 43%|████▎ | 14832/34278 [16:20:34<18:00:28, 3.33s/it] {'loss': 0.114, 'grad_norm': 0.7400291112559871, 'learning_rate': 6.3172393284821775e-06, 'epoch': 0.43} 43%|████▎ | 14832/34278 [16:20:34<18:00:28, 3.33s/it] 43%|████▎ | 14833/34278 [16:20:38<18:34:06, 3.44s/it] {'loss': 0.1268, 'grad_norm': 0.7331330322028462, 'learning_rate': 6.316783577827854e-06, 'epoch': 0.43} 43%|████▎ | 14833/34278 [16:20:38<18:34:06, 3.44s/it] 43%|████▎ | 14834/34278 [16:20:41<17:58:36, 3.33s/it] {'loss': 0.1552, 'grad_norm': 1.0497645154772755, 'learning_rate': 6.3163278154176e-06, 'epoch': 0.43} 43%|████▎ | 14834/34278 [16:20:41<17:58:36, 3.33s/it] 43%|████▎ | 14835/34278 [16:20:46<20:51:00, 3.86s/it] {'loss': 0.1213, 'grad_norm': 0.9624530482482117, 'learning_rate': 6.315872041255484e-06, 'epoch': 0.43} 43%|████▎ | 14835/34278 [16:20:46<20:51:00, 3.86s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 43%|████▎ | 14836/34278 [16:20:49<20:01:44, 3.71s/it] {'loss': 0.142, 'grad_norm': 1.0323126643004623, 'learning_rate': 6.3154162553455775e-06, 'epoch': 0.43} 43%|████▎ | 14836/34278 [16:20:49<20:01:44, 3.71s/it] 43%|████▎ | 14837/34278 [16:20:52<18:52:35, 3.50s/it] {'loss': 0.1549, 'grad_norm': 0.980677544834582, 'learning_rate': 6.31496045769195e-06, 'epoch': 0.43} 43%|████▎ | 14837/34278 [16:20:52<18:52:35, 3.50s/it] 43%|████▎ | 14838/34278 [16:20:56<20:18:54, 3.76s/it] {'loss': 0.1512, 'grad_norm': 0.8966870514937749, 'learning_rate': 6.314504648298667e-06, 'epoch': 0.43} 43%|████▎ | 14838/34278 [16:20:56<20:18:54, 3.76s/it] 43%|████▎ | 14839/34278 [16:20:59<18:58:26, 3.51s/it] {'loss': 0.1196, 'grad_norm': 0.7708401591962507, 'learning_rate': 6.3140488271698015e-06, 'epoch': 0.43} 43%|████▎ | 14839/34278 [16:20:59<18:58:26, 3.51s/it] 43%|████▎ | 14840/34278 [16:21:02<17:53:51, 3.31s/it] {'loss': 0.152, 'grad_norm': 0.9611765463011959, 'learning_rate': 6.3135929943094235e-06, 'epoch': 0.43} 43%|████▎ | 14840/34278 [16:21:02<17:53:51, 3.31s/it] 43%|████▎ | 14841/34278 [16:21:05<17:49:43, 3.30s/it] {'loss': 0.1158, 'grad_norm': 0.7808004179677782, 'learning_rate': 6.313137149721597e-06, 'epoch': 0.43} 43%|████▎ | 14841/34278 [16:21:06<17:49:43, 3.30s/it] 43%|████▎ | 14842/34278 [16:21:08<17:02:33, 3.16s/it] {'loss': 0.135, 'grad_norm': 0.7507308527471663, 'learning_rate': 6.312681293410399e-06, 'epoch': 0.43} 43%|████▎ | 14842/34278 [16:21:08<17:02:33, 3.16s/it] 43%|████▎ | 14843/34278 [16:21:12<18:06:31, 3.35s/it] {'loss': 0.1319, 'grad_norm': 0.7333275014877074, 'learning_rate': 6.312225425379896e-06, 'epoch': 0.43} 43%|████▎ | 14843/34278 [16:21:12<18:06:31, 3.35s/it] 43%|████▎ | 14844/34278 [16:21:16<18:40:51, 3.46s/it] {'loss': 0.1585, 'grad_norm': 0.7817531313835508, 'learning_rate': 6.311769545634154e-06, 'epoch': 0.43} 43%|████▎ | 14844/34278 [16:21:16<18:40:51, 3.46s/it] 43%|████▎ | 14845/34278 [16:21:20<19:04:04, 3.53s/it] {'loss': 0.1431, 'grad_norm': 0.8548138312330888, 'learning_rate': 6.311313654177249e-06, 'epoch': 0.43} 43%|████▎ | 14845/34278 [16:21:20<19:04:04, 3.53s/it] 43%|████▎ | 14846/34278 [16:21:23<19:45:04, 3.66s/it] {'loss': 0.1541, 'grad_norm': 0.7211573466060593, 'learning_rate': 6.310857751013248e-06, 'epoch': 0.43} 43%|████▎ | 14846/34278 [16:21:23<19:45:04, 3.66s/it] 43%|████▎ | 14847/34278 [16:21:30<23:52:33, 4.42s/it] {'loss': 0.1229, 'grad_norm': 0.645913438630516, 'learning_rate': 6.3104018361462225e-06, 'epoch': 0.43} 43%|████▎ | 14847/34278 [16:21:30<23:52:33, 4.42s/it] 43%|████▎ | 14848/34278 [16:21:33<21:52:42, 4.05s/it] {'loss': 0.139, 'grad_norm': 0.9447276864129122, 'learning_rate': 6.309945909580243e-06, 'epoch': 0.43} 43%|████▎ | 14848/34278 [16:21:33<21:52:42, 4.05s/it] 43%|████▎ | 14849/34278 [16:21:37<21:31:37, 3.99s/it] {'loss': 0.1307, 'grad_norm': 0.7900679300978004, 'learning_rate': 6.309489971319378e-06, 'epoch': 0.43} 43%|████▎ | 14849/34278 [16:21:37<21:31:37, 3.99s/it] 43%|████▎ | 14850/34278 [16:21:40<20:19:22, 3.77s/it] {'loss': 0.156, 'grad_norm': 0.8623032168405524, 'learning_rate': 6.309034021367699e-06, 'epoch': 0.43} 43%|████▎ | 14850/34278 [16:21:40<20:19:22, 3.77s/it] 43%|████▎ | 14851/34278 [16:21:45<22:08:41, 4.10s/it] {'loss': 0.1723, 'grad_norm': 1.346691040609426, 'learning_rate': 6.308578059729278e-06, 'epoch': 0.43} 43%|████▎ | 14851/34278 [16:21:45<22:08:41, 4.10s/it] 43%|████▎ | 14852/34278 [16:21:48<21:08:46, 3.92s/it] {'loss': 0.1391, 'grad_norm': 1.0181550369805636, 'learning_rate': 6.308122086408184e-06, 'epoch': 0.43} 43%|████▎ | 14852/34278 [16:21:48<21:08:46, 3.92s/it] 43%|████▎ | 14853/34278 [16:21:52<20:23:59, 3.78s/it] {'loss': 0.1405, 'grad_norm': 0.66617601220997, 'learning_rate': 6.307666101408487e-06, 'epoch': 0.43} 43%|████▎ | 14853/34278 [16:21:52<20:23:59, 3.78s/it] 43%|████▎ | 14854/34278 [16:21:56<21:28:12, 3.98s/it] {'loss': 0.1717, 'grad_norm': 1.0803410521516597, 'learning_rate': 6.30721010473426e-06, 'epoch': 0.43} 43%|████▎ | 14854/34278 [16:21:56<21:28:12, 3.98s/it] 43%|████▎ | 14855/34278 [16:22:00<20:30:09, 3.80s/it] {'loss': 0.1304, 'grad_norm': 0.9935671643146751, 'learning_rate': 6.306754096389575e-06, 'epoch': 0.43} 43%|████▎ | 14855/34278 [16:22:00<20:30:09, 3.80s/it] 43%|████▎ | 14856/34278 [16:22:03<20:13:25, 3.75s/it] {'loss': 0.1719, 'grad_norm': 0.9995445759687146, 'learning_rate': 6.306298076378499e-06, 'epoch': 0.43} 43%|████▎ | 14856/34278 [16:22:03<20:13:25, 3.75s/it] 43%|████▎ | 14857/34278 [16:22:06<19:10:52, 3.56s/it] {'loss': 0.1458, 'grad_norm': 0.8930670840373487, 'learning_rate': 6.305842044705105e-06, 'epoch': 0.43} 43%|████▎ | 14857/34278 [16:22:06<19:10:52, 3.56s/it] 43%|████▎ | 14858/34278 [16:22:12<23:10:43, 4.30s/it] {'loss': 0.1315, 'grad_norm': 0.729476831124679, 'learning_rate': 6.305386001373468e-06, 'epoch': 0.43} 43%|████▎ | 14858/34278 [16:22:12<23:10:43, 4.30s/it] 43%|████▎ | 14859/34278 [16:22:16<21:31:02, 3.99s/it] {'loss': 0.1395, 'grad_norm': 0.8435005806158672, 'learning_rate': 6.3049299463876535e-06, 'epoch': 0.43} 43%|████▎ | 14859/34278 [16:22:16<21:31:02, 3.99s/it] 43%|████▎ | 14860/34278 [16:22:19<20:26:58, 3.79s/it] {'loss': 0.1497, 'grad_norm': 0.9560165677874352, 'learning_rate': 6.304473879751738e-06, 'epoch': 0.43} 43%|████▎ | 14860/34278 [16:22:19<20:26:58, 3.79s/it] 43%|████▎ | 14861/34278 [16:22:22<19:02:14, 3.53s/it] {'loss': 0.1444, 'grad_norm': 0.8345507564547471, 'learning_rate': 6.3040178014697905e-06, 'epoch': 0.43} 43%|████▎ | 14861/34278 [16:22:22<19:02:14, 3.53s/it] 43%|████▎ | 14862/34278 [16:22:25<18:27:33, 3.42s/it] {'loss': 0.1392, 'grad_norm': 1.1152359866359134, 'learning_rate': 6.303561711545883e-06, 'epoch': 0.43} 43%|████▎ | 14862/34278 [16:22:25<18:27:33, 3.42s/it] 43%|████▎ | 14863/34278 [16:22:29<18:43:47, 3.47s/it] {'loss': 0.1363, 'grad_norm': 0.8206332388495481, 'learning_rate': 6.303105609984087e-06, 'epoch': 0.43} 43%|████▎ | 14863/34278 [16:22:29<18:43:47, 3.47s/it] 43%|████▎ | 14864/34278 [16:22:33<20:11:11, 3.74s/it] {'loss': 0.141, 'grad_norm': 0.7987801003957105, 'learning_rate': 6.302649496788476e-06, 'epoch': 0.43} 43%|████▎ | 14864/34278 [16:22:33<20:11:11, 3.74s/it] 43%|████▎ | 14865/34278 [16:22:36<19:33:24, 3.63s/it] {'loss': 0.1303, 'grad_norm': 0.8427239224657411, 'learning_rate': 6.3021933719631215e-06, 'epoch': 0.43} 43%|████▎ | 14865/34278 [16:22:36<19:33:24, 3.63s/it] 43%|████▎ | 14866/34278 [16:22:40<19:17:51, 3.58s/it] {'loss': 0.1272, 'grad_norm': 0.851809512482157, 'learning_rate': 6.301737235512096e-06, 'epoch': 0.43} 43%|████▎ | 14866/34278 [16:22:40<19:17:51, 3.58s/it] 43%|████▎ | 14867/34278 [16:22:44<20:22:33, 3.78s/it] {'loss': 0.1356, 'grad_norm': 0.760481169540387, 'learning_rate': 6.301281087439469e-06, 'epoch': 0.43} 43%|████▎ | 14867/34278 [16:22:44<20:22:33, 3.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 43%|████▎ | 14868/34278 [16:22:48<20:10:42, 3.74s/it] {'loss': 0.1262, 'grad_norm': 0.8702583160884054, 'learning_rate': 6.300824927749317e-06, 'epoch': 0.43} 43%|████▎ | 14868/34278 [16:22:48<20:10:42, 3.74s/it] 43%|████▎ | 14869/34278 [16:22:51<19:10:20, 3.56s/it] {'loss': 0.1411, 'grad_norm': 0.9830728223049329, 'learning_rate': 6.300368756445709e-06, 'epoch': 0.43} 43%|████▎ | 14869/34278 [16:22:51<19:10:20, 3.56s/it] 43%|████▎ | 14870/34278 [16:22:57<23:04:16, 4.28s/it] {'loss': 0.1248, 'grad_norm': 0.7699137388317029, 'learning_rate': 6.299912573532723e-06, 'epoch': 0.43} 43%|████▎ | 14870/34278 [16:22:57<23:04:16, 4.28s/it] 43%|████▎ | 14871/34278 [16:23:01<22:13:25, 4.12s/it] {'loss': 0.1584, 'grad_norm': 0.8193503189857513, 'learning_rate': 6.299456379014424e-06, 'epoch': 0.43} 43%|████▎ | 14871/34278 [16:23:01<22:13:25, 4.12s/it] 43%|████▎ | 14872/34278 [16:23:04<20:20:46, 3.77s/it] {'loss': 0.1465, 'grad_norm': 0.7466214930730316, 'learning_rate': 6.299000172894889e-06, 'epoch': 0.43} 43%|████▎ | 14872/34278 [16:23:04<20:20:46, 3.77s/it] 43%|████▎ | 14873/34278 [16:23:07<19:04:19, 3.54s/it] {'loss': 0.1463, 'grad_norm': 0.9512890462432595, 'learning_rate': 6.298543955178192e-06, 'epoch': 0.43} 43%|████▎ | 14873/34278 [16:23:07<19:04:19, 3.54s/it] 43%|████▎ | 14874/34278 [16:23:10<19:05:06, 3.54s/it] {'loss': 0.1606, 'grad_norm': 0.9847601613888907, 'learning_rate': 6.298087725868403e-06, 'epoch': 0.43} 43%|████▎ | 14874/34278 [16:23:10<19:05:06, 3.54s/it] 43%|████▎ | 14875/34278 [16:23:14<20:25:43, 3.79s/it] {'loss': 0.1286, 'grad_norm': 0.678758495208975, 'learning_rate': 6.2976314849695985e-06, 'epoch': 0.43} 43%|████▎ | 14875/34278 [16:23:14<20:25:43, 3.79s/it] 43%|████▎ | 14876/34278 [16:23:18<19:51:30, 3.68s/it] {'loss': 0.132, 'grad_norm': 0.8780937813226559, 'learning_rate': 6.297175232485849e-06, 'epoch': 0.43} 43%|████▎ | 14876/34278 [16:23:18<19:51:30, 3.68s/it] 43%|████▎ | 14877/34278 [16:23:21<19:26:38, 3.61s/it] {'loss': 0.1433, 'grad_norm': 1.015683950220094, 'learning_rate': 6.296718968421228e-06, 'epoch': 0.43} 43%|████▎ | 14877/34278 [16:23:21<19:26:38, 3.61s/it] 43%|████▎ | 14878/34278 [16:23:26<20:46:23, 3.85s/it] {'loss': 0.1223, 'grad_norm': 0.7916238415866884, 'learning_rate': 6.296262692779811e-06, 'epoch': 0.43} 43%|████▎ | 14878/34278 [16:23:26<20:46:23, 3.85s/it] 43%|████▎ | 14879/34278 [16:23:32<23:58:43, 4.45s/it] {'loss': 0.1095, 'grad_norm': 0.5392514728974622, 'learning_rate': 6.295806405565668e-06, 'epoch': 0.43} 43%|████▎ | 14879/34278 [16:23:32<23:58:43, 4.45s/it] 43%|████▎ | 14880/34278 [16:23:35<22:24:28, 4.16s/it] {'loss': 0.1337, 'grad_norm': 0.8330809840747785, 'learning_rate': 6.295350106782877e-06, 'epoch': 0.43} 43%|████▎ | 14880/34278 [16:23:35<22:24:28, 4.16s/it] 43%|████▎ | 14881/34278 [16:23:39<21:32:52, 4.00s/it] {'loss': 0.14, 'grad_norm': 1.262089795742695, 'learning_rate': 6.294893796435508e-06, 'epoch': 0.43} 43%|████▎ | 14881/34278 [16:23:39<21:32:52, 4.00s/it] 43%|████▎ | 14882/34278 [16:23:42<20:22:43, 3.78s/it] {'loss': 0.1564, 'grad_norm': 0.866794026353123, 'learning_rate': 6.294437474527637e-06, 'epoch': 0.43} 43%|████▎ | 14882/34278 [16:23:42<20:22:43, 3.78s/it] 43%|████▎ | 14883/34278 [16:23:47<22:52:52, 4.25s/it] {'loss': 0.1326, 'grad_norm': 0.7113868590294488, 'learning_rate': 6.293981141063336e-06, 'epoch': 0.43} 43%|████▎ | 14883/34278 [16:23:47<22:52:52, 4.25s/it] 43%|████▎ | 14884/34278 [16:23:50<20:24:26, 3.79s/it] {'loss': 0.1393, 'grad_norm': 0.9040129207199873, 'learning_rate': 6.293524796046683e-06, 'epoch': 0.43} 43%|████▎ | 14884/34278 [16:23:50<20:24:26, 3.79s/it] 43%|████▎ | 14885/34278 [16:23:53<19:47:56, 3.68s/it] {'loss': 0.1279, 'grad_norm': 0.9098606553163996, 'learning_rate': 6.293068439481749e-06, 'epoch': 0.43} 43%|████▎ | 14885/34278 [16:23:53<19:47:56, 3.68s/it] 43%|████▎ | 14886/34278 [16:23:56<18:28:48, 3.43s/it] {'loss': 0.1355, 'grad_norm': 0.9560544364486673, 'learning_rate': 6.2926120713726055e-06, 'epoch': 0.43} 43%|████▎ | 14886/34278 [16:23:56<18:28:48, 3.43s/it] 43%|████▎ | 14887/34278 [16:23:59<18:00:31, 3.34s/it] {'loss': 0.1274, 'grad_norm': 0.9017628328457967, 'learning_rate': 6.292155691723331e-06, 'epoch': 0.43} 43%|████▎ | 14887/34278 [16:23:59<18:00:31, 3.34s/it] 43%|████▎ | 14888/34278 [16:24:02<17:22:00, 3.22s/it] {'loss': 0.1381, 'grad_norm': 1.0174906176820488, 'learning_rate': 6.291699300538001e-06, 'epoch': 0.43} 43%|████▎ | 14888/34278 [16:24:02<17:22:00, 3.22s/it] 43%|████▎ | 14889/34278 [16:24:05<16:52:51, 3.13s/it] {'loss': 0.1125, 'grad_norm': 0.7896600965768121, 'learning_rate': 6.291242897820686e-06, 'epoch': 0.43} 43%|████▎ | 14889/34278 [16:24:05<16:52:51, 3.13s/it] 43%|████▎ | 14890/34278 [16:24:09<17:08:03, 3.18s/it] {'loss': 0.141, 'grad_norm': 0.8736816177417097, 'learning_rate': 6.290786483575465e-06, 'epoch': 0.43} 43%|████▎ | 14890/34278 [16:24:09<17:08:03, 3.18s/it] 43%|████▎ | 14891/34278 [16:24:12<17:00:07, 3.16s/it] {'loss': 0.1431, 'grad_norm': 0.8666296258123337, 'learning_rate': 6.290330057806408e-06, 'epoch': 0.43} 43%|████▎ | 14891/34278 [16:24:12<17:00:07, 3.16s/it] 43%|████▎ | 14892/34278 [16:24:15<17:41:03, 3.28s/it] {'loss': 0.1285, 'grad_norm': 0.9022593673584454, 'learning_rate': 6.289873620517594e-06, 'epoch': 0.43} 43%|████▎ | 14892/34278 [16:24:15<17:41:03, 3.28s/it] 43%|████▎ | 14893/34278 [16:24:19<18:42:37, 3.47s/it] {'loss': 0.1192, 'grad_norm': 0.7726111119294767, 'learning_rate': 6.289417171713095e-06, 'epoch': 0.43} 43%|████▎ | 14893/34278 [16:24:19<18:42:37, 3.47s/it] 43%|████▎ | 14894/34278 [16:24:23<19:17:53, 3.58s/it] {'loss': 0.1531, 'grad_norm': 0.9764752659765082, 'learning_rate': 6.288960711396987e-06, 'epoch': 0.43} 43%|████▎ | 14894/34278 [16:24:23<19:17:53, 3.58s/it] 43%|████▎ | 14895/34278 [16:24:26<18:38:32, 3.46s/it] {'loss': 0.1348, 'grad_norm': 1.1802016135698505, 'learning_rate': 6.288504239573348e-06, 'epoch': 0.43} 43%|████▎ | 14895/34278 [16:24:26<18:38:32, 3.46s/it] 43%|████▎ | 14896/34278 [16:24:30<19:37:38, 3.65s/it] {'loss': 0.1306, 'grad_norm': 0.7652544979387879, 'learning_rate': 6.2880477562462475e-06, 'epoch': 0.43} 43%|████▎ | 14896/34278 [16:24:30<19:37:38, 3.65s/it] 43%|████▎ | 14897/34278 [16:24:34<19:28:36, 3.62s/it] {'loss': 0.1353, 'grad_norm': 0.9221428203868074, 'learning_rate': 6.287591261419765e-06, 'epoch': 0.43} 43%|████▎ | 14897/34278 [16:24:34<19:28:36, 3.62s/it] 43%|████▎ | 14898/34278 [16:24:37<19:15:04, 3.58s/it] {'loss': 0.1694, 'grad_norm': 0.9333945401115743, 'learning_rate': 6.287134755097977e-06, 'epoch': 0.43} 43%|████▎ | 14898/34278 [16:24:37<19:15:04, 3.58s/it] 43%|████▎ | 14899/34278 [16:24:43<23:04:29, 4.29s/it] {'loss': 0.1511, 'grad_norm': 0.997088890826675, 'learning_rate': 6.2866782372849555e-06, 'epoch': 0.43} 43%|████▎ | 14899/34278 [16:24:43<23:04:29, 4.29s/it] 43%|████▎ | 14900/34278 [16:24:48<23:21:24, 4.34s/it] {'loss': 0.1322, 'grad_norm': 0.7862373800922435, 'learning_rate': 6.286221707984778e-06, 'epoch': 0.43} 43%|████▎ | 14900/34278 [16:24:48<23:21:24, 4.34s/it] 43%|████▎ | 14901/34278 [16:24:52<23:55:03, 4.44s/it] {'loss': 0.1375, 'grad_norm': 0.8032756680491813, 'learning_rate': 6.28576516720152e-06, 'epoch': 0.43} 43%|████▎ | 14901/34278 [16:24:52<23:55:03, 4.44s/it] 43%|████▎ | 14902/34278 [16:24:56<21:45:15, 4.04s/it] {'loss': 0.1441, 'grad_norm': 0.8915256699252233, 'learning_rate': 6.285308614939259e-06, 'epoch': 0.43} 43%|████▎ | 14902/34278 [16:24:56<21:45:15, 4.04s/it] 43%|████▎ | 14903/34278 [16:24:59<20:16:12, 3.77s/it] {'loss': 0.1562, 'grad_norm': 0.8143586950469169, 'learning_rate': 6.284852051202069e-06, 'epoch': 0.43} 43%|████▎ | 14903/34278 [16:24:59<20:16:12, 3.77s/it] 43%|████▎ | 14904/34278 [16:25:03<20:53:00, 3.88s/it] {'loss': 0.1588, 'grad_norm': 0.8097365289970733, 'learning_rate': 6.284395475994024e-06, 'epoch': 0.43} 43%|████▎ | 14904/34278 [16:25:03<20:53:00, 3.88s/it] 43%|████▎ | 14905/34278 [16:25:06<20:15:40, 3.77s/it] {'loss': 0.1258, 'grad_norm': 0.6316064871648317, 'learning_rate': 6.283938889319205e-06, 'epoch': 0.43} 43%|████▎ | 14905/34278 [16:25:06<20:15:40, 3.77s/it] 43%|████▎ | 14906/34278 [16:25:09<19:13:49, 3.57s/it] {'loss': 0.1332, 'grad_norm': 0.7199241791446711, 'learning_rate': 6.283482291181686e-06, 'epoch': 0.43} 43%|████▎ | 14906/34278 [16:25:09<19:13:49, 3.57s/it] 43%|████▎ | 14907/34278 [16:25:13<18:35:57, 3.46s/it] {'loss': 0.1295, 'grad_norm': 0.9549830821817287, 'learning_rate': 6.283025681585544e-06, 'epoch': 0.43} 43%|████▎ | 14907/34278 [16:25:13<18:35:57, 3.46s/it] 43%|████▎ | 14908/34278 [16:25:19<22:31:41, 4.19s/it] {'loss': 0.132, 'grad_norm': 0.5998824774171313, 'learning_rate': 6.282569060534854e-06, 'epoch': 0.43} 43%|████▎ | 14908/34278 [16:25:19<22:31:41, 4.19s/it] 43%|████▎ | 14909/34278 [16:25:22<20:43:54, 3.85s/it] {'loss': 0.1778, 'grad_norm': 0.8522325645392712, 'learning_rate': 6.2821124280336934e-06, 'epoch': 0.43} 43%|████▎ | 14909/34278 [16:25:22<20:43:54, 3.85s/it] 43%|████▎ | 14910/34278 [16:25:25<19:14:06, 3.58s/it] {'loss': 0.1334, 'grad_norm': 0.9510984597363902, 'learning_rate': 6.28165578408614e-06, 'epoch': 0.43} 43%|████▎ | 14910/34278 [16:25:25<19:14:06, 3.58s/it] 44%|████▎ | 14911/34278 [16:25:31<23:16:26, 4.33s/it] {'loss': 0.1362, 'grad_norm': 0.7994132652116095, 'learning_rate': 6.281199128696269e-06, 'epoch': 0.44} 44%|████▎ | 14911/34278 [16:25:31<23:16:26, 4.33s/it] 44%|████▎ | 14912/34278 [16:25:34<21:26:57, 3.99s/it] {'loss': 0.1265, 'grad_norm': 0.7392170295309389, 'learning_rate': 6.280742461868159e-06, 'epoch': 0.44} 44%|████▎ | 14912/34278 [16:25:34<21:26:57, 3.99s/it] 44%|████▎ | 14913/34278 [16:25:37<19:52:24, 3.69s/it] {'loss': 0.1386, 'grad_norm': 0.7661820989637711, 'learning_rate': 6.280285783605885e-06, 'epoch': 0.44} 44%|████▎ | 14913/34278 [16:25:37<19:52:24, 3.69s/it] 44%|████▎ | 14914/34278 [16:25:40<19:31:44, 3.63s/it] {'loss': 0.1496, 'grad_norm': 0.815470895969181, 'learning_rate': 6.279829093913525e-06, 'epoch': 0.44} 44%|████▎ | 14914/34278 [16:25:40<19:31:44, 3.63s/it] 44%|████▎ | 14915/34278 [16:25:44<20:14:48, 3.76s/it] {'loss': 0.1439, 'grad_norm': 0.9107718400772307, 'learning_rate': 6.2793723927951575e-06, 'epoch': 0.44} 44%|████▎ | 14915/34278 [16:25:44<20:14:48, 3.76s/it] 44%|████▎ | 14916/34278 [16:25:48<20:16:04, 3.77s/it] {'loss': 0.1523, 'grad_norm': 0.9852545037025268, 'learning_rate': 6.278915680254858e-06, 'epoch': 0.44} 44%|████▎ | 14916/34278 [16:25:48<20:16:04, 3.77s/it] 44%|████▎ | 14917/34278 [16:25:54<23:59:42, 4.46s/it] {'loss': 0.1344, 'grad_norm': 0.844753075866623, 'learning_rate': 6.2784589562967045e-06, 'epoch': 0.44} 44%|████▎ | 14917/34278 [16:25:54<23:59:42, 4.46s/it] 44%|████▎ | 14918/34278 [16:25:57<21:27:36, 3.99s/it] {'loss': 0.1148, 'grad_norm': 0.7164684216736642, 'learning_rate': 6.278002220924776e-06, 'epoch': 0.44} 44%|████▎ | 14918/34278 [16:25:57<21:27:36, 3.99s/it] 44%|████▎ | 14919/34278 [16:26:03<24:53:06, 4.63s/it] {'loss': 0.1262, 'grad_norm': 0.8886839598601174, 'learning_rate': 6.277545474143146e-06, 'epoch': 0.44} 44%|████▎ | 14919/34278 [16:26:03<24:53:06, 4.63s/it] 44%|████▎ | 14920/34278 [16:26:09<27:09:54, 5.05s/it] {'loss': 0.1341, 'grad_norm': 0.9947075280271497, 'learning_rate': 6.277088715955898e-06, 'epoch': 0.44} 44%|████▎ | 14920/34278 [16:26:09<27:09:54, 5.05s/it] 44%|████▎ | 14921/34278 [16:26:13<24:27:38, 4.55s/it] {'loss': 0.1653, 'grad_norm': 1.2448171324132684, 'learning_rate': 6.276631946367106e-06, 'epoch': 0.44} 44%|████▎ | 14921/34278 [16:26:13<24:27:38, 4.55s/it] 44%|████▎ | 14922/34278 [16:26:16<21:48:07, 4.05s/it] {'loss': 0.1308, 'grad_norm': 0.9888601040175079, 'learning_rate': 6.276175165380847e-06, 'epoch': 0.44} 44%|████▎ | 14922/34278 [16:26:16<21:48:07, 4.05s/it] 44%|████▎ | 14923/34278 [16:26:18<19:56:15, 3.71s/it] {'loss': 0.1321, 'grad_norm': 0.9215285039816433, 'learning_rate': 6.275718373001203e-06, 'epoch': 0.44} 44%|████▎ | 14923/34278 [16:26:18<19:56:15, 3.71s/it] 44%|████▎ | 14924/34278 [16:26:21<18:49:51, 3.50s/it] {'loss': 0.1367, 'grad_norm': 0.7205768417382196, 'learning_rate': 6.2752615692322485e-06, 'epoch': 0.44} 44%|████▎ | 14924/34278 [16:26:21<18:49:51, 3.50s/it] 44%|████▎ | 14925/34278 [16:26:25<18:15:48, 3.40s/it] {'loss': 0.1131, 'grad_norm': 0.7417452238793568, 'learning_rate': 6.274804754078063e-06, 'epoch': 0.44} 44%|████▎ | 14925/34278 [16:26:25<18:15:48, 3.40s/it] 44%|████▎ | 14926/34278 [16:26:31<23:20:15, 4.34s/it] {'loss': 0.1343, 'grad_norm': 1.022941663769502, 'learning_rate': 6.2743479275427255e-06, 'epoch': 0.44} 44%|████▎ | 14926/34278 [16:26:31<23:20:15, 4.34s/it] 44%|████▎ | 14927/34278 [16:26:35<22:27:03, 4.18s/it] {'loss': 0.1701, 'grad_norm': 1.143967356093502, 'learning_rate': 6.273891089630313e-06, 'epoch': 0.44} 44%|████▎ | 14927/34278 [16:26:35<22:27:03, 4.18s/it] 44%|████▎ | 14928/34278 [16:26:38<20:50:14, 3.88s/it] {'loss': 0.1265, 'grad_norm': 0.7368982907513386, 'learning_rate': 6.273434240344906e-06, 'epoch': 0.44} 44%|████▎ | 14928/34278 [16:26:38<20:50:14, 3.88s/it] 44%|████▎ | 14929/34278 [16:26:42<20:43:34, 3.86s/it] {'loss': 0.128, 'grad_norm': 0.9170885634221387, 'learning_rate': 6.272977379690583e-06, 'epoch': 0.44} 44%|████▎ | 14929/34278 [16:26:42<20:43:34, 3.86s/it] 44%|████▎ | 14930/34278 [16:26:45<19:47:17, 3.68s/it] {'loss': 0.1514, 'grad_norm': 1.0144873266327827, 'learning_rate': 6.2725205076714215e-06, 'epoch': 0.44} 44%|████▎ | 14930/34278 [16:26:45<19:47:17, 3.68s/it] 44%|████▎ | 14931/34278 [16:26:49<20:07:29, 3.74s/it] {'loss': 0.1571, 'grad_norm': 0.8171141688562251, 'learning_rate': 6.272063624291498e-06, 'epoch': 0.44} 44%|████▎ | 14931/34278 [16:26:49<20:07:29, 3.74s/it] 44%|████▎ | 14932/34278 [16:26:52<18:59:11, 3.53s/it] {'loss': 0.1408, 'grad_norm': 1.1620975943386103, 'learning_rate': 6.271606729554897e-06, 'epoch': 0.44} 44%|████▎ | 14932/34278 [16:26:52<18:59:11, 3.53s/it] 44%|████▎ | 14933/34278 [16:26:56<18:55:40, 3.52s/it] {'loss': 0.1349, 'grad_norm': 1.1871283517654905, 'learning_rate': 6.271149823465693e-06, 'epoch': 0.44} 44%|████▎ | 14933/34278 [16:26:56<18:55:40, 3.52s/it] 44%|████▎ | 14934/34278 [16:27:02<22:56:24, 4.27s/it] {'loss': 0.1352, 'grad_norm': 0.847428614997877, 'learning_rate': 6.270692906027968e-06, 'epoch': 0.44} 44%|████▎ | 14934/34278 [16:27:02<22:56:24, 4.27s/it] 44%|████▎ | 14935/34278 [16:27:05<22:15:08, 4.14s/it] {'loss': 0.1473, 'grad_norm': 1.0326812778059362, 'learning_rate': 6.2702359772458e-06, 'epoch': 0.44} 44%|████▎ | 14935/34278 [16:27:05<22:15:08, 4.14s/it] 44%|████▎ | 14936/34278 [16:27:09<21:46:42, 4.05s/it] {'loss': 0.1451, 'grad_norm': 0.9176456457317715, 'learning_rate': 6.269779037123267e-06, 'epoch': 0.44} 44%|████▎ | 14936/34278 [16:27:09<21:46:42, 4.05s/it] 44%|████▎ | 14937/34278 [16:27:13<21:38:20, 4.03s/it] {'loss': 0.1432, 'grad_norm': 0.7759409629178708, 'learning_rate': 6.269322085664452e-06, 'epoch': 0.44} 44%|████▎ | 14937/34278 [16:27:13<21:38:20, 4.03s/it] 44%|████▎ | 14938/34278 [16:27:18<22:17:35, 4.15s/it] {'loss': 0.171, 'grad_norm': 1.2262564725803873, 'learning_rate': 6.268865122873431e-06, 'epoch': 0.44} 44%|████▎ | 14938/34278 [16:27:18<22:17:35, 4.15s/it] 44%|████▎ | 14939/34278 [16:27:22<21:43:49, 4.05s/it] {'loss': 0.1361, 'grad_norm': 0.829829756502998, 'learning_rate': 6.268408148754285e-06, 'epoch': 0.44} 44%|████▎ | 14939/34278 [16:27:22<21:43:49, 4.05s/it] 44%|████▎ | 14940/34278 [16:27:25<20:05:55, 3.74s/it] {'loss': 0.1466, 'grad_norm': 0.8094543078533318, 'learning_rate': 6.267951163311095e-06, 'epoch': 0.44} 44%|████▎ | 14940/34278 [16:27:25<20:05:55, 3.74s/it] 44%|████▎ | 14941/34278 [16:27:29<20:56:09, 3.90s/it] {'loss': 0.1413, 'grad_norm': 0.7789330195323289, 'learning_rate': 6.267494166547938e-06, 'epoch': 0.44} 44%|████▎ | 14941/34278 [16:27:29<20:56:09, 3.90s/it] 44%|████▎ | 14942/34278 [16:27:34<22:27:51, 4.18s/it] {'loss': 0.1695, 'grad_norm': 0.8812221258860311, 'learning_rate': 6.267037158468897e-06, 'epoch': 0.44} 44%|████▎ | 14942/34278 [16:27:34<22:27:51, 4.18s/it] 44%|████▎ | 14943/34278 [16:27:38<22:16:03, 4.15s/it] {'loss': 0.1426, 'grad_norm': 0.9116048139269639, 'learning_rate': 6.266580139078051e-06, 'epoch': 0.44} 44%|████▎ | 14943/34278 [16:27:38<22:16:03, 4.15s/it] 44%|████▎ | 14944/34278 [16:27:41<20:27:36, 3.81s/it] {'loss': 0.1274, 'grad_norm': 0.7102903023901466, 'learning_rate': 6.266123108379478e-06, 'epoch': 0.44} 44%|████▎ | 14944/34278 [16:27:41<20:27:36, 3.81s/it] 44%|████▎ | 14945/34278 [16:27:44<19:45:43, 3.68s/it] {'loss': 0.1406, 'grad_norm': 0.8306328790161557, 'learning_rate': 6.265666066377262e-06, 'epoch': 0.44} 44%|████▎ | 14945/34278 [16:27:44<19:45:43, 3.68s/it] 44%|████▎ | 14946/34278 [16:27:47<19:05:09, 3.55s/it] {'loss': 0.1409, 'grad_norm': 1.0817500740617516, 'learning_rate': 6.265209013075481e-06, 'epoch': 0.44} 44%|████▎ | 14946/34278 [16:27:47<19:05:09, 3.55s/it] 44%|████▎ | 14947/34278 [16:27:50<18:09:01, 3.38s/it] {'loss': 0.141, 'grad_norm': 0.8373492525705621, 'learning_rate': 6.264751948478216e-06, 'epoch': 0.44} 44%|████▎ | 14947/34278 [16:27:50<18:09:01, 3.38s/it] 44%|████▎ | 14948/34278 [16:27:54<18:01:05, 3.36s/it] {'loss': 0.1501, 'grad_norm': 1.006839374873911, 'learning_rate': 6.264294872589547e-06, 'epoch': 0.44} 44%|████▎ | 14948/34278 [16:27:54<18:01:05, 3.36s/it] 44%|████▎ | 14949/34278 [16:27:58<20:13:32, 3.77s/it] {'loss': 0.1353, 'grad_norm': 1.2613639955868652, 'learning_rate': 6.263837785413556e-06, 'epoch': 0.44} 44%|████▎ | 14949/34278 [16:27:58<20:13:32, 3.77s/it] 44%|████▎ | 14950/34278 [16:28:01<18:42:13, 3.48s/it] {'loss': 0.1376, 'grad_norm': 0.844220750536089, 'learning_rate': 6.263380686954324e-06, 'epoch': 0.44} 44%|████▎ | 14950/34278 [16:28:01<18:42:13, 3.48s/it] 44%|████▎ | 14951/34278 [16:28:04<18:14:54, 3.40s/it] {'loss': 0.137, 'grad_norm': 0.8256806551009317, 'learning_rate': 6.2629235772159266e-06, 'epoch': 0.44} 44%|████▎ | 14951/34278 [16:28:04<18:14:54, 3.40s/it] 44%|████▎ | 14952/34278 [16:28:08<18:10:48, 3.39s/it] {'loss': 0.1512, 'grad_norm': 0.8015607269478686, 'learning_rate': 6.262466456202453e-06, 'epoch': 0.44} 44%|████▎ | 14952/34278 [16:28:08<18:10:48, 3.39s/it] 44%|████▎ | 14953/34278 [16:28:14<22:07:26, 4.12s/it] {'loss': 0.1394, 'grad_norm': 0.8865343343345692, 'learning_rate': 6.262009323917979e-06, 'epoch': 0.44} 44%|████▎ | 14953/34278 [16:28:14<22:07:26, 4.12s/it] 44%|████▎ | 14954/34278 [16:28:20<25:20:41, 4.72s/it] {'loss': 0.1637, 'grad_norm': 0.7404050238691539, 'learning_rate': 6.261552180366586e-06, 'epoch': 0.44} 44%|████▎ | 14954/34278 [16:28:20<25:20:41, 4.72s/it] 44%|████▎ | 14955/34278 [16:28:23<23:08:42, 4.31s/it] {'loss': 0.1636, 'grad_norm': 0.8766041674094689, 'learning_rate': 6.261095025552359e-06, 'epoch': 0.44} 44%|████▎ | 14955/34278 [16:28:23<23:08:42, 4.31s/it] 44%|████▎ | 14956/34278 [16:28:27<21:48:27, 4.06s/it] {'loss': 0.1458, 'grad_norm': 0.9401005495181551, 'learning_rate': 6.260637859479374e-06, 'epoch': 0.44} 44%|████▎ | 14956/34278 [16:28:27<21:48:27, 4.06s/it] 44%|████▎ | 14957/34278 [16:28:32<23:31:21, 4.38s/it] {'loss': 0.1086, 'grad_norm': 0.7294483064104669, 'learning_rate': 6.260180682151716e-06, 'epoch': 0.44} 44%|████▎ | 14957/34278 [16:28:32<23:31:21, 4.38s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 44%|████▎ | 14958/34278 [16:28:36<23:47:49, 4.43s/it] {'loss': 0.1393, 'grad_norm': 0.8767594795626354, 'learning_rate': 6.259723493573467e-06, 'epoch': 0.44} 44%|████▎ | 14958/34278 [16:28:36<23:47:49, 4.43s/it] 44%|████▎ | 14959/34278 [16:28:39<21:38:27, 4.03s/it] {'loss': 0.1594, 'grad_norm': 0.8694135397666469, 'learning_rate': 6.259266293748705e-06, 'epoch': 0.44} 44%|████▎ | 14959/34278 [16:28:39<21:38:27, 4.03s/it] 44%|████▎ | 14960/34278 [16:28:46<25:04:52, 4.67s/it] {'loss': 0.1464, 'grad_norm': 1.0462151480352384, 'learning_rate': 6.258809082681515e-06, 'epoch': 0.44} 44%|████▎ | 14960/34278 [16:28:46<25:04:52, 4.67s/it] 44%|████▎ | 14961/34278 [16:28:50<24:36:37, 4.59s/it] {'loss': 0.109, 'grad_norm': 0.9366276317687989, 'learning_rate': 6.258351860375979e-06, 'epoch': 0.44} 44%|████▎ | 14961/34278 [16:28:50<24:36:37, 4.59s/it] 44%|████▎ | 14962/34278 [16:28:54<23:21:50, 4.35s/it] {'loss': 0.1393, 'grad_norm': 0.9656065547459961, 'learning_rate': 6.257894626836176e-06, 'epoch': 0.44} 44%|████▎ | 14962/34278 [16:28:54<23:21:50, 4.35s/it] 44%|████▎ | 14963/34278 [16:28:58<22:57:32, 4.28s/it] {'loss': 0.1341, 'grad_norm': 0.9458351499044808, 'learning_rate': 6.257437382066191e-06, 'epoch': 0.44} 44%|████▎ | 14963/34278 [16:28:58<22:57:32, 4.28s/it] 44%|████▎ | 14964/34278 [16:29:01<21:38:21, 4.03s/it] {'loss': 0.1462, 'grad_norm': 0.7728687151152849, 'learning_rate': 6.256980126070107e-06, 'epoch': 0.44} 44%|████▎ | 14964/34278 [16:29:01<21:38:21, 4.03s/it] 44%|████▎ | 14965/34278 [16:29:05<20:27:44, 3.81s/it] {'loss': 0.1398, 'grad_norm': 0.7792620701017975, 'learning_rate': 6.256522858852003e-06, 'epoch': 0.44} 44%|████▎ | 14965/34278 [16:29:05<20:27:44, 3.81s/it] 44%|████▎ | 14966/34278 [16:29:08<19:17:36, 3.60s/it] {'loss': 0.164, 'grad_norm': 0.795409314591477, 'learning_rate': 6.256065580415962e-06, 'epoch': 0.44} 44%|████▎ | 14966/34278 [16:29:08<19:17:36, 3.60s/it] 44%|████▎ | 14967/34278 [16:29:11<18:31:32, 3.45s/it] {'loss': 0.1268, 'grad_norm': 0.9598838014143705, 'learning_rate': 6.2556082907660685e-06, 'epoch': 0.44} 44%|████▎ | 14967/34278 [16:29:11<18:31:32, 3.45s/it] 44%|████▎ | 14968/34278 [16:29:17<22:39:38, 4.22s/it] {'loss': 0.1431, 'grad_norm': 0.8031767049728477, 'learning_rate': 6.255150989906405e-06, 'epoch': 0.44} 44%|████▎ | 14968/34278 [16:29:17<22:39:38, 4.22s/it] 44%|████▎ | 14969/34278 [16:29:21<22:12:56, 4.14s/it] {'loss': 0.145, 'grad_norm': 1.1100118664682679, 'learning_rate': 6.254693677841051e-06, 'epoch': 0.44} 44%|████▎ | 14969/34278 [16:29:21<22:12:56, 4.14s/it] 44%|████▎ | 14970/34278 [16:29:27<25:12:52, 4.70s/it] {'loss': 0.1401, 'grad_norm': 0.7209447122471424, 'learning_rate': 6.254236354574092e-06, 'epoch': 0.44} 44%|████▎ | 14970/34278 [16:29:27<25:12:52, 4.70s/it] 44%|████▎ | 14971/34278 [16:29:30<22:29:43, 4.19s/it] {'loss': 0.1217, 'grad_norm': 0.7933848661987428, 'learning_rate': 6.25377902010961e-06, 'epoch': 0.44} 44%|████▎ | 14971/34278 [16:29:30<22:29:43, 4.19s/it] 44%|████▎ | 14972/34278 [16:29:33<21:16:14, 3.97s/it] {'loss': 0.1393, 'grad_norm': 0.8697372108140659, 'learning_rate': 6.253321674451689e-06, 'epoch': 0.44} 44%|████▎ | 14972/34278 [16:29:33<21:16:14, 3.97s/it] 44%|████▎ | 14973/34278 [16:29:36<19:52:23, 3.71s/it] {'loss': 0.1579, 'grad_norm': 0.8834304271180028, 'learning_rate': 6.252864317604411e-06, 'epoch': 0.44} 44%|████▎ | 14973/34278 [16:29:36<19:52:23, 3.71s/it] 44%|████▎ | 14974/34278 [16:29:39<18:37:06, 3.47s/it] {'loss': 0.1436, 'grad_norm': 0.8534089619224059, 'learning_rate': 6.252406949571858e-06, 'epoch': 0.44} 44%|████▎ | 14974/34278 [16:29:39<18:37:06, 3.47s/it] 44%|████▎ | 14975/34278 [16:29:42<17:26:32, 3.25s/it] {'loss': 0.1305, 'grad_norm': 0.8595474014569607, 'learning_rate': 6.2519495703581165e-06, 'epoch': 0.44} 44%|████▎ | 14975/34278 [16:29:42<17:26:32, 3.25s/it] 44%|████▎ | 14976/34278 [16:29:48<21:50:01, 4.07s/it] {'loss': 0.1445, 'grad_norm': 1.116247587712783, 'learning_rate': 6.2514921799672675e-06, 'epoch': 0.44} 44%|████▎ | 14976/34278 [16:29:48<21:50:01, 4.07s/it] 44%|████▎ | 14977/34278 [16:29:51<20:18:24, 3.79s/it] {'loss': 0.1363, 'grad_norm': 1.06846417250696, 'learning_rate': 6.251034778403396e-06, 'epoch': 0.44} 44%|████▎ | 14977/34278 [16:29:51<20:18:24, 3.79s/it] 44%|████▎ | 14978/34278 [16:29:54<19:38:55, 3.67s/it] {'loss': 0.1361, 'grad_norm': 0.9021355009698474, 'learning_rate': 6.250577365670584e-06, 'epoch': 0.44} 44%|████▎ | 14978/34278 [16:29:54<19:38:55, 3.67s/it] 44%|████▎ | 14979/34278 [16:30:00<23:25:31, 4.37s/it] {'loss': 0.1362, 'grad_norm': 0.878252703424149, 'learning_rate': 6.250119941772915e-06, 'epoch': 0.44} 44%|████▎ | 14979/34278 [16:30:00<23:25:31, 4.37s/it] 44%|████▎ | 14980/34278 [16:30:04<22:22:10, 4.17s/it] {'loss': 0.1505, 'grad_norm': 0.961434453542702, 'learning_rate': 6.2496625067144755e-06, 'epoch': 0.44} 44%|████▎ | 14980/34278 [16:30:04<22:22:10, 4.17s/it] 44%|████▎ | 14981/34278 [16:30:10<24:42:32, 4.61s/it] {'loss': 0.1371, 'grad_norm': 0.6862118257027884, 'learning_rate': 6.249205060499345e-06, 'epoch': 0.44} 44%|████▎ | 14981/34278 [16:30:10<24:42:32, 4.61s/it] 44%|████▎ | 14982/34278 [16:30:14<23:11:29, 4.33s/it] {'loss': 0.1137, 'grad_norm': 0.9255086249462283, 'learning_rate': 6.248747603131612e-06, 'epoch': 0.44} 44%|████▎ | 14982/34278 [16:30:14<23:11:29, 4.33s/it] 44%|████▎ | 14983/34278 [16:30:17<21:46:31, 4.06s/it] {'loss': 0.1457, 'grad_norm': 0.9150855053642007, 'learning_rate': 6.2482901346153575e-06, 'epoch': 0.44} 44%|████▎ | 14983/34278 [16:30:17<21:46:31, 4.06s/it] 44%|████▎ | 14984/34278 [16:30:21<21:23:31, 3.99s/it] {'loss': 0.1751, 'grad_norm': 0.8467982806994376, 'learning_rate': 6.247832654954666e-06, 'epoch': 0.44} 44%|████▎ | 14984/34278 [16:30:21<21:23:31, 3.99s/it] 44%|████▎ | 14985/34278 [16:30:27<24:32:31, 4.58s/it] {'loss': 0.1263, 'grad_norm': 0.8952665039433834, 'learning_rate': 6.247375164153624e-06, 'epoch': 0.44} 44%|████▎ | 14985/34278 [16:30:27<24:32:31, 4.58s/it] 44%|████▎ | 14986/34278 [16:30:30<22:44:53, 4.24s/it] {'loss': 0.1576, 'grad_norm': 0.698666395867167, 'learning_rate': 6.246917662216314e-06, 'epoch': 0.44} 44%|████▎ | 14986/34278 [16:30:30<22:44:53, 4.24s/it] 44%|████▎ | 14987/34278 [16:30:33<20:38:07, 3.85s/it] {'loss': 0.1541, 'grad_norm': 0.9601414635502714, 'learning_rate': 6.24646014914682e-06, 'epoch': 0.44} 44%|████▎ | 14987/34278 [16:30:33<20:38:07, 3.85s/it] 44%|████▎ | 14988/34278 [16:30:36<19:42:22, 3.68s/it] {'loss': 0.1543, 'grad_norm': 0.8327388951772458, 'learning_rate': 6.246002624949228e-06, 'epoch': 0.44} 44%|████▎ | 14988/34278 [16:30:36<19:42:22, 3.68s/it] 44%|████▎ | 14989/34278 [16:30:39<18:21:12, 3.43s/it] {'loss': 0.1337, 'grad_norm': 0.7883870737055048, 'learning_rate': 6.245545089627622e-06, 'epoch': 0.44} 44%|████▎ | 14989/34278 [16:30:39<18:21:12, 3.43s/it] 44%|████▎ | 14990/34278 [16:30:43<18:40:25, 3.49s/it] {'loss': 0.1703, 'grad_norm': 0.7564155832898626, 'learning_rate': 6.2450875431860855e-06, 'epoch': 0.44} 44%|████▎ | 14990/34278 [16:30:43<18:40:25, 3.49s/it] 44%|████▎ | 14991/34278 [16:30:47<19:40:51, 3.67s/it] {'loss': 0.1471, 'grad_norm': 0.8815126459544893, 'learning_rate': 6.244629985628706e-06, 'epoch': 0.44} 44%|████▎ | 14991/34278 [16:30:47<19:40:51, 3.67s/it] 44%|████▎ | 14992/34278 [16:30:50<18:22:41, 3.43s/it] {'loss': 0.1385, 'grad_norm': 0.8876949536195939, 'learning_rate': 6.2441724169595665e-06, 'epoch': 0.44} 44%|████▎ | 14992/34278 [16:30:50<18:22:41, 3.43s/it] 44%|████▎ | 14993/34278 [16:30:54<19:14:08, 3.59s/it] {'loss': 0.148, 'grad_norm': 0.7380826998864822, 'learning_rate': 6.243714837182753e-06, 'epoch': 0.44} 44%|████▎ | 14993/34278 [16:30:54<19:14:08, 3.59s/it] 44%|████▎ | 14994/34278 [16:30:57<18:44:31, 3.50s/it] {'loss': 0.1449, 'grad_norm': 0.8595716256583968, 'learning_rate': 6.24325724630235e-06, 'epoch': 0.44} 44%|████▎ | 14994/34278 [16:30:57<18:44:31, 3.50s/it] 44%|████▎ | 14995/34278 [16:31:01<18:52:38, 3.52s/it] {'loss': 0.1439, 'grad_norm': 0.8102512183599935, 'learning_rate': 6.242799644322445e-06, 'epoch': 0.44} 44%|████▎ | 14995/34278 [16:31:01<18:52:38, 3.52s/it] 44%|████▎ | 14996/34278 [16:31:07<22:43:52, 4.24s/it] {'loss': 0.1231, 'grad_norm': 0.7982252801663071, 'learning_rate': 6.2423420312471185e-06, 'epoch': 0.44} 44%|████▎ | 14996/34278 [16:31:07<22:43:52, 4.24s/it] 44%|████▍ | 14997/34278 [16:31:10<20:47:26, 3.88s/it] {'loss': 0.1628, 'grad_norm': 0.7994109582589674, 'learning_rate': 6.241884407080461e-06, 'epoch': 0.44} 44%|████▍ | 14997/34278 [16:31:10<20:47:26, 3.88s/it] 44%|████▍ | 14998/34278 [16:31:13<20:17:15, 3.79s/it] {'loss': 0.1458, 'grad_norm': 0.7772701160558678, 'learning_rate': 6.241426771826555e-06, 'epoch': 0.44} 44%|████▍ | 14998/34278 [16:31:13<20:17:15, 3.79s/it] 44%|████▍ | 14999/34278 [16:31:16<19:14:45, 3.59s/it] {'loss': 0.1534, 'grad_norm': 0.8368242603943941, 'learning_rate': 6.240969125489486e-06, 'epoch': 0.44} 44%|████▍ | 14999/34278 [16:31:16<19:14:45, 3.59s/it] 44%|████▍ | 15000/34278 [16:31:20<19:24:39, 3.62s/it] {'loss': 0.1478, 'grad_norm': 0.8074603876020473, 'learning_rate': 6.240511468073343e-06, 'epoch': 0.44} 44%|████▍ | 15000/34278 [16:31:20<19:24:39, 3.62s/it] 44%|████▍ | 15001/34278 [16:31:25<21:48:54, 4.07s/it] {'loss': 0.1383, 'grad_norm': 0.7153335338167677, 'learning_rate': 6.2400537995822085e-06, 'epoch': 0.44} 44%|████▍ | 15001/34278 [16:31:25<21:48:54, 4.07s/it] 44%|████▍ | 15002/34278 [16:31:28<20:09:00, 3.76s/it] {'loss': 0.1434, 'grad_norm': 0.7662948335841253, 'learning_rate': 6.23959612002017e-06, 'epoch': 0.44} 44%|████▍ | 15002/34278 [16:31:28<20:09:00, 3.76s/it] 44%|████▍ | 15003/34278 [16:31:32<19:47:44, 3.70s/it] {'loss': 0.1436, 'grad_norm': 0.9111369640201166, 'learning_rate': 6.239138429391314e-06, 'epoch': 0.44} 44%|████▍ | 15003/34278 [16:31:32<19:47:44, 3.70s/it] 44%|████▍ | 15004/34278 [16:31:35<19:01:41, 3.55s/it] {'loss': 0.1317, 'grad_norm': 0.8663095010290788, 'learning_rate': 6.238680727699726e-06, 'epoch': 0.44} 44%|████▍ | 15004/34278 [16:31:35<19:01:41, 3.55s/it] 44%|████▍ | 15005/34278 [16:31:38<18:04:39, 3.38s/it] {'loss': 0.1717, 'grad_norm': 1.058155303477025, 'learning_rate': 6.2382230149494906e-06, 'epoch': 0.44} 44%|████▍ | 15005/34278 [16:31:38<18:04:39, 3.38s/it] 44%|████▍ | 15006/34278 [16:31:41<18:06:34, 3.38s/it] {'loss': 0.1501, 'grad_norm': 0.8210054234280479, 'learning_rate': 6.237765291144696e-06, 'epoch': 0.44} 44%|████▍ | 15006/34278 [16:31:41<18:06:34, 3.38s/it] 44%|████▍ | 15007/34278 [16:31:44<17:10:25, 3.21s/it] {'loss': 0.1314, 'grad_norm': 0.8199647565725456, 'learning_rate': 6.237307556289429e-06, 'epoch': 0.44} 44%|████▍ | 15007/34278 [16:31:44<17:10:25, 3.21s/it] 44%|████▍ | 15008/34278 [16:31:48<18:30:33, 3.46s/it] {'loss': 0.1625, 'grad_norm': 1.223085026269827, 'learning_rate': 6.236849810387776e-06, 'epoch': 0.44} 44%|████▍ | 15008/34278 [16:31:48<18:30:33, 3.46s/it] 44%|████▍ | 15009/34278 [16:31:51<18:14:29, 3.41s/it] {'loss': 0.1234, 'grad_norm': 1.1614711613469009, 'learning_rate': 6.236392053443822e-06, 'epoch': 0.44} 44%|████▍ | 15009/34278 [16:31:51<18:14:29, 3.41s/it] 44%|████▍ | 15010/34278 [16:31:55<19:07:57, 3.57s/it] {'loss': 0.1158, 'grad_norm': 0.9369550405747571, 'learning_rate': 6.235934285461656e-06, 'epoch': 0.44} 44%|████▍ | 15010/34278 [16:31:55<19:07:57, 3.57s/it] 44%|████▍ | 15011/34278 [16:32:00<20:35:57, 3.85s/it] {'loss': 0.1246, 'grad_norm': 0.8493194095409426, 'learning_rate': 6.235476506445362e-06, 'epoch': 0.44} 44%|████▍ | 15011/34278 [16:32:00<20:35:57, 3.85s/it] 44%|████▍ | 15012/34278 [16:32:03<19:35:15, 3.66s/it] {'loss': 0.1556, 'grad_norm': 1.3030959463256946, 'learning_rate': 6.2350187163990314e-06, 'epoch': 0.44} 44%|████▍ | 15012/34278 [16:32:03<19:35:15, 3.66s/it] 44%|████▍ | 15013/34278 [16:32:06<18:36:11, 3.48s/it] {'loss': 0.1468, 'grad_norm': 0.9238553779865656, 'learning_rate': 6.234560915326747e-06, 'epoch': 0.44} 44%|████▍ | 15013/34278 [16:32:06<18:36:11, 3.48s/it] 44%|████▍ | 15014/34278 [16:32:09<18:13:42, 3.41s/it] {'loss': 0.1363, 'grad_norm': 0.9696903616097787, 'learning_rate': 6.234103103232597e-06, 'epoch': 0.44} 44%|████▍ | 15014/34278 [16:32:09<18:13:42, 3.41s/it] 44%|████▍ | 15015/34278 [16:32:15<21:09:22, 3.95s/it] {'loss': 0.1583, 'grad_norm': 1.2887541279913717, 'learning_rate': 6.233645280120671e-06, 'epoch': 0.44} 44%|████▍ | 15015/34278 [16:32:15<21:09:22, 3.95s/it] 44%|████▍ | 15016/34278 [16:32:18<19:24:02, 3.63s/it] {'loss': 0.1368, 'grad_norm': 1.113477237477694, 'learning_rate': 6.233187445995053e-06, 'epoch': 0.44} 44%|████▍ | 15016/34278 [16:32:18<19:24:02, 3.63s/it] 44%|████▍ | 15017/34278 [16:32:22<21:25:33, 4.00s/it] {'loss': 0.1444, 'grad_norm': 0.9901409573743811, 'learning_rate': 6.232729600859832e-06, 'epoch': 0.44} 44%|████▍ | 15017/34278 [16:32:22<21:25:33, 4.00s/it] 44%|████▍ | 15018/34278 [16:32:28<24:47:50, 4.64s/it] {'loss': 0.1431, 'grad_norm': 0.9425912909861479, 'learning_rate': 6.232271744719094e-06, 'epoch': 0.44} 44%|████▍ | 15018/34278 [16:32:29<24:47:50, 4.64s/it] 44%|████▍ | 15019/34278 [16:32:32<22:19:57, 4.17s/it] {'loss': 0.1307, 'grad_norm': 0.8408452784656811, 'learning_rate': 6.23181387757693e-06, 'epoch': 0.44} 44%|████▍ | 15019/34278 [16:32:32<22:19:57, 4.17s/it] 44%|████▍ | 15020/34278 [16:32:35<21:28:13, 4.01s/it] {'loss': 0.1542, 'grad_norm': 0.8291973248767233, 'learning_rate': 6.231355999437425e-06, 'epoch': 0.44} 44%|████▍ | 15020/34278 [16:32:35<21:28:13, 4.01s/it] 44%|████▍ | 15021/34278 [16:32:39<20:22:38, 3.81s/it] {'loss': 0.1206, 'grad_norm': 0.887432534102003, 'learning_rate': 6.230898110304668e-06, 'epoch': 0.44} 44%|████▍ | 15021/34278 [16:32:39<20:22:38, 3.81s/it] 44%|████▍ | 15022/34278 [16:32:43<21:48:39, 4.08s/it] {'loss': 0.1462, 'grad_norm': 0.8618449996123551, 'learning_rate': 6.230440210182745e-06, 'epoch': 0.44} 44%|████▍ | 15022/34278 [16:32:43<21:48:39, 4.08s/it] 44%|████▍ | 15023/34278 [16:32:46<20:22:17, 3.81s/it] {'loss': 0.1338, 'grad_norm': 0.8350368418007432, 'learning_rate': 6.2299822990757475e-06, 'epoch': 0.44} 44%|████▍ | 15023/34278 [16:32:46<20:22:17, 3.81s/it] 44%|████▍ | 15024/34278 [16:32:52<23:45:49, 4.44s/it] {'loss': 0.1292, 'grad_norm': 0.7501682509218951, 'learning_rate': 6.22952437698776e-06, 'epoch': 0.44} 44%|████▍ | 15024/34278 [16:32:52<23:45:49, 4.44s/it] 44%|████▍ | 15025/34278 [16:32:55<21:17:12, 3.98s/it] {'loss': 0.1189, 'grad_norm': 0.7891398829649721, 'learning_rate': 6.229066443922874e-06, 'epoch': 0.44} 44%|████▍ | 15025/34278 [16:32:55<21:17:12, 3.98s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fafef8b6660> Failed to fetch sample 3291480. Exception: cannot identify image file <_io.BytesIO object at 0x7fafef8b6660> 44%|████▍ | 15026/34278 [16:32:58<20:00:42, 3.74s/it] {'loss': 0.148, 'grad_norm': 0.8053139780836996, 'learning_rate': 6.228608499885174e-06, 'epoch': 0.44} 44%|████▍ | 15026/34278 [16:32:58<20:00:42, 3.74s/it] 44%|████▍ | 15027/34278 [16:33:02<19:14:40, 3.60s/it] {'loss': 0.1347, 'grad_norm': 0.7746703807058757, 'learning_rate': 6.228150544878754e-06, 'epoch': 0.44} 44%|████▍ | 15027/34278 [16:33:02<19:14:40, 3.60s/it] 44%|████▍ | 15028/34278 [16:33:05<18:31:41, 3.46s/it] {'loss': 0.1438, 'grad_norm': 0.7696282133756824, 'learning_rate': 6.227692578907697e-06, 'epoch': 0.44} 44%|████▍ | 15028/34278 [16:33:05<18:31:41, 3.46s/it] 44%|████▍ | 15029/34278 [16:33:09<20:18:42, 3.80s/it] {'loss': 0.149, 'grad_norm': 0.7889312279455797, 'learning_rate': 6.2272346019760936e-06, 'epoch': 0.44} 44%|████▍ | 15029/34278 [16:33:09<20:18:42, 3.80s/it] 44%|████▍ | 15030/34278 [16:33:13<19:23:19, 3.63s/it] {'loss': 0.1316, 'grad_norm': 0.8572841859126313, 'learning_rate': 6.2267766140880325e-06, 'epoch': 0.44} 44%|████▍ | 15030/34278 [16:33:13<19:23:19, 3.63s/it] 44%|████▍ | 15031/34278 [16:33:15<17:51:57, 3.34s/it] {'loss': 0.1376, 'grad_norm': 0.9308072763704206, 'learning_rate': 6.226318615247604e-06, 'epoch': 0.44} 44%|████▍ | 15031/34278 [16:33:15<17:51:57, 3.34s/it] 44%|████▍ | 15032/34278 [16:33:18<17:16:20, 3.23s/it] {'loss': 0.1418, 'grad_norm': 0.9821851734638208, 'learning_rate': 6.225860605458895e-06, 'epoch': 0.44} 44%|████▍ | 15032/34278 [16:33:18<17:16:20, 3.23s/it] 44%|████▍ | 15033/34278 [16:33:23<19:35:12, 3.66s/it] {'loss': 0.1475, 'grad_norm': 0.6586206659882248, 'learning_rate': 6.225402584725993e-06, 'epoch': 0.44} 44%|████▍ | 15033/34278 [16:33:23<19:35:12, 3.66s/it] 44%|████▍ | 15034/34278 [16:33:27<20:40:41, 3.87s/it] {'loss': 0.1403, 'grad_norm': 0.8719938041849924, 'learning_rate': 6.224944553052992e-06, 'epoch': 0.44} 44%|████▍ | 15034/34278 [16:33:27<20:40:41, 3.87s/it] 44%|████▍ | 15035/34278 [16:33:31<19:37:11, 3.67s/it] {'loss': 0.1354, 'grad_norm': 0.7686687697514415, 'learning_rate': 6.224486510443978e-06, 'epoch': 0.44} 44%|████▍ | 15035/34278 [16:33:31<19:37:11, 3.67s/it] 44%|████▍ | 15036/34278 [16:33:34<18:42:03, 3.50s/it] {'loss': 0.1254, 'grad_norm': 0.7872009238631145, 'learning_rate': 6.2240284569030395e-06, 'epoch': 0.44} 44%|████▍ | 15036/34278 [16:33:34<18:42:03, 3.50s/it] 44%|████▍ | 15037/34278 [16:33:37<18:53:28, 3.53s/it] {'loss': 0.1316, 'grad_norm': 0.6871719476135176, 'learning_rate': 6.223570392434268e-06, 'epoch': 0.44} 44%|████▍ | 15037/34278 [16:33:37<18:53:28, 3.53s/it] 44%|████▍ | 15038/34278 [16:33:41<19:12:14, 3.59s/it] {'loss': 0.1475, 'grad_norm': 1.059686281698016, 'learning_rate': 6.223112317041751e-06, 'epoch': 0.44} 44%|████▍ | 15038/34278 [16:33:41<19:12:14, 3.59s/it] 44%|████▍ | 15039/34278 [16:33:44<17:32:06, 3.28s/it] {'loss': 0.1137, 'grad_norm': 0.7529910432543049, 'learning_rate': 6.222654230729582e-06, 'epoch': 0.44} 44%|████▍ | 15039/34278 [16:33:44<17:32:06, 3.28s/it] 44%|████▍ | 15040/34278 [16:33:47<17:20:35, 3.25s/it] {'loss': 0.1392, 'grad_norm': 1.4046174304789645, 'learning_rate': 6.2221961335018464e-06, 'epoch': 0.44} 44%|████▍ | 15040/34278 [16:33:47<17:20:35, 3.25s/it] 44%|████▍ | 15041/34278 [16:33:50<16:47:07, 3.14s/it] {'loss': 0.1434, 'grad_norm': 0.7354673971840441, 'learning_rate': 6.2217380253626346e-06, 'epoch': 0.44} 44%|████▍ | 15041/34278 [16:33:50<16:47:07, 3.14s/it] 44%|████▍ | 15042/34278 [16:33:53<17:52:52, 3.35s/it] {'loss': 0.129, 'grad_norm': 0.7909726582936715, 'learning_rate': 6.221279906316039e-06, 'epoch': 0.44} 44%|████▍ | 15042/34278 [16:33:53<17:52:52, 3.35s/it] 44%|████▍ | 15043/34278 [16:33:59<21:45:22, 4.07s/it] {'loss': 0.1347, 'grad_norm': 0.7945483450078731, 'learning_rate': 6.220821776366146e-06, 'epoch': 0.44} 44%|████▍ | 15043/34278 [16:33:59<21:45:22, 4.07s/it] 44%|████▍ | 15044/34278 [16:34:02<20:26:56, 3.83s/it] {'loss': 0.1185, 'grad_norm': 0.7010766125587274, 'learning_rate': 6.2203636355170485e-06, 'epoch': 0.44} 44%|████▍ | 15044/34278 [16:34:02<20:26:56, 3.83s/it] 44%|████▍ | 15045/34278 [16:34:06<19:23:55, 3.63s/it] {'loss': 0.1544, 'grad_norm': 0.77055252530375, 'learning_rate': 6.219905483772837e-06, 'epoch': 0.44} 44%|████▍ | 15045/34278 [16:34:06<19:23:55, 3.63s/it] 44%|████▍ | 15046/34278 [16:34:09<18:45:59, 3.51s/it] {'loss': 0.1216, 'grad_norm': 0.6051215517554468, 'learning_rate': 6.2194473211376e-06, 'epoch': 0.44} 44%|████▍ | 15046/34278 [16:34:09<18:45:59, 3.51s/it] 44%|████▍ | 15047/34278 [16:34:15<22:45:01, 4.26s/it] {'loss': 0.1486, 'grad_norm': 0.7614440524907657, 'learning_rate': 6.218989147615426e-06, 'epoch': 0.44} 44%|████▍ | 15047/34278 [16:34:15<22:45:01, 4.26s/it] 44%|████▍ | 15048/34278 [16:34:18<20:50:09, 3.90s/it] {'loss': 0.1438, 'grad_norm': 0.7310503216162159, 'learning_rate': 6.218530963210411e-06, 'epoch': 0.44} 44%|████▍ | 15048/34278 [16:34:18<20:50:09, 3.90s/it] 44%|████▍ | 15049/34278 [16:34:21<19:37:22, 3.67s/it] {'loss': 0.15, 'grad_norm': 0.8824424699031387, 'learning_rate': 6.21807276792664e-06, 'epoch': 0.44} 44%|████▍ | 15049/34278 [16:34:21<19:37:22, 3.67s/it] 44%|████▍ | 15050/34278 [16:34:24<18:02:25, 3.38s/it] {'loss': 0.1269, 'grad_norm': 1.1171730097733568, 'learning_rate': 6.217614561768208e-06, 'epoch': 0.44} 44%|████▍ | 15050/34278 [16:34:24<18:02:25, 3.38s/it] 44%|████▍ | 15051/34278 [16:34:28<19:42:31, 3.69s/it] {'loss': 0.1411, 'grad_norm': 0.6888003281689934, 'learning_rate': 6.217156344739203e-06, 'epoch': 0.44} 44%|████▍ | 15051/34278 [16:34:28<19:42:31, 3.69s/it] 44%|████▍ | 15052/34278 [16:34:31<18:57:58, 3.55s/it] {'loss': 0.1229, 'grad_norm': 0.6657787649455748, 'learning_rate': 6.2166981168437165e-06, 'epoch': 0.44} 44%|████▍ | 15052/34278 [16:34:31<18:57:58, 3.55s/it] 44%|████▍ | 15053/34278 [16:34:35<18:49:43, 3.53s/it] {'loss': 0.1442, 'grad_norm': 0.7578786141444442, 'learning_rate': 6.21623987808584e-06, 'epoch': 0.44} 44%|████▍ | 15053/34278 [16:34:35<18:49:43, 3.53s/it] 44%|████▍ | 15054/34278 [16:34:38<18:05:47, 3.39s/it] {'loss': 0.1484, 'grad_norm': 0.8051741284438174, 'learning_rate': 6.215781628469663e-06, 'epoch': 0.44} 44%|████▍ | 15054/34278 [16:34:38<18:05:47, 3.39s/it] 44%|████▍ | 15055/34278 [16:34:41<17:27:58, 3.27s/it] {'loss': 0.1431, 'grad_norm': 0.8568323410142273, 'learning_rate': 6.2153233679992805e-06, 'epoch': 0.44} 44%|████▍ | 15055/34278 [16:34:41<17:27:58, 3.27s/it] 44%|████▍ | 15056/34278 [16:34:44<16:44:18, 3.13s/it] {'loss': 0.1355, 'grad_norm': 0.7701121083595855, 'learning_rate': 6.214865096678779e-06, 'epoch': 0.44} 44%|████▍ | 15056/34278 [16:34:44<16:44:18, 3.13s/it] 44%|████▍ | 15057/34278 [16:34:47<17:30:09, 3.28s/it] {'loss': 0.1419, 'grad_norm': 0.9624893288209205, 'learning_rate': 6.214406814512254e-06, 'epoch': 0.44} 44%|████▍ | 15057/34278 [16:34:47<17:30:09, 3.28s/it] 44%|████▍ | 15058/34278 [16:34:53<20:36:07, 3.86s/it] {'loss': 0.1653, 'grad_norm': 0.7941603210473188, 'learning_rate': 6.213948521503793e-06, 'epoch': 0.44} 44%|████▍ | 15058/34278 [16:34:53<20:36:07, 3.86s/it] 44%|████▍ | 15059/34278 [16:34:56<20:19:43, 3.81s/it] {'loss': 0.1404, 'grad_norm': 0.7443878015701055, 'learning_rate': 6.2134902176574884e-06, 'epoch': 0.44} 44%|████▍ | 15059/34278 [16:34:56<20:19:43, 3.81s/it] 44%|████▍ | 15060/34278 [16:35:00<19:40:13, 3.68s/it] {'loss': 0.1166, 'grad_norm': 0.8533354162846383, 'learning_rate': 6.213031902977436e-06, 'epoch': 0.44} 44%|████▍ | 15060/34278 [16:35:00<19:40:13, 3.68s/it] 44%|████▍ | 15061/34278 [16:35:06<23:22:09, 4.38s/it] {'loss': 0.1437, 'grad_norm': 0.9405274831200605, 'learning_rate': 6.212573577467722e-06, 'epoch': 0.44} 44%|████▍ | 15061/34278 [16:35:06<23:22:09, 4.38s/it] 44%|████▍ | 15062/34278 [16:35:10<22:32:49, 4.22s/it] {'loss': 0.1453, 'grad_norm': 0.7169491241073812, 'learning_rate': 6.212115241132441e-06, 'epoch': 0.44} 44%|████▍ | 15062/34278 [16:35:10<22:32:49, 4.22s/it] 44%|████▍ | 15063/34278 [16:35:12<20:26:09, 3.83s/it] {'loss': 0.1327, 'grad_norm': 1.0823337820424184, 'learning_rate': 6.211656893975685e-06, 'epoch': 0.44} 44%|████▍ | 15063/34278 [16:35:12<20:26:09, 3.83s/it] 44%|████▍ | 15064/34278 [16:35:16<20:27:41, 3.83s/it] {'loss': 0.1303, 'grad_norm': 0.8252949874601155, 'learning_rate': 6.211198536001545e-06, 'epoch': 0.44} 44%|████▍ | 15064/34278 [16:35:16<20:27:41, 3.83s/it] 44%|████▍ | 15065/34278 [16:35:19<18:47:59, 3.52s/it] {'loss': 0.1398, 'grad_norm': 1.031682793059374, 'learning_rate': 6.210740167214114e-06, 'epoch': 0.44} 44%|████▍ | 15065/34278 [16:35:19<18:47:59, 3.52s/it] 44%|████▍ | 15066/34278 [16:35:22<17:46:27, 3.33s/it] {'loss': 0.15, 'grad_norm': 0.7706924572131718, 'learning_rate': 6.210281787617483e-06, 'epoch': 0.44} 44%|████▍ | 15066/34278 [16:35:22<17:46:27, 3.33s/it] 44%|████▍ | 15067/34278 [16:35:26<18:33:04, 3.48s/it] {'loss': 0.1258, 'grad_norm': 0.832935142749437, 'learning_rate': 6.209823397215746e-06, 'epoch': 0.44} 44%|████▍ | 15067/34278 [16:35:26<18:33:04, 3.48s/it] 44%|████▍ | 15068/34278 [16:35:30<19:37:24, 3.68s/it] {'loss': 0.1274, 'grad_norm': 0.9778245354120296, 'learning_rate': 6.209364996012994e-06, 'epoch': 0.44} 44%|████▍ | 15068/34278 [16:35:30<19:37:24, 3.68s/it] 44%|████▍ | 15069/34278 [16:35:33<18:45:53, 3.52s/it] {'loss': 0.1301, 'grad_norm': 0.89526369817899, 'learning_rate': 6.20890658401332e-06, 'epoch': 0.44} 44%|████▍ | 15069/34278 [16:35:33<18:45:53, 3.52s/it] 44%|████▍ | 15070/34278 [16:35:36<18:04:45, 3.39s/it] {'loss': 0.1338, 'grad_norm': 0.6959084421482379, 'learning_rate': 6.208448161220818e-06, 'epoch': 0.44} 44%|████▍ | 15070/34278 [16:35:36<18:04:45, 3.39s/it] 44%|████▍ | 15071/34278 [16:35:40<19:20:43, 3.63s/it] {'loss': 0.1257, 'grad_norm': 1.5306102217470179, 'learning_rate': 6.207989727639577e-06, 'epoch': 0.44} 44%|████▍ | 15071/34278 [16:35:40<19:20:43, 3.63s/it] 44%|████▍ | 15072/34278 [16:35:44<18:53:34, 3.54s/it] {'loss': 0.157, 'grad_norm': 0.9210846705347121, 'learning_rate': 6.2075312832736945e-06, 'epoch': 0.44} 44%|████▍ | 15072/34278 [16:35:44<18:53:34, 3.54s/it] 44%|████▍ | 15073/34278 [16:35:47<18:29:32, 3.47s/it] {'loss': 0.1411, 'grad_norm': 0.8404251767950718, 'learning_rate': 6.2070728281272594e-06, 'epoch': 0.44} 44%|████▍ | 15073/34278 [16:35:47<18:29:32, 3.47s/it] 44%|████▍ | 15074/34278 [16:35:50<17:45:55, 3.33s/it] {'loss': 0.1296, 'grad_norm': 0.763211868240203, 'learning_rate': 6.206614362204366e-06, 'epoch': 0.44} 44%|████▍ | 15074/34278 [16:35:50<17:45:55, 3.33s/it] 44%|████▍ | 15075/34278 [16:35:53<17:19:29, 3.25s/it] {'loss': 0.1219, 'grad_norm': 0.883705407748208, 'learning_rate': 6.206155885509108e-06, 'epoch': 0.44} 44%|████▍ | 15075/34278 [16:35:53<17:19:29, 3.25s/it] 44%|████▍ | 15076/34278 [16:35:58<20:45:02, 3.89s/it] {'loss': 0.1237, 'grad_norm': 0.9266157241004488, 'learning_rate': 6.2056973980455795e-06, 'epoch': 0.44} 44%|████▍ | 15076/34278 [16:35:58<20:45:02, 3.89s/it] 44%|████▍ | 15077/34278 [16:36:02<20:30:19, 3.84s/it] {'loss': 0.1742, 'grad_norm': 0.9700898960791888, 'learning_rate': 6.2052388998178705e-06, 'epoch': 0.44} 44%|████▍ | 15077/34278 [16:36:02<20:30:19, 3.84s/it] 44%|████▍ | 15078/34278 [16:36:05<19:27:55, 3.65s/it] {'loss': 0.1336, 'grad_norm': 0.9618272210553835, 'learning_rate': 6.2047803908300776e-06, 'epoch': 0.44} 44%|████▍ | 15078/34278 [16:36:05<19:27:55, 3.65s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f06c7eebc40> Failed to fetch sample 2654738. Exception: cannot identify image file <_io.BytesIO object at 0x7f06c7eebc40> 44%|████▍ | 15079/34278 [16:36:09<18:48:16, 3.53s/it] {'loss': 0.133, 'grad_norm': 0.863116055024015, 'learning_rate': 6.204321871086292e-06, 'epoch': 0.44} 44%|████▍ | 15079/34278 [16:36:09<18:48:16, 3.53s/it] 44%|████▍ | 15080/34278 [16:36:13<20:34:16, 3.86s/it] {'loss': 0.1354, 'grad_norm': 1.0521571374995566, 'learning_rate': 6.203863340590609e-06, 'epoch': 0.44} 44%|████▍ | 15080/34278 [16:36:13<20:34:16, 3.86s/it] 44%|████▍ | 15081/34278 [16:36:16<19:18:40, 3.62s/it] {'loss': 0.1264, 'grad_norm': 0.9063771646807343, 'learning_rate': 6.203404799347122e-06, 'epoch': 0.44} 44%|████▍ | 15081/34278 [16:36:16<19:18:40, 3.62s/it] 44%|████▍ | 15082/34278 [16:36:20<18:55:00, 3.55s/it] {'loss': 0.1477, 'grad_norm': 0.9149711879709472, 'learning_rate': 6.202946247359922e-06, 'epoch': 0.44} 44%|████▍ | 15082/34278 [16:36:20<18:55:00, 3.55s/it] 44%|████▍ | 15083/34278 [16:36:23<19:18:57, 3.62s/it] {'loss': 0.141, 'grad_norm': 0.979774442979104, 'learning_rate': 6.202487684633107e-06, 'epoch': 0.44} 44%|████▍ | 15083/34278 [16:36:24<19:18:57, 3.62s/it] 44%|████▍ | 15084/34278 [16:36:26<18:11:45, 3.41s/it] {'loss': 0.1393, 'grad_norm': 0.9123487969392395, 'learning_rate': 6.202029111170769e-06, 'epoch': 0.44} 44%|████▍ | 15084/34278 [16:36:26<18:11:45, 3.41s/it] 44%|████▍ | 15085/34278 [16:36:29<17:30:21, 3.28s/it] {'loss': 0.1287, 'grad_norm': 0.8849756322792716, 'learning_rate': 6.201570526977001e-06, 'epoch': 0.44} 44%|████▍ | 15085/34278 [16:36:29<17:30:21, 3.28s/it] 44%|████▍ | 15086/34278 [16:36:34<19:29:48, 3.66s/it] {'loss': 0.1462, 'grad_norm': 0.8109589019185897, 'learning_rate': 6.2011119320558986e-06, 'epoch': 0.44} 44%|████▍ | 15086/34278 [16:36:34<19:29:48, 3.66s/it] 44%|████▍ | 15087/34278 [16:36:37<18:14:11, 3.42s/it] {'loss': 0.1255, 'grad_norm': 0.8302390639730577, 'learning_rate': 6.2006533264115564e-06, 'epoch': 0.44} 44%|████▍ | 15087/34278 [16:36:37<18:14:11, 3.42s/it] 44%|████▍ | 15088/34278 [16:36:43<22:36:34, 4.24s/it] {'loss': 0.1481, 'grad_norm': 0.7300103239122315, 'learning_rate': 6.2001947100480675e-06, 'epoch': 0.44} 44%|████▍ | 15088/34278 [16:36:43<22:36:34, 4.24s/it] 44%|████▍ | 15089/34278 [16:36:46<20:58:05, 3.93s/it] {'loss': 0.1314, 'grad_norm': 0.9699959274058704, 'learning_rate': 6.199736082969525e-06, 'epoch': 0.44} 44%|████▍ | 15089/34278 [16:36:46<20:58:05, 3.93s/it] 44%|████▍ | 15090/34278 [16:36:52<24:42:51, 4.64s/it] {'loss': 0.1622, 'grad_norm': 0.7641958932511366, 'learning_rate': 6.199277445180028e-06, 'epoch': 0.44} 44%|████▍ | 15090/34278 [16:36:52<24:42:51, 4.64s/it] 44%|████▍ | 15091/34278 [16:36:58<26:49:43, 5.03s/it] {'loss': 0.1461, 'grad_norm': 0.7577420220487728, 'learning_rate': 6.198818796683666e-06, 'epoch': 0.44} 44%|████▍ | 15091/34278 [16:36:58<26:49:43, 5.03s/it] 44%|████▍ | 15092/34278 [16:37:01<23:39:04, 4.44s/it] {'loss': 0.1396, 'grad_norm': 0.7869450527540598, 'learning_rate': 6.198360137484537e-06, 'epoch': 0.44} 44%|████▍ | 15092/34278 [16:37:01<23:39:04, 4.44s/it] 44%|████▍ | 15093/34278 [16:37:05<22:55:41, 4.30s/it] {'loss': 0.1396, 'grad_norm': 0.7217383889627569, 'learning_rate': 6.1979014675867345e-06, 'epoch': 0.44} 44%|████▍ | 15093/34278 [16:37:05<22:55:41, 4.30s/it] 44%|████▍ | 15094/34278 [16:37:09<21:25:41, 4.02s/it] {'loss': 0.1183, 'grad_norm': 0.7495436122585886, 'learning_rate': 6.197442786994354e-06, 'epoch': 0.44} 44%|████▍ | 15094/34278 [16:37:09<21:25:41, 4.02s/it] 44%|████▍ | 15095/34278 [16:37:12<20:12:32, 3.79s/it] {'loss': 0.157, 'grad_norm': 0.8044067727692116, 'learning_rate': 6.1969840957114904e-06, 'epoch': 0.44} 44%|████▍ | 15095/34278 [16:37:12<20:12:32, 3.79s/it] 44%|████▍ | 15096/34278 [16:37:16<20:08:22, 3.78s/it] {'loss': 0.1701, 'grad_norm': 0.7255909826283716, 'learning_rate': 6.196525393742238e-06, 'epoch': 0.44} 44%|████▍ | 15096/34278 [16:37:16<20:08:22, 3.78s/it] 44%|████▍ | 15097/34278 [16:37:20<20:08:58, 3.78s/it] {'loss': 0.1358, 'grad_norm': 0.8239930130991369, 'learning_rate': 6.196066681090692e-06, 'epoch': 0.44} 44%|████▍ | 15097/34278 [16:37:20<20:08:58, 3.78s/it] 44%|████▍ | 15098/34278 [16:37:25<23:27:38, 4.40s/it] {'loss': 0.143, 'grad_norm': 0.8686482362650536, 'learning_rate': 6.1956079577609485e-06, 'epoch': 0.44} 44%|████▍ | 15098/34278 [16:37:25<23:27:38, 4.40s/it] 44%|████▍ | 15099/34278 [16:37:31<25:59:47, 4.88s/it] {'loss': 0.1392, 'grad_norm': 0.8572453656024628, 'learning_rate': 6.195149223757103e-06, 'epoch': 0.44} 44%|████▍ | 15099/34278 [16:37:31<25:59:47, 4.88s/it] 44%|████▍ | 15100/34278 [16:37:35<23:19:48, 4.38s/it] {'loss': 0.1489, 'grad_norm': 0.7974897270986314, 'learning_rate': 6.194690479083251e-06, 'epoch': 0.44} 44%|████▍ | 15100/34278 [16:37:35<23:19:48, 4.38s/it] 44%|████▍ | 15101/34278 [16:37:38<22:23:20, 4.20s/it] {'loss': 0.1479, 'grad_norm': 1.0348902345146171, 'learning_rate': 6.194231723743486e-06, 'epoch': 0.44} 44%|████▍ | 15101/34278 [16:37:38<22:23:20, 4.20s/it] 44%|████▍ | 15102/34278 [16:37:44<25:15:05, 4.74s/it] {'loss': 0.1756, 'grad_norm': 0.9820780283957675, 'learning_rate': 6.193772957741907e-06, 'epoch': 0.44} 44%|████▍ | 15102/34278 [16:37:44<25:15:05, 4.74s/it] 44%|████▍ | 15103/34278 [16:37:50<27:09:08, 5.10s/it] {'loss': 0.14, 'grad_norm': 0.8434465511246421, 'learning_rate': 6.193314181082607e-06, 'epoch': 0.44} 44%|████▍ | 15103/34278 [16:37:50<27:09:08, 5.10s/it] 44%|████▍ | 15104/34278 [16:37:56<28:37:16, 5.37s/it] {'loss': 0.1337, 'grad_norm': 0.6533601682398729, 'learning_rate': 6.192855393769683e-06, 'epoch': 0.44} 44%|████▍ | 15104/34278 [16:37:56<28:37:16, 5.37s/it] 44%|████▍ | 15105/34278 [16:38:00<25:06:17, 4.71s/it] {'loss': 0.1576, 'grad_norm': 0.8235956027060124, 'learning_rate': 6.192396595807231e-06, 'epoch': 0.44} 44%|████▍ | 15105/34278 [16:38:00<25:06:17, 4.71s/it] 44%|████▍ | 15106/34278 [16:38:03<22:21:22, 4.20s/it] {'loss': 0.1479, 'grad_norm': 1.0340137000191478, 'learning_rate': 6.191937787199347e-06, 'epoch': 0.44} 44%|████▍ | 15106/34278 [16:38:03<22:21:22, 4.20s/it] 44%|████▍ | 15107/34278 [16:38:06<20:39:15, 3.88s/it] {'loss': 0.1302, 'grad_norm': 0.9728982450315318, 'learning_rate': 6.1914789679501266e-06, 'epoch': 0.44} 44%|████▍ | 15107/34278 [16:38:06<20:39:15, 3.88s/it] 44%|████▍ | 15108/34278 [16:38:12<23:59:35, 4.51s/it] {'loss': 0.1589, 'grad_norm': 0.8074227064702215, 'learning_rate': 6.191020138063666e-06, 'epoch': 0.44} 44%|████▍ | 15108/34278 [16:38:12<23:59:35, 4.51s/it] 44%|████▍ | 15109/34278 [16:38:15<21:58:10, 4.13s/it] {'loss': 0.1296, 'grad_norm': 0.9898810044190635, 'learning_rate': 6.190561297544063e-06, 'epoch': 0.44} 44%|████▍ | 15109/34278 [16:38:15<21:58:10, 4.13s/it] 44%|████▍ | 15110/34278 [16:38:19<21:43:46, 4.08s/it] {'loss': 0.1513, 'grad_norm': 0.7838370141602546, 'learning_rate': 6.190102446395412e-06, 'epoch': 0.44} 44%|████▍ | 15110/34278 [16:38:19<21:43:46, 4.08s/it] 44%|████▍ | 15111/34278 [16:38:23<21:26:49, 4.03s/it] {'loss': 0.1408, 'grad_norm': 0.8351440670730054, 'learning_rate': 6.189643584621811e-06, 'epoch': 0.44} 44%|████▍ | 15111/34278 [16:38:23<21:26:49, 4.03s/it] 44%|████▍ | 15112/34278 [16:38:26<20:43:50, 3.89s/it] {'loss': 0.1598, 'grad_norm': 0.9429585268779143, 'learning_rate': 6.189184712227356e-06, 'epoch': 0.44} 44%|████▍ | 15112/34278 [16:38:26<20:43:50, 3.89s/it] 44%|████▍ | 15113/34278 [16:38:31<21:54:54, 4.12s/it] {'loss': 0.1627, 'grad_norm': 0.8872677081948187, 'learning_rate': 6.1887258292161435e-06, 'epoch': 0.44} 44%|████▍ | 15113/34278 [16:38:31<21:54:54, 4.12s/it] 44%|████▍ | 15114/34278 [16:38:34<20:51:31, 3.92s/it] {'loss': 0.1216, 'grad_norm': 0.7317606862709696, 'learning_rate': 6.1882669355922706e-06, 'epoch': 0.44} 44%|████▍ | 15114/34278 [16:38:34<20:51:31, 3.92s/it] 44%|████▍ | 15115/34278 [16:38:39<21:34:27, 4.05s/it] {'loss': 0.1377, 'grad_norm': 0.8599470645369262, 'learning_rate': 6.187808031359835e-06, 'epoch': 0.44} 44%|████▍ | 15115/34278 [16:38:39<21:34:27, 4.05s/it] 44%|████▍ | 15116/34278 [16:38:45<24:42:15, 4.64s/it] {'loss': 0.1308, 'grad_norm': 0.9091108097425813, 'learning_rate': 6.187349116522932e-06, 'epoch': 0.44} 44%|████▍ | 15116/34278 [16:38:45<24:42:15, 4.64s/it] 44%|████▍ | 15117/34278 [16:38:48<22:43:55, 4.27s/it] {'loss': 0.1224, 'grad_norm': 0.6868528400456547, 'learning_rate': 6.186890191085659e-06, 'epoch': 0.44} 44%|████▍ | 15117/34278 [16:38:48<22:43:55, 4.27s/it] 44%|████▍ | 15118/34278 [16:38:52<22:18:12, 4.19s/it] {'loss': 0.1134, 'grad_norm': 0.7895111323825614, 'learning_rate': 6.1864312550521156e-06, 'epoch': 0.44} 44%|████▍ | 15118/34278 [16:38:52<22:18:12, 4.19s/it] 44%|████▍ | 15119/34278 [16:38:55<20:26:00, 3.84s/it] {'loss': 0.1425, 'grad_norm': 0.9170317658962038, 'learning_rate': 6.185972308426394e-06, 'epoch': 0.44} 44%|████▍ | 15119/34278 [16:38:55<20:26:00, 3.84s/it] 44%|████▍ | 15120/34278 [16:38:58<19:16:13, 3.62s/it] {'loss': 0.1339, 'grad_norm': 0.8898762215953443, 'learning_rate': 6.185513351212599e-06, 'epoch': 0.44} 44%|████▍ | 15120/34278 [16:38:58<19:16:13, 3.62s/it] 44%|████▍ | 15121/34278 [16:39:02<19:43:09, 3.71s/it] {'loss': 0.128, 'grad_norm': 0.7120990645692094, 'learning_rate': 6.185054383414821e-06, 'epoch': 0.44} 44%|████▍ | 15121/34278 [16:39:02<19:43:09, 3.71s/it] 44%|████▍ | 15122/34278 [16:39:06<19:43:49, 3.71s/it] {'loss': 0.1396, 'grad_norm': 0.9098797685642345, 'learning_rate': 6.18459540503716e-06, 'epoch': 0.44} 44%|████▍ | 15122/34278 [16:39:06<19:43:49, 3.71s/it] 44%|████▍ | 15123/34278 [16:39:12<23:14:56, 4.37s/it] {'loss': 0.1754, 'grad_norm': 0.9099002629450361, 'learning_rate': 6.184136416083716e-06, 'epoch': 0.44} 44%|████▍ | 15123/34278 [16:39:12<23:14:56, 4.37s/it] 44%|████▍ | 15124/34278 [16:39:16<22:25:59, 4.22s/it] {'loss': 0.1255, 'grad_norm': 0.8491899787753415, 'learning_rate': 6.1836774165585835e-06, 'epoch': 0.44} 44%|████▍ | 15124/34278 [16:39:16<22:25:59, 4.22s/it] 44%|████▍ | 15125/34278 [16:39:19<20:08:29, 3.79s/it] {'loss': 0.1295, 'grad_norm': 1.0133237976787148, 'learning_rate': 6.183218406465861e-06, 'epoch': 0.44} 44%|████▍ | 15125/34278 [16:39:19<20:08:29, 3.79s/it] 44%|████▍ | 15126/34278 [16:39:22<19:07:56, 3.60s/it] {'loss': 0.1598, 'grad_norm': 0.937482166158476, 'learning_rate': 6.182759385809648e-06, 'epoch': 0.44} 44%|████▍ | 15126/34278 [16:39:22<19:07:56, 3.60s/it] 44%|████▍ | 15127/34278 [16:39:25<18:24:00, 3.46s/it] {'loss': 0.1587, 'grad_norm': 0.8495937070330164, 'learning_rate': 6.182300354594041e-06, 'epoch': 0.44} 44%|████▍ | 15127/34278 [16:39:25<18:24:00, 3.46s/it] 44%|████▍ | 15128/34278 [16:39:28<17:30:38, 3.29s/it] {'loss': 0.1495, 'grad_norm': 0.9365884837859367, 'learning_rate': 6.181841312823139e-06, 'epoch': 0.44} 44%|████▍ | 15128/34278 [16:39:28<17:30:38, 3.29s/it] 44%|████▍ | 15129/34278 [16:39:31<17:33:48, 3.30s/it] {'loss': 0.1354, 'grad_norm': 0.828917931816457, 'learning_rate': 6.18138226050104e-06, 'epoch': 0.44} 44%|████▍ | 15129/34278 [16:39:31<17:33:48, 3.30s/it] 44%|████▍ | 15130/34278 [16:39:34<16:55:59, 3.18s/it] {'loss': 0.1461, 'grad_norm': 1.2121680427116477, 'learning_rate': 6.1809231976318414e-06, 'epoch': 0.44} 44%|████▍ | 15130/34278 [16:39:34<16:55:59, 3.18s/it] 44%|████▍ | 15131/34278 [16:39:37<16:34:45, 3.12s/it] {'loss': 0.1297, 'grad_norm': 0.8267442464136356, 'learning_rate': 6.1804641242196435e-06, 'epoch': 0.44} 44%|████▍ | 15131/34278 [16:39:37<16:34:45, 3.12s/it] 44%|████▍ | 15132/34278 [16:39:40<16:04:39, 3.02s/it] {'loss': 0.1639, 'grad_norm': 0.8669047273309362, 'learning_rate': 6.180005040268544e-06, 'epoch': 0.44} 44%|████▍ | 15132/34278 [16:39:40<16:04:39, 3.02s/it] 44%|████▍ | 15133/34278 [16:39:43<16:05:53, 3.03s/it] {'loss': 0.1352, 'grad_norm': 0.9900346816148035, 'learning_rate': 6.179545945782639e-06, 'epoch': 0.44} 44%|████▍ | 15133/34278 [16:39:43<16:05:53, 3.03s/it] 44%|████▍ | 15134/34278 [16:39:46<16:05:12, 3.03s/it] {'loss': 0.1545, 'grad_norm': 0.8302777317742639, 'learning_rate': 6.179086840766031e-06, 'epoch': 0.44} 44%|████▍ | 15134/34278 [16:39:46<16:05:12, 3.03s/it] 44%|████▍ | 15135/34278 [16:39:49<15:54:03, 2.99s/it] {'loss': 0.1528, 'grad_norm': 0.8540477899928897, 'learning_rate': 6.178627725222819e-06, 'epoch': 0.44} 44%|████▍ | 15135/34278 [16:39:49<15:54:03, 2.99s/it] 44%|████▍ | 15136/34278 [16:39:52<16:25:52, 3.09s/it] {'loss': 0.1047, 'grad_norm': 0.7911456172678053, 'learning_rate': 6.178168599157096e-06, 'epoch': 0.44} 44%|████▍ | 15136/34278 [16:39:52<16:25:52, 3.09s/it] 44%|████▍ | 15137/34278 [16:39:56<17:05:34, 3.21s/it] {'loss': 0.1471, 'grad_norm': 0.9700265681787416, 'learning_rate': 6.177709462572969e-06, 'epoch': 0.44} 44%|████▍ | 15137/34278 [16:39:56<17:05:34, 3.21s/it] 44%|████▍ | 15138/34278 [16:39:59<17:35:41, 3.31s/it] {'loss': 0.157, 'grad_norm': 0.7729885089326664, 'learning_rate': 6.17725031547453e-06, 'epoch': 0.44} 44%|████▍ | 15138/34278 [16:39:59<17:35:41, 3.31s/it] 44%|████▍ | 15139/34278 [16:40:03<19:09:23, 3.60s/it] {'loss': 0.1337, 'grad_norm': 1.0104582272958682, 'learning_rate': 6.176791157865881e-06, 'epoch': 0.44} 44%|████▍ | 15139/34278 [16:40:03<19:09:23, 3.60s/it] 44%|████▍ | 15140/34278 [16:40:06<18:15:28, 3.43s/it] {'loss': 0.1253, 'grad_norm': 0.775961171060146, 'learning_rate': 6.176331989751125e-06, 'epoch': 0.44} 44%|████▍ | 15140/34278 [16:40:06<18:15:28, 3.43s/it] 44%|████▍ | 15141/34278 [16:40:10<18:19:02, 3.45s/it] {'loss': 0.1335, 'grad_norm': 1.0101315435786071, 'learning_rate': 6.175872811134355e-06, 'epoch': 0.44} 44%|████▍ | 15141/34278 [16:40:10<18:19:02, 3.45s/it] 44%|████▍ | 15142/34278 [16:40:14<19:14:02, 3.62s/it] {'loss': 0.1401, 'grad_norm': 0.756935795113428, 'learning_rate': 6.175413622019674e-06, 'epoch': 0.44} 44%|████▍ | 15142/34278 [16:40:14<19:14:02, 3.62s/it] 44%|████▍ | 15143/34278 [16:40:17<18:42:17, 3.52s/it] {'loss': 0.1261, 'grad_norm': 0.8103766556796341, 'learning_rate': 6.1749544224111805e-06, 'epoch': 0.44} 44%|████▍ | 15143/34278 [16:40:17<18:42:17, 3.52s/it] 44%|████▍ | 15144/34278 [16:40:22<20:14:37, 3.81s/it] {'loss': 0.1561, 'grad_norm': 0.7480765217196009, 'learning_rate': 6.174495212312974e-06, 'epoch': 0.44} 44%|████▍ | 15144/34278 [16:40:22<20:14:37, 3.81s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (13576 > 8192). Running this sequence through the model will result in indexing errors 44%|████▍ | 15145/34278 [16:40:25<19:21:27, 3.64s/it] {'loss': 0.1752, 'grad_norm': 0.7150165037411563, 'learning_rate': 6.174035991729155e-06, 'epoch': 0.44} 44%|████▍ | 15145/34278 [16:40:25<19:21:27, 3.64s/it] 44%|████▍ | 15146/34278 [16:40:29<20:05:51, 3.78s/it] {'loss': 0.15, 'grad_norm': 0.9834870920906477, 'learning_rate': 6.173576760663823e-06, 'epoch': 0.44} 44%|████▍ | 15146/34278 [16:40:29<20:05:51, 3.78s/it] 44%|████▍ | 15147/34278 [16:40:35<23:43:53, 4.47s/it] {'loss': 0.1472, 'grad_norm': 0.9306883206375008, 'learning_rate': 6.173117519121079e-06, 'epoch': 0.44} 44%|████▍ | 15147/34278 [16:40:35<23:43:53, 4.47s/it] 44%|████▍ | 15148/34278 [16:40:38<21:59:07, 4.14s/it] {'loss': 0.1182, 'grad_norm': 0.6708625275650458, 'learning_rate': 6.172658267105019e-06, 'epoch': 0.44} 44%|████▍ | 15148/34278 [16:40:38<21:59:07, 4.14s/it] 44%|████▍ | 15149/34278 [16:40:41<20:05:49, 3.78s/it] {'loss': 0.1478, 'grad_norm': 0.6782386383396968, 'learning_rate': 6.172199004619748e-06, 'epoch': 0.44} 44%|████▍ | 15149/34278 [16:40:41<20:05:49, 3.78s/it] 44%|████▍ | 15150/34278 [16:40:44<18:39:45, 3.51s/it] {'loss': 0.1354, 'grad_norm': 0.7840701456064665, 'learning_rate': 6.171739731669365e-06, 'epoch': 0.44} 44%|████▍ | 15150/34278 [16:40:44<18:39:45, 3.51s/it] 44%|████▍ | 15151/34278 [16:40:48<19:24:38, 3.65s/it] {'loss': 0.1115, 'grad_norm': 0.8921163707375518, 'learning_rate': 6.171280448257967e-06, 'epoch': 0.44} 44%|████▍ | 15151/34278 [16:40:48<19:24:38, 3.65s/it] 44%|████▍ | 15152/34278 [16:40:54<22:46:38, 4.29s/it] {'loss': 0.1397, 'grad_norm': 0.7350397465064393, 'learning_rate': 6.170821154389659e-06, 'epoch': 0.44} 44%|████▍ | 15152/34278 [16:40:54<22:46:38, 4.29s/it] 44%|████▍ | 15153/34278 [16:40:58<21:37:31, 4.07s/it] {'loss': 0.1366, 'grad_norm': 0.8706135341382112, 'learning_rate': 6.170361850068538e-06, 'epoch': 0.44} 44%|████▍ | 15153/34278 [16:40:58<21:37:31, 4.07s/it] 44%|████▍ | 15154/34278 [16:41:01<20:19:38, 3.83s/it] {'loss': 0.1148, 'grad_norm': 0.8786752834643791, 'learning_rate': 6.169902535298704e-06, 'epoch': 0.44} 44%|████▍ | 15154/34278 [16:41:01<20:19:38, 3.83s/it] 44%|████▍ | 15155/34278 [16:41:04<19:02:54, 3.59s/it] {'loss': 0.1354, 'grad_norm': 0.9376217221559763, 'learning_rate': 6.169443210084262e-06, 'epoch': 0.44} 44%|████▍ | 15155/34278 [16:41:04<19:02:54, 3.59s/it] 44%|████▍ | 15156/34278 [16:41:07<18:21:20, 3.46s/it] {'loss': 0.1375, 'grad_norm': 0.7440517010432058, 'learning_rate': 6.1689838744293105e-06, 'epoch': 0.44} 44%|████▍ | 15156/34278 [16:41:07<18:21:20, 3.46s/it] 44%|████▍ | 15157/34278 [16:41:10<18:00:39, 3.39s/it] {'loss': 0.1439, 'grad_norm': 0.7252953866774031, 'learning_rate': 6.168524528337949e-06, 'epoch': 0.44} 44%|████▍ | 15157/34278 [16:41:10<18:00:39, 3.39s/it] 44%|████▍ | 15158/34278 [16:41:14<18:46:39, 3.54s/it] {'loss': 0.1416, 'grad_norm': 0.8294662727384108, 'learning_rate': 6.168065171814279e-06, 'epoch': 0.44} 44%|████▍ | 15158/34278 [16:41:14<18:46:39, 3.54s/it] 44%|████▍ | 15159/34278 [16:41:20<22:30:56, 4.24s/it] {'loss': 0.1264, 'grad_norm': 1.006490711539657, 'learning_rate': 6.1676058048624035e-06, 'epoch': 0.44} 44%|████▍ | 15159/34278 [16:41:20<22:30:56, 4.24s/it] 44%|████▍ | 15160/34278 [16:41:23<20:25:38, 3.85s/it] {'loss': 0.1325, 'grad_norm': 0.8004298319702237, 'learning_rate': 6.167146427486421e-06, 'epoch': 0.44} 44%|████▍ | 15160/34278 [16:41:23<20:25:38, 3.85s/it] 44%|████▍ | 15161/34278 [16:41:26<19:56:29, 3.76s/it] {'loss': 0.1251, 'grad_norm': 0.6268073336564143, 'learning_rate': 6.166687039690433e-06, 'epoch': 0.44} 44%|████▍ | 15161/34278 [16:41:27<19:56:29, 3.76s/it] 44%|████▍ | 15162/34278 [16:41:30<18:57:13, 3.57s/it] {'loss': 0.1372, 'grad_norm': 0.8880811293595925, 'learning_rate': 6.166227641478544e-06, 'epoch': 0.44} 44%|████▍ | 15162/34278 [16:41:30<18:57:13, 3.57s/it] 44%|████▍ | 15163/34278 [16:41:34<19:27:40, 3.67s/it] {'loss': 0.1435, 'grad_norm': 0.9086839478739694, 'learning_rate': 6.1657682328548505e-06, 'epoch': 0.44} 44%|████▍ | 15163/34278 [16:41:34<19:27:40, 3.67s/it] 44%|████▍ | 15164/34278 [16:41:37<19:28:32, 3.67s/it] {'loss': 0.1293, 'grad_norm': 0.873657569457763, 'learning_rate': 6.165308813823457e-06, 'epoch': 0.44} 44%|████▍ | 15164/34278 [16:41:37<19:28:32, 3.67s/it] 44%|████▍ | 15165/34278 [16:41:41<20:15:01, 3.81s/it] {'loss': 0.141, 'grad_norm': 0.833198728873711, 'learning_rate': 6.164849384388467e-06, 'epoch': 0.44} 44%|████▍ | 15165/34278 [16:41:41<20:15:01, 3.81s/it] 44%|████▍ | 15166/34278 [16:41:47<23:44:29, 4.47s/it] {'loss': 0.1584, 'grad_norm': 0.7669766276065918, 'learning_rate': 6.164389944553977e-06, 'epoch': 0.44} 44%|████▍ | 15166/34278 [16:41:47<23:44:29, 4.47s/it] 44%|████▍ | 15167/34278 [16:41:51<21:45:21, 4.10s/it] {'loss': 0.1478, 'grad_norm': 1.2068899831680615, 'learning_rate': 6.163930494324093e-06, 'epoch': 0.44} 44%|████▍ | 15167/34278 [16:41:51<21:45:21, 4.10s/it] 44%|████▍ | 15168/34278 [16:41:54<20:11:35, 3.80s/it] {'loss': 0.1153, 'grad_norm': 0.7778010498215501, 'learning_rate': 6.163471033702914e-06, 'epoch': 0.44} 44%|████▍ | 15168/34278 [16:41:54<20:11:35, 3.80s/it] 44%|████▍ | 15169/34278 [16:41:57<19:55:05, 3.75s/it] {'loss': 0.1552, 'grad_norm': 0.770797837270307, 'learning_rate': 6.1630115626945445e-06, 'epoch': 0.44} 44%|████▍ | 15169/34278 [16:41:57<19:55:05, 3.75s/it] 44%|████▍ | 15170/34278 [16:42:00<18:37:40, 3.51s/it] {'loss': 0.1304, 'grad_norm': 1.055067195964919, 'learning_rate': 6.1625520813030855e-06, 'epoch': 0.44} 44%|████▍ | 15170/34278 [16:42:00<18:37:40, 3.51s/it] 44%|████▍ | 15171/34278 [16:42:06<22:32:13, 4.25s/it] {'loss': 0.1389, 'grad_norm': 0.790305789887117, 'learning_rate': 6.162092589532639e-06, 'epoch': 0.44} 44%|████▍ | 15171/34278 [16:42:06<22:32:13, 4.25s/it] 44%|████▍ | 15172/34278 [16:42:11<23:30:18, 4.43s/it] {'loss': 0.1296, 'grad_norm': 0.7926830874783952, 'learning_rate': 6.1616330873873065e-06, 'epoch': 0.44} 44%|████▍ | 15172/34278 [16:42:11<23:30:18, 4.43s/it] 44%|████▍ | 15173/34278 [16:42:15<22:18:46, 4.20s/it] {'loss': 0.1313, 'grad_norm': 0.8228868513259496, 'learning_rate': 6.161173574871192e-06, 'epoch': 0.44} 44%|████▍ | 15173/34278 [16:42:15<22:18:46, 4.20s/it] 44%|████▍ | 15174/34278 [16:42:18<19:59:52, 3.77s/it] {'loss': 0.1407, 'grad_norm': 0.7739986899820893, 'learning_rate': 6.160714051988396e-06, 'epoch': 0.44} 44%|████▍ | 15174/34278 [16:42:18<19:59:52, 3.77s/it] 44%|████▍ | 15175/34278 [16:42:21<19:02:27, 3.59s/it] {'loss': 0.1022, 'grad_norm': 0.7870682635859886, 'learning_rate': 6.160254518743023e-06, 'epoch': 0.44} 44%|████▍ | 15175/34278 [16:42:21<19:02:27, 3.59s/it] 44%|████▍ | 15176/34278 [16:42:24<18:49:26, 3.55s/it] {'loss': 0.1296, 'grad_norm': 0.8602714911088228, 'learning_rate': 6.159794975139174e-06, 'epoch': 0.44} 44%|████▍ | 15176/34278 [16:42:24<18:49:26, 3.55s/it] 44%|████▍ | 15177/34278 [16:42:28<18:53:28, 3.56s/it] {'loss': 0.1355, 'grad_norm': 0.8641677714563666, 'learning_rate': 6.159335421180954e-06, 'epoch': 0.44} 44%|████▍ | 15177/34278 [16:42:28<18:53:28, 3.56s/it] 44%|████▍ | 15178/34278 [16:42:31<18:58:04, 3.58s/it] {'loss': 0.1198, 'grad_norm': 0.8789323711322151, 'learning_rate': 6.158875856872462e-06, 'epoch': 0.44} 44%|████▍ | 15178/34278 [16:42:31<18:58:04, 3.58s/it] 44%|████▍ | 15179/34278 [16:42:35<18:26:37, 3.48s/it] {'loss': 0.1767, 'grad_norm': 0.948326178954136, 'learning_rate': 6.158416282217803e-06, 'epoch': 0.44} 44%|████▍ | 15179/34278 [16:42:35<18:26:37, 3.48s/it] 44%|████▍ | 15180/34278 [16:42:37<17:28:27, 3.29s/it] {'loss': 0.1376, 'grad_norm': 1.11965728484572, 'learning_rate': 6.157956697221082e-06, 'epoch': 0.44} 44%|████▍ | 15180/34278 [16:42:37<17:28:27, 3.29s/it] 44%|████▍ | 15181/34278 [16:42:40<16:58:16, 3.20s/it] {'loss': 0.1307, 'grad_norm': 0.8302308190233277, 'learning_rate': 6.157497101886397e-06, 'epoch': 0.44} 44%|████▍ | 15181/34278 [16:42:40<16:58:16, 3.20s/it] 44%|████▍ | 15182/34278 [16:42:44<17:34:01, 3.31s/it] {'loss': 0.1194, 'grad_norm': 1.045664337871547, 'learning_rate': 6.157037496217857e-06, 'epoch': 0.44} 44%|████▍ | 15182/34278 [16:42:44<17:34:01, 3.31s/it] 44%|████▍ | 15183/34278 [16:42:47<17:28:05, 3.29s/it] {'loss': 0.1474, 'grad_norm': 0.7982028729623004, 'learning_rate': 6.156577880219561e-06, 'epoch': 0.44} 44%|████▍ | 15183/34278 [16:42:47<17:28:05, 3.29s/it] 44%|████▍ | 15184/34278 [16:42:50<16:52:49, 3.18s/it] {'loss': 0.1386, 'grad_norm': 0.9772078210105002, 'learning_rate': 6.156118253895613e-06, 'epoch': 0.44} 44%|████▍ | 15184/34278 [16:42:50<16:52:49, 3.18s/it] 44%|████▍ | 15185/34278 [16:42:53<16:55:29, 3.19s/it] {'loss': 0.1312, 'grad_norm': 0.8324946152115834, 'learning_rate': 6.15565861725012e-06, 'epoch': 0.44} 44%|████▍ | 15185/34278 [16:42:53<16:55:29, 3.19s/it] 44%|████▍ | 15186/34278 [16:42:57<17:02:19, 3.21s/it] {'loss': 0.1317, 'grad_norm': 0.8117971938902022, 'learning_rate': 6.155198970287181e-06, 'epoch': 0.44} 44%|████▍ | 15186/34278 [16:42:57<17:02:19, 3.21s/it] 44%|████▍ | 15187/34278 [16:43:00<17:11:17, 3.24s/it] {'loss': 0.1308, 'grad_norm': 0.9119895557336309, 'learning_rate': 6.154739313010901e-06, 'epoch': 0.44} 44%|████▍ | 15187/34278 [16:43:00<17:11:17, 3.24s/it] 44%|████▍ | 15188/34278 [16:43:06<21:07:20, 3.98s/it] {'loss': 0.1339, 'grad_norm': 0.8644022003527182, 'learning_rate': 6.154279645425385e-06, 'epoch': 0.44} 44%|████▍ | 15188/34278 [16:43:06<21:07:20, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f29f39e99e0> Failed to fetch sample 3423927. Exception: cannot identify image file <_io.BytesIO object at 0x7f29f39e99e0> 44%|████▍ | 15189/34278 [16:43:09<19:58:24, 3.77s/it] {'loss': 0.1451, 'grad_norm': 0.6803353772697368, 'learning_rate': 6.153819967534734e-06, 'epoch': 0.44} 44%|████▍ | 15189/34278 [16:43:09<19:58:24, 3.77s/it] 44%|████▍ | 15190/34278 [16:43:14<21:26:08, 4.04s/it] {'loss': 0.1267, 'grad_norm': 0.879070882971226, 'learning_rate': 6.153360279343056e-06, 'epoch': 0.44} 44%|████▍ | 15190/34278 [16:43:14<21:26:08, 4.04s/it] 44%|████▍ | 15191/34278 [16:43:18<21:13:55, 4.00s/it] {'loss': 0.1361, 'grad_norm': 0.766632869208845, 'learning_rate': 6.152900580854452e-06, 'epoch': 0.44} 44%|████▍ | 15191/34278 [16:43:18<21:13:55, 4.00s/it] 44%|████▍ | 15192/34278 [16:43:21<19:49:18, 3.74s/it] {'loss': 0.1592, 'grad_norm': 0.9171909654778313, 'learning_rate': 6.1524408720730276e-06, 'epoch': 0.44} 44%|████▍ | 15192/34278 [16:43:21<19:49:18, 3.74s/it] 44%|████▍ | 15193/34278 [16:43:25<20:10:10, 3.80s/it] {'loss': 0.1347, 'grad_norm': 1.0116900483386435, 'learning_rate': 6.1519811530028836e-06, 'epoch': 0.44} 44%|████▍ | 15193/34278 [16:43:25<20:10:10, 3.80s/it] 44%|████▍ | 15194/34278 [16:43:28<19:09:44, 3.61s/it] {'loss': 0.1317, 'grad_norm': 0.9236343921387395, 'learning_rate': 6.151521423648129e-06, 'epoch': 0.44} 44%|████▍ | 15194/34278 [16:43:28<19:09:44, 3.61s/it] 44%|████▍ | 15195/34278 [16:43:31<17:59:19, 3.39s/it] {'loss': 0.1476, 'grad_norm': 0.8200229888697481, 'learning_rate': 6.151061684012867e-06, 'epoch': 0.44} 44%|████▍ | 15195/34278 [16:43:31<17:59:19, 3.39s/it] 44%|████▍ | 15196/34278 [16:43:34<17:09:18, 3.24s/it] {'loss': 0.1367, 'grad_norm': 1.0263558504613906, 'learning_rate': 6.150601934101198e-06, 'epoch': 0.44} 44%|████▍ | 15196/34278 [16:43:34<17:09:18, 3.24s/it] 44%|████▍ | 15197/34278 [16:43:38<18:18:15, 3.45s/it] {'loss': 0.1562, 'grad_norm': 1.0179915279890752, 'learning_rate': 6.150142173917233e-06, 'epoch': 0.44} 44%|████▍ | 15197/34278 [16:43:38<18:18:15, 3.45s/it] 44%|████▍ | 15198/34278 [16:43:41<18:38:11, 3.52s/it] {'loss': 0.1457, 'grad_norm': 0.8193260872849141, 'learning_rate': 6.1496824034650715e-06, 'epoch': 0.44} 44%|████▍ | 15198/34278 [16:43:41<18:38:11, 3.52s/it] 44%|████▍ | 15199/34278 [16:43:45<19:28:05, 3.67s/it] {'loss': 0.1672, 'grad_norm': 0.8125875373992769, 'learning_rate': 6.149222622748818e-06, 'epoch': 0.44} 44%|████▍ | 15199/34278 [16:43:45<19:28:05, 3.67s/it] 44%|████▍ | 15200/34278 [16:43:48<17:52:15, 3.37s/it] {'loss': 0.1409, 'grad_norm': 0.9401759401314863, 'learning_rate': 6.148762831772582e-06, 'epoch': 0.44} 44%|████▍ | 15200/34278 [16:43:48<17:52:15, 3.37s/it] 44%|████▍ | 15201/34278 [16:43:51<17:22:30, 3.28s/it] {'loss': 0.1317, 'grad_norm': 1.061541647071786, 'learning_rate': 6.148303030540466e-06, 'epoch': 0.44} 44%|████▍ | 15201/34278 [16:43:51<17:22:30, 3.28s/it] 44%|████▍ | 15202/34278 [16:43:54<16:41:29, 3.15s/it] {'loss': 0.1472, 'grad_norm': 0.9208910349135103, 'learning_rate': 6.1478432190565725e-06, 'epoch': 0.44} 44%|████▍ | 15202/34278 [16:43:54<16:41:29, 3.15s/it] 44%|████▍ | 15203/34278 [16:43:57<16:28:02, 3.11s/it] {'loss': 0.1403, 'grad_norm': 0.7130989810499107, 'learning_rate': 6.14738339732501e-06, 'epoch': 0.44} 44%|████▍ | 15203/34278 [16:43:57<16:28:02, 3.11s/it] 44%|████▍ | 15204/34278 [16:44:00<16:02:09, 3.03s/it] {'loss': 0.163, 'grad_norm': 1.1571797061398832, 'learning_rate': 6.146923565349882e-06, 'epoch': 0.44} 44%|████▍ | 15204/34278 [16:44:00<16:02:09, 3.03s/it] 44%|████▍ | 15205/34278 [16:44:03<15:54:38, 3.00s/it] {'loss': 0.1467, 'grad_norm': 0.8491187005356661, 'learning_rate': 6.146463723135295e-06, 'epoch': 0.44} 44%|████▍ | 15205/34278 [16:44:03<15:54:38, 3.00s/it] 44%|████▍ | 15206/34278 [16:44:06<16:03:37, 3.03s/it] {'loss': 0.1215, 'grad_norm': 0.7907530667174092, 'learning_rate': 6.146003870685353e-06, 'epoch': 0.44} 44%|████▍ | 15206/34278 [16:44:06<16:03:37, 3.03s/it] 44%|████▍ | 15207/34278 [16:44:10<17:29:27, 3.30s/it] {'loss': 0.1469, 'grad_norm': 0.8576470547447197, 'learning_rate': 6.145544008004163e-06, 'epoch': 0.44} 44%|████▍ | 15207/34278 [16:44:10<17:29:27, 3.30s/it] 44%|████▍ | 15208/34278 [16:44:13<17:44:27, 3.35s/it] {'loss': 0.13, 'grad_norm': 0.6210393668457042, 'learning_rate': 6.145084135095827e-06, 'epoch': 0.44} 44%|████▍ | 15208/34278 [16:44:13<17:44:27, 3.35s/it] 44%|████▍ | 15209/34278 [16:44:17<18:12:25, 3.44s/it] {'loss': 0.1293, 'grad_norm': 0.9519110825112453, 'learning_rate': 6.144624251964455e-06, 'epoch': 0.44} 44%|████▍ | 15209/34278 [16:44:17<18:12:25, 3.44s/it] 44%|████▍ | 15210/34278 [16:44:20<17:46:54, 3.36s/it] {'loss': 0.1432, 'grad_norm': 0.8792011471834411, 'learning_rate': 6.144164358614152e-06, 'epoch': 0.44} 44%|████▍ | 15210/34278 [16:44:20<17:46:54, 3.36s/it] 44%|████▍ | 15211/34278 [16:44:25<21:00:26, 3.97s/it] {'loss': 0.1162, 'grad_norm': 0.7657075243259581, 'learning_rate': 6.14370445504902e-06, 'epoch': 0.44} 44%|████▍ | 15211/34278 [16:44:25<21:00:26, 3.97s/it] 44%|████▍ | 15212/34278 [16:44:28<19:20:36, 3.65s/it] {'loss': 0.144, 'grad_norm': 0.8340099443319311, 'learning_rate': 6.14324454127317e-06, 'epoch': 0.44} 44%|████▍ | 15212/34278 [16:44:28<19:20:36, 3.65s/it] 44%|████▍ | 15213/34278 [16:44:31<18:41:45, 3.53s/it] {'loss': 0.1488, 'grad_norm': 0.8752137407083753, 'learning_rate': 6.1427846172907045e-06, 'epoch': 0.44} 44%|████▍ | 15213/34278 [16:44:31<18:41:45, 3.53s/it] 44%|████▍ | 15214/34278 [16:44:35<17:57:15, 3.39s/it] {'loss': 0.1284, 'grad_norm': 0.9318068773304626, 'learning_rate': 6.14232468310573e-06, 'epoch': 0.44} 44%|████▍ | 15214/34278 [16:44:35<17:57:15, 3.39s/it] 44%|████▍ | 15215/34278 [16:44:38<18:26:49, 3.48s/it] {'loss': 0.1607, 'grad_norm': 0.8736516655055431, 'learning_rate': 6.141864738722356e-06, 'epoch': 0.44} 44%|████▍ | 15215/34278 [16:44:38<18:26:49, 3.48s/it] 44%|████▍ | 15216/34278 [16:44:41<17:46:04, 3.36s/it] {'loss': 0.1443, 'grad_norm': 0.8076426652809794, 'learning_rate': 6.141404784144685e-06, 'epoch': 0.44} 44%|████▍ | 15216/34278 [16:44:41<17:46:04, 3.36s/it] 44%|████▍ | 15217/34278 [16:44:45<17:44:10, 3.35s/it] {'loss': 0.1482, 'grad_norm': 0.9168196815559632, 'learning_rate': 6.140944819376824e-06, 'epoch': 0.44} 44%|████▍ | 15217/34278 [16:44:45<17:44:10, 3.35s/it] 44%|████▍ | 15218/34278 [16:44:49<19:01:52, 3.59s/it] {'loss': 0.1244, 'grad_norm': 0.7397749370047907, 'learning_rate': 6.140484844422879e-06, 'epoch': 0.44} 44%|████▍ | 15218/34278 [16:44:49<19:01:52, 3.59s/it] 44%|████▍ | 15219/34278 [16:44:52<18:14:17, 3.44s/it] {'loss': 0.1526, 'grad_norm': 0.7476313378941652, 'learning_rate': 6.14002485928696e-06, 'epoch': 0.44} 44%|████▍ | 15219/34278 [16:44:52<18:14:17, 3.44s/it] 44%|████▍ | 15220/34278 [16:44:58<22:36:13, 4.27s/it] {'loss': 0.1326, 'grad_norm': 0.6667299942179602, 'learning_rate': 6.139564863973169e-06, 'epoch': 0.44} 44%|████▍ | 15220/34278 [16:44:58<22:36:13, 4.27s/it] 44%|████▍ | 15221/34278 [16:45:01<20:49:57, 3.94s/it] {'loss': 0.1348, 'grad_norm': 0.9066358944844822, 'learning_rate': 6.139104858485616e-06, 'epoch': 0.44} 44%|████▍ | 15221/34278 [16:45:01<20:49:57, 3.94s/it] 44%|████▍ | 15222/34278 [16:45:07<24:20:45, 4.60s/it] {'loss': 0.138, 'grad_norm': 0.7812964435545219, 'learning_rate': 6.138644842828407e-06, 'epoch': 0.44} 44%|████▍ | 15222/34278 [16:45:07<24:20:45, 4.60s/it] 44%|████▍ | 15223/34278 [16:45:10<21:55:21, 4.14s/it] {'loss': 0.1524, 'grad_norm': 0.8349967115734126, 'learning_rate': 6.138184817005648e-06, 'epoch': 0.44} 44%|████▍ | 15223/34278 [16:45:10<21:55:21, 4.14s/it] 44%|████▍ | 15224/34278 [16:45:13<20:09:46, 3.81s/it] {'loss': 0.1452, 'grad_norm': 1.1972323678106125, 'learning_rate': 6.1377247810214466e-06, 'epoch': 0.44} 44%|████▍ | 15224/34278 [16:45:13<20:09:46, 3.81s/it] 44%|████▍ | 15225/34278 [16:45:17<20:21:02, 3.85s/it] {'loss': 0.132, 'grad_norm': 0.9094728032049704, 'learning_rate': 6.137264734879912e-06, 'epoch': 0.44} 44%|████▍ | 15225/34278 [16:45:17<20:21:02, 3.85s/it] 44%|████▍ | 15226/34278 [16:45:21<19:52:39, 3.76s/it] {'loss': 0.1341, 'grad_norm': 0.8020759584277014, 'learning_rate': 6.136804678585146e-06, 'epoch': 0.44} 44%|████▍ | 15226/34278 [16:45:21<19:52:39, 3.76s/it] 44%|████▍ | 15227/34278 [16:45:27<23:35:50, 4.46s/it] {'loss': 0.1435, 'grad_norm': 1.1431151200320264, 'learning_rate': 6.136344612141262e-06, 'epoch': 0.44} 44%|████▍ | 15227/34278 [16:45:27<23:35:50, 4.46s/it] 44%|████▍ | 15228/34278 [16:45:30<21:26:55, 4.05s/it] {'loss': 0.1479, 'grad_norm': 0.9889349789349048, 'learning_rate': 6.135884535552363e-06, 'epoch': 0.44} 44%|████▍ | 15228/34278 [16:45:30<21:26:55, 4.05s/it] 44%|████▍ | 15229/34278 [16:45:33<19:33:52, 3.70s/it] {'loss': 0.1271, 'grad_norm': 0.7478560976135962, 'learning_rate': 6.135424448822559e-06, 'epoch': 0.44} 44%|████▍ | 15229/34278 [16:45:33<19:33:52, 3.70s/it] 44%|████▍ | 15230/34278 [16:45:39<23:14:42, 4.39s/it] {'loss': 0.1181, 'grad_norm': 0.7186266917342702, 'learning_rate': 6.134964351955955e-06, 'epoch': 0.44} 44%|████▍ | 15230/34278 [16:45:39<23:14:42, 4.39s/it] 44%|████▍ | 15231/34278 [16:45:42<21:02:42, 3.98s/it] {'loss': 0.1172, 'grad_norm': 0.8726330738217674, 'learning_rate': 6.134504244956662e-06, 'epoch': 0.44} 44%|████▍ | 15231/34278 [16:45:42<21:02:42, 3.98s/it] 44%|████▍ | 15232/34278 [16:45:47<23:14:22, 4.39s/it] {'loss': 0.124, 'grad_norm': 0.9163240605568302, 'learning_rate': 6.134044127828785e-06, 'epoch': 0.44} 44%|████▍ | 15232/34278 [16:45:47<23:14:22, 4.39s/it] 44%|████▍ | 15233/34278 [16:45:51<21:13:42, 4.01s/it] {'loss': 0.1463, 'grad_norm': 0.96480251333729, 'learning_rate': 6.133584000576433e-06, 'epoch': 0.44} 44%|████▍ | 15233/34278 [16:45:51<21:13:42, 4.01s/it] 44%|████▍ | 15234/34278 [16:45:54<20:06:39, 3.80s/it] {'loss': 0.13, 'grad_norm': 0.7747088266092722, 'learning_rate': 6.133123863203714e-06, 'epoch': 0.44} 44%|████▍ | 15234/34278 [16:45:54<20:06:39, 3.80s/it] 44%|████▍ | 15235/34278 [16:45:58<20:01:22, 3.79s/it] {'loss': 0.131, 'grad_norm': 1.0229218733197905, 'learning_rate': 6.132663715714735e-06, 'epoch': 0.44} 44%|████▍ | 15235/34278 [16:45:58<20:01:22, 3.79s/it] 44%|████▍ | 15236/34278 [16:46:01<18:51:58, 3.57s/it] {'loss': 0.1064, 'grad_norm': 0.8384492580311871, 'learning_rate': 6.132203558113604e-06, 'epoch': 0.44} 44%|████▍ | 15236/34278 [16:46:01<18:51:58, 3.57s/it] 44%|████▍ | 15237/34278 [16:46:06<22:01:45, 4.16s/it] {'loss': 0.1413, 'grad_norm': 0.7906416335983961, 'learning_rate': 6.131743390404432e-06, 'epoch': 0.44} 44%|████▍ | 15237/34278 [16:46:06<22:01:45, 4.16s/it] 44%|████▍ | 15238/34278 [16:46:09<20:02:54, 3.79s/it] {'loss': 0.1527, 'grad_norm': 0.7826541865543741, 'learning_rate': 6.131283212591324e-06, 'epoch': 0.44} 44%|████▍ | 15238/34278 [16:46:09<20:02:54, 3.79s/it] 44%|████▍ | 15239/34278 [16:46:12<18:50:40, 3.56s/it] {'loss': 0.1454, 'grad_norm': 0.9590447953863227, 'learning_rate': 6.130823024678388e-06, 'epoch': 0.44} 44%|████▍ | 15239/34278 [16:46:12<18:50:40, 3.56s/it] 44%|████▍ | 15240/34278 [16:46:15<18:02:16, 3.41s/it] {'loss': 0.1467, 'grad_norm': 0.9056538663663815, 'learning_rate': 6.1303628266697365e-06, 'epoch': 0.44} 44%|████▍ | 15240/34278 [16:46:15<18:02:16, 3.41s/it] 44%|████▍ | 15241/34278 [16:46:18<17:16:07, 3.27s/it] {'loss': 0.1477, 'grad_norm': 0.8059288224470611, 'learning_rate': 6.129902618569474e-06, 'epoch': 0.44} 44%|████▍ | 15241/34278 [16:46:18<17:16:07, 3.27s/it] 44%|████▍ | 15242/34278 [16:46:21<16:22:29, 3.10s/it] {'loss': 0.1207, 'grad_norm': 0.8979235488551794, 'learning_rate': 6.129442400381712e-06, 'epoch': 0.44} 44%|████▍ | 15242/34278 [16:46:21<16:22:29, 3.10s/it] 44%|████▍ | 15243/34278 [16:46:25<17:24:52, 3.29s/it] {'loss': 0.1386, 'grad_norm': 0.8475314209268942, 'learning_rate': 6.128982172110558e-06, 'epoch': 0.44} 44%|████▍ | 15243/34278 [16:46:25<17:24:52, 3.29s/it] 44%|████▍ | 15244/34278 [16:46:28<16:53:45, 3.20s/it] {'loss': 0.1282, 'grad_norm': 0.5986386792810503, 'learning_rate': 6.128521933760119e-06, 'epoch': 0.44} 44%|████▍ | 15244/34278 [16:46:28<16:53:45, 3.20s/it] 44%|████▍ | 15245/34278 [16:46:31<17:26:36, 3.30s/it] {'loss': 0.1489, 'grad_norm': 0.8971110629639041, 'learning_rate': 6.1280616853345065e-06, 'epoch': 0.44} 44%|████▍ | 15245/34278 [16:46:31<17:26:36, 3.30s/it] 44%|████▍ | 15246/34278 [16:46:37<21:19:21, 4.03s/it] {'loss': 0.1624, 'grad_norm': 0.9085603557417934, 'learning_rate': 6.127601426837828e-06, 'epoch': 0.44} 44%|████▍ | 15246/34278 [16:46:37<21:19:21, 4.03s/it] 44%|████▍ | 15247/34278 [16:46:40<19:43:27, 3.73s/it] {'loss': 0.1638, 'grad_norm': 0.8271277986371625, 'learning_rate': 6.127141158274194e-06, 'epoch': 0.44} 44%|████▍ | 15247/34278 [16:46:40<19:43:27, 3.73s/it] 44%|████▍ | 15248/34278 [16:46:44<19:57:43, 3.78s/it] {'loss': 0.1424, 'grad_norm': 1.1073249084395338, 'learning_rate': 6.126680879647712e-06, 'epoch': 0.44} 44%|████▍ | 15248/34278 [16:46:44<19:57:43, 3.78s/it] 44%|████▍ | 15249/34278 [16:46:47<18:47:14, 3.55s/it] {'loss': 0.1486, 'grad_norm': 0.9205586686240007, 'learning_rate': 6.126220590962493e-06, 'epoch': 0.44} 44%|████▍ | 15249/34278 [16:46:47<18:47:14, 3.55s/it] 44%|████▍ | 15250/34278 [16:46:53<22:42:16, 4.30s/it] {'loss': 0.1311, 'grad_norm': 0.7290816364217954, 'learning_rate': 6.1257602922226445e-06, 'epoch': 0.44} 44%|████▍ | 15250/34278 [16:46:53<22:42:16, 4.30s/it] 44%|████▍ | 15251/34278 [16:46:56<21:17:43, 4.03s/it] {'loss': 0.16, 'grad_norm': 1.2122255502811066, 'learning_rate': 6.1252999834322766e-06, 'epoch': 0.44} 44%|████▍ | 15251/34278 [16:46:56<21:17:43, 4.03s/it] 44%|████▍ | 15252/34278 [16:47:01<22:00:15, 4.16s/it] {'loss': 0.1589, 'grad_norm': 1.0996832928390419, 'learning_rate': 6.124839664595501e-06, 'epoch': 0.44} 44%|████▍ | 15252/34278 [16:47:01<22:00:15, 4.16s/it] 44%|████▍ | 15253/34278 [16:47:04<20:56:10, 3.96s/it] {'loss': 0.1348, 'grad_norm': 0.9662578582152325, 'learning_rate': 6.1243793357164224e-06, 'epoch': 0.44} 44%|████▍ | 15253/34278 [16:47:04<20:56:10, 3.96s/it] 45%|████▍ | 15254/34278 [16:47:08<20:29:54, 3.88s/it] {'loss': 0.1675, 'grad_norm': 0.8472291855870548, 'learning_rate': 6.123918996799155e-06, 'epoch': 0.45} 45%|████▍ | 15254/34278 [16:47:08<20:29:54, 3.88s/it] 45%|████▍ | 15255/34278 [16:47:11<19:36:05, 3.71s/it] {'loss': 0.1689, 'grad_norm': 1.1205126844670723, 'learning_rate': 6.123458647847808e-06, 'epoch': 0.45} 45%|████▍ | 15255/34278 [16:47:11<19:36:05, 3.71s/it] 45%|████▍ | 15256/34278 [16:47:16<21:01:51, 3.98s/it] {'loss': 0.161, 'grad_norm': 0.9057560283475344, 'learning_rate': 6.1229982888664895e-06, 'epoch': 0.45} 45%|████▍ | 15256/34278 [16:47:16<21:01:51, 3.98s/it] 45%|████▍ | 15257/34278 [16:47:19<20:13:40, 3.83s/it] {'loss': 0.1204, 'grad_norm': 0.8111224915309679, 'learning_rate': 6.122537919859312e-06, 'epoch': 0.45} 45%|████▍ | 15257/34278 [16:47:19<20:13:40, 3.83s/it] 45%|████▍ | 15258/34278 [16:47:23<19:33:38, 3.70s/it] {'loss': 0.1355, 'grad_norm': 0.752264863469283, 'learning_rate': 6.1220775408303825e-06, 'epoch': 0.45} 45%|████▍ | 15258/34278 [16:47:23<19:33:38, 3.70s/it] 45%|████▍ | 15259/34278 [16:47:29<23:25:02, 4.43s/it] {'loss': 0.1696, 'grad_norm': 0.9918192620855522, 'learning_rate': 6.121617151783812e-06, 'epoch': 0.45} 45%|████▍ | 15259/34278 [16:47:29<23:25:02, 4.43s/it] 45%|████▍ | 15260/34278 [16:47:35<26:24:16, 5.00s/it] {'loss': 0.1471, 'grad_norm': 0.7428392567570353, 'learning_rate': 6.1211567527237115e-06, 'epoch': 0.45} 45%|████▍ | 15260/34278 [16:47:35<26:24:16, 5.00s/it] 45%|████▍ | 15261/34278 [16:47:38<23:41:34, 4.49s/it] {'loss': 0.1305, 'grad_norm': 0.8730197701897332, 'learning_rate': 6.120696343654191e-06, 'epoch': 0.45} 45%|████▍ | 15261/34278 [16:47:38<23:41:34, 4.49s/it] 45%|████▍ | 15262/34278 [16:47:42<22:06:16, 4.18s/it] {'loss': 0.1259, 'grad_norm': 0.7914013776401143, 'learning_rate': 6.120235924579362e-06, 'epoch': 0.45} 45%|████▍ | 15262/34278 [16:47:42<22:06:16, 4.18s/it] 45%|████▍ | 15263/34278 [16:47:45<20:54:49, 3.96s/it] {'loss': 0.142, 'grad_norm': 0.7101655196438709, 'learning_rate': 6.119775495503334e-06, 'epoch': 0.45} 45%|████▍ | 15263/34278 [16:47:45<20:54:49, 3.96s/it] 45%|████▍ | 15264/34278 [16:47:49<20:00:39, 3.79s/it] {'loss': 0.1323, 'grad_norm': 0.8127695382551665, 'learning_rate': 6.119315056430217e-06, 'epoch': 0.45} 45%|████▍ | 15264/34278 [16:47:49<20:00:39, 3.79s/it] 45%|████▍ | 15265/34278 [16:47:53<20:01:21, 3.79s/it] {'loss': 0.1405, 'grad_norm': 0.8752157719510197, 'learning_rate': 6.118854607364122e-06, 'epoch': 0.45} 45%|████▍ | 15265/34278 [16:47:53<20:01:21, 3.79s/it] 45%|████▍ | 15266/34278 [16:47:55<18:37:40, 3.53s/it] {'loss': 0.1511, 'grad_norm': 0.8859459423157212, 'learning_rate': 6.118394148309161e-06, 'epoch': 0.45} 45%|████▍ | 15266/34278 [16:47:55<18:37:40, 3.53s/it] 45%|████▍ | 15267/34278 [16:47:59<18:26:55, 3.49s/it] {'loss': 0.1602, 'grad_norm': 0.9430807259674541, 'learning_rate': 6.117933679269446e-06, 'epoch': 0.45} 45%|████▍ | 15267/34278 [16:47:59<18:26:55, 3.49s/it] 45%|████▍ | 15268/34278 [16:48:03<19:30:19, 3.69s/it] {'loss': 0.1251, 'grad_norm': 0.8101264308156917, 'learning_rate': 6.117473200249082e-06, 'epoch': 0.45} 45%|████▍ | 15268/34278 [16:48:03<19:30:19, 3.69s/it] 45%|████▍ | 15269/34278 [16:48:06<18:28:59, 3.50s/it] {'loss': 0.1311, 'grad_norm': 0.6454273830947004, 'learning_rate': 6.117012711252186e-06, 'epoch': 0.45} 45%|████▍ | 15269/34278 [16:48:06<18:28:59, 3.50s/it] 45%|████▍ | 15270/34278 [16:48:09<18:03:46, 3.42s/it] {'loss': 0.1252, 'grad_norm': 1.0266229742097204, 'learning_rate': 6.116552212282868e-06, 'epoch': 0.45} 45%|████▍ | 15270/34278 [16:48:09<18:03:46, 3.42s/it] 45%|████▍ | 15271/34278 [16:48:12<17:13:00, 3.26s/it] {'loss': 0.1326, 'grad_norm': 0.8443968042739745, 'learning_rate': 6.116091703345236e-06, 'epoch': 0.45} 45%|████▍ | 15271/34278 [16:48:12<17:13:00, 3.26s/it] 45%|████▍ | 15272/34278 [16:48:16<17:28:24, 3.31s/it] {'loss': 0.1311, 'grad_norm': 0.7291060015392791, 'learning_rate': 6.1156311844434065e-06, 'epoch': 0.45} 45%|████▍ | 15272/34278 [16:48:16<17:28:24, 3.31s/it] 45%|████▍ | 15273/34278 [16:48:21<21:18:15, 4.04s/it] {'loss': 0.1279, 'grad_norm': 0.8763559814127485, 'learning_rate': 6.115170655581486e-06, 'epoch': 0.45} 45%|████▍ | 15273/34278 [16:48:21<21:18:15, 4.04s/it] 45%|████▍ | 15274/34278 [16:48:28<24:47:06, 4.70s/it] {'loss': 0.1284, 'grad_norm': 0.7304809459449753, 'learning_rate': 6.114710116763589e-06, 'epoch': 0.45} 45%|████▍ | 15274/34278 [16:48:28<24:47:06, 4.70s/it] 45%|████▍ | 15275/34278 [16:48:30<21:44:54, 4.12s/it] {'loss': 0.1394, 'grad_norm': 0.788802497349413, 'learning_rate': 6.114249567993826e-06, 'epoch': 0.45} 45%|████▍ | 15275/34278 [16:48:30<21:44:54, 4.12s/it] 45%|████▍ | 15276/34278 [16:48:36<24:26:10, 4.63s/it] {'loss': 0.1526, 'grad_norm': 1.0010644044930095, 'learning_rate': 6.11378900927631e-06, 'epoch': 0.45} 45%|████▍ | 15276/34278 [16:48:36<24:26:10, 4.63s/it] 45%|████▍ | 15277/34278 [16:48:40<22:36:35, 4.28s/it] {'loss': 0.1468, 'grad_norm': 0.9800887845963783, 'learning_rate': 6.1133284406151494e-06, 'epoch': 0.45} 45%|████▍ | 15277/34278 [16:48:40<22:36:35, 4.28s/it] 45%|████▍ | 15278/34278 [16:48:43<20:44:03, 3.93s/it] {'loss': 0.1246, 'grad_norm': 0.8837972629123051, 'learning_rate': 6.11286786201446e-06, 'epoch': 0.45} 45%|████▍ | 15278/34278 [16:48:43<20:44:03, 3.93s/it] 45%|████▍ | 15279/34278 [16:48:49<24:17:23, 4.60s/it] {'loss': 0.1459, 'grad_norm': 0.756446040358204, 'learning_rate': 6.112407273478351e-06, 'epoch': 0.45} 45%|████▍ | 15279/34278 [16:48:49<24:17:23, 4.60s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 45%|████▍ | 15280/34278 [16:48:55<26:32:30, 5.03s/it] {'loss': 0.1448, 'grad_norm': 0.8495668524065141, 'learning_rate': 6.111946675010936e-06, 'epoch': 0.45} 45%|████▍ | 15280/34278 [16:48:55<26:32:30, 5.03s/it] 45%|████▍ | 15281/34278 [16:48:58<23:37:02, 4.48s/it] {'loss': 0.1664, 'grad_norm': 1.01471428766146, 'learning_rate': 6.111486066616326e-06, 'epoch': 0.45} 45%|████▍ | 15281/34278 [16:48:58<23:37:02, 4.48s/it] 45%|████▍ | 15282/34278 [16:49:01<21:43:06, 4.12s/it] {'loss': 0.1574, 'grad_norm': 0.916121638870755, 'learning_rate': 6.1110254482986354e-06, 'epoch': 0.45} 45%|████▍ | 15282/34278 [16:49:01<21:43:06, 4.12s/it] 45%|████▍ | 15283/34278 [16:49:05<20:10:47, 3.82s/it] {'loss': 0.1525, 'grad_norm': 0.7760158580921451, 'learning_rate': 6.110564820061972e-06, 'epoch': 0.45} 45%|████▍ | 15283/34278 [16:49:05<20:10:47, 3.82s/it] 45%|████▍ | 15284/34278 [16:49:08<18:54:16, 3.58s/it] {'loss': 0.1295, 'grad_norm': 0.853142412517895, 'learning_rate': 6.110104181910452e-06, 'epoch': 0.45} 45%|████▍ | 15284/34278 [16:49:08<18:54:16, 3.58s/it] 45%|████▍ | 15285/34278 [16:49:10<17:50:08, 3.38s/it] {'loss': 0.1505, 'grad_norm': 0.9426743939597242, 'learning_rate': 6.1096435338481885e-06, 'epoch': 0.45} 45%|████▍ | 15285/34278 [16:49:10<17:50:08, 3.38s/it] 45%|████▍ | 15286/34278 [16:49:14<17:50:26, 3.38s/it] {'loss': 0.1502, 'grad_norm': 0.7478221107434188, 'learning_rate': 6.10918287587929e-06, 'epoch': 0.45} 45%|████▍ | 15286/34278 [16:49:14<17:50:26, 3.38s/it] 45%|████▍ | 15287/34278 [16:49:17<17:43:46, 3.36s/it] {'loss': 0.1607, 'grad_norm': 0.7782289003441114, 'learning_rate': 6.108722208007875e-06, 'epoch': 0.45} 45%|████▍ | 15287/34278 [16:49:17<17:43:46, 3.36s/it] 45%|████▍ | 15288/34278 [16:49:21<18:42:46, 3.55s/it] {'loss': 0.1362, 'grad_norm': 0.904992190235612, 'learning_rate': 6.10826153023805e-06, 'epoch': 0.45} 45%|████▍ | 15288/34278 [16:49:21<18:42:46, 3.55s/it] 45%|████▍ | 15289/34278 [16:49:25<19:52:56, 3.77s/it] {'loss': 0.1436, 'grad_norm': 0.8852852530556897, 'learning_rate': 6.107800842573931e-06, 'epoch': 0.45} 45%|████▍ | 15289/34278 [16:49:25<19:52:56, 3.77s/it] 45%|████▍ | 15290/34278 [16:49:32<24:10:50, 4.58s/it] {'loss': 0.1426, 'grad_norm': 1.429744324549573, 'learning_rate': 6.10734014501963e-06, 'epoch': 0.45} 45%|████▍ | 15290/34278 [16:49:32<24:10:50, 4.58s/it] 45%|████▍ | 15291/34278 [16:49:36<23:10:59, 4.40s/it] {'loss': 0.1535, 'grad_norm': 0.9704718485114926, 'learning_rate': 6.106879437579262e-06, 'epoch': 0.45} 45%|████▍ | 15291/34278 [16:49:36<23:10:59, 4.40s/it] 45%|████▍ | 15292/34278 [16:49:41<23:47:35, 4.51s/it] {'loss': 0.1261, 'grad_norm': 0.866394347707187, 'learning_rate': 6.106418720256938e-06, 'epoch': 0.45} 45%|████▍ | 15292/34278 [16:49:41<23:47:35, 4.51s/it] 45%|████▍ | 15293/34278 [16:49:44<21:48:38, 4.14s/it] {'loss': 0.1243, 'grad_norm': 0.8273542824415847, 'learning_rate': 6.105957993056772e-06, 'epoch': 0.45} 45%|████▍ | 15293/34278 [16:49:44<21:48:38, 4.14s/it] 45%|████▍ | 15294/34278 [16:49:47<19:43:30, 3.74s/it] {'loss': 0.1381, 'grad_norm': 1.5233395354131498, 'learning_rate': 6.105497255982876e-06, 'epoch': 0.45} 45%|████▍ | 15294/34278 [16:49:47<19:43:30, 3.74s/it] 45%|████▍ | 15295/34278 [16:49:50<19:40:05, 3.73s/it] {'loss': 0.1579, 'grad_norm': 1.0572255648397941, 'learning_rate': 6.105036509039365e-06, 'epoch': 0.45} 45%|████▍ | 15295/34278 [16:49:50<19:40:05, 3.73s/it] 45%|████▍ | 15296/34278 [16:49:53<18:09:40, 3.44s/it] {'loss': 0.154, 'grad_norm': 0.7290005488993977, 'learning_rate': 6.1045757522303516e-06, 'epoch': 0.45} 45%|████▍ | 15296/34278 [16:49:53<18:09:40, 3.44s/it] 45%|████▍ | 15297/34278 [16:49:59<21:44:26, 4.12s/it] {'loss': 0.1504, 'grad_norm': 0.7555598067273647, 'learning_rate': 6.104114985559952e-06, 'epoch': 0.45} 45%|████▍ | 15297/34278 [16:49:59<21:44:26, 4.12s/it] 45%|████▍ | 15298/34278 [16:50:02<20:14:40, 3.84s/it] {'loss': 0.1504, 'grad_norm': 1.0388390205532618, 'learning_rate': 6.1036542090322736e-06, 'epoch': 0.45} 45%|████▍ | 15298/34278 [16:50:02<20:14:40, 3.84s/it] 45%|████▍ | 15299/34278 [16:50:05<19:05:55, 3.62s/it] {'loss': 0.144, 'grad_norm': 0.890318762127317, 'learning_rate': 6.103193422651436e-06, 'epoch': 0.45} 45%|████▍ | 15299/34278 [16:50:05<19:05:55, 3.62s/it] 45%|████▍ | 15300/34278 [16:50:08<18:09:27, 3.44s/it] {'loss': 0.129, 'grad_norm': 0.6257069102607349, 'learning_rate': 6.102732626421552e-06, 'epoch': 0.45} 45%|████▍ | 15300/34278 [16:50:08<18:09:27, 3.44s/it] 45%|████▍ | 15301/34278 [16:50:12<18:39:01, 3.54s/it] {'loss': 0.1485, 'grad_norm': 0.8375940792515587, 'learning_rate': 6.102271820346731e-06, 'epoch': 0.45} 45%|████▍ | 15301/34278 [16:50:12<18:39:01, 3.54s/it] 45%|████▍ | 15302/34278 [16:50:16<19:16:55, 3.66s/it] {'loss': 0.1412, 'grad_norm': 1.1803825090593871, 'learning_rate': 6.101811004431093e-06, 'epoch': 0.45} 45%|████▍ | 15302/34278 [16:50:16<19:16:55, 3.66s/it] 45%|████▍ | 15303/34278 [16:50:19<18:12:37, 3.45s/it] {'loss': 0.1196, 'grad_norm': 1.0694821347337167, 'learning_rate': 6.101350178678749e-06, 'epoch': 0.45} 45%|████▍ | 15303/34278 [16:50:19<18:12:37, 3.45s/it] 45%|████▍ | 15304/34278 [16:50:25<22:23:59, 4.25s/it] {'loss': 0.1313, 'grad_norm': 0.7030966777568456, 'learning_rate': 6.100889343093812e-06, 'epoch': 0.45} 45%|████▍ | 15304/34278 [16:50:25<22:23:59, 4.25s/it] 45%|████▍ | 15305/34278 [16:50:28<21:02:33, 3.99s/it] {'loss': 0.1426, 'grad_norm': 0.8758459983348651, 'learning_rate': 6.1004284976804e-06, 'epoch': 0.45} 45%|████▍ | 15305/34278 [16:50:28<21:02:33, 3.99s/it] 45%|████▍ | 15306/34278 [16:50:35<24:24:05, 4.63s/it] {'loss': 0.1436, 'grad_norm': 0.8054310547956597, 'learning_rate': 6.099967642442623e-06, 'epoch': 0.45} 45%|████▍ | 15306/34278 [16:50:35<24:24:05, 4.63s/it] 45%|████▍ | 15307/34278 [16:50:37<21:34:06, 4.09s/it] {'loss': 0.1381, 'grad_norm': 0.9332237157358919, 'learning_rate': 6.099506777384598e-06, 'epoch': 0.45} 45%|████▍ | 15307/34278 [16:50:37<21:34:06, 4.09s/it] 45%|████▍ | 15308/34278 [16:50:43<23:23:56, 4.44s/it] {'loss': 0.1403, 'grad_norm': 0.8506058449099817, 'learning_rate': 6.09904590251044e-06, 'epoch': 0.45} 45%|████▍ | 15308/34278 [16:50:43<23:23:56, 4.44s/it] 45%|████▍ | 15309/34278 [16:50:48<24:51:15, 4.72s/it] {'loss': 0.1346, 'grad_norm': 0.7043375782536109, 'learning_rate': 6.098585017824261e-06, 'epoch': 0.45} 45%|████▍ | 15309/34278 [16:50:48<24:51:15, 4.72s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 45%|████▍ | 15310/34278 [16:50:51<21:49:16, 4.14s/it] {'loss': 0.1271, 'grad_norm': 0.9915756333596731, 'learning_rate': 6.098124123330178e-06, 'epoch': 0.45} 45%|████▍ | 15310/34278 [16:50:51<21:49:16, 4.14s/it] 45%|████▍ | 15311/34278 [16:50:55<21:33:01, 4.09s/it] {'loss': 0.1277, 'grad_norm': 0.7555900798427305, 'learning_rate': 6.097663219032306e-06, 'epoch': 0.45} 45%|████▍ | 15311/34278 [16:50:55<21:33:01, 4.09s/it] 45%|████▍ | 15312/34278 [16:51:00<23:03:56, 4.38s/it] {'loss': 0.1369, 'grad_norm': 0.7811465571733777, 'learning_rate': 6.097202304934758e-06, 'epoch': 0.45} 45%|████▍ | 15312/34278 [16:51:00<23:03:56, 4.38s/it] 45%|████▍ | 15313/34278 [16:51:03<21:03:11, 4.00s/it] {'loss': 0.1393, 'grad_norm': 0.7760424626545825, 'learning_rate': 6.096741381041649e-06, 'epoch': 0.45} 45%|████▍ | 15313/34278 [16:51:03<21:03:11, 4.00s/it] 45%|████▍ | 15314/34278 [16:51:06<20:15:12, 3.84s/it] {'loss': 0.1242, 'grad_norm': 0.8137295964514707, 'learning_rate': 6.096280447357095e-06, 'epoch': 0.45} 45%|████▍ | 15314/34278 [16:51:06<20:15:12, 3.84s/it] 45%|████▍ | 15315/34278 [16:51:10<20:01:29, 3.80s/it] {'loss': 0.1736, 'grad_norm': 0.8190671346470227, 'learning_rate': 6.0958195038852115e-06, 'epoch': 0.45} 45%|████▍ | 15315/34278 [16:51:10<20:01:29, 3.80s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8923fb7a10> Failed to fetch sample 2667591. Exception: cannot identify image file <_io.BytesIO object at 0x7f8923fb7a10> 45%|████▍ | 15316/34278 [16:51:13<18:47:30, 3.57s/it] {'loss': 0.1358, 'grad_norm': 0.7415349402083039, 'learning_rate': 6.095358550630113e-06, 'epoch': 0.45} 45%|████▍ | 15316/34278 [16:51:13<18:47:30, 3.57s/it] 45%|████▍ | 15317/34278 [16:51:16<18:03:24, 3.43s/it] {'loss': 0.1279, 'grad_norm': 0.7279555472391306, 'learning_rate': 6.0948975875959145e-06, 'epoch': 0.45} 45%|████▍ | 15317/34278 [16:51:16<18:03:24, 3.43s/it] 45%|████▍ | 15318/34278 [16:51:22<22:19:06, 4.24s/it] {'loss': 0.1519, 'grad_norm': 0.8911639321268853, 'learning_rate': 6.094436614786733e-06, 'epoch': 0.45} 45%|████▍ | 15318/34278 [16:51:22<22:19:06, 4.24s/it] 45%|████▍ | 15319/34278 [16:51:26<22:02:00, 4.18s/it] {'loss': 0.1252, 'grad_norm': 0.9726188669800133, 'learning_rate': 6.093975632206681e-06, 'epoch': 0.45} 45%|████▍ | 15319/34278 [16:51:26<22:02:00, 4.18s/it] 45%|████▍ | 15320/34278 [16:51:30<20:38:55, 3.92s/it] {'loss': 0.1671, 'grad_norm': 0.8049216253395035, 'learning_rate': 6.093514639859877e-06, 'epoch': 0.45} 45%|████▍ | 15320/34278 [16:51:30<20:38:55, 3.92s/it] 45%|████▍ | 15321/34278 [16:51:33<19:28:02, 3.70s/it] {'loss': 0.1335, 'grad_norm': 0.8248019570433789, 'learning_rate': 6.093053637750433e-06, 'epoch': 0.45} 45%|████▍ | 15321/34278 [16:51:33<19:28:02, 3.70s/it] 45%|████▍ | 15322/34278 [16:51:36<18:24:37, 3.50s/it] {'loss': 0.1481, 'grad_norm': 0.9190140038585972, 'learning_rate': 6.09259262588247e-06, 'epoch': 0.45} 45%|████▍ | 15322/34278 [16:51:36<18:24:37, 3.50s/it] 45%|████▍ | 15323/34278 [16:51:40<19:39:23, 3.73s/it] {'loss': 0.1237, 'grad_norm': 1.0802199683662481, 'learning_rate': 6.092131604260099e-06, 'epoch': 0.45} 45%|████▍ | 15323/34278 [16:51:40<19:39:23, 3.73s/it] 45%|████▍ | 15324/34278 [16:51:43<18:42:31, 3.55s/it] {'loss': 0.1477, 'grad_norm': 1.9020566396685108, 'learning_rate': 6.091670572887438e-06, 'epoch': 0.45} 45%|████▍ | 15324/34278 [16:51:43<18:42:31, 3.55s/it] 45%|████▍ | 15325/34278 [16:51:46<17:49:09, 3.38s/it] {'loss': 0.1403, 'grad_norm': 0.7611700628300063, 'learning_rate': 6.091209531768603e-06, 'epoch': 0.45} 45%|████▍ | 15325/34278 [16:51:46<17:49:09, 3.38s/it] 45%|████▍ | 15326/34278 [16:51:50<18:56:39, 3.60s/it] {'loss': 0.13, 'grad_norm': 0.8753853301393958, 'learning_rate': 6.09074848090771e-06, 'epoch': 0.45} 45%|████▍ | 15326/34278 [16:51:50<18:56:39, 3.60s/it] 45%|████▍ | 15327/34278 [16:51:54<19:06:54, 3.63s/it] {'loss': 0.1358, 'grad_norm': 0.9155815837711926, 'learning_rate': 6.0902874203088744e-06, 'epoch': 0.45} 45%|████▍ | 15327/34278 [16:51:54<19:06:54, 3.63s/it] 45%|████▍ | 15328/34278 [16:51:58<19:38:05, 3.73s/it] {'loss': 0.1342, 'grad_norm': 1.1288010289736823, 'learning_rate': 6.089826349976213e-06, 'epoch': 0.45} 45%|████▍ | 15328/34278 [16:51:58<19:38:05, 3.73s/it] 45%|████▍ | 15329/34278 [16:52:04<23:32:58, 4.47s/it] {'loss': 0.135, 'grad_norm': 0.7041431192759933, 'learning_rate': 6.0893652699138425e-06, 'epoch': 0.45} 45%|████▍ | 15329/34278 [16:52:04<23:32:58, 4.47s/it] 45%|████▍ | 15330/34278 [16:52:07<21:15:50, 4.04s/it] {'loss': 0.1359, 'grad_norm': 0.916718058298528, 'learning_rate': 6.088904180125878e-06, 'epoch': 0.45} 45%|████▍ | 15330/34278 [16:52:07<21:15:50, 4.04s/it] 45%|████▍ | 15331/34278 [16:52:13<24:18:01, 4.62s/it] {'loss': 0.1593, 'grad_norm': 0.8518468216049682, 'learning_rate': 6.088443080616439e-06, 'epoch': 0.45} 45%|████▍ | 15331/34278 [16:52:13<24:18:01, 4.62s/it] 45%|████▍ | 15332/34278 [16:52:16<21:28:56, 4.08s/it] {'loss': 0.1362, 'grad_norm': 1.1255361767521337, 'learning_rate': 6.087981971389639e-06, 'epoch': 0.45} 45%|████▍ | 15332/34278 [16:52:16<21:28:56, 4.08s/it] 45%|████▍ | 15333/34278 [16:52:19<19:38:04, 3.73s/it] {'loss': 0.1262, 'grad_norm': 0.6877519479258729, 'learning_rate': 6.0875208524495945e-06, 'epoch': 0.45} 45%|████▍ | 15333/34278 [16:52:19<19:38:04, 3.73s/it] 45%|████▍ | 15334/34278 [16:52:22<18:30:49, 3.52s/it] {'loss': 0.1368, 'grad_norm': 0.8847023564377745, 'learning_rate': 6.087059723800426e-06, 'epoch': 0.45} 45%|████▍ | 15334/34278 [16:52:22<18:30:49, 3.52s/it] 45%|████▍ | 15335/34278 [16:52:26<18:30:13, 3.52s/it] {'loss': 0.1436, 'grad_norm': 0.7625876272799125, 'learning_rate': 6.086598585446245e-06, 'epoch': 0.45} 45%|████▍ | 15335/34278 [16:52:26<18:30:13, 3.52s/it] 45%|████▍ | 15336/34278 [16:52:29<17:43:59, 3.37s/it] {'loss': 0.1253, 'grad_norm': 0.78945406976093, 'learning_rate': 6.086137437391172e-06, 'epoch': 0.45} 45%|████▍ | 15336/34278 [16:52:29<17:43:59, 3.37s/it] 45%|████▍ | 15337/34278 [16:52:32<17:05:49, 3.25s/it] {'loss': 0.149, 'grad_norm': 0.7537447445197407, 'learning_rate': 6.0856762796393244e-06, 'epoch': 0.45} 45%|████▍ | 15337/34278 [16:52:32<17:05:49, 3.25s/it] 45%|████▍ | 15338/34278 [16:52:35<16:42:20, 3.18s/it] {'loss': 0.1312, 'grad_norm': 0.7916157298811274, 'learning_rate': 6.085215112194818e-06, 'epoch': 0.45} 45%|████▍ | 15338/34278 [16:52:35<16:42:20, 3.18s/it] 45%|████▍ | 15339/34278 [16:52:38<16:41:13, 3.17s/it] {'loss': 0.1543, 'grad_norm': 0.9057345129733425, 'learning_rate': 6.084753935061769e-06, 'epoch': 0.45} 45%|████▍ | 15339/34278 [16:52:38<16:41:13, 3.17s/it] 45%|████▍ | 15340/34278 [16:52:43<19:25:51, 3.69s/it] {'loss': 0.1302, 'grad_norm': 0.6811350085174357, 'learning_rate': 6.084292748244296e-06, 'epoch': 0.45} 45%|████▍ | 15340/34278 [16:52:43<19:25:51, 3.69s/it] 45%|████▍ | 15341/34278 [16:52:46<19:18:47, 3.67s/it] {'loss': 0.1418, 'grad_norm': 0.7061628425165356, 'learning_rate': 6.083831551746516e-06, 'epoch': 0.45} 45%|████▍ | 15341/34278 [16:52:46<19:18:47, 3.67s/it] 45%|████▍ | 15342/34278 [16:52:49<18:03:40, 3.43s/it] {'loss': 0.1333, 'grad_norm': 0.9942950514244173, 'learning_rate': 6.083370345572548e-06, 'epoch': 0.45} 45%|████▍ | 15342/34278 [16:52:49<18:03:40, 3.43s/it] 45%|████▍ | 15343/34278 [16:52:52<17:49:24, 3.39s/it] {'loss': 0.1311, 'grad_norm': 0.8480920884999015, 'learning_rate': 6.082909129726506e-06, 'epoch': 0.45} 45%|████▍ | 15343/34278 [16:52:52<17:49:24, 3.39s/it] 45%|████▍ | 15344/34278 [16:52:58<21:54:06, 4.16s/it] {'loss': 0.1371, 'grad_norm': 0.7649499289265241, 'learning_rate': 6.082447904212512e-06, 'epoch': 0.45} 45%|████▍ | 15344/34278 [16:52:58<21:54:06, 4.16s/it] 45%|████▍ | 15345/34278 [16:53:04<24:05:22, 4.58s/it] {'loss': 0.14, 'grad_norm': 0.7993715485804653, 'learning_rate': 6.081986669034681e-06, 'epoch': 0.45} 45%|████▍ | 15345/34278 [16:53:04<24:05:22, 4.58s/it] 45%|████▍ | 15346/34278 [16:53:07<21:14:27, 4.04s/it] {'loss': 0.1245, 'grad_norm': 0.8205791858500433, 'learning_rate': 6.08152542419713e-06, 'epoch': 0.45} 45%|████▍ | 15346/34278 [16:53:07<21:14:27, 4.04s/it] 45%|████▍ | 15347/34278 [16:53:11<21:59:58, 4.18s/it] {'loss': 0.1305, 'grad_norm': 0.7218935117331445, 'learning_rate': 6.081064169703981e-06, 'epoch': 0.45} 45%|████▍ | 15347/34278 [16:53:11<21:59:58, 4.18s/it] 45%|████▍ | 15348/34278 [16:53:14<20:22:57, 3.88s/it] {'loss': 0.1597, 'grad_norm': 0.8263799578193723, 'learning_rate': 6.080602905559346e-06, 'epoch': 0.45} 45%|████▍ | 15348/34278 [16:53:14<20:22:57, 3.88s/it] 45%|████▍ | 15349/34278 [16:53:18<19:42:34, 3.75s/it] {'loss': 0.1604, 'grad_norm': 0.8179836136294776, 'learning_rate': 6.080141631767349e-06, 'epoch': 0.45} 45%|████▍ | 15349/34278 [16:53:18<19:42:34, 3.75s/it] 45%|████▍ | 15350/34278 [16:53:21<18:33:24, 3.53s/it] {'loss': 0.1421, 'grad_norm': 0.9099487724906116, 'learning_rate': 6.079680348332103e-06, 'epoch': 0.45} 45%|████▍ | 15350/34278 [16:53:21<18:33:24, 3.53s/it] 45%|████▍ | 15351/34278 [16:53:26<20:56:26, 3.98s/it] {'loss': 0.1333, 'grad_norm': 1.0850840366906596, 'learning_rate': 6.079219055257729e-06, 'epoch': 0.45} 45%|████▍ | 15351/34278 [16:53:26<20:56:26, 3.98s/it] 45%|████▍ | 15352/34278 [16:53:29<19:10:57, 3.65s/it] {'loss': 0.132, 'grad_norm': 0.789915599834623, 'learning_rate': 6.078757752548346e-06, 'epoch': 0.45} 45%|████▍ | 15352/34278 [16:53:29<19:10:57, 3.65s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f231b8098a0> Failed to fetch sample 3058471. Exception: cannot identify image file <_io.BytesIO object at 0x7f231b8098a0> 45%|████▍ | 15353/34278 [16:53:32<18:29:02, 3.52s/it] {'loss': 0.1281, 'grad_norm': 0.8338508408773504, 'learning_rate': 6.07829644020807e-06, 'epoch': 0.45} 45%|████▍ | 15353/34278 [16:53:32<18:29:02, 3.52s/it] 45%|████▍ | 15354/34278 [16:53:35<17:28:25, 3.32s/it] {'loss': 0.109, 'grad_norm': 0.7019238584494384, 'learning_rate': 6.0778351182410226e-06, 'epoch': 0.45} 45%|████▍ | 15354/34278 [16:53:35<17:28:25, 3.32s/it] 45%|████▍ | 15355/34278 [16:53:38<17:47:10, 3.38s/it] {'loss': 0.1535, 'grad_norm': 0.8044855984225002, 'learning_rate': 6.077373786651319e-06, 'epoch': 0.45} 45%|████▍ | 15355/34278 [16:53:38<17:47:10, 3.38s/it] 45%|████▍ | 15356/34278 [16:53:44<21:09:50, 4.03s/it] {'loss': 0.1124, 'grad_norm': 0.7373770750377117, 'learning_rate': 6.076912445443079e-06, 'epoch': 0.45} 45%|████▍ | 15356/34278 [16:53:44<21:09:50, 4.03s/it] 45%|████▍ | 15357/34278 [16:53:48<21:21:55, 4.07s/it] {'loss': 0.1466, 'grad_norm': 0.7168772588691699, 'learning_rate': 6.076451094620424e-06, 'epoch': 0.45} 45%|████▍ | 15357/34278 [16:53:48<21:21:55, 4.07s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 45%|████▍ | 15358/34278 [16:53:53<22:37:40, 4.31s/it] {'loss': 0.1563, 'grad_norm': 0.95791986581113, 'learning_rate': 6.075989734187469e-06, 'epoch': 0.45} 45%|████▍ | 15358/34278 [16:53:53<22:37:40, 4.31s/it] 45%|████▍ | 15359/34278 [16:53:57<22:23:06, 4.26s/it] {'loss': 0.1061, 'grad_norm': 0.7106563143561379, 'learning_rate': 6.075528364148335e-06, 'epoch': 0.45} 45%|████▍ | 15359/34278 [16:53:57<22:23:06, 4.26s/it] 45%|████▍ | 15360/34278 [16:54:00<20:30:06, 3.90s/it] {'loss': 0.1729, 'grad_norm': 0.8339195536507266, 'learning_rate': 6.07506698450714e-06, 'epoch': 0.45} 45%|████▍ | 15360/34278 [16:54:00<20:30:06, 3.90s/it] 45%|████▍ | 15361/34278 [16:54:04<20:15:11, 3.85s/it] {'loss': 0.1284, 'grad_norm': 0.8239540782182683, 'learning_rate': 6.074605595268002e-06, 'epoch': 0.45} 45%|████▍ | 15361/34278 [16:54:04<20:15:11, 3.85s/it] 45%|████▍ | 15362/34278 [16:54:07<18:32:07, 3.53s/it] {'loss': 0.1367, 'grad_norm': 0.7366551213719733, 'learning_rate': 6.074144196435045e-06, 'epoch': 0.45} 45%|████▍ | 15362/34278 [16:54:07<18:32:07, 3.53s/it] 45%|████▍ | 15363/34278 [16:54:12<21:27:35, 4.08s/it] {'loss': 0.1103, 'grad_norm': 0.7667253677875338, 'learning_rate': 6.073682788012384e-06, 'epoch': 0.45} 45%|████▍ | 15363/34278 [16:54:12<21:27:35, 4.08s/it] 45%|████▍ | 15364/34278 [16:54:15<19:25:35, 3.70s/it] {'loss': 0.1373, 'grad_norm': 0.7472561908775801, 'learning_rate': 6.073221370004139e-06, 'epoch': 0.45} 45%|████▍ | 15364/34278 [16:54:15<19:25:35, 3.70s/it] 45%|████▍ | 15365/34278 [16:54:21<23:37:38, 4.50s/it] {'loss': 0.1223, 'grad_norm': 0.9541954510364338, 'learning_rate': 6.07275994241443e-06, 'epoch': 0.45} 45%|████▍ | 15365/34278 [16:54:21<23:37:38, 4.50s/it] 45%|████▍ | 15366/34278 [16:54:25<21:52:55, 4.17s/it] {'loss': 0.1183, 'grad_norm': 0.8713491107684442, 'learning_rate': 6.072298505247376e-06, 'epoch': 0.45} 45%|████▍ | 15366/34278 [16:54:25<21:52:55, 4.17s/it] 45%|████▍ | 15367/34278 [16:54:28<20:47:05, 3.96s/it] {'loss': 0.1942, 'grad_norm': 1.3063701456978865, 'learning_rate': 6.071837058507097e-06, 'epoch': 0.45} 45%|████▍ | 15367/34278 [16:54:28<20:47:05, 3.96s/it] 45%|████▍ | 15368/34278 [16:54:32<19:57:57, 3.80s/it] {'loss': 0.1376, 'grad_norm': 0.8573021583863663, 'learning_rate': 6.071375602197713e-06, 'epoch': 0.45} 45%|████▍ | 15368/34278 [16:54:32<19:57:57, 3.80s/it] 45%|████▍ | 15369/34278 [16:54:38<23:47:10, 4.53s/it] {'loss': 0.1429, 'grad_norm': 0.9923513666614028, 'learning_rate': 6.070914136323342e-06, 'epoch': 0.45} 45%|████▍ | 15369/34278 [16:54:38<23:47:10, 4.53s/it] 45%|████▍ | 15370/34278 [16:54:41<21:42:20, 4.13s/it] {'loss': 0.1554, 'grad_norm': 0.9075965650298964, 'learning_rate': 6.070452660888108e-06, 'epoch': 0.45} 45%|████▍ | 15370/34278 [16:54:41<21:42:20, 4.13s/it] 45%|████▍ | 15371/34278 [16:54:44<20:01:40, 3.81s/it] {'loss': 0.1473, 'grad_norm': 0.7643302118184612, 'learning_rate': 6.069991175896126e-06, 'epoch': 0.45} 45%|████▍ | 15371/34278 [16:54:44<20:01:40, 3.81s/it] 45%|████▍ | 15372/34278 [16:54:47<19:02:55, 3.63s/it] {'loss': 0.16, 'grad_norm': 1.0499260942359514, 'learning_rate': 6.069529681351518e-06, 'epoch': 0.45} 45%|████▍ | 15372/34278 [16:54:47<19:02:55, 3.63s/it] 45%|████▍ | 15373/34278 [16:54:50<18:16:55, 3.48s/it] {'loss': 0.1542, 'grad_norm': 0.8892435689751551, 'learning_rate': 6.069068177258406e-06, 'epoch': 0.45} 45%|████▍ | 15373/34278 [16:54:50<18:16:55, 3.48s/it] 45%|████▍ | 15374/34278 [16:54:53<17:17:52, 3.29s/it] {'loss': 0.1316, 'grad_norm': 0.8745325863367627, 'learning_rate': 6.068606663620907e-06, 'epoch': 0.45} 45%|████▍ | 15374/34278 [16:54:53<17:17:52, 3.29s/it] 45%|████▍ | 15375/34278 [16:54:56<16:55:43, 3.22s/it] {'loss': 0.1457, 'grad_norm': 1.2117473936439547, 'learning_rate': 6.068145140443143e-06, 'epoch': 0.45} 45%|████▍ | 15375/34278 [16:54:56<16:55:43, 3.22s/it] 45%|████▍ | 15376/34278 [16:54:59<16:38:17, 3.17s/it] {'loss': 0.1585, 'grad_norm': 0.9842029551190973, 'learning_rate': 6.067683607729234e-06, 'epoch': 0.45} 45%|████▍ | 15376/34278 [16:54:59<16:38:17, 3.17s/it] 45%|████▍ | 15377/34278 [16:55:03<17:06:18, 3.26s/it] {'loss': 0.1406, 'grad_norm': 0.942666837965468, 'learning_rate': 6.067222065483303e-06, 'epoch': 0.45} 45%|████▍ | 15377/34278 [16:55:03<17:06:18, 3.26s/it] 45%|████▍ | 15378/34278 [16:55:06<16:38:13, 3.17s/it] {'loss': 0.1281, 'grad_norm': 0.8178630689461366, 'learning_rate': 6.066760513709466e-06, 'epoch': 0.45} 45%|████▍ | 15378/34278 [16:55:06<16:38:13, 3.17s/it] 45%|████▍ | 15379/34278 [16:55:12<20:55:57, 3.99s/it] {'loss': 0.14, 'grad_norm': 0.7872626468120314, 'learning_rate': 6.066298952411846e-06, 'epoch': 0.45} 45%|████▍ | 15379/34278 [16:55:12<20:55:57, 3.99s/it] 45%|████▍ | 15380/34278 [16:55:16<21:06:19, 4.02s/it] {'loss': 0.1416, 'grad_norm': 0.9733597246995307, 'learning_rate': 6.065837381594563e-06, 'epoch': 0.45} 45%|████▍ | 15380/34278 [16:55:16<21:06:19, 4.02s/it] 45%|████▍ | 15381/34278 [16:55:20<21:29:21, 4.09s/it] {'loss': 0.153, 'grad_norm': 0.8273228500542338, 'learning_rate': 6.065375801261739e-06, 'epoch': 0.45} 45%|████▍ | 15381/34278 [16:55:20<21:29:21, 4.09s/it] 45%|████▍ | 15382/34278 [16:55:23<19:24:07, 3.70s/it] {'loss': 0.1465, 'grad_norm': 1.01081477485419, 'learning_rate': 6.064914211417495e-06, 'epoch': 0.45} 45%|████▍ | 15382/34278 [16:55:23<19:24:07, 3.70s/it] 45%|████▍ | 15383/34278 [16:55:26<19:22:41, 3.69s/it] {'loss': 0.1337, 'grad_norm': 0.7620129358509248, 'learning_rate': 6.06445261206595e-06, 'epoch': 0.45} 45%|████▍ | 15383/34278 [16:55:27<19:22:41, 3.69s/it] 45%|████▍ | 15384/34278 [16:55:29<18:11:29, 3.47s/it] {'loss': 0.1402, 'grad_norm': 0.875167098626487, 'learning_rate': 6.063991003211227e-06, 'epoch': 0.45} 45%|████▍ | 15384/34278 [16:55:29<18:11:29, 3.47s/it] 45%|████▍ | 15385/34278 [16:55:34<19:54:52, 3.79s/it] {'loss': 0.1427, 'grad_norm': 0.8485254629721719, 'learning_rate': 6.063529384857445e-06, 'epoch': 0.45} 45%|████▍ | 15385/34278 [16:55:34<19:54:52, 3.79s/it] 45%|████▍ | 15386/34278 [16:55:39<21:25:22, 4.08s/it] {'loss': 0.1343, 'grad_norm': 0.874625989164001, 'learning_rate': 6.063067757008727e-06, 'epoch': 0.45} 45%|████▍ | 15386/34278 [16:55:39<21:25:22, 4.08s/it] 45%|████▍ | 15387/34278 [16:55:42<19:41:12, 3.75s/it] {'loss': 0.1499, 'grad_norm': 0.8749592803137143, 'learning_rate': 6.062606119669194e-06, 'epoch': 0.45} 45%|████▍ | 15387/34278 [16:55:42<19:41:12, 3.75s/it] 45%|████▍ | 15388/34278 [16:55:48<23:14:44, 4.43s/it] {'loss': 0.1678, 'grad_norm': 0.9409782433805018, 'learning_rate': 6.0621444728429675e-06, 'epoch': 0.45} 45%|████▍ | 15388/34278 [16:55:48<23:14:44, 4.43s/it] 45%|████▍ | 15389/34278 [16:55:51<21:27:59, 4.09s/it] {'loss': 0.1501, 'grad_norm': 0.9233812981784898, 'learning_rate': 6.061682816534169e-06, 'epoch': 0.45} 45%|████▍ | 15389/34278 [16:55:51<21:27:59, 4.09s/it] 45%|████▍ | 15390/34278 [16:55:54<20:09:28, 3.84s/it] {'loss': 0.1468, 'grad_norm': 0.9928736377413834, 'learning_rate': 6.061221150746919e-06, 'epoch': 0.45} 45%|████▍ | 15390/34278 [16:55:54<20:09:28, 3.84s/it] 45%|████▍ | 15391/34278 [16:55:57<18:27:43, 3.52s/it] {'loss': 0.1581, 'grad_norm': 0.8676091112081428, 'learning_rate': 6.060759475485341e-06, 'epoch': 0.45} 45%|████▍ | 15391/34278 [16:55:57<18:27:43, 3.52s/it] 45%|████▍ | 15392/34278 [16:56:00<17:36:40, 3.36s/it] {'loss': 0.1564, 'grad_norm': 0.8149235286502625, 'learning_rate': 6.060297790753555e-06, 'epoch': 0.45} 45%|████▍ | 15392/34278 [16:56:00<17:36:40, 3.36s/it] 45%|████▍ | 15393/34278 [16:56:03<17:10:15, 3.27s/it] {'loss': 0.1651, 'grad_norm': 0.7952708965899153, 'learning_rate': 6.059836096555682e-06, 'epoch': 0.45} 45%|████▍ | 15393/34278 [16:56:03<17:10:15, 3.27s/it] 45%|████▍ | 15394/34278 [16:56:06<16:44:51, 3.19s/it] {'loss': 0.1432, 'grad_norm': 0.774273371444286, 'learning_rate': 6.059374392895847e-06, 'epoch': 0.45} 45%|████▍ | 15394/34278 [16:56:06<16:44:51, 3.19s/it] 45%|████▍ | 15395/34278 [16:56:10<17:10:24, 3.27s/it] {'loss': 0.166, 'grad_norm': 1.064555635610097, 'learning_rate': 6.0589126797781705e-06, 'epoch': 0.45} 45%|████▍ | 15395/34278 [16:56:10<17:10:24, 3.27s/it] 45%|████▍ | 15396/34278 [16:56:13<17:08:18, 3.27s/it] {'loss': 0.1344, 'grad_norm': 0.8378755092136079, 'learning_rate': 6.058450957206773e-06, 'epoch': 0.45} 45%|████▍ | 15396/34278 [16:56:13<17:08:18, 3.27s/it] 45%|████▍ | 15397/34278 [16:56:19<20:57:20, 4.00s/it] {'loss': 0.1393, 'grad_norm': 0.8085456034294426, 'learning_rate': 6.057989225185779e-06, 'epoch': 0.45} 45%|████▍ | 15397/34278 [16:56:19<20:57:20, 4.00s/it] 45%|████▍ | 15398/34278 [16:56:25<24:06:04, 4.60s/it] {'loss': 0.1318, 'grad_norm': 0.7754635374313775, 'learning_rate': 6.0575274837193096e-06, 'epoch': 0.45} 45%|████▍ | 15398/34278 [16:56:25<24:06:04, 4.60s/it] 45%|████▍ | 15399/34278 [16:56:28<22:17:51, 4.25s/it] {'loss': 0.1442, 'grad_norm': 0.8527890777382867, 'learning_rate': 6.057065732811488e-06, 'epoch': 0.45} 45%|████▍ | 15399/34278 [16:56:28<22:17:51, 4.25s/it] 45%|████▍ | 15400/34278 [16:56:31<19:54:57, 3.80s/it] {'loss': 0.1585, 'grad_norm': 0.9897797469581112, 'learning_rate': 6.056603972466435e-06, 'epoch': 0.45} 45%|████▍ | 15400/34278 [16:56:31<19:54:57, 3.80s/it] 45%|████▍ | 15401/34278 [16:56:35<20:48:59, 3.97s/it] {'loss': 0.169, 'grad_norm': 0.7479714463271162, 'learning_rate': 6.0561422026882735e-06, 'epoch': 0.45} 45%|████▍ | 15401/34278 [16:56:35<20:48:59, 3.97s/it] 45%|████▍ | 15402/34278 [16:56:38<19:38:52, 3.75s/it] {'loss': 0.1232, 'grad_norm': 0.8982032075909928, 'learning_rate': 6.0556804234811276e-06, 'epoch': 0.45} 45%|████▍ | 15402/34278 [16:56:38<19:38:52, 3.75s/it] 45%|████▍ | 15403/34278 [16:56:41<17:59:35, 3.43s/it] {'loss': 0.1442, 'grad_norm': 0.7703823612855346, 'learning_rate': 6.055218634849118e-06, 'epoch': 0.45} 45%|████▍ | 15403/34278 [16:56:41<17:59:35, 3.43s/it] 45%|████▍ | 15404/34278 [16:56:45<19:27:35, 3.71s/it] {'loss': 0.1059, 'grad_norm': 0.9600556990766124, 'learning_rate': 6.054756836796369e-06, 'epoch': 0.45} 45%|████▍ | 15404/34278 [16:56:45<19:27:35, 3.71s/it] 45%|████▍ | 15405/34278 [16:56:50<21:30:03, 4.10s/it] {'loss': 0.1536, 'grad_norm': 1.0048974647856885, 'learning_rate': 6.054295029327002e-06, 'epoch': 0.45} 45%|████▍ | 15405/34278 [16:56:50<21:30:03, 4.10s/it] 45%|████▍ | 15406/34278 [16:56:54<21:00:15, 4.01s/it] {'loss': 0.1471, 'grad_norm': 0.7310083617419373, 'learning_rate': 6.053833212445141e-06, 'epoch': 0.45} 45%|████▍ | 15406/34278 [16:56:54<21:00:15, 4.01s/it] 45%|████▍ | 15407/34278 [16:56:58<21:13:25, 4.05s/it] {'loss': 0.1298, 'grad_norm': 0.7867274235152153, 'learning_rate': 6.05337138615491e-06, 'epoch': 0.45} 45%|████▍ | 15407/34278 [16:56:58<21:13:25, 4.05s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 45%|████▍ | 15408/34278 [16:57:03<22:15:05, 4.25s/it] {'loss': 0.1391, 'grad_norm': 0.7945308302893132, 'learning_rate': 6.052909550460429e-06, 'epoch': 0.45} 45%|████▍ | 15408/34278 [16:57:03<22:15:05, 4.25s/it] 45%|████▍ | 15409/34278 [16:57:09<24:29:01, 4.67s/it] {'loss': 0.1677, 'grad_norm': 0.8331005019096529, 'learning_rate': 6.052447705365824e-06, 'epoch': 0.45} 45%|████▍ | 15409/34278 [16:57:09<24:29:01, 4.67s/it] 45%|████▍ | 15410/34278 [16:57:15<26:55:14, 5.14s/it] {'loss': 0.1639, 'grad_norm': 0.763065824096036, 'learning_rate': 6.051985850875216e-06, 'epoch': 0.45} 45%|████▍ | 15410/34278 [16:57:15<26:55:14, 5.14s/it] 45%|████▍ | 15411/34278 [16:57:18<23:46:50, 4.54s/it] {'loss': 0.1118, 'grad_norm': 0.6980665358403694, 'learning_rate': 6.0515239869927285e-06, 'epoch': 0.45} 45%|████▍ | 15411/34278 [16:57:18<23:46:50, 4.54s/it] 45%|████▍ | 15412/34278 [16:57:22<22:10:22, 4.23s/it] {'loss': 0.1447, 'grad_norm': 0.8724358091090383, 'learning_rate': 6.051062113722489e-06, 'epoch': 0.45} 45%|████▍ | 15412/34278 [16:57:22<22:10:22, 4.23s/it] 45%|████▍ | 15413/34278 [16:57:25<20:32:11, 3.92s/it] {'loss': 0.1302, 'grad_norm': 0.6836028132907334, 'learning_rate': 6.050600231068616e-06, 'epoch': 0.45} 45%|████▍ | 15413/34278 [16:57:25<20:32:11, 3.92s/it] 45%|████▍ | 15414/34278 [16:57:28<19:09:05, 3.65s/it] {'loss': 0.1333, 'grad_norm': 0.7380583610704688, 'learning_rate': 6.050138339035235e-06, 'epoch': 0.45} 45%|████▍ | 15414/34278 [16:57:28<19:09:05, 3.65s/it] 45%|████▍ | 15415/34278 [16:57:31<18:53:02, 3.60s/it] {'loss': 0.1405, 'grad_norm': 0.7250387437012618, 'learning_rate': 6.0496764376264705e-06, 'epoch': 0.45} 45%|████▍ | 15415/34278 [16:57:31<18:53:02, 3.60s/it] 45%|████▍ | 15416/34278 [16:57:35<18:25:10, 3.52s/it] {'loss': 0.147, 'grad_norm': 0.7031926994293658, 'learning_rate': 6.049214526846444e-06, 'epoch': 0.45} 45%|████▍ | 15416/34278 [16:57:35<18:25:10, 3.52s/it] 45%|████▍ | 15417/34278 [16:57:38<18:14:39, 3.48s/it] {'loss': 0.135, 'grad_norm': 0.8165137002136926, 'learning_rate': 6.048752606699282e-06, 'epoch': 0.45} 45%|████▍ | 15417/34278 [16:57:38<18:14:39, 3.48s/it] 45%|████▍ | 15418/34278 [16:57:41<17:26:23, 3.33s/it] {'loss': 0.1364, 'grad_norm': 0.7543408656527488, 'learning_rate': 6.048290677189106e-06, 'epoch': 0.45} 45%|████▍ | 15418/34278 [16:57:41<17:26:23, 3.33s/it] 45%|████▍ | 15419/34278 [16:57:44<17:01:41, 3.25s/it] {'loss': 0.1324, 'grad_norm': 0.6675726524186767, 'learning_rate': 6.047828738320041e-06, 'epoch': 0.45} 45%|████▍ | 15419/34278 [16:57:44<17:01:41, 3.25s/it] 45%|████▍ | 15420/34278 [16:57:48<18:40:56, 3.57s/it] {'loss': 0.169, 'grad_norm': 0.9602264692579462, 'learning_rate': 6.047366790096212e-06, 'epoch': 0.45} 45%|████▍ | 15420/34278 [16:57:48<18:40:56, 3.57s/it] 45%|████▍ | 15421/34278 [16:57:54<22:33:16, 4.31s/it] {'loss': 0.1487, 'grad_norm': 0.8502253876920634, 'learning_rate': 6.046904832521742e-06, 'epoch': 0.45} 45%|████▍ | 15421/34278 [16:57:54<22:33:16, 4.31s/it] 45%|████▍ | 15422/34278 [16:57:58<21:20:36, 4.07s/it] {'loss': 0.1191, 'grad_norm': 0.7192279901324837, 'learning_rate': 6.046442865600756e-06, 'epoch': 0.45} 45%|████▍ | 15422/34278 [16:57:58<21:20:36, 4.07s/it] 45%|████▍ | 15423/34278 [16:58:01<20:07:37, 3.84s/it] {'loss': 0.1439, 'grad_norm': 0.9532467202734655, 'learning_rate': 6.0459808893373764e-06, 'epoch': 0.45} 45%|████▍ | 15423/34278 [16:58:01<20:07:37, 3.84s/it] 45%|████▍ | 15424/34278 [16:58:04<18:40:29, 3.57s/it] {'loss': 0.1374, 'grad_norm': 0.7877536771062995, 'learning_rate': 6.045518903735731e-06, 'epoch': 0.45} 45%|████▍ | 15424/34278 [16:58:04<18:40:29, 3.57s/it] 45%|████▍ | 15425/34278 [16:58:07<17:41:50, 3.38s/it] {'loss': 0.1274, 'grad_norm': 0.763803614463936, 'learning_rate': 6.045056908799941e-06, 'epoch': 0.45} 45%|████▍ | 15425/34278 [16:58:07<17:41:50, 3.38s/it] 45%|████▌ | 15426/34278 [16:58:13<21:44:20, 4.15s/it] {'loss': 0.1227, 'grad_norm': 0.684652609615203, 'learning_rate': 6.044594904534132e-06, 'epoch': 0.45} 45%|████▌ | 15426/34278 [16:58:13<21:44:20, 4.15s/it] 45%|████▌ | 15427/34278 [16:58:16<20:05:17, 3.84s/it] {'loss': 0.1402, 'grad_norm': 0.9331517680282322, 'learning_rate': 6.044132890942432e-06, 'epoch': 0.45} 45%|████▌ | 15427/34278 [16:58:16<20:05:17, 3.84s/it] 45%|████▌ | 15428/34278 [16:58:20<20:02:52, 3.83s/it] {'loss': 0.1065, 'grad_norm': 0.8575296634774847, 'learning_rate': 6.04367086802896e-06, 'epoch': 0.45} 45%|████▌ | 15428/34278 [16:58:20<20:02:52, 3.83s/it] 45%|████▌ | 15429/34278 [16:58:24<20:39:48, 3.95s/it] {'loss': 0.1389, 'grad_norm': 0.8340571847404953, 'learning_rate': 6.043208835797845e-06, 'epoch': 0.45} 45%|████▌ | 15429/34278 [16:58:24<20:39:48, 3.95s/it] 45%|████▌ | 15430/34278 [16:58:28<19:50:44, 3.79s/it] {'loss': 0.1324, 'grad_norm': 0.805375479888196, 'learning_rate': 6.042746794253209e-06, 'epoch': 0.45} 45%|████▌ | 15430/34278 [16:58:28<19:50:44, 3.79s/it] 45%|████▌ | 15431/34278 [16:58:31<19:46:33, 3.78s/it] {'loss': 0.1489, 'grad_norm': 0.7207822573273216, 'learning_rate': 6.0422847433991795e-06, 'epoch': 0.45} 45%|████▌ | 15431/34278 [16:58:31<19:46:33, 3.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 45%|████▌ | 15432/34278 [16:58:34<18:38:28, 3.56s/it] {'loss': 0.1296, 'grad_norm': 0.911818155425748, 'learning_rate': 6.041822683239881e-06, 'epoch': 0.45} 45%|████▌ | 15432/34278 [16:58:34<18:38:28, 3.56s/it] 45%|████▌ | 15433/34278 [16:58:38<19:26:12, 3.71s/it] {'loss': 0.1305, 'grad_norm': 0.674039306822005, 'learning_rate': 6.041360613779438e-06, 'epoch': 0.45} 45%|████▌ | 15433/34278 [16:58:38<19:26:12, 3.71s/it] 45%|████▌ | 15434/34278 [16:58:42<19:19:03, 3.69s/it] {'loss': 0.1427, 'grad_norm': 0.8040425280619382, 'learning_rate': 6.040898535021975e-06, 'epoch': 0.45} 45%|████▌ | 15434/34278 [16:58:42<19:19:03, 3.69s/it] 45%|████▌ | 15435/34278 [16:58:46<19:33:06, 3.74s/it] {'loss': 0.1344, 'grad_norm': 0.8042811515528271, 'learning_rate': 6.040436446971619e-06, 'epoch': 0.45} 45%|████▌ | 15435/34278 [16:58:46<19:33:06, 3.74s/it] 45%|████▌ | 15436/34278 [16:58:51<22:21:20, 4.27s/it] {'loss': 0.1272, 'grad_norm': 0.8620850881804475, 'learning_rate': 6.039974349632496e-06, 'epoch': 0.45} 45%|████▌ | 15436/34278 [16:58:51<22:21:20, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 45%|████▌ | 15437/34278 [16:58:55<21:58:09, 4.20s/it] {'loss': 0.1622, 'grad_norm': 0.842513368101869, 'learning_rate': 6.03951224300873e-06, 'epoch': 0.45} 45%|████▌ | 15437/34278 [16:58:55<21:58:09, 4.20s/it] 45%|████▌ | 15438/34278 [16:58:59<20:47:39, 3.97s/it] {'loss': 0.1204, 'grad_norm': 0.7177630781376846, 'learning_rate': 6.0390501271044455e-06, 'epoch': 0.45} 45%|████▌ | 15438/34278 [16:58:59<20:47:39, 3.97s/it] 45%|████▌ | 15439/34278 [16:59:02<19:56:22, 3.81s/it] {'loss': 0.1323, 'grad_norm': 0.7727972733442917, 'learning_rate': 6.038588001923771e-06, 'epoch': 0.45} 45%|████▌ | 15439/34278 [16:59:02<19:56:22, 3.81s/it] 45%|████▌ | 15440/34278 [16:59:08<23:20:06, 4.46s/it] {'loss': 0.1363, 'grad_norm': 1.0554661299495012, 'learning_rate': 6.03812586747083e-06, 'epoch': 0.45} 45%|████▌ | 15440/34278 [16:59:08<23:20:06, 4.46s/it] 45%|████▌ | 15441/34278 [16:59:12<21:19:37, 4.08s/it] {'loss': 0.1341, 'grad_norm': 0.7942799378197438, 'learning_rate': 6.0376637237497474e-06, 'epoch': 0.45} 45%|████▌ | 15441/34278 [16:59:12<21:19:37, 4.08s/it] 45%|████▌ | 15442/34278 [16:59:15<19:49:00, 3.79s/it] {'loss': 0.1325, 'grad_norm': 0.8396266384549794, 'learning_rate': 6.037201570764654e-06, 'epoch': 0.45} 45%|████▌ | 15442/34278 [16:59:15<19:49:00, 3.79s/it] 45%|████▌ | 15443/34278 [16:59:19<20:02:23, 3.83s/it] {'loss': 0.1034, 'grad_norm': 0.805519465081484, 'learning_rate': 6.036739408519671e-06, 'epoch': 0.45} 45%|████▌ | 15443/34278 [16:59:19<20:02:23, 3.83s/it] 45%|████▌ | 15444/34278 [16:59:22<19:04:34, 3.65s/it] {'loss': 0.119, 'grad_norm': 0.8864674384806074, 'learning_rate': 6.036277237018926e-06, 'epoch': 0.45} 45%|████▌ | 15444/34278 [16:59:22<19:04:34, 3.65s/it] 45%|████▌ | 15445/34278 [16:59:25<17:55:27, 3.43s/it] {'loss': 0.1416, 'grad_norm': 0.9880335570734815, 'learning_rate': 6.0358150562665455e-06, 'epoch': 0.45} 45%|████▌ | 15445/34278 [16:59:25<17:55:27, 3.43s/it] 45%|████▌ | 15446/34278 [16:59:29<18:51:01, 3.60s/it] {'loss': 0.1648, 'grad_norm': 1.2902869860420179, 'learning_rate': 6.035352866266655e-06, 'epoch': 0.45} 45%|████▌ | 15446/34278 [16:59:29<18:51:01, 3.60s/it] 45%|████▌ | 15447/34278 [16:59:35<22:36:49, 4.32s/it] {'loss': 0.1499, 'grad_norm': 1.052955135958026, 'learning_rate': 6.034890667023381e-06, 'epoch': 0.45} 45%|████▌ | 15447/34278 [16:59:35<22:36:49, 4.32s/it] 45%|████▌ | 15448/34278 [16:59:38<20:20:51, 3.89s/it] {'loss': 0.1356, 'grad_norm': 0.9574731079353955, 'learning_rate': 6.034428458540851e-06, 'epoch': 0.45} 45%|████▌ | 15448/34278 [16:59:38<20:20:51, 3.89s/it] 45%|████▌ | 15449/34278 [16:59:40<18:37:24, 3.56s/it] {'loss': 0.1404, 'grad_norm': 1.2152176449831125, 'learning_rate': 6.03396624082319e-06, 'epoch': 0.45} 45%|████▌ | 15449/34278 [16:59:40<18:37:24, 3.56s/it] 45%|████▌ | 15450/34278 [16:59:44<18:03:38, 3.45s/it] {'loss': 0.1334, 'grad_norm': 1.2169092351149746, 'learning_rate': 6.033504013874525e-06, 'epoch': 0.45} 45%|████▌ | 15450/34278 [16:59:44<18:03:38, 3.45s/it] 45%|████▌ | 15451/34278 [16:59:50<22:06:07, 4.23s/it] {'loss': 0.1448, 'grad_norm': 0.7316672657289102, 'learning_rate': 6.033041777698983e-06, 'epoch': 0.45} 45%|████▌ | 15451/34278 [16:59:50<22:06:07, 4.23s/it] 45%|████▌ | 15452/34278 [16:59:53<20:03:00, 3.83s/it] {'loss': 0.1356, 'grad_norm': 0.8576953274679225, 'learning_rate': 6.032579532300693e-06, 'epoch': 0.45} 45%|████▌ | 15452/34278 [16:59:53<20:03:00, 3.83s/it] 45%|████▌ | 15453/34278 [16:59:55<18:32:36, 3.55s/it] {'loss': 0.1185, 'grad_norm': 1.192208792342409, 'learning_rate': 6.032117277683776e-06, 'epoch': 0.45} 45%|████▌ | 15453/34278 [16:59:55<18:32:36, 3.55s/it] 45%|████▌ | 15454/34278 [16:59:59<18:29:02, 3.53s/it] {'loss': 0.1477, 'grad_norm': 0.6967262956605862, 'learning_rate': 6.0316550138523646e-06, 'epoch': 0.45} 45%|████▌ | 15454/34278 [16:59:59<18:29:02, 3.53s/it] 45%|████▌ | 15455/34278 [17:00:04<20:58:04, 4.01s/it] {'loss': 0.1276, 'grad_norm': 0.6481728147418251, 'learning_rate': 6.031192740810583e-06, 'epoch': 0.45} 45%|████▌ | 15455/34278 [17:00:04<20:58:04, 4.01s/it] 45%|████▌ | 15456/34278 [17:00:07<19:20:22, 3.70s/it] {'loss': 0.117, 'grad_norm': 0.8024521820210737, 'learning_rate': 6.030730458562557e-06, 'epoch': 0.45} 45%|████▌ | 15456/34278 [17:00:07<19:20:22, 3.70s/it] 45%|████▌ | 15457/34278 [17:00:13<22:51:45, 4.37s/it] {'loss': 0.1476, 'grad_norm': 0.9684911976346853, 'learning_rate': 6.030268167112419e-06, 'epoch': 0.45} 45%|████▌ | 15457/34278 [17:00:13<22:51:45, 4.37s/it] 45%|████▌ | 15458/34278 [17:00:17<22:10:04, 4.24s/it] {'loss': 0.1215, 'grad_norm': 0.7871994175419291, 'learning_rate': 6.02980586646429e-06, 'epoch': 0.45} 45%|████▌ | 15458/34278 [17:00:17<22:10:04, 4.24s/it] 45%|████▌ | 15459/34278 [17:00:23<25:31:55, 4.88s/it] {'loss': 0.134, 'grad_norm': 0.8654542181243517, 'learning_rate': 6.0293435566223e-06, 'epoch': 0.45} 45%|████▌ | 15459/34278 [17:00:23<25:31:55, 4.88s/it] 45%|████▌ | 15460/34278 [17:00:27<23:03:34, 4.41s/it] {'loss': 0.1403, 'grad_norm': 1.2935168187353963, 'learning_rate': 6.028881237590578e-06, 'epoch': 0.45} 45%|████▌ | 15460/34278 [17:00:27<23:03:34, 4.41s/it] 45%|████▌ | 15461/34278 [17:00:30<20:54:35, 4.00s/it] {'loss': 0.135, 'grad_norm': 0.9120213480978883, 'learning_rate': 6.028418909373249e-06, 'epoch': 0.45} 45%|████▌ | 15461/34278 [17:00:30<20:54:35, 4.00s/it] 45%|████▌ | 15462/34278 [17:00:33<19:24:17, 3.71s/it] {'loss': 0.1437, 'grad_norm': 0.8344562021187523, 'learning_rate': 6.027956571974442e-06, 'epoch': 0.45} 45%|████▌ | 15462/34278 [17:00:33<19:24:17, 3.71s/it] 45%|████▌ | 15463/34278 [17:00:36<18:49:43, 3.60s/it] {'loss': 0.1398, 'grad_norm': 0.8154161122264343, 'learning_rate': 6.0274942253982825e-06, 'epoch': 0.45} 45%|████▌ | 15463/34278 [17:00:36<18:49:43, 3.60s/it] 45%|████▌ | 15464/34278 [17:00:41<20:18:29, 3.89s/it] {'loss': 0.1248, 'grad_norm': 0.9741081672251926, 'learning_rate': 6.027031869648901e-06, 'epoch': 0.45} 45%|████▌ | 15464/34278 [17:00:41<20:18:29, 3.89s/it] 45%|████▌ | 15465/34278 [17:00:44<19:10:01, 3.67s/it] {'loss': 0.1573, 'grad_norm': 0.8620278803679651, 'learning_rate': 6.026569504730425e-06, 'epoch': 0.45} 45%|████▌ | 15465/34278 [17:00:44<19:10:01, 3.67s/it] 45%|████▌ | 15466/34278 [17:00:47<18:28:48, 3.54s/it] {'loss': 0.1241, 'grad_norm': 0.9282741921915446, 'learning_rate': 6.026107130646981e-06, 'epoch': 0.45} 45%|████▌ | 15466/34278 [17:00:47<18:28:48, 3.54s/it] 45%|████▌ | 15467/34278 [17:00:51<18:43:23, 3.58s/it] {'loss': 0.1425, 'grad_norm': 1.0156400767179663, 'learning_rate': 6.025644747402698e-06, 'epoch': 0.45} 45%|████▌ | 15467/34278 [17:00:51<18:43:23, 3.58s/it] 45%|████▌ | 15468/34278 [17:00:54<18:31:22, 3.55s/it] {'loss': 0.1329, 'grad_norm': 0.9510747137714097, 'learning_rate': 6.025182355001702e-06, 'epoch': 0.45} 45%|████▌ | 15468/34278 [17:00:54<18:31:22, 3.55s/it] 45%|████▌ | 15469/34278 [17:00:57<18:16:55, 3.50s/it] {'loss': 0.1367, 'grad_norm': 0.7098683350119727, 'learning_rate': 6.024719953448124e-06, 'epoch': 0.45} 45%|████▌ | 15469/34278 [17:00:57<18:16:55, 3.50s/it] 45%|████▌ | 15470/34278 [17:01:00<17:27:52, 3.34s/it] {'loss': 0.1586, 'grad_norm': 0.9384021723867909, 'learning_rate': 6.02425754274609e-06, 'epoch': 0.45} 45%|████▌ | 15470/34278 [17:01:00<17:27:52, 3.34s/it] 45%|████▌ | 15471/34278 [17:01:03<16:38:05, 3.18s/it] {'loss': 0.1176, 'grad_norm': 1.0093393459392084, 'learning_rate': 6.023795122899729e-06, 'epoch': 0.45} 45%|████▌ | 15471/34278 [17:01:03<16:38:05, 3.18s/it] 45%|████▌ | 15472/34278 [17:01:08<18:36:20, 3.56s/it] {'loss': 0.1392, 'grad_norm': 0.736763983910399, 'learning_rate': 6.023332693913171e-06, 'epoch': 0.45} 45%|████▌ | 15472/34278 [17:01:08<18:36:20, 3.56s/it] 45%|████▌ | 15473/34278 [17:01:11<18:01:21, 3.45s/it] {'loss': 0.1444, 'grad_norm': 1.0968694045293141, 'learning_rate': 6.0228702557905415e-06, 'epoch': 0.45} 45%|████▌ | 15473/34278 [17:01:11<18:01:21, 3.45s/it] 45%|████▌ | 15474/34278 [17:01:14<17:54:41, 3.43s/it] {'loss': 0.142, 'grad_norm': 1.007510007736189, 'learning_rate': 6.022407808535972e-06, 'epoch': 0.45} 45%|████▌ | 15474/34278 [17:01:14<17:54:41, 3.43s/it] 45%|████▌ | 15475/34278 [17:01:17<17:18:49, 3.31s/it] {'loss': 0.1291, 'grad_norm': 0.7977697991769906, 'learning_rate': 6.0219453521535875e-06, 'epoch': 0.45} 45%|████▌ | 15475/34278 [17:01:17<17:18:49, 3.31s/it] 45%|████▌ | 15476/34278 [17:01:20<16:59:51, 3.25s/it] {'loss': 0.1555, 'grad_norm': 0.9444930378608256, 'learning_rate': 6.021482886647521e-06, 'epoch': 0.45} 45%|████▌ | 15476/34278 [17:01:20<16:59:51, 3.25s/it] 45%|████▌ | 15477/34278 [17:01:24<17:10:35, 3.29s/it] {'loss': 0.1339, 'grad_norm': 0.7666298926815919, 'learning_rate': 6.021020412021897e-06, 'epoch': 0.45} 45%|████▌ | 15477/34278 [17:01:24<17:10:35, 3.29s/it] 45%|████▌ | 15478/34278 [17:01:27<16:56:47, 3.25s/it] {'loss': 0.1174, 'grad_norm': 0.9371666251757395, 'learning_rate': 6.020557928280848e-06, 'epoch': 0.45} 45%|████▌ | 15478/34278 [17:01:27<16:56:47, 3.25s/it] 45%|████▌ | 15479/34278 [17:01:30<17:05:32, 3.27s/it] {'loss': 0.1194, 'grad_norm': 0.7594950748927057, 'learning_rate': 6.020095435428501e-06, 'epoch': 0.45} 45%|████▌ | 15479/34278 [17:01:30<17:05:32, 3.27s/it] 45%|████▌ | 15480/34278 [17:01:34<17:15:58, 3.31s/it] {'loss': 0.1396, 'grad_norm': 0.8172497911967067, 'learning_rate': 6.019632933468986e-06, 'epoch': 0.45} 45%|████▌ | 15480/34278 [17:01:34<17:15:58, 3.31s/it] 45%|████▌ | 15481/34278 [17:01:40<21:47:01, 4.17s/it] {'loss': 0.1405, 'grad_norm': 0.7670550947442615, 'learning_rate': 6.0191704224064305e-06, 'epoch': 0.45} 45%|████▌ | 15481/34278 [17:01:40<21:47:01, 4.17s/it] 45%|████▌ | 15482/34278 [17:01:43<19:55:50, 3.82s/it] {'loss': 0.1279, 'grad_norm': 0.7604196341090894, 'learning_rate': 6.018707902244967e-06, 'epoch': 0.45} 45%|████▌ | 15482/34278 [17:01:43<19:55:50, 3.82s/it] 45%|████▌ | 15483/34278 [17:01:46<19:29:54, 3.73s/it] {'loss': 0.1349, 'grad_norm': 0.8268719505091074, 'learning_rate': 6.0182453729887205e-06, 'epoch': 0.45} 45%|████▌ | 15483/34278 [17:01:46<19:29:54, 3.73s/it] 45%|████▌ | 15484/34278 [17:01:51<20:27:45, 3.92s/it] {'loss': 0.1178, 'grad_norm': 0.7436341402224729, 'learning_rate': 6.0177828346418235e-06, 'epoch': 0.45} 45%|████▌ | 15484/34278 [17:01:51<20:27:45, 3.92s/it] 45%|████▌ | 15485/34278 [17:01:54<20:02:53, 3.84s/it] {'loss': 0.1266, 'grad_norm': 0.8496856208825444, 'learning_rate': 6.0173202872084035e-06, 'epoch': 0.45} 45%|████▌ | 15485/34278 [17:01:54<20:02:53, 3.84s/it] 45%|████▌ | 15486/34278 [17:01:58<18:58:12, 3.63s/it] {'loss': 0.1241, 'grad_norm': 0.7986128202583314, 'learning_rate': 6.01685773069259e-06, 'epoch': 0.45} 45%|████▌ | 15486/34278 [17:01:58<18:58:12, 3.63s/it] 45%|████▌ | 15487/34278 [17:02:01<18:21:35, 3.52s/it] {'loss': 0.1323, 'grad_norm': 0.7725657820741384, 'learning_rate': 6.016395165098516e-06, 'epoch': 0.45} 45%|████▌ | 15487/34278 [17:02:01<18:21:35, 3.52s/it] 45%|████▌ | 15488/34278 [17:02:04<18:11:16, 3.48s/it] {'loss': 0.1361, 'grad_norm': 0.7673420715874265, 'learning_rate': 6.0159325904303064e-06, 'epoch': 0.45} 45%|████▌ | 15488/34278 [17:02:04<18:11:16, 3.48s/it] 45%|████▌ | 15489/34278 [17:02:09<20:27:39, 3.92s/it] {'loss': 0.1518, 'grad_norm': 0.8167424355679127, 'learning_rate': 6.015470006692095e-06, 'epoch': 0.45} 45%|████▌ | 15489/34278 [17:02:09<20:27:39, 3.92s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 45%|████▌ | 15490/34278 [17:02:13<19:34:20, 3.75s/it] {'loss': 0.1391, 'grad_norm': 0.9149063800333057, 'learning_rate': 6.015007413888008e-06, 'epoch': 0.45} 45%|████▌ | 15490/34278 [17:02:13<19:34:20, 3.75s/it] 45%|████▌ | 15491/34278 [17:02:16<18:57:12, 3.63s/it] {'loss': 0.1487, 'grad_norm': 0.7660604220663534, 'learning_rate': 6.014544812022177e-06, 'epoch': 0.45} 45%|████▌ | 15491/34278 [17:02:16<18:57:12, 3.63s/it] 45%|████▌ | 15492/34278 [17:02:19<18:23:16, 3.52s/it] {'loss': 0.13, 'grad_norm': 0.7932425885698281, 'learning_rate': 6.014082201098733e-06, 'epoch': 0.45} 45%|████▌ | 15492/34278 [17:02:19<18:23:16, 3.52s/it] 45%|████▌ | 15493/34278 [17:02:23<19:16:44, 3.69s/it] {'loss': 0.1417, 'grad_norm': 0.8012751187484937, 'learning_rate': 6.013619581121806e-06, 'epoch': 0.45} 45%|████▌ | 15493/34278 [17:02:23<19:16:44, 3.69s/it] 45%|████▌ | 15494/34278 [17:02:26<18:08:12, 3.48s/it] {'loss': 0.1362, 'grad_norm': 0.8094050479653967, 'learning_rate': 6.013156952095523e-06, 'epoch': 0.45} 45%|████▌ | 15494/34278 [17:02:26<18:08:12, 3.48s/it] 45%|████▌ | 15495/34278 [17:02:29<17:33:27, 3.37s/it] {'loss': 0.1355, 'grad_norm': 0.8730183129639572, 'learning_rate': 6.012694314024018e-06, 'epoch': 0.45} 45%|████▌ | 15495/34278 [17:02:29<17:33:27, 3.37s/it] 45%|████▌ | 15496/34278 [17:02:32<16:41:43, 3.20s/it] {'loss': 0.1276, 'grad_norm': 0.7518413162940071, 'learning_rate': 6.01223166691142e-06, 'epoch': 0.45} 45%|████▌ | 15496/34278 [17:02:32<16:41:43, 3.20s/it] 45%|████▌ | 15497/34278 [17:02:38<21:13:27, 4.07s/it] {'loss': 0.13, 'grad_norm': 0.855937178667111, 'learning_rate': 6.011769010761861e-06, 'epoch': 0.45} 45%|████▌ | 15497/34278 [17:02:38<21:13:27, 4.07s/it] 45%|████▌ | 15498/34278 [17:02:42<21:27:10, 4.11s/it] {'loss': 0.1356, 'grad_norm': 0.7668697609911654, 'learning_rate': 6.011306345579466e-06, 'epoch': 0.45} 45%|████▌ | 15498/34278 [17:02:42<21:27:10, 4.11s/it] 45%|████▌ | 15499/34278 [17:02:47<21:27:11, 4.11s/it] {'loss': 0.1378, 'grad_norm': 0.8047188374888111, 'learning_rate': 6.010843671368373e-06, 'epoch': 0.45} 45%|████▌ | 15499/34278 [17:02:47<21:27:11, 4.11s/it] 45%|████▌ | 15500/34278 [17:02:50<19:41:27, 3.78s/it] {'loss': 0.1479, 'grad_norm': 0.8493618000262269, 'learning_rate': 6.0103809881327065e-06, 'epoch': 0.45} 45%|████▌ | 15500/34278 [17:02:50<19:41:27, 3.78s/it] 45%|████▌ | 15501/34278 [17:02:53<18:28:40, 3.54s/it] {'loss': 0.1301, 'grad_norm': 0.6987696427403324, 'learning_rate': 6.0099182958766e-06, 'epoch': 0.45} 45%|████▌ | 15501/34278 [17:02:53<18:28:40, 3.54s/it] 45%|████▌ | 15502/34278 [17:02:56<18:19:54, 3.51s/it] {'loss': 0.1226, 'grad_norm': 0.6967339943179401, 'learning_rate': 6.0094555946041855e-06, 'epoch': 0.45} 45%|████▌ | 15502/34278 [17:02:56<18:19:54, 3.51s/it] 45%|████▌ | 15503/34278 [17:03:02<22:27:11, 4.31s/it] {'loss': 0.154, 'grad_norm': 0.7669631552971107, 'learning_rate': 6.008992884319591e-06, 'epoch': 0.45} 45%|████▌ | 15503/34278 [17:03:02<22:27:11, 4.31s/it] 45%|████▌ | 15504/34278 [17:03:07<22:39:53, 4.35s/it] {'loss': 0.1545, 'grad_norm': 0.6731993615570382, 'learning_rate': 6.00853016502695e-06, 'epoch': 0.45} 45%|████▌ | 15504/34278 [17:03:07<22:39:53, 4.35s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 45%|████▌ | 15505/34278 [17:03:10<20:49:38, 3.99s/it] {'loss': 0.1505, 'grad_norm': 7.223912458215729, 'learning_rate': 6.008067436730392e-06, 'epoch': 0.45} 45%|████▌ | 15505/34278 [17:03:10<20:49:38, 3.99s/it] 45%|████▌ | 15506/34278 [17:03:13<19:04:44, 3.66s/it] {'loss': 0.1614, 'grad_norm': 1.0159335650326058, 'learning_rate': 6.0076046994340486e-06, 'epoch': 0.45} 45%|████▌ | 15506/34278 [17:03:13<19:04:44, 3.66s/it] 45%|████▌ | 15507/34278 [17:03:16<18:24:23, 3.53s/it] {'loss': 0.1175, 'grad_norm': 0.886501189483512, 'learning_rate': 6.0071419531420505e-06, 'epoch': 0.45} 45%|████▌ | 15507/34278 [17:03:16<18:24:23, 3.53s/it] 45%|████▌ | 15508/34278 [17:03:20<18:56:38, 3.63s/it] {'loss': 0.1308, 'grad_norm': 0.6803573396212004, 'learning_rate': 6.006679197858529e-06, 'epoch': 0.45} 45%|████▌ | 15508/34278 [17:03:20<18:56:38, 3.63s/it] 45%|████▌ | 15509/34278 [17:03:26<23:15:42, 4.46s/it] {'loss': 0.1336, 'grad_norm': 0.7105691467647167, 'learning_rate': 6.006216433587617e-06, 'epoch': 0.45} 45%|████▌ | 15509/34278 [17:03:26<23:15:42, 4.46s/it] 45%|████▌ | 15510/34278 [17:03:30<21:47:58, 4.18s/it] {'loss': 0.1536, 'grad_norm': 1.0094452391803512, 'learning_rate': 6.005753660333446e-06, 'epoch': 0.45} 45%|████▌ | 15510/34278 [17:03:30<21:47:58, 4.18s/it] 45%|████▌ | 15511/34278 [17:03:33<20:40:42, 3.97s/it] {'loss': 0.1503, 'grad_norm': 0.8377821686328103, 'learning_rate': 6.005290878100144e-06, 'epoch': 0.45} 45%|████▌ | 15511/34278 [17:03:33<20:40:42, 3.97s/it] 45%|████▌ | 15512/34278 [17:03:38<21:51:02, 4.19s/it] {'loss': 0.1549, 'grad_norm': 0.889393134243688, 'learning_rate': 6.004828086891847e-06, 'epoch': 0.45} 45%|████▌ | 15512/34278 [17:03:38<21:51:02, 4.19s/it] 45%|████▌ | 15513/34278 [17:03:42<21:04:24, 4.04s/it] {'loss': 0.1406, 'grad_norm': 0.8069271483880135, 'learning_rate': 6.0043652867126835e-06, 'epoch': 0.45} 45%|████▌ | 15513/34278 [17:03:42<21:04:24, 4.04s/it] 45%|████▌ | 15514/34278 [17:03:45<19:32:54, 3.75s/it] {'loss': 0.1286, 'grad_norm': 0.9104707480265944, 'learning_rate': 6.003902477566788e-06, 'epoch': 0.45} 45%|████▌ | 15514/34278 [17:03:45<19:32:54, 3.75s/it] 45%|████▌ | 15515/34278 [17:03:48<19:30:44, 3.74s/it] {'loss': 0.147, 'grad_norm': 0.9262352416055278, 'learning_rate': 6.003439659458288e-06, 'epoch': 0.45} 45%|████▌ | 15515/34278 [17:03:48<19:30:44, 3.74s/it] 45%|████▌ | 15516/34278 [17:03:52<19:47:53, 3.80s/it] {'loss': 0.1469, 'grad_norm': 0.8160715212386537, 'learning_rate': 6.00297683239132e-06, 'epoch': 0.45} 45%|████▌ | 15516/34278 [17:03:52<19:47:53, 3.80s/it] 45%|████▌ | 15517/34278 [17:03:57<21:37:46, 4.15s/it] {'loss': 0.1373, 'grad_norm': 0.9547087787979797, 'learning_rate': 6.002513996370014e-06, 'epoch': 0.45} 45%|████▌ | 15517/34278 [17:03:57<21:37:46, 4.15s/it] 45%|████▌ | 15518/34278 [17:04:01<20:31:58, 3.94s/it] {'loss': 0.1406, 'grad_norm': 0.78882374032233, 'learning_rate': 6.002051151398503e-06, 'epoch': 0.45} 45%|████▌ | 15518/34278 [17:04:01<20:31:58, 3.94s/it] 45%|████▌ | 15519/34278 [17:04:04<18:56:01, 3.63s/it] {'loss': 0.146, 'grad_norm': 0.8128759934170584, 'learning_rate': 6.001588297480918e-06, 'epoch': 0.45} 45%|████▌ | 15519/34278 [17:04:04<18:56:01, 3.63s/it] 45%|████▌ | 15520/34278 [17:04:07<18:07:19, 3.48s/it] {'loss': 0.1505, 'grad_norm': 0.8322631443153432, 'learning_rate': 6.0011254346213924e-06, 'epoch': 0.45} 45%|████▌ | 15520/34278 [17:04:07<18:07:19, 3.48s/it] 45%|████▌ | 15521/34278 [17:04:11<18:41:52, 3.59s/it] {'loss': 0.1224, 'grad_norm': 0.6798064602547464, 'learning_rate': 6.000662562824056e-06, 'epoch': 0.45} 45%|████▌ | 15521/34278 [17:04:11<18:41:52, 3.59s/it] 45%|████▌ | 15522/34278 [17:04:14<18:36:52, 3.57s/it] {'loss': 0.1521, 'grad_norm': 0.8639071412444207, 'learning_rate': 6.000199682093045e-06, 'epoch': 0.45} 45%|████▌ | 15522/34278 [17:04:14<18:36:52, 3.57s/it] 45%|████▌ | 15523/34278 [17:04:19<21:01:05, 4.03s/it] {'loss': 0.1428, 'grad_norm': 0.8478110353943761, 'learning_rate': 5.999736792432489e-06, 'epoch': 0.45} 45%|████▌ | 15523/34278 [17:04:19<21:01:05, 4.03s/it] 45%|████▌ | 15524/34278 [17:04:23<20:46:07, 3.99s/it] {'loss': 0.1189, 'grad_norm': 0.704310892555951, 'learning_rate': 5.9992738938465226e-06, 'epoch': 0.45} 45%|████▌ | 15524/34278 [17:04:23<20:46:07, 3.99s/it] 45%|████▌ | 15525/34278 [17:04:27<21:16:25, 4.08s/it] {'loss': 0.1302, 'grad_norm': 0.6830878620903953, 'learning_rate': 5.998810986339276e-06, 'epoch': 0.45} 45%|████▌ | 15525/34278 [17:04:27<21:16:25, 4.08s/it] 45%|████▌ | 15526/34278 [17:04:31<19:53:45, 3.82s/it] {'loss': 0.1466, 'grad_norm': 0.6755613588284448, 'learning_rate': 5.998348069914884e-06, 'epoch': 0.45} 45%|████▌ | 15526/34278 [17:04:31<19:53:45, 3.82s/it] 45%|████▌ | 15527/34278 [17:04:34<19:14:00, 3.69s/it] {'loss': 0.1368, 'grad_norm': 0.754726292745728, 'learning_rate': 5.99788514457748e-06, 'epoch': 0.45} 45%|████▌ | 15527/34278 [17:04:34<19:14:00, 3.69s/it] 45%|████▌ | 15528/34278 [17:04:37<18:04:59, 3.47s/it] {'loss': 0.1646, 'grad_norm': 0.8819122489878344, 'learning_rate': 5.997422210331194e-06, 'epoch': 0.45} 45%|████▌ | 15528/34278 [17:04:37<18:04:59, 3.47s/it] 45%|████▌ | 15529/34278 [17:04:40<17:26:55, 3.35s/it] {'loss': 0.1405, 'grad_norm': 0.9737874095863003, 'learning_rate': 5.996959267180162e-06, 'epoch': 0.45} 45%|████▌ | 15529/34278 [17:04:40<17:26:55, 3.35s/it] 45%|████▌ | 15530/34278 [17:04:43<17:24:30, 3.34s/it] {'loss': 0.1293, 'grad_norm': 0.7185403255339305, 'learning_rate': 5.996496315128514e-06, 'epoch': 0.45} 45%|████▌ | 15530/34278 [17:04:43<17:24:30, 3.34s/it] 45%|████▌ | 15531/34278 [17:04:49<21:35:24, 4.15s/it] {'loss': 0.1442, 'grad_norm': 0.808329124481926, 'learning_rate': 5.996033354180386e-06, 'epoch': 0.45} 45%|████▌ | 15531/34278 [17:04:49<21:35:24, 4.15s/it] 45%|████▌ | 15532/34278 [17:04:55<24:22:14, 4.68s/it] {'loss': 0.1421, 'grad_norm': 0.876900716347898, 'learning_rate': 5.99557038433991e-06, 'epoch': 0.45} 45%|████▌ | 15532/34278 [17:04:55<24:22:14, 4.68s/it] 45%|████▌ | 15533/34278 [17:04:59<22:36:00, 4.34s/it] {'loss': 0.1238, 'grad_norm': 0.9997237214924847, 'learning_rate': 5.995107405611218e-06, 'epoch': 0.45} 45%|████▌ | 15533/34278 [17:04:59<22:36:00, 4.34s/it] 45%|████▌ | 15534/34278 [17:05:02<20:31:00, 3.94s/it] {'loss': 0.1166, 'grad_norm': 0.8329400431096406, 'learning_rate': 5.994644417998447e-06, 'epoch': 0.45} 45%|████▌ | 15534/34278 [17:05:02<20:31:00, 3.94s/it] 45%|████▌ | 15535/34278 [17:05:08<23:50:44, 4.58s/it] {'loss': 0.121, 'grad_norm': 0.9260749046529769, 'learning_rate': 5.994181421505726e-06, 'epoch': 0.45} 45%|████▌ | 15535/34278 [17:05:08<23:50:44, 4.58s/it] 45%|████▌ | 15536/34278 [17:05:11<21:39:11, 4.16s/it] {'loss': 0.1253, 'grad_norm': 0.9736342829476802, 'learning_rate': 5.993718416137191e-06, 'epoch': 0.45} 45%|████▌ | 15536/34278 [17:05:11<21:39:11, 4.16s/it] 45%|████▌ | 15537/34278 [17:05:14<19:50:13, 3.81s/it] {'loss': 0.1448, 'grad_norm': 1.1189328167263761, 'learning_rate': 5.993255401896976e-06, 'epoch': 0.45} 45%|████▌ | 15537/34278 [17:05:14<19:50:13, 3.81s/it] 45%|████▌ | 15538/34278 [17:05:18<19:57:02, 3.83s/it] {'loss': 0.137, 'grad_norm': 1.0059683400281096, 'learning_rate': 5.9927923787892125e-06, 'epoch': 0.45} 45%|████▌ | 15538/34278 [17:05:18<19:57:02, 3.83s/it] 45%|████▌ | 15539/34278 [17:05:21<18:43:05, 3.60s/it] {'loss': 0.1438, 'grad_norm': 0.8726715932419563, 'learning_rate': 5.992329346818036e-06, 'epoch': 0.45} 45%|████▌ | 15539/34278 [17:05:21<18:43:05, 3.60s/it] 45%|████▌ | 15540/34278 [17:05:27<22:20:37, 4.29s/it] {'loss': 0.1713, 'grad_norm': 0.9672239353443517, 'learning_rate': 5.991866305987581e-06, 'epoch': 0.45} 45%|████▌ | 15540/34278 [17:05:27<22:20:37, 4.29s/it] 45%|████▌ | 15541/34278 [17:05:31<21:22:22, 4.11s/it] {'loss': 0.1413, 'grad_norm': 1.1397650317352954, 'learning_rate': 5.99140325630198e-06, 'epoch': 0.45} 45%|████▌ | 15541/34278 [17:05:31<21:22:22, 4.11s/it] 45%|████▌ | 15542/34278 [17:05:34<20:04:52, 3.86s/it] {'loss': 0.141, 'grad_norm': 0.7483326709503136, 'learning_rate': 5.990940197765367e-06, 'epoch': 0.45} 45%|████▌ | 15542/34278 [17:05:34<20:04:52, 3.86s/it] 45%|████▌ | 15543/34278 [17:05:38<20:53:58, 4.02s/it] {'loss': 0.1207, 'grad_norm': 0.7700113396276715, 'learning_rate': 5.990477130381877e-06, 'epoch': 0.45} 45%|████▌ | 15543/34278 [17:05:38<20:53:58, 4.02s/it] 45%|████▌ | 15544/34278 [17:05:41<19:17:57, 3.71s/it] {'loss': 0.128, 'grad_norm': 1.1356080666950594, 'learning_rate': 5.990014054155644e-06, 'epoch': 0.45} 45%|████▌ | 15544/34278 [17:05:41<19:17:57, 3.71s/it] 45%|████▌ | 15545/34278 [17:05:44<18:07:32, 3.48s/it] {'loss': 0.1549, 'grad_norm': 0.9938686487025937, 'learning_rate': 5.989550969090801e-06, 'epoch': 0.45} 45%|████▌ | 15545/34278 [17:05:44<18:07:32, 3.48s/it] 45%|████▌ | 15546/34278 [17:05:47<17:23:11, 3.34s/it] {'loss': 0.131, 'grad_norm': 1.2927614805086236, 'learning_rate': 5.989087875191481e-06, 'epoch': 0.45} 45%|████▌ | 15546/34278 [17:05:47<17:23:11, 3.34s/it] 45%|████▌ | 15547/34278 [17:05:53<21:31:12, 4.14s/it] {'loss': 0.1438, 'grad_norm': 0.7277551171650158, 'learning_rate': 5.9886247724618255e-06, 'epoch': 0.45} 45%|████▌ | 15547/34278 [17:05:53<21:31:12, 4.14s/it] 45%|████▌ | 15548/34278 [17:05:58<21:51:31, 4.20s/it] {'loss': 0.1412, 'grad_norm': 0.9748458948991484, 'learning_rate': 5.98816166090596e-06, 'epoch': 0.45} 45%|████▌ | 15548/34278 [17:05:58<21:51:31, 4.20s/it] 45%|████▌ | 15549/34278 [17:06:01<20:50:52, 4.01s/it] {'loss': 0.1378, 'grad_norm': 0.8023881371892535, 'learning_rate': 5.987698540528026e-06, 'epoch': 0.45} 45%|████▌ | 15549/34278 [17:06:01<20:50:52, 4.01s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd62f6f6ac0> Failed to fetch sample 3655707. Exception: cannot identify image file <_io.BytesIO object at 0x7fd62f6f6ac0> 45%|████▌ | 15550/34278 [17:06:04<19:25:10, 3.73s/it] {'loss': 0.1076, 'grad_norm': 0.8176600544026086, 'learning_rate': 5.987235411332153e-06, 'epoch': 0.45} 45%|████▌ | 15550/34278 [17:06:04<19:25:10, 3.73s/it] 45%|████▌ | 15551/34278 [17:06:08<20:16:27, 3.90s/it] {'loss': 0.138, 'grad_norm': 0.8899976618069858, 'learning_rate': 5.986772273322478e-06, 'epoch': 0.45} 45%|████▌ | 15551/34278 [17:06:09<20:16:27, 3.90s/it] 45%|████▌ | 15552/34278 [17:06:12<19:13:28, 3.70s/it] {'loss': 0.1376, 'grad_norm': 0.7308522708829461, 'learning_rate': 5.986309126503137e-06, 'epoch': 0.45} 45%|████▌ | 15552/34278 [17:06:12<19:13:28, 3.70s/it] 45%|████▌ | 15553/34278 [17:06:15<19:03:23, 3.66s/it] {'loss': 0.1553, 'grad_norm': 0.7756023275048722, 'learning_rate': 5.985845970878263e-06, 'epoch': 0.45} 45%|████▌ | 15553/34278 [17:06:15<19:03:23, 3.66s/it] 45%|████▌ | 15554/34278 [17:06:21<22:22:45, 4.30s/it] {'loss': 0.1332, 'grad_norm': 1.15371829833908, 'learning_rate': 5.985382806451991e-06, 'epoch': 0.45} 45%|████▌ | 15554/34278 [17:06:21<22:22:45, 4.30s/it] 45%|████▌ | 15555/34278 [17:06:27<25:26:17, 4.89s/it] {'loss': 0.1331, 'grad_norm': 0.6294588425665543, 'learning_rate': 5.984919633228458e-06, 'epoch': 0.45} 45%|████▌ | 15555/34278 [17:06:27<25:26:17, 4.89s/it] 45%|████▌ | 15556/34278 [17:06:31<23:13:00, 4.46s/it] {'loss': 0.1383, 'grad_norm': 0.9856300304573363, 'learning_rate': 5.984456451211795e-06, 'epoch': 0.45} 45%|████▌ | 15556/34278 [17:06:31<23:13:00, 4.46s/it] 45%|████▌ | 15557/34278 [17:06:34<21:18:09, 4.10s/it] {'loss': 0.1154, 'grad_norm': 0.8435417059113256, 'learning_rate': 5.9839932604061425e-06, 'epoch': 0.45} 45%|████▌ | 15557/34278 [17:06:34<21:18:09, 4.10s/it] 45%|████▌ | 15558/34278 [17:06:37<20:07:45, 3.87s/it] {'loss': 0.1333, 'grad_norm': 0.8038887465016826, 'learning_rate': 5.983530060815631e-06, 'epoch': 0.45} 45%|████▌ | 15558/34278 [17:06:37<20:07:45, 3.87s/it] 45%|████▌ | 15559/34278 [17:06:41<19:11:03, 3.69s/it] {'loss': 0.1672, 'grad_norm': 0.7613533098142548, 'learning_rate': 5.9830668524444e-06, 'epoch': 0.45} 45%|████▌ | 15559/34278 [17:06:41<19:11:03, 3.69s/it] 45%|████▌ | 15560/34278 [17:06:44<18:36:22, 3.58s/it] {'loss': 0.1397, 'grad_norm': 0.8701557236316875, 'learning_rate': 5.982603635296581e-06, 'epoch': 0.45} 45%|████▌ | 15560/34278 [17:06:44<18:36:22, 3.58s/it] 45%|████▌ | 15561/34278 [17:06:47<18:27:03, 3.55s/it] {'loss': 0.1065, 'grad_norm': 0.9238254563999214, 'learning_rate': 5.9821404093763116e-06, 'epoch': 0.45} 45%|████▌ | 15561/34278 [17:06:47<18:27:03, 3.55s/it] 45%|████▌ | 15562/34278 [17:06:51<17:53:53, 3.44s/it] {'loss': 0.1197, 'grad_norm': 0.8083320936044646, 'learning_rate': 5.981677174687729e-06, 'epoch': 0.45} 45%|████▌ | 15562/34278 [17:06:51<17:53:53, 3.44s/it] 45%|████▌ | 15563/34278 [17:06:54<17:27:52, 3.36s/it] {'loss': 0.1431, 'grad_norm': 0.8540124986521128, 'learning_rate': 5.981213931234964e-06, 'epoch': 0.45} 45%|████▌ | 15563/34278 [17:06:54<17:27:52, 3.36s/it] 45%|████▌ | 15564/34278 [17:06:57<16:41:35, 3.21s/it] {'loss': 0.1303, 'grad_norm': 1.0437493845420331, 'learning_rate': 5.980750679022158e-06, 'epoch': 0.45} 45%|████▌ | 15564/34278 [17:06:57<16:41:35, 3.21s/it] 45%|████▌ | 15565/34278 [17:07:00<16:34:38, 3.19s/it] {'loss': 0.1369, 'grad_norm': 0.6804187898118352, 'learning_rate': 5.980287418053442e-06, 'epoch': 0.45} 45%|████▌ | 15565/34278 [17:07:00<16:34:38, 3.19s/it] 45%|████▌ | 15566/34278 [17:07:03<16:26:43, 3.16s/it] {'loss': 0.1495, 'grad_norm': 0.7692563005271972, 'learning_rate': 5.979824148332954e-06, 'epoch': 0.45} 45%|████▌ | 15566/34278 [17:07:03<16:26:43, 3.16s/it] 45%|████▌ | 15567/34278 [17:07:06<16:45:54, 3.23s/it] {'loss': 0.1512, 'grad_norm': 0.800288155652702, 'learning_rate': 5.979360869864832e-06, 'epoch': 0.45} 45%|████▌ | 15567/34278 [17:07:06<16:45:54, 3.23s/it] 45%|████▌ | 15568/34278 [17:07:11<18:22:17, 3.53s/it] {'loss': 0.1341, 'grad_norm': 1.003529682542251, 'learning_rate': 5.9788975826532085e-06, 'epoch': 0.45} 45%|████▌ | 15568/34278 [17:07:11<18:22:17, 3.53s/it] 45%|████▌ | 15569/34278 [17:07:14<18:04:56, 3.48s/it] {'loss': 0.1413, 'grad_norm': 0.7516609503566002, 'learning_rate': 5.97843428670222e-06, 'epoch': 0.45} 45%|████▌ | 15569/34278 [17:07:14<18:04:56, 3.48s/it] 45%|████▌ | 15570/34278 [17:07:18<18:43:49, 3.60s/it] {'loss': 0.1375, 'grad_norm': 0.7483244607450641, 'learning_rate': 5.977970982016006e-06, 'epoch': 0.45} 45%|████▌ | 15570/34278 [17:07:18<18:43:49, 3.60s/it] 45%|████▌ | 15571/34278 [17:07:21<17:30:48, 3.37s/it] {'loss': 0.1139, 'grad_norm': 0.6541581501805274, 'learning_rate': 5.977507668598699e-06, 'epoch': 0.45} 45%|████▌ | 15571/34278 [17:07:21<17:30:48, 3.37s/it] 45%|████▌ | 15572/34278 [17:07:24<17:40:35, 3.40s/it] {'loss': 0.1312, 'grad_norm': 0.6464670584708954, 'learning_rate': 5.977044346454437e-06, 'epoch': 0.45} 45%|████▌ | 15572/34278 [17:07:24<17:40:35, 3.40s/it] 45%|████▌ | 15573/34278 [17:07:30<21:53:31, 4.21s/it] {'loss': 0.1507, 'grad_norm': 0.9645176954342236, 'learning_rate': 5.976581015587357e-06, 'epoch': 0.45} 45%|████▌ | 15573/34278 [17:07:30<21:53:31, 4.21s/it] 45%|████▌ | 15574/34278 [17:07:36<24:42:25, 4.76s/it] {'loss': 0.1429, 'grad_norm': 0.8301075641835433, 'learning_rate': 5.9761176760015945e-06, 'epoch': 0.45} 45%|████▌ | 15574/34278 [17:07:36<24:42:25, 4.76s/it] 45%|████▌ | 15575/34278 [17:07:41<25:26:30, 4.90s/it] {'loss': 0.1478, 'grad_norm': 1.0723481299694118, 'learning_rate': 5.975654327701286e-06, 'epoch': 0.45} 45%|████▌ | 15575/34278 [17:07:42<25:26:30, 4.90s/it] 45%|████▌ | 15576/34278 [17:07:45<23:29:09, 4.52s/it] {'loss': 0.1429, 'grad_norm': 0.8724914626221386, 'learning_rate': 5.975190970690568e-06, 'epoch': 0.45} 45%|████▌ | 15576/34278 [17:07:45<23:29:09, 4.52s/it] 45%|████▌ | 15577/34278 [17:07:49<21:53:40, 4.21s/it] {'loss': 0.1346, 'grad_norm': 0.8844472149878354, 'learning_rate': 5.97472760497358e-06, 'epoch': 0.45} 45%|████▌ | 15577/34278 [17:07:49<21:53:40, 4.21s/it] 45%|████▌ | 15578/34278 [17:07:53<21:26:10, 4.13s/it] {'loss': 0.1345, 'grad_norm': 0.8083720723703682, 'learning_rate': 5.974264230554454e-06, 'epoch': 0.45} 45%|████▌ | 15578/34278 [17:07:53<21:26:10, 4.13s/it] 45%|████▌ | 15579/34278 [17:07:56<20:03:36, 3.86s/it] {'loss': 0.1251, 'grad_norm': 1.3204416064588236, 'learning_rate': 5.973800847437332e-06, 'epoch': 0.45} 45%|████▌ | 15579/34278 [17:07:56<20:03:36, 3.86s/it] 45%|████▌ | 15580/34278 [17:07:59<18:31:47, 3.57s/it] {'loss': 0.1452, 'grad_norm': 1.0738716931241754, 'learning_rate': 5.973337455626348e-06, 'epoch': 0.45} 45%|████▌ | 15580/34278 [17:07:59<18:31:47, 3.57s/it] 45%|████▌ | 15581/34278 [17:08:02<17:40:03, 3.40s/it] {'loss': 0.1411, 'grad_norm': 1.0675443266092903, 'learning_rate': 5.972874055125637e-06, 'epoch': 0.45} 45%|████▌ | 15581/34278 [17:08:02<17:40:03, 3.40s/it] 45%|████▌ | 15582/34278 [17:08:05<16:48:25, 3.24s/it] {'loss': 0.1254, 'grad_norm': 0.8243831145326032, 'learning_rate': 5.972410645939342e-06, 'epoch': 0.45} 45%|████▌ | 15582/34278 [17:08:05<16:48:25, 3.24s/it] 45%|████▌ | 15583/34278 [17:08:09<19:14:53, 3.71s/it] {'loss': 0.1528, 'grad_norm': 0.7569276271537585, 'learning_rate': 5.971947228071595e-06, 'epoch': 0.45} 45%|████▌ | 15583/34278 [17:08:09<19:14:53, 3.71s/it] 45%|████▌ | 15584/34278 [17:08:12<17:46:34, 3.42s/it] {'loss': 0.1391, 'grad_norm': 0.8531415261963556, 'learning_rate': 5.971483801526536e-06, 'epoch': 0.45} 45%|████▌ | 15584/34278 [17:08:12<17:46:34, 3.42s/it] 45%|████▌ | 15585/34278 [17:08:16<17:51:48, 3.44s/it] {'loss': 0.1334, 'grad_norm': 0.9787891705695544, 'learning_rate': 5.971020366308301e-06, 'epoch': 0.45} 45%|████▌ | 15585/34278 [17:08:16<17:51:48, 3.44s/it] 45%|████▌ | 15586/34278 [17:08:19<17:29:26, 3.37s/it] {'loss': 0.1465, 'grad_norm': 0.7507445483480693, 'learning_rate': 5.970556922421028e-06, 'epoch': 0.45} 45%|████▌ | 15586/34278 [17:08:19<17:29:26, 3.37s/it] 45%|████▌ | 15587/34278 [17:08:22<17:52:01, 3.44s/it] {'loss': 0.1207, 'grad_norm': 0.8299327822734983, 'learning_rate': 5.970093469868855e-06, 'epoch': 0.45} 45%|████▌ | 15587/34278 [17:08:22<17:52:01, 3.44s/it] 45%|████▌ | 15588/34278 [17:08:29<22:16:46, 4.29s/it] {'loss': 0.1715, 'grad_norm': 0.8162947581409953, 'learning_rate': 5.969630008655919e-06, 'epoch': 0.45} 45%|████▌ | 15588/34278 [17:08:29<22:16:46, 4.29s/it] 45%|████▌ | 15589/34278 [17:08:35<24:56:59, 4.81s/it] {'loss': 0.1527, 'grad_norm': 0.7519473468992576, 'learning_rate': 5.969166538786357e-06, 'epoch': 0.45} 45%|████▌ | 15589/34278 [17:08:35<24:56:59, 4.81s/it] 45%|████▌ | 15590/34278 [17:08:41<27:37:28, 5.32s/it] {'loss': 0.1403, 'grad_norm': 0.8550610823679656, 'learning_rate': 5.968703060264308e-06, 'epoch': 0.45} 45%|████▌ | 15590/34278 [17:08:41<27:37:28, 5.32s/it] 45%|████▌ | 15591/34278 [17:08:47<27:40:04, 5.33s/it] {'loss': 0.1542, 'grad_norm': 0.9089281538646847, 'learning_rate': 5.968239573093909e-06, 'epoch': 0.45} 45%|████▌ | 15591/34278 [17:08:47<27:40:04, 5.33s/it] 45%|████▌ | 15592/34278 [17:08:50<24:06:51, 4.65s/it] {'loss': 0.1516, 'grad_norm': 0.8927215201020315, 'learning_rate': 5.967776077279299e-06, 'epoch': 0.45} 45%|████▌ | 15592/34278 [17:08:50<24:06:51, 4.65s/it] 45%|████▌ | 15593/34278 [17:08:53<21:53:26, 4.22s/it] {'loss': 0.1618, 'grad_norm': 0.8438451780909817, 'learning_rate': 5.9673125728246136e-06, 'epoch': 0.45} 45%|████▌ | 15593/34278 [17:08:53<21:53:26, 4.22s/it] 45%|████▌ | 15594/34278 [17:08:57<21:23:49, 4.12s/it] {'loss': 0.14, 'grad_norm': 0.8108699552061513, 'learning_rate': 5.966849059733994e-06, 'epoch': 0.45} 45%|████▌ | 15594/34278 [17:08:57<21:23:49, 4.12s/it] 45%|████▌ | 15595/34278 [17:09:01<22:23:53, 4.32s/it] {'loss': 0.1678, 'grad_norm': 0.9168909298375204, 'learning_rate': 5.966385538011577e-06, 'epoch': 0.45} 45%|████▌ | 15595/34278 [17:09:02<22:23:53, 4.32s/it] 45%|████▌ | 15596/34278 [17:09:05<21:06:50, 4.07s/it] {'loss': 0.1509, 'grad_norm': 0.684513498308056, 'learning_rate': 5.9659220076614995e-06, 'epoch': 0.45} 45%|████▌ | 15596/34278 [17:09:05<21:06:50, 4.07s/it] 46%|████▌ | 15597/34278 [17:09:09<20:24:51, 3.93s/it] {'loss': 0.125, 'grad_norm': 0.7329521844052996, 'learning_rate': 5.965458468687902e-06, 'epoch': 0.46} 46%|████▌ | 15597/34278 [17:09:09<20:24:51, 3.93s/it] 46%|████▌ | 15598/34278 [17:09:13<20:45:57, 4.00s/it] {'loss': 0.1397, 'grad_norm': 0.7877975127804708, 'learning_rate': 5.964994921094921e-06, 'epoch': 0.46} 46%|████▌ | 15598/34278 [17:09:13<20:45:57, 4.00s/it] 46%|████▌ | 15599/34278 [17:09:16<19:09:36, 3.69s/it] {'loss': 0.1449, 'grad_norm': 0.8390761540958495, 'learning_rate': 5.964531364886696e-06, 'epoch': 0.46} 46%|████▌ | 15599/34278 [17:09:16<19:09:36, 3.69s/it] 46%|████▌ | 15600/34278 [17:09:19<18:05:17, 3.49s/it] {'loss': 0.1434, 'grad_norm': 0.8375526024404093, 'learning_rate': 5.964067800067366e-06, 'epoch': 0.46} 46%|████▌ | 15600/34278 [17:09:19<18:05:17, 3.49s/it] 46%|████▌ | 15601/34278 [17:09:23<19:38:12, 3.78s/it] {'loss': 0.1398, 'grad_norm': 0.784226838130666, 'learning_rate': 5.9636042266410666e-06, 'epoch': 0.46} 46%|████▌ | 15601/34278 [17:09:23<19:38:12, 3.78s/it] 46%|████▌ | 15602/34278 [17:09:26<18:40:50, 3.60s/it] {'loss': 0.1507, 'grad_norm': 1.188310705158349, 'learning_rate': 5.96314064461194e-06, 'epoch': 0.46} 46%|████▌ | 15602/34278 [17:09:26<18:40:50, 3.60s/it] 46%|████▌ | 15603/34278 [17:09:29<17:46:49, 3.43s/it] {'loss': 0.1318, 'grad_norm': 0.8884153249430453, 'learning_rate': 5.962677053984124e-06, 'epoch': 0.46} 46%|████▌ | 15603/34278 [17:09:29<17:46:49, 3.43s/it] 46%|████▌ | 15604/34278 [17:09:32<17:05:57, 3.30s/it] {'loss': 0.1347, 'grad_norm': 0.8591894126257077, 'learning_rate': 5.962213454761758e-06, 'epoch': 0.46} 46%|████▌ | 15604/34278 [17:09:32<17:05:57, 3.30s/it] 46%|████▌ | 15605/34278 [17:09:36<17:17:11, 3.33s/it] {'loss': 0.1266, 'grad_norm': 0.8814817104144005, 'learning_rate': 5.961749846948977e-06, 'epoch': 0.46} 46%|████▌ | 15605/34278 [17:09:36<17:17:11, 3.33s/it] 46%|████▌ | 15606/34278 [17:09:42<21:17:03, 4.10s/it] {'loss': 0.1422, 'grad_norm': 0.8035971967177175, 'learning_rate': 5.961286230549925e-06, 'epoch': 0.46} 46%|████▌ | 15606/34278 [17:09:42<21:17:03, 4.10s/it] 46%|████▌ | 15607/34278 [17:09:45<19:52:38, 3.83s/it] {'loss': 0.1463, 'grad_norm': 0.8688263422836926, 'learning_rate': 5.96082260556874e-06, 'epoch': 0.46} 46%|████▌ | 15607/34278 [17:09:45<19:52:38, 3.83s/it] 46%|████▌ | 15608/34278 [17:09:49<20:06:58, 3.88s/it] {'loss': 0.1334, 'grad_norm': 0.868205180530539, 'learning_rate': 5.9603589720095575e-06, 'epoch': 0.46} 46%|████▌ | 15608/34278 [17:09:49<20:06:58, 3.88s/it] 46%|████▌ | 15609/34278 [17:09:52<19:13:26, 3.71s/it] {'loss': 0.1627, 'grad_norm': 0.7593555479635258, 'learning_rate': 5.959895329876521e-06, 'epoch': 0.46} 46%|████▌ | 15609/34278 [17:09:52<19:13:26, 3.71s/it] 46%|████▌ | 15610/34278 [17:09:58<22:10:49, 4.28s/it] {'loss': 0.1438, 'grad_norm': 0.9070878175146634, 'learning_rate': 5.959431679173768e-06, 'epoch': 0.46} 46%|████▌ | 15610/34278 [17:09:58<22:10:49, 4.28s/it] 46%|████▌ | 15611/34278 [17:10:02<22:11:03, 4.28s/it] {'loss': 0.1518, 'grad_norm': 0.7284378881575929, 'learning_rate': 5.958968019905438e-06, 'epoch': 0.46} 46%|████▌ | 15611/34278 [17:10:02<22:11:03, 4.28s/it] 46%|████▌ | 15612/34278 [17:10:05<20:16:14, 3.91s/it] {'loss': 0.1225, 'grad_norm': 0.8233408714281342, 'learning_rate': 5.95850435207567e-06, 'epoch': 0.46} 46%|████▌ | 15612/34278 [17:10:05<20:16:14, 3.91s/it] 46%|████▌ | 15613/34278 [17:10:08<18:45:14, 3.62s/it] {'loss': 0.1332, 'grad_norm': 0.6495865762672907, 'learning_rate': 5.9580406756886046e-06, 'epoch': 0.46} 46%|████▌ | 15613/34278 [17:10:08<18:45:14, 3.62s/it] 46%|████▌ | 15614/34278 [17:10:11<18:22:11, 3.54s/it] {'loss': 0.1348, 'grad_norm': 0.8377952270274468, 'learning_rate': 5.957576990748381e-06, 'epoch': 0.46} 46%|████▌ | 15614/34278 [17:10:11<18:22:11, 3.54s/it] 46%|████▌ | 15615/34278 [17:10:15<17:38:06, 3.40s/it] {'loss': 0.1294, 'grad_norm': 0.8555159658834381, 'learning_rate': 5.957113297259137e-06, 'epoch': 0.46} 46%|████▌ | 15615/34278 [17:10:15<17:38:06, 3.40s/it] 46%|████▌ | 15616/34278 [17:10:18<17:26:58, 3.37s/it] {'loss': 0.1473, 'grad_norm': 0.9373924405724049, 'learning_rate': 5.956649595225015e-06, 'epoch': 0.46} 46%|████▌ | 15616/34278 [17:10:18<17:26:58, 3.37s/it] 46%|████▌ | 15617/34278 [17:10:24<21:33:30, 4.16s/it] {'loss': 0.1538, 'grad_norm': 0.8757514150161813, 'learning_rate': 5.956185884650154e-06, 'epoch': 0.46} 46%|████▌ | 15617/34278 [17:10:24<21:33:30, 4.16s/it] 46%|████▌ | 15618/34278 [17:10:27<20:49:52, 4.02s/it] {'loss': 0.1267, 'grad_norm': 0.6024764240720053, 'learning_rate': 5.955722165538693e-06, 'epoch': 0.46} 46%|████▌ | 15618/34278 [17:10:28<20:49:52, 4.02s/it] 46%|████▌ | 15619/34278 [17:10:31<20:17:17, 3.91s/it] {'loss': 0.1291, 'grad_norm': 0.9658590661050219, 'learning_rate': 5.9552584378947746e-06, 'epoch': 0.46} 46%|████▌ | 15619/34278 [17:10:31<20:17:17, 3.91s/it] 46%|████▌ | 15620/34278 [17:10:34<18:53:47, 3.65s/it] {'loss': 0.1311, 'grad_norm': 0.8764901413759552, 'learning_rate': 5.954794701722534e-06, 'epoch': 0.46} 46%|████▌ | 15620/34278 [17:10:34<18:53:47, 3.65s/it] 46%|████▌ | 15621/34278 [17:10:37<17:59:08, 3.47s/it] {'loss': 0.1202, 'grad_norm': 0.780645758995893, 'learning_rate': 5.954330957026115e-06, 'epoch': 0.46} 46%|████▌ | 15621/34278 [17:10:37<17:59:08, 3.47s/it] 46%|████▌ | 15622/34278 [17:10:41<17:47:24, 3.43s/it] {'loss': 0.1486, 'grad_norm': 0.8320018159178345, 'learning_rate': 5.953867203809659e-06, 'epoch': 0.46} 46%|████▌ | 15622/34278 [17:10:41<17:47:24, 3.43s/it] 46%|████▌ | 15623/34278 [17:10:44<17:31:14, 3.38s/it] {'loss': 0.1322, 'grad_norm': 0.7268077402865062, 'learning_rate': 5.953403442077302e-06, 'epoch': 0.46} 46%|████▌ | 15623/34278 [17:10:44<17:31:14, 3.38s/it] 46%|████▌ | 15624/34278 [17:10:47<16:58:56, 3.28s/it] {'loss': 0.1121, 'grad_norm': 0.770748963245814, 'learning_rate': 5.952939671833189e-06, 'epoch': 0.46} 46%|████▌ | 15624/34278 [17:10:47<16:58:56, 3.28s/it] 46%|████▌ | 15625/34278 [17:10:50<16:37:11, 3.21s/it] {'loss': 0.1243, 'grad_norm': 0.7652247459678828, 'learning_rate': 5.9524758930814565e-06, 'epoch': 0.46} 46%|████▌ | 15625/34278 [17:10:50<16:37:11, 3.21s/it] 46%|████▌ | 15626/34278 [17:10:53<16:49:38, 3.25s/it] {'loss': 0.1317, 'grad_norm': 0.846546008252921, 'learning_rate': 5.952012105826247e-06, 'epoch': 0.46} 46%|████▌ | 15626/34278 [17:10:53<16:49:38, 3.25s/it] 46%|████▌ | 15627/34278 [17:10:57<17:20:27, 3.35s/it] {'loss': 0.1261, 'grad_norm': 0.6920386970271211, 'learning_rate': 5.9515483100716994e-06, 'epoch': 0.46} 46%|████▌ | 15627/34278 [17:10:57<17:20:27, 3.35s/it] 46%|████▌ | 15628/34278 [17:11:00<17:06:34, 3.30s/it] {'loss': 0.1725, 'grad_norm': 0.8808606251055751, 'learning_rate': 5.951084505821957e-06, 'epoch': 0.46} 46%|████▌ | 15628/34278 [17:11:00<17:06:34, 3.30s/it] 46%|████▌ | 15629/34278 [17:11:04<18:23:44, 3.55s/it] {'loss': 0.123, 'grad_norm': 0.6640518970320024, 'learning_rate': 5.950620693081159e-06, 'epoch': 0.46} 46%|████▌ | 15629/34278 [17:11:04<18:23:44, 3.55s/it] 46%|████▌ | 15630/34278 [17:11:07<17:28:16, 3.37s/it] {'loss': 0.1639, 'grad_norm': 0.8132760484031268, 'learning_rate': 5.950156871853446e-06, 'epoch': 0.46} 46%|████▌ | 15630/34278 [17:11:07<17:28:16, 3.37s/it] 46%|████▌ | 15631/34278 [17:11:12<19:20:42, 3.73s/it] {'loss': 0.1458, 'grad_norm': 0.8180047225855253, 'learning_rate': 5.94969304214296e-06, 'epoch': 0.46} 46%|████▌ | 15631/34278 [17:11:12<19:20:42, 3.73s/it] 46%|████▌ | 15632/34278 [17:11:17<22:04:52, 4.26s/it] {'loss': 0.1364, 'grad_norm': 0.7903443282540956, 'learning_rate': 5.94922920395384e-06, 'epoch': 0.46} 46%|████▌ | 15632/34278 [17:11:17<22:04:52, 4.26s/it] 46%|████▌ | 15633/34278 [17:11:20<20:18:22, 3.92s/it] {'loss': 0.1368, 'grad_norm': 0.7793112046755744, 'learning_rate': 5.948765357290229e-06, 'epoch': 0.46} 46%|████▌ | 15633/34278 [17:11:20<20:18:22, 3.92s/it] 46%|████▌ | 15634/34278 [17:11:23<18:43:02, 3.61s/it] {'loss': 0.1462, 'grad_norm': 0.7479085223042332, 'learning_rate': 5.94830150215627e-06, 'epoch': 0.46} 46%|████▌ | 15634/34278 [17:11:23<18:43:02, 3.61s/it] 46%|████▌ | 15635/34278 [17:11:27<18:25:37, 3.56s/it] {'loss': 0.1318, 'grad_norm': 0.7091872238254096, 'learning_rate': 5.947837638556096e-06, 'epoch': 0.46} 46%|████▌ | 15635/34278 [17:11:27<18:25:37, 3.56s/it] 46%|████▌ | 15636/34278 [17:11:31<19:27:03, 3.76s/it] {'loss': 0.1337, 'grad_norm': 0.9630085534828409, 'learning_rate': 5.947373766493858e-06, 'epoch': 0.46} 46%|████▌ | 15636/34278 [17:11:31<19:27:03, 3.76s/it] 46%|████▌ | 15637/34278 [17:11:35<19:28:59, 3.76s/it] {'loss': 0.1222, 'grad_norm': 0.9419381849430555, 'learning_rate': 5.946909885973693e-06, 'epoch': 0.46} 46%|████▌ | 15637/34278 [17:11:35<19:28:59, 3.76s/it] 46%|████▌ | 15638/34278 [17:11:38<18:33:12, 3.58s/it] {'loss': 0.1392, 'grad_norm': 0.8737622678445661, 'learning_rate': 5.94644599699974e-06, 'epoch': 0.46} 46%|████▌ | 15638/34278 [17:11:38<18:33:12, 3.58s/it] 46%|████▌ | 15639/34278 [17:11:42<19:09:10, 3.70s/it] {'loss': 0.1472, 'grad_norm': 0.7275142853586954, 'learning_rate': 5.945982099576147e-06, 'epoch': 0.46} 46%|████▌ | 15639/34278 [17:11:42<19:09:10, 3.70s/it] 46%|████▌ | 15640/34278 [17:11:45<18:59:13, 3.67s/it] {'loss': 0.1315, 'grad_norm': 0.8130554862107516, 'learning_rate': 5.945518193707048e-06, 'epoch': 0.46} 46%|████▌ | 15640/34278 [17:11:45<18:59:13, 3.67s/it] 46%|████▌ | 15641/34278 [17:11:48<18:03:36, 3.49s/it] {'loss': 0.1294, 'grad_norm': 0.6153955434577857, 'learning_rate': 5.945054279396589e-06, 'epoch': 0.46} 46%|████▌ | 15641/34278 [17:11:49<18:03:36, 3.49s/it] 46%|████▌ | 15642/34278 [17:11:51<17:07:41, 3.31s/it] {'loss': 0.161, 'grad_norm': 1.0260667782830692, 'learning_rate': 5.944590356648913e-06, 'epoch': 0.46} 46%|████▌ | 15642/34278 [17:11:51<17:07:41, 3.31s/it] 46%|████▌ | 15643/34278 [17:11:55<17:29:08, 3.38s/it] {'loss': 0.123, 'grad_norm': 0.6957108945306344, 'learning_rate': 5.944126425468158e-06, 'epoch': 0.46} 46%|████▌ | 15643/34278 [17:11:55<17:29:08, 3.38s/it] 46%|████▌ | 15644/34278 [17:11:58<17:04:54, 3.30s/it] {'loss': 0.1206, 'grad_norm': 0.7159912722493332, 'learning_rate': 5.943662485858468e-06, 'epoch': 0.46} 46%|████▌ | 15644/34278 [17:11:58<17:04:54, 3.30s/it] 46%|████▌ | 15645/34278 [17:12:01<16:26:31, 3.18s/it] {'loss': 0.1787, 'grad_norm': 1.1752349501025643, 'learning_rate': 5.9431985378239845e-06, 'epoch': 0.46} 46%|████▌ | 15645/34278 [17:12:01<16:26:31, 3.18s/it] 46%|████▌ | 15646/34278 [17:12:06<19:19:47, 3.73s/it] {'loss': 0.1486, 'grad_norm': 0.9045465807891037, 'learning_rate': 5.94273458136885e-06, 'epoch': 0.46} 46%|████▌ | 15646/34278 [17:12:06<19:19:47, 3.73s/it] 46%|████▌ | 15647/34278 [17:12:09<18:27:42, 3.57s/it] {'loss': 0.1113, 'grad_norm': 0.6638173725878435, 'learning_rate': 5.942270616497206e-06, 'epoch': 0.46} 46%|████▌ | 15647/34278 [17:12:09<18:27:42, 3.57s/it] 46%|████▌ | 15648/34278 [17:12:12<17:08:16, 3.31s/it] {'loss': 0.1386, 'grad_norm': 1.0599652223154115, 'learning_rate': 5.941806643213194e-06, 'epoch': 0.46} 46%|████▌ | 15648/34278 [17:12:12<17:08:16, 3.31s/it] 46%|████▌ | 15649/34278 [17:12:15<16:56:01, 3.27s/it] {'loss': 0.1446, 'grad_norm': 1.108494409632205, 'learning_rate': 5.941342661520959e-06, 'epoch': 0.46} 46%|████▌ | 15649/34278 [17:12:15<16:56:01, 3.27s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f16a36b60c0> Failed to fetch sample 3291476. Exception: cannot identify image file <_io.BytesIO object at 0x7f16a36b60c0> 46%|████▌ | 15650/34278 [17:12:19<18:19:18, 3.54s/it] {'loss': 0.1388, 'grad_norm': 0.8771978366714083, 'learning_rate': 5.940878671424639e-06, 'epoch': 0.46} 46%|████▌ | 15650/34278 [17:12:19<18:19:18, 3.54s/it] 46%|████▌ | 15651/34278 [17:12:22<17:30:50, 3.38s/it] {'loss': 0.1236, 'grad_norm': 0.8269749943895522, 'learning_rate': 5.940414672928381e-06, 'epoch': 0.46} 46%|████▌ | 15651/34278 [17:12:22<17:30:50, 3.38s/it] 46%|████▌ | 15652/34278 [17:12:26<17:39:19, 3.41s/it] {'loss': 0.1394, 'grad_norm': 0.8759997309413657, 'learning_rate': 5.9399506660363244e-06, 'epoch': 0.46} 46%|████▌ | 15652/34278 [17:12:26<17:39:19, 3.41s/it] 46%|████▌ | 15653/34278 [17:12:32<21:55:23, 4.24s/it] {'loss': 0.136, 'grad_norm': 0.9488383640892364, 'learning_rate': 5.939486650752612e-06, 'epoch': 0.46} 46%|████▌ | 15653/34278 [17:12:32<21:55:23, 4.24s/it] 46%|████▌ | 15654/34278 [17:12:35<20:58:45, 4.06s/it] {'loss': 0.123, 'grad_norm': 0.6794040859282786, 'learning_rate': 5.939022627081389e-06, 'epoch': 0.46} 46%|████▌ | 15654/34278 [17:12:35<20:58:45, 4.06s/it] 46%|████▌ | 15655/34278 [17:12:39<19:28:22, 3.76s/it] {'loss': 0.1195, 'grad_norm': 0.7957088297622816, 'learning_rate': 5.938558595026794e-06, 'epoch': 0.46} 46%|████▌ | 15655/34278 [17:12:39<19:28:22, 3.76s/it] 46%|████▌ | 15656/34278 [17:12:42<18:50:57, 3.64s/it] {'loss': 0.1637, 'grad_norm': 0.8379369819921373, 'learning_rate': 5.938094554592973e-06, 'epoch': 0.46} 46%|████▌ | 15656/34278 [17:12:42<18:50:57, 3.64s/it] 46%|████▌ | 15657/34278 [17:12:45<17:39:21, 3.41s/it] {'loss': 0.1355, 'grad_norm': 0.8238980349361563, 'learning_rate': 5.937630505784068e-06, 'epoch': 0.46} 46%|████▌ | 15657/34278 [17:12:45<17:39:21, 3.41s/it] 46%|████▌ | 15658/34278 [17:12:50<20:13:52, 3.91s/it] {'loss': 0.132, 'grad_norm': 0.8688347774364377, 'learning_rate': 5.9371664486042216e-06, 'epoch': 0.46} 46%|████▌ | 15658/34278 [17:12:50<20:13:52, 3.91s/it] 46%|████▌ | 15659/34278 [17:12:54<20:00:38, 3.87s/it] {'loss': 0.1301, 'grad_norm': 1.095856481281604, 'learning_rate': 5.936702383057576e-06, 'epoch': 0.46} 46%|████▌ | 15659/34278 [17:12:54<20:00:38, 3.87s/it] 46%|████▌ | 15660/34278 [17:12:57<19:12:57, 3.72s/it] {'loss': 0.1311, 'grad_norm': 0.8897872583146305, 'learning_rate': 5.936238309148276e-06, 'epoch': 0.46} 46%|████▌ | 15660/34278 [17:12:57<19:12:57, 3.72s/it] 46%|████▌ | 15661/34278 [17:13:03<22:38:53, 4.38s/it] {'loss': 0.1243, 'grad_norm': 0.7323225377747237, 'learning_rate': 5.935774226880463e-06, 'epoch': 0.46} 46%|████▌ | 15661/34278 [17:13:03<22:38:53, 4.38s/it] 46%|████▌ | 15662/34278 [17:13:06<20:39:00, 3.99s/it] {'loss': 0.1275, 'grad_norm': 0.8574268429481138, 'learning_rate': 5.9353101362582825e-06, 'epoch': 0.46} 46%|████▌ | 15662/34278 [17:13:06<20:39:00, 3.99s/it] 46%|████▌ | 15663/34278 [17:13:10<19:58:38, 3.86s/it] {'loss': 0.1274, 'grad_norm': 1.0035717491324825, 'learning_rate': 5.934846037285875e-06, 'epoch': 0.46} 46%|████▌ | 15663/34278 [17:13:10<19:58:38, 3.86s/it] 46%|████▌ | 15664/34278 [17:13:13<18:49:40, 3.64s/it] {'loss': 0.1032, 'grad_norm': 0.6340515572348439, 'learning_rate': 5.9343819299673865e-06, 'epoch': 0.46} 46%|████▌ | 15664/34278 [17:13:13<18:49:40, 3.64s/it] 46%|████▌ | 15665/34278 [17:13:16<17:46:20, 3.44s/it] {'loss': 0.1508, 'grad_norm': 0.9438652048879773, 'learning_rate': 5.933917814306958e-06, 'epoch': 0.46} 46%|████▌ | 15665/34278 [17:13:16<17:46:20, 3.44s/it] 46%|████▌ | 15666/34278 [17:13:19<17:20:15, 3.35s/it] {'loss': 0.1412, 'grad_norm': 0.9742379780115008, 'learning_rate': 5.933453690308734e-06, 'epoch': 0.46} 46%|████▌ | 15666/34278 [17:13:19<17:20:15, 3.35s/it] 46%|████▌ | 15667/34278 [17:13:25<21:26:58, 4.15s/it] {'loss': 0.1339, 'grad_norm': 0.8079002900725311, 'learning_rate': 5.93298955797686e-06, 'epoch': 0.46} 46%|████▌ | 15667/34278 [17:13:25<21:26:58, 4.15s/it] 46%|████▌ | 15668/34278 [17:13:30<22:30:35, 4.35s/it] {'loss': 0.1351, 'grad_norm': 1.1181298450483161, 'learning_rate': 5.9325254173154754e-06, 'epoch': 0.46} 46%|████▌ | 15668/34278 [17:13:30<22:30:35, 4.35s/it] 46%|████▌ | 15669/34278 [17:13:35<24:06:53, 4.67s/it] {'loss': 0.126, 'grad_norm': 1.2047272065936931, 'learning_rate': 5.932061268328729e-06, 'epoch': 0.46} 46%|████▌ | 15669/34278 [17:13:35<24:06:53, 4.67s/it] 46%|████▌ | 15670/34278 [17:13:40<24:19:43, 4.71s/it] {'loss': 0.1295, 'grad_norm': 0.9596218433201393, 'learning_rate': 5.931597111020762e-06, 'epoch': 0.46} 46%|████▌ | 15670/34278 [17:13:40<24:19:43, 4.71s/it] 46%|████▌ | 15671/34278 [17:13:44<22:52:14, 4.42s/it] {'loss': 0.1467, 'grad_norm': 1.0413839676500838, 'learning_rate': 5.931132945395717e-06, 'epoch': 0.46} 46%|████▌ | 15671/34278 [17:13:44<22:52:14, 4.42s/it] 46%|████▌ | 15672/34278 [17:13:47<21:32:53, 4.17s/it] {'loss': 0.1244, 'grad_norm': 1.3019729448171455, 'learning_rate': 5.930668771457739e-06, 'epoch': 0.46} 46%|████▌ | 15672/34278 [17:13:47<21:32:53, 4.17s/it] 46%|████▌ | 15673/34278 [17:13:52<22:40:07, 4.39s/it] {'loss': 0.1481, 'grad_norm': 0.8658591039474676, 'learning_rate': 5.930204589210974e-06, 'epoch': 0.46} 46%|████▌ | 15673/34278 [17:13:52<22:40:07, 4.39s/it] 46%|████▌ | 15674/34278 [17:13:56<21:29:43, 4.16s/it] {'loss': 0.1408, 'grad_norm': 0.7759193174158013, 'learning_rate': 5.929740398659563e-06, 'epoch': 0.46} 46%|████▌ | 15674/34278 [17:13:56<21:29:43, 4.16s/it] 46%|████▌ | 15675/34278 [17:14:00<20:55:48, 4.05s/it] {'loss': 0.151, 'grad_norm': 0.8641358135923433, 'learning_rate': 5.929276199807652e-06, 'epoch': 0.46} 46%|████▌ | 15675/34278 [17:14:00<20:55:48, 4.05s/it] 46%|████▌ | 15676/34278 [17:14:03<19:59:41, 3.87s/it] {'loss': 0.1453, 'grad_norm': 0.7434838348542243, 'learning_rate': 5.928811992659386e-06, 'epoch': 0.46} 46%|████▌ | 15676/34278 [17:14:03<19:59:41, 3.87s/it] 46%|████▌ | 15677/34278 [17:14:06<18:59:17, 3.67s/it] {'loss': 0.1489, 'grad_norm': 0.9680447752688965, 'learning_rate': 5.928347777218907e-06, 'epoch': 0.46} 46%|████▌ | 15677/34278 [17:14:06<18:59:17, 3.67s/it] 46%|████▌ | 15678/34278 [17:14:12<22:39:44, 4.39s/it] {'loss': 0.1265, 'grad_norm': 0.7862911694994094, 'learning_rate': 5.927883553490361e-06, 'epoch': 0.46} 46%|████▌ | 15678/34278 [17:14:12<22:39:44, 4.39s/it] 46%|████▌ | 15679/34278 [17:14:17<23:21:53, 4.52s/it] {'loss': 0.1559, 'grad_norm': 1.045671477489225, 'learning_rate': 5.927419321477893e-06, 'epoch': 0.46} 46%|████▌ | 15679/34278 [17:14:17<23:21:53, 4.52s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 46%|████▌ | 15680/34278 [17:14:21<21:46:51, 4.22s/it] {'loss': 0.1695, 'grad_norm': 0.8921272670391884, 'learning_rate': 5.926955081185646e-06, 'epoch': 0.46} 46%|████▌ | 15680/34278 [17:14:21<21:46:51, 4.22s/it] 46%|████▌ | 15681/34278 [17:14:24<19:52:37, 3.85s/it] {'loss': 0.1307, 'grad_norm': 0.7146774925535057, 'learning_rate': 5.926490832617764e-06, 'epoch': 0.46} 46%|████▌ | 15681/34278 [17:14:24<19:52:37, 3.85s/it] 46%|████▌ | 15682/34278 [17:14:28<20:16:07, 3.92s/it] {'loss': 0.1105, 'grad_norm': 0.9267503736800597, 'learning_rate': 5.926026575778396e-06, 'epoch': 0.46} 46%|████▌ | 15682/34278 [17:14:28<20:16:07, 3.92s/it] 46%|████▌ | 15683/34278 [17:14:31<19:13:36, 3.72s/it] {'loss': 0.1403, 'grad_norm': 0.7048558503763448, 'learning_rate': 5.9255623106716805e-06, 'epoch': 0.46} 46%|████▌ | 15683/34278 [17:14:31<19:13:36, 3.72s/it] 46%|████▌ | 15684/34278 [17:14:34<18:07:26, 3.51s/it] {'loss': 0.1727, 'grad_norm': 0.9110532148569493, 'learning_rate': 5.925098037301769e-06, 'epoch': 0.46} 46%|████▌ | 15684/34278 [17:14:34<18:07:26, 3.51s/it] 46%|████▌ | 15685/34278 [17:14:37<17:53:48, 3.47s/it] {'loss': 0.1407, 'grad_norm': 1.1119534675833322, 'learning_rate': 5.9246337556728005e-06, 'epoch': 0.46} 46%|████▌ | 15685/34278 [17:14:37<17:53:48, 3.47s/it] 46%|████▌ | 15686/34278 [17:14:40<17:13:55, 3.34s/it] {'loss': 0.1429, 'grad_norm': 0.6629367221488919, 'learning_rate': 5.9241694657889236e-06, 'epoch': 0.46} 46%|████▌ | 15686/34278 [17:14:40<17:13:55, 3.34s/it] 46%|████▌ | 15687/34278 [17:14:43<16:39:04, 3.22s/it] {'loss': 0.1166, 'grad_norm': 0.6451570080377812, 'learning_rate': 5.9237051676542825e-06, 'epoch': 0.46} 46%|████▌ | 15687/34278 [17:14:43<16:39:04, 3.22s/it] 46%|████▌ | 15688/34278 [17:14:47<16:44:09, 3.24s/it] {'loss': 0.1632, 'grad_norm': 0.8167856611009412, 'learning_rate': 5.923240861273021e-06, 'epoch': 0.46} 46%|████▌ | 15688/34278 [17:14:47<16:44:09, 3.24s/it] 46%|████▌ | 15689/34278 [17:14:52<20:47:15, 4.03s/it] {'loss': 0.1269, 'grad_norm': 1.0186215312827995, 'learning_rate': 5.922776546649287e-06, 'epoch': 0.46} 46%|████▌ | 15689/34278 [17:14:52<20:47:15, 4.03s/it] 46%|████▌ | 15690/34278 [17:14:56<20:06:16, 3.89s/it] {'loss': 0.1395, 'grad_norm': 0.7110533075572291, 'learning_rate': 5.922312223787223e-06, 'epoch': 0.46} 46%|████▌ | 15690/34278 [17:14:56<20:06:16, 3.89s/it] 46%|████▌ | 15691/34278 [17:15:01<21:25:33, 4.15s/it] {'loss': 0.1408, 'grad_norm': 0.7325905083465877, 'learning_rate': 5.921847892690976e-06, 'epoch': 0.46} 46%|████▌ | 15691/34278 [17:15:01<21:25:33, 4.15s/it] 46%|████▌ | 15692/34278 [17:15:04<19:32:21, 3.78s/it] {'loss': 0.1358, 'grad_norm': 1.1060383046376248, 'learning_rate': 5.9213835533646914e-06, 'epoch': 0.46} 46%|████▌ | 15692/34278 [17:15:04<19:32:21, 3.78s/it] 46%|████▌ | 15693/34278 [17:15:10<23:17:56, 4.51s/it] {'loss': 0.1249, 'grad_norm': 0.8946239487513469, 'learning_rate': 5.920919205812514e-06, 'epoch': 0.46} 46%|████▌ | 15693/34278 [17:15:10<23:17:56, 4.51s/it] 46%|████▌ | 15694/34278 [17:15:15<23:43:31, 4.60s/it] {'loss': 0.1302, 'grad_norm': 0.8759308957921003, 'learning_rate': 5.920454850038591e-06, 'epoch': 0.46} 46%|████▌ | 15694/34278 [17:15:15<23:43:31, 4.60s/it] 46%|████▌ | 15695/34278 [17:15:19<23:37:02, 4.58s/it] {'loss': 0.1378, 'grad_norm': 1.1367852553850144, 'learning_rate': 5.919990486047065e-06, 'epoch': 0.46} 46%|████▌ | 15695/34278 [17:15:19<23:37:02, 4.58s/it] 46%|████▌ | 15696/34278 [17:15:22<20:57:43, 4.06s/it] {'loss': 0.1208, 'grad_norm': 0.7844432128423668, 'learning_rate': 5.919526113842085e-06, 'epoch': 0.46} 46%|████▌ | 15696/34278 [17:15:22<20:57:43, 4.06s/it] 46%|████▌ | 15697/34278 [17:15:25<19:40:54, 3.81s/it] {'loss': 0.1366, 'grad_norm': 0.8387595450575478, 'learning_rate': 5.9190617334277955e-06, 'epoch': 0.46} 46%|████▌ | 15697/34278 [17:15:25<19:40:54, 3.81s/it] 46%|████▌ | 15698/34278 [17:15:28<18:39:04, 3.61s/it] {'loss': 0.1487, 'grad_norm': 0.9436747595291688, 'learning_rate': 5.91859734480834e-06, 'epoch': 0.46} 46%|████▌ | 15698/34278 [17:15:28<18:39:04, 3.61s/it] 46%|████▌ | 15699/34278 [17:15:33<20:06:04, 3.89s/it] {'loss': 0.1326, 'grad_norm': 0.8176453567039192, 'learning_rate': 5.9181329479878694e-06, 'epoch': 0.46} 46%|████▌ | 15699/34278 [17:15:33<20:06:04, 3.89s/it] 46%|████▌ | 15700/34278 [17:15:36<18:47:18, 3.64s/it] {'loss': 0.1346, 'grad_norm': 0.8249520224108696, 'learning_rate': 5.917668542970525e-06, 'epoch': 0.46} 46%|████▌ | 15700/34278 [17:15:36<18:47:18, 3.64s/it] 46%|████▌ | 15701/34278 [17:15:39<17:53:42, 3.47s/it] {'loss': 0.1258, 'grad_norm': 1.1771261961324997, 'learning_rate': 5.917204129760457e-06, 'epoch': 0.46} 46%|████▌ | 15701/34278 [17:15:39<17:53:42, 3.47s/it] 46%|████▌ | 15702/34278 [17:15:45<21:51:08, 4.23s/it] {'loss': 0.1468, 'grad_norm': 0.922435303808003, 'learning_rate': 5.916739708361807e-06, 'epoch': 0.46} 46%|████▌ | 15702/34278 [17:15:45<21:51:08, 4.23s/it] 46%|████▌ | 15703/34278 [17:15:48<20:22:26, 3.95s/it] {'loss': 0.1308, 'grad_norm': 0.7783907730184846, 'learning_rate': 5.916275278778725e-06, 'epoch': 0.46} 46%|████▌ | 15703/34278 [17:15:48<20:22:26, 3.95s/it] 46%|████▌ | 15704/34278 [17:15:54<23:14:59, 4.51s/it] {'loss': 0.1288, 'grad_norm': 0.8477210818925518, 'learning_rate': 5.915810841015356e-06, 'epoch': 0.46} 46%|████▌ | 15704/34278 [17:15:54<23:14:59, 4.51s/it] 46%|████▌ | 15705/34278 [17:15:57<20:56:12, 4.06s/it] {'loss': 0.1392, 'grad_norm': 0.9263544934521192, 'learning_rate': 5.9153463950758465e-06, 'epoch': 0.46} 46%|████▌ | 15705/34278 [17:15:57<20:56:12, 4.06s/it] 46%|████▌ | 15706/34278 [17:16:03<23:47:22, 4.61s/it] {'loss': 0.1556, 'grad_norm': 1.1589657547471184, 'learning_rate': 5.914881940964343e-06, 'epoch': 0.46} 46%|████▌ | 15706/34278 [17:16:03<23:47:22, 4.61s/it] 46%|████▌ | 15707/34278 [17:16:06<20:55:25, 4.06s/it] {'loss': 0.1244, 'grad_norm': 0.7878431455735818, 'learning_rate': 5.914417478684992e-06, 'epoch': 0.46} 46%|████▌ | 15707/34278 [17:16:06<20:55:25, 4.06s/it] 46%|████▌ | 15708/34278 [17:16:09<19:44:01, 3.83s/it] {'loss': 0.1429, 'grad_norm': 1.048494819227112, 'learning_rate': 5.913953008241939e-06, 'epoch': 0.46} 46%|████▌ | 15708/34278 [17:16:09<19:44:01, 3.83s/it] 46%|████▌ | 15709/34278 [17:16:12<18:20:08, 3.55s/it] {'loss': 0.1603, 'grad_norm': 1.0324088380933156, 'learning_rate': 5.913488529639334e-06, 'epoch': 0.46} 46%|████▌ | 15709/34278 [17:16:12<18:20:08, 3.55s/it] 46%|████▌ | 15710/34278 [17:16:17<19:38:33, 3.81s/it] {'loss': 0.1267, 'grad_norm': 0.7533032521400135, 'learning_rate': 5.913024042881319e-06, 'epoch': 0.46} 46%|████▌ | 15710/34278 [17:16:17<19:38:33, 3.81s/it] 46%|████▌ | 15711/34278 [17:16:21<20:54:32, 4.05s/it] {'loss': 0.1301, 'grad_norm': 0.7113361607782436, 'learning_rate': 5.912559547972043e-06, 'epoch': 0.46} 46%|████▌ | 15711/34278 [17:16:21<20:54:32, 4.05s/it] 46%|████▌ | 15712/34278 [17:16:24<19:36:26, 3.80s/it] {'loss': 0.1429, 'grad_norm': 0.9468588685317267, 'learning_rate': 5.912095044915655e-06, 'epoch': 0.46} 46%|████▌ | 15712/34278 [17:16:24<19:36:26, 3.80s/it] 46%|████▌ | 15713/34278 [17:16:27<18:04:30, 3.50s/it] {'loss': 0.1347, 'grad_norm': 1.0343857302662343, 'learning_rate': 5.911630533716299e-06, 'epoch': 0.46} 46%|████▌ | 15713/34278 [17:16:27<18:04:30, 3.50s/it] 46%|████▌ | 15714/34278 [17:16:30<16:56:02, 3.28s/it] {'loss': 0.134, 'grad_norm': 0.7442153617635787, 'learning_rate': 5.911166014378126e-06, 'epoch': 0.46} 46%|████▌ | 15714/34278 [17:16:30<16:56:02, 3.28s/it] 46%|████▌ | 15715/34278 [17:16:33<16:58:44, 3.29s/it] {'loss': 0.1327, 'grad_norm': 1.0059846109109405, 'learning_rate': 5.910701486905277e-06, 'epoch': 0.46} 46%|████▌ | 15715/34278 [17:16:33<16:58:44, 3.29s/it] 46%|████▌ | 15716/34278 [17:16:39<20:11:07, 3.91s/it] {'loss': 0.1669, 'grad_norm': 0.8944237191137988, 'learning_rate': 5.910236951301904e-06, 'epoch': 0.46} 46%|████▌ | 15716/34278 [17:16:39<20:11:07, 3.91s/it] 46%|████▌ | 15717/34278 [17:16:42<18:46:55, 3.64s/it] {'loss': 0.136, 'grad_norm': 0.7624367383167346, 'learning_rate': 5.909772407572153e-06, 'epoch': 0.46} 46%|████▌ | 15717/34278 [17:16:42<18:46:55, 3.64s/it] 46%|████▌ | 15718/34278 [17:16:47<22:07:11, 4.29s/it] {'loss': 0.1112, 'grad_norm': 0.84192067433435, 'learning_rate': 5.90930785572017e-06, 'epoch': 0.46} 46%|████▌ | 15718/34278 [17:16:47<22:07:11, 4.29s/it] 46%|████▌ | 15719/34278 [17:16:50<20:02:44, 3.89s/it] {'loss': 0.1616, 'grad_norm': 0.7057893376643394, 'learning_rate': 5.908843295750104e-06, 'epoch': 0.46} 46%|████▌ | 15719/34278 [17:16:50<20:02:44, 3.89s/it] 46%|████▌ | 15720/34278 [17:16:53<18:26:24, 3.58s/it] {'loss': 0.1485, 'grad_norm': 0.8740791571990643, 'learning_rate': 5.908378727666103e-06, 'epoch': 0.46} 46%|████▌ | 15720/34278 [17:16:53<18:26:24, 3.58s/it] 46%|████▌ | 15721/34278 [17:16:56<17:52:11, 3.47s/it] {'loss': 0.1325, 'grad_norm': 0.8135321402168436, 'learning_rate': 5.907914151472312e-06, 'epoch': 0.46} 46%|████▌ | 15721/34278 [17:16:56<17:52:11, 3.47s/it] 46%|████▌ | 15722/34278 [17:17:02<20:46:54, 4.03s/it] {'loss': 0.1317, 'grad_norm': 0.848164326095268, 'learning_rate': 5.9074495671728814e-06, 'epoch': 0.46} 46%|████▌ | 15722/34278 [17:17:02<20:46:54, 4.03s/it] 46%|████▌ | 15723/34278 [17:17:05<20:11:34, 3.92s/it] {'loss': 0.1221, 'grad_norm': 0.704790213205819, 'learning_rate': 5.9069849747719565e-06, 'epoch': 0.46} 46%|████▌ | 15723/34278 [17:17:05<20:11:34, 3.92s/it] 46%|████▌ | 15724/34278 [17:17:08<18:35:58, 3.61s/it] {'loss': 0.1364, 'grad_norm': 0.7766729929604375, 'learning_rate': 5.906520374273688e-06, 'epoch': 0.46} 46%|████▌ | 15724/34278 [17:17:08<18:35:58, 3.61s/it] 46%|████▌ | 15725/34278 [17:17:11<17:44:54, 3.44s/it] {'loss': 0.1366, 'grad_norm': 0.7079896371676835, 'learning_rate': 5.90605576568222e-06, 'epoch': 0.46} 46%|████▌ | 15725/34278 [17:17:11<17:44:54, 3.44s/it] 46%|████▌ | 15726/34278 [17:17:15<18:21:24, 3.56s/it] {'loss': 0.1276, 'grad_norm': 0.7778515559501419, 'learning_rate': 5.905591149001704e-06, 'epoch': 0.46} 46%|████▌ | 15726/34278 [17:17:15<18:21:24, 3.56s/it] 46%|████▌ | 15727/34278 [17:17:19<18:38:30, 3.62s/it] {'loss': 0.1346, 'grad_norm': 0.9684472097472696, 'learning_rate': 5.9051265242362854e-06, 'epoch': 0.46} 46%|████▌ | 15727/34278 [17:17:19<18:38:30, 3.62s/it] 46%|████▌ | 15728/34278 [17:17:24<20:46:07, 4.03s/it] {'loss': 0.1197, 'grad_norm': 0.676124760181125, 'learning_rate': 5.904661891390114e-06, 'epoch': 0.46} 46%|████▌ | 15728/34278 [17:17:24<20:46:07, 4.03s/it] 46%|████▌ | 15729/34278 [17:17:27<19:50:29, 3.85s/it] {'loss': 0.1425, 'grad_norm': 0.6906927877819188, 'learning_rate': 5.904197250467339e-06, 'epoch': 0.46} 46%|████▌ | 15729/34278 [17:17:27<19:50:29, 3.85s/it] 46%|████▌ | 15730/34278 [17:17:30<18:38:47, 3.62s/it] {'loss': 0.1395, 'grad_norm': 1.2108509483220204, 'learning_rate': 5.903732601472102e-06, 'epoch': 0.46} 46%|████▌ | 15730/34278 [17:17:30<18:38:47, 3.62s/it] 46%|████▌ | 15731/34278 [17:17:33<17:22:34, 3.37s/it] {'loss': 0.1449, 'grad_norm': 1.0535901221585215, 'learning_rate': 5.903267944408561e-06, 'epoch': 0.46} 46%|████▌ | 15731/34278 [17:17:33<17:22:34, 3.37s/it] 46%|████▌ | 15732/34278 [17:17:36<16:12:55, 3.15s/it] {'loss': 0.106, 'grad_norm': 0.730594325168327, 'learning_rate': 5.902803279280857e-06, 'epoch': 0.46} 46%|████▌ | 15732/34278 [17:17:36<16:12:55, 3.15s/it] 46%|████▌ | 15733/34278 [17:17:39<16:08:54, 3.13s/it] {'loss': 0.1285, 'grad_norm': 0.9053460795820518, 'learning_rate': 5.902338606093139e-06, 'epoch': 0.46} 46%|████▌ | 15733/34278 [17:17:39<16:08:54, 3.13s/it] 46%|████▌ | 15734/34278 [17:17:45<20:28:04, 3.97s/it] {'loss': 0.1299, 'grad_norm': 0.7754003592857962, 'learning_rate': 5.9018739248495605e-06, 'epoch': 0.46} 46%|████▌ | 15734/34278 [17:17:45<20:28:04, 3.97s/it] 46%|████▌ | 15735/34278 [17:17:49<20:01:23, 3.89s/it] {'loss': 0.1078, 'grad_norm': 0.7470854924206519, 'learning_rate': 5.901409235554265e-06, 'epoch': 0.46} 46%|████▌ | 15735/34278 [17:17:49<20:01:23, 3.89s/it] 46%|████▌ | 15736/34278 [17:17:52<18:59:41, 3.69s/it] {'loss': 0.1367, 'grad_norm': 0.8360333458535162, 'learning_rate': 5.900944538211404e-06, 'epoch': 0.46} 46%|████▌ | 15736/34278 [17:17:52<18:59:41, 3.69s/it] 46%|████▌ | 15737/34278 [17:17:58<23:04:53, 4.48s/it] {'loss': 0.1485, 'grad_norm': 0.8861071433531529, 'learning_rate': 5.9004798328251255e-06, 'epoch': 0.46} 46%|████▌ | 15737/34278 [17:17:58<23:04:53, 4.48s/it] 46%|████▌ | 15738/34278 [17:18:01<20:41:13, 4.02s/it] {'loss': 0.1466, 'grad_norm': 0.9182952557822152, 'learning_rate': 5.900015119399577e-06, 'epoch': 0.46} 46%|████▌ | 15738/34278 [17:18:01<20:41:13, 4.02s/it] 46%|████▌ | 15739/34278 [17:18:07<22:49:56, 4.43s/it] {'loss': 0.148, 'grad_norm': 1.1191553678586899, 'learning_rate': 5.899550397938909e-06, 'epoch': 0.46} 46%|████▌ | 15739/34278 [17:18:07<22:49:56, 4.43s/it] 46%|████▌ | 15740/34278 [17:18:11<22:27:55, 4.36s/it] {'loss': 0.1262, 'grad_norm': 0.816610906007468, 'learning_rate': 5.89908566844727e-06, 'epoch': 0.46} 46%|████▌ | 15740/34278 [17:18:11<22:27:55, 4.36s/it] 46%|████▌ | 15741/34278 [17:18:14<20:41:06, 4.02s/it] {'loss': 0.1432, 'grad_norm': 1.1109664468265683, 'learning_rate': 5.898620930928808e-06, 'epoch': 0.46} 46%|████▌ | 15741/34278 [17:18:14<20:41:06, 4.02s/it] 46%|████▌ | 15742/34278 [17:18:20<23:41:07, 4.60s/it] {'loss': 0.156, 'grad_norm': 0.8793888636473848, 'learning_rate': 5.898156185387674e-06, 'epoch': 0.46} 46%|████▌ | 15742/34278 [17:18:20<23:41:07, 4.60s/it] 46%|████▌ | 15743/34278 [17:18:23<22:04:17, 4.29s/it] {'loss': 0.135, 'grad_norm': 0.7624539990976663, 'learning_rate': 5.897691431828014e-06, 'epoch': 0.46} 46%|████▌ | 15743/34278 [17:18:23<22:04:17, 4.29s/it] 46%|████▌ | 15744/34278 [17:18:28<21:43:58, 4.22s/it] {'loss': 0.1476, 'grad_norm': 0.9667460169531502, 'learning_rate': 5.897226670253982e-06, 'epoch': 0.46} 46%|████▌ | 15744/34278 [17:18:28<21:43:58, 4.22s/it] 46%|████▌ | 15745/34278 [17:18:31<20:20:33, 3.95s/it] {'loss': 0.1422, 'grad_norm': 0.8293488957500059, 'learning_rate': 5.896761900669722e-06, 'epoch': 0.46} 46%|████▌ | 15745/34278 [17:18:31<20:20:33, 3.95s/it] 46%|████▌ | 15746/34278 [17:18:34<19:49:17, 3.85s/it] {'loss': 0.1448, 'grad_norm': 0.8085583621733674, 'learning_rate': 5.896297123079388e-06, 'epoch': 0.46} 46%|████▌ | 15746/34278 [17:18:34<19:49:17, 3.85s/it] 46%|████▌ | 15747/34278 [17:18:39<20:14:07, 3.93s/it] {'loss': 0.1357, 'grad_norm': 0.9251861458750373, 'learning_rate': 5.895832337487126e-06, 'epoch': 0.46} 46%|████▌ | 15747/34278 [17:18:39<20:14:07, 3.93s/it] 46%|████▌ | 15748/34278 [17:18:42<19:23:52, 3.77s/it] {'loss': 0.1397, 'grad_norm': 0.773105611322147, 'learning_rate': 5.895367543897086e-06, 'epoch': 0.46} 46%|████▌ | 15748/34278 [17:18:42<19:23:52, 3.77s/it] 46%|████▌ | 15749/34278 [17:18:45<18:20:28, 3.56s/it] {'loss': 0.1257, 'grad_norm': 0.9358682107421241, 'learning_rate': 5.89490274231342e-06, 'epoch': 0.46} 46%|████▌ | 15749/34278 [17:18:45<18:20:28, 3.56s/it] 46%|████▌ | 15750/34278 [17:18:48<17:44:06, 3.45s/it] {'loss': 0.127, 'grad_norm': 0.7510929156606305, 'learning_rate': 5.894437932740274e-06, 'epoch': 0.46} 46%|████▌ | 15750/34278 [17:18:48<17:44:06, 3.45s/it] 46%|████▌ | 15751/34278 [17:18:53<20:29:54, 3.98s/it] {'loss': 0.1441, 'grad_norm': 0.8084065674036928, 'learning_rate': 5.893973115181801e-06, 'epoch': 0.46} 46%|████▌ | 15751/34278 [17:18:53<20:29:54, 3.98s/it] 46%|████▌ | 15752/34278 [17:18:57<19:09:06, 3.72s/it] {'loss': 0.1429, 'grad_norm': 0.822699837033059, 'learning_rate': 5.8935082896421495e-06, 'epoch': 0.46} 46%|████▌ | 15752/34278 [17:18:57<19:09:06, 3.72s/it] 46%|████▌ | 15753/34278 [17:19:00<18:28:51, 3.59s/it] {'loss': 0.1112, 'grad_norm': 0.8620453745010047, 'learning_rate': 5.893043456125469e-06, 'epoch': 0.46} 46%|████▌ | 15753/34278 [17:19:00<18:28:51, 3.59s/it] 46%|████▌ | 15754/34278 [17:19:04<19:54:32, 3.87s/it] {'loss': 0.1393, 'grad_norm': 1.0328251337959462, 'learning_rate': 5.892578614635909e-06, 'epoch': 0.46} 46%|████▌ | 15754/34278 [17:19:04<19:54:32, 3.87s/it] 46%|████▌ | 15755/34278 [17:19:08<19:16:15, 3.75s/it] {'loss': 0.1269, 'grad_norm': 0.8911042434095463, 'learning_rate': 5.892113765177621e-06, 'epoch': 0.46} 46%|████▌ | 15755/34278 [17:19:08<19:16:15, 3.75s/it] 46%|████▌ | 15756/34278 [17:19:14<22:43:12, 4.42s/it] {'loss': 0.136, 'grad_norm': 0.7718290472463595, 'learning_rate': 5.891648907754753e-06, 'epoch': 0.46} 46%|████▌ | 15756/34278 [17:19:14<22:43:12, 4.42s/it] 46%|████▌ | 15757/34278 [17:19:16<19:55:05, 3.87s/it] {'loss': 0.1252, 'grad_norm': 0.9180483729049965, 'learning_rate': 5.891184042371459e-06, 'epoch': 0.46} 46%|████▌ | 15757/34278 [17:19:16<19:55:05, 3.87s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4a81f33060> Failed to fetch sample 3353703. Exception: cannot identify image file <_io.BytesIO object at 0x7f4a81f33060> 46%|████▌ | 15758/34278 [17:19:22<23:13:15, 4.51s/it] {'loss': 0.1557, 'grad_norm': 0.9859209708795096, 'learning_rate': 5.890719169031885e-06, 'epoch': 0.46} 46%|████▌ | 15758/34278 [17:19:22<23:13:15, 4.51s/it] 46%|████▌ | 15759/34278 [17:19:26<21:41:31, 4.22s/it] {'loss': 0.1461, 'grad_norm': 0.8682703837180158, 'learning_rate': 5.890254287740183e-06, 'epoch': 0.46} 46%|████▌ | 15759/34278 [17:19:26<21:41:31, 4.22s/it] 46%|████▌ | 15760/34278 [17:19:31<23:18:14, 4.53s/it] {'loss': 0.1371, 'grad_norm': 0.9547434965349666, 'learning_rate': 5.889789398500503e-06, 'epoch': 0.46} 46%|████▌ | 15760/34278 [17:19:31<23:18:14, 4.53s/it] 46%|████▌ | 15761/34278 [17:19:37<24:31:25, 4.77s/it] {'loss': 0.1411, 'grad_norm': 1.1981820087067303, 'learning_rate': 5.8893245013169965e-06, 'epoch': 0.46} 46%|████▌ | 15761/34278 [17:19:37<24:31:25, 4.77s/it] 46%|████▌ | 15762/34278 [17:19:40<23:04:42, 4.49s/it] {'loss': 0.1496, 'grad_norm': 1.1017722517492872, 'learning_rate': 5.888859596193812e-06, 'epoch': 0.46} 46%|████▌ | 15762/34278 [17:19:40<23:04:42, 4.49s/it] 46%|████▌ | 15763/34278 [17:19:43<20:48:02, 4.04s/it] {'loss': 0.1224, 'grad_norm': 0.7807452502563147, 'learning_rate': 5.8883946831351014e-06, 'epoch': 0.46} 46%|████▌ | 15763/34278 [17:19:43<20:48:02, 4.04s/it] 46%|████▌ | 15764/34278 [17:19:47<19:58:23, 3.88s/it] {'loss': 0.1343, 'grad_norm': 0.8802133211082914, 'learning_rate': 5.887929762145016e-06, 'epoch': 0.46} 46%|████▌ | 15764/34278 [17:19:47<19:58:23, 3.88s/it] 46%|████▌ | 15765/34278 [17:19:50<18:59:19, 3.69s/it] {'loss': 0.1534, 'grad_norm': 1.0183444332824991, 'learning_rate': 5.887464833227705e-06, 'epoch': 0.46} 46%|████▌ | 15765/34278 [17:19:50<18:59:19, 3.69s/it] 46%|████▌ | 15766/34278 [17:19:53<18:23:00, 3.57s/it] {'loss': 0.1323, 'grad_norm': 1.0514384318453445, 'learning_rate': 5.8869998963873195e-06, 'epoch': 0.46} 46%|████▌ | 15766/34278 [17:19:53<18:23:00, 3.57s/it] 46%|████▌ | 15767/34278 [17:19:56<17:19:29, 3.37s/it] {'loss': 0.1539, 'grad_norm': 0.8659569271433172, 'learning_rate': 5.886534951628011e-06, 'epoch': 0.46} 46%|████▌ | 15767/34278 [17:19:56<17:19:29, 3.37s/it] 46%|████▌ | 15768/34278 [17:20:02<21:34:10, 4.20s/it] {'loss': 0.1455, 'grad_norm': 0.969923767980383, 'learning_rate': 5.88606999895393e-06, 'epoch': 0.46} 46%|████▌ | 15768/34278 [17:20:02<21:34:10, 4.20s/it] 46%|████▌ | 15769/34278 [17:20:06<20:52:20, 4.06s/it] {'loss': 0.1345, 'grad_norm': 2.026311164005376, 'learning_rate': 5.885605038369228e-06, 'epoch': 0.46} 46%|████▌ | 15769/34278 [17:20:06<20:52:20, 4.06s/it] 46%|████▌ | 15770/34278 [17:20:09<19:11:48, 3.73s/it] {'loss': 0.1313, 'grad_norm': 1.0163049865030611, 'learning_rate': 5.885140069878056e-06, 'epoch': 0.46} 46%|████▌ | 15770/34278 [17:20:09<19:11:48, 3.73s/it] 46%|████▌ | 15771/34278 [17:20:12<17:57:16, 3.49s/it] {'loss': 0.1119, 'grad_norm': 0.7852708379640772, 'learning_rate': 5.884675093484565e-06, 'epoch': 0.46} 46%|████▌ | 15771/34278 [17:20:12<17:57:16, 3.49s/it] 46%|████▌ | 15772/34278 [17:20:15<17:43:19, 3.45s/it] {'loss': 0.1698, 'grad_norm': 0.8553733755474103, 'learning_rate': 5.884210109192904e-06, 'epoch': 0.46} 46%|████▌ | 15772/34278 [17:20:15<17:43:19, 3.45s/it] 46%|████▌ | 15773/34278 [17:20:19<17:29:23, 3.40s/it] {'loss': 0.1414, 'grad_norm': 0.9825994654228479, 'learning_rate': 5.883745117007227e-06, 'epoch': 0.46} 46%|████▌ | 15773/34278 [17:20:19<17:29:23, 3.40s/it] 46%|████▌ | 15774/34278 [17:20:23<18:26:51, 3.59s/it] {'loss': 0.1795, 'grad_norm': 1.1105769379837276, 'learning_rate': 5.883280116931687e-06, 'epoch': 0.46} 46%|████▌ | 15774/34278 [17:20:23<18:26:51, 3.59s/it] 46%|████▌ | 15775/34278 [17:20:25<16:53:40, 3.29s/it] {'loss': 0.1167, 'grad_norm': 0.7570744073486113, 'learning_rate': 5.882815108970429e-06, 'epoch': 0.46} 46%|████▌ | 15775/34278 [17:20:25<16:53:40, 3.29s/it] 46%|████▌ | 15776/34278 [17:20:29<17:39:04, 3.43s/it] {'loss': 0.1551, 'grad_norm': 0.9461345232593066, 'learning_rate': 5.882350093127611e-06, 'epoch': 0.46} 46%|████▌ | 15776/34278 [17:20:29<17:39:04, 3.43s/it] 46%|████▌ | 15777/34278 [17:20:32<17:24:25, 3.39s/it] {'loss': 0.138, 'grad_norm': 0.9622667746723423, 'learning_rate': 5.881885069407382e-06, 'epoch': 0.46} 46%|████▌ | 15777/34278 [17:20:32<17:24:25, 3.39s/it] 46%|████▌ | 15778/34278 [17:20:35<16:43:24, 3.25s/it] {'loss': 0.1303, 'grad_norm': 0.7238455267781404, 'learning_rate': 5.881420037813892e-06, 'epoch': 0.46} 46%|████▌ | 15778/34278 [17:20:35<16:43:24, 3.25s/it] 46%|████▌ | 15779/34278 [17:20:40<19:27:34, 3.79s/it] {'loss': 0.1333, 'grad_norm': 0.7767999021164391, 'learning_rate': 5.880954998351296e-06, 'epoch': 0.46} 46%|████▌ | 15779/34278 [17:20:40<19:27:34, 3.79s/it] 46%|████▌ | 15780/34278 [17:20:44<18:47:58, 3.66s/it] {'loss': 0.1248, 'grad_norm': 0.7106336440672024, 'learning_rate': 5.8804899510237435e-06, 'epoch': 0.46} 46%|████▌ | 15780/34278 [17:20:44<18:47:58, 3.66s/it] 46%|████▌ | 15781/34278 [17:20:49<21:25:58, 4.17s/it] {'loss': 0.161, 'grad_norm': 0.7356084204781619, 'learning_rate': 5.880024895835387e-06, 'epoch': 0.46} 46%|████▌ | 15781/34278 [17:20:49<21:25:58, 4.17s/it] 46%|████▌ | 15782/34278 [17:20:52<19:37:07, 3.82s/it] {'loss': 0.1349, 'grad_norm': 0.7885342508501595, 'learning_rate': 5.879559832790378e-06, 'epoch': 0.46} 46%|████▌ | 15782/34278 [17:20:52<19:37:07, 3.82s/it] 46%|████▌ | 15783/34278 [17:20:57<21:05:45, 4.11s/it] {'loss': 0.1549, 'grad_norm': 0.8843470982483292, 'learning_rate': 5.8790947618928686e-06, 'epoch': 0.46} 46%|████▌ | 15783/34278 [17:20:57<21:05:45, 4.11s/it] 46%|████▌ | 15784/34278 [17:21:01<20:41:31, 4.03s/it] {'loss': 0.1552, 'grad_norm': 0.7950865713698176, 'learning_rate': 5.878629683147011e-06, 'epoch': 0.46} 46%|████▌ | 15784/34278 [17:21:01<20:41:31, 4.03s/it] 46%|████▌ | 15785/34278 [17:21:03<18:31:05, 3.60s/it] {'loss': 0.1415, 'grad_norm': 0.76882081982006, 'learning_rate': 5.878164596556958e-06, 'epoch': 0.46} 46%|████▌ | 15785/34278 [17:21:03<18:31:05, 3.60s/it] 46%|████▌ | 15786/34278 [17:21:10<23:00:49, 4.48s/it] {'loss': 0.1221, 'grad_norm': 0.7126205467770146, 'learning_rate': 5.87769950212686e-06, 'epoch': 0.46} 46%|████▌ | 15786/34278 [17:21:10<23:00:49, 4.48s/it] 46%|████▌ | 15787/34278 [17:21:13<21:02:53, 4.10s/it] {'loss': 0.1513, 'grad_norm': 0.9999807887006671, 'learning_rate': 5.877234399860872e-06, 'epoch': 0.46} 46%|████▌ | 15787/34278 [17:21:13<21:02:53, 4.10s/it] 46%|████▌ | 15788/34278 [17:21:16<19:41:41, 3.83s/it] {'loss': 0.1587, 'grad_norm': 0.8897769518034591, 'learning_rate': 5.876769289763144e-06, 'epoch': 0.46} 46%|████▌ | 15788/34278 [17:21:16<19:41:41, 3.83s/it] 46%|████▌ | 15789/34278 [17:21:22<21:56:47, 4.27s/it] {'loss': 0.1574, 'grad_norm': 0.8672343051128001, 'learning_rate': 5.876304171837829e-06, 'epoch': 0.46} 46%|████▌ | 15789/34278 [17:21:22<21:56:47, 4.27s/it] 46%|████▌ | 15790/34278 [17:21:25<20:45:59, 4.04s/it] {'loss': 0.1332, 'grad_norm': 0.9092971735426206, 'learning_rate': 5.875839046089078e-06, 'epoch': 0.46} 46%|████▌ | 15790/34278 [17:21:25<20:45:59, 4.04s/it] 46%|████▌ | 15791/34278 [17:21:28<18:39:42, 3.63s/it] {'loss': 0.1321, 'grad_norm': 0.8162992516912753, 'learning_rate': 5.875373912521047e-06, 'epoch': 0.46} 46%|████▌ | 15791/34278 [17:21:28<18:39:42, 3.63s/it] 46%|████▌ | 15792/34278 [17:21:33<21:51:08, 4.26s/it] {'loss': 0.1341, 'grad_norm': 0.9867222396381891, 'learning_rate': 5.874908771137887e-06, 'epoch': 0.46} 46%|████▌ | 15792/34278 [17:21:33<21:51:08, 4.26s/it] 46%|████▌ | 15793/34278 [17:21:37<20:54:19, 4.07s/it] {'loss': 0.1305, 'grad_norm': 0.7682594536658461, 'learning_rate': 5.874443621943749e-06, 'epoch': 0.46} 46%|████▌ | 15793/34278 [17:21:37<20:54:19, 4.07s/it] 46%|████▌ | 15794/34278 [17:21:40<19:36:18, 3.82s/it] {'loss': 0.1393, 'grad_norm': 0.7447781676970837, 'learning_rate': 5.873978464942788e-06, 'epoch': 0.46} 46%|████▌ | 15794/34278 [17:21:40<19:36:18, 3.82s/it] 46%|████▌ | 15795/34278 [17:21:43<18:20:14, 3.57s/it] {'loss': 0.1376, 'grad_norm': 0.8220847614500845, 'learning_rate': 5.873513300139155e-06, 'epoch': 0.46} 46%|████▌ | 15795/34278 [17:21:43<18:20:14, 3.57s/it] 46%|████▌ | 15796/34278 [17:21:46<17:36:55, 3.43s/it] {'loss': 0.1283, 'grad_norm': 0.9003115680649544, 'learning_rate': 5.873048127537005e-06, 'epoch': 0.46} 46%|████▌ | 15796/34278 [17:21:46<17:36:55, 3.43s/it] 46%|████▌ | 15797/34278 [17:21:50<17:05:11, 3.33s/it] {'loss': 0.1268, 'grad_norm': 0.6516482520777607, 'learning_rate': 5.8725829471404884e-06, 'epoch': 0.46} 46%|████▌ | 15797/34278 [17:21:50<17:05:11, 3.33s/it] 46%|████▌ | 15798/34278 [17:21:55<20:51:24, 4.06s/it] {'loss': 0.1359, 'grad_norm': 0.8948338886054641, 'learning_rate': 5.87211775895376e-06, 'epoch': 0.46} 46%|████▌ | 15798/34278 [17:21:55<20:51:24, 4.06s/it] 46%|████▌ | 15799/34278 [17:21:59<20:46:37, 4.05s/it] {'loss': 0.1421, 'grad_norm': 0.7926542584453049, 'learning_rate': 5.871652562980973e-06, 'epoch': 0.46} 46%|████▌ | 15799/34278 [17:21:59<20:46:37, 4.05s/it] 46%|████▌ | 15800/34278 [17:22:05<23:48:51, 4.64s/it] {'loss': 0.1334, 'grad_norm': 1.140870125403056, 'learning_rate': 5.871187359226279e-06, 'epoch': 0.46} 46%|████▌ | 15800/34278 [17:22:05<23:48:51, 4.64s/it] 46%|████▌ | 15801/34278 [17:22:09<21:51:24, 4.26s/it] {'loss': 0.1395, 'grad_norm': 0.7507135948161748, 'learning_rate': 5.870722147693832e-06, 'epoch': 0.46} 46%|████▌ | 15801/34278 [17:22:09<21:51:24, 4.26s/it] 46%|████▌ | 15802/34278 [17:22:11<19:36:07, 3.82s/it] {'loss': 0.1351, 'grad_norm': 0.8021253975166392, 'learning_rate': 5.870256928387788e-06, 'epoch': 0.46} 46%|████▌ | 15802/34278 [17:22:11<19:36:07, 3.82s/it] 46%|████▌ | 15803/34278 [17:22:15<18:38:40, 3.63s/it] {'loss': 0.1427, 'grad_norm': 0.7941479253720344, 'learning_rate': 5.8697917013122955e-06, 'epoch': 0.46} 46%|████▌ | 15803/34278 [17:22:15<18:38:40, 3.63s/it] 46%|████▌ | 15804/34278 [17:22:18<17:29:58, 3.41s/it] {'loss': 0.1563, 'grad_norm': 0.8310011848931662, 'learning_rate': 5.869326466471512e-06, 'epoch': 0.46} 46%|████▌ | 15804/34278 [17:22:18<17:29:58, 3.41s/it] 46%|████▌ | 15805/34278 [17:22:21<16:47:52, 3.27s/it] {'loss': 0.1321, 'grad_norm': 0.7490702873320659, 'learning_rate': 5.868861223869587e-06, 'epoch': 0.46} 46%|████▌ | 15805/34278 [17:22:21<16:47:52, 3.27s/it] 46%|████▌ | 15806/34278 [17:22:24<17:03:56, 3.33s/it] {'loss': 0.1423, 'grad_norm': 0.7299876823540122, 'learning_rate': 5.868395973510679e-06, 'epoch': 0.46} 46%|████▌ | 15806/34278 [17:22:24<17:03:56, 3.33s/it] 46%|████▌ | 15807/34278 [17:22:28<18:03:30, 3.52s/it] {'loss': 0.132, 'grad_norm': 0.7715733570698626, 'learning_rate': 5.867930715398938e-06, 'epoch': 0.46} 46%|████▌ | 15807/34278 [17:22:28<18:03:30, 3.52s/it] 46%|████▌ | 15808/34278 [17:22:32<18:23:54, 3.59s/it] {'loss': 0.1356, 'grad_norm': 0.7824107871791222, 'learning_rate': 5.867465449538518e-06, 'epoch': 0.46} 46%|████▌ | 15808/34278 [17:22:32<18:23:54, 3.59s/it] 46%|████▌ | 15809/34278 [17:22:35<17:24:15, 3.39s/it] {'loss': 0.1374, 'grad_norm': 0.7929137437316226, 'learning_rate': 5.8670001759335745e-06, 'epoch': 0.46} 46%|████▌ | 15809/34278 [17:22:35<17:24:15, 3.39s/it] 46%|████▌ | 15810/34278 [17:22:41<21:28:47, 4.19s/it] {'loss': 0.1274, 'grad_norm': 0.787361562331058, 'learning_rate': 5.86653489458826e-06, 'epoch': 0.46} 46%|████▌ | 15810/34278 [17:22:41<21:28:47, 4.19s/it] 46%|████▌ | 15811/34278 [17:22:46<23:56:10, 4.67s/it] {'loss': 0.1245, 'grad_norm': 0.6777553009520738, 'learning_rate': 5.866069605506729e-06, 'epoch': 0.46} 46%|████▌ | 15811/34278 [17:22:46<23:56:10, 4.67s/it] 46%|████▌ | 15812/34278 [17:22:50<22:02:46, 4.30s/it] {'loss': 0.1516, 'grad_norm': 0.7695877637104159, 'learning_rate': 5.865604308693136e-06, 'epoch': 0.46} 46%|████▌ | 15812/34278 [17:22:50<22:02:46, 4.30s/it] 46%|████▌ | 15813/34278 [17:22:53<20:12:13, 3.94s/it] {'loss': 0.1448, 'grad_norm': 1.0323711240579274, 'learning_rate': 5.865139004151633e-06, 'epoch': 0.46} 46%|████▌ | 15813/34278 [17:22:53<20:12:13, 3.94s/it] 46%|████▌ | 15814/34278 [17:22:57<20:17:54, 3.96s/it] {'loss': 0.1519, 'grad_norm': 0.799632101319446, 'learning_rate': 5.864673691886375e-06, 'epoch': 0.46} 46%|████▌ | 15814/34278 [17:22:57<20:17:54, 3.96s/it] 46%|████▌ | 15815/34278 [17:23:01<20:45:18, 4.05s/it] {'loss': 0.1462, 'grad_norm': 0.7439077035334605, 'learning_rate': 5.864208371901519e-06, 'epoch': 0.46} 46%|████▌ | 15815/34278 [17:23:01<20:45:18, 4.05s/it] 46%|████▌ | 15816/34278 [17:23:04<19:03:08, 3.72s/it] {'loss': 0.1285, 'grad_norm': 0.9327709860995128, 'learning_rate': 5.863743044201215e-06, 'epoch': 0.46} 46%|████▌ | 15816/34278 [17:23:04<19:03:08, 3.72s/it] 46%|████▌ | 15817/34278 [17:23:08<18:36:48, 3.63s/it] {'loss': 0.1448, 'grad_norm': 0.998590255210444, 'learning_rate': 5.8632777087896205e-06, 'epoch': 0.46} 46%|████▌ | 15817/34278 [17:23:08<18:36:48, 3.63s/it] 46%|████▌ | 15818/34278 [17:23:11<17:40:55, 3.45s/it] {'loss': 0.1331, 'grad_norm': 0.8592407937248154, 'learning_rate': 5.862812365670888e-06, 'epoch': 0.46} 46%|████▌ | 15818/34278 [17:23:11<17:40:55, 3.45s/it] 46%|████▌ | 15819/34278 [17:23:17<21:28:48, 4.19s/it] {'loss': 0.1505, 'grad_norm': 0.9755159497312703, 'learning_rate': 5.862347014849174e-06, 'epoch': 0.46} 46%|████▌ | 15819/34278 [17:23:17<21:28:48, 4.19s/it] 46%|████▌ | 15820/34278 [17:23:22<23:10:34, 4.52s/it] {'loss': 0.135, 'grad_norm': 0.9038505585913441, 'learning_rate': 5.861881656328629e-06, 'epoch': 0.46} 46%|████▌ | 15820/34278 [17:23:22<23:10:34, 4.52s/it] 46%|████▌ | 15821/34278 [17:23:26<23:06:51, 4.51s/it] {'loss': 0.1406, 'grad_norm': 0.7637412982771709, 'learning_rate': 5.861416290113413e-06, 'epoch': 0.46} 46%|████▌ | 15821/34278 [17:23:26<23:06:51, 4.51s/it] 46%|████▌ | 15822/34278 [17:23:30<22:16:18, 4.34s/it] {'loss': 0.1316, 'grad_norm': 0.86363035493597, 'learning_rate': 5.860950916207677e-06, 'epoch': 0.46} 46%|████▌ | 15822/34278 [17:23:30<22:16:18, 4.34s/it] 46%|████▌ | 15823/34278 [17:23:34<20:32:07, 4.01s/it] {'loss': 0.1151, 'grad_norm': 0.9496228424623425, 'learning_rate': 5.8604855346155756e-06, 'epoch': 0.46} 46%|████▌ | 15823/34278 [17:23:34<20:32:07, 4.01s/it] 46%|████▌ | 15824/34278 [17:23:37<19:37:31, 3.83s/it] {'loss': 0.1306, 'grad_norm': 0.9982435813688407, 'learning_rate': 5.860020145341267e-06, 'epoch': 0.46} 46%|████▌ | 15824/34278 [17:23:37<19:37:31, 3.83s/it] 46%|████▌ | 15825/34278 [17:23:51<35:03:19, 6.84s/it] {'loss': 0.1191, 'grad_norm': 1.0682495955333833, 'learning_rate': 5.859554748388903e-06, 'epoch': 0.46} 46%|████▌ | 15825/34278 [17:23:51<35:03:19, 6.84s/it] 46%|████▌ | 15826/34278 [17:23:54<29:59:05, 5.85s/it] {'loss': 0.1472, 'grad_norm': 0.9238371193535599, 'learning_rate': 5.859089343762638e-06, 'epoch': 0.46} 46%|████▌ | 15826/34278 [17:23:54<29:59:05, 5.85s/it] 46%|████▌ | 15827/34278 [17:23:57<25:28:18, 4.97s/it] {'loss': 0.1351, 'grad_norm': 0.8926023227567703, 'learning_rate': 5.85862393146663e-06, 'epoch': 0.46} 46%|████▌ | 15827/34278 [17:23:57<25:28:18, 4.97s/it] 46%|████▌ | 15828/34278 [17:24:00<22:27:42, 4.38s/it] {'loss': 0.1103, 'grad_norm': 0.7104826765634306, 'learning_rate': 5.858158511505032e-06, 'epoch': 0.46} 46%|████▌ | 15828/34278 [17:24:00<22:27:42, 4.38s/it] 46%|████▌ | 15829/34278 [17:24:03<20:26:38, 3.99s/it] {'loss': 0.1561, 'grad_norm': 1.5810407758998741, 'learning_rate': 5.857693083881999e-06, 'epoch': 0.46} 46%|████▌ | 15829/34278 [17:24:03<20:26:38, 3.99s/it] 46%|████▌ | 15830/34278 [17:24:15<32:05:19, 6.26s/it] {'loss': 0.1765, 'grad_norm': 0.9074428050151642, 'learning_rate': 5.857227648601688e-06, 'epoch': 0.46} 46%|████▌ | 15830/34278 [17:24:15<32:05:19, 6.26s/it] 46%|████▌ | 15831/34278 [17:24:18<27:59:51, 5.46s/it] {'loss': 0.1403, 'grad_norm': 0.8134479443437905, 'learning_rate': 5.856762205668253e-06, 'epoch': 0.46} 46%|████▌ | 15831/34278 [17:24:19<27:59:51, 5.46s/it] 46%|████▌ | 15832/34278 [17:24:34<42:47:51, 8.35s/it] {'loss': 0.1617, 'grad_norm': 1.0476413221189405, 'learning_rate': 5.856296755085849e-06, 'epoch': 0.46} 46%|████▌ | 15832/34278 [17:24:34<42:47:51, 8.35s/it] 46%|████▌ | 15833/34278 [17:24:37<35:55:37, 7.01s/it] {'loss': 0.1565, 'grad_norm': 0.7207947215252731, 'learning_rate': 5.855831296858631e-06, 'epoch': 0.46} 46%|████▌ | 15833/34278 [17:24:37<35:55:37, 7.01s/it] 46%|████▌ | 15834/34278 [17:24:57<54:31:50, 10.64s/it] {'loss': 0.1342, 'grad_norm': 0.8456209194298886, 'learning_rate': 5.855365830990759e-06, 'epoch': 0.46} 46%|████▌ | 15834/34278 [17:24:57<54:31:50, 10.64s/it] 46%|████▌ | 15835/34278 [17:25:00<42:43:28, 8.34s/it] {'loss': 0.1646, 'grad_norm': 1.099463485103391, 'learning_rate': 5.8549003574863815e-06, 'epoch': 0.46} 46%|████▌ | 15835/34278 [17:25:00<42:43:28, 8.34s/it] 46%|████▌ | 15836/34278 [17:25:02<34:19:15, 6.70s/it] {'loss': 0.1418, 'grad_norm': 0.7539161030504511, 'learning_rate': 5.85443487634966e-06, 'epoch': 0.46} 46%|████▌ | 15836/34278 [17:25:02<34:19:15, 6.70s/it] 46%|████▌ | 15837/34278 [17:25:27<61:14:20, 11.95s/it] {'loss': 0.1406, 'grad_norm': 1.2446236246928624, 'learning_rate': 5.853969387584747e-06, 'epoch': 0.46} 46%|████▌ | 15837/34278 [17:25:27<61:14:20, 11.95s/it] 46%|████▌ | 15838/34278 [17:25:30<48:18:04, 9.43s/it] {'loss': 0.1418, 'grad_norm': 1.01262732191191, 'learning_rate': 5.853503891195797e-06, 'epoch': 0.46} 46%|████▌ | 15838/34278 [17:25:30<48:18:04, 9.43s/it] 46%|████▌ | 15839/34278 [17:25:33<38:42:59, 7.56s/it] {'loss': 0.1275, 'grad_norm': 1.0295086226658106, 'learning_rate': 5.8530383871869725e-06, 'epoch': 0.46} 46%|████▌ | 15839/34278 [17:25:33<38:42:59, 7.56s/it] 46%|████▌ | 15840/34278 [17:25:44<43:48:37, 8.55s/it] {'loss': 0.1301, 'grad_norm': 0.8157928167959317, 'learning_rate': 5.852572875562422e-06, 'epoch': 0.46} 46%|████▌ | 15840/34278 [17:25:44<43:48:37, 8.55s/it] 46%|████▌ | 15841/34278 [17:25:59<53:56:54, 10.53s/it] {'loss': 0.1339, 'grad_norm': 1.2160861930921467, 'learning_rate': 5.852107356326305e-06, 'epoch': 0.46} 46%|████▌ | 15841/34278 [17:25:59<53:56:54, 10.53s/it] 46%|████▌ | 15842/34278 [17:26:15<62:19:21, 12.17s/it] {'loss': 0.1403, 'grad_norm': 1.0446637605268678, 'learning_rate': 5.851641829482777e-06, 'epoch': 0.46} 46%|████▌ | 15842/34278 [17:26:15<62:19:21, 12.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 46%|████▌ | 15843/34278 [17:26:45<88:28:03, 17.28s/it] {'loss': 0.1238, 'grad_norm': 0.9811265382880493, 'learning_rate': 5.851176295035994e-06, 'epoch': 0.46} 46%|████▌ | 15843/34278 [17:26:45<88:28:03, 17.28s/it] 46%|████▌ | 15844/34278 [17:27:01<86:37:16, 16.92s/it] {'loss': 0.1242, 'grad_norm': 0.9901597261186631, 'learning_rate': 5.850710752990112e-06, 'epoch': 0.46} 46%|████▌ | 15844/34278 [17:27:01<86:37:16, 16.92s/it] 46%|████▌ | 15845/34278 [17:27:12<77:45:47, 15.19s/it] {'loss': 0.1528, 'grad_norm': 0.7981800484178719, 'learning_rate': 5.850245203349288e-06, 'epoch': 0.46} 46%|████▌ | 15845/34278 [17:27:12<77:45:47, 15.19s/it] 46%|████▌ | 15846/34278 [17:27:31<83:46:16, 16.36s/it] {'loss': 0.1567, 'grad_norm': 0.7895154486982267, 'learning_rate': 5.849779646117677e-06, 'epoch': 0.46} 46%|████▌ | 15846/34278 [17:27:31<83:46:16, 16.36s/it] 46%|████▌ | 15847/34278 [17:27:54<93:27:40, 18.26s/it] {'loss': 0.1386, 'grad_norm': 0.745081575095745, 'learning_rate': 5.849314081299436e-06, 'epoch': 0.46} 46%|████▌ | 15847/34278 [17:27:54<93:27:40, 18.26s/it] 46%|████▌ | 15848/34278 [17:28:23<111:05:09, 21.70s/it] {'loss': 0.1277, 'grad_norm': 0.8394289508340748, 'learning_rate': 5.848848508898722e-06, 'epoch': 0.46} 46%|████▌ | 15848/34278 [17:28:23<111:05:09, 21.70s/it] 46%|████▌ | 15849/34278 [17:28:47<113:53:52, 22.25s/it] {'loss': 0.1425, 'grad_norm': 0.8546716452311085, 'learning_rate': 5.848382928919693e-06, 'epoch': 0.46} 46%|████▌ | 15849/34278 [17:28:47<113:53:52, 22.25s/it] 46%|████▌ | 15850/34278 [17:28:53<88:37:50, 17.31s/it] {'loss': 0.1315, 'grad_norm': 0.8334257354863847, 'learning_rate': 5.847917341366501e-06, 'epoch': 0.46} 46%|████▌ | 15850/34278 [17:28:53<88:37:50, 17.31s/it] 46%|████▌ | 15851/34278 [17:28:56<67:47:31, 13.24s/it] {'loss': 0.1282, 'grad_norm': 0.9946812920536485, 'learning_rate': 5.847451746243306e-06, 'epoch': 0.46} 46%|████▌ | 15851/34278 [17:28:56<67:47:31, 13.24s/it] 46%|████▌ | 15852/34278 [17:29:10<67:44:44, 13.24s/it] {'loss': 0.1525, 'grad_norm': 1.0914693586102404, 'learning_rate': 5.846986143554265e-06, 'epoch': 0.46} 46%|████▌ | 15852/34278 [17:29:10<67:44:44, 13.24s/it] 46%|████▌ | 15853/34278 [17:29:25<70:38:38, 13.80s/it] {'loss': 0.1485, 'grad_norm': 1.0420059449555357, 'learning_rate': 5.846520533303532e-06, 'epoch': 0.46} 46%|████▌ | 15853/34278 [17:29:25<70:38:38, 13.80s/it] 46%|████▋ | 15854/34278 [17:29:36<66:30:47, 13.00s/it] {'loss': 0.1142, 'grad_norm': 0.7094138955921574, 'learning_rate': 5.846054915495269e-06, 'epoch': 0.46} 46%|████▋ | 15854/34278 [17:29:36<66:30:47, 13.00s/it] 46%|████▋ | 15855/34278 [17:29:57<79:16:55, 15.49s/it] {'loss': 0.1377, 'grad_norm': 1.051887299073645, 'learning_rate': 5.845589290133627e-06, 'epoch': 0.46} 46%|████▋ | 15855/34278 [17:29:57<79:16:55, 15.49s/it] 46%|████▋ | 15856/34278 [17:30:08<72:16:11, 14.12s/it] {'loss': 0.1548, 'grad_norm': 1.0753670585455644, 'learning_rate': 5.845123657222768e-06, 'epoch': 0.46} 46%|████▋ | 15856/34278 [17:30:08<72:16:11, 14.12s/it] 46%|████▋ | 15857/34278 [17:30:33<88:06:26, 17.22s/it] {'loss': 0.1188, 'grad_norm': 0.8494958539932758, 'learning_rate': 5.844658016766845e-06, 'epoch': 0.46} 46%|████▋ | 15857/34278 [17:30:33<88:06:26, 17.22s/it] 46%|████▋ | 15858/34278 [17:30:46<82:48:29, 16.18s/it] {'loss': 0.1272, 'grad_norm': 0.6343566290332142, 'learning_rate': 5.844192368770017e-06, 'epoch': 0.46} 46%|████▋ | 15858/34278 [17:30:46<82:48:29, 16.18s/it] 46%|████▋ | 15859/34278 [17:30:49<62:33:14, 12.23s/it] {'loss': 0.1326, 'grad_norm': 0.8215137434270036, 'learning_rate': 5.843726713236442e-06, 'epoch': 0.46} 46%|████▋ | 15859/34278 [17:30:49<62:33:14, 12.23s/it] 46%|████▋ | 15860/34278 [17:31:13<79:36:35, 15.56s/it] {'loss': 0.1321, 'grad_norm': 1.092692961779195, 'learning_rate': 5.843261050170274e-06, 'epoch': 0.46} 46%|████▋ | 15860/34278 [17:31:13<79:36:35, 15.56s/it] 46%|████▋ | 15861/34278 [17:31:42<100:38:05, 19.67s/it] {'loss': 0.1259, 'grad_norm': 0.7652329509303649, 'learning_rate': 5.842795379575675e-06, 'epoch': 0.46} 46%|████▋ | 15861/34278 [17:31:42<100:38:05, 19.67s/it] 46%|████▋ | 15862/34278 [17:31:45<75:35:57, 14.78s/it] {'loss': 0.1304, 'grad_norm': 0.6855271583271109, 'learning_rate': 5.842329701456799e-06, 'epoch': 0.46} 46%|████▋ | 15862/34278 [17:31:45<75:35:57, 14.78s/it] 46%|████▋ | 15863/34278 [17:32:05<83:27:21, 16.32s/it] {'loss': 0.1292, 'grad_norm': 0.7693759880466009, 'learning_rate': 5.841864015817804e-06, 'epoch': 0.46} 46%|████▋ | 15863/34278 [17:32:05<83:27:21, 16.32s/it] 46%|████▋ | 15864/34278 [17:32:08<62:43:30, 12.26s/it] {'loss': 0.1294, 'grad_norm': 1.107998901306329, 'learning_rate': 5.84139832266285e-06, 'epoch': 0.46} 46%|████▋ | 15864/34278 [17:32:08<62:43:30, 12.26s/it] 46%|████▋ | 15865/34278 [17:32:40<92:44:21, 18.13s/it] {'loss': 0.125, 'grad_norm': 0.7082320486190519, 'learning_rate': 5.84093262199609e-06, 'epoch': 0.46} 46%|████▋ | 15865/34278 [17:32:40<92:44:21, 18.13s/it] 46%|████▋ | 15866/34278 [17:32:54<86:30:19, 16.91s/it] {'loss': 0.139, 'grad_norm': 0.6618581406917674, 'learning_rate': 5.840466913821687e-06, 'epoch': 0.46} 46%|████▋ | 15866/34278 [17:32:54<86:30:19, 16.91s/it] 46%|████▋ | 15867/34278 [17:33:10<85:57:01, 16.81s/it] {'loss': 0.1392, 'grad_norm': 0.9890547647006038, 'learning_rate': 5.840001198143795e-06, 'epoch': 0.46} 46%|████▋ | 15867/34278 [17:33:10<85:57:01, 16.81s/it] 46%|████▋ | 15868/34278 [17:33:23<79:32:01, 15.55s/it] {'loss': 0.1412, 'grad_norm': 0.8354134840361568, 'learning_rate': 5.8395354749665725e-06, 'epoch': 0.46} 46%|████▋ | 15868/34278 [17:33:23<79:32:01, 15.55s/it] 46%|████▋ | 15869/34278 [17:33:27<61:39:52, 12.06s/it] {'loss': 0.1183, 'grad_norm': 0.6994827796111657, 'learning_rate': 5.839069744294178e-06, 'epoch': 0.46} 46%|████▋ | 15869/34278 [17:33:27<61:39:52, 12.06s/it] 46%|████▋ | 15870/34278 [17:33:43<67:26:08, 13.19s/it] {'loss': 0.1315, 'grad_norm': 0.7988385418865582, 'learning_rate': 5.838604006130769e-06, 'epoch': 0.46} 46%|████▋ | 15870/34278 [17:33:43<67:26:08, 13.19s/it] 46%|████▋ | 15871/34278 [17:33:57<69:32:44, 13.60s/it] {'loss': 0.1253, 'grad_norm': 0.8536960901789566, 'learning_rate': 5.8381382604805035e-06, 'epoch': 0.46} 46%|████▋ | 15871/34278 [17:33:57<69:32:44, 13.60s/it] 46%|████▋ | 15872/34278 [17:34:01<54:53:53, 10.74s/it] {'loss': 0.1369, 'grad_norm': 0.8348528895425051, 'learning_rate': 5.83767250734754e-06, 'epoch': 0.46} 46%|████▋ | 15872/34278 [17:34:01<54:53:53, 10.74s/it] 46%|████▋ | 15873/34278 [17:34:05<43:33:49, 8.52s/it] {'loss': 0.1319, 'grad_norm': 0.703700913853645, 'learning_rate': 5.837206746736036e-06, 'epoch': 0.46} 46%|████▋ | 15873/34278 [17:34:05<43:33:49, 8.52s/it] 46%|████▋ | 15874/34278 [17:34:08<35:02:20, 6.85s/it] {'loss': 0.1438, 'grad_norm': 0.721868231071233, 'learning_rate': 5.836740978650149e-06, 'epoch': 0.46} 46%|████▋ | 15874/34278 [17:34:08<35:02:20, 6.85s/it] 46%|████▋ | 15875/34278 [17:34:12<30:54:59, 6.05s/it] {'loss': 0.1312, 'grad_norm': 0.9297929393124746, 'learning_rate': 5.83627520309404e-06, 'epoch': 0.46} 46%|████▋ | 15875/34278 [17:34:12<30:54:59, 6.05s/it] 46%|████▋ | 15876/34278 [17:34:27<44:47:55, 8.76s/it] {'loss': 0.1378, 'grad_norm': 0.7933117896441769, 'learning_rate': 5.835809420071865e-06, 'epoch': 0.46} 46%|████▋ | 15876/34278 [17:34:27<44:47:55, 8.76s/it] 46%|████▋ | 15877/34278 [17:34:30<36:43:13, 7.18s/it] {'loss': 0.1574, 'grad_norm': 0.8277036736973242, 'learning_rate': 5.835343629587783e-06, 'epoch': 0.46} 46%|████▋ | 15877/34278 [17:34:31<36:43:13, 7.18s/it] 46%|████▋ | 15878/34278 [17:34:33<29:53:09, 5.85s/it] {'loss': 0.1508, 'grad_norm': 0.8011762285766284, 'learning_rate': 5.834877831645952e-06, 'epoch': 0.46} 46%|████▋ | 15878/34278 [17:34:33<29:53:09, 5.85s/it] 46%|████▋ | 15879/34278 [17:34:37<26:22:05, 5.16s/it] {'loss': 0.1466, 'grad_norm': 0.8365521808430085, 'learning_rate': 5.8344120262505335e-06, 'epoch': 0.46} 46%|████▋ | 15879/34278 [17:34:37<26:22:05, 5.16s/it] 46%|████▋ | 15880/34278 [17:34:52<42:24:00, 8.30s/it] {'loss': 0.1448, 'grad_norm': 0.8445417585011479, 'learning_rate': 5.8339462134056805e-06, 'epoch': 0.46} 46%|████▋ | 15880/34278 [17:34:52<42:24:00, 8.30s/it] 46%|████▋ | 15881/34278 [17:35:04<46:54:19, 9.18s/it] {'loss': 0.1441, 'grad_norm': 0.8004943882549624, 'learning_rate': 5.833480393115556e-06, 'epoch': 0.46} 46%|████▋ | 15881/34278 [17:35:04<46:54:19, 9.18s/it] 46%|████▋ | 15882/34278 [17:35:08<40:10:31, 7.86s/it] {'loss': 0.1138, 'grad_norm': 0.7248928591422803, 'learning_rate': 5.833014565384318e-06, 'epoch': 0.46} 46%|████▋ | 15882/34278 [17:35:08<40:10:31, 7.86s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 46%|████▋ | 15883/34278 [17:35:20<45:49:29, 8.97s/it] {'loss': 0.1358, 'grad_norm': 1.2993545420646264, 'learning_rate': 5.832548730216123e-06, 'epoch': 0.46} 46%|████▋ | 15883/34278 [17:35:20<45:49:29, 8.97s/it] 46%|████▋ | 15884/34278 [17:35:26<40:53:54, 8.00s/it] {'loss': 0.128, 'grad_norm': 0.888070530529126, 'learning_rate': 5.832082887615134e-06, 'epoch': 0.46} 46%|████▋ | 15884/34278 [17:35:26<40:53:54, 8.00s/it] 46%|████▋ | 15885/34278 [17:35:29<33:36:10, 6.58s/it] {'loss': 0.1354, 'grad_norm': 0.8489878461255642, 'learning_rate': 5.8316170375855065e-06, 'epoch': 0.46} 46%|████▋ | 15885/34278 [17:35:29<33:36:10, 6.58s/it] 46%|████▋ | 15886/34278 [17:35:32<28:33:37, 5.59s/it] {'loss': 0.1479, 'grad_norm': 0.7668677989178522, 'learning_rate': 5.8311511801314e-06, 'epoch': 0.46} 46%|████▋ | 15886/34278 [17:35:32<28:33:37, 5.59s/it] 46%|████▋ | 15887/34278 [17:35:36<25:46:04, 5.04s/it] {'loss': 0.1384, 'grad_norm': 0.8060949317504428, 'learning_rate': 5.8306853152569755e-06, 'epoch': 0.46} 46%|████▋ | 15887/34278 [17:35:36<25:46:04, 5.04s/it] 46%|████▋ | 15888/34278 [17:35:40<23:37:06, 4.62s/it] {'loss': 0.1749, 'grad_norm': 0.7966606036787572, 'learning_rate': 5.83021944296639e-06, 'epoch': 0.46} 46%|████▋ | 15888/34278 [17:35:40<23:37:06, 4.62s/it] 46%|████▋ | 15889/34278 [17:35:43<21:32:55, 4.22s/it] {'loss': 0.1172, 'grad_norm': 0.7777750236694937, 'learning_rate': 5.829753563263803e-06, 'epoch': 0.46} 46%|████▋ | 15889/34278 [17:35:43<21:32:55, 4.22s/it] 46%|████▋ | 15890/34278 [17:35:46<19:56:02, 3.90s/it] {'loss': 0.1285, 'grad_norm': 0.830723371055366, 'learning_rate': 5.829287676153375e-06, 'epoch': 0.46} 46%|████▋ | 15890/34278 [17:35:46<19:56:02, 3.90s/it] 46%|████▋ | 15891/34278 [17:35:52<22:23:49, 4.39s/it] {'loss': 0.1498, 'grad_norm': 0.8333860237145468, 'learning_rate': 5.828821781639264e-06, 'epoch': 0.46} 46%|████▋ | 15891/34278 [17:35:52<22:23:49, 4.39s/it] 46%|████▋ | 15892/34278 [17:35:55<21:28:27, 4.20s/it] {'loss': 0.1189, 'grad_norm': 0.8372865646905908, 'learning_rate': 5.828355879725632e-06, 'epoch': 0.46} 46%|████▋ | 15892/34278 [17:35:55<21:28:27, 4.20s/it] 46%|████▋ | 15893/34278 [17:35:59<20:02:26, 3.92s/it] {'loss': 0.1346, 'grad_norm': 0.8435699699315544, 'learning_rate': 5.827889970416634e-06, 'epoch': 0.46} 46%|████▋ | 15893/34278 [17:35:59<20:02:26, 3.92s/it] 46%|████▋ | 15894/34278 [17:36:02<19:21:56, 3.79s/it] {'loss': 0.1518, 'grad_norm': 0.7458663290715156, 'learning_rate': 5.827424053716434e-06, 'epoch': 0.46} 46%|████▋ | 15894/34278 [17:36:02<19:21:56, 3.79s/it] 46%|████▋ | 15895/34278 [17:36:08<22:49:14, 4.47s/it] {'loss': 0.1417, 'grad_norm': 0.7838148068917841, 'learning_rate': 5.826958129629187e-06, 'epoch': 0.46} 46%|████▋ | 15895/34278 [17:36:08<22:49:14, 4.47s/it] 46%|████▋ | 15896/34278 [17:36:12<21:21:35, 4.18s/it] {'loss': 0.126, 'grad_norm': 0.888477035667017, 'learning_rate': 5.826492198159058e-06, 'epoch': 0.46} 46%|████▋ | 15896/34278 [17:36:12<21:21:35, 4.18s/it] 46%|████▋ | 15897/34278 [17:36:15<19:41:54, 3.86s/it] {'loss': 0.1457, 'grad_norm': 0.8729074064593524, 'learning_rate': 5.826026259310202e-06, 'epoch': 0.46} 46%|████▋ | 15897/34278 [17:36:15<19:41:54, 3.86s/it] 46%|████▋ | 15898/34278 [17:36:18<19:00:40, 3.72s/it] {'loss': 0.1425, 'grad_norm': 0.9131639863595894, 'learning_rate': 5.825560313086781e-06, 'epoch': 0.46} 46%|████▋ | 15898/34278 [17:36:18<19:00:40, 3.72s/it] 46%|████▋ | 15899/34278 [17:36:22<18:24:50, 3.61s/it] {'loss': 0.1552, 'grad_norm': 0.9122556029639202, 'learning_rate': 5.825094359492955e-06, 'epoch': 0.46} 46%|████▋ | 15899/34278 [17:36:22<18:24:50, 3.61s/it] 46%|████▋ | 15900/34278 [17:36:25<17:50:33, 3.50s/it] {'loss': 0.1385, 'grad_norm': 0.8976113489352935, 'learning_rate': 5.8246283985328845e-06, 'epoch': 0.46} 46%|████▋ | 15900/34278 [17:36:25<17:50:33, 3.50s/it] 46%|████▋ | 15901/34278 [17:36:29<19:03:17, 3.73s/it] {'loss': 0.1447, 'grad_norm': 0.8362597188678967, 'learning_rate': 5.824162430210727e-06, 'epoch': 0.46} 46%|████▋ | 15901/34278 [17:36:29<19:03:17, 3.73s/it] 46%|████▋ | 15902/34278 [17:36:32<18:13:08, 3.57s/it] {'loss': 0.1453, 'grad_norm': 0.9641924428098306, 'learning_rate': 5.823696454530645e-06, 'epoch': 0.46} 46%|████▋ | 15902/34278 [17:36:32<18:13:08, 3.57s/it] 46%|████▋ | 15903/34278 [17:36:35<17:26:57, 3.42s/it] {'loss': 0.1274, 'grad_norm': 0.975611475874682, 'learning_rate': 5.823230471496797e-06, 'epoch': 0.46} 46%|████▋ | 15903/34278 [17:36:35<17:26:57, 3.42s/it] 46%|████▋ | 15904/34278 [17:36:41<21:26:01, 4.20s/it] {'loss': 0.122, 'grad_norm': 0.7284168076885339, 'learning_rate': 5.822764481113345e-06, 'epoch': 0.46} 46%|████▋ | 15904/34278 [17:36:41<21:26:01, 4.20s/it] 46%|████▋ | 15905/34278 [17:36:45<20:28:41, 4.01s/it] {'loss': 0.1305, 'grad_norm': 0.8004637172072524, 'learning_rate': 5.822298483384446e-06, 'epoch': 0.46} 46%|████▋ | 15905/34278 [17:36:45<20:28:41, 4.01s/it] 46%|████▋ | 15906/34278 [17:36:49<21:02:52, 4.12s/it] {'loss': 0.1376, 'grad_norm': 1.3252533327640181, 'learning_rate': 5.821832478314265e-06, 'epoch': 0.46} 46%|████▋ | 15906/34278 [17:36:49<21:02:52, 4.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 46%|████▋ | 15907/34278 [17:36:52<19:26:34, 3.81s/it] {'loss': 0.1339, 'grad_norm': 0.9785839268351734, 'learning_rate': 5.821366465906958e-06, 'epoch': 0.46} 46%|████▋ | 15907/34278 [17:36:52<19:26:34, 3.81s/it] 46%|████▋ | 15908/34278 [17:36:55<18:10:37, 3.56s/it] {'loss': 0.1448, 'grad_norm': 0.8941719513667673, 'learning_rate': 5.820900446166687e-06, 'epoch': 0.46} 46%|████▋ | 15908/34278 [17:36:55<18:10:37, 3.56s/it] 46%|████▋ | 15909/34278 [17:36:59<18:03:29, 3.54s/it] {'loss': 0.1568, 'grad_norm': 0.9483706041460341, 'learning_rate': 5.820434419097614e-06, 'epoch': 0.46} 46%|████▋ | 15909/34278 [17:36:59<18:03:29, 3.54s/it] 46%|████▋ | 15910/34278 [17:37:02<17:16:53, 3.39s/it] {'loss': 0.115, 'grad_norm': 1.0202428273220814, 'learning_rate': 5.819968384703898e-06, 'epoch': 0.46} 46%|████▋ | 15910/34278 [17:37:02<17:16:53, 3.39s/it] 46%|████▋ | 15911/34278 [17:37:07<19:14:51, 3.77s/it] {'loss': 0.1394, 'grad_norm': 0.7549637289971923, 'learning_rate': 5.819502342989701e-06, 'epoch': 0.46} 46%|████▋ | 15911/34278 [17:37:07<19:14:51, 3.77s/it] 46%|████▋ | 15912/34278 [17:37:10<18:54:37, 3.71s/it] {'loss': 0.1595, 'grad_norm': 0.8714901820124208, 'learning_rate': 5.81903629395918e-06, 'epoch': 0.46} 46%|████▋ | 15912/34278 [17:37:10<18:54:37, 3.71s/it] 46%|████▋ | 15913/34278 [17:37:16<21:35:15, 4.23s/it] {'loss': 0.152, 'grad_norm': 0.9732469182318602, 'learning_rate': 5.818570237616501e-06, 'epoch': 0.46} 46%|████▋ | 15913/34278 [17:37:16<21:35:15, 4.23s/it] 46%|████▋ | 15914/34278 [17:37:18<19:33:17, 3.83s/it] {'loss': 0.1264, 'grad_norm': 0.842016257223515, 'learning_rate': 5.818104173965822e-06, 'epoch': 0.46} 46%|████▋ | 15914/34278 [17:37:18<19:33:17, 3.83s/it] 46%|████▋ | 15915/34278 [17:37:23<19:56:29, 3.91s/it] {'loss': 0.1548, 'grad_norm': 0.8294549762141147, 'learning_rate': 5.817638103011303e-06, 'epoch': 0.46} 46%|████▋ | 15915/34278 [17:37:23<19:56:29, 3.91s/it] 46%|████▋ | 15916/34278 [17:37:26<18:40:32, 3.66s/it] {'loss': 0.1276, 'grad_norm': 1.1116113134622982, 'learning_rate': 5.817172024757107e-06, 'epoch': 0.46} 46%|████▋ | 15916/34278 [17:37:26<18:40:32, 3.66s/it] 46%|████▋ | 15917/34278 [17:37:30<19:06:42, 3.75s/it] {'loss': 0.153, 'grad_norm': 0.8217614550077912, 'learning_rate': 5.8167059392073945e-06, 'epoch': 0.46} 46%|████▋ | 15917/34278 [17:37:30<19:06:42, 3.75s/it] 46%|████▋ | 15918/34278 [17:37:33<17:54:51, 3.51s/it] {'loss': 0.1522, 'grad_norm': 1.0098753202766408, 'learning_rate': 5.816239846366325e-06, 'epoch': 0.46} 46%|████▋ | 15918/34278 [17:37:33<17:54:51, 3.51s/it] 46%|████▋ | 15919/34278 [17:37:36<17:39:36, 3.46s/it] {'loss': 0.138, 'grad_norm': 0.9738587566490842, 'learning_rate': 5.815773746238063e-06, 'epoch': 0.46} 46%|████▋ | 15919/34278 [17:37:36<17:39:36, 3.46s/it] 46%|████▋ | 15920/34278 [17:37:39<17:08:07, 3.36s/it] {'loss': 0.1279, 'grad_norm': 0.7493340709827934, 'learning_rate': 5.815307638826767e-06, 'epoch': 0.46} 46%|████▋ | 15920/34278 [17:37:39<17:08:07, 3.36s/it] 46%|████▋ | 15921/34278 [17:37:43<18:44:54, 3.68s/it] {'loss': 0.1448, 'grad_norm': 0.8826322638455888, 'learning_rate': 5.8148415241365985e-06, 'epoch': 0.46} 46%|████▋ | 15921/34278 [17:37:43<18:44:54, 3.68s/it] 46%|████▋ | 15922/34278 [17:37:47<18:24:42, 3.61s/it] {'loss': 0.1314, 'grad_norm': 0.7779789604266115, 'learning_rate': 5.81437540217172e-06, 'epoch': 0.46} 46%|████▋ | 15922/34278 [17:37:47<18:24:42, 3.61s/it] 46%|████▋ | 15923/34278 [17:37:53<21:54:42, 4.30s/it] {'loss': 0.1548, 'grad_norm': 0.9722819435858137, 'learning_rate': 5.8139092729362925e-06, 'epoch': 0.46} 46%|████▋ | 15923/34278 [17:37:53<21:54:42, 4.30s/it] 46%|████▋ | 15924/34278 [17:37:56<20:35:36, 4.04s/it] {'loss': 0.1209, 'grad_norm': 0.8050178260881279, 'learning_rate': 5.813443136434475e-06, 'epoch': 0.46} 46%|████▋ | 15924/34278 [17:37:56<20:35:36, 4.04s/it] 46%|████▋ | 15925/34278 [17:37:59<19:16:13, 3.78s/it] {'loss': 0.1433, 'grad_norm': 0.9171370105747304, 'learning_rate': 5.812976992670434e-06, 'epoch': 0.46} 46%|████▋ | 15925/34278 [17:37:59<19:16:13, 3.78s/it] 46%|████▋ | 15926/34278 [17:38:04<20:11:49, 3.96s/it] {'loss': 0.1394, 'grad_norm': 0.8076803104860456, 'learning_rate': 5.812510841648329e-06, 'epoch': 0.46} 46%|████▋ | 15926/34278 [17:38:04<20:11:49, 3.96s/it] 46%|████▋ | 15927/34278 [17:38:08<20:34:12, 4.04s/it] {'loss': 0.1444, 'grad_norm': 0.9973335139882092, 'learning_rate': 5.812044683372318e-06, 'epoch': 0.46} 46%|████▋ | 15927/34278 [17:38:08<20:34:12, 4.04s/it] 46%|████▋ | 15928/34278 [17:38:11<19:16:48, 3.78s/it] {'loss': 0.1407, 'grad_norm': 0.7753992461313428, 'learning_rate': 5.811578517846567e-06, 'epoch': 0.46} 46%|████▋ | 15928/34278 [17:38:11<19:16:48, 3.78s/it] 46%|████▋ | 15929/34278 [17:38:15<19:41:33, 3.86s/it] {'loss': 0.1246, 'grad_norm': 0.9103740934767576, 'learning_rate': 5.81111234507524e-06, 'epoch': 0.46} 46%|████▋ | 15929/34278 [17:38:15<19:41:33, 3.86s/it] 46%|████▋ | 15930/34278 [17:38:19<18:51:29, 3.70s/it] {'loss': 0.1285, 'grad_norm': 0.9864108966598794, 'learning_rate': 5.810646165062491e-06, 'epoch': 0.46} 46%|████▋ | 15930/34278 [17:38:19<18:51:29, 3.70s/it] 46%|████▋ | 15931/34278 [17:38:22<17:51:07, 3.50s/it] {'loss': 0.1346, 'grad_norm': 0.9102732017805926, 'learning_rate': 5.8101799778124905e-06, 'epoch': 0.46} 46%|████▋ | 15931/34278 [17:38:22<17:51:07, 3.50s/it] 46%|████▋ | 15932/34278 [17:38:25<18:21:23, 3.60s/it] {'loss': 0.1317, 'grad_norm': 0.9376487685418402, 'learning_rate': 5.809713783329395e-06, 'epoch': 0.46} 46%|████▋ | 15932/34278 [17:38:25<18:21:23, 3.60s/it] 46%|████▋ | 15933/34278 [17:38:31<21:55:54, 4.30s/it] {'loss': 0.1367, 'grad_norm': 1.0161822282457287, 'learning_rate': 5.809247581617366e-06, 'epoch': 0.46} 46%|████▋ | 15933/34278 [17:38:31<21:55:54, 4.30s/it] 46%|████▋ | 15934/34278 [17:38:35<20:29:49, 4.02s/it] {'loss': 0.1301, 'grad_norm': 0.7920617594439303, 'learning_rate': 5.808781372680571e-06, 'epoch': 0.46} 46%|████▋ | 15934/34278 [17:38:35<20:29:49, 4.02s/it] 46%|████▋ | 15935/34278 [17:38:39<20:17:24, 3.98s/it] {'loss': 0.1397, 'grad_norm': 1.4013070796833542, 'learning_rate': 5.808315156523168e-06, 'epoch': 0.46} 46%|████▋ | 15935/34278 [17:38:39<20:17:24, 3.98s/it] 46%|████▋ | 15936/34278 [17:38:43<20:17:15, 3.98s/it] {'loss': 0.1309, 'grad_norm': 0.9982127411340717, 'learning_rate': 5.807848933149319e-06, 'epoch': 0.46} 46%|████▋ | 15936/34278 [17:38:43<20:17:15, 3.98s/it] 46%|████▋ | 15937/34278 [17:38:46<18:49:45, 3.70s/it] {'loss': 0.1414, 'grad_norm': 0.8397415543048476, 'learning_rate': 5.807382702563188e-06, 'epoch': 0.46} 46%|████▋ | 15937/34278 [17:38:46<18:49:45, 3.70s/it] 46%|████▋ | 15938/34278 [17:38:49<18:02:03, 3.54s/it] {'loss': 0.1287, 'grad_norm': 1.347567736516576, 'learning_rate': 5.806916464768938e-06, 'epoch': 0.46} 46%|████▋ | 15938/34278 [17:38:49<18:02:03, 3.54s/it] 46%|████▋ | 15939/34278 [17:38:52<17:08:40, 3.37s/it] {'loss': 0.128, 'grad_norm': 1.4014554496035387, 'learning_rate': 5.80645021977073e-06, 'epoch': 0.46} 46%|████▋ | 15939/34278 [17:38:52<17:08:40, 3.37s/it] 47%|████▋ | 15940/34278 [17:38:55<17:01:59, 3.34s/it] {'loss': 0.1406, 'grad_norm': 0.8452629014948132, 'learning_rate': 5.8059839675727255e-06, 'epoch': 0.47} 47%|████▋ | 15940/34278 [17:38:55<17:01:59, 3.34s/it] 47%|████▋ | 15941/34278 [17:38:58<16:34:08, 3.25s/it] {'loss': 0.1409, 'grad_norm': 0.8116111608551361, 'learning_rate': 5.8055177081790916e-06, 'epoch': 0.47} 47%|████▋ | 15941/34278 [17:38:58<16:34:08, 3.25s/it] 47%|████▋ | 15942/34278 [17:39:02<17:05:06, 3.35s/it] {'loss': 0.1388, 'grad_norm': 1.0389772590121729, 'learning_rate': 5.805051441593985e-06, 'epoch': 0.47} 47%|████▋ | 15942/34278 [17:39:02<17:05:06, 3.35s/it] 47%|████▋ | 15943/34278 [17:39:05<17:36:23, 3.46s/it] {'loss': 0.118, 'grad_norm': 0.8691310836913704, 'learning_rate': 5.804585167821572e-06, 'epoch': 0.47} 47%|████▋ | 15943/34278 [17:39:05<17:36:23, 3.46s/it] 47%|████▋ | 15944/34278 [17:39:08<16:55:58, 3.32s/it] {'loss': 0.1363, 'grad_norm': 0.7957873510033092, 'learning_rate': 5.804118886866016e-06, 'epoch': 0.47} 47%|████▋ | 15944/34278 [17:39:08<16:55:58, 3.32s/it] 47%|████▋ | 15945/34278 [17:39:12<16:55:37, 3.32s/it] {'loss': 0.1336, 'grad_norm': 0.9212943282096638, 'learning_rate': 5.803652598731476e-06, 'epoch': 0.47} 47%|████▋ | 15945/34278 [17:39:12<16:55:37, 3.32s/it] 47%|████▋ | 15946/34278 [17:39:15<16:11:47, 3.18s/it] {'loss': 0.1247, 'grad_norm': 0.7731884868494172, 'learning_rate': 5.80318630342212e-06, 'epoch': 0.47} 47%|████▋ | 15946/34278 [17:39:15<16:11:47, 3.18s/it] 47%|████▋ | 15947/34278 [17:39:18<16:58:49, 3.33s/it] {'loss': 0.1445, 'grad_norm': 0.7845451088379048, 'learning_rate': 5.802720000942108e-06, 'epoch': 0.47} 47%|████▋ | 15947/34278 [17:39:18<16:58:49, 3.33s/it] 47%|████▋ | 15948/34278 [17:39:21<16:36:35, 3.26s/it] {'loss': 0.145, 'grad_norm': 0.8924213228390518, 'learning_rate': 5.802253691295602e-06, 'epoch': 0.47} 47%|████▋ | 15948/34278 [17:39:21<16:36:35, 3.26s/it] 47%|████▋ | 15949/34278 [17:39:27<20:31:04, 4.03s/it] {'loss': 0.1665, 'grad_norm': 0.8146316307558824, 'learning_rate': 5.801787374486768e-06, 'epoch': 0.47} 47%|████▋ | 15949/34278 [17:39:27<20:31:04, 4.03s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 47%|████▋ | 15950/34278 [17:39:31<20:16:09, 3.98s/it] {'loss': 0.1299, 'grad_norm': 0.7357533913377574, 'learning_rate': 5.801321050519768e-06, 'epoch': 0.47} 47%|████▋ | 15950/34278 [17:39:31<20:16:09, 3.98s/it] 47%|████▋ | 15951/34278 [17:39:35<20:09:27, 3.96s/it] {'loss': 0.1381, 'grad_norm': 1.0523115059686823, 'learning_rate': 5.800854719398764e-06, 'epoch': 0.47} 47%|████▋ | 15951/34278 [17:39:35<20:09:27, 3.96s/it] 47%|████▋ | 15952/34278 [17:39:38<19:01:51, 3.74s/it] {'loss': 0.1508, 'grad_norm': 1.0395106948844963, 'learning_rate': 5.80038838112792e-06, 'epoch': 0.47} 47%|████▋ | 15952/34278 [17:39:38<19:01:51, 3.74s/it] 47%|████▋ | 15953/34278 [17:39:41<18:09:06, 3.57s/it] {'loss': 0.1473, 'grad_norm': 0.7198719787956999, 'learning_rate': 5.799922035711401e-06, 'epoch': 0.47} 47%|████▋ | 15953/34278 [17:39:41<18:09:06, 3.57s/it] 47%|████▋ | 15954/34278 [17:39:47<20:57:46, 4.12s/it] {'loss': 0.1229, 'grad_norm': 0.7866665469816593, 'learning_rate': 5.799455683153367e-06, 'epoch': 0.47} 47%|████▋ | 15954/34278 [17:39:47<20:57:46, 4.12s/it] 47%|████▋ | 15955/34278 [17:39:51<20:34:36, 4.04s/it] {'loss': 0.1405, 'grad_norm': 0.822350925080411, 'learning_rate': 5.798989323457984e-06, 'epoch': 0.47} 47%|████▋ | 15955/34278 [17:39:51<20:34:36, 4.04s/it] 47%|████▋ | 15956/34278 [17:39:54<19:26:27, 3.82s/it] {'loss': 0.1279, 'grad_norm': 0.6539063365948917, 'learning_rate': 5.798522956629418e-06, 'epoch': 0.47} 47%|████▋ | 15956/34278 [17:39:54<19:26:27, 3.82s/it] 47%|████▋ | 15957/34278 [17:39:57<18:02:49, 3.55s/it] {'loss': 0.1308, 'grad_norm': 0.9022568533322931, 'learning_rate': 5.798056582671825e-06, 'epoch': 0.47} 47%|████▋ | 15957/34278 [17:39:57<18:02:49, 3.55s/it] 47%|████▋ | 15958/34278 [17:40:00<17:58:49, 3.53s/it] {'loss': 0.1523, 'grad_norm': 0.8149328748712105, 'learning_rate': 5.797590201589376e-06, 'epoch': 0.47} 47%|████▋ | 15958/34278 [17:40:00<17:58:49, 3.53s/it] 47%|████▋ | 15959/34278 [17:40:03<17:17:50, 3.40s/it] {'loss': 0.1322, 'grad_norm': 0.8771860608393286, 'learning_rate': 5.7971238133862324e-06, 'epoch': 0.47} 47%|████▋ | 15959/34278 [17:40:03<17:17:50, 3.40s/it] 47%|████▋ | 15960/34278 [17:40:07<17:27:47, 3.43s/it] {'loss': 0.1511, 'grad_norm': 0.9190749951306041, 'learning_rate': 5.796657418066556e-06, 'epoch': 0.47} 47%|████▋ | 15960/34278 [17:40:07<17:27:47, 3.43s/it] 47%|████▋ | 15961/34278 [17:40:13<21:35:17, 4.24s/it] {'loss': 0.1421, 'grad_norm': 0.8041082549146434, 'learning_rate': 5.796191015634515e-06, 'epoch': 0.47} 47%|████▋ | 15961/34278 [17:40:13<21:35:17, 4.24s/it] 47%|████▋ | 15962/34278 [17:40:16<19:27:16, 3.82s/it] {'loss': 0.1412, 'grad_norm': 0.8021631778731418, 'learning_rate': 5.795724606094269e-06, 'epoch': 0.47} 47%|████▋ | 15962/34278 [17:40:16<19:27:16, 3.82s/it] 47%|████▋ | 15963/34278 [17:40:19<18:46:27, 3.69s/it] {'loss': 0.1599, 'grad_norm': 0.972627726391673, 'learning_rate': 5.795258189449983e-06, 'epoch': 0.47} 47%|████▋ | 15963/34278 [17:40:19<18:46:27, 3.69s/it] 47%|████▋ | 15964/34278 [17:40:25<22:13:06, 4.37s/it] {'loss': 0.1253, 'grad_norm': 0.911331091011921, 'learning_rate': 5.794791765705823e-06, 'epoch': 0.47} 47%|████▋ | 15964/34278 [17:40:25<22:13:06, 4.37s/it] 47%|████▋ | 15965/34278 [17:40:31<23:39:23, 4.65s/it] {'loss': 0.1567, 'grad_norm': 0.7905016224581348, 'learning_rate': 5.79432533486595e-06, 'epoch': 0.47} 47%|████▋ | 15965/34278 [17:40:31<23:39:23, 4.65s/it] 47%|████▋ | 15966/34278 [17:40:34<22:17:49, 4.38s/it] {'loss': 0.1376, 'grad_norm': 1.025032660651501, 'learning_rate': 5.793858896934532e-06, 'epoch': 0.47} 47%|████▋ | 15966/34278 [17:40:34<22:17:49, 4.38s/it] 47%|████▋ | 15967/34278 [17:40:37<20:05:23, 3.95s/it] {'loss': 0.1552, 'grad_norm': 0.9245182905301442, 'learning_rate': 5.79339245191573e-06, 'epoch': 0.47} 47%|████▋ | 15967/34278 [17:40:37<20:05:23, 3.95s/it] 47%|████▋ | 15968/34278 [17:40:43<23:15:19, 4.57s/it] {'loss': 0.1534, 'grad_norm': 1.1764974665898509, 'learning_rate': 5.79292599981371e-06, 'epoch': 0.47} 47%|████▋ | 15968/34278 [17:40:43<23:15:19, 4.57s/it] 47%|████▋ | 15969/34278 [17:40:47<21:40:28, 4.26s/it] {'loss': 0.1402, 'grad_norm': 0.8612890491522468, 'learning_rate': 5.792459540632636e-06, 'epoch': 0.47} 47%|████▋ | 15969/34278 [17:40:47<21:40:28, 4.26s/it] 47%|████▋ | 15970/34278 [17:40:51<21:04:27, 4.14s/it] {'loss': 0.1549, 'grad_norm': 0.7767768190759483, 'learning_rate': 5.791993074376673e-06, 'epoch': 0.47} 47%|████▋ | 15970/34278 [17:40:51<21:04:27, 4.14s/it] 47%|████▋ | 15971/34278 [17:40:54<20:07:11, 3.96s/it] {'loss': 0.1436, 'grad_norm': 0.7311189609185839, 'learning_rate': 5.791526601049985e-06, 'epoch': 0.47} 47%|████▋ | 15971/34278 [17:40:54<20:07:11, 3.96s/it] 47%|████▋ | 15972/34278 [17:41:00<23:05:47, 4.54s/it] {'loss': 0.1271, 'grad_norm': 0.7177933757195226, 'learning_rate': 5.791060120656735e-06, 'epoch': 0.47} 47%|████▋ | 15972/34278 [17:41:00<23:05:47, 4.54s/it] 47%|████▋ | 15973/34278 [17:41:05<24:10:54, 4.76s/it] {'loss': 0.1503, 'grad_norm': 0.7610377232259685, 'learning_rate': 5.790593633201089e-06, 'epoch': 0.47} 47%|████▋ | 15973/34278 [17:41:05<24:10:54, 4.76s/it] 47%|████▋ | 15974/34278 [17:41:08<21:10:35, 4.16s/it] {'loss': 0.1269, 'grad_norm': 0.8474659506844789, 'learning_rate': 5.790127138687215e-06, 'epoch': 0.47} 47%|████▋ | 15974/34278 [17:41:08<21:10:35, 4.16s/it] 47%|████▋ | 15975/34278 [17:41:13<21:56:00, 4.31s/it] {'loss': 0.122, 'grad_norm': 0.7254915854900958, 'learning_rate': 5.789660637119271e-06, 'epoch': 0.47} 47%|████▋ | 15975/34278 [17:41:13<21:56:00, 4.31s/it] 47%|████▋ | 15976/34278 [17:41:16<20:53:33, 4.11s/it] {'loss': 0.147, 'grad_norm': 0.7989884598295096, 'learning_rate': 5.789194128501428e-06, 'epoch': 0.47} 47%|████▋ | 15976/34278 [17:41:16<20:53:33, 4.11s/it] 47%|████▋ | 15977/34278 [17:41:20<20:40:27, 4.07s/it] {'loss': 0.1355, 'grad_norm': 0.8491314139759364, 'learning_rate': 5.788727612837846e-06, 'epoch': 0.47} 47%|████▋ | 15977/34278 [17:41:20<20:40:27, 4.07s/it] 47%|████▋ | 15978/34278 [17:41:26<23:41:02, 4.66s/it] {'loss': 0.1218, 'grad_norm': 0.761607299632363, 'learning_rate': 5.788261090132693e-06, 'epoch': 0.47} 47%|████▋ | 15978/34278 [17:41:26<23:41:02, 4.66s/it] 47%|████▋ | 15979/34278 [17:41:30<21:44:51, 4.28s/it] {'loss': 0.1347, 'grad_norm': 0.6986230239896588, 'learning_rate': 5.787794560390133e-06, 'epoch': 0.47} 47%|████▋ | 15979/34278 [17:41:30<21:44:51, 4.28s/it] 47%|████▋ | 15980/34278 [17:41:33<19:58:37, 3.93s/it] {'loss': 0.144, 'grad_norm': 1.054838554348645, 'learning_rate': 5.787328023614331e-06, 'epoch': 0.47} 47%|████▋ | 15980/34278 [17:41:33<19:58:37, 3.93s/it] 47%|████▋ | 15981/34278 [17:41:36<18:38:26, 3.67s/it] {'loss': 0.1497, 'grad_norm': 0.8017458873753452, 'learning_rate': 5.786861479809453e-06, 'epoch': 0.47} 47%|████▋ | 15981/34278 [17:41:36<18:38:26, 3.67s/it] 47%|████▋ | 15982/34278 [17:41:39<17:39:48, 3.48s/it] {'loss': 0.1215, 'grad_norm': 0.7515159455250477, 'learning_rate': 5.786394928979663e-06, 'epoch': 0.47} 47%|████▋ | 15982/34278 [17:41:39<17:39:48, 3.48s/it] 47%|████▋ | 15983/34278 [17:41:42<17:32:21, 3.45s/it] {'loss': 0.131, 'grad_norm': 0.7126228538215411, 'learning_rate': 5.785928371129127e-06, 'epoch': 0.47} 47%|████▋ | 15983/34278 [17:41:42<17:32:21, 3.45s/it] 47%|████▋ | 15984/34278 [17:41:46<17:30:21, 3.44s/it] {'loss': 0.1183, 'grad_norm': 0.7843490446497038, 'learning_rate': 5.785461806262011e-06, 'epoch': 0.47} 47%|████▋ | 15984/34278 [17:41:46<17:30:21, 3.44s/it] 47%|████▋ | 15985/34278 [17:41:50<18:27:47, 3.63s/it] {'loss': 0.1194, 'grad_norm': 0.6464048953221874, 'learning_rate': 5.784995234382478e-06, 'epoch': 0.47} 47%|████▋ | 15985/34278 [17:41:50<18:27:47, 3.63s/it] 47%|████▋ | 15986/34278 [17:41:56<22:31:44, 4.43s/it] {'loss': 0.1371, 'grad_norm': 0.6520631001183158, 'learning_rate': 5.784528655494697e-06, 'epoch': 0.47} 47%|████▋ | 15986/34278 [17:41:56<22:31:44, 4.43s/it] 47%|████▋ | 15987/34278 [17:42:02<23:58:03, 4.72s/it] {'loss': 0.1751, 'grad_norm': 0.9026076849639647, 'learning_rate': 5.784062069602828e-06, 'epoch': 0.47} 47%|████▋ | 15987/34278 [17:42:02<23:58:03, 4.72s/it] 47%|████▋ | 15988/34278 [17:42:06<23:13:27, 4.57s/it] {'loss': 0.135, 'grad_norm': 0.8416307734789469, 'learning_rate': 5.783595476711043e-06, 'epoch': 0.47} 47%|████▋ | 15988/34278 [17:42:06<23:13:27, 4.57s/it] 47%|████▋ | 15989/34278 [17:42:09<20:50:19, 4.10s/it] {'loss': 0.118, 'grad_norm': 0.6961788531818385, 'learning_rate': 5.783128876823504e-06, 'epoch': 0.47} 47%|████▋ | 15989/34278 [17:42:09<20:50:19, 4.10s/it] 47%|████▋ | 15990/34278 [17:42:12<19:30:00, 3.84s/it] {'loss': 0.1398, 'grad_norm': 0.8461906207423968, 'learning_rate': 5.782662269944376e-06, 'epoch': 0.47} 47%|████▋ | 15990/34278 [17:42:12<19:30:00, 3.84s/it] 47%|████▋ | 15991/34278 [17:42:18<22:25:23, 4.41s/it] {'loss': 0.1219, 'grad_norm': 0.8957956304247597, 'learning_rate': 5.782195656077828e-06, 'epoch': 0.47} 47%|████▋ | 15991/34278 [17:42:18<22:25:23, 4.41s/it] 47%|████▋ | 15992/34278 [17:42:21<20:22:16, 4.01s/it] {'loss': 0.1477, 'grad_norm': 0.8668098247405077, 'learning_rate': 5.781729035228023e-06, 'epoch': 0.47} 47%|████▋ | 15992/34278 [17:42:21<20:22:16, 4.01s/it] 47%|████▋ | 15993/34278 [17:42:24<19:08:18, 3.77s/it] {'loss': 0.1321, 'grad_norm': 0.7105224761898491, 'learning_rate': 5.7812624073991276e-06, 'epoch': 0.47} 47%|████▋ | 15993/34278 [17:42:24<19:08:18, 3.77s/it] 47%|████▋ | 15994/34278 [17:42:27<18:10:23, 3.58s/it] {'loss': 0.1468, 'grad_norm': 1.012134838403854, 'learning_rate': 5.7807957725953076e-06, 'epoch': 0.47} 47%|████▋ | 15994/34278 [17:42:27<18:10:23, 3.58s/it] 47%|████▋ | 15995/34278 [17:42:30<17:09:39, 3.38s/it] {'loss': 0.1372, 'grad_norm': 0.8740211865235363, 'learning_rate': 5.78032913082073e-06, 'epoch': 0.47} 47%|████▋ | 15995/34278 [17:42:30<17:09:39, 3.38s/it] 47%|████▋ | 15996/34278 [17:42:33<16:57:32, 3.34s/it] {'loss': 0.1398, 'grad_norm': 0.8097992390360006, 'learning_rate': 5.7798624820795605e-06, 'epoch': 0.47} 47%|████▋ | 15996/34278 [17:42:33<16:57:32, 3.34s/it] 47%|████▋ | 15997/34278 [17:42:37<16:45:37, 3.30s/it] {'loss': 0.148, 'grad_norm': 0.8794296185515897, 'learning_rate': 5.779395826375964e-06, 'epoch': 0.47} 47%|████▋ | 15997/34278 [17:42:37<16:45:37, 3.30s/it] 47%|████▋ | 15998/34278 [17:42:40<16:27:58, 3.24s/it] {'loss': 0.1562, 'grad_norm': 0.9570177270508721, 'learning_rate': 5.778929163714109e-06, 'epoch': 0.47} 47%|████▋ | 15998/34278 [17:42:40<16:27:58, 3.24s/it] 47%|████▋ | 15999/34278 [17:42:44<17:19:00, 3.41s/it] {'loss': 0.1143, 'grad_norm': 0.8375361115858077, 'learning_rate': 5.77846249409816e-06, 'epoch': 0.47} 47%|████▋ | 15999/34278 [17:42:44<17:19:00, 3.41s/it] 47%|████▋ | 16000/34278 [17:42:47<16:38:52, 3.28s/it] {'loss': 0.1292, 'grad_norm': 0.9187618810046481, 'learning_rate': 5.777995817532282e-06, 'epoch': 0.47} 47%|████▋ | 16000/34278 [17:42:47<16:38:52, 3.28s/it]Set eos token id toSet eos token id to 151658Set eos token id to151658Set eos token id to Set eos token id toSet eos token id toSet eos token toSet eos token to 151658 Set eos token id to 151658 <|diff_marker|><|diff_marker|>151658 151658Set eos token to 151658 Set generation config eos token id toSet eos token id toSet eos token toSet generation config eos token id to <|diff_marker|>Set eos token toSet eos token to Set eos token to <|diff_marker|><|diff_marker|><|diff_marker|>[151658][151658]Set generation config eos token id to<|diff_marker|>151658 Set generation config eos token id toSet generation config eos token id toSet generation config eos token id to[151658]Set generation config eos token id to Set eos token to [151658][151658][151658][151658]<|diff_marker|> Set generation config eos token id to [151658] Set eos token id toSet eos token id toSet eos token id toSet eos token id toSet eos token id toSet eos token id to 151658151658Set eos token id to151658151658151658 Set eos token id to Set eos token to Set eos token to Set eos token toSet eos token toSet eos token to Set eos token id to 151658 <|diff_marker|><|diff_marker|><|diff_marker|> <|diff_marker|>151658Set eos token toSet generation config eos token id to Set generation config eos token id to Set generation config eos token id to Set generation config eos token id to [151658]<|diff_marker|>151658 Set eos token to[151658] Set generation config eos token id to <|diff_marker|>[151658] [151658]Set eos token to<|diff_marker|> <|diff_marker|>Set generation config eos token id to Set generation config eos token id to [151658] [151658]Set generation config eos token id to [151658] [151658] [151658] [151658] [151658][151658] Set eos token id to 151658 151658151658 151658 Set eos token to Set eos token toSet eos token to<|diff_marker|> Set eos token to <|diff_marker|> Set generation config eos token id to<|diff_marker|> <|diff_marker|> Set generation config eos token id toSet generation config eos token id to Set generation config eos token id to[151658]Set eos token id to [151658][151658] [151658] Set eos token id to151658 151658Set eos token to <|diff_marker|>Set eos token id to Set eos token to Set generation config eos token id to<|diff_marker|>151658 Set generation config eos token id to [151658]Set eos token to <|diff_marker|>[151658] Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id back to 151645Set eos token id back to Set eos token back to <|im_end|> 151645Set generation config eos token id back to Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [1516 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645Set eos token id back to Set eos token id back toSet eos token back to <|im_end|>151645 151645Set generation config eos token id back to Set eos token back toSet eos token back to <|im_end|><|im_end|> [151645, 151643] Set generation config eos tokenSet eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 47%|████▋ | 16001/34278 [17:43:23<66:36:11, 13.12s/it] {'loss': 0.1346, 'grad_norm': 0.6914638111304797, 'learning_rate': 5.777529134020645e-06, 'epoch': 0.47} 47%|████▋ | 16001/34278 [17:43:23<66:36:11, 13.12s/it] 47%|████▋ | 16002/34278 [17:43:26<52:12:00, 10.28s/it] {'loss': 0.1461, 'grad_norm': 0.7145671042787071, 'learning_rate': 5.777062443567412e-06, 'epoch': 0.47} 47%|████▋ | 16002/34278 [17:43:26<52:12:00, 10.28s/it] 47%|████▋ | 16003/34278 [17:43:29<41:14:26, 8.12s/it] {'loss': 0.1321, 'grad_norm': 0.8079655492792002, 'learning_rate': 5.7765957461767515e-06, 'epoch': 0.47} 47%|████▋ | 16003/34278 [17:43:29<41:14:26, 8.12s/it] 47%|████▋ | 16004/34278 [17:43:34<36:01:03, 7.10s/it] {'loss': 0.1348, 'grad_norm': 0.8192383471776307, 'learning_rate': 5.776129041852831e-06, 'epoch': 0.47} 47%|████▋ | 16004/34278 [17:43:34<36:01:03, 7.10s/it] 47%|████▋ | 16005/34278 [17:43:40<34:22:53, 6.77s/it] {'loss': 0.1405, 'grad_norm': 0.8083178474820759, 'learning_rate': 5.775662330599814e-06, 'epoch': 0.47} 47%|████▋ | 16005/34278 [17:43:40<34:22:53, 6.77s/it] 47%|████▋ | 16006/34278 [17:43:44<29:31:18, 5.82s/it] {'loss': 0.157, 'grad_norm': 0.8795955703771724, 'learning_rate': 5.77519561242187e-06, 'epoch': 0.47} 47%|████▋ | 16006/34278 [17:43:44<29:31:18, 5.82s/it] 47%|████▋ | 16007/34278 [17:43:48<26:56:20, 5.31s/it] {'loss': 0.1226, 'grad_norm': 0.7997922708958513, 'learning_rate': 5.7747288873231645e-06, 'epoch': 0.47} 47%|████▋ | 16007/34278 [17:43:48<26:56:20, 5.31s/it] 47%|████▋ | 16008/34278 [17:43:54<27:55:26, 5.50s/it] {'loss': 0.1233, 'grad_norm': 0.776315606649705, 'learning_rate': 5.774262155307863e-06, 'epoch': 0.47} 47%|████▋ | 16008/34278 [17:43:54<27:55:26, 5.50s/it] 47%|████▋ | 16009/34278 [17:43:57<25:13:43, 4.97s/it] {'loss': 0.1416, 'grad_norm': 0.8044015518766559, 'learning_rate': 5.773795416380135e-06, 'epoch': 0.47} 47%|████▋ | 16009/34278 [17:43:57<25:13:43, 4.97s/it] 47%|████▋ | 16010/34278 [17:44:03<26:50:24, 5.29s/it] {'loss': 0.1358, 'grad_norm': 0.8374210098741632, 'learning_rate': 5.773328670544146e-06, 'epoch': 0.47} 47%|████▋ | 16010/34278 [17:44:04<26:50:24, 5.29s/it] 47%|████▋ | 16011/34278 [17:44:08<25:03:00, 4.94s/it] {'loss': 0.1274, 'grad_norm': 0.8489569119467483, 'learning_rate': 5.772861917804064e-06, 'epoch': 0.47} 47%|████▋ | 16011/34278 [17:44:08<25:03:00, 4.94s/it] 47%|████▋ | 16012/34278 [17:44:10<21:52:10, 4.31s/it] {'loss': 0.1254, 'grad_norm': 0.6620216492087456, 'learning_rate': 5.772395158164054e-06, 'epoch': 0.47} 47%|████▋ | 16012/34278 [17:44:10<21:52:10, 4.31s/it] 47%|████▋ | 16013/34278 [17:44:14<20:35:15, 4.06s/it] {'loss': 0.1233, 'grad_norm': 1.1379218990602393, 'learning_rate': 5.771928391628284e-06, 'epoch': 0.47} 47%|████▋ | 16013/34278 [17:44:14<20:35:15, 4.06s/it] 47%|████▋ | 16014/34278 [17:44:17<19:42:45, 3.89s/it] {'loss': 0.1327, 'grad_norm': 0.9407417004271419, 'learning_rate': 5.771461618200923e-06, 'epoch': 0.47} 47%|████▋ | 16014/34278 [17:44:17<19:42:45, 3.89s/it] 47%|████▋ | 16015/34278 [17:44:21<19:11:12, 3.78s/it] {'loss': 0.1374, 'grad_norm': 0.8260911501985518, 'learning_rate': 5.770994837886137e-06, 'epoch': 0.47} 47%|████▋ | 16015/34278 [17:44:21<19:11:12, 3.78s/it] 47%|████▋ | 16016/34278 [17:44:24<18:44:42, 3.70s/it] {'loss': 0.1131, 'grad_norm': 0.8446010490366636, 'learning_rate': 5.770528050688093e-06, 'epoch': 0.47} 47%|████▋ | 16016/34278 [17:44:24<18:44:42, 3.70s/it] 47%|████▋ | 16017/34278 [17:44:28<18:16:25, 3.60s/it] {'loss': 0.1366, 'grad_norm': 0.7865372069291242, 'learning_rate': 5.770061256610957e-06, 'epoch': 0.47} 47%|████▋ | 16017/34278 [17:44:28<18:16:25, 3.60s/it] 47%|████▋ | 16018/34278 [17:44:31<17:31:14, 3.45s/it] {'loss': 0.0975, 'grad_norm': 0.8856937467203092, 'learning_rate': 5.769594455658899e-06, 'epoch': 0.47} 47%|████▋ | 16018/34278 [17:44:31<17:31:14, 3.45s/it] 47%|████▋ | 16019/34278 [17:44:34<17:03:04, 3.36s/it] {'loss': 0.1515, 'grad_norm': 1.0385318123528824, 'learning_rate': 5.7691276478360854e-06, 'epoch': 0.47} 47%|████▋ | 16019/34278 [17:44:34<17:03:04, 3.36s/it] 47%|████▋ | 16020/34278 [17:44:37<16:33:35, 3.27s/it] {'loss': 0.1296, 'grad_norm': 0.8525272348093069, 'learning_rate': 5.768660833146683e-06, 'epoch': 0.47} 47%|████▋ | 16020/34278 [17:44:37<16:33:35, 3.27s/it] 47%|████▋ | 16021/34278 [17:44:40<16:09:17, 3.19s/it] {'loss': 0.1005, 'grad_norm': 0.7564119644539686, 'learning_rate': 5.7681940115948624e-06, 'epoch': 0.47} 47%|████▋ | 16021/34278 [17:44:40<16:09:17, 3.19s/it] 47%|████▋ | 16022/34278 [17:44:44<16:35:27, 3.27s/it] {'loss': 0.1272, 'grad_norm': 1.032340142288582, 'learning_rate': 5.767727183184787e-06, 'epoch': 0.47} 47%|████▋ | 16022/34278 [17:44:44<16:35:27, 3.27s/it] 47%|████▋ | 16023/34278 [17:44:50<21:02:05, 4.15s/it] {'loss': 0.1337, 'grad_norm': 0.9099153002462543, 'learning_rate': 5.767260347920627e-06, 'epoch': 0.47} 47%|████▋ | 16023/34278 [17:44:50<21:02:05, 4.15s/it] 47%|████▋ | 16024/34278 [17:44:53<19:45:42, 3.90s/it] {'loss': 0.1474, 'grad_norm': 0.846177825244513, 'learning_rate': 5.766793505806551e-06, 'epoch': 0.47} 47%|████▋ | 16024/34278 [17:44:53<19:45:42, 3.90s/it] 47%|████▋ | 16025/34278 [17:44:56<18:40:06, 3.68s/it] {'loss': 0.1163, 'grad_norm': 0.9650979193854139, 'learning_rate': 5.766326656846723e-06, 'epoch': 0.47} 47%|████▋ | 16025/34278 [17:44:56<18:40:06, 3.68s/it] 47%|████▋ | 16026/34278 [17:45:00<18:47:46, 3.71s/it] {'loss': 0.1446, 'grad_norm': 1.1737999601585896, 'learning_rate': 5.765859801045316e-06, 'epoch': 0.47} 47%|████▋ | 16026/34278 [17:45:00<18:47:46, 3.71s/it] 47%|████▋ | 16027/34278 [17:45:03<18:03:38, 3.56s/it] {'loss': 0.144, 'grad_norm': 0.7507372097227764, 'learning_rate': 5.765392938406494e-06, 'epoch': 0.47} 47%|████▋ | 16027/34278 [17:45:03<18:03:38, 3.56s/it] 47%|████▋ | 16028/34278 [17:45:07<17:36:49, 3.47s/it] {'loss': 0.1553, 'grad_norm': 1.0289732575570227, 'learning_rate': 5.764926068934428e-06, 'epoch': 0.47} 47%|████▋ | 16028/34278 [17:45:07<17:36:49, 3.47s/it] 47%|████▋ | 16029/34278 [17:45:10<18:09:05, 3.58s/it] {'loss': 0.1426, 'grad_norm': 0.7891455298453132, 'learning_rate': 5.764459192633282e-06, 'epoch': 0.47} 47%|████▋ | 16029/34278 [17:45:10<18:09:05, 3.58s/it] 47%|████▋ | 16030/34278 [17:45:14<17:50:01, 3.52s/it] {'loss': 0.1417, 'grad_norm': 0.8292319534656433, 'learning_rate': 5.763992309507229e-06, 'epoch': 0.47} 47%|████▋ | 16030/34278 [17:45:14<17:50:01, 3.52s/it] 47%|████▋ | 16031/34278 [17:45:17<16:57:40, 3.35s/it] {'loss': 0.1526, 'grad_norm': 0.9021045559202837, 'learning_rate': 5.763525419560436e-06, 'epoch': 0.47} 47%|████▋ | 16031/34278 [17:45:17<16:57:40, 3.35s/it] 47%|████▋ | 16032/34278 [17:45:20<16:15:02, 3.21s/it] {'loss': 0.1202, 'grad_norm': 0.8264634081061879, 'learning_rate': 5.763058522797068e-06, 'epoch': 0.47} 47%|████▋ | 16032/34278 [17:45:20<16:15:02, 3.21s/it] 47%|████▋ | 16033/34278 [17:45:23<16:15:21, 3.21s/it] {'loss': 0.1367, 'grad_norm': 0.652515074592792, 'learning_rate': 5.762591619221297e-06, 'epoch': 0.47} 47%|████▋ | 16033/34278 [17:45:23<16:15:21, 3.21s/it] 47%|████▋ | 16034/34278 [17:45:26<16:15:28, 3.21s/it] {'loss': 0.1712, 'grad_norm': 0.7340814911279222, 'learning_rate': 5.762124708837291e-06, 'epoch': 0.47} 47%|████▋ | 16034/34278 [17:45:26<16:15:28, 3.21s/it] 47%|████▋ | 16035/34278 [17:45:31<18:27:44, 3.64s/it] {'loss': 0.124, 'grad_norm': 0.8506201084744504, 'learning_rate': 5.7616577916492145e-06, 'epoch': 0.47} 47%|████▋ | 16035/34278 [17:45:31<18:27:44, 3.64s/it] 47%|████▋ | 16036/34278 [17:45:34<18:00:53, 3.56s/it] {'loss': 0.1523, 'grad_norm': 0.7813046536314805, 'learning_rate': 5.761190867661243e-06, 'epoch': 0.47} 47%|████▋ | 16036/34278 [17:45:34<18:00:53, 3.56s/it] 47%|████▋ | 16037/34278 [17:45:37<17:19:14, 3.42s/it] {'loss': 0.1585, 'grad_norm': 0.7246186837938973, 'learning_rate': 5.760723936877538e-06, 'epoch': 0.47} 47%|████▋ | 16037/34278 [17:45:37<17:19:14, 3.42s/it] 47%|████▋ | 16038/34278 [17:45:40<16:57:49, 3.35s/it] {'loss': 0.1685, 'grad_norm': 0.8265533346747397, 'learning_rate': 5.760256999302273e-06, 'epoch': 0.47} 47%|████▋ | 16038/34278 [17:45:40<16:57:49, 3.35s/it] 47%|████▋ | 16039/34278 [17:45:43<16:17:42, 3.22s/it] {'loss': 0.132, 'grad_norm': 0.7512918950003025, 'learning_rate': 5.759790054939614e-06, 'epoch': 0.47} 47%|████▋ | 16039/34278 [17:45:43<16:17:42, 3.22s/it] 47%|████▋ | 16040/34278 [17:45:47<17:27:48, 3.45s/it] {'loss': 0.1321, 'grad_norm': 0.6419272741135773, 'learning_rate': 5.7593231037937306e-06, 'epoch': 0.47} 47%|████▋ | 16040/34278 [17:45:47<17:27:48, 3.45s/it] 47%|████▋ | 16041/34278 [17:45:51<18:13:58, 3.60s/it] {'loss': 0.135, 'grad_norm': 1.0469292057265425, 'learning_rate': 5.758856145868792e-06, 'epoch': 0.47} 47%|████▋ | 16041/34278 [17:45:51<18:13:58, 3.60s/it] 47%|████▋ | 16042/34278 [17:45:55<18:16:34, 3.61s/it] {'loss': 0.1423, 'grad_norm': 0.844045201941579, 'learning_rate': 5.758389181168967e-06, 'epoch': 0.47} 47%|████▋ | 16042/34278 [17:45:55<18:16:34, 3.61s/it] 47%|████▋ | 16043/34278 [17:46:01<22:03:56, 4.36s/it] {'loss': 0.1267, 'grad_norm': 0.7167638581889434, 'learning_rate': 5.757922209698424e-06, 'epoch': 0.47} 47%|████▋ | 16043/34278 [17:46:01<22:03:56, 4.36s/it] 47%|████▋ | 16044/34278 [17:46:07<24:43:56, 4.88s/it] {'loss': 0.146, 'grad_norm': 0.7103636203539796, 'learning_rate': 5.757455231461334e-06, 'epoch': 0.47} 47%|████▋ | 16044/34278 [17:46:07<24:43:56, 4.88s/it] 47%|████▋ | 16045/34278 [17:46:13<26:29:45, 5.23s/it] {'loss': 0.1391, 'grad_norm': 0.6696736563358334, 'learning_rate': 5.756988246461863e-06, 'epoch': 0.47} 47%|████▋ | 16045/34278 [17:46:13<26:29:45, 5.23s/it] 47%|████▋ | 16046/34278 [17:46:16<23:44:22, 4.69s/it] {'loss': 0.1278, 'grad_norm': 0.7317317777456038, 'learning_rate': 5.7565212547041835e-06, 'epoch': 0.47} 47%|████▋ | 16046/34278 [17:46:16<23:44:22, 4.69s/it] 47%|████▋ | 16047/34278 [17:46:20<21:28:43, 4.24s/it] {'loss': 0.1503, 'grad_norm': 0.637411729797091, 'learning_rate': 5.75605425619246e-06, 'epoch': 0.47} 47%|████▋ | 16047/34278 [17:46:20<21:28:43, 4.24s/it] 47%|████▋ | 16048/34278 [17:46:23<20:09:07, 3.98s/it] {'loss': 0.1418, 'grad_norm': 1.0490510121574332, 'learning_rate': 5.755587250930866e-06, 'epoch': 0.47} 47%|████▋ | 16048/34278 [17:46:23<20:09:07, 3.98s/it] 47%|████▋ | 16049/34278 [17:46:26<18:27:34, 3.65s/it] {'loss': 0.1409, 'grad_norm': 0.8138231282346315, 'learning_rate': 5.75512023892357e-06, 'epoch': 0.47} 47%|████▋ | 16049/34278 [17:46:26<18:27:34, 3.65s/it] 47%|████▋ | 16050/34278 [17:46:32<22:12:00, 4.38s/it] {'loss': 0.124, 'grad_norm': 0.7919611992138462, 'learning_rate': 5.75465322017474e-06, 'epoch': 0.47} 47%|████▋ | 16050/34278 [17:46:32<22:12:00, 4.38s/it] 47%|████▋ | 16051/34278 [17:46:37<22:57:04, 4.53s/it] {'loss': 0.1767, 'grad_norm': 1.0982884992699131, 'learning_rate': 5.754186194688547e-06, 'epoch': 0.47} 47%|████▋ | 16051/34278 [17:46:37<22:57:04, 4.53s/it] 47%|████▋ | 16052/34278 [17:46:40<20:39:27, 4.08s/it] {'loss': 0.1439, 'grad_norm': 1.0299722455438751, 'learning_rate': 5.753719162469159e-06, 'epoch': 0.47} 47%|████▋ | 16052/34278 [17:46:40<20:39:27, 4.08s/it] 47%|████▋ | 16053/34278 [17:46:43<18:37:37, 3.68s/it] {'loss': 0.1132, 'grad_norm': 0.6987058633317436, 'learning_rate': 5.753252123520746e-06, 'epoch': 0.47} 47%|████▋ | 16053/34278 [17:46:43<18:37:37, 3.68s/it] 47%|████▋ | 16054/34278 [17:46:49<22:13:18, 4.39s/it] {'loss': 0.143, 'grad_norm': 1.0383339761552182, 'learning_rate': 5.7527850778474795e-06, 'epoch': 0.47} 47%|████▋ | 16054/34278 [17:46:49<22:13:18, 4.39s/it] 47%|████▋ | 16055/34278 [17:46:52<20:01:33, 3.96s/it] {'loss': 0.1486, 'grad_norm': 1.1578639748397883, 'learning_rate': 5.752318025453525e-06, 'epoch': 0.47} 47%|████▋ | 16055/34278 [17:46:52<20:01:33, 3.96s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8ae11bb2e0> Failed to fetch sample 2654731. Exception: cannot identify image file <_io.BytesIO object at 0x7f8ae11bb2e0> 47%|████▋ | 16056/34278 [17:46:55<18:29:04, 3.65s/it] {'loss': 0.1488, 'grad_norm': 0.7351432849730697, 'learning_rate': 5.751850966343057e-06, 'epoch': 0.47} 47%|████▋ | 16056/34278 [17:46:55<18:29:04, 3.65s/it] 47%|████▋ | 16057/34278 [17:47:00<21:55:37, 4.33s/it] {'loss': 0.1113, 'grad_norm': 0.843477890384064, 'learning_rate': 5.751383900520241e-06, 'epoch': 0.47} 47%|████▋ | 16057/34278 [17:47:01<21:55:37, 4.33s/it] 47%|████▋ | 16058/34278 [17:47:04<20:19:43, 4.02s/it] {'loss': 0.1327, 'grad_norm': 0.8823683745562853, 'learning_rate': 5.75091682798925e-06, 'epoch': 0.47} 47%|████▋ | 16058/34278 [17:47:04<20:19:43, 4.02s/it] 47%|████▋ | 16059/34278 [17:47:10<23:21:25, 4.62s/it] {'loss': 0.1338, 'grad_norm': 0.8005412105676123, 'learning_rate': 5.750449748754253e-06, 'epoch': 0.47} 47%|████▋ | 16059/34278 [17:47:10<23:21:25, 4.62s/it] 47%|████▋ | 16060/34278 [17:47:13<21:08:03, 4.18s/it] {'loss': 0.1686, 'grad_norm': 0.968143710825528, 'learning_rate': 5.74998266281942e-06, 'epoch': 0.47} 47%|████▋ | 16060/34278 [17:47:13<21:08:03, 4.18s/it] 47%|████▋ | 16061/34278 [17:47:17<21:19:50, 4.22s/it] {'loss': 0.1174, 'grad_norm': 0.8627773213283976, 'learning_rate': 5.7495155701889215e-06, 'epoch': 0.47} 47%|████▋ | 16061/34278 [17:47:17<21:19:50, 4.22s/it] 47%|████▋ | 16062/34278 [17:47:20<19:39:49, 3.89s/it] {'loss': 0.1568, 'grad_norm': 1.2187515398357416, 'learning_rate': 5.749048470866925e-06, 'epoch': 0.47} 47%|████▋ | 16062/34278 [17:47:20<19:39:49, 3.89s/it] 47%|████▋ | 16063/34278 [17:47:26<22:55:32, 4.53s/it] {'loss': 0.1558, 'grad_norm': 1.0308792401904288, 'learning_rate': 5.748581364857603e-06, 'epoch': 0.47} 47%|████▋ | 16063/34278 [17:47:26<22:55:32, 4.53s/it] 47%|████▋ | 16064/34278 [17:47:29<20:23:09, 4.03s/it] {'loss': 0.1393, 'grad_norm': 0.9357220981721602, 'learning_rate': 5.748114252165127e-06, 'epoch': 0.47} 47%|████▋ | 16064/34278 [17:47:29<20:23:09, 4.03s/it] 47%|████▋ | 16065/34278 [17:47:35<22:56:27, 4.53s/it] {'loss': 0.1467, 'grad_norm': 0.8992948611466673, 'learning_rate': 5.747647132793662e-06, 'epoch': 0.47} 47%|████▋ | 16065/34278 [17:47:35<22:56:27, 4.53s/it] 47%|████▋ | 16066/34278 [17:47:38<21:05:41, 4.17s/it] {'loss': 0.1695, 'grad_norm': 0.9082733345381605, 'learning_rate': 5.747180006747386e-06, 'epoch': 0.47} 47%|████▋ | 16066/34278 [17:47:38<21:05:41, 4.17s/it] 47%|████▋ | 16067/34278 [17:47:41<19:34:45, 3.87s/it] {'loss': 0.1391, 'grad_norm': 0.8600373402962813, 'learning_rate': 5.746712874030462e-06, 'epoch': 0.47} 47%|████▋ | 16067/34278 [17:47:41<19:34:45, 3.87s/it] 47%|████▋ | 16068/34278 [17:47:45<18:42:24, 3.70s/it] {'loss': 0.1266, 'grad_norm': 0.8737621041818534, 'learning_rate': 5.746245734647066e-06, 'epoch': 0.47} 47%|████▋ | 16068/34278 [17:47:45<18:42:24, 3.70s/it] 47%|████▋ | 16069/34278 [17:47:49<18:55:22, 3.74s/it] {'loss': 0.1628, 'grad_norm': 1.2121580779759222, 'learning_rate': 5.745778588601365e-06, 'epoch': 0.47} 47%|████▋ | 16069/34278 [17:47:49<18:55:22, 3.74s/it] 47%|████▋ | 16070/34278 [17:47:52<19:04:34, 3.77s/it] {'loss': 0.1463, 'grad_norm': 0.7783008039871504, 'learning_rate': 5.745311435897531e-06, 'epoch': 0.47} 47%|████▋ | 16070/34278 [17:47:52<19:04:34, 3.77s/it] 47%|████▋ | 16071/34278 [17:47:55<17:44:47, 3.51s/it] {'loss': 0.132, 'grad_norm': 0.8315272123788562, 'learning_rate': 5.744844276539734e-06, 'epoch': 0.47} 47%|████▋ | 16071/34278 [17:47:55<17:44:47, 3.51s/it] 47%|████▋ | 16072/34278 [17:47:58<17:03:43, 3.37s/it] {'loss': 0.1237, 'grad_norm': 0.8048650563939992, 'learning_rate': 5.744377110532146e-06, 'epoch': 0.47} 47%|████▋ | 16072/34278 [17:47:58<17:03:43, 3.37s/it] 47%|████▋ | 16073/34278 [17:48:04<20:58:47, 4.15s/it] {'loss': 0.1364, 'grad_norm': 1.011428222566658, 'learning_rate': 5.7439099378789366e-06, 'epoch': 0.47} 47%|████▋ | 16073/34278 [17:48:04<20:58:47, 4.15s/it] 47%|████▋ | 16074/34278 [17:48:07<19:20:47, 3.83s/it] {'loss': 0.1274, 'grad_norm': 0.8505536900192274, 'learning_rate': 5.743442758584277e-06, 'epoch': 0.47} 47%|████▋ | 16074/34278 [17:48:07<19:20:47, 3.83s/it] 47%|████▋ | 16075/34278 [17:48:13<22:25:06, 4.43s/it] {'loss': 0.123, 'grad_norm': 0.9729993148155971, 'learning_rate': 5.742975572652337e-06, 'epoch': 0.47} 47%|████▋ | 16075/34278 [17:48:13<22:25:06, 4.43s/it] 47%|████▋ | 16076/34278 [17:48:16<20:32:51, 4.06s/it] {'loss': 0.1469, 'grad_norm': 1.0459297712761293, 'learning_rate': 5.74250838008729e-06, 'epoch': 0.47} 47%|████▋ | 16076/34278 [17:48:16<20:32:51, 4.06s/it] 47%|████▋ | 16077/34278 [17:48:20<19:51:48, 3.93s/it] {'loss': 0.1392, 'grad_norm': 0.8132158729836365, 'learning_rate': 5.742041180893303e-06, 'epoch': 0.47} 47%|████▋ | 16077/34278 [17:48:20<19:51:48, 3.93s/it] 47%|████▋ | 16078/34278 [17:48:24<19:08:58, 3.79s/it] {'loss': 0.1159, 'grad_norm': 0.8584280553812293, 'learning_rate': 5.741573975074551e-06, 'epoch': 0.47} 47%|████▋ | 16078/34278 [17:48:24<19:08:58, 3.79s/it] 47%|████▋ | 16079/34278 [17:48:28<20:05:05, 3.97s/it] {'loss': 0.1219, 'grad_norm': 0.6767765047043773, 'learning_rate': 5.741106762635205e-06, 'epoch': 0.47} 47%|████▋ | 16079/34278 [17:48:28<20:05:05, 3.97s/it] 47%|████▋ | 16080/34278 [17:48:31<19:01:33, 3.76s/it] {'loss': 0.1369, 'grad_norm': 1.1232119840550205, 'learning_rate': 5.740639543579433e-06, 'epoch': 0.47} 47%|████▋ | 16080/34278 [17:48:31<19:01:33, 3.76s/it] 47%|████▋ | 16081/34278 [17:48:38<22:54:45, 4.53s/it] {'loss': 0.1359, 'grad_norm': 0.796951447742278, 'learning_rate': 5.740172317911409e-06, 'epoch': 0.47} 47%|████▋ | 16081/34278 [17:48:38<22:54:45, 4.53s/it] 47%|████▋ | 16082/34278 [17:48:43<24:46:56, 4.90s/it] {'loss': 0.1431, 'grad_norm': 0.8761079062131024, 'learning_rate': 5.739705085635302e-06, 'epoch': 0.47} 47%|████▋ | 16082/34278 [17:48:43<24:46:56, 4.90s/it] 47%|████▋ | 16083/34278 [17:48:48<23:57:42, 4.74s/it] {'loss': 0.13, 'grad_norm': 0.9884386708917005, 'learning_rate': 5.739237846755285e-06, 'epoch': 0.47} 47%|████▋ | 16083/34278 [17:48:48<23:57:42, 4.74s/it] 47%|████▋ | 16084/34278 [17:48:51<21:54:42, 4.34s/it] {'loss': 0.1246, 'grad_norm': 1.1919464567864113, 'learning_rate': 5.738770601275529e-06, 'epoch': 0.47} 47%|████▋ | 16084/34278 [17:48:51<21:54:42, 4.34s/it] 47%|████▋ | 16085/34278 [17:48:55<20:52:18, 4.13s/it] {'loss': 0.1257, 'grad_norm': 1.045991070029729, 'learning_rate': 5.738303349200206e-06, 'epoch': 0.47} 47%|████▋ | 16085/34278 [17:48:55<20:52:18, 4.13s/it] 47%|████▋ | 16086/34278 [17:49:01<23:39:02, 4.68s/it] {'loss': 0.1396, 'grad_norm': 0.7959565312596193, 'learning_rate': 5.7378360905334865e-06, 'epoch': 0.47} 47%|████▋ | 16086/34278 [17:49:01<23:39:02, 4.68s/it] 47%|████▋ | 16087/34278 [17:49:04<21:46:43, 4.31s/it] {'loss': 0.1476, 'grad_norm': 1.5820160593183699, 'learning_rate': 5.737368825279542e-06, 'epoch': 0.47} 47%|████▋ | 16087/34278 [17:49:04<21:46:43, 4.31s/it] 47%|████▋ | 16088/34278 [17:49:07<19:48:10, 3.92s/it] {'loss': 0.1402, 'grad_norm': 1.1815826628493877, 'learning_rate': 5.736901553442545e-06, 'epoch': 0.47} 47%|████▋ | 16088/34278 [17:49:07<19:48:10, 3.92s/it] 47%|████▋ | 16089/34278 [17:49:10<18:10:13, 3.60s/it] {'loss': 0.1334, 'grad_norm': 0.7513949019477715, 'learning_rate': 5.736434275026667e-06, 'epoch': 0.47} 47%|████▋ | 16089/34278 [17:49:10<18:10:13, 3.60s/it] 47%|████▋ | 16090/34278 [17:49:15<19:38:47, 3.89s/it] {'loss': 0.1392, 'grad_norm': 1.2522696477108841, 'learning_rate': 5.735966990036079e-06, 'epoch': 0.47} 47%|████▋ | 16090/34278 [17:49:15<19:38:47, 3.89s/it] 47%|████▋ | 16091/34278 [17:49:19<20:46:54, 4.11s/it] {'loss': 0.139, 'grad_norm': 1.0568337995409316, 'learning_rate': 5.735499698474956e-06, 'epoch': 0.47} 47%|████▋ | 16091/34278 [17:49:19<20:46:54, 4.11s/it] 47%|████▋ | 16092/34278 [17:49:22<19:28:12, 3.85s/it] {'loss': 0.1388, 'grad_norm': 1.0403577071745036, 'learning_rate': 5.735032400347463e-06, 'epoch': 0.47} 47%|████▋ | 16092/34278 [17:49:22<19:28:12, 3.85s/it] 47%|████▋ | 16093/34278 [17:49:25<18:15:54, 3.62s/it] {'loss': 0.1243, 'grad_norm': 0.7900333409971576, 'learning_rate': 5.734565095657779e-06, 'epoch': 0.47} 47%|████▋ | 16093/34278 [17:49:25<18:15:54, 3.62s/it] 47%|████▋ | 16094/34278 [17:49:29<18:30:42, 3.66s/it] {'loss': 0.1369, 'grad_norm': 0.9720347804398541, 'learning_rate': 5.7340977844100735e-06, 'epoch': 0.47} 47%|████▋ | 16094/34278 [17:49:29<18:30:42, 3.66s/it] 47%|████▋ | 16095/34278 [17:49:32<17:04:58, 3.38s/it] {'loss': 0.1358, 'grad_norm': 0.9672105474943895, 'learning_rate': 5.733630466608516e-06, 'epoch': 0.47} 47%|████▋ | 16095/34278 [17:49:32<17:04:58, 3.38s/it] 47%|████▋ | 16096/34278 [17:49:35<16:26:12, 3.25s/it] {'loss': 0.1509, 'grad_norm': 1.0027947901199838, 'learning_rate': 5.733163142257283e-06, 'epoch': 0.47} 47%|████▋ | 16096/34278 [17:49:35<16:26:12, 3.25s/it] 47%|████▋ | 16097/34278 [17:49:38<16:28:51, 3.26s/it] {'loss': 0.1257, 'grad_norm': 0.6224211116672625, 'learning_rate': 5.732695811360543e-06, 'epoch': 0.47} 47%|████▋ | 16097/34278 [17:49:38<16:28:51, 3.26s/it] 47%|████▋ | 16098/34278 [17:49:42<16:56:22, 3.35s/it] {'loss': 0.176, 'grad_norm': 0.7699030096852254, 'learning_rate': 5.732228473922471e-06, 'epoch': 0.47} 47%|████▋ | 16098/34278 [17:49:42<16:56:22, 3.35s/it] 47%|████▋ | 16099/34278 [17:49:46<17:31:45, 3.47s/it] {'loss': 0.1186, 'grad_norm': 0.8856354430030329, 'learning_rate': 5.731761129947238e-06, 'epoch': 0.47} 47%|████▋ | 16099/34278 [17:49:46<17:31:45, 3.47s/it] 47%|████▋ | 16100/34278 [17:49:52<21:17:57, 4.22s/it] {'loss': 0.1285, 'grad_norm': 0.8167555368557391, 'learning_rate': 5.731293779439015e-06, 'epoch': 0.47} 47%|████▋ | 16100/34278 [17:49:52<21:17:57, 4.22s/it] 47%|████▋ | 16101/34278 [17:49:55<19:38:49, 3.89s/it] {'loss': 0.1589, 'grad_norm': 0.9193967476538044, 'learning_rate': 5.730826422401976e-06, 'epoch': 0.47} 47%|████▋ | 16101/34278 [17:49:55<19:38:49, 3.89s/it] 47%|████▋ | 16102/34278 [17:49:58<19:07:08, 3.79s/it] {'loss': 0.1417, 'grad_norm': 0.6393872877340689, 'learning_rate': 5.730359058840294e-06, 'epoch': 0.47} 47%|████▋ | 16102/34278 [17:49:58<19:07:08, 3.79s/it] 47%|████▋ | 16103/34278 [17:50:01<18:04:22, 3.58s/it] {'loss': 0.1457, 'grad_norm': 1.110856759356158, 'learning_rate': 5.7298916887581405e-06, 'epoch': 0.47} 47%|████▋ | 16103/34278 [17:50:01<18:04:22, 3.58s/it] 47%|████▋ | 16104/34278 [17:50:05<18:28:48, 3.66s/it] {'loss': 0.1497, 'grad_norm': 0.9144741724024261, 'learning_rate': 5.729424312159687e-06, 'epoch': 0.47} 47%|████▋ | 16104/34278 [17:50:05<18:28:48, 3.66s/it] 47%|████▋ | 16105/34278 [17:50:09<19:20:22, 3.83s/it] {'loss': 0.1298, 'grad_norm': 0.648704093586818, 'learning_rate': 5.728956929049109e-06, 'epoch': 0.47} 47%|████▋ | 16105/34278 [17:50:09<19:20:22, 3.83s/it] 47%|████▋ | 16106/34278 [17:50:12<18:02:26, 3.57s/it] {'loss': 0.1331, 'grad_norm': 0.8048886852999525, 'learning_rate': 5.728489539430576e-06, 'epoch': 0.47} 47%|████▋ | 16106/34278 [17:50:12<18:02:26, 3.57s/it] 47%|████▋ | 16107/34278 [17:50:15<17:17:54, 3.43s/it] {'loss': 0.1339, 'grad_norm': 0.7897477806332242, 'learning_rate': 5.728022143308264e-06, 'epoch': 0.47} 47%|████▋ | 16107/34278 [17:50:15<17:17:54, 3.43s/it] 47%|████▋ | 16108/34278 [17:50:18<16:38:30, 3.30s/it] {'loss': 0.1367, 'grad_norm': 0.8601492069847314, 'learning_rate': 5.727554740686343e-06, 'epoch': 0.47} 47%|████▋ | 16108/34278 [17:50:18<16:38:30, 3.30s/it] 47%|████▋ | 16109/34278 [17:50:24<20:51:45, 4.13s/it] {'loss': 0.1553, 'grad_norm': 0.9207029387636473, 'learning_rate': 5.727087331568986e-06, 'epoch': 0.47} 47%|████▋ | 16109/34278 [17:50:24<20:51:45, 4.13s/it] 47%|████▋ | 16110/34278 [17:50:27<18:55:30, 3.75s/it] {'loss': 0.1069, 'grad_norm': 0.7737759715592784, 'learning_rate': 5.726619915960368e-06, 'epoch': 0.47} 47%|████▋ | 16110/34278 [17:50:27<18:55:30, 3.75s/it] 47%|████▋ | 16111/34278 [17:50:33<21:54:55, 4.34s/it] {'loss': 0.1598, 'grad_norm': 0.7515370629130431, 'learning_rate': 5.726152493864663e-06, 'epoch': 0.47} 47%|████▋ | 16111/34278 [17:50:33<21:54:55, 4.34s/it] 47%|████▋ | 16112/34278 [17:50:38<22:27:41, 4.45s/it] {'loss': 0.1471, 'grad_norm': 0.8571022277462108, 'learning_rate': 5.725685065286038e-06, 'epoch': 0.47} 47%|████▋ | 16112/34278 [17:50:38<22:27:41, 4.45s/it] 47%|████▋ | 16113/34278 [17:50:41<20:07:16, 3.99s/it] {'loss': 0.1596, 'grad_norm': 0.8753528201700052, 'learning_rate': 5.725217630228673e-06, 'epoch': 0.47} 47%|████▋ | 16113/34278 [17:50:41<20:07:16, 3.99s/it] 47%|████▋ | 16114/34278 [17:50:44<18:27:16, 3.66s/it] {'loss': 0.1489, 'grad_norm': 4.859814509656574, 'learning_rate': 5.724750188696737e-06, 'epoch': 0.47} 47%|████▋ | 16114/34278 [17:50:44<18:27:16, 3.66s/it] 47%|████▋ | 16115/34278 [17:50:47<17:45:05, 3.52s/it] {'loss': 0.1154, 'grad_norm': 1.4828493024288039, 'learning_rate': 5.724282740694404e-06, 'epoch': 0.47} 47%|████▋ | 16115/34278 [17:50:47<17:45:05, 3.52s/it] 47%|████▋ | 16116/34278 [17:50:50<17:40:01, 3.50s/it] {'loss': 0.1447, 'grad_norm': 0.8638338229309651, 'learning_rate': 5.723815286225848e-06, 'epoch': 0.47} 47%|████▋ | 16116/34278 [17:50:50<17:40:01, 3.50s/it] 47%|████▋ | 16117/34278 [17:50:57<22:03:19, 4.37s/it] {'loss': 0.1237, 'grad_norm': 0.7688649662024418, 'learning_rate': 5.723347825295243e-06, 'epoch': 0.47} 47%|████▋ | 16117/34278 [17:50:57<22:03:19, 4.37s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 47%|████▋ | 16118/34278 [17:51:02<23:58:59, 4.75s/it] {'loss': 0.1152, 'grad_norm': 0.7522221028055127, 'learning_rate': 5.7228803579067594e-06, 'epoch': 0.47} 47%|████▋ | 16118/34278 [17:51:02<23:58:59, 4.75s/it] 47%|████▋ | 16119/34278 [17:51:06<21:43:05, 4.31s/it] {'loss': 0.1298, 'grad_norm': 0.8944578795617836, 'learning_rate': 5.722412884064572e-06, 'epoch': 0.47} 47%|████▋ | 16119/34278 [17:51:06<21:43:05, 4.31s/it] 47%|████▋ | 16120/34278 [17:51:09<20:15:56, 4.02s/it] {'loss': 0.1508, 'grad_norm': 1.084083705789618, 'learning_rate': 5.7219454037728564e-06, 'epoch': 0.47} 47%|████▋ | 16120/34278 [17:51:09<20:15:56, 4.02s/it] 47%|████▋ | 16121/34278 [17:51:12<18:59:54, 3.77s/it] {'loss': 0.1362, 'grad_norm': 0.9507613558325935, 'learning_rate': 5.721477917035785e-06, 'epoch': 0.47} 47%|████▋ | 16121/34278 [17:51:12<18:59:54, 3.77s/it] 47%|████▋ | 16122/34278 [17:51:18<21:36:07, 4.28s/it] {'loss': 0.1201, 'grad_norm': 0.7930644797736451, 'learning_rate': 5.7210104238575295e-06, 'epoch': 0.47} 47%|████▋ | 16122/34278 [17:51:18<21:36:07, 4.28s/it] 47%|████▋ | 16123/34278 [17:51:21<20:18:20, 4.03s/it] {'loss': 0.1296, 'grad_norm': 0.6314872642891975, 'learning_rate': 5.720542924242265e-06, 'epoch': 0.47} 47%|████▋ | 16123/34278 [17:51:21<20:18:20, 4.03s/it] 47%|████▋ | 16124/34278 [17:51:25<21:00:37, 4.17s/it] {'loss': 0.1368, 'grad_norm': 0.8494723624941237, 'learning_rate': 5.720075418194166e-06, 'epoch': 0.47} 47%|████▋ | 16124/34278 [17:51:25<21:00:37, 4.17s/it] 47%|████▋ | 16125/34278 [17:51:30<22:09:22, 4.39s/it] {'loss': 0.1355, 'grad_norm': 0.9204880438144835, 'learning_rate': 5.719607905717406e-06, 'epoch': 0.47} 47%|████▋ | 16125/34278 [17:51:30<22:09:22, 4.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 47%|████▋ | 16126/34278 [17:51:35<22:32:09, 4.47s/it] {'loss': 0.1282, 'grad_norm': 0.7134989970552862, 'learning_rate': 5.719140386816159e-06, 'epoch': 0.47} 47%|████▋ | 16126/34278 [17:51:35<22:32:09, 4.47s/it] 47%|████▋ | 16127/34278 [17:51:38<20:42:30, 4.11s/it] {'loss': 0.1516, 'grad_norm': 0.8339145112729058, 'learning_rate': 5.718672861494597e-06, 'epoch': 0.47} 47%|████▋ | 16127/34278 [17:51:38<20:42:30, 4.11s/it] 47%|████▋ | 16128/34278 [17:51:42<19:34:46, 3.88s/it] {'loss': 0.1565, 'grad_norm': 0.9566071881084267, 'learning_rate': 5.718205329756895e-06, 'epoch': 0.47} 47%|████▋ | 16128/34278 [17:51:42<19:34:46, 3.88s/it] 47%|████▋ | 16129/34278 [17:51:47<22:20:56, 4.43s/it] {'loss': 0.1645, 'grad_norm': 0.854543237254776, 'learning_rate': 5.7177377916072285e-06, 'epoch': 0.47} 47%|████▋ | 16129/34278 [17:51:47<22:20:56, 4.43s/it] 47%|████▋ | 16130/34278 [17:51:50<19:51:18, 3.94s/it] {'loss': 0.1407, 'grad_norm': 0.9789235876882216, 'learning_rate': 5.717270247049769e-06, 'epoch': 0.47} 47%|████▋ | 16130/34278 [17:51:50<19:51:18, 3.94s/it] 47%|████▋ | 16131/34278 [17:51:56<22:07:53, 4.39s/it] {'loss': 0.1357, 'grad_norm': 0.7974384768832588, 'learning_rate': 5.7168026960886925e-06, 'epoch': 0.47} 47%|████▋ | 16131/34278 [17:51:56<22:07:53, 4.39s/it] 47%|████▋ | 16132/34278 [17:52:00<21:38:29, 4.29s/it] {'loss': 0.1263, 'grad_norm': 0.8275988235439948, 'learning_rate': 5.716335138728173e-06, 'epoch': 0.47} 47%|████▋ | 16132/34278 [17:52:00<21:38:29, 4.29s/it] 47%|████▋ | 16133/34278 [17:52:03<20:10:25, 4.00s/it] {'loss': 0.1621, 'grad_norm': 0.9523676958991666, 'learning_rate': 5.715867574972384e-06, 'epoch': 0.47} 47%|████▋ | 16133/34278 [17:52:03<20:10:25, 4.00s/it] 47%|████▋ | 16134/34278 [17:52:06<18:57:13, 3.76s/it] {'loss': 0.1603, 'grad_norm': 0.7635523007543593, 'learning_rate': 5.7154000048255e-06, 'epoch': 0.47} 47%|████▋ | 16134/34278 [17:52:06<18:57:13, 3.76s/it] 47%|████▋ | 16135/34278 [17:52:09<17:27:14, 3.46s/it] {'loss': 0.1281, 'grad_norm': 0.8623497340330429, 'learning_rate': 5.7149324282916966e-06, 'epoch': 0.47} 47%|████▋ | 16135/34278 [17:52:09<17:27:14, 3.46s/it] 47%|████▋ | 16136/34278 [17:52:14<19:27:57, 3.86s/it] {'loss': 0.1225, 'grad_norm': 0.8510871688185185, 'learning_rate': 5.714464845375146e-06, 'epoch': 0.47} 47%|████▋ | 16136/34278 [17:52:14<19:27:57, 3.86s/it] 47%|████▋ | 16137/34278 [17:52:17<17:59:40, 3.57s/it] {'loss': 0.1646, 'grad_norm': 0.7639056998088636, 'learning_rate': 5.7139972560800235e-06, 'epoch': 0.47} 47%|████▋ | 16137/34278 [17:52:17<17:59:40, 3.57s/it] 47%|████▋ | 16138/34278 [17:52:20<17:54:29, 3.55s/it] {'loss': 0.1517, 'grad_norm': 0.9565157521139924, 'learning_rate': 5.713529660410505e-06, 'epoch': 0.47} 47%|████▋ | 16138/34278 [17:52:20<17:54:29, 3.55s/it] 47%|████▋ | 16139/34278 [17:52:23<16:44:35, 3.32s/it] {'loss': 0.1245, 'grad_norm': 0.7635071253582182, 'learning_rate': 5.713062058370763e-06, 'epoch': 0.47} 47%|████▋ | 16139/34278 [17:52:23<16:44:35, 3.32s/it] 47%|████▋ | 16140/34278 [17:52:27<18:06:35, 3.59s/it] {'loss': 0.1233, 'grad_norm': 0.7291809115107964, 'learning_rate': 5.7125944499649745e-06, 'epoch': 0.47} 47%|████▋ | 16140/34278 [17:52:27<18:06:35, 3.59s/it] 47%|████▋ | 16141/34278 [17:52:31<17:44:43, 3.52s/it] {'loss': 0.1439, 'grad_norm': 0.8774965193916223, 'learning_rate': 5.712126835197313e-06, 'epoch': 0.47} 47%|████▋ | 16141/34278 [17:52:31<17:44:43, 3.52s/it] 47%|████▋ | 16142/34278 [17:52:34<17:09:36, 3.41s/it] {'loss': 0.13, 'grad_norm': 0.8285968761142548, 'learning_rate': 5.711659214071951e-06, 'epoch': 0.47} 47%|████▋ | 16142/34278 [17:52:34<17:09:36, 3.41s/it] 47%|████▋ | 16143/34278 [17:52:38<18:00:08, 3.57s/it] {'loss': 0.1202, 'grad_norm': 0.6676578682951105, 'learning_rate': 5.711191586593068e-06, 'epoch': 0.47} 47%|████▋ | 16143/34278 [17:52:38<18:00:08, 3.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 47%|████▋ | 16144/34278 [17:52:41<17:02:10, 3.38s/it] {'loss': 0.1268, 'grad_norm': 0.666544706641162, 'learning_rate': 5.710723952764835e-06, 'epoch': 0.47} 47%|████▋ | 16144/34278 [17:52:41<17:02:10, 3.38s/it] 47%|████▋ | 16145/34278 [17:52:44<17:02:13, 3.38s/it] {'loss': 0.1416, 'grad_norm': 0.7635204443806743, 'learning_rate': 5.7102563125914265e-06, 'epoch': 0.47} 47%|████▋ | 16145/34278 [17:52:44<17:02:13, 3.38s/it] 47%|████▋ | 16146/34278 [17:52:47<16:39:15, 3.31s/it] {'loss': 0.1194, 'grad_norm': 0.7802522015872385, 'learning_rate': 5.709788666077022e-06, 'epoch': 0.47} 47%|████▋ | 16146/34278 [17:52:47<16:39:15, 3.31s/it] 47%|████▋ | 16147/34278 [17:52:50<16:43:30, 3.32s/it] {'loss': 0.1362, 'grad_norm': 0.7881972123320081, 'learning_rate': 5.709321013225792e-06, 'epoch': 0.47} 47%|████▋ | 16147/34278 [17:52:50<16:43:30, 3.32s/it] 47%|████▋ | 16148/34278 [17:52:53<16:13:52, 3.22s/it] {'loss': 0.1373, 'grad_norm': 0.9339135012352625, 'learning_rate': 5.708853354041914e-06, 'epoch': 0.47} 47%|████▋ | 16148/34278 [17:52:53<16:13:52, 3.22s/it] 47%|████▋ | 16149/34278 [17:52:57<16:09:33, 3.21s/it] {'loss': 0.1447, 'grad_norm': 0.7428028039373517, 'learning_rate': 5.708385688529563e-06, 'epoch': 0.47} 47%|████▋ | 16149/34278 [17:52:57<16:09:33, 3.21s/it] 47%|████▋ | 16150/34278 [17:52:59<15:35:46, 3.10s/it] {'loss': 0.1136, 'grad_norm': 0.7557091850447257, 'learning_rate': 5.707918016692913e-06, 'epoch': 0.47} 47%|████▋ | 16150/34278 [17:52:59<15:35:46, 3.10s/it] 47%|████▋ | 16151/34278 [17:53:05<19:54:14, 3.95s/it] {'loss': 0.1528, 'grad_norm': 0.930051679838062, 'learning_rate': 5.7074503385361406e-06, 'epoch': 0.47} 47%|████▋ | 16151/34278 [17:53:05<19:54:14, 3.95s/it] 47%|████▋ | 16152/34278 [17:53:08<18:30:32, 3.68s/it] {'loss': 0.1414, 'grad_norm': 0.914309014482323, 'learning_rate': 5.70698265406342e-06, 'epoch': 0.47} 47%|████▋ | 16152/34278 [17:53:08<18:30:32, 3.68s/it] 47%|████▋ | 16153/34278 [17:53:11<17:29:46, 3.48s/it] {'loss': 0.1458, 'grad_norm': 1.0921308307739288, 'learning_rate': 5.706514963278926e-06, 'epoch': 0.47} 47%|████▋ | 16153/34278 [17:53:11<17:29:46, 3.48s/it] 47%|████▋ | 16154/34278 [17:53:15<17:30:54, 3.48s/it] {'loss': 0.1254, 'grad_norm': 0.8584718839188415, 'learning_rate': 5.706047266186836e-06, 'epoch': 0.47} 47%|████▋ | 16154/34278 [17:53:15<17:30:54, 3.48s/it] 47%|████▋ | 16155/34278 [17:53:18<16:40:47, 3.31s/it] {'loss': 0.1506, 'grad_norm': 0.7586975375020832, 'learning_rate': 5.705579562791325e-06, 'epoch': 0.47} 47%|████▋ | 16155/34278 [17:53:18<16:40:47, 3.31s/it] 47%|████▋ | 16156/34278 [17:53:21<16:25:02, 3.26s/it] {'loss': 0.1453, 'grad_norm': 0.9601549045653952, 'learning_rate': 5.705111853096569e-06, 'epoch': 0.47} 47%|████▋ | 16156/34278 [17:53:21<16:25:02, 3.26s/it] 47%|████▋ | 16157/34278 [17:53:26<18:20:48, 3.64s/it] {'loss': 0.1028, 'grad_norm': 0.9988405830758633, 'learning_rate': 5.70464413710674e-06, 'epoch': 0.47} 47%|████▋ | 16157/34278 [17:53:26<18:20:48, 3.64s/it] 47%|████▋ | 16158/34278 [17:53:30<19:33:20, 3.89s/it] {'loss': 0.1393, 'grad_norm': 0.7686197023796981, 'learning_rate': 5.704176414826018e-06, 'epoch': 0.47} 47%|████▋ | 16158/34278 [17:53:30<19:33:20, 3.89s/it] 47%|████▋ | 16159/34278 [17:53:33<18:45:35, 3.73s/it] {'loss': 0.1318, 'grad_norm': 0.9426717499516286, 'learning_rate': 5.703708686258577e-06, 'epoch': 0.47} 47%|████▋ | 16159/34278 [17:53:33<18:45:35, 3.73s/it] 47%|████▋ | 16160/34278 [17:53:37<18:25:24, 3.66s/it] {'loss': 0.1424, 'grad_norm': 0.9483265331401411, 'learning_rate': 5.703240951408592e-06, 'epoch': 0.47} 47%|████▋ | 16160/34278 [17:53:37<18:25:24, 3.66s/it] 47%|████▋ | 16161/34278 [17:53:40<17:54:16, 3.56s/it] {'loss': 0.118, 'grad_norm': 0.9296762430677868, 'learning_rate': 5.7027732102802416e-06, 'epoch': 0.47} 47%|████▋ | 16161/34278 [17:53:40<17:54:16, 3.56s/it] 47%|████▋ | 16162/34278 [17:53:46<21:48:19, 4.33s/it] {'loss': 0.1589, 'grad_norm': 0.6927683269348481, 'learning_rate': 5.702305462877697e-06, 'epoch': 0.47} 47%|████▋ | 16162/34278 [17:53:46<21:48:19, 4.33s/it] 47%|████▋ | 16163/34278 [17:53:49<19:24:50, 3.86s/it] {'loss': 0.1109, 'grad_norm': 0.6899457308718389, 'learning_rate': 5.701837709205139e-06, 'epoch': 0.47} 47%|████▋ | 16163/34278 [17:53:49<19:24:50, 3.86s/it] 47%|████▋ | 16164/34278 [17:53:52<18:03:16, 3.59s/it] {'loss': 0.1595, 'grad_norm': 0.9561555591782459, 'learning_rate': 5.70136994926674e-06, 'epoch': 0.47} 47%|████▋ | 16164/34278 [17:53:52<18:03:16, 3.59s/it] 47%|████▋ | 16165/34278 [17:53:55<17:44:32, 3.53s/it] {'loss': 0.1158, 'grad_norm': 0.9913448171536647, 'learning_rate': 5.700902183066679e-06, 'epoch': 0.47} 47%|████▋ | 16165/34278 [17:53:55<17:44:32, 3.53s/it] 47%|████▋ | 16166/34278 [17:53:59<18:23:43, 3.66s/it] {'loss': 0.1354, 'grad_norm': 0.7727073830246254, 'learning_rate': 5.70043441060913e-06, 'epoch': 0.47} 47%|████▋ | 16166/34278 [17:53:59<18:23:43, 3.66s/it] 47%|████▋ | 16167/34278 [17:54:02<17:31:38, 3.48s/it] {'loss': 0.1127, 'grad_norm': 0.96473705052807, 'learning_rate': 5.699966631898269e-06, 'epoch': 0.47} 47%|████▋ | 16167/34278 [17:54:02<17:31:38, 3.48s/it] 47%|████▋ | 16168/34278 [17:54:06<18:11:22, 3.62s/it] {'loss': 0.1299, 'grad_norm': 0.7495226406181917, 'learning_rate': 5.699498846938274e-06, 'epoch': 0.47} 47%|████▋ | 16168/34278 [17:54:06<18:11:22, 3.62s/it] 47%|████▋ | 16169/34278 [17:54:10<17:34:38, 3.49s/it] {'loss': 0.1492, 'grad_norm': 0.8292004078232724, 'learning_rate': 5.699031055733319e-06, 'epoch': 0.47} 47%|████▋ | 16169/34278 [17:54:10<17:34:38, 3.49s/it] 47%|████▋ | 16170/34278 [17:54:12<16:39:57, 3.31s/it] {'loss': 0.165, 'grad_norm': 0.7441635103053541, 'learning_rate': 5.698563258287584e-06, 'epoch': 0.47} 47%|████▋ | 16170/34278 [17:54:12<16:39:57, 3.31s/it] 47%|████▋ | 16171/34278 [17:54:15<16:15:42, 3.23s/it] {'loss': 0.1299, 'grad_norm': 0.6816748903949795, 'learning_rate': 5.698095454605243e-06, 'epoch': 0.47} 47%|████▋ | 16171/34278 [17:54:16<16:15:42, 3.23s/it] 47%|████▋ | 16172/34278 [17:54:20<18:53:02, 3.75s/it] {'loss': 0.1514, 'grad_norm': 0.7927442000531812, 'learning_rate': 5.6976276446904684e-06, 'epoch': 0.47} 47%|████▋ | 16172/34278 [17:54:20<18:53:02, 3.75s/it] 47%|████▋ | 16173/34278 [17:54:23<17:26:18, 3.47s/it] {'loss': 0.1612, 'grad_norm': 0.8611246894664979, 'learning_rate': 5.697159828547445e-06, 'epoch': 0.47} 47%|████▋ | 16173/34278 [17:54:23<17:26:18, 3.47s/it] 47%|████▋ | 16174/34278 [17:54:28<19:36:12, 3.90s/it] {'loss': 0.1287, 'grad_norm': 0.7813186770413587, 'learning_rate': 5.6966920061803435e-06, 'epoch': 0.47} 47%|████▋ | 16174/34278 [17:54:28<19:36:12, 3.90s/it] 47%|████▋ | 16175/34278 [17:54:31<18:05:12, 3.60s/it] {'loss': 0.1269, 'grad_norm': 0.8555039145706819, 'learning_rate': 5.696224177593341e-06, 'epoch': 0.47} 47%|████▋ | 16175/34278 [17:54:31<18:05:12, 3.60s/it] 47%|████▋ | 16176/34278 [17:54:34<17:47:21, 3.54s/it] {'loss': 0.1456, 'grad_norm': 0.8187749906027676, 'learning_rate': 5.695756342790617e-06, 'epoch': 0.47} 47%|████▋ | 16176/34278 [17:54:34<17:47:21, 3.54s/it] 47%|████▋ | 16177/34278 [17:54:39<19:31:47, 3.88s/it] {'loss': 0.1397, 'grad_norm': 0.8189764933933033, 'learning_rate': 5.6952885017763455e-06, 'epoch': 0.47} 47%|████▋ | 16177/34278 [17:54:39<19:31:47, 3.88s/it] 47%|████▋ | 16178/34278 [17:54:42<18:18:32, 3.64s/it] {'loss': 0.1487, 'grad_norm': 0.9037366018345423, 'learning_rate': 5.694820654554705e-06, 'epoch': 0.47} 47%|████▋ | 16178/34278 [17:54:42<18:18:32, 3.64s/it] 47%|████▋ | 16179/34278 [17:54:45<17:20:48, 3.45s/it] {'loss': 0.1202, 'grad_norm': 0.830932021339775, 'learning_rate': 5.694352801129871e-06, 'epoch': 0.47} 47%|████▋ | 16179/34278 [17:54:45<17:20:48, 3.45s/it] 47%|████▋ | 16180/34278 [17:54:48<16:48:00, 3.34s/it] {'loss': 0.1494, 'grad_norm': 0.9435808114088704, 'learning_rate': 5.69388494150602e-06, 'epoch': 0.47} 47%|████▋ | 16180/34278 [17:54:48<16:48:00, 3.34s/it] 47%|████▋ | 16181/34278 [17:54:55<21:18:20, 4.24s/it] {'loss': 0.1384, 'grad_norm': 0.7360990872898313, 'learning_rate': 5.693417075687332e-06, 'epoch': 0.47} 47%|████▋ | 16181/34278 [17:54:55<21:18:20, 4.24s/it] 47%|████▋ | 16182/34278 [17:54:59<20:43:52, 4.12s/it] {'loss': 0.1291, 'grad_norm': 0.946534332693603, 'learning_rate': 5.69294920367798e-06, 'epoch': 0.47} 47%|████▋ | 16182/34278 [17:54:59<20:43:52, 4.12s/it] 47%|████▋ | 16183/34278 [17:55:03<21:56:44, 4.37s/it] {'loss': 0.1362, 'grad_norm': 0.8751863756735684, 'learning_rate': 5.692481325482144e-06, 'epoch': 0.47} 47%|████▋ | 16183/34278 [17:55:03<21:56:44, 4.37s/it] 47%|████▋ | 16184/34278 [17:55:06<19:57:33, 3.97s/it] {'loss': 0.1364, 'grad_norm': 0.8350416032285239, 'learning_rate': 5.692013441103999e-06, 'epoch': 0.47} 47%|████▋ | 16184/34278 [17:55:06<19:57:33, 3.97s/it] 47%|████▋ | 16185/34278 [17:55:11<20:09:29, 4.01s/it] {'loss': 0.1396, 'grad_norm': 0.7840014775393542, 'learning_rate': 5.6915455505477244e-06, 'epoch': 0.47} 47%|████▋ | 16185/34278 [17:55:11<20:09:29, 4.01s/it] 47%|████▋ | 16186/34278 [17:55:14<18:44:11, 3.73s/it] {'loss': 0.143, 'grad_norm': 0.9949906733657168, 'learning_rate': 5.691077653817496e-06, 'epoch': 0.47} 47%|████▋ | 16186/34278 [17:55:14<18:44:11, 3.73s/it] 47%|████▋ | 16187/34278 [17:55:17<18:04:38, 3.60s/it] {'loss': 0.1335, 'grad_norm': 1.1018000811796227, 'learning_rate': 5.690609750917491e-06, 'epoch': 0.47} 47%|████▋ | 16187/34278 [17:55:17<18:04:38, 3.60s/it] 47%|████▋ | 16188/34278 [17:55:20<17:06:11, 3.40s/it] {'loss': 0.1289, 'grad_norm': 0.839745800730884, 'learning_rate': 5.690141841851887e-06, 'epoch': 0.47} 47%|████▋ | 16188/34278 [17:55:20<17:06:11, 3.40s/it] 47%|████▋ | 16189/34278 [17:55:23<16:01:52, 3.19s/it] {'loss': 0.1242, 'grad_norm': 0.8672016742917711, 'learning_rate': 5.689673926624862e-06, 'epoch': 0.47} 47%|████▋ | 16189/34278 [17:55:23<16:01:52, 3.19s/it] 47%|████▋ | 16190/34278 [17:55:28<19:46:29, 3.94s/it] {'loss': 0.1287, 'grad_norm': 0.9283706838298952, 'learning_rate': 5.6892060052405906e-06, 'epoch': 0.47} 47%|████▋ | 16190/34278 [17:55:28<19:46:29, 3.94s/it] 47%|████▋ | 16191/34278 [17:55:31<18:00:57, 3.59s/it] {'loss': 0.1532, 'grad_norm': 0.8957466119878499, 'learning_rate': 5.688738077703255e-06, 'epoch': 0.47} 47%|████▋ | 16191/34278 [17:55:31<18:00:57, 3.59s/it] 47%|████▋ | 16192/34278 [17:55:37<21:46:00, 4.33s/it] {'loss': 0.1438, 'grad_norm': 0.8065300531139225, 'learning_rate': 5.68827014401703e-06, 'epoch': 0.47} 47%|████▋ | 16192/34278 [17:55:37<21:46:00, 4.33s/it] 47%|████▋ | 16193/34278 [17:55:42<22:51:27, 4.55s/it] {'loss': 0.1209, 'grad_norm': 0.7639327856073047, 'learning_rate': 5.687802204186092e-06, 'epoch': 0.47} 47%|████▋ | 16193/34278 [17:55:42<22:51:27, 4.55s/it] 47%|████▋ | 16194/34278 [17:55:47<24:00:22, 4.78s/it] {'loss': 0.1269, 'grad_norm': 0.9828183122126705, 'learning_rate': 5.687334258214622e-06, 'epoch': 0.47} 47%|████▋ | 16194/34278 [17:55:47<24:00:22, 4.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 47%|████▋ | 16195/34278 [17:55:50<21:15:20, 4.23s/it] {'loss': 0.1412, 'grad_norm': 0.8431529340952344, 'learning_rate': 5.686866306106794e-06, 'epoch': 0.47} 47%|████▋ | 16195/34278 [17:55:50<21:15:20, 4.23s/it] 47%|████▋ | 16196/34278 [17:55:54<19:46:22, 3.94s/it] {'loss': 0.1343, 'grad_norm': 1.1173127652406816, 'learning_rate': 5.686398347866789e-06, 'epoch': 0.47} 47%|████▋ | 16196/34278 [17:55:54<19:46:22, 3.94s/it] 47%|████▋ | 16197/34278 [17:56:00<23:03:39, 4.59s/it] {'loss': 0.1304, 'grad_norm': 0.7631756116195886, 'learning_rate': 5.685930383498782e-06, 'epoch': 0.47} 47%|████▋ | 16197/34278 [17:56:00<23:03:39, 4.59s/it] 47%|████▋ | 16198/34278 [17:56:04<21:54:11, 4.36s/it] {'loss': 0.1409, 'grad_norm': 0.8798489410905271, 'learning_rate': 5.685462413006953e-06, 'epoch': 0.47} 47%|████▋ | 16198/34278 [17:56:04<21:54:11, 4.36s/it] 47%|████▋ | 16199/34278 [17:56:07<20:38:31, 4.11s/it] {'loss': 0.1207, 'grad_norm': 0.9584388683992391, 'learning_rate': 5.684994436395479e-06, 'epoch': 0.47} 47%|████▋ | 16199/34278 [17:56:07<20:38:31, 4.11s/it] 47%|████▋ | 16200/34278 [17:56:10<18:52:44, 3.76s/it] {'loss': 0.1153, 'grad_norm': 0.7511771133137782, 'learning_rate': 5.684526453668538e-06, 'epoch': 0.47} 47%|████▋ | 16200/34278 [17:56:10<18:52:44, 3.76s/it] 47%|████▋ | 16201/34278 [17:56:14<18:38:24, 3.71s/it] {'loss': 0.1309, 'grad_norm': 1.0283700102265485, 'learning_rate': 5.684058464830311e-06, 'epoch': 0.47} 47%|████▋ | 16201/34278 [17:56:14<18:38:24, 3.71s/it] 47%|████▋ | 16202/34278 [17:56:18<18:47:44, 3.74s/it] {'loss': 0.1206, 'grad_norm': 0.815990377068461, 'learning_rate': 5.68359046988497e-06, 'epoch': 0.47} 47%|████▋ | 16202/34278 [17:56:18<18:47:44, 3.74s/it] 47%|████▋ | 16203/34278 [17:56:21<18:32:56, 3.69s/it] {'loss': 0.1179, 'grad_norm': 0.7414053418331662, 'learning_rate': 5.683122468836698e-06, 'epoch': 0.47} 47%|████▋ | 16203/34278 [17:56:21<18:32:56, 3.69s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (13172 > 8192). Running this sequence through the model will result in indexing errors 47%|████▋ | 16204/34278 [17:56:25<18:28:44, 3.68s/it] {'loss': 0.1435, 'grad_norm': 0.7558004797743774, 'learning_rate': 5.682654461689671e-06, 'epoch': 0.47} 47%|████▋ | 16204/34278 [17:56:25<18:28:44, 3.68s/it] 47%|████▋ | 16205/34278 [17:56:28<17:14:59, 3.44s/it] {'loss': 0.1437, 'grad_norm': 0.763002566802126, 'learning_rate': 5.682186448448067e-06, 'epoch': 0.47} 47%|████▋ | 16205/34278 [17:56:28<17:14:59, 3.44s/it] 47%|████▋ | 16206/34278 [17:56:31<17:20:14, 3.45s/it] {'loss': 0.1312, 'grad_norm': 0.7492393525328191, 'learning_rate': 5.681718429116067e-06, 'epoch': 0.47} 47%|████▋ | 16206/34278 [17:56:31<17:20:14, 3.45s/it] 47%|████▋ | 16207/34278 [17:56:35<18:41:27, 3.72s/it] {'loss': 0.1036, 'grad_norm': 0.7150913299742103, 'learning_rate': 5.681250403697847e-06, 'epoch': 0.47} 47%|████▋ | 16207/34278 [17:56:35<18:41:27, 3.72s/it] 47%|████▋ | 16208/34278 [17:56:41<21:42:52, 4.33s/it] {'loss': 0.1189, 'grad_norm': 0.6375610874861878, 'learning_rate': 5.680782372197586e-06, 'epoch': 0.47} 47%|████▋ | 16208/34278 [17:56:41<21:42:52, 4.33s/it] 47%|████▋ | 16209/34278 [17:56:44<19:38:17, 3.91s/it] {'loss': 0.1275, 'grad_norm': 0.9498688512170802, 'learning_rate': 5.6803143346194625e-06, 'epoch': 0.47} 47%|████▋ | 16209/34278 [17:56:44<19:38:17, 3.91s/it] 47%|████▋ | 16210/34278 [17:56:50<21:57:14, 4.37s/it] {'loss': 0.1257, 'grad_norm': 0.782084929024932, 'learning_rate': 5.679846290967654e-06, 'epoch': 0.47} 47%|████▋ | 16210/34278 [17:56:50<21:57:14, 4.37s/it] 47%|████▋ | 16211/34278 [17:56:54<21:49:50, 4.35s/it] {'loss': 0.1326, 'grad_norm': 0.8024384483136685, 'learning_rate': 5.679378241246341e-06, 'epoch': 0.47} 47%|████▋ | 16211/34278 [17:56:54<21:49:50, 4.35s/it] 47%|████▋ | 16212/34278 [17:57:00<23:57:56, 4.78s/it] {'loss': 0.1391, 'grad_norm': 0.9090946722184887, 'learning_rate': 5.678910185459702e-06, 'epoch': 0.47} 47%|████▋ | 16212/34278 [17:57:00<23:57:56, 4.78s/it] 47%|████▋ | 16213/34278 [17:57:03<22:22:59, 4.46s/it] {'loss': 0.1356, 'grad_norm': 0.8432305552418801, 'learning_rate': 5.678442123611914e-06, 'epoch': 0.47} 47%|████▋ | 16213/34278 [17:57:03<22:22:59, 4.46s/it] 47%|████▋ | 16214/34278 [17:57:09<23:38:41, 4.71s/it] {'loss': 0.1394, 'grad_norm': 0.8891832226368575, 'learning_rate': 5.6779740557071574e-06, 'epoch': 0.47} 47%|████▋ | 16214/34278 [17:57:09<23:38:41, 4.71s/it] 47%|████▋ | 16215/34278 [17:57:13<22:21:22, 4.46s/it] {'loss': 0.1627, 'grad_norm': 0.8660342683361745, 'learning_rate': 5.67750598174961e-06, 'epoch': 0.47} 47%|████▋ | 16215/34278 [17:57:13<22:21:22, 4.46s/it] 47%|████▋ | 16216/34278 [17:57:16<20:26:10, 4.07s/it] {'loss': 0.1432, 'grad_norm': 0.9064999845036191, 'learning_rate': 5.67703790174345e-06, 'epoch': 0.47} 47%|████▋ | 16216/34278 [17:57:16<20:26:10, 4.07s/it] 47%|████▋ | 16217/34278 [17:57:19<18:45:11, 3.74s/it] {'loss': 0.1315, 'grad_norm': 0.9992737392003487, 'learning_rate': 5.676569815692858e-06, 'epoch': 0.47} 47%|████▋ | 16217/34278 [17:57:19<18:45:11, 3.74s/it] 47%|████▋ | 16218/34278 [17:57:22<18:47:36, 3.75s/it] {'loss': 0.1384, 'grad_norm': 0.841757401922245, 'learning_rate': 5.676101723602014e-06, 'epoch': 0.47} 47%|████▋ | 16218/34278 [17:57:22<18:47:36, 3.75s/it] 47%|████▋ | 16219/34278 [17:57:26<18:00:51, 3.59s/it] {'loss': 0.1288, 'grad_norm': 0.677806080845052, 'learning_rate': 5.675633625475092e-06, 'epoch': 0.47} 47%|████▋ | 16219/34278 [17:57:26<18:00:51, 3.59s/it] 47%|████▋ | 16220/34278 [17:57:29<17:31:52, 3.50s/it] {'loss': 0.1473, 'grad_norm': 0.9660709917443289, 'learning_rate': 5.6751655213162746e-06, 'epoch': 0.47} 47%|████▋ | 16220/34278 [17:57:29<17:31:52, 3.50s/it] 47%|████▋ | 16221/34278 [17:57:32<16:44:52, 3.34s/it] {'loss': 0.1382, 'grad_norm': 0.9806908705855765, 'learning_rate': 5.674697411129743e-06, 'epoch': 0.47} 47%|████▋ | 16221/34278 [17:57:32<16:44:52, 3.34s/it] 47%|████▋ | 16222/34278 [17:57:37<19:51:32, 3.96s/it] {'loss': 0.1435, 'grad_norm': 0.8439538122796523, 'learning_rate': 5.674229294919672e-06, 'epoch': 0.47} 47%|████▋ | 16222/34278 [17:57:37<19:51:32, 3.96s/it] 47%|████▋ | 16223/34278 [17:57:41<18:55:42, 3.77s/it] {'loss': 0.1531, 'grad_norm': 0.7437934665996728, 'learning_rate': 5.6737611726902446e-06, 'epoch': 0.47} 47%|████▋ | 16223/34278 [17:57:41<18:55:42, 3.77s/it] 47%|████▋ | 16224/34278 [17:57:44<17:58:18, 3.58s/it] {'loss': 0.1551, 'grad_norm': 0.9858281049159205, 'learning_rate': 5.673293044445636e-06, 'epoch': 0.47} 47%|████▋ | 16224/34278 [17:57:44<17:58:18, 3.58s/it] 47%|████▋ | 16225/34278 [17:57:48<19:19:20, 3.85s/it] {'loss': 0.1487, 'grad_norm': 0.7692244938518777, 'learning_rate': 5.672824910190029e-06, 'epoch': 0.47} 47%|████▋ | 16225/34278 [17:57:48<19:19:20, 3.85s/it] 47%|████▋ | 16226/34278 [17:57:51<18:08:11, 3.62s/it] {'loss': 0.1394, 'grad_norm': 0.775186689285311, 'learning_rate': 5.672356769927601e-06, 'epoch': 0.47} 47%|████▋ | 16226/34278 [17:57:51<18:08:11, 3.62s/it] 47%|████▋ | 16227/34278 [17:57:55<18:11:47, 3.63s/it] {'loss': 0.16, 'grad_norm': 0.8328297749803291, 'learning_rate': 5.671888623662534e-06, 'epoch': 0.47} 47%|████▋ | 16227/34278 [17:57:55<18:11:47, 3.63s/it] 47%|████▋ | 16228/34278 [17:58:00<20:55:53, 4.17s/it] {'loss': 0.1497, 'grad_norm': 1.738440038387054, 'learning_rate': 5.671420471399005e-06, 'epoch': 0.47} 47%|████▋ | 16228/34278 [17:58:00<20:55:53, 4.17s/it] 47%|████▋ | 16229/34278 [17:58:05<21:22:53, 4.26s/it] {'loss': 0.1133, 'grad_norm': 0.8687263397133216, 'learning_rate': 5.670952313141193e-06, 'epoch': 0.47} 47%|████▋ | 16229/34278 [17:58:05<21:22:53, 4.26s/it] 47%|████▋ | 16230/34278 [17:58:08<20:03:35, 4.00s/it] {'loss': 0.1306, 'grad_norm': 0.9985943263605154, 'learning_rate': 5.670484148893281e-06, 'epoch': 0.47} 47%|████▋ | 16230/34278 [17:58:08<20:03:35, 4.00s/it] 47%|████▋ | 16231/34278 [17:58:14<22:10:20, 4.42s/it] {'loss': 0.1267, 'grad_norm': 0.8357561037795771, 'learning_rate': 5.6700159786594466e-06, 'epoch': 0.47} 47%|████▋ | 16231/34278 [17:58:14<22:10:20, 4.42s/it] 47%|████▋ | 16232/34278 [17:58:17<20:21:57, 4.06s/it] {'loss': 0.1395, 'grad_norm': 1.0401668740599634, 'learning_rate': 5.6695478024438665e-06, 'epoch': 0.47} 47%|████▋ | 16232/34278 [17:58:17<20:21:57, 4.06s/it] 47%|████▋ | 16233/34278 [17:58:20<19:12:24, 3.83s/it] {'loss': 0.1358, 'grad_norm': 0.7872097597891584, 'learning_rate': 5.669079620250727e-06, 'epoch': 0.47} 47%|████▋ | 16233/34278 [17:58:20<19:12:24, 3.83s/it] 47%|████▋ | 16234/34278 [17:58:24<19:07:59, 3.82s/it] {'loss': 0.1398, 'grad_norm': 0.6619763784436152, 'learning_rate': 5.668611432084202e-06, 'epoch': 0.47} 47%|████▋ | 16234/34278 [17:58:24<19:07:59, 3.82s/it] 47%|████▋ | 16235/34278 [17:58:29<20:52:20, 4.16s/it] {'loss': 0.1384, 'grad_norm': 0.8538666418095292, 'learning_rate': 5.668143237948474e-06, 'epoch': 0.47} 47%|████▋ | 16235/34278 [17:58:29<20:52:20, 4.16s/it] 47%|████▋ | 16236/34278 [17:58:32<19:17:25, 3.85s/it] {'loss': 0.1286, 'grad_norm': 0.7445876302715672, 'learning_rate': 5.667675037847724e-06, 'epoch': 0.47} 47%|████▋ | 16236/34278 [17:58:32<19:17:25, 3.85s/it] 47%|████▋ | 16237/34278 [17:58:36<19:46:14, 3.95s/it] {'loss': 0.1535, 'grad_norm': 0.9563616285404325, 'learning_rate': 5.667206831786131e-06, 'epoch': 0.47} 47%|████▋ | 16237/34278 [17:58:36<19:46:14, 3.95s/it] 47%|████▋ | 16238/34278 [17:58:40<19:31:48, 3.90s/it] {'loss': 0.1305, 'grad_norm': 0.7220433397475235, 'learning_rate': 5.666738619767873e-06, 'epoch': 0.47} 47%|████▋ | 16238/34278 [17:58:40<19:31:48, 3.90s/it] 47%|████▋ | 16239/34278 [17:58:43<18:30:11, 3.69s/it] {'loss': 0.1213, 'grad_norm': 0.9418118334499467, 'learning_rate': 5.666270401797132e-06, 'epoch': 0.47} 47%|████▋ | 16239/34278 [17:58:43<18:30:11, 3.69s/it] 47%|████▋ | 16240/34278 [17:58:49<22:01:58, 4.40s/it] {'loss': 0.1428, 'grad_norm': 0.7768616027863579, 'learning_rate': 5.665802177878088e-06, 'epoch': 0.47} 47%|████▋ | 16240/34278 [17:58:49<22:01:58, 4.40s/it] 47%|████▋ | 16241/34278 [17:58:55<23:29:21, 4.69s/it] {'loss': 0.1403, 'grad_norm': 0.9101999476468361, 'learning_rate': 5.665333948014922e-06, 'epoch': 0.47} 47%|████▋ | 16241/34278 [17:58:55<23:29:21, 4.69s/it] 47%|████▋ | 16242/34278 [17:58:58<21:23:24, 4.27s/it] {'loss': 0.1304, 'grad_norm': 0.8402877876782288, 'learning_rate': 5.664865712211812e-06, 'epoch': 0.47} 47%|████▋ | 16242/34278 [17:58:58<21:23:24, 4.27s/it] 47%|████▋ | 16243/34278 [17:59:01<19:32:39, 3.90s/it] {'loss': 0.1397, 'grad_norm': 0.9650119347957364, 'learning_rate': 5.66439747047294e-06, 'epoch': 0.47} 47%|████▋ | 16243/34278 [17:59:01<19:32:39, 3.90s/it] 47%|████▋ | 16244/34278 [17:59:07<22:47:15, 4.55s/it] {'loss': 0.1174, 'grad_norm': 0.7516018342228209, 'learning_rate': 5.663929222802487e-06, 'epoch': 0.47} 47%|████▋ | 16244/34278 [17:59:07<22:47:15, 4.55s/it] 47%|████▋ | 16245/34278 [17:59:11<21:19:37, 4.26s/it] {'loss': 0.1438, 'grad_norm': 0.9969496985948444, 'learning_rate': 5.663460969204631e-06, 'epoch': 0.47} 47%|████▋ | 16245/34278 [17:59:11<21:19:37, 4.26s/it] 47%|████▋ | 16246/34278 [17:59:14<19:47:27, 3.95s/it] {'loss': 0.1189, 'grad_norm': 0.9829370754519864, 'learning_rate': 5.662992709683556e-06, 'epoch': 0.47} 47%|████▋ | 16246/34278 [17:59:14<19:47:27, 3.95s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10916 > 8192). Running this sequence through the model will result in indexing errors 47%|████▋ | 16247/34278 [17:59:17<18:38:05, 3.72s/it] {'loss': 0.1211, 'grad_norm': 0.7936624946802271, 'learning_rate': 5.662524444243437e-06, 'epoch': 0.47} 47%|████▋ | 16247/34278 [17:59:17<18:38:05, 3.72s/it] 47%|████▋ | 16248/34278 [17:59:21<18:46:30, 3.75s/it] {'loss': 0.1323, 'grad_norm': 1.014677559492022, 'learning_rate': 5.6620561728884616e-06, 'epoch': 0.47} 47%|████▋ | 16248/34278 [17:59:21<18:46:30, 3.75s/it] 47%|████▋ | 16249/34278 [17:59:24<17:35:36, 3.51s/it] {'loss': 0.1218, 'grad_norm': 0.8610488994778456, 'learning_rate': 5.661587895622805e-06, 'epoch': 0.47} 47%|████▋ | 16249/34278 [17:59:24<17:35:36, 3.51s/it] 47%|████▋ | 16250/34278 [17:59:27<17:30:06, 3.49s/it] {'loss': 0.129, 'grad_norm': 0.8314360485172108, 'learning_rate': 5.661119612450647e-06, 'epoch': 0.47} 47%|████▋ | 16250/34278 [17:59:27<17:30:06, 3.49s/it] 47%|████▋ | 16251/34278 [17:59:31<17:53:35, 3.57s/it] {'loss': 0.1336, 'grad_norm': 0.9404321259942992, 'learning_rate': 5.660651323376175e-06, 'epoch': 0.47} 47%|████▋ | 16251/34278 [17:59:31<17:53:35, 3.57s/it] 47%|████▋ | 16252/34278 [17:59:34<16:59:28, 3.39s/it] {'loss': 0.1381, 'grad_norm': 0.7948735444190058, 'learning_rate': 5.660183028403564e-06, 'epoch': 0.47} 47%|████▋ | 16252/34278 [17:59:34<16:59:28, 3.39s/it] 47%|████▋ | 16253/34278 [17:59:38<17:21:15, 3.47s/it] {'loss': 0.1744, 'grad_norm': 0.9186003866517694, 'learning_rate': 5.659714727536997e-06, 'epoch': 0.47} 47%|████▋ | 16253/34278 [17:59:38<17:21:15, 3.47s/it] 47%|████▋ | 16254/34278 [17:59:42<18:24:04, 3.68s/it] {'loss': 0.14, 'grad_norm': 0.6924003566379509, 'learning_rate': 5.659246420780654e-06, 'epoch': 0.47} 47%|████▋ | 16254/34278 [17:59:42<18:24:04, 3.68s/it] 47%|████▋ | 16255/34278 [17:59:47<20:08:23, 4.02s/it] {'loss': 0.1474, 'grad_norm': 0.8309259555606926, 'learning_rate': 5.658778108138716e-06, 'epoch': 0.47} 47%|████▋ | 16255/34278 [17:59:47<20:08:23, 4.02s/it] 47%|████▋ | 16256/34278 [17:59:50<18:26:45, 3.68s/it] {'loss': 0.1487, 'grad_norm': 0.8182532797642579, 'learning_rate': 5.658309789615365e-06, 'epoch': 0.47} 47%|████▋ | 16256/34278 [17:59:50<18:26:45, 3.68s/it] 47%|████▋ | 16257/34278 [17:59:53<17:21:34, 3.47s/it] {'loss': 0.1357, 'grad_norm': 0.8406096018410196, 'learning_rate': 5.657841465214781e-06, 'epoch': 0.47} 47%|████▋ | 16257/34278 [17:59:53<17:21:34, 3.47s/it] 47%|████▋ | 16258/34278 [17:59:56<17:08:17, 3.42s/it] {'loss': 0.1489, 'grad_norm': 0.877345040652179, 'learning_rate': 5.6573731349411455e-06, 'epoch': 0.47} 47%|████▋ | 16258/34278 [17:59:56<17:08:17, 3.42s/it] 47%|████▋ | 16259/34278 [17:59:59<16:12:43, 3.24s/it] {'loss': 0.1375, 'grad_norm': 1.108890956422376, 'learning_rate': 5.656904798798639e-06, 'epoch': 0.47} 47%|████▋ | 16259/34278 [17:59:59<16:12:43, 3.24s/it] 47%|████▋ | 16260/34278 [18:00:02<15:41:26, 3.13s/it] {'loss': 0.1337, 'grad_norm': 1.018419444436362, 'learning_rate': 5.6564364567914446e-06, 'epoch': 0.47} 47%|████▋ | 16260/34278 [18:00:02<15:41:26, 3.13s/it] 47%|████▋ | 16261/34278 [18:00:05<15:28:39, 3.09s/it] {'loss': 0.1333, 'grad_norm': 0.8214578524218132, 'learning_rate': 5.655968108923742e-06, 'epoch': 0.47} 47%|████▋ | 16261/34278 [18:00:05<15:28:39, 3.09s/it] 47%|████▋ | 16262/34278 [18:00:09<17:29:08, 3.49s/it] {'loss': 0.1476, 'grad_norm': 0.9240534830592845, 'learning_rate': 5.655499755199711e-06, 'epoch': 0.47} 47%|████▋ | 16262/34278 [18:00:09<17:29:08, 3.49s/it] 47%|████▋ | 16263/34278 [18:00:12<16:28:27, 3.29s/it] {'loss': 0.1314, 'grad_norm': 0.8709571035293724, 'learning_rate': 5.655031395623537e-06, 'epoch': 0.47} 47%|████▋ | 16263/34278 [18:00:12<16:28:27, 3.29s/it] 47%|████▋ | 16264/34278 [18:00:15<16:35:37, 3.32s/it] {'loss': 0.1443, 'grad_norm': 0.8551789118468752, 'learning_rate': 5.654563030199398e-06, 'epoch': 0.47} 47%|████▋ | 16264/34278 [18:00:15<16:35:37, 3.32s/it] 47%|████▋ | 16265/34278 [18:00:21<20:39:23, 4.13s/it] {'loss': 0.1162, 'grad_norm': 0.9940196886603396, 'learning_rate': 5.654094658931475e-06, 'epoch': 0.47} 47%|████▋ | 16265/34278 [18:00:21<20:39:23, 4.13s/it] 47%|████▋ | 16266/34278 [18:00:25<19:31:24, 3.90s/it] {'loss': 0.1561, 'grad_norm': 0.8371940155546695, 'learning_rate': 5.653626281823954e-06, 'epoch': 0.47} 47%|████▋ | 16266/34278 [18:00:25<19:31:24, 3.90s/it] 47%|████▋ | 16267/34278 [18:00:28<18:05:09, 3.62s/it] {'loss': 0.1439, 'grad_norm': 0.8418990081313251, 'learning_rate': 5.653157898881012e-06, 'epoch': 0.47} 47%|████▋ | 16267/34278 [18:00:28<18:05:09, 3.62s/it] 47%|████▋ | 16268/34278 [18:00:31<17:44:37, 3.55s/it] {'loss': 0.149, 'grad_norm': 0.8449860784566638, 'learning_rate': 5.652689510106832e-06, 'epoch': 0.47} 47%|████▋ | 16268/34278 [18:00:31<17:44:37, 3.55s/it] 47%|████▋ | 16269/34278 [18:00:37<21:11:32, 4.24s/it] {'loss': 0.1212, 'grad_norm': 0.9916595714534104, 'learning_rate': 5.652221115505596e-06, 'epoch': 0.47} 47%|████▋ | 16269/34278 [18:00:37<21:11:32, 4.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 47%|████▋ | 16270/34278 [18:00:40<19:03:30, 3.81s/it] {'loss': 0.154, 'grad_norm': 1.1035250167472606, 'learning_rate': 5.651752715081486e-06, 'epoch': 0.47} 47%|████▋ | 16270/34278 [18:00:40<19:03:30, 3.81s/it] 47%|████▋ | 16271/34278 [18:00:43<18:38:42, 3.73s/it] {'loss': 0.1522, 'grad_norm': 0.9671174009877531, 'learning_rate': 5.651284308838683e-06, 'epoch': 0.47} 47%|████▋ | 16271/34278 [18:00:43<18:38:42, 3.73s/it] 47%|████▋ | 16272/34278 [18:00:49<22:05:49, 4.42s/it] {'loss': 0.1532, 'grad_norm': 0.945547690334323, 'learning_rate': 5.650815896781369e-06, 'epoch': 0.47} 47%|████▋ | 16272/34278 [18:00:49<22:05:49, 4.42s/it] 47%|████▋ | 16273/34278 [18:00:52<20:21:55, 4.07s/it] {'loss': 0.1293, 'grad_norm': 0.9820598763822408, 'learning_rate': 5.650347478913726e-06, 'epoch': 0.47} 47%|████▋ | 16273/34278 [18:00:52<20:21:55, 4.07s/it] 47%|████▋ | 16274/34278 [18:00:56<20:23:22, 4.08s/it] {'loss': 0.142, 'grad_norm': 0.9646350181918945, 'learning_rate': 5.649879055239936e-06, 'epoch': 0.47} 47%|████▋ | 16274/34278 [18:00:56<20:23:22, 4.08s/it] 47%|████▋ | 16275/34278 [18:00:59<18:28:33, 3.69s/it] {'loss': 0.1274, 'grad_norm': 1.0998332083928983, 'learning_rate': 5.649410625764181e-06, 'epoch': 0.47} 47%|████▋ | 16275/34278 [18:00:59<18:28:33, 3.69s/it] 47%|████▋ | 16276/34278 [18:01:03<17:57:12, 3.59s/it] {'loss': 0.1454, 'grad_norm': 1.0358855667448654, 'learning_rate': 5.648942190490645e-06, 'epoch': 0.47} 47%|████▋ | 16276/34278 [18:01:03<17:57:12, 3.59s/it] 47%|████▋ | 16277/34278 [18:01:09<21:33:58, 4.31s/it] {'loss': 0.1263, 'grad_norm': 0.73096872678173, 'learning_rate': 5.648473749423504e-06, 'epoch': 0.47} 47%|████▋ | 16277/34278 [18:01:09<21:33:58, 4.31s/it] 47%|████▋ | 16278/34278 [18:01:12<20:26:27, 4.09s/it] {'loss': 0.095, 'grad_norm': 0.7662822299619612, 'learning_rate': 5.648005302566948e-06, 'epoch': 0.47} 47%|████▋ | 16278/34278 [18:01:12<20:26:27, 4.09s/it] 47%|████▋ | 16279/34278 [18:01:15<19:00:43, 3.80s/it] {'loss': 0.1174, 'grad_norm': 0.8184027461602206, 'learning_rate': 5.647536849925154e-06, 'epoch': 0.47} 47%|████▋ | 16279/34278 [18:01:15<19:00:43, 3.80s/it] 47%|████▋ | 16280/34278 [18:01:22<23:00:27, 4.60s/it] {'loss': 0.1202, 'grad_norm': 0.7924127382126381, 'learning_rate': 5.647068391502304e-06, 'epoch': 0.47} 47%|████▋ | 16280/34278 [18:01:22<23:00:27, 4.60s/it] 47%|████▋ | 16281/34278 [18:01:25<21:16:51, 4.26s/it] {'loss': 0.1581, 'grad_norm': 1.10565237542086, 'learning_rate': 5.646599927302584e-06, 'epoch': 0.47} 47%|████▋ | 16281/34278 [18:01:25<21:16:51, 4.26s/it] 47%|████▋ | 16282/34278 [18:01:28<19:39:27, 3.93s/it] {'loss': 0.1151, 'grad_norm': 0.7452630278413567, 'learning_rate': 5.646131457330173e-06, 'epoch': 0.47} 47%|████▋ | 16282/34278 [18:01:28<19:39:27, 3.93s/it] 48%|████▊ | 16283/34278 [18:01:34<22:47:09, 4.56s/it] {'loss': 0.1377, 'grad_norm': 0.7729927488370573, 'learning_rate': 5.645662981589255e-06, 'epoch': 0.48} 48%|████▊ | 16283/34278 [18:01:34<22:47:09, 4.56s/it] 48%|████▊ | 16284/34278 [18:01:37<19:51:38, 3.97s/it] {'loss': 0.139, 'grad_norm': 0.7449926836030744, 'learning_rate': 5.645194500084011e-06, 'epoch': 0.48} 48%|████▊ | 16284/34278 [18:01:37<19:51:38, 3.97s/it] 48%|████▊ | 16285/34278 [18:01:40<18:24:42, 3.68s/it] {'loss': 0.1303, 'grad_norm': 0.7884663737594525, 'learning_rate': 5.644726012818626e-06, 'epoch': 0.48} 48%|████▊ | 16285/34278 [18:01:40<18:24:42, 3.68s/it] 48%|████▊ | 16286/34278 [18:01:43<17:54:10, 3.58s/it] {'loss': 0.1309, 'grad_norm': 1.008820998919924, 'learning_rate': 5.644257519797281e-06, 'epoch': 0.48} 48%|████▊ | 16286/34278 [18:01:43<17:54:10, 3.58s/it] 48%|████▊ | 16287/34278 [18:01:47<18:18:14, 3.66s/it] {'loss': 0.1518, 'grad_norm': 0.7658301592147836, 'learning_rate': 5.643789021024157e-06, 'epoch': 0.48} 48%|████▊ | 16287/34278 [18:01:47<18:18:14, 3.66s/it] 48%|████▊ | 16288/34278 [18:01:50<17:40:34, 3.54s/it] {'loss': 0.1285, 'grad_norm': 0.880928281849524, 'learning_rate': 5.64332051650344e-06, 'epoch': 0.48} 48%|████▊ | 16288/34278 [18:01:51<17:40:34, 3.54s/it] 48%|████▊ | 16289/34278 [18:01:54<18:11:16, 3.64s/it] {'loss': 0.1259, 'grad_norm': 0.7554296987147964, 'learning_rate': 5.642852006239311e-06, 'epoch': 0.48} 48%|████▊ | 16289/34278 [18:01:54<18:11:16, 3.64s/it] 48%|████▊ | 16290/34278 [18:01:58<17:39:52, 3.54s/it] {'loss': 0.1228, 'grad_norm': 1.024612839630492, 'learning_rate': 5.642383490235952e-06, 'epoch': 0.48} 48%|████▊ | 16290/34278 [18:01:58<17:39:52, 3.54s/it] 48%|████▊ | 16291/34278 [18:02:01<17:50:41, 3.57s/it] {'loss': 0.1465, 'grad_norm': 0.9496401102042054, 'learning_rate': 5.641914968497547e-06, 'epoch': 0.48} 48%|████▊ | 16291/34278 [18:02:01<17:50:41, 3.57s/it] 48%|████▊ | 16292/34278 [18:02:04<16:50:38, 3.37s/it] {'loss': 0.1388, 'grad_norm': 0.7016204864134581, 'learning_rate': 5.6414464410282775e-06, 'epoch': 0.48} 48%|████▊ | 16292/34278 [18:02:04<16:50:38, 3.37s/it] 48%|████▊ | 16293/34278 [18:02:07<16:28:15, 3.30s/it] {'loss': 0.127, 'grad_norm': 0.9001482427453192, 'learning_rate': 5.640977907832329e-06, 'epoch': 0.48} 48%|████▊ | 16293/34278 [18:02:07<16:28:15, 3.30s/it] 48%|████▊ | 16294/34278 [18:02:10<16:11:19, 3.24s/it] {'loss': 0.1445, 'grad_norm': 0.7739858159602111, 'learning_rate': 5.640509368913881e-06, 'epoch': 0.48} 48%|████▊ | 16294/34278 [18:02:10<16:11:19, 3.24s/it] 48%|████▊ | 16295/34278 [18:02:14<15:58:14, 3.20s/it] {'loss': 0.144, 'grad_norm': 0.8609336916889685, 'learning_rate': 5.640040824277119e-06, 'epoch': 0.48} 48%|████▊ | 16295/34278 [18:02:14<15:58:14, 3.20s/it] 48%|████▊ | 16296/34278 [18:02:17<16:06:38, 3.23s/it] {'loss': 0.115, 'grad_norm': 0.7231928968906722, 'learning_rate': 5.639572273926226e-06, 'epoch': 0.48} 48%|████▊ | 16296/34278 [18:02:17<16:06:38, 3.23s/it] 48%|████▊ | 16297/34278 [18:02:20<16:07:28, 3.23s/it] {'loss': 0.127, 'grad_norm': 0.8155024116357175, 'learning_rate': 5.639103717865383e-06, 'epoch': 0.48} 48%|████▊ | 16297/34278 [18:02:20<16:07:28, 3.23s/it] 48%|████▊ | 16298/34278 [18:02:23<15:38:14, 3.13s/it] {'loss': 0.1327, 'grad_norm': 0.8359432403720437, 'learning_rate': 5.6386351560987765e-06, 'epoch': 0.48} 48%|████▊ | 16298/34278 [18:02:23<15:38:14, 3.13s/it] 48%|████▊ | 16299/34278 [18:02:26<16:06:25, 3.23s/it] {'loss': 0.1372, 'grad_norm': 0.9868203154406928, 'learning_rate': 5.6381665886305855e-06, 'epoch': 0.48} 48%|████▊ | 16299/34278 [18:02:26<16:06:25, 3.23s/it] 48%|████▊ | 16300/34278 [18:02:30<16:23:24, 3.28s/it] {'loss': 0.124, 'grad_norm': 0.7039813079919041, 'learning_rate': 5.637698015464996e-06, 'epoch': 0.48} 48%|████▊ | 16300/34278 [18:02:30<16:23:24, 3.28s/it] 48%|████▊ | 16301/34278 [18:02:33<16:27:48, 3.30s/it] {'loss': 0.1366, 'grad_norm': 0.9433427371998303, 'learning_rate': 5.637229436606193e-06, 'epoch': 0.48} 48%|████▊ | 16301/34278 [18:02:33<16:27:48, 3.30s/it] 48%|████▊ | 16302/34278 [18:02:36<15:30:39, 3.11s/it] {'loss': 0.1304, 'grad_norm': 1.104607429078894, 'learning_rate': 5.636760852058356e-06, 'epoch': 0.48} 48%|████▊ | 16302/34278 [18:02:36<15:30:39, 3.11s/it] 48%|████▊ | 16303/34278 [18:02:42<19:47:34, 3.96s/it] {'loss': 0.1131, 'grad_norm': 0.9894809272089414, 'learning_rate': 5.63629226182567e-06, 'epoch': 0.48} 48%|████▊ | 16303/34278 [18:02:42<19:47:34, 3.96s/it] 48%|████▊ | 16304/34278 [18:02:45<18:30:47, 3.71s/it] {'loss': 0.1276, 'grad_norm': 0.8373168273815884, 'learning_rate': 5.635823665912319e-06, 'epoch': 0.48} 48%|████▊ | 16304/34278 [18:02:45<18:30:47, 3.71s/it] 48%|████▊ | 16305/34278 [18:02:48<18:09:07, 3.64s/it] {'loss': 0.1538, 'grad_norm': 1.1356005883576916, 'learning_rate': 5.635355064322485e-06, 'epoch': 0.48} 48%|████▊ | 16305/34278 [18:02:48<18:09:07, 3.64s/it] 48%|████▊ | 16306/34278 [18:02:51<16:57:07, 3.40s/it] {'loss': 0.1354, 'grad_norm': 0.9302560494816402, 'learning_rate': 5.634886457060355e-06, 'epoch': 0.48} 48%|████▊ | 16306/34278 [18:02:51<16:57:07, 3.40s/it] 48%|████▊ | 16307/34278 [18:02:54<16:44:19, 3.35s/it] {'loss': 0.1238, 'grad_norm': 0.9249712503956626, 'learning_rate': 5.634417844130108e-06, 'epoch': 0.48} 48%|████▊ | 16307/34278 [18:02:54<16:44:19, 3.35s/it] 48%|████▊ | 16308/34278 [18:03:00<20:02:36, 4.02s/it] {'loss': 0.1312, 'grad_norm': 0.9696940757980068, 'learning_rate': 5.633949225535932e-06, 'epoch': 0.48} 48%|████▊ | 16308/34278 [18:03:00<20:02:36, 4.02s/it] 48%|████▊ | 16309/34278 [18:03:03<18:23:27, 3.68s/it] {'loss': 0.1279, 'grad_norm': 1.0416341297635099, 'learning_rate': 5.633480601282007e-06, 'epoch': 0.48} 48%|████▊ | 16309/34278 [18:03:03<18:23:27, 3.68s/it] 48%|████▊ | 16310/34278 [18:03:06<17:54:39, 3.59s/it] {'loss': 0.1486, 'grad_norm': 1.023882818797613, 'learning_rate': 5.633011971372519e-06, 'epoch': 0.48} 48%|████▊ | 16310/34278 [18:03:06<17:54:39, 3.59s/it] 48%|████▊ | 16311/34278 [18:03:11<19:12:51, 3.85s/it] {'loss': 0.1116, 'grad_norm': 0.8128353283753157, 'learning_rate': 5.632543335811651e-06, 'epoch': 0.48} 48%|████▊ | 16311/34278 [18:03:11<19:12:51, 3.85s/it] 48%|████▊ | 16312/34278 [18:03:14<18:21:16, 3.68s/it] {'loss': 0.1379, 'grad_norm': 0.984506636394337, 'learning_rate': 5.632074694603586e-06, 'epoch': 0.48} 48%|████▊ | 16312/34278 [18:03:14<18:21:16, 3.68s/it] 48%|████▊ | 16313/34278 [18:03:18<18:30:42, 3.71s/it] {'loss': 0.1328, 'grad_norm': 1.084500215608927, 'learning_rate': 5.631606047752512e-06, 'epoch': 0.48} 48%|████▊ | 16313/34278 [18:03:18<18:30:42, 3.71s/it] 48%|████▊ | 16314/34278 [18:03:21<17:12:49, 3.45s/it] {'loss': 0.1127, 'grad_norm': 0.7600621811441662, 'learning_rate': 5.631137395262608e-06, 'epoch': 0.48} 48%|████▊ | 16314/34278 [18:03:21<17:12:49, 3.45s/it] 48%|████▊ | 16315/34278 [18:03:25<19:09:42, 3.84s/it] {'loss': 0.1189, 'grad_norm': 0.9186156422568045, 'learning_rate': 5.6306687371380585e-06, 'epoch': 0.48} 48%|████▊ | 16315/34278 [18:03:25<19:09:42, 3.84s/it] 48%|████▊ | 16316/34278 [18:03:29<18:48:47, 3.77s/it] {'loss': 0.1336, 'grad_norm': 0.7012573468766916, 'learning_rate': 5.630200073383052e-06, 'epoch': 0.48} 48%|████▊ | 16316/34278 [18:03:29<18:48:47, 3.77s/it] 48%|████▊ | 16317/34278 [18:03:33<18:25:38, 3.69s/it] {'loss': 0.1464, 'grad_norm': 0.9912072267814284, 'learning_rate': 5.629731404001769e-06, 'epoch': 0.48} 48%|████▊ | 16317/34278 [18:03:33<18:25:38, 3.69s/it] 48%|████▊ | 16318/34278 [18:03:38<21:42:56, 4.35s/it] {'loss': 0.1154, 'grad_norm': 0.766488550025548, 'learning_rate': 5.6292627289983934e-06, 'epoch': 0.48} 48%|████▊ | 16318/34278 [18:03:39<21:42:56, 4.35s/it] 48%|████▊ | 16319/34278 [18:03:42<20:17:31, 4.07s/it] {'loss': 0.123, 'grad_norm': 0.9167470478466365, 'learning_rate': 5.628794048377111e-06, 'epoch': 0.48} 48%|████▊ | 16319/34278 [18:03:42<20:17:31, 4.07s/it] 48%|████▊ | 16320/34278 [18:03:45<19:25:49, 3.90s/it] {'loss': 0.1332, 'grad_norm': 0.8191229535731257, 'learning_rate': 5.628325362142105e-06, 'epoch': 0.48} 48%|████▊ | 16320/34278 [18:03:45<19:25:49, 3.90s/it] 48%|████▊ | 16321/34278 [18:03:52<23:05:02, 4.63s/it] {'loss': 0.1343, 'grad_norm': 0.7887257907050483, 'learning_rate': 5.62785667029756e-06, 'epoch': 0.48} 48%|████▊ | 16321/34278 [18:03:52<23:05:02, 4.63s/it] 48%|████▊ | 16322/34278 [18:03:56<23:19:10, 4.68s/it] {'loss': 0.1503, 'grad_norm': 0.7854894873088004, 'learning_rate': 5.627387972847661e-06, 'epoch': 0.48} 48%|████▊ | 16322/34278 [18:03:56<23:19:10, 4.68s/it] 48%|████▊ | 16323/34278 [18:03:59<20:47:43, 4.17s/it] {'loss': 0.1497, 'grad_norm': 0.9401852237844465, 'learning_rate': 5.626919269796594e-06, 'epoch': 0.48} 48%|████▊ | 16323/34278 [18:03:59<20:47:43, 4.17s/it] 48%|████▊ | 16324/34278 [18:04:03<20:11:25, 4.05s/it] {'loss': 0.1437, 'grad_norm': 0.9153490341263396, 'learning_rate': 5.626450561148537e-06, 'epoch': 0.48} 48%|████▊ | 16324/34278 [18:04:03<20:11:25, 4.05s/it] 48%|████▊ | 16325/34278 [18:04:06<18:24:17, 3.69s/it] {'loss': 0.1354, 'grad_norm': 0.7760296164459083, 'learning_rate': 5.625981846907682e-06, 'epoch': 0.48} 48%|████▊ | 16325/34278 [18:04:06<18:24:17, 3.69s/it] 48%|████▊ | 16326/34278 [18:04:12<21:53:09, 4.39s/it] {'loss': 0.1492, 'grad_norm': 0.9832838538956926, 'learning_rate': 5.62551312707821e-06, 'epoch': 0.48} 48%|████▊ | 16326/34278 [18:04:12<21:53:09, 4.39s/it] 48%|████▊ | 16327/34278 [18:04:15<19:22:52, 3.89s/it] {'loss': 0.1447, 'grad_norm': 0.770392387891613, 'learning_rate': 5.625044401664306e-06, 'epoch': 0.48} 48%|████▊ | 16327/34278 [18:04:15<19:22:52, 3.89s/it] 48%|████▊ | 16328/34278 [18:04:18<17:55:52, 3.60s/it] {'loss': 0.1423, 'grad_norm': 0.8861269753905067, 'learning_rate': 5.624575670670155e-06, 'epoch': 0.48} 48%|████▊ | 16328/34278 [18:04:18<17:55:52, 3.60s/it] 48%|████▊ | 16329/34278 [18:04:24<21:31:11, 4.32s/it] {'loss': 0.1479, 'grad_norm': 0.772477885446521, 'learning_rate': 5.624106934099941e-06, 'epoch': 0.48} 48%|████▊ | 16329/34278 [18:04:24<21:31:11, 4.32s/it] 48%|████▊ | 16330/34278 [18:04:27<19:29:37, 3.91s/it] {'loss': 0.1366, 'grad_norm': 0.8666486284004198, 'learning_rate': 5.623638191957849e-06, 'epoch': 0.48} 48%|████▊ | 16330/34278 [18:04:27<19:29:37, 3.91s/it] 48%|████▊ | 16331/34278 [18:04:30<18:03:22, 3.62s/it] {'loss': 0.1234, 'grad_norm': 1.0405527675674633, 'learning_rate': 5.623169444248064e-06, 'epoch': 0.48} 48%|████▊ | 16331/34278 [18:04:30<18:03:22, 3.62s/it] 48%|████▊ | 16332/34278 [18:04:34<18:39:38, 3.74s/it] {'loss': 0.1242, 'grad_norm': 0.9482855984339433, 'learning_rate': 5.6227006909747724e-06, 'epoch': 0.48} 48%|████▊ | 16332/34278 [18:04:34<18:39:38, 3.74s/it] 48%|████▊ | 16333/34278 [18:04:39<21:48:20, 4.37s/it] {'loss': 0.1278, 'grad_norm': 0.8474810524686269, 'learning_rate': 5.622231932142157e-06, 'epoch': 0.48} 48%|████▊ | 16333/34278 [18:04:39<21:48:20, 4.37s/it] 48%|████▊ | 16334/34278 [18:04:43<20:48:49, 4.18s/it] {'loss': 0.1239, 'grad_norm': 0.9077890273505308, 'learning_rate': 5.621763167754402e-06, 'epoch': 0.48} 48%|████▊ | 16334/34278 [18:04:43<20:48:49, 4.18s/it] 48%|████▊ | 16335/34278 [18:04:47<19:35:43, 3.93s/it] {'loss': 0.1233, 'grad_norm': 0.742815865182335, 'learning_rate': 5.621294397815697e-06, 'epoch': 0.48} 48%|████▊ | 16335/34278 [18:04:47<19:35:43, 3.93s/it] 48%|████▊ | 16336/34278 [18:04:50<18:58:15, 3.81s/it] {'loss': 0.1294, 'grad_norm': 1.087039266341935, 'learning_rate': 5.620825622330221e-06, 'epoch': 0.48} 48%|████▊ | 16336/34278 [18:04:50<18:58:15, 3.81s/it] 48%|████▊ | 16337/34278 [18:04:55<20:50:49, 4.18s/it] {'loss': 0.1392, 'grad_norm': 0.932618488392475, 'learning_rate': 5.620356841302162e-06, 'epoch': 0.48} 48%|████▊ | 16337/34278 [18:04:55<20:50:49, 4.18s/it] 48%|████▊ | 16338/34278 [18:04:58<19:00:49, 3.82s/it] {'loss': 0.1507, 'grad_norm': 0.9678135850146654, 'learning_rate': 5.6198880547357085e-06, 'epoch': 0.48} 48%|████▊ | 16338/34278 [18:04:58<19:00:49, 3.82s/it] 48%|████▊ | 16339/34278 [18:05:01<17:44:44, 3.56s/it] {'loss': 0.1141, 'grad_norm': 0.7887670456865438, 'learning_rate': 5.619419262635039e-06, 'epoch': 0.48} 48%|████▊ | 16339/34278 [18:05:01<17:44:44, 3.56s/it] 48%|████▊ | 16340/34278 [18:05:05<17:57:43, 3.60s/it] {'loss': 0.1223, 'grad_norm': 0.9818752688943305, 'learning_rate': 5.618950465004344e-06, 'epoch': 0.48} 48%|████▊ | 16340/34278 [18:05:05<17:57:43, 3.60s/it] 48%|████▊ | 16341/34278 [18:05:08<17:39:01, 3.54s/it] {'loss': 0.1435, 'grad_norm': 1.0586796820123847, 'learning_rate': 5.618481661847806e-06, 'epoch': 0.48} 48%|████▊ | 16341/34278 [18:05:08<17:39:01, 3.54s/it] 48%|████▊ | 16342/34278 [18:05:14<21:12:37, 4.26s/it] {'loss': 0.1323, 'grad_norm': 1.1787417750038875, 'learning_rate': 5.618012853169611e-06, 'epoch': 0.48} 48%|████▊ | 16342/34278 [18:05:14<21:12:37, 4.26s/it] 48%|████▊ | 16343/34278 [18:05:17<19:43:22, 3.96s/it] {'loss': 0.129, 'grad_norm': 0.9320450187711383, 'learning_rate': 5.617544038973946e-06, 'epoch': 0.48} 48%|████▊ | 16343/34278 [18:05:17<19:43:22, 3.96s/it] 48%|████▊ | 16344/34278 [18:05:21<19:13:50, 3.86s/it] {'loss': 0.1617, 'grad_norm': 1.2869324861698728, 'learning_rate': 5.617075219264996e-06, 'epoch': 0.48} 48%|████▊ | 16344/34278 [18:05:21<19:13:50, 3.86s/it] 48%|████▊ | 16345/34278 [18:05:24<17:51:13, 3.58s/it] {'loss': 0.1311, 'grad_norm': 1.0556070212776434, 'learning_rate': 5.616606394046944e-06, 'epoch': 0.48} 48%|████▊ | 16345/34278 [18:05:24<17:51:13, 3.58s/it] 48%|████▊ | 16346/34278 [18:05:27<16:50:39, 3.38s/it] {'loss': 0.1204, 'grad_norm': 0.9284384602442743, 'learning_rate': 5.616137563323978e-06, 'epoch': 0.48} 48%|████▊ | 16346/34278 [18:05:27<16:50:39, 3.38s/it] 48%|████▊ | 16347/34278 [18:05:30<16:24:08, 3.29s/it] {'loss': 0.1267, 'grad_norm': 0.8314594194575753, 'learning_rate': 5.615668727100283e-06, 'epoch': 0.48} 48%|████▊ | 16347/34278 [18:05:30<16:24:08, 3.29s/it] 48%|████▊ | 16348/34278 [18:05:33<16:08:42, 3.24s/it] {'loss': 0.1313, 'grad_norm': 0.8195256567909605, 'learning_rate': 5.615199885380044e-06, 'epoch': 0.48} 48%|████▊ | 16348/34278 [18:05:33<16:08:42, 3.24s/it] 48%|████▊ | 16349/34278 [18:05:36<16:03:00, 3.22s/it] {'loss': 0.1578, 'grad_norm': 0.9979888435768413, 'learning_rate': 5.614731038167448e-06, 'epoch': 0.48} 48%|████▊ | 16349/34278 [18:05:36<16:03:00, 3.22s/it] 48%|████▊ | 16350/34278 [18:05:40<17:16:48, 3.47s/it] {'loss': 0.1347, 'grad_norm': 0.9328922645018125, 'learning_rate': 5.614262185466679e-06, 'epoch': 0.48} 48%|████▊ | 16350/34278 [18:05:40<17:16:48, 3.47s/it] 48%|████▊ | 16351/34278 [18:05:43<16:25:21, 3.30s/it] {'loss': 0.136, 'grad_norm': 0.8920237393330316, 'learning_rate': 5.613793327281924e-06, 'epoch': 0.48} 48%|████▊ | 16351/34278 [18:05:43<16:25:21, 3.30s/it] 48%|████▊ | 16352/34278 [18:05:47<16:34:40, 3.33s/it] {'loss': 0.1161, 'grad_norm': 0.7439757715781523, 'learning_rate': 5.61332446361737e-06, 'epoch': 0.48} 48%|████▊ | 16352/34278 [18:05:47<16:34:40, 3.33s/it] 48%|████▊ | 16353/34278 [18:05:50<16:51:39, 3.39s/it] {'loss': 0.1422, 'grad_norm': 0.9532320816988058, 'learning_rate': 5.612855594477202e-06, 'epoch': 0.48} 48%|████▊ | 16353/34278 [18:05:50<16:51:39, 3.39s/it] 48%|████▊ | 16354/34278 [18:05:53<16:25:29, 3.30s/it] {'loss': 0.1233, 'grad_norm': 0.773370658411886, 'learning_rate': 5.612386719865604e-06, 'epoch': 0.48} 48%|████▊ | 16354/34278 [18:05:53<16:25:29, 3.30s/it] 48%|████▊ | 16355/34278 [18:05:57<17:20:29, 3.48s/it] {'loss': 0.1331, 'grad_norm': 0.9073938318172595, 'learning_rate': 5.611917839786763e-06, 'epoch': 0.48} 48%|████▊ | 16355/34278 [18:05:57<17:20:29, 3.48s/it] 48%|████▊ | 16356/34278 [18:06:00<17:04:54, 3.43s/it] {'loss': 0.1323, 'grad_norm': 0.8527106710249348, 'learning_rate': 5.6114489542448684e-06, 'epoch': 0.48} 48%|████▊ | 16356/34278 [18:06:00<17:04:54, 3.43s/it] 48%|████▊ | 16357/34278 [18:06:04<17:32:19, 3.52s/it] {'loss': 0.1331, 'grad_norm': 0.787239349102488, 'learning_rate': 5.610980063244099e-06, 'epoch': 0.48} 48%|████▊ | 16357/34278 [18:06:04<17:32:19, 3.52s/it] 48%|████▊ | 16358/34278 [18:06:07<17:02:06, 3.42s/it] {'loss': 0.1485, 'grad_norm': 0.750076972832993, 'learning_rate': 5.61051116678865e-06, 'epoch': 0.48} 48%|████▊ | 16358/34278 [18:06:07<17:02:06, 3.42s/it] 48%|████▊ | 16359/34278 [18:06:11<17:35:34, 3.53s/it] {'loss': 0.1588, 'grad_norm': 0.9619912671955636, 'learning_rate': 5.610042264882701e-06, 'epoch': 0.48} 48%|████▊ | 16359/34278 [18:06:11<17:35:34, 3.53s/it] 48%|████▊ | 16360/34278 [18:06:14<16:51:04, 3.39s/it] {'loss': 0.1387, 'grad_norm': 1.0436512459267488, 'learning_rate': 5.60957335753044e-06, 'epoch': 0.48} 48%|████▊ | 16360/34278 [18:06:14<16:51:04, 3.39s/it] 48%|████▊ | 16361/34278 [18:06:17<16:32:10, 3.32s/it] {'loss': 0.1349, 'grad_norm': 0.9279762888706286, 'learning_rate': 5.6091044447360545e-06, 'epoch': 0.48} 48%|████▊ | 16361/34278 [18:06:17<16:32:10, 3.32s/it] 48%|████▊ | 16362/34278 [18:06:22<18:05:48, 3.64s/it] {'loss': 0.1497, 'grad_norm': 0.840565962671998, 'learning_rate': 5.60863552650373e-06, 'epoch': 0.48} 48%|████▊ | 16362/34278 [18:06:22<18:05:48, 3.64s/it] 48%|████▊ | 16363/34278 [18:06:24<16:44:38, 3.36s/it] {'loss': 0.1352, 'grad_norm': 1.2783894863802543, 'learning_rate': 5.608166602837652e-06, 'epoch': 0.48} 48%|████▊ | 16363/34278 [18:06:24<16:44:38, 3.36s/it] 48%|████▊ | 16364/34278 [18:06:27<15:55:38, 3.20s/it] {'loss': 0.1318, 'grad_norm': 0.8850599410811496, 'learning_rate': 5.607697673742008e-06, 'epoch': 0.48} 48%|████▊ | 16364/34278 [18:06:27<15:55:38, 3.20s/it] 48%|████▊ | 16365/34278 [18:06:30<15:52:07, 3.19s/it] {'loss': 0.1255, 'grad_norm': 0.8114331658584741, 'learning_rate': 5.607228739220984e-06, 'epoch': 0.48} 48%|████▊ | 16365/34278 [18:06:30<15:52:07, 3.19s/it] 48%|████▊ | 16366/34278 [18:06:34<15:44:08, 3.16s/it] {'loss': 0.1662, 'grad_norm': 1.5518561725449465, 'learning_rate': 5.606759799278766e-06, 'epoch': 0.48} 48%|████▊ | 16366/34278 [18:06:34<15:44:08, 3.16s/it] 48%|████▊ | 16367/34278 [18:06:37<15:27:33, 3.11s/it] {'loss': 0.1221, 'grad_norm': 0.9667529991787196, 'learning_rate': 5.606290853919543e-06, 'epoch': 0.48} 48%|████▊ | 16367/34278 [18:06:37<15:27:33, 3.11s/it] 48%|████▊ | 16368/34278 [18:06:41<17:17:14, 3.47s/it] {'loss': 0.1235, 'grad_norm': 0.8184996547673199, 'learning_rate': 5.6058219031475e-06, 'epoch': 0.48} 48%|████▊ | 16368/34278 [18:06:41<17:17:14, 3.47s/it] 48%|████▊ | 16369/34278 [18:06:44<16:39:31, 3.35s/it] {'loss': 0.1296, 'grad_norm': 1.2228537771081796, 'learning_rate': 5.605352946966822e-06, 'epoch': 0.48} 48%|████▊ | 16369/34278 [18:06:44<16:39:31, 3.35s/it] 48%|████▊ | 16370/34278 [18:06:50<20:51:42, 4.19s/it] {'loss': 0.1564, 'grad_norm': 1.0041666589820943, 'learning_rate': 5.604883985381699e-06, 'epoch': 0.48} 48%|████▊ | 16370/34278 [18:06:50<20:51:42, 4.19s/it] 48%|████▊ | 16371/34278 [18:06:54<20:15:45, 4.07s/it] {'loss': 0.1294, 'grad_norm': 0.9556537717756693, 'learning_rate': 5.604415018396315e-06, 'epoch': 0.48} 48%|████▊ | 16371/34278 [18:06:54<20:15:45, 4.07s/it] 48%|████▊ | 16372/34278 [18:06:57<19:16:26, 3.88s/it] {'loss': 0.149, 'grad_norm': 0.9860609207112173, 'learning_rate': 5.603946046014859e-06, 'epoch': 0.48} 48%|████▊ | 16372/34278 [18:06:57<19:16:26, 3.88s/it] 48%|████▊ | 16373/34278 [18:07:01<18:36:25, 3.74s/it] {'loss': 0.1252, 'grad_norm': 0.9007862306763328, 'learning_rate': 5.603477068241516e-06, 'epoch': 0.48} 48%|████▊ | 16373/34278 [18:07:01<18:36:25, 3.74s/it] 48%|████▊ | 16374/34278 [18:07:04<17:39:25, 3.55s/it] {'loss': 0.1262, 'grad_norm': 0.9248634586943631, 'learning_rate': 5.603008085080475e-06, 'epoch': 0.48} 48%|████▊ | 16374/34278 [18:07:04<17:39:25, 3.55s/it] 48%|████▊ | 16375/34278 [18:07:07<16:34:42, 3.33s/it] {'loss': 0.1188, 'grad_norm': 0.8512401550436068, 'learning_rate': 5.602539096535921e-06, 'epoch': 0.48} 48%|████▊ | 16375/34278 [18:07:07<16:34:42, 3.33s/it] 48%|████▊ | 16376/34278 [18:07:13<20:23:36, 4.10s/it] {'loss': 0.1491, 'grad_norm': 0.9262570479023378, 'learning_rate': 5.602070102612042e-06, 'epoch': 0.48} 48%|████▊ | 16376/34278 [18:07:13<20:23:36, 4.10s/it] 48%|████▊ | 16377/34278 [18:07:17<20:24:15, 4.10s/it] {'loss': 0.1637, 'grad_norm': 0.8646970332991294, 'learning_rate': 5.6016011033130246e-06, 'epoch': 0.48} 48%|████▊ | 16377/34278 [18:07:17<20:24:15, 4.10s/it] 48%|████▊ | 16378/34278 [18:07:21<20:32:42, 4.13s/it] {'loss': 0.1435, 'grad_norm': 1.1847361888051147, 'learning_rate': 5.601132098643056e-06, 'epoch': 0.48} 48%|████▊ | 16378/34278 [18:07:21<20:32:42, 4.13s/it] 48%|████▊ | 16379/34278 [18:07:25<21:05:48, 4.24s/it] {'loss': 0.1164, 'grad_norm': 0.764824753947898, 'learning_rate': 5.600663088606324e-06, 'epoch': 0.48} 48%|████▊ | 16379/34278 [18:07:25<21:05:48, 4.24s/it] 48%|████▊ | 16380/34278 [18:07:28<18:53:20, 3.80s/it] {'loss': 0.1268, 'grad_norm': 1.0289881956391431, 'learning_rate': 5.600194073207015e-06, 'epoch': 0.48} 48%|████▊ | 16380/34278 [18:07:28<18:53:20, 3.80s/it] 48%|████▊ | 16381/34278 [18:07:32<19:13:26, 3.87s/it] {'loss': 0.1469, 'grad_norm': 1.0154543053608711, 'learning_rate': 5.599725052449316e-06, 'epoch': 0.48} 48%|████▊ | 16381/34278 [18:07:32<19:13:26, 3.87s/it] 48%|████▊ | 16382/34278 [18:07:35<17:59:58, 3.62s/it] {'loss': 0.1182, 'grad_norm': 0.7685139442929392, 'learning_rate': 5.599256026337417e-06, 'epoch': 0.48} 48%|████▊ | 16382/34278 [18:07:35<17:59:58, 3.62s/it] 48%|████▊ | 16383/34278 [18:07:41<20:56:21, 4.21s/it] {'loss': 0.1249, 'grad_norm': 0.8005495512040665, 'learning_rate': 5.5987869948755014e-06, 'epoch': 0.48} 48%|████▊ | 16383/34278 [18:07:41<20:56:21, 4.21s/it] 48%|████▊ | 16384/34278 [18:07:45<20:17:58, 4.08s/it] {'loss': 0.1413, 'grad_norm': 0.7659027252764419, 'learning_rate': 5.598317958067758e-06, 'epoch': 0.48} 48%|████▊ | 16384/34278 [18:07:45<20:17:58, 4.08s/it] 48%|████▊ | 16385/34278 [18:07:48<19:43:28, 3.97s/it] {'loss': 0.1359, 'grad_norm': 0.7497897555580734, 'learning_rate': 5.597848915918376e-06, 'epoch': 0.48} 48%|████▊ | 16385/34278 [18:07:48<19:43:28, 3.97s/it] 48%|████▊ | 16386/34278 [18:07:52<19:16:18, 3.88s/it] {'loss': 0.135, 'grad_norm': 0.9880357635674445, 'learning_rate': 5.5973798684315415e-06, 'epoch': 0.48} 48%|████▊ | 16386/34278 [18:07:52<19:16:18, 3.88s/it] 48%|████▊ | 16387/34278 [18:07:55<18:18:59, 3.69s/it] {'loss': 0.1312, 'grad_norm': 0.8929744030550361, 'learning_rate': 5.5969108156114406e-06, 'epoch': 0.48} 48%|████▊ | 16387/34278 [18:07:55<18:18:59, 3.69s/it] 48%|████▊ | 16388/34278 [18:07:59<18:02:41, 3.63s/it] {'loss': 0.1336, 'grad_norm': 0.8720010906201271, 'learning_rate': 5.596441757462266e-06, 'epoch': 0.48} 48%|████▊ | 16388/34278 [18:07:59<18:02:41, 3.63s/it] 48%|████▊ | 16389/34278 [18:08:02<18:14:09, 3.67s/it] {'loss': 0.1095, 'grad_norm': 0.7842769572080821, 'learning_rate': 5.595972693988199e-06, 'epoch': 0.48} 48%|████▊ | 16389/34278 [18:08:02<18:14:09, 3.67s/it] 48%|████▊ | 16390/34278 [18:08:06<17:58:26, 3.62s/it] {'loss': 0.1336, 'grad_norm': 0.7877639372626694, 'learning_rate': 5.595503625193429e-06, 'epoch': 0.48} 48%|████▊ | 16390/34278 [18:08:06<17:58:26, 3.62s/it] 48%|████▊ | 16391/34278 [18:08:10<18:18:43, 3.69s/it] {'loss': 0.1541, 'grad_norm': 1.0346505626710718, 'learning_rate': 5.595034551082147e-06, 'epoch': 0.48} 48%|████▊ | 16391/34278 [18:08:10<18:18:43, 3.69s/it] 48%|████▊ | 16392/34278 [18:08:13<17:13:58, 3.47s/it] {'loss': 0.1532, 'grad_norm': 1.1464245879999029, 'learning_rate': 5.594565471658537e-06, 'epoch': 0.48} 48%|████▊ | 16392/34278 [18:08:13<17:13:58, 3.47s/it] 48%|████▊ | 16393/34278 [18:08:16<16:31:01, 3.32s/it] {'loss': 0.1286, 'grad_norm': 0.7323674847221384, 'learning_rate': 5.594096386926789e-06, 'epoch': 0.48} 48%|████▊ | 16393/34278 [18:08:16<16:31:01, 3.32s/it] 48%|████▊ | 16394/34278 [18:08:19<17:05:50, 3.44s/it] {'loss': 0.149, 'grad_norm': 0.7177221973266082, 'learning_rate': 5.5936272968910905e-06, 'epoch': 0.48} 48%|████▊ | 16394/34278 [18:08:19<17:05:50, 3.44s/it] 48%|████▊ | 16395/34278 [18:08:23<17:11:35, 3.46s/it] {'loss': 0.1489, 'grad_norm': 0.939104521542515, 'learning_rate': 5.5931582015556294e-06, 'epoch': 0.48} 48%|████▊ | 16395/34278 [18:08:23<17:11:35, 3.46s/it] 48%|████▊ | 16396/34278 [18:08:28<19:53:49, 4.01s/it] {'loss': 0.1266, 'grad_norm': 0.7656754892557142, 'learning_rate': 5.592689100924595e-06, 'epoch': 0.48} 48%|████▊ | 16396/34278 [18:08:28<19:53:49, 4.01s/it] 48%|████▊ | 16397/34278 [18:08:34<22:58:08, 4.62s/it] {'loss': 0.1538, 'grad_norm': 0.71370623999873, 'learning_rate': 5.59221999500217e-06, 'epoch': 0.48} 48%|████▊ | 16397/34278 [18:08:34<22:58:08, 4.62s/it] 48%|████▊ | 16398/34278 [18:08:38<21:53:13, 4.41s/it] {'loss': 0.1246, 'grad_norm': 0.885135228127414, 'learning_rate': 5.59175088379255e-06, 'epoch': 0.48} 48%|████▊ | 16398/34278 [18:08:38<21:53:13, 4.41s/it] 48%|████▊ | 16399/34278 [18:08:41<19:46:21, 3.98s/it] {'loss': 0.1483, 'grad_norm': 1.030477827237176, 'learning_rate': 5.591281767299916e-06, 'epoch': 0.48} 48%|████▊ | 16399/34278 [18:08:41<19:46:21, 3.98s/it] 48%|████▊ | 16400/34278 [18:08:45<18:51:16, 3.80s/it] {'loss': 0.1169, 'grad_norm': 0.817095829340118, 'learning_rate': 5.590812645528462e-06, 'epoch': 0.48} 48%|████▊ | 16400/34278 [18:08:45<18:51:16, 3.80s/it] 48%|████▊ | 16401/34278 [18:08:49<19:26:02, 3.91s/it] {'loss': 0.1335, 'grad_norm': 0.7114241373232695, 'learning_rate': 5.590343518482374e-06, 'epoch': 0.48} 48%|████▊ | 16401/34278 [18:08:49<19:26:02, 3.91s/it] 48%|████▊ | 16402/34278 [18:08:52<18:36:46, 3.75s/it] {'loss': 0.1445, 'grad_norm': 0.8398076516534582, 'learning_rate': 5.589874386165838e-06, 'epoch': 0.48} 48%|████▊ | 16402/34278 [18:08:52<18:36:46, 3.75s/it] 48%|████▊ | 16403/34278 [18:08:56<18:13:09, 3.67s/it] {'loss': 0.1207, 'grad_norm': 0.8405885559418892, 'learning_rate': 5.5894052485830464e-06, 'epoch': 0.48} 48%|████▊ | 16403/34278 [18:08:56<18:13:09, 3.67s/it] 48%|████▊ | 16404/34278 [18:09:01<20:09:29, 4.06s/it] {'loss': 0.1113, 'grad_norm': 0.6021177127044054, 'learning_rate': 5.588936105738184e-06, 'epoch': 0.48} 48%|████▊ | 16404/34278 [18:09:01<20:09:29, 4.06s/it] 48%|████▊ | 16405/34278 [18:09:05<21:18:49, 4.29s/it] {'loss': 0.1423, 'grad_norm': 1.0032348959711763, 'learning_rate': 5.588466957635441e-06, 'epoch': 0.48} 48%|████▊ | 16405/34278 [18:09:05<21:18:49, 4.29s/it] 48%|████▊ | 16406/34278 [18:09:08<19:30:52, 3.93s/it] {'loss': 0.1522, 'grad_norm': 0.7924935714187326, 'learning_rate': 5.587997804279005e-06, 'epoch': 0.48} 48%|████▊ | 16406/34278 [18:09:08<19:30:52, 3.93s/it] 48%|████▊ | 16407/34278 [18:09:12<19:06:47, 3.85s/it] {'loss': 0.1208, 'grad_norm': 0.7680357104654308, 'learning_rate': 5.587528645673066e-06, 'epoch': 0.48} 48%|████▊ | 16407/34278 [18:09:12<19:06:47, 3.85s/it] 48%|████▊ | 16408/34278 [18:09:15<18:18:52, 3.69s/it] {'loss': 0.1406, 'grad_norm': 0.8609661601631099, 'learning_rate': 5.58705948182181e-06, 'epoch': 0.48} 48%|████▊ | 16408/34278 [18:09:15<18:18:52, 3.69s/it] 48%|████▊ | 16409/34278 [18:09:19<17:45:42, 3.58s/it] {'loss': 0.1342, 'grad_norm': 0.9511141456151238, 'learning_rate': 5.586590312729429e-06, 'epoch': 0.48} 48%|████▊ | 16409/34278 [18:09:19<17:45:42, 3.58s/it] 48%|████▊ | 16410/34278 [18:09:22<17:08:15, 3.45s/it] {'loss': 0.1095, 'grad_norm': 0.7120753550924881, 'learning_rate': 5.586121138400108e-06, 'epoch': 0.48} 48%|████▊ | 16410/34278 [18:09:22<17:08:15, 3.45s/it] 48%|████▊ | 16411/34278 [18:09:25<16:43:27, 3.37s/it] {'loss': 0.14, 'grad_norm': 1.0006084562915092, 'learning_rate': 5.5856519588380385e-06, 'epoch': 0.48} 48%|████▊ | 16411/34278 [18:09:25<16:43:27, 3.37s/it] 48%|████▊ | 16412/34278 [18:09:28<16:20:08, 3.29s/it] {'loss': 0.1616, 'grad_norm': 0.9435542241082506, 'learning_rate': 5.5851827740474075e-06, 'epoch': 0.48} 48%|████▊ | 16412/34278 [18:09:28<16:20:08, 3.29s/it] 48%|████▊ | 16413/34278 [18:09:31<16:00:32, 3.23s/it] {'loss': 0.121, 'grad_norm': 0.930465655253894, 'learning_rate': 5.584713584032406e-06, 'epoch': 0.48} 48%|████▊ | 16413/34278 [18:09:31<16:00:32, 3.23s/it] 48%|████▊ | 16414/34278 [18:09:34<15:24:44, 3.11s/it] {'loss': 0.1268, 'grad_norm': 0.7637096447809273, 'learning_rate': 5.5842443887972184e-06, 'epoch': 0.48} 48%|████▊ | 16414/34278 [18:09:34<15:24:44, 3.11s/it] 48%|████▊ | 16415/34278 [18:09:40<19:16:29, 3.88s/it] {'loss': 0.1387, 'grad_norm': 0.8205669053114473, 'learning_rate': 5.5837751883460375e-06, 'epoch': 0.48} 48%|████▊ | 16415/34278 [18:09:40<19:16:29, 3.88s/it] 48%|████▊ | 16416/34278 [18:09:43<18:24:06, 3.71s/it] {'loss': 0.1417, 'grad_norm': 0.8744346633681963, 'learning_rate': 5.583305982683053e-06, 'epoch': 0.48} 48%|████▊ | 16416/34278 [18:09:43<18:24:06, 3.71s/it] 48%|████▊ | 16417/34278 [18:09:47<18:25:16, 3.71s/it] {'loss': 0.1191, 'grad_norm': 0.7638189563637072, 'learning_rate': 5.582836771812448e-06, 'epoch': 0.48} 48%|████▊ | 16417/34278 [18:09:47<18:25:16, 3.71s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 48%|████▊ | 16418/34278 [18:09:50<18:14:25, 3.68s/it] {'loss': 0.1313, 'grad_norm': 1.0753041099163074, 'learning_rate': 5.582367555738419e-06, 'epoch': 0.48} 48%|████▊ | 16418/34278 [18:09:50<18:14:25, 3.68s/it] 48%|████▊ | 16419/34278 [18:09:56<21:51:05, 4.40s/it] {'loss': 0.1149, 'grad_norm': 0.8909782947913392, 'learning_rate': 5.5818983344651515e-06, 'epoch': 0.48} 48%|████▊ | 16419/34278 [18:09:57<21:51:05, 4.40s/it] 48%|████▊ | 16420/34278 [18:10:00<20:23:36, 4.11s/it] {'loss': 0.1323, 'grad_norm': 0.8579148418636114, 'learning_rate': 5.581429107996833e-06, 'epoch': 0.48} 48%|████▊ | 16420/34278 [18:10:00<20:23:36, 4.11s/it] 48%|████▊ | 16421/34278 [18:10:03<18:43:34, 3.78s/it] {'loss': 0.1384, 'grad_norm': 1.112360093645641, 'learning_rate': 5.580959876337654e-06, 'epoch': 0.48} 48%|████▊ | 16421/34278 [18:10:03<18:43:34, 3.78s/it] 48%|████▊ | 16422/34278 [18:10:09<22:34:51, 4.55s/it] {'loss': 0.1551, 'grad_norm': 0.8913188629078489, 'learning_rate': 5.580490639491805e-06, 'epoch': 0.48} 48%|████▊ | 16422/34278 [18:10:09<22:34:51, 4.55s/it] 48%|████▊ | 16423/34278 [18:10:13<21:16:43, 4.29s/it] {'loss': 0.1642, 'grad_norm': 0.7864805654378544, 'learning_rate': 5.580021397463473e-06, 'epoch': 0.48} 48%|████▊ | 16423/34278 [18:10:13<21:16:43, 4.29s/it] 48%|████▊ | 16424/34278 [18:10:17<20:36:24, 4.16s/it] {'loss': 0.1035, 'grad_norm': 1.0481762492684195, 'learning_rate': 5.579552150256849e-06, 'epoch': 0.48} 48%|████▊ | 16424/34278 [18:10:17<20:36:24, 4.16s/it] 48%|████▊ | 16425/34278 [18:10:21<20:02:36, 4.04s/it] {'loss': 0.1503, 'grad_norm': 0.8525269377785037, 'learning_rate': 5.5790828978761215e-06, 'epoch': 0.48} 48%|████▊ | 16425/34278 [18:10:21<20:02:36, 4.04s/it] 48%|████▊ | 16426/34278 [18:10:24<18:36:27, 3.75s/it] {'loss': 0.1297, 'grad_norm': 0.7712905240418593, 'learning_rate': 5.578613640325481e-06, 'epoch': 0.48} 48%|████▊ | 16426/34278 [18:10:24<18:36:27, 3.75s/it] 48%|████▊ | 16427/34278 [18:10:27<17:42:55, 3.57s/it] {'loss': 0.1501, 'grad_norm': 1.1679634596940096, 'learning_rate': 5.5781443776091145e-06, 'epoch': 0.48} 48%|████▊ | 16427/34278 [18:10:27<17:42:55, 3.57s/it] 48%|████▊ | 16428/34278 [18:10:30<16:48:49, 3.39s/it] {'loss': 0.1392, 'grad_norm': 1.0837233145526382, 'learning_rate': 5.577675109731216e-06, 'epoch': 0.48} 48%|████▊ | 16428/34278 [18:10:30<16:48:49, 3.39s/it] 48%|████▊ | 16429/34278 [18:10:36<20:44:39, 4.18s/it] {'loss': 0.1263, 'grad_norm': 0.8112739922629897, 'learning_rate': 5.577205836695968e-06, 'epoch': 0.48} 48%|████▊ | 16429/34278 [18:10:36<20:44:39, 4.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 48%|████▊ | 16430/34278 [18:10:39<18:53:08, 3.81s/it] {'loss': 0.1345, 'grad_norm': 1.4041024594045965, 'learning_rate': 5.576736558507566e-06, 'epoch': 0.48} 48%|████▊ | 16430/34278 [18:10:39<18:53:08, 3.81s/it] 48%|████▊ | 16431/34278 [18:10:42<17:58:25, 3.63s/it] {'loss': 0.1325, 'grad_norm': 1.137613393891698, 'learning_rate': 5.5762672751702e-06, 'epoch': 0.48} 48%|████▊ | 16431/34278 [18:10:42<17:58:25, 3.63s/it] 48%|████▊ | 16432/34278 [18:10:45<17:48:56, 3.59s/it] {'loss': 0.1214, 'grad_norm': 0.8934552907259496, 'learning_rate': 5.575797986688053e-06, 'epoch': 0.48} 48%|████▊ | 16432/34278 [18:10:45<17:48:56, 3.59s/it] 48%|████▊ | 16433/34278 [18:10:49<17:06:22, 3.45s/it] {'loss': 0.1091, 'grad_norm': 0.6215980394763082, 'learning_rate': 5.575328693065322e-06, 'epoch': 0.48} 48%|████▊ | 16433/34278 [18:10:49<17:06:22, 3.45s/it] 48%|████▊ | 16434/34278 [18:10:53<18:09:58, 3.67s/it] {'loss': 0.1246, 'grad_norm': 1.1148917131769525, 'learning_rate': 5.574859394306194e-06, 'epoch': 0.48} 48%|████▊ | 16434/34278 [18:10:53<18:09:58, 3.67s/it] 48%|████▊ | 16435/34278 [18:10:57<18:28:07, 3.73s/it] {'loss': 0.1283, 'grad_norm': 1.1038719635113177, 'learning_rate': 5.574390090414856e-06, 'epoch': 0.48} 48%|████▊ | 16435/34278 [18:10:57<18:28:07, 3.73s/it] 48%|████▊ | 16436/34278 [18:11:03<21:54:28, 4.42s/it] {'loss': 0.149, 'grad_norm': 0.8680988789897821, 'learning_rate': 5.573920781395502e-06, 'epoch': 0.48} 48%|████▊ | 16436/34278 [18:11:03<21:54:28, 4.42s/it] 48%|████▊ | 16437/34278 [18:11:06<19:49:28, 4.00s/it] {'loss': 0.1458, 'grad_norm': 1.0342378646567074, 'learning_rate': 5.57345146725232e-06, 'epoch': 0.48} 48%|████▊ | 16437/34278 [18:11:06<19:49:28, 4.00s/it] 48%|████▊ | 16438/34278 [18:11:09<18:19:02, 3.70s/it] {'loss': 0.1341, 'grad_norm': 0.9369272256059544, 'learning_rate': 5.572982147989501e-06, 'epoch': 0.48} 48%|████▊ | 16438/34278 [18:11:09<18:19:02, 3.70s/it] 48%|████▊ | 16439/34278 [18:11:14<20:16:00, 4.09s/it] {'loss': 0.1266, 'grad_norm': 0.709060020956352, 'learning_rate': 5.5725128236112326e-06, 'epoch': 0.48} 48%|████▊ | 16439/34278 [18:11:14<20:16:00, 4.09s/it] 48%|████▊ | 16440/34278 [18:11:20<23:08:53, 4.67s/it] {'loss': 0.1579, 'grad_norm': 0.7898214892689679, 'learning_rate': 5.572043494121707e-06, 'epoch': 0.48} 48%|████▊ | 16440/34278 [18:11:20<23:08:53, 4.67s/it] 48%|████▊ | 16441/34278 [18:11:23<21:06:21, 4.26s/it] {'loss': 0.1524, 'grad_norm': 1.0129801073910862, 'learning_rate': 5.571574159525114e-06, 'epoch': 0.48} 48%|████▊ | 16441/34278 [18:11:23<21:06:21, 4.26s/it] 48%|████▊ | 16442/34278 [18:11:27<20:05:24, 4.05s/it] {'loss': 0.1195, 'grad_norm': 1.0811600843143345, 'learning_rate': 5.571104819825643e-06, 'epoch': 0.48} 48%|████▊ | 16442/34278 [18:11:27<20:05:24, 4.05s/it] 48%|████▊ | 16443/34278 [18:11:30<18:44:07, 3.78s/it] {'loss': 0.1459, 'grad_norm': 0.8784083018169584, 'learning_rate': 5.570635475027486e-06, 'epoch': 0.48} 48%|████▊ | 16443/34278 [18:11:30<18:44:07, 3.78s/it] 48%|████▊ | 16444/34278 [18:11:33<18:02:13, 3.64s/it] {'loss': 0.1357, 'grad_norm': 0.8980149655159101, 'learning_rate': 5.570166125134829e-06, 'epoch': 0.48} 48%|████▊ | 16444/34278 [18:11:33<18:02:13, 3.64s/it] 48%|████▊ | 16445/34278 [18:11:36<16:41:42, 3.37s/it] {'loss': 0.1377, 'grad_norm': 0.8902907706004111, 'learning_rate': 5.569696770151866e-06, 'epoch': 0.48} 48%|████▊ | 16445/34278 [18:11:36<16:41:42, 3.37s/it] 48%|████▊ | 16446/34278 [18:11:39<16:54:43, 3.41s/it] {'loss': 0.1116, 'grad_norm': 0.8007189101995574, 'learning_rate': 5.569227410082788e-06, 'epoch': 0.48} 48%|████▊ | 16446/34278 [18:11:39<16:54:43, 3.41s/it] 48%|████▊ | 16447/34278 [18:11:43<17:50:22, 3.60s/it] {'loss': 0.1442, 'grad_norm': 1.013187782512625, 'learning_rate': 5.568758044931781e-06, 'epoch': 0.48} 48%|████▊ | 16447/34278 [18:11:43<17:50:22, 3.60s/it] 48%|████▊ | 16448/34278 [18:11:47<17:39:55, 3.57s/it] {'loss': 0.1273, 'grad_norm': 0.8005773321204657, 'learning_rate': 5.568288674703041e-06, 'epoch': 0.48} 48%|████▊ | 16448/34278 [18:11:47<17:39:55, 3.57s/it] 48%|████▊ | 16449/34278 [18:11:50<16:40:31, 3.37s/it] {'loss': 0.1429, 'grad_norm': 0.7571208636228627, 'learning_rate': 5.5678192994007526e-06, 'epoch': 0.48} 48%|████▊ | 16449/34278 [18:11:50<16:40:31, 3.37s/it] 48%|████▊ | 16450/34278 [18:11:54<17:48:31, 3.60s/it] {'loss': 0.1236, 'grad_norm': 0.6262548695044964, 'learning_rate': 5.56734991902911e-06, 'epoch': 0.48} 48%|████▊ | 16450/34278 [18:11:54<17:48:31, 3.60s/it] 48%|████▊ | 16451/34278 [18:12:00<20:53:43, 4.22s/it] {'loss': 0.1397, 'grad_norm': 1.0030999671621827, 'learning_rate': 5.566880533592303e-06, 'epoch': 0.48} 48%|████▊ | 16451/34278 [18:12:00<20:53:43, 4.22s/it] 48%|████▊ | 16452/34278 [18:12:02<18:52:33, 3.81s/it] {'loss': 0.1256, 'grad_norm': 0.825659076886547, 'learning_rate': 5.566411143094521e-06, 'epoch': 0.48} 48%|████▊ | 16452/34278 [18:12:02<18:52:33, 3.81s/it] 48%|████▊ | 16453/34278 [18:12:05<17:42:59, 3.58s/it] {'loss': 0.1444, 'grad_norm': 0.7132176911079248, 'learning_rate': 5.565941747539957e-06, 'epoch': 0.48} 48%|████▊ | 16453/34278 [18:12:05<17:42:59, 3.58s/it] 48%|████▊ | 16454/34278 [18:12:08<16:50:08, 3.40s/it] {'loss': 0.1528, 'grad_norm': 0.7043674861429906, 'learning_rate': 5.565472346932799e-06, 'epoch': 0.48} 48%|████▊ | 16454/34278 [18:12:08<16:50:08, 3.40s/it] 48%|████▊ | 16455/34278 [18:12:14<20:49:44, 4.21s/it] {'loss': 0.1267, 'grad_norm': 0.9347895133101263, 'learning_rate': 5.565002941277239e-06, 'epoch': 0.48} 48%|████▊ | 16455/34278 [18:12:14<20:49:44, 4.21s/it] 48%|████▊ | 16456/34278 [18:12:18<19:53:40, 4.02s/it] {'loss': 0.1314, 'grad_norm': 0.7726630271489435, 'learning_rate': 5.564533530577467e-06, 'epoch': 0.48} 48%|████▊ | 16456/34278 [18:12:18<19:53:40, 4.02s/it] 48%|████▊ | 16457/34278 [18:12:23<20:43:38, 4.19s/it] {'loss': 0.1319, 'grad_norm': 0.6726280534384327, 'learning_rate': 5.5640641148376765e-06, 'epoch': 0.48} 48%|████▊ | 16457/34278 [18:12:23<20:43:38, 4.19s/it] 48%|████▊ | 16458/34278 [18:12:27<21:36:11, 4.36s/it] {'loss': 0.1297, 'grad_norm': 0.8693280533083172, 'learning_rate': 5.563594694062055e-06, 'epoch': 0.48} 48%|████▊ | 16458/34278 [18:12:27<21:36:11, 4.36s/it] 48%|████▊ | 16459/34278 [18:12:31<20:12:03, 4.08s/it] {'loss': 0.1107, 'grad_norm': 1.0234800984648962, 'learning_rate': 5.563125268254794e-06, 'epoch': 0.48} 48%|████▊ | 16459/34278 [18:12:31<20:12:03, 4.08s/it] 48%|████▊ | 16460/34278 [18:12:37<23:25:52, 4.73s/it] {'loss': 0.1648, 'grad_norm': 0.9369029388200885, 'learning_rate': 5.562655837420086e-06, 'epoch': 0.48} 48%|████▊ | 16460/34278 [18:12:37<23:25:52, 4.73s/it] 48%|████▊ | 16461/34278 [18:12:43<24:53:35, 5.03s/it] {'loss': 0.1238, 'grad_norm': 0.9688258507488033, 'learning_rate': 5.562186401562121e-06, 'epoch': 0.48} 48%|████▊ | 16461/34278 [18:12:43<24:53:35, 5.03s/it] 48%|████▊ | 16462/34278 [18:12:46<22:42:43, 4.59s/it] {'loss': 0.1447, 'grad_norm': 0.9769512448249669, 'learning_rate': 5.561716960685089e-06, 'epoch': 0.48} 48%|████▊ | 16462/34278 [18:12:46<22:42:43, 4.59s/it] 48%|████▊ | 16463/34278 [18:12:49<20:16:09, 4.10s/it] {'loss': 0.1481, 'grad_norm': 1.0054589122449409, 'learning_rate': 5.561247514793183e-06, 'epoch': 0.48} 48%|████▊ | 16463/34278 [18:12:49<20:16:09, 4.10s/it] 48%|████▊ | 16464/34278 [18:12:53<19:15:55, 3.89s/it] {'loss': 0.1446, 'grad_norm': 0.8938732851180062, 'learning_rate': 5.560778063890593e-06, 'epoch': 0.48} 48%|████▊ | 16464/34278 [18:12:53<19:15:55, 3.89s/it] 48%|████▊ | 16465/34278 [18:12:56<18:19:55, 3.70s/it] {'loss': 0.1388, 'grad_norm': 0.9926766665942596, 'learning_rate': 5.560308607981511e-06, 'epoch': 0.48} 48%|████▊ | 16465/34278 [18:12:56<18:19:55, 3.70s/it] 48%|████▊ | 16466/34278 [18:13:02<21:00:47, 4.25s/it] {'loss': 0.1388, 'grad_norm': 0.6755509937775684, 'learning_rate': 5.559839147070125e-06, 'epoch': 0.48} 48%|████▊ | 16466/34278 [18:13:02<21:00:47, 4.25s/it] 48%|████▊ | 16467/34278 [18:13:05<19:27:32, 3.93s/it] {'loss': 0.126, 'grad_norm': 1.1091210266959968, 'learning_rate': 5.5593696811606314e-06, 'epoch': 0.48} 48%|████▊ | 16467/34278 [18:13:05<19:27:32, 3.93s/it] 48%|████▊ | 16468/34278 [18:13:08<18:36:51, 3.76s/it] {'loss': 0.1566, 'grad_norm': 0.8662699026525464, 'learning_rate': 5.558900210257218e-06, 'epoch': 0.48} 48%|████▊ | 16468/34278 [18:13:08<18:36:51, 3.76s/it] 48%|████▊ | 16469/34278 [18:13:12<18:28:32, 3.73s/it] {'loss': 0.1342, 'grad_norm': 0.7126832076605465, 'learning_rate': 5.558430734364077e-06, 'epoch': 0.48} 48%|████▊ | 16469/34278 [18:13:12<18:28:32, 3.73s/it] 48%|████▊ | 16470/34278 [18:13:15<18:15:34, 3.69s/it] {'loss': 0.1429, 'grad_norm': 0.8529551201597447, 'learning_rate': 5.557961253485399e-06, 'epoch': 0.48} 48%|████▊ | 16470/34278 [18:13:15<18:15:34, 3.69s/it] 48%|████▊ | 16471/34278 [18:13:19<17:30:14, 3.54s/it] {'loss': 0.1195, 'grad_norm': 0.7867620828402069, 'learning_rate': 5.5574917676253755e-06, 'epoch': 0.48} 48%|████▊ | 16471/34278 [18:13:19<17:30:14, 3.54s/it] 48%|████▊ | 16472/34278 [18:13:22<17:44:15, 3.59s/it] {'loss': 0.1182, 'grad_norm': 0.7318710212935304, 'learning_rate': 5.5570222767882e-06, 'epoch': 0.48} 48%|████▊ | 16472/34278 [18:13:22<17:44:15, 3.59s/it] 48%|████▊ | 16473/34278 [18:13:26<18:45:00, 3.79s/it] {'loss': 0.1299, 'grad_norm': 0.7632637272100438, 'learning_rate': 5.5565527809780635e-06, 'epoch': 0.48} 48%|████▊ | 16473/34278 [18:13:27<18:45:00, 3.79s/it] 48%|████▊ | 16474/34278 [18:13:29<17:01:51, 3.44s/it] {'loss': 0.1523, 'grad_norm': 0.8563688559181248, 'learning_rate': 5.556083280199154e-06, 'epoch': 0.48} 48%|████▊ | 16474/34278 [18:13:29<17:01:51, 3.44s/it] 48%|████▊ | 16475/34278 [18:13:32<16:19:14, 3.30s/it] {'loss': 0.1288, 'grad_norm': 0.8808119500229724, 'learning_rate': 5.555613774455667e-06, 'epoch': 0.48} 48%|████▊ | 16475/34278 [18:13:32<16:19:14, 3.30s/it] 48%|████▊ | 16476/34278 [18:13:38<20:11:45, 4.08s/it] {'loss': 0.1326, 'grad_norm': 0.6808399437320495, 'learning_rate': 5.555144263751795e-06, 'epoch': 0.48} 48%|████▊ | 16476/34278 [18:13:38<20:11:45, 4.08s/it] 48%|████▊ | 16477/34278 [18:13:41<18:52:01, 3.82s/it] {'loss': 0.1225, 'grad_norm': 0.808250933327027, 'learning_rate': 5.554674748091724e-06, 'epoch': 0.48} 48%|████▊ | 16477/34278 [18:13:41<18:52:01, 3.82s/it] 48%|████▊ | 16478/34278 [18:13:46<20:16:00, 4.10s/it] {'loss': 0.1491, 'grad_norm': 0.9609093191262928, 'learning_rate': 5.5542052274796524e-06, 'epoch': 0.48} 48%|████▊ | 16478/34278 [18:13:46<20:16:00, 4.10s/it] 48%|████▊ | 16479/34278 [18:13:52<22:56:29, 4.64s/it] {'loss': 0.1298, 'grad_norm': 0.8257760206838456, 'learning_rate': 5.5537357019197665e-06, 'epoch': 0.48} 48%|████▊ | 16479/34278 [18:13:52<22:56:29, 4.64s/it] 48%|████▊ | 16480/34278 [18:13:55<20:41:55, 4.19s/it] {'loss': 0.1389, 'grad_norm': 0.7090715152658524, 'learning_rate': 5.553266171416261e-06, 'epoch': 0.48} 48%|████▊ | 16480/34278 [18:13:55<20:41:55, 4.19s/it] 48%|████▊ | 16481/34278 [18:13:58<19:04:27, 3.86s/it] {'loss': 0.1422, 'grad_norm': 1.0532035124340533, 'learning_rate': 5.5527966359733274e-06, 'epoch': 0.48} 48%|████▊ | 16481/34278 [18:13:58<19:04:27, 3.86s/it] 48%|████▊ | 16482/34278 [18:14:03<20:24:52, 4.13s/it] {'loss': 0.1246, 'grad_norm': 1.0824765111729429, 'learning_rate': 5.552327095595157e-06, 'epoch': 0.48} 48%|████▊ | 16482/34278 [18:14:03<20:24:52, 4.13s/it] 48%|████▊ | 16483/34278 [18:14:08<21:13:31, 4.29s/it] {'loss': 0.1449, 'grad_norm': 0.7181584806931197, 'learning_rate': 5.551857550285943e-06, 'epoch': 0.48} 48%|████▊ | 16483/34278 [18:14:08<21:13:31, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 48%|████▊ | 16484/34278 [18:14:11<20:06:08, 4.07s/it] {'loss': 0.1285, 'grad_norm': 0.7609579922837822, 'learning_rate': 5.551388000049875e-06, 'epoch': 0.48} 48%|████▊ | 16484/34278 [18:14:11<20:06:08, 4.07s/it] 48%|████▊ | 16485/34278 [18:14:14<19:06:58, 3.87s/it] {'loss': 0.1275, 'grad_norm': 0.9222075582398839, 'learning_rate': 5.550918444891148e-06, 'epoch': 0.48} 48%|████▊ | 16485/34278 [18:14:14<19:06:58, 3.87s/it] 48%|████▊ | 16486/34278 [18:14:21<22:35:54, 4.57s/it] {'loss': 0.1431, 'grad_norm': 0.9596579643201208, 'learning_rate': 5.550448884813952e-06, 'epoch': 0.48} 48%|████▊ | 16486/34278 [18:14:21<22:35:54, 4.57s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 48%|████▊ | 16487/34278 [18:14:24<20:40:23, 4.18s/it] {'loss': 0.1381, 'grad_norm': 0.7015797071947066, 'learning_rate': 5.54997931982248e-06, 'epoch': 0.48} 48%|████▊ | 16487/34278 [18:14:24<20:40:23, 4.18s/it] 48%|████▊ | 16488/34278 [18:14:28<19:49:30, 4.01s/it] {'loss': 0.158, 'grad_norm': 0.925869743484793, 'learning_rate': 5.5495097499209235e-06, 'epoch': 0.48} 48%|████▊ | 16488/34278 [18:14:28<19:49:30, 4.01s/it] 48%|████▊ | 16489/34278 [18:14:31<19:19:34, 3.91s/it] {'loss': 0.1067, 'grad_norm': 0.8716894146482536, 'learning_rate': 5.549040175113476e-06, 'epoch': 0.48} 48%|████▊ | 16489/34278 [18:14:31<19:19:34, 3.91s/it] 48%|████▊ | 16490/34278 [18:14:35<19:08:50, 3.88s/it] {'loss': 0.1431, 'grad_norm': 0.7537976244261037, 'learning_rate': 5.548570595404328e-06, 'epoch': 0.48} 48%|████▊ | 16490/34278 [18:14:35<19:08:50, 3.88s/it] 48%|████▊ | 16491/34278 [18:14:38<17:30:11, 3.54s/it] {'loss': 0.1516, 'grad_norm': 0.9190646876419771, 'learning_rate': 5.548101010797673e-06, 'epoch': 0.48} 48%|████▊ | 16491/34278 [18:14:38<17:30:11, 3.54s/it] 48%|████▊ | 16492/34278 [18:14:41<17:19:25, 3.51s/it] {'loss': 0.1157, 'grad_norm': 0.852624910553048, 'learning_rate': 5.547631421297704e-06, 'epoch': 0.48} 48%|████▊ | 16492/34278 [18:14:41<17:19:25, 3.51s/it] 48%|████▊ | 16493/34278 [18:14:47<21:11:51, 4.29s/it] {'loss': 0.1214, 'grad_norm': 0.718147437544926, 'learning_rate': 5.5471618269086125e-06, 'epoch': 0.48} 48%|████▊ | 16493/34278 [18:14:47<21:11:51, 4.29s/it] 48%|████▊ | 16494/34278 [18:14:51<20:22:19, 4.12s/it] {'loss': 0.1487, 'grad_norm': 0.8552282376593306, 'learning_rate': 5.546692227634588e-06, 'epoch': 0.48} 48%|████▊ | 16494/34278 [18:14:51<20:22:19, 4.12s/it] 48%|████▊ | 16495/34278 [18:14:55<19:24:08, 3.93s/it] {'loss': 0.144, 'grad_norm': 0.9300653222118747, 'learning_rate': 5.546222623479829e-06, 'epoch': 0.48} 48%|████▊ | 16495/34278 [18:14:55<19:24:08, 3.93s/it] 48%|████▊ | 16496/34278 [18:14:58<18:45:21, 3.80s/it] {'loss': 0.1705, 'grad_norm': 0.856634662299573, 'learning_rate': 5.545753014448523e-06, 'epoch': 0.48} 48%|████▊ | 16496/34278 [18:14:58<18:45:21, 3.80s/it] 48%|████▊ | 16497/34278 [18:15:03<19:47:46, 4.01s/it] {'loss': 0.1387, 'grad_norm': 0.653131023475819, 'learning_rate': 5.545283400544864e-06, 'epoch': 0.48} 48%|████▊ | 16497/34278 [18:15:03<19:47:46, 4.01s/it] 48%|████▊ | 16498/34278 [18:15:08<21:24:29, 4.33s/it] {'loss': 0.1388, 'grad_norm': 0.7570686133259324, 'learning_rate': 5.544813781773046e-06, 'epoch': 0.48} 48%|████▊ | 16498/34278 [18:15:08<21:24:29, 4.33s/it] 48%|████▊ | 16499/34278 [18:15:11<19:50:49, 4.02s/it] {'loss': 0.1181, 'grad_norm': 0.9521446827960226, 'learning_rate': 5.544344158137262e-06, 'epoch': 0.48} 48%|████▊ | 16499/34278 [18:15:11<19:50:49, 4.02s/it] 48%|████▊ | 16500/34278 [18:15:14<17:47:39, 3.60s/it] {'loss': 0.1173, 'grad_norm': 1.8571987751703076, 'learning_rate': 5.543874529641701e-06, 'epoch': 0.48} 48%|████▊ | 16500/34278 [18:15:14<17:47:39, 3.60s/it] 48%|████▊ | 16501/34278 [18:15:17<17:47:11, 3.60s/it] {'loss': 0.1118, 'grad_norm': 0.6692509290438796, 'learning_rate': 5.543404896290559e-06, 'epoch': 0.48} 48%|████▊ | 16501/34278 [18:15:17<17:47:11, 3.60s/it] 48%|████▊ | 16502/34278 [18:15:20<16:36:15, 3.36s/it] {'loss': 0.1481, 'grad_norm': 0.8703717468671059, 'learning_rate': 5.542935258088027e-06, 'epoch': 0.48} 48%|████▊ | 16502/34278 [18:15:20<16:36:15, 3.36s/it] 48%|████▊ | 16503/34278 [18:15:23<16:21:00, 3.31s/it] {'loss': 0.1519, 'grad_norm': 0.95446062042861, 'learning_rate': 5.5424656150383e-06, 'epoch': 0.48} 48%|████▊ | 16503/34278 [18:15:23<16:21:00, 3.31s/it] 48%|████▊ | 16504/34278 [18:15:27<16:57:32, 3.43s/it] {'loss': 0.1345, 'grad_norm': 0.599736865610296, 'learning_rate': 5.5419959671455685e-06, 'epoch': 0.48} 48%|████▊ | 16504/34278 [18:15:27<16:57:32, 3.43s/it] 48%|████▊ | 16505/34278 [18:15:30<16:05:39, 3.26s/it] {'loss': 0.1264, 'grad_norm': 0.9137106562508852, 'learning_rate': 5.541526314414025e-06, 'epoch': 0.48} 48%|████▊ | 16505/34278 [18:15:30<16:05:39, 3.26s/it] 48%|████▊ | 16506/34278 [18:15:33<15:38:33, 3.17s/it] {'loss': 0.118, 'grad_norm': 0.8354033497039266, 'learning_rate': 5.541056656847866e-06, 'epoch': 0.48} 48%|████▊ | 16506/34278 [18:15:33<15:38:33, 3.17s/it] 48%|████▊ | 16507/34278 [18:15:36<15:26:24, 3.13s/it] {'loss': 0.1303, 'grad_norm': 0.9765969251457027, 'learning_rate': 5.540586994451281e-06, 'epoch': 0.48} 48%|████▊ | 16507/34278 [18:15:36<15:26:24, 3.13s/it] 48%|████▊ | 16508/34278 [18:15:41<18:29:19, 3.75s/it] {'loss': 0.1842, 'grad_norm': 0.8333721286224998, 'learning_rate': 5.540117327228467e-06, 'epoch': 0.48} 48%|████▊ | 16508/34278 [18:15:41<18:29:19, 3.75s/it] 48%|████▊ | 16509/34278 [18:15:45<18:20:25, 3.72s/it] {'loss': 0.1287, 'grad_norm': 0.8190719756353236, 'learning_rate': 5.5396476551836105e-06, 'epoch': 0.48} 48%|████▊ | 16509/34278 [18:15:45<18:20:25, 3.72s/it] 48%|████▊ | 16510/34278 [18:15:48<17:17:35, 3.50s/it] {'loss': 0.131, 'grad_norm': 0.8662930610716588, 'learning_rate': 5.539177978320912e-06, 'epoch': 0.48} 48%|████▊ | 16510/34278 [18:15:48<17:17:35, 3.50s/it] 48%|████▊ | 16511/34278 [18:15:51<17:42:40, 3.59s/it] {'loss': 0.122, 'grad_norm': 0.778730400033978, 'learning_rate': 5.53870829664456e-06, 'epoch': 0.48} 48%|████▊ | 16511/34278 [18:15:51<17:42:40, 3.59s/it] 48%|████▊ | 16512/34278 [18:15:55<17:22:26, 3.52s/it] {'loss': 0.1513, 'grad_norm': 0.7945330640920626, 'learning_rate': 5.538238610158747e-06, 'epoch': 0.48} 48%|████▊ | 16512/34278 [18:15:55<17:22:26, 3.52s/it] 48%|████▊ | 16513/34278 [18:15:59<17:48:09, 3.61s/it] {'loss': 0.125, 'grad_norm': 0.8547319196600227, 'learning_rate': 5.537768918867672e-06, 'epoch': 0.48} 48%|████▊ | 16513/34278 [18:15:59<17:48:09, 3.61s/it] 48%|████▊ | 16514/34278 [18:16:01<16:50:07, 3.41s/it] {'loss': 0.1372, 'grad_norm': 0.9733174618029062, 'learning_rate': 5.537299222775522e-06, 'epoch': 0.48} 48%|████▊ | 16514/34278 [18:16:01<16:50:07, 3.41s/it] 48%|████▊ | 16515/34278 [18:16:05<17:17:44, 3.51s/it] {'loss': 0.1038, 'grad_norm': 0.5822618893424835, 'learning_rate': 5.536829521886493e-06, 'epoch': 0.48} 48%|████▊ | 16515/34278 [18:16:05<17:17:44, 3.51s/it] 48%|████▊ | 16516/34278 [18:16:10<18:53:49, 3.83s/it] {'loss': 0.1315, 'grad_norm': 0.8427107813694704, 'learning_rate': 5.536359816204779e-06, 'epoch': 0.48} 48%|████▊ | 16516/34278 [18:16:10<18:53:49, 3.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 48%|████▊ | 16517/34278 [18:16:13<18:03:43, 3.66s/it] {'loss': 0.1547, 'grad_norm': 0.995674395919126, 'learning_rate': 5.535890105734571e-06, 'epoch': 0.48} 48%|████▊ | 16517/34278 [18:16:13<18:03:43, 3.66s/it] 48%|████▊ | 16518/34278 [18:16:18<19:30:06, 3.95s/it] {'loss': 0.1496, 'grad_norm': 0.8359358884480976, 'learning_rate': 5.535420390480065e-06, 'epoch': 0.48} 48%|████▊ | 16518/34278 [18:16:18<19:30:06, 3.95s/it] 48%|████▊ | 16519/34278 [18:16:21<18:32:01, 3.76s/it] {'loss': 0.14, 'grad_norm': 0.794006514004653, 'learning_rate': 5.534950670445453e-06, 'epoch': 0.48} 48%|████▊ | 16519/34278 [18:16:21<18:32:01, 3.76s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8dcedff7e0> Failed to fetch sample 3588838. Exception: cannot identify image file <_io.BytesIO object at 0x7f8dcedff7e0> 48%|████▊ | 16520/34278 [18:16:26<19:49:40, 4.02s/it] {'loss': 0.13, 'grad_norm': 0.9871956366899922, 'learning_rate': 5.53448094563493e-06, 'epoch': 0.48} 48%|████▊ | 16520/34278 [18:16:26<19:49:40, 4.02s/it] 48%|████▊ | 16521/34278 [18:16:29<18:30:51, 3.75s/it] {'loss': 0.1238, 'grad_norm': 0.9647479521573002, 'learning_rate': 5.534011216052688e-06, 'epoch': 0.48} 48%|████▊ | 16521/34278 [18:16:29<18:30:51, 3.75s/it] 48%|████▊ | 16522/34278 [18:16:35<21:58:49, 4.46s/it] {'loss': 0.1419, 'grad_norm': 0.7474789922402135, 'learning_rate': 5.533541481702922e-06, 'epoch': 0.48} 48%|████▊ | 16522/34278 [18:16:35<21:58:49, 4.46s/it] 48%|████▊ | 16523/34278 [18:16:41<24:34:11, 4.98s/it] {'loss': 0.1217, 'grad_norm': 0.890849777723087, 'learning_rate': 5.533071742589826e-06, 'epoch': 0.48} 48%|████▊ | 16523/34278 [18:16:41<24:34:11, 4.98s/it] 48%|████▊ | 16524/34278 [18:16:44<21:50:45, 4.43s/it] {'loss': 0.1435, 'grad_norm': 0.9112210921801089, 'learning_rate': 5.53260199871759e-06, 'epoch': 0.48} 48%|████▊ | 16524/34278 [18:16:44<21:50:45, 4.43s/it] 48%|████▊ | 16525/34278 [18:16:47<19:43:12, 4.00s/it] {'loss': 0.1357, 'grad_norm': 0.8242540047625232, 'learning_rate': 5.532132250090414e-06, 'epoch': 0.48} 48%|████▊ | 16525/34278 [18:16:47<19:43:12, 4.00s/it] 48%|████▊ | 16526/34278 [18:16:52<21:10:09, 4.29s/it] {'loss': 0.148, 'grad_norm': 1.0408837781474771, 'learning_rate': 5.531662496712485e-06, 'epoch': 0.48} 48%|████▊ | 16526/34278 [18:16:52<21:10:09, 4.29s/it] 48%|████▊ | 16527/34278 [18:16:56<20:05:41, 4.08s/it] {'loss': 0.142, 'grad_norm': 0.808382138381165, 'learning_rate': 5.531192738588e-06, 'epoch': 0.48} 48%|████▊ | 16527/34278 [18:16:56<20:05:41, 4.08s/it] 48%|████▊ | 16528/34278 [18:17:00<20:22:46, 4.13s/it] {'loss': 0.1392, 'grad_norm': 0.985210178693923, 'learning_rate': 5.5307229757211565e-06, 'epoch': 0.48} 48%|████▊ | 16528/34278 [18:17:00<20:22:46, 4.13s/it] 48%|████▊ | 16529/34278 [18:17:04<19:40:41, 3.99s/it] {'loss': 0.125, 'grad_norm': 0.733025222941078, 'learning_rate': 5.530253208116143e-06, 'epoch': 0.48} 48%|████▊ | 16529/34278 [18:17:04<19:40:41, 3.99s/it] 48%|████▊ | 16530/34278 [18:17:07<18:44:30, 3.80s/it] {'loss': 0.1289, 'grad_norm': 1.0001245540675354, 'learning_rate': 5.529783435777155e-06, 'epoch': 0.48} 48%|████▊ | 16530/34278 [18:17:07<18:44:30, 3.80s/it] 48%|████▊ | 16531/34278 [18:17:10<17:40:15, 3.58s/it] {'loss': 0.1572, 'grad_norm': 0.7763369947500759, 'learning_rate': 5.529313658708387e-06, 'epoch': 0.48} 48%|████▊ | 16531/34278 [18:17:10<17:40:15, 3.58s/it] 48%|████▊ | 16532/34278 [18:17:13<16:20:25, 3.31s/it] {'loss': 0.1258, 'grad_norm': 0.7732278672191945, 'learning_rate': 5.528843876914034e-06, 'epoch': 0.48} 48%|████▊ | 16532/34278 [18:17:13<16:20:25, 3.31s/it] 48%|████▊ | 16533/34278 [18:17:17<17:03:32, 3.46s/it] {'loss': 0.121, 'grad_norm': 0.7179032294173053, 'learning_rate': 5.5283740903982886e-06, 'epoch': 0.48} 48%|████▊ | 16533/34278 [18:17:17<17:03:32, 3.46s/it] 48%|████▊ | 16534/34278 [18:17:22<19:39:04, 3.99s/it] {'loss': 0.141, 'grad_norm': 0.8505302722884381, 'learning_rate': 5.5279042991653456e-06, 'epoch': 0.48} 48%|████▊ | 16534/34278 [18:17:22<19:39:04, 3.99s/it] 48%|████▊ | 16535/34278 [18:17:25<18:46:14, 3.81s/it] {'loss': 0.1379, 'grad_norm': 0.8750178937222148, 'learning_rate': 5.527434503219398e-06, 'epoch': 0.48} 48%|████▊ | 16535/34278 [18:17:25<18:46:14, 3.81s/it] 48%|████▊ | 16536/34278 [18:17:28<17:35:02, 3.57s/it] {'loss': 0.1319, 'grad_norm': 0.7382532797087719, 'learning_rate': 5.526964702564642e-06, 'epoch': 0.48} 48%|████▊ | 16536/34278 [18:17:28<17:35:02, 3.57s/it] 48%|████▊ | 16537/34278 [18:17:31<16:50:46, 3.42s/it] {'loss': 0.1567, 'grad_norm': 0.8259270765480796, 'learning_rate': 5.52649489720527e-06, 'epoch': 0.48} 48%|████▊ | 16537/34278 [18:17:31<16:50:46, 3.42s/it] 48%|████▊ | 16538/34278 [18:17:35<16:45:34, 3.40s/it] {'loss': 0.1617, 'grad_norm': 0.7733304721672678, 'learning_rate': 5.526025087145479e-06, 'epoch': 0.48} 48%|████▊ | 16538/34278 [18:17:35<16:45:34, 3.40s/it] 48%|████▊ | 16539/34278 [18:17:38<17:03:01, 3.46s/it] {'loss': 0.1351, 'grad_norm': 0.7787072372182898, 'learning_rate': 5.52555527238946e-06, 'epoch': 0.48} 48%|████▊ | 16539/34278 [18:17:38<17:03:01, 3.46s/it] 48%|████▊ | 16540/34278 [18:17:44<20:46:15, 4.22s/it] {'loss': 0.1699, 'grad_norm': 0.7745084992915291, 'learning_rate': 5.525085452941411e-06, 'epoch': 0.48} 48%|████▊ | 16540/34278 [18:17:44<20:46:15, 4.22s/it] 48%|████▊ | 16541/34278 [18:17:48<19:31:13, 3.96s/it] {'loss': 0.1418, 'grad_norm': 0.8384407509541995, 'learning_rate': 5.524615628805523e-06, 'epoch': 0.48} 48%|████▊ | 16541/34278 [18:17:48<19:31:13, 3.96s/it] 48%|████▊ | 16542/34278 [18:17:52<20:28:28, 4.16s/it] {'loss': 0.1348, 'grad_norm': 0.8909645378113125, 'learning_rate': 5.52414579998599e-06, 'epoch': 0.48} 48%|████▊ | 16542/34278 [18:17:52<20:28:28, 4.16s/it] 48%|████▊ | 16543/34278 [18:17:59<23:51:29, 4.84s/it] {'loss': 0.1429, 'grad_norm': 0.7095590406783882, 'learning_rate': 5.523675966487012e-06, 'epoch': 0.48} 48%|████▊ | 16543/34278 [18:17:59<23:51:29, 4.84s/it] 48%|████▊ | 16544/34278 [18:18:02<21:57:41, 4.46s/it] {'loss': 0.1515, 'grad_norm': 0.9085360330182193, 'learning_rate': 5.523206128312778e-06, 'epoch': 0.48} 48%|████▊ | 16544/34278 [18:18:02<21:57:41, 4.46s/it] 48%|████▊ | 16545/34278 [18:18:05<20:01:37, 4.07s/it] {'loss': 0.1489, 'grad_norm': 1.1421048007952443, 'learning_rate': 5.522736285467485e-06, 'epoch': 0.48} 48%|████▊ | 16545/34278 [18:18:05<20:01:37, 4.07s/it] 48%|████▊ | 16546/34278 [18:18:08<18:31:14, 3.76s/it] {'loss': 0.1619, 'grad_norm': 1.1052497560282095, 'learning_rate': 5.522266437955327e-06, 'epoch': 0.48} 48%|████▊ | 16546/34278 [18:18:08<18:31:14, 3.76s/it] 48%|████▊ | 16547/34278 [18:18:13<19:15:41, 3.91s/it] {'loss': 0.1381, 'grad_norm': 0.9155191091801037, 'learning_rate': 5.5217965857804985e-06, 'epoch': 0.48} 48%|████▊ | 16547/34278 [18:18:13<19:15:41, 3.91s/it] 48%|████▊ | 16548/34278 [18:18:16<17:54:52, 3.64s/it] {'loss': 0.1277, 'grad_norm': 0.8365915686320591, 'learning_rate': 5.521326728947195e-06, 'epoch': 0.48} 48%|████▊ | 16548/34278 [18:18:16<17:54:52, 3.64s/it] 48%|████▊ | 16549/34278 [18:18:19<16:57:55, 3.44s/it] {'loss': 0.1422, 'grad_norm': 0.7858846985914822, 'learning_rate': 5.520856867459612e-06, 'epoch': 0.48} 48%|████▊ | 16549/34278 [18:18:19<16:57:55, 3.44s/it] 48%|████▊ | 16550/34278 [18:18:21<15:51:46, 3.22s/it] {'loss': 0.1592, 'grad_norm': 1.3055734495893565, 'learning_rate': 5.520387001321941e-06, 'epoch': 0.48} 48%|████▊ | 16550/34278 [18:18:21<15:51:46, 3.22s/it] 48%|████▊ | 16551/34278 [18:18:25<16:45:33, 3.40s/it] {'loss': 0.1275, 'grad_norm': 1.008156375466698, 'learning_rate': 5.519917130538381e-06, 'epoch': 0.48} 48%|████▊ | 16551/34278 [18:18:25<16:45:33, 3.40s/it] 48%|████▊ | 16552/34278 [18:18:28<15:55:58, 3.24s/it] {'loss': 0.1171, 'grad_norm': 0.77627103000323, 'learning_rate': 5.519447255113124e-06, 'epoch': 0.48} 48%|████▊ | 16552/34278 [18:18:28<15:55:58, 3.24s/it] 48%|████▊ | 16553/34278 [18:18:33<18:48:48, 3.82s/it] {'loss': 0.143, 'grad_norm': 1.060123575100618, 'learning_rate': 5.518977375050369e-06, 'epoch': 0.48} 48%|████▊ | 16553/34278 [18:18:33<18:48:48, 3.82s/it] 48%|████▊ | 16554/34278 [18:18:37<18:58:11, 3.85s/it] {'loss': 0.125, 'grad_norm': 1.021153263148013, 'learning_rate': 5.518507490354303e-06, 'epoch': 0.48} 48%|████▊ | 16554/34278 [18:18:37<18:58:11, 3.85s/it] 48%|████▊ | 16555/34278 [18:18:41<18:36:00, 3.78s/it] {'loss': 0.1284, 'grad_norm': 0.882098373933168, 'learning_rate': 5.518037601029129e-06, 'epoch': 0.48} 48%|████▊ | 16555/34278 [18:18:41<18:36:00, 3.78s/it] 48%|████▊ | 16556/34278 [18:18:44<17:32:17, 3.56s/it] {'loss': 0.119, 'grad_norm': 0.7901226421111385, 'learning_rate': 5.517567707079038e-06, 'epoch': 0.48} 48%|████▊ | 16556/34278 [18:18:44<17:32:17, 3.56s/it] 48%|████▊ | 16557/34278 [18:18:47<16:37:14, 3.38s/it] {'loss': 0.1302, 'grad_norm': 0.920247587865245, 'learning_rate': 5.517097808508225e-06, 'epoch': 0.48} 48%|████▊ | 16557/34278 [18:18:47<16:37:14, 3.38s/it] 48%|████▊ | 16558/34278 [18:18:53<20:19:11, 4.13s/it] {'loss': 0.1553, 'grad_norm': 1.1793749720336413, 'learning_rate': 5.516627905320888e-06, 'epoch': 0.48} 48%|████▊ | 16558/34278 [18:18:53<20:19:11, 4.13s/it] 48%|████▊ | 16559/34278 [18:18:56<18:34:54, 3.78s/it] {'loss': 0.1401, 'grad_norm': 1.0796224648889627, 'learning_rate': 5.51615799752122e-06, 'epoch': 0.48} 48%|████▊ | 16559/34278 [18:18:56<18:34:54, 3.78s/it] 48%|████▊ | 16560/34278 [18:18:58<16:52:33, 3.43s/it] {'loss': 0.1337, 'grad_norm': 1.5646483378237794, 'learning_rate': 5.515688085113416e-06, 'epoch': 0.48} 48%|████▊ | 16560/34278 [18:18:58<16:52:33, 3.43s/it] 48%|████▊ | 16561/34278 [18:19:01<16:16:20, 3.31s/it] {'loss': 0.1381, 'grad_norm': 0.9313085361047224, 'learning_rate': 5.515218168101673e-06, 'epoch': 0.48} 48%|████▊ | 16561/34278 [18:19:01<16:16:20, 3.31s/it] 48%|████▊ | 16562/34278 [18:19:05<17:12:52, 3.50s/it] {'loss': 0.134, 'grad_norm': 0.9453004475932669, 'learning_rate': 5.514748246490184e-06, 'epoch': 0.48} 48%|████▊ | 16562/34278 [18:19:05<17:12:52, 3.50s/it] 48%|████▊ | 16563/34278 [18:19:08<16:35:50, 3.37s/it] {'loss': 0.1097, 'grad_norm': 0.9656647349989894, 'learning_rate': 5.514278320283145e-06, 'epoch': 0.48} 48%|████▊ | 16563/34278 [18:19:08<16:35:50, 3.37s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9720 > 8192). Running this sequence through the model will result in indexing errors 48%|████▊ | 16564/34278 [18:19:11<15:55:43, 3.24s/it] {'loss': 0.1214, 'grad_norm': 0.8275094301706116, 'learning_rate': 5.513808389484754e-06, 'epoch': 0.48} 48%|████▊ | 16564/34278 [18:19:11<15:55:43, 3.24s/it] 48%|████▊ | 16565/34278 [18:19:15<16:06:18, 3.27s/it] {'loss': 0.1673, 'grad_norm': 0.9536694338675364, 'learning_rate': 5.513338454099203e-06, 'epoch': 0.48} 48%|████▊ | 16565/34278 [18:19:15<16:06:18, 3.27s/it] 48%|████▊ | 16566/34278 [18:19:17<15:32:19, 3.16s/it] {'loss': 0.1681, 'grad_norm': 1.42692864189217, 'learning_rate': 5.512868514130688e-06, 'epoch': 0.48} 48%|████▊ | 16566/34278 [18:19:17<15:32:19, 3.16s/it] 48%|████▊ | 16567/34278 [18:19:21<15:48:13, 3.21s/it] {'loss': 0.1436, 'grad_norm': 0.9921845063117846, 'learning_rate': 5.512398569583407e-06, 'epoch': 0.48} 48%|████▊ | 16567/34278 [18:19:21<15:48:13, 3.21s/it] 48%|████▊ | 16568/34278 [18:19:26<18:44:51, 3.81s/it] {'loss': 0.1104, 'grad_norm': 1.0185852627053067, 'learning_rate': 5.511928620461554e-06, 'epoch': 0.48} 48%|████▊ | 16568/34278 [18:19:26<18:44:51, 3.81s/it] 48%|████▊ | 16569/34278 [18:19:29<17:46:47, 3.61s/it] {'loss': 0.1224, 'grad_norm': 0.5818103833703272, 'learning_rate': 5.511458666769323e-06, 'epoch': 0.48} 48%|████▊ | 16569/34278 [18:19:29<17:46:47, 3.61s/it] 48%|████▊ | 16570/34278 [18:19:32<16:48:19, 3.42s/it] {'loss': 0.1225, 'grad_norm': 0.9644499244791919, 'learning_rate': 5.510988708510913e-06, 'epoch': 0.48} 48%|████▊ | 16570/34278 [18:19:32<16:48:19, 3.42s/it] 48%|████▊ | 16571/34278 [18:19:37<19:11:56, 3.90s/it] {'loss': 0.1303, 'grad_norm': 1.0213132183185323, 'learning_rate': 5.510518745690516e-06, 'epoch': 0.48} 48%|████▊ | 16571/34278 [18:19:37<19:11:56, 3.90s/it] 48%|████▊ | 16572/34278 [18:19:40<18:08:46, 3.69s/it] {'loss': 0.1279, 'grad_norm': 0.9379995454353172, 'learning_rate': 5.510048778312329e-06, 'epoch': 0.48} 48%|████▊ | 16572/34278 [18:19:40<18:08:46, 3.69s/it] 48%|████▊ | 16573/34278 [18:19:43<17:03:21, 3.47s/it] {'loss': 0.1366, 'grad_norm': 0.790624273438235, 'learning_rate': 5.509578806380551e-06, 'epoch': 0.48} 48%|████▊ | 16573/34278 [18:19:43<17:03:21, 3.47s/it] 48%|████▊ | 16574/34278 [18:19:46<16:24:50, 3.34s/it] {'loss': 0.149, 'grad_norm': 1.0198715632730508, 'learning_rate': 5.509108829899374e-06, 'epoch': 0.48} 48%|████▊ | 16574/34278 [18:19:46<16:24:50, 3.34s/it] 48%|████▊ | 16575/34278 [18:19:49<15:57:26, 3.24s/it] {'loss': 0.1399, 'grad_norm': 0.8444673125265063, 'learning_rate': 5.508638848872993e-06, 'epoch': 0.48} 48%|████▊ | 16575/34278 [18:19:49<15:57:26, 3.24s/it] 48%|████▊ | 16576/34278 [18:19:53<17:13:51, 3.50s/it] {'loss': 0.157, 'grad_norm': 0.9265335583361651, 'learning_rate': 5.508168863305607e-06, 'epoch': 0.48} 48%|████▊ | 16576/34278 [18:19:53<17:13:51, 3.50s/it] 48%|████▊ | 16577/34278 [18:19:57<17:18:52, 3.52s/it] {'loss': 0.143, 'grad_norm': 0.8774750551063015, 'learning_rate': 5.507698873201411e-06, 'epoch': 0.48} 48%|████▊ | 16577/34278 [18:19:57<17:18:52, 3.52s/it] 48%|████▊ | 16578/34278 [18:20:00<16:54:53, 3.44s/it] {'loss': 0.1346, 'grad_norm': 0.756139460175104, 'learning_rate': 5.507228878564601e-06, 'epoch': 0.48} 48%|████▊ | 16578/34278 [18:20:00<16:54:53, 3.44s/it] 48%|████▊ | 16579/34278 [18:20:05<18:32:50, 3.77s/it] {'loss': 0.1398, 'grad_norm': 0.7341509530257865, 'learning_rate': 5.506758879399372e-06, 'epoch': 0.48} 48%|████▊ | 16579/34278 [18:20:05<18:32:50, 3.77s/it] 48%|████▊ | 16580/34278 [18:20:09<19:30:49, 3.97s/it] {'loss': 0.1311, 'grad_norm': 0.7510997357780191, 'learning_rate': 5.506288875709921e-06, 'epoch': 0.48} 48%|████▊ | 16580/34278 [18:20:09<19:30:49, 3.97s/it] 48%|████▊ | 16581/34278 [18:20:12<18:08:05, 3.69s/it] {'loss': 0.1388, 'grad_norm': 0.888161077874037, 'learning_rate': 5.505818867500443e-06, 'epoch': 0.48} 48%|████▊ | 16581/34278 [18:20:12<18:08:05, 3.69s/it] 48%|████▊ | 16582/34278 [18:20:15<16:43:26, 3.40s/it] {'loss': 0.1351, 'grad_norm': 0.8020689619674765, 'learning_rate': 5.505348854775135e-06, 'epoch': 0.48} 48%|████▊ | 16582/34278 [18:20:15<16:43:26, 3.40s/it] 48%|████▊ | 16583/34278 [18:20:18<16:40:26, 3.39s/it] {'loss': 0.1375, 'grad_norm': 0.7977449131759667, 'learning_rate': 5.504878837538195e-06, 'epoch': 0.48} 48%|████▊ | 16583/34278 [18:20:18<16:40:26, 3.39s/it] 48%|████▊ | 16584/34278 [18:20:21<16:12:24, 3.30s/it] {'loss': 0.1439, 'grad_norm': 0.9988269198867377, 'learning_rate': 5.504408815793816e-06, 'epoch': 0.48} 48%|████▊ | 16584/34278 [18:20:21<16:12:24, 3.30s/it] 48%|████▊ | 16585/34278 [18:20:24<15:33:59, 3.17s/it] {'loss': 0.1632, 'grad_norm': 1.086336562718429, 'learning_rate': 5.5039387895461956e-06, 'epoch': 0.48} 48%|████▊ | 16585/34278 [18:20:24<15:33:59, 3.17s/it] 48%|████▊ | 16586/34278 [18:20:27<15:36:26, 3.18s/it] {'loss': 0.1324, 'grad_norm': 0.8145764289106557, 'learning_rate': 5.503468758799529e-06, 'epoch': 0.48} 48%|████▊ | 16586/34278 [18:20:27<15:36:26, 3.18s/it] 48%|████▊ | 16587/34278 [18:20:31<15:43:18, 3.20s/it] {'loss': 0.1448, 'grad_norm': 0.7863050471615618, 'learning_rate': 5.502998723558014e-06, 'epoch': 0.48} 48%|████▊ | 16587/34278 [18:20:31<15:43:18, 3.20s/it] 48%|████▊ | 16588/34278 [18:20:34<15:52:07, 3.23s/it] {'loss': 0.1493, 'grad_norm': 0.7478906050284334, 'learning_rate': 5.502528683825847e-06, 'epoch': 0.48} 48%|████▊ | 16588/34278 [18:20:34<15:52:07, 3.23s/it] 48%|████▊ | 16589/34278 [18:20:40<19:27:35, 3.96s/it] {'loss': 0.1205, 'grad_norm': 0.7873330482565611, 'learning_rate': 5.502058639607224e-06, 'epoch': 0.48} 48%|████▊ | 16589/34278 [18:20:40<19:27:35, 3.96s/it] 48%|████▊ | 16590/34278 [18:20:44<19:26:01, 3.96s/it] {'loss': 0.1182, 'grad_norm': 0.8620669116617288, 'learning_rate': 5.501588590906342e-06, 'epoch': 0.48} 48%|████▊ | 16590/34278 [18:20:44<19:26:01, 3.96s/it] 48%|████▊ | 16591/34278 [18:20:47<18:14:39, 3.71s/it] {'loss': 0.1367, 'grad_norm': 0.6864588399225512, 'learning_rate': 5.501118537727394e-06, 'epoch': 0.48} 48%|████▊ | 16591/34278 [18:20:47<18:14:39, 3.71s/it] 48%|████▊ | 16592/34278 [18:20:50<16:53:16, 3.44s/it] {'loss': 0.135, 'grad_norm': 0.8999941015068548, 'learning_rate': 5.500648480074582e-06, 'epoch': 0.48} 48%|████▊ | 16592/34278 [18:20:50<16:53:16, 3.44s/it] 48%|████▊ | 16593/34278 [18:20:53<16:48:17, 3.42s/it] {'loss': 0.1207, 'grad_norm': 0.7647912021712191, 'learning_rate': 5.500178417952099e-06, 'epoch': 0.48} 48%|████▊ | 16593/34278 [18:20:53<16:48:17, 3.42s/it] 48%|████▊ | 16594/34278 [18:20:57<17:00:39, 3.46s/it] {'loss': 0.1476, 'grad_norm': 0.7569057693438344, 'learning_rate': 5.499708351364142e-06, 'epoch': 0.48} 48%|████▊ | 16594/34278 [18:20:57<17:00:39, 3.46s/it] 48%|████▊ | 16595/34278 [18:21:00<16:58:20, 3.46s/it] {'loss': 0.1325, 'grad_norm': 0.7492787375397334, 'learning_rate': 5.499238280314909e-06, 'epoch': 0.48} 48%|████▊ | 16595/34278 [18:21:00<16:58:20, 3.46s/it] 48%|████▊ | 16596/34278 [18:21:04<17:09:18, 3.49s/it] {'loss': 0.1384, 'grad_norm': 0.8361052466336646, 'learning_rate': 5.4987682048085955e-06, 'epoch': 0.48} 48%|████▊ | 16596/34278 [18:21:04<17:09:18, 3.49s/it] 48%|████▊ | 16597/34278 [18:21:09<20:04:44, 4.09s/it] {'loss': 0.1612, 'grad_norm': 0.8739257125006146, 'learning_rate': 5.498298124849399e-06, 'epoch': 0.48} 48%|████▊ | 16597/34278 [18:21:09<20:04:44, 4.09s/it] 48%|████▊ | 16598/34278 [18:21:12<18:47:52, 3.83s/it] {'loss': 0.1257, 'grad_norm': 0.7515913562564204, 'learning_rate': 5.497828040441515e-06, 'epoch': 0.48} 48%|████▊ | 16598/34278 [18:21:12<18:47:52, 3.83s/it] 48%|████▊ | 16599/34278 [18:21:16<19:01:16, 3.87s/it] {'loss': 0.1335, 'grad_norm': 0.7851826210252971, 'learning_rate': 5.497357951589141e-06, 'epoch': 0.48} 48%|████▊ | 16599/34278 [18:21:16<19:01:16, 3.87s/it] 48%|████▊ | 16600/34278 [18:21:19<17:51:30, 3.64s/it] {'loss': 0.1279, 'grad_norm': 0.7789635299586463, 'learning_rate': 5.496887858296475e-06, 'epoch': 0.48} 48%|████▊ | 16600/34278 [18:21:19<17:51:30, 3.64s/it] 48%|████▊ | 16601/34278 [18:21:25<21:30:32, 4.38s/it] {'loss': 0.135, 'grad_norm': 0.9577817455507989, 'learning_rate': 5.496417760567712e-06, 'epoch': 0.48} 48%|████▊ | 16601/34278 [18:21:25<21:30:32, 4.38s/it] 48%|████▊ | 16602/34278 [18:21:28<19:07:18, 3.89s/it] {'loss': 0.1557, 'grad_norm': 0.8623397308921672, 'learning_rate': 5.4959476584070485e-06, 'epoch': 0.48} 48%|████▊ | 16602/34278 [18:21:28<19:07:18, 3.89s/it] 48%|████▊ | 16603/34278 [18:21:31<18:03:31, 3.68s/it] {'loss': 0.155, 'grad_norm': 0.7733125935298913, 'learning_rate': 5.495477551818685e-06, 'epoch': 0.48} 48%|████▊ | 16603/34278 [18:21:31<18:03:31, 3.68s/it] 48%|████▊ | 16604/34278 [18:21:35<17:23:46, 3.54s/it] {'loss': 0.1325, 'grad_norm': 0.8175778594508675, 'learning_rate': 5.495007440806816e-06, 'epoch': 0.48} 48%|████▊ | 16604/34278 [18:21:35<17:23:46, 3.54s/it] 48%|████▊ | 16605/34278 [18:21:38<17:04:33, 3.48s/it] {'loss': 0.1307, 'grad_norm': 0.6646761772274842, 'learning_rate': 5.494537325375637e-06, 'epoch': 0.48} 48%|████▊ | 16605/34278 [18:21:38<17:04:33, 3.48s/it] 48%|████▊ | 16606/34278 [18:21:42<17:21:18, 3.54s/it] {'loss': 0.1322, 'grad_norm': 1.02001488818844, 'learning_rate': 5.494067205529347e-06, 'epoch': 0.48} 48%|████▊ | 16606/34278 [18:21:42<17:21:18, 3.54s/it] 48%|████▊ | 16607/34278 [18:21:46<18:31:30, 3.77s/it] {'loss': 0.1298, 'grad_norm': 0.7459854556711544, 'learning_rate': 5.493597081272144e-06, 'epoch': 0.48} 48%|████▊ | 16607/34278 [18:21:46<18:31:30, 3.77s/it] 48%|████▊ | 16608/34278 [18:21:49<17:38:29, 3.59s/it] {'loss': 0.1271, 'grad_norm': 0.8182527768869449, 'learning_rate': 5.493126952608224e-06, 'epoch': 0.48} 48%|████▊ | 16608/34278 [18:21:49<17:38:29, 3.59s/it] 48%|████▊ | 16609/34278 [18:21:52<17:09:58, 3.50s/it] {'loss': 0.1205, 'grad_norm': 0.8523103671767203, 'learning_rate': 5.4926568195417836e-06, 'epoch': 0.48} 48%|████▊ | 16609/34278 [18:21:52<17:09:58, 3.50s/it] 48%|████▊ | 16610/34278 [18:21:55<16:22:24, 3.34s/it] {'loss': 0.1747, 'grad_norm': 0.9129672467024895, 'learning_rate': 5.492186682077021e-06, 'epoch': 0.48} 48%|████▊ | 16610/34278 [18:21:55<16:22:24, 3.34s/it] 48%|████▊ | 16611/34278 [18:21:59<16:29:14, 3.36s/it] {'loss': 0.1465, 'grad_norm': 1.0324779786710794, 'learning_rate': 5.491716540218134e-06, 'epoch': 0.48} 48%|████▊ | 16611/34278 [18:21:59<16:29:14, 3.36s/it] 48%|████▊ | 16612/34278 [18:22:02<16:49:22, 3.43s/it] {'loss': 0.1309, 'grad_norm': 0.8173512900407819, 'learning_rate': 5.491246393969318e-06, 'epoch': 0.48} 48%|████▊ | 16612/34278 [18:22:02<16:49:22, 3.43s/it] 48%|████▊ | 16613/34278 [18:22:08<20:40:30, 4.21s/it] {'loss': 0.1291, 'grad_norm': 0.9675424431363238, 'learning_rate': 5.490776243334773e-06, 'epoch': 0.48} 48%|████▊ | 16613/34278 [18:22:08<20:40:30, 4.21s/it] 48%|████▊ | 16614/34278 [18:22:11<18:48:01, 3.83s/it] {'loss': 0.1301, 'grad_norm': 0.7566896229893554, 'learning_rate': 5.4903060883186934e-06, 'epoch': 0.48} 48%|████▊ | 16614/34278 [18:22:11<18:48:01, 3.83s/it] 48%|████▊ | 16615/34278 [18:22:16<19:48:29, 4.04s/it] {'loss': 0.1369, 'grad_norm': 0.7624964269774012, 'learning_rate': 5.489835928925279e-06, 'epoch': 0.48} 48%|████▊ | 16615/34278 [18:22:16<19:48:29, 4.04s/it] 48%|████▊ | 16616/34278 [18:22:19<18:12:19, 3.71s/it] {'loss': 0.1303, 'grad_norm': 0.7919410215877583, 'learning_rate': 5.489365765158726e-06, 'epoch': 0.48} 48%|████▊ | 16616/34278 [18:22:19<18:12:19, 3.71s/it] 48%|████▊ | 16617/34278 [18:22:22<17:39:18, 3.60s/it] {'loss': 0.142, 'grad_norm': 0.8996018718186514, 'learning_rate': 5.488895597023231e-06, 'epoch': 0.48} 48%|████▊ | 16617/34278 [18:22:22<17:39:18, 3.60s/it] 48%|████▊ | 16618/34278 [18:22:28<21:20:25, 4.35s/it] {'loss': 0.1381, 'grad_norm': 0.9470661118490609, 'learning_rate': 5.488425424522995e-06, 'epoch': 0.48} 48%|████▊ | 16618/34278 [18:22:28<21:20:25, 4.35s/it] 48%|████▊ | 16619/34278 [18:22:34<23:45:40, 4.84s/it] {'loss': 0.1495, 'grad_norm': 0.8278474374546028, 'learning_rate': 5.487955247662212e-06, 'epoch': 0.48} 48%|████▊ | 16619/34278 [18:22:34<23:45:40, 4.84s/it] 48%|████▊ | 16620/34278 [18:22:37<20:41:02, 4.22s/it] {'loss': 0.1394, 'grad_norm': 0.8078013253587139, 'learning_rate': 5.487485066445082e-06, 'epoch': 0.48} 48%|████▊ | 16620/34278 [18:22:37<20:41:02, 4.22s/it] 48%|████▊ | 16621/34278 [18:22:40<18:52:47, 3.85s/it] {'loss': 0.1268, 'grad_norm': 0.8290149226693893, 'learning_rate': 5.487014880875801e-06, 'epoch': 0.48} 48%|████▊ | 16621/34278 [18:22:40<18:52:47, 3.85s/it] 48%|████▊ | 16622/34278 [18:22:44<18:43:45, 3.82s/it] {'loss': 0.1264, 'grad_norm': 0.8751408919109652, 'learning_rate': 5.486544690958566e-06, 'epoch': 0.48} 48%|████▊ | 16622/34278 [18:22:44<18:43:45, 3.82s/it] 48%|████▊ | 16623/34278 [18:22:50<21:57:37, 4.48s/it] {'loss': 0.1343, 'grad_norm': 0.6911520271026074, 'learning_rate': 5.486074496697579e-06, 'epoch': 0.48} 48%|████▊ | 16623/34278 [18:22:50<21:57:37, 4.48s/it] 48%|████▊ | 16624/34278 [18:22:53<20:11:44, 4.12s/it] {'loss': 0.1537, 'grad_norm': 1.1207690834165749, 'learning_rate': 5.4856042980970325e-06, 'epoch': 0.48} 48%|████▊ | 16624/34278 [18:22:53<20:11:44, 4.12s/it] 49%|████▊ | 16625/34278 [18:22:57<19:59:26, 4.08s/it] {'loss': 0.1228, 'grad_norm': 0.7882957005018483, 'learning_rate': 5.485134095161128e-06, 'epoch': 0.49} 49%|████▊ | 16625/34278 [18:22:57<19:59:26, 4.08s/it] 49%|████▊ | 16626/34278 [18:23:01<20:05:19, 4.10s/it] {'loss': 0.1512, 'grad_norm': 0.7985098025453309, 'learning_rate': 5.484663887894062e-06, 'epoch': 0.49} 49%|████▊ | 16626/34278 [18:23:01<20:05:19, 4.10s/it] 49%|████▊ | 16627/34278 [18:23:04<18:16:42, 3.73s/it] {'loss': 0.1443, 'grad_norm': 0.9279442523753277, 'learning_rate': 5.484193676300033e-06, 'epoch': 0.49} 49%|████▊ | 16627/34278 [18:23:04<18:16:42, 3.73s/it] 49%|████▊ | 16628/34278 [18:23:07<16:59:24, 3.47s/it] {'loss': 0.1182, 'grad_norm': 0.9967141948785709, 'learning_rate': 5.483723460383238e-06, 'epoch': 0.49} 49%|████▊ | 16628/34278 [18:23:07<16:59:24, 3.47s/it] 49%|████▊ | 16629/34278 [18:23:10<16:49:05, 3.43s/it] {'loss': 0.1435, 'grad_norm': 0.882580770098697, 'learning_rate': 5.4832532401478745e-06, 'epoch': 0.49} 49%|████▊ | 16629/34278 [18:23:10<16:49:05, 3.43s/it] 49%|████▊ | 16630/34278 [18:23:13<16:13:41, 3.31s/it] {'loss': 0.1263, 'grad_norm': 1.3703099992895844, 'learning_rate': 5.4827830155981435e-06, 'epoch': 0.49} 49%|████▊ | 16630/34278 [18:23:13<16:13:41, 3.31s/it] 49%|████▊ | 16631/34278 [18:23:18<17:53:26, 3.65s/it] {'loss': 0.1361, 'grad_norm': 0.8743530719152944, 'learning_rate': 5.48231278673824e-06, 'epoch': 0.49} 49%|████▊ | 16631/34278 [18:23:18<17:53:26, 3.65s/it] 49%|████▊ | 16632/34278 [18:23:21<17:03:50, 3.48s/it] {'loss': 0.1203, 'grad_norm': 1.0134325947634437, 'learning_rate': 5.481842553572361e-06, 'epoch': 0.49} 49%|████▊ | 16632/34278 [18:23:21<17:03:50, 3.48s/it] 49%|████▊ | 16633/34278 [18:23:25<18:50:56, 3.85s/it] {'loss': 0.135, 'grad_norm': 1.3346784215743988, 'learning_rate': 5.481372316104709e-06, 'epoch': 0.49} 49%|████▊ | 16633/34278 [18:23:25<18:50:56, 3.85s/it] 49%|████▊ | 16634/34278 [18:23:31<20:54:02, 4.26s/it] {'loss': 0.1536, 'grad_norm': 0.8632620273812637, 'learning_rate': 5.480902074339481e-06, 'epoch': 0.49} 49%|████▊ | 16634/34278 [18:23:31<20:54:02, 4.26s/it] 49%|████▊ | 16635/34278 [18:23:34<19:40:22, 4.01s/it] {'loss': 0.1513, 'grad_norm': 0.7689613898443438, 'learning_rate': 5.480431828280871e-06, 'epoch': 0.49} 49%|████▊ | 16635/34278 [18:23:34<19:40:22, 4.01s/it] 49%|████▊ | 16636/34278 [18:23:37<18:41:11, 3.81s/it] {'loss': 0.1447, 'grad_norm': 1.019997277179937, 'learning_rate': 5.479961577933082e-06, 'epoch': 0.49} 49%|████▊ | 16636/34278 [18:23:37<18:41:11, 3.81s/it] 49%|████▊ | 16637/34278 [18:23:42<19:40:26, 4.01s/it] {'loss': 0.1328, 'grad_norm': 0.7371436515523148, 'learning_rate': 5.47949132330031e-06, 'epoch': 0.49} 49%|████▊ | 16637/34278 [18:23:42<19:40:26, 4.01s/it] 49%|████▊ | 16638/34278 [18:23:45<18:43:03, 3.82s/it] {'loss': 0.1351, 'grad_norm': 0.7720598248913605, 'learning_rate': 5.479021064386755e-06, 'epoch': 0.49} 49%|████▊ | 16638/34278 [18:23:45<18:43:03, 3.82s/it] 49%|████▊ | 16639/34278 [18:23:51<22:00:34, 4.49s/it] {'loss': 0.1451, 'grad_norm': 0.9798615052228705, 'learning_rate': 5.4785508011966125e-06, 'epoch': 0.49} 49%|████▊ | 16639/34278 [18:23:51<22:00:34, 4.49s/it] 49%|████▊ | 16640/34278 [18:23:57<23:33:29, 4.81s/it] {'loss': 0.1551, 'grad_norm': 0.9712722685659123, 'learning_rate': 5.478080533734085e-06, 'epoch': 0.49} 49%|████▊ | 16640/34278 [18:23:57<23:33:29, 4.81s/it] 49%|████▊ | 16641/34278 [18:24:00<21:31:52, 4.39s/it] {'loss': 0.1502, 'grad_norm': 0.8415761999773763, 'learning_rate': 5.477610262003367e-06, 'epoch': 0.49} 49%|████▊ | 16641/34278 [18:24:00<21:31:52, 4.39s/it] 49%|████▊ | 16642/34278 [18:24:06<24:04:33, 4.91s/it] {'loss': 0.1296, 'grad_norm': 1.0311450776940383, 'learning_rate': 5.477139986008658e-06, 'epoch': 0.49} 49%|████▊ | 16642/34278 [18:24:06<24:04:33, 4.91s/it] 49%|████▊ | 16643/34278 [18:24:10<22:30:15, 4.59s/it] {'loss': 0.1248, 'grad_norm': 0.8779969680054039, 'learning_rate': 5.476669705754159e-06, 'epoch': 0.49} 49%|████▊ | 16643/34278 [18:24:10<22:30:15, 4.59s/it] 49%|████▊ | 16644/34278 [18:24:14<20:28:38, 4.18s/it] {'loss': 0.1212, 'grad_norm': 0.7249655699170686, 'learning_rate': 5.476199421244065e-06, 'epoch': 0.49} 49%|████▊ | 16644/34278 [18:24:14<20:28:38, 4.18s/it] 49%|████▊ | 16645/34278 [18:24:17<19:20:16, 3.95s/it] {'loss': 0.1283, 'grad_norm': 0.9359321465945644, 'learning_rate': 5.475729132482578e-06, 'epoch': 0.49} 49%|████▊ | 16645/34278 [18:24:17<19:20:16, 3.95s/it] 49%|████▊ | 16646/34278 [18:24:20<17:28:05, 3.57s/it] {'loss': 0.1279, 'grad_norm': 0.8862819070652191, 'learning_rate': 5.475258839473894e-06, 'epoch': 0.49} 49%|████▊ | 16646/34278 [18:24:20<17:28:05, 3.57s/it] 49%|████▊ | 16647/34278 [18:24:23<16:43:42, 3.42s/it] {'loss': 0.1379, 'grad_norm': 0.7651430301032168, 'learning_rate': 5.474788542222211e-06, 'epoch': 0.49} 49%|████▊ | 16647/34278 [18:24:23<16:43:42, 3.42s/it] 49%|████▊ | 16648/34278 [18:24:29<21:05:57, 4.31s/it] {'loss': 0.1548, 'grad_norm': 0.7489622617004734, 'learning_rate': 5.474318240731732e-06, 'epoch': 0.49} 49%|████▊ | 16648/34278 [18:24:29<21:05:57, 4.31s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa3bda71990> Failed to fetch sample 3291472. Exception: cannot identify image file <_io.BytesIO object at 0x7fa3bda71990> 49%|████▊ | 16649/34278 [18:24:34<22:12:29, 4.54s/it] {'loss': 0.1384, 'grad_norm': 0.7813106362728675, 'learning_rate': 5.473847935006652e-06, 'epoch': 0.49} 49%|████▊ | 16649/34278 [18:24:34<22:12:29, 4.54s/it] 49%|████▊ | 16650/34278 [18:24:40<23:34:34, 4.81s/it] {'loss': 0.1235, 'grad_norm': 0.6767682363045044, 'learning_rate': 5.4733776250511706e-06, 'epoch': 0.49} 49%|████▊ | 16650/34278 [18:24:40<23:34:34, 4.81s/it] 49%|████▊ | 16651/34278 [18:24:43<21:24:01, 4.37s/it] {'loss': 0.1212, 'grad_norm': 0.7700170536576478, 'learning_rate': 5.472907310869486e-06, 'epoch': 0.49} 49%|████▊ | 16651/34278 [18:24:43<21:24:01, 4.37s/it] 49%|████▊ | 16652/34278 [18:24:48<23:00:34, 4.70s/it] {'loss': 0.1288, 'grad_norm': 0.9489483314365391, 'learning_rate': 5.4724369924657985e-06, 'epoch': 0.49} 49%|████▊ | 16652/34278 [18:24:48<23:00:34, 4.70s/it] 49%|████▊ | 16653/34278 [18:24:53<22:57:56, 4.69s/it] {'loss': 0.1196, 'grad_norm': 0.7578751245786504, 'learning_rate': 5.471966669844307e-06, 'epoch': 0.49} 49%|████▊ | 16653/34278 [18:24:53<22:57:56, 4.69s/it] 49%|████▊ | 16654/34278 [18:24:56<20:29:29, 4.19s/it] {'loss': 0.1391, 'grad_norm': 0.974441354637069, 'learning_rate': 5.471496343009208e-06, 'epoch': 0.49} 49%|████▊ | 16654/34278 [18:24:56<20:29:29, 4.19s/it] 49%|████▊ | 16655/34278 [18:24:59<19:16:21, 3.94s/it] {'loss': 0.1233, 'grad_norm': 0.7060072836914836, 'learning_rate': 5.471026011964703e-06, 'epoch': 0.49} 49%|████▊ | 16655/34278 [18:24:59<19:16:21, 3.94s/it] 49%|████▊ | 16656/34278 [18:25:03<18:20:53, 3.75s/it] {'loss': 0.1241, 'grad_norm': 0.8272008883339874, 'learning_rate': 5.47055567671499e-06, 'epoch': 0.49} 49%|████▊ | 16656/34278 [18:25:03<18:20:53, 3.75s/it] 49%|████▊ | 16657/34278 [18:25:06<17:22:05, 3.55s/it] {'loss': 0.145, 'grad_norm': 0.8842349354146034, 'learning_rate': 5.470085337264268e-06, 'epoch': 0.49} 49%|████▊ | 16657/34278 [18:25:06<17:22:05, 3.55s/it] 49%|████▊ | 16658/34278 [18:25:12<21:28:54, 4.39s/it] {'loss': 0.1552, 'grad_norm': 0.8066742307684761, 'learning_rate': 5.469614993616739e-06, 'epoch': 0.49} 49%|████▊ | 16658/34278 [18:25:12<21:28:54, 4.39s/it] 49%|████▊ | 16659/34278 [18:25:16<21:18:56, 4.36s/it] {'loss': 0.1519, 'grad_norm': 0.886208217483384, 'learning_rate': 5.469144645776596e-06, 'epoch': 0.49} 49%|████▊ | 16659/34278 [18:25:16<21:18:56, 4.36s/it] 49%|████▊ | 16660/34278 [18:25:20<20:26:51, 4.18s/it] {'loss': 0.1398, 'grad_norm': 0.8917265733195617, 'learning_rate': 5.468674293748044e-06, 'epoch': 0.49} 49%|████▊ | 16660/34278 [18:25:20<20:26:51, 4.18s/it] 49%|████▊ | 16661/34278 [18:25:27<23:43:21, 4.85s/it] {'loss': 0.1188, 'grad_norm': 0.741515680860913, 'learning_rate': 5.468203937535278e-06, 'epoch': 0.49} 49%|████▊ | 16661/34278 [18:25:27<23:43:21, 4.85s/it] 49%|████▊ | 16662/34278 [18:25:30<21:15:14, 4.34s/it] {'loss': 0.1556, 'grad_norm': 0.8224294073227981, 'learning_rate': 5.467733577142499e-06, 'epoch': 0.49} 49%|████▊ | 16662/34278 [18:25:30<21:15:14, 4.34s/it] 49%|████▊ | 16663/34278 [18:25:33<19:02:56, 3.89s/it] {'loss': 0.1336, 'grad_norm': 0.7176844646669913, 'learning_rate': 5.467263212573908e-06, 'epoch': 0.49} 49%|████▊ | 16663/34278 [18:25:33<19:02:56, 3.89s/it] 49%|████▊ | 16664/34278 [18:25:36<18:06:31, 3.70s/it] {'loss': 0.1443, 'grad_norm': 0.70235432623764, 'learning_rate': 5.466792843833702e-06, 'epoch': 0.49} 49%|████▊ | 16664/34278 [18:25:36<18:06:31, 3.70s/it] 49%|████▊ | 16665/34278 [18:25:39<17:17:00, 3.53s/it] {'loss': 0.1226, 'grad_norm': 0.8076954041882298, 'learning_rate': 5.46632247092608e-06, 'epoch': 0.49} 49%|████▊ | 16665/34278 [18:25:39<17:17:00, 3.53s/it] 49%|████▊ | 16666/34278 [18:25:42<16:14:19, 3.32s/it] {'loss': 0.1519, 'grad_norm': 1.0319426461567602, 'learning_rate': 5.465852093855243e-06, 'epoch': 0.49} 49%|████▊ | 16666/34278 [18:25:42<16:14:19, 3.32s/it] 49%|████▊ | 16667/34278 [18:25:45<15:24:25, 3.15s/it] {'loss': 0.1281, 'grad_norm': 0.8531808357671388, 'learning_rate': 5.46538171262539e-06, 'epoch': 0.49} 49%|████▊ | 16667/34278 [18:25:45<15:24:25, 3.15s/it] 49%|████▊ | 16668/34278 [18:25:48<15:32:46, 3.18s/it] {'loss': 0.173, 'grad_norm': 1.0352049857879229, 'learning_rate': 5.464911327240719e-06, 'epoch': 0.49} 49%|████▊ | 16668/34278 [18:25:48<15:32:46, 3.18s/it] 49%|████▊ | 16669/34278 [18:25:51<15:26:19, 3.16s/it] {'loss': 0.1315, 'grad_norm': 0.8800928140830611, 'learning_rate': 5.4644409377054305e-06, 'epoch': 0.49} 49%|████▊ | 16669/34278 [18:25:51<15:26:19, 3.16s/it] 49%|████▊ | 16670/34278 [18:25:54<15:12:23, 3.11s/it] {'loss': 0.1626, 'grad_norm': 1.0911421983003886, 'learning_rate': 5.463970544023726e-06, 'epoch': 0.49} 49%|████▊ | 16670/34278 [18:25:54<15:12:23, 3.11s/it] 49%|████▊ | 16671/34278 [18:26:00<19:32:23, 4.00s/it] {'loss': 0.1445, 'grad_norm': 0.8169729470278007, 'learning_rate': 5.463500146199801e-06, 'epoch': 0.49} 49%|████▊ | 16671/34278 [18:26:00<19:32:23, 4.00s/it] 49%|████▊ | 16672/34278 [18:26:03<18:21:00, 3.75s/it] {'loss': 0.1237, 'grad_norm': 0.8859138574134487, 'learning_rate': 5.46302974423786e-06, 'epoch': 0.49} 49%|████▊ | 16672/34278 [18:26:03<18:21:00, 3.75s/it] 49%|████▊ | 16673/34278 [18:26:07<17:45:26, 3.63s/it] {'loss': 0.1349, 'grad_norm': 1.0588198479976483, 'learning_rate': 5.4625593381421e-06, 'epoch': 0.49} 49%|████▊ | 16673/34278 [18:26:07<17:45:26, 3.63s/it] 49%|████▊ | 16674/34278 [18:26:10<17:27:34, 3.57s/it] {'loss': 0.1444, 'grad_norm': 0.9419584188868422, 'learning_rate': 5.4620889279167174e-06, 'epoch': 0.49} 49%|████▊ | 16674/34278 [18:26:10<17:27:34, 3.57s/it] 49%|████▊ | 16675/34278 [18:26:16<20:58:40, 4.29s/it] {'loss': 0.116, 'grad_norm': 0.8217783306314604, 'learning_rate': 5.461618513565918e-06, 'epoch': 0.49} 49%|████▊ | 16675/34278 [18:26:16<20:58:40, 4.29s/it] 49%|████▊ | 16676/34278 [18:26:19<19:08:14, 3.91s/it] {'loss': 0.167, 'grad_norm': 1.488590067859592, 'learning_rate': 5.461148095093898e-06, 'epoch': 0.49} 49%|████▊ | 16676/34278 [18:26:19<19:08:14, 3.91s/it] 49%|████▊ | 16677/34278 [18:26:22<18:08:23, 3.71s/it] {'loss': 0.1475, 'grad_norm': 0.9474291803655491, 'learning_rate': 5.460677672504856e-06, 'epoch': 0.49} 49%|████▊ | 16677/34278 [18:26:22<18:08:23, 3.71s/it] 49%|████▊ | 16678/34278 [18:26:29<21:58:39, 4.50s/it] {'loss': 0.1379, 'grad_norm': 0.860261484254281, 'learning_rate': 5.460207245802996e-06, 'epoch': 0.49} 49%|████▊ | 16678/34278 [18:26:29<21:58:39, 4.50s/it] 49%|████▊ | 16679/34278 [18:26:32<20:02:20, 4.10s/it] {'loss': 0.1154, 'grad_norm': 0.8841663562973775, 'learning_rate': 5.4597368149925154e-06, 'epoch': 0.49} 49%|████▊ | 16679/34278 [18:26:32<20:02:20, 4.10s/it] 49%|████▊ | 16680/34278 [18:26:35<18:57:23, 3.88s/it] {'loss': 0.1674, 'grad_norm': 0.8119690881667214, 'learning_rate': 5.459266380077614e-06, 'epoch': 0.49} 49%|████▊ | 16680/34278 [18:26:35<18:57:23, 3.88s/it] 49%|████▊ | 16681/34278 [18:26:38<17:37:31, 3.61s/it] {'loss': 0.1271, 'grad_norm': 0.8433755193533623, 'learning_rate': 5.458795941062491e-06, 'epoch': 0.49} 49%|████▊ | 16681/34278 [18:26:38<17:37:31, 3.61s/it] 49%|████▊ | 16682/34278 [18:26:41<16:53:00, 3.45s/it] {'loss': 0.111, 'grad_norm': 0.8857321157738206, 'learning_rate': 5.458325497951348e-06, 'epoch': 0.49} 49%|████▊ | 16682/34278 [18:26:41<16:53:00, 3.45s/it] 49%|████▊ | 16683/34278 [18:26:47<20:49:10, 4.26s/it] {'loss': 0.1291, 'grad_norm': 0.7623113028603865, 'learning_rate': 5.457855050748385e-06, 'epoch': 0.49} 49%|████▊ | 16683/34278 [18:26:47<20:49:10, 4.26s/it] 49%|████▊ | 16684/34278 [18:26:50<19:15:17, 3.94s/it] {'loss': 0.1273, 'grad_norm': 0.8593488336640536, 'learning_rate': 5.457384599457801e-06, 'epoch': 0.49} 49%|████▊ | 16684/34278 [18:26:51<19:15:17, 3.94s/it] 49%|████▊ | 16685/34278 [18:26:53<17:41:37, 3.62s/it] {'loss': 0.1312, 'grad_norm': 0.7462675008781164, 'learning_rate': 5.456914144083796e-06, 'epoch': 0.49} 49%|████▊ | 16685/34278 [18:26:53<17:41:37, 3.62s/it] 49%|████▊ | 16686/34278 [18:26:57<17:15:29, 3.53s/it] {'loss': 0.1053, 'grad_norm': 0.8076048988420254, 'learning_rate': 5.456443684630572e-06, 'epoch': 0.49} 49%|████▊ | 16686/34278 [18:26:57<17:15:29, 3.53s/it] 49%|████▊ | 16687/34278 [18:27:00<17:31:45, 3.59s/it] {'loss': 0.1236, 'grad_norm': 0.7176795308410698, 'learning_rate': 5.455973221102325e-06, 'epoch': 0.49} 49%|████▊ | 16687/34278 [18:27:00<17:31:45, 3.59s/it] 49%|████▊ | 16688/34278 [18:27:03<16:34:40, 3.39s/it] {'loss': 0.1235, 'grad_norm': 0.7454224420745489, 'learning_rate': 5.45550275350326e-06, 'epoch': 0.49} 49%|████▊ | 16688/34278 [18:27:03<16:34:40, 3.39s/it] 49%|████▊ | 16689/34278 [18:27:07<17:31:05, 3.59s/it] {'loss': 0.1262, 'grad_norm': 0.8958860901901691, 'learning_rate': 5.455032281837576e-06, 'epoch': 0.49} 49%|████▊ | 16689/34278 [18:27:07<17:31:05, 3.59s/it] 49%|████▊ | 16690/34278 [18:27:12<18:58:59, 3.89s/it] {'loss': 0.1413, 'grad_norm': 0.8392124770640628, 'learning_rate': 5.454561806109472e-06, 'epoch': 0.49} 49%|████▊ | 16690/34278 [18:27:12<18:58:59, 3.89s/it] 49%|████▊ | 16691/34278 [18:27:15<17:50:58, 3.65s/it] {'loss': 0.1362, 'grad_norm': 0.9805932324169192, 'learning_rate': 5.4540913263231466e-06, 'epoch': 0.49} 49%|████▊ | 16691/34278 [18:27:15<17:50:58, 3.65s/it] 49%|████▊ | 16692/34278 [18:27:18<17:21:49, 3.55s/it] {'loss': 0.1059, 'grad_norm': 0.8005952637989759, 'learning_rate': 5.453620842482803e-06, 'epoch': 0.49} 49%|████▊ | 16692/34278 [18:27:18<17:21:49, 3.55s/it] 49%|████▊ | 16693/34278 [18:27:24<20:59:48, 4.30s/it] {'loss': 0.1655, 'grad_norm': 1.0117586096771631, 'learning_rate': 5.4531503545926425e-06, 'epoch': 0.49} 49%|████▊ | 16693/34278 [18:27:24<20:59:48, 4.30s/it] 49%|████▊ | 16694/34278 [18:27:29<21:46:57, 4.46s/it] {'loss': 0.1377, 'grad_norm': 0.7899426762886181, 'learning_rate': 5.452679862656861e-06, 'epoch': 0.49} 49%|████▊ | 16694/34278 [18:27:29<21:46:57, 4.46s/it] 49%|████▊ | 16695/34278 [18:27:32<19:33:47, 4.01s/it] {'loss': 0.1287, 'grad_norm': 0.8782222208997967, 'learning_rate': 5.452209366679665e-06, 'epoch': 0.49} 49%|████▊ | 16695/34278 [18:27:32<19:33:47, 4.01s/it] 49%|████▊ | 16696/34278 [18:27:35<18:02:40, 3.69s/it] {'loss': 0.1163, 'grad_norm': 0.798744209565529, 'learning_rate': 5.45173886666525e-06, 'epoch': 0.49} 49%|████▊ | 16696/34278 [18:27:35<18:02:40, 3.69s/it] 49%|████▊ | 16697/34278 [18:27:39<18:45:29, 3.84s/it] {'loss': 0.1184, 'grad_norm': 0.9060525066292064, 'learning_rate': 5.451268362617819e-06, 'epoch': 0.49} 49%|████▊ | 16697/34278 [18:27:39<18:45:29, 3.84s/it] 49%|████▊ | 16698/34278 [18:27:42<17:24:50, 3.57s/it] {'loss': 0.1369, 'grad_norm': 1.039120813857294, 'learning_rate': 5.4507978545415704e-06, 'epoch': 0.49} 49%|████▊ | 16698/34278 [18:27:42<17:24:50, 3.57s/it] 49%|████▊ | 16699/34278 [18:27:45<16:36:02, 3.40s/it] {'loss': 0.1311, 'grad_norm': 0.8210188290349758, 'learning_rate': 5.450327342440707e-06, 'epoch': 0.49} 49%|████▊ | 16699/34278 [18:27:45<16:36:02, 3.40s/it] 49%|████▊ | 16700/34278 [18:27:51<19:36:32, 4.02s/it] {'loss': 0.1185, 'grad_norm': 0.7169642763400298, 'learning_rate': 5.449856826319429e-06, 'epoch': 0.49} 49%|████▊ | 16700/34278 [18:27:51<19:36:32, 4.02s/it] 49%|████▊ | 16701/34278 [18:27:54<18:32:51, 3.80s/it] {'loss': 0.1417, 'grad_norm': 1.2238526355218988, 'learning_rate': 5.449386306181935e-06, 'epoch': 0.49} 49%|████▊ | 16701/34278 [18:27:54<18:32:51, 3.80s/it] 49%|████▊ | 16702/34278 [18:27:57<17:40:14, 3.62s/it] {'loss': 0.144, 'grad_norm': 1.0169156857525088, 'learning_rate': 5.448915782032429e-06, 'epoch': 0.49} 49%|████▊ | 16702/34278 [18:27:57<17:40:14, 3.62s/it] 49%|████▊ | 16703/34278 [18:28:00<16:56:00, 3.47s/it] {'loss': 0.1022, 'grad_norm': 0.8190136502687897, 'learning_rate': 5.4484452538751095e-06, 'epoch': 0.49} 49%|████▊ | 16703/34278 [18:28:00<16:56:00, 3.47s/it] 49%|████▊ | 16704/34278 [18:28:04<17:46:35, 3.64s/it] {'loss': 0.1177, 'grad_norm': 0.9112090248792706, 'learning_rate': 5.447974721714178e-06, 'epoch': 0.49} 49%|████▊ | 16704/34278 [18:28:04<17:46:35, 3.64s/it] 49%|████▊ | 16705/34278 [18:28:07<16:55:35, 3.47s/it] {'loss': 0.1348, 'grad_norm': 0.8967721578079546, 'learning_rate': 5.447504185553836e-06, 'epoch': 0.49} 49%|████▊ | 16705/34278 [18:28:08<16:55:35, 3.47s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f42b37f9da0> Failed to fetch sample 2751686. Exception: cannot identify image file <_io.BytesIO object at 0x7f42b37f9da0> 49%|████▊ | 16706/34278 [18:28:10<16:11:02, 3.32s/it] {'loss': 0.1351, 'grad_norm': 0.6705471140235855, 'learning_rate': 5.4470336453982805e-06, 'epoch': 0.49} 49%|████▊ | 16706/34278 [18:28:10<16:11:02, 3.32s/it] 49%|████▊ | 16707/34278 [18:28:13<15:46:19, 3.23s/it] {'loss': 0.1338, 'grad_norm': 0.7990269514479517, 'learning_rate': 5.446563101251718e-06, 'epoch': 0.49} 49%|████▊ | 16707/34278 [18:28:14<15:46:19, 3.23s/it] 49%|████▊ | 16708/34278 [18:28:18<17:19:38, 3.55s/it] {'loss': 0.1428, 'grad_norm': 0.8389563162533513, 'learning_rate': 5.446092553118347e-06, 'epoch': 0.49} 49%|████▊ | 16708/34278 [18:28:18<17:19:38, 3.55s/it] 49%|████▊ | 16709/34278 [18:28:21<16:32:35, 3.39s/it] {'loss': 0.1309, 'grad_norm': 0.8288790745157593, 'learning_rate': 5.445622001002366e-06, 'epoch': 0.49} 49%|████▊ | 16709/34278 [18:28:21<16:32:35, 3.39s/it] 49%|████▊ | 16710/34278 [18:28:26<18:40:11, 3.83s/it] {'loss': 0.1758, 'grad_norm': 0.8000472215523416, 'learning_rate': 5.445151444907981e-06, 'epoch': 0.49} 49%|████▊ | 16710/34278 [18:28:26<18:40:11, 3.83s/it] 49%|████▉ | 16711/34278 [18:28:32<21:56:50, 4.50s/it] {'loss': 0.1213, 'grad_norm': 0.6687827671769992, 'learning_rate': 5.444680884839389e-06, 'epoch': 0.49} 49%|████▉ | 16711/34278 [18:28:32<21:56:50, 4.50s/it] 49%|████▉ | 16712/34278 [18:28:35<20:22:33, 4.18s/it] {'loss': 0.144, 'grad_norm': 0.8007923645474778, 'learning_rate': 5.444210320800791e-06, 'epoch': 0.49} 49%|████▉ | 16712/34278 [18:28:35<20:22:33, 4.18s/it] 49%|████▉ | 16713/34278 [18:28:39<20:08:50, 4.13s/it] {'loss': 0.1232, 'grad_norm': 1.0185180313992426, 'learning_rate': 5.44373975279639e-06, 'epoch': 0.49} 49%|████▉ | 16713/34278 [18:28:39<20:08:50, 4.13s/it] 49%|████▉ | 16714/34278 [18:28:42<18:32:09, 3.80s/it] {'loss': 0.1238, 'grad_norm': 0.7897839234618698, 'learning_rate': 5.443269180830386e-06, 'epoch': 0.49} 49%|████▉ | 16714/34278 [18:28:42<18:32:09, 3.80s/it] 49%|████▉ | 16715/34278 [18:28:45<17:23:08, 3.56s/it] {'loss': 0.1367, 'grad_norm': 0.7037191563042358, 'learning_rate': 5.442798604906981e-06, 'epoch': 0.49} 49%|████▉ | 16715/34278 [18:28:45<17:23:08, 3.56s/it] 49%|████▉ | 16716/34278 [18:28:48<16:39:07, 3.41s/it] {'loss': 0.1373, 'grad_norm': 0.7153632602948264, 'learning_rate': 5.442328025030375e-06, 'epoch': 0.49} 49%|████▉ | 16716/34278 [18:28:48<16:39:07, 3.41s/it] 49%|████▉ | 16717/34278 [18:28:52<16:31:29, 3.39s/it] {'loss': 0.1297, 'grad_norm': 0.7414456386811991, 'learning_rate': 5.441857441204772e-06, 'epoch': 0.49} 49%|████▉ | 16717/34278 [18:28:52<16:31:29, 3.39s/it] 49%|████▉ | 16718/34278 [18:28:55<16:47:35, 3.44s/it] {'loss': 0.1238, 'grad_norm': 0.9137008098492148, 'learning_rate': 5.441386853434369e-06, 'epoch': 0.49} 49%|████▉ | 16718/34278 [18:28:55<16:47:35, 3.44s/it] 49%|████▉ | 16719/34278 [18:29:01<19:40:02, 4.03s/it] {'loss': 0.1621, 'grad_norm': 0.8122164406109992, 'learning_rate': 5.4409162617233715e-06, 'epoch': 0.49} 49%|████▉ | 16719/34278 [18:29:01<19:40:02, 4.03s/it] 49%|████▉ | 16720/34278 [18:29:03<17:54:00, 3.67s/it] {'loss': 0.1396, 'grad_norm': 0.8667435227430735, 'learning_rate': 5.440445666075979e-06, 'epoch': 0.49} 49%|████▉ | 16720/34278 [18:29:03<17:54:00, 3.67s/it] 49%|████▉ | 16721/34278 [18:29:07<17:40:39, 3.62s/it] {'loss': 0.143, 'grad_norm': 0.9612628334759031, 'learning_rate': 5.4399750664963905e-06, 'epoch': 0.49} 49%|████▉ | 16721/34278 [18:29:07<17:40:39, 3.62s/it] 49%|████▉ | 16722/34278 [18:29:11<17:55:00, 3.67s/it] {'loss': 0.1193, 'grad_norm': 0.7942635529285477, 'learning_rate': 5.439504462988811e-06, 'epoch': 0.49} 49%|████▉ | 16722/34278 [18:29:11<17:55:00, 3.67s/it] 49%|████▉ | 16723/34278 [18:29:14<17:02:49, 3.50s/it] {'loss': 0.148, 'grad_norm': 0.7533546031416218, 'learning_rate': 5.4390338555574405e-06, 'epoch': 0.49} 49%|████▉ | 16723/34278 [18:29:14<17:02:49, 3.50s/it] 49%|████▉ | 16724/34278 [18:29:17<16:19:14, 3.35s/it] {'loss': 0.1386, 'grad_norm': 0.8896298213154132, 'learning_rate': 5.4385632442064795e-06, 'epoch': 0.49} 49%|████▉ | 16724/34278 [18:29:17<16:19:14, 3.35s/it] 49%|████▉ | 16725/34278 [18:29:21<17:38:21, 3.62s/it] {'loss': 0.1152, 'grad_norm': 1.0280146935703676, 'learning_rate': 5.4380926289401325e-06, 'epoch': 0.49} 49%|████▉ | 16725/34278 [18:29:21<17:38:21, 3.62s/it] 49%|████▉ | 16726/34278 [18:29:25<17:38:33, 3.62s/it] {'loss': 0.1202, 'grad_norm': 0.7729068892464108, 'learning_rate': 5.437622009762599e-06, 'epoch': 0.49} 49%|████▉ | 16726/34278 [18:29:25<17:38:33, 3.62s/it] 49%|████▉ | 16727/34278 [18:29:28<16:33:04, 3.39s/it] {'loss': 0.1416, 'grad_norm': 0.7668135338946674, 'learning_rate': 5.437151386678079e-06, 'epoch': 0.49} 49%|████▉ | 16727/34278 [18:29:28<16:33:04, 3.39s/it] 49%|████▉ | 16728/34278 [18:29:31<15:59:48, 3.28s/it] {'loss': 0.1401, 'grad_norm': 0.8540119289562168, 'learning_rate': 5.436680759690777e-06, 'epoch': 0.49} 49%|████▉ | 16728/34278 [18:29:31<15:59:48, 3.28s/it] 49%|████▉ | 16729/34278 [18:29:34<16:14:54, 3.33s/it] {'loss': 0.1412, 'grad_norm': 0.7166846145096721, 'learning_rate': 5.436210128804893e-06, 'epoch': 0.49} 49%|████▉ | 16729/34278 [18:29:34<16:14:54, 3.33s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fafef8dce00> Failed to fetch sample 2667592. Exception: cannot identify image file <_io.BytesIO object at 0x7fafef8dce00> 49%|████▉ | 16730/34278 [18:29:38<17:07:49, 3.51s/it] {'loss': 0.1233, 'grad_norm': 1.276487160810915, 'learning_rate': 5.435739494024629e-06, 'epoch': 0.49} 49%|████▉ | 16730/34278 [18:29:38<17:07:49, 3.51s/it] 49%|████▉ | 16731/34278 [18:29:41<16:50:31, 3.46s/it] {'loss': 0.129, 'grad_norm': 0.9385052299502991, 'learning_rate': 5.4352688553541865e-06, 'epoch': 0.49} 49%|████▉ | 16731/34278 [18:29:41<16:50:31, 3.46s/it] 49%|████▉ | 16732/34278 [18:29:44<15:57:31, 3.27s/it] {'loss': 0.1092, 'grad_norm': 0.763211669110271, 'learning_rate': 5.434798212797767e-06, 'epoch': 0.49} 49%|████▉ | 16732/34278 [18:29:44<15:57:31, 3.27s/it] 49%|████▉ | 16733/34278 [18:29:48<16:19:17, 3.35s/it] {'loss': 0.1212, 'grad_norm': 0.799938314499092, 'learning_rate': 5.434327566359574e-06, 'epoch': 0.49} 49%|████▉ | 16733/34278 [18:29:48<16:19:17, 3.35s/it] 49%|████▉ | 16734/34278 [18:29:51<15:59:17, 3.28s/it] {'loss': 0.1716, 'grad_norm': 0.7336706928372488, 'learning_rate': 5.433856916043808e-06, 'epoch': 0.49} 49%|████▉ | 16734/34278 [18:29:51<15:59:17, 3.28s/it] 49%|████▉ | 16735/34278 [18:29:54<15:30:13, 3.18s/it] {'loss': 0.1375, 'grad_norm': 1.068392530537651, 'learning_rate': 5.433386261854672e-06, 'epoch': 0.49} 49%|████▉ | 16735/34278 [18:29:54<15:30:13, 3.18s/it] 49%|████▉ | 16736/34278 [18:29:57<15:22:49, 3.16s/it] {'loss': 0.1372, 'grad_norm': 0.8966242026079092, 'learning_rate': 5.432915603796365e-06, 'epoch': 0.49} 49%|████▉ | 16736/34278 [18:29:57<15:22:49, 3.16s/it] 49%|████▉ | 16737/34278 [18:30:01<16:58:40, 3.48s/it] {'loss': 0.136, 'grad_norm': 0.7255530041341725, 'learning_rate': 5.432444941873092e-06, 'epoch': 0.49} 49%|████▉ | 16737/34278 [18:30:01<16:58:40, 3.48s/it] 49%|████▉ | 16738/34278 [18:30:07<20:39:43, 4.24s/it] {'loss': 0.16, 'grad_norm': 0.9633534175498903, 'learning_rate': 5.431974276089054e-06, 'epoch': 0.49} 49%|████▉ | 16738/34278 [18:30:07<20:39:43, 4.24s/it] 49%|████▉ | 16739/34278 [18:30:10<18:56:58, 3.89s/it] {'loss': 0.1302, 'grad_norm': 0.7816706975664015, 'learning_rate': 5.431503606448452e-06, 'epoch': 0.49} 49%|████▉ | 16739/34278 [18:30:10<18:56:58, 3.89s/it] 49%|████▉ | 16740/34278 [18:30:14<19:24:46, 3.98s/it] {'loss': 0.1185, 'grad_norm': 0.7555487024819179, 'learning_rate': 5.4310329329554885e-06, 'epoch': 0.49} 49%|████▉ | 16740/34278 [18:30:14<19:24:46, 3.98s/it] 49%|████▉ | 16741/34278 [18:30:17<18:09:14, 3.73s/it] {'loss': 0.1382, 'grad_norm': 1.1433135594249912, 'learning_rate': 5.4305622556143675e-06, 'epoch': 0.49} 49%|████▉ | 16741/34278 [18:30:17<18:09:14, 3.73s/it] 49%|████▉ | 16742/34278 [18:30:21<17:27:08, 3.58s/it] {'loss': 0.1307, 'grad_norm': 0.7882269824023626, 'learning_rate': 5.430091574429288e-06, 'epoch': 0.49} 49%|████▉ | 16742/34278 [18:30:21<17:27:08, 3.58s/it] 49%|████▉ | 16743/34278 [18:30:24<17:37:31, 3.62s/it] {'loss': 0.1144, 'grad_norm': 0.7279517395647666, 'learning_rate': 5.429620889404454e-06, 'epoch': 0.49} 49%|████▉ | 16743/34278 [18:30:24<17:37:31, 3.62s/it] 49%|████▉ | 16744/34278 [18:30:28<17:20:25, 3.56s/it] {'loss': 0.141, 'grad_norm': 0.7461429640054047, 'learning_rate': 5.429150200544068e-06, 'epoch': 0.49} 49%|████▉ | 16744/34278 [18:30:28<17:20:25, 3.56s/it] 49%|████▉ | 16745/34278 [18:30:31<16:14:01, 3.33s/it] {'loss': 0.1369, 'grad_norm': 0.8471060884395589, 'learning_rate': 5.42867950785233e-06, 'epoch': 0.49} 49%|████▉ | 16745/34278 [18:30:31<16:14:01, 3.33s/it] 49%|████▉ | 16746/34278 [18:30:34<15:43:39, 3.23s/it] {'loss': 0.1459, 'grad_norm': 0.7419462183963568, 'learning_rate': 5.4282088113334445e-06, 'epoch': 0.49} 49%|████▉ | 16746/34278 [18:30:34<15:43:39, 3.23s/it] 49%|████▉ | 16747/34278 [18:30:40<19:52:16, 4.08s/it] {'loss': 0.1427, 'grad_norm': 0.7109712546871579, 'learning_rate': 5.427738110991613e-06, 'epoch': 0.49} 49%|████▉ | 16747/34278 [18:30:40<19:52:16, 4.08s/it] 49%|████▉ | 16748/34278 [18:30:43<19:13:41, 3.95s/it] {'loss': 0.1205, 'grad_norm': 0.7921504567432183, 'learning_rate': 5.427267406831037e-06, 'epoch': 0.49} 49%|████▉ | 16748/34278 [18:30:43<19:13:41, 3.95s/it] 49%|████▉ | 16749/34278 [18:30:47<18:09:12, 3.73s/it] {'loss': 0.1268, 'grad_norm': 0.6369093230324466, 'learning_rate': 5.426796698855921e-06, 'epoch': 0.49} 49%|████▉ | 16749/34278 [18:30:47<18:09:12, 3.73s/it] 49%|████▉ | 16750/34278 [18:30:51<19:05:44, 3.92s/it] {'loss': 0.1249, 'grad_norm': 0.7619024885991046, 'learning_rate': 5.426325987070465e-06, 'epoch': 0.49} 49%|████▉ | 16750/34278 [18:30:51<19:05:44, 3.92s/it] 49%|████▉ | 16751/34278 [18:30:54<18:25:34, 3.78s/it] {'loss': 0.1642, 'grad_norm': 1.1428791818286819, 'learning_rate': 5.425855271478873e-06, 'epoch': 0.49} 49%|████▉ | 16751/34278 [18:30:54<18:25:34, 3.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 49%|████▉ | 16752/34278 [18:30:58<17:39:00, 3.63s/it] {'loss': 0.1353, 'grad_norm': 0.7777704283770847, 'learning_rate': 5.425384552085346e-06, 'epoch': 0.49} 49%|████▉ | 16752/34278 [18:30:58<17:39:00, 3.63s/it] 49%|████▉ | 16753/34278 [18:31:01<16:42:41, 3.43s/it] {'loss': 0.1344, 'grad_norm': 1.350268335225953, 'learning_rate': 5.424913828894088e-06, 'epoch': 0.49} 49%|████▉ | 16753/34278 [18:31:01<16:42:41, 3.43s/it] 49%|████▉ | 16754/34278 [18:31:05<17:43:06, 3.64s/it] {'loss': 0.1344, 'grad_norm': 0.7177870442582034, 'learning_rate': 5.424443101909299e-06, 'epoch': 0.49} 49%|████▉ | 16754/34278 [18:31:05<17:43:06, 3.64s/it] 49%|████▉ | 16755/34278 [18:31:08<17:45:56, 3.65s/it] {'loss': 0.1412, 'grad_norm': 0.902093696664178, 'learning_rate': 5.423972371135186e-06, 'epoch': 0.49} 49%|████▉ | 16755/34278 [18:31:08<17:45:56, 3.65s/it] 49%|████▉ | 16756/34278 [18:31:14<19:54:07, 4.09s/it] {'loss': 0.1297, 'grad_norm': 0.8169954794117326, 'learning_rate': 5.423501636575947e-06, 'epoch': 0.49} 49%|████▉ | 16756/34278 [18:31:14<19:54:07, 4.09s/it] 49%|████▉ | 16757/34278 [18:31:17<18:42:34, 3.84s/it] {'loss': 0.1183, 'grad_norm': 1.1969661614115779, 'learning_rate': 5.423030898235788e-06, 'epoch': 0.49} 49%|████▉ | 16757/34278 [18:31:17<18:42:34, 3.84s/it] 49%|████▉ | 16758/34278 [18:31:22<21:23:03, 4.39s/it] {'loss': 0.1331, 'grad_norm': 0.9097145304324763, 'learning_rate': 5.422560156118909e-06, 'epoch': 0.49} 49%|████▉ | 16758/34278 [18:31:22<21:23:03, 4.39s/it] 49%|████▉ | 16759/34278 [18:31:26<20:09:38, 4.14s/it] {'loss': 0.1325, 'grad_norm': 0.8576091171382718, 'learning_rate': 5.422089410229514e-06, 'epoch': 0.49} 49%|████▉ | 16759/34278 [18:31:26<20:09:38, 4.14s/it] 49%|████▉ | 16760/34278 [18:31:30<19:17:03, 3.96s/it] {'loss': 0.132, 'grad_norm': 1.542136337139382, 'learning_rate': 5.421618660571804e-06, 'epoch': 0.49} 49%|████▉ | 16760/34278 [18:31:30<19:17:03, 3.96s/it] 49%|████▉ | 16761/34278 [18:31:34<19:22:45, 3.98s/it] {'loss': 0.1554, 'grad_norm': 0.9677542293305892, 'learning_rate': 5.4211479071499866e-06, 'epoch': 0.49} 49%|████▉ | 16761/34278 [18:31:34<19:22:45, 3.98s/it] 49%|████▉ | 16762/34278 [18:31:37<18:21:56, 3.77s/it] {'loss': 0.1335, 'grad_norm': 0.802046556672027, 'learning_rate': 5.420677149968259e-06, 'epoch': 0.49} 49%|████▉ | 16762/34278 [18:31:37<18:21:56, 3.77s/it] 49%|████▉ | 16763/34278 [18:31:40<17:15:08, 3.55s/it] {'loss': 0.1268, 'grad_norm': 1.016011147504178, 'learning_rate': 5.4202063890308265e-06, 'epoch': 0.49} 49%|████▉ | 16763/34278 [18:31:40<17:15:08, 3.55s/it] 49%|████▉ | 16764/34278 [18:31:43<16:34:47, 3.41s/it] {'loss': 0.1018, 'grad_norm': 0.6628011873979082, 'learning_rate': 5.419735624341891e-06, 'epoch': 0.49} 49%|████▉ | 16764/34278 [18:31:43<16:34:47, 3.41s/it] 49%|████▉ | 16765/34278 [18:31:47<18:12:09, 3.74s/it] {'loss': 0.1494, 'grad_norm': 0.8839723900097081, 'learning_rate': 5.419264855905658e-06, 'epoch': 0.49} 49%|████▉ | 16765/34278 [18:31:47<18:12:09, 3.74s/it] 49%|████▉ | 16766/34278 [18:31:54<21:46:38, 4.48s/it] {'loss': 0.1227, 'grad_norm': 0.7841528464484294, 'learning_rate': 5.418794083726326e-06, 'epoch': 0.49} 49%|████▉ | 16766/34278 [18:31:54<21:46:38, 4.48s/it] 49%|████▉ | 16767/34278 [18:31:57<19:30:43, 4.01s/it] {'loss': 0.1175, 'grad_norm': 0.8447786992580212, 'learning_rate': 5.418323307808102e-06, 'epoch': 0.49} 49%|████▉ | 16767/34278 [18:31:57<19:30:43, 4.01s/it] 49%|████▉ | 16768/34278 [18:32:00<18:33:16, 3.81s/it] {'loss': 0.1429, 'grad_norm': 0.8742525640783552, 'learning_rate': 5.4178525281551874e-06, 'epoch': 0.49} 49%|████▉ | 16768/34278 [18:32:00<18:33:16, 3.81s/it] 49%|████▉ | 16769/34278 [18:32:03<17:43:33, 3.64s/it] {'loss': 0.1237, 'grad_norm': 0.7398236346392426, 'learning_rate': 5.417381744771783e-06, 'epoch': 0.49} 49%|████▉ | 16769/34278 [18:32:03<17:43:33, 3.64s/it] 49%|████▉ | 16770/34278 [18:32:06<16:54:44, 3.48s/it] {'loss': 0.1156, 'grad_norm': 0.8276263046738359, 'learning_rate': 5.416910957662098e-06, 'epoch': 0.49} 49%|████▉ | 16770/34278 [18:32:06<16:54:44, 3.48s/it] 49%|████▉ | 16771/34278 [18:32:12<20:34:59, 4.23s/it] {'loss': 0.1169, 'grad_norm': 0.7211954023857353, 'learning_rate': 5.416440166830329e-06, 'epoch': 0.49} 49%|████▉ | 16771/34278 [18:32:12<20:34:59, 4.23s/it] 49%|████▉ | 16772/34278 [18:32:15<18:56:39, 3.90s/it] {'loss': 0.1609, 'grad_norm': 0.7849124847552079, 'learning_rate': 5.415969372280682e-06, 'epoch': 0.49} 49%|████▉ | 16772/34278 [18:32:15<18:56:39, 3.90s/it] 49%|████▉ | 16773/34278 [18:32:19<18:48:50, 3.87s/it] {'loss': 0.1323, 'grad_norm': 0.8787052180196727, 'learning_rate': 5.415498574017359e-06, 'epoch': 0.49} 49%|████▉ | 16773/34278 [18:32:19<18:48:50, 3.87s/it] 49%|████▉ | 16774/34278 [18:32:23<18:24:36, 3.79s/it] {'loss': 0.1255, 'grad_norm': 0.907574241717095, 'learning_rate': 5.415027772044565e-06, 'epoch': 0.49} 49%|████▉ | 16774/34278 [18:32:23<18:24:36, 3.79s/it] 49%|████▉ | 16775/34278 [18:32:28<20:05:06, 4.13s/it] {'loss': 0.1382, 'grad_norm': 0.7477803076224181, 'learning_rate': 5.4145569663665024e-06, 'epoch': 0.49} 49%|████▉ | 16775/34278 [18:32:28<20:05:06, 4.13s/it] 49%|████▉ | 16776/34278 [18:32:31<19:02:15, 3.92s/it] {'loss': 0.1402, 'grad_norm': 1.0226656916821553, 'learning_rate': 5.4140861569873725e-06, 'epoch': 0.49} 49%|████▉ | 16776/34278 [18:32:31<19:02:15, 3.92s/it] 49%|████▉ | 16777/34278 [18:32:35<19:00:24, 3.91s/it] {'loss': 0.1155, 'grad_norm': 0.7094494368558831, 'learning_rate': 5.413615343911382e-06, 'epoch': 0.49} 49%|████▉ | 16777/34278 [18:32:35<19:00:24, 3.91s/it] 49%|████▉ | 16778/34278 [18:32:39<19:03:52, 3.92s/it] {'loss': 0.1295, 'grad_norm': 0.8783132299893559, 'learning_rate': 5.413144527142731e-06, 'epoch': 0.49} 49%|████▉ | 16778/34278 [18:32:39<19:03:52, 3.92s/it] 49%|████▉ | 16779/34278 [18:32:42<17:55:31, 3.69s/it] {'loss': 0.1384, 'grad_norm': 0.8450164614100708, 'learning_rate': 5.412673706685625e-06, 'epoch': 0.49} 49%|████▉ | 16779/34278 [18:32:42<17:55:31, 3.69s/it] 49%|████▉ | 16780/34278 [18:32:45<17:00:42, 3.50s/it] {'loss': 0.1248, 'grad_norm': 0.6510655045156076, 'learning_rate': 5.4122028825442675e-06, 'epoch': 0.49} 49%|████▉ | 16780/34278 [18:32:45<17:00:42, 3.50s/it] 49%|████▉ | 16781/34278 [18:32:49<17:31:36, 3.61s/it] {'loss': 0.1369, 'grad_norm': 0.8934417540642914, 'learning_rate': 5.411732054722859e-06, 'epoch': 0.49} 49%|████▉ | 16781/34278 [18:32:49<17:31:36, 3.61s/it] 49%|████▉ | 16782/34278 [18:32:52<17:12:06, 3.54s/it] {'loss': 0.1254, 'grad_norm': 0.7863068068754623, 'learning_rate': 5.411261223225605e-06, 'epoch': 0.49} 49%|████▉ | 16782/34278 [18:32:52<17:12:06, 3.54s/it] 49%|████▉ | 16783/34278 [18:32:56<16:55:10, 3.48s/it] {'loss': 0.1287, 'grad_norm': 0.8649214694677202, 'learning_rate': 5.4107903880567125e-06, 'epoch': 0.49} 49%|████▉ | 16783/34278 [18:32:56<16:55:10, 3.48s/it] 49%|████▉ | 16784/34278 [18:32:59<17:04:55, 3.52s/it] {'loss': 0.137, 'grad_norm': 0.8852695558120606, 'learning_rate': 5.410319549220378e-06, 'epoch': 0.49} 49%|████▉ | 16784/34278 [18:32:59<17:04:55, 3.52s/it] 49%|████▉ | 16785/34278 [18:33:02<16:16:08, 3.35s/it] {'loss': 0.1249, 'grad_norm': 0.9302665968237415, 'learning_rate': 5.40984870672081e-06, 'epoch': 0.49} 49%|████▉ | 16785/34278 [18:33:02<16:16:08, 3.35s/it] 49%|████▉ | 16786/34278 [18:33:06<16:34:59, 3.41s/it] {'loss': 0.1267, 'grad_norm': 0.8130711936782737, 'learning_rate': 5.4093778605622105e-06, 'epoch': 0.49} 49%|████▉ | 16786/34278 [18:33:06<16:34:59, 3.41s/it] 49%|████▉ | 16787/34278 [18:33:12<20:22:21, 4.19s/it] {'loss': 0.1199, 'grad_norm': 0.8807032602975655, 'learning_rate': 5.408907010748783e-06, 'epoch': 0.49} 49%|████▉ | 16787/34278 [18:33:12<20:22:21, 4.19s/it] 49%|████▉ | 16788/34278 [18:33:15<18:44:27, 3.86s/it] {'loss': 0.15, 'grad_norm': 1.013407550964662, 'learning_rate': 5.408436157284731e-06, 'epoch': 0.49} 49%|████▉ | 16788/34278 [18:33:15<18:44:27, 3.86s/it] 49%|████▉ | 16789/34278 [18:33:21<21:52:31, 4.50s/it] {'loss': 0.1332, 'grad_norm': 0.6923897675745911, 'learning_rate': 5.40796530017426e-06, 'epoch': 0.49} 49%|████▉ | 16789/34278 [18:33:21<21:52:31, 4.50s/it] 49%|████▉ | 16790/34278 [18:33:25<21:43:54, 4.47s/it] {'loss': 0.1068, 'grad_norm': 0.858974260629212, 'learning_rate': 5.40749443942157e-06, 'epoch': 0.49} 49%|████▉ | 16790/34278 [18:33:25<21:43:54, 4.47s/it] 49%|████▉ | 16791/34278 [18:33:30<21:19:28, 4.39s/it] {'loss': 0.1548, 'grad_norm': 0.9930197684295528, 'learning_rate': 5.407023575030867e-06, 'epoch': 0.49} 49%|████▉ | 16791/34278 [18:33:30<21:19:28, 4.39s/it] 49%|████▉ | 16792/34278 [18:33:33<19:48:45, 4.08s/it] {'loss': 0.1384, 'grad_norm': 0.8859854839653981, 'learning_rate': 5.406552707006356e-06, 'epoch': 0.49} 49%|████▉ | 16792/34278 [18:33:33<19:48:45, 4.08s/it] 49%|████▉ | 16793/34278 [18:33:37<20:20:01, 4.19s/it] {'loss': 0.1491, 'grad_norm': 0.793575497542596, 'learning_rate': 5.4060818353522396e-06, 'epoch': 0.49} 49%|████▉ | 16793/34278 [18:33:37<20:20:01, 4.19s/it] 49%|████▉ | 16794/34278 [18:33:41<19:23:49, 3.99s/it] {'loss': 0.1375, 'grad_norm': 0.888016965330607, 'learning_rate': 5.405610960072721e-06, 'epoch': 0.49} 49%|████▉ | 16794/34278 [18:33:41<19:23:49, 3.99s/it] 49%|████▉ | 16795/34278 [18:33:44<17:34:48, 3.62s/it] {'loss': 0.132, 'grad_norm': 0.8028485926525644, 'learning_rate': 5.405140081172005e-06, 'epoch': 0.49} 49%|████▉ | 16795/34278 [18:33:44<17:34:48, 3.62s/it] 49%|████▉ | 16796/34278 [18:33:47<17:04:16, 3.52s/it] {'loss': 0.1294, 'grad_norm': 1.093747280561161, 'learning_rate': 5.4046691986542935e-06, 'epoch': 0.49} 49%|████▉ | 16796/34278 [18:33:47<17:04:16, 3.52s/it] 49%|████▉ | 16797/34278 [18:33:51<17:34:10, 3.62s/it] {'loss': 0.1267, 'grad_norm': 0.6918535963370159, 'learning_rate': 5.404198312523793e-06, 'epoch': 0.49} 49%|████▉ | 16797/34278 [18:33:51<17:34:10, 3.62s/it] 49%|████▉ | 16798/34278 [18:33:55<18:58:39, 3.91s/it] {'loss': 0.1267, 'grad_norm': 0.855602394232518, 'learning_rate': 5.403727422784707e-06, 'epoch': 0.49} 49%|████▉ | 16798/34278 [18:33:55<18:58:39, 3.91s/it] 49%|████▉ | 16799/34278 [18:34:00<19:31:08, 4.02s/it] {'loss': 0.1568, 'grad_norm': 0.9963409012648484, 'learning_rate': 5.403256529441238e-06, 'epoch': 0.49} 49%|████▉ | 16799/34278 [18:34:00<19:31:08, 4.02s/it] 49%|████▉ | 16800/34278 [18:34:03<18:48:44, 3.87s/it] {'loss': 0.1421, 'grad_norm': 0.7977064881755879, 'learning_rate': 5.402785632497593e-06, 'epoch': 0.49} 49%|████▉ | 16800/34278 [18:34:03<18:48:44, 3.87s/it] 49%|████▉ | 16801/34278 [18:34:06<17:41:30, 3.64s/it] {'loss': 0.1345, 'grad_norm': 0.829545169257994, 'learning_rate': 5.4023147319579715e-06, 'epoch': 0.49} 49%|████▉ | 16801/34278 [18:34:06<17:41:30, 3.64s/it] 49%|████▉ | 16802/34278 [18:34:09<16:52:21, 3.48s/it] {'loss': 0.1177, 'grad_norm': 0.74539276370791, 'learning_rate': 5.401843827826581e-06, 'epoch': 0.49} 49%|████▉ | 16802/34278 [18:34:09<16:52:21, 3.48s/it] 49%|████▉ | 16803/34278 [18:34:13<16:56:36, 3.49s/it] {'loss': 0.1329, 'grad_norm': 0.8563638907714458, 'learning_rate': 5.4013729201076245e-06, 'epoch': 0.49} 49%|████▉ | 16803/34278 [18:34:13<16:56:36, 3.49s/it] 49%|████▉ | 16804/34278 [18:34:17<17:12:27, 3.55s/it] {'loss': 0.1201, 'grad_norm': 0.7020997028155187, 'learning_rate': 5.400902008805306e-06, 'epoch': 0.49} 49%|████▉ | 16804/34278 [18:34:17<17:12:27, 3.55s/it] 49%|████▉ | 16805/34278 [18:34:19<16:12:37, 3.34s/it] {'loss': 0.1277, 'grad_norm': 0.891919959854982, 'learning_rate': 5.400431093923832e-06, 'epoch': 0.49} 49%|████▉ | 16805/34278 [18:34:19<16:12:37, 3.34s/it] 49%|████▉ | 16806/34278 [18:34:22<15:38:27, 3.22s/it] {'loss': 0.14, 'grad_norm': 0.9311534081126035, 'learning_rate': 5.399960175467404e-06, 'epoch': 0.49} 49%|████▉ | 16806/34278 [18:34:22<15:38:27, 3.22s/it] 49%|████▉ | 16807/34278 [18:34:26<16:04:46, 3.31s/it] {'loss': 0.1306, 'grad_norm': 0.708145217147487, 'learning_rate': 5.3994892534402255e-06, 'epoch': 0.49} 49%|████▉ | 16807/34278 [18:34:26<16:04:46, 3.31s/it] 49%|████▉ | 16808/34278 [18:34:29<16:06:35, 3.32s/it] {'loss': 0.1312, 'grad_norm': 0.6924660526146618, 'learning_rate': 5.399018327846504e-06, 'epoch': 0.49} 49%|████▉ | 16808/34278 [18:34:29<16:06:35, 3.32s/it] 49%|████▉ | 16809/34278 [18:34:32<15:49:55, 3.26s/it] {'loss': 0.1447, 'grad_norm': 0.7908199978659743, 'learning_rate': 5.398547398690441e-06, 'epoch': 0.49} 49%|████▉ | 16809/34278 [18:34:32<15:49:55, 3.26s/it] 49%|████▉ | 16810/34278 [18:34:36<15:54:16, 3.28s/it] {'loss': 0.1392, 'grad_norm': 0.835693832828301, 'learning_rate': 5.398076465976243e-06, 'epoch': 0.49} 49%|████▉ | 16810/34278 [18:34:36<15:54:16, 3.28s/it] 49%|████▉ | 16811/34278 [18:34:41<19:28:15, 4.01s/it] {'loss': 0.1497, 'grad_norm': 1.3199714941969403, 'learning_rate': 5.397605529708112e-06, 'epoch': 0.49} 49%|████▉ | 16811/34278 [18:34:41<19:28:15, 4.01s/it] 49%|████▉ | 16812/34278 [18:34:45<19:27:16, 4.01s/it] {'loss': 0.1098, 'grad_norm': 0.7411579942711041, 'learning_rate': 5.397134589890255e-06, 'epoch': 0.49} 49%|████▉ | 16812/34278 [18:34:45<19:27:16, 4.01s/it] 49%|████▉ | 16813/34278 [18:34:49<18:05:41, 3.73s/it] {'loss': 0.1402, 'grad_norm': 0.7547862470630657, 'learning_rate': 5.396663646526875e-06, 'epoch': 0.49} 49%|████▉ | 16813/34278 [18:34:49<18:05:41, 3.73s/it] 49%|████▉ | 16814/34278 [18:34:51<16:42:13, 3.44s/it] {'loss': 0.1241, 'grad_norm': 1.0065925346132234, 'learning_rate': 5.396192699622176e-06, 'epoch': 0.49} 49%|████▉ | 16814/34278 [18:34:51<16:42:13, 3.44s/it] 49%|████▉ | 16815/34278 [18:34:55<17:16:25, 3.56s/it] {'loss': 0.1293, 'grad_norm': 0.9593389319689464, 'learning_rate': 5.3957217491803645e-06, 'epoch': 0.49} 49%|████▉ | 16815/34278 [18:34:55<17:16:25, 3.56s/it] 49%|████▉ | 16816/34278 [18:34:58<16:24:55, 3.38s/it] {'loss': 0.1391, 'grad_norm': 0.7320499446501436, 'learning_rate': 5.395250795205642e-06, 'epoch': 0.49} 49%|████▉ | 16816/34278 [18:34:58<16:24:55, 3.38s/it] 49%|████▉ | 16817/34278 [18:35:03<18:32:42, 3.82s/it] {'loss': 0.1267, 'grad_norm': 0.7895198341381162, 'learning_rate': 5.394779837702216e-06, 'epoch': 0.49} 49%|████▉ | 16817/34278 [18:35:03<18:32:42, 3.82s/it] 49%|████▉ | 16818/34278 [18:35:07<18:09:49, 3.75s/it] {'loss': 0.147, 'grad_norm': 0.974056337435937, 'learning_rate': 5.394308876674289e-06, 'epoch': 0.49} 49%|████▉ | 16818/34278 [18:35:07<18:09:49, 3.75s/it] 49%|████▉ | 16819/34278 [18:35:10<17:47:57, 3.67s/it] {'loss': 0.1582, 'grad_norm': 1.1127042826556375, 'learning_rate': 5.3938379121260675e-06, 'epoch': 0.49} 49%|████▉ | 16819/34278 [18:35:10<17:47:57, 3.67s/it] 49%|████▉ | 16820/34278 [18:35:14<17:42:21, 3.65s/it] {'loss': 0.1413, 'grad_norm': 1.1167645802770252, 'learning_rate': 5.393366944061754e-06, 'epoch': 0.49} 49%|████▉ | 16820/34278 [18:35:14<17:42:21, 3.65s/it] 49%|████▉ | 16821/34278 [18:35:17<17:31:43, 3.61s/it] {'loss': 0.1402, 'grad_norm': 0.7937400312126629, 'learning_rate': 5.392895972485555e-06, 'epoch': 0.49} 49%|████▉ | 16821/34278 [18:35:17<17:31:43, 3.61s/it] 49%|████▉ | 16822/34278 [18:35:23<21:23:52, 4.41s/it] {'loss': 0.1468, 'grad_norm': 0.9375515907959356, 'learning_rate': 5.392424997401674e-06, 'epoch': 0.49} 49%|████▉ | 16822/34278 [18:35:23<21:23:52, 4.41s/it] 49%|████▉ | 16823/34278 [18:35:26<18:54:38, 3.90s/it] {'loss': 0.1282, 'grad_norm': 1.277586920616488, 'learning_rate': 5.391954018814316e-06, 'epoch': 0.49} 49%|████▉ | 16823/34278 [18:35:26<18:54:38, 3.90s/it] 49%|████▉ | 16824/34278 [18:35:29<17:26:18, 3.60s/it] {'loss': 0.1292, 'grad_norm': 1.0791530362056487, 'learning_rate': 5.3914830367276875e-06, 'epoch': 0.49} 49%|████▉ | 16824/34278 [18:35:29<17:26:18, 3.60s/it] 49%|████▉ | 16825/34278 [18:35:32<17:02:55, 3.52s/it] {'loss': 0.1317, 'grad_norm': 0.7994404509745789, 'learning_rate': 5.3910120511459915e-06, 'epoch': 0.49} 49%|████▉ | 16825/34278 [18:35:32<17:02:55, 3.52s/it] 49%|████▉ | 16826/34278 [18:35:36<16:40:44, 3.44s/it] {'loss': 0.1355, 'grad_norm': 1.107509311312047, 'learning_rate': 5.390541062073432e-06, 'epoch': 0.49} 49%|████▉ | 16826/34278 [18:35:36<16:40:44, 3.44s/it] 49%|████▉ | 16827/34278 [18:35:39<17:07:22, 3.53s/it] {'loss': 0.1478, 'grad_norm': 0.8922484935540654, 'learning_rate': 5.390070069514216e-06, 'epoch': 0.49} 49%|████▉ | 16827/34278 [18:35:39<17:07:22, 3.53s/it] 49%|████▉ | 16828/34278 [18:35:43<16:49:53, 3.47s/it] {'loss': 0.1216, 'grad_norm': 1.2708040797349531, 'learning_rate': 5.389599073472549e-06, 'epoch': 0.49} 49%|████▉ | 16828/34278 [18:35:43<16:49:53, 3.47s/it] 49%|████▉ | 16829/34278 [18:35:49<20:21:41, 4.20s/it] {'loss': 0.1372, 'grad_norm': 1.0097909657546378, 'learning_rate': 5.389128073952632e-06, 'epoch': 0.49} 49%|████▉ | 16829/34278 [18:35:49<20:21:41, 4.20s/it] 49%|████▉ | 16830/34278 [18:35:54<21:36:09, 4.46s/it] {'loss': 0.1562, 'grad_norm': 0.7479600103341646, 'learning_rate': 5.388657070958674e-06, 'epoch': 0.49} 49%|████▉ | 16830/34278 [18:35:54<21:36:09, 4.46s/it] 49%|████▉ | 16831/34278 [18:35:57<19:46:16, 4.08s/it] {'loss': 0.1152, 'grad_norm': 0.7886261775704383, 'learning_rate': 5.388186064494878e-06, 'epoch': 0.49} 49%|████▉ | 16831/34278 [18:35:57<19:46:16, 4.08s/it] 49%|████▉ | 16832/34278 [18:36:00<18:19:57, 3.78s/it] {'loss': 0.1627, 'grad_norm': 1.0528727344581106, 'learning_rate': 5.3877150545654486e-06, 'epoch': 0.49} 49%|████▉ | 16832/34278 [18:36:00<18:19:57, 3.78s/it] 49%|████▉ | 16833/34278 [18:36:05<20:51:42, 4.31s/it] {'loss': 0.1424, 'grad_norm': 0.7441770880074093, 'learning_rate': 5.387244041174593e-06, 'epoch': 0.49} 49%|████▉ | 16833/34278 [18:36:05<20:51:42, 4.31s/it] 49%|████▉ | 16834/34278 [18:36:10<21:37:04, 4.46s/it] {'loss': 0.1319, 'grad_norm': 0.8021533562511127, 'learning_rate': 5.3867730243265145e-06, 'epoch': 0.49} 49%|████▉ | 16834/34278 [18:36:10<21:37:04, 4.46s/it] 49%|████▉ | 16835/34278 [18:36:13<18:59:14, 3.92s/it] {'loss': 0.1234, 'grad_norm': 0.8206389697842621, 'learning_rate': 5.386302004025419e-06, 'epoch': 0.49} 49%|████▉ | 16835/34278 [18:36:13<18:59:14, 3.92s/it] 49%|████▉ | 16836/34278 [18:36:19<21:25:28, 4.42s/it] {'loss': 0.1413, 'grad_norm': 0.9698046102401209, 'learning_rate': 5.385830980275511e-06, 'epoch': 0.49} 49%|████▉ | 16836/34278 [18:36:19<21:25:28, 4.42s/it] 49%|████▉ | 16837/34278 [18:36:22<20:39:02, 4.26s/it] {'loss': 0.1467, 'grad_norm': 0.7334987243422961, 'learning_rate': 5.385359953080997e-06, 'epoch': 0.49} 49%|████▉ | 16837/34278 [18:36:22<20:39:02, 4.26s/it] 49%|████▉ | 16838/34278 [18:36:27<21:18:18, 4.40s/it] {'loss': 0.1278, 'grad_norm': 0.7536300243009635, 'learning_rate': 5.384888922446081e-06, 'epoch': 0.49} 49%|████▉ | 16838/34278 [18:36:27<21:18:18, 4.40s/it] 49%|████▉ | 16839/34278 [18:36:30<19:41:42, 4.07s/it] {'loss': 0.1054, 'grad_norm': 0.7082229165762484, 'learning_rate': 5.384417888374967e-06, 'epoch': 0.49} 49%|████▉ | 16839/34278 [18:36:30<19:41:42, 4.07s/it] 49%|████▉ | 16840/34278 [18:36:37<22:37:10, 4.67s/it] {'loss': 0.1342, 'grad_norm': 0.756425998044131, 'learning_rate': 5.383946850871865e-06, 'epoch': 0.49} 49%|████▉ | 16840/34278 [18:36:37<22:37:10, 4.67s/it] 49%|████▉ | 16841/34278 [18:36:40<20:36:26, 4.25s/it] {'loss': 0.1389, 'grad_norm': 0.8357922972489411, 'learning_rate': 5.383475809940975e-06, 'epoch': 0.49} 49%|████▉ | 16841/34278 [18:36:40<20:36:26, 4.25s/it] 49%|████▉ | 16842/34278 [18:36:43<19:15:59, 3.98s/it] {'loss': 0.1217, 'grad_norm': 0.7504759075158891, 'learning_rate': 5.383004765586504e-06, 'epoch': 0.49} 49%|████▉ | 16842/34278 [18:36:43<19:15:59, 3.98s/it] 49%|████▉ | 16843/34278 [18:36:49<22:07:41, 4.57s/it] {'loss': 0.1251, 'grad_norm': 0.6340031231950363, 'learning_rate': 5.38253371781266e-06, 'epoch': 0.49} 49%|████▉ | 16843/34278 [18:36:49<22:07:41, 4.57s/it] 49%|████▉ | 16844/34278 [18:36:53<20:59:43, 4.34s/it] {'loss': 0.1392, 'grad_norm': 0.8961351187184075, 'learning_rate': 5.3820626666236445e-06, 'epoch': 0.49} 49%|████▉ | 16844/34278 [18:36:53<20:59:43, 4.34s/it] 49%|████▉ | 16845/34278 [18:36:58<21:27:07, 4.43s/it] {'loss': 0.1491, 'grad_norm': 1.1382162980969783, 'learning_rate': 5.381591612023665e-06, 'epoch': 0.49} 49%|████▉ | 16845/34278 [18:36:58<21:27:07, 4.43s/it] 49%|████▉ | 16846/34278 [18:37:00<19:13:37, 3.97s/it] {'loss': 0.1151, 'grad_norm': 1.0942466131232769, 'learning_rate': 5.381120554016928e-06, 'epoch': 0.49} 49%|████▉ | 16846/34278 [18:37:00<19:13:37, 3.97s/it] 49%|████▉ | 16847/34278 [18:37:04<18:23:53, 3.80s/it] {'loss': 0.1285, 'grad_norm': 1.127863183941797, 'learning_rate': 5.380649492607636e-06, 'epoch': 0.49} 49%|████▉ | 16847/34278 [18:37:04<18:23:53, 3.80s/it] 49%|████▉ | 16848/34278 [18:37:09<20:25:54, 4.22s/it] {'loss': 0.1481, 'grad_norm': 0.9259252089644728, 'learning_rate': 5.380178427799997e-06, 'epoch': 0.49} 49%|████▉ | 16848/34278 [18:37:09<20:25:54, 4.22s/it] 49%|████▉ | 16849/34278 [18:37:12<18:41:36, 3.86s/it] {'loss': 0.1654, 'grad_norm': 0.8838430115790725, 'learning_rate': 5.379707359598215e-06, 'epoch': 0.49} 49%|████▉ | 16849/34278 [18:37:12<18:41:36, 3.86s/it] 49%|████▉ | 16850/34278 [18:37:15<16:54:49, 3.49s/it] {'loss': 0.1268, 'grad_norm': 0.9487736874910573, 'learning_rate': 5.379236288006497e-06, 'epoch': 0.49} 49%|████▉ | 16850/34278 [18:37:15<16:54:49, 3.49s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f96a23c5b70> Failed to fetch sample 3722599. Exception: cannot identify image file <_io.BytesIO object at 0x7f96a23c5b70> 49%|████▉ | 16851/34278 [18:37:21<20:34:45, 4.25s/it] {'loss': 0.1246, 'grad_norm': 0.983283064243009, 'learning_rate': 5.378765213029048e-06, 'epoch': 0.49} 49%|████▉ | 16851/34278 [18:37:21<20:34:45, 4.25s/it] 49%|████▉ | 16852/34278 [18:37:24<18:56:24, 3.91s/it] {'loss': 0.1375, 'grad_norm': 0.7664008709616098, 'learning_rate': 5.378294134670073e-06, 'epoch': 0.49} 49%|████▉ | 16852/34278 [18:37:24<18:56:24, 3.91s/it] 49%|████▉ | 16853/34278 [18:37:27<17:51:15, 3.69s/it] {'loss': 0.1057, 'grad_norm': 0.716134218745774, 'learning_rate': 5.377823052933779e-06, 'epoch': 0.49} 49%|████▉ | 16853/34278 [18:37:27<17:51:15, 3.69s/it] 49%|████▉ | 16854/34278 [18:37:33<21:16:05, 4.39s/it] {'loss': 0.1333, 'grad_norm': 0.7962058399740612, 'learning_rate': 5.37735196782437e-06, 'epoch': 0.49} 49%|████▉ | 16854/34278 [18:37:33<21:16:05, 4.39s/it] 49%|████▉ | 16855/34278 [18:37:37<20:11:41, 4.17s/it] {'loss': 0.1196, 'grad_norm': 0.6437605306220976, 'learning_rate': 5.376880879346054e-06, 'epoch': 0.49} 49%|████▉ | 16855/34278 [18:37:37<20:11:41, 4.17s/it] 49%|████▉ | 16856/34278 [18:37:40<18:59:09, 3.92s/it] {'loss': 0.1416, 'grad_norm': 1.111434136573803, 'learning_rate': 5.376409787503034e-06, 'epoch': 0.49} 49%|████▉ | 16856/34278 [18:37:40<18:59:09, 3.92s/it] 49%|████▉ | 16857/34278 [18:37:46<21:45:33, 4.50s/it] {'loss': 0.1116, 'grad_norm': 0.7347356291363897, 'learning_rate': 5.375938692299518e-06, 'epoch': 0.49} 49%|████▉ | 16857/34278 [18:37:46<21:45:33, 4.50s/it] 49%|████▉ | 16858/34278 [18:37:49<20:03:54, 4.15s/it] {'loss': 0.1431, 'grad_norm': 0.7997691791500734, 'learning_rate': 5.375467593739713e-06, 'epoch': 0.49} 49%|████▉ | 16858/34278 [18:37:49<20:03:54, 4.15s/it] 49%|████▉ | 16859/34278 [18:37:52<18:40:41, 3.86s/it] {'loss': 0.1247, 'grad_norm': 0.6407723668348148, 'learning_rate': 5.37499649182782e-06, 'epoch': 0.49} 49%|████▉ | 16859/34278 [18:37:52<18:40:41, 3.86s/it] 49%|████▉ | 16860/34278 [18:37:56<18:44:53, 3.87s/it] {'loss': 0.1565, 'grad_norm': 0.9153042964347438, 'learning_rate': 5.37452538656805e-06, 'epoch': 0.49} 49%|████▉ | 16860/34278 [18:37:56<18:44:53, 3.87s/it] 49%|████▉ | 16861/34278 [18:37:59<17:35:23, 3.64s/it] {'loss': 0.1345, 'grad_norm': 0.7070742348713276, 'learning_rate': 5.374054277964605e-06, 'epoch': 0.49} 49%|████▉ | 16861/34278 [18:37:59<17:35:23, 3.64s/it] 49%|████▉ | 16862/34278 [18:38:02<16:21:29, 3.38s/it] {'loss': 0.1345, 'grad_norm': 0.7474405370120811, 'learning_rate': 5.373583166021694e-06, 'epoch': 0.49} 49%|████▉ | 16862/34278 [18:38:02<16:21:29, 3.38s/it] 49%|████▉ | 16863/34278 [18:38:08<20:09:39, 4.17s/it] {'loss': 0.1201, 'grad_norm': 0.975596061924209, 'learning_rate': 5.373112050743522e-06, 'epoch': 0.49} 49%|████▉ | 16863/34278 [18:38:08<20:09:39, 4.17s/it] 49%|████▉ | 16864/34278 [18:38:11<18:57:07, 3.92s/it] {'loss': 0.1245, 'grad_norm': 0.8097861084213097, 'learning_rate': 5.3726409321342935e-06, 'epoch': 0.49} 49%|████▉ | 16864/34278 [18:38:11<18:57:07, 3.92s/it] 49%|████▉ | 16865/34278 [18:38:15<18:05:55, 3.74s/it] {'loss': 0.1503, 'grad_norm': 0.792038897417954, 'learning_rate': 5.372169810198215e-06, 'epoch': 0.49} 49%|████▉ | 16865/34278 [18:38:15<18:05:55, 3.74s/it] 49%|████▉ | 16866/34278 [18:38:18<17:08:09, 3.54s/it] {'loss': 0.1412, 'grad_norm': 0.9405911575915272, 'learning_rate': 5.371698684939495e-06, 'epoch': 0.49} 49%|████▉ | 16866/34278 [18:38:18<17:08:09, 3.54s/it] 49%|████▉ | 16867/34278 [18:38:21<16:17:36, 3.37s/it] {'loss': 0.129, 'grad_norm': 0.7992514729447099, 'learning_rate': 5.371227556362337e-06, 'epoch': 0.49} 49%|████▉ | 16867/34278 [18:38:21<16:17:36, 3.37s/it] 49%|████▉ | 16868/34278 [18:38:25<17:44:55, 3.67s/it] {'loss': 0.114, 'grad_norm': 0.6542417261430244, 'learning_rate': 5.370756424470948e-06, 'epoch': 0.49} 49%|████▉ | 16868/34278 [18:38:25<17:44:55, 3.67s/it] 49%|████▉ | 16869/34278 [18:38:31<21:19:07, 4.41s/it] {'loss': 0.1287, 'grad_norm': 1.163943728663297, 'learning_rate': 5.370285289269535e-06, 'epoch': 0.49} 49%|████▉ | 16869/34278 [18:38:31<21:19:07, 4.41s/it] 49%|████▉ | 16870/34278 [18:38:35<19:45:09, 4.08s/it] {'loss': 0.1268, 'grad_norm': 0.9083702708069268, 'learning_rate': 5.369814150762304e-06, 'epoch': 0.49} 49%|████▉ | 16870/34278 [18:38:35<19:45:09, 4.08s/it] 49%|████▉ | 16871/34278 [18:38:39<19:28:44, 4.03s/it] {'loss': 0.1042, 'grad_norm': 0.6994081948116999, 'learning_rate': 5.369343008953458e-06, 'epoch': 0.49} 49%|████▉ | 16871/34278 [18:38:39<19:28:44, 4.03s/it] 49%|████▉ | 16872/34278 [18:38:44<21:42:27, 4.49s/it] {'loss': 0.1171, 'grad_norm': 0.8741433725163471, 'learning_rate': 5.368871863847207e-06, 'epoch': 0.49} 49%|████▉ | 16872/34278 [18:38:44<21:42:27, 4.49s/it] 49%|████▉ | 16873/34278 [18:38:50<23:33:16, 4.87s/it] {'loss': 0.127, 'grad_norm': 1.2603996912826851, 'learning_rate': 5.368400715447757e-06, 'epoch': 0.49} 49%|████▉ | 16873/34278 [18:38:50<23:33:16, 4.87s/it] 49%|████▉ | 16874/34278 [18:38:53<21:15:49, 4.40s/it] {'loss': 0.1147, 'grad_norm': 0.7652508192028512, 'learning_rate': 5.367929563759311e-06, 'epoch': 0.49} 49%|████▉ | 16874/34278 [18:38:53<21:15:49, 4.40s/it] 49%|████▉ | 16875/34278 [18:38:57<21:03:48, 4.36s/it] {'loss': 0.131, 'grad_norm': 1.1648667864675992, 'learning_rate': 5.36745840878608e-06, 'epoch': 0.49} 49%|████▉ | 16875/34278 [18:38:57<21:03:48, 4.36s/it] 49%|████▉ | 16876/34278 [18:39:03<23:26:10, 4.85s/it] {'loss': 0.1326, 'grad_norm': 1.0416485002240208, 'learning_rate': 5.366987250532266e-06, 'epoch': 0.49} 49%|████▉ | 16876/34278 [18:39:03<23:26:10, 4.85s/it] 49%|████▉ | 16877/34278 [18:39:08<22:46:32, 4.71s/it] {'loss': 0.1522, 'grad_norm': 0.8748062798784032, 'learning_rate': 5.36651608900208e-06, 'epoch': 0.49} 49%|████▉ | 16877/34278 [18:39:08<22:46:32, 4.71s/it] 49%|████▉ | 16878/34278 [18:39:11<20:33:58, 4.26s/it] {'loss': 0.1286, 'grad_norm': 0.8007945767833401, 'learning_rate': 5.366044924199725e-06, 'epoch': 0.49} 49%|████▉ | 16878/34278 [18:39:11<20:33:58, 4.26s/it] 49%|████▉ | 16879/34278 [18:39:17<23:07:03, 4.78s/it] {'loss': 0.1408, 'grad_norm': 0.8482147388626303, 'learning_rate': 5.365573756129406e-06, 'epoch': 0.49} 49%|████▉ | 16879/34278 [18:39:17<23:07:03, 4.78s/it] 49%|████▉ | 16880/34278 [18:39:20<20:50:44, 4.31s/it] {'loss': 0.1285, 'grad_norm': 0.967644841804688, 'learning_rate': 5.365102584795334e-06, 'epoch': 0.49} 49%|████▉ | 16880/34278 [18:39:20<20:50:44, 4.31s/it] 49%|████▉ | 16881/34278 [18:39:23<19:11:43, 3.97s/it] {'loss': 0.1251, 'grad_norm': 0.8995162241486194, 'learning_rate': 5.364631410201713e-06, 'epoch': 0.49} 49%|████▉ | 16881/34278 [18:39:23<19:11:43, 3.97s/it] 49%|████▉ | 16882/34278 [18:39:28<19:47:16, 4.10s/it] {'loss': 0.1156, 'grad_norm': 0.8415421441677097, 'learning_rate': 5.364160232352749e-06, 'epoch': 0.49} 49%|████▉ | 16882/34278 [18:39:28<19:47:16, 4.10s/it] 49%|████▉ | 16883/34278 [18:39:32<19:23:32, 4.01s/it] {'loss': 0.1447, 'grad_norm': 1.323591753648415, 'learning_rate': 5.363689051252651e-06, 'epoch': 0.49} 49%|████▉ | 16883/34278 [18:39:32<19:23:32, 4.01s/it] 49%|████▉ | 16884/34278 [18:39:35<17:56:23, 3.71s/it] {'loss': 0.15, 'grad_norm': 0.9582912145627567, 'learning_rate': 5.363217866905622e-06, 'epoch': 0.49} 49%|████▉ | 16884/34278 [18:39:35<17:56:23, 3.71s/it] 49%|████▉ | 16885/34278 [18:39:38<17:20:10, 3.59s/it] {'loss': 0.1415, 'grad_norm': 1.0225435535678218, 'learning_rate': 5.362746679315872e-06, 'epoch': 0.49} 49%|████▉ | 16885/34278 [18:39:38<17:20:10, 3.59s/it] 49%|████▉ | 16886/34278 [18:39:43<18:56:01, 3.92s/it] {'loss': 0.1691, 'grad_norm': 0.9494187747788593, 'learning_rate': 5.362275488487606e-06, 'epoch': 0.49} 49%|████▉ | 16886/34278 [18:39:43<18:56:01, 3.92s/it] 49%|████▉ | 16887/34278 [18:39:46<17:57:22, 3.72s/it] {'loss': 0.1336, 'grad_norm': 0.8646619701138887, 'learning_rate': 5.361804294425031e-06, 'epoch': 0.49} 49%|████▉ | 16887/34278 [18:39:46<17:57:22, 3.72s/it] 49%|████▉ | 16888/34278 [18:39:49<17:03:07, 3.53s/it] {'loss': 0.1152, 'grad_norm': 1.014770621302196, 'learning_rate': 5.361333097132353e-06, 'epoch': 0.49} 49%|████▉ | 16888/34278 [18:39:49<17:03:07, 3.53s/it] 49%|████▉ | 16889/34278 [18:39:52<15:52:24, 3.29s/it] {'loss': 0.1257, 'grad_norm': 0.8472959034349283, 'learning_rate': 5.360861896613779e-06, 'epoch': 0.49} 49%|████▉ | 16889/34278 [18:39:52<15:52:24, 3.29s/it] 49%|████▉ | 16890/34278 [18:39:55<16:19:19, 3.38s/it] {'loss': 0.1325, 'grad_norm': 0.9647487125299989, 'learning_rate': 5.360390692873518e-06, 'epoch': 0.49} 49%|████▉ | 16890/34278 [18:39:55<16:19:19, 3.38s/it] 49%|████▉ | 16891/34278 [18:39:59<16:23:00, 3.39s/it] {'loss': 0.1395, 'grad_norm': 0.9419671715760097, 'learning_rate': 5.3599194859157735e-06, 'epoch': 0.49} 49%|████▉ | 16891/34278 [18:39:59<16:23:00, 3.39s/it] 49%|████▉ | 16892/34278 [18:40:02<16:26:52, 3.41s/it] {'loss': 0.1413, 'grad_norm': 0.7140778222154549, 'learning_rate': 5.359448275744755e-06, 'epoch': 0.49} 49%|████▉ | 16892/34278 [18:40:02<16:26:52, 3.41s/it] 49%|████▉ | 16893/34278 [18:40:05<16:12:02, 3.35s/it] {'loss': 0.1369, 'grad_norm': 0.7519770736931981, 'learning_rate': 5.358977062364666e-06, 'epoch': 0.49} 49%|████▉ | 16893/34278 [18:40:05<16:12:02, 3.35s/it] 49%|████▉ | 16894/34278 [18:40:08<15:43:10, 3.26s/it] {'loss': 0.1402, 'grad_norm': 0.8270270530223526, 'learning_rate': 5.358505845779717e-06, 'epoch': 0.49} 49%|████▉ | 16894/34278 [18:40:08<15:43:10, 3.26s/it] 49%|████▉ | 16895/34278 [18:40:11<15:12:49, 3.15s/it] {'loss': 0.1514, 'grad_norm': 0.7663369407857998, 'learning_rate': 5.358034625994113e-06, 'epoch': 0.49} 49%|████▉ | 16895/34278 [18:40:11<15:12:49, 3.15s/it] 49%|████▉ | 16896/34278 [18:40:15<15:22:48, 3.19s/it] {'loss': 0.1077, 'grad_norm': 0.5966561564754324, 'learning_rate': 5.357563403012061e-06, 'epoch': 0.49} 49%|████▉ | 16896/34278 [18:40:15<15:22:48, 3.19s/it] 49%|████▉ | 16897/34278 [18:40:21<19:19:23, 4.00s/it] {'loss': 0.1258, 'grad_norm': 0.7748324464328683, 'learning_rate': 5.357092176837769e-06, 'epoch': 0.49} 49%|████▉ | 16897/34278 [18:40:21<19:19:23, 4.00s/it] 49%|████▉ | 16898/34278 [18:40:24<18:23:01, 3.81s/it] {'loss': 0.1422, 'grad_norm': 0.8450404382189073, 'learning_rate': 5.3566209474754425e-06, 'epoch': 0.49} 49%|████▉ | 16898/34278 [18:40:24<18:23:01, 3.81s/it] 49%|████▉ | 16899/34278 [18:40:29<20:35:31, 4.27s/it] {'loss': 0.1176, 'grad_norm': 0.8087937305077653, 'learning_rate': 5.356149714929291e-06, 'epoch': 0.49} 49%|████▉ | 16899/34278 [18:40:29<20:35:31, 4.27s/it] 49%|████▉ | 16900/34278 [18:40:33<19:19:24, 4.00s/it] {'loss': 0.1003, 'grad_norm': 0.7299956387668687, 'learning_rate': 5.355678479203518e-06, 'epoch': 0.49} 49%|████▉ | 16900/34278 [18:40:33<19:19:24, 4.00s/it] 49%|████▉ | 16901/34278 [18:40:37<19:32:04, 4.05s/it] {'loss': 0.1476, 'grad_norm': 0.817078900020243, 'learning_rate': 5.355207240302332e-06, 'epoch': 0.49} 49%|████▉ | 16901/34278 [18:40:37<19:32:04, 4.05s/it] 49%|████▉ | 16902/34278 [18:40:40<18:00:59, 3.73s/it] {'loss': 0.1209, 'grad_norm': 0.7672277827493014, 'learning_rate': 5.354735998229943e-06, 'epoch': 0.49} 49%|████▉ | 16902/34278 [18:40:40<18:00:59, 3.73s/it] 49%|████▉ | 16903/34278 [18:40:44<18:18:55, 3.79s/it] {'loss': 0.1388, 'grad_norm': 0.7747143025480175, 'learning_rate': 5.354264752990553e-06, 'epoch': 0.49} 49%|████▉ | 16903/34278 [18:40:44<18:18:55, 3.79s/it] 49%|████▉ | 16904/34278 [18:40:47<17:46:14, 3.68s/it] {'loss': 0.1287, 'grad_norm': 0.8244094445952639, 'learning_rate': 5.353793504588374e-06, 'epoch': 0.49} 49%|████▉ | 16904/34278 [18:40:47<17:46:14, 3.68s/it] 49%|████▉ | 16905/34278 [18:40:50<17:09:07, 3.55s/it] {'loss': 0.1216, 'grad_norm': 0.8267489229302534, 'learning_rate': 5.353322253027611e-06, 'epoch': 0.49} 49%|████▉ | 16905/34278 [18:40:50<17:09:07, 3.55s/it] 49%|████▉ | 16906/34278 [18:40:54<16:38:26, 3.45s/it] {'loss': 0.1241, 'grad_norm': 1.0800355112043782, 'learning_rate': 5.352850998312469e-06, 'epoch': 0.49} 49%|████▉ | 16906/34278 [18:40:54<16:38:26, 3.45s/it] 49%|████▉ | 16907/34278 [18:40:57<15:59:13, 3.31s/it] {'loss': 0.1577, 'grad_norm': 0.8004539660085748, 'learning_rate': 5.35237974044716e-06, 'epoch': 0.49} 49%|████▉ | 16907/34278 [18:40:57<15:59:13, 3.31s/it] 49%|████▉ | 16908/34278 [18:41:00<16:44:56, 3.47s/it] {'loss': 0.1247, 'grad_norm': 0.9291822275348421, 'learning_rate': 5.351908479435888e-06, 'epoch': 0.49} 49%|████▉ | 16908/34278 [18:41:00<16:44:56, 3.47s/it] 49%|████▉ | 16909/34278 [18:41:06<20:22:59, 4.22s/it] {'loss': 0.1318, 'grad_norm': 0.5909709718039137, 'learning_rate': 5.35143721528286e-06, 'epoch': 0.49} 49%|████▉ | 16909/34278 [18:41:06<20:22:59, 4.22s/it] 49%|████▉ | 16910/34278 [18:41:09<18:16:29, 3.79s/it] {'loss': 0.1248, 'grad_norm': 0.8182605410853571, 'learning_rate': 5.350965947992286e-06, 'epoch': 0.49} 49%|████▉ | 16910/34278 [18:41:09<18:16:29, 3.79s/it] 49%|████▉ | 16911/34278 [18:41:13<17:41:58, 3.67s/it] {'loss': 0.1479, 'grad_norm': 1.060945591380358, 'learning_rate': 5.350494677568371e-06, 'epoch': 0.49} 49%|████▉ | 16911/34278 [18:41:13<17:41:58, 3.67s/it] 49%|████▉ | 16912/34278 [18:41:16<17:34:16, 3.64s/it] {'loss': 0.1208, 'grad_norm': 0.7781852485598358, 'learning_rate': 5.350023404015323e-06, 'epoch': 0.49} 49%|████▉ | 16912/34278 [18:41:16<17:34:16, 3.64s/it] 49%|████▉ | 16913/34278 [18:41:21<19:06:30, 3.96s/it] {'loss': 0.1429, 'grad_norm': 0.897173862130397, 'learning_rate': 5.3495521273373504e-06, 'epoch': 0.49} 49%|████▉ | 16913/34278 [18:41:21<19:06:30, 3.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 49%|████▉ | 16914/34278 [18:41:24<18:23:01, 3.81s/it] {'loss': 0.1225, 'grad_norm': 0.902893401242971, 'learning_rate': 5.349080847538659e-06, 'epoch': 0.49} 49%|████▉ | 16914/34278 [18:41:24<18:23:01, 3.81s/it] 49%|████▉ | 16915/34278 [18:41:29<19:15:08, 3.99s/it] {'loss': 0.136, 'grad_norm': 1.0301174153869221, 'learning_rate': 5.348609564623458e-06, 'epoch': 0.49} 49%|████▉ | 16915/34278 [18:41:29<19:15:08, 3.99s/it] 49%|████▉ | 16916/34278 [18:41:32<17:33:30, 3.64s/it] {'loss': 0.1451, 'grad_norm': 0.8247224773588024, 'learning_rate': 5.3481382785959536e-06, 'epoch': 0.49} 49%|████▉ | 16916/34278 [18:41:32<17:33:30, 3.64s/it] 49%|████▉ | 16917/34278 [18:41:35<17:02:51, 3.54s/it] {'loss': 0.1282, 'grad_norm': 0.7293546646236825, 'learning_rate': 5.347666989460353e-06, 'epoch': 0.49} 49%|████▉ | 16917/34278 [18:41:35<17:02:51, 3.54s/it] 49%|████▉ | 16918/34278 [18:41:38<16:38:34, 3.45s/it] {'loss': 0.13, 'grad_norm': 0.8207541196798567, 'learning_rate': 5.347195697220865e-06, 'epoch': 0.49} 49%|████▉ | 16918/34278 [18:41:38<16:38:34, 3.45s/it] 49%|████▉ | 16919/34278 [18:41:43<18:15:01, 3.78s/it] {'loss': 0.1341, 'grad_norm': 0.8288687812887132, 'learning_rate': 5.346724401881697e-06, 'epoch': 0.49} 49%|████▉ | 16919/34278 [18:41:43<18:15:01, 3.78s/it] 49%|████▉ | 16920/34278 [18:41:45<16:40:05, 3.46s/it] {'loss': 0.124, 'grad_norm': 0.8174263177651826, 'learning_rate': 5.346253103447058e-06, 'epoch': 0.49} 49%|████▉ | 16920/34278 [18:41:45<16:40:05, 3.46s/it] 49%|████▉ | 16921/34278 [18:41:49<16:37:15, 3.45s/it] {'loss': 0.1357, 'grad_norm': 0.7612552679145954, 'learning_rate': 5.34578180192115e-06, 'epoch': 0.49} 49%|████▉ | 16921/34278 [18:41:49<16:37:15, 3.45s/it] 49%|████▉ | 16922/34278 [18:41:54<19:30:43, 4.05s/it] {'loss': 0.1326, 'grad_norm': 0.7953009668694182, 'learning_rate': 5.3453104973081884e-06, 'epoch': 0.49} 49%|████▉ | 16922/34278 [18:41:54<19:30:43, 4.05s/it] 49%|████▉ | 16923/34278 [18:41:58<18:49:30, 3.90s/it] {'loss': 0.1283, 'grad_norm': 0.6520145509816958, 'learning_rate': 5.344839189612375e-06, 'epoch': 0.49} 49%|████▉ | 16923/34278 [18:41:58<18:49:30, 3.90s/it] 49%|████▉ | 16924/34278 [18:42:02<18:42:25, 3.88s/it] {'loss': 0.1388, 'grad_norm': 0.7295993419099278, 'learning_rate': 5.3443678788379195e-06, 'epoch': 0.49} 49%|████▉ | 16924/34278 [18:42:02<18:42:25, 3.88s/it] 49%|████▉ | 16925/34278 [18:42:04<17:13:09, 3.57s/it] {'loss': 0.1231, 'grad_norm': 0.7165709618403117, 'learning_rate': 5.343896564989031e-06, 'epoch': 0.49} 49%|████▉ | 16925/34278 [18:42:04<17:13:09, 3.57s/it] 49%|████▉ | 16926/34278 [18:42:08<16:34:52, 3.44s/it] {'loss': 0.1229, 'grad_norm': 0.7751106265930863, 'learning_rate': 5.3434252480699154e-06, 'epoch': 0.49} 49%|████▉ | 16926/34278 [18:42:08<16:34:52, 3.44s/it] 49%|████▉ | 16927/34278 [18:42:10<15:47:24, 3.28s/it] {'loss': 0.1312, 'grad_norm': 0.872227451919471, 'learning_rate': 5.3429539280847805e-06, 'epoch': 0.49} 49%|████▉ | 16927/34278 [18:42:10<15:47:24, 3.28s/it] 49%|████▉ | 16928/34278 [18:42:14<16:13:58, 3.37s/it] {'loss': 0.1433, 'grad_norm': 0.8025408400035391, 'learning_rate': 5.3424826050378365e-06, 'epoch': 0.49} 49%|████▉ | 16928/34278 [18:42:14<16:13:58, 3.37s/it] 49%|████▉ | 16929/34278 [18:42:17<15:48:28, 3.28s/it] {'loss': 0.1432, 'grad_norm': 0.7289180481653799, 'learning_rate': 5.3420112789332875e-06, 'epoch': 0.49} 49%|████▉ | 16929/34278 [18:42:17<15:48:28, 3.28s/it] 49%|████▉ | 16930/34278 [18:42:20<15:39:00, 3.25s/it] {'loss': 0.1296, 'grad_norm': 0.9134284659715837, 'learning_rate': 5.341539949775345e-06, 'epoch': 0.49} 49%|████▉ | 16930/34278 [18:42:20<15:39:00, 3.25s/it] 49%|████▉ | 16931/34278 [18:42:24<16:05:01, 3.34s/it] {'loss': 0.1486, 'grad_norm': 0.8583498510757264, 'learning_rate': 5.341068617568215e-06, 'epoch': 0.49} 49%|████▉ | 16931/34278 [18:42:24<16:05:01, 3.34s/it] 49%|████▉ | 16932/34278 [18:42:26<15:05:41, 3.13s/it] {'loss': 0.1292, 'grad_norm': 0.8446957679303193, 'learning_rate': 5.340597282316105e-06, 'epoch': 0.49} 49%|████▉ | 16932/34278 [18:42:27<15:05:41, 3.13s/it] 49%|████▉ | 16933/34278 [18:42:30<16:00:57, 3.32s/it] {'loss': 0.1138, 'grad_norm': 0.9423209681327167, 'learning_rate': 5.340125944023226e-06, 'epoch': 0.49} 49%|████▉ | 16933/34278 [18:42:30<16:00:57, 3.32s/it] 49%|████▉ | 16934/34278 [18:42:33<15:44:27, 3.27s/it] {'loss': 0.1379, 'grad_norm': 1.06153794678825, 'learning_rate': 5.339654602693781e-06, 'epoch': 0.49} 49%|████▉ | 16934/34278 [18:42:33<15:44:27, 3.27s/it] 49%|████▉ | 16935/34278 [18:42:36<15:28:05, 3.21s/it] {'loss': 0.118, 'grad_norm': 1.039660574620419, 'learning_rate': 5.339183258331983e-06, 'epoch': 0.49} 49%|████▉ | 16935/34278 [18:42:36<15:28:05, 3.21s/it] 49%|████▉ | 16936/34278 [18:42:40<15:51:51, 3.29s/it] {'loss': 0.1238, 'grad_norm': 0.9709315235538598, 'learning_rate': 5.338711910942036e-06, 'epoch': 0.49} 49%|████▉ | 16936/34278 [18:42:40<15:51:51, 3.29s/it] 49%|████▉ | 16937/34278 [18:42:43<15:32:08, 3.23s/it] {'loss': 0.1545, 'grad_norm': 1.068794208306307, 'learning_rate': 5.338240560528152e-06, 'epoch': 0.49} 49%|████▉ | 16937/34278 [18:42:43<15:32:08, 3.23s/it] 49%|████▉ | 16938/34278 [18:42:46<15:20:35, 3.19s/it] {'loss': 0.1354, 'grad_norm': 0.9457659930370551, 'learning_rate': 5.337769207094535e-06, 'epoch': 0.49} 49%|████▉ | 16938/34278 [18:42:46<15:20:35, 3.19s/it] 49%|████▉ | 16939/34278 [18:42:50<16:09:24, 3.35s/it] {'loss': 0.1494, 'grad_norm': 1.0797531203555015, 'learning_rate': 5.337297850645395e-06, 'epoch': 0.49} 49%|████▉ | 16939/34278 [18:42:50<16:09:24, 3.35s/it] 49%|████▉ | 16940/34278 [18:42:53<16:25:54, 3.41s/it] {'loss': 0.1304, 'grad_norm': 0.8296869908372474, 'learning_rate': 5.336826491184943e-06, 'epoch': 0.49} 49%|████▉ | 16940/34278 [18:42:53<16:25:54, 3.41s/it] 49%|████▉ | 16941/34278 [18:42:59<20:02:53, 4.16s/it] {'loss': 0.1407, 'grad_norm': 0.9105037236438693, 'learning_rate': 5.336355128717382e-06, 'epoch': 0.49} 49%|████▉ | 16941/34278 [18:42:59<20:02:53, 4.16s/it] 49%|████▉ | 16942/34278 [18:43:02<18:22:58, 3.82s/it] {'loss': 0.1433, 'grad_norm': 1.0522028001869357, 'learning_rate': 5.335883763246924e-06, 'epoch': 0.49} 49%|████▉ | 16942/34278 [18:43:02<18:22:58, 3.82s/it] 49%|████▉ | 16943/34278 [18:43:09<21:53:46, 4.55s/it] {'loss': 0.158, 'grad_norm': 0.9423373371445346, 'learning_rate': 5.335412394777775e-06, 'epoch': 0.49} 49%|████▉ | 16943/34278 [18:43:09<21:53:46, 4.55s/it] 49%|████▉ | 16944/34278 [18:43:12<20:09:25, 4.19s/it] {'loss': 0.1376, 'grad_norm': 0.9413246687873199, 'learning_rate': 5.334941023314145e-06, 'epoch': 0.49} 49%|████▉ | 16944/34278 [18:43:12<20:09:25, 4.19s/it] 49%|████▉ | 16945/34278 [18:43:15<18:36:03, 3.86s/it] {'loss': 0.1611, 'grad_norm': 0.8574886498954905, 'learning_rate': 5.334469648860241e-06, 'epoch': 0.49} 49%|████▉ | 16945/34278 [18:43:15<18:36:03, 3.86s/it] 49%|████▉ | 16946/34278 [18:43:19<19:08:20, 3.98s/it] {'loss': 0.1194, 'grad_norm': 1.4212075923177745, 'learning_rate': 5.333998271420272e-06, 'epoch': 0.49} 49%|████▉ | 16946/34278 [18:43:19<19:08:20, 3.98s/it] 49%|████▉ | 16947/34278 [18:43:23<19:23:19, 4.03s/it] {'loss': 0.1482, 'grad_norm': 0.9849675768205657, 'learning_rate': 5.333526890998446e-06, 'epoch': 0.49} 49%|████▉ | 16947/34278 [18:43:24<19:23:19, 4.03s/it] 49%|████▉ | 16948/34278 [18:43:28<19:26:38, 4.04s/it] {'loss': 0.1518, 'grad_norm': 1.124274564036261, 'learning_rate': 5.333055507598971e-06, 'epoch': 0.49} 49%|████▉ | 16948/34278 [18:43:28<19:26:38, 4.04s/it] 49%|████▉ | 16949/34278 [18:43:31<19:07:40, 3.97s/it] {'loss': 0.1295, 'grad_norm': 0.8075400744977255, 'learning_rate': 5.332584121226057e-06, 'epoch': 0.49} 49%|████▉ | 16949/34278 [18:43:31<19:07:40, 3.97s/it] 49%|████▉ | 16950/34278 [18:43:34<17:33:34, 3.65s/it] {'loss': 0.148, 'grad_norm': 1.019394901701943, 'learning_rate': 5.332112731883912e-06, 'epoch': 0.49} 49%|████▉ | 16950/34278 [18:43:34<17:33:34, 3.65s/it] 49%|████▉ | 16951/34278 [18:43:38<17:52:09, 3.71s/it] {'loss': 0.135, 'grad_norm': 0.9657542465799582, 'learning_rate': 5.3316413395767405e-06, 'epoch': 0.49} 49%|████▉ | 16951/34278 [18:43:38<17:52:09, 3.71s/it] 49%|████▉ | 16952/34278 [18:43:41<17:18:15, 3.60s/it] {'loss': 0.1252, 'grad_norm': 0.7827734704996198, 'learning_rate': 5.331169944308758e-06, 'epoch': 0.49} 49%|████▉ | 16952/34278 [18:43:41<17:18:15, 3.60s/it] 49%|████▉ | 16953/34278 [18:43:44<16:21:49, 3.40s/it] {'loss': 0.1169, 'grad_norm': 0.8204546421277574, 'learning_rate': 5.330698546084167e-06, 'epoch': 0.49} 49%|████▉ | 16953/34278 [18:43:44<16:21:49, 3.40s/it] 49%|████▉ | 16954/34278 [18:43:48<17:27:06, 3.63s/it] {'loss': 0.1158, 'grad_norm': 0.7606126873012299, 'learning_rate': 5.330227144907179e-06, 'epoch': 0.49} 49%|████▉ | 16954/34278 [18:43:49<17:27:06, 3.63s/it] 49%|████▉ | 16955/34278 [18:43:53<18:55:08, 3.93s/it] {'loss': 0.1444, 'grad_norm': 0.8045375654519857, 'learning_rate': 5.329755740782003e-06, 'epoch': 0.49} 49%|████▉ | 16955/34278 [18:43:53<18:55:08, 3.93s/it] 49%|████▉ | 16956/34278 [18:43:57<18:58:47, 3.94s/it] {'loss': 0.1402, 'grad_norm': 1.0550458230531574, 'learning_rate': 5.329284333712845e-06, 'epoch': 0.49} 49%|████▉ | 16956/34278 [18:43:57<18:58:47, 3.94s/it] 49%|████▉ | 16957/34278 [18:44:01<18:36:59, 3.87s/it] {'loss': 0.1318, 'grad_norm': 1.0603512795656371, 'learning_rate': 5.328812923703917e-06, 'epoch': 0.49} 49%|████▉ | 16957/34278 [18:44:01<18:36:59, 3.87s/it] 49%|████▉ | 16958/34278 [18:44:04<17:34:28, 3.65s/it] {'loss': 0.1174, 'grad_norm': 0.5955430122008735, 'learning_rate': 5.328341510759423e-06, 'epoch': 0.49} 49%|████▉ | 16958/34278 [18:44:04<17:34:28, 3.65s/it] 49%|████▉ | 16959/34278 [18:44:07<16:44:44, 3.48s/it] {'loss': 0.1311, 'grad_norm': 0.8398390339422245, 'learning_rate': 5.327870094883576e-06, 'epoch': 0.49} 49%|████▉ | 16959/34278 [18:44:07<16:44:44, 3.48s/it] 49%|████▉ | 16960/34278 [18:44:10<15:53:33, 3.30s/it] {'loss': 0.1309, 'grad_norm': 1.1291956634852869, 'learning_rate': 5.327398676080583e-06, 'epoch': 0.49} 49%|████▉ | 16960/34278 [18:44:10<15:53:33, 3.30s/it] 49%|████▉ | 16961/34278 [18:44:13<15:58:05, 3.32s/it] {'loss': 0.162, 'grad_norm': 1.0516302051600936, 'learning_rate': 5.3269272543546524e-06, 'epoch': 0.49} 49%|████▉ | 16961/34278 [18:44:13<15:58:05, 3.32s/it] 49%|████▉ | 16962/34278 [18:44:17<15:50:36, 3.29s/it] {'loss': 0.1362, 'grad_norm': 0.7976172681207964, 'learning_rate': 5.3264558297099935e-06, 'epoch': 0.49} 49%|████▉ | 16962/34278 [18:44:17<15:50:36, 3.29s/it] 49%|████▉ | 16963/34278 [18:44:20<16:32:10, 3.44s/it] {'loss': 0.1407, 'grad_norm': 0.8877281464621941, 'learning_rate': 5.3259844021508145e-06, 'epoch': 0.49} 49%|████▉ | 16963/34278 [18:44:20<16:32:10, 3.44s/it] 49%|████▉ | 16964/34278 [18:44:23<16:06:36, 3.35s/it] {'loss': 0.1184, 'grad_norm': 0.7885792964523457, 'learning_rate': 5.325512971681325e-06, 'epoch': 0.49} 49%|████▉ | 16964/34278 [18:44:23<16:06:36, 3.35s/it] 49%|████▉ | 16965/34278 [18:44:26<15:39:01, 3.25s/it] {'loss': 0.1411, 'grad_norm': 0.8746196809306633, 'learning_rate': 5.325041538305734e-06, 'epoch': 0.49} 49%|████▉ | 16965/34278 [18:44:26<15:39:01, 3.25s/it] 49%|████▉ | 16966/34278 [18:44:30<16:27:37, 3.42s/it] {'loss': 0.161, 'grad_norm': 0.8485752718426207, 'learning_rate': 5.324570102028248e-06, 'epoch': 0.49} 49%|████▉ | 16966/34278 [18:44:30<16:27:37, 3.42s/it] 49%|████▉ | 16967/34278 [18:44:33<15:50:29, 3.29s/it] {'loss': 0.1362, 'grad_norm': 0.9043789611546547, 'learning_rate': 5.324098662853079e-06, 'epoch': 0.49} 49%|████▉ | 16967/34278 [18:44:33<15:50:29, 3.29s/it] 50%|████▉ | 16968/34278 [18:44:39<19:26:27, 4.04s/it] {'loss': 0.1237, 'grad_norm': 0.8806146882536067, 'learning_rate': 5.323627220784434e-06, 'epoch': 0.5} 50%|████▉ | 16968/34278 [18:44:39<19:26:27, 4.04s/it] 50%|████▉ | 16969/34278 [18:44:42<18:21:40, 3.82s/it] {'loss': 0.1273, 'grad_norm': 0.8910099585858573, 'learning_rate': 5.3231557758265215e-06, 'epoch': 0.5} 50%|████▉ | 16969/34278 [18:44:42<18:21:40, 3.82s/it] 50%|████▉ | 16970/34278 [18:44:46<17:47:34, 3.70s/it] {'loss': 0.1314, 'grad_norm': 0.691561728123018, 'learning_rate': 5.322684327983554e-06, 'epoch': 0.5} 50%|████▉ | 16970/34278 [18:44:46<17:47:34, 3.70s/it] 50%|████▉ | 16971/34278 [18:44:49<17:46:21, 3.70s/it] {'loss': 0.1522, 'grad_norm': 0.7960444633372479, 'learning_rate': 5.3222128772597355e-06, 'epoch': 0.5} 50%|████▉ | 16971/34278 [18:44:49<17:46:21, 3.70s/it] 50%|████▉ | 16972/34278 [18:44:55<21:03:11, 4.38s/it] {'loss': 0.1416, 'grad_norm': 0.9629556327673812, 'learning_rate': 5.321741423659279e-06, 'epoch': 0.5} 50%|████▉ | 16972/34278 [18:44:55<21:03:11, 4.38s/it] 50%|████▉ | 16973/34278 [18:44:58<18:56:01, 3.94s/it] {'loss': 0.1217, 'grad_norm': 0.7045362984911328, 'learning_rate': 5.321269967186391e-06, 'epoch': 0.5} 50%|████▉ | 16973/34278 [18:44:58<18:56:01, 3.94s/it] 50%|████▉ | 16974/34278 [18:45:02<18:15:06, 3.80s/it] {'loss': 0.1239, 'grad_norm': 0.742526438025149, 'learning_rate': 5.320798507845281e-06, 'epoch': 0.5} 50%|████▉ | 16974/34278 [18:45:02<18:15:06, 3.80s/it] 50%|████▉ | 16975/34278 [18:45:07<20:17:52, 4.22s/it] {'loss': 0.1191, 'grad_norm': 0.7586203050264122, 'learning_rate': 5.320327045640159e-06, 'epoch': 0.5} 50%|████▉ | 16975/34278 [18:45:07<20:17:52, 4.22s/it] 50%|████▉ | 16976/34278 [18:45:13<22:26:46, 4.67s/it] {'loss': 0.1335, 'grad_norm': 1.1978140322051605, 'learning_rate': 5.319855580575233e-06, 'epoch': 0.5} 50%|████▉ | 16976/34278 [18:45:13<22:26:46, 4.67s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 50%|████▉ | 16977/34278 [18:45:16<20:10:59, 4.20s/it] {'loss': 0.1341, 'grad_norm': 0.8545638705041876, 'learning_rate': 5.319384112654713e-06, 'epoch': 0.5} 50%|████▉ | 16977/34278 [18:45:16<20:10:59, 4.20s/it] 50%|████▉ | 16978/34278 [18:45:19<18:27:57, 3.84s/it] {'loss': 0.1414, 'grad_norm': 0.8353652565520191, 'learning_rate': 5.318912641882809e-06, 'epoch': 0.5} 50%|████▉ | 16978/34278 [18:45:19<18:27:57, 3.84s/it] 50%|████▉ | 16979/34278 [18:45:25<21:18:32, 4.43s/it] {'loss': 0.1399, 'grad_norm': 0.7460690829875166, 'learning_rate': 5.318441168263727e-06, 'epoch': 0.5} 50%|████▉ | 16979/34278 [18:45:25<21:18:32, 4.43s/it] 50%|████▉ | 16980/34278 [18:45:28<19:48:02, 4.12s/it] {'loss': 0.1239, 'grad_norm': 0.8707347033873954, 'learning_rate': 5.317969691801681e-06, 'epoch': 0.5} 50%|████▉ | 16980/34278 [18:45:28<19:48:02, 4.12s/it] 50%|████▉ | 16981/34278 [18:45:31<18:08:28, 3.78s/it] {'loss': 0.1288, 'grad_norm': 0.8130012177841824, 'learning_rate': 5.3174982125008745e-06, 'epoch': 0.5} 50%|████▉ | 16981/34278 [18:45:31<18:08:28, 3.78s/it] 50%|████▉ | 16982/34278 [18:45:35<18:51:17, 3.92s/it] {'loss': 0.1311, 'grad_norm': 0.8792166690507628, 'learning_rate': 5.317026730365523e-06, 'epoch': 0.5} 50%|████▉ | 16982/34278 [18:45:35<18:51:17, 3.92s/it] 50%|████▉ | 16983/34278 [18:45:39<18:18:45, 3.81s/it] {'loss': 0.1121, 'grad_norm': 0.8265517968595797, 'learning_rate': 5.31655524539983e-06, 'epoch': 0.5} 50%|████▉ | 16983/34278 [18:45:39<18:18:45, 3.81s/it] 50%|████▉ | 16984/34278 [18:45:42<17:12:38, 3.58s/it] {'loss': 0.1495, 'grad_norm': 0.6604723081097706, 'learning_rate': 5.316083757608007e-06, 'epoch': 0.5} 50%|████▉ | 16984/34278 [18:45:42<17:12:38, 3.58s/it] 50%|████▉ | 16985/34278 [18:45:45<16:57:23, 3.53s/it] {'loss': 0.1392, 'grad_norm': 0.947929565131271, 'learning_rate': 5.3156122669942665e-06, 'epoch': 0.5} 50%|████▉ | 16985/34278 [18:45:45<16:57:23, 3.53s/it] 50%|████▉ | 16986/34278 [18:45:49<17:13:28, 3.59s/it] {'loss': 0.1251, 'grad_norm': 0.8239737182678122, 'learning_rate': 5.3151407735628125e-06, 'epoch': 0.5} 50%|████▉ | 16986/34278 [18:45:49<17:13:28, 3.59s/it] 50%|████▉ | 16987/34278 [18:45:52<16:33:40, 3.45s/it] {'loss': 0.1379, 'grad_norm': 1.0205966496438035, 'learning_rate': 5.314669277317858e-06, 'epoch': 0.5} 50%|████▉ | 16987/34278 [18:45:52<16:33:40, 3.45s/it] 50%|████▉ | 16988/34278 [18:45:56<17:02:33, 3.55s/it] {'loss': 0.1248, 'grad_norm': 0.7194479213053977, 'learning_rate': 5.314197778263611e-06, 'epoch': 0.5} 50%|████▉ | 16988/34278 [18:45:56<17:02:33, 3.55s/it] 50%|████▉ | 16989/34278 [18:45:59<15:48:10, 3.29s/it] {'loss': 0.1354, 'grad_norm': 1.1724484690623163, 'learning_rate': 5.313726276404281e-06, 'epoch': 0.5} 50%|████▉ | 16989/34278 [18:45:59<15:48:10, 3.29s/it] 50%|████▉ | 16990/34278 [18:46:02<16:15:13, 3.38s/it] {'loss': 0.1389, 'grad_norm': 1.2049310165373772, 'learning_rate': 5.313254771744079e-06, 'epoch': 0.5} 50%|████▉ | 16990/34278 [18:46:02<16:15:13, 3.38s/it] 50%|████▉ | 16991/34278 [18:46:06<16:44:50, 3.49s/it] {'loss': 0.1218, 'grad_norm': 0.700208096745436, 'learning_rate': 5.3127832642872116e-06, 'epoch': 0.5} 50%|████▉ | 16991/34278 [18:46:06<16:44:50, 3.49s/it] 50%|████▉ | 16992/34278 [18:46:09<16:13:52, 3.38s/it] {'loss': 0.1475, 'grad_norm': 0.8755550331867326, 'learning_rate': 5.3123117540378895e-06, 'epoch': 0.5} 50%|████▉ | 16992/34278 [18:46:09<16:13:52, 3.38s/it] 50%|████▉ | 16993/34278 [18:46:13<16:52:24, 3.51s/it] {'loss': 0.1163, 'grad_norm': 0.9273784916004748, 'learning_rate': 5.311840241000323e-06, 'epoch': 0.5} 50%|████▉ | 16993/34278 [18:46:13<16:52:24, 3.51s/it] 50%|████▉ | 16994/34278 [18:46:16<16:21:03, 3.41s/it] {'loss': 0.1165, 'grad_norm': 0.9852128356665886, 'learning_rate': 5.311368725178723e-06, 'epoch': 0.5} 50%|████▉ | 16994/34278 [18:46:16<16:21:03, 3.41s/it] 50%|████▉ | 16995/34278 [18:46:21<18:00:01, 3.75s/it] {'loss': 0.1326, 'grad_norm': 0.8172719672545941, 'learning_rate': 5.310897206577297e-06, 'epoch': 0.5} 50%|████▉ | 16995/34278 [18:46:21<18:00:01, 3.75s/it] 50%|████▉ | 16996/34278 [18:46:24<17:50:52, 3.72s/it] {'loss': 0.1381, 'grad_norm': 0.8514774838543385, 'learning_rate': 5.310425685200252e-06, 'epoch': 0.5} 50%|████▉ | 16996/34278 [18:46:24<17:50:52, 3.72s/it] 50%|████▉ | 16997/34278 [18:46:30<20:23:52, 4.25s/it] {'loss': 0.1313, 'grad_norm': 0.8554652494690688, 'learning_rate': 5.3099541610518046e-06, 'epoch': 0.5} 50%|████▉ | 16997/34278 [18:46:30<20:23:52, 4.25s/it] 50%|████▉ | 16998/34278 [18:46:34<20:12:23, 4.21s/it] {'loss': 0.1306, 'grad_norm': 0.9276791529462564, 'learning_rate': 5.309482634136158e-06, 'epoch': 0.5} 50%|████▉ | 16998/34278 [18:46:34<20:12:23, 4.21s/it] 50%|████▉ | 16999/34278 [18:46:37<19:20:59, 4.03s/it] {'loss': 0.1463, 'grad_norm': 1.096328000628962, 'learning_rate': 5.309011104457524e-06, 'epoch': 0.5} 50%|████▉ | 16999/34278 [18:46:37<19:20:59, 4.03s/it] 50%|████▉ | 17000/34278 [18:46:41<18:20:14, 3.82s/it] {'loss': 0.1124, 'grad_norm': 0.8096521912923862, 'learning_rate': 5.3085395720201145e-06, 'epoch': 0.5} 50%|████▉ | 17000/34278 [18:46:41<18:20:14, 3.82s/it] 50%|████▉ | 17001/34278 [18:46:47<21:40:40, 4.52s/it] {'loss': 0.1177, 'grad_norm': 0.8665191497573511, 'learning_rate': 5.308068036828137e-06, 'epoch': 0.5} 50%|████▉ | 17001/34278 [18:46:47<21:40:40, 4.52s/it] 50%|████▉ | 17002/34278 [18:46:50<20:00:02, 4.17s/it] {'loss': 0.1255, 'grad_norm': 1.0612210196688785, 'learning_rate': 5.3075964988857995e-06, 'epoch': 0.5} 50%|████▉ | 17002/34278 [18:46:50<20:00:02, 4.17s/it] 50%|████▉ | 17003/34278 [18:46:54<18:59:15, 3.96s/it] {'loss': 0.1372, 'grad_norm': 1.0490245799527906, 'learning_rate': 5.307124958197316e-06, 'epoch': 0.5} 50%|████▉ | 17003/34278 [18:46:54<18:59:15, 3.96s/it] 50%|████▉ | 17004/34278 [18:46:57<17:55:18, 3.74s/it] {'loss': 0.138, 'grad_norm': 0.7495009661428474, 'learning_rate': 5.306653414766894e-06, 'epoch': 0.5} 50%|████▉ | 17004/34278 [18:46:57<17:55:18, 3.74s/it] 50%|████▉ | 17005/34278 [18:47:01<17:58:48, 3.75s/it] {'loss': 0.1332, 'grad_norm': 0.7803730397205744, 'learning_rate': 5.306181868598742e-06, 'epoch': 0.5} 50%|████▉ | 17005/34278 [18:47:01<17:58:48, 3.75s/it] 50%|████▉ | 17006/34278 [18:47:04<17:09:01, 3.57s/it] {'loss': 0.1249, 'grad_norm': 1.0449426882750648, 'learning_rate': 5.305710319697073e-06, 'epoch': 0.5} 50%|████▉ | 17006/34278 [18:47:04<17:09:01, 3.57s/it] 50%|████▉ | 17007/34278 [18:47:07<16:41:04, 3.48s/it] {'loss': 0.1164, 'grad_norm': 0.8090543934474392, 'learning_rate': 5.3052387680660945e-06, 'epoch': 0.5} 50%|████▉ | 17007/34278 [18:47:07<16:41:04, 3.48s/it] 50%|████▉ | 17008/34278 [18:47:11<16:28:26, 3.43s/it] {'loss': 0.1237, 'grad_norm': 0.8323158832759161, 'learning_rate': 5.304767213710017e-06, 'epoch': 0.5} 50%|████▉ | 17008/34278 [18:47:11<16:28:26, 3.43s/it] 50%|████▉ | 17009/34278 [18:47:13<15:40:20, 3.27s/it] {'loss': 0.1461, 'grad_norm': 1.0577348037359635, 'learning_rate': 5.304295656633051e-06, 'epoch': 0.5} 50%|████▉ | 17009/34278 [18:47:13<15:40:20, 3.27s/it] 50%|████▉ | 17010/34278 [18:47:17<15:36:07, 3.25s/it] {'loss': 0.1394, 'grad_norm': 0.9646279194872153, 'learning_rate': 5.303824096839407e-06, 'epoch': 0.5} 50%|████▉ | 17010/34278 [18:47:17<15:36:07, 3.25s/it] 50%|████▉ | 17011/34278 [18:47:23<19:28:38, 4.06s/it] {'loss': 0.1401, 'grad_norm': 1.0613858884412792, 'learning_rate': 5.303352534333291e-06, 'epoch': 0.5} 50%|████▉ | 17011/34278 [18:47:23<19:28:38, 4.06s/it] 50%|████▉ | 17012/34278 [18:47:26<17:50:50, 3.72s/it] {'loss': 0.1209, 'grad_norm': 1.0609249588291445, 'learning_rate': 5.30288096911892e-06, 'epoch': 0.5} 50%|████▉ | 17012/34278 [18:47:26<17:50:50, 3.72s/it] 50%|████▉ | 17013/34278 [18:47:28<16:36:48, 3.46s/it] {'loss': 0.1429, 'grad_norm': 6.964887388903303, 'learning_rate': 5.302409401200497e-06, 'epoch': 0.5} 50%|████▉ | 17013/34278 [18:47:28<16:36:48, 3.46s/it] 50%|████▉ | 17014/34278 [18:47:32<16:18:50, 3.40s/it] {'loss': 0.1268, 'grad_norm': 0.9194167504080002, 'learning_rate': 5.301937830582235e-06, 'epoch': 0.5} 50%|████▉ | 17014/34278 [18:47:32<16:18:50, 3.40s/it] 50%|████▉ | 17015/34278 [18:47:35<16:01:15, 3.34s/it] {'loss': 0.1389, 'grad_norm': 0.8043827704686469, 'learning_rate': 5.301466257268346e-06, 'epoch': 0.5} 50%|████▉ | 17015/34278 [18:47:35<16:01:15, 3.34s/it] 50%|████▉ | 17016/34278 [18:47:38<16:14:14, 3.39s/it] {'loss': 0.1403, 'grad_norm': 0.9819150658530034, 'learning_rate': 5.300994681263038e-06, 'epoch': 0.5} 50%|████▉ | 17016/34278 [18:47:38<16:14:14, 3.39s/it] 50%|████▉ | 17017/34278 [18:47:43<18:26:37, 3.85s/it] {'loss': 0.1651, 'grad_norm': 0.7903323502090318, 'learning_rate': 5.3005231025705195e-06, 'epoch': 0.5} 50%|████▉ | 17017/34278 [18:47:43<18:26:37, 3.85s/it] 50%|████▉ | 17018/34278 [18:47:46<17:27:07, 3.64s/it] {'loss': 0.1167, 'grad_norm': 0.8828242039006373, 'learning_rate': 5.300051521195004e-06, 'epoch': 0.5} 50%|████▉ | 17018/34278 [18:47:46<17:27:07, 3.64s/it] 50%|████▉ | 17019/34278 [18:47:49<16:37:35, 3.47s/it] {'loss': 0.1254, 'grad_norm': 0.9808924137735425, 'learning_rate': 5.299579937140699e-06, 'epoch': 0.5} 50%|████▉ | 17019/34278 [18:47:49<16:37:35, 3.47s/it] 50%|████▉ | 17020/34278 [18:47:53<16:34:16, 3.46s/it] {'loss': 0.1466, 'grad_norm': 0.8858867618245164, 'learning_rate': 5.299108350411817e-06, 'epoch': 0.5} 50%|████▉ | 17020/34278 [18:47:53<16:34:16, 3.46s/it] 50%|████▉ | 17021/34278 [18:47:56<15:54:32, 3.32s/it] {'loss': 0.1251, 'grad_norm': 0.7861078381162786, 'learning_rate': 5.298636761012567e-06, 'epoch': 0.5} 50%|████▉ | 17021/34278 [18:47:56<15:54:32, 3.32s/it] 50%|████▉ | 17022/34278 [18:48:00<16:40:12, 3.48s/it] {'loss': 0.1346, 'grad_norm': 0.9829584641559423, 'learning_rate': 5.298165168947158e-06, 'epoch': 0.5} 50%|████▉ | 17022/34278 [18:48:00<16:40:12, 3.48s/it] 50%|████▉ | 17023/34278 [18:48:03<15:49:02, 3.30s/it] {'loss': 0.1492, 'grad_norm': 1.0145337046355396, 'learning_rate': 5.297693574219803e-06, 'epoch': 0.5} 50%|████▉ | 17023/34278 [18:48:03<15:49:02, 3.30s/it] 50%|████▉ | 17024/34278 [18:48:06<15:30:39, 3.24s/it] {'loss': 0.1304, 'grad_norm': 0.9425629232566466, 'learning_rate': 5.29722197683471e-06, 'epoch': 0.5} 50%|████▉ | 17024/34278 [18:48:06<15:30:39, 3.24s/it] 50%|████▉ | 17025/34278 [18:48:09<16:03:34, 3.35s/it] {'loss': 0.1197, 'grad_norm': 0.9778512536435043, 'learning_rate': 5.296750376796092e-06, 'epoch': 0.5} 50%|████▉ | 17025/34278 [18:48:09<16:03:34, 3.35s/it] 50%|████▉ | 17026/34278 [18:48:14<18:16:27, 3.81s/it] {'loss': 0.142, 'grad_norm': 0.8531332874595413, 'learning_rate': 5.296278774108154e-06, 'epoch': 0.5} 50%|████▉ | 17026/34278 [18:48:14<18:16:27, 3.81s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 50%|████▉ | 17027/34278 [18:48:17<17:25:10, 3.64s/it] {'loss': 0.1341, 'grad_norm': 0.9996692545825431, 'learning_rate': 5.295807168775113e-06, 'epoch': 0.5} 50%|████▉ | 17027/34278 [18:48:17<17:25:10, 3.64s/it] 50%|████▉ | 17028/34278 [18:48:20<16:33:54, 3.46s/it] {'loss': 0.1363, 'grad_norm': 0.8120564563782053, 'learning_rate': 5.295335560801175e-06, 'epoch': 0.5} 50%|████▉ | 17028/34278 [18:48:20<16:33:54, 3.46s/it] 50%|████▉ | 17029/34278 [18:48:26<19:39:51, 4.10s/it] {'loss': 0.1373, 'grad_norm': 0.7709973211616562, 'learning_rate': 5.294863950190551e-06, 'epoch': 0.5} 50%|████▉ | 17029/34278 [18:48:26<19:39:51, 4.10s/it] 50%|████▉ | 17030/34278 [18:48:32<21:51:57, 4.56s/it] {'loss': 0.1236, 'grad_norm': 0.9389352398385432, 'learning_rate': 5.294392336947454e-06, 'epoch': 0.5} 50%|████▉ | 17030/34278 [18:48:32<21:51:57, 4.56s/it] 50%|████▉ | 17031/34278 [18:48:35<19:41:56, 4.11s/it] {'loss': 0.1339, 'grad_norm': 0.8049604165370342, 'learning_rate': 5.29392072107609e-06, 'epoch': 0.5} 50%|████▉ | 17031/34278 [18:48:35<19:41:56, 4.11s/it] 50%|████▉ | 17032/34278 [18:48:39<19:44:44, 4.12s/it] {'loss': 0.1257, 'grad_norm': 0.885934958894433, 'learning_rate': 5.293449102580674e-06, 'epoch': 0.5} 50%|████▉ | 17032/34278 [18:48:39<19:44:44, 4.12s/it] 50%|████▉ | 17033/34278 [18:48:43<19:39:04, 4.10s/it] {'loss': 0.1383, 'grad_norm': 0.7998677185861385, 'learning_rate': 5.292977481465413e-06, 'epoch': 0.5} 50%|████▉ | 17033/34278 [18:48:43<19:39:04, 4.10s/it] 50%|████▉ | 17034/34278 [18:48:47<18:50:52, 3.93s/it] {'loss': 0.1202, 'grad_norm': 0.9177388784485104, 'learning_rate': 5.292505857734519e-06, 'epoch': 0.5} 50%|████▉ | 17034/34278 [18:48:47<18:50:52, 3.93s/it] 50%|████▉ | 17035/34278 [18:48:50<17:58:43, 3.75s/it] {'loss': 0.1467, 'grad_norm': 0.900434529486035, 'learning_rate': 5.292034231392204e-06, 'epoch': 0.5} 50%|████▉ | 17035/34278 [18:48:50<17:58:43, 3.75s/it] 50%|████▉ | 17036/34278 [18:48:53<17:16:00, 3.61s/it] {'loss': 0.1487, 'grad_norm': 0.9666660790595324, 'learning_rate': 5.2915626024426755e-06, 'epoch': 0.5} 50%|████▉ | 17036/34278 [18:48:53<17:16:00, 3.61s/it] 50%|████▉ | 17037/34278 [18:48:57<17:08:37, 3.58s/it] {'loss': 0.1446, 'grad_norm': 0.9791511838331073, 'learning_rate': 5.291090970890146e-06, 'epoch': 0.5} 50%|████▉ | 17037/34278 [18:48:57<17:08:37, 3.58s/it] 50%|████▉ | 17038/34278 [18:49:02<20:20:02, 4.25s/it] {'loss': 0.1504, 'grad_norm': 0.852126728562316, 'learning_rate': 5.290619336738826e-06, 'epoch': 0.5} 50%|████▉ | 17038/34278 [18:49:02<20:20:02, 4.25s/it] 50%|████▉ | 17039/34278 [18:49:06<19:41:14, 4.11s/it] {'loss': 0.1421, 'grad_norm': 0.9246946420062684, 'learning_rate': 5.290147699992926e-06, 'epoch': 0.5} 50%|████▉ | 17039/34278 [18:49:06<19:41:14, 4.11s/it] 50%|████▉ | 17040/34278 [18:49:09<18:24:35, 3.84s/it] {'loss': 0.1361, 'grad_norm': 0.8825312853080106, 'learning_rate': 5.2896760606566576e-06, 'epoch': 0.5} 50%|████▉ | 17040/34278 [18:49:09<18:24:35, 3.84s/it] 50%|████▉ | 17041/34278 [18:49:14<19:28:09, 4.07s/it] {'loss': 0.1192, 'grad_norm': 0.7683017994188074, 'learning_rate': 5.289204418734228e-06, 'epoch': 0.5} 50%|████▉ | 17041/34278 [18:49:14<19:28:09, 4.07s/it] 50%|████▉ | 17042/34278 [18:49:17<17:29:51, 3.65s/it] {'loss': 0.1235, 'grad_norm': 0.745517946160222, 'learning_rate': 5.288732774229853e-06, 'epoch': 0.5} 50%|████▉ | 17042/34278 [18:49:17<17:29:51, 3.65s/it] 50%|████▉ | 17043/34278 [18:49:21<17:53:14, 3.74s/it] {'loss': 0.1141, 'grad_norm': 0.7803483486077735, 'learning_rate': 5.28826112714774e-06, 'epoch': 0.5} 50%|████▉ | 17043/34278 [18:49:21<17:53:14, 3.74s/it] 50%|████▉ | 17044/34278 [18:49:24<16:38:37, 3.48s/it] {'loss': 0.1528, 'grad_norm': 0.9824857045201814, 'learning_rate': 5.287789477492099e-06, 'epoch': 0.5} 50%|████▉ | 17044/34278 [18:49:24<16:38:37, 3.48s/it] 50%|████▉ | 17045/34278 [18:49:27<15:56:25, 3.33s/it] {'loss': 0.1353, 'grad_norm': 0.7936125583236037, 'learning_rate': 5.287317825267146e-06, 'epoch': 0.5} 50%|████▉ | 17045/34278 [18:49:27<15:56:25, 3.33s/it] 50%|████▉ | 17046/34278 [18:49:30<15:39:41, 3.27s/it] {'loss': 0.115, 'grad_norm': 0.8966151860414091, 'learning_rate': 5.286846170477085e-06, 'epoch': 0.5} 50%|████▉ | 17046/34278 [18:49:30<15:39:41, 3.27s/it] 50%|████▉ | 17047/34278 [18:49:33<16:11:22, 3.38s/it] {'loss': 0.1411, 'grad_norm': 0.8581127729591428, 'learning_rate': 5.286374513126129e-06, 'epoch': 0.5} 50%|████▉ | 17047/34278 [18:49:33<16:11:22, 3.38s/it] 50%|████▉ | 17048/34278 [18:49:37<16:22:55, 3.42s/it] {'loss': 0.1385, 'grad_norm': 0.6876296940895243, 'learning_rate': 5.285902853218492e-06, 'epoch': 0.5} 50%|████▉ | 17048/34278 [18:49:37<16:22:55, 3.42s/it] 50%|████▉ | 17049/34278 [18:49:41<16:55:55, 3.54s/it] {'loss': 0.1414, 'grad_norm': 0.9251394566503488, 'learning_rate': 5.285431190758381e-06, 'epoch': 0.5} 50%|████▉ | 17049/34278 [18:49:41<16:55:55, 3.54s/it] 50%|████▉ | 17050/34278 [18:49:44<16:31:27, 3.45s/it] {'loss': 0.12, 'grad_norm': 0.7486646186866653, 'learning_rate': 5.2849595257500085e-06, 'epoch': 0.5} 50%|████▉ | 17050/34278 [18:49:44<16:31:27, 3.45s/it] 50%|████▉ | 17051/34278 [18:49:50<20:14:54, 4.23s/it] {'loss': 0.1267, 'grad_norm': 0.7712107945516644, 'learning_rate': 5.284487858197586e-06, 'epoch': 0.5} 50%|████▉ | 17051/34278 [18:49:50<20:14:54, 4.23s/it] 50%|████▉ | 17052/34278 [18:49:53<18:31:10, 3.87s/it] {'loss': 0.1144, 'grad_norm': 0.8771844825510239, 'learning_rate': 5.284016188105324e-06, 'epoch': 0.5} 50%|████▉ | 17052/34278 [18:49:53<18:31:10, 3.87s/it] 50%|████▉ | 17053/34278 [18:49:56<17:08:05, 3.58s/it] {'loss': 0.1428, 'grad_norm': 0.8092455419223078, 'learning_rate': 5.283544515477434e-06, 'epoch': 0.5} 50%|████▉ | 17053/34278 [18:49:56<17:08:05, 3.58s/it] 50%|████▉ | 17054/34278 [18:50:00<17:36:57, 3.68s/it] {'loss': 0.155, 'grad_norm': 0.8932150374580071, 'learning_rate': 5.283072840318124e-06, 'epoch': 0.5} 50%|████▉ | 17054/34278 [18:50:00<17:36:57, 3.68s/it] 50%|████▉ | 17055/34278 [18:50:03<16:55:29, 3.54s/it] {'loss': 0.1164, 'grad_norm': 0.6802444985614569, 'learning_rate': 5.282601162631609e-06, 'epoch': 0.5} 50%|████▉ | 17055/34278 [18:50:03<16:55:29, 3.54s/it] 50%|████▉ | 17056/34278 [18:50:06<16:00:21, 3.35s/it] {'loss': 0.129, 'grad_norm': 0.6382936620776493, 'learning_rate': 5.282129482422097e-06, 'epoch': 0.5} 50%|████▉ | 17056/34278 [18:50:06<16:00:21, 3.35s/it] 50%|████▉ | 17057/34278 [18:50:09<15:15:41, 3.19s/it] {'loss': 0.1318, 'grad_norm': 0.9247116257299203, 'learning_rate': 5.281657799693803e-06, 'epoch': 0.5} 50%|████▉ | 17057/34278 [18:50:09<15:15:41, 3.19s/it] 50%|████▉ | 17058/34278 [18:50:14<18:53:56, 3.95s/it] {'loss': 0.1292, 'grad_norm': 0.728335394824729, 'learning_rate': 5.281186114450934e-06, 'epoch': 0.5} 50%|████▉ | 17058/34278 [18:50:14<18:53:56, 3.95s/it] 50%|████▉ | 17059/34278 [18:50:20<21:16:29, 4.45s/it] {'loss': 0.1404, 'grad_norm': 0.713713454618646, 'learning_rate': 5.2807144266977e-06, 'epoch': 0.5} 50%|████▉ | 17059/34278 [18:50:20<21:16:29, 4.45s/it] 50%|████▉ | 17060/34278 [18:50:23<18:51:28, 3.94s/it] {'loss': 0.1315, 'grad_norm': 0.8888424569177863, 'learning_rate': 5.280242736438318e-06, 'epoch': 0.5} 50%|████▉ | 17060/34278 [18:50:23<18:51:28, 3.94s/it] 50%|████▉ | 17061/34278 [18:50:26<17:30:19, 3.66s/it] {'loss': 0.1312, 'grad_norm': 0.7007810331234429, 'learning_rate': 5.279771043676994e-06, 'epoch': 0.5} 50%|████▉ | 17061/34278 [18:50:26<17:30:19, 3.66s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f06ec804630> Failed to fetch sample 3655704. Exception: cannot identify image file <_io.BytesIO object at 0x7f06ec804630> 50%|████▉ | 17062/34278 [18:50:29<16:34:20, 3.47s/it] {'loss': 0.1473, 'grad_norm': 0.7150189101992875, 'learning_rate': 5.2792993484179415e-06, 'epoch': 0.5} 50%|████▉ | 17062/34278 [18:50:29<16:34:20, 3.47s/it] 50%|████▉ | 17063/34278 [18:50:35<20:11:40, 4.22s/it] {'loss': 0.1469, 'grad_norm': 0.8462435442992592, 'learning_rate': 5.27882765066537e-06, 'epoch': 0.5} 50%|████▉ | 17063/34278 [18:50:35<20:11:40, 4.22s/it] 50%|████▉ | 17064/34278 [18:50:38<18:55:19, 3.96s/it] {'loss': 0.1433, 'grad_norm': 0.6706241051842962, 'learning_rate': 5.2783559504234926e-06, 'epoch': 0.5} 50%|████▉ | 17064/34278 [18:50:38<18:55:19, 3.96s/it] 50%|████▉ | 17065/34278 [18:50:43<20:08:55, 4.21s/it] {'loss': 0.1388, 'grad_norm': 0.8418848267602387, 'learning_rate': 5.277884247696521e-06, 'epoch': 0.5} 50%|████▉ | 17065/34278 [18:50:43<20:08:55, 4.21s/it] 50%|████▉ | 17066/34278 [18:50:49<22:53:53, 4.79s/it] {'loss': 0.1389, 'grad_norm': 0.9550010976560557, 'learning_rate': 5.277412542488664e-06, 'epoch': 0.5} 50%|████▉ | 17066/34278 [18:50:49<22:53:53, 4.79s/it] 50%|████▉ | 17067/34278 [18:50:55<24:37:00, 5.15s/it] {'loss': 0.1343, 'grad_norm': 0.8098212287485531, 'learning_rate': 5.276940834804133e-06, 'epoch': 0.5} 50%|████▉ | 17067/34278 [18:50:55<24:37:00, 5.15s/it] 50%|████▉ | 17068/34278 [18:50:59<22:17:50, 4.66s/it] {'loss': 0.1363, 'grad_norm': 0.9511043289148973, 'learning_rate': 5.276469124647141e-06, 'epoch': 0.5} 50%|████▉ | 17068/34278 [18:50:59<22:17:50, 4.66s/it] 50%|████▉ | 17069/34278 [18:51:02<20:07:59, 4.21s/it] {'loss': 0.1337, 'grad_norm': 0.9632947494313432, 'learning_rate': 5.2759974120218995e-06, 'epoch': 0.5} 50%|████▉ | 17069/34278 [18:51:02<20:07:59, 4.21s/it] 50%|████▉ | 17070/34278 [18:51:06<19:33:19, 4.09s/it] {'loss': 0.1213, 'grad_norm': 0.7655909900395728, 'learning_rate': 5.2755256969326195e-06, 'epoch': 0.5} 50%|████▉ | 17070/34278 [18:51:06<19:33:19, 4.09s/it] 50%|████▉ | 17071/34278 [18:51:09<18:18:30, 3.83s/it] {'loss': 0.1251, 'grad_norm': 0.6625999224547507, 'learning_rate': 5.27505397938351e-06, 'epoch': 0.5} 50%|████▉ | 17071/34278 [18:51:09<18:18:30, 3.83s/it] 50%|████▉ | 17072/34278 [18:51:12<17:53:28, 3.74s/it] {'loss': 0.1443, 'grad_norm': 0.9600395652719803, 'learning_rate': 5.274582259378785e-06, 'epoch': 0.5} 50%|████▉ | 17072/34278 [18:51:12<17:53:28, 3.74s/it] 50%|████▉ | 17073/34278 [18:51:16<17:29:09, 3.66s/it] {'loss': 0.1217, 'grad_norm': 0.7291946285816626, 'learning_rate': 5.274110536922655e-06, 'epoch': 0.5} 50%|████▉ | 17073/34278 [18:51:16<17:29:09, 3.66s/it] 50%|████▉ | 17074/34278 [18:51:21<19:41:46, 4.12s/it] {'loss': 0.1324, 'grad_norm': 0.7580019253967736, 'learning_rate': 5.273638812019331e-06, 'epoch': 0.5} 50%|████▉ | 17074/34278 [18:51:21<19:41:46, 4.12s/it] 50%|████▉ | 17075/34278 [18:51:24<18:03:12, 3.78s/it] {'loss': 0.1233, 'grad_norm': 1.0467376295916913, 'learning_rate': 5.273167084673028e-06, 'epoch': 0.5} 50%|████▉ | 17075/34278 [18:51:24<18:03:12, 3.78s/it] 50%|████▉ | 17076/34278 [18:51:30<20:35:59, 4.31s/it] {'loss': 0.1601, 'grad_norm': 0.7543399720664358, 'learning_rate': 5.272695354887951e-06, 'epoch': 0.5} 50%|████▉ | 17076/34278 [18:51:30<20:35:59, 4.31s/it] 50%|████▉ | 17077/34278 [18:51:32<18:35:06, 3.89s/it] {'loss': 0.1113, 'grad_norm': 0.8197716089735835, 'learning_rate': 5.272223622668316e-06, 'epoch': 0.5} 50%|████▉ | 17077/34278 [18:51:32<18:35:06, 3.89s/it] 50%|████▉ | 17078/34278 [18:51:38<21:04:11, 4.41s/it] {'loss': 0.1312, 'grad_norm': 0.7635338222701459, 'learning_rate': 5.271751888018335e-06, 'epoch': 0.5} 50%|████▉ | 17078/34278 [18:51:38<21:04:11, 4.41s/it] 50%|████▉ | 17079/34278 [18:51:43<21:06:24, 4.42s/it] {'loss': 0.1398, 'grad_norm': 0.8843088300732025, 'learning_rate': 5.271280150942217e-06, 'epoch': 0.5} 50%|████▉ | 17079/34278 [18:51:43<21:06:24, 4.42s/it] 50%|████▉ | 17080/34278 [18:51:45<18:59:53, 3.98s/it] {'loss': 0.1568, 'grad_norm': 0.8341103549713862, 'learning_rate': 5.270808411444174e-06, 'epoch': 0.5} 50%|████▉ | 17080/34278 [18:51:45<18:59:53, 3.98s/it] 50%|████▉ | 17081/34278 [18:51:51<21:58:01, 4.60s/it] {'loss': 0.1426, 'grad_norm': 0.8561214965639796, 'learning_rate': 5.270336669528417e-06, 'epoch': 0.5} 50%|████▉ | 17081/34278 [18:51:52<21:58:01, 4.60s/it] 50%|████▉ | 17082/34278 [18:51:54<19:13:11, 4.02s/it] {'loss': 0.1328, 'grad_norm': 0.8894287839903012, 'learning_rate': 5.269864925199161e-06, 'epoch': 0.5} 50%|████▉ | 17082/34278 [18:51:54<19:13:11, 4.02s/it] 50%|████▉ | 17083/34278 [18:51:58<18:32:49, 3.88s/it] {'loss': 0.1541, 'grad_norm': 0.8532956943212627, 'learning_rate': 5.269393178460614e-06, 'epoch': 0.5} 50%|████▉ | 17083/34278 [18:51:58<18:32:49, 3.88s/it] 50%|████▉ | 17084/34278 [18:52:03<20:28:47, 4.29s/it] {'loss': 0.1393, 'grad_norm': 0.6599364125191604, 'learning_rate': 5.2689214293169896e-06, 'epoch': 0.5} 50%|████▉ | 17084/34278 [18:52:03<20:28:47, 4.29s/it] 50%|████▉ | 17085/34278 [18:52:06<19:17:41, 4.04s/it] {'loss': 0.1133, 'grad_norm': 0.8397851924690483, 'learning_rate': 5.268449677772499e-06, 'epoch': 0.5} 50%|████▉ | 17085/34278 [18:52:06<19:17:41, 4.04s/it] 50%|████▉ | 17086/34278 [18:52:10<17:58:08, 3.76s/it] {'loss': 0.1353, 'grad_norm': 0.9352705543099153, 'learning_rate': 5.267977923831354e-06, 'epoch': 0.5} 50%|████▉ | 17086/34278 [18:52:10<17:58:08, 3.76s/it] 50%|████▉ | 17087/34278 [18:52:13<17:12:56, 3.61s/it] {'loss': 0.1202, 'grad_norm': 0.8282603629619283, 'learning_rate': 5.2675061674977665e-06, 'epoch': 0.5} 50%|████▉ | 17087/34278 [18:52:13<17:12:56, 3.61s/it] 50%|████▉ | 17088/34278 [18:52:16<17:21:36, 3.64s/it] {'loss': 0.1569, 'grad_norm': 0.9538341436559521, 'learning_rate': 5.2670344087759466e-06, 'epoch': 0.5} 50%|████▉ | 17088/34278 [18:52:16<17:21:36, 3.64s/it] 50%|████▉ | 17089/34278 [18:52:21<18:34:52, 3.89s/it] {'loss': 0.1534, 'grad_norm': 0.9729144870371288, 'learning_rate': 5.266562647670107e-06, 'epoch': 0.5} 50%|████▉ | 17089/34278 [18:52:21<18:34:52, 3.89s/it] 50%|████▉ | 17090/34278 [18:52:27<21:47:42, 4.56s/it] {'loss': 0.137, 'grad_norm': 0.798530296346233, 'learning_rate': 5.266090884184462e-06, 'epoch': 0.5} 50%|████▉ | 17090/34278 [18:52:27<21:47:42, 4.56s/it] 50%|████▉ | 17091/34278 [18:52:30<19:33:34, 4.10s/it] {'loss': 0.1275, 'grad_norm': 0.6972318621149772, 'learning_rate': 5.265619118323218e-06, 'epoch': 0.5} 50%|████▉ | 17091/34278 [18:52:30<19:33:34, 4.10s/it] 50%|████▉ | 17092/34278 [18:52:33<17:50:36, 3.74s/it] {'loss': 0.1149, 'grad_norm': 0.9491328908344574, 'learning_rate': 5.2651473500905925e-06, 'epoch': 0.5} 50%|████▉ | 17092/34278 [18:52:33<17:50:36, 3.74s/it] 50%|████▉ | 17093/34278 [18:52:38<18:59:36, 3.98s/it] {'loss': 0.1386, 'grad_norm': 0.9503833956624851, 'learning_rate': 5.264675579490793e-06, 'epoch': 0.5} 50%|████▉ | 17093/34278 [18:52:38<18:59:36, 3.98s/it] 50%|████▉ | 17094/34278 [18:52:42<19:13:52, 4.03s/it] {'loss': 0.1171, 'grad_norm': 0.5601770519055654, 'learning_rate': 5.264203806528034e-06, 'epoch': 0.5} 50%|████▉ | 17094/34278 [18:52:42<19:13:52, 4.03s/it] 50%|████▉ | 17095/34278 [18:52:45<18:12:01, 3.81s/it] {'loss': 0.1451, 'grad_norm': 1.0063804905034068, 'learning_rate': 5.263732031206527e-06, 'epoch': 0.5} 50%|████▉ | 17095/34278 [18:52:45<18:12:01, 3.81s/it] 50%|████▉ | 17096/34278 [18:52:51<21:39:33, 4.54s/it] {'loss': 0.1304, 'grad_norm': 0.9061470755468986, 'learning_rate': 5.263260253530482e-06, 'epoch': 0.5} 50%|████▉ | 17096/34278 [18:52:51<21:39:33, 4.54s/it] 50%|████▉ | 17097/34278 [18:52:55<20:31:08, 4.30s/it] {'loss': 0.1301, 'grad_norm': 0.7608954569541991, 'learning_rate': 5.262788473504112e-06, 'epoch': 0.5} 50%|████▉ | 17097/34278 [18:52:55<20:31:08, 4.30s/it] 50%|████▉ | 17098/34278 [18:52:58<19:17:53, 4.04s/it] {'loss': 0.1284, 'grad_norm': 0.6842391012230838, 'learning_rate': 5.262316691131631e-06, 'epoch': 0.5} 50%|████▉ | 17098/34278 [18:52:58<19:17:53, 4.04s/it] 50%|████▉ | 17099/34278 [18:53:01<17:45:07, 3.72s/it] {'loss': 0.124, 'grad_norm': 0.8127145270545167, 'learning_rate': 5.261844906417249e-06, 'epoch': 0.5} 50%|████▉ | 17099/34278 [18:53:01<17:45:07, 3.72s/it] 50%|████▉ | 17100/34278 [18:53:04<16:47:55, 3.52s/it] {'loss': 0.1242, 'grad_norm': 0.9134184916856534, 'learning_rate': 5.261373119365176e-06, 'epoch': 0.5} 50%|████▉ | 17100/34278 [18:53:04<16:47:55, 3.52s/it] 50%|████▉ | 17101/34278 [18:53:10<20:11:36, 4.23s/it] {'loss': 0.1367, 'grad_norm': 0.809696198233572, 'learning_rate': 5.260901329979628e-06, 'epoch': 0.5} 50%|████▉ | 17101/34278 [18:53:10<20:11:36, 4.23s/it] 50%|████▉ | 17102/34278 [18:53:13<18:31:15, 3.88s/it] {'loss': 0.1291, 'grad_norm': 0.7523160407219962, 'learning_rate': 5.260429538264816e-06, 'epoch': 0.5} 50%|████▉ | 17102/34278 [18:53:13<18:31:15, 3.88s/it] 50%|████▉ | 17103/34278 [18:53:16<17:01:44, 3.57s/it] {'loss': 0.1219, 'grad_norm': 0.7269267040884179, 'learning_rate': 5.2599577442249496e-06, 'epoch': 0.5} 50%|████▉ | 17103/34278 [18:53:16<17:01:44, 3.57s/it] 50%|████▉ | 17104/34278 [18:53:22<20:35:02, 4.31s/it] {'loss': 0.1026, 'grad_norm': 0.7285468407026217, 'learning_rate': 5.259485947864242e-06, 'epoch': 0.5} 50%|████▉ | 17104/34278 [18:53:22<20:35:02, 4.31s/it] 50%|████▉ | 17105/34278 [18:53:27<21:18:26, 4.47s/it] {'loss': 0.1585, 'grad_norm': 0.9600386351774044, 'learning_rate': 5.259014149186908e-06, 'epoch': 0.5} 50%|████▉ | 17105/34278 [18:53:27<21:18:26, 4.47s/it] 50%|████▉ | 17106/34278 [18:53:30<19:09:29, 4.02s/it] {'loss': 0.1416, 'grad_norm': 0.7941039515535414, 'learning_rate': 5.258542348197157e-06, 'epoch': 0.5} 50%|████▉ | 17106/34278 [18:53:30<19:09:29, 4.02s/it] 50%|████▉ | 17107/34278 [18:53:34<18:17:41, 3.84s/it] {'loss': 0.1206, 'grad_norm': 0.8670383025657609, 'learning_rate': 5.258070544899201e-06, 'epoch': 0.5} 50%|████▉ | 17107/34278 [18:53:34<18:17:41, 3.84s/it] 50%|████▉ | 17108/34278 [18:53:37<17:43:19, 3.72s/it] {'loss': 0.1621, 'grad_norm': 0.8872511624243985, 'learning_rate': 5.257598739297253e-06, 'epoch': 0.5} 50%|████▉ | 17108/34278 [18:53:37<17:43:19, 3.72s/it] 50%|████▉ | 17109/34278 [18:53:40<16:57:20, 3.56s/it] {'loss': 0.1462, 'grad_norm': 1.6453368524010874, 'learning_rate': 5.257126931395524e-06, 'epoch': 0.5} 50%|████▉ | 17109/34278 [18:53:40<16:57:20, 3.56s/it] 50%|████▉ | 17110/34278 [18:53:43<16:08:49, 3.39s/it] {'loss': 0.1182, 'grad_norm': 0.9544847583473068, 'learning_rate': 5.256655121198229e-06, 'epoch': 0.5} 50%|████▉ | 17110/34278 [18:53:43<16:08:49, 3.39s/it] 50%|████▉ | 17111/34278 [18:53:46<15:44:05, 3.30s/it] {'loss': 0.1488, 'grad_norm': 0.9484962867566957, 'learning_rate': 5.256183308709577e-06, 'epoch': 0.5} 50%|████▉ | 17111/34278 [18:53:46<15:44:05, 3.30s/it] 50%|████▉ | 17112/34278 [18:53:49<15:28:20, 3.24s/it] {'loss': 0.1282, 'grad_norm': 0.8161254533832683, 'learning_rate': 5.255711493933781e-06, 'epoch': 0.5} 50%|████▉ | 17112/34278 [18:53:49<15:28:20, 3.24s/it] 50%|████▉ | 17113/34278 [18:53:52<14:37:45, 3.07s/it] {'loss': 0.1635, 'grad_norm': 1.0173781251841534, 'learning_rate': 5.255239676875055e-06, 'epoch': 0.5} 50%|████▉ | 17113/34278 [18:53:52<14:37:45, 3.07s/it] 50%|████▉ | 17114/34278 [18:53:56<15:31:11, 3.26s/it] {'loss': 0.1376, 'grad_norm': 0.9580842003699206, 'learning_rate': 5.254767857537611e-06, 'epoch': 0.5} 50%|████▉ | 17114/34278 [18:53:56<15:31:11, 3.26s/it] 50%|████▉ | 17115/34278 [18:53:59<15:31:09, 3.26s/it] {'loss': 0.1143, 'grad_norm': 0.7094881150961431, 'learning_rate': 5.254296035925658e-06, 'epoch': 0.5} 50%|████▉ | 17115/34278 [18:53:59<15:31:09, 3.26s/it] 50%|████▉ | 17116/34278 [18:54:02<15:15:03, 3.20s/it] {'loss': 0.1381, 'grad_norm': 1.0444373873103063, 'learning_rate': 5.253824212043411e-06, 'epoch': 0.5} 50%|████▉ | 17116/34278 [18:54:02<15:15:03, 3.20s/it] 50%|████▉ | 17117/34278 [18:54:05<15:00:00, 3.15s/it] {'loss': 0.1466, 'grad_norm': 1.245752921310101, 'learning_rate': 5.253352385895085e-06, 'epoch': 0.5} 50%|████▉ | 17117/34278 [18:54:05<15:00:00, 3.15s/it] 50%|████▉ | 17118/34278 [18:54:08<15:16:32, 3.20s/it] {'loss': 0.1134, 'grad_norm': 0.8277694308556317, 'learning_rate': 5.252880557484886e-06, 'epoch': 0.5} 50%|████▉ | 17118/34278 [18:54:08<15:16:32, 3.20s/it] 50%|████▉ | 17119/34278 [18:54:12<15:19:10, 3.21s/it] {'loss': 0.1406, 'grad_norm': 0.6658102339743692, 'learning_rate': 5.252408726817031e-06, 'epoch': 0.5} 50%|████▉ | 17119/34278 [18:54:12<15:19:10, 3.21s/it] 50%|████▉ | 17120/34278 [18:54:15<15:45:33, 3.31s/it] {'loss': 0.1202, 'grad_norm': 2.0027386387552615, 'learning_rate': 5.251936893895732e-06, 'epoch': 0.5} 50%|████▉ | 17120/34278 [18:54:15<15:45:33, 3.31s/it] 50%|████▉ | 17121/34278 [18:54:19<15:52:57, 3.33s/it] {'loss': 0.1448, 'grad_norm': 1.0980822142946722, 'learning_rate': 5.251465058725198e-06, 'epoch': 0.5} 50%|████▉ | 17121/34278 [18:54:19<15:52:57, 3.33s/it] 50%|████▉ | 17122/34278 [18:54:24<18:31:44, 3.89s/it] {'loss': 0.1389, 'grad_norm': 1.2828284325790056, 'learning_rate': 5.250993221309647e-06, 'epoch': 0.5} 50%|████▉ | 17122/34278 [18:54:24<18:31:44, 3.89s/it] 50%|████▉ | 17123/34278 [18:54:27<17:18:02, 3.63s/it] {'loss': 0.1369, 'grad_norm': 0.7465328188197644, 'learning_rate': 5.250521381653287e-06, 'epoch': 0.5} 50%|████▉ | 17123/34278 [18:54:27<17:18:02, 3.63s/it] 50%|████▉ | 17124/34278 [18:54:33<20:45:02, 4.35s/it] {'loss': 0.1179, 'grad_norm': 0.6626564926755844, 'learning_rate': 5.250049539760332e-06, 'epoch': 0.5} 50%|████▉ | 17124/34278 [18:54:33<20:45:02, 4.35s/it] 50%|████▉ | 17125/34278 [18:54:36<19:35:07, 4.11s/it] {'loss': 0.143, 'grad_norm': 0.7576224810787092, 'learning_rate': 5.249577695634994e-06, 'epoch': 0.5} 50%|████▉ | 17125/34278 [18:54:36<19:35:07, 4.11s/it] 50%|████▉ | 17126/34278 [18:54:42<21:45:50, 4.57s/it] {'loss': 0.119, 'grad_norm': 0.7086279516859726, 'learning_rate': 5.2491058492814875e-06, 'epoch': 0.5} 50%|████▉ | 17126/34278 [18:54:42<21:45:50, 4.57s/it] 50%|████▉ | 17127/34278 [18:54:45<19:29:21, 4.09s/it] {'loss': 0.1602, 'grad_norm': 0.9251884333562526, 'learning_rate': 5.248634000704021e-06, 'epoch': 0.5} 50%|████▉ | 17127/34278 [18:54:45<19:29:21, 4.09s/it] 50%|████▉ | 17128/34278 [18:54:48<17:52:15, 3.75s/it] {'loss': 0.1381, 'grad_norm': 0.7102153188699789, 'learning_rate': 5.248162149906811e-06, 'epoch': 0.5} 50%|████▉ | 17128/34278 [18:54:48<17:52:15, 3.75s/it] 50%|████▉ | 17129/34278 [18:54:52<18:18:41, 3.84s/it] {'loss': 0.1293, 'grad_norm': 0.6747754949169598, 'learning_rate': 5.247690296894069e-06, 'epoch': 0.5} 50%|████▉ | 17129/34278 [18:54:52<18:18:41, 3.84s/it] 50%|████▉ | 17130/34278 [18:54:55<17:28:32, 3.67s/it] {'loss': 0.1292, 'grad_norm': 0.8786216402254168, 'learning_rate': 5.247218441670005e-06, 'epoch': 0.5} 50%|████▉ | 17130/34278 [18:54:55<17:28:32, 3.67s/it] 50%|████▉ | 17131/34278 [18:54:59<17:28:37, 3.67s/it] {'loss': 0.1353, 'grad_norm': 1.1623128843874437, 'learning_rate': 5.246746584238837e-06, 'epoch': 0.5} 50%|████▉ | 17131/34278 [18:54:59<17:28:37, 3.67s/it] 50%|████▉ | 17132/34278 [18:55:02<16:35:10, 3.48s/it] {'loss': 0.1321, 'grad_norm': 0.7744472521828143, 'learning_rate': 5.246274724604773e-06, 'epoch': 0.5} 50%|████▉ | 17132/34278 [18:55:02<16:35:10, 3.48s/it] 50%|████▉ | 17133/34278 [18:55:07<18:31:13, 3.89s/it] {'loss': 0.1452, 'grad_norm': 0.7848661400042279, 'learning_rate': 5.245802862772026e-06, 'epoch': 0.5} 50%|████▉ | 17133/34278 [18:55:07<18:31:13, 3.89s/it] 50%|████▉ | 17134/34278 [18:55:12<20:15:07, 4.25s/it] {'loss': 0.1612, 'grad_norm': 1.0082561941495158, 'learning_rate': 5.24533099874481e-06, 'epoch': 0.5} 50%|████▉ | 17134/34278 [18:55:12<20:15:07, 4.25s/it] 50%|████▉ | 17135/34278 [18:55:15<18:28:58, 3.88s/it] {'loss': 0.1464, 'grad_norm': 0.9243029079990752, 'learning_rate': 5.244859132527339e-06, 'epoch': 0.5} 50%|████▉ | 17135/34278 [18:55:15<18:28:58, 3.88s/it] 50%|████▉ | 17136/34278 [18:55:18<17:06:36, 3.59s/it] {'loss': 0.1561, 'grad_norm': 0.8644881692321109, 'learning_rate': 5.2443872641238215e-06, 'epoch': 0.5} 50%|████▉ | 17136/34278 [18:55:18<17:06:36, 3.59s/it] 50%|████▉ | 17137/34278 [18:55:21<16:25:09, 3.45s/it] {'loss': 0.1301, 'grad_norm': 1.1887181921751124, 'learning_rate': 5.243915393538476e-06, 'epoch': 0.5} 50%|████▉ | 17137/34278 [18:55:21<16:25:09, 3.45s/it] 50%|████▉ | 17138/34278 [18:55:24<16:12:08, 3.40s/it] {'loss': 0.1285, 'grad_norm': 1.118070939766289, 'learning_rate': 5.2434435207755094e-06, 'epoch': 0.5} 50%|████▉ | 17138/34278 [18:55:24<16:12:08, 3.40s/it] 50%|█████ | 17139/34278 [18:55:30<19:50:38, 4.17s/it] {'loss': 0.1201, 'grad_norm': 0.9731561634333157, 'learning_rate': 5.242971645839139e-06, 'epoch': 0.5} 50%|█████ | 17139/34278 [18:55:30<19:50:38, 4.17s/it] 50%|█████ | 17140/34278 [18:55:33<18:19:07, 3.85s/it] {'loss': 0.1311, 'grad_norm': 1.3684619331189036, 'learning_rate': 5.242499768733574e-06, 'epoch': 0.5} 50%|█████ | 17140/34278 [18:55:33<18:19:07, 3.85s/it] 50%|█████ | 17141/34278 [18:55:36<16:36:19, 3.49s/it] {'loss': 0.1416, 'grad_norm': 1.0099767079129824, 'learning_rate': 5.24202788946303e-06, 'epoch': 0.5} 50%|█████ | 17141/34278 [18:55:36<16:36:19, 3.49s/it] 50%|█████ | 17142/34278 [18:55:40<17:51:38, 3.75s/it] {'loss': 0.1365, 'grad_norm': 1.035754821415772, 'learning_rate': 5.2415560080317184e-06, 'epoch': 0.5} 50%|█████ | 17142/34278 [18:55:40<17:51:38, 3.75s/it] 50%|█████ | 17143/34278 [18:55:43<16:24:50, 3.45s/it] {'loss': 0.1017, 'grad_norm': 0.7112097320370867, 'learning_rate': 5.241084124443854e-06, 'epoch': 0.5} 50%|█████ | 17143/34278 [18:55:43<16:24:50, 3.45s/it] 50%|█████ | 17144/34278 [18:55:46<16:24:26, 3.45s/it] {'loss': 0.1241, 'grad_norm': 0.8004353611466325, 'learning_rate': 5.240612238703646e-06, 'epoch': 0.5} 50%|█████ | 17144/34278 [18:55:46<16:24:26, 3.45s/it] 50%|█████ | 17145/34278 [18:55:50<15:54:35, 3.34s/it] {'loss': 0.1161, 'grad_norm': 1.120319803724126, 'learning_rate': 5.24014035081531e-06, 'epoch': 0.5} 50%|█████ | 17145/34278 [18:55:50<15:54:35, 3.34s/it] 50%|█████ | 17146/34278 [18:55:53<16:43:41, 3.52s/it] {'loss': 0.1249, 'grad_norm': 0.8980134409796721, 'learning_rate': 5.239668460783059e-06, 'epoch': 0.5} 50%|█████ | 17146/34278 [18:55:54<16:43:41, 3.52s/it] 50%|█████ | 17147/34278 [18:55:59<19:09:25, 4.03s/it] {'loss': 0.1249, 'grad_norm': 0.7903064197778377, 'learning_rate': 5.239196568611105e-06, 'epoch': 0.5} 50%|█████ | 17147/34278 [18:55:59<19:09:25, 4.03s/it] 50%|█████ | 17148/34278 [18:56:02<18:01:19, 3.79s/it] {'loss': 0.1364, 'grad_norm': 0.8356499527549497, 'learning_rate': 5.2387246743036595e-06, 'epoch': 0.5} 50%|█████ | 17148/34278 [18:56:02<18:01:19, 3.79s/it] 50%|█████ | 17149/34278 [18:56:05<17:12:22, 3.62s/it] {'loss': 0.1397, 'grad_norm': 0.886421625560439, 'learning_rate': 5.238252777864938e-06, 'epoch': 0.5} 50%|█████ | 17149/34278 [18:56:05<17:12:22, 3.62s/it] 50%|█████ | 17150/34278 [18:56:08<16:20:08, 3.43s/it] {'loss': 0.1148, 'grad_norm': 0.6963085457001312, 'learning_rate': 5.237780879299155e-06, 'epoch': 0.5} 50%|█████ | 17150/34278 [18:56:08<16:20:08, 3.43s/it] 50%|█████ | 17151/34278 [18:56:14<19:34:46, 4.12s/it] {'loss': 0.139, 'grad_norm': 1.025842471472968, 'learning_rate': 5.237308978610517e-06, 'epoch': 0.5} 50%|█████ | 17151/34278 [18:56:14<19:34:46, 4.12s/it] 50%|█████ | 17152/34278 [18:56:17<18:14:06, 3.83s/it] {'loss': 0.1493, 'grad_norm': 0.794770099589364, 'learning_rate': 5.236837075803244e-06, 'epoch': 0.5} 50%|█████ | 17152/34278 [18:56:17<18:14:06, 3.83s/it] 50%|█████ | 17153/34278 [18:56:21<18:12:34, 3.83s/it] {'loss': 0.1188, 'grad_norm': 0.780344725681366, 'learning_rate': 5.236365170881545e-06, 'epoch': 0.5} 50%|█████ | 17153/34278 [18:56:21<18:12:34, 3.83s/it] 50%|█████ | 17154/34278 [18:56:24<17:32:17, 3.69s/it] {'loss': 0.127, 'grad_norm': 0.8627918229814081, 'learning_rate': 5.235893263849635e-06, 'epoch': 0.5} 50%|█████ | 17154/34278 [18:56:24<17:32:17, 3.69s/it] 50%|█████ | 17155/34278 [18:56:28<17:13:40, 3.62s/it] {'loss': 0.1147, 'grad_norm': 0.8956561634544297, 'learning_rate': 5.2354213547117246e-06, 'epoch': 0.5} 50%|█████ | 17155/34278 [18:56:28<17:13:40, 3.62s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2295513e70> Failed to fetch sample 2701719. Exception: cannot identify image file <_io.BytesIO object at 0x7f2295513e70> 50%|█████ | 17156/34278 [18:56:31<16:13:54, 3.41s/it] {'loss': 0.1633, 'grad_norm': 0.8951160083460781, 'learning_rate': 5.234949443472031e-06, 'epoch': 0.5} 50%|█████ | 17156/34278 [18:56:31<16:13:54, 3.41s/it] 50%|█████ | 17157/34278 [18:56:37<19:47:18, 4.16s/it] {'loss': 0.1147, 'grad_norm': 0.6589209539871892, 'learning_rate': 5.234477530134763e-06, 'epoch': 0.5} 50%|█████ | 17157/34278 [18:56:37<19:47:18, 4.16s/it] 50%|█████ | 17158/34278 [18:56:43<22:29:04, 4.73s/it] {'loss': 0.1418, 'grad_norm': 0.8054137191463541, 'learning_rate': 5.2340056147041356e-06, 'epoch': 0.5} 50%|█████ | 17158/34278 [18:56:43<22:29:04, 4.73s/it] 50%|█████ | 17159/34278 [18:56:46<20:10:58, 4.24s/it] {'loss': 0.113, 'grad_norm': 0.9100534464272424, 'learning_rate': 5.233533697184362e-06, 'epoch': 0.5} 50%|█████ | 17159/34278 [18:56:46<20:10:58, 4.24s/it] 50%|█████ | 17160/34278 [18:56:49<18:45:18, 3.94s/it] {'loss': 0.1306, 'grad_norm': 0.8610173686244597, 'learning_rate': 5.233061777579656e-06, 'epoch': 0.5} 50%|█████ | 17160/34278 [18:56:49<18:45:18, 3.94s/it] 50%|█████ | 17161/34278 [18:56:52<18:04:20, 3.80s/it] {'loss': 0.1144, 'grad_norm': 0.6738213603824865, 'learning_rate': 5.23258985589423e-06, 'epoch': 0.5} 50%|█████ | 17161/34278 [18:56:52<18:04:20, 3.80s/it] 50%|█████ | 17162/34278 [18:56:57<19:40:23, 4.14s/it] {'loss': 0.1318, 'grad_norm': 0.81525623459888, 'learning_rate': 5.232117932132298e-06, 'epoch': 0.5} 50%|█████ | 17162/34278 [18:56:57<19:40:23, 4.14s/it] 50%|█████ | 17163/34278 [18:57:00<18:12:15, 3.83s/it] {'loss': 0.1357, 'grad_norm': 0.7610579385189001, 'learning_rate': 5.23164600629807e-06, 'epoch': 0.5} 50%|█████ | 17163/34278 [18:57:00<18:12:15, 3.83s/it] 50%|█████ | 17164/34278 [18:57:04<17:38:29, 3.71s/it] {'loss': 0.1253, 'grad_norm': 0.7452739226198208, 'learning_rate': 5.231174078395763e-06, 'epoch': 0.5} 50%|█████ | 17164/34278 [18:57:04<17:38:29, 3.71s/it] 50%|█████ | 17165/34278 [18:57:10<20:46:52, 4.37s/it] {'loss': 0.1226, 'grad_norm': 0.7060718897728941, 'learning_rate': 5.230702148429591e-06, 'epoch': 0.5} 50%|█████ | 17165/34278 [18:57:10<20:46:52, 4.37s/it] 50%|█████ | 17166/34278 [18:57:16<23:03:58, 4.85s/it] {'loss': 0.1507, 'grad_norm': 0.622535066895069, 'learning_rate': 5.230230216403762e-06, 'epoch': 0.5} 50%|█████ | 17166/34278 [18:57:16<23:03:58, 4.85s/it] 50%|█████ | 17167/34278 [18:57:19<21:28:12, 4.52s/it] {'loss': 0.1125, 'grad_norm': 0.8489303336404533, 'learning_rate': 5.2297582823224955e-06, 'epoch': 0.5} 50%|█████ | 17167/34278 [18:57:19<21:28:12, 4.52s/it] 50%|█████ | 17168/34278 [18:57:25<22:36:13, 4.76s/it] {'loss': 0.1231, 'grad_norm': 0.8558647579546741, 'learning_rate': 5.22928634619e-06, 'epoch': 0.5} 50%|█████ | 17168/34278 [18:57:25<22:36:13, 4.76s/it] 50%|█████ | 17169/34278 [18:57:28<20:58:40, 4.41s/it] {'loss': 0.1494, 'grad_norm': 0.6552461441578921, 'learning_rate': 5.228814408010492e-06, 'epoch': 0.5} 50%|█████ | 17169/34278 [18:57:28<20:58:40, 4.41s/it] 50%|█████ | 17170/34278 [18:57:33<20:39:29, 4.35s/it] {'loss': 0.1229, 'grad_norm': 0.7765841525541597, 'learning_rate': 5.228342467788182e-06, 'epoch': 0.5} 50%|█████ | 17170/34278 [18:57:33<20:39:29, 4.35s/it] 50%|█████ | 17171/34278 [18:57:36<18:36:23, 3.92s/it] {'loss': 0.1409, 'grad_norm': 0.9703386582263274, 'learning_rate': 5.2278705255272866e-06, 'epoch': 0.5} 50%|█████ | 17171/34278 [18:57:36<18:36:23, 3.92s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f216b9e5620> Failed to fetch sample 3720966. Exception: cannot identify image file <_io.BytesIO object at 0x7f216b9e5620> 50%|█████ | 17172/34278 [18:57:42<21:47:43, 4.59s/it] {'loss': 0.1373, 'grad_norm': 0.782969804611631, 'learning_rate': 5.227398581232016e-06, 'epoch': 0.5} 50%|█████ | 17172/34278 [18:57:42<21:47:43, 4.59s/it] 50%|█████ | 17173/34278 [18:57:44<19:07:20, 4.02s/it] {'loss': 0.1392, 'grad_norm': 0.7574228508574635, 'learning_rate': 5.226926634906586e-06, 'epoch': 0.5} 50%|█████ | 17173/34278 [18:57:44<19:07:20, 4.02s/it] 50%|█████ | 17174/34278 [18:57:48<18:03:06, 3.80s/it] {'loss': 0.1391, 'grad_norm': 0.7353146706328247, 'learning_rate': 5.226454686555209e-06, 'epoch': 0.5} 50%|█████ | 17174/34278 [18:57:48<18:03:06, 3.80s/it] 50%|█████ | 17175/34278 [18:57:51<16:58:37, 3.57s/it] {'loss': 0.1193, 'grad_norm': 0.8493344005529188, 'learning_rate': 5.225982736182099e-06, 'epoch': 0.5} 50%|█████ | 17175/34278 [18:57:51<16:58:37, 3.57s/it] 50%|█████ | 17176/34278 [18:57:54<16:58:23, 3.57s/it] {'loss': 0.1357, 'grad_norm': 0.7026209480147556, 'learning_rate': 5.2255107837914685e-06, 'epoch': 0.5} 50%|█████ | 17176/34278 [18:57:54<16:58:23, 3.57s/it] 50%|█████ | 17177/34278 [18:57:57<15:52:09, 3.34s/it] {'loss': 0.1389, 'grad_norm': 2.137751931106153, 'learning_rate': 5.225038829387533e-06, 'epoch': 0.5} 50%|█████ | 17177/34278 [18:57:57<15:52:09, 3.34s/it] 50%|█████ | 17178/34278 [18:58:01<16:01:47, 3.37s/it] {'loss': 0.1473, 'grad_norm': 0.7991677006618545, 'learning_rate': 5.224566872974502e-06, 'epoch': 0.5} 50%|█████ | 17178/34278 [18:58:01<16:01:47, 3.37s/it] 50%|█████ | 17179/34278 [18:58:04<16:44:22, 3.52s/it] {'loss': 0.1323, 'grad_norm': 0.674882459748988, 'learning_rate': 5.2240949145565935e-06, 'epoch': 0.5} 50%|█████ | 17179/34278 [18:58:04<16:44:22, 3.52s/it] 50%|█████ | 17180/34278 [18:58:07<15:47:02, 3.32s/it] {'loss': 0.1507, 'grad_norm': 0.8200767887070802, 'learning_rate': 5.22362295413802e-06, 'epoch': 0.5} 50%|█████ | 17180/34278 [18:58:07<15:47:02, 3.32s/it] 50%|█████ | 17181/34278 [18:58:11<16:09:33, 3.40s/it] {'loss': 0.1223, 'grad_norm': 0.9659816451627661, 'learning_rate': 5.223150991722992e-06, 'epoch': 0.5} 50%|█████ | 17181/34278 [18:58:11<16:09:33, 3.40s/it] 50%|█████ | 17182/34278 [18:58:14<15:34:02, 3.28s/it] {'loss': 0.156, 'grad_norm': 0.9508972043286074, 'learning_rate': 5.222679027315727e-06, 'epoch': 0.5} 50%|█████ | 17182/34278 [18:58:14<15:34:02, 3.28s/it] 50%|█████ | 17183/34278 [18:58:17<15:16:59, 3.22s/it] {'loss': 0.1317, 'grad_norm': 1.0595972555282738, 'learning_rate': 5.2222070609204355e-06, 'epoch': 0.5} 50%|█████ | 17183/34278 [18:58:17<15:16:59, 3.22s/it] 50%|█████ | 17184/34278 [18:58:20<14:56:05, 3.15s/it] {'loss': 0.1532, 'grad_norm': 0.8614280346225054, 'learning_rate': 5.221735092541332e-06, 'epoch': 0.5} 50%|█████ | 17184/34278 [18:58:20<14:56:05, 3.15s/it] 50%|█████ | 17185/34278 [18:58:24<16:56:50, 3.57s/it] {'loss': 0.117, 'grad_norm': 0.884064970889249, 'learning_rate': 5.2212631221826315e-06, 'epoch': 0.5} 50%|█████ | 17185/34278 [18:58:24<16:56:50, 3.57s/it] 50%|█████ | 17186/34278 [18:58:30<20:29:10, 4.31s/it] {'loss': 0.128, 'grad_norm': 0.7387140232644068, 'learning_rate': 5.220791149848547e-06, 'epoch': 0.5} 50%|█████ | 17186/34278 [18:58:30<20:29:10, 4.31s/it] 50%|█████ | 17187/34278 [18:58:33<18:23:28, 3.87s/it] {'loss': 0.1153, 'grad_norm': 0.883379906728551, 'learning_rate': 5.22031917554329e-06, 'epoch': 0.5} 50%|█████ | 17187/34278 [18:58:33<18:23:28, 3.87s/it] 50%|█████ | 17188/34278 [18:58:37<17:46:15, 3.74s/it] {'loss': 0.1654, 'grad_norm': 0.8607105100624723, 'learning_rate': 5.219847199271078e-06, 'epoch': 0.5} 50%|█████ | 17188/34278 [18:58:37<17:46:15, 3.74s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12637 > 8192). Running this sequence through the model will result in indexing errors 50%|█████ | 17189/34278 [18:58:41<18:54:04, 3.98s/it] {'loss': 0.1217, 'grad_norm': 0.8471029065213588, 'learning_rate': 5.219375221036122e-06, 'epoch': 0.5} 50%|█████ | 17189/34278 [18:58:41<18:54:04, 3.98s/it] 50%|█████ | 17190/34278 [18:58:46<20:03:32, 4.23s/it] {'loss': 0.1559, 'grad_norm': 0.9251528919789282, 'learning_rate': 5.218903240842635e-06, 'epoch': 0.5} 50%|█████ | 17190/34278 [18:58:46<20:03:32, 4.23s/it] 50%|█████ | 17191/34278 [18:58:49<18:29:30, 3.90s/it] {'loss': 0.1461, 'grad_norm': 0.7348551840984193, 'learning_rate': 5.218431258694833e-06, 'epoch': 0.5} 50%|█████ | 17191/34278 [18:58:49<18:29:30, 3.90s/it] 50%|█████ | 17192/34278 [18:58:52<16:44:22, 3.53s/it] {'loss': 0.1364, 'grad_norm': 0.8236715980687773, 'learning_rate': 5.217959274596931e-06, 'epoch': 0.5} 50%|█████ | 17192/34278 [18:58:52<16:44:22, 3.53s/it] 50%|█████ | 17193/34278 [18:58:55<15:55:34, 3.36s/it] {'loss': 0.1323, 'grad_norm': 1.0239782261428787, 'learning_rate': 5.217487288553138e-06, 'epoch': 0.5} 50%|█████ | 17193/34278 [18:58:55<15:55:34, 3.36s/it] 50%|█████ | 17194/34278 [18:59:01<19:37:22, 4.14s/it] {'loss': 0.1445, 'grad_norm': 1.0428771227201818, 'learning_rate': 5.2170153005676715e-06, 'epoch': 0.5} 50%|█████ | 17194/34278 [18:59:01<19:37:22, 4.14s/it] 50%|█████ | 17195/34278 [18:59:07<21:51:42, 4.61s/it] {'loss': 0.1448, 'grad_norm': 0.8244842854108386, 'learning_rate': 5.216543310644745e-06, 'epoch': 0.5} 50%|█████ | 17195/34278 [18:59:07<21:51:42, 4.61s/it] 50%|█████ | 17196/34278 [18:59:12<23:23:50, 4.93s/it] {'loss': 0.1586, 'grad_norm': 1.141564332369589, 'learning_rate': 5.216071318788569e-06, 'epoch': 0.5} 50%|█████ | 17196/34278 [18:59:12<23:23:50, 4.93s/it] 50%|█████ | 17197/34278 [18:59:16<21:45:11, 4.58s/it] {'loss': 0.1437, 'grad_norm': 1.0340795327004673, 'learning_rate': 5.215599325003362e-06, 'epoch': 0.5} 50%|█████ | 17197/34278 [18:59:16<21:45:11, 4.58s/it] 50%|█████ | 17198/34278 [18:59:19<19:09:00, 4.04s/it] {'loss': 0.1219, 'grad_norm': 0.7727040580340079, 'learning_rate': 5.215127329293336e-06, 'epoch': 0.5} 50%|█████ | 17198/34278 [18:59:19<19:09:00, 4.04s/it] 50%|█████ | 17199/34278 [18:59:22<18:05:04, 3.81s/it] {'loss': 0.1405, 'grad_norm': 0.8878697654869425, 'learning_rate': 5.214655331662703e-06, 'epoch': 0.5} 50%|█████ | 17199/34278 [18:59:22<18:05:04, 3.81s/it] 50%|█████ | 17200/34278 [18:59:28<21:01:54, 4.43s/it] {'loss': 0.1363, 'grad_norm': 0.8413827455853727, 'learning_rate': 5.2141833321156785e-06, 'epoch': 0.5} 50%|█████ | 17200/34278 [18:59:28<21:01:54, 4.43s/it] 50%|█████ | 17201/34278 [18:59:34<23:19:24, 4.92s/it] {'loss': 0.133, 'grad_norm': 0.8404351792994844, 'learning_rate': 5.213711330656478e-06, 'epoch': 0.5} 50%|█████ | 17201/34278 [18:59:34<23:19:24, 4.92s/it] 50%|█████ | 17202/34278 [18:59:37<20:24:45, 4.30s/it] {'loss': 0.1466, 'grad_norm': 1.5971743469107753, 'learning_rate': 5.213239327289312e-06, 'epoch': 0.5} 50%|█████ | 17202/34278 [18:59:37<20:24:45, 4.30s/it] 50%|█████ | 17203/34278 [18:59:40<18:26:57, 3.89s/it] {'loss': 0.151, 'grad_norm': 0.8603876780371584, 'learning_rate': 5.212767322018397e-06, 'epoch': 0.5} 50%|█████ | 17203/34278 [18:59:40<18:26:57, 3.89s/it] 50%|█████ | 17204/34278 [18:59:43<17:33:20, 3.70s/it] {'loss': 0.1923, 'grad_norm': 0.9672789957744976, 'learning_rate': 5.212295314847946e-06, 'epoch': 0.5} 50%|█████ | 17204/34278 [18:59:43<17:33:20, 3.70s/it] 50%|█████ | 17205/34278 [18:59:47<17:39:48, 3.72s/it] {'loss': 0.1469, 'grad_norm': 0.7749333811009574, 'learning_rate': 5.211823305782173e-06, 'epoch': 0.5} 50%|█████ | 17205/34278 [18:59:47<17:39:48, 3.72s/it] 50%|█████ | 17206/34278 [18:59:50<16:28:47, 3.48s/it] {'loss': 0.1061, 'grad_norm': 0.7871876121019177, 'learning_rate': 5.211351294825292e-06, 'epoch': 0.5} 50%|█████ | 17206/34278 [18:59:50<16:28:47, 3.48s/it] 50%|█████ | 17207/34278 [18:59:53<16:06:36, 3.40s/it] {'loss': 0.1079, 'grad_norm': 0.7937711212665466, 'learning_rate': 5.210879281981518e-06, 'epoch': 0.5} 50%|█████ | 17207/34278 [18:59:53<16:06:36, 3.40s/it] 50%|█████ | 17208/34278 [18:59:56<15:46:21, 3.33s/it] {'loss': 0.1367, 'grad_norm': 0.7134331140405762, 'learning_rate': 5.210407267255062e-06, 'epoch': 0.5} 50%|█████ | 17208/34278 [18:59:56<15:46:21, 3.33s/it] 50%|█████ | 17209/34278 [18:59:59<15:54:25, 3.35s/it] {'loss': 0.1272, 'grad_norm': 0.7591675522412211, 'learning_rate': 5.209935250650142e-06, 'epoch': 0.5} 50%|█████ | 17209/34278 [18:59:59<15:54:25, 3.35s/it] 50%|█████ | 17210/34278 [19:00:03<15:28:05, 3.26s/it] {'loss': 0.1512, 'grad_norm': 0.8322558863772237, 'learning_rate': 5.2094632321709705e-06, 'epoch': 0.5} 50%|█████ | 17210/34278 [19:00:03<15:28:05, 3.26s/it] 50%|█████ | 17211/34278 [19:00:05<14:48:35, 3.12s/it] {'loss': 0.1352, 'grad_norm': 0.754612848043372, 'learning_rate': 5.20899121182176e-06, 'epoch': 0.5} 50%|█████ | 17211/34278 [19:00:05<14:48:35, 3.12s/it] 50%|█████ | 17212/34278 [19:00:09<15:13:02, 3.21s/it] {'loss': 0.1252, 'grad_norm': 0.8021606134855779, 'learning_rate': 5.2085191896067265e-06, 'epoch': 0.5} 50%|█████ | 17212/34278 [19:00:09<15:13:02, 3.21s/it] 50%|█████ | 17213/34278 [19:00:12<15:30:53, 3.27s/it] {'loss': 0.1273, 'grad_norm': 0.950624928019119, 'learning_rate': 5.208047165530083e-06, 'epoch': 0.5} 50%|█████ | 17213/34278 [19:00:12<15:30:53, 3.27s/it] 50%|█████ | 17214/34278 [19:00:15<15:22:33, 3.24s/it] {'loss': 0.1417, 'grad_norm': 0.768628499878225, 'learning_rate': 5.207575139596045e-06, 'epoch': 0.5} 50%|█████ | 17214/34278 [19:00:15<15:22:33, 3.24s/it] 50%|█████ | 17215/34278 [19:00:21<19:05:51, 4.03s/it] {'loss': 0.1371, 'grad_norm': 0.8059146562851579, 'learning_rate': 5.2071031118088255e-06, 'epoch': 0.5} 50%|█████ | 17215/34278 [19:00:21<19:05:51, 4.03s/it] 50%|█████ | 17216/34278 [19:00:26<20:08:56, 4.25s/it] {'loss': 0.143, 'grad_norm': 0.8437210934009688, 'learning_rate': 5.206631082172638e-06, 'epoch': 0.5} 50%|█████ | 17216/34278 [19:00:26<20:08:56, 4.25s/it] 50%|█████ | 17217/34278 [19:00:31<20:39:03, 4.36s/it] {'loss': 0.1444, 'grad_norm': 0.7986468902053767, 'learning_rate': 5.206159050691698e-06, 'epoch': 0.5} 50%|█████ | 17217/34278 [19:00:31<20:39:03, 4.36s/it] 50%|█████ | 17218/34278 [19:00:35<20:55:02, 4.41s/it] {'loss': 0.1155, 'grad_norm': 0.8032047586295259, 'learning_rate': 5.205687017370219e-06, 'epoch': 0.5} 50%|█████ | 17218/34278 [19:00:35<20:55:02, 4.41s/it] 50%|█████ | 17219/34278 [19:00:39<20:49:24, 4.39s/it] {'loss': 0.1363, 'grad_norm': 0.7235984459577574, 'learning_rate': 5.205214982212416e-06, 'epoch': 0.5} 50%|█████ | 17219/34278 [19:00:39<20:49:24, 4.39s/it] 50%|█████ | 17220/34278 [19:00:44<21:05:36, 4.45s/it] {'loss': 0.1315, 'grad_norm': 0.82779791602683, 'learning_rate': 5.204742945222502e-06, 'epoch': 0.5} 50%|█████ | 17220/34278 [19:00:44<21:05:36, 4.45s/it] 50%|█████ | 17221/34278 [19:00:48<20:47:38, 4.39s/it] {'loss': 0.1261, 'grad_norm': 0.9646787226862298, 'learning_rate': 5.204270906404692e-06, 'epoch': 0.5} 50%|█████ | 17221/34278 [19:00:48<20:47:38, 4.39s/it] 50%|█████ | 17222/34278 [19:00:51<19:02:55, 4.02s/it] {'loss': 0.1122, 'grad_norm': 0.7659089965180165, 'learning_rate': 5.203798865763201e-06, 'epoch': 0.5} 50%|█████ | 17222/34278 [19:00:51<19:02:55, 4.02s/it] 50%|█████ | 17223/34278 [19:00:54<17:28:22, 3.69s/it] {'loss': 0.1357, 'grad_norm': 0.9048215158293285, 'learning_rate': 5.20332682330224e-06, 'epoch': 0.5} 50%|█████ | 17223/34278 [19:00:54<17:28:22, 3.69s/it] 50%|█████ | 17224/34278 [19:00:57<16:25:23, 3.47s/it] {'loss': 0.128, 'grad_norm': 1.0139816087897788, 'learning_rate': 5.202854779026028e-06, 'epoch': 0.5} 50%|█████ | 17224/34278 [19:00:57<16:25:23, 3.47s/it] 50%|█████ | 17225/34278 [19:01:01<16:26:58, 3.47s/it] {'loss': 0.1157, 'grad_norm': 0.8499543411608957, 'learning_rate': 5.202382732938777e-06, 'epoch': 0.5} 50%|█████ | 17225/34278 [19:01:01<16:26:58, 3.47s/it] 50%|█████ | 17226/34278 [19:01:04<15:47:50, 3.34s/it] {'loss': 0.1214, 'grad_norm': 0.8338528415467421, 'learning_rate': 5.201910685044699e-06, 'epoch': 0.5} 50%|█████ | 17226/34278 [19:01:04<15:47:50, 3.34s/it] 50%|█████ | 17227/34278 [19:01:07<15:07:29, 3.19s/it] {'loss': 0.1572, 'grad_norm': 0.8279495772859653, 'learning_rate': 5.201438635348013e-06, 'epoch': 0.5} 50%|█████ | 17227/34278 [19:01:07<15:07:29, 3.19s/it] 50%|█████ | 17228/34278 [19:01:10<14:54:50, 3.15s/it] {'loss': 0.1218, 'grad_norm': 1.0496619765438127, 'learning_rate': 5.20096658385293e-06, 'epoch': 0.5} 50%|█████ | 17228/34278 [19:01:10<14:54:50, 3.15s/it] 50%|█████ | 17229/34278 [19:01:14<16:42:06, 3.53s/it] {'loss': 0.1158, 'grad_norm': 0.7130666128350417, 'learning_rate': 5.2004945305636656e-06, 'epoch': 0.5} 50%|█████ | 17229/34278 [19:01:14<16:42:06, 3.53s/it] 50%|█████ | 17230/34278 [19:01:17<15:57:00, 3.37s/it] {'loss': 0.1284, 'grad_norm': 0.7951027918329903, 'learning_rate': 5.200022475484433e-06, 'epoch': 0.5} 50%|█████ | 17230/34278 [19:01:17<15:57:00, 3.37s/it] 50%|█████ | 17231/34278 [19:01:20<15:41:42, 3.31s/it] {'loss': 0.1598, 'grad_norm': 1.0339598951541478, 'learning_rate': 5.1995504186194476e-06, 'epoch': 0.5} 50%|█████ | 17231/34278 [19:01:20<15:41:42, 3.31s/it] 50%|█████ | 17232/34278 [19:01:23<15:14:54, 3.22s/it] {'loss': 0.1205, 'grad_norm': 0.8752915020425636, 'learning_rate': 5.199078359972925e-06, 'epoch': 0.5} 50%|█████ | 17232/34278 [19:01:23<15:14:54, 3.22s/it] 50%|█████ | 17233/34278 [19:01:26<14:52:08, 3.14s/it] {'loss': 0.1383, 'grad_norm': 1.0506697388721178, 'learning_rate': 5.198606299549077e-06, 'epoch': 0.5} 50%|█████ | 17233/34278 [19:01:26<14:52:08, 3.14s/it] 50%|█████ | 17234/34278 [19:01:31<17:29:00, 3.69s/it] {'loss': 0.148, 'grad_norm': 1.0142581175250243, 'learning_rate': 5.198134237352121e-06, 'epoch': 0.5} 50%|█████ | 17234/34278 [19:01:31<17:29:00, 3.69s/it] 50%|█████ | 17235/34278 [19:01:34<16:24:16, 3.47s/it] {'loss': 0.1228, 'grad_norm': 0.8689653063950162, 'learning_rate': 5.1976621733862675e-06, 'epoch': 0.5} 50%|█████ | 17235/34278 [19:01:34<16:24:16, 3.47s/it] 50%|█████ | 17236/34278 [19:01:39<18:07:48, 3.83s/it] {'loss': 0.1437, 'grad_norm': 0.9116054998973984, 'learning_rate': 5.197190107655735e-06, 'epoch': 0.5} 50%|█████ | 17236/34278 [19:01:39<18:07:48, 3.83s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fbf2733fa10> Failed to fetch sample 2762611. Exception: cannot identify image file <_io.BytesIO object at 0x7fbf2733fa10> 50%|█████ | 17237/34278 [19:01:42<17:04:23, 3.61s/it] {'loss': 0.134, 'grad_norm': 1.1149824829170243, 'learning_rate': 5.196718040164737e-06, 'epoch': 0.5} 50%|█████ | 17237/34278 [19:01:42<17:04:23, 3.61s/it] 50%|█████ | 17238/34278 [19:01:45<16:55:53, 3.58s/it] {'loss': 0.1562, 'grad_norm': 1.1332287799316638, 'learning_rate': 5.196245970917485e-06, 'epoch': 0.5} 50%|█████ | 17238/34278 [19:01:45<16:55:53, 3.58s/it] 50%|█████ | 17239/34278 [19:01:51<20:23:21, 4.31s/it] {'loss': 0.1352, 'grad_norm': 0.8639495800020006, 'learning_rate': 5.195773899918196e-06, 'epoch': 0.5} 50%|█████ | 17239/34278 [19:01:51<20:23:21, 4.31s/it] 50%|█████ | 17240/34278 [19:01:55<19:10:27, 4.05s/it] {'loss': 0.1466, 'grad_norm': 0.8801817825528243, 'learning_rate': 5.195301827171086e-06, 'epoch': 0.5} 50%|█████ | 17240/34278 [19:01:55<19:10:27, 4.05s/it] 50%|█████ | 17241/34278 [19:01:58<18:07:30, 3.83s/it] {'loss': 0.1381, 'grad_norm': 0.9851435143146585, 'learning_rate': 5.194829752680367e-06, 'epoch': 0.5} 50%|█████ | 17241/34278 [19:01:58<18:07:30, 3.83s/it] 50%|█████ | 17242/34278 [19:02:04<21:02:38, 4.45s/it] {'loss': 0.1477, 'grad_norm': 0.9734840545233086, 'learning_rate': 5.194357676450256e-06, 'epoch': 0.5} 50%|█████ | 17242/34278 [19:02:04<21:02:38, 4.45s/it] 50%|█████ | 17243/34278 [19:02:08<19:47:50, 4.18s/it] {'loss': 0.1174, 'grad_norm': 0.7232070671923675, 'learning_rate': 5.1938855984849645e-06, 'epoch': 0.5} 50%|█████ | 17243/34278 [19:02:08<19:47:50, 4.18s/it] 50%|█████ | 17244/34278 [19:02:11<18:05:38, 3.82s/it] {'loss': 0.1448, 'grad_norm': 0.9599050197452415, 'learning_rate': 5.193413518788709e-06, 'epoch': 0.5} 50%|█████ | 17244/34278 [19:02:11<18:05:38, 3.82s/it] 50%|█████ | 17245/34278 [19:02:17<21:22:34, 4.52s/it] {'loss': 0.1379, 'grad_norm': 0.8497767215530986, 'learning_rate': 5.192941437365704e-06, 'epoch': 0.5} 50%|█████ | 17245/34278 [19:02:17<21:22:34, 4.52s/it] 50%|█████ | 17246/34278 [19:02:20<19:15:47, 4.07s/it] {'loss': 0.1252, 'grad_norm': 0.9261743928033254, 'learning_rate': 5.192469354220163e-06, 'epoch': 0.5} 50%|█████ | 17246/34278 [19:02:20<19:15:47, 4.07s/it] 50%|█████ | 17247/34278 [19:02:26<22:03:02, 4.66s/it] {'loss': 0.1194, 'grad_norm': 0.7747379280891233, 'learning_rate': 5.191997269356302e-06, 'epoch': 0.5} 50%|█████ | 17247/34278 [19:02:26<22:03:02, 4.66s/it] 50%|█████ | 17248/34278 [19:02:29<20:04:53, 4.25s/it] {'loss': 0.1383, 'grad_norm': 0.8353798486093982, 'learning_rate': 5.1915251827783355e-06, 'epoch': 0.5} 50%|█████ | 17248/34278 [19:02:29<20:04:53, 4.25s/it] 50%|█████ | 17249/34278 [19:02:32<18:14:40, 3.86s/it] {'loss': 0.1344, 'grad_norm': 1.078824034442684, 'learning_rate': 5.191053094490477e-06, 'epoch': 0.5} 50%|█████ | 17249/34278 [19:02:32<18:14:40, 3.86s/it] 50%|█████ | 17250/34278 [19:02:35<16:36:48, 3.51s/it] {'loss': 0.1263, 'grad_norm': 0.7618045597560886, 'learning_rate': 5.190581004496943e-06, 'epoch': 0.5} 50%|█████ | 17250/34278 [19:02:35<16:36:48, 3.51s/it] 50%|█████ | 17251/34278 [19:02:40<18:29:19, 3.91s/it] {'loss': 0.1226, 'grad_norm': 0.6335554826591071, 'learning_rate': 5.190108912801948e-06, 'epoch': 0.5} 50%|█████ | 17251/34278 [19:02:40<18:29:19, 3.91s/it] 50%|█████ | 17252/34278 [19:02:44<19:00:33, 4.02s/it] {'loss': 0.1391, 'grad_norm': 0.8226690093523454, 'learning_rate': 5.189636819409706e-06, 'epoch': 0.5} 50%|█████ | 17252/34278 [19:02:44<19:00:33, 4.02s/it] 50%|█████ | 17253/34278 [19:02:50<21:44:08, 4.60s/it] {'loss': 0.1238, 'grad_norm': 0.8102035247872031, 'learning_rate': 5.1891647243244295e-06, 'epoch': 0.5} 50%|█████ | 17253/34278 [19:02:50<21:44:08, 4.60s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (8383 > 8192). Running this sequence through the model will result in indexing errors 50%|█████ | 17254/34278 [19:02:53<20:03:48, 4.24s/it] {'loss': 0.1511, 'grad_norm': 0.8244272403590951, 'learning_rate': 5.188692627550337e-06, 'epoch': 0.5} 50%|█████ | 17254/34278 [19:02:53<20:03:48, 4.24s/it] 50%|█████ | 17255/34278 [19:02:56<18:23:27, 3.89s/it] {'loss': 0.1224, 'grad_norm': 1.0458971247754452, 'learning_rate': 5.188220529091642e-06, 'epoch': 0.5} 50%|█████ | 17255/34278 [19:02:56<18:23:27, 3.89s/it] 50%|█████ | 17256/34278 [19:03:00<18:01:24, 3.81s/it] {'loss': 0.1357, 'grad_norm': 0.753577592633354, 'learning_rate': 5.187748428952557e-06, 'epoch': 0.5} 50%|█████ | 17256/34278 [19:03:00<18:01:24, 3.81s/it] 50%|█████ | 17257/34278 [19:03:06<21:01:31, 4.45s/it] {'loss': 0.1338, 'grad_norm': 0.853675461914812, 'learning_rate': 5.187276327137302e-06, 'epoch': 0.5} 50%|█████ | 17257/34278 [19:03:06<21:01:31, 4.45s/it] 50%|█████ | 17258/34278 [19:03:10<20:56:51, 4.43s/it] {'loss': 0.1496, 'grad_norm': 1.1959383984510523, 'learning_rate': 5.186804223650087e-06, 'epoch': 0.5} 50%|█████ | 17258/34278 [19:03:10<20:56:51, 4.43s/it] 50%|█████ | 17259/34278 [19:03:13<18:45:55, 3.97s/it] {'loss': 0.1222, 'grad_norm': 0.708652813153985, 'learning_rate': 5.1863321184951285e-06, 'epoch': 0.5} 50%|█████ | 17259/34278 [19:03:13<18:45:55, 3.97s/it] 50%|█████ | 17260/34278 [19:03:16<17:27:32, 3.69s/it] {'loss': 0.1452, 'grad_norm': 1.0634244139453963, 'learning_rate': 5.185860011676643e-06, 'epoch': 0.5} 50%|█████ | 17260/34278 [19:03:16<17:27:32, 3.69s/it] 50%|█████ | 17261/34278 [19:03:21<18:37:53, 3.94s/it] {'loss': 0.1542, 'grad_norm': 0.8887788334081251, 'learning_rate': 5.185387903198841e-06, 'epoch': 0.5} 50%|█████ | 17261/34278 [19:03:21<18:37:53, 3.94s/it] 50%|█████ | 17262/34278 [19:03:26<19:47:56, 4.19s/it] {'loss': 0.1375, 'grad_norm': 0.8628642665490859, 'learning_rate': 5.184915793065941e-06, 'epoch': 0.5} 50%|█████ | 17262/34278 [19:03:26<19:47:56, 4.19s/it] 50%|█████ | 17263/34278 [19:03:28<17:45:56, 3.76s/it] {'loss': 0.1186, 'grad_norm': 0.8509809838676219, 'learning_rate': 5.184443681282157e-06, 'epoch': 0.5} 50%|█████ | 17263/34278 [19:03:28<17:45:56, 3.76s/it] 50%|█████ | 17264/34278 [19:03:31<16:39:10, 3.52s/it] {'loss': 0.1483, 'grad_norm': 1.1490226197470912, 'learning_rate': 5.183971567851704e-06, 'epoch': 0.5} 50%|█████ | 17264/34278 [19:03:31<16:39:10, 3.52s/it] 50%|█████ | 17265/34278 [19:03:35<16:14:36, 3.44s/it] {'loss': 0.1401, 'grad_norm': 0.8110213708126766, 'learning_rate': 5.183499452778797e-06, 'epoch': 0.5} 50%|█████ | 17265/34278 [19:03:35<16:14:36, 3.44s/it] 50%|█████ | 17266/34278 [19:03:38<15:56:12, 3.37s/it] {'loss': 0.1303, 'grad_norm': 1.1587106322731249, 'learning_rate': 5.183027336067649e-06, 'epoch': 0.5} 50%|█████ | 17266/34278 [19:03:38<15:56:12, 3.37s/it] 50%|█████ | 17267/34278 [19:03:43<18:08:31, 3.84s/it] {'loss': 0.1396, 'grad_norm': 0.8795036241011459, 'learning_rate': 5.182555217722479e-06, 'epoch': 0.5} 50%|█████ | 17267/34278 [19:03:43<18:08:31, 3.84s/it] 50%|█████ | 17268/34278 [19:03:47<18:41:07, 3.95s/it] {'loss': 0.158, 'grad_norm': 0.777523482896419, 'learning_rate': 5.182083097747499e-06, 'epoch': 0.5} 50%|█████ | 17268/34278 [19:03:47<18:41:07, 3.95s/it] 50%|█████ | 17269/34278 [19:03:50<18:03:09, 3.82s/it] {'loss': 0.1344, 'grad_norm': 1.1357611384784294, 'learning_rate': 5.181610976146924e-06, 'epoch': 0.5} 50%|█████ | 17269/34278 [19:03:50<18:03:09, 3.82s/it] 50%|█████ | 17270/34278 [19:03:54<17:08:58, 3.63s/it] {'loss': 0.126, 'grad_norm': 1.185696125110025, 'learning_rate': 5.1811388529249695e-06, 'epoch': 0.5} 50%|█████ | 17270/34278 [19:03:54<17:08:58, 3.63s/it] 50%|█████ | 17271/34278 [19:03:56<15:49:02, 3.35s/it] {'loss': 0.1149, 'grad_norm': 0.9539166554928546, 'learning_rate': 5.180666728085852e-06, 'epoch': 0.5} 50%|█████ | 17271/34278 [19:03:56<15:49:02, 3.35s/it] 50%|█████ | 17272/34278 [19:04:02<18:57:23, 4.01s/it] {'loss': 0.129, 'grad_norm': 0.9670610747291339, 'learning_rate': 5.180194601633784e-06, 'epoch': 0.5} 50%|█████ | 17272/34278 [19:04:02<18:57:23, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 50%|█████ | 17273/34278 [19:04:05<17:37:36, 3.73s/it] {'loss': 0.1302, 'grad_norm': 0.7969008634462226, 'learning_rate': 5.179722473572982e-06, 'epoch': 0.5} 50%|█████ | 17273/34278 [19:04:05<17:37:36, 3.73s/it] 50%|█████ | 17274/34278 [19:04:08<17:08:11, 3.63s/it] {'loss': 0.1476, 'grad_norm': 0.7885981007998627, 'learning_rate': 5.17925034390766e-06, 'epoch': 0.5} 50%|█████ | 17274/34278 [19:04:08<17:08:11, 3.63s/it] 50%|█████ | 17275/34278 [19:04:11<16:07:47, 3.42s/it] {'loss': 0.1478, 'grad_norm': 1.2347809487692973, 'learning_rate': 5.178778212642034e-06, 'epoch': 0.5} 50%|█████ | 17275/34278 [19:04:11<16:07:47, 3.42s/it] 50%|█████ | 17276/34278 [19:04:15<16:58:32, 3.59s/it] {'loss': 0.1397, 'grad_norm': 0.7202792160277384, 'learning_rate': 5.178306079780318e-06, 'epoch': 0.5} 50%|█████ | 17276/34278 [19:04:15<16:58:32, 3.59s/it] 50%|█████ | 17277/34278 [19:04:19<16:34:15, 3.51s/it] {'loss': 0.1454, 'grad_norm': 0.7922208300396085, 'learning_rate': 5.177833945326729e-06, 'epoch': 0.5} 50%|█████ | 17277/34278 [19:04:19<16:34:15, 3.51s/it] 50%|█████ | 17278/34278 [19:04:21<15:40:04, 3.32s/it] {'loss': 0.1166, 'grad_norm': 1.0010676963995142, 'learning_rate': 5.17736180928548e-06, 'epoch': 0.5} 50%|█████ | 17278/34278 [19:04:21<15:40:04, 3.32s/it] 50%|█████ | 17279/34278 [19:04:25<16:17:43, 3.45s/it] {'loss': 0.1111, 'grad_norm': 0.7995820822926964, 'learning_rate': 5.176889671660789e-06, 'epoch': 0.5} 50%|█████ | 17279/34278 [19:04:25<16:17:43, 3.45s/it] 50%|█████ | 17280/34278 [19:04:28<15:58:00, 3.38s/it] {'loss': 0.1149, 'grad_norm': 0.6889976012705192, 'learning_rate': 5.176417532456868e-06, 'epoch': 0.5} 50%|█████ | 17280/34278 [19:04:28<15:58:00, 3.38s/it] 50%|█████ | 17281/34278 [19:04:32<16:04:45, 3.41s/it] {'loss': 0.1591, 'grad_norm': 1.2054995080238184, 'learning_rate': 5.175945391677932e-06, 'epoch': 0.5} 50%|█████ | 17281/34278 [19:04:32<16:04:45, 3.41s/it] 50%|█████ | 17282/34278 [19:04:35<15:27:49, 3.28s/it] {'loss': 0.0966, 'grad_norm': 0.7467556408446708, 'learning_rate': 5.175473249328199e-06, 'epoch': 0.5} 50%|█████ | 17282/34278 [19:04:35<15:27:49, 3.28s/it] 50%|█████ | 17283/34278 [19:04:39<16:42:20, 3.54s/it] {'loss': 0.1376, 'grad_norm': 0.7376648464058364, 'learning_rate': 5.175001105411883e-06, 'epoch': 0.5} 50%|█████ | 17283/34278 [19:04:39<16:42:20, 3.54s/it] 50%|█████ | 17284/34278 [19:04:44<18:16:40, 3.87s/it] {'loss': 0.1686, 'grad_norm': 0.7796029587698763, 'learning_rate': 5.174528959933198e-06, 'epoch': 0.5} 50%|█████ | 17284/34278 [19:04:44<18:16:40, 3.87s/it] 50%|█████ | 17285/34278 [19:04:47<17:04:08, 3.62s/it] {'loss': 0.1515, 'grad_norm': 0.937693107754778, 'learning_rate': 5.1740568128963605e-06, 'epoch': 0.5} 50%|█████ | 17285/34278 [19:04:47<17:04:08, 3.62s/it] 50%|█████ | 17286/34278 [19:04:51<18:19:54, 3.88s/it] {'loss': 0.1281, 'grad_norm': 0.761824456179311, 'learning_rate': 5.173584664305587e-06, 'epoch': 0.5} 50%|█████ | 17286/34278 [19:04:51<18:19:54, 3.88s/it] 50%|█████ | 17287/34278 [19:04:55<17:38:49, 3.74s/it] {'loss': 0.1194, 'grad_norm': 0.7372483380958332, 'learning_rate': 5.173112514165089e-06, 'epoch': 0.5} 50%|█████ | 17287/34278 [19:04:55<17:38:49, 3.74s/it] 50%|█████ | 17288/34278 [19:04:57<16:27:44, 3.49s/it] {'loss': 0.1265, 'grad_norm': 0.8701508919391344, 'learning_rate': 5.1726403624790834e-06, 'epoch': 0.5} 50%|█████ | 17288/34278 [19:04:57<16:27:44, 3.49s/it] 50%|█████ | 17289/34278 [19:05:03<19:50:02, 4.20s/it] {'loss': 0.124, 'grad_norm': 0.6910382203783673, 'learning_rate': 5.172168209251788e-06, 'epoch': 0.5} 50%|█████ | 17289/34278 [19:05:03<19:50:02, 4.20s/it] 50%|█████ | 17290/34278 [19:05:07<19:23:00, 4.11s/it] {'loss': 0.1611, 'grad_norm': 1.2675084606568687, 'learning_rate': 5.171696054487415e-06, 'epoch': 0.5} 50%|█████ | 17290/34278 [19:05:07<19:23:00, 4.11s/it] 50%|█████ | 17291/34278 [19:05:10<18:09:37, 3.85s/it] {'loss': 0.1395, 'grad_norm': 0.8944759267988633, 'learning_rate': 5.171223898190178e-06, 'epoch': 0.5} 50%|█████ | 17291/34278 [19:05:10<18:09:37, 3.85s/it] 50%|█████ | 17292/34278 [19:05:13<16:46:26, 3.56s/it] {'loss': 0.1302, 'grad_norm': 0.9528528080657033, 'learning_rate': 5.170751740364299e-06, 'epoch': 0.5} 50%|█████ | 17292/34278 [19:05:13<16:46:26, 3.56s/it] 50%|█████ | 17293/34278 [19:05:17<16:43:12, 3.54s/it] {'loss': 0.1146, 'grad_norm': 0.9272631413814407, 'learning_rate': 5.170279581013987e-06, 'epoch': 0.5} 50%|█████ | 17293/34278 [19:05:17<16:43:12, 3.54s/it] 50%|█████ | 17294/34278 [19:05:20<16:07:05, 3.42s/it] {'loss': 0.1361, 'grad_norm': 0.6632904208785081, 'learning_rate': 5.16980742014346e-06, 'epoch': 0.5} 50%|█████ | 17294/34278 [19:05:20<16:07:05, 3.42s/it] 50%|█████ | 17295/34278 [19:05:24<16:33:11, 3.51s/it] {'loss': 0.1313, 'grad_norm': 0.7432749761201312, 'learning_rate': 5.169335257756933e-06, 'epoch': 0.5} 50%|█████ | 17295/34278 [19:05:24<16:33:11, 3.51s/it] 50%|█████ | 17296/34278 [19:05:27<15:50:38, 3.36s/it] {'loss': 0.1258, 'grad_norm': 0.8658986817112828, 'learning_rate': 5.168863093858622e-06, 'epoch': 0.5} 50%|█████ | 17296/34278 [19:05:27<15:50:38, 3.36s/it] 50%|█████ | 17297/34278 [19:05:30<16:02:54, 3.40s/it] {'loss': 0.1364, 'grad_norm': 0.9471182737787873, 'learning_rate': 5.1683909284527404e-06, 'epoch': 0.5} 50%|█████ | 17297/34278 [19:05:30<16:02:54, 3.40s/it] 50%|█████ | 17298/34278 [19:05:34<16:17:43, 3.45s/it] {'loss': 0.1465, 'grad_norm': 0.7554185249892994, 'learning_rate': 5.1679187615435045e-06, 'epoch': 0.5} 50%|█████ | 17298/34278 [19:05:34<16:17:43, 3.45s/it] 50%|█████ | 17299/34278 [19:05:37<16:06:33, 3.42s/it] {'loss': 0.143, 'grad_norm': 0.8796981762674053, 'learning_rate': 5.16744659313513e-06, 'epoch': 0.5} 50%|█████ | 17299/34278 [19:05:37<16:06:33, 3.42s/it] 50%|█████ | 17300/34278 [19:05:41<16:39:59, 3.53s/it] {'loss': 0.1454, 'grad_norm': 0.778147412229105, 'learning_rate': 5.1669744232318345e-06, 'epoch': 0.5} 50%|█████ | 17300/34278 [19:05:41<16:39:59, 3.53s/it] 50%|█████ | 17301/34278 [19:05:45<16:48:03, 3.56s/it] {'loss': 0.131, 'grad_norm': 0.6812293522365617, 'learning_rate': 5.1665022518378285e-06, 'epoch': 0.5} 50%|█████ | 17301/34278 [19:05:45<16:48:03, 3.56s/it] 50%|█████ | 17302/34278 [19:05:49<18:21:27, 3.89s/it] {'loss': 0.1555, 'grad_norm': 1.0503963227237636, 'learning_rate': 5.166030078957333e-06, 'epoch': 0.5} 50%|█████ | 17302/34278 [19:05:49<18:21:27, 3.89s/it] 50%|█████ | 17303/34278 [19:05:52<17:13:44, 3.65s/it] {'loss': 0.1221, 'grad_norm': 0.7414429945332138, 'learning_rate': 5.165557904594557e-06, 'epoch': 0.5} 50%|█████ | 17303/34278 [19:05:52<17:13:44, 3.65s/it] 50%|█████ | 17304/34278 [19:05:56<17:15:51, 3.66s/it] {'loss': 0.1478, 'grad_norm': 0.9916526761838073, 'learning_rate': 5.165085728753723e-06, 'epoch': 0.5} 50%|█████ | 17304/34278 [19:05:56<17:15:51, 3.66s/it] 50%|█████ | 17305/34278 [19:05:59<16:13:52, 3.44s/it] {'loss': 0.1365, 'grad_norm': 0.9172534816491521, 'learning_rate': 5.16461355143904e-06, 'epoch': 0.5} 50%|█████ | 17305/34278 [19:05:59<16:13:52, 3.44s/it] 50%|█████ | 17306/34278 [19:06:03<17:23:46, 3.69s/it] {'loss': 0.163, 'grad_norm': 0.8760311130636964, 'learning_rate': 5.164141372654728e-06, 'epoch': 0.5} 50%|█████ | 17306/34278 [19:06:03<17:23:46, 3.69s/it] 50%|█████ | 17307/34278 [19:06:09<20:35:46, 4.37s/it] {'loss': 0.1272, 'grad_norm': 0.7033881953878998, 'learning_rate': 5.163669192405002e-06, 'epoch': 0.5} 50%|█████ | 17307/34278 [19:06:09<20:35:46, 4.37s/it] 50%|█████ | 17308/34278 [19:06:15<22:16:42, 4.73s/it] {'loss': 0.1351, 'grad_norm': 0.7781406001882464, 'learning_rate': 5.163197010694076e-06, 'epoch': 0.5} 50%|█████ | 17308/34278 [19:06:15<22:16:42, 4.73s/it] 50%|█████ | 17309/34278 [19:06:18<19:42:05, 4.18s/it] {'loss': 0.1586, 'grad_norm': 1.0139965614663418, 'learning_rate': 5.162724827526164e-06, 'epoch': 0.5} 50%|█████ | 17309/34278 [19:06:18<19:42:05, 4.18s/it] 50%|█████ | 17310/34278 [19:06:20<17:45:52, 3.77s/it] {'loss': 0.1156, 'grad_norm': 0.770099026281592, 'learning_rate': 5.1622526429054855e-06, 'epoch': 0.5} 50%|█████ | 17310/34278 [19:06:20<17:45:52, 3.77s/it] 51%|█████ | 17311/34278 [19:06:25<19:04:36, 4.05s/it] {'loss': 0.1316, 'grad_norm': 0.7775863334155969, 'learning_rate': 5.161780456836254e-06, 'epoch': 0.51} 51%|█████ | 17311/34278 [19:06:25<19:04:36, 4.05s/it] 51%|█████ | 17312/34278 [19:06:28<17:37:02, 3.74s/it] {'loss': 0.1335, 'grad_norm': 0.9193186397371704, 'learning_rate': 5.161308269322684e-06, 'epoch': 0.51} 51%|█████ | 17312/34278 [19:06:28<17:37:02, 3.74s/it] 51%|█████ | 17313/34278 [19:06:31<16:55:15, 3.59s/it] {'loss': 0.1624, 'grad_norm': 0.961528245068585, 'learning_rate': 5.160836080368994e-06, 'epoch': 0.51} 51%|█████ | 17313/34278 [19:06:31<16:55:15, 3.59s/it] 51%|█████ | 17314/34278 [19:06:35<17:28:54, 3.71s/it] {'loss': 0.1126, 'grad_norm': 0.8024931120018204, 'learning_rate': 5.160363889979396e-06, 'epoch': 0.51} 51%|█████ | 17314/34278 [19:06:35<17:28:54, 3.71s/it] 51%|█████ | 17315/34278 [19:06:42<20:55:44, 4.44s/it] {'loss': 0.1508, 'grad_norm': 0.792742673458629, 'learning_rate': 5.159891698158109e-06, 'epoch': 0.51} 51%|█████ | 17315/34278 [19:06:42<20:55:44, 4.44s/it] 51%|█████ | 17316/34278 [19:06:45<19:03:06, 4.04s/it] {'loss': 0.1252, 'grad_norm': 0.8145078846492846, 'learning_rate': 5.159419504909346e-06, 'epoch': 0.51} 51%|█████ | 17316/34278 [19:06:45<19:03:06, 4.04s/it] 51%|█████ | 17317/34278 [19:06:48<17:47:54, 3.78s/it] {'loss': 0.1498, 'grad_norm': 1.1263377609224792, 'learning_rate': 5.1589473102373265e-06, 'epoch': 0.51} 51%|█████ | 17317/34278 [19:06:48<17:47:54, 3.78s/it] 51%|█████ | 17318/34278 [19:06:54<20:54:22, 4.44s/it] {'loss': 0.118, 'grad_norm': 0.7968997022564035, 'learning_rate': 5.15847511414626e-06, 'epoch': 0.51} 51%|█████ | 17318/34278 [19:06:54<20:54:22, 4.44s/it] 51%|█████ | 17319/34278 [19:07:00<23:13:33, 4.93s/it] {'loss': 0.1354, 'grad_norm': 0.8437469819237442, 'learning_rate': 5.1580029166403675e-06, 'epoch': 0.51} 51%|█████ | 17319/34278 [19:07:00<23:13:33, 4.93s/it] 51%|█████ | 17320/34278 [19:07:03<20:38:08, 4.38s/it] {'loss': 0.1367, 'grad_norm': 0.8186802097733634, 'learning_rate': 5.157530717723862e-06, 'epoch': 0.51} 51%|█████ | 17320/34278 [19:07:03<20:38:08, 4.38s/it] 51%|█████ | 17321/34278 [19:07:06<19:18:44, 4.10s/it] {'loss': 0.1568, 'grad_norm': 0.7429109514425973, 'learning_rate': 5.157058517400958e-06, 'epoch': 0.51} 51%|█████ | 17321/34278 [19:07:06<19:18:44, 4.10s/it] 51%|█████ | 17322/34278 [19:07:10<18:04:50, 3.84s/it] {'loss': 0.1374, 'grad_norm': 0.7847091485413135, 'learning_rate': 5.156586315675877e-06, 'epoch': 0.51} 51%|█████ | 17322/34278 [19:07:10<18:04:50, 3.84s/it] 51%|█████ | 17323/34278 [19:07:14<19:27:29, 4.13s/it] {'loss': 0.1348, 'grad_norm': 1.0191237270795168, 'learning_rate': 5.156114112552828e-06, 'epoch': 0.51} 51%|█████ | 17323/34278 [19:07:14<19:27:29, 4.13s/it] 51%|█████ | 17324/34278 [19:07:17<17:56:16, 3.81s/it] {'loss': 0.1297, 'grad_norm': 0.739147094279075, 'learning_rate': 5.15564190803603e-06, 'epoch': 0.51} 51%|█████ | 17324/34278 [19:07:17<17:56:16, 3.81s/it] 51%|█████ | 17325/34278 [19:07:20<16:37:17, 3.53s/it] {'loss': 0.1397, 'grad_norm': 0.8209983422623021, 'learning_rate': 5.1551697021296975e-06, 'epoch': 0.51} 51%|█████ | 17325/34278 [19:07:20<16:37:17, 3.53s/it] 51%|█████ | 17326/34278 [19:07:24<16:17:25, 3.46s/it] {'loss': 0.1406, 'grad_norm': 0.9436519694130765, 'learning_rate': 5.154697494838048e-06, 'epoch': 0.51} 51%|█████ | 17326/34278 [19:07:24<16:17:25, 3.46s/it] 51%|█████ | 17327/34278 [19:07:28<17:31:03, 3.72s/it] {'loss': 0.1353, 'grad_norm': 0.7957110386933195, 'learning_rate': 5.154225286165296e-06, 'epoch': 0.51} 51%|█████ | 17327/34278 [19:07:28<17:31:03, 3.72s/it] 51%|█████ | 17328/34278 [19:07:31<17:03:08, 3.62s/it] {'loss': 0.1512, 'grad_norm': 1.012930471614181, 'learning_rate': 5.153753076115657e-06, 'epoch': 0.51} 51%|█████ | 17328/34278 [19:07:31<17:03:08, 3.62s/it] 51%|█████ | 17329/34278 [19:07:35<16:39:27, 3.54s/it] {'loss': 0.1356, 'grad_norm': 0.5983101306414064, 'learning_rate': 5.153280864693348e-06, 'epoch': 0.51} 51%|█████ | 17329/34278 [19:07:35<16:39:27, 3.54s/it] 51%|█████ | 17330/34278 [19:07:39<17:11:32, 3.65s/it] {'loss': 0.1427, 'grad_norm': 0.8152696892143733, 'learning_rate': 5.152808651902583e-06, 'epoch': 0.51} 51%|█████ | 17330/34278 [19:07:39<17:11:32, 3.65s/it] 51%|█████ | 17331/34278 [19:07:44<18:57:26, 4.03s/it] {'loss': 0.1408, 'grad_norm': 1.0830767427558556, 'learning_rate': 5.152336437747579e-06, 'epoch': 0.51} 51%|█████ | 17331/34278 [19:07:44<18:57:26, 4.03s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 51%|█████ | 17332/34278 [19:07:46<17:12:08, 3.65s/it] {'loss': 0.1188, 'grad_norm': 0.7452343454132013, 'learning_rate': 5.1518642222325535e-06, 'epoch': 0.51} 51%|█████ | 17332/34278 [19:07:46<17:12:08, 3.65s/it] 51%|█████ | 17333/34278 [19:07:50<16:45:07, 3.56s/it] {'loss': 0.1191, 'grad_norm': 0.7369772632010159, 'learning_rate': 5.151392005361719e-06, 'epoch': 0.51} 51%|█████ | 17333/34278 [19:07:50<16:45:07, 3.56s/it] 51%|█████ | 17334/34278 [19:07:54<17:25:55, 3.70s/it] {'loss': 0.1746, 'grad_norm': 0.9904101752353428, 'learning_rate': 5.150919787139294e-06, 'epoch': 0.51} 51%|█████ | 17334/34278 [19:07:54<17:25:55, 3.70s/it] 51%|█████ | 17335/34278 [19:07:57<16:53:27, 3.59s/it] {'loss': 0.1254, 'grad_norm': 1.0406323486541111, 'learning_rate': 5.150447567569491e-06, 'epoch': 0.51} 51%|█████ | 17335/34278 [19:07:57<16:53:27, 3.59s/it] 51%|█████ | 17336/34278 [19:08:01<16:53:03, 3.59s/it] {'loss': 0.1096, 'grad_norm': 0.7075305093172395, 'learning_rate': 5.149975346656528e-06, 'epoch': 0.51} 51%|█████ | 17336/34278 [19:08:01<16:53:03, 3.59s/it] 51%|█████ | 17337/34278 [19:08:04<17:00:56, 3.62s/it] {'loss': 0.1179, 'grad_norm': 0.702184811195534, 'learning_rate': 5.149503124404624e-06, 'epoch': 0.51} 51%|█████ | 17337/34278 [19:08:04<17:00:56, 3.62s/it] 51%|█████ | 17338/34278 [19:08:07<16:11:38, 3.44s/it] {'loss': 0.1211, 'grad_norm': 1.0631778209550506, 'learning_rate': 5.149030900817988e-06, 'epoch': 0.51} 51%|█████ | 17338/34278 [19:08:07<16:11:38, 3.44s/it] 51%|█████ | 17339/34278 [19:08:10<15:12:51, 3.23s/it] {'loss': 0.1131, 'grad_norm': 0.7921009141751055, 'learning_rate': 5.148558675900842e-06, 'epoch': 0.51} 51%|█████ | 17339/34278 [19:08:10<15:12:51, 3.23s/it] 51%|█████ | 17340/34278 [19:08:14<15:54:58, 3.38s/it] {'loss': 0.1414, 'grad_norm': 0.8507068470545315, 'learning_rate': 5.148086449657399e-06, 'epoch': 0.51} 51%|█████ | 17340/34278 [19:08:14<15:54:58, 3.38s/it] 51%|█████ | 17341/34278 [19:08:17<15:16:22, 3.25s/it] {'loss': 0.1339, 'grad_norm': 0.910500527934626, 'learning_rate': 5.147614222091876e-06, 'epoch': 0.51} 51%|█████ | 17341/34278 [19:08:17<15:16:22, 3.25s/it] 51%|█████ | 17342/34278 [19:08:20<15:44:25, 3.35s/it] {'loss': 0.1495, 'grad_norm': 1.0105572210231657, 'learning_rate': 5.147141993208487e-06, 'epoch': 0.51} 51%|█████ | 17342/34278 [19:08:20<15:44:25, 3.35s/it] 51%|█████ | 17343/34278 [19:08:24<16:21:57, 3.48s/it] {'loss': 0.1345, 'grad_norm': 0.8055268268118744, 'learning_rate': 5.146669763011452e-06, 'epoch': 0.51} 51%|█████ | 17343/34278 [19:08:24<16:21:57, 3.48s/it] 51%|█████ | 17344/34278 [19:08:29<18:01:17, 3.83s/it] {'loss': 0.1315, 'grad_norm': 0.8120024890477584, 'learning_rate': 5.146197531504982e-06, 'epoch': 0.51} 51%|█████ | 17344/34278 [19:08:29<18:01:17, 3.83s/it] 51%|█████ | 17345/34278 [19:08:33<18:27:41, 3.92s/it] {'loss': 0.1173, 'grad_norm': 0.83127325265799, 'learning_rate': 5.145725298693296e-06, 'epoch': 0.51} 51%|█████ | 17345/34278 [19:08:33<18:27:41, 3.92s/it] 51%|█████ | 17346/34278 [19:08:36<17:39:34, 3.75s/it] {'loss': 0.1066, 'grad_norm': 0.8872147149249974, 'learning_rate': 5.145253064580609e-06, 'epoch': 0.51} 51%|█████ | 17346/34278 [19:08:36<17:39:34, 3.75s/it] 51%|█████ | 17347/34278 [19:08:40<17:24:52, 3.70s/it] {'loss': 0.1231, 'grad_norm': 0.8112256411173704, 'learning_rate': 5.144780829171139e-06, 'epoch': 0.51} 51%|█████ | 17347/34278 [19:08:40<17:24:52, 3.70s/it] 51%|█████ | 17348/34278 [19:08:43<16:51:29, 3.58s/it] {'loss': 0.1734, 'grad_norm': 0.9873245820093021, 'learning_rate': 5.1443085924690986e-06, 'epoch': 0.51} 51%|█████ | 17348/34278 [19:08:43<16:51:29, 3.58s/it] 51%|█████ | 17349/34278 [19:08:46<16:22:07, 3.48s/it] {'loss': 0.1439, 'grad_norm': 1.1000236816155275, 'learning_rate': 5.143836354478706e-06, 'epoch': 0.51} 51%|█████ | 17349/34278 [19:08:46<16:22:07, 3.48s/it] 51%|█████ | 17350/34278 [19:08:52<19:16:17, 4.10s/it] {'loss': 0.1482, 'grad_norm': 0.8240164434531867, 'learning_rate': 5.143364115204178e-06, 'epoch': 0.51} 51%|█████ | 17350/34278 [19:08:52<19:16:17, 4.10s/it] 51%|█████ | 17351/34278 [19:08:55<18:02:14, 3.84s/it] {'loss': 0.1475, 'grad_norm': 0.80167303383792, 'learning_rate': 5.142891874649727e-06, 'epoch': 0.51} 51%|█████ | 17351/34278 [19:08:55<18:02:14, 3.84s/it] 51%|█████ | 17352/34278 [19:08:59<17:37:54, 3.75s/it] {'loss': 0.1247, 'grad_norm': 0.7506224208283753, 'learning_rate': 5.142419632819573e-06, 'epoch': 0.51} 51%|█████ | 17352/34278 [19:08:59<17:37:54, 3.75s/it] 51%|█████ | 17353/34278 [19:09:02<16:28:04, 3.50s/it] {'loss': 0.1449, 'grad_norm': 0.8844447321758908, 'learning_rate': 5.14194738971793e-06, 'epoch': 0.51} 51%|█████ | 17353/34278 [19:09:02<16:28:04, 3.50s/it] 51%|█████ | 17354/34278 [19:09:08<20:03:24, 4.27s/it] {'loss': 0.1478, 'grad_norm': 0.7550763334019078, 'learning_rate': 5.1414751453490154e-06, 'epoch': 0.51} 51%|█████ | 17354/34278 [19:09:08<20:03:24, 4.27s/it] 51%|█████ | 17355/34278 [19:09:11<18:43:01, 3.98s/it] {'loss': 0.1353, 'grad_norm': 0.7393954077232628, 'learning_rate': 5.141002899717044e-06, 'epoch': 0.51} 51%|█████ | 17355/34278 [19:09:11<18:43:01, 3.98s/it] 51%|█████ | 17356/34278 [19:09:15<18:38:47, 3.97s/it] {'loss': 0.1338, 'grad_norm': 0.6858918738623919, 'learning_rate': 5.140530652826232e-06, 'epoch': 0.51} 51%|█████ | 17356/34278 [19:09:15<18:38:47, 3.97s/it] 51%|█████ | 17357/34278 [19:09:20<20:11:40, 4.30s/it] {'loss': 0.1545, 'grad_norm': 0.7929363077110662, 'learning_rate': 5.1400584046807955e-06, 'epoch': 0.51} 51%|█████ | 17357/34278 [19:09:20<20:11:40, 4.30s/it] 51%|█████ | 17358/34278 [19:09:26<22:44:13, 4.84s/it] {'loss': 0.1383, 'grad_norm': 0.8007111275954959, 'learning_rate': 5.139586155284953e-06, 'epoch': 0.51} 51%|█████ | 17358/34278 [19:09:26<22:44:13, 4.84s/it] 51%|█████ | 17359/34278 [19:09:30<20:47:51, 4.43s/it] {'loss': 0.1092, 'grad_norm': 0.6879773730704335, 'learning_rate': 5.139113904642916e-06, 'epoch': 0.51} 51%|█████ | 17359/34278 [19:09:30<20:47:51, 4.43s/it] 51%|█████ | 17360/34278 [19:09:36<23:03:40, 4.91s/it] {'loss': 0.1178, 'grad_norm': 0.7878276610794269, 'learning_rate': 5.138641652758904e-06, 'epoch': 0.51} 51%|█████ | 17360/34278 [19:09:36<23:03:40, 4.91s/it] 51%|█████ | 17361/34278 [19:09:39<20:31:53, 4.37s/it] {'loss': 0.1171, 'grad_norm': 0.756600483158757, 'learning_rate': 5.138169399637134e-06, 'epoch': 0.51} 51%|█████ | 17361/34278 [19:09:39<20:31:53, 4.37s/it] 51%|█████ | 17362/34278 [19:09:42<18:56:40, 4.03s/it] {'loss': 0.1156, 'grad_norm': 0.6037140498171352, 'learning_rate': 5.137697145281821e-06, 'epoch': 0.51} 51%|█████ | 17362/34278 [19:09:42<18:56:40, 4.03s/it] 51%|█████ | 17363/34278 [19:09:45<17:18:53, 3.69s/it] {'loss': 0.1358, 'grad_norm': 0.8472983398237784, 'learning_rate': 5.137224889697178e-06, 'epoch': 0.51} 51%|█████ | 17363/34278 [19:09:45<17:18:53, 3.69s/it] 51%|█████ | 17364/34278 [19:09:48<17:08:58, 3.65s/it] {'loss': 0.1225, 'grad_norm': 0.7878403651131111, 'learning_rate': 5.136752632887425e-06, 'epoch': 0.51} 51%|█████ | 17364/34278 [19:09:48<17:08:58, 3.65s/it] 51%|█████ | 17365/34278 [19:09:52<17:28:09, 3.72s/it] {'loss': 0.1395, 'grad_norm': 0.8571396047247545, 'learning_rate': 5.136280374856778e-06, 'epoch': 0.51} 51%|█████ | 17365/34278 [19:09:52<17:28:09, 3.72s/it] 51%|█████ | 17366/34278 [19:09:56<17:26:35, 3.71s/it] {'loss': 0.1165, 'grad_norm': 0.8080667401260934, 'learning_rate': 5.135808115609451e-06, 'epoch': 0.51} 51%|█████ | 17366/34278 [19:09:56<17:26:35, 3.71s/it] 51%|█████ | 17367/34278 [19:10:00<18:13:52, 3.88s/it] {'loss': 0.1404, 'grad_norm': 0.8838439608089907, 'learning_rate': 5.135335855149662e-06, 'epoch': 0.51} 51%|█████ | 17367/34278 [19:10:00<18:13:52, 3.88s/it] 51%|█████ | 17368/34278 [19:10:04<18:00:41, 3.83s/it] {'loss': 0.1242, 'grad_norm': 0.7789429548029285, 'learning_rate': 5.134863593481628e-06, 'epoch': 0.51} 51%|█████ | 17368/34278 [19:10:04<18:00:41, 3.83s/it] 51%|█████ | 17369/34278 [19:10:10<20:58:36, 4.47s/it] {'loss': 0.155, 'grad_norm': 0.7882215555047755, 'learning_rate': 5.134391330609563e-06, 'epoch': 0.51} 51%|█████ | 17369/34278 [19:10:10<20:58:36, 4.47s/it] 51%|█████ | 17370/34278 [19:10:13<19:39:48, 4.19s/it] {'loss': 0.1711, 'grad_norm': 0.8961196088251444, 'learning_rate': 5.133919066537683e-06, 'epoch': 0.51} 51%|█████ | 17370/34278 [19:10:13<19:39:48, 4.19s/it] 51%|█████ | 17371/34278 [19:10:16<17:44:24, 3.78s/it] {'loss': 0.1462, 'grad_norm': 0.9771907644859715, 'learning_rate': 5.133446801270207e-06, 'epoch': 0.51} 51%|█████ | 17371/34278 [19:10:16<17:44:24, 3.78s/it] 51%|█████ | 17372/34278 [19:10:19<16:39:54, 3.55s/it] {'loss': 0.1387, 'grad_norm': 0.7900261333947414, 'learning_rate': 5.13297453481135e-06, 'epoch': 0.51} 51%|█████ | 17372/34278 [19:10:19<16:39:54, 3.55s/it] 51%|█████ | 17373/34278 [19:10:22<15:37:11, 3.33s/it] {'loss': 0.1571, 'grad_norm': 0.9798097602595854, 'learning_rate': 5.1325022671653275e-06, 'epoch': 0.51} 51%|█████ | 17373/34278 [19:10:22<15:37:11, 3.33s/it] 51%|█████ | 17374/34278 [19:10:25<15:03:41, 3.21s/it] {'loss': 0.1191, 'grad_norm': 0.9566408462411358, 'learning_rate': 5.1320299983363576e-06, 'epoch': 0.51} 51%|█████ | 17374/34278 [19:10:25<15:03:41, 3.21s/it] 51%|█████ | 17375/34278 [19:10:29<16:08:08, 3.44s/it] {'loss': 0.1379, 'grad_norm': 0.8072634113664551, 'learning_rate': 5.131557728328655e-06, 'epoch': 0.51} 51%|█████ | 17375/34278 [19:10:29<16:08:08, 3.44s/it] 51%|█████ | 17376/34278 [19:10:32<15:40:01, 3.34s/it] {'loss': 0.1303, 'grad_norm': 0.773098212033721, 'learning_rate': 5.131085457146435e-06, 'epoch': 0.51} 51%|█████ | 17376/34278 [19:10:32<15:40:01, 3.34s/it] 51%|█████ | 17377/34278 [19:10:35<15:10:08, 3.23s/it] {'loss': 0.1422, 'grad_norm': 0.8632777823305222, 'learning_rate': 5.130613184793918e-06, 'epoch': 0.51} 51%|█████ | 17377/34278 [19:10:35<15:10:08, 3.23s/it] 51%|█████ | 17378/34278 [19:10:38<15:01:42, 3.20s/it] {'loss': 0.1416, 'grad_norm': 0.7959921007310494, 'learning_rate': 5.130140911275315e-06, 'epoch': 0.51} 51%|█████ | 17378/34278 [19:10:38<15:01:42, 3.20s/it] 51%|█████ | 17379/34278 [19:10:41<14:54:49, 3.18s/it] {'loss': 0.1276, 'grad_norm': 0.6881613802772266, 'learning_rate': 5.129668636594847e-06, 'epoch': 0.51} 51%|█████ | 17379/34278 [19:10:41<14:54:49, 3.18s/it] 51%|█████ | 17380/34278 [19:10:45<15:19:53, 3.27s/it] {'loss': 0.118, 'grad_norm': 0.7637479052797145, 'learning_rate': 5.129196360756726e-06, 'epoch': 0.51} 51%|█████ | 17380/34278 [19:10:45<15:19:53, 3.27s/it] 51%|█████ | 17381/34278 [19:10:48<15:25:39, 3.29s/it] {'loss': 0.1183, 'grad_norm': 1.0347095831555904, 'learning_rate': 5.128724083765172e-06, 'epoch': 0.51} 51%|█████ | 17381/34278 [19:10:48<15:25:39, 3.29s/it] 51%|█████ | 17382/34278 [19:10:51<14:52:01, 3.17s/it] {'loss': 0.1245, 'grad_norm': 0.8213310815684511, 'learning_rate': 5.1282518056244006e-06, 'epoch': 0.51} 51%|█████ | 17382/34278 [19:10:51<14:52:01, 3.17s/it] 51%|█████ | 17383/34278 [19:10:57<18:49:10, 4.01s/it] {'loss': 0.1478, 'grad_norm': 0.9759906612344345, 'learning_rate': 5.127779526338628e-06, 'epoch': 0.51} 51%|█████ | 17383/34278 [19:10:57<18:49:10, 4.01s/it] 51%|█████ | 17384/34278 [19:11:01<19:08:21, 4.08s/it] {'loss': 0.1506, 'grad_norm': 1.2924699679050022, 'learning_rate': 5.127307245912069e-06, 'epoch': 0.51} 51%|█████ | 17384/34278 [19:11:01<19:08:21, 4.08s/it] 51%|█████ | 17385/34278 [19:11:04<17:47:23, 3.79s/it] {'loss': 0.1405, 'grad_norm': 0.9077479603573876, 'learning_rate': 5.126834964348941e-06, 'epoch': 0.51} 51%|█████ | 17385/34278 [19:11:04<17:47:23, 3.79s/it] 51%|█████ | 17386/34278 [19:11:11<21:13:01, 4.52s/it] {'loss': 0.1445, 'grad_norm': 0.882082982669461, 'learning_rate': 5.1263626816534616e-06, 'epoch': 0.51} 51%|█████ | 17386/34278 [19:11:11<21:13:01, 4.52s/it] 51%|█████ | 17387/34278 [19:11:14<19:08:47, 4.08s/it] {'loss': 0.1472, 'grad_norm': 0.939918307589359, 'learning_rate': 5.125890397829847e-06, 'epoch': 0.51} 51%|█████ | 17387/34278 [19:11:14<19:08:47, 4.08s/it] 51%|█████ | 17388/34278 [19:11:17<17:52:25, 3.81s/it] {'loss': 0.1252, 'grad_norm': 0.8187095096245093, 'learning_rate': 5.1254181128823124e-06, 'epoch': 0.51} 51%|█████ | 17388/34278 [19:11:17<17:52:25, 3.81s/it] 51%|█████ | 17389/34278 [19:11:22<19:43:50, 4.21s/it] {'loss': 0.1153, 'grad_norm': 0.8300657431033446, 'learning_rate': 5.124945826815074e-06, 'epoch': 0.51} 51%|█████ | 17389/34278 [19:11:22<19:43:50, 4.21s/it] 51%|█████ | 17390/34278 [19:11:25<18:01:53, 3.84s/it] {'loss': 0.1483, 'grad_norm': 0.8988592144257003, 'learning_rate': 5.1244735396323495e-06, 'epoch': 0.51} 51%|█████ | 17390/34278 [19:11:25<18:01:53, 3.84s/it] 51%|█████ | 17391/34278 [19:11:28<17:13:22, 3.67s/it] {'loss': 0.1379, 'grad_norm': 0.9213824058900065, 'learning_rate': 5.124001251338355e-06, 'epoch': 0.51} 51%|█████ | 17391/34278 [19:11:28<17:13:22, 3.67s/it] 51%|█████ | 17392/34278 [19:11:31<16:08:38, 3.44s/it] {'loss': 0.1461, 'grad_norm': 0.7782039516202991, 'learning_rate': 5.1235289619373085e-06, 'epoch': 0.51} 51%|█████ | 17392/34278 [19:11:31<16:08:38, 3.44s/it] 51%|█████ | 17393/34278 [19:11:36<18:05:51, 3.86s/it] {'loss': 0.1508, 'grad_norm': 0.7872022005111858, 'learning_rate': 5.123056671433423e-06, 'epoch': 0.51} 51%|█████ | 17393/34278 [19:11:36<18:05:51, 3.86s/it] 51%|█████ | 17394/34278 [19:11:39<17:28:29, 3.73s/it] {'loss': 0.1369, 'grad_norm': 0.8579017110269386, 'learning_rate': 5.122584379830918e-06, 'epoch': 0.51} 51%|█████ | 17394/34278 [19:11:39<17:28:29, 3.73s/it] 51%|█████ | 17395/34278 [19:11:43<17:32:38, 3.74s/it] {'loss': 0.1285, 'grad_norm': 0.8714802581837974, 'learning_rate': 5.122112087134008e-06, 'epoch': 0.51} 51%|█████ | 17395/34278 [19:11:43<17:32:38, 3.74s/it] 51%|█████ | 17396/34278 [19:11:46<16:36:54, 3.54s/it] {'loss': 0.1221, 'grad_norm': 0.7290086984449238, 'learning_rate': 5.12163979334691e-06, 'epoch': 0.51} 51%|█████ | 17396/34278 [19:11:46<16:36:54, 3.54s/it] 51%|█████ | 17397/34278 [19:11:52<19:54:10, 4.24s/it] {'loss': 0.1322, 'grad_norm': 0.8952570803272655, 'learning_rate': 5.121167498473844e-06, 'epoch': 0.51} 51%|█████ | 17397/34278 [19:11:52<19:54:10, 4.24s/it] 51%|█████ | 17398/34278 [19:11:55<18:21:08, 3.91s/it] {'loss': 0.1215, 'grad_norm': 0.7462410291138011, 'learning_rate': 5.12069520251902e-06, 'epoch': 0.51} 51%|█████ | 17398/34278 [19:11:55<18:21:08, 3.91s/it] 51%|█████ | 17399/34278 [19:11:58<16:55:24, 3.61s/it] {'loss': 0.1241, 'grad_norm': 0.7841741056002378, 'learning_rate': 5.1202229054866595e-06, 'epoch': 0.51} 51%|█████ | 17399/34278 [19:11:58<16:55:24, 3.61s/it] 51%|█████ | 17400/34278 [19:12:02<16:56:07, 3.61s/it] {'loss': 0.1421, 'grad_norm': 0.9448811646218165, 'learning_rate': 5.119750607380977e-06, 'epoch': 0.51} 51%|█████ | 17400/34278 [19:12:02<16:56:07, 3.61s/it] 51%|█████ | 17401/34278 [19:12:05<16:46:13, 3.58s/it] {'loss': 0.113, 'grad_norm': 0.7560075039981516, 'learning_rate': 5.119278308206191e-06, 'epoch': 0.51} 51%|█████ | 17401/34278 [19:12:05<16:46:13, 3.58s/it] 51%|█████ | 17402/34278 [19:12:09<16:24:09, 3.50s/it] {'loss': 0.1544, 'grad_norm': 1.1668001301099393, 'learning_rate': 5.118806007966516e-06, 'epoch': 0.51} 51%|█████ | 17402/34278 [19:12:09<16:24:09, 3.50s/it] 51%|█████ | 17403/34278 [19:12:13<17:02:31, 3.64s/it] {'loss': 0.1326, 'grad_norm': 1.2219707925183994, 'learning_rate': 5.118333706666168e-06, 'epoch': 0.51} 51%|█████ | 17403/34278 [19:12:13<17:02:31, 3.64s/it] 51%|█████ | 17404/34278 [19:12:16<17:26:42, 3.72s/it] {'loss': 0.1083, 'grad_norm': 0.9900389618322261, 'learning_rate': 5.117861404309367e-06, 'epoch': 0.51} 51%|█████ | 17404/34278 [19:12:16<17:26:42, 3.72s/it] 51%|█████ | 17405/34278 [19:12:21<18:14:54, 3.89s/it] {'loss': 0.1499, 'grad_norm': 0.7042060620325492, 'learning_rate': 5.117389100900326e-06, 'epoch': 0.51} 51%|█████ | 17405/34278 [19:12:21<18:14:54, 3.89s/it] 51%|█████ | 17406/34278 [19:12:24<17:02:08, 3.63s/it] {'loss': 0.1354, 'grad_norm': 1.0610797808611254, 'learning_rate': 5.116916796443264e-06, 'epoch': 0.51} 51%|█████ | 17406/34278 [19:12:24<17:02:08, 3.63s/it] 51%|█████ | 17407/34278 [19:12:27<16:15:45, 3.47s/it] {'loss': 0.1188, 'grad_norm': 1.2112837320831007, 'learning_rate': 5.116444490942397e-06, 'epoch': 0.51} 51%|█████ | 17407/34278 [19:12:27<16:15:45, 3.47s/it] 51%|█████ | 17408/34278 [19:12:32<19:02:59, 4.07s/it] {'loss': 0.1446, 'grad_norm': 0.757975343444948, 'learning_rate': 5.1159721844019406e-06, 'epoch': 0.51} 51%|█████ | 17408/34278 [19:12:32<19:02:59, 4.07s/it] 51%|█████ | 17409/34278 [19:12:39<22:04:35, 4.71s/it] {'loss': 0.1363, 'grad_norm': 0.8757713595024581, 'learning_rate': 5.115499876826113e-06, 'epoch': 0.51} 51%|█████ | 17409/34278 [19:12:39<22:04:35, 4.71s/it] 51%|█████ | 17410/34278 [19:12:44<23:44:52, 5.07s/it] {'loss': 0.1445, 'grad_norm': 1.0200454868819515, 'learning_rate': 5.115027568219129e-06, 'epoch': 0.51} 51%|█████ | 17410/34278 [19:12:44<23:44:52, 5.07s/it] 51%|█████ | 17411/34278 [19:12:48<21:00:31, 4.48s/it] {'loss': 0.1259, 'grad_norm': 0.7946393618360597, 'learning_rate': 5.114555258585207e-06, 'epoch': 0.51} 51%|█████ | 17411/34278 [19:12:48<21:00:31, 4.48s/it] 51%|█████ | 17412/34278 [19:12:50<18:41:43, 3.99s/it] {'loss': 0.1579, 'grad_norm': 0.755484192261936, 'learning_rate': 5.114082947928563e-06, 'epoch': 0.51} 51%|█████ | 17412/34278 [19:12:50<18:41:43, 3.99s/it] 51%|█████ | 17413/34278 [19:12:56<20:45:30, 4.43s/it] {'loss': 0.1415, 'grad_norm': 0.805338398118202, 'learning_rate': 5.113610636253413e-06, 'epoch': 0.51} 51%|█████ | 17413/34278 [19:12:56<20:45:30, 4.43s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 51%|█████ | 17414/34278 [19:12:59<19:04:33, 4.07s/it] {'loss': 0.1414, 'grad_norm': 0.7999682988143448, 'learning_rate': 5.113138323563975e-06, 'epoch': 0.51} 51%|█████ | 17414/34278 [19:12:59<19:04:33, 4.07s/it] 51%|█████ | 17415/34278 [19:13:03<18:41:34, 3.99s/it] {'loss': 0.126, 'grad_norm': 0.798125551415898, 'learning_rate': 5.112666009864466e-06, 'epoch': 0.51} 51%|█████ | 17415/34278 [19:13:03<18:41:34, 3.99s/it] 51%|█████ | 17416/34278 [19:13:06<16:59:53, 3.63s/it] {'loss': 0.1105, 'grad_norm': 0.6759290380349545, 'learning_rate': 5.1121936951591e-06, 'epoch': 0.51} 51%|█████ | 17416/34278 [19:13:06<16:59:53, 3.63s/it] 51%|█████ | 17417/34278 [19:13:09<15:59:04, 3.41s/it] {'loss': 0.1297, 'grad_norm': 0.6684177179268687, 'learning_rate': 5.111721379452096e-06, 'epoch': 0.51} 51%|█████ | 17417/34278 [19:13:09<15:59:04, 3.41s/it] 51%|█████ | 17418/34278 [19:13:14<19:18:58, 4.12s/it] {'loss': 0.1288, 'grad_norm': 0.8306387513930461, 'learning_rate': 5.111249062747671e-06, 'epoch': 0.51} 51%|█████ | 17418/34278 [19:13:14<19:18:58, 4.12s/it] 51%|█████ | 17419/34278 [19:13:20<21:56:04, 4.68s/it] {'loss': 0.1212, 'grad_norm': 1.0230107081557667, 'learning_rate': 5.11077674505004e-06, 'epoch': 0.51} 51%|█████ | 17419/34278 [19:13:20<21:56:04, 4.68s/it] 51%|█████ | 17420/34278 [19:13:24<20:44:36, 4.43s/it] {'loss': 0.1463, 'grad_norm': 0.9203159726182012, 'learning_rate': 5.11030442636342e-06, 'epoch': 0.51} 51%|█████ | 17420/34278 [19:13:24<20:44:36, 4.43s/it] 51%|█████ | 17421/34278 [19:13:28<20:22:16, 4.35s/it] {'loss': 0.1225, 'grad_norm': 0.8596779820305135, 'learning_rate': 5.10983210669203e-06, 'epoch': 0.51} 51%|█████ | 17421/34278 [19:13:28<20:22:16, 4.35s/it] 51%|█████ | 17422/34278 [19:13:31<18:37:37, 3.98s/it] {'loss': 0.1305, 'grad_norm': 0.8261003187516668, 'learning_rate': 5.109359786040086e-06, 'epoch': 0.51} 51%|█████ | 17422/34278 [19:13:31<18:37:37, 3.98s/it] 51%|█████ | 17423/34278 [19:13:37<20:07:56, 4.30s/it] {'loss': 0.1547, 'grad_norm': 1.0041683647406259, 'learning_rate': 5.108887464411802e-06, 'epoch': 0.51} 51%|█████ | 17423/34278 [19:13:37<20:07:56, 4.30s/it] 51%|█████ | 17424/34278 [19:13:40<19:10:41, 4.10s/it] {'loss': 0.117, 'grad_norm': 1.0042134265473222, 'learning_rate': 5.108415141811398e-06, 'epoch': 0.51} 51%|█████ | 17424/34278 [19:13:40<19:10:41, 4.10s/it] 51%|█████ | 17425/34278 [19:13:45<19:53:21, 4.25s/it] {'loss': 0.1151, 'grad_norm': 0.6871673440834162, 'learning_rate': 5.107942818243088e-06, 'epoch': 0.51} 51%|█████ | 17425/34278 [19:13:45<19:53:21, 4.25s/it] 51%|█████ | 17426/34278 [19:13:48<19:00:57, 4.06s/it] {'loss': 0.1337, 'grad_norm': 0.8310470443975275, 'learning_rate': 5.1074704937110895e-06, 'epoch': 0.51} 51%|█████ | 17426/34278 [19:13:48<19:00:57, 4.06s/it] 51%|█████ | 17427/34278 [19:13:52<18:59:33, 4.06s/it] {'loss': 0.123, 'grad_norm': 0.9398306613967943, 'learning_rate': 5.1069981682196235e-06, 'epoch': 0.51} 51%|█████ | 17427/34278 [19:13:52<18:59:33, 4.06s/it] 51%|█████ | 17428/34278 [19:13:56<18:02:39, 3.86s/it] {'loss': 0.1067, 'grad_norm': 0.8089940837278848, 'learning_rate': 5.106525841772902e-06, 'epoch': 0.51} 51%|█████ | 17428/34278 [19:13:56<18:02:39, 3.86s/it] 51%|█████ | 17429/34278 [19:13:59<16:46:20, 3.58s/it] {'loss': 0.1653, 'grad_norm': 0.6566858907538361, 'learning_rate': 5.106053514375142e-06, 'epoch': 0.51} 51%|█████ | 17429/34278 [19:13:59<16:46:20, 3.58s/it] 51%|█████ | 17430/34278 [19:14:02<15:52:17, 3.39s/it] {'loss': 0.1445, 'grad_norm': 0.8595110387011408, 'learning_rate': 5.105581186030563e-06, 'epoch': 0.51} 51%|█████ | 17430/34278 [19:14:02<15:52:17, 3.39s/it] 51%|█████ | 17431/34278 [19:14:05<15:22:49, 3.29s/it] {'loss': 0.1308, 'grad_norm': 0.7523261269920449, 'learning_rate': 5.1051088567433785e-06, 'epoch': 0.51} 51%|█████ | 17431/34278 [19:14:05<15:22:49, 3.29s/it] 51%|█████ | 17432/34278 [19:14:11<18:59:29, 4.06s/it] {'loss': 0.1104, 'grad_norm': 0.829148605483477, 'learning_rate': 5.104636526517809e-06, 'epoch': 0.51} 51%|█████ | 17432/34278 [19:14:11<18:59:29, 4.06s/it] 51%|█████ | 17433/34278 [19:14:17<21:37:23, 4.62s/it] {'loss': 0.1569, 'grad_norm': 0.7246840243450478, 'learning_rate': 5.104164195358068e-06, 'epoch': 0.51} 51%|█████ | 17433/34278 [19:14:17<21:37:23, 4.62s/it] 51%|█████ | 17434/34278 [19:14:20<19:50:07, 4.24s/it] {'loss': 0.137, 'grad_norm': 0.9536137671145061, 'learning_rate': 5.103691863268375e-06, 'epoch': 0.51} 51%|█████ | 17434/34278 [19:14:20<19:50:07, 4.24s/it] 51%|█████ | 17435/34278 [19:14:23<17:43:32, 3.79s/it] {'loss': 0.1312, 'grad_norm': 1.0648240016753865, 'learning_rate': 5.103219530252945e-06, 'epoch': 0.51} 51%|█████ | 17435/34278 [19:14:23<17:43:32, 3.79s/it] 51%|█████ | 17436/34278 [19:14:26<16:30:19, 3.53s/it] {'loss': 0.1155, 'grad_norm': 0.7450880858814997, 'learning_rate': 5.102747196315997e-06, 'epoch': 0.51} 51%|█████ | 17436/34278 [19:14:26<16:30:19, 3.53s/it] 51%|█████ | 17437/34278 [19:14:29<16:42:21, 3.57s/it] {'loss': 0.1493, 'grad_norm': 0.8871366210252537, 'learning_rate': 5.102274861461747e-06, 'epoch': 0.51} 51%|█████ | 17437/34278 [19:14:29<16:42:21, 3.57s/it] 51%|█████ | 17438/34278 [19:14:32<15:57:56, 3.41s/it] {'loss': 0.1353, 'grad_norm': 0.8154720472206844, 'learning_rate': 5.101802525694409e-06, 'epoch': 0.51} 51%|█████ | 17438/34278 [19:14:32<15:57:56, 3.41s/it] 51%|█████ | 17439/34278 [19:14:36<17:04:00, 3.65s/it] {'loss': 0.1093, 'grad_norm': 0.9148185585487233, 'learning_rate': 5.101330189018205e-06, 'epoch': 0.51} 51%|█████ | 17439/34278 [19:14:36<17:04:00, 3.65s/it] 51%|█████ | 17440/34278 [19:14:41<18:22:18, 3.93s/it] {'loss': 0.1264, 'grad_norm': 0.8445701927656427, 'learning_rate': 5.100857851437347e-06, 'epoch': 0.51} 51%|█████ | 17440/34278 [19:14:41<18:22:18, 3.93s/it] 51%|█████ | 17441/34278 [19:14:44<17:22:00, 3.71s/it] {'loss': 0.1374, 'grad_norm': 0.7670467790226362, 'learning_rate': 5.100385512956054e-06, 'epoch': 0.51} 51%|█████ | 17441/34278 [19:14:44<17:22:00, 3.71s/it] 51%|█████ | 17442/34278 [19:14:47<16:20:49, 3.50s/it] {'loss': 0.1626, 'grad_norm': 0.8350280756894248, 'learning_rate': 5.099913173578546e-06, 'epoch': 0.51} 51%|█████ | 17442/34278 [19:14:47<16:20:49, 3.50s/it] 51%|█████ | 17443/34278 [19:14:50<15:31:57, 3.32s/it] {'loss': 0.1231, 'grad_norm': 1.124872795905888, 'learning_rate': 5.099440833309035e-06, 'epoch': 0.51} 51%|█████ | 17443/34278 [19:14:50<15:31:57, 3.32s/it] 51%|█████ | 17444/34278 [19:14:53<15:21:00, 3.28s/it] {'loss': 0.1748, 'grad_norm': 1.0196721305331584, 'learning_rate': 5.09896849215174e-06, 'epoch': 0.51} 51%|█████ | 17444/34278 [19:14:53<15:21:00, 3.28s/it] 51%|█████ | 17445/34278 [19:14:56<14:59:15, 3.21s/it] {'loss': 0.1374, 'grad_norm': 0.7511860367937689, 'learning_rate': 5.0984961501108785e-06, 'epoch': 0.51} 51%|█████ | 17445/34278 [19:14:56<14:59:15, 3.21s/it] 51%|█████ | 17446/34278 [19:14:59<14:38:13, 3.13s/it] {'loss': 0.1036, 'grad_norm': 0.829116893736274, 'learning_rate': 5.098023807190666e-06, 'epoch': 0.51} 51%|█████ | 17446/34278 [19:14:59<14:38:13, 3.13s/it] 51%|█████ | 17447/34278 [19:15:03<14:46:06, 3.16s/it] {'loss': 0.1075, 'grad_norm': 0.8072556863341455, 'learning_rate': 5.097551463395321e-06, 'epoch': 0.51} 51%|█████ | 17447/34278 [19:15:03<14:46:06, 3.16s/it] 51%|█████ | 17448/34278 [19:15:08<18:12:08, 3.89s/it] {'loss': 0.1241, 'grad_norm': 0.9118758133732958, 'learning_rate': 5.0970791187290605e-06, 'epoch': 0.51} 51%|█████ | 17448/34278 [19:15:08<18:12:08, 3.89s/it] 51%|█████ | 17449/34278 [19:15:14<20:55:56, 4.48s/it] {'loss': 0.1224, 'grad_norm': 0.699691390826038, 'learning_rate': 5.0966067731961e-06, 'epoch': 0.51} 51%|█████ | 17449/34278 [19:15:14<20:55:56, 4.48s/it] 51%|█████ | 17450/34278 [19:15:18<19:35:51, 4.19s/it] {'loss': 0.1512, 'grad_norm': 0.8384194162642024, 'learning_rate': 5.096134426800657e-06, 'epoch': 0.51} 51%|█████ | 17450/34278 [19:15:18<19:35:51, 4.19s/it] 51%|█████ | 17451/34278 [19:15:21<17:59:00, 3.85s/it] {'loss': 0.1318, 'grad_norm': 0.786729089874759, 'learning_rate': 5.095662079546949e-06, 'epoch': 0.51} 51%|█████ | 17451/34278 [19:15:21<17:59:00, 3.85s/it] 51%|█████ | 17452/34278 [19:15:26<20:37:36, 4.41s/it] {'loss': 0.1526, 'grad_norm': 0.8723570546988967, 'learning_rate': 5.095189731439194e-06, 'epoch': 0.51} 51%|█████ | 17452/34278 [19:15:26<20:37:36, 4.41s/it] 51%|█████ | 17453/34278 [19:15:31<20:28:25, 4.38s/it] {'loss': 0.1367, 'grad_norm': 0.8062033254186469, 'learning_rate': 5.094717382481605e-06, 'epoch': 0.51} 51%|█████ | 17453/34278 [19:15:31<20:28:25, 4.38s/it] 51%|█████ | 17454/34278 [19:15:34<18:50:46, 4.03s/it] {'loss': 0.1245, 'grad_norm': 0.7979303462656121, 'learning_rate': 5.094245032678406e-06, 'epoch': 0.51} 51%|█████ | 17454/34278 [19:15:34<18:50:46, 4.03s/it] 51%|█████ | 17455/34278 [19:15:37<17:29:39, 3.74s/it] {'loss': 0.1407, 'grad_norm': 0.7828414313310849, 'learning_rate': 5.093772682033806e-06, 'epoch': 0.51} 51%|█████ | 17455/34278 [19:15:37<17:29:39, 3.74s/it] 51%|█████ | 17456/34278 [19:15:40<16:25:28, 3.51s/it] {'loss': 0.1241, 'grad_norm': 0.7756363833803713, 'learning_rate': 5.093300330552027e-06, 'epoch': 0.51} 51%|█████ | 17456/34278 [19:15:40<16:25:28, 3.51s/it] 51%|█████ | 17457/34278 [19:15:43<16:05:33, 3.44s/it] {'loss': 0.1347, 'grad_norm': 0.793736937840087, 'learning_rate': 5.0928279782372855e-06, 'epoch': 0.51} 51%|█████ | 17457/34278 [19:15:43<16:05:33, 3.44s/it] 51%|█████ | 17458/34278 [19:15:47<16:49:49, 3.60s/it] {'loss': 0.1446, 'grad_norm': 0.7788558158774311, 'learning_rate': 5.092355625093798e-06, 'epoch': 0.51} 51%|█████ | 17458/34278 [19:15:47<16:49:49, 3.60s/it] 51%|█████ | 17459/34278 [19:15:50<16:27:03, 3.52s/it] {'loss': 0.1237, 'grad_norm': 0.673358783316202, 'learning_rate': 5.0918832711257805e-06, 'epoch': 0.51} 51%|█████ | 17459/34278 [19:15:50<16:27:03, 3.52s/it] 51%|█████ | 17460/34278 [19:15:53<15:40:29, 3.36s/it] {'loss': 0.1274, 'grad_norm': 0.8420016865602312, 'learning_rate': 5.091410916337452e-06, 'epoch': 0.51} 51%|█████ | 17460/34278 [19:15:53<15:40:29, 3.36s/it] 51%|█████ | 17461/34278 [19:15:57<16:40:21, 3.57s/it] {'loss': 0.1307, 'grad_norm': 0.7502259822245504, 'learning_rate': 5.090938560733029e-06, 'epoch': 0.51} 51%|█████ | 17461/34278 [19:15:57<16:40:21, 3.57s/it] 51%|█████ | 17462/34278 [19:16:01<15:58:11, 3.42s/it] {'loss': 0.1256, 'grad_norm': 0.7980311105265848, 'learning_rate': 5.090466204316727e-06, 'epoch': 0.51} 51%|█████ | 17462/34278 [19:16:01<15:58:11, 3.42s/it] 51%|█████ | 17463/34278 [19:16:06<18:34:41, 3.98s/it] {'loss': 0.1449, 'grad_norm': 0.7784150212040712, 'learning_rate': 5.089993847092764e-06, 'epoch': 0.51} 51%|█████ | 17463/34278 [19:16:06<18:34:41, 3.98s/it] 51%|█████ | 17464/34278 [19:16:09<17:56:28, 3.84s/it] {'loss': 0.1404, 'grad_norm': 0.7988999313518343, 'learning_rate': 5.089521489065358e-06, 'epoch': 0.51} 51%|█████ | 17464/34278 [19:16:09<17:56:28, 3.84s/it] 51%|█████ | 17465/34278 [19:16:15<20:54:05, 4.48s/it] {'loss': 0.1422, 'grad_norm': 0.8574086188145461, 'learning_rate': 5.089049130238727e-06, 'epoch': 0.51} 51%|█████ | 17465/34278 [19:16:15<20:54:05, 4.48s/it] 51%|█████ | 17466/34278 [19:16:21<23:09:37, 4.96s/it] {'loss': 0.134, 'grad_norm': 0.713583071387903, 'learning_rate': 5.088576770617086e-06, 'epoch': 0.51} 51%|█████ | 17466/34278 [19:16:21<23:09:37, 4.96s/it] 51%|█████ | 17467/34278 [19:16:27<23:22:52, 5.01s/it] {'loss': 0.1192, 'grad_norm': 0.7387580018763268, 'learning_rate': 5.088104410204652e-06, 'epoch': 0.51} 51%|█████ | 17467/34278 [19:16:27<23:22:52, 5.01s/it] 51%|█████ | 17468/34278 [19:16:30<21:21:56, 4.58s/it] {'loss': 0.122, 'grad_norm': 0.8620868978061649, 'learning_rate': 5.087632049005643e-06, 'epoch': 0.51} 51%|█████ | 17468/34278 [19:16:30<21:21:56, 4.58s/it] 51%|█████ | 17469/34278 [19:16:34<20:27:08, 4.38s/it] {'loss': 0.1358, 'grad_norm': 0.7908676300118478, 'learning_rate': 5.087159687024277e-06, 'epoch': 0.51} 51%|█████ | 17469/34278 [19:16:34<20:27:08, 4.38s/it] 51%|█████ | 17470/34278 [19:16:37<18:56:06, 4.06s/it] {'loss': 0.1409, 'grad_norm': 0.8784872923271049, 'learning_rate': 5.086687324264768e-06, 'epoch': 0.51} 51%|█████ | 17470/34278 [19:16:37<18:56:06, 4.06s/it] 51%|█████ | 17471/34278 [19:16:40<17:38:44, 3.78s/it] {'loss': 0.1493, 'grad_norm': 1.0483879927732656, 'learning_rate': 5.086214960731337e-06, 'epoch': 0.51} 51%|█████ | 17471/34278 [19:16:40<17:38:44, 3.78s/it] 51%|█████ | 17472/34278 [19:16:44<16:57:58, 3.63s/it] {'loss': 0.1311, 'grad_norm': 0.7200407098885422, 'learning_rate': 5.085742596428199e-06, 'epoch': 0.51} 51%|█████ | 17472/34278 [19:16:44<16:57:58, 3.63s/it] 51%|█████ | 17473/34278 [19:16:47<16:11:13, 3.47s/it] {'loss': 0.1261, 'grad_norm': 0.6268472174358896, 'learning_rate': 5.085270231359572e-06, 'epoch': 0.51} 51%|█████ | 17473/34278 [19:16:47<16:11:13, 3.47s/it] 51%|█████ | 17474/34278 [19:16:51<16:39:45, 3.57s/it] {'loss': 0.1338, 'grad_norm': 0.9703086724801832, 'learning_rate': 5.084797865529673e-06, 'epoch': 0.51} 51%|█████ | 17474/34278 [19:16:51<16:39:45, 3.57s/it] 51%|█████ | 17475/34278 [19:16:54<16:16:42, 3.49s/it] {'loss': 0.1165, 'grad_norm': 0.8499811925044137, 'learning_rate': 5.084325498942717e-06, 'epoch': 0.51} 51%|█████ | 17475/34278 [19:16:54<16:16:42, 3.49s/it] 51%|█████ | 17476/34278 [19:17:00<19:47:36, 4.24s/it] {'loss': 0.1412, 'grad_norm': 0.8616719357043495, 'learning_rate': 5.083853131602924e-06, 'epoch': 0.51} 51%|█████ | 17476/34278 [19:17:00<19:47:36, 4.24s/it] 51%|█████ | 17477/34278 [19:17:05<21:28:14, 4.60s/it] {'loss': 0.1298, 'grad_norm': 0.8336430000759242, 'learning_rate': 5.083380763514511e-06, 'epoch': 0.51} 51%|█████ | 17477/34278 [19:17:05<21:28:14, 4.60s/it] 51%|█████ | 17478/34278 [19:17:11<22:25:14, 4.80s/it] {'loss': 0.1189, 'grad_norm': 0.7461456777905648, 'learning_rate': 5.082908394681694e-06, 'epoch': 0.51} 51%|█████ | 17478/34278 [19:17:11<22:25:14, 4.80s/it] 51%|█████ | 17479/34278 [19:17:14<20:44:06, 4.44s/it] {'loss': 0.1506, 'grad_norm': 0.9183240458443164, 'learning_rate': 5.08243602510869e-06, 'epoch': 0.51} 51%|█████ | 17479/34278 [19:17:14<20:44:06, 4.44s/it] 51%|█████ | 17480/34278 [19:17:18<19:14:52, 4.13s/it] {'loss': 0.1306, 'grad_norm': 0.8203233671850735, 'learning_rate': 5.081963654799717e-06, 'epoch': 0.51} 51%|█████ | 17480/34278 [19:17:18<19:14:52, 4.13s/it] 51%|█████ | 17481/34278 [19:17:21<18:03:18, 3.87s/it] {'loss': 0.141, 'grad_norm': 1.0360600413094372, 'learning_rate': 5.0814912837589926e-06, 'epoch': 0.51} 51%|█████ | 17481/34278 [19:17:21<18:03:18, 3.87s/it] 51%|█████ | 17482/34278 [19:17:24<16:47:52, 3.60s/it] {'loss': 0.1279, 'grad_norm': 1.1149290780890595, 'learning_rate': 5.081018911990734e-06, 'epoch': 0.51} 51%|█████ | 17482/34278 [19:17:24<16:47:52, 3.60s/it] 51%|█████ | 17483/34278 [19:17:28<17:09:36, 3.68s/it] {'loss': 0.1554, 'grad_norm': 1.0399195394392293, 'learning_rate': 5.080546539499156e-06, 'epoch': 0.51} 51%|█████ | 17483/34278 [19:17:28<17:09:36, 3.68s/it] 51%|█████ | 17484/34278 [19:17:31<16:22:51, 3.51s/it] {'loss': 0.1164, 'grad_norm': 0.7729377457812602, 'learning_rate': 5.08007416628848e-06, 'epoch': 0.51} 51%|█████ | 17484/34278 [19:17:31<16:22:51, 3.51s/it] 51%|█████ | 17485/34278 [19:17:34<15:44:54, 3.38s/it] {'loss': 0.1182, 'grad_norm': 0.6820680789318545, 'learning_rate': 5.079601792362919e-06, 'epoch': 0.51} 51%|█████ | 17485/34278 [19:17:34<15:44:54, 3.38s/it] 51%|█████ | 17486/34278 [19:17:38<16:14:16, 3.48s/it] {'loss': 0.1135, 'grad_norm': 0.9411728310634727, 'learning_rate': 5.079129417726694e-06, 'epoch': 0.51} 51%|█████ | 17486/34278 [19:17:38<16:14:16, 3.48s/it] 51%|█████ | 17487/34278 [19:17:41<16:45:44, 3.59s/it] {'loss': 0.12, 'grad_norm': 0.954943445540097, 'learning_rate': 5.07865704238402e-06, 'epoch': 0.51} 51%|█████ | 17487/34278 [19:17:42<16:45:44, 3.59s/it] 51%|█████ | 17488/34278 [19:17:47<20:05:08, 4.31s/it] {'loss': 0.1333, 'grad_norm': 0.7679639014582577, 'learning_rate': 5.078184666339113e-06, 'epoch': 0.51} 51%|█████ | 17488/34278 [19:17:47<20:05:08, 4.31s/it] 51%|█████ | 17489/34278 [19:17:51<19:07:22, 4.10s/it] {'loss': 0.1335, 'grad_norm': 0.8363182413744034, 'learning_rate': 5.077712289596194e-06, 'epoch': 0.51} 51%|█████ | 17489/34278 [19:17:51<19:07:22, 4.10s/it] 51%|█████ | 17490/34278 [19:17:55<18:23:05, 3.94s/it] {'loss': 0.1288, 'grad_norm': 0.7267867294204925, 'learning_rate': 5.077239912159477e-06, 'epoch': 0.51} 51%|█████ | 17490/34278 [19:17:55<18:23:05, 3.94s/it] 51%|█████ | 17491/34278 [19:17:58<17:04:21, 3.66s/it] {'loss': 0.1359, 'grad_norm': 1.1942479269315884, 'learning_rate': 5.076767534033181e-06, 'epoch': 0.51} 51%|█████ | 17491/34278 [19:17:58<17:04:21, 3.66s/it] 51%|█████ | 17492/34278 [19:18:00<15:41:33, 3.37s/it] {'loss': 0.1269, 'grad_norm': 0.9139303107840787, 'learning_rate': 5.076295155221523e-06, 'epoch': 0.51} 51%|█████ | 17492/34278 [19:18:00<15:41:33, 3.37s/it] 51%|█████ | 17493/34278 [19:18:05<17:48:24, 3.82s/it] {'loss': 0.1416, 'grad_norm': 0.811733228885649, 'learning_rate': 5.07582277572872e-06, 'epoch': 0.51} 51%|█████ | 17493/34278 [19:18:05<17:48:24, 3.82s/it] 51%|█████ | 17494/34278 [19:18:12<21:27:59, 4.60s/it] {'loss': 0.1239, 'grad_norm': 0.7993082769531046, 'learning_rate': 5.075350395558989e-06, 'epoch': 0.51} 51%|█████ | 17494/34278 [19:18:12<21:27:59, 4.60s/it] 51%|█████ | 17495/34278 [19:18:15<19:06:01, 4.10s/it] {'loss': 0.1319, 'grad_norm': 1.0356845721725232, 'learning_rate': 5.074878014716548e-06, 'epoch': 0.51} 51%|█████ | 17495/34278 [19:18:15<19:06:01, 4.10s/it] 51%|█████ | 17496/34278 [19:18:17<17:23:49, 3.73s/it] {'loss': 0.1312, 'grad_norm': 0.9208334162372714, 'learning_rate': 5.0744056332056135e-06, 'epoch': 0.51} 51%|█████ | 17496/34278 [19:18:17<17:23:49, 3.73s/it] 51%|█████ | 17497/34278 [19:18:23<19:44:29, 4.24s/it] {'loss': 0.1519, 'grad_norm': 1.1003553933247492, 'learning_rate': 5.073933251030403e-06, 'epoch': 0.51} 51%|█████ | 17497/34278 [19:18:23<19:44:29, 4.24s/it] 51%|█████ | 17498/34278 [19:18:27<19:07:52, 4.10s/it] {'loss': 0.1394, 'grad_norm': 0.7936843075235583, 'learning_rate': 5.073460868195135e-06, 'epoch': 0.51} 51%|█████ | 17498/34278 [19:18:27<19:07:52, 4.10s/it] 51%|█████ | 17499/34278 [19:18:30<17:40:11, 3.79s/it] {'loss': 0.1207, 'grad_norm': 1.0421305551238624, 'learning_rate': 5.072988484704026e-06, 'epoch': 0.51} 51%|█████ | 17499/34278 [19:18:30<17:40:11, 3.79s/it] 51%|█████ | 17500/34278 [19:18:33<16:55:45, 3.63s/it] {'loss': 0.1125, 'grad_norm': 1.2136703774253128, 'learning_rate': 5.072516100561292e-06, 'epoch': 0.51} 51%|█████ | 17500/34278 [19:18:33<16:55:45, 3.63s/it] 51%|█████ | 17501/34278 [19:18:37<16:49:25, 3.61s/it] {'loss': 0.148, 'grad_norm': 1.0813212636620535, 'learning_rate': 5.0720437157711525e-06, 'epoch': 0.51} 51%|█████ | 17501/34278 [19:18:37<16:49:25, 3.61s/it] 51%|█████ | 17502/34278 [19:18:40<16:48:50, 3.61s/it] {'loss': 0.1378, 'grad_norm': 1.0195924965369898, 'learning_rate': 5.0715713303378245e-06, 'epoch': 0.51} 51%|█████ | 17502/34278 [19:18:40<16:48:50, 3.61s/it] 51%|█████ | 17503/34278 [19:18:44<16:52:19, 3.62s/it] {'loss': 0.135, 'grad_norm': 0.9360398833281904, 'learning_rate': 5.071098944265524e-06, 'epoch': 0.51} 51%|█████ | 17503/34278 [19:18:44<16:52:19, 3.62s/it] 51%|█████ | 17504/34278 [19:18:47<16:14:19, 3.49s/it] {'loss': 0.1481, 'grad_norm': 1.0452041218987642, 'learning_rate': 5.070626557558469e-06, 'epoch': 0.51} 51%|█████ | 17504/34278 [19:18:47<16:14:19, 3.49s/it] 51%|█████ | 17505/34278 [19:18:50<15:39:51, 3.36s/it] {'loss': 0.1709, 'grad_norm': 1.0237875216021775, 'learning_rate': 5.070154170220877e-06, 'epoch': 0.51} 51%|█████ | 17505/34278 [19:18:50<15:39:51, 3.36s/it] 51%|█████ | 17506/34278 [19:18:53<14:56:21, 3.21s/it] {'loss': 0.1202, 'grad_norm': 0.856728350090205, 'learning_rate': 5.069681782256965e-06, 'epoch': 0.51} 51%|█████ | 17506/34278 [19:18:53<14:56:21, 3.21s/it] 51%|█████ | 17507/34278 [19:18:56<14:21:33, 3.08s/it] {'loss': 0.1446, 'grad_norm': 0.9616693050458184, 'learning_rate': 5.069209393670951e-06, 'epoch': 0.51} 51%|█████ | 17507/34278 [19:18:56<14:21:33, 3.08s/it] 51%|█████ | 17508/34278 [19:18:59<14:27:30, 3.10s/it] {'loss': 0.1379, 'grad_norm': 0.8628997542542997, 'learning_rate': 5.0687370044670525e-06, 'epoch': 0.51} 51%|█████ | 17508/34278 [19:18:59<14:27:30, 3.10s/it] 51%|█████ | 17509/34278 [19:19:02<14:26:41, 3.10s/it] {'loss': 0.1336, 'grad_norm': 0.8683347621934591, 'learning_rate': 5.068264614649485e-06, 'epoch': 0.51} 51%|█████ | 17509/34278 [19:19:02<14:26:41, 3.10s/it] 51%|█████ | 17510/34278 [19:19:06<15:18:00, 3.28s/it] {'loss': 0.1357, 'grad_norm': 0.7882985045782255, 'learning_rate': 5.067792224222469e-06, 'epoch': 0.51} 51%|█████ | 17510/34278 [19:19:06<15:18:00, 3.28s/it] 51%|█████ | 17511/34278 [19:19:12<19:17:57, 4.14s/it] {'loss': 0.1137, 'grad_norm': 0.802689756041416, 'learning_rate': 5.06731983319022e-06, 'epoch': 0.51} 51%|█████ | 17511/34278 [19:19:12<19:17:57, 4.14s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 51%|█████ | 17512/34278 [19:19:17<21:16:39, 4.57s/it] {'loss': 0.1501, 'grad_norm': 0.8961327508057595, 'learning_rate': 5.066847441556955e-06, 'epoch': 0.51} 51%|█████ | 17512/34278 [19:19:17<21:16:39, 4.57s/it] 51%|█████ | 17513/34278 [19:19:21<19:26:46, 4.18s/it] {'loss': 0.1558, 'grad_norm': 1.072911160374994, 'learning_rate': 5.066375049326891e-06, 'epoch': 0.51} 51%|█████ | 17513/34278 [19:19:21<19:26:46, 4.18s/it] 51%|█████ | 17514/34278 [19:19:24<18:29:15, 3.97s/it] {'loss': 0.1367, 'grad_norm': 0.7563739992307312, 'learning_rate': 5.065902656504249e-06, 'epoch': 0.51} 51%|█████ | 17514/34278 [19:19:24<18:29:15, 3.97s/it] 51%|█████ | 17515/34278 [19:19:27<17:20:27, 3.72s/it] {'loss': 0.1302, 'grad_norm': 0.9185384509595925, 'learning_rate': 5.065430263093241e-06, 'epoch': 0.51} 51%|█████ | 17515/34278 [19:19:27<17:20:27, 3.72s/it] 51%|█████ | 17516/34278 [19:19:30<16:17:39, 3.50s/it] {'loss': 0.1258, 'grad_norm': 0.7980854257951039, 'learning_rate': 5.064957869098089e-06, 'epoch': 0.51} 51%|█████ | 17516/34278 [19:19:30<16:17:39, 3.50s/it] 51%|█████ | 17517/34278 [19:19:33<15:34:59, 3.35s/it] {'loss': 0.1474, 'grad_norm': 0.8508350465342617, 'learning_rate': 5.064485474523009e-06, 'epoch': 0.51} 51%|█████ | 17517/34278 [19:19:33<15:34:59, 3.35s/it] 51%|█████ | 17518/34278 [19:19:37<15:54:45, 3.42s/it] {'loss': 0.1448, 'grad_norm': 0.8594741215635663, 'learning_rate': 5.064013079372217e-06, 'epoch': 0.51} 51%|█████ | 17518/34278 [19:19:37<15:54:45, 3.42s/it] 51%|█████ | 17519/34278 [19:19:40<15:26:34, 3.32s/it] {'loss': 0.1599, 'grad_norm': 0.9353567233323511, 'learning_rate': 5.063540683649932e-06, 'epoch': 0.51} 51%|█████ | 17519/34278 [19:19:40<15:26:34, 3.32s/it] 51%|█████ | 17520/34278 [19:19:43<15:19:16, 3.29s/it] {'loss': 0.1218, 'grad_norm': 0.7570092298781683, 'learning_rate': 5.063068287360371e-06, 'epoch': 0.51} 51%|█████ | 17520/34278 [19:19:43<15:19:16, 3.29s/it] 51%|█████ | 17521/34278 [19:19:46<14:56:52, 3.21s/it] {'loss': 0.1459, 'grad_norm': 0.9619765117042295, 'learning_rate': 5.062595890507751e-06, 'epoch': 0.51} 51%|█████ | 17521/34278 [19:19:46<14:56:52, 3.21s/it] 51%|█████ | 17522/34278 [19:19:50<15:14:31, 3.27s/it] {'loss': 0.1289, 'grad_norm': 0.9031932490312972, 'learning_rate': 5.0621234930962905e-06, 'epoch': 0.51} 51%|█████ | 17522/34278 [19:19:50<15:14:31, 3.27s/it] 51%|█████ | 17523/34278 [19:19:53<15:13:10, 3.27s/it] {'loss': 0.1411, 'grad_norm': 0.913318391035932, 'learning_rate': 5.061651095130205e-06, 'epoch': 0.51} 51%|█████ | 17523/34278 [19:19:53<15:13:10, 3.27s/it] 51%|█████ | 17524/34278 [19:19:59<19:05:12, 4.10s/it] {'loss': 0.131, 'grad_norm': 0.7774639919480909, 'learning_rate': 5.061178696613714e-06, 'epoch': 0.51} 51%|█████ | 17524/34278 [19:19:59<19:05:12, 4.10s/it] 51%|█████ | 17525/34278 [19:20:03<18:48:21, 4.04s/it] {'loss': 0.1281, 'grad_norm': 0.9805357211513239, 'learning_rate': 5.060706297551035e-06, 'epoch': 0.51} 51%|█████ | 17525/34278 [19:20:03<18:48:21, 4.04s/it] 51%|█████ | 17526/34278 [19:20:07<18:25:21, 3.96s/it] {'loss': 0.1324, 'grad_norm': 0.7394969456489723, 'learning_rate': 5.060233897946383e-06, 'epoch': 0.51} 51%|█████ | 17526/34278 [19:20:07<18:25:21, 3.96s/it] 51%|█████ | 17527/34278 [19:20:10<17:47:51, 3.82s/it] {'loss': 0.1324, 'grad_norm': 0.9975029201407476, 'learning_rate': 5.059761497803978e-06, 'epoch': 0.51} 51%|█████ | 17527/34278 [19:20:10<17:47:51, 3.82s/it] 51%|█████ | 17528/34278 [19:20:13<16:29:01, 3.54s/it] {'loss': 0.1304, 'grad_norm': 0.7696731450396544, 'learning_rate': 5.059289097128036e-06, 'epoch': 0.51} 51%|█████ | 17528/34278 [19:20:13<16:29:01, 3.54s/it] 51%|█████ | 17529/34278 [19:20:19<19:57:47, 4.29s/it] {'loss': 0.1347, 'grad_norm': 0.8056673509710328, 'learning_rate': 5.058816695922777e-06, 'epoch': 0.51} 51%|█████ | 17529/34278 [19:20:19<19:57:47, 4.29s/it] 51%|█████ | 17530/34278 [19:20:22<18:49:37, 4.05s/it] {'loss': 0.1321, 'grad_norm': 0.9691332752521078, 'learning_rate': 5.058344294192414e-06, 'epoch': 0.51} 51%|█████ | 17530/34278 [19:20:22<18:49:37, 4.05s/it] 51%|█████ | 17531/34278 [19:20:26<17:38:43, 3.79s/it] {'loss': 0.1207, 'grad_norm': 0.7686200758317691, 'learning_rate': 5.057871891941168e-06, 'epoch': 0.51} 51%|█████ | 17531/34278 [19:20:26<17:38:43, 3.79s/it] 51%|█████ | 17532/34278 [19:20:29<16:43:25, 3.60s/it] {'loss': 0.1299, 'grad_norm': 0.8759365359133702, 'learning_rate': 5.057399489173258e-06, 'epoch': 0.51} 51%|█████ | 17532/34278 [19:20:29<16:43:25, 3.60s/it] 51%|█████ | 17533/34278 [19:20:35<20:00:08, 4.30s/it] {'loss': 0.1235, 'grad_norm': 0.8709034331203749, 'learning_rate': 5.056927085892895e-06, 'epoch': 0.51} 51%|█████ | 17533/34278 [19:20:35<20:00:08, 4.30s/it] 51%|█████ | 17534/34278 [19:20:38<18:46:12, 4.04s/it] {'loss': 0.1473, 'grad_norm': 0.9124801522464158, 'learning_rate': 5.056454682104304e-06, 'epoch': 0.51} 51%|█████ | 17534/34278 [19:20:38<18:46:12, 4.04s/it] 51%|█████ | 17535/34278 [19:20:41<17:12:03, 3.70s/it] {'loss': 0.1331, 'grad_norm': 1.1281914280370087, 'learning_rate': 5.055982277811698e-06, 'epoch': 0.51} 51%|█████ | 17535/34278 [19:20:41<17:12:03, 3.70s/it] 51%|█████ | 17536/34278 [19:20:46<18:40:23, 4.02s/it] {'loss': 0.1517, 'grad_norm': 0.797872254490622, 'learning_rate': 5.055509873019295e-06, 'epoch': 0.51} 51%|█████ | 17536/34278 [19:20:46<18:40:23, 4.02s/it] 51%|█████ | 17537/34278 [19:20:49<17:15:32, 3.71s/it] {'loss': 0.1542, 'grad_norm': 0.7042960811354347, 'learning_rate': 5.055037467731313e-06, 'epoch': 0.51} 51%|█████ | 17537/34278 [19:20:49<17:15:32, 3.71s/it] 51%|█████ | 17538/34278 [19:20:52<16:23:36, 3.53s/it] {'loss': 0.1301, 'grad_norm': 0.8982350288571852, 'learning_rate': 5.05456506195197e-06, 'epoch': 0.51} 51%|█████ | 17538/34278 [19:20:52<16:23:36, 3.53s/it] 51%|█████ | 17539/34278 [19:20:55<15:35:53, 3.35s/it] {'loss': 0.1189, 'grad_norm': 0.9645314326560633, 'learning_rate': 5.054092655685483e-06, 'epoch': 0.51} 51%|█████ | 17539/34278 [19:20:55<15:35:53, 3.35s/it] 51%|█████ | 17540/34278 [19:20:58<14:47:25, 3.18s/it] {'loss': 0.1207, 'grad_norm': 0.6831060075712297, 'learning_rate': 5.05362024893607e-06, 'epoch': 0.51} 51%|█████ | 17540/34278 [19:20:58<14:47:25, 3.18s/it] 51%|█████ | 17541/34278 [19:21:01<15:07:22, 3.25s/it] {'loss': 0.1289, 'grad_norm': 0.8136685565395272, 'learning_rate': 5.053147841707949e-06, 'epoch': 0.51} 51%|█████ | 17541/34278 [19:21:01<15:07:22, 3.25s/it] 51%|█████ | 17542/34278 [19:21:04<15:05:14, 3.25s/it] {'loss': 0.148, 'grad_norm': 0.9101201701656183, 'learning_rate': 5.052675434005334e-06, 'epoch': 0.51} 51%|█████ | 17542/34278 [19:21:04<15:05:14, 3.25s/it] 51%|█████ | 17543/34278 [19:21:07<14:37:00, 3.14s/it] {'loss': 0.1241, 'grad_norm': 0.9390603386217481, 'learning_rate': 5.052203025832447e-06, 'epoch': 0.51} 51%|█████ | 17543/34278 [19:21:07<14:37:00, 3.14s/it] 51%|█████ | 17544/34278 [19:21:10<14:36:23, 3.14s/it] {'loss': 0.118, 'grad_norm': 0.7610631210478856, 'learning_rate': 5.051730617193505e-06, 'epoch': 0.51} 51%|█████ | 17544/34278 [19:21:10<14:36:23, 3.14s/it] 51%|█████ | 17545/34278 [19:21:13<14:35:30, 3.14s/it] {'loss': 0.1219, 'grad_norm': 0.7814541424739787, 'learning_rate': 5.051258208092723e-06, 'epoch': 0.51} 51%|█████ | 17545/34278 [19:21:13<14:35:30, 3.14s/it] 51%|█████ | 17546/34278 [19:21:18<15:59:56, 3.44s/it] {'loss': 0.1385, 'grad_norm': 0.7967455639705543, 'learning_rate': 5.05078579853432e-06, 'epoch': 0.51} 51%|█████ | 17546/34278 [19:21:18<15:59:56, 3.44s/it] 51%|█████ | 17547/34278 [19:21:21<15:48:52, 3.40s/it] {'loss': 0.1315, 'grad_norm': 0.6755519418654807, 'learning_rate': 5.050313388522514e-06, 'epoch': 0.51} 51%|█████ | 17547/34278 [19:21:21<15:48:52, 3.40s/it] 51%|█████ | 17548/34278 [19:21:25<16:24:02, 3.53s/it] {'loss': 0.1306, 'grad_norm': 0.8998580896578239, 'learning_rate': 5.0498409780615205e-06, 'epoch': 0.51} 51%|█████ | 17548/34278 [19:21:25<16:24:02, 3.53s/it] 51%|█████ | 17549/34278 [19:21:28<16:25:10, 3.53s/it] {'loss': 0.1323, 'grad_norm': 0.92405078889, 'learning_rate': 5.049368567155561e-06, 'epoch': 0.51} 51%|█████ | 17549/34278 [19:21:28<16:25:10, 3.53s/it] 51%|█████ | 17550/34278 [19:21:32<16:23:14, 3.53s/it] {'loss': 0.1383, 'grad_norm': 0.785876774642788, 'learning_rate': 5.04889615580885e-06, 'epoch': 0.51} 51%|█████ | 17550/34278 [19:21:32<16:23:14, 3.53s/it] 51%|█████ | 17551/34278 [19:21:36<16:41:38, 3.59s/it] {'loss': 0.1329, 'grad_norm': 0.7165323386571989, 'learning_rate': 5.048423744025605e-06, 'epoch': 0.51} 51%|█████ | 17551/34278 [19:21:36<16:41:38, 3.59s/it] 51%|█████ | 17552/34278 [19:21:39<16:08:18, 3.47s/it] {'loss': 0.1339, 'grad_norm': 0.7620758413797071, 'learning_rate': 5.047951331810046e-06, 'epoch': 0.51} 51%|█████ | 17552/34278 [19:21:39<16:08:18, 3.47s/it] 51%|█████ | 17553/34278 [19:21:42<15:50:09, 3.41s/it] {'loss': 0.1296, 'grad_norm': 0.8656110648443649, 'learning_rate': 5.047478919166388e-06, 'epoch': 0.51} 51%|█████ | 17553/34278 [19:21:42<15:50:09, 3.41s/it] 51%|█████ | 17554/34278 [19:21:45<15:28:47, 3.33s/it] {'loss': 0.1347, 'grad_norm': 0.9855420724478027, 'learning_rate': 5.047006506098849e-06, 'epoch': 0.51} 51%|█████ | 17554/34278 [19:21:45<15:28:47, 3.33s/it] 51%|█████ | 17555/34278 [19:21:48<14:51:52, 3.20s/it] {'loss': 0.1383, 'grad_norm': 0.8203545053568296, 'learning_rate': 5.046534092611648e-06, 'epoch': 0.51} 51%|█████ | 17555/34278 [19:21:48<14:51:52, 3.20s/it] 51%|█████ | 17556/34278 [19:21:51<14:55:48, 3.21s/it] {'loss': 0.1253, 'grad_norm': 0.8336024513130412, 'learning_rate': 5.046061678709001e-06, 'epoch': 0.51} 51%|█████ | 17556/34278 [19:21:51<14:55:48, 3.21s/it] 51%|█████ | 17557/34278 [19:21:54<14:40:18, 3.16s/it] {'loss': 0.1279, 'grad_norm': 0.8251741447645392, 'learning_rate': 5.045589264395127e-06, 'epoch': 0.51} 51%|█████ | 17557/34278 [19:21:54<14:40:18, 3.16s/it] 51%|█████ | 17558/34278 [19:22:01<19:21:27, 4.17s/it] {'loss': 0.1158, 'grad_norm': 0.7249019702836738, 'learning_rate': 5.045116849674242e-06, 'epoch': 0.51} 51%|█████ | 17558/34278 [19:22:01<19:21:27, 4.17s/it] 51%|█████ | 17559/34278 [19:22:04<18:03:44, 3.89s/it] {'loss': 0.1363, 'grad_norm': 0.8212257179288065, 'learning_rate': 5.0446444345505655e-06, 'epoch': 0.51} 51%|█████ | 17559/34278 [19:22:04<18:03:44, 3.89s/it] 51%|█████ | 17560/34278 [19:22:08<17:28:43, 3.76s/it] {'loss': 0.127, 'grad_norm': 0.7925447040494781, 'learning_rate': 5.044172019028313e-06, 'epoch': 0.51} 51%|█████ | 17560/34278 [19:22:08<17:28:43, 3.76s/it] 51%|█████ | 17561/34278 [19:22:12<18:26:38, 3.97s/it] {'loss': 0.12, 'grad_norm': 0.8123743961452655, 'learning_rate': 5.043699603111703e-06, 'epoch': 0.51} 51%|█████ | 17561/34278 [19:22:12<18:26:38, 3.97s/it] 51%|█████ | 17562/34278 [19:22:15<17:05:21, 3.68s/it] {'loss': 0.1229, 'grad_norm': 0.8630978133301008, 'learning_rate': 5.043227186804956e-06, 'epoch': 0.51} 51%|█████ | 17562/34278 [19:22:15<17:05:21, 3.68s/it] 51%|█████ | 17563/34278 [19:22:18<16:10:40, 3.48s/it] {'loss': 0.1292, 'grad_norm': 0.8541533524559927, 'learning_rate': 5.042754770112284e-06, 'epoch': 0.51} 51%|█████ | 17563/34278 [19:22:18<16:10:40, 3.48s/it] 51%|█████ | 17564/34278 [19:22:22<16:40:17, 3.59s/it] {'loss': 0.1472, 'grad_norm': 0.842949923027598, 'learning_rate': 5.0422823530379105e-06, 'epoch': 0.51} 51%|█████ | 17564/34278 [19:22:22<16:40:17, 3.59s/it] 51%|█████ | 17565/34278 [19:22:27<19:11:07, 4.13s/it] {'loss': 0.1346, 'grad_norm': 0.8409518885481113, 'learning_rate': 5.0418099355860484e-06, 'epoch': 0.51} 51%|█████ | 17565/34278 [19:22:27<19:11:07, 4.13s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 51%|█████ | 17566/34278 [19:22:30<17:31:48, 3.78s/it] {'loss': 0.1303, 'grad_norm': 0.7374106530409561, 'learning_rate': 5.041337517760917e-06, 'epoch': 0.51} 51%|█████ | 17566/34278 [19:22:30<17:31:48, 3.78s/it] 51%|█████ | 17567/34278 [19:22:34<17:17:44, 3.73s/it] {'loss': 0.1367, 'grad_norm': 0.8742229032087201, 'learning_rate': 5.040865099566735e-06, 'epoch': 0.51} 51%|█████ | 17567/34278 [19:22:34<17:17:44, 3.73s/it] 51%|█████▏ | 17568/34278 [19:22:37<16:29:36, 3.55s/it] {'loss': 0.1245, 'grad_norm': 1.016779321728823, 'learning_rate': 5.040392681007718e-06, 'epoch': 0.51} 51%|█████▏ | 17568/34278 [19:22:37<16:29:36, 3.55s/it] 51%|█████▏ | 17569/34278 [19:22:41<16:37:16, 3.58s/it] {'loss': 0.1505, 'grad_norm': 0.7807388059490147, 'learning_rate': 5.039920262088086e-06, 'epoch': 0.51} 51%|█████▏ | 17569/34278 [19:22:41<16:37:16, 3.58s/it] 51%|█████▏ | 17570/34278 [19:22:44<15:52:34, 3.42s/it] {'loss': 0.1195, 'grad_norm': 0.7747413133406592, 'learning_rate': 5.039447842812055e-06, 'epoch': 0.51} 51%|█████▏ | 17570/34278 [19:22:44<15:52:34, 3.42s/it] 51%|█████▏ | 17571/34278 [19:22:47<15:04:06, 3.25s/it] {'loss': 0.1298, 'grad_norm': 0.8271880281283326, 'learning_rate': 5.038975423183842e-06, 'epoch': 0.51} 51%|█████▏ | 17571/34278 [19:22:47<15:04:06, 3.25s/it] 51%|█████▏ | 17572/34278 [19:22:50<15:09:31, 3.27s/it] {'loss': 0.1273, 'grad_norm': 0.7830936799746516, 'learning_rate': 5.038503003207668e-06, 'epoch': 0.51} 51%|█████▏ | 17572/34278 [19:22:50<15:09:31, 3.27s/it] 51%|█████▏ | 17573/34278 [19:22:53<14:50:34, 3.20s/it] {'loss': 0.1221, 'grad_norm': 0.8771580466338059, 'learning_rate': 5.0380305828877465e-06, 'epoch': 0.51} 51%|█████▏ | 17573/34278 [19:22:53<14:50:34, 3.20s/it] 51%|█████▏ | 17574/34278 [19:22:56<15:00:28, 3.23s/it] {'loss': 0.1324, 'grad_norm': 0.8718309850456868, 'learning_rate': 5.037558162228299e-06, 'epoch': 0.51} 51%|█████▏ | 17574/34278 [19:22:56<15:00:28, 3.23s/it] 51%|█████▏ | 17575/34278 [19:23:00<15:39:32, 3.37s/it] {'loss': 0.1425, 'grad_norm': 0.9590825845942171, 'learning_rate': 5.037085741233538e-06, 'epoch': 0.51} 51%|█████▏ | 17575/34278 [19:23:00<15:39:32, 3.37s/it] 51%|█████▏ | 17576/34278 [19:23:03<15:18:35, 3.30s/it] {'loss': 0.1509, 'grad_norm': 0.8733374508286359, 'learning_rate': 5.036613319907686e-06, 'epoch': 0.51} 51%|█████▏ | 17576/34278 [19:23:03<15:18:35, 3.30s/it] 51%|█████▏ | 17577/34278 [19:23:06<14:37:33, 3.15s/it] {'loss': 0.1281, 'grad_norm': 0.8010297410646896, 'learning_rate': 5.036140898254961e-06, 'epoch': 0.51} 51%|█████▏ | 17577/34278 [19:23:06<14:37:33, 3.15s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0cabad2610> Failed to fetch sample 3255713. Exception: cannot identify image file <_io.BytesIO object at 0x7f0cabad2610> 51%|█████▏ | 17578/34278 [19:23:09<14:30:43, 3.13s/it] {'loss': 0.1186, 'grad_norm': 0.818046871337447, 'learning_rate': 5.035668476279576e-06, 'epoch': 0.51} 51%|█████▏ | 17578/34278 [19:23:09<14:30:43, 3.13s/it] 51%|█████▏ | 17579/34278 [19:23:12<15:03:30, 3.25s/it] {'loss': 0.1373, 'grad_norm': 1.2083012101166253, 'learning_rate': 5.035196053985753e-06, 'epoch': 0.51} 51%|█████▏ | 17579/34278 [19:23:12<15:03:30, 3.25s/it] 51%|█████▏ | 17580/34278 [19:23:16<15:15:52, 3.29s/it] {'loss': 0.1431, 'grad_norm': 0.9142451664612667, 'learning_rate': 5.034723631377707e-06, 'epoch': 0.51} 51%|█████▏ | 17580/34278 [19:23:16<15:15:52, 3.29s/it] 51%|█████▏ | 17581/34278 [19:23:19<15:08:22, 3.26s/it] {'loss': 0.1472, 'grad_norm': 0.8908800856985881, 'learning_rate': 5.034251208459657e-06, 'epoch': 0.51} 51%|█████▏ | 17581/34278 [19:23:19<15:08:22, 3.26s/it] 51%|█████▏ | 17582/34278 [19:23:22<14:57:41, 3.23s/it] {'loss': 0.1572, 'grad_norm': 1.0317160445217477, 'learning_rate': 5.03377878523582e-06, 'epoch': 0.51} 51%|█████▏ | 17582/34278 [19:23:22<14:57:41, 3.23s/it] 51%|█████▏ | 17583/34278 [19:23:27<17:10:50, 3.70s/it] {'loss': 0.1299, 'grad_norm': 0.8756008051929964, 'learning_rate': 5.033306361710415e-06, 'epoch': 0.51} 51%|█████▏ | 17583/34278 [19:23:27<17:10:50, 3.70s/it] 51%|█████▏ | 17584/34278 [19:23:31<17:25:46, 3.76s/it] {'loss': 0.1269, 'grad_norm': 0.7728398782043059, 'learning_rate': 5.032833937887658e-06, 'epoch': 0.51} 51%|█████▏ | 17584/34278 [19:23:31<17:25:46, 3.76s/it] 51%|█████▏ | 17585/34278 [19:23:34<16:19:50, 3.52s/it] {'loss': 0.1318, 'grad_norm': 0.7538051765640672, 'learning_rate': 5.032361513771767e-06, 'epoch': 0.51} 51%|█████▏ | 17585/34278 [19:23:34<16:19:50, 3.52s/it] 51%|█████▏ | 17586/34278 [19:23:39<18:22:45, 3.96s/it] {'loss': 0.1359, 'grad_norm': 0.8545320646510951, 'learning_rate': 5.0318890893669615e-06, 'epoch': 0.51} 51%|█████▏ | 17586/34278 [19:23:39<18:22:45, 3.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 51%|█████▏ | 17587/34278 [19:23:42<17:30:34, 3.78s/it] {'loss': 0.1292, 'grad_norm': 0.9585410008967832, 'learning_rate': 5.031416664677456e-06, 'epoch': 0.51} 51%|█████▏ | 17587/34278 [19:23:42<17:30:34, 3.78s/it] 51%|█████▏ | 17588/34278 [19:23:45<16:37:48, 3.59s/it] {'loss': 0.1143, 'grad_norm': 1.0299142078365062, 'learning_rate': 5.030944239707471e-06, 'epoch': 0.51} 51%|█████▏ | 17588/34278 [19:23:45<16:37:48, 3.59s/it] 51%|█████▏ | 17589/34278 [19:23:48<16:03:56, 3.47s/it] {'loss': 0.1362, 'grad_norm': 0.9619283975036986, 'learning_rate': 5.0304718144612255e-06, 'epoch': 0.51} 51%|█████▏ | 17589/34278 [19:23:48<16:03:56, 3.47s/it] 51%|█████▏ | 17590/34278 [19:23:52<16:17:11, 3.51s/it] {'loss': 0.1368, 'grad_norm': 0.7429583931989896, 'learning_rate': 5.029999388942931e-06, 'epoch': 0.51} 51%|█████▏ | 17590/34278 [19:23:52<16:17:11, 3.51s/it] 51%|█████▏ | 17591/34278 [19:23:55<15:46:02, 3.40s/it] {'loss': 0.1452, 'grad_norm': 0.8229938502388782, 'learning_rate': 5.029526963156811e-06, 'epoch': 0.51} 51%|█████▏ | 17591/34278 [19:23:55<15:46:02, 3.40s/it] 51%|█████▏ | 17592/34278 [19:24:00<17:45:41, 3.83s/it] {'loss': 0.1206, 'grad_norm': 1.0138785203681089, 'learning_rate': 5.029054537107082e-06, 'epoch': 0.51} 51%|█████▏ | 17592/34278 [19:24:00<17:45:41, 3.83s/it] 51%|█████▏ | 17593/34278 [19:24:03<16:39:04, 3.59s/it] {'loss': 0.1138, 'grad_norm': 0.8362051566001064, 'learning_rate': 5.028582110797959e-06, 'epoch': 0.51} 51%|█████▏ | 17593/34278 [19:24:03<16:39:04, 3.59s/it] 51%|█████▏ | 17594/34278 [19:24:06<16:04:15, 3.47s/it] {'loss': 0.1448, 'grad_norm': 1.0142553667413128, 'learning_rate': 5.028109684233664e-06, 'epoch': 0.51} 51%|█████▏ | 17594/34278 [19:24:06<16:04:15, 3.47s/it] 51%|█████▏ | 17595/34278 [19:24:10<16:01:00, 3.46s/it] {'loss': 0.1438, 'grad_norm': 0.9349747283301558, 'learning_rate': 5.027637257418412e-06, 'epoch': 0.51} 51%|█████▏ | 17595/34278 [19:24:10<16:01:00, 3.46s/it] 51%|█████▏ | 17596/34278 [19:24:13<16:12:10, 3.50s/it] {'loss': 0.1513, 'grad_norm': 0.8878888874083781, 'learning_rate': 5.02716483035642e-06, 'epoch': 0.51} 51%|█████▏ | 17596/34278 [19:24:13<16:12:10, 3.50s/it] 51%|█████▏ | 17597/34278 [19:24:17<16:13:23, 3.50s/it] {'loss': 0.1119, 'grad_norm': 0.7865816982391449, 'learning_rate': 5.026692403051908e-06, 'epoch': 0.51} 51%|█████▏ | 17597/34278 [19:24:17<16:13:23, 3.50s/it] 51%|█████▏ | 17598/34278 [19:24:21<16:38:37, 3.59s/it] {'loss': 0.1223, 'grad_norm': 0.7100548874746431, 'learning_rate': 5.026219975509091e-06, 'epoch': 0.51} 51%|█████▏ | 17598/34278 [19:24:21<16:38:37, 3.59s/it] 51%|█████▏ | 17599/34278 [19:24:24<16:35:45, 3.58s/it] {'loss': 0.1398, 'grad_norm': 0.9445340145825436, 'learning_rate': 5.02574754773219e-06, 'epoch': 0.51} 51%|█████▏ | 17599/34278 [19:24:24<16:35:45, 3.58s/it] 51%|█████▏ | 17600/34278 [19:24:29<18:10:05, 3.92s/it] {'loss': 0.1184, 'grad_norm': 1.002108731341329, 'learning_rate': 5.02527511972542e-06, 'epoch': 0.51} 51%|█████▏ | 17600/34278 [19:24:29<18:10:05, 3.92s/it] 51%|█████▏ | 17601/34278 [19:24:35<21:13:38, 4.58s/it] {'loss': 0.1557, 'grad_norm': 0.6824222856757094, 'learning_rate': 5.0248026914930006e-06, 'epoch': 0.51} 51%|█████▏ | 17601/34278 [19:24:35<21:13:38, 4.58s/it] 51%|█████▏ | 17602/34278 [19:24:38<19:27:11, 4.20s/it] {'loss': 0.1191, 'grad_norm': 0.9177669362364731, 'learning_rate': 5.024330263039148e-06, 'epoch': 0.51} 51%|█████▏ | 17602/34278 [19:24:38<19:27:11, 4.20s/it] 51%|█████▏ | 17603/34278 [19:24:41<17:56:11, 3.87s/it] {'loss': 0.1572, 'grad_norm': 0.9441676301186944, 'learning_rate': 5.023857834368081e-06, 'epoch': 0.51} 51%|█████▏ | 17603/34278 [19:24:41<17:56:11, 3.87s/it] 51%|█████▏ | 17604/34278 [19:24:45<17:13:54, 3.72s/it] {'loss': 0.1245, 'grad_norm': 0.7064923808484347, 'learning_rate': 5.023385405484018e-06, 'epoch': 0.51} 51%|█████▏ | 17604/34278 [19:24:45<17:13:54, 3.72s/it] 51%|█████▏ | 17605/34278 [19:24:48<16:47:23, 3.63s/it] {'loss': 0.1275, 'grad_norm': 0.7447612330377841, 'learning_rate': 5.022912976391174e-06, 'epoch': 0.51} 51%|█████▏ | 17605/34278 [19:24:48<16:47:23, 3.63s/it] 51%|█████▏ | 17606/34278 [19:24:52<16:53:48, 3.65s/it] {'loss': 0.1269, 'grad_norm': 0.9186222233356722, 'learning_rate': 5.022440547093768e-06, 'epoch': 0.51} 51%|█████▏ | 17606/34278 [19:24:52<16:53:48, 3.65s/it] 51%|█████▏ | 17607/34278 [19:24:55<16:11:21, 3.50s/it] {'loss': 0.1119, 'grad_norm': 0.674380939403684, 'learning_rate': 5.02196811759602e-06, 'epoch': 0.51} 51%|█████▏ | 17607/34278 [19:24:55<16:11:21, 3.50s/it] 51%|█████▏ | 17608/34278 [19:24:58<15:59:56, 3.46s/it] {'loss': 0.122, 'grad_norm': 0.6018218128908944, 'learning_rate': 5.021495687902144e-06, 'epoch': 0.51} 51%|█████▏ | 17608/34278 [19:24:58<15:59:56, 3.46s/it] 51%|█████▏ | 17609/34278 [19:25:02<15:32:32, 3.36s/it] {'loss': 0.137, 'grad_norm': 0.8038044934493266, 'learning_rate': 5.021023258016362e-06, 'epoch': 0.51} 51%|█████▏ | 17609/34278 [19:25:02<15:32:32, 3.36s/it] 51%|█████▏ | 17610/34278 [19:25:04<14:33:05, 3.14s/it] {'loss': 0.1238, 'grad_norm': 0.7717034856489727, 'learning_rate': 5.020550827942887e-06, 'epoch': 0.51} 51%|█████▏ | 17610/34278 [19:25:04<14:33:05, 3.14s/it] 51%|█████▏ | 17611/34278 [19:25:08<15:46:26, 3.41s/it] {'loss': 0.1161, 'grad_norm': 0.7571141400189079, 'learning_rate': 5.02007839768594e-06, 'epoch': 0.51} 51%|█████▏ | 17611/34278 [19:25:08<15:46:26, 3.41s/it] 51%|█████▏ | 17612/34278 [19:25:11<15:28:54, 3.34s/it] {'loss': 0.1792, 'grad_norm': 0.7604141439047912, 'learning_rate': 5.019605967249739e-06, 'epoch': 0.51} 51%|█████▏ | 17612/34278 [19:25:11<15:28:54, 3.34s/it] 51%|█████▏ | 17613/34278 [19:25:15<15:30:51, 3.35s/it] {'loss': 0.1323, 'grad_norm': 0.9124473784167703, 'learning_rate': 5.019133536638499e-06, 'epoch': 0.51} 51%|█████▏ | 17613/34278 [19:25:15<15:30:51, 3.35s/it] 51%|█████▏ | 17614/34278 [19:25:19<16:38:01, 3.59s/it] {'loss': 0.1159, 'grad_norm': 0.9377508153489198, 'learning_rate': 5.018661105856439e-06, 'epoch': 0.51} 51%|█████▏ | 17614/34278 [19:25:19<16:38:01, 3.59s/it] 51%|█████▏ | 17615/34278 [19:25:23<16:36:58, 3.59s/it] {'loss': 0.1254, 'grad_norm': 0.9459563426893842, 'learning_rate': 5.0181886749077795e-06, 'epoch': 0.51} 51%|█████▏ | 17615/34278 [19:25:23<16:36:58, 3.59s/it] 51%|█████▏ | 17616/34278 [19:25:26<16:12:28, 3.50s/it] {'loss': 0.1417, 'grad_norm': 0.8730453821464376, 'learning_rate': 5.017716243796733e-06, 'epoch': 0.51} 51%|█████▏ | 17616/34278 [19:25:26<16:12:28, 3.50s/it] 51%|█████▏ | 17617/34278 [19:25:29<16:04:04, 3.47s/it] {'loss': 0.1297, 'grad_norm': 0.9597291800094656, 'learning_rate': 5.017243812527522e-06, 'epoch': 0.51} 51%|█████▏ | 17617/34278 [19:25:29<16:04:04, 3.47s/it] 51%|█████▏ | 17618/34278 [19:25:33<16:26:17, 3.55s/it] {'loss': 0.1524, 'grad_norm': 1.116313804130204, 'learning_rate': 5.0167713811043615e-06, 'epoch': 0.51} 51%|█████▏ | 17618/34278 [19:25:33<16:26:17, 3.55s/it] 51%|█████▏ | 17619/34278 [19:25:36<15:51:32, 3.43s/it] {'loss': 0.1129, 'grad_norm': 0.6809247080188088, 'learning_rate': 5.016298949531472e-06, 'epoch': 0.51} 51%|█████▏ | 17619/34278 [19:25:36<15:51:32, 3.43s/it] 51%|█████▏ | 17620/34278 [19:25:39<15:00:26, 3.24s/it] {'loss': 0.1142, 'grad_norm': 0.9446785784678333, 'learning_rate': 5.015826517813066e-06, 'epoch': 0.51} 51%|█████▏ | 17620/34278 [19:25:39<15:00:26, 3.24s/it] 51%|█████▏ | 17621/34278 [19:25:43<15:33:09, 3.36s/it] {'loss': 0.1257, 'grad_norm': 0.8700839566767229, 'learning_rate': 5.0153540859533666e-06, 'epoch': 0.51} 51%|█████▏ | 17621/34278 [19:25:43<15:33:09, 3.36s/it] 51%|█████▏ | 17622/34278 [19:25:49<19:21:02, 4.18s/it] {'loss': 0.1361, 'grad_norm': 0.6687352248983223, 'learning_rate': 5.01488165395659e-06, 'epoch': 0.51} 51%|█████▏ | 17622/34278 [19:25:49<19:21:02, 4.18s/it] 51%|█████▏ | 17623/34278 [19:25:52<17:53:00, 3.87s/it] {'loss': 0.1298, 'grad_norm': 0.8536488013155896, 'learning_rate': 5.0144092218269524e-06, 'epoch': 0.51} 51%|█████▏ | 17623/34278 [19:25:52<17:53:00, 3.87s/it] 51%|█████▏ | 17624/34278 [19:25:55<16:55:35, 3.66s/it] {'loss': 0.1075, 'grad_norm': 0.8045445463211859, 'learning_rate': 5.013936789568674e-06, 'epoch': 0.51} 51%|█████▏ | 17624/34278 [19:25:55<16:55:35, 3.66s/it] 51%|█████▏ | 17625/34278 [19:25:58<16:34:56, 3.58s/it] {'loss': 0.1283, 'grad_norm': 0.886544578939302, 'learning_rate': 5.013464357185971e-06, 'epoch': 0.51} 51%|█████▏ | 17625/34278 [19:25:58<16:34:56, 3.58s/it] 51%|█████▏ | 17626/34278 [19:26:02<16:07:27, 3.49s/it] {'loss': 0.1465, 'grad_norm': 0.8834498824643948, 'learning_rate': 5.01299192468306e-06, 'epoch': 0.51} 51%|█████▏ | 17626/34278 [19:26:02<16:07:27, 3.49s/it] 51%|█████▏ | 17627/34278 [19:26:05<15:50:24, 3.42s/it] {'loss': 0.1357, 'grad_norm': 0.8235137054250127, 'learning_rate': 5.012519492064162e-06, 'epoch': 0.51} 51%|█████▏ | 17627/34278 [19:26:05<15:50:24, 3.42s/it] 51%|█████▏ | 17628/34278 [19:26:09<16:36:27, 3.59s/it] {'loss': 0.1136, 'grad_norm': 0.7779755421483731, 'learning_rate': 5.012047059333492e-06, 'epoch': 0.51} 51%|█████▏ | 17628/34278 [19:26:09<16:36:27, 3.59s/it] 51%|█████▏ | 17629/34278 [19:26:14<18:52:00, 4.08s/it] {'loss': 0.1601, 'grad_norm': 0.7521641273170295, 'learning_rate': 5.011574626495269e-06, 'epoch': 0.51} 51%|█████▏ | 17629/34278 [19:26:14<18:52:00, 4.08s/it] 51%|█████▏ | 17630/34278 [19:26:18<18:40:18, 4.04s/it] {'loss': 0.133, 'grad_norm': 0.7202731880658887, 'learning_rate': 5.01110219355371e-06, 'epoch': 0.51} 51%|█████▏ | 17630/34278 [19:26:18<18:40:18, 4.04s/it] 51%|█████▏ | 17631/34278 [19:26:23<20:39:34, 4.47s/it] {'loss': 0.1204, 'grad_norm': 0.771261148817068, 'learning_rate': 5.010629760513034e-06, 'epoch': 0.51} 51%|█████▏ | 17631/34278 [19:26:23<20:39:34, 4.47s/it] 51%|█████▏ | 17632/34278 [19:26:28<20:17:15, 4.39s/it] {'loss': 0.1112, 'grad_norm': 0.7949068248008649, 'learning_rate': 5.010157327377457e-06, 'epoch': 0.51} 51%|█████▏ | 17632/34278 [19:26:28<20:17:15, 4.39s/it] 51%|█████▏ | 17633/34278 [19:26:32<19:42:08, 4.26s/it] {'loss': 0.1575, 'grad_norm': 0.7917686820114463, 'learning_rate': 5.009684894151199e-06, 'epoch': 0.51} 51%|█████▏ | 17633/34278 [19:26:32<19:42:08, 4.26s/it] 51%|█████▏ | 17634/34278 [19:26:35<18:04:08, 3.91s/it] {'loss': 0.123, 'grad_norm': 0.7125942396034246, 'learning_rate': 5.009212460838477e-06, 'epoch': 0.51} 51%|█████▏ | 17634/34278 [19:26:35<18:04:08, 3.91s/it] 51%|█████▏ | 17635/34278 [19:26:38<16:43:39, 3.62s/it] {'loss': 0.1482, 'grad_norm': 0.8952945642164067, 'learning_rate': 5.008740027443506e-06, 'epoch': 0.51} 51%|█████▏ | 17635/34278 [19:26:38<16:43:39, 3.62s/it] 51%|█████▏ | 17636/34278 [19:26:44<20:49:04, 4.50s/it] {'loss': 0.1398, 'grad_norm': 0.8271549553230554, 'learning_rate': 5.008267593970507e-06, 'epoch': 0.51} 51%|█████▏ | 17636/34278 [19:26:44<20:49:04, 4.50s/it] 51%|█████▏ | 17637/34278 [19:26:48<19:18:46, 4.18s/it] {'loss': 0.144, 'grad_norm': 0.7547581475037284, 'learning_rate': 5.0077951604236985e-06, 'epoch': 0.51} 51%|█████▏ | 17637/34278 [19:26:48<19:18:46, 4.18s/it] 51%|█████▏ | 17638/34278 [19:26:50<17:20:47, 3.75s/it] {'loss': 0.1212, 'grad_norm': 0.8611805200564947, 'learning_rate': 5.007322726807294e-06, 'epoch': 0.51} 51%|█████▏ | 17638/34278 [19:26:50<17:20:47, 3.75s/it] 51%|█████▏ | 17639/34278 [19:26:54<16:44:20, 3.62s/it] {'loss': 0.1372, 'grad_norm': 0.8804799122613016, 'learning_rate': 5.006850293125517e-06, 'epoch': 0.51} 51%|█████▏ | 17639/34278 [19:26:54<16:44:20, 3.62s/it] 51%|█████▏ | 17640/34278 [19:26:58<16:59:28, 3.68s/it] {'loss': 0.123, 'grad_norm': 0.7698369940580078, 'learning_rate': 5.0063778593825805e-06, 'epoch': 0.51} 51%|█████▏ | 17640/34278 [19:26:58<16:59:28, 3.68s/it] 51%|█████▏ | 17641/34278 [19:27:01<16:04:57, 3.48s/it] {'loss': 0.1588, 'grad_norm': 0.916682018196249, 'learning_rate': 5.005905425582705e-06, 'epoch': 0.51} 51%|█████▏ | 17641/34278 [19:27:01<16:04:57, 3.48s/it] 51%|█████▏ | 17642/34278 [19:27:04<15:29:29, 3.35s/it] {'loss': 0.1291, 'grad_norm': 0.7255126727410842, 'learning_rate': 5.005432991730106e-06, 'epoch': 0.51} 51%|█████▏ | 17642/34278 [19:27:04<15:29:29, 3.35s/it] 51%|█████▏ | 17643/34278 [19:27:07<16:01:09, 3.47s/it] {'loss': 0.1393, 'grad_norm': 0.8011659620481545, 'learning_rate': 5.0049605578290025e-06, 'epoch': 0.51} 51%|█████▏ | 17643/34278 [19:27:07<16:01:09, 3.47s/it] 51%|█████▏ | 17644/34278 [19:27:13<19:24:03, 4.20s/it] {'loss': 0.1165, 'grad_norm': 0.770103932765385, 'learning_rate': 5.004488123883614e-06, 'epoch': 0.51} 51%|█████▏ | 17644/34278 [19:27:13<19:24:03, 4.20s/it] 51%|█████▏ | 17645/34278 [19:27:16<17:30:16, 3.79s/it] {'loss': 0.1191, 'grad_norm': 0.6142928258533513, 'learning_rate': 5.004015689898155e-06, 'epoch': 0.51} 51%|█████▏ | 17645/34278 [19:27:16<17:30:16, 3.79s/it] 51%|█████▏ | 17646/34278 [19:27:19<16:14:46, 3.52s/it] {'loss': 0.157, 'grad_norm': 1.0560506694150225, 'learning_rate': 5.003543255876845e-06, 'epoch': 0.51} 51%|█████▏ | 17646/34278 [19:27:19<16:14:46, 3.52s/it] 51%|█████▏ | 17647/34278 [19:27:22<15:24:32, 3.34s/it] {'loss': 0.1436, 'grad_norm': 0.8755699419388113, 'learning_rate': 5.0030708218239025e-06, 'epoch': 0.51} 51%|█████▏ | 17647/34278 [19:27:22<15:24:32, 3.34s/it] 51%|█████▏ | 17648/34278 [19:27:25<15:10:38, 3.29s/it] {'loss': 0.1349, 'grad_norm': 0.794533891524701, 'learning_rate': 5.002598387743544e-06, 'epoch': 0.51} 51%|█████▏ | 17648/34278 [19:27:25<15:10:38, 3.29s/it] 51%|█████▏ | 17649/34278 [19:27:28<14:37:14, 3.17s/it] {'loss': 0.1253, 'grad_norm': 0.8392470518062469, 'learning_rate': 5.002125953639988e-06, 'epoch': 0.51} 51%|█████▏ | 17649/34278 [19:27:28<14:37:14, 3.17s/it] 51%|█████▏ | 17650/34278 [19:27:31<14:33:05, 3.15s/it] {'loss': 0.1304, 'grad_norm': 0.9559536658700135, 'learning_rate': 5.001653519517451e-06, 'epoch': 0.51} 51%|█████▏ | 17650/34278 [19:27:31<14:33:05, 3.15s/it] 51%|█████▏ | 17651/34278 [19:27:35<16:06:45, 3.49s/it] {'loss': 0.1282, 'grad_norm': 0.7364703795814111, 'learning_rate': 5.001181085380152e-06, 'epoch': 0.51} 51%|█████▏ | 17651/34278 [19:27:35<16:06:45, 3.49s/it] 51%|█████▏ | 17652/34278 [19:27:38<15:21:10, 3.32s/it] {'loss': 0.1213, 'grad_norm': 0.8225490988635491, 'learning_rate': 5.00070865123231e-06, 'epoch': 0.51} 51%|█████▏ | 17652/34278 [19:27:38<15:21:10, 3.32s/it] 51%|█████▏ | 17653/34278 [19:27:41<15:06:27, 3.27s/it] {'loss': 0.1303, 'grad_norm': 1.0874898380750317, 'learning_rate': 5.000236217078139e-06, 'epoch': 0.51} 51%|█████▏ | 17653/34278 [19:27:41<15:06:27, 3.27s/it] 52%|█████▏ | 17654/34278 [19:27:46<16:14:50, 3.52s/it] {'loss': 0.1224, 'grad_norm': 0.6688242311374385, 'learning_rate': 4.999763782921862e-06, 'epoch': 0.52} 52%|█████▏ | 17654/34278 [19:27:46<16:14:50, 3.52s/it] 52%|█████▏ | 17655/34278 [19:27:50<17:09:24, 3.72s/it] {'loss': 0.1468, 'grad_norm': 0.8607544504149954, 'learning_rate': 4.999291348767692e-06, 'epoch': 0.52} 52%|█████▏ | 17655/34278 [19:27:50<17:09:24, 3.72s/it] 52%|█████▏ | 17656/34278 [19:27:53<16:17:07, 3.53s/it] {'loss': 0.1319, 'grad_norm': 1.0217087991651117, 'learning_rate': 4.998818914619849e-06, 'epoch': 0.52} 52%|█████▏ | 17656/34278 [19:27:53<16:17:07, 3.53s/it] 52%|█████▏ | 17657/34278 [19:27:56<16:17:35, 3.53s/it] {'loss': 0.1451, 'grad_norm': 0.8437567190693104, 'learning_rate': 4.99834648048255e-06, 'epoch': 0.52} 52%|█████▏ | 17657/34278 [19:27:56<16:17:35, 3.53s/it] 52%|█████▏ | 17658/34278 [19:28:02<19:35:59, 4.25s/it] {'loss': 0.1269, 'grad_norm': 0.7201711567972063, 'learning_rate': 4.997874046360013e-06, 'epoch': 0.52} 52%|█████▏ | 17658/34278 [19:28:02<19:35:59, 4.25s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 52%|█████▏ | 17659/34278 [19:28:08<21:06:58, 4.57s/it] {'loss': 0.1332, 'grad_norm': 0.8352761030653358, 'learning_rate': 4.997401612256458e-06, 'epoch': 0.52} 52%|█████▏ | 17659/34278 [19:28:08<21:06:58, 4.57s/it] 52%|█████▏ | 17660/34278 [19:28:11<18:54:01, 4.09s/it] {'loss': 0.1416, 'grad_norm': 1.004129729343521, 'learning_rate': 4.996929178176099e-06, 'epoch': 0.52} 52%|█████▏ | 17660/34278 [19:28:11<18:54:01, 4.09s/it] 52%|█████▏ | 17661/34278 [19:28:15<18:55:58, 4.10s/it] {'loss': 0.117, 'grad_norm': 1.2824937203244915, 'learning_rate': 4.996456744123156e-06, 'epoch': 0.52} 52%|█████▏ | 17661/34278 [19:28:15<18:55:58, 4.10s/it] 52%|█████▏ | 17662/34278 [19:28:18<17:41:54, 3.83s/it] {'loss': 0.1129, 'grad_norm': 1.1857359085225379, 'learning_rate': 4.995984310101847e-06, 'epoch': 0.52} 52%|█████▏ | 17662/34278 [19:28:18<17:41:54, 3.83s/it] 52%|█████▏ | 17663/34278 [19:28:21<16:51:15, 3.65s/it] {'loss': 0.121, 'grad_norm': 1.1371941299943034, 'learning_rate': 4.995511876116387e-06, 'epoch': 0.52} 52%|█████▏ | 17663/34278 [19:28:21<16:51:15, 3.65s/it] 52%|█████▏ | 17664/34278 [19:28:25<17:12:47, 3.73s/it] {'loss': 0.1331, 'grad_norm': 1.196261158483275, 'learning_rate': 4.995039442170998e-06, 'epoch': 0.52} 52%|█████▏ | 17664/34278 [19:28:25<17:12:47, 3.73s/it] 52%|█████▏ | 17665/34278 [19:28:31<20:04:34, 4.35s/it] {'loss': 0.1541, 'grad_norm': 1.1620631468864877, 'learning_rate': 4.9945670082698945e-06, 'epoch': 0.52} 52%|█████▏ | 17665/34278 [19:28:31<20:04:34, 4.35s/it] 52%|█████▏ | 17666/34278 [19:28:34<18:05:53, 3.92s/it] {'loss': 0.1258, 'grad_norm': 1.0097065937778051, 'learning_rate': 4.994094574417296e-06, 'epoch': 0.52} 52%|█████▏ | 17666/34278 [19:28:34<18:05:53, 3.92s/it] 52%|█████▏ | 17667/34278 [19:28:37<16:58:37, 3.68s/it] {'loss': 0.144, 'grad_norm': 0.821252905649037, 'learning_rate': 4.993622140617421e-06, 'epoch': 0.52} 52%|█████▏ | 17667/34278 [19:28:37<16:58:37, 3.68s/it] 52%|█████▏ | 17668/34278 [19:28:40<16:13:46, 3.52s/it] {'loss': 0.1354, 'grad_norm': 0.9205304381811602, 'learning_rate': 4.993149706874485e-06, 'epoch': 0.52} 52%|█████▏ | 17668/34278 [19:28:40<16:13:46, 3.52s/it] 52%|█████▏ | 17669/34278 [19:28:43<16:07:57, 3.50s/it] {'loss': 0.1268, 'grad_norm': 1.2005603618970164, 'learning_rate': 4.992677273192706e-06, 'epoch': 0.52} 52%|█████▏ | 17669/34278 [19:28:43<16:07:57, 3.50s/it] 52%|█████▏ | 17670/34278 [19:28:46<15:01:01, 3.26s/it] {'loss': 0.138, 'grad_norm': 0.9359330023349163, 'learning_rate': 4.992204839576302e-06, 'epoch': 0.52} 52%|█████▏ | 17670/34278 [19:28:46<15:01:01, 3.26s/it] 52%|█████▏ | 17671/34278 [19:28:50<15:24:10, 3.34s/it] {'loss': 0.1119, 'grad_norm': 0.8701714764215726, 'learning_rate': 4.9917324060294946e-06, 'epoch': 0.52} 52%|█████▏ | 17671/34278 [19:28:50<15:24:10, 3.34s/it] 52%|█████▏ | 17672/34278 [19:28:53<15:55:44, 3.45s/it] {'loss': 0.1162, 'grad_norm': 0.8238086979703602, 'learning_rate': 4.991259972556496e-06, 'epoch': 0.52} 52%|█████▏ | 17672/34278 [19:28:53<15:55:44, 3.45s/it] 52%|█████▏ | 17673/34278 [19:28:56<15:24:54, 3.34s/it] {'loss': 0.1222, 'grad_norm': 0.770604622471963, 'learning_rate': 4.990787539161525e-06, 'epoch': 0.52} 52%|█████▏ | 17673/34278 [19:28:56<15:24:54, 3.34s/it] 52%|█████▏ | 17674/34278 [19:28:59<14:52:53, 3.23s/it] {'loss': 0.1263, 'grad_norm': 0.6771629181307338, 'learning_rate': 4.990315105848804e-06, 'epoch': 0.52} 52%|█████▏ | 17674/34278 [19:28:59<14:52:53, 3.23s/it] 52%|█████▏ | 17675/34278 [19:29:02<14:00:05, 3.04s/it] {'loss': 0.1166, 'grad_norm': 0.8926126858578739, 'learning_rate': 4.989842672622543e-06, 'epoch': 0.52} 52%|█████▏ | 17675/34278 [19:29:02<14:00:05, 3.04s/it] 52%|█████▏ | 17676/34278 [19:29:08<18:03:24, 3.92s/it] {'loss': 0.1322, 'grad_norm': 0.8297622269993699, 'learning_rate': 4.989370239486968e-06, 'epoch': 0.52} 52%|█████▏ | 17676/34278 [19:29:08<18:03:24, 3.92s/it] 52%|█████▏ | 17677/34278 [19:29:12<17:37:10, 3.82s/it] {'loss': 0.1313, 'grad_norm': 0.7446893830893277, 'learning_rate': 4.988897806446291e-06, 'epoch': 0.52} 52%|█████▏ | 17677/34278 [19:29:12<17:37:10, 3.82s/it] 52%|█████▏ | 17678/34278 [19:29:14<16:08:26, 3.50s/it] {'loss': 0.1541, 'grad_norm': 0.9243250876529159, 'learning_rate': 4.9884253735047325e-06, 'epoch': 0.52} 52%|█████▏ | 17678/34278 [19:29:14<16:08:26, 3.50s/it] 52%|█████▏ | 17679/34278 [19:29:18<16:19:57, 3.54s/it] {'loss': 0.1197, 'grad_norm': 0.8860627931972566, 'learning_rate': 4.98795294066651e-06, 'epoch': 0.52} 52%|█████▏ | 17679/34278 [19:29:18<16:19:57, 3.54s/it] 52%|█████▏ | 17680/34278 [19:29:21<15:15:21, 3.31s/it] {'loss': 0.1245, 'grad_norm': 1.0050192417510047, 'learning_rate': 4.987480507935841e-06, 'epoch': 0.52} 52%|█████▏ | 17680/34278 [19:29:21<15:15:21, 3.31s/it] 52%|█████▏ | 17681/34278 [19:29:24<14:42:21, 3.19s/it] {'loss': 0.1387, 'grad_norm': 0.7079725561051315, 'learning_rate': 4.987008075316941e-06, 'epoch': 0.52} 52%|█████▏ | 17681/34278 [19:29:24<14:42:21, 3.19s/it] 52%|█████▏ | 17682/34278 [19:29:28<15:51:39, 3.44s/it] {'loss': 0.1236, 'grad_norm': 0.7984669677892621, 'learning_rate': 4.986535642814031e-06, 'epoch': 0.52} 52%|█████▏ | 17682/34278 [19:29:28<15:51:39, 3.44s/it] 52%|█████▏ | 17683/34278 [19:29:34<19:33:57, 4.24s/it] {'loss': 0.1452, 'grad_norm': 0.9139776999032819, 'learning_rate': 4.9860632104313276e-06, 'epoch': 0.52} 52%|█████▏ | 17683/34278 [19:29:34<19:33:57, 4.24s/it] 52%|█████▏ | 17684/34278 [19:29:37<17:55:03, 3.89s/it] {'loss': 0.1086, 'grad_norm': 0.7610917318336733, 'learning_rate': 4.985590778173049e-06, 'epoch': 0.52} 52%|█████▏ | 17684/34278 [19:29:37<17:55:03, 3.89s/it] 52%|█████▏ | 17685/34278 [19:29:40<17:17:55, 3.75s/it] {'loss': 0.1461, 'grad_norm': 0.7632751121205954, 'learning_rate': 4.9851183460434115e-06, 'epoch': 0.52} 52%|█████▏ | 17685/34278 [19:29:40<17:17:55, 3.75s/it] 52%|█████▏ | 17686/34278 [19:29:46<19:32:55, 4.24s/it] {'loss': 0.1279, 'grad_norm': 0.7501627366479159, 'learning_rate': 4.984645914046635e-06, 'epoch': 0.52} 52%|█████▏ | 17686/34278 [19:29:46<19:32:55, 4.24s/it] 52%|█████▏ | 17687/34278 [19:29:49<17:36:33, 3.82s/it] {'loss': 0.1554, 'grad_norm': 1.019683836730631, 'learning_rate': 4.984173482186934e-06, 'epoch': 0.52} 52%|█████▏ | 17687/34278 [19:29:49<17:36:33, 3.82s/it] 52%|█████▏ | 17688/34278 [19:29:54<19:59:31, 4.34s/it] {'loss': 0.168, 'grad_norm': 0.8740614422971671, 'learning_rate': 4.98370105046853e-06, 'epoch': 0.52} 52%|█████▏ | 17688/34278 [19:29:54<19:59:31, 4.34s/it] 52%|█████▏ | 17689/34278 [19:29:57<18:09:25, 3.94s/it] {'loss': 0.1221, 'grad_norm': 0.7718973254598634, 'learning_rate': 4.983228618895639e-06, 'epoch': 0.52} 52%|█████▏ | 17689/34278 [19:29:57<18:09:25, 3.94s/it] 52%|█████▏ | 17690/34278 [19:30:01<17:36:00, 3.82s/it] {'loss': 0.1459, 'grad_norm': 0.7935483578242476, 'learning_rate': 4.98275618747248e-06, 'epoch': 0.52} 52%|█████▏ | 17690/34278 [19:30:01<17:36:00, 3.82s/it] 52%|█████▏ | 17691/34278 [19:30:05<18:44:46, 4.07s/it] {'loss': 0.1521, 'grad_norm': 1.0596746784375382, 'learning_rate': 4.982283756203268e-06, 'epoch': 0.52} 52%|█████▏ | 17691/34278 [19:30:05<18:44:46, 4.07s/it] 52%|█████▏ | 17692/34278 [19:30:08<17:22:02, 3.77s/it] {'loss': 0.1212, 'grad_norm': 0.7472360856697575, 'learning_rate': 4.981811325092224e-06, 'epoch': 0.52} 52%|█████▏ | 17692/34278 [19:30:08<17:22:02, 3.77s/it] 52%|█████▏ | 17693/34278 [19:30:13<17:55:46, 3.89s/it] {'loss': 0.1408, 'grad_norm': 0.7540684429918678, 'learning_rate': 4.98133889414356e-06, 'epoch': 0.52} 52%|█████▏ | 17693/34278 [19:30:13<17:55:46, 3.89s/it] 52%|█████▏ | 17694/34278 [19:30:18<19:37:59, 4.26s/it] {'loss': 0.1413, 'grad_norm': 0.8577320979432324, 'learning_rate': 4.980866463361502e-06, 'epoch': 0.52} 52%|█████▏ | 17694/34278 [19:30:18<19:37:59, 4.26s/it] 52%|█████▏ | 17695/34278 [19:30:21<17:50:15, 3.87s/it] {'loss': 0.1542, 'grad_norm': 0.9033068190112308, 'learning_rate': 4.980394032750263e-06, 'epoch': 0.52} 52%|█████▏ | 17695/34278 [19:30:21<17:50:15, 3.87s/it] 52%|█████▏ | 17696/34278 [19:30:25<18:16:56, 3.97s/it] {'loss': 0.1314, 'grad_norm': 0.7692487755193975, 'learning_rate': 4.979921602314061e-06, 'epoch': 0.52} 52%|█████▏ | 17696/34278 [19:30:25<18:16:56, 3.97s/it] 52%|█████▏ | 17697/34278 [19:30:28<16:57:53, 3.68s/it] {'loss': 0.1336, 'grad_norm': 0.9400376900876605, 'learning_rate': 4.979449172057115e-06, 'epoch': 0.52} 52%|█████▏ | 17697/34278 [19:30:28<16:57:53, 3.68s/it] 52%|█████▏ | 17698/34278 [19:30:31<16:18:55, 3.54s/it] {'loss': 0.1241, 'grad_norm': 0.7299198992071241, 'learning_rate': 4.978976741983641e-06, 'epoch': 0.52} 52%|█████▏ | 17698/34278 [19:30:31<16:18:55, 3.54s/it] 52%|█████▏ | 17699/34278 [19:30:36<18:13:16, 3.96s/it] {'loss': 0.1343, 'grad_norm': 0.7482717083200953, 'learning_rate': 4.978504312097856e-06, 'epoch': 0.52} 52%|█████▏ | 17699/34278 [19:30:36<18:13:16, 3.96s/it] 52%|█████▏ | 17700/34278 [19:30:40<18:55:27, 4.11s/it] {'loss': 0.1507, 'grad_norm': 0.8087436392121131, 'learning_rate': 4.978031882403981e-06, 'epoch': 0.52} 52%|█████▏ | 17700/34278 [19:30:40<18:55:27, 4.11s/it] 52%|█████▏ | 17701/34278 [19:30:43<17:11:16, 3.73s/it] {'loss': 0.1319, 'grad_norm': 0.868521306445695, 'learning_rate': 4.977559452906233e-06, 'epoch': 0.52} 52%|█████▏ | 17701/34278 [19:30:43<17:11:16, 3.73s/it] 52%|█████▏ | 17702/34278 [19:30:46<16:21:05, 3.55s/it] {'loss': 0.1178, 'grad_norm': 0.936283371969216, 'learning_rate': 4.977087023608828e-06, 'epoch': 0.52} 52%|█████▏ | 17702/34278 [19:30:46<16:21:05, 3.55s/it] 52%|█████▏ | 17703/34278 [19:30:49<15:42:13, 3.41s/it] {'loss': 0.1491, 'grad_norm': 1.175195682655177, 'learning_rate': 4.976614594515985e-06, 'epoch': 0.52} 52%|█████▏ | 17703/34278 [19:30:49<15:42:13, 3.41s/it] 52%|█████▏ | 17704/34278 [19:30:53<16:08:14, 3.51s/it] {'loss': 0.1439, 'grad_norm': 0.9890558252635985, 'learning_rate': 4.976142165631921e-06, 'epoch': 0.52} 52%|█████▏ | 17704/34278 [19:30:53<16:08:14, 3.51s/it] 52%|█████▏ | 17705/34278 [19:30:56<15:42:27, 3.41s/it] {'loss': 0.1493, 'grad_norm': 0.9898106675366942, 'learning_rate': 4.975669736960852e-06, 'epoch': 0.52} 52%|█████▏ | 17705/34278 [19:30:56<15:42:27, 3.41s/it] 52%|█████▏ | 17706/34278 [19:31:00<15:18:27, 3.33s/it] {'loss': 0.1283, 'grad_norm': 1.0099130958080584, 'learning_rate': 4.975197308507001e-06, 'epoch': 0.52} 52%|█████▏ | 17706/34278 [19:31:00<15:18:27, 3.33s/it] 52%|█████▏ | 17707/34278 [19:31:02<14:48:13, 3.22s/it] {'loss': 0.114, 'grad_norm': 0.6482686432632625, 'learning_rate': 4.9747248802745814e-06, 'epoch': 0.52} 52%|█████▏ | 17707/34278 [19:31:02<14:48:13, 3.22s/it] 52%|█████▏ | 17708/34278 [19:31:07<16:30:04, 3.59s/it] {'loss': 0.1499, 'grad_norm': 0.9074951647529433, 'learning_rate': 4.974252452267811e-06, 'epoch': 0.52} 52%|█████▏ | 17708/34278 [19:31:07<16:30:04, 3.59s/it] 52%|█████▏ | 17709/34278 [19:31:10<16:15:55, 3.53s/it] {'loss': 0.1286, 'grad_norm': 0.8029354281442845, 'learning_rate': 4.973780024490911e-06, 'epoch': 0.52} 52%|█████▏ | 17709/34278 [19:31:10<16:15:55, 3.53s/it] 52%|█████▏ | 17710/34278 [19:31:14<16:19:57, 3.55s/it] {'loss': 0.1491, 'grad_norm': 0.9127033778546586, 'learning_rate': 4.9733075969480945e-06, 'epoch': 0.52} 52%|█████▏ | 17710/34278 [19:31:14<16:19:57, 3.55s/it] 52%|█████▏ | 17711/34278 [19:31:18<17:35:36, 3.82s/it] {'loss': 0.125, 'grad_norm': 0.7613957159556328, 'learning_rate': 4.972835169643581e-06, 'epoch': 0.52} 52%|█████▏ | 17711/34278 [19:31:18<17:35:36, 3.82s/it] 52%|█████▏ | 17712/34278 [19:31:22<17:46:10, 3.86s/it] {'loss': 0.1548, 'grad_norm': 0.6994314010900585, 'learning_rate': 4.9723627425815895e-06, 'epoch': 0.52} 52%|█████▏ | 17712/34278 [19:31:22<17:46:10, 3.86s/it] 52%|█████▏ | 17713/34278 [19:31:26<17:27:32, 3.79s/it] {'loss': 0.1374, 'grad_norm': 0.7249464277301007, 'learning_rate': 4.9718903157663364e-06, 'epoch': 0.52} 52%|█████▏ | 17713/34278 [19:31:26<17:27:32, 3.79s/it] 52%|█████▏ | 17714/34278 [19:31:29<16:31:53, 3.59s/it] {'loss': 0.1414, 'grad_norm': 0.8740554154516622, 'learning_rate': 4.971417889202042e-06, 'epoch': 0.52} 52%|█████▏ | 17714/34278 [19:31:29<16:31:53, 3.59s/it] 52%|█████▏ | 17715/34278 [19:31:32<15:34:25, 3.38s/it] {'loss': 0.1279, 'grad_norm': 0.7367444142907535, 'learning_rate': 4.97094546289292e-06, 'epoch': 0.52} 52%|█████▏ | 17715/34278 [19:31:32<15:34:25, 3.38s/it] 52%|█████▏ | 17716/34278 [19:31:35<15:10:42, 3.30s/it] {'loss': 0.1249, 'grad_norm': 0.8791510432754738, 'learning_rate': 4.97047303684319e-06, 'epoch': 0.52} 52%|█████▏ | 17716/34278 [19:31:35<15:10:42, 3.30s/it] 52%|█████▏ | 17717/34278 [19:31:38<14:37:55, 3.18s/it] {'loss': 0.1386, 'grad_norm': 0.8161126947784589, 'learning_rate': 4.970000611057069e-06, 'epoch': 0.52} 52%|█████▏ | 17717/34278 [19:31:38<14:37:55, 3.18s/it] 52%|█████▏ | 17718/34278 [19:31:42<15:09:35, 3.30s/it] {'loss': 0.1372, 'grad_norm': 0.8634750377528411, 'learning_rate': 4.969528185538776e-06, 'epoch': 0.52} 52%|█████▏ | 17718/34278 [19:31:42<15:09:35, 3.30s/it] 52%|█████▏ | 17719/34278 [19:31:45<15:08:06, 3.29s/it] {'loss': 0.1436, 'grad_norm': 0.8407895590347889, 'learning_rate': 4.96905576029253e-06, 'epoch': 0.52} 52%|█████▏ | 17719/34278 [19:31:45<15:08:06, 3.29s/it] 52%|█████▏ | 17720/34278 [19:31:48<14:42:51, 3.20s/it] {'loss': 0.1389, 'grad_norm': 1.0012309494361669, 'learning_rate': 4.968583335322545e-06, 'epoch': 0.52} 52%|█████▏ | 17720/34278 [19:31:48<14:42:51, 3.20s/it] 52%|█████▏ | 17721/34278 [19:31:51<14:15:41, 3.10s/it] {'loss': 0.1409, 'grad_norm': 1.0487690076129972, 'learning_rate': 4.96811091063304e-06, 'epoch': 0.52} 52%|█████▏ | 17721/34278 [19:31:51<14:15:41, 3.10s/it] 52%|█████▏ | 17722/34278 [19:31:54<14:43:05, 3.20s/it] {'loss': 0.1461, 'grad_norm': 0.6886249733579866, 'learning_rate': 4.967638486228235e-06, 'epoch': 0.52} 52%|█████▏ | 17722/34278 [19:31:54<14:43:05, 3.20s/it] 52%|█████▏ | 17723/34278 [19:31:57<14:18:28, 3.11s/it] {'loss': 0.1489, 'grad_norm': 1.7070397978566778, 'learning_rate': 4.967166062112342e-06, 'epoch': 0.52} 52%|█████▏ | 17723/34278 [19:31:57<14:18:28, 3.11s/it] 52%|█████▏ | 17724/34278 [19:32:01<15:10:56, 3.30s/it] {'loss': 0.1172, 'grad_norm': 0.8597711687918821, 'learning_rate': 4.966693638289587e-06, 'epoch': 0.52} 52%|█████▏ | 17724/34278 [19:32:01<15:10:56, 3.30s/it] 52%|█████▏ | 17725/34278 [19:32:04<14:31:51, 3.16s/it] {'loss': 0.1234, 'grad_norm': 0.853591338688919, 'learning_rate': 4.9662212147641805e-06, 'epoch': 0.52} 52%|█████▏ | 17725/34278 [19:32:04<14:31:51, 3.16s/it] 52%|█████▏ | 17726/34278 [19:32:09<17:29:19, 3.80s/it] {'loss': 0.1324, 'grad_norm': 0.719466915487513, 'learning_rate': 4.9657487915403446e-06, 'epoch': 0.52} 52%|█████▏ | 17726/34278 [19:32:09<17:29:19, 3.80s/it] 52%|█████▏ | 17727/34278 [19:32:12<16:39:11, 3.62s/it] {'loss': 0.1291, 'grad_norm': 0.8899182333294233, 'learning_rate': 4.965276368622295e-06, 'epoch': 0.52} 52%|█████▏ | 17727/34278 [19:32:12<16:39:11, 3.62s/it] 52%|█████▏ | 17728/34278 [19:32:15<15:48:44, 3.44s/it] {'loss': 0.1344, 'grad_norm': 0.9061730129709415, 'learning_rate': 4.96480394601425e-06, 'epoch': 0.52} 52%|█████▏ | 17728/34278 [19:32:15<15:48:44, 3.44s/it] 52%|█████▏ | 17729/34278 [19:32:21<19:17:53, 4.20s/it] {'loss': 0.155, 'grad_norm': 0.9292343509728695, 'learning_rate': 4.9643315237204246e-06, 'epoch': 0.52} 52%|█████▏ | 17729/34278 [19:32:21<19:17:53, 4.20s/it] 52%|█████▏ | 17730/34278 [19:32:24<17:45:26, 3.86s/it] {'loss': 0.1327, 'grad_norm': 0.9195084573904982, 'learning_rate': 4.963859101745041e-06, 'epoch': 0.52} 52%|█████▏ | 17730/34278 [19:32:24<17:45:26, 3.86s/it] 52%|█████▏ | 17731/34278 [19:32:27<16:33:46, 3.60s/it] {'loss': 0.1512, 'grad_norm': 1.9916532290169138, 'learning_rate': 4.9633866800923145e-06, 'epoch': 0.52} 52%|█████▏ | 17731/34278 [19:32:27<16:33:46, 3.60s/it] 52%|█████▏ | 17732/34278 [19:32:30<15:58:57, 3.48s/it] {'loss': 0.1268, 'grad_norm': 0.8279354201987689, 'learning_rate': 4.962914258766463e-06, 'epoch': 0.52} 52%|█████▏ | 17732/34278 [19:32:30<15:58:57, 3.48s/it] 52%|█████▏ | 17733/34278 [19:32:34<15:35:55, 3.39s/it] {'loss': 0.126, 'grad_norm': 0.7465001089712736, 'learning_rate': 4.962441837771704e-06, 'epoch': 0.52} 52%|█████▏ | 17733/34278 [19:32:34<15:35:55, 3.39s/it] 52%|█████▏ | 17734/34278 [19:32:38<17:31:42, 3.81s/it] {'loss': 0.1317, 'grad_norm': 1.2480594259412, 'learning_rate': 4.961969417112256e-06, 'epoch': 0.52} 52%|█████▏ | 17734/34278 [19:32:38<17:31:42, 3.81s/it] 52%|█████▏ | 17735/34278 [19:32:42<16:52:08, 3.67s/it] {'loss': 0.1374, 'grad_norm': 1.1180437220550352, 'learning_rate': 4.961496996792333e-06, 'epoch': 0.52} 52%|█████▏ | 17735/34278 [19:32:42<16:52:08, 3.67s/it] 52%|█████▏ | 17736/34278 [19:32:47<18:33:18, 4.04s/it] {'loss': 0.1264, 'grad_norm': 0.7644397191168941, 'learning_rate': 4.961024576816158e-06, 'epoch': 0.52} 52%|█████▏ | 17736/34278 [19:32:47<18:33:18, 4.04s/it] 52%|█████▏ | 17737/34278 [19:32:50<17:01:09, 3.70s/it] {'loss': 0.1287, 'grad_norm': 1.125579767318, 'learning_rate': 4.960552157187947e-06, 'epoch': 0.52} 52%|█████▏ | 17737/34278 [19:32:50<17:01:09, 3.70s/it] 52%|█████▏ | 17738/34278 [19:32:53<16:49:38, 3.66s/it] {'loss': 0.154, 'grad_norm': 1.7882388009657824, 'learning_rate': 4.9600797379119155e-06, 'epoch': 0.52} 52%|█████▏ | 17738/34278 [19:32:53<16:49:38, 3.66s/it] 52%|█████▏ | 17739/34278 [19:32:56<15:59:08, 3.48s/it] {'loss': 0.1181, 'grad_norm': 0.800660711242164, 'learning_rate': 4.959607318992284e-06, 'epoch': 0.52} 52%|█████▏ | 17739/34278 [19:32:56<15:59:08, 3.48s/it] 52%|█████▏ | 17740/34278 [19:33:01<17:32:32, 3.82s/it] {'loss': 0.1502, 'grad_norm': 1.1732319497813264, 'learning_rate': 4.959134900433268e-06, 'epoch': 0.52} 52%|█████▏ | 17740/34278 [19:33:01<17:32:32, 3.82s/it] 52%|█████▏ | 17741/34278 [19:33:04<16:43:54, 3.64s/it] {'loss': 0.1526, 'grad_norm': 1.1002167995407082, 'learning_rate': 4.958662482239084e-06, 'epoch': 0.52} 52%|█████▏ | 17741/34278 [19:33:04<16:43:54, 3.64s/it] 52%|█████▏ | 17742/34278 [19:33:08<16:56:25, 3.69s/it] {'loss': 0.1357, 'grad_norm': 0.839185950225196, 'learning_rate': 4.958190064413953e-06, 'epoch': 0.52} 52%|█████▏ | 17742/34278 [19:33:08<16:56:25, 3.69s/it] 52%|█████▏ | 17743/34278 [19:33:14<19:54:05, 4.33s/it] {'loss': 0.147, 'grad_norm': 0.8207595591182254, 'learning_rate': 4.957717646962091e-06, 'epoch': 0.52} 52%|█████▏ | 17743/34278 [19:33:14<19:54:05, 4.33s/it] 52%|█████▏ | 17744/34278 [19:33:17<17:55:17, 3.90s/it] {'loss': 0.1193, 'grad_norm': 1.0114933892310523, 'learning_rate': 4.957245229887717e-06, 'epoch': 0.52} 52%|█████▏ | 17744/34278 [19:33:17<17:55:17, 3.90s/it] 52%|█████▏ | 17745/34278 [19:33:23<20:53:14, 4.55s/it] {'loss': 0.135, 'grad_norm': 1.0847668761992957, 'learning_rate': 4.956772813195046e-06, 'epoch': 0.52} 52%|█████▏ | 17745/34278 [19:33:23<20:53:14, 4.55s/it] 52%|█████▏ | 17746/34278 [19:33:27<20:23:32, 4.44s/it] {'loss': 0.1245, 'grad_norm': 0.7678586858077632, 'learning_rate': 4.9563003968882975e-06, 'epoch': 0.52} 52%|█████▏ | 17746/34278 [19:33:27<20:23:32, 4.44s/it] 52%|█████▏ | 17747/34278 [19:33:30<18:17:15, 3.98s/it] {'loss': 0.147, 'grad_norm': 1.1090211114566713, 'learning_rate': 4.955827980971688e-06, 'epoch': 0.52} 52%|█████▏ | 17747/34278 [19:33:30<18:17:15, 3.98s/it] 52%|█████▏ | 17748/34278 [19:33:33<17:08:14, 3.73s/it] {'loss': 0.1256, 'grad_norm': 0.8378878431401968, 'learning_rate': 4.955355565449435e-06, 'epoch': 0.52} 52%|█████▏ | 17748/34278 [19:33:33<17:08:14, 3.73s/it] 52%|█████▏ | 17749/34278 [19:33:39<20:36:55, 4.49s/it] {'loss': 0.1365, 'grad_norm': 0.748962072543815, 'learning_rate': 4.95488315032576e-06, 'epoch': 0.52} 52%|█████▏ | 17749/34278 [19:33:39<20:36:55, 4.49s/it] 52%|█████▏ | 17750/34278 [19:33:42<18:26:17, 4.02s/it] {'loss': 0.1205, 'grad_norm': 0.8700736053153962, 'learning_rate': 4.9544107356048756e-06, 'epoch': 0.52} 52%|█████▏ | 17750/34278 [19:33:42<18:26:17, 4.02s/it] 52%|█████▏ | 17751/34278 [19:33:48<21:04:55, 4.59s/it] {'loss': 0.1291, 'grad_norm': 0.9766888630828362, 'learning_rate': 4.953938321291001e-06, 'epoch': 0.52} 52%|█████▏ | 17751/34278 [19:33:48<21:04:55, 4.59s/it] 52%|█████▏ | 17752/34278 [19:33:51<19:14:52, 4.19s/it] {'loss': 0.1241, 'grad_norm': 0.8786663876319774, 'learning_rate': 4.953465907388353e-06, 'epoch': 0.52} 52%|█████▏ | 17752/34278 [19:33:51<19:14:52, 4.19s/it] 52%|█████▏ | 17753/34278 [19:33:56<19:34:16, 4.26s/it] {'loss': 0.1288, 'grad_norm': 0.7814683983961322, 'learning_rate': 4.9529934939011514e-06, 'epoch': 0.52} 52%|█████▏ | 17753/34278 [19:33:56<19:34:16, 4.26s/it] 52%|█████▏ | 17754/34278 [19:33:59<18:10:16, 3.96s/it] {'loss': 0.1172, 'grad_norm': 0.7162963342767883, 'learning_rate': 4.952521080833614e-06, 'epoch': 0.52} 52%|█████▏ | 17754/34278 [19:33:59<18:10:16, 3.96s/it] 52%|█████▏ | 17755/34278 [19:34:03<18:33:51, 4.04s/it] {'loss': 0.1167, 'grad_norm': 0.7422130006335729, 'learning_rate': 4.952048668189956e-06, 'epoch': 0.52} 52%|█████▏ | 17755/34278 [19:34:03<18:33:51, 4.04s/it] 52%|█████▏ | 17756/34278 [19:34:06<17:01:51, 3.71s/it] {'loss': 0.1192, 'grad_norm': 0.9636738077209109, 'learning_rate': 4.9515762559743955e-06, 'epoch': 0.52} 52%|█████▏ | 17756/34278 [19:34:06<17:01:51, 3.71s/it] 52%|█████▏ | 17757/34278 [19:34:11<18:52:54, 4.11s/it] {'loss': 0.1201, 'grad_norm': 0.8079579914363468, 'learning_rate': 4.9511038441911515e-06, 'epoch': 0.52} 52%|█████▏ | 17757/34278 [19:34:11<18:52:54, 4.11s/it] 52%|█████▏ | 17758/34278 [19:34:17<21:41:11, 4.73s/it] {'loss': 0.1288, 'grad_norm': 0.7498702957223226, 'learning_rate': 4.9506314328444395e-06, 'epoch': 0.52} 52%|█████▏ | 17758/34278 [19:34:17<21:41:11, 4.73s/it] 52%|█████▏ | 17759/34278 [19:34:20<19:06:56, 4.17s/it] {'loss': 0.137, 'grad_norm': 0.754107044012069, 'learning_rate': 4.950159021938479e-06, 'epoch': 0.52} 52%|█████▏ | 17759/34278 [19:34:20<19:06:56, 4.17s/it] 52%|█████▏ | 17760/34278 [19:34:23<17:39:21, 3.85s/it] {'loss': 0.1127, 'grad_norm': 0.7427974198618872, 'learning_rate': 4.949686611477487e-06, 'epoch': 0.52} 52%|█████▏ | 17760/34278 [19:34:23<17:39:21, 3.85s/it] 52%|█████▏ | 17761/34278 [19:34:27<17:11:45, 3.75s/it] {'loss': 0.1163, 'grad_norm': 0.7629154342469427, 'learning_rate': 4.949214201465682e-06, 'epoch': 0.52} 52%|█████▏ | 17761/34278 [19:34:27<17:11:45, 3.75s/it] 52%|█████▏ | 17762/34278 [19:34:30<15:57:39, 3.48s/it] {'loss': 0.111, 'grad_norm': 0.7295565415995703, 'learning_rate': 4.948741791907279e-06, 'epoch': 0.52} 52%|█████▏ | 17762/34278 [19:34:30<15:57:39, 3.48s/it] 52%|█████▏ | 17763/34278 [19:34:32<15:11:04, 3.31s/it] {'loss': 0.1416, 'grad_norm': 0.9266575335231287, 'learning_rate': 4.948269382806497e-06, 'epoch': 0.52} 52%|█████▏ | 17763/34278 [19:34:32<15:11:04, 3.31s/it] 52%|█████▏ | 17764/34278 [19:34:36<15:05:27, 3.29s/it] {'loss': 0.1222, 'grad_norm': 0.8798685542925329, 'learning_rate': 4.947796974167553e-06, 'epoch': 0.52} 52%|█████▏ | 17764/34278 [19:34:36<15:05:27, 3.29s/it] 52%|█████▏ | 17765/34278 [19:34:39<14:35:56, 3.18s/it] {'loss': 0.1371, 'grad_norm': 1.2083189150380842, 'learning_rate': 4.947324565994666e-06, 'epoch': 0.52} 52%|█████▏ | 17765/34278 [19:34:39<14:35:56, 3.18s/it] 52%|█████▏ | 17766/34278 [19:34:43<15:58:11, 3.48s/it] {'loss': 0.1448, 'grad_norm': 1.0981785801723138, 'learning_rate': 4.946852158292054e-06, 'epoch': 0.52} 52%|█████▏ | 17766/34278 [19:34:43<15:58:11, 3.48s/it] 52%|█████▏ | 17767/34278 [19:34:46<15:16:02, 3.33s/it] {'loss': 0.1293, 'grad_norm': 0.8165686750111091, 'learning_rate': 4.946379751063932e-06, 'epoch': 0.52} 52%|█████▏ | 17767/34278 [19:34:46<15:16:02, 3.33s/it] 52%|█████▏ | 17768/34278 [19:34:50<17:05:10, 3.73s/it] {'loss': 0.1216, 'grad_norm': 1.1957180571188588, 'learning_rate': 4.9459073443145185e-06, 'epoch': 0.52} 52%|█████▏ | 17768/34278 [19:34:50<17:05:10, 3.73s/it] 52%|█████▏ | 17769/34278 [19:34:54<17:01:05, 3.71s/it] {'loss': 0.139, 'grad_norm': 0.7525766178725002, 'learning_rate': 4.945434938048032e-06, 'epoch': 0.52} 52%|█████▏ | 17769/34278 [19:34:54<17:01:05, 3.71s/it] 52%|█████▏ | 17770/34278 [19:34:58<17:13:22, 3.76s/it] {'loss': 0.1322, 'grad_norm': 0.8103711165343084, 'learning_rate': 4.9449625322686874e-06, 'epoch': 0.52} 52%|█████▏ | 17770/34278 [19:34:58<17:13:22, 3.76s/it] 52%|█████▏ | 17771/34278 [19:35:01<16:14:06, 3.54s/it] {'loss': 0.1148, 'grad_norm': 1.1193703453006678, 'learning_rate': 4.944490126980706e-06, 'epoch': 0.52} 52%|█████▏ | 17771/34278 [19:35:01<16:14:06, 3.54s/it] 52%|█████▏ | 17772/34278 [19:35:04<15:51:07, 3.46s/it] {'loss': 0.1308, 'grad_norm': 0.8859469023702427, 'learning_rate': 4.944017722188303e-06, 'epoch': 0.52} 52%|█████▏ | 17772/34278 [19:35:04<15:51:07, 3.46s/it] 52%|█████▏ | 17773/34278 [19:35:07<15:29:09, 3.38s/it] {'loss': 0.1268, 'grad_norm': 0.7090720779509818, 'learning_rate': 4.943545317895697e-06, 'epoch': 0.52} 52%|█████▏ | 17773/34278 [19:35:08<15:29:09, 3.38s/it] 52%|█████▏ | 17774/34278 [19:35:10<14:52:11, 3.24s/it] {'loss': 0.1225, 'grad_norm': 0.9905126126943034, 'learning_rate': 4.9430729141071056e-06, 'epoch': 0.52} 52%|█████▏ | 17774/34278 [19:35:10<14:52:11, 3.24s/it] 52%|█████▏ | 17775/34278 [19:35:13<14:35:32, 3.18s/it] {'loss': 0.1123, 'grad_norm': 0.9087945970173723, 'learning_rate': 4.942600510826745e-06, 'epoch': 0.52} 52%|█████▏ | 17775/34278 [19:35:13<14:35:32, 3.18s/it] 52%|█████▏ | 17776/34278 [19:35:19<18:12:44, 3.97s/it] {'loss': 0.1122, 'grad_norm': 0.753379908493967, 'learning_rate': 4.942128108058832e-06, 'epoch': 0.52} 52%|█████▏ | 17776/34278 [19:35:19<18:12:44, 3.97s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 52%|█████▏ | 17777/34278 [19:35:22<17:00:29, 3.71s/it] {'loss': 0.1476, 'grad_norm': 0.9766768732241771, 'learning_rate': 4.941655705807586e-06, 'epoch': 0.52} 52%|█████▏ | 17777/34278 [19:35:22<17:00:29, 3.71s/it] 52%|█████▏ | 17778/34278 [19:35:28<20:03:30, 4.38s/it] {'loss': 0.1202, 'grad_norm': 0.8597888733321941, 'learning_rate': 4.941183304077224e-06, 'epoch': 0.52} 52%|█████▏ | 17778/34278 [19:35:28<20:03:30, 4.38s/it] 52%|█████▏ | 17779/34278 [19:35:32<18:53:25, 4.12s/it] {'loss': 0.1273, 'grad_norm': 0.7241541105926349, 'learning_rate': 4.9407109028719644e-06, 'epoch': 0.52} 52%|█████▏ | 17779/34278 [19:35:32<18:53:25, 4.12s/it] 52%|█████▏ | 17780/34278 [19:35:35<18:14:33, 3.98s/it] {'loss': 0.1425, 'grad_norm': 0.9018762007631137, 'learning_rate': 4.940238502196024e-06, 'epoch': 0.52} 52%|█████▏ | 17780/34278 [19:35:36<18:14:33, 3.98s/it] 52%|█████▏ | 17781/34278 [19:35:39<17:57:13, 3.92s/it] {'loss': 0.1359, 'grad_norm': 1.113150947027481, 'learning_rate': 4.939766102053619e-06, 'epoch': 0.52} 52%|█████▏ | 17781/34278 [19:35:39<17:57:13, 3.92s/it] 52%|█████▏ | 17782/34278 [19:35:42<16:34:43, 3.62s/it] {'loss': 0.1366, 'grad_norm': 0.9067067536712301, 'learning_rate': 4.939293702448966e-06, 'epoch': 0.52} 52%|█████▏ | 17782/34278 [19:35:42<16:34:43, 3.62s/it] 52%|█████▏ | 17783/34278 [19:35:46<16:59:54, 3.71s/it] {'loss': 0.1348, 'grad_norm': 0.869308613776223, 'learning_rate': 4.938821303386287e-06, 'epoch': 0.52} 52%|█████▏ | 17783/34278 [19:35:46<16:59:54, 3.71s/it] 52%|█████▏ | 17784/34278 [19:35:49<16:19:17, 3.56s/it] {'loss': 0.1516, 'grad_norm': 1.030779845618955, 'learning_rate': 4.938348904869796e-06, 'epoch': 0.52} 52%|█████▏ | 17784/34278 [19:35:49<16:19:17, 3.56s/it] 52%|█████▏ | 17785/34278 [19:35:55<19:49:23, 4.33s/it] {'loss': 0.1155, 'grad_norm': 0.7756506386233125, 'learning_rate': 4.937876506903711e-06, 'epoch': 0.52} 52%|█████▏ | 17785/34278 [19:35:55<19:49:23, 4.33s/it] 52%|█████▏ | 17786/34278 [19:35:59<18:24:08, 4.02s/it] {'loss': 0.1316, 'grad_norm': 0.8730181617833321, 'learning_rate': 4.9374041094922506e-06, 'epoch': 0.52} 52%|█████▏ | 17786/34278 [19:35:59<18:24:08, 4.02s/it] 52%|█████▏ | 17787/34278 [19:36:02<16:55:35, 3.70s/it] {'loss': 0.1389, 'grad_norm': 1.0480714878501245, 'learning_rate': 4.936931712639632e-06, 'epoch': 0.52} 52%|█████▏ | 17787/34278 [19:36:02<16:55:35, 3.70s/it] 52%|█████▏ | 17788/34278 [19:36:05<16:13:40, 3.54s/it] {'loss': 0.1568, 'grad_norm': 0.918854444863424, 'learning_rate': 4.936459316350069e-06, 'epoch': 0.52} 52%|█████▏ | 17788/34278 [19:36:05<16:13:40, 3.54s/it] 52%|█████▏ | 17789/34278 [19:36:08<16:15:55, 3.55s/it] {'loss': 0.121, 'grad_norm': 0.7993345800592149, 'learning_rate': 4.935986920627784e-06, 'epoch': 0.52} 52%|█████▏ | 17789/34278 [19:36:08<16:15:55, 3.55s/it] 52%|█████▏ | 17790/34278 [19:36:12<16:30:29, 3.60s/it] {'loss': 0.1463, 'grad_norm': 1.1084824142156962, 'learning_rate': 4.935514525476992e-06, 'epoch': 0.52} 52%|█████▏ | 17790/34278 [19:36:12<16:30:29, 3.60s/it] 52%|█████▏ | 17791/34278 [19:36:16<16:30:19, 3.60s/it] {'loss': 0.1402, 'grad_norm': 0.8425773712132383, 'learning_rate': 4.9350421309019125e-06, 'epoch': 0.52} 52%|█████▏ | 17791/34278 [19:36:16<16:30:19, 3.60s/it] 52%|█████▏ | 17792/34278 [19:36:19<15:25:58, 3.37s/it] {'loss': 0.1229, 'grad_norm': 0.8271009729135738, 'learning_rate': 4.93456973690676e-06, 'epoch': 0.52} 52%|█████▏ | 17792/34278 [19:36:19<15:25:58, 3.37s/it] 52%|█████▏ | 17793/34278 [19:36:22<15:14:19, 3.33s/it] {'loss': 0.112, 'grad_norm': 0.7775744912527479, 'learning_rate': 4.934097343495753e-06, 'epoch': 0.52} 52%|█████▏ | 17793/34278 [19:36:22<15:14:19, 3.33s/it] 52%|█████▏ | 17794/34278 [19:36:25<15:14:46, 3.33s/it] {'loss': 0.1205, 'grad_norm': 0.8043375251488273, 'learning_rate': 4.933624950673109e-06, 'epoch': 0.52} 52%|█████▏ | 17794/34278 [19:36:25<15:14:46, 3.33s/it] 52%|█████▏ | 17795/34278 [19:36:29<15:35:12, 3.40s/it] {'loss': 0.1102, 'grad_norm': 0.7250849815657042, 'learning_rate': 4.933152558443045e-06, 'epoch': 0.52} 52%|█████▏ | 17795/34278 [19:36:29<15:35:12, 3.40s/it] 52%|█████▏ | 17796/34278 [19:36:33<16:20:02, 3.57s/it] {'loss': 0.1209, 'grad_norm': 0.6975336060490384, 'learning_rate': 4.932680166809782e-06, 'epoch': 0.52} 52%|█████▏ | 17796/34278 [19:36:33<16:20:02, 3.57s/it] 52%|█████▏ | 17797/34278 [19:36:36<15:54:32, 3.48s/it] {'loss': 0.1128, 'grad_norm': 0.744123554043262, 'learning_rate': 4.932207775777532e-06, 'epoch': 0.52} 52%|█████▏ | 17797/34278 [19:36:36<15:54:32, 3.48s/it] 52%|█████▏ | 17798/34278 [19:36:42<19:29:16, 4.26s/it] {'loss': 0.1554, 'grad_norm': 0.7732292829867687, 'learning_rate': 4.9317353853505154e-06, 'epoch': 0.52} 52%|█████▏ | 17798/34278 [19:36:42<19:29:16, 4.26s/it] 52%|█████▏ | 17799/34278 [19:36:45<17:52:34, 3.91s/it] {'loss': 0.1365, 'grad_norm': 1.0536207532554802, 'learning_rate': 4.931262995532951e-06, 'epoch': 0.52} 52%|█████▏ | 17799/34278 [19:36:45<17:52:34, 3.91s/it] 52%|█████▏ | 17800/34278 [19:36:48<16:25:47, 3.59s/it] {'loss': 0.1342, 'grad_norm': 0.7685445534392636, 'learning_rate': 4.930790606329049e-06, 'epoch': 0.52} 52%|█████▏ | 17800/34278 [19:36:48<16:25:47, 3.59s/it] 52%|█████▏ | 17801/34278 [19:36:51<16:08:26, 3.53s/it] {'loss': 0.1524, 'grad_norm': 1.2868047870908867, 'learning_rate': 4.9303182177430355e-06, 'epoch': 0.52} 52%|█████▏ | 17801/34278 [19:36:51<16:08:26, 3.53s/it] 52%|█████▏ | 17802/34278 [19:36:54<15:06:08, 3.30s/it] {'loss': 0.1396, 'grad_norm': 0.9592999503782187, 'learning_rate': 4.9298458297791245e-06, 'epoch': 0.52} 52%|█████▏ | 17802/34278 [19:36:54<15:06:08, 3.30s/it] 52%|█████▏ | 17803/34278 [19:36:57<14:53:04, 3.25s/it] {'loss': 0.1252, 'grad_norm': 0.8482988127290358, 'learning_rate': 4.929373442441533e-06, 'epoch': 0.52} 52%|█████▏ | 17803/34278 [19:36:57<14:53:04, 3.25s/it] 52%|█████▏ | 17804/34278 [19:37:00<14:16:32, 3.12s/it] {'loss': 0.1088, 'grad_norm': 0.8686835971258066, 'learning_rate': 4.928901055734479e-06, 'epoch': 0.52} 52%|█████▏ | 17804/34278 [19:37:00<14:16:32, 3.12s/it] 52%|█████▏ | 17805/34278 [19:37:03<14:12:25, 3.10s/it] {'loss': 0.1414, 'grad_norm': 1.549980148624875, 'learning_rate': 4.928428669662178e-06, 'epoch': 0.52} 52%|█████▏ | 17805/34278 [19:37:03<14:12:25, 3.10s/it] 52%|█████▏ | 17806/34278 [19:37:06<13:57:44, 3.05s/it] {'loss': 0.1187, 'grad_norm': 0.954973356348734, 'learning_rate': 4.927956284228848e-06, 'epoch': 0.52} 52%|█████▏ | 17806/34278 [19:37:06<13:57:44, 3.05s/it] 52%|█████▏ | 17807/34278 [19:37:09<14:11:31, 3.10s/it] {'loss': 0.1476, 'grad_norm': 1.031618422626665, 'learning_rate': 4.927483899438708e-06, 'epoch': 0.52} 52%|█████▏ | 17807/34278 [19:37:09<14:11:31, 3.10s/it] 52%|█████▏ | 17808/34278 [19:37:13<15:01:46, 3.29s/it] {'loss': 0.1192, 'grad_norm': 1.2986421010957385, 'learning_rate': 4.9270115152959744e-06, 'epoch': 0.52} 52%|█████▏ | 17808/34278 [19:37:13<15:01:46, 3.29s/it] 52%|█████▏ | 17809/34278 [19:37:16<14:35:41, 3.19s/it] {'loss': 0.1427, 'grad_norm': 1.1509890595581167, 'learning_rate': 4.926539131804867e-06, 'epoch': 0.52} 52%|█████▏ | 17809/34278 [19:37:16<14:35:41, 3.19s/it] 52%|█████▏ | 17810/34278 [19:37:22<18:45:29, 4.10s/it] {'loss': 0.142, 'grad_norm': 0.9717509027750101, 'learning_rate': 4.926066748969598e-06, 'epoch': 0.52} 52%|█████▏ | 17810/34278 [19:37:22<18:45:29, 4.10s/it] 52%|█████▏ | 17811/34278 [19:37:25<17:36:53, 3.85s/it] {'loss': 0.1319, 'grad_norm': 0.720133769036651, 'learning_rate': 4.925594366794388e-06, 'epoch': 0.52} 52%|█████▏ | 17811/34278 [19:37:25<17:36:53, 3.85s/it] 52%|█████▏ | 17812/34278 [19:37:28<16:28:34, 3.60s/it] {'loss': 0.1183, 'grad_norm': 0.7808693343070151, 'learning_rate': 4.925121985283453e-06, 'epoch': 0.52} 52%|█████▏ | 17812/34278 [19:37:28<16:28:34, 3.60s/it] 52%|█████▏ | 17813/34278 [19:37:35<19:49:51, 4.34s/it] {'loss': 0.1172, 'grad_norm': 0.7395202818105009, 'learning_rate': 4.924649604441012e-06, 'epoch': 0.52} 52%|█████▏ | 17813/34278 [19:37:35<19:49:51, 4.34s/it] 52%|█████▏ | 17814/34278 [19:37:40<22:00:41, 4.81s/it] {'loss': 0.1289, 'grad_norm': 0.7640524827593631, 'learning_rate': 4.9241772242712815e-06, 'epoch': 0.52} 52%|█████▏ | 17814/34278 [19:37:40<22:00:41, 4.81s/it] 52%|█████▏ | 17815/34278 [19:37:45<21:29:54, 4.70s/it] {'loss': 0.137, 'grad_norm': 0.7583174019718506, 'learning_rate': 4.9237048447784785e-06, 'epoch': 0.52} 52%|█████▏ | 17815/34278 [19:37:45<21:29:54, 4.70s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 52%|█████▏ | 17816/34278 [19:37:48<18:53:33, 4.13s/it] {'loss': 0.1201, 'grad_norm': 0.7979366143303395, 'learning_rate': 4.92323246596682e-06, 'epoch': 0.52} 52%|█████▏ | 17816/34278 [19:37:48<18:53:33, 4.13s/it] 52%|█████▏ | 17817/34278 [19:37:51<17:36:36, 3.85s/it] {'loss': 0.1341, 'grad_norm': 0.6877777126262582, 'learning_rate': 4.9227600878405255e-06, 'epoch': 0.52} 52%|█████▏ | 17817/34278 [19:37:51<17:36:36, 3.85s/it] 52%|█████▏ | 17818/34278 [19:37:54<16:57:17, 3.71s/it] {'loss': 0.1194, 'grad_norm': 0.8602122677761864, 'learning_rate': 4.922287710403807e-06, 'epoch': 0.52} 52%|█████▏ | 17818/34278 [19:37:54<16:57:17, 3.71s/it] 52%|█████▏ | 17819/34278 [19:37:57<16:02:44, 3.51s/it] {'loss': 0.137, 'grad_norm': 0.7375428944619327, 'learning_rate': 4.921815333660888e-06, 'epoch': 0.52} 52%|█████▏ | 17819/34278 [19:37:57<16:02:44, 3.51s/it] 52%|█████▏ | 17820/34278 [19:38:00<15:09:49, 3.32s/it] {'loss': 0.1333, 'grad_norm': 0.9867675926808651, 'learning_rate': 4.9213429576159815e-06, 'epoch': 0.52} 52%|█████▏ | 17820/34278 [19:38:00<15:09:49, 3.32s/it] 52%|█████▏ | 17821/34278 [19:38:04<16:05:35, 3.52s/it] {'loss': 0.1384, 'grad_norm': 0.7379810370770487, 'learning_rate': 4.920870582273308e-06, 'epoch': 0.52} 52%|█████▏ | 17821/34278 [19:38:04<16:05:35, 3.52s/it] 52%|█████▏ | 17822/34278 [19:38:08<16:34:10, 3.62s/it] {'loss': 0.1087, 'grad_norm': 0.848173272023187, 'learning_rate': 4.920398207637082e-06, 'epoch': 0.52} 52%|█████▏ | 17822/34278 [19:38:08<16:34:10, 3.62s/it] 52%|█████▏ | 17823/34278 [19:38:11<15:36:33, 3.41s/it] {'loss': 0.1423, 'grad_norm': 0.9426890226773692, 'learning_rate': 4.919925833711522e-06, 'epoch': 0.52} 52%|█████▏ | 17823/34278 [19:38:11<15:36:33, 3.41s/it] 52%|█████▏ | 17824/34278 [19:38:17<19:41:18, 4.31s/it] {'loss': 0.1172, 'grad_norm': 0.7548892339920918, 'learning_rate': 4.919453460500844e-06, 'epoch': 0.52} 52%|█████▏ | 17824/34278 [19:38:17<19:41:18, 4.31s/it] 52%|█████▏ | 17825/34278 [19:38:20<18:05:05, 3.96s/it] {'loss': 0.1259, 'grad_norm': 0.771375472373932, 'learning_rate': 4.918981088009267e-06, 'epoch': 0.52} 52%|█████▏ | 17825/34278 [19:38:21<18:05:05, 3.96s/it] 52%|█████▏ | 17826/34278 [19:38:24<18:07:19, 3.97s/it] {'loss': 0.1294, 'grad_norm': 0.806062664754491, 'learning_rate': 4.918508716241009e-06, 'epoch': 0.52} 52%|█████▏ | 17826/34278 [19:38:25<18:07:19, 3.97s/it] 52%|█████▏ | 17827/34278 [19:38:28<17:41:15, 3.87s/it] {'loss': 0.1431, 'grad_norm': 0.918711591193355, 'learning_rate': 4.918036345200284e-06, 'epoch': 0.52} 52%|█████▏ | 17827/34278 [19:38:28<17:41:15, 3.87s/it] 52%|█████▏ | 17828/34278 [19:38:31<16:25:21, 3.59s/it] {'loss': 0.1093, 'grad_norm': 0.8793300357363661, 'learning_rate': 4.917563974891311e-06, 'epoch': 0.52} 52%|█████▏ | 17828/34278 [19:38:31<16:25:21, 3.59s/it] 52%|█████▏ | 17829/34278 [19:38:34<15:52:09, 3.47s/it] {'loss': 0.1057, 'grad_norm': 0.6963931385633928, 'learning_rate': 4.917091605318309e-06, 'epoch': 0.52} 52%|█████▏ | 17829/34278 [19:38:34<15:52:09, 3.47s/it] 52%|█████▏ | 17830/34278 [19:38:38<16:28:56, 3.61s/it] {'loss': 0.1095, 'grad_norm': 0.7431056834038489, 'learning_rate': 4.91661923648549e-06, 'epoch': 0.52} 52%|█████▏ | 17830/34278 [19:38:38<16:28:56, 3.61s/it] 52%|█████▏ | 17831/34278 [19:38:42<16:28:21, 3.61s/it] {'loss': 0.1289, 'grad_norm': 0.8950737012664087, 'learning_rate': 4.916146868397077e-06, 'epoch': 0.52} 52%|█████▏ | 17831/34278 [19:38:42<16:28:21, 3.61s/it] 52%|█████▏ | 17832/34278 [19:38:45<15:25:55, 3.38s/it] {'loss': 0.1234, 'grad_norm': 0.8267027874287781, 'learning_rate': 4.915674501057284e-06, 'epoch': 0.52} 52%|█████▏ | 17832/34278 [19:38:45<15:25:55, 3.38s/it] 52%|█████▏ | 17833/34278 [19:38:48<14:57:48, 3.28s/it] {'loss': 0.118, 'grad_norm': 0.9156664218792455, 'learning_rate': 4.91520213447033e-06, 'epoch': 0.52} 52%|█████▏ | 17833/34278 [19:38:48<14:57:48, 3.28s/it] 52%|█████▏ | 17834/34278 [19:38:51<15:11:29, 3.33s/it] {'loss': 0.1257, 'grad_norm': 0.8229915052361602, 'learning_rate': 4.914729768640431e-06, 'epoch': 0.52} 52%|█████▏ | 17834/34278 [19:38:51<15:11:29, 3.33s/it] 52%|█████▏ | 17835/34278 [19:38:54<14:31:47, 3.18s/it] {'loss': 0.1486, 'grad_norm': 0.8387745144811057, 'learning_rate': 4.914257403571803e-06, 'epoch': 0.52} 52%|█████▏ | 17835/34278 [19:38:54<14:31:47, 3.18s/it] 52%|█████▏ | 17836/34278 [19:38:57<14:26:16, 3.16s/it] {'loss': 0.1214, 'grad_norm': 0.8746183836734868, 'learning_rate': 4.9137850392686635e-06, 'epoch': 0.52} 52%|█████▏ | 17836/34278 [19:38:57<14:26:16, 3.16s/it] 52%|█████▏ | 17837/34278 [19:39:03<18:27:56, 4.04s/it] {'loss': 0.1357, 'grad_norm': 0.8686617024439045, 'learning_rate': 4.913312675735233e-06, 'epoch': 0.52} 52%|█████▏ | 17837/34278 [19:39:03<18:27:56, 4.04s/it] 52%|█████▏ | 17838/34278 [19:39:07<17:29:09, 3.83s/it] {'loss': 0.1516, 'grad_norm': 0.804809133997618, 'learning_rate': 4.912840312975725e-06, 'epoch': 0.52} 52%|█████▏ | 17838/34278 [19:39:07<17:29:09, 3.83s/it] 52%|█████▏ | 17839/34278 [19:39:10<16:48:14, 3.68s/it] {'loss': 0.1175, 'grad_norm': 0.8651921029484754, 'learning_rate': 4.912367950994358e-06, 'epoch': 0.52} 52%|█████▏ | 17839/34278 [19:39:10<16:48:14, 3.68s/it] 52%|█████▏ | 17840/34278 [19:39:14<17:05:05, 3.74s/it] {'loss': 0.1501, 'grad_norm': 0.858855350432069, 'learning_rate': 4.91189558979535e-06, 'epoch': 0.52} 52%|█████▏ | 17840/34278 [19:39:14<17:05:05, 3.74s/it] 52%|█████▏ | 17841/34278 [19:39:16<15:45:01, 3.45s/it] {'loss': 0.1339, 'grad_norm': 0.7022536918286209, 'learning_rate': 4.911423229382915e-06, 'epoch': 0.52} 52%|█████▏ | 17841/34278 [19:39:17<15:45:01, 3.45s/it] 52%|█████▏ | 17842/34278 [19:39:20<15:15:13, 3.34s/it] {'loss': 0.1502, 'grad_norm': 0.929108569504628, 'learning_rate': 4.910950869761273e-06, 'epoch': 0.52} 52%|█████▏ | 17842/34278 [19:39:20<15:15:13, 3.34s/it] 52%|█████▏ | 17843/34278 [19:39:25<17:41:22, 3.87s/it] {'loss': 0.109, 'grad_norm': 0.810850832292827, 'learning_rate': 4.910478510934642e-06, 'epoch': 0.52} 52%|█████▏ | 17843/34278 [19:39:25<17:41:22, 3.87s/it] 52%|█████▏ | 17844/34278 [19:39:28<16:41:59, 3.66s/it] {'loss': 0.1498, 'grad_norm': 0.7207408769937799, 'learning_rate': 4.9100061529072365e-06, 'epoch': 0.52} 52%|█████▏ | 17844/34278 [19:39:28<16:41:59, 3.66s/it] 52%|█████▏ | 17845/34278 [19:39:34<19:28:49, 4.27s/it] {'loss': 0.1288, 'grad_norm': 0.875596935326705, 'learning_rate': 4.9095337956832744e-06, 'epoch': 0.52} 52%|█████▏ | 17845/34278 [19:39:34<19:28:49, 4.27s/it] 52%|█████▏ | 17846/34278 [19:39:37<17:56:08, 3.93s/it] {'loss': 0.1186, 'grad_norm': 0.7938756474073557, 'learning_rate': 4.9090614392669735e-06, 'epoch': 0.52} 52%|█████▏ | 17846/34278 [19:39:37<17:56:08, 3.93s/it] 52%|█████▏ | 17847/34278 [19:39:40<17:07:45, 3.75s/it] {'loss': 0.1051, 'grad_norm': 1.020857028179819, 'learning_rate': 4.90858908366255e-06, 'epoch': 0.52} 52%|█████▏ | 17847/34278 [19:39:40<17:07:45, 3.75s/it] 52%|█████▏ | 17848/34278 [19:39:43<16:18:47, 3.57s/it] {'loss': 0.1255, 'grad_norm': 1.0545725879355818, 'learning_rate': 4.90811672887422e-06, 'epoch': 0.52} 52%|█████▏ | 17848/34278 [19:39:43<16:18:47, 3.57s/it] 52%|█████▏ | 17849/34278 [19:39:46<15:18:26, 3.35s/it] {'loss': 0.1341, 'grad_norm': 0.7909117978497822, 'learning_rate': 4.907644374906204e-06, 'epoch': 0.52} 52%|█████▏ | 17849/34278 [19:39:46<15:18:26, 3.35s/it] 52%|█████▏ | 17850/34278 [19:39:50<15:32:13, 3.40s/it] {'loss': 0.1305, 'grad_norm': 0.7248427416592674, 'learning_rate': 4.907172021762715e-06, 'epoch': 0.52} 52%|█████▏ | 17850/34278 [19:39:50<15:32:13, 3.40s/it] 52%|█████▏ | 17851/34278 [19:39:53<15:45:01, 3.45s/it] {'loss': 0.1369, 'grad_norm': 0.7969163509737802, 'learning_rate': 4.906699669447975e-06, 'epoch': 0.52} 52%|█████▏ | 17851/34278 [19:39:53<15:45:01, 3.45s/it] 52%|█████▏ | 17852/34278 [19:39:57<15:42:50, 3.44s/it] {'loss': 0.1368, 'grad_norm': 0.7503094517373456, 'learning_rate': 4.9062273179661965e-06, 'epoch': 0.52} 52%|█████▏ | 17852/34278 [19:39:57<15:42:50, 3.44s/it] 52%|█████▏ | 17853/34278 [19:40:00<15:12:53, 3.33s/it] {'loss': 0.139, 'grad_norm': 0.764510925958115, 'learning_rate': 4.9057549673215976e-06, 'epoch': 0.52} 52%|█████▏ | 17853/34278 [19:40:00<15:12:53, 3.33s/it] 52%|█████▏ | 17854/34278 [19:40:03<15:13:14, 3.34s/it] {'loss': 0.1379, 'grad_norm': 0.9238158315036814, 'learning_rate': 4.9052826175183946e-06, 'epoch': 0.52} 52%|█████▏ | 17854/34278 [19:40:03<15:13:14, 3.34s/it] 52%|█████▏ | 17855/34278 [19:40:06<14:54:38, 3.27s/it] {'loss': 0.1513, 'grad_norm': 0.8985348784809547, 'learning_rate': 4.904810268560807e-06, 'epoch': 0.52} 52%|█████▏ | 17855/34278 [19:40:06<14:54:38, 3.27s/it] 52%|█████▏ | 17856/34278 [19:40:09<14:50:04, 3.25s/it] {'loss': 0.1471, 'grad_norm': 0.9198487150224192, 'learning_rate': 4.904337920453053e-06, 'epoch': 0.52} 52%|█████▏ | 17856/34278 [19:40:09<14:50:04, 3.25s/it] 52%|█████▏ | 17857/34278 [19:40:13<15:04:39, 3.31s/it] {'loss': 0.1306, 'grad_norm': 1.3279149130796433, 'learning_rate': 4.903865573199344e-06, 'epoch': 0.52} 52%|█████▏ | 17857/34278 [19:40:13<15:04:39, 3.31s/it] 52%|█████▏ | 17858/34278 [19:40:16<15:39:21, 3.43s/it] {'loss': 0.1389, 'grad_norm': 0.9732633830223836, 'learning_rate': 4.903393226803902e-06, 'epoch': 0.52} 52%|█████▏ | 17858/34278 [19:40:16<15:39:21, 3.43s/it] 52%|█████▏ | 17859/34278 [19:40:20<15:12:38, 3.34s/it] {'loss': 0.146, 'grad_norm': 0.9224297104087886, 'learning_rate': 4.902920881270942e-06, 'epoch': 0.52} 52%|█████▏ | 17859/34278 [19:40:20<15:12:38, 3.34s/it] 52%|█████▏ | 17860/34278 [19:40:22<14:18:04, 3.14s/it] {'loss': 0.1304, 'grad_norm': 0.9539881656581738, 'learning_rate': 4.902448536604679e-06, 'epoch': 0.52} 52%|█████▏ | 17860/34278 [19:40:22<14:18:04, 3.14s/it] 52%|█████▏ | 17861/34278 [19:40:26<15:30:53, 3.40s/it] {'loss': 0.123, 'grad_norm': 0.7559097551927099, 'learning_rate': 4.901976192809335e-06, 'epoch': 0.52} 52%|█████▏ | 17861/34278 [19:40:26<15:30:53, 3.40s/it] 52%|█████▏ | 17862/34278 [19:40:29<14:47:33, 3.24s/it] {'loss': 0.1328, 'grad_norm': 0.9404253632952876, 'learning_rate': 4.901503849889122e-06, 'epoch': 0.52} 52%|█████▏ | 17862/34278 [19:40:29<14:47:33, 3.24s/it] 52%|█████▏ | 17863/34278 [19:40:33<15:37:21, 3.43s/it] {'loss': 0.1389, 'grad_norm': 0.8255707159001553, 'learning_rate': 4.901031507848261e-06, 'epoch': 0.52} 52%|█████▏ | 17863/34278 [19:40:33<15:37:21, 3.43s/it] 52%|█████▏ | 17864/34278 [19:40:36<15:23:33, 3.38s/it] {'loss': 0.1184, 'grad_norm': 0.8598240984880731, 'learning_rate': 4.900559166690968e-06, 'epoch': 0.52} 52%|█████▏ | 17864/34278 [19:40:36<15:23:33, 3.38s/it] 52%|█████▏ | 17865/34278 [19:40:40<16:00:08, 3.51s/it] {'loss': 0.1361, 'grad_norm': 0.7687898219097739, 'learning_rate': 4.900086826421457e-06, 'epoch': 0.52} 52%|█████▏ | 17865/34278 [19:40:40<16:00:08, 3.51s/it] 52%|█████▏ | 17866/34278 [19:40:43<15:13:28, 3.34s/it] {'loss': 0.136, 'grad_norm': 1.2607156104616033, 'learning_rate': 4.899614487043945e-06, 'epoch': 0.52} 52%|█████▏ | 17866/34278 [19:40:43<15:13:28, 3.34s/it] 52%|█████▏ | 17867/34278 [19:40:46<14:25:18, 3.16s/it] {'loss': 0.1024, 'grad_norm': 1.1577965033863988, 'learning_rate': 4.899142148562654e-06, 'epoch': 0.52} 52%|█████▏ | 17867/34278 [19:40:46<14:25:18, 3.16s/it] 52%|█████▏ | 17868/34278 [19:40:50<15:24:57, 3.38s/it] {'loss': 0.1366, 'grad_norm': 0.8993489940128342, 'learning_rate': 4.8986698109817965e-06, 'epoch': 0.52} 52%|█████▏ | 17868/34278 [19:40:50<15:24:57, 3.38s/it] 52%|█████▏ | 17869/34278 [19:40:53<15:17:11, 3.35s/it] {'loss': 0.1407, 'grad_norm': 0.8256751835174652, 'learning_rate': 4.8981974743055924e-06, 'epoch': 0.52} 52%|█████▏ | 17869/34278 [19:40:53<15:17:11, 3.35s/it] 52%|█████▏ | 17870/34278 [19:40:59<18:47:23, 4.12s/it] {'loss': 0.1468, 'grad_norm': 1.8909717940180342, 'learning_rate': 4.897725138538256e-06, 'epoch': 0.52} 52%|█████▏ | 17870/34278 [19:40:59<18:47:23, 4.12s/it] 52%|█████▏ | 17871/34278 [19:41:02<17:26:54, 3.83s/it] {'loss': 0.1455, 'grad_norm': 0.7629539101206839, 'learning_rate': 4.897252803684004e-06, 'epoch': 0.52} 52%|█████▏ | 17871/34278 [19:41:02<17:26:54, 3.83s/it] 52%|█████▏ | 17872/34278 [19:41:06<17:02:42, 3.74s/it] {'loss': 0.1389, 'grad_norm': 0.844166739001318, 'learning_rate': 4.896780469747055e-06, 'epoch': 0.52} 52%|█████▏ | 17872/34278 [19:41:06<17:02:42, 3.74s/it] 52%|█████▏ | 17873/34278 [19:41:09<16:53:32, 3.71s/it] {'loss': 0.1184, 'grad_norm': 0.8643156654137838, 'learning_rate': 4.896308136731626e-06, 'epoch': 0.52} 52%|█████▏ | 17873/34278 [19:41:09<16:53:32, 3.71s/it] 52%|█████▏ | 17874/34278 [19:41:13<16:28:33, 3.62s/it] {'loss': 0.1166, 'grad_norm': 0.6863796990266603, 'learning_rate': 4.895835804641933e-06, 'epoch': 0.52} 52%|█████▏ | 17874/34278 [19:41:13<16:28:33, 3.62s/it] 52%|█████▏ | 17875/34278 [19:41:15<15:20:35, 3.37s/it] {'loss': 0.1326, 'grad_norm': 0.6952077020349884, 'learning_rate': 4.895363473482193e-06, 'epoch': 0.52} 52%|█████▏ | 17875/34278 [19:41:15<15:20:35, 3.37s/it] 52%|█████▏ | 17876/34278 [19:41:18<14:43:28, 3.23s/it] {'loss': 0.125, 'grad_norm': 0.8471047800515966, 'learning_rate': 4.894891143256622e-06, 'epoch': 0.52} 52%|█████▏ | 17876/34278 [19:41:18<14:43:28, 3.23s/it] 52%|█████▏ | 17877/34278 [19:41:22<15:10:57, 3.33s/it] {'loss': 0.1225, 'grad_norm': 0.7736863298443879, 'learning_rate': 4.894418813969441e-06, 'epoch': 0.52} 52%|█████▏ | 17877/34278 [19:41:22<15:10:57, 3.33s/it] 52%|█████▏ | 17878/34278 [19:41:26<15:49:56, 3.48s/it] {'loss': 0.1219, 'grad_norm': 0.7933700083617907, 'learning_rate': 4.893946485624859e-06, 'epoch': 0.52} 52%|█████▏ | 17878/34278 [19:41:26<15:49:56, 3.48s/it] 52%|█████▏ | 17879/34278 [19:41:29<16:16:20, 3.57s/it] {'loss': 0.1132, 'grad_norm': 0.6815658508332655, 'learning_rate': 4.8934741582271e-06, 'epoch': 0.52} 52%|█████▏ | 17879/34278 [19:41:29<16:16:20, 3.57s/it] 52%|█████▏ | 17880/34278 [19:41:32<15:35:04, 3.42s/it] {'loss': 0.1395, 'grad_norm': 0.7561530162098145, 'learning_rate': 4.893001831780378e-06, 'epoch': 0.52} 52%|█████▏ | 17880/34278 [19:41:32<15:35:04, 3.42s/it] 52%|█████▏ | 17881/34278 [19:41:35<14:45:21, 3.24s/it] {'loss': 0.1068, 'grad_norm': 0.8749908551044697, 'learning_rate': 4.892529506288911e-06, 'epoch': 0.52} 52%|█████▏ | 17881/34278 [19:41:35<14:45:21, 3.24s/it] 52%|█████▏ | 17882/34278 [19:41:38<14:26:26, 3.17s/it] {'loss': 0.1061, 'grad_norm': 1.3182560129097096, 'learning_rate': 4.892057181756914e-06, 'epoch': 0.52} 52%|█████▏ | 17882/34278 [19:41:38<14:26:26, 3.17s/it] 52%|█████▏ | 17883/34278 [19:41:42<15:13:23, 3.34s/it] {'loss': 0.1268, 'grad_norm': 0.6291798547307672, 'learning_rate': 4.891584858188605e-06, 'epoch': 0.52} 52%|█████▏ | 17883/34278 [19:41:42<15:13:23, 3.34s/it] 52%|█████▏ | 17884/34278 [19:41:48<18:27:31, 4.05s/it] {'loss': 0.1286, 'grad_norm': 0.7589851009998014, 'learning_rate': 4.891112535588199e-06, 'epoch': 0.52} 52%|█████▏ | 17884/34278 [19:41:48<18:27:31, 4.05s/it] 52%|█████▏ | 17885/34278 [19:41:51<16:47:37, 3.69s/it] {'loss': 0.1268, 'grad_norm': 0.9022577316131594, 'learning_rate': 4.890640213959915e-06, 'epoch': 0.52} 52%|█████▏ | 17885/34278 [19:41:51<16:47:37, 3.69s/it] 52%|█████▏ | 17886/34278 [19:41:54<15:55:35, 3.50s/it] {'loss': 0.1518, 'grad_norm': 0.9731030876978904, 'learning_rate': 4.890167893307971e-06, 'epoch': 0.52} 52%|█████▏ | 17886/34278 [19:41:54<15:55:35, 3.50s/it] 52%|█████▏ | 17887/34278 [19:42:00<19:55:14, 4.38s/it] {'loss': 0.1295, 'grad_norm': 0.7098412431093563, 'learning_rate': 4.889695573636581e-06, 'epoch': 0.52} 52%|█████▏ | 17887/34278 [19:42:00<19:55:14, 4.38s/it] 52%|█████▏ | 17888/34278 [19:42:05<20:19:38, 4.46s/it] {'loss': 0.1299, 'grad_norm': 0.9869573351912039, 'learning_rate': 4.889223254949961e-06, 'epoch': 0.52} 52%|█████▏ | 17888/34278 [19:42:05<20:19:38, 4.46s/it] 52%|█████▏ | 17889/34278 [19:42:09<20:02:43, 4.40s/it] {'loss': 0.1259, 'grad_norm': 0.9072045110410829, 'learning_rate': 4.888750937252332e-06, 'epoch': 0.52} 52%|█████▏ | 17889/34278 [19:42:09<20:02:43, 4.40s/it] 52%|█████▏ | 17890/34278 [19:42:12<18:13:15, 4.00s/it] {'loss': 0.1377, 'grad_norm': 0.8978386946958383, 'learning_rate': 4.8882786205479035e-06, 'epoch': 0.52} 52%|█████▏ | 17890/34278 [19:42:12<18:13:15, 4.00s/it] 52%|█████▏ | 17891/34278 [19:42:16<17:58:01, 3.95s/it] {'loss': 0.1401, 'grad_norm': 0.8882988365300463, 'learning_rate': 4.887806304840901e-06, 'epoch': 0.52} 52%|█████▏ | 17891/34278 [19:42:16<17:58:01, 3.95s/it] 52%|█████▏ | 17892/34278 [19:42:19<16:48:37, 3.69s/it] {'loss': 0.1407, 'grad_norm': 0.9094948331594452, 'learning_rate': 4.887333990135536e-06, 'epoch': 0.52} 52%|█████▏ | 17892/34278 [19:42:19<16:48:37, 3.69s/it] 52%|█████▏ | 17893/34278 [19:42:23<17:36:51, 3.87s/it] {'loss': 0.1167, 'grad_norm': 0.6853262552371552, 'learning_rate': 4.886861676436026e-06, 'epoch': 0.52} 52%|█████▏ | 17893/34278 [19:42:23<17:36:51, 3.87s/it] 52%|█████▏ | 17894/34278 [19:42:26<16:41:02, 3.67s/it] {'loss': 0.1317, 'grad_norm': 0.7485944270381532, 'learning_rate': 4.886389363746588e-06, 'epoch': 0.52} 52%|█████▏ | 17894/34278 [19:42:26<16:41:02, 3.67s/it] 52%|█████▏ | 17895/34278 [19:42:30<15:48:52, 3.48s/it] {'loss': 0.1138, 'grad_norm': 0.8338742387687241, 'learning_rate': 4.885917052071439e-06, 'epoch': 0.52} 52%|█████▏ | 17895/34278 [19:42:30<15:48:52, 3.48s/it] 52%|█████▏ | 17896/34278 [19:42:35<18:56:05, 4.16s/it] {'loss': 0.1173, 'grad_norm': 0.7409709048756515, 'learning_rate': 4.885444741414794e-06, 'epoch': 0.52} 52%|█████▏ | 17896/34278 [19:42:35<18:56:05, 4.16s/it] 52%|█████▏ | 17897/34278 [19:42:41<21:07:57, 4.64s/it] {'loss': 0.1255, 'grad_norm': 0.7580988775971637, 'learning_rate': 4.884972431780872e-06, 'epoch': 0.52} 52%|█████▏ | 17897/34278 [19:42:41<21:07:57, 4.64s/it] 52%|█████▏ | 17898/34278 [19:42:47<23:35:06, 5.18s/it] {'loss': 0.158, 'grad_norm': 0.8213209898400915, 'learning_rate': 4.884500123173888e-06, 'epoch': 0.52} 52%|█████▏ | 17898/34278 [19:42:47<23:35:06, 5.18s/it] 52%|█████▏ | 17899/34278 [19:42:50<20:37:28, 4.53s/it] {'loss': 0.1214, 'grad_norm': 0.8303623412264455, 'learning_rate': 4.884027815598061e-06, 'epoch': 0.52} 52%|█████▏ | 17899/34278 [19:42:51<20:37:28, 4.53s/it] 52%|█████▏ | 17900/34278 [19:42:54<18:38:58, 4.10s/it] {'loss': 0.108, 'grad_norm': 0.7614129240235559, 'learning_rate': 4.8835555090576054e-06, 'epoch': 0.52} 52%|█████▏ | 17900/34278 [19:42:54<18:38:58, 4.10s/it] 52%|█████▏ | 17901/34278 [19:42:57<17:49:07, 3.92s/it] {'loss': 0.1396, 'grad_norm': 0.860717154105392, 'learning_rate': 4.883083203556738e-06, 'epoch': 0.52} 52%|█████▏ | 17901/34278 [19:42:57<17:49:07, 3.92s/it] 52%|█████▏ | 17902/34278 [19:43:01<17:17:51, 3.80s/it] {'loss': 0.1281, 'grad_norm': 0.757320014920277, 'learning_rate': 4.882610899099674e-06, 'epoch': 0.52} 52%|█████▏ | 17902/34278 [19:43:01<17:17:51, 3.80s/it] 52%|█████▏ | 17903/34278 [19:43:04<16:05:28, 3.54s/it] {'loss': 0.1353, 'grad_norm': 0.8023616902442609, 'learning_rate': 4.882138595690635e-06, 'epoch': 0.52} 52%|█████▏ | 17903/34278 [19:43:04<16:05:28, 3.54s/it] 52%|█████▏ | 17904/34278 [19:43:07<16:08:34, 3.55s/it] {'loss': 0.1366, 'grad_norm': 0.7335446218700364, 'learning_rate': 4.881666293333832e-06, 'epoch': 0.52} 52%|█████▏ | 17904/34278 [19:43:07<16:08:34, 3.55s/it] 52%|█████▏ | 17905/34278 [19:43:10<15:48:56, 3.48s/it] {'loss': 0.1359, 'grad_norm': 0.6739565189851321, 'learning_rate': 4.881193992033486e-06, 'epoch': 0.52} 52%|█████▏ | 17905/34278 [19:43:10<15:48:56, 3.48s/it] 52%|█████▏ | 17906/34278 [19:43:15<17:15:59, 3.80s/it] {'loss': 0.1478, 'grad_norm': 0.911529365925771, 'learning_rate': 4.880721691793812e-06, 'epoch': 0.52} 52%|█████▏ | 17906/34278 [19:43:15<17:15:59, 3.80s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 52%|█████▏ | 17907/34278 [19:43:18<16:38:49, 3.66s/it] {'loss': 0.1425, 'grad_norm': 0.8483509395103137, 'learning_rate': 4.880249392619025e-06, 'epoch': 0.52} 52%|█████▏ | 17907/34278 [19:43:18<16:38:49, 3.66s/it] 52%|█████▏ | 17908/34278 [19:43:22<16:47:33, 3.69s/it] {'loss': 0.1368, 'grad_norm': 0.8178655393409022, 'learning_rate': 4.879777094513341e-06, 'epoch': 0.52} 52%|█████▏ | 17908/34278 [19:43:22<16:47:33, 3.69s/it] 52%|█████▏ | 17909/34278 [19:43:26<17:25:35, 3.83s/it] {'loss': 0.1404, 'grad_norm': 0.8577307170717238, 'learning_rate': 4.879304797480981e-06, 'epoch': 0.52} 52%|█████▏ | 17909/34278 [19:43:26<17:25:35, 3.83s/it] 52%|█████▏ | 17910/34278 [19:43:31<18:39:07, 4.10s/it] {'loss': 0.1429, 'grad_norm': 0.9602509403066533, 'learning_rate': 4.878832501526158e-06, 'epoch': 0.52} 52%|█████▏ | 17910/34278 [19:43:31<18:39:07, 4.10s/it] 52%|█████▏ | 17911/34278 [19:43:35<18:02:11, 3.97s/it] {'loss': 0.15, 'grad_norm': 0.8733446378812333, 'learning_rate': 4.878360206653091e-06, 'epoch': 0.52} 52%|█████▏ | 17911/34278 [19:43:35<18:02:11, 3.97s/it] 52%|█████▏ | 17912/34278 [19:43:38<17:22:11, 3.82s/it] {'loss': 0.1472, 'grad_norm': 0.9655338759400075, 'learning_rate': 4.877887912865994e-06, 'epoch': 0.52} 52%|█████▏ | 17912/34278 [19:43:38<17:22:11, 3.82s/it] 52%|█████▏ | 17913/34278 [19:43:42<17:08:55, 3.77s/it] {'loss': 0.1356, 'grad_norm': 0.8931499260447278, 'learning_rate': 4.877415620169084e-06, 'epoch': 0.52} 52%|█████▏ | 17913/34278 [19:43:42<17:08:55, 3.77s/it] 52%|█████▏ | 17914/34278 [19:43:46<17:56:48, 3.95s/it] {'loss': 0.1235, 'grad_norm': 0.9231256882193069, 'learning_rate': 4.876943328566578e-06, 'epoch': 0.52} 52%|█████▏ | 17914/34278 [19:43:46<17:56:48, 3.95s/it] 52%|█████▏ | 17915/34278 [19:43:49<16:41:49, 3.67s/it] {'loss': 0.1531, 'grad_norm': 1.1325598380436084, 'learning_rate': 4.876471038062693e-06, 'epoch': 0.52} 52%|█████▏ | 17915/34278 [19:43:49<16:41:49, 3.67s/it] 52%|█████▏ | 17916/34278 [19:43:53<17:31:03, 3.85s/it] {'loss': 0.1905, 'grad_norm': 1.186704811758057, 'learning_rate': 4.875998748661646e-06, 'epoch': 0.52} 52%|█████▏ | 17916/34278 [19:43:53<17:31:03, 3.85s/it] 52%|█████▏ | 17917/34278 [19:43:57<16:41:25, 3.67s/it] {'loss': 0.1346, 'grad_norm': 0.8180383593609283, 'learning_rate': 4.875526460367651e-06, 'epoch': 0.52} 52%|█████▏ | 17917/34278 [19:43:57<16:41:25, 3.67s/it] 52%|█████▏ | 17918/34278 [19:44:00<16:12:41, 3.57s/it] {'loss': 0.1134, 'grad_norm': 0.9811199477158028, 'learning_rate': 4.8750541731849274e-06, 'epoch': 0.52} 52%|█████▏ | 17918/34278 [19:44:00<16:12:41, 3.57s/it] 52%|█████▏ | 17919/34278 [19:44:03<15:42:50, 3.46s/it] {'loss': 0.1319, 'grad_norm': 0.9616250190910933, 'learning_rate': 4.874581887117691e-06, 'epoch': 0.52} 52%|█████▏ | 17919/34278 [19:44:03<15:42:50, 3.46s/it] 52%|█████▏ | 17920/34278 [19:44:06<14:59:37, 3.30s/it] {'loss': 0.1222, 'grad_norm': 0.8495857545743318, 'learning_rate': 4.874109602170154e-06, 'epoch': 0.52} 52%|█████▏ | 17920/34278 [19:44:06<14:59:37, 3.30s/it] 52%|█████▏ | 17921/34278 [19:44:09<14:20:19, 3.16s/it] {'loss': 0.1475, 'grad_norm': 0.9981163796100289, 'learning_rate': 4.873637318346539e-06, 'epoch': 0.52} 52%|█████▏ | 17921/34278 [19:44:09<14:20:19, 3.16s/it] 52%|█████▏ | 17922/34278 [19:44:13<14:59:56, 3.30s/it] {'loss': 0.1239, 'grad_norm': 0.9542983875370893, 'learning_rate': 4.8731650356510605e-06, 'epoch': 0.52} 52%|█████▏ | 17922/34278 [19:44:13<14:59:56, 3.30s/it] 52%|█████▏ | 17923/34278 [19:44:16<15:33:28, 3.42s/it] {'loss': 0.1433, 'grad_norm': 0.8052732719910499, 'learning_rate': 4.872692754087933e-06, 'epoch': 0.52} 52%|█████▏ | 17923/34278 [19:44:16<15:33:28, 3.42s/it] 52%|█████▏ | 17924/34278 [19:44:19<14:59:46, 3.30s/it] {'loss': 0.1296, 'grad_norm': 0.7535801357383433, 'learning_rate': 4.872220473661376e-06, 'epoch': 0.52} 52%|█████▏ | 17924/34278 [19:44:19<14:59:46, 3.30s/it] 52%|█████▏ | 17925/34278 [19:44:24<16:28:39, 3.63s/it] {'loss': 0.1479, 'grad_norm': 0.7714976778803393, 'learning_rate': 4.871748194375602e-06, 'epoch': 0.52} 52%|█████▏ | 17925/34278 [19:44:24<16:28:39, 3.63s/it] 52%|█████▏ | 17926/34278 [19:44:28<16:59:35, 3.74s/it] {'loss': 0.1138, 'grad_norm': 0.8562842980877414, 'learning_rate': 4.871275916234829e-06, 'epoch': 0.52} 52%|█████▏ | 17926/34278 [19:44:28<16:59:35, 3.74s/it] 52%|█████▏ | 17927/34278 [19:44:31<16:47:12, 3.70s/it] {'loss': 0.1324, 'grad_norm': 0.7413114455314824, 'learning_rate': 4.870803639243275e-06, 'epoch': 0.52} 52%|█████▏ | 17927/34278 [19:44:31<16:47:12, 3.70s/it] 52%|█████▏ | 17928/34278 [19:44:34<15:29:17, 3.41s/it] {'loss': 0.1557, 'grad_norm': 0.7978088224484438, 'learning_rate': 4.8703313634051555e-06, 'epoch': 0.52} 52%|█████▏ | 17928/34278 [19:44:34<15:29:17, 3.41s/it] 52%|█████▏ | 17929/34278 [19:44:38<15:48:42, 3.48s/it] {'loss': 0.1324, 'grad_norm': 0.7027258225234448, 'learning_rate': 4.869859088724687e-06, 'epoch': 0.52} 52%|█████▏ | 17929/34278 [19:44:38<15:48:42, 3.48s/it] 52%|█████▏ | 17930/34278 [19:44:44<19:11:50, 4.23s/it] {'loss': 0.1338, 'grad_norm': 0.7510819893563819, 'learning_rate': 4.8693868152060844e-06, 'epoch': 0.52} 52%|█████▏ | 17930/34278 [19:44:44<19:11:50, 4.23s/it] 52%|█████▏ | 17931/34278 [19:44:47<17:31:02, 3.86s/it] {'loss': 0.1259, 'grad_norm': 0.8155236962523871, 'learning_rate': 4.868914542853566e-06, 'epoch': 0.52} 52%|█████▏ | 17931/34278 [19:44:47<17:31:02, 3.86s/it] 52%|█████▏ | 17932/34278 [19:44:51<18:27:29, 4.07s/it] {'loss': 0.1276, 'grad_norm': 0.7732616948535456, 'learning_rate': 4.868442271671346e-06, 'epoch': 0.52} 52%|█████▏ | 17932/34278 [19:44:51<18:27:29, 4.07s/it] 52%|█████▏ | 17933/34278 [19:44:58<21:36:04, 4.76s/it] {'loss': 0.1335, 'grad_norm': 0.833652499312342, 'learning_rate': 4.867970001663644e-06, 'epoch': 0.52} 52%|█████▏ | 17933/34278 [19:44:58<21:36:04, 4.76s/it] 52%|█████▏ | 17934/34278 [19:45:01<19:40:29, 4.33s/it] {'loss': 0.1387, 'grad_norm': 0.8366874000975631, 'learning_rate': 4.867497732834673e-06, 'epoch': 0.52} 52%|█████▏ | 17934/34278 [19:45:01<19:40:29, 4.33s/it] 52%|█████▏ | 17935/34278 [19:45:05<19:09:35, 4.22s/it] {'loss': 0.1398, 'grad_norm': 0.7508403598543987, 'learning_rate': 4.867025465188651e-06, 'epoch': 0.52} 52%|█████▏ | 17935/34278 [19:45:05<19:09:35, 4.22s/it] 52%|█████▏ | 17936/34278 [19:45:09<18:24:39, 4.06s/it] {'loss': 0.1144, 'grad_norm': 0.8776091395129162, 'learning_rate': 4.866553198729795e-06, 'epoch': 0.52} 52%|█████▏ | 17936/34278 [19:45:09<18:24:39, 4.06s/it] 52%|█████▏ | 17937/34278 [19:45:12<17:15:40, 3.80s/it] {'loss': 0.1259, 'grad_norm': 0.7561549927851885, 'learning_rate': 4.866080933462318e-06, 'epoch': 0.52} 52%|█████▏ | 17937/34278 [19:45:12<17:15:40, 3.80s/it] 52%|█████▏ | 17938/34278 [19:45:16<18:17:41, 4.03s/it] {'loss': 0.1182, 'grad_norm': 0.8702700009110349, 'learning_rate': 4.865608669390439e-06, 'epoch': 0.52} 52%|█████▏ | 17938/34278 [19:45:16<18:17:41, 4.03s/it] 52%|█████▏ | 17939/34278 [19:45:20<17:08:35, 3.78s/it] {'loss': 0.123, 'grad_norm': 0.8289225573785339, 'learning_rate': 4.8651364065183735e-06, 'epoch': 0.52} 52%|█████▏ | 17939/34278 [19:45:20<17:08:35, 3.78s/it] 52%|█████▏ | 17940/34278 [19:45:22<15:55:24, 3.51s/it] {'loss': 0.1335, 'grad_norm': 2.2467421701393797, 'learning_rate': 4.864664144850339e-06, 'epoch': 0.52} 52%|█████▏ | 17940/34278 [19:45:22<15:55:24, 3.51s/it] 52%|█████▏ | 17941/34278 [19:45:26<15:44:29, 3.47s/it] {'loss': 0.1268, 'grad_norm': 0.9348183107750612, 'learning_rate': 4.864191884390551e-06, 'epoch': 0.52} 52%|█████▏ | 17941/34278 [19:45:26<15:44:29, 3.47s/it] 52%|█████▏ | 17942/34278 [19:45:29<14:59:55, 3.31s/it] {'loss': 0.1202, 'grad_norm': 0.8360624824019696, 'learning_rate': 4.863719625143225e-06, 'epoch': 0.52} 52%|█████▏ | 17942/34278 [19:45:29<14:59:55, 3.31s/it] 52%|█████▏ | 17943/34278 [19:45:32<14:48:17, 3.26s/it] {'loss': 0.1066, 'grad_norm': 0.8640078876565908, 'learning_rate': 4.8632473671125765e-06, 'epoch': 0.52} 52%|█████▏ | 17943/34278 [19:45:32<14:48:17, 3.26s/it] 52%|█████▏ | 17944/34278 [19:45:35<14:36:37, 3.22s/it] {'loss': 0.1333, 'grad_norm': 0.9376836752967381, 'learning_rate': 4.862775110302823e-06, 'epoch': 0.52} 52%|█████▏ | 17944/34278 [19:45:35<14:36:37, 3.22s/it] 52%|█████▏ | 17945/34278 [19:45:38<14:32:58, 3.21s/it] {'loss': 0.1325, 'grad_norm': 0.819108078445002, 'learning_rate': 4.862302854718181e-06, 'epoch': 0.52} 52%|█████▏ | 17945/34278 [19:45:38<14:32:58, 3.21s/it] 52%|█████▏ | 17946/34278 [19:45:41<14:01:53, 3.09s/it] {'loss': 0.142, 'grad_norm': 0.8755362805588103, 'learning_rate': 4.861830600362868e-06, 'epoch': 0.52} 52%|█████▏ | 17946/34278 [19:45:41<14:01:53, 3.09s/it] 52%|█████▏ | 17947/34278 [19:45:45<14:40:18, 3.23s/it] {'loss': 0.1347, 'grad_norm': 0.7584084125841402, 'learning_rate': 4.861358347241097e-06, 'epoch': 0.52} 52%|█████▏ | 17947/34278 [19:45:45<14:40:18, 3.23s/it] 52%|█████▏ | 17948/34278 [19:45:48<15:12:54, 3.35s/it] {'loss': 0.1266, 'grad_norm': 0.9762185605530063, 'learning_rate': 4.860886095357085e-06, 'epoch': 0.52} 52%|█████▏ | 17948/34278 [19:45:48<15:12:54, 3.35s/it] 52%|█████▏ | 17949/34278 [19:45:51<15:03:35, 3.32s/it] {'loss': 0.1518, 'grad_norm': 0.850976016020747, 'learning_rate': 4.860413844715048e-06, 'epoch': 0.52} 52%|█████▏ | 17949/34278 [19:45:51<15:03:35, 3.32s/it] 52%|█████▏ | 17950/34278 [19:45:55<15:07:50, 3.34s/it] {'loss': 0.138, 'grad_norm': 1.0727609637353392, 'learning_rate': 4.859941595319204e-06, 'epoch': 0.52} 52%|█████▏ | 17950/34278 [19:45:55<15:07:50, 3.34s/it] 52%|█████▏ | 17951/34278 [19:45:59<16:13:17, 3.58s/it] {'loss': 0.1341, 'grad_norm': 0.7412763744903017, 'learning_rate': 4.859469347173769e-06, 'epoch': 0.52} 52%|█████▏ | 17951/34278 [19:45:59<16:13:17, 3.58s/it] 52%|█████▏ | 17952/34278 [19:46:02<15:31:25, 3.42s/it] {'loss': 0.1229, 'grad_norm': 0.6602935382793448, 'learning_rate': 4.858997100282958e-06, 'epoch': 0.52} 52%|█████▏ | 17952/34278 [19:46:02<15:31:25, 3.42s/it] 52%|█████▏ | 17953/34278 [19:46:05<14:49:22, 3.27s/it] {'loss': 0.0988, 'grad_norm': 0.8813426744458847, 'learning_rate': 4.858524854650986e-06, 'epoch': 0.52} 52%|█████▏ | 17953/34278 [19:46:05<14:49:22, 3.27s/it] 52%|█████▏ | 17954/34278 [19:46:08<14:33:24, 3.21s/it] {'loss': 0.1381, 'grad_norm': 0.7512036713866219, 'learning_rate': 4.858052610282072e-06, 'epoch': 0.52} 52%|█████▏ | 17954/34278 [19:46:08<14:33:24, 3.21s/it] 52%|█████▏ | 17955/34278 [19:46:11<14:30:34, 3.20s/it] {'loss': 0.1346, 'grad_norm': 0.9621052485718656, 'learning_rate': 4.857580367180427e-06, 'epoch': 0.52} 52%|█████▏ | 17955/34278 [19:46:11<14:30:34, 3.20s/it] 52%|█████▏ | 17956/34278 [19:46:14<14:06:40, 3.11s/it] {'loss': 0.1134, 'grad_norm': 0.7738331930747737, 'learning_rate': 4.857108125350274e-06, 'epoch': 0.52} 52%|█████▏ | 17956/34278 [19:46:14<14:06:40, 3.11s/it] 52%|█████▏ | 17957/34278 [19:46:19<16:34:22, 3.66s/it] {'loss': 0.1252, 'grad_norm': 0.8259746644171821, 'learning_rate': 4.856635884795824e-06, 'epoch': 0.52} 52%|█████▏ | 17957/34278 [19:46:19<16:34:22, 3.66s/it] 52%|█████▏ | 17958/34278 [19:46:23<17:14:24, 3.80s/it] {'loss': 0.1423, 'grad_norm': 0.731470398207601, 'learning_rate': 4.856163645521295e-06, 'epoch': 0.52} 52%|█████▏ | 17958/34278 [19:46:23<17:14:24, 3.80s/it] 52%|█████▏ | 17959/34278 [19:46:26<16:22:19, 3.61s/it] {'loss': 0.1247, 'grad_norm': 0.8188305469047616, 'learning_rate': 4.855691407530903e-06, 'epoch': 0.52} 52%|█████▏ | 17959/34278 [19:46:26<16:22:19, 3.61s/it] 52%|█████▏ | 17960/34278 [19:46:30<16:20:42, 3.61s/it] {'loss': 0.136, 'grad_norm': 0.8393247378678323, 'learning_rate': 4.855219170828863e-06, 'epoch': 0.52} 52%|█████▏ | 17960/34278 [19:46:30<16:20:42, 3.61s/it] 52%|█████▏ | 17961/34278 [19:46:34<17:05:59, 3.77s/it] {'loss': 0.1373, 'grad_norm': 0.7694881940520427, 'learning_rate': 4.854746935419391e-06, 'epoch': 0.52} 52%|█████▏ | 17961/34278 [19:46:34<17:05:59, 3.77s/it] 52%|█████▏ | 17962/34278 [19:46:37<16:03:10, 3.54s/it] {'loss': 0.1216, 'grad_norm': 0.6646887651826038, 'learning_rate': 4.8542747013067046e-06, 'epoch': 0.52} 52%|█████▏ | 17962/34278 [19:46:37<16:03:10, 3.54s/it] 52%|█████▏ | 17963/34278 [19:46:40<15:34:48, 3.44s/it] {'loss': 0.1143, 'grad_norm': 0.7698194675114775, 'learning_rate': 4.85380246849502e-06, 'epoch': 0.52} 52%|█████▏ | 17963/34278 [19:46:40<15:34:48, 3.44s/it] 52%|█████▏ | 17964/34278 [19:46:43<15:12:58, 3.36s/it] {'loss': 0.1528, 'grad_norm': 0.9663948066989679, 'learning_rate': 4.853330236988551e-06, 'epoch': 0.52} 52%|█████▏ | 17964/34278 [19:46:43<15:12:58, 3.36s/it] 52%|█████▏ | 17965/34278 [19:46:48<16:38:16, 3.67s/it] {'loss': 0.1301, 'grad_norm': 0.8500157732306555, 'learning_rate': 4.852858006791513e-06, 'epoch': 0.52} 52%|█████▏ | 17965/34278 [19:46:48<16:38:16, 3.67s/it] 52%|█████▏ | 17966/34278 [19:46:51<15:31:58, 3.43s/it] {'loss': 0.1154, 'grad_norm': 0.6561816000060233, 'learning_rate': 4.852385777908127e-06, 'epoch': 0.52} 52%|█████▏ | 17966/34278 [19:46:51<15:31:58, 3.43s/it] 52%|█████▏ | 17967/34278 [19:46:55<16:27:08, 3.63s/it] {'loss': 0.1201, 'grad_norm': 1.2660261846960206, 'learning_rate': 4.8519135503426014e-06, 'epoch': 0.52} 52%|█████▏ | 17967/34278 [19:46:55<16:27:08, 3.63s/it] 52%|█████▏ | 17968/34278 [19:47:01<19:44:57, 4.36s/it] {'loss': 0.1088, 'grad_norm': 0.9443999314956311, 'learning_rate': 4.851441324099159e-06, 'epoch': 0.52} 52%|█████▏ | 17968/34278 [19:47:01<19:44:57, 4.36s/it] 52%|█████▏ | 17969/34278 [19:47:04<18:04:28, 3.99s/it] {'loss': 0.1508, 'grad_norm': 0.8122989850323468, 'learning_rate': 4.850969099182013e-06, 'epoch': 0.52} 52%|█████▏ | 17969/34278 [19:47:04<18:04:28, 3.99s/it] 52%|█████▏ | 17970/34278 [19:47:07<16:54:04, 3.73s/it] {'loss': 0.1233, 'grad_norm': 0.9833396853502512, 'learning_rate': 4.850496875595379e-06, 'epoch': 0.52} 52%|█████▏ | 17970/34278 [19:47:07<16:54:04, 3.73s/it] 52%|█████▏ | 17971/34278 [19:47:12<18:50:31, 4.16s/it] {'loss': 0.1344, 'grad_norm': 0.986616223271879, 'learning_rate': 4.850024653343473e-06, 'epoch': 0.52} 52%|█████▏ | 17971/34278 [19:47:12<18:50:31, 4.16s/it] 52%|█████▏ | 17972/34278 [19:47:19<21:44:37, 4.80s/it] {'loss': 0.1372, 'grad_norm': 0.8007862832192146, 'learning_rate': 4.849552432430512e-06, 'epoch': 0.52} 52%|█████▏ | 17972/34278 [19:47:19<21:44:37, 4.80s/it] 52%|█████▏ | 17973/34278 [19:47:22<19:47:26, 4.37s/it] {'loss': 0.1324, 'grad_norm': 1.0666085204198423, 'learning_rate': 4.849080212860709e-06, 'epoch': 0.52} 52%|█████▏ | 17973/34278 [19:47:22<19:47:26, 4.37s/it] 52%|█████▏ | 17974/34278 [19:47:26<18:51:20, 4.16s/it] {'loss': 0.1593, 'grad_norm': 0.9058754526315972, 'learning_rate': 4.848607994638282e-06, 'epoch': 0.52} 52%|█████▏ | 17974/34278 [19:47:26<18:51:20, 4.16s/it] 52%|█████▏ | 17975/34278 [19:47:29<18:02:08, 3.98s/it] {'loss': 0.1104, 'grad_norm': 0.8441704512336876, 'learning_rate': 4.848135777767447e-06, 'epoch': 0.52} 52%|█████▏ | 17975/34278 [19:47:29<18:02:08, 3.98s/it] 52%|█████▏ | 17976/34278 [19:47:35<20:17:39, 4.48s/it] {'loss': 0.1223, 'grad_norm': 0.8940368108007474, 'learning_rate': 4.847663562252422e-06, 'epoch': 0.52} 52%|█████▏ | 17976/34278 [19:47:35<20:17:39, 4.48s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 52%|█████▏ | 17977/34278 [19:47:38<17:53:21, 3.95s/it] {'loss': 0.1291, 'grad_norm': 0.800211833876078, 'learning_rate': 4.8471913480974184e-06, 'epoch': 0.52} 52%|█████▏ | 17977/34278 [19:47:38<17:53:21, 3.95s/it] 52%|█████▏ | 17978/34278 [19:47:41<17:33:50, 3.88s/it] {'loss': 0.1247, 'grad_norm': 0.7713669762026707, 'learning_rate': 4.846719135306654e-06, 'epoch': 0.52} 52%|█████▏ | 17978/34278 [19:47:41<17:33:50, 3.88s/it] 52%|█████▏ | 17979/34278 [19:47:45<17:38:05, 3.90s/it] {'loss': 0.136, 'grad_norm': 0.9492886397041358, 'learning_rate': 4.846246923884343e-06, 'epoch': 0.52} 52%|█████▏ | 17979/34278 [19:47:45<17:38:05, 3.90s/it] 52%|█████▏ | 17980/34278 [19:47:48<16:05:43, 3.56s/it] {'loss': 0.1535, 'grad_norm': 1.0445991721570165, 'learning_rate': 4.845774713834705e-06, 'epoch': 0.52} 52%|█████▏ | 17980/34278 [19:47:48<16:05:43, 3.56s/it] 52%|█████▏ | 17981/34278 [19:47:51<15:10:35, 3.35s/it] {'loss': 0.1349, 'grad_norm': 0.8926232799278784, 'learning_rate': 4.845302505161954e-06, 'epoch': 0.52} 52%|█████▏ | 17981/34278 [19:47:51<15:10:35, 3.35s/it] 52%|█████▏ | 17982/34278 [19:47:54<14:48:16, 3.27s/it] {'loss': 0.1393, 'grad_norm': 0.869689043109833, 'learning_rate': 4.844830297870303e-06, 'epoch': 0.52} 52%|█████▏ | 17982/34278 [19:47:54<14:48:16, 3.27s/it] 52%|█████▏ | 17983/34278 [19:48:00<18:30:58, 4.09s/it] {'loss': 0.1361, 'grad_norm': 0.9578446700207232, 'learning_rate': 4.844358091963971e-06, 'epoch': 0.52} 52%|█████▏ | 17983/34278 [19:48:00<18:30:58, 4.09s/it] 52%|█████▏ | 17984/34278 [19:48:03<17:44:33, 3.92s/it] {'loss': 0.1239, 'grad_norm': 1.1267656458186959, 'learning_rate': 4.8438858874471754e-06, 'epoch': 0.52} 52%|█████▏ | 17984/34278 [19:48:03<17:44:33, 3.92s/it] 52%|█████▏ | 17985/34278 [19:48:06<16:22:58, 3.62s/it] {'loss': 0.1585, 'grad_norm': 0.9362500665418243, 'learning_rate': 4.843413684324124e-06, 'epoch': 0.52} 52%|█████▏ | 17985/34278 [19:48:06<16:22:58, 3.62s/it] 52%|█████▏ | 17986/34278 [19:48:10<16:06:43, 3.56s/it] {'loss': 0.1336, 'grad_norm': 0.8570036739285297, 'learning_rate': 4.842941482599041e-06, 'epoch': 0.52} 52%|█████▏ | 17986/34278 [19:48:10<16:06:43, 3.56s/it] 52%|█████▏ | 17987/34278 [19:48:13<15:44:13, 3.48s/it] {'loss': 0.1415, 'grad_norm': 1.1604667286988664, 'learning_rate': 4.8424692822761395e-06, 'epoch': 0.52} 52%|█████▏ | 17987/34278 [19:48:13<15:44:13, 3.48s/it] 52%|█████▏ | 17988/34278 [19:48:18<18:04:15, 3.99s/it] {'loss': 0.1177, 'grad_norm': 1.0697962974820818, 'learning_rate': 4.841997083359634e-06, 'epoch': 0.52} 52%|█████▏ | 17988/34278 [19:48:18<18:04:15, 3.99s/it] 52%|█████▏ | 17989/34278 [19:48:23<19:24:28, 4.29s/it] {'loss': 0.1328, 'grad_norm': 0.7622841840977652, 'learning_rate': 4.841524885853742e-06, 'epoch': 0.52} 52%|█████▏ | 17989/34278 [19:48:23<19:24:28, 4.29s/it] 52%|█████▏ | 17990/34278 [19:48:27<18:56:36, 4.19s/it] {'loss': 0.1262, 'grad_norm': 0.9267699854793973, 'learning_rate': 4.841052689762676e-06, 'epoch': 0.52} 52%|█████▏ | 17990/34278 [19:48:27<18:56:36, 4.19s/it] 52%|█████▏ | 17991/34278 [19:48:31<18:20:31, 4.05s/it] {'loss': 0.1209, 'grad_norm': 1.0435908215995608, 'learning_rate': 4.840580495090654e-06, 'epoch': 0.52} 52%|█████▏ | 17991/34278 [19:48:31<18:20:31, 4.05s/it] 52%|█████▏ | 17992/34278 [19:48:35<18:02:04, 3.99s/it] {'loss': 0.1447, 'grad_norm': 0.9682931262440511, 'learning_rate': 4.840108301841891e-06, 'epoch': 0.52} 52%|█████▏ | 17992/34278 [19:48:35<18:02:04, 3.99s/it] 52%|█████▏ | 17993/34278 [19:48:39<17:59:50, 3.98s/it] {'loss': 0.1083, 'grad_norm': 0.5997827566998676, 'learning_rate': 4.839636110020605e-06, 'epoch': 0.52} 52%|█████▏ | 17993/34278 [19:48:39<17:59:50, 3.98s/it] 52%|█████▏ | 17994/34278 [19:48:42<17:08:39, 3.79s/it] {'loss': 0.1394, 'grad_norm': 0.8357244523095261, 'learning_rate': 4.839163919631008e-06, 'epoch': 0.52} 52%|█████▏ | 17994/34278 [19:48:42<17:08:39, 3.79s/it] 52%|█████▏ | 17995/34278 [19:48:45<16:03:06, 3.55s/it] {'loss': 0.1316, 'grad_norm': 0.8528270064538509, 'learning_rate': 4.8386917306773166e-06, 'epoch': 0.52} 52%|█████▏ | 17995/34278 [19:48:45<16:03:06, 3.55s/it] 53%|█████▎ | 17996/34278 [19:48:49<16:13:20, 3.59s/it] {'loss': 0.1245, 'grad_norm': 0.7857553200242777, 'learning_rate': 4.838219543163749e-06, 'epoch': 0.53} 53%|█████▎ | 17996/34278 [19:48:49<16:13:20, 3.59s/it] 53%|█████▎ | 17997/34278 [19:48:54<18:27:52, 4.08s/it] {'loss': 0.1301, 'grad_norm': 0.760674877072806, 'learning_rate': 4.837747357094515e-06, 'epoch': 0.53} 53%|█████▎ | 17997/34278 [19:48:54<18:27:52, 4.08s/it] 53%|█████▎ | 17998/34278 [19:48:57<17:09:13, 3.79s/it] {'loss': 0.177, 'grad_norm': 0.8523709086749641, 'learning_rate': 4.837275172473837e-06, 'epoch': 0.53} 53%|█████▎ | 17998/34278 [19:48:57<17:09:13, 3.79s/it] 53%|█████▎ | 17999/34278 [19:49:01<16:50:08, 3.72s/it] {'loss': 0.1324, 'grad_norm': 1.2116779135550535, 'learning_rate': 4.836802989305927e-06, 'epoch': 0.53} 53%|█████▎ | 17999/34278 [19:49:01<16:50:08, 3.72s/it] 53%|█████▎ | 18000/34278 [19:49:04<15:51:03, 3.51s/it] {'loss': 0.1252, 'grad_norm': 0.8755427782452537, 'learning_rate': 4.836330807595e-06, 'epoch': 0.53} 53%|█████▎ | 18000/34278 [19:49:04<15:51:03, 3.51s/it]Set eos token id to Set eos token id to151658Set eos token id toSet eos token id to 151658 Set eos token to 151658 151658Set eos token to<|diff_marker|> Set eos token id toSet eos token toSet eos token id to<|diff_marker|>Set eos token to Set generation config eos token id to <|diff_marker|> <|diff_marker|>Set generation config eos token id to151658151658Set eos token id to Set generation config eos token id to [151658] Set generation config eos token id toSet eos token to [151658]Set eos token to Set eos token id to [151658]151658 <|diff_marker|><|diff_marker|> 151658Set eos token to <|diff_marker|>Set generation config eos token id to Set generation config eos token id toSet eos token to Set generation config eos token id to [151658]<|diff_marker|> [151658] [151658] [151658]Set generation config eos token id to [151658] Set eos token id toSet eos token id toSet eos token id toSet eos token id to Set eos token id to151658Set eos token id toSet eos token id toSet eos token id to 151658 Set eos token to151658 Set eos token id to 151658 151658Set eos token to<|diff_marker|> Set eos token to 151658 Set generation config eos token id toSet eos token to Set eos token toSet eos token id to<|diff_marker|>Set eos token id to <|diff_marker|> <|diff_marker|> <|diff_marker|>Set eos token toSet generation config eos token id to [151658] 151658Set generation config eos token id to <|diff_marker|> 151658Set generation config eos token id to Set generation config eos token id to Set eos token to [151658] [151658] [151658]Set generation config eos token id to Set eos token to [151658]<|diff_marker|> <|diff_marker|>[151658] Set generation config eos token id to Set generation config eos token id to Set eos token id to 151658 Set eos token id to151658 151658151658 Set eos token to 151658 Set eos token to <|diff_marker|>151658Set eos token to Set eos token toSet eos token to<|diff_marker|><|diff_marker|> Set generation config eos token id to <|diff_marker|> Set generation config eos token id to <|diff_marker|>Set generation config eos token id to Set generation config eos token id to Set generation config eos token id to [151658][151658][151658] [151658][151658] Set eos token id to Set eos token to 151658<|diff_marker|> Set generation config eos token id toSet eos token to <|diff_marker|> [151658] Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id back toS151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645Set eos token id back to Set eos token back to <|im_end|> 151645Set generation config eos token id back to Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] 151645151645151645 Set eos token back toSet eos token back to Set eos token back to <|im_end|><|im_end|> <|im_end|>Set generation config eos token id back to Set generation config eos token id back to Set generation config eos token id back to [151645, 151643][151645, 151643] [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to Set eos token id back to[151645, 151643] 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] ion config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 53%|█████▎ | 18001/34278 [19:49:40<59:59:51, 13.27s/it] {'loss': 0.1389, 'grad_norm': 0.8016962301962923, 'learning_rate': 4.835858627345273e-06, 'epoch': 0.53} 53%|█████▎ | 18001/34278 [19:49:40<59:59:51, 13.27s/it] 53%|█████▎ | 18002/34278 [19:49:43<46:27:10, 10.27s/it] {'loss': 0.1498, 'grad_norm': 0.8869979893036224, 'learning_rate': 4.835386448560961e-06, 'epoch': 0.53} 53%|█████▎ | 18002/34278 [19:49:43<46:27:10, 10.27s/it] 53%|█████▎ | 18003/34278 [19:49:46<36:44:10, 8.13s/it] {'loss': 0.1411, 'grad_norm': 0.9275440747445214, 'learning_rate': 4.834914271246279e-06, 'epoch': 0.53} 53%|█████▎ | 18003/34278 [19:49:46<36:44:10, 8.13s/it] 53%|█████▎ | 18004/34278 [19:49:52<33:53:10, 7.50s/it] {'loss': 0.1371, 'grad_norm': 0.8299382513115088, 'learning_rate': 4.834442095405443e-06, 'epoch': 0.53} 53%|█████▎ | 18004/34278 [19:49:52<33:53:10, 7.50s/it] 53%|█████▎ | 18005/34278 [19:49:56<28:46:49, 6.37s/it] {'loss': 0.1354, 'grad_norm': 0.7344429097988446, 'learning_rate': 4.833969921042669e-06, 'epoch': 0.53} 53%|█████▎ | 18005/34278 [19:49:56<28:46:49, 6.37s/it] 53%|█████▎ | 18006/34278 [19:49:59<24:39:05, 5.45s/it] {'loss': 0.1174, 'grad_norm': 0.8020292187680751, 'learning_rate': 4.833497748162172e-06, 'epoch': 0.53} 53%|█████▎ | 18006/34278 [19:49:59<24:39:05, 5.45s/it] 53%|█████▎ | 18007/34278 [19:50:03<22:01:28, 4.87s/it] {'loss': 0.1168, 'grad_norm': 0.7589677544688201, 'learning_rate': 4.833025576768168e-06, 'epoch': 0.53} 53%|█████▎ | 18007/34278 [19:50:03<22:01:28, 4.87s/it] 53%|█████▎ | 18008/34278 [19:50:06<19:30:00, 4.31s/it] {'loss': 0.134, 'grad_norm': 1.0809993126779969, 'learning_rate': 4.8325534068648705e-06, 'epoch': 0.53} 53%|█████▎ | 18008/34278 [19:50:06<19:30:00, 4.31s/it] 53%|█████▎ | 18009/34278 [19:50:09<18:05:20, 4.00s/it] {'loss': 0.1252, 'grad_norm': 0.8306161921408216, 'learning_rate': 4.8320812384564955e-06, 'epoch': 0.53} 53%|█████▎ | 18009/34278 [19:50:09<18:05:20, 4.00s/it] 53%|█████▎ | 18010/34278 [19:50:13<17:30:16, 3.87s/it] {'loss': 0.1113, 'grad_norm': 0.7180312168582362, 'learning_rate': 4.83160907154726e-06, 'epoch': 0.53} 53%|█████▎ | 18010/34278 [19:50:13<17:30:16, 3.87s/it] 53%|█████▎ | 18011/34278 [19:50:17<18:22:41, 4.07s/it] {'loss': 0.1431, 'grad_norm': 0.7818417928655756, 'learning_rate': 4.83113690614138e-06, 'epoch': 0.53} 53%|█████▎ | 18011/34278 [19:50:17<18:22:41, 4.07s/it] 53%|█████▎ | 18012/34278 [19:50:21<17:54:18, 3.96s/it] {'loss': 0.1287, 'grad_norm': 0.9556346596527349, 'learning_rate': 4.830664742243068e-06, 'epoch': 0.53} 53%|█████▎ | 18012/34278 [19:50:21<17:54:18, 3.96s/it] 53%|█████▎ | 18013/34278 [19:50:25<17:52:20, 3.96s/it] {'loss': 0.135, 'grad_norm': 1.2172309169139588, 'learning_rate': 4.830192579856541e-06, 'epoch': 0.53} 53%|█████▎ | 18013/34278 [19:50:25<17:52:20, 3.96s/it] 53%|█████▎ | 18014/34278 [19:50:28<16:54:00, 3.74s/it] {'loss': 0.1319, 'grad_norm': 0.9589657292617719, 'learning_rate': 4.829720418986015e-06, 'epoch': 0.53} 53%|█████▎ | 18014/34278 [19:50:28<16:54:00, 3.74s/it] 53%|█████▎ | 18015/34278 [19:50:34<19:53:39, 4.40s/it] {'loss': 0.1404, 'grad_norm': 1.05892071662572, 'learning_rate': 4.829248259635701e-06, 'epoch': 0.53} 53%|█████▎ | 18015/34278 [19:50:34<19:53:39, 4.40s/it] 53%|█████▎ | 18016/34278 [19:50:37<18:07:14, 4.01s/it] {'loss': 0.1275, 'grad_norm': 0.9285604008415789, 'learning_rate': 4.828776101809821e-06, 'epoch': 0.53} 53%|█████▎ | 18016/34278 [19:50:37<18:07:14, 4.01s/it] 53%|█████▎ | 18017/34278 [19:50:40<17:21:42, 3.84s/it] {'loss': 0.1325, 'grad_norm': 0.975494494513817, 'learning_rate': 4.8283039455125865e-06, 'epoch': 0.53} 53%|█████▎ | 18017/34278 [19:50:40<17:21:42, 3.84s/it] 53%|█████▎ | 18018/34278 [19:50:46<20:18:57, 4.50s/it] {'loss': 0.1181, 'grad_norm': 0.7933315168665201, 'learning_rate': 4.827831790748213e-06, 'epoch': 0.53} 53%|█████▎ | 18018/34278 [19:50:46<20:18:57, 4.50s/it] 53%|█████▎ | 18019/34278 [19:50:50<19:14:27, 4.26s/it] {'loss': 0.1438, 'grad_norm': 1.0561985706953883, 'learning_rate': 4.827359637520917e-06, 'epoch': 0.53} 53%|█████▎ | 18019/34278 [19:50:50<19:14:27, 4.26s/it] 53%|█████▎ | 18020/34278 [19:50:55<20:32:03, 4.55s/it] {'loss': 0.1425, 'grad_norm': 0.8767416437329977, 'learning_rate': 4.826887485834913e-06, 'epoch': 0.53} 53%|█████▎ | 18020/34278 [19:50:55<20:32:03, 4.55s/it] 53%|█████▎ | 18021/34278 [19:50:58<18:27:18, 4.09s/it] {'loss': 0.1291, 'grad_norm': 1.030326838227654, 'learning_rate': 4.826415335694414e-06, 'epoch': 0.53} 53%|█████▎ | 18021/34278 [19:50:58<18:27:18, 4.09s/it] 53%|█████▎ | 18022/34278 [19:51:04<20:56:06, 4.64s/it] {'loss': 0.1278, 'grad_norm': 0.6721898444369182, 'learning_rate': 4.8259431871036395e-06, 'epoch': 0.53} 53%|█████▎ | 18022/34278 [19:51:04<20:56:06, 4.64s/it] 53%|█████▎ | 18023/34278 [19:51:08<19:26:18, 4.31s/it] {'loss': 0.1208, 'grad_norm': 0.709761200362223, 'learning_rate': 4.825471040066803e-06, 'epoch': 0.53} 53%|█████▎ | 18023/34278 [19:51:08<19:26:18, 4.31s/it] 53%|█████▎ | 18024/34278 [19:51:13<21:01:49, 4.66s/it] {'loss': 0.1473, 'grad_norm': 0.8058778338832103, 'learning_rate': 4.824998894588118e-06, 'epoch': 0.53} 53%|█████▎ | 18024/34278 [19:51:13<21:01:49, 4.66s/it] 53%|█████▎ | 18025/34278 [19:51:19<22:53:18, 5.07s/it] {'loss': 0.1096, 'grad_norm': 0.7934397989321833, 'learning_rate': 4.824526750671802e-06, 'epoch': 0.53} 53%|█████▎ | 18025/34278 [19:51:19<22:53:18, 5.07s/it] 53%|█████▎ | 18026/34278 [19:51:23<20:25:12, 4.52s/it] {'loss': 0.1274, 'grad_norm': 0.7762464528155616, 'learning_rate': 4.8240546083220705e-06, 'epoch': 0.53} 53%|█████▎ | 18026/34278 [19:51:23<20:25:12, 4.52s/it] 53%|█████▎ | 18027/34278 [19:51:26<18:14:52, 4.04s/it] {'loss': 0.1287, 'grad_norm': 0.8353728222100885, 'learning_rate': 4.823582467543133e-06, 'epoch': 0.53} 53%|█████▎ | 18027/34278 [19:51:26<18:14:52, 4.04s/it] 53%|█████▎ | 18028/34278 [19:51:29<17:08:03, 3.80s/it] {'loss': 0.1248, 'grad_norm': 1.001478652873019, 'learning_rate': 4.823110328339213e-06, 'epoch': 0.53} 53%|█████▎ | 18028/34278 [19:51:29<17:08:03, 3.80s/it] 53%|█████▎ | 18029/34278 [19:51:33<17:49:36, 3.95s/it] {'loss': 0.1423, 'grad_norm': 0.7875069717632527, 'learning_rate': 4.822638190714521e-06, 'epoch': 0.53} 53%|█████▎ | 18029/34278 [19:51:33<17:49:36, 3.95s/it] 53%|█████▎ | 18030/34278 [19:51:39<20:26:57, 4.53s/it] {'loss': 0.1286, 'grad_norm': 1.136917047056234, 'learning_rate': 4.822166054673273e-06, 'epoch': 0.53} 53%|█████▎ | 18030/34278 [19:51:39<20:26:57, 4.53s/it] 53%|█████▎ | 18031/34278 [19:51:43<19:16:57, 4.27s/it] {'loss': 0.1506, 'grad_norm': 0.8338412718231685, 'learning_rate': 4.821693920219684e-06, 'epoch': 0.53} 53%|█████▎ | 18031/34278 [19:51:43<19:16:57, 4.27s/it] 53%|█████▎ | 18032/34278 [19:51:46<17:35:02, 3.90s/it] {'loss': 0.1283, 'grad_norm': 0.8810387903852749, 'learning_rate': 4.821221787357969e-06, 'epoch': 0.53} 53%|█████▎ | 18032/34278 [19:51:46<17:35:02, 3.90s/it] 53%|█████▎ | 18033/34278 [19:51:50<17:35:22, 3.90s/it] {'loss': 0.1445, 'grad_norm': 0.8387559980864229, 'learning_rate': 4.820749656092342e-06, 'epoch': 0.53} 53%|█████▎ | 18033/34278 [19:51:50<17:35:22, 3.90s/it] 53%|█████▎ | 18034/34278 [19:51:52<16:18:13, 3.61s/it] {'loss': 0.1138, 'grad_norm': 0.6970389163980308, 'learning_rate': 4.820277526427019e-06, 'epoch': 0.53} 53%|█████▎ | 18034/34278 [19:51:52<16:18:13, 3.61s/it] 53%|█████▎ | 18035/34278 [19:51:57<17:21:33, 3.85s/it] {'loss': 0.1412, 'grad_norm': 1.1730141652813515, 'learning_rate': 4.8198053983662175e-06, 'epoch': 0.53} 53%|█████▎ | 18035/34278 [19:51:57<17:21:33, 3.85s/it] 53%|█████▎ | 18036/34278 [19:52:01<17:26:25, 3.87s/it] {'loss': 0.1441, 'grad_norm': 0.8330160665392379, 'learning_rate': 4.81933327191415e-06, 'epoch': 0.53} 53%|█████▎ | 18036/34278 [19:52:01<17:26:25, 3.87s/it] 53%|█████▎ | 18037/34278 [19:52:04<16:21:12, 3.62s/it] {'loss': 0.1326, 'grad_norm': 1.1819803911284248, 'learning_rate': 4.818861147075031e-06, 'epoch': 0.53} 53%|█████▎ | 18037/34278 [19:52:04<16:21:12, 3.62s/it] 53%|█████▎ | 18038/34278 [19:52:07<15:44:54, 3.49s/it] {'loss': 0.1344, 'grad_norm': 0.7637892124219897, 'learning_rate': 4.818389023853077e-06, 'epoch': 0.53} 53%|█████▎ | 18038/34278 [19:52:07<15:44:54, 3.49s/it] 53%|█████▎ | 18039/34278 [19:52:12<17:53:26, 3.97s/it] {'loss': 0.1468, 'grad_norm': 0.8886704083543827, 'learning_rate': 4.817916902252501e-06, 'epoch': 0.53} 53%|█████▎ | 18039/34278 [19:52:12<17:53:26, 3.97s/it] 53%|█████▎ | 18040/34278 [19:52:17<19:15:39, 4.27s/it] {'loss': 0.129, 'grad_norm': 0.738187964168913, 'learning_rate': 4.817444782277521e-06, 'epoch': 0.53} 53%|█████▎ | 18040/34278 [19:52:17<19:15:39, 4.27s/it] 53%|█████▎ | 18041/34278 [19:52:20<17:14:45, 3.82s/it] {'loss': 0.1182, 'grad_norm': 0.739917219648246, 'learning_rate': 4.8169726639323514e-06, 'epoch': 0.53} 53%|█████▎ | 18041/34278 [19:52:20<17:14:45, 3.82s/it] 53%|█████▎ | 18042/34278 [19:52:23<16:39:17, 3.69s/it] {'loss': 0.1242, 'grad_norm': 0.703088669527814, 'learning_rate': 4.816500547221204e-06, 'epoch': 0.53} 53%|█████▎ | 18042/34278 [19:52:23<16:39:17, 3.69s/it] 53%|█████▎ | 18043/34278 [19:52:26<15:43:00, 3.49s/it] {'loss': 0.1125, 'grad_norm': 0.9440728665811604, 'learning_rate': 4.816028432148298e-06, 'epoch': 0.53} 53%|█████▎ | 18043/34278 [19:52:26<15:43:00, 3.49s/it] 53%|█████▎ | 18044/34278 [19:52:30<15:29:31, 3.44s/it] {'loss': 0.1252, 'grad_norm': 0.6979161404789853, 'learning_rate': 4.8155563187178454e-06, 'epoch': 0.53} 53%|█████▎ | 18044/34278 [19:52:30<15:29:31, 3.44s/it] 53%|█████▎ | 18045/34278 [19:52:33<15:23:55, 3.41s/it] {'loss': 0.1681, 'grad_norm': 0.9156033733058908, 'learning_rate': 4.815084206934059e-06, 'epoch': 0.53} 53%|█████▎ | 18045/34278 [19:52:33<15:23:55, 3.41s/it] 53%|█████▎ | 18046/34278 [19:52:36<15:15:27, 3.38s/it] {'loss': 0.137, 'grad_norm': 0.8652140379149614, 'learning_rate': 4.8146120968011605e-06, 'epoch': 0.53} 53%|█████▎ | 18046/34278 [19:52:36<15:15:27, 3.38s/it] 53%|█████▎ | 18047/34278 [19:52:40<15:33:45, 3.45s/it] {'loss': 0.1134, 'grad_norm': 1.2206878237891405, 'learning_rate': 4.81413998832336e-06, 'epoch': 0.53} 53%|█████▎ | 18047/34278 [19:52:40<15:33:45, 3.45s/it] 53%|█████▎ | 18048/34278 [19:52:43<14:46:50, 3.28s/it] {'loss': 0.1251, 'grad_norm': 0.8484945222426621, 'learning_rate': 4.813667881504872e-06, 'epoch': 0.53} 53%|█████▎ | 18048/34278 [19:52:43<14:46:50, 3.28s/it] 53%|█████▎ | 18049/34278 [19:52:46<15:05:28, 3.35s/it] {'loss': 0.1281, 'grad_norm': 0.8924153261661256, 'learning_rate': 4.813195776349915e-06, 'epoch': 0.53} 53%|█████▎ | 18049/34278 [19:52:46<15:05:28, 3.35s/it] 53%|█████▎ | 18050/34278 [19:52:49<14:28:18, 3.21s/it] {'loss': 0.1218, 'grad_norm': 0.8985907140185593, 'learning_rate': 4.8127236728627005e-06, 'epoch': 0.53} 53%|█████▎ | 18050/34278 [19:52:49<14:28:18, 3.21s/it] 53%|█████▎ | 18051/34278 [19:52:52<14:20:06, 3.18s/it] {'loss': 0.1316, 'grad_norm': 0.7852840943978696, 'learning_rate': 4.8122515710474426e-06, 'epoch': 0.53} 53%|█████▎ | 18051/34278 [19:52:52<14:20:06, 3.18s/it] 53%|█████▎ | 18052/34278 [19:52:55<14:01:40, 3.11s/it] {'loss': 0.1353, 'grad_norm': 0.850122060780698, 'learning_rate': 4.8117794709083595e-06, 'epoch': 0.53} 53%|█████▎ | 18052/34278 [19:52:55<14:01:40, 3.11s/it] 53%|█████▎ | 18053/34278 [19:52:58<14:08:40, 3.14s/it] {'loss': 0.1529, 'grad_norm': 0.7281238097267843, 'learning_rate': 4.811307372449665e-06, 'epoch': 0.53} 53%|█████▎ | 18053/34278 [19:52:58<14:08:40, 3.14s/it] 53%|█████▎ | 18054/34278 [19:53:04<17:52:38, 3.97s/it] {'loss': 0.1123, 'grad_norm': 0.8030551206498359, 'learning_rate': 4.810835275675572e-06, 'epoch': 0.53} 53%|█████▎ | 18054/34278 [19:53:04<17:52:38, 3.97s/it] 53%|█████▎ | 18055/34278 [19:53:07<16:16:21, 3.61s/it] {'loss': 0.1318, 'grad_norm': 0.8632120116245481, 'learning_rate': 4.810363180590298e-06, 'epoch': 0.53} 53%|█████▎ | 18055/34278 [19:53:07<16:16:21, 3.61s/it] 53%|█████▎ | 18056/34278 [19:53:10<15:22:07, 3.41s/it] {'loss': 0.1319, 'grad_norm': 0.7320787543014441, 'learning_rate': 4.809891087198056e-06, 'epoch': 0.53} 53%|█████▎ | 18056/34278 [19:53:10<15:22:07, 3.41s/it] 53%|█████▎ | 18057/34278 [19:53:13<14:43:03, 3.27s/it] {'loss': 0.1503, 'grad_norm': 0.8081987981181039, 'learning_rate': 4.8094189955030576e-06, 'epoch': 0.53} 53%|█████▎ | 18057/34278 [19:53:13<14:43:03, 3.27s/it] 53%|█████▎ | 18058/34278 [19:53:16<14:47:36, 3.28s/it] {'loss': 0.1378, 'grad_norm': 0.9697511601745913, 'learning_rate': 4.808946905509524e-06, 'epoch': 0.53} 53%|█████▎ | 18058/34278 [19:53:16<14:47:36, 3.28s/it] 53%|█████▎ | 18059/34278 [19:53:19<14:32:04, 3.23s/it] {'loss': 0.1357, 'grad_norm': 0.7832968130363371, 'learning_rate': 4.808474817221666e-06, 'epoch': 0.53} 53%|█████▎ | 18059/34278 [19:53:19<14:32:04, 3.23s/it] 53%|█████▎ | 18060/34278 [19:53:22<14:09:41, 3.14s/it] {'loss': 0.1348, 'grad_norm': 0.743714129006087, 'learning_rate': 4.808002730643699e-06, 'epoch': 0.53} 53%|█████▎ | 18060/34278 [19:53:22<14:09:41, 3.14s/it] 53%|█████▎ | 18061/34278 [19:53:28<17:05:11, 3.79s/it] {'loss': 0.1377, 'grad_norm': 0.8473482931279858, 'learning_rate': 4.80753064577984e-06, 'epoch': 0.53} 53%|█████▎ | 18061/34278 [19:53:28<17:05:11, 3.79s/it] 53%|█████▎ | 18062/34278 [19:53:31<16:10:17, 3.59s/it] {'loss': 0.1463, 'grad_norm': 0.9650357162729263, 'learning_rate': 4.807058562634299e-06, 'epoch': 0.53} 53%|█████▎ | 18062/34278 [19:53:31<16:10:17, 3.59s/it] 53%|█████▎ | 18063/34278 [19:53:34<15:36:03, 3.46s/it] {'loss': 0.1289, 'grad_norm': 0.9278197670917236, 'learning_rate': 4.806586481211293e-06, 'epoch': 0.53} 53%|█████▎ | 18063/34278 [19:53:34<15:36:03, 3.46s/it] 53%|█████▎ | 18064/34278 [19:53:38<15:52:27, 3.52s/it] {'loss': 0.1137, 'grad_norm': 1.0563501491813705, 'learning_rate': 4.806114401515037e-06, 'epoch': 0.53} 53%|█████▎ | 18064/34278 [19:53:38<15:52:27, 3.52s/it] 53%|█████▎ | 18065/34278 [19:53:44<19:14:06, 4.27s/it] {'loss': 0.1269, 'grad_norm': 0.8898702218224827, 'learning_rate': 4.805642323549746e-06, 'epoch': 0.53} 53%|█████▎ | 18065/34278 [19:53:44<19:14:06, 4.27s/it] 53%|█████▎ | 18066/34278 [19:53:46<17:20:30, 3.85s/it] {'loss': 0.1188, 'grad_norm': 1.1734522270005805, 'learning_rate': 4.805170247319634e-06, 'epoch': 0.53} 53%|█████▎ | 18066/34278 [19:53:46<17:20:30, 3.85s/it] 53%|█████▎ | 18067/34278 [19:53:50<17:16:21, 3.84s/it] {'loss': 0.1203, 'grad_norm': 0.9957179615002449, 'learning_rate': 4.804698172828915e-06, 'epoch': 0.53} 53%|█████▎ | 18067/34278 [19:53:50<17:16:21, 3.84s/it] 53%|█████▎ | 18068/34278 [19:53:55<18:04:45, 4.02s/it] {'loss': 0.1241, 'grad_norm': 0.9750430119631452, 'learning_rate': 4.804226100081805e-06, 'epoch': 0.53} 53%|█████▎ | 18068/34278 [19:53:55<18:04:45, 4.02s/it] 53%|█████▎ | 18069/34278 [19:53:58<16:56:38, 3.76s/it] {'loss': 0.1403, 'grad_norm': 0.8965316300644635, 'learning_rate': 4.803754029082516e-06, 'epoch': 0.53} 53%|█████▎ | 18069/34278 [19:53:58<16:56:38, 3.76s/it] 53%|█████▎ | 18070/34278 [19:54:01<16:41:46, 3.71s/it] {'loss': 0.1589, 'grad_norm': 0.9609371465783336, 'learning_rate': 4.803281959835265e-06, 'epoch': 0.53} 53%|█████▎ | 18070/34278 [19:54:01<16:41:46, 3.71s/it] 53%|█████▎ | 18071/34278 [19:54:07<19:41:18, 4.37s/it] {'loss': 0.1542, 'grad_norm': 1.0046840178298868, 'learning_rate': 4.802809892344267e-06, 'epoch': 0.53} 53%|█████▎ | 18071/34278 [19:54:07<19:41:18, 4.37s/it] 53%|█████▎ | 18072/34278 [19:54:12<20:03:46, 4.46s/it] {'loss': 0.1275, 'grad_norm': 0.8883119399217058, 'learning_rate': 4.802337826613733e-06, 'epoch': 0.53} 53%|█████▎ | 18072/34278 [19:54:12<20:03:46, 4.46s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11976 > 8192). Running this sequence through the model will result in indexing errors 53%|█████▎ | 18073/34278 [19:54:15<18:14:15, 4.05s/it] {'loss': 0.1218, 'grad_norm': 1.129326124691106, 'learning_rate': 4.801865762647881e-06, 'epoch': 0.53} 53%|█████▎ | 18073/34278 [19:54:15<18:14:15, 4.05s/it] 53%|█████▎ | 18074/34278 [19:54:18<17:12:11, 3.82s/it] {'loss': 0.1454, 'grad_norm': 0.8033499427478014, 'learning_rate': 4.8013937004509255e-06, 'epoch': 0.53} 53%|█████▎ | 18074/34278 [19:54:18<17:12:11, 3.82s/it] 53%|█████▎ | 18075/34278 [19:54:22<16:24:44, 3.65s/it] {'loss': 0.1143, 'grad_norm': 0.838794416406774, 'learning_rate': 4.800921640027075e-06, 'epoch': 0.53} 53%|█████▎ | 18075/34278 [19:54:22<16:24:44, 3.65s/it] 53%|█████▎ | 18076/34278 [19:54:27<18:57:41, 4.21s/it] {'loss': 0.1467, 'grad_norm': 0.7796150481679842, 'learning_rate': 4.800449581380553e-06, 'epoch': 0.53} 53%|█████▎ | 18076/34278 [19:54:27<18:57:41, 4.21s/it] 53%|█████▎ | 18077/34278 [19:54:30<17:33:41, 3.90s/it] {'loss': 0.1448, 'grad_norm': 0.7990660761113042, 'learning_rate': 4.799977524515569e-06, 'epoch': 0.53} 53%|█████▎ | 18077/34278 [19:54:30<17:33:41, 3.90s/it] 53%|█████▎ | 18078/34278 [19:54:34<16:47:11, 3.73s/it] {'loss': 0.1143, 'grad_norm': 0.7281474478754107, 'learning_rate': 4.799505469436336e-06, 'epoch': 0.53} 53%|█████▎ | 18078/34278 [19:54:34<16:47:11, 3.73s/it] 53%|█████▎ | 18079/34278 [19:54:39<18:39:39, 4.15s/it] {'loss': 0.1214, 'grad_norm': 0.6220072426216634, 'learning_rate': 4.799033416147072e-06, 'epoch': 0.53} 53%|█████▎ | 18079/34278 [19:54:39<18:39:39, 4.15s/it] 53%|█████▎ | 18080/34278 [19:54:44<19:52:03, 4.42s/it] {'loss': 0.129, 'grad_norm': 0.8078280407580145, 'learning_rate': 4.798561364651989e-06, 'epoch': 0.53} 53%|█████▎ | 18080/34278 [19:54:44<19:52:03, 4.42s/it] 53%|█████▎ | 18081/34278 [19:54:47<18:35:25, 4.13s/it] {'loss': 0.1186, 'grad_norm': 0.9613274756810768, 'learning_rate': 4.798089314955301e-06, 'epoch': 0.53} 53%|█████▎ | 18081/34278 [19:54:47<18:35:25, 4.13s/it] 53%|█████▎ | 18082/34278 [19:54:50<17:01:14, 3.78s/it] {'loss': 0.1264, 'grad_norm': 0.9082737903113405, 'learning_rate': 4.797617267061225e-06, 'epoch': 0.53} 53%|█████▎ | 18082/34278 [19:54:50<17:01:14, 3.78s/it] 53%|█████▎ | 18083/34278 [19:54:54<17:25:38, 3.87s/it] {'loss': 0.1178, 'grad_norm': 0.8825640198193014, 'learning_rate': 4.797145220973974e-06, 'epoch': 0.53} 53%|█████▎ | 18083/34278 [19:54:54<17:25:38, 3.87s/it] 53%|█████▎ | 18084/34278 [19:55:00<20:08:39, 4.48s/it] {'loss': 0.1431, 'grad_norm': 1.0322207175883822, 'learning_rate': 4.796673176697761e-06, 'epoch': 0.53} 53%|█████▎ | 18084/34278 [19:55:00<20:08:39, 4.48s/it] 53%|█████▎ | 18085/34278 [19:55:03<18:13:48, 4.05s/it] {'loss': 0.1313, 'grad_norm': 0.8572898625618728, 'learning_rate': 4.796201134236802e-06, 'epoch': 0.53} 53%|█████▎ | 18085/34278 [19:55:03<18:13:48, 4.05s/it] 53%|█████▎ | 18086/34278 [19:55:06<16:50:59, 3.75s/it] {'loss': 0.1033, 'grad_norm': 0.6383544910445332, 'learning_rate': 4.795729093595311e-06, 'epoch': 0.53} 53%|█████▎ | 18086/34278 [19:55:06<16:50:59, 3.75s/it] 53%|█████▎ | 18087/34278 [19:55:10<16:29:33, 3.67s/it] {'loss': 0.1352, 'grad_norm': 1.1487167821455868, 'learning_rate': 4.795257054777498e-06, 'epoch': 0.53} 53%|█████▎ | 18087/34278 [19:55:10<16:29:33, 3.67s/it] 53%|█████▎ | 18088/34278 [19:55:13<15:24:02, 3.42s/it] {'loss': 0.1281, 'grad_norm': 1.227802510553886, 'learning_rate': 4.794785017787586e-06, 'epoch': 0.53} 53%|█████▎ | 18088/34278 [19:55:13<15:24:02, 3.42s/it] 53%|█████▎ | 18089/34278 [19:55:17<16:07:31, 3.59s/it] {'loss': 0.1271, 'grad_norm': 1.0471796945750702, 'learning_rate': 4.794312982629782e-06, 'epoch': 0.53} 53%|█████▎ | 18089/34278 [19:55:17<16:07:31, 3.59s/it] 53%|█████▎ | 18090/34278 [19:55:20<15:46:44, 3.51s/it] {'loss': 0.1337, 'grad_norm': 1.015561482373048, 'learning_rate': 4.793840949308303e-06, 'epoch': 0.53} 53%|█████▎ | 18090/34278 [19:55:20<15:46:44, 3.51s/it] 53%|█████▎ | 18091/34278 [19:55:24<16:49:44, 3.74s/it] {'loss': 0.1393, 'grad_norm': 1.2683844195254879, 'learning_rate': 4.793368917827364e-06, 'epoch': 0.53} 53%|█████▎ | 18091/34278 [19:55:24<16:49:44, 3.74s/it] 53%|█████▎ | 18092/34278 [19:55:27<15:53:39, 3.54s/it] {'loss': 0.1199, 'grad_norm': 1.0412588226822705, 'learning_rate': 4.792896888191178e-06, 'epoch': 0.53} 53%|█████▎ | 18092/34278 [19:55:27<15:53:39, 3.54s/it] 53%|█████▎ | 18093/34278 [19:55:30<14:59:35, 3.33s/it] {'loss': 0.1458, 'grad_norm': 0.8197454736929621, 'learning_rate': 4.792424860403956e-06, 'epoch': 0.53} 53%|█████▎ | 18093/34278 [19:55:30<14:59:35, 3.33s/it] 53%|█████▎ | 18094/34278 [19:55:33<14:46:08, 3.29s/it] {'loss': 0.1415, 'grad_norm': 0.937736034001576, 'learning_rate': 4.791952834469918e-06, 'epoch': 0.53} 53%|█████▎ | 18094/34278 [19:55:33<14:46:08, 3.29s/it] 53%|█████▎ | 18095/34278 [19:55:37<15:47:02, 3.51s/it] {'loss': 0.1434, 'grad_norm': 0.7921275622064367, 'learning_rate': 4.791480810393274e-06, 'epoch': 0.53} 53%|█████▎ | 18095/34278 [19:55:37<15:47:02, 3.51s/it] 53%|█████▎ | 18096/34278 [19:55:40<14:50:47, 3.30s/it] {'loss': 0.1139, 'grad_norm': 0.981332889317743, 'learning_rate': 4.791008788178242e-06, 'epoch': 0.53} 53%|█████▎ | 18096/34278 [19:55:40<14:50:47, 3.30s/it] 53%|█████▎ | 18097/34278 [19:55:45<17:20:21, 3.86s/it] {'loss': 0.1229, 'grad_norm': 0.7827055333623226, 'learning_rate': 4.790536767829031e-06, 'epoch': 0.53} 53%|█████▎ | 18097/34278 [19:55:45<17:20:21, 3.86s/it] 53%|█████▎ | 18098/34278 [19:55:49<17:02:28, 3.79s/it] {'loss': 0.1229, 'grad_norm': 0.7327383006507882, 'learning_rate': 4.790064749349859e-06, 'epoch': 0.53} 53%|█████▎ | 18098/34278 [19:55:49<17:02:28, 3.79s/it] 53%|█████▎ | 18099/34278 [19:55:52<15:36:32, 3.47s/it] {'loss': 0.145, 'grad_norm': 0.9917459690118268, 'learning_rate': 4.789592732744938e-06, 'epoch': 0.53} 53%|█████▎ | 18099/34278 [19:55:52<15:36:32, 3.47s/it] 53%|█████▎ | 18100/34278 [19:55:55<14:53:27, 3.31s/it] {'loss': 0.1365, 'grad_norm': 0.8834634369964915, 'learning_rate': 4.789120718018483e-06, 'epoch': 0.53} 53%|█████▎ | 18100/34278 [19:55:55<14:53:27, 3.31s/it] 53%|█████▎ | 18101/34278 [19:55:58<14:27:03, 3.22s/it] {'loss': 0.1509, 'grad_norm': 0.8989532761807614, 'learning_rate': 4.788648705174709e-06, 'epoch': 0.53} 53%|█████▎ | 18101/34278 [19:55:58<14:27:03, 3.22s/it] 53%|█████▎ | 18102/34278 [19:56:01<15:12:18, 3.38s/it] {'loss': 0.1635, 'grad_norm': 0.9894850757227963, 'learning_rate': 4.788176694217829e-06, 'epoch': 0.53} 53%|█████▎ | 18102/34278 [19:56:01<15:12:18, 3.38s/it] 53%|█████▎ | 18103/34278 [19:56:05<15:57:23, 3.55s/it] {'loss': 0.1191, 'grad_norm': 0.7255062625172629, 'learning_rate': 4.787704685152056e-06, 'epoch': 0.53} 53%|█████▎ | 18103/34278 [19:56:05<15:57:23, 3.55s/it] 53%|█████▎ | 18104/34278 [19:56:11<18:42:59, 4.17s/it] {'loss': 0.1336, 'grad_norm': 0.9273245366090763, 'learning_rate': 4.787232677981606e-06, 'epoch': 0.53} 53%|█████▎ | 18104/34278 [19:56:11<18:42:59, 4.17s/it] 53%|█████▎ | 18105/34278 [19:56:14<16:52:41, 3.76s/it] {'loss': 0.1449, 'grad_norm': 1.0659834874039709, 'learning_rate': 4.786760672710688e-06, 'epoch': 0.53} 53%|█████▎ | 18105/34278 [19:56:14<16:52:41, 3.76s/it] 53%|█████▎ | 18106/34278 [19:56:19<18:41:28, 4.16s/it] {'loss': 0.1389, 'grad_norm': 0.9509628085125836, 'learning_rate': 4.786288669343524e-06, 'epoch': 0.53} 53%|█████▎ | 18106/34278 [19:56:19<18:41:28, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( Token indices sequence length is longer than the specified maximum sequence length for this model (12092 > 8192). Running this sequence through the model will result in indexing errors 53%|█████▎ | 18107/34278 [19:56:23<18:15:17, 4.06s/it] {'loss': 0.1167, 'grad_norm': 0.8294774142889741, 'learning_rate': 4.785816667884322e-06, 'epoch': 0.53} 53%|█████▎ | 18107/34278 [19:56:23<18:15:17, 4.06s/it] 53%|█████▎ | 18108/34278 [19:56:27<18:06:32, 4.03s/it] {'loss': 0.1248, 'grad_norm': 0.7845219601055182, 'learning_rate': 4.785344668337298e-06, 'epoch': 0.53} 53%|█████▎ | 18108/34278 [19:56:27<18:06:32, 4.03s/it] 53%|█████▎ | 18109/34278 [19:56:29<16:25:48, 3.66s/it] {'loss': 0.1316, 'grad_norm': 1.0530402775112828, 'learning_rate': 4.784872670706667e-06, 'epoch': 0.53} 53%|█████▎ | 18109/34278 [19:56:30<16:25:48, 3.66s/it] 53%|█████▎ | 18110/34278 [19:56:33<15:46:40, 3.51s/it] {'loss': 0.109, 'grad_norm': 0.7586720307979221, 'learning_rate': 4.78440067499664e-06, 'epoch': 0.53} 53%|█████▎ | 18110/34278 [19:56:33<15:46:40, 3.51s/it] 53%|█████▎ | 18111/34278 [19:56:37<17:08:14, 3.82s/it] {'loss': 0.1457, 'grad_norm': 0.7611260963669129, 'learning_rate': 4.783928681211431e-06, 'epoch': 0.53} 53%|█████▎ | 18111/34278 [19:56:37<17:08:14, 3.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 53%|█████▎ | 18112/34278 [19:56:40<15:56:51, 3.55s/it] {'loss': 0.1175, 'grad_norm': 0.7210615560156368, 'learning_rate': 4.7834566893552566e-06, 'epoch': 0.53} 53%|█████▎ | 18112/34278 [19:56:40<15:56:51, 3.55s/it] 53%|█████▎ | 18113/34278 [19:56:44<16:33:38, 3.69s/it] {'loss': 0.1223, 'grad_norm': 0.7575481673555559, 'learning_rate': 4.78298469943233e-06, 'epoch': 0.53} 53%|█████▎ | 18113/34278 [19:56:44<16:33:38, 3.69s/it] 53%|█████▎ | 18114/34278 [19:56:48<16:16:40, 3.63s/it] {'loss': 0.1534, 'grad_norm': 0.9555994621082579, 'learning_rate': 4.782512711446864e-06, 'epoch': 0.53} 53%|█████▎ | 18114/34278 [19:56:48<16:16:40, 3.63s/it] 53%|█████▎ | 18115/34278 [19:56:51<15:54:08, 3.54s/it] {'loss': 0.1299, 'grad_norm': 0.604928116169317, 'learning_rate': 4.782040725403071e-06, 'epoch': 0.53} 53%|█████▎ | 18115/34278 [19:56:51<15:54:08, 3.54s/it] 53%|█████▎ | 18116/34278 [19:56:58<20:04:50, 4.47s/it] {'loss': 0.1377, 'grad_norm': 0.7659277118805252, 'learning_rate': 4.781568741305168e-06, 'epoch': 0.53} 53%|█████▎ | 18116/34278 [19:56:58<20:04:50, 4.47s/it] 53%|█████▎ | 18117/34278 [19:57:01<18:04:26, 4.03s/it] {'loss': 0.1313, 'grad_norm': 0.8654886039910025, 'learning_rate': 4.781096759157365e-06, 'epoch': 0.53} 53%|█████▎ | 18117/34278 [19:57:01<18:04:26, 4.03s/it] 53%|█████▎ | 18118/34278 [19:57:04<17:32:43, 3.91s/it] {'loss': 0.1355, 'grad_norm': 0.6664524862480887, 'learning_rate': 4.78062477896388e-06, 'epoch': 0.53} 53%|█████▎ | 18118/34278 [19:57:04<17:32:43, 3.91s/it] 53%|█████▎ | 18119/34278 [19:57:08<17:10:50, 3.83s/it] {'loss': 0.113, 'grad_norm': 0.5936013882066392, 'learning_rate': 4.780152800728924e-06, 'epoch': 0.53} 53%|█████▎ | 18119/34278 [19:57:08<17:10:50, 3.83s/it] 53%|█████▎ | 18120/34278 [19:57:14<20:38:27, 4.60s/it] {'loss': 0.1262, 'grad_norm': 0.7703810986902896, 'learning_rate': 4.779680824456711e-06, 'epoch': 0.53} 53%|█████▎ | 18120/34278 [19:57:14<20:38:27, 4.60s/it] 53%|█████▎ | 18121/34278 [19:57:17<18:20:30, 4.09s/it] {'loss': 0.1179, 'grad_norm': 0.7784612891200107, 'learning_rate': 4.779208850151456e-06, 'epoch': 0.53} 53%|█████▎ | 18121/34278 [19:57:17<18:20:30, 4.09s/it] 53%|█████▎ | 18122/34278 [19:57:20<16:59:59, 3.79s/it] {'loss': 0.1367, 'grad_norm': 0.8464230582609625, 'learning_rate': 4.778736877817371e-06, 'epoch': 0.53} 53%|█████▎ | 18122/34278 [19:57:20<16:59:59, 3.79s/it] 53%|█████▎ | 18123/34278 [19:57:23<16:04:03, 3.58s/it] {'loss': 0.1244, 'grad_norm': 0.7059243571332273, 'learning_rate': 4.778264907458669e-06, 'epoch': 0.53} 53%|█████▎ | 18123/34278 [19:57:23<16:04:03, 3.58s/it] 53%|█████▎ | 18124/34278 [19:57:26<15:26:54, 3.44s/it] {'loss': 0.1199, 'grad_norm': 1.0484842853668985, 'learning_rate': 4.777792939079566e-06, 'epoch': 0.53} 53%|█████▎ | 18124/34278 [19:57:26<15:26:54, 3.44s/it] 53%|█████▎ | 18125/34278 [19:57:30<15:22:39, 3.43s/it] {'loss': 0.1375, 'grad_norm': 0.87445289181572, 'learning_rate': 4.777320972684275e-06, 'epoch': 0.53} 53%|█████▎ | 18125/34278 [19:57:30<15:22:39, 3.43s/it] 53%|█████▎ | 18126/34278 [19:57:36<18:42:29, 4.17s/it] {'loss': 0.1306, 'grad_norm': 0.955840944166885, 'learning_rate': 4.77684900827701e-06, 'epoch': 0.53} 53%|█████▎ | 18126/34278 [19:57:36<18:42:29, 4.17s/it] 53%|█████▎ | 18127/34278 [19:57:39<17:12:37, 3.84s/it] {'loss': 0.1288, 'grad_norm': 1.0386475112368847, 'learning_rate': 4.776377045861983e-06, 'epoch': 0.53} 53%|█████▎ | 18127/34278 [19:57:39<17:12:37, 3.84s/it] 53%|█████▎ | 18128/34278 [19:57:42<15:56:06, 3.55s/it] {'loss': 0.1256, 'grad_norm': 0.7025667394062385, 'learning_rate': 4.775905085443407e-06, 'epoch': 0.53} 53%|█████▎ | 18128/34278 [19:57:42<15:56:06, 3.55s/it] 53%|█████▎ | 18129/34278 [19:57:45<16:09:54, 3.60s/it] {'loss': 0.1225, 'grad_norm': 1.1036444458007375, 'learning_rate': 4.775433127025498e-06, 'epoch': 0.53} 53%|█████▎ | 18129/34278 [19:57:45<16:09:54, 3.60s/it] 53%|█████▎ | 18130/34278 [19:57:48<15:14:15, 3.40s/it] {'loss': 0.1239, 'grad_norm': 0.6748829544189402, 'learning_rate': 4.774961170612468e-06, 'epoch': 0.53} 53%|█████▎ | 18130/34278 [19:57:48<15:14:15, 3.40s/it] 53%|█████▎ | 18131/34278 [19:57:52<15:44:41, 3.51s/it] {'loss': 0.1278, 'grad_norm': 0.7721274296734262, 'learning_rate': 4.774489216208532e-06, 'epoch': 0.53} 53%|█████▎ | 18131/34278 [19:57:52<15:44:41, 3.51s/it] 53%|█████▎ | 18132/34278 [19:57:56<15:51:51, 3.54s/it] {'loss': 0.1169, 'grad_norm': 0.7973691056935861, 'learning_rate': 4.774017263817902e-06, 'epoch': 0.53} 53%|█████▎ | 18132/34278 [19:57:56<15:51:51, 3.54s/it] 53%|█████▎ | 18133/34278 [19:57:59<15:07:50, 3.37s/it] {'loss': 0.1538, 'grad_norm': 0.8711552955112548, 'learning_rate': 4.773545313444792e-06, 'epoch': 0.53} 53%|█████▎ | 18133/34278 [19:57:59<15:07:50, 3.37s/it] 53%|█████▎ | 18134/34278 [19:58:02<14:37:46, 3.26s/it] {'loss': 0.1515, 'grad_norm': 0.9545024171331314, 'learning_rate': 4.773073365093417e-06, 'epoch': 0.53} 53%|█████▎ | 18134/34278 [19:58:02<14:37:46, 3.26s/it] 53%|█████▎ | 18135/34278 [19:58:05<15:04:03, 3.36s/it] {'loss': 0.1116, 'grad_norm': 0.6827698208518546, 'learning_rate': 4.772601418767983e-06, 'epoch': 0.53} 53%|█████▎ | 18135/34278 [19:58:05<15:04:03, 3.36s/it] 53%|█████▎ | 18136/34278 [19:58:11<17:35:35, 3.92s/it] {'loss': 0.1261, 'grad_norm': 0.927146834457475, 'learning_rate': 4.772129474472715e-06, 'epoch': 0.53} 53%|█████▎ | 18136/34278 [19:58:11<17:35:35, 3.92s/it] 53%|█████▎ | 18137/34278 [19:58:13<16:09:00, 3.60s/it] {'loss': 0.138, 'grad_norm': 0.9836776608461734, 'learning_rate': 4.771657532211819e-06, 'epoch': 0.53} 53%|█████▎ | 18137/34278 [19:58:13<16:09:00, 3.60s/it] 53%|█████▎ | 18138/34278 [19:58:18<17:10:15, 3.83s/it] {'loss': 0.1208, 'grad_norm': 0.7881476004287299, 'learning_rate': 4.77118559198951e-06, 'epoch': 0.53} 53%|█████▎ | 18138/34278 [19:58:18<17:10:15, 3.83s/it] 53%|█████▎ | 18139/34278 [19:58:23<18:44:46, 4.18s/it] {'loss': 0.1395, 'grad_norm': 0.8841536043278565, 'learning_rate': 4.7707136538100026e-06, 'epoch': 0.53} 53%|█████▎ | 18139/34278 [19:58:23<18:44:46, 4.18s/it] 53%|█████▎ | 18140/34278 [19:58:27<18:54:36, 4.22s/it] {'loss': 0.1543, 'grad_norm': 1.2884414593715057, 'learning_rate': 4.770241717677506e-06, 'epoch': 0.53} 53%|█████▎ | 18140/34278 [19:58:27<18:54:36, 4.22s/it] 53%|█████▎ | 18141/34278 [19:58:31<18:02:37, 4.03s/it] {'loss': 0.1508, 'grad_norm': 0.8431313028877536, 'learning_rate': 4.769769783596238e-06, 'epoch': 0.53} 53%|█████▎ | 18141/34278 [19:58:31<18:02:37, 4.03s/it] 53%|█████▎ | 18142/34278 [19:58:34<16:34:05, 3.70s/it] {'loss': 0.1198, 'grad_norm': 0.8680004736037638, 'learning_rate': 4.769297851570411e-06, 'epoch': 0.53} 53%|█████▎ | 18142/34278 [19:58:34<16:34:05, 3.70s/it] 53%|█████▎ | 18143/34278 [19:58:40<19:36:34, 4.38s/it] {'loss': 0.1185, 'grad_norm': 0.9459812532286426, 'learning_rate': 4.768825921604238e-06, 'epoch': 0.53} 53%|█████▎ | 18143/34278 [19:58:40<19:36:34, 4.38s/it] 53%|█████▎ | 18144/34278 [19:58:46<21:54:47, 4.89s/it] {'loss': 0.1218, 'grad_norm': 1.0297871352150496, 'learning_rate': 4.768353993701931e-06, 'epoch': 0.53} 53%|█████▎ | 18144/34278 [19:58:46<21:54:47, 4.89s/it] 53%|█████▎ | 18145/34278 [19:58:49<19:39:24, 4.39s/it] {'loss': 0.1291, 'grad_norm': 0.8701244527498139, 'learning_rate': 4.767882067867705e-06, 'epoch': 0.53} 53%|█████▎ | 18145/34278 [19:58:49<19:39:24, 4.39s/it] 53%|█████▎ | 18146/34278 [19:58:52<17:59:20, 4.01s/it] {'loss': 0.1386, 'grad_norm': 1.066987590461949, 'learning_rate': 4.7674101441057705e-06, 'epoch': 0.53} 53%|█████▎ | 18146/34278 [19:58:52<17:59:20, 4.01s/it] 53%|█████▎ | 18147/34278 [19:58:57<19:24:01, 4.33s/it] {'loss': 0.1346, 'grad_norm': 0.9628167726103253, 'learning_rate': 4.766938222420344e-06, 'epoch': 0.53} 53%|█████▎ | 18147/34278 [19:58:57<19:24:01, 4.33s/it] 53%|█████▎ | 18148/34278 [19:59:00<17:21:07, 3.87s/it] {'loss': 0.1339, 'grad_norm': 0.9860449775612433, 'learning_rate': 4.766466302815639e-06, 'epoch': 0.53} 53%|█████▎ | 18148/34278 [19:59:00<17:21:07, 3.87s/it] 53%|█████▎ | 18149/34278 [19:59:04<17:18:50, 3.86s/it] {'loss': 0.1201, 'grad_norm': 0.787033332965655, 'learning_rate': 4.765994385295865e-06, 'epoch': 0.53} 53%|█████▎ | 18149/34278 [19:59:04<17:18:50, 3.86s/it] 53%|█████▎ | 18150/34278 [19:59:07<16:30:03, 3.68s/it] {'loss': 0.1245, 'grad_norm': 0.7549511063452695, 'learning_rate': 4.765522469865239e-06, 'epoch': 0.53} 53%|█████▎ | 18150/34278 [19:59:07<16:30:03, 3.68s/it] 53%|█████▎ | 18151/34278 [19:59:11<17:17:41, 3.86s/it] {'loss': 0.1428, 'grad_norm': 0.9348648305971067, 'learning_rate': 4.765050556527973e-06, 'epoch': 0.53} 53%|█████▎ | 18151/34278 [19:59:11<17:17:41, 3.86s/it] 53%|█████▎ | 18152/34278 [19:59:14<16:14:21, 3.63s/it] {'loss': 0.1439, 'grad_norm': 0.8012189678711377, 'learning_rate': 4.7645786452882746e-06, 'epoch': 0.53} 53%|█████▎ | 18152/34278 [19:59:14<16:14:21, 3.63s/it] 53%|█████▎ | 18153/34278 [19:59:20<18:25:01, 4.11s/it] {'loss': 0.131, 'grad_norm': 0.7953172030733313, 'learning_rate': 4.764106736150367e-06, 'epoch': 0.53} 53%|█████▎ | 18153/34278 [19:59:20<18:25:01, 4.11s/it] 53%|█████▎ | 18154/34278 [19:59:23<17:23:46, 3.88s/it] {'loss': 0.1301, 'grad_norm': 0.7893499175576796, 'learning_rate': 4.7636348291184555e-06, 'epoch': 0.53} 53%|█████▎ | 18154/34278 [19:59:23<17:23:46, 3.88s/it] 53%|█████▎ | 18155/34278 [19:59:27<17:07:12, 3.82s/it] {'loss': 0.1129, 'grad_norm': 0.7676449840213458, 'learning_rate': 4.763162924196757e-06, 'epoch': 0.53} 53%|█████▎ | 18155/34278 [19:59:27<17:07:12, 3.82s/it] 53%|█████▎ | 18156/34278 [19:59:29<15:35:35, 3.48s/it] {'loss': 0.1423, 'grad_norm': 0.8807712011856247, 'learning_rate': 4.762691021389484e-06, 'epoch': 0.53} 53%|█████▎ | 18156/34278 [19:59:29<15:35:35, 3.48s/it] 53%|█████▎ | 18157/34278 [19:59:34<17:20:48, 3.87s/it] {'loss': 0.1317, 'grad_norm': 0.8807696551720602, 'learning_rate': 4.762219120700848e-06, 'epoch': 0.53} 53%|█████▎ | 18157/34278 [19:59:34<17:20:48, 3.87s/it] 53%|█████▎ | 18158/34278 [19:59:37<16:22:23, 3.66s/it] {'loss': 0.1134, 'grad_norm': 1.0291963794905532, 'learning_rate': 4.761747222135062e-06, 'epoch': 0.53} 53%|█████▎ | 18158/34278 [19:59:37<16:22:23, 3.66s/it] 53%|█████▎ | 18159/34278 [19:59:40<15:34:56, 3.48s/it] {'loss': 0.1367, 'grad_norm': 0.8358811957714001, 'learning_rate': 4.7612753256963405e-06, 'epoch': 0.53} 53%|█████▎ | 18159/34278 [19:59:40<15:34:56, 3.48s/it] 53%|█████▎ | 18160/34278 [19:59:43<15:00:43, 3.35s/it] {'loss': 0.1354, 'grad_norm': 0.932982533203107, 'learning_rate': 4.760803431388896e-06, 'epoch': 0.53} 53%|█████▎ | 18160/34278 [19:59:43<15:00:43, 3.35s/it] 53%|█████▎ | 18161/34278 [19:59:49<18:00:06, 4.02s/it] {'loss': 0.1431, 'grad_norm': 1.0072010753475942, 'learning_rate': 4.760331539216943e-06, 'epoch': 0.53} 53%|█████▎ | 18161/34278 [19:59:49<18:00:06, 4.02s/it] 53%|█████▎ | 18162/34278 [19:59:52<16:20:09, 3.65s/it] {'loss': 0.1273, 'grad_norm': 0.9612798982168413, 'learning_rate': 4.759859649184692e-06, 'epoch': 0.53} 53%|█████▎ | 18162/34278 [19:59:52<16:20:09, 3.65s/it] 53%|█████▎ | 18163/34278 [19:59:58<19:34:18, 4.37s/it] {'loss': 0.1472, 'grad_norm': 0.7939814182068036, 'learning_rate': 4.759387761296355e-06, 'epoch': 0.53} 53%|█████▎ | 18163/34278 [19:59:58<19:34:18, 4.37s/it] 53%|█████▎ | 18164/34278 [20:00:02<19:35:56, 4.38s/it] {'loss': 0.1361, 'grad_norm': 0.6750643649613849, 'learning_rate': 4.758915875556147e-06, 'epoch': 0.53} 53%|█████▎ | 18164/34278 [20:00:02<19:35:56, 4.38s/it] 53%|█████▎ | 18165/34278 [20:00:05<17:51:08, 3.99s/it] {'loss': 0.1373, 'grad_norm': 1.0139754207777394, 'learning_rate': 4.758443991968282e-06, 'epoch': 0.53} 53%|█████▎ | 18165/34278 [20:00:05<17:51:08, 3.99s/it] 53%|█████▎ | 18166/34278 [20:00:11<19:59:19, 4.47s/it] {'loss': 0.1263, 'grad_norm': 0.9152048475594219, 'learning_rate': 4.7579721105369705e-06, 'epoch': 0.53} 53%|█████▎ | 18166/34278 [20:00:11<19:59:19, 4.47s/it] 53%|█████▎ | 18167/34278 [20:00:16<21:31:05, 4.81s/it] {'loss': 0.1678, 'grad_norm': 1.0364722244867663, 'learning_rate': 4.757500231266427e-06, 'epoch': 0.53} 53%|█████▎ | 18167/34278 [20:00:16<21:31:05, 4.81s/it] 53%|█████▎ | 18168/34278 [20:00:22<23:15:07, 5.20s/it] {'loss': 0.1267, 'grad_norm': 0.8473708298773877, 'learning_rate': 4.757028354160862e-06, 'epoch': 0.53} 53%|█████▎ | 18168/34278 [20:00:23<23:15:07, 5.20s/it] 53%|█████▎ | 18169/34278 [20:00:26<20:31:53, 4.59s/it] {'loss': 0.1385, 'grad_norm': 0.879045223881066, 'learning_rate': 4.756556479224493e-06, 'epoch': 0.53} 53%|█████▎ | 18169/34278 [20:00:26<20:31:53, 4.59s/it] 53%|█████▎ | 18170/34278 [20:00:29<18:50:32, 4.21s/it] {'loss': 0.1461, 'grad_norm': 0.9393381422047005, 'learning_rate': 4.756084606461526e-06, 'epoch': 0.53} 53%|█████▎ | 18170/34278 [20:00:29<18:50:32, 4.21s/it] 53%|█████▎ | 18171/34278 [20:00:33<18:02:18, 4.03s/it] {'loss': 0.1426, 'grad_norm': 1.0444034765539267, 'learning_rate': 4.7556127358761785e-06, 'epoch': 0.53} 53%|█████▎ | 18171/34278 [20:00:33<18:02:18, 4.03s/it] 53%|█████▎ | 18172/34278 [20:00:36<17:47:39, 3.98s/it] {'loss': 0.1317, 'grad_norm': 0.8727968068672698, 'learning_rate': 4.755140867472663e-06, 'epoch': 0.53} 53%|█████▎ | 18172/34278 [20:00:36<17:47:39, 3.98s/it] 53%|█████▎ | 18173/34278 [20:00:42<19:34:02, 4.37s/it] {'loss': 0.1292, 'grad_norm': 0.9287140610081616, 'learning_rate': 4.754669001255192e-06, 'epoch': 0.53} 53%|█████▎ | 18173/34278 [20:00:42<19:34:02, 4.37s/it] 53%|█████▎ | 18174/34278 [20:00:45<17:41:17, 3.95s/it] {'loss': 0.1476, 'grad_norm': 1.0607407479073734, 'learning_rate': 4.754197137227976e-06, 'epoch': 0.53} 53%|█████▎ | 18174/34278 [20:00:45<17:41:17, 3.95s/it] 53%|█████▎ | 18175/34278 [20:00:48<16:33:58, 3.70s/it] {'loss': 0.1408, 'grad_norm': 0.8928006746861766, 'learning_rate': 4.753725275395229e-06, 'epoch': 0.53} 53%|█████▎ | 18175/34278 [20:00:48<16:33:58, 3.70s/it] 53%|█████▎ | 18176/34278 [20:00:53<19:04:27, 4.26s/it] {'loss': 0.1365, 'grad_norm': 0.8564182428234062, 'learning_rate': 4.753253415761164e-06, 'epoch': 0.53} 53%|█████▎ | 18176/34278 [20:00:53<19:04:27, 4.26s/it] 53%|█████▎ | 18177/34278 [20:00:57<18:39:03, 4.17s/it] {'loss': 0.1476, 'grad_norm': 0.8327865366363296, 'learning_rate': 4.752781558329994e-06, 'epoch': 0.53} 53%|█████▎ | 18177/34278 [20:00:57<18:39:03, 4.17s/it] 53%|█████▎ | 18178/34278 [20:01:00<17:02:05, 3.81s/it] {'loss': 0.1617, 'grad_norm': 0.8220980041107818, 'learning_rate': 4.752309703105933e-06, 'epoch': 0.53} 53%|█████▎ | 18178/34278 [20:01:00<17:02:05, 3.81s/it] 53%|█████▎ | 18179/34278 [20:01:05<18:08:46, 4.06s/it] {'loss': 0.1533, 'grad_norm': 0.8700150925041665, 'learning_rate': 4.75183785009319e-06, 'epoch': 0.53} 53%|█████▎ | 18179/34278 [20:01:05<18:08:46, 4.06s/it] 53%|█████▎ | 18180/34278 [20:01:08<16:22:38, 3.66s/it] {'loss': 0.1315, 'grad_norm': 0.8551945400335113, 'learning_rate': 4.7513659992959795e-06, 'epoch': 0.53} 53%|█████▎ | 18180/34278 [20:01:08<16:22:38, 3.66s/it] 53%|█████▎ | 18181/34278 [20:01:11<15:26:40, 3.45s/it] {'loss': 0.1223, 'grad_norm': 0.8874755265774167, 'learning_rate': 4.750894150718516e-06, 'epoch': 0.53} 53%|█████▎ | 18181/34278 [20:01:11<15:26:40, 3.45s/it] 53%|█████▎ | 18182/34278 [20:01:15<16:07:55, 3.61s/it] {'loss': 0.133, 'grad_norm': 0.9902142340042741, 'learning_rate': 4.750422304365006e-06, 'epoch': 0.53} 53%|█████▎ | 18182/34278 [20:01:15<16:07:55, 3.61s/it] 53%|█████▎ | 18183/34278 [20:01:21<19:12:15, 4.30s/it] {'loss': 0.1382, 'grad_norm': 0.826339138436057, 'learning_rate': 4.749950460239669e-06, 'epoch': 0.53} 53%|█████▎ | 18183/34278 [20:01:21<19:12:15, 4.30s/it] 53%|█████▎ | 18184/34278 [20:01:24<18:02:04, 4.03s/it] {'loss': 0.1482, 'grad_norm': 0.9591896728975197, 'learning_rate': 4.749478618346714e-06, 'epoch': 0.53} 53%|█████▎ | 18184/34278 [20:01:24<18:02:04, 4.03s/it] 53%|█████▎ | 18185/34278 [20:01:28<17:28:25, 3.91s/it] {'loss': 0.1308, 'grad_norm': 1.0308127724807137, 'learning_rate': 4.749006778690354e-06, 'epoch': 0.53} 53%|█████▎ | 18185/34278 [20:01:28<17:28:25, 3.91s/it] 53%|█████▎ | 18186/34278 [20:01:31<16:54:43, 3.78s/it] {'loss': 0.1498, 'grad_norm': 0.8470755796075768, 'learning_rate': 4.748534941274803e-06, 'epoch': 0.53} 53%|█████▎ | 18186/34278 [20:01:31<16:54:43, 3.78s/it] 53%|█████▎ | 18187/34278 [20:01:35<16:27:47, 3.68s/it] {'loss': 0.1125, 'grad_norm': 0.730272608511541, 'learning_rate': 4.748063106104271e-06, 'epoch': 0.53} 53%|█████▎ | 18187/34278 [20:01:35<16:27:47, 3.68s/it] 53%|█████▎ | 18188/34278 [20:01:38<15:41:22, 3.51s/it] {'loss': 0.1319, 'grad_norm': 0.7683053062257869, 'learning_rate': 4.74759127318297e-06, 'epoch': 0.53} 53%|█████▎ | 18188/34278 [20:01:38<15:41:22, 3.51s/it] 53%|█████▎ | 18189/34278 [20:01:41<14:52:17, 3.33s/it] {'loss': 0.1086, 'grad_norm': 0.7876833514650872, 'learning_rate': 4.7471194425151145e-06, 'epoch': 0.53} 53%|█████▎ | 18189/34278 [20:01:41<14:52:17, 3.33s/it] 53%|█████▎ | 18190/34278 [20:01:46<17:32:27, 3.93s/it] {'loss': 0.1102, 'grad_norm': 0.6449356624882449, 'learning_rate': 4.746647614104917e-06, 'epoch': 0.53} 53%|█████▎ | 18190/34278 [20:01:46<17:32:27, 3.93s/it] 53%|█████▎ | 18191/34278 [20:01:49<16:17:13, 3.64s/it] {'loss': 0.137, 'grad_norm': 0.7419098809986556, 'learning_rate': 4.74617578795659e-06, 'epoch': 0.53} 53%|█████▎ | 18191/34278 [20:01:49<16:17:13, 3.64s/it] 53%|█████▎ | 18192/34278 [20:01:52<15:10:08, 3.39s/it] {'loss': 0.1456, 'grad_norm': 0.8498207794600134, 'learning_rate': 4.745703964074344e-06, 'epoch': 0.53} 53%|█████▎ | 18192/34278 [20:01:52<15:10:08, 3.39s/it] 53%|█████▎ | 18193/34278 [20:01:55<15:03:43, 3.37s/it] {'loss': 0.1223, 'grad_norm': 0.8679690788819033, 'learning_rate': 4.745232142462392e-06, 'epoch': 0.53} 53%|█████▎ | 18193/34278 [20:01:55<15:03:43, 3.37s/it] 53%|█████▎ | 18194/34278 [20:01:58<14:37:21, 3.27s/it] {'loss': 0.1455, 'grad_norm': 0.6259903203854398, 'learning_rate': 4.744760323124945e-06, 'epoch': 0.53} 53%|█████▎ | 18194/34278 [20:01:58<14:37:21, 3.27s/it] 53%|█████▎ | 18195/34278 [20:02:01<14:18:21, 3.20s/it] {'loss': 0.1343, 'grad_norm': 0.7957118791469273, 'learning_rate': 4.744288506066219e-06, 'epoch': 0.53} 53%|█████▎ | 18195/34278 [20:02:01<14:18:21, 3.20s/it] 53%|█████▎ | 18196/34278 [20:02:04<13:53:52, 3.11s/it] {'loss': 0.1217, 'grad_norm': 0.8133105330846443, 'learning_rate': 4.743816691290425e-06, 'epoch': 0.53} 53%|█████▎ | 18196/34278 [20:02:04<13:53:52, 3.11s/it] 53%|█████▎ | 18197/34278 [20:02:10<17:54:22, 4.01s/it] {'loss': 0.1258, 'grad_norm': 0.7151682397135266, 'learning_rate': 4.743344878801773e-06, 'epoch': 0.53} 53%|█████▎ | 18197/34278 [20:02:10<17:54:22, 4.01s/it] 53%|█████▎ | 18198/34278 [20:02:13<16:41:53, 3.74s/it] {'loss': 0.1513, 'grad_norm': 0.761165780384021, 'learning_rate': 4.742873068604477e-06, 'epoch': 0.53} 53%|█████▎ | 18198/34278 [20:02:13<16:41:53, 3.74s/it] 53%|█████▎ | 18199/34278 [20:02:17<16:58:06, 3.80s/it] {'loss': 0.1281, 'grad_norm': 0.8718161032869127, 'learning_rate': 4.74240126070275e-06, 'epoch': 0.53} 53%|█████▎ | 18199/34278 [20:02:17<16:58:06, 3.80s/it] 53%|█████▎ | 18200/34278 [20:02:22<18:04:11, 4.05s/it] {'loss': 0.1262, 'grad_norm': 0.8866589715428614, 'learning_rate': 4.7419294551008e-06, 'epoch': 0.53} 53%|█████▎ | 18200/34278 [20:02:22<18:04:11, 4.05s/it] 53%|█████▎ | 18201/34278 [20:02:26<18:41:57, 4.19s/it] {'loss': 0.1365, 'grad_norm': 0.8178419401134385, 'learning_rate': 4.741457651802844e-06, 'epoch': 0.53} 53%|█████▎ | 18201/34278 [20:02:26<18:41:57, 4.19s/it] 53%|█████▎ | 18202/34278 [20:02:32<21:02:17, 4.71s/it] {'loss': 0.1512, 'grad_norm': 0.8823398417940378, 'learning_rate': 4.7409858508130925e-06, 'epoch': 0.53} 53%|█████▎ | 18202/34278 [20:02:32<21:02:17, 4.71s/it] 53%|█████▎ | 18203/34278 [20:02:35<19:03:58, 4.27s/it] {'loss': 0.145, 'grad_norm': 0.9261725894905353, 'learning_rate': 4.7405140521357585e-06, 'epoch': 0.53} 53%|█████▎ | 18203/34278 [20:02:35<19:03:58, 4.27s/it] 53%|█████▎ | 18204/34278 [20:02:40<18:52:20, 4.23s/it] {'loss': 0.1288, 'grad_norm': 0.927896716636194, 'learning_rate': 4.740042255775052e-06, 'epoch': 0.53} 53%|█████▎ | 18204/34278 [20:02:40<18:52:20, 4.23s/it] 53%|█████▎ | 18205/34278 [20:02:43<18:09:36, 4.07s/it] {'loss': 0.1238, 'grad_norm': 0.7190119095763443, 'learning_rate': 4.739570461735186e-06, 'epoch': 0.53} 53%|█████▎ | 18205/34278 [20:02:43<18:09:36, 4.07s/it] 53%|█████▎ | 18206/34278 [20:02:47<17:10:14, 3.85s/it] {'loss': 0.1329, 'grad_norm': 0.9254830175686871, 'learning_rate': 4.739098670020372e-06, 'epoch': 0.53} 53%|█████▎ | 18206/34278 [20:02:47<17:10:14, 3.85s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff4f3faa5c0> Failed to fetch sample 3641701. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4f3faa5c0> 53%|█████▎ | 18207/34278 [20:02:50<16:16:29, 3.65s/it] {'loss': 0.1603, 'grad_norm': 0.7772275685820007, 'learning_rate': 4.738626880634823e-06, 'epoch': 0.53} 53%|█████▎ | 18207/34278 [20:02:50<16:16:29, 3.65s/it] 53%|█████▎ | 18208/34278 [20:02:53<15:45:55, 3.53s/it] {'loss': 0.1805, 'grad_norm': 0.9083866916896929, 'learning_rate': 4.738155093582753e-06, 'epoch': 0.53} 53%|█████▎ | 18208/34278 [20:02:53<15:45:55, 3.53s/it] 53%|█████▎ | 18209/34278 [20:02:56<15:37:55, 3.50s/it] {'loss': 0.1221, 'grad_norm': 0.9578283106622949, 'learning_rate': 4.73768330886837e-06, 'epoch': 0.53} 53%|█████▎ | 18209/34278 [20:02:56<15:37:55, 3.50s/it] 53%|█████▎ | 18210/34278 [20:03:00<15:03:56, 3.38s/it] {'loss': 0.1472, 'grad_norm': 0.9589037101494035, 'learning_rate': 4.7372115264958885e-06, 'epoch': 0.53} 53%|█████▎ | 18210/34278 [20:03:00<15:03:56, 3.38s/it] 53%|█████▎ | 18211/34278 [20:03:06<18:39:57, 4.18s/it] {'loss': 0.1292, 'grad_norm': 0.8128078720110864, 'learning_rate': 4.736739746469521e-06, 'epoch': 0.53} 53%|█████▎ | 18211/34278 [20:03:06<18:39:57, 4.18s/it] 53%|█████▎ | 18212/34278 [20:03:09<17:24:57, 3.90s/it] {'loss': 0.1276, 'grad_norm': 1.1196850412460473, 'learning_rate': 4.736267968793474e-06, 'epoch': 0.53} 53%|█████▎ | 18212/34278 [20:03:09<17:24:57, 3.90s/it] 53%|█████▎ | 18213/34278 [20:03:12<15:55:11, 3.57s/it] {'loss': 0.1386, 'grad_norm': 0.9749254399697772, 'learning_rate': 4.735796193471967e-06, 'epoch': 0.53} 53%|█████▎ | 18213/34278 [20:03:12<15:55:11, 3.57s/it] 53%|█████▎ | 18214/34278 [20:03:15<15:28:28, 3.47s/it] {'loss': 0.1253, 'grad_norm': 0.8128488652324242, 'learning_rate': 4.735324420509208e-06, 'epoch': 0.53} 53%|█████▎ | 18214/34278 [20:03:15<15:28:28, 3.47s/it] 53%|█████▎ | 18215/34278 [20:03:18<15:10:54, 3.40s/it] {'loss': 0.1437, 'grad_norm': 0.7719495539185899, 'learning_rate': 4.734852649909409e-06, 'epoch': 0.53} 53%|█████▎ | 18215/34278 [20:03:18<15:10:54, 3.40s/it] 53%|█████▎ | 18216/34278 [20:03:21<14:39:40, 3.29s/it] {'loss': 0.1249, 'grad_norm': 1.3777910466487588, 'learning_rate': 4.734380881676783e-06, 'epoch': 0.53} 53%|█████▎ | 18216/34278 [20:03:21<14:39:40, 3.29s/it] 53%|█████▎ | 18217/34278 [20:03:26<16:10:00, 3.62s/it] {'loss': 0.1421, 'grad_norm': 0.9424403854254995, 'learning_rate': 4.733909115815541e-06, 'epoch': 0.53} 53%|█████▎ | 18217/34278 [20:03:26<16:10:00, 3.62s/it] 53%|█████▎ | 18218/34278 [20:03:30<17:48:00, 3.99s/it] {'loss': 0.128, 'grad_norm': 0.7700984218747459, 'learning_rate': 4.733437352329893e-06, 'epoch': 0.53} 53%|█████▎ | 18218/34278 [20:03:30<17:48:00, 3.99s/it] 53%|█████▎ | 18219/34278 [20:03:36<20:24:49, 4.58s/it] {'loss': 0.127, 'grad_norm': 0.7503305025733185, 'learning_rate': 4.732965591224054e-06, 'epoch': 0.53} 53%|█████▎ | 18219/34278 [20:03:36<20:24:49, 4.58s/it] 53%|█████▎ | 18220/34278 [20:03:41<20:19:30, 4.56s/it] {'loss': 0.1402, 'grad_norm': 1.0114362003752448, 'learning_rate': 4.732493832502234e-06, 'epoch': 0.53} 53%|█████▎ | 18220/34278 [20:03:41<20:19:30, 4.56s/it] 53%|█████▎ | 18221/34278 [20:03:44<18:14:09, 4.09s/it] {'loss': 0.1194, 'grad_norm': 0.8115849228481044, 'learning_rate': 4.7320220761686474e-06, 'epoch': 0.53} 53%|█████▎ | 18221/34278 [20:03:44<18:14:09, 4.09s/it] 53%|█████▎ | 18222/34278 [20:03:47<16:56:51, 3.80s/it] {'loss': 0.1222, 'grad_norm': 0.7141742109317262, 'learning_rate': 4.731550322227502e-06, 'epoch': 0.53} 53%|█████▎ | 18222/34278 [20:03:47<16:56:51, 3.80s/it] 53%|█████▎ | 18223/34278 [20:03:50<16:04:41, 3.61s/it] {'loss': 0.1375, 'grad_norm': 0.7910402233028568, 'learning_rate': 4.731078570683011e-06, 'epoch': 0.53} 53%|█████▎ | 18223/34278 [20:03:50<16:04:41, 3.61s/it] 53%|█████▎ | 18224/34278 [20:03:53<15:08:57, 3.40s/it] {'loss': 0.1341, 'grad_norm': 0.866716925506884, 'learning_rate': 4.730606821539386e-06, 'epoch': 0.53} 53%|█████▎ | 18224/34278 [20:03:53<15:08:57, 3.40s/it] 53%|█████▎ | 18225/34278 [20:03:56<14:27:04, 3.24s/it] {'loss': 0.1265, 'grad_norm': 0.7100187237823343, 'learning_rate': 4.73013507480084e-06, 'epoch': 0.53} 53%|█████▎ | 18225/34278 [20:03:56<14:27:04, 3.24s/it] 53%|█████▎ | 18226/34278 [20:04:01<16:24:31, 3.68s/it] {'loss': 0.1542, 'grad_norm': 0.7937530420766605, 'learning_rate': 4.7296633304715834e-06, 'epoch': 0.53} 53%|█████▎ | 18226/34278 [20:04:01<16:24:31, 3.68s/it] 53%|█████▎ | 18227/34278 [20:04:04<16:10:40, 3.63s/it] {'loss': 0.1292, 'grad_norm': 0.8790082634737036, 'learning_rate': 4.729191588555827e-06, 'epoch': 0.53} 53%|█████▎ | 18227/34278 [20:04:04<16:10:40, 3.63s/it] 53%|█████▎ | 18228/34278 [20:04:07<15:03:55, 3.38s/it] {'loss': 0.11, 'grad_norm': 0.7977555259832114, 'learning_rate': 4.728719849057785e-06, 'epoch': 0.53} 53%|█████▎ | 18228/34278 [20:04:07<15:03:55, 3.38s/it] 53%|█████▎ | 18229/34278 [20:04:10<15:01:51, 3.37s/it] {'loss': 0.129, 'grad_norm': 0.8802254598764814, 'learning_rate': 4.7282481119816684e-06, 'epoch': 0.53} 53%|█████▎ | 18229/34278 [20:04:10<15:01:51, 3.37s/it] 53%|█████▎ | 18230/34278 [20:04:16<18:26:30, 4.14s/it] {'loss': 0.1318, 'grad_norm': 0.8308378588462866, 'learning_rate': 4.727776377331685e-06, 'epoch': 0.53} 53%|█████▎ | 18230/34278 [20:04:16<18:26:30, 4.14s/it] 53%|█████▎ | 18231/34278 [20:04:19<17:08:37, 3.85s/it] {'loss': 0.1639, 'grad_norm': 1.0403894789855197, 'learning_rate': 4.72730464511205e-06, 'epoch': 0.53} 53%|█████▎ | 18231/34278 [20:04:19<17:08:37, 3.85s/it] 53%|█████▎ | 18232/34278 [20:04:25<19:56:32, 4.47s/it] {'loss': 0.1096, 'grad_norm': 0.777906028070433, 'learning_rate': 4.726832915326974e-06, 'epoch': 0.53} 53%|█████▎ | 18232/34278 [20:04:25<19:56:32, 4.47s/it] 53%|█████▎ | 18233/34278 [20:04:30<20:16:56, 4.55s/it] {'loss': 0.1511, 'grad_norm': 0.6856424421621916, 'learning_rate': 4.7263611879806694e-06, 'epoch': 0.53} 53%|█████▎ | 18233/34278 [20:04:30<20:16:56, 4.55s/it] 53%|█████▎ | 18234/34278 [20:04:33<18:18:43, 4.11s/it] {'loss': 0.1324, 'grad_norm': 1.054616312461978, 'learning_rate': 4.725889463077346e-06, 'epoch': 0.53} 53%|█████▎ | 18234/34278 [20:04:33<18:18:43, 4.11s/it] 53%|█████▎ | 18235/34278 [20:04:37<18:13:51, 4.09s/it] {'loss': 0.1371, 'grad_norm': 1.0271182679398891, 'learning_rate': 4.725417740621217e-06, 'epoch': 0.53} 53%|█████▎ | 18235/34278 [20:04:37<18:13:51, 4.09s/it] 53%|█████▎ | 18236/34278 [20:04:44<21:15:59, 4.77s/it] {'loss': 0.1444, 'grad_norm': 1.0183451635925291, 'learning_rate': 4.724946020616491e-06, 'epoch': 0.53} 53%|█████▎ | 18236/34278 [20:04:44<21:15:59, 4.77s/it] 53%|█████▎ | 18237/34278 [20:04:49<21:55:29, 4.92s/it] {'loss': 0.1199, 'grad_norm': 1.1563805190256737, 'learning_rate': 4.724474303067381e-06, 'epoch': 0.53} 53%|█████▎ | 18237/34278 [20:04:49<21:55:29, 4.92s/it] 53%|█████▎ | 18238/34278 [20:04:52<20:15:59, 4.55s/it] {'loss': 0.1242, 'grad_norm': 1.378097088732142, 'learning_rate': 4.724002587978102e-06, 'epoch': 0.53} 53%|█████▎ | 18238/34278 [20:04:52<20:15:59, 4.55s/it] 53%|█████▎ | 18239/34278 [20:04:56<18:58:01, 4.26s/it] {'loss': 0.1382, 'grad_norm': 0.7790409610984101, 'learning_rate': 4.7235308753528596e-06, 'epoch': 0.53} 53%|█████▎ | 18239/34278 [20:04:56<18:58:01, 4.26s/it] 53%|█████▎ | 18240/34278 [20:05:02<21:16:25, 4.78s/it] {'loss': 0.1451, 'grad_norm': 0.9516491913643963, 'learning_rate': 4.723059165195868e-06, 'epoch': 0.53} 53%|█████▎ | 18240/34278 [20:05:02<21:16:25, 4.78s/it] 53%|█████▎ | 18241/34278 [20:05:05<18:58:25, 4.26s/it] {'loss': 0.1201, 'grad_norm': 1.070747120837479, 'learning_rate': 4.722587457511339e-06, 'epoch': 0.53} 53%|█████▎ | 18241/34278 [20:05:05<18:58:25, 4.26s/it] 53%|█████▎ | 18242/34278 [20:05:08<17:38:06, 3.96s/it] {'loss': 0.1096, 'grad_norm': 0.890840183191396, 'learning_rate': 4.72211575230348e-06, 'epoch': 0.53} 53%|█████▎ | 18242/34278 [20:05:08<17:38:06, 3.96s/it] 53%|█████▎ | 18243/34278 [20:05:14<20:20:47, 4.57s/it] {'loss': 0.1327, 'grad_norm': 0.8084631046858048, 'learning_rate': 4.721644049576508e-06, 'epoch': 0.53} 53%|█████▎ | 18243/34278 [20:05:14<20:20:47, 4.57s/it] 53%|█████▎ | 18244/34278 [20:05:17<18:24:11, 4.13s/it] {'loss': 0.1206, 'grad_norm': 1.0445416958195826, 'learning_rate': 4.721172349334631e-06, 'epoch': 0.53} 53%|█████▎ | 18244/34278 [20:05:17<18:24:11, 4.13s/it] 53%|█████▎ | 18245/34278 [20:05:21<17:37:59, 3.96s/it] {'loss': 0.145, 'grad_norm': 0.8798676295080163, 'learning_rate': 4.72070065158206e-06, 'epoch': 0.53} 53%|█████▎ | 18245/34278 [20:05:21<17:37:59, 3.96s/it] 53%|█████▎ | 18246/34278 [20:05:27<20:13:48, 4.54s/it] {'loss': 0.1291, 'grad_norm': 0.8842797310684821, 'learning_rate': 4.720228956323009e-06, 'epoch': 0.53} 53%|█████▎ | 18246/34278 [20:05:27<20:13:48, 4.54s/it] 53%|█████▎ | 18247/34278 [20:05:30<18:17:33, 4.11s/it] {'loss': 0.1208, 'grad_norm': 0.9091357762333823, 'learning_rate': 4.719757263561685e-06, 'epoch': 0.53} 53%|█████▎ | 18247/34278 [20:05:30<18:17:33, 4.11s/it] 53%|█████▎ | 18248/34278 [20:05:33<16:55:00, 3.80s/it] {'loss': 0.1412, 'grad_norm': 0.8923979134276224, 'learning_rate': 4.7192855733023e-06, 'epoch': 0.53} 53%|█████▎ | 18248/34278 [20:05:33<16:55:00, 3.80s/it] 53%|█████▎ | 18249/34278 [20:05:36<15:59:00, 3.59s/it] {'loss': 0.1361, 'grad_norm': 0.8404394908659193, 'learning_rate': 4.718813885549069e-06, 'epoch': 0.53} 53%|█████▎ | 18249/34278 [20:05:36<15:59:00, 3.59s/it] 53%|█████▎ | 18250/34278 [20:05:39<15:27:20, 3.47s/it] {'loss': 0.1082, 'grad_norm': 0.8390766493099656, 'learning_rate': 4.718342200306199e-06, 'epoch': 0.53} 53%|█████▎ | 18250/34278 [20:05:39<15:27:20, 3.47s/it] 53%|█████▎ | 18251/34278 [20:05:43<15:48:18, 3.55s/it] {'loss': 0.1215, 'grad_norm': 0.9777215074669852, 'learning_rate': 4.717870517577904e-06, 'epoch': 0.53} 53%|█████▎ | 18251/34278 [20:05:43<15:48:18, 3.55s/it] 53%|█████▎ | 18252/34278 [20:05:46<15:13:16, 3.42s/it] {'loss': 0.1365, 'grad_norm': 0.6881839018889654, 'learning_rate': 4.717398837368392e-06, 'epoch': 0.53} 53%|█████▎ | 18252/34278 [20:05:46<15:13:16, 3.42s/it] 53%|█████▎ | 18253/34278 [20:05:51<17:25:54, 3.92s/it] {'loss': 0.1379, 'grad_norm': 0.9999322787493645, 'learning_rate': 4.716927159681877e-06, 'epoch': 0.53} 53%|█████▎ | 18253/34278 [20:05:51<17:25:54, 3.92s/it] 53%|█████▎ | 18254/34278 [20:05:57<19:18:10, 4.34s/it] {'loss': 0.1291, 'grad_norm': 0.8681590806963833, 'learning_rate': 4.716455484522567e-06, 'epoch': 0.53} 53%|█████▎ | 18254/34278 [20:05:57<19:18:10, 4.34s/it] 53%|█████▎ | 18255/34278 [20:06:00<17:42:43, 3.98s/it] {'loss': 0.1143, 'grad_norm': 0.7197364165320143, 'learning_rate': 4.715983811894678e-06, 'epoch': 0.53} 53%|█████▎ | 18255/34278 [20:06:00<17:42:43, 3.98s/it] 53%|█████▎ | 18256/34278 [20:06:03<16:13:18, 3.64s/it] {'loss': 0.1334, 'grad_norm': 0.8390255890591161, 'learning_rate': 4.715512141802415e-06, 'epoch': 0.53} 53%|█████▎ | 18256/34278 [20:06:03<16:13:18, 3.64s/it] 53%|█████▎ | 18257/34278 [20:06:05<15:05:48, 3.39s/it] {'loss': 0.1228, 'grad_norm': 0.8020806753716382, 'learning_rate': 4.715040474249993e-06, 'epoch': 0.53} 53%|█████▎ | 18257/34278 [20:06:05<15:05:48, 3.39s/it] 53%|█████▎ | 18258/34278 [20:06:09<14:54:40, 3.35s/it] {'loss': 0.13, 'grad_norm': 0.8289200491529507, 'learning_rate': 4.714568809241622e-06, 'epoch': 0.53} 53%|█████▎ | 18258/34278 [20:06:09<14:54:40, 3.35s/it] 53%|█████▎ | 18259/34278 [20:06:12<14:39:28, 3.29s/it] {'loss': 0.1385, 'grad_norm': 0.9476852145563937, 'learning_rate': 4.7140971467815115e-06, 'epoch': 0.53} 53%|█████▎ | 18259/34278 [20:06:12<14:39:28, 3.29s/it] 53%|█████▎ | 18260/34278 [20:06:15<14:44:46, 3.31s/it] {'loss': 0.122, 'grad_norm': 0.8103467856705289, 'learning_rate': 4.713625486873872e-06, 'epoch': 0.53} 53%|█████▎ | 18260/34278 [20:06:15<14:44:46, 3.31s/it] 53%|█████▎ | 18261/34278 [20:06:18<14:20:55, 3.23s/it] {'loss': 0.1151, 'grad_norm': 0.8168846218972035, 'learning_rate': 4.713153829522918e-06, 'epoch': 0.53} 53%|█████▎ | 18261/34278 [20:06:18<14:20:55, 3.23s/it] 53%|█████▎ | 18262/34278 [20:06:24<18:10:31, 4.09s/it] {'loss': 0.128, 'grad_norm': 0.7159185013828294, 'learning_rate': 4.712682174732857e-06, 'epoch': 0.53} 53%|█████▎ | 18262/34278 [20:06:24<18:10:31, 4.09s/it] 53%|█████▎ | 18263/34278 [20:06:28<18:14:49, 4.10s/it] {'loss': 0.1491, 'grad_norm': 0.7375691227334467, 'learning_rate': 4.7122105225079015e-06, 'epoch': 0.53} 53%|█████▎ | 18263/34278 [20:06:28<18:14:49, 4.10s/it] 53%|█████▎ | 18264/34278 [20:06:32<16:57:13, 3.81s/it] {'loss': 0.1343, 'grad_norm': 0.9646780648256801, 'learning_rate': 4.711738872852262e-06, 'epoch': 0.53} 53%|█████▎ | 18264/34278 [20:06:32<16:57:13, 3.81s/it] 53%|█████▎ | 18265/34278 [20:06:35<16:35:09, 3.73s/it] {'loss': 0.1362, 'grad_norm': 0.7199992123139404, 'learning_rate': 4.711267225770149e-06, 'epoch': 0.53} 53%|█████▎ | 18265/34278 [20:06:35<16:35:09, 3.73s/it] 53%|█████▎ | 18266/34278 [20:06:39<16:26:40, 3.70s/it] {'loss': 0.144, 'grad_norm': 0.6174057422571153, 'learning_rate': 4.710795581265772e-06, 'epoch': 0.53} 53%|█████▎ | 18266/34278 [20:06:39<16:26:40, 3.70s/it] 53%|█████▎ | 18267/34278 [20:06:42<15:58:40, 3.59s/it] {'loss': 0.1202, 'grad_norm': 0.8041162452665113, 'learning_rate': 4.710323939343343e-06, 'epoch': 0.53} 53%|█████▎ | 18267/34278 [20:06:42<15:58:40, 3.59s/it] 53%|█████▎ | 18268/34278 [20:06:45<15:18:16, 3.44s/it] {'loss': 0.1137, 'grad_norm': 0.8354353108165564, 'learning_rate': 4.709852300007075e-06, 'epoch': 0.53} 53%|█████▎ | 18268/34278 [20:06:45<15:18:16, 3.44s/it] 53%|█████▎ | 18269/34278 [20:06:49<15:48:15, 3.55s/it] {'loss': 0.1598, 'grad_norm': 0.9102347114209531, 'learning_rate': 4.709380663261175e-06, 'epoch': 0.53} 53%|█████▎ | 18269/34278 [20:06:49<15:48:15, 3.55s/it] 53%|█████▎ | 18270/34278 [20:06:52<14:47:59, 3.33s/it] {'loss': 0.1469, 'grad_norm': 0.8691516424405844, 'learning_rate': 4.7089090291098555e-06, 'epoch': 0.53} 53%|█████▎ | 18270/34278 [20:06:52<14:47:59, 3.33s/it] 53%|█████▎ | 18271/34278 [20:06:56<15:41:54, 3.53s/it] {'loss': 0.1528, 'grad_norm': 0.894379100194545, 'learning_rate': 4.708437397557327e-06, 'epoch': 0.53} 53%|█████▎ | 18271/34278 [20:06:56<15:41:54, 3.53s/it] 53%|█████▎ | 18272/34278 [20:06:59<15:35:57, 3.51s/it] {'loss': 0.1186, 'grad_norm': 0.7686223797221675, 'learning_rate': 4.707965768607797e-06, 'epoch': 0.53} 53%|█████▎ | 18272/34278 [20:06:59<15:35:57, 3.51s/it] 53%|█████▎ | 18273/34278 [20:07:03<15:27:42, 3.48s/it] {'loss': 0.1301, 'grad_norm': 0.9922473040053408, 'learning_rate': 4.7074941422654825e-06, 'epoch': 0.53} 53%|█████▎ | 18273/34278 [20:07:03<15:27:42, 3.48s/it] 53%|█████▎ | 18274/34278 [20:07:06<14:40:00, 3.30s/it] {'loss': 0.1348, 'grad_norm': 0.9553222278714265, 'learning_rate': 4.7070225185345885e-06, 'epoch': 0.53} 53%|█████▎ | 18274/34278 [20:07:06<14:40:00, 3.30s/it] 53%|█████▎ | 18275/34278 [20:07:09<14:57:24, 3.36s/it] {'loss': 0.1292, 'grad_norm': 0.8780426277561764, 'learning_rate': 4.706550897419328e-06, 'epoch': 0.53} 53%|█████▎ | 18275/34278 [20:07:09<14:57:24, 3.36s/it] 53%|█████▎ | 18276/34278 [20:07:13<15:56:34, 3.59s/it] {'loss': 0.1277, 'grad_norm': 1.1271964222389244, 'learning_rate': 4.706079278923912e-06, 'epoch': 0.53} 53%|█████▎ | 18276/34278 [20:07:13<15:56:34, 3.59s/it] 53%|█████▎ | 18277/34278 [20:07:16<15:15:25, 3.43s/it] {'loss': 0.1235, 'grad_norm': 0.7803256032983364, 'learning_rate': 4.70560766305255e-06, 'epoch': 0.53} 53%|█████▎ | 18277/34278 [20:07:16<15:15:25, 3.43s/it] 53%|█████▎ | 18278/34278 [20:07:19<14:52:40, 3.35s/it] {'loss': 0.1201, 'grad_norm': 1.062307277822807, 'learning_rate': 4.70513604980945e-06, 'epoch': 0.53} 53%|█████▎ | 18278/34278 [20:07:19<14:52:40, 3.35s/it] 53%|█████▎ | 18279/34278 [20:07:23<15:43:03, 3.54s/it] {'loss': 0.1199, 'grad_norm': 0.8432346201306513, 'learning_rate': 4.704664439198826e-06, 'epoch': 0.53} 53%|█████▎ | 18279/34278 [20:07:23<15:43:03, 3.54s/it] 53%|█████▎ | 18280/34278 [20:07:29<18:12:32, 4.10s/it] {'loss': 0.1382, 'grad_norm': 0.8053567904561258, 'learning_rate': 4.704192831224888e-06, 'epoch': 0.53} 53%|█████▎ | 18280/34278 [20:07:29<18:12:32, 4.10s/it] 53%|█████▎ | 18281/34278 [20:07:33<17:46:40, 4.00s/it] {'loss': 0.1163, 'grad_norm': 0.7709820122302663, 'learning_rate': 4.703721225891847e-06, 'epoch': 0.53} 53%|█████▎ | 18281/34278 [20:07:33<17:46:40, 4.00s/it] 53%|█████▎ | 18282/34278 [20:07:35<16:14:43, 3.66s/it] {'loss': 0.1502, 'grad_norm': 0.9814907406419364, 'learning_rate': 4.703249623203911e-06, 'epoch': 0.53} 53%|█████▎ | 18282/34278 [20:07:35<16:14:43, 3.66s/it] 53%|█████▎ | 18283/34278 [20:07:39<16:41:27, 3.76s/it] {'loss': 0.1574, 'grad_norm': 0.8545745632077943, 'learning_rate': 4.702778023165291e-06, 'epoch': 0.53} 53%|█████▎ | 18283/34278 [20:07:39<16:41:27, 3.76s/it] 53%|█████▎ | 18284/34278 [20:07:43<16:53:10, 3.80s/it] {'loss': 0.1506, 'grad_norm': 0.7053356035176296, 'learning_rate': 4.7023064257801976e-06, 'epoch': 0.53} 53%|█████▎ | 18284/34278 [20:07:43<16:53:10, 3.80s/it] 53%|█████▎ | 18285/34278 [20:07:46<15:55:08, 3.58s/it] {'loss': 0.1292, 'grad_norm': 1.1326987087328027, 'learning_rate': 4.7018348310528424e-06, 'epoch': 0.53} 53%|█████▎ | 18285/34278 [20:07:46<15:55:08, 3.58s/it] 53%|█████▎ | 18286/34278 [20:07:49<15:00:39, 3.38s/it] {'loss': 0.1339, 'grad_norm': 0.7508903359106306, 'learning_rate': 4.701363238987435e-06, 'epoch': 0.53} 53%|█████▎ | 18286/34278 [20:07:49<15:00:39, 3.38s/it] 53%|█████▎ | 18287/34278 [20:07:54<17:17:36, 3.89s/it] {'loss': 0.117, 'grad_norm': 0.6477469308655547, 'learning_rate': 4.700891649588185e-06, 'epoch': 0.53} 53%|█████▎ | 18287/34278 [20:07:54<17:17:36, 3.89s/it] 53%|█████▎ | 18288/34278 [20:07:58<16:57:06, 3.82s/it] {'loss': 0.1634, 'grad_norm': 0.6742799625847177, 'learning_rate': 4.700420062859303e-06, 'epoch': 0.53} 53%|█████▎ | 18288/34278 [20:07:58<16:57:06, 3.82s/it] 53%|█████▎ | 18289/34278 [20:08:02<16:45:09, 3.77s/it] {'loss': 0.1352, 'grad_norm': 1.3577615709575646, 'learning_rate': 4.6999484788049985e-06, 'epoch': 0.53} 53%|█████▎ | 18289/34278 [20:08:02<16:45:09, 3.77s/it] 53%|█████▎ | 18290/34278 [20:08:06<17:35:36, 3.96s/it] {'loss': 0.1581, 'grad_norm': 0.7758935998448669, 'learning_rate': 4.699476897429481e-06, 'epoch': 0.53} 53%|█████▎ | 18290/34278 [20:08:06<17:35:36, 3.96s/it] 53%|█████▎ | 18291/34278 [20:08:10<17:21:55, 3.91s/it] {'loss': 0.1346, 'grad_norm': 2.846208854867453, 'learning_rate': 4.699005318736965e-06, 'epoch': 0.53} 53%|█████▎ | 18291/34278 [20:08:10<17:21:55, 3.91s/it] 53%|█████▎ | 18292/34278 [20:08:13<16:54:15, 3.81s/it] {'loss': 0.1322, 'grad_norm': 0.6612591799521408, 'learning_rate': 4.698533742731655e-06, 'epoch': 0.53} 53%|█████▎ | 18292/34278 [20:08:13<16:54:15, 3.81s/it] 53%|█████▎ | 18293/34278 [20:08:20<20:02:13, 4.51s/it] {'loss': 0.1117, 'grad_norm': 0.8580813226827683, 'learning_rate': 4.698062169417766e-06, 'epoch': 0.53} 53%|█████▎ | 18293/34278 [20:08:20<20:02:13, 4.51s/it] 53%|█████▎ | 18294/34278 [20:08:24<19:29:36, 4.39s/it] {'loss': 0.1257, 'grad_norm': 0.5712477496198003, 'learning_rate': 4.697590598799505e-06, 'epoch': 0.53} 53%|█████▎ | 18294/34278 [20:08:24<19:29:36, 4.39s/it] 53%|█████▎ | 18295/34278 [20:08:27<18:26:32, 4.15s/it] {'loss': 0.1391, 'grad_norm': 0.895768043690643, 'learning_rate': 4.697119030881083e-06, 'epoch': 0.53} 53%|█████▎ | 18295/34278 [20:08:27<18:26:32, 4.15s/it] 53%|█████▎ | 18296/34278 [20:08:30<16:57:33, 3.82s/it] {'loss': 0.1208, 'grad_norm': 0.8231409957602026, 'learning_rate': 4.696647465666709e-06, 'epoch': 0.53} 53%|█████▎ | 18296/34278 [20:08:30<16:57:33, 3.82s/it] 53%|█████▎ | 18297/34278 [20:08:34<16:08:30, 3.64s/it] {'loss': 0.1171, 'grad_norm': 0.794681568119074, 'learning_rate': 4.6961759031605945e-06, 'epoch': 0.53} 53%|█████▎ | 18297/34278 [20:08:34<16:08:30, 3.64s/it] 53%|█████▎ | 18298/34278 [20:08:37<16:17:55, 3.67s/it] {'loss': 0.1487, 'grad_norm': 0.7995639882069969, 'learning_rate': 4.695704343366951e-06, 'epoch': 0.53} 53%|█████▎ | 18298/34278 [20:08:37<16:17:55, 3.67s/it] 53%|█████▎ | 18299/34278 [20:08:41<16:08:11, 3.64s/it] {'loss': 0.1186, 'grad_norm': 0.860057475567077, 'learning_rate': 4.695232786289984e-06, 'epoch': 0.53} 53%|█████▎ | 18299/34278 [20:08:41<16:08:11, 3.64s/it] 53%|█████▎ | 18300/34278 [20:08:44<15:11:03, 3.42s/it] {'loss': 0.1115, 'grad_norm': 0.9380352743996345, 'learning_rate': 4.694761231933907e-06, 'epoch': 0.53} 53%|█████▎ | 18300/34278 [20:08:44<15:11:03, 3.42s/it] 53%|█████▎ | 18301/34278 [20:08:47<15:22:21, 3.46s/it] {'loss': 0.1485, 'grad_norm': 0.7989324976010658, 'learning_rate': 4.694289680302929e-06, 'epoch': 0.53} 53%|█████▎ | 18301/34278 [20:08:47<15:22:21, 3.46s/it] 53%|█████▎ | 18302/34278 [20:08:52<17:02:35, 3.84s/it] {'loss': 0.1346, 'grad_norm': 0.87914004930729, 'learning_rate': 4.693818131401258e-06, 'epoch': 0.53} 53%|█████▎ | 18302/34278 [20:08:52<17:02:35, 3.84s/it] 53%|█████▎ | 18303/34278 [20:08:57<18:17:38, 4.12s/it] {'loss': 0.1205, 'grad_norm': 0.7506380667800656, 'learning_rate': 4.693346585233108e-06, 'epoch': 0.53} 53%|█████▎ | 18303/34278 [20:08:57<18:17:38, 4.12s/it] 53%|█████▎ | 18304/34278 [20:09:00<16:58:19, 3.82s/it] {'loss': 0.1198, 'grad_norm': 1.2642888865243207, 'learning_rate': 4.692875041802686e-06, 'epoch': 0.53} 53%|█████▎ | 18304/34278 [20:09:00<16:58:19, 3.82s/it] 53%|█████▎ | 18305/34278 [20:09:05<18:09:00, 4.09s/it] {'loss': 0.1275, 'grad_norm': 0.8676706384944769, 'learning_rate': 4.692403501114201e-06, 'epoch': 0.53} 53%|█████▎ | 18305/34278 [20:09:05<18:09:00, 4.09s/it] 53%|█████▎ | 18306/34278 [20:09:08<16:48:36, 3.79s/it] {'loss': 0.1002, 'grad_norm': 0.7709817409464969, 'learning_rate': 4.691931963171866e-06, 'epoch': 0.53} 53%|█████▎ | 18306/34278 [20:09:08<16:48:36, 3.79s/it] 53%|█████▎ | 18307/34278 [20:09:11<15:51:48, 3.58s/it] {'loss': 0.1518, 'grad_norm': 1.351751291138029, 'learning_rate': 4.691460427979888e-06, 'epoch': 0.53} 53%|█████▎ | 18307/34278 [20:09:11<15:51:48, 3.58s/it] 53%|█████▎ | 18308/34278 [20:09:14<15:33:32, 3.51s/it] {'loss': 0.1496, 'grad_norm': 1.1386904341793178, 'learning_rate': 4.690988895542477e-06, 'epoch': 0.53} 53%|█████▎ | 18308/34278 [20:09:14<15:33:32, 3.51s/it] 53%|█████▎ | 18309/34278 [20:09:17<15:14:44, 3.44s/it] {'loss': 0.1769, 'grad_norm': 1.2382735820396864, 'learning_rate': 4.690517365863843e-06, 'epoch': 0.53} 53%|█████▎ | 18309/34278 [20:09:17<15:14:44, 3.44s/it] 53%|█████▎ | 18310/34278 [20:09:23<18:09:55, 4.10s/it] {'loss': 0.1269, 'grad_norm': 1.2838804628353189, 'learning_rate': 4.690045838948197e-06, 'epoch': 0.53} 53%|█████▎ | 18310/34278 [20:09:23<18:09:55, 4.10s/it] 53%|█████▎ | 18311/34278 [20:09:27<17:32:58, 3.96s/it] {'loss': 0.1213, 'grad_norm': 0.948425471216006, 'learning_rate': 4.689574314799749e-06, 'epoch': 0.53} 53%|█████▎ | 18311/34278 [20:09:27<17:32:58, 3.96s/it] 53%|█████▎ | 18312/34278 [20:09:30<16:39:06, 3.75s/it] {'loss': 0.1259, 'grad_norm': 0.8091162461170046, 'learning_rate': 4.689102793422706e-06, 'epoch': 0.53} 53%|█████▎ | 18312/34278 [20:09:30<16:39:06, 3.75s/it] 53%|█████▎ | 18313/34278 [20:09:35<17:49:58, 4.02s/it] {'loss': 0.1353, 'grad_norm': 1.2537966712209563, 'learning_rate': 4.688631274821279e-06, 'epoch': 0.53} 53%|█████▎ | 18313/34278 [20:09:35<17:49:58, 4.02s/it] 53%|█████▎ | 18314/34278 [20:09:38<16:34:27, 3.74s/it] {'loss': 0.1308, 'grad_norm': 0.937753720979893, 'learning_rate': 4.688159758999676e-06, 'epoch': 0.53} 53%|█████▎ | 18314/34278 [20:09:38<16:34:27, 3.74s/it] 53%|█████▎ | 18315/34278 [20:09:44<19:27:24, 4.39s/it] {'loss': 0.1301, 'grad_norm': 0.8199593811915793, 'learning_rate': 4.687688245962111e-06, 'epoch': 0.53} 53%|█████▎ | 18315/34278 [20:09:44<19:27:24, 4.39s/it] 53%|█████▎ | 18316/34278 [20:09:47<17:29:05, 3.94s/it] {'loss': 0.1357, 'grad_norm': 1.0522108926739333, 'learning_rate': 4.68721673571279e-06, 'epoch': 0.53} 53%|█████▎ | 18316/34278 [20:09:47<17:29:05, 3.94s/it] 53%|█████▎ | 18317/34278 [20:09:50<17:08:36, 3.87s/it] {'loss': 0.1278, 'grad_norm': 0.9877584992858726, 'learning_rate': 4.686745228255923e-06, 'epoch': 0.53} 53%|█████▎ | 18317/34278 [20:09:50<17:08:36, 3.87s/it] 53%|█████▎ | 18318/34278 [20:09:54<17:00:43, 3.84s/it] {'loss': 0.1138, 'grad_norm': 0.6561644672374607, 'learning_rate': 4.686273723595721e-06, 'epoch': 0.53} 53%|█████▎ | 18318/34278 [20:09:54<17:00:43, 3.84s/it] 53%|█████▎ | 18319/34278 [20:09:57<16:31:18, 3.73s/it] {'loss': 0.1439, 'grad_norm': 0.674583807057486, 'learning_rate': 4.685802221736391e-06, 'epoch': 0.53} 53%|█████▎ | 18319/34278 [20:09:57<16:31:18, 3.73s/it] 53%|█████▎ | 18320/34278 [20:10:02<17:08:54, 3.87s/it] {'loss': 0.1348, 'grad_norm': 0.8126408168353374, 'learning_rate': 4.685330722682143e-06, 'epoch': 0.53} 53%|█████▎ | 18320/34278 [20:10:02<17:08:54, 3.87s/it] 53%|█████▎ | 18321/34278 [20:10:05<16:05:55, 3.63s/it] {'loss': 0.123, 'grad_norm': 0.9298237473943516, 'learning_rate': 4.684859226437188e-06, 'epoch': 0.53} 53%|█████▎ | 18321/34278 [20:10:05<16:05:55, 3.63s/it] 53%|█████▎ | 18322/34278 [20:10:08<16:08:10, 3.64s/it] {'loss': 0.1285, 'grad_norm': 0.6441497151952471, 'learning_rate': 4.684387733005735e-06, 'epoch': 0.53} 53%|█████▎ | 18322/34278 [20:10:08<16:08:10, 3.64s/it] 53%|█████▎ | 18323/34278 [20:10:12<16:17:13, 3.67s/it] {'loss': 0.1261, 'grad_norm': 0.6988583308964291, 'learning_rate': 4.6839162423919946e-06, 'epoch': 0.53} 53%|█████▎ | 18323/34278 [20:10:12<16:17:13, 3.67s/it] 53%|█████▎ | 18324/34278 [20:10:15<15:11:31, 3.43s/it] {'loss': 0.1171, 'grad_norm': 0.729748063937843, 'learning_rate': 4.683444754600172e-06, 'epoch': 0.53} 53%|█████▎ | 18324/34278 [20:10:15<15:11:31, 3.43s/it] 53%|█████▎ | 18325/34278 [20:10:18<14:41:12, 3.31s/it] {'loss': 0.1534, 'grad_norm': 0.7203507248612669, 'learning_rate': 4.6829732696344796e-06, 'epoch': 0.53} 53%|█████▎ | 18325/34278 [20:10:18<14:41:12, 3.31s/it] 53%|█████▎ | 18326/34278 [20:10:24<17:51:29, 4.03s/it] {'loss': 0.1368, 'grad_norm': 0.8065783589834921, 'learning_rate': 4.6825017874991255e-06, 'epoch': 0.53} 53%|█████▎ | 18326/34278 [20:10:24<17:51:29, 4.03s/it] 53%|█████▎ | 18327/34278 [20:10:27<17:14:49, 3.89s/it] {'loss': 0.1433, 'grad_norm': 0.9653649908196572, 'learning_rate': 4.6820303081983205e-06, 'epoch': 0.53} 53%|█████▎ | 18327/34278 [20:10:27<17:14:49, 3.89s/it] 53%|█████▎ | 18328/34278 [20:10:30<16:04:44, 3.63s/it] {'loss': 0.136, 'grad_norm': 0.9393678065840173, 'learning_rate': 4.681558831736274e-06, 'epoch': 0.53} 53%|█████▎ | 18328/34278 [20:10:30<16:04:44, 3.63s/it] 53%|█████▎ | 18329/34278 [20:10:34<16:06:33, 3.64s/it] {'loss': 0.1583, 'grad_norm': 0.808057939594074, 'learning_rate': 4.681087358117193e-06, 'epoch': 0.53} 53%|█████▎ | 18329/34278 [20:10:34<16:06:33, 3.64s/it] 53%|█████▎ | 18330/34278 [20:10:38<16:37:51, 3.75s/it] {'loss': 0.1311, 'grad_norm': 0.7662604504264521, 'learning_rate': 4.680615887345288e-06, 'epoch': 0.53} 53%|█████▎ | 18330/34278 [20:10:38<16:37:51, 3.75s/it] 53%|█████▎ | 18331/34278 [20:10:41<16:06:04, 3.63s/it] {'loss': 0.1489, 'grad_norm': 0.9298896011013003, 'learning_rate': 4.680144419424769e-06, 'epoch': 0.53} 53%|█████▎ | 18331/34278 [20:10:41<16:06:04, 3.63s/it] 53%|█████▎ | 18332/34278 [20:10:47<19:21:02, 4.37s/it] {'loss': 0.1452, 'grad_norm': 0.7472354714700918, 'learning_rate': 4.679672954359842e-06, 'epoch': 0.53} 53%|█████▎ | 18332/34278 [20:10:47<19:21:02, 4.37s/it] 53%|█████▎ | 18333/34278 [20:10:51<18:05:58, 4.09s/it] {'loss': 0.1393, 'grad_norm': 0.8432475776533255, 'learning_rate': 4.679201492154721e-06, 'epoch': 0.53} 53%|█████▎ | 18333/34278 [20:10:51<18:05:58, 4.09s/it] 53%|█████▎ | 18334/34278 [20:10:54<16:29:17, 3.72s/it] {'loss': 0.1253, 'grad_norm': 0.8881170093770653, 'learning_rate': 4.678730032813611e-06, 'epoch': 0.53} 53%|█████▎ | 18334/34278 [20:10:54<16:29:17, 3.72s/it] 53%|█████▎ | 18335/34278 [20:10:57<15:37:10, 3.53s/it] {'loss': 0.117, 'grad_norm': 0.9069534303549787, 'learning_rate': 4.678258576340723e-06, 'epoch': 0.53} 53%|█████▎ | 18335/34278 [20:10:57<15:37:10, 3.53s/it] 53%|█████▎ | 18336/34278 [20:11:00<15:39:48, 3.54s/it] {'loss': 0.1175, 'grad_norm': 0.7528468968079108, 'learning_rate': 4.677787122740267e-06, 'epoch': 0.53} 53%|█████▎ | 18336/34278 [20:11:00<15:39:48, 3.54s/it] 53%|█████▎ | 18337/34278 [20:11:04<15:09:32, 3.42s/it] {'loss': 0.1154, 'grad_norm': 0.9679569993903948, 'learning_rate': 4.677315672016446e-06, 'epoch': 0.53} 53%|█████▎ | 18337/34278 [20:11:04<15:09:32, 3.42s/it] 53%|█████▎ | 18338/34278 [20:11:07<15:04:12, 3.40s/it] {'loss': 0.1319, 'grad_norm': 1.2178409028124713, 'learning_rate': 4.6768442241734785e-06, 'epoch': 0.53} 53%|█████▎ | 18338/34278 [20:11:07<15:04:12, 3.40s/it] 54%|█████▎ | 18339/34278 [20:11:11<15:33:21, 3.51s/it] {'loss': 0.1321, 'grad_norm': 1.094091458256245, 'learning_rate': 4.676372779215568e-06, 'epoch': 0.54} 54%|█████▎ | 18339/34278 [20:11:11<15:33:21, 3.51s/it] 54%|█████▎ | 18340/34278 [20:11:14<14:59:08, 3.38s/it] {'loss': 0.1224, 'grad_norm': 0.82715505011348, 'learning_rate': 4.675901337146922e-06, 'epoch': 0.54} 54%|█████▎ | 18340/34278 [20:11:14<14:59:08, 3.38s/it] 54%|█████▎ | 18341/34278 [20:11:17<14:56:00, 3.37s/it] {'loss': 0.1355, 'grad_norm': 1.0231909253261822, 'learning_rate': 4.675429897971754e-06, 'epoch': 0.54} 54%|█████▎ | 18341/34278 [20:11:17<14:56:00, 3.37s/it] 54%|█████▎ | 18342/34278 [20:11:21<15:23:12, 3.48s/it] {'loss': 0.147, 'grad_norm': 0.9893810869493288, 'learning_rate': 4.674958461694269e-06, 'epoch': 0.54} 54%|█████▎ | 18342/34278 [20:11:21<15:23:12, 3.48s/it] 54%|█████▎ | 18343/34278 [20:11:25<16:31:52, 3.73s/it] {'loss': 0.117, 'grad_norm': 0.7506321777001245, 'learning_rate': 4.674487028318676e-06, 'epoch': 0.54} 54%|█████▎ | 18343/34278 [20:11:25<16:31:52, 3.73s/it] 54%|█████▎ | 18344/34278 [20:11:28<15:42:34, 3.55s/it] {'loss': 0.1283, 'grad_norm': 1.0344483586868616, 'learning_rate': 4.674015597849186e-06, 'epoch': 0.54} 54%|█████▎ | 18344/34278 [20:11:28<15:42:34, 3.55s/it] 54%|█████▎ | 18345/34278 [20:11:32<16:03:23, 3.63s/it] {'loss': 0.1486, 'grad_norm': 0.9586845928124975, 'learning_rate': 4.673544170290009e-06, 'epoch': 0.54} 54%|█████▎ | 18345/34278 [20:11:32<16:03:23, 3.63s/it] 54%|█████▎ | 18346/34278 [20:11:35<15:20:30, 3.47s/it] {'loss': 0.1306, 'grad_norm': 0.7940525509400088, 'learning_rate': 4.673072745645349e-06, 'epoch': 0.54} 54%|█████▎ | 18346/34278 [20:11:35<15:20:30, 3.47s/it] 54%|█████▎ | 18347/34278 [20:11:38<14:44:02, 3.33s/it] {'loss': 0.1287, 'grad_norm': 1.3334246405174768, 'learning_rate': 4.672601323919419e-06, 'epoch': 0.54} 54%|█████▎ | 18347/34278 [20:11:38<14:44:02, 3.33s/it] 54%|█████▎ | 18348/34278 [20:11:41<14:27:23, 3.27s/it] {'loss': 0.1197, 'grad_norm': 1.078371798384496, 'learning_rate': 4.6721299051164265e-06, 'epoch': 0.54} 54%|█████▎ | 18348/34278 [20:11:41<14:27:23, 3.27s/it] 54%|█████▎ | 18349/34278 [20:11:44<14:10:07, 3.20s/it] {'loss': 0.1278, 'grad_norm': 0.9785701120685548, 'learning_rate': 4.671658489240577e-06, 'epoch': 0.54} 54%|█████▎ | 18349/34278 [20:11:44<14:10:07, 3.20s/it] 54%|█████▎ | 18350/34278 [20:11:47<13:36:20, 3.08s/it] {'loss': 0.109, 'grad_norm': 0.7759886837248762, 'learning_rate': 4.671187076296085e-06, 'epoch': 0.54} 54%|█████▎ | 18350/34278 [20:11:47<13:36:20, 3.08s/it] 54%|█████▎ | 18351/34278 [20:11:51<14:15:08, 3.22s/it] {'loss': 0.1415, 'grad_norm': 1.1109812964350543, 'learning_rate': 4.670715666287156e-06, 'epoch': 0.54} 54%|█████▎ | 18351/34278 [20:11:51<14:15:08, 3.22s/it] 54%|█████▎ | 18352/34278 [20:11:54<14:19:38, 3.24s/it] {'loss': 0.1513, 'grad_norm': 1.2929363377866827, 'learning_rate': 4.670244259217998e-06, 'epoch': 0.54} 54%|█████▎ | 18352/34278 [20:11:54<14:19:38, 3.24s/it] 54%|█████▎ | 18353/34278 [20:11:57<13:57:49, 3.16s/it] {'loss': 0.1476, 'grad_norm': 0.9022509312476248, 'learning_rate': 4.669772855092822e-06, 'epoch': 0.54} 54%|█████▎ | 18353/34278 [20:11:57<13:57:49, 3.16s/it] 54%|█████▎ | 18354/34278 [20:12:00<13:53:50, 3.14s/it] {'loss': 0.1361, 'grad_norm': 0.988257571648059, 'learning_rate': 4.6693014539158345e-06, 'epoch': 0.54} 54%|█████▎ | 18354/34278 [20:12:00<13:53:50, 3.14s/it] 54%|█████▎ | 18355/34278 [20:12:06<17:51:36, 4.04s/it] {'loss': 0.1454, 'grad_norm': 1.0434758164432263, 'learning_rate': 4.668830055691243e-06, 'epoch': 0.54} 54%|█████▎ | 18355/34278 [20:12:06<17:51:36, 4.04s/it] 54%|█████▎ | 18356/34278 [20:12:09<16:18:54, 3.69s/it] {'loss': 0.1285, 'grad_norm': 0.9014227433768871, 'learning_rate': 4.668358660423259e-06, 'epoch': 0.54} 54%|█████▎ | 18356/34278 [20:12:09<16:18:54, 3.69s/it] 54%|█████▎ | 18357/34278 [20:12:12<15:29:35, 3.50s/it] {'loss': 0.1295, 'grad_norm': 0.8525085880182496, 'learning_rate': 4.66788726811609e-06, 'epoch': 0.54} 54%|█████▎ | 18357/34278 [20:12:12<15:29:35, 3.50s/it] 54%|█████▎ | 18358/34278 [20:12:17<16:48:35, 3.80s/it] {'loss': 0.1453, 'grad_norm': 0.9098629368849164, 'learning_rate': 4.667415878773945e-06, 'epoch': 0.54} 54%|█████▎ | 18358/34278 [20:12:17<16:48:35, 3.80s/it] 54%|█████▎ | 18359/34278 [20:12:20<16:36:36, 3.76s/it] {'loss': 0.1233, 'grad_norm': 0.8745548385921665, 'learning_rate': 4.6669444924010305e-06, 'epoch': 0.54} 54%|█████▎ | 18359/34278 [20:12:20<16:36:36, 3.76s/it] 54%|█████▎ | 18360/34278 [20:12:23<15:25:48, 3.49s/it] {'loss': 0.1677, 'grad_norm': 0.8206904657070538, 'learning_rate': 4.666473109001556e-06, 'epoch': 0.54} 54%|█████▎ | 18360/34278 [20:12:23<15:25:48, 3.49s/it] 54%|█████▎ | 18361/34278 [20:12:27<15:20:39, 3.47s/it] {'loss': 0.1466, 'grad_norm': 0.8218439636259399, 'learning_rate': 4.666001728579729e-06, 'epoch': 0.54} 54%|█████▎ | 18361/34278 [20:12:27<15:20:39, 3.47s/it] 54%|█████▎ | 18362/34278 [20:12:30<14:40:37, 3.32s/it] {'loss': 0.1272, 'grad_norm': 0.9085453548345437, 'learning_rate': 4.66553035113976e-06, 'epoch': 0.54} 54%|█████▎ | 18362/34278 [20:12:30<14:40:37, 3.32s/it] 54%|█████▎ | 18363/34278 [20:12:33<14:49:10, 3.35s/it] {'loss': 0.1234, 'grad_norm': 0.8833952600137825, 'learning_rate': 4.665058976685857e-06, 'epoch': 0.54} 54%|█████▎ | 18363/34278 [20:12:33<14:49:10, 3.35s/it] 54%|█████▎ | 18364/34278 [20:12:38<17:25:42, 3.94s/it] {'loss': 0.1292, 'grad_norm': 1.0060277647472204, 'learning_rate': 4.664587605222226e-06, 'epoch': 0.54} 54%|█████▎ | 18364/34278 [20:12:38<17:25:42, 3.94s/it] 54%|█████▎ | 18365/34278 [20:12:41<16:23:27, 3.71s/it] {'loss': 0.1033, 'grad_norm': 1.0899336140879554, 'learning_rate': 4.6641162367530775e-06, 'epoch': 0.54} 54%|█████▎ | 18365/34278 [20:12:41<16:23:27, 3.71s/it] 54%|█████▎ | 18366/34278 [20:12:44<15:17:50, 3.46s/it] {'loss': 0.1169, 'grad_norm': 1.2235143866002742, 'learning_rate': 4.66364487128262e-06, 'epoch': 0.54} 54%|█████▎ | 18366/34278 [20:12:44<15:17:50, 3.46s/it] 54%|█████▎ | 18367/34278 [20:12:51<19:05:59, 4.32s/it] {'loss': 0.1535, 'grad_norm': 0.8596889715983336, 'learning_rate': 4.663173508815058e-06, 'epoch': 0.54} 54%|█████▎ | 18367/34278 [20:12:51<19:05:59, 4.32s/it] 54%|█████▎ | 18368/34278 [20:12:54<18:20:14, 4.15s/it] {'loss': 0.1192, 'grad_norm': 0.9054611873990683, 'learning_rate': 4.662702149354605e-06, 'epoch': 0.54} 54%|█████▎ | 18368/34278 [20:12:54<18:20:14, 4.15s/it] 54%|█████▎ | 18369/34278 [20:13:01<21:04:14, 4.77s/it] {'loss': 0.118, 'grad_norm': 0.9634229690797912, 'learning_rate': 4.662230792905465e-06, 'epoch': 0.54} 54%|█████▎ | 18369/34278 [20:13:01<21:04:14, 4.77s/it] 54%|█████▎ | 18370/34278 [20:13:04<19:37:47, 4.44s/it] {'loss': 0.1328, 'grad_norm': 1.0595127571269867, 'learning_rate': 4.66175943947185e-06, 'epoch': 0.54} 54%|█████▎ | 18370/34278 [20:13:04<19:37:47, 4.44s/it] 54%|█████▎ | 18371/34278 [20:13:07<17:50:24, 4.04s/it] {'loss': 0.1497, 'grad_norm': 1.0649972455942749, 'learning_rate': 4.661288089057965e-06, 'epoch': 0.54} 54%|█████▎ | 18371/34278 [20:13:07<17:50:24, 4.04s/it] 54%|█████▎ | 18372/34278 [20:13:10<16:28:02, 3.73s/it] {'loss': 0.1228, 'grad_norm': 0.8622703140674189, 'learning_rate': 4.660816741668019e-06, 'epoch': 0.54} 54%|█████▎ | 18372/34278 [20:13:10<16:28:02, 3.73s/it] 54%|█████▎ | 18373/34278 [20:13:13<15:26:39, 3.50s/it] {'loss': 0.1315, 'grad_norm': 0.9624169763035911, 'learning_rate': 4.660345397306219e-06, 'epoch': 0.54} 54%|█████▎ | 18373/34278 [20:13:13<15:26:39, 3.50s/it] 54%|█████▎ | 18374/34278 [20:13:17<15:22:10, 3.48s/it] {'loss': 0.1505, 'grad_norm': 0.8373174658170262, 'learning_rate': 4.659874055976775e-06, 'epoch': 0.54} 54%|█████▎ | 18374/34278 [20:13:17<15:22:10, 3.48s/it] 54%|█████▎ | 18375/34278 [20:13:20<14:39:07, 3.32s/it] {'loss': 0.1159, 'grad_norm': 0.7389060289683349, 'learning_rate': 4.6594027176838955e-06, 'epoch': 0.54} 54%|█████▎ | 18375/34278 [20:13:20<14:39:07, 3.32s/it] 54%|█████▎ | 18376/34278 [20:13:25<17:46:27, 4.02s/it] {'loss': 0.1153, 'grad_norm': 0.6855628089539305, 'learning_rate': 4.658931382431786e-06, 'epoch': 0.54} 54%|█████▎ | 18376/34278 [20:13:25<17:46:27, 4.02s/it] 54%|█████▎ | 18377/34278 [20:13:31<20:21:16, 4.61s/it] {'loss': 0.1362, 'grad_norm': 0.9392507510743571, 'learning_rate': 4.658460050224656e-06, 'epoch': 0.54} 54%|█████▎ | 18377/34278 [20:13:31<20:21:16, 4.61s/it] 54%|█████▎ | 18378/34278 [20:13:35<18:41:30, 4.23s/it] {'loss': 0.1269, 'grad_norm': 0.9069703368660548, 'learning_rate': 4.657988721066714e-06, 'epoch': 0.54} 54%|█████▎ | 18378/34278 [20:13:35<18:41:30, 4.23s/it] 54%|█████▎ | 18379/34278 [20:13:39<18:53:39, 4.28s/it] {'loss': 0.1242, 'grad_norm': 0.8773801036238599, 'learning_rate': 4.657517394962164e-06, 'epoch': 0.54} 54%|█████▎ | 18379/34278 [20:13:39<18:53:39, 4.28s/it] 54%|█████▎ | 18380/34278 [20:13:42<17:18:35, 3.92s/it] {'loss': 0.1503, 'grad_norm': 0.7399085306556702, 'learning_rate': 4.65704607191522e-06, 'epoch': 0.54} 54%|█████▎ | 18380/34278 [20:13:42<17:18:35, 3.92s/it] 54%|█████▎ | 18381/34278 [20:13:46<17:25:44, 3.95s/it] {'loss': 0.118, 'grad_norm': 1.0119142077092582, 'learning_rate': 4.656574751930085e-06, 'epoch': 0.54} 54%|█████▎ | 18381/34278 [20:13:46<17:25:44, 3.95s/it] 54%|█████▎ | 18382/34278 [20:13:49<16:23:32, 3.71s/it] {'loss': 0.152, 'grad_norm': 1.5654223764628867, 'learning_rate': 4.65610343501097e-06, 'epoch': 0.54} 54%|█████▎ | 18382/34278 [20:13:49<16:23:32, 3.71s/it] 54%|█████▎ | 18383/34278 [20:13:52<15:28:27, 3.50s/it] {'loss': 0.1457, 'grad_norm': 0.8274538592274344, 'learning_rate': 4.655632121162082e-06, 'epoch': 0.54} 54%|█████▎ | 18383/34278 [20:13:52<15:28:27, 3.50s/it] 54%|█████▎ | 18384/34278 [20:13:56<15:46:01, 3.57s/it] {'loss': 0.1415, 'grad_norm': 1.2318373194834533, 'learning_rate': 4.6551608103876275e-06, 'epoch': 0.54} 54%|█████▎ | 18384/34278 [20:13:56<15:46:01, 3.57s/it] 54%|█████▎ | 18385/34278 [20:14:00<15:34:49, 3.53s/it] {'loss': 0.1259, 'grad_norm': 0.9486937306815246, 'learning_rate': 4.654689502691813e-06, 'epoch': 0.54} 54%|█████▎ | 18385/34278 [20:14:00<15:34:49, 3.53s/it] 54%|█████▎ | 18386/34278 [20:14:04<16:07:27, 3.65s/it] {'loss': 0.1167, 'grad_norm': 0.7057054803619004, 'learning_rate': 4.65421819807885e-06, 'epoch': 0.54} 54%|█████▎ | 18386/34278 [20:14:04<16:07:27, 3.65s/it] 54%|█████▎ | 18387/34278 [20:14:06<14:50:14, 3.36s/it] {'loss': 0.1405, 'grad_norm': 1.0210682161050608, 'learning_rate': 4.653746896552944e-06, 'epoch': 0.54} 54%|█████▎ | 18387/34278 [20:14:06<14:50:14, 3.36s/it] 54%|█████▎ | 18388/34278 [20:14:10<15:37:05, 3.54s/it] {'loss': 0.1375, 'grad_norm': 0.8450355942890404, 'learning_rate': 4.653275598118304e-06, 'epoch': 0.54} 54%|█████▎ | 18388/34278 [20:14:10<15:37:05, 3.54s/it] 54%|█████▎ | 18389/34278 [20:14:14<15:55:30, 3.61s/it] {'loss': 0.1071, 'grad_norm': 0.743774645346137, 'learning_rate': 4.652804302779136e-06, 'epoch': 0.54} 54%|█████▎ | 18389/34278 [20:14:14<15:55:30, 3.61s/it] 54%|█████▎ | 18390/34278 [20:14:19<17:13:30, 3.90s/it] {'loss': 0.1317, 'grad_norm': 0.7219983884839402, 'learning_rate': 4.652333010539648e-06, 'epoch': 0.54} 54%|█████▎ | 18390/34278 [20:14:19<17:13:30, 3.90s/it] 54%|█████▎ | 18391/34278 [20:14:22<16:07:15, 3.65s/it] {'loss': 0.1561, 'grad_norm': 1.6790037943494809, 'learning_rate': 4.651861721404047e-06, 'epoch': 0.54} 54%|█████▎ | 18391/34278 [20:14:22<16:07:15, 3.65s/it] 54%|█████▎ | 18392/34278 [20:14:28<19:11:17, 4.35s/it] {'loss': 0.1159, 'grad_norm': 0.8807400125325477, 'learning_rate': 4.651390435376543e-06, 'epoch': 0.54} 54%|█████▎ | 18392/34278 [20:14:28<19:11:17, 4.35s/it] 54%|█████▎ | 18393/34278 [20:14:31<17:27:05, 3.95s/it] {'loss': 0.1004, 'grad_norm': 0.9634830345396149, 'learning_rate': 4.650919152461342e-06, 'epoch': 0.54} 54%|█████▎ | 18393/34278 [20:14:31<17:27:05, 3.95s/it] 54%|█████▎ | 18394/34278 [20:14:34<16:28:34, 3.73s/it] {'loss': 0.1382, 'grad_norm': 0.8857535844913024, 'learning_rate': 4.650447872662651e-06, 'epoch': 0.54} 54%|█████▎ | 18394/34278 [20:14:34<16:28:34, 3.73s/it] 54%|█████▎ | 18395/34278 [20:14:37<16:06:50, 3.65s/it] {'loss': 0.1289, 'grad_norm': 0.8063250823663929, 'learning_rate': 4.649976595984678e-06, 'epoch': 0.54} 54%|█████▎ | 18395/34278 [20:14:37<16:06:50, 3.65s/it] 54%|█████▎ | 18396/34278 [20:14:41<15:55:17, 3.61s/it] {'loss': 0.1182, 'grad_norm': 0.8622685643994183, 'learning_rate': 4.649505322431631e-06, 'epoch': 0.54} 54%|█████▎ | 18396/34278 [20:14:41<15:55:17, 3.61s/it] 54%|█████▎ | 18397/34278 [20:14:45<16:49:12, 3.81s/it] {'loss': 0.1478, 'grad_norm': 1.0892713923536639, 'learning_rate': 4.649034052007714e-06, 'epoch': 0.54} 54%|█████▎ | 18397/34278 [20:14:45<16:49:12, 3.81s/it] 54%|█████▎ | 18398/34278 [20:14:51<19:38:57, 4.45s/it] {'loss': 0.1265, 'grad_norm': 0.8673382068332008, 'learning_rate': 4.648562784717141e-06, 'epoch': 0.54} 54%|█████▎ | 18398/34278 [20:14:51<19:38:57, 4.45s/it] 54%|█████▎ | 18399/34278 [20:14:57<21:17:29, 4.83s/it] {'loss': 0.1288, 'grad_norm': 0.8608091835670153, 'learning_rate': 4.648091520564114e-06, 'epoch': 0.54} 54%|█████▎ | 18399/34278 [20:14:57<21:17:29, 4.83s/it] 54%|█████▎ | 18400/34278 [20:15:00<19:17:39, 4.37s/it] {'loss': 0.134, 'grad_norm': 0.856921635053501, 'learning_rate': 4.647620259552841e-06, 'epoch': 0.54} 54%|█████▎ | 18400/34278 [20:15:00<19:17:39, 4.37s/it] 54%|█████▎ | 18401/34278 [20:15:05<20:07:14, 4.56s/it] {'loss': 0.1229, 'grad_norm': 0.8035283308925599, 'learning_rate': 4.647149001687532e-06, 'epoch': 0.54} 54%|█████▎ | 18401/34278 [20:15:05<20:07:14, 4.56s/it] 54%|█████▎ | 18402/34278 [20:15:11<22:10:29, 5.03s/it] {'loss': 0.134, 'grad_norm': 0.7661735637383169, 'learning_rate': 4.6466777469723916e-06, 'epoch': 0.54} 54%|█████▎ | 18402/34278 [20:15:11<22:10:29, 5.03s/it] 54%|█████▎ | 18403/34278 [20:15:14<19:44:23, 4.48s/it] {'loss': 0.1318, 'grad_norm': 1.0260182917262406, 'learning_rate': 4.646206495411627e-06, 'epoch': 0.54} 54%|█████▎ | 18403/34278 [20:15:14<19:44:23, 4.48s/it] 54%|█████▎ | 18404/34278 [20:15:20<21:36:51, 4.90s/it] {'loss': 0.1312, 'grad_norm': 0.7281531016042366, 'learning_rate': 4.645735247009447e-06, 'epoch': 0.54} 54%|█████▎ | 18404/34278 [20:15:20<21:36:51, 4.90s/it] 54%|█████▎ | 18405/34278 [20:15:23<19:07:23, 4.34s/it] {'loss': 0.1369, 'grad_norm': 0.9359715608461058, 'learning_rate': 4.645264001770059e-06, 'epoch': 0.54} 54%|█████▎ | 18405/34278 [20:15:23<19:07:23, 4.34s/it] 54%|█████▎ | 18406/34278 [20:15:28<19:56:03, 4.52s/it] {'loss': 0.1356, 'grad_norm': 0.7474998995425077, 'learning_rate': 4.6447927596976685e-06, 'epoch': 0.54} 54%|█████▎ | 18406/34278 [20:15:28<19:56:03, 4.52s/it] 54%|█████▎ | 18407/34278 [20:15:31<17:27:02, 3.96s/it] {'loss': 0.1235, 'grad_norm': 0.9396636845455977, 'learning_rate': 4.644321520796484e-06, 'epoch': 0.54} 54%|█████▎ | 18407/34278 [20:15:31<17:27:02, 3.96s/it] 54%|█████▎ | 18408/34278 [20:15:37<20:14:26, 4.59s/it] {'loss': 0.1396, 'grad_norm': 0.900428593419211, 'learning_rate': 4.6438502850707125e-06, 'epoch': 0.54} 54%|█████▎ | 18408/34278 [20:15:37<20:14:26, 4.59s/it] 54%|█████▎ | 18409/34278 [20:15:40<18:34:10, 4.21s/it] {'loss': 0.1275, 'grad_norm': 0.7501024115615466, 'learning_rate': 4.643379052524557e-06, 'epoch': 0.54} 54%|█████▎ | 18409/34278 [20:15:40<18:34:10, 4.21s/it] 54%|█████▎ | 18410/34278 [20:15:46<20:46:35, 4.71s/it] {'loss': 0.1263, 'grad_norm': 0.8911013787200006, 'learning_rate': 4.642907823162232e-06, 'epoch': 0.54} 54%|█████▎ | 18410/34278 [20:15:46<20:46:35, 4.71s/it] 54%|█████▎ | 18411/34278 [20:15:49<18:24:34, 4.18s/it] {'loss': 0.1241, 'grad_norm': 1.267359403901331, 'learning_rate': 4.642436596987939e-06, 'epoch': 0.54} 54%|█████▎ | 18411/34278 [20:15:49<18:24:34, 4.18s/it] 54%|█████▎ | 18412/34278 [20:15:52<17:14:12, 3.91s/it] {'loss': 0.1349, 'grad_norm': 1.2193543182524031, 'learning_rate': 4.6419653740058875e-06, 'epoch': 0.54} 54%|█████▎ | 18412/34278 [20:15:52<17:14:12, 3.91s/it] 54%|█████▎ | 18413/34278 [20:15:56<16:19:11, 3.70s/it] {'loss': 0.1479, 'grad_norm': 0.869884365322348, 'learning_rate': 4.6414941542202854e-06, 'epoch': 0.54} 54%|█████▎ | 18413/34278 [20:15:56<16:19:11, 3.70s/it] 54%|█████▎ | 18414/34278 [20:15:59<15:39:51, 3.55s/it] {'loss': 0.1255, 'grad_norm': 0.9270581440670198, 'learning_rate': 4.6410229376353355e-06, 'epoch': 0.54} 54%|█████▎ | 18414/34278 [20:15:59<15:39:51, 3.55s/it] 54%|█████▎ | 18415/34278 [20:16:02<14:48:55, 3.36s/it] {'loss': 0.1413, 'grad_norm': 0.8335069150306736, 'learning_rate': 4.6405517242552465e-06, 'epoch': 0.54} 54%|█████▎ | 18415/34278 [20:16:02<14:48:55, 3.36s/it] 54%|█████▎ | 18416/34278 [20:16:05<14:53:20, 3.38s/it] {'loss': 0.1376, 'grad_norm': 0.9004884093575155, 'learning_rate': 4.640080514084227e-06, 'epoch': 0.54} 54%|█████▎ | 18416/34278 [20:16:05<14:53:20, 3.38s/it] 54%|█████▎ | 18417/34278 [20:16:08<14:18:27, 3.25s/it] {'loss': 0.1207, 'grad_norm': 0.9495672066905021, 'learning_rate': 4.639609307126483e-06, 'epoch': 0.54} 54%|█████▎ | 18417/34278 [20:16:08<14:18:27, 3.25s/it] 54%|█████▎ | 18418/34278 [20:16:11<14:30:58, 3.29s/it] {'loss': 0.1182, 'grad_norm': 0.8584298778242642, 'learning_rate': 4.639138103386222e-06, 'epoch': 0.54} 54%|█████▎ | 18418/34278 [20:16:11<14:30:58, 3.29s/it] 54%|█████▎ | 18419/34278 [20:16:17<16:52:54, 3.83s/it] {'loss': 0.1518, 'grad_norm': 0.9689798923975219, 'learning_rate': 4.638666902867649e-06, 'epoch': 0.54} 54%|█████▎ | 18419/34278 [20:16:17<16:52:54, 3.83s/it] 54%|█████▎ | 18420/34278 [20:16:20<16:03:12, 3.64s/it] {'loss': 0.1625, 'grad_norm': 0.8713494961194964, 'learning_rate': 4.63819570557497e-06, 'epoch': 0.54} 54%|█████▎ | 18420/34278 [20:16:20<16:03:12, 3.64s/it] 54%|█████▎ | 18421/34278 [20:16:23<15:02:43, 3.42s/it] {'loss': 0.1019, 'grad_norm': 0.8097312640427291, 'learning_rate': 4.637724511512394e-06, 'epoch': 0.54} 54%|█████▎ | 18421/34278 [20:16:23<15:02:43, 3.42s/it] 54%|█████▎ | 18422/34278 [20:16:27<15:43:54, 3.57s/it] {'loss': 0.1411, 'grad_norm': 0.7818637297060073, 'learning_rate': 4.637253320684128e-06, 'epoch': 0.54} 54%|█████▎ | 18422/34278 [20:16:27<15:43:54, 3.57s/it] 54%|█████▎ | 18423/34278 [20:16:30<15:53:34, 3.61s/it] {'loss': 0.1358, 'grad_norm': 0.7725537560170543, 'learning_rate': 4.636782133094379e-06, 'epoch': 0.54} 54%|█████▎ | 18423/34278 [20:16:30<15:53:34, 3.61s/it] 54%|█████▎ | 18424/34278 [20:16:34<15:56:19, 3.62s/it] {'loss': 0.1161, 'grad_norm': 0.9685417795608715, 'learning_rate': 4.636310948747351e-06, 'epoch': 0.54} 54%|█████▎ | 18424/34278 [20:16:34<15:56:19, 3.62s/it] 54%|█████▍ | 18425/34278 [20:16:37<15:13:13, 3.46s/it] {'loss': 0.1179, 'grad_norm': 0.8661703798486129, 'learning_rate': 4.6358397676472514e-06, 'epoch': 0.54} 54%|█████▍ | 18425/34278 [20:16:37<15:13:13, 3.46s/it] 54%|█████▍ | 18426/34278 [20:16:43<18:36:23, 4.23s/it] {'loss': 0.1328, 'grad_norm': 0.9748207706114557, 'learning_rate': 4.63536858979829e-06, 'epoch': 0.54} 54%|█████▍ | 18426/34278 [20:16:43<18:36:23, 4.23s/it] 54%|█████▍ | 18427/34278 [20:16:46<17:21:07, 3.94s/it] {'loss': 0.1255, 'grad_norm': 0.8068977563207552, 'learning_rate': 4.634897415204665e-06, 'epoch': 0.54} 54%|█████▍ | 18427/34278 [20:16:46<17:21:07, 3.94s/it] 54%|█████▍ | 18428/34278 [20:16:50<17:37:09, 4.00s/it] {'loss': 0.1238, 'grad_norm': 0.9814118356836369, 'learning_rate': 4.6344262438705945e-06, 'epoch': 0.54} 54%|█████▍ | 18428/34278 [20:16:50<17:37:09, 4.00s/it] 54%|█████▍ | 18429/34278 [20:16:55<18:31:08, 4.21s/it] {'loss': 0.1236, 'grad_norm': 0.8697629592549748, 'learning_rate': 4.633955075800277e-06, 'epoch': 0.54} 54%|█████▍ | 18429/34278 [20:16:55<18:31:08, 4.21s/it] 54%|█████▍ | 18430/34278 [20:16:58<17:04:34, 3.88s/it] {'loss': 0.1373, 'grad_norm': 0.818823172399622, 'learning_rate': 4.633483910997921e-06, 'epoch': 0.54} 54%|█████▍ | 18430/34278 [20:16:58<17:04:34, 3.88s/it] 54%|█████▍ | 18431/34278 [20:17:04<19:51:31, 4.51s/it] {'loss': 0.1327, 'grad_norm': 1.1620154208251865, 'learning_rate': 4.633012749467735e-06, 'epoch': 0.54} 54%|█████▍ | 18431/34278 [20:17:04<19:51:31, 4.51s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 54%|█████▍ | 18432/34278 [20:17:07<17:53:21, 4.06s/it] {'loss': 0.1381, 'grad_norm': 0.9945830533717772, 'learning_rate': 4.632541591213922e-06, 'epoch': 0.54} 54%|█████▍ | 18432/34278 [20:17:07<17:53:21, 4.06s/it] 54%|█████▍ | 18433/34278 [20:17:12<19:18:17, 4.39s/it] {'loss': 0.1394, 'grad_norm': 1.0759595745860047, 'learning_rate': 4.6320704362406895e-06, 'epoch': 0.54} 54%|█████▍ | 18433/34278 [20:17:12<19:18:17, 4.39s/it] 54%|█████▍ | 18434/34278 [20:17:16<17:49:54, 4.05s/it] {'loss': 0.1262, 'grad_norm': 1.0332282875284455, 'learning_rate': 4.6315992845522445e-06, 'epoch': 0.54} 54%|█████▍ | 18434/34278 [20:17:16<17:49:54, 4.05s/it] 54%|█████▍ | 18435/34278 [20:17:19<17:26:58, 3.97s/it] {'loss': 0.1424, 'grad_norm': 1.0009817731707173, 'learning_rate': 4.631128136152795e-06, 'epoch': 0.54} 54%|█████▍ | 18435/34278 [20:17:19<17:26:58, 3.97s/it] 54%|█████▍ | 18436/34278 [20:17:22<15:56:30, 3.62s/it] {'loss': 0.1438, 'grad_norm': 0.829985548869482, 'learning_rate': 4.6306569910465435e-06, 'epoch': 0.54} 54%|█████▍ | 18436/34278 [20:17:22<15:56:30, 3.62s/it] 54%|█████▍ | 18437/34278 [20:17:26<15:54:34, 3.62s/it] {'loss': 0.1326, 'grad_norm': 0.9543572266061056, 'learning_rate': 4.630185849237699e-06, 'epoch': 0.54} 54%|█████▍ | 18437/34278 [20:17:26<15:54:34, 3.62s/it] 54%|█████▍ | 18438/34278 [20:17:29<15:17:46, 3.48s/it] {'loss': 0.1523, 'grad_norm': 1.1472750665460945, 'learning_rate': 4.629714710730468e-06, 'epoch': 0.54} 54%|█████▍ | 18438/34278 [20:17:29<15:17:46, 3.48s/it] 54%|█████▍ | 18439/34278 [20:17:32<14:52:22, 3.38s/it] {'loss': 0.1196, 'grad_norm': 0.9609814818823639, 'learning_rate': 4.629243575529052e-06, 'epoch': 0.54} 54%|█████▍ | 18439/34278 [20:17:32<14:52:22, 3.38s/it] 54%|█████▍ | 18440/34278 [20:17:35<14:41:35, 3.34s/it] {'loss': 0.1168, 'grad_norm': 1.0839039490434084, 'learning_rate': 4.628772443637664e-06, 'epoch': 0.54} 54%|█████▍ | 18440/34278 [20:17:35<14:41:35, 3.34s/it] 54%|█████▍ | 18441/34278 [20:17:41<17:34:38, 4.00s/it] {'loss': 0.127, 'grad_norm': 0.9058019196393359, 'learning_rate': 4.628301315060506e-06, 'epoch': 0.54} 54%|█████▍ | 18441/34278 [20:17:41<17:34:38, 4.00s/it] 54%|█████▍ | 18442/34278 [20:17:44<16:38:06, 3.78s/it] {'loss': 0.1108, 'grad_norm': 0.7880115419121816, 'learning_rate': 4.627830189801785e-06, 'epoch': 0.54} 54%|█████▍ | 18442/34278 [20:17:44<16:38:06, 3.78s/it] 54%|█████▍ | 18443/34278 [20:17:47<16:02:02, 3.65s/it] {'loss': 0.1246, 'grad_norm': 0.8383131103373016, 'learning_rate': 4.627359067865709e-06, 'epoch': 0.54} 54%|█████▍ | 18443/34278 [20:17:48<16:02:02, 3.65s/it] 54%|█████▍ | 18444/34278 [20:17:51<15:36:49, 3.55s/it] {'loss': 0.1326, 'grad_norm': 0.7706655732006188, 'learning_rate': 4.6268879492564815e-06, 'epoch': 0.54} 54%|█████▍ | 18444/34278 [20:17:51<15:36:49, 3.55s/it] 54%|█████▍ | 18445/34278 [20:17:54<15:16:04, 3.47s/it] {'loss': 0.1111, 'grad_norm': 0.6893824246782932, 'learning_rate': 4.626416833978307e-06, 'epoch': 0.54} 54%|█████▍ | 18445/34278 [20:17:54<15:16:04, 3.47s/it] 54%|█████▍ | 18446/34278 [20:17:57<14:45:33, 3.36s/it] {'loss': 0.1244, 'grad_norm': 0.7094778064231246, 'learning_rate': 4.6259457220353955e-06, 'epoch': 0.54} 54%|█████▍ | 18446/34278 [20:17:57<14:45:33, 3.36s/it] 54%|█████▍ | 18447/34278 [20:18:00<14:35:28, 3.32s/it] {'loss': 0.1257, 'grad_norm': 0.816157148753955, 'learning_rate': 4.625474613431951e-06, 'epoch': 0.54} 54%|█████▍ | 18447/34278 [20:18:00<14:35:28, 3.32s/it] 54%|█████▍ | 18448/34278 [20:18:04<14:34:25, 3.31s/it] {'loss': 0.1234, 'grad_norm': 0.7448126542639731, 'learning_rate': 4.625003508172181e-06, 'epoch': 0.54} 54%|█████▍ | 18448/34278 [20:18:04<14:34:25, 3.31s/it] 54%|█████▍ | 18449/34278 [20:18:07<14:16:52, 3.25s/it] {'loss': 0.1298, 'grad_norm': 0.7032555178388354, 'learning_rate': 4.624532406260289e-06, 'epoch': 0.54} 54%|█████▍ | 18449/34278 [20:18:07<14:16:52, 3.25s/it] 54%|█████▍ | 18450/34278 [20:18:10<13:43:26, 3.12s/it] {'loss': 0.1118, 'grad_norm': 0.7566557780826755, 'learning_rate': 4.6240613077004825e-06, 'epoch': 0.54} 54%|█████▍ | 18450/34278 [20:18:10<13:43:26, 3.12s/it] 54%|█████▍ | 18451/34278 [20:18:14<14:51:07, 3.38s/it] {'loss': 0.1501, 'grad_norm': 1.024552593339411, 'learning_rate': 4.623590212496966e-06, 'epoch': 0.54} 54%|█████▍ | 18451/34278 [20:18:14<14:51:07, 3.38s/it] 54%|█████▍ | 18452/34278 [20:18:17<14:12:11, 3.23s/it] {'loss': 0.1304, 'grad_norm': 1.002637863063864, 'learning_rate': 4.6231191206539464e-06, 'epoch': 0.54} 54%|█████▍ | 18452/34278 [20:18:17<14:12:11, 3.23s/it] 54%|█████▍ | 18453/34278 [20:18:20<13:57:06, 3.17s/it] {'loss': 0.1146, 'grad_norm': 0.8508473027450237, 'learning_rate': 4.622648032175631e-06, 'epoch': 0.54} 54%|█████▍ | 18453/34278 [20:18:20<13:57:06, 3.17s/it] 54%|█████▍ | 18454/34278 [20:18:23<13:47:11, 3.14s/it] {'loss': 0.1287, 'grad_norm': 0.7604976770223463, 'learning_rate': 4.622176947066223e-06, 'epoch': 0.54} 54%|█████▍ | 18454/34278 [20:18:23<13:47:11, 3.14s/it] 54%|█████▍ | 18455/34278 [20:18:29<18:29:17, 4.21s/it] {'loss': 0.1435, 'grad_norm': 0.8756015731827811, 'learning_rate': 4.621705865329928e-06, 'epoch': 0.54} 54%|█████▍ | 18455/34278 [20:18:29<18:29:17, 4.21s/it] 54%|█████▍ | 18456/34278 [20:18:33<18:01:31, 4.10s/it] {'loss': 0.1207, 'grad_norm': 0.9054175597379397, 'learning_rate': 4.621234786970955e-06, 'epoch': 0.54} 54%|█████▍ | 18456/34278 [20:18:33<18:01:31, 4.10s/it] 54%|█████▍ | 18457/34278 [20:18:36<16:37:58, 3.78s/it] {'loss': 0.1231, 'grad_norm': 0.7993475119439174, 'learning_rate': 4.620763711993504e-06, 'epoch': 0.54} 54%|█████▍ | 18457/34278 [20:18:36<16:37:58, 3.78s/it] 54%|█████▍ | 18458/34278 [20:18:39<15:44:15, 3.58s/it] {'loss': 0.1306, 'grad_norm': 1.0069903678745922, 'learning_rate': 4.620292640401786e-06, 'epoch': 0.54} 54%|█████▍ | 18458/34278 [20:18:39<15:44:15, 3.58s/it] 54%|█████▍ | 18459/34278 [20:18:43<15:12:41, 3.46s/it] {'loss': 0.1371, 'grad_norm': 0.7675858475618413, 'learning_rate': 4.619821572200005e-06, 'epoch': 0.54} 54%|█████▍ | 18459/34278 [20:18:43<15:12:41, 3.46s/it] 54%|█████▍ | 18460/34278 [20:18:46<14:54:15, 3.39s/it] {'loss': 0.1272, 'grad_norm': 0.9962535852556412, 'learning_rate': 4.6193505073923655e-06, 'epoch': 0.54} 54%|█████▍ | 18460/34278 [20:18:46<14:54:15, 3.39s/it] 54%|█████▍ | 18461/34278 [20:18:49<14:35:34, 3.32s/it] {'loss': 0.1281, 'grad_norm': 0.7955796040305474, 'learning_rate': 4.6188794459830756e-06, 'epoch': 0.54} 54%|█████▍ | 18461/34278 [20:18:49<14:35:34, 3.32s/it] 54%|█████▍ | 18462/34278 [20:18:52<14:01:50, 3.19s/it] {'loss': 0.131, 'grad_norm': 0.803729078148684, 'learning_rate': 4.618408387976337e-06, 'epoch': 0.54} 54%|█████▍ | 18462/34278 [20:18:52<14:01:50, 3.19s/it] 54%|█████▍ | 18463/34278 [20:18:55<14:28:45, 3.30s/it] {'loss': 0.1207, 'grad_norm': 0.8122718679589646, 'learning_rate': 4.617937333376356e-06, 'epoch': 0.54} 54%|█████▍ | 18463/34278 [20:18:55<14:28:45, 3.30s/it] 54%|█████▍ | 18464/34278 [20:18:58<13:40:44, 3.11s/it] {'loss': 0.1115, 'grad_norm': 0.9027543324881537, 'learning_rate': 4.617466282187341e-06, 'epoch': 0.54} 54%|█████▍ | 18464/34278 [20:18:58<13:40:44, 3.11s/it] 54%|█████▍ | 18465/34278 [20:19:04<17:22:52, 3.96s/it] {'loss': 0.13, 'grad_norm': 1.1345576211791466, 'learning_rate': 4.616995234413498e-06, 'epoch': 0.54} 54%|█████▍ | 18465/34278 [20:19:04<17:22:52, 3.96s/it] 54%|█████▍ | 18466/34278 [20:19:07<16:19:01, 3.71s/it] {'loss': 0.1356, 'grad_norm': 0.9931104688525274, 'learning_rate': 4.616524190059028e-06, 'epoch': 0.54} 54%|█████▍ | 18466/34278 [20:19:07<16:19:01, 3.71s/it] 54%|█████▍ | 18467/34278 [20:19:10<15:26:46, 3.52s/it] {'loss': 0.1498, 'grad_norm': 0.9092572288132816, 'learning_rate': 4.616053149128137e-06, 'epoch': 0.54} 54%|█████▍ | 18467/34278 [20:19:10<15:26:46, 3.52s/it] 54%|█████▍ | 18468/34278 [20:19:14<15:44:54, 3.59s/it] {'loss': 0.1703, 'grad_norm': 0.8681668748462004, 'learning_rate': 4.615582111625035e-06, 'epoch': 0.54} 54%|█████▍ | 18468/34278 [20:19:14<15:44:54, 3.59s/it] 54%|█████▍ | 18469/34278 [20:19:17<15:41:13, 3.57s/it] {'loss': 0.1305, 'grad_norm': 0.9380233133435865, 'learning_rate': 4.61511107755392e-06, 'epoch': 0.54} 54%|█████▍ | 18469/34278 [20:19:17<15:41:13, 3.57s/it] 54%|█████▍ | 18470/34278 [20:19:21<15:20:45, 3.49s/it] {'loss': 0.1352, 'grad_norm': 0.8526693956400522, 'learning_rate': 4.614640046919004e-06, 'epoch': 0.54} 54%|█████▍ | 18470/34278 [20:19:21<15:20:45, 3.49s/it] 54%|█████▍ | 18471/34278 [20:19:24<15:06:46, 3.44s/it] {'loss': 0.1688, 'grad_norm': 0.8469171229904489, 'learning_rate': 4.6141690197244895e-06, 'epoch': 0.54} 54%|█████▍ | 18471/34278 [20:19:24<15:06:46, 3.44s/it] 54%|█████▍ | 18472/34278 [20:19:28<15:10:53, 3.46s/it] {'loss': 0.1278, 'grad_norm': 1.0020668817402003, 'learning_rate': 4.613697995974582e-06, 'epoch': 0.54} 54%|█████▍ | 18472/34278 [20:19:28<15:10:53, 3.46s/it] 54%|█████▍ | 18473/34278 [20:19:31<15:28:03, 3.52s/it] {'loss': 0.1405, 'grad_norm': 0.84438952777186, 'learning_rate': 4.613226975673488e-06, 'epoch': 0.54} 54%|█████▍ | 18473/34278 [20:19:31<15:28:03, 3.52s/it] 54%|█████▍ | 18474/34278 [20:19:34<14:43:58, 3.36s/it] {'loss': 0.1465, 'grad_norm': 0.945618628826909, 'learning_rate': 4.61275595882541e-06, 'epoch': 0.54} 54%|█████▍ | 18474/34278 [20:19:34<14:43:58, 3.36s/it] 54%|█████▍ | 18475/34278 [20:19:39<16:55:18, 3.85s/it] {'loss': 0.1443, 'grad_norm': 1.0235841023439358, 'learning_rate': 4.612284945434552e-06, 'epoch': 0.54} 54%|█████▍ | 18475/34278 [20:19:39<16:55:18, 3.85s/it] 54%|█████▍ | 18476/34278 [20:19:44<18:44:27, 4.27s/it] {'loss': 0.1225, 'grad_norm': 0.6546548950283324, 'learning_rate': 4.611813935505124e-06, 'epoch': 0.54} 54%|█████▍ | 18476/34278 [20:19:44<18:44:27, 4.27s/it] 54%|█████▍ | 18477/34278 [20:19:47<16:54:22, 3.85s/it] {'loss': 0.1713, 'grad_norm': 1.0493183644750592, 'learning_rate': 4.611342929041327e-06, 'epoch': 0.54} 54%|█████▍ | 18477/34278 [20:19:47<16:54:22, 3.85s/it] 54%|█████▍ | 18478/34278 [20:19:53<19:27:04, 4.43s/it] {'loss': 0.1094, 'grad_norm': 0.9463567859642354, 'learning_rate': 4.61087192604737e-06, 'epoch': 0.54} 54%|█████▍ | 18478/34278 [20:19:53<19:27:04, 4.43s/it] 54%|█████▍ | 18479/34278 [20:19:58<20:32:58, 4.68s/it] {'loss': 0.1027, 'grad_norm': 0.772709685839752, 'learning_rate': 4.610400926527454e-06, 'epoch': 0.54} 54%|█████▍ | 18479/34278 [20:19:58<20:32:58, 4.68s/it] 54%|█████▍ | 18480/34278 [20:20:01<18:23:22, 4.19s/it] {'loss': 0.1312, 'grad_norm': 0.7836107240767829, 'learning_rate': 4.609929930485785e-06, 'epoch': 0.54} 54%|█████▍ | 18480/34278 [20:20:01<18:23:22, 4.19s/it] 54%|█████▍ | 18481/34278 [20:20:06<18:32:20, 4.22s/it] {'loss': 0.1449, 'grad_norm': 1.0118977320199922, 'learning_rate': 4.609458937926568e-06, 'epoch': 0.54} 54%|█████▍ | 18481/34278 [20:20:06<18:32:20, 4.22s/it] 54%|█████▍ | 18482/34278 [20:20:10<18:27:36, 4.21s/it] {'loss': 0.1142, 'grad_norm': 0.7708828717475101, 'learning_rate': 4.608987948854009e-06, 'epoch': 0.54} 54%|█████▍ | 18482/34278 [20:20:10<18:27:36, 4.21s/it] 54%|█████▍ | 18483/34278 [20:20:13<16:54:39, 3.85s/it] {'loss': 0.118, 'grad_norm': 0.7905709797869556, 'learning_rate': 4.608516963272314e-06, 'epoch': 0.54} 54%|█████▍ | 18483/34278 [20:20:13<16:54:39, 3.85s/it] 54%|█████▍ | 18484/34278 [20:20:16<16:16:36, 3.71s/it] {'loss': 0.1328, 'grad_norm': 0.8069241387258512, 'learning_rate': 4.6080459811856845e-06, 'epoch': 0.54} 54%|█████▍ | 18484/34278 [20:20:16<16:16:36, 3.71s/it] 54%|█████▍ | 18485/34278 [20:20:20<16:00:36, 3.65s/it] {'loss': 0.1236, 'grad_norm': 1.0309158279777073, 'learning_rate': 4.6075750025983274e-06, 'epoch': 0.54} 54%|█████▍ | 18485/34278 [20:20:20<16:00:36, 3.65s/it] 54%|█████▍ | 18486/34278 [20:20:26<19:17:03, 4.40s/it] {'loss': 0.1254, 'grad_norm': 0.7039157364644283, 'learning_rate': 4.607104027514448e-06, 'epoch': 0.54} 54%|█████▍ | 18486/34278 [20:20:26<19:17:03, 4.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 54%|█████▍ | 18487/34278 [20:20:29<17:22:23, 3.96s/it] {'loss': 0.1404, 'grad_norm': 0.8343523265323943, 'learning_rate': 4.606633055938247e-06, 'epoch': 0.54} 54%|█████▍ | 18487/34278 [20:20:29<17:22:23, 3.96s/it] 54%|█████▍ | 18488/34278 [20:20:33<17:51:43, 4.07s/it] {'loss': 0.1576, 'grad_norm': 0.6620159980620091, 'learning_rate': 4.606162087873934e-06, 'epoch': 0.54} 54%|█████▍ | 18488/34278 [20:20:33<17:51:43, 4.07s/it] 54%|█████▍ | 18489/34278 [20:20:36<16:33:35, 3.78s/it] {'loss': 0.1205, 'grad_norm': 0.7441419946854502, 'learning_rate': 4.605691123325712e-06, 'epoch': 0.54} 54%|█████▍ | 18489/34278 [20:20:36<16:33:35, 3.78s/it] 54%|█████▍ | 18490/34278 [20:20:39<15:14:14, 3.47s/it] {'loss': 0.1245, 'grad_norm': 1.0306287315275824, 'learning_rate': 4.605220162297785e-06, 'epoch': 0.54} 54%|█████▍ | 18490/34278 [20:20:39<15:14:14, 3.47s/it] 54%|█████▍ | 18491/34278 [20:20:42<14:36:35, 3.33s/it] {'loss': 0.1322, 'grad_norm': 0.9544581723560933, 'learning_rate': 4.60474920479436e-06, 'epoch': 0.54} 54%|█████▍ | 18491/34278 [20:20:42<14:36:35, 3.33s/it] 54%|█████▍ | 18492/34278 [20:20:45<14:42:03, 3.35s/it] {'loss': 0.1248, 'grad_norm': 0.9809156334129635, 'learning_rate': 4.604278250819638e-06, 'epoch': 0.54} 54%|█████▍ | 18492/34278 [20:20:45<14:42:03, 3.35s/it] 54%|█████▍ | 18493/34278 [20:20:50<15:48:28, 3.61s/it] {'loss': 0.1178, 'grad_norm': 0.9011113477017791, 'learning_rate': 4.603807300377825e-06, 'epoch': 0.54} 54%|█████▍ | 18493/34278 [20:20:50<15:48:28, 3.61s/it] 54%|█████▍ | 18494/34278 [20:20:53<14:51:42, 3.39s/it] {'loss': 0.134, 'grad_norm': 0.8838891360815239, 'learning_rate': 4.603336353473126e-06, 'epoch': 0.54} 54%|█████▍ | 18494/34278 [20:20:53<14:51:42, 3.39s/it] 54%|█████▍ | 18495/34278 [20:20:56<15:12:13, 3.47s/it] {'loss': 0.1588, 'grad_norm': 0.9477505501652712, 'learning_rate': 4.602865410109747e-06, 'epoch': 0.54} 54%|█████▍ | 18495/34278 [20:20:56<15:12:13, 3.47s/it] 54%|█████▍ | 18496/34278 [20:20:59<14:47:32, 3.37s/it] {'loss': 0.1084, 'grad_norm': 1.1140434308745006, 'learning_rate': 4.602394470291889e-06, 'epoch': 0.54} 54%|█████▍ | 18496/34278 [20:20:59<14:47:32, 3.37s/it] 54%|█████▍ | 18497/34278 [20:21:05<17:18:47, 3.95s/it] {'loss': 0.0971, 'grad_norm': 0.8818063594020238, 'learning_rate': 4.601923534023759e-06, 'epoch': 0.54} 54%|█████▍ | 18497/34278 [20:21:05<17:18:47, 3.95s/it] 54%|█████▍ | 18498/34278 [20:21:09<17:49:18, 4.07s/it] {'loss': 0.1281, 'grad_norm': 0.8463094940252679, 'learning_rate': 4.601452601309562e-06, 'epoch': 0.54} 54%|█████▍ | 18498/34278 [20:21:09<17:49:18, 4.07s/it] 54%|█████▍ | 18499/34278 [20:21:13<17:23:40, 3.97s/it] {'loss': 0.1254, 'grad_norm': 0.8236064778402271, 'learning_rate': 4.600981672153497e-06, 'epoch': 0.54} 54%|█████▍ | 18499/34278 [20:21:13<17:23:40, 3.97s/it] 54%|█████▍ | 18500/34278 [20:21:16<16:21:48, 3.73s/it] {'loss': 0.1093, 'grad_norm': 0.762362504079061, 'learning_rate': 4.600510746559776e-06, 'epoch': 0.54} 54%|█████▍ | 18500/34278 [20:21:16<16:21:48, 3.73s/it] 54%|█████▍ | 18501/34278 [20:21:19<15:35:07, 3.56s/it] {'loss': 0.1193, 'grad_norm': 0.9572674060564242, 'learning_rate': 4.600039824532599e-06, 'epoch': 0.54} 54%|█████▍ | 18501/34278 [20:21:19<15:35:07, 3.56s/it] 54%|█████▍ | 18502/34278 [20:21:22<14:27:59, 3.30s/it] {'loss': 0.1038, 'grad_norm': 0.7105556118142405, 'learning_rate': 4.599568906076169e-06, 'epoch': 0.54} 54%|█████▍ | 18502/34278 [20:21:22<14:27:59, 3.30s/it] 54%|█████▍ | 18503/34278 [20:21:25<14:35:13, 3.33s/it] {'loss': 0.1558, 'grad_norm': 1.4230552122953444, 'learning_rate': 4.599097991194695e-06, 'epoch': 0.54} 54%|█████▍ | 18503/34278 [20:21:25<14:35:13, 3.33s/it] 54%|█████▍ | 18504/34278 [20:21:31<17:58:59, 4.10s/it] {'loss': 0.1072, 'grad_norm': 1.0784991990489363, 'learning_rate': 4.598627079892378e-06, 'epoch': 0.54} 54%|█████▍ | 18504/34278 [20:21:31<17:58:59, 4.10s/it] 54%|█████▍ | 18505/34278 [20:21:34<16:37:24, 3.79s/it] {'loss': 0.1289, 'grad_norm': 0.7736688661207487, 'learning_rate': 4.59815617217342e-06, 'epoch': 0.54} 54%|█████▍ | 18505/34278 [20:21:34<16:37:24, 3.79s/it] 54%|█████▍ | 18506/34278 [20:21:38<16:45:39, 3.83s/it] {'loss': 0.1189, 'grad_norm': 0.9221273996752505, 'learning_rate': 4.59768526804203e-06, 'epoch': 0.54} 54%|█████▍ | 18506/34278 [20:21:38<16:45:39, 3.83s/it] 54%|█████▍ | 18507/34278 [20:21:42<16:44:43, 3.82s/it] {'loss': 0.1321, 'grad_norm': 0.821146265980746, 'learning_rate': 4.597214367502409e-06, 'epoch': 0.54} 54%|█████▍ | 18507/34278 [20:21:42<16:44:43, 3.82s/it] 54%|█████▍ | 18508/34278 [20:21:45<16:15:49, 3.71s/it] {'loss': 0.1314, 'grad_norm': 0.7993447388617366, 'learning_rate': 4.596743470558764e-06, 'epoch': 0.54} 54%|█████▍ | 18508/34278 [20:21:45<16:15:49, 3.71s/it] 54%|█████▍ | 18509/34278 [20:21:48<15:17:07, 3.49s/it] {'loss': 0.1096, 'grad_norm': 0.7863949008157836, 'learning_rate': 4.596272577215295e-06, 'epoch': 0.54} 54%|█████▍ | 18509/34278 [20:21:48<15:17:07, 3.49s/it] 54%|█████▍ | 18510/34278 [20:21:51<14:16:20, 3.26s/it] {'loss': 0.1194, 'grad_norm': 0.7868174842687745, 'learning_rate': 4.595801687476209e-06, 'epoch': 0.54} 54%|█████▍ | 18510/34278 [20:21:51<14:16:20, 3.26s/it] 54%|█████▍ | 18511/34278 [20:21:56<17:03:52, 3.90s/it] {'loss': 0.1379, 'grad_norm': 0.8563130500686809, 'learning_rate': 4.595330801345707e-06, 'epoch': 0.54} 54%|█████▍ | 18511/34278 [20:21:56<17:03:52, 3.90s/it] 54%|█████▍ | 18512/34278 [20:21:59<15:46:22, 3.60s/it] {'loss': 0.1217, 'grad_norm': 0.8570011607210023, 'learning_rate': 4.594859918827996e-06, 'epoch': 0.54} 54%|█████▍ | 18512/34278 [20:21:59<15:46:22, 3.60s/it] 54%|█████▍ | 18513/34278 [20:22:03<15:35:27, 3.56s/it] {'loss': 0.1245, 'grad_norm': 0.7020689895705245, 'learning_rate': 4.594389039927281e-06, 'epoch': 0.54} 54%|█████▍ | 18513/34278 [20:22:03<15:35:27, 3.56s/it] 54%|█████▍ | 18514/34278 [20:22:06<15:08:57, 3.46s/it] {'loss': 0.1227, 'grad_norm': 0.7365351004202181, 'learning_rate': 4.593918164647763e-06, 'epoch': 0.54} 54%|█████▍ | 18514/34278 [20:22:06<15:08:57, 3.46s/it] 54%|█████▍ | 18515/34278 [20:22:09<14:46:10, 3.37s/it] {'loss': 0.1248, 'grad_norm': 0.9497995167706723, 'learning_rate': 4.593447292993645e-06, 'epoch': 0.54} 54%|█████▍ | 18515/34278 [20:22:09<14:46:10, 3.37s/it] 54%|█████▍ | 18516/34278 [20:22:14<17:16:44, 3.95s/it] {'loss': 0.1325, 'grad_norm': 0.9012064912899808, 'learning_rate': 4.592976424969135e-06, 'epoch': 0.54} 54%|█████▍ | 18516/34278 [20:22:14<17:16:44, 3.95s/it] 54%|█████▍ | 18517/34278 [20:22:17<15:52:03, 3.62s/it] {'loss': 0.1188, 'grad_norm': 1.0477580990927593, 'learning_rate': 4.592505560578431e-06, 'epoch': 0.54} 54%|█████▍ | 18517/34278 [20:22:17<15:52:03, 3.62s/it] 54%|█████▍ | 18518/34278 [20:22:21<16:07:27, 3.68s/it] {'loss': 0.1408, 'grad_norm': 0.8948818940261655, 'learning_rate': 4.592034699825743e-06, 'epoch': 0.54} 54%|█████▍ | 18518/34278 [20:22:21<16:07:27, 3.68s/it] 54%|█████▍ | 18519/34278 [20:22:24<15:09:33, 3.46s/it] {'loss': 0.137, 'grad_norm': 0.8467170548211386, 'learning_rate': 4.59156384271527e-06, 'epoch': 0.54} 54%|█████▍ | 18519/34278 [20:22:24<15:09:33, 3.46s/it] 54%|█████▍ | 18520/34278 [20:22:28<16:13:49, 3.71s/it] {'loss': 0.1321, 'grad_norm': 0.811727248192837, 'learning_rate': 4.591092989251219e-06, 'epoch': 0.54} 54%|█████▍ | 18520/34278 [20:22:28<16:13:49, 3.71s/it] 54%|█████▍ | 18521/34278 [20:22:31<15:08:25, 3.46s/it] {'loss': 0.1317, 'grad_norm': 1.043395801216942, 'learning_rate': 4.590622139437792e-06, 'epoch': 0.54} 54%|█████▍ | 18521/34278 [20:22:31<15:08:25, 3.46s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f216b8b6340> Failed to fetch sample 3291479. Exception: cannot identify image file <_io.BytesIO object at 0x7f216b8b6340> 54%|█████▍ | 18522/34278 [20:22:36<17:03:52, 3.90s/it] {'loss': 0.1393, 'grad_norm': 1.1229914141576245, 'learning_rate': 4.590151293279192e-06, 'epoch': 0.54} 54%|█████▍ | 18522/34278 [20:22:36<17:03:52, 3.90s/it] 54%|█████▍ | 18523/34278 [20:22:40<16:40:26, 3.81s/it] {'loss': 0.1455, 'grad_norm': 0.8158467164834956, 'learning_rate': 4.589680450779622e-06, 'epoch': 0.54} 54%|█████▍ | 18523/34278 [20:22:40<16:40:26, 3.81s/it] 54%|█████▍ | 18524/34278 [20:22:43<15:24:38, 3.52s/it] {'loss': 0.134, 'grad_norm': 0.84450730775348, 'learning_rate': 4.589209611943289e-06, 'epoch': 0.54} 54%|█████▍ | 18524/34278 [20:22:43<15:24:38, 3.52s/it] 54%|█████▍ | 18525/34278 [20:22:46<14:48:10, 3.38s/it] {'loss': 0.1312, 'grad_norm': 0.8061892052318363, 'learning_rate': 4.5887387767743955e-06, 'epoch': 0.54} 54%|█████▍ | 18525/34278 [20:22:46<14:48:10, 3.38s/it] 54%|█████▍ | 18526/34278 [20:22:49<14:58:37, 3.42s/it] {'loss': 0.1329, 'grad_norm': 0.8457112123241478, 'learning_rate': 4.588267945277142e-06, 'epoch': 0.54} 54%|█████▍ | 18526/34278 [20:22:49<14:58:37, 3.42s/it] 54%|█████▍ | 18527/34278 [20:22:53<15:04:27, 3.45s/it] {'loss': 0.1353, 'grad_norm': 0.8079282516340712, 'learning_rate': 4.587797117455735e-06, 'epoch': 0.54} 54%|█████▍ | 18527/34278 [20:22:53<15:04:27, 3.45s/it] 54%|█████▍ | 18528/34278 [20:22:56<14:50:51, 3.39s/it] {'loss': 0.1143, 'grad_norm': 0.9582440122427279, 'learning_rate': 4.587326293314378e-06, 'epoch': 0.54} 54%|█████▍ | 18528/34278 [20:22:56<14:50:51, 3.39s/it] 54%|█████▍ | 18529/34278 [20:23:00<15:22:01, 3.51s/it] {'loss': 0.1228, 'grad_norm': 0.9462693153503788, 'learning_rate': 4.586855472857269e-06, 'epoch': 0.54} 54%|█████▍ | 18529/34278 [20:23:00<15:22:01, 3.51s/it] 54%|█████▍ | 18530/34278 [20:23:04<15:45:09, 3.60s/it] {'loss': 0.1174, 'grad_norm': 0.7384460324234042, 'learning_rate': 4.58638465608862e-06, 'epoch': 0.54} 54%|█████▍ | 18530/34278 [20:23:04<15:45:09, 3.60s/it] 54%|█████▍ | 18531/34278 [20:23:07<15:52:46, 3.63s/it] {'loss': 0.1306, 'grad_norm': 0.7686077928824532, 'learning_rate': 4.585913843012628e-06, 'epoch': 0.54} 54%|█████▍ | 18531/34278 [20:23:07<15:52:46, 3.63s/it] 54%|█████▍ | 18532/34278 [20:23:12<17:00:40, 3.89s/it] {'loss': 0.1323, 'grad_norm': 0.9531871625500782, 'learning_rate': 4.5854430336335e-06, 'epoch': 0.54} 54%|█████▍ | 18532/34278 [20:23:12<17:00:40, 3.89s/it] 54%|█████▍ | 18533/34278 [20:23:15<16:41:15, 3.82s/it] {'loss': 0.1663, 'grad_norm': 0.8085206343946517, 'learning_rate': 4.584972227955437e-06, 'epoch': 0.54} 54%|█████▍ | 18533/34278 [20:23:15<16:41:15, 3.82s/it] 54%|█████▍ | 18534/34278 [20:23:19<15:52:44, 3.63s/it] {'loss': 0.1396, 'grad_norm': 0.9274307278452015, 'learning_rate': 4.584501425982641e-06, 'epoch': 0.54} 54%|█████▍ | 18534/34278 [20:23:19<15:52:44, 3.63s/it] 54%|█████▍ | 18535/34278 [20:23:21<14:49:15, 3.39s/it] {'loss': 0.1288, 'grad_norm': 0.8118635885139049, 'learning_rate': 4.584030627719319e-06, 'epoch': 0.54} 54%|█████▍ | 18535/34278 [20:23:21<14:49:15, 3.39s/it] 54%|█████▍ | 18536/34278 [20:23:26<15:43:09, 3.59s/it] {'loss': 0.1127, 'grad_norm': 0.6881775262599032, 'learning_rate': 4.5835598331696725e-06, 'epoch': 0.54} 54%|█████▍ | 18536/34278 [20:23:26<15:43:09, 3.59s/it] 54%|█████▍ | 18537/34278 [20:23:29<15:26:11, 3.53s/it] {'loss': 0.1332, 'grad_norm': 0.8751145615556514, 'learning_rate': 4.5830890423379035e-06, 'epoch': 0.54} 54%|█████▍ | 18537/34278 [20:23:29<15:26:11, 3.53s/it] 54%|█████▍ | 18538/34278 [20:23:32<14:31:39, 3.32s/it] {'loss': 0.1207, 'grad_norm': 0.9482218663944888, 'learning_rate': 4.582618255228218e-06, 'epoch': 0.54} 54%|█████▍ | 18538/34278 [20:23:32<14:31:39, 3.32s/it] 54%|█████▍ | 18539/34278 [20:23:35<14:16:06, 3.26s/it] {'loss': 0.1437, 'grad_norm': 0.8477653576217563, 'learning_rate': 4.582147471844814e-06, 'epoch': 0.54} 54%|█████▍ | 18539/34278 [20:23:35<14:16:06, 3.26s/it] 54%|█████▍ | 18540/34278 [20:23:39<15:20:07, 3.51s/it] {'loss': 0.1286, 'grad_norm': 0.7862490789715032, 'learning_rate': 4.581676692191899e-06, 'epoch': 0.54} 54%|█████▍ | 18540/34278 [20:23:39<15:20:07, 3.51s/it] 54%|█████▍ | 18541/34278 [20:23:42<14:47:18, 3.38s/it] {'loss': 0.1359, 'grad_norm': 0.7373498074330986, 'learning_rate': 4.581205916273675e-06, 'epoch': 0.54} 54%|█████▍ | 18541/34278 [20:23:42<14:47:18, 3.38s/it] 54%|█████▍ | 18542/34278 [20:23:45<14:52:08, 3.40s/it] {'loss': 0.1095, 'grad_norm': 0.8518693891851921, 'learning_rate': 4.580735144094343e-06, 'epoch': 0.54} 54%|█████▍ | 18542/34278 [20:23:45<14:52:08, 3.40s/it] 54%|█████▍ | 18543/34278 [20:23:49<14:30:58, 3.32s/it] {'loss': 0.1223, 'grad_norm': 0.7929335141174001, 'learning_rate': 4.58026437565811e-06, 'epoch': 0.54} 54%|█████▍ | 18543/34278 [20:23:49<14:30:58, 3.32s/it] 54%|█████▍ | 18544/34278 [20:23:52<14:59:01, 3.43s/it] {'loss': 0.1477, 'grad_norm': 0.949890133436839, 'learning_rate': 4.579793610969175e-06, 'epoch': 0.54} 54%|█████▍ | 18544/34278 [20:23:52<14:59:01, 3.43s/it] 54%|█████▍ | 18545/34278 [20:23:55<14:22:34, 3.29s/it] {'loss': 0.1268, 'grad_norm': 0.6923494122570102, 'learning_rate': 4.579322850031743e-06, 'epoch': 0.54} 54%|█████▍ | 18545/34278 [20:23:55<14:22:34, 3.29s/it] 54%|█████▍ | 18546/34278 [20:23:58<13:53:51, 3.18s/it] {'loss': 0.1211, 'grad_norm': 0.9872506294900837, 'learning_rate': 4.578852092850014e-06, 'epoch': 0.54} 54%|█████▍ | 18546/34278 [20:23:58<13:53:51, 3.18s/it] 54%|█████▍ | 18547/34278 [20:24:02<14:10:57, 3.25s/it] {'loss': 0.1559, 'grad_norm': 0.7901695596699595, 'learning_rate': 4.578381339428197e-06, 'epoch': 0.54} 54%|█████▍ | 18547/34278 [20:24:02<14:10:57, 3.25s/it] 54%|█████▍ | 18548/34278 [20:24:05<14:03:55, 3.22s/it] {'loss': 0.1287, 'grad_norm': 0.7477654018944593, 'learning_rate': 4.5779105897704874e-06, 'epoch': 0.54} 54%|█████▍ | 18548/34278 [20:24:05<14:03:55, 3.22s/it] 54%|█████▍ | 18549/34278 [20:24:08<13:43:55, 3.14s/it] {'loss': 0.1306, 'grad_norm': 1.0079866343956774, 'learning_rate': 4.577439843881093e-06, 'epoch': 0.54} 54%|█████▍ | 18549/34278 [20:24:08<13:43:55, 3.14s/it] 54%|█████▍ | 18550/34278 [20:24:11<13:45:57, 3.15s/it] {'loss': 0.1234, 'grad_norm': 0.8164972594869021, 'learning_rate': 4.5769691017642135e-06, 'epoch': 0.54} 54%|█████▍ | 18550/34278 [20:24:11<13:45:57, 3.15s/it] 54%|█████▍ | 18551/34278 [20:24:14<13:46:39, 3.15s/it] {'loss': 0.1489, 'grad_norm': 0.7352347377635433, 'learning_rate': 4.5764983634240554e-06, 'epoch': 0.54} 54%|█████▍ | 18551/34278 [20:24:14<13:46:39, 3.15s/it] 54%|█████▍ | 18552/34278 [20:24:17<13:15:25, 3.03s/it] {'loss': 0.1356, 'grad_norm': 0.9238793751207289, 'learning_rate': 4.576027628864815e-06, 'epoch': 0.54} 54%|█████▍ | 18552/34278 [20:24:17<13:15:25, 3.03s/it] 54%|█████▍ | 18553/34278 [20:24:22<16:10:39, 3.70s/it] {'loss': 0.1336, 'grad_norm': 0.9410922114238712, 'learning_rate': 4.575556898090701e-06, 'epoch': 0.54} 54%|█████▍ | 18553/34278 [20:24:22<16:10:39, 3.70s/it] 54%|█████▍ | 18554/34278 [20:24:25<15:39:44, 3.59s/it] {'loss': 0.1284, 'grad_norm': 0.890025217300632, 'learning_rate': 4.575086171105913e-06, 'epoch': 0.54} 54%|█████▍ | 18554/34278 [20:24:25<15:39:44, 3.59s/it] 54%|█████▍ | 18555/34278 [20:24:29<15:29:06, 3.55s/it] {'loss': 0.1368, 'grad_norm': 0.7760769456198849, 'learning_rate': 4.574615447914656e-06, 'epoch': 0.54} 54%|█████▍ | 18555/34278 [20:24:29<15:29:06, 3.55s/it] 54%|█████▍ | 18556/34278 [20:24:32<14:43:02, 3.37s/it] {'loss': 0.1483, 'grad_norm': 0.8683222818050699, 'learning_rate': 4.574144728521129e-06, 'epoch': 0.54} 54%|█████▍ | 18556/34278 [20:24:32<14:43:02, 3.37s/it] 54%|█████▍ | 18557/34278 [20:24:35<14:22:12, 3.29s/it] {'loss': 0.1134, 'grad_norm': 0.7828015637307851, 'learning_rate': 4.573674012929537e-06, 'epoch': 0.54} 54%|█████▍ | 18557/34278 [20:24:35<14:22:12, 3.29s/it] 54%|█████▍ | 18558/34278 [20:24:40<16:16:17, 3.73s/it] {'loss': 0.1245, 'grad_norm': 0.772556403125213, 'learning_rate': 4.57320330114408e-06, 'epoch': 0.54} 54%|█████▍ | 18558/34278 [20:24:40<16:16:17, 3.73s/it] 54%|█████▍ | 18559/34278 [20:24:43<15:20:56, 3.52s/it] {'loss': 0.1484, 'grad_norm': 0.8124959768576329, 'learning_rate': 4.572732593168963e-06, 'epoch': 0.54} 54%|█████▍ | 18559/34278 [20:24:43<15:20:56, 3.52s/it] 54%|█████▍ | 18560/34278 [20:24:46<15:04:43, 3.45s/it] {'loss': 0.1437, 'grad_norm': 0.8928366475724265, 'learning_rate': 4.5722618890083886e-06, 'epoch': 0.54} 54%|█████▍ | 18560/34278 [20:24:46<15:04:43, 3.45s/it] 54%|█████▍ | 18561/34278 [20:24:49<14:28:31, 3.32s/it] {'loss': 0.1161, 'grad_norm': 0.8576042037903062, 'learning_rate': 4.571791188666556e-06, 'epoch': 0.54} 54%|█████▍ | 18561/34278 [20:24:49<14:28:31, 3.32s/it] 54%|█████▍ | 18562/34278 [20:24:52<14:21:45, 3.29s/it] {'loss': 0.1439, 'grad_norm': 0.6334579477388942, 'learning_rate': 4.571320492147671e-06, 'epoch': 0.54} 54%|█████▍ | 18562/34278 [20:24:52<14:21:45, 3.29s/it] 54%|█████▍ | 18563/34278 [20:24:55<13:53:40, 3.18s/it] {'loss': 0.1575, 'grad_norm': 1.2423093574568127, 'learning_rate': 4.570849799455935e-06, 'epoch': 0.54} 54%|█████▍ | 18563/34278 [20:24:55<13:53:40, 3.18s/it] 54%|█████▍ | 18564/34278 [20:25:01<17:24:12, 3.99s/it] {'loss': 0.1241, 'grad_norm': 1.0674356324885308, 'learning_rate': 4.5703791105955465e-06, 'epoch': 0.54} 54%|█████▍ | 18564/34278 [20:25:01<17:24:12, 3.99s/it] 54%|█████▍ | 18565/34278 [20:25:07<19:51:21, 4.55s/it] {'loss': 0.1091, 'grad_norm': 0.8917905360190724, 'learning_rate': 4.5699084255707135e-06, 'epoch': 0.54} 54%|█████▍ | 18565/34278 [20:25:07<19:51:21, 4.55s/it] 54%|█████▍ | 18566/34278 [20:25:10<17:46:06, 4.07s/it] {'loss': 0.1418, 'grad_norm': 0.9229579882678267, 'learning_rate': 4.569437744385634e-06, 'epoch': 0.54} 54%|█████▍ | 18566/34278 [20:25:10<17:46:06, 4.07s/it] 54%|█████▍ | 18567/34278 [20:25:14<17:32:36, 4.02s/it] {'loss': 0.1292, 'grad_norm': 0.7628392567268024, 'learning_rate': 4.568967067044512e-06, 'epoch': 0.54} 54%|█████▍ | 18567/34278 [20:25:14<17:32:36, 4.02s/it] 54%|█████▍ | 18568/34278 [20:25:17<17:02:10, 3.90s/it] {'loss': 0.0971, 'grad_norm': 1.3246170348709403, 'learning_rate': 4.56849639355155e-06, 'epoch': 0.54} 54%|█████▍ | 18568/34278 [20:25:17<17:02:10, 3.90s/it] 54%|█████▍ | 18569/34278 [20:25:20<16:03:52, 3.68s/it] {'loss': 0.1319, 'grad_norm': 0.845091498366653, 'learning_rate': 4.568025723910948e-06, 'epoch': 0.54} 54%|█████▍ | 18569/34278 [20:25:20<16:03:52, 3.68s/it] 54%|█████▍ | 18570/34278 [20:25:24<15:39:44, 3.59s/it] {'loss': 0.131, 'grad_norm': 0.6984478644385249, 'learning_rate': 4.567555058126909e-06, 'epoch': 0.54} 54%|█████▍ | 18570/34278 [20:25:24<15:39:44, 3.59s/it] 54%|█████▍ | 18571/34278 [20:25:27<14:57:07, 3.43s/it] {'loss': 0.1449, 'grad_norm': 1.070303298446535, 'learning_rate': 4.567084396203636e-06, 'epoch': 0.54} 54%|█████▍ | 18571/34278 [20:25:27<14:57:07, 3.43s/it] 54%|█████▍ | 18572/34278 [20:25:30<14:30:53, 3.33s/it] {'loss': 0.1249, 'grad_norm': 1.1262681782985426, 'learning_rate': 4.566613738145329e-06, 'epoch': 0.54} 54%|█████▍ | 18572/34278 [20:25:30<14:30:53, 3.33s/it] 54%|█████▍ | 18573/34278 [20:25:33<14:09:20, 3.24s/it] {'loss': 0.1589, 'grad_norm': 0.8433029057458278, 'learning_rate': 4.566143083956193e-06, 'epoch': 0.54} 54%|█████▍ | 18573/34278 [20:25:33<14:09:20, 3.24s/it] 54%|█████▍ | 18574/34278 [20:25:36<13:51:53, 3.18s/it] {'loss': 0.1134, 'grad_norm': 0.7064582794321326, 'learning_rate': 4.565672433640428e-06, 'epoch': 0.54} 54%|█████▍ | 18574/34278 [20:25:36<13:51:53, 3.18s/it] 54%|█████▍ | 18575/34278 [20:25:39<13:11:52, 3.03s/it] {'loss': 0.1102, 'grad_norm': 0.951049905265706, 'learning_rate': 4.565201787202234e-06, 'epoch': 0.54} 54%|█████▍ | 18575/34278 [20:25:39<13:11:52, 3.03s/it] 54%|█████▍ | 18576/34278 [20:25:43<15:18:09, 3.51s/it] {'loss': 0.148, 'grad_norm': 0.8282929047926882, 'learning_rate': 4.564731144645814e-06, 'epoch': 0.54} 54%|█████▍ | 18576/34278 [20:25:43<15:18:09, 3.51s/it] 54%|█████▍ | 18577/34278 [20:25:47<15:19:22, 3.51s/it] {'loss': 0.1303, 'grad_norm': 0.7453905798825045, 'learning_rate': 4.564260505975373e-06, 'epoch': 0.54} 54%|█████▍ | 18577/34278 [20:25:47<15:19:22, 3.51s/it] 54%|█████▍ | 18578/34278 [20:25:50<14:23:20, 3.30s/it] {'loss': 0.1259, 'grad_norm': 0.7995034987138877, 'learning_rate': 4.5637898711951086e-06, 'epoch': 0.54} 54%|█████▍ | 18578/34278 [20:25:50<14:23:20, 3.30s/it] 54%|█████▍ | 18579/34278 [20:25:55<16:28:37, 3.78s/it] {'loss': 0.1276, 'grad_norm': 0.8275938207800875, 'learning_rate': 4.563319240309225e-06, 'epoch': 0.54} 54%|█████▍ | 18579/34278 [20:25:55<16:28:37, 3.78s/it] 54%|█████▍ | 18580/34278 [20:26:00<18:01:00, 4.13s/it] {'loss': 0.1536, 'grad_norm': 0.7407051337221364, 'learning_rate': 4.562848613321922e-06, 'epoch': 0.54} 54%|█████▍ | 18580/34278 [20:26:00<18:01:00, 4.13s/it] 54%|█████▍ | 18581/34278 [20:26:03<16:51:57, 3.87s/it] {'loss': 0.1321, 'grad_norm': 0.7538172452892475, 'learning_rate': 4.562377990237404e-06, 'epoch': 0.54} 54%|█████▍ | 18581/34278 [20:26:03<16:51:57, 3.87s/it] 54%|█████▍ | 18582/34278 [20:26:06<15:25:41, 3.54s/it] {'loss': 0.1274, 'grad_norm': 1.905030882722415, 'learning_rate': 4.561907371059868e-06, 'epoch': 0.54} 54%|█████▍ | 18582/34278 [20:26:06<15:25:41, 3.54s/it] 54%|█████▍ | 18583/34278 [20:26:11<18:15:57, 4.19s/it] {'loss': 0.1152, 'grad_norm': 1.0625206190115513, 'learning_rate': 4.5614367557935205e-06, 'epoch': 0.54} 54%|█████▍ | 18583/34278 [20:26:11<18:15:57, 4.19s/it] 54%|█████▍ | 18584/34278 [20:26:14<16:51:32, 3.87s/it] {'loss': 0.155, 'grad_norm': 1.0209787428644586, 'learning_rate': 4.56096614444256e-06, 'epoch': 0.54} 54%|█████▍ | 18584/34278 [20:26:14<16:51:32, 3.87s/it] 54%|█████▍ | 18585/34278 [20:26:18<16:48:14, 3.85s/it] {'loss': 0.1311, 'grad_norm': 0.7645543365119774, 'learning_rate': 4.560495537011191e-06, 'epoch': 0.54} 54%|█████▍ | 18585/34278 [20:26:18<16:48:14, 3.85s/it] 54%|█████▍ | 18586/34278 [20:26:22<16:05:26, 3.69s/it] {'loss': 0.1313, 'grad_norm': 1.1025132199310623, 'learning_rate': 4.560024933503611e-06, 'epoch': 0.54} 54%|█████▍ | 18586/34278 [20:26:22<16:05:26, 3.69s/it] 54%|█████▍ | 18587/34278 [20:26:25<15:09:16, 3.48s/it] {'loss': 0.1334, 'grad_norm': 0.9262771547987466, 'learning_rate': 4.559554333924024e-06, 'epoch': 0.54} 54%|█████▍ | 18587/34278 [20:26:25<15:09:16, 3.48s/it] 54%|█████▍ | 18588/34278 [20:26:30<18:00:40, 4.13s/it] {'loss': 0.1463, 'grad_norm': 0.8751620781825704, 'learning_rate': 4.559083738276629e-06, 'epoch': 0.54} 54%|█████▍ | 18588/34278 [20:26:30<18:00:40, 4.13s/it] 54%|█████▍ | 18589/34278 [20:26:34<17:24:15, 3.99s/it] {'loss': 0.1289, 'grad_norm': 0.9838603475520283, 'learning_rate': 4.55861314656563e-06, 'epoch': 0.54} 54%|█████▍ | 18589/34278 [20:26:34<17:24:15, 3.99s/it] 54%|█████▍ | 18590/34278 [20:26:38<17:56:17, 4.12s/it] {'loss': 0.1137, 'grad_norm': 1.013986579247909, 'learning_rate': 4.558142558795229e-06, 'epoch': 0.54} 54%|█████▍ | 18590/34278 [20:26:38<17:56:17, 4.12s/it] 54%|█████▍ | 18591/34278 [20:26:41<16:42:40, 3.84s/it] {'loss': 0.1319, 'grad_norm': 0.9428278662199534, 'learning_rate': 4.5576719749696255e-06, 'epoch': 0.54} 54%|█████▍ | 18591/34278 [20:26:41<16:42:40, 3.84s/it] 54%|█████▍ | 18592/34278 [20:26:44<15:19:55, 3.52s/it] {'loss': 0.1423, 'grad_norm': 0.9500908104131973, 'learning_rate': 4.55720139509302e-06, 'epoch': 0.54} 54%|█████▍ | 18592/34278 [20:26:44<15:19:55, 3.52s/it] 54%|█████▍ | 18593/34278 [20:26:50<18:11:00, 4.17s/it] {'loss': 0.1269, 'grad_norm': 0.7443225951884858, 'learning_rate': 4.556730819169617e-06, 'epoch': 0.54} 54%|█████▍ | 18593/34278 [20:26:50<18:11:00, 4.17s/it] 54%|█████▍ | 18594/34278 [20:26:54<17:31:14, 4.02s/it] {'loss': 0.1208, 'grad_norm': 0.8387766165065289, 'learning_rate': 4.556260247203611e-06, 'epoch': 0.54} 54%|█████▍ | 18594/34278 [20:26:54<17:31:14, 4.02s/it] 54%|█████▍ | 18595/34278 [20:26:58<18:25:41, 4.23s/it] {'loss': 0.1294, 'grad_norm': 0.8724240408580441, 'learning_rate': 4.55578967919921e-06, 'epoch': 0.54} 54%|█████▍ | 18595/34278 [20:26:58<18:25:41, 4.23s/it] 54%|█████▍ | 18596/34278 [20:27:01<17:00:00, 3.90s/it] {'loss': 0.1193, 'grad_norm': 0.9397541649578579, 'learning_rate': 4.555319115160613e-06, 'epoch': 0.54} 54%|█████▍ | 18596/34278 [20:27:01<17:00:00, 3.90s/it] 54%|█████▍ | 18597/34278 [20:27:05<16:36:35, 3.81s/it] {'loss': 0.1328, 'grad_norm': 0.8715328430253131, 'learning_rate': 4.554848555092021e-06, 'epoch': 0.54} 54%|█████▍ | 18597/34278 [20:27:05<16:36:35, 3.81s/it] 54%|█████▍ | 18598/34278 [20:27:11<19:34:44, 4.50s/it] {'loss': 0.1373, 'grad_norm': 1.0141080824533748, 'learning_rate': 4.554377998997635e-06, 'epoch': 0.54} 54%|█████▍ | 18598/34278 [20:27:11<19:34:44, 4.50s/it] 54%|█████▍ | 18599/34278 [20:27:15<18:09:05, 4.17s/it] {'loss': 0.1457, 'grad_norm': 0.977833134828352, 'learning_rate': 4.553907446881655e-06, 'epoch': 0.54} 54%|█████▍ | 18599/34278 [20:27:15<18:09:05, 4.17s/it] 54%|█████▍ | 18600/34278 [20:27:17<16:35:34, 3.81s/it] {'loss': 0.1321, 'grad_norm': 0.7314597419648076, 'learning_rate': 4.553436898748283e-06, 'epoch': 0.54} 54%|█████▍ | 18600/34278 [20:27:18<16:35:34, 3.81s/it] 54%|█████▍ | 18601/34278 [20:27:24<19:40:43, 4.52s/it] {'loss': 0.13, 'grad_norm': 0.7464298405060341, 'learning_rate': 4.552966354601719e-06, 'epoch': 0.54} 54%|█████▍ | 18601/34278 [20:27:24<19:40:43, 4.52s/it] 54%|█████▍ | 18602/34278 [20:27:30<21:34:51, 4.96s/it] {'loss': 0.124, 'grad_norm': 0.9192476285587113, 'learning_rate': 4.552495814446165e-06, 'epoch': 0.54} 54%|█████▍ | 18602/34278 [20:27:30<21:34:51, 4.96s/it] 54%|█████▍ | 18603/34278 [20:27:33<19:20:02, 4.44s/it] {'loss': 0.1412, 'grad_norm': 0.9471931251716911, 'learning_rate': 4.552025278285823e-06, 'epoch': 0.54} 54%|█████▍ | 18603/34278 [20:27:33<19:20:02, 4.44s/it] 54%|█████▍ | 18604/34278 [20:27:36<17:33:24, 4.03s/it] {'loss': 0.1122, 'grad_norm': 0.7028199390854473, 'learning_rate': 4.551554746124891e-06, 'epoch': 0.54} 54%|█████▍ | 18604/34278 [20:27:36<17:33:24, 4.03s/it] 54%|█████▍ | 18605/34278 [20:27:39<16:34:11, 3.81s/it] {'loss': 0.1385, 'grad_norm': 0.8186137046998053, 'learning_rate': 4.551084217967573e-06, 'epoch': 0.54} 54%|█████▍ | 18605/34278 [20:27:39<16:34:11, 3.81s/it] 54%|█████▍ | 18606/34278 [20:27:42<15:49:33, 3.64s/it] {'loss': 0.1273, 'grad_norm': 0.9767308062095534, 'learning_rate': 4.550613693818064e-06, 'epoch': 0.54} 54%|█████▍ | 18606/34278 [20:27:42<15:49:33, 3.64s/it] 54%|█████▍ | 18607/34278 [20:27:46<15:41:09, 3.60s/it] {'loss': 0.1382, 'grad_norm': 0.8968856658488537, 'learning_rate': 4.550143173680573e-06, 'epoch': 0.54} 54%|█████▍ | 18607/34278 [20:27:46<15:41:09, 3.60s/it] 54%|█████▍ | 18608/34278 [20:27:50<15:47:39, 3.63s/it] {'loss': 0.114, 'grad_norm': 0.8372639466185194, 'learning_rate': 4.549672657559294e-06, 'epoch': 0.54} 54%|█████▍ | 18608/34278 [20:27:50<15:47:39, 3.63s/it] 54%|█████▍ | 18609/34278 [20:27:54<16:02:46, 3.69s/it] {'loss': 0.1362, 'grad_norm': 0.9577222514170669, 'learning_rate': 4.54920214545843e-06, 'epoch': 0.54} 54%|█████▍ | 18609/34278 [20:27:54<16:02:46, 3.69s/it] 54%|█████▍ | 18610/34278 [20:27:57<15:20:52, 3.53s/it] {'loss': 0.1324, 'grad_norm': 0.8554428290139974, 'learning_rate': 4.5487316373821834e-06, 'epoch': 0.54} 54%|█████▍ | 18610/34278 [20:27:57<15:20:52, 3.53s/it] 54%|█████▍ | 18611/34278 [20:28:00<15:02:20, 3.46s/it] {'loss': 0.1033, 'grad_norm': 0.8378792662146597, 'learning_rate': 4.548261133334753e-06, 'epoch': 0.54} 54%|█████▍ | 18611/34278 [20:28:00<15:02:20, 3.46s/it] 54%|█████▍ | 18612/34278 [20:28:05<17:23:23, 4.00s/it] {'loss': 0.1192, 'grad_norm': 0.8425684330542159, 'learning_rate': 4.547790633320336e-06, 'epoch': 0.54} 54%|█████▍ | 18612/34278 [20:28:05<17:23:23, 4.00s/it] 54%|█████▍ | 18613/34278 [20:28:08<15:50:34, 3.64s/it] {'loss': 0.1343, 'grad_norm': 0.86600847540191, 'learning_rate': 4.547320137343138e-06, 'epoch': 0.54} 54%|█████▍ | 18613/34278 [20:28:08<15:50:34, 3.64s/it] 54%|█████▍ | 18614/34278 [20:28:11<15:02:19, 3.46s/it] {'loss': 0.1306, 'grad_norm': 0.8970330089790087, 'learning_rate': 4.546849645407359e-06, 'epoch': 0.54} 54%|█████▍ | 18614/34278 [20:28:11<15:02:19, 3.46s/it] 54%|█████▍ | 18615/34278 [20:28:14<14:43:42, 3.39s/it] {'loss': 0.1082, 'grad_norm': 1.2082125477194066, 'learning_rate': 4.546379157517198e-06, 'epoch': 0.54} 54%|█████▍ | 18615/34278 [20:28:14<14:43:42, 3.39s/it] 54%|█████▍ | 18616/34278 [20:28:17<14:11:25, 3.26s/it] {'loss': 0.1427, 'grad_norm': 0.8747763423161303, 'learning_rate': 4.545908673676855e-06, 'epoch': 0.54} 54%|█████▍ | 18616/34278 [20:28:17<14:11:25, 3.26s/it] 54%|█████▍ | 18617/34278 [20:28:20<14:02:59, 3.23s/it] {'loss': 0.1235, 'grad_norm': 0.9298206824513096, 'learning_rate': 4.545438193890531e-06, 'epoch': 0.54} 54%|█████▍ | 18617/34278 [20:28:20<14:02:59, 3.23s/it] 54%|█████▍ | 18618/34278 [20:28:24<14:01:10, 3.22s/it] {'loss': 0.1105, 'grad_norm': 0.897925886878897, 'learning_rate': 4.544967718162425e-06, 'epoch': 0.54} 54%|█████▍ | 18618/34278 [20:28:24<14:01:10, 3.22s/it] 54%|█████▍ | 18619/34278 [20:28:28<14:56:01, 3.43s/it] {'loss': 0.1265, 'grad_norm': 0.7862787817646996, 'learning_rate': 4.544497246496741e-06, 'epoch': 0.54} 54%|█████▍ | 18619/34278 [20:28:28<14:56:01, 3.43s/it] 54%|█████▍ | 18620/34278 [20:28:31<14:44:39, 3.39s/it] {'loss': 0.1247, 'grad_norm': 0.7829494278367473, 'learning_rate': 4.544026778897676e-06, 'epoch': 0.54} 54%|█████▍ | 18620/34278 [20:28:31<14:44:39, 3.39s/it] 54%|█████▍ | 18621/34278 [20:28:34<14:44:58, 3.39s/it] {'loss': 0.1229, 'grad_norm': 0.8739761744992527, 'learning_rate': 4.54355631536943e-06, 'epoch': 0.54} 54%|█████▍ | 18621/34278 [20:28:34<14:44:58, 3.39s/it] 54%|█████▍ | 18622/34278 [20:28:39<15:57:10, 3.67s/it] {'loss': 0.1522, 'grad_norm': 0.7817666258065223, 'learning_rate': 4.543085855916205e-06, 'epoch': 0.54} 54%|█████▍ | 18622/34278 [20:28:39<15:57:10, 3.67s/it] 54%|█████▍ | 18623/34278 [20:28:42<15:30:41, 3.57s/it] {'loss': 0.1376, 'grad_norm': 0.8743724793823562, 'learning_rate': 4.542615400542202e-06, 'epoch': 0.54} 54%|█████▍ | 18623/34278 [20:28:42<15:30:41, 3.57s/it] 54%|█████▍ | 18624/34278 [20:28:45<14:48:59, 3.41s/it] {'loss': 0.1304, 'grad_norm': 0.8417347293161281, 'learning_rate': 4.542144949251615e-06, 'epoch': 0.54} 54%|█████▍ | 18624/34278 [20:28:45<14:48:59, 3.41s/it] 54%|█████▍ | 18625/34278 [20:28:50<16:28:03, 3.79s/it] {'loss': 0.1122, 'grad_norm': 0.8349549582346112, 'learning_rate': 4.541674502048653e-06, 'epoch': 0.54} 54%|█████▍ | 18625/34278 [20:28:50<16:28:03, 3.79s/it] 54%|█████▍ | 18626/34278 [20:28:55<19:13:31, 4.42s/it] {'loss': 0.1391, 'grad_norm': 0.9844219872835687, 'learning_rate': 4.54120405893751e-06, 'epoch': 0.54} 54%|█████▍ | 18626/34278 [20:28:55<19:13:31, 4.42s/it] 54%|█████▍ | 18627/34278 [20:29:00<19:56:40, 4.59s/it] {'loss': 0.1398, 'grad_norm': 0.9521398611024925, 'learning_rate': 4.540733619922388e-06, 'epoch': 0.54} 54%|█████▍ | 18627/34278 [20:29:00<19:56:40, 4.59s/it] 54%|█████▍ | 18628/34278 [20:29:04<18:07:30, 4.17s/it] {'loss': 0.1415, 'grad_norm': 1.403520958876466, 'learning_rate': 4.540263185007487e-06, 'epoch': 0.54} 54%|█████▍ | 18628/34278 [20:29:04<18:07:30, 4.17s/it] 54%|█████▍ | 18629/34278 [20:29:07<17:19:40, 3.99s/it] {'loss': 0.1218, 'grad_norm': 1.0320447509130322, 'learning_rate': 4.539792754197006e-06, 'epoch': 0.54} 54%|█████▍ | 18629/34278 [20:29:07<17:19:40, 3.99s/it] 54%|█████▍ | 18630/34278 [20:29:10<16:23:42, 3.77s/it] {'loss': 0.1466, 'grad_norm': 1.2343273669027426, 'learning_rate': 4.539322327495144e-06, 'epoch': 0.54} 54%|█████▍ | 18630/34278 [20:29:11<16:23:42, 3.77s/it] 54%|█████▍ | 18631/34278 [20:29:14<15:55:24, 3.66s/it] {'loss': 0.1337, 'grad_norm': 0.8574991551732334, 'learning_rate': 4.538851904906103e-06, 'epoch': 0.54} 54%|█████▍ | 18631/34278 [20:29:14<15:55:24, 3.66s/it] 54%|█████▍ | 18632/34278 [20:29:17<15:04:05, 3.47s/it] {'loss': 0.116, 'grad_norm': 0.8387819903807323, 'learning_rate': 4.538381486434083e-06, 'epoch': 0.54} 54%|█████▍ | 18632/34278 [20:29:17<15:04:05, 3.47s/it] 54%|█████▍ | 18633/34278 [20:29:20<14:23:34, 3.31s/it] {'loss': 0.1494, 'grad_norm': 1.2056660015842662, 'learning_rate': 4.537911072083284e-06, 'epoch': 0.54} 54%|█████▍ | 18633/34278 [20:29:20<14:23:34, 3.31s/it] 54%|█████▍ | 18634/34278 [20:29:23<14:20:54, 3.30s/it] {'loss': 0.1473, 'grad_norm': 0.9794726032666625, 'learning_rate': 4.537440661857903e-06, 'epoch': 0.54} 54%|█████▍ | 18634/34278 [20:29:23<14:20:54, 3.30s/it] 54%|█████▍ | 18635/34278 [20:29:27<14:45:22, 3.40s/it] {'loss': 0.1292, 'grad_norm': 1.100841704205615, 'learning_rate': 4.536970255762142e-06, 'epoch': 0.54} 54%|█████▍ | 18635/34278 [20:29:27<14:45:22, 3.40s/it] 54%|█████▍ | 18636/34278 [20:29:32<16:38:09, 3.83s/it] {'loss': 0.137, 'grad_norm': 0.8550148477909255, 'learning_rate': 4.536499853800198e-06, 'epoch': 0.54} 54%|█████▍ | 18636/34278 [20:29:32<16:38:09, 3.83s/it] 54%|█████▍ | 18637/34278 [20:29:35<15:41:00, 3.61s/it] {'loss': 0.1206, 'grad_norm': 0.8420445117887098, 'learning_rate': 4.536029455976276e-06, 'epoch': 0.54} 54%|█████▍ | 18637/34278 [20:29:35<15:41:00, 3.61s/it] 54%|█████▍ | 18638/34278 [20:29:38<15:33:21, 3.58s/it] {'loss': 0.1314, 'grad_norm': 1.0208099243470854, 'learning_rate': 4.53555906229457e-06, 'epoch': 0.54} 54%|█████▍ | 18638/34278 [20:29:38<15:33:21, 3.58s/it] 54%|█████▍ | 18639/34278 [20:29:41<15:06:05, 3.48s/it] {'loss': 0.119, 'grad_norm': 0.9248521454814603, 'learning_rate': 4.5350886727592824e-06, 'epoch': 0.54} 54%|█████▍ | 18639/34278 [20:29:41<15:06:05, 3.48s/it] 54%|█████▍ | 18640/34278 [20:29:44<14:29:25, 3.34s/it] {'loss': 0.1101, 'grad_norm': 0.8622934723177336, 'learning_rate': 4.534618287374613e-06, 'epoch': 0.54} 54%|█████▍ | 18640/34278 [20:29:44<14:29:25, 3.34s/it] 54%|█████▍ | 18641/34278 [20:29:47<14:00:17, 3.22s/it] {'loss': 0.1512, 'grad_norm': 0.930709123492009, 'learning_rate': 4.53414790614476e-06, 'epoch': 0.54} 54%|█████▍ | 18641/34278 [20:29:47<14:00:17, 3.22s/it] 54%|█████▍ | 18642/34278 [20:29:50<13:33:30, 3.12s/it] {'loss': 0.1463, 'grad_norm': 1.0691718997838953, 'learning_rate': 4.533677529073921e-06, 'epoch': 0.54} 54%|█████▍ | 18642/34278 [20:29:50<13:33:30, 3.12s/it] 54%|█████▍ | 18643/34278 [20:29:54<13:49:28, 3.18s/it] {'loss': 0.1404, 'grad_norm': 0.8207725725808066, 'learning_rate': 4.5332071561663e-06, 'epoch': 0.54} 54%|█████▍ | 18643/34278 [20:29:54<13:49:28, 3.18s/it] 54%|█████▍ | 18644/34278 [20:29:57<14:00:44, 3.23s/it] {'loss': 0.1499, 'grad_norm': 0.8619108900355196, 'learning_rate': 4.532736787426093e-06, 'epoch': 0.54} 54%|█████▍ | 18644/34278 [20:29:57<14:00:44, 3.23s/it] 54%|█████▍ | 18645/34278 [20:30:02<16:53:43, 3.89s/it] {'loss': 0.1343, 'grad_norm': 0.7963081018313473, 'learning_rate': 4.5322664228575024e-06, 'epoch': 0.54} 54%|█████▍ | 18645/34278 [20:30:02<16:53:43, 3.89s/it] 54%|█████▍ | 18646/34278 [20:30:06<16:11:49, 3.73s/it] {'loss': 0.1287, 'grad_norm': 1.052160880951865, 'learning_rate': 4.531796062464724e-06, 'epoch': 0.54} 54%|█████▍ | 18646/34278 [20:30:06<16:11:49, 3.73s/it] 54%|█████▍ | 18647/34278 [20:30:12<19:04:48, 4.39s/it] {'loss': 0.1266, 'grad_norm': 0.8674671496535811, 'learning_rate': 4.531325706251959e-06, 'epoch': 0.54} 54%|█████▍ | 18647/34278 [20:30:12<19:04:48, 4.39s/it] 54%|█████▍ | 18648/34278 [20:30:18<21:09:45, 4.87s/it] {'loss': 0.1191, 'grad_norm': 0.7351219368697155, 'learning_rate': 4.530855354223405e-06, 'epoch': 0.54} 54%|█████▍ | 18648/34278 [20:30:18<21:09:45, 4.87s/it] 54%|█████▍ | 18649/34278 [20:30:24<22:42:36, 5.23s/it] {'loss': 0.1256, 'grad_norm': 0.8441911575644822, 'learning_rate': 4.530385006383263e-06, 'epoch': 0.54} 54%|█████▍ | 18649/34278 [20:30:24<22:42:36, 5.23s/it] 54%|█████▍ | 18650/34278 [20:30:28<21:15:55, 4.90s/it] {'loss': 0.1166, 'grad_norm': 0.8942356063346709, 'learning_rate': 4.5299146627357325e-06, 'epoch': 0.54} 54%|█████▍ | 18650/34278 [20:30:28<21:15:55, 4.90s/it] 54%|█████▍ | 18651/34278 [20:30:31<18:52:38, 4.35s/it] {'loss': 0.1316, 'grad_norm': 0.8453180381197173, 'learning_rate': 4.5294443232850115e-06, 'epoch': 0.54} 54%|█████▍ | 18651/34278 [20:30:31<18:52:38, 4.35s/it] 54%|█████▍ | 18652/34278 [20:30:34<17:00:31, 3.92s/it] {'loss': 0.118, 'grad_norm': 0.9194448328534545, 'learning_rate': 4.528973988035299e-06, 'epoch': 0.54} 54%|█████▍ | 18652/34278 [20:30:34<17:00:31, 3.92s/it] 54%|█████▍ | 18653/34278 [20:30:37<15:39:18, 3.61s/it] {'loss': 0.1229, 'grad_norm': 0.8963468507354588, 'learning_rate': 4.528503656990794e-06, 'epoch': 0.54} 54%|█████▍ | 18653/34278 [20:30:37<15:39:18, 3.61s/it] 54%|█████▍ | 18654/34278 [20:30:40<15:23:22, 3.55s/it] {'loss': 0.1295, 'grad_norm': 0.9618334380729433, 'learning_rate': 4.528033330155694e-06, 'epoch': 0.54} 54%|█████▍ | 18654/34278 [20:30:40<15:23:22, 3.55s/it] 54%|█████▍ | 18655/34278 [20:30:45<16:39:25, 3.84s/it] {'loss': 0.1303, 'grad_norm': 0.8519382505129798, 'learning_rate': 4.527563007534203e-06, 'epoch': 0.54} 54%|█████▍ | 18655/34278 [20:30:45<16:39:25, 3.84s/it] 54%|█████▍ | 18656/34278 [20:30:48<15:38:07, 3.60s/it] {'loss': 0.1404, 'grad_norm': 0.9808986787414548, 'learning_rate': 4.527092689130515e-06, 'epoch': 0.54} 54%|█████▍ | 18656/34278 [20:30:48<15:38:07, 3.60s/it] 54%|█████▍ | 18657/34278 [20:30:52<15:59:48, 3.69s/it] {'loss': 0.1258, 'grad_norm': 0.8217557534225198, 'learning_rate': 4.526622374948831e-06, 'epoch': 0.54} 54%|█████▍ | 18657/34278 [20:30:52<15:59:48, 3.69s/it] 54%|█████▍ | 18658/34278 [20:30:58<19:09:54, 4.42s/it] {'loss': 0.1387, 'grad_norm': 0.8524935676331612, 'learning_rate': 4.526152064993351e-06, 'epoch': 0.54} 54%|█████▍ | 18658/34278 [20:30:58<19:09:54, 4.42s/it] 54%|█████▍ | 18659/34278 [20:31:01<17:56:05, 4.13s/it] {'loss': 0.1263, 'grad_norm': 0.7730121827305676, 'learning_rate': 4.525681759268271e-06, 'epoch': 0.54} 54%|█████▍ | 18659/34278 [20:31:01<17:56:05, 4.13s/it] 54%|█████▍ | 18660/34278 [20:31:04<16:46:47, 3.87s/it] {'loss': 0.1393, 'grad_norm': 0.9901844344913672, 'learning_rate': 4.525211457777789e-06, 'epoch': 0.54} 54%|█████▍ | 18660/34278 [20:31:04<16:46:47, 3.87s/it] 54%|█████▍ | 18661/34278 [20:31:08<16:06:43, 3.71s/it] {'loss': 0.1363, 'grad_norm': 0.8078128701754148, 'learning_rate': 4.524741160526107e-06, 'epoch': 0.54} 54%|█████▍ | 18661/34278 [20:31:08<16:06:43, 3.71s/it] 54%|█████▍ | 18662/34278 [20:31:13<17:27:20, 4.02s/it] {'loss': 0.13, 'grad_norm': 0.8081108537745513, 'learning_rate': 4.524270867517423e-06, 'epoch': 0.54} 54%|█████▍ | 18662/34278 [20:31:13<17:27:20, 4.02s/it] 54%|█████▍ | 18663/34278 [20:31:16<16:22:21, 3.77s/it] {'loss': 0.1267, 'grad_norm': 0.7951821217886085, 'learning_rate': 4.523800578755936e-06, 'epoch': 0.54} 54%|█████▍ | 18663/34278 [20:31:16<16:22:21, 3.77s/it] 54%|█████▍ | 18664/34278 [20:31:19<15:16:31, 3.52s/it] {'loss': 0.1246, 'grad_norm': 0.7916764994858416, 'learning_rate': 4.523330294245843e-06, 'epoch': 0.54} 54%|█████▍ | 18664/34278 [20:31:19<15:16:31, 3.52s/it] 54%|█████▍ | 18665/34278 [20:31:22<15:21:33, 3.54s/it] {'loss': 0.115, 'grad_norm': 0.6801620602796086, 'learning_rate': 4.522860013991343e-06, 'epoch': 0.54} 54%|█████▍ | 18665/34278 [20:31:22<15:21:33, 3.54s/it] 54%|█████▍ | 18666/34278 [20:31:26<15:19:04, 3.53s/it] {'loss': 0.1383, 'grad_norm': 0.8694378580903237, 'learning_rate': 4.522389737996634e-06, 'epoch': 0.54} 54%|█████▍ | 18666/34278 [20:31:26<15:19:04, 3.53s/it] 54%|█████▍ | 18667/34278 [20:31:29<15:14:22, 3.51s/it] {'loss': 0.1304, 'grad_norm': 0.7734560245462737, 'learning_rate': 4.5219194662659175e-06, 'epoch': 0.54} 54%|█████▍ | 18667/34278 [20:31:29<15:14:22, 3.51s/it] 54%|█████▍ | 18668/34278 [20:31:35<18:30:10, 4.27s/it] {'loss': 0.1278, 'grad_norm': 0.6987330885457839, 'learning_rate': 4.521449198803388e-06, 'epoch': 0.54} 54%|█████▍ | 18668/34278 [20:31:35<18:30:10, 4.27s/it] 54%|█████▍ | 18669/34278 [20:31:38<17:00:38, 3.92s/it] {'loss': 0.114, 'grad_norm': 0.7725151046455366, 'learning_rate': 4.5209789356132475e-06, 'epoch': 0.54} 54%|█████▍ | 18669/34278 [20:31:38<17:00:38, 3.92s/it] 54%|█████▍ | 18670/34278 [20:31:41<15:57:09, 3.68s/it] {'loss': 0.1464, 'grad_norm': 0.9217262555492821, 'learning_rate': 4.520508676699692e-06, 'epoch': 0.54} 54%|█████▍ | 18670/34278 [20:31:41<15:57:09, 3.68s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f433b881580> Failed to fetch sample 3353699. Exception: cannot identify image file <_io.BytesIO object at 0x7f433b881580> 54%|█████▍ | 18671/34278 [20:31:48<19:10:42, 4.42s/it] {'loss': 0.1393, 'grad_norm': 0.8209866915718372, 'learning_rate': 4.5200384220669204e-06, 'epoch': 0.54} 54%|█████▍ | 18671/34278 [20:31:48<19:10:42, 4.42s/it] 54%|█████▍ | 18672/34278 [20:31:52<19:05:15, 4.40s/it] {'loss': 0.1274, 'grad_norm': 0.7208983744735961, 'learning_rate': 4.519568171719131e-06, 'epoch': 0.54} 54%|█████▍ | 18672/34278 [20:31:52<19:05:15, 4.40s/it] 54%|█████▍ | 18673/34278 [20:31:55<17:37:17, 4.07s/it] {'loss': 0.1345, 'grad_norm': 0.8469303619525171, 'learning_rate': 4.519097925660522e-06, 'epoch': 0.54} 54%|█████▍ | 18673/34278 [20:31:55<17:37:17, 4.07s/it] 54%|█████▍ | 18674/34278 [20:31:58<16:07:07, 3.72s/it] {'loss': 0.1304, 'grad_norm': 0.8605894598785747, 'learning_rate': 4.518627683895292e-06, 'epoch': 0.54} 54%|█████▍ | 18674/34278 [20:31:58<16:07:07, 3.72s/it] 54%|█████▍ | 18675/34278 [20:32:01<15:05:26, 3.48s/it] {'loss': 0.1364, 'grad_norm': 0.9702323629730008, 'learning_rate': 4.518157446427641e-06, 'epoch': 0.54} 54%|█████▍ | 18675/34278 [20:32:01<15:05:26, 3.48s/it] 54%|█████▍ | 18676/34278 [20:32:07<18:11:39, 4.20s/it] {'loss': 0.1324, 'grad_norm': 0.8255608939055838, 'learning_rate': 4.517687213261763e-06, 'epoch': 0.54} 54%|█████▍ | 18676/34278 [20:32:07<18:11:39, 4.20s/it] 54%|█████▍ | 18677/34278 [20:32:10<16:25:10, 3.79s/it] {'loss': 0.1551, 'grad_norm': 1.2221890049807367, 'learning_rate': 4.517216984401859e-06, 'epoch': 0.54} 54%|█████▍ | 18677/34278 [20:32:10<16:25:10, 3.79s/it] 54%|█████▍ | 18678/34278 [20:32:15<18:30:54, 4.27s/it] {'loss': 0.1141, 'grad_norm': 0.958857925699575, 'learning_rate': 4.5167467598521255e-06, 'epoch': 0.54} 54%|█████▍ | 18678/34278 [20:32:15<18:30:54, 4.27s/it] 54%|█████▍ | 18679/34278 [20:32:21<20:47:31, 4.80s/it] {'loss': 0.1264, 'grad_norm': 0.7141807691434207, 'learning_rate': 4.516276539616763e-06, 'epoch': 0.54} 54%|█████▍ | 18679/34278 [20:32:21<20:47:31, 4.80s/it] 54%|█████▍ | 18680/34278 [20:32:24<18:10:28, 4.19s/it] {'loss': 0.1384, 'grad_norm': 0.8791452586420072, 'learning_rate': 4.51580632369997e-06, 'epoch': 0.54} 54%|█████▍ | 18680/34278 [20:32:24<18:10:28, 4.19s/it] 54%|█████▍ | 18681/34278 [20:32:30<20:01:39, 4.62s/it] {'loss': 0.1313, 'grad_norm': 1.1676846362190259, 'learning_rate': 4.51533611210594e-06, 'epoch': 0.54} 54%|█████▍ | 18681/34278 [20:32:30<20:01:39, 4.62s/it] 55%|█████▍ | 18682/34278 [20:32:33<17:55:56, 4.14s/it] {'loss': 0.1302, 'grad_norm': 0.8867457317922832, 'learning_rate': 4.514865904838873e-06, 'epoch': 0.55} 55%|█████▍ | 18682/34278 [20:32:33<17:55:56, 4.14s/it] 55%|█████▍ | 18683/34278 [20:32:38<19:02:04, 4.39s/it] {'loss': 0.1455, 'grad_norm': 0.7035972351414402, 'learning_rate': 4.51439570190297e-06, 'epoch': 0.55} 55%|█████▍ | 18683/34278 [20:32:38<19:02:04, 4.39s/it] 55%|█████▍ | 18684/34278 [20:32:41<17:26:49, 4.03s/it] {'loss': 0.136, 'grad_norm': 0.9727928675577748, 'learning_rate': 4.513925503302422e-06, 'epoch': 0.55} 55%|█████▍ | 18684/34278 [20:32:41<17:26:49, 4.03s/it] 55%|█████▍ | 18685/34278 [20:32:44<16:17:45, 3.76s/it] {'loss': 0.1173, 'grad_norm': 0.7911244894352147, 'learning_rate': 4.513455309041435e-06, 'epoch': 0.55} 55%|█████▍ | 18685/34278 [20:32:44<16:17:45, 3.76s/it] 55%|█████▍ | 18686/34278 [20:32:49<17:24:48, 4.02s/it] {'loss': 0.1074, 'grad_norm': 0.5604570224485353, 'learning_rate': 4.512985119124201e-06, 'epoch': 0.55} 55%|█████▍ | 18686/34278 [20:32:49<17:24:48, 4.02s/it] 55%|█████▍ | 18687/34278 [20:32:52<16:20:08, 3.77s/it] {'loss': 0.1403, 'grad_norm': 0.8242058665485705, 'learning_rate': 4.51251493355492e-06, 'epoch': 0.55} 55%|█████▍ | 18687/34278 [20:32:52<16:20:08, 3.77s/it] 55%|█████▍ | 18688/34278 [20:32:56<16:27:35, 3.80s/it] {'loss': 0.1559, 'grad_norm': 0.7932939935576612, 'learning_rate': 4.512044752337791e-06, 'epoch': 0.55} 55%|█████▍ | 18688/34278 [20:32:56<16:27:35, 3.80s/it] 55%|█████▍ | 18689/34278 [20:33:02<19:09:58, 4.43s/it] {'loss': 0.1326, 'grad_norm': 0.8979089486879996, 'learning_rate': 4.511574575477008e-06, 'epoch': 0.55} 55%|█████▍ | 18689/34278 [20:33:02<19:09:58, 4.43s/it] 55%|█████▍ | 18690/34278 [20:33:05<17:29:48, 4.04s/it] {'loss': 0.1248, 'grad_norm': 0.71343820691617, 'learning_rate': 4.51110440297677e-06, 'epoch': 0.55} 55%|█████▍ | 18690/34278 [20:33:05<17:29:48, 4.04s/it] 55%|█████▍ | 18691/34278 [20:33:07<15:48:40, 3.65s/it] {'loss': 0.1209, 'grad_norm': 0.6444381535690982, 'learning_rate': 4.510634234841276e-06, 'epoch': 0.55} 55%|█████▍ | 18691/34278 [20:33:07<15:48:40, 3.65s/it] 55%|█████▍ | 18692/34278 [20:33:11<15:20:18, 3.54s/it] {'loss': 0.1417, 'grad_norm': 0.713407745145363, 'learning_rate': 4.510164071074722e-06, 'epoch': 0.55} 55%|█████▍ | 18692/34278 [20:33:11<15:20:18, 3.54s/it] 55%|█████▍ | 18693/34278 [20:33:15<16:22:26, 3.78s/it] {'loss': 0.142, 'grad_norm': 0.7842246343689397, 'learning_rate': 4.509693911681309e-06, 'epoch': 0.55} 55%|█████▍ | 18693/34278 [20:33:15<16:22:26, 3.78s/it] 55%|█████▍ | 18694/34278 [20:33:19<16:20:15, 3.77s/it] {'loss': 0.1373, 'grad_norm': 0.7797764613995918, 'learning_rate': 4.509223756665229e-06, 'epoch': 0.55} 55%|█████▍ | 18694/34278 [20:33:19<16:20:15, 3.77s/it] 55%|█████▍ | 18695/34278 [20:33:22<15:07:28, 3.49s/it] {'loss': 0.141, 'grad_norm': 0.9279253897875245, 'learning_rate': 4.508753606030683e-06, 'epoch': 0.55} 55%|█████▍ | 18695/34278 [20:33:22<15:07:28, 3.49s/it] 55%|█████▍ | 18696/34278 [20:33:26<16:00:34, 3.70s/it] {'loss': 0.1272, 'grad_norm': 1.2388409268297025, 'learning_rate': 4.508283459781866e-06, 'epoch': 0.55} 55%|█████▍ | 18696/34278 [20:33:26<16:00:34, 3.70s/it] 55%|█████▍ | 18697/34278 [20:33:32<19:01:26, 4.40s/it] {'loss': 0.1023, 'grad_norm': 0.8143512484818307, 'learning_rate': 4.50781331792298e-06, 'epoch': 0.55} 55%|█████▍ | 18697/34278 [20:33:32<19:01:26, 4.40s/it] 55%|█████▍ | 18698/34278 [20:33:35<17:54:30, 4.14s/it] {'loss': 0.1316, 'grad_norm': 0.8197034669807962, 'learning_rate': 4.507343180458217e-06, 'epoch': 0.55} 55%|█████▍ | 18698/34278 [20:33:35<17:54:30, 4.14s/it] 55%|█████▍ | 18699/34278 [20:33:40<18:02:47, 4.17s/it] {'loss': 0.1202, 'grad_norm': 0.9597287125437736, 'learning_rate': 4.5068730473917775e-06, 'epoch': 0.55} 55%|█████▍ | 18699/34278 [20:33:40<18:02:47, 4.17s/it] 55%|█████▍ | 18700/34278 [20:33:43<16:37:03, 3.84s/it] {'loss': 0.111, 'grad_norm': 1.746329316587141, 'learning_rate': 4.506402918727858e-06, 'epoch': 0.55} 55%|█████▍ | 18700/34278 [20:33:43<16:37:03, 3.84s/it] 55%|█████▍ | 18701/34278 [20:33:48<17:59:58, 4.16s/it] {'loss': 0.1242, 'grad_norm': 0.8173571891349378, 'learning_rate': 4.505932794470655e-06, 'epoch': 0.55} 55%|█████▍ | 18701/34278 [20:33:48<17:59:58, 4.16s/it] 55%|█████▍ | 18702/34278 [20:33:53<20:01:34, 4.63s/it] {'loss': 0.1152, 'grad_norm': 0.9096773756712586, 'learning_rate': 4.505462674624364e-06, 'epoch': 0.55} 55%|█████▍ | 18702/34278 [20:33:53<20:01:34, 4.63s/it] 55%|█████▍ | 18703/34278 [20:33:56<17:41:34, 4.09s/it] {'loss': 0.1144, 'grad_norm': 0.6762544356314814, 'learning_rate': 4.504992559193186e-06, 'epoch': 0.55} 55%|█████▍ | 18703/34278 [20:33:56<17:41:34, 4.09s/it] 55%|█████▍ | 18704/34278 [20:34:00<17:53:23, 4.14s/it] {'loss': 0.1221, 'grad_norm': 0.7441982050700876, 'learning_rate': 4.504522448181317e-06, 'epoch': 0.55} 55%|█████▍ | 18704/34278 [20:34:00<17:53:23, 4.14s/it] 55%|█████▍ | 18705/34278 [20:34:04<16:52:06, 3.90s/it] {'loss': 0.1209, 'grad_norm': 0.6798766630157737, 'learning_rate': 4.504052341592953e-06, 'epoch': 0.55} 55%|█████▍ | 18705/34278 [20:34:04<16:52:06, 3.90s/it] 55%|█████▍ | 18706/34278 [20:34:07<16:27:34, 3.81s/it] {'loss': 0.1394, 'grad_norm': 0.8271821107677139, 'learning_rate': 4.503582239432291e-06, 'epoch': 0.55} 55%|█████▍ | 18706/34278 [20:34:07<16:27:34, 3.81s/it] 55%|█████▍ | 18707/34278 [20:34:13<18:47:40, 4.35s/it] {'loss': 0.12, 'grad_norm': 0.8143690873232363, 'learning_rate': 4.503112141703528e-06, 'epoch': 0.55} 55%|█████▍ | 18707/34278 [20:34:13<18:47:40, 4.35s/it] 55%|█████▍ | 18708/34278 [20:34:16<17:15:57, 3.99s/it] {'loss': 0.1279, 'grad_norm': 0.916131360435358, 'learning_rate': 4.50264204841086e-06, 'epoch': 0.55} 55%|█████▍ | 18708/34278 [20:34:16<17:15:57, 3.99s/it] 55%|█████▍ | 18709/34278 [20:34:19<15:45:09, 3.64s/it] {'loss': 0.1228, 'grad_norm': 1.0924993975321524, 'learning_rate': 4.502171959558486e-06, 'epoch': 0.55} 55%|█████▍ | 18709/34278 [20:34:19<15:45:09, 3.64s/it] 55%|█████▍ | 18710/34278 [20:34:22<15:19:40, 3.54s/it] {'loss': 0.1409, 'grad_norm': 0.9329447919396929, 'learning_rate': 4.501701875150604e-06, 'epoch': 0.55} 55%|█████▍ | 18710/34278 [20:34:22<15:19:40, 3.54s/it] 55%|█████▍ | 18711/34278 [20:34:27<17:17:06, 4.00s/it] {'loss': 0.138, 'grad_norm': 1.0741350878115938, 'learning_rate': 4.501231795191406e-06, 'epoch': 0.55} 55%|█████▍ | 18711/34278 [20:34:27<17:17:06, 4.00s/it] 55%|█████▍ | 18712/34278 [20:34:31<16:25:33, 3.80s/it] {'loss': 0.1525, 'grad_norm': 0.917101561364549, 'learning_rate': 4.500761719685093e-06, 'epoch': 0.55} 55%|█████▍ | 18712/34278 [20:34:31<16:25:33, 3.80s/it] 55%|█████▍ | 18713/34278 [20:34:34<15:24:17, 3.56s/it] {'loss': 0.1264, 'grad_norm': 0.8087043155629119, 'learning_rate': 4.50029164863586e-06, 'epoch': 0.55} 55%|█████▍ | 18713/34278 [20:34:34<15:24:17, 3.56s/it] 55%|█████▍ | 18714/34278 [20:34:36<14:24:39, 3.33s/it] {'loss': 0.1116, 'grad_norm': 0.855974439774273, 'learning_rate': 4.499821582047902e-06, 'epoch': 0.55} 55%|█████▍ | 18714/34278 [20:34:36<14:24:39, 3.33s/it] 55%|█████▍ | 18715/34278 [20:34:42<17:47:37, 4.12s/it] {'loss': 0.1304, 'grad_norm': 0.8794473717442446, 'learning_rate': 4.4993515199254196e-06, 'epoch': 0.55} 55%|█████▍ | 18715/34278 [20:34:42<17:47:37, 4.12s/it] 55%|█████▍ | 18716/34278 [20:34:45<16:19:25, 3.78s/it] {'loss': 0.1559, 'grad_norm': 0.946860807346336, 'learning_rate': 4.498881462272607e-06, 'epoch': 0.55} 55%|█████▍ | 18716/34278 [20:34:45<16:19:25, 3.78s/it] 55%|█████▍ | 18717/34278 [20:34:48<15:07:59, 3.50s/it] {'loss': 0.1204, 'grad_norm': 0.8348987097597831, 'learning_rate': 4.49841140909366e-06, 'epoch': 0.55} 55%|█████▍ | 18717/34278 [20:34:48<15:07:59, 3.50s/it] 55%|█████▍ | 18718/34278 [20:34:52<15:42:15, 3.63s/it] {'loss': 0.1174, 'grad_norm': 0.780462143600301, 'learning_rate': 4.497941360392778e-06, 'epoch': 0.55} 55%|█████▍ | 18718/34278 [20:34:52<15:42:15, 3.63s/it] 55%|█████▍ | 18719/34278 [20:34:55<15:17:11, 3.54s/it] {'loss': 0.1352, 'grad_norm': 0.765716565392337, 'learning_rate': 4.4974713161741545e-06, 'epoch': 0.55} 55%|█████▍ | 18719/34278 [20:34:55<15:17:11, 3.54s/it] 55%|█████▍ | 18720/34278 [20:34:59<15:36:39, 3.61s/it] {'loss': 0.1359, 'grad_norm': 1.0252641668279476, 'learning_rate': 4.497001276441986e-06, 'epoch': 0.55} 55%|█████▍ | 18720/34278 [20:34:59<15:36:39, 3.61s/it] 55%|█████▍ | 18721/34278 [20:35:04<16:32:42, 3.83s/it] {'loss': 0.1298, 'grad_norm': 0.7572138221774259, 'learning_rate': 4.496531241200472e-06, 'epoch': 0.55} 55%|█████▍ | 18721/34278 [20:35:04<16:32:42, 3.83s/it] 55%|█████▍ | 18722/34278 [20:35:07<15:43:57, 3.64s/it] {'loss': 0.0975, 'grad_norm': 0.6336423310001275, 'learning_rate': 4.496061210453806e-06, 'epoch': 0.55} 55%|█████▍ | 18722/34278 [20:35:07<15:43:57, 3.64s/it] 55%|█████▍ | 18723/34278 [20:35:10<15:26:03, 3.57s/it] {'loss': 0.1116, 'grad_norm': 0.8224626514752715, 'learning_rate': 4.4955911842061864e-06, 'epoch': 0.55} 55%|█████▍ | 18723/34278 [20:35:10<15:26:03, 3.57s/it] 55%|█████▍ | 18724/34278 [20:35:13<14:26:47, 3.34s/it] {'loss': 0.1461, 'grad_norm': 0.9424743288141211, 'learning_rate': 4.4951211624618065e-06, 'epoch': 0.55} 55%|█████▍ | 18724/34278 [20:35:13<14:26:47, 3.34s/it] 55%|█████▍ | 18725/34278 [20:35:16<14:15:33, 3.30s/it] {'loss': 0.1468, 'grad_norm': 1.1566218562939485, 'learning_rate': 4.494651145224864e-06, 'epoch': 0.55} 55%|█████▍ | 18725/34278 [20:35:16<14:15:33, 3.30s/it] 55%|█████▍ | 18726/34278 [20:35:20<14:19:56, 3.32s/it] {'loss': 0.1124, 'grad_norm': 0.8847982767943576, 'learning_rate': 4.494181132499557e-06, 'epoch': 0.55} 55%|█████▍ | 18726/34278 [20:35:20<14:19:56, 3.32s/it] 55%|█████▍ | 18727/34278 [20:35:23<14:33:01, 3.37s/it] {'loss': 0.1251, 'grad_norm': 0.9788825134811449, 'learning_rate': 4.493711124290081e-06, 'epoch': 0.55} 55%|█████▍ | 18727/34278 [20:35:23<14:33:01, 3.37s/it] 55%|█████▍ | 18728/34278 [20:35:26<14:32:20, 3.37s/it] {'loss': 0.1202, 'grad_norm': 1.0327831909363185, 'learning_rate': 4.493241120600629e-06, 'epoch': 0.55} 55%|█████▍ | 18728/34278 [20:35:26<14:32:20, 3.37s/it] 55%|█████▍ | 18729/34278 [20:35:30<14:17:47, 3.31s/it] {'loss': 0.1282, 'grad_norm': 1.04227725466138, 'learning_rate': 4.4927711214354005e-06, 'epoch': 0.55} 55%|█████▍ | 18729/34278 [20:35:30<14:17:47, 3.31s/it] 55%|█████▍ | 18730/34278 [20:35:33<14:33:17, 3.37s/it] {'loss': 0.1474, 'grad_norm': 1.0892908731422253, 'learning_rate': 4.492301126798591e-06, 'epoch': 0.55} 55%|█████▍ | 18730/34278 [20:35:33<14:33:17, 3.37s/it] 55%|█████▍ | 18731/34278 [20:35:38<16:50:22, 3.90s/it] {'loss': 0.1357, 'grad_norm': 1.445404712102127, 'learning_rate': 4.491831136694393e-06, 'epoch': 0.55} 55%|█████▍ | 18731/34278 [20:35:38<16:50:22, 3.90s/it] 55%|█████▍ | 18732/34278 [20:35:42<17:10:08, 3.98s/it] {'loss': 0.1188, 'grad_norm': 0.8457448446123691, 'learning_rate': 4.491361151127008e-06, 'epoch': 0.55} 55%|█████▍ | 18732/34278 [20:35:42<17:10:08, 3.98s/it] 55%|█████▍ | 18733/34278 [20:35:46<16:06:20, 3.73s/it] {'loss': 0.1282, 'grad_norm': 0.698862966818781, 'learning_rate': 4.490891170100629e-06, 'epoch': 0.55} 55%|█████▍ | 18733/34278 [20:35:46<16:06:20, 3.73s/it] 55%|█████▍ | 18734/34278 [20:35:49<15:33:26, 3.60s/it] {'loss': 0.136, 'grad_norm': 1.075784804102348, 'learning_rate': 4.490421193619451e-06, 'epoch': 0.55} 55%|█████▍ | 18734/34278 [20:35:49<15:33:26, 3.60s/it] 55%|█████▍ | 18735/34278 [20:35:54<16:55:51, 3.92s/it] {'loss': 0.1176, 'grad_norm': 0.9105765184802457, 'learning_rate': 4.489951221687672e-06, 'epoch': 0.55} 55%|█████▍ | 18735/34278 [20:35:54<16:55:51, 3.92s/it] 55%|█████▍ | 18736/34278 [20:35:57<16:15:11, 3.76s/it] {'loss': 0.1463, 'grad_norm': 0.8338228380624473, 'learning_rate': 4.489481254309486e-06, 'epoch': 0.55} 55%|█████▍ | 18736/34278 [20:35:57<16:15:11, 3.76s/it] 55%|█████▍ | 18737/34278 [20:36:01<16:29:28, 3.82s/it] {'loss': 0.1343, 'grad_norm': 0.8184887980584117, 'learning_rate': 4.489011291489089e-06, 'epoch': 0.55} 55%|█████▍ | 18737/34278 [20:36:01<16:29:28, 3.82s/it] 55%|█████▍ | 18738/34278 [20:36:05<16:24:11, 3.80s/it] {'loss': 0.147, 'grad_norm': 1.0756234626517278, 'learning_rate': 4.488541333230678e-06, 'epoch': 0.55} 55%|█████▍ | 18738/34278 [20:36:05<16:24:11, 3.80s/it] 55%|█████▍ | 18739/34278 [20:36:08<15:30:24, 3.59s/it] {'loss': 0.138, 'grad_norm': 1.1569508670024704, 'learning_rate': 4.488071379538447e-06, 'epoch': 0.55} 55%|█████▍ | 18739/34278 [20:36:08<15:30:24, 3.59s/it] 55%|█████▍ | 18740/34278 [20:36:12<16:43:00, 3.87s/it] {'loss': 0.1382, 'grad_norm': 1.0553493069012183, 'learning_rate': 4.487601430416595e-06, 'epoch': 0.55} 55%|█████▍ | 18740/34278 [20:36:12<16:43:00, 3.87s/it] 55%|█████▍ | 18741/34278 [20:36:15<15:47:18, 3.66s/it] {'loss': 0.1461, 'grad_norm': 0.73847327189784, 'learning_rate': 4.487131485869313e-06, 'epoch': 0.55} 55%|█████▍ | 18741/34278 [20:36:15<15:47:18, 3.66s/it] 55%|█████▍ | 18742/34278 [20:36:21<17:38:07, 4.09s/it] {'loss': 0.1039, 'grad_norm': 1.2745943829878523, 'learning_rate': 4.486661545900799e-06, 'epoch': 0.55} 55%|█████▍ | 18742/34278 [20:36:21<17:38:07, 4.09s/it] 55%|█████▍ | 18743/34278 [20:36:23<16:11:35, 3.75s/it] {'loss': 0.1368, 'grad_norm': 0.9256812935285194, 'learning_rate': 4.486191610515247e-06, 'epoch': 0.55} 55%|█████▍ | 18743/34278 [20:36:24<16:11:35, 3.75s/it] 55%|█████▍ | 18744/34278 [20:36:27<15:39:20, 3.63s/it] {'loss': 0.1497, 'grad_norm': 1.0334598848100578, 'learning_rate': 4.485721679716855e-06, 'epoch': 0.55} 55%|█████▍ | 18744/34278 [20:36:27<15:39:20, 3.63s/it] 55%|█████▍ | 18745/34278 [20:36:30<14:54:13, 3.45s/it] {'loss': 0.126, 'grad_norm': 0.9181275271618506, 'learning_rate': 4.485251753509818e-06, 'epoch': 0.55} 55%|█████▍ | 18745/34278 [20:36:30<14:54:13, 3.45s/it] 55%|█████▍ | 18746/34278 [20:36:34<15:08:43, 3.51s/it] {'loss': 0.1555, 'grad_norm': 0.8469800679286065, 'learning_rate': 4.484781831898329e-06, 'epoch': 0.55} 55%|█████▍ | 18746/34278 [20:36:34<15:08:43, 3.51s/it] 55%|█████▍ | 18747/34278 [20:36:37<15:07:52, 3.51s/it] {'loss': 0.1281, 'grad_norm': 0.8484698155359741, 'learning_rate': 4.484311914886585e-06, 'epoch': 0.55} 55%|█████▍ | 18747/34278 [20:36:37<15:07:52, 3.51s/it] 55%|█████▍ | 18748/34278 [20:36:40<14:46:20, 3.42s/it] {'loss': 0.1352, 'grad_norm': 1.2317439107696468, 'learning_rate': 4.483842002478783e-06, 'epoch': 0.55} 55%|█████▍ | 18748/34278 [20:36:40<14:46:20, 3.42s/it] 55%|█████▍ | 18749/34278 [20:36:43<13:54:57, 3.23s/it] {'loss': 0.122, 'grad_norm': 1.071208504436812, 'learning_rate': 4.483372094679112e-06, 'epoch': 0.55} 55%|█████▍ | 18749/34278 [20:36:43<13:54:57, 3.23s/it] 55%|█████▍ | 18750/34278 [20:36:46<14:08:40, 3.28s/it] {'loss': 0.1161, 'grad_norm': 0.7195957535184634, 'learning_rate': 4.482902191491775e-06, 'epoch': 0.55} 55%|█████▍ | 18750/34278 [20:36:46<14:08:40, 3.28s/it] 55%|█████▍ | 18751/34278 [20:36:53<18:03:42, 4.19s/it] {'loss': 0.1303, 'grad_norm': 0.7326552436370776, 'learning_rate': 4.482432292920963e-06, 'epoch': 0.55} 55%|█████▍ | 18751/34278 [20:36:53<18:03:42, 4.19s/it] 55%|█████▍ | 18752/34278 [20:36:56<16:53:47, 3.92s/it] {'loss': 0.1177, 'grad_norm': 0.7813913349897524, 'learning_rate': 4.481962398970872e-06, 'epoch': 0.55} 55%|█████▍ | 18752/34278 [20:36:56<16:53:47, 3.92s/it] 55%|█████▍ | 18753/34278 [20:37:00<16:41:44, 3.87s/it] {'loss': 0.1144, 'grad_norm': 1.783801470151127, 'learning_rate': 4.481492509645698e-06, 'epoch': 0.55} 55%|█████▍ | 18753/34278 [20:37:00<16:41:44, 3.87s/it] 55%|█████▍ | 18754/34278 [20:37:03<16:11:26, 3.75s/it] {'loss': 0.1535, 'grad_norm': 0.9595920176408392, 'learning_rate': 4.481022624949635e-06, 'epoch': 0.55} 55%|█████▍ | 18754/34278 [20:37:03<16:11:26, 3.75s/it] 55%|█████▍ | 18755/34278 [20:37:07<16:17:05, 3.78s/it] {'loss': 0.1077, 'grad_norm': 0.7438172546724363, 'learning_rate': 4.480552744886876e-06, 'epoch': 0.55} 55%|█████▍ | 18755/34278 [20:37:07<16:17:05, 3.78s/it] 55%|█████▍ | 18756/34278 [20:37:13<19:12:10, 4.45s/it] {'loss': 0.1364, 'grad_norm': 0.7688152617798003, 'learning_rate': 4.4800828694616195e-06, 'epoch': 0.55} 55%|█████▍ | 18756/34278 [20:37:13<19:12:10, 4.45s/it] 55%|█████▍ | 18757/34278 [20:37:17<17:52:43, 4.15s/it] {'loss': 0.1283, 'grad_norm': 0.9405447055173943, 'learning_rate': 4.479612998678059e-06, 'epoch': 0.55} 55%|█████▍ | 18757/34278 [20:37:17<17:52:43, 4.15s/it] 55%|█████▍ | 18758/34278 [20:37:22<20:05:31, 4.66s/it] {'loss': 0.1116, 'grad_norm': 0.7854322612157514, 'learning_rate': 4.47914313254039e-06, 'epoch': 0.55} 55%|█████▍ | 18758/34278 [20:37:22<20:05:31, 4.66s/it] 55%|█████▍ | 18759/34278 [20:37:26<18:14:46, 4.23s/it] {'loss': 0.1489, 'grad_norm': 0.8431792024419208, 'learning_rate': 4.478673271052806e-06, 'epoch': 0.55} 55%|█████▍ | 18759/34278 [20:37:26<18:14:46, 4.23s/it] 55%|█████▍ | 18760/34278 [20:37:29<16:59:27, 3.94s/it] {'loss': 0.1252, 'grad_norm': 0.7991888279351618, 'learning_rate': 4.478203414219503e-06, 'epoch': 0.55} 55%|█████▍ | 18760/34278 [20:37:29<16:59:27, 3.94s/it] 55%|█████▍ | 18761/34278 [20:37:32<16:32:33, 3.84s/it] {'loss': 0.1249, 'grad_norm': 0.8240294087855686, 'learning_rate': 4.477733562044673e-06, 'epoch': 0.55} 55%|█████▍ | 18761/34278 [20:37:32<16:32:33, 3.84s/it] 55%|█████▍ | 18762/34278 [20:37:36<15:48:04, 3.67s/it] {'loss': 0.1343, 'grad_norm': 0.9701580584668879, 'learning_rate': 4.477263714532517e-06, 'epoch': 0.55} 55%|█████▍ | 18762/34278 [20:37:36<15:48:04, 3.67s/it] 55%|█████▍ | 18763/34278 [20:37:39<15:20:29, 3.56s/it] {'loss': 0.1444, 'grad_norm': 0.9308008446272982, 'learning_rate': 4.476793871687224e-06, 'epoch': 0.55} 55%|█████▍ | 18763/34278 [20:37:39<15:20:29, 3.56s/it] 55%|█████▍ | 18764/34278 [20:37:42<14:38:17, 3.40s/it] {'loss': 0.1063, 'grad_norm': 0.9927858917002728, 'learning_rate': 4.4763240335129895e-06, 'epoch': 0.55} 55%|█████▍ | 18764/34278 [20:37:42<14:38:17, 3.40s/it] 55%|█████▍ | 18765/34278 [20:37:48<18:07:16, 4.21s/it] {'loss': 0.1179, 'grad_norm': 0.7792231841900343, 'learning_rate': 4.475854200014011e-06, 'epoch': 0.55} 55%|█████▍ | 18765/34278 [20:37:48<18:07:16, 4.21s/it] 55%|█████▍ | 18766/34278 [20:37:51<16:18:47, 3.79s/it] {'loss': 0.1396, 'grad_norm': 0.9923921341333932, 'learning_rate': 4.47538437119448e-06, 'epoch': 0.55} 55%|█████▍ | 18766/34278 [20:37:51<16:18:47, 3.79s/it] 55%|█████▍ | 18767/34278 [20:37:54<15:49:31, 3.67s/it] {'loss': 0.154, 'grad_norm': 1.1559385992876254, 'learning_rate': 4.474914547058591e-06, 'epoch': 0.55} 55%|█████▍ | 18767/34278 [20:37:54<15:49:31, 3.67s/it] 55%|█████▍ | 18768/34278 [20:38:00<18:36:11, 4.32s/it] {'loss': 0.1185, 'grad_norm': 0.6660916757162116, 'learning_rate': 4.4744447276105405e-06, 'epoch': 0.55} 55%|█████▍ | 18768/34278 [20:38:00<18:36:11, 4.32s/it] 55%|█████▍ | 18769/34278 [20:38:06<20:53:26, 4.85s/it] {'loss': 0.1483, 'grad_norm': 1.0079416046977872, 'learning_rate': 4.473974912854522e-06, 'epoch': 0.55} 55%|█████▍ | 18769/34278 [20:38:06<20:53:26, 4.85s/it] 55%|█████▍ | 18770/34278 [20:38:10<19:13:10, 4.46s/it] {'loss': 0.1447, 'grad_norm': 0.9979158077808288, 'learning_rate': 4.473505102794731e-06, 'epoch': 0.55} 55%|█████▍ | 18770/34278 [20:38:10<19:13:10, 4.46s/it] 55%|█████▍ | 18771/34278 [20:38:14<19:20:36, 4.49s/it] {'loss': 0.1156, 'grad_norm': 0.6353070059279827, 'learning_rate': 4.4730352974353595e-06, 'epoch': 0.55} 55%|█████▍ | 18771/34278 [20:38:14<19:20:36, 4.49s/it] 55%|█████▍ | 18772/34278 [20:38:18<17:41:02, 4.11s/it] {'loss': 0.1376, 'grad_norm': 0.8250928276288193, 'learning_rate': 4.472565496780603e-06, 'epoch': 0.55} 55%|█████▍ | 18772/34278 [20:38:18<17:41:02, 4.11s/it] 55%|█████▍ | 18773/34278 [20:38:21<16:47:24, 3.90s/it] {'loss': 0.1239, 'grad_norm': 0.8017297030902534, 'learning_rate': 4.472095700834655e-06, 'epoch': 0.55} 55%|█████▍ | 18773/34278 [20:38:21<16:47:24, 3.90s/it] 55%|█████▍ | 18774/34278 [20:38:25<16:33:53, 3.85s/it] {'loss': 0.1395, 'grad_norm': 0.827653124740946, 'learning_rate': 4.471625909601712e-06, 'epoch': 0.55} 55%|█████▍ | 18774/34278 [20:38:25<16:33:53, 3.85s/it] 55%|█████▍ | 18775/34278 [20:38:29<16:27:47, 3.82s/it] {'loss': 0.1034, 'grad_norm': 0.714812532505758, 'learning_rate': 4.471156123085968e-06, 'epoch': 0.55} 55%|█████▍ | 18775/34278 [20:38:29<16:27:47, 3.82s/it] 55%|█████▍ | 18776/34278 [20:38:32<16:34:08, 3.85s/it] {'loss': 0.1239, 'grad_norm': 0.94154036922442, 'learning_rate': 4.470686341291614e-06, 'epoch': 0.55} 55%|█████▍ | 18776/34278 [20:38:32<16:34:08, 3.85s/it] 55%|█████▍ | 18777/34278 [20:38:38<18:37:24, 4.33s/it] {'loss': 0.1094, 'grad_norm': 0.8996131738315101, 'learning_rate': 4.470216564222846e-06, 'epoch': 0.55} 55%|█████▍ | 18777/34278 [20:38:38<18:37:24, 4.33s/it] 55%|█████▍ | 18778/34278 [20:38:41<17:29:16, 4.06s/it] {'loss': 0.1262, 'grad_norm': 0.8582004083861352, 'learning_rate': 4.469746791883859e-06, 'epoch': 0.55} 55%|█████▍ | 18778/34278 [20:38:41<17:29:16, 4.06s/it] 55%|█████▍ | 18779/34278 [20:38:45<16:52:53, 3.92s/it] {'loss': 0.1077, 'grad_norm': 0.9166084305535637, 'learning_rate': 4.469277024278844e-06, 'epoch': 0.55} 55%|█████▍ | 18779/34278 [20:38:45<16:52:53, 3.92s/it] 55%|█████▍ | 18780/34278 [20:38:48<15:42:06, 3.65s/it] {'loss': 0.1574, 'grad_norm': 1.2600268115693165, 'learning_rate': 4.468807261412e-06, 'epoch': 0.55} 55%|█████▍ | 18780/34278 [20:38:48<15:42:06, 3.65s/it] 55%|█████▍ | 18781/34278 [20:38:51<14:52:57, 3.46s/it] {'loss': 0.1545, 'grad_norm': 0.8629881857398644, 'learning_rate': 4.468337503287516e-06, 'epoch': 0.55} 55%|█████▍ | 18781/34278 [20:38:51<14:52:57, 3.46s/it] 55%|█████▍ | 18782/34278 [20:38:54<14:21:48, 3.34s/it] {'loss': 0.104, 'grad_norm': 0.9043997551778598, 'learning_rate': 4.467867749909588e-06, 'epoch': 0.55} 55%|█████▍ | 18782/34278 [20:38:54<14:21:48, 3.34s/it] 55%|█████▍ | 18783/34278 [20:39:00<17:55:44, 4.17s/it] {'loss': 0.1412, 'grad_norm': 1.0151971542149922, 'learning_rate': 4.4673980012824106e-06, 'epoch': 0.55} 55%|█████▍ | 18783/34278 [20:39:00<17:55:44, 4.17s/it] 55%|█████▍ | 18784/34278 [20:39:04<17:20:42, 4.03s/it] {'loss': 0.178, 'grad_norm': 1.1540785596688323, 'learning_rate': 4.466928257410176e-06, 'epoch': 0.55} 55%|█████▍ | 18784/34278 [20:39:04<17:20:42, 4.03s/it] 55%|█████▍ | 18785/34278 [20:39:08<17:12:07, 4.00s/it] {'loss': 0.1141, 'grad_norm': 1.4450616556343472, 'learning_rate': 4.466458518297078e-06, 'epoch': 0.55} 55%|█████▍ | 18785/34278 [20:39:08<17:12:07, 4.00s/it] 55%|█████▍ | 18786/34278 [20:39:11<16:34:50, 3.85s/it] {'loss': 0.1152, 'grad_norm': 0.8877658422972836, 'learning_rate': 4.465988783947311e-06, 'epoch': 0.55} 55%|█████▍ | 18786/34278 [20:39:11<16:34:50, 3.85s/it] 55%|█████▍ | 18787/34278 [20:39:14<15:22:36, 3.57s/it] {'loss': 0.1195, 'grad_norm': 1.0677561065489143, 'learning_rate': 4.465519054365071e-06, 'epoch': 0.55} 55%|█████▍ | 18787/34278 [20:39:14<15:22:36, 3.57s/it] 55%|█████▍ | 18788/34278 [20:39:17<14:43:28, 3.42s/it] {'loss': 0.1147, 'grad_norm': 1.0366187645591158, 'learning_rate': 4.4650493295545475e-06, 'epoch': 0.55} 55%|█████▍ | 18788/34278 [20:39:17<14:43:28, 3.42s/it] 55%|█████▍ | 18789/34278 [20:39:21<15:06:54, 3.51s/it] {'loss': 0.1167, 'grad_norm': 0.9566943292061366, 'learning_rate': 4.464579609519936e-06, 'epoch': 0.55} 55%|█████▍ | 18789/34278 [20:39:21<15:06:54, 3.51s/it] 55%|█████▍ | 18790/34278 [20:39:26<16:59:22, 3.95s/it] {'loss': 0.1141, 'grad_norm': 1.015505908262457, 'learning_rate': 4.464109894265431e-06, 'epoch': 0.55} 55%|█████▍ | 18790/34278 [20:39:26<16:59:22, 3.95s/it] 55%|█████▍ | 18791/34278 [20:39:29<15:36:24, 3.63s/it] {'loss': 0.1225, 'grad_norm': 1.0564093438562037, 'learning_rate': 4.463640183795222e-06, 'epoch': 0.55} 55%|█████▍ | 18791/34278 [20:39:29<15:36:24, 3.63s/it] 55%|█████▍ | 18792/34278 [20:39:32<15:22:11, 3.57s/it] {'loss': 0.1517, 'grad_norm': 1.5484185495907816, 'learning_rate': 4.463170478113509e-06, 'epoch': 0.55} 55%|█████▍ | 18792/34278 [20:39:32<15:22:11, 3.57s/it] 55%|█████▍ | 18793/34278 [20:39:37<16:15:35, 3.78s/it] {'loss': 0.1299, 'grad_norm': 0.9621266986658162, 'learning_rate': 4.462700777224479e-06, 'epoch': 0.55} 55%|█████▍ | 18793/34278 [20:39:37<16:15:35, 3.78s/it] 55%|█████▍ | 18794/34278 [20:39:40<15:37:33, 3.63s/it] {'loss': 0.1438, 'grad_norm': 1.0427061543192153, 'learning_rate': 4.46223108113233e-06, 'epoch': 0.55} 55%|█████▍ | 18794/34278 [20:39:40<15:37:33, 3.63s/it] 55%|█████▍ | 18795/34278 [20:39:43<15:16:43, 3.55s/it] {'loss': 0.1329, 'grad_norm': 0.7867770643621365, 'learning_rate': 4.4617613898412534e-06, 'epoch': 0.55} 55%|█████▍ | 18795/34278 [20:39:43<15:16:43, 3.55s/it] 55%|█████▍ | 18796/34278 [20:39:46<14:46:14, 3.43s/it] {'loss': 0.1289, 'grad_norm': 0.6628114665732285, 'learning_rate': 4.461291703355443e-06, 'epoch': 0.55} 55%|█████▍ | 18796/34278 [20:39:46<14:46:14, 3.43s/it] 55%|█████▍ | 18797/34278 [20:39:53<18:35:49, 4.32s/it] {'loss': 0.1222, 'grad_norm': 0.8931836886938811, 'learning_rate': 4.460822021679089e-06, 'epoch': 0.55} 55%|█████▍ | 18797/34278 [20:39:53<18:35:49, 4.32s/it] 55%|█████▍ | 18798/34278 [20:39:56<16:43:26, 3.89s/it] {'loss': 0.1166, 'grad_norm': 0.9416935724452078, 'learning_rate': 4.4603523448163894e-06, 'epoch': 0.55} 55%|█████▍ | 18798/34278 [20:39:56<16:43:26, 3.89s/it] 55%|█████▍ | 18799/34278 [20:40:00<16:44:48, 3.89s/it] {'loss': 0.1337, 'grad_norm': 0.660218947610981, 'learning_rate': 4.459882672771535e-06, 'epoch': 0.55} 55%|█████▍ | 18799/34278 [20:40:00<16:44:48, 3.89s/it] 55%|█████▍ | 18800/34278 [20:40:04<17:53:19, 4.16s/it] {'loss': 0.1134, 'grad_norm': 0.83965462166203, 'learning_rate': 4.45941300554872e-06, 'epoch': 0.55} 55%|█████▍ | 18800/34278 [20:40:04<17:53:19, 4.16s/it] 55%|█████▍ | 18801/34278 [20:40:09<18:34:53, 4.32s/it] {'loss': 0.1134, 'grad_norm': 1.0496612007027168, 'learning_rate': 4.4589433431521356e-06, 'epoch': 0.55} 55%|█████▍ | 18801/34278 [20:40:09<18:34:53, 4.32s/it] 55%|█████▍ | 18802/34278 [20:40:12<17:20:38, 4.03s/it] {'loss': 0.1342, 'grad_norm': 0.8463714576445494, 'learning_rate': 4.458473685585976e-06, 'epoch': 0.55} 55%|█████▍ | 18802/34278 [20:40:12<17:20:38, 4.03s/it] 55%|█████▍ | 18803/34278 [20:40:15<16:04:32, 3.74s/it] {'loss': 0.1328, 'grad_norm': 0.8421325209491768, 'learning_rate': 4.458004032854432e-06, 'epoch': 0.55} 55%|█████▍ | 18803/34278 [20:40:15<16:04:32, 3.74s/it] 55%|█████▍ | 18804/34278 [20:40:18<15:11:10, 3.53s/it] {'loss': 0.1343, 'grad_norm': 0.9108583905917115, 'learning_rate': 4.457534384961701e-06, 'epoch': 0.55} 55%|█████▍ | 18804/34278 [20:40:18<15:11:10, 3.53s/it] 55%|█████▍ | 18805/34278 [20:40:21<14:29:10, 3.37s/it] {'loss': 0.1279, 'grad_norm': 0.9689609962829522, 'learning_rate': 4.457064741911974e-06, 'epoch': 0.55} 55%|█████▍ | 18805/34278 [20:40:21<14:29:10, 3.37s/it] 55%|█████▍ | 18806/34278 [20:40:25<14:17:19, 3.32s/it] {'loss': 0.1352, 'grad_norm': 0.7041795736720375, 'learning_rate': 4.456595103709443e-06, 'epoch': 0.55} 55%|█████▍ | 18806/34278 [20:40:25<14:17:19, 3.32s/it] 55%|█████▍ | 18807/34278 [20:40:30<16:27:56, 3.83s/it] {'loss': 0.1292, 'grad_norm': 0.8467823049388483, 'learning_rate': 4.456125470358301e-06, 'epoch': 0.55} 55%|█████▍ | 18807/34278 [20:40:30<16:27:56, 3.83s/it] 55%|█████▍ | 18808/34278 [20:40:33<16:05:08, 3.74s/it] {'loss': 0.1455, 'grad_norm': 0.7456565169942629, 'learning_rate': 4.455655841862742e-06, 'epoch': 0.55} 55%|█████▍ | 18808/34278 [20:40:33<16:05:08, 3.74s/it] 55%|█████▍ | 18809/34278 [20:40:37<16:39:20, 3.88s/it] {'loss': 0.1343, 'grad_norm': 0.8075679281034006, 'learning_rate': 4.455186218226953e-06, 'epoch': 0.55} 55%|█████▍ | 18809/34278 [20:40:37<16:39:20, 3.88s/it] 55%|█████▍ | 18810/34278 [20:40:42<17:03:58, 3.97s/it] {'loss': 0.1273, 'grad_norm': 0.7829420297078291, 'learning_rate': 4.454716599455137e-06, 'epoch': 0.55} 55%|█████▍ | 18810/34278 [20:40:42<17:03:58, 3.97s/it] 55%|█████▍ | 18811/34278 [20:40:45<16:37:12, 3.87s/it] {'loss': 0.1107, 'grad_norm': 0.8074066550865667, 'learning_rate': 4.454246985551478e-06, 'epoch': 0.55} 55%|█████▍ | 18811/34278 [20:40:45<16:37:12, 3.87s/it] 55%|█████▍ | 18812/34278 [20:40:49<16:59:43, 3.96s/it] {'loss': 0.1243, 'grad_norm': 0.877346506703811, 'learning_rate': 4.453777376520173e-06, 'epoch': 0.55} 55%|█████▍ | 18812/34278 [20:40:49<16:59:43, 3.96s/it] 55%|█████▍ | 18813/34278 [20:40:53<15:57:06, 3.71s/it] {'loss': 0.1441, 'grad_norm': 0.8439324148329742, 'learning_rate': 4.4533077723654134e-06, 'epoch': 0.55} 55%|█████▍ | 18813/34278 [20:40:53<15:57:06, 3.71s/it] 55%|█████▍ | 18814/34278 [20:40:56<15:34:45, 3.63s/it] {'loss': 0.1096, 'grad_norm': 0.9160281794188151, 'learning_rate': 4.452838173091391e-06, 'epoch': 0.55} 55%|█████▍ | 18814/34278 [20:40:56<15:34:45, 3.63s/it] 55%|█████▍ | 18815/34278 [20:41:02<18:46:55, 4.37s/it] {'loss': 0.1427, 'grad_norm': 1.2818435066609408, 'learning_rate': 4.452368578702297e-06, 'epoch': 0.55} 55%|█████▍ | 18815/34278 [20:41:02<18:46:55, 4.37s/it] 55%|█████▍ | 18816/34278 [20:41:05<17:18:12, 4.03s/it] {'loss': 0.1206, 'grad_norm': 0.8903131888603528, 'learning_rate': 4.451898989202327e-06, 'epoch': 0.55} 55%|█████▍ | 18816/34278 [20:41:05<17:18:12, 4.03s/it] 55%|█████▍ | 18817/34278 [20:41:08<16:05:19, 3.75s/it] {'loss': 0.1292, 'grad_norm': 1.0707974982028392, 'learning_rate': 4.451429404595673e-06, 'epoch': 0.55} 55%|█████▍ | 18817/34278 [20:41:08<16:05:19, 3.75s/it] 55%|█████▍ | 18818/34278 [20:41:11<14:59:58, 3.49s/it] {'loss': 0.1226, 'grad_norm': 1.068831289893814, 'learning_rate': 4.450959824886525e-06, 'epoch': 0.55} 55%|█████▍ | 18818/34278 [20:41:11<14:59:58, 3.49s/it] 55%|█████▍ | 18819/34278 [20:41:14<14:08:00, 3.29s/it] {'loss': 0.1305, 'grad_norm': 0.9618543277054882, 'learning_rate': 4.450490250079077e-06, 'epoch': 0.55} 55%|█████▍ | 18819/34278 [20:41:14<14:08:00, 3.29s/it] 55%|█████▍ | 18820/34278 [20:41:20<17:29:21, 4.07s/it] {'loss': 0.1379, 'grad_norm': 0.7889767428814896, 'learning_rate': 4.450020680177522e-06, 'epoch': 0.55} 55%|█████▍ | 18820/34278 [20:41:20<17:29:21, 4.07s/it] 55%|█████▍ | 18821/34278 [20:41:23<16:04:36, 3.74s/it] {'loss': 0.1456, 'grad_norm': 0.9420117986656027, 'learning_rate': 4.449551115186049e-06, 'epoch': 0.55} 55%|█████▍ | 18821/34278 [20:41:23<16:04:36, 3.74s/it] 55%|█████▍ | 18822/34278 [20:41:27<16:29:30, 3.84s/it] {'loss': 0.1174, 'grad_norm': 0.9483556160076674, 'learning_rate': 4.4490815551088535e-06, 'epoch': 0.55} 55%|█████▍ | 18822/34278 [20:41:27<16:29:30, 3.84s/it] 55%|█████▍ | 18823/34278 [20:41:30<15:20:11, 3.57s/it] {'loss': 0.12, 'grad_norm': 0.8420100584383265, 'learning_rate': 4.448611999950126e-06, 'epoch': 0.55} 55%|█████▍ | 18823/34278 [20:41:30<15:20:11, 3.57s/it] 55%|█████▍ | 18824/34278 [20:41:33<14:15:46, 3.32s/it] {'loss': 0.1085, 'grad_norm': 0.753450656429226, 'learning_rate': 4.448142449714059e-06, 'epoch': 0.55} 55%|█████▍ | 18824/34278 [20:41:33<14:15:46, 3.32s/it] 55%|█████▍ | 18825/34278 [20:41:37<15:18:37, 3.57s/it] {'loss': 0.1207, 'grad_norm': 0.7376249513628443, 'learning_rate': 4.447672904404846e-06, 'epoch': 0.55} 55%|█████▍ | 18825/34278 [20:41:37<15:18:37, 3.57s/it] 55%|█████▍ | 18826/34278 [20:41:40<14:35:43, 3.40s/it] {'loss': 0.1288, 'grad_norm': 0.6532449538371288, 'learning_rate': 4.447203364026675e-06, 'epoch': 0.55} 55%|█████▍ | 18826/34278 [20:41:40<14:35:43, 3.40s/it] 55%|█████▍ | 18827/34278 [20:41:43<14:10:01, 3.30s/it] {'loss': 0.1357, 'grad_norm': 0.7441042072903009, 'learning_rate': 4.44673382858374e-06, 'epoch': 0.55} 55%|█████▍ | 18827/34278 [20:41:43<14:10:01, 3.30s/it] 55%|█████▍ | 18828/34278 [20:41:48<16:51:28, 3.93s/it] {'loss': 0.1356, 'grad_norm': 0.7667964860350982, 'learning_rate': 4.446264298080235e-06, 'epoch': 0.55} 55%|█████▍ | 18828/34278 [20:41:48<16:51:28, 3.93s/it] 55%|█████▍ | 18829/34278 [20:41:51<15:46:18, 3.68s/it] {'loss': 0.1316, 'grad_norm': 0.7521707498130297, 'learning_rate': 4.44579477252035e-06, 'epoch': 0.55} 55%|█████▍ | 18829/34278 [20:41:51<15:46:18, 3.68s/it] 55%|█████▍ | 18830/34278 [20:41:57<17:54:49, 4.17s/it] {'loss': 0.1362, 'grad_norm': 0.799486513772945, 'learning_rate': 4.4453252519082775e-06, 'epoch': 0.55} 55%|█████▍ | 18830/34278 [20:41:57<17:54:49, 4.17s/it] 55%|█████▍ | 18831/34278 [20:42:03<20:11:45, 4.71s/it] {'loss': 0.1221, 'grad_norm': 0.7546343097194946, 'learning_rate': 4.444855736248208e-06, 'epoch': 0.55} 55%|█████▍ | 18831/34278 [20:42:03<20:11:45, 4.71s/it] 55%|█████▍ | 18832/34278 [20:42:06<18:13:03, 4.25s/it] {'loss': 0.1556, 'grad_norm': 0.7752573259250796, 'learning_rate': 4.444386225544334e-06, 'epoch': 0.55} 55%|█████▍ | 18832/34278 [20:42:06<18:13:03, 4.25s/it] 55%|█████▍ | 18833/34278 [20:42:12<21:03:10, 4.91s/it] {'loss': 0.1318, 'grad_norm': 0.685234704736772, 'learning_rate': 4.443916719800846e-06, 'epoch': 0.55} 55%|█████▍ | 18833/34278 [20:42:12<21:03:10, 4.91s/it] 55%|█████▍ | 18834/34278 [20:42:16<19:13:11, 4.48s/it] {'loss': 0.1556, 'grad_norm': 0.8198630358754666, 'learning_rate': 4.443447219021938e-06, 'epoch': 0.55} 55%|█████▍ | 18834/34278 [20:42:16<19:13:11, 4.48s/it] 55%|█████▍ | 18835/34278 [20:42:20<18:45:54, 4.37s/it] {'loss': 0.1151, 'grad_norm': 0.7886386679426491, 'learning_rate': 4.442977723211801e-06, 'epoch': 0.55} 55%|█████▍ | 18835/34278 [20:42:20<18:45:54, 4.37s/it] 55%|█████▍ | 18836/34278 [20:42:24<17:44:14, 4.14s/it] {'loss': 0.1099, 'grad_norm': 0.9014790405505484, 'learning_rate': 4.442508232374625e-06, 'epoch': 0.55} 55%|█████▍ | 18836/34278 [20:42:24<17:44:14, 4.14s/it] 55%|█████▍ | 18837/34278 [20:42:27<16:33:37, 3.86s/it] {'loss': 0.1483, 'grad_norm': 0.8411352161674372, 'learning_rate': 4.442038746514603e-06, 'epoch': 0.55} 55%|█████▍ | 18837/34278 [20:42:27<16:33:37, 3.86s/it] 55%|█████▍ | 18838/34278 [20:42:30<16:06:06, 3.75s/it] {'loss': 0.1189, 'grad_norm': 0.6923289850293326, 'learning_rate': 4.441569265635927e-06, 'epoch': 0.55} 55%|█████▍ | 18838/34278 [20:42:30<16:06:06, 3.75s/it] 55%|█████▍ | 18839/34278 [20:42:33<14:51:29, 3.46s/it] {'loss': 0.103, 'grad_norm': 0.78649112074162, 'learning_rate': 4.441099789742783e-06, 'epoch': 0.55} 55%|█████▍ | 18839/34278 [20:42:33<14:51:29, 3.46s/it] 55%|█████▍ | 18840/34278 [20:42:36<14:46:16, 3.44s/it] {'loss': 0.1345, 'grad_norm': 0.7811436076507703, 'learning_rate': 4.440630318839371e-06, 'epoch': 0.55} 55%|█████▍ | 18840/34278 [20:42:36<14:46:16, 3.44s/it] 55%|█████▍ | 18841/34278 [20:42:39<14:09:35, 3.30s/it] {'loss': 0.1333, 'grad_norm': 0.7639822127543663, 'learning_rate': 4.4401608529298755e-06, 'epoch': 0.55} 55%|█████▍ | 18841/34278 [20:42:39<14:09:35, 3.30s/it] 55%|█████▍ | 18842/34278 [20:42:44<15:11:25, 3.54s/it] {'loss': 0.1568, 'grad_norm': 0.9188100326365559, 'learning_rate': 4.439691392018492e-06, 'epoch': 0.55} 55%|█████▍ | 18842/34278 [20:42:44<15:11:25, 3.54s/it] 55%|█████▍ | 18843/34278 [20:42:49<18:17:23, 4.27s/it] {'loss': 0.1204, 'grad_norm': 0.7253342840516087, 'learning_rate': 4.439221936109409e-06, 'epoch': 0.55} 55%|█████▍ | 18843/34278 [20:42:49<18:17:23, 4.27s/it] 55%|█████▍ | 18844/34278 [20:42:53<17:15:56, 4.03s/it] {'loss': 0.1289, 'grad_norm': 0.9615458196395555, 'learning_rate': 4.438752485206819e-06, 'epoch': 0.55} 55%|█████▍ | 18844/34278 [20:42:53<17:15:56, 4.03s/it] 55%|█████▍ | 18845/34278 [20:42:56<16:08:31, 3.77s/it] {'loss': 0.1226, 'grad_norm': 0.8532075732843499, 'learning_rate': 4.438283039314912e-06, 'epoch': 0.55} 55%|█████▍ | 18845/34278 [20:42:56<16:08:31, 3.77s/it] 55%|█████▍ | 18846/34278 [20:43:02<19:02:58, 4.44s/it] {'loss': 0.1205, 'grad_norm': 0.8816727263528346, 'learning_rate': 4.437813598437881e-06, 'epoch': 0.55} 55%|█████▍ | 18846/34278 [20:43:02<19:02:58, 4.44s/it] 55%|█████▍ | 18847/34278 [20:43:06<17:53:57, 4.18s/it] {'loss': 0.1403, 'grad_norm': 0.9298808247796456, 'learning_rate': 4.437344162579917e-06, 'epoch': 0.55} 55%|█████▍ | 18847/34278 [20:43:06<17:53:57, 4.18s/it] 55%|█████▍ | 18848/34278 [20:43:10<17:32:23, 4.09s/it] {'loss': 0.1482, 'grad_norm': 0.7830926199444167, 'learning_rate': 4.4368747317452075e-06, 'epoch': 0.55} 55%|█████▍ | 18848/34278 [20:43:10<17:32:23, 4.09s/it] 55%|█████▍ | 18849/34278 [20:43:12<15:54:46, 3.71s/it] {'loss': 0.1236, 'grad_norm': 0.7605544103110462, 'learning_rate': 4.436405305937947e-06, 'epoch': 0.55} 55%|█████▍ | 18849/34278 [20:43:12<15:54:46, 3.71s/it] 55%|█████▍ | 18850/34278 [20:43:15<14:56:38, 3.49s/it] {'loss': 0.1159, 'grad_norm': 0.8261668690686542, 'learning_rate': 4.435935885162327e-06, 'epoch': 0.55} 55%|█████▍ | 18850/34278 [20:43:15<14:56:38, 3.49s/it] 55%|█████▍ | 18851/34278 [20:43:21<18:04:44, 4.22s/it] {'loss': 0.1247, 'grad_norm': 1.0097377594927257, 'learning_rate': 4.435466469422533e-06, 'epoch': 0.55} 55%|█████▍ | 18851/34278 [20:43:21<18:04:44, 4.22s/it] 55%|█████▍ | 18852/34278 [20:43:26<19:16:29, 4.50s/it] {'loss': 0.1353, 'grad_norm': 0.9728248677311301, 'learning_rate': 4.434997058722762e-06, 'epoch': 0.55} 55%|█████▍ | 18852/34278 [20:43:26<19:16:29, 4.50s/it] 55%|█████▌ | 18853/34278 [20:43:30<17:37:05, 4.11s/it] {'loss': 0.113, 'grad_norm': 0.7422217850705474, 'learning_rate': 4.434527653067203e-06, 'epoch': 0.55} 55%|█████▌ | 18853/34278 [20:43:30<17:37:05, 4.11s/it] 55%|█████▌ | 18854/34278 [20:43:33<17:05:19, 3.99s/it] {'loss': 0.1263, 'grad_norm': 0.9094787481848825, 'learning_rate': 4.434058252460045e-06, 'epoch': 0.55} 55%|█████▌ | 18854/34278 [20:43:33<17:05:19, 3.99s/it] 55%|█████▌ | 18855/34278 [20:43:37<16:56:46, 3.96s/it] {'loss': 0.1326, 'grad_norm': 0.6841078949401904, 'learning_rate': 4.433588856905481e-06, 'epoch': 0.55} 55%|█████▌ | 18855/34278 [20:43:37<16:56:46, 3.96s/it] 55%|█████▌ | 18856/34278 [20:43:41<16:29:41, 3.85s/it] {'loss': 0.1367, 'grad_norm': 0.8828866013868408, 'learning_rate': 4.4331194664077e-06, 'epoch': 0.55} 55%|█████▌ | 18856/34278 [20:43:41<16:29:41, 3.85s/it] 55%|█████▌ | 18857/34278 [20:43:44<15:32:30, 3.63s/it] {'loss': 0.1462, 'grad_norm': 0.7830079930268402, 'learning_rate': 4.432650080970891e-06, 'epoch': 0.55} 55%|█████▌ | 18857/34278 [20:43:44<15:32:30, 3.63s/it] 55%|█████▌ | 18858/34278 [20:43:49<17:57:42, 4.19s/it] {'loss': 0.1238, 'grad_norm': 0.8156306303494512, 'learning_rate': 4.432180700599248e-06, 'epoch': 0.55} 55%|█████▌ | 18858/34278 [20:43:49<17:57:42, 4.19s/it] 55%|█████▌ | 18859/34278 [20:43:53<16:44:46, 3.91s/it] {'loss': 0.1343, 'grad_norm': 0.7823258402850047, 'learning_rate': 4.431711325296961e-06, 'epoch': 0.55} 55%|█████▌ | 18859/34278 [20:43:53<16:44:46, 3.91s/it] 55%|█████▌ | 18860/34278 [20:43:56<15:46:01, 3.68s/it] {'loss': 0.1164, 'grad_norm': 0.8022561333768476, 'learning_rate': 4.43124195506822e-06, 'epoch': 0.55} 55%|█████▌ | 18860/34278 [20:43:56<15:46:01, 3.68s/it] 55%|█████▌ | 18861/34278 [20:44:00<15:53:03, 3.71s/it] {'loss': 0.1213, 'grad_norm': 0.8583481960707582, 'learning_rate': 4.430772589917214e-06, 'epoch': 0.55} 55%|█████▌ | 18861/34278 [20:44:00<15:53:03, 3.71s/it] 55%|█████▌ | 18862/34278 [20:44:03<15:03:10, 3.52s/it] {'loss': 0.1365, 'grad_norm': 0.954864727716638, 'learning_rate': 4.4303032298481344e-06, 'epoch': 0.55} 55%|█████▌ | 18862/34278 [20:44:03<15:03:10, 3.52s/it] 55%|█████▌ | 18863/34278 [20:44:06<14:21:45, 3.35s/it] {'loss': 0.1229, 'grad_norm': 0.8819247378086011, 'learning_rate': 4.429833874865171e-06, 'epoch': 0.55} 55%|█████▌ | 18863/34278 [20:44:06<14:21:45, 3.35s/it] 55%|█████▌ | 18864/34278 [20:44:09<13:49:59, 3.23s/it] {'loss': 0.1375, 'grad_norm': 0.9175827146331453, 'learning_rate': 4.429364524972516e-06, 'epoch': 0.55} 55%|█████▌ | 18864/34278 [20:44:09<13:49:59, 3.23s/it] 55%|█████▌ | 18865/34278 [20:44:12<13:49:10, 3.23s/it] {'loss': 0.1235, 'grad_norm': 0.7254745234518502, 'learning_rate': 4.428895180174358e-06, 'epoch': 0.55} 55%|█████▌ | 18865/34278 [20:44:12<13:49:10, 3.23s/it] 55%|█████▌ | 18866/34278 [20:44:16<15:37:04, 3.65s/it] {'loss': 0.1351, 'grad_norm': 0.9335599154474736, 'learning_rate': 4.428425840474888e-06, 'epoch': 0.55} 55%|█████▌ | 18866/34278 [20:44:16<15:37:04, 3.65s/it] 55%|█████▌ | 18867/34278 [20:44:20<15:55:15, 3.72s/it] {'loss': 0.148, 'grad_norm': 1.048875958370967, 'learning_rate': 4.427956505878294e-06, 'epoch': 0.55} 55%|█████▌ | 18867/34278 [20:44:20<15:55:15, 3.72s/it] 55%|█████▌ | 18868/34278 [20:44:25<16:58:44, 3.97s/it] {'loss': 0.1099, 'grad_norm': 0.6447775964429916, 'learning_rate': 4.42748717638877e-06, 'epoch': 0.55} 55%|█████▌ | 18868/34278 [20:44:25<16:58:44, 3.97s/it] 55%|█████▌ | 18869/34278 [20:44:28<15:48:31, 3.69s/it] {'loss': 0.1192, 'grad_norm': 0.9248577677453754, 'learning_rate': 4.4270178520105e-06, 'epoch': 0.55} 55%|█████▌ | 18869/34278 [20:44:28<15:48:31, 3.69s/it] 55%|█████▌ | 18870/34278 [20:44:31<15:20:01, 3.58s/it] {'loss': 0.1286, 'grad_norm': 0.8270878868959397, 'learning_rate': 4.426548532747681e-06, 'epoch': 0.55} 55%|█████▌ | 18870/34278 [20:44:31<15:20:01, 3.58s/it] 55%|█████▌ | 18871/34278 [20:44:34<14:30:03, 3.39s/it] {'loss': 0.1607, 'grad_norm': 0.7425708562237897, 'learning_rate': 4.426079218604499e-06, 'epoch': 0.55} 55%|█████▌ | 18871/34278 [20:44:34<14:30:03, 3.39s/it] 55%|█████▌ | 18872/34278 [20:44:37<14:16:17, 3.33s/it] {'loss': 0.1403, 'grad_norm': 1.4173539162091753, 'learning_rate': 4.4256099095851455e-06, 'epoch': 0.55} 55%|█████▌ | 18872/34278 [20:44:37<14:16:17, 3.33s/it] 55%|█████▌ | 18873/34278 [20:44:41<14:13:16, 3.32s/it] {'loss': 0.1249, 'grad_norm': 0.9591979772681544, 'learning_rate': 4.42514060569381e-06, 'epoch': 0.55} 55%|█████▌ | 18873/34278 [20:44:41<14:13:16, 3.32s/it] 55%|█████▌ | 18874/34278 [20:44:44<14:29:49, 3.39s/it] {'loss': 0.1343, 'grad_norm': 0.9629079639441862, 'learning_rate': 4.424671306934681e-06, 'epoch': 0.55} 55%|█████▌ | 18874/34278 [20:44:44<14:29:49, 3.39s/it] 55%|█████▌ | 18875/34278 [20:44:48<14:59:00, 3.50s/it] {'loss': 0.1111, 'grad_norm': 0.6543564246871194, 'learning_rate': 4.424202013311947e-06, 'epoch': 0.55} 55%|█████▌ | 18875/34278 [20:44:48<14:59:00, 3.50s/it] 55%|█████▌ | 18876/34278 [20:44:53<16:50:40, 3.94s/it] {'loss': 0.1119, 'grad_norm': 0.9133388080681468, 'learning_rate': 4.423732724829802e-06, 'epoch': 0.55} 55%|█████▌ | 18876/34278 [20:44:53<16:50:40, 3.94s/it] 55%|█████▌ | 18877/34278 [20:44:56<16:16:25, 3.80s/it] {'loss': 0.1339, 'grad_norm': 0.8416084456937348, 'learning_rate': 4.423263441492436e-06, 'epoch': 0.55} 55%|█████▌ | 18877/34278 [20:44:56<16:16:25, 3.80s/it] 55%|█████▌ | 18878/34278 [20:45:02<18:58:52, 4.44s/it] {'loss': 0.1335, 'grad_norm': 0.7324154434440577, 'learning_rate': 4.4227941633040335e-06, 'epoch': 0.55} 55%|█████▌ | 18878/34278 [20:45:02<18:58:52, 4.44s/it] 55%|█████▌ | 18879/34278 [20:45:07<19:12:23, 4.49s/it] {'loss': 0.1416, 'grad_norm': 0.8143888661673182, 'learning_rate': 4.422324890268787e-06, 'epoch': 0.55} 55%|█████▌ | 18879/34278 [20:45:07<19:12:23, 4.49s/it] 55%|█████▌ | 18880/34278 [20:45:10<17:38:56, 4.13s/it] {'loss': 0.1256, 'grad_norm': 0.8755486013829306, 'learning_rate': 4.421855622390887e-06, 'epoch': 0.55} 55%|█████▌ | 18880/34278 [20:45:10<17:38:56, 4.13s/it] 55%|█████▌ | 18881/34278 [20:45:16<20:01:54, 4.68s/it] {'loss': 0.1195, 'grad_norm': 0.9647428164434501, 'learning_rate': 4.42138635967452e-06, 'epoch': 0.55} 55%|█████▌ | 18881/34278 [20:45:16<20:01:54, 4.68s/it] 55%|█████▌ | 18882/34278 [20:45:19<18:04:57, 4.23s/it] {'loss': 0.1073, 'grad_norm': 0.6968924654914538, 'learning_rate': 4.420917102123879e-06, 'epoch': 0.55} 55%|█████▌ | 18882/34278 [20:45:19<18:04:57, 4.23s/it] 55%|█████▌ | 18883/34278 [20:45:23<17:29:02, 4.09s/it] {'loss': 0.1218, 'grad_norm': 0.6776839132508738, 'learning_rate': 4.420447849743152e-06, 'epoch': 0.55} 55%|█████▌ | 18883/34278 [20:45:23<17:29:02, 4.09s/it] 55%|█████▌ | 18884/34278 [20:45:26<16:07:21, 3.77s/it] {'loss': 0.1301, 'grad_norm': 0.9780392719740577, 'learning_rate': 4.419978602536529e-06, 'epoch': 0.55} 55%|█████▌ | 18884/34278 [20:45:26<16:07:21, 3.77s/it] 55%|█████▌ | 18885/34278 [20:45:31<17:31:40, 4.10s/it] {'loss': 0.1227, 'grad_norm': 1.269366287466131, 'learning_rate': 4.419509360508198e-06, 'epoch': 0.55} 55%|█████▌ | 18885/34278 [20:45:31<17:31:40, 4.10s/it] 55%|█████▌ | 18886/34278 [20:45:35<16:57:40, 3.97s/it] {'loss': 0.1373, 'grad_norm': 0.988261489618055, 'learning_rate': 4.419040123662348e-06, 'epoch': 0.55} 55%|█████▌ | 18886/34278 [20:45:35<16:57:40, 3.97s/it] 55%|█████▌ | 18887/34278 [20:45:38<15:33:51, 3.64s/it] {'loss': 0.1255, 'grad_norm': 0.998681512237093, 'learning_rate': 4.418570892003169e-06, 'epoch': 0.55} 55%|█████▌ | 18887/34278 [20:45:38<15:33:51, 3.64s/it] 55%|█████▌ | 18888/34278 [20:45:42<16:11:14, 3.79s/it] {'loss': 0.1302, 'grad_norm': 0.9859257190037307, 'learning_rate': 4.418101665534851e-06, 'epoch': 0.55} 55%|█████▌ | 18888/34278 [20:45:42<16:11:14, 3.79s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 55%|█████▌ | 18889/34278 [20:45:46<16:22:11, 3.83s/it] {'loss': 0.1201, 'grad_norm': 0.7435861505082985, 'learning_rate': 4.417632444261582e-06, 'epoch': 0.55} 55%|█████▌ | 18889/34278 [20:45:46<16:22:11, 3.83s/it] 55%|█████▌ | 18890/34278 [20:45:49<15:36:35, 3.65s/it] {'loss': 0.1296, 'grad_norm': 0.9044645211984617, 'learning_rate': 4.417163228187552e-06, 'epoch': 0.55} 55%|█████▌ | 18890/34278 [20:45:49<15:36:35, 3.65s/it] 55%|█████▌ | 18891/34278 [20:45:52<15:08:33, 3.54s/it] {'loss': 0.1198, 'grad_norm': 0.6673412854286197, 'learning_rate': 4.41669401731695e-06, 'epoch': 0.55} 55%|█████▌ | 18891/34278 [20:45:52<15:08:33, 3.54s/it] 55%|█████▌ | 18892/34278 [20:45:55<14:47:15, 3.46s/it] {'loss': 0.1262, 'grad_norm': 1.015387950005781, 'learning_rate': 4.416224811653963e-06, 'epoch': 0.55} 55%|█████▌ | 18892/34278 [20:45:55<14:47:15, 3.46s/it] 55%|█████▌ | 18893/34278 [20:45:59<15:30:56, 3.63s/it] {'loss': 0.1135, 'grad_norm': 0.7409449697376059, 'learning_rate': 4.415755611202782e-06, 'epoch': 0.55} 55%|█████▌ | 18893/34278 [20:45:59<15:30:56, 3.63s/it] 55%|█████▌ | 18894/34278 [20:46:03<14:47:21, 3.46s/it] {'loss': 0.1353, 'grad_norm': 0.8683402773180618, 'learning_rate': 4.415286415967596e-06, 'epoch': 0.55} 55%|█████▌ | 18894/34278 [20:46:03<14:47:21, 3.46s/it] 55%|█████▌ | 18895/34278 [20:46:06<15:09:20, 3.55s/it] {'loss': 0.1204, 'grad_norm': 0.8628280057573453, 'learning_rate': 4.414817225952594e-06, 'epoch': 0.55} 55%|█████▌ | 18895/34278 [20:46:06<15:09:20, 3.55s/it] 55%|█████▌ | 18896/34278 [20:46:09<14:18:30, 3.35s/it] {'loss': 0.1202, 'grad_norm': 0.6847709045708442, 'learning_rate': 4.414348041161963e-06, 'epoch': 0.55} 55%|█████▌ | 18896/34278 [20:46:09<14:18:30, 3.35s/it] 55%|█████▌ | 18897/34278 [20:46:15<17:45:51, 4.16s/it] {'loss': 0.1239, 'grad_norm': 0.8020461498396281, 'learning_rate': 4.413878861599893e-06, 'epoch': 0.55} 55%|█████▌ | 18897/34278 [20:46:15<17:45:51, 4.16s/it] 55%|█████▌ | 18898/34278 [20:46:20<18:56:17, 4.43s/it] {'loss': 0.1431, 'grad_norm': 1.0200723866121209, 'learning_rate': 4.413409687270574e-06, 'epoch': 0.55} 55%|█████▌ | 18898/34278 [20:46:20<18:56:17, 4.43s/it] 55%|█████▌ | 18899/34278 [20:46:23<16:50:21, 3.94s/it] {'loss': 0.1361, 'grad_norm': 1.0289337290145673, 'learning_rate': 4.412940518178191e-06, 'epoch': 0.55} 55%|█████▌ | 18899/34278 [20:46:23<16:50:21, 3.94s/it] 55%|█████▌ | 18900/34278 [20:46:26<15:40:12, 3.67s/it] {'loss': 0.1303, 'grad_norm': 1.095416605994027, 'learning_rate': 4.412471354326936e-06, 'epoch': 0.55} 55%|█████▌ | 18900/34278 [20:46:26<15:40:12, 3.67s/it] 55%|█████▌ | 18901/34278 [20:46:29<14:40:49, 3.44s/it] {'loss': 0.1484, 'grad_norm': 1.0664787186450981, 'learning_rate': 4.412002195720996e-06, 'epoch': 0.55} 55%|█████▌ | 18901/34278 [20:46:29<14:40:49, 3.44s/it] 55%|█████▌ | 18902/34278 [20:46:32<13:59:46, 3.28s/it] {'loss': 0.1453, 'grad_norm': 1.1820260823233966, 'learning_rate': 4.41153304236456e-06, 'epoch': 0.55} 55%|█████▌ | 18902/34278 [20:46:32<13:59:46, 3.28s/it] 55%|█████▌ | 18903/34278 [20:46:35<14:16:44, 3.34s/it] {'loss': 0.1195, 'grad_norm': 1.0926917410889156, 'learning_rate': 4.411063894261818e-06, 'epoch': 0.55} 55%|█████▌ | 18903/34278 [20:46:35<14:16:44, 3.34s/it] 55%|█████▌ | 18904/34278 [20:46:40<15:49:50, 3.71s/it] {'loss': 0.1208, 'grad_norm': 0.8201506636812573, 'learning_rate': 4.410594751416956e-06, 'epoch': 0.55} 55%|█████▌ | 18904/34278 [20:46:40<15:49:50, 3.71s/it] 55%|█████▌ | 18905/34278 [20:46:43<15:19:04, 3.59s/it] {'loss': 0.1236, 'grad_norm': 0.9268618846094715, 'learning_rate': 4.410125613834162e-06, 'epoch': 0.55} 55%|█████▌ | 18905/34278 [20:46:43<15:19:04, 3.59s/it] 55%|█████▌ | 18906/34278 [20:46:46<14:43:01, 3.45s/it] {'loss': 0.1423, 'grad_norm': 0.8709982156274102, 'learning_rate': 4.409656481517627e-06, 'epoch': 0.55} 55%|█████▌ | 18906/34278 [20:46:46<14:43:01, 3.45s/it] 55%|█████▌ | 18907/34278 [20:46:50<14:17:34, 3.35s/it] {'loss': 0.151, 'grad_norm': 1.0799782334146752, 'learning_rate': 4.409187354471539e-06, 'epoch': 0.55} 55%|█████▌ | 18907/34278 [20:46:50<14:17:34, 3.35s/it] 55%|█████▌ | 18908/34278 [20:46:53<13:53:10, 3.25s/it] {'loss': 0.1437, 'grad_norm': 0.8726156079233057, 'learning_rate': 4.4087182327000845e-06, 'epoch': 0.55} 55%|█████▌ | 18908/34278 [20:46:53<13:53:10, 3.25s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f30ff654590> Failed to fetch sample 3187193. Exception: cannot identify image file <_io.BytesIO object at 0x7f30ff654590> 55%|█████▌ | 18909/34278 [20:46:56<13:34:55, 3.18s/it] {'loss': 0.133, 'grad_norm': 0.880339981058879, 'learning_rate': 4.408249116207452e-06, 'epoch': 0.55} 55%|█████▌ | 18909/34278 [20:46:56<13:34:55, 3.18s/it] 55%|█████▌ | 18910/34278 [20:46:59<13:33:42, 3.18s/it] {'loss': 0.1209, 'grad_norm': 1.0609898160043045, 'learning_rate': 4.407780004997831e-06, 'epoch': 0.55} 55%|█████▌ | 18910/34278 [20:46:59<13:33:42, 3.18s/it] 55%|█████▌ | 18911/34278 [20:47:02<14:08:44, 3.31s/it] {'loss': 0.1126, 'grad_norm': 0.8067179625470575, 'learning_rate': 4.407310899075406e-06, 'epoch': 0.55} 55%|█████▌ | 18911/34278 [20:47:02<14:08:44, 3.31s/it] 55%|█████▌ | 18912/34278 [20:47:06<14:02:53, 3.29s/it] {'loss': 0.1423, 'grad_norm': 0.8214607414672347, 'learning_rate': 4.406841798444371e-06, 'epoch': 0.55} 55%|█████▌ | 18912/34278 [20:47:06<14:02:53, 3.29s/it] 55%|█████▌ | 18913/34278 [20:47:09<14:29:42, 3.40s/it] {'loss': 0.1279, 'grad_norm': 0.9296961843547017, 'learning_rate': 4.40637270310891e-06, 'epoch': 0.55} 55%|█████▌ | 18913/34278 [20:47:09<14:29:42, 3.40s/it] 55%|█████▌ | 18914/34278 [20:47:12<14:01:21, 3.29s/it] {'loss': 0.1354, 'grad_norm': 1.0150414366375669, 'learning_rate': 4.4059036130732115e-06, 'epoch': 0.55} 55%|█████▌ | 18914/34278 [20:47:12<14:01:21, 3.29s/it] 55%|█████▌ | 18915/34278 [20:47:18<17:26:08, 4.09s/it] {'loss': 0.1278, 'grad_norm': 0.8882836837030985, 'learning_rate': 4.4054345283414645e-06, 'epoch': 0.55} 55%|█████▌ | 18915/34278 [20:47:18<17:26:08, 4.09s/it] 55%|█████▌ | 18916/34278 [20:47:21<16:12:04, 3.80s/it] {'loss': 0.1229, 'grad_norm': 0.8148474926750665, 'learning_rate': 4.404965448917855e-06, 'epoch': 0.55} 55%|█████▌ | 18916/34278 [20:47:21<16:12:04, 3.80s/it] 55%|█████▌ | 18917/34278 [20:47:25<15:48:25, 3.70s/it] {'loss': 0.111, 'grad_norm': 0.9414039923913972, 'learning_rate': 4.4044963748065716e-06, 'epoch': 0.55} 55%|█████▌ | 18917/34278 [20:47:25<15:48:25, 3.70s/it] 55%|█████▌ | 18918/34278 [20:47:29<15:46:06, 3.70s/it] {'loss': 0.1262, 'grad_norm': 0.8041724061132699, 'learning_rate': 4.404027306011804e-06, 'epoch': 0.55} 55%|█████▌ | 18918/34278 [20:47:29<15:46:06, 3.70s/it] 55%|█████▌ | 18919/34278 [20:47:32<15:42:55, 3.68s/it] {'loss': 0.1236, 'grad_norm': 0.9318898626723681, 'learning_rate': 4.403558242537737e-06, 'epoch': 0.55} 55%|█████▌ | 18919/34278 [20:47:32<15:42:55, 3.68s/it] 55%|█████▌ | 18920/34278 [20:47:35<14:58:00, 3.51s/it] {'loss': 0.1519, 'grad_norm': 1.1528299941047244, 'learning_rate': 4.40308918438856e-06, 'epoch': 0.55} 55%|█████▌ | 18920/34278 [20:47:35<14:58:00, 3.51s/it] 55%|█████▌ | 18921/34278 [20:47:38<14:31:24, 3.40s/it] {'loss': 0.1046, 'grad_norm': 0.8343848843723077, 'learning_rate': 4.402620131568461e-06, 'epoch': 0.55} 55%|█████▌ | 18921/34278 [20:47:38<14:31:24, 3.40s/it] 55%|█████▌ | 18922/34278 [20:47:42<15:05:58, 3.54s/it] {'loss': 0.1454, 'grad_norm': 0.8668240869909689, 'learning_rate': 4.402151084081625e-06, 'epoch': 0.55} 55%|█████▌ | 18922/34278 [20:47:42<15:05:58, 3.54s/it] 55%|█████▌ | 18923/34278 [20:47:46<15:14:11, 3.57s/it] {'loss': 0.1235, 'grad_norm': 0.6814790320732358, 'learning_rate': 4.401682041932243e-06, 'epoch': 0.55} 55%|█████▌ | 18923/34278 [20:47:46<15:14:11, 3.57s/it] 55%|█████▌ | 18924/34278 [20:47:49<14:57:50, 3.51s/it] {'loss': 0.1225, 'grad_norm': 0.8920669265290863, 'learning_rate': 4.4012130051245e-06, 'epoch': 0.55} 55%|█████▌ | 18924/34278 [20:47:49<14:57:50, 3.51s/it] 55%|█████▌ | 18925/34278 [20:47:55<18:07:45, 4.25s/it] {'loss': 0.1241, 'grad_norm': 0.7313361740763353, 'learning_rate': 4.400743973662586e-06, 'epoch': 0.55} 55%|█████▌ | 18925/34278 [20:47:55<18:07:45, 4.25s/it] 55%|█████▌ | 18926/34278 [20:48:01<20:10:31, 4.73s/it] {'loss': 0.1375, 'grad_norm': 0.7913429969906784, 'learning_rate': 4.400274947550685e-06, 'epoch': 0.55} 55%|█████▌ | 18926/34278 [20:48:01<20:10:31, 4.73s/it] 55%|█████▌ | 18927/34278 [20:48:05<18:51:27, 4.42s/it] {'loss': 0.1427, 'grad_norm': 0.8398671674310144, 'learning_rate': 4.3998059267929875e-06, 'epoch': 0.55} 55%|█████▌ | 18927/34278 [20:48:05<18:51:27, 4.42s/it] 55%|█████▌ | 18928/34278 [20:48:08<17:46:20, 4.17s/it] {'loss': 0.1091, 'grad_norm': 0.9810835566100664, 'learning_rate': 4.3993369113936765e-06, 'epoch': 0.55} 55%|█████▌ | 18928/34278 [20:48:08<17:46:20, 4.17s/it] 55%|█████▌ | 18929/34278 [20:48:12<16:56:03, 3.97s/it] {'loss': 0.1066, 'grad_norm': 0.6773255354104609, 'learning_rate': 4.3988679013569455e-06, 'epoch': 0.55} 55%|█████▌ | 18929/34278 [20:48:12<16:56:03, 3.97s/it] 55%|█████▌ | 18930/34278 [20:48:15<16:02:35, 3.76s/it] {'loss': 0.1345, 'grad_norm': 0.7687159863426298, 'learning_rate': 4.398398896686977e-06, 'epoch': 0.55} 55%|█████▌ | 18930/34278 [20:48:15<16:02:35, 3.76s/it] 55%|█████▌ | 18931/34278 [20:48:20<17:13:37, 4.04s/it] {'loss': 0.1345, 'grad_norm': 0.854676951390436, 'learning_rate': 4.39792989738796e-06, 'epoch': 0.55} 55%|█████▌ | 18931/34278 [20:48:20<17:13:37, 4.04s/it] 55%|█████▌ | 18932/34278 [20:48:23<16:22:02, 3.84s/it] {'loss': 0.0965, 'grad_norm': 0.6842095574049585, 'learning_rate': 4.39746090346408e-06, 'epoch': 0.55} 55%|█████▌ | 18932/34278 [20:48:23<16:22:02, 3.84s/it] 55%|█████▌ | 18933/34278 [20:48:28<17:23:20, 4.08s/it] {'loss': 0.1126, 'grad_norm': 0.7438169389739349, 'learning_rate': 4.396991914919528e-06, 'epoch': 0.55} 55%|█████▌ | 18933/34278 [20:48:28<17:23:20, 4.08s/it] 55%|█████▌ | 18934/34278 [20:48:31<16:36:04, 3.89s/it] {'loss': 0.1277, 'grad_norm': 0.7650484211780397, 'learning_rate': 4.3965229317584846e-06, 'epoch': 0.55} 55%|█████▌ | 18934/34278 [20:48:31<16:36:04, 3.89s/it] 55%|█████▌ | 18935/34278 [20:48:34<15:37:30, 3.67s/it] {'loss': 0.1393, 'grad_norm': 0.9242249311460413, 'learning_rate': 4.396053953985142e-06, 'epoch': 0.55} 55%|█████▌ | 18935/34278 [20:48:35<15:37:30, 3.67s/it] 55%|█████▌ | 18936/34278 [20:48:38<15:53:53, 3.73s/it] {'loss': 0.1209, 'grad_norm': 0.8117106767305521, 'learning_rate': 4.395584981603686e-06, 'epoch': 0.55} 55%|█████▌ | 18936/34278 [20:48:38<15:53:53, 3.73s/it] 55%|█████▌ | 18937/34278 [20:48:44<18:48:46, 4.41s/it] {'loss': 0.1255, 'grad_norm': 0.6625364024823419, 'learning_rate': 4.395116014618303e-06, 'epoch': 0.55} 55%|█████▌ | 18937/34278 [20:48:44<18:48:46, 4.41s/it] 55%|█████▌ | 18938/34278 [20:48:49<19:25:17, 4.56s/it] {'loss': 0.1352, 'grad_norm': 1.0487966281576853, 'learning_rate': 4.39464705303318e-06, 'epoch': 0.55} 55%|█████▌ | 18938/34278 [20:48:49<19:25:17, 4.56s/it] 55%|█████▌ | 18939/34278 [20:48:52<17:27:19, 4.10s/it] {'loss': 0.1269, 'grad_norm': 0.8082036924009987, 'learning_rate': 4.394178096852503e-06, 'epoch': 0.55} 55%|█████▌ | 18939/34278 [20:48:52<17:27:19, 4.10s/it] 55%|█████▌ | 18940/34278 [20:48:56<16:27:43, 3.86s/it] {'loss': 0.1131, 'grad_norm': 0.822389565897582, 'learning_rate': 4.393709146080458e-06, 'epoch': 0.55} 55%|█████▌ | 18940/34278 [20:48:56<16:27:43, 3.86s/it] 55%|█████▌ | 18941/34278 [20:49:01<18:32:08, 4.35s/it] {'loss': 0.1193, 'grad_norm': 1.035314430128407, 'learning_rate': 4.393240200721234e-06, 'epoch': 0.55} 55%|█████▌ | 18941/34278 [20:49:01<18:32:08, 4.35s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 55%|█████▌ | 18942/34278 [20:49:04<16:39:08, 3.91s/it] {'loss': 0.1564, 'grad_norm': 0.9707725354811764, 'learning_rate': 4.392771260779018e-06, 'epoch': 0.55} 55%|█████▌ | 18942/34278 [20:49:04<16:39:08, 3.91s/it] 55%|█████▌ | 18943/34278 [20:49:07<15:12:33, 3.57s/it] {'loss': 0.1372, 'grad_norm': 0.9636631736749641, 'learning_rate': 4.392302326257995e-06, 'epoch': 0.55} 55%|█████▌ | 18943/34278 [20:49:07<15:12:33, 3.57s/it] 55%|█████▌ | 18944/34278 [20:49:11<15:59:32, 3.75s/it] {'loss': 0.1284, 'grad_norm': 1.0086431948465115, 'learning_rate': 4.39183339716235e-06, 'epoch': 0.55} 55%|█████▌ | 18944/34278 [20:49:11<15:59:32, 3.75s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 55%|█████▌ | 18945/34278 [20:49:14<15:23:10, 3.61s/it] {'loss': 0.1226, 'grad_norm': 0.8029281205526639, 'learning_rate': 4.391364473496273e-06, 'epoch': 0.55} 55%|█████▌ | 18945/34278 [20:49:14<15:23:10, 3.61s/it] 55%|█████▌ | 18946/34278 [20:49:18<15:36:39, 3.67s/it] {'loss': 0.1369, 'grad_norm': 0.8376520467839277, 'learning_rate': 4.390895555263946e-06, 'epoch': 0.55} 55%|█████▌ | 18946/34278 [20:49:18<15:36:39, 3.67s/it] 55%|█████▌ | 18947/34278 [20:49:21<15:11:06, 3.57s/it] {'loss': 0.1348, 'grad_norm': 0.8621087899813299, 'learning_rate': 4.390426642469561e-06, 'epoch': 0.55} 55%|█████▌ | 18947/34278 [20:49:21<15:11:06, 3.57s/it] 55%|█████▌ | 18948/34278 [20:49:25<14:58:18, 3.52s/it] {'loss': 0.1398, 'grad_norm': 1.2560347738458584, 'learning_rate': 4.3899577351173005e-06, 'epoch': 0.55} 55%|█████▌ | 18948/34278 [20:49:25<14:58:18, 3.52s/it] 55%|█████▌ | 18949/34278 [20:49:31<18:09:58, 4.27s/it] {'loss': 0.1362, 'grad_norm': 0.8478268100757564, 'learning_rate': 4.389488833211351e-06, 'epoch': 0.55} 55%|█████▌ | 18949/34278 [20:49:31<18:09:58, 4.27s/it] 55%|█████▌ | 18950/34278 [20:49:34<17:25:03, 4.09s/it] {'loss': 0.1502, 'grad_norm': 0.9655586142279446, 'learning_rate': 4.389019936755902e-06, 'epoch': 0.55} 55%|█████▌ | 18950/34278 [20:49:34<17:25:03, 4.09s/it] 55%|█████▌ | 18951/34278 [20:49:38<16:51:44, 3.96s/it] {'loss': 0.1205, 'grad_norm': 0.7419266281202673, 'learning_rate': 4.388551045755135e-06, 'epoch': 0.55} 55%|█████▌ | 18951/34278 [20:49:38<16:51:44, 3.96s/it] 55%|█████▌ | 18952/34278 [20:49:41<15:42:29, 3.69s/it] {'loss': 0.1182, 'grad_norm': 0.8711008103723668, 'learning_rate': 4.388082160213237e-06, 'epoch': 0.55} 55%|█████▌ | 18952/34278 [20:49:41<15:42:29, 3.69s/it] 55%|█████▌ | 18953/34278 [20:49:44<15:15:04, 3.58s/it] {'loss': 0.1286, 'grad_norm': 0.7856337236933707, 'learning_rate': 4.387613280134397e-06, 'epoch': 0.55} 55%|█████▌ | 18953/34278 [20:49:45<15:15:04, 3.58s/it] 55%|█████▌ | 18954/34278 [20:49:49<16:05:16, 3.78s/it] {'loss': 0.1136, 'grad_norm': 1.0391416184434505, 'learning_rate': 4.3871444055228e-06, 'epoch': 0.55} 55%|█████▌ | 18954/34278 [20:49:49<16:05:16, 3.78s/it] 55%|█████▌ | 18955/34278 [20:49:52<15:15:37, 3.59s/it] {'loss': 0.1304, 'grad_norm': 0.750986333806838, 'learning_rate': 4.386675536382631e-06, 'epoch': 0.55} 55%|█████▌ | 18955/34278 [20:49:52<15:15:37, 3.59s/it] 55%|█████▌ | 18956/34278 [20:49:55<14:38:53, 3.44s/it] {'loss': 0.1202, 'grad_norm': 1.249896122674722, 'learning_rate': 4.3862066727180765e-06, 'epoch': 0.55} 55%|█████▌ | 18956/34278 [20:49:55<14:38:53, 3.44s/it] 55%|█████▌ | 18957/34278 [20:49:58<14:23:45, 3.38s/it] {'loss': 0.1326, 'grad_norm': 0.8660143694255064, 'learning_rate': 4.385737814533322e-06, 'epoch': 0.55} 55%|█████▌ | 18957/34278 [20:49:58<14:23:45, 3.38s/it] 55%|█████▌ | 18958/34278 [20:50:04<17:53:30, 4.20s/it] {'loss': 0.124, 'grad_norm': 0.7776825646497837, 'learning_rate': 4.385268961832553e-06, 'epoch': 0.55} 55%|█████▌ | 18958/34278 [20:50:04<17:53:30, 4.20s/it] 55%|█████▌ | 18959/34278 [20:50:08<16:50:52, 3.96s/it] {'loss': 0.1178, 'grad_norm': 0.7515619600990452, 'learning_rate': 4.384800114619957e-06, 'epoch': 0.55} 55%|█████▌ | 18959/34278 [20:50:08<16:50:52, 3.96s/it] 55%|█████▌ | 18960/34278 [20:50:11<16:34:07, 3.89s/it] {'loss': 0.1156, 'grad_norm': 0.8631802248534036, 'learning_rate': 4.384331272899718e-06, 'epoch': 0.55} 55%|█████▌ | 18960/34278 [20:50:12<16:34:07, 3.89s/it] 55%|█████▌ | 18961/34278 [20:50:17<19:15:48, 4.53s/it] {'loss': 0.1208, 'grad_norm': 0.7442139612714272, 'learning_rate': 4.383862436676023e-06, 'epoch': 0.55} 55%|█████▌ | 18961/34278 [20:50:17<19:15:48, 4.53s/it] 55%|█████▌ | 18962/34278 [20:50:20<17:09:49, 4.03s/it] {'loss': 0.1499, 'grad_norm': 1.070272932987149, 'learning_rate': 4.383393605953057e-06, 'epoch': 0.55} 55%|█████▌ | 18962/34278 [20:50:20<17:09:49, 4.03s/it] 55%|█████▌ | 18963/34278 [20:50:23<15:58:57, 3.76s/it] {'loss': 0.1342, 'grad_norm': 0.869308725151239, 'learning_rate': 4.382924780735007e-06, 'epoch': 0.55} 55%|█████▌ | 18963/34278 [20:50:23<15:58:57, 3.76s/it] 55%|█████▌ | 18964/34278 [20:50:28<16:58:31, 3.99s/it] {'loss': 0.1653, 'grad_norm': 0.9966991000253045, 'learning_rate': 4.3824559610260545e-06, 'epoch': 0.55} 55%|█████▌ | 18964/34278 [20:50:28<16:58:31, 3.99s/it] 55%|█████▌ | 18965/34278 [20:50:31<16:16:40, 3.83s/it] {'loss': 0.1305, 'grad_norm': 0.708567997809416, 'learning_rate': 4.381987146830389e-06, 'epoch': 0.55} 55%|█████▌ | 18965/34278 [20:50:31<16:16:40, 3.83s/it] 55%|█████▌ | 18966/34278 [20:50:37<19:00:27, 4.47s/it] {'loss': 0.1593, 'grad_norm': 1.0003084806399476, 'learning_rate': 4.381518338152195e-06, 'epoch': 0.55} 55%|█████▌ | 18966/34278 [20:50:37<19:00:27, 4.47s/it] 55%|█████▌ | 18967/34278 [20:50:40<17:03:14, 4.01s/it] {'loss': 0.1015, 'grad_norm': 0.8213124015520988, 'learning_rate': 4.381049534995658e-06, 'epoch': 0.55} 55%|█████▌ | 18967/34278 [20:50:40<17:03:14, 4.01s/it] 55%|█████▌ | 18968/34278 [20:50:44<16:58:34, 3.99s/it] {'loss': 0.1271, 'grad_norm': 0.8604842656028665, 'learning_rate': 4.380580737364962e-06, 'epoch': 0.55} 55%|█████▌ | 18968/34278 [20:50:44<16:58:34, 3.99s/it] 55%|█████▌ | 18969/34278 [20:50:47<15:25:41, 3.63s/it] {'loss': 0.1241, 'grad_norm': 0.7736718020630616, 'learning_rate': 4.380111945264294e-06, 'epoch': 0.55} 55%|█████▌ | 18969/34278 [20:50:47<15:25:41, 3.63s/it] 55%|█████▌ | 18970/34278 [20:50:50<14:56:33, 3.51s/it] {'loss': 0.1601, 'grad_norm': 0.8683832202446158, 'learning_rate': 4.379643158697837e-06, 'epoch': 0.55} 55%|█████▌ | 18970/34278 [20:50:50<14:56:33, 3.51s/it] 55%|█████▌ | 18971/34278 [20:50:54<14:59:06, 3.52s/it] {'loss': 0.1154, 'grad_norm': 0.6773294747814594, 'learning_rate': 4.3791743776697795e-06, 'epoch': 0.55} 55%|█████▌ | 18971/34278 [20:50:54<14:59:06, 3.52s/it] 55%|█████▌ | 18972/34278 [20:50:58<16:22:13, 3.85s/it] {'loss': 0.14, 'grad_norm': 0.9849618752089566, 'learning_rate': 4.378705602184306e-06, 'epoch': 0.55} 55%|█████▌ | 18972/34278 [20:50:59<16:22:13, 3.85s/it] 55%|█████▌ | 18973/34278 [20:51:01<15:00:21, 3.53s/it] {'loss': 0.1167, 'grad_norm': 0.7972406615254845, 'learning_rate': 4.3782368322455985e-06, 'epoch': 0.55} 55%|█████▌ | 18973/34278 [20:51:01<15:00:21, 3.53s/it] 55%|█████▌ | 18974/34278 [20:51:05<14:48:30, 3.48s/it] {'loss': 0.166, 'grad_norm': 0.8624819543776043, 'learning_rate': 4.377768067857845e-06, 'epoch': 0.55} 55%|█████▌ | 18974/34278 [20:51:05<14:48:30, 3.48s/it] 55%|█████▌ | 18975/34278 [20:51:08<14:07:57, 3.32s/it] {'loss': 0.1241, 'grad_norm': 0.7968597633134373, 'learning_rate': 4.37729930902523e-06, 'epoch': 0.55} 55%|█████▌ | 18975/34278 [20:51:08<14:07:57, 3.32s/it] 55%|█████▌ | 18976/34278 [20:51:13<17:16:20, 4.06s/it] {'loss': 0.1237, 'grad_norm': 0.8832363884273774, 'learning_rate': 4.376830555751935e-06, 'epoch': 0.55} 55%|█████▌ | 18976/34278 [20:51:13<17:16:20, 4.06s/it] 55%|█████▌ | 18977/34278 [20:51:17<16:25:08, 3.86s/it] {'loss': 0.1342, 'grad_norm': 0.8994375649384976, 'learning_rate': 4.376361808042152e-06, 'epoch': 0.55} 55%|█████▌ | 18977/34278 [20:51:17<16:25:08, 3.86s/it] 55%|█████▌ | 18978/34278 [20:51:20<15:37:37, 3.68s/it] {'loss': 0.1263, 'grad_norm': 0.8993620114859945, 'learning_rate': 4.37589306590006e-06, 'epoch': 0.55} 55%|█████▌ | 18978/34278 [20:51:20<15:37:37, 3.68s/it] 55%|█████▌ | 18979/34278 [20:51:25<17:02:22, 4.01s/it] {'loss': 0.1319, 'grad_norm': 0.8789984648277301, 'learning_rate': 4.375424329329847e-06, 'epoch': 0.55} 55%|█████▌ | 18979/34278 [20:51:25<17:02:22, 4.01s/it] 55%|█████▌ | 18980/34278 [20:51:29<16:47:10, 3.95s/it] {'loss': 0.1208, 'grad_norm': 0.700312367315295, 'learning_rate': 4.374955598335696e-06, 'epoch': 0.55} 55%|█████▌ | 18980/34278 [20:51:29<16:47:10, 3.95s/it] 55%|█████▌ | 18981/34278 [20:51:32<15:43:34, 3.70s/it] {'loss': 0.1459, 'grad_norm': 0.9916636045107491, 'learning_rate': 4.374486872921792e-06, 'epoch': 0.55} 55%|█████▌ | 18981/34278 [20:51:32<15:43:34, 3.70s/it] 55%|█████▌ | 18982/34278 [20:51:35<14:51:46, 3.50s/it] {'loss': 0.1222, 'grad_norm': 0.7780606346331178, 'learning_rate': 4.374018153092319e-06, 'epoch': 0.55} 55%|█████▌ | 18982/34278 [20:51:35<14:51:46, 3.50s/it] 55%|█████▌ | 18983/34278 [20:51:39<15:36:38, 3.67s/it] {'loss': 0.1285, 'grad_norm': 0.8056706125102487, 'learning_rate': 4.373549438851463e-06, 'epoch': 0.55} 55%|█████▌ | 18983/34278 [20:51:39<15:36:38, 3.67s/it] 55%|█████▌ | 18984/34278 [20:51:45<18:33:17, 4.37s/it] {'loss': 0.1195, 'grad_norm': 0.842929800563625, 'learning_rate': 4.373080730203408e-06, 'epoch': 0.55} 55%|█████▌ | 18984/34278 [20:51:45<18:33:17, 4.37s/it] 55%|█████▌ | 18985/34278 [20:51:48<17:10:26, 4.04s/it] {'loss': 0.1091, 'grad_norm': 0.6976156825232113, 'learning_rate': 4.37261202715234e-06, 'epoch': 0.55} 55%|█████▌ | 18985/34278 [20:51:48<17:10:26, 4.04s/it] 55%|█████▌ | 18986/34278 [20:51:53<18:34:50, 4.37s/it] {'loss': 0.1117, 'grad_norm': 0.7394111334881569, 'learning_rate': 4.372143329702441e-06, 'epoch': 0.55} 55%|█████▌ | 18986/34278 [20:51:53<18:34:50, 4.37s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 55%|█████▌ | 18987/34278 [20:51:57<17:17:38, 4.07s/it] {'loss': 0.126, 'grad_norm': 0.7721938028571832, 'learning_rate': 4.371674637857896e-06, 'epoch': 0.55} 55%|█████▌ | 18987/34278 [20:51:57<17:17:38, 4.07s/it] 55%|█████▌ | 18988/34278 [20:52:01<17:20:18, 4.08s/it] {'loss': 0.1457, 'grad_norm': 0.8670435034129449, 'learning_rate': 4.371205951622889e-06, 'epoch': 0.55} 55%|█████▌ | 18988/34278 [20:52:01<17:20:18, 4.08s/it] 55%|█████▌ | 18989/34278 [20:52:05<17:15:14, 4.06s/it] {'loss': 0.125, 'grad_norm': 0.7389260962426617, 'learning_rate': 4.370737271001607e-06, 'epoch': 0.55} 55%|█████▌ | 18989/34278 [20:52:05<17:15:14, 4.06s/it] 55%|█████▌ | 18990/34278 [20:52:08<16:46:59, 3.95s/it] {'loss': 0.1394, 'grad_norm': 0.8865940446498207, 'learning_rate': 4.3702685959982326e-06, 'epoch': 0.55} 55%|█████▌ | 18990/34278 [20:52:08<16:46:59, 3.95s/it] 55%|█████▌ | 18991/34278 [20:52:11<15:33:15, 3.66s/it] {'loss': 0.1293, 'grad_norm': 0.9553162359285411, 'learning_rate': 4.369799926616949e-06, 'epoch': 0.55} 55%|█████▌ | 18991/34278 [20:52:11<15:33:15, 3.66s/it] 55%|█████▌ | 18992/34278 [20:52:15<14:57:03, 3.52s/it] {'loss': 0.1063, 'grad_norm': 0.8915793746933419, 'learning_rate': 4.369331262861942e-06, 'epoch': 0.55} 55%|█████▌ | 18992/34278 [20:52:15<14:57:03, 3.52s/it] 55%|█████▌ | 18993/34278 [20:52:19<16:19:07, 3.84s/it] {'loss': 0.1222, 'grad_norm': 0.871306471435351, 'learning_rate': 4.368862604737395e-06, 'epoch': 0.55} 55%|█████▌ | 18993/34278 [20:52:19<16:19:07, 3.84s/it] 55%|█████▌ | 18994/34278 [20:52:22<15:27:02, 3.64s/it] {'loss': 0.1238, 'grad_norm': 0.8436957957319473, 'learning_rate': 4.368393952247489e-06, 'epoch': 0.55} 55%|█████▌ | 18994/34278 [20:52:22<15:27:02, 3.64s/it] 55%|█████▌ | 18995/34278 [20:52:26<15:14:46, 3.59s/it] {'loss': 0.1301, 'grad_norm': 0.8121602703143961, 'learning_rate': 4.367925305396414e-06, 'epoch': 0.55} 55%|█████▌ | 18995/34278 [20:52:26<15:14:46, 3.59s/it] 55%|█████▌ | 18996/34278 [20:52:29<14:17:11, 3.37s/it] {'loss': 0.1354, 'grad_norm': 0.736501040908026, 'learning_rate': 4.36745666418835e-06, 'epoch': 0.55} 55%|█████▌ | 18996/34278 [20:52:29<14:17:11, 3.37s/it] 55%|█████▌ | 18997/34278 [20:52:32<13:48:51, 3.25s/it] {'loss': 0.1357, 'grad_norm': 0.9146198369061643, 'learning_rate': 4.366988028627484e-06, 'epoch': 0.55} 55%|█████▌ | 18997/34278 [20:52:32<13:48:51, 3.25s/it] 55%|█████▌ | 18998/34278 [20:52:35<13:22:43, 3.15s/it] {'loss': 0.1233, 'grad_norm': 1.0272053538843235, 'learning_rate': 4.366519398717995e-06, 'epoch': 0.55} 55%|█████▌ | 18998/34278 [20:52:35<13:22:43, 3.15s/it] 55%|█████▌ | 18999/34278 [20:52:37<12:58:38, 3.06s/it] {'loss': 0.1599, 'grad_norm': 3.134792651135183, 'learning_rate': 4.366050774464071e-06, 'epoch': 0.55} 55%|█████▌ | 18999/34278 [20:52:37<12:58:38, 3.06s/it] 55%|█████▌ | 19000/34278 [20:52:40<12:54:34, 3.04s/it] {'loss': 0.1047, 'grad_norm': 0.6636392571345271, 'learning_rate': 4.365582155869892e-06, 'epoch': 0.55} 55%|█████▌ | 19000/34278 [20:52:40<12:54:34, 3.04s/it] 55%|█████▌ | 19001/34278 [20:52:43<12:48:40, 3.02s/it] {'loss': 0.122, 'grad_norm': 0.8338426930577785, 'learning_rate': 4.365113542939646e-06, 'epoch': 0.55} 55%|█████▌ | 19001/34278 [20:52:43<12:48:40, 3.02s/it] 55%|█████▌ | 19002/34278 [20:52:47<13:48:47, 3.26s/it] {'loss': 0.1524, 'grad_norm': 1.0225088274138192, 'learning_rate': 4.364644935677516e-06, 'epoch': 0.55} 55%|█████▌ | 19002/34278 [20:52:47<13:48:47, 3.26s/it] 55%|█████▌ | 19003/34278 [20:52:51<14:21:29, 3.38s/it] {'loss': 0.1135, 'grad_norm': 0.8356938730103732, 'learning_rate': 4.364176334087683e-06, 'epoch': 0.55} 55%|█████▌ | 19003/34278 [20:52:51<14:21:29, 3.38s/it] 55%|█████▌ | 19004/34278 [20:52:54<14:27:56, 3.41s/it] {'loss': 0.1275, 'grad_norm': 0.827849731435746, 'learning_rate': 4.363707738174331e-06, 'epoch': 0.55} 55%|█████▌ | 19004/34278 [20:52:54<14:27:56, 3.41s/it] 55%|█████▌ | 19005/34278 [20:52:59<15:56:45, 3.76s/it] {'loss': 0.1048, 'grad_norm': 0.758940524020363, 'learning_rate': 4.363239147941647e-06, 'epoch': 0.55} 55%|█████▌ | 19005/34278 [20:52:59<15:56:45, 3.76s/it] 55%|█████▌ | 19006/34278 [20:53:03<15:54:27, 3.75s/it] {'loss': 0.1345, 'grad_norm': 0.9455837093178442, 'learning_rate': 4.362770563393808e-06, 'epoch': 0.55} 55%|█████▌ | 19006/34278 [20:53:03<15:54:27, 3.75s/it] 55%|█████▌ | 19007/34278 [20:53:06<14:49:16, 3.49s/it] {'loss': 0.1163, 'grad_norm': 1.010327736798248, 'learning_rate': 4.362301984535005e-06, 'epoch': 0.55} 55%|█████▌ | 19007/34278 [20:53:06<14:49:16, 3.49s/it] 55%|█████▌ | 19008/34278 [20:53:09<14:29:52, 3.42s/it] {'loss': 0.1444, 'grad_norm': 1.5037815722126615, 'learning_rate': 4.361833411369415e-06, 'epoch': 0.55} 55%|█████▌ | 19008/34278 [20:53:09<14:29:52, 3.42s/it] 55%|█████▌ | 19009/34278 [20:53:12<14:37:50, 3.45s/it] {'loss': 0.1411, 'grad_norm': 1.0660801195113891, 'learning_rate': 4.361364843901226e-06, 'epoch': 0.55} 55%|█████▌ | 19009/34278 [20:53:12<14:37:50, 3.45s/it] 55%|█████▌ | 19010/34278 [20:53:16<14:14:51, 3.36s/it] {'loss': 0.1292, 'grad_norm': 0.8827088248975826, 'learning_rate': 4.360896282134619e-06, 'epoch': 0.55} 55%|█████▌ | 19010/34278 [20:53:16<14:14:51, 3.36s/it] 55%|█████▌ | 19011/34278 [20:53:19<14:08:37, 3.34s/it] {'loss': 0.1429, 'grad_norm': 0.8612085820432255, 'learning_rate': 4.360427726073776e-06, 'epoch': 0.55} 55%|█████▌ | 19011/34278 [20:53:19<14:08:37, 3.34s/it] 55%|█████▌ | 19012/34278 [20:53:22<13:27:47, 3.17s/it] {'loss': 0.1332, 'grad_norm': 0.9299876808958056, 'learning_rate': 4.359959175722881e-06, 'epoch': 0.55} 55%|█████▌ | 19012/34278 [20:53:22<13:27:47, 3.17s/it] 55%|█████▌ | 19013/34278 [20:53:26<15:31:50, 3.66s/it] {'loss': 0.1662, 'grad_norm': 1.0010153111031155, 'learning_rate': 4.3594906310861195e-06, 'epoch': 0.55} 55%|█████▌ | 19013/34278 [20:53:26<15:31:50, 3.66s/it] 55%|█████▌ | 19014/34278 [20:53:31<16:10:33, 3.82s/it] {'loss': 0.1316, 'grad_norm': 0.8387242150310642, 'learning_rate': 4.359022092167672e-06, 'epoch': 0.55} 55%|█████▌ | 19014/34278 [20:53:31<16:10:33, 3.82s/it] 55%|█████▌ | 19015/34278 [20:53:37<19:08:56, 4.52s/it] {'loss': 0.1184, 'grad_norm': 0.8570485680581119, 'learning_rate': 4.358553558971723e-06, 'epoch': 0.55} 55%|█████▌ | 19015/34278 [20:53:37<19:08:56, 4.52s/it] 55%|█████▌ | 19016/34278 [20:53:41<19:13:41, 4.54s/it] {'loss': 0.1257, 'grad_norm': 0.8463602905838344, 'learning_rate': 4.358085031502455e-06, 'epoch': 0.55} 55%|█████▌ | 19016/34278 [20:53:41<19:13:41, 4.54s/it] 55%|█████▌ | 19017/34278 [20:53:45<17:35:20, 4.15s/it] {'loss': 0.1378, 'grad_norm': 0.9155141260660282, 'learning_rate': 4.35761650976405e-06, 'epoch': 0.55} 55%|█████▌ | 19017/34278 [20:53:45<17:35:20, 4.15s/it] 55%|█████▌ | 19018/34278 [20:53:48<16:30:38, 3.90s/it] {'loss': 0.1275, 'grad_norm': 0.8735324406153037, 'learning_rate': 4.35714799376069e-06, 'epoch': 0.55} 55%|█████▌ | 19018/34278 [20:53:48<16:30:38, 3.90s/it] 55%|█████▌ | 19019/34278 [20:53:51<15:30:11, 3.66s/it] {'loss': 0.1326, 'grad_norm': 0.6755818049046284, 'learning_rate': 4.3566794834965616e-06, 'epoch': 0.55} 55%|█████▌ | 19019/34278 [20:53:51<15:30:11, 3.66s/it] 55%|█████▌ | 19020/34278 [20:53:55<15:24:30, 3.64s/it] {'loss': 0.1461, 'grad_norm': 0.9367120460741722, 'learning_rate': 4.3562109789758435e-06, 'epoch': 0.55} 55%|█████▌ | 19020/34278 [20:53:55<15:24:30, 3.64s/it] 55%|█████▌ | 19021/34278 [20:53:58<15:29:10, 3.65s/it] {'loss': 0.1312, 'grad_norm': 0.8767740890279596, 'learning_rate': 4.355742480202721e-06, 'epoch': 0.55} 55%|█████▌ | 19021/34278 [20:53:58<15:29:10, 3.65s/it] 55%|█████▌ | 19022/34278 [20:54:02<15:46:51, 3.72s/it] {'loss': 0.1407, 'grad_norm': 0.8035646964027161, 'learning_rate': 4.355273987181376e-06, 'epoch': 0.55} 55%|█████▌ | 19022/34278 [20:54:02<15:46:51, 3.72s/it] 55%|█████▌ | 19023/34278 [20:54:08<18:35:54, 4.39s/it] {'loss': 0.1398, 'grad_norm': 0.807426950519346, 'learning_rate': 4.354805499915991e-06, 'epoch': 0.55} 55%|█████▌ | 19023/34278 [20:54:08<18:35:54, 4.39s/it] 55%|█████▌ | 19024/34278 [20:54:11<17:11:46, 4.06s/it] {'loss': 0.1381, 'grad_norm': 0.9118753614808602, 'learning_rate': 4.354337018410747e-06, 'epoch': 0.55} 55%|█████▌ | 19024/34278 [20:54:11<17:11:46, 4.06s/it] 56%|█████▌ | 19025/34278 [20:54:14<15:35:06, 3.68s/it] {'loss': 0.0961, 'grad_norm': 0.576043163429003, 'learning_rate': 4.353868542669828e-06, 'epoch': 0.56} 56%|█████▌ | 19025/34278 [20:54:14<15:35:06, 3.68s/it] 56%|█████▌ | 19026/34278 [20:54:19<16:39:50, 3.93s/it] {'loss': 0.1216, 'grad_norm': 0.6594052129528744, 'learning_rate': 4.353400072697418e-06, 'epoch': 0.56} 56%|█████▌ | 19026/34278 [20:54:19<16:39:50, 3.93s/it] 56%|█████▌ | 19027/34278 [20:54:23<16:36:16, 3.92s/it] {'loss': 0.1382, 'grad_norm': 0.7585226301717503, 'learning_rate': 4.352931608497698e-06, 'epoch': 0.56} 56%|█████▌ | 19027/34278 [20:54:23<16:36:16, 3.92s/it] 56%|█████▌ | 19028/34278 [20:54:26<15:52:28, 3.75s/it] {'loss': 0.1293, 'grad_norm': 0.6750017083918501, 'learning_rate': 4.3524631500748495e-06, 'epoch': 0.56} 56%|█████▌ | 19028/34278 [20:54:26<15:52:28, 3.75s/it] 56%|█████▌ | 19029/34278 [20:54:30<16:17:12, 3.84s/it] {'loss': 0.1114, 'grad_norm': 0.6961170966180397, 'learning_rate': 4.351994697433055e-06, 'epoch': 0.56} 56%|█████▌ | 19029/34278 [20:54:30<16:17:12, 3.84s/it] 56%|█████▌ | 19030/34278 [20:54:33<15:47:08, 3.73s/it] {'loss': 0.1234, 'grad_norm': 0.7360047297017995, 'learning_rate': 4.351526250576496e-06, 'epoch': 0.56} 56%|█████▌ | 19030/34278 [20:54:33<15:47:08, 3.73s/it] 56%|█████▌ | 19031/34278 [20:54:36<14:41:08, 3.47s/it] {'loss': 0.1342, 'grad_norm': 0.74104320568258, 'learning_rate': 4.351057809509357e-06, 'epoch': 0.56} 56%|█████▌ | 19031/34278 [20:54:36<14:41:08, 3.47s/it] 56%|█████▌ | 19032/34278 [20:54:39<14:15:12, 3.37s/it] {'loss': 0.1241, 'grad_norm': 1.1616859823897157, 'learning_rate': 4.35058937423582e-06, 'epoch': 0.56} 56%|█████▌ | 19032/34278 [20:54:39<14:15:12, 3.37s/it] 56%|█████▌ | 19033/34278 [20:54:43<15:07:42, 3.57s/it] {'loss': 0.1146, 'grad_norm': 0.7534353068898046, 'learning_rate': 4.350120944760065e-06, 'epoch': 0.56} 56%|█████▌ | 19033/34278 [20:54:43<15:07:42, 3.57s/it] 56%|█████▌ | 19034/34278 [20:54:47<15:04:07, 3.56s/it] {'loss': 0.1346, 'grad_norm': 0.7359607953267799, 'learning_rate': 4.349652521086275e-06, 'epoch': 0.56} 56%|█████▌ | 19034/34278 [20:54:47<15:04:07, 3.56s/it] 56%|█████▌ | 19035/34278 [20:54:50<14:29:29, 3.42s/it] {'loss': 0.1226, 'grad_norm': 0.7887782773626593, 'learning_rate': 4.349184103218633e-06, 'epoch': 0.56} 56%|█████▌ | 19035/34278 [20:54:50<14:29:29, 3.42s/it] 56%|█████▌ | 19036/34278 [20:54:53<14:11:16, 3.35s/it] {'loss': 0.1244, 'grad_norm': 0.7908862976458872, 'learning_rate': 4.348715691161317e-06, 'epoch': 0.56} 56%|█████▌ | 19036/34278 [20:54:53<14:11:16, 3.35s/it] 56%|█████▌ | 19037/34278 [20:54:57<14:30:01, 3.43s/it] {'loss': 0.1386, 'grad_norm': 0.7437358248556468, 'learning_rate': 4.348247284918515e-06, 'epoch': 0.56} 56%|█████▌ | 19037/34278 [20:54:57<14:30:01, 3.43s/it] 56%|█████▌ | 19038/34278 [20:55:00<14:34:04, 3.44s/it] {'loss': 0.1319, 'grad_norm': 1.1328661443383496, 'learning_rate': 4.347778884494405e-06, 'epoch': 0.56} 56%|█████▌ | 19038/34278 [20:55:00<14:34:04, 3.44s/it] 56%|█████▌ | 19039/34278 [20:55:04<14:18:31, 3.38s/it] {'loss': 0.1029, 'grad_norm': 0.712735036495089, 'learning_rate': 4.347310489893169e-06, 'epoch': 0.56} 56%|█████▌ | 19039/34278 [20:55:04<14:18:31, 3.38s/it] 56%|█████▌ | 19040/34278 [20:55:07<14:04:02, 3.32s/it] {'loss': 0.1421, 'grad_norm': 0.7744272683956761, 'learning_rate': 4.346842101118991e-06, 'epoch': 0.56} 56%|█████▌ | 19040/34278 [20:55:07<14:04:02, 3.32s/it] 56%|█████▌ | 19041/34278 [20:55:09<13:17:07, 3.14s/it] {'loss': 0.1211, 'grad_norm': 0.778418849903219, 'learning_rate': 4.346373718176049e-06, 'epoch': 0.56} 56%|█████▌ | 19041/34278 [20:55:09<13:17:07, 3.14s/it] 56%|█████▌ | 19042/34278 [20:55:13<13:11:54, 3.12s/it] {'loss': 0.1376, 'grad_norm': 0.9379960028819022, 'learning_rate': 4.345905341068525e-06, 'epoch': 0.56} 56%|█████▌ | 19042/34278 [20:55:13<13:11:54, 3.12s/it] 56%|█████▌ | 19043/34278 [20:55:16<12:58:40, 3.07s/it] {'loss': 0.1185, 'grad_norm': 0.7194867018599489, 'learning_rate': 4.345436969800603e-06, 'epoch': 0.56} 56%|█████▌ | 19043/34278 [20:55:16<12:58:40, 3.07s/it] 56%|█████▌ | 19044/34278 [20:55:19<13:52:38, 3.28s/it] {'loss': 0.1294, 'grad_norm': 0.7764692817658038, 'learning_rate': 4.344968604376465e-06, 'epoch': 0.56} 56%|█████▌ | 19044/34278 [20:55:19<13:52:38, 3.28s/it] 56%|█████▌ | 19045/34278 [20:55:23<14:44:49, 3.49s/it] {'loss': 0.1275, 'grad_norm': 1.0001105838790252, 'learning_rate': 4.34450024480029e-06, 'epoch': 0.56} 56%|█████▌ | 19045/34278 [20:55:23<14:44:49, 3.49s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 56%|█████▌ | 19046/34278 [20:55:27<14:51:52, 3.51s/it] {'loss': 0.1137, 'grad_norm': 0.724988975142468, 'learning_rate': 4.34403189107626e-06, 'epoch': 0.56} 56%|█████▌ | 19046/34278 [20:55:27<14:51:52, 3.51s/it] 56%|█████▌ | 19047/34278 [20:55:31<15:12:02, 3.59s/it] {'loss': 0.1538, 'grad_norm': 0.7300662716982012, 'learning_rate': 4.343563543208557e-06, 'epoch': 0.56} 56%|█████▌ | 19047/34278 [20:55:31<15:12:02, 3.59s/it] 56%|█████▌ | 19048/34278 [20:55:35<15:52:08, 3.75s/it] {'loss': 0.1399, 'grad_norm': 0.8290228078879128, 'learning_rate': 4.343095201201361e-06, 'epoch': 0.56} 56%|█████▌ | 19048/34278 [20:55:35<15:52:08, 3.75s/it] 56%|█████▌ | 19049/34278 [20:55:38<14:40:54, 3.47s/it] {'loss': 0.1302, 'grad_norm': 0.8955325333341915, 'learning_rate': 4.342626865058856e-06, 'epoch': 0.56} 56%|█████▌ | 19049/34278 [20:55:38<14:40:54, 3.47s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f433af3f600> Failed to fetch sample 3187196. Exception: cannot identify image file <_io.BytesIO object at 0x7f433af3f600> 56%|█████▌ | 19050/34278 [20:55:42<16:06:14, 3.81s/it] {'loss': 0.1293, 'grad_norm': 0.7533257227519368, 'learning_rate': 4.34215853478522e-06, 'epoch': 0.56} 56%|█████▌ | 19050/34278 [20:55:42<16:06:14, 3.81s/it] 56%|█████▌ | 19051/34278 [20:55:46<16:44:51, 3.96s/it] {'loss': 0.1293, 'grad_norm': 0.8504209059512653, 'learning_rate': 4.341690210384636e-06, 'epoch': 0.56} 56%|█████▌ | 19051/34278 [20:55:46<16:44:51, 3.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 56%|█████▌ | 19052/34278 [20:55:50<15:45:57, 3.73s/it] {'loss': 0.1199, 'grad_norm': 0.8174216954485575, 'learning_rate': 4.341221891861286e-06, 'epoch': 0.56} 56%|█████▌ | 19052/34278 [20:55:50<15:45:57, 3.73s/it] 56%|█████▌ | 19053/34278 [20:55:53<15:23:19, 3.64s/it] {'loss': 0.1335, 'grad_norm': 0.9584904670466751, 'learning_rate': 4.340753579219349e-06, 'epoch': 0.56} 56%|█████▌ | 19053/34278 [20:55:53<15:23:19, 3.64s/it] 56%|█████▌ | 19054/34278 [20:55:57<15:14:10, 3.60s/it] {'loss': 0.132, 'grad_norm': 0.7971548088341041, 'learning_rate': 4.340285272463005e-06, 'epoch': 0.56} 56%|█████▌ | 19054/34278 [20:55:57<15:14:10, 3.60s/it] 56%|█████▌ | 19055/34278 [20:56:00<14:23:55, 3.41s/it] {'loss': 0.1306, 'grad_norm': 1.372057586506304, 'learning_rate': 4.339816971596438e-06, 'epoch': 0.56} 56%|█████▌ | 19055/34278 [20:56:00<14:23:55, 3.41s/it] 56%|█████▌ | 19056/34278 [20:56:03<14:12:24, 3.36s/it] {'loss': 0.1601, 'grad_norm': 1.035451505593037, 'learning_rate': 4.339348676623826e-06, 'epoch': 0.56} 56%|█████▌ | 19056/34278 [20:56:03<14:12:24, 3.36s/it] 56%|█████▌ | 19057/34278 [20:56:08<16:06:10, 3.81s/it] {'loss': 0.1075, 'grad_norm': 0.6946171225751947, 'learning_rate': 4.3388803875493536e-06, 'epoch': 0.56} 56%|█████▌ | 19057/34278 [20:56:08<16:06:10, 3.81s/it] 56%|█████▌ | 19058/34278 [20:56:11<15:44:55, 3.73s/it] {'loss': 0.1052, 'grad_norm': 0.748041898500976, 'learning_rate': 4.338412104377198e-06, 'epoch': 0.56} 56%|█████▌ | 19058/34278 [20:56:11<15:44:55, 3.73s/it] 56%|█████▌ | 19059/34278 [20:56:14<14:56:29, 3.53s/it] {'loss': 0.1302, 'grad_norm': 0.8847162211447822, 'learning_rate': 4.337943827111542e-06, 'epoch': 0.56} 56%|█████▌ | 19059/34278 [20:56:14<14:56:29, 3.53s/it] 56%|█████▌ | 19060/34278 [20:56:18<14:46:42, 3.50s/it] {'loss': 0.1221, 'grad_norm': 0.9788611723862909, 'learning_rate': 4.337475555756563e-06, 'epoch': 0.56} 56%|█████▌ | 19060/34278 [20:56:18<14:46:42, 3.50s/it] 56%|█████▌ | 19061/34278 [20:56:21<14:38:34, 3.46s/it] {'loss': 0.1163, 'grad_norm': 0.7934920805269242, 'learning_rate': 4.3370072903164466e-06, 'epoch': 0.56} 56%|█████▌ | 19061/34278 [20:56:21<14:38:34, 3.46s/it] 56%|█████▌ | 19062/34278 [20:56:24<14:05:56, 3.34s/it] {'loss': 0.1303, 'grad_norm': 0.8060432545753894, 'learning_rate': 4.33653903079537e-06, 'epoch': 0.56} 56%|█████▌ | 19062/34278 [20:56:24<14:05:56, 3.34s/it] 56%|█████▌ | 19063/34278 [20:56:28<14:15:30, 3.37s/it] {'loss': 0.1045, 'grad_norm': 0.8694569583121312, 'learning_rate': 4.3360707771975154e-06, 'epoch': 0.56} 56%|█████▌ | 19063/34278 [20:56:28<14:15:30, 3.37s/it] 56%|█████▌ | 19064/34278 [20:56:31<14:35:06, 3.45s/it] {'loss': 0.1185, 'grad_norm': 0.9722308929764426, 'learning_rate': 4.335602529527061e-06, 'epoch': 0.56} 56%|█████▌ | 19064/34278 [20:56:31<14:35:06, 3.45s/it] 56%|█████▌ | 19065/34278 [20:56:35<14:55:07, 3.53s/it] {'loss': 0.1224, 'grad_norm': 0.7641553324482699, 'learning_rate': 4.335134287788191e-06, 'epoch': 0.56} 56%|█████▌ | 19065/34278 [20:56:35<14:55:07, 3.53s/it] 56%|█████▌ | 19066/34278 [20:56:39<15:21:47, 3.64s/it] {'loss': 0.1439, 'grad_norm': 1.0635532818329378, 'learning_rate': 4.334666051985079e-06, 'epoch': 0.56} 56%|█████▌ | 19066/34278 [20:56:39<15:21:47, 3.64s/it] 56%|█████▌ | 19067/34278 [20:56:43<15:33:32, 3.68s/it] {'loss': 0.1094, 'grad_norm': 0.6870946879967289, 'learning_rate': 4.334197822121913e-06, 'epoch': 0.56} 56%|█████▌ | 19067/34278 [20:56:43<15:33:32, 3.68s/it] 56%|█████▌ | 19068/34278 [20:56:48<17:56:04, 4.24s/it] {'loss': 0.1229, 'grad_norm': 0.8234267835962014, 'learning_rate': 4.333729598202869e-06, 'epoch': 0.56} 56%|█████▌ | 19068/34278 [20:56:48<17:56:04, 4.24s/it] 56%|█████▌ | 19069/34278 [20:56:51<16:42:03, 3.95s/it] {'loss': 0.1426, 'grad_norm': 0.9770794180998047, 'learning_rate': 4.333261380232129e-06, 'epoch': 0.56} 56%|█████▌ | 19069/34278 [20:56:51<16:42:03, 3.95s/it] 56%|█████▌ | 19070/34278 [20:56:56<17:09:33, 4.06s/it] {'loss': 0.1423, 'grad_norm': 0.9482585300221368, 'learning_rate': 4.3327931682138725e-06, 'epoch': 0.56} 56%|█████▌ | 19070/34278 [20:56:56<17:09:33, 4.06s/it] 56%|█████▌ | 19071/34278 [20:56:59<16:06:02, 3.81s/it] {'loss': 0.1296, 'grad_norm': 0.9467254690847229, 'learning_rate': 4.3323249621522785e-06, 'epoch': 0.56} 56%|█████▌ | 19071/34278 [20:56:59<16:06:02, 3.81s/it] 56%|█████▌ | 19072/34278 [20:57:04<17:54:49, 4.24s/it] {'loss': 0.1211, 'grad_norm': 0.8977881860793262, 'learning_rate': 4.331856762051526e-06, 'epoch': 0.56} 56%|█████▌ | 19072/34278 [20:57:04<17:54:49, 4.24s/it] 56%|█████▌ | 19073/34278 [20:57:07<16:42:51, 3.96s/it] {'loss': 0.1155, 'grad_norm': 0.9323535480483929, 'learning_rate': 4.331388567915799e-06, 'epoch': 0.56} 56%|█████▌ | 19073/34278 [20:57:07<16:42:51, 3.96s/it] 56%|█████▌ | 19074/34278 [20:57:13<19:12:54, 4.55s/it] {'loss': 0.1324, 'grad_norm': 0.8592387935246127, 'learning_rate': 4.330920379749274e-06, 'epoch': 0.56} 56%|█████▌ | 19074/34278 [20:57:13<19:12:54, 4.55s/it] 56%|█████▌ | 19075/34278 [20:57:17<18:09:53, 4.30s/it] {'loss': 0.1341, 'grad_norm': 1.0401763103605797, 'learning_rate': 4.330452197556134e-06, 'epoch': 0.56} 56%|█████▌ | 19075/34278 [20:57:17<18:09:53, 4.30s/it] 56%|█████▌ | 19076/34278 [20:57:20<16:11:15, 3.83s/it] {'loss': 0.1264, 'grad_norm': 0.8327513069803256, 'learning_rate': 4.329984021340557e-06, 'epoch': 0.56} 56%|█████▌ | 19076/34278 [20:57:20<16:11:15, 3.83s/it] 56%|█████▌ | 19077/34278 [20:57:23<14:57:19, 3.54s/it] {'loss': 0.1326, 'grad_norm': 0.7398323747336847, 'learning_rate': 4.329515851106721e-06, 'epoch': 0.56} 56%|█████▌ | 19077/34278 [20:57:23<14:57:19, 3.54s/it] 56%|█████▌ | 19078/34278 [20:57:26<14:09:51, 3.35s/it] {'loss': 0.1504, 'grad_norm': 0.8631278579939401, 'learning_rate': 4.329047686858807e-06, 'epoch': 0.56} 56%|█████▌ | 19078/34278 [20:57:26<14:09:51, 3.35s/it] 56%|█████▌ | 19079/34278 [20:57:29<14:28:41, 3.43s/it] {'loss': 0.1046, 'grad_norm': 0.9173061038077363, 'learning_rate': 4.328579528600997e-06, 'epoch': 0.56} 56%|█████▌ | 19079/34278 [20:57:29<14:28:41, 3.43s/it] 56%|█████▌ | 19080/34278 [20:57:35<17:48:08, 4.22s/it] {'loss': 0.1299, 'grad_norm': 0.8053387977027799, 'learning_rate': 4.328111376337468e-06, 'epoch': 0.56} 56%|█████▌ | 19080/34278 [20:57:35<17:48:08, 4.22s/it] 56%|█████▌ | 19081/34278 [20:57:38<15:53:32, 3.76s/it] {'loss': 0.1152, 'grad_norm': 0.7943544478077899, 'learning_rate': 4.3276432300723995e-06, 'epoch': 0.56} 56%|█████▌ | 19081/34278 [20:57:38<15:53:32, 3.76s/it] 56%|█████▌ | 19082/34278 [20:57:41<15:20:25, 3.63s/it] {'loss': 0.1134, 'grad_norm': 0.684900982092863, 'learning_rate': 4.327175089809973e-06, 'epoch': 0.56} 56%|█████▌ | 19082/34278 [20:57:41<15:20:25, 3.63s/it] 56%|█████▌ | 19083/34278 [20:57:45<15:37:05, 3.70s/it] {'loss': 0.1289, 'grad_norm': 1.0556320969726258, 'learning_rate': 4.3267069555543665e-06, 'epoch': 0.56} 56%|█████▌ | 19083/34278 [20:57:45<15:37:05, 3.70s/it] 56%|█████▌ | 19084/34278 [20:57:49<15:48:30, 3.75s/it] {'loss': 0.1095, 'grad_norm': 0.8466057433921833, 'learning_rate': 4.326238827309758e-06, 'epoch': 0.56} 56%|█████▌ | 19084/34278 [20:57:49<15:48:30, 3.75s/it] 56%|█████▌ | 19085/34278 [20:57:52<14:49:17, 3.51s/it] {'loss': 0.1267, 'grad_norm': 0.7338247525685471, 'learning_rate': 4.3257707050803285e-06, 'epoch': 0.56} 56%|█████▌ | 19085/34278 [20:57:52<14:49:17, 3.51s/it] 56%|█████▌ | 19086/34278 [20:57:58<18:00:11, 4.27s/it] {'loss': 0.11, 'grad_norm': 0.7557776443026418, 'learning_rate': 4.325302588870258e-06, 'epoch': 0.56} 56%|█████▌ | 19086/34278 [20:57:58<18:00:11, 4.27s/it] 56%|█████▌ | 19087/34278 [20:58:01<16:19:26, 3.87s/it] {'loss': 0.1316, 'grad_norm': 0.8162613754395548, 'learning_rate': 4.324834478683726e-06, 'epoch': 0.56} 56%|█████▌ | 19087/34278 [20:58:01<16:19:26, 3.87s/it] 56%|█████▌ | 19088/34278 [20:58:07<19:06:17, 4.53s/it] {'loss': 0.1252, 'grad_norm': 0.7112984568497187, 'learning_rate': 4.32436637452491e-06, 'epoch': 0.56} 56%|█████▌ | 19088/34278 [20:58:07<19:06:17, 4.53s/it] 56%|█████▌ | 19089/34278 [20:58:10<16:53:16, 4.00s/it] {'loss': 0.1239, 'grad_norm': 0.945271187261099, 'learning_rate': 4.32389827639799e-06, 'epoch': 0.56} 56%|█████▌ | 19089/34278 [20:58:10<16:53:16, 4.00s/it] 56%|█████▌ | 19090/34278 [20:58:14<16:38:16, 3.94s/it] {'loss': 0.1198, 'grad_norm': 0.8873115463856694, 'learning_rate': 4.323430184307143e-06, 'epoch': 0.56} 56%|█████▌ | 19090/34278 [20:58:14<16:38:16, 3.94s/it] 56%|█████▌ | 19091/34278 [20:58:19<18:40:08, 4.43s/it] {'loss': 0.1345, 'grad_norm': 0.8925186333145753, 'learning_rate': 4.3229620982565505e-06, 'epoch': 0.56} 56%|█████▌ | 19091/34278 [20:58:19<18:40:08, 4.43s/it] 56%|█████▌ | 19092/34278 [20:58:25<20:03:06, 4.75s/it] {'loss': 0.1319, 'grad_norm': 1.267998504717158, 'learning_rate': 4.322494018250392e-06, 'epoch': 0.56} 56%|█████▌ | 19092/34278 [20:58:25<20:03:06, 4.75s/it] 56%|█████▌ | 19093/34278 [20:58:28<17:35:49, 4.17s/it] {'loss': 0.119, 'grad_norm': 0.913261745842966, 'learning_rate': 4.322025944292845e-06, 'epoch': 0.56} 56%|█████▌ | 19093/34278 [20:58:28<17:35:49, 4.17s/it] 56%|█████▌ | 19094/34278 [20:58:30<15:48:48, 3.75s/it] {'loss': 0.1337, 'grad_norm': 0.7564078256122583, 'learning_rate': 4.321557876388087e-06, 'epoch': 0.56} 56%|█████▌ | 19094/34278 [20:58:30<15:48:48, 3.75s/it] 56%|█████▌ | 19095/34278 [20:58:33<14:59:33, 3.55s/it] {'loss': 0.1181, 'grad_norm': 0.9747580617501804, 'learning_rate': 4.321089814540301e-06, 'epoch': 0.56} 56%|█████▌ | 19095/34278 [20:58:33<14:59:33, 3.55s/it] 56%|█████▌ | 19096/34278 [20:58:37<14:34:19, 3.46s/it] {'loss': 0.1315, 'grad_norm': 0.9551944684281286, 'learning_rate': 4.320621758753659e-06, 'epoch': 0.56} 56%|█████▌ | 19096/34278 [20:58:37<14:34:19, 3.46s/it] 56%|█████▌ | 19097/34278 [20:58:43<18:07:41, 4.30s/it] {'loss': 0.1283, 'grad_norm': 0.7604880208963859, 'learning_rate': 4.320153709032347e-06, 'epoch': 0.56} 56%|█████▌ | 19097/34278 [20:58:43<18:07:41, 4.30s/it] 56%|█████▌ | 19098/34278 [20:58:46<16:03:10, 3.81s/it] {'loss': 0.1295, 'grad_norm': 0.8382857647702464, 'learning_rate': 4.319685665380539e-06, 'epoch': 0.56} 56%|█████▌ | 19098/34278 [20:58:46<16:03:10, 3.81s/it] 56%|█████▌ | 19099/34278 [20:58:52<18:55:48, 4.49s/it] {'loss': 0.1235, 'grad_norm': 1.203360923454032, 'learning_rate': 4.319217627802415e-06, 'epoch': 0.56} 56%|█████▌ | 19099/34278 [20:58:52<18:55:48, 4.49s/it] 56%|█████▌ | 19100/34278 [20:58:55<17:48:17, 4.22s/it] {'loss': 0.1329, 'grad_norm': 0.7630826778051315, 'learning_rate': 4.318749596302155e-06, 'epoch': 0.56} 56%|█████▌ | 19100/34278 [20:58:55<17:48:17, 4.22s/it] 56%|█████▌ | 19101/34278 [20:58:58<16:17:22, 3.86s/it] {'loss': 0.1294, 'grad_norm': 0.8794901446033415, 'learning_rate': 4.318281570883935e-06, 'epoch': 0.56} 56%|█████▌ | 19101/34278 [20:58:58<16:17:22, 3.86s/it] 56%|█████▌ | 19102/34278 [20:59:01<15:07:32, 3.59s/it] {'loss': 0.1419, 'grad_norm': 0.9443425549623128, 'learning_rate': 4.3178135515519336e-06, 'epoch': 0.56} 56%|█████▌ | 19102/34278 [20:59:01<15:07:32, 3.59s/it] 56%|█████▌ | 19103/34278 [20:59:04<14:30:19, 3.44s/it] {'loss': 0.1355, 'grad_norm': 0.8137379571082658, 'learning_rate': 4.317345538310331e-06, 'epoch': 0.56} 56%|█████▌ | 19103/34278 [20:59:04<14:30:19, 3.44s/it] 56%|█████▌ | 19104/34278 [20:59:07<13:56:31, 3.31s/it] {'loss': 0.1333, 'grad_norm': 0.8853168441729572, 'learning_rate': 4.316877531163304e-06, 'epoch': 0.56} 56%|█████▌ | 19104/34278 [20:59:07<13:56:31, 3.31s/it] 56%|█████▌ | 19105/34278 [20:59:10<13:38:01, 3.23s/it] {'loss': 0.1289, 'grad_norm': 0.8321714517997488, 'learning_rate': 4.3164095301150325e-06, 'epoch': 0.56} 56%|█████▌ | 19105/34278 [20:59:10<13:38:01, 3.23s/it] 56%|█████▌ | 19106/34278 [20:59:13<13:06:37, 3.11s/it] {'loss': 0.1392, 'grad_norm': 0.9104922555381546, 'learning_rate': 4.315941535169692e-06, 'epoch': 0.56} 56%|█████▌ | 19106/34278 [20:59:13<13:06:37, 3.11s/it] 56%|█████▌ | 19107/34278 [20:59:16<13:20:08, 3.16s/it] {'loss': 0.1003, 'grad_norm': 0.7053054006086941, 'learning_rate': 4.315473546331463e-06, 'epoch': 0.56} 56%|█████▌ | 19107/34278 [20:59:16<13:20:08, 3.16s/it] 56%|█████▌ | 19108/34278 [20:59:20<14:03:02, 3.33s/it] {'loss': 0.1179, 'grad_norm': 0.7700656288780094, 'learning_rate': 4.315005563604521e-06, 'epoch': 0.56} 56%|█████▌ | 19108/34278 [20:59:20<14:03:02, 3.33s/it] 56%|█████▌ | 19109/34278 [20:59:23<13:53:58, 3.30s/it] {'loss': 0.1391, 'grad_norm': 0.7747077764319444, 'learning_rate': 4.314537586993048e-06, 'epoch': 0.56} 56%|█████▌ | 19109/34278 [20:59:23<13:53:58, 3.30s/it] 56%|█████▌ | 19110/34278 [20:59:27<14:13:02, 3.37s/it] {'loss': 0.116, 'grad_norm': 0.8512534505870097, 'learning_rate': 4.314069616501219e-06, 'epoch': 0.56} 56%|█████▌ | 19110/34278 [20:59:27<14:13:02, 3.37s/it] 56%|█████▌ | 19111/34278 [20:59:31<14:34:49, 3.46s/it] {'loss': 0.1532, 'grad_norm': 1.1915308090824996, 'learning_rate': 4.313601652133213e-06, 'epoch': 0.56} 56%|█████▌ | 19111/34278 [20:59:31<14:34:49, 3.46s/it] 56%|█████▌ | 19112/34278 [20:59:34<14:19:58, 3.40s/it] {'loss': 0.1264, 'grad_norm': 0.7630239928530291, 'learning_rate': 4.3131336938932085e-06, 'epoch': 0.56} 56%|█████▌ | 19112/34278 [20:59:34<14:19:58, 3.40s/it] 56%|█████▌ | 19113/34278 [20:59:37<13:37:39, 3.24s/it] {'loss': 0.1414, 'grad_norm': 0.9180348198619798, 'learning_rate': 4.312665741785379e-06, 'epoch': 0.56} 56%|█████▌ | 19113/34278 [20:59:37<13:37:39, 3.24s/it] 56%|█████▌ | 19114/34278 [20:59:40<13:46:40, 3.27s/it] {'loss': 0.1146, 'grad_norm': 0.9850176224492503, 'learning_rate': 4.312197795813909e-06, 'epoch': 0.56} 56%|█████▌ | 19114/34278 [20:59:40<13:46:40, 3.27s/it] 56%|█████▌ | 19115/34278 [20:59:46<16:39:28, 3.95s/it] {'loss': 0.1441, 'grad_norm': 0.7933498422687582, 'learning_rate': 4.311729855982972e-06, 'epoch': 0.56} 56%|█████▌ | 19115/34278 [20:59:46<16:39:28, 3.95s/it] 56%|█████▌ | 19116/34278 [20:59:49<15:30:42, 3.68s/it] {'loss': 0.1252, 'grad_norm': 0.9139270458622918, 'learning_rate': 4.311261922296746e-06, 'epoch': 0.56} 56%|█████▌ | 19116/34278 [20:59:49<15:30:42, 3.68s/it] 56%|█████▌ | 19117/34278 [20:59:52<15:19:15, 3.64s/it] {'loss': 0.1584, 'grad_norm': 1.2188145977938503, 'learning_rate': 4.310793994759411e-06, 'epoch': 0.56} 56%|█████▌ | 19117/34278 [20:59:52<15:19:15, 3.64s/it] 56%|█████▌ | 19118/34278 [20:59:55<14:32:29, 3.45s/it] {'loss': 0.1238, 'grad_norm': 0.9690350110217198, 'learning_rate': 4.310326073375141e-06, 'epoch': 0.56} 56%|█████▌ | 19118/34278 [20:59:55<14:32:29, 3.45s/it] 56%|█████▌ | 19119/34278 [21:00:00<16:19:48, 3.88s/it] {'loss': 0.1283, 'grad_norm': 1.1264086158332027, 'learning_rate': 4.309858158148114e-06, 'epoch': 0.56} 56%|█████▌ | 19119/34278 [21:00:00<16:19:48, 3.88s/it] 56%|█████▌ | 19120/34278 [21:00:06<18:36:17, 4.42s/it] {'loss': 0.1215, 'grad_norm': 0.8654237947195278, 'learning_rate': 4.30939024908251e-06, 'epoch': 0.56} 56%|█████▌ | 19120/34278 [21:00:06<18:36:17, 4.42s/it] 56%|█████▌ | 19121/34278 [21:00:09<17:06:35, 4.06s/it] {'loss': 0.1098, 'grad_norm': 0.7326600570244524, 'learning_rate': 4.308922346182505e-06, 'epoch': 0.56} 56%|█████▌ | 19121/34278 [21:00:09<17:06:35, 4.06s/it] 56%|█████▌ | 19122/34278 [21:00:15<19:34:51, 4.65s/it] {'loss': 0.1186, 'grad_norm': 0.7202669208213778, 'learning_rate': 4.308454449452277e-06, 'epoch': 0.56} 56%|█████▌ | 19122/34278 [21:00:15<19:34:51, 4.65s/it] 56%|█████▌ | 19123/34278 [21:00:19<18:20:01, 4.36s/it] {'loss': 0.1271, 'grad_norm': 1.0690084370132689, 'learning_rate': 4.3079865588960014e-06, 'epoch': 0.56} 56%|█████▌ | 19123/34278 [21:00:19<18:20:01, 4.36s/it] 56%|█████▌ | 19124/34278 [21:00:22<17:26:27, 4.14s/it] {'loss': 0.1361, 'grad_norm': 1.1007812742797274, 'learning_rate': 4.307518674517858e-06, 'epoch': 0.56} 56%|█████▌ | 19124/34278 [21:00:22<17:26:27, 4.14s/it] 56%|█████▌ | 19125/34278 [21:00:26<17:13:07, 4.09s/it] {'loss': 0.1235, 'grad_norm': 0.8787765921807975, 'learning_rate': 4.3070507963220195e-06, 'epoch': 0.56} 56%|█████▌ | 19125/34278 [21:00:26<17:13:07, 4.09s/it] 56%|█████▌ | 19126/34278 [21:00:32<19:18:11, 4.59s/it] {'loss': 0.1509, 'grad_norm': 0.8996433276085365, 'learning_rate': 4.3065829243126685e-06, 'epoch': 0.56} 56%|█████▌ | 19126/34278 [21:00:32<19:18:11, 4.59s/it] 56%|█████▌ | 19127/34278 [21:00:36<17:57:47, 4.27s/it] {'loss': 0.1583, 'grad_norm': 0.8878376716646226, 'learning_rate': 4.306115058493981e-06, 'epoch': 0.56} 56%|█████▌ | 19127/34278 [21:00:36<17:57:47, 4.27s/it] 56%|█████▌ | 19128/34278 [21:00:38<16:11:30, 3.85s/it] {'loss': 0.1302, 'grad_norm': 1.0487540761799035, 'learning_rate': 4.305647198870131e-06, 'epoch': 0.56} 56%|█████▌ | 19128/34278 [21:00:38<16:11:30, 3.85s/it] 56%|█████▌ | 19129/34278 [21:00:43<17:09:53, 4.08s/it] {'loss': 0.1521, 'grad_norm': 0.8405901697246108, 'learning_rate': 4.305179345445297e-06, 'epoch': 0.56} 56%|█████▌ | 19129/34278 [21:00:43<17:09:53, 4.08s/it] 56%|█████▌ | 19130/34278 [21:00:46<16:11:50, 3.85s/it] {'loss': 0.1411, 'grad_norm': 0.888072930235096, 'learning_rate': 4.304711498223656e-06, 'epoch': 0.56} 56%|█████▌ | 19130/34278 [21:00:46<16:11:50, 3.85s/it] 56%|█████▌ | 19131/34278 [21:00:52<18:02:57, 4.29s/it] {'loss': 0.1314, 'grad_norm': 0.9305898421135123, 'learning_rate': 4.304243657209383e-06, 'epoch': 0.56} 56%|█████▌ | 19131/34278 [21:00:52<18:02:57, 4.29s/it] 56%|█████▌ | 19132/34278 [21:00:55<16:50:43, 4.00s/it] {'loss': 0.1215, 'grad_norm': 0.8085351979341941, 'learning_rate': 4.30377582240666e-06, 'epoch': 0.56} 56%|█████▌ | 19132/34278 [21:00:55<16:50:43, 4.00s/it] 56%|█████▌ | 19133/34278 [21:00:59<17:06:58, 4.07s/it] {'loss': 0.1419, 'grad_norm': 0.8572758553739566, 'learning_rate': 4.303307993819657e-06, 'epoch': 0.56} 56%|█████▌ | 19133/34278 [21:00:59<17:06:58, 4.07s/it] 56%|█████▌ | 19134/34278 [21:01:02<15:32:23, 3.69s/it] {'loss': 0.1329, 'grad_norm': 0.922083938679305, 'learning_rate': 4.302840171452556e-06, 'epoch': 0.56} 56%|█████▌ | 19134/34278 [21:01:02<15:32:23, 3.69s/it] 56%|█████▌ | 19135/34278 [21:01:05<14:59:19, 3.56s/it] {'loss': 0.1179, 'grad_norm': 0.8911884182439094, 'learning_rate': 4.302372355309532e-06, 'epoch': 0.56} 56%|█████▌ | 19135/34278 [21:01:05<14:59:19, 3.56s/it] 56%|█████▌ | 19136/34278 [21:01:08<14:25:36, 3.43s/it] {'loss': 0.141, 'grad_norm': 0.7111043411041209, 'learning_rate': 4.301904545394761e-06, 'epoch': 0.56} 56%|█████▌ | 19136/34278 [21:01:08<14:25:36, 3.43s/it] 56%|█████▌ | 19137/34278 [21:01:11<13:36:36, 3.24s/it] {'loss': 0.1394, 'grad_norm': 1.0692574535871096, 'learning_rate': 4.301436741712417e-06, 'epoch': 0.56} 56%|█████▌ | 19137/34278 [21:01:11<13:36:36, 3.24s/it] 56%|█████▌ | 19138/34278 [21:01:16<14:58:02, 3.56s/it] {'loss': 0.1555, 'grad_norm': 1.1114995483858565, 'learning_rate': 4.30096894426668e-06, 'epoch': 0.56} 56%|█████▌ | 19138/34278 [21:01:16<14:58:02, 3.56s/it] 56%|█████▌ | 19139/34278 [21:01:21<17:00:29, 4.04s/it] {'loss': 0.1215, 'grad_norm': 0.9175644226687056, 'learning_rate': 4.3005011530617275e-06, 'epoch': 0.56} 56%|█████▌ | 19139/34278 [21:01:21<17:00:29, 4.04s/it] 56%|█████▌ | 19140/34278 [21:01:27<19:23:56, 4.61s/it] {'loss': 0.1245, 'grad_norm': 0.7594943854632864, 'learning_rate': 4.300033368101732e-06, 'epoch': 0.56} 56%|█████▌ | 19140/34278 [21:01:27<19:23:56, 4.61s/it] 56%|█████▌ | 19141/34278 [21:01:30<17:36:50, 4.19s/it] {'loss': 0.1371, 'grad_norm': 0.980480244557107, 'learning_rate': 4.299565589390872e-06, 'epoch': 0.56} 56%|█████▌ | 19141/34278 [21:01:30<17:36:50, 4.19s/it] 56%|█████▌ | 19142/34278 [21:01:33<16:01:04, 3.81s/it] {'loss': 0.1414, 'grad_norm': 0.9394728822043553, 'learning_rate': 4.299097816933323e-06, 'epoch': 0.56} 56%|█████▌ | 19142/34278 [21:01:33<16:01:04, 3.81s/it] 56%|█████▌ | 19143/34278 [21:01:36<15:05:24, 3.59s/it] {'loss': 0.17, 'grad_norm': 0.8717233172501493, 'learning_rate': 4.29863005073326e-06, 'epoch': 0.56} 56%|█████▌ | 19143/34278 [21:01:36<15:05:24, 3.59s/it] 56%|█████▌ | 19144/34278 [21:01:41<17:04:32, 4.06s/it] {'loss': 0.1597, 'grad_norm': 0.825546529492615, 'learning_rate': 4.2981622907948625e-06, 'epoch': 0.56} 56%|█████▌ | 19144/34278 [21:01:41<17:04:32, 4.06s/it] 56%|█████▌ | 19145/34278 [21:01:45<16:26:07, 3.91s/it] {'loss': 0.121, 'grad_norm': 1.3135694938829492, 'learning_rate': 4.297694537122304e-06, 'epoch': 0.56} 56%|█████▌ | 19145/34278 [21:01:45<16:26:07, 3.91s/it] 56%|█████▌ | 19146/34278 [21:01:47<15:05:50, 3.59s/it] {'loss': 0.1132, 'grad_norm': 0.808713343832058, 'learning_rate': 4.297226789719761e-06, 'epoch': 0.56} 56%|█████▌ | 19146/34278 [21:01:47<15:05:50, 3.59s/it] 56%|█████▌ | 19147/34278 [21:01:51<14:35:17, 3.47s/it] {'loss': 0.1202, 'grad_norm': 0.7237285483951936, 'learning_rate': 4.29675904859141e-06, 'epoch': 0.56} 56%|█████▌ | 19147/34278 [21:01:51<14:35:17, 3.47s/it] 56%|█████▌ | 19148/34278 [21:01:54<13:53:31, 3.31s/it] {'loss': 0.1301, 'grad_norm': 0.943673714823329, 'learning_rate': 4.296291313741425e-06, 'epoch': 0.56} 56%|█████▌ | 19148/34278 [21:01:54<13:53:31, 3.31s/it] 56%|█████▌ | 19149/34278 [21:01:57<14:23:00, 3.42s/it] {'loss': 0.1457, 'grad_norm': 0.8394084451252932, 'learning_rate': 4.295823585173983e-06, 'epoch': 0.56} 56%|█████▌ | 19149/34278 [21:01:57<14:23:00, 3.42s/it] 56%|█████▌ | 19150/34278 [21:02:02<15:35:53, 3.71s/it] {'loss': 0.1091, 'grad_norm': 0.7437816288736493, 'learning_rate': 4.29535586289326e-06, 'epoch': 0.56} 56%|█████▌ | 19150/34278 [21:02:02<15:35:53, 3.71s/it] 56%|█████▌ | 19151/34278 [21:02:05<14:40:41, 3.49s/it] {'loss': 0.1285, 'grad_norm': 0.7471769505431773, 'learning_rate': 4.294888146903433e-06, 'epoch': 0.56} 56%|█████▌ | 19151/34278 [21:02:05<14:40:41, 3.49s/it] 56%|█████▌ | 19152/34278 [21:02:08<13:58:40, 3.33s/it] {'loss': 0.1166, 'grad_norm': 0.9376425501717874, 'learning_rate': 4.294420437208677e-06, 'epoch': 0.56} 56%|█████▌ | 19152/34278 [21:02:08<13:58:40, 3.33s/it] 56%|█████▌ | 19153/34278 [21:02:11<14:33:17, 3.46s/it] {'loss': 0.1237, 'grad_norm': 0.9560575967084503, 'learning_rate': 4.2939527338131654e-06, 'epoch': 0.56} 56%|█████▌ | 19153/34278 [21:02:11<14:33:17, 3.46s/it] 56%|█████▌ | 19154/34278 [21:02:14<13:51:23, 3.30s/it] {'loss': 0.1278, 'grad_norm': 1.322551694300141, 'learning_rate': 4.293485036721075e-06, 'epoch': 0.56} 56%|█████▌ | 19154/34278 [21:02:14<13:51:23, 3.30s/it] 56%|█████▌ | 19155/34278 [21:02:17<13:45:23, 3.27s/it] {'loss': 0.1293, 'grad_norm': 1.3297083397872205, 'learning_rate': 4.293017345936581e-06, 'epoch': 0.56} 56%|█████▌ | 19155/34278 [21:02:17<13:45:23, 3.27s/it] 56%|█████▌ | 19156/34278 [21:02:21<14:16:15, 3.40s/it] {'loss': 0.1148, 'grad_norm': 0.8807202298756728, 'learning_rate': 4.29254966146386e-06, 'epoch': 0.56} 56%|█████▌ | 19156/34278 [21:02:21<14:16:15, 3.40s/it] 56%|█████▌ | 19157/34278 [21:02:25<14:17:44, 3.40s/it] {'loss': 0.1433, 'grad_norm': 0.884337355599638, 'learning_rate': 4.292081983307088e-06, 'epoch': 0.56} 56%|█████▌ | 19157/34278 [21:02:25<14:17:44, 3.40s/it] 56%|█████▌ | 19158/34278 [21:02:28<14:01:47, 3.34s/it] {'loss': 0.1265, 'grad_norm': 0.8870220382296279, 'learning_rate': 4.291614311470438e-06, 'epoch': 0.56} 56%|█████▌ | 19158/34278 [21:02:28<14:01:47, 3.34s/it] 56%|█████▌ | 19159/34278 [21:02:32<14:54:12, 3.55s/it] {'loss': 0.1561, 'grad_norm': 1.1241527600915935, 'learning_rate': 4.291146645958087e-06, 'epoch': 0.56} 56%|█████▌ | 19159/34278 [21:02:32<14:54:12, 3.55s/it] 56%|█████▌ | 19160/34278 [21:02:36<15:05:20, 3.59s/it] {'loss': 0.139, 'grad_norm': 1.0388864135881575, 'learning_rate': 4.29067898677421e-06, 'epoch': 0.56} 56%|█████▌ | 19160/34278 [21:02:36<15:05:20, 3.59s/it] 56%|█████▌ | 19161/34278 [21:02:39<14:37:54, 3.48s/it] {'loss': 0.1153, 'grad_norm': 0.7738380209356532, 'learning_rate': 4.2902113339229774e-06, 'epoch': 0.56} 56%|█████▌ | 19161/34278 [21:02:39<14:37:54, 3.48s/it] 56%|█████▌ | 19162/34278 [21:02:42<14:06:52, 3.36s/it] {'loss': 0.1336, 'grad_norm': 0.819588683616226, 'learning_rate': 4.2897436874085735e-06, 'epoch': 0.56} 56%|█████▌ | 19162/34278 [21:02:42<14:06:52, 3.36s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (11171 > 8192). Running this sequence through the model will result in indexing errors 56%|█████▌ | 19163/34278 [21:02:45<13:51:06, 3.30s/it] {'loss': 0.1464, 'grad_norm': 1.1735502412721954, 'learning_rate': 4.289276047235167e-06, 'epoch': 0.56} 56%|█████▌ | 19163/34278 [21:02:45<13:51:06, 3.30s/it] 56%|█████▌ | 19164/34278 [21:02:48<13:24:36, 3.19s/it] {'loss': 0.1264, 'grad_norm': 0.8005612962686929, 'learning_rate': 4.2888084134069335e-06, 'epoch': 0.56} 56%|█████▌ | 19164/34278 [21:02:48<13:24:36, 3.19s/it] 56%|█████▌ | 19165/34278 [21:02:52<14:13:52, 3.39s/it] {'loss': 0.1118, 'grad_norm': 0.8830956224221792, 'learning_rate': 4.28834078592805e-06, 'epoch': 0.56} 56%|█████▌ | 19165/34278 [21:02:52<14:13:52, 3.39s/it] 56%|█████▌ | 19166/34278 [21:02:55<13:35:33, 3.24s/it] {'loss': 0.133, 'grad_norm': 0.9955461519220135, 'learning_rate': 4.28787316480269e-06, 'epoch': 0.56} 56%|█████▌ | 19166/34278 [21:02:55<13:35:33, 3.24s/it] 56%|█████▌ | 19167/34278 [21:02:58<13:48:16, 3.29s/it] {'loss': 0.1243, 'grad_norm': 0.7082792680425669, 'learning_rate': 4.287405550035026e-06, 'epoch': 0.56} 56%|█████▌ | 19167/34278 [21:02:58<13:48:16, 3.29s/it] 56%|█████▌ | 19168/34278 [21:03:02<14:25:40, 3.44s/it] {'loss': 0.1194, 'grad_norm': 0.7681017037077732, 'learning_rate': 4.286937941629237e-06, 'epoch': 0.56} 56%|█████▌ | 19168/34278 [21:03:02<14:25:40, 3.44s/it] 56%|█████▌ | 19169/34278 [21:03:05<14:26:11, 3.44s/it] {'loss': 0.1609, 'grad_norm': 0.8187444008121079, 'learning_rate': 4.286470339589497e-06, 'epoch': 0.56} 56%|█████▌ | 19169/34278 [21:03:05<14:26:11, 3.44s/it] 56%|█████▌ | 19170/34278 [21:03:09<14:11:10, 3.38s/it] {'loss': 0.1358, 'grad_norm': 0.7842607479180885, 'learning_rate': 4.286002743919977e-06, 'epoch': 0.56} 56%|█████▌ | 19170/34278 [21:03:09<14:11:10, 3.38s/it] 56%|█████▌ | 19171/34278 [21:03:14<17:23:14, 4.14s/it] {'loss': 0.1118, 'grad_norm': 0.8434677490486692, 'learning_rate': 4.2855351546248555e-06, 'epoch': 0.56} 56%|█████▌ | 19171/34278 [21:03:14<17:23:14, 4.14s/it] 56%|█████▌ | 19172/34278 [21:03:20<18:55:44, 4.51s/it] {'loss': 0.1221, 'grad_norm': 0.6987501642828052, 'learning_rate': 4.285067571708307e-06, 'epoch': 0.56} 56%|█████▌ | 19172/34278 [21:03:20<18:55:44, 4.51s/it] 56%|█████▌ | 19173/34278 [21:03:23<16:48:46, 4.01s/it] {'loss': 0.1206, 'grad_norm': 0.9471406513575809, 'learning_rate': 4.2845999951744995e-06, 'epoch': 0.56} 56%|█████▌ | 19173/34278 [21:03:23<16:48:46, 4.01s/it] 56%|█████▌ | 19174/34278 [21:03:27<16:46:52, 4.00s/it] {'loss': 0.1422, 'grad_norm': 1.0593603487317498, 'learning_rate': 4.284132425027617e-06, 'epoch': 0.56} 56%|█████▌ | 19174/34278 [21:03:27<16:46:52, 4.00s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7facc1f17e70> Failed to fetch sample 2898122. Exception: cannot identify image file <_io.BytesIO object at 0x7facc1f17e70> 56%|█████▌ | 19175/34278 [21:03:30<15:37:41, 3.73s/it] {'loss': 0.1103, 'grad_norm': 0.7583069093625302, 'learning_rate': 4.283664861271829e-06, 'epoch': 0.56} 56%|█████▌ | 19175/34278 [21:03:30<15:37:41, 3.73s/it] 56%|█████▌ | 19176/34278 [21:03:36<18:16:52, 4.36s/it] {'loss': 0.1276, 'grad_norm': 0.8101424809285023, 'learning_rate': 4.283197303911308e-06, 'epoch': 0.56} 56%|█████▌ | 19176/34278 [21:03:36<18:16:52, 4.36s/it] 56%|█████▌ | 19177/34278 [21:03:40<18:36:38, 4.44s/it] {'loss': 0.1434, 'grad_norm': 0.943264193812396, 'learning_rate': 4.282729752950233e-06, 'epoch': 0.56} 56%|█████▌ | 19177/34278 [21:03:40<18:36:38, 4.44s/it] 56%|█████▌ | 19178/34278 [21:03:44<17:28:44, 4.17s/it] {'loss': 0.1203, 'grad_norm': 0.824926061015487, 'learning_rate': 4.282262208392775e-06, 'epoch': 0.56} 56%|█████▌ | 19178/34278 [21:03:44<17:28:44, 4.17s/it] 56%|█████▌ | 19179/34278 [21:03:48<17:03:23, 4.07s/it] {'loss': 0.1026, 'grad_norm': 0.7954025563054928, 'learning_rate': 4.281794670243106e-06, 'epoch': 0.56} 56%|█████▌ | 19179/34278 [21:03:48<17:03:23, 4.07s/it] 56%|█████▌ | 19180/34278 [21:03:51<16:03:57, 3.83s/it] {'loss': 0.1344, 'grad_norm': 0.8435891204851553, 'learning_rate': 4.281327138505404e-06, 'epoch': 0.56} 56%|█████▌ | 19180/34278 [21:03:51<16:03:57, 3.83s/it] 56%|█████▌ | 19181/34278 [21:03:54<14:55:57, 3.56s/it] {'loss': 0.1279, 'grad_norm': 0.8847702424844895, 'learning_rate': 4.2808596131838425e-06, 'epoch': 0.56} 56%|█████▌ | 19181/34278 [21:03:54<14:55:57, 3.56s/it] 56%|█████▌ | 19182/34278 [21:03:57<14:20:19, 3.42s/it] {'loss': 0.1199, 'grad_norm': 1.370102585499779, 'learning_rate': 4.280392094282596e-06, 'epoch': 0.56} 56%|█████▌ | 19182/34278 [21:03:57<14:20:19, 3.42s/it] 56%|█████▌ | 19183/34278 [21:04:01<14:48:44, 3.53s/it] {'loss': 0.1456, 'grad_norm': 0.6293253879548771, 'learning_rate': 4.2799245818058345e-06, 'epoch': 0.56} 56%|█████▌ | 19183/34278 [21:04:01<14:48:44, 3.53s/it] 56%|█████▌ | 19184/34278 [21:04:03<13:55:14, 3.32s/it] {'loss': 0.1244, 'grad_norm': 1.211199340368652, 'learning_rate': 4.279457075757736e-06, 'epoch': 0.56} 56%|█████▌ | 19184/34278 [21:04:03<13:55:14, 3.32s/it] 56%|█████▌ | 19185/34278 [21:04:07<13:38:56, 3.26s/it] {'loss': 0.1265, 'grad_norm': 0.8115015571584155, 'learning_rate': 4.278989576142471e-06, 'epoch': 0.56} 56%|█████▌ | 19185/34278 [21:04:07<13:38:56, 3.26s/it] 56%|█████▌ | 19186/34278 [21:04:10<14:28:50, 3.45s/it] {'loss': 0.1106, 'grad_norm': 0.7559107067971785, 'learning_rate': 4.278522082964216e-06, 'epoch': 0.56} 56%|█████▌ | 19186/34278 [21:04:11<14:28:50, 3.45s/it] 56%|█████▌ | 19187/34278 [21:04:14<14:11:36, 3.39s/it] {'loss': 0.1351, 'grad_norm': 0.7125935775902695, 'learning_rate': 4.278054596227144e-06, 'epoch': 0.56} 56%|█████▌ | 19187/34278 [21:04:14<14:11:36, 3.39s/it] 56%|█████▌ | 19188/34278 [21:04:20<17:28:19, 4.17s/it] {'loss': 0.12, 'grad_norm': 0.8288329012682432, 'learning_rate': 4.277587115935429e-06, 'epoch': 0.56} 56%|█████▌ | 19188/34278 [21:04:20<17:28:19, 4.17s/it] 56%|█████▌ | 19189/34278 [21:04:23<16:03:56, 3.83s/it] {'loss': 0.1122, 'grad_norm': 0.7560790380864659, 'learning_rate': 4.277119642093242e-06, 'epoch': 0.56} 56%|█████▌ | 19189/34278 [21:04:23<16:03:56, 3.83s/it] 56%|█████▌ | 19190/34278 [21:04:29<18:38:44, 4.45s/it] {'loss': 0.1538, 'grad_norm': 0.8941174187321133, 'learning_rate': 4.276652174704761e-06, 'epoch': 0.56} 56%|█████▌ | 19190/34278 [21:04:29<18:38:44, 4.45s/it] 56%|█████▌ | 19191/34278 [21:04:32<16:50:25, 4.02s/it] {'loss': 0.1106, 'grad_norm': 0.8135442525526468, 'learning_rate': 4.276184713774152e-06, 'epoch': 0.56} 56%|█████▌ | 19191/34278 [21:04:32<16:50:25, 4.02s/it] 56%|█████▌ | 19192/34278 [21:04:34<15:17:56, 3.65s/it] {'loss': 0.1329, 'grad_norm': 0.9668779603652046, 'learning_rate': 4.275717259305596e-06, 'epoch': 0.56} 56%|█████▌ | 19192/34278 [21:04:34<15:17:56, 3.65s/it] 56%|█████▌ | 19193/34278 [21:04:39<16:04:15, 3.84s/it] {'loss': 0.1079, 'grad_norm': 0.9930314287723587, 'learning_rate': 4.275249811303265e-06, 'epoch': 0.56} 56%|█████▌ | 19193/34278 [21:04:39<16:04:15, 3.84s/it] 56%|█████▌ | 19194/34278 [21:04:42<15:27:09, 3.69s/it] {'loss': 0.1409, 'grad_norm': 0.7501561529837268, 'learning_rate': 4.274782369771328e-06, 'epoch': 0.56} 56%|█████▌ | 19194/34278 [21:04:42<15:27:09, 3.69s/it] 56%|█████▌ | 19195/34278 [21:04:45<14:37:25, 3.49s/it] {'loss': 0.1446, 'grad_norm': 1.0002464672194624, 'learning_rate': 4.2743149347139624e-06, 'epoch': 0.56} 56%|█████▌ | 19195/34278 [21:04:45<14:37:25, 3.49s/it] 56%|█████▌ | 19196/34278 [21:04:48<14:01:18, 3.35s/it] {'loss': 0.1245, 'grad_norm': 0.8574637381494113, 'learning_rate': 4.27384750613534e-06, 'epoch': 0.56} 56%|█████▌ | 19196/34278 [21:04:48<14:01:18, 3.35s/it] 56%|█████▌ | 19197/34278 [21:04:52<14:37:20, 3.49s/it] {'loss': 0.1222, 'grad_norm': 0.7935332889830304, 'learning_rate': 4.273380084039631e-06, 'epoch': 0.56} 56%|█████▌ | 19197/34278 [21:04:52<14:37:20, 3.49s/it] 56%|█████▌ | 19198/34278 [21:04:55<14:32:46, 3.47s/it] {'loss': 0.1355, 'grad_norm': 0.8225165613848285, 'learning_rate': 4.2729126684310136e-06, 'epoch': 0.56} 56%|█████▌ | 19198/34278 [21:04:55<14:32:46, 3.47s/it] 56%|█████▌ | 19199/34278 [21:04:59<14:17:18, 3.41s/it] {'loss': 0.1423, 'grad_norm': 1.0342259246357455, 'learning_rate': 4.272445259313659e-06, 'epoch': 0.56} 56%|█████▌ | 19199/34278 [21:04:59<14:17:18, 3.41s/it] 56%|█████▌ | 19200/34278 [21:05:02<14:12:05, 3.39s/it] {'loss': 0.1444, 'grad_norm': 0.9432765790313263, 'learning_rate': 4.271977856691738e-06, 'epoch': 0.56} 56%|█████▌ | 19200/34278 [21:05:02<14:12:05, 3.39s/it] 56%|█████▌ | 19201/34278 [21:05:05<13:48:30, 3.30s/it] {'loss': 0.1287, 'grad_norm': 0.803940543852876, 'learning_rate': 4.271510460569425e-06, 'epoch': 0.56} 56%|█████▌ | 19201/34278 [21:05:05<13:48:30, 3.30s/it] 56%|█████▌ | 19202/34278 [21:05:11<17:11:07, 4.10s/it] {'loss': 0.1278, 'grad_norm': 0.9773705130442183, 'learning_rate': 4.271043070950894e-06, 'epoch': 0.56} 56%|█████▌ | 19202/34278 [21:05:11<17:11:07, 4.10s/it] 56%|█████▌ | 19203/34278 [21:05:17<19:31:52, 4.66s/it] {'loss': 0.1094, 'grad_norm': 0.7344465941221047, 'learning_rate': 4.270575687840312e-06, 'epoch': 0.56} 56%|█████▌ | 19203/34278 [21:05:17<19:31:52, 4.66s/it] 56%|█████▌ | 19204/34278 [21:05:20<17:30:28, 4.18s/it] {'loss': 0.1309, 'grad_norm': 0.8838434826550736, 'learning_rate': 4.270108311241861e-06, 'epoch': 0.56} 56%|█████▌ | 19204/34278 [21:05:20<17:30:28, 4.18s/it] 56%|█████▌ | 19205/34278 [21:05:23<15:58:33, 3.82s/it] {'loss': 0.141, 'grad_norm': 0.8994255625391175, 'learning_rate': 4.269640941159707e-06, 'epoch': 0.56} 56%|█████▌ | 19205/34278 [21:05:23<15:58:33, 3.82s/it] 56%|█████▌ | 19206/34278 [21:05:29<18:43:22, 4.47s/it] {'loss': 0.1323, 'grad_norm': 0.9732707594536684, 'learning_rate': 4.269173577598025e-06, 'epoch': 0.56} 56%|█████▌ | 19206/34278 [21:05:29<18:43:22, 4.47s/it] 56%|█████▌ | 19207/34278 [21:05:35<20:28:25, 4.89s/it] {'loss': 0.1206, 'grad_norm': 0.7352430477521541, 'learning_rate': 4.268706220560988e-06, 'epoch': 0.56} 56%|█████▌ | 19207/34278 [21:05:35<20:28:25, 4.89s/it] 56%|█████▌ | 19208/34278 [21:05:39<19:10:35, 4.58s/it] {'loss': 0.1341, 'grad_norm': 1.1259196559999454, 'learning_rate': 4.268238870052765e-06, 'epoch': 0.56} 56%|█████▌ | 19208/34278 [21:05:39<19:10:35, 4.58s/it] 56%|█████▌ | 19209/34278 [21:05:42<17:45:47, 4.24s/it] {'loss': 0.1619, 'grad_norm': 1.0097374425626127, 'learning_rate': 4.26777152607753e-06, 'epoch': 0.56} 56%|█████▌ | 19209/34278 [21:05:42<17:45:47, 4.24s/it] 56%|█████▌ | 19210/34278 [21:05:46<16:39:30, 3.98s/it] {'loss': 0.1451, 'grad_norm': 0.9265512576160485, 'learning_rate': 4.2673041886394575e-06, 'epoch': 0.56} 56%|█████▌ | 19210/34278 [21:05:46<16:39:30, 3.98s/it] 56%|█████▌ | 19211/34278 [21:05:49<16:01:19, 3.83s/it] {'loss': 0.1206, 'grad_norm': 0.9950666042745812, 'learning_rate': 4.266836857742718e-06, 'epoch': 0.56} 56%|█████▌ | 19211/34278 [21:05:49<16:01:19, 3.83s/it] 56%|█████▌ | 19212/34278 [21:05:52<15:26:02, 3.69s/it] {'loss': 0.1251, 'grad_norm': 1.1128922622827953, 'learning_rate': 4.266369533391485e-06, 'epoch': 0.56} 56%|█████▌ | 19212/34278 [21:05:52<15:26:02, 3.69s/it] 56%|█████▌ | 19213/34278 [21:05:56<14:44:52, 3.52s/it] {'loss': 0.1301, 'grad_norm': 0.9989686876612985, 'learning_rate': 4.265902215589929e-06, 'epoch': 0.56} 56%|█████▌ | 19213/34278 [21:05:56<14:44:52, 3.52s/it] 56%|█████▌ | 19214/34278 [21:05:58<13:53:52, 3.32s/it] {'loss': 0.1113, 'grad_norm': 0.7475467204018343, 'learning_rate': 4.265434904342223e-06, 'epoch': 0.56} 56%|█████▌ | 19214/34278 [21:05:58<13:53:52, 3.32s/it] 56%|█████▌ | 19215/34278 [21:06:02<14:08:30, 3.38s/it] {'loss': 0.1591, 'grad_norm': 0.9076958632300759, 'learning_rate': 4.264967599652537e-06, 'epoch': 0.56} 56%|█████▌ | 19215/34278 [21:06:02<14:08:30, 3.38s/it] 56%|█████▌ | 19216/34278 [21:06:05<14:00:19, 3.35s/it] {'loss': 0.1467, 'grad_norm': 1.1850395940739917, 'learning_rate': 4.264500301525047e-06, 'epoch': 0.56} 56%|█████▌ | 19216/34278 [21:06:05<14:00:19, 3.35s/it] 56%|█████▌ | 19217/34278 [21:06:11<16:40:22, 3.99s/it] {'loss': 0.1459, 'grad_norm': 0.8431486816914163, 'learning_rate': 4.264033009963922e-06, 'epoch': 0.56} 56%|█████▌ | 19217/34278 [21:06:11<16:40:22, 3.99s/it] 56%|█████▌ | 19218/34278 [21:06:17<19:32:21, 4.67s/it] {'loss': 0.1377, 'grad_norm': 0.8584084368418102, 'learning_rate': 4.263565724973335e-06, 'epoch': 0.56} 56%|█████▌ | 19218/34278 [21:06:17<19:32:21, 4.67s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 56%|█████▌ | 19219/34278 [21:06:20<17:26:15, 4.17s/it] {'loss': 0.1095, 'grad_norm': 0.9314764561785516, 'learning_rate': 4.2630984465574565e-06, 'epoch': 0.56} 56%|█████▌ | 19219/34278 [21:06:20<17:26:15, 4.17s/it] 56%|█████▌ | 19220/34278 [21:06:23<15:34:13, 3.72s/it] {'loss': 0.1238, 'grad_norm': 0.8418468200486386, 'learning_rate': 4.262631174720461e-06, 'epoch': 0.56} 56%|█████▌ | 19220/34278 [21:06:23<15:34:13, 3.72s/it] 56%|█████▌ | 19221/34278 [21:06:27<16:39:17, 3.98s/it] {'loss': 0.1225, 'grad_norm': 0.8403701956729118, 'learning_rate': 4.262163909466514e-06, 'epoch': 0.56} 56%|█████▌ | 19221/34278 [21:06:27<16:39:17, 3.98s/it] 56%|█████▌ | 19222/34278 [21:06:34<19:37:29, 4.69s/it] {'loss': 0.1478, 'grad_norm': 1.029647917924995, 'learning_rate': 4.261696650799796e-06, 'epoch': 0.56} 56%|█████▌ | 19222/34278 [21:06:34<19:37:29, 4.69s/it] 56%|█████▌ | 19223/34278 [21:06:37<18:08:44, 4.34s/it] {'loss': 0.1488, 'grad_norm': 0.6898737406749272, 'learning_rate': 4.2612293987244724e-06, 'epoch': 0.56} 56%|█████▌ | 19223/34278 [21:06:37<18:08:44, 4.34s/it] 56%|█████▌ | 19224/34278 [21:06:40<16:22:23, 3.92s/it] {'loss': 0.1485, 'grad_norm': 0.899877041059888, 'learning_rate': 4.2607621532447165e-06, 'epoch': 0.56} 56%|█████▌ | 19224/34278 [21:06:40<16:22:23, 3.92s/it] 56%|█████▌ | 19225/34278 [21:06:43<15:12:39, 3.64s/it] {'loss': 0.1338, 'grad_norm': 0.7953760056174247, 'learning_rate': 4.260294914364701e-06, 'epoch': 0.56} 56%|█████▌ | 19225/34278 [21:06:43<15:12:39, 3.64s/it] 56%|█████▌ | 19226/34278 [21:06:47<15:24:25, 3.68s/it] {'loss': 0.123, 'grad_norm': 0.5909386877772858, 'learning_rate': 4.259827682088594e-06, 'epoch': 0.56} 56%|█████▌ | 19226/34278 [21:06:47<15:24:25, 3.68s/it] 56%|█████▌ | 19227/34278 [21:06:50<15:01:05, 3.59s/it] {'loss': 0.1089, 'grad_norm': 0.6730042108219717, 'learning_rate': 4.259360456420568e-06, 'epoch': 0.56} 56%|█████▌ | 19227/34278 [21:06:50<15:01:05, 3.59s/it] 56%|█████▌ | 19228/34278 [21:06:53<14:14:01, 3.40s/it] {'loss': 0.1294, 'grad_norm': 0.8464745304806932, 'learning_rate': 4.258893237364796e-06, 'epoch': 0.56} 56%|█████▌ | 19228/34278 [21:06:53<14:14:01, 3.40s/it] 56%|█████▌ | 19229/34278 [21:06:56<13:50:24, 3.31s/it] {'loss': 0.1055, 'grad_norm': 0.6476066777922179, 'learning_rate': 4.25842602492545e-06, 'epoch': 0.56} 56%|█████▌ | 19229/34278 [21:06:56<13:50:24, 3.31s/it] 56%|█████▌ | 19230/34278 [21:07:00<13:56:03, 3.33s/it] {'loss': 0.141, 'grad_norm': 1.07844491737013, 'learning_rate': 4.257958819106698e-06, 'epoch': 0.56} 56%|█████▌ | 19230/34278 [21:07:00<13:56:03, 3.33s/it] 56%|█████▌ | 19231/34278 [21:07:03<14:38:23, 3.50s/it] {'loss': 0.1366, 'grad_norm': 0.7563219447049566, 'learning_rate': 4.257491619912712e-06, 'epoch': 0.56} 56%|█████▌ | 19231/34278 [21:07:03<14:38:23, 3.50s/it] 56%|█████▌ | 19232/34278 [21:07:07<14:54:37, 3.57s/it] {'loss': 0.1219, 'grad_norm': 0.7512824230749573, 'learning_rate': 4.257024427347665e-06, 'epoch': 0.56} 56%|█████▌ | 19232/34278 [21:07:07<14:54:37, 3.57s/it] 56%|█████▌ | 19233/34278 [21:07:12<16:46:28, 4.01s/it] {'loss': 0.1316, 'grad_norm': 0.7878563773671461, 'learning_rate': 4.256557241415724e-06, 'epoch': 0.56} 56%|█████▌ | 19233/34278 [21:07:12<16:46:28, 4.01s/it] 56%|█████▌ | 19234/34278 [21:07:15<15:44:31, 3.77s/it] {'loss': 0.1207, 'grad_norm': 0.7928603201442495, 'learning_rate': 4.256090062121065e-06, 'epoch': 0.56} 56%|█████▌ | 19234/34278 [21:07:15<15:44:31, 3.77s/it] 56%|█████▌ | 19235/34278 [21:07:19<15:24:41, 3.69s/it] {'loss': 0.1445, 'grad_norm': 0.8322238051894852, 'learning_rate': 4.255622889467855e-06, 'epoch': 0.56} 56%|█████▌ | 19235/34278 [21:07:19<15:24:41, 3.69s/it] 56%|█████▌ | 19236/34278 [21:07:24<16:42:24, 4.00s/it] {'loss': 0.1194, 'grad_norm': 0.8572578634645005, 'learning_rate': 4.255155723460267e-06, 'epoch': 0.56} 56%|█████▌ | 19236/34278 [21:07:24<16:42:24, 4.00s/it] 56%|█████▌ | 19237/34278 [21:07:27<15:50:18, 3.79s/it] {'loss': 0.1489, 'grad_norm': 1.0490382995322687, 'learning_rate': 4.254688564102471e-06, 'epoch': 0.56} 56%|█████▌ | 19237/34278 [21:07:27<15:50:18, 3.79s/it] 56%|█████▌ | 19238/34278 [21:07:30<14:41:38, 3.52s/it] {'loss': 0.1455, 'grad_norm': 0.7490484906454387, 'learning_rate': 4.254221411398637e-06, 'epoch': 0.56} 56%|█████▌ | 19238/34278 [21:07:30<14:41:38, 3.52s/it] 56%|█████▌ | 19239/34278 [21:07:34<15:12:33, 3.64s/it] {'loss': 0.1337, 'grad_norm': 0.7888836109931429, 'learning_rate': 4.253754265352936e-06, 'epoch': 0.56} 56%|█████▌ | 19239/34278 [21:07:34<15:12:33, 3.64s/it] 56%|█████▌ | 19240/34278 [21:07:39<17:36:42, 4.22s/it] {'loss': 0.1223, 'grad_norm': 0.7320539147918943, 'learning_rate': 4.253287125969539e-06, 'epoch': 0.56} 56%|█████▌ | 19240/34278 [21:07:39<17:36:42, 4.22s/it] 56%|█████▌ | 19241/34278 [21:07:43<17:19:01, 4.15s/it] {'loss': 0.1507, 'grad_norm': 0.9379249120352995, 'learning_rate': 4.252819993252616e-06, 'epoch': 0.56} 56%|█████▌ | 19241/34278 [21:07:43<17:19:01, 4.15s/it] 56%|█████▌ | 19242/34278 [21:07:46<16:01:02, 3.83s/it] {'loss': 0.1166, 'grad_norm': 0.6936519850642795, 'learning_rate': 4.252352867206339e-06, 'epoch': 0.56} 56%|█████▌ | 19242/34278 [21:07:46<16:01:02, 3.83s/it] 56%|█████▌ | 19243/34278 [21:07:51<17:17:37, 4.14s/it] {'loss': 0.1036, 'grad_norm': 0.7324403816158332, 'learning_rate': 4.251885747834876e-06, 'epoch': 0.56} 56%|█████▌ | 19243/34278 [21:07:51<17:17:37, 4.14s/it] 56%|█████▌ | 19244/34278 [21:07:54<15:56:19, 3.82s/it] {'loss': 0.1256, 'grad_norm': 1.0492171542944357, 'learning_rate': 4.251418635142399e-06, 'epoch': 0.56} 56%|█████▌ | 19244/34278 [21:07:54<15:56:19, 3.82s/it] 56%|█████▌ | 19245/34278 [21:07:58<15:51:28, 3.80s/it] {'loss': 0.1481, 'grad_norm': 0.6966023763959978, 'learning_rate': 4.250951529133076e-06, 'epoch': 0.56} 56%|█████▌ | 19245/34278 [21:07:58<15:51:28, 3.80s/it] 56%|█████▌ | 19246/34278 [21:08:01<14:39:47, 3.51s/it] {'loss': 0.128, 'grad_norm': 0.8778409784254735, 'learning_rate': 4.25048442981108e-06, 'epoch': 0.56} 56%|█████▌ | 19246/34278 [21:08:01<14:39:47, 3.51s/it] 56%|█████▌ | 19247/34278 [21:08:07<17:57:18, 4.30s/it] {'loss': 0.1057, 'grad_norm': 0.8473416623500546, 'learning_rate': 4.250017337180582e-06, 'epoch': 0.56} 56%|█████▌ | 19247/34278 [21:08:07<17:57:18, 4.30s/it] 56%|█████▌ | 19248/34278 [21:08:10<16:19:10, 3.91s/it] {'loss': 0.1228, 'grad_norm': 0.6725343787609986, 'learning_rate': 4.249550251245748e-06, 'epoch': 0.56} 56%|█████▌ | 19248/34278 [21:08:10<16:19:10, 3.91s/it] 56%|█████▌ | 19249/34278 [21:08:14<15:48:52, 3.79s/it] {'loss': 0.1308, 'grad_norm': 0.7639821480113129, 'learning_rate': 4.2490831720107514e-06, 'epoch': 0.56} 56%|█████▌ | 19249/34278 [21:08:14<15:48:52, 3.79s/it] 56%|█████▌ | 19250/34278 [21:08:17<14:58:21, 3.59s/it] {'loss': 0.1282, 'grad_norm': 0.8546762631832446, 'learning_rate': 4.248616099479761e-06, 'epoch': 0.56} 56%|█████▌ | 19250/34278 [21:08:17<14:58:21, 3.59s/it] 56%|█████▌ | 19251/34278 [21:08:20<15:11:14, 3.64s/it] {'loss': 0.1381, 'grad_norm': 0.7126137860181593, 'learning_rate': 4.248149033656944e-06, 'epoch': 0.56} 56%|█████▌ | 19251/34278 [21:08:20<15:11:14, 3.64s/it] 56%|█████▌ | 19252/34278 [21:08:24<15:00:31, 3.60s/it] {'loss': 0.1258, 'grad_norm': 0.8070827937807697, 'learning_rate': 4.247681974546476e-06, 'epoch': 0.56} 56%|█████▌ | 19252/34278 [21:08:24<15:00:31, 3.60s/it] 56%|█████▌ | 19253/34278 [21:08:29<17:09:49, 4.11s/it] {'loss': 0.1285, 'grad_norm': 0.8765076472300789, 'learning_rate': 4.247214922152523e-06, 'epoch': 0.56} 56%|█████▌ | 19253/34278 [21:08:29<17:09:49, 4.11s/it] 56%|█████▌ | 19254/34278 [21:08:34<17:29:37, 4.19s/it] {'loss': 0.1351, 'grad_norm': 0.7976205097523037, 'learning_rate': 4.246747876479255e-06, 'epoch': 0.56} 56%|█████▌ | 19254/34278 [21:08:34<17:29:37, 4.19s/it] 56%|█████▌ | 19255/34278 [21:08:37<16:28:09, 3.95s/it] {'loss': 0.1523, 'grad_norm': 0.7308835986813395, 'learning_rate': 4.246280837530843e-06, 'epoch': 0.56} 56%|█████▌ | 19255/34278 [21:08:37<16:28:09, 3.95s/it] 56%|█████▌ | 19256/34278 [21:08:41<16:39:16, 3.99s/it] {'loss': 0.1277, 'grad_norm': 0.8392817980715473, 'learning_rate': 4.245813805311455e-06, 'epoch': 0.56} 56%|█████▌ | 19256/34278 [21:08:41<16:39:16, 3.99s/it] 56%|█████▌ | 19257/34278 [21:08:45<15:55:59, 3.82s/it] {'loss': 0.1427, 'grad_norm': 0.6953263947730987, 'learning_rate': 4.245346779825261e-06, 'epoch': 0.56} 56%|█████▌ | 19257/34278 [21:08:45<15:55:59, 3.82s/it] 56%|█████▌ | 19258/34278 [21:08:48<15:07:07, 3.62s/it] {'loss': 0.1386, 'grad_norm': 0.7919335657714905, 'learning_rate': 4.244879761076431e-06, 'epoch': 0.56} 56%|█████▌ | 19258/34278 [21:08:48<15:07:07, 3.62s/it] 56%|█████▌ | 19259/34278 [21:08:52<15:48:06, 3.79s/it] {'loss': 0.141, 'grad_norm': 0.7850194959804362, 'learning_rate': 4.244412749069136e-06, 'epoch': 0.56} 56%|█████▌ | 19259/34278 [21:08:52<15:48:06, 3.79s/it] 56%|█████▌ | 19260/34278 [21:08:55<14:43:59, 3.53s/it] {'loss': 0.1231, 'grad_norm': 0.8078149370581779, 'learning_rate': 4.2439457438075415e-06, 'epoch': 0.56} 56%|█████▌ | 19260/34278 [21:08:55<14:43:59, 3.53s/it] 56%|█████▌ | 19261/34278 [21:08:58<14:46:39, 3.54s/it] {'loss': 0.1432, 'grad_norm': 0.8211010678317224, 'learning_rate': 4.243478745295819e-06, 'epoch': 0.56} 56%|█████▌ | 19261/34278 [21:08:58<14:46:39, 3.54s/it] 56%|█████▌ | 19262/34278 [21:09:01<13:55:34, 3.34s/it] {'loss': 0.1204, 'grad_norm': 0.7373873299260599, 'learning_rate': 4.243011753538139e-06, 'epoch': 0.56} 56%|█████▌ | 19262/34278 [21:09:01<13:55:34, 3.34s/it] 56%|█████▌ | 19263/34278 [21:09:05<13:58:22, 3.35s/it] {'loss': 0.1326, 'grad_norm': 0.9526800720565995, 'learning_rate': 4.242544768538667e-06, 'epoch': 0.56} 56%|█████▌ | 19263/34278 [21:09:05<13:58:22, 3.35s/it] 56%|█████▌ | 19264/34278 [21:09:08<13:32:52, 3.25s/it] {'loss': 0.1221, 'grad_norm': 0.8399968829161019, 'learning_rate': 4.2420777903015765e-06, 'epoch': 0.56} 56%|█████▌ | 19264/34278 [21:09:08<13:32:52, 3.25s/it] 56%|█████▌ | 19265/34278 [21:09:14<16:57:37, 4.07s/it] {'loss': 0.1368, 'grad_norm': 0.8356319202526975, 'learning_rate': 4.241610818831034e-06, 'epoch': 0.56} 56%|█████▌ | 19265/34278 [21:09:14<16:57:37, 4.07s/it] 56%|█████▌ | 19266/34278 [21:09:17<15:38:43, 3.75s/it] {'loss': 0.1593, 'grad_norm': 0.8500670034443689, 'learning_rate': 4.241143854131209e-06, 'epoch': 0.56} 56%|█████▌ | 19266/34278 [21:09:17<15:38:43, 3.75s/it] 56%|█████▌ | 19267/34278 [21:09:20<14:37:36, 3.51s/it] {'loss': 0.1282, 'grad_norm': 0.7891021294403209, 'learning_rate': 4.240676896206272e-06, 'epoch': 0.56} 56%|█████▌ | 19267/34278 [21:09:20<14:37:36, 3.51s/it] 56%|█████▌ | 19268/34278 [21:09:23<14:58:30, 3.59s/it] {'loss': 0.1212, 'grad_norm': 0.7078533875865706, 'learning_rate': 4.240209945060389e-06, 'epoch': 0.56} 56%|█████▌ | 19268/34278 [21:09:23<14:58:30, 3.59s/it] 56%|█████▌ | 19269/34278 [21:09:27<14:55:50, 3.58s/it] {'loss': 0.1327, 'grad_norm': 1.0192459342862867, 'learning_rate': 4.239743000697729e-06, 'epoch': 0.56} 56%|█████▌ | 19269/34278 [21:09:27<14:55:50, 3.58s/it] 56%|█████▌ | 19270/34278 [21:09:30<14:24:40, 3.46s/it] {'loss': 0.1189, 'grad_norm': 1.266716519773206, 'learning_rate': 4.2392760631224635e-06, 'epoch': 0.56} 56%|█████▌ | 19270/34278 [21:09:30<14:24:40, 3.46s/it] 56%|█████▌ | 19271/34278 [21:09:33<13:28:09, 3.23s/it] {'loss': 0.1145, 'grad_norm': 0.9068333846726923, 'learning_rate': 4.2388091323387595e-06, 'epoch': 0.56} 56%|█████▌ | 19271/34278 [21:09:33<13:28:09, 3.23s/it] 56%|█████▌ | 19272/34278 [21:09:36<13:10:08, 3.16s/it] {'loss': 0.1341, 'grad_norm': 0.7834339806082172, 'learning_rate': 4.238342208350786e-06, 'epoch': 0.56} 56%|█████▌ | 19272/34278 [21:09:36<13:10:08, 3.16s/it] 56%|█████▌ | 19273/34278 [21:09:39<12:45:51, 3.06s/it] {'loss': 0.1173, 'grad_norm': 1.0955118180741146, 'learning_rate': 4.237875291162712e-06, 'epoch': 0.56} 56%|█████▌ | 19273/34278 [21:09:39<12:45:51, 3.06s/it] 56%|█████▌ | 19274/34278 [21:09:42<13:36:32, 3.27s/it] {'loss': 0.1154, 'grad_norm': 1.1646390911415243, 'learning_rate': 4.237408380778705e-06, 'epoch': 0.56} 56%|█████▌ | 19274/34278 [21:09:42<13:36:32, 3.27s/it] 56%|█████▌ | 19275/34278 [21:09:46<13:46:52, 3.31s/it] {'loss': 0.124, 'grad_norm': 0.6779539983296421, 'learning_rate': 4.236941477202932e-06, 'epoch': 0.56} 56%|█████▌ | 19275/34278 [21:09:46<13:46:52, 3.31s/it] 56%|█████▌ | 19276/34278 [21:09:49<13:29:57, 3.24s/it] {'loss': 0.1137, 'grad_norm': 1.0207774300517027, 'learning_rate': 4.236474580439565e-06, 'epoch': 0.56} 56%|█████▌ | 19276/34278 [21:09:49<13:29:57, 3.24s/it] 56%|█████▌ | 19277/34278 [21:09:52<13:30:43, 3.24s/it] {'loss': 0.1435, 'grad_norm': 0.871406900651337, 'learning_rate': 4.236007690492772e-06, 'epoch': 0.56} 56%|█████▌ | 19277/34278 [21:09:52<13:30:43, 3.24s/it] 56%|█████▌ | 19278/34278 [21:09:55<13:19:42, 3.20s/it] {'loss': 0.1303, 'grad_norm': 0.9679563557813425, 'learning_rate': 4.2355408073667185e-06, 'epoch': 0.56} 56%|█████▌ | 19278/34278 [21:09:55<13:19:42, 3.20s/it] 56%|█████▌ | 19279/34278 [21:10:01<16:56:05, 4.06s/it] {'loss': 0.1423, 'grad_norm': 0.9602230166493325, 'learning_rate': 4.235073931065574e-06, 'epoch': 0.56} 56%|█████▌ | 19279/34278 [21:10:01<16:56:05, 4.06s/it] 56%|█████▌ | 19280/34278 [21:10:04<15:41:22, 3.77s/it] {'loss': 0.1333, 'grad_norm': 0.7242996991793469, 'learning_rate': 4.234607061593508e-06, 'epoch': 0.56} 56%|█████▌ | 19280/34278 [21:10:04<15:41:22, 3.77s/it] 56%|█████▌ | 19281/34278 [21:10:08<15:37:47, 3.75s/it] {'loss': 0.1326, 'grad_norm': 0.9829676891679778, 'learning_rate': 4.234140198954686e-06, 'epoch': 0.56} 56%|█████▌ | 19281/34278 [21:10:08<15:37:47, 3.75s/it] 56%|█████▋ | 19282/34278 [21:10:11<14:49:34, 3.56s/it] {'loss': 0.1111, 'grad_norm': 0.8301560375608132, 'learning_rate': 4.233673343153278e-06, 'epoch': 0.56} 56%|█████▋ | 19282/34278 [21:10:11<14:49:34, 3.56s/it] 56%|█████▋ | 19283/34278 [21:10:14<14:31:26, 3.49s/it] {'loss': 0.147, 'grad_norm': 0.7619115022470225, 'learning_rate': 4.233206494193452e-06, 'epoch': 0.56} 56%|█████▋ | 19283/34278 [21:10:15<14:31:26, 3.49s/it] 56%|█████▋ | 19284/34278 [21:10:21<17:41:25, 4.25s/it] {'loss': 0.1466, 'grad_norm': 0.7479633971938229, 'learning_rate': 4.232739652079374e-06, 'epoch': 0.56} 56%|█████▋ | 19284/34278 [21:10:21<17:41:25, 4.25s/it] 56%|█████▋ | 19285/34278 [21:10:24<17:09:31, 4.12s/it] {'loss': 0.1273, 'grad_norm': 0.8556559103942638, 'learning_rate': 4.232272816815215e-06, 'epoch': 0.56} 56%|█████▋ | 19285/34278 [21:10:24<17:09:31, 4.12s/it] 56%|█████▋ | 19286/34278 [21:10:29<17:21:05, 4.17s/it] {'loss': 0.1483, 'grad_norm': 0.7495067380402984, 'learning_rate': 4.23180598840514e-06, 'epoch': 0.56} 56%|█████▋ | 19286/34278 [21:10:29<17:21:05, 4.17s/it] 56%|█████▋ | 19287/34278 [21:10:35<19:46:26, 4.75s/it] {'loss': 0.1179, 'grad_norm': 0.7616907057061205, 'learning_rate': 4.2313391668533175e-06, 'epoch': 0.56} 56%|█████▋ | 19287/34278 [21:10:35<19:46:26, 4.75s/it] 56%|█████▋ | 19288/34278 [21:10:38<17:31:12, 4.21s/it] {'loss': 0.1388, 'grad_norm': 1.0785160648338838, 'learning_rate': 4.230872352163915e-06, 'epoch': 0.56} 56%|█████▋ | 19288/34278 [21:10:38<17:31:12, 4.21s/it] 56%|█████▋ | 19289/34278 [21:10:41<15:56:39, 3.83s/it] {'loss': 0.1333, 'grad_norm': 0.9011776848206084, 'learning_rate': 4.230405544341103e-06, 'epoch': 0.56} 56%|█████▋ | 19289/34278 [21:10:41<15:56:39, 3.83s/it] 56%|█████▋ | 19290/34278 [21:10:44<15:45:46, 3.79s/it] {'loss': 0.1311, 'grad_norm': 0.859255093890849, 'learning_rate': 4.229938743389045e-06, 'epoch': 0.56} 56%|█████▋ | 19290/34278 [21:10:44<15:45:46, 3.79s/it] 56%|█████▋ | 19291/34278 [21:10:47<14:39:54, 3.52s/it] {'loss': 0.1317, 'grad_norm': 0.8478954436826731, 'learning_rate': 4.229471949311909e-06, 'epoch': 0.56} 56%|█████▋ | 19291/34278 [21:10:47<14:39:54, 3.52s/it] 56%|█████▋ | 19292/34278 [21:10:53<17:55:43, 4.31s/it] {'loss': 0.1081, 'grad_norm': 0.6805753181191991, 'learning_rate': 4.229005162113866e-06, 'epoch': 0.56} 56%|█████▋ | 19292/34278 [21:10:53<17:55:43, 4.31s/it] 56%|█████▋ | 19293/34278 [21:10:57<16:49:16, 4.04s/it] {'loss': 0.1148, 'grad_norm': 1.1931656618607633, 'learning_rate': 4.228538381799077e-06, 'epoch': 0.56} 56%|█████▋ | 19293/34278 [21:10:57<16:49:16, 4.04s/it] 56%|█████▋ | 19294/34278 [21:11:03<19:23:08, 4.66s/it] {'loss': 0.1324, 'grad_norm': 0.7888856696280375, 'learning_rate': 4.228071608371717e-06, 'epoch': 0.56} 56%|█████▋ | 19294/34278 [21:11:03<19:23:08, 4.66s/it] 56%|█████▋ | 19295/34278 [21:11:09<20:57:18, 5.03s/it] {'loss': 0.1335, 'grad_norm': 0.8647763043827412, 'learning_rate': 4.227604841835948e-06, 'epoch': 0.56} 56%|█████▋ | 19295/34278 [21:11:09<20:57:18, 5.03s/it] 56%|█████▋ | 19296/34278 [21:11:12<18:16:00, 4.39s/it] {'loss': 0.1331, 'grad_norm': 1.0025047658229442, 'learning_rate': 4.227138082195939e-06, 'epoch': 0.56} 56%|█████▋ | 19296/34278 [21:11:12<18:16:00, 4.39s/it] 56%|█████▋ | 19297/34278 [21:11:15<16:50:33, 4.05s/it] {'loss': 0.1216, 'grad_norm': 0.8419858458809792, 'learning_rate': 4.226671329455856e-06, 'epoch': 0.56} 56%|█████▋ | 19297/34278 [21:11:15<16:50:33, 4.05s/it] 56%|█████▋ | 19298/34278 [21:11:18<15:53:23, 3.82s/it] {'loss': 0.1449, 'grad_norm': 0.9431387653014118, 'learning_rate': 4.226204583619868e-06, 'epoch': 0.56} 56%|█████▋ | 19298/34278 [21:11:18<15:53:23, 3.82s/it] 56%|█████▋ | 19299/34278 [21:11:21<14:45:57, 3.55s/it] {'loss': 0.1242, 'grad_norm': 1.0837020257541334, 'learning_rate': 4.225737844692138e-06, 'epoch': 0.56} 56%|█████▋ | 19299/34278 [21:11:21<14:45:57, 3.55s/it] 56%|█████▋ | 19300/34278 [21:11:24<14:03:56, 3.38s/it] {'loss': 0.1235, 'grad_norm': 0.8552785217938188, 'learning_rate': 4.225271112676837e-06, 'epoch': 0.56} 56%|█████▋ | 19300/34278 [21:11:24<14:03:56, 3.38s/it] 56%|█████▋ | 19301/34278 [21:11:30<17:10:08, 4.13s/it] {'loss': 0.1374, 'grad_norm': 0.7848543573552564, 'learning_rate': 4.224804387578131e-06, 'epoch': 0.56} 56%|█████▋ | 19301/34278 [21:11:30<17:10:08, 4.13s/it] 56%|█████▋ | 19302/34278 [21:11:34<16:59:31, 4.08s/it] {'loss': 0.1422, 'grad_norm': 0.8830283325085005, 'learning_rate': 4.224337669400188e-06, 'epoch': 0.56} 56%|█████▋ | 19302/34278 [21:11:34<16:59:31, 4.08s/it] 56%|█████▋ | 19303/34278 [21:11:38<16:29:01, 3.96s/it] {'loss': 0.1308, 'grad_norm': 0.9473238107477748, 'learning_rate': 4.223870958147171e-06, 'epoch': 0.56} 56%|█████▋ | 19303/34278 [21:11:38<16:29:01, 3.96s/it] 56%|█████▋ | 19304/34278 [21:11:43<18:16:06, 4.39s/it] {'loss': 0.1341, 'grad_norm': 0.8144301208569833, 'learning_rate': 4.22340425382325e-06, 'epoch': 0.56} 56%|█████▋ | 19304/34278 [21:11:43<18:16:06, 4.39s/it] 56%|█████▋ | 19305/34278 [21:11:46<16:28:30, 3.96s/it] {'loss': 0.1387, 'grad_norm': 0.912128089231077, 'learning_rate': 4.222937556432588e-06, 'epoch': 0.56} 56%|█████▋ | 19305/34278 [21:11:46<16:28:30, 3.96s/it] 56%|█████▋ | 19306/34278 [21:11:50<16:08:20, 3.88s/it] {'loss': 0.1506, 'grad_norm': 1.0681510926519333, 'learning_rate': 4.222470865979356e-06, 'epoch': 0.56} 56%|█████▋ | 19306/34278 [21:11:50<16:08:20, 3.88s/it] 56%|█████▋ | 19307/34278 [21:11:53<16:00:12, 3.85s/it] {'loss': 0.1311, 'grad_norm': 1.0162766848870537, 'learning_rate': 4.2220041824677194e-06, 'epoch': 0.56} 56%|█████▋ | 19307/34278 [21:11:53<16:00:12, 3.85s/it] 56%|█████▋ | 19308/34278 [21:11:57<15:39:08, 3.76s/it] {'loss': 0.1523, 'grad_norm': 0.8632845629407573, 'learning_rate': 4.221537505901843e-06, 'epoch': 0.56} 56%|█████▋ | 19308/34278 [21:11:57<15:39:08, 3.76s/it] 56%|█████▋ | 19309/34278 [21:12:01<15:24:06, 3.70s/it] {'loss': 0.117, 'grad_norm': 0.9065762068080127, 'learning_rate': 4.221070836285893e-06, 'epoch': 0.56} 56%|█████▋ | 19309/34278 [21:12:01<15:24:06, 3.70s/it] 56%|█████▋ | 19310/34278 [21:12:04<14:56:06, 3.59s/it] {'loss': 0.1302, 'grad_norm': 0.9711847233830024, 'learning_rate': 4.220604173624036e-06, 'epoch': 0.56} 56%|█████▋ | 19310/34278 [21:12:04<14:56:06, 3.59s/it] 56%|█████▋ | 19311/34278 [21:12:08<15:01:00, 3.61s/it] {'loss': 0.139, 'grad_norm': 0.9597026554698108, 'learning_rate': 4.22013751792044e-06, 'epoch': 0.56} 56%|█████▋ | 19311/34278 [21:12:08<15:01:00, 3.61s/it] 56%|█████▋ | 19312/34278 [21:12:11<14:56:52, 3.60s/it] {'loss': 0.1472, 'grad_norm': 0.9967824934046925, 'learning_rate': 4.219670869179271e-06, 'epoch': 0.56} 56%|█████▋ | 19312/34278 [21:12:11<14:56:52, 3.60s/it] 56%|█████▋ | 19313/34278 [21:12:15<15:15:35, 3.67s/it] {'loss': 0.129, 'grad_norm': 0.920229970603909, 'learning_rate': 4.219204227404693e-06, 'epoch': 0.56} 56%|█████▋ | 19313/34278 [21:12:15<15:15:35, 3.67s/it] 56%|█████▋ | 19314/34278 [21:12:18<14:28:48, 3.48s/it] {'loss': 0.1218, 'grad_norm': 0.8940024553661092, 'learning_rate': 4.218737592600873e-06, 'epoch': 0.56} 56%|█████▋ | 19314/34278 [21:12:18<14:28:48, 3.48s/it] 56%|█████▋ | 19315/34278 [21:12:21<14:07:39, 3.40s/it] {'loss': 0.1385, 'grad_norm': 0.9780462343756766, 'learning_rate': 4.218270964771979e-06, 'epoch': 0.56} 56%|█████▋ | 19315/34278 [21:12:21<14:07:39, 3.40s/it] 56%|█████▋ | 19316/34278 [21:12:25<14:26:23, 3.47s/it] {'loss': 0.1597, 'grad_norm': 0.7950557499939994, 'learning_rate': 4.217804343922173e-06, 'epoch': 0.56} 56%|█████▋ | 19316/34278 [21:12:25<14:26:23, 3.47s/it] 56%|█████▋ | 19317/34278 [21:12:28<13:47:28, 3.32s/it] {'loss': 0.1535, 'grad_norm': 1.1356372175957845, 'learning_rate': 4.217337730055624e-06, 'epoch': 0.56} 56%|█████▋ | 19317/34278 [21:12:28<13:47:28, 3.32s/it] 56%|█████▋ | 19318/34278 [21:12:33<16:12:54, 3.90s/it] {'loss': 0.106, 'grad_norm': 0.926451669767488, 'learning_rate': 4.216871123176498e-06, 'epoch': 0.56} 56%|█████▋ | 19318/34278 [21:12:33<16:12:54, 3.90s/it] 56%|█████▋ | 19319/34278 [21:12:36<15:25:47, 3.71s/it] {'loss': 0.1296, 'grad_norm': 0.9967468346324517, 'learning_rate': 4.21640452328896e-06, 'epoch': 0.56} 56%|█████▋ | 19319/34278 [21:12:36<15:25:47, 3.71s/it] 56%|█████▋ | 19320/34278 [21:12:40<15:02:39, 3.62s/it] {'loss': 0.1277, 'grad_norm': 1.1076081985968935, 'learning_rate': 4.215937930397173e-06, 'epoch': 0.56} 56%|█████▋ | 19320/34278 [21:12:40<15:02:39, 3.62s/it] 56%|█████▋ | 19321/34278 [21:12:43<14:26:19, 3.48s/it] {'loss': 0.1316, 'grad_norm': 0.8887507150342753, 'learning_rate': 4.215471344505307e-06, 'epoch': 0.56} 56%|█████▋ | 19321/34278 [21:12:43<14:26:19, 3.48s/it] 56%|█████▋ | 19322/34278 [21:12:47<14:55:43, 3.59s/it] {'loss': 0.117, 'grad_norm': 0.8451591389386952, 'learning_rate': 4.215004765617522e-06, 'epoch': 0.56} 56%|█████▋ | 19322/34278 [21:12:47<14:55:43, 3.59s/it] 56%|█████▋ | 19323/34278 [21:12:51<15:07:26, 3.64s/it] {'loss': 0.1291, 'grad_norm': 1.0267875290904167, 'learning_rate': 4.21453819373799e-06, 'epoch': 0.56} 56%|█████▋ | 19323/34278 [21:12:51<15:07:26, 3.64s/it] 56%|█████▋ | 19324/34278 [21:12:53<14:06:56, 3.40s/it] {'loss': 0.1288, 'grad_norm': 1.0539471781510779, 'learning_rate': 4.214071628870874e-06, 'epoch': 0.56} 56%|█████▋ | 19324/34278 [21:12:53<14:06:56, 3.40s/it] 56%|█████▋ | 19325/34278 [21:12:56<13:25:49, 3.23s/it] {'loss': 0.1271, 'grad_norm': 0.8185991522525249, 'learning_rate': 4.213605071020338e-06, 'epoch': 0.56} 56%|█████▋ | 19325/34278 [21:12:56<13:25:49, 3.23s/it] 56%|█████▋ | 19326/34278 [21:13:02<16:26:52, 3.96s/it] {'loss': 0.1244, 'grad_norm': 1.2374200149474295, 'learning_rate': 4.213138520190548e-06, 'epoch': 0.56} 56%|█████▋ | 19326/34278 [21:13:02<16:26:52, 3.96s/it] 56%|█████▋ | 19327/34278 [21:13:05<15:09:44, 3.65s/it] {'loss': 0.1441, 'grad_norm': 1.0267053419097691, 'learning_rate': 4.212671976385671e-06, 'epoch': 0.56} 56%|█████▋ | 19327/34278 [21:13:05<15:09:44, 3.65s/it] 56%|█████▋ | 19328/34278 [21:13:08<14:17:35, 3.44s/it] {'loss': 0.1242, 'grad_norm': 0.7783242735958762, 'learning_rate': 4.212205439609868e-06, 'epoch': 0.56} 56%|█████▋ | 19328/34278 [21:13:08<14:17:35, 3.44s/it] 56%|█████▋ | 19329/34278 [21:13:11<14:10:24, 3.41s/it] {'loss': 0.1207, 'grad_norm': 1.0591920770891756, 'learning_rate': 4.211738909867309e-06, 'epoch': 0.56} 56%|█████▋ | 19329/34278 [21:13:11<14:10:24, 3.41s/it] 56%|█████▋ | 19330/34278 [21:13:14<13:34:21, 3.27s/it] {'loss': 0.1296, 'grad_norm': 0.9302342414014498, 'learning_rate': 4.211272387162155e-06, 'epoch': 0.56} 56%|█████▋ | 19330/34278 [21:13:14<13:34:21, 3.27s/it] 56%|█████▋ | 19331/34278 [21:13:17<13:13:39, 3.19s/it] {'loss': 0.1134, 'grad_norm': 0.8021722036322011, 'learning_rate': 4.210805871498575e-06, 'epoch': 0.56} 56%|█████▋ | 19331/34278 [21:13:17<13:13:39, 3.19s/it] 56%|█████▋ | 19332/34278 [21:13:20<12:35:52, 3.03s/it] {'loss': 0.1402, 'grad_norm': 0.8071094129094918, 'learning_rate': 4.210339362880731e-06, 'epoch': 0.56} 56%|█████▋ | 19332/34278 [21:13:20<12:35:52, 3.03s/it] 56%|█████▋ | 19333/34278 [21:13:24<13:36:55, 3.28s/it] {'loss': 0.1511, 'grad_norm': 1.104945245826796, 'learning_rate': 4.209872861312788e-06, 'epoch': 0.56} 56%|█████▋ | 19333/34278 [21:13:24<13:36:55, 3.28s/it] 56%|█████▋ | 19334/34278 [21:13:28<15:14:08, 3.67s/it] {'loss': 0.1208, 'grad_norm': 1.158047827388993, 'learning_rate': 4.209406366798911e-06, 'epoch': 0.56} 56%|█████▋ | 19334/34278 [21:13:28<15:14:08, 3.67s/it] 56%|█████▋ | 19335/34278 [21:13:33<16:28:24, 3.97s/it] {'loss': 0.1337, 'grad_norm': 0.7866838173179109, 'learning_rate': 4.208939879343266e-06, 'epoch': 0.56} 56%|█████▋ | 19335/34278 [21:13:33<16:28:24, 3.97s/it] 56%|█████▋ | 19336/34278 [21:13:39<19:16:57, 4.65s/it] {'loss': 0.139, 'grad_norm': 0.9365522579795171, 'learning_rate': 4.208473398950016e-06, 'epoch': 0.56} 56%|█████▋ | 19336/34278 [21:13:39<19:16:57, 4.65s/it] 56%|█████▋ | 19337/34278 [21:13:42<17:24:36, 4.19s/it] {'loss': 0.1389, 'grad_norm': 1.0414850520402614, 'learning_rate': 4.208006925623329e-06, 'epoch': 0.56} 56%|█████▋ | 19337/34278 [21:13:42<17:24:36, 4.19s/it] 56%|█████▋ | 19338/34278 [21:13:48<19:09:52, 4.62s/it] {'loss': 0.1438, 'grad_norm': 0.8206167461272429, 'learning_rate': 4.207540459367365e-06, 'epoch': 0.56} 56%|█████▋ | 19338/34278 [21:13:48<19:09:52, 4.62s/it] 56%|█████▋ | 19339/34278 [21:13:51<17:05:45, 4.12s/it] {'loss': 0.1191, 'grad_norm': 1.0190539656046538, 'learning_rate': 4.207074000186291e-06, 'epoch': 0.56} 56%|█████▋ | 19339/34278 [21:13:51<17:05:45, 4.12s/it] 56%|█████▋ | 19340/34278 [21:13:54<15:53:58, 3.83s/it] {'loss': 0.1266, 'grad_norm': 1.101606784957744, 'learning_rate': 4.20660754808427e-06, 'epoch': 0.56} 56%|█████▋ | 19340/34278 [21:13:54<15:53:58, 3.83s/it] 56%|█████▋ | 19341/34278 [21:13:57<14:40:29, 3.54s/it] {'loss': 0.1157, 'grad_norm': 0.8097156329379198, 'learning_rate': 4.20614110306547e-06, 'epoch': 0.56} 56%|█████▋ | 19341/34278 [21:13:57<14:40:29, 3.54s/it] 56%|█████▋ | 19342/34278 [21:14:00<14:54:05, 3.59s/it] {'loss': 0.111, 'grad_norm': 1.0365985115715617, 'learning_rate': 4.205674665134051e-06, 'epoch': 0.56} 56%|█████▋ | 19342/34278 [21:14:00<14:54:05, 3.59s/it] 56%|█████▋ | 19343/34278 [21:14:06<16:50:35, 4.06s/it] {'loss': 0.1238, 'grad_norm': 0.9027453339980929, 'learning_rate': 4.205208234294179e-06, 'epoch': 0.56} 56%|█████▋ | 19343/34278 [21:14:06<16:50:35, 4.06s/it] 56%|█████▋ | 19344/34278 [21:14:09<15:31:02, 3.74s/it] {'loss': 0.1302, 'grad_norm': 0.9278063332020796, 'learning_rate': 4.204741810550018e-06, 'epoch': 0.56} 56%|█████▋ | 19344/34278 [21:14:09<15:31:02, 3.74s/it] 56%|█████▋ | 19345/34278 [21:14:12<14:56:46, 3.60s/it] {'loss': 0.1413, 'grad_norm': 0.9972173466492001, 'learning_rate': 4.204275393905734e-06, 'epoch': 0.56} 56%|█████▋ | 19345/34278 [21:14:12<14:56:46, 3.60s/it] 56%|█████▋ | 19346/34278 [21:14:15<14:02:20, 3.38s/it] {'loss': 0.1205, 'grad_norm': 0.8389099482289466, 'learning_rate': 4.203808984365487e-06, 'epoch': 0.56} 56%|█████▋ | 19346/34278 [21:14:15<14:02:20, 3.38s/it] 56%|█████▋ | 19347/34278 [21:14:18<14:22:47, 3.47s/it] {'loss': 0.1502, 'grad_norm': 0.8919946807713786, 'learning_rate': 4.203342581933444e-06, 'epoch': 0.56} 56%|█████▋ | 19347/34278 [21:14:18<14:22:47, 3.47s/it] 56%|█████▋ | 19348/34278 [21:14:22<14:08:19, 3.41s/it] {'loss': 0.1357, 'grad_norm': 0.9556445667187019, 'learning_rate': 4.202876186613769e-06, 'epoch': 0.56} 56%|█████▋ | 19348/34278 [21:14:22<14:08:19, 3.41s/it] 56%|█████▋ | 19349/34278 [21:14:25<14:24:58, 3.48s/it] {'loss': 0.13, 'grad_norm': 0.9808822001374411, 'learning_rate': 4.2024097984106254e-06, 'epoch': 0.56} 56%|█████▋ | 19349/34278 [21:14:25<14:24:58, 3.48s/it] 56%|█████▋ | 19350/34278 [21:14:29<14:11:04, 3.42s/it] {'loss': 0.1101, 'grad_norm': 0.692672238631294, 'learning_rate': 4.201943417328176e-06, 'epoch': 0.56} 56%|█████▋ | 19350/34278 [21:14:29<14:11:04, 3.42s/it] 56%|█████▋ | 19351/34278 [21:14:33<15:16:31, 3.68s/it] {'loss': 0.1102, 'grad_norm': 1.051160649074656, 'learning_rate': 4.2014770433705856e-06, 'epoch': 0.56} 56%|█████▋ | 19351/34278 [21:14:33<15:16:31, 3.68s/it] 56%|█████▋ | 19352/34278 [21:14:36<14:56:13, 3.60s/it] {'loss': 0.1401, 'grad_norm': 0.7781561212765957, 'learning_rate': 4.201010676542016e-06, 'epoch': 0.56} 56%|█████▋ | 19352/34278 [21:14:36<14:56:13, 3.60s/it] 56%|█████▋ | 19353/34278 [21:14:40<14:29:54, 3.50s/it] {'loss': 0.143, 'grad_norm': 0.9035506015524689, 'learning_rate': 4.200544316846633e-06, 'epoch': 0.56} 56%|█████▋ | 19353/34278 [21:14:40<14:29:54, 3.50s/it] 56%|█████▋ | 19354/34278 [21:14:44<15:18:51, 3.69s/it] {'loss': 0.1137, 'grad_norm': 0.9088489089560662, 'learning_rate': 4.200077964288601e-06, 'epoch': 0.56} 56%|█████▋ | 19354/34278 [21:14:44<15:18:51, 3.69s/it] 56%|█████▋ | 19355/34278 [21:14:48<15:48:06, 3.81s/it] {'loss': 0.1347, 'grad_norm': 0.7259117047558888, 'learning_rate': 4.199611618872081e-06, 'epoch': 0.56} 56%|█████▋ | 19355/34278 [21:14:48<15:48:06, 3.81s/it] 56%|█████▋ | 19356/34278 [21:14:53<17:42:37, 4.27s/it] {'loss': 0.1262, 'grad_norm': 0.9965548007664105, 'learning_rate': 4.199145280601238e-06, 'epoch': 0.56} 56%|█████▋ | 19356/34278 [21:14:53<17:42:37, 4.27s/it] 56%|█████▋ | 19357/34278 [21:14:56<16:27:15, 3.97s/it] {'loss': 0.1371, 'grad_norm': 0.8341149270966275, 'learning_rate': 4.1986789494802345e-06, 'epoch': 0.56} 56%|█████▋ | 19357/34278 [21:14:56<16:27:15, 3.97s/it] 56%|█████▋ | 19358/34278 [21:14:59<15:19:01, 3.70s/it] {'loss': 0.1264, 'grad_norm': 0.7118071097143382, 'learning_rate': 4.198212625513232e-06, 'epoch': 0.56} 56%|█████▋ | 19358/34278 [21:14:59<15:19:01, 3.70s/it] 56%|█████▋ | 19359/34278 [21:15:03<15:10:03, 3.66s/it] {'loss': 0.1302, 'grad_norm': 0.9634689955515569, 'learning_rate': 4.197746308704399e-06, 'epoch': 0.56} 56%|█████▋ | 19359/34278 [21:15:03<15:10:03, 3.66s/it] 56%|█████▋ | 19360/34278 [21:15:09<18:15:16, 4.41s/it] {'loss': 0.1461, 'grad_norm': 0.9066515208854374, 'learning_rate': 4.1972799990578934e-06, 'epoch': 0.56} 56%|█████▋ | 19360/34278 [21:15:09<18:15:16, 4.41s/it] 56%|█████▋ | 19361/34278 [21:15:14<18:14:54, 4.40s/it] {'loss': 0.1336, 'grad_norm': 0.9874836309580389, 'learning_rate': 4.1968136965778805e-06, 'epoch': 0.56} 56%|█████▋ | 19361/34278 [21:15:14<18:14:54, 4.40s/it] 56%|█████▋ | 19362/34278 [21:15:17<16:30:42, 3.99s/it] {'loss': 0.1101, 'grad_norm': 0.9584657928992774, 'learning_rate': 4.196347401268525e-06, 'epoch': 0.56} 56%|█████▋ | 19362/34278 [21:15:17<16:30:42, 3.99s/it] 56%|█████▋ | 19363/34278 [21:15:20<16:17:15, 3.93s/it] {'loss': 0.1432, 'grad_norm': 0.8677620847531943, 'learning_rate': 4.195881113133986e-06, 'epoch': 0.56} 56%|█████▋ | 19363/34278 [21:15:20<16:17:15, 3.93s/it] 56%|█████▋ | 19364/34278 [21:15:24<15:38:39, 3.78s/it] {'loss': 0.1142, 'grad_norm': 0.8374568733142175, 'learning_rate': 4.1954148321784285e-06, 'epoch': 0.56} 56%|█████▋ | 19364/34278 [21:15:24<15:38:39, 3.78s/it] 56%|█████▋ | 19365/34278 [21:15:28<15:42:27, 3.79s/it] {'loss': 0.1411, 'grad_norm': 0.6635602068403426, 'learning_rate': 4.1949485584060155e-06, 'epoch': 0.56} 56%|█████▋ | 19365/34278 [21:15:28<15:42:27, 3.79s/it] 56%|█████▋ | 19366/34278 [21:15:31<14:37:58, 3.53s/it] {'loss': 0.1189, 'grad_norm': 0.8396410835996723, 'learning_rate': 4.19448229182091e-06, 'epoch': 0.56} 56%|█████▋ | 19366/34278 [21:15:31<14:37:58, 3.53s/it] 56%|█████▋ | 19367/34278 [21:15:33<13:49:45, 3.34s/it] {'loss': 0.13, 'grad_norm': 0.8711314366463256, 'learning_rate': 4.194016032427275e-06, 'epoch': 0.56} 56%|█████▋ | 19367/34278 [21:15:33<13:49:45, 3.34s/it] 57%|█████▋ | 19368/34278 [21:15:37<13:40:18, 3.30s/it] {'loss': 0.0982, 'grad_norm': 0.7495674259968295, 'learning_rate': 4.193549780229273e-06, 'epoch': 0.57} 57%|█████▋ | 19368/34278 [21:15:37<13:40:18, 3.30s/it] 57%|█████▋ | 19369/34278 [21:15:40<13:13:59, 3.20s/it] {'loss': 0.1193, 'grad_norm': 0.994828317206556, 'learning_rate': 4.193083535231064e-06, 'epoch': 0.57} 57%|█████▋ | 19369/34278 [21:15:40<13:13:59, 3.20s/it] 57%|█████▋ | 19370/34278 [21:15:46<16:35:56, 4.01s/it] {'loss': 0.1428, 'grad_norm': 0.8031371144023249, 'learning_rate': 4.192617297436812e-06, 'epoch': 0.57} 57%|█████▋ | 19370/34278 [21:15:46<16:35:56, 4.01s/it] 57%|█████▋ | 19371/34278 [21:15:52<19:15:50, 4.65s/it] {'loss': 0.1676, 'grad_norm': 1.078601565853731, 'learning_rate': 4.192151066850682e-06, 'epoch': 0.57} 57%|█████▋ | 19371/34278 [21:15:52<19:15:50, 4.65s/it] 57%|█████▋ | 19372/34278 [21:15:55<17:36:22, 4.25s/it] {'loss': 0.1523, 'grad_norm': 0.9624650306696038, 'learning_rate': 4.191684843476834e-06, 'epoch': 0.57} 57%|█████▋ | 19372/34278 [21:15:55<17:36:22, 4.25s/it] 57%|█████▋ | 19373/34278 [21:15:58<15:46:19, 3.81s/it] {'loss': 0.1431, 'grad_norm': 0.757665184886242, 'learning_rate': 4.191218627319431e-06, 'epoch': 0.57} 57%|█████▋ | 19373/34278 [21:15:58<15:46:19, 3.81s/it] 57%|█████▋ | 19374/34278 [21:16:01<14:48:03, 3.58s/it] {'loss': 0.1185, 'grad_norm': 0.694972450711904, 'learning_rate': 4.190752418382635e-06, 'epoch': 0.57} 57%|█████▋ | 19374/34278 [21:16:01<14:48:03, 3.58s/it] 57%|█████▋ | 19375/34278 [21:16:04<14:30:11, 3.50s/it] {'loss': 0.1151, 'grad_norm': 0.8081352924546684, 'learning_rate': 4.190286216670608e-06, 'epoch': 0.57} 57%|█████▋ | 19375/34278 [21:16:04<14:30:11, 3.50s/it] 57%|█████▋ | 19376/34278 [21:16:08<14:28:05, 3.50s/it] {'loss': 0.1361, 'grad_norm': 0.903095796203024, 'learning_rate': 4.189820022187511e-06, 'epoch': 0.57} 57%|█████▋ | 19376/34278 [21:16:08<14:28:05, 3.50s/it] 57%|█████▋ | 19377/34278 [21:16:11<14:30:47, 3.51s/it] {'loss': 0.1439, 'grad_norm': 1.3446850188910406, 'learning_rate': 4.189353834937509e-06, 'epoch': 0.57} 57%|█████▋ | 19377/34278 [21:16:11<14:30:47, 3.51s/it] 57%|█████▋ | 19378/34278 [21:16:15<15:14:53, 3.68s/it] {'loss': 0.1177, 'grad_norm': 0.8425675405138412, 'learning_rate': 4.188887654924761e-06, 'epoch': 0.57} 57%|█████▋ | 19378/34278 [21:16:15<15:14:53, 3.68s/it] 57%|█████▋ | 19379/34278 [21:16:19<15:01:23, 3.63s/it] {'loss': 0.1101, 'grad_norm': 0.7948812385773764, 'learning_rate': 4.1884214821534334e-06, 'epoch': 0.57} 57%|█████▋ | 19379/34278 [21:16:19<15:01:23, 3.63s/it] 57%|█████▋ | 19380/34278 [21:16:22<15:03:20, 3.64s/it] {'loss': 0.1312, 'grad_norm': 0.7755867743772799, 'learning_rate': 4.187955316627683e-06, 'epoch': 0.57} 57%|█████▋ | 19380/34278 [21:16:22<15:03:20, 3.64s/it] 57%|█████▋ | 19381/34278 [21:16:25<14:06:27, 3.41s/it] {'loss': 0.1558, 'grad_norm': 1.140046919943364, 'learning_rate': 4.187489158351674e-06, 'epoch': 0.57} 57%|█████▋ | 19381/34278 [21:16:25<14:06:27, 3.41s/it] 57%|█████▋ | 19382/34278 [21:16:31<17:19:02, 4.19s/it] {'loss': 0.1373, 'grad_norm': 0.8086468288477293, 'learning_rate': 4.187023007329566e-06, 'epoch': 0.57} 57%|█████▋ | 19382/34278 [21:16:31<17:19:02, 4.19s/it] 57%|█████▋ | 19383/34278 [21:16:35<17:06:08, 4.13s/it] {'loss': 0.1286, 'grad_norm': 0.7529757380862866, 'learning_rate': 4.186556863565524e-06, 'epoch': 0.57} 57%|█████▋ | 19383/34278 [21:16:35<17:06:08, 4.13s/it] 57%|█████▋ | 19384/34278 [21:16:38<15:38:55, 3.78s/it] {'loss': 0.1224, 'grad_norm': 1.3346562066612684, 'learning_rate': 4.18609072706371e-06, 'epoch': 0.57} 57%|█████▋ | 19384/34278 [21:16:38<15:38:55, 3.78s/it] 57%|█████▋ | 19385/34278 [21:16:42<15:59:17, 3.86s/it] {'loss': 0.1085, 'grad_norm': 0.7560453375313204, 'learning_rate': 4.185624597828282e-06, 'epoch': 0.57} 57%|█████▋ | 19385/34278 [21:16:42<15:59:17, 3.86s/it] 57%|█████▋ | 19386/34278 [21:16:46<15:14:38, 3.69s/it] {'loss': 0.1275, 'grad_norm': 0.7212282462561252, 'learning_rate': 4.185158475863403e-06, 'epoch': 0.57} 57%|█████▋ | 19386/34278 [21:16:46<15:14:38, 3.69s/it] 57%|█████▋ | 19387/34278 [21:16:49<15:09:40, 3.67s/it] {'loss': 0.1114, 'grad_norm': 0.7374292874245314, 'learning_rate': 4.184692361173236e-06, 'epoch': 0.57} 57%|█████▋ | 19387/34278 [21:16:49<15:09:40, 3.67s/it] 57%|█████▋ | 19388/34278 [21:16:52<14:18:28, 3.46s/it] {'loss': 0.1291, 'grad_norm': 0.9391013064283164, 'learning_rate': 4.184226253761937e-06, 'epoch': 0.57} 57%|█████▋ | 19388/34278 [21:16:52<14:18:28, 3.46s/it] 57%|█████▋ | 19389/34278 [21:16:58<17:04:09, 4.13s/it] {'loss': 0.1228, 'grad_norm': 0.7647587171548582, 'learning_rate': 4.183760153633675e-06, 'epoch': 0.57} 57%|█████▋ | 19389/34278 [21:16:58<17:04:09, 4.13s/it] 57%|█████▋ | 19390/34278 [21:17:01<16:08:55, 3.90s/it] {'loss': 0.1368, 'grad_norm': 0.9987100527782217, 'learning_rate': 4.183294060792606e-06, 'epoch': 0.57} 57%|█████▋ | 19390/34278 [21:17:01<16:08:55, 3.90s/it] 57%|█████▋ | 19391/34278 [21:17:04<14:51:53, 3.59s/it] {'loss': 0.134, 'grad_norm': 0.8860530588359792, 'learning_rate': 4.182827975242894e-06, 'epoch': 0.57} 57%|█████▋ | 19391/34278 [21:17:04<14:51:53, 3.59s/it] 57%|█████▋ | 19392/34278 [21:17:08<14:40:42, 3.55s/it] {'loss': 0.1132, 'grad_norm': 1.0873575091132723, 'learning_rate': 4.182361896988699e-06, 'epoch': 0.57} 57%|█████▋ | 19392/34278 [21:17:08<14:40:42, 3.55s/it] 57%|█████▋ | 19393/34278 [21:17:13<17:21:41, 4.20s/it] {'loss': 0.14, 'grad_norm': 0.842081833139939, 'learning_rate': 4.18189582603418e-06, 'epoch': 0.57} 57%|█████▋ | 19393/34278 [21:17:13<17:21:41, 4.20s/it] 57%|█████▋ | 19394/34278 [21:17:16<15:37:35, 3.78s/it] {'loss': 0.1287, 'grad_norm': 0.8752384891404035, 'learning_rate': 4.1814297623835e-06, 'epoch': 0.57} 57%|█████▋ | 19394/34278 [21:17:16<15:37:35, 3.78s/it] 57%|█████▋ | 19395/34278 [21:17:19<14:52:47, 3.60s/it] {'loss': 0.1291, 'grad_norm': 0.8753769206838534, 'learning_rate': 4.18096370604082e-06, 'epoch': 0.57} 57%|█████▋ | 19395/34278 [21:17:19<14:52:47, 3.60s/it] 57%|█████▋ | 19396/34278 [21:17:22<14:11:38, 3.43s/it] {'loss': 0.1511, 'grad_norm': 1.5124093889687904, 'learning_rate': 4.1804976570103e-06, 'epoch': 0.57} 57%|█████▋ | 19396/34278 [21:17:22<14:11:38, 3.43s/it] 57%|█████▋ | 19397/34278 [21:17:27<15:22:51, 3.72s/it] {'loss': 0.1413, 'grad_norm': 0.8637210905796692, 'learning_rate': 4.180031615296103e-06, 'epoch': 0.57} 57%|█████▋ | 19397/34278 [21:17:27<15:22:51, 3.72s/it] 57%|█████▋ | 19398/34278 [21:17:30<14:43:30, 3.56s/it] {'loss': 0.1313, 'grad_norm': 0.9549525126186816, 'learning_rate': 4.179565580902387e-06, 'epoch': 0.57} 57%|█████▋ | 19398/34278 [21:17:30<14:43:30, 3.56s/it] 57%|█████▋ | 19399/34278 [21:17:36<17:34:16, 4.25s/it] {'loss': 0.1209, 'grad_norm': 0.7949894880690509, 'learning_rate': 4.179099553833314e-06, 'epoch': 0.57} 57%|█████▋ | 19399/34278 [21:17:36<17:34:16, 4.25s/it] 57%|█████▋ | 19400/34278 [21:17:39<16:01:54, 3.88s/it] {'loss': 0.1121, 'grad_norm': 0.8830952411828071, 'learning_rate': 4.178633534093043e-06, 'epoch': 0.57} 57%|█████▋ | 19400/34278 [21:17:39<16:01:54, 3.88s/it] 57%|█████▋ | 19401/34278 [21:17:43<17:03:43, 4.13s/it] {'loss': 0.1196, 'grad_norm': 0.8183061514539053, 'learning_rate': 4.178167521685737e-06, 'epoch': 0.57} 57%|█████▋ | 19401/34278 [21:17:43<17:03:43, 4.13s/it] 57%|█████▋ | 19402/34278 [21:17:47<16:05:58, 3.90s/it] {'loss': 0.1609, 'grad_norm': 0.8907210383788373, 'learning_rate': 4.177701516615555e-06, 'epoch': 0.57} 57%|█████▋ | 19402/34278 [21:17:47<16:05:58, 3.90s/it] 57%|█████▋ | 19403/34278 [21:17:50<14:41:49, 3.56s/it] {'loss': 0.1333, 'grad_norm': 0.7892265729820176, 'learning_rate': 4.177235518886657e-06, 'epoch': 0.57} 57%|█████▋ | 19403/34278 [21:17:50<14:41:49, 3.56s/it] 57%|█████▋ | 19404/34278 [21:17:54<15:09:11, 3.67s/it] {'loss': 0.1162, 'grad_norm': 0.7893912092104693, 'learning_rate': 4.176769528503205e-06, 'epoch': 0.57} 57%|█████▋ | 19404/34278 [21:17:54<15:09:11, 3.67s/it] 57%|█████▋ | 19405/34278 [21:17:57<14:41:19, 3.56s/it] {'loss': 0.1257, 'grad_norm': 0.8847707860248508, 'learning_rate': 4.176303545469358e-06, 'epoch': 0.57} 57%|█████▋ | 19405/34278 [21:17:57<14:41:19, 3.56s/it] 57%|█████▋ | 19406/34278 [21:18:02<16:37:24, 4.02s/it] {'loss': 0.1441, 'grad_norm': 1.0207630139430863, 'learning_rate': 4.175837569789274e-06, 'epoch': 0.57} 57%|█████▋ | 19406/34278 [21:18:02<16:37:24, 4.02s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 57%|█████▋ | 19407/34278 [21:18:07<17:47:31, 4.31s/it] {'loss': 0.1155, 'grad_norm': 0.714577325556803, 'learning_rate': 4.175371601467117e-06, 'epoch': 0.57} 57%|█████▋ | 19407/34278 [21:18:07<17:47:31, 4.31s/it] 57%|█████▋ | 19408/34278 [21:18:10<16:21:50, 3.96s/it] {'loss': 0.1312, 'grad_norm': 0.9138541251039776, 'learning_rate': 4.1749056405070455e-06, 'epoch': 0.57} 57%|█████▋ | 19408/34278 [21:18:10<16:21:50, 3.96s/it] 57%|█████▋ | 19409/34278 [21:18:13<15:21:02, 3.72s/it] {'loss': 0.1313, 'grad_norm': 0.8673457643088875, 'learning_rate': 4.1744396869132205e-06, 'epoch': 0.57} 57%|█████▋ | 19409/34278 [21:18:13<15:21:02, 3.72s/it] 57%|█████▋ | 19410/34278 [21:18:16<14:13:18, 3.44s/it] {'loss': 0.1313, 'grad_norm': 1.0043121989356136, 'learning_rate': 4.1739737406898e-06, 'epoch': 0.57} 57%|█████▋ | 19410/34278 [21:18:16<14:13:18, 3.44s/it] 57%|█████▋ | 19411/34278 [21:18:19<13:59:21, 3.39s/it] {'loss': 0.1325, 'grad_norm': 0.8286278643874674, 'learning_rate': 4.173507801840945e-06, 'epoch': 0.57} 57%|█████▋ | 19411/34278 [21:18:19<13:59:21, 3.39s/it] 57%|█████▋ | 19412/34278 [21:18:23<13:48:06, 3.34s/it] {'loss': 0.1376, 'grad_norm': 1.0226878988018635, 'learning_rate': 4.173041870370813e-06, 'epoch': 0.57} 57%|█████▋ | 19412/34278 [21:18:23<13:48:06, 3.34s/it] 57%|█████▋ | 19413/34278 [21:18:27<15:42:24, 3.80s/it] {'loss': 0.1266, 'grad_norm': 0.9393169483809087, 'learning_rate': 4.1725759462835674e-06, 'epoch': 0.57} 57%|█████▋ | 19413/34278 [21:18:27<15:42:24, 3.80s/it] 57%|█████▋ | 19414/34278 [21:18:31<15:40:28, 3.80s/it] {'loss': 0.1118, 'grad_norm': 0.8350351829037099, 'learning_rate': 4.172110029583368e-06, 'epoch': 0.57} 57%|█████▋ | 19414/34278 [21:18:31<15:40:28, 3.80s/it] 57%|█████▋ | 19415/34278 [21:18:34<15:03:25, 3.65s/it] {'loss': 0.1327, 'grad_norm': 1.0055518985598915, 'learning_rate': 4.171644120274371e-06, 'epoch': 0.57} 57%|█████▋ | 19415/34278 [21:18:34<15:03:25, 3.65s/it] 57%|█████▋ | 19416/34278 [21:18:38<15:22:48, 3.73s/it] {'loss': 0.1214, 'grad_norm': 0.8402808204010609, 'learning_rate': 4.171178218360737e-06, 'epoch': 0.57} 57%|█████▋ | 19416/34278 [21:18:38<15:22:48, 3.73s/it] 57%|█████▋ | 19417/34278 [21:18:41<14:26:48, 3.50s/it] {'loss': 0.1282, 'grad_norm': 0.9098556148388088, 'learning_rate': 4.170712323846628e-06, 'epoch': 0.57} 57%|█████▋ | 19417/34278 [21:18:41<14:26:48, 3.50s/it] 57%|█████▋ | 19418/34278 [21:18:47<17:14:12, 4.18s/it] {'loss': 0.1071, 'grad_norm': 0.9596209886171775, 'learning_rate': 4.170246436736198e-06, 'epoch': 0.57} 57%|█████▋ | 19418/34278 [21:18:47<17:14:12, 4.18s/it] 57%|█████▋ | 19419/34278 [21:18:50<15:49:18, 3.83s/it] {'loss': 0.1278, 'grad_norm': 0.8622785281876078, 'learning_rate': 4.169780557033612e-06, 'epoch': 0.57} 57%|█████▋ | 19419/34278 [21:18:50<15:49:18, 3.83s/it] 57%|█████▋ | 19420/34278 [21:18:53<14:57:50, 3.63s/it] {'loss': 0.1252, 'grad_norm': 0.7480267302996062, 'learning_rate': 4.169314684743027e-06, 'epoch': 0.57} 57%|█████▋ | 19420/34278 [21:18:53<14:57:50, 3.63s/it] 57%|█████▋ | 19421/34278 [21:18:57<14:32:23, 3.52s/it] {'loss': 0.1316, 'grad_norm': 0.8038013696081622, 'learning_rate': 4.168848819868601e-06, 'epoch': 0.57} 57%|█████▋ | 19421/34278 [21:18:57<14:32:23, 3.52s/it] 57%|█████▋ | 19422/34278 [21:19:00<14:41:30, 3.56s/it] {'loss': 0.127, 'grad_norm': 1.1371605648032899, 'learning_rate': 4.168382962414496e-06, 'epoch': 0.57} 57%|█████▋ | 19422/34278 [21:19:00<14:41:30, 3.56s/it] 57%|█████▋ | 19423/34278 [21:19:03<14:11:41, 3.44s/it] {'loss': 0.1498, 'grad_norm': 1.2734743904923689, 'learning_rate': 4.167917112384869e-06, 'epoch': 0.57} 57%|█████▋ | 19423/34278 [21:19:03<14:11:41, 3.44s/it] 57%|█████▋ | 19424/34278 [21:19:07<13:52:12, 3.36s/it] {'loss': 0.1565, 'grad_norm': 1.0318764991978846, 'learning_rate': 4.167451269783878e-06, 'epoch': 0.57} 57%|█████▋ | 19424/34278 [21:19:07<13:52:12, 3.36s/it] 57%|█████▋ | 19425/34278 [21:19:10<14:17:12, 3.46s/it] {'loss': 0.1351, 'grad_norm': 0.9522348872724322, 'learning_rate': 4.166985434615683e-06, 'epoch': 0.57} 57%|█████▋ | 19425/34278 [21:19:10<14:17:12, 3.46s/it] 57%|█████▋ | 19426/34278 [21:19:15<15:21:03, 3.72s/it] {'loss': 0.1187, 'grad_norm': 0.9402296360804302, 'learning_rate': 4.166519606884445e-06, 'epoch': 0.57} 57%|█████▋ | 19426/34278 [21:19:15<15:21:03, 3.72s/it] 57%|█████▋ | 19427/34278 [21:19:20<17:46:48, 4.31s/it] {'loss': 0.1351, 'grad_norm': 0.9181844994976865, 'learning_rate': 4.166053786594322e-06, 'epoch': 0.57} 57%|█████▋ | 19427/34278 [21:19:20<17:46:48, 4.31s/it] 57%|█████▋ | 19428/34278 [21:19:24<16:28:42, 3.99s/it] {'loss': 0.1388, 'grad_norm': 0.8209798989749533, 'learning_rate': 4.16558797374947e-06, 'epoch': 0.57} 57%|█████▋ | 19428/34278 [21:19:24<16:28:42, 3.99s/it] 57%|█████▋ | 19429/34278 [21:19:30<19:28:44, 4.72s/it] {'loss': 0.1321, 'grad_norm': 0.898912837960602, 'learning_rate': 4.165122168354049e-06, 'epoch': 0.57} 57%|█████▋ | 19429/34278 [21:19:30<19:28:44, 4.72s/it] 57%|█████▋ | 19430/34278 [21:19:36<21:05:19, 5.11s/it] {'loss': 0.1385, 'grad_norm': 0.9807008826655649, 'learning_rate': 4.164656370412218e-06, 'epoch': 0.57} 57%|█████▋ | 19430/34278 [21:19:36<21:05:19, 5.11s/it] 57%|█████▋ | 19431/34278 [21:19:39<18:22:34, 4.46s/it] {'loss': 0.1286, 'grad_norm': 0.8622015768296055, 'learning_rate': 4.164190579928137e-06, 'epoch': 0.57} 57%|█████▋ | 19431/34278 [21:19:39<18:22:34, 4.46s/it] 57%|█████▋ | 19432/34278 [21:19:43<17:49:15, 4.32s/it] {'loss': 0.1061, 'grad_norm': 0.7912732824512433, 'learning_rate': 4.163724796905961e-06, 'epoch': 0.57} 57%|█████▋ | 19432/34278 [21:19:43<17:49:15, 4.32s/it] 57%|█████▋ | 19433/34278 [21:19:46<16:55:25, 4.10s/it] {'loss': 0.1215, 'grad_norm': 0.833736347636523, 'learning_rate': 4.163259021349852e-06, 'epoch': 0.57} 57%|█████▋ | 19433/34278 [21:19:46<16:55:25, 4.10s/it] 57%|█████▋ | 19434/34278 [21:19:50<16:00:41, 3.88s/it] {'loss': 0.1131, 'grad_norm': 0.7638380756488233, 'learning_rate': 4.162793253263967e-06, 'epoch': 0.57} 57%|█████▋ | 19434/34278 [21:19:50<16:00:41, 3.88s/it] 57%|█████▋ | 19435/34278 [21:19:53<15:32:29, 3.77s/it] {'loss': 0.1177, 'grad_norm': 0.7640489158978119, 'learning_rate': 4.162327492652463e-06, 'epoch': 0.57} 57%|█████▋ | 19435/34278 [21:19:53<15:32:29, 3.77s/it] 57%|█████▋ | 19436/34278 [21:19:59<17:31:20, 4.25s/it] {'loss': 0.139, 'grad_norm': 1.0578159091295236, 'learning_rate': 4.161861739519498e-06, 'epoch': 0.57} 57%|█████▋ | 19436/34278 [21:19:59<17:31:20, 4.25s/it] 57%|█████▋ | 19437/34278 [21:20:02<16:45:37, 4.07s/it] {'loss': 0.1262, 'grad_norm': 0.810072989231744, 'learning_rate': 4.161395993869232e-06, 'epoch': 0.57} 57%|█████▋ | 19437/34278 [21:20:02<16:45:37, 4.07s/it] 57%|█████▋ | 19438/34278 [21:20:05<15:34:07, 3.78s/it] {'loss': 0.1678, 'grad_norm': 0.9400030183406941, 'learning_rate': 4.160930255705824e-06, 'epoch': 0.57} 57%|█████▋ | 19438/34278 [21:20:05<15:34:07, 3.78s/it] 57%|█████▋ | 19439/34278 [21:20:09<15:30:08, 3.76s/it] {'loss': 0.1263, 'grad_norm': 0.6954126633205726, 'learning_rate': 4.16046452503343e-06, 'epoch': 0.57} 57%|█████▋ | 19439/34278 [21:20:09<15:30:08, 3.76s/it] 57%|█████▋ | 19440/34278 [21:20:13<14:58:12, 3.63s/it] {'loss': 0.1477, 'grad_norm': 0.7525121649948221, 'learning_rate': 4.159998801856207e-06, 'epoch': 0.57} 57%|█████▋ | 19440/34278 [21:20:13<14:58:12, 3.63s/it] 57%|█████▋ | 19441/34278 [21:20:15<14:07:54, 3.43s/it] {'loss': 0.1292, 'grad_norm': 0.9595718520300346, 'learning_rate': 4.1595330861783145e-06, 'epoch': 0.57} 57%|█████▋ | 19441/34278 [21:20:15<14:07:54, 3.43s/it] 57%|█████▋ | 19442/34278 [21:20:21<17:02:04, 4.13s/it] {'loss': 0.1202, 'grad_norm': 0.7302068471788169, 'learning_rate': 4.15906737800391e-06, 'epoch': 0.57} 57%|█████▋ | 19442/34278 [21:20:21<17:02:04, 4.13s/it] 57%|█████▋ | 19443/34278 [21:20:24<15:42:45, 3.81s/it] {'loss': 0.1483, 'grad_norm': 0.7817238203916237, 'learning_rate': 4.158601677337151e-06, 'epoch': 0.57} 57%|█████▋ | 19443/34278 [21:20:24<15:42:45, 3.81s/it] 57%|█████▋ | 19444/34278 [21:20:28<15:44:17, 3.82s/it] {'loss': 0.1125, 'grad_norm': 0.6014948314245426, 'learning_rate': 4.158135984182197e-06, 'epoch': 0.57} 57%|█████▋ | 19444/34278 [21:20:28<15:44:17, 3.82s/it] 57%|█████▋ | 19445/34278 [21:20:31<14:28:16, 3.51s/it] {'loss': 0.1526, 'grad_norm': 0.9995087962528116, 'learning_rate': 4.157670298543203e-06, 'epoch': 0.57} 57%|█████▋ | 19445/34278 [21:20:31<14:28:16, 3.51s/it] 57%|█████▋ | 19446/34278 [21:20:35<14:31:59, 3.53s/it] {'loss': 0.1301, 'grad_norm': 0.8340204683298592, 'learning_rate': 4.157204620424326e-06, 'epoch': 0.57} 57%|█████▋ | 19446/34278 [21:20:35<14:31:59, 3.53s/it] 57%|█████▋ | 19447/34278 [21:20:40<17:27:08, 4.24s/it] {'loss': 0.1205, 'grad_norm': 1.2041059476698694, 'learning_rate': 4.156738949829728e-06, 'epoch': 0.57} 57%|█████▋ | 19447/34278 [21:20:40<17:27:08, 4.24s/it] 57%|█████▋ | 19448/34278 [21:20:44<16:27:10, 3.99s/it] {'loss': 0.1312, 'grad_norm': 0.7920740428638681, 'learning_rate': 4.156273286763559e-06, 'epoch': 0.57} 57%|█████▋ | 19448/34278 [21:20:44<16:27:10, 3.99s/it] 57%|█████▋ | 19449/34278 [21:20:47<15:29:59, 3.76s/it] {'loss': 0.1199, 'grad_norm': 0.7831400996347803, 'learning_rate': 4.155807631229984e-06, 'epoch': 0.57} 57%|█████▋ | 19449/34278 [21:20:47<15:29:59, 3.76s/it] 57%|█████▋ | 19450/34278 [21:20:53<18:14:04, 4.43s/it] {'loss': 0.1336, 'grad_norm': 0.8910101049727717, 'learning_rate': 4.155341983233156e-06, 'epoch': 0.57} 57%|█████▋ | 19450/34278 [21:20:53<18:14:04, 4.43s/it] 57%|█████▋ | 19451/34278 [21:20:56<16:46:58, 4.07s/it] {'loss': 0.1135, 'grad_norm': 0.730621387194202, 'learning_rate': 4.154876342777234e-06, 'epoch': 0.57} 57%|█████▋ | 19451/34278 [21:20:56<16:46:58, 4.07s/it] 57%|█████▋ | 19452/34278 [21:20:59<15:09:30, 3.68s/it] {'loss': 0.1188, 'grad_norm': 0.7208364369338753, 'learning_rate': 4.154410709866374e-06, 'epoch': 0.57} 57%|█████▋ | 19452/34278 [21:20:59<15:09:30, 3.68s/it] 57%|█████▋ | 19453/34278 [21:21:02<14:33:04, 3.53s/it] {'loss': 0.1418, 'grad_norm': 0.8334785944597244, 'learning_rate': 4.153945084504733e-06, 'epoch': 0.57} 57%|█████▋ | 19453/34278 [21:21:02<14:33:04, 3.53s/it] 57%|█████▋ | 19454/34278 [21:21:07<16:14:02, 3.94s/it] {'loss': 0.1191, 'grad_norm': 0.805382737106822, 'learning_rate': 4.153479466696467e-06, 'epoch': 0.57} 57%|█████▋ | 19454/34278 [21:21:07<16:14:02, 3.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 57%|█████▋ | 19455/34278 [21:21:10<15:11:51, 3.69s/it] {'loss': 0.1263, 'grad_norm': 1.1381496668325919, 'learning_rate': 4.153013856445736e-06, 'epoch': 0.57} 57%|█████▋ | 19455/34278 [21:21:10<15:11:51, 3.69s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0cab9c0090> Failed to fetch sample 3353700. Exception: cannot identify image file <_io.BytesIO object at 0x7f0cab9c0090> 57%|█████▋ | 19456/34278 [21:21:14<15:21:01, 3.73s/it] {'loss': 0.1167, 'grad_norm': 0.892138197681285, 'learning_rate': 4.152548253756694e-06, 'epoch': 0.57} 57%|█████▋ | 19456/34278 [21:21:14<15:21:01, 3.73s/it] 57%|█████▋ | 19457/34278 [21:21:17<14:21:11, 3.49s/it] {'loss': 0.1322, 'grad_norm': 0.8309719989376817, 'learning_rate': 4.152082658633501e-06, 'epoch': 0.57} 57%|█████▋ | 19457/34278 [21:21:17<14:21:11, 3.49s/it] 57%|█████▋ | 19458/34278 [21:21:21<14:54:37, 3.62s/it] {'loss': 0.129, 'grad_norm': 0.856713290710731, 'learning_rate': 4.15161707108031e-06, 'epoch': 0.57} 57%|█████▋ | 19458/34278 [21:21:21<14:54:37, 3.62s/it] 57%|█████▋ | 19459/34278 [21:21:24<14:29:47, 3.52s/it] {'loss': 0.123, 'grad_norm': 0.9598761879946143, 'learning_rate': 4.151151491101279e-06, 'epoch': 0.57} 57%|█████▋ | 19459/34278 [21:21:24<14:29:47, 3.52s/it] 57%|█████▋ | 19460/34278 [21:21:30<17:33:15, 4.26s/it] {'loss': 0.1304, 'grad_norm': 1.011005379721408, 'learning_rate': 4.150685918700565e-06, 'epoch': 0.57} 57%|█████▋ | 19460/34278 [21:21:30<17:33:15, 4.26s/it] 57%|█████▋ | 19461/34278 [21:21:36<19:03:21, 4.63s/it] {'loss': 0.146, 'grad_norm': 0.8848972179178022, 'learning_rate': 4.150220353882325e-06, 'epoch': 0.57} 57%|█████▋ | 19461/34278 [21:21:36<19:03:21, 4.63s/it] 57%|█████▋ | 19462/34278 [21:21:42<21:01:14, 5.11s/it] {'loss': 0.1146, 'grad_norm': 0.7194590108526638, 'learning_rate': 4.149754796650714e-06, 'epoch': 0.57} 57%|█████▋ | 19462/34278 [21:21:42<21:01:14, 5.11s/it] 57%|█████▋ | 19463/34278 [21:21:45<18:38:55, 4.53s/it] {'loss': 0.1268, 'grad_norm': 0.9134909870263048, 'learning_rate': 4.14928924700989e-06, 'epoch': 0.57} 57%|█████▋ | 19463/34278 [21:21:45<18:38:55, 4.53s/it] 57%|█████▋ | 19464/34278 [21:21:49<17:59:50, 4.37s/it] {'loss': 0.1002, 'grad_norm': 0.6926311306615194, 'learning_rate': 4.148823704964009e-06, 'epoch': 0.57} 57%|█████▋ | 19464/34278 [21:21:49<17:59:50, 4.37s/it] 57%|█████▋ | 19465/34278 [21:21:54<18:39:31, 4.53s/it] {'loss': 0.1044, 'grad_norm': 0.8663371143277986, 'learning_rate': 4.148358170517226e-06, 'epoch': 0.57} 57%|█████▋ | 19465/34278 [21:21:54<18:39:31, 4.53s/it] 57%|█████▋ | 19466/34278 [21:21:57<16:44:20, 4.07s/it] {'loss': 0.1273, 'grad_norm': 0.8497971341096269, 'learning_rate': 4.147892643673696e-06, 'epoch': 0.57} 57%|█████▋ | 19466/34278 [21:21:57<16:44:20, 4.07s/it] 57%|█████▋ | 19467/34278 [21:22:00<15:25:59, 3.75s/it] {'loss': 0.1627, 'grad_norm': 0.906068927483276, 'learning_rate': 4.147427124437579e-06, 'epoch': 0.57} 57%|█████▋ | 19467/34278 [21:22:00<15:25:59, 3.75s/it] 57%|█████▋ | 19468/34278 [21:22:05<16:54:21, 4.11s/it] {'loss': 0.1137, 'grad_norm': 0.9524640293613119, 'learning_rate': 4.146961612813029e-06, 'epoch': 0.57} 57%|█████▋ | 19468/34278 [21:22:05<16:54:21, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 57%|█████▋ | 19469/34278 [21:22:08<15:50:36, 3.85s/it] {'loss': 0.1472, 'grad_norm': 0.8292784079540674, 'learning_rate': 4.1464961088042035e-06, 'epoch': 0.57} 57%|█████▋ | 19469/34278 [21:22:08<15:50:36, 3.85s/it] 57%|█████▋ | 19470/34278 [21:22:11<14:56:32, 3.63s/it] {'loss': 0.1408, 'grad_norm': 0.8191775191554431, 'learning_rate': 4.146030612415256e-06, 'epoch': 0.57} 57%|█████▋ | 19470/34278 [21:22:11<14:56:32, 3.63s/it] 57%|█████▋ | 19471/34278 [21:22:14<13:54:35, 3.38s/it] {'loss': 0.1292, 'grad_norm': 0.9411945831136156, 'learning_rate': 4.145565123650342e-06, 'epoch': 0.57} 57%|█████▋ | 19471/34278 [21:22:14<13:54:35, 3.38s/it] 57%|█████▋ | 19472/34278 [21:22:17<13:20:58, 3.25s/it] {'loss': 0.1166, 'grad_norm': 0.8442802249400984, 'learning_rate': 4.1450996425136184e-06, 'epoch': 0.57} 57%|█████▋ | 19472/34278 [21:22:17<13:20:58, 3.25s/it] 57%|█████▋ | 19473/34278 [21:22:21<14:39:54, 3.57s/it] {'loss': 0.1366, 'grad_norm': 0.9372724918897899, 'learning_rate': 4.144634169009243e-06, 'epoch': 0.57} 57%|█████▋ | 19473/34278 [21:22:21<14:39:54, 3.57s/it] 57%|█████▋ | 19474/34278 [21:22:25<14:11:37, 3.45s/it] {'loss': 0.1188, 'grad_norm': 0.7893050022050192, 'learning_rate': 4.1441687031413695e-06, 'epoch': 0.57} 57%|█████▋ | 19474/34278 [21:22:25<14:11:37, 3.45s/it] 57%|█████▋ | 19475/34278 [21:22:28<14:32:02, 3.53s/it] {'loss': 0.1133, 'grad_norm': 0.7700591593083678, 'learning_rate': 4.143703244914152e-06, 'epoch': 0.57} 57%|█████▋ | 19475/34278 [21:22:28<14:32:02, 3.53s/it] 57%|█████▋ | 19476/34278 [21:22:32<14:18:41, 3.48s/it] {'loss': 0.1221, 'grad_norm': 0.9653185623356487, 'learning_rate': 4.143237794331749e-06, 'epoch': 0.57} 57%|█████▋ | 19476/34278 [21:22:32<14:18:41, 3.48s/it] 57%|█████▋ | 19477/34278 [21:22:36<14:50:55, 3.61s/it] {'loss': 0.1407, 'grad_norm': 1.0229825103184051, 'learning_rate': 4.142772351398314e-06, 'epoch': 0.57} 57%|█████▋ | 19477/34278 [21:22:36<14:50:55, 3.61s/it] 57%|█████▋ | 19478/34278 [21:22:39<15:02:59, 3.66s/it] {'loss': 0.1001, 'grad_norm': 0.9353775145276803, 'learning_rate': 4.142306916118e-06, 'epoch': 0.57} 57%|█████▋ | 19478/34278 [21:22:39<15:02:59, 3.66s/it] 57%|█████▋ | 19479/34278 [21:22:42<14:24:12, 3.50s/it] {'loss': 0.1303, 'grad_norm': 0.8305059203098825, 'learning_rate': 4.141841488494969e-06, 'epoch': 0.57} 57%|█████▋ | 19479/34278 [21:22:42<14:24:12, 3.50s/it] 57%|█████▋ | 19480/34278 [21:22:46<14:01:25, 3.41s/it] {'loss': 0.1237, 'grad_norm': 0.879799059625502, 'learning_rate': 4.1413760685333714e-06, 'epoch': 0.57} 57%|█████▋ | 19480/34278 [21:22:46<14:01:25, 3.41s/it] 57%|█████▋ | 19481/34278 [21:22:49<13:39:03, 3.32s/it] {'loss': 0.1327, 'grad_norm': 0.786556193206934, 'learning_rate': 4.140910656237363e-06, 'epoch': 0.57} 57%|█████▋ | 19481/34278 [21:22:49<13:39:03, 3.32s/it] 57%|█████▋ | 19482/34278 [21:22:52<13:14:01, 3.22s/it] {'loss': 0.1044, 'grad_norm': 0.6985346634610555, 'learning_rate': 4.1404452516111e-06, 'epoch': 0.57} 57%|█████▋ | 19482/34278 [21:22:52<13:14:01, 3.22s/it] 57%|█████▋ | 19483/34278 [21:22:55<13:01:50, 3.17s/it] {'loss': 0.1329, 'grad_norm': 0.873493223399064, 'learning_rate': 4.139979854658735e-06, 'epoch': 0.57} 57%|█████▋ | 19483/34278 [21:22:55<13:01:50, 3.17s/it] 57%|█████▋ | 19484/34278 [21:23:00<15:53:45, 3.87s/it] {'loss': 0.1522, 'grad_norm': 0.833281339853357, 'learning_rate': 4.139514465384424e-06, 'epoch': 0.57} 57%|█████▋ | 19484/34278 [21:23:00<15:53:45, 3.87s/it] 57%|█████▋ | 19485/34278 [21:23:03<14:52:02, 3.62s/it] {'loss': 0.1097, 'grad_norm': 0.7788355824123007, 'learning_rate': 4.139049083792324e-06, 'epoch': 0.57} 57%|█████▋ | 19485/34278 [21:23:03<14:52:02, 3.62s/it] 57%|█████▋ | 19486/34278 [21:23:08<16:05:29, 3.92s/it] {'loss': 0.1326, 'grad_norm': 0.8406256788633911, 'learning_rate': 4.1385837098865874e-06, 'epoch': 0.57} 57%|█████▋ | 19486/34278 [21:23:08<16:05:29, 3.92s/it] 57%|█████▋ | 19487/34278 [21:23:11<14:56:11, 3.64s/it] {'loss': 0.1214, 'grad_norm': 0.5863643463359757, 'learning_rate': 4.138118343671372e-06, 'epoch': 0.57} 57%|█████▋ | 19487/34278 [21:23:11<14:56:11, 3.64s/it] 57%|█████▋ | 19488/34278 [21:23:14<14:29:04, 3.53s/it] {'loss': 0.1474, 'grad_norm': 0.813258095451415, 'learning_rate': 4.137652985150829e-06, 'epoch': 0.57} 57%|█████▋ | 19488/34278 [21:23:14<14:29:04, 3.53s/it] 57%|█████▋ | 19489/34278 [21:23:18<14:14:03, 3.46s/it] {'loss': 0.1179, 'grad_norm': 0.745323317296243, 'learning_rate': 4.137187634329114e-06, 'epoch': 0.57} 57%|█████▋ | 19489/34278 [21:23:18<14:14:03, 3.46s/it] 57%|█████▋ | 19490/34278 [21:23:21<14:45:50, 3.59s/it] {'loss': 0.1435, 'grad_norm': 0.7667490617432795, 'learning_rate': 4.13672229121038e-06, 'epoch': 0.57} 57%|█████▋ | 19490/34278 [21:23:21<14:45:50, 3.59s/it] 57%|█████▋ | 19491/34278 [21:23:27<17:34:46, 4.28s/it] {'loss': 0.1411, 'grad_norm': 0.8094175844997595, 'learning_rate': 4.136256955798786e-06, 'epoch': 0.57} 57%|█████▋ | 19491/34278 [21:23:27<17:34:46, 4.28s/it] 57%|█████▋ | 19492/34278 [21:23:31<16:20:28, 3.98s/it] {'loss': 0.1429, 'grad_norm': 0.6636350950499006, 'learning_rate': 4.135791628098483e-06, 'epoch': 0.57} 57%|█████▋ | 19492/34278 [21:23:31<16:20:28, 3.98s/it] 57%|█████▋ | 19493/34278 [21:23:34<15:14:09, 3.71s/it] {'loss': 0.1086, 'grad_norm': 0.7253760280737869, 'learning_rate': 4.135326308113625e-06, 'epoch': 0.57} 57%|█████▋ | 19493/34278 [21:23:34<15:14:09, 3.71s/it] 57%|█████▋ | 19494/34278 [21:23:37<14:47:22, 3.60s/it] {'loss': 0.1229, 'grad_norm': 0.8037861905000112, 'learning_rate': 4.13486099584837e-06, 'epoch': 0.57} 57%|█████▋ | 19494/34278 [21:23:37<14:47:22, 3.60s/it] 57%|█████▋ | 19495/34278 [21:23:42<16:01:00, 3.90s/it] {'loss': 0.1313, 'grad_norm': 0.8480607830202082, 'learning_rate': 4.134395691306868e-06, 'epoch': 0.57} 57%|█████▋ | 19495/34278 [21:23:42<16:01:00, 3.90s/it] 57%|█████▋ | 19496/34278 [21:23:45<15:13:07, 3.71s/it] {'loss': 0.1035, 'grad_norm': 0.7440739430597881, 'learning_rate': 4.133930394493272e-06, 'epoch': 0.57} 57%|█████▋ | 19496/34278 [21:23:45<15:13:07, 3.71s/it] 57%|█████▋ | 19497/34278 [21:23:48<14:59:35, 3.65s/it] {'loss': 0.1252, 'grad_norm': 0.7217697385956728, 'learning_rate': 4.1334651054117404e-06, 'epoch': 0.57} 57%|█████▋ | 19497/34278 [21:23:48<14:59:35, 3.65s/it] 57%|█████▋ | 19498/34278 [21:23:51<14:18:18, 3.48s/it] {'loss': 0.1448, 'grad_norm': 0.7103858199884849, 'learning_rate': 4.132999824066426e-06, 'epoch': 0.57} 57%|█████▋ | 19498/34278 [21:23:51<14:18:18, 3.48s/it] 57%|█████▋ | 19499/34278 [21:23:54<13:35:08, 3.31s/it] {'loss': 0.1435, 'grad_norm': 0.8064217001025907, 'learning_rate': 4.132534550461484e-06, 'epoch': 0.57} 57%|█████▋ | 19499/34278 [21:23:54<13:35:08, 3.31s/it] 57%|█████▋ | 19500/34278 [21:23:59<15:33:51, 3.79s/it] {'loss': 0.1089, 'grad_norm': 0.7874102313578355, 'learning_rate': 4.1320692846010645e-06, 'epoch': 0.57} 57%|█████▋ | 19500/34278 [21:23:59<15:33:51, 3.79s/it] 57%|█████▋ | 19501/34278 [21:24:03<14:52:40, 3.62s/it] {'loss': 0.1293, 'grad_norm': 0.8832892212667347, 'learning_rate': 4.131604026489322e-06, 'epoch': 0.57} 57%|█████▋ | 19501/34278 [21:24:03<14:52:40, 3.62s/it] 57%|█████▋ | 19502/34278 [21:24:06<14:20:04, 3.49s/it] {'loss': 0.1291, 'grad_norm': 0.8844026983146775, 'learning_rate': 4.131138776130413e-06, 'epoch': 0.57} 57%|█████▋ | 19502/34278 [21:24:06<14:20:04, 3.49s/it] 57%|█████▋ | 19503/34278 [21:24:09<14:24:44, 3.51s/it] {'loss': 0.1262, 'grad_norm': 0.7443751926414968, 'learning_rate': 4.130673533528489e-06, 'epoch': 0.57} 57%|█████▋ | 19503/34278 [21:24:09<14:24:44, 3.51s/it] 57%|█████▋ | 19504/34278 [21:24:13<14:36:42, 3.56s/it] {'loss': 0.1101, 'grad_norm': 0.7058878160954616, 'learning_rate': 4.130208298687705e-06, 'epoch': 0.57} 57%|█████▋ | 19504/34278 [21:24:13<14:36:42, 3.56s/it] 57%|█████▋ | 19505/34278 [21:24:16<14:18:01, 3.48s/it] {'loss': 0.1101, 'grad_norm': 0.7290644904028006, 'learning_rate': 4.129743071612214e-06, 'epoch': 0.57} 57%|█████▋ | 19505/34278 [21:24:16<14:18:01, 3.48s/it] 57%|█████▋ | 19506/34278 [21:24:20<14:05:16, 3.43s/it] {'loss': 0.1313, 'grad_norm': 0.890508438194241, 'learning_rate': 4.129277852306169e-06, 'epoch': 0.57} 57%|█████▋ | 19506/34278 [21:24:20<14:05:16, 3.43s/it] 57%|█████▋ | 19507/34278 [21:24:23<13:47:33, 3.36s/it] {'loss': 0.1234, 'grad_norm': 0.7590479402233686, 'learning_rate': 4.128812640773721e-06, 'epoch': 0.57} 57%|█████▋ | 19507/34278 [21:24:23<13:47:33, 3.36s/it] 57%|█████▋ | 19508/34278 [21:24:26<13:53:02, 3.38s/it] {'loss': 0.1051, 'grad_norm': 0.6992276627467836, 'learning_rate': 4.128347437019028e-06, 'epoch': 0.57} 57%|█████▋ | 19508/34278 [21:24:26<13:53:02, 3.38s/it] 57%|█████▋ | 19509/34278 [21:24:29<13:15:12, 3.23s/it] {'loss': 0.1197, 'grad_norm': 0.77158180994644, 'learning_rate': 4.127882241046241e-06, 'epoch': 0.57} 57%|█████▋ | 19509/34278 [21:24:29<13:15:12, 3.23s/it] 57%|█████▋ | 19510/34278 [21:24:32<13:04:31, 3.19s/it] {'loss': 0.1358, 'grad_norm': 0.7894896966601396, 'learning_rate': 4.127417052859513e-06, 'epoch': 0.57} 57%|█████▋ | 19510/34278 [21:24:32<13:04:31, 3.19s/it] 57%|█████▋ | 19511/34278 [21:24:35<13:05:02, 3.19s/it] {'loss': 0.1215, 'grad_norm': 0.7574734746844207, 'learning_rate': 4.126951872462997e-06, 'epoch': 0.57} 57%|█████▋ | 19511/34278 [21:24:35<13:05:02, 3.19s/it] 57%|█████▋ | 19512/34278 [21:24:41<16:37:26, 4.05s/it] {'loss': 0.1421, 'grad_norm': 0.8685129970062757, 'learning_rate': 4.1264866998608476e-06, 'epoch': 0.57} 57%|█████▋ | 19512/34278 [21:24:41<16:37:26, 4.05s/it] 57%|█████▋ | 19513/34278 [21:24:45<15:48:12, 3.85s/it] {'loss': 0.1387, 'grad_norm': 0.870514576773545, 'learning_rate': 4.126021535057213e-06, 'epoch': 0.57} 57%|█████▋ | 19513/34278 [21:24:45<15:48:12, 3.85s/it] 57%|█████▋ | 19514/34278 [21:24:48<14:42:53, 3.59s/it] {'loss': 0.118, 'grad_norm': 0.8267546453740713, 'learning_rate': 4.125556378056252e-06, 'epoch': 0.57} 57%|█████▋ | 19514/34278 [21:24:48<14:42:53, 3.59s/it] 57%|█████▋ | 19515/34278 [21:24:51<13:48:10, 3.37s/it] {'loss': 0.1399, 'grad_norm': 0.7105776430547567, 'learning_rate': 4.125091228862115e-06, 'epoch': 0.57} 57%|█████▋ | 19515/34278 [21:24:51<13:48:10, 3.37s/it] 57%|█████▋ | 19516/34278 [21:24:53<12:58:54, 3.17s/it] {'loss': 0.1272, 'grad_norm': 0.939084837621374, 'learning_rate': 4.124626087478954e-06, 'epoch': 0.57} 57%|█████▋ | 19516/34278 [21:24:53<12:58:54, 3.17s/it] 57%|█████▋ | 19517/34278 [21:24:57<13:25:17, 3.27s/it] {'loss': 0.1237, 'grad_norm': 0.8669677573714532, 'learning_rate': 4.124160953910923e-06, 'epoch': 0.57} 57%|█████▋ | 19517/34278 [21:24:57<13:25:17, 3.27s/it] 57%|█████▋ | 19518/34278 [21:25:00<13:36:11, 3.32s/it] {'loss': 0.1136, 'grad_norm': 0.9734128272672662, 'learning_rate': 4.1236958281621735e-06, 'epoch': 0.57} 57%|█████▋ | 19518/34278 [21:25:00<13:36:11, 3.32s/it] 57%|█████▋ | 19519/34278 [21:25:04<13:42:19, 3.34s/it] {'loss': 0.1129, 'grad_norm': 0.9261766815643392, 'learning_rate': 4.123230710236857e-06, 'epoch': 0.57} 57%|█████▋ | 19519/34278 [21:25:04<13:42:19, 3.34s/it] 57%|█████▋ | 19520/34278 [21:25:06<13:00:17, 3.17s/it] {'loss': 0.1273, 'grad_norm': 1.0987557919086146, 'learning_rate': 4.122765600139129e-06, 'epoch': 0.57} 57%|█████▋ | 19520/34278 [21:25:06<13:00:17, 3.17s/it] 57%|█████▋ | 19521/34278 [21:25:12<16:26:47, 4.01s/it] {'loss': 0.1248, 'grad_norm': 1.0520802567736014, 'learning_rate': 4.122300497873141e-06, 'epoch': 0.57} 57%|█████▋ | 19521/34278 [21:25:12<16:26:47, 4.01s/it] 57%|█████▋ | 19522/34278 [21:25:16<15:40:29, 3.82s/it] {'loss': 0.1204, 'grad_norm': 0.7404176701961205, 'learning_rate': 4.121835403443044e-06, 'epoch': 0.57} 57%|█████▋ | 19522/34278 [21:25:16<15:40:29, 3.82s/it] 57%|█████▋ | 19523/34278 [21:25:19<14:32:44, 3.55s/it] {'loss': 0.1306, 'grad_norm': 0.8641522715390688, 'learning_rate': 4.1213703168529905e-06, 'epoch': 0.57} 57%|█████▋ | 19523/34278 [21:25:19<14:32:44, 3.55s/it] 57%|█████▋ | 19524/34278 [21:25:24<16:51:24, 4.11s/it] {'loss': 0.1193, 'grad_norm': 0.7196633741119154, 'learning_rate': 4.120905238107134e-06, 'epoch': 0.57} 57%|█████▋ | 19524/34278 [21:25:24<16:51:24, 4.11s/it] 57%|█████▋ | 19525/34278 [21:25:28<16:55:38, 4.13s/it] {'loss': 0.106, 'grad_norm': 0.8359834060471374, 'learning_rate': 4.120440167209623e-06, 'epoch': 0.57} 57%|█████▋ | 19525/34278 [21:25:28<16:55:38, 4.13s/it] 57%|█████▋ | 19526/34278 [21:25:32<15:59:53, 3.90s/it] {'loss': 0.1342, 'grad_norm': 0.7430056591631837, 'learning_rate': 4.119975104164616e-06, 'epoch': 0.57} 57%|█████▋ | 19526/34278 [21:25:32<15:59:53, 3.90s/it] 57%|█████▋ | 19527/34278 [21:25:35<15:18:09, 3.73s/it] {'loss': 0.1089, 'grad_norm': 0.78401092675671, 'learning_rate': 4.119510048976258e-06, 'epoch': 0.57} 57%|█████▋ | 19527/34278 [21:25:35<15:18:09, 3.73s/it] 57%|█████▋ | 19528/34278 [21:25:38<14:16:11, 3.48s/it] {'loss': 0.1071, 'grad_norm': 0.737065396274654, 'learning_rate': 4.119045001648705e-06, 'epoch': 0.57} 57%|█████▋ | 19528/34278 [21:25:38<14:16:11, 3.48s/it] 57%|█████▋ | 19529/34278 [21:25:42<14:45:36, 3.60s/it] {'loss': 0.1268, 'grad_norm': 0.8720979256625068, 'learning_rate': 4.11857996218611e-06, 'epoch': 0.57} 57%|█████▋ | 19529/34278 [21:25:42<14:45:36, 3.60s/it] 57%|█████▋ | 19530/34278 [21:25:45<14:17:47, 3.49s/it] {'loss': 0.14, 'grad_norm': 0.9327166649629202, 'learning_rate': 4.118114930592621e-06, 'epoch': 0.57} 57%|█████▋ | 19530/34278 [21:25:45<14:17:47, 3.49s/it] 57%|█████▋ | 19531/34278 [21:25:48<13:34:07, 3.31s/it] {'loss': 0.1158, 'grad_norm': 0.8726103788430802, 'learning_rate': 4.1176499068723895e-06, 'epoch': 0.57} 57%|█████▋ | 19531/34278 [21:25:48<13:34:07, 3.31s/it] 57%|█████▋ | 19532/34278 [21:25:51<13:14:19, 3.23s/it] {'loss': 0.1213, 'grad_norm': 1.0831975917678167, 'learning_rate': 4.117184891029571e-06, 'epoch': 0.57} 57%|█████▋ | 19532/34278 [21:25:51<13:14:19, 3.23s/it] 57%|█████▋ | 19533/34278 [21:25:57<16:18:37, 3.98s/it] {'loss': 0.119, 'grad_norm': 0.6362389314916833, 'learning_rate': 4.116719883068315e-06, 'epoch': 0.57} 57%|█████▋ | 19533/34278 [21:25:57<16:18:37, 3.98s/it] 57%|█████▋ | 19534/34278 [21:26:00<15:01:33, 3.67s/it] {'loss': 0.1398, 'grad_norm': 1.022191520372765, 'learning_rate': 4.116254882992774e-06, 'epoch': 0.57} 57%|█████▋ | 19534/34278 [21:26:00<15:01:33, 3.67s/it] 57%|█████▋ | 19535/34278 [21:26:03<14:29:16, 3.54s/it] {'loss': 0.1176, 'grad_norm': 0.7810039189547336, 'learning_rate': 4.115789890807097e-06, 'epoch': 0.57} 57%|█████▋ | 19535/34278 [21:26:03<14:29:16, 3.54s/it] 57%|█████▋ | 19536/34278 [21:26:06<13:35:30, 3.32s/it] {'loss': 0.1217, 'grad_norm': 0.9425448543692836, 'learning_rate': 4.115324906515438e-06, 'epoch': 0.57} 57%|█████▋ | 19536/34278 [21:26:06<13:35:30, 3.32s/it] 57%|█████▋ | 19537/34278 [21:26:12<16:48:13, 4.10s/it] {'loss': 0.1335, 'grad_norm': 0.6867779750577592, 'learning_rate': 4.114859930121944e-06, 'epoch': 0.57} 57%|█████▋ | 19537/34278 [21:26:12<16:48:13, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 57%|█████▋ | 19538/34278 [21:26:15<16:06:32, 3.93s/it] {'loss': 0.1545, 'grad_norm': 0.7868889556038507, 'learning_rate': 4.1143949616307725e-06, 'epoch': 0.57} 57%|█████▋ | 19538/34278 [21:26:15<16:06:32, 3.93s/it] 57%|█████▋ | 19539/34278 [21:26:18<14:49:38, 3.62s/it] {'loss': 0.1317, 'grad_norm': 0.7451587901962556, 'learning_rate': 4.1139300010460705e-06, 'epoch': 0.57} 57%|█████▋ | 19539/34278 [21:26:18<14:49:38, 3.62s/it] 57%|█████▋ | 19540/34278 [21:26:23<16:05:52, 3.93s/it] {'loss': 0.1348, 'grad_norm': 0.9621247044829014, 'learning_rate': 4.11346504837199e-06, 'epoch': 0.57} 57%|█████▋ | 19540/34278 [21:26:23<16:05:52, 3.93s/it] 57%|█████▋ | 19541/34278 [21:26:26<14:43:46, 3.60s/it] {'loss': 0.144, 'grad_norm': 0.8226151313164166, 'learning_rate': 4.113000103612681e-06, 'epoch': 0.57} 57%|█████▋ | 19541/34278 [21:26:26<14:43:46, 3.60s/it] 57%|█████▋ | 19542/34278 [21:26:29<14:03:37, 3.43s/it] {'loss': 0.138, 'grad_norm': 0.7845961869456818, 'learning_rate': 4.112535166772297e-06, 'epoch': 0.57} 57%|█████▋ | 19542/34278 [21:26:29<14:03:37, 3.43s/it] 57%|█████▋ | 19543/34278 [21:26:33<14:53:04, 3.64s/it] {'loss': 0.12, 'grad_norm': 0.7521358174870113, 'learning_rate': 4.112070237854984e-06, 'epoch': 0.57} 57%|█████▋ | 19543/34278 [21:26:33<14:53:04, 3.64s/it] 57%|█████▋ | 19544/34278 [21:26:36<14:13:08, 3.47s/it] {'loss': 0.1438, 'grad_norm': 0.7869389188886107, 'learning_rate': 4.111605316864899e-06, 'epoch': 0.57} 57%|█████▋ | 19544/34278 [21:26:36<14:13:08, 3.47s/it] 57%|█████▋ | 19545/34278 [21:26:39<13:30:51, 3.30s/it] {'loss': 0.1334, 'grad_norm': 0.891113661257238, 'learning_rate': 4.1111404038061895e-06, 'epoch': 0.57} 57%|█████▋ | 19545/34278 [21:26:39<13:30:51, 3.30s/it] 57%|█████▋ | 19546/34278 [21:26:42<13:16:36, 3.24s/it] {'loss': 0.1496, 'grad_norm': 0.8058385390983227, 'learning_rate': 4.110675498683005e-06, 'epoch': 0.57} 57%|█████▋ | 19546/34278 [21:26:42<13:16:36, 3.24s/it] 57%|█████▋ | 19547/34278 [21:26:48<16:46:49, 4.10s/it] {'loss': 0.1202, 'grad_norm': 0.891585900716879, 'learning_rate': 4.1102106014994994e-06, 'epoch': 0.57} 57%|█████▋ | 19547/34278 [21:26:48<16:46:49, 4.10s/it] 57%|█████▋ | 19548/34278 [21:26:51<15:33:25, 3.80s/it] {'loss': 0.1373, 'grad_norm': 0.8010862708971639, 'learning_rate': 4.109745712259819e-06, 'epoch': 0.57} 57%|█████▋ | 19548/34278 [21:26:51<15:33:25, 3.80s/it] 57%|█████▋ | 19549/34278 [21:26:54<14:23:32, 3.52s/it] {'loss': 0.1086, 'grad_norm': 0.6890893052186399, 'learning_rate': 4.109280830968116e-06, 'epoch': 0.57} 57%|█████▋ | 19549/34278 [21:26:54<14:23:32, 3.52s/it] 57%|█████▋ | 19550/34278 [21:26:57<14:25:41, 3.53s/it] {'loss': 0.1444, 'grad_norm': 0.9517040546287387, 'learning_rate': 4.108815957628542e-06, 'epoch': 0.57} 57%|█████▋ | 19550/34278 [21:26:57<14:25:41, 3.53s/it] 57%|█████▋ | 19551/34278 [21:27:02<15:39:02, 3.83s/it] {'loss': 0.1496, 'grad_norm': 0.740405320141188, 'learning_rate': 4.108351092245248e-06, 'epoch': 0.57} 57%|█████▋ | 19551/34278 [21:27:02<15:39:02, 3.83s/it] 57%|█████▋ | 19552/34278 [21:27:07<17:11:15, 4.20s/it] {'loss': 0.1322, 'grad_norm': 0.803242782272997, 'learning_rate': 4.107886234822381e-06, 'epoch': 0.57} 57%|█████▋ | 19552/34278 [21:27:07<17:11:15, 4.20s/it] 57%|█████▋ | 19553/34278 [21:27:13<19:13:38, 4.70s/it] {'loss': 0.1404, 'grad_norm': 0.9112748207641433, 'learning_rate': 4.107421385364093e-06, 'epoch': 0.57} 57%|█████▋ | 19553/34278 [21:27:13<19:13:38, 4.70s/it] 57%|█████▋ | 19554/34278 [21:27:16<17:43:17, 4.33s/it] {'loss': 0.1231, 'grad_norm': 0.8018854481429659, 'learning_rate': 4.106956543874534e-06, 'epoch': 0.57} 57%|█████▋ | 19554/34278 [21:27:16<17:43:17, 4.33s/it] 57%|█████▋ | 19555/34278 [21:27:22<19:38:45, 4.80s/it] {'loss': 0.1554, 'grad_norm': 0.983309130673319, 'learning_rate': 4.106491710357851e-06, 'epoch': 0.57} 57%|█████▋ | 19555/34278 [21:27:22<19:38:45, 4.80s/it] 57%|█████▋ | 19556/34278 [21:27:25<17:42:47, 4.33s/it] {'loss': 0.1263, 'grad_norm': 0.9327751668367343, 'learning_rate': 4.106026884818201e-06, 'epoch': 0.57} 57%|█████▋ | 19556/34278 [21:27:25<17:42:47, 4.33s/it] 57%|█████▋ | 19557/34278 [21:27:29<16:10:22, 3.96s/it] {'loss': 0.1025, 'grad_norm': 0.8019383722303732, 'learning_rate': 4.105562067259726e-06, 'epoch': 0.57} 57%|█████▋ | 19557/34278 [21:27:29<16:10:22, 3.96s/it] 57%|█████▋ | 19558/34278 [21:27:32<15:04:55, 3.69s/it] {'loss': 0.1471, 'grad_norm': 0.7746129361333384, 'learning_rate': 4.1050972576865824e-06, 'epoch': 0.57} 57%|█████▋ | 19558/34278 [21:27:32<15:04:55, 3.69s/it] 57%|█████▋ | 19559/34278 [21:27:35<14:33:36, 3.56s/it] {'loss': 0.1232, 'grad_norm': 0.9636544945578799, 'learning_rate': 4.104632456102916e-06, 'epoch': 0.57} 57%|█████▋ | 19559/34278 [21:27:35<14:33:36, 3.56s/it] 57%|█████▋ | 19560/34278 [21:27:38<14:30:01, 3.55s/it] {'loss': 0.1153, 'grad_norm': 0.7462668806815042, 'learning_rate': 4.104167662512877e-06, 'epoch': 0.57} 57%|█████▋ | 19560/34278 [21:27:38<14:30:01, 3.55s/it] 57%|█████▋ | 19561/34278 [21:27:42<15:11:06, 3.71s/it] {'loss': 0.1338, 'grad_norm': 1.3059020968712054, 'learning_rate': 4.103702876920614e-06, 'epoch': 0.57} 57%|█████▋ | 19561/34278 [21:27:42<15:11:06, 3.71s/it] 57%|█████▋ | 19562/34278 [21:27:48<17:56:18, 4.39s/it] {'loss': 0.1371, 'grad_norm': 0.8451109525651166, 'learning_rate': 4.103238099330279e-06, 'epoch': 0.57} 57%|█████▋ | 19562/34278 [21:27:48<17:56:18, 4.39s/it] 57%|█████▋ | 19563/34278 [21:27:52<16:30:25, 4.04s/it] {'loss': 0.1467, 'grad_norm': 0.9030798489373518, 'learning_rate': 4.102773329746019e-06, 'epoch': 0.57} 57%|█████▋ | 19563/34278 [21:27:52<16:30:25, 4.04s/it] 57%|█████▋ | 19564/34278 [21:27:56<17:27:48, 4.27s/it] {'loss': 0.1363, 'grad_norm': 0.8529831916062731, 'learning_rate': 4.102308568171987e-06, 'epoch': 0.57} 57%|█████▋ | 19564/34278 [21:27:56<17:27:48, 4.27s/it] 57%|█████▋ | 19565/34278 [21:28:00<16:08:41, 3.95s/it] {'loss': 0.1321, 'grad_norm': 0.860329533972752, 'learning_rate': 4.101843814612328e-06, 'epoch': 0.57} 57%|█████▋ | 19565/34278 [21:28:00<16:08:41, 3.95s/it] 57%|█████▋ | 19566/34278 [21:28:03<15:15:25, 3.73s/it] {'loss': 0.1359, 'grad_norm': 0.8707392296522488, 'learning_rate': 4.101379069071193e-06, 'epoch': 0.57} 57%|█████▋ | 19566/34278 [21:28:03<15:15:25, 3.73s/it] 57%|█████▋ | 19567/34278 [21:28:06<14:17:58, 3.50s/it] {'loss': 0.1099, 'grad_norm': 1.100558595633123, 'learning_rate': 4.100914331552731e-06, 'epoch': 0.57} 57%|█████▋ | 19567/34278 [21:28:06<14:17:58, 3.50s/it] 57%|█████▋ | 19568/34278 [21:28:12<17:46:54, 4.35s/it] {'loss': 0.1076, 'grad_norm': 1.036221664048366, 'learning_rate': 4.100449602061091e-06, 'epoch': 0.57} 57%|█████▋ | 19568/34278 [21:28:12<17:46:54, 4.35s/it] 57%|█████▋ | 19569/34278 [21:28:16<16:32:58, 4.05s/it] {'loss': 0.132, 'grad_norm': 0.670950051219072, 'learning_rate': 4.0999848806004235e-06, 'epoch': 0.57} 57%|█████▋ | 19569/34278 [21:28:16<16:32:58, 4.05s/it] 57%|█████▋ | 19570/34278 [21:28:19<16:16:25, 3.98s/it] {'loss': 0.1364, 'grad_norm': 0.8106256922639987, 'learning_rate': 4.099520167174876e-06, 'epoch': 0.57} 57%|█████▋ | 19570/34278 [21:28:19<16:16:25, 3.98s/it] 57%|█████▋ | 19571/34278 [21:28:22<15:03:29, 3.69s/it] {'loss': 0.1247, 'grad_norm': 0.8510338740647666, 'learning_rate': 4.0990554617885965e-06, 'epoch': 0.57} 57%|█████▋ | 19571/34278 [21:28:22<15:03:29, 3.69s/it] 57%|█████▋ | 19572/34278 [21:28:26<15:20:40, 3.76s/it] {'loss': 0.1202, 'grad_norm': 0.8043332398153833, 'learning_rate': 4.098590764445737e-06, 'epoch': 0.57} 57%|█████▋ | 19572/34278 [21:28:26<15:20:40, 3.76s/it] 57%|█████▋ | 19573/34278 [21:28:30<14:52:07, 3.64s/it] {'loss': 0.1464, 'grad_norm': 1.0271611484031353, 'learning_rate': 4.0981260751504394e-06, 'epoch': 0.57} 57%|█████▋ | 19573/34278 [21:28:30<14:52:07, 3.64s/it] 57%|█████▋ | 19574/34278 [21:28:33<13:59:17, 3.42s/it] {'loss': 0.1546, 'grad_norm': 0.9118783140012819, 'learning_rate': 4.097661393906861e-06, 'epoch': 0.57} 57%|█████▋ | 19574/34278 [21:28:33<13:59:17, 3.42s/it] 57%|█████▋ | 19575/34278 [21:28:38<16:39:30, 4.08s/it] {'loss': 0.1238, 'grad_norm': 0.7993818994107283, 'learning_rate': 4.097196720719146e-06, 'epoch': 0.57} 57%|█████▋ | 19575/34278 [21:28:38<16:39:30, 4.08s/it] 57%|█████▋ | 19576/34278 [21:28:42<15:43:01, 3.85s/it] {'loss': 0.1291, 'grad_norm': 0.8956912654468556, 'learning_rate': 4.096732055591442e-06, 'epoch': 0.57} 57%|█████▋ | 19576/34278 [21:28:42<15:43:01, 3.85s/it] 57%|█████▋ | 19577/34278 [21:28:44<14:35:41, 3.57s/it] {'loss': 0.146, 'grad_norm': 1.2382410713120202, 'learning_rate': 4.096267398527899e-06, 'epoch': 0.57} 57%|█████▋ | 19577/34278 [21:28:44<14:35:41, 3.57s/it] 57%|█████▋ | 19578/34278 [21:28:47<13:52:09, 3.40s/it] {'loss': 0.1366, 'grad_norm': 0.8038197701147217, 'learning_rate': 4.095802749532665e-06, 'epoch': 0.57} 57%|█████▋ | 19578/34278 [21:28:47<13:52:09, 3.40s/it] 57%|█████▋ | 19579/34278 [21:28:51<13:46:58, 3.38s/it] {'loss': 0.1472, 'grad_norm': 0.9035561096904182, 'learning_rate': 4.095338108609887e-06, 'epoch': 0.57} 57%|█████▋ | 19579/34278 [21:28:51<13:46:58, 3.38s/it] 57%|█████▋ | 19580/34278 [21:28:54<13:30:41, 3.31s/it] {'loss': 0.1278, 'grad_norm': 0.9322470645117651, 'learning_rate': 4.0948734757637145e-06, 'epoch': 0.57} 57%|█████▋ | 19580/34278 [21:28:54<13:30:41, 3.31s/it] 57%|█████▋ | 19581/34278 [21:28:57<13:40:49, 3.35s/it] {'loss': 0.1332, 'grad_norm': 0.7384723552150703, 'learning_rate': 4.094408850998298e-06, 'epoch': 0.57} 57%|█████▋ | 19581/34278 [21:28:57<13:40:49, 3.35s/it] 57%|█████▋ | 19582/34278 [21:29:00<13:19:45, 3.27s/it] {'loss': 0.1198, 'grad_norm': 0.7524179745350091, 'learning_rate': 4.093944234317781e-06, 'epoch': 0.57} 57%|█████▋ | 19582/34278 [21:29:00<13:19:45, 3.27s/it] 57%|█████▋ | 19583/34278 [21:29:04<13:15:09, 3.25s/it] {'loss': 0.1108, 'grad_norm': 0.8528641395279695, 'learning_rate': 4.093479625726314e-06, 'epoch': 0.57} 57%|█████▋ | 19583/34278 [21:29:04<13:15:09, 3.25s/it] 57%|█████▋ | 19584/34278 [21:29:07<12:55:27, 3.17s/it] {'loss': 0.145, 'grad_norm': 0.7550134162929518, 'learning_rate': 4.093015025228045e-06, 'epoch': 0.57} 57%|█████▋ | 19584/34278 [21:29:07<12:55:27, 3.17s/it] 57%|█████▋ | 19585/34278 [21:29:10<12:44:29, 3.12s/it] {'loss': 0.1188, 'grad_norm': 0.7579223993536571, 'learning_rate': 4.092550432827119e-06, 'epoch': 0.57} 57%|█████▋ | 19585/34278 [21:29:10<12:44:29, 3.12s/it] 57%|█████▋ | 19586/34278 [21:29:12<12:26:24, 3.05s/it] {'loss': 0.1212, 'grad_norm': 0.8729008044637647, 'learning_rate': 4.092085848527689e-06, 'epoch': 0.57} 57%|█████▋ | 19586/34278 [21:29:12<12:26:24, 3.05s/it] 57%|█████▋ | 19587/34278 [21:29:16<12:45:27, 3.13s/it] {'loss': 0.1219, 'grad_norm': 0.6287918891569605, 'learning_rate': 4.091621272333899e-06, 'epoch': 0.57} 57%|█████▋ | 19587/34278 [21:29:16<12:45:27, 3.13s/it] 57%|█████▋ | 19588/34278 [21:29:20<14:00:24, 3.43s/it] {'loss': 0.1547, 'grad_norm': 0.9254186992449724, 'learning_rate': 4.091156704249897e-06, 'epoch': 0.57} 57%|█████▋ | 19588/34278 [21:29:20<14:00:24, 3.43s/it] 57%|█████▋ | 19589/34278 [21:29:23<13:39:07, 3.35s/it] {'loss': 0.1208, 'grad_norm': 0.86534759301468, 'learning_rate': 4.090692144279832e-06, 'epoch': 0.57} 57%|█████▋ | 19589/34278 [21:29:23<13:39:07, 3.35s/it] 57%|█████▋ | 19590/34278 [21:29:29<16:46:42, 4.11s/it] {'loss': 0.1158, 'grad_norm': 0.7307745436742721, 'learning_rate': 4.0902275924278494e-06, 'epoch': 0.57} 57%|█████▋ | 19590/34278 [21:29:29<16:46:42, 4.11s/it] 57%|█████▋ | 19591/34278 [21:29:32<15:14:32, 3.74s/it] {'loss': 0.151, 'grad_norm': 0.7691039628841517, 'learning_rate': 4.0897630486980975e-06, 'epoch': 0.57} 57%|█████▋ | 19591/34278 [21:29:32<15:14:32, 3.74s/it] 57%|█████▋ | 19592/34278 [21:29:35<14:55:11, 3.66s/it] {'loss': 0.1145, 'grad_norm': 0.9721164989121035, 'learning_rate': 4.089298513094724e-06, 'epoch': 0.57} 57%|█████▋ | 19592/34278 [21:29:35<14:55:11, 3.66s/it] 57%|█████▋ | 19593/34278 [21:29:39<14:43:36, 3.61s/it] {'loss': 0.1523, 'grad_norm': 0.9281174769800552, 'learning_rate': 4.088833985621876e-06, 'epoch': 0.57} 57%|█████▋ | 19593/34278 [21:29:39<14:43:36, 3.61s/it] 57%|█████▋ | 19594/34278 [21:29:42<14:12:24, 3.48s/it] {'loss': 0.1438, 'grad_norm': 0.8749704575707429, 'learning_rate': 4.0883694662837015e-06, 'epoch': 0.57} 57%|█████▋ | 19594/34278 [21:29:42<14:12:24, 3.48s/it] 57%|█████▋ | 19595/34278 [21:29:48<17:24:01, 4.27s/it] {'loss': 0.1105, 'grad_norm': 0.8404845980285226, 'learning_rate': 4.087904955084346e-06, 'epoch': 0.57} 57%|█████▋ | 19595/34278 [21:29:48<17:24:01, 4.27s/it] 57%|█████▋ | 19596/34278 [21:29:51<15:43:39, 3.86s/it] {'loss': 0.1161, 'grad_norm': 0.6669387434807595, 'learning_rate': 4.087440452027958e-06, 'epoch': 0.57} 57%|█████▋ | 19596/34278 [21:29:51<15:43:39, 3.86s/it] 57%|█████▋ | 19597/34278 [21:29:55<15:42:08, 3.85s/it] {'loss': 0.1091, 'grad_norm': 0.8796561362075194, 'learning_rate': 4.086975957118682e-06, 'epoch': 0.57} 57%|█████▋ | 19597/34278 [21:29:55<15:42:08, 3.85s/it] 57%|█████▋ | 19598/34278 [21:29:58<14:48:17, 3.63s/it] {'loss': 0.1313, 'grad_norm': 0.8085861200201783, 'learning_rate': 4.0865114703606675e-06, 'epoch': 0.57} 57%|█████▋ | 19598/34278 [21:29:58<14:48:17, 3.63s/it] 57%|█████▋ | 19599/34278 [21:30:01<13:48:50, 3.39s/it] {'loss': 0.1413, 'grad_norm': 0.9494960421333559, 'learning_rate': 4.0860469917580625e-06, 'epoch': 0.57} 57%|█████▋ | 19599/34278 [21:30:01<13:48:50, 3.39s/it] 57%|█████▋ | 19600/34278 [21:30:07<16:54:22, 4.15s/it] {'loss': 0.1327, 'grad_norm': 1.016402525698968, 'learning_rate': 4.085582521315011e-06, 'epoch': 0.57} 57%|█████▋ | 19600/34278 [21:30:07<16:54:22, 4.15s/it] 57%|█████▋ | 19601/34278 [21:30:10<16:23:33, 4.02s/it] {'loss': 0.1316, 'grad_norm': 0.8810885581902279, 'learning_rate': 4.085118059035659e-06, 'epoch': 0.57} 57%|█████▋ | 19601/34278 [21:30:10<16:23:33, 4.02s/it] 57%|█████▋ | 19602/34278 [21:30:13<15:03:12, 3.69s/it] {'loss': 0.134, 'grad_norm': 1.295599828780416, 'learning_rate': 4.084653604924156e-06, 'epoch': 0.57} 57%|█████▋ | 19602/34278 [21:30:13<15:03:12, 3.69s/it] 57%|█████▋ | 19603/34278 [21:30:17<14:33:57, 3.57s/it] {'loss': 0.1147, 'grad_norm': 0.8965368382451819, 'learning_rate': 4.084189158984644e-06, 'epoch': 0.57} 57%|█████▋ | 19603/34278 [21:30:17<14:33:57, 3.57s/it] 57%|█████▋ | 19604/34278 [21:30:23<17:33:20, 4.31s/it] {'loss': 0.1389, 'grad_norm': 0.7570945154397575, 'learning_rate': 4.083724721221276e-06, 'epoch': 0.57} 57%|█████▋ | 19604/34278 [21:30:23<17:33:20, 4.31s/it] 57%|█████▋ | 19605/34278 [21:30:26<16:10:11, 3.97s/it] {'loss': 0.1326, 'grad_norm': 0.7008189567842064, 'learning_rate': 4.083260291638194e-06, 'epoch': 0.57} 57%|█████▋ | 19605/34278 [21:30:26<16:10:11, 3.97s/it] 57%|█████▋ | 19606/34278 [21:30:29<14:54:34, 3.66s/it] {'loss': 0.1096, 'grad_norm': 0.8957014734114848, 'learning_rate': 4.082795870239546e-06, 'epoch': 0.57} 57%|█████▋ | 19606/34278 [21:30:29<14:54:34, 3.66s/it] 57%|█████▋ | 19607/34278 [21:30:32<14:30:17, 3.56s/it] {'loss': 0.1223, 'grad_norm': 0.9645052959832401, 'learning_rate': 4.082331457029477e-06, 'epoch': 0.57} 57%|█████▋ | 19607/34278 [21:30:32<14:30:17, 3.56s/it] 57%|█████▋ | 19608/34278 [21:30:36<14:26:35, 3.54s/it] {'loss': 0.1306, 'grad_norm': 0.8505067561797681, 'learning_rate': 4.081867052012133e-06, 'epoch': 0.57} 57%|█████▋ | 19608/34278 [21:30:36<14:26:35, 3.54s/it] 57%|█████▋ | 19609/34278 [21:30:39<14:38:08, 3.59s/it] {'loss': 0.1362, 'grad_norm': 1.1036601142215465, 'learning_rate': 4.081402655191661e-06, 'epoch': 0.57} 57%|█████▋ | 19609/34278 [21:30:39<14:38:08, 3.59s/it] 57%|█████▋ | 19610/34278 [21:30:42<13:41:32, 3.36s/it] {'loss': 0.1426, 'grad_norm': 1.0219819072813023, 'learning_rate': 4.080938266572206e-06, 'epoch': 0.57} 57%|█████▋ | 19610/34278 [21:30:42<13:41:32, 3.36s/it] 57%|█████▋ | 19611/34278 [21:30:48<16:43:31, 4.11s/it] {'loss': 0.1237, 'grad_norm': 0.7664604229633002, 'learning_rate': 4.080473886157917e-06, 'epoch': 0.57} 57%|█████▋ | 19611/34278 [21:30:48<16:43:31, 4.11s/it] 57%|█████▋ | 19612/34278 [21:30:51<15:25:30, 3.79s/it] {'loss': 0.1405, 'grad_norm': 4.583120225789219, 'learning_rate': 4.080009513952937e-06, 'epoch': 0.57} 57%|█████▋ | 19612/34278 [21:30:51<15:25:30, 3.79s/it] 57%|█████▋ | 19613/34278 [21:30:54<14:54:10, 3.66s/it] {'loss': 0.1221, 'grad_norm': 1.1148858841638805, 'learning_rate': 4.079545149961411e-06, 'epoch': 0.57} 57%|█████▋ | 19613/34278 [21:30:54<14:54:10, 3.66s/it] 57%|█████▋ | 19614/34278 [21:30:59<15:33:12, 3.82s/it] {'loss': 0.1311, 'grad_norm': 1.1982993190446523, 'learning_rate': 4.079080794187488e-06, 'epoch': 0.57} 57%|█████▋ | 19614/34278 [21:30:59<15:33:12, 3.82s/it] 57%|█████▋ | 19615/34278 [21:31:02<14:49:08, 3.64s/it] {'loss': 0.1475, 'grad_norm': 0.7716507717834155, 'learning_rate': 4.078616446635309e-06, 'epoch': 0.57} 57%|█████▋ | 19615/34278 [21:31:02<14:49:08, 3.64s/it] 57%|█████▋ | 19616/34278 [21:31:05<14:07:12, 3.47s/it] {'loss': 0.1188, 'grad_norm': 1.3592832215293333, 'learning_rate': 4.078152107309025e-06, 'epoch': 0.57} 57%|█████▋ | 19616/34278 [21:31:05<14:07:12, 3.47s/it] 57%|█████▋ | 19617/34278 [21:31:08<13:36:20, 3.34s/it] {'loss': 0.1348, 'grad_norm': 0.8898626691653115, 'learning_rate': 4.0776877762127786e-06, 'epoch': 0.57} 57%|█████▋ | 19617/34278 [21:31:08<13:36:20, 3.34s/it] 57%|█████▋ | 19618/34278 [21:31:12<14:03:04, 3.45s/it] {'loss': 0.1321, 'grad_norm': 0.7912808512138472, 'learning_rate': 4.077223453350715e-06, 'epoch': 0.57} 57%|█████▋ | 19618/34278 [21:31:12<14:03:04, 3.45s/it] 57%|█████▋ | 19619/34278 [21:31:15<14:23:52, 3.54s/it] {'loss': 0.1377, 'grad_norm': 0.7352657412409985, 'learning_rate': 4.076759138726981e-06, 'epoch': 0.57} 57%|█████▋ | 19619/34278 [21:31:15<14:23:52, 3.54s/it] 57%|█████▋ | 19620/34278 [21:31:18<13:46:31, 3.38s/it] {'loss': 0.1192, 'grad_norm': 1.0250168066845917, 'learning_rate': 4.07629483234572e-06, 'epoch': 0.57} 57%|█████▋ | 19620/34278 [21:31:18<13:46:31, 3.38s/it] 57%|█████▋ | 19621/34278 [21:31:21<13:23:15, 3.29s/it] {'loss': 0.1102, 'grad_norm': 0.9129310045646087, 'learning_rate': 4.075830534211077e-06, 'epoch': 0.57} 57%|█████▋ | 19621/34278 [21:31:21<13:23:15, 3.29s/it] 57%|█████▋ | 19622/34278 [21:31:24<12:53:45, 3.17s/it] {'loss': 0.1046, 'grad_norm': 1.0136528664078577, 'learning_rate': 4.075366244327201e-06, 'epoch': 0.57} 57%|█████▋ | 19622/34278 [21:31:24<12:53:45, 3.17s/it] 57%|█████▋ | 19623/34278 [21:31:27<12:43:46, 3.13s/it] {'loss': 0.1152, 'grad_norm': 0.9626198268203121, 'learning_rate': 4.074901962698233e-06, 'epoch': 0.57} 57%|█████▋ | 19623/34278 [21:31:27<12:43:46, 3.13s/it] 57%|█████▋ | 19624/34278 [21:31:31<12:57:36, 3.18s/it] {'loss': 0.1452, 'grad_norm': 0.9867479245628938, 'learning_rate': 4.07443768932832e-06, 'epoch': 0.57} 57%|█████▋ | 19624/34278 [21:31:31<12:57:36, 3.18s/it] 57%|█████▋ | 19625/34278 [21:31:35<13:54:00, 3.42s/it] {'loss': 0.1342, 'grad_norm': 1.1981208406394535, 'learning_rate': 4.073973424221606e-06, 'epoch': 0.57} 57%|█████▋ | 19625/34278 [21:31:35<13:54:00, 3.42s/it] 57%|█████▋ | 19626/34278 [21:31:40<16:13:57, 3.99s/it] {'loss': 0.1255, 'grad_norm': 1.103538368144145, 'learning_rate': 4.073509167382237e-06, 'epoch': 0.57} 57%|█████▋ | 19626/34278 [21:31:40<16:13:57, 3.99s/it] 57%|█████▋ | 19627/34278 [21:31:43<15:00:20, 3.69s/it] {'loss': 0.1307, 'grad_norm': 0.9205918551145647, 'learning_rate': 4.073044918814355e-06, 'epoch': 0.57} 57%|█████▋ | 19627/34278 [21:31:43<15:00:20, 3.69s/it] 57%|█████▋ | 19628/34278 [21:31:46<14:00:31, 3.44s/it] {'loss': 0.1203, 'grad_norm': 0.75994579522841, 'learning_rate': 4.072580678522108e-06, 'epoch': 0.57} 57%|█████▋ | 19628/34278 [21:31:46<14:00:31, 3.44s/it] 57%|█████▋ | 19629/34278 [21:31:50<14:30:54, 3.57s/it] {'loss': 0.1274, 'grad_norm': 0.7800455976184955, 'learning_rate': 4.07211644650964e-06, 'epoch': 0.57} 57%|█████▋ | 19629/34278 [21:31:50<14:30:54, 3.57s/it] 57%|█████▋ | 19630/34278 [21:31:53<13:57:47, 3.43s/it] {'loss': 0.1067, 'grad_norm': 0.7479296977296234, 'learning_rate': 4.071652222781095e-06, 'epoch': 0.57} 57%|█████▋ | 19630/34278 [21:31:53<13:57:47, 3.43s/it] 57%|█████▋ | 19631/34278 [21:31:56<13:14:52, 3.26s/it] {'loss': 0.144, 'grad_norm': 0.935377227041055, 'learning_rate': 4.071188007340616e-06, 'epoch': 0.57} 57%|█████▋ | 19631/34278 [21:31:56<13:14:52, 3.26s/it] 57%|█████▋ | 19632/34278 [21:31:59<13:45:07, 3.38s/it] {'loss': 0.1249, 'grad_norm': 1.086846693480443, 'learning_rate': 4.07072380019235e-06, 'epoch': 0.57} 57%|█████▋ | 19632/34278 [21:31:59<13:45:07, 3.38s/it] 57%|█████▋ | 19633/34278 [21:32:05<16:51:56, 4.15s/it] {'loss': 0.1228, 'grad_norm': 0.7920414969886806, 'learning_rate': 4.070259601340438e-06, 'epoch': 0.57} 57%|█████▋ | 19633/34278 [21:32:05<16:51:56, 4.15s/it] 57%|█████▋ | 19634/34278 [21:32:08<15:28:36, 3.80s/it] {'loss': 0.13, 'grad_norm': 0.9641235197837658, 'learning_rate': 4.069795410789028e-06, 'epoch': 0.57} 57%|█████▋ | 19634/34278 [21:32:08<15:28:36, 3.80s/it] 57%|█████▋ | 19635/34278 [21:32:12<15:28:23, 3.80s/it] {'loss': 0.1367, 'grad_norm': 1.4582623863303132, 'learning_rate': 4.069331228542262e-06, 'epoch': 0.57} 57%|█████▋ | 19635/34278 [21:32:12<15:28:23, 3.80s/it] 57%|█████▋ | 19636/34278 [21:32:16<15:08:47, 3.72s/it] {'loss': 0.1163, 'grad_norm': 0.7596637382779614, 'learning_rate': 4.0688670546042846e-06, 'epoch': 0.57} 57%|█████▋ | 19636/34278 [21:32:16<15:08:47, 3.72s/it] 57%|█████▋ | 19637/34278 [21:32:19<14:59:16, 3.69s/it] {'loss': 0.1384, 'grad_norm': 0.9131849863806129, 'learning_rate': 4.0684028889792414e-06, 'epoch': 0.57} 57%|█████▋ | 19637/34278 [21:32:19<14:59:16, 3.69s/it] 57%|█████▋ | 19638/34278 [21:32:25<17:58:11, 4.42s/it] {'loss': 0.1303, 'grad_norm': 0.9595573296767251, 'learning_rate': 4.067938731671273e-06, 'epoch': 0.57} 57%|█████▋ | 19638/34278 [21:32:25<17:58:11, 4.42s/it] 57%|█████▋ | 19639/34278 [21:32:29<16:54:55, 4.16s/it] {'loss': 0.1196, 'grad_norm': 1.3266108150947125, 'learning_rate': 4.0674745826845245e-06, 'epoch': 0.57} 57%|█████▋ | 19639/34278 [21:32:29<16:54:55, 4.16s/it] 57%|█████▋ | 19640/34278 [21:32:33<16:23:23, 4.03s/it] {'loss': 0.1297, 'grad_norm': 0.8504605757416285, 'learning_rate': 4.0670104420231415e-06, 'epoch': 0.57} 57%|█████▋ | 19640/34278 [21:32:33<16:23:23, 4.03s/it] 57%|█████▋ | 19641/34278 [21:32:37<16:40:34, 4.10s/it] {'loss': 0.1481, 'grad_norm': 0.9627670484871081, 'learning_rate': 4.066546309691267e-06, 'epoch': 0.57} 57%|█████▋ | 19641/34278 [21:32:37<16:40:34, 4.10s/it] 57%|█████▋ | 19642/34278 [21:32:43<19:17:36, 4.75s/it] {'loss': 0.1216, 'grad_norm': 0.8181036989347299, 'learning_rate': 4.066082185693044e-06, 'epoch': 0.57} 57%|█████▋ | 19642/34278 [21:32:43<19:17:36, 4.75s/it] 57%|█████▋ | 19643/34278 [21:32:49<20:42:41, 5.09s/it] {'loss': 0.144, 'grad_norm': 0.8190177665164526, 'learning_rate': 4.065618070032616e-06, 'epoch': 0.57} 57%|█████▋ | 19643/34278 [21:32:49<20:42:41, 5.09s/it] 57%|█████▋ | 19644/34278 [21:32:52<18:11:13, 4.47s/it] {'loss': 0.1456, 'grad_norm': 0.9405307383188206, 'learning_rate': 4.065153962714128e-06, 'epoch': 0.57} 57%|█████▋ | 19644/34278 [21:32:52<18:11:13, 4.47s/it] 57%|█████▋ | 19645/34278 [21:32:55<16:39:51, 4.10s/it] {'loss': 0.1353, 'grad_norm': 0.8718385256673635, 'learning_rate': 4.064689863741718e-06, 'epoch': 0.57} 57%|█████▋ | 19645/34278 [21:32:55<16:39:51, 4.10s/it] 57%|█████▋ | 19646/34278 [21:32:59<16:06:29, 3.96s/it] {'loss': 0.1209, 'grad_norm': 0.8696890164201897, 'learning_rate': 4.0642257731195386e-06, 'epoch': 0.57} 57%|█████▋ | 19646/34278 [21:32:59<16:06:29, 3.96s/it] 57%|█████▋ | 19647/34278 [21:33:05<18:29:36, 4.55s/it] {'loss': 0.1263, 'grad_norm': 0.9092921157919418, 'learning_rate': 4.063761690851726e-06, 'epoch': 0.57} 57%|█████▋ | 19647/34278 [21:33:05<18:29:36, 4.55s/it] 57%|█████▋ | 19648/34278 [21:33:11<20:05:11, 4.94s/it] {'loss': 0.1369, 'grad_norm': 0.8009606291683495, 'learning_rate': 4.063297616942425e-06, 'epoch': 0.57} 57%|█████▋ | 19648/34278 [21:33:11<20:05:11, 4.94s/it] 57%|█████▋ | 19649/34278 [21:33:14<18:30:43, 4.56s/it] {'loss': 0.1216, 'grad_norm': 0.7824878711572439, 'learning_rate': 4.062833551395781e-06, 'epoch': 0.57} 57%|█████▋ | 19649/34278 [21:33:14<18:30:43, 4.56s/it] 57%|█████▋ | 19650/34278 [21:33:17<16:30:30, 4.06s/it] {'loss': 0.1305, 'grad_norm': 0.9747094430619917, 'learning_rate': 4.062369494215935e-06, 'epoch': 0.57} 57%|█████▋ | 19650/34278 [21:33:17<16:30:30, 4.06s/it] 57%|█████▋ | 19651/34278 [21:33:20<15:27:47, 3.81s/it] {'loss': 0.1564, 'grad_norm': 0.7065736982856796, 'learning_rate': 4.061905445407028e-06, 'epoch': 0.57} 57%|█████▋ | 19651/34278 [21:33:20<15:27:47, 3.81s/it] 57%|█████▋ | 19652/34278 [21:33:25<16:30:17, 4.06s/it] {'loss': 0.1394, 'grad_norm': 0.9489454903107976, 'learning_rate': 4.061441404973207e-06, 'epoch': 0.57} 57%|█████▋ | 19652/34278 [21:33:25<16:30:17, 4.06s/it] 57%|█████▋ | 19653/34278 [21:33:31<19:00:20, 4.68s/it] {'loss': 0.1237, 'grad_norm': 0.9276366532392195, 'learning_rate': 4.0609773729186126e-06, 'epoch': 0.57} 57%|█████▋ | 19653/34278 [21:33:31<19:00:20, 4.68s/it] 57%|█████▋ | 19654/34278 [21:33:34<17:02:50, 4.20s/it] {'loss': 0.1281, 'grad_norm': 0.8067987872644518, 'learning_rate': 4.060513349247389e-06, 'epoch': 0.57} 57%|█████▋ | 19654/34278 [21:33:34<17:02:50, 4.20s/it] 57%|█████▋ | 19655/34278 [21:33:38<16:35:05, 4.08s/it] {'loss': 0.1068, 'grad_norm': 0.7668286174008708, 'learning_rate': 4.060049333963677e-06, 'epoch': 0.57} 57%|█████▋ | 19655/34278 [21:33:38<16:35:05, 4.08s/it] 57%|█████▋ | 19656/34278 [21:33:42<16:29:38, 4.06s/it] {'loss': 0.1418, 'grad_norm': 1.0409118486005096, 'learning_rate': 4.059585327071622e-06, 'epoch': 0.57} 57%|█████▋ | 19656/34278 [21:33:42<16:29:38, 4.06s/it] 57%|█████▋ | 19657/34278 [21:33:45<14:59:25, 3.69s/it] {'loss': 0.1323, 'grad_norm': 0.7098214044664539, 'learning_rate': 4.059121328575361e-06, 'epoch': 0.57} 57%|█████▋ | 19657/34278 [21:33:45<14:59:25, 3.69s/it] 57%|█████▋ | 19658/34278 [21:33:51<17:47:42, 4.38s/it] {'loss': 0.1486, 'grad_norm': 0.8026493491423402, 'learning_rate': 4.058657338479043e-06, 'epoch': 0.57} 57%|█████▋ | 19658/34278 [21:33:51<17:47:42, 4.38s/it] 57%|█████▋ | 19659/34278 [21:33:54<15:59:04, 3.94s/it] {'loss': 0.1272, 'grad_norm': 0.9952965179770774, 'learning_rate': 4.058193356786808e-06, 'epoch': 0.57} 57%|█████▋ | 19659/34278 [21:33:54<15:59:04, 3.94s/it] 57%|█████▋ | 19660/34278 [21:33:57<14:53:26, 3.67s/it] {'loss': 0.1364, 'grad_norm': 1.004014463737942, 'learning_rate': 4.057729383502797e-06, 'epoch': 0.57} 57%|█████▋ | 19660/34278 [21:33:57<14:53:26, 3.67s/it] 57%|█████▋ | 19661/34278 [21:34:02<16:05:22, 3.96s/it] {'loss': 0.1146, 'grad_norm': 0.9945880244164974, 'learning_rate': 4.057265418631152e-06, 'epoch': 0.57} 57%|█████▋ | 19661/34278 [21:34:02<16:05:22, 3.96s/it] 57%|█████▋ | 19662/34278 [21:34:05<15:46:22, 3.88s/it] {'loss': 0.1241, 'grad_norm': 0.7428496129006911, 'learning_rate': 4.056801462176018e-06, 'epoch': 0.57} 57%|█████▋ | 19662/34278 [21:34:05<15:46:22, 3.88s/it] 57%|█████▋ | 19663/34278 [21:34:09<15:36:14, 3.84s/it] {'loss': 0.1445, 'grad_norm': 0.9291305056916853, 'learning_rate': 4.056337514141534e-06, 'epoch': 0.57} 57%|█████▋ | 19663/34278 [21:34:09<15:36:14, 3.84s/it] 57%|█████▋ | 19664/34278 [21:34:12<14:31:00, 3.58s/it] {'loss': 0.1317, 'grad_norm': 0.9821001367157365, 'learning_rate': 4.055873574531844e-06, 'epoch': 0.57} 57%|█████▋ | 19664/34278 [21:34:12<14:31:00, 3.58s/it] 57%|█████▋ | 19665/34278 [21:34:16<14:52:50, 3.67s/it] {'loss': 0.1415, 'grad_norm': 0.7703487540931252, 'learning_rate': 4.055409643351089e-06, 'epoch': 0.57} 57%|█████▋ | 19665/34278 [21:34:16<14:52:50, 3.67s/it] 57%|█████▋ | 19666/34278 [21:34:19<14:34:49, 3.59s/it] {'loss': 0.1429, 'grad_norm': 1.0669377839748413, 'learning_rate': 4.054945720603412e-06, 'epoch': 0.57} 57%|█████▋ | 19666/34278 [21:34:19<14:34:49, 3.59s/it] 57%|█████▋ | 19667/34278 [21:34:23<14:44:40, 3.63s/it] {'loss': 0.116, 'grad_norm': 0.7835812262511627, 'learning_rate': 4.054481806292954e-06, 'epoch': 0.57} 57%|█████▋ | 19667/34278 [21:34:23<14:44:40, 3.63s/it] 57%|█████▋ | 19668/34278 [21:34:29<17:50:09, 4.39s/it] {'loss': 0.1337, 'grad_norm': 1.0901860140491364, 'learning_rate': 4.054017900423857e-06, 'epoch': 0.57} 57%|█████▋ | 19668/34278 [21:34:29<17:50:09, 4.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 57%|█████▋ | 19669/34278 [21:34:32<16:13:36, 4.00s/it] {'loss': 0.1477, 'grad_norm': 0.7255252750423824, 'learning_rate': 4.05355400300026e-06, 'epoch': 0.57} 57%|█████▋ | 19669/34278 [21:34:32<16:13:36, 4.00s/it] 57%|█████▋ | 19670/34278 [21:34:38<17:48:43, 4.39s/it] {'loss': 0.1453, 'grad_norm': 0.8568672255002934, 'learning_rate': 4.0530901140263086e-06, 'epoch': 0.57} 57%|█████▋ | 19670/34278 [21:34:38<17:48:43, 4.39s/it] 57%|█████▋ | 19671/34278 [21:34:40<15:51:48, 3.91s/it] {'loss': 0.1052, 'grad_norm': 0.8489136617872457, 'learning_rate': 4.052626233506144e-06, 'epoch': 0.57} 57%|█████▋ | 19671/34278 [21:34:40<15:51:48, 3.91s/it] 57%|█████▋ | 19672/34278 [21:34:46<18:29:06, 4.56s/it] {'loss': 0.1258, 'grad_norm': 0.8732673269822866, 'learning_rate': 4.052162361443905e-06, 'epoch': 0.57} 57%|█████▋ | 19672/34278 [21:34:46<18:29:06, 4.56s/it] 57%|█████▋ | 19673/34278 [21:34:50<16:52:51, 4.16s/it] {'loss': 0.1313, 'grad_norm': 0.7741536434094117, 'learning_rate': 4.051698497843733e-06, 'epoch': 0.57} 57%|█████▋ | 19673/34278 [21:34:50<16:52:51, 4.16s/it] 57%|█████▋ | 19674/34278 [21:34:53<16:04:45, 3.96s/it] {'loss': 0.1226, 'grad_norm': 0.906186888405836, 'learning_rate': 4.0512346427097725e-06, 'epoch': 0.57} 57%|█████▋ | 19674/34278 [21:34:53<16:04:45, 3.96s/it] 57%|█████▋ | 19675/34278 [21:34:56<14:47:03, 3.64s/it] {'loss': 0.1276, 'grad_norm': 0.8287054882482325, 'learning_rate': 4.05077079604616e-06, 'epoch': 0.57} 57%|█████▋ | 19675/34278 [21:34:56<14:47:03, 3.64s/it] 57%|█████▋ | 19676/34278 [21:34:59<14:23:47, 3.55s/it] {'loss': 0.1492, 'grad_norm': 0.7243895749605813, 'learning_rate': 4.050306957857041e-06, 'epoch': 0.57} 57%|█████▋ | 19676/34278 [21:34:59<14:23:47, 3.55s/it] 57%|█████▋ | 19677/34278 [21:35:02<13:30:13, 3.33s/it] {'loss': 0.1217, 'grad_norm': 0.7510583587606688, 'learning_rate': 4.049843128146555e-06, 'epoch': 0.57} 57%|█████▋ | 19677/34278 [21:35:02<13:30:13, 3.33s/it] 57%|█████▋ | 19678/34278 [21:35:05<13:30:12, 3.33s/it] {'loss': 0.137, 'grad_norm': 0.9605537640545965, 'learning_rate': 4.0493793069188425e-06, 'epoch': 0.57} 57%|█████▋ | 19678/34278 [21:35:06<13:30:12, 3.33s/it] 57%|█████▋ | 19679/34278 [21:35:10<15:00:56, 3.70s/it] {'loss': 0.1043, 'grad_norm': 0.6980832287768168, 'learning_rate': 4.0489154941780455e-06, 'epoch': 0.57} 57%|█████▋ | 19679/34278 [21:35:10<15:00:56, 3.70s/it] 57%|█████▋ | 19680/34278 [21:35:13<14:09:35, 3.49s/it] {'loss': 0.1273, 'grad_norm': 0.9968242679545044, 'learning_rate': 4.048451689928302e-06, 'epoch': 0.57} 57%|█████▋ | 19680/34278 [21:35:13<14:09:35, 3.49s/it] 57%|█████▋ | 19681/34278 [21:35:17<14:06:20, 3.48s/it] {'loss': 0.107, 'grad_norm': 0.9241971501512688, 'learning_rate': 4.047987894173755e-06, 'epoch': 0.57} 57%|█████▋ | 19681/34278 [21:35:17<14:06:20, 3.48s/it] 57%|█████▋ | 19682/34278 [21:35:21<15:23:58, 3.80s/it] {'loss': 0.1272, 'grad_norm': 0.7666690507839959, 'learning_rate': 4.047524106918545e-06, 'epoch': 0.57} 57%|█████▋ | 19682/34278 [21:35:21<15:23:58, 3.80s/it] 57%|█████▋ | 19683/34278 [21:35:25<15:25:48, 3.81s/it] {'loss': 0.121, 'grad_norm': 0.8163723685752113, 'learning_rate': 4.047060328166813e-06, 'epoch': 0.57} 57%|█████▋ | 19683/34278 [21:35:25<15:25:48, 3.81s/it] 57%|█████▋ | 19684/34278 [21:35:28<15:01:23, 3.71s/it] {'loss': 0.1152, 'grad_norm': 0.7700777990182223, 'learning_rate': 4.0465965579227e-06, 'epoch': 0.57} 57%|█████▋ | 19684/34278 [21:35:28<15:01:23, 3.71s/it] 57%|█████▋ | 19685/34278 [21:35:32<15:02:32, 3.71s/it] {'loss': 0.1389, 'grad_norm': 0.855520884905867, 'learning_rate': 4.046132796190344e-06, 'epoch': 0.57} 57%|█████▋ | 19685/34278 [21:35:32<15:02:32, 3.71s/it] 57%|█████▋ | 19686/34278 [21:35:35<14:25:48, 3.56s/it] {'loss': 0.1194, 'grad_norm': 0.7351823772848916, 'learning_rate': 4.045669042973886e-06, 'epoch': 0.57} 57%|█████▋ | 19686/34278 [21:35:35<14:25:48, 3.56s/it] 57%|█████▋ | 19687/34278 [21:35:39<14:18:28, 3.53s/it] {'loss': 0.1168, 'grad_norm': 0.7907259457199423, 'learning_rate': 4.045205298277466e-06, 'epoch': 0.57} 57%|█████▋ | 19687/34278 [21:35:39<14:18:28, 3.53s/it] 57%|█████▋ | 19688/34278 [21:35:42<14:27:36, 3.57s/it] {'loss': 0.1407, 'grad_norm': 0.8827805223309084, 'learning_rate': 4.044741562105227e-06, 'epoch': 0.57} 57%|█████▋ | 19688/34278 [21:35:42<14:27:36, 3.57s/it] 57%|█████▋ | 19689/34278 [21:35:46<14:28:30, 3.57s/it] {'loss': 0.1307, 'grad_norm': 0.9773334181207157, 'learning_rate': 4.044277834461308e-06, 'epoch': 0.57} 57%|█████▋ | 19689/34278 [21:35:46<14:28:30, 3.57s/it] 57%|█████▋ | 19690/34278 [21:35:49<14:11:23, 3.50s/it] {'loss': 0.1304, 'grad_norm': 0.812223302005619, 'learning_rate': 4.043814115349848e-06, 'epoch': 0.57} 57%|█████▋ | 19690/34278 [21:35:49<14:11:23, 3.50s/it] 57%|█████▋ | 19691/34278 [21:35:54<15:14:55, 3.76s/it] {'loss': 0.1404, 'grad_norm': 1.0275950849839577, 'learning_rate': 4.043350404774986e-06, 'epoch': 0.57} 57%|█████▋ | 19691/34278 [21:35:54<15:14:55, 3.76s/it] 57%|█████▋ | 19692/34278 [21:35:57<14:33:22, 3.59s/it] {'loss': 0.1111, 'grad_norm': 0.8937972180652621, 'learning_rate': 4.042886702740865e-06, 'epoch': 0.57} 57%|█████▋ | 19692/34278 [21:35:57<14:33:22, 3.59s/it] 57%|█████▋ | 19693/34278 [21:36:01<14:39:08, 3.62s/it] {'loss': 0.1158, 'grad_norm': 0.7054348300623637, 'learning_rate': 4.042423009251622e-06, 'epoch': 0.57} 57%|█████▋ | 19693/34278 [21:36:01<14:39:08, 3.62s/it] 57%|█████▋ | 19694/34278 [21:36:04<14:17:46, 3.53s/it] {'loss': 0.1319, 'grad_norm': 0.9100222587938743, 'learning_rate': 4.041959324311397e-06, 'epoch': 0.57} 57%|█████▋ | 19694/34278 [21:36:04<14:17:46, 3.53s/it] 57%|█████▋ | 19695/34278 [21:36:07<13:16:42, 3.28s/it] {'loss': 0.1177, 'grad_norm': 0.9097507057876928, 'learning_rate': 4.041495647924331e-06, 'epoch': 0.57} 57%|█████▋ | 19695/34278 [21:36:07<13:16:42, 3.28s/it] 57%|█████▋ | 19696/34278 [21:36:10<13:32:11, 3.34s/it] {'loss': 0.1385, 'grad_norm': 0.7082270876472646, 'learning_rate': 4.041031980094563e-06, 'epoch': 0.57} 57%|█████▋ | 19696/34278 [21:36:10<13:32:11, 3.34s/it] 57%|█████▋ | 19697/34278 [21:36:13<13:15:51, 3.27s/it] {'loss': 0.1304, 'grad_norm': 0.9263768877658136, 'learning_rate': 4.040568320826234e-06, 'epoch': 0.57} 57%|█████▋ | 19697/34278 [21:36:13<13:15:51, 3.27s/it] 57%|█████▋ | 19698/34278 [21:36:16<13:06:57, 3.24s/it] {'loss': 0.1345, 'grad_norm': 1.0005896366809788, 'learning_rate': 4.0401046701234795e-06, 'epoch': 0.57} 57%|█████▋ | 19698/34278 [21:36:16<13:06:57, 3.24s/it] 57%|█████▋ | 19699/34278 [21:36:20<13:54:58, 3.44s/it] {'loss': 0.1477, 'grad_norm': 0.9350469304411964, 'learning_rate': 4.039641027990443e-06, 'epoch': 0.57} 57%|█████▋ | 19699/34278 [21:36:20<13:54:58, 3.44s/it] 57%|█████▋ | 19700/34278 [21:36:25<15:24:02, 3.80s/it] {'loss': 0.1346, 'grad_norm': 0.9217337418915221, 'learning_rate': 4.039177394431262e-06, 'epoch': 0.57} 57%|█████▋ | 19700/34278 [21:36:25<15:24:02, 3.80s/it] 57%|█████▋ | 19701/34278 [21:36:30<16:43:30, 4.13s/it] {'loss': 0.1296, 'grad_norm': 0.7962343875316714, 'learning_rate': 4.038713769450076e-06, 'epoch': 0.57} 57%|█████▋ | 19701/34278 [21:36:30<16:43:30, 4.13s/it] 57%|█████▋ | 19702/34278 [21:36:36<18:42:03, 4.62s/it] {'loss': 0.1381, 'grad_norm': 0.8335662634199607, 'learning_rate': 4.038250153051024e-06, 'epoch': 0.57} 57%|█████▋ | 19702/34278 [21:36:36<18:42:03, 4.62s/it] 57%|█████▋ | 19703/34278 [21:36:39<16:44:30, 4.14s/it] {'loss': 0.1231, 'grad_norm': 1.0278937183408707, 'learning_rate': 4.0377865452382444e-06, 'epoch': 0.57} 57%|█████▋ | 19703/34278 [21:36:39<16:44:30, 4.14s/it] 57%|█████▋ | 19704/34278 [21:36:42<15:42:13, 3.88s/it] {'loss': 0.1235, 'grad_norm': 0.7782265105030279, 'learning_rate': 4.037322946015876e-06, 'epoch': 0.57} 57%|█████▋ | 19704/34278 [21:36:42<15:42:13, 3.88s/it] 57%|█████▋ | 19705/34278 [21:36:47<17:30:46, 4.33s/it] {'loss': 0.1368, 'grad_norm': 0.8048237422203832, 'learning_rate': 4.03685935538806e-06, 'epoch': 0.57} 57%|█████▋ | 19705/34278 [21:36:47<17:30:46, 4.33s/it] 57%|█████▋ | 19706/34278 [21:36:50<16:07:20, 3.98s/it] {'loss': 0.1317, 'grad_norm': 0.9199416165871753, 'learning_rate': 4.036395773358934e-06, 'epoch': 0.57} 57%|█████▋ | 19706/34278 [21:36:50<16:07:20, 3.98s/it] 57%|█████▋ | 19707/34278 [21:36:54<15:18:51, 3.78s/it] {'loss': 0.1103, 'grad_norm': 0.799419880980573, 'learning_rate': 4.035932199932636e-06, 'epoch': 0.57} 57%|█████▋ | 19707/34278 [21:36:54<15:18:51, 3.78s/it] 57%|█████▋ | 19708/34278 [21:36:57<15:04:53, 3.73s/it] {'loss': 0.1421, 'grad_norm': 1.0907092432324073, 'learning_rate': 4.0354686351133055e-06, 'epoch': 0.57} 57%|█████▋ | 19708/34278 [21:36:57<15:04:53, 3.73s/it] 57%|█████▋ | 19709/34278 [21:37:02<16:40:43, 4.12s/it] {'loss': 0.121, 'grad_norm': 1.0981836394400115, 'learning_rate': 4.035005078905081e-06, 'epoch': 0.57} 57%|█████▋ | 19709/34278 [21:37:02<16:40:43, 4.12s/it] 58%|█████▊ | 19710/34278 [21:37:06<15:51:39, 3.92s/it] {'loss': 0.1446, 'grad_norm': 0.8183032746130807, 'learning_rate': 4.034541531312099e-06, 'epoch': 0.58} 58%|█████▊ | 19710/34278 [21:37:06<15:51:39, 3.92s/it] 58%|█████▊ | 19711/34278 [21:37:09<15:15:41, 3.77s/it] {'loss': 0.1396, 'grad_norm': 0.8314492661481785, 'learning_rate': 4.034077992338501e-06, 'epoch': 0.58} 58%|█████▊ | 19711/34278 [21:37:09<15:15:41, 3.77s/it] 58%|█████▊ | 19712/34278 [21:37:13<15:45:02, 3.89s/it] {'loss': 0.1409, 'grad_norm': 0.9401287982859894, 'learning_rate': 4.0336144619884236e-06, 'epoch': 0.58} 58%|█████▊ | 19712/34278 [21:37:13<15:45:02, 3.89s/it] 58%|█████▊ | 19713/34278 [21:37:16<14:37:52, 3.62s/it] {'loss': 0.1176, 'grad_norm': 0.8138107409108649, 'learning_rate': 4.0331509402660066e-06, 'epoch': 0.58} 58%|█████▊ | 19713/34278 [21:37:16<14:37:52, 3.62s/it] 58%|█████▊ | 19714/34278 [21:37:20<14:08:14, 3.49s/it] {'loss': 0.1194, 'grad_norm': 0.7815657198400805, 'learning_rate': 4.032687427175387e-06, 'epoch': 0.58} 58%|█████▊ | 19714/34278 [21:37:20<14:08:14, 3.49s/it] 58%|█████▊ | 19715/34278 [21:37:23<14:31:07, 3.59s/it] {'loss': 0.155, 'grad_norm': 0.9751414630695675, 'learning_rate': 4.0322239227207025e-06, 'epoch': 0.58} 58%|█████▊ | 19715/34278 [21:37:23<14:31:07, 3.59s/it] 58%|█████▊ | 19716/34278 [21:37:27<14:38:49, 3.62s/it] {'loss': 0.1291, 'grad_norm': 0.752295778092462, 'learning_rate': 4.031760426906091e-06, 'epoch': 0.58} 58%|█████▊ | 19716/34278 [21:37:27<14:38:49, 3.62s/it] 58%|█████▊ | 19717/34278 [21:37:30<14:01:36, 3.47s/it] {'loss': 0.1291, 'grad_norm': 0.9306219494970849, 'learning_rate': 4.031296939735693e-06, 'epoch': 0.58} 58%|█████▊ | 19717/34278 [21:37:30<14:01:36, 3.47s/it] 58%|█████▊ | 19718/34278 [21:37:34<14:31:04, 3.59s/it] {'loss': 0.1179, 'grad_norm': 0.7988325175986996, 'learning_rate': 4.0308334612136435e-06, 'epoch': 0.58} 58%|█████▊ | 19718/34278 [21:37:34<14:31:04, 3.59s/it] 58%|█████▊ | 19719/34278 [21:37:40<17:29:34, 4.33s/it] {'loss': 0.1448, 'grad_norm': 1.0465718626053495, 'learning_rate': 4.030369991344083e-06, 'epoch': 0.58} 58%|█████▊ | 19719/34278 [21:37:40<17:29:34, 4.33s/it] 58%|█████▊ | 19720/34278 [21:37:43<15:39:45, 3.87s/it] {'loss': 0.1443, 'grad_norm': 0.8277484942582721, 'learning_rate': 4.029906530131147e-06, 'epoch': 0.58} 58%|█████▊ | 19720/34278 [21:37:43<15:39:45, 3.87s/it] 58%|█████▊ | 19721/34278 [21:37:46<14:21:16, 3.55s/it] {'loss': 0.1201, 'grad_norm': 0.8403907840133241, 'learning_rate': 4.0294430775789735e-06, 'epoch': 0.58} 58%|█████▊ | 19721/34278 [21:37:46<14:21:16, 3.55s/it] 58%|█████▊ | 19722/34278 [21:37:49<13:55:35, 3.44s/it] {'loss': 0.1275, 'grad_norm': 0.7523851084725544, 'learning_rate': 4.028979633691699e-06, 'epoch': 0.58} 58%|█████▊ | 19722/34278 [21:37:49<13:55:35, 3.44s/it] 58%|█████▊ | 19723/34278 [21:37:52<13:02:35, 3.23s/it] {'loss': 0.1091, 'grad_norm': 0.742304283264885, 'learning_rate': 4.028516198473465e-06, 'epoch': 0.58} 58%|█████▊ | 19723/34278 [21:37:52<13:02:35, 3.23s/it] 58%|█████▊ | 19724/34278 [21:37:55<12:53:43, 3.19s/it] {'loss': 0.1223, 'grad_norm': 0.8433274774611195, 'learning_rate': 4.028052771928406e-06, 'epoch': 0.58} 58%|█████▊ | 19724/34278 [21:37:55<12:53:43, 3.19s/it] 58%|█████▊ | 19725/34278 [21:38:00<15:41:02, 3.88s/it] {'loss': 0.1155, 'grad_norm': 0.8413010352489224, 'learning_rate': 4.027589354060659e-06, 'epoch': 0.58} 58%|█████▊ | 19725/34278 [21:38:00<15:41:02, 3.88s/it] 58%|█████▊ | 19726/34278 [21:38:06<17:35:59, 4.35s/it] {'loss': 0.1365, 'grad_norm': 0.854479311239559, 'learning_rate': 4.027125944874364e-06, 'epoch': 0.58} 58%|█████▊ | 19726/34278 [21:38:06<17:35:59, 4.35s/it] 58%|█████▊ | 19727/34278 [21:38:09<15:51:04, 3.92s/it] {'loss': 0.1311, 'grad_norm': 0.8284366003951291, 'learning_rate': 4.0266625443736555e-06, 'epoch': 0.58} 58%|█████▊ | 19727/34278 [21:38:09<15:51:04, 3.92s/it] 58%|█████▊ | 19728/34278 [21:38:12<15:04:06, 3.73s/it] {'loss': 0.127, 'grad_norm': 0.8834802623640456, 'learning_rate': 4.0261991525626696e-06, 'epoch': 0.58} 58%|█████▊ | 19728/34278 [21:38:12<15:04:06, 3.73s/it] 58%|█████▊ | 19729/34278 [21:38:16<15:22:03, 3.80s/it] {'loss': 0.141, 'grad_norm': 0.6915509347636353, 'learning_rate': 4.025735769445546e-06, 'epoch': 0.58} 58%|█████▊ | 19729/34278 [21:38:16<15:22:03, 3.80s/it] 58%|█████▊ | 19730/34278 [21:38:19<14:17:40, 3.54s/it] {'loss': 0.1306, 'grad_norm': 0.96567692670703, 'learning_rate': 4.025272395026421e-06, 'epoch': 0.58} 58%|█████▊ | 19730/34278 [21:38:19<14:17:40, 3.54s/it] 58%|█████▊ | 19731/34278 [21:38:22<14:00:52, 3.47s/it] {'loss': 0.1569, 'grad_norm': 1.0906606888868646, 'learning_rate': 4.024809029309433e-06, 'epoch': 0.58} 58%|█████▊ | 19731/34278 [21:38:22<14:00:52, 3.47s/it] 58%|█████▊ | 19732/34278 [21:38:26<14:53:25, 3.69s/it] {'loss': 0.1121, 'grad_norm': 0.6957580102975305, 'learning_rate': 4.024345672298716e-06, 'epoch': 0.58} 58%|█████▊ | 19732/34278 [21:38:26<14:53:25, 3.69s/it] 58%|█████▊ | 19733/34278 [21:38:30<14:23:04, 3.56s/it] {'loss': 0.1072, 'grad_norm': 0.7463641590409569, 'learning_rate': 4.023882323998408e-06, 'epoch': 0.58} 58%|█████▊ | 19733/34278 [21:38:30<14:23:04, 3.56s/it] 58%|█████▊ | 19734/34278 [21:38:36<17:23:25, 4.30s/it] {'loss': 0.1393, 'grad_norm': 0.8857683952020571, 'learning_rate': 4.023418984412644e-06, 'epoch': 0.58} 58%|█████▊ | 19734/34278 [21:38:36<17:23:25, 4.30s/it] 58%|█████▊ | 19735/34278 [21:38:39<16:28:37, 4.08s/it] {'loss': 0.1102, 'grad_norm': 0.8309607274026838, 'learning_rate': 4.022955653545563e-06, 'epoch': 0.58} 58%|█████▊ | 19735/34278 [21:38:39<16:28:37, 4.08s/it] 58%|█████▊ | 19736/34278 [21:38:43<16:13:21, 4.02s/it] {'loss': 0.1309, 'grad_norm': 1.0748862367884913, 'learning_rate': 4.0224923314013025e-06, 'epoch': 0.58} 58%|█████▊ | 19736/34278 [21:38:43<16:13:21, 4.02s/it] 58%|█████▊ | 19737/34278 [21:38:47<16:08:20, 4.00s/it] {'loss': 0.116, 'grad_norm': 0.8220081336407107, 'learning_rate': 4.022029017983996e-06, 'epoch': 0.58} 58%|█████▊ | 19737/34278 [21:38:47<16:08:20, 4.00s/it] 58%|█████▊ | 19738/34278 [21:38:53<18:35:45, 4.60s/it] {'loss': 0.135, 'grad_norm': 0.7963877708342001, 'learning_rate': 4.0215657132977806e-06, 'epoch': 0.58} 58%|█████▊ | 19738/34278 [21:38:53<18:35:45, 4.60s/it] 58%|█████▊ | 19739/34278 [21:38:57<17:55:46, 4.44s/it] {'loss': 0.1059, 'grad_norm': 0.9767353315039462, 'learning_rate': 4.021102417346794e-06, 'epoch': 0.58} 58%|█████▊ | 19739/34278 [21:38:57<17:55:46, 4.44s/it] 58%|█████▊ | 19740/34278 [21:39:00<16:29:03, 4.08s/it] {'loss': 0.1413, 'grad_norm': 1.1487649578395094, 'learning_rate': 4.020639130135169e-06, 'epoch': 0.58} 58%|█████▊ | 19740/34278 [21:39:00<16:29:03, 4.08s/it] 58%|█████▊ | 19741/34278 [21:39:04<16:18:11, 4.04s/it] {'loss': 0.1264, 'grad_norm': 1.0001431320504899, 'learning_rate': 4.020175851667047e-06, 'epoch': 0.58} 58%|█████▊ | 19741/34278 [21:39:04<16:18:11, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 58%|█████▊ | 19742/34278 [21:39:07<15:22:43, 3.81s/it] {'loss': 0.1492, 'grad_norm': 0.8337775920306253, 'learning_rate': 4.019712581946559e-06, 'epoch': 0.58} 58%|█████▊ | 19742/34278 [21:39:07<15:22:43, 3.81s/it] 58%|█████▊ | 19743/34278 [21:39:11<14:27:45, 3.58s/it] {'loss': 0.1351, 'grad_norm': 0.9615227523095687, 'learning_rate': 4.019249320977844e-06, 'epoch': 0.58} 58%|█████▊ | 19743/34278 [21:39:11<14:27:45, 3.58s/it] 58%|█████▊ | 19744/34278 [21:39:15<14:59:31, 3.71s/it] {'loss': 0.122, 'grad_norm': 0.8406770901161117, 'learning_rate': 4.018786068765037e-06, 'epoch': 0.58} 58%|█████▊ | 19744/34278 [21:39:15<14:59:31, 3.71s/it] 58%|█████▊ | 19745/34278 [21:39:19<15:41:20, 3.89s/it] {'loss': 0.1402, 'grad_norm': 1.0603003567825613, 'learning_rate': 4.018322825312273e-06, 'epoch': 0.58} 58%|█████▊ | 19745/34278 [21:39:19<15:41:20, 3.89s/it] 58%|█████▊ | 19746/34278 [21:39:22<14:15:07, 3.53s/it] {'loss': 0.1243, 'grad_norm': 0.834167142924995, 'learning_rate': 4.017859590623688e-06, 'epoch': 0.58} 58%|█████▊ | 19746/34278 [21:39:22<14:15:07, 3.53s/it] 58%|█████▊ | 19747/34278 [21:39:25<14:17:38, 3.54s/it] {'loss': 0.1172, 'grad_norm': 0.8171894066121539, 'learning_rate': 4.01739636470342e-06, 'epoch': 0.58} 58%|█████▊ | 19747/34278 [21:39:25<14:17:38, 3.54s/it] 58%|█████▊ | 19748/34278 [21:39:28<13:45:04, 3.41s/it] {'loss': 0.1216, 'grad_norm': 1.0158128756718992, 'learning_rate': 4.016933147555601e-06, 'epoch': 0.58} 58%|█████▊ | 19748/34278 [21:39:28<13:45:04, 3.41s/it] 58%|█████▊ | 19749/34278 [21:39:31<13:25:56, 3.33s/it] {'loss': 0.1103, 'grad_norm': 0.7361794378816956, 'learning_rate': 4.01646993918437e-06, 'epoch': 0.58} 58%|█████▊ | 19749/34278 [21:39:31<13:25:56, 3.33s/it] 58%|█████▊ | 19750/34278 [21:39:35<13:24:19, 3.32s/it] {'loss': 0.1125, 'grad_norm': 1.0196663055710706, 'learning_rate': 4.016006739593859e-06, 'epoch': 0.58} 58%|█████▊ | 19750/34278 [21:39:35<13:24:19, 3.32s/it] 58%|█████▊ | 19751/34278 [21:39:38<13:31:21, 3.35s/it] {'loss': 0.1121, 'grad_norm': 0.8065011504438061, 'learning_rate': 4.015543548788206e-06, 'epoch': 0.58} 58%|█████▊ | 19751/34278 [21:39:38<13:31:21, 3.35s/it] 58%|█████▊ | 19752/34278 [21:39:41<13:11:15, 3.27s/it] {'loss': 0.121, 'grad_norm': 1.0641140286394402, 'learning_rate': 4.015080366771543e-06, 'epoch': 0.58} 58%|█████▊ | 19752/34278 [21:39:41<13:11:15, 3.27s/it] 58%|█████▊ | 19753/34278 [21:39:44<12:53:25, 3.19s/it] {'loss': 0.1077, 'grad_norm': 0.82255712865244, 'learning_rate': 4.0146171935480105e-06, 'epoch': 0.58} 58%|█████▊ | 19753/34278 [21:39:44<12:53:25, 3.19s/it] 58%|█████▊ | 19754/34278 [21:39:47<12:57:16, 3.21s/it] {'loss': 0.1296, 'grad_norm': 0.7435633035051162, 'learning_rate': 4.014154029121739e-06, 'epoch': 0.58} 58%|█████▊ | 19754/34278 [21:39:47<12:57:16, 3.21s/it] 58%|█████▊ | 19755/34278 [21:39:51<13:14:33, 3.28s/it] {'loss': 0.1329, 'grad_norm': 0.943976291171618, 'learning_rate': 4.013690873496864e-06, 'epoch': 0.58} 58%|█████▊ | 19755/34278 [21:39:51<13:14:33, 3.28s/it] 58%|█████▊ | 19756/34278 [21:39:56<16:03:10, 3.98s/it] {'loss': 0.1241, 'grad_norm': 0.8523598476593407, 'learning_rate': 4.013227726677524e-06, 'epoch': 0.58} 58%|█████▊ | 19756/34278 [21:39:56<16:03:10, 3.98s/it] 58%|█████▊ | 19757/34278 [21:40:00<15:36:29, 3.87s/it] {'loss': 0.101, 'grad_norm': 0.8075599513321965, 'learning_rate': 4.01276458866785e-06, 'epoch': 0.58} 58%|█████▊ | 19757/34278 [21:40:00<15:36:29, 3.87s/it] 58%|█████▊ | 19758/34278 [21:40:03<14:53:54, 3.69s/it] {'loss': 0.127, 'grad_norm': 1.1751377581481242, 'learning_rate': 4.012301459471976e-06, 'epoch': 0.58} 58%|█████▊ | 19758/34278 [21:40:03<14:53:54, 3.69s/it] 58%|█████▊ | 19759/34278 [21:40:06<13:50:25, 3.43s/it] {'loss': 0.1468, 'grad_norm': 1.0134788032131126, 'learning_rate': 4.011838339094041e-06, 'epoch': 0.58} 58%|█████▊ | 19759/34278 [21:40:06<13:50:25, 3.43s/it] 58%|█████▊ | 19760/34278 [21:40:10<13:44:23, 3.41s/it] {'loss': 0.1254, 'grad_norm': 0.7576605016473313, 'learning_rate': 4.011375227538176e-06, 'epoch': 0.58} 58%|█████▊ | 19760/34278 [21:40:10<13:44:23, 3.41s/it] 58%|█████▊ | 19761/34278 [21:40:16<16:52:06, 4.18s/it] {'loss': 0.1349, 'grad_norm': 0.8019530711713517, 'learning_rate': 4.0109121248085196e-06, 'epoch': 0.58} 58%|█████▊ | 19761/34278 [21:40:16<16:52:06, 4.18s/it] 58%|█████▊ | 19762/34278 [21:40:21<18:56:39, 4.70s/it] {'loss': 0.1369, 'grad_norm': 0.8733164061042527, 'learning_rate': 4.010449030909202e-06, 'epoch': 0.58} 58%|█████▊ | 19762/34278 [21:40:21<18:56:39, 4.70s/it] 58%|█████▊ | 19763/34278 [21:40:25<17:23:34, 4.31s/it] {'loss': 0.14, 'grad_norm': 1.1058875053330164, 'learning_rate': 4.009985945844359e-06, 'epoch': 0.58} 58%|█████▊ | 19763/34278 [21:40:25<17:23:34, 4.31s/it] 58%|█████▊ | 19764/34278 [21:40:28<15:35:05, 3.87s/it] {'loss': 0.1101, 'grad_norm': 0.7559257276682203, 'learning_rate': 4.009522869618124e-06, 'epoch': 0.58} 58%|█████▊ | 19764/34278 [21:40:28<15:35:05, 3.87s/it] 58%|█████▊ | 19765/34278 [21:40:34<18:02:23, 4.47s/it] {'loss': 0.1282, 'grad_norm': 0.7963224403194995, 'learning_rate': 4.009059802234633e-06, 'epoch': 0.58} 58%|█████▊ | 19765/34278 [21:40:34<18:02:23, 4.47s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 58%|█████▊ | 19766/34278 [21:40:37<16:35:08, 4.11s/it] {'loss': 0.1228, 'grad_norm': 1.0287969571924067, 'learning_rate': 4.008596743698022e-06, 'epoch': 0.58} 58%|█████▊ | 19766/34278 [21:40:37<16:35:08, 4.11s/it] 58%|█████▊ | 19767/34278 [21:40:40<15:18:21, 3.80s/it] {'loss': 0.1485, 'grad_norm': 1.0658700001395847, 'learning_rate': 4.00813369401242e-06, 'epoch': 0.58} 58%|█████▊ | 19767/34278 [21:40:40<15:18:21, 3.80s/it] 58%|█████▊ | 19768/34278 [21:40:44<15:18:19, 3.80s/it] {'loss': 0.121, 'grad_norm': 0.9506619467707396, 'learning_rate': 4.007670653181965e-06, 'epoch': 0.58} 58%|█████▊ | 19768/34278 [21:40:44<15:18:19, 3.80s/it] 58%|█████▊ | 19769/34278 [21:40:47<14:15:16, 3.54s/it] {'loss': 0.1381, 'grad_norm': 1.6668110885406686, 'learning_rate': 4.00720762121079e-06, 'epoch': 0.58} 58%|█████▊ | 19769/34278 [21:40:47<14:15:16, 3.54s/it] 58%|█████▊ | 19770/34278 [21:40:49<13:10:58, 3.27s/it] {'loss': 0.119, 'grad_norm': 1.105236162752622, 'learning_rate': 4.006744598103025e-06, 'epoch': 0.58} 58%|█████▊ | 19770/34278 [21:40:49<13:10:58, 3.27s/it] 58%|█████▊ | 19771/34278 [21:40:52<13:00:42, 3.23s/it] {'loss': 0.1288, 'grad_norm': 1.0975762739219408, 'learning_rate': 4.00628158386281e-06, 'epoch': 0.58} 58%|█████▊ | 19771/34278 [21:40:52<13:00:42, 3.23s/it] 58%|█████▊ | 19772/34278 [21:40:56<12:53:39, 3.20s/it] {'loss': 0.1288, 'grad_norm': 0.9598289749935085, 'learning_rate': 4.005818578494275e-06, 'epoch': 0.58} 58%|█████▊ | 19772/34278 [21:40:56<12:53:39, 3.20s/it] 58%|█████▊ | 19773/34278 [21:40:59<12:42:35, 3.15s/it] {'loss': 0.135, 'grad_norm': 1.2119157824993114, 'learning_rate': 4.005355582001555e-06, 'epoch': 0.58} 58%|█████▊ | 19773/34278 [21:40:59<12:42:35, 3.15s/it] 58%|█████▊ | 19774/34278 [21:41:02<12:42:12, 3.15s/it] {'loss': 0.1347, 'grad_norm': 1.4749088699914519, 'learning_rate': 4.0048925943887835e-06, 'epoch': 0.58} 58%|█████▊ | 19774/34278 [21:41:02<12:42:12, 3.15s/it] 58%|█████▊ | 19775/34278 [21:41:05<13:08:14, 3.26s/it] {'loss': 0.138, 'grad_norm': 1.125350444889824, 'learning_rate': 4.004429615660092e-06, 'epoch': 0.58} 58%|█████▊ | 19775/34278 [21:41:05<13:08:14, 3.26s/it] 58%|█████▊ | 19776/34278 [21:41:11<16:23:40, 4.07s/it] {'loss': 0.1323, 'grad_norm': 0.8212775273239675, 'learning_rate': 4.003966645819615e-06, 'epoch': 0.58} 58%|█████▊ | 19776/34278 [21:41:11<16:23:40, 4.07s/it] 58%|█████▊ | 19777/34278 [21:41:14<14:55:18, 3.70s/it] {'loss': 0.1376, 'grad_norm': 1.1248482628024037, 'learning_rate': 4.003503684871486e-06, 'epoch': 0.58} 58%|█████▊ | 19777/34278 [21:41:14<14:55:18, 3.70s/it] 58%|█████▊ | 19778/34278 [21:41:18<14:41:31, 3.65s/it] {'loss': 0.1257, 'grad_norm': 1.460473657981357, 'learning_rate': 4.003040732819839e-06, 'epoch': 0.58} 58%|█████▊ | 19778/34278 [21:41:18<14:41:31, 3.65s/it] 58%|█████▊ | 19779/34278 [21:41:21<14:44:10, 3.66s/it] {'loss': 0.1224, 'grad_norm': 0.97374372816712, 'learning_rate': 4.002577789668807e-06, 'epoch': 0.58} 58%|█████▊ | 19779/34278 [21:41:21<14:44:10, 3.66s/it] 58%|█████▊ | 19780/34278 [21:41:26<15:32:38, 3.86s/it] {'loss': 0.1182, 'grad_norm': 0.728353761558451, 'learning_rate': 4.002114855422522e-06, 'epoch': 0.58} 58%|█████▊ | 19780/34278 [21:41:26<15:32:38, 3.86s/it] 58%|█████▊ | 19781/34278 [21:41:29<15:22:03, 3.82s/it] {'loss': 0.1288, 'grad_norm': 0.8199051259301026, 'learning_rate': 4.001651930085117e-06, 'epoch': 0.58} 58%|█████▊ | 19781/34278 [21:41:29<15:22:03, 3.82s/it] 58%|█████▊ | 19782/34278 [21:41:33<14:39:10, 3.64s/it] {'loss': 0.1167, 'grad_norm': 0.9040974601117048, 'learning_rate': 4.0011890136607236e-06, 'epoch': 0.58} 58%|█████▊ | 19782/34278 [21:41:33<14:39:10, 3.64s/it] 58%|█████▊ | 19783/34278 [21:41:36<14:25:10, 3.58s/it] {'loss': 0.1563, 'grad_norm': 0.9500231380617581, 'learning_rate': 4.000726106153479e-06, 'epoch': 0.58} 58%|█████▊ | 19783/34278 [21:41:36<14:25:10, 3.58s/it] 58%|█████▊ | 19784/34278 [21:41:39<13:31:56, 3.36s/it] {'loss': 0.1391, 'grad_norm': 0.9962617092536757, 'learning_rate': 4.000263207567512e-06, 'epoch': 0.58} 58%|█████▊ | 19784/34278 [21:41:39<13:31:56, 3.36s/it] 58%|█████▊ | 19785/34278 [21:41:42<13:03:12, 3.24s/it] {'loss': 0.1415, 'grad_norm': 0.8647576649946964, 'learning_rate': 3.999800317906956e-06, 'epoch': 0.58} 58%|█████▊ | 19785/34278 [21:41:42<13:03:12, 3.24s/it] 58%|█████▊ | 19786/34278 [21:41:47<15:14:08, 3.78s/it] {'loss': 0.1261, 'grad_norm': 0.9644638617361798, 'learning_rate': 3.999337437175946e-06, 'epoch': 0.58} 58%|█████▊ | 19786/34278 [21:41:47<15:14:08, 3.78s/it] 58%|█████▊ | 19787/34278 [21:41:50<14:48:42, 3.68s/it] {'loss': 0.1204, 'grad_norm': 1.0077523994658744, 'learning_rate': 3.998874565378611e-06, 'epoch': 0.58} 58%|█████▊ | 19787/34278 [21:41:50<14:48:42, 3.68s/it] 58%|█████▊ | 19788/34278 [21:41:55<15:41:56, 3.90s/it] {'loss': 0.1432, 'grad_norm': 0.9483469434896953, 'learning_rate': 3.998411702519083e-06, 'epoch': 0.58} 58%|█████▊ | 19788/34278 [21:41:55<15:41:56, 3.90s/it] 58%|█████▊ | 19789/34278 [21:41:58<14:30:09, 3.60s/it] {'loss': 0.1091, 'grad_norm': 0.6702345503281127, 'learning_rate': 3.997948848601498e-06, 'epoch': 0.58} 58%|█████▊ | 19789/34278 [21:41:58<14:30:09, 3.60s/it] 58%|█████▊ | 19790/34278 [21:42:01<14:00:28, 3.48s/it] {'loss': 0.1255, 'grad_norm': 0.868699841119976, 'learning_rate': 3.997486003629987e-06, 'epoch': 0.58} 58%|█████▊ | 19790/34278 [21:42:01<14:00:28, 3.48s/it] 58%|█████▊ | 19791/34278 [21:42:04<13:33:05, 3.37s/it] {'loss': 0.1401, 'grad_norm': 0.9848451210062822, 'learning_rate': 3.997023167608682e-06, 'epoch': 0.58} 58%|█████▊ | 19791/34278 [21:42:04<13:33:05, 3.37s/it] 58%|█████▊ | 19792/34278 [21:42:08<14:20:17, 3.56s/it] {'loss': 0.1117, 'grad_norm': 0.8777929448891568, 'learning_rate': 3.996560340541714e-06, 'epoch': 0.58} 58%|█████▊ | 19792/34278 [21:42:08<14:20:17, 3.56s/it] 58%|█████▊ | 19793/34278 [21:42:14<17:06:40, 4.25s/it] {'loss': 0.1235, 'grad_norm': 0.7907585752637132, 'learning_rate': 3.996097522433216e-06, 'epoch': 0.58} 58%|█████▊ | 19793/34278 [21:42:14<17:06:40, 4.25s/it] 58%|█████▊ | 19794/34278 [21:42:17<15:54:55, 3.96s/it] {'loss': 0.1403, 'grad_norm': 0.9023159792247771, 'learning_rate': 3.995634713287317e-06, 'epoch': 0.58} 58%|█████▊ | 19794/34278 [21:42:17<15:54:55, 3.96s/it] 58%|█████▊ | 19795/34278 [21:42:20<15:17:02, 3.80s/it] {'loss': 0.1048, 'grad_norm': 0.8529545228579224, 'learning_rate': 3.995171913108154e-06, 'epoch': 0.58} 58%|█████▊ | 19795/34278 [21:42:21<15:17:02, 3.80s/it] 58%|█████▊ | 19796/34278 [21:42:23<13:58:40, 3.47s/it] {'loss': 0.1578, 'grad_norm': 0.858297678135177, 'learning_rate': 3.994709121899858e-06, 'epoch': 0.58} 58%|█████▊ | 19796/34278 [21:42:23<13:58:40, 3.47s/it] 58%|█████▊ | 19797/34278 [21:42:28<15:06:14, 3.75s/it] {'loss': 0.14, 'grad_norm': 1.002419466038017, 'learning_rate': 3.994246339666557e-06, 'epoch': 0.58} 58%|█████▊ | 19797/34278 [21:42:28<15:06:14, 3.75s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 58%|█████▊ | 19798/34278 [21:42:34<17:52:59, 4.45s/it] {'loss': 0.1296, 'grad_norm': 1.0851157227082862, 'learning_rate': 3.993783566412384e-06, 'epoch': 0.58} 58%|█████▊ | 19798/34278 [21:42:34<17:52:59, 4.45s/it] 58%|█████▊ | 19799/34278 [21:42:37<16:52:03, 4.19s/it] {'loss': 0.1259, 'grad_norm': 0.9576563622048736, 'learning_rate': 3.9933208021414725e-06, 'epoch': 0.58} 58%|█████▊ | 19799/34278 [21:42:37<16:52:03, 4.19s/it] 58%|█████▊ | 19800/34278 [21:42:42<17:57:05, 4.46s/it] {'loss': 0.142, 'grad_norm': 1.0849308098215573, 'learning_rate': 3.9928580468579495e-06, 'epoch': 0.58} 58%|█████▊ | 19800/34278 [21:42:42<17:57:05, 4.46s/it] 58%|█████▊ | 19801/34278 [21:42:48<19:49:29, 4.93s/it] {'loss': 0.1231, 'grad_norm': 0.9446703660505869, 'learning_rate': 3.992395300565953e-06, 'epoch': 0.58} 58%|█████▊ | 19801/34278 [21:42:48<19:49:29, 4.93s/it] 58%|█████▊ | 19802/34278 [21:42:52<18:34:18, 4.62s/it] {'loss': 0.1571, 'grad_norm': 1.1123231866657721, 'learning_rate': 3.991932563269609e-06, 'epoch': 0.58} 58%|█████▊ | 19802/34278 [21:42:52<18:34:18, 4.62s/it] 58%|█████▊ | 19803/34278 [21:42:55<16:45:16, 4.17s/it] {'loss': 0.0973, 'grad_norm': 0.7499829305532338, 'learning_rate': 3.991469834973051e-06, 'epoch': 0.58} 58%|█████▊ | 19803/34278 [21:42:55<16:45:16, 4.17s/it] 58%|█████▊ | 19804/34278 [21:42:59<16:35:48, 4.13s/it] {'loss': 0.1288, 'grad_norm': 0.7121285357367448, 'learning_rate': 3.991007115680411e-06, 'epoch': 0.58} 58%|█████▊ | 19804/34278 [21:42:59<16:35:48, 4.13s/it] 58%|█████▊ | 19805/34278 [21:43:03<15:54:42, 3.96s/it] {'loss': 0.1182, 'grad_norm': 1.0362350184472247, 'learning_rate': 3.990544405395817e-06, 'epoch': 0.58} 58%|█████▊ | 19805/34278 [21:43:03<15:54:42, 3.96s/it] 58%|█████▊ | 19806/34278 [21:43:07<16:21:46, 4.07s/it] {'loss': 0.1465, 'grad_norm': 2.8101228898403883, 'learning_rate': 3.9900817041234e-06, 'epoch': 0.58} 58%|█████▊ | 19806/34278 [21:43:07<16:21:46, 4.07s/it] 58%|█████▊ | 19807/34278 [21:43:10<15:16:02, 3.80s/it] {'loss': 0.1161, 'grad_norm': 0.9996293048392291, 'learning_rate': 3.989619011867294e-06, 'epoch': 0.58} 58%|█████▊ | 19807/34278 [21:43:10<15:16:02, 3.80s/it] 58%|█████▊ | 19808/34278 [21:43:15<15:31:49, 3.86s/it] {'loss': 0.1117, 'grad_norm': 0.688654226170998, 'learning_rate': 3.989156328631629e-06, 'epoch': 0.58} 58%|█████▊ | 19808/34278 [21:43:15<15:31:49, 3.86s/it] 58%|█████▊ | 19809/34278 [21:43:20<17:03:34, 4.24s/it] {'loss': 0.1198, 'grad_norm': 1.0035116856754207, 'learning_rate': 3.9886936544205354e-06, 'epoch': 0.58} 58%|█████▊ | 19809/34278 [21:43:20<17:03:34, 4.24s/it] 58%|█████▊ | 19810/34278 [21:43:22<15:19:03, 3.81s/it] {'loss': 0.1564, 'grad_norm': 1.1308080666264633, 'learning_rate': 3.988230989238142e-06, 'epoch': 0.58} 58%|█████▊ | 19810/34278 [21:43:23<15:19:03, 3.81s/it] 58%|█████▊ | 19811/34278 [21:43:28<17:35:20, 4.38s/it] {'loss': 0.1276, 'grad_norm': 0.8816919204467307, 'learning_rate': 3.987768333088581e-06, 'epoch': 0.58} 58%|█████▊ | 19811/34278 [21:43:28<17:35:20, 4.38s/it] 58%|█████▊ | 19812/34278 [21:43:32<16:38:25, 4.14s/it] {'loss': 0.1115, 'grad_norm': 0.7839434874727919, 'learning_rate': 3.987305685975982e-06, 'epoch': 0.58} 58%|█████▊ | 19812/34278 [21:43:32<16:38:25, 4.14s/it] 58%|█████▊ | 19813/34278 [21:43:35<15:23:25, 3.83s/it] {'loss': 0.1545, 'grad_norm': 1.1028254583540187, 'learning_rate': 3.9868430479044775e-06, 'epoch': 0.58} 58%|█████▊ | 19813/34278 [21:43:35<15:23:25, 3.83s/it] 58%|█████▊ | 19814/34278 [21:43:38<14:50:31, 3.69s/it] {'loss': 0.1058, 'grad_norm': 0.8066229557752148, 'learning_rate': 3.9863804188781965e-06, 'epoch': 0.58} 58%|█████▊ | 19814/34278 [21:43:38<14:50:31, 3.69s/it] 58%|█████▊ | 19815/34278 [21:43:45<18:12:39, 4.53s/it] {'loss': 0.1703, 'grad_norm': 1.069983565154359, 'learning_rate': 3.985917798901268e-06, 'epoch': 0.58} 58%|█████▊ | 19815/34278 [21:43:45<18:12:39, 4.53s/it] 58%|█████▊ | 19816/34278 [21:43:48<16:14:07, 4.04s/it] {'loss': 0.1419, 'grad_norm': 0.9194799344167955, 'learning_rate': 3.985455187977825e-06, 'epoch': 0.58} 58%|█████▊ | 19816/34278 [21:43:48<16:14:07, 4.04s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb8a01c78d0> Failed to fetch sample 2654732. Exception: cannot identify image file <_io.BytesIO object at 0x7fb8a01c78d0> 58%|█████▊ | 19817/34278 [21:43:51<15:00:51, 3.74s/it] {'loss': 0.0971, 'grad_norm': 0.6317555119245367, 'learning_rate': 3.984992586111995e-06, 'epoch': 0.58} 58%|█████▊ | 19817/34278 [21:43:51<15:00:51, 3.74s/it] 58%|█████▊ | 19818/34278 [21:43:53<13:57:21, 3.47s/it] {'loss': 0.1067, 'grad_norm': 0.7721207344944646, 'learning_rate': 3.984529993307907e-06, 'epoch': 0.58} 58%|█████▊ | 19818/34278 [21:43:54<13:57:21, 3.47s/it] 58%|█████▊ | 19819/34278 [21:43:57<13:47:24, 3.43s/it] {'loss': 0.1371, 'grad_norm': 2.79177121614945, 'learning_rate': 3.984067409569694e-06, 'epoch': 0.58} 58%|█████▊ | 19819/34278 [21:43:57<13:47:24, 3.43s/it] 58%|█████▊ | 19820/34278 [21:44:00<13:06:50, 3.27s/it] {'loss': 0.1339, 'grad_norm': 1.0237265107365645, 'learning_rate': 3.983604834901485e-06, 'epoch': 0.58} 58%|█████▊ | 19820/34278 [21:44:00<13:06:50, 3.27s/it] 58%|█████▊ | 19821/34278 [21:44:03<12:58:50, 3.23s/it] {'loss': 0.1154, 'grad_norm': 0.8966166671272845, 'learning_rate': 3.983142269307411e-06, 'epoch': 0.58} 58%|█████▊ | 19821/34278 [21:44:03<12:58:50, 3.23s/it] 58%|█████▊ | 19822/34278 [21:44:07<13:41:14, 3.41s/it] {'loss': 0.1393, 'grad_norm': 0.8076250089700812, 'learning_rate': 3.982679712791599e-06, 'epoch': 0.58} 58%|█████▊ | 19822/34278 [21:44:07<13:41:14, 3.41s/it] 58%|█████▊ | 19823/34278 [21:44:10<13:26:06, 3.35s/it] {'loss': 0.1224, 'grad_norm': 0.8761378344490751, 'learning_rate': 3.982217165358179e-06, 'epoch': 0.58} 58%|█████▊ | 19823/34278 [21:44:10<13:26:06, 3.35s/it] 58%|█████▊ | 19824/34278 [21:44:13<12:45:06, 3.18s/it] {'loss': 0.105, 'grad_norm': 1.0008965283306264, 'learning_rate': 3.98175462701128e-06, 'epoch': 0.58} 58%|█████▊ | 19824/34278 [21:44:13<12:45:06, 3.18s/it] 58%|█████▊ | 19825/34278 [21:44:16<12:35:59, 3.14s/it] {'loss': 0.1454, 'grad_norm': 0.7722542211462903, 'learning_rate': 3.981292097755034e-06, 'epoch': 0.58} 58%|█████▊ | 19825/34278 [21:44:16<12:35:59, 3.14s/it] 58%|█████▊ | 19826/34278 [21:44:19<12:27:24, 3.10s/it] {'loss': 0.1124, 'grad_norm': 0.653741040135133, 'learning_rate': 3.98082957759357e-06, 'epoch': 0.58} 58%|█████▊ | 19826/34278 [21:44:19<12:27:24, 3.10s/it] 58%|█████▊ | 19827/34278 [21:44:23<13:42:59, 3.42s/it] {'loss': 0.1221, 'grad_norm': 0.8022381275982864, 'learning_rate': 3.980367066531015e-06, 'epoch': 0.58} 58%|█████▊ | 19827/34278 [21:44:23<13:42:59, 3.42s/it] 58%|█████▊ | 19828/34278 [21:44:26<13:11:03, 3.28s/it] {'loss': 0.1011, 'grad_norm': 0.7649152233724401, 'learning_rate': 3.9799045645715e-06, 'epoch': 0.58} 58%|█████▊ | 19828/34278 [21:44:26<13:11:03, 3.28s/it] 58%|█████▊ | 19829/34278 [21:44:29<12:32:24, 3.12s/it] {'loss': 0.126, 'grad_norm': 0.7485276808645908, 'learning_rate': 3.979442071719154e-06, 'epoch': 0.58} 58%|█████▊ | 19829/34278 [21:44:29<12:32:24, 3.12s/it] 58%|█████▊ | 19830/34278 [21:44:32<12:40:00, 3.16s/it] {'loss': 0.1185, 'grad_norm': 1.19238550015347, 'learning_rate': 3.978979587978102e-06, 'epoch': 0.58} 58%|█████▊ | 19830/34278 [21:44:32<12:40:00, 3.16s/it] 58%|█████▊ | 19831/34278 [21:44:36<14:00:12, 3.49s/it] {'loss': 0.1173, 'grad_norm': 0.8311169837843856, 'learning_rate': 3.978517113352481e-06, 'epoch': 0.58} 58%|█████▊ | 19831/34278 [21:44:36<14:00:12, 3.49s/it] 58%|█████▊ | 19832/34278 [21:44:39<13:50:45, 3.45s/it] {'loss': 0.1265, 'grad_norm': 0.9373734781385317, 'learning_rate': 3.978054647846413e-06, 'epoch': 0.58} 58%|█████▊ | 19832/34278 [21:44:39<13:50:45, 3.45s/it] 58%|█████▊ | 19833/34278 [21:44:43<13:40:48, 3.41s/it] {'loss': 0.1215, 'grad_norm': 0.7231410845994454, 'learning_rate': 3.97759219146403e-06, 'epoch': 0.58} 58%|█████▊ | 19833/34278 [21:44:43<13:40:48, 3.41s/it] 58%|█████▊ | 19834/34278 [21:44:48<15:41:49, 3.91s/it] {'loss': 0.1323, 'grad_norm': 0.720753605115125, 'learning_rate': 3.977129744209461e-06, 'epoch': 0.58} 58%|█████▊ | 19834/34278 [21:44:48<15:41:49, 3.91s/it] 58%|█████▊ | 19835/34278 [21:44:51<15:00:56, 3.74s/it] {'loss': 0.123, 'grad_norm': 0.7621961911100231, 'learning_rate': 3.976667306086831e-06, 'epoch': 0.58} 58%|█████▊ | 19835/34278 [21:44:51<15:00:56, 3.74s/it] 58%|█████▊ | 19836/34278 [21:44:55<14:38:54, 3.65s/it] {'loss': 0.1346, 'grad_norm': 0.940698061770366, 'learning_rate': 3.976204877100272e-06, 'epoch': 0.58} 58%|█████▊ | 19836/34278 [21:44:55<14:38:54, 3.65s/it] 58%|█████▊ | 19837/34278 [21:44:58<14:01:23, 3.50s/it] {'loss': 0.1208, 'grad_norm': 0.693390098201076, 'learning_rate': 3.975742457253911e-06, 'epoch': 0.58} 58%|█████▊ | 19837/34278 [21:44:58<14:01:23, 3.50s/it] 58%|█████▊ | 19838/34278 [21:45:01<13:13:17, 3.30s/it] {'loss': 0.1317, 'grad_norm': 0.7374938419856453, 'learning_rate': 3.975280046551877e-06, 'epoch': 0.58} 58%|█████▊ | 19838/34278 [21:45:01<13:13:17, 3.30s/it] 58%|█████▊ | 19839/34278 [21:45:04<13:26:06, 3.35s/it] {'loss': 0.1503, 'grad_norm': 0.8774612806914714, 'learning_rate': 3.9748176449983e-06, 'epoch': 0.58} 58%|█████▊ | 19839/34278 [21:45:04<13:26:06, 3.35s/it] 58%|█████▊ | 19840/34278 [21:45:07<13:27:53, 3.36s/it] {'loss': 0.1126, 'grad_norm': 0.734475246427361, 'learning_rate': 3.974355252597304e-06, 'epoch': 0.58} 58%|█████▊ | 19840/34278 [21:45:07<13:27:53, 3.36s/it] 58%|█████▊ | 19841/34278 [21:45:11<13:50:37, 3.45s/it] {'loss': 0.1195, 'grad_norm': 0.886860910766383, 'learning_rate': 3.973892869353021e-06, 'epoch': 0.58} 58%|█████▊ | 19841/34278 [21:45:11<13:50:37, 3.45s/it] 58%|█████▊ | 19842/34278 [21:45:14<13:10:39, 3.29s/it] {'loss': 0.131, 'grad_norm': 1.041479930540893, 'learning_rate': 3.973430495269576e-06, 'epoch': 0.58} 58%|█████▊ | 19842/34278 [21:45:14<13:10:39, 3.29s/it] 58%|█████▊ | 19843/34278 [21:45:20<16:42:30, 4.17s/it] {'loss': 0.1039, 'grad_norm': 0.7043167579534947, 'learning_rate': 3.9729681303510995e-06, 'epoch': 0.58} 58%|█████▊ | 19843/34278 [21:45:20<16:42:30, 4.17s/it] 58%|█████▊ | 19844/34278 [21:45:23<15:15:37, 3.81s/it] {'loss': 0.1043, 'grad_norm': 0.7283009365961199, 'learning_rate': 3.972505774601718e-06, 'epoch': 0.58} 58%|█████▊ | 19844/34278 [21:45:23<15:15:37, 3.81s/it] 58%|█████▊ | 19845/34278 [21:45:26<14:26:55, 3.60s/it] {'loss': 0.1422, 'grad_norm': 1.1139644686340868, 'learning_rate': 3.97204342802556e-06, 'epoch': 0.58} 58%|█████▊ | 19845/34278 [21:45:26<14:26:55, 3.60s/it] 58%|█████▊ | 19846/34278 [21:45:29<13:37:01, 3.40s/it] {'loss': 0.1109, 'grad_norm': 0.8158675132918699, 'learning_rate': 3.971581090626754e-06, 'epoch': 0.58} 58%|█████▊ | 19846/34278 [21:45:29<13:37:01, 3.40s/it] 58%|█████▊ | 19847/34278 [21:45:33<14:31:39, 3.62s/it] {'loss': 0.1268, 'grad_norm': 0.939665905198642, 'learning_rate': 3.971118762409425e-06, 'epoch': 0.58} 58%|█████▊ | 19847/34278 [21:45:33<14:31:39, 3.62s/it] 58%|█████▊ | 19848/34278 [21:45:36<13:34:25, 3.39s/it] {'loss': 0.1298, 'grad_norm': 0.8274508180598745, 'learning_rate': 3.970656443377701e-06, 'epoch': 0.58} 58%|█████▊ | 19848/34278 [21:45:36<13:34:25, 3.39s/it] 58%|█████▊ | 19849/34278 [21:45:40<13:32:13, 3.38s/it] {'loss': 0.1423, 'grad_norm': 1.0217875282158446, 'learning_rate': 3.970194133535712e-06, 'epoch': 0.58} 58%|█████▊ | 19849/34278 [21:45:40<13:32:13, 3.38s/it] 58%|█████▊ | 19850/34278 [21:45:42<12:55:44, 3.23s/it] {'loss': 0.1324, 'grad_norm': 0.8823670079489304, 'learning_rate': 3.9697318328875835e-06, 'epoch': 0.58} 58%|█████▊ | 19850/34278 [21:45:42<12:55:44, 3.23s/it] 58%|█████▊ | 19851/34278 [21:45:46<12:56:35, 3.23s/it] {'loss': 0.1325, 'grad_norm': 0.808542118756704, 'learning_rate': 3.969269541437444e-06, 'epoch': 0.58} 58%|█████▊ | 19851/34278 [21:45:46<12:56:35, 3.23s/it] 58%|█████▊ | 19852/34278 [21:45:49<12:59:49, 3.24s/it] {'loss': 0.1634, 'grad_norm': 0.8857303744791257, 'learning_rate': 3.96880725918942e-06, 'epoch': 0.58} 58%|█████▊ | 19852/34278 [21:45:49<12:59:49, 3.24s/it] 58%|█████▊ | 19853/34278 [21:45:53<13:50:25, 3.45s/it] {'loss': 0.1265, 'grad_norm': 0.6887971717287216, 'learning_rate': 3.968344986147637e-06, 'epoch': 0.58} 58%|█████▊ | 19853/34278 [21:45:53<13:50:25, 3.45s/it] 58%|█████▊ | 19854/34278 [21:45:57<14:17:15, 3.57s/it] {'loss': 0.1317, 'grad_norm': 0.9153616050952632, 'learning_rate': 3.967882722316224e-06, 'epoch': 0.58} 58%|█████▊ | 19854/34278 [21:45:57<14:17:15, 3.57s/it] 58%|█████▊ | 19855/34278 [21:46:08<23:36:37, 5.89s/it] {'loss': 0.1263, 'grad_norm': 0.884300541454304, 'learning_rate': 3.967420467699309e-06, 'epoch': 0.58} 58%|█████▊ | 19855/34278 [21:46:08<23:36:37, 5.89s/it] 58%|█████▊ | 19856/34278 [21:46:11<20:02:39, 5.00s/it] {'loss': 0.1207, 'grad_norm': 0.7977231072198944, 'learning_rate': 3.9669582223010175e-06, 'epoch': 0.58} 58%|█████▊ | 19856/34278 [21:46:11<20:02:39, 5.00s/it] 58%|█████▊ | 19857/34278 [21:46:14<17:47:25, 4.44s/it] {'loss': 0.1203, 'grad_norm': 0.7568274865395707, 'learning_rate': 3.966495986125476e-06, 'epoch': 0.58} 58%|█████▊ | 19857/34278 [21:46:14<17:47:25, 4.44s/it] 58%|█████▊ | 19858/34278 [21:46:25<25:44:21, 6.43s/it] {'loss': 0.1226, 'grad_norm': 0.7097346775430343, 'learning_rate': 3.966033759176811e-06, 'epoch': 0.58} 58%|█████▊ | 19858/34278 [21:46:25<25:44:21, 6.43s/it] 58%|█████▊ | 19859/34278 [21:46:28<21:48:24, 5.44s/it] {'loss': 0.1163, 'grad_norm': 0.9743664825150998, 'learning_rate': 3.965571541459153e-06, 'epoch': 0.58} 58%|█████▊ | 19859/34278 [21:46:28<21:48:24, 5.44s/it] 58%|█████▊ | 19860/34278 [21:46:34<21:49:49, 5.45s/it] {'loss': 0.124, 'grad_norm': 0.7941952176074787, 'learning_rate': 3.96510933297662e-06, 'epoch': 0.58} 58%|█████▊ | 19860/34278 [21:46:34<21:49:49, 5.45s/it] 58%|█████▊ | 19861/34278 [21:46:38<20:42:57, 5.17s/it] {'loss': 0.1318, 'grad_norm': 1.0758974343281034, 'learning_rate': 3.964647133733347e-06, 'epoch': 0.58} 58%|█████▊ | 19861/34278 [21:46:38<20:42:57, 5.17s/it] 58%|█████▊ | 19862/34278 [21:47:03<43:49:51, 10.95s/it] {'loss': 0.1609, 'grad_norm': 0.9089754466127158, 'learning_rate': 3.964184943733457e-06, 'epoch': 0.58} 58%|█████▊ | 19862/34278 [21:47:03<43:49:51, 10.95s/it] 58%|█████▊ | 19863/34278 [21:47:06<34:23:54, 8.59s/it] {'loss': 0.1447, 'grad_norm': 0.8361578672702864, 'learning_rate': 3.963722762981076e-06, 'epoch': 0.58} 58%|█████▊ | 19863/34278 [21:47:06<34:23:54, 8.59s/it] 58%|█████▊ | 19864/34278 [21:47:17<38:01:57, 9.50s/it] {'loss': 0.1047, 'grad_norm': 0.7861331954177537, 'learning_rate': 3.963260591480332e-06, 'epoch': 0.58} 58%|█████▊ | 19864/34278 [21:47:17<38:01:57, 9.50s/it] 58%|█████▊ | 19865/34278 [21:47:32<43:49:47, 10.95s/it] {'loss': 0.1518, 'grad_norm': 0.8176470971516183, 'learning_rate': 3.962798429235349e-06, 'epoch': 0.58} 58%|█████▊ | 19865/34278 [21:47:32<43:49:47, 10.95s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f69138514e0> Failed to fetch sample 3291475. Exception: cannot identify image file <_io.BytesIO object at 0x7f69138514e0> 58%|█████▊ | 19866/34278 [21:47:35<34:47:38, 8.69s/it] {'loss': 0.1481, 'grad_norm': 0.7700184030086775, 'learning_rate': 3.9623362762502525e-06, 'epoch': 0.58} 58%|█████▊ | 19866/34278 [21:47:35<34:47:38, 8.69s/it] 58%|█████▊ | 19867/34278 [21:47:59<53:26:04, 13.35s/it] {'loss': 0.1272, 'grad_norm': 0.8709540450684102, 'learning_rate': 3.961874132529172e-06, 'epoch': 0.58} 58%|█████▊ | 19867/34278 [21:47:59<53:26:04, 13.35s/it] 58%|█████▊ | 19868/34278 [21:48:06<44:43:44, 11.17s/it] {'loss': 0.1287, 'grad_norm': 0.8419206805683189, 'learning_rate': 3.961411998076231e-06, 'epoch': 0.58} 58%|█████▊ | 19868/34278 [21:48:06<44:43:44, 11.17s/it] 58%|█████▊ | 19869/34278 [21:48:17<44:55:33, 11.22s/it] {'loss': 0.1256, 'grad_norm': 0.7653976739221687, 'learning_rate': 3.960949872895556e-06, 'epoch': 0.58} 58%|█████▊ | 19869/34278 [21:48:17<44:55:33, 11.22s/it] 58%|█████▊ | 19870/34278 [21:48:28<45:05:12, 11.27s/it] {'loss': 0.12, 'grad_norm': 0.8034744332979992, 'learning_rate': 3.960487756991272e-06, 'epoch': 0.58} 58%|█████▊ | 19870/34278 [21:48:28<45:05:12, 11.27s/it] 58%|█████▊ | 19871/34278 [21:48:31<34:38:42, 8.66s/it] {'loss': 0.147, 'grad_norm': 0.8200857262003038, 'learning_rate': 3.9600256503675054e-06, 'epoch': 0.58} 58%|█████▊ | 19871/34278 [21:48:31<34:38:42, 8.66s/it] 58%|█████▊ | 19872/34278 [21:48:45<41:11:08, 10.29s/it] {'loss': 0.1353, 'grad_norm': 0.7411616313533512, 'learning_rate': 3.95956355302838e-06, 'epoch': 0.58} 58%|█████▊ | 19872/34278 [21:48:45<41:11:08, 10.29s/it] 58%|█████▊ | 19873/34278 [21:48:56<42:26:59, 10.61s/it] {'loss': 0.1358, 'grad_norm': 0.6780823267716775, 'learning_rate': 3.959101464978026e-06, 'epoch': 0.58} 58%|█████▊ | 19873/34278 [21:48:56<42:26:59, 10.61s/it] 58%|█████▊ | 19874/34278 [21:49:24<63:14:22, 15.81s/it] {'loss': 0.1378, 'grad_norm': 0.9806931616582027, 'learning_rate': 3.958639386220564e-06, 'epoch': 0.58} 58%|█████▊ | 19874/34278 [21:49:24<63:14:22, 15.81s/it] 58%|█████▊ | 19875/34278 [21:49:37<59:09:17, 14.79s/it] {'loss': 0.1326, 'grad_norm': 0.8598138366783364, 'learning_rate': 3.9581773167601205e-06, 'epoch': 0.58} 58%|█████▊ | 19875/34278 [21:49:37<59:09:17, 14.79s/it] 58%|█████▊ | 19876/34278 [21:49:51<58:30:13, 14.62s/it] {'loss': 0.142, 'grad_norm': 0.7139813517210786, 'learning_rate': 3.957715256600822e-06, 'epoch': 0.58} 58%|█████▊ | 19876/34278 [21:49:51<58:30:13, 14.62s/it] 58%|█████▊ | 19877/34278 [21:50:11<64:44:31, 16.18s/it] {'loss': 0.143, 'grad_norm': 0.7566984482264986, 'learning_rate': 3.957253205746793e-06, 'epoch': 0.58} 58%|█████▊ | 19877/34278 [21:50:11<64:44:31, 16.18s/it] 58%|█████▊ | 19878/34278 [21:50:29<66:58:20, 16.74s/it] {'loss': 0.116, 'grad_norm': 0.7622060336518932, 'learning_rate': 3.956791164202158e-06, 'epoch': 0.58} 58%|█████▊ | 19878/34278 [21:50:29<66:58:20, 16.74s/it] 58%|█████▊ | 19879/34278 [21:51:01<85:39:55, 21.42s/it] {'loss': 0.1187, 'grad_norm': 0.7702675669435379, 'learning_rate': 3.9563291319710416e-06, 'epoch': 0.58} 58%|█████▊ | 19879/34278 [21:51:01<85:39:55, 21.42s/it] 58%|█████▊ | 19880/34278 [21:51:04<63:47:43, 15.95s/it] {'loss': 0.1215, 'grad_norm': 0.7526465653630454, 'learning_rate': 3.95586710905757e-06, 'epoch': 0.58} 58%|█████▊ | 19880/34278 [21:51:04<63:47:43, 15.95s/it] 58%|█████▊ | 19881/34278 [21:51:07<48:26:42, 12.11s/it] {'loss': 0.1308, 'grad_norm': 0.7270363122337156, 'learning_rate': 3.955405095465869e-06, 'epoch': 0.58} 58%|█████▊ | 19881/34278 [21:51:07<48:26:42, 12.11s/it] 58%|█████▊ | 19882/34278 [21:51:19<47:27:21, 11.87s/it] {'loss': 0.1318, 'grad_norm': 0.7641247848398072, 'learning_rate': 3.9549430912000605e-06, 'epoch': 0.58} 58%|█████▊ | 19882/34278 [21:51:19<47:27:21, 11.87s/it] 58%|█████▊ | 19883/34278 [21:51:31<47:58:47, 12.00s/it] {'loss': 0.1183, 'grad_norm': 0.9114894132623551, 'learning_rate': 3.954481096264272e-06, 'epoch': 0.58} 58%|█████▊ | 19883/34278 [21:51:31<47:58:47, 12.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 58%|█████▊ | 19884/34278 [21:51:45<50:21:09, 12.59s/it] {'loss': 0.1168, 'grad_norm': 0.9135341903234376, 'learning_rate': 3.954019110662624e-06, 'epoch': 0.58} 58%|█████▊ | 19884/34278 [21:51:45<50:21:09, 12.59s/it] 58%|█████▊ | 19885/34278 [21:52:07<61:34:23, 15.40s/it] {'loss': 0.108, 'grad_norm': 0.7507602960813876, 'learning_rate': 3.953557134399245e-06, 'epoch': 0.58} 58%|█████▊ | 19885/34278 [21:52:07<61:34:23, 15.40s/it] 58%|█████▊ | 19886/34278 [21:52:18<56:15:23, 14.07s/it] {'loss': 0.1369, 'grad_norm': 0.755358982570575, 'learning_rate': 3.95309516747826e-06, 'epoch': 0.58} 58%|█████▊ | 19886/34278 [21:52:18<56:15:23, 14.07s/it] 58%|█████▊ | 19887/34278 [21:52:29<52:38:10, 13.17s/it] {'loss': 0.1288, 'grad_norm': 0.7412791489904513, 'learning_rate': 3.95263320990379e-06, 'epoch': 0.58} 58%|█████▊ | 19887/34278 [21:52:29<52:38:10, 13.17s/it] 58%|█████▊ | 19888/34278 [21:52:54<67:16:11, 16.83s/it] {'loss': 0.1324, 'grad_norm': 0.9177539700374737, 'learning_rate': 3.95217126167996e-06, 'epoch': 0.58} 58%|█████▊ | 19888/34278 [21:52:54<67:16:11, 16.83s/it] 58%|█████▊ | 19889/34278 [21:53:06<60:40:15, 15.18s/it] {'loss': 0.108, 'grad_norm': 0.721199355245318, 'learning_rate': 3.951709322810896e-06, 'epoch': 0.58} 58%|█████▊ | 19889/34278 [21:53:06<60:40:15, 15.18s/it] 58%|█████▊ | 19890/34278 [21:53:29<70:23:00, 17.61s/it] {'loss': 0.1269, 'grad_norm': 0.9920530278610725, 'learning_rate': 3.9512473933007185e-06, 'epoch': 0.58} 58%|█████▊ | 19890/34278 [21:53:29<70:23:00, 17.61s/it] 58%|█████▊ | 19891/34278 [21:53:40<62:20:39, 15.60s/it] {'loss': 0.1365, 'grad_norm': 0.9430818114541443, 'learning_rate': 3.950785473153557e-06, 'epoch': 0.58} 58%|█████▊ | 19891/34278 [21:53:40<62:20:39, 15.60s/it] 58%|█████▊ | 19892/34278 [21:54:02<69:59:19, 17.51s/it] {'loss': 0.1185, 'grad_norm': 0.8576843877655337, 'learning_rate': 3.950323562373531e-06, 'epoch': 0.58} 58%|█████▊ | 19892/34278 [21:54:02<69:59:19, 17.51s/it] 58%|█████▊ | 19893/34278 [21:54:13<62:48:52, 15.72s/it] {'loss': 0.1158, 'grad_norm': 0.7132914824219657, 'learning_rate': 3.949861660964766e-06, 'epoch': 0.58} 58%|█████▊ | 19893/34278 [21:54:13<62:48:52, 15.72s/it] 58%|█████▊ | 19894/34278 [21:54:29<62:56:06, 15.75s/it] {'loss': 0.1146, 'grad_norm': 0.9576143100258052, 'learning_rate': 3.949399768931386e-06, 'epoch': 0.58} 58%|█████▊ | 19894/34278 [21:54:29<62:56:06, 15.75s/it] 58%|█████▊ | 19895/34278 [21:54:50<69:24:48, 17.37s/it] {'loss': 0.1212, 'grad_norm': 0.7590347442726993, 'learning_rate': 3.948937886277511e-06, 'epoch': 0.58} 58%|█████▊ | 19895/34278 [21:54:50<69:24:48, 17.37s/it] 58%|█████▊ | 19896/34278 [21:55:01<61:36:16, 15.42s/it] {'loss': 0.1384, 'grad_norm': 0.7955627615021786, 'learning_rate': 3.948476013007271e-06, 'epoch': 0.58} 58%|█████▊ | 19896/34278 [21:55:01<61:36:16, 15.42s/it] 58%|█████▊ | 19897/34278 [21:55:17<61:47:02, 15.47s/it] {'loss': 0.1275, 'grad_norm': 0.794198650050654, 'learning_rate': 3.948014149124785e-06, 'epoch': 0.58} 58%|█████▊ | 19897/34278 [21:55:17<61:47:02, 15.47s/it] 58%|█████▊ | 19898/34278 [21:55:28<56:07:03, 14.05s/it] {'loss': 0.1427, 'grad_norm': 0.8803440218187136, 'learning_rate': 3.947552294634177e-06, 'epoch': 0.58} 58%|█████▊ | 19898/34278 [21:55:28<56:07:03, 14.05s/it] 58%|█████▊ | 19899/34278 [21:55:31<42:58:38, 10.76s/it] {'loss': 0.1209, 'grad_norm': 1.1473174799434247, 'learning_rate': 3.947090449539573e-06, 'epoch': 0.58} 58%|█████▊ | 19899/34278 [21:55:31<42:58:38, 10.76s/it] 58%|█████▊ | 19900/34278 [21:55:54<58:38:04, 14.68s/it] {'loss': 0.1184, 'grad_norm': 0.7649924331445636, 'learning_rate': 3.946628613845092e-06, 'epoch': 0.58} 58%|█████▊ | 19900/34278 [21:55:55<58:38:04, 14.68s/it] 58%|█████▊ | 19901/34278 [21:55:58<45:50:52, 11.48s/it] {'loss': 0.1421, 'grad_norm': 0.8568457914856835, 'learning_rate': 3.9461667875548594e-06, 'epoch': 0.58} 58%|█████▊ | 19901/34278 [21:55:58<45:50:52, 11.48s/it] 58%|█████▊ | 19902/34278 [21:56:02<36:55:48, 9.25s/it] {'loss': 0.1465, 'grad_norm': 0.9987021030237287, 'learning_rate': 3.945704970672998e-06, 'epoch': 0.58} 58%|█████▊ | 19902/34278 [21:56:03<36:55:48, 9.25s/it] 58%|█████▊ | 19903/34278 [21:56:16<42:00:16, 10.52s/it] {'loss': 0.1352, 'grad_norm': 0.7834915474880942, 'learning_rate': 3.9452431632036326e-06, 'epoch': 0.58} 58%|█████▊ | 19903/34278 [21:56:16<42:00:16, 10.52s/it] 58%|█████▊ | 19904/34278 [21:56:30<46:28:53, 11.64s/it] {'loss': 0.1137, 'grad_norm': 0.9804178230128983, 'learning_rate': 3.944781365150883e-06, 'epoch': 0.58} 58%|█████▊ | 19904/34278 [21:56:30<46:28:53, 11.64s/it] 58%|█████▊ | 19905/34278 [21:56:46<51:54:46, 13.00s/it] {'loss': 0.1191, 'grad_norm': 0.7419456224991688, 'learning_rate': 3.944319576518874e-06, 'epoch': 0.58} 58%|█████▊ | 19905/34278 [21:56:46<51:54:46, 13.00s/it] 58%|█████▊ | 19906/34278 [21:57:03<55:48:57, 13.98s/it] {'loss': 0.149, 'grad_norm': 0.8857004541264865, 'learning_rate': 3.943857797311729e-06, 'epoch': 0.58} 58%|█████▊ | 19906/34278 [21:57:03<55:48:57, 13.98s/it] 58%|█████▊ | 19907/34278 [21:57:13<51:51:01, 12.99s/it] {'loss': 0.1426, 'grad_norm': 0.8853359666038874, 'learning_rate': 3.943396027533566e-06, 'epoch': 0.58} 58%|█████▊ | 19907/34278 [21:57:13<51:51:01, 12.99s/it] 58%|█████▊ | 19908/34278 [21:57:20<43:51:50, 10.99s/it] {'loss': 0.1339, 'grad_norm': 0.7817563314982279, 'learning_rate': 3.942934267188514e-06, 'epoch': 0.58} 58%|█████▊ | 19908/34278 [21:57:20<43:51:50, 10.99s/it] 58%|█████▊ | 19909/34278 [21:57:26<37:49:41, 9.48s/it] {'loss': 0.1477, 'grad_norm': 1.0440011384952868, 'learning_rate': 3.942472516280691e-06, 'epoch': 0.58} 58%|█████▊ | 19909/34278 [21:57:26<37:49:41, 9.48s/it] 58%|█████▊ | 19910/34278 [21:57:29<30:17:27, 7.59s/it] {'loss': 0.1496, 'grad_norm': 0.7198382432564843, 'learning_rate': 3.942010774814222e-06, 'epoch': 0.58} 58%|█████▊ | 19910/34278 [21:57:29<30:17:27, 7.59s/it] 58%|█████▊ | 19911/34278 [21:57:32<25:01:14, 6.27s/it] {'loss': 0.1397, 'grad_norm': 0.9719886487899805, 'learning_rate': 3.941549042793229e-06, 'epoch': 0.58} 58%|█████▊ | 19911/34278 [21:57:32<25:01:14, 6.27s/it] 58%|█████▊ | 19912/34278 [21:57:37<23:47:30, 5.96s/it] {'loss': 0.1228, 'grad_norm': 1.0668822065325398, 'learning_rate': 3.941087320221832e-06, 'epoch': 0.58} 58%|█████▊ | 19912/34278 [21:57:37<23:47:30, 5.96s/it] 58%|█████▊ | 19913/34278 [21:57:49<30:21:34, 7.61s/it] {'loss': 0.1179, 'grad_norm': 0.8799220268546974, 'learning_rate': 3.940625607104154e-06, 'epoch': 0.58} 58%|█████▊ | 19913/34278 [21:57:49<30:21:34, 7.61s/it] 58%|█████▊ | 19914/34278 [21:57:52<25:34:37, 6.41s/it] {'loss': 0.1092, 'grad_norm': 1.131297759725469, 'learning_rate': 3.940163903444319e-06, 'epoch': 0.58} 58%|█████▊ | 19914/34278 [21:57:52<25:34:37, 6.41s/it] 58%|█████▊ | 19915/34278 [21:58:08<36:26:00, 9.13s/it] {'loss': 0.1262, 'grad_norm': 0.9415100809001666, 'learning_rate': 3.939702209246446e-06, 'epoch': 0.58} 58%|█████▊ | 19915/34278 [21:58:08<36:26:00, 9.13s/it] 58%|█████▊ | 19916/34278 [21:58:11<29:07:24, 7.30s/it] {'loss': 0.1227, 'grad_norm': 1.1233275530287412, 'learning_rate': 3.939240524514662e-06, 'epoch': 0.58} 58%|█████▊ | 19916/34278 [21:58:11<29:07:24, 7.30s/it] 58%|█████▊ | 19917/34278 [21:58:14<24:16:06, 6.08s/it] {'loss': 0.1117, 'grad_norm': 0.7489122847542726, 'learning_rate': 3.9387788492530826e-06, 'epoch': 0.58} 58%|█████▊ | 19917/34278 [21:58:14<24:16:06, 6.08s/it] 58%|█████▊ | 19918/34278 [21:58:27<32:34:13, 8.17s/it] {'loss': 0.088, 'grad_norm': 0.6935638746989583, 'learning_rate': 3.938317183465833e-06, 'epoch': 0.58} 58%|█████▊ | 19918/34278 [21:58:27<32:34:13, 8.17s/it] 58%|█████▊ | 19919/34278 [21:58:30<26:49:42, 6.73s/it] {'loss': 0.1526, 'grad_norm': 1.0277083187623963, 'learning_rate': 3.937855527157033e-06, 'epoch': 0.58} 58%|█████▊ | 19919/34278 [21:58:30<26:49:42, 6.73s/it] 58%|█████▊ | 19920/34278 [21:58:33<22:19:54, 5.60s/it] {'loss': 0.1373, 'grad_norm': 1.0545003707528808, 'learning_rate': 3.937393880330806e-06, 'epoch': 0.58} 58%|█████▊ | 19920/34278 [21:58:33<22:19:54, 5.60s/it] 58%|█████▊ | 19921/34278 [21:58:38<20:59:08, 5.26s/it] {'loss': 0.1257, 'grad_norm': 0.7525256930274601, 'learning_rate': 3.9369322429912736e-06, 'epoch': 0.58} 58%|█████▊ | 19921/34278 [21:58:38<20:59:08, 5.26s/it] 58%|█████▊ | 19922/34278 [21:58:44<21:25:11, 5.37s/it] {'loss': 0.1226, 'grad_norm': 0.7158008025312903, 'learning_rate': 3.936470615142557e-06, 'epoch': 0.58} 58%|█████▊ | 19922/34278 [21:58:44<21:25:11, 5.37s/it] 58%|█████▊ | 19923/34278 [21:58:47<18:47:49, 4.71s/it] {'loss': 0.1379, 'grad_norm': 0.8203859749882335, 'learning_rate': 3.936008996788775e-06, 'epoch': 0.58} 58%|█████▊ | 19923/34278 [21:58:47<18:47:49, 4.71s/it] 58%|█████▊ | 19924/34278 [21:58:51<17:49:05, 4.47s/it] {'loss': 0.1094, 'grad_norm': 1.0118663819334237, 'learning_rate': 3.935547387934052e-06, 'epoch': 0.58} 58%|█████▊ | 19924/34278 [21:58:51<17:49:05, 4.47s/it] 58%|█████▊ | 19925/34278 [21:58:53<15:43:01, 3.94s/it] {'loss': 0.1192, 'grad_norm': 0.8037094045417538, 'learning_rate': 3.935085788582506e-06, 'epoch': 0.58} 58%|█████▊ | 19925/34278 [21:58:53<15:43:01, 3.94s/it] 58%|█████▊ | 19926/34278 [21:58:57<15:07:13, 3.79s/it] {'loss': 0.1217, 'grad_norm': 0.7028004467414166, 'learning_rate': 3.9346241987382615e-06, 'epoch': 0.58} 58%|█████▊ | 19926/34278 [21:58:57<15:07:13, 3.79s/it] 58%|█████▊ | 19927/34278 [21:59:01<15:09:48, 3.80s/it] {'loss': 0.1378, 'grad_norm': 0.7599614576893323, 'learning_rate': 3.9341626184054375e-06, 'epoch': 0.58} 58%|█████▊ | 19927/34278 [21:59:01<15:09:48, 3.80s/it] 58%|█████▊ | 19928/34278 [21:59:04<14:27:40, 3.63s/it] {'loss': 0.1408, 'grad_norm': 1.0005815611395625, 'learning_rate': 3.9337010475881545e-06, 'epoch': 0.58} 58%|█████▊ | 19928/34278 [21:59:04<14:27:40, 3.63s/it] 58%|█████▊ | 19929/34278 [21:59:08<14:56:32, 3.75s/it] {'loss': 0.1369, 'grad_norm': 0.7408872546255835, 'learning_rate': 3.933239486290536e-06, 'epoch': 0.58} 58%|█████▊ | 19929/34278 [21:59:08<14:56:32, 3.75s/it] 58%|█████▊ | 19930/34278 [21:59:11<13:43:25, 3.44s/it] {'loss': 0.1431, 'grad_norm': 0.8496162025289706, 'learning_rate': 3.932777934516699e-06, 'epoch': 0.58} 58%|█████▊ | 19930/34278 [21:59:11<13:43:25, 3.44s/it] 58%|█████▊ | 19931/34278 [21:59:15<14:21:40, 3.60s/it] {'loss': 0.1363, 'grad_norm': 0.7909946780810766, 'learning_rate': 3.932316392270765e-06, 'epoch': 0.58} 58%|█████▊ | 19931/34278 [21:59:15<14:21:40, 3.60s/it] 58%|█████▊ | 19932/34278 [21:59:20<16:59:27, 4.26s/it] {'loss': 0.1303, 'grad_norm': 0.8744586560589046, 'learning_rate': 3.931854859556857e-06, 'epoch': 0.58} 58%|█████▊ | 19932/34278 [21:59:20<16:59:27, 4.26s/it] 58%|█████▊ | 19933/34278 [21:59:23<15:36:32, 3.92s/it] {'loss': 0.1408, 'grad_norm': 0.7510692926728375, 'learning_rate': 3.931393336379094e-06, 'epoch': 0.58} 58%|█████▊ | 19933/34278 [21:59:23<15:36:32, 3.92s/it] 58%|█████▊ | 19934/34278 [21:59:28<16:37:31, 4.17s/it] {'loss': 0.1371, 'grad_norm': 0.6833111703741482, 'learning_rate': 3.930931822741596e-06, 'epoch': 0.58} 58%|█████▊ | 19934/34278 [21:59:28<16:37:31, 4.17s/it] 58%|█████▊ | 19935/34278 [21:59:31<14:55:31, 3.75s/it] {'loss': 0.1399, 'grad_norm': 0.9106610941026719, 'learning_rate': 3.9304703186484825e-06, 'epoch': 0.58} 58%|█████▊ | 19935/34278 [21:59:31<14:55:31, 3.75s/it] 58%|█████▊ | 19936/34278 [21:59:35<14:47:22, 3.71s/it] {'loss': 0.1341, 'grad_norm': 0.7618252838441384, 'learning_rate': 3.930008824103876e-06, 'epoch': 0.58} 58%|█████▊ | 19936/34278 [21:59:35<14:47:22, 3.71s/it] 58%|█████▊ | 19937/34278 [21:59:40<17:13:16, 4.32s/it] {'loss': 0.1183, 'grad_norm': 0.893523887261781, 'learning_rate': 3.929547339111892e-06, 'epoch': 0.58} 58%|█████▊ | 19937/34278 [21:59:40<17:13:16, 4.32s/it] 58%|█████▊ | 19938/34278 [21:59:44<15:53:10, 3.99s/it] {'loss': 0.1307, 'grad_norm': 0.8366860430902492, 'learning_rate': 3.9290858636766585e-06, 'epoch': 0.58} 58%|█████▊ | 19938/34278 [21:59:44<15:53:10, 3.99s/it] 58%|█████▊ | 19939/34278 [21:59:49<17:42:09, 4.44s/it] {'loss': 0.1271, 'grad_norm': 0.7336512921920025, 'learning_rate': 3.928624397802288e-06, 'epoch': 0.58} 58%|█████▊ | 19939/34278 [21:59:49<17:42:09, 4.44s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 58%|█████▊ | 19940/34278 [21:59:52<16:07:35, 4.05s/it] {'loss': 0.1225, 'grad_norm': 0.7648675122049943, 'learning_rate': 3.928162941492904e-06, 'epoch': 0.58} 58%|█████▊ | 19940/34278 [21:59:52<16:07:35, 4.05s/it] 58%|█████▊ | 19941/34278 [21:59:55<14:36:07, 3.67s/it] {'loss': 0.1223, 'grad_norm': 0.7354919159711172, 'learning_rate': 3.927701494752626e-06, 'epoch': 0.58} 58%|█████▊ | 19941/34278 [21:59:55<14:36:07, 3.67s/it] 58%|█████▊ | 19942/34278 [21:59:59<14:30:03, 3.64s/it] {'loss': 0.1085, 'grad_norm': 0.6273991668830573, 'learning_rate': 3.927240057585573e-06, 'epoch': 0.58} 58%|█████▊ | 19942/34278 [21:59:59<14:30:03, 3.64s/it] 58%|█████▊ | 19943/34278 [22:00:02<13:59:55, 3.52s/it] {'loss': 0.1314, 'grad_norm': 0.6459800492029086, 'learning_rate': 3.926778629995862e-06, 'epoch': 0.58} 58%|█████▊ | 19943/34278 [22:00:02<13:59:55, 3.52s/it] 58%|█████▊ | 19944/34278 [22:00:06<15:09:19, 3.81s/it] {'loss': 0.1223, 'grad_norm': 0.7482739992861783, 'learning_rate': 3.9263172119876166e-06, 'epoch': 0.58} 58%|█████▊ | 19944/34278 [22:00:06<15:09:19, 3.81s/it] 58%|█████▊ | 19945/34278 [22:00:10<14:32:25, 3.65s/it] {'loss': 0.1402, 'grad_norm': 0.8085054685461269, 'learning_rate': 3.9258558035649556e-06, 'epoch': 0.58} 58%|█████▊ | 19945/34278 [22:00:10<14:32:25, 3.65s/it] 58%|█████▊ | 19946/34278 [22:00:13<13:45:25, 3.46s/it] {'loss': 0.122, 'grad_norm': 0.6800301775320966, 'learning_rate': 3.925394404731998e-06, 'epoch': 0.58} 58%|█████▊ | 19946/34278 [22:00:13<13:45:25, 3.46s/it] 58%|█████▊ | 19947/34278 [22:00:16<13:23:56, 3.37s/it] {'loss': 0.1174, 'grad_norm': 0.7043598093254362, 'learning_rate': 3.9249330154928625e-06, 'epoch': 0.58} 58%|█████▊ | 19947/34278 [22:00:16<13:23:56, 3.37s/it] 58%|█████▊ | 19948/34278 [22:00:19<13:41:18, 3.44s/it] {'loss': 0.1071, 'grad_norm': 0.7787487097948912, 'learning_rate': 3.924471635851667e-06, 'epoch': 0.58} 58%|█████▊ | 19948/34278 [22:00:19<13:41:18, 3.44s/it] 58%|█████▊ | 19949/34278 [22:00:22<13:21:10, 3.35s/it] {'loss': 0.1036, 'grad_norm': 0.7955993349512235, 'learning_rate': 3.924010265812532e-06, 'epoch': 0.58} 58%|█████▊ | 19949/34278 [22:00:23<13:21:10, 3.35s/it] 58%|█████▊ | 19950/34278 [22:00:26<13:09:23, 3.31s/it] {'loss': 0.1464, 'grad_norm': 0.8019184844104764, 'learning_rate': 3.923548905379577e-06, 'epoch': 0.58} 58%|█████▊ | 19950/34278 [22:00:26<13:09:23, 3.31s/it] 58%|█████▊ | 19951/34278 [22:00:29<13:28:04, 3.38s/it] {'loss': 0.1165, 'grad_norm': 0.9414282098573326, 'learning_rate': 3.923087554556922e-06, 'epoch': 0.58} 58%|█████▊ | 19951/34278 [22:00:29<13:28:04, 3.38s/it] 58%|█████▊ | 19952/34278 [22:00:33<13:21:19, 3.36s/it] {'loss': 0.107, 'grad_norm': 0.7954506286146843, 'learning_rate': 3.9226262133486824e-06, 'epoch': 0.58} 58%|█████▊ | 19952/34278 [22:00:33<13:21:19, 3.36s/it] 58%|█████▊ | 19953/34278 [22:00:38<15:21:18, 3.86s/it] {'loss': 0.1326, 'grad_norm': 0.7379893339804852, 'learning_rate': 3.922164881758979e-06, 'epoch': 0.58} 58%|█████▊ | 19953/34278 [22:00:38<15:21:18, 3.86s/it] 58%|█████▊ | 19954/34278 [22:00:44<17:58:49, 4.52s/it] {'loss': 0.1403, 'grad_norm': 0.929430262482613, 'learning_rate': 3.921703559791932e-06, 'epoch': 0.58} 58%|█████▊ | 19954/34278 [22:00:44<17:58:49, 4.52s/it] 58%|█████▊ | 19955/34278 [22:00:47<16:09:36, 4.06s/it] {'loss': 0.1331, 'grad_norm': 0.943504225127908, 'learning_rate': 3.921242247451654e-06, 'epoch': 0.58} 58%|█████▊ | 19955/34278 [22:00:47<16:09:36, 4.06s/it] 58%|█████▊ | 19956/34278 [22:00:51<16:03:05, 4.03s/it] {'loss': 0.1208, 'grad_norm': 0.7250821337711371, 'learning_rate': 3.920780944742272e-06, 'epoch': 0.58} 58%|█████▊ | 19956/34278 [22:00:51<16:03:05, 4.03s/it] 58%|█████▊ | 19957/34278 [22:00:54<14:52:40, 3.74s/it] {'loss': 0.1127, 'grad_norm': 0.8756263308111629, 'learning_rate': 3.920319651667898e-06, 'epoch': 0.58} 58%|█████▊ | 19957/34278 [22:00:54<14:52:40, 3.74s/it] 58%|█████▊ | 19958/34278 [22:00:57<14:40:30, 3.69s/it] {'loss': 0.1482, 'grad_norm': 0.7321426688668755, 'learning_rate': 3.919858368232653e-06, 'epoch': 0.58} 58%|█████▊ | 19958/34278 [22:00:57<14:40:30, 3.69s/it] 58%|█████▊ | 19959/34278 [22:01:02<16:07:03, 4.05s/it] {'loss': 0.1094, 'grad_norm': 0.7992597580466819, 'learning_rate': 3.919397094440655e-06, 'epoch': 0.58} 58%|█████▊ | 19959/34278 [22:01:02<16:07:03, 4.05s/it] 58%|█████▊ | 19960/34278 [22:01:05<14:46:50, 3.72s/it] {'loss': 0.1248, 'grad_norm': 0.9162254415320166, 'learning_rate': 3.9189358302960215e-06, 'epoch': 0.58} 58%|█████▊ | 19960/34278 [22:01:05<14:46:50, 3.72s/it] 58%|█████▊ | 19961/34278 [22:01:08<14:14:06, 3.58s/it] {'loss': 0.1121, 'grad_norm': 0.7029670268063505, 'learning_rate': 3.91847457580287e-06, 'epoch': 0.58} 58%|█████▊ | 19961/34278 [22:01:08<14:14:06, 3.58s/it] 58%|█████▊ | 19962/34278 [22:01:12<14:40:24, 3.69s/it] {'loss': 0.136, 'grad_norm': 0.8335174589105804, 'learning_rate': 3.91801333096532e-06, 'epoch': 0.58} 58%|█████▊ | 19962/34278 [22:01:12<14:40:24, 3.69s/it] 58%|█████▊ | 19963/34278 [22:01:18<16:32:45, 4.16s/it] {'loss': 0.1252, 'grad_norm': 0.9310187708467518, 'learning_rate': 3.917552095787489e-06, 'epoch': 0.58} 58%|█████▊ | 19963/34278 [22:01:18<16:32:45, 4.16s/it] 58%|█████▊ | 19964/34278 [22:01:20<14:50:26, 3.73s/it] {'loss': 0.1256, 'grad_norm': 0.7710969883012689, 'learning_rate': 3.9170908702734945e-06, 'epoch': 0.58} 58%|█████▊ | 19964/34278 [22:01:20<14:50:26, 3.73s/it] 58%|█████▊ | 19965/34278 [22:01:24<14:39:20, 3.69s/it] {'loss': 0.1324, 'grad_norm': 0.9074498699170812, 'learning_rate': 3.916629654427454e-06, 'epoch': 0.58} 58%|█████▊ | 19965/34278 [22:01:24<14:39:20, 3.69s/it] 58%|█████▊ | 19966/34278 [22:01:27<14:18:39, 3.60s/it] {'loss': 0.1456, 'grad_norm': 1.0343579747381952, 'learning_rate': 3.916168448253485e-06, 'epoch': 0.58} 58%|█████▊ | 19966/34278 [22:01:27<14:18:39, 3.60s/it] 58%|█████▊ | 19967/34278 [22:01:30<13:17:07, 3.34s/it] {'loss': 0.129, 'grad_norm': 0.9765773529326656, 'learning_rate': 3.915707251755704e-06, 'epoch': 0.58} 58%|█████▊ | 19967/34278 [22:01:30<13:17:07, 3.34s/it] 58%|█████▊ | 19968/34278 [22:01:33<13:02:18, 3.28s/it] {'loss': 0.1153, 'grad_norm': 0.8459793182395423, 'learning_rate': 3.915246064938233e-06, 'epoch': 0.58} 58%|█████▊ | 19968/34278 [22:01:33<13:02:18, 3.28s/it] 58%|█████▊ | 19969/34278 [22:01:36<12:54:53, 3.25s/it] {'loss': 0.1139, 'grad_norm': 0.7714183250959891, 'learning_rate': 3.9147848878051845e-06, 'epoch': 0.58} 58%|█████▊ | 19969/34278 [22:01:36<12:54:53, 3.25s/it] 58%|█████▊ | 19970/34278 [22:01:42<16:02:37, 4.04s/it] {'loss': 0.1081, 'grad_norm': 0.970932464693246, 'learning_rate': 3.914323720360677e-06, 'epoch': 0.58} 58%|█████▊ | 19970/34278 [22:01:42<16:02:37, 4.04s/it] 58%|█████▊ | 19971/34278 [22:01:45<14:43:26, 3.70s/it] {'loss': 0.1436, 'grad_norm': 1.3982589816857298, 'learning_rate': 3.91386256260883e-06, 'epoch': 0.58} 58%|█████▊ | 19971/34278 [22:01:45<14:43:26, 3.70s/it] 58%|█████▊ | 19972/34278 [22:01:48<14:12:56, 3.58s/it] {'loss': 0.1153, 'grad_norm': 0.8520444859948432, 'learning_rate': 3.913401414553757e-06, 'epoch': 0.58} 58%|█████▊ | 19972/34278 [22:01:48<14:12:56, 3.58s/it] 58%|█████▊ | 19973/34278 [22:01:52<14:23:05, 3.62s/it] {'loss': 0.1334, 'grad_norm': 0.8259940838057555, 'learning_rate': 3.9129402761995765e-06, 'epoch': 0.58} 58%|█████▊ | 19973/34278 [22:01:52<14:23:05, 3.62s/it] 58%|█████▊ | 19974/34278 [22:01:55<13:51:28, 3.49s/it] {'loss': 0.1415, 'grad_norm': 0.9727286286622902, 'learning_rate': 3.912479147550406e-06, 'epoch': 0.58} 58%|█████▊ | 19974/34278 [22:01:55<13:51:28, 3.49s/it] 58%|█████▊ | 19975/34278 [22:01:59<13:55:21, 3.50s/it] {'loss': 0.12, 'grad_norm': 0.998387208620422, 'learning_rate': 3.912018028610362e-06, 'epoch': 0.58} 58%|█████▊ | 19975/34278 [22:01:59<13:55:21, 3.50s/it] 58%|█████▊ | 19976/34278 [22:02:04<16:11:18, 4.07s/it] {'loss': 0.1429, 'grad_norm': 0.8039708771766947, 'learning_rate': 3.911556919383563e-06, 'epoch': 0.58} 58%|█████▊ | 19976/34278 [22:02:04<16:11:18, 4.07s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 58%|█████▊ | 19977/34278 [22:02:08<15:39:10, 3.94s/it] {'loss': 0.146, 'grad_norm': 0.7643786701800359, 'learning_rate': 3.911095819874123e-06, 'epoch': 0.58} 58%|█████▊ | 19977/34278 [22:02:08<15:39:10, 3.94s/it] 58%|█████▊ | 19978/34278 [22:02:13<16:58:19, 4.27s/it] {'loss': 0.1229, 'grad_norm': 1.0885862466447291, 'learning_rate': 3.910634730086159e-06, 'epoch': 0.58} 58%|█████▊ | 19978/34278 [22:02:13<16:58:19, 4.27s/it] 58%|█████▊ | 19979/34278 [22:02:18<17:52:22, 4.50s/it] {'loss': 0.1146, 'grad_norm': 0.8779849257664843, 'learning_rate': 3.910173650023787e-06, 'epoch': 0.58} 58%|█████▊ | 19979/34278 [22:02:18<17:52:22, 4.50s/it] 58%|█████▊ | 19980/34278 [22:02:21<16:00:10, 4.03s/it] {'loss': 0.1373, 'grad_norm': 0.9798843494074476, 'learning_rate': 3.909712579691126e-06, 'epoch': 0.58} 58%|█████▊ | 19980/34278 [22:02:21<16:00:10, 4.03s/it] 58%|█████▊ | 19981/34278 [22:02:25<16:26:41, 4.14s/it] {'loss': 0.1625, 'grad_norm': 0.9199995043051882, 'learning_rate': 3.909251519092292e-06, 'epoch': 0.58} 58%|█████▊ | 19981/34278 [22:02:25<16:26:41, 4.14s/it] 58%|█████▊ | 19982/34278 [22:02:31<18:45:12, 4.72s/it] {'loss': 0.1235, 'grad_norm': 0.9244682248536849, 'learning_rate': 3.908790468231398e-06, 'epoch': 0.58} 58%|█████▊ | 19982/34278 [22:02:31<18:45:12, 4.72s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f624543d580> Failed to fetch sample 3221372. Exception: cannot identify image file <_io.BytesIO object at 0x7f624543d580> 58%|█████▊ | 19983/34278 [22:02:35<17:28:10, 4.40s/it] {'loss': 0.127, 'grad_norm': 0.6955839158370702, 'learning_rate': 3.9083294271125635e-06, 'epoch': 0.58} 58%|█████▊ | 19983/34278 [22:02:35<17:28:10, 4.40s/it] 58%|█████▊ | 19984/34278 [22:02:41<19:33:49, 4.93s/it] {'loss': 0.1421, 'grad_norm': 0.8609939111882231, 'learning_rate': 3.907868395739904e-06, 'epoch': 0.58} 58%|█████▊ | 19984/34278 [22:02:41<19:33:49, 4.93s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 58%|█████▊ | 19985/34278 [22:02:44<17:23:41, 4.38s/it] {'loss': 0.1224, 'grad_norm': 1.751689791296209, 'learning_rate': 3.907407374117531e-06, 'epoch': 0.58} 58%|█████▊ | 19985/34278 [22:02:44<17:23:41, 4.38s/it] 58%|█████▊ | 19986/34278 [22:02:48<16:38:46, 4.19s/it] {'loss': 0.1072, 'grad_norm': 0.7213866183718433, 'learning_rate': 3.906946362249567e-06, 'epoch': 0.58} 58%|█████▊ | 19986/34278 [22:02:48<16:38:46, 4.19s/it] 58%|█████▊ | 19987/34278 [22:02:52<16:09:16, 4.07s/it] {'loss': 0.125, 'grad_norm': 0.6711112849067568, 'learning_rate': 3.9064853601401255e-06, 'epoch': 0.58} 58%|█████▊ | 19987/34278 [22:02:52<16:09:16, 4.07s/it] 58%|█████▊ | 19988/34278 [22:02:55<15:13:03, 3.83s/it] {'loss': 0.1147, 'grad_norm': 0.8294815817653309, 'learning_rate': 3.90602436779332e-06, 'epoch': 0.58} 58%|█████▊ | 19988/34278 [22:02:55<15:13:03, 3.83s/it] 58%|█████▊ | 19989/34278 [22:02:58<14:19:45, 3.61s/it] {'loss': 0.0988, 'grad_norm': 1.0764632780837011, 'learning_rate': 3.90556338521327e-06, 'epoch': 0.58} 58%|█████▊ | 19989/34278 [22:02:58<14:19:45, 3.61s/it] 58%|█████▊ | 19990/34278 [22:03:02<14:31:08, 3.66s/it] {'loss': 0.1312, 'grad_norm': 0.7551939346681211, 'learning_rate': 3.905102412404087e-06, 'epoch': 0.58} 58%|█████▊ | 19990/34278 [22:03:02<14:31:08, 3.66s/it] 58%|█████▊ | 19991/34278 [22:03:05<14:05:28, 3.55s/it] {'loss': 0.1206, 'grad_norm': 0.6482365702037921, 'learning_rate': 3.904641449369887e-06, 'epoch': 0.58} 58%|█████▊ | 19991/34278 [22:03:05<14:05:28, 3.55s/it] 58%|█████▊ | 19992/34278 [22:03:08<13:07:41, 3.31s/it] {'loss': 0.1345, 'grad_norm': 0.8071205374283903, 'learning_rate': 3.904180496114789e-06, 'epoch': 0.58} 58%|█████▊ | 19992/34278 [22:03:08<13:07:41, 3.31s/it] 58%|█████▊ | 19993/34278 [22:03:12<14:17:39, 3.60s/it] {'loss': 0.1, 'grad_norm': 0.7723152671545592, 'learning_rate': 3.903719552642906e-06, 'epoch': 0.58} 58%|█████▊ | 19993/34278 [22:03:12<14:17:39, 3.60s/it] 58%|█████▊ | 19994/34278 [22:03:15<13:45:02, 3.47s/it] {'loss': 0.1317, 'grad_norm': 0.9115210649858577, 'learning_rate': 3.9032586189583525e-06, 'epoch': 0.58} 58%|█████▊ | 19994/34278 [22:03:15<13:45:02, 3.47s/it] 58%|█████▊ | 19995/34278 [22:03:18<13:05:18, 3.30s/it] {'loss': 0.1253, 'grad_norm': 0.7362211880106496, 'learning_rate': 3.902797695065244e-06, 'epoch': 0.58} 58%|█████▊ | 19995/34278 [22:03:18<13:05:18, 3.30s/it] 58%|█████▊ | 19996/34278 [22:03:21<12:47:04, 3.22s/it] {'loss': 0.1338, 'grad_norm': 0.8636060424256389, 'learning_rate': 3.902336780967697e-06, 'epoch': 0.58} 58%|█████▊ | 19996/34278 [22:03:21<12:47:04, 3.22s/it] 58%|█████▊ | 19997/34278 [22:03:26<14:00:43, 3.53s/it] {'loss': 0.1127, 'grad_norm': 0.7829876879955818, 'learning_rate': 3.901875876669822e-06, 'epoch': 0.58} 58%|█████▊ | 19997/34278 [22:03:26<14:00:43, 3.53s/it] 58%|█████▊ | 19998/34278 [22:03:29<14:07:25, 3.56s/it] {'loss': 0.1361, 'grad_norm': 0.7388444336577626, 'learning_rate': 3.90141498217574e-06, 'epoch': 0.58} 58%|█████▊ | 19998/34278 [22:03:29<14:07:25, 3.56s/it] 58%|█████▊ | 19999/34278 [22:03:32<13:37:34, 3.44s/it] {'loss': 0.1207, 'grad_norm': 0.8067589149653539, 'learning_rate': 3.900954097489562e-06, 'epoch': 0.58} 58%|█████▊ | 19999/34278 [22:03:32<13:37:34, 3.44s/it] 58%|█████▊ | 20000/34278 [22:03:36<13:16:47, 3.35s/it] {'loss': 0.1104, 'grad_norm': 0.7230169779803474, 'learning_rate': 3.900493222615403e-06, 'epoch': 0.58} 58%|█████▊ | 20000/34278 [22:03:36<13:16:47, 3.35s/it]Set eos token id toSet eos token id to 151658151658 Set eos token toSet eos token to <|diff_marker|><|diff_marker|> Set eos token id to Set generation config eos token id to Set generation config eos token id to [151658]Set eos token id to 151658Set eos token id to [151658] 151658 Set eos token id toSet eos token to151658Set eos token id to Set eos token toSet eos token to<|diff_marker|> 151658 <|diff_marker|>151658 Set eos token toSet generation config eos token id to Set eos token to <|diff_marker|>[151658] Set generation config eos token id to<|diff_marker|> <|diff_marker|> Set generation config eos token id to[151658] Set generation config eos token id toSet generation config eos token id to [151658] [151658][151658] Set eos token id toSet eos token id toSet eos token id to 151658151658 d toSet eos token id to [151658] Set eos token id toSet eos token id toSet eos token id to os token id to 151658151658 151658 151658 Set eos token toSet eos token toSet eos token to 151658Set eos token to Set eos token to Set eos token to <|diff_marker|><|diff_marker|> Set eos token to<|diff_marker|><|diff_marker|> <|diff_marker|> Set generation config eos token id toSet generation config eos token id toSet generation config eos token id to [151658] Set generation config eos token id to<|diff_marker|>Set generation config eos token id to<|diff_marker|> [151658] [151658] [151658] [151658]Set generation config eos token id to Set generation config eos token id to [151658][151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] 151658Set eos token id to Set eos token id toSet eos token id toSet eos token id toSet eos token to Set eos token id to Set eos token id to Set eos token id to <|diff_marker|>151658 151658 151658 Set generation config eos token id to151658151658 Set eos token to Set eos token to Set eos token toSet eos token to151658[151658]<|diff_marker|>Set eos token to 151658 <|diff_marker|> <|diff_marker|> <|diff_marker|><|diff_marker|> Set eos token to Set generation config eos token id toSet generation config eos token id toSet eos token toSet generation config eos token id toSet generation config eos token id to <|diff_marker|> [151658]Set generation config eos token id to[151658]<|diff_marker|> [151658][151658]Set generation config eos token id to[151658] Set generation config eos token id to [151658][151658] Set eos token id to 151658151658Set eos token id to151658 Set eos token toSet eos token id toSet eos token to151658Set eos token to 151658 <|diff_marker|> <|diff_marker|> Set eos token to<|diff_marker|> Set eos token id toSet eos token toSet generation config eos token id to Set eos token id to Set generation config eos token id toSet generation config eos token id to <|diff_marker|> <|diff_marker|> 151658151658 151658[151658] [151658] [151658] Set generation config eos token id to Set generation config eos token id toSet eos token to Set eos token to Set eos token to <|diff_marker|> [151658][151658]<|diff_marker|><|diff_marker|> Set generation config eos token id to Set generation config eos token id toSet generation config eos token id to[151658] [151658] [151658] Set eos token id back toS 151645 Set eos token b 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id ba 151645 Set eos token bSet eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] o 151645151645 Set eos token back to <|im_end|>Set eos token back to <|im_end|>Set generation config eos token id back to Set generation config eos token id back to [151645, 151643] Set eos token id back to[151645, 151643] 151645 Set eos token back to <|im_end|> Set generation config eos token id back to Set eos token id back to 151645[151645, 151643] Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 58%|█████▊ | 20001/34278 [22:04:12<52:46:06, 13.31s/it] {'loss': 0.1243, 'grad_norm': 0.877173890972481, 'learning_rate': 3.900032357557379e-06, 'epoch': 0.58} 58%|█████▊ | 20001/34278 [22:04:12<52:46:06, 13.31s/it] 58%|█████▊ | 20002/34278 [22:04:16<41:34:04, 10.48s/it] {'loss': 0.1247, 'grad_norm': 0.9922213644489261, 'learning_rate': 3.899571502319603e-06, 'epoch': 0.58} 58%|█████▊ | 20002/34278 [22:04:16<41:34:04, 10.48s/it] 58%|█████▊ | 20003/34278 [22:04:19<33:11:08, 8.37s/it] {'loss': 0.1386, 'grad_norm': 0.9334965449708817, 'learning_rate': 3.899110656906189e-06, 'epoch': 0.58} 58%|█████▊ | 20003/34278 [22:04:19<33:11:08, 8.37s/it] 58%|█████▊ | 20004/34278 [22:04:25<30:18:51, 7.65s/it] {'loss': 0.126, 'grad_norm': 1.0450135546821684, 'learning_rate': 3.898649821321253e-06, 'epoch': 0.58} 58%|█████▊ | 20004/34278 [22:04:25<30:18:51, 7.65s/it] 58%|█████▊ | 20005/34278 [22:04:29<25:11:21, 6.35s/it] {'loss': 0.1247, 'grad_norm': 1.0209668161040246, 'learning_rate': 3.898188995568908e-06, 'epoch': 0.58} 58%|█████▊ | 20005/34278 [22:04:29<25:11:21, 6.35s/it] 58%|█████▊ | 20006/34278 [22:04:32<21:38:04, 5.46s/it] {'loss': 0.1202, 'grad_norm': 1.007674244420397, 'learning_rate': 3.8977281796532706e-06, 'epoch': 0.58} 58%|█████▊ | 20006/34278 [22:04:32<21:38:04, 5.46s/it] 58%|█████▊ | 20007/34278 [22:04:37<20:51:55, 5.26s/it] {'loss': 0.12, 'grad_norm': 0.9367676131305241, 'learning_rate': 3.8972673735784516e-06, 'epoch': 0.58} 58%|█████▊ | 20007/34278 [22:04:37<20:51:55, 5.26s/it] 58%|█████▊ | 20008/34278 [22:04:40<18:42:30, 4.72s/it] {'loss': 0.1151, 'grad_norm': 1.0124107800044146, 'learning_rate': 3.896806577348566e-06, 'epoch': 0.58} 58%|█████▊ | 20008/34278 [22:04:40<18:42:30, 4.72s/it] 58%|█████▊ | 20009/34278 [22:04:43<16:46:52, 4.23s/it] {'loss': 0.1217, 'grad_norm': 0.9649067586170705, 'learning_rate': 3.896345790967726e-06, 'epoch': 0.58} 58%|█████▊ | 20009/34278 [22:04:43<16:46:52, 4.23s/it] 58%|█████▊ | 20010/34278 [22:04:46<15:14:16, 3.84s/it] {'loss': 0.1095, 'grad_norm': 0.7424227106621085, 'learning_rate': 3.89588501444005e-06, 'epoch': 0.58} 58%|█████▊ | 20010/34278 [22:04:46<15:14:16, 3.84s/it] 58%|█████▊ | 20011/34278 [22:04:49<14:20:47, 3.62s/it] {'loss': 0.1238, 'grad_norm': 0.8772521777826231, 'learning_rate': 3.895424247769649e-06, 'epoch': 0.58} 58%|█████▊ | 20011/34278 [22:04:49<14:20:47, 3.62s/it] 58%|█████▊ | 20012/34278 [22:04:54<15:09:12, 3.82s/it] {'loss': 0.1365, 'grad_norm': 0.8659507469262135, 'learning_rate': 3.8949634909606365e-06, 'epoch': 0.58} 58%|█████▊ | 20012/34278 [22:04:54<15:09:12, 3.82s/it] 58%|█████▊ | 20013/34278 [22:04:57<13:55:08, 3.51s/it] {'loss': 0.1318, 'grad_norm': 1.0415473076502995, 'learning_rate': 3.894502744017126e-06, 'epoch': 0.58} 58%|█████▊ | 20013/34278 [22:04:57<13:55:08, 3.51s/it] 58%|█████▊ | 20014/34278 [22:04:59<13:05:15, 3.30s/it] {'loss': 0.1189, 'grad_norm': 0.7313353263417107, 'learning_rate': 3.894042006943231e-06, 'epoch': 0.58} 58%|█████▊ | 20014/34278 [22:04:59<13:05:15, 3.30s/it] 58%|█████▊ | 20015/34278 [22:05:03<13:11:02, 3.33s/it] {'loss': 0.1349, 'grad_norm': 0.9722540423042687, 'learning_rate': 3.893581279743064e-06, 'epoch': 0.58} 58%|█████▊ | 20015/34278 [22:05:03<13:11:02, 3.33s/it] 58%|█████▊ | 20016/34278 [22:05:06<13:18:59, 3.36s/it] {'loss': 0.1171, 'grad_norm': 0.850733822255877, 'learning_rate': 3.89312056242074e-06, 'epoch': 0.58} 58%|█████▊ | 20016/34278 [22:05:06<13:18:59, 3.36s/it] 58%|█████▊ | 20017/34278 [22:05:09<13:07:15, 3.31s/it] {'loss': 0.1322, 'grad_norm': 1.5434614652881848, 'learning_rate': 3.892659854980371e-06, 'epoch': 0.58} 58%|█████▊ | 20017/34278 [22:05:09<13:07:15, 3.31s/it] 58%|█████▊ | 20018/34278 [22:05:13<13:06:31, 3.31s/it] {'loss': 0.1516, 'grad_norm': 1.5228709237166698, 'learning_rate': 3.892199157426071e-06, 'epoch': 0.58} 58%|█████▊ | 20018/34278 [22:05:13<13:06:31, 3.31s/it] 58%|█████▊ | 20019/34278 [22:05:16<12:32:06, 3.16s/it] {'loss': 0.1332, 'grad_norm': 1.2995402203326651, 'learning_rate': 3.891738469761953e-06, 'epoch': 0.58} 58%|█████▊ | 20019/34278 [22:05:16<12:32:06, 3.16s/it] 58%|█████▊ | 20020/34278 [22:05:19<13:08:12, 3.32s/it] {'loss': 0.1363, 'grad_norm': 1.4594074075054007, 'learning_rate': 3.891277791992129e-06, 'epoch': 0.58} 58%|█████▊ | 20020/34278 [22:05:19<13:08:12, 3.32s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (10632 > 8192). Running this sequence through the model will result in indexing errors 58%|█████▊ | 20021/34278 [22:05:22<12:40:20, 3.20s/it] {'loss': 0.1178, 'grad_norm': 0.6573844072446806, 'learning_rate': 3.890817124120711e-06, 'epoch': 0.58} 58%|█████▊ | 20021/34278 [22:05:22<12:40:20, 3.20s/it] 58%|█████▊ | 20022/34278 [22:05:26<13:11:05, 3.33s/it] {'loss': 0.1315, 'grad_norm': 0.8124837233294876, 'learning_rate': 3.890356466151813e-06, 'epoch': 0.58} 58%|█████▊ | 20022/34278 [22:05:26<13:11:05, 3.33s/it] 58%|█████▊ | 20023/34278 [22:05:29<13:33:03, 3.42s/it] {'loss': 0.1227, 'grad_norm': 0.9847557728491084, 'learning_rate': 3.889895818089549e-06, 'epoch': 0.58} 58%|█████▊ | 20023/34278 [22:05:29<13:33:03, 3.42s/it] 58%|█████▊ | 20024/34278 [22:05:33<13:38:32, 3.45s/it] {'loss': 0.1255, 'grad_norm': 1.2453805876828943, 'learning_rate': 3.889435179938029e-06, 'epoch': 0.58} 58%|█████▊ | 20024/34278 [22:05:33<13:38:32, 3.45s/it] 58%|█████▊ | 20025/34278 [22:05:36<13:21:31, 3.37s/it] {'loss': 0.1197, 'grad_norm': 0.7276105532760212, 'learning_rate': 3.888974551701368e-06, 'epoch': 0.58} 58%|█████▊ | 20025/34278 [22:05:36<13:21:31, 3.37s/it] 58%|█████▊ | 20026/34278 [22:05:40<13:40:02, 3.45s/it] {'loss': 0.1254, 'grad_norm': 0.6886842346855506, 'learning_rate': 3.888513933383676e-06, 'epoch': 0.58} 58%|█████▊ | 20026/34278 [22:05:40<13:40:02, 3.45s/it] 58%|█████▊ | 20027/34278 [22:05:43<13:17:33, 3.36s/it] {'loss': 0.1412, 'grad_norm': 1.0024426718891248, 'learning_rate': 3.888053324989065e-06, 'epoch': 0.58} 58%|█████▊ | 20027/34278 [22:05:43<13:17:33, 3.36s/it] 58%|█████▊ | 20028/34278 [22:05:48<15:05:59, 3.81s/it] {'loss': 0.1127, 'grad_norm': 0.8664652350890629, 'learning_rate': 3.88759272652165e-06, 'epoch': 0.58} 58%|█████▊ | 20028/34278 [22:05:48<15:05:59, 3.81s/it] 58%|█████▊ | 20029/34278 [22:05:52<15:21:09, 3.88s/it] {'loss': 0.1213, 'grad_norm': 0.6224970984708146, 'learning_rate': 3.887132137985542e-06, 'epoch': 0.58} 58%|█████▊ | 20029/34278 [22:05:52<15:21:09, 3.88s/it] 58%|█████▊ | 20030/34278 [22:05:58<17:55:43, 4.53s/it] {'loss': 0.1414, 'grad_norm': 0.6913691028308873, 'learning_rate': 3.886671559384851e-06, 'epoch': 0.58} 58%|█████▊ | 20030/34278 [22:05:58<17:55:43, 4.53s/it] 58%|█████▊ | 20031/34278 [22:06:01<15:56:11, 4.03s/it] {'loss': 0.1183, 'grad_norm': 0.8798904964005513, 'learning_rate': 3.8862109907236935e-06, 'epoch': 0.58} 58%|█████▊ | 20031/34278 [22:06:01<15:56:11, 4.03s/it] 58%|█████▊ | 20032/34278 [22:06:04<14:31:51, 3.67s/it] {'loss': 0.1095, 'grad_norm': 0.9806387333938295, 'learning_rate': 3.8857504320061765e-06, 'epoch': 0.58} 58%|█████▊ | 20032/34278 [22:06:04<14:31:51, 3.67s/it] 58%|█████▊ | 20033/34278 [22:06:07<14:45:54, 3.73s/it] {'loss': 0.135, 'grad_norm': 0.8494375148451374, 'learning_rate': 3.8852898832364125e-06, 'epoch': 0.58} 58%|█████▊ | 20033/34278 [22:06:07<14:45:54, 3.73s/it] 58%|█████▊ | 20034/34278 [22:06:10<13:48:51, 3.49s/it] {'loss': 0.127, 'grad_norm': 0.7349521172006768, 'learning_rate': 3.884829344418515e-06, 'epoch': 0.58} 58%|█████▊ | 20034/34278 [22:06:10<13:48:51, 3.49s/it] 58%|█████▊ | 20035/34278 [22:06:14<13:57:32, 3.53s/it] {'loss': 0.1218, 'grad_norm': 1.0602459365016692, 'learning_rate': 3.884368815556595e-06, 'epoch': 0.58} 58%|█████▊ | 20035/34278 [22:06:14<13:57:32, 3.53s/it] 58%|█████▊ | 20036/34278 [22:06:18<14:54:10, 3.77s/it] {'loss': 0.1204, 'grad_norm': 1.1043666239761065, 'learning_rate': 3.883908296654766e-06, 'epoch': 0.58} 58%|█████▊ | 20036/34278 [22:06:18<14:54:10, 3.77s/it] 58%|█████▊ | 20037/34278 [22:06:22<14:55:25, 3.77s/it] {'loss': 0.1221, 'grad_norm': 0.7194028575697635, 'learning_rate': 3.883447787717134e-06, 'epoch': 0.58} 58%|█████▊ | 20037/34278 [22:06:22<14:55:25, 3.77s/it] 58%|█████▊ | 20038/34278 [22:06:28<17:36:46, 4.45s/it] {'loss': 0.1359, 'grad_norm': 1.0477697364098046, 'learning_rate': 3.882987288747816e-06, 'epoch': 0.58} 58%|█████▊ | 20038/34278 [22:06:28<17:36:46, 4.45s/it] 58%|█████▊ | 20039/34278 [22:06:31<16:12:53, 4.10s/it] {'loss': 0.1384, 'grad_norm': 1.2162683390570899, 'learning_rate': 3.8825267997509184e-06, 'epoch': 0.58} 58%|█████▊ | 20039/34278 [22:06:31<16:12:53, 4.10s/it] 58%|█████▊ | 20040/34278 [22:06:37<18:10:41, 4.60s/it] {'loss': 0.1326, 'grad_norm': 0.9315426408156291, 'learning_rate': 3.882066320730556e-06, 'epoch': 0.58} 58%|█████▊ | 20040/34278 [22:06:37<18:10:41, 4.60s/it] 58%|█████▊ | 20041/34278 [22:06:40<16:24:14, 4.15s/it] {'loss': 0.1297, 'grad_norm': 0.7159074593237017, 'learning_rate': 3.88160585169084e-06, 'epoch': 0.58} 58%|█████▊ | 20041/34278 [22:06:40<16:24:14, 4.15s/it] 58%|█████▊ | 20042/34278 [22:06:43<14:57:23, 3.78s/it] {'loss': 0.1199, 'grad_norm': 1.2336832947245413, 'learning_rate': 3.881145392635879e-06, 'epoch': 0.58} 58%|█████▊ | 20042/34278 [22:06:43<14:57:23, 3.78s/it] 58%|█████▊ | 20043/34278 [22:06:49<17:40:35, 4.47s/it] {'loss': 0.1216, 'grad_norm': 1.0889179166800855, 'learning_rate': 3.880684943569785e-06, 'epoch': 0.58} 58%|█████▊ | 20043/34278 [22:06:49<17:40:35, 4.47s/it] 58%|█████▊ | 20044/34278 [22:06:54<18:19:38, 4.64s/it] {'loss': 0.1291, 'grad_norm': 0.8037398815818094, 'learning_rate': 3.880224504496669e-06, 'epoch': 0.58} 58%|█████▊ | 20044/34278 [22:06:54<18:19:38, 4.64s/it] 58%|█████▊ | 20045/34278 [22:06:57<16:04:49, 4.07s/it] {'loss': 0.1423, 'grad_norm': 0.8383273262411762, 'learning_rate': 3.87976407542064e-06, 'epoch': 0.58} 58%|█████▊ | 20045/34278 [22:06:57<16:04:49, 4.07s/it] 58%|█████▊ | 20046/34278 [22:07:00<14:37:56, 3.70s/it] {'loss': 0.122, 'grad_norm': 0.9656367754583141, 'learning_rate': 3.87930365634581e-06, 'epoch': 0.58} 58%|█████▊ | 20046/34278 [22:07:00<14:37:56, 3.70s/it] 58%|█████▊ | 20047/34278 [22:07:03<14:10:29, 3.59s/it] {'loss': 0.1492, 'grad_norm': 0.7820117746304088, 'learning_rate': 3.87884324727629e-06, 'epoch': 0.58} 58%|█████▊ | 20047/34278 [22:07:03<14:10:29, 3.59s/it] 58%|█████▊ | 20048/34278 [22:07:06<13:44:55, 3.48s/it] {'loss': 0.1233, 'grad_norm': 0.9874843732922406, 'learning_rate': 3.87838284821619e-06, 'epoch': 0.58} 58%|█████▊ | 20048/34278 [22:07:06<13:44:55, 3.48s/it] 58%|█████▊ | 20049/34278 [22:07:10<13:53:33, 3.51s/it] {'loss': 0.1324, 'grad_norm': 0.7807569775875209, 'learning_rate': 3.877922459169621e-06, 'epoch': 0.58} 58%|█████▊ | 20049/34278 [22:07:10<13:53:33, 3.51s/it] 58%|█████▊ | 20050/34278 [22:07:13<13:28:40, 3.41s/it] {'loss': 0.138, 'grad_norm': 0.908533675249824, 'learning_rate': 3.877462080140691e-06, 'epoch': 0.58} 58%|█████▊ | 20050/34278 [22:07:13<13:28:40, 3.41s/it] 58%|█████▊ | 20051/34278 [22:07:19<16:00:05, 4.05s/it] {'loss': 0.1443, 'grad_norm': 0.9268027110355939, 'learning_rate': 3.877001711133511e-06, 'epoch': 0.58} 58%|█████▊ | 20051/34278 [22:07:19<16:00:05, 4.05s/it] 58%|█████▊ | 20052/34278 [22:07:21<14:29:11, 3.67s/it] {'loss': 0.1353, 'grad_norm': 1.1052806011889693, 'learning_rate': 3.8765413521521925e-06, 'epoch': 0.58} 58%|█████▊ | 20052/34278 [22:07:21<14:29:11, 3.67s/it] 59%|█████▊ | 20053/34278 [22:07:26<15:26:37, 3.91s/it] {'loss': 0.1137, 'grad_norm': 0.7436349984137494, 'learning_rate': 3.876081003200846e-06, 'epoch': 0.59} 59%|█████▊ | 20053/34278 [22:07:26<15:26:37, 3.91s/it] 59%|█████▊ | 20054/34278 [22:07:32<17:50:20, 4.51s/it] {'loss': 0.1256, 'grad_norm': 0.7905624130436862, 'learning_rate': 3.875620664283578e-06, 'epoch': 0.59} 59%|█████▊ | 20054/34278 [22:07:32<17:50:20, 4.51s/it] 59%|█████▊ | 20055/34278 [22:07:35<16:15:46, 4.12s/it] {'loss': 0.1109, 'grad_norm': 0.7727231524808063, 'learning_rate': 3.875160335404502e-06, 'epoch': 0.59} 59%|█████▊ | 20055/34278 [22:07:35<16:15:46, 4.12s/it] 59%|█████▊ | 20056/34278 [22:07:39<16:05:12, 4.07s/it] {'loss': 0.1388, 'grad_norm': 1.1785316990071035, 'learning_rate': 3.874700016567726e-06, 'epoch': 0.59} 59%|█████▊ | 20056/34278 [22:07:39<16:05:12, 4.07s/it] 59%|█████▊ | 20057/34278 [22:07:42<14:29:52, 3.67s/it] {'loss': 0.1067, 'grad_norm': 0.8946554632140041, 'learning_rate': 3.874239707777356e-06, 'epoch': 0.59} 59%|█████▊ | 20057/34278 [22:07:42<14:29:52, 3.67s/it] 59%|█████▊ | 20058/34278 [22:07:47<16:25:20, 4.16s/it] {'loss': 0.1297, 'grad_norm': 0.8262004130191329, 'learning_rate': 3.873779409037509e-06, 'epoch': 0.59} 59%|█████▊ | 20058/34278 [22:07:47<16:25:20, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 59%|█████▊ | 20059/34278 [22:07:51<15:39:36, 3.96s/it] {'loss': 0.1197, 'grad_norm': 0.7765629391008582, 'learning_rate': 3.873319120352289e-06, 'epoch': 0.59} 59%|█████▊ | 20059/34278 [22:07:51<15:39:36, 3.96s/it] 59%|█████▊ | 20060/34278 [22:07:54<14:41:35, 3.72s/it] {'loss': 0.1304, 'grad_norm': 0.8418836592297371, 'learning_rate': 3.872858841725808e-06, 'epoch': 0.59} 59%|█████▊ | 20060/34278 [22:07:54<14:41:35, 3.72s/it] 59%|█████▊ | 20061/34278 [22:07:58<14:56:38, 3.78s/it] {'loss': 0.0986, 'grad_norm': 0.7772474308397072, 'learning_rate': 3.872398573162174e-06, 'epoch': 0.59} 59%|█████▊ | 20061/34278 [22:07:58<14:56:38, 3.78s/it] 59%|█████▊ | 20062/34278 [22:08:01<14:24:59, 3.65s/it] {'loss': 0.1454, 'grad_norm': 0.7997763202848225, 'learning_rate': 3.871938314665496e-06, 'epoch': 0.59} 59%|█████▊ | 20062/34278 [22:08:01<14:24:59, 3.65s/it] 59%|█████▊ | 20063/34278 [22:08:04<14:11:54, 3.60s/it] {'loss': 0.1159, 'grad_norm': 0.7620179197154986, 'learning_rate': 3.871478066239882e-06, 'epoch': 0.59} 59%|█████▊ | 20063/34278 [22:08:04<14:11:54, 3.60s/it] 59%|█████▊ | 20064/34278 [22:08:11<17:07:54, 4.34s/it] {'loss': 0.1409, 'grad_norm': 0.7601146672815218, 'learning_rate': 3.871017827889444e-06, 'epoch': 0.59} 59%|█████▊ | 20064/34278 [22:08:11<17:07:54, 4.34s/it] 59%|█████▊ | 20065/34278 [22:08:14<16:18:43, 4.13s/it] {'loss': 0.1192, 'grad_norm': 0.9971389348944414, 'learning_rate': 3.870557599618289e-06, 'epoch': 0.59} 59%|█████▊ | 20065/34278 [22:08:14<16:18:43, 4.13s/it] 59%|█████▊ | 20066/34278 [22:08:17<15:00:49, 3.80s/it] {'loss': 0.1396, 'grad_norm': 0.7898776749792827, 'learning_rate': 3.8700973814305275e-06, 'epoch': 0.59} 59%|█████▊ | 20066/34278 [22:08:17<15:00:49, 3.80s/it] 59%|█████▊ | 20067/34278 [22:08:21<15:17:52, 3.88s/it] {'loss': 0.1278, 'grad_norm': 0.7245572167616001, 'learning_rate': 3.869637173330265e-06, 'epoch': 0.59} 59%|█████▊ | 20067/34278 [22:08:21<15:17:52, 3.88s/it] 59%|█████▊ | 20068/34278 [22:08:25<15:21:48, 3.89s/it] {'loss': 0.1119, 'grad_norm': 0.81545613368151, 'learning_rate': 3.869176975321613e-06, 'epoch': 0.59} 59%|█████▊ | 20068/34278 [22:08:25<15:21:48, 3.89s/it] 59%|█████▊ | 20069/34278 [22:08:28<14:31:38, 3.68s/it] {'loss': 0.1344, 'grad_norm': 0.8334207358753285, 'learning_rate': 3.868716787408677e-06, 'epoch': 0.59} 59%|█████▊ | 20069/34278 [22:08:28<14:31:38, 3.68s/it] 59%|█████▊ | 20070/34278 [22:08:31<13:26:32, 3.41s/it] {'loss': 0.129, 'grad_norm': 0.7354282710283488, 'learning_rate': 3.8682566095955695e-06, 'epoch': 0.59} 59%|█████▊ | 20070/34278 [22:08:31<13:26:32, 3.41s/it] 59%|█████▊ | 20071/34278 [22:08:34<12:59:49, 3.29s/it] {'loss': 0.1007, 'grad_norm': 0.7748654416289245, 'learning_rate': 3.867796441886397e-06, 'epoch': 0.59} 59%|█████▊ | 20071/34278 [22:08:34<12:59:49, 3.29s/it] 59%|█████▊ | 20072/34278 [22:08:38<13:25:26, 3.40s/it] {'loss': 0.1262, 'grad_norm': 0.9861834514467656, 'learning_rate': 3.867336284285267e-06, 'epoch': 0.59} 59%|█████▊ | 20072/34278 [22:08:38<13:25:26, 3.40s/it] 59%|█████▊ | 20073/34278 [22:08:41<13:11:42, 3.34s/it] {'loss': 0.1329, 'grad_norm': 0.8149178327824725, 'learning_rate': 3.866876136796288e-06, 'epoch': 0.59} 59%|█████▊ | 20073/34278 [22:08:41<13:11:42, 3.34s/it] 59%|█████▊ | 20074/34278 [22:08:46<14:35:37, 3.70s/it] {'loss': 0.1189, 'grad_norm': 0.8383302629967043, 'learning_rate': 3.86641599942357e-06, 'epoch': 0.59} 59%|█████▊ | 20074/34278 [22:08:46<14:35:37, 3.70s/it] 59%|█████▊ | 20075/34278 [22:08:48<13:35:44, 3.45s/it] {'loss': 0.1247, 'grad_norm': 1.0343052504712773, 'learning_rate': 3.865955872171217e-06, 'epoch': 0.59} 59%|█████▊ | 20075/34278 [22:08:48<13:35:44, 3.45s/it] 59%|█████▊ | 20076/34278 [22:08:52<13:28:20, 3.42s/it] {'loss': 0.1204, 'grad_norm': 0.7977024950722141, 'learning_rate': 3.865495755043339e-06, 'epoch': 0.59} 59%|█████▊ | 20076/34278 [22:08:52<13:28:20, 3.42s/it] 59%|█████▊ | 20077/34278 [22:08:55<12:49:20, 3.25s/it] {'loss': 0.1416, 'grad_norm': 1.2251561082309965, 'learning_rate': 3.865035648044046e-06, 'epoch': 0.59} 59%|█████▊ | 20077/34278 [22:08:55<12:49:20, 3.25s/it] 59%|█████▊ | 20078/34278 [22:08:57<12:15:06, 3.11s/it] {'loss': 0.1272, 'grad_norm': 0.8435546083933906, 'learning_rate': 3.864575551177443e-06, 'epoch': 0.59} 59%|█████▊ | 20078/34278 [22:08:57<12:15:06, 3.11s/it] 59%|█████▊ | 20079/34278 [22:09:01<13:04:48, 3.32s/it] {'loss': 0.1247, 'grad_norm': 0.7883942391601387, 'learning_rate': 3.864115464447639e-06, 'epoch': 0.59} 59%|█████▊ | 20079/34278 [22:09:01<13:04:48, 3.32s/it] 59%|█████▊ | 20080/34278 [22:09:04<12:59:02, 3.29s/it] {'loss': 0.1291, 'grad_norm': 0.8557367358849178, 'learning_rate': 3.86365538785874e-06, 'epoch': 0.59} 59%|█████▊ | 20080/34278 [22:09:04<12:59:02, 3.29s/it] 59%|█████▊ | 20081/34278 [22:09:08<12:44:52, 3.23s/it] {'loss': 0.1155, 'grad_norm': 0.9048262984258508, 'learning_rate': 3.863195321414855e-06, 'epoch': 0.59} 59%|█████▊ | 20081/34278 [22:09:08<12:44:52, 3.23s/it] 59%|█████▊ | 20082/34278 [22:09:11<12:40:00, 3.21s/it] {'loss': 0.1472, 'grad_norm': 0.7213752903120728, 'learning_rate': 3.86273526512009e-06, 'epoch': 0.59} 59%|█████▊ | 20082/34278 [22:09:11<12:40:00, 3.21s/it] 59%|█████▊ | 20083/34278 [22:09:16<15:15:47, 3.87s/it] {'loss': 0.1478, 'grad_norm': 0.7683928580871993, 'learning_rate': 3.862275218978554e-06, 'epoch': 0.59} 59%|█████▊ | 20083/34278 [22:09:16<15:15:47, 3.87s/it] 59%|█████▊ | 20084/34278 [22:09:20<15:04:52, 3.83s/it] {'loss': 0.1472, 'grad_norm': 0.9852625062479899, 'learning_rate': 3.861815182994353e-06, 'epoch': 0.59} 59%|█████▊ | 20084/34278 [22:09:20<15:04:52, 3.83s/it] 59%|█████▊ | 20085/34278 [22:09:23<13:51:12, 3.51s/it] {'loss': 0.1159, 'grad_norm': 0.9549895964272472, 'learning_rate': 3.861355157171594e-06, 'epoch': 0.59} 59%|█████▊ | 20085/34278 [22:09:23<13:51:12, 3.51s/it] 59%|█████▊ | 20086/34278 [22:09:26<13:54:52, 3.53s/it] {'loss': 0.1007, 'grad_norm': 0.7935749292525924, 'learning_rate': 3.860895141514384e-06, 'epoch': 0.59} 59%|█████▊ | 20086/34278 [22:09:26<13:54:52, 3.53s/it] 59%|█████▊ | 20087/34278 [22:09:30<13:48:07, 3.50s/it] {'loss': 0.1254, 'grad_norm': 0.8734585237837397, 'learning_rate': 3.860435136026831e-06, 'epoch': 0.59} 59%|█████▊ | 20087/34278 [22:09:30<13:48:07, 3.50s/it] 59%|█████▊ | 20088/34278 [22:09:33<13:08:24, 3.33s/it] {'loss': 0.1262, 'grad_norm': 0.9321607175072781, 'learning_rate': 3.859975140713042e-06, 'epoch': 0.59} 59%|█████▊ | 20088/34278 [22:09:33<13:08:24, 3.33s/it] 59%|█████▊ | 20089/34278 [22:09:38<15:24:15, 3.91s/it] {'loss': 0.1186, 'grad_norm': 0.8740670292774305, 'learning_rate': 3.859515155577122e-06, 'epoch': 0.59} 59%|█████▊ | 20089/34278 [22:09:38<15:24:15, 3.91s/it] 59%|█████▊ | 20090/34278 [22:09:44<17:46:50, 4.51s/it] {'loss': 0.1444, 'grad_norm': 0.885447387603564, 'learning_rate': 3.859055180623178e-06, 'epoch': 0.59} 59%|█████▊ | 20090/34278 [22:09:44<17:46:50, 4.51s/it] 59%|█████▊ | 20091/34278 [22:09:48<17:23:13, 4.41s/it] {'loss': 0.1217, 'grad_norm': 1.1201388672989014, 'learning_rate': 3.858595215855318e-06, 'epoch': 0.59} 59%|█████▊ | 20091/34278 [22:09:48<17:23:13, 4.41s/it] 59%|█████▊ | 20092/34278 [22:09:51<15:54:46, 4.04s/it] {'loss': 0.1169, 'grad_norm': 0.8479948512544705, 'learning_rate': 3.858135261277645e-06, 'epoch': 0.59} 59%|█████▊ | 20092/34278 [22:09:51<15:54:46, 4.04s/it] 59%|█████▊ | 20093/34278 [22:09:56<17:13:23, 4.37s/it] {'loss': 0.1064, 'grad_norm': 0.8185512498549288, 'learning_rate': 3.85767531689427e-06, 'epoch': 0.59} 59%|█████▊ | 20093/34278 [22:09:56<17:13:23, 4.37s/it] 59%|█████▊ | 20094/34278 [22:10:02<18:28:33, 4.69s/it] {'loss': 0.1252, 'grad_norm': 0.9016533709889478, 'learning_rate': 3.857215382709296e-06, 'epoch': 0.59} 59%|█████▊ | 20094/34278 [22:10:02<18:28:33, 4.69s/it] 59%|█████▊ | 20095/34278 [22:10:05<16:29:41, 4.19s/it] {'loss': 0.1119, 'grad_norm': 0.9268200388301004, 'learning_rate': 3.856755458726831e-06, 'epoch': 0.59} 59%|█████▊ | 20095/34278 [22:10:05<16:29:41, 4.19s/it] 59%|█████▊ | 20096/34278 [22:10:08<15:24:13, 3.91s/it] {'loss': 0.1365, 'grad_norm': 0.8720738752910677, 'learning_rate': 3.8562955449509814e-06, 'epoch': 0.59} 59%|█████▊ | 20096/34278 [22:10:08<15:24:13, 3.91s/it] 59%|█████▊ | 20097/34278 [22:10:14<17:50:34, 4.53s/it] {'loss': 0.1183, 'grad_norm': 0.9482182903925894, 'learning_rate': 3.85583564138585e-06, 'epoch': 0.59} 59%|█████▊ | 20097/34278 [22:10:14<17:50:34, 4.53s/it] 59%|█████▊ | 20098/34278 [22:10:17<16:18:25, 4.14s/it] {'loss': 0.1304, 'grad_norm': 0.8620664503063633, 'learning_rate': 3.855375748035545e-06, 'epoch': 0.59} 59%|█████▊ | 20098/34278 [22:10:17<16:18:25, 4.14s/it] 59%|█████▊ | 20099/34278 [22:10:20<15:21:40, 3.90s/it] {'loss': 0.15, 'grad_norm': 0.9452053302308003, 'learning_rate': 3.854915864904173e-06, 'epoch': 0.59} 59%|█████▊ | 20099/34278 [22:10:20<15:21:40, 3.90s/it] 59%|█████▊ | 20100/34278 [22:10:24<14:59:23, 3.81s/it] {'loss': 0.1411, 'grad_norm': 1.0589915214326235, 'learning_rate': 3.854455991995838e-06, 'epoch': 0.59} 59%|█████▊ | 20100/34278 [22:10:24<14:59:23, 3.81s/it] 59%|█████▊ | 20101/34278 [22:10:28<15:39:42, 3.98s/it] {'loss': 0.1176, 'grad_norm': 0.857230424040825, 'learning_rate': 3.853996129314649e-06, 'epoch': 0.59} 59%|█████▊ | 20101/34278 [22:10:28<15:39:42, 3.98s/it] 59%|█████▊ | 20102/34278 [22:10:32<15:28:27, 3.93s/it] {'loss': 0.1379, 'grad_norm': 0.9205323384255356, 'learning_rate': 3.853536276864707e-06, 'epoch': 0.59} 59%|█████▊ | 20102/34278 [22:10:32<15:28:27, 3.93s/it] 59%|█████▊ | 20103/34278 [22:10:35<14:24:49, 3.66s/it] {'loss': 0.133, 'grad_norm': 1.048169501315199, 'learning_rate': 3.853076434650119e-06, 'epoch': 0.59} 59%|█████▊ | 20103/34278 [22:10:35<14:24:49, 3.66s/it] 59%|█████▊ | 20104/34278 [22:10:40<16:11:07, 4.11s/it] {'loss': 0.1203, 'grad_norm': 1.0178603010415677, 'learning_rate': 3.8526166026749904e-06, 'epoch': 0.59} 59%|█████▊ | 20104/34278 [22:10:40<16:11:07, 4.11s/it] 59%|█████▊ | 20105/34278 [22:10:43<14:48:35, 3.76s/it] {'loss': 0.132, 'grad_norm': 1.096479765021856, 'learning_rate': 3.852156780943428e-06, 'epoch': 0.59} 59%|█████▊ | 20105/34278 [22:10:43<14:48:35, 3.76s/it] 59%|█████▊ | 20106/34278 [22:10:46<13:59:33, 3.55s/it] {'loss': 0.1407, 'grad_norm': 1.0845604859818005, 'learning_rate': 3.851696969459536e-06, 'epoch': 0.59} 59%|█████▊ | 20106/34278 [22:10:46<13:59:33, 3.55s/it] 59%|█████▊ | 20107/34278 [22:10:50<13:55:23, 3.54s/it] {'loss': 0.1172, 'grad_norm': 0.872584285654691, 'learning_rate': 3.851237168227419e-06, 'epoch': 0.59} 59%|█████▊ | 20107/34278 [22:10:50<13:55:23, 3.54s/it] 59%|█████▊ | 20108/34278 [22:10:53<13:32:00, 3.44s/it] {'loss': 0.1551, 'grad_norm': 1.036295087112203, 'learning_rate': 3.850777377251183e-06, 'epoch': 0.59} 59%|█████▊ | 20108/34278 [22:10:53<13:32:00, 3.44s/it] 59%|█████▊ | 20109/34278 [22:10:57<14:26:02, 3.67s/it] {'loss': 0.1396, 'grad_norm': 1.1641155943704615, 'learning_rate': 3.850317596534932e-06, 'epoch': 0.59} 59%|█████▊ | 20109/34278 [22:10:57<14:26:02, 3.67s/it] 59%|█████▊ | 20110/34278 [22:11:01<14:01:14, 3.56s/it] {'loss': 0.1247, 'grad_norm': 1.0923985167320642, 'learning_rate': 3.849857826082769e-06, 'epoch': 0.59} 59%|█████▊ | 20110/34278 [22:11:01<14:01:14, 3.56s/it] 59%|█████▊ | 20111/34278 [22:11:04<13:31:38, 3.44s/it] {'loss': 0.131, 'grad_norm': 0.987160959863191, 'learning_rate': 3.849398065898802e-06, 'epoch': 0.59} 59%|█████▊ | 20111/34278 [22:11:04<13:31:38, 3.44s/it] 59%|█████▊ | 20112/34278 [22:11:07<13:39:27, 3.47s/it] {'loss': 0.129, 'grad_norm': 1.253080785508952, 'learning_rate': 3.848938315987135e-06, 'epoch': 0.59} 59%|█████▊ | 20112/34278 [22:11:07<13:39:27, 3.47s/it] 59%|█████▊ | 20113/34278 [22:11:11<13:28:25, 3.42s/it] {'loss': 0.1245, 'grad_norm': 1.1538753717837746, 'learning_rate': 3.848478576351873e-06, 'epoch': 0.59} 59%|█████▊ | 20113/34278 [22:11:11<13:28:25, 3.42s/it] 59%|█████▊ | 20114/34278 [22:11:14<13:26:03, 3.41s/it] {'loss': 0.1422, 'grad_norm': 1.0333923564407248, 'learning_rate': 3.848018846997117e-06, 'epoch': 0.59} 59%|█████▊ | 20114/34278 [22:11:14<13:26:03, 3.41s/it] 59%|█████▊ | 20115/34278 [22:11:19<14:54:17, 3.79s/it] {'loss': 0.1214, 'grad_norm': 0.6461112413466147, 'learning_rate': 3.847559127926975e-06, 'epoch': 0.59} 59%|█████▊ | 20115/34278 [22:11:19<14:54:17, 3.79s/it] 59%|█████▊ | 20116/34278 [22:11:21<13:39:18, 3.47s/it] {'loss': 0.1309, 'grad_norm': 0.8272750820515089, 'learning_rate': 3.847099419145549e-06, 'epoch': 0.59} 59%|█████▊ | 20116/34278 [22:11:21<13:39:18, 3.47s/it] 59%|█████▊ | 20117/34278 [22:11:25<13:18:35, 3.38s/it] {'loss': 0.1581, 'grad_norm': 0.9896185719377043, 'learning_rate': 3.846639720656944e-06, 'epoch': 0.59} 59%|█████▊ | 20117/34278 [22:11:25<13:18:35, 3.38s/it] 59%|█████▊ | 20118/34278 [22:11:29<14:39:11, 3.73s/it] {'loss': 0.1131, 'grad_norm': 0.7564900045154057, 'learning_rate': 3.846180032465267e-06, 'epoch': 0.59} 59%|█████▊ | 20118/34278 [22:11:29<14:39:11, 3.73s/it] 59%|█████▊ | 20119/34278 [22:11:32<13:55:42, 3.54s/it] {'loss': 0.1087, 'grad_norm': 0.8645903721445637, 'learning_rate': 3.845720354574617e-06, 'epoch': 0.59} 59%|█████▊ | 20119/34278 [22:11:32<13:55:42, 3.54s/it] 59%|█████▊ | 20120/34278 [22:11:35<12:58:47, 3.30s/it] {'loss': 0.1454, 'grad_norm': 0.8617417665144481, 'learning_rate': 3.845260686989101e-06, 'epoch': 0.59} 59%|█████▊ | 20120/34278 [22:11:35<12:58:47, 3.30s/it] 59%|█████▊ | 20121/34278 [22:11:38<12:44:20, 3.24s/it] {'loss': 0.1068, 'grad_norm': 0.6203208205541115, 'learning_rate': 3.844801029712822e-06, 'epoch': 0.59} 59%|█████▊ | 20121/34278 [22:11:38<12:44:20, 3.24s/it] 59%|█████▊ | 20122/34278 [22:11:44<15:55:11, 4.05s/it] {'loss': 0.1434, 'grad_norm': 0.7631944690684509, 'learning_rate': 3.844341382749881e-06, 'epoch': 0.59} 59%|█████▊ | 20122/34278 [22:11:44<15:55:11, 4.05s/it] 59%|█████▊ | 20123/34278 [22:11:47<14:30:16, 3.69s/it] {'loss': 0.1086, 'grad_norm': 0.8825347793520131, 'learning_rate': 3.843881746104387e-06, 'epoch': 0.59} 59%|█████▊ | 20123/34278 [22:11:47<14:30:16, 3.69s/it] 59%|█████▊ | 20124/34278 [22:11:50<13:49:06, 3.51s/it] {'loss': 0.1271, 'grad_norm': 1.6912087255054722, 'learning_rate': 3.84342211978044e-06, 'epoch': 0.59} 59%|█████▊ | 20124/34278 [22:11:50<13:49:06, 3.51s/it] 59%|█████▊ | 20125/34278 [22:11:53<13:15:00, 3.37s/it] {'loss': 0.123, 'grad_norm': 0.8616899578604179, 'learning_rate': 3.842962503782145e-06, 'epoch': 0.59} 59%|█████▊ | 20125/34278 [22:11:53<13:15:00, 3.37s/it] 59%|█████▊ | 20126/34278 [22:11:57<13:29:44, 3.43s/it] {'loss': 0.1384, 'grad_norm': 0.6871094352328266, 'learning_rate': 3.842502898113604e-06, 'epoch': 0.59} 59%|█████▊ | 20126/34278 [22:11:57<13:29:44, 3.43s/it] 59%|█████▊ | 20127/34278 [22:12:00<13:03:01, 3.32s/it] {'loss': 0.1303, 'grad_norm': 0.9791209284560194, 'learning_rate': 3.842043302778921e-06, 'epoch': 0.59} 59%|█████▊ | 20127/34278 [22:12:00<13:03:01, 3.32s/it] 59%|█████▊ | 20128/34278 [22:12:03<12:51:04, 3.27s/it] {'loss': 0.1083, 'grad_norm': 0.6875342564927575, 'learning_rate': 3.8415837177821976e-06, 'epoch': 0.59} 59%|█████▊ | 20128/34278 [22:12:03<12:51:04, 3.27s/it] 59%|█████▊ | 20129/34278 [22:12:06<12:34:26, 3.20s/it] {'loss': 0.1147, 'grad_norm': 0.8012390508740657, 'learning_rate': 3.841124143127539e-06, 'epoch': 0.59} 59%|█████▊ | 20129/34278 [22:12:06<12:34:26, 3.20s/it] 59%|█████▊ | 20130/34278 [22:12:10<13:31:24, 3.44s/it] {'loss': 0.1489, 'grad_norm': 0.8787926292352068, 'learning_rate': 3.840664578819047e-06, 'epoch': 0.59} 59%|█████▊ | 20130/34278 [22:12:10<13:31:24, 3.44s/it] 59%|█████▊ | 20131/34278 [22:12:13<12:49:47, 3.26s/it] {'loss': 0.1322, 'grad_norm': 0.8967703054374653, 'learning_rate': 3.8402050248608266e-06, 'epoch': 0.59} 59%|█████▊ | 20131/34278 [22:12:13<12:49:47, 3.26s/it] 59%|█████▊ | 20132/34278 [22:12:16<12:34:20, 3.20s/it] {'loss': 0.1346, 'grad_norm': 0.8685471794163647, 'learning_rate': 3.839745481256979e-06, 'epoch': 0.59} 59%|█████▊ | 20132/34278 [22:12:16<12:34:20, 3.20s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faf789ddee0> Failed to fetch sample 3268268. Exception: cannot identify image file <_io.BytesIO object at 0x7faf789ddee0> 59%|█████▊ | 20133/34278 [22:12:22<15:51:29, 4.04s/it] {'loss': 0.126, 'grad_norm': 0.8787694749575461, 'learning_rate': 3.839285948011605e-06, 'epoch': 0.59} 59%|█████▊ | 20133/34278 [22:12:22<15:51:29, 4.04s/it] 59%|█████▊ | 20134/34278 [22:12:25<15:03:37, 3.83s/it] {'loss': 0.1205, 'grad_norm': 0.8927126506899811, 'learning_rate': 3.838826425128809e-06, 'epoch': 0.59} 59%|█████▊ | 20134/34278 [22:12:25<15:03:37, 3.83s/it] 59%|█████▊ | 20135/34278 [22:12:29<14:49:39, 3.77s/it] {'loss': 0.1326, 'grad_norm': 0.8486994366293353, 'learning_rate': 3.838366912612694e-06, 'epoch': 0.59} 59%|█████▊ | 20135/34278 [22:12:29<14:49:39, 3.77s/it] 59%|█████▊ | 20136/34278 [22:12:33<14:54:53, 3.80s/it] {'loss': 0.1266, 'grad_norm': 0.7935926414817781, 'learning_rate': 3.837907410467363e-06, 'epoch': 0.59} 59%|█████▊ | 20136/34278 [22:12:33<14:54:53, 3.80s/it] 59%|█████▊ | 20137/34278 [22:12:36<14:06:13, 3.59s/it] {'loss': 0.1424, 'grad_norm': 0.7376191128550542, 'learning_rate': 3.837447918696915e-06, 'epoch': 0.59} 59%|█████▊ | 20137/34278 [22:12:36<14:06:13, 3.59s/it] 59%|█████▊ | 20138/34278 [22:12:39<13:21:25, 3.40s/it] {'loss': 0.1177, 'grad_norm': 1.0005943673066164, 'learning_rate': 3.836988437305457e-06, 'epoch': 0.59} 59%|█████▊ | 20138/34278 [22:12:39<13:21:25, 3.40s/it] 59%|█████▉ | 20139/34278 [22:12:42<13:13:40, 3.37s/it] {'loss': 0.1331, 'grad_norm': 0.9030270167580045, 'learning_rate': 3.836528966297087e-06, 'epoch': 0.59} 59%|█████▉ | 20139/34278 [22:12:42<13:13:40, 3.37s/it] 59%|█████▉ | 20140/34278 [22:12:45<13:10:01, 3.35s/it] {'loss': 0.1193, 'grad_norm': 0.6526625704100362, 'learning_rate': 3.836069505675909e-06, 'epoch': 0.59} 59%|█████▉ | 20140/34278 [22:12:45<13:10:01, 3.35s/it] 59%|█████▉ | 20141/34278 [22:12:49<13:16:19, 3.38s/it] {'loss': 0.1357, 'grad_norm': 1.019786180214768, 'learning_rate': 3.835610055446024e-06, 'epoch': 0.59} 59%|█████▉ | 20141/34278 [22:12:49<13:16:19, 3.38s/it] 59%|█████▉ | 20142/34278 [22:12:52<13:42:25, 3.49s/it] {'loss': 0.1444, 'grad_norm': 1.0235832849047515, 'learning_rate': 3.835150615611535e-06, 'epoch': 0.59} 59%|█████▉ | 20142/34278 [22:12:53<13:42:25, 3.49s/it] 59%|█████▉ | 20143/34278 [22:12:56<13:55:37, 3.55s/it] {'loss': 0.114, 'grad_norm': 0.6907795557456458, 'learning_rate': 3.8346911861765444e-06, 'epoch': 0.59} 59%|█████▉ | 20143/34278 [22:12:56<13:55:37, 3.55s/it] 59%|█████▉ | 20144/34278 [22:12:59<13:21:32, 3.40s/it] {'loss': 0.1151, 'grad_norm': 0.6595963428949111, 'learning_rate': 3.83423176714515e-06, 'epoch': 0.59} 59%|█████▉ | 20144/34278 [22:12:59<13:21:32, 3.40s/it] 59%|█████▉ | 20145/34278 [22:13:02<12:42:33, 3.24s/it] {'loss': 0.1326, 'grad_norm': 0.9794914247080527, 'learning_rate': 3.833772358521458e-06, 'epoch': 0.59} 59%|█████▉ | 20145/34278 [22:13:02<12:42:33, 3.24s/it] 59%|█████▉ | 20146/34278 [22:13:05<12:40:05, 3.23s/it] {'loss': 0.1344, 'grad_norm': 0.7917853262638629, 'learning_rate': 3.833312960309567e-06, 'epoch': 0.59} 59%|█████▉ | 20146/34278 [22:13:05<12:40:05, 3.23s/it] 59%|█████▉ | 20147/34278 [22:13:08<12:35:53, 3.21s/it] {'loss': 0.122, 'grad_norm': 0.7057080667836103, 'learning_rate': 3.83285357251358e-06, 'epoch': 0.59} 59%|█████▉ | 20147/34278 [22:13:08<12:35:53, 3.21s/it] 59%|█████▉ | 20148/34278 [22:13:12<13:01:46, 3.32s/it] {'loss': 0.1672, 'grad_norm': 0.927868335098306, 'learning_rate': 3.832394195137599e-06, 'epoch': 0.59} 59%|█████▉ | 20148/34278 [22:13:12<13:01:46, 3.32s/it] 59%|█████▉ | 20149/34278 [22:13:16<14:10:31, 3.61s/it] {'loss': 0.1141, 'grad_norm': 0.7746056396313836, 'learning_rate': 3.8319348281857215e-06, 'epoch': 0.59} 59%|█████▉ | 20149/34278 [22:13:16<14:10:31, 3.61s/it] 59%|█████▉ | 20150/34278 [22:13:22<16:52:54, 4.30s/it] {'loss': 0.0981, 'grad_norm': 0.827359989904886, 'learning_rate': 3.831475471662052e-06, 'epoch': 0.59} 59%|█████▉ | 20150/34278 [22:13:22<16:52:54, 4.30s/it] 59%|█████▉ | 20151/34278 [22:13:27<16:58:22, 4.33s/it] {'loss': 0.1292, 'grad_norm': 0.8608432342331177, 'learning_rate': 3.831016125570692e-06, 'epoch': 0.59} 59%|█████▉ | 20151/34278 [22:13:27<16:58:22, 4.33s/it] 59%|█████▉ | 20152/34278 [22:13:30<15:26:06, 3.93s/it] {'loss': 0.1526, 'grad_norm': 0.7438997762371768, 'learning_rate': 3.830556789915737e-06, 'epoch': 0.59} 59%|█████▉ | 20152/34278 [22:13:30<15:26:06, 3.93s/it] 59%|█████▉ | 20153/34278 [22:13:35<16:46:07, 4.27s/it] {'loss': 0.1371, 'grad_norm': 0.9969349527665727, 'learning_rate': 3.830097464701296e-06, 'epoch': 0.59} 59%|█████▉ | 20153/34278 [22:13:35<16:46:07, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 59%|█████▉ | 20154/34278 [22:13:39<16:14:01, 4.14s/it] {'loss': 0.1135, 'grad_norm': 0.7159327366780542, 'learning_rate': 3.829638149931464e-06, 'epoch': 0.59} 59%|█████▉ | 20154/34278 [22:13:39<16:14:01, 4.14s/it] 59%|█████▉ | 20155/34278 [22:13:45<18:38:34, 4.75s/it] {'loss': 0.1361, 'grad_norm': 1.1501666243945665, 'learning_rate': 3.829178845610343e-06, 'epoch': 0.59} 59%|█████▉ | 20155/34278 [22:13:45<18:38:34, 4.75s/it] 59%|█████▉ | 20156/34278 [22:13:48<17:15:22, 4.40s/it] {'loss': 0.1157, 'grad_norm': 1.1771798621100007, 'learning_rate': 3.8287195517420345e-06, 'epoch': 0.59} 59%|█████▉ | 20156/34278 [22:13:48<17:15:22, 4.40s/it] 59%|█████▉ | 20157/34278 [22:13:52<15:57:11, 4.07s/it] {'loss': 0.1118, 'grad_norm': 0.7381653491865249, 'learning_rate': 3.828260268330638e-06, 'epoch': 0.59} 59%|█████▉ | 20157/34278 [22:13:52<15:57:11, 4.07s/it] 59%|█████▉ | 20158/34278 [22:13:56<16:36:48, 4.24s/it] {'loss': 0.1444, 'grad_norm': 0.763061630725222, 'learning_rate': 3.827800995380252e-06, 'epoch': 0.59} 59%|█████▉ | 20158/34278 [22:13:56<16:36:48, 4.24s/it] 59%|█████▉ | 20159/34278 [22:14:01<16:52:41, 4.30s/it] {'loss': 0.1232, 'grad_norm': 1.3101333418024994, 'learning_rate': 3.827341732894981e-06, 'epoch': 0.59} 59%|█████▉ | 20159/34278 [22:14:01<16:52:41, 4.30s/it] 59%|█████▉ | 20160/34278 [22:14:04<15:09:59, 3.87s/it] {'loss': 0.0984, 'grad_norm': 0.93697661865873, 'learning_rate': 3.826882480878923e-06, 'epoch': 0.59} 59%|█████▉ | 20160/34278 [22:14:04<15:09:59, 3.87s/it] 59%|█████▉ | 20161/34278 [22:14:07<14:19:30, 3.65s/it] {'loss': 0.1178, 'grad_norm': 0.8497757827537434, 'learning_rate': 3.8264232393361785e-06, 'epoch': 0.59} 59%|█████▉ | 20161/34278 [22:14:07<14:19:30, 3.65s/it] 59%|█████▉ | 20162/34278 [22:14:11<14:33:10, 3.71s/it] {'loss': 0.1347, 'grad_norm': 1.1924786070328814, 'learning_rate': 3.825964008270847e-06, 'epoch': 0.59} 59%|█████▉ | 20162/34278 [22:14:11<14:33:10, 3.71s/it] 59%|█████▉ | 20163/34278 [22:14:14<13:49:53, 3.53s/it] {'loss': 0.0987, 'grad_norm': 1.0225547594735644, 'learning_rate': 3.825504787687027e-06, 'epoch': 0.59} 59%|█████▉ | 20163/34278 [22:14:14<13:49:53, 3.53s/it] 59%|█████▉ | 20164/34278 [22:14:17<13:54:28, 3.55s/it] {'loss': 0.1282, 'grad_norm': 1.0026292756298758, 'learning_rate': 3.82504557758882e-06, 'epoch': 0.59} 59%|█████▉ | 20164/34278 [22:14:17<13:54:28, 3.55s/it] 59%|█████▉ | 20165/34278 [22:14:20<13:30:30, 3.45s/it] {'loss': 0.1291, 'grad_norm': 0.8335342189841055, 'learning_rate': 3.824586377980328e-06, 'epoch': 0.59} 59%|█████▉ | 20165/34278 [22:14:20<13:30:30, 3.45s/it] 59%|█████▉ | 20166/34278 [22:14:27<16:54:10, 4.31s/it] {'loss': 0.1572, 'grad_norm': 1.302804691078596, 'learning_rate': 3.824127188865647e-06, 'epoch': 0.59} 59%|█████▉ | 20166/34278 [22:14:27<16:54:10, 4.31s/it] 59%|█████▉ | 20167/34278 [22:14:30<15:45:00, 4.02s/it] {'loss': 0.1274, 'grad_norm': 0.985718268969789, 'learning_rate': 3.823668010248877e-06, 'epoch': 0.59} 59%|█████▉ | 20167/34278 [22:14:30<15:45:00, 4.02s/it] 59%|█████▉ | 20168/34278 [22:14:35<17:17:36, 4.41s/it] {'loss': 0.1257, 'grad_norm': 1.597761621760707, 'learning_rate': 3.82320884213412e-06, 'epoch': 0.59} 59%|█████▉ | 20168/34278 [22:14:35<17:17:36, 4.41s/it] 59%|█████▉ | 20169/34278 [22:14:39<16:41:02, 4.26s/it] {'loss': 0.1345, 'grad_norm': 0.6882805232081349, 'learning_rate': 3.822749684525472e-06, 'epoch': 0.59} 59%|█████▉ | 20169/34278 [22:14:39<16:41:02, 4.26s/it] 59%|█████▉ | 20170/34278 [22:14:42<15:02:55, 3.84s/it] {'loss': 0.1423, 'grad_norm': 0.8339342206647082, 'learning_rate': 3.822290537427033e-06, 'epoch': 0.59} 59%|█████▉ | 20170/34278 [22:14:42<15:02:55, 3.84s/it] 59%|█████▉ | 20171/34278 [22:14:48<17:36:42, 4.49s/it] {'loss': 0.1219, 'grad_norm': 1.2302623407076574, 'learning_rate': 3.8218314008429045e-06, 'epoch': 0.59} 59%|█████▉ | 20171/34278 [22:14:48<17:36:42, 4.49s/it] 59%|█████▉ | 20172/34278 [22:14:51<16:00:23, 4.09s/it] {'loss': 0.1431, 'grad_norm': 1.0556585914346928, 'learning_rate': 3.821372274777183e-06, 'epoch': 0.59} 59%|█████▉ | 20172/34278 [22:14:51<16:00:23, 4.09s/it] 59%|█████▉ | 20173/34278 [22:14:56<16:36:45, 4.24s/it] {'loss': 0.1443, 'grad_norm': 0.8827344637910692, 'learning_rate': 3.82091315923397e-06, 'epoch': 0.59} 59%|█████▉ | 20173/34278 [22:14:56<16:36:45, 4.24s/it] 59%|█████▉ | 20174/34278 [22:15:02<18:58:18, 4.84s/it] {'loss': 0.1351, 'grad_norm': 0.674349674596774, 'learning_rate': 3.820454054217362e-06, 'epoch': 0.59} 59%|█████▉ | 20174/34278 [22:15:02<18:58:18, 4.84s/it] 59%|█████▉ | 20175/34278 [22:15:05<16:46:36, 4.28s/it] {'loss': 0.1523, 'grad_norm': 0.8529289725548811, 'learning_rate': 3.8199949597314586e-06, 'epoch': 0.59} 59%|█████▉ | 20175/34278 [22:15:05<16:46:36, 4.28s/it] 59%|█████▉ | 20176/34278 [22:15:08<15:36:00, 3.98s/it] {'loss': 0.1385, 'grad_norm': 0.9926984196468841, 'learning_rate': 3.819535875780357e-06, 'epoch': 0.59} 59%|█████▉ | 20176/34278 [22:15:08<15:36:00, 3.98s/it] 59%|█████▉ | 20177/34278 [22:15:11<14:19:13, 3.66s/it] {'loss': 0.1275, 'grad_norm': 0.6518237954321938, 'learning_rate': 3.8190768023681585e-06, 'epoch': 0.59} 59%|█████▉ | 20177/34278 [22:15:11<14:19:13, 3.66s/it] 59%|█████▉ | 20178/34278 [22:15:14<13:35:34, 3.47s/it] {'loss': 0.1051, 'grad_norm': 0.6130187369350059, 'learning_rate': 3.818617739498962e-06, 'epoch': 0.59} 59%|█████▉ | 20178/34278 [22:15:14<13:35:34, 3.47s/it] 59%|█████▉ | 20179/34278 [22:15:18<13:54:54, 3.55s/it] {'loss': 0.1431, 'grad_norm': 0.8051082249111425, 'learning_rate': 3.818158687176862e-06, 'epoch': 0.59} 59%|█████▉ | 20179/34278 [22:15:18<13:54:54, 3.55s/it] 59%|█████▉ | 20180/34278 [22:15:22<14:37:12, 3.73s/it] {'loss': 0.1361, 'grad_norm': 0.6799712927962633, 'learning_rate': 3.81769964540596e-06, 'epoch': 0.59} 59%|█████▉ | 20180/34278 [22:15:22<14:37:12, 3.73s/it] 59%|█████▉ | 20181/34278 [22:15:25<14:00:14, 3.58s/it] {'loss': 0.1174, 'grad_norm': 0.6967539028836852, 'learning_rate': 3.817240614190354e-06, 'epoch': 0.59} 59%|█████▉ | 20181/34278 [22:15:25<14:00:14, 3.58s/it] 59%|█████▉ | 20182/34278 [22:15:29<13:50:11, 3.53s/it] {'loss': 0.1125, 'grad_norm': 0.9381637487702315, 'learning_rate': 3.816781593534139e-06, 'epoch': 0.59} 59%|█████▉ | 20182/34278 [22:15:29<13:50:11, 3.53s/it] 59%|█████▉ | 20183/34278 [22:15:32<13:12:54, 3.38s/it] {'loss': 0.1104, 'grad_norm': 0.6395438147681456, 'learning_rate': 3.816322583441419e-06, 'epoch': 0.59} 59%|█████▉ | 20183/34278 [22:15:32<13:12:54, 3.38s/it] 59%|█████▉ | 20184/34278 [22:15:38<16:18:37, 4.17s/it] {'loss': 0.1201, 'grad_norm': 0.6855004873407058, 'learning_rate': 3.815863583916286e-06, 'epoch': 0.59} 59%|█████▉ | 20184/34278 [22:15:38<16:18:37, 4.17s/it] 59%|█████▉ | 20185/34278 [22:15:41<14:34:55, 3.72s/it] {'loss': 0.1391, 'grad_norm': 0.9913520040827561, 'learning_rate': 3.815404594962841e-06, 'epoch': 0.59} 59%|█████▉ | 20185/34278 [22:15:41<14:34:55, 3.72s/it] 59%|█████▉ | 20186/34278 [22:15:47<17:32:03, 4.48s/it] {'loss': 0.1347, 'grad_norm': 0.9338284679029946, 'learning_rate': 3.814945616585182e-06, 'epoch': 0.59} 59%|█████▉ | 20186/34278 [22:15:47<17:32:03, 4.48s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 59%|█████▉ | 20187/34278 [22:15:50<15:51:28, 4.05s/it] {'loss': 0.1076, 'grad_norm': 0.8037720846213434, 'learning_rate': 3.8144866487874043e-06, 'epoch': 0.59} 59%|█████▉ | 20187/34278 [22:15:50<15:51:28, 4.05s/it] 59%|█████▉ | 20188/34278 [22:15:53<14:42:24, 3.76s/it] {'loss': 0.1379, 'grad_norm': 0.9015038126636395, 'learning_rate': 3.8140276915736056e-06, 'epoch': 0.59} 59%|█████▉ | 20188/34278 [22:15:53<14:42:24, 3.76s/it] 59%|█████▉ | 20189/34278 [22:15:58<16:13:48, 4.15s/it] {'loss': 0.1487, 'grad_norm': 1.079556128494006, 'learning_rate': 3.8135687449478865e-06, 'epoch': 0.59} 59%|█████▉ | 20189/34278 [22:15:58<16:13:48, 4.15s/it] 59%|█████▉ | 20190/34278 [22:16:02<16:08:47, 4.13s/it] {'loss': 0.14, 'grad_norm': 0.9209749360880155, 'learning_rate': 3.8131098089143415e-06, 'epoch': 0.59} 59%|█████▉ | 20190/34278 [22:16:02<16:08:47, 4.13s/it] 59%|█████▉ | 20191/34278 [22:16:05<14:57:34, 3.82s/it] {'loss': 0.1298, 'grad_norm': 0.8223976212634202, 'learning_rate': 3.8126508834770703e-06, 'epoch': 0.59} 59%|█████▉ | 20191/34278 [22:16:05<14:57:34, 3.82s/it] 59%|█████▉ | 20192/34278 [22:16:10<15:46:33, 4.03s/it] {'loss': 0.1412, 'grad_norm': 0.8623040227317921, 'learning_rate': 3.812191968640167e-06, 'epoch': 0.59} 59%|█████▉ | 20192/34278 [22:16:10<15:46:33, 4.03s/it] 59%|█████▉ | 20193/34278 [22:16:13<15:18:47, 3.91s/it] {'loss': 0.1179, 'grad_norm': 0.8455280296843426, 'learning_rate': 3.811733064407731e-06, 'epoch': 0.59} 59%|█████▉ | 20193/34278 [22:16:13<15:18:47, 3.91s/it] 59%|█████▉ | 20194/34278 [22:16:17<15:06:01, 3.86s/it] {'loss': 0.1312, 'grad_norm': 1.0134298386784872, 'learning_rate': 3.811274170783857e-06, 'epoch': 0.59} 59%|█████▉ | 20194/34278 [22:16:17<15:06:01, 3.86s/it] 59%|█████▉ | 20195/34278 [22:16:20<14:02:06, 3.59s/it] {'loss': 0.13, 'grad_norm': 0.8541302059261543, 'learning_rate': 3.8108152877726457e-06, 'epoch': 0.59} 59%|█████▉ | 20195/34278 [22:16:20<14:02:06, 3.59s/it] 59%|█████▉ | 20196/34278 [22:16:24<13:51:26, 3.54s/it] {'loss': 0.105, 'grad_norm': 0.7675668199312625, 'learning_rate': 3.8103564153781904e-06, 'epoch': 0.59} 59%|█████▉ | 20196/34278 [22:16:24<13:51:26, 3.54s/it] 59%|█████▉ | 20197/34278 [22:16:27<13:33:16, 3.47s/it] {'loss': 0.1329, 'grad_norm': 1.0805143242330255, 'learning_rate': 3.809897553604589e-06, 'epoch': 0.59} 59%|█████▉ | 20197/34278 [22:16:27<13:33:16, 3.47s/it] 59%|█████▉ | 20198/34278 [22:16:31<14:34:07, 3.72s/it] {'loss': 0.1416, 'grad_norm': 0.9027691343169723, 'learning_rate': 3.80943870245594e-06, 'epoch': 0.59} 59%|█████▉ | 20198/34278 [22:16:31<14:34:07, 3.72s/it] 59%|█████▉ | 20199/34278 [22:16:34<13:37:06, 3.48s/it] {'loss': 0.1142, 'grad_norm': 1.0002746479320108, 'learning_rate': 3.808979861936336e-06, 'epoch': 0.59} 59%|█████▉ | 20199/34278 [22:16:34<13:37:06, 3.48s/it] 59%|█████▉ | 20200/34278 [22:16:37<12:49:53, 3.28s/it] {'loss': 0.1113, 'grad_norm': 0.8930213473446856, 'learning_rate': 3.808521032049875e-06, 'epoch': 0.59} 59%|█████▉ | 20200/34278 [22:16:37<12:49:53, 3.28s/it] 59%|█████▉ | 20201/34278 [22:16:43<16:28:42, 4.21s/it] {'loss': 0.1224, 'grad_norm': 0.9342242631978366, 'learning_rate': 3.8080622128006547e-06, 'epoch': 0.59} 59%|█████▉ | 20201/34278 [22:16:43<16:28:42, 4.21s/it] 59%|█████▉ | 20202/34278 [22:16:46<15:20:25, 3.92s/it] {'loss': 0.1172, 'grad_norm': 1.0747903739228073, 'learning_rate': 3.80760340419277e-06, 'epoch': 0.59} 59%|█████▉ | 20202/34278 [22:16:47<15:20:25, 3.92s/it] 59%|█████▉ | 20203/34278 [22:16:50<14:52:53, 3.81s/it] {'loss': 0.1368, 'grad_norm': 1.1361561748983318, 'learning_rate': 3.807144606230319e-06, 'epoch': 0.59} 59%|█████▉ | 20203/34278 [22:16:50<14:52:53, 3.81s/it] 59%|█████▉ | 20204/34278 [22:16:53<14:12:39, 3.64s/it] {'loss': 0.1186, 'grad_norm': 0.8532972712206477, 'learning_rate': 3.806685818917395e-06, 'epoch': 0.59} 59%|█████▉ | 20204/34278 [22:16:53<14:12:39, 3.64s/it] 59%|█████▉ | 20205/34278 [22:16:56<13:29:56, 3.45s/it] {'loss': 0.1144, 'grad_norm': 1.0131116558195778, 'learning_rate': 3.8062270422580953e-06, 'epoch': 0.59} 59%|█████▉ | 20205/34278 [22:16:56<13:29:56, 3.45s/it] 59%|█████▉ | 20206/34278 [22:17:00<13:43:39, 3.51s/it] {'loss': 0.1386, 'grad_norm': 0.7055321851541815, 'learning_rate': 3.805768276256514e-06, 'epoch': 0.59} 59%|█████▉ | 20206/34278 [22:17:00<13:43:39, 3.51s/it] 59%|█████▉ | 20207/34278 [22:17:03<13:02:59, 3.34s/it] {'loss': 0.1033, 'grad_norm': 1.0382676058333964, 'learning_rate': 3.80530952091675e-06, 'epoch': 0.59} 59%|█████▉ | 20207/34278 [22:17:03<13:02:59, 3.34s/it] 59%|█████▉ | 20208/34278 [22:17:06<13:00:15, 3.33s/it] {'loss': 0.1192, 'grad_norm': 0.7754527877530077, 'learning_rate': 3.804850776242899e-06, 'epoch': 0.59} 59%|█████▉ | 20208/34278 [22:17:06<13:00:15, 3.33s/it] 59%|█████▉ | 20209/34278 [22:17:10<13:28:45, 3.45s/it] {'loss': 0.1183, 'grad_norm': 0.8098385858799838, 'learning_rate': 3.8043920422390527e-06, 'epoch': 0.59} 59%|█████▉ | 20209/34278 [22:17:10<13:28:45, 3.45s/it] 59%|█████▉ | 20210/34278 [22:17:13<12:44:30, 3.26s/it] {'loss': 0.1437, 'grad_norm': 1.0213690853970159, 'learning_rate': 3.80393331890931e-06, 'epoch': 0.59} 59%|█████▉ | 20210/34278 [22:17:13<12:44:30, 3.26s/it] 59%|█████▉ | 20211/34278 [22:17:17<14:30:47, 3.71s/it] {'loss': 0.1345, 'grad_norm': 0.8024946265682966, 'learning_rate': 3.8034746062577653e-06, 'epoch': 0.59} 59%|█████▉ | 20211/34278 [22:17:17<14:30:47, 3.71s/it] 59%|█████▉ | 20212/34278 [22:17:21<13:50:44, 3.54s/it] {'loss': 0.108, 'grad_norm': 0.7137791715372281, 'learning_rate': 3.803015904288511e-06, 'epoch': 0.59} 59%|█████▉ | 20212/34278 [22:17:21<13:50:44, 3.54s/it] 59%|█████▉ | 20213/34278 [22:17:24<13:12:54, 3.38s/it] {'loss': 0.1161, 'grad_norm': 0.9698080163434466, 'learning_rate': 3.8025572130056475e-06, 'epoch': 0.59} 59%|█████▉ | 20213/34278 [22:17:24<13:12:54, 3.38s/it] 59%|█████▉ | 20214/34278 [22:17:27<13:03:08, 3.34s/it] {'loss': 0.1239, 'grad_norm': 0.7361753419335797, 'learning_rate': 3.8020985324132663e-06, 'epoch': 0.59} 59%|█████▉ | 20214/34278 [22:17:27<13:03:08, 3.34s/it] 59%|█████▉ | 20215/34278 [22:17:30<12:33:14, 3.21s/it] {'loss': 0.1362, 'grad_norm': 0.9412898368112532, 'learning_rate': 3.801639862515464e-06, 'epoch': 0.59} 59%|█████▉ | 20215/34278 [22:17:30<12:33:14, 3.21s/it] 59%|█████▉ | 20216/34278 [22:17:33<12:19:35, 3.16s/it] {'loss': 0.1246, 'grad_norm': 0.6458254491982539, 'learning_rate': 3.8011812033163365e-06, 'epoch': 0.59} 59%|█████▉ | 20216/34278 [22:17:33<12:19:35, 3.16s/it] 59%|█████▉ | 20217/34278 [22:17:38<14:52:38, 3.81s/it] {'loss': 0.1179, 'grad_norm': 0.8289219559663683, 'learning_rate': 3.800722554819975e-06, 'epoch': 0.59} 59%|█████▉ | 20217/34278 [22:17:38<14:52:38, 3.81s/it] 59%|█████▉ | 20218/34278 [22:17:42<14:32:45, 3.72s/it] {'loss': 0.1178, 'grad_norm': 1.0031912845215238, 'learning_rate': 3.8002639170304755e-06, 'epoch': 0.59} 59%|█████▉ | 20218/34278 [22:17:42<14:32:45, 3.72s/it] 59%|█████▉ | 20219/34278 [22:17:45<13:47:42, 3.53s/it] {'loss': 0.1119, 'grad_norm': 0.6587167737029508, 'learning_rate': 3.7998052899519346e-06, 'epoch': 0.59} 59%|█████▉ | 20219/34278 [22:17:45<13:47:42, 3.53s/it] 59%|█████▉ | 20220/34278 [22:17:51<16:49:35, 4.31s/it] {'loss': 0.124, 'grad_norm': 0.7147696398102091, 'learning_rate': 3.7993466735884456e-06, 'epoch': 0.59} 59%|█████▉ | 20220/34278 [22:17:51<16:49:35, 4.31s/it] 59%|█████▉ | 20221/34278 [22:17:57<18:47:59, 4.81s/it] {'loss': 0.1329, 'grad_norm': 0.8562683931200171, 'learning_rate': 3.798888067944103e-06, 'epoch': 0.59} 59%|█████▉ | 20221/34278 [22:17:57<18:47:59, 4.81s/it] 59%|█████▉ | 20222/34278 [22:18:00<16:56:00, 4.34s/it] {'loss': 0.133, 'grad_norm': 0.8384721329675718, 'learning_rate': 3.7984294730230008e-06, 'epoch': 0.59} 59%|█████▉ | 20222/34278 [22:18:00<16:56:00, 4.34s/it] 59%|█████▉ | 20223/34278 [22:18:04<16:00:56, 4.10s/it] {'loss': 0.1045, 'grad_norm': 0.7354471706626688, 'learning_rate': 3.797970888829233e-06, 'epoch': 0.59} 59%|█████▉ | 20223/34278 [22:18:04<16:00:56, 4.10s/it] 59%|█████▉ | 20224/34278 [22:18:06<14:22:35, 3.68s/it] {'loss': 0.1199, 'grad_norm': 0.7309064317536617, 'learning_rate': 3.7975123153668935e-06, 'epoch': 0.59} 59%|█████▉ | 20224/34278 [22:18:06<14:22:35, 3.68s/it] 59%|█████▉ | 20225/34278 [22:18:10<13:46:48, 3.53s/it] {'loss': 0.117, 'grad_norm': 0.9422277741528954, 'learning_rate': 3.797053752640079e-06, 'epoch': 0.59} 59%|█████▉ | 20225/34278 [22:18:10<13:46:48, 3.53s/it] 59%|█████▉ | 20226/34278 [22:18:13<13:45:51, 3.53s/it] {'loss': 0.1219, 'grad_norm': 1.1053133172270788, 'learning_rate': 3.7965952006528805e-06, 'epoch': 0.59} 59%|█████▉ | 20226/34278 [22:18:13<13:45:51, 3.53s/it] 59%|█████▉ | 20227/34278 [22:18:16<13:23:55, 3.43s/it] {'loss': 0.1234, 'grad_norm': 0.8682861779230187, 'learning_rate': 3.796136659409393e-06, 'epoch': 0.59} 59%|█████▉ | 20227/34278 [22:18:16<13:23:55, 3.43s/it] 59%|█████▉ | 20228/34278 [22:18:22<16:00:05, 4.10s/it] {'loss': 0.1213, 'grad_norm': 0.8578679187059338, 'learning_rate': 3.7956781289137103e-06, 'epoch': 0.59} 59%|█████▉ | 20228/34278 [22:18:22<16:00:05, 4.10s/it] 59%|█████▉ | 20229/34278 [22:18:26<15:35:40, 4.00s/it] {'loss': 0.1202, 'grad_norm': 0.6627094752494258, 'learning_rate': 3.795219609169925e-06, 'epoch': 0.59} 59%|█████▉ | 20229/34278 [22:18:26<15:35:40, 4.00s/it] 59%|█████▉ | 20230/34278 [22:18:29<15:00:10, 3.84s/it] {'loss': 0.126, 'grad_norm': 1.0334651298328659, 'learning_rate': 3.7947611001821307e-06, 'epoch': 0.59} 59%|█████▉ | 20230/34278 [22:18:29<15:00:10, 3.84s/it] 59%|█████▉ | 20231/34278 [22:18:32<14:05:16, 3.61s/it] {'loss': 0.1172, 'grad_norm': 1.0303606584877127, 'learning_rate': 3.7943026019544226e-06, 'epoch': 0.59} 59%|█████▉ | 20231/34278 [22:18:32<14:05:16, 3.61s/it] 59%|█████▉ | 20232/34278 [22:18:36<14:14:29, 3.65s/it] {'loss': 0.1068, 'grad_norm': 1.1767317178537413, 'learning_rate': 3.7938441144908926e-06, 'epoch': 0.59} 59%|█████▉ | 20232/34278 [22:18:36<14:14:29, 3.65s/it] 59%|█████▉ | 20233/34278 [22:18:40<14:40:07, 3.76s/it] {'loss': 0.1325, 'grad_norm': 0.9603041475692659, 'learning_rate': 3.7933856377956357e-06, 'epoch': 0.59} 59%|█████▉ | 20233/34278 [22:18:40<14:40:07, 3.76s/it] 59%|█████▉ | 20234/34278 [22:18:43<13:32:32, 3.47s/it] {'loss': 0.1272, 'grad_norm': 0.9127724785613851, 'learning_rate': 3.7929271718727426e-06, 'epoch': 0.59} 59%|█████▉ | 20234/34278 [22:18:43<13:32:32, 3.47s/it] 59%|█████▉ | 20235/34278 [22:18:46<13:08:30, 3.37s/it] {'loss': 0.1143, 'grad_norm': 0.8904540092492622, 'learning_rate': 3.792468716726308e-06, 'epoch': 0.59} 59%|█████▉ | 20235/34278 [22:18:46<13:08:30, 3.37s/it] 59%|█████▉ | 20236/34278 [22:18:49<13:17:44, 3.41s/it] {'loss': 0.1251, 'grad_norm': 1.1342747403471634, 'learning_rate': 3.792010272360423e-06, 'epoch': 0.59} 59%|█████▉ | 20236/34278 [22:18:49<13:17:44, 3.41s/it] 59%|█████▉ | 20237/34278 [22:18:53<12:59:39, 3.33s/it] {'loss': 0.1208, 'grad_norm': 0.9276026984423652, 'learning_rate': 3.7915518387791833e-06, 'epoch': 0.59} 59%|█████▉ | 20237/34278 [22:18:53<12:59:39, 3.33s/it] 59%|█████▉ | 20238/34278 [22:18:56<12:55:55, 3.32s/it] {'loss': 0.1347, 'grad_norm': 0.8160098687053883, 'learning_rate': 3.7910934159866807e-06, 'epoch': 0.59} 59%|█████▉ | 20238/34278 [22:18:56<12:55:55, 3.32s/it] 59%|█████▉ | 20239/34278 [22:19:02<15:49:07, 4.06s/it] {'loss': 0.1322, 'grad_norm': 1.1614221942977292, 'learning_rate': 3.790635003987007e-06, 'epoch': 0.59} 59%|█████▉ | 20239/34278 [22:19:02<15:49:07, 4.06s/it] 59%|█████▉ | 20240/34278 [22:19:06<15:36:41, 4.00s/it] {'loss': 0.1377, 'grad_norm': 1.071437658293513, 'learning_rate': 3.7901766027842553e-06, 'epoch': 0.59} 59%|█████▉ | 20240/34278 [22:19:06<15:36:41, 4.00s/it] 59%|█████▉ | 20241/34278 [22:19:09<14:35:21, 3.74s/it] {'loss': 0.1337, 'grad_norm': 0.8739567106959935, 'learning_rate': 3.7897182123825196e-06, 'epoch': 0.59} 59%|█████▉ | 20241/34278 [22:19:09<14:35:21, 3.74s/it] 59%|█████▉ | 20242/34278 [22:19:14<16:57:56, 4.35s/it] {'loss': 0.1408, 'grad_norm': 0.9832205175421125, 'learning_rate': 3.7892598327858863e-06, 'epoch': 0.59} 59%|█████▉ | 20242/34278 [22:19:14<16:57:56, 4.35s/it] 59%|█████▉ | 20243/34278 [22:19:19<17:04:47, 4.38s/it] {'loss': 0.1191, 'grad_norm': 1.1821731010630248, 'learning_rate': 3.788801463998456e-06, 'epoch': 0.59} 59%|█████▉ | 20243/34278 [22:19:19<17:04:47, 4.38s/it] 59%|█████▉ | 20244/34278 [22:19:22<15:47:03, 4.05s/it] {'loss': 0.1076, 'grad_norm': 0.896523533953455, 'learning_rate': 3.7883431060243163e-06, 'epoch': 0.59} 59%|█████▉ | 20244/34278 [22:19:22<15:47:03, 4.05s/it] 59%|█████▉ | 20245/34278 [22:19:25<14:31:46, 3.73s/it] {'loss': 0.1536, 'grad_norm': 0.7510175788776267, 'learning_rate': 3.78788475886756e-06, 'epoch': 0.59} 59%|█████▉ | 20245/34278 [22:19:25<14:31:46, 3.73s/it] 59%|█████▉ | 20246/34278 [22:19:28<13:41:30, 3.51s/it] {'loss': 0.1198, 'grad_norm': 1.1033535775488357, 'learning_rate': 3.78742642253228e-06, 'epoch': 0.59} 59%|█████▉ | 20246/34278 [22:19:28<13:41:30, 3.51s/it] 59%|█████▉ | 20247/34278 [22:19:33<15:45:39, 4.04s/it] {'loss': 0.1434, 'grad_norm': 0.8632607001897823, 'learning_rate': 3.7869680970225663e-06, 'epoch': 0.59} 59%|█████▉ | 20247/34278 [22:19:33<15:45:39, 4.04s/it] 59%|█████▉ | 20248/34278 [22:19:37<15:16:29, 3.92s/it] {'loss': 0.1354, 'grad_norm': 0.7615522216793309, 'learning_rate': 3.786509782342511e-06, 'epoch': 0.59} 59%|█████▉ | 20248/34278 [22:19:37<15:16:29, 3.92s/it] 59%|█████▉ | 20249/34278 [22:19:40<14:15:24, 3.66s/it] {'loss': 0.1346, 'grad_norm': 0.8663061346334333, 'learning_rate': 3.7860514784962084e-06, 'epoch': 0.59} 59%|█████▉ | 20249/34278 [22:19:40<14:15:24, 3.66s/it] 59%|█████▉ | 20250/34278 [22:19:43<13:32:11, 3.47s/it] {'loss': 0.1108, 'grad_norm': 0.7234468548634531, 'learning_rate': 3.7855931854877474e-06, 'epoch': 0.59} 59%|█████▉ | 20250/34278 [22:19:43<13:32:11, 3.47s/it] 59%|█████▉ | 20251/34278 [22:19:49<16:08:26, 4.14s/it] {'loss': 0.1193, 'grad_norm': 1.1542971298041533, 'learning_rate': 3.785134903321222e-06, 'epoch': 0.59} 59%|█████▉ | 20251/34278 [22:19:49<16:08:26, 4.14s/it] 59%|█████▉ | 20252/34278 [22:19:52<15:15:11, 3.91s/it] {'loss': 0.127, 'grad_norm': 0.720546737061622, 'learning_rate': 3.784676632000721e-06, 'epoch': 0.59} 59%|█████▉ | 20252/34278 [22:19:52<15:15:11, 3.91s/it] 59%|█████▉ | 20253/34278 [22:19:55<14:18:51, 3.67s/it] {'loss': 0.1263, 'grad_norm': 1.0360990966121526, 'learning_rate': 3.784218371530337e-06, 'epoch': 0.59} 59%|█████▉ | 20253/34278 [22:19:55<14:18:51, 3.67s/it] 59%|█████▉ | 20254/34278 [22:19:59<14:02:19, 3.60s/it] {'loss': 0.1075, 'grad_norm': 0.7628493944623506, 'learning_rate': 3.7837601219141605e-06, 'epoch': 0.59} 59%|█████▉ | 20254/34278 [22:19:59<14:02:19, 3.60s/it] 59%|█████▉ | 20255/34278 [22:20:02<13:18:42, 3.42s/it] {'loss': 0.1101, 'grad_norm': 0.7982711688955447, 'learning_rate': 3.783301883156285e-06, 'epoch': 0.59} 59%|█████▉ | 20255/34278 [22:20:02<13:18:42, 3.42s/it] 59%|█████▉ | 20256/34278 [22:20:05<13:13:44, 3.40s/it] {'loss': 0.128, 'grad_norm': 0.7200743169359279, 'learning_rate': 3.782843655260799e-06, 'epoch': 0.59} 59%|█████▉ | 20256/34278 [22:20:05<13:13:44, 3.40s/it] 59%|█████▉ | 20257/34278 [22:20:09<13:53:58, 3.57s/it] {'loss': 0.1177, 'grad_norm': 0.825527860327397, 'learning_rate': 3.782385438231794e-06, 'epoch': 0.59} 59%|█████▉ | 20257/34278 [22:20:09<13:53:58, 3.57s/it] 59%|█████▉ | 20258/34278 [22:20:15<16:59:00, 4.36s/it] {'loss': 0.134, 'grad_norm': 1.1544435995553002, 'learning_rate': 3.7819272320733626e-06, 'epoch': 0.59} 59%|█████▉ | 20258/34278 [22:20:15<16:59:00, 4.36s/it] 59%|█████▉ | 20259/34278 [22:20:20<16:51:19, 4.33s/it] {'loss': 0.1252, 'grad_norm': 0.797619386105925, 'learning_rate': 3.7814690367895923e-06, 'epoch': 0.59} 59%|█████▉ | 20259/34278 [22:20:20<16:51:19, 4.33s/it] 59%|█████▉ | 20260/34278 [22:20:23<15:21:58, 3.95s/it] {'loss': 0.1334, 'grad_norm': 1.0506363430902557, 'learning_rate': 3.7810108523845744e-06, 'epoch': 0.59} 59%|█████▉ | 20260/34278 [22:20:23<15:21:58, 3.95s/it] 59%|█████▉ | 20261/34278 [22:20:27<15:47:11, 4.05s/it] {'loss': 0.1578, 'grad_norm': 0.8632340134026724, 'learning_rate': 3.7805526788624027e-06, 'epoch': 0.59} 59%|█████▉ | 20261/34278 [22:20:27<15:47:11, 4.05s/it] 59%|█████▉ | 20262/34278 [22:20:33<18:34:19, 4.77s/it] {'loss': 0.1328, 'grad_norm': 1.0704684836764167, 'learning_rate': 3.780094516227165e-06, 'epoch': 0.59} 59%|█████▉ | 20262/34278 [22:20:33<18:34:19, 4.77s/it] 59%|█████▉ | 20263/34278 [22:20:36<16:37:11, 4.27s/it] {'loss': 0.1267, 'grad_norm': 0.92312155686962, 'learning_rate': 3.779636364482953e-06, 'epoch': 0.59} 59%|█████▉ | 20263/34278 [22:20:36<16:37:11, 4.27s/it] 59%|█████▉ | 20264/34278 [22:20:42<18:31:59, 4.76s/it] {'loss': 0.111, 'grad_norm': 0.7941396087589555, 'learning_rate': 3.779178223633856e-06, 'epoch': 0.59} 59%|█████▉ | 20264/34278 [22:20:42<18:31:59, 4.76s/it] 59%|█████▉ | 20265/34278 [22:20:47<17:59:15, 4.62s/it] {'loss': 0.1471, 'grad_norm': 1.0645081561050787, 'learning_rate': 3.778720093683964e-06, 'epoch': 0.59} 59%|█████▉ | 20265/34278 [22:20:47<17:59:15, 4.62s/it] 59%|█████▉ | 20266/34278 [22:20:53<19:41:22, 5.06s/it] {'loss': 0.1235, 'grad_norm': 0.9816676630504501, 'learning_rate': 3.7782619746373663e-06, 'epoch': 0.59} 59%|█████▉ | 20266/34278 [22:20:53<19:41:22, 5.06s/it] 59%|█████▉ | 20267/34278 [22:20:56<17:11:00, 4.42s/it] {'loss': 0.1236, 'grad_norm': 0.777847199885689, 'learning_rate': 3.777803866498155e-06, 'epoch': 0.59} 59%|█████▉ | 20267/34278 [22:20:56<17:11:00, 4.42s/it] 59%|█████▉ | 20268/34278 [22:21:00<17:06:21, 4.40s/it] {'loss': 0.1009, 'grad_norm': 0.9568779142795496, 'learning_rate': 3.77734576927042e-06, 'epoch': 0.59} 59%|█████▉ | 20268/34278 [22:21:00<17:06:21, 4.40s/it] 59%|█████▉ | 20269/34278 [22:21:03<15:15:46, 3.92s/it] {'loss': 0.1359, 'grad_norm': 0.775840409936639, 'learning_rate': 3.776887682958249e-06, 'epoch': 0.59} 59%|█████▉ | 20269/34278 [22:21:03<15:15:46, 3.92s/it] 59%|█████▉ | 20270/34278 [22:21:06<14:33:31, 3.74s/it] {'loss': 0.1215, 'grad_norm': 0.9228157617525361, 'learning_rate': 3.776429607565733e-06, 'epoch': 0.59} 59%|█████▉ | 20270/34278 [22:21:06<14:33:31, 3.74s/it] 59%|█████▉ | 20271/34278 [22:21:09<13:35:33, 3.49s/it] {'loss': 0.097, 'grad_norm': 0.8301160033415184, 'learning_rate': 3.775971543096963e-06, 'epoch': 0.59} 59%|█████▉ | 20271/34278 [22:21:09<13:35:33, 3.49s/it] 59%|█████▉ | 20272/34278 [22:21:12<13:18:24, 3.42s/it] {'loss': 0.1402, 'grad_norm': 1.0905985737349984, 'learning_rate': 3.775513489556023e-06, 'epoch': 0.59} 59%|█████▉ | 20272/34278 [22:21:12<13:18:24, 3.42s/it] 59%|█████▉ | 20273/34278 [22:21:15<12:56:20, 3.33s/it] {'loss': 0.1274, 'grad_norm': 0.9260101498628446, 'learning_rate': 3.775055446947009e-06, 'epoch': 0.59} 59%|█████▉ | 20273/34278 [22:21:15<12:56:20, 3.33s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7faf5a36f4c0> Failed to fetch sample 2654741. Exception: cannot identify image file <_io.BytesIO object at 0x7faf5a36f4c0> 59%|█████▉ | 20274/34278 [22:21:18<12:36:15, 3.24s/it] {'loss': 0.1186, 'grad_norm': 0.8318914638435035, 'learning_rate': 3.7745974152740074e-06, 'epoch': 0.59} 59%|█████▉ | 20274/34278 [22:21:18<12:36:15, 3.24s/it] 59%|█████▉ | 20275/34278 [22:21:21<12:11:42, 3.14s/it] {'loss': 0.1046, 'grad_norm': 1.0085084473662826, 'learning_rate': 3.7741393945411075e-06, 'epoch': 0.59} 59%|█████▉ | 20275/34278 [22:21:21<12:11:42, 3.14s/it] 59%|█████▉ | 20276/34278 [22:21:25<13:20:40, 3.43s/it] {'loss': 0.1372, 'grad_norm': 0.9674246822945007, 'learning_rate': 3.773681384752399e-06, 'epoch': 0.59} 59%|█████▉ | 20276/34278 [22:21:25<13:20:40, 3.43s/it] 59%|█████▉ | 20277/34278 [22:21:32<16:32:02, 4.25s/it] {'loss': 0.1512, 'grad_norm': 0.8177735756142149, 'learning_rate': 3.773223385911969e-06, 'epoch': 0.59} 59%|█████▉ | 20277/34278 [22:21:32<16:32:02, 4.25s/it] 59%|█████▉ | 20278/34278 [22:21:35<15:12:49, 3.91s/it] {'loss': 0.1027, 'grad_norm': 0.8951644185692315, 'learning_rate': 3.7727653980239077e-06, 'epoch': 0.59} 59%|█████▉ | 20278/34278 [22:21:35<15:12:49, 3.91s/it] 59%|█████▉ | 20279/34278 [22:21:38<14:28:08, 3.72s/it] {'loss': 0.11, 'grad_norm': 0.7730806463621648, 'learning_rate': 3.7723074210923046e-06, 'epoch': 0.59} 59%|█████▉ | 20279/34278 [22:21:38<14:28:08, 3.72s/it] 59%|█████▉ | 20280/34278 [22:21:41<14:06:25, 3.63s/it] {'loss': 0.1219, 'grad_norm': 1.4916102140596397, 'learning_rate': 3.7718494551212477e-06, 'epoch': 0.59} 59%|█████▉ | 20280/34278 [22:21:41<14:06:25, 3.63s/it] 59%|█████▉ | 20281/34278 [22:21:46<15:11:44, 3.91s/it] {'loss': 0.1394, 'grad_norm': 1.2629613847233272, 'learning_rate': 3.7713915001148264e-06, 'epoch': 0.59} 59%|█████▉ | 20281/34278 [22:21:46<15:11:44, 3.91s/it] 59%|█████▉ | 20282/34278 [22:21:49<14:09:20, 3.64s/it] {'loss': 0.131, 'grad_norm': 0.8404674415570916, 'learning_rate': 3.770933556077128e-06, 'epoch': 0.59} 59%|█████▉ | 20282/34278 [22:21:49<14:09:20, 3.64s/it] 59%|█████▉ | 20283/34278 [22:21:53<14:55:55, 3.84s/it] {'loss': 0.1198, 'grad_norm': 0.8222340325811418, 'learning_rate': 3.7704756230122404e-06, 'epoch': 0.59} 59%|█████▉ | 20283/34278 [22:21:53<14:55:55, 3.84s/it] 59%|█████▉ | 20284/34278 [22:21:56<14:00:14, 3.60s/it] {'loss': 0.1464, 'grad_norm': 1.2953859987497278, 'learning_rate': 3.7700177009242533e-06, 'epoch': 0.59} 59%|█████▉ | 20284/34278 [22:21:56<14:00:14, 3.60s/it] 59%|█████▉ | 20285/34278 [22:21:59<13:02:51, 3.36s/it] {'loss': 0.1227, 'grad_norm': 0.7463448766649863, 'learning_rate': 3.769559789817256e-06, 'epoch': 0.59} 59%|█████▉ | 20285/34278 [22:21:59<13:02:51, 3.36s/it] 59%|█████▉ | 20286/34278 [22:22:03<13:16:16, 3.41s/it] {'loss': 0.1088, 'grad_norm': 0.9141728419939459, 'learning_rate': 3.769101889695334e-06, 'epoch': 0.59} 59%|█████▉ | 20286/34278 [22:22:03<13:16:16, 3.41s/it] 59%|█████▉ | 20287/34278 [22:22:06<13:00:02, 3.35s/it] {'loss': 0.1378, 'grad_norm': 0.7300132060866088, 'learning_rate': 3.768644000562577e-06, 'epoch': 0.59} 59%|█████▉ | 20287/34278 [22:22:06<13:00:02, 3.35s/it] 59%|█████▉ | 20288/34278 [22:22:09<13:18:57, 3.43s/it] {'loss': 0.1101, 'grad_norm': 0.9292285150265088, 'learning_rate': 3.768186122423073e-06, 'epoch': 0.59} 59%|█████▉ | 20288/34278 [22:22:10<13:18:57, 3.43s/it] 59%|█████▉ | 20289/34278 [22:22:13<13:41:10, 3.52s/it] {'loss': 0.1297, 'grad_norm': 0.8716206320290674, 'learning_rate': 3.767728255280906e-06, 'epoch': 0.59} 59%|█████▉ | 20289/34278 [22:22:13<13:41:10, 3.52s/it] 59%|█████▉ | 20290/34278 [22:22:16<13:14:10, 3.41s/it] {'loss': 0.1149, 'grad_norm': 0.8383145328399962, 'learning_rate': 3.7672703991401706e-06, 'epoch': 0.59} 59%|█████▉ | 20290/34278 [22:22:16<13:14:10, 3.41s/it] 59%|█████▉ | 20291/34278 [22:22:20<13:23:01, 3.44s/it] {'loss': 0.1439, 'grad_norm': 0.7973474333886468, 'learning_rate': 3.7668125540049493e-06, 'epoch': 0.59} 59%|█████▉ | 20291/34278 [22:22:20<13:23:01, 3.44s/it] 59%|█████▉ | 20292/34278 [22:22:23<12:52:53, 3.32s/it] {'loss': 0.124, 'grad_norm': 0.7862309807261225, 'learning_rate': 3.766354719879331e-06, 'epoch': 0.59} 59%|█████▉ | 20292/34278 [22:22:23<12:52:53, 3.32s/it] 59%|█████▉ | 20293/34278 [22:22:29<16:00:04, 4.12s/it] {'loss': 0.1385, 'grad_norm': 0.8835775439221308, 'learning_rate': 3.7658968967674046e-06, 'epoch': 0.59} 59%|█████▉ | 20293/34278 [22:22:29<16:00:04, 4.12s/it] 59%|█████▉ | 20294/34278 [22:22:32<14:44:00, 3.79s/it] {'loss': 0.1302, 'grad_norm': 0.8011348997365383, 'learning_rate': 3.765439084673255e-06, 'epoch': 0.59} 59%|█████▉ | 20294/34278 [22:22:32<14:44:00, 3.79s/it] 59%|█████▉ | 20295/34278 [22:22:36<14:41:44, 3.78s/it] {'loss': 0.1491, 'grad_norm': 1.027977766405424, 'learning_rate': 3.76498128360097e-06, 'epoch': 0.59} 59%|█████▉ | 20295/34278 [22:22:36<14:41:44, 3.78s/it] 59%|█████▉ | 20296/34278 [22:22:39<13:34:14, 3.49s/it] {'loss': 0.1159, 'grad_norm': 0.8544333249902301, 'learning_rate': 3.7645234935546377e-06, 'epoch': 0.59} 59%|█████▉ | 20296/34278 [22:22:39<13:34:14, 3.49s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0fd20caa70> Failed to fetch sample 3699515. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fd20caa70> 59%|█████▉ | 20297/34278 [22:22:42<13:32:16, 3.49s/it] {'loss': 0.1137, 'grad_norm': 0.8805906780075904, 'learning_rate': 3.7640657145383445e-06, 'epoch': 0.59} 59%|█████▉ | 20297/34278 [22:22:42<13:32:16, 3.49s/it] 59%|█████▉ | 20298/34278 [22:22:46<13:43:46, 3.54s/it] {'loss': 0.1275, 'grad_norm': 0.7795037875714507, 'learning_rate': 3.7636079465561793e-06, 'epoch': 0.59} 59%|█████▉ | 20298/34278 [22:22:46<13:43:46, 3.54s/it] 59%|█████▉ | 20299/34278 [22:22:49<13:34:27, 3.50s/it] {'loss': 0.1294, 'grad_norm': 0.8167336207242685, 'learning_rate': 3.763150189612226e-06, 'epoch': 0.59} 59%|█████▉ | 20299/34278 [22:22:49<13:34:27, 3.50s/it] 59%|█████▉ | 20300/34278 [22:22:52<13:05:40, 3.37s/it] {'loss': 0.1266, 'grad_norm': 0.6971774326622011, 'learning_rate': 3.7626924437105723e-06, 'epoch': 0.59} 59%|█████▉ | 20300/34278 [22:22:52<13:05:40, 3.37s/it] 59%|█████▉ | 20301/34278 [22:22:55<12:47:42, 3.30s/it] {'loss': 0.1172, 'grad_norm': 0.6563811155516861, 'learning_rate': 3.762234708855304e-06, 'epoch': 0.59} 59%|█████▉ | 20301/34278 [22:22:55<12:47:42, 3.30s/it] 59%|█████▉ | 20302/34278 [22:22:59<13:36:27, 3.51s/it] {'loss': 0.1161, 'grad_norm': 0.7529173622208494, 'learning_rate': 3.76177698505051e-06, 'epoch': 0.59} 59%|█████▉ | 20302/34278 [22:22:59<13:36:27, 3.51s/it] 59%|█████▉ | 20303/34278 [22:23:05<16:43:51, 4.31s/it] {'loss': 0.1219, 'grad_norm': 0.8713120732052216, 'learning_rate': 3.761319272300276e-06, 'epoch': 0.59} 59%|█████▉ | 20303/34278 [22:23:05<16:43:51, 4.31s/it] 59%|█████▉ | 20304/34278 [22:23:09<15:17:31, 3.94s/it] {'loss': 0.1213, 'grad_norm': 0.7023739943710798, 'learning_rate': 3.7608615706086876e-06, 'epoch': 0.59} 59%|█████▉ | 20304/34278 [22:23:09<15:17:31, 3.94s/it] 59%|█████▉ | 20305/34278 [22:23:12<14:35:00, 3.76s/it] {'loss': 0.1267, 'grad_norm': 1.068413761264487, 'learning_rate': 3.760403879979831e-06, 'epoch': 0.59} 59%|█████▉ | 20305/34278 [22:23:12<14:35:00, 3.76s/it] 59%|█████▉ | 20306/34278 [22:23:15<13:45:08, 3.54s/it] {'loss': 0.1373, 'grad_norm': 1.0622620883514793, 'learning_rate': 3.759946200417793e-06, 'epoch': 0.59} 59%|█████▉ | 20306/34278 [22:23:15<13:45:08, 3.54s/it] 59%|█████▉ | 20307/34278 [22:23:18<13:02:45, 3.36s/it] {'loss': 0.1254, 'grad_norm': 0.8729555445934811, 'learning_rate': 3.759488531926657e-06, 'epoch': 0.59} 59%|█████▉ | 20307/34278 [22:23:18<13:02:45, 3.36s/it] 59%|█████▉ | 20308/34278 [22:23:21<12:44:58, 3.29s/it] {'loss': 0.1369, 'grad_norm': 0.8807063874024929, 'learning_rate': 3.7590308745105143e-06, 'epoch': 0.59} 59%|█████▉ | 20308/34278 [22:23:21<12:44:58, 3.29s/it] 59%|█████▉ | 20309/34278 [22:23:27<15:39:39, 4.04s/it] {'loss': 0.1226, 'grad_norm': 0.7730701181240264, 'learning_rate': 3.7585732281734467e-06, 'epoch': 0.59} 59%|█████▉ | 20309/34278 [22:23:27<15:39:39, 4.04s/it] 59%|█████▉ | 20310/34278 [22:23:30<15:09:16, 3.91s/it] {'loss': 0.1325, 'grad_norm': 0.6801049083834891, 'learning_rate': 3.7581155929195405e-06, 'epoch': 0.59} 59%|█████▉ | 20310/34278 [22:23:30<15:09:16, 3.91s/it] 59%|█████▉ | 20311/34278 [22:23:36<17:13:10, 4.44s/it] {'loss': 0.1416, 'grad_norm': 0.9171360209832026, 'learning_rate': 3.7576579687528836e-06, 'epoch': 0.59} 59%|█████▉ | 20311/34278 [22:23:36<17:13:10, 4.44s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff00c128720> Failed to fetch sample 2762608. Exception: cannot identify image file <_io.BytesIO object at 0x7ff00c128720> 59%|█████▉ | 20312/34278 [22:23:40<16:57:56, 4.37s/it] {'loss': 0.1365, 'grad_norm': 0.9015598603877577, 'learning_rate': 3.757200355677558e-06, 'epoch': 0.59} 59%|█████▉ | 20312/34278 [22:23:40<16:57:56, 4.37s/it] 59%|█████▉ | 20313/34278 [22:23:44<16:15:03, 4.19s/it] {'loss': 0.1193, 'grad_norm': 0.9509602034197917, 'learning_rate': 3.75674275369765e-06, 'epoch': 0.59} 59%|█████▉ | 20313/34278 [22:23:44<16:15:03, 4.19s/it] 59%|█████▉ | 20314/34278 [22:23:47<15:03:22, 3.88s/it] {'loss': 0.109, 'grad_norm': 0.8274027554620362, 'learning_rate': 3.7562851628172476e-06, 'epoch': 0.59} 59%|█████▉ | 20314/34278 [22:23:47<15:03:22, 3.88s/it] 59%|█████▉ | 20315/34278 [22:23:50<14:06:29, 3.64s/it] {'loss': 0.1096, 'grad_norm': 2.385593261944973, 'learning_rate': 3.755827583040435e-06, 'epoch': 0.59} 59%|█████▉ | 20315/34278 [22:23:50<14:06:29, 3.64s/it] 59%|█████▉ | 20316/34278 [22:23:54<14:05:21, 3.63s/it] {'loss': 0.1445, 'grad_norm': 0.8304798709331231, 'learning_rate': 3.7553700143712956e-06, 'epoch': 0.59} 59%|█████▉ | 20316/34278 [22:23:54<14:05:21, 3.63s/it] 59%|█████▉ | 20317/34278 [22:23:57<13:47:39, 3.56s/it] {'loss': 0.1224, 'grad_norm': 0.9432083259294172, 'learning_rate': 3.7549124568139158e-06, 'epoch': 0.59} 59%|█████▉ | 20317/34278 [22:23:57<13:47:39, 3.56s/it] 59%|█████▉ | 20318/34278 [22:24:03<16:45:19, 4.32s/it] {'loss': 0.1417, 'grad_norm': 0.9228415072757776, 'learning_rate': 3.754454910372381e-06, 'epoch': 0.59} 59%|█████▉ | 20318/34278 [22:24:03<16:45:19, 4.32s/it] 59%|█████▉ | 20319/34278 [22:24:07<15:36:51, 4.03s/it] {'loss': 0.1356, 'grad_norm': 0.8394270709799782, 'learning_rate': 3.7539973750507723e-06, 'epoch': 0.59} 59%|█████▉ | 20319/34278 [22:24:07<15:36:51, 4.03s/it] 59%|█████▉ | 20320/34278 [22:24:11<15:28:18, 3.99s/it] {'loss': 0.1182, 'grad_norm': 1.0490922762574508, 'learning_rate': 3.753539850853181e-06, 'epoch': 0.59} 59%|█████▉ | 20320/34278 [22:24:11<15:28:18, 3.99s/it] 59%|█████▉ | 20321/34278 [22:24:13<14:08:32, 3.65s/it] {'loss': 0.0953, 'grad_norm': 0.9272445567929332, 'learning_rate': 3.753082337783688e-06, 'epoch': 0.59} 59%|█████▉ | 20321/34278 [22:24:13<14:08:32, 3.65s/it] 59%|█████▉ | 20322/34278 [22:24:18<14:57:01, 3.86s/it] {'loss': 0.1458, 'grad_norm': 0.7105217279422499, 'learning_rate': 3.7526248358463768e-06, 'epoch': 0.59} 59%|█████▉ | 20322/34278 [22:24:18<14:57:01, 3.86s/it] 59%|█████▉ | 20323/34278 [22:24:21<14:24:28, 3.72s/it] {'loss': 0.1424, 'grad_norm': 0.9120023752836961, 'learning_rate': 3.7521673450453356e-06, 'epoch': 0.59} 59%|█████▉ | 20323/34278 [22:24:21<14:24:28, 3.72s/it] 59%|█████▉ | 20324/34278 [22:24:24<13:30:14, 3.48s/it] {'loss': 0.1058, 'grad_norm': 1.257870197846429, 'learning_rate': 3.7517098653846446e-06, 'epoch': 0.59} 59%|█████▉ | 20324/34278 [22:24:24<13:30:14, 3.48s/it] 59%|█████▉ | 20325/34278 [22:24:27<13:16:22, 3.42s/it] {'loss': 0.118, 'grad_norm': 0.8575419280848129, 'learning_rate': 3.751252396868389e-06, 'epoch': 0.59} 59%|█████▉ | 20325/34278 [22:24:27<13:16:22, 3.42s/it] 59%|█████▉ | 20326/34278 [22:24:31<12:58:52, 3.35s/it] {'loss': 0.1314, 'grad_norm': 0.7128499351579052, 'learning_rate': 3.750794939500655e-06, 'epoch': 0.59} 59%|█████▉ | 20326/34278 [22:24:31<12:58:52, 3.35s/it] 59%|█████▉ | 20327/34278 [22:24:34<12:54:18, 3.33s/it] {'loss': 0.1234, 'grad_norm': 0.8097750720264704, 'learning_rate': 3.7503374932855258e-06, 'epoch': 0.59} 59%|█████▉ | 20327/34278 [22:24:34<12:54:18, 3.33s/it] 59%|█████▉ | 20328/34278 [22:24:37<12:40:21, 3.27s/it] {'loss': 0.1073, 'grad_norm': 0.9747084859732462, 'learning_rate': 3.7498800582270863e-06, 'epoch': 0.59} 59%|█████▉ | 20328/34278 [22:24:37<12:40:21, 3.27s/it] 59%|█████▉ | 20329/34278 [22:24:43<15:34:03, 4.02s/it] {'loss': 0.1, 'grad_norm': 0.7648578301821678, 'learning_rate': 3.7494226343294177e-06, 'epoch': 0.59} 59%|█████▉ | 20329/34278 [22:24:43<15:34:03, 4.02s/it] 59%|█████▉ | 20330/34278 [22:24:46<14:14:30, 3.68s/it] {'loss': 0.1393, 'grad_norm': 0.7565341007173764, 'learning_rate': 3.7489652215966055e-06, 'epoch': 0.59} 59%|█████▉ | 20330/34278 [22:24:46<14:14:30, 3.68s/it] 59%|█████▉ | 20331/34278 [22:24:51<16:32:05, 4.27s/it] {'loss': 0.1316, 'grad_norm': 0.7887501873350983, 'learning_rate': 3.7485078200327317e-06, 'epoch': 0.59} 59%|█████▉ | 20331/34278 [22:24:51<16:32:05, 4.27s/it] 59%|█████▉ | 20332/34278 [22:24:54<15:00:22, 3.87s/it] {'loss': 0.1152, 'grad_norm': 0.8692053319094313, 'learning_rate': 3.748050429641883e-06, 'epoch': 0.59} 59%|█████▉ | 20332/34278 [22:24:54<15:00:22, 3.87s/it] 59%|█████▉ | 20333/34278 [22:25:00<17:06:47, 4.42s/it] {'loss': 0.1413, 'grad_norm': 0.7751738676907003, 'learning_rate': 3.747593050428142e-06, 'epoch': 0.59} 59%|█████▉ | 20333/34278 [22:25:00<17:06:47, 4.42s/it] 59%|█████▉ | 20334/34278 [22:25:03<15:30:12, 4.00s/it] {'loss': 0.1356, 'grad_norm': 0.7288789974986442, 'learning_rate': 3.7471356823955908e-06, 'epoch': 0.59} 59%|█████▉ | 20334/34278 [22:25:03<15:30:12, 4.00s/it] 59%|█████▉ | 20335/34278 [22:25:09<17:48:46, 4.60s/it] {'loss': 0.1503, 'grad_norm': 0.8715625909154425, 'learning_rate': 3.7466783255483125e-06, 'epoch': 0.59} 59%|█████▉ | 20335/34278 [22:25:09<17:48:46, 4.60s/it] 59%|█████▉ | 20336/34278 [22:25:15<19:43:37, 5.09s/it] {'loss': 0.1303, 'grad_norm': 0.8948276633084514, 'learning_rate': 3.746220979890392e-06, 'epoch': 0.59} 59%|█████▉ | 20336/34278 [22:25:15<19:43:37, 5.09s/it] 59%|█████▉ | 20337/34278 [22:25:18<17:22:34, 4.49s/it] {'loss': 0.1221, 'grad_norm': 0.7820814106919346, 'learning_rate': 3.7457636454259084e-06, 'epoch': 0.59} 59%|█████▉ | 20337/34278 [22:25:18<17:22:34, 4.49s/it] 59%|█████▉ | 20338/34278 [22:25:21<15:42:01, 4.05s/it] {'loss': 0.1262, 'grad_norm': 0.9603909685091399, 'learning_rate': 3.74530632215895e-06, 'epoch': 0.59} 59%|█████▉ | 20338/34278 [22:25:21<15:42:01, 4.05s/it] 59%|█████▉ | 20339/34278 [22:25:27<17:46:17, 4.59s/it] {'loss': 0.1262, 'grad_norm': 0.9199513407352496, 'learning_rate': 3.744849010093597e-06, 'epoch': 0.59} 59%|█████▉ | 20339/34278 [22:25:27<17:46:17, 4.59s/it] 59%|█████▉ | 20340/34278 [22:25:30<16:10:38, 4.18s/it] {'loss': 0.1241, 'grad_norm': 0.9135260477282455, 'learning_rate': 3.7443917092339323e-06, 'epoch': 0.59} 59%|█████▉ | 20340/34278 [22:25:30<16:10:38, 4.18s/it] 59%|█████▉ | 20341/34278 [22:25:34<16:02:37, 4.14s/it] {'loss': 0.1205, 'grad_norm': 0.9318084696620351, 'learning_rate': 3.7439344195840393e-06, 'epoch': 0.59} 59%|█████▉ | 20341/34278 [22:25:34<16:02:37, 4.14s/it] 59%|█████▉ | 20342/34278 [22:25:38<15:43:35, 4.06s/it] {'loss': 0.1069, 'grad_norm': 0.8841853727203788, 'learning_rate': 3.7434771411479993e-06, 'epoch': 0.59} 59%|█████▉ | 20342/34278 [22:25:38<15:43:35, 4.06s/it] 59%|█████▉ | 20343/34278 [22:25:44<17:32:14, 4.53s/it] {'loss': 0.138, 'grad_norm': 1.23356435704127, 'learning_rate': 3.743019873929894e-06, 'epoch': 0.59} 59%|█████▉ | 20343/34278 [22:25:44<17:32:14, 4.53s/it] 59%|█████▉ | 20344/34278 [22:25:50<19:17:41, 4.99s/it] {'loss': 0.1369, 'grad_norm': 1.300083953889085, 'learning_rate': 3.7425626179338087e-06, 'epoch': 0.59} 59%|█████▉ | 20344/34278 [22:25:50<19:17:41, 4.99s/it] 59%|█████▉ | 20345/34278 [22:25:55<19:13:03, 4.97s/it] {'loss': 0.1724, 'grad_norm': 0.9076827024539956, 'learning_rate': 3.7421053731638247e-06, 'epoch': 0.59} 59%|█████▉ | 20345/34278 [22:25:55<19:13:03, 4.97s/it] 59%|█████▉ | 20346/34278 [22:25:58<17:25:33, 4.50s/it] {'loss': 0.1179, 'grad_norm': 0.9152453874334994, 'learning_rate': 3.7416481396240233e-06, 'epoch': 0.59} 59%|█████▉ | 20346/34278 [22:25:58<17:25:33, 4.50s/it] 59%|█████▉ | 20347/34278 [22:26:04<19:20:55, 5.00s/it] {'loss': 0.1167, 'grad_norm': 0.7506870356234908, 'learning_rate': 3.7411909173184863e-06, 'epoch': 0.59} 59%|█████▉ | 20347/34278 [22:26:04<19:20:55, 5.00s/it] 59%|█████▉ | 20348/34278 [22:26:10<20:33:09, 5.31s/it] {'loss': 0.1177, 'grad_norm': 0.8966013397136234, 'learning_rate': 3.740733706251298e-06, 'epoch': 0.59} 59%|█████▉ | 20348/34278 [22:26:11<20:33:09, 5.31s/it] 59%|█████▉ | 20349/34278 [22:26:17<21:21:59, 5.52s/it] {'loss': 0.1291, 'grad_norm': 0.8199959588363075, 'learning_rate': 3.7402765064265346e-06, 'epoch': 0.59} 59%|█████▉ | 20349/34278 [22:26:17<21:21:59, 5.52s/it] 59%|█████▉ | 20350/34278 [22:26:20<18:48:19, 4.86s/it] {'loss': 0.1545, 'grad_norm': 0.9856671762568906, 'learning_rate': 3.7398193178482855e-06, 'epoch': 0.59} 59%|█████▉ | 20350/34278 [22:26:20<18:48:19, 4.86s/it] 59%|█████▉ | 20351/34278 [22:26:24<17:59:43, 4.65s/it] {'loss': 0.1406, 'grad_norm': 0.7912230412079806, 'learning_rate': 3.739362140520627e-06, 'epoch': 0.59} 59%|█████▉ | 20351/34278 [22:26:24<17:59:43, 4.65s/it] 59%|█████▉ | 20352/34278 [22:26:27<16:24:37, 4.24s/it] {'loss': 0.1257, 'grad_norm': 0.8040817049542729, 'learning_rate': 3.7389049744476437e-06, 'epoch': 0.59} 59%|█████▉ | 20352/34278 [22:26:27<16:24:37, 4.24s/it] 59%|█████▉ | 20353/34278 [22:26:30<15:03:06, 3.89s/it] {'loss': 0.1199, 'grad_norm': 0.7683089581911615, 'learning_rate': 3.738447819633415e-06, 'epoch': 0.59} 59%|█████▉ | 20353/34278 [22:26:30<15:03:06, 3.89s/it] 59%|█████▉ | 20354/34278 [22:26:36<16:33:42, 4.28s/it] {'loss': 0.107, 'grad_norm': 0.709091046344398, 'learning_rate': 3.7379906760820234e-06, 'epoch': 0.59} 59%|█████▉ | 20354/34278 [22:26:36<16:33:42, 4.28s/it] 59%|█████▉ | 20355/34278 [22:26:39<15:23:24, 3.98s/it] {'loss': 0.132, 'grad_norm': 0.7930700175595372, 'learning_rate': 3.737533543797548e-06, 'epoch': 0.59} 59%|█████▉ | 20355/34278 [22:26:39<15:23:24, 3.98s/it] 59%|█████▉ | 20356/34278 [22:26:42<14:45:07, 3.81s/it] {'loss': 0.1611, 'grad_norm': 0.829812465390992, 'learning_rate': 3.7370764227840734e-06, 'epoch': 0.59} 59%|█████▉ | 20356/34278 [22:26:42<14:45:07, 3.81s/it] 59%|█████▉ | 20357/34278 [22:26:45<13:41:26, 3.54s/it] {'loss': 0.1019, 'grad_norm': 0.7824138393830351, 'learning_rate': 3.7366193130456784e-06, 'epoch': 0.59} 59%|█████▉ | 20357/34278 [22:26:45<13:41:26, 3.54s/it] 59%|█████▉ | 20358/34278 [22:26:50<15:27:39, 4.00s/it] {'loss': 0.1364, 'grad_norm': 0.8153222821715773, 'learning_rate': 3.736162214586446e-06, 'epoch': 0.59} 59%|█████▉ | 20358/34278 [22:26:50<15:27:39, 4.00s/it] 59%|█████▉ | 20359/34278 [22:26:54<15:14:52, 3.94s/it] {'loss': 0.1107, 'grad_norm': 0.8966206591182314, 'learning_rate': 3.7357051274104545e-06, 'epoch': 0.59} 59%|█████▉ | 20359/34278 [22:26:54<15:14:52, 3.94s/it] 59%|█████▉ | 20360/34278 [22:26:57<14:02:00, 3.63s/it] {'loss': 0.1104, 'grad_norm': 0.8034161455187806, 'learning_rate': 3.735248051521786e-06, 'epoch': 0.59} 59%|█████▉ | 20360/34278 [22:26:57<14:02:00, 3.63s/it] 59%|█████▉ | 20361/34278 [22:27:00<13:27:29, 3.48s/it] {'loss': 0.1009, 'grad_norm': 0.7245629777622508, 'learning_rate': 3.734790986924519e-06, 'epoch': 0.59} 59%|█████▉ | 20361/34278 [22:27:00<13:27:29, 3.48s/it] 59%|█████▉ | 20362/34278 [22:27:04<14:30:39, 3.75s/it] {'loss': 0.1052, 'grad_norm': 0.8347839313674297, 'learning_rate': 3.734333933622738e-06, 'epoch': 0.59} 59%|█████▉ | 20362/34278 [22:27:04<14:30:39, 3.75s/it] 59%|█████▉ | 20363/34278 [22:27:10<16:27:50, 4.26s/it] {'loss': 0.1295, 'grad_norm': 0.8366940053896308, 'learning_rate': 3.7338768916205224e-06, 'epoch': 0.59} 59%|█████▉ | 20363/34278 [22:27:10<16:27:50, 4.26s/it] 59%|█████▉ | 20364/34278 [22:27:14<15:58:33, 4.13s/it] {'loss': 0.1364, 'grad_norm': 0.7912055817415566, 'learning_rate': 3.7334198609219506e-06, 'epoch': 0.59} 59%|█████▉ | 20364/34278 [22:27:14<15:58:33, 4.13s/it] 59%|█████▉ | 20365/34278 [22:27:19<17:38:51, 4.57s/it] {'loss': 0.1215, 'grad_norm': 1.3096844057058885, 'learning_rate': 3.7329628415311043e-06, 'epoch': 0.59} 59%|█████▉ | 20365/34278 [22:27:19<17:38:51, 4.57s/it] 59%|█████▉ | 20366/34278 [22:27:22<15:43:37, 4.07s/it] {'loss': 0.1393, 'grad_norm': 0.8284737979793699, 'learning_rate': 3.7325058334520637e-06, 'epoch': 0.59} 59%|█████▉ | 20366/34278 [22:27:22<15:43:37, 4.07s/it] 59%|█████▉ | 20367/34278 [22:27:27<16:05:46, 4.17s/it] {'loss': 0.113, 'grad_norm': 1.3668484993926355, 'learning_rate': 3.7320488366889064e-06, 'epoch': 0.59} 59%|█████▉ | 20367/34278 [22:27:27<16:05:46, 4.17s/it] 59%|█████▉ | 20368/34278 [22:27:32<17:13:31, 4.46s/it] {'loss': 0.1456, 'grad_norm': 0.8733517250331496, 'learning_rate': 3.731591851245716e-06, 'epoch': 0.59} 59%|█████▉ | 20368/34278 [22:27:32<17:13:31, 4.46s/it] 59%|█████▉ | 20369/34278 [22:27:38<19:13:14, 4.97s/it] {'loss': 0.1246, 'grad_norm': 0.7996803319961718, 'learning_rate': 3.73113487712657e-06, 'epoch': 0.59} 59%|█████▉ | 20369/34278 [22:27:38<19:13:14, 4.97s/it] 59%|█████▉ | 20370/34278 [22:27:44<20:40:33, 5.35s/it] {'loss': 0.135, 'grad_norm': 0.7079043012081149, 'learning_rate': 3.73067791433555e-06, 'epoch': 0.59} 59%|█████▉ | 20370/34278 [22:27:44<20:40:33, 5.35s/it] 59%|█████▉ | 20371/34278 [22:27:47<17:53:50, 4.63s/it] {'loss': 0.133, 'grad_norm': 0.7480051648978058, 'learning_rate': 3.7302209628767345e-06, 'epoch': 0.59} 59%|█████▉ | 20371/34278 [22:27:47<17:53:50, 4.63s/it] 59%|█████▉ | 20372/34278 [22:27:50<16:15:43, 4.21s/it] {'loss': 0.1169, 'grad_norm': 0.7366026184962297, 'learning_rate': 3.7297640227542024e-06, 'epoch': 0.59} 59%|█████▉ | 20372/34278 [22:27:50<16:15:43, 4.21s/it] 59%|█████▉ | 20373/34278 [22:27:55<16:55:23, 4.38s/it] {'loss': 0.1251, 'grad_norm': 2.1814451346277037, 'learning_rate': 3.7293070939720332e-06, 'epoch': 0.59} 59%|█████▉ | 20373/34278 [22:27:55<16:55:23, 4.38s/it] 59%|█████▉ | 20374/34278 [22:27:59<15:56:40, 4.13s/it] {'loss': 0.1413, 'grad_norm': 0.959928786799977, 'learning_rate': 3.7288501765343076e-06, 'epoch': 0.59} 59%|█████▉ | 20374/34278 [22:27:59<15:56:40, 4.13s/it] 59%|█████▉ | 20375/34278 [22:28:02<14:33:09, 3.77s/it] {'loss': 0.1348, 'grad_norm': 0.8054432663471339, 'learning_rate': 3.7283932704451053e-06, 'epoch': 0.59} 59%|█████▉ | 20375/34278 [22:28:02<14:33:09, 3.77s/it] 59%|█████▉ | 20376/34278 [22:28:07<16:28:24, 4.27s/it] {'loss': 0.1143, 'grad_norm': 0.6711341243653186, 'learning_rate': 3.727936375708503e-06, 'epoch': 0.59} 59%|█████▉ | 20376/34278 [22:28:07<16:28:24, 4.27s/it] 59%|█████▉ | 20377/34278 [22:28:10<14:45:58, 3.82s/it] {'loss': 0.1355, 'grad_norm': 0.8048865207066975, 'learning_rate': 3.727479492328582e-06, 'epoch': 0.59} 59%|█████▉ | 20377/34278 [22:28:10<14:45:58, 3.82s/it] 59%|█████▉ | 20378/34278 [22:28:13<13:59:01, 3.62s/it] {'loss': 0.1065, 'grad_norm': 0.8756069044306218, 'learning_rate': 3.7270226203094207e-06, 'epoch': 0.59} 59%|█████▉ | 20378/34278 [22:28:13<13:59:01, 3.62s/it] 59%|█████▉ | 20379/34278 [22:28:17<14:05:26, 3.65s/it] {'loss': 0.1356, 'grad_norm': 0.7645233073994955, 'learning_rate': 3.726565759655094e-06, 'epoch': 0.59} 59%|█████▉ | 20379/34278 [22:28:17<14:05:26, 3.65s/it] 59%|█████▉ | 20380/34278 [22:28:20<13:31:04, 3.50s/it] {'loss': 0.1275, 'grad_norm': 0.8286671678293144, 'learning_rate': 3.726108910369688e-06, 'epoch': 0.59} 59%|█████▉ | 20380/34278 [22:28:20<13:31:04, 3.50s/it] 59%|█████▉ | 20381/34278 [22:28:23<13:41:49, 3.55s/it] {'loss': 0.1474, 'grad_norm': 0.7099918736793572, 'learning_rate': 3.7256520724572766e-06, 'epoch': 0.59} 59%|█████▉ | 20381/34278 [22:28:23<13:41:49, 3.55s/it] 59%|█████▉ | 20382/34278 [22:28:27<13:52:54, 3.60s/it] {'loss': 0.0914, 'grad_norm': 0.8368279277975266, 'learning_rate': 3.7251952459219385e-06, 'epoch': 0.59} 59%|█████▉ | 20382/34278 [22:28:27<13:52:54, 3.60s/it] 59%|█████▉ | 20383/34278 [22:28:31<13:50:20, 3.59s/it] {'loss': 0.1382, 'grad_norm': 0.6257342084471068, 'learning_rate': 3.724738430767755e-06, 'epoch': 0.59} 59%|█████▉ | 20383/34278 [22:28:31<13:50:20, 3.59s/it] 59%|█████▉ | 20384/34278 [22:28:36<16:14:56, 4.21s/it] {'loss': 0.1343, 'grad_norm': 0.7840624078300055, 'learning_rate': 3.7242816269988e-06, 'epoch': 0.59} 59%|█████▉ | 20384/34278 [22:28:36<16:14:56, 4.21s/it] 59%|█████▉ | 20385/34278 [22:28:40<15:10:06, 3.93s/it] {'loss': 0.1164, 'grad_norm': 0.8047747085774777, 'learning_rate': 3.7238248346191543e-06, 'epoch': 0.59} 59%|█████▉ | 20385/34278 [22:28:40<15:10:06, 3.93s/it] 59%|█████▉ | 20386/34278 [22:28:44<15:06:03, 3.91s/it] {'loss': 0.1133, 'grad_norm': 1.0750330604736558, 'learning_rate': 3.7233680536328965e-06, 'epoch': 0.59} 59%|█████▉ | 20386/34278 [22:28:44<15:06:03, 3.91s/it] 59%|█████▉ | 20387/34278 [22:28:48<15:14:03, 3.95s/it] {'loss': 0.1426, 'grad_norm': 0.8052074439890694, 'learning_rate': 3.7229112840441036e-06, 'epoch': 0.59} 59%|█████▉ | 20387/34278 [22:28:48<15:14:03, 3.95s/it] 59%|█████▉ | 20388/34278 [22:28:50<13:55:36, 3.61s/it] {'loss': 0.1271, 'grad_norm': 1.0046242506305367, 'learning_rate': 3.722454525856855e-06, 'epoch': 0.59} 59%|█████▉ | 20388/34278 [22:28:50<13:55:36, 3.61s/it] 59%|█████▉ | 20389/34278 [22:28:53<13:13:01, 3.43s/it] {'loss': 0.1186, 'grad_norm': 0.920190064571862, 'learning_rate': 3.7219977790752265e-06, 'epoch': 0.59} 59%|█████▉ | 20389/34278 [22:28:53<13:13:01, 3.43s/it] 59%|█████▉ | 20390/34278 [22:28:56<12:46:53, 3.31s/it] {'loss': 0.131, 'grad_norm': 0.9141840259196852, 'learning_rate': 3.721541043703297e-06, 'epoch': 0.59} 59%|█████▉ | 20390/34278 [22:28:56<12:46:53, 3.31s/it] 59%|█████▉ | 20391/34278 [22:29:00<12:41:46, 3.29s/it] {'loss': 0.1568, 'grad_norm': 0.8071523413765483, 'learning_rate': 3.7210843197451423e-06, 'epoch': 0.59} 59%|█████▉ | 20391/34278 [22:29:00<12:41:46, 3.29s/it] 59%|█████▉ | 20392/34278 [22:29:03<12:49:15, 3.32s/it] {'loss': 0.1442, 'grad_norm': 0.9220774603287075, 'learning_rate': 3.720627607204843e-06, 'epoch': 0.59} 59%|█████▉ | 20392/34278 [22:29:03<12:49:15, 3.32s/it] 59%|█████▉ | 20393/34278 [22:29:06<12:45:06, 3.31s/it] {'loss': 0.0972, 'grad_norm': 0.9029026910952078, 'learning_rate': 3.720170906086476e-06, 'epoch': 0.59} 59%|█████▉ | 20393/34278 [22:29:06<12:45:06, 3.31s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d237b3bf0> Failed to fetch sample 3255716. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d237b3bf0> 59%|█████▉ | 20394/34278 [22:29:10<13:04:54, 3.39s/it] {'loss': 0.1127, 'grad_norm': 0.8324401036230402, 'learning_rate': 3.719714216394117e-06, 'epoch': 0.59} 59%|█████▉ | 20394/34278 [22:29:10<13:04:54, 3.39s/it] 59%|█████▉ | 20395/34278 [22:29:13<12:48:15, 3.32s/it] {'loss': 0.1183, 'grad_norm': 0.9037690580022587, 'learning_rate': 3.719257538131843e-06, 'epoch': 0.59} 59%|█████▉ | 20395/34278 [22:29:13<12:48:15, 3.32s/it] 60%|█████▉ | 20396/34278 [22:29:16<12:23:56, 3.22s/it] {'loss': 0.1096, 'grad_norm': 0.8559684972197111, 'learning_rate': 3.718800871303733e-06, 'epoch': 0.6} 60%|█████▉ | 20396/34278 [22:29:16<12:23:56, 3.22s/it] 60%|█████▉ | 20397/34278 [22:29:20<12:57:46, 3.36s/it] {'loss': 0.1319, 'grad_norm': 1.1706300170802248, 'learning_rate': 3.7183442159138618e-06, 'epoch': 0.6} 60%|█████▉ | 20397/34278 [22:29:20<12:57:46, 3.36s/it] 60%|█████▉ | 20398/34278 [22:29:23<12:34:29, 3.26s/it] {'loss': 0.1399, 'grad_norm': 0.8423446476329747, 'learning_rate': 3.717887571966308e-06, 'epoch': 0.6} 60%|█████▉ | 20398/34278 [22:29:23<12:34:29, 3.26s/it] 60%|█████▉ | 20399/34278 [22:29:26<12:00:12, 3.11s/it] {'loss': 0.1199, 'grad_norm': 0.957491801776679, 'learning_rate': 3.7174309394651476e-06, 'epoch': 0.6} 60%|█████▉ | 20399/34278 [22:29:26<12:00:12, 3.11s/it] 60%|█████▉ | 20400/34278 [22:29:29<12:12:10, 3.17s/it] {'loss': 0.1058, 'grad_norm': 0.9615955937963768, 'learning_rate': 3.716974318414458e-06, 'epoch': 0.6} 60%|█████▉ | 20400/34278 [22:29:29<12:12:10, 3.17s/it] 60%|█████▉ | 20401/34278 [22:29:34<14:43:29, 3.82s/it] {'loss': 0.1325, 'grad_norm': 0.7055027976814059, 'learning_rate': 3.7165177088183158e-06, 'epoch': 0.6} 60%|█████▉ | 20401/34278 [22:29:34<14:43:29, 3.82s/it] 60%|█████▉ | 20402/34278 [22:29:37<13:27:26, 3.49s/it] {'loss': 0.1214, 'grad_norm': 0.8306744452301431, 'learning_rate': 3.716061110680797e-06, 'epoch': 0.6} 60%|█████▉ | 20402/34278 [22:29:37<13:27:26, 3.49s/it] 60%|█████▉ | 20403/34278 [22:29:41<13:46:05, 3.57s/it] {'loss': 0.1371, 'grad_norm': 0.8514220181183535, 'learning_rate': 3.7156045240059766e-06, 'epoch': 0.6} 60%|█████▉ | 20403/34278 [22:29:41<13:46:05, 3.57s/it] 60%|█████▉ | 20404/34278 [22:29:44<13:10:30, 3.42s/it] {'loss': 0.1287, 'grad_norm': 1.0421853182694367, 'learning_rate': 3.7151479487979335e-06, 'epoch': 0.6} 60%|█████▉ | 20404/34278 [22:29:44<13:10:30, 3.42s/it] 60%|█████▉ | 20405/34278 [22:29:47<13:00:54, 3.38s/it] {'loss': 0.1174, 'grad_norm': 0.7951703747004137, 'learning_rate': 3.7146913850607435e-06, 'epoch': 0.6} 60%|█████▉ | 20405/34278 [22:29:47<13:00:54, 3.38s/it] 60%|█████▉ | 20406/34278 [22:29:50<12:38:52, 3.28s/it] {'loss': 0.1211, 'grad_norm': 0.9022842613203131, 'learning_rate': 3.714234832798481e-06, 'epoch': 0.6} 60%|█████▉ | 20406/34278 [22:29:50<12:38:52, 3.28s/it] 60%|█████▉ | 20407/34278 [22:29:55<14:19:00, 3.72s/it] {'loss': 0.125, 'grad_norm': 0.8525782163889262, 'learning_rate': 3.7137782920152237e-06, 'epoch': 0.6} 60%|█████▉ | 20407/34278 [22:29:55<14:19:00, 3.72s/it] 60%|█████▉ | 20408/34278 [22:29:59<14:29:28, 3.76s/it] {'loss': 0.1264, 'grad_norm': 0.8668321566287852, 'learning_rate': 3.7133217627150475e-06, 'epoch': 0.6} 60%|█████▉ | 20408/34278 [22:29:59<14:29:28, 3.76s/it] 60%|█████▉ | 20409/34278 [22:30:02<14:11:08, 3.68s/it] {'loss': 0.1459, 'grad_norm': 0.8136735881886328, 'learning_rate': 3.712865244902024e-06, 'epoch': 0.6} 60%|█████▉ | 20409/34278 [22:30:02<14:11:08, 3.68s/it] 60%|█████▉ | 20410/34278 [22:30:05<13:26:54, 3.49s/it] {'loss': 0.1179, 'grad_norm': 0.832175816541259, 'learning_rate': 3.7124087385802353e-06, 'epoch': 0.6} 60%|█████▉ | 20410/34278 [22:30:05<13:26:54, 3.49s/it] 60%|█████▉ | 20411/34278 [22:30:09<13:48:39, 3.59s/it] {'loss': 0.1288, 'grad_norm': 0.8724492284315477, 'learning_rate': 3.7119522437537537e-06, 'epoch': 0.6} 60%|█████▉ | 20411/34278 [22:30:09<13:48:39, 3.59s/it] 60%|█████▉ | 20412/34278 [22:30:12<13:20:11, 3.46s/it] {'loss': 0.133, 'grad_norm': 1.0214453828923034, 'learning_rate': 3.7114957604266546e-06, 'epoch': 0.6} 60%|█████▉ | 20412/34278 [22:30:12<13:20:11, 3.46s/it] 60%|█████▉ | 20413/34278 [22:30:15<12:33:49, 3.26s/it] {'loss': 0.1303, 'grad_norm': 0.8232700476169499, 'learning_rate': 3.7110392886030145e-06, 'epoch': 0.6} 60%|█████▉ | 20413/34278 [22:30:15<12:33:49, 3.26s/it] 60%|█████▉ | 20414/34278 [22:30:18<12:30:25, 3.25s/it] {'loss': 0.1184, 'grad_norm': 0.799274066246224, 'learning_rate': 3.710582828286907e-06, 'epoch': 0.6} 60%|█████▉ | 20414/34278 [22:30:18<12:30:25, 3.25s/it] 60%|█████▉ | 20415/34278 [22:30:21<12:26:49, 3.23s/it] {'loss': 0.1084, 'grad_norm': 0.8350031645601355, 'learning_rate': 3.7101263794824072e-06, 'epoch': 0.6} 60%|█████▉ | 20415/34278 [22:30:21<12:26:49, 3.23s/it] 60%|█████▉ | 20416/34278 [22:30:24<12:10:15, 3.16s/it] {'loss': 0.1508, 'grad_norm': 0.8646161730243356, 'learning_rate': 3.7096699421935926e-06, 'epoch': 0.6} 60%|█████▉ | 20416/34278 [22:30:24<12:10:15, 3.16s/it] 60%|█████▉ | 20417/34278 [22:30:28<12:11:51, 3.17s/it] {'loss': 0.1399, 'grad_norm': 0.8859231506998803, 'learning_rate': 3.709213516424537e-06, 'epoch': 0.6} 60%|█████▉ | 20417/34278 [22:30:28<12:11:51, 3.17s/it] 60%|█████▉ | 20418/34278 [22:30:31<12:22:25, 3.21s/it] {'loss': 0.1257, 'grad_norm': 0.8511175372845571, 'learning_rate': 3.708757102179315e-06, 'epoch': 0.6} 60%|█████▉ | 20418/34278 [22:30:31<12:22:25, 3.21s/it] 60%|█████▉ | 20419/34278 [22:30:34<12:00:14, 3.12s/it] {'loss': 0.1426, 'grad_norm': 0.9045200428975965, 'learning_rate': 3.708300699462001e-06, 'epoch': 0.6} 60%|█████▉ | 20419/34278 [22:30:34<12:00:14, 3.12s/it] 60%|█████▉ | 20420/34278 [22:30:38<13:14:22, 3.44s/it] {'loss': 0.1264, 'grad_norm': 0.8359579178889355, 'learning_rate': 3.7078443082766694e-06, 'epoch': 0.6} 60%|█████▉ | 20420/34278 [22:30:38<13:14:22, 3.44s/it] 60%|█████▉ | 20421/34278 [22:30:41<13:01:57, 3.39s/it] {'loss': 0.1243, 'grad_norm': 0.8293232457778126, 'learning_rate': 3.707387928627395e-06, 'epoch': 0.6} 60%|█████▉ | 20421/34278 [22:30:41<13:01:57, 3.39s/it] 60%|█████▉ | 20422/34278 [22:30:47<15:27:35, 4.02s/it] {'loss': 0.1369, 'grad_norm': 0.8543332688610659, 'learning_rate': 3.706931560518253e-06, 'epoch': 0.6} 60%|█████▉ | 20422/34278 [22:30:47<15:27:35, 4.02s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f45fb8c6e80> Failed to fetch sample 2654733. Exception: cannot identify image file <_io.BytesIO object at 0x7f45fb8c6e80> 60%|█████▉ | 20423/34278 [22:30:50<14:26:54, 3.75s/it] {'loss': 0.1125, 'grad_norm': 0.8685431899547782, 'learning_rate': 3.706475203953319e-06, 'epoch': 0.6} 60%|█████▉ | 20423/34278 [22:30:50<14:26:54, 3.75s/it] 60%|█████▉ | 20424/34278 [22:30:56<17:00:14, 4.42s/it] {'loss': 0.1213, 'grad_norm': 0.8912930805152341, 'learning_rate': 3.706018858936664e-06, 'epoch': 0.6} 60%|█████▉ | 20424/34278 [22:30:56<17:00:14, 4.42s/it] 60%|█████▉ | 20425/34278 [22:30:59<15:26:47, 4.01s/it] {'loss': 0.1108, 'grad_norm': 0.9540886112928165, 'learning_rate': 3.7055625254723645e-06, 'epoch': 0.6} 60%|█████▉ | 20425/34278 [22:30:59<15:26:47, 4.01s/it] 60%|█████▉ | 20426/34278 [22:31:05<17:24:53, 4.53s/it] {'loss': 0.1246, 'grad_norm': 0.6292913574709981, 'learning_rate': 3.705106203564494e-06, 'epoch': 0.6} 60%|█████▉ | 20426/34278 [22:31:05<17:24:53, 4.53s/it] 60%|█████▉ | 20427/34278 [22:31:08<15:48:10, 4.11s/it] {'loss': 0.1202, 'grad_norm': 0.866460610219473, 'learning_rate': 3.7046498932171247e-06, 'epoch': 0.6} 60%|█████▉ | 20427/34278 [22:31:08<15:48:10, 4.11s/it] 60%|█████▉ | 20428/34278 [22:31:11<14:26:19, 3.75s/it] {'loss': 0.1225, 'grad_norm': 0.9358369037365718, 'learning_rate': 3.7041935944343325e-06, 'epoch': 0.6} 60%|█████▉ | 20428/34278 [22:31:11<14:26:19, 3.75s/it] 60%|█████▉ | 20429/34278 [22:31:14<14:09:01, 3.68s/it] {'loss': 0.1124, 'grad_norm': 0.8857018143086642, 'learning_rate': 3.703737307220191e-06, 'epoch': 0.6} 60%|█████▉ | 20429/34278 [22:31:14<14:09:01, 3.68s/it] 60%|█████▉ | 20430/34278 [22:31:18<14:36:32, 3.80s/it] {'loss': 0.1405, 'grad_norm': 0.8307551087365714, 'learning_rate': 3.7032810315787726e-06, 'epoch': 0.6} 60%|█████▉ | 20430/34278 [22:31:18<14:36:32, 3.80s/it] 60%|█████▉ | 20431/34278 [22:31:22<14:50:13, 3.86s/it] {'loss': 0.1342, 'grad_norm': 0.8077905351829049, 'learning_rate': 3.7028247675141538e-06, 'epoch': 0.6} 60%|█████▉ | 20431/34278 [22:31:22<14:50:13, 3.86s/it] 60%|█████▉ | 20432/34278 [22:31:26<14:23:36, 3.74s/it] {'loss': 0.0898, 'grad_norm': 0.768252473178892, 'learning_rate': 3.702368515030404e-06, 'epoch': 0.6} 60%|█████▉ | 20432/34278 [22:31:26<14:23:36, 3.74s/it] 60%|█████▉ | 20433/34278 [22:31:29<13:54:34, 3.62s/it] {'loss': 0.1441, 'grad_norm': 1.1228798295974916, 'learning_rate': 3.701912274131597e-06, 'epoch': 0.6} 60%|█████▉ | 20433/34278 [22:31:29<13:54:34, 3.62s/it] 60%|█████▉ | 20434/34278 [22:31:32<13:34:12, 3.53s/it] {'loss': 0.1148, 'grad_norm': 0.653153642098277, 'learning_rate': 3.7014560448218094e-06, 'epoch': 0.6} 60%|█████▉ | 20434/34278 [22:31:32<13:34:12, 3.53s/it] 60%|█████▉ | 20435/34278 [22:31:36<14:09:06, 3.68s/it] {'loss': 0.1184, 'grad_norm': 0.7372247471430156, 'learning_rate': 3.7009998271051127e-06, 'epoch': 0.6} 60%|█████▉ | 20435/34278 [22:31:36<14:09:06, 3.68s/it] 60%|█████▉ | 20436/34278 [22:31:40<13:51:58, 3.61s/it] {'loss': 0.114, 'grad_norm': 0.8465129941024829, 'learning_rate': 3.700543620985578e-06, 'epoch': 0.6} 60%|█████▉ | 20436/34278 [22:31:40<13:51:58, 3.61s/it] 60%|█████▉ | 20437/34278 [22:31:44<14:50:09, 3.86s/it] {'loss': 0.1313, 'grad_norm': 0.8628217830651215, 'learning_rate': 3.7000874264672804e-06, 'epoch': 0.6} 60%|█████▉ | 20437/34278 [22:31:44<14:50:09, 3.86s/it] 60%|█████▉ | 20438/34278 [22:31:48<14:05:14, 3.66s/it] {'loss': 0.127, 'grad_norm': 0.738228192358053, 'learning_rate': 3.6996312435542925e-06, 'epoch': 0.6} 60%|█████▉ | 20438/34278 [22:31:48<14:05:14, 3.66s/it] 60%|█████▉ | 20439/34278 [22:31:51<13:27:26, 3.50s/it] {'loss': 0.1202, 'grad_norm': 1.0597878518786803, 'learning_rate': 3.6991750722506835e-06, 'epoch': 0.6} 60%|█████▉ | 20439/34278 [22:31:51<13:27:26, 3.50s/it] 60%|█████▉ | 20440/34278 [22:31:57<16:28:12, 4.28s/it] {'loss': 0.1302, 'grad_norm': 0.810686714995001, 'learning_rate': 3.6987189125605315e-06, 'epoch': 0.6} 60%|█████▉ | 20440/34278 [22:31:57<16:28:12, 4.28s/it] 60%|█████▉ | 20441/34278 [22:32:00<14:47:06, 3.85s/it] {'loss': 0.1522, 'grad_norm': 0.8079389368908894, 'learning_rate': 3.6982627644879065e-06, 'epoch': 0.6} 60%|█████▉ | 20441/34278 [22:32:00<14:47:06, 3.85s/it] 60%|█████▉ | 20442/34278 [22:32:03<14:43:45, 3.83s/it] {'loss': 0.1086, 'grad_norm': 0.8620911802258356, 'learning_rate': 3.6978066280368797e-06, 'epoch': 0.6} 60%|█████▉ | 20442/34278 [22:32:03<14:43:45, 3.83s/it] 60%|█████▉ | 20443/34278 [22:32:06<13:33:51, 3.53s/it] {'loss': 0.135, 'grad_norm': 0.8940846739395645, 'learning_rate': 3.6973505032115262e-06, 'epoch': 0.6} 60%|█████▉ | 20443/34278 [22:32:06<13:33:51, 3.53s/it] 60%|█████▉ | 20444/34278 [22:32:10<13:27:04, 3.50s/it] {'loss': 0.1066, 'grad_norm': 0.7280652859703205, 'learning_rate': 3.696894390015915e-06, 'epoch': 0.6} 60%|█████▉ | 20444/34278 [22:32:10<13:27:04, 3.50s/it] 60%|█████▉ | 20445/34278 [22:32:13<13:31:14, 3.52s/it] {'loss': 0.1127, 'grad_norm': 0.8332426661569426, 'learning_rate': 3.6964382884541188e-06, 'epoch': 0.6} 60%|█████▉ | 20445/34278 [22:32:13<13:31:14, 3.52s/it] 60%|█████▉ | 20446/34278 [22:32:18<15:22:36, 4.00s/it] {'loss': 0.132, 'grad_norm': 0.8961981703523535, 'learning_rate': 3.695982198530211e-06, 'epoch': 0.6} 60%|█████▉ | 20446/34278 [22:32:18<15:22:36, 4.00s/it] 60%|█████▉ | 20447/34278 [22:32:23<16:23:09, 4.27s/it] {'loss': 0.118, 'grad_norm': 0.6293797067487202, 'learning_rate': 3.695526120248264e-06, 'epoch': 0.6} 60%|█████▉ | 20447/34278 [22:32:23<16:23:09, 4.27s/it] 60%|█████▉ | 20448/34278 [22:32:26<14:57:27, 3.89s/it] {'loss': 0.1149, 'grad_norm': 0.9985226237685431, 'learning_rate': 3.6950700536123486e-06, 'epoch': 0.6} 60%|█████▉ | 20448/34278 [22:32:26<14:57:27, 3.89s/it] 60%|█████▉ | 20449/34278 [22:32:29<13:58:49, 3.64s/it] {'loss': 0.1281, 'grad_norm': 0.8470849790946957, 'learning_rate': 3.694613998626535e-06, 'epoch': 0.6} 60%|█████▉ | 20449/34278 [22:32:29<13:58:49, 3.64s/it] 60%|█████▉ | 20450/34278 [22:32:32<13:20:41, 3.47s/it] {'loss': 0.1316, 'grad_norm': 0.7890509438732369, 'learning_rate': 3.694157955294896e-06, 'epoch': 0.6} 60%|█████▉ | 20450/34278 [22:32:32<13:20:41, 3.47s/it] 60%|█████▉ | 20451/34278 [22:32:36<13:12:36, 3.44s/it] {'loss': 0.1322, 'grad_norm': 0.9085179037130472, 'learning_rate': 3.693701923621502e-06, 'epoch': 0.6} 60%|█████▉ | 20451/34278 [22:32:36<13:12:36, 3.44s/it] 60%|█████▉ | 20452/34278 [22:32:39<13:07:33, 3.42s/it] {'loss': 0.1048, 'grad_norm': 0.8370162621157942, 'learning_rate': 3.6932459036104272e-06, 'epoch': 0.6} 60%|█████▉ | 20452/34278 [22:32:39<13:07:33, 3.42s/it] 60%|█████▉ | 20453/34278 [22:32:42<12:38:47, 3.29s/it] {'loss': 0.1245, 'grad_norm': 1.0677576295377924, 'learning_rate': 3.6927898952657417e-06, 'epoch': 0.6} 60%|█████▉ | 20453/34278 [22:32:42<12:38:47, 3.29s/it] 60%|█████▉ | 20454/34278 [22:32:46<13:40:56, 3.56s/it] {'loss': 0.1382, 'grad_norm': 0.8045087334684989, 'learning_rate': 3.6923338985915146e-06, 'epoch': 0.6} 60%|█████▉ | 20454/34278 [22:32:46<13:40:56, 3.56s/it] 60%|█████▉ | 20455/34278 [22:32:52<15:48:54, 4.12s/it] {'loss': 0.1314, 'grad_norm': 0.9289775206330065, 'learning_rate': 3.691877913591818e-06, 'epoch': 0.6} 60%|█████▉ | 20455/34278 [22:32:52<15:48:54, 4.12s/it] 60%|█████▉ | 20456/34278 [22:32:55<14:23:03, 3.75s/it] {'loss': 0.1128, 'grad_norm': 0.7931801109386174, 'learning_rate': 3.691421940270725e-06, 'epoch': 0.6} 60%|█████▉ | 20456/34278 [22:32:55<14:23:03, 3.75s/it] 60%|█████▉ | 20457/34278 [22:32:58<14:04:57, 3.67s/it] {'loss': 0.1203, 'grad_norm': 0.7868190286090775, 'learning_rate': 3.6909659786323016e-06, 'epoch': 0.6} 60%|█████▉ | 20457/34278 [22:32:58<14:04:57, 3.67s/it] 60%|█████▉ | 20458/34278 [22:33:01<13:28:19, 3.51s/it] {'loss': 0.1307, 'grad_norm': 0.9195037410372153, 'learning_rate': 3.6905100286806228e-06, 'epoch': 0.6} 60%|█████▉ | 20458/34278 [22:33:01<13:28:19, 3.51s/it] 60%|█████▉ | 20459/34278 [22:33:04<12:32:46, 3.27s/it] {'loss': 0.1211, 'grad_norm': 0.9351941867488012, 'learning_rate': 3.6900540904197583e-06, 'epoch': 0.6} 60%|█████▉ | 20459/34278 [22:33:04<12:32:46, 3.27s/it] 60%|█████▉ | 20460/34278 [22:33:07<12:42:18, 3.31s/it] {'loss': 0.1374, 'grad_norm': 0.7102082820615029, 'learning_rate': 3.689598163853779e-06, 'epoch': 0.6} 60%|█████▉ | 20460/34278 [22:33:07<12:42:18, 3.31s/it] 60%|█████▉ | 20461/34278 [22:33:11<13:25:08, 3.50s/it] {'loss': 0.1262, 'grad_norm': 1.232348357686735, 'learning_rate': 3.6891422489867535e-06, 'epoch': 0.6} 60%|█████▉ | 20461/34278 [22:33:11<13:25:08, 3.50s/it] 60%|█████▉ | 20462/34278 [22:33:14<12:31:54, 3.27s/it] {'loss': 0.1398, 'grad_norm': 1.6203371497122976, 'learning_rate': 3.688686345822753e-06, 'epoch': 0.6} 60%|█████▉ | 20462/34278 [22:33:14<12:31:54, 3.27s/it] 60%|█████▉ | 20463/34278 [22:33:17<12:25:55, 3.24s/it] {'loss': 0.1697, 'grad_norm': 1.0860417899502368, 'learning_rate': 3.6882304543658465e-06, 'epoch': 0.6} 60%|█████▉ | 20463/34278 [22:33:17<12:25:55, 3.24s/it] 60%|█████▉ | 20464/34278 [22:33:20<12:30:36, 3.26s/it] {'loss': 0.1386, 'grad_norm': 0.8871256628348626, 'learning_rate': 3.6877745746201064e-06, 'epoch': 0.6} 60%|█████▉ | 20464/34278 [22:33:20<12:30:36, 3.26s/it] 60%|█████▉ | 20465/34278 [22:33:27<16:05:19, 4.19s/it] {'loss': 0.1205, 'grad_norm': 0.7836428040188675, 'learning_rate': 3.6873187065896033e-06, 'epoch': 0.6} 60%|█████▉ | 20465/34278 [22:33:27<16:05:19, 4.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 60%|█████▉ | 20466/34278 [22:33:30<15:16:11, 3.98s/it] {'loss': 0.1118, 'grad_norm': 0.9965053353182319, 'learning_rate': 3.686862850278403e-06, 'epoch': 0.6} 60%|█████▉ | 20466/34278 [22:33:30<15:16:11, 3.98s/it] 60%|█████▉ | 20467/34278 [22:33:33<14:19:30, 3.73s/it] {'loss': 0.1331, 'grad_norm': 0.9808097879999046, 'learning_rate': 3.6864070056905786e-06, 'epoch': 0.6} 60%|█████▉ | 20467/34278 [22:33:34<14:19:30, 3.73s/it] 60%|█████▉ | 20468/34278 [22:33:40<17:04:04, 4.45s/it] {'loss': 0.1262, 'grad_norm': 0.909625164618019, 'learning_rate': 3.6859511728302006e-06, 'epoch': 0.6} 60%|█████▉ | 20468/34278 [22:33:40<17:04:04, 4.45s/it] 60%|█████▉ | 20469/34278 [22:33:43<16:08:52, 4.21s/it] {'loss': 0.1348, 'grad_norm': 0.7804522237800623, 'learning_rate': 3.6854953517013326e-06, 'epoch': 0.6} 60%|█████▉ | 20469/34278 [22:33:43<16:08:52, 4.21s/it] 60%|█████▉ | 20470/34278 [22:33:46<14:53:29, 3.88s/it] {'loss': 0.1477, 'grad_norm': 0.9893818890694739, 'learning_rate': 3.685039542308052e-06, 'epoch': 0.6} 60%|█████▉ | 20470/34278 [22:33:46<14:53:29, 3.88s/it] 60%|█████▉ | 20471/34278 [22:33:49<13:39:43, 3.56s/it] {'loss': 0.1088, 'grad_norm': 0.8875647411801023, 'learning_rate': 3.684583744654423e-06, 'epoch': 0.6} 60%|█████▉ | 20471/34278 [22:33:49<13:39:43, 3.56s/it] 60%|█████▉ | 20472/34278 [22:33:53<14:26:39, 3.77s/it] {'loss': 0.1369, 'grad_norm': 0.7961423523856973, 'learning_rate': 3.6841279587445165e-06, 'epoch': 0.6} 60%|█████▉ | 20472/34278 [22:33:53<14:26:39, 3.77s/it] 60%|█████▉ | 20473/34278 [22:33:57<14:41:10, 3.83s/it] {'loss': 0.1452, 'grad_norm': 1.0075475240222815, 'learning_rate': 3.6836721845824032e-06, 'epoch': 0.6} 60%|█████▉ | 20473/34278 [22:33:57<14:41:10, 3.83s/it] 60%|█████▉ | 20474/34278 [22:34:02<15:00:53, 3.92s/it] {'loss': 0.1286, 'grad_norm': 0.9706868154006772, 'learning_rate': 3.6832164221721465e-06, 'epoch': 0.6} 60%|█████▉ | 20474/34278 [22:34:02<15:00:53, 3.92s/it] 60%|█████▉ | 20475/34278 [22:34:04<13:49:23, 3.61s/it] {'loss': 0.1046, 'grad_norm': 0.6751922044837719, 'learning_rate': 3.682760671517823e-06, 'epoch': 0.6} 60%|█████▉ | 20475/34278 [22:34:04<13:49:23, 3.61s/it] 60%|█████▉ | 20476/34278 [22:34:08<14:18:53, 3.73s/it] {'loss': 0.1278, 'grad_norm': 0.9000148407740457, 'learning_rate': 3.6823049326234963e-06, 'epoch': 0.6} 60%|█████▉ | 20476/34278 [22:34:08<14:18:53, 3.73s/it] 60%|█████▉ | 20477/34278 [22:34:12<13:50:15, 3.61s/it] {'loss': 0.1329, 'grad_norm': 1.144890886810307, 'learning_rate': 3.6818492054932363e-06, 'epoch': 0.6} 60%|█████▉ | 20477/34278 [22:34:12<13:50:15, 3.61s/it] 60%|█████▉ | 20478/34278 [22:34:15<13:47:57, 3.60s/it] {'loss': 0.1077, 'grad_norm': 1.039175940067144, 'learning_rate': 3.6813934901311134e-06, 'epoch': 0.6} 60%|█████▉ | 20478/34278 [22:34:15<13:47:57, 3.60s/it] 60%|█████▉ | 20479/34278 [22:34:21<16:28:25, 4.30s/it] {'loss': 0.1235, 'grad_norm': 0.7314679239854645, 'learning_rate': 3.6809377865411933e-06, 'epoch': 0.6} 60%|█████▉ | 20479/34278 [22:34:21<16:28:25, 4.30s/it] 60%|█████▉ | 20480/34278 [22:34:25<15:27:22, 4.03s/it] {'loss': 0.1319, 'grad_norm': 0.8729613152525328, 'learning_rate': 3.6804820947275444e-06, 'epoch': 0.6} 60%|█████▉ | 20480/34278 [22:34:25<15:27:22, 4.03s/it] 60%|█████▉ | 20481/34278 [22:34:28<14:22:36, 3.75s/it] {'loss': 0.1206, 'grad_norm': 0.785173835529341, 'learning_rate': 3.680026414694238e-06, 'epoch': 0.6} 60%|█████▉ | 20481/34278 [22:34:28<14:22:36, 3.75s/it] 60%|█████▉ | 20482/34278 [22:34:32<14:43:11, 3.84s/it] {'loss': 0.122, 'grad_norm': 1.184177483031644, 'learning_rate': 3.679570746445341e-06, 'epoch': 0.6} 60%|█████▉ | 20482/34278 [22:34:32<14:43:11, 3.84s/it] 60%|█████▉ | 20483/34278 [22:34:35<13:58:23, 3.65s/it] {'loss': 0.1345, 'grad_norm': 1.1185632815041635, 'learning_rate': 3.6791150899849215e-06, 'epoch': 0.6} 60%|█████▉ | 20483/34278 [22:34:35<13:58:23, 3.65s/it] 60%|█████▉ | 20484/34278 [22:34:38<13:06:27, 3.42s/it] {'loss': 0.1472, 'grad_norm': 0.9808900238407227, 'learning_rate': 3.6786594453170467e-06, 'epoch': 0.6} 60%|█████▉ | 20484/34278 [22:34:38<13:06:27, 3.42s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7facc27c37e0> Failed to fetch sample 3646659. Exception: cannot identify image file <_io.BytesIO object at 0x7facc27c37e0> 60%|█████▉ | 20485/34278 [22:34:43<15:27:48, 4.04s/it] {'loss': 0.1449, 'grad_norm': 0.8160310435880171, 'learning_rate': 3.678203812445784e-06, 'epoch': 0.6} 60%|█████▉ | 20485/34278 [22:34:43<15:27:48, 4.04s/it] 60%|█████▉ | 20486/34278 [22:34:47<14:44:07, 3.85s/it] {'loss': 0.1224, 'grad_norm': 0.8488171059555756, 'learning_rate': 3.677748191375202e-06, 'epoch': 0.6} 60%|█████▉ | 20486/34278 [22:34:47<14:44:07, 3.85s/it] 60%|█████▉ | 20487/34278 [22:34:51<14:42:23, 3.84s/it] {'loss': 0.1262, 'grad_norm': 0.7615093566664713, 'learning_rate': 3.67729258210937e-06, 'epoch': 0.6} 60%|█████▉ | 20487/34278 [22:34:51<14:42:23, 3.84s/it] 60%|█████▉ | 20488/34278 [22:34:53<13:23:04, 3.49s/it] {'loss': 0.1148, 'grad_norm': 1.0336629755885731, 'learning_rate': 3.6768369846523534e-06, 'epoch': 0.6} 60%|█████▉ | 20488/34278 [22:34:53<13:23:04, 3.49s/it] 60%|█████▉ | 20489/34278 [22:34:56<12:45:03, 3.33s/it] {'loss': 0.1562, 'grad_norm': 0.9502916717067837, 'learning_rate': 3.6763813990082205e-06, 'epoch': 0.6} 60%|█████▉ | 20489/34278 [22:34:56<12:45:03, 3.33s/it] 60%|█████▉ | 20490/34278 [22:34:59<12:18:46, 3.21s/it] {'loss': 0.1532, 'grad_norm': 0.9356259514588079, 'learning_rate': 3.675925825181039e-06, 'epoch': 0.6} 60%|█████▉ | 20490/34278 [22:34:59<12:18:46, 3.21s/it] 60%|█████▉ | 20491/34278 [22:35:02<12:13:06, 3.19s/it] {'loss': 0.1394, 'grad_norm': 1.0665564011160675, 'learning_rate': 3.675470263174875e-06, 'epoch': 0.6} 60%|█████▉ | 20491/34278 [22:35:02<12:13:06, 3.19s/it] 60%|█████▉ | 20492/34278 [22:35:07<14:09:31, 3.70s/it] {'loss': 0.1385, 'grad_norm': 0.8358045200038936, 'learning_rate': 3.6750147129937954e-06, 'epoch': 0.6} 60%|█████▉ | 20492/34278 [22:35:07<14:09:31, 3.70s/it] 60%|█████▉ | 20493/34278 [22:35:12<15:11:39, 3.97s/it] {'loss': 0.1317, 'grad_norm': 0.6542279822197059, 'learning_rate': 3.6745591746418687e-06, 'epoch': 0.6} 60%|█████▉ | 20493/34278 [22:35:12<15:11:39, 3.97s/it] 60%|█████▉ | 20494/34278 [22:35:15<14:04:36, 3.68s/it] {'loss': 0.1396, 'grad_norm': 0.7857079650083013, 'learning_rate': 3.6741036481231618e-06, 'epoch': 0.6} 60%|█████▉ | 20494/34278 [22:35:15<14:04:36, 3.68s/it] 60%|█████▉ | 20495/34278 [22:35:18<13:31:41, 3.53s/it] {'loss': 0.1661, 'grad_norm': 0.9921570766625235, 'learning_rate': 3.673648133441742e-06, 'epoch': 0.6} 60%|█████▉ | 20495/34278 [22:35:18<13:31:41, 3.53s/it] 60%|█████▉ | 20496/34278 [22:35:22<13:44:20, 3.59s/it] {'loss': 0.1136, 'grad_norm': 0.7417332212474258, 'learning_rate': 3.673192630601673e-06, 'epoch': 0.6} 60%|█████▉ | 20496/34278 [22:35:22<13:44:20, 3.59s/it] 60%|█████▉ | 20497/34278 [22:35:27<15:12:14, 3.97s/it] {'loss': 0.1314, 'grad_norm': 0.8751843957608968, 'learning_rate': 3.672737139607024e-06, 'epoch': 0.6} 60%|█████▉ | 20497/34278 [22:35:27<15:12:14, 3.97s/it] 60%|█████▉ | 20498/34278 [22:35:30<14:15:28, 3.72s/it] {'loss': 0.1469, 'grad_norm': 0.9085993574172754, 'learning_rate': 3.6722816604618603e-06, 'epoch': 0.6} 60%|█████▉ | 20498/34278 [22:35:30<14:15:28, 3.72s/it] 60%|█████▉ | 20499/34278 [22:35:35<16:09:58, 4.22s/it] {'loss': 0.1243, 'grad_norm': 1.0359512202969643, 'learning_rate': 3.6718261931702504e-06, 'epoch': 0.6} 60%|█████▉ | 20499/34278 [22:35:35<16:09:58, 4.22s/it] 60%|█████▉ | 20500/34278 [22:35:41<18:18:46, 4.78s/it] {'loss': 0.1032, 'grad_norm': 0.95895574487634, 'learning_rate': 3.6713707377362594e-06, 'epoch': 0.6} 60%|█████▉ | 20500/34278 [22:35:41<18:18:46, 4.78s/it] 60%|█████▉ | 20501/34278 [22:35:45<16:40:20, 4.36s/it] {'loss': 0.1241, 'grad_norm': 1.2815863745803195, 'learning_rate': 3.6709152941639526e-06, 'epoch': 0.6} 60%|█████▉ | 20501/34278 [22:35:45<16:40:20, 4.36s/it] 60%|█████▉ | 20502/34278 [22:35:47<14:47:17, 3.86s/it] {'loss': 0.1327, 'grad_norm': 0.8772051140004502, 'learning_rate': 3.6704598624573967e-06, 'epoch': 0.6} 60%|█████▉ | 20502/34278 [22:35:47<14:47:17, 3.86s/it] 60%|█████▉ | 20503/34278 [22:35:52<15:37:54, 4.09s/it] {'loss': 0.1216, 'grad_norm': 0.8344933951998865, 'learning_rate': 3.670004442620659e-06, 'epoch': 0.6} 60%|█████▉ | 20503/34278 [22:35:52<15:37:54, 4.09s/it] 60%|█████▉ | 20504/34278 [22:35:57<16:49:08, 4.40s/it] {'loss': 0.1233, 'grad_norm': 0.9100034234769037, 'learning_rate': 3.6695490346578007e-06, 'epoch': 0.6} 60%|█████▉ | 20504/34278 [22:35:57<16:49:08, 4.40s/it] 60%|█████▉ | 20505/34278 [22:36:00<15:20:02, 4.01s/it] {'loss': 0.102, 'grad_norm': 0.7762673031814327, 'learning_rate': 3.6690936385728943e-06, 'epoch': 0.6} 60%|█████▉ | 20505/34278 [22:36:00<15:20:02, 4.01s/it] 60%|█████▉ | 20506/34278 [22:36:03<14:25:10, 3.77s/it] {'loss': 0.1178, 'grad_norm': 0.7129530037666477, 'learning_rate': 3.668638254370001e-06, 'epoch': 0.6} 60%|█████▉ | 20506/34278 [22:36:03<14:25:10, 3.77s/it] 60%|█████▉ | 20507/34278 [22:36:07<13:45:29, 3.60s/it] {'loss': 0.1162, 'grad_norm': 1.052894058234126, 'learning_rate': 3.668182882053188e-06, 'epoch': 0.6} 60%|█████▉ | 20507/34278 [22:36:07<13:45:29, 3.60s/it] 60%|█████▉ | 20508/34278 [22:36:11<14:56:39, 3.91s/it] {'loss': 0.1546, 'grad_norm': 0.7718272057497901, 'learning_rate': 3.667727521626521e-06, 'epoch': 0.6} 60%|█████▉ | 20508/34278 [22:36:11<14:56:39, 3.91s/it] 60%|█████▉ | 20509/34278 [22:36:16<15:35:40, 4.08s/it] {'loss': 0.1322, 'grad_norm': 0.821713021135635, 'learning_rate': 3.667272173094063e-06, 'epoch': 0.6} 60%|█████▉ | 20509/34278 [22:36:16<15:35:40, 4.08s/it] 60%|█████▉ | 20510/34278 [22:36:19<14:35:04, 3.81s/it] {'loss': 0.1062, 'grad_norm': 0.8207207203971407, 'learning_rate': 3.666816836459881e-06, 'epoch': 0.6} 60%|█████▉ | 20510/34278 [22:36:19<14:35:04, 3.81s/it] 60%|█████▉ | 20511/34278 [22:36:24<15:58:58, 4.18s/it] {'loss': 0.1352, 'grad_norm': 0.8450759428460285, 'learning_rate': 3.6663615117280405e-06, 'epoch': 0.6} 60%|█████▉ | 20511/34278 [22:36:24<15:58:58, 4.18s/it] 60%|█████▉ | 20512/34278 [22:36:28<15:22:09, 4.02s/it] {'loss': 0.1138, 'grad_norm': 0.7236166763696078, 'learning_rate': 3.6659061989026057e-06, 'epoch': 0.6} 60%|█████▉ | 20512/34278 [22:36:28<15:22:09, 4.02s/it] 60%|█████▉ | 20513/34278 [22:36:30<14:04:50, 3.68s/it] {'loss': 0.1234, 'grad_norm': 0.708566754879243, 'learning_rate': 3.6654508979876433e-06, 'epoch': 0.6} 60%|█████▉ | 20513/34278 [22:36:30<14:04:50, 3.68s/it] 60%|█████▉ | 20514/34278 [22:36:33<13:16:04, 3.47s/it] {'loss': 0.1172, 'grad_norm': 0.8334590853831855, 'learning_rate': 3.6649956089872163e-06, 'epoch': 0.6} 60%|█████▉ | 20514/34278 [22:36:33<13:16:04, 3.47s/it] 60%|█████▉ | 20515/34278 [22:36:39<16:07:43, 4.22s/it] {'loss': 0.1459, 'grad_norm': 0.9573498502570469, 'learning_rate': 3.6645403319053885e-06, 'epoch': 0.6} 60%|█████▉ | 20515/34278 [22:36:39<16:07:43, 4.22s/it] 60%|█████▉ | 20516/34278 [22:36:42<14:39:10, 3.83s/it] {'loss': 0.1428, 'grad_norm': 0.9034634695812083, 'learning_rate': 3.664085066746226e-06, 'epoch': 0.6} 60%|█████▉ | 20516/34278 [22:36:42<14:39:10, 3.83s/it] 60%|█████▉ | 20517/34278 [22:36:46<14:53:06, 3.89s/it] {'loss': 0.1189, 'grad_norm': 1.005677962838466, 'learning_rate': 3.6636298135137945e-06, 'epoch': 0.6} 60%|█████▉ | 20517/34278 [22:36:46<14:53:06, 3.89s/it] 60%|█████▉ | 20518/34278 [22:36:50<14:13:39, 3.72s/it] {'loss': 0.1008, 'grad_norm': 0.8234564040175449, 'learning_rate': 3.663174572212156e-06, 'epoch': 0.6} 60%|█████▉ | 20518/34278 [22:36:50<14:13:39, 3.72s/it] 60%|█████▉ | 20519/34278 [22:36:54<14:59:35, 3.92s/it] {'loss': 0.1269, 'grad_norm': 0.8627707595548206, 'learning_rate': 3.6627193428453755e-06, 'epoch': 0.6} 60%|█████▉ | 20519/34278 [22:36:54<14:59:35, 3.92s/it] 60%|█████▉ | 20520/34278 [22:36:57<13:59:17, 3.66s/it] {'loss': 0.1239, 'grad_norm': 0.9130325309868798, 'learning_rate': 3.6622641254175193e-06, 'epoch': 0.6} 60%|█████▉ | 20520/34278 [22:36:57<13:59:17, 3.66s/it] 60%|█████▉ | 20521/34278 [22:37:01<13:46:12, 3.60s/it] {'loss': 0.1027, 'grad_norm': 1.0175108021003532, 'learning_rate': 3.6618089199326477e-06, 'epoch': 0.6} 60%|█████▉ | 20521/34278 [22:37:01<13:46:12, 3.60s/it] 60%|█████▉ | 20522/34278 [22:37:04<13:53:41, 3.64s/it] {'loss': 0.122, 'grad_norm': 1.0057138131731775, 'learning_rate': 3.661353726394826e-06, 'epoch': 0.6} 60%|█████▉ | 20522/34278 [22:37:04<13:53:41, 3.64s/it] 60%|█████▉ | 20523/34278 [22:37:10<16:33:07, 4.33s/it] {'loss': 0.1498, 'grad_norm': 0.9933031439251812, 'learning_rate': 3.6608985448081204e-06, 'epoch': 0.6} 60%|█████▉ | 20523/34278 [22:37:10<16:33:07, 4.33s/it] 60%|█████▉ | 20524/34278 [22:37:14<15:34:39, 4.08s/it] {'loss': 0.112, 'grad_norm': 0.8647899552852462, 'learning_rate': 3.660443375176592e-06, 'epoch': 0.6} 60%|█████▉ | 20524/34278 [22:37:14<15:34:39, 4.08s/it] 60%|█████▉ | 20525/34278 [22:37:18<15:18:01, 4.01s/it] {'loss': 0.1216, 'grad_norm': 0.7840596306832815, 'learning_rate': 3.6599882175043074e-06, 'epoch': 0.6} 60%|█████▉ | 20525/34278 [22:37:18<15:18:01, 4.01s/it] 60%|█████▉ | 20526/34278 [22:37:21<14:45:54, 3.87s/it] {'loss': 0.1341, 'grad_norm': 0.8525807806613944, 'learning_rate': 3.659533071795326e-06, 'epoch': 0.6} 60%|█████▉ | 20526/34278 [22:37:21<14:45:54, 3.87s/it] 60%|█████▉ | 20527/34278 [22:37:25<14:39:38, 3.84s/it] {'loss': 0.1418, 'grad_norm': 0.8781698906984328, 'learning_rate': 3.659077938053714e-06, 'epoch': 0.6} 60%|█████▉ | 20527/34278 [22:37:25<14:39:38, 3.84s/it] 60%|█████▉ | 20528/34278 [22:37:31<17:11:10, 4.50s/it] {'loss': 0.1437, 'grad_norm': 0.9022492130214876, 'learning_rate': 3.6586228162835326e-06, 'epoch': 0.6} 60%|█████▉ | 20528/34278 [22:37:31<17:11:10, 4.50s/it] 60%|█████▉ | 20529/34278 [22:37:35<16:24:17, 4.30s/it] {'loss': 0.1105, 'grad_norm': 0.6684622309029933, 'learning_rate': 3.6581677064888476e-06, 'epoch': 0.6} 60%|█████▉ | 20529/34278 [22:37:35<16:24:17, 4.30s/it] 60%|█████▉ | 20530/34278 [22:37:41<18:09:40, 4.76s/it] {'loss': 0.1027, 'grad_norm': 0.8036002649106786, 'learning_rate': 3.6577126086737225e-06, 'epoch': 0.6} 60%|█████▉ | 20530/34278 [22:37:41<18:09:40, 4.76s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 60%|█████▉ | 20531/34278 [22:37:46<18:58:55, 4.97s/it] {'loss': 0.1421, 'grad_norm': 1.061772593087311, 'learning_rate': 3.657257522842217e-06, 'epoch': 0.6} 60%|█████▉ | 20531/34278 [22:37:46<18:58:55, 4.97s/it] 60%|█████▉ | 20532/34278 [22:37:50<17:43:05, 4.64s/it] {'loss': 0.1146, 'grad_norm': 0.9983704893430387, 'learning_rate': 3.6568024489983967e-06, 'epoch': 0.6} 60%|█████▉ | 20532/34278 [22:37:50<17:43:05, 4.64s/it] 60%|█████▉ | 20533/34278 [22:37:54<17:24:05, 4.56s/it] {'loss': 0.113, 'grad_norm': 0.9157293895801865, 'learning_rate': 3.6563473871463238e-06, 'epoch': 0.6} 60%|█████▉ | 20533/34278 [22:37:54<17:24:05, 4.56s/it] 60%|█████▉ | 20534/34278 [22:37:58<15:55:13, 4.17s/it] {'loss': 0.1291, 'grad_norm': 0.8104014835120488, 'learning_rate': 3.655892337290058e-06, 'epoch': 0.6} 60%|█████▉ | 20534/34278 [22:37:58<15:55:13, 4.17s/it] 60%|█████▉ | 20535/34278 [22:38:01<15:06:58, 3.96s/it] {'loss': 0.117, 'grad_norm': 0.7866470717924987, 'learning_rate': 3.6554372994336674e-06, 'epoch': 0.6} 60%|█████▉ | 20535/34278 [22:38:01<15:06:58, 3.96s/it] 60%|█████▉ | 20536/34278 [22:38:04<13:58:31, 3.66s/it] {'loss': 0.0957, 'grad_norm': 0.7352142384835508, 'learning_rate': 3.65498227358121e-06, 'epoch': 0.6} 60%|█████▉ | 20536/34278 [22:38:04<13:58:31, 3.66s/it] 60%|█████▉ | 20537/34278 [22:38:10<17:13:03, 4.51s/it] {'loss': 0.1209, 'grad_norm': 0.8191234449972371, 'learning_rate': 3.6545272597367507e-06, 'epoch': 0.6} 60%|█████▉ | 20537/34278 [22:38:10<17:13:03, 4.51s/it] 60%|█████▉ | 20538/34278 [22:38:15<17:05:44, 4.48s/it] {'loss': 0.146, 'grad_norm': 0.9268517631101401, 'learning_rate': 3.654072257904352e-06, 'epoch': 0.6} 60%|█████▉ | 20538/34278 [22:38:15<17:05:44, 4.48s/it] 60%|█████▉ | 20539/34278 [22:38:18<15:15:41, 4.00s/it] {'loss': 0.1335, 'grad_norm': 0.8001956900343967, 'learning_rate': 3.6536172680880732e-06, 'epoch': 0.6} 60%|█████▉ | 20539/34278 [22:38:18<15:15:41, 4.00s/it] 60%|█████▉ | 20540/34278 [22:38:21<14:16:20, 3.74s/it] {'loss': 0.1082, 'grad_norm': 0.7775463995139918, 'learning_rate': 3.653162290291977e-06, 'epoch': 0.6} 60%|█████▉ | 20540/34278 [22:38:21<14:16:20, 3.74s/it] 60%|█████▉ | 20541/34278 [22:38:27<16:58:43, 4.45s/it] {'loss': 0.0965, 'grad_norm': 0.7418387145631792, 'learning_rate': 3.652707324520127e-06, 'epoch': 0.6} 60%|█████▉ | 20541/34278 [22:38:27<16:58:43, 4.45s/it] 60%|█████▉ | 20542/34278 [22:38:30<15:24:16, 4.04s/it] {'loss': 0.1274, 'grad_norm': 0.8001930971560922, 'learning_rate': 3.6522523707765856e-06, 'epoch': 0.6} 60%|█████▉ | 20542/34278 [22:38:30<15:24:16, 4.04s/it] 60%|█████▉ | 20543/34278 [22:38:36<18:00:45, 4.72s/it] {'loss': 0.122, 'grad_norm': 0.7296196533427095, 'learning_rate': 3.6517974290654136e-06, 'epoch': 0.6} 60%|█████▉ | 20543/34278 [22:38:36<18:00:45, 4.72s/it] 60%|█████▉ | 20544/34278 [22:38:40<16:34:39, 4.35s/it] {'loss': 0.1318, 'grad_norm': 0.8544467595532104, 'learning_rate': 3.6513424993906717e-06, 'epoch': 0.6} 60%|█████▉ | 20544/34278 [22:38:40<16:34:39, 4.35s/it] 60%|█████▉ | 20545/34278 [22:38:43<14:51:17, 3.89s/it] {'loss': 0.146, 'grad_norm': 0.9430329215346632, 'learning_rate': 3.6508875817564214e-06, 'epoch': 0.6} 60%|█████▉ | 20545/34278 [22:38:43<14:51:17, 3.89s/it] 60%|█████▉ | 20546/34278 [22:38:46<14:19:22, 3.75s/it] {'loss': 0.1126, 'grad_norm': 0.6930544020153473, 'learning_rate': 3.6504326761667242e-06, 'epoch': 0.6} 60%|█████▉ | 20546/34278 [22:38:46<14:19:22, 3.75s/it] 60%|█████▉ | 20547/34278 [22:38:51<15:14:40, 4.00s/it] {'loss': 0.1334, 'grad_norm': 0.881379971953356, 'learning_rate': 3.6499777826256434e-06, 'epoch': 0.6} 60%|█████▉ | 20547/34278 [22:38:51<15:14:40, 4.00s/it] 60%|█████▉ | 20548/34278 [22:38:56<16:14:55, 4.26s/it] {'loss': 0.1444, 'grad_norm': 1.0946395253591987, 'learning_rate': 3.649522901137238e-06, 'epoch': 0.6} 60%|█████▉ | 20548/34278 [22:38:56<16:14:55, 4.26s/it] 60%|█████▉ | 20549/34278 [22:38:59<14:59:36, 3.93s/it] {'loss': 0.1341, 'grad_norm': 0.9284954408650453, 'learning_rate': 3.64906803170557e-06, 'epoch': 0.6} 60%|█████▉ | 20549/34278 [22:38:59<14:59:36, 3.93s/it] 60%|█████▉ | 20550/34278 [22:39:02<14:03:45, 3.69s/it] {'loss': 0.114, 'grad_norm': 0.6966842165615734, 'learning_rate': 3.6486131743347007e-06, 'epoch': 0.6} 60%|█████▉ | 20550/34278 [22:39:02<14:03:45, 3.69s/it] 60%|█████▉ | 20551/34278 [22:39:05<13:28:05, 3.53s/it] {'loss': 0.1153, 'grad_norm': 0.8204866663581304, 'learning_rate': 3.6481583290286894e-06, 'epoch': 0.6} 60%|█████▉ | 20551/34278 [22:39:05<13:28:05, 3.53s/it] 60%|█████▉ | 20552/34278 [22:39:10<14:43:29, 3.86s/it] {'loss': 0.1387, 'grad_norm': 1.21413720074367, 'learning_rate': 3.647703495791597e-06, 'epoch': 0.6} 60%|█████▉ | 20552/34278 [22:39:10<14:43:29, 3.86s/it] 60%|█████▉ | 20553/34278 [22:39:13<14:04:34, 3.69s/it] {'loss': 0.1109, 'grad_norm': 0.8178389200924865, 'learning_rate': 3.647248674627486e-06, 'epoch': 0.6} 60%|█████▉ | 20553/34278 [22:39:13<14:04:34, 3.69s/it] 60%|█████▉ | 20554/34278 [22:39:17<13:59:26, 3.67s/it] {'loss': 0.1168, 'grad_norm': 0.7348694448942786, 'learning_rate': 3.6467938655404155e-06, 'epoch': 0.6} 60%|█████▉ | 20554/34278 [22:39:17<13:59:26, 3.67s/it] 60%|█████▉ | 20555/34278 [22:39:20<13:24:40, 3.52s/it] {'loss': 0.1152, 'grad_norm': 0.7679002210842868, 'learning_rate': 3.646339068534448e-06, 'epoch': 0.6} 60%|█████▉ | 20555/34278 [22:39:20<13:24:40, 3.52s/it] 60%|█████▉ | 20556/34278 [22:39:22<12:33:47, 3.30s/it] {'loss': 0.122, 'grad_norm': 1.3187101764265343, 'learning_rate': 3.645884283613641e-06, 'epoch': 0.6} 60%|█████▉ | 20556/34278 [22:39:22<12:33:47, 3.30s/it] 60%|█████▉ | 20557/34278 [22:39:26<12:24:36, 3.26s/it] {'loss': 0.1006, 'grad_norm': 0.9132659570266782, 'learning_rate': 3.6454295107820557e-06, 'epoch': 0.6} 60%|█████▉ | 20557/34278 [22:39:26<12:24:36, 3.26s/it] 60%|█████▉ | 20558/34278 [22:39:29<12:53:04, 3.38s/it] {'loss': 0.1206, 'grad_norm': 0.7024967893104602, 'learning_rate': 3.6449747500437517e-06, 'epoch': 0.6} 60%|█████▉ | 20558/34278 [22:39:29<12:53:04, 3.38s/it] 60%|█████▉ | 20559/34278 [22:39:32<12:22:29, 3.25s/it] {'loss': 0.1084, 'grad_norm': 0.7835023542497151, 'learning_rate': 3.64452000140279e-06, 'epoch': 0.6} 60%|█████▉ | 20559/34278 [22:39:32<12:22:29, 3.25s/it] 60%|█████▉ | 20560/34278 [22:39:36<12:33:30, 3.30s/it] {'loss': 0.1001, 'grad_norm': 0.891525483631285, 'learning_rate': 3.6440652648632314e-06, 'epoch': 0.6} 60%|█████▉ | 20560/34278 [22:39:36<12:33:30, 3.30s/it] 60%|█████▉ | 20561/34278 [22:39:39<12:27:44, 3.27s/it] {'loss': 0.1267, 'grad_norm': 1.001478933931435, 'learning_rate': 3.6436105404291334e-06, 'epoch': 0.6} 60%|█████▉ | 20561/34278 [22:39:39<12:27:44, 3.27s/it] 60%|█████▉ | 20562/34278 [22:39:42<12:34:20, 3.30s/it] {'loss': 0.1179, 'grad_norm': 0.8148258392322149, 'learning_rate': 3.643155828104557e-06, 'epoch': 0.6} 60%|█████▉ | 20562/34278 [22:39:42<12:34:20, 3.30s/it] 60%|█████▉ | 20563/34278 [22:39:46<12:40:27, 3.33s/it] {'loss': 0.1093, 'grad_norm': 0.9157434882122585, 'learning_rate': 3.642701127893562e-06, 'epoch': 0.6} 60%|█████▉ | 20563/34278 [22:39:46<12:40:27, 3.33s/it] 60%|█████▉ | 20564/34278 [22:39:49<13:01:43, 3.42s/it] {'loss': 0.1163, 'grad_norm': 0.8773445146079626, 'learning_rate': 3.6422464398002044e-06, 'epoch': 0.6} 60%|█████▉ | 20564/34278 [22:39:49<13:01:43, 3.42s/it] 60%|█████▉ | 20565/34278 [22:39:53<13:52:20, 3.64s/it] {'loss': 0.1216, 'grad_norm': 0.8994101692166602, 'learning_rate': 3.6417917638285497e-06, 'epoch': 0.6} 60%|█████▉ | 20565/34278 [22:39:53<13:52:20, 3.64s/it] 60%|█████▉ | 20566/34278 [22:39:57<13:25:22, 3.52s/it] {'loss': 0.1395, 'grad_norm': 0.816099564080936, 'learning_rate': 3.641337099982653e-06, 'epoch': 0.6} 60%|█████▉ | 20566/34278 [22:39:57<13:25:22, 3.52s/it] 60%|██████ | 20567/34278 [22:40:00<12:41:34, 3.33s/it] {'loss': 0.1194, 'grad_norm': 0.8183404442908826, 'learning_rate': 3.6408824482665744e-06, 'epoch': 0.6} 60%|██████ | 20567/34278 [22:40:00<12:41:34, 3.33s/it] 60%|██████ | 20568/34278 [22:40:02<12:08:26, 3.19s/it] {'loss': 0.1339, 'grad_norm': 1.1391397282549864, 'learning_rate': 3.640427808684374e-06, 'epoch': 0.6} 60%|██████ | 20568/34278 [22:40:02<12:08:26, 3.19s/it] 60%|██████ | 20569/34278 [22:40:06<12:04:31, 3.17s/it] {'loss': 0.1085, 'grad_norm': 0.829170338175808, 'learning_rate': 3.639973181240108e-06, 'epoch': 0.6} 60%|██████ | 20569/34278 [22:40:06<12:04:31, 3.17s/it] 60%|██████ | 20570/34278 [22:40:11<15:07:04, 3.97s/it] {'loss': 0.1108, 'grad_norm': 0.7001107993042314, 'learning_rate': 3.6395185659378357e-06, 'epoch': 0.6} 60%|██████ | 20570/34278 [22:40:11<15:07:04, 3.97s/it] 60%|██████ | 20571/34278 [22:40:15<14:57:16, 3.93s/it] {'loss': 0.1252, 'grad_norm': 0.9894333483873435, 'learning_rate': 3.6390639627816182e-06, 'epoch': 0.6} 60%|██████ | 20571/34278 [22:40:15<14:57:16, 3.93s/it] 60%|██████ | 20572/34278 [22:40:21<17:21:24, 4.56s/it] {'loss': 0.1192, 'grad_norm': 0.6801080708384538, 'learning_rate': 3.638609371775512e-06, 'epoch': 0.6} 60%|██████ | 20572/34278 [22:40:21<17:21:24, 4.56s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 60%|██████ | 20573/34278 [22:40:25<16:12:17, 4.26s/it] {'loss': 0.1328, 'grad_norm': 0.8665406353231355, 'learning_rate': 3.638154792923578e-06, 'epoch': 0.6} 60%|██████ | 20573/34278 [22:40:25<16:12:17, 4.26s/it] 60%|██████ | 20574/34278 [22:40:31<18:12:16, 4.78s/it] {'loss': 0.1121, 'grad_norm': 0.7497171878377388, 'learning_rate': 3.6377002262298726e-06, 'epoch': 0.6} 60%|██████ | 20574/34278 [22:40:31<18:12:16, 4.78s/it] 60%|██████ | 20575/34278 [22:40:34<16:21:41, 4.30s/it] {'loss': 0.1441, 'grad_norm': 0.8700510828837275, 'learning_rate': 3.637245671698454e-06, 'epoch': 0.6} 60%|██████ | 20575/34278 [22:40:34<16:21:41, 4.30s/it] 60%|██████ | 20576/34278 [22:40:37<14:47:08, 3.88s/it] {'loss': 0.135, 'grad_norm': 0.8163394810681037, 'learning_rate': 3.636791129333379e-06, 'epoch': 0.6} 60%|██████ | 20576/34278 [22:40:37<14:47:08, 3.88s/it] 60%|██████ | 20577/34278 [22:40:41<15:10:31, 3.99s/it] {'loss': 0.1371, 'grad_norm': 0.8931486550626689, 'learning_rate': 3.6363365991387102e-06, 'epoch': 0.6} 60%|██████ | 20577/34278 [22:40:41<15:10:31, 3.99s/it] 60%|██████ | 20578/34278 [22:40:47<17:10:21, 4.51s/it] {'loss': 0.1207, 'grad_norm': 0.8836632685717212, 'learning_rate': 3.6358820811185015e-06, 'epoch': 0.6} 60%|██████ | 20578/34278 [22:40:47<17:10:21, 4.51s/it] 60%|██████ | 20579/34278 [22:40:50<15:41:16, 4.12s/it] {'loss': 0.1073, 'grad_norm': 0.6778682316577336, 'learning_rate': 3.6354275752768114e-06, 'epoch': 0.6} 60%|██████ | 20579/34278 [22:40:50<15:41:16, 4.12s/it] 60%|██████ | 20580/34278 [22:40:54<14:55:55, 3.92s/it] {'loss': 0.1326, 'grad_norm': 0.7449842815807373, 'learning_rate': 3.6349730816176996e-06, 'epoch': 0.6} 60%|██████ | 20580/34278 [22:40:54<14:55:55, 3.92s/it] 60%|██████ | 20581/34278 [22:40:59<16:23:32, 4.31s/it] {'loss': 0.1359, 'grad_norm': 0.9601133893833005, 'learning_rate': 3.6345186001452215e-06, 'epoch': 0.6} 60%|██████ | 20581/34278 [22:40:59<16:23:32, 4.31s/it] 60%|██████ | 20582/34278 [22:41:02<14:44:35, 3.88s/it] {'loss': 0.1064, 'grad_norm': 0.7274108088868823, 'learning_rate': 3.634064130863434e-06, 'epoch': 0.6} 60%|██████ | 20582/34278 [22:41:02<14:44:35, 3.88s/it] 60%|██████ | 20583/34278 [22:41:05<14:12:57, 3.74s/it] {'loss': 0.1441, 'grad_norm': 0.7576625059967385, 'learning_rate': 3.6336096737763964e-06, 'epoch': 0.6} 60%|██████ | 20583/34278 [22:41:05<14:12:57, 3.74s/it] 60%|██████ | 20584/34278 [22:41:08<13:25:11, 3.53s/it] {'loss': 0.1605, 'grad_norm': 0.9506553708409862, 'learning_rate': 3.633155228888166e-06, 'epoch': 0.6} 60%|██████ | 20584/34278 [22:41:08<13:25:11, 3.53s/it] 60%|██████ | 20585/34278 [22:41:12<13:53:15, 3.65s/it] {'loss': 0.1532, 'grad_norm': 1.045133038241358, 'learning_rate': 3.6327007962028003e-06, 'epoch': 0.6} 60%|██████ | 20585/34278 [22:41:12<13:53:15, 3.65s/it] 60%|██████ | 20586/34278 [22:41:15<13:11:29, 3.47s/it] {'loss': 0.1353, 'grad_norm': 0.8065056763539316, 'learning_rate': 3.6322463757243554e-06, 'epoch': 0.6} 60%|██████ | 20586/34278 [22:41:15<13:11:29, 3.47s/it] 60%|██████ | 20587/34278 [22:41:21<16:07:03, 4.24s/it] {'loss': 0.1425, 'grad_norm': 0.7090430509265934, 'learning_rate': 3.631791967456887e-06, 'epoch': 0.6} 60%|██████ | 20587/34278 [22:41:21<16:07:03, 4.24s/it] 60%|██████ | 20588/34278 [22:41:25<15:30:52, 4.08s/it] {'loss': 0.1194, 'grad_norm': 0.9132580952968826, 'learning_rate': 3.631337571404453e-06, 'epoch': 0.6} 60%|██████ | 20588/34278 [22:41:25<15:30:52, 4.08s/it] 60%|██████ | 20589/34278 [22:41:29<15:48:28, 4.16s/it] {'loss': 0.1243, 'grad_norm': 0.9152231027887684, 'learning_rate': 3.6308831875711115e-06, 'epoch': 0.6} 60%|██████ | 20589/34278 [22:41:29<15:48:28, 4.16s/it] 60%|██████ | 20590/34278 [22:41:35<17:34:25, 4.62s/it] {'loss': 0.1263, 'grad_norm': 1.219974233729985, 'learning_rate': 3.6304288159609187e-06, 'epoch': 0.6} 60%|██████ | 20590/34278 [22:41:35<17:34:25, 4.62s/it] 60%|██████ | 20591/34278 [22:41:39<16:51:05, 4.43s/it] {'loss': 0.1174, 'grad_norm': 0.7423868034816976, 'learning_rate': 3.6299744565779294e-06, 'epoch': 0.6} 60%|██████ | 20591/34278 [22:41:39<16:51:05, 4.43s/it] 60%|██████ | 20592/34278 [22:41:43<16:26:22, 4.32s/it] {'loss': 0.1253, 'grad_norm': 0.9292507963557238, 'learning_rate': 3.6295201094262013e-06, 'epoch': 0.6} 60%|██████ | 20592/34278 [22:41:43<16:26:22, 4.32s/it] 60%|██████ | 20593/34278 [22:41:47<16:30:20, 4.34s/it] {'loss': 0.1371, 'grad_norm': 1.0896664292755547, 'learning_rate': 3.6290657745097917e-06, 'epoch': 0.6} 60%|██████ | 20593/34278 [22:41:47<16:30:20, 4.34s/it] 60%|██████ | 20594/34278 [22:41:51<15:17:17, 4.02s/it] {'loss': 0.1339, 'grad_norm': 0.808847616529553, 'learning_rate': 3.628611451832752e-06, 'epoch': 0.6} 60%|██████ | 20594/34278 [22:41:51<15:17:17, 4.02s/it] 60%|██████ | 20595/34278 [22:41:54<14:07:40, 3.72s/it] {'loss': 0.1236, 'grad_norm': 0.7392848472867324, 'learning_rate': 3.6281571413991458e-06, 'epoch': 0.6} 60%|██████ | 20595/34278 [22:41:54<14:07:40, 3.72s/it] 60%|██████ | 20596/34278 [22:41:57<13:25:45, 3.53s/it] {'loss': 0.111, 'grad_norm': 0.9715254412626044, 'learning_rate': 3.6277028432130235e-06, 'epoch': 0.6} 60%|██████ | 20596/34278 [22:41:57<13:25:45, 3.53s/it] 60%|██████ | 20597/34278 [22:42:01<14:11:56, 3.74s/it] {'loss': 0.1095, 'grad_norm': 0.9161313724467532, 'learning_rate': 3.6272485572784426e-06, 'epoch': 0.6} 60%|██████ | 20597/34278 [22:42:01<14:11:56, 3.74s/it] 60%|██████ | 20598/34278 [22:42:04<13:32:23, 3.56s/it] {'loss': 0.1086, 'grad_norm': 1.1380091667837064, 'learning_rate': 3.6267942835994607e-06, 'epoch': 0.6} 60%|██████ | 20598/34278 [22:42:04<13:32:23, 3.56s/it] 60%|██████ | 20599/34278 [22:42:09<14:59:59, 3.95s/it] {'loss': 0.1217, 'grad_norm': 0.7336744295507781, 'learning_rate': 3.6263400221801292e-06, 'epoch': 0.6} 60%|██████ | 20599/34278 [22:42:09<14:59:59, 3.95s/it] 60%|██████ | 20600/34278 [22:42:12<13:46:18, 3.62s/it] {'loss': 0.1268, 'grad_norm': 0.7615605502921807, 'learning_rate': 3.625885773024506e-06, 'epoch': 0.6} 60%|██████ | 20600/34278 [22:42:12<13:46:18, 3.62s/it] 60%|██████ | 20601/34278 [22:42:15<13:04:58, 3.44s/it] {'loss': 0.1173, 'grad_norm': 0.7294107060867674, 'learning_rate': 3.6254315361366477e-06, 'epoch': 0.6} 60%|██████ | 20601/34278 [22:42:15<13:04:58, 3.44s/it] 60%|██████ | 20602/34278 [22:42:18<13:05:17, 3.45s/it] {'loss': 0.1162, 'grad_norm': 0.6880247447368646, 'learning_rate': 3.6249773115206085e-06, 'epoch': 0.6} 60%|██████ | 20602/34278 [22:42:18<13:05:17, 3.45s/it] 60%|██████ | 20603/34278 [22:42:21<12:40:13, 3.34s/it] {'loss': 0.11, 'grad_norm': 0.8659064709018958, 'learning_rate': 3.624523099180444e-06, 'epoch': 0.6} 60%|██████ | 20603/34278 [22:42:21<12:40:13, 3.34s/it] 60%|██████ | 20604/34278 [22:42:24<12:18:21, 3.24s/it] {'loss': 0.1214, 'grad_norm': 0.9030016504160433, 'learning_rate': 3.6240688991202085e-06, 'epoch': 0.6} 60%|██████ | 20604/34278 [22:42:24<12:18:21, 3.24s/it] 60%|██████ | 20605/34278 [22:42:27<12:16:04, 3.23s/it] {'loss': 0.1261, 'grad_norm': 0.6965923949101781, 'learning_rate': 3.623614711343957e-06, 'epoch': 0.6} 60%|██████ | 20605/34278 [22:42:28<12:16:04, 3.23s/it] 60%|██████ | 20606/34278 [22:42:30<11:55:42, 3.14s/it] {'loss': 0.1231, 'grad_norm': 0.8626239518223667, 'learning_rate': 3.6231605358557442e-06, 'epoch': 0.6} 60%|██████ | 20606/34278 [22:42:30<11:55:42, 3.14s/it] 60%|██████ | 20607/34278 [22:42:33<11:26:03, 3.01s/it] {'loss': 0.1196, 'grad_norm': 1.1416879445210433, 'learning_rate': 3.622706372659627e-06, 'epoch': 0.6} 60%|██████ | 20607/34278 [22:42:33<11:26:03, 3.01s/it] 60%|██████ | 20608/34278 [22:42:36<11:47:06, 3.10s/it] {'loss': 0.1346, 'grad_norm': 0.7884215630810347, 'learning_rate': 3.622252221759658e-06, 'epoch': 0.6} 60%|██████ | 20608/34278 [22:42:36<11:47:06, 3.10s/it] 60%|██████ | 20609/34278 [22:42:39<11:35:40, 3.05s/it] {'loss': 0.1181, 'grad_norm': 0.794585350162656, 'learning_rate': 3.621798083159892e-06, 'epoch': 0.6} 60%|██████ | 20609/34278 [22:42:39<11:35:40, 3.05s/it] 60%|██████ | 20610/34278 [22:42:43<11:52:15, 3.13s/it] {'loss': 0.1111, 'grad_norm': 0.9790855397077731, 'learning_rate': 3.621343956864385e-06, 'epoch': 0.6} 60%|██████ | 20610/34278 [22:42:43<11:52:15, 3.13s/it] 60%|██████ | 20611/34278 [22:42:46<11:49:04, 3.11s/it] {'loss': 0.1342, 'grad_norm': 0.7805984597896618, 'learning_rate': 3.6208898428771887e-06, 'epoch': 0.6} 60%|██████ | 20611/34278 [22:42:46<11:49:04, 3.11s/it] 60%|██████ | 20612/34278 [22:42:52<15:03:59, 3.97s/it] {'loss': 0.1415, 'grad_norm': 1.0142421179477181, 'learning_rate': 3.620435741202357e-06, 'epoch': 0.6} 60%|██████ | 20612/34278 [22:42:52<15:03:59, 3.97s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 60%|██████ | 20613/34278 [22:42:55<13:49:26, 3.64s/it] {'loss': 0.1566, 'grad_norm': 1.05410386310876, 'learning_rate': 3.6199816518439477e-06, 'epoch': 0.6} 60%|██████ | 20613/34278 [22:42:55<13:49:26, 3.64s/it] 60%|██████ | 20614/34278 [22:43:00<15:59:43, 4.21s/it] {'loss': 0.1252, 'grad_norm': 1.1189031920338837, 'learning_rate': 3.6195275748060125e-06, 'epoch': 0.6} 60%|██████ | 20614/34278 [22:43:00<15:59:43, 4.21s/it] 60%|██████ | 20615/34278 [22:43:04<15:41:45, 4.14s/it] {'loss': 0.1434, 'grad_norm': 1.0135170322610298, 'learning_rate': 3.6190735100926066e-06, 'epoch': 0.6} 60%|██████ | 20615/34278 [22:43:04<15:41:45, 4.14s/it] 60%|██████ | 20616/34278 [22:43:10<17:57:01, 4.73s/it] {'loss': 0.1402, 'grad_norm': 1.0868049012424301, 'learning_rate': 3.6186194577077817e-06, 'epoch': 0.6} 60%|██████ | 20616/34278 [22:43:10<17:57:01, 4.73s/it] 60%|██████ | 20617/34278 [22:43:14<16:21:03, 4.31s/it] {'loss': 0.1119, 'grad_norm': 0.8598214788839132, 'learning_rate': 3.6181654176555927e-06, 'epoch': 0.6} 60%|██████ | 20617/34278 [22:43:14<16:21:03, 4.31s/it] 60%|██████ | 20618/34278 [22:43:17<15:25:16, 4.06s/it] {'loss': 0.1075, 'grad_norm': 0.7153551213611583, 'learning_rate': 3.6177113899400916e-06, 'epoch': 0.6} 60%|██████ | 20618/34278 [22:43:17<15:25:16, 4.06s/it] 60%|██████ | 20619/34278 [22:43:24<18:23:35, 4.85s/it] {'loss': 0.1181, 'grad_norm': 0.7031827095581438, 'learning_rate': 3.617257374565335e-06, 'epoch': 0.6} 60%|██████ | 20619/34278 [22:43:24<18:23:35, 4.85s/it] 60%|██████ | 20620/34278 [22:43:29<18:34:50, 4.90s/it] {'loss': 0.1282, 'grad_norm': 1.2676887347710046, 'learning_rate': 3.6168033715353747e-06, 'epoch': 0.6} 60%|██████ | 20620/34278 [22:43:29<18:34:50, 4.90s/it] 60%|██████ | 20621/34278 [22:43:32<17:13:04, 4.54s/it] {'loss': 0.1189, 'grad_norm': 0.9495079511916384, 'learning_rate': 3.6163493808542628e-06, 'epoch': 0.6} 60%|██████ | 20621/34278 [22:43:32<17:13:04, 4.54s/it] 60%|██████ | 20622/34278 [22:43:37<16:47:39, 4.43s/it] {'loss': 0.1209, 'grad_norm': 0.7168617426368551, 'learning_rate': 3.6158954025260532e-06, 'epoch': 0.6} 60%|██████ | 20622/34278 [22:43:37<16:47:39, 4.43s/it] 60%|██████ | 20623/34278 [22:43:40<15:34:44, 4.11s/it] {'loss': 0.1355, 'grad_norm': 0.6744299998705775, 'learning_rate': 3.6154414365548008e-06, 'epoch': 0.6} 60%|██████ | 20623/34278 [22:43:40<15:34:44, 4.11s/it] 60%|██████ | 20624/34278 [22:43:43<14:15:34, 3.76s/it] {'loss': 0.1359, 'grad_norm': 1.3458883116510916, 'learning_rate': 3.614987482944553e-06, 'epoch': 0.6} 60%|██████ | 20624/34278 [22:43:43<14:15:34, 3.76s/it] 60%|██████ | 20625/34278 [22:43:46<13:28:49, 3.55s/it] {'loss': 0.13, 'grad_norm': 1.1403406960405245, 'learning_rate': 3.61453354169937e-06, 'epoch': 0.6} 60%|██████ | 20625/34278 [22:43:46<13:28:49, 3.55s/it] 60%|██████ | 20626/34278 [22:43:50<13:30:57, 3.56s/it] {'loss': 0.1182, 'grad_norm': 0.8954812012347784, 'learning_rate': 3.614079612823299e-06, 'epoch': 0.6} 60%|██████ | 20626/34278 [22:43:50<13:30:57, 3.56s/it] 60%|██████ | 20627/34278 [22:43:54<14:33:07, 3.84s/it] {'loss': 0.14, 'grad_norm': 0.755592231970031, 'learning_rate': 3.613625696320394e-06, 'epoch': 0.6} 60%|██████ | 20627/34278 [22:43:54<14:33:07, 3.84s/it] 60%|██████ | 20628/34278 [22:43:58<14:19:51, 3.78s/it] {'loss': 0.1177, 'grad_norm': 0.6368978214706259, 'learning_rate': 3.61317179219471e-06, 'epoch': 0.6} 60%|██████ | 20628/34278 [22:43:58<14:19:51, 3.78s/it] 60%|██████ | 20629/34278 [22:44:02<15:03:45, 3.97s/it] {'loss': 0.161, 'grad_norm': 0.8581482083463947, 'learning_rate': 3.6127179004502953e-06, 'epoch': 0.6} 60%|██████ | 20629/34278 [22:44:02<15:03:45, 3.97s/it] 60%|██████ | 20630/34278 [22:44:05<14:13:44, 3.75s/it] {'loss': 0.141, 'grad_norm': 1.0527574505587558, 'learning_rate': 3.6122640210912042e-06, 'epoch': 0.6} 60%|██████ | 20630/34278 [22:44:05<14:13:44, 3.75s/it] 60%|██████ | 20631/34278 [22:44:09<13:46:13, 3.63s/it] {'loss': 0.1406, 'grad_norm': 0.9176333557272351, 'learning_rate': 3.6118101541214887e-06, 'epoch': 0.6} 60%|██████ | 20631/34278 [22:44:09<13:46:13, 3.63s/it] 60%|██████ | 20632/34278 [22:44:12<13:25:41, 3.54s/it] {'loss': 0.109, 'grad_norm': 0.8515930027778424, 'learning_rate': 3.611356299545201e-06, 'epoch': 0.6} 60%|██████ | 20632/34278 [22:44:12<13:25:41, 3.54s/it] 60%|██████ | 20633/34278 [22:44:18<16:06:22, 4.25s/it] {'loss': 0.1287, 'grad_norm': 0.7663183498739945, 'learning_rate': 3.6109024573663938e-06, 'epoch': 0.6} 60%|██████ | 20633/34278 [22:44:18<16:06:22, 4.25s/it] 60%|██████ | 20634/34278 [22:44:22<15:43:59, 4.15s/it] {'loss': 0.1398, 'grad_norm': 0.7971329176308535, 'learning_rate': 3.6104486275891166e-06, 'epoch': 0.6} 60%|██████ | 20634/34278 [22:44:22<15:43:59, 4.15s/it] 60%|██████ | 20635/34278 [22:44:25<14:44:27, 3.89s/it] {'loss': 0.127, 'grad_norm': 0.847177107999471, 'learning_rate': 3.609994810217422e-06, 'epoch': 0.6} 60%|██████ | 20635/34278 [22:44:25<14:44:27, 3.89s/it] 60%|██████ | 20636/34278 [22:44:31<17:02:49, 4.50s/it] {'loss': 0.1334, 'grad_norm': 0.8145235229946574, 'learning_rate': 3.6095410052553613e-06, 'epoch': 0.6} 60%|██████ | 20636/34278 [22:44:31<17:02:49, 4.50s/it] 60%|██████ | 20637/34278 [22:44:34<15:18:46, 4.04s/it] {'loss': 0.142, 'grad_norm': 0.9353061638939351, 'learning_rate': 3.609087212706989e-06, 'epoch': 0.6} 60%|██████ | 20637/34278 [22:44:34<15:18:46, 4.04s/it] 60%|██████ | 20638/34278 [22:44:37<13:51:09, 3.66s/it] {'loss': 0.1181, 'grad_norm': 0.8241346742955289, 'learning_rate': 3.6086334325763528e-06, 'epoch': 0.6} 60%|██████ | 20638/34278 [22:44:37<13:51:09, 3.66s/it] 60%|██████ | 20639/34278 [22:44:40<13:42:51, 3.62s/it] {'loss': 0.1112, 'grad_norm': 1.517520638230361, 'learning_rate': 3.608179664867505e-06, 'epoch': 0.6} 60%|██████ | 20639/34278 [22:44:40<13:42:51, 3.62s/it] 60%|██████ | 20640/34278 [22:44:45<14:39:00, 3.87s/it] {'loss': 0.1191, 'grad_norm': 0.8251323222826907, 'learning_rate': 3.607725909584498e-06, 'epoch': 0.6} 60%|██████ | 20640/34278 [22:44:45<14:39:00, 3.87s/it] 60%|██████ | 20641/34278 [22:44:48<14:12:37, 3.75s/it] {'loss': 0.1348, 'grad_norm': 0.8794529914077927, 'learning_rate': 3.6072721667313806e-06, 'epoch': 0.6} 60%|██████ | 20641/34278 [22:44:48<14:12:37, 3.75s/it] 60%|██████ | 20642/34278 [22:44:52<14:01:19, 3.70s/it] {'loss': 0.1301, 'grad_norm': 0.8467028458750899, 'learning_rate': 3.606818436312204e-06, 'epoch': 0.6} 60%|██████ | 20642/34278 [22:44:52<14:01:19, 3.70s/it] 60%|██████ | 20643/34278 [22:44:55<13:18:04, 3.51s/it] {'loss': 0.1255, 'grad_norm': 0.8023789917132966, 'learning_rate': 3.606364718331021e-06, 'epoch': 0.6} 60%|██████ | 20643/34278 [22:44:55<13:18:04, 3.51s/it] 60%|██████ | 20644/34278 [22:45:01<16:03:26, 4.24s/it] {'loss': 0.1549, 'grad_norm': 0.9670951006226213, 'learning_rate': 3.6059110127918807e-06, 'epoch': 0.6} 60%|██████ | 20644/34278 [22:45:01<16:03:26, 4.24s/it] 60%|██████ | 20645/34278 [22:45:04<14:47:08, 3.90s/it] {'loss': 0.1594, 'grad_norm': 0.8424118195840142, 'learning_rate': 3.605457319698835e-06, 'epoch': 0.6} 60%|██████ | 20645/34278 [22:45:04<14:47:08, 3.90s/it] 60%|██████ | 20646/34278 [22:45:08<15:26:22, 4.08s/it] {'loss': 0.1099, 'grad_norm': 0.830922376239889, 'learning_rate': 3.605003639055933e-06, 'epoch': 0.6} 60%|██████ | 20646/34278 [22:45:08<15:26:22, 4.08s/it] 60%|██████ | 20647/34278 [22:45:12<14:45:30, 3.90s/it] {'loss': 0.1034, 'grad_norm': 0.7870205949639671, 'learning_rate': 3.604549970867225e-06, 'epoch': 0.6} 60%|██████ | 20647/34278 [22:45:12<14:45:30, 3.90s/it] 60%|██████ | 20648/34278 [22:45:15<13:48:18, 3.65s/it] {'loss': 0.1329, 'grad_norm': 0.9793539706991589, 'learning_rate': 3.604096315136761e-06, 'epoch': 0.6} 60%|██████ | 20648/34278 [22:45:15<13:48:18, 3.65s/it] 60%|██████ | 20649/34278 [22:45:18<13:26:17, 3.55s/it] {'loss': 0.1444, 'grad_norm': 0.7572672185471363, 'learning_rate': 3.6036426718685925e-06, 'epoch': 0.6} 60%|██████ | 20649/34278 [22:45:18<13:26:17, 3.55s/it] 60%|██████ | 20650/34278 [22:45:24<16:10:19, 4.27s/it] {'loss': 0.116, 'grad_norm': 0.7233964453635535, 'learning_rate': 3.6031890410667704e-06, 'epoch': 0.6} 60%|██████ | 20650/34278 [22:45:24<16:10:19, 4.27s/it] 60%|██████ | 20651/34278 [22:45:27<14:51:10, 3.92s/it] {'loss': 0.1433, 'grad_norm': 1.1149264391767628, 'learning_rate': 3.6027354227353417e-06, 'epoch': 0.6} 60%|██████ | 20651/34278 [22:45:27<14:51:10, 3.92s/it] 60%|██████ | 20652/34278 [22:45:32<15:28:21, 4.09s/it] {'loss': 0.1209, 'grad_norm': 1.0212343659979026, 'learning_rate': 3.602281816878358e-06, 'epoch': 0.6} 60%|██████ | 20652/34278 [22:45:32<15:28:21, 4.09s/it] 60%|██████ | 20653/34278 [22:45:35<13:50:36, 3.66s/it] {'loss': 0.1236, 'grad_norm': 1.2270577799485092, 'learning_rate': 3.6018282234998693e-06, 'epoch': 0.6} 60%|██████ | 20653/34278 [22:45:35<13:50:36, 3.66s/it] 60%|██████ | 20654/34278 [22:45:38<13:58:14, 3.69s/it] {'loss': 0.1274, 'grad_norm': 0.8697683972920066, 'learning_rate': 3.601374642603921e-06, 'epoch': 0.6} 60%|██████ | 20654/34278 [22:45:38<13:58:14, 3.69s/it] 60%|██████ | 20655/34278 [22:45:42<13:33:53, 3.58s/it] {'loss': 0.1233, 'grad_norm': 0.932449916295231, 'learning_rate': 3.60092107419457e-06, 'epoch': 0.6} 60%|██████ | 20655/34278 [22:45:42<13:33:53, 3.58s/it] 60%|██████ | 20656/34278 [22:45:45<13:13:04, 3.49s/it] {'loss': 0.1262, 'grad_norm': 1.1722804030568308, 'learning_rate': 3.6004675182758598e-06, 'epoch': 0.6} 60%|██████ | 20656/34278 [22:45:45<13:13:04, 3.49s/it] 60%|██████ | 20657/34278 [22:45:50<15:09:26, 4.01s/it] {'loss': 0.109, 'grad_norm': 1.1566340777759807, 'learning_rate': 3.600013974851842e-06, 'epoch': 0.6} 60%|██████ | 20657/34278 [22:45:50<15:09:26, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 60%|██████ | 20658/34278 [22:45:53<13:57:00, 3.69s/it] {'loss': 0.1371, 'grad_norm': 0.8125755871923773, 'learning_rate': 3.5995604439265664e-06, 'epoch': 0.6} 60%|██████ | 20658/34278 [22:45:53<13:57:00, 3.69s/it] 60%|██████ | 20659/34278 [22:45:56<13:36:05, 3.60s/it] {'loss': 0.1079, 'grad_norm': 1.351873782568918, 'learning_rate': 3.599106925504079e-06, 'epoch': 0.6} 60%|██████ | 20659/34278 [22:45:56<13:36:05, 3.60s/it] 60%|██████ | 20660/34278 [22:45:59<12:35:56, 3.33s/it] {'loss': 0.1273, 'grad_norm': 1.0619854616517461, 'learning_rate': 3.5986534195884305e-06, 'epoch': 0.6} 60%|██████ | 20660/34278 [22:45:59<12:35:56, 3.33s/it] 60%|██████ | 20661/34278 [22:46:02<12:21:47, 3.27s/it] {'loss': 0.1256, 'grad_norm': 1.3283689024383327, 'learning_rate': 3.59819992618367e-06, 'epoch': 0.6} 60%|██████ | 20661/34278 [22:46:02<12:21:47, 3.27s/it] 60%|██████ | 20662/34278 [22:46:06<12:35:42, 3.33s/it] {'loss': 0.1245, 'grad_norm': 0.8898156437098708, 'learning_rate': 3.597746445293846e-06, 'epoch': 0.6} 60%|██████ | 20662/34278 [22:46:06<12:35:42, 3.33s/it] 60%|██████ | 20663/34278 [22:46:09<12:13:39, 3.23s/it] {'loss': 0.1288, 'grad_norm': 1.1695808649292871, 'learning_rate': 3.597292976923008e-06, 'epoch': 0.6} 60%|██████ | 20663/34278 [22:46:09<12:13:39, 3.23s/it] 60%|██████ | 20664/34278 [22:46:12<11:47:54, 3.12s/it] {'loss': 0.1078, 'grad_norm': 0.8282942467785808, 'learning_rate': 3.5968395210752027e-06, 'epoch': 0.6} 60%|██████ | 20664/34278 [22:46:12<11:47:54, 3.12s/it] 60%|██████ | 20665/34278 [22:46:15<12:22:50, 3.27s/it] {'loss': 0.1576, 'grad_norm': 0.9183036143669286, 'learning_rate': 3.5963860777544796e-06, 'epoch': 0.6} 60%|██████ | 20665/34278 [22:46:15<12:22:50, 3.27s/it] 60%|██████ | 20666/34278 [22:46:18<12:15:03, 3.24s/it] {'loss': 0.1096, 'grad_norm': 1.0451332960441726, 'learning_rate': 3.5959326469648847e-06, 'epoch': 0.6} 60%|██████ | 20666/34278 [22:46:18<12:15:03, 3.24s/it] 60%|██████ | 20667/34278 [22:46:21<11:36:43, 3.07s/it] {'loss': 0.1352, 'grad_norm': 0.7834724585353079, 'learning_rate': 3.5954792287104707e-06, 'epoch': 0.6} 60%|██████ | 20667/34278 [22:46:21<11:36:43, 3.07s/it] 60%|██████ | 20668/34278 [22:46:25<12:43:45, 3.37s/it] {'loss': 0.121, 'grad_norm': 0.7969454085436715, 'learning_rate': 3.5950258229952817e-06, 'epoch': 0.6} 60%|██████ | 20668/34278 [22:46:25<12:43:45, 3.37s/it] 60%|██████ | 20669/34278 [22:46:28<12:16:39, 3.25s/it] {'loss': 0.1425, 'grad_norm': 1.2529372948763293, 'learning_rate': 3.5945724298233665e-06, 'epoch': 0.6} 60%|██████ | 20669/34278 [22:46:28<12:16:39, 3.25s/it] 60%|██████ | 20670/34278 [22:46:31<12:20:12, 3.26s/it] {'loss': 0.1222, 'grad_norm': 1.192656878933679, 'learning_rate': 3.5941190491987745e-06, 'epoch': 0.6} 60%|██████ | 20670/34278 [22:46:31<12:20:12, 3.26s/it] 60%|██████ | 20671/34278 [22:46:36<14:03:23, 3.72s/it] {'loss': 0.0992, 'grad_norm': 0.6172278253839465, 'learning_rate': 3.5936656811255484e-06, 'epoch': 0.6} 60%|██████ | 20671/34278 [22:46:36<14:03:23, 3.72s/it] 60%|██████ | 20672/34278 [22:46:42<16:40:58, 4.41s/it] {'loss': 0.1312, 'grad_norm': 0.9249651754572804, 'learning_rate': 3.593212325607742e-06, 'epoch': 0.6} 60%|██████ | 20672/34278 [22:46:42<16:40:58, 4.41s/it] 60%|██████ | 20673/34278 [22:46:45<15:12:49, 4.03s/it] {'loss': 0.1326, 'grad_norm': 1.2429406023273408, 'learning_rate': 3.5927589826494005e-06, 'epoch': 0.6} 60%|██████ | 20673/34278 [22:46:45<15:12:49, 4.03s/it] 60%|██████ | 20674/34278 [22:46:49<14:15:30, 3.77s/it] {'loss': 0.1209, 'grad_norm': 1.107180675293284, 'learning_rate': 3.5923056522545703e-06, 'epoch': 0.6} 60%|██████ | 20674/34278 [22:46:49<14:15:30, 3.77s/it] 60%|██████ | 20675/34278 [22:46:53<15:33:13, 4.12s/it] {'loss': 0.1687, 'grad_norm': 0.9657972443333933, 'learning_rate': 3.5918523344272997e-06, 'epoch': 0.6} 60%|██████ | 20675/34278 [22:46:53<15:33:13, 4.12s/it] 60%|██████ | 20676/34278 [22:46:59<17:32:19, 4.64s/it] {'loss': 0.1263, 'grad_norm': 1.3548453125253543, 'learning_rate': 3.591399029171635e-06, 'epoch': 0.6} 60%|██████ | 20676/34278 [22:46:59<17:32:19, 4.64s/it] 60%|██████ | 20677/34278 [22:47:03<16:41:28, 4.42s/it] {'loss': 0.1313, 'grad_norm': 1.0409323225286617, 'learning_rate': 3.5909457364916223e-06, 'epoch': 0.6} 60%|██████ | 20677/34278 [22:47:03<16:41:28, 4.42s/it] 60%|██████ | 20678/34278 [22:47:06<15:05:00, 3.99s/it] {'loss': 0.1442, 'grad_norm': 0.8592680051229966, 'learning_rate': 3.59049245639131e-06, 'epoch': 0.6} 60%|██████ | 20678/34278 [22:47:06<15:05:00, 3.99s/it] 60%|██████ | 20679/34278 [22:47:09<13:46:02, 3.64s/it] {'loss': 0.1239, 'grad_norm': 1.1466818044190623, 'learning_rate': 3.5900391888747455e-06, 'epoch': 0.6} 60%|██████ | 20679/34278 [22:47:09<13:46:02, 3.64s/it] 60%|██████ | 20680/34278 [22:47:12<12:53:04, 3.41s/it] {'loss': 0.1162, 'grad_norm': 1.1531244828518774, 'learning_rate': 3.5895859339459753e-06, 'epoch': 0.6} 60%|██████ | 20680/34278 [22:47:12<12:53:04, 3.41s/it] 60%|██████ | 20681/34278 [22:47:18<15:47:19, 4.18s/it] {'loss': 0.1061, 'grad_norm': 0.9372895602273493, 'learning_rate': 3.589132691609044e-06, 'epoch': 0.6} 60%|██████ | 20681/34278 [22:47:18<15:47:19, 4.18s/it] 60%|██████ | 20682/34278 [22:47:22<15:12:32, 4.03s/it] {'loss': 0.1228, 'grad_norm': 0.8773942916931872, 'learning_rate': 3.588679461868e-06, 'epoch': 0.6} 60%|██████ | 20682/34278 [22:47:22<15:12:32, 4.03s/it] 60%|██████ | 20683/34278 [22:47:25<14:15:42, 3.78s/it] {'loss': 0.1364, 'grad_norm': 0.8371494965905516, 'learning_rate': 3.5882262447268865e-06, 'epoch': 0.6} 60%|██████ | 20683/34278 [22:47:25<14:15:42, 3.78s/it] 60%|██████ | 20684/34278 [22:47:28<14:04:43, 3.73s/it] {'loss': 0.1158, 'grad_norm': 1.1130096353059873, 'learning_rate': 3.587773040189754e-06, 'epoch': 0.6} 60%|██████ | 20684/34278 [22:47:28<14:04:43, 3.73s/it] 60%|██████ | 20685/34278 [22:47:34<16:43:43, 4.43s/it] {'loss': 0.1057, 'grad_norm': 0.8665302231987151, 'learning_rate': 3.5873198482606477e-06, 'epoch': 0.6} 60%|██████ | 20685/34278 [22:47:34<16:43:43, 4.43s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 60%|██████ | 20686/34278 [22:47:38<15:12:54, 4.03s/it] {'loss': 0.1166, 'grad_norm': 0.7844004690780885, 'learning_rate': 3.586866668943611e-06, 'epoch': 0.6} 60%|██████ | 20686/34278 [22:47:38<15:12:54, 4.03s/it] 60%|██████ | 20687/34278 [22:47:41<14:32:56, 3.85s/it] {'loss': 0.1347, 'grad_norm': 1.2263462758872938, 'learning_rate': 3.5864135022426916e-06, 'epoch': 0.6} 60%|██████ | 20687/34278 [22:47:41<14:32:56, 3.85s/it] 60%|██████ | 20688/34278 [22:47:44<13:35:40, 3.60s/it] {'loss': 0.1182, 'grad_norm': 1.0266241126485882, 'learning_rate': 3.585960348161936e-06, 'epoch': 0.6} 60%|██████ | 20688/34278 [22:47:44<13:35:40, 3.60s/it] 60%|██████ | 20689/34278 [22:47:47<13:02:06, 3.45s/it] {'loss': 0.1274, 'grad_norm': 0.8003543521389126, 'learning_rate': 3.585507206705386e-06, 'epoch': 0.6} 60%|██████ | 20689/34278 [22:47:47<13:02:06, 3.45s/it] 60%|██████ | 20690/34278 [22:47:50<12:41:26, 3.36s/it] {'loss': 0.1154, 'grad_norm': 0.948863701856177, 'learning_rate': 3.5850540778770924e-06, 'epoch': 0.6} 60%|██████ | 20690/34278 [22:47:50<12:41:26, 3.36s/it] 60%|██████ | 20691/34278 [22:47:53<12:21:10, 3.27s/it] {'loss': 0.131, 'grad_norm': 0.7486068958601275, 'learning_rate': 3.5846009616810983e-06, 'epoch': 0.6} 60%|██████ | 20691/34278 [22:47:53<12:21:10, 3.27s/it] 60%|██████ | 20692/34278 [22:47:56<11:55:11, 3.16s/it] {'loss': 0.1153, 'grad_norm': 0.7833005345713987, 'learning_rate': 3.5841478581214483e-06, 'epoch': 0.6} 60%|██████ | 20692/34278 [22:47:56<11:55:11, 3.16s/it] 60%|██████ | 20693/34278 [22:48:02<14:36:02, 3.87s/it] {'loss': 0.097, 'grad_norm': 0.6709724665651245, 'learning_rate': 3.583694767202189e-06, 'epoch': 0.6} 60%|██████ | 20693/34278 [22:48:02<14:36:02, 3.87s/it] 60%|██████ | 20694/34278 [22:48:05<13:38:21, 3.61s/it] {'loss': 0.1284, 'grad_norm': 0.8574087692321917, 'learning_rate': 3.583241688927364e-06, 'epoch': 0.6} 60%|██████ | 20694/34278 [22:48:05<13:38:21, 3.61s/it] 60%|██████ | 20695/34278 [22:48:08<13:19:32, 3.53s/it] {'loss': 0.132, 'grad_norm': 1.0161858743358603, 'learning_rate': 3.582788623301018e-06, 'epoch': 0.6} 60%|██████ | 20695/34278 [22:48:08<13:19:32, 3.53s/it] 60%|██████ | 20696/34278 [22:48:11<12:51:14, 3.41s/it] {'loss': 0.1242, 'grad_norm': 1.0026121346936068, 'learning_rate': 3.582335570327198e-06, 'epoch': 0.6} 60%|██████ | 20696/34278 [22:48:11<12:51:14, 3.41s/it] 60%|██████ | 20697/34278 [22:48:15<13:05:37, 3.47s/it] {'loss': 0.14, 'grad_norm': 0.7721155276758169, 'learning_rate': 3.581882530009948e-06, 'epoch': 0.6} 60%|██████ | 20697/34278 [22:48:15<13:05:37, 3.47s/it] 60%|██████ | 20698/34278 [22:48:19<13:48:18, 3.66s/it] {'loss': 0.1484, 'grad_norm': 0.7625042643757383, 'learning_rate': 3.581429502353312e-06, 'epoch': 0.6} 60%|██████ | 20698/34278 [22:48:19<13:48:18, 3.66s/it] 60%|██████ | 20699/34278 [22:48:22<13:02:46, 3.46s/it] {'loss': 0.1267, 'grad_norm': 0.8184795323443229, 'learning_rate': 3.580976487361334e-06, 'epoch': 0.6} 60%|██████ | 20699/34278 [22:48:22<13:02:46, 3.46s/it] 60%|██████ | 20700/34278 [22:48:26<14:05:22, 3.74s/it] {'loss': 0.1052, 'grad_norm': 0.7152635396360302, 'learning_rate': 3.580523485038061e-06, 'epoch': 0.6} 60%|██████ | 20700/34278 [22:48:26<14:05:22, 3.74s/it] 60%|██████ | 20701/34278 [22:48:31<15:03:18, 3.99s/it] {'loss': 0.1208, 'grad_norm': 0.7071465680886245, 'learning_rate': 3.580070495387532e-06, 'epoch': 0.6} 60%|██████ | 20701/34278 [22:48:31<15:03:18, 3.99s/it] 60%|██████ | 20702/34278 [22:48:35<15:35:39, 4.14s/it] {'loss': 0.1493, 'grad_norm': 1.0458616600763395, 'learning_rate': 3.579617518413798e-06, 'epoch': 0.6} 60%|██████ | 20702/34278 [22:48:35<15:35:39, 4.14s/it] 60%|██████ | 20703/34278 [22:48:39<15:15:25, 4.05s/it] {'loss': 0.1176, 'grad_norm': 0.7926599590148983, 'learning_rate': 3.579164554120898e-06, 'epoch': 0.6} 60%|██████ | 20703/34278 [22:48:39<15:15:25, 4.05s/it] 60%|██████ | 20704/34278 [22:48:43<15:05:23, 4.00s/it] {'loss': 0.1346, 'grad_norm': 0.7180877823910206, 'learning_rate': 3.578711602512878e-06, 'epoch': 0.6} 60%|██████ | 20704/34278 [22:48:43<15:05:23, 4.00s/it] 60%|██████ | 20705/34278 [22:48:46<14:16:47, 3.79s/it] {'loss': 0.142, 'grad_norm': 0.731005048302328, 'learning_rate': 3.5782586635937834e-06, 'epoch': 0.6} 60%|██████ | 20705/34278 [22:48:46<14:16:47, 3.79s/it] 60%|██████ | 20706/34278 [22:48:49<13:31:24, 3.59s/it] {'loss': 0.1396, 'grad_norm': 0.9452837656329977, 'learning_rate': 3.577805737367654e-06, 'epoch': 0.6} 60%|██████ | 20706/34278 [22:48:49<13:31:24, 3.59s/it] 60%|██████ | 20707/34278 [22:48:56<16:20:57, 4.34s/it] {'loss': 0.102, 'grad_norm': 0.6820508516041044, 'learning_rate': 3.5773528238385346e-06, 'epoch': 0.6} 60%|██████ | 20707/34278 [22:48:56<16:20:57, 4.34s/it] 60%|██████ | 20708/34278 [22:48:59<15:14:59, 4.05s/it] {'loss': 0.1087, 'grad_norm': 0.80347364286241, 'learning_rate': 3.5768999230104704e-06, 'epoch': 0.6} 60%|██████ | 20708/34278 [22:48:59<15:14:59, 4.05s/it] 60%|██████ | 20709/34278 [22:49:02<14:39:20, 3.89s/it] {'loss': 0.1391, 'grad_norm': 0.9430438813368315, 'learning_rate': 3.5764470348875045e-06, 'epoch': 0.6} 60%|██████ | 20709/34278 [22:49:02<14:39:20, 3.89s/it] 60%|██████ | 20710/34278 [22:49:06<13:51:40, 3.68s/it] {'loss': 0.1539, 'grad_norm': 0.926409377536277, 'learning_rate': 3.57599415947368e-06, 'epoch': 0.6} 60%|██████ | 20710/34278 [22:49:06<13:51:40, 3.68s/it] 60%|██████ | 20711/34278 [22:49:09<13:53:08, 3.68s/it] {'loss': 0.1193, 'grad_norm': 0.7496252817836285, 'learning_rate': 3.5755412967730397e-06, 'epoch': 0.6} 60%|██████ | 20711/34278 [22:49:09<13:53:08, 3.68s/it] 60%|██████ | 20712/34278 [22:49:14<14:49:47, 3.94s/it] {'loss': 0.1699, 'grad_norm': 1.0412874650331714, 'learning_rate': 3.5750884467896262e-06, 'epoch': 0.6} 60%|██████ | 20712/34278 [22:49:14<14:49:47, 3.94s/it] 60%|██████ | 20713/34278 [22:49:17<13:35:37, 3.61s/it] {'loss': 0.1343, 'grad_norm': 1.0497410722901828, 'learning_rate': 3.5746356095274817e-06, 'epoch': 0.6} 60%|██████ | 20713/34278 [22:49:17<13:35:37, 3.61s/it] 60%|██████ | 20714/34278 [22:49:20<13:04:57, 3.47s/it] {'loss': 0.11, 'grad_norm': 0.9605813323763668, 'learning_rate': 3.5741827849906514e-06, 'epoch': 0.6} 60%|██████ | 20714/34278 [22:49:20<13:04:57, 3.47s/it] 60%|██████ | 20715/34278 [22:49:24<13:16:54, 3.53s/it] {'loss': 0.1755, 'grad_norm': 1.075380391109725, 'learning_rate': 3.5737299731831776e-06, 'epoch': 0.6} 60%|██████ | 20715/34278 [22:49:24<13:16:54, 3.53s/it] 60%|██████ | 20716/34278 [22:49:29<16:00:19, 4.25s/it] {'loss': 0.1267, 'grad_norm': 0.815662823665873, 'learning_rate': 3.5732771741091014e-06, 'epoch': 0.6} 60%|██████ | 20716/34278 [22:49:29<16:00:19, 4.25s/it] 60%|██████ | 20717/34278 [22:49:32<14:19:39, 3.80s/it] {'loss': 0.1316, 'grad_norm': 1.0608945222955084, 'learning_rate': 3.572824387772466e-06, 'epoch': 0.6} 60%|██████ | 20717/34278 [22:49:32<14:19:39, 3.80s/it] 60%|██████ | 20718/34278 [22:49:36<14:09:52, 3.76s/it] {'loss': 0.1109, 'grad_norm': 1.3495815301820906, 'learning_rate': 3.5723716141773145e-06, 'epoch': 0.6} 60%|██████ | 20718/34278 [22:49:36<14:09:52, 3.76s/it] 60%|██████ | 20719/34278 [22:49:42<16:16:13, 4.32s/it] {'loss': 0.1258, 'grad_norm': 0.7443500837028708, 'learning_rate': 3.5719188533276854e-06, 'epoch': 0.6} 60%|██████ | 20719/34278 [22:49:42<16:16:13, 4.32s/it] 60%|██████ | 20720/34278 [22:49:45<15:17:51, 4.06s/it] {'loss': 0.1204, 'grad_norm': 1.032691015903389, 'learning_rate': 3.571466105227627e-06, 'epoch': 0.6} 60%|██████ | 20720/34278 [22:49:45<15:17:51, 4.06s/it] 60%|██████ | 20721/34278 [22:49:48<13:50:22, 3.68s/it] {'loss': 0.1076, 'grad_norm': 0.6986654255922213, 'learning_rate': 3.5710133698811776e-06, 'epoch': 0.6} 60%|██████ | 20721/34278 [22:49:48<13:50:22, 3.68s/it] 60%|██████ | 20722/34278 [22:49:52<14:09:39, 3.76s/it] {'loss': 0.1329, 'grad_norm': 0.8930889899932719, 'learning_rate': 3.570560647292379e-06, 'epoch': 0.6} 60%|██████ | 20722/34278 [22:49:52<14:09:39, 3.76s/it] 60%|██████ | 20723/34278 [22:49:55<13:19:29, 3.54s/it] {'loss': 0.1199, 'grad_norm': 0.8177811896482285, 'learning_rate': 3.570107937465276e-06, 'epoch': 0.6} 60%|██████ | 20723/34278 [22:49:55<13:19:29, 3.54s/it] 60%|██████ | 20724/34278 [22:49:58<13:11:07, 3.50s/it] {'loss': 0.1166, 'grad_norm': 0.9547362573044593, 'learning_rate': 3.5696552404039053e-06, 'epoch': 0.6} 60%|██████ | 20724/34278 [22:49:58<13:11:07, 3.50s/it] 60%|██████ | 20725/34278 [22:50:01<12:51:01, 3.41s/it] {'loss': 0.1024, 'grad_norm': 0.6963679729761292, 'learning_rate': 3.569202556112311e-06, 'epoch': 0.6} 60%|██████ | 20725/34278 [22:50:01<12:51:01, 3.41s/it] 60%|██████ | 20726/34278 [22:50:06<13:42:53, 3.64s/it] {'loss': 0.178, 'grad_norm': 1.3464318442173022, 'learning_rate': 3.5687498845945357e-06, 'epoch': 0.6} 60%|██████ | 20726/34278 [22:50:06<13:42:53, 3.64s/it] 60%|██████ | 20727/34278 [22:50:08<12:40:54, 3.37s/it] {'loss': 0.1164, 'grad_norm': 0.740158816183777, 'learning_rate': 3.5682972258546213e-06, 'epoch': 0.6} 60%|██████ | 20727/34278 [22:50:08<12:40:54, 3.37s/it] 60%|██████ | 20728/34278 [22:50:14<15:28:29, 4.11s/it] {'loss': 0.1188, 'grad_norm': 0.957911953830772, 'learning_rate': 3.5678445798966055e-06, 'epoch': 0.6} 60%|██████ | 20728/34278 [22:50:14<15:28:29, 4.11s/it] 60%|██████ | 20729/34278 [22:50:17<14:18:21, 3.80s/it] {'loss': 0.1162, 'grad_norm': 1.2051437086304853, 'learning_rate': 3.567391946724532e-06, 'epoch': 0.6} 60%|██████ | 20729/34278 [22:50:17<14:18:21, 3.80s/it] 60%|██████ | 20730/34278 [22:50:20<13:39:14, 3.63s/it] {'loss': 0.1317, 'grad_norm': 0.9796458679730979, 'learning_rate': 3.5669393263424417e-06, 'epoch': 0.6} 60%|██████ | 20730/34278 [22:50:20<13:39:14, 3.63s/it] 60%|██████ | 20731/34278 [22:50:23<12:49:44, 3.41s/it] {'loss': 0.1133, 'grad_norm': 0.8184650639055017, 'learning_rate': 3.566486718754372e-06, 'epoch': 0.6} 60%|██████ | 20731/34278 [22:50:23<12:49:44, 3.41s/it] 60%|██████ | 20732/34278 [22:50:26<12:12:58, 3.25s/it] {'loss': 0.1043, 'grad_norm': 0.6924939754626013, 'learning_rate': 3.5660341239643703e-06, 'epoch': 0.6} 60%|██████ | 20732/34278 [22:50:26<12:12:58, 3.25s/it] 60%|██████ | 20733/34278 [22:50:30<13:05:32, 3.48s/it] {'loss': 0.1324, 'grad_norm': 0.7927609320462828, 'learning_rate': 3.5655815419764724e-06, 'epoch': 0.6} 60%|██████ | 20733/34278 [22:50:30<13:05:32, 3.48s/it] 60%|██████ | 20734/34278 [22:50:33<12:30:07, 3.32s/it] {'loss': 0.1478, 'grad_norm': 0.8222092239598024, 'learning_rate': 3.56512897279472e-06, 'epoch': 0.6} 60%|██████ | 20734/34278 [22:50:33<12:30:07, 3.32s/it] 60%|██████ | 20735/34278 [22:50:36<12:12:02, 3.24s/it] {'loss': 0.1253, 'grad_norm': 0.8377589663943582, 'learning_rate': 3.564676416423154e-06, 'epoch': 0.6} 60%|██████ | 20735/34278 [22:50:36<12:12:02, 3.24s/it] 60%|██████ | 20736/34278 [22:50:40<13:12:24, 3.51s/it] {'loss': 0.1367, 'grad_norm': 0.877026665903272, 'learning_rate': 3.564223872865814e-06, 'epoch': 0.6} 60%|██████ | 20736/34278 [22:50:40<13:12:24, 3.51s/it] 60%|██████ | 20737/34278 [22:50:44<13:39:04, 3.63s/it] {'loss': 0.1366, 'grad_norm': 0.774475699720562, 'learning_rate': 3.5637713421267395e-06, 'epoch': 0.6} 60%|██████ | 20737/34278 [22:50:44<13:39:04, 3.63s/it] 60%|██████ | 20738/34278 [22:50:49<15:12:00, 4.04s/it] {'loss': 0.1591, 'grad_norm': 0.9174580895118168, 'learning_rate': 3.5633188242099726e-06, 'epoch': 0.6} 60%|██████ | 20738/34278 [22:50:49<15:12:00, 4.04s/it] 61%|██████ | 20739/34278 [22:50:54<15:36:42, 4.15s/it] {'loss': 0.1127, 'grad_norm': 0.9322027571763905, 'learning_rate': 3.5628663191195525e-06, 'epoch': 0.61} 61%|██████ | 20739/34278 [22:50:54<15:36:42, 4.15s/it] 61%|██████ | 20740/34278 [22:50:58<15:17:51, 4.07s/it] {'loss': 0.1198, 'grad_norm': 0.6558769605593633, 'learning_rate': 3.5624138268595186e-06, 'epoch': 0.61} 61%|██████ | 20740/34278 [22:50:58<15:17:51, 4.07s/it] 61%|██████ | 20741/34278 [22:51:01<14:21:38, 3.82s/it] {'loss': 0.1484, 'grad_norm': 0.8040920378513927, 'learning_rate': 3.561961347433911e-06, 'epoch': 0.61} 61%|██████ | 20741/34278 [22:51:01<14:21:38, 3.82s/it] 61%|██████ | 20742/34278 [22:51:04<13:48:14, 3.67s/it] {'loss': 0.1291, 'grad_norm': 0.8918910970278644, 'learning_rate': 3.5615088808467692e-06, 'epoch': 0.61} 61%|██████ | 20742/34278 [22:51:04<13:48:14, 3.67s/it] 61%|██████ | 20743/34278 [22:51:10<16:05:55, 4.28s/it] {'loss': 0.1029, 'grad_norm': 0.861320704932758, 'learning_rate': 3.5610564271021315e-06, 'epoch': 0.61} 61%|██████ | 20743/34278 [22:51:10<16:05:55, 4.28s/it] 61%|██████ | 20744/34278 [22:51:14<16:23:56, 4.36s/it] {'loss': 0.1459, 'grad_norm': 0.8195312237738975, 'learning_rate': 3.5606039862040398e-06, 'epoch': 0.61} 61%|██████ | 20744/34278 [22:51:14<16:23:56, 4.36s/it] 61%|██████ | 20745/34278 [22:51:18<16:08:13, 4.29s/it] {'loss': 0.1178, 'grad_norm': 0.691934009545335, 'learning_rate': 3.5601515581565326e-06, 'epoch': 0.61} 61%|██████ | 20745/34278 [22:51:18<16:08:13, 4.29s/it] 61%|██████ | 20746/34278 [22:51:22<14:47:59, 3.94s/it] {'loss': 0.1155, 'grad_norm': 0.8664220212506307, 'learning_rate': 3.5596991429636474e-06, 'epoch': 0.61} 61%|██████ | 20746/34278 [22:51:22<14:47:59, 3.94s/it] 61%|██████ | 20747/34278 [22:51:26<15:21:37, 4.09s/it] {'loss': 0.1319, 'grad_norm': 0.769249132744937, 'learning_rate': 3.559246740629425e-06, 'epoch': 0.61} 61%|██████ | 20747/34278 [22:51:26<15:21:37, 4.09s/it] 61%|██████ | 20748/34278 [22:51:29<14:30:47, 3.86s/it] {'loss': 0.1316, 'grad_norm': 0.7613625846478836, 'learning_rate': 3.558794351157905e-06, 'epoch': 0.61} 61%|██████ | 20748/34278 [22:51:29<14:30:47, 3.86s/it] 61%|██████ | 20749/34278 [22:51:32<13:18:11, 3.54s/it] {'loss': 0.1208, 'grad_norm': 0.8746338487270799, 'learning_rate': 3.558341974553122e-06, 'epoch': 0.61} 61%|██████ | 20749/34278 [22:51:32<13:18:11, 3.54s/it] 61%|██████ | 20750/34278 [22:51:35<13:04:15, 3.48s/it] {'loss': 0.138, 'grad_norm': 0.7576457626868381, 'learning_rate': 3.5578896108191195e-06, 'epoch': 0.61} 61%|██████ | 20750/34278 [22:51:35<13:04:15, 3.48s/it] 61%|██████ | 20751/34278 [22:51:38<12:26:28, 3.31s/it] {'loss': 0.1222, 'grad_norm': 0.827542553441471, 'learning_rate': 3.5574372599599337e-06, 'epoch': 0.61} 61%|██████ | 20751/34278 [22:51:38<12:26:28, 3.31s/it] 61%|██████ | 20752/34278 [22:51:42<12:46:56, 3.40s/it] {'loss': 0.1272, 'grad_norm': 0.923101206094693, 'learning_rate': 3.5569849219796044e-06, 'epoch': 0.61} 61%|██████ | 20752/34278 [22:51:42<12:46:56, 3.40s/it] 61%|██████ | 20753/34278 [22:51:45<12:23:16, 3.30s/it] {'loss': 0.1253, 'grad_norm': 0.8517156793731273, 'learning_rate': 3.5565325968821694e-06, 'epoch': 0.61} 61%|██████ | 20753/34278 [22:51:45<12:23:16, 3.30s/it] 61%|██████ | 20754/34278 [22:51:50<13:54:06, 3.70s/it] {'loss': 0.1288, 'grad_norm': 0.7968882326285671, 'learning_rate': 3.556080284671667e-06, 'epoch': 0.61} 61%|██████ | 20754/34278 [22:51:50<13:54:06, 3.70s/it] 61%|██████ | 20755/34278 [22:51:54<14:43:51, 3.92s/it] {'loss': 0.1359, 'grad_norm': 0.8297991682726783, 'learning_rate': 3.555627985352133e-06, 'epoch': 0.61} 61%|██████ | 20755/34278 [22:51:54<14:43:51, 3.92s/it] 61%|██████ | 20756/34278 [22:51:57<13:19:35, 3.55s/it] {'loss': 0.1321, 'grad_norm': 0.9329933523109677, 'learning_rate': 3.5551756989276087e-06, 'epoch': 0.61} 61%|██████ | 20756/34278 [22:51:57<13:19:35, 3.55s/it] 61%|██████ | 20757/34278 [22:52:00<12:55:48, 3.44s/it] {'loss': 0.1034, 'grad_norm': 0.7371539989534648, 'learning_rate': 3.5547234254021325e-06, 'epoch': 0.61} 61%|██████ | 20757/34278 [22:52:00<12:55:48, 3.44s/it] 61%|██████ | 20758/34278 [22:52:03<12:49:54, 3.42s/it] {'loss': 0.117, 'grad_norm': 0.808178212639286, 'learning_rate': 3.554271164779739e-06, 'epoch': 0.61} 61%|██████ | 20758/34278 [22:52:03<12:49:54, 3.42s/it] 61%|██████ | 20759/34278 [22:52:07<13:31:27, 3.60s/it] {'loss': 0.1154, 'grad_norm': 0.6366159947828574, 'learning_rate': 3.5538189170644678e-06, 'epoch': 0.61} 61%|██████ | 20759/34278 [22:52:07<13:31:27, 3.60s/it] 61%|██████ | 20760/34278 [22:52:10<12:44:32, 3.39s/it] {'loss': 0.1327, 'grad_norm': 1.2799814059715604, 'learning_rate': 3.553366682260356e-06, 'epoch': 0.61} 61%|██████ | 20760/34278 [22:52:10<12:44:32, 3.39s/it] 61%|██████ | 20761/34278 [22:52:13<12:23:08, 3.30s/it] {'loss': 0.1227, 'grad_norm': 0.8034490254175609, 'learning_rate': 3.5529144603714395e-06, 'epoch': 0.61} 61%|██████ | 20761/34278 [22:52:13<12:23:08, 3.30s/it] 61%|██████ | 20762/34278 [22:52:16<12:03:38, 3.21s/it] {'loss': 0.1252, 'grad_norm': 1.1062572770653447, 'learning_rate': 3.55246225140176e-06, 'epoch': 0.61} 61%|██████ | 20762/34278 [22:52:16<12:03:38, 3.21s/it] 61%|██████ | 20763/34278 [22:52:20<12:08:58, 3.24s/it] {'loss': 0.1564, 'grad_norm': 1.0330797722000389, 'learning_rate': 3.5520100553553504e-06, 'epoch': 0.61} 61%|██████ | 20763/34278 [22:52:20<12:08:58, 3.24s/it] 61%|██████ | 20764/34278 [22:52:23<12:02:02, 3.21s/it] {'loss': 0.1281, 'grad_norm': 0.8166753943911534, 'learning_rate': 3.55155787223625e-06, 'epoch': 0.61} 61%|██████ | 20764/34278 [22:52:23<12:02:02, 3.21s/it] 61%|██████ | 20765/34278 [22:52:27<12:33:17, 3.34s/it] {'loss': 0.1292, 'grad_norm': 0.8070122984817666, 'learning_rate': 3.551105702048495e-06, 'epoch': 0.61} 61%|██████ | 20765/34278 [22:52:27<12:33:17, 3.34s/it] 61%|██████ | 20766/34278 [22:52:30<12:12:42, 3.25s/it] {'loss': 0.1332, 'grad_norm': 1.1250501921628877, 'learning_rate': 3.5506535447961227e-06, 'epoch': 0.61} 61%|██████ | 20766/34278 [22:52:30<12:12:42, 3.25s/it] 61%|██████ | 20767/34278 [22:52:32<11:35:45, 3.09s/it] {'loss': 0.1257, 'grad_norm': 1.0708299412142712, 'learning_rate': 3.5502014004831674e-06, 'epoch': 0.61} 61%|██████ | 20767/34278 [22:52:32<11:35:45, 3.09s/it] 61%|██████ | 20768/34278 [22:52:36<12:11:06, 3.25s/it] {'loss': 0.1424, 'grad_norm': 0.9902104897837148, 'learning_rate': 3.5497492691136705e-06, 'epoch': 0.61} 61%|██████ | 20768/34278 [22:52:36<12:11:06, 3.25s/it] 61%|██████ | 20769/34278 [22:52:41<13:47:47, 3.68s/it] {'loss': 0.1489, 'grad_norm': 0.7946503233411324, 'learning_rate': 3.5492971506916647e-06, 'epoch': 0.61} 61%|██████ | 20769/34278 [22:52:41<13:47:47, 3.68s/it] 61%|██████ | 20770/34278 [22:52:44<13:43:12, 3.66s/it] {'loss': 0.1281, 'grad_norm': 1.1545968112596225, 'learning_rate': 3.5488450452211887e-06, 'epoch': 0.61} 61%|██████ | 20770/34278 [22:52:44<13:43:12, 3.66s/it] 61%|██████ | 20771/34278 [22:52:47<13:00:47, 3.47s/it] {'loss': 0.1209, 'grad_norm': 0.998017859659975, 'learning_rate': 3.5483929527062764e-06, 'epoch': 0.61} 61%|██████ | 20771/34278 [22:52:47<13:00:47, 3.47s/it] 61%|██████ | 20772/34278 [22:52:53<15:24:52, 4.11s/it] {'loss': 0.1006, 'grad_norm': 0.7082787407444177, 'learning_rate': 3.547940873150966e-06, 'epoch': 0.61} 61%|██████ | 20772/34278 [22:52:53<15:24:52, 4.11s/it] 61%|██████ | 20773/34278 [22:52:59<17:26:39, 4.65s/it] {'loss': 0.1373, 'grad_norm': 0.8808205347203875, 'learning_rate': 3.547488806559292e-06, 'epoch': 0.61} 61%|██████ | 20773/34278 [22:52:59<17:26:39, 4.65s/it] 61%|██████ | 20774/34278 [22:53:02<15:41:22, 4.18s/it] {'loss': 0.1378, 'grad_norm': 1.0237477128960173, 'learning_rate': 3.5470367529352917e-06, 'epoch': 0.61} 61%|██████ | 20774/34278 [22:53:02<15:41:22, 4.18s/it] 61%|██████ | 20775/34278 [22:53:06<16:02:48, 4.28s/it] {'loss': 0.0925, 'grad_norm': 0.6965407799322243, 'learning_rate': 3.5465847122830014e-06, 'epoch': 0.61} 61%|██████ | 20775/34278 [22:53:06<16:02:48, 4.28s/it] 61%|██████ | 20776/34278 [22:53:10<15:31:34, 4.14s/it] {'loss': 0.1142, 'grad_norm': 0.5832944443817607, 'learning_rate': 3.5461326846064555e-06, 'epoch': 0.61} 61%|██████ | 20776/34278 [22:53:10<15:31:34, 4.14s/it] 61%|██████ | 20777/34278 [22:53:13<14:37:57, 3.90s/it] {'loss': 0.1118, 'grad_norm': 1.0577146828928368, 'learning_rate': 3.545680669909689e-06, 'epoch': 0.61} 61%|██████ | 20777/34278 [22:53:13<14:37:57, 3.90s/it] 61%|██████ | 20778/34278 [22:53:17<14:08:27, 3.77s/it] {'loss': 0.1411, 'grad_norm': 1.1101115172841742, 'learning_rate': 3.5452286681967397e-06, 'epoch': 0.61} 61%|██████ | 20778/34278 [22:53:17<14:08:27, 3.77s/it] 61%|██████ | 20779/34278 [22:53:21<14:54:25, 3.98s/it] {'loss': 0.1271, 'grad_norm': 0.7237761676523827, 'learning_rate': 3.54477667947164e-06, 'epoch': 0.61} 61%|██████ | 20779/34278 [22:53:21<14:54:25, 3.98s/it] 61%|██████ | 20780/34278 [22:53:25<14:51:23, 3.96s/it] {'loss': 0.1319, 'grad_norm': 1.1404929236029993, 'learning_rate': 3.5443247037384273e-06, 'epoch': 0.61} 61%|██████ | 20780/34278 [22:53:25<14:51:23, 3.96s/it] 61%|██████ | 20781/34278 [22:53:29<14:01:29, 3.74s/it] {'loss': 0.1384, 'grad_norm': 1.585405341464, 'learning_rate': 3.543872741001136e-06, 'epoch': 0.61} 61%|██████ | 20781/34278 [22:53:29<14:01:29, 3.74s/it] 61%|██████ | 20782/34278 [22:53:34<15:51:23, 4.23s/it] {'loss': 0.1094, 'grad_norm': 1.1413178559235155, 'learning_rate': 3.543420791263801e-06, 'epoch': 0.61} 61%|██████ | 20782/34278 [22:53:34<15:51:23, 4.23s/it] 61%|██████ | 20783/34278 [22:53:37<15:03:31, 4.02s/it] {'loss': 0.1302, 'grad_norm': 0.805556354799846, 'learning_rate': 3.5429688545304596e-06, 'epoch': 0.61} 61%|██████ | 20783/34278 [22:53:37<15:03:31, 4.02s/it] 61%|██████ | 20784/34278 [22:53:41<14:24:38, 3.84s/it] {'loss': 0.1168, 'grad_norm': 0.8586873033020317, 'learning_rate': 3.5425169308051423e-06, 'epoch': 0.61} 61%|██████ | 20784/34278 [22:53:41<14:24:38, 3.84s/it] 61%|██████ | 20785/34278 [22:53:44<13:56:09, 3.72s/it] {'loss': 0.1182, 'grad_norm': 1.1159385045306318, 'learning_rate': 3.5420650200918854e-06, 'epoch': 0.61} 61%|██████ | 20785/34278 [22:53:44<13:56:09, 3.72s/it] 61%|██████ | 20786/34278 [22:53:48<13:39:34, 3.64s/it] {'loss': 0.1018, 'grad_norm': 1.1160109972322403, 'learning_rate': 3.5416131223947246e-06, 'epoch': 0.61} 61%|██████ | 20786/34278 [22:53:48<13:39:34, 3.64s/it] 61%|██████ | 20787/34278 [22:53:51<13:32:21, 3.61s/it] {'loss': 0.1398, 'grad_norm': 0.8517934459793963, 'learning_rate': 3.5411612377176952e-06, 'epoch': 0.61} 61%|██████ | 20787/34278 [22:53:51<13:32:21, 3.61s/it] 61%|██████ | 20788/34278 [22:53:54<12:39:19, 3.38s/it] {'loss': 0.1161, 'grad_norm': 1.007116534568464, 'learning_rate': 3.540709366064829e-06, 'epoch': 0.61} 61%|██████ | 20788/34278 [22:53:54<12:39:19, 3.38s/it] 61%|██████ | 20789/34278 [22:53:57<12:23:41, 3.31s/it] {'loss': 0.121, 'grad_norm': 1.106611411847511, 'learning_rate': 3.5402575074401614e-06, 'epoch': 0.61} 61%|██████ | 20789/34278 [22:53:57<12:23:41, 3.31s/it] 61%|██████ | 20790/34278 [22:54:01<12:24:17, 3.31s/it] {'loss': 0.117, 'grad_norm': 1.1247601950427408, 'learning_rate': 3.5398056618477267e-06, 'epoch': 0.61} 61%|██████ | 20790/34278 [22:54:01<12:24:17, 3.31s/it] 61%|██████ | 20791/34278 [22:54:03<11:52:02, 3.17s/it] {'loss': 0.1522, 'grad_norm': 0.8765751259200038, 'learning_rate': 3.539353829291555e-06, 'epoch': 0.61} 61%|██████ | 20791/34278 [22:54:03<11:52:02, 3.17s/it] 61%|██████ | 20792/34278 [22:54:06<11:23:43, 3.04s/it] {'loss': 0.1423, 'grad_norm': 0.848885329789495, 'learning_rate': 3.5389020097756875e-06, 'epoch': 0.61} 61%|██████ | 20792/34278 [22:54:06<11:23:43, 3.04s/it] 61%|██████ | 20793/34278 [22:54:09<11:39:44, 3.11s/it] {'loss': 0.1238, 'grad_norm': 1.3298916316619849, 'learning_rate': 3.5384502033041534e-06, 'epoch': 0.61} 61%|██████ | 20793/34278 [22:54:09<11:39:44, 3.11s/it] 61%|██████ | 20794/34278 [22:54:13<11:54:41, 3.18s/it] {'loss': 0.1404, 'grad_norm': 0.9793552854737595, 'learning_rate': 3.537998409880986e-06, 'epoch': 0.61} 61%|██████ | 20794/34278 [22:54:13<11:54:41, 3.18s/it] 61%|██████ | 20795/34278 [22:54:17<12:38:21, 3.37s/it] {'loss': 0.1465, 'grad_norm': 0.9251143141179438, 'learning_rate': 3.537546629510222e-06, 'epoch': 0.61} 61%|██████ | 20795/34278 [22:54:17<12:38:21, 3.37s/it] 61%|██████ | 20796/34278 [22:54:22<14:42:56, 3.93s/it] {'loss': 0.1183, 'grad_norm': 0.7619181328684479, 'learning_rate': 3.5370948621958905e-06, 'epoch': 0.61} 61%|██████ | 20796/34278 [22:54:22<14:42:56, 3.93s/it] 61%|██████ | 20797/34278 [22:54:26<14:27:27, 3.86s/it] {'loss': 0.1291, 'grad_norm': 0.9043037680986421, 'learning_rate': 3.536643107942026e-06, 'epoch': 0.61} 61%|██████ | 20797/34278 [22:54:26<14:27:27, 3.86s/it] 61%|██████ | 20798/34278 [22:54:29<13:30:24, 3.61s/it] {'loss': 0.1225, 'grad_norm': 0.7967554360175508, 'learning_rate': 3.5361913667526637e-06, 'epoch': 0.61} 61%|██████ | 20798/34278 [22:54:29<13:30:24, 3.61s/it] 61%|██████ | 20799/34278 [22:54:35<16:14:30, 4.34s/it] {'loss': 0.1066, 'grad_norm': 1.128864099514495, 'learning_rate': 3.5357396386318356e-06, 'epoch': 0.61} 61%|██████ | 20799/34278 [22:54:35<16:14:30, 4.34s/it] 61%|██████ | 20800/34278 [22:54:39<15:58:28, 4.27s/it] {'loss': 0.1013, 'grad_norm': 0.9666030322325735, 'learning_rate': 3.535287923583576e-06, 'epoch': 0.61} 61%|██████ | 20800/34278 [22:54:39<15:58:28, 4.27s/it] 61%|██████ | 20801/34278 [22:54:43<15:55:53, 4.26s/it] {'loss': 0.1278, 'grad_norm': 0.6596696692633949, 'learning_rate': 3.5348362216119136e-06, 'epoch': 0.61} 61%|██████ | 20801/34278 [22:54:43<15:55:53, 4.26s/it] 61%|██████ | 20802/34278 [22:54:46<14:18:28, 3.82s/it] {'loss': 0.1392, 'grad_norm': 0.9430316041094062, 'learning_rate': 3.534384532720885e-06, 'epoch': 0.61} 61%|██████ | 20802/34278 [22:54:46<14:18:28, 3.82s/it] 61%|██████ | 20803/34278 [22:54:50<14:17:09, 3.82s/it] {'loss': 0.1262, 'grad_norm': 0.8621980563535409, 'learning_rate': 3.53393285691452e-06, 'epoch': 0.61} 61%|██████ | 20803/34278 [22:54:50<14:17:09, 3.82s/it] 61%|██████ | 20804/34278 [22:54:53<13:51:44, 3.70s/it] {'loss': 0.1064, 'grad_norm': 0.7425572221417459, 'learning_rate': 3.5334811941968533e-06, 'epoch': 0.61} 61%|██████ | 20804/34278 [22:54:53<13:51:44, 3.70s/it] 61%|██████ | 20805/34278 [22:54:56<13:01:36, 3.48s/it] {'loss': 0.126, 'grad_norm': 0.7104699576240053, 'learning_rate': 3.5330295445719174e-06, 'epoch': 0.61} 61%|██████ | 20805/34278 [22:54:56<13:01:36, 3.48s/it] 61%|██████ | 20806/34278 [22:54:59<12:59:08, 3.47s/it] {'loss': 0.0955, 'grad_norm': 0.7137393887368758, 'learning_rate': 3.5325779080437427e-06, 'epoch': 0.61} 61%|██████ | 20806/34278 [22:54:59<12:59:08, 3.47s/it] 61%|██████ | 20807/34278 [22:55:02<12:30:24, 3.34s/it] {'loss': 0.1368, 'grad_norm': 0.836191153618072, 'learning_rate': 3.532126284616362e-06, 'epoch': 0.61} 61%|██████ | 20807/34278 [22:55:02<12:30:24, 3.34s/it] 61%|██████ | 20808/34278 [22:55:08<15:24:48, 4.12s/it] {'loss': 0.1177, 'grad_norm': 0.8013247250862227, 'learning_rate': 3.531674674293809e-06, 'epoch': 0.61} 61%|██████ | 20808/34278 [22:55:08<15:24:48, 4.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 61%|██████ | 20809/34278 [22:55:12<14:29:16, 3.87s/it] {'loss': 0.1246, 'grad_norm': 0.8837431030362821, 'learning_rate': 3.5312230770801115e-06, 'epoch': 0.61} 61%|██████ | 20809/34278 [22:55:12<14:29:16, 3.87s/it] 61%|██████ | 20810/34278 [22:55:15<14:16:17, 3.81s/it] {'loss': 0.1373, 'grad_norm': 0.7869883262216052, 'learning_rate': 3.530771492979305e-06, 'epoch': 0.61} 61%|██████ | 20810/34278 [22:55:15<14:16:17, 3.81s/it] 61%|██████ | 20811/34278 [22:55:19<13:55:24, 3.72s/it] {'loss': 0.1183, 'grad_norm': 0.8483986506473256, 'learning_rate': 3.5303199219954188e-06, 'epoch': 0.61} 61%|██████ | 20811/34278 [22:55:19<13:55:24, 3.72s/it] 61%|██████ | 20812/34278 [22:55:23<13:56:13, 3.73s/it] {'loss': 0.1147, 'grad_norm': 0.6440159824701732, 'learning_rate': 3.5298683641324864e-06, 'epoch': 0.61} 61%|██████ | 20812/34278 [22:55:23<13:56:13, 3.73s/it] 61%|██████ | 20813/34278 [22:55:26<13:32:10, 3.62s/it] {'loss': 0.1363, 'grad_norm': 0.8186681196906564, 'learning_rate': 3.5294168193945392e-06, 'epoch': 0.61} 61%|██████ | 20813/34278 [22:55:26<13:32:10, 3.62s/it] 61%|██████ | 20814/34278 [22:55:29<12:45:30, 3.41s/it] {'loss': 0.0933, 'grad_norm': 0.7340619231161224, 'learning_rate': 3.528965287785607e-06, 'epoch': 0.61} 61%|██████ | 20814/34278 [22:55:29<12:45:30, 3.41s/it] 61%|██████ | 20815/34278 [22:55:32<12:23:37, 3.31s/it] {'loss': 0.1366, 'grad_norm': 0.7199545601433122, 'learning_rate': 3.5285137693097198e-06, 'epoch': 0.61} 61%|██████ | 20815/34278 [22:55:32<12:23:37, 3.31s/it] 61%|██████ | 20816/34278 [22:55:35<12:01:52, 3.22s/it] {'loss': 0.1199, 'grad_norm': 0.6875301094139717, 'learning_rate': 3.5280622639709117e-06, 'epoch': 0.61} 61%|██████ | 20816/34278 [22:55:35<12:01:52, 3.22s/it] 61%|██████ | 20817/34278 [22:55:40<13:55:36, 3.72s/it] {'loss': 0.1261, 'grad_norm': 0.8598583189444979, 'learning_rate': 3.5276107717732133e-06, 'epoch': 0.61} 61%|██████ | 20817/34278 [22:55:40<13:55:36, 3.72s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 61%|██████ | 20818/34278 [22:55:43<13:13:19, 3.54s/it] {'loss': 0.1302, 'grad_norm': 0.749982161543977, 'learning_rate': 3.527159292720654e-06, 'epoch': 0.61} 61%|██████ | 20818/34278 [22:55:43<13:13:19, 3.54s/it] 61%|██████ | 20819/34278 [22:55:49<15:56:19, 4.26s/it] {'loss': 0.1311, 'grad_norm': 0.9103036841326585, 'learning_rate': 3.526707826817264e-06, 'epoch': 0.61} 61%|██████ | 20819/34278 [22:55:49<15:56:19, 4.26s/it] 61%|██████ | 20820/34278 [22:55:54<17:18:33, 4.63s/it] {'loss': 0.1189, 'grad_norm': 0.746632233792318, 'learning_rate': 3.5262563740670765e-06, 'epoch': 0.61} 61%|██████ | 20820/34278 [22:55:54<17:18:33, 4.63s/it] 61%|██████ | 20821/34278 [22:55:58<16:10:15, 4.33s/it] {'loss': 0.1469, 'grad_norm': 0.90928843892397, 'learning_rate': 3.525804934474117e-06, 'epoch': 0.61} 61%|██████ | 20821/34278 [22:55:58<16:10:15, 4.33s/it] 61%|██████ | 20822/34278 [22:56:02<15:59:33, 4.28s/it] {'loss': 0.1174, 'grad_norm': 0.742644949864802, 'learning_rate': 3.5253535080424224e-06, 'epoch': 0.61} 61%|██████ | 20822/34278 [22:56:02<15:59:33, 4.28s/it] 61%|██████ | 20823/34278 [22:56:08<17:17:26, 4.63s/it] {'loss': 0.1507, 'grad_norm': 0.7847711759910857, 'learning_rate': 3.5249020947760182e-06, 'epoch': 0.61} 61%|██████ | 20823/34278 [22:56:08<17:17:26, 4.63s/it] 61%|██████ | 20824/34278 [22:56:11<15:31:59, 4.16s/it] {'loss': 0.1354, 'grad_norm': 0.7114351635012685, 'learning_rate': 3.524450694678936e-06, 'epoch': 0.61} 61%|██████ | 20824/34278 [22:56:11<15:31:59, 4.16s/it] 61%|██████ | 20825/34278 [22:56:14<14:18:58, 3.83s/it] {'loss': 0.1387, 'grad_norm': 0.9068679796758986, 'learning_rate': 3.523999307755207e-06, 'epoch': 0.61} 61%|██████ | 20825/34278 [22:56:14<14:18:58, 3.83s/it] 61%|██████ | 20826/34278 [22:56:19<15:43:24, 4.21s/it] {'loss': 0.1443, 'grad_norm': 0.7231457922906789, 'learning_rate': 3.523547934008859e-06, 'epoch': 0.61} 61%|██████ | 20826/34278 [22:56:19<15:43:24, 4.21s/it] 61%|██████ | 20827/34278 [22:56:22<14:54:23, 3.99s/it] {'loss': 0.1104, 'grad_norm': 0.8640421220288766, 'learning_rate': 3.5230965734439214e-06, 'epoch': 0.61} 61%|██████ | 20827/34278 [22:56:22<14:54:23, 3.99s/it] 61%|██████ | 20828/34278 [22:56:25<13:49:52, 3.70s/it] {'loss': 0.1171, 'grad_norm': 0.7545538548010573, 'learning_rate': 3.522645226064426e-06, 'epoch': 0.61} 61%|██████ | 20828/34278 [22:56:25<13:49:52, 3.70s/it] 61%|██████ | 20829/34278 [22:56:29<13:35:32, 3.64s/it] {'loss': 0.1275, 'grad_norm': 0.8720536517182633, 'learning_rate': 3.5221938918744013e-06, 'epoch': 0.61} 61%|██████ | 20829/34278 [22:56:29<13:35:32, 3.64s/it] 61%|██████ | 20830/34278 [22:56:33<14:26:28, 3.87s/it] {'loss': 0.0852, 'grad_norm': 0.7062483667290207, 'learning_rate': 3.5217425708778774e-06, 'epoch': 0.61} 61%|██████ | 20830/34278 [22:56:33<14:26:28, 3.87s/it] 61%|██████ | 20831/34278 [22:56:36<13:18:49, 3.56s/it] {'loss': 0.1322, 'grad_norm': 0.9420998347925931, 'learning_rate': 3.5212912630788827e-06, 'epoch': 0.61} 61%|██████ | 20831/34278 [22:56:36<13:18:49, 3.56s/it] 61%|██████ | 20832/34278 [22:56:39<12:40:18, 3.39s/it] {'loss': 0.124, 'grad_norm': 0.7892344052736006, 'learning_rate': 3.5208399684814463e-06, 'epoch': 0.61} 61%|██████ | 20832/34278 [22:56:39<12:40:18, 3.39s/it] 61%|██████ | 20833/34278 [22:56:42<12:25:35, 3.33s/it] {'loss': 0.1176, 'grad_norm': 0.8214551947674077, 'learning_rate': 3.5203886870895965e-06, 'epoch': 0.61} 61%|██████ | 20833/34278 [22:56:42<12:25:35, 3.33s/it] 61%|██████ | 20834/34278 [22:56:45<12:15:30, 3.28s/it] {'loss': 0.135, 'grad_norm': 0.8876504109379849, 'learning_rate': 3.519937418907364e-06, 'epoch': 0.61} 61%|██████ | 20834/34278 [22:56:45<12:15:30, 3.28s/it] 61%|██████ | 20835/34278 [22:56:49<12:31:26, 3.35s/it] {'loss': 0.1228, 'grad_norm': 0.939069544891645, 'learning_rate': 3.5194861639387783e-06, 'epoch': 0.61} 61%|██████ | 20835/34278 [22:56:49<12:31:26, 3.35s/it] 61%|██████ | 20836/34278 [22:56:52<12:16:16, 3.29s/it] {'loss': 0.1229, 'grad_norm': 0.6827677451020275, 'learning_rate': 3.5190349221878655e-06, 'epoch': 0.61} 61%|██████ | 20836/34278 [22:56:52<12:16:16, 3.29s/it] 61%|██████ | 20837/34278 [22:56:55<12:02:41, 3.23s/it] {'loss': 0.1354, 'grad_norm': 0.8175353329013016, 'learning_rate': 3.518583693658656e-06, 'epoch': 0.61} 61%|██████ | 20837/34278 [22:56:55<12:02:41, 3.23s/it] 61%|██████ | 20838/34278 [22:56:58<11:52:13, 3.18s/it] {'loss': 0.1394, 'grad_norm': 0.8746298979621124, 'learning_rate': 3.518132478355178e-06, 'epoch': 0.61} 61%|██████ | 20838/34278 [22:56:58<11:52:13, 3.18s/it] 61%|██████ | 20839/34278 [22:57:04<14:34:37, 3.90s/it] {'loss': 0.1097, 'grad_norm': 0.7487423416580942, 'learning_rate': 3.5176812762814572e-06, 'epoch': 0.61} 61%|██████ | 20839/34278 [22:57:04<14:34:37, 3.90s/it] 61%|██████ | 20840/34278 [22:57:07<14:09:13, 3.79s/it] {'loss': 0.1359, 'grad_norm': 0.7421659489455351, 'learning_rate': 3.5172300874415256e-06, 'epoch': 0.61} 61%|██████ | 20840/34278 [22:57:07<14:09:13, 3.79s/it] 61%|██████ | 20841/34278 [22:57:10<13:06:03, 3.51s/it] {'loss': 0.1221, 'grad_norm': 0.928068727451947, 'learning_rate': 3.51677891183941e-06, 'epoch': 0.61} 61%|██████ | 20841/34278 [22:57:10<13:06:03, 3.51s/it] 61%|██████ | 20842/34278 [22:57:13<12:12:53, 3.27s/it] {'loss': 0.1323, 'grad_norm': 0.8362138577853714, 'learning_rate': 3.516327749479139e-06, 'epoch': 0.61} 61%|██████ | 20842/34278 [22:57:13<12:12:53, 3.27s/it] 61%|██████ | 20843/34278 [22:57:18<13:50:16, 3.71s/it] {'loss': 0.1245, 'grad_norm': 0.8282463313254467, 'learning_rate': 3.5158766003647382e-06, 'epoch': 0.61} 61%|██████ | 20843/34278 [22:57:18<13:50:16, 3.71s/it] 61%|██████ | 20844/34278 [22:57:21<13:55:24, 3.73s/it] {'loss': 0.144, 'grad_norm': 0.7602479263944608, 'learning_rate': 3.515425464500237e-06, 'epoch': 0.61} 61%|██████ | 20844/34278 [22:57:21<13:55:24, 3.73s/it] 61%|██████ | 20845/34278 [22:57:28<16:41:15, 4.47s/it] {'loss': 0.1132, 'grad_norm': 1.075393671920079, 'learning_rate': 3.5149743418896622e-06, 'epoch': 0.61} 61%|██████ | 20845/34278 [22:57:28<16:41:15, 4.47s/it] 61%|██████ | 20846/34278 [22:57:31<15:22:54, 4.12s/it] {'loss': 0.1463, 'grad_norm': 0.8795152460951055, 'learning_rate': 3.5145232325370426e-06, 'epoch': 0.61} 61%|██████ | 20846/34278 [22:57:31<15:22:54, 4.12s/it] 61%|██████ | 20847/34278 [22:57:35<14:57:47, 4.01s/it] {'loss': 0.1128, 'grad_norm': 0.8786622884723323, 'learning_rate': 3.5140721364464068e-06, 'epoch': 0.61} 61%|██████ | 20847/34278 [22:57:35<14:57:47, 4.01s/it] 61%|██████ | 20848/34278 [22:57:38<13:44:44, 3.68s/it] {'loss': 0.1132, 'grad_norm': 0.7586587947296383, 'learning_rate': 3.5136210536217787e-06, 'epoch': 0.61} 61%|██████ | 20848/34278 [22:57:38<13:44:44, 3.68s/it] 61%|██████ | 20849/34278 [22:57:41<13:07:55, 3.52s/it] {'loss': 0.1291, 'grad_norm': 0.9380004096608355, 'learning_rate': 3.5131699840671867e-06, 'epoch': 0.61} 61%|██████ | 20849/34278 [22:57:41<13:07:55, 3.52s/it] 61%|██████ | 20850/34278 [22:57:46<14:59:44, 4.02s/it] {'loss': 0.1279, 'grad_norm': 0.7444580516213217, 'learning_rate': 3.51271892778666e-06, 'epoch': 0.61} 61%|██████ | 20850/34278 [22:57:46<14:59:44, 4.02s/it] 61%|██████ | 20851/34278 [22:57:50<14:44:21, 3.95s/it] {'loss': 0.1145, 'grad_norm': 0.8131382801497327, 'learning_rate': 3.5122678847842197e-06, 'epoch': 0.61} 61%|██████ | 20851/34278 [22:57:50<14:44:21, 3.95s/it] 61%|██████ | 20852/34278 [22:57:53<13:33:22, 3.63s/it] {'loss': 0.1259, 'grad_norm': 0.7383238104561087, 'learning_rate': 3.5118168550639e-06, 'epoch': 0.61} 61%|██████ | 20852/34278 [22:57:53<13:33:22, 3.63s/it] 61%|██████ | 20853/34278 [22:57:56<13:01:42, 3.49s/it] {'loss': 0.142, 'grad_norm': 0.8365176261664504, 'learning_rate': 3.5113658386297227e-06, 'epoch': 0.61} 61%|██████ | 20853/34278 [22:57:56<13:01:42, 3.49s/it] 61%|██████ | 20854/34278 [22:58:02<16:14:51, 4.36s/it] {'loss': 0.1447, 'grad_norm': 0.748472365231926, 'learning_rate': 3.5109148354857165e-06, 'epoch': 0.61} 61%|██████ | 20854/34278 [22:58:02<16:14:51, 4.36s/it] 61%|██████ | 20855/34278 [22:58:08<17:49:44, 4.78s/it] {'loss': 0.1057, 'grad_norm': 0.7115854890993898, 'learning_rate': 3.510463845635908e-06, 'epoch': 0.61} 61%|██████ | 20855/34278 [22:58:08<17:49:44, 4.78s/it] 61%|██████ | 20856/34278 [22:58:11<15:47:39, 4.24s/it] {'loss': 0.126, 'grad_norm': 0.8930583837912623, 'learning_rate': 3.5100128690843215e-06, 'epoch': 0.61} 61%|██████ | 20856/34278 [22:58:11<15:47:39, 4.24s/it] 61%|██████ | 20857/34278 [22:58:14<14:54:06, 4.00s/it] {'loss': 0.1225, 'grad_norm': 0.7522305806269778, 'learning_rate': 3.509561905834984e-06, 'epoch': 0.61} 61%|██████ | 20857/34278 [22:58:14<14:54:06, 4.00s/it] 61%|██████ | 20858/34278 [22:58:20<17:06:48, 4.59s/it] {'loss': 0.1097, 'grad_norm': 0.7656887352671015, 'learning_rate': 3.5091109558919223e-06, 'epoch': 0.61} 61%|██████ | 20858/34278 [22:58:20<17:06:48, 4.59s/it] 61%|██████ | 20859/34278 [22:58:24<15:43:51, 4.22s/it] {'loss': 0.1293, 'grad_norm': 0.8535693426859395, 'learning_rate': 3.5086600192591623e-06, 'epoch': 0.61} 61%|██████ | 20859/34278 [22:58:24<15:43:51, 4.22s/it] 61%|██████ | 20860/34278 [22:58:29<16:29:27, 4.42s/it] {'loss': 0.1157, 'grad_norm': 0.8635018599260486, 'learning_rate': 3.5082090959407307e-06, 'epoch': 0.61} 61%|██████ | 20860/34278 [22:58:29<16:29:27, 4.42s/it] 61%|██████ | 20861/34278 [22:58:32<15:16:57, 4.10s/it] {'loss': 0.1188, 'grad_norm': 0.7333457926948656, 'learning_rate': 3.5077581859406508e-06, 'epoch': 0.61} 61%|██████ | 20861/34278 [22:58:32<15:16:57, 4.10s/it] 61%|██████ | 20862/34278 [22:58:35<14:05:30, 3.78s/it] {'loss': 0.1416, 'grad_norm': 0.8042113886541649, 'learning_rate': 3.507307289262949e-06, 'epoch': 0.61} 61%|██████ | 20862/34278 [22:58:35<14:05:30, 3.78s/it] 61%|██████ | 20863/34278 [22:58:39<14:31:47, 3.90s/it] {'loss': 0.1144, 'grad_norm': 0.7903422032931964, 'learning_rate': 3.5068564059116522e-06, 'epoch': 0.61} 61%|██████ | 20863/34278 [22:58:39<14:31:47, 3.90s/it] 61%|██████ | 20864/34278 [22:58:42<13:45:58, 3.69s/it] {'loss': 0.0999, 'grad_norm': 0.7913155719387774, 'learning_rate': 3.5064055358907854e-06, 'epoch': 0.61} 61%|██████ | 20864/34278 [22:58:42<13:45:58, 3.69s/it] 61%|██████ | 20865/34278 [22:58:48<16:26:31, 4.41s/it] {'loss': 0.1217, 'grad_norm': 1.0452286762992344, 'learning_rate': 3.5059546792043742e-06, 'epoch': 0.61} 61%|██████ | 20865/34278 [22:58:48<16:26:31, 4.41s/it] 61%|██████ | 20866/34278 [22:58:52<15:01:26, 4.03s/it] {'loss': 0.1313, 'grad_norm': 0.8504592104242178, 'learning_rate': 3.505503835856442e-06, 'epoch': 0.61} 61%|██████ | 20866/34278 [22:58:52<15:01:26, 4.03s/it] 61%|██████ | 20867/34278 [22:58:56<15:29:23, 4.16s/it] {'loss': 0.1391, 'grad_norm': 0.9016377475640407, 'learning_rate': 3.5050530058510146e-06, 'epoch': 0.61} 61%|██████ | 20867/34278 [22:58:56<15:29:23, 4.16s/it] 61%|██████ | 20868/34278 [22:58:59<14:13:01, 3.82s/it] {'loss': 0.1419, 'grad_norm': 1.3474493280915316, 'learning_rate': 3.5046021891921156e-06, 'epoch': 0.61} 61%|██████ | 20868/34278 [22:58:59<14:13:01, 3.82s/it] 61%|██████ | 20869/34278 [22:59:02<13:35:35, 3.65s/it] {'loss': 0.1173, 'grad_norm': 1.0027879261808696, 'learning_rate': 3.504151385883774e-06, 'epoch': 0.61} 61%|██████ | 20869/34278 [22:59:02<13:35:35, 3.65s/it] 61%|██████ | 20870/34278 [22:59:08<16:17:26, 4.37s/it] {'loss': 0.1265, 'grad_norm': 0.8442227729097376, 'learning_rate': 3.5037005959300106e-06, 'epoch': 0.61} 61%|██████ | 20870/34278 [22:59:08<16:17:26, 4.37s/it] 61%|██████ | 20871/34278 [22:59:11<14:50:20, 3.98s/it] {'loss': 0.1343, 'grad_norm': 1.5925643195253445, 'learning_rate': 3.503249819334851e-06, 'epoch': 0.61} 61%|██████ | 20871/34278 [22:59:12<14:50:20, 3.98s/it] 61%|██████ | 20872/34278 [22:59:16<15:40:33, 4.21s/it] {'loss': 0.1328, 'grad_norm': 1.3341999200780448, 'learning_rate': 3.5027990561023204e-06, 'epoch': 0.61} 61%|██████ | 20872/34278 [22:59:16<15:40:33, 4.21s/it] 61%|██████ | 20873/34278 [22:59:19<14:26:22, 3.88s/it] {'loss': 0.1348, 'grad_norm': 0.8395569746982348, 'learning_rate': 3.502348306236442e-06, 'epoch': 0.61} 61%|██████ | 20873/34278 [22:59:19<14:26:22, 3.88s/it] 61%|██████ | 20874/34278 [22:59:25<16:44:41, 4.50s/it] {'loss': 0.1515, 'grad_norm': 1.1910055822634305, 'learning_rate': 3.5018975697412392e-06, 'epoch': 0.61} 61%|██████ | 20874/34278 [22:59:25<16:44:41, 4.50s/it] 61%|██████ | 20875/34278 [22:59:28<15:08:25, 4.07s/it] {'loss': 0.1205, 'grad_norm': 1.4586076054610828, 'learning_rate': 3.5014468466207387e-06, 'epoch': 0.61} 61%|██████ | 20875/34278 [22:59:28<15:08:25, 4.07s/it] 61%|██████ | 20876/34278 [22:59:31<14:01:21, 3.77s/it] {'loss': 0.114, 'grad_norm': 0.8727719842501428, 'learning_rate': 3.5009961368789623e-06, 'epoch': 0.61} 61%|██████ | 20876/34278 [22:59:31<14:01:21, 3.77s/it] 61%|██████ | 20877/34278 [22:59:35<13:46:38, 3.70s/it] {'loss': 0.1166, 'grad_norm': 0.8727754863794717, 'learning_rate': 3.5005454405199358e-06, 'epoch': 0.61} 61%|██████ | 20877/34278 [22:59:35<13:46:38, 3.70s/it] 61%|██████ | 20878/34278 [22:59:38<12:57:16, 3.48s/it] {'loss': 0.1029, 'grad_norm': 0.8788419478958428, 'learning_rate': 3.5000947575476806e-06, 'epoch': 0.61} 61%|██████ | 20878/34278 [22:59:38<12:57:16, 3.48s/it] 61%|██████ | 20879/34278 [22:59:43<14:15:59, 3.83s/it] {'loss': 0.1335, 'grad_norm': 1.1375574796335826, 'learning_rate': 3.4996440879662218e-06, 'epoch': 0.61} 61%|██████ | 20879/34278 [22:59:43<14:15:59, 3.83s/it] 61%|██████ | 20880/34278 [22:59:47<15:27:25, 4.15s/it] {'loss': 0.1295, 'grad_norm': 1.170482436033285, 'learning_rate': 3.4991934317795806e-06, 'epoch': 0.61} 61%|██████ | 20880/34278 [22:59:47<15:27:25, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 61%|██████ | 20881/34278 [22:59:51<15:06:13, 4.06s/it] {'loss': 0.1288, 'grad_norm': 0.7969557444024356, 'learning_rate': 3.4987427889917835e-06, 'epoch': 0.61} 61%|██████ | 20881/34278 [22:59:51<15:06:13, 4.06s/it] 61%|██████ | 20882/34278 [22:59:54<13:56:05, 3.74s/it] {'loss': 0.119, 'grad_norm': 0.9061287474697424, 'learning_rate': 3.4982921596068543e-06, 'epoch': 0.61} 61%|██████ | 20882/34278 [22:59:54<13:56:05, 3.74s/it] 61%|██████ | 20883/34278 [22:59:58<14:09:26, 3.80s/it] {'loss': 0.1126, 'grad_norm': 0.841542942014196, 'learning_rate': 3.4978415436288117e-06, 'epoch': 0.61} 61%|██████ | 20883/34278 [22:59:58<14:09:26, 3.80s/it] 61%|██████ | 20884/34278 [23:00:02<13:55:25, 3.74s/it] {'loss': 0.1144, 'grad_norm': 0.8243967094503337, 'learning_rate': 3.4973909410616825e-06, 'epoch': 0.61} 61%|██████ | 20884/34278 [23:00:02<13:55:25, 3.74s/it] 61%|██████ | 20885/34278 [23:00:05<13:09:57, 3.54s/it] {'loss': 0.1247, 'grad_norm': 0.7724631396965665, 'learning_rate': 3.4969403519094884e-06, 'epoch': 0.61} 61%|██████ | 20885/34278 [23:00:05<13:09:57, 3.54s/it] 61%|██████ | 20886/34278 [23:00:11<15:53:06, 4.27s/it] {'loss': 0.1458, 'grad_norm': 0.8084179720613038, 'learning_rate': 3.4964897761762494e-06, 'epoch': 0.61} 61%|██████ | 20886/34278 [23:00:11<15:53:06, 4.27s/it] 61%|██████ | 20887/34278 [23:00:14<14:42:52, 3.96s/it] {'loss': 0.1159, 'grad_norm': 0.7925687170356461, 'learning_rate': 3.4960392138659937e-06, 'epoch': 0.61} 61%|██████ | 20887/34278 [23:00:14<14:42:52, 3.96s/it] 61%|██████ | 20888/34278 [23:00:17<14:03:14, 3.78s/it] {'loss': 0.127, 'grad_norm': 0.9906277379096061, 'learning_rate': 3.49558866498274e-06, 'epoch': 0.61} 61%|██████ | 20888/34278 [23:00:17<14:03:14, 3.78s/it] 61%|██████ | 20889/34278 [23:00:22<14:59:20, 4.03s/it] {'loss': 0.0994, 'grad_norm': 0.8723356293846019, 'learning_rate': 3.495138129530511e-06, 'epoch': 0.61} 61%|██████ | 20889/34278 [23:00:22<14:59:20, 4.03s/it] 61%|██████ | 20890/34278 [23:00:28<16:36:42, 4.47s/it] {'loss': 0.1389, 'grad_norm': 0.8502232317404022, 'learning_rate': 3.4946876075133314e-06, 'epoch': 0.61} 61%|██████ | 20890/34278 [23:00:28<16:36:42, 4.47s/it] 61%|██████ | 20891/34278 [23:00:31<15:14:03, 4.10s/it] {'loss': 0.13, 'grad_norm': 0.7269993560244731, 'learning_rate': 3.4942370989352197e-06, 'epoch': 0.61} 61%|██████ | 20891/34278 [23:00:31<15:14:03, 4.10s/it] 61%|██████ | 20892/34278 [23:00:34<14:34:02, 3.92s/it] {'loss': 0.1203, 'grad_norm': 0.7966318687625942, 'learning_rate': 3.493786603800199e-06, 'epoch': 0.61} 61%|██████ | 20892/34278 [23:00:34<14:34:02, 3.92s/it] 61%|██████ | 20893/34278 [23:00:38<14:16:31, 3.84s/it] {'loss': 0.1314, 'grad_norm': 0.8267682902728425, 'learning_rate': 3.493336122112293e-06, 'epoch': 0.61} 61%|██████ | 20893/34278 [23:00:38<14:16:31, 3.84s/it] 61%|██████ | 20894/34278 [23:00:41<13:36:47, 3.66s/it] {'loss': 0.1412, 'grad_norm': 0.8030176606329594, 'learning_rate': 3.492885653875523e-06, 'epoch': 0.61} 61%|██████ | 20894/34278 [23:00:41<13:36:47, 3.66s/it] 61%|██████ | 20895/34278 [23:00:45<13:31:38, 3.64s/it] {'loss': 0.1076, 'grad_norm': 0.9935218848228359, 'learning_rate': 3.4924351990939102e-06, 'epoch': 0.61} 61%|██████ | 20895/34278 [23:00:45<13:31:38, 3.64s/it] 61%|██████ | 20896/34278 [23:00:48<13:07:30, 3.53s/it] {'loss': 0.0969, 'grad_norm': 0.8916645742970674, 'learning_rate': 3.4919847577714753e-06, 'epoch': 0.61} 61%|██████ | 20896/34278 [23:00:48<13:07:30, 3.53s/it] 61%|██████ | 20897/34278 [23:00:52<13:24:18, 3.61s/it] {'loss': 0.1284, 'grad_norm': 0.8236760374407862, 'learning_rate': 3.4915343299122408e-06, 'epoch': 0.61} 61%|██████ | 20897/34278 [23:00:52<13:24:18, 3.61s/it] 61%|██████ | 20898/34278 [23:00:55<12:31:23, 3.37s/it] {'loss': 0.1516, 'grad_norm': 0.919804114272897, 'learning_rate': 3.491083915520227e-06, 'epoch': 0.61} 61%|██████ | 20898/34278 [23:00:55<12:31:23, 3.37s/it] 61%|██████ | 20899/34278 [23:00:58<12:25:17, 3.34s/it] {'loss': 0.1143, 'grad_norm': 0.9565564281413478, 'learning_rate': 3.490633514599457e-06, 'epoch': 0.61} 61%|██████ | 20899/34278 [23:00:58<12:25:17, 3.34s/it] 61%|██████ | 20900/34278 [23:01:04<15:25:58, 4.15s/it] {'loss': 0.1382, 'grad_norm': 0.7096049569596213, 'learning_rate': 3.49018312715395e-06, 'epoch': 0.61} 61%|██████ | 20900/34278 [23:01:04<15:25:58, 4.15s/it] 61%|██████ | 20901/34278 [23:01:09<15:51:44, 4.27s/it] {'loss': 0.1388, 'grad_norm': 0.9871400070174613, 'learning_rate': 3.489732753187728e-06, 'epoch': 0.61} 61%|██████ | 20901/34278 [23:01:09<15:51:44, 4.27s/it] 61%|██████ | 20902/34278 [23:01:12<15:29:02, 4.17s/it] {'loss': 0.1249, 'grad_norm': 1.131942789357873, 'learning_rate': 3.4892823927048113e-06, 'epoch': 0.61} 61%|██████ | 20902/34278 [23:01:12<15:29:02, 4.17s/it] 61%|██████ | 20903/34278 [23:01:18<17:27:32, 4.70s/it] {'loss': 0.1315, 'grad_norm': 0.7074766836221436, 'learning_rate': 3.4888320457092207e-06, 'epoch': 0.61} 61%|██████ | 20903/34278 [23:01:18<17:27:32, 4.70s/it] 61%|██████ | 20904/34278 [23:01:24<18:53:53, 5.09s/it] {'loss': 0.1334, 'grad_norm': 1.6197528286078506, 'learning_rate': 3.4883817122049757e-06, 'epoch': 0.61} 61%|██████ | 20904/34278 [23:01:24<18:53:53, 5.09s/it] 61%|██████ | 20905/34278 [23:01:28<16:49:03, 4.53s/it] {'loss': 0.146, 'grad_norm': 1.0630169660883857, 'learning_rate': 3.4879313921960988e-06, 'epoch': 0.61} 61%|██████ | 20905/34278 [23:01:28<16:49:03, 4.53s/it] 61%|██████ | 20906/34278 [23:01:33<17:13:29, 4.64s/it] {'loss': 0.1199, 'grad_norm': 1.3073245857038782, 'learning_rate': 3.48748108568661e-06, 'epoch': 0.61} 61%|██████ | 20906/34278 [23:01:33<17:13:29, 4.64s/it] 61%|██████ | 20907/34278 [23:01:37<17:07:03, 4.61s/it] {'loss': 0.1196, 'grad_norm': 0.7861543012982191, 'learning_rate': 3.4870307926805293e-06, 'epoch': 0.61} 61%|██████ | 20907/34278 [23:01:37<17:07:03, 4.61s/it] 61%|██████ | 20908/34278 [23:01:41<15:51:50, 4.27s/it] {'loss': 0.1264, 'grad_norm': 1.178482787113756, 'learning_rate': 3.486580513181876e-06, 'epoch': 0.61} 61%|██████ | 20908/34278 [23:01:41<15:51:50, 4.27s/it] 61%|██████ | 20909/34278 [23:01:46<17:13:57, 4.64s/it] {'loss': 0.1353, 'grad_norm': 1.3058702213214082, 'learning_rate': 3.4861302471946703e-06, 'epoch': 0.61} 61%|██████ | 20909/34278 [23:01:46<17:13:57, 4.64s/it] 61%|██████ | 20910/34278 [23:01:52<18:39:27, 5.02s/it] {'loss': 0.1239, 'grad_norm': 0.8620394268813247, 'learning_rate': 3.4856799947229316e-06, 'epoch': 0.61} 61%|██████ | 20910/34278 [23:01:52<18:39:27, 5.02s/it] 61%|██████ | 20911/34278 [23:01:58<19:37:07, 5.28s/it] {'loss': 0.1024, 'grad_norm': 0.794154012305866, 'learning_rate': 3.4852297557706803e-06, 'epoch': 0.61} 61%|██████ | 20911/34278 [23:01:58<19:37:07, 5.28s/it] 61%|██████ | 20912/34278 [23:02:01<17:38:46, 4.75s/it] {'loss': 0.142, 'grad_norm': 0.9086135890438978, 'learning_rate': 3.4847795303419385e-06, 'epoch': 0.61} 61%|██████ | 20912/34278 [23:02:01<17:38:46, 4.75s/it] 61%|██████ | 20913/34278 [23:02:05<16:24:52, 4.42s/it] {'loss': 0.1233, 'grad_norm': 1.1621642144116857, 'learning_rate': 3.4843293184407223e-06, 'epoch': 0.61} 61%|██████ | 20913/34278 [23:02:05<16:24:52, 4.42s/it] 61%|██████ | 20914/34278 [23:02:08<14:43:40, 3.97s/it] {'loss': 0.1321, 'grad_norm': 0.8985573165668729, 'learning_rate': 3.4838791200710515e-06, 'epoch': 0.61} 61%|██████ | 20914/34278 [23:02:08<14:43:40, 3.97s/it] 61%|██████ | 20915/34278 [23:02:13<16:02:59, 4.32s/it] {'loss': 0.1333, 'grad_norm': 0.8056488716815887, 'learning_rate': 3.4834289352369477e-06, 'epoch': 0.61} 61%|██████ | 20915/34278 [23:02:13<16:02:59, 4.32s/it] 61%|██████ | 20916/34278 [23:02:16<14:45:14, 3.98s/it] {'loss': 0.1212, 'grad_norm': 0.8015228922543629, 'learning_rate': 3.4829787639424238e-06, 'epoch': 0.61} 61%|██████ | 20916/34278 [23:02:16<14:45:14, 3.98s/it] 61%|██████ | 20917/34278 [23:02:21<15:20:45, 4.13s/it] {'loss': 0.1395, 'grad_norm': 1.038916445368362, 'learning_rate': 3.482528606191508e-06, 'epoch': 0.61} 61%|██████ | 20917/34278 [23:02:21<15:20:45, 4.13s/it] 61%|██████ | 20918/34278 [23:02:27<17:19:15, 4.67s/it] {'loss': 0.1042, 'grad_norm': 0.9483160722802674, 'learning_rate': 3.482078461988213e-06, 'epoch': 0.61} 61%|██████ | 20918/34278 [23:02:27<17:19:15, 4.67s/it] 61%|██████ | 20919/34278 [23:02:30<16:04:01, 4.33s/it] {'loss': 0.1246, 'grad_norm': 0.7376460552091912, 'learning_rate': 3.481628331336559e-06, 'epoch': 0.61} 61%|██████ | 20919/34278 [23:02:30<16:04:01, 4.33s/it] 61%|██████ | 20920/34278 [23:02:35<16:19:27, 4.40s/it] {'loss': 0.1087, 'grad_norm': 0.8470370483299596, 'learning_rate': 3.481178214240566e-06, 'epoch': 0.61} 61%|██████ | 20920/34278 [23:02:35<16:19:27, 4.40s/it] 61%|██████ | 20921/34278 [23:02:38<14:41:20, 3.96s/it] {'loss': 0.1286, 'grad_norm': 0.9497577318465791, 'learning_rate': 3.48072811070425e-06, 'epoch': 0.61} 61%|██████ | 20921/34278 [23:02:38<14:41:20, 3.96s/it] 61%|██████ | 20922/34278 [23:02:41<13:50:35, 3.73s/it] {'loss': 0.1097, 'grad_norm': 0.8930144418107888, 'learning_rate': 3.48027802073163e-06, 'epoch': 0.61} 61%|██████ | 20922/34278 [23:02:41<13:50:35, 3.73s/it] 61%|██████ | 20923/34278 [23:02:44<12:55:39, 3.48s/it] {'loss': 0.1272, 'grad_norm': 0.842364621518752, 'learning_rate': 3.479827944326726e-06, 'epoch': 0.61} 61%|██████ | 20923/34278 [23:02:44<12:55:39, 3.48s/it] 61%|██████ | 20924/34278 [23:02:50<15:47:24, 4.26s/it] {'loss': 0.1231, 'grad_norm': 0.747239643573268, 'learning_rate': 3.4793778814935553e-06, 'epoch': 0.61} 61%|██████ | 20924/34278 [23:02:50<15:47:24, 4.26s/it] 61%|██████ | 20925/34278 [23:02:53<14:43:23, 3.97s/it] {'loss': 0.1132, 'grad_norm': 0.8841884267214734, 'learning_rate': 3.478927832236137e-06, 'epoch': 0.61} 61%|██████ | 20925/34278 [23:02:53<14:43:23, 3.97s/it] 61%|██████ | 20926/34278 [23:02:57<14:02:57, 3.79s/it] {'loss': 0.1351, 'grad_norm': 0.8602181308889852, 'learning_rate': 3.478477796558487e-06, 'epoch': 0.61} 61%|██████ | 20926/34278 [23:02:57<14:02:57, 3.79s/it] 61%|██████ | 20927/34278 [23:03:00<13:09:17, 3.55s/it] {'loss': 0.1272, 'grad_norm': 0.9963921043130824, 'learning_rate': 3.4780277744646236e-06, 'epoch': 0.61} 61%|██████ | 20927/34278 [23:03:00<13:09:17, 3.55s/it] 61%|██████ | 20928/34278 [23:03:02<12:21:49, 3.33s/it] {'loss': 0.1381, 'grad_norm': 0.8268281964396148, 'learning_rate': 3.477577765958564e-06, 'epoch': 0.61} 61%|██████ | 20928/34278 [23:03:02<12:21:49, 3.33s/it] 61%|██████ | 20929/34278 [23:03:08<14:49:32, 4.00s/it] {'loss': 0.1109, 'grad_norm': 0.7129514428666248, 'learning_rate': 3.4771277710443284e-06, 'epoch': 0.61} 61%|██████ | 20929/34278 [23:03:08<14:49:32, 4.00s/it] 61%|██████ | 20930/34278 [23:03:11<13:59:11, 3.77s/it] {'loss': 0.1395, 'grad_norm': 0.8057698220128542, 'learning_rate': 3.4766777897259317e-06, 'epoch': 0.61} 61%|██████ | 20930/34278 [23:03:11<13:59:11, 3.77s/it] 61%|██████ | 20931/34278 [23:03:14<13:00:40, 3.51s/it] {'loss': 0.1416, 'grad_norm': 1.0915883936241786, 'learning_rate': 3.4762278220073927e-06, 'epoch': 0.61} 61%|██████ | 20931/34278 [23:03:14<13:00:40, 3.51s/it] 61%|██████ | 20932/34278 [23:03:18<13:10:32, 3.55s/it] {'loss': 0.1237, 'grad_norm': 0.8465931890384214, 'learning_rate': 3.475777867892728e-06, 'epoch': 0.61} 61%|██████ | 20932/34278 [23:03:18<13:10:32, 3.55s/it] 61%|██████ | 20933/34278 [23:03:22<14:14:35, 3.84s/it] {'loss': 0.1195, 'grad_norm': 0.9360312890669172, 'learning_rate': 3.475327927385954e-06, 'epoch': 0.61} 61%|██████ | 20933/34278 [23:03:22<14:14:35, 3.84s/it] 61%|██████ | 20934/34278 [23:03:26<14:21:40, 3.87s/it] {'loss': 0.1175, 'grad_norm': 0.7891785178727896, 'learning_rate': 3.4748780004910875e-06, 'epoch': 0.61} 61%|██████ | 20934/34278 [23:03:26<14:21:40, 3.87s/it] 61%|██████ | 20935/34278 [23:03:29<13:03:21, 3.52s/it] {'loss': 0.1024, 'grad_norm': 0.7896108617273909, 'learning_rate': 3.474428087212147e-06, 'epoch': 0.61} 61%|██████ | 20935/34278 [23:03:29<13:03:21, 3.52s/it] 61%|██████ | 20936/34278 [23:03:32<13:09:49, 3.55s/it] {'loss': 0.1134, 'grad_norm': 0.9645337464520171, 'learning_rate': 3.473978187553149e-06, 'epoch': 0.61} 61%|██████ | 20936/34278 [23:03:32<13:09:49, 3.55s/it] 61%|██████ | 20937/34278 [23:03:36<13:15:25, 3.58s/it] {'loss': 0.148, 'grad_norm': 1.036323117317088, 'learning_rate': 3.4735283015181092e-06, 'epoch': 0.61} 61%|██████ | 20937/34278 [23:03:36<13:15:25, 3.58s/it] 61%|██████ | 20938/34278 [23:03:40<13:25:27, 3.62s/it] {'loss': 0.142, 'grad_norm': 1.020705629172368, 'learning_rate': 3.473078429111044e-06, 'epoch': 0.61} 61%|██████ | 20938/34278 [23:03:40<13:25:27, 3.62s/it] 61%|██████ | 20939/34278 [23:03:43<12:55:31, 3.49s/it] {'loss': 0.1102, 'grad_norm': 0.9300278752439212, 'learning_rate': 3.4726285703359698e-06, 'epoch': 0.61} 61%|██████ | 20939/34278 [23:03:43<12:55:31, 3.49s/it] 61%|██████ | 20940/34278 [23:03:46<12:51:02, 3.47s/it] {'loss': 0.1332, 'grad_norm': 0.7062035862868438, 'learning_rate': 3.4721787251969023e-06, 'epoch': 0.61} 61%|██████ | 20940/34278 [23:03:46<12:51:02, 3.47s/it] 61%|██████ | 20941/34278 [23:03:50<13:02:01, 3.52s/it] {'loss': 0.1292, 'grad_norm': 0.9170001169958084, 'learning_rate': 3.47172889369786e-06, 'epoch': 0.61} 61%|██████ | 20941/34278 [23:03:50<13:02:01, 3.52s/it] 61%|██████ | 20942/34278 [23:03:53<12:37:16, 3.41s/it] {'loss': 0.1292, 'grad_norm': 1.1138343930381036, 'learning_rate': 3.471279075842857e-06, 'epoch': 0.61} 61%|██████ | 20942/34278 [23:03:53<12:37:16, 3.41s/it] 61%|██████ | 20943/34278 [23:03:57<13:24:13, 3.62s/it] {'loss': 0.1507, 'grad_norm': 0.8864499926165303, 'learning_rate': 3.4708292716359094e-06, 'epoch': 0.61} 61%|██████ | 20943/34278 [23:03:57<13:24:13, 3.62s/it] 61%|██████ | 20944/34278 [23:04:01<12:53:50, 3.48s/it] {'loss': 0.1303, 'grad_norm': 0.80922507010581, 'learning_rate': 3.4703794810810334e-06, 'epoch': 0.61} 61%|██████ | 20944/34278 [23:04:01<12:53:50, 3.48s/it] 61%|██████ | 20945/34278 [23:04:04<12:32:03, 3.38s/it] {'loss': 0.1246, 'grad_norm': 0.7618441864480844, 'learning_rate': 3.4699297041822444e-06, 'epoch': 0.61} 61%|██████ | 20945/34278 [23:04:04<12:32:03, 3.38s/it] 61%|██████ | 20946/34278 [23:04:06<11:52:49, 3.21s/it] {'loss': 0.1192, 'grad_norm': 1.1477619314815288, 'learning_rate': 3.469479940943555e-06, 'epoch': 0.61} 61%|██████ | 20946/34278 [23:04:06<11:52:49, 3.21s/it] 61%|██████ | 20947/34278 [23:04:10<12:17:35, 3.32s/it] {'loss': 0.106, 'grad_norm': 0.8051245272157639, 'learning_rate': 3.4690301913689863e-06, 'epoch': 0.61} 61%|██████ | 20947/34278 [23:04:10<12:17:35, 3.32s/it] 61%|██████ | 20948/34278 [23:04:14<12:30:58, 3.38s/it] {'loss': 0.1174, 'grad_norm': 0.9034847528658548, 'learning_rate': 3.4685804554625495e-06, 'epoch': 0.61} 61%|██████ | 20948/34278 [23:04:14<12:30:58, 3.38s/it] 61%|██████ | 20949/34278 [23:04:18<13:16:02, 3.58s/it] {'loss': 0.134, 'grad_norm': 0.921643093258146, 'learning_rate': 3.468130733228261e-06, 'epoch': 0.61} 61%|██████ | 20949/34278 [23:04:18<13:16:02, 3.58s/it] 61%|██████ | 20950/34278 [23:04:22<13:37:02, 3.68s/it] {'loss': 0.1194, 'grad_norm': 1.070957243678224, 'learning_rate': 3.4676810246701365e-06, 'epoch': 0.61} 61%|██████ | 20950/34278 [23:04:22<13:37:02, 3.68s/it] 61%|██████ | 20951/34278 [23:04:27<15:29:39, 4.19s/it] {'loss': 0.156, 'grad_norm': 1.166313172374092, 'learning_rate': 3.467231329792189e-06, 'epoch': 0.61} 61%|██████ | 20951/34278 [23:04:27<15:29:39, 4.19s/it] 61%|██████ | 20952/34278 [23:04:30<14:17:58, 3.86s/it] {'loss': 0.135, 'grad_norm': 0.7777529780148577, 'learning_rate': 3.4667816485984334e-06, 'epoch': 0.61} 61%|██████ | 20952/34278 [23:04:30<14:17:58, 3.86s/it] 61%|██████ | 20953/34278 [23:04:33<13:38:24, 3.69s/it] {'loss': 0.1207, 'grad_norm': 0.7953839416045292, 'learning_rate': 3.4663319810928865e-06, 'epoch': 0.61} 61%|██████ | 20953/34278 [23:04:33<13:38:24, 3.69s/it] 61%|██████ | 20954/34278 [23:04:37<13:14:53, 3.58s/it] {'loss': 0.1361, 'grad_norm': 0.9927925875520035, 'learning_rate': 3.465882327279561e-06, 'epoch': 0.61} 61%|██████ | 20954/34278 [23:04:37<13:14:53, 3.58s/it] 61%|██████ | 20955/34278 [23:04:43<16:02:12, 4.33s/it] {'loss': 0.1155, 'grad_norm': 0.9540477595798074, 'learning_rate': 3.465432687162473e-06, 'epoch': 0.61} 61%|██████ | 20955/34278 [23:04:43<16:02:12, 4.33s/it] 61%|██████ | 20956/34278 [23:04:46<14:46:48, 3.99s/it] {'loss': 0.1434, 'grad_norm': 1.0290227431209418, 'learning_rate': 3.464983060745635e-06, 'epoch': 0.61} 61%|██████ | 20956/34278 [23:04:46<14:46:48, 3.99s/it] 61%|██████ | 20957/34278 [23:04:49<13:27:50, 3.64s/it] {'loss': 0.1339, 'grad_norm': 0.8991033544133925, 'learning_rate': 3.4645334480330616e-06, 'epoch': 0.61} 61%|██████ | 20957/34278 [23:04:49<13:27:50, 3.64s/it] 61%|██████ | 20958/34278 [23:04:52<12:56:22, 3.50s/it] {'loss': 0.1575, 'grad_norm': 0.7888454291008832, 'learning_rate': 3.464083849028766e-06, 'epoch': 0.61} 61%|██████ | 20958/34278 [23:04:52<12:56:22, 3.50s/it] 61%|██████ | 20959/34278 [23:04:56<13:34:21, 3.67s/it] {'loss': 0.1171, 'grad_norm': 0.9488993324605315, 'learning_rate': 3.463634263736765e-06, 'epoch': 0.61} 61%|██████ | 20959/34278 [23:04:56<13:34:21, 3.67s/it] 61%|██████ | 20960/34278 [23:04:59<12:58:55, 3.51s/it] {'loss': 0.1466, 'grad_norm': 0.8432693116187181, 'learning_rate': 3.46318469216107e-06, 'epoch': 0.61} 61%|██████ | 20960/34278 [23:04:59<12:58:55, 3.51s/it] 61%|██████ | 20961/34278 [23:05:03<13:07:11, 3.55s/it] {'loss': 0.1115, 'grad_norm': 0.6769327721821427, 'learning_rate': 3.4627351343056947e-06, 'epoch': 0.61} 61%|██████ | 20961/34278 [23:05:03<13:07:11, 3.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 61%|██████ | 20962/34278 [23:05:06<12:45:46, 3.45s/it] {'loss': 0.1363, 'grad_norm': 0.8731764480185785, 'learning_rate': 3.4622855901746543e-06, 'epoch': 0.61} 61%|██████ | 20962/34278 [23:05:06<12:45:46, 3.45s/it] 61%|██████ | 20963/34278 [23:05:09<12:29:31, 3.38s/it] {'loss': 0.1498, 'grad_norm': 0.953362235882418, 'learning_rate': 3.46183605977196e-06, 'epoch': 0.61} 61%|██████ | 20963/34278 [23:05:09<12:29:31, 3.38s/it] 61%|██████ | 20964/34278 [23:05:12<12:07:12, 3.28s/it] {'loss': 0.1363, 'grad_norm': 1.0649145980754482, 'learning_rate': 3.4613865431016253e-06, 'epoch': 0.61} 61%|██████ | 20964/34278 [23:05:12<12:07:12, 3.28s/it] 61%|██████ | 20965/34278 [23:05:15<12:07:11, 3.28s/it] {'loss': 0.125, 'grad_norm': 0.9672501895694245, 'learning_rate': 3.460937040167665e-06, 'epoch': 0.61} 61%|██████ | 20965/34278 [23:05:15<12:07:11, 3.28s/it] 61%|██████ | 20966/34278 [23:05:19<12:22:27, 3.35s/it] {'loss': 0.1217, 'grad_norm': 0.9080402169163565, 'learning_rate': 3.4604875509740922e-06, 'epoch': 0.61} 61%|██████ | 20966/34278 [23:05:19<12:22:27, 3.35s/it] 61%|██████ | 20967/34278 [23:05:24<14:33:06, 3.94s/it] {'loss': 0.1251, 'grad_norm': 1.1121807349407231, 'learning_rate': 3.460038075524919e-06, 'epoch': 0.61} 61%|██████ | 20967/34278 [23:05:24<14:33:06, 3.94s/it] 61%|██████ | 20968/34278 [23:05:29<15:53:01, 4.30s/it] {'loss': 0.1473, 'grad_norm': 1.0001426277027219, 'learning_rate': 3.4595886138241575e-06, 'epoch': 0.61} 61%|██████ | 20968/34278 [23:05:29<15:53:01, 4.30s/it] 61%|██████ | 20969/34278 [23:05:32<14:28:41, 3.92s/it] {'loss': 0.1309, 'grad_norm': 0.8792572688182623, 'learning_rate': 3.459139165875821e-06, 'epoch': 0.61} 61%|██████ | 20969/34278 [23:05:32<14:28:41, 3.92s/it] 61%|██████ | 20970/34278 [23:05:36<13:36:13, 3.68s/it] {'loss': 0.1387, 'grad_norm': 0.7173610154422235, 'learning_rate': 3.4586897316839217e-06, 'epoch': 0.61} 61%|██████ | 20970/34278 [23:05:36<13:36:13, 3.68s/it] 61%|██████ | 20971/34278 [23:05:38<12:40:40, 3.43s/it] {'loss': 0.1293, 'grad_norm': 1.098478615007473, 'learning_rate': 3.458240311252473e-06, 'epoch': 0.61} 61%|██████ | 20971/34278 [23:05:38<12:40:40, 3.43s/it] 61%|██████ | 20972/34278 [23:05:42<12:32:21, 3.39s/it] {'loss': 0.1167, 'grad_norm': 0.7366822015674209, 'learning_rate': 3.4577909045854884e-06, 'epoch': 0.61} 61%|██████ | 20972/34278 [23:05:42<12:32:21, 3.39s/it] 61%|██████ | 20973/34278 [23:05:45<12:04:43, 3.27s/it] {'loss': 0.1205, 'grad_norm': 0.9817394618978629, 'learning_rate': 3.4573415116869774e-06, 'epoch': 0.61} 61%|██████ | 20973/34278 [23:05:45<12:04:43, 3.27s/it] 61%|██████ | 20974/34278 [23:05:51<15:12:11, 4.11s/it] {'loss': 0.1575, 'grad_norm': 0.8977344426497099, 'learning_rate': 3.456892132560953e-06, 'epoch': 0.61} 61%|██████ | 20974/34278 [23:05:51<15:12:11, 4.11s/it] 61%|██████ | 20975/34278 [23:05:54<13:44:12, 3.72s/it] {'loss': 0.1278, 'grad_norm': 0.8035672913976277, 'learning_rate': 3.456442767211428e-06, 'epoch': 0.61} 61%|██████ | 20975/34278 [23:05:54<13:44:12, 3.72s/it] 61%|██████ | 20976/34278 [23:05:57<13:07:45, 3.55s/it] {'loss': 0.1244, 'grad_norm': 0.6974203347194811, 'learning_rate': 3.45599341564241e-06, 'epoch': 0.61} 61%|██████ | 20976/34278 [23:05:57<13:07:45, 3.55s/it] 61%|██████ | 20977/34278 [23:06:01<14:02:17, 3.80s/it] {'loss': 0.1056, 'grad_norm': 0.8611962303220687, 'learning_rate': 3.4555440778579185e-06, 'epoch': 0.61} 61%|██████ | 20977/34278 [23:06:01<14:02:17, 3.80s/it] 61%|██████ | 20978/34278 [23:06:04<13:15:06, 3.59s/it] {'loss': 0.119, 'grad_norm': 0.7342171979455759, 'learning_rate': 3.455094753861959e-06, 'epoch': 0.61} 61%|██████ | 20978/34278 [23:06:04<13:15:06, 3.59s/it] 61%|██████ | 20979/34278 [23:06:08<13:06:46, 3.55s/it] {'loss': 0.1088, 'grad_norm': 0.6687892399548173, 'learning_rate': 3.4546454436585454e-06, 'epoch': 0.61} 61%|██████ | 20979/34278 [23:06:08<13:06:46, 3.55s/it] 61%|██████ | 20980/34278 [23:06:11<12:24:46, 3.36s/it] {'loss': 0.1277, 'grad_norm': 0.9301980438692424, 'learning_rate': 3.4541961472516882e-06, 'epoch': 0.61} 61%|██████ | 20980/34278 [23:06:11<12:24:46, 3.36s/it] 61%|██████ | 20981/34278 [23:06:14<12:27:00, 3.37s/it] {'loss': 0.1122, 'grad_norm': 0.7632162987039959, 'learning_rate': 3.4537468646453987e-06, 'epoch': 0.61} 61%|██████ | 20981/34278 [23:06:14<12:27:00, 3.37s/it] 61%|██████ | 20982/34278 [23:06:17<12:04:58, 3.27s/it] {'loss': 0.1149, 'grad_norm': 0.7659636098045769, 'learning_rate': 3.4532975958436866e-06, 'epoch': 0.61} 61%|██████ | 20982/34278 [23:06:17<12:04:58, 3.27s/it] 61%|██████ | 20983/34278 [23:06:20<11:46:27, 3.19s/it] {'loss': 0.1089, 'grad_norm': 0.8543931345825652, 'learning_rate': 3.4528483408505653e-06, 'epoch': 0.61} 61%|██████ | 20983/34278 [23:06:20<11:46:27, 3.19s/it] 61%|██████ | 20984/34278 [23:06:24<12:31:52, 3.39s/it] {'loss': 0.1227, 'grad_norm': 1.0042344270139514, 'learning_rate': 3.452399099670045e-06, 'epoch': 0.61} 61%|██████ | 20984/34278 [23:06:24<12:31:52, 3.39s/it] 61%|██████ | 20985/34278 [23:06:27<12:14:33, 3.32s/it] {'loss': 0.1161, 'grad_norm': 0.8658228461930776, 'learning_rate': 3.451949872306137e-06, 'epoch': 0.61} 61%|██████ | 20985/34278 [23:06:27<12:14:33, 3.32s/it] 61%|██████ | 20986/34278 [23:06:30<12:04:21, 3.27s/it] {'loss': 0.1236, 'grad_norm': 0.8556424588367897, 'learning_rate': 3.4515006587628497e-06, 'epoch': 0.61} 61%|██████ | 20986/34278 [23:06:30<12:04:21, 3.27s/it] 61%|██████ | 20987/34278 [23:06:34<12:30:04, 3.39s/it] {'loss': 0.1323, 'grad_norm': 0.8755333393449263, 'learning_rate': 3.4510514590441957e-06, 'epoch': 0.61} 61%|██████ | 20987/34278 [23:06:34<12:30:04, 3.39s/it] 61%|██████ | 20988/34278 [23:06:38<12:59:46, 3.52s/it] {'loss': 0.1398, 'grad_norm': 1.079093149162543, 'learning_rate': 3.4506022731541826e-06, 'epoch': 0.61} 61%|██████ | 20988/34278 [23:06:38<12:59:46, 3.52s/it] 61%|██████ | 20989/34278 [23:06:40<12:09:44, 3.29s/it] {'loss': 0.1099, 'grad_norm': 1.16111636361247, 'learning_rate': 3.450153101096825e-06, 'epoch': 0.61} 61%|██████ | 20989/34278 [23:06:40<12:09:44, 3.29s/it] 61%|██████ | 20990/34278 [23:06:44<12:07:51, 3.29s/it] {'loss': 0.1142, 'grad_norm': 0.779681191738122, 'learning_rate': 3.4497039428761293e-06, 'epoch': 0.61} 61%|██████ | 20990/34278 [23:06:44<12:07:51, 3.29s/it] 61%|██████ | 20991/34278 [23:06:47<11:46:42, 3.19s/it] {'loss': 0.1303, 'grad_norm': 0.9628872504343245, 'learning_rate': 3.4492547984961067e-06, 'epoch': 0.61} 61%|██████ | 20991/34278 [23:06:47<11:46:42, 3.19s/it] 61%|██████ | 20992/34278 [23:06:52<13:59:49, 3.79s/it] {'loss': 0.1168, 'grad_norm': 1.051397096908327, 'learning_rate': 3.4488056679607685e-06, 'epoch': 0.61} 61%|██████ | 20992/34278 [23:06:52<13:59:49, 3.79s/it] 61%|██████ | 20993/34278 [23:06:56<13:47:16, 3.74s/it] {'loss': 0.1236, 'grad_norm': 0.9578684732032886, 'learning_rate': 3.4483565512741214e-06, 'epoch': 0.61} 61%|██████ | 20993/34278 [23:06:56<13:47:16, 3.74s/it] 61%|██████ | 20994/34278 [23:07:02<16:26:33, 4.46s/it] {'loss': 0.1325, 'grad_norm': 0.869847829921037, 'learning_rate': 3.4479074484401763e-06, 'epoch': 0.61} 61%|██████ | 20994/34278 [23:07:02<16:26:33, 4.46s/it] 61%|██████ | 20995/34278 [23:07:05<15:44:44, 4.27s/it] {'loss': 0.1406, 'grad_norm': 1.402606131990117, 'learning_rate': 3.4474583594629436e-06, 'epoch': 0.61} 61%|██████ | 20995/34278 [23:07:05<15:44:44, 4.27s/it] 61%|██████▏ | 20996/34278 [23:07:09<14:29:39, 3.93s/it] {'loss': 0.1255, 'grad_norm': 1.0505145326623786, 'learning_rate': 3.447009284346432e-06, 'epoch': 0.61} 61%|██████▏ | 20996/34278 [23:07:09<14:29:39, 3.93s/it] 61%|██████▏ | 20997/34278 [23:07:15<16:47:40, 4.55s/it] {'loss': 0.1312, 'grad_norm': 0.8443454167133138, 'learning_rate': 3.4465602230946517e-06, 'epoch': 0.61} 61%|██████▏ | 20997/34278 [23:07:15<16:47:40, 4.55s/it] 61%|██████▏ | 20998/34278 [23:07:18<15:16:44, 4.14s/it] {'loss': 0.1327, 'grad_norm': 1.530233319482489, 'learning_rate': 3.44611117571161e-06, 'epoch': 0.61} 61%|██████▏ | 20998/34278 [23:07:18<15:16:44, 4.14s/it] 61%|██████▏ | 20999/34278 [23:07:21<14:23:42, 3.90s/it] {'loss': 0.1275, 'grad_norm': 1.204036597985255, 'learning_rate': 3.445662142201317e-06, 'epoch': 0.61} 61%|██████▏ | 20999/34278 [23:07:21<14:23:42, 3.90s/it] 61%|██████▏ | 21000/34278 [23:07:25<13:55:19, 3.77s/it] {'loss': 0.1215, 'grad_norm': 0.8904733020868539, 'learning_rate': 3.4452131225677798e-06, 'epoch': 0.61} 61%|██████▏ | 21000/34278 [23:07:25<13:55:19, 3.77s/it] 61%|██████▏ | 21001/34278 [23:07:28<13:11:30, 3.58s/it] {'loss': 0.1449, 'grad_norm': 1.2092441567946983, 'learning_rate': 3.4447641168150103e-06, 'epoch': 0.61} 61%|██████▏ | 21001/34278 [23:07:28<13:11:30, 3.58s/it] 61%|██████▏ | 21002/34278 [23:07:31<13:04:51, 3.55s/it] {'loss': 0.1352, 'grad_norm': 1.298882785905577, 'learning_rate': 3.4443151249470163e-06, 'epoch': 0.61} 61%|██████▏ | 21002/34278 [23:07:31<13:04:51, 3.55s/it] 61%|██████▏ | 21003/34278 [23:07:34<12:27:43, 3.38s/it] {'loss': 0.1317, 'grad_norm': 1.1317444753596781, 'learning_rate': 3.443866146967804e-06, 'epoch': 0.61} 61%|██████▏ | 21003/34278 [23:07:34<12:27:43, 3.38s/it] 61%|██████▏ | 21004/34278 [23:07:37<12:20:51, 3.35s/it] {'loss': 0.1353, 'grad_norm': 0.770033170431705, 'learning_rate': 3.4434171828813833e-06, 'epoch': 0.61} 61%|██████▏ | 21004/34278 [23:07:37<12:20:51, 3.35s/it] 61%|██████▏ | 21005/34278 [23:07:41<12:57:45, 3.52s/it] {'loss': 0.1136, 'grad_norm': 0.7785240795060671, 'learning_rate': 3.4429682326917645e-06, 'epoch': 0.61} 61%|██████▏ | 21005/34278 [23:07:41<12:57:45, 3.52s/it] 61%|██████▏ | 21006/34278 [23:07:44<12:27:35, 3.38s/it] {'loss': 0.1378, 'grad_norm': 1.0235229812936215, 'learning_rate': 3.44251929640295e-06, 'epoch': 0.61} 61%|██████▏ | 21006/34278 [23:07:44<12:27:35, 3.38s/it] 61%|██████▏ | 21007/34278 [23:07:48<12:20:56, 3.35s/it] {'loss': 0.1312, 'grad_norm': 0.9164818800568367, 'learning_rate': 3.4420703740189544e-06, 'epoch': 0.61} 61%|██████▏ | 21007/34278 [23:07:48<12:20:56, 3.35s/it] 61%|██████▏ | 21008/34278 [23:07:51<12:08:25, 3.29s/it] {'loss': 0.1621, 'grad_norm': 0.9996494893264714, 'learning_rate': 3.441621465543781e-06, 'epoch': 0.61} 61%|██████▏ | 21008/34278 [23:07:51<12:08:25, 3.29s/it] 61%|██████▏ | 21009/34278 [23:07:55<13:11:27, 3.58s/it] {'loss': 0.1247, 'grad_norm': 0.7748913711001848, 'learning_rate': 3.4411725709814397e-06, 'epoch': 0.61} 61%|██████▏ | 21009/34278 [23:07:55<13:11:27, 3.58s/it] 61%|██████▏ | 21010/34278 [23:07:59<13:13:53, 3.59s/it] {'loss': 0.1258, 'grad_norm': 0.9984979162441634, 'learning_rate': 3.4407236903359385e-06, 'epoch': 0.61} 61%|██████▏ | 21010/34278 [23:07:59<13:13:53, 3.59s/it] 61%|██████▏ | 21011/34278 [23:08:04<15:06:31, 4.10s/it] {'loss': 0.1379, 'grad_norm': 0.9676661351595874, 'learning_rate': 3.4402748236112827e-06, 'epoch': 0.61} 61%|██████▏ | 21011/34278 [23:08:04<15:06:31, 4.10s/it] 61%|██████▏ | 21012/34278 [23:08:07<13:50:09, 3.75s/it] {'loss': 0.1277, 'grad_norm': 0.8025287907469718, 'learning_rate': 3.43982597081148e-06, 'epoch': 0.61} 61%|██████▏ | 21012/34278 [23:08:07<13:50:09, 3.75s/it] 61%|██████▏ | 21013/34278 [23:08:14<16:56:47, 4.60s/it] {'loss': 0.1508, 'grad_norm': 0.9375310565927569, 'learning_rate': 3.43937713194054e-06, 'epoch': 0.61} 61%|██████▏ | 21013/34278 [23:08:14<16:56:47, 4.60s/it] 61%|██████▏ | 21014/34278 [23:08:17<15:12:13, 4.13s/it] {'loss': 0.1002, 'grad_norm': 0.7262399909554743, 'learning_rate': 3.4389283070024684e-06, 'epoch': 0.61} 61%|██████▏ | 21014/34278 [23:08:17<15:12:13, 4.13s/it] 61%|██████▏ | 21015/34278 [23:08:20<14:09:16, 3.84s/it] {'loss': 0.1143, 'grad_norm': 0.7272098888733515, 'learning_rate': 3.4384794960012734e-06, 'epoch': 0.61} 61%|██████▏ | 21015/34278 [23:08:20<14:09:16, 3.84s/it] 61%|██████▏ | 21016/34278 [23:08:26<16:59:31, 4.61s/it] {'loss': 0.1176, 'grad_norm': 0.8919304754962747, 'learning_rate': 3.438030698940959e-06, 'epoch': 0.61} 61%|██████▏ | 21016/34278 [23:08:26<16:59:31, 4.61s/it] 61%|██████▏ | 21017/34278 [23:08:32<18:24:01, 5.00s/it] {'loss': 0.1408, 'grad_norm': 0.813250615691482, 'learning_rate': 3.437581915825534e-06, 'epoch': 0.61} 61%|██████▏ | 21017/34278 [23:08:32<18:24:01, 5.00s/it] 61%|██████▏ | 21018/34278 [23:08:35<16:33:58, 4.50s/it] {'loss': 0.1049, 'grad_norm': 0.7351002698829738, 'learning_rate': 3.4371331466590038e-06, 'epoch': 0.61} 61%|██████▏ | 21018/34278 [23:08:36<16:33:58, 4.50s/it] 61%|██████▏ | 21019/34278 [23:08:39<15:59:14, 4.34s/it] {'loss': 0.1352, 'grad_norm': 0.876096414795378, 'learning_rate': 3.4366843914453774e-06, 'epoch': 0.61} 61%|██████▏ | 21019/34278 [23:08:39<15:59:14, 4.34s/it] 61%|██████▏ | 21020/34278 [23:08:45<17:28:41, 4.75s/it] {'loss': 0.1604, 'grad_norm': 0.8369573559671274, 'learning_rate': 3.436235650188659e-06, 'epoch': 0.61} 61%|██████▏ | 21020/34278 [23:08:45<17:28:41, 4.75s/it] 61%|██████▏ | 21021/34278 [23:08:51<18:45:46, 5.10s/it] {'loss': 0.105, 'grad_norm': 0.710377594669661, 'learning_rate': 3.4357869228928553e-06, 'epoch': 0.61} 61%|██████▏ | 21021/34278 [23:08:51<18:45:46, 5.10s/it] 61%|██████▏ | 21022/34278 [23:08:55<17:47:09, 4.83s/it] {'loss': 0.1229, 'grad_norm': 0.7724533961278669, 'learning_rate': 3.4353382095619737e-06, 'epoch': 0.61} 61%|██████▏ | 21022/34278 [23:08:55<17:47:09, 4.83s/it] 61%|██████▏ | 21023/34278 [23:08:59<16:10:13, 4.39s/it] {'loss': 0.1367, 'grad_norm': 0.9649071735315539, 'learning_rate': 3.4348895102000173e-06, 'epoch': 0.61} 61%|██████▏ | 21023/34278 [23:08:59<16:10:13, 4.39s/it] 61%|██████▏ | 21024/34278 [23:09:02<14:44:08, 4.00s/it] {'loss': 0.12, 'grad_norm': 0.9883881679299172, 'learning_rate': 3.4344408248109933e-06, 'epoch': 0.61} 61%|██████▏ | 21024/34278 [23:09:02<14:44:08, 4.00s/it] 61%|██████▏ | 21025/34278 [23:09:05<14:06:32, 3.83s/it] {'loss': 0.1059, 'grad_norm': 0.7613585357562115, 'learning_rate': 3.4339921533989083e-06, 'epoch': 0.61} 61%|██████▏ | 21025/34278 [23:09:05<14:06:32, 3.83s/it] 61%|██████▏ | 21026/34278 [23:09:08<13:26:58, 3.65s/it] {'loss': 0.1124, 'grad_norm': 0.6464953118660361, 'learning_rate': 3.4335434959677683e-06, 'epoch': 0.61} 61%|██████▏ | 21026/34278 [23:09:08<13:26:58, 3.65s/it] 61%|██████▏ | 21027/34278 [23:09:12<13:13:46, 3.59s/it] {'loss': 0.1146, 'grad_norm': 0.767443712319199, 'learning_rate': 3.433094852521579e-06, 'epoch': 0.61} 61%|██████▏ | 21027/34278 [23:09:12<13:13:46, 3.59s/it] 61%|██████▏ | 21028/34278 [23:09:15<12:46:23, 3.47s/it] {'loss': 0.1302, 'grad_norm': 1.2296710589919633, 'learning_rate': 3.4326462230643436e-06, 'epoch': 0.61} 61%|██████▏ | 21028/34278 [23:09:15<12:46:23, 3.47s/it] 61%|██████▏ | 21029/34278 [23:09:19<12:51:47, 3.50s/it] {'loss': 0.1252, 'grad_norm': 0.7219022973250488, 'learning_rate': 3.4321976076000685e-06, 'epoch': 0.61} 61%|██████▏ | 21029/34278 [23:09:19<12:51:47, 3.50s/it] 61%|██████▏ | 21030/34278 [23:09:24<15:34:19, 4.23s/it] {'loss': 0.111, 'grad_norm': 0.837349692783926, 'learning_rate': 3.431749006132758e-06, 'epoch': 0.61} 61%|██████▏ | 21030/34278 [23:09:24<15:34:19, 4.23s/it] 61%|██████▏ | 21031/34278 [23:09:29<15:54:02, 4.32s/it] {'loss': 0.1386, 'grad_norm': 0.7466805912168603, 'learning_rate': 3.431300418666419e-06, 'epoch': 0.61} 61%|██████▏ | 21031/34278 [23:09:29<15:54:02, 4.32s/it] 61%|██████▏ | 21032/34278 [23:09:32<14:52:53, 4.04s/it] {'loss': 0.1016, 'grad_norm': 0.8062874915350087, 'learning_rate': 3.4308518452050567e-06, 'epoch': 0.61} 61%|██████▏ | 21032/34278 [23:09:32<14:52:53, 4.04s/it] 61%|██████▏ | 21033/34278 [23:09:38<16:34:19, 4.50s/it] {'loss': 0.1344, 'grad_norm': 0.8428077027826459, 'learning_rate': 3.4304032857526724e-06, 'epoch': 0.61} 61%|██████▏ | 21033/34278 [23:09:38<16:34:19, 4.50s/it] 61%|██████▏ | 21034/34278 [23:09:41<14:43:43, 4.00s/it] {'loss': 0.137, 'grad_norm': 1.1384267038198153, 'learning_rate': 3.4299547403132738e-06, 'epoch': 0.61} 61%|██████▏ | 21034/34278 [23:09:41<14:43:43, 4.00s/it] 61%|██████▏ | 21035/34278 [23:09:47<16:58:50, 4.62s/it] {'loss': 0.1286, 'grad_norm': 0.7902375493699365, 'learning_rate': 3.4295062088908652e-06, 'epoch': 0.61} 61%|██████▏ | 21035/34278 [23:09:47<16:58:50, 4.62s/it] 61%|██████▏ | 21036/34278 [23:09:51<16:02:49, 4.36s/it] {'loss': 0.1219, 'grad_norm': 0.9446412109261622, 'learning_rate': 3.4290576914894473e-06, 'epoch': 0.61} 61%|██████▏ | 21036/34278 [23:09:51<16:02:49, 4.36s/it] 61%|██████▏ | 21037/34278 [23:09:54<15:01:53, 4.09s/it] {'loss': 0.1283, 'grad_norm': 0.9453470235633491, 'learning_rate': 3.4286091881130306e-06, 'epoch': 0.61} 61%|██████▏ | 21037/34278 [23:09:54<15:01:53, 4.09s/it] 61%|██████▏ | 21038/34278 [23:09:58<15:15:40, 4.15s/it] {'loss': 0.1274, 'grad_norm': 0.9393398829366166, 'learning_rate': 3.4281606987656145e-06, 'epoch': 0.61} 61%|██████▏ | 21038/34278 [23:09:58<15:15:40, 4.15s/it] 61%|██████▏ | 21039/34278 [23:10:02<15:15:36, 4.15s/it] {'loss': 0.0999, 'grad_norm': 0.8114576726343591, 'learning_rate': 3.427712223451205e-06, 'epoch': 0.61} 61%|██████▏ | 21039/34278 [23:10:03<15:15:36, 4.15s/it] 61%|██████▏ | 21040/34278 [23:10:07<15:18:33, 4.16s/it] {'loss': 0.1214, 'grad_norm': 0.8286629047477375, 'learning_rate': 3.427263762173806e-06, 'epoch': 0.61} 61%|██████▏ | 21040/34278 [23:10:07<15:18:33, 4.16s/it] 61%|██████▏ | 21041/34278 [23:10:10<13:59:24, 3.80s/it] {'loss': 0.1456, 'grad_norm': 0.9669587631486285, 'learning_rate': 3.4268153149374196e-06, 'epoch': 0.61} 61%|██████▏ | 21041/34278 [23:10:10<13:59:24, 3.80s/it] 61%|██████▏ | 21042/34278 [23:10:13<13:26:42, 3.66s/it] {'loss': 0.1197, 'grad_norm': 1.2406251188809492, 'learning_rate': 3.42636688174605e-06, 'epoch': 0.61} 61%|██████▏ | 21042/34278 [23:10:13<13:26:42, 3.66s/it] 61%|██████▏ | 21043/34278 [23:10:17<13:38:22, 3.71s/it] {'loss': 0.1032, 'grad_norm': 0.8051232750190032, 'learning_rate': 3.425918462603702e-06, 'epoch': 0.61} 61%|██████▏ | 21043/34278 [23:10:17<13:38:22, 3.71s/it] 61%|██████▏ | 21044/34278 [23:10:20<13:14:59, 3.60s/it] {'loss': 0.1232, 'grad_norm': 0.7013276307148666, 'learning_rate': 3.425470057514378e-06, 'epoch': 0.61} 61%|██████▏ | 21044/34278 [23:10:20<13:14:59, 3.60s/it] 61%|██████▏ | 21045/34278 [23:10:25<14:09:47, 3.85s/it] {'loss': 0.1196, 'grad_norm': 0.8151813991754139, 'learning_rate': 3.4250216664820823e-06, 'epoch': 0.61} 61%|██████▏ | 21045/34278 [23:10:25<14:09:47, 3.85s/it] 61%|██████▏ | 21046/34278 [23:10:27<13:04:48, 3.56s/it] {'loss': 0.1222, 'grad_norm': 1.1766845593849624, 'learning_rate': 3.424573289510817e-06, 'epoch': 0.61} 61%|██████▏ | 21046/34278 [23:10:27<13:04:48, 3.56s/it] 61%|██████▏ | 21047/34278 [23:10:31<12:37:12, 3.43s/it] {'loss': 0.1063, 'grad_norm': 0.8595289919133213, 'learning_rate': 3.4241249266045846e-06, 'epoch': 0.61} 61%|██████▏ | 21047/34278 [23:10:31<12:37:12, 3.43s/it] 61%|██████▏ | 21048/34278 [23:10:34<12:12:18, 3.32s/it] {'loss': 0.1065, 'grad_norm': 0.977795390304621, 'learning_rate': 3.4236765777673877e-06, 'epoch': 0.61} 61%|██████▏ | 21048/34278 [23:10:34<12:12:18, 3.32s/it] 61%|██████▏ | 21049/34278 [23:10:39<14:10:17, 3.86s/it] {'loss': 0.1311, 'grad_norm': 0.9509711323827088, 'learning_rate': 3.4232282430032325e-06, 'epoch': 0.61} 61%|██████▏ | 21049/34278 [23:10:39<14:10:17, 3.86s/it] 61%|██████▏ | 21050/34278 [23:10:43<14:05:25, 3.83s/it] {'loss': 0.1038, 'grad_norm': 0.8516215564018293, 'learning_rate': 3.4227799223161172e-06, 'epoch': 0.61} 61%|██████▏ | 21050/34278 [23:10:43<14:05:25, 3.83s/it] 61%|██████▏ | 21051/34278 [23:10:45<12:57:09, 3.53s/it] {'loss': 0.1203, 'grad_norm': 1.0373232414161326, 'learning_rate': 3.4223316157100472e-06, 'epoch': 0.61} 61%|██████▏ | 21051/34278 [23:10:45<12:57:09, 3.53s/it] 61%|██████▏ | 21052/34278 [23:10:49<12:46:34, 3.48s/it] {'loss': 0.1134, 'grad_norm': 1.1369227402149664, 'learning_rate': 3.4218833231890247e-06, 'epoch': 0.61} 61%|██████▏ | 21052/34278 [23:10:49<12:46:34, 3.48s/it] 61%|██████▏ | 21053/34278 [23:10:52<12:14:43, 3.33s/it] {'loss': 0.1372, 'grad_norm': 0.9984749318565033, 'learning_rate': 3.4214350447570497e-06, 'epoch': 0.61} 61%|██████▏ | 21053/34278 [23:10:52<12:14:43, 3.33s/it] 61%|██████▏ | 21054/34278 [23:10:55<12:37:40, 3.44s/it] {'loss': 0.1223, 'grad_norm': 0.8746272182503635, 'learning_rate': 3.420986780418125e-06, 'epoch': 0.61} 61%|██████▏ | 21054/34278 [23:10:55<12:37:40, 3.44s/it] 61%|██████▏ | 21055/34278 [23:10:59<12:20:12, 3.36s/it] {'loss': 0.1359, 'grad_norm': 0.9208209942617852, 'learning_rate': 3.420538530176255e-06, 'epoch': 0.61} 61%|██████▏ | 21055/34278 [23:10:59<12:20:12, 3.36s/it] 61%|██████▏ | 21056/34278 [23:11:05<15:14:59, 4.15s/it] {'loss': 0.1173, 'grad_norm': 0.8486047135830159, 'learning_rate': 3.4200902940354393e-06, 'epoch': 0.61} 61%|██████▏ | 21056/34278 [23:11:05<15:14:59, 4.15s/it] 61%|██████▏ | 21057/34278 [23:11:09<15:09:50, 4.13s/it] {'loss': 0.1354, 'grad_norm': 0.9784875212204225, 'learning_rate': 3.4196420719996815e-06, 'epoch': 0.61} 61%|██████▏ | 21057/34278 [23:11:09<15:09:50, 4.13s/it] 61%|██████▏ | 21058/34278 [23:11:15<17:27:50, 4.76s/it] {'loss': 0.13, 'grad_norm': 0.9037462086589715, 'learning_rate': 3.4191938640729804e-06, 'epoch': 0.61} 61%|██████▏ | 21058/34278 [23:11:15<17:27:50, 4.76s/it] 61%|██████▏ | 21059/34278 [23:11:18<15:11:06, 4.14s/it] {'loss': 0.1164, 'grad_norm': 0.836418206364951, 'learning_rate': 3.4187456702593393e-06, 'epoch': 0.61} 61%|██████▏ | 21059/34278 [23:11:18<15:11:06, 4.14s/it] 61%|██████▏ | 21060/34278 [23:11:22<15:42:28, 4.28s/it] {'loss': 0.0974, 'grad_norm': 0.7692119225633084, 'learning_rate': 3.4182974905627597e-06, 'epoch': 0.61} 61%|██████▏ | 21060/34278 [23:11:22<15:42:28, 4.28s/it] 61%|██████▏ | 21061/34278 [23:11:26<15:12:27, 4.14s/it] {'loss': 0.1179, 'grad_norm': 0.8262778150302602, 'learning_rate': 3.4178493249872426e-06, 'epoch': 0.61} 61%|██████▏ | 21061/34278 [23:11:26<15:12:27, 4.14s/it] 61%|██████▏ | 21062/34278 [23:11:29<14:26:30, 3.93s/it] {'loss': 0.1482, 'grad_norm': 1.0896507636235115, 'learning_rate': 3.4174011735367898e-06, 'epoch': 0.61} 61%|██████▏ | 21062/34278 [23:11:29<14:26:30, 3.93s/it] 61%|██████▏ | 21063/34278 [23:11:35<15:52:00, 4.32s/it] {'loss': 0.1557, 'grad_norm': 0.849059463558778, 'learning_rate': 3.416953036215401e-06, 'epoch': 0.61} 61%|██████▏ | 21063/34278 [23:11:35<15:52:00, 4.32s/it] 61%|██████▏ | 21064/34278 [23:11:38<15:00:42, 4.09s/it] {'loss': 0.11, 'grad_norm': 0.7333087844744282, 'learning_rate': 3.416504913027077e-06, 'epoch': 0.61} 61%|██████▏ | 21064/34278 [23:11:38<15:00:42, 4.09s/it] 61%|██████▏ | 21065/34278 [23:11:44<17:06:17, 4.66s/it] {'loss': 0.1361, 'grad_norm': 0.7265565962359889, 'learning_rate': 3.416056803975818e-06, 'epoch': 0.61} 61%|██████▏ | 21065/34278 [23:11:44<17:06:17, 4.66s/it] 61%|██████▏ | 21066/34278 [23:11:49<17:38:33, 4.81s/it] {'loss': 0.1254, 'grad_norm': 1.0893694892523167, 'learning_rate': 3.4156087090656274e-06, 'epoch': 0.61} 61%|██████▏ | 21066/34278 [23:11:49<17:38:33, 4.81s/it] 61%|██████▏ | 21067/34278 [23:11:55<19:04:18, 5.20s/it] {'loss': 0.1272, 'grad_norm': 0.8376286525754213, 'learning_rate': 3.415160628300505e-06, 'epoch': 0.61} 61%|██████▏ | 21067/34278 [23:11:55<19:04:18, 5.20s/it] 61%|██████▏ | 21068/34278 [23:11:59<17:26:41, 4.75s/it] {'loss': 0.1017, 'grad_norm': 0.7529455272292838, 'learning_rate': 3.414712561684449e-06, 'epoch': 0.61} 61%|██████▏ | 21068/34278 [23:11:59<17:26:41, 4.75s/it] 61%|██████▏ | 21069/34278 [23:12:03<16:23:26, 4.47s/it] {'loss': 0.1184, 'grad_norm': 0.8563188556651878, 'learning_rate': 3.414264509221461e-06, 'epoch': 0.61} 61%|██████▏ | 21069/34278 [23:12:03<16:23:26, 4.47s/it] 61%|██████▏ | 21070/34278 [23:12:06<15:05:58, 4.12s/it] {'loss': 0.1409, 'grad_norm': 0.9759476152761085, 'learning_rate': 3.4138164709155415e-06, 'epoch': 0.61} 61%|██████▏ | 21070/34278 [23:12:06<15:05:58, 4.12s/it] 61%|██████▏ | 21071/34278 [23:12:09<14:01:22, 3.82s/it] {'loss': 0.1374, 'grad_norm': 0.9775050495130322, 'learning_rate': 3.4133684467706872e-06, 'epoch': 0.61} 61%|██████▏ | 21071/34278 [23:12:09<14:01:22, 3.82s/it] 61%|██████▏ | 21072/34278 [23:12:13<14:17:30, 3.90s/it] {'loss': 0.1086, 'grad_norm': 0.6244597394948174, 'learning_rate': 3.412920436790903e-06, 'epoch': 0.61} 61%|██████▏ | 21072/34278 [23:12:14<14:17:30, 3.90s/it] 61%|██████▏ | 21073/34278 [23:12:16<13:17:33, 3.62s/it] {'loss': 0.133, 'grad_norm': 0.7340720042898693, 'learning_rate': 3.4124724409801864e-06, 'epoch': 0.61} 61%|██████▏ | 21073/34278 [23:12:16<13:17:33, 3.62s/it] 61%|██████▏ | 21074/34278 [23:12:20<13:07:10, 3.58s/it] {'loss': 0.1398, 'grad_norm': 0.9434362957191156, 'learning_rate': 3.4120244593425363e-06, 'epoch': 0.61} 61%|██████▏ | 21074/34278 [23:12:20<13:07:10, 3.58s/it] 61%|██████▏ | 21075/34278 [23:12:23<12:46:17, 3.48s/it] {'loss': 0.1129, 'grad_norm': 0.7282879258179913, 'learning_rate': 3.411576491881954e-06, 'epoch': 0.61} 61%|██████▏ | 21075/34278 [23:12:23<12:46:17, 3.48s/it] 61%|██████▏ | 21076/34278 [23:12:27<12:58:48, 3.54s/it] {'loss': 0.1272, 'grad_norm': 0.89164869118127, 'learning_rate': 3.4111285386024363e-06, 'epoch': 0.61} 61%|██████▏ | 21076/34278 [23:12:27<12:58:48, 3.54s/it] 61%|██████▏ | 21077/34278 [23:12:30<12:25:59, 3.39s/it] {'loss': 0.1528, 'grad_norm': 0.9496976075366732, 'learning_rate': 3.4106805995079824e-06, 'epoch': 0.61} 61%|██████▏ | 21077/34278 [23:12:30<12:25:59, 3.39s/it] 61%|██████▏ | 21078/34278 [23:12:33<11:51:53, 3.24s/it] {'loss': 0.1261, 'grad_norm': 0.9619568063580806, 'learning_rate': 3.4102326746025938e-06, 'epoch': 0.61} 61%|██████▏ | 21078/34278 [23:12:33<11:51:53, 3.24s/it] 61%|██████▏ | 21079/34278 [23:12:36<12:03:23, 3.29s/it] {'loss': 0.1285, 'grad_norm': 0.7389963679323991, 'learning_rate': 3.40978476389027e-06, 'epoch': 0.61} 61%|██████▏ | 21079/34278 [23:12:36<12:03:23, 3.29s/it] 61%|██████▏ | 21080/34278 [23:12:39<11:55:12, 3.25s/it] {'loss': 0.1217, 'grad_norm': 0.7873182570850406, 'learning_rate': 3.4093368673750066e-06, 'epoch': 0.61} 61%|██████▏ | 21080/34278 [23:12:39<11:55:12, 3.25s/it] 62%|██████▏ | 21081/34278 [23:12:42<11:39:44, 3.18s/it] {'loss': 0.1507, 'grad_norm': 1.124881653368094, 'learning_rate': 3.408888985060804e-06, 'epoch': 0.62} 62%|██████▏ | 21081/34278 [23:12:42<11:39:44, 3.18s/it] 62%|██████▏ | 21082/34278 [23:12:46<11:38:47, 3.18s/it] {'loss': 0.1153, 'grad_norm': 0.8711768419178351, 'learning_rate': 3.4084411169516618e-06, 'epoch': 0.62} 62%|██████▏ | 21082/34278 [23:12:46<11:38:47, 3.18s/it] 62%|██████▏ | 21083/34278 [23:12:50<12:57:29, 3.54s/it] {'loss': 0.1141, 'grad_norm': 0.5681554111961512, 'learning_rate': 3.4079932630515746e-06, 'epoch': 0.62} 62%|██████▏ | 21083/34278 [23:12:50<12:57:29, 3.54s/it] 62%|██████▏ | 21084/34278 [23:12:53<12:23:27, 3.38s/it] {'loss': 0.1251, 'grad_norm': 0.8012527760871635, 'learning_rate': 3.4075454233645466e-06, 'epoch': 0.62} 62%|██████▏ | 21084/34278 [23:12:53<12:23:27, 3.38s/it] 62%|██████▏ | 21085/34278 [23:12:57<12:44:04, 3.47s/it] {'loss': 0.1176, 'grad_norm': 1.1940778013542062, 'learning_rate': 3.407097597894572e-06, 'epoch': 0.62} 62%|██████▏ | 21085/34278 [23:12:57<12:44:04, 3.47s/it] 62%|██████▏ | 21086/34278 [23:13:00<12:48:15, 3.49s/it] {'loss': 0.1315, 'grad_norm': 0.783546930966101, 'learning_rate': 3.4066497866456493e-06, 'epoch': 0.62} 62%|██████▏ | 21086/34278 [23:13:00<12:48:15, 3.49s/it] 62%|██████▏ | 21087/34278 [23:13:03<12:22:17, 3.38s/it] {'loss': 0.1268, 'grad_norm': 0.7274787811137401, 'learning_rate': 3.406201989621778e-06, 'epoch': 0.62} 62%|██████▏ | 21087/34278 [23:13:03<12:22:17, 3.38s/it] 62%|██████▏ | 21088/34278 [23:13:06<12:08:36, 3.31s/it] {'loss': 0.1186, 'grad_norm': 0.8389753465093771, 'learning_rate': 3.405754206826954e-06, 'epoch': 0.62} 62%|██████▏ | 21088/34278 [23:13:06<12:08:36, 3.31s/it] 62%|██████▏ | 21089/34278 [23:13:10<12:18:19, 3.36s/it] {'loss': 0.1124, 'grad_norm': 0.8868556445561018, 'learning_rate': 3.4053064382651748e-06, 'epoch': 0.62} 62%|██████▏ | 21089/34278 [23:13:10<12:18:19, 3.36s/it] 62%|██████▏ | 21090/34278 [23:13:14<13:13:58, 3.61s/it] {'loss': 0.1229, 'grad_norm': 0.7182195739752382, 'learning_rate': 3.4048586839404394e-06, 'epoch': 0.62} 62%|██████▏ | 21090/34278 [23:13:14<13:13:58, 3.61s/it] 62%|██████▏ | 21091/34278 [23:13:20<15:54:15, 4.34s/it] {'loss': 0.113, 'grad_norm': 0.8918735417370072, 'learning_rate': 3.4044109438567463e-06, 'epoch': 0.62} 62%|██████▏ | 21091/34278 [23:13:20<15:54:15, 4.34s/it] 62%|██████▏ | 21092/34278 [23:13:24<15:04:13, 4.11s/it] {'loss': 0.1199, 'grad_norm': 0.8436804640959246, 'learning_rate': 3.4039632180180915e-06, 'epoch': 0.62} 62%|██████▏ | 21092/34278 [23:13:24<15:04:13, 4.11s/it] 62%|██████▏ | 21093/34278 [23:13:28<14:44:11, 4.02s/it] {'loss': 0.111, 'grad_norm': 0.6505094891322994, 'learning_rate': 3.403515506428471e-06, 'epoch': 0.62} 62%|██████▏ | 21093/34278 [23:13:28<14:44:11, 4.02s/it] 62%|██████▏ | 21094/34278 [23:13:31<13:52:23, 3.79s/it] {'loss': 0.1418, 'grad_norm': 0.7803460636998947, 'learning_rate': 3.4030678090918833e-06, 'epoch': 0.62} 62%|██████▏ | 21094/34278 [23:13:31<13:52:23, 3.79s/it] 62%|██████▏ | 21095/34278 [23:13:37<16:25:56, 4.49s/it] {'loss': 0.1441, 'grad_norm': 0.9766751223909023, 'learning_rate': 3.4026201260123237e-06, 'epoch': 0.62} 62%|██████▏ | 21095/34278 [23:13:37<16:25:56, 4.49s/it] 62%|██████▏ | 21096/34278 [23:13:40<15:16:16, 4.17s/it] {'loss': 0.1285, 'grad_norm': 0.9481376533175228, 'learning_rate': 3.402172457193792e-06, 'epoch': 0.62} 62%|██████▏ | 21096/34278 [23:13:40<15:16:16, 4.17s/it] 62%|██████▏ | 21097/34278 [23:13:44<14:30:53, 3.96s/it] {'loss': 0.1139, 'grad_norm': 0.8126063740133177, 'learning_rate': 3.401724802640283e-06, 'epoch': 0.62} 62%|██████▏ | 21097/34278 [23:13:44<14:30:53, 3.96s/it] 62%|██████▏ | 21098/34278 [23:13:47<13:30:29, 3.69s/it] {'loss': 0.1264, 'grad_norm': 0.8163672154408407, 'learning_rate': 3.401277162355793e-06, 'epoch': 0.62} 62%|██████▏ | 21098/34278 [23:13:47<13:30:29, 3.69s/it] 62%|██████▏ | 21099/34278 [23:13:50<12:54:24, 3.53s/it] {'loss': 0.1047, 'grad_norm': 0.8898152787830411, 'learning_rate': 3.400829536344319e-06, 'epoch': 0.62} 62%|██████▏ | 21099/34278 [23:13:50<12:54:24, 3.53s/it] 62%|██████▏ | 21100/34278 [23:13:56<15:08:44, 4.14s/it] {'loss': 0.0934, 'grad_norm': 0.9128706990403679, 'learning_rate': 3.400381924609858e-06, 'epoch': 0.62} 62%|██████▏ | 21100/34278 [23:13:56<15:08:44, 4.14s/it] 62%|██████▏ | 21101/34278 [23:13:59<14:28:25, 3.95s/it] {'loss': 0.1174, 'grad_norm': 0.8037016179255396, 'learning_rate': 3.3999343271564033e-06, 'epoch': 0.62} 62%|██████▏ | 21101/34278 [23:13:59<14:28:25, 3.95s/it] 62%|██████▏ | 21102/34278 [23:14:02<13:36:45, 3.72s/it] {'loss': 0.1284, 'grad_norm': 0.8151627688975245, 'learning_rate': 3.3994867439879543e-06, 'epoch': 0.62} 62%|██████▏ | 21102/34278 [23:14:02<13:36:45, 3.72s/it] 62%|██████▏ | 21103/34278 [23:14:05<12:53:36, 3.52s/it] {'loss': 0.1376, 'grad_norm': 0.8541182688864145, 'learning_rate': 3.399039175108505e-06, 'epoch': 0.62} 62%|██████▏ | 21103/34278 [23:14:05<12:53:36, 3.52s/it] 62%|██████▏ | 21104/34278 [23:14:09<13:06:51, 3.58s/it] {'loss': 0.1396, 'grad_norm': 0.9659765166684128, 'learning_rate': 3.3985916205220527e-06, 'epoch': 0.62} 62%|██████▏ | 21104/34278 [23:14:09<13:06:51, 3.58s/it] 62%|██████▏ | 21105/34278 [23:14:12<12:48:32, 3.50s/it] {'loss': 0.1396, 'grad_norm': 0.6885270996459721, 'learning_rate': 3.3981440802325922e-06, 'epoch': 0.62} 62%|██████▏ | 21105/34278 [23:14:12<12:48:32, 3.50s/it] 62%|██████▏ | 21106/34278 [23:14:16<12:25:26, 3.40s/it] {'loss': 0.1489, 'grad_norm': 0.9091368934832649, 'learning_rate': 3.397696554244118e-06, 'epoch': 0.62} 62%|██████▏ | 21106/34278 [23:14:16<12:25:26, 3.40s/it] 62%|██████▏ | 21107/34278 [23:14:19<12:22:27, 3.38s/it] {'loss': 0.1273, 'grad_norm': 0.8662934895472221, 'learning_rate': 3.3972490425606258e-06, 'epoch': 0.62} 62%|██████▏ | 21107/34278 [23:14:19<12:22:27, 3.38s/it] 62%|██████▏ | 21108/34278 [23:14:22<12:33:40, 3.43s/it] {'loss': 0.1416, 'grad_norm': 0.8242258428719526, 'learning_rate': 3.3968015451861124e-06, 'epoch': 0.62} 62%|██████▏ | 21108/34278 [23:14:22<12:33:40, 3.43s/it] 62%|██████▏ | 21109/34278 [23:14:26<12:14:12, 3.35s/it] {'loss': 0.1153, 'grad_norm': 1.005713719858816, 'learning_rate': 3.3963540621245734e-06, 'epoch': 0.62} 62%|██████▏ | 21109/34278 [23:14:26<12:14:12, 3.35s/it] 62%|██████▏ | 21110/34278 [23:14:31<14:16:33, 3.90s/it] {'loss': 0.1185, 'grad_norm': 1.002304810971801, 'learning_rate': 3.395906593380001e-06, 'epoch': 0.62} 62%|██████▏ | 21110/34278 [23:14:31<14:16:33, 3.90s/it] 62%|██████▏ | 21111/34278 [23:14:34<13:25:17, 3.67s/it] {'loss': 0.1136, 'grad_norm': 0.9061916118785468, 'learning_rate': 3.395459138956392e-06, 'epoch': 0.62} 62%|██████▏ | 21111/34278 [23:14:34<13:25:17, 3.67s/it] 62%|██████▏ | 21112/34278 [23:14:40<15:56:27, 4.36s/it] {'loss': 0.1315, 'grad_norm': 0.946255304795538, 'learning_rate': 3.395011698857742e-06, 'epoch': 0.62} 62%|██████▏ | 21112/34278 [23:14:40<15:56:27, 4.36s/it] 62%|██████▏ | 21113/34278 [23:14:43<14:51:23, 4.06s/it] {'loss': 0.1151, 'grad_norm': 1.3922510314359713, 'learning_rate': 3.39456427308804e-06, 'epoch': 0.62} 62%|██████▏ | 21113/34278 [23:14:43<14:51:23, 4.06s/it] 62%|██████▏ | 21114/34278 [23:14:47<14:30:50, 3.97s/it] {'loss': 0.1091, 'grad_norm': 0.7777176414925873, 'learning_rate': 3.39411686165129e-06, 'epoch': 0.62} 62%|██████▏ | 21114/34278 [23:14:47<14:30:50, 3.97s/it] 62%|██████▏ | 21115/34278 [23:14:53<16:55:03, 4.63s/it] {'loss': 0.1183, 'grad_norm': 0.7181694510957686, 'learning_rate': 3.393669464551479e-06, 'epoch': 0.62} 62%|██████▏ | 21115/34278 [23:14:53<16:55:03, 4.63s/it] 62%|██████▏ | 21116/34278 [23:14:57<16:18:50, 4.46s/it] {'loss': 0.1194, 'grad_norm': 1.2838727664589396, 'learning_rate': 3.393222081792603e-06, 'epoch': 0.62} 62%|██████▏ | 21116/34278 [23:14:57<16:18:50, 4.46s/it] 62%|██████▏ | 21117/34278 [23:15:03<18:09:35, 4.97s/it] {'loss': 0.1491, 'grad_norm': 1.0335190932026679, 'learning_rate': 3.3927747133786593e-06, 'epoch': 0.62} 62%|██████▏ | 21117/34278 [23:15:03<18:09:35, 4.97s/it] 62%|██████▏ | 21118/34278 [23:15:07<16:30:02, 4.51s/it] {'loss': 0.1104, 'grad_norm': 0.921466228499897, 'learning_rate': 3.3923273593136376e-06, 'epoch': 0.62} 62%|██████▏ | 21118/34278 [23:15:07<16:30:02, 4.51s/it] 62%|██████▏ | 21119/34278 [23:15:10<14:53:24, 4.07s/it] {'loss': 0.1212, 'grad_norm': 0.973417624582613, 'learning_rate': 3.3918800196015324e-06, 'epoch': 0.62} 62%|██████▏ | 21119/34278 [23:15:10<14:53:24, 4.07s/it] 62%|██████▏ | 21120/34278 [23:15:16<16:54:15, 4.62s/it] {'loss': 0.1286, 'grad_norm': 1.1702767967094085, 'learning_rate': 3.3914326942463393e-06, 'epoch': 0.62} 62%|██████▏ | 21120/34278 [23:15:16<16:54:15, 4.62s/it] 62%|██████▏ | 21121/34278 [23:15:19<15:05:07, 4.13s/it] {'loss': 0.1277, 'grad_norm': 0.9303754380158115, 'learning_rate': 3.390985383252051e-06, 'epoch': 0.62} 62%|██████▏ | 21121/34278 [23:15:19<15:05:07, 4.13s/it] 62%|██████▏ | 21122/34278 [23:15:24<15:58:09, 4.37s/it] {'loss': 0.1362, 'grad_norm': 0.8328432298832673, 'learning_rate': 3.3905380866226622e-06, 'epoch': 0.62} 62%|██████▏ | 21122/34278 [23:15:24<15:58:09, 4.37s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 62%|██████▏ | 21123/34278 [23:15:27<14:44:08, 4.03s/it] {'loss': 0.1226, 'grad_norm': 1.0653924386615603, 'learning_rate': 3.3900908043621642e-06, 'epoch': 0.62} 62%|██████▏ | 21123/34278 [23:15:27<14:44:08, 4.03s/it] 62%|██████▏ | 21124/34278 [23:15:30<13:40:22, 3.74s/it] {'loss': 0.1521, 'grad_norm': 0.8004490174977535, 'learning_rate': 3.3896435364745516e-06, 'epoch': 0.62} 62%|██████▏ | 21124/34278 [23:15:30<13:40:22, 3.74s/it] 62%|██████▏ | 21125/34278 [23:15:33<13:10:20, 3.61s/it] {'loss': 0.1108, 'grad_norm': 0.7318667905252606, 'learning_rate': 3.389196282963816e-06, 'epoch': 0.62} 62%|██████▏ | 21125/34278 [23:15:33<13:10:20, 3.61s/it] 62%|██████▏ | 21126/34278 [23:15:37<13:12:17, 3.61s/it] {'loss': 0.1253, 'grad_norm': 0.9010306237909933, 'learning_rate': 3.388749043833952e-06, 'epoch': 0.62} 62%|██████▏ | 21126/34278 [23:15:37<13:12:17, 3.61s/it] 62%|██████▏ | 21127/34278 [23:15:40<12:22:07, 3.39s/it] {'loss': 0.131, 'grad_norm': 0.802820715916072, 'learning_rate': 3.3883018190889526e-06, 'epoch': 0.62} 62%|██████▏ | 21127/34278 [23:15:40<12:22:07, 3.39s/it] 62%|██████▏ | 21128/34278 [23:15:43<11:51:27, 3.25s/it] {'loss': 0.1068, 'grad_norm': 0.8216216258840674, 'learning_rate': 3.3878546087328096e-06, 'epoch': 0.62} 62%|██████▏ | 21128/34278 [23:15:43<11:51:27, 3.25s/it] 62%|██████▏ | 21129/34278 [23:15:46<11:35:25, 3.17s/it] {'loss': 0.1166, 'grad_norm': 0.8973045211638617, 'learning_rate': 3.3874074127695156e-06, 'epoch': 0.62} 62%|██████▏ | 21129/34278 [23:15:46<11:35:25, 3.17s/it] 62%|██████▏ | 21130/34278 [23:15:50<12:20:44, 3.38s/it] {'loss': 0.1276, 'grad_norm': 2.5582603756955216, 'learning_rate': 3.386960231203064e-06, 'epoch': 0.62} 62%|██████▏ | 21130/34278 [23:15:50<12:20:44, 3.38s/it] 62%|██████▏ | 21131/34278 [23:15:53<12:17:33, 3.37s/it] {'loss': 0.1256, 'grad_norm': 0.9130987991297592, 'learning_rate': 3.3865130640374444e-06, 'epoch': 0.62} 62%|██████▏ | 21131/34278 [23:15:53<12:17:33, 3.37s/it] 62%|██████▏ | 21132/34278 [23:15:57<12:48:50, 3.51s/it] {'loss': 0.1274, 'grad_norm': 1.0208648611113489, 'learning_rate': 3.3860659112766526e-06, 'epoch': 0.62} 62%|██████▏ | 21132/34278 [23:15:57<12:48:50, 3.51s/it] 62%|██████▏ | 21133/34278 [23:16:01<13:07:10, 3.59s/it] {'loss': 0.1358, 'grad_norm': 0.6996033325044707, 'learning_rate': 3.3856187729246785e-06, 'epoch': 0.62} 62%|██████▏ | 21133/34278 [23:16:01<13:07:10, 3.59s/it] 62%|██████▏ | 21134/34278 [23:16:06<14:58:31, 4.10s/it] {'loss': 0.1511, 'grad_norm': 0.9373999646860356, 'learning_rate': 3.3851716489855146e-06, 'epoch': 0.62} 62%|██████▏ | 21134/34278 [23:16:06<14:58:31, 4.10s/it] 62%|██████▏ | 21135/34278 [23:16:10<14:34:22, 3.99s/it] {'loss': 0.1321, 'grad_norm': 0.9384190485217382, 'learning_rate': 3.3847245394631544e-06, 'epoch': 0.62} 62%|██████▏ | 21135/34278 [23:16:10<14:34:22, 3.99s/it] 62%|██████▏ | 21136/34278 [23:16:13<14:12:42, 3.89s/it] {'loss': 0.1466, 'grad_norm': 0.90609768373347, 'learning_rate': 3.384277444361586e-06, 'epoch': 0.62} 62%|██████▏ | 21136/34278 [23:16:13<14:12:42, 3.89s/it] 62%|██████▏ | 21137/34278 [23:16:17<14:04:28, 3.86s/it] {'loss': 0.1396, 'grad_norm': 0.7934721185413713, 'learning_rate': 3.3838303636848022e-06, 'epoch': 0.62} 62%|██████▏ | 21137/34278 [23:16:17<14:04:28, 3.86s/it] 62%|██████▏ | 21138/34278 [23:16:20<13:14:26, 3.63s/it] {'loss': 0.1404, 'grad_norm': 1.1830060333479426, 'learning_rate': 3.383383297436796e-06, 'epoch': 0.62} 62%|██████▏ | 21138/34278 [23:16:20<13:14:26, 3.63s/it] 62%|██████▏ | 21139/34278 [23:16:23<12:29:22, 3.42s/it] {'loss': 0.1173, 'grad_norm': 0.9254301458145205, 'learning_rate': 3.38293624562156e-06, 'epoch': 0.62} 62%|██████▏ | 21139/34278 [23:16:23<12:29:22, 3.42s/it] 62%|██████▏ | 21140/34278 [23:16:28<13:43:44, 3.76s/it] {'loss': 0.1207, 'grad_norm': 0.7713612792198895, 'learning_rate': 3.3824892082430803e-06, 'epoch': 0.62} 62%|██████▏ | 21140/34278 [23:16:28<13:43:44, 3.76s/it] 62%|██████▏ | 21141/34278 [23:16:31<12:58:33, 3.56s/it] {'loss': 0.13, 'grad_norm': 1.2303376297944446, 'learning_rate': 3.382042185305352e-06, 'epoch': 0.62} 62%|██████▏ | 21141/34278 [23:16:31<12:58:33, 3.56s/it] 62%|██████▏ | 21142/34278 [23:16:35<13:41:28, 3.75s/it] {'loss': 0.1178, 'grad_norm': 0.9861930183687242, 'learning_rate': 3.3815951768123654e-06, 'epoch': 0.62} 62%|██████▏ | 21142/34278 [23:16:35<13:41:28, 3.75s/it] 62%|██████▏ | 21143/34278 [23:16:39<14:32:03, 3.98s/it] {'loss': 0.1159, 'grad_norm': 0.7440257645961308, 'learning_rate': 3.381148182768108e-06, 'epoch': 0.62} 62%|██████▏ | 21143/34278 [23:16:39<14:32:03, 3.98s/it] 62%|██████▏ | 21144/34278 [23:16:45<15:46:20, 4.32s/it] {'loss': 0.1283, 'grad_norm': 1.0968413448562162, 'learning_rate': 3.3807012031765758e-06, 'epoch': 0.62} 62%|██████▏ | 21144/34278 [23:16:45<15:46:20, 4.32s/it] 62%|██████▏ | 21145/34278 [23:16:47<14:15:56, 3.91s/it] {'loss': 0.1115, 'grad_norm': 1.0142731549523618, 'learning_rate': 3.3802542380417556e-06, 'epoch': 0.62} 62%|██████▏ | 21145/34278 [23:16:47<14:15:56, 3.91s/it] 62%|██████▏ | 21146/34278 [23:16:50<13:13:19, 3.62s/it] {'loss': 0.1229, 'grad_norm': 0.7268069163428662, 'learning_rate': 3.379807287367639e-06, 'epoch': 0.62} 62%|██████▏ | 21146/34278 [23:16:50<13:13:19, 3.62s/it] 62%|██████▏ | 21147/34278 [23:16:57<15:55:09, 4.36s/it] {'loss': 0.1052, 'grad_norm': 0.8990398893279822, 'learning_rate': 3.3793603511582178e-06, 'epoch': 0.62} 62%|██████▏ | 21147/34278 [23:16:57<15:55:09, 4.36s/it] 62%|██████▏ | 21148/34278 [23:17:00<14:35:57, 4.00s/it] {'loss': 0.1506, 'grad_norm': 1.3020606188837562, 'learning_rate': 3.378913429417479e-06, 'epoch': 0.62} 62%|██████▏ | 21148/34278 [23:17:00<14:35:57, 4.00s/it] 62%|██████▏ | 21149/34278 [23:17:05<16:33:12, 4.54s/it] {'loss': 0.1533, 'grad_norm': 0.9374825750232973, 'learning_rate': 3.378466522149413e-06, 'epoch': 0.62} 62%|██████▏ | 21149/34278 [23:17:05<16:33:12, 4.54s/it] 62%|██████▏ | 21150/34278 [23:17:08<14:50:16, 4.07s/it] {'loss': 0.1158, 'grad_norm': 0.9108943977469091, 'learning_rate': 3.3780196293580125e-06, 'epoch': 0.62} 62%|██████▏ | 21150/34278 [23:17:08<14:50:16, 4.07s/it] 62%|██████▏ | 21151/34278 [23:17:13<15:08:05, 4.15s/it] {'loss': 0.1336, 'grad_norm': 0.9500285737204432, 'learning_rate': 3.3775727510472644e-06, 'epoch': 0.62} 62%|██████▏ | 21151/34278 [23:17:13<15:08:05, 4.15s/it] 62%|██████▏ | 21152/34278 [23:17:16<14:04:36, 3.86s/it] {'loss': 0.1101, 'grad_norm': 0.941945039929362, 'learning_rate': 3.3771258872211614e-06, 'epoch': 0.62} 62%|██████▏ | 21152/34278 [23:17:16<14:04:36, 3.86s/it] 62%|██████▏ | 21153/34278 [23:17:22<16:48:30, 4.61s/it] {'loss': 0.1, 'grad_norm': 0.7556585325761117, 'learning_rate': 3.37667903788369e-06, 'epoch': 0.62} 62%|██████▏ | 21153/34278 [23:17:22<16:48:30, 4.61s/it] 62%|██████▏ | 21154/34278 [23:17:26<15:31:45, 4.26s/it] {'loss': 0.1027, 'grad_norm': 0.8734207602001116, 'learning_rate': 3.3762322030388407e-06, 'epoch': 0.62} 62%|██████▏ | 21154/34278 [23:17:26<15:31:45, 4.26s/it] 62%|██████▏ | 21155/34278 [23:17:30<15:10:31, 4.16s/it] {'loss': 0.1034, 'grad_norm': 0.8427695756956373, 'learning_rate': 3.375785382690601e-06, 'epoch': 0.62} 62%|██████▏ | 21155/34278 [23:17:30<15:10:31, 4.16s/it] 62%|██████▏ | 21156/34278 [23:17:33<14:02:04, 3.85s/it] {'loss': 0.11, 'grad_norm': 0.8173671443059403, 'learning_rate': 3.3753385768429624e-06, 'epoch': 0.62} 62%|██████▏ | 21156/34278 [23:17:33<14:02:04, 3.85s/it] 62%|██████▏ | 21157/34278 [23:17:37<14:02:26, 3.85s/it] {'loss': 0.0903, 'grad_norm': 0.7632379967922096, 'learning_rate': 3.3748917854999153e-06, 'epoch': 0.62} 62%|██████▏ | 21157/34278 [23:17:37<14:02:26, 3.85s/it] 62%|██████▏ | 21158/34278 [23:17:40<13:08:07, 3.60s/it] {'loss': 0.1207, 'grad_norm': 0.899417514099561, 'learning_rate': 3.3744450086654444e-06, 'epoch': 0.62} 62%|██████▏ | 21158/34278 [23:17:40<13:08:07, 3.60s/it] 62%|██████▏ | 21159/34278 [23:17:43<12:20:49, 3.39s/it] {'loss': 0.1486, 'grad_norm': 1.0236612305183193, 'learning_rate': 3.3739982463435417e-06, 'epoch': 0.62} 62%|██████▏ | 21159/34278 [23:17:43<12:20:49, 3.39s/it] 62%|██████▏ | 21160/34278 [23:17:46<11:59:00, 3.29s/it] {'loss': 0.1263, 'grad_norm': 0.802210837903345, 'learning_rate': 3.3735514985381944e-06, 'epoch': 0.62} 62%|██████▏ | 21160/34278 [23:17:46<11:59:00, 3.29s/it] 62%|██████▏ | 21161/34278 [23:17:49<12:11:39, 3.35s/it] {'loss': 0.1133, 'grad_norm': 0.8378422940751181, 'learning_rate': 3.3731047652533892e-06, 'epoch': 0.62} 62%|██████▏ | 21161/34278 [23:17:49<12:11:39, 3.35s/it] 62%|██████▏ | 21162/34278 [23:17:52<11:38:53, 3.20s/it] {'loss': 0.1414, 'grad_norm': 1.1110821027982585, 'learning_rate': 3.372658046493118e-06, 'epoch': 0.62} 62%|██████▏ | 21162/34278 [23:17:52<11:38:53, 3.20s/it] 62%|██████▏ | 21163/34278 [23:17:55<11:32:12, 3.17s/it] {'loss': 0.1099, 'grad_norm': 1.1105431698320545, 'learning_rate': 3.3722113422613668e-06, 'epoch': 0.62} 62%|██████▏ | 21163/34278 [23:17:55<11:32:12, 3.17s/it] 62%|██████▏ | 21164/34278 [23:17:58<11:44:50, 3.22s/it] {'loss': 0.1043, 'grad_norm': 0.7429347405761328, 'learning_rate': 3.371764652562124e-06, 'epoch': 0.62} 62%|██████▏ | 21164/34278 [23:17:58<11:44:50, 3.22s/it] 62%|██████▏ | 21165/34278 [23:18:02<11:56:32, 3.28s/it] {'loss': 0.1377, 'grad_norm': 0.9633949073256, 'learning_rate': 3.3713179773993787e-06, 'epoch': 0.62} 62%|██████▏ | 21165/34278 [23:18:02<11:56:32, 3.28s/it] 62%|██████▏ | 21166/34278 [23:18:05<11:40:46, 3.21s/it] {'loss': 0.1389, 'grad_norm': 1.0267365838698848, 'learning_rate': 3.3708713167771166e-06, 'epoch': 0.62} 62%|██████▏ | 21166/34278 [23:18:05<11:40:46, 3.21s/it] 62%|██████▏ | 21167/34278 [23:18:08<11:49:07, 3.25s/it] {'loss': 0.1292, 'grad_norm': 1.0615631732340258, 'learning_rate': 3.3704246706993255e-06, 'epoch': 0.62} 62%|██████▏ | 21167/34278 [23:18:08<11:49:07, 3.25s/it] 62%|██████▏ | 21168/34278 [23:18:11<11:38:36, 3.20s/it] {'loss': 0.1252, 'grad_norm': 0.8260204382415454, 'learning_rate': 3.369978039169995e-06, 'epoch': 0.62} 62%|██████▏ | 21168/34278 [23:18:11<11:38:36, 3.20s/it] 62%|██████▏ | 21169/34278 [23:18:14<11:31:46, 3.17s/it] {'loss': 0.1297, 'grad_norm': 0.7363162209643128, 'learning_rate': 3.3695314221931124e-06, 'epoch': 0.62} 62%|██████▏ | 21169/34278 [23:18:14<11:31:46, 3.17s/it] 62%|██████▏ | 21170/34278 [23:18:20<14:33:42, 4.00s/it] {'loss': 0.134, 'grad_norm': 1.1806912401812193, 'learning_rate': 3.369084819772663e-06, 'epoch': 0.62} 62%|██████▏ | 21170/34278 [23:18:20<14:33:42, 4.00s/it] 62%|██████▏ | 21171/34278 [23:18:24<13:50:24, 3.80s/it] {'loss': 0.1309, 'grad_norm': 1.1202515002285367, 'learning_rate': 3.3686382319126353e-06, 'epoch': 0.62} 62%|██████▏ | 21171/34278 [23:18:24<13:50:24, 3.80s/it] 62%|██████▏ | 21172/34278 [23:18:28<14:09:46, 3.89s/it] {'loss': 0.1262, 'grad_norm': 0.8047278167034158, 'learning_rate': 3.368191658617017e-06, 'epoch': 0.62} 62%|██████▏ | 21172/34278 [23:18:28<14:09:46, 3.89s/it] 62%|██████▏ | 21173/34278 [23:18:31<13:43:14, 3.77s/it] {'loss': 0.1286, 'grad_norm': 0.6263439729541208, 'learning_rate': 3.367745099889791e-06, 'epoch': 0.62} 62%|██████▏ | 21173/34278 [23:18:31<13:43:14, 3.77s/it] 62%|██████▏ | 21174/34278 [23:18:35<13:33:49, 3.73s/it] {'loss': 0.1359, 'grad_norm': 0.726227833659534, 'learning_rate': 3.36729855573495e-06, 'epoch': 0.62} 62%|██████▏ | 21174/34278 [23:18:35<13:33:49, 3.73s/it] 62%|██████▏ | 21175/34278 [23:18:39<13:34:57, 3.73s/it] {'loss': 0.1254, 'grad_norm': 1.057684417738265, 'learning_rate': 3.3668520261564764e-06, 'epoch': 0.62} 62%|██████▏ | 21175/34278 [23:18:39<13:34:57, 3.73s/it] 62%|██████▏ | 21176/34278 [23:18:42<13:21:09, 3.67s/it] {'loss': 0.1468, 'grad_norm': 1.0064517708224447, 'learning_rate': 3.3664055111583586e-06, 'epoch': 0.62} 62%|██████▏ | 21176/34278 [23:18:42<13:21:09, 3.67s/it] 62%|██████▏ | 21177/34278 [23:18:48<16:08:35, 4.44s/it] {'loss': 0.1249, 'grad_norm': 0.7996593234194277, 'learning_rate': 3.3659590107445833e-06, 'epoch': 0.62} 62%|██████▏ | 21177/34278 [23:18:48<16:08:35, 4.44s/it] 62%|██████▏ | 21178/34278 [23:18:52<15:05:09, 4.15s/it] {'loss': 0.1063, 'grad_norm': 0.7087731280318946, 'learning_rate': 3.3655125249191344e-06, 'epoch': 0.62} 62%|██████▏ | 21178/34278 [23:18:52<15:05:09, 4.15s/it] 62%|██████▏ | 21179/34278 [23:18:58<17:11:44, 4.73s/it] {'loss': 0.1065, 'grad_norm': 0.784415737054412, 'learning_rate': 3.365066053685999e-06, 'epoch': 0.62} 62%|██████▏ | 21179/34278 [23:18:58<17:11:44, 4.73s/it] 62%|██████▏ | 21180/34278 [23:19:01<15:11:35, 4.18s/it] {'loss': 0.1356, 'grad_norm': 0.9828445544343942, 'learning_rate': 3.3646195970491645e-06, 'epoch': 0.62} 62%|██████▏ | 21180/34278 [23:19:01<15:11:35, 4.18s/it] 62%|██████▏ | 21181/34278 [23:19:04<13:53:17, 3.82s/it] {'loss': 0.1352, 'grad_norm': 0.8968393461343391, 'learning_rate': 3.364173155012616e-06, 'epoch': 0.62} 62%|██████▏ | 21181/34278 [23:19:04<13:53:17, 3.82s/it] 62%|██████▏ | 21182/34278 [23:19:08<14:44:20, 4.05s/it] {'loss': 0.1169, 'grad_norm': 0.7622712764123849, 'learning_rate': 3.3637267275803397e-06, 'epoch': 0.62} 62%|██████▏ | 21182/34278 [23:19:08<14:44:20, 4.05s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 62%|██████▏ | 21183/34278 [23:19:14<16:47:26, 4.62s/it] {'loss': 0.1516, 'grad_norm': 0.9958838711416718, 'learning_rate': 3.36328031475632e-06, 'epoch': 0.62} 62%|██████▏ | 21183/34278 [23:19:14<16:47:26, 4.62s/it] 62%|██████▏ | 21184/34278 [23:19:20<18:07:00, 4.98s/it] {'loss': 0.1392, 'grad_norm': 0.7650714280782808, 'learning_rate': 3.3628339165445427e-06, 'epoch': 0.62} 62%|██████▏ | 21184/34278 [23:19:20<18:07:00, 4.98s/it] 62%|██████▏ | 21185/34278 [23:19:23<16:03:14, 4.41s/it] {'loss': 0.1399, 'grad_norm': 0.9511637695257763, 'learning_rate': 3.3623875329489923e-06, 'epoch': 0.62} 62%|██████▏ | 21185/34278 [23:19:23<16:03:14, 4.41s/it] 62%|██████▏ | 21186/34278 [23:19:27<15:21:29, 4.22s/it] {'loss': 0.128, 'grad_norm': 0.7707891357080175, 'learning_rate': 3.3619411639736566e-06, 'epoch': 0.62} 62%|██████▏ | 21186/34278 [23:19:27<15:21:29, 4.22s/it] 62%|██████▏ | 21187/34278 [23:19:30<14:11:09, 3.90s/it] {'loss': 0.1276, 'grad_norm': 0.8522396463338711, 'learning_rate': 3.3614948096225193e-06, 'epoch': 0.62} 62%|██████▏ | 21187/34278 [23:19:30<14:11:09, 3.90s/it] 62%|██████▏ | 21188/34278 [23:19:33<13:14:49, 3.64s/it] {'loss': 0.1423, 'grad_norm': 0.7604940444213334, 'learning_rate': 3.3610484698995647e-06, 'epoch': 0.62} 62%|██████▏ | 21188/34278 [23:19:33<13:14:49, 3.64s/it] 62%|██████▏ | 21189/34278 [23:19:38<14:51:38, 4.09s/it] {'loss': 0.1142, 'grad_norm': 0.7199763847857643, 'learning_rate': 3.3606021448087778e-06, 'epoch': 0.62} 62%|██████▏ | 21189/34278 [23:19:38<14:51:38, 4.09s/it] 62%|██████▏ | 21190/34278 [23:19:43<14:57:08, 4.11s/it] {'loss': 0.1298, 'grad_norm': 0.6883281593330257, 'learning_rate': 3.360155834354145e-06, 'epoch': 0.62} 62%|██████▏ | 21190/34278 [23:19:43<14:57:08, 4.11s/it] 62%|██████▏ | 21191/34278 [23:19:46<13:53:56, 3.82s/it] {'loss': 0.1405, 'grad_norm': 1.2775797964422437, 'learning_rate': 3.359709538539647e-06, 'epoch': 0.62} 62%|██████▏ | 21191/34278 [23:19:46<13:53:56, 3.82s/it] 62%|██████▏ | 21192/34278 [23:19:52<16:32:56, 4.55s/it] {'loss': 0.1392, 'grad_norm': 0.8332967776041809, 'learning_rate': 3.359263257369272e-06, 'epoch': 0.62} 62%|██████▏ | 21192/34278 [23:19:52<16:32:56, 4.55s/it] 62%|██████▏ | 21193/34278 [23:19:56<16:10:31, 4.45s/it] {'loss': 0.129, 'grad_norm': 0.9422662887694544, 'learning_rate': 3.3588169908470024e-06, 'epoch': 0.62} 62%|██████▏ | 21193/34278 [23:19:56<16:10:31, 4.45s/it] 62%|██████▏ | 21194/34278 [23:20:00<15:00:43, 4.13s/it] {'loss': 0.122, 'grad_norm': 0.8398392910025081, 'learning_rate': 3.358370738976825e-06, 'epoch': 0.62} 62%|██████▏ | 21194/34278 [23:20:00<15:00:43, 4.13s/it] 62%|██████▏ | 21195/34278 [23:20:03<14:06:39, 3.88s/it] {'loss': 0.1229, 'grad_norm': 0.8505175982716272, 'learning_rate': 3.35792450176272e-06, 'epoch': 0.62} 62%|██████▏ | 21195/34278 [23:20:03<14:06:39, 3.88s/it] 62%|██████▏ | 21196/34278 [23:20:06<13:06:54, 3.61s/it] {'loss': 0.0949, 'grad_norm': 0.7003612590512537, 'learning_rate': 3.3574782792086735e-06, 'epoch': 0.62} 62%|██████▏ | 21196/34278 [23:20:06<13:06:54, 3.61s/it] 62%|██████▏ | 21197/34278 [23:20:10<13:27:32, 3.70s/it] {'loss': 0.1279, 'grad_norm': 0.7088503878253166, 'learning_rate': 3.357032071318667e-06, 'epoch': 0.62} 62%|██████▏ | 21197/34278 [23:20:10<13:27:32, 3.70s/it] 62%|██████▏ | 21198/34278 [23:20:16<16:03:46, 4.42s/it] {'loss': 0.1052, 'grad_norm': 1.0418224776350187, 'learning_rate': 3.3565858780966875e-06, 'epoch': 0.62} 62%|██████▏ | 21198/34278 [23:20:16<16:03:46, 4.42s/it] 62%|██████▏ | 21199/34278 [23:20:20<15:54:51, 4.38s/it] {'loss': 0.1059, 'grad_norm': 0.9797337619283316, 'learning_rate': 3.356139699546718e-06, 'epoch': 0.62} 62%|██████▏ | 21199/34278 [23:20:20<15:54:51, 4.38s/it] 62%|██████▏ | 21200/34278 [23:20:24<15:15:19, 4.20s/it] {'loss': 0.108, 'grad_norm': 0.714229762414593, 'learning_rate': 3.35569353567274e-06, 'epoch': 0.62} 62%|██████▏ | 21200/34278 [23:20:24<15:15:19, 4.20s/it] 62%|██████▏ | 21201/34278 [23:20:28<14:46:04, 4.07s/it] {'loss': 0.1575, 'grad_norm': 1.134266107694089, 'learning_rate': 3.3552473864787373e-06, 'epoch': 0.62} 62%|██████▏ | 21201/34278 [23:20:28<14:46:04, 4.07s/it] 62%|██████▏ | 21202/34278 [23:20:30<13:23:18, 3.69s/it] {'loss': 0.1414, 'grad_norm': 1.2277527972648197, 'learning_rate': 3.3548012519686944e-06, 'epoch': 0.62} 62%|██████▏ | 21202/34278 [23:20:30<13:23:18, 3.69s/it] 62%|██████▏ | 21203/34278 [23:20:36<15:44:16, 4.33s/it] {'loss': 0.1154, 'grad_norm': 0.7349958918407727, 'learning_rate': 3.35435513214659e-06, 'epoch': 0.62} 62%|██████▏ | 21203/34278 [23:20:36<15:44:16, 4.33s/it] 62%|██████▏ | 21204/34278 [23:20:40<15:09:01, 4.17s/it] {'loss': 0.1494, 'grad_norm': 0.8126375420422207, 'learning_rate': 3.3539090270164134e-06, 'epoch': 0.62} 62%|██████▏ | 21204/34278 [23:20:40<15:09:01, 4.17s/it] 62%|██████▏ | 21205/34278 [23:20:44<15:14:17, 4.20s/it] {'loss': 0.1406, 'grad_norm': 1.1103551371168165, 'learning_rate': 3.3534629365821424e-06, 'epoch': 0.62} 62%|██████▏ | 21205/34278 [23:20:44<15:14:17, 4.20s/it] 62%|██████▏ | 21206/34278 [23:20:47<14:01:03, 3.86s/it] {'loss': 0.1172, 'grad_norm': 0.8389841037147578, 'learning_rate': 3.353016860847762e-06, 'epoch': 0.62} 62%|██████▏ | 21206/34278 [23:20:47<14:01:03, 3.86s/it] 62%|██████▏ | 21207/34278 [23:20:51<14:01:15, 3.86s/it] {'loss': 0.1133, 'grad_norm': 0.800865656585191, 'learning_rate': 3.352570799817255e-06, 'epoch': 0.62} 62%|██████▏ | 21207/34278 [23:20:51<14:01:15, 3.86s/it] 62%|██████▏ | 21208/34278 [23:20:54<13:09:14, 3.62s/it] {'loss': 0.1163, 'grad_norm': 0.6766690715697238, 'learning_rate': 3.352124753494601e-06, 'epoch': 0.62} 62%|██████▏ | 21208/34278 [23:20:54<13:09:14, 3.62s/it] 62%|██████▏ | 21209/34278 [23:20:57<12:11:35, 3.36s/it] {'loss': 0.1307, 'grad_norm': 0.7791316632308496, 'learning_rate': 3.351678721883783e-06, 'epoch': 0.62} 62%|██████▏ | 21209/34278 [23:20:57<12:11:35, 3.36s/it] 62%|██████▏ | 21210/34278 [23:21:00<12:07:38, 3.34s/it] {'loss': 0.1284, 'grad_norm': 0.9211891434207212, 'learning_rate': 3.351232704988785e-06, 'epoch': 0.62} 62%|██████▏ | 21210/34278 [23:21:00<12:07:38, 3.34s/it] 62%|██████▏ | 21211/34278 [23:21:04<11:55:25, 3.29s/it] {'loss': 0.1202, 'grad_norm': 0.6421911306092272, 'learning_rate': 3.3507867028135883e-06, 'epoch': 0.62} 62%|██████▏ | 21211/34278 [23:21:04<11:55:25, 3.29s/it] 62%|██████▏ | 21212/34278 [23:21:06<11:34:29, 3.19s/it] {'loss': 0.1116, 'grad_norm': 0.755354661394962, 'learning_rate': 3.3503407153621747e-06, 'epoch': 0.62} 62%|██████▏ | 21212/34278 [23:21:06<11:34:29, 3.19s/it] 62%|██████▏ | 21213/34278 [23:21:10<11:29:55, 3.17s/it] {'loss': 0.1379, 'grad_norm': 0.9795982324837306, 'learning_rate': 3.349894742638524e-06, 'epoch': 0.62} 62%|██████▏ | 21213/34278 [23:21:10<11:29:55, 3.17s/it] 62%|██████▏ | 21214/34278 [23:21:13<11:28:32, 3.16s/it] {'loss': 0.1303, 'grad_norm': 0.7906302537901844, 'learning_rate': 3.34944878464662e-06, 'epoch': 0.62} 62%|██████▏ | 21214/34278 [23:21:13<11:28:32, 3.16s/it] 62%|██████▏ | 21215/34278 [23:21:17<12:23:35, 3.42s/it] {'loss': 0.1066, 'grad_norm': 0.7370506327497786, 'learning_rate': 3.349002841390442e-06, 'epoch': 0.62} 62%|██████▏ | 21215/34278 [23:21:17<12:23:35, 3.42s/it] 62%|██████▏ | 21216/34278 [23:21:22<14:22:43, 3.96s/it] {'loss': 0.134, 'grad_norm': 0.7863086011781343, 'learning_rate': 3.3485569128739724e-06, 'epoch': 0.62} 62%|██████▏ | 21216/34278 [23:21:22<14:22:43, 3.96s/it] 62%|██████▏ | 21217/34278 [23:21:28<16:10:14, 4.46s/it] {'loss': 0.133, 'grad_norm': 0.7531898753356019, 'learning_rate': 3.348110999101195e-06, 'epoch': 0.62} 62%|██████▏ | 21217/34278 [23:21:28<16:10:14, 4.46s/it] 62%|██████▏ | 21218/34278 [23:21:31<15:01:41, 4.14s/it] {'loss': 0.0964, 'grad_norm': 0.8702064344304873, 'learning_rate': 3.347665100076086e-06, 'epoch': 0.62} 62%|██████▏ | 21218/34278 [23:21:31<15:01:41, 4.14s/it] 62%|██████▏ | 21219/34278 [23:21:34<13:50:06, 3.81s/it] {'loss': 0.1498, 'grad_norm': 0.7362051539785243, 'learning_rate': 3.3472192158026296e-06, 'epoch': 0.62} 62%|██████▏ | 21219/34278 [23:21:34<13:50:06, 3.81s/it] 62%|██████▏ | 21220/34278 [23:21:37<13:04:47, 3.61s/it] {'loss': 0.1188, 'grad_norm': 0.7447196134231505, 'learning_rate': 3.3467733462848063e-06, 'epoch': 0.62} 62%|██████▏ | 21220/34278 [23:21:37<13:04:47, 3.61s/it] 62%|██████▏ | 21221/34278 [23:21:42<14:08:57, 3.90s/it] {'loss': 0.1132, 'grad_norm': 0.7591857929875219, 'learning_rate': 3.3463274915265935e-06, 'epoch': 0.62} 62%|██████▏ | 21221/34278 [23:21:42<14:08:57, 3.90s/it] 62%|██████▏ | 21222/34278 [23:21:45<13:34:50, 3.74s/it] {'loss': 0.124, 'grad_norm': 0.6873927769309279, 'learning_rate': 3.3458816515319753e-06, 'epoch': 0.62} 62%|██████▏ | 21222/34278 [23:21:45<13:34:50, 3.74s/it] 62%|██████▏ | 21223/34278 [23:21:48<12:36:26, 3.48s/it] {'loss': 0.1254, 'grad_norm': 0.9018772086631418, 'learning_rate': 3.345435826304931e-06, 'epoch': 0.62} 62%|██████▏ | 21223/34278 [23:21:48<12:36:26, 3.48s/it] 62%|██████▏ | 21224/34278 [23:21:51<11:49:38, 3.26s/it] {'loss': 0.1257, 'grad_norm': 0.8982515765955048, 'learning_rate': 3.3449900158494407e-06, 'epoch': 0.62} 62%|██████▏ | 21224/34278 [23:21:51<11:49:38, 3.26s/it] 62%|██████▏ | 21225/34278 [23:21:54<11:27:33, 3.16s/it] {'loss': 0.1203, 'grad_norm': 0.8706524419812917, 'learning_rate': 3.3445442201694843e-06, 'epoch': 0.62} 62%|██████▏ | 21225/34278 [23:21:54<11:27:33, 3.16s/it] 62%|██████▏ | 21226/34278 [23:22:00<14:31:22, 4.01s/it] {'loss': 0.1138, 'grad_norm': 0.7358970491922977, 'learning_rate': 3.3440984392690425e-06, 'epoch': 0.62} 62%|██████▏ | 21226/34278 [23:22:00<14:31:22, 4.01s/it] 62%|██████▏ | 21227/34278 [23:22:03<13:18:48, 3.67s/it] {'loss': 0.1222, 'grad_norm': 1.0222237823949332, 'learning_rate': 3.3436526731520924e-06, 'epoch': 0.62} 62%|██████▏ | 21227/34278 [23:22:03<13:18:48, 3.67s/it] 62%|██████▏ | 21228/34278 [23:22:05<12:17:02, 3.39s/it] {'loss': 0.1106, 'grad_norm': 0.8041844508813587, 'learning_rate': 3.3432069218226173e-06, 'epoch': 0.62} 62%|██████▏ | 21228/34278 [23:22:05<12:17:02, 3.39s/it] 62%|██████▏ | 21229/34278 [23:22:09<12:27:50, 3.44s/it] {'loss': 0.1641, 'grad_norm': 0.9842227525447131, 'learning_rate': 3.3427611852845964e-06, 'epoch': 0.62} 62%|██████▏ | 21229/34278 [23:22:09<12:27:50, 3.44s/it] 62%|██████▏ | 21230/34278 [23:22:12<11:52:38, 3.28s/it] {'loss': 0.1113, 'grad_norm': 0.8740283958603253, 'learning_rate': 3.3423154635420075e-06, 'epoch': 0.62} 62%|██████▏ | 21230/34278 [23:22:12<11:52:38, 3.28s/it] 62%|██████▏ | 21231/34278 [23:22:15<11:53:28, 3.28s/it] {'loss': 0.1402, 'grad_norm': 0.9697856100686789, 'learning_rate': 3.341869756598829e-06, 'epoch': 0.62} 62%|██████▏ | 21231/34278 [23:22:15<11:53:28, 3.28s/it] 62%|██████▏ | 21232/34278 [23:22:18<11:19:17, 3.12s/it] {'loss': 0.1236, 'grad_norm': 0.9156001557062314, 'learning_rate': 3.3414240644590435e-06, 'epoch': 0.62} 62%|██████▏ | 21232/34278 [23:22:18<11:19:17, 3.12s/it] 62%|██████▏ | 21233/34278 [23:22:22<12:56:06, 3.57s/it] {'loss': 0.1204, 'grad_norm': 0.8168858686619196, 'learning_rate': 3.340978387126625e-06, 'epoch': 0.62} 62%|██████▏ | 21233/34278 [23:22:22<12:56:06, 3.57s/it] 62%|██████▏ | 21234/34278 [23:22:26<12:47:26, 3.53s/it] {'loss': 0.1083, 'grad_norm': 0.8154387843613505, 'learning_rate': 3.3405327246055584e-06, 'epoch': 0.62} 62%|██████▏ | 21234/34278 [23:22:26<12:47:26, 3.53s/it] 62%|██████▏ | 21235/34278 [23:22:28<11:50:29, 3.27s/it] {'loss': 0.135, 'grad_norm': 0.9627968238583299, 'learning_rate': 3.3400870768998185e-06, 'epoch': 0.62} 62%|██████▏ | 21235/34278 [23:22:29<11:50:29, 3.27s/it] 62%|██████▏ | 21236/34278 [23:22:32<11:40:20, 3.22s/it] {'loss': 0.1201, 'grad_norm': 0.8056921129051398, 'learning_rate': 3.3396414440133846e-06, 'epoch': 0.62} 62%|██████▏ | 21236/34278 [23:22:32<11:40:20, 3.22s/it] 62%|██████▏ | 21237/34278 [23:22:38<14:46:31, 4.08s/it] {'loss': 0.1071, 'grad_norm': 0.7033696684907796, 'learning_rate': 3.3391958259502364e-06, 'epoch': 0.62} 62%|██████▏ | 21237/34278 [23:22:38<14:46:31, 4.08s/it] 62%|██████▏ | 21238/34278 [23:22:44<16:57:21, 4.68s/it] {'loss': 0.131, 'grad_norm': 1.1705868548999783, 'learning_rate': 3.338750222714351e-06, 'epoch': 0.62} 62%|██████▏ | 21238/34278 [23:22:44<16:57:21, 4.68s/it] 62%|██████▏ | 21239/34278 [23:22:48<16:40:37, 4.60s/it] {'loss': 0.1141, 'grad_norm': 0.8836863283166284, 'learning_rate': 3.3383046343097057e-06, 'epoch': 0.62} 62%|██████▏ | 21239/34278 [23:22:48<16:40:37, 4.60s/it] 62%|██████▏ | 21240/34278 [23:22:51<14:58:01, 4.13s/it] {'loss': 0.122, 'grad_norm': 0.866811109363582, 'learning_rate': 3.3378590607402805e-06, 'epoch': 0.62} 62%|██████▏ | 21240/34278 [23:22:51<14:58:01, 4.13s/it] 62%|██████▏ | 21241/34278 [23:22:55<14:19:24, 3.96s/it] {'loss': 0.1204, 'grad_norm': 1.1313283525712725, 'learning_rate': 3.337413502010054e-06, 'epoch': 0.62} 62%|██████▏ | 21241/34278 [23:22:55<14:19:24, 3.96s/it] 62%|██████▏ | 21242/34278 [23:23:01<16:32:27, 4.57s/it] {'loss': 0.1095, 'grad_norm': 1.2772787645472965, 'learning_rate': 3.336967958123003e-06, 'epoch': 0.62} 62%|██████▏ | 21242/34278 [23:23:01<16:32:27, 4.57s/it] 62%|██████▏ | 21243/34278 [23:23:04<15:02:58, 4.16s/it] {'loss': 0.1003, 'grad_norm': 0.8394017627605113, 'learning_rate': 3.3365224290831046e-06, 'epoch': 0.62} 62%|██████▏ | 21243/34278 [23:23:04<15:02:58, 4.16s/it] 62%|██████▏ | 21244/34278 [23:23:08<14:27:10, 3.99s/it] {'loss': 0.1161, 'grad_norm': 0.8194341602790984, 'learning_rate': 3.336076914894336e-06, 'epoch': 0.62} 62%|██████▏ | 21244/34278 [23:23:08<14:27:10, 3.99s/it] 62%|██████▏ | 21245/34278 [23:23:12<14:39:28, 4.05s/it] {'loss': 0.1518, 'grad_norm': 0.9484147882504607, 'learning_rate': 3.335631415560675e-06, 'epoch': 0.62} 62%|██████▏ | 21245/34278 [23:23:12<14:39:28, 4.05s/it] 62%|██████▏ | 21246/34278 [23:23:15<13:34:37, 3.75s/it] {'loss': 0.1106, 'grad_norm': 0.6960903804747017, 'learning_rate': 3.3351859310861002e-06, 'epoch': 0.62} 62%|██████▏ | 21246/34278 [23:23:15<13:34:37, 3.75s/it] 62%|██████▏ | 21247/34278 [23:23:19<13:33:52, 3.75s/it] {'loss': 0.1057, 'grad_norm': 0.7394339730995685, 'learning_rate': 3.3347404614745893e-06, 'epoch': 0.62} 62%|██████▏ | 21247/34278 [23:23:19<13:33:52, 3.75s/it] 62%|██████▏ | 21248/34278 [23:23:21<12:36:57, 3.49s/it] {'loss': 0.1243, 'grad_norm': 0.9504018079113686, 'learning_rate': 3.3342950067301173e-06, 'epoch': 0.62} 62%|██████▏ | 21248/34278 [23:23:21<12:36:57, 3.49s/it] 62%|██████▏ | 21249/34278 [23:23:26<13:30:37, 3.73s/it] {'loss': 0.1422, 'grad_norm': 0.9583044532792427, 'learning_rate': 3.3338495668566614e-06, 'epoch': 0.62} 62%|██████▏ | 21249/34278 [23:23:26<13:30:37, 3.73s/it] 62%|██████▏ | 21250/34278 [23:23:29<12:59:05, 3.59s/it] {'loss': 0.1055, 'grad_norm': 0.8205100638906588, 'learning_rate': 3.3334041418581996e-06, 'epoch': 0.62} 62%|██████▏ | 21250/34278 [23:23:29<12:59:05, 3.59s/it] 62%|██████▏ | 21251/34278 [23:23:35<15:35:17, 4.31s/it] {'loss': 0.1338, 'grad_norm': 0.9523776220684347, 'learning_rate': 3.332958731738706e-06, 'epoch': 0.62} 62%|██████▏ | 21251/34278 [23:23:35<15:35:17, 4.31s/it] 62%|██████▏ | 21252/34278 [23:23:38<14:38:57, 4.05s/it] {'loss': 0.1141, 'grad_norm': 0.7827706205399767, 'learning_rate': 3.33251333650216e-06, 'epoch': 0.62} 62%|██████▏ | 21252/34278 [23:23:39<14:38:57, 4.05s/it] 62%|██████▏ | 21253/34278 [23:23:44<16:08:57, 4.46s/it] {'loss': 0.1217, 'grad_norm': 1.0623234776400148, 'learning_rate': 3.332067956152537e-06, 'epoch': 0.62} 62%|██████▏ | 21253/34278 [23:23:44<16:08:57, 4.46s/it] 62%|██████▏ | 21254/34278 [23:23:47<14:34:40, 4.03s/it] {'loss': 0.1231, 'grad_norm': 0.8327085585583657, 'learning_rate': 3.3316225906938136e-06, 'epoch': 0.62} 62%|██████▏ | 21254/34278 [23:23:47<14:34:40, 4.03s/it] 62%|██████▏ | 21255/34278 [23:23:50<13:49:58, 3.82s/it] {'loss': 0.1515, 'grad_norm': 1.0833418036181328, 'learning_rate': 3.3311772401299645e-06, 'epoch': 0.62} 62%|██████▏ | 21255/34278 [23:23:50<13:49:58, 3.82s/it] 62%|██████▏ | 21256/34278 [23:23:53<13:00:28, 3.60s/it] {'loss': 0.1241, 'grad_norm': 0.9818978491360845, 'learning_rate': 3.3307319044649663e-06, 'epoch': 0.62} 62%|██████▏ | 21256/34278 [23:23:53<13:00:28, 3.60s/it] 62%|██████▏ | 21257/34278 [23:23:59<15:38:37, 4.33s/it] {'loss': 0.1163, 'grad_norm': 1.0891961690935306, 'learning_rate': 3.3302865837027954e-06, 'epoch': 0.62} 62%|██████▏ | 21257/34278 [23:23:59<15:38:37, 4.33s/it] 62%|██████▏ | 21258/34278 [23:24:03<14:59:50, 4.15s/it] {'loss': 0.1102, 'grad_norm': 0.8342598441064556, 'learning_rate': 3.3298412778474277e-06, 'epoch': 0.62} 62%|██████▏ | 21258/34278 [23:24:03<14:59:50, 4.15s/it] 62%|██████▏ | 21259/34278 [23:24:07<14:23:18, 3.98s/it] {'loss': 0.1105, 'grad_norm': 0.9643015121715223, 'learning_rate': 3.329395986902839e-06, 'epoch': 0.62} 62%|██████▏ | 21259/34278 [23:24:07<14:23:18, 3.98s/it] 62%|██████▏ | 21260/34278 [23:24:13<16:42:03, 4.62s/it] {'loss': 0.1168, 'grad_norm': 1.1013596414539768, 'learning_rate': 3.3289507108730033e-06, 'epoch': 0.62} 62%|██████▏ | 21260/34278 [23:24:13<16:42:03, 4.62s/it] 62%|██████▏ | 21261/34278 [23:24:19<18:09:34, 5.02s/it] {'loss': 0.139, 'grad_norm': 0.9029074386464252, 'learning_rate': 3.3285054497618974e-06, 'epoch': 0.62} 62%|██████▏ | 21261/34278 [23:24:19<18:09:34, 5.02s/it] 62%|██████▏ | 21262/34278 [23:24:22<16:11:25, 4.48s/it] {'loss': 0.106, 'grad_norm': 0.7658185436620013, 'learning_rate': 3.3280602035734944e-06, 'epoch': 0.62} 62%|██████▏ | 21262/34278 [23:24:22<16:11:25, 4.48s/it] 62%|██████▏ | 21263/34278 [23:24:25<14:27:47, 4.00s/it] {'loss': 0.1107, 'grad_norm': 0.9713398445467478, 'learning_rate': 3.327614972311771e-06, 'epoch': 0.62} 62%|██████▏ | 21263/34278 [23:24:25<14:27:47, 4.00s/it] 62%|██████▏ | 21264/34278 [23:24:28<14:02:34, 3.88s/it] {'loss': 0.1202, 'grad_norm': 0.8215817782423971, 'learning_rate': 3.3271697559807042e-06, 'epoch': 0.62} 62%|██████▏ | 21264/34278 [23:24:28<14:02:34, 3.88s/it] 62%|██████▏ | 21265/34278 [23:24:32<13:38:21, 3.77s/it] {'loss': 0.1597, 'grad_norm': 0.8677790075313323, 'learning_rate': 3.3267245545842653e-06, 'epoch': 0.62} 62%|██████▏ | 21265/34278 [23:24:32<13:38:21, 3.77s/it] 62%|██████▏ | 21266/34278 [23:24:35<12:41:21, 3.51s/it] {'loss': 0.1151, 'grad_norm': 0.8700063377829749, 'learning_rate': 3.3262793681264293e-06, 'epoch': 0.62} 62%|██████▏ | 21266/34278 [23:24:35<12:41:21, 3.51s/it] 62%|██████▏ | 21267/34278 [23:24:38<12:19:45, 3.41s/it] {'loss': 0.1317, 'grad_norm': 0.7790501776389437, 'learning_rate': 3.3258341966111728e-06, 'epoch': 0.62} 62%|██████▏ | 21267/34278 [23:24:38<12:19:45, 3.41s/it] 62%|██████▏ | 21268/34278 [23:24:42<12:30:15, 3.46s/it] {'loss': 0.1339, 'grad_norm': 0.94202255518554, 'learning_rate': 3.325389040042466e-06, 'epoch': 0.62} 62%|██████▏ | 21268/34278 [23:24:42<12:30:15, 3.46s/it] 62%|██████▏ | 21269/34278 [23:24:46<13:02:02, 3.61s/it] {'loss': 0.1148, 'grad_norm': 0.7582691685828977, 'learning_rate': 3.3249438984242893e-06, 'epoch': 0.62} 62%|██████▏ | 21269/34278 [23:24:46<13:02:02, 3.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 62%|██████▏ | 21270/34278 [23:24:49<12:29:50, 3.46s/it] {'loss': 0.1152, 'grad_norm': 0.8749013701005308, 'learning_rate': 3.3244987717606127e-06, 'epoch': 0.62} 62%|██████▏ | 21270/34278 [23:24:49<12:29:50, 3.46s/it] 62%|██████▏ | 21271/34278 [23:24:52<12:02:27, 3.33s/it] {'loss': 0.1208, 'grad_norm': 0.7694512091733933, 'learning_rate': 3.324053660055411e-06, 'epoch': 0.62} 62%|██████▏ | 21271/34278 [23:24:52<12:02:27, 3.33s/it] 62%|██████▏ | 21272/34278 [23:24:55<11:51:41, 3.28s/it] {'loss': 0.1371, 'grad_norm': 0.8421991445483307, 'learning_rate': 3.3236085633126586e-06, 'epoch': 0.62} 62%|██████▏ | 21272/34278 [23:24:55<11:51:41, 3.28s/it] 62%|██████▏ | 21273/34278 [23:24:58<11:40:37, 3.23s/it] {'loss': 0.1303, 'grad_norm': 0.7464571484475591, 'learning_rate': 3.323163481536328e-06, 'epoch': 0.62} 62%|██████▏ | 21273/34278 [23:24:58<11:40:37, 3.23s/it] 62%|██████▏ | 21274/34278 [23:25:01<11:14:59, 3.11s/it] {'loss': 0.1206, 'grad_norm': 0.8140424366889636, 'learning_rate': 3.3227184147303928e-06, 'epoch': 0.62} 62%|██████▏ | 21274/34278 [23:25:01<11:14:59, 3.11s/it] 62%|██████▏ | 21275/34278 [23:25:04<11:02:14, 3.06s/it] {'loss': 0.1178, 'grad_norm': 0.9722149153286191, 'learning_rate': 3.322273362898828e-06, 'epoch': 0.62} 62%|██████▏ | 21275/34278 [23:25:04<11:02:14, 3.06s/it] 62%|██████▏ | 21276/34278 [23:25:07<11:45:48, 3.26s/it] {'loss': 0.1167, 'grad_norm': 0.7490105420493265, 'learning_rate': 3.3218283260456065e-06, 'epoch': 0.62} 62%|██████▏ | 21276/34278 [23:25:07<11:45:48, 3.26s/it] 62%|██████▏ | 21277/34278 [23:25:11<11:56:37, 3.31s/it] {'loss': 0.1364, 'grad_norm': 0.907949776745723, 'learning_rate': 3.321383304174702e-06, 'epoch': 0.62} 62%|██████▏ | 21277/34278 [23:25:11<11:56:37, 3.31s/it] 62%|██████▏ | 21278/34278 [23:25:14<11:40:52, 3.23s/it] {'loss': 0.1187, 'grad_norm': 0.9494358160912708, 'learning_rate': 3.320938297290085e-06, 'epoch': 0.62} 62%|██████▏ | 21278/34278 [23:25:14<11:40:52, 3.23s/it] 62%|██████▏ | 21279/34278 [23:25:17<11:45:27, 3.26s/it] {'loss': 0.1292, 'grad_norm': 1.0470637100780809, 'learning_rate': 3.3204933053957312e-06, 'epoch': 0.62} 62%|██████▏ | 21279/34278 [23:25:17<11:45:27, 3.26s/it] 62%|██████▏ | 21280/34278 [23:25:20<11:42:29, 3.24s/it] {'loss': 0.1249, 'grad_norm': 0.8368251737513249, 'learning_rate': 3.32004832849561e-06, 'epoch': 0.62} 62%|██████▏ | 21280/34278 [23:25:20<11:42:29, 3.24s/it] 62%|██████▏ | 21281/34278 [23:25:24<11:36:35, 3.22s/it] {'loss': 0.1316, 'grad_norm': 1.0894381658191175, 'learning_rate': 3.319603366593699e-06, 'epoch': 0.62} 62%|██████▏ | 21281/34278 [23:25:24<11:36:35, 3.22s/it] 62%|██████▏ | 21282/34278 [23:25:27<11:16:41, 3.12s/it] {'loss': 0.149, 'grad_norm': 1.053502910510657, 'learning_rate': 3.3191584196939664e-06, 'epoch': 0.62} 62%|██████▏ | 21282/34278 [23:25:27<11:16:41, 3.12s/it] 62%|██████▏ | 21283/34278 [23:25:31<12:21:39, 3.42s/it] {'loss': 0.139, 'grad_norm': 0.7907878805400254, 'learning_rate': 3.318713487800387e-06, 'epoch': 0.62} 62%|██████▏ | 21283/34278 [23:25:31<12:21:39, 3.42s/it] 62%|██████▏ | 21284/34278 [23:25:34<12:44:28, 3.53s/it] {'loss': 0.1359, 'grad_norm': 0.806438614256576, 'learning_rate': 3.318268570916933e-06, 'epoch': 0.62} 62%|██████▏ | 21284/34278 [23:25:34<12:44:28, 3.53s/it] 62%|██████▏ | 21285/34278 [23:25:38<12:37:08, 3.50s/it] {'loss': 0.114, 'grad_norm': 0.8779660143369934, 'learning_rate': 3.317823669047574e-06, 'epoch': 0.62} 62%|██████▏ | 21285/34278 [23:25:38<12:37:08, 3.50s/it] 62%|██████▏ | 21286/34278 [23:25:41<12:21:00, 3.42s/it] {'loss': 0.1013, 'grad_norm': 0.780236913047683, 'learning_rate': 3.3173787821962835e-06, 'epoch': 0.62} 62%|██████▏ | 21286/34278 [23:25:41<12:21:00, 3.42s/it] 62%|██████▏ | 21287/34278 [23:25:45<12:21:38, 3.43s/it] {'loss': 0.1441, 'grad_norm': 0.7286366102868524, 'learning_rate': 3.3169339103670346e-06, 'epoch': 0.62} 62%|██████▏ | 21287/34278 [23:25:45<12:21:38, 3.43s/it] 62%|██████▏ | 21288/34278 [23:25:51<15:20:59, 4.25s/it] {'loss': 0.1417, 'grad_norm': 0.8642237540419939, 'learning_rate': 3.3164890535637973e-06, 'epoch': 0.62} 62%|██████▏ | 21288/34278 [23:25:51<15:20:59, 4.25s/it] 62%|██████▏ | 21289/34278 [23:25:54<14:00:40, 3.88s/it] {'loss': 0.1071, 'grad_norm': 1.0161236822646509, 'learning_rate': 3.3160442117905457e-06, 'epoch': 0.62} 62%|██████▏ | 21289/34278 [23:25:54<14:00:40, 3.88s/it] 62%|██████▏ | 21290/34278 [23:25:57<13:03:58, 3.62s/it] {'loss': 0.1363, 'grad_norm': 0.7800651069117587, 'learning_rate': 3.315599385051248e-06, 'epoch': 0.62} 62%|██████▏ | 21290/34278 [23:25:57<13:03:58, 3.62s/it] 62%|██████▏ | 21291/34278 [23:26:00<13:01:02, 3.61s/it] {'loss': 0.1266, 'grad_norm': 0.789159023649561, 'learning_rate': 3.315154573349877e-06, 'epoch': 0.62} 62%|██████▏ | 21291/34278 [23:26:00<13:01:02, 3.61s/it] 62%|██████▏ | 21292/34278 [23:26:04<13:27:32, 3.73s/it] {'loss': 0.1096, 'grad_norm': 0.7680218715126719, 'learning_rate': 3.3147097766904023e-06, 'epoch': 0.62} 62%|██████▏ | 21292/34278 [23:26:04<13:27:32, 3.73s/it] 62%|██████▏ | 21293/34278 [23:26:08<13:01:28, 3.61s/it] {'loss': 0.1409, 'grad_norm': 0.8466641849015817, 'learning_rate': 3.314264995076798e-06, 'epoch': 0.62} 62%|██████▏ | 21293/34278 [23:26:08<13:01:28, 3.61s/it] 62%|██████▏ | 21294/34278 [23:26:11<12:42:06, 3.52s/it] {'loss': 0.1142, 'grad_norm': 0.6803673970957486, 'learning_rate': 3.313820228513034e-06, 'epoch': 0.62} 62%|██████▏ | 21294/34278 [23:26:11<12:42:06, 3.52s/it] 62%|██████▏ | 21295/34278 [23:26:17<15:00:57, 4.16s/it] {'loss': 0.1339, 'grad_norm': 1.0595812562162283, 'learning_rate': 3.313375477003079e-06, 'epoch': 0.62} 62%|██████▏ | 21295/34278 [23:26:17<15:00:57, 4.16s/it] 62%|██████▏ | 21296/34278 [23:26:20<13:45:51, 3.82s/it] {'loss': 0.1323, 'grad_norm': 0.8386605136996219, 'learning_rate': 3.3129307405509058e-06, 'epoch': 0.62} 62%|██████▏ | 21296/34278 [23:26:20<13:45:51, 3.82s/it] 62%|██████▏ | 21297/34278 [23:26:25<15:06:08, 4.19s/it] {'loss': 0.1453, 'grad_norm': 0.8007409591229493, 'learning_rate': 3.312486019160486e-06, 'epoch': 0.62} 62%|██████▏ | 21297/34278 [23:26:25<15:06:08, 4.19s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 62%|██████▏ | 21298/34278 [23:26:28<13:46:58, 3.82s/it] {'loss': 0.1407, 'grad_norm': 0.8800377461827983, 'learning_rate': 3.3120413128357837e-06, 'epoch': 0.62} 62%|██████▏ | 21298/34278 [23:26:28<13:46:58, 3.82s/it] 62%|██████▏ | 21299/34278 [23:26:31<13:31:40, 3.75s/it] {'loss': 0.1032, 'grad_norm': 0.7508978002471144, 'learning_rate': 3.311596621580777e-06, 'epoch': 0.62} 62%|██████▏ | 21299/34278 [23:26:31<13:31:40, 3.75s/it] 62%|██████▏ | 21300/34278 [23:26:34<12:40:00, 3.51s/it] {'loss': 0.1106, 'grad_norm': 0.6572251173652115, 'learning_rate': 3.311151945399432e-06, 'epoch': 0.62} 62%|██████▏ | 21300/34278 [23:26:34<12:40:00, 3.51s/it] 62%|██████▏ | 21301/34278 [23:26:38<12:31:40, 3.48s/it] {'loss': 0.129, 'grad_norm': 0.8964986549543154, 'learning_rate': 3.3107072842957188e-06, 'epoch': 0.62} 62%|██████▏ | 21301/34278 [23:26:38<12:31:40, 3.48s/it] 62%|██████▏ | 21302/34278 [23:26:41<12:06:37, 3.36s/it] {'loss': 0.1428, 'grad_norm': 1.0203596500690153, 'learning_rate': 3.310262638273609e-06, 'epoch': 0.62} 62%|██████▏ | 21302/34278 [23:26:41<12:06:37, 3.36s/it] 62%|██████▏ | 21303/34278 [23:26:43<11:26:42, 3.18s/it] {'loss': 0.1242, 'grad_norm': 0.9062871790989347, 'learning_rate': 3.3098180073370702e-06, 'epoch': 0.62} 62%|██████▏ | 21303/34278 [23:26:43<11:26:42, 3.18s/it] 62%|██████▏ | 21304/34278 [23:26:49<14:14:25, 3.95s/it] {'loss': 0.1277, 'grad_norm': 0.892411289294688, 'learning_rate': 3.309373391490072e-06, 'epoch': 0.62} 62%|██████▏ | 21304/34278 [23:26:49<14:14:25, 3.95s/it] 62%|██████▏ | 21305/34278 [23:26:53<14:00:49, 3.89s/it] {'loss': 0.1387, 'grad_norm': 0.8217260832629453, 'learning_rate': 3.3089287907365848e-06, 'epoch': 0.62} 62%|██████▏ | 21305/34278 [23:26:53<14:00:49, 3.89s/it] 62%|██████▏ | 21306/34278 [23:26:56<13:11:49, 3.66s/it] {'loss': 0.1206, 'grad_norm': 0.9979695852189724, 'learning_rate': 3.3084842050805778e-06, 'epoch': 0.62} 62%|██████▏ | 21306/34278 [23:26:56<13:11:49, 3.66s/it] 62%|██████▏ | 21307/34278 [23:27:02<15:30:26, 4.30s/it] {'loss': 0.0993, 'grad_norm': 0.9104752937405064, 'learning_rate': 3.3080396345260213e-06, 'epoch': 0.62} 62%|██████▏ | 21307/34278 [23:27:02<15:30:26, 4.30s/it] 62%|██████▏ | 21308/34278 [23:27:07<16:14:16, 4.51s/it] {'loss': 0.1326, 'grad_norm': 0.7880754227526585, 'learning_rate': 3.3075950790768817e-06, 'epoch': 0.62} 62%|██████▏ | 21308/34278 [23:27:07<16:14:16, 4.51s/it] 62%|██████▏ | 21309/34278 [23:27:13<18:21:49, 5.10s/it] {'loss': 0.1217, 'grad_norm': 0.8896441449057885, 'learning_rate': 3.3071505387371294e-06, 'epoch': 0.62} 62%|██████▏ | 21309/34278 [23:27:13<18:21:49, 5.10s/it] 62%|██████▏ | 21310/34278 [23:27:17<16:57:01, 4.71s/it] {'loss': 0.1301, 'grad_norm': 0.9020034823921245, 'learning_rate': 3.306706013510732e-06, 'epoch': 0.62} 62%|██████▏ | 21310/34278 [23:27:17<16:57:01, 4.71s/it] 62%|██████▏ | 21311/34278 [23:27:21<15:43:10, 4.36s/it] {'loss': 0.1202, 'grad_norm': 0.7829725279898587, 'learning_rate': 3.306261503401661e-06, 'epoch': 0.62} 62%|██████▏ | 21311/34278 [23:27:21<15:43:10, 4.36s/it] 62%|██████▏ | 21312/34278 [23:27:24<14:38:19, 4.06s/it] {'loss': 0.1325, 'grad_norm': 1.0006266652654363, 'learning_rate': 3.3058170084138824e-06, 'epoch': 0.62} 62%|██████▏ | 21312/34278 [23:27:24<14:38:19, 4.06s/it] 62%|██████▏ | 21313/34278 [23:27:27<13:38:53, 3.79s/it] {'loss': 0.1383, 'grad_norm': 0.8483983545262735, 'learning_rate': 3.305372528551365e-06, 'epoch': 0.62} 62%|██████▏ | 21313/34278 [23:27:27<13:38:53, 3.79s/it] 62%|██████▏ | 21314/34278 [23:27:30<12:42:28, 3.53s/it] {'loss': 0.1056, 'grad_norm': 0.7990302783664022, 'learning_rate': 3.304928063818078e-06, 'epoch': 0.62} 62%|██████▏ | 21314/34278 [23:27:30<12:42:28, 3.53s/it] 62%|██████▏ | 21315/34278 [23:27:33<12:16:44, 3.41s/it] {'loss': 0.1365, 'grad_norm': 1.3404300346733664, 'learning_rate': 3.304483614217987e-06, 'epoch': 0.62} 62%|██████▏ | 21315/34278 [23:27:33<12:16:44, 3.41s/it] 62%|██████▏ | 21316/34278 [23:27:37<12:59:17, 3.61s/it] {'loss': 0.1272, 'grad_norm': 0.8902200862550608, 'learning_rate': 3.304039179755061e-06, 'epoch': 0.62} 62%|██████▏ | 21316/34278 [23:27:37<12:59:17, 3.61s/it] 62%|██████▏ | 21317/34278 [23:27:40<12:31:52, 3.48s/it] {'loss': 0.1177, 'grad_norm': 0.9414809876283852, 'learning_rate': 3.3035947604332697e-06, 'epoch': 0.62} 62%|██████▏ | 21317/34278 [23:27:41<12:31:52, 3.48s/it] 62%|██████▏ | 21318/34278 [23:27:44<12:55:44, 3.59s/it] {'loss': 0.121, 'grad_norm': 0.8641232348744955, 'learning_rate': 3.3031503562565793e-06, 'epoch': 0.62} 62%|██████▏ | 21318/34278 [23:27:44<12:55:44, 3.59s/it] 62%|██████▏ | 21319/34278 [23:27:48<12:30:33, 3.48s/it] {'loss': 0.1243, 'grad_norm': 0.8492757587837996, 'learning_rate': 3.302705967228958e-06, 'epoch': 0.62} 62%|██████▏ | 21319/34278 [23:27:48<12:30:33, 3.48s/it] 62%|██████▏ | 21320/34278 [23:27:50<11:54:36, 3.31s/it] {'loss': 0.1139, 'grad_norm': 0.675696365679319, 'learning_rate': 3.3022615933543724e-06, 'epoch': 0.62} 62%|██████▏ | 21320/34278 [23:27:50<11:54:36, 3.31s/it] 62%|██████▏ | 21321/34278 [23:27:54<11:38:44, 3.24s/it] {'loss': 0.1551, 'grad_norm': 0.9470120382955522, 'learning_rate': 3.3018172346367896e-06, 'epoch': 0.62} 62%|██████▏ | 21321/34278 [23:27:54<11:38:44, 3.24s/it] 62%|██████▏ | 21322/34278 [23:27:57<11:42:13, 3.25s/it] {'loss': 0.1141, 'grad_norm': 1.0350572461555476, 'learning_rate': 3.3013728910801758e-06, 'epoch': 0.62} 62%|██████▏ | 21322/34278 [23:27:57<11:42:13, 3.25s/it] 62%|██████▏ | 21323/34278 [23:28:00<11:50:11, 3.29s/it] {'loss': 0.1136, 'grad_norm': 0.686469596550152, 'learning_rate': 3.3009285626885002e-06, 'epoch': 0.62} 62%|██████▏ | 21323/34278 [23:28:00<11:50:11, 3.29s/it] 62%|██████▏ | 21324/34278 [23:28:03<11:23:16, 3.16s/it] {'loss': 0.1491, 'grad_norm': 0.7361842123082664, 'learning_rate': 3.3004842494657304e-06, 'epoch': 0.62} 62%|██████▏ | 21324/34278 [23:28:03<11:23:16, 3.16s/it] 62%|██████▏ | 21325/34278 [23:28:06<10:59:57, 3.06s/it] {'loss': 0.122, 'grad_norm': 0.9547261361657784, 'learning_rate': 3.30003995141583e-06, 'epoch': 0.62} 62%|██████▏ | 21325/34278 [23:28:06<10:59:57, 3.06s/it] 62%|██████▏ | 21326/34278 [23:28:10<11:37:28, 3.23s/it] {'loss': 0.1406, 'grad_norm': 0.7418964773658194, 'learning_rate': 3.299595668542768e-06, 'epoch': 0.62} 62%|██████▏ | 21326/34278 [23:28:10<11:37:28, 3.23s/it] 62%|██████▏ | 21327/34278 [23:28:12<11:16:25, 3.13s/it] {'loss': 0.1337, 'grad_norm': 0.7739919750428765, 'learning_rate': 3.29915140085051e-06, 'epoch': 0.62} 62%|██████▏ | 21327/34278 [23:28:12<11:16:25, 3.13s/it] 62%|██████▏ | 21328/34278 [23:28:16<11:35:46, 3.22s/it] {'loss': 0.1083, 'grad_norm': 0.7858944115116288, 'learning_rate': 3.2987071483430195e-06, 'epoch': 0.62} 62%|██████▏ | 21328/34278 [23:28:16<11:35:46, 3.22s/it] 62%|██████▏ | 21329/34278 [23:28:19<11:44:59, 3.27s/it] {'loss': 0.1297, 'grad_norm': 1.0061417646637059, 'learning_rate': 3.298262911024269e-06, 'epoch': 0.62} 62%|██████▏ | 21329/34278 [23:28:19<11:44:59, 3.27s/it] 62%|██████▏ | 21330/34278 [23:28:22<11:39:42, 3.24s/it] {'loss': 0.1171, 'grad_norm': 0.6176665245333957, 'learning_rate': 3.2978186888982188e-06, 'epoch': 0.62} 62%|██████▏ | 21330/34278 [23:28:23<11:39:42, 3.24s/it] 62%|██████▏ | 21331/34278 [23:28:27<12:56:07, 3.60s/it] {'loss': 0.1376, 'grad_norm': 0.7673620279695847, 'learning_rate': 3.297374481968838e-06, 'epoch': 0.62} 62%|██████▏ | 21331/34278 [23:28:27<12:56:07, 3.60s/it] 62%|██████▏ | 21332/34278 [23:28:30<12:56:45, 3.60s/it] {'loss': 0.1146, 'grad_norm': 1.0690542554056024, 'learning_rate': 3.2969302902400925e-06, 'epoch': 0.62} 62%|██████▏ | 21332/34278 [23:28:30<12:56:45, 3.60s/it] 62%|██████▏ | 21333/34278 [23:28:34<12:22:20, 3.44s/it] {'loss': 0.1087, 'grad_norm': 0.7159491735092279, 'learning_rate': 3.2964861137159453e-06, 'epoch': 0.62} 62%|██████▏ | 21333/34278 [23:28:34<12:22:20, 3.44s/it] 62%|██████▏ | 21334/34278 [23:28:37<12:49:27, 3.57s/it] {'loss': 0.112, 'grad_norm': 0.8175363148488521, 'learning_rate': 3.296041952400363e-06, 'epoch': 0.62} 62%|██████▏ | 21334/34278 [23:28:37<12:49:27, 3.57s/it] 62%|██████▏ | 21335/34278 [23:28:41<12:39:25, 3.52s/it] {'loss': 0.1317, 'grad_norm': 0.8295698694515653, 'learning_rate': 3.2955978062973117e-06, 'epoch': 0.62} 62%|██████▏ | 21335/34278 [23:28:41<12:39:25, 3.52s/it] 62%|██████▏ | 21336/34278 [23:28:44<12:25:21, 3.46s/it] {'loss': 0.1245, 'grad_norm': 0.9168800836975299, 'learning_rate': 3.295153675410756e-06, 'epoch': 0.62} 62%|██████▏ | 21336/34278 [23:28:44<12:25:21, 3.46s/it] 62%|██████▏ | 21337/34278 [23:28:47<11:57:55, 3.33s/it] {'loss': 0.1397, 'grad_norm': 0.9645563310931667, 'learning_rate': 3.294709559744663e-06, 'epoch': 0.62} 62%|██████▏ | 21337/34278 [23:28:47<11:57:55, 3.33s/it] 62%|██████▏ | 21338/34278 [23:28:50<11:46:13, 3.27s/it] {'loss': 0.1266, 'grad_norm': 0.8367384801046545, 'learning_rate': 3.2942654593029957e-06, 'epoch': 0.62} 62%|██████▏ | 21338/34278 [23:28:50<11:46:13, 3.27s/it] 62%|██████▏ | 21339/34278 [23:28:54<11:45:48, 3.27s/it] {'loss': 0.1085, 'grad_norm': 0.7889492779134213, 'learning_rate': 3.2938213740897173e-06, 'epoch': 0.62} 62%|██████▏ | 21339/34278 [23:28:54<11:45:48, 3.27s/it] 62%|██████▏ | 21340/34278 [23:29:00<14:43:16, 4.10s/it] {'loss': 0.1594, 'grad_norm': 0.8859241154679921, 'learning_rate': 3.2933773041087945e-06, 'epoch': 0.62} 62%|██████▏ | 21340/34278 [23:29:00<14:43:16, 4.10s/it] 62%|██████▏ | 21341/34278 [23:29:03<13:32:03, 3.77s/it] {'loss': 0.1378, 'grad_norm': 0.8246678630982585, 'learning_rate': 3.292933249364194e-06, 'epoch': 0.62} 62%|██████▏ | 21341/34278 [23:29:03<13:32:03, 3.77s/it] 62%|██████▏ | 21342/34278 [23:29:05<12:37:40, 3.51s/it] {'loss': 0.1151, 'grad_norm': 0.9921870326117046, 'learning_rate': 3.2924892098598765e-06, 'epoch': 0.62} 62%|██████▏ | 21342/34278 [23:29:05<12:37:40, 3.51s/it] 62%|██████▏ | 21343/34278 [23:29:09<12:16:11, 3.41s/it] {'loss': 0.1186, 'grad_norm': 0.8903747923174103, 'learning_rate': 3.292045185599808e-06, 'epoch': 0.62} 62%|██████▏ | 21343/34278 [23:29:09<12:16:11, 3.41s/it] 62%|██████▏ | 21344/34278 [23:29:12<11:46:05, 3.28s/it] {'loss': 0.1235, 'grad_norm': 1.0347288106601755, 'learning_rate': 3.291601176587953e-06, 'epoch': 0.62} 62%|██████▏ | 21344/34278 [23:29:12<11:46:05, 3.28s/it] 62%|██████▏ | 21345/34278 [23:29:15<11:36:02, 3.23s/it] {'loss': 0.1479, 'grad_norm': 0.9944202953824731, 'learning_rate': 3.291157182828274e-06, 'epoch': 0.62} 62%|██████▏ | 21345/34278 [23:29:15<11:36:02, 3.23s/it] 62%|██████▏ | 21346/34278 [23:29:19<12:12:56, 3.40s/it] {'loss': 0.1128, 'grad_norm': 0.8423356634400826, 'learning_rate': 3.290713204324735e-06, 'epoch': 0.62} 62%|██████▏ | 21346/34278 [23:29:19<12:12:56, 3.40s/it] 62%|██████▏ | 21347/34278 [23:29:22<11:55:49, 3.32s/it] {'loss': 0.1097, 'grad_norm': 0.7917008469805683, 'learning_rate': 3.290269241081301e-06, 'epoch': 0.62} 62%|██████▏ | 21347/34278 [23:29:22<11:55:49, 3.32s/it] 62%|██████▏ | 21348/34278 [23:29:24<11:21:48, 3.16s/it] {'loss': 0.1379, 'grad_norm': 0.7328504510975807, 'learning_rate': 3.2898252931019353e-06, 'epoch': 0.62} 62%|██████▏ | 21348/34278 [23:29:24<11:21:48, 3.16s/it] 62%|██████▏ | 21349/34278 [23:29:28<11:17:21, 3.14s/it] {'loss': 0.1268, 'grad_norm': 1.1886418462196109, 'learning_rate': 3.289381360390602e-06, 'epoch': 0.62} 62%|██████▏ | 21349/34278 [23:29:28<11:17:21, 3.14s/it] 62%|██████▏ | 21350/34278 [23:29:32<12:42:02, 3.54s/it] {'loss': 0.1191, 'grad_norm': 0.9171882169636048, 'learning_rate': 3.2889374429512625e-06, 'epoch': 0.62} 62%|██████▏ | 21350/34278 [23:29:32<12:42:02, 3.54s/it] 62%|██████▏ | 21351/34278 [23:29:35<12:14:10, 3.41s/it] {'loss': 0.128, 'grad_norm': 0.8192435164398073, 'learning_rate': 3.2884935407878815e-06, 'epoch': 0.62} 62%|██████▏ | 21351/34278 [23:29:35<12:14:10, 3.41s/it] 62%|██████▏ | 21352/34278 [23:29:38<11:50:00, 3.30s/it] {'loss': 0.1404, 'grad_norm': 0.9002291269705648, 'learning_rate': 3.2880496539044204e-06, 'epoch': 0.62} 62%|██████▏ | 21352/34278 [23:29:38<11:50:00, 3.30s/it] 62%|██████▏ | 21353/34278 [23:29:42<12:00:03, 3.34s/it] {'loss': 0.1265, 'grad_norm': 1.4768307983113431, 'learning_rate': 3.287605782304844e-06, 'epoch': 0.62} 62%|██████▏ | 21353/34278 [23:29:42<12:00:03, 3.34s/it] 62%|██████▏ | 21354/34278 [23:29:46<12:41:47, 3.54s/it] {'loss': 0.1306, 'grad_norm': 0.9693148724283404, 'learning_rate': 3.2871619259931155e-06, 'epoch': 0.62} 62%|██████▏ | 21354/34278 [23:29:46<12:41:47, 3.54s/it] 62%|██████▏ | 21355/34278 [23:29:51<14:10:43, 3.95s/it] {'loss': 0.1393, 'grad_norm': 1.0424101658228286, 'learning_rate': 3.286718084973196e-06, 'epoch': 0.62} 62%|██████▏ | 21355/34278 [23:29:51<14:10:43, 3.95s/it] 62%|██████▏ | 21356/34278 [23:29:54<13:10:40, 3.67s/it] {'loss': 0.1319, 'grad_norm': 1.0124755595158514, 'learning_rate': 3.286274259249048e-06, 'epoch': 0.62} 62%|██████▏ | 21356/34278 [23:29:54<13:10:40, 3.67s/it] 62%|██████▏ | 21357/34278 [23:29:57<12:34:22, 3.50s/it] {'loss': 0.1424, 'grad_norm': 0.9645705464887633, 'learning_rate': 3.285830448824635e-06, 'epoch': 0.62} 62%|██████▏ | 21357/34278 [23:29:57<12:34:22, 3.50s/it] 62%|██████▏ | 21358/34278 [23:30:01<13:24:06, 3.73s/it] {'loss': 0.1367, 'grad_norm': 0.7622431640789379, 'learning_rate': 3.285386653703916e-06, 'epoch': 0.62} 62%|██████▏ | 21358/34278 [23:30:01<13:24:06, 3.73s/it] 62%|██████▏ | 21359/34278 [23:30:04<13:09:36, 3.67s/it] {'loss': 0.1443, 'grad_norm': 0.92898231544446, 'learning_rate': 3.2849428738908585e-06, 'epoch': 0.62} 62%|██████▏ | 21359/34278 [23:30:04<13:09:36, 3.67s/it] 62%|██████▏ | 21360/34278 [23:30:08<13:00:50, 3.63s/it] {'loss': 0.1361, 'grad_norm': 1.1094987025385317, 'learning_rate': 3.2844991093894205e-06, 'epoch': 0.62} 62%|██████▏ | 21360/34278 [23:30:08<13:00:50, 3.63s/it] 62%|██████▏ | 21361/34278 [23:30:12<13:21:48, 3.72s/it] {'loss': 0.1164, 'grad_norm': 0.9244821197402684, 'learning_rate': 3.284055360203565e-06, 'epoch': 0.62} 62%|██████▏ | 21361/34278 [23:30:12<13:21:48, 3.72s/it] 62%|██████▏ | 21362/34278 [23:30:15<12:31:15, 3.49s/it] {'loss': 0.1237, 'grad_norm': 1.5233771887704943, 'learning_rate': 3.2836116263372553e-06, 'epoch': 0.62} 62%|██████▏ | 21362/34278 [23:30:15<12:31:15, 3.49s/it] 62%|██████▏ | 21363/34278 [23:30:18<11:55:59, 3.33s/it] {'loss': 0.1436, 'grad_norm': 1.3060899002913862, 'learning_rate': 3.283167907794449e-06, 'epoch': 0.62} 62%|██████▏ | 21363/34278 [23:30:18<11:55:59, 3.33s/it] 62%|██████▏ | 21364/34278 [23:30:21<11:24:44, 3.18s/it] {'loss': 0.1126, 'grad_norm': 0.8891857246693189, 'learning_rate': 3.2827242045791097e-06, 'epoch': 0.62} 62%|██████▏ | 21364/34278 [23:30:21<11:24:44, 3.18s/it] 62%|██████▏ | 21365/34278 [23:30:25<12:45:48, 3.56s/it] {'loss': 0.1269, 'grad_norm': 0.7134619818092386, 'learning_rate': 3.2822805166951993e-06, 'epoch': 0.62} 62%|██████▏ | 21365/34278 [23:30:25<12:45:48, 3.56s/it] 62%|██████▏ | 21366/34278 [23:30:28<12:04:17, 3.37s/it] {'loss': 0.1174, 'grad_norm': 0.7131956097189778, 'learning_rate': 3.2818368441466785e-06, 'epoch': 0.62} 62%|██████▏ | 21366/34278 [23:30:28<12:04:17, 3.37s/it] 62%|██████▏ | 21367/34278 [23:30:31<11:42:32, 3.26s/it] {'loss': 0.1017, 'grad_norm': 0.9251043737066043, 'learning_rate': 3.2813931869375093e-06, 'epoch': 0.62} 62%|██████▏ | 21367/34278 [23:30:31<11:42:32, 3.26s/it] 62%|██████▏ | 21368/34278 [23:30:37<14:49:08, 4.13s/it] {'loss': 0.1194, 'grad_norm': 0.9510310397422824, 'learning_rate': 3.2809495450716504e-06, 'epoch': 0.62} 62%|██████▏ | 21368/34278 [23:30:37<14:49:08, 4.13s/it] 62%|██████▏ | 21369/34278 [23:30:41<14:12:04, 3.96s/it] {'loss': 0.1269, 'grad_norm': 0.8411471375005438, 'learning_rate': 3.280505918553064e-06, 'epoch': 0.62} 62%|██████▏ | 21369/34278 [23:30:41<14:12:04, 3.96s/it] 62%|██████▏ | 21370/34278 [23:30:44<13:29:51, 3.76s/it] {'loss': 0.1124, 'grad_norm': 0.7882271282585804, 'learning_rate': 3.2800623073857086e-06, 'epoch': 0.62} 62%|██████▏ | 21370/34278 [23:30:44<13:29:51, 3.76s/it] 62%|██████▏ | 21371/34278 [23:30:49<14:45:24, 4.12s/it] {'loss': 0.1071, 'grad_norm': 0.9041004644139871, 'learning_rate': 3.279618711573549e-06, 'epoch': 0.62} 62%|██████▏ | 21371/34278 [23:30:49<14:45:24, 4.12s/it] 62%|██████▏ | 21372/34278 [23:30:53<14:16:31, 3.98s/it] {'loss': 0.117, 'grad_norm': 1.0976159418705274, 'learning_rate': 3.2791751311205412e-06, 'epoch': 0.62} 62%|██████▏ | 21372/34278 [23:30:53<14:16:31, 3.98s/it] 62%|██████▏ | 21373/34278 [23:30:56<13:39:24, 3.81s/it] {'loss': 0.1277, 'grad_norm': 0.7036863548417275, 'learning_rate': 3.2787315660306473e-06, 'epoch': 0.62} 62%|██████▏ | 21373/34278 [23:30:56<13:39:24, 3.81s/it] 62%|██████▏ | 21374/34278 [23:31:00<13:54:13, 3.88s/it] {'loss': 0.1192, 'grad_norm': 0.8272929067017146, 'learning_rate': 3.278288016307828e-06, 'epoch': 0.62} 62%|██████▏ | 21374/34278 [23:31:00<13:54:13, 3.88s/it] 62%|██████▏ | 21375/34278 [23:31:03<12:52:42, 3.59s/it] {'loss': 0.1257, 'grad_norm': 0.9351901240229592, 'learning_rate': 3.277844481956042e-06, 'epoch': 0.62} 62%|██████▏ | 21375/34278 [23:31:03<12:52:42, 3.59s/it] 62%|██████▏ | 21376/34278 [23:31:08<14:06:16, 3.94s/it] {'loss': 0.134, 'grad_norm': 0.9171908978621737, 'learning_rate': 3.277400962979247e-06, 'epoch': 0.62} 62%|██████▏ | 21376/34278 [23:31:08<14:06:16, 3.94s/it] 62%|██████▏ | 21377/34278 [23:31:13<15:47:31, 4.41s/it] {'loss': 0.134, 'grad_norm': 0.7627911834806919, 'learning_rate': 3.2769574593814067e-06, 'epoch': 0.62} 62%|██████▏ | 21377/34278 [23:31:13<15:47:31, 4.41s/it] 62%|██████▏ | 21378/34278 [23:31:19<17:23:35, 4.85s/it] {'loss': 0.1599, 'grad_norm': 0.8982023656165522, 'learning_rate': 3.2765139711664795e-06, 'epoch': 0.62} 62%|██████▏ | 21378/34278 [23:31:19<17:23:35, 4.85s/it] 62%|██████▏ | 21379/34278 [23:31:23<15:55:58, 4.45s/it] {'loss': 0.1149, 'grad_norm': 0.7722640945898418, 'learning_rate': 3.2760704983384237e-06, 'epoch': 0.62} 62%|██████▏ | 21379/34278 [23:31:23<15:55:58, 4.45s/it] 62%|██████▏ | 21380/34278 [23:31:27<15:24:09, 4.30s/it] {'loss': 0.1295, 'grad_norm': 1.1665461068577712, 'learning_rate': 3.2756270409011993e-06, 'epoch': 0.62} 62%|██████▏ | 21380/34278 [23:31:27<15:24:09, 4.30s/it] 62%|██████▏ | 21381/34278 [23:31:31<14:57:45, 4.18s/it] {'loss': 0.14, 'grad_norm': 0.9510608179510497, 'learning_rate': 3.2751835988587644e-06, 'epoch': 0.62} 62%|██████▏ | 21381/34278 [23:31:31<14:57:45, 4.18s/it] 62%|██████▏ | 21382/34278 [23:31:34<13:41:37, 3.82s/it] {'loss': 0.1097, 'grad_norm': 0.8044782996137715, 'learning_rate': 3.274740172215078e-06, 'epoch': 0.62} 62%|██████▏ | 21382/34278 [23:31:34<13:41:37, 3.82s/it] 62%|██████▏ | 21383/34278 [23:31:37<13:26:47, 3.75s/it] {'loss': 0.123, 'grad_norm': 0.8821719331786081, 'learning_rate': 3.2742967609741e-06, 'epoch': 0.62} 62%|██████▏ | 21383/34278 [23:31:37<13:26:47, 3.75s/it] 62%|██████▏ | 21384/34278 [23:31:40<12:37:50, 3.53s/it] {'loss': 0.1282, 'grad_norm': 0.8969669929447068, 'learning_rate': 3.2738533651397895e-06, 'epoch': 0.62} 62%|██████▏ | 21384/34278 [23:31:40<12:37:50, 3.53s/it] 62%|██████▏ | 21385/34278 [23:31:43<12:13:45, 3.41s/it] {'loss': 0.1016, 'grad_norm': 0.8431979072420812, 'learning_rate': 3.2734099847161038e-06, 'epoch': 0.62} 62%|██████▏ | 21385/34278 [23:31:43<12:13:45, 3.41s/it] 62%|██████▏ | 21386/34278 [23:31:46<11:45:42, 3.28s/it] {'loss': 0.1379, 'grad_norm': 1.2478708829528349, 'learning_rate': 3.272966619707001e-06, 'epoch': 0.62} 62%|██████▏ | 21386/34278 [23:31:46<11:45:42, 3.28s/it] 62%|██████▏ | 21387/34278 [23:31:50<12:01:55, 3.36s/it] {'loss': 0.1407, 'grad_norm': 0.8291457332907493, 'learning_rate': 3.272523270116441e-06, 'epoch': 0.62} 62%|██████▏ | 21387/34278 [23:31:50<12:01:55, 3.36s/it] 62%|██████▏ | 21388/34278 [23:31:53<12:17:19, 3.43s/it] {'loss': 0.1219, 'grad_norm': 0.9483068769052317, 'learning_rate': 3.272079935948378e-06, 'epoch': 0.62} 62%|██████▏ | 21388/34278 [23:31:53<12:17:19, 3.43s/it] 62%|██████▏ | 21389/34278 [23:31:56<11:44:19, 3.28s/it] {'loss': 0.1205, 'grad_norm': 0.7799120919355934, 'learning_rate': 3.271636617206776e-06, 'epoch': 0.62} 62%|██████▏ | 21389/34278 [23:31:56<11:44:19, 3.28s/it] 62%|██████▏ | 21390/34278 [23:32:00<12:28:49, 3.49s/it] {'loss': 0.1285, 'grad_norm': 0.8938662660427801, 'learning_rate': 3.271193313895588e-06, 'epoch': 0.62} 62%|██████▏ | 21390/34278 [23:32:00<12:28:49, 3.49s/it] 62%|██████▏ | 21391/34278 [23:32:03<12:06:03, 3.38s/it] {'loss': 0.1091, 'grad_norm': 0.7676948460797017, 'learning_rate': 3.270750026018774e-06, 'epoch': 0.62} 62%|██████▏ | 21391/34278 [23:32:03<12:06:03, 3.38s/it] 62%|██████▏ | 21392/34278 [23:32:07<12:10:41, 3.40s/it] {'loss': 0.1397, 'grad_norm': 0.6694981629040268, 'learning_rate': 3.270306753580292e-06, 'epoch': 0.62} 62%|██████▏ | 21392/34278 [23:32:07<12:10:41, 3.40s/it] 62%|██████▏ | 21393/34278 [23:32:11<13:23:25, 3.74s/it] {'loss': 0.111, 'grad_norm': 0.8658009481411322, 'learning_rate': 3.269863496584097e-06, 'epoch': 0.62} 62%|██████▏ | 21393/34278 [23:32:11<13:23:25, 3.74s/it] 62%|██████▏ | 21394/34278 [23:32:17<15:54:06, 4.44s/it] {'loss': 0.1224, 'grad_norm': 1.0818055981683743, 'learning_rate': 3.2694202550341467e-06, 'epoch': 0.62} 62%|██████▏ | 21394/34278 [23:32:17<15:54:06, 4.44s/it] 62%|██████▏ | 21395/34278 [23:32:21<14:35:47, 4.08s/it] {'loss': 0.1412, 'grad_norm': 0.8759576032731533, 'learning_rate': 3.2689770289344006e-06, 'epoch': 0.62} 62%|██████▏ | 21395/34278 [23:32:21<14:35:47, 4.08s/it] 62%|██████▏ | 21396/34278 [23:32:26<16:05:13, 4.50s/it] {'loss': 0.1089, 'grad_norm': 0.9859698274761876, 'learning_rate': 3.2685338182888143e-06, 'epoch': 0.62} 62%|██████▏ | 21396/34278 [23:32:26<16:05:13, 4.50s/it] 62%|██████▏ | 21397/34278 [23:32:30<15:13:39, 4.26s/it] {'loss': 0.112, 'grad_norm': 0.7141952478940872, 'learning_rate': 3.268090623101346e-06, 'epoch': 0.62} 62%|██████▏ | 21397/34278 [23:32:30<15:13:39, 4.26s/it] 62%|██████▏ | 21398/34278 [23:32:33<14:05:30, 3.94s/it] {'loss': 0.1081, 'grad_norm': 0.9453485591901521, 'learning_rate': 3.2676474433759498e-06, 'epoch': 0.62} 62%|██████▏ | 21398/34278 [23:32:33<14:05:30, 3.94s/it] 62%|██████▏ | 21399/34278 [23:32:36<13:01:17, 3.64s/it] {'loss': 0.1242, 'grad_norm': 0.921193296818913, 'learning_rate': 3.2672042791165837e-06, 'epoch': 0.62} 62%|██████▏ | 21399/34278 [23:32:36<13:01:17, 3.64s/it] 62%|██████▏ | 21400/34278 [23:32:39<12:07:48, 3.39s/it] {'loss': 0.1157, 'grad_norm': 0.8739465438971922, 'learning_rate': 3.266761130327203e-06, 'epoch': 0.62} 62%|██████▏ | 21400/34278 [23:32:39<12:07:48, 3.39s/it] 62%|██████▏ | 21401/34278 [23:32:43<12:33:45, 3.51s/it] {'loss': 0.1399, 'grad_norm': 0.9048152619749588, 'learning_rate': 3.2663179970117678e-06, 'epoch': 0.62} 62%|██████▏ | 21401/34278 [23:32:43<12:33:45, 3.51s/it] 62%|██████▏ | 21402/34278 [23:32:45<11:49:04, 3.30s/it] {'loss': 0.1455, 'grad_norm': 1.023687456115142, 'learning_rate': 3.26587487917423e-06, 'epoch': 0.62} 62%|██████▏ | 21402/34278 [23:32:45<11:49:04, 3.30s/it] 62%|██████▏ | 21403/34278 [23:32:48<11:27:27, 3.20s/it] {'loss': 0.1206, 'grad_norm': 0.9609077409680912, 'learning_rate': 3.2654317768185474e-06, 'epoch': 0.62} 62%|██████▏ | 21403/34278 [23:32:48<11:27:27, 3.20s/it] 62%|██████▏ | 21404/34278 [23:32:52<11:23:19, 3.18s/it] {'loss': 0.1148, 'grad_norm': 0.7574570783561895, 'learning_rate': 3.264988689948677e-06, 'epoch': 0.62} 62%|██████▏ | 21404/34278 [23:32:52<11:23:19, 3.18s/it] 62%|██████▏ | 21405/34278 [23:32:55<11:14:26, 3.14s/it] {'loss': 0.1156, 'grad_norm': 0.8387826309541686, 'learning_rate': 3.264545618568572e-06, 'epoch': 0.62} 62%|██████▏ | 21405/34278 [23:32:55<11:14:26, 3.14s/it] 62%|██████▏ | 21406/34278 [23:33:01<14:29:29, 4.05s/it] {'loss': 0.1127, 'grad_norm': 0.8790705048861258, 'learning_rate': 3.264102562682189e-06, 'epoch': 0.62} 62%|██████▏ | 21406/34278 [23:33:01<14:29:29, 4.05s/it] 62%|██████▏ | 21407/34278 [23:33:04<13:18:22, 3.72s/it] {'loss': 0.1122, 'grad_norm': 0.8097963171134626, 'learning_rate': 3.2636595222934843e-06, 'epoch': 0.62} 62%|██████▏ | 21407/34278 [23:33:04<13:18:22, 3.72s/it] 62%|██████▏ | 21408/34278 [23:33:09<15:03:55, 4.21s/it] {'loss': 0.1315, 'grad_norm': 0.7712599598560937, 'learning_rate': 3.2632164974064136e-06, 'epoch': 0.62} 62%|██████▏ | 21408/34278 [23:33:09<15:03:55, 4.21s/it] 62%|██████▏ | 21409/34278 [23:33:12<13:45:53, 3.85s/it] {'loss': 0.1347, 'grad_norm': 0.7878562866613266, 'learning_rate': 3.262773488024932e-06, 'epoch': 0.62} 62%|██████▏ | 21409/34278 [23:33:12<13:45:53, 3.85s/it] 62%|██████▏ | 21410/34278 [23:33:16<13:41:47, 3.83s/it] {'loss': 0.1227, 'grad_norm': 0.7682216592352683, 'learning_rate': 3.262330494152993e-06, 'epoch': 0.62} 62%|██████▏ | 21410/34278 [23:33:16<13:41:47, 3.83s/it] 62%|██████▏ | 21411/34278 [23:33:22<16:00:48, 4.48s/it] {'loss': 0.1085, 'grad_norm': 0.8909676263064586, 'learning_rate': 3.2618875157945527e-06, 'epoch': 0.62} 62%|██████▏ | 21411/34278 [23:33:22<16:00:48, 4.48s/it] 62%|██████▏ | 21412/34278 [23:33:26<15:17:36, 4.28s/it] {'loss': 0.1297, 'grad_norm': 0.893901912628838, 'learning_rate': 3.2614445529535643e-06, 'epoch': 0.62} 62%|██████▏ | 21412/34278 [23:33:26<15:17:36, 4.28s/it] 62%|██████▏ | 21413/34278 [23:33:29<13:48:24, 3.86s/it] {'loss': 0.1211, 'grad_norm': 0.8418011611485549, 'learning_rate': 3.2610016056339855e-06, 'epoch': 0.62} 62%|██████▏ | 21413/34278 [23:33:29<13:48:24, 3.86s/it] 62%|██████▏ | 21414/34278 [23:33:31<12:40:54, 3.55s/it] {'loss': 0.1148, 'grad_norm': 0.8337426922324811, 'learning_rate': 3.2605586738397697e-06, 'epoch': 0.62} 62%|██████▏ | 21414/34278 [23:33:31<12:40:54, 3.55s/it] 62%|██████▏ | 21415/34278 [23:33:36<13:20:59, 3.74s/it] {'loss': 0.1356, 'grad_norm': 0.8394602552448956, 'learning_rate': 3.26011575757487e-06, 'epoch': 0.62} 62%|██████▏ | 21415/34278 [23:33:36<13:20:59, 3.74s/it] 62%|██████▏ | 21416/34278 [23:33:42<16:17:01, 4.56s/it] {'loss': 0.1688, 'grad_norm': 0.7867755359073282, 'learning_rate': 3.2596728568432417e-06, 'epoch': 0.62} 62%|██████▏ | 21416/34278 [23:33:42<16:17:01, 4.56s/it] 62%|██████▏ | 21417/34278 [23:33:45<14:27:06, 4.05s/it] {'loss': 0.1353, 'grad_norm': 0.7488016430772473, 'learning_rate': 3.2592299716488396e-06, 'epoch': 0.62} 62%|██████▏ | 21417/34278 [23:33:45<14:27:06, 4.05s/it] 62%|██████▏ | 21418/34278 [23:33:50<16:08:22, 4.52s/it] {'loss': 0.1291, 'grad_norm': 0.8252871185907982, 'learning_rate': 3.2587871019956137e-06, 'epoch': 0.62} 62%|██████▏ | 21418/34278 [23:33:50<16:08:22, 4.52s/it] 62%|██████▏ | 21419/34278 [23:33:56<16:40:46, 4.67s/it] {'loss': 0.1394, 'grad_norm': 0.6719926057896809, 'learning_rate': 3.258344247887524e-06, 'epoch': 0.62} 62%|██████▏ | 21419/34278 [23:33:56<16:40:46, 4.67s/it] 62%|██████▏ | 21420/34278 [23:33:59<15:00:17, 4.20s/it] {'loss': 0.1238, 'grad_norm': 0.6376576206577068, 'learning_rate': 3.25790140932852e-06, 'epoch': 0.62} 62%|██████▏ | 21420/34278 [23:33:59<15:00:17, 4.20s/it] 62%|██████▏ | 21421/34278 [23:34:02<13:41:53, 3.84s/it] {'loss': 0.1254, 'grad_norm': 0.8584609128994657, 'learning_rate': 3.257458586322556e-06, 'epoch': 0.62} 62%|██████▏ | 21421/34278 [23:34:02<13:41:53, 3.84s/it] 62%|██████▏ | 21422/34278 [23:34:05<13:44:33, 3.85s/it] {'loss': 0.1244, 'grad_norm': 0.6927924290628801, 'learning_rate': 3.257015778873587e-06, 'epoch': 0.62} 62%|██████▏ | 21422/34278 [23:34:05<13:44:33, 3.85s/it] 62%|██████▏ | 21423/34278 [23:34:08<12:45:06, 3.57s/it] {'loss': 0.1217, 'grad_norm': 0.6575024255520201, 'learning_rate': 3.2565729869855643e-06, 'epoch': 0.62} 62%|██████▏ | 21423/34278 [23:34:08<12:45:06, 3.57s/it] 63%|██████▎ | 21424/34278 [23:34:13<14:19:37, 4.01s/it] {'loss': 0.1145, 'grad_norm': 0.7516460135581234, 'learning_rate': 3.2561302106624405e-06, 'epoch': 0.63} 63%|██████▎ | 21424/34278 [23:34:13<14:19:37, 4.01s/it] 63%|██████▎ | 21425/34278 [23:34:19<16:20:01, 4.57s/it] {'loss': 0.1167, 'grad_norm': 0.9149866479396442, 'learning_rate': 3.2556874499081715e-06, 'epoch': 0.63} 63%|██████▎ | 21425/34278 [23:34:19<16:20:01, 4.57s/it] 63%|██████▎ | 21426/34278 [23:34:24<16:14:45, 4.55s/it] {'loss': 0.1362, 'grad_norm': 0.6900964557226459, 'learning_rate': 3.255244704726708e-06, 'epoch': 0.63} 63%|██████▎ | 21426/34278 [23:34:24<16:14:45, 4.55s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 63%|██████▎ | 21427/34278 [23:34:27<14:28:17, 4.05s/it] {'loss': 0.1245, 'grad_norm': 0.7534044221475009, 'learning_rate': 3.254801975122004e-06, 'epoch': 0.63} 63%|██████▎ | 21427/34278 [23:34:27<14:28:17, 4.05s/it] 63%|██████▎ | 21428/34278 [23:34:31<14:18:10, 4.01s/it] {'loss': 0.1232, 'grad_norm': 1.0243240109542218, 'learning_rate': 3.2543592610980107e-06, 'epoch': 0.63} 63%|██████▎ | 21428/34278 [23:34:31<14:18:10, 4.01s/it] 63%|██████▎ | 21429/34278 [23:34:34<13:12:21, 3.70s/it] {'loss': 0.1122, 'grad_norm': 0.8209915087198103, 'learning_rate': 3.2539165626586812e-06, 'epoch': 0.63} 63%|██████▎ | 21429/34278 [23:34:34<13:12:21, 3.70s/it] 63%|██████▎ | 21430/34278 [23:34:38<13:59:10, 3.92s/it] {'loss': 0.1144, 'grad_norm': 0.7725251185715415, 'learning_rate': 3.253473879807967e-06, 'epoch': 0.63} 63%|██████▎ | 21430/34278 [23:34:38<13:59:10, 3.92s/it] 63%|██████▎ | 21431/34278 [23:34:41<13:03:14, 3.66s/it] {'loss': 0.1143, 'grad_norm': 0.787825801021517, 'learning_rate': 3.2530312125498224e-06, 'epoch': 0.63} 63%|██████▎ | 21431/34278 [23:34:41<13:03:14, 3.66s/it] 63%|██████▎ | 21432/34278 [23:34:44<11:58:21, 3.36s/it] {'loss': 0.1308, 'grad_norm': 0.9066304345049514, 'learning_rate': 3.252588560888198e-06, 'epoch': 0.63} 63%|██████▎ | 21432/34278 [23:34:44<11:58:21, 3.36s/it] 63%|██████▎ | 21433/34278 [23:34:48<12:40:24, 3.55s/it] {'loss': 0.13, 'grad_norm': 0.8198773791558235, 'learning_rate': 3.252145924827045e-06, 'epoch': 0.63} 63%|██████▎ | 21433/34278 [23:34:48<12:40:24, 3.55s/it] 63%|██████▎ | 21434/34278 [23:34:54<15:30:21, 4.35s/it] {'loss': 0.1235, 'grad_norm': 1.1783804499968054, 'learning_rate': 3.251703304370317e-06, 'epoch': 0.63} 63%|██████▎ | 21434/34278 [23:34:54<15:30:21, 4.35s/it] 63%|██████▎ | 21435/34278 [23:34:58<15:39:47, 4.39s/it] {'loss': 0.1258, 'grad_norm': 0.8029233077293314, 'learning_rate': 3.251260699521964e-06, 'epoch': 0.63} 63%|██████▎ | 21435/34278 [23:34:58<15:39:47, 4.39s/it] 63%|██████▎ | 21436/34278 [23:35:02<14:35:03, 4.09s/it] {'loss': 0.1132, 'grad_norm': 0.847215641910633, 'learning_rate': 3.2508181102859373e-06, 'epoch': 0.63} 63%|██████▎ | 21436/34278 [23:35:02<14:35:03, 4.09s/it] 63%|██████▎ | 21437/34278 [23:35:08<16:48:38, 4.71s/it] {'loss': 0.144, 'grad_norm': 1.0726819381486028, 'learning_rate': 3.2503755366661893e-06, 'epoch': 0.63} 63%|██████▎ | 21437/34278 [23:35:08<16:48:38, 4.71s/it] 63%|██████▎ | 21438/34278 [23:35:11<14:54:55, 4.18s/it] {'loss': 0.1252, 'grad_norm': 0.9165456177087047, 'learning_rate': 3.2499329786666704e-06, 'epoch': 0.63} 63%|██████▎ | 21438/34278 [23:35:11<14:54:55, 4.18s/it] 63%|██████▎ | 21439/34278 [23:35:17<16:48:30, 4.71s/it] {'loss': 0.1244, 'grad_norm': 1.0593579415371146, 'learning_rate': 3.2494904362913336e-06, 'epoch': 0.63} 63%|██████▎ | 21439/34278 [23:35:17<16:48:30, 4.71s/it] 63%|██████▎ | 21440/34278 [23:35:23<18:03:05, 5.06s/it] {'loss': 0.1164, 'grad_norm': 0.7798101061649765, 'learning_rate': 3.2490479095441274e-06, 'epoch': 0.63} 63%|██████▎ | 21440/34278 [23:35:23<18:03:05, 5.06s/it] 63%|██████▎ | 21441/34278 [23:35:27<17:20:57, 4.87s/it] {'loss': 0.1128, 'grad_norm': 1.165482104967654, 'learning_rate': 3.248605398429004e-06, 'epoch': 0.63} 63%|██████▎ | 21441/34278 [23:35:27<17:20:57, 4.87s/it] 63%|██████▎ | 21442/34278 [23:35:30<15:38:11, 4.39s/it] {'loss': 0.1132, 'grad_norm': 1.002490336377689, 'learning_rate': 3.248162902949912e-06, 'epoch': 0.63} 63%|██████▎ | 21442/34278 [23:35:30<15:38:11, 4.39s/it] 63%|██████▎ | 21443/34278 [23:35:33<14:03:53, 3.94s/it] {'loss': 0.1191, 'grad_norm': 0.6985548173851968, 'learning_rate': 3.247720423110804e-06, 'epoch': 0.63} 63%|██████▎ | 21443/34278 [23:35:33<14:03:53, 3.94s/it] 63%|██████▎ | 21444/34278 [23:35:36<12:58:44, 3.64s/it] {'loss': 0.1432, 'grad_norm': 0.9088624263014558, 'learning_rate': 3.2472779589156313e-06, 'epoch': 0.63} 63%|██████▎ | 21444/34278 [23:35:36<12:58:44, 3.64s/it] 63%|██████▎ | 21445/34278 [23:35:39<12:21:24, 3.47s/it] {'loss': 0.1445, 'grad_norm': 0.91907924242179, 'learning_rate': 3.2468355103683414e-06, 'epoch': 0.63} 63%|██████▎ | 21445/34278 [23:35:39<12:21:24, 3.47s/it] 63%|██████▎ | 21446/34278 [23:35:42<11:54:44, 3.34s/it] {'loss': 0.1299, 'grad_norm': 0.8977712600118576, 'learning_rate': 3.246393077472886e-06, 'epoch': 0.63} 63%|██████▎ | 21446/34278 [23:35:42<11:54:44, 3.34s/it] 63%|██████▎ | 21447/34278 [23:35:46<11:51:00, 3.32s/it] {'loss': 0.1424, 'grad_norm': 0.8134447544544907, 'learning_rate': 3.2459506602332124e-06, 'epoch': 0.63} 63%|██████▎ | 21447/34278 [23:35:46<11:51:00, 3.32s/it] 63%|██████▎ | 21448/34278 [23:35:49<11:38:45, 3.27s/it] {'loss': 0.1214, 'grad_norm': 0.8258625851606094, 'learning_rate': 3.2455082586532748e-06, 'epoch': 0.63} 63%|██████▎ | 21448/34278 [23:35:49<11:38:45, 3.27s/it] 63%|██████▎ | 21449/34278 [23:35:52<11:10:01, 3.13s/it] {'loss': 0.1138, 'grad_norm': 0.8219977428003091, 'learning_rate': 3.245065872737021e-06, 'epoch': 0.63} 63%|██████▎ | 21449/34278 [23:35:52<11:10:01, 3.13s/it] 63%|██████▎ | 21450/34278 [23:35:54<10:46:11, 3.02s/it] {'loss': 0.1259, 'grad_norm': 0.8148850511239529, 'learning_rate': 3.2446235024883998e-06, 'epoch': 0.63} 63%|██████▎ | 21450/34278 [23:35:54<10:46:11, 3.02s/it] 63%|██████▎ | 21451/34278 [23:35:57<10:34:43, 2.97s/it] {'loss': 0.1206, 'grad_norm': 0.9931304727284846, 'learning_rate': 3.2441811479113606e-06, 'epoch': 0.63} 63%|██████▎ | 21451/34278 [23:35:57<10:34:43, 2.97s/it] 63%|██████▎ | 21452/34278 [23:36:03<13:29:07, 3.79s/it] {'loss': 0.1212, 'grad_norm': 1.130776706632609, 'learning_rate': 3.243738809009853e-06, 'epoch': 0.63} 63%|██████▎ | 21452/34278 [23:36:03<13:29:07, 3.79s/it] 63%|██████▎ | 21453/34278 [23:36:06<12:24:45, 3.48s/it] {'loss': 0.111, 'grad_norm': 0.7070222500696697, 'learning_rate': 3.2432964857878255e-06, 'epoch': 0.63} 63%|██████▎ | 21453/34278 [23:36:06<12:24:45, 3.48s/it] 63%|██████▎ | 21454/34278 [23:36:09<11:59:25, 3.37s/it] {'loss': 0.1343, 'grad_norm': 1.3226701988216725, 'learning_rate': 3.242854178249228e-06, 'epoch': 0.63} 63%|██████▎ | 21454/34278 [23:36:09<11:59:25, 3.37s/it] 63%|██████▎ | 21455/34278 [23:36:12<11:57:45, 3.36s/it] {'loss': 0.1107, 'grad_norm': 1.1425727052245944, 'learning_rate': 3.242411886398009e-06, 'epoch': 0.63} 63%|██████▎ | 21455/34278 [23:36:12<11:57:45, 3.36s/it] 63%|██████▎ | 21456/34278 [23:36:18<14:37:13, 4.10s/it] {'loss': 0.1204, 'grad_norm': 1.0726275174312958, 'learning_rate': 3.241969610238117e-06, 'epoch': 0.63} 63%|██████▎ | 21456/34278 [23:36:18<14:37:13, 4.10s/it] 63%|██████▎ | 21457/34278 [23:36:23<15:38:53, 4.39s/it] {'loss': 0.0913, 'grad_norm': 0.8533176144163238, 'learning_rate': 3.241527349773501e-06, 'epoch': 0.63} 63%|██████▎ | 21457/34278 [23:36:23<15:38:53, 4.39s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 63%|██████▎ | 21458/34278 [23:36:29<17:14:01, 4.84s/it] {'loss': 0.1581, 'grad_norm': 0.8312751156994354, 'learning_rate': 3.2410851050081093e-06, 'epoch': 0.63} 63%|██████▎ | 21458/34278 [23:36:29<17:14:01, 4.84s/it] 63%|██████▎ | 21459/34278 [23:36:32<15:17:55, 4.30s/it] {'loss': 0.1461, 'grad_norm': 1.3826534479492336, 'learning_rate': 3.2406428759458886e-06, 'epoch': 0.63} 63%|██████▎ | 21459/34278 [23:36:32<15:17:55, 4.30s/it] 63%|██████▎ | 21460/34278 [23:36:37<16:11:03, 4.55s/it] {'loss': 0.119, 'grad_norm': 0.8075980524239934, 'learning_rate': 3.240200662590789e-06, 'epoch': 0.63} 63%|██████▎ | 21460/34278 [23:36:37<16:11:03, 4.55s/it] 63%|██████▎ | 21461/34278 [23:36:41<15:39:20, 4.40s/it] {'loss': 0.1475, 'grad_norm': 1.0854000076389174, 'learning_rate': 3.2397584649467584e-06, 'epoch': 0.63} 63%|██████▎ | 21461/34278 [23:36:41<15:39:20, 4.40s/it] 63%|██████▎ | 21462/34278 [23:36:44<14:21:36, 4.03s/it] {'loss': 0.108, 'grad_norm': 1.2238030639072361, 'learning_rate': 3.239316283017744e-06, 'epoch': 0.63} 63%|██████▎ | 21462/34278 [23:36:44<14:21:36, 4.03s/it] 63%|██████▎ | 21463/34278 [23:36:50<16:34:04, 4.65s/it] {'loss': 0.1129, 'grad_norm': 1.0729826646664034, 'learning_rate': 3.2388741168076927e-06, 'epoch': 0.63} 63%|██████▎ | 21463/34278 [23:36:50<16:34:04, 4.65s/it] 63%|██████▎ | 21464/34278 [23:36:54<14:53:13, 4.18s/it] {'loss': 0.1131, 'grad_norm': 0.8640682667547323, 'learning_rate': 3.2384319663205544e-06, 'epoch': 0.63} 63%|██████▎ | 21464/34278 [23:36:54<14:53:13, 4.18s/it] 63%|██████▎ | 21465/34278 [23:36:59<16:44:36, 4.70s/it] {'loss': 0.1325, 'grad_norm': 0.7957328888807539, 'learning_rate': 3.237989831560271e-06, 'epoch': 0.63} 63%|██████▎ | 21465/34278 [23:36:59<16:44:36, 4.70s/it] 63%|██████▎ | 21466/34278 [23:37:03<15:27:23, 4.34s/it] {'loss': 0.1359, 'grad_norm': 0.9437612541507222, 'learning_rate': 3.2375477125307976e-06, 'epoch': 0.63} 63%|██████▎ | 21466/34278 [23:37:03<15:27:23, 4.34s/it] 63%|██████▎ | 21467/34278 [23:37:08<16:01:44, 4.50s/it] {'loss': 0.1126, 'grad_norm': 0.690983930350747, 'learning_rate': 3.2371056092360764e-06, 'epoch': 0.63} 63%|██████▎ | 21467/34278 [23:37:08<16:01:44, 4.50s/it] 63%|██████▎ | 21468/34278 [23:37:11<14:35:52, 4.10s/it] {'loss': 0.1162, 'grad_norm': 0.8852358085702343, 'learning_rate': 3.2366635216800556e-06, 'epoch': 0.63} 63%|██████▎ | 21468/34278 [23:37:11<14:35:52, 4.10s/it] 63%|██████▎ | 21469/34278 [23:37:17<16:29:56, 4.64s/it] {'loss': 0.1279, 'grad_norm': 0.6476452883828339, 'learning_rate': 3.2362214498666826e-06, 'epoch': 0.63} 63%|██████▎ | 21469/34278 [23:37:17<16:29:56, 4.64s/it] 63%|██████▎ | 21470/34278 [23:37:23<18:01:03, 5.06s/it] {'loss': 0.1059, 'grad_norm': 0.7760825147597292, 'learning_rate': 3.235779393799903e-06, 'epoch': 0.63} 63%|██████▎ | 21470/34278 [23:37:23<18:01:03, 5.06s/it] 63%|██████▎ | 21471/34278 [23:37:27<16:35:41, 4.66s/it] {'loss': 0.112, 'grad_norm': 1.120546129680719, 'learning_rate': 3.2353373534836618e-06, 'epoch': 0.63} 63%|██████▎ | 21471/34278 [23:37:27<16:35:41, 4.66s/it] 63%|██████▎ | 21472/34278 [23:37:32<17:40:25, 4.97s/it] {'loss': 0.1296, 'grad_norm': 0.6623240211488992, 'learning_rate': 3.23489532892191e-06, 'epoch': 0.63} 63%|██████▎ | 21472/34278 [23:37:32<17:40:25, 4.97s/it] 63%|██████▎ | 21473/34278 [23:37:36<16:46:47, 4.72s/it] {'loss': 0.1355, 'grad_norm': 0.9737138537624721, 'learning_rate': 3.2344533201185903e-06, 'epoch': 0.63} 63%|██████▎ | 21473/34278 [23:37:36<16:46:47, 4.72s/it] 63%|██████▎ | 21474/34278 [23:37:40<15:17:01, 4.30s/it] {'loss': 0.1303, 'grad_norm': 1.0758797997196936, 'learning_rate': 3.234011327077651e-06, 'epoch': 0.63} 63%|██████▎ | 21474/34278 [23:37:40<15:17:01, 4.30s/it] 63%|██████▎ | 21475/34278 [23:37:44<15:26:07, 4.34s/it] {'loss': 0.0937, 'grad_norm': 0.7697573308968146, 'learning_rate': 3.233569349803036e-06, 'epoch': 0.63} 63%|██████▎ | 21475/34278 [23:37:44<15:26:07, 4.34s/it] 63%|██████▎ | 21476/34278 [23:37:49<15:49:34, 4.45s/it] {'loss': 0.1574, 'grad_norm': 0.7873452600584238, 'learning_rate': 3.233127388298692e-06, 'epoch': 0.63} 63%|██████▎ | 21476/34278 [23:37:49<15:49:34, 4.45s/it] 63%|██████▎ | 21477/34278 [23:37:55<17:28:51, 4.92s/it] {'loss': 0.1401, 'grad_norm': 1.3188187568593865, 'learning_rate': 3.232685442568564e-06, 'epoch': 0.63} 63%|██████▎ | 21477/34278 [23:37:55<17:28:51, 4.92s/it] 63%|██████▎ | 21478/34278 [23:37:58<15:50:44, 4.46s/it] {'loss': 0.1251, 'grad_norm': 0.8299559483035459, 'learning_rate': 3.2322435126165998e-06, 'epoch': 0.63} 63%|██████▎ | 21478/34278 [23:37:58<15:50:44, 4.46s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 63%|██████▎ | 21479/34278 [23:38:01<14:15:50, 4.01s/it] {'loss': 0.1093, 'grad_norm': 0.7907233450056685, 'learning_rate': 3.2318015984467444e-06, 'epoch': 0.63} 63%|██████▎ | 21479/34278 [23:38:01<14:15:50, 4.01s/it] 63%|██████▎ | 21480/34278 [23:38:04<13:21:56, 3.76s/it] {'loss': 0.1078, 'grad_norm': 0.7562805504276985, 'learning_rate': 3.2313597000629405e-06, 'epoch': 0.63} 63%|██████▎ | 21480/34278 [23:38:04<13:21:56, 3.76s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f2acff3d170> Failed to fetch sample 3058470. Exception: cannot identify image file <_io.BytesIO object at 0x7f2acff3d170> 63%|██████▎ | 21481/34278 [23:38:10<15:26:07, 4.34s/it] {'loss': 0.1096, 'grad_norm': 0.7682667137237226, 'learning_rate': 3.230917817469136e-06, 'epoch': 0.63} 63%|██████▎ | 21481/34278 [23:38:10<15:26:07, 4.34s/it] 63%|██████▎ | 21482/34278 [23:38:13<14:20:27, 4.03s/it] {'loss': 0.0987, 'grad_norm': 0.778745490208327, 'learning_rate': 3.230475950669275e-06, 'epoch': 0.63} 63%|██████▎ | 21482/34278 [23:38:13<14:20:27, 4.03s/it] 63%|██████▎ | 21483/34278 [23:38:17<13:23:19, 3.77s/it] {'loss': 0.123, 'grad_norm': 0.7573155065175218, 'learning_rate': 3.2300340996673007e-06, 'epoch': 0.63} 63%|██████▎ | 21483/34278 [23:38:17<13:23:19, 3.77s/it] 63%|██████▎ | 21484/34278 [23:38:20<12:37:19, 3.55s/it] {'loss': 0.1363, 'grad_norm': 0.6423302620766055, 'learning_rate': 3.2295922644671605e-06, 'epoch': 0.63} 63%|██████▎ | 21484/34278 [23:38:20<12:37:19, 3.55s/it] 63%|██████▎ | 21485/34278 [23:38:26<15:36:02, 4.39s/it] {'loss': 0.1193, 'grad_norm': 0.9531062125109175, 'learning_rate': 3.2291504450727983e-06, 'epoch': 0.63} 63%|██████▎ | 21485/34278 [23:38:26<15:36:02, 4.39s/it] 63%|██████▎ | 21486/34278 [23:38:29<14:02:53, 3.95s/it] {'loss': 0.1228, 'grad_norm': 0.7858304417946237, 'learning_rate': 3.228708641488158e-06, 'epoch': 0.63} 63%|██████▎ | 21486/34278 [23:38:29<14:02:53, 3.95s/it] 63%|██████▎ | 21487/34278 [23:38:32<13:12:02, 3.72s/it] {'loss': 0.1123, 'grad_norm': 0.8371163322809207, 'learning_rate': 3.2282668537171845e-06, 'epoch': 0.63} 63%|██████▎ | 21487/34278 [23:38:32<13:12:02, 3.72s/it] 63%|██████▎ | 21488/34278 [23:38:35<12:35:15, 3.54s/it] {'loss': 0.1247, 'grad_norm': 0.8470091732039831, 'learning_rate': 3.2278250817638213e-06, 'epoch': 0.63} 63%|██████▎ | 21488/34278 [23:38:35<12:35:15, 3.54s/it] 63%|██████▎ | 21489/34278 [23:38:39<12:50:44, 3.62s/it] {'loss': 0.0973, 'grad_norm': 0.6490071039160903, 'learning_rate': 3.227383325632012e-06, 'epoch': 0.63} 63%|██████▎ | 21489/34278 [23:38:39<12:50:44, 3.62s/it] 63%|██████▎ | 21490/34278 [23:38:43<13:01:08, 3.67s/it] {'loss': 0.1603, 'grad_norm': 0.7524538481288671, 'learning_rate': 3.2269415853257015e-06, 'epoch': 0.63} 63%|██████▎ | 21490/34278 [23:38:43<13:01:08, 3.67s/it] 63%|██████▎ | 21491/34278 [23:38:46<12:39:26, 3.56s/it] {'loss': 0.1109, 'grad_norm': 0.8569837417623171, 'learning_rate': 3.226499860848834e-06, 'epoch': 0.63} 63%|██████▎ | 21491/34278 [23:38:46<12:39:26, 3.56s/it] 63%|██████▎ | 21492/34278 [23:38:50<12:43:38, 3.58s/it] {'loss': 0.1319, 'grad_norm': 0.8558427233602092, 'learning_rate': 3.226058152205352e-06, 'epoch': 0.63} 63%|██████▎ | 21492/34278 [23:38:50<12:43:38, 3.58s/it] 63%|██████▎ | 21493/34278 [23:38:54<13:27:01, 3.79s/it] {'loss': 0.1222, 'grad_norm': 0.7372753385136062, 'learning_rate': 3.225616459399199e-06, 'epoch': 0.63} 63%|██████▎ | 21493/34278 [23:38:54<13:27:01, 3.79s/it] 63%|██████▎ | 21494/34278 [23:38:58<13:34:10, 3.82s/it] {'loss': 0.1116, 'grad_norm': 0.6613528140723616, 'learning_rate': 3.22517478243432e-06, 'epoch': 0.63} 63%|██████▎ | 21494/34278 [23:38:58<13:34:10, 3.82s/it] 63%|██████▎ | 21495/34278 [23:39:02<13:19:34, 3.75s/it] {'loss': 0.1212, 'grad_norm': 0.8100790230582315, 'learning_rate': 3.2247331213146537e-06, 'epoch': 0.63} 63%|██████▎ | 21495/34278 [23:39:02<13:19:34, 3.75s/it] 63%|██████▎ | 21496/34278 [23:39:05<12:53:27, 3.63s/it] {'loss': 0.0958, 'grad_norm': 1.0019922446123124, 'learning_rate': 3.2242914760441492e-06, 'epoch': 0.63} 63%|██████▎ | 21496/34278 [23:39:05<12:53:27, 3.63s/it] 63%|██████▎ | 21497/34278 [23:39:08<12:33:50, 3.54s/it] {'loss': 0.1361, 'grad_norm': 0.8196795105674665, 'learning_rate': 3.2238498466267452e-06, 'epoch': 0.63} 63%|██████▎ | 21497/34278 [23:39:08<12:33:50, 3.54s/it] 63%|██████▎ | 21498/34278 [23:39:12<13:04:36, 3.68s/it] {'loss': 0.1417, 'grad_norm': 1.0121264732408293, 'learning_rate': 3.2234082330663862e-06, 'epoch': 0.63} 63%|██████▎ | 21498/34278 [23:39:12<13:04:36, 3.68s/it] 63%|██████▎ | 21499/34278 [23:39:15<12:24:33, 3.50s/it] {'loss': 0.1257, 'grad_norm': 0.8623796532033311, 'learning_rate': 3.2229666353670157e-06, 'epoch': 0.63} 63%|██████▎ | 21499/34278 [23:39:15<12:24:33, 3.50s/it] 63%|██████▎ | 21500/34278 [23:39:20<13:50:17, 3.90s/it] {'loss': 0.0941, 'grad_norm': 0.706500786192673, 'learning_rate': 3.2225250535325734e-06, 'epoch': 0.63} 63%|██████▎ | 21500/34278 [23:39:20<13:50:17, 3.90s/it] 63%|██████▎ | 21501/34278 [23:39:26<15:51:12, 4.47s/it] {'loss': 0.1343, 'grad_norm': 1.0776435973948162, 'learning_rate': 3.2220834875670025e-06, 'epoch': 0.63} 63%|██████▎ | 21501/34278 [23:39:26<15:51:12, 4.47s/it] 63%|██████▎ | 21502/34278 [23:39:30<15:09:07, 4.27s/it] {'loss': 0.1196, 'grad_norm': 0.9433531895740077, 'learning_rate': 3.2216419374742463e-06, 'epoch': 0.63} 63%|██████▎ | 21502/34278 [23:39:30<15:09:07, 4.27s/it] 63%|██████▎ | 21503/34278 [23:39:33<14:21:41, 4.05s/it] {'loss': 0.138, 'grad_norm': 0.8541372782299788, 'learning_rate': 3.221200403258247e-06, 'epoch': 0.63} 63%|██████▎ | 21503/34278 [23:39:33<14:21:41, 4.05s/it] 63%|██████▎ | 21504/34278 [23:39:37<13:34:47, 3.83s/it] {'loss': 0.1404, 'grad_norm': 1.0613124678525838, 'learning_rate': 3.220758884922946e-06, 'epoch': 0.63} 63%|██████▎ | 21504/34278 [23:39:37<13:34:47, 3.83s/it] 63%|██████▎ | 21505/34278 [23:39:39<12:35:18, 3.55s/it] {'loss': 0.1287, 'grad_norm': 0.7149691283911175, 'learning_rate': 3.2203173824722845e-06, 'epoch': 0.63} 63%|██████▎ | 21505/34278 [23:39:39<12:35:18, 3.55s/it] 63%|██████▎ | 21506/34278 [23:39:43<12:45:46, 3.60s/it] {'loss': 0.1309, 'grad_norm': 0.8569712554103952, 'learning_rate': 3.2198758959102044e-06, 'epoch': 0.63} 63%|██████▎ | 21506/34278 [23:39:43<12:45:46, 3.60s/it] 63%|██████▎ | 21507/34278 [23:39:47<12:32:11, 3.53s/it] {'loss': 0.1076, 'grad_norm': 0.8563520174119542, 'learning_rate': 3.219434425240646e-06, 'epoch': 0.63} 63%|██████▎ | 21507/34278 [23:39:47<12:32:11, 3.53s/it] 63%|██████▎ | 21508/34278 [23:39:50<12:33:21, 3.54s/it] {'loss': 0.1434, 'grad_norm': 0.8683837521634081, 'learning_rate': 3.218992970467554e-06, 'epoch': 0.63} 63%|██████▎ | 21508/34278 [23:39:50<12:33:21, 3.54s/it] 63%|██████▎ | 21509/34278 [23:39:54<13:07:02, 3.70s/it] {'loss': 0.1175, 'grad_norm': 0.7393628992403547, 'learning_rate': 3.218551531594868e-06, 'epoch': 0.63} 63%|██████▎ | 21509/34278 [23:39:54<13:07:02, 3.70s/it] 63%|██████▎ | 21510/34278 [23:39:57<12:15:38, 3.46s/it] {'loss': 0.1128, 'grad_norm': 0.7744820819302393, 'learning_rate': 3.218110108626528e-06, 'epoch': 0.63} 63%|██████▎ | 21510/34278 [23:39:57<12:15:38, 3.46s/it] 63%|██████▎ | 21511/34278 [23:40:03<14:56:23, 4.21s/it] {'loss': 0.1169, 'grad_norm': 0.764739802815339, 'learning_rate': 3.2176687015664744e-06, 'epoch': 0.63} 63%|██████▎ | 21511/34278 [23:40:03<14:56:23, 4.21s/it] 63%|██████▎ | 21512/34278 [23:40:07<14:26:24, 4.07s/it] {'loss': 0.125, 'grad_norm': 0.8077778593371385, 'learning_rate': 3.217227310418651e-06, 'epoch': 0.63} 63%|██████▎ | 21512/34278 [23:40:07<14:26:24, 4.07s/it] 63%|██████▎ | 21513/34278 [23:40:10<14:02:12, 3.96s/it] {'loss': 0.1407, 'grad_norm': 0.7929958180323915, 'learning_rate': 3.2167859351869946e-06, 'epoch': 0.63} 63%|██████▎ | 21513/34278 [23:40:10<14:02:12, 3.96s/it] 63%|██████▎ | 21514/34278 [23:40:14<13:04:24, 3.69s/it] {'loss': 0.1162, 'grad_norm': 0.9001240930573178, 'learning_rate': 3.2163445758754484e-06, 'epoch': 0.63} 63%|██████▎ | 21514/34278 [23:40:14<13:04:24, 3.69s/it] 63%|██████▎ | 21515/34278 [23:40:17<12:25:35, 3.51s/it] {'loss': 0.1074, 'grad_norm': 0.9234262503649429, 'learning_rate': 3.2159032324879522e-06, 'epoch': 0.63} 63%|██████▎ | 21515/34278 [23:40:17<12:25:35, 3.51s/it] 63%|██████▎ | 21516/34278 [23:40:20<11:46:30, 3.32s/it] {'loss': 0.1425, 'grad_norm': 0.6901976147452621, 'learning_rate': 3.2154619050284465e-06, 'epoch': 0.63} 63%|██████▎ | 21516/34278 [23:40:20<11:46:30, 3.32s/it] 63%|██████▎ | 21517/34278 [23:40:23<11:39:35, 3.29s/it] {'loss': 0.1164, 'grad_norm': 0.9370112500263487, 'learning_rate': 3.2150205935008715e-06, 'epoch': 0.63} 63%|██████▎ | 21517/34278 [23:40:23<11:39:35, 3.29s/it] 63%|██████▎ | 21518/34278 [23:40:26<11:59:29, 3.38s/it] {'loss': 0.1213, 'grad_norm': 0.8671280063844882, 'learning_rate': 3.2145792979091656e-06, 'epoch': 0.63} 63%|██████▎ | 21518/34278 [23:40:26<11:59:29, 3.38s/it] 63%|██████▎ | 21519/34278 [23:40:30<11:49:23, 3.34s/it] {'loss': 0.1236, 'grad_norm': 0.7151301844910141, 'learning_rate': 3.2141380182572684e-06, 'epoch': 0.63} 63%|██████▎ | 21519/34278 [23:40:30<11:49:23, 3.34s/it] 63%|██████▎ | 21520/34278 [23:40:34<13:10:13, 3.72s/it] {'loss': 0.1054, 'grad_norm': 0.7907461814608361, 'learning_rate': 3.2136967545491214e-06, 'epoch': 0.63} 63%|██████▎ | 21520/34278 [23:40:34<13:10:13, 3.72s/it] 63%|██████▎ | 21521/34278 [23:40:37<12:33:21, 3.54s/it] {'loss': 0.13, 'grad_norm': 1.0311974990272104, 'learning_rate': 3.213255506788665e-06, 'epoch': 0.63} 63%|██████▎ | 21521/34278 [23:40:37<12:33:21, 3.54s/it] 63%|██████▎ | 21522/34278 [23:40:41<12:43:06, 3.59s/it] {'loss': 0.1073, 'grad_norm': 0.654783786397787, 'learning_rate': 3.2128142749798357e-06, 'epoch': 0.63} 63%|██████▎ | 21522/34278 [23:40:41<12:43:06, 3.59s/it] 63%|██████▎ | 21523/34278 [23:40:45<12:44:55, 3.60s/it] {'loss': 0.1276, 'grad_norm': 0.7715701914786791, 'learning_rate': 3.212373059126574e-06, 'epoch': 0.63} 63%|██████▎ | 21523/34278 [23:40:45<12:44:55, 3.60s/it] 63%|██████▎ | 21524/34278 [23:40:47<11:46:05, 3.32s/it] {'loss': 0.123, 'grad_norm': 0.8551968957402468, 'learning_rate': 3.21193185923282e-06, 'epoch': 0.63} 63%|██████▎ | 21524/34278 [23:40:47<11:46:05, 3.32s/it] 63%|██████▎ | 21525/34278 [23:40:53<14:33:11, 4.11s/it] {'loss': 0.1269, 'grad_norm': 0.9263490150613771, 'learning_rate': 3.211490675302508e-06, 'epoch': 0.63} 63%|██████▎ | 21525/34278 [23:40:53<14:33:11, 4.11s/it] 63%|██████▎ | 21526/34278 [23:41:00<16:53:35, 4.77s/it] {'loss': 0.1149, 'grad_norm': 0.7033271582772593, 'learning_rate': 3.211049507339583e-06, 'epoch': 0.63} 63%|██████▎ | 21526/34278 [23:41:00<16:53:35, 4.77s/it] 63%|██████▎ | 21527/34278 [23:41:03<15:36:40, 4.41s/it] {'loss': 0.1284, 'grad_norm': 0.7079676583610439, 'learning_rate': 3.2106083553479803e-06, 'epoch': 0.63} 63%|██████▎ | 21527/34278 [23:41:03<15:36:40, 4.41s/it] 63%|██████▎ | 21528/34278 [23:41:09<17:32:10, 4.95s/it] {'loss': 0.1159, 'grad_norm': 0.7680249952309003, 'learning_rate': 3.2101672193316396e-06, 'epoch': 0.63} 63%|██████▎ | 21528/34278 [23:41:09<17:32:10, 4.95s/it] 63%|██████▎ | 21529/34278 [23:41:13<16:06:54, 4.55s/it] {'loss': 0.1269, 'grad_norm': 0.7992028237089969, 'learning_rate': 3.209726099294499e-06, 'epoch': 0.63} 63%|██████▎ | 21529/34278 [23:41:13<16:06:54, 4.55s/it] 63%|██████▎ | 21530/34278 [23:41:16<14:26:06, 4.08s/it] {'loss': 0.1264, 'grad_norm': 0.7976631278886198, 'learning_rate': 3.2092849952404958e-06, 'epoch': 0.63} 63%|██████▎ | 21530/34278 [23:41:16<14:26:06, 4.08s/it] 63%|██████▎ | 21531/34278 [23:41:19<13:03:47, 3.69s/it] {'loss': 0.1292, 'grad_norm': 0.8019705601305389, 'learning_rate': 3.208843907173568e-06, 'epoch': 0.63} 63%|██████▎ | 21531/34278 [23:41:19<13:03:47, 3.69s/it] 63%|██████▎ | 21532/34278 [23:41:22<12:49:10, 3.62s/it] {'loss': 0.1165, 'grad_norm': 0.8028068650962843, 'learning_rate': 3.2084028350976547e-06, 'epoch': 0.63} 63%|██████▎ | 21532/34278 [23:41:22<12:49:10, 3.62s/it] 63%|██████▎ | 21533/34278 [23:41:26<12:45:04, 3.60s/it] {'loss': 0.1262, 'grad_norm': 0.8200340992995329, 'learning_rate': 3.207961779016693e-06, 'epoch': 0.63} 63%|██████▎ | 21533/34278 [23:41:26<12:45:04, 3.60s/it] 63%|██████▎ | 21534/34278 [23:41:29<12:30:32, 3.53s/it] {'loss': 0.1202, 'grad_norm': 0.9319842655998817, 'learning_rate': 3.207520738934622e-06, 'epoch': 0.63} 63%|██████▎ | 21534/34278 [23:41:29<12:30:32, 3.53s/it] 63%|██████▎ | 21535/34278 [23:41:33<12:24:13, 3.50s/it] {'loss': 0.1269, 'grad_norm': 0.7868073065749692, 'learning_rate': 3.207079714855377e-06, 'epoch': 0.63} 63%|██████▎ | 21535/34278 [23:41:33<12:24:13, 3.50s/it] 63%|██████▎ | 21536/34278 [23:41:38<13:59:57, 3.96s/it] {'loss': 0.1134, 'grad_norm': 0.7929034376400665, 'learning_rate': 3.2066387067828964e-06, 'epoch': 0.63} 63%|██████▎ | 21536/34278 [23:41:38<13:59:57, 3.96s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 63%|██████▎ | 21537/34278 [23:41:41<13:12:21, 3.73s/it] {'loss': 0.132, 'grad_norm': 0.9960031040117716, 'learning_rate': 3.2061977147211167e-06, 'epoch': 0.63} 63%|██████▎ | 21537/34278 [23:41:41<13:12:21, 3.73s/it] 63%|██████▎ | 21538/34278 [23:41:46<14:37:35, 4.13s/it] {'loss': 0.1156, 'grad_norm': 0.8034204287393533, 'learning_rate': 3.205756738673976e-06, 'epoch': 0.63} 63%|██████▎ | 21538/34278 [23:41:46<14:37:35, 4.13s/it] 63%|██████▎ | 21539/34278 [23:41:50<14:10:27, 4.01s/it] {'loss': 0.1093, 'grad_norm': 0.8642989459802942, 'learning_rate': 3.2053157786454115e-06, 'epoch': 0.63} 63%|██████▎ | 21539/34278 [23:41:50<14:10:27, 4.01s/it] 63%|██████▎ | 21540/34278 [23:41:53<13:43:34, 3.88s/it] {'loss': 0.116, 'grad_norm': 0.8764213622877319, 'learning_rate': 3.2048748346393587e-06, 'epoch': 0.63} 63%|██████▎ | 21540/34278 [23:41:53<13:43:34, 3.88s/it] 63%|██████▎ | 21541/34278 [23:41:57<13:17:08, 3.76s/it] {'loss': 0.1141, 'grad_norm': 0.9479980537473418, 'learning_rate': 3.2044339066597554e-06, 'epoch': 0.63} 63%|██████▎ | 21541/34278 [23:41:57<13:17:08, 3.76s/it] 63%|██████▎ | 21542/34278 [23:42:00<12:38:58, 3.58s/it] {'loss': 0.12, 'grad_norm': 0.8517522390729108, 'learning_rate': 3.2039929947105373e-06, 'epoch': 0.63} 63%|██████▎ | 21542/34278 [23:42:00<12:38:58, 3.58s/it] 63%|██████▎ | 21543/34278 [23:42:05<13:54:57, 3.93s/it] {'loss': 0.1127, 'grad_norm': 0.9185116737345116, 'learning_rate': 3.2035520987956403e-06, 'epoch': 0.63} 63%|██████▎ | 21543/34278 [23:42:05<13:54:57, 3.93s/it] 63%|██████▎ | 21544/34278 [23:42:08<13:05:03, 3.70s/it] {'loss': 0.1275, 'grad_norm': 1.076201627421619, 'learning_rate': 3.2031112189190016e-06, 'epoch': 0.63} 63%|██████▎ | 21544/34278 [23:42:08<13:05:03, 3.70s/it] 63%|██████▎ | 21545/34278 [23:42:11<12:38:26, 3.57s/it] {'loss': 0.1495, 'grad_norm': 0.9351418794293074, 'learning_rate': 3.202670355084557e-06, 'epoch': 0.63} 63%|██████▎ | 21545/34278 [23:42:11<12:38:26, 3.57s/it] 63%|██████▎ | 21546/34278 [23:42:14<11:55:21, 3.37s/it] {'loss': 0.1061, 'grad_norm': 0.7919129578114034, 'learning_rate': 3.202229507296242e-06, 'epoch': 0.63} 63%|██████▎ | 21546/34278 [23:42:14<11:55:21, 3.37s/it] 63%|██████▎ | 21547/34278 [23:42:17<12:07:32, 3.43s/it] {'loss': 0.1157, 'grad_norm': 0.9598799706381894, 'learning_rate': 3.2017886755579945e-06, 'epoch': 0.63} 63%|██████▎ | 21547/34278 [23:42:17<12:07:32, 3.43s/it] 63%|██████▎ | 21548/34278 [23:42:22<13:50:21, 3.91s/it] {'loss': 0.1213, 'grad_norm': 1.256863121251991, 'learning_rate': 3.2013478598737473e-06, 'epoch': 0.63} 63%|██████▎ | 21548/34278 [23:42:22<13:50:21, 3.91s/it] 63%|██████▎ | 21549/34278 [23:42:26<13:05:37, 3.70s/it] {'loss': 0.1055, 'grad_norm': 1.0244973188794877, 'learning_rate': 3.2009070602474364e-06, 'epoch': 0.63} 63%|██████▎ | 21549/34278 [23:42:26<13:05:37, 3.70s/it] 63%|██████▎ | 21550/34278 [23:42:29<12:29:14, 3.53s/it] {'loss': 0.1197, 'grad_norm': 0.8329326346392835, 'learning_rate': 3.200466276682998e-06, 'epoch': 0.63} 63%|██████▎ | 21550/34278 [23:42:29<12:29:14, 3.53s/it] 63%|██████▎ | 21551/34278 [23:42:32<12:18:47, 3.48s/it] {'loss': 0.144, 'grad_norm': 1.07833819645262, 'learning_rate': 3.2000255091843685e-06, 'epoch': 0.63} 63%|██████▎ | 21551/34278 [23:42:32<12:18:47, 3.48s/it] 63%|██████▎ | 21552/34278 [23:42:36<12:30:50, 3.54s/it] {'loss': 0.1278, 'grad_norm': 1.4706122642924964, 'learning_rate': 3.1995847577554805e-06, 'epoch': 0.63} 63%|██████▎ | 21552/34278 [23:42:36<12:30:50, 3.54s/it] 63%|██████▎ | 21553/34278 [23:42:42<15:00:23, 4.25s/it] {'loss': 0.1046, 'grad_norm': 0.7771298614194058, 'learning_rate': 3.1991440224002703e-06, 'epoch': 0.63} 63%|██████▎ | 21553/34278 [23:42:42<15:00:23, 4.25s/it] 63%|██████▎ | 21554/34278 [23:42:45<13:44:55, 3.89s/it] {'loss': 0.1175, 'grad_norm': 0.934078883731427, 'learning_rate': 3.1987033031226734e-06, 'epoch': 0.63} 63%|██████▎ | 21554/34278 [23:42:45<13:44:55, 3.89s/it] 63%|██████▎ | 21555/34278 [23:42:49<14:06:22, 3.99s/it] {'loss': 0.1314, 'grad_norm': 0.8632368368291131, 'learning_rate': 3.1982625999266192e-06, 'epoch': 0.63} 63%|██████▎ | 21555/34278 [23:42:49<14:06:22, 3.99s/it] 63%|██████▎ | 21556/34278 [23:42:52<13:18:34, 3.77s/it] {'loss': 0.14, 'grad_norm': 0.9722694887346638, 'learning_rate': 3.1978219128160506e-06, 'epoch': 0.63} 63%|██████▎ | 21556/34278 [23:42:52<13:18:34, 3.77s/it] 63%|██████▎ | 21557/34278 [23:42:55<12:43:57, 3.60s/it] {'loss': 0.1179, 'grad_norm': 0.7143577061373547, 'learning_rate': 3.197381241794897e-06, 'epoch': 0.63} 63%|██████▎ | 21557/34278 [23:42:55<12:43:57, 3.60s/it] 63%|██████▎ | 21558/34278 [23:42:59<12:42:21, 3.60s/it] {'loss': 0.1145, 'grad_norm': 0.7677500897266829, 'learning_rate': 3.1969405868670923e-06, 'epoch': 0.63} 63%|██████▎ | 21558/34278 [23:42:59<12:42:21, 3.60s/it] 63%|██████▎ | 21559/34278 [23:43:04<14:25:17, 4.08s/it] {'loss': 0.1412, 'grad_norm': 1.192550759965728, 'learning_rate': 3.1964999480365732e-06, 'epoch': 0.63} 63%|██████▎ | 21559/34278 [23:43:04<14:25:17, 4.08s/it] 63%|██████▎ | 21560/34278 [23:43:10<16:29:13, 4.67s/it] {'loss': 0.1232, 'grad_norm': 1.1865159356900845, 'learning_rate': 3.1960593253072713e-06, 'epoch': 0.63} 63%|██████▎ | 21560/34278 [23:43:10<16:29:13, 4.67s/it] 63%|██████▎ | 21561/34278 [23:43:14<15:02:28, 4.26s/it] {'loss': 0.1118, 'grad_norm': 0.7064843681198242, 'learning_rate': 3.1956187186831197e-06, 'epoch': 0.63} 63%|██████▎ | 21561/34278 [23:43:14<15:02:28, 4.26s/it] 63%|██████▎ | 21562/34278 [23:43:17<13:53:31, 3.93s/it] {'loss': 0.1262, 'grad_norm': 0.8028464328453727, 'learning_rate': 3.1951781281680537e-06, 'epoch': 0.63} 63%|██████▎ | 21562/34278 [23:43:17<13:53:31, 3.93s/it] 63%|██████▎ | 21563/34278 [23:43:21<14:04:41, 3.99s/it] {'loss': 0.1282, 'grad_norm': 1.0914709799610633, 'learning_rate': 3.1947375537660073e-06, 'epoch': 0.63} 63%|██████▎ | 21563/34278 [23:43:21<14:04:41, 3.99s/it] 63%|██████▎ | 21564/34278 [23:43:25<13:43:04, 3.88s/it] {'loss': 0.1255, 'grad_norm': 0.828817605881031, 'learning_rate': 3.1942969954809142e-06, 'epoch': 0.63} 63%|██████▎ | 21564/34278 [23:43:25<13:43:04, 3.88s/it] 63%|██████▎ | 21565/34278 [23:43:31<16:03:13, 4.55s/it] {'loss': 0.1267, 'grad_norm': 0.7945320374821138, 'learning_rate': 3.193856453316706e-06, 'epoch': 0.63} 63%|██████▎ | 21565/34278 [23:43:31<16:03:13, 4.55s/it] 63%|██████▎ | 21566/34278 [23:43:34<14:36:43, 4.14s/it] {'loss': 0.1261, 'grad_norm': 0.8926668259099348, 'learning_rate': 3.1934159272773153e-06, 'epoch': 0.63} 63%|██████▎ | 21566/34278 [23:43:34<14:36:43, 4.14s/it] 63%|██████▎ | 21567/34278 [23:43:37<13:48:34, 3.91s/it] {'loss': 0.1262, 'grad_norm': 1.019131136454555, 'learning_rate': 3.192975417366675e-06, 'epoch': 0.63} 63%|██████▎ | 21567/34278 [23:43:37<13:48:34, 3.91s/it] 63%|██████▎ | 21568/34278 [23:43:40<12:59:52, 3.68s/it] {'loss': 0.1222, 'grad_norm': 1.4016807723883395, 'learning_rate': 3.1925349235887206e-06, 'epoch': 0.63} 63%|██████▎ | 21568/34278 [23:43:40<12:59:52, 3.68s/it] 63%|██████▎ | 21569/34278 [23:43:45<14:10:08, 4.01s/it] {'loss': 0.1351, 'grad_norm': 0.7037588645852593, 'learning_rate': 3.192094445947383e-06, 'epoch': 0.63} 63%|██████▎ | 21569/34278 [23:43:45<14:10:08, 4.01s/it] 63%|██████▎ | 21570/34278 [23:43:48<12:56:58, 3.67s/it] {'loss': 0.1325, 'grad_norm': 1.1536549875530948, 'learning_rate': 3.1916539844465945e-06, 'epoch': 0.63} 63%|██████▎ | 21570/34278 [23:43:48<12:56:58, 3.67s/it] 63%|██████▎ | 21571/34278 [23:43:52<12:46:58, 3.62s/it] {'loss': 0.1162, 'grad_norm': 1.0094463591261138, 'learning_rate': 3.1912135390902866e-06, 'epoch': 0.63} 63%|██████▎ | 21571/34278 [23:43:52<12:46:58, 3.62s/it] 63%|██████▎ | 21572/34278 [23:43:55<12:49:23, 3.63s/it] {'loss': 0.1085, 'grad_norm': 0.7365809998674072, 'learning_rate': 3.1907731098823934e-06, 'epoch': 0.63} 63%|██████▎ | 21572/34278 [23:43:55<12:49:23, 3.63s/it] 63%|██████▎ | 21573/34278 [23:43:58<12:19:18, 3.49s/it] {'loss': 0.129, 'grad_norm': 1.0474714461770684, 'learning_rate': 3.1903326968268445e-06, 'epoch': 0.63} 63%|██████▎ | 21573/34278 [23:43:58<12:19:18, 3.49s/it] 63%|██████▎ | 21574/34278 [23:44:01<11:55:07, 3.38s/it] {'loss': 0.1489, 'grad_norm': 1.105126739367677, 'learning_rate': 3.1898922999275746e-06, 'epoch': 0.63} 63%|██████▎ | 21574/34278 [23:44:01<11:55:07, 3.38s/it] 63%|██████▎ | 21575/34278 [23:44:04<11:34:14, 3.28s/it] {'loss': 0.1301, 'grad_norm': 1.2915971982128405, 'learning_rate': 3.189451919188513e-06, 'epoch': 0.63} 63%|██████▎ | 21575/34278 [23:44:04<11:34:14, 3.28s/it] 63%|██████▎ | 21576/34278 [23:44:08<11:36:31, 3.29s/it] {'loss': 0.1258, 'grad_norm': 0.8876481332728403, 'learning_rate': 3.1890115546135946e-06, 'epoch': 0.63} 63%|██████▎ | 21576/34278 [23:44:08<11:36:31, 3.29s/it] 63%|██████▎ | 21577/34278 [23:44:11<11:39:42, 3.31s/it] {'loss': 0.1245, 'grad_norm': 0.874967281714496, 'learning_rate': 3.1885712062067474e-06, 'epoch': 0.63} 63%|██████▎ | 21577/34278 [23:44:11<11:39:42, 3.31s/it] 63%|██████▎ | 21578/34278 [23:44:17<14:22:02, 4.07s/it] {'loss': 0.1414, 'grad_norm': 1.031186192412197, 'learning_rate': 3.1881308739719043e-06, 'epoch': 0.63} 63%|██████▎ | 21578/34278 [23:44:17<14:22:02, 4.07s/it] 63%|██████▎ | 21579/34278 [23:44:23<16:15:23, 4.61s/it] {'loss': 0.1136, 'grad_norm': 0.7004382066330048, 'learning_rate': 3.1876905579129947e-06, 'epoch': 0.63} 63%|██████▎ | 21579/34278 [23:44:23<16:15:23, 4.61s/it] 63%|██████▎ | 21580/34278 [23:44:26<14:36:29, 4.14s/it] {'loss': 0.1358, 'grad_norm': 0.7906080386510678, 'learning_rate': 3.187250258033952e-06, 'epoch': 0.63} 63%|██████▎ | 21580/34278 [23:44:26<14:36:29, 4.14s/it] 63%|██████▎ | 21581/34278 [23:44:29<13:47:01, 3.91s/it] {'loss': 0.1321, 'grad_norm': 1.0132762957323436, 'learning_rate': 3.186809974338708e-06, 'epoch': 0.63} 63%|██████▎ | 21581/34278 [23:44:29<13:47:01, 3.91s/it] 63%|██████▎ | 21582/34278 [23:44:33<13:21:41, 3.79s/it] {'loss': 0.1096, 'grad_norm': 0.8342937522349811, 'learning_rate': 3.18636970683119e-06, 'epoch': 0.63} 63%|██████▎ | 21582/34278 [23:44:33<13:21:41, 3.79s/it] 63%|██████▎ | 21583/34278 [23:44:36<12:35:10, 3.57s/it] {'loss': 0.1147, 'grad_norm': 0.7805721851565051, 'learning_rate': 3.1859294555153307e-06, 'epoch': 0.63} 63%|██████▎ | 21583/34278 [23:44:36<12:35:10, 3.57s/it] 63%|██████▎ | 21584/34278 [23:44:42<15:10:17, 4.30s/it] {'loss': 0.1242, 'grad_norm': 0.8848720620201574, 'learning_rate': 3.185489220395061e-06, 'epoch': 0.63} 63%|██████▎ | 21584/34278 [23:44:42<15:10:17, 4.30s/it] 63%|██████▎ | 21585/34278 [23:44:45<13:36:18, 3.86s/it] {'loss': 0.1267, 'grad_norm': 0.9549712490551664, 'learning_rate': 3.1850490014743073e-06, 'epoch': 0.63} 63%|██████▎ | 21585/34278 [23:44:45<13:36:18, 3.86s/it] 63%|██████▎ | 21586/34278 [23:44:48<12:39:09, 3.59s/it] {'loss': 0.1316, 'grad_norm': 0.6993995909889085, 'learning_rate': 3.1846087987570064e-06, 'epoch': 0.63} 63%|██████▎ | 21586/34278 [23:44:48<12:39:09, 3.59s/it] 63%|██████▎ | 21587/34278 [23:44:53<14:40:04, 4.16s/it] {'loss': 0.1434, 'grad_norm': 0.7656935525511819, 'learning_rate': 3.184168612247083e-06, 'epoch': 0.63} 63%|██████▎ | 21587/34278 [23:44:53<14:40:04, 4.16s/it] 63%|██████▎ | 21588/34278 [23:44:56<13:12:12, 3.75s/it] {'loss': 0.1163, 'grad_norm': 0.6832487924100321, 'learning_rate': 3.1837284419484692e-06, 'epoch': 0.63} 63%|██████▎ | 21588/34278 [23:44:56<13:12:12, 3.75s/it] 63%|██████▎ | 21589/34278 [23:44:59<12:39:08, 3.59s/it] {'loss': 0.1234, 'grad_norm': 0.6859772217800502, 'learning_rate': 3.183288287865095e-06, 'epoch': 0.63} 63%|██████▎ | 21589/34278 [23:44:59<12:39:08, 3.59s/it] 63%|██████▎ | 21590/34278 [23:45:02<12:17:53, 3.49s/it] {'loss': 0.1293, 'grad_norm': 0.7820156328027904, 'learning_rate': 3.182848150000889e-06, 'epoch': 0.63} 63%|██████▎ | 21590/34278 [23:45:02<12:17:53, 3.49s/it] 63%|██████▎ | 21591/34278 [23:45:05<11:39:37, 3.31s/it] {'loss': 0.1476, 'grad_norm': 1.0464432483476764, 'learning_rate': 3.182408028359779e-06, 'epoch': 0.63} 63%|██████▎ | 21591/34278 [23:45:05<11:39:37, 3.31s/it] 63%|██████▎ | 21592/34278 [23:45:08<11:10:04, 3.17s/it] {'loss': 0.1092, 'grad_norm': 0.9398890972408397, 'learning_rate': 3.181967922945698e-06, 'epoch': 0.63} 63%|██████▎ | 21592/34278 [23:45:08<11:10:04, 3.17s/it] 63%|██████▎ | 21593/34278 [23:45:12<11:52:17, 3.37s/it] {'loss': 0.1228, 'grad_norm': 0.6819936556417671, 'learning_rate': 3.181527833762573e-06, 'epoch': 0.63} 63%|██████▎ | 21593/34278 [23:45:12<11:52:17, 3.37s/it] 63%|██████▎ | 21594/34278 [23:45:16<12:13:22, 3.47s/it] {'loss': 0.1217, 'grad_norm': 0.7882889215378267, 'learning_rate': 3.181087760814334e-06, 'epoch': 0.63} 63%|██████▎ | 21594/34278 [23:45:16<12:13:22, 3.47s/it] 63%|██████▎ | 21595/34278 [23:45:19<12:16:11, 3.48s/it] {'loss': 0.1269, 'grad_norm': 0.8198202330165746, 'learning_rate': 3.1806477041049088e-06, 'epoch': 0.63} 63%|██████▎ | 21595/34278 [23:45:19<12:16:11, 3.48s/it] 63%|██████▎ | 21596/34278 [23:45:23<12:38:04, 3.59s/it] {'loss': 0.1294, 'grad_norm': 0.8829989907763003, 'learning_rate': 3.1802076636382266e-06, 'epoch': 0.63} 63%|██████▎ | 21596/34278 [23:45:23<12:38:04, 3.59s/it] 63%|██████▎ | 21597/34278 [23:45:27<12:54:51, 3.67s/it] {'loss': 0.146, 'grad_norm': 0.8119415114941211, 'learning_rate': 3.1797676394182154e-06, 'epoch': 0.63} 63%|██████▎ | 21597/34278 [23:45:27<12:54:51, 3.67s/it] 63%|██████▎ | 21598/34278 [23:45:31<13:49:23, 3.92s/it] {'loss': 0.1354, 'grad_norm': 1.0180215103513393, 'learning_rate': 3.1793276314488044e-06, 'epoch': 0.63} 63%|██████▎ | 21598/34278 [23:45:31<13:49:23, 3.92s/it] 63%|██████▎ | 21599/34278 [23:45:35<13:50:27, 3.93s/it] {'loss': 0.113, 'grad_norm': 0.8614817739939169, 'learning_rate': 3.178887639733923e-06, 'epoch': 0.63} 63%|██████▎ | 21599/34278 [23:45:35<13:50:27, 3.93s/it] 63%|██████▎ | 21600/34278 [23:45:38<12:35:28, 3.58s/it] {'loss': 0.1062, 'grad_norm': 0.9822827151036375, 'learning_rate': 3.1784476642774965e-06, 'epoch': 0.63} 63%|██████▎ | 21600/34278 [23:45:38<12:35:28, 3.58s/it] 63%|██████▎ | 21601/34278 [23:45:42<12:39:07, 3.59s/it] {'loss': 0.1336, 'grad_norm': 0.883722596679789, 'learning_rate': 3.178007705083455e-06, 'epoch': 0.63} 63%|██████▎ | 21601/34278 [23:45:42<12:39:07, 3.59s/it] 63%|██████▎ | 21602/34278 [23:45:45<12:00:19, 3.41s/it] {'loss': 0.1601, 'grad_norm': 0.8868749676557872, 'learning_rate': 3.1775677621557266e-06, 'epoch': 0.63} 63%|██████▎ | 21602/34278 [23:45:45<12:00:19, 3.41s/it] 63%|██████▎ | 21603/34278 [23:45:48<11:58:03, 3.40s/it] {'loss': 0.126, 'grad_norm': 0.8256364246893216, 'learning_rate': 3.177127835498236e-06, 'epoch': 0.63} 63%|██████▎ | 21603/34278 [23:45:48<11:58:03, 3.40s/it] 63%|██████▎ | 21604/34278 [23:45:54<14:48:21, 4.21s/it] {'loss': 0.0992, 'grad_norm': 0.7191852828996356, 'learning_rate': 3.176687925114914e-06, 'epoch': 0.63} 63%|██████▎ | 21604/34278 [23:45:54<14:48:21, 4.21s/it] 63%|██████▎ | 21605/34278 [23:45:58<14:53:18, 4.23s/it] {'loss': 0.1197, 'grad_norm': 0.9261084337875681, 'learning_rate': 3.1762480310096875e-06, 'epoch': 0.63} 63%|██████▎ | 21605/34278 [23:45:58<14:53:18, 4.23s/it] 63%|██████▎ | 21606/34278 [23:46:03<15:17:34, 4.34s/it] {'loss': 0.1343, 'grad_norm': 0.7821855119917328, 'learning_rate': 3.1758081531864836e-06, 'epoch': 0.63} 63%|██████▎ | 21606/34278 [23:46:03<15:17:34, 4.34s/it] 63%|██████▎ | 21607/34278 [23:46:06<13:41:04, 3.89s/it] {'loss': 0.1321, 'grad_norm': 1.2769804075875613, 'learning_rate': 3.1753682916492283e-06, 'epoch': 0.63} 63%|██████▎ | 21607/34278 [23:46:06<13:41:04, 3.89s/it] 63%|██████▎ | 21608/34278 [23:46:11<15:30:06, 4.40s/it] {'loss': 0.1102, 'grad_norm': 0.6846081119655252, 'learning_rate': 3.1749284464018493e-06, 'epoch': 0.63} 63%|██████▎ | 21608/34278 [23:46:12<15:30:06, 4.40s/it] 63%|██████▎ | 21609/34278 [23:46:15<14:10:16, 4.03s/it] {'loss': 0.1315, 'grad_norm': 0.803259659986723, 'learning_rate': 3.1744886174482727e-06, 'epoch': 0.63} 63%|██████▎ | 21609/34278 [23:46:15<14:10:16, 4.03s/it] 63%|██████▎ | 21610/34278 [23:46:19<14:55:38, 4.24s/it] {'loss': 0.1175, 'grad_norm': 1.0148603553708146, 'learning_rate': 3.174048804792426e-06, 'epoch': 0.63} 63%|██████▎ | 21610/34278 [23:46:19<14:55:38, 4.24s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 63%|██████▎ | 21611/34278 [23:46:22<13:44:08, 3.90s/it] {'loss': 0.1352, 'grad_norm': 0.8276113582559743, 'learning_rate': 3.1736090084382375e-06, 'epoch': 0.63} 63%|██████▎ | 21611/34278 [23:46:22<13:44:08, 3.90s/it] 63%|██████▎ | 21612/34278 [23:46:26<12:56:12, 3.68s/it] {'loss': 0.1189, 'grad_norm': 0.674217183269698, 'learning_rate': 3.173169228389631e-06, 'epoch': 0.63} 63%|██████▎ | 21612/34278 [23:46:26<12:56:12, 3.68s/it] 63%|██████▎ | 21613/34278 [23:46:28<11:52:04, 3.37s/it] {'loss': 0.111, 'grad_norm': 0.8854670487468641, 'learning_rate': 3.1727294646505326e-06, 'epoch': 0.63} 63%|██████▎ | 21613/34278 [23:46:28<11:52:04, 3.37s/it] 63%|██████▎ | 21614/34278 [23:46:31<11:38:18, 3.31s/it] {'loss': 0.1251, 'grad_norm': 0.7736500206964132, 'learning_rate': 3.172289717224871e-06, 'epoch': 0.63} 63%|██████▎ | 21614/34278 [23:46:31<11:38:18, 3.31s/it] 63%|██████▎ | 21615/34278 [23:46:34<11:15:17, 3.20s/it] {'loss': 0.1306, 'grad_norm': 0.8464978375916811, 'learning_rate': 3.1718499861165675e-06, 'epoch': 0.63} 63%|██████▎ | 21615/34278 [23:46:34<11:15:17, 3.20s/it] 63%|██████▎ | 21616/34278 [23:46:38<11:39:51, 3.32s/it] {'loss': 0.1349, 'grad_norm': 0.8395150571330763, 'learning_rate': 3.1714102713295538e-06, 'epoch': 0.63} 63%|██████▎ | 21616/34278 [23:46:38<11:39:51, 3.32s/it] 63%|██████▎ | 21617/34278 [23:46:41<11:15:54, 3.20s/it] {'loss': 0.119, 'grad_norm': 0.8225095010010273, 'learning_rate': 3.1709705728677516e-06, 'epoch': 0.63} 63%|██████▎ | 21617/34278 [23:46:41<11:15:54, 3.20s/it] 63%|██████▎ | 21618/34278 [23:46:46<13:19:02, 3.79s/it] {'loss': 0.1363, 'grad_norm': 0.7485651100930605, 'learning_rate': 3.1705308907350874e-06, 'epoch': 0.63} 63%|██████▎ | 21618/34278 [23:46:46<13:19:02, 3.79s/it] 63%|██████▎ | 21619/34278 [23:46:49<12:37:05, 3.59s/it] {'loss': 0.1408, 'grad_norm': 0.8285357155703458, 'learning_rate': 3.1700912249354876e-06, 'epoch': 0.63} 63%|██████▎ | 21619/34278 [23:46:49<12:37:05, 3.59s/it] 63%|██████▎ | 21620/34278 [23:46:53<12:49:51, 3.65s/it] {'loss': 0.1147, 'grad_norm': 0.725291400812458, 'learning_rate': 3.169651575472876e-06, 'epoch': 0.63} 63%|██████▎ | 21620/34278 [23:46:53<12:49:51, 3.65s/it] 63%|██████▎ | 21621/34278 [23:46:56<12:28:32, 3.55s/it] {'loss': 0.1304, 'grad_norm': 0.8214455353634862, 'learning_rate': 3.169211942351177e-06, 'epoch': 0.63} 63%|██████▎ | 21621/34278 [23:46:56<12:28:32, 3.55s/it] 63%|██████▎ | 21622/34278 [23:47:01<13:18:18, 3.78s/it] {'loss': 0.1285, 'grad_norm': 0.8523354910578501, 'learning_rate': 3.1687723255743175e-06, 'epoch': 0.63} 63%|██████▎ | 21622/34278 [23:47:01<13:18:18, 3.78s/it] 63%|██████▎ | 21623/34278 [23:47:04<12:33:49, 3.57s/it] {'loss': 0.1254, 'grad_norm': 0.8011027639649022, 'learning_rate': 3.1683327251462214e-06, 'epoch': 0.63} 63%|██████▎ | 21623/34278 [23:47:04<12:33:49, 3.57s/it] 63%|██████▎ | 21624/34278 [23:47:07<11:46:46, 3.35s/it] {'loss': 0.14, 'grad_norm': 0.8352920934123474, 'learning_rate': 3.1678931410708147e-06, 'epoch': 0.63} 63%|██████▎ | 21624/34278 [23:47:07<11:46:46, 3.35s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0fd20d4e00> Failed to fetch sample 3679499. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fd20d4e00> 63%|██████▎ | 21625/34278 [23:47:10<12:17:55, 3.50s/it] {'loss': 0.1106, 'grad_norm': 0.7782148491210552, 'learning_rate': 3.16745357335202e-06, 'epoch': 0.63} 63%|██████▎ | 21625/34278 [23:47:10<12:17:55, 3.50s/it] 63%|██████▎ | 21626/34278 [23:47:14<12:45:05, 3.63s/it] {'loss': 0.1091, 'grad_norm': 0.7553737254850275, 'learning_rate': 3.1670140219937618e-06, 'epoch': 0.63} 63%|██████▎ | 21626/34278 [23:47:14<12:45:05, 3.63s/it] 63%|██████▎ | 21627/34278 [23:47:19<13:28:57, 3.84s/it] {'loss': 0.1399, 'grad_norm': 0.844354724598551, 'learning_rate': 3.166574486999964e-06, 'epoch': 0.63} 63%|██████▎ | 21627/34278 [23:47:19<13:28:57, 3.84s/it] 63%|██████▎ | 21628/34278 [23:47:23<14:26:40, 4.11s/it] {'loss': 0.1194, 'grad_norm': 0.84286281446668, 'learning_rate': 3.1661349683745527e-06, 'epoch': 0.63} 63%|██████▎ | 21628/34278 [23:47:23<14:26:40, 4.11s/it] 63%|██████▎ | 21629/34278 [23:47:27<13:25:05, 3.82s/it] {'loss': 0.1265, 'grad_norm': 0.9342965999046021, 'learning_rate': 3.1656954661214517e-06, 'epoch': 0.63} 63%|██████▎ | 21629/34278 [23:47:27<13:25:05, 3.82s/it] 63%|██████▎ | 21630/34278 [23:47:29<12:12:37, 3.48s/it] {'loss': 0.1242, 'grad_norm': 0.8805631626157451, 'learning_rate': 3.1652559802445824e-06, 'epoch': 0.63} 63%|██████▎ | 21630/34278 [23:47:29<12:12:37, 3.48s/it] 63%|██████▎ | 21631/34278 [23:47:32<11:40:33, 3.32s/it] {'loss': 0.1262, 'grad_norm': 0.826035691178042, 'learning_rate': 3.16481651074787e-06, 'epoch': 0.63} 63%|██████▎ | 21631/34278 [23:47:32<11:40:33, 3.32s/it] 63%|██████▎ | 21632/34278 [23:47:36<11:41:19, 3.33s/it] {'loss': 0.15, 'grad_norm': 1.0506807534626028, 'learning_rate': 3.1643770576352385e-06, 'epoch': 0.63} 63%|██████▎ | 21632/34278 [23:47:36<11:41:19, 3.33s/it] 63%|██████▎ | 21633/34278 [23:47:39<11:28:42, 3.27s/it] {'loss': 0.132, 'grad_norm': 0.9071116991791308, 'learning_rate': 3.1639376209106087e-06, 'epoch': 0.63} 63%|██████▎ | 21633/34278 [23:47:39<11:28:42, 3.27s/it] 63%|██████▎ | 21634/34278 [23:47:42<11:45:11, 3.35s/it] {'loss': 0.1183, 'grad_norm': 0.810589991856113, 'learning_rate': 3.1634982005779057e-06, 'epoch': 0.63} 63%|██████▎ | 21634/34278 [23:47:42<11:45:11, 3.35s/it] 63%|██████▎ | 21635/34278 [23:47:46<12:29:10, 3.56s/it] {'loss': 0.1338, 'grad_norm': 1.1061353599375823, 'learning_rate': 3.163058796641053e-06, 'epoch': 0.63} 63%|██████▎ | 21635/34278 [23:47:46<12:29:10, 3.56s/it] 63%|██████▎ | 21636/34278 [23:47:50<12:44:46, 3.63s/it] {'loss': 0.1386, 'grad_norm': 1.2644968792372722, 'learning_rate': 3.162619409103974e-06, 'epoch': 0.63} 63%|██████▎ | 21636/34278 [23:47:50<12:44:46, 3.63s/it] 63%|██████▎ | 21637/34278 [23:47:53<12:25:44, 3.54s/it] {'loss': 0.1226, 'grad_norm': 1.2277733713954433, 'learning_rate': 3.162180037970589e-06, 'epoch': 0.63} 63%|██████▎ | 21637/34278 [23:47:53<12:25:44, 3.54s/it] 63%|██████▎ | 21638/34278 [23:47:56<11:58:57, 3.41s/it] {'loss': 0.1272, 'grad_norm': 0.9519430141792353, 'learning_rate': 3.1617406832448226e-06, 'epoch': 0.63} 63%|██████▎ | 21638/34278 [23:47:56<11:58:57, 3.41s/it] 63%|██████▎ | 21639/34278 [23:48:00<11:42:36, 3.34s/it] {'loss': 0.1311, 'grad_norm': 1.2693948166689224, 'learning_rate': 3.1613013449305948e-06, 'epoch': 0.63} 63%|██████▎ | 21639/34278 [23:48:00<11:42:36, 3.34s/it] 63%|██████▎ | 21640/34278 [23:48:03<11:37:04, 3.31s/it] {'loss': 0.1352, 'grad_norm': 0.9420657904999812, 'learning_rate': 3.160862023031831e-06, 'epoch': 0.63} 63%|██████▎ | 21640/34278 [23:48:03<11:37:04, 3.31s/it] 63%|██████▎ | 21641/34278 [23:48:06<11:40:26, 3.33s/it] {'loss': 0.1177, 'grad_norm': 1.052262365752245, 'learning_rate': 3.1604227175524527e-06, 'epoch': 0.63} 63%|██████▎ | 21641/34278 [23:48:06<11:40:26, 3.33s/it] 63%|██████▎ | 21642/34278 [23:48:11<12:40:50, 3.61s/it] {'loss': 0.1403, 'grad_norm': 1.098416247575714, 'learning_rate': 3.15998342849638e-06, 'epoch': 0.63} 63%|██████▎ | 21642/34278 [23:48:11<12:40:50, 3.61s/it] 63%|██████▎ | 21643/34278 [23:48:16<14:31:03, 4.14s/it] {'loss': 0.1178, 'grad_norm': 0.7520614242649607, 'learning_rate': 3.1595441558675364e-06, 'epoch': 0.63} 63%|██████▎ | 21643/34278 [23:48:16<14:31:03, 4.14s/it] 63%|██████▎ | 21644/34278 [23:48:19<13:41:00, 3.90s/it] {'loss': 0.1332, 'grad_norm': 1.3690315430696907, 'learning_rate': 3.1591048996698426e-06, 'epoch': 0.63} 63%|██████▎ | 21644/34278 [23:48:19<13:41:00, 3.90s/it] 63%|██████▎ | 21645/34278 [23:48:22<12:51:18, 3.66s/it] {'loss': 0.1458, 'grad_norm': 0.8677210659976563, 'learning_rate': 3.1586656599072205e-06, 'epoch': 0.63} 63%|██████▎ | 21645/34278 [23:48:23<12:51:18, 3.66s/it] 63%|██████▎ | 21646/34278 [23:48:27<13:48:38, 3.94s/it] {'loss': 0.1169, 'grad_norm': 0.8448804808239114, 'learning_rate': 3.1582264365835946e-06, 'epoch': 0.63} 63%|██████▎ | 21646/34278 [23:48:27<13:48:38, 3.94s/it] 63%|██████▎ | 21647/34278 [23:48:31<13:30:37, 3.85s/it] {'loss': 0.1143, 'grad_norm': 0.7226770601195681, 'learning_rate': 3.1577872297028813e-06, 'epoch': 0.63} 63%|██████▎ | 21647/34278 [23:48:31<13:30:37, 3.85s/it] 63%|██████▎ | 21648/34278 [23:48:35<13:44:52, 3.92s/it] {'loss': 0.0992, 'grad_norm': 0.7707979322774487, 'learning_rate': 3.157348039269004e-06, 'epoch': 0.63} 63%|██████▎ | 21648/34278 [23:48:35<13:44:52, 3.92s/it] 63%|██████▎ | 21649/34278 [23:48:38<13:16:45, 3.79s/it] {'loss': 0.1156, 'grad_norm': 1.046312267048633, 'learning_rate': 3.1569088652858847e-06, 'epoch': 0.63} 63%|██████▎ | 21649/34278 [23:48:38<13:16:45, 3.79s/it] 63%|██████▎ | 21650/34278 [23:48:42<13:26:08, 3.83s/it] {'loss': 0.1208, 'grad_norm': 0.9281426295578425, 'learning_rate': 3.1564697077574403e-06, 'epoch': 0.63} 63%|██████▎ | 21650/34278 [23:48:42<13:26:08, 3.83s/it] 63%|██████▎ | 21651/34278 [23:48:47<14:54:59, 4.25s/it] {'loss': 0.1546, 'grad_norm': 0.9278221382126396, 'learning_rate': 3.156030566687597e-06, 'epoch': 0.63} 63%|██████▎ | 21651/34278 [23:48:47<14:54:59, 4.25s/it] 63%|██████▎ | 21652/34278 [23:48:51<13:57:05, 3.98s/it] {'loss': 0.1143, 'grad_norm': 0.7366072250886886, 'learning_rate': 3.155591442080271e-06, 'epoch': 0.63} 63%|██████▎ | 21652/34278 [23:48:51<13:57:05, 3.98s/it] 63%|██████▎ | 21653/34278 [23:48:54<12:53:30, 3.68s/it] {'loss': 0.1464, 'grad_norm': 1.1135366008466816, 'learning_rate': 3.1551523339393855e-06, 'epoch': 0.63} 63%|██████▎ | 21653/34278 [23:48:54<12:53:30, 3.68s/it] 63%|██████▎ | 21654/34278 [23:48:57<12:47:45, 3.65s/it] {'loss': 0.1416, 'grad_norm': 0.7537104048847073, 'learning_rate': 3.1547132422688593e-06, 'epoch': 0.63} 63%|██████▎ | 21654/34278 [23:48:57<12:47:45, 3.65s/it] 63%|██████▎ | 21655/34278 [23:49:01<12:28:46, 3.56s/it] {'loss': 0.1212, 'grad_norm': 0.8406120523779976, 'learning_rate': 3.1542741670726123e-06, 'epoch': 0.63} 63%|██████▎ | 21655/34278 [23:49:01<12:28:46, 3.56s/it] 63%|██████▎ | 21656/34278 [23:49:05<12:56:10, 3.69s/it] {'loss': 0.1427, 'grad_norm': 0.9879725891146476, 'learning_rate': 3.153835108354564e-06, 'epoch': 0.63} 63%|██████▎ | 21656/34278 [23:49:05<12:56:10, 3.69s/it] 63%|██████▎ | 21657/34278 [23:49:08<12:17:25, 3.51s/it] {'loss': 0.1104, 'grad_norm': 0.9870917487487902, 'learning_rate': 3.153396066118636e-06, 'epoch': 0.63} 63%|██████▎ | 21657/34278 [23:49:08<12:17:25, 3.51s/it] 63%|██████▎ | 21658/34278 [23:49:11<12:30:03, 3.57s/it] {'loss': 0.1127, 'grad_norm': 0.57449428922917, 'learning_rate': 3.152957040368747e-06, 'epoch': 0.63} 63%|██████▎ | 21658/34278 [23:49:11<12:30:03, 3.57s/it] 63%|██████▎ | 21659/34278 [23:49:14<12:01:06, 3.43s/it] {'loss': 0.1076, 'grad_norm': 0.6881469921397672, 'learning_rate': 3.152518031108818e-06, 'epoch': 0.63} 63%|██████▎ | 21659/34278 [23:49:14<12:01:06, 3.43s/it] 63%|██████▎ | 21660/34278 [23:49:18<12:39:48, 3.61s/it] {'loss': 0.0951, 'grad_norm': 0.7112782570654465, 'learning_rate': 3.1520790383427657e-06, 'epoch': 0.63} 63%|██████▎ | 21660/34278 [23:49:18<12:39:48, 3.61s/it] 63%|██████▎ | 21661/34278 [23:49:21<11:57:19, 3.41s/it] {'loss': 0.117, 'grad_norm': 0.6982666678987339, 'learning_rate': 3.1516400620745112e-06, 'epoch': 0.63} 63%|██████▎ | 21661/34278 [23:49:21<11:57:19, 3.41s/it] 63%|██████▎ | 21662/34278 [23:49:27<14:45:51, 4.21s/it] {'loss': 0.1204, 'grad_norm': 0.6983628511843952, 'learning_rate': 3.1512011023079714e-06, 'epoch': 0.63} 63%|██████▎ | 21662/34278 [23:49:28<14:45:51, 4.21s/it] 63%|██████▎ | 21663/34278 [23:49:31<13:40:34, 3.90s/it] {'loss': 0.149, 'grad_norm': 0.8568639745493805, 'learning_rate': 3.1507621590470692e-06, 'epoch': 0.63} 63%|██████▎ | 21663/34278 [23:49:31<13:40:34, 3.90s/it] 63%|██████▎ | 21664/34278 [23:49:35<13:57:56, 3.99s/it] {'loss': 0.1431, 'grad_norm': 0.8962051912890376, 'learning_rate': 3.15032323229572e-06, 'epoch': 0.63} 63%|██████▎ | 21664/34278 [23:49:35<13:57:56, 3.99s/it] 63%|██████▎ | 21665/34278 [23:49:38<13:06:36, 3.74s/it] {'loss': 0.1375, 'grad_norm': 1.0384683097379803, 'learning_rate': 3.149884322057843e-06, 'epoch': 0.63} 63%|██████▎ | 21665/34278 [23:49:38<13:06:36, 3.74s/it] 63%|██████▎ | 21666/34278 [23:49:42<12:55:05, 3.69s/it] {'loss': 0.1366, 'grad_norm': 0.7291252888006237, 'learning_rate': 3.1494454283373583e-06, 'epoch': 0.63} 63%|██████▎ | 21666/34278 [23:49:42<12:55:05, 3.69s/it] 63%|██████▎ | 21667/34278 [23:49:45<12:10:48, 3.48s/it] {'loss': 0.1428, 'grad_norm': 0.9349662375087829, 'learning_rate': 3.1490065511381816e-06, 'epoch': 0.63} 63%|██████▎ | 21667/34278 [23:49:45<12:10:48, 3.48s/it] 63%|██████▎ | 21668/34278 [23:49:49<13:16:37, 3.79s/it] {'loss': 0.1158, 'grad_norm': 0.9394634596186122, 'learning_rate': 3.1485676904642326e-06, 'epoch': 0.63} 63%|██████▎ | 21668/34278 [23:49:49<13:16:37, 3.79s/it] 63%|██████▎ | 21669/34278 [23:49:52<12:44:17, 3.64s/it] {'loss': 0.1306, 'grad_norm': 0.812259875545716, 'learning_rate': 3.1481288463194295e-06, 'epoch': 0.63} 63%|██████▎ | 21669/34278 [23:49:52<12:44:17, 3.64s/it] 63%|██████▎ | 21670/34278 [23:49:56<12:17:33, 3.51s/it] {'loss': 0.1288, 'grad_norm': 0.7069932550567836, 'learning_rate': 3.1476900187076896e-06, 'epoch': 0.63} 63%|██████▎ | 21670/34278 [23:49:56<12:17:33, 3.51s/it] 63%|██████▎ | 21671/34278 [23:49:59<11:40:41, 3.33s/it] {'loss': 0.1164, 'grad_norm': 0.9154211675415918, 'learning_rate': 3.147251207632933e-06, 'epoch': 0.63} 63%|██████▎ | 21671/34278 [23:49:59<11:40:41, 3.33s/it] 63%|██████▎ | 21672/34278 [23:50:01<11:16:03, 3.22s/it] {'loss': 0.1401, 'grad_norm': 1.0751263803716586, 'learning_rate': 3.146812413099074e-06, 'epoch': 0.63} 63%|██████▎ | 21672/34278 [23:50:01<11:16:03, 3.22s/it] 63%|██████▎ | 21673/34278 [23:50:05<11:40:36, 3.33s/it] {'loss': 0.1309, 'grad_norm': 0.8306769690576609, 'learning_rate': 3.1463736351100315e-06, 'epoch': 0.63} 63%|██████▎ | 21673/34278 [23:50:05<11:40:36, 3.33s/it] 63%|██████▎ | 21674/34278 [23:50:08<11:23:36, 3.25s/it] {'loss': 0.1242, 'grad_norm': 0.8637797866408252, 'learning_rate': 3.1459348736697214e-06, 'epoch': 0.63} 63%|██████▎ | 21674/34278 [23:50:08<11:23:36, 3.25s/it] 63%|██████▎ | 21675/34278 [23:50:14<14:11:58, 4.06s/it] {'loss': 0.1073, 'grad_norm': 0.899775583907776, 'learning_rate': 3.1454961287820627e-06, 'epoch': 0.63} 63%|██████▎ | 21675/34278 [23:50:14<14:11:58, 4.06s/it] 63%|██████▎ | 21676/34278 [23:50:19<15:18:06, 4.37s/it] {'loss': 0.1477, 'grad_norm': 0.9252264441433307, 'learning_rate': 3.1450574004509737e-06, 'epoch': 0.63} 63%|██████▎ | 21676/34278 [23:50:19<15:18:06, 4.37s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0cab1c1a30> Failed to fetch sample 3268266. Exception: cannot identify image file <_io.BytesIO object at 0x7f0cab1c1a30> 63%|██████▎ | 21677/34278 [23:50:23<14:25:24, 4.12s/it] {'loss': 0.1186, 'grad_norm': 0.8818618303142279, 'learning_rate': 3.144618688680368e-06, 'epoch': 0.63} 63%|██████▎ | 21677/34278 [23:50:23<14:25:24, 4.12s/it] 63%|██████▎ | 21678/34278 [23:50:27<14:22:29, 4.11s/it] {'loss': 0.1442, 'grad_norm': 1.1302293896714315, 'learning_rate': 3.144179993474164e-06, 'epoch': 0.63} 63%|██████▎ | 21678/34278 [23:50:27<14:22:29, 4.11s/it] 63%|██████▎ | 21679/34278 [23:50:30<13:32:25, 3.87s/it] {'loss': 0.133, 'grad_norm': 0.9210576401465935, 'learning_rate': 3.143741314836279e-06, 'epoch': 0.63} 63%|██████▎ | 21679/34278 [23:50:30<13:32:25, 3.87s/it] 63%|██████▎ | 21680/34278 [23:50:36<16:00:38, 4.58s/it] {'loss': 0.1169, 'grad_norm': 0.6783099486101454, 'learning_rate': 3.143302652770625e-06, 'epoch': 0.63} 63%|██████▎ | 21680/34278 [23:50:36<16:00:38, 4.58s/it] 63%|██████▎ | 21681/34278 [23:50:40<14:39:11, 4.19s/it] {'loss': 0.1219, 'grad_norm': 0.9228908753654181, 'learning_rate': 3.142864007281125e-06, 'epoch': 0.63} 63%|██████▎ | 21681/34278 [23:50:40<14:39:11, 4.19s/it] 63%|██████▎ | 21682/34278 [23:50:43<13:44:28, 3.93s/it] {'loss': 0.1161, 'grad_norm': 0.748219865458826, 'learning_rate': 3.142425378371691e-06, 'epoch': 0.63} 63%|██████▎ | 21682/34278 [23:50:43<13:44:28, 3.93s/it] 63%|██████▎ | 21683/34278 [23:50:48<14:31:38, 4.15s/it] {'loss': 0.1249, 'grad_norm': 0.7959972009416684, 'learning_rate': 3.1419867660462393e-06, 'epoch': 0.63} 63%|██████▎ | 21683/34278 [23:50:48<14:31:38, 4.15s/it] 63%|██████▎ | 21684/34278 [23:50:51<13:24:38, 3.83s/it] {'loss': 0.1207, 'grad_norm': 0.766086134365, 'learning_rate': 3.1415481703086875e-06, 'epoch': 0.63} 63%|██████▎ | 21684/34278 [23:50:51<13:24:38, 3.83s/it] 63%|██████▎ | 21685/34278 [23:50:55<13:53:45, 3.97s/it] {'loss': 0.1139, 'grad_norm': 0.7530772631663772, 'learning_rate': 3.1411095911629493e-06, 'epoch': 0.63} 63%|██████▎ | 21685/34278 [23:50:55<13:53:45, 3.97s/it] 63%|██████▎ | 21686/34278 [23:50:58<13:18:52, 3.81s/it] {'loss': 0.1403, 'grad_norm': 0.9548755474435264, 'learning_rate': 3.1406710286129395e-06, 'epoch': 0.63} 63%|██████▎ | 21686/34278 [23:50:58<13:18:52, 3.81s/it] 63%|██████▎ | 21687/34278 [23:51:02<12:37:06, 3.61s/it] {'loss': 0.1414, 'grad_norm': 0.8235797774988136, 'learning_rate': 3.1402324826625758e-06, 'epoch': 0.63} 63%|██████▎ | 21687/34278 [23:51:02<12:37:06, 3.61s/it] 63%|██████▎ | 21688/34278 [23:51:05<12:03:06, 3.45s/it] {'loss': 0.1318, 'grad_norm': 0.749668243905599, 'learning_rate': 3.139793953315773e-06, 'epoch': 0.63} 63%|██████▎ | 21688/34278 [23:51:05<12:03:06, 3.45s/it] 63%|██████▎ | 21689/34278 [23:51:08<12:06:59, 3.46s/it] {'loss': 0.1006, 'grad_norm': 0.9450579388649207, 'learning_rate': 3.139355440576446e-06, 'epoch': 0.63} 63%|██████▎ | 21689/34278 [23:51:08<12:06:59, 3.46s/it] 63%|██████▎ | 21690/34278 [23:51:14<14:55:25, 4.27s/it] {'loss': 0.1392, 'grad_norm': 0.802373093106417, 'learning_rate': 3.1389169444485092e-06, 'epoch': 0.63} 63%|██████▎ | 21690/34278 [23:51:14<14:55:25, 4.27s/it] 63%|██████▎ | 21691/34278 [23:51:17<13:32:23, 3.87s/it] {'loss': 0.1293, 'grad_norm': 0.9469888424479527, 'learning_rate': 3.138478464935877e-06, 'epoch': 0.63} 63%|██████▎ | 21691/34278 [23:51:17<13:32:23, 3.87s/it] 63%|██████▎ | 21692/34278 [23:51:23<15:41:21, 4.49s/it] {'loss': 0.1415, 'grad_norm': 0.922208982657413, 'learning_rate': 3.1380400020424638e-06, 'epoch': 0.63} 63%|██████▎ | 21692/34278 [23:51:23<15:41:21, 4.49s/it] 63%|██████▎ | 21693/34278 [23:51:26<14:00:29, 4.01s/it] {'loss': 0.1035, 'grad_norm': 0.9266849914243759, 'learning_rate': 3.1376015557721875e-06, 'epoch': 0.63} 63%|██████▎ | 21693/34278 [23:51:26<14:00:29, 4.01s/it] 63%|██████▎ | 21694/34278 [23:51:29<12:58:31, 3.71s/it] {'loss': 0.1177, 'grad_norm': 0.9004134124589869, 'learning_rate': 3.1371631261289583e-06, 'epoch': 0.63} 63%|██████▎ | 21694/34278 [23:51:29<12:58:31, 3.71s/it] 63%|██████▎ | 21695/34278 [23:51:32<12:05:54, 3.46s/it] {'loss': 0.1164, 'grad_norm': 0.8702502407614358, 'learning_rate': 3.136724713116692e-06, 'epoch': 0.63} 63%|██████▎ | 21695/34278 [23:51:32<12:05:54, 3.46s/it] 63%|██████▎ | 21696/34278 [23:51:35<11:35:15, 3.32s/it] {'loss': 0.0976, 'grad_norm': 0.7611662273728962, 'learning_rate': 3.136286316739304e-06, 'epoch': 0.63} 63%|██████▎ | 21696/34278 [23:51:35<11:35:15, 3.32s/it] 63%|██████▎ | 21697/34278 [23:51:40<13:55:41, 3.99s/it] {'loss': 0.1348, 'grad_norm': 1.0615112949234136, 'learning_rate': 3.1358479370007067e-06, 'epoch': 0.63} 63%|██████▎ | 21697/34278 [23:51:40<13:55:41, 3.99s/it] 63%|██████▎ | 21698/34278 [23:51:44<13:44:20, 3.93s/it] {'loss': 0.1151, 'grad_norm': 0.8625541826782752, 'learning_rate': 3.135409573904812e-06, 'epoch': 0.63} 63%|██████▎ | 21698/34278 [23:51:44<13:44:20, 3.93s/it] 63%|██████▎ | 21699/34278 [23:51:48<13:11:57, 3.78s/it] {'loss': 0.123, 'grad_norm': 0.9753872935193014, 'learning_rate': 3.1349712274555364e-06, 'epoch': 0.63} 63%|██████▎ | 21699/34278 [23:51:48<13:11:57, 3.78s/it] 63%|██████▎ | 21700/34278 [23:51:51<12:40:24, 3.63s/it] {'loss': 0.1111, 'grad_norm': 0.7421668580821029, 'learning_rate': 3.1345328976567923e-06, 'epoch': 0.63} 63%|██████▎ | 21700/34278 [23:51:51<12:40:24, 3.63s/it] 63%|██████▎ | 21701/34278 [23:51:56<14:25:43, 4.13s/it] {'loss': 0.1144, 'grad_norm': 0.666921239172021, 'learning_rate': 3.1340945845124948e-06, 'epoch': 0.63} 63%|██████▎ | 21701/34278 [23:51:56<14:25:43, 4.13s/it] 63%|██████▎ | 21702/34278 [23:52:00<14:15:06, 4.08s/it] {'loss': 0.1502, 'grad_norm': 0.8423616308448548, 'learning_rate': 3.133656288026554e-06, 'epoch': 0.63} 63%|██████▎ | 21702/34278 [23:52:00<14:15:06, 4.08s/it] 63%|██████▎ | 21703/34278 [23:52:03<13:23:56, 3.84s/it] {'loss': 0.096, 'grad_norm': 0.9447961324947602, 'learning_rate': 3.133218008202885e-06, 'epoch': 0.63} 63%|██████▎ | 21703/34278 [23:52:03<13:23:56, 3.84s/it] 63%|██████▎ | 21704/34278 [23:52:07<12:36:59, 3.61s/it] {'loss': 0.1131, 'grad_norm': 0.8084249031859451, 'learning_rate': 3.1327797450453984e-06, 'epoch': 0.63} 63%|██████▎ | 21704/34278 [23:52:07<12:36:59, 3.61s/it] 63%|██████▎ | 21705/34278 [23:52:10<12:22:21, 3.54s/it] {'loss': 0.1187, 'grad_norm': 0.7578063947746068, 'learning_rate': 3.1323414985580092e-06, 'epoch': 0.63} 63%|██████▎ | 21705/34278 [23:52:10<12:22:21, 3.54s/it] 63%|██████▎ | 21706/34278 [23:52:13<11:53:24, 3.40s/it] {'loss': 0.1348, 'grad_norm': 0.9348425084795907, 'learning_rate': 3.131903268744631e-06, 'epoch': 0.63} 63%|██████▎ | 21706/34278 [23:52:13<11:53:24, 3.40s/it] 63%|██████▎ | 21707/34278 [23:52:19<14:26:25, 4.14s/it] {'loss': 0.12, 'grad_norm': 1.0434833243171249, 'learning_rate': 3.131465055609173e-06, 'epoch': 0.63} 63%|██████▎ | 21707/34278 [23:52:19<14:26:25, 4.14s/it] 63%|██████▎ | 21708/34278 [23:52:25<16:25:18, 4.70s/it] {'loss': 0.1126, 'grad_norm': 0.6079233290908971, 'learning_rate': 3.1310268591555494e-06, 'epoch': 0.63} 63%|██████▎ | 21708/34278 [23:52:25<16:25:18, 4.70s/it] 63%|██████▎ | 21709/34278 [23:52:28<15:12:46, 4.36s/it] {'loss': 0.1135, 'grad_norm': 0.7797421497396152, 'learning_rate': 3.130588679387672e-06, 'epoch': 0.63} 63%|██████▎ | 21709/34278 [23:52:28<15:12:46, 4.36s/it] 63%|██████▎ | 21710/34278 [23:52:32<14:05:41, 4.04s/it] {'loss': 0.1189, 'grad_norm': 0.8559849384990991, 'learning_rate': 3.13015051630945e-06, 'epoch': 0.63} 63%|██████▎ | 21710/34278 [23:52:32<14:05:41, 4.04s/it] 63%|██████▎ | 21711/34278 [23:52:35<13:01:36, 3.73s/it] {'loss': 0.1053, 'grad_norm': 0.7625466746703312, 'learning_rate': 3.129712369924801e-06, 'epoch': 0.63} 63%|██████▎ | 21711/34278 [23:52:35<13:01:36, 3.73s/it] 63%|██████▎ | 21712/34278 [23:52:38<13:01:49, 3.73s/it] {'loss': 0.1402, 'grad_norm': 0.870000530641801, 'learning_rate': 3.129274240237633e-06, 'epoch': 0.63} 63%|██████▎ | 21712/34278 [23:52:38<13:01:49, 3.73s/it] 63%|██████▎ | 21713/34278 [23:52:42<12:20:00, 3.53s/it] {'loss': 0.1295, 'grad_norm': 1.1223567464906534, 'learning_rate': 3.1288361272518575e-06, 'epoch': 0.63} 63%|██████▎ | 21713/34278 [23:52:42<12:20:00, 3.53s/it] 63%|██████▎ | 21714/34278 [23:52:46<13:17:36, 3.81s/it] {'loss': 0.1374, 'grad_norm': 0.9488875518791894, 'learning_rate': 3.128398030971387e-06, 'epoch': 0.63} 63%|██████▎ | 21714/34278 [23:52:46<13:17:36, 3.81s/it] 63%|██████▎ | 21715/34278 [23:52:49<12:40:57, 3.63s/it] {'loss': 0.1418, 'grad_norm': 0.7958988894901167, 'learning_rate': 3.127959951400131e-06, 'epoch': 0.63} 63%|██████▎ | 21715/34278 [23:52:49<12:40:57, 3.63s/it] 63%|██████▎ | 21716/34278 [23:52:55<15:16:47, 4.38s/it] {'loss': 0.1388, 'grad_norm': 1.2673980555432538, 'learning_rate': 3.127521888542001e-06, 'epoch': 0.63} 63%|██████▎ | 21716/34278 [23:52:55<15:16:47, 4.38s/it] 63%|██████▎ | 21717/34278 [23:52:59<13:59:57, 4.01s/it] {'loss': 0.1238, 'grad_norm': 0.996179579160665, 'learning_rate': 3.1270838424009097e-06, 'epoch': 0.63} 63%|██████▎ | 21717/34278 [23:52:59<13:59:57, 4.01s/it] 63%|██████▎ | 21718/34278 [23:53:03<14:35:18, 4.18s/it] {'loss': 0.1467, 'grad_norm': 0.8176115520522832, 'learning_rate': 3.126645812980767e-06, 'epoch': 0.63} 63%|██████▎ | 21718/34278 [23:53:03<14:35:18, 4.18s/it] 63%|██████▎ | 21719/34278 [23:53:06<13:33:53, 3.89s/it] {'loss': 0.1244, 'grad_norm': 0.7650193098981782, 'learning_rate': 3.126207800285484e-06, 'epoch': 0.63} 63%|██████▎ | 21719/34278 [23:53:06<13:33:53, 3.89s/it] 63%|██████▎ | 21720/34278 [23:53:11<14:31:12, 4.16s/it] {'loss': 0.1235, 'grad_norm': 0.9278519089992957, 'learning_rate': 3.1257698043189693e-06, 'epoch': 0.63} 63%|██████▎ | 21720/34278 [23:53:11<14:31:12, 4.16s/it] 63%|██████▎ | 21721/34278 [23:53:14<13:32:59, 3.88s/it] {'loss': 0.1126, 'grad_norm': 0.7736597741982416, 'learning_rate': 3.1253318250851345e-06, 'epoch': 0.63} 63%|██████▎ | 21721/34278 [23:53:14<13:32:59, 3.88s/it] 63%|██████▎ | 21722/34278 [23:53:18<13:01:40, 3.74s/it] {'loss': 0.114, 'grad_norm': 0.6927540006471979, 'learning_rate': 3.124893862587889e-06, 'epoch': 0.63} 63%|██████▎ | 21722/34278 [23:53:18<13:01:40, 3.74s/it] 63%|██████▎ | 21723/34278 [23:53:24<15:20:03, 4.40s/it] {'loss': 0.1257, 'grad_norm': 0.9132711973936171, 'learning_rate': 3.1244559168311452e-06, 'epoch': 0.63} 63%|██████▎ | 21723/34278 [23:53:24<15:20:03, 4.40s/it] 63%|██████▎ | 21724/34278 [23:53:29<15:49:05, 4.54s/it] {'loss': 0.1343, 'grad_norm': 0.8161342566372455, 'learning_rate': 3.124017987818809e-06, 'epoch': 0.63} 63%|██████▎ | 21724/34278 [23:53:29<15:49:05, 4.54s/it] 63%|██████▎ | 21725/34278 [23:53:31<13:53:59, 3.99s/it] {'loss': 0.1135, 'grad_norm': 0.8222380140798606, 'learning_rate': 3.123580075554794e-06, 'epoch': 0.63} 63%|██████▎ | 21725/34278 [23:53:31<13:53:59, 3.99s/it] 63%|██████▎ | 21726/34278 [23:53:35<13:28:38, 3.87s/it] {'loss': 0.1088, 'grad_norm': 0.9491268803419923, 'learning_rate': 3.1231421800430084e-06, 'epoch': 0.63} 63%|██████▎ | 21726/34278 [23:53:35<13:28:38, 3.87s/it] 63%|██████▎ | 21727/34278 [23:53:41<15:49:27, 4.54s/it] {'loss': 0.146, 'grad_norm': 0.9556709148887018, 'learning_rate': 3.122704301287361e-06, 'epoch': 0.63} 63%|██████▎ | 21727/34278 [23:53:41<15:49:27, 4.54s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 63%|██████▎ | 21728/34278 [23:53:44<14:34:14, 4.18s/it] {'loss': 0.1389, 'grad_norm': 0.9430692757125659, 'learning_rate': 3.12226643929176e-06, 'epoch': 0.63} 63%|██████▎ | 21728/34278 [23:53:44<14:34:14, 4.18s/it] 63%|██████▎ | 21729/34278 [23:53:48<14:02:34, 4.03s/it] {'loss': 0.1235, 'grad_norm': 0.7764755199732704, 'learning_rate': 3.1218285940601166e-06, 'epoch': 0.63} 63%|██████▎ | 21729/34278 [23:53:48<14:02:34, 4.03s/it] 63%|██████▎ | 21730/34278 [23:53:52<13:53:23, 3.98s/it] {'loss': 0.1428, 'grad_norm': 0.7909181206676502, 'learning_rate': 3.1213907655963406e-06, 'epoch': 0.63} 63%|██████▎ | 21730/34278 [23:53:52<13:53:23, 3.98s/it] 63%|██████▎ | 21731/34278 [23:53:55<13:10:21, 3.78s/it] {'loss': 0.1104, 'grad_norm': 0.7880675764901902, 'learning_rate': 3.120952953904339e-06, 'epoch': 0.63} 63%|██████▎ | 21731/34278 [23:53:55<13:10:21, 3.78s/it] 63%|██████▎ | 21732/34278 [23:53:59<13:12:07, 3.79s/it] {'loss': 0.1136, 'grad_norm': 0.739293609115454, 'learning_rate': 3.12051515898802e-06, 'epoch': 0.63} 63%|██████▎ | 21732/34278 [23:53:59<13:12:07, 3.79s/it] 63%|██████▎ | 21733/34278 [23:54:02<12:48:34, 3.68s/it] {'loss': 0.1182, 'grad_norm': 0.8153702795905208, 'learning_rate': 3.1200773808512936e-06, 'epoch': 0.63} 63%|██████▎ | 21733/34278 [23:54:02<12:48:34, 3.68s/it] 63%|██████▎ | 21734/34278 [23:54:07<13:53:11, 3.99s/it] {'loss': 0.131, 'grad_norm': 0.7242207422869644, 'learning_rate': 3.119639619498066e-06, 'epoch': 0.63} 63%|██████▎ | 21734/34278 [23:54:07<13:53:11, 3.99s/it] 63%|██████▎ | 21735/34278 [23:54:10<13:07:04, 3.77s/it] {'loss': 0.1218, 'grad_norm': 0.8791651001527339, 'learning_rate': 3.1192018749322482e-06, 'epoch': 0.63} 63%|██████▎ | 21735/34278 [23:54:10<13:07:04, 3.77s/it] 63%|██████▎ | 21736/34278 [23:54:14<13:06:20, 3.76s/it] {'loss': 0.1277, 'grad_norm': 0.7925481732877339, 'learning_rate': 3.1187641471577478e-06, 'epoch': 0.63} 63%|██████▎ | 21736/34278 [23:54:14<13:06:20, 3.76s/it] 63%|██████▎ | 21737/34278 [23:54:17<12:41:03, 3.64s/it] {'loss': 0.127, 'grad_norm': 1.0229415245648514, 'learning_rate': 3.1183264361784716e-06, 'epoch': 0.63} 63%|██████▎ | 21737/34278 [23:54:17<12:41:03, 3.64s/it] 63%|██████▎ | 21738/34278 [23:54:21<12:39:58, 3.64s/it] {'loss': 0.1278, 'grad_norm': 1.0213441175537599, 'learning_rate': 3.117888741998328e-06, 'epoch': 0.63} 63%|██████▎ | 21738/34278 [23:54:21<12:39:58, 3.64s/it] 63%|██████▎ | 21739/34278 [23:54:26<14:28:37, 4.16s/it] {'loss': 0.1324, 'grad_norm': 0.8416204239503033, 'learning_rate': 3.1174510646212247e-06, 'epoch': 0.63} 63%|██████▎ | 21739/34278 [23:54:26<14:28:37, 4.16s/it] 63%|██████▎ | 21740/34278 [23:54:30<13:48:21, 3.96s/it] {'loss': 0.1569, 'grad_norm': 0.9804147924450002, 'learning_rate': 3.117013404051066e-06, 'epoch': 0.63} 63%|██████▎ | 21740/34278 [23:54:30<13:48:21, 3.96s/it] 63%|██████▎ | 21741/34278 [23:54:35<15:02:36, 4.32s/it] {'loss': 0.1202, 'grad_norm': 1.604150237655962, 'learning_rate': 3.1165757602917653e-06, 'epoch': 0.63} 63%|██████▎ | 21741/34278 [23:54:35<15:02:36, 4.32s/it] 63%|██████▎ | 21742/34278 [23:54:38<13:42:30, 3.94s/it] {'loss': 0.1116, 'grad_norm': 0.8373751070357379, 'learning_rate': 3.1161381333472253e-06, 'epoch': 0.63} 63%|██████▎ | 21742/34278 [23:54:38<13:42:30, 3.94s/it] 63%|██████▎ | 21743/34278 [23:54:44<15:47:28, 4.54s/it] {'loss': 0.1354, 'grad_norm': 0.7782899909471713, 'learning_rate': 3.1157005232213542e-06, 'epoch': 0.63} 63%|██████▎ | 21743/34278 [23:54:44<15:47:28, 4.54s/it] 63%|██████▎ | 21744/34278 [23:54:48<15:15:40, 4.38s/it] {'loss': 0.1233, 'grad_norm': 0.9136252138127734, 'learning_rate': 3.115262929918061e-06, 'epoch': 0.63} 63%|██████▎ | 21744/34278 [23:54:48<15:15:40, 4.38s/it] 63%|██████▎ | 21745/34278 [23:54:51<14:07:45, 4.06s/it] {'loss': 0.1379, 'grad_norm': 0.8607119451216523, 'learning_rate': 3.114825353441249e-06, 'epoch': 0.63} 63%|██████▎ | 21745/34278 [23:54:51<14:07:45, 4.06s/it] 63%|██████▎ | 21746/34278 [23:54:56<14:20:36, 4.12s/it] {'loss': 0.1122, 'grad_norm': 0.911061730616713, 'learning_rate': 3.1143877937948247e-06, 'epoch': 0.63} 63%|██████▎ | 21746/34278 [23:54:56<14:20:36, 4.12s/it] 63%|██████▎ | 21747/34278 [23:54:59<13:18:59, 3.83s/it] {'loss': 0.1215, 'grad_norm': 0.8118839289481564, 'learning_rate': 3.1139502509826975e-06, 'epoch': 0.63} 63%|██████▎ | 21747/34278 [23:54:59<13:18:59, 3.83s/it] 63%|██████▎ | 21748/34278 [23:55:05<15:30:36, 4.46s/it] {'loss': 0.1115, 'grad_norm': 0.6488405045595612, 'learning_rate': 3.113512725008772e-06, 'epoch': 0.63} 63%|██████▎ | 21748/34278 [23:55:05<15:30:36, 4.46s/it] 63%|██████▎ | 21749/34278 [23:55:08<13:59:15, 4.02s/it] {'loss': 0.1318, 'grad_norm': 0.7618996093964154, 'learning_rate': 3.1130752158769555e-06, 'epoch': 0.63} 63%|██████▎ | 21749/34278 [23:55:08<13:59:15, 4.02s/it] 63%|██████▎ | 21750/34278 [23:55:14<15:56:35, 4.58s/it] {'loss': 0.1144, 'grad_norm': 0.9880605615753427, 'learning_rate': 3.112637723591152e-06, 'epoch': 0.63} 63%|██████▎ | 21750/34278 [23:55:14<15:56:35, 4.58s/it] 63%|██████▎ | 21751/34278 [23:55:17<14:58:07, 4.30s/it] {'loss': 0.1663, 'grad_norm': 0.7804695926800223, 'learning_rate': 3.112200248155269e-06, 'epoch': 0.63} 63%|██████▎ | 21751/34278 [23:55:17<14:58:07, 4.30s/it] 63%|██████▎ | 21752/34278 [23:55:20<13:33:46, 3.90s/it] {'loss': 0.112, 'grad_norm': 0.9445166281720175, 'learning_rate': 3.11176278957321e-06, 'epoch': 0.63} 63%|██████▎ | 21752/34278 [23:55:20<13:33:46, 3.90s/it] 63%|██████▎ | 21753/34278 [23:55:23<12:50:05, 3.69s/it] {'loss': 0.1182, 'grad_norm': 0.7725115968509061, 'learning_rate': 3.111325347848884e-06, 'epoch': 0.63} 63%|██████▎ | 21753/34278 [23:55:23<12:50:05, 3.69s/it] 63%|██████▎ | 21754/34278 [23:55:30<15:28:00, 4.45s/it] {'loss': 0.119, 'grad_norm': 0.655354166208351, 'learning_rate': 3.1108879229861934e-06, 'epoch': 0.63} 63%|██████▎ | 21754/34278 [23:55:30<15:28:00, 4.45s/it] 63%|██████▎ | 21755/34278 [23:55:34<15:11:52, 4.37s/it] {'loss': 0.1402, 'grad_norm': 0.7494193134613015, 'learning_rate': 3.110450514989045e-06, 'epoch': 0.63} 63%|██████▎ | 21755/34278 [23:55:34<15:11:52, 4.37s/it] 63%|██████▎ | 21756/34278 [23:55:37<13:47:34, 3.97s/it] {'loss': 0.1192, 'grad_norm': 0.9474238399449878, 'learning_rate': 3.110013123861344e-06, 'epoch': 0.63} 63%|██████▎ | 21756/34278 [23:55:37<13:47:34, 3.97s/it] 63%|██████▎ | 21757/34278 [23:55:41<14:11:16, 4.08s/it] {'loss': 0.1343, 'grad_norm': 0.8790234496562304, 'learning_rate': 3.1095757496069934e-06, 'epoch': 0.63} 63%|██████▎ | 21757/34278 [23:55:41<14:11:16, 4.08s/it] 63%|██████▎ | 21758/34278 [23:55:45<13:32:51, 3.90s/it] {'loss': 0.1185, 'grad_norm': 0.7351632736785756, 'learning_rate': 3.1091383922298982e-06, 'epoch': 0.63} 63%|██████▎ | 21758/34278 [23:55:45<13:32:51, 3.90s/it] 63%|██████▎ | 21759/34278 [23:55:47<12:19:06, 3.54s/it] {'loss': 0.1278, 'grad_norm': 0.8324877787027928, 'learning_rate': 3.1087010517339656e-06, 'epoch': 0.63} 63%|██████▎ | 21759/34278 [23:55:47<12:19:06, 3.54s/it] 63%|██████▎ | 21760/34278 [23:55:52<13:29:36, 3.88s/it] {'loss': 0.1122, 'grad_norm': 0.8039823791184235, 'learning_rate': 3.1082637281230977e-06, 'epoch': 0.63} 63%|██████▎ | 21760/34278 [23:55:52<13:29:36, 3.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 63%|██████▎ | 21761/34278 [23:55:56<13:31:16, 3.89s/it] {'loss': 0.107, 'grad_norm': 0.8083964921544764, 'learning_rate': 3.107826421401201e-06, 'epoch': 0.63} 63%|██████▎ | 21761/34278 [23:55:56<13:31:16, 3.89s/it] 63%|██████▎ | 21762/34278 [23:56:02<15:48:04, 4.54s/it] {'loss': 0.109, 'grad_norm': 0.7078659341223058, 'learning_rate': 3.107389131572178e-06, 'epoch': 0.63} 63%|██████▎ | 21762/34278 [23:56:02<15:48:04, 4.54s/it] 63%|██████▎ | 21763/34278 [23:56:06<15:16:32, 4.39s/it] {'loss': 0.1635, 'grad_norm': 1.2444803594802591, 'learning_rate': 3.1069518586399323e-06, 'epoch': 0.63} 63%|██████▎ | 21763/34278 [23:56:06<15:16:32, 4.39s/it] 63%|██████▎ | 21764/34278 [23:56:11<16:05:01, 4.63s/it] {'loss': 0.1134, 'grad_norm': 0.9175885724502149, 'learning_rate': 3.1065146026083675e-06, 'epoch': 0.63} 63%|██████▎ | 21764/34278 [23:56:11<16:05:01, 4.63s/it] 63%|██████▎ | 21765/34278 [23:56:16<15:57:19, 4.59s/it] {'loss': 0.1361, 'grad_norm': 0.8921175961086458, 'learning_rate': 3.1060773634813895e-06, 'epoch': 0.63} 63%|██████▎ | 21765/34278 [23:56:16<15:57:19, 4.59s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 63%|██████▎ | 21766/34278 [23:56:19<14:36:35, 4.20s/it] {'loss': 0.1332, 'grad_norm': 0.8195935800664395, 'learning_rate': 3.1056401412629023e-06, 'epoch': 0.63} 63%|██████▎ | 21766/34278 [23:56:19<14:36:35, 4.20s/it] 64%|██████▎ | 21767/34278 [23:56:23<13:52:13, 3.99s/it] {'loss': 0.1453, 'grad_norm': 0.8485398500224487, 'learning_rate': 3.105202935956806e-06, 'epoch': 0.64} 64%|██████▎ | 21767/34278 [23:56:23<13:52:13, 3.99s/it] 64%|██████▎ | 21768/34278 [23:56:26<13:04:34, 3.76s/it] {'loss': 0.0851, 'grad_norm': 0.7762656619937783, 'learning_rate': 3.104765747567005e-06, 'epoch': 0.64} 64%|██████▎ | 21768/34278 [23:56:26<13:04:34, 3.76s/it] 64%|██████▎ | 21769/34278 [23:56:30<13:57:54, 4.02s/it] {'loss': 0.121, 'grad_norm': 0.7682391691572218, 'learning_rate': 3.104328576097405e-06, 'epoch': 0.64} 64%|██████▎ | 21769/34278 [23:56:30<13:57:54, 4.02s/it] 64%|██████▎ | 21770/34278 [23:56:34<13:08:27, 3.78s/it] {'loss': 0.1106, 'grad_norm': 0.7209647293901071, 'learning_rate': 3.1038914215519035e-06, 'epoch': 0.64} 64%|██████▎ | 21770/34278 [23:56:34<13:08:27, 3.78s/it] 64%|██████▎ | 21771/34278 [23:56:37<12:19:38, 3.55s/it] {'loss': 0.0989, 'grad_norm': 0.7839635461636388, 'learning_rate': 3.1034542839344094e-06, 'epoch': 0.64} 64%|██████▎ | 21771/34278 [23:56:37<12:19:38, 3.55s/it] 64%|██████▎ | 21772/34278 [23:56:40<12:12:20, 3.51s/it] {'loss': 0.1179, 'grad_norm': 0.9488156725088999, 'learning_rate': 3.1030171632488226e-06, 'epoch': 0.64} 64%|██████▎ | 21772/34278 [23:56:40<12:12:20, 3.51s/it] 64%|██████▎ | 21773/34278 [23:56:43<11:24:00, 3.28s/it] {'loss': 0.1145, 'grad_norm': 0.8923147396641986, 'learning_rate': 3.102580059499045e-06, 'epoch': 0.64} 64%|██████▎ | 21773/34278 [23:56:43<11:24:00, 3.28s/it] 64%|██████▎ | 21774/34278 [23:56:46<11:00:31, 3.17s/it] {'loss': 0.1352, 'grad_norm': 0.8070798670150509, 'learning_rate': 3.1021429726889808e-06, 'epoch': 0.64} 64%|██████▎ | 21774/34278 [23:56:46<11:00:31, 3.17s/it] 64%|██████▎ | 21775/34278 [23:56:49<10:40:19, 3.07s/it] {'loss': 0.1068, 'grad_norm': 1.0570815815348948, 'learning_rate': 3.1017059028225303e-06, 'epoch': 0.64} 64%|██████▎ | 21775/34278 [23:56:49<10:40:19, 3.07s/it] 64%|██████▎ | 21776/34278 [23:56:52<11:25:59, 3.29s/it] {'loss': 0.111, 'grad_norm': 0.8505928246591681, 'learning_rate': 3.1012688499035955e-06, 'epoch': 0.64} 64%|██████▎ | 21776/34278 [23:56:52<11:25:59, 3.29s/it] 64%|██████▎ | 21777/34278 [23:56:56<11:52:12, 3.42s/it] {'loss': 0.1308, 'grad_norm': 0.9204157772627148, 'learning_rate': 3.1008318139360795e-06, 'epoch': 0.64} 64%|██████▎ | 21777/34278 [23:56:56<11:52:12, 3.42s/it] 64%|██████▎ | 21778/34278 [23:56:59<11:41:03, 3.37s/it] {'loss': 0.0989, 'grad_norm': 0.933776237861263, 'learning_rate': 3.100394794923884e-06, 'epoch': 0.64} 64%|██████▎ | 21778/34278 [23:56:59<11:41:03, 3.37s/it] 64%|██████▎ | 21779/34278 [23:57:03<11:40:10, 3.36s/it] {'loss': 0.1189, 'grad_norm': 1.0928731214810563, 'learning_rate': 3.0999577928709114e-06, 'epoch': 0.64} 64%|██████▎ | 21779/34278 [23:57:03<11:40:10, 3.36s/it] 64%|██████▎ | 21780/34278 [23:57:07<12:21:48, 3.56s/it] {'loss': 0.1253, 'grad_norm': 1.0584957960818002, 'learning_rate': 3.0995208077810613e-06, 'epoch': 0.64} 64%|██████▎ | 21780/34278 [23:57:07<12:21:48, 3.56s/it] 64%|██████▎ | 21781/34278 [23:57:10<11:38:16, 3.35s/it] {'loss': 0.1262, 'grad_norm': 1.1712067444457757, 'learning_rate': 3.0990838396582357e-06, 'epoch': 0.64} 64%|██████▎ | 21781/34278 [23:57:10<11:38:16, 3.35s/it] 64%|██████▎ | 21782/34278 [23:57:13<11:25:55, 3.29s/it] {'loss': 0.0927, 'grad_norm': 0.6716273021083589, 'learning_rate': 3.0986468885063344e-06, 'epoch': 0.64} 64%|██████▎ | 21782/34278 [23:57:13<11:25:55, 3.29s/it] 64%|██████▎ | 21783/34278 [23:57:16<11:21:07, 3.27s/it] {'loss': 0.1352, 'grad_norm': 1.0672201278160154, 'learning_rate': 3.098209954329262e-06, 'epoch': 0.64} 64%|██████▎ | 21783/34278 [23:57:16<11:21:07, 3.27s/it] 64%|██████▎ | 21784/34278 [23:57:19<11:14:24, 3.24s/it] {'loss': 0.1437, 'grad_norm': 0.9193766952718748, 'learning_rate': 3.0977730371309154e-06, 'epoch': 0.64} 64%|██████▎ | 21784/34278 [23:57:19<11:14:24, 3.24s/it] 64%|██████▎ | 21785/34278 [23:57:25<13:45:17, 3.96s/it] {'loss': 0.1117, 'grad_norm': 0.8749013625383029, 'learning_rate': 3.0973361369151977e-06, 'epoch': 0.64} 64%|██████▎ | 21785/34278 [23:57:25<13:45:17, 3.96s/it] 64%|██████▎ | 21786/34278 [23:57:31<15:58:00, 4.60s/it] {'loss': 0.122, 'grad_norm': 1.0707215369033931, 'learning_rate': 3.09689925368601e-06, 'epoch': 0.64} 64%|██████▎ | 21786/34278 [23:57:31<15:58:00, 4.60s/it] 64%|██████▎ | 21787/34278 [23:57:34<14:18:55, 4.13s/it] {'loss': 0.1101, 'grad_norm': 0.6587241875285823, 'learning_rate': 3.0964623874472503e-06, 'epoch': 0.64} 64%|██████▎ | 21787/34278 [23:57:34<14:18:55, 4.13s/it] 64%|██████▎ | 21788/34278 [23:57:40<16:30:27, 4.76s/it] {'loss': 0.114, 'grad_norm': 0.9705001267168245, 'learning_rate': 3.0960255382028193e-06, 'epoch': 0.64} 64%|██████▎ | 21788/34278 [23:57:40<16:30:27, 4.76s/it] 64%|██████▎ | 21789/34278 [23:57:43<14:46:22, 4.26s/it] {'loss': 0.1406, 'grad_norm': 1.2154443540220972, 'learning_rate': 3.095588705956618e-06, 'epoch': 0.64} 64%|██████▎ | 21789/34278 [23:57:43<14:46:22, 4.26s/it] 64%|██████▎ | 21790/34278 [23:57:46<13:26:20, 3.87s/it] {'loss': 0.1327, 'grad_norm': 0.8496619936358655, 'learning_rate': 3.0951518907125468e-06, 'epoch': 0.64} 64%|██████▎ | 21790/34278 [23:57:46<13:26:20, 3.87s/it] 64%|██████▎ | 21791/34278 [23:57:49<12:28:52, 3.60s/it] {'loss': 0.1229, 'grad_norm': 0.741635813731282, 'learning_rate': 3.094715092474505e-06, 'epoch': 0.64} 64%|██████▎ | 21791/34278 [23:57:49<12:28:52, 3.60s/it] 64%|██████▎ | 21792/34278 [23:57:55<14:54:43, 4.30s/it] {'loss': 0.1253, 'grad_norm': 0.8686250528099158, 'learning_rate': 3.094278311246392e-06, 'epoch': 0.64} 64%|██████▎ | 21792/34278 [23:57:55<14:54:43, 4.30s/it] 64%|██████▎ | 21793/34278 [23:57:58<13:42:34, 3.95s/it] {'loss': 0.1382, 'grad_norm': 0.7785649933729758, 'learning_rate': 3.093841547032107e-06, 'epoch': 0.64} 64%|██████▎ | 21793/34278 [23:57:58<13:42:34, 3.95s/it] 64%|██████▎ | 21794/34278 [23:58:02<13:22:47, 3.86s/it] {'loss': 0.1135, 'grad_norm': 0.9243323103759059, 'learning_rate': 3.093404799835548e-06, 'epoch': 0.64} 64%|██████▎ | 21794/34278 [23:58:02<13:22:47, 3.86s/it] 64%|██████▎ | 21795/34278 [23:58:05<12:19:15, 3.55s/it] {'loss': 0.1303, 'grad_norm': 0.9318614880552503, 'learning_rate': 3.092968069660618e-06, 'epoch': 0.64} 64%|██████▎ | 21795/34278 [23:58:05<12:19:15, 3.55s/it] 64%|██████▎ | 21796/34278 [23:58:08<12:14:16, 3.53s/it] {'loss': 0.11, 'grad_norm': 0.8252247957126727, 'learning_rate': 3.0925313565112135e-06, 'epoch': 0.64} 64%|██████▎ | 21796/34278 [23:58:08<12:14:16, 3.53s/it] 64%|██████▎ | 21797/34278 [23:58:14<14:53:43, 4.30s/it] {'loss': 0.1203, 'grad_norm': 0.7766490834908291, 'learning_rate': 3.092094660391234e-06, 'epoch': 0.64} 64%|██████▎ | 21797/34278 [23:58:14<14:53:43, 4.30s/it] 64%|██████▎ | 21798/34278 [23:58:18<14:20:42, 4.14s/it] {'loss': 0.1222, 'grad_norm': 0.8685011647973678, 'learning_rate': 3.0916579813045764e-06, 'epoch': 0.64} 64%|██████▎ | 21798/34278 [23:58:18<14:20:42, 4.14s/it] 64%|██████▎ | 21799/34278 [23:58:21<13:23:39, 3.86s/it] {'loss': 0.0998, 'grad_norm': 0.8992556388119634, 'learning_rate': 3.0912213192551434e-06, 'epoch': 0.64} 64%|██████▎ | 21799/34278 [23:58:21<13:23:39, 3.86s/it] 64%|██████▎ | 21800/34278 [23:58:25<13:16:43, 3.83s/it] {'loss': 0.1154, 'grad_norm': 0.798910637193303, 'learning_rate': 3.090784674246826e-06, 'epoch': 0.64} 64%|██████▎ | 21800/34278 [23:58:25<13:16:43, 3.83s/it] 64%|██████▎ | 21801/34278 [23:58:29<13:14:51, 3.82s/it] {'loss': 0.1308, 'grad_norm': 0.97939781384001, 'learning_rate': 3.0903480462835323e-06, 'epoch': 0.64} 64%|██████▎ | 21801/34278 [23:58:29<13:14:51, 3.82s/it] 64%|██████▎ | 21802/34278 [23:58:32<12:40:16, 3.66s/it] {'loss': 0.1358, 'grad_norm': 1.0612139597400803, 'learning_rate': 3.089911435369153e-06, 'epoch': 0.64} 64%|██████▎ | 21802/34278 [23:58:32<12:40:16, 3.66s/it] 64%|██████▎ | 21803/34278 [23:58:35<11:50:13, 3.42s/it] {'loss': 0.1333, 'grad_norm': 0.9971849600337275, 'learning_rate': 3.0894748415075887e-06, 'epoch': 0.64} 64%|██████▎ | 21803/34278 [23:58:35<11:50:13, 3.42s/it] 64%|██████▎ | 21804/34278 [23:58:40<13:36:47, 3.93s/it] {'loss': 0.121, 'grad_norm': 0.7554802910512478, 'learning_rate': 3.0890382647027382e-06, 'epoch': 0.64} 64%|██████▎ | 21804/34278 [23:58:40<13:36:47, 3.93s/it] 64%|██████▎ | 21805/34278 [23:58:46<15:29:11, 4.47s/it] {'loss': 0.1412, 'grad_norm': 1.1523452522450974, 'learning_rate': 3.0886017049584963e-06, 'epoch': 0.64} 64%|██████▎ | 21805/34278 [23:58:46<15:29:11, 4.47s/it] 64%|██████▎ | 21806/34278 [23:58:49<14:22:14, 4.15s/it] {'loss': 0.1324, 'grad_norm': 1.0359600180640038, 'learning_rate': 3.088165162278762e-06, 'epoch': 0.64} 64%|██████▎ | 21806/34278 [23:58:49<14:22:14, 4.15s/it] 64%|██████▎ | 21807/34278 [23:58:52<13:17:23, 3.84s/it] {'loss': 0.1022, 'grad_norm': 1.3569448829642885, 'learning_rate': 3.087728636667433e-06, 'epoch': 0.64} 64%|██████▎ | 21807/34278 [23:58:52<13:17:23, 3.84s/it] 64%|██████▎ | 21808/34278 [23:58:56<13:11:58, 3.81s/it] {'loss': 0.1332, 'grad_norm': 1.2866344165629882, 'learning_rate': 3.0872921281284063e-06, 'epoch': 0.64} 64%|██████▎ | 21808/34278 [23:58:56<13:11:58, 3.81s/it] 64%|██████▎ | 21809/34278 [23:59:01<14:33:40, 4.20s/it] {'loss': 0.1463, 'grad_norm': 1.2706445821451313, 'learning_rate': 3.08685563666558e-06, 'epoch': 0.64} 64%|██████▎ | 21809/34278 [23:59:01<14:33:40, 4.20s/it] 64%|██████▎ | 21810/34278 [23:59:07<16:40:34, 4.82s/it] {'loss': 0.1174, 'grad_norm': 1.354870983290892, 'learning_rate': 3.086419162282849e-06, 'epoch': 0.64} 64%|██████▎ | 21810/34278 [23:59:07<16:40:34, 4.82s/it] 64%|██████▎ | 21811/34278 [23:59:10<14:49:22, 4.28s/it] {'loss': 0.1216, 'grad_norm': 1.383365738194712, 'learning_rate': 3.0859827049841105e-06, 'epoch': 0.64} 64%|██████▎ | 21811/34278 [23:59:10<14:49:22, 4.28s/it] 64%|██████▎ | 21812/34278 [23:59:13<13:31:50, 3.91s/it] {'loss': 0.1301, 'grad_norm': 0.6837405199969548, 'learning_rate': 3.0855462647732615e-06, 'epoch': 0.64} 64%|██████▎ | 21812/34278 [23:59:13<13:31:50, 3.91s/it] 64%|██████▎ | 21813/34278 [23:59:16<12:27:20, 3.60s/it] {'loss': 0.1313, 'grad_norm': 1.1635242670466868, 'learning_rate': 3.085109841654199e-06, 'epoch': 0.64} 64%|██████▎ | 21813/34278 [23:59:16<12:27:20, 3.60s/it] 64%|██████▎ | 21814/34278 [23:59:20<12:09:10, 3.51s/it] {'loss': 0.1001, 'grad_norm': 1.405525454712235, 'learning_rate': 3.084673435630819e-06, 'epoch': 0.64} 64%|██████▎ | 21814/34278 [23:59:20<12:09:10, 3.51s/it] 64%|██████▎ | 21815/34278 [23:59:23<11:52:56, 3.43s/it] {'loss': 0.1139, 'grad_norm': 1.165711951011714, 'learning_rate': 3.084237046707017e-06, 'epoch': 0.64} 64%|██████▎ | 21815/34278 [23:59:23<11:52:56, 3.43s/it] 64%|██████▎ | 21816/34278 [23:59:26<12:00:05, 3.47s/it] {'loss': 0.1213, 'grad_norm': 0.880725538768999, 'learning_rate': 3.08380067488669e-06, 'epoch': 0.64} 64%|██████▎ | 21816/34278 [23:59:26<12:00:05, 3.47s/it] 64%|██████▎ | 21817/34278 [23:59:33<14:59:43, 4.33s/it] {'loss': 0.1301, 'grad_norm': 2.381796792180809, 'learning_rate': 3.083364320173732e-06, 'epoch': 0.64} 64%|██████▎ | 21817/34278 [23:59:33<14:59:43, 4.33s/it] 64%|██████▎ | 21818/34278 [23:59:36<13:55:05, 4.02s/it] {'loss': 0.1074, 'grad_norm': 0.9165355178052541, 'learning_rate': 3.0829279825720393e-06, 'epoch': 0.64} 64%|██████▎ | 21818/34278 [23:59:36<13:55:05, 4.02s/it] 64%|██████▎ | 21819/34278 [23:59:42<15:30:44, 4.48s/it] {'loss': 0.1172, 'grad_norm': 1.0651285245372524, 'learning_rate': 3.082491662085508e-06, 'epoch': 0.64} 64%|██████▎ | 21819/34278 [23:59:42<15:30:44, 4.48s/it] 64%|██████▎ | 21820/34278 [23:59:47<16:32:25, 4.78s/it] {'loss': 0.1116, 'grad_norm': 0.8388221543037918, 'learning_rate': 3.0820553587180346e-06, 'epoch': 0.64} 64%|██████▎ | 21820/34278 [23:59:47<16:32:25, 4.78s/it] 64%|██████▎ | 21821/34278 [23:59:50<14:54:42, 4.31s/it] {'loss': 0.1284, 'grad_norm': 0.7066232406302658, 'learning_rate': 3.081619072473514e-06, 'epoch': 0.64} 64%|██████▎ | 21821/34278 [23:59:50<14:54:42, 4.31s/it] 64%|██████▎ | 21822/34278 [23:59:54<14:13:49, 4.11s/it] {'loss': 0.1267, 'grad_norm': 0.9950246610089574, 'learning_rate': 3.0811828033558388e-06, 'epoch': 0.64} 64%|██████▎ | 21822/34278 [23:59:54<14:13:49, 4.11s/it] 64%|██████▎ | 21823/34278 [23:59:57<13:19:44, 3.85s/it] {'loss': 0.1262, 'grad_norm': 1.1898016694790707, 'learning_rate': 3.080746551368906e-06, 'epoch': 0.64} 64%|██████▎ | 21823/34278 [23:59:57<13:19:44, 3.85s/it] 64%|██████▎ | 21824/34278 [24:00:00<12:18:29, 3.56s/it] {'loss': 0.1154, 'grad_norm': 0.8613786808996372, 'learning_rate': 3.080310316516608e-06, 'epoch': 0.64} 64%|██████▎ | 21824/34278 [24:00:00<12:18:29, 3.56s/it] 64%|██████▎ | 21825/34278 [24:00:03<11:44:43, 3.40s/it] {'loss': 0.1239, 'grad_norm': 0.8929237471297329, 'learning_rate': 3.079874098802843e-06, 'epoch': 0.64} 64%|██████▎ | 21825/34278 [24:00:03<11:44:43, 3.40s/it] 64%|██████▎ | 21826/34278 [24:00:07<12:19:02, 3.56s/it] {'loss': 0.1454, 'grad_norm': 1.5433127283925474, 'learning_rate': 3.0794378982315044e-06, 'epoch': 0.64} 64%|██████▎ | 21826/34278 [24:00:07<12:19:02, 3.56s/it] 64%|██████▎ | 21827/34278 [24:00:10<11:50:32, 3.42s/it] {'loss': 0.1218, 'grad_norm': 1.2118272981373643, 'learning_rate': 3.0790017148064844e-06, 'epoch': 0.64} 64%|██████▎ | 21827/34278 [24:00:10<11:50:32, 3.42s/it] 64%|██████▎ | 21828/34278 [24:00:13<11:31:10, 3.33s/it] {'loss': 0.1148, 'grad_norm': 0.9964554054028634, 'learning_rate': 3.0785655485316788e-06, 'epoch': 0.64} 64%|██████▎ | 21828/34278 [24:00:13<11:31:10, 3.33s/it] 64%|██████▎ | 21829/34278 [24:00:17<11:51:47, 3.43s/it] {'loss': 0.1307, 'grad_norm': 0.6779271565125494, 'learning_rate': 3.0781293994109828e-06, 'epoch': 0.64} 64%|██████▎ | 21829/34278 [24:00:17<11:51:47, 3.43s/it] 64%|██████▎ | 21830/34278 [24:00:20<11:42:31, 3.39s/it] {'loss': 0.1347, 'grad_norm': 1.0674790488557546, 'learning_rate': 3.077693267448285e-06, 'epoch': 0.64} 64%|██████▎ | 21830/34278 [24:00:20<11:42:31, 3.39s/it] 64%|██████▎ | 21831/34278 [24:00:23<11:18:14, 3.27s/it] {'loss': 0.13, 'grad_norm': 1.316009182413062, 'learning_rate': 3.077257152647486e-06, 'epoch': 0.64} 64%|██████▎ | 21831/34278 [24:00:23<11:18:14, 3.27s/it] 64%|██████▎ | 21832/34278 [24:00:27<11:38:15, 3.37s/it] {'loss': 0.0948, 'grad_norm': 0.9490288674966368, 'learning_rate': 3.0768210550124757e-06, 'epoch': 0.64} 64%|██████▎ | 21832/34278 [24:00:27<11:38:15, 3.37s/it] 64%|██████▎ | 21833/34278 [24:00:33<14:12:53, 4.11s/it] {'loss': 0.1195, 'grad_norm': 0.6827399227485108, 'learning_rate': 3.0763849745471475e-06, 'epoch': 0.64} 64%|██████▎ | 21833/34278 [24:00:33<14:12:53, 4.11s/it] 64%|██████▎ | 21834/34278 [24:00:36<13:53:04, 4.02s/it] {'loss': 0.1251, 'grad_norm': 1.268438017932574, 'learning_rate': 3.075948911255396e-06, 'epoch': 0.64} 64%|██████▎ | 21834/34278 [24:00:36<13:53:04, 4.02s/it] 64%|██████▎ | 21835/34278 [24:00:40<13:36:57, 3.94s/it] {'loss': 0.1025, 'grad_norm': 1.5663374850285545, 'learning_rate': 3.0755128651411115e-06, 'epoch': 0.64} 64%|██████▎ | 21835/34278 [24:00:40<13:36:57, 3.94s/it] 64%|██████▎ | 21836/34278 [24:00:44<13:38:28, 3.95s/it] {'loss': 0.1092, 'grad_norm': 1.061104852661182, 'learning_rate': 3.0750768362081895e-06, 'epoch': 0.64} 64%|██████▎ | 21836/34278 [24:00:44<13:38:28, 3.95s/it] 64%|██████▎ | 21837/34278 [24:00:47<12:54:23, 3.73s/it] {'loss': 0.1347, 'grad_norm': 0.7752633644749488, 'learning_rate': 3.074640824460522e-06, 'epoch': 0.64} 64%|██████▎ | 21837/34278 [24:00:47<12:54:23, 3.73s/it] 64%|██████▎ | 21838/34278 [24:00:51<12:25:32, 3.60s/it] {'loss': 0.1365, 'grad_norm': 1.2016071339398173, 'learning_rate': 3.074204829902001e-06, 'epoch': 0.64} 64%|██████▎ | 21838/34278 [24:00:51<12:25:32, 3.60s/it] 64%|██████▎ | 21839/34278 [24:00:55<13:06:35, 3.79s/it] {'loss': 0.1328, 'grad_norm': 1.3821317503312849, 'learning_rate': 3.073768852536522e-06, 'epoch': 0.64} 64%|██████▎ | 21839/34278 [24:00:55<13:06:35, 3.79s/it] 64%|██████▎ | 21840/34278 [24:01:01<15:25:56, 4.47s/it] {'loss': 0.1144, 'grad_norm': 0.9094382064689873, 'learning_rate': 3.073332892367973e-06, 'epoch': 0.64} 64%|██████▎ | 21840/34278 [24:01:01<15:25:56, 4.47s/it] 64%|██████▎ | 21841/34278 [24:01:04<13:46:49, 3.99s/it] {'loss': 0.1253, 'grad_norm': 0.66381617402986, 'learning_rate': 3.072896949400247e-06, 'epoch': 0.64} 64%|██████▎ | 21841/34278 [24:01:04<13:46:49, 3.99s/it] 64%|██████▎ | 21842/34278 [24:01:07<12:55:41, 3.74s/it] {'loss': 0.1125, 'grad_norm': 0.9821706677181078, 'learning_rate': 3.0724610236372377e-06, 'epoch': 0.64} 64%|██████▎ | 21842/34278 [24:01:07<12:55:41, 3.74s/it] 64%|██████▎ | 21843/34278 [24:01:13<15:14:58, 4.41s/it] {'loss': 0.0997, 'grad_norm': 1.0959836495574657, 'learning_rate': 3.072025115082838e-06, 'epoch': 0.64} 64%|██████▎ | 21843/34278 [24:01:13<15:14:58, 4.41s/it] 64%|██████▎ | 21844/34278 [24:01:16<14:18:38, 4.14s/it] {'loss': 0.1119, 'grad_norm': 1.1337204186278373, 'learning_rate': 3.071589223740936e-06, 'epoch': 0.64} 64%|██████▎ | 21844/34278 [24:01:17<14:18:38, 4.14s/it] 64%|██████▎ | 21845/34278 [24:01:19<13:06:01, 3.79s/it] {'loss': 0.1149, 'grad_norm': 0.6897983155072406, 'learning_rate': 3.0711533496154258e-06, 'epoch': 0.64} 64%|██████▎ | 21845/34278 [24:01:19<13:06:01, 3.79s/it] 64%|██████▎ | 21846/34278 [24:01:23<12:24:24, 3.59s/it] {'loss': 0.1232, 'grad_norm': 0.9365897347237178, 'learning_rate': 3.070717492710199e-06, 'epoch': 0.64} 64%|██████▎ | 21846/34278 [24:01:23<12:24:24, 3.59s/it] 64%|██████▎ | 21847/34278 [24:01:28<14:47:03, 4.28s/it] {'loss': 0.1391, 'grad_norm': 0.9274447875877025, 'learning_rate': 3.0702816530291425e-06, 'epoch': 0.64} 64%|██████▎ | 21847/34278 [24:01:28<14:47:03, 4.28s/it] 64%|██████▎ | 21848/34278 [24:01:32<13:51:49, 4.02s/it] {'loss': 0.1493, 'grad_norm': 1.0215428332432723, 'learning_rate': 3.0698458305761538e-06, 'epoch': 0.64} 64%|██████▎ | 21848/34278 [24:01:32<13:51:49, 4.02s/it] 64%|██████▎ | 21849/34278 [24:01:35<13:18:39, 3.86s/it] {'loss': 0.1112, 'grad_norm': 0.9081846317617027, 'learning_rate': 3.0694100253551195e-06, 'epoch': 0.64} 64%|██████▎ | 21849/34278 [24:01:35<13:18:39, 3.86s/it] 64%|██████▎ | 21850/34278 [24:01:38<12:32:46, 3.63s/it] {'loss': 0.1116, 'grad_norm': 0.7751254122582037, 'learning_rate': 3.068974237369932e-06, 'epoch': 0.64} 64%|██████▎ | 21850/34278 [24:01:38<12:32:46, 3.63s/it] 64%|██████▎ | 21851/34278 [24:01:43<13:36:38, 3.94s/it] {'loss': 0.1376, 'grad_norm': 0.8434729375194193, 'learning_rate': 3.068538466624482e-06, 'epoch': 0.64} 64%|██████▎ | 21851/34278 [24:01:43<13:36:38, 3.94s/it] 64%|██████▎ | 21852/34278 [24:01:46<12:14:12, 3.55s/it] {'loss': 0.1707, 'grad_norm': 0.9442999885080988, 'learning_rate': 3.068102713122659e-06, 'epoch': 0.64} 64%|██████▎ | 21852/34278 [24:01:46<12:14:12, 3.55s/it] 64%|██████▍ | 21853/34278 [24:01:49<11:37:05, 3.37s/it] {'loss': 0.133, 'grad_norm': 0.9222775526189096, 'learning_rate': 3.067666976868353e-06, 'epoch': 0.64} 64%|██████▍ | 21853/34278 [24:01:49<11:37:05, 3.37s/it] 64%|██████▍ | 21854/34278 [24:01:55<14:15:58, 4.13s/it] {'loss': 0.1364, 'grad_norm': 0.7792483602620585, 'learning_rate': 3.067231257865456e-06, 'epoch': 0.64} 64%|██████▍ | 21854/34278 [24:01:55<14:15:58, 4.13s/it] 64%|██████▍ | 21855/34278 [24:01:58<13:10:38, 3.82s/it] {'loss': 0.1257, 'grad_norm': 0.8573625377828165, 'learning_rate': 3.0667955561178566e-06, 'epoch': 0.64} 64%|██████▍ | 21855/34278 [24:01:58<13:10:38, 3.82s/it] 64%|██████▍ | 21856/34278 [24:02:01<13:02:58, 3.78s/it] {'loss': 0.1099, 'grad_norm': 0.7886138594932548, 'learning_rate': 3.066359871629446e-06, 'epoch': 0.64} 64%|██████▍ | 21856/34278 [24:02:01<13:02:58, 3.78s/it] 64%|██████▍ | 21857/34278 [24:02:05<12:38:26, 3.66s/it] {'loss': 0.1359, 'grad_norm': 0.9117843252800424, 'learning_rate': 3.0659242044041117e-06, 'epoch': 0.64} 64%|██████▍ | 21857/34278 [24:02:05<12:38:26, 3.66s/it] 64%|██████▍ | 21858/34278 [24:02:08<12:32:13, 3.63s/it] {'loss': 0.1193, 'grad_norm': 0.7730494799951317, 'learning_rate': 3.0654885544457446e-06, 'epoch': 0.64} 64%|██████▍ | 21858/34278 [24:02:08<12:32:13, 3.63s/it] 64%|██████▍ | 21859/34278 [24:02:12<12:45:50, 3.70s/it] {'loss': 0.1319, 'grad_norm': 0.859282447127082, 'learning_rate': 3.0650529217582333e-06, 'epoch': 0.64} 64%|██████▍ | 21859/34278 [24:02:12<12:45:50, 3.70s/it] 64%|██████▍ | 21860/34278 [24:02:16<12:25:36, 3.60s/it] {'loss': 0.1279, 'grad_norm': 1.0796400353288, 'learning_rate': 3.0646173063454676e-06, 'epoch': 0.64} 64%|██████▍ | 21860/34278 [24:02:16<12:25:36, 3.60s/it] 64%|██████▍ | 21861/34278 [24:02:19<11:59:11, 3.48s/it] {'loss': 0.119, 'grad_norm': 0.9053880327450825, 'learning_rate': 3.0641817082113385e-06, 'epoch': 0.64} 64%|██████▍ | 21861/34278 [24:02:19<11:59:11, 3.48s/it] 64%|██████▍ | 21862/34278 [24:02:22<11:16:02, 3.27s/it] {'loss': 0.1181, 'grad_norm': 0.727253148668691, 'learning_rate': 3.0637461273597312e-06, 'epoch': 0.64} 64%|██████▍ | 21862/34278 [24:02:22<11:16:02, 3.27s/it] 64%|██████▍ | 21863/34278 [24:02:25<11:19:20, 3.28s/it] {'loss': 0.1216, 'grad_norm': 1.0231326515798984, 'learning_rate': 3.063310563794537e-06, 'epoch': 0.64} 64%|██████▍ | 21863/34278 [24:02:25<11:19:20, 3.28s/it] 64%|██████▍ | 21864/34278 [24:02:28<11:35:40, 3.36s/it] {'loss': 0.1049, 'grad_norm': 0.8431087631210024, 'learning_rate': 3.062875017519645e-06, 'epoch': 0.64} 64%|██████▍ | 21864/34278 [24:02:28<11:35:40, 3.36s/it] 64%|██████▍ | 21865/34278 [24:02:32<11:25:08, 3.31s/it] {'loss': 0.1382, 'grad_norm': 0.7231253542040589, 'learning_rate': 3.0624394885389397e-06, 'epoch': 0.64} 64%|██████▍ | 21865/34278 [24:02:32<11:25:08, 3.31s/it] 64%|██████▍ | 21866/34278 [24:02:35<11:12:44, 3.25s/it] {'loss': 0.1327, 'grad_norm': 0.8593676211645107, 'learning_rate': 3.062003976856313e-06, 'epoch': 0.64} 64%|██████▍ | 21866/34278 [24:02:35<11:12:44, 3.25s/it] 64%|██████▍ | 21867/34278 [24:02:40<13:16:08, 3.85s/it] {'loss': 0.1246, 'grad_norm': 0.7948219579548143, 'learning_rate': 3.0615684824756525e-06, 'epoch': 0.64} 64%|██████▍ | 21867/34278 [24:02:40<13:16:08, 3.85s/it] 64%|██████▍ | 21868/34278 [24:02:46<15:35:10, 4.52s/it] {'loss': 0.1372, 'grad_norm': 0.7230630063087355, 'learning_rate': 3.061133005400846e-06, 'epoch': 0.64} 64%|██████▍ | 21868/34278 [24:02:46<15:35:10, 4.52s/it] 64%|██████▍ | 21869/34278 [24:02:52<17:04:26, 4.95s/it] {'loss': 0.1446, 'grad_norm': 0.8481834099556024, 'learning_rate': 3.0606975456357817e-06, 'epoch': 0.64} 64%|██████▍ | 21869/34278 [24:02:52<17:04:26, 4.95s/it] 64%|██████▍ | 21870/34278 [24:02:58<18:02:04, 5.23s/it] {'loss': 0.1242, 'grad_norm': 0.8388905248271551, 'learning_rate': 3.060262103184346e-06, 'epoch': 0.64} 64%|██████▍ | 21870/34278 [24:02:58<18:02:04, 5.23s/it] 64%|██████▍ | 21871/34278 [24:03:04<18:57:35, 5.50s/it] {'loss': 0.129, 'grad_norm': 0.8039098227458443, 'learning_rate': 3.0598266780504267e-06, 'epoch': 0.64} 64%|██████▍ | 21871/34278 [24:03:04<18:57:35, 5.50s/it] 64%|██████▍ | 21872/34278 [24:03:08<17:33:44, 5.10s/it] {'loss': 0.1023, 'grad_norm': 0.6854274554351321, 'learning_rate': 3.059391270237912e-06, 'epoch': 0.64} 64%|██████▍ | 21872/34278 [24:03:08<17:33:44, 5.10s/it] 64%|██████▍ | 21873/34278 [24:03:14<18:29:15, 5.37s/it] {'loss': 0.1288, 'grad_norm': 0.8483208695899778, 'learning_rate': 3.05895587975069e-06, 'epoch': 0.64} 64%|██████▍ | 21873/34278 [24:03:14<18:29:15, 5.37s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff307ed8f40> Failed to fetch sample 3412265. Exception: cannot identify image file <_io.BytesIO object at 0x7ff307ed8f40> 64%|██████▍ | 21874/34278 [24:03:18<16:32:02, 4.80s/it] {'loss': 0.1131, 'grad_norm': 0.8825059011319244, 'learning_rate': 3.0585205065926453e-06, 'epoch': 0.64} 64%|██████▍ | 21874/34278 [24:03:18<16:32:02, 4.80s/it] 64%|██████▍ | 21875/34278 [24:03:24<18:12:39, 5.29s/it] {'loss': 0.101, 'grad_norm': 0.704871957973163, 'learning_rate': 3.058085150767667e-06, 'epoch': 0.64} 64%|██████▍ | 21875/34278 [24:03:24<18:12:39, 5.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 64%|██████▍ | 21876/34278 [24:03:28<16:44:37, 4.86s/it] {'loss': 0.1156, 'grad_norm': 0.7805779065666156, 'learning_rate': 3.0576498122796403e-06, 'epoch': 0.64} 64%|██████▍ | 21876/34278 [24:03:28<16:44:37, 4.86s/it] 64%|██████▍ | 21877/34278 [24:03:33<16:48:43, 4.88s/it] {'loss': 0.1326, 'grad_norm': 0.9444300393589251, 'learning_rate': 3.057214491132451e-06, 'epoch': 0.64} 64%|██████▍ | 21877/34278 [24:03:33<16:48:43, 4.88s/it] 64%|██████▍ | 21878/34278 [24:03:36<15:08:50, 4.40s/it] {'loss': 0.1374, 'grad_norm': 0.9479615537361952, 'learning_rate': 3.056779187329989e-06, 'epoch': 0.64} 64%|██████▍ | 21878/34278 [24:03:36<15:08:50, 4.40s/it] 64%|██████▍ | 21879/34278 [24:03:40<14:16:13, 4.14s/it] {'loss': 0.1115, 'grad_norm': 0.8509617843592863, 'learning_rate': 3.0563439008761377e-06, 'epoch': 0.64} 64%|██████▍ | 21879/34278 [24:03:40<14:16:13, 4.14s/it] 64%|██████▍ | 21880/34278 [24:03:44<13:58:57, 4.06s/it] {'loss': 0.1287, 'grad_norm': 0.7367877561446459, 'learning_rate': 3.055908631774784e-06, 'epoch': 0.64} 64%|██████▍ | 21880/34278 [24:03:44<13:58:57, 4.06s/it] 64%|██████▍ | 21881/34278 [24:03:48<13:53:07, 4.03s/it] {'loss': 0.1279, 'grad_norm': 0.8504320472884936, 'learning_rate': 3.0554733800298154e-06, 'epoch': 0.64} 64%|██████▍ | 21881/34278 [24:03:48<13:53:07, 4.03s/it] 64%|██████▍ | 21882/34278 [24:03:53<15:48:59, 4.59s/it] {'loss': 0.1256, 'grad_norm': 0.9429986359842464, 'learning_rate': 3.0550381456451144e-06, 'epoch': 0.64} 64%|██████▍ | 21882/34278 [24:03:53<15:48:59, 4.59s/it] 64%|██████▍ | 21883/34278 [24:03:57<14:20:29, 4.17s/it] {'loss': 0.1217, 'grad_norm': 1.0032892736514707, 'learning_rate': 3.054602928624568e-06, 'epoch': 0.64} 64%|██████▍ | 21883/34278 [24:03:57<14:20:29, 4.17s/it] 64%|██████▍ | 21884/34278 [24:04:00<13:21:15, 3.88s/it] {'loss': 0.1269, 'grad_norm': 0.9133680212190357, 'learning_rate': 3.0541677289720632e-06, 'epoch': 0.64} 64%|██████▍ | 21884/34278 [24:04:00<13:21:15, 3.88s/it] 64%|██████▍ | 21885/34278 [24:04:05<14:15:07, 4.14s/it] {'loss': 0.1046, 'grad_norm': 0.9797065792975945, 'learning_rate': 3.053732546691485e-06, 'epoch': 0.64} 64%|██████▍ | 21885/34278 [24:04:05<14:15:07, 4.14s/it] 64%|██████▍ | 21886/34278 [24:04:09<14:50:42, 4.31s/it] {'loss': 0.1342, 'grad_norm': 0.9860794712482452, 'learning_rate': 3.0532973817867185e-06, 'epoch': 0.64} 64%|██████▍ | 21886/34278 [24:04:09<14:50:42, 4.31s/it] 64%|██████▍ | 21887/34278 [24:04:13<13:49:24, 4.02s/it] {'loss': 0.1214, 'grad_norm': 1.0183469029431744, 'learning_rate': 3.0528622342616472e-06, 'epoch': 0.64} 64%|██████▍ | 21887/34278 [24:04:13<13:49:24, 4.02s/it] 64%|██████▍ | 21888/34278 [24:04:17<14:34:48, 4.24s/it] {'loss': 0.1191, 'grad_norm': 0.9259410328115509, 'learning_rate': 3.052427104120157e-06, 'epoch': 0.64} 64%|██████▍ | 21888/34278 [24:04:17<14:34:48, 4.24s/it] 64%|██████▍ | 21889/34278 [24:04:22<14:34:15, 4.23s/it] {'loss': 0.0981, 'grad_norm': 1.2581005181482376, 'learning_rate': 3.0519919913661317e-06, 'epoch': 0.64} 64%|██████▍ | 21889/34278 [24:04:22<14:34:15, 4.23s/it] 64%|██████▍ | 21890/34278 [24:04:27<16:14:42, 4.72s/it] {'loss': 0.1356, 'grad_norm': 1.561430521600238, 'learning_rate': 3.051556896003458e-06, 'epoch': 0.64} 64%|██████▍ | 21890/34278 [24:04:27<16:14:42, 4.72s/it] 64%|██████▍ | 21891/34278 [24:04:30<14:21:57, 4.18s/it] {'loss': 0.1138, 'grad_norm': 0.8987681531126557, 'learning_rate': 3.05112181803602e-06, 'epoch': 0.64} 64%|██████▍ | 21891/34278 [24:04:30<14:21:57, 4.18s/it] 64%|██████▍ | 21892/34278 [24:04:34<13:32:23, 3.94s/it] {'loss': 0.1122, 'grad_norm': 0.9575434341929723, 'learning_rate': 3.0506867574677007e-06, 'epoch': 0.64} 64%|██████▍ | 21892/34278 [24:04:34<13:32:23, 3.94s/it] 64%|██████▍ | 21893/34278 [24:04:37<13:08:15, 3.82s/it] {'loss': 0.1244, 'grad_norm': 0.9803084522916741, 'learning_rate': 3.0502517143023846e-06, 'epoch': 0.64} 64%|██████▍ | 21893/34278 [24:04:37<13:08:15, 3.82s/it] 64%|██████▍ | 21894/34278 [24:04:42<14:07:35, 4.11s/it] {'loss': 0.1319, 'grad_norm': 1.1634107455634812, 'learning_rate': 3.049816688543956e-06, 'epoch': 0.64} 64%|██████▍ | 21894/34278 [24:04:42<14:07:35, 4.11s/it] 64%|██████▍ | 21895/34278 [24:04:45<12:48:42, 3.72s/it] {'loss': 0.1404, 'grad_norm': 1.0048455952281852, 'learning_rate': 3.0493816801962974e-06, 'epoch': 0.64} 64%|██████▍ | 21895/34278 [24:04:45<12:48:42, 3.72s/it] 64%|██████▍ | 21896/34278 [24:04:48<12:25:47, 3.61s/it] {'loss': 0.1034, 'grad_norm': 0.7597447083970655, 'learning_rate': 3.0489466892632934e-06, 'epoch': 0.64} 64%|██████▍ | 21896/34278 [24:04:48<12:25:47, 3.61s/it] 64%|██████▍ | 21897/34278 [24:04:54<14:47:52, 4.30s/it] {'loss': 0.1387, 'grad_norm': 0.9691091562542609, 'learning_rate': 3.0485117157488287e-06, 'epoch': 0.64} 64%|██████▍ | 21897/34278 [24:04:54<14:47:52, 4.30s/it] 64%|██████▍ | 21898/34278 [24:04:57<13:44:53, 4.00s/it] {'loss': 0.1211, 'grad_norm': 0.8071088564651172, 'learning_rate': 3.048076759656785e-06, 'epoch': 0.64} 64%|██████▍ | 21898/34278 [24:04:57<13:44:53, 4.00s/it] 64%|██████▍ | 21899/34278 [24:05:01<12:51:19, 3.74s/it] {'loss': 0.12, 'grad_norm': 0.6358678297759078, 'learning_rate': 3.0476418209910475e-06, 'epoch': 0.64} 64%|██████▍ | 21899/34278 [24:05:01<12:51:19, 3.74s/it] 64%|██████▍ | 21900/34278 [24:05:04<12:05:31, 3.52s/it] {'loss': 0.126, 'grad_norm': 0.7951613576571183, 'learning_rate': 3.047206899755496e-06, 'epoch': 0.64} 64%|██████▍ | 21900/34278 [24:05:04<12:05:31, 3.52s/it] 64%|██████▍ | 21901/34278 [24:05:07<11:52:47, 3.46s/it] {'loss': 0.1086, 'grad_norm': 1.0271889369572134, 'learning_rate': 3.046771995954015e-06, 'epoch': 0.64} 64%|██████▍ | 21901/34278 [24:05:07<11:52:47, 3.46s/it] 64%|██████▍ | 21902/34278 [24:05:10<11:49:07, 3.44s/it] {'loss': 0.1276, 'grad_norm': 0.9344279582335326, 'learning_rate': 3.046337109590488e-06, 'epoch': 0.64} 64%|██████▍ | 21902/34278 [24:05:10<11:49:07, 3.44s/it] 64%|██████▍ | 21903/34278 [24:05:16<14:21:33, 4.18s/it] {'loss': 0.1029, 'grad_norm': 0.7668315638447574, 'learning_rate': 3.0459022406687977e-06, 'epoch': 0.64} 64%|██████▍ | 21903/34278 [24:05:16<14:21:33, 4.18s/it] 64%|██████▍ | 21904/34278 [24:05:22<16:07:19, 4.69s/it] {'loss': 0.1381, 'grad_norm': 0.8563832055480559, 'learning_rate': 3.045467389192824e-06, 'epoch': 0.64} 64%|██████▍ | 21904/34278 [24:05:22<16:07:19, 4.69s/it] 64%|██████▍ | 21905/34278 [24:05:25<14:34:09, 4.24s/it] {'loss': 0.1243, 'grad_norm': 1.3599042135465556, 'learning_rate': 3.0450325551664522e-06, 'epoch': 0.64} 64%|██████▍ | 21905/34278 [24:05:25<14:34:09, 4.24s/it] 64%|██████▍ | 21906/34278 [24:05:28<13:27:00, 3.91s/it] {'loss': 0.1308, 'grad_norm': 0.7885305964212832, 'learning_rate': 3.044597738593564e-06, 'epoch': 0.64} 64%|██████▍ | 21906/34278 [24:05:28<13:27:00, 3.91s/it] 64%|██████▍ | 21907/34278 [24:05:31<12:31:58, 3.65s/it] {'loss': 0.1, 'grad_norm': 0.6982586480666967, 'learning_rate': 3.044162939478037e-06, 'epoch': 0.64} 64%|██████▍ | 21907/34278 [24:05:31<12:31:58, 3.65s/it] 64%|██████▍ | 21908/34278 [24:05:35<12:03:51, 3.51s/it] {'loss': 0.1265, 'grad_norm': 0.8285675420662989, 'learning_rate': 3.0437281578237587e-06, 'epoch': 0.64} 64%|██████▍ | 21908/34278 [24:05:35<12:03:51, 3.51s/it] 64%|██████▍ | 21909/34278 [24:05:41<14:49:19, 4.31s/it] {'loss': 0.1257, 'grad_norm': 0.8283163405615398, 'learning_rate': 3.0432933936346083e-06, 'epoch': 0.64} 64%|██████▍ | 21909/34278 [24:05:41<14:49:19, 4.31s/it] 64%|██████▍ | 21910/34278 [24:05:44<13:45:13, 4.00s/it] {'loss': 0.1107, 'grad_norm': 0.9649752723675649, 'learning_rate': 3.042858646914467e-06, 'epoch': 0.64} 64%|██████▍ | 21910/34278 [24:05:44<13:45:13, 4.00s/it] 64%|██████▍ | 21911/34278 [24:05:48<13:42:25, 3.99s/it] {'loss': 0.0975, 'grad_norm': 0.9493165956215821, 'learning_rate': 3.0424239176672177e-06, 'epoch': 0.64} 64%|██████▍ | 21911/34278 [24:05:48<13:42:25, 3.99s/it] 64%|██████▍ | 21912/34278 [24:05:53<14:20:04, 4.17s/it] {'loss': 0.1373, 'grad_norm': 1.0680619133120668, 'learning_rate': 3.0419892058967393e-06, 'epoch': 0.64} 64%|██████▍ | 21912/34278 [24:05:53<14:20:04, 4.17s/it] 64%|██████▍ | 21913/34278 [24:05:58<15:06:56, 4.40s/it] {'loss': 0.109, 'grad_norm': 1.3516981134621209, 'learning_rate': 3.0415545116069127e-06, 'epoch': 0.64} 64%|██████▍ | 21913/34278 [24:05:58<15:06:56, 4.40s/it] 64%|██████▍ | 21914/34278 [24:06:01<13:51:12, 4.03s/it] {'loss': 0.1146, 'grad_norm': 1.0398911799811374, 'learning_rate': 3.041119834801621e-06, 'epoch': 0.64} 64%|██████▍ | 21914/34278 [24:06:01<13:51:12, 4.03s/it] 64%|██████▍ | 21915/34278 [24:06:04<13:15:17, 3.86s/it] {'loss': 0.1078, 'grad_norm': 0.7618778724016294, 'learning_rate': 3.040685175484744e-06, 'epoch': 0.64} 64%|██████▍ | 21915/34278 [24:06:04<13:15:17, 3.86s/it] 64%|██████▍ | 21916/34278 [24:06:08<13:13:54, 3.85s/it] {'loss': 0.14, 'grad_norm': 0.8823225671438082, 'learning_rate': 3.040250533660163e-06, 'epoch': 0.64} 64%|██████▍ | 21916/34278 [24:06:08<13:13:54, 3.85s/it] 64%|██████▍ | 21917/34278 [24:06:12<13:38:06, 3.97s/it] {'loss': 0.1263, 'grad_norm': 0.9549327538169706, 'learning_rate': 3.039815909331756e-06, 'epoch': 0.64} 64%|██████▍ | 21917/34278 [24:06:12<13:38:06, 3.97s/it] 64%|██████▍ | 21918/34278 [24:06:15<12:25:58, 3.62s/it] {'loss': 0.1138, 'grad_norm': 0.7160000367478118, 'learning_rate': 3.0393813025034046e-06, 'epoch': 0.64} 64%|██████▍ | 21918/34278 [24:06:15<12:25:58, 3.62s/it] 64%|██████▍ | 21919/34278 [24:06:19<12:29:03, 3.64s/it] {'loss': 0.1198, 'grad_norm': 0.8889924544531044, 'learning_rate': 3.0389467131789884e-06, 'epoch': 0.64} 64%|██████▍ | 21919/34278 [24:06:19<12:29:03, 3.64s/it] 64%|██████▍ | 21920/34278 [24:06:22<12:07:40, 3.53s/it] {'loss': 0.1263, 'grad_norm': 0.7604025105980199, 'learning_rate': 3.0385121413623888e-06, 'epoch': 0.64} 64%|██████▍ | 21920/34278 [24:06:22<12:07:40, 3.53s/it] 64%|██████▍ | 21921/34278 [24:06:25<11:37:51, 3.39s/it] {'loss': 0.1126, 'grad_norm': 0.8818718817076348, 'learning_rate': 3.038077587057485e-06, 'epoch': 0.64} 64%|██████▍ | 21921/34278 [24:06:25<11:37:51, 3.39s/it] 64%|██████▍ | 21922/34278 [24:06:28<11:35:01, 3.38s/it] {'loss': 0.1251, 'grad_norm': 0.8709218495331252, 'learning_rate': 3.0376430502681554e-06, 'epoch': 0.64} 64%|██████▍ | 21922/34278 [24:06:28<11:35:01, 3.38s/it] 64%|██████▍ | 21923/34278 [24:06:31<10:57:52, 3.19s/it] {'loss': 0.1317, 'grad_norm': 0.7977261760500468, 'learning_rate': 3.03720853099828e-06, 'epoch': 0.64} 64%|██████▍ | 21923/34278 [24:06:31<10:57:52, 3.19s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f229378d1c0> Failed to fetch sample 3353705. Exception: cannot identify image file <_io.BytesIO object at 0x7f229378d1c0> 64%|██████▍ | 21924/34278 [24:06:35<11:06:21, 3.24s/it] {'loss': 0.1247, 'grad_norm': 0.9860642992008797, 'learning_rate': 3.03677402925174e-06, 'epoch': 0.64} 64%|██████▍ | 21924/34278 [24:06:35<11:06:21, 3.24s/it] 64%|██████▍ | 21925/34278 [24:06:38<11:36:59, 3.39s/it] {'loss': 0.1148, 'grad_norm': 0.7759127192684441, 'learning_rate': 3.0363395450324103e-06, 'epoch': 0.64} 64%|██████▍ | 21925/34278 [24:06:38<11:36:59, 3.39s/it] 64%|██████▍ | 21926/34278 [24:06:43<13:02:59, 3.80s/it] {'loss': 0.1183, 'grad_norm': 0.634868363926555, 'learning_rate': 3.0359050783441736e-06, 'epoch': 0.64} 64%|██████▍ | 21926/34278 [24:06:43<13:02:59, 3.80s/it] 64%|██████▍ | 21927/34278 [24:06:47<13:07:56, 3.83s/it] {'loss': 0.124, 'grad_norm': 0.7773985173968639, 'learning_rate': 3.035470629190907e-06, 'epoch': 0.64} 64%|██████▍ | 21927/34278 [24:06:47<13:07:56, 3.83s/it] 64%|██████▍ | 21928/34278 [24:06:50<12:28:18, 3.64s/it] {'loss': 0.127, 'grad_norm': 0.8659888848175732, 'learning_rate': 3.0350361975764907e-06, 'epoch': 0.64} 64%|██████▍ | 21928/34278 [24:06:50<12:28:18, 3.64s/it] 64%|██████▍ | 21929/34278 [24:06:56<14:58:22, 4.36s/it] {'loss': 0.1269, 'grad_norm': 0.9017771768607322, 'learning_rate': 3.0346017835048015e-06, 'epoch': 0.64} 64%|██████▍ | 21929/34278 [24:06:56<14:58:22, 4.36s/it] 64%|██████▍ | 21930/34278 [24:07:00<14:18:10, 4.17s/it] {'loss': 0.111, 'grad_norm': 0.7096728122358629, 'learning_rate': 3.0341673869797183e-06, 'epoch': 0.64} 64%|██████▍ | 21930/34278 [24:07:00<14:18:10, 4.17s/it] 64%|██████▍ | 21931/34278 [24:07:04<13:48:11, 4.02s/it] {'loss': 0.1141, 'grad_norm': 0.8243088859971814, 'learning_rate': 3.0337330080051188e-06, 'epoch': 0.64} 64%|██████▍ | 21931/34278 [24:07:04<13:48:11, 4.02s/it] 64%|██████▍ | 21932/34278 [24:07:07<13:07:55, 3.83s/it] {'loss': 0.1346, 'grad_norm': 1.0075431142357507, 'learning_rate': 3.0332986465848824e-06, 'epoch': 0.64} 64%|██████▍ | 21932/34278 [24:07:07<13:07:55, 3.83s/it] 64%|██████▍ | 21933/34278 [24:07:10<12:24:11, 3.62s/it] {'loss': 0.1114, 'grad_norm': 0.9271585951905548, 'learning_rate': 3.0328643027228864e-06, 'epoch': 0.64} 64%|██████▍ | 21933/34278 [24:07:10<12:24:11, 3.62s/it] 64%|██████▍ | 21934/34278 [24:07:13<12:03:33, 3.52s/it] {'loss': 0.1403, 'grad_norm': 0.8825953290479391, 'learning_rate': 3.032429976423008e-06, 'epoch': 0.64} 64%|██████▍ | 21934/34278 [24:07:13<12:03:33, 3.52s/it] 64%|██████▍ | 21935/34278 [24:07:17<12:29:47, 3.64s/it] {'loss': 0.1308, 'grad_norm': 0.6456542010582347, 'learning_rate': 3.0319956676891253e-06, 'epoch': 0.64} 64%|██████▍ | 21935/34278 [24:07:17<12:29:47, 3.64s/it] 64%|██████▍ | 21936/34278 [24:07:21<12:13:19, 3.57s/it] {'loss': 0.1298, 'grad_norm': 0.9258455589709558, 'learning_rate': 3.0315613765251164e-06, 'epoch': 0.64} 64%|██████▍ | 21936/34278 [24:07:21<12:13:19, 3.57s/it] 64%|██████▍ | 21937/34278 [24:07:24<11:42:54, 3.42s/it] {'loss': 0.1211, 'grad_norm': 1.014659021103718, 'learning_rate': 3.0311271029348545e-06, 'epoch': 0.64} 64%|██████▍ | 21937/34278 [24:07:24<11:42:54, 3.42s/it] 64%|██████▍ | 21938/34278 [24:07:27<11:35:46, 3.38s/it] {'loss': 0.1402, 'grad_norm': 1.040044223233855, 'learning_rate': 3.0306928469222225e-06, 'epoch': 0.64} 64%|██████▍ | 21938/34278 [24:07:27<11:35:46, 3.38s/it] 64%|██████▍ | 21939/34278 [24:07:30<11:27:50, 3.34s/it] {'loss': 0.1388, 'grad_norm': 1.1177879221621851, 'learning_rate': 3.0302586084910934e-06, 'epoch': 0.64} 64%|██████▍ | 21939/34278 [24:07:30<11:27:50, 3.34s/it] 64%|██████▍ | 21940/34278 [24:07:35<12:40:44, 3.70s/it] {'loss': 0.1178, 'grad_norm': 0.9252474808014111, 'learning_rate': 3.0298243876453458e-06, 'epoch': 0.64} 64%|██████▍ | 21940/34278 [24:07:35<12:40:44, 3.70s/it] 64%|██████▍ | 21941/34278 [24:07:38<11:51:05, 3.46s/it] {'loss': 0.1315, 'grad_norm': 1.1589570442812709, 'learning_rate': 3.0293901843888573e-06, 'epoch': 0.64} 64%|██████▍ | 21941/34278 [24:07:38<11:51:05, 3.46s/it] 64%|██████▍ | 21942/34278 [24:07:41<11:53:38, 3.47s/it] {'loss': 0.1342, 'grad_norm': 1.0330798901038698, 'learning_rate': 3.0289559987255015e-06, 'epoch': 0.64} 64%|██████▍ | 21942/34278 [24:07:41<11:53:38, 3.47s/it] 64%|██████▍ | 21943/34278 [24:07:45<12:13:38, 3.57s/it] {'loss': 0.1269, 'grad_norm': 0.83232699843636, 'learning_rate': 3.028521830659154e-06, 'epoch': 0.64} 64%|██████▍ | 21943/34278 [24:07:45<12:13:38, 3.57s/it] 64%|██████▍ | 21944/34278 [24:07:49<12:55:09, 3.77s/it] {'loss': 0.1193, 'grad_norm': 1.0162619330003844, 'learning_rate': 3.028087680193695e-06, 'epoch': 0.64} 64%|██████▍ | 21944/34278 [24:07:49<12:55:09, 3.77s/it] 64%|██████▍ | 21945/34278 [24:07:52<12:13:46, 3.57s/it] {'loss': 0.1513, 'grad_norm': 0.9821735402285827, 'learning_rate': 3.0276535473329983e-06, 'epoch': 0.64} 64%|██████▍ | 21945/34278 [24:07:52<12:13:46, 3.57s/it] 64%|██████▍ | 21946/34278 [24:07:55<11:34:58, 3.38s/it] {'loss': 0.1204, 'grad_norm': 0.7782382406666812, 'learning_rate': 3.02721943208094e-06, 'epoch': 0.64} 64%|██████▍ | 21946/34278 [24:07:55<11:34:58, 3.38s/it] 64%|██████▍ | 21947/34278 [24:08:00<12:51:38, 3.75s/it] {'loss': 0.1221, 'grad_norm': 0.942174926867864, 'learning_rate': 3.0267853344413956e-06, 'epoch': 0.64} 64%|██████▍ | 21947/34278 [24:08:00<12:51:38, 3.75s/it] 64%|██████▍ | 21948/34278 [24:08:04<13:21:57, 3.90s/it] {'loss': 0.121, 'grad_norm': 1.2193659694124632, 'learning_rate': 3.0263512544182407e-06, 'epoch': 0.64} 64%|██████▍ | 21948/34278 [24:08:04<13:21:57, 3.90s/it] 64%|██████▍ | 21949/34278 [24:08:07<12:22:37, 3.61s/it] {'loss': 0.1257, 'grad_norm': 1.1613143866439641, 'learning_rate': 3.025917192015349e-06, 'epoch': 0.64} 64%|██████▍ | 21949/34278 [24:08:07<12:22:37, 3.61s/it] 64%|██████▍ | 21950/34278 [24:08:11<12:25:29, 3.63s/it] {'loss': 0.1395, 'grad_norm': 1.198356382509822, 'learning_rate': 3.025483147236599e-06, 'epoch': 0.64} 64%|██████▍ | 21950/34278 [24:08:11<12:25:29, 3.63s/it] 64%|██████▍ | 21951/34278 [24:08:14<12:22:12, 3.61s/it] {'loss': 0.1222, 'grad_norm': 0.8952116743310828, 'learning_rate': 3.0250491200858643e-06, 'epoch': 0.64} 64%|██████▍ | 21951/34278 [24:08:14<12:22:12, 3.61s/it] 64%|██████▍ | 21952/34278 [24:08:18<12:09:11, 3.55s/it] {'loss': 0.136, 'grad_norm': 1.2709399242472916, 'learning_rate': 3.0246151105670197e-06, 'epoch': 0.64} 64%|██████▍ | 21952/34278 [24:08:18<12:09:11, 3.55s/it] 64%|██████▍ | 21953/34278 [24:08:22<12:32:35, 3.66s/it] {'loss': 0.1505, 'grad_norm': 0.9343221816026714, 'learning_rate': 3.0241811186839394e-06, 'epoch': 0.64} 64%|██████▍ | 21953/34278 [24:08:22<12:32:35, 3.66s/it] 64%|██████▍ | 21954/34278 [24:08:26<12:45:08, 3.73s/it] {'loss': 0.1052, 'grad_norm': 0.7738212314549767, 'learning_rate': 3.0237471444404993e-06, 'epoch': 0.64} 64%|██████▍ | 21954/34278 [24:08:26<12:45:08, 3.73s/it] 64%|██████▍ | 21955/34278 [24:08:29<11:57:52, 3.50s/it] {'loss': 0.1326, 'grad_norm': 1.096858823539028, 'learning_rate': 3.023313187840571e-06, 'epoch': 0.64} 64%|██████▍ | 21955/34278 [24:08:29<11:57:52, 3.50s/it] 64%|██████▍ | 21956/34278 [24:08:35<14:37:29, 4.27s/it] {'loss': 0.1104, 'grad_norm': 1.0254494717705362, 'learning_rate': 3.0228792488880315e-06, 'epoch': 0.64} 64%|██████▍ | 21956/34278 [24:08:35<14:37:29, 4.27s/it] 64%|██████▍ | 21957/34278 [24:08:40<15:13:43, 4.45s/it] {'loss': 0.1379, 'grad_norm': 0.7765213523737811, 'learning_rate': 3.0224453275867544e-06, 'epoch': 0.64} 64%|██████▍ | 21957/34278 [24:08:40<15:13:43, 4.45s/it] 64%|██████▍ | 21958/34278 [24:08:44<15:26:49, 4.51s/it] {'loss': 0.1173, 'grad_norm': 1.0059879994998377, 'learning_rate': 3.022011423940614e-06, 'epoch': 0.64} 64%|██████▍ | 21958/34278 [24:08:44<15:26:49, 4.51s/it] 64%|██████▍ | 21959/34278 [24:08:49<15:49:55, 4.63s/it] {'loss': 0.1253, 'grad_norm': 0.914641722131277, 'learning_rate': 3.0215775379534827e-06, 'epoch': 0.64} 64%|██████▍ | 21959/34278 [24:08:49<15:49:55, 4.63s/it] 64%|██████▍ | 21960/34278 [24:08:52<14:24:02, 4.21s/it] {'loss': 0.1441, 'grad_norm': 0.9022035160712973, 'learning_rate': 3.0211436696292346e-06, 'epoch': 0.64} 64%|██████▍ | 21960/34278 [24:08:52<14:24:02, 4.21s/it] 64%|██████▍ | 21961/34278 [24:08:56<13:26:45, 3.93s/it] {'loss': 0.1271, 'grad_norm': 0.8967770234631273, 'learning_rate': 3.020709818971743e-06, 'epoch': 0.64} 64%|██████▍ | 21961/34278 [24:08:56<13:26:45, 3.93s/it] 64%|██████▍ | 21962/34278 [24:08:59<13:19:23, 3.89s/it] {'loss': 0.1061, 'grad_norm': 0.9269267973477245, 'learning_rate': 3.0202759859848818e-06, 'epoch': 0.64} 64%|██████▍ | 21962/34278 [24:08:59<13:19:23, 3.89s/it] 64%|██████▍ | 21963/34278 [24:09:04<14:11:21, 4.15s/it] {'loss': 0.1451, 'grad_norm': 0.8430561957964302, 'learning_rate': 3.0198421706725257e-06, 'epoch': 0.64} 64%|██████▍ | 21963/34278 [24:09:04<14:11:21, 4.15s/it] 64%|██████▍ | 21964/34278 [24:09:07<13:00:51, 3.80s/it] {'loss': 0.1137, 'grad_norm': 0.8483089362372715, 'learning_rate': 3.0194083730385443e-06, 'epoch': 0.64} 64%|██████▍ | 21964/34278 [24:09:07<13:00:51, 3.80s/it] 64%|██████▍ | 21965/34278 [24:09:11<13:21:07, 3.90s/it] {'loss': 0.1284, 'grad_norm': 0.7416063276970117, 'learning_rate': 3.0189745930868127e-06, 'epoch': 0.64} 64%|██████▍ | 21965/34278 [24:09:11<13:21:07, 3.90s/it] 64%|██████▍ | 21966/34278 [24:09:15<13:27:52, 3.94s/it] {'loss': 0.1348, 'grad_norm': 0.6940232346523145, 'learning_rate': 3.018540830821204e-06, 'epoch': 0.64} 64%|██████▍ | 21966/34278 [24:09:15<13:27:52, 3.94s/it] 64%|██████▍ | 21967/34278 [24:09:18<12:36:53, 3.69s/it] {'loss': 0.1409, 'grad_norm': 0.9563816919176291, 'learning_rate': 3.0181070862455862e-06, 'epoch': 0.64} 64%|██████▍ | 21967/34278 [24:09:18<12:36:53, 3.69s/it] 64%|██████▍ | 21968/34278 [24:09:22<12:37:39, 3.69s/it] {'loss': 0.1225, 'grad_norm': 0.8849799514833532, 'learning_rate': 3.0176733593638387e-06, 'epoch': 0.64} 64%|██████▍ | 21968/34278 [24:09:22<12:37:39, 3.69s/it] 64%|██████▍ | 21969/34278 [24:09:26<12:24:46, 3.63s/it] {'loss': 0.143, 'grad_norm': 0.7482586046657074, 'learning_rate': 3.0172396501798295e-06, 'epoch': 0.64} 64%|██████▍ | 21969/34278 [24:09:26<12:24:46, 3.63s/it] 64%|██████▍ | 21970/34278 [24:09:28<11:35:46, 3.39s/it] {'loss': 0.1223, 'grad_norm': 0.7317275594846018, 'learning_rate': 3.0168059586974307e-06, 'epoch': 0.64} 64%|██████▍ | 21970/34278 [24:09:28<11:35:46, 3.39s/it] 64%|██████▍ | 21971/34278 [24:09:32<12:07:31, 3.55s/it] {'loss': 0.1132, 'grad_norm': 0.8917560050040593, 'learning_rate': 3.0163722849205163e-06, 'epoch': 0.64} 64%|██████▍ | 21971/34278 [24:09:32<12:07:31, 3.55s/it] 64%|██████▍ | 21972/34278 [24:09:36<12:14:41, 3.58s/it] {'loss': 0.1044, 'grad_norm': 0.6154021358023682, 'learning_rate': 3.0159386288529556e-06, 'epoch': 0.64} 64%|██████▍ | 21972/34278 [24:09:36<12:14:41, 3.58s/it] 64%|██████▍ | 21973/34278 [24:09:42<14:36:18, 4.27s/it] {'loss': 0.1223, 'grad_norm': 0.8310882793958907, 'learning_rate': 3.01550499049862e-06, 'epoch': 0.64} 64%|██████▍ | 21973/34278 [24:09:42<14:36:18, 4.27s/it] 64%|██████▍ | 21974/34278 [24:09:45<13:22:57, 3.92s/it] {'loss': 0.1274, 'grad_norm': 0.8564255076719426, 'learning_rate': 3.0150713698613833e-06, 'epoch': 0.64} 64%|██████▍ | 21974/34278 [24:09:45<13:22:57, 3.92s/it] 64%|██████▍ | 21975/34278 [24:09:48<12:21:13, 3.61s/it] {'loss': 0.112, 'grad_norm': 0.8576802317833004, 'learning_rate': 3.0146377669451154e-06, 'epoch': 0.64} 64%|██████▍ | 21975/34278 [24:09:48<12:21:13, 3.61s/it] 64%|██████▍ | 21976/34278 [24:09:54<14:52:39, 4.35s/it] {'loss': 0.124, 'grad_norm': 0.8924822883242373, 'learning_rate': 3.0142041817536883e-06, 'epoch': 0.64} 64%|██████▍ | 21976/34278 [24:09:54<14:52:39, 4.35s/it] 64%|██████▍ | 21977/34278 [24:09:57<13:31:31, 3.96s/it] {'loss': 0.1289, 'grad_norm': 0.8209841294782584, 'learning_rate': 3.0137706142909717e-06, 'epoch': 0.64} 64%|██████▍ | 21977/34278 [24:09:57<13:31:31, 3.96s/it] 64%|██████▍ | 21978/34278 [24:10:00<12:49:56, 3.76s/it] {'loss': 0.1327, 'grad_norm': 0.832261433301835, 'learning_rate': 3.0133370645608372e-06, 'epoch': 0.64} 64%|██████▍ | 21978/34278 [24:10:00<12:49:56, 3.76s/it] 64%|██████▍ | 21979/34278 [24:10:06<15:10:28, 4.44s/it] {'loss': 0.1273, 'grad_norm': 0.9162160733691365, 'learning_rate': 3.0129035325671534e-06, 'epoch': 0.64} 64%|██████▍ | 21979/34278 [24:10:06<15:10:28, 4.44s/it] 64%|██████▍ | 21980/34278 [24:10:10<14:15:30, 4.17s/it] {'loss': 0.1108, 'grad_norm': 1.0494374263771356, 'learning_rate': 3.0124700183137938e-06, 'epoch': 0.64} 64%|██████▍ | 21980/34278 [24:10:10<14:15:30, 4.17s/it] 64%|██████▍ | 21981/34278 [24:10:15<15:39:43, 4.59s/it] {'loss': 0.1239, 'grad_norm': 0.8841927875348656, 'learning_rate': 3.0120365218046287e-06, 'epoch': 0.64} 64%|██████▍ | 21981/34278 [24:10:15<15:39:43, 4.59s/it] 64%|██████▍ | 21982/34278 [24:10:18<14:03:42, 4.12s/it] {'loss': 0.113, 'grad_norm': 0.8271655609304868, 'learning_rate': 3.0116030430435254e-06, 'epoch': 0.64} 64%|██████▍ | 21982/34278 [24:10:18<14:03:42, 4.12s/it] 64%|██████▍ | 21983/34278 [24:10:23<14:34:53, 4.27s/it] {'loss': 0.1194, 'grad_norm': 0.8725414580239986, 'learning_rate': 3.0111695820343557e-06, 'epoch': 0.64} 64%|██████▍ | 21983/34278 [24:10:23<14:34:53, 4.27s/it] 64%|██████▍ | 21984/34278 [24:10:26<13:32:34, 3.97s/it] {'loss': 0.1217, 'grad_norm': 0.878717120453251, 'learning_rate': 3.010736138780991e-06, 'epoch': 0.64} 64%|██████▍ | 21984/34278 [24:10:26<13:32:34, 3.97s/it] 64%|██████▍ | 21985/34278 [24:10:31<14:10:00, 4.15s/it] {'loss': 0.1266, 'grad_norm': 0.8967277763892505, 'learning_rate': 3.010302713287296e-06, 'epoch': 0.64} 64%|██████▍ | 21985/34278 [24:10:31<14:10:00, 4.15s/it] 64%|██████▍ | 21986/34278 [24:10:35<14:07:57, 4.14s/it] {'loss': 0.1211, 'grad_norm': 0.9566597113111973, 'learning_rate': 3.009869305557145e-06, 'epoch': 0.64} 64%|██████▍ | 21986/34278 [24:10:35<14:07:57, 4.14s/it] 64%|██████▍ | 21987/34278 [24:10:40<14:50:49, 4.35s/it] {'loss': 0.1171, 'grad_norm': 0.6656255080399474, 'learning_rate': 3.0094359155944053e-06, 'epoch': 0.64} 64%|██████▍ | 21987/34278 [24:10:40<14:50:49, 4.35s/it] 64%|██████▍ | 21988/34278 [24:10:43<13:46:53, 4.04s/it] {'loss': 0.1176, 'grad_norm': 0.926675776362127, 'learning_rate': 3.009002543402948e-06, 'epoch': 0.64} 64%|██████▍ | 21988/34278 [24:10:43<13:46:53, 4.04s/it] 64%|██████▍ | 21989/34278 [24:10:46<13:00:04, 3.81s/it] {'loss': 0.1191, 'grad_norm': 0.9455297630585128, 'learning_rate': 3.0085691889866396e-06, 'epoch': 0.64} 64%|██████▍ | 21989/34278 [24:10:46<13:00:04, 3.81s/it] 64%|██████▍ | 21990/34278 [24:10:51<13:34:12, 3.98s/it] {'loss': 0.1352, 'grad_norm': 0.8037505377903367, 'learning_rate': 3.00813585234935e-06, 'epoch': 0.64} 64%|██████▍ | 21990/34278 [24:10:51<13:34:12, 3.98s/it] 64%|██████▍ | 21991/34278 [24:10:57<15:50:27, 4.64s/it] {'loss': 0.1368, 'grad_norm': 0.8411793116770895, 'learning_rate': 3.0077025334949465e-06, 'epoch': 0.64} 64%|██████▍ | 21991/34278 [24:10:57<15:50:27, 4.64s/it] 64%|██████▍ | 21992/34278 [24:11:00<14:23:00, 4.21s/it] {'loss': 0.1198, 'grad_norm': 0.9225695404193294, 'learning_rate': 3.007269232427301e-06, 'epoch': 0.64} 64%|██████▍ | 21992/34278 [24:11:00<14:23:00, 4.21s/it] 64%|██████▍ | 21993/34278 [24:11:04<13:29:07, 3.95s/it] {'loss': 0.1257, 'grad_norm': 0.9568972889835847, 'learning_rate': 3.0068359491502806e-06, 'epoch': 0.64} 64%|██████▍ | 21993/34278 [24:11:04<13:29:07, 3.95s/it] 64%|██████▍ | 21994/34278 [24:11:07<12:36:22, 3.69s/it] {'loss': 0.1071, 'grad_norm': 0.8408048220332507, 'learning_rate': 3.0064026836677527e-06, 'epoch': 0.64} 64%|██████▍ | 21994/34278 [24:11:07<12:36:22, 3.69s/it] 64%|██████▍ | 21995/34278 [24:11:10<11:50:48, 3.47s/it] {'loss': 0.1565, 'grad_norm': 1.0444506090823575, 'learning_rate': 3.005969435983585e-06, 'epoch': 0.64} 64%|██████▍ | 21995/34278 [24:11:10<11:50:48, 3.47s/it] 64%|██████▍ | 21996/34278 [24:11:13<11:17:27, 3.31s/it] {'loss': 0.1205, 'grad_norm': 0.8235011669755461, 'learning_rate': 3.005536206101648e-06, 'epoch': 0.64} 64%|██████▍ | 21996/34278 [24:11:13<11:17:27, 3.31s/it] 64%|██████▍ | 21997/34278 [24:11:18<13:11:39, 3.87s/it] {'loss': 0.1314, 'grad_norm': 1.0100537331249817, 'learning_rate': 3.0051029940258035e-06, 'epoch': 0.64} 64%|██████▍ | 21997/34278 [24:11:18<13:11:39, 3.87s/it] 64%|██████▍ | 21998/34278 [24:11:21<12:55:11, 3.79s/it] {'loss': 0.1185, 'grad_norm': 0.7175819177898534, 'learning_rate': 3.004669799759927e-06, 'epoch': 0.64} 64%|██████▍ | 21998/34278 [24:11:21<12:55:11, 3.79s/it] 64%|██████▍ | 21999/34278 [24:11:24<11:52:21, 3.48s/it] {'loss': 0.1176, 'grad_norm': 0.6697634596920586, 'learning_rate': 3.004236623307881e-06, 'epoch': 0.64} 64%|██████▍ | 21999/34278 [24:11:24<11:52:21, 3.48s/it] 64%|██████▍ | 22000/34278 [24:11:27<11:26:19, 3.35s/it] {'loss': 0.1134, 'grad_norm': 0.7314890809794784, 'learning_rate': 3.003803464673534e-06, 'epoch': 0.64} 64%|██████▍ | 22000/34278 [24:11:27<11:26:19, 3.35s/it]Set eos token id to 151658Set eos token id to Set eos token to <|diff_marker|>151658 Set eos token id toSet generation config eos token id to Set eos token to [151658]<|diff_marker|>Set eos token id to151658 Set generation config eos token id to 151658 Set eos token to [151658] <|diff_marker|> Set eos token to <|diff_marker|>Set generation config eos token id to Set generation config eos token id to [151658] [151658] Set eos token id to 151658Set eos token id to Set eos token to <|diff_marker|>151658 Set generation config eos token id to Set eos token to Set eos token id to<|diff_marker|>[151658] Set generation config eos token id to151658 Set eos token to [151658]<|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id toSet eos token id toSet eos token id toSet eos token id to Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id toSet eos token id to [151658]Set eos token id to 151658Set eos token id toSet eos token id to Set eos token id to 151658Set eos token to151658 151658<|diff_marker|> Set eos token to Set eos token to Set eos token to 151658 <|diff_marker|>Set generation config eos token id to <|diff_marker|> Set generation config eos token id to Set generation config eos token id to [151658]<|diff_marker|> Set eos token to[151658] [151658] <|diff_marker|>Set generation config eos token id to Set generation config eos token id to [151658] [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658]Set eos token id to Set eos token id to151658 151658 151658 151658151658 Set eos token to151658 Set eos token toSet eos token to<|diff_marker|> Set eos token toSet eos token to Set eos token to <|diff_marker|> <|diff_marker|><|diff_marker|>Set generation config eos token id to<|diff_marker|> <|diff_marker|> Set generation config eos token id toSet generation config eos token id toSet generation config eos token id to [151658] Set generation config eos token id toSet generation config eos token id to [151658][151658][151658] [151658][151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 64%|██████▍ | 22001/34278 [24:12:00<41:29:31, 12.17s/it] {'loss': 0.1142, 'grad_norm': 0.8877970432339143, 'learning_rate': 3.0033703238607544e-06, 'epoch': 0.64} 64%|██████▍ | 22001/34278 [24:12:00<41:29:31, 12.17s/it] 64%|██████▍ | 22002/34278 [24:12:05<34:29:29, 10.11s/it] {'loss': 0.1278, 'grad_norm': 1.0484756317865096, 'learning_rate': 3.0029372008734065e-06, 'epoch': 0.64} 64%|██████▍ | 22002/34278 [24:12:05<34:29:29, 10.11s/it] 64%|██████▍ | 22003/34278 [24:12:08<27:23:50, 8.04s/it] {'loss': 0.1305, 'grad_norm': 0.784807631667673, 'learning_rate': 3.0025040957153576e-06, 'epoch': 0.64} 64%|██████▍ | 22003/34278 [24:12:08<27:23:50, 8.04s/it] 64%|██████▍ | 22004/34278 [24:12:14<24:54:19, 7.30s/it] {'loss': 0.1022, 'grad_norm': 0.7995695426138639, 'learning_rate': 3.002071008390477e-06, 'epoch': 0.64} 64%|██████▍ | 22004/34278 [24:12:14<24:54:19, 7.30s/it] 64%|██████▍ | 22005/34278 [24:12:17<21:00:20, 6.16s/it] {'loss': 0.1178, 'grad_norm': 1.0498805837174303, 'learning_rate': 3.0016379389026283e-06, 'epoch': 0.64} 64%|██████▍ | 22005/34278 [24:12:17<21:00:20, 6.16s/it] 64%|██████▍ | 22006/34278 [24:12:20<17:47:38, 5.22s/it] {'loss': 0.132, 'grad_norm': 0.8907687355688043, 'learning_rate': 3.001204887255681e-06, 'epoch': 0.64} 64%|██████▍ | 22006/34278 [24:12:20<17:47:38, 5.22s/it] 64%|██████▍ | 22007/34278 [24:12:25<17:36:22, 5.17s/it] {'loss': 0.1146, 'grad_norm': 1.2388437603140898, 'learning_rate': 3.000771853453498e-06, 'epoch': 0.64} 64%|██████▍ | 22007/34278 [24:12:26<17:36:22, 5.17s/it] 64%|██████▍ | 22008/34278 [24:12:29<15:34:06, 4.57s/it] {'loss': 0.1552, 'grad_norm': 0.7294235891008237, 'learning_rate': 3.0003388374999464e-06, 'epoch': 0.64} 64%|██████▍ | 22008/34278 [24:12:29<15:34:06, 4.57s/it] 64%|██████▍ | 22009/34278 [24:12:34<16:26:02, 4.82s/it] {'loss': 0.1453, 'grad_norm': 0.9111170835689067, 'learning_rate': 2.999905839398891e-06, 'epoch': 0.64} 64%|██████▍ | 22009/34278 [24:12:34<16:26:02, 4.82s/it] 64%|██████▍ | 22010/34278 [24:12:37<14:38:12, 4.30s/it] {'loss': 0.1386, 'grad_norm': 1.0006834918474947, 'learning_rate': 2.9994728591542012e-06, 'epoch': 0.64} 64%|██████▍ | 22010/34278 [24:12:37<14:38:12, 4.30s/it] 64%|██████▍ | 22011/34278 [24:12:43<15:59:08, 4.69s/it] {'loss': 0.1287, 'grad_norm': 0.7411976676137246, 'learning_rate': 2.99903989676974e-06, 'epoch': 0.64} 64%|██████▍ | 22011/34278 [24:12:43<15:59:08, 4.69s/it] 64%|██████▍ | 22012/34278 [24:12:48<16:34:44, 4.87s/it] {'loss': 0.1239, 'grad_norm': 0.8374476498672586, 'learning_rate': 2.998606952249372e-06, 'epoch': 0.64} 64%|██████▍ | 22012/34278 [24:12:48<16:34:44, 4.87s/it] 64%|██████▍ | 22013/34278 [24:12:51<14:57:13, 4.39s/it] {'loss': 0.1205, 'grad_norm': 1.1581749601311697, 'learning_rate': 2.998174025596964e-06, 'epoch': 0.64} 64%|██████▍ | 22013/34278 [24:12:51<14:57:13, 4.39s/it] 64%|██████▍ | 22014/34278 [24:12:54<13:28:01, 3.95s/it] {'loss': 0.1247, 'grad_norm': 0.8413115633375319, 'learning_rate': 2.9977411168163807e-06, 'epoch': 0.64} 64%|██████▍ | 22014/34278 [24:12:54<13:28:01, 3.95s/it] 64%|██████▍ | 22015/34278 [24:12:58<13:01:31, 3.82s/it] {'loss': 0.1048, 'grad_norm': 0.7172542444129659, 'learning_rate': 2.997308225911485e-06, 'epoch': 0.64} 64%|██████▍ | 22015/34278 [24:12:58<13:01:31, 3.82s/it] 64%|██████▍ | 22016/34278 [24:13:01<12:23:24, 3.64s/it] {'loss': 0.143, 'grad_norm': 0.7459797529607547, 'learning_rate': 2.9968753528861443e-06, 'epoch': 0.64} 64%|██████▍ | 22016/34278 [24:13:01<12:23:24, 3.64s/it] 64%|██████▍ | 22017/34278 [24:13:04<12:00:45, 3.53s/it] {'loss': 0.1219, 'grad_norm': 0.8838932432440447, 'learning_rate': 2.9964424977442223e-06, 'epoch': 0.64} 64%|██████▍ | 22017/34278 [24:13:04<12:00:45, 3.53s/it] 64%|██████▍ | 22018/34278 [24:13:08<12:41:06, 3.72s/it] {'loss': 0.1356, 'grad_norm': 0.7892937388748272, 'learning_rate': 2.9960096604895843e-06, 'epoch': 0.64} 64%|██████▍ | 22018/34278 [24:13:08<12:41:06, 3.72s/it] 64%|██████▍ | 22019/34278 [24:13:11<11:47:27, 3.46s/it] {'loss': 0.1041, 'grad_norm': 0.723426527644377, 'learning_rate': 2.9955768411260935e-06, 'epoch': 0.64} 64%|██████▍ | 22019/34278 [24:13:11<11:47:27, 3.46s/it] 64%|██████▍ | 22020/34278 [24:13:15<11:40:22, 3.43s/it] {'loss': 0.1075, 'grad_norm': 0.817449795560833, 'learning_rate': 2.9951440396576128e-06, 'epoch': 0.64} 64%|██████▍ | 22020/34278 [24:13:15<11:40:22, 3.43s/it] 64%|██████▍ | 22021/34278 [24:13:20<13:37:18, 4.00s/it] {'loss': 0.1152, 'grad_norm': 0.7913939349359839, 'learning_rate': 2.9947112560880076e-06, 'epoch': 0.64} 64%|██████▍ | 22021/34278 [24:13:20<13:37:18, 4.00s/it] 64%|██████▍ | 22022/34278 [24:13:24<13:48:53, 4.06s/it] {'loss': 0.1364, 'grad_norm': 0.766103026246493, 'learning_rate': 2.9942784904211418e-06, 'epoch': 0.64} 64%|██████▍ | 22022/34278 [24:13:24<13:48:53, 4.06s/it] 64%|██████▍ | 22023/34278 [24:13:28<13:20:30, 3.92s/it] {'loss': 0.1159, 'grad_norm': 0.87075082492271, 'learning_rate': 2.9938457426608802e-06, 'epoch': 0.64} 64%|██████▍ | 22023/34278 [24:13:28<13:20:30, 3.92s/it] 64%|██████▍ | 22024/34278 [24:13:31<12:53:36, 3.79s/it] {'loss': 0.1258, 'grad_norm': 0.9102059288066796, 'learning_rate': 2.993413012811084e-06, 'epoch': 0.64} 64%|██████▍ | 22024/34278 [24:13:31<12:53:36, 3.79s/it] 64%|██████▍ | 22025/34278 [24:13:34<12:16:12, 3.61s/it] {'loss': 0.1038, 'grad_norm': 0.8912854960681408, 'learning_rate': 2.9929803008756174e-06, 'epoch': 0.64} 64%|██████▍ | 22025/34278 [24:13:34<12:16:12, 3.61s/it] 64%|██████▍ | 22026/34278 [24:13:38<12:13:54, 3.59s/it] {'loss': 0.1203, 'grad_norm': 0.7777757062121337, 'learning_rate': 2.992547606858345e-06, 'epoch': 0.64} 64%|██████▍ | 22026/34278 [24:13:38<12:13:54, 3.59s/it] 64%|██████▍ | 22027/34278 [24:13:42<12:20:52, 3.63s/it] {'loss': 0.1407, 'grad_norm': 0.9571044758716966, 'learning_rate': 2.992114930763125e-06, 'epoch': 0.64} 64%|██████▍ | 22027/34278 [24:13:42<12:20:52, 3.63s/it] 64%|██████▍ | 22028/34278 [24:13:45<11:42:59, 3.44s/it] {'loss': 0.1132, 'grad_norm': 1.0391279665164588, 'learning_rate': 2.9916822725938253e-06, 'epoch': 0.64} 64%|██████▍ | 22028/34278 [24:13:45<11:42:59, 3.44s/it] 64%|██████▍ | 22029/34278 [24:13:48<11:42:56, 3.44s/it] {'loss': 0.0956, 'grad_norm': 0.7927344573522009, 'learning_rate': 2.9912496323543074e-06, 'epoch': 0.64} 64%|██████▍ | 22029/34278 [24:13:48<11:42:56, 3.44s/it] 64%|██████▍ | 22030/34278 [24:13:54<14:24:14, 4.23s/it] {'loss': 0.1274, 'grad_norm': 0.6825566380151522, 'learning_rate': 2.990817010048433e-06, 'epoch': 0.64} 64%|██████▍ | 22030/34278 [24:13:54<14:24:14, 4.23s/it] 64%|██████▍ | 22031/34278 [24:13:58<13:46:49, 4.05s/it] {'loss': 0.1324, 'grad_norm': 0.8293530431236341, 'learning_rate': 2.9903844056800657e-06, 'epoch': 0.64} 64%|██████▍ | 22031/34278 [24:13:58<13:46:49, 4.05s/it] 64%|██████▍ | 22032/34278 [24:14:01<13:13:53, 3.89s/it] {'loss': 0.1078, 'grad_norm': 0.902414512926688, 'learning_rate': 2.989951819253063e-06, 'epoch': 0.64} 64%|██████▍ | 22032/34278 [24:14:01<13:13:53, 3.89s/it] 64%|██████▍ | 22033/34278 [24:14:05<13:20:02, 3.92s/it] {'loss': 0.1289, 'grad_norm': 0.9077883812298531, 'learning_rate': 2.9895192507712943e-06, 'epoch': 0.64} 64%|██████▍ | 22033/34278 [24:14:05<13:20:02, 3.92s/it] 64%|██████▍ | 22034/34278 [24:14:10<14:20:06, 4.21s/it] {'loss': 0.1355, 'grad_norm': 1.0576657118963588, 'learning_rate': 2.989086700238617e-06, 'epoch': 0.64} 64%|██████▍ | 22034/34278 [24:14:10<14:20:06, 4.21s/it] 64%|██████▍ | 22035/34278 [24:14:16<16:07:26, 4.74s/it] {'loss': 0.117, 'grad_norm': 0.7742967496213374, 'learning_rate': 2.988654167658893e-06, 'epoch': 0.64} 64%|██████▍ | 22035/34278 [24:14:16<16:07:26, 4.74s/it] 64%|██████▍ | 22036/34278 [24:14:19<14:18:01, 4.21s/it] {'loss': 0.1129, 'grad_norm': 0.9320383661908841, 'learning_rate': 2.9882216530359855e-06, 'epoch': 0.64} 64%|██████▍ | 22036/34278 [24:14:19<14:18:01, 4.21s/it] 64%|██████▍ | 22037/34278 [24:14:22<13:01:01, 3.83s/it] {'loss': 0.1171, 'grad_norm': 0.8551385599946922, 'learning_rate': 2.9877891563737538e-06, 'epoch': 0.64} 64%|██████▍ | 22037/34278 [24:14:22<13:01:01, 3.83s/it] 64%|██████▍ | 22038/34278 [24:14:26<13:13:10, 3.89s/it] {'loss': 0.1221, 'grad_norm': 1.1244173168804574, 'learning_rate': 2.98735667767606e-06, 'epoch': 0.64} 64%|██████▍ | 22038/34278 [24:14:26<13:13:10, 3.89s/it] 64%|██████▍ | 22039/34278 [24:14:29<12:27:33, 3.66s/it] {'loss': 0.1115, 'grad_norm': 0.8901108181760674, 'learning_rate': 2.986924216946765e-06, 'epoch': 0.64} 64%|██████▍ | 22039/34278 [24:14:29<12:27:33, 3.66s/it] 64%|██████▍ | 22040/34278 [24:14:33<12:05:00, 3.55s/it] {'loss': 0.1458, 'grad_norm': 0.8029337076108818, 'learning_rate': 2.986491774189731e-06, 'epoch': 0.64} 64%|██████▍ | 22040/34278 [24:14:33<12:05:00, 3.55s/it] 64%|██████▍ | 22041/34278 [24:14:36<11:31:20, 3.39s/it] {'loss': 0.12, 'grad_norm': 0.8066120730382462, 'learning_rate': 2.9860593494088187e-06, 'epoch': 0.64} 64%|██████▍ | 22041/34278 [24:14:36<11:31:20, 3.39s/it] 64%|██████▍ | 22042/34278 [24:14:39<11:44:31, 3.45s/it] {'loss': 0.1217, 'grad_norm': 0.864314297791898, 'learning_rate': 2.9856269426078867e-06, 'epoch': 0.64} 64%|██████▍ | 22042/34278 [24:14:39<11:44:31, 3.45s/it] 64%|██████▍ | 22043/34278 [24:14:42<11:23:39, 3.35s/it] {'loss': 0.1192, 'grad_norm': 0.7196071010194764, 'learning_rate': 2.985194553790796e-06, 'epoch': 0.64} 64%|██████▍ | 22043/34278 [24:14:42<11:23:39, 3.35s/it] 64%|██████▍ | 22044/34278 [24:14:48<13:21:01, 3.93s/it] {'loss': 0.1164, 'grad_norm': 0.8675494861147474, 'learning_rate': 2.984762182961407e-06, 'epoch': 0.64} 64%|██████▍ | 22044/34278 [24:14:48<13:21:01, 3.93s/it] 64%|██████▍ | 22045/34278 [24:14:51<12:30:21, 3.68s/it] {'loss': 0.1364, 'grad_norm': 0.8661346968630518, 'learning_rate': 2.9843298301235812e-06, 'epoch': 0.64} 64%|██████▍ | 22045/34278 [24:14:51<12:30:21, 3.68s/it] 64%|██████▍ | 22046/34278 [24:14:56<14:05:24, 4.15s/it] {'loss': 0.1115, 'grad_norm': 0.8267836367556943, 'learning_rate': 2.983897495281177e-06, 'epoch': 0.64} 64%|██████▍ | 22046/34278 [24:14:56<14:05:24, 4.15s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6c9bf175b0> Failed to fetch sample 3353707. Exception: cannot identify image file <_io.BytesIO object at 0x7f6c9bf175b0> 64%|██████▍ | 22047/34278 [24:15:00<13:54:33, 4.09s/it] {'loss': 0.1035, 'grad_norm': 0.8494980209202818, 'learning_rate': 2.9834651784380554e-06, 'epoch': 0.64} 64%|██████▍ | 22047/34278 [24:15:00<13:54:33, 4.09s/it] 64%|██████▍ | 22048/34278 [24:15:03<12:53:13, 3.79s/it] {'loss': 0.1218, 'grad_norm': 0.8089928985302235, 'learning_rate': 2.9830328795980756e-06, 'epoch': 0.64} 64%|██████▍ | 22048/34278 [24:15:03<12:53:13, 3.79s/it] 64%|██████▍ | 22049/34278 [24:15:06<11:46:53, 3.47s/it] {'loss': 0.1248, 'grad_norm': 0.9807952922080896, 'learning_rate': 2.9826005987650964e-06, 'epoch': 0.64} 64%|██████▍ | 22049/34278 [24:15:06<11:46:53, 3.47s/it] 64%|██████▍ | 22050/34278 [24:15:09<11:15:16, 3.31s/it] {'loss': 0.1278, 'grad_norm': 0.8681631326657169, 'learning_rate': 2.9821683359429755e-06, 'epoch': 0.64} 64%|██████▍ | 22050/34278 [24:15:09<11:15:16, 3.31s/it] 64%|██████▍ | 22051/34278 [24:15:14<13:31:33, 3.98s/it] {'loss': 0.1268, 'grad_norm': 2.583564011172082, 'learning_rate': 2.981736091135575e-06, 'epoch': 0.64} 64%|██████▍ | 22051/34278 [24:15:14<13:31:33, 3.98s/it] 64%|██████▍ | 22052/34278 [24:15:17<12:35:35, 3.71s/it] {'loss': 0.1167, 'grad_norm': 1.0576873002634501, 'learning_rate': 2.981303864346754e-06, 'epoch': 0.64} 64%|██████▍ | 22052/34278 [24:15:17<12:35:35, 3.71s/it] 64%|██████▍ | 22053/34278 [24:15:21<12:10:58, 3.59s/it] {'loss': 0.1242, 'grad_norm': 0.9968347441174502, 'learning_rate': 2.9808716555803704e-06, 'epoch': 0.64} 64%|██████▍ | 22053/34278 [24:15:21<12:10:58, 3.59s/it] 64%|██████▍ | 22054/34278 [24:15:27<14:41:30, 4.33s/it] {'loss': 0.1306, 'grad_norm': 0.8397921202388221, 'learning_rate': 2.980439464840282e-06, 'epoch': 0.64} 64%|██████▍ | 22054/34278 [24:15:27<14:41:30, 4.33s/it] 64%|██████▍ | 22055/34278 [24:15:30<13:27:03, 3.96s/it] {'loss': 0.1297, 'grad_norm': 0.775762972254896, 'learning_rate': 2.9800072921303474e-06, 'epoch': 0.64} 64%|██████▍ | 22055/34278 [24:15:30<13:27:03, 3.96s/it] 64%|██████▍ | 22056/34278 [24:15:33<12:36:12, 3.71s/it] {'loss': 0.1066, 'grad_norm': 0.8310130153168136, 'learning_rate': 2.9795751374544244e-06, 'epoch': 0.64} 64%|██████▍ | 22056/34278 [24:15:33<12:36:12, 3.71s/it] 64%|██████▍ | 22057/34278 [24:15:36<11:59:27, 3.53s/it] {'loss': 0.1146, 'grad_norm': 0.7542953577879931, 'learning_rate': 2.9791430008163743e-06, 'epoch': 0.64} 64%|██████▍ | 22057/34278 [24:15:36<11:59:27, 3.53s/it] 64%|██████▍ | 22058/34278 [24:15:39<11:44:56, 3.46s/it] {'loss': 0.1128, 'grad_norm': 0.7323427741304133, 'learning_rate': 2.9787108822200535e-06, 'epoch': 0.64} 64%|██████▍ | 22058/34278 [24:15:39<11:44:56, 3.46s/it] 64%|██████▍ | 22059/34278 [24:15:46<14:55:11, 4.40s/it] {'loss': 0.1282, 'grad_norm': 0.907603976780851, 'learning_rate': 2.978278781669318e-06, 'epoch': 0.64} 64%|██████▍ | 22059/34278 [24:15:46<14:55:11, 4.40s/it] 64%|██████▍ | 22060/34278 [24:15:49<13:45:13, 4.05s/it] {'loss': 0.1365, 'grad_norm': 0.84944528132297, 'learning_rate': 2.977846699168028e-06, 'epoch': 0.64} 64%|██████▍ | 22060/34278 [24:15:49<13:45:13, 4.05s/it] 64%|██████▍ | 22061/34278 [24:15:53<13:09:49, 3.88s/it] {'loss': 0.1582, 'grad_norm': 0.7779429623159441, 'learning_rate': 2.9774146347200394e-06, 'epoch': 0.64} 64%|██████▍ | 22061/34278 [24:15:53<13:09:49, 3.88s/it] 64%|██████▍ | 22062/34278 [24:15:55<12:09:50, 3.58s/it] {'loss': 0.154, 'grad_norm': 0.7783991942694073, 'learning_rate': 2.9769825883292082e-06, 'epoch': 0.64} 64%|██████▍ | 22062/34278 [24:15:55<12:09:50, 3.58s/it] 64%|██████▍ | 22063/34278 [24:15:59<11:43:59, 3.46s/it] {'loss': 0.1439, 'grad_norm': 1.0587142119934603, 'learning_rate': 2.976550559999396e-06, 'epoch': 0.64} 64%|██████▍ | 22063/34278 [24:15:59<11:43:59, 3.46s/it] 64%|██████▍ | 22064/34278 [24:16:04<13:26:25, 3.96s/it] {'loss': 0.1222, 'grad_norm': 0.8804615046154666, 'learning_rate': 2.976118549734457e-06, 'epoch': 0.64} 64%|██████▍ | 22064/34278 [24:16:04<13:26:25, 3.96s/it] 64%|██████▍ | 22065/34278 [24:16:07<12:22:53, 3.65s/it] {'loss': 0.1248, 'grad_norm': 0.8756130173308975, 'learning_rate': 2.9756865575382475e-06, 'epoch': 0.64} 64%|██████▍ | 22065/34278 [24:16:07<12:22:53, 3.65s/it] 64%|██████▍ | 22066/34278 [24:16:10<11:39:46, 3.44s/it] {'loss': 0.1159, 'grad_norm': 0.7862666622013987, 'learning_rate': 2.9752545834146275e-06, 'epoch': 0.64} 64%|██████▍ | 22066/34278 [24:16:10<11:39:46, 3.44s/it] 64%|██████▍ | 22067/34278 [24:16:13<11:30:50, 3.39s/it] {'loss': 0.1288, 'grad_norm': 0.9494680000328745, 'learning_rate': 2.974822627367449e-06, 'epoch': 0.64} 64%|██████▍ | 22067/34278 [24:16:13<11:30:50, 3.39s/it] 64%|██████▍ | 22068/34278 [24:16:19<14:17:34, 4.21s/it] {'loss': 0.1132, 'grad_norm': 0.798034359516251, 'learning_rate': 2.97439068940057e-06, 'epoch': 0.64} 64%|██████▍ | 22068/34278 [24:16:19<14:17:34, 4.21s/it] 64%|██████▍ | 22069/34278 [24:16:22<13:28:16, 3.97s/it] {'loss': 0.1537, 'grad_norm': 1.1078607766031694, 'learning_rate': 2.9739587695178485e-06, 'epoch': 0.64} 64%|██████▍ | 22069/34278 [24:16:22<13:28:16, 3.97s/it] 64%|██████▍ | 22070/34278 [24:16:26<12:53:42, 3.80s/it] {'loss': 0.1083, 'grad_norm': 1.2731222312197774, 'learning_rate': 2.97352686772314e-06, 'epoch': 0.64} 64%|██████▍ | 22070/34278 [24:16:26<12:53:42, 3.80s/it] 64%|██████▍ | 22071/34278 [24:16:32<15:01:04, 4.43s/it] {'loss': 0.1162, 'grad_norm': 0.9063292670247907, 'learning_rate': 2.9730949840203e-06, 'epoch': 0.64} 64%|██████▍ | 22071/34278 [24:16:32<15:01:04, 4.43s/it] 64%|██████▍ | 22072/34278 [24:16:38<16:46:55, 4.95s/it] {'loss': 0.1585, 'grad_norm': 0.939624863022762, 'learning_rate': 2.9726631184131833e-06, 'epoch': 0.64} 64%|██████▍ | 22072/34278 [24:16:38<16:46:55, 4.95s/it] 64%|██████▍ | 22073/34278 [24:16:41<15:00:37, 4.43s/it] {'loss': 0.1278, 'grad_norm': 0.9894101783704818, 'learning_rate': 2.9722312709056466e-06, 'epoch': 0.64} 64%|██████▍ | 22073/34278 [24:16:41<15:00:37, 4.43s/it] 64%|██████▍ | 22074/34278 [24:16:47<16:35:01, 4.89s/it] {'loss': 0.1211, 'grad_norm': 1.1466272745788815, 'learning_rate': 2.971799441501544e-06, 'epoch': 0.64} 64%|██████▍ | 22074/34278 [24:16:47<16:35:01, 4.89s/it] 64%|██████▍ | 22075/34278 [24:16:50<15:02:41, 4.44s/it] {'loss': 0.0974, 'grad_norm': 0.7980771666472901, 'learning_rate': 2.9713676302047335e-06, 'epoch': 0.64} 64%|██████▍ | 22075/34278 [24:16:51<15:02:41, 4.44s/it] 64%|██████▍ | 22076/34278 [24:16:54<13:38:35, 4.03s/it] {'loss': 0.1329, 'grad_norm': 0.9849768463337183, 'learning_rate': 2.9709358370190677e-06, 'epoch': 0.64} 64%|██████▍ | 22076/34278 [24:16:54<13:38:35, 4.03s/it] 64%|██████▍ | 22077/34278 [24:16:57<13:02:55, 3.85s/it] {'loss': 0.1245, 'grad_norm': 1.0127582034249096, 'learning_rate': 2.970504061948403e-06, 'epoch': 0.64} 64%|██████▍ | 22077/34278 [24:16:57<13:02:55, 3.85s/it] 64%|██████▍ | 22078/34278 [24:17:00<11:54:40, 3.51s/it] {'loss': 0.1236, 'grad_norm': 1.1530073418603166, 'learning_rate': 2.9700723049965928e-06, 'epoch': 0.64} 64%|██████▍ | 22078/34278 [24:17:00<11:54:40, 3.51s/it] 64%|██████▍ | 22079/34278 [24:17:03<11:24:55, 3.37s/it] {'loss': 0.122, 'grad_norm': 0.9272447726282984, 'learning_rate': 2.969640566167493e-06, 'epoch': 0.64} 64%|██████▍ | 22079/34278 [24:17:03<11:24:55, 3.37s/it] 64%|██████▍ | 22080/34278 [24:17:06<10:59:35, 3.24s/it] {'loss': 0.126, 'grad_norm': 0.917588325968273, 'learning_rate': 2.969208845464956e-06, 'epoch': 0.64} 64%|██████▍ | 22080/34278 [24:17:06<10:59:35, 3.24s/it] 64%|██████▍ | 22081/34278 [24:17:09<10:40:37, 3.15s/it] {'loss': 0.1437, 'grad_norm': 1.0576892000786284, 'learning_rate': 2.968777142892839e-06, 'epoch': 0.64} 64%|██████▍ | 22081/34278 [24:17:09<10:40:37, 3.15s/it] 64%|██████▍ | 22082/34278 [24:17:12<11:16:01, 3.33s/it] {'loss': 0.1172, 'grad_norm': 0.6943798440503597, 'learning_rate': 2.9683454584549943e-06, 'epoch': 0.64} 64%|██████▍ | 22082/34278 [24:17:12<11:16:01, 3.33s/it] 64%|██████▍ | 22083/34278 [24:17:16<11:30:50, 3.40s/it] {'loss': 0.1285, 'grad_norm': 0.985432734009756, 'learning_rate': 2.967913792155278e-06, 'epoch': 0.64} 64%|██████▍ | 22083/34278 [24:17:16<11:30:50, 3.40s/it] 64%|██████▍ | 22084/34278 [24:17:19<11:09:23, 3.29s/it] {'loss': 0.1253, 'grad_norm': 0.9401943586052562, 'learning_rate': 2.967482143997541e-06, 'epoch': 0.64} 64%|██████▍ | 22084/34278 [24:17:19<11:09:23, 3.29s/it] 64%|██████▍ | 22085/34278 [24:17:23<11:30:23, 3.40s/it] {'loss': 0.1139, 'grad_norm': 0.9388888997823897, 'learning_rate': 2.9670505139856375e-06, 'epoch': 0.64} 64%|██████▍ | 22085/34278 [24:17:23<11:30:23, 3.40s/it] 64%|██████▍ | 22086/34278 [24:17:26<11:19:18, 3.34s/it] {'loss': 0.1053, 'grad_norm': 0.9569262205041986, 'learning_rate': 2.9666189021234214e-06, 'epoch': 0.64} 64%|██████▍ | 22086/34278 [24:17:26<11:19:18, 3.34s/it] 64%|██████▍ | 22087/34278 [24:17:29<10:50:03, 3.20s/it] {'loss': 0.1084, 'grad_norm': 0.8517320319007174, 'learning_rate': 2.9661873084147473e-06, 'epoch': 0.64} 64%|██████▍ | 22087/34278 [24:17:29<10:50:03, 3.20s/it] 64%|██████▍ | 22088/34278 [24:17:35<13:39:34, 4.03s/it] {'loss': 0.165, 'grad_norm': 1.0762450535352732, 'learning_rate': 2.9657557328634688e-06, 'epoch': 0.64} 64%|██████▍ | 22088/34278 [24:17:35<13:39:34, 4.03s/it] 64%|██████▍ | 22089/34278 [24:17:37<12:20:29, 3.65s/it] {'loss': 0.1348, 'grad_norm': 0.9851269425213238, 'learning_rate': 2.9653241754734363e-06, 'epoch': 0.64} 64%|██████▍ | 22089/34278 [24:17:37<12:20:29, 3.65s/it] 64%|██████▍ | 22090/34278 [24:17:44<14:50:58, 4.39s/it] {'loss': 0.1143, 'grad_norm': 0.8501366656248518, 'learning_rate': 2.964892636248503e-06, 'epoch': 0.64} 64%|██████▍ | 22090/34278 [24:17:44<14:50:58, 4.39s/it] 64%|██████▍ | 22091/34278 [24:17:48<14:24:18, 4.26s/it] {'loss': 0.1307, 'grad_norm': 1.1994242588566304, 'learning_rate': 2.964461115192524e-06, 'epoch': 0.64} 64%|██████▍ | 22091/34278 [24:17:48<14:24:18, 4.26s/it] 64%|██████▍ | 22092/34278 [24:17:52<15:03:34, 4.45s/it] {'loss': 0.1183, 'grad_norm': 0.8736033208421453, 'learning_rate': 2.9640296123093476e-06, 'epoch': 0.64} 64%|██████▍ | 22092/34278 [24:17:52<15:03:34, 4.45s/it] 64%|██████▍ | 22093/34278 [24:17:57<15:31:59, 4.59s/it] {'loss': 0.1307, 'grad_norm': 0.8446397914758896, 'learning_rate': 2.963598127602831e-06, 'epoch': 0.64} 64%|██████▍ | 22093/34278 [24:17:57<15:31:59, 4.59s/it] 64%|██████▍ | 22094/34278 [24:18:01<14:12:55, 4.20s/it] {'loss': 0.1301, 'grad_norm': 0.7794112893337023, 'learning_rate': 2.963166661076824e-06, 'epoch': 0.64} 64%|██████▍ | 22094/34278 [24:18:01<14:12:55, 4.20s/it] 64%|██████▍ | 22095/34278 [24:18:04<13:03:32, 3.86s/it] {'loss': 0.1395, 'grad_norm': 0.8520922326501634, 'learning_rate': 2.9627352127351783e-06, 'epoch': 0.64} 64%|██████▍ | 22095/34278 [24:18:04<13:03:32, 3.86s/it] 64%|██████▍ | 22096/34278 [24:18:07<12:19:30, 3.64s/it] {'loss': 0.1116, 'grad_norm': 0.7382525198148784, 'learning_rate': 2.962303782581748e-06, 'epoch': 0.64} 64%|██████▍ | 22096/34278 [24:18:07<12:19:30, 3.64s/it] 64%|██████▍ | 22097/34278 [24:18:10<12:06:23, 3.58s/it] {'loss': 0.1236, 'grad_norm': 1.0613390162714242, 'learning_rate': 2.9618723706203812e-06, 'epoch': 0.64} 64%|██████▍ | 22097/34278 [24:18:10<12:06:23, 3.58s/it] 64%|██████▍ | 22098/34278 [24:18:15<13:13:16, 3.91s/it] {'loss': 0.1264, 'grad_norm': 0.9408842892032847, 'learning_rate': 2.961440976854931e-06, 'epoch': 0.64} 64%|██████▍ | 22098/34278 [24:18:15<13:13:16, 3.91s/it] 64%|██████▍ | 22099/34278 [24:18:20<14:16:45, 4.22s/it] {'loss': 0.1297, 'grad_norm': 0.72467979429876, 'learning_rate': 2.9610096012892496e-06, 'epoch': 0.64} 64%|██████▍ | 22099/34278 [24:18:20<14:16:45, 4.22s/it] 64%|██████▍ | 22100/34278 [24:18:23<13:12:06, 3.90s/it] {'loss': 0.1149, 'grad_norm': 0.737689601338275, 'learning_rate': 2.960578243927188e-06, 'epoch': 0.64} 64%|██████▍ | 22100/34278 [24:18:23<13:12:06, 3.90s/it] 64%|██████▍ | 22101/34278 [24:18:29<15:12:20, 4.50s/it] {'loss': 0.1228, 'grad_norm': 0.9520034245445577, 'learning_rate': 2.960146904772598e-06, 'epoch': 0.64} 64%|██████▍ | 22101/34278 [24:18:29<15:12:20, 4.50s/it] 64%|██████▍ | 22102/34278 [24:18:32<13:40:56, 4.05s/it] {'loss': 0.1399, 'grad_norm': 1.136912996718269, 'learning_rate': 2.959715583829328e-06, 'epoch': 0.64} 64%|██████▍ | 22102/34278 [24:18:32<13:40:56, 4.05s/it] 64%|██████▍ | 22103/34278 [24:18:36<13:20:34, 3.95s/it] {'loss': 0.1399, 'grad_norm': 0.8071727351381353, 'learning_rate': 2.959284281101231e-06, 'epoch': 0.64} 64%|██████▍ | 22103/34278 [24:18:36<13:20:34, 3.95s/it] 64%|██████▍ | 22104/34278 [24:18:39<13:15:10, 3.92s/it] {'loss': 0.1224, 'grad_norm': 0.8089437736268954, 'learning_rate': 2.958852996592155e-06, 'epoch': 0.64} 64%|██████▍ | 22104/34278 [24:18:39<13:15:10, 3.92s/it] 64%|██████▍ | 22105/34278 [24:18:45<15:18:38, 4.53s/it] {'loss': 0.1156, 'grad_norm': 0.947368275285795, 'learning_rate': 2.958421730305955e-06, 'epoch': 0.64} 64%|██████▍ | 22105/34278 [24:18:45<15:18:38, 4.53s/it] 64%|██████▍ | 22106/34278 [24:18:48<13:45:36, 4.07s/it] {'loss': 0.1408, 'grad_norm': 1.1211907837726405, 'learning_rate': 2.9579904822464767e-06, 'epoch': 0.64} 64%|██████▍ | 22106/34278 [24:18:48<13:45:36, 4.07s/it] 64%|██████▍ | 22107/34278 [24:18:54<15:13:43, 4.50s/it] {'loss': 0.1132, 'grad_norm': 0.792799923898483, 'learning_rate': 2.9575592524175723e-06, 'epoch': 0.64} 64%|██████▍ | 22107/34278 [24:18:54<15:13:43, 4.50s/it] 64%|██████▍ | 22108/34278 [24:18:58<14:31:32, 4.30s/it] {'loss': 0.1305, 'grad_norm': 0.7318146421671569, 'learning_rate': 2.9571280408230917e-06, 'epoch': 0.64} 64%|██████▍ | 22108/34278 [24:18:58<14:31:32, 4.30s/it] 64%|██████▍ | 22109/34278 [24:19:01<13:14:50, 3.92s/it] {'loss': 0.1203, 'grad_norm': 1.0511070170228909, 'learning_rate': 2.9566968474668847e-06, 'epoch': 0.64} 64%|██████▍ | 22109/34278 [24:19:01<13:14:50, 3.92s/it] 65%|██████▍ | 22110/34278 [24:19:04<12:25:42, 3.68s/it] {'loss': 0.1153, 'grad_norm': 0.8679755600181671, 'learning_rate': 2.956265672352798e-06, 'epoch': 0.65} 65%|██████▍ | 22110/34278 [24:19:04<12:25:42, 3.68s/it] 65%|██████▍ | 22111/34278 [24:19:08<12:52:45, 3.81s/it] {'loss': 0.1296, 'grad_norm': 0.9430004558181917, 'learning_rate': 2.955834515484685e-06, 'epoch': 0.65} 65%|██████▍ | 22111/34278 [24:19:08<12:52:45, 3.81s/it] 65%|██████▍ | 22112/34278 [24:19:11<12:26:50, 3.68s/it] {'loss': 0.1121, 'grad_norm': 0.9449628681644011, 'learning_rate': 2.9554033768663937e-06, 'epoch': 0.65} 65%|██████▍ | 22112/34278 [24:19:11<12:26:50, 3.68s/it] 65%|██████▍ | 22113/34278 [24:19:15<12:45:17, 3.77s/it] {'loss': 0.1392, 'grad_norm': 0.7746785251412969, 'learning_rate': 2.9549722565017737e-06, 'epoch': 0.65} 65%|██████▍ | 22113/34278 [24:19:15<12:45:17, 3.77s/it] 65%|██████▍ | 22114/34278 [24:19:19<12:29:49, 3.70s/it] {'loss': 0.1305, 'grad_norm': 0.8924165420964965, 'learning_rate': 2.9545411543946723e-06, 'epoch': 0.65} 65%|██████▍ | 22114/34278 [24:19:19<12:29:49, 3.70s/it] 65%|██████▍ | 22115/34278 [24:19:23<12:51:56, 3.81s/it] {'loss': 0.1518, 'grad_norm': 0.7748486079407143, 'learning_rate': 2.9541100705489393e-06, 'epoch': 0.65} 65%|██████▍ | 22115/34278 [24:19:23<12:51:56, 3.81s/it] 65%|██████▍ | 22116/34278 [24:19:26<12:14:18, 3.62s/it] {'loss': 0.1081, 'grad_norm': 0.6335176502798734, 'learning_rate': 2.9536790049684224e-06, 'epoch': 0.65} 65%|██████▍ | 22116/34278 [24:19:26<12:14:18, 3.62s/it] 65%|██████▍ | 22117/34278 [24:19:32<14:34:12, 4.31s/it] {'loss': 0.1275, 'grad_norm': 1.0450723518143625, 'learning_rate': 2.9532479576569716e-06, 'epoch': 0.65} 65%|██████▍ | 22117/34278 [24:19:32<14:34:12, 4.31s/it] 65%|██████▍ | 22118/34278 [24:19:35<13:12:54, 3.91s/it] {'loss': 0.1052, 'grad_norm': 0.7215618244914422, 'learning_rate': 2.9528169286184348e-06, 'epoch': 0.65} 65%|██████▍ | 22118/34278 [24:19:35<13:12:54, 3.91s/it] 65%|██████▍ | 22119/34278 [24:19:41<15:26:20, 4.57s/it] {'loss': 0.133, 'grad_norm': 0.9781881820207516, 'learning_rate': 2.9523859178566594e-06, 'epoch': 0.65} 65%|██████▍ | 22119/34278 [24:19:41<15:26:20, 4.57s/it] 65%|██████▍ | 22120/34278 [24:19:45<14:11:30, 4.20s/it] {'loss': 0.1145, 'grad_norm': 0.707457863237948, 'learning_rate': 2.951954925375494e-06, 'epoch': 0.65} 65%|██████▍ | 22120/34278 [24:19:45<14:11:30, 4.20s/it] 65%|██████▍ | 22121/34278 [24:19:47<12:51:21, 3.81s/it] {'loss': 0.1044, 'grad_norm': 0.7934944591494858, 'learning_rate': 2.951523951178787e-06, 'epoch': 0.65} 65%|██████▍ | 22121/34278 [24:19:47<12:51:21, 3.81s/it] 65%|██████▍ | 22122/34278 [24:19:54<15:10:58, 4.50s/it] {'loss': 0.1289, 'grad_norm': 0.7912181486480262, 'learning_rate': 2.9510929952703815e-06, 'epoch': 0.65} 65%|██████▍ | 22122/34278 [24:19:54<15:10:58, 4.50s/it] 65%|██████▍ | 22123/34278 [24:19:56<13:31:57, 4.01s/it] {'loss': 0.1309, 'grad_norm': 0.7793378273608388, 'learning_rate': 2.950662057654132e-06, 'epoch': 0.65} 65%|██████▍ | 22123/34278 [24:19:56<13:31:57, 4.01s/it] 65%|██████▍ | 22124/34278 [24:19:59<12:17:53, 3.64s/it] {'loss': 0.1038, 'grad_norm': 0.8729551155646463, 'learning_rate': 2.950231138333882e-06, 'epoch': 0.65} 65%|██████▍ | 22124/34278 [24:19:59<12:17:53, 3.64s/it] 65%|██████▍ | 22125/34278 [24:20:03<12:16:08, 3.63s/it] {'loss': 0.126, 'grad_norm': 0.813778001061841, 'learning_rate': 2.949800237313478e-06, 'epoch': 0.65} 65%|██████▍ | 22125/34278 [24:20:03<12:16:08, 3.63s/it] 65%|██████▍ | 22126/34278 [24:20:09<14:41:15, 4.35s/it] {'loss': 0.1372, 'grad_norm': 0.894484055265241, 'learning_rate': 2.94936935459677e-06, 'epoch': 0.65} 65%|██████▍ | 22126/34278 [24:20:09<14:41:15, 4.35s/it] 65%|██████▍ | 22127/34278 [24:20:12<13:15:01, 3.93s/it] {'loss': 0.1383, 'grad_norm': 0.895704340897552, 'learning_rate': 2.9489384901876016e-06, 'epoch': 0.65} 65%|██████▍ | 22127/34278 [24:20:12<13:15:01, 3.93s/it] 65%|██████▍ | 22128/34278 [24:20:18<15:18:47, 4.54s/it] {'loss': 0.1336, 'grad_norm': 0.8094463295701287, 'learning_rate': 2.94850764408982e-06, 'epoch': 0.65} 65%|██████▍ | 22128/34278 [24:20:18<15:18:47, 4.54s/it] 65%|██████▍ | 22129/34278 [24:20:22<14:56:17, 4.43s/it] {'loss': 0.117, 'grad_norm': 0.8192602504632791, 'learning_rate': 2.9480768163072726e-06, 'epoch': 0.65} 65%|██████▍ | 22129/34278 [24:20:22<14:56:17, 4.43s/it] 65%|██████▍ | 22130/34278 [24:20:25<13:47:35, 4.09s/it] {'loss': 0.1049, 'grad_norm': 0.6793074005980579, 'learning_rate': 2.9476460068438064e-06, 'epoch': 0.65} 65%|██████▍ | 22130/34278 [24:20:25<13:47:35, 4.09s/it] 65%|██████▍ | 22131/34278 [24:20:28<12:33:00, 3.72s/it] {'loss': 0.1251, 'grad_norm': 0.7619057899262515, 'learning_rate': 2.947215215703267e-06, 'epoch': 0.65} 65%|██████▍ | 22131/34278 [24:20:28<12:33:00, 3.72s/it] 65%|██████▍ | 22132/34278 [24:20:31<12:17:03, 3.64s/it] {'loss': 0.1229, 'grad_norm': 0.8574812210018996, 'learning_rate': 2.9467844428894998e-06, 'epoch': 0.65} 65%|██████▍ | 22132/34278 [24:20:32<12:17:03, 3.64s/it] 65%|██████▍ | 22133/34278 [24:20:38<14:50:28, 4.40s/it] {'loss': 0.1324, 'grad_norm': 0.7172180666131697, 'learning_rate': 2.9463536884063505e-06, 'epoch': 0.65} 65%|██████▍ | 22133/34278 [24:20:38<14:50:28, 4.40s/it] 65%|██████▍ | 22134/34278 [24:20:40<13:14:04, 3.92s/it] {'loss': 0.1339, 'grad_norm': 0.7592250951735323, 'learning_rate': 2.945922952257664e-06, 'epoch': 0.65} 65%|██████▍ | 22134/34278 [24:20:40<13:14:04, 3.92s/it] 65%|██████▍ | 22135/34278 [24:20:44<13:14:58, 3.93s/it] {'loss': 0.1114, 'grad_norm': 1.1939515370283684, 'learning_rate': 2.9454922344472893e-06, 'epoch': 0.65} 65%|██████▍ | 22135/34278 [24:20:44<13:14:58, 3.93s/it] 65%|██████▍ | 22136/34278 [24:20:47<12:15:39, 3.64s/it] {'loss': 0.0953, 'grad_norm': 0.6845693570592758, 'learning_rate': 2.945061534979069e-06, 'epoch': 0.65} 65%|██████▍ | 22136/34278 [24:20:47<12:15:39, 3.64s/it] 65%|██████▍ | 22137/34278 [24:20:53<14:43:20, 4.37s/it] {'loss': 0.1186, 'grad_norm': 0.7923648608066438, 'learning_rate': 2.944630853856848e-06, 'epoch': 0.65} 65%|██████▍ | 22137/34278 [24:20:53<14:43:20, 4.37s/it] 65%|██████▍ | 22138/34278 [24:20:56<13:22:03, 3.96s/it] {'loss': 0.1334, 'grad_norm': 1.3492082174648148, 'learning_rate': 2.944200191084473e-06, 'epoch': 0.65} 65%|██████▍ | 22138/34278 [24:20:56<13:22:03, 3.96s/it] 65%|██████▍ | 22139/34278 [24:21:00<13:01:35, 3.86s/it] {'loss': 0.1165, 'grad_norm': 1.0539879261958156, 'learning_rate': 2.9437695466657877e-06, 'epoch': 0.65} 65%|██████▍ | 22139/34278 [24:21:00<13:01:35, 3.86s/it] 65%|██████▍ | 22140/34278 [24:21:03<11:52:28, 3.52s/it] {'loss': 0.1012, 'grad_norm': 0.6507382814474879, 'learning_rate': 2.943338920604636e-06, 'epoch': 0.65} 65%|██████▍ | 22140/34278 [24:21:03<11:52:28, 3.52s/it] 65%|██████▍ | 22141/34278 [24:21:06<11:26:06, 3.39s/it] {'loss': 0.1133, 'grad_norm': 0.9651391579973144, 'learning_rate': 2.9429083129048636e-06, 'epoch': 0.65} 65%|██████▍ | 22141/34278 [24:21:06<11:26:06, 3.39s/it] 65%|██████▍ | 22142/34278 [24:21:09<11:19:46, 3.36s/it] {'loss': 0.1211, 'grad_norm': 1.1172918869556225, 'learning_rate': 2.942477723570315e-06, 'epoch': 0.65} 65%|██████▍ | 22142/34278 [24:21:09<11:19:46, 3.36s/it] 65%|██████▍ | 22143/34278 [24:21:13<11:25:05, 3.39s/it] {'loss': 0.1299, 'grad_norm': 0.914783104135238, 'learning_rate': 2.9420471526048356e-06, 'epoch': 0.65} 65%|██████▍ | 22143/34278 [24:21:13<11:25:05, 3.39s/it] 65%|██████▍ | 22144/34278 [24:21:16<10:56:41, 3.25s/it] {'loss': 0.1403, 'grad_norm': 1.1613740299798396, 'learning_rate': 2.941616600012267e-06, 'epoch': 0.65} 65%|██████▍ | 22144/34278 [24:21:16<10:56:41, 3.25s/it] 65%|██████▍ | 22145/34278 [24:21:21<12:47:04, 3.79s/it] {'loss': 0.139, 'grad_norm': 0.7755959362290784, 'learning_rate': 2.941186065796453e-06, 'epoch': 0.65} 65%|██████▍ | 22145/34278 [24:21:21<12:47:04, 3.79s/it] 65%|██████▍ | 22146/34278 [24:21:24<12:14:13, 3.63s/it] {'loss': 0.0885, 'grad_norm': 0.7695792323176357, 'learning_rate': 2.9407555499612383e-06, 'epoch': 0.65} 65%|██████▍ | 22146/34278 [24:21:24<12:14:13, 3.63s/it] 65%|██████▍ | 22147/34278 [24:21:29<14:02:43, 4.17s/it] {'loss': 0.1054, 'grad_norm': 0.7572841970919096, 'learning_rate': 2.9403250525104672e-06, 'epoch': 0.65} 65%|██████▍ | 22147/34278 [24:21:29<14:02:43, 4.17s/it] 65%|██████▍ | 22148/34278 [24:21:32<12:44:56, 3.78s/it] {'loss': 0.1171, 'grad_norm': 0.8386233756755662, 'learning_rate': 2.939894573447983e-06, 'epoch': 0.65} 65%|██████▍ | 22148/34278 [24:21:32<12:44:56, 3.78s/it] 65%|██████▍ | 22149/34278 [24:21:35<12:00:27, 3.56s/it] {'loss': 0.1308, 'grad_norm': 1.1214203074447373, 'learning_rate': 2.939464112777628e-06, 'epoch': 0.65} 65%|██████▍ | 22149/34278 [24:21:35<12:00:27, 3.56s/it] 65%|██████▍ | 22150/34278 [24:21:38<11:32:37, 3.43s/it] {'loss': 0.1036, 'grad_norm': 0.7332803395155896, 'learning_rate': 2.9390336705032452e-06, 'epoch': 0.65} 65%|██████▍ | 22150/34278 [24:21:38<11:32:37, 3.43s/it] 65%|██████▍ | 22151/34278 [24:21:41<11:12:40, 3.33s/it] {'loss': 0.1378, 'grad_norm': 0.9084925279839042, 'learning_rate': 2.9386032466286783e-06, 'epoch': 0.65} 65%|██████▍ | 22151/34278 [24:21:41<11:12:40, 3.33s/it] 65%|██████▍ | 22152/34278 [24:21:45<11:22:35, 3.38s/it] {'loss': 0.1477, 'grad_norm': 1.191419971198691, 'learning_rate': 2.938172841157767e-06, 'epoch': 0.65} 65%|██████▍ | 22152/34278 [24:21:45<11:22:35, 3.38s/it] 65%|██████▍ | 22153/34278 [24:21:48<11:05:05, 3.29s/it] {'loss': 0.1216, 'grad_norm': 0.8901477574469311, 'learning_rate': 2.9377424540943594e-06, 'epoch': 0.65} 65%|██████▍ | 22153/34278 [24:21:48<11:05:05, 3.29s/it] 65%|██████▍ | 22154/34278 [24:21:52<12:10:39, 3.62s/it] {'loss': 0.1337, 'grad_norm': 0.8752770002895012, 'learning_rate': 2.937312085442294e-06, 'epoch': 0.65} 65%|██████▍ | 22154/34278 [24:21:52<12:10:39, 3.62s/it] 65%|██████▍ | 22155/34278 [24:21:56<11:54:27, 3.54s/it] {'loss': 0.1083, 'grad_norm': 0.7788200669650033, 'learning_rate': 2.9368817352054137e-06, 'epoch': 0.65} 65%|██████▍ | 22155/34278 [24:21:56<11:54:27, 3.54s/it] 65%|██████▍ | 22156/34278 [24:21:59<11:14:08, 3.34s/it] {'loss': 0.1024, 'grad_norm': 0.9631934396815398, 'learning_rate': 2.9364514033875614e-06, 'epoch': 0.65} 65%|██████▍ | 22156/34278 [24:21:59<11:14:08, 3.34s/it] 65%|██████▍ | 22157/34278 [24:22:03<11:51:54, 3.52s/it] {'loss': 0.0926, 'grad_norm': 0.7865690791547649, 'learning_rate': 2.936021089992578e-06, 'epoch': 0.65} 65%|██████▍ | 22157/34278 [24:22:03<11:51:54, 3.52s/it] 65%|██████▍ | 22158/34278 [24:22:06<12:04:06, 3.58s/it] {'loss': 0.1306, 'grad_norm': 0.8685141462870842, 'learning_rate': 2.935590795024304e-06, 'epoch': 0.65} 65%|██████▍ | 22158/34278 [24:22:06<12:04:06, 3.58s/it] 65%|██████▍ | 22159/34278 [24:22:10<11:48:23, 3.51s/it] {'loss': 0.1125, 'grad_norm': 0.8094680277486507, 'learning_rate': 2.935160518486584e-06, 'epoch': 0.65} 65%|██████▍ | 22159/34278 [24:22:10<11:48:23, 3.51s/it] 65%|██████▍ | 22160/34278 [24:22:13<11:12:22, 3.33s/it] {'loss': 0.1466, 'grad_norm': 0.9249522662181008, 'learning_rate': 2.934730260383258e-06, 'epoch': 0.65} 65%|██████▍ | 22160/34278 [24:22:13<11:12:22, 3.33s/it] 65%|██████▍ | 22161/34278 [24:22:19<13:56:50, 4.14s/it] {'loss': 0.1269, 'grad_norm': 0.8095583761245791, 'learning_rate': 2.9343000207181676e-06, 'epoch': 0.65} 65%|██████▍ | 22161/34278 [24:22:19<13:56:50, 4.14s/it] 65%|██████▍ | 22162/34278 [24:22:21<12:41:21, 3.77s/it] {'loss': 0.1555, 'grad_norm': 1.096617108135167, 'learning_rate': 2.9338697994951532e-06, 'epoch': 0.65} 65%|██████▍ | 22162/34278 [24:22:22<12:41:21, 3.77s/it] 65%|██████▍ | 22163/34278 [24:22:25<12:43:06, 3.78s/it] {'loss': 0.1289, 'grad_norm': 0.8291858025448493, 'learning_rate': 2.933439596718056e-06, 'epoch': 0.65} 65%|██████▍ | 22163/34278 [24:22:25<12:43:06, 3.78s/it] 65%|██████▍ | 22164/34278 [24:22:31<15:07:59, 4.50s/it] {'loss': 0.1132, 'grad_norm': 0.7494580678500257, 'learning_rate': 2.933009412390715e-06, 'epoch': 0.65} 65%|██████▍ | 22164/34278 [24:22:31<15:07:59, 4.50s/it] 65%|██████▍ | 22165/34278 [24:22:35<14:21:00, 4.26s/it] {'loss': 0.1325, 'grad_norm': 0.7315831730628344, 'learning_rate': 2.9325792465169755e-06, 'epoch': 0.65} 65%|██████▍ | 22165/34278 [24:22:35<14:21:00, 4.26s/it] 65%|██████▍ | 22166/34278 [24:22:38<13:22:23, 3.97s/it] {'loss': 0.1274, 'grad_norm': 0.7560886606993555, 'learning_rate': 2.932149099100673e-06, 'epoch': 0.65} 65%|██████▍ | 22166/34278 [24:22:39<13:22:23, 3.97s/it] 65%|██████▍ | 22167/34278 [24:22:41<12:12:31, 3.63s/it] {'loss': 0.0989, 'grad_norm': 0.8554363865744427, 'learning_rate': 2.9317189701456505e-06, 'epoch': 0.65} 65%|██████▍ | 22167/34278 [24:22:41<12:12:31, 3.63s/it] 65%|██████▍ | 22168/34278 [24:22:46<13:16:11, 3.94s/it] {'loss': 0.1083, 'grad_norm': 0.8235054489768502, 'learning_rate': 2.9312888596557476e-06, 'epoch': 0.65} 65%|██████▍ | 22168/34278 [24:22:46<13:16:11, 3.94s/it] 65%|██████▍ | 22169/34278 [24:22:50<13:03:17, 3.88s/it] {'loss': 0.0995, 'grad_norm': 0.7127127157064905, 'learning_rate': 2.930858767634803e-06, 'epoch': 0.65} 65%|██████▍ | 22169/34278 [24:22:50<13:03:17, 3.88s/it] 65%|██████▍ | 22170/34278 [24:22:56<15:13:05, 4.52s/it] {'loss': 0.1464, 'grad_norm': 0.8658708281194661, 'learning_rate': 2.930428694086657e-06, 'epoch': 0.65} 65%|██████▍ | 22170/34278 [24:22:56<15:13:05, 4.52s/it] 65%|██████▍ | 22171/34278 [24:22:59<14:04:21, 4.18s/it] {'loss': 0.119, 'grad_norm': 0.8377800537767242, 'learning_rate': 2.92999863901515e-06, 'epoch': 0.65} 65%|██████▍ | 22171/34278 [24:22:59<14:04:21, 4.18s/it] 65%|██████▍ | 22172/34278 [24:23:05<15:47:50, 4.70s/it] {'loss': 0.1198, 'grad_norm': 0.6426346720352603, 'learning_rate': 2.9295686024241222e-06, 'epoch': 0.65} 65%|██████▍ | 22172/34278 [24:23:05<15:47:50, 4.70s/it] 65%|██████▍ | 22173/34278 [24:23:11<16:59:36, 5.05s/it] {'loss': 0.121, 'grad_norm': 0.6362227121973375, 'learning_rate': 2.9291385843174114e-06, 'epoch': 0.65} 65%|██████▍ | 22173/34278 [24:23:11<16:59:36, 5.05s/it] 65%|██████▍ | 22174/34278 [24:23:14<15:22:04, 4.57s/it] {'loss': 0.147, 'grad_norm': 0.8023373488833564, 'learning_rate': 2.928708584698856e-06, 'epoch': 0.65} 65%|██████▍ | 22174/34278 [24:23:14<15:22:04, 4.57s/it] 65%|██████▍ | 22175/34278 [24:23:17<13:52:07, 4.13s/it] {'loss': 0.1238, 'grad_norm': 0.9216522972434915, 'learning_rate': 2.9282786035722965e-06, 'epoch': 0.65} 65%|██████▍ | 22175/34278 [24:23:17<13:52:07, 4.13s/it] 65%|██████▍ | 22176/34278 [24:23:22<14:27:32, 4.30s/it] {'loss': 0.1292, 'grad_norm': 0.6761602563199479, 'learning_rate': 2.9278486409415694e-06, 'epoch': 0.65} 65%|██████▍ | 22176/34278 [24:23:22<14:27:32, 4.30s/it] 65%|██████▍ | 22177/34278 [24:23:28<15:38:26, 4.65s/it] {'loss': 0.1237, 'grad_norm': 0.998378761133215, 'learning_rate': 2.9274186968105167e-06, 'epoch': 0.65} 65%|██████▍ | 22177/34278 [24:23:28<15:38:26, 4.65s/it] 65%|██████▍ | 22178/34278 [24:23:33<15:55:54, 4.74s/it] {'loss': 0.0997, 'grad_norm': 0.7092643212091149, 'learning_rate': 2.9269887711829758e-06, 'epoch': 0.65} 65%|██████▍ | 22178/34278 [24:23:33<15:55:54, 4.74s/it] 65%|██████▍ | 22179/34278 [24:23:39<17:08:42, 5.10s/it] {'loss': 0.1045, 'grad_norm': 1.027844812565083, 'learning_rate': 2.926558864062783e-06, 'epoch': 0.65} 65%|██████▍ | 22179/34278 [24:23:39<17:08:42, 5.10s/it] 65%|██████▍ | 22180/34278 [24:23:42<15:39:24, 4.66s/it] {'loss': 0.1292, 'grad_norm': 0.7285614319596585, 'learning_rate': 2.926128975453778e-06, 'epoch': 0.65} 65%|██████▍ | 22180/34278 [24:23:42<15:39:24, 4.66s/it] 65%|██████▍ | 22181/34278 [24:23:45<14:01:41, 4.17s/it] {'loss': 0.125, 'grad_norm': 0.8914201346505192, 'learning_rate': 2.9256991053597995e-06, 'epoch': 0.65} 65%|██████▍ | 22181/34278 [24:23:45<14:01:41, 4.17s/it] 65%|██████▍ | 22182/34278 [24:23:51<15:47:11, 4.70s/it] {'loss': 0.1307, 'grad_norm': 0.9331847810236362, 'learning_rate': 2.9252692537846807e-06, 'epoch': 0.65} 65%|██████▍ | 22182/34278 [24:23:51<15:47:11, 4.70s/it] 65%|██████▍ | 22183/34278 [24:23:54<14:11:02, 4.22s/it] {'loss': 0.1094, 'grad_norm': 0.7859581084706962, 'learning_rate': 2.924839420732266e-06, 'epoch': 0.65} 65%|██████▍ | 22183/34278 [24:23:54<14:11:02, 4.22s/it] 65%|██████▍ | 22184/34278 [24:23:57<13:13:21, 3.94s/it] {'loss': 0.1319, 'grad_norm': 1.0851130472313715, 'learning_rate': 2.9244096062063887e-06, 'epoch': 0.65} 65%|██████▍ | 22184/34278 [24:23:57<13:13:21, 3.94s/it] 65%|██████▍ | 22185/34278 [24:24:01<12:22:49, 3.69s/it] {'loss': 0.1102, 'grad_norm': 0.843565091800913, 'learning_rate': 2.9239798102108876e-06, 'epoch': 0.65} 65%|██████▍ | 22185/34278 [24:24:01<12:22:49, 3.69s/it] 65%|██████▍ | 22186/34278 [24:24:04<11:58:42, 3.57s/it] {'loss': 0.1146, 'grad_norm': 0.6561390125954926, 'learning_rate': 2.923550032749599e-06, 'epoch': 0.65} 65%|██████▍ | 22186/34278 [24:24:04<11:58:42, 3.57s/it] 65%|██████▍ | 22187/34278 [24:24:07<11:31:17, 3.43s/it] {'loss': 0.1154, 'grad_norm': 0.9459923822715989, 'learning_rate': 2.9231202738263596e-06, 'epoch': 0.65} 65%|██████▍ | 22187/34278 [24:24:07<11:31:17, 3.43s/it] 65%|██████▍ | 22188/34278 [24:24:11<11:44:40, 3.50s/it] {'loss': 0.1258, 'grad_norm': 0.7443980986626273, 'learning_rate': 2.922690533445005e-06, 'epoch': 0.65} 65%|██████▍ | 22188/34278 [24:24:11<11:44:40, 3.50s/it] 65%|██████▍ | 22189/34278 [24:24:13<11:04:43, 3.30s/it] {'loss': 0.1093, 'grad_norm': 0.9890264743048002, 'learning_rate': 2.922260811609375e-06, 'epoch': 0.65} 65%|██████▍ | 22189/34278 [24:24:14<11:04:43, 3.30s/it] 65%|██████▍ | 22190/34278 [24:24:17<11:20:13, 3.38s/it] {'loss': 0.1207, 'grad_norm': 0.7692884800769954, 'learning_rate': 2.9218311083233043e-06, 'epoch': 0.65} 65%|██████▍ | 22190/34278 [24:24:17<11:20:13, 3.38s/it] 65%|██████▍ | 22191/34278 [24:24:23<13:29:37, 4.02s/it] {'loss': 0.1304, 'grad_norm': 0.8194871660919252, 'learning_rate': 2.921401423590631e-06, 'epoch': 0.65} 65%|██████▍ | 22191/34278 [24:24:23<13:29:37, 4.02s/it] 65%|██████▍ | 22192/34278 [24:24:25<12:12:43, 3.64s/it] {'loss': 0.1135, 'grad_norm': 0.9914336772632879, 'learning_rate': 2.9209717574151876e-06, 'epoch': 0.65} 65%|██████▍ | 22192/34278 [24:24:25<12:12:43, 3.64s/it] 65%|██████▍ | 22193/34278 [24:24:31<14:41:18, 4.38s/it] {'loss': 0.1224, 'grad_norm': 0.8285517709083561, 'learning_rate': 2.9205421098008125e-06, 'epoch': 0.65} 65%|██████▍ | 22193/34278 [24:24:31<14:41:18, 4.38s/it] 65%|██████▍ | 22194/34278 [24:24:35<14:13:41, 4.24s/it] {'loss': 0.1307, 'grad_norm': 0.7174786091560991, 'learning_rate': 2.9201124807513404e-06, 'epoch': 0.65} 65%|██████▍ | 22194/34278 [24:24:35<14:13:41, 4.24s/it] 65%|██████▍ | 22195/34278 [24:24:39<13:51:45, 4.13s/it] {'loss': 0.1233, 'grad_norm': 0.8336481348727043, 'learning_rate': 2.9196828702706093e-06, 'epoch': 0.65} 65%|██████▍ | 22195/34278 [24:24:39<13:51:45, 4.13s/it] 65%|██████▍ | 22196/34278 [24:24:43<13:29:02, 4.02s/it] {'loss': 0.1339, 'grad_norm': 0.7776439857809159, 'learning_rate': 2.9192532783624503e-06, 'epoch': 0.65} 65%|██████▍ | 22196/34278 [24:24:43<13:29:02, 4.02s/it] 65%|██████▍ | 22197/34278 [24:24:47<13:27:29, 4.01s/it] {'loss': 0.1128, 'grad_norm': 0.8855835433072633, 'learning_rate': 2.9188237050307043e-06, 'epoch': 0.65} 65%|██████▍ | 22197/34278 [24:24:47<13:27:29, 4.01s/it] 65%|██████▍ | 22198/34278 [24:24:51<13:34:06, 4.04s/it] {'loss': 0.1206, 'grad_norm': 0.8987721106669736, 'learning_rate': 2.9183941502792024e-06, 'epoch': 0.65} 65%|██████▍ | 22198/34278 [24:24:51<13:34:06, 4.04s/it] 65%|██████▍ | 22199/34278 [24:24:54<12:29:36, 3.72s/it] {'loss': 0.1129, 'grad_norm': 0.8379566254560279, 'learning_rate': 2.9179646141117796e-06, 'epoch': 0.65} 65%|██████▍ | 22199/34278 [24:24:54<12:29:36, 3.72s/it] 65%|██████▍ | 22200/34278 [24:24:58<12:14:30, 3.65s/it] {'loss': 0.1265, 'grad_norm': 0.9057008007230078, 'learning_rate': 2.917535096532271e-06, 'epoch': 0.65} 65%|██████▍ | 22200/34278 [24:24:58<12:14:30, 3.65s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f40e11938d0> Failed to fetch sample 3074344. Exception: cannot identify image file <_io.BytesIO object at 0x7f40e11938d0> 65%|██████▍ | 22201/34278 [24:25:01<12:12:59, 3.64s/it] {'loss': 0.1172, 'grad_norm': 0.8331903270729928, 'learning_rate': 2.9171055975445146e-06, 'epoch': 0.65} 65%|██████▍ | 22201/34278 [24:25:01<12:12:59, 3.64s/it] 65%|██████▍ | 22202/34278 [24:25:06<13:00:29, 3.88s/it] {'loss': 0.1283, 'grad_norm': 0.8653612302633052, 'learning_rate': 2.916676117152342e-06, 'epoch': 0.65} 65%|██████▍ | 22202/34278 [24:25:06<13:00:29, 3.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 65%|██████▍ | 22203/34278 [24:25:12<15:06:20, 4.50s/it] {'loss': 0.1117, 'grad_norm': 0.7786705846466023, 'learning_rate': 2.9162466553595855e-06, 'epoch': 0.65} 65%|██████▍ | 22203/34278 [24:25:12<15:06:20, 4.50s/it] 65%|██████▍ | 22204/34278 [24:25:15<13:44:06, 4.10s/it] {'loss': 0.1131, 'grad_norm': 0.8807180720103558, 'learning_rate': 2.9158172121700832e-06, 'epoch': 0.65} 65%|██████▍ | 22204/34278 [24:25:15<13:44:06, 4.10s/it] 65%|██████▍ | 22205/34278 [24:25:18<12:51:59, 3.84s/it] {'loss': 0.1184, 'grad_norm': 0.8487521008138452, 'learning_rate': 2.9153877875876676e-06, 'epoch': 0.65} 65%|██████▍ | 22205/34278 [24:25:18<12:51:59, 3.84s/it] 65%|██████▍ | 22206/34278 [24:25:21<11:50:50, 3.53s/it] {'loss': 0.1237, 'grad_norm': 1.059747550967466, 'learning_rate': 2.9149583816161696e-06, 'epoch': 0.65} 65%|██████▍ | 22206/34278 [24:25:21<11:50:50, 3.53s/it] 65%|██████▍ | 22207/34278 [24:25:24<12:01:36, 3.59s/it] {'loss': 0.1162, 'grad_norm': 0.883438067013105, 'learning_rate': 2.9145289942594264e-06, 'epoch': 0.65} 65%|██████▍ | 22207/34278 [24:25:24<12:01:36, 3.59s/it] 65%|██████▍ | 22208/34278 [24:25:28<12:20:28, 3.68s/it] {'loss': 0.1392, 'grad_norm': 0.8808293893054353, 'learning_rate': 2.9140996255212717e-06, 'epoch': 0.65} 65%|██████▍ | 22208/34278 [24:25:28<12:20:28, 3.68s/it] 65%|██████▍ | 22209/34278 [24:25:31<11:44:29, 3.50s/it] {'loss': 0.1249, 'grad_norm': 0.8223902277286501, 'learning_rate': 2.9136702754055378e-06, 'epoch': 0.65} 65%|██████▍ | 22209/34278 [24:25:31<11:44:29, 3.50s/it] 65%|██████▍ | 22210/34278 [24:25:35<11:42:33, 3.49s/it] {'loss': 0.1365, 'grad_norm': 0.7806718802515316, 'learning_rate': 2.9132409439160563e-06, 'epoch': 0.65} 65%|██████▍ | 22210/34278 [24:25:35<11:42:33, 3.49s/it] 65%|██████▍ | 22211/34278 [24:25:38<11:17:39, 3.37s/it] {'loss': 0.1052, 'grad_norm': 0.8111719380116086, 'learning_rate': 2.912811631056663e-06, 'epoch': 0.65} 65%|██████▍ | 22211/34278 [24:25:38<11:17:39, 3.37s/it] 65%|██████▍ | 22212/34278 [24:25:41<11:07:53, 3.32s/it] {'loss': 0.1345, 'grad_norm': 0.8420319153966669, 'learning_rate': 2.9123823368311872e-06, 'epoch': 0.65} 65%|██████▍ | 22212/34278 [24:25:41<11:07:53, 3.32s/it] 65%|██████▍ | 22213/34278 [24:25:47<13:52:40, 4.14s/it] {'loss': 0.1652, 'grad_norm': 0.8771429884939764, 'learning_rate': 2.9119530612434632e-06, 'epoch': 0.65} 65%|██████▍ | 22213/34278 [24:25:47<13:52:40, 4.14s/it] 65%|██████▍ | 22214/34278 [24:25:51<13:15:39, 3.96s/it] {'loss': 0.1281, 'grad_norm': 0.8131677053655297, 'learning_rate': 2.9115238042973263e-06, 'epoch': 0.65} 65%|██████▍ | 22214/34278 [24:25:51<13:15:39, 3.96s/it] 65%|██████▍ | 22215/34278 [24:25:54<12:44:52, 3.80s/it] {'loss': 0.1211, 'grad_norm': 0.7371308360770922, 'learning_rate': 2.9110945659966063e-06, 'epoch': 0.65} 65%|██████▍ | 22215/34278 [24:25:54<12:44:52, 3.80s/it] 65%|██████▍ | 22216/34278 [24:25:58<12:27:42, 3.72s/it] {'loss': 0.1186, 'grad_norm': 0.9234163657485489, 'learning_rate': 2.9106653463451327e-06, 'epoch': 0.65} 65%|██████▍ | 22216/34278 [24:25:58<12:27:42, 3.72s/it] 65%|██████▍ | 22217/34278 [24:26:01<11:50:38, 3.54s/it] {'loss': 0.143, 'grad_norm': 1.0125152645211053, 'learning_rate': 2.9102361453467434e-06, 'epoch': 0.65} 65%|██████▍ | 22217/34278 [24:26:01<11:50:38, 3.54s/it] 65%|██████▍ | 22218/34278 [24:26:06<13:40:32, 4.08s/it] {'loss': 0.1068, 'grad_norm': 0.8084206160272337, 'learning_rate': 2.909806963005264e-06, 'epoch': 0.65} 65%|██████▍ | 22218/34278 [24:26:06<13:40:32, 4.08s/it] 65%|██████▍ | 22219/34278 [24:26:10<13:34:10, 4.05s/it] {'loss': 0.1231, 'grad_norm': 0.6277330122588356, 'learning_rate': 2.909377799324531e-06, 'epoch': 0.65} 65%|██████▍ | 22219/34278 [24:26:10<13:34:10, 4.05s/it] 65%|██████▍ | 22220/34278 [24:26:16<15:32:05, 4.64s/it] {'loss': 0.1126, 'grad_norm': 0.6551189963208536, 'learning_rate': 2.9089486543083724e-06, 'epoch': 0.65} 65%|██████▍ | 22220/34278 [24:26:16<15:32:05, 4.64s/it] 65%|██████▍ | 22221/34278 [24:26:21<15:35:27, 4.66s/it] {'loss': 0.1136, 'grad_norm': 0.8168921311399865, 'learning_rate': 2.9085195279606226e-06, 'epoch': 0.65} 65%|██████▍ | 22221/34278 [24:26:21<15:35:27, 4.66s/it] 65%|██████▍ | 22222/34278 [24:26:25<14:49:11, 4.43s/it] {'loss': 0.127, 'grad_norm': 0.6861027445830455, 'learning_rate': 2.908090420285112e-06, 'epoch': 0.65} 65%|██████▍ | 22222/34278 [24:26:25<14:49:11, 4.43s/it] 65%|██████▍ | 22223/34278 [24:26:28<13:52:53, 4.15s/it] {'loss': 0.1196, 'grad_norm': 0.7218587338935585, 'learning_rate': 2.9076613312856662e-06, 'epoch': 0.65} 65%|██████▍ | 22223/34278 [24:26:28<13:52:53, 4.15s/it] 65%|██████▍ | 22224/34278 [24:26:31<12:45:59, 3.81s/it] {'loss': 0.1199, 'grad_norm': 0.7896318321198683, 'learning_rate': 2.907232260966124e-06, 'epoch': 0.65} 65%|██████▍ | 22224/34278 [24:26:31<12:45:59, 3.81s/it] 65%|██████▍ | 22225/34278 [24:26:35<12:26:56, 3.72s/it] {'loss': 0.1209, 'grad_norm': 0.837601289541288, 'learning_rate': 2.906803209330313e-06, 'epoch': 0.65} 65%|██████▍ | 22225/34278 [24:26:35<12:26:56, 3.72s/it] 65%|██████▍ | 22226/34278 [24:26:39<13:15:34, 3.96s/it] {'loss': 0.1191, 'grad_norm': 0.7454560796760552, 'learning_rate': 2.906374176382062e-06, 'epoch': 0.65} 65%|██████▍ | 22226/34278 [24:26:39<13:15:34, 3.96s/it] 65%|██████▍ | 22227/34278 [24:26:42<12:07:36, 3.62s/it] {'loss': 0.1383, 'grad_norm': 0.85999224330815, 'learning_rate': 2.9059451621252035e-06, 'epoch': 0.65} 65%|██████▍ | 22227/34278 [24:26:42<12:07:36, 3.62s/it] 65%|██████▍ | 22228/34278 [24:26:45<11:40:18, 3.49s/it] {'loss': 0.1326, 'grad_norm': 0.9845153676921726, 'learning_rate': 2.9055161665635665e-06, 'epoch': 0.65} 65%|██████▍ | 22228/34278 [24:26:45<11:40:18, 3.49s/it] 65%|██████▍ | 22229/34278 [24:26:49<11:45:28, 3.51s/it] {'loss': 0.1037, 'grad_norm': 0.8082920375886358, 'learning_rate': 2.9050871897009803e-06, 'epoch': 0.65} 65%|██████▍ | 22229/34278 [24:26:49<11:45:28, 3.51s/it] 65%|██████▍ | 22230/34278 [24:26:52<11:06:56, 3.32s/it] {'loss': 0.1245, 'grad_norm': 0.8762478996095815, 'learning_rate': 2.9046582315412753e-06, 'epoch': 0.65} 65%|██████▍ | 22230/34278 [24:26:52<11:06:56, 3.32s/it] 65%|██████▍ | 22231/34278 [24:26:55<11:02:44, 3.30s/it] {'loss': 0.1234, 'grad_norm': 0.8291544298446543, 'learning_rate': 2.904229292088283e-06, 'epoch': 0.65} 65%|██████▍ | 22231/34278 [24:26:55<11:02:44, 3.30s/it] 65%|██████▍ | 22232/34278 [24:26:58<10:48:55, 3.23s/it] {'loss': 0.1023, 'grad_norm': 0.65625097007985, 'learning_rate': 2.903800371345832e-06, 'epoch': 0.65} 65%|██████▍ | 22232/34278 [24:26:58<10:48:55, 3.23s/it] 65%|██████▍ | 22233/34278 [24:27:01<10:37:04, 3.17s/it] {'loss': 0.0977, 'grad_norm': 0.9834768611648416, 'learning_rate': 2.9033714693177476e-06, 'epoch': 0.65} 65%|██████▍ | 22233/34278 [24:27:01<10:37:04, 3.17s/it] 65%|██████▍ | 22234/34278 [24:27:04<10:33:03, 3.15s/it] {'loss': 0.1226, 'grad_norm': 0.8343521171507311, 'learning_rate': 2.9029425860078654e-06, 'epoch': 0.65} 65%|██████▍ | 22234/34278 [24:27:04<10:33:03, 3.15s/it] 65%|██████▍ | 22235/34278 [24:27:07<10:32:17, 3.15s/it] {'loss': 0.1363, 'grad_norm': 1.1112147369985677, 'learning_rate': 2.9025137214200083e-06, 'epoch': 0.65} 65%|██████▍ | 22235/34278 [24:27:07<10:32:17, 3.15s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd577246480> Failed to fetch sample 3087947. Exception: cannot identify image file <_io.BytesIO object at 0x7fd577246480> 65%|██████▍ | 22236/34278 [24:27:11<11:27:27, 3.43s/it] {'loss': 0.1277, 'grad_norm': 0.8137001270267556, 'learning_rate': 2.9020848755580105e-06, 'epoch': 0.65} 65%|██████▍ | 22236/34278 [24:27:11<11:27:27, 3.43s/it] 65%|██████▍ | 22237/34278 [24:27:15<11:57:42, 3.58s/it] {'loss': 0.1174, 'grad_norm': 0.9032656744694125, 'learning_rate': 2.9016560484256962e-06, 'epoch': 0.65} 65%|██████▍ | 22237/34278 [24:27:15<11:57:42, 3.58s/it] 65%|██████▍ | 22238/34278 [24:27:19<11:46:18, 3.52s/it] {'loss': 0.1263, 'grad_norm': 1.0388719305762362, 'learning_rate': 2.9012272400268975e-06, 'epoch': 0.65} 65%|██████▍ | 22238/34278 [24:27:19<11:46:18, 3.52s/it] 65%|██████▍ | 22239/34278 [24:27:22<11:19:33, 3.39s/it] {'loss': 0.1392, 'grad_norm': 0.9447980528238217, 'learning_rate': 2.9007984503654413e-06, 'epoch': 0.65} 65%|██████▍ | 22239/34278 [24:27:22<11:19:33, 3.39s/it] 65%|██████▍ | 22240/34278 [24:27:25<11:13:52, 3.36s/it] {'loss': 0.1165, 'grad_norm': 0.8465938837144383, 'learning_rate': 2.900369679445153e-06, 'epoch': 0.65} 65%|██████▍ | 22240/34278 [24:27:25<11:13:52, 3.36s/it] 65%|██████▍ | 22241/34278 [24:27:29<11:30:21, 3.44s/it] {'loss': 0.1288, 'grad_norm': 1.0450775181172034, 'learning_rate': 2.899940927269863e-06, 'epoch': 0.65} 65%|██████▍ | 22241/34278 [24:27:29<11:30:21, 3.44s/it] 65%|██████▍ | 22242/34278 [24:27:32<11:01:52, 3.30s/it] {'loss': 0.134, 'grad_norm': 0.9843836485440016, 'learning_rate': 2.8995121938434013e-06, 'epoch': 0.65} 65%|██████▍ | 22242/34278 [24:27:32<11:01:52, 3.30s/it] 65%|██████▍ | 22243/34278 [24:27:35<10:47:11, 3.23s/it] {'loss': 0.1363, 'grad_norm': 0.8441332050784572, 'learning_rate': 2.8990834791695915e-06, 'epoch': 0.65} 65%|██████▍ | 22243/34278 [24:27:35<10:47:11, 3.23s/it] 65%|██████▍ | 22244/34278 [24:27:38<10:29:20, 3.14s/it] {'loss': 0.1223, 'grad_norm': 0.8157436532775706, 'learning_rate': 2.898654783252265e-06, 'epoch': 0.65} 65%|██████▍ | 22244/34278 [24:27:38<10:29:20, 3.14s/it] 65%|██████▍ | 22245/34278 [24:27:41<10:19:58, 3.09s/it] {'loss': 0.1162, 'grad_norm': 1.0417248356619506, 'learning_rate': 2.8982261060952464e-06, 'epoch': 0.65} 65%|██████▍ | 22245/34278 [24:27:41<10:19:58, 3.09s/it] 65%|██████▍ | 22246/34278 [24:27:44<10:55:05, 3.27s/it] {'loss': 0.1068, 'grad_norm': 0.8010471495555376, 'learning_rate': 2.897797447702362e-06, 'epoch': 0.65} 65%|██████▍ | 22246/34278 [24:27:44<10:55:05, 3.27s/it] 65%|██████▍ | 22247/34278 [24:27:48<10:55:46, 3.27s/it] {'loss': 0.1405, 'grad_norm': 0.8060256648823838, 'learning_rate': 2.897368808077439e-06, 'epoch': 0.65} 65%|██████▍ | 22247/34278 [24:27:48<10:55:46, 3.27s/it] 65%|██████▍ | 22248/34278 [24:27:51<11:06:55, 3.33s/it] {'loss': 0.1295, 'grad_norm': 0.7801619476066202, 'learning_rate': 2.8969401872243087e-06, 'epoch': 0.65} 65%|██████▍ | 22248/34278 [24:27:51<11:06:55, 3.33s/it] 65%|██████▍ | 22249/34278 [24:27:55<12:02:05, 3.60s/it] {'loss': 0.1606, 'grad_norm': 0.9191791421847849, 'learning_rate': 2.8965115851467935e-06, 'epoch': 0.65} 65%|██████▍ | 22249/34278 [24:27:55<12:02:05, 3.60s/it] 65%|██████▍ | 22250/34278 [24:27:59<11:47:40, 3.53s/it] {'loss': 0.161, 'grad_norm': 1.120677276714752, 'learning_rate': 2.8960830018487183e-06, 'epoch': 0.65} 65%|██████▍ | 22250/34278 [24:27:59<11:47:40, 3.53s/it] 65%|██████▍ | 22251/34278 [24:28:04<13:50:52, 4.15s/it] {'loss': 0.1237, 'grad_norm': 1.481076613089939, 'learning_rate': 2.895654437333915e-06, 'epoch': 0.65} 65%|██████▍ | 22251/34278 [24:28:04<13:50:52, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 65%|██████▍ | 22252/34278 [24:28:10<15:49:00, 4.73s/it] {'loss': 0.1246, 'grad_norm': 0.7821748784121355, 'learning_rate': 2.895225891606206e-06, 'epoch': 0.65} 65%|██████▍ | 22252/34278 [24:28:10<15:49:00, 4.73s/it] 65%|██████▍ | 22253/34278 [24:28:13<13:54:42, 4.16s/it] {'loss': 0.087, 'grad_norm': 0.7540817950701243, 'learning_rate': 2.894797364669414e-06, 'epoch': 0.65} 65%|██████▍ | 22253/34278 [24:28:13<13:54:42, 4.16s/it] 65%|██████▍ | 22254/34278 [24:28:17<13:21:38, 4.00s/it] {'loss': 0.11, 'grad_norm': 0.7995361905697157, 'learning_rate': 2.894368856527372e-06, 'epoch': 0.65} 65%|██████▍ | 22254/34278 [24:28:17<13:21:38, 4.00s/it] 65%|██████▍ | 22255/34278 [24:28:20<12:18:40, 3.69s/it] {'loss': 0.1057, 'grad_norm': 0.9061402087324139, 'learning_rate': 2.8939403671839027e-06, 'epoch': 0.65} 65%|██████▍ | 22255/34278 [24:28:20<12:18:40, 3.69s/it] 65%|██████▍ | 22256/34278 [24:28:24<12:54:44, 3.87s/it] {'loss': 0.122, 'grad_norm': 0.9513215363869687, 'learning_rate': 2.893511896642829e-06, 'epoch': 0.65} 65%|██████▍ | 22256/34278 [24:28:24<12:54:44, 3.87s/it] 65%|██████▍ | 22257/34278 [24:28:28<12:29:09, 3.74s/it] {'loss': 0.0895, 'grad_norm': 0.9415533149661969, 'learning_rate': 2.8930834449079803e-06, 'epoch': 0.65} 65%|██████▍ | 22257/34278 [24:28:28<12:29:09, 3.74s/it] 65%|██████▍ | 22258/34278 [24:28:33<13:56:11, 4.17s/it] {'loss': 0.1393, 'grad_norm': 0.999627909313702, 'learning_rate': 2.8926550119831798e-06, 'epoch': 0.65} 65%|██████▍ | 22258/34278 [24:28:33<13:56:11, 4.17s/it] 65%|██████▍ | 22259/34278 [24:28:36<12:31:09, 3.75s/it] {'loss': 0.1368, 'grad_norm': 0.8792576086743242, 'learning_rate': 2.89222659787225e-06, 'epoch': 0.65} 65%|██████▍ | 22259/34278 [24:28:36<12:31:09, 3.75s/it] 65%|██████▍ | 22260/34278 [24:28:40<13:03:13, 3.91s/it] {'loss': 0.1321, 'grad_norm': 1.1220494267093675, 'learning_rate': 2.891798202579018e-06, 'epoch': 0.65} 65%|██████▍ | 22260/34278 [24:28:40<13:03:13, 3.91s/it] 65%|██████▍ | 22261/34278 [24:28:43<12:12:32, 3.66s/it] {'loss': 0.1006, 'grad_norm': 1.2925017141032562, 'learning_rate': 2.8913698261073097e-06, 'epoch': 0.65} 65%|██████▍ | 22261/34278 [24:28:43<12:12:32, 3.66s/it] 65%|██████▍ | 22262/34278 [24:28:46<11:20:51, 3.40s/it] {'loss': 0.1339, 'grad_norm': 1.1950285563697882, 'learning_rate': 2.890941468460949e-06, 'epoch': 0.65} 65%|██████▍ | 22262/34278 [24:28:46<11:20:51, 3.40s/it] 65%|██████▍ | 22263/34278 [24:28:50<12:13:15, 3.66s/it] {'loss': 0.1421, 'grad_norm': 0.9762538718653299, 'learning_rate': 2.890513129643757e-06, 'epoch': 0.65} 65%|██████▍ | 22263/34278 [24:28:50<12:13:15, 3.66s/it] 65%|██████▍ | 22264/34278 [24:28:53<11:36:30, 3.48s/it] {'loss': 0.1297, 'grad_norm': 0.818201095500948, 'learning_rate': 2.890084809659563e-06, 'epoch': 0.65} 65%|██████▍ | 22264/34278 [24:28:53<11:36:30, 3.48s/it] 65%|██████▍ | 22265/34278 [24:28:59<14:04:58, 4.22s/it] {'loss': 0.1294, 'grad_norm': 0.8716190450685624, 'learning_rate': 2.8896565085121854e-06, 'epoch': 0.65} 65%|██████▍ | 22265/34278 [24:28:59<14:04:58, 4.22s/it] 65%|██████▍ | 22266/34278 [24:29:02<13:04:39, 3.92s/it] {'loss': 0.1141, 'grad_norm': 0.8326342416327545, 'learning_rate': 2.8892282262054533e-06, 'epoch': 0.65} 65%|██████▍ | 22266/34278 [24:29:02<13:04:39, 3.92s/it] 65%|██████▍ | 22267/34278 [24:29:05<12:00:40, 3.60s/it] {'loss': 0.1527, 'grad_norm': 1.1768202664076068, 'learning_rate': 2.8887999627431853e-06, 'epoch': 0.65} 65%|██████▍ | 22267/34278 [24:29:05<12:00:40, 3.60s/it] 65%|██████▍ | 22268/34278 [24:29:08<11:24:22, 3.42s/it] {'loss': 0.1076, 'grad_norm': 0.7728175670162283, 'learning_rate': 2.8883717181292092e-06, 'epoch': 0.65} 65%|██████▍ | 22268/34278 [24:29:08<11:24:22, 3.42s/it] 65%|██████▍ | 22269/34278 [24:29:11<11:22:02, 3.41s/it] {'loss': 0.1268, 'grad_norm': 0.8384587145072449, 'learning_rate': 2.8879434923673465e-06, 'epoch': 0.65} 65%|██████▍ | 22269/34278 [24:29:11<11:22:02, 3.41s/it] 65%|██████▍ | 22270/34278 [24:29:14<10:56:28, 3.28s/it] {'loss': 0.1324, 'grad_norm': 0.9827271946730417, 'learning_rate': 2.887515285461418e-06, 'epoch': 0.65} 65%|██████▍ | 22270/34278 [24:29:14<10:56:28, 3.28s/it] 65%|██████▍ | 22271/34278 [24:29:19<11:55:01, 3.57s/it] {'loss': 0.1143, 'grad_norm': 0.7941963284896321, 'learning_rate': 2.8870870974152485e-06, 'epoch': 0.65} 65%|██████▍ | 22271/34278 [24:29:19<11:55:01, 3.57s/it] 65%|██████▍ | 22272/34278 [24:29:22<12:07:00, 3.63s/it] {'loss': 0.1311, 'grad_norm': 0.8922433027313872, 'learning_rate': 2.8866589282326633e-06, 'epoch': 0.65} 65%|██████▍ | 22272/34278 [24:29:22<12:07:00, 3.63s/it] 65%|██████▍ | 22273/34278 [24:29:26<11:52:07, 3.56s/it] {'loss': 0.1082, 'grad_norm': 0.8916129357864748, 'learning_rate': 2.886230777917481e-06, 'epoch': 0.65} 65%|██████▍ | 22273/34278 [24:29:26<11:52:07, 3.56s/it] 65%|██████▍ | 22274/34278 [24:29:29<11:23:27, 3.42s/it] {'loss': 0.1031, 'grad_norm': 0.8046660369678841, 'learning_rate': 2.8858026464735275e-06, 'epoch': 0.65} 65%|██████▍ | 22274/34278 [24:29:29<11:23:27, 3.42s/it] 65%|██████▍ | 22275/34278 [24:29:32<10:55:49, 3.28s/it] {'loss': 0.1054, 'grad_norm': 0.7749585521185539, 'learning_rate': 2.885374533904623e-06, 'epoch': 0.65} 65%|██████▍ | 22275/34278 [24:29:32<10:55:49, 3.28s/it] 65%|██████▍ | 22276/34278 [24:29:38<13:46:33, 4.13s/it] {'loss': 0.1207, 'grad_norm': 0.7019637526795982, 'learning_rate': 2.8849464402145878e-06, 'epoch': 0.65} 65%|██████▍ | 22276/34278 [24:29:38<13:46:33, 4.13s/it] 65%|██████▍ | 22277/34278 [24:29:41<12:38:19, 3.79s/it] {'loss': 0.1156, 'grad_norm': 0.8911156397946876, 'learning_rate': 2.8845183654072463e-06, 'epoch': 0.65} 65%|██████▍ | 22277/34278 [24:29:41<12:38:19, 3.79s/it] 65%|██████▍ | 22278/34278 [24:29:47<14:43:00, 4.42s/it] {'loss': 0.1494, 'grad_norm': 0.8485200099235773, 'learning_rate': 2.8840903094864213e-06, 'epoch': 0.65} 65%|██████▍ | 22278/34278 [24:29:47<14:43:00, 4.42s/it] 65%|██████▍ | 22279/34278 [24:29:50<13:19:42, 4.00s/it] {'loss': 0.1099, 'grad_norm': 0.7812087982819907, 'learning_rate': 2.8836622724559332e-06, 'epoch': 0.65} 65%|██████▍ | 22279/34278 [24:29:50<13:19:42, 4.00s/it] 65%|██████▍ | 22280/34278 [24:29:53<12:12:23, 3.66s/it] {'loss': 0.1353, 'grad_norm': 0.9901185834870755, 'learning_rate': 2.8832342543196013e-06, 'epoch': 0.65} 65%|██████▍ | 22280/34278 [24:29:53<12:12:23, 3.66s/it] 65%|██████▌ | 22281/34278 [24:29:57<12:56:25, 3.88s/it] {'loss': 0.1255, 'grad_norm': 0.8845551968744959, 'learning_rate': 2.882806255081251e-06, 'epoch': 0.65} 65%|██████▌ | 22281/34278 [24:29:57<12:56:25, 3.88s/it] 65%|██████▌ | 22282/34278 [24:30:00<11:56:22, 3.58s/it] {'loss': 0.0956, 'grad_norm': 0.7359573079446841, 'learning_rate': 2.8823782747447002e-06, 'epoch': 0.65} 65%|██████▌ | 22282/34278 [24:30:00<11:56:22, 3.58s/it] 65%|██████▌ | 22283/34278 [24:30:05<13:26:52, 4.04s/it] {'loss': 0.141, 'grad_norm': 1.1051803047028599, 'learning_rate': 2.881950313313767e-06, 'epoch': 0.65} 65%|██████▌ | 22283/34278 [24:30:05<13:26:52, 4.04s/it] 65%|██████▌ | 22284/34278 [24:30:08<12:33:51, 3.77s/it] {'loss': 0.109, 'grad_norm': 1.2176424935196708, 'learning_rate': 2.88152237079228e-06, 'epoch': 0.65} 65%|██████▌ | 22284/34278 [24:30:08<12:33:51, 3.77s/it] 65%|██████▌ | 22285/34278 [24:30:12<12:14:54, 3.68s/it] {'loss': 0.1254, 'grad_norm': 0.8490213342693431, 'learning_rate': 2.8810944471840553e-06, 'epoch': 0.65} 65%|██████▌ | 22285/34278 [24:30:12<12:14:54, 3.68s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 65%|██████▌ | 22286/34278 [24:30:18<14:43:09, 4.42s/it] {'loss': 0.1262, 'grad_norm': 0.7544935484334007, 'learning_rate': 2.8806665424929115e-06, 'epoch': 0.65} 65%|██████▌ | 22286/34278 [24:30:18<14:43:09, 4.42s/it] 65%|██████▌ | 22287/34278 [24:30:21<13:32:24, 4.07s/it] {'loss': 0.1321, 'grad_norm': 1.1875890504706237, 'learning_rate': 2.8802386567226724e-06, 'epoch': 0.65} 65%|██████▌ | 22287/34278 [24:30:21<13:32:24, 4.07s/it] 65%|██████▌ | 22288/34278 [24:30:24<12:31:11, 3.76s/it] {'loss': 0.1198, 'grad_norm': 0.8672871483778205, 'learning_rate': 2.8798107898771577e-06, 'epoch': 0.65} 65%|██████▌ | 22288/34278 [24:30:24<12:31:11, 3.76s/it] 65%|██████▌ | 22289/34278 [24:30:28<12:13:38, 3.67s/it] {'loss': 0.1107, 'grad_norm': 0.6797266463043273, 'learning_rate': 2.879382941960183e-06, 'epoch': 0.65} 65%|██████▌ | 22289/34278 [24:30:28<12:13:38, 3.67s/it] 65%|██████▌ | 22290/34278 [24:30:32<13:17:53, 3.99s/it] {'loss': 0.1037, 'grad_norm': 0.7605994544537991, 'learning_rate': 2.878955112975572e-06, 'epoch': 0.65} 65%|██████▌ | 22290/34278 [24:30:32<13:17:53, 3.99s/it] 65%|██████▌ | 22291/34278 [24:30:36<12:36:05, 3.78s/it] {'loss': 0.1225, 'grad_norm': 0.8885139352006814, 'learning_rate': 2.8785273029271447e-06, 'epoch': 0.65} 65%|██████▌ | 22291/34278 [24:30:36<12:36:05, 3.78s/it] 65%|██████▌ | 22292/34278 [24:30:39<12:13:23, 3.67s/it] {'loss': 0.1145, 'grad_norm': 0.8725811308371497, 'learning_rate': 2.87809951181872e-06, 'epoch': 0.65} 65%|██████▌ | 22292/34278 [24:30:39<12:13:23, 3.67s/it] 65%|██████▌ | 22293/34278 [24:30:42<11:48:15, 3.55s/it] {'loss': 0.1336, 'grad_norm': 0.946947772713514, 'learning_rate': 2.8776717396541145e-06, 'epoch': 0.65} 65%|██████▌ | 22293/34278 [24:30:42<11:48:15, 3.55s/it] 65%|██████▌ | 22294/34278 [24:30:46<11:42:29, 3.52s/it] {'loss': 0.1463, 'grad_norm': 0.9082505042413088, 'learning_rate': 2.8772439864371497e-06, 'epoch': 0.65} 65%|██████▌ | 22294/34278 [24:30:46<11:42:29, 3.52s/it] 65%|██████▌ | 22295/34278 [24:30:49<11:19:15, 3.40s/it] {'loss': 0.1227, 'grad_norm': 0.8276712450899966, 'learning_rate': 2.8768162521716426e-06, 'epoch': 0.65} 65%|██████▌ | 22295/34278 [24:30:49<11:19:15, 3.40s/it] 65%|██████▌ | 22296/34278 [24:30:55<13:54:36, 4.18s/it] {'loss': 0.0924, 'grad_norm': 0.8256386614093127, 'learning_rate': 2.876388536861415e-06, 'epoch': 0.65} 65%|██████▌ | 22296/34278 [24:30:55<13:54:36, 4.18s/it] 65%|██████▌ | 22297/34278 [24:31:00<14:22:04, 4.32s/it] {'loss': 0.1032, 'grad_norm': 0.8900106813547162, 'learning_rate': 2.875960840510282e-06, 'epoch': 0.65} 65%|██████▌ | 22297/34278 [24:31:00<14:22:04, 4.32s/it] 65%|██████▌ | 22298/34278 [24:31:03<13:03:59, 3.93s/it] {'loss': 0.1373, 'grad_norm': 1.0424582409899887, 'learning_rate': 2.8755331631220654e-06, 'epoch': 0.65} 65%|██████▌ | 22298/34278 [24:31:03<13:03:59, 3.93s/it] 65%|██████▌ | 22299/34278 [24:31:09<15:08:46, 4.55s/it] {'loss': 0.1181, 'grad_norm': 0.9105610222122973, 'learning_rate': 2.8751055047005817e-06, 'epoch': 0.65} 65%|██████▌ | 22299/34278 [24:31:09<15:08:46, 4.55s/it] 65%|██████▌ | 22300/34278 [24:31:12<13:58:46, 4.20s/it] {'loss': 0.1198, 'grad_norm': 0.8846220500941803, 'learning_rate': 2.8746778652496467e-06, 'epoch': 0.65} 65%|██████▌ | 22300/34278 [24:31:12<13:58:46, 4.20s/it] 65%|██████▌ | 22301/34278 [24:31:15<12:53:01, 3.87s/it] {'loss': 0.1324, 'grad_norm': 1.130978155378496, 'learning_rate': 2.8742502447730803e-06, 'epoch': 0.65} 65%|██████▌ | 22301/34278 [24:31:15<12:53:01, 3.87s/it] 65%|██████▌ | 22302/34278 [24:31:21<14:59:26, 4.51s/it] {'loss': 0.0988, 'grad_norm': 0.8745998816816701, 'learning_rate': 2.8738226432747025e-06, 'epoch': 0.65} 65%|██████▌ | 22302/34278 [24:31:21<14:59:26, 4.51s/it] 65%|██████▌ | 22303/34278 [24:31:24<13:34:07, 4.08s/it] {'loss': 0.1363, 'grad_norm': 0.9903151874375687, 'learning_rate': 2.873395060758326e-06, 'epoch': 0.65} 65%|██████▌ | 22303/34278 [24:31:24<13:34:07, 4.08s/it] 65%|██████▌ | 22304/34278 [24:31:30<15:07:18, 4.55s/it] {'loss': 0.1336, 'grad_norm': 1.0525509613535189, 'learning_rate': 2.872967497227773e-06, 'epoch': 0.65} 65%|██████▌ | 22304/34278 [24:31:30<15:07:18, 4.55s/it] 65%|██████▌ | 22305/34278 [24:31:33<14:15:31, 4.29s/it] {'loss': 0.1295, 'grad_norm': 0.932936869396191, 'learning_rate': 2.872539952686859e-06, 'epoch': 0.65} 65%|██████▌ | 22305/34278 [24:31:33<14:15:31, 4.29s/it] 65%|██████▌ | 22306/34278 [24:31:40<16:07:24, 4.85s/it] {'loss': 0.103, 'grad_norm': 0.9742411972903376, 'learning_rate': 2.8721124271393973e-06, 'epoch': 0.65} 65%|██████▌ | 22306/34278 [24:31:40<16:07:24, 4.85s/it] 65%|██████▌ | 22307/34278 [24:31:43<14:18:19, 4.30s/it] {'loss': 0.1104, 'grad_norm': 0.7703058739276191, 'learning_rate': 2.8716849205892087e-06, 'epoch': 0.65} 65%|██████▌ | 22307/34278 [24:31:43<14:18:19, 4.30s/it] 65%|██████▌ | 22308/34278 [24:31:48<15:40:50, 4.72s/it] {'loss': 0.1156, 'grad_norm': 0.9885490723901301, 'learning_rate': 2.8712574330401112e-06, 'epoch': 0.65} 65%|██████▌ | 22308/34278 [24:31:48<15:40:50, 4.72s/it] 65%|██████▌ | 22309/34278 [24:31:52<14:37:37, 4.40s/it] {'loss': 0.1092, 'grad_norm': 1.0284538432567505, 'learning_rate': 2.8708299644959187e-06, 'epoch': 0.65} 65%|██████▌ | 22309/34278 [24:31:52<14:37:37, 4.40s/it] 65%|██████▌ | 22310/34278 [24:31:55<13:10:25, 3.96s/it] {'loss': 0.0943, 'grad_norm': 0.8335080222930304, 'learning_rate': 2.8704025149604465e-06, 'epoch': 0.65} 65%|██████▌ | 22310/34278 [24:31:55<13:10:25, 3.96s/it] 65%|██████▌ | 22311/34278 [24:31:59<12:56:57, 3.90s/it] {'loss': 0.1068, 'grad_norm': 0.7431695653097307, 'learning_rate': 2.8699750844375136e-06, 'epoch': 0.65} 65%|██████▌ | 22311/34278 [24:31:59<12:56:57, 3.90s/it] 65%|██████▌ | 22312/34278 [24:32:01<11:54:07, 3.58s/it] {'loss': 0.1296, 'grad_norm': 0.9368502487206658, 'learning_rate': 2.8695476729309345e-06, 'epoch': 0.65} 65%|██████▌ | 22312/34278 [24:32:01<11:54:07, 3.58s/it] 65%|██████▌ | 22313/34278 [24:32:07<14:11:58, 4.27s/it] {'loss': 0.1136, 'grad_norm': 0.668798187950919, 'learning_rate': 2.869120280444522e-06, 'epoch': 0.65} 65%|██████▌ | 22313/34278 [24:32:07<14:11:58, 4.27s/it] 65%|██████▌ | 22314/34278 [24:32:13<15:03:42, 4.53s/it] {'loss': 0.1578, 'grad_norm': 1.0603467585020838, 'learning_rate': 2.868692906982099e-06, 'epoch': 0.65} 65%|██████▌ | 22314/34278 [24:32:13<15:03:42, 4.53s/it] 65%|██████▌ | 22315/34278 [24:32:17<14:59:25, 4.51s/it] {'loss': 0.1064, 'grad_norm': 0.9109501761270273, 'learning_rate': 2.868265552547477e-06, 'epoch': 0.65} 65%|██████▌ | 22315/34278 [24:32:17<14:59:25, 4.51s/it] 65%|██████▌ | 22316/34278 [24:32:20<13:39:03, 4.11s/it] {'loss': 0.1272, 'grad_norm': 0.8593863084108118, 'learning_rate': 2.8678382171444686e-06, 'epoch': 0.65} 65%|██████▌ | 22316/34278 [24:32:20<13:39:03, 4.11s/it] 65%|██████▌ | 22317/34278 [24:32:23<12:21:28, 3.72s/it] {'loss': 0.1226, 'grad_norm': 1.026785609717716, 'learning_rate': 2.8674109007768935e-06, 'epoch': 0.65} 65%|██████▌ | 22317/34278 [24:32:23<12:21:28, 3.72s/it] 65%|██████▌ | 22318/34278 [24:32:26<11:41:59, 3.52s/it] {'loss': 0.1079, 'grad_norm': 0.945703504374666, 'learning_rate': 2.8669836034485655e-06, 'epoch': 0.65} 65%|██████▌ | 22318/34278 [24:32:26<11:41:59, 3.52s/it] 65%|██████▌ | 22319/34278 [24:32:29<11:23:57, 3.43s/it] {'loss': 0.1226, 'grad_norm': 0.7709154267247766, 'learning_rate': 2.866556325163296e-06, 'epoch': 0.65} 65%|██████▌ | 22319/34278 [24:32:29<11:23:57, 3.43s/it] 65%|██████▌ | 22320/34278 [24:32:33<11:21:39, 3.42s/it] {'loss': 0.1164, 'grad_norm': 0.7851121998357765, 'learning_rate': 2.866129065924903e-06, 'epoch': 0.65} 65%|██████▌ | 22320/34278 [24:32:33<11:21:39, 3.42s/it] 65%|██████▌ | 22321/34278 [24:32:36<11:00:11, 3.31s/it] {'loss': 0.1214, 'grad_norm': 0.7446738456615879, 'learning_rate': 2.8657018257372017e-06, 'epoch': 0.65} 65%|██████▌ | 22321/34278 [24:32:36<11:00:11, 3.31s/it] 65%|██████▌ | 22322/34278 [24:32:39<10:56:07, 3.29s/it] {'loss': 0.1531, 'grad_norm': 1.1456257114106072, 'learning_rate': 2.8652746046040053e-06, 'epoch': 0.65} 65%|██████▌ | 22322/34278 [24:32:39<10:56:07, 3.29s/it] 65%|██████▌ | 22323/34278 [24:32:42<10:36:40, 3.20s/it] {'loss': 0.1209, 'grad_norm': 0.8452625852629456, 'learning_rate': 2.8648474025291257e-06, 'epoch': 0.65} 65%|██████▌ | 22323/34278 [24:32:42<10:36:40, 3.20s/it] 65%|██████▌ | 22324/34278 [24:32:45<10:42:13, 3.22s/it] {'loss': 0.1125, 'grad_norm': 0.7001595159123495, 'learning_rate': 2.8644202195163807e-06, 'epoch': 0.65} 65%|██████▌ | 22324/34278 [24:32:45<10:42:13, 3.22s/it] 65%|██████▌ | 22325/34278 [24:32:49<10:57:32, 3.30s/it] {'loss': 0.123, 'grad_norm': 0.9826576706862223, 'learning_rate': 2.86399305556958e-06, 'epoch': 0.65} 65%|██████▌ | 22325/34278 [24:32:49<10:57:32, 3.30s/it] 65%|██████▌ | 22326/34278 [24:32:55<13:36:31, 4.10s/it] {'loss': 0.1181, 'grad_norm': 0.9083825273559737, 'learning_rate': 2.8635659106925415e-06, 'epoch': 0.65} 65%|██████▌ | 22326/34278 [24:32:55<13:36:31, 4.10s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 65%|██████▌ | 22327/34278 [24:32:58<12:23:15, 3.73s/it] {'loss': 0.1312, 'grad_norm': 0.8339662508236567, 'learning_rate': 2.8631387848890744e-06, 'epoch': 0.65} 65%|██████▌ | 22327/34278 [24:32:58<12:23:15, 3.73s/it] 65%|██████▌ | 22328/34278 [24:33:01<11:48:00, 3.55s/it] {'loss': 0.1097, 'grad_norm': 1.1447321900342426, 'learning_rate': 2.8627116781629966e-06, 'epoch': 0.65} 65%|██████▌ | 22328/34278 [24:33:01<11:48:00, 3.55s/it] 65%|██████▌ | 22329/34278 [24:33:06<13:51:49, 4.18s/it] {'loss': 0.1135, 'grad_norm': 0.7953141510279437, 'learning_rate': 2.8622845905181185e-06, 'epoch': 0.65} 65%|██████▌ | 22329/34278 [24:33:06<13:51:49, 4.18s/it] 65%|██████▌ | 22330/34278 [24:33:10<13:16:06, 4.00s/it] {'loss': 0.1194, 'grad_norm': 0.8991002693315734, 'learning_rate': 2.8618575219582514e-06, 'epoch': 0.65} 65%|██████▌ | 22330/34278 [24:33:10<13:16:06, 4.00s/it] 65%|██████▌ | 22331/34278 [24:33:15<14:29:01, 4.36s/it] {'loss': 0.1043, 'grad_norm': 0.7034638652643109, 'learning_rate': 2.8614304724872094e-06, 'epoch': 0.65} 65%|██████▌ | 22331/34278 [24:33:15<14:29:01, 4.36s/it] 65%|██████▌ | 22332/34278 [24:33:18<13:04:59, 3.94s/it] {'loss': 0.105, 'grad_norm': 1.0133834699574427, 'learning_rate': 2.8610034421088084e-06, 'epoch': 0.65} 65%|██████▌ | 22332/34278 [24:33:18<13:04:59, 3.94s/it] 65%|██████▌ | 22333/34278 [24:33:21<12:11:53, 3.68s/it] {'loss': 0.1279, 'grad_norm': 0.815816599781631, 'learning_rate': 2.8605764308268554e-06, 'epoch': 0.65} 65%|██████▌ | 22333/34278 [24:33:21<12:11:53, 3.68s/it] 65%|██████▌ | 22334/34278 [24:33:26<13:14:32, 3.99s/it] {'loss': 0.1249, 'grad_norm': 0.9157008933986431, 'learning_rate': 2.860149438645168e-06, 'epoch': 0.65} 65%|██████▌ | 22334/34278 [24:33:26<13:14:32, 3.99s/it] 65%|██████▌ | 22335/34278 [24:33:29<12:48:59, 3.86s/it] {'loss': 0.1009, 'grad_norm': 0.8890247586150002, 'learning_rate': 2.859722465567555e-06, 'epoch': 0.65} 65%|██████▌ | 22335/34278 [24:33:29<12:48:59, 3.86s/it] 65%|██████▌ | 22336/34278 [24:33:33<12:06:39, 3.65s/it] {'loss': 0.1254, 'grad_norm': 0.9735215141512003, 'learning_rate': 2.8592955115978268e-06, 'epoch': 0.65} 65%|██████▌ | 22336/34278 [24:33:33<12:06:39, 3.65s/it] 65%|██████▌ | 22337/34278 [24:33:36<12:05:03, 3.64s/it] {'loss': 0.1311, 'grad_norm': 0.877890120340445, 'learning_rate': 2.858868576739797e-06, 'epoch': 0.65} 65%|██████▌ | 22337/34278 [24:33:36<12:05:03, 3.64s/it] 65%|██████▌ | 22338/34278 [24:33:40<12:10:21, 3.67s/it] {'loss': 0.1164, 'grad_norm': 0.7703131794966079, 'learning_rate': 2.85844166099728e-06, 'epoch': 0.65} 65%|██████▌ | 22338/34278 [24:33:40<12:10:21, 3.67s/it] 65%|██████▌ | 22339/34278 [24:33:44<12:09:04, 3.66s/it] {'loss': 0.146, 'grad_norm': 1.033143415958569, 'learning_rate': 2.8580147643740847e-06, 'epoch': 0.65} 65%|██████▌ | 22339/34278 [24:33:44<12:09:04, 3.66s/it] 65%|██████▌ | 22340/34278 [24:33:47<11:39:00, 3.51s/it] {'loss': 0.1347, 'grad_norm': 0.8663714746089348, 'learning_rate': 2.8575878868740197e-06, 'epoch': 0.65} 65%|██████▌ | 22340/34278 [24:33:47<11:39:00, 3.51s/it] 65%|██████▌ | 22341/34278 [24:33:50<11:12:01, 3.38s/it] {'loss': 0.1191, 'grad_norm': 0.9090693280832018, 'learning_rate': 2.857161028500901e-06, 'epoch': 0.65} 65%|██████▌ | 22341/34278 [24:33:50<11:12:01, 3.38s/it] 65%|██████▌ | 22342/34278 [24:33:56<13:47:37, 4.16s/it] {'loss': 0.1088, 'grad_norm': 0.6568736409347923, 'learning_rate': 2.8567341892585373e-06, 'epoch': 0.65} 65%|██████▌ | 22342/34278 [24:33:56<13:47:37, 4.16s/it] 65%|██████▌ | 22343/34278 [24:33:59<12:56:13, 3.90s/it] {'loss': 0.1181, 'grad_norm': 0.9747536446220381, 'learning_rate': 2.8563073691507346e-06, 'epoch': 0.65} 65%|██████▌ | 22343/34278 [24:33:59<12:56:13, 3.90s/it] 65%|██████▌ | 22344/34278 [24:34:02<12:25:41, 3.75s/it] {'loss': 0.1284, 'grad_norm': 0.8898675146297077, 'learning_rate': 2.8558805681813123e-06, 'epoch': 0.65} 65%|██████▌ | 22344/34278 [24:34:02<12:25:41, 3.75s/it] 65%|██████▌ | 22345/34278 [24:34:06<11:59:52, 3.62s/it] {'loss': 0.1247, 'grad_norm': 0.7639550982067947, 'learning_rate': 2.8554537863540766e-06, 'epoch': 0.65} 65%|██████▌ | 22345/34278 [24:34:06<11:59:52, 3.62s/it] 65%|██████▌ | 22346/34278 [24:34:11<13:58:24, 4.22s/it] {'loss': 0.1101, 'grad_norm': 0.9532939328691233, 'learning_rate': 2.855027023672835e-06, 'epoch': 0.65} 65%|██████▌ | 22346/34278 [24:34:11<13:58:24, 4.22s/it] 65%|██████▌ | 22347/34278 [24:34:15<13:06:52, 3.96s/it] {'loss': 0.1437, 'grad_norm': 0.8518130061041179, 'learning_rate': 2.854600280141403e-06, 'epoch': 0.65} 65%|██████▌ | 22347/34278 [24:34:15<13:06:52, 3.96s/it] 65%|██████▌ | 22348/34278 [24:34:21<15:05:34, 4.55s/it] {'loss': 0.1003, 'grad_norm': 0.7728245534145937, 'learning_rate': 2.8541735557635863e-06, 'epoch': 0.65} 65%|██████▌ | 22348/34278 [24:34:21<15:05:34, 4.55s/it] 65%|██████▌ | 22349/34278 [24:34:24<14:07:55, 4.26s/it] {'loss': 0.1256, 'grad_norm': 0.9433016687814527, 'learning_rate': 2.853746850543195e-06, 'epoch': 0.65} 65%|██████▌ | 22349/34278 [24:34:24<14:07:55, 4.26s/it] 65%|██████▌ | 22350/34278 [24:34:28<13:50:08, 4.18s/it] {'loss': 0.0998, 'grad_norm': 0.8137539866414318, 'learning_rate': 2.8533201644840392e-06, 'epoch': 0.65} 65%|██████▌ | 22350/34278 [24:34:28<13:50:08, 4.18s/it] 65%|██████▌ | 22351/34278 [24:34:31<12:51:07, 3.88s/it] {'loss': 0.1085, 'grad_norm': 0.5994125075179363, 'learning_rate': 2.8528934975899303e-06, 'epoch': 0.65} 65%|██████▌ | 22351/34278 [24:34:31<12:51:07, 3.88s/it] 65%|██████▌ | 22352/34278 [24:34:37<14:09:26, 4.27s/it] {'loss': 0.141, 'grad_norm': 0.8790534188690425, 'learning_rate': 2.8524668498646755e-06, 'epoch': 0.65} 65%|██████▌ | 22352/34278 [24:34:37<14:09:26, 4.27s/it] 65%|██████▌ | 22353/34278 [24:34:40<12:56:56, 3.91s/it] {'loss': 0.1224, 'grad_norm': 1.018077477680946, 'learning_rate': 2.852040221312082e-06, 'epoch': 0.65} 65%|██████▌ | 22353/34278 [24:34:40<12:56:56, 3.91s/it] 65%|██████▌ | 22354/34278 [24:34:43<12:07:16, 3.66s/it] {'loss': 0.1205, 'grad_norm': 0.8866096984268983, 'learning_rate': 2.851613611935963e-06, 'epoch': 0.65} 65%|██████▌ | 22354/34278 [24:34:43<12:07:16, 3.66s/it] 65%|██████▌ | 22355/34278 [24:34:46<11:23:50, 3.44s/it] {'loss': 0.1191, 'grad_norm': 0.697688642917243, 'learning_rate': 2.8511870217401227e-06, 'epoch': 0.65} 65%|██████▌ | 22355/34278 [24:34:46<11:23:50, 3.44s/it] 65%|██████▌ | 22356/34278 [24:34:49<11:15:48, 3.40s/it] {'loss': 0.1152, 'grad_norm': 0.919552829781995, 'learning_rate': 2.8507604507283736e-06, 'epoch': 0.65} 65%|██████▌ | 22356/34278 [24:34:49<11:15:48, 3.40s/it] 65%|██████▌ | 22357/34278 [24:34:52<10:57:40, 3.31s/it] {'loss': 0.092, 'grad_norm': 0.9177034296130838, 'learning_rate': 2.8503338989045202e-06, 'epoch': 0.65} 65%|██████▌ | 22357/34278 [24:34:52<10:57:40, 3.31s/it] 65%|██████▌ | 22358/34278 [24:34:56<11:27:19, 3.46s/it] {'loss': 0.1375, 'grad_norm': 1.0123413429136259, 'learning_rate': 2.8499073662723743e-06, 'epoch': 0.65} 65%|██████▌ | 22358/34278 [24:34:56<11:27:19, 3.46s/it] 65%|██████▌ | 22359/34278 [24:35:00<11:56:05, 3.60s/it] {'loss': 0.1055, 'grad_norm': 0.8239520774706534, 'learning_rate': 2.8494808528357424e-06, 'epoch': 0.65} 65%|██████▌ | 22359/34278 [24:35:00<11:56:05, 3.60s/it] 65%|██████▌ | 22360/34278 [24:35:03<11:39:16, 3.52s/it] {'loss': 0.114, 'grad_norm': 0.8823536883204104, 'learning_rate': 2.8490543585984303e-06, 'epoch': 0.65} 65%|██████▌ | 22360/34278 [24:35:03<11:39:16, 3.52s/it] 65%|██████▌ | 22361/34278 [24:35:07<12:06:32, 3.66s/it] {'loss': 0.1116, 'grad_norm': 0.8723999640850891, 'learning_rate': 2.8486278835642474e-06, 'epoch': 0.65} 65%|██████▌ | 22361/34278 [24:35:07<12:06:32, 3.66s/it] 65%|██████▌ | 22362/34278 [24:35:11<12:35:26, 3.80s/it] {'loss': 0.1235, 'grad_norm': 0.8112371910259645, 'learning_rate': 2.848201427737003e-06, 'epoch': 0.65} 65%|██████▌ | 22362/34278 [24:35:11<12:35:26, 3.80s/it] 65%|██████▌ | 22363/34278 [24:35:15<12:20:30, 3.73s/it] {'loss': 0.1195, 'grad_norm': 0.8758281179562669, 'learning_rate': 2.8477749911205007e-06, 'epoch': 0.65} 65%|██████▌ | 22363/34278 [24:35:15<12:20:30, 3.73s/it] 65%|██████▌ | 22364/34278 [24:35:19<12:26:02, 3.76s/it] {'loss': 0.118, 'grad_norm': 0.7635111746276324, 'learning_rate': 2.8473485737185513e-06, 'epoch': 0.65} 65%|██████▌ | 22364/34278 [24:35:19<12:26:02, 3.76s/it] 65%|██████▌ | 22365/34278 [24:35:25<14:58:57, 4.53s/it] {'loss': 0.1204, 'grad_norm': 0.6954318845365827, 'learning_rate': 2.8469221755349596e-06, 'epoch': 0.65} 65%|██████▌ | 22365/34278 [24:35:25<14:58:57, 4.53s/it] 65%|██████▌ | 22366/34278 [24:35:31<16:37:37, 5.02s/it] {'loss': 0.1341, 'grad_norm': 1.0995960484615341, 'learning_rate': 2.8464957965735317e-06, 'epoch': 0.65} 65%|██████▌ | 22366/34278 [24:35:31<16:37:37, 5.02s/it] 65%|██████▌ | 22367/34278 [24:35:35<15:37:06, 4.72s/it] {'loss': 0.1214, 'grad_norm': 0.8960672492372462, 'learning_rate': 2.846069436838075e-06, 'epoch': 0.65} 65%|██████▌ | 22367/34278 [24:35:35<15:37:06, 4.72s/it] 65%|██████▌ | 22368/34278 [24:35:41<16:49:49, 5.09s/it] {'loss': 0.1361, 'grad_norm': 0.9486649441543449, 'learning_rate': 2.8456430963323977e-06, 'epoch': 0.65} 65%|██████▌ | 22368/34278 [24:35:41<16:49:49, 5.09s/it] 65%|██████▌ | 22369/34278 [24:35:44<14:41:16, 4.44s/it] {'loss': 0.1102, 'grad_norm': 0.7028028926129187, 'learning_rate': 2.8452167750603044e-06, 'epoch': 0.65} 65%|██████▌ | 22369/34278 [24:35:44<14:41:16, 4.44s/it] 65%|██████▌ | 22370/34278 [24:35:47<13:05:52, 3.96s/it] {'loss': 0.1386, 'grad_norm': 1.4042321279524002, 'learning_rate': 2.8447904730256e-06, 'epoch': 0.65} 65%|██████▌ | 22370/34278 [24:35:47<13:05:52, 3.96s/it] 65%|██████▌ | 22371/34278 [24:35:50<12:09:25, 3.68s/it] {'loss': 0.1075, 'grad_norm': 1.2118493118809086, 'learning_rate': 2.8443641902320935e-06, 'epoch': 0.65} 65%|██████▌ | 22371/34278 [24:35:50<12:09:25, 3.68s/it] 65%|██████▌ | 22372/34278 [24:35:53<11:42:58, 3.54s/it] {'loss': 0.1152, 'grad_norm': 0.8708631911218182, 'learning_rate': 2.8439379266835888e-06, 'epoch': 0.65} 65%|██████▌ | 22372/34278 [24:35:53<11:42:58, 3.54s/it] 65%|██████▌ | 22373/34278 [24:35:57<11:41:16, 3.53s/it] {'loss': 0.1383, 'grad_norm': 0.8939611154119125, 'learning_rate': 2.843511682383888e-06, 'epoch': 0.65} 65%|██████▌ | 22373/34278 [24:35:57<11:41:16, 3.53s/it] 65%|██████▌ | 22374/34278 [24:36:00<11:04:08, 3.35s/it] {'loss': 0.1162, 'grad_norm': 1.197413725856025, 'learning_rate': 2.843085457336804e-06, 'epoch': 0.65} 65%|██████▌ | 22374/34278 [24:36:00<11:04:08, 3.35s/it] 65%|██████▌ | 22375/34278 [24:36:03<11:23:39, 3.45s/it] {'loss': 0.1101, 'grad_norm': 1.2039479656733154, 'learning_rate': 2.842659251546137e-06, 'epoch': 0.65} 65%|██████▌ | 22375/34278 [24:36:03<11:23:39, 3.45s/it] 65%|██████▌ | 22376/34278 [24:36:06<11:02:14, 3.34s/it] {'loss': 0.1082, 'grad_norm': 0.7770492126751283, 'learning_rate': 2.8422330650156926e-06, 'epoch': 0.65} 65%|██████▌ | 22376/34278 [24:36:06<11:02:14, 3.34s/it] 65%|██████▌ | 22377/34278 [24:36:10<11:13:23, 3.39s/it] {'loss': 0.1278, 'grad_norm': 0.9534063290817145, 'learning_rate': 2.8418068977492773e-06, 'epoch': 0.65} 65%|██████▌ | 22377/34278 [24:36:10<11:13:23, 3.39s/it] 65%|██████▌ | 22378/34278 [24:36:13<10:59:46, 3.33s/it] {'loss': 0.118, 'grad_norm': 1.0784775794238626, 'learning_rate': 2.841380749750696e-06, 'epoch': 0.65} 65%|██████▌ | 22378/34278 [24:36:13<10:59:46, 3.33s/it] 65%|██████▌ | 22379/34278 [24:36:16<10:43:50, 3.25s/it] {'loss': 0.1132, 'grad_norm': 1.0676492238026483, 'learning_rate': 2.840954621023749e-06, 'epoch': 0.65} 65%|██████▌ | 22379/34278 [24:36:16<10:43:50, 3.25s/it] 65%|██████▌ | 22380/34278 [24:36:20<11:36:08, 3.51s/it] {'loss': 0.139, 'grad_norm': 0.8335513307074479, 'learning_rate': 2.840528511572245e-06, 'epoch': 0.65} 65%|██████▌ | 22380/34278 [24:36:20<11:36:08, 3.51s/it] 65%|██████▌ | 22381/34278 [24:36:24<12:00:43, 3.63s/it] {'loss': 0.1319, 'grad_norm': 0.9354201849273656, 'learning_rate': 2.840102421399987e-06, 'epoch': 0.65} 65%|██████▌ | 22381/34278 [24:36:24<12:00:43, 3.63s/it] 65%|██████▌ | 22382/34278 [24:36:27<11:19:29, 3.43s/it] {'loss': 0.1268, 'grad_norm': 1.068194468994753, 'learning_rate': 2.8396763505107804e-06, 'epoch': 0.65} 65%|██████▌ | 22382/34278 [24:36:27<11:19:29, 3.43s/it] 65%|██████▌ | 22383/34278 [24:36:30<11:17:05, 3.42s/it] {'loss': 0.1394, 'grad_norm': 0.8156923346856937, 'learning_rate': 2.8392502989084255e-06, 'epoch': 0.65} 65%|██████▌ | 22383/34278 [24:36:30<11:17:05, 3.42s/it] 65%|██████▌ | 22384/34278 [24:36:34<11:29:26, 3.48s/it] {'loss': 0.1086, 'grad_norm': 1.0070871330619462, 'learning_rate': 2.8388242665967296e-06, 'epoch': 0.65} 65%|██████▌ | 22384/34278 [24:36:34<11:29:26, 3.48s/it] 65%|██████▌ | 22385/34278 [24:36:38<12:03:20, 3.65s/it] {'loss': 0.1111, 'grad_norm': 1.074138388207415, 'learning_rate': 2.838398253579493e-06, 'epoch': 0.65} 65%|██████▌ | 22385/34278 [24:36:38<12:03:20, 3.65s/it] 65%|██████▌ | 22386/34278 [24:36:42<12:03:44, 3.65s/it] {'loss': 0.106, 'grad_norm': 0.756491479175004, 'learning_rate': 2.8379722598605233e-06, 'epoch': 0.65} 65%|██████▌ | 22386/34278 [24:36:42<12:03:44, 3.65s/it] 65%|██████▌ | 22387/34278 [24:36:45<11:37:24, 3.52s/it] {'loss': 0.1163, 'grad_norm': 0.7841927221775205, 'learning_rate': 2.8375462854436187e-06, 'epoch': 0.65} 65%|██████▌ | 22387/34278 [24:36:45<11:37:24, 3.52s/it] 65%|██████▌ | 22388/34278 [24:36:50<12:42:06, 3.85s/it] {'loss': 0.1191, 'grad_norm': 0.78904380879566, 'learning_rate': 2.837120330332587e-06, 'epoch': 0.65} 65%|██████▌ | 22388/34278 [24:36:50<12:42:06, 3.85s/it] 65%|██████▌ | 22389/34278 [24:36:52<11:42:41, 3.55s/it] {'loss': 0.1252, 'grad_norm': 1.0917631612426648, 'learning_rate': 2.8366943945312274e-06, 'epoch': 0.65} 65%|██████▌ | 22389/34278 [24:36:52<11:42:41, 3.55s/it] 65%|██████▌ | 22390/34278 [24:36:56<11:31:35, 3.49s/it] {'loss': 0.1294, 'grad_norm': 0.7741510810631309, 'learning_rate': 2.836268478043343e-06, 'epoch': 0.65} 65%|██████▌ | 22390/34278 [24:36:56<11:31:35, 3.49s/it] 65%|██████▌ | 22391/34278 [24:36:59<10:58:53, 3.33s/it] {'loss': 0.1197, 'grad_norm': 0.7935141619877567, 'learning_rate': 2.835842580872737e-06, 'epoch': 0.65} 65%|██████▌ | 22391/34278 [24:36:59<10:58:53, 3.33s/it] 65%|██████▌ | 22392/34278 [24:37:02<10:48:11, 3.27s/it] {'loss': 0.1092, 'grad_norm': 0.9354382676729552, 'learning_rate': 2.835416703023214e-06, 'epoch': 0.65} 65%|██████▌ | 22392/34278 [24:37:02<10:48:11, 3.27s/it] 65%|██████▌ | 22393/34278 [24:37:05<10:29:50, 3.18s/it] {'loss': 0.1147, 'grad_norm': 1.1299759732247512, 'learning_rate': 2.8349908444985706e-06, 'epoch': 0.65} 65%|██████▌ | 22393/34278 [24:37:05<10:29:50, 3.18s/it] 65%|██████▌ | 22394/34278 [24:37:08<10:37:01, 3.22s/it] {'loss': 0.1234, 'grad_norm': 0.8956681349786876, 'learning_rate': 2.834565005302615e-06, 'epoch': 0.65} 65%|██████▌ | 22394/34278 [24:37:08<10:37:01, 3.22s/it] 65%|██████▌ | 22395/34278 [24:37:14<13:35:03, 4.12s/it] {'loss': 0.1204, 'grad_norm': 0.8416045818463777, 'learning_rate': 2.8341391854391466e-06, 'epoch': 0.65} 65%|██████▌ | 22395/34278 [24:37:14<13:35:03, 4.12s/it] 65%|██████▌ | 22396/34278 [24:37:17<12:24:31, 3.76s/it] {'loss': 0.125, 'grad_norm': 0.873502656204317, 'learning_rate': 2.8337133849119643e-06, 'epoch': 0.65} 65%|██████▌ | 22396/34278 [24:37:17<12:24:31, 3.76s/it] 65%|██████▌ | 22397/34278 [24:37:20<11:23:02, 3.45s/it] {'loss': 0.1262, 'grad_norm': 0.8876938485626714, 'learning_rate': 2.8332876037248714e-06, 'epoch': 0.65} 65%|██████▌ | 22397/34278 [24:37:20<11:23:02, 3.45s/it] 65%|██████▌ | 22398/34278 [24:37:23<10:51:34, 3.29s/it] {'loss': 0.1073, 'grad_norm': 0.8923825238580209, 'learning_rate': 2.8328618418816715e-06, 'epoch': 0.65} 65%|██████▌ | 22398/34278 [24:37:23<10:51:34, 3.29s/it] 65%|██████▌ | 22399/34278 [24:37:26<10:27:54, 3.17s/it] {'loss': 0.1056, 'grad_norm': 0.7873329537573146, 'learning_rate': 2.8324360993861644e-06, 'epoch': 0.65} 65%|██████▌ | 22399/34278 [24:37:26<10:27:54, 3.17s/it] 65%|██████▌ | 22400/34278 [24:37:29<10:34:17, 3.20s/it] {'loss': 0.1298, 'grad_norm': 0.7710899821948057, 'learning_rate': 2.832010376242148e-06, 'epoch': 0.65} 65%|██████▌ | 22400/34278 [24:37:29<10:34:17, 3.20s/it] 65%|██████▌ | 22401/34278 [24:37:32<10:25:49, 3.16s/it] {'loss': 0.108, 'grad_norm': 1.0411339849268668, 'learning_rate': 2.831584672453427e-06, 'epoch': 0.65} 65%|██████▌ | 22401/34278 [24:37:32<10:25:49, 3.16s/it] 65%|██████▌ | 22402/34278 [24:37:35<10:23:30, 3.15s/it] {'loss': 0.1155, 'grad_norm': 0.9291247455436145, 'learning_rate': 2.831158988023801e-06, 'epoch': 0.65} 65%|██████▌ | 22402/34278 [24:37:35<10:23:30, 3.15s/it] 65%|██████▌ | 22403/34278 [24:37:39<10:31:26, 3.19s/it] {'loss': 0.1177, 'grad_norm': 0.8797433662321947, 'learning_rate': 2.8307333229570653e-06, 'epoch': 0.65} 65%|██████▌ | 22403/34278 [24:37:39<10:31:26, 3.19s/it] 65%|██████▌ | 22404/34278 [24:37:43<11:48:55, 3.58s/it] {'loss': 0.1317, 'grad_norm': 0.8237735555776691, 'learning_rate': 2.8303076772570292e-06, 'epoch': 0.65} 65%|██████▌ | 22404/34278 [24:37:43<11:48:55, 3.58s/it] 65%|██████▌ | 22405/34278 [24:37:47<11:45:12, 3.56s/it] {'loss': 0.1307, 'grad_norm': 0.8579782609770397, 'learning_rate': 2.8298820509274876e-06, 'epoch': 0.65} 65%|██████▌ | 22405/34278 [24:37:47<11:45:12, 3.56s/it] 65%|██████▌ | 22406/34278 [24:37:49<11:00:20, 3.34s/it] {'loss': 0.107, 'grad_norm': 0.8855548023531, 'learning_rate': 2.8294564439722395e-06, 'epoch': 0.65} 65%|██████▌ | 22406/34278 [24:37:49<11:00:20, 3.34s/it] 65%|██████▌ | 22407/34278 [24:37:53<11:30:58, 3.49s/it] {'loss': 0.1073, 'grad_norm': 1.0135499412180398, 'learning_rate': 2.8290308563950876e-06, 'epoch': 0.65} 65%|██████▌ | 22407/34278 [24:37:53<11:30:58, 3.49s/it] 65%|██████▌ | 22408/34278 [24:37:57<11:24:08, 3.46s/it] {'loss': 0.1173, 'grad_norm': 0.8882936866312848, 'learning_rate': 2.8286052881998303e-06, 'epoch': 0.65} 65%|██████▌ | 22408/34278 [24:37:57<11:24:08, 3.46s/it] 65%|██████▌ | 22409/34278 [24:38:03<13:53:49, 4.22s/it] {'loss': 0.1424, 'grad_norm': 0.8248935598199801, 'learning_rate': 2.8281797393902643e-06, 'epoch': 0.65} 65%|██████▌ | 22409/34278 [24:38:03<13:53:49, 4.22s/it] 65%|██████▌ | 22410/34278 [24:38:06<12:33:57, 3.81s/it] {'loss': 0.1118, 'grad_norm': 0.9835032623361226, 'learning_rate': 2.8277542099701916e-06, 'epoch': 0.65} 65%|██████▌ | 22410/34278 [24:38:06<12:33:57, 3.81s/it] 65%|██████▌ | 22411/34278 [24:38:09<12:36:32, 3.83s/it] {'loss': 0.1374, 'grad_norm': 1.080433309396559, 'learning_rate': 2.827328699943413e-06, 'epoch': 0.65} 65%|██████▌ | 22411/34278 [24:38:09<12:36:32, 3.83s/it] 65%|██████▌ | 22412/34278 [24:38:12<11:32:58, 3.50s/it] {'loss': 0.1389, 'grad_norm': 1.0401450762281055, 'learning_rate': 2.826903209313725e-06, 'epoch': 0.65} 65%|██████▌ | 22412/34278 [24:38:12<11:32:58, 3.50s/it] 65%|██████▌ | 22413/34278 [24:38:15<11:12:50, 3.40s/it] {'loss': 0.131, 'grad_norm': 0.8951298171915354, 'learning_rate': 2.826477738084924e-06, 'epoch': 0.65} 65%|██████▌ | 22413/34278 [24:38:15<11:12:50, 3.40s/it] 65%|██████▌ | 22414/34278 [24:38:19<11:37:46, 3.53s/it] {'loss': 0.1204, 'grad_norm': 0.7924965449107844, 'learning_rate': 2.8260522862608123e-06, 'epoch': 0.65} 65%|██████▌ | 22414/34278 [24:38:19<11:37:46, 3.53s/it] 65%|██████▌ | 22415/34278 [24:38:22<10:58:39, 3.33s/it] {'loss': 0.1343, 'grad_norm': 0.8684814252637882, 'learning_rate': 2.825626853845186e-06, 'epoch': 0.65} 65%|██████▌ | 22415/34278 [24:38:22<10:58:39, 3.33s/it] 65%|██████▌ | 22416/34278 [24:38:26<11:39:25, 3.54s/it] {'loss': 0.132, 'grad_norm': 1.14344951259655, 'learning_rate': 2.8252014408418455e-06, 'epoch': 0.65} 65%|██████▌ | 22416/34278 [24:38:26<11:39:25, 3.54s/it] 65%|██████▌ | 22417/34278 [24:38:29<11:14:55, 3.41s/it] {'loss': 0.1187, 'grad_norm': 1.650113471243881, 'learning_rate': 2.8247760472545856e-06, 'epoch': 0.65} 65%|██████▌ | 22417/34278 [24:38:29<11:14:55, 3.41s/it] 65%|██████▌ | 22418/34278 [24:38:34<12:30:49, 3.80s/it] {'loss': 0.109, 'grad_norm': 0.9882441329202631, 'learning_rate': 2.8243506730872072e-06, 'epoch': 0.65} 65%|██████▌ | 22418/34278 [24:38:34<12:30:49, 3.80s/it] 65%|██████▌ | 22419/34278 [24:38:40<15:08:26, 4.60s/it] {'loss': 0.1143, 'grad_norm': 0.806671554677657, 'learning_rate': 2.8239253183435078e-06, 'epoch': 0.65} 65%|██████▌ | 22419/34278 [24:38:40<15:08:26, 4.60s/it] 65%|██████▌ | 22420/34278 [24:38:46<16:38:06, 5.05s/it] {'loss': 0.1258, 'grad_norm': 0.8737769216924756, 'learning_rate': 2.8234999830272793e-06, 'epoch': 0.65} 65%|██████▌ | 22420/34278 [24:38:46<16:38:06, 5.05s/it] 65%|██████▌ | 22421/34278 [24:38:50<15:06:18, 4.59s/it] {'loss': 0.1008, 'grad_norm': 1.027183654696904, 'learning_rate': 2.823074667142327e-06, 'epoch': 0.65} 65%|██████▌ | 22421/34278 [24:38:50<15:06:18, 4.59s/it] 65%|██████▌ | 22422/34278 [24:38:54<14:17:40, 4.34s/it] {'loss': 0.1229, 'grad_norm': 1.0784696427280314, 'learning_rate': 2.822649370692444e-06, 'epoch': 0.65} 65%|██████▌ | 22422/34278 [24:38:54<14:17:40, 4.34s/it] 65%|██████▌ | 22423/34278 [24:38:57<12:54:53, 3.92s/it] {'loss': 0.1082, 'grad_norm': 1.0854113484420793, 'learning_rate': 2.822224093681426e-06, 'epoch': 0.65} 65%|██████▌ | 22423/34278 [24:38:57<12:54:53, 3.92s/it] 65%|██████▌ | 22424/34278 [24:39:00<12:43:51, 3.87s/it] {'loss': 0.1538, 'grad_norm': 0.8773620291506504, 'learning_rate': 2.8217988361130745e-06, 'epoch': 0.65} 65%|██████▌ | 22424/34278 [24:39:00<12:43:51, 3.87s/it] 65%|██████▌ | 22425/34278 [24:39:03<11:49:16, 3.59s/it] {'loss': 0.1084, 'grad_norm': 1.052376493376747, 'learning_rate': 2.8213735979911815e-06, 'epoch': 0.65} 65%|██████▌ | 22425/34278 [24:39:03<11:49:16, 3.59s/it] 65%|██████▌ | 22426/34278 [24:39:07<12:22:10, 3.76s/it] {'loss': 0.1258, 'grad_norm': 1.00853733576621, 'learning_rate': 2.8209483793195434e-06, 'epoch': 0.65} 65%|██████▌ | 22426/34278 [24:39:07<12:22:10, 3.76s/it] 65%|██████▌ | 22427/34278 [24:39:11<12:32:51, 3.81s/it] {'loss': 0.116, 'grad_norm': 0.9335007307215261, 'learning_rate': 2.8205231801019584e-06, 'epoch': 0.65} 65%|██████▌ | 22427/34278 [24:39:11<12:32:51, 3.81s/it] 65%|██████▌ | 22428/34278 [24:39:14<11:45:27, 3.57s/it] {'loss': 0.1072, 'grad_norm': 0.7882432973844449, 'learning_rate': 2.820098000342224e-06, 'epoch': 0.65} 65%|██████▌ | 22428/34278 [24:39:14<11:45:27, 3.57s/it] 65%|██████▌ | 22429/34278 [24:39:18<11:52:30, 3.61s/it] {'loss': 0.1013, 'grad_norm': 0.9348378620044736, 'learning_rate': 2.8196728400441343e-06, 'epoch': 0.65} 65%|██████▌ | 22429/34278 [24:39:18<11:52:30, 3.61s/it] 65%|██████▌ | 22430/34278 [24:39:22<11:50:40, 3.60s/it] {'loss': 0.1209, 'grad_norm': 1.0358406809412704, 'learning_rate': 2.8192476992114825e-06, 'epoch': 0.65} 65%|██████▌ | 22430/34278 [24:39:22<11:50:40, 3.60s/it] 65%|██████▌ | 22431/34278 [24:39:25<11:36:30, 3.53s/it] {'loss': 0.1123, 'grad_norm': 0.9515981882527773, 'learning_rate': 2.8188225778480694e-06, 'epoch': 0.65} 65%|██████▌ | 22431/34278 [24:39:25<11:36:30, 3.53s/it] 65%|██████▌ | 22432/34278 [24:39:29<11:43:33, 3.56s/it] {'loss': 0.1463, 'grad_norm': 0.854083675080113, 'learning_rate': 2.818397475957685e-06, 'epoch': 0.65} 65%|██████▌ | 22432/34278 [24:39:29<11:43:33, 3.56s/it] 65%|██████▌ | 22433/34278 [24:39:35<14:01:07, 4.26s/it] {'loss': 0.1338, 'grad_norm': 1.0071188848555603, 'learning_rate': 2.8179723935441273e-06, 'epoch': 0.65} 65%|██████▌ | 22433/34278 [24:39:35<14:01:07, 4.26s/it] 65%|██████▌ | 22434/34278 [24:39:38<13:37:51, 4.14s/it] {'loss': 0.1131, 'grad_norm': 0.9228795996135992, 'learning_rate': 2.8175473306111932e-06, 'epoch': 0.65} 65%|██████▌ | 22434/34278 [24:39:38<13:37:51, 4.14s/it] 65%|██████▌ | 22435/34278 [24:39:42<13:22:47, 4.07s/it] {'loss': 0.147, 'grad_norm': 1.0131591896620302, 'learning_rate': 2.817122287162676e-06, 'epoch': 0.65} 65%|██████▌ | 22435/34278 [24:39:42<13:22:47, 4.07s/it] 65%|██████▌ | 22436/34278 [24:39:46<12:56:43, 3.94s/it] {'loss': 0.1174, 'grad_norm': 0.9948899904967381, 'learning_rate': 2.816697263202367e-06, 'epoch': 0.65} 65%|██████▌ | 22436/34278 [24:39:46<12:56:43, 3.94s/it] 65%|██████▌ | 22437/34278 [24:39:49<12:21:41, 3.76s/it] {'loss': 0.1318, 'grad_norm': 0.8279068293792077, 'learning_rate': 2.8162722587340663e-06, 'epoch': 0.65} 65%|██████▌ | 22437/34278 [24:39:49<12:21:41, 3.76s/it] 65%|██████▌ | 22438/34278 [24:39:52<11:23:23, 3.46s/it] {'loss': 0.1519, 'grad_norm': 0.9452393538791887, 'learning_rate': 2.815847273761564e-06, 'epoch': 0.65} 65%|██████▌ | 22438/34278 [24:39:52<11:23:23, 3.46s/it] 65%|██████▌ | 22439/34278 [24:39:55<10:57:40, 3.33s/it] {'loss': 0.1204, 'grad_norm': 0.8785567153294748, 'learning_rate': 2.8154223082886568e-06, 'epoch': 0.65} 65%|██████▌ | 22439/34278 [24:39:55<10:57:40, 3.33s/it] 65%|██████▌ | 22440/34278 [24:39:59<11:36:06, 3.53s/it] {'loss': 0.1223, 'grad_norm': 0.8682213748099582, 'learning_rate': 2.8149973623191363e-06, 'epoch': 0.65} 65%|██████▌ | 22440/34278 [24:39:59<11:36:06, 3.53s/it] 65%|██████▌ | 22441/34278 [24:40:02<11:10:02, 3.40s/it] {'loss': 0.1071, 'grad_norm': 0.794535610763149, 'learning_rate': 2.8145724358567994e-06, 'epoch': 0.65} 65%|██████▌ | 22441/34278 [24:40:02<11:10:02, 3.40s/it] 65%|██████▌ | 22442/34278 [24:40:06<11:05:55, 3.38s/it] {'loss': 0.1357, 'grad_norm': 0.8324289979702626, 'learning_rate': 2.8141475289054387e-06, 'epoch': 0.65} 65%|██████▌ | 22442/34278 [24:40:06<11:05:55, 3.38s/it] 65%|██████▌ | 22443/34278 [24:40:11<13:38:29, 4.15s/it] {'loss': 0.1198, 'grad_norm': 0.946816262503394, 'learning_rate': 2.8137226414688447e-06, 'epoch': 0.65} 65%|██████▌ | 22443/34278 [24:40:11<13:38:29, 4.15s/it] 65%|██████▌ | 22444/34278 [24:40:15<12:34:20, 3.82s/it] {'loss': 0.1156, 'grad_norm': 0.9347654787220115, 'learning_rate': 2.8132977735508125e-06, 'epoch': 0.65} 65%|██████▌ | 22444/34278 [24:40:15<12:34:20, 3.82s/it] 65%|██████▌ | 22445/34278 [24:40:18<11:45:36, 3.58s/it] {'loss': 0.1205, 'grad_norm': 1.054201416734486, 'learning_rate': 2.812872925155139e-06, 'epoch': 0.65} 65%|██████▌ | 22445/34278 [24:40:18<11:45:36, 3.58s/it] 65%|██████▌ | 22446/34278 [24:40:20<11:09:23, 3.39s/it] {'loss': 0.1164, 'grad_norm': 0.8743155275048713, 'learning_rate': 2.812448096285613e-06, 'epoch': 0.65} 65%|██████▌ | 22446/34278 [24:40:20<11:09:23, 3.39s/it] 65%|██████▌ | 22447/34278 [24:40:27<13:46:45, 4.19s/it] {'loss': 0.1224, 'grad_norm': 0.989058469367441, 'learning_rate': 2.812023286946026e-06, 'epoch': 0.65} 65%|██████▌ | 22447/34278 [24:40:27<13:46:45, 4.19s/it] 65%|██████▌ | 22448/34278 [24:40:30<13:15:03, 4.03s/it] {'loss': 0.1297, 'grad_norm': 0.8370971723206385, 'learning_rate': 2.8115984971401753e-06, 'epoch': 0.65} 65%|██████▌ | 22448/34278 [24:40:30<13:15:03, 4.03s/it] 65%|██████▌ | 22449/34278 [24:40:36<15:11:02, 4.62s/it] {'loss': 0.1396, 'grad_norm': 1.2293559844157345, 'learning_rate': 2.8111737268718507e-06, 'epoch': 0.65} 65%|██████▌ | 22449/34278 [24:40:36<15:11:02, 4.62s/it] 65%|██████▌ | 22450/34278 [24:40:40<13:59:39, 4.26s/it] {'loss': 0.1094, 'grad_norm': 0.7159436549623293, 'learning_rate': 2.8107489761448416e-06, 'epoch': 0.65} 65%|██████▌ | 22450/34278 [24:40:40<13:59:39, 4.26s/it] 65%|██████▌ | 22451/34278 [24:40:43<12:43:34, 3.87s/it] {'loss': 0.1245, 'grad_norm': 1.059710812761776, 'learning_rate': 2.8103242449629455e-06, 'epoch': 0.65} 65%|██████▌ | 22451/34278 [24:40:43<12:43:34, 3.87s/it] 65%|██████▌ | 22452/34278 [24:40:49<14:57:01, 4.55s/it] {'loss': 0.132, 'grad_norm': 1.0762803144274677, 'learning_rate': 2.8098995333299522e-06, 'epoch': 0.65} 65%|██████▌ | 22452/34278 [24:40:49<14:57:01, 4.55s/it] 66%|██████▌ | 22453/34278 [24:40:52<13:26:13, 4.09s/it] {'loss': 0.1087, 'grad_norm': 0.9713609543462169, 'learning_rate': 2.8094748412496507e-06, 'epoch': 0.66} 66%|██████▌ | 22453/34278 [24:40:52<13:26:13, 4.09s/it] 66%|██████▌ | 22454/34278 [24:40:55<12:35:28, 3.83s/it] {'loss': 0.135, 'grad_norm': 0.9352136934925658, 'learning_rate': 2.8090501687258378e-06, 'epoch': 0.66} 66%|██████▌ | 22454/34278 [24:40:55<12:35:28, 3.83s/it] 66%|██████▌ | 22455/34278 [24:40:59<12:30:33, 3.81s/it] {'loss': 0.1451, 'grad_norm': 1.009512518613979, 'learning_rate': 2.8086255157623017e-06, 'epoch': 0.66} 66%|██████▌ | 22455/34278 [24:40:59<12:30:33, 3.81s/it] 66%|██████▌ | 22456/34278 [24:41:02<11:54:16, 3.63s/it] {'loss': 0.1209, 'grad_norm': 1.3412461828688422, 'learning_rate': 2.8082008823628313e-06, 'epoch': 0.66} 66%|██████▌ | 22456/34278 [24:41:02<11:54:16, 3.63s/it] 66%|██████▌ | 22457/34278 [24:41:07<13:37:42, 4.15s/it] {'loss': 0.1165, 'grad_norm': 1.0134594124748277, 'learning_rate': 2.807776268531221e-06, 'epoch': 0.66} 66%|██████▌ | 22457/34278 [24:41:07<13:37:42, 4.15s/it] 66%|██████▌ | 22458/34278 [24:41:10<12:30:05, 3.81s/it] {'loss': 0.1167, 'grad_norm': 0.9105265471124363, 'learning_rate': 2.8073516742712626e-06, 'epoch': 0.66} 66%|██████▌ | 22458/34278 [24:41:10<12:30:05, 3.81s/it] 66%|██████▌ | 22459/34278 [24:41:13<11:52:50, 3.62s/it] {'loss': 0.1277, 'grad_norm': 0.9796945647382419, 'learning_rate': 2.8069270995867447e-06, 'epoch': 0.66} 66%|██████▌ | 22459/34278 [24:41:13<11:52:50, 3.62s/it] 66%|██████▌ | 22460/34278 [24:41:16<11:05:16, 3.38s/it] {'loss': 0.1201, 'grad_norm': 1.2798380530827744, 'learning_rate': 2.8065025444814566e-06, 'epoch': 0.66} 66%|██████▌ | 22460/34278 [24:41:16<11:05:16, 3.38s/it] 66%|██████▌ | 22461/34278 [24:41:22<13:49:59, 4.21s/it] {'loss': 0.1418, 'grad_norm': 0.8342012696107435, 'learning_rate': 2.8060780089591915e-06, 'epoch': 0.66} 66%|██████▌ | 22461/34278 [24:41:22<13:49:59, 4.21s/it] 66%|██████▌ | 22462/34278 [24:41:29<15:48:24, 4.82s/it] {'loss': 0.1372, 'grad_norm': 0.77476631915906, 'learning_rate': 2.8056534930237367e-06, 'epoch': 0.66} 66%|██████▌ | 22462/34278 [24:41:29<15:48:24, 4.82s/it] 66%|██████▌ | 22463/34278 [24:41:35<17:04:57, 5.21s/it] {'loss': 0.1076, 'grad_norm': 0.7909487902491348, 'learning_rate': 2.8052289966788838e-06, 'epoch': 0.66} 66%|██████▌ | 22463/34278 [24:41:35<17:04:57, 5.21s/it] 66%|██████▌ | 22464/34278 [24:41:38<15:26:46, 4.71s/it] {'loss': 0.0971, 'grad_norm': 0.6720644960359576, 'learning_rate': 2.804804519928424e-06, 'epoch': 0.66} 66%|██████▌ | 22464/34278 [24:41:38<15:26:46, 4.71s/it] 66%|██████▌ | 22465/34278 [24:41:43<15:49:58, 4.83s/it] {'loss': 0.111, 'grad_norm': 0.793948908039111, 'learning_rate': 2.8043800627761453e-06, 'epoch': 0.66} 66%|██████▌ | 22465/34278 [24:41:43<15:49:58, 4.83s/it] 66%|██████▌ | 22466/34278 [24:41:47<14:33:44, 4.44s/it] {'loss': 0.1469, 'grad_norm': 1.0834419645421325, 'learning_rate': 2.803955625225836e-06, 'epoch': 0.66} 66%|██████▌ | 22466/34278 [24:41:47<14:33:44, 4.44s/it] 66%|██████▌ | 22467/34278 [24:41:51<13:47:05, 4.20s/it] {'loss': 0.1049, 'grad_norm': 0.8105128864027233, 'learning_rate': 2.803531207281288e-06, 'epoch': 0.66} 66%|██████▌ | 22467/34278 [24:41:51<13:47:05, 4.20s/it] 66%|██████▌ | 22468/34278 [24:41:56<14:54:26, 4.54s/it] {'loss': 0.1274, 'grad_norm': 0.8490429674101937, 'learning_rate': 2.8031068089462874e-06, 'epoch': 0.66} 66%|██████▌ | 22468/34278 [24:41:56<14:54:26, 4.54s/it] 66%|██████▌ | 22469/34278 [24:42:00<14:48:38, 4.52s/it] {'loss': 0.1364, 'grad_norm': 0.7610165214629974, 'learning_rate': 2.802682430224627e-06, 'epoch': 0.66} 66%|██████▌ | 22469/34278 [24:42:00<14:48:38, 4.52s/it] 66%|██████▌ | 22470/34278 [24:42:04<13:38:05, 4.16s/it] {'loss': 0.1397, 'grad_norm': 1.0817781727835747, 'learning_rate': 2.802258071120091e-06, 'epoch': 0.66} 66%|██████▌ | 22470/34278 [24:42:04<13:38:05, 4.16s/it] 66%|██████▌ | 22471/34278 [24:42:07<12:15:48, 3.74s/it] {'loss': 0.1193, 'grad_norm': 0.8705608230901531, 'learning_rate': 2.801833731636472e-06, 'epoch': 0.66} 66%|██████▌ | 22471/34278 [24:42:07<12:15:48, 3.74s/it] 66%|██████▌ | 22472/34278 [24:42:10<11:34:41, 3.53s/it] {'loss': 0.1137, 'grad_norm': 0.808086343932562, 'learning_rate': 2.801409411777557e-06, 'epoch': 0.66} 66%|██████▌ | 22472/34278 [24:42:10<11:34:41, 3.53s/it] 66%|██████▌ | 22473/34278 [24:42:13<11:03:21, 3.37s/it] {'loss': 0.1204, 'grad_norm': 0.8992175199035769, 'learning_rate': 2.800985111547132e-06, 'epoch': 0.66} 66%|██████▌ | 22473/34278 [24:42:13<11:03:21, 3.37s/it] 66%|██████▌ | 22474/34278 [24:42:16<11:07:10, 3.39s/it] {'loss': 0.11, 'grad_norm': 0.7156651496369794, 'learning_rate': 2.800560830948987e-06, 'epoch': 0.66} 66%|██████▌ | 22474/34278 [24:42:16<11:07:10, 3.39s/it] 66%|██████▌ | 22475/34278 [24:42:19<10:40:58, 3.26s/it] {'loss': 0.1284, 'grad_norm': 0.7464135277631306, 'learning_rate': 2.8001365699869108e-06, 'epoch': 0.66} 66%|██████▌ | 22475/34278 [24:42:19<10:40:58, 3.26s/it] 66%|██████▌ | 22476/34278 [24:42:22<10:22:37, 3.17s/it] {'loss': 0.1497, 'grad_norm': 1.0967097120862654, 'learning_rate': 2.7997123286646916e-06, 'epoch': 0.66} 66%|██████▌ | 22476/34278 [24:42:22<10:22:37, 3.17s/it] 66%|██████▌ | 22477/34278 [24:42:25<10:09:42, 3.10s/it] {'loss': 0.1275, 'grad_norm': 0.9632295410992674, 'learning_rate': 2.7992881069861135e-06, 'epoch': 0.66} 66%|██████▌ | 22477/34278 [24:42:25<10:09:42, 3.10s/it] 66%|██████▌ | 22478/34278 [24:42:29<10:55:48, 3.33s/it] {'loss': 0.1543, 'grad_norm': 0.8300706982170465, 'learning_rate': 2.798863904954967e-06, 'epoch': 0.66} 66%|██████▌ | 22478/34278 [24:42:29<10:55:48, 3.33s/it] 66%|██████▌ | 22479/34278 [24:42:32<10:32:42, 3.22s/it] {'loss': 0.1329, 'grad_norm': 1.1260893811778192, 'learning_rate': 2.798439722575038e-06, 'epoch': 0.66} 66%|██████▌ | 22479/34278 [24:42:32<10:32:42, 3.22s/it] 66%|██████▌ | 22480/34278 [24:42:38<13:22:55, 4.08s/it] {'loss': 0.1233, 'grad_norm': 0.8562290905610475, 'learning_rate': 2.79801555985011e-06, 'epoch': 0.66} 66%|██████▌ | 22480/34278 [24:42:38<13:22:55, 4.08s/it] 66%|██████▌ | 22481/34278 [24:42:41<12:16:05, 3.74s/it] {'loss': 0.1196, 'grad_norm': 0.8569677028540854, 'learning_rate': 2.797591416783978e-06, 'epoch': 0.66} 66%|██████▌ | 22481/34278 [24:42:41<12:16:05, 3.74s/it] 66%|██████▌ | 22482/34278 [24:42:47<14:21:58, 4.38s/it] {'loss': 0.133, 'grad_norm': 0.7703821745089594, 'learning_rate': 2.7971672933804227e-06, 'epoch': 0.66} 66%|██████▌ | 22482/34278 [24:42:47<14:21:58, 4.38s/it] 66%|██████▌ | 22483/34278 [24:42:50<13:37:18, 4.16s/it] {'loss': 0.1125, 'grad_norm': 0.7191233824796355, 'learning_rate': 2.796743189643231e-06, 'epoch': 0.66} 66%|██████▌ | 22483/34278 [24:42:50<13:37:18, 4.16s/it] 66%|██████▌ | 22484/34278 [24:42:53<12:37:25, 3.85s/it] {'loss': 0.1316, 'grad_norm': 0.9049161664824638, 'learning_rate': 2.7963191055761916e-06, 'epoch': 0.66} 66%|██████▌ | 22484/34278 [24:42:53<12:37:25, 3.85s/it] 66%|██████▌ | 22485/34278 [24:42:56<11:49:34, 3.61s/it] {'loss': 0.098, 'grad_norm': 0.7079508217902415, 'learning_rate': 2.79589504118309e-06, 'epoch': 0.66} 66%|██████▌ | 22485/34278 [24:42:56<11:49:34, 3.61s/it] 66%|██████▌ | 22486/34278 [24:43:01<12:49:57, 3.92s/it] {'loss': 0.129, 'grad_norm': 0.8930720409388776, 'learning_rate': 2.7954709964677083e-06, 'epoch': 0.66} 66%|██████▌ | 22486/34278 [24:43:01<12:49:57, 3.92s/it] 66%|██████▌ | 22487/34278 [24:43:04<11:51:03, 3.62s/it] {'loss': 0.1348, 'grad_norm': 0.802453894170231, 'learning_rate': 2.7950469714338356e-06, 'epoch': 0.66} 66%|██████▌ | 22487/34278 [24:43:04<11:51:03, 3.62s/it] 66%|██████▌ | 22488/34278 [24:43:07<11:06:36, 3.39s/it] {'loss': 0.1159, 'grad_norm': 0.9234038288708448, 'learning_rate': 2.7946229660852598e-06, 'epoch': 0.66} 66%|██████▌ | 22488/34278 [24:43:07<11:06:36, 3.39s/it] 66%|██████▌ | 22489/34278 [24:43:12<12:31:31, 3.82s/it] {'loss': 0.1225, 'grad_norm': 0.7178968242083876, 'learning_rate': 2.7941989804257628e-06, 'epoch': 0.66} 66%|██████▌ | 22489/34278 [24:43:12<12:31:31, 3.82s/it] 66%|██████▌ | 22490/34278 [24:43:15<11:37:20, 3.55s/it] {'loss': 0.1288, 'grad_norm': 0.8047241178207227, 'learning_rate': 2.793775014459129e-06, 'epoch': 0.66} 66%|██████▌ | 22490/34278 [24:43:15<11:37:20, 3.55s/it] 66%|██████▌ | 22491/34278 [24:43:18<11:05:17, 3.39s/it] {'loss': 0.1203, 'grad_norm': 0.8627289103067527, 'learning_rate': 2.7933510681891477e-06, 'epoch': 0.66} 66%|██████▌ | 22491/34278 [24:43:18<11:05:17, 3.39s/it] 66%|██████▌ | 22492/34278 [24:43:22<12:15:15, 3.74s/it] {'loss': 0.121, 'grad_norm': 0.8776410857739505, 'learning_rate': 2.792927141619599e-06, 'epoch': 0.66} 66%|██████▌ | 22492/34278 [24:43:22<12:15:15, 3.74s/it] 66%|██████▌ | 22493/34278 [24:43:26<12:47:08, 3.91s/it] {'loss': 0.1078, 'grad_norm': 0.9019677742950784, 'learning_rate': 2.79250323475427e-06, 'epoch': 0.66} 66%|██████▌ | 22493/34278 [24:43:26<12:47:08, 3.91s/it] 66%|██████▌ | 22494/34278 [24:43:29<11:54:49, 3.64s/it] {'loss': 0.1132, 'grad_norm': 0.6099422452034862, 'learning_rate': 2.7920793475969465e-06, 'epoch': 0.66} 66%|██████▌ | 22494/34278 [24:43:29<11:54:49, 3.64s/it] 66%|██████▌ | 22495/34278 [24:43:32<11:18:21, 3.45s/it] {'loss': 0.1269, 'grad_norm': 1.3063584220997537, 'learning_rate': 2.7916554801514124e-06, 'epoch': 0.66} 66%|██████▌ | 22495/34278 [24:43:32<11:18:21, 3.45s/it] 66%|██████▌ | 22496/34278 [24:43:39<13:56:54, 4.26s/it] {'loss': 0.1181, 'grad_norm': 0.7329641995200221, 'learning_rate': 2.7912316324214485e-06, 'epoch': 0.66} 66%|██████▌ | 22496/34278 [24:43:39<13:56:54, 4.26s/it] 66%|██████▌ | 22497/34278 [24:43:44<14:54:55, 4.56s/it] {'loss': 0.1166, 'grad_norm': 0.7523293310272482, 'learning_rate': 2.790807804410843e-06, 'epoch': 0.66} 66%|██████▌ | 22497/34278 [24:43:44<14:54:55, 4.56s/it] 66%|██████▌ | 22498/34278 [24:43:48<14:07:15, 4.32s/it] {'loss': 0.1153, 'grad_norm': 0.8334991656815292, 'learning_rate': 2.790383996123377e-06, 'epoch': 0.66} 66%|██████▌ | 22498/34278 [24:43:48<14:07:15, 4.32s/it] 66%|██████▌ | 22499/34278 [24:43:51<13:04:40, 4.00s/it] {'loss': 0.1288, 'grad_norm': 0.8202842930485915, 'learning_rate': 2.7899602075628366e-06, 'epoch': 0.66} 66%|██████▌ | 22499/34278 [24:43:51<13:04:40, 4.00s/it] 66%|██████▌ | 22500/34278 [24:43:54<12:30:26, 3.82s/it] {'loss': 0.0935, 'grad_norm': 1.188380710720048, 'learning_rate': 2.789536438733002e-06, 'epoch': 0.66} 66%|██████▌ | 22500/34278 [24:43:54<12:30:26, 3.82s/it] 66%|██████▌ | 22501/34278 [24:43:58<12:09:10, 3.71s/it] {'loss': 0.1157, 'grad_norm': 0.810922022589193, 'learning_rate': 2.7891126896376603e-06, 'epoch': 0.66} 66%|██████▌ | 22501/34278 [24:43:58<12:09:10, 3.71s/it] 66%|██████▌ | 22502/34278 [24:44:04<14:22:10, 4.39s/it] {'loss': 0.1291, 'grad_norm': 0.7876301060910118, 'learning_rate': 2.7886889602805926e-06, 'epoch': 0.66} 66%|██████▌ | 22502/34278 [24:44:04<14:22:10, 4.39s/it] 66%|██████▌ | 22503/34278 [24:44:09<15:00:19, 4.59s/it] {'loss': 0.1145, 'grad_norm': 0.7881536119215248, 'learning_rate': 2.7882652506655807e-06, 'epoch': 0.66} 66%|██████▌ | 22503/34278 [24:44:09<15:00:19, 4.59s/it] 66%|██████▌ | 22504/34278 [24:44:13<14:58:35, 4.58s/it] {'loss': 0.1267, 'grad_norm': 0.8918753539973705, 'learning_rate': 2.787841560796408e-06, 'epoch': 0.66} 66%|██████▌ | 22504/34278 [24:44:13<14:58:35, 4.58s/it] 66%|██████▌ | 22505/34278 [24:44:17<13:38:28, 4.17s/it] {'loss': 0.1108, 'grad_norm': 1.1072156778997373, 'learning_rate': 2.78741789067686e-06, 'epoch': 0.66} 66%|██████▌ | 22505/34278 [24:44:17<13:38:28, 4.17s/it] 66%|██████▌ | 22506/34278 [24:44:22<15:18:09, 4.68s/it] {'loss': 0.1276, 'grad_norm': 0.8877621015890508, 'learning_rate': 2.7869942403107163e-06, 'epoch': 0.66} 66%|██████▌ | 22506/34278 [24:44:22<15:18:09, 4.68s/it] 66%|██████▌ | 22507/34278 [24:44:26<13:52:21, 4.24s/it] {'loss': 0.1032, 'grad_norm': 0.8072881168843168, 'learning_rate': 2.7865706097017585e-06, 'epoch': 0.66} 66%|██████▌ | 22507/34278 [24:44:26<13:52:21, 4.24s/it] 66%|██████▌ | 22508/34278 [24:44:29<12:56:24, 3.96s/it] {'loss': 0.1358, 'grad_norm': 0.9372479949765035, 'learning_rate': 2.7861469988537714e-06, 'epoch': 0.66} 66%|██████▌ | 22508/34278 [24:44:29<12:56:24, 3.96s/it] 66%|██████▌ | 22509/34278 [24:44:35<14:49:33, 4.54s/it] {'loss': 0.131, 'grad_norm': 0.8541840180347505, 'learning_rate': 2.7857234077705355e-06, 'epoch': 0.66} 66%|██████▌ | 22509/34278 [24:44:35<14:49:33, 4.54s/it] 66%|██████▌ | 22510/34278 [24:44:38<13:32:50, 4.14s/it] {'loss': 0.1091, 'grad_norm': 0.7326392057265572, 'learning_rate': 2.7852998364558287e-06, 'epoch': 0.66} 66%|██████▌ | 22510/34278 [24:44:38<13:32:50, 4.14s/it] 66%|██████▌ | 22511/34278 [24:44:44<15:20:21, 4.69s/it] {'loss': 0.1423, 'grad_norm': 1.046935756928821, 'learning_rate': 2.7848762849134405e-06, 'epoch': 0.66} 66%|██████▌ | 22511/34278 [24:44:44<15:20:21, 4.69s/it] 66%|██████▌ | 22512/34278 [24:44:49<15:24:05, 4.71s/it] {'loss': 0.14, 'grad_norm': 0.9414506643276448, 'learning_rate': 2.784452753147147e-06, 'epoch': 0.66} 66%|██████▌ | 22512/34278 [24:44:49<15:24:05, 4.71s/it] 66%|██████▌ | 22513/34278 [24:44:52<14:18:53, 4.38s/it] {'loss': 0.1243, 'grad_norm': 0.7318518189924511, 'learning_rate': 2.7840292411607296e-06, 'epoch': 0.66} 66%|██████▌ | 22513/34278 [24:44:52<14:18:53, 4.38s/it] 66%|██████▌ | 22514/34278 [24:44:55<13:00:51, 3.98s/it] {'loss': 0.1148, 'grad_norm': 0.818140835759817, 'learning_rate': 2.7836057489579714e-06, 'epoch': 0.66} 66%|██████▌ | 22514/34278 [24:44:55<13:00:51, 3.98s/it] 66%|██████▌ | 22515/34278 [24:45:00<13:52:17, 4.25s/it] {'loss': 0.1245, 'grad_norm': 0.9527922379739329, 'learning_rate': 2.783182276542652e-06, 'epoch': 0.66} 66%|██████▌ | 22515/34278 [24:45:00<13:52:17, 4.25s/it] 66%|██████▌ | 22516/34278 [24:45:03<12:30:00, 3.83s/it] {'loss': 0.123, 'grad_norm': 0.8002737478246307, 'learning_rate': 2.7827588239185497e-06, 'epoch': 0.66} 66%|██████▌ | 22516/34278 [24:45:03<12:30:00, 3.83s/it] 66%|██████▌ | 22517/34278 [24:45:06<11:55:24, 3.65s/it] {'loss': 0.1199, 'grad_norm': 0.8946948698519541, 'learning_rate': 2.7823353910894486e-06, 'epoch': 0.66} 66%|██████▌ | 22517/34278 [24:45:06<11:55:24, 3.65s/it] 66%|██████▌ | 22518/34278 [24:45:09<11:23:54, 3.49s/it] {'loss': 0.0966, 'grad_norm': 0.7873289502771038, 'learning_rate': 2.7819119780591284e-06, 'epoch': 0.66} 66%|██████▌ | 22518/34278 [24:45:10<11:23:54, 3.49s/it] 66%|██████▌ | 22519/34278 [24:45:14<12:28:48, 3.82s/it] {'loss': 0.1136, 'grad_norm': 0.8008015819465731, 'learning_rate': 2.7814885848313692e-06, 'epoch': 0.66} 66%|██████▌ | 22519/34278 [24:45:14<12:28:48, 3.82s/it] 66%|██████▌ | 22520/34278 [24:45:18<12:06:23, 3.71s/it] {'loss': 0.1178, 'grad_norm': 1.103038646793943, 'learning_rate': 2.7810652114099483e-06, 'epoch': 0.66} 66%|██████▌ | 22520/34278 [24:45:18<12:06:23, 3.71s/it] 66%|██████▌ | 22521/34278 [24:45:22<12:28:06, 3.82s/it] {'loss': 0.1279, 'grad_norm': 0.9026383226586517, 'learning_rate': 2.7806418577986494e-06, 'epoch': 0.66} 66%|██████▌ | 22521/34278 [24:45:22<12:28:06, 3.82s/it] 66%|██████▌ | 22522/34278 [24:45:26<13:12:32, 4.04s/it] {'loss': 0.1179, 'grad_norm': 0.8547687783967982, 'learning_rate': 2.7802185240012485e-06, 'epoch': 0.66} 66%|██████▌ | 22522/34278 [24:45:26<13:12:32, 4.04s/it] 66%|██████▌ | 22523/34278 [24:45:29<12:08:45, 3.72s/it] {'loss': 0.1589, 'grad_norm': 1.0004540488073015, 'learning_rate': 2.7797952100215263e-06, 'epoch': 0.66} 66%|██████▌ | 22523/34278 [24:45:29<12:08:45, 3.72s/it] 66%|██████▌ | 22524/34278 [24:45:32<11:26:51, 3.51s/it] {'loss': 0.1183, 'grad_norm': 0.8353649868482187, 'learning_rate': 2.779371915863265e-06, 'epoch': 0.66} 66%|██████▌ | 22524/34278 [24:45:32<11:26:51, 3.51s/it] 66%|██████▌ | 22525/34278 [24:45:38<13:22:15, 4.10s/it] {'loss': 0.1019, 'grad_norm': 0.7423876304035788, 'learning_rate': 2.7789486415302404e-06, 'epoch': 0.66} 66%|██████▌ | 22525/34278 [24:45:38<13:22:15, 4.10s/it] 66%|██████▌ | 22526/34278 [24:45:41<12:49:49, 3.93s/it] {'loss': 0.111, 'grad_norm': 0.8084602701288659, 'learning_rate': 2.778525387026231e-06, 'epoch': 0.66} 66%|██████▌ | 22526/34278 [24:45:41<12:49:49, 3.93s/it] 66%|██████▌ | 22527/34278 [24:45:44<12:00:01, 3.68s/it] {'loss': 0.1175, 'grad_norm': 0.823783401402058, 'learning_rate': 2.7781021523550177e-06, 'epoch': 0.66} 66%|██████▌ | 22527/34278 [24:45:44<12:00:01, 3.68s/it] 66%|██████▌ | 22528/34278 [24:45:48<11:53:49, 3.65s/it] {'loss': 0.1321, 'grad_norm': 0.8878706143548062, 'learning_rate': 2.777678937520376e-06, 'epoch': 0.66} 66%|██████▌ | 22528/34278 [24:45:48<11:53:49, 3.65s/it] 66%|██████▌ | 22529/34278 [24:45:51<11:22:18, 3.48s/it] {'loss': 0.114, 'grad_norm': 0.9299055328002069, 'learning_rate': 2.7772557425260886e-06, 'epoch': 0.66} 66%|██████▌ | 22529/34278 [24:45:51<11:22:18, 3.48s/it] 66%|██████▌ | 22530/34278 [24:45:55<12:06:16, 3.71s/it] {'loss': 0.1359, 'grad_norm': 0.8653418937087174, 'learning_rate': 2.7768325673759296e-06, 'epoch': 0.66} 66%|██████▌ | 22530/34278 [24:45:55<12:06:16, 3.71s/it] 66%|██████▌ | 22531/34278 [24:45:59<11:51:33, 3.63s/it] {'loss': 0.1325, 'grad_norm': 0.7045029678143745, 'learning_rate': 2.7764094120736805e-06, 'epoch': 0.66} 66%|██████▌ | 22531/34278 [24:45:59<11:51:33, 3.63s/it] 66%|██████▌ | 22532/34278 [24:46:03<12:07:17, 3.72s/it] {'loss': 0.1112, 'grad_norm': 2.7500356859860577, 'learning_rate': 2.775986276623117e-06, 'epoch': 0.66} 66%|██████▌ | 22532/34278 [24:46:03<12:07:17, 3.72s/it] 66%|██████▌ | 22533/34278 [24:46:06<11:43:15, 3.59s/it] {'loss': 0.1313, 'grad_norm': 0.7649597187975548, 'learning_rate': 2.7755631610280154e-06, 'epoch': 0.66} 66%|██████▌ | 22533/34278 [24:46:06<11:43:15, 3.59s/it] 66%|██████▌ | 22534/34278 [24:46:09<11:27:56, 3.51s/it] {'loss': 0.1211, 'grad_norm': 0.7479922180573809, 'learning_rate': 2.775140065292155e-06, 'epoch': 0.66} 66%|██████▌ | 22534/34278 [24:46:09<11:27:56, 3.51s/it] 66%|██████▌ | 22535/34278 [24:46:15<13:25:07, 4.11s/it] {'loss': 0.1061, 'grad_norm': 0.7180944342727001, 'learning_rate': 2.7747169894193148e-06, 'epoch': 0.66} 66%|██████▌ | 22535/34278 [24:46:15<13:25:07, 4.11s/it] 66%|██████▌ | 22536/34278 [24:46:18<12:34:26, 3.86s/it] {'loss': 0.0963, 'grad_norm': 0.6721148205570775, 'learning_rate': 2.77429393341327e-06, 'epoch': 0.66} 66%|██████▌ | 22536/34278 [24:46:18<12:34:26, 3.86s/it] 66%|██████▌ | 22537/34278 [24:46:22<12:35:47, 3.86s/it] {'loss': 0.1313, 'grad_norm': 0.7612768261335503, 'learning_rate': 2.7738708972777963e-06, 'epoch': 0.66} 66%|██████▌ | 22537/34278 [24:46:22<12:35:47, 3.86s/it] 66%|██████▌ | 22538/34278 [24:46:25<11:46:50, 3.61s/it] {'loss': 0.1453, 'grad_norm': 0.9427584560829441, 'learning_rate': 2.7734478810166734e-06, 'epoch': 0.66} 66%|██████▌ | 22538/34278 [24:46:25<11:46:50, 3.61s/it] 66%|██████▌ | 22539/34278 [24:46:28<11:49:01, 3.62s/it] {'loss': 0.1043, 'grad_norm': 0.7583288904935364, 'learning_rate': 2.773024884633676e-06, 'epoch': 0.66} 66%|██████▌ | 22539/34278 [24:46:29<11:49:01, 3.62s/it] 66%|██████▌ | 22540/34278 [24:46:32<11:39:20, 3.57s/it] {'loss': 0.1476, 'grad_norm': 0.9393309194213926, 'learning_rate': 2.772601908132577e-06, 'epoch': 0.66} 66%|██████▌ | 22540/34278 [24:46:32<11:39:20, 3.57s/it] 66%|██████▌ | 22541/34278 [24:46:35<11:30:32, 3.53s/it] {'loss': 0.1364, 'grad_norm': 0.9863844353901349, 'learning_rate': 2.7721789515171605e-06, 'epoch': 0.66} 66%|██████▌ | 22541/34278 [24:46:35<11:30:32, 3.53s/it] 66%|██████▌ | 22542/34278 [24:46:38<10:50:40, 3.33s/it] {'loss': 0.1439, 'grad_norm': 1.2494265439675063, 'learning_rate': 2.771756014791198e-06, 'epoch': 0.66} 66%|██████▌ | 22542/34278 [24:46:38<10:50:40, 3.33s/it] 66%|██████▌ | 22543/34278 [24:46:44<13:05:12, 4.01s/it] {'loss': 0.0968, 'grad_norm': 0.5955301698206025, 'learning_rate': 2.7713330979584645e-06, 'epoch': 0.66} 66%|██████▌ | 22543/34278 [24:46:44<13:05:12, 4.01s/it] 66%|██████▌ | 22544/34278 [24:46:47<12:22:26, 3.80s/it] {'loss': 0.1314, 'grad_norm': 1.136010740585745, 'learning_rate': 2.770910201022739e-06, 'epoch': 0.66} 66%|██████▌ | 22544/34278 [24:46:47<12:22:26, 3.80s/it] 66%|██████▌ | 22545/34278 [24:46:50<11:50:35, 3.63s/it] {'loss': 0.1283, 'grad_norm': 1.0458930875849077, 'learning_rate': 2.770487323987795e-06, 'epoch': 0.66} 66%|██████▌ | 22545/34278 [24:46:50<11:50:35, 3.63s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fc776b84310> Failed to fetch sample 3074342. Exception: cannot identify image file <_io.BytesIO object at 0x7fc776b84310> 66%|██████▌ | 22546/34278 [24:46:55<12:27:43, 3.82s/it] {'loss': 0.1136, 'grad_norm': 1.3174283796579231, 'learning_rate': 2.770064466857406e-06, 'epoch': 0.66} 66%|██████▌ | 22546/34278 [24:46:55<12:27:43, 3.82s/it] 66%|██████▌ | 22547/34278 [24:46:59<12:45:17, 3.91s/it] {'loss': 0.1291, 'grad_norm': 0.8495506250413206, 'learning_rate': 2.769641629635349e-06, 'epoch': 0.66} 66%|██████▌ | 22547/34278 [24:46:59<12:45:17, 3.91s/it] 66%|██████▌ | 22548/34278 [24:47:02<12:21:01, 3.79s/it] {'loss': 0.1226, 'grad_norm': 0.8198223648100067, 'learning_rate': 2.769218812325401e-06, 'epoch': 0.66} 66%|██████▌ | 22548/34278 [24:47:02<12:21:01, 3.79s/it] 66%|██████▌ | 22549/34278 [24:47:05<11:26:19, 3.51s/it] {'loss': 0.128, 'grad_norm': 0.7818460254895462, 'learning_rate': 2.7687960149313354e-06, 'epoch': 0.66} 66%|██████▌ | 22549/34278 [24:47:05<11:26:19, 3.51s/it] 66%|██████▌ | 22550/34278 [24:47:10<12:43:38, 3.91s/it] {'loss': 0.1172, 'grad_norm': 1.0414744926522435, 'learning_rate': 2.7683732374569237e-06, 'epoch': 0.66} 66%|██████▌ | 22550/34278 [24:47:10<12:43:38, 3.91s/it] 66%|██████▌ | 22551/34278 [24:47:13<11:56:13, 3.66s/it] {'loss': 0.1012, 'grad_norm': 0.8192019799411547, 'learning_rate': 2.7679504799059454e-06, 'epoch': 0.66} 66%|██████▌ | 22551/34278 [24:47:13<11:56:13, 3.66s/it] 66%|██████▌ | 22552/34278 [24:47:16<11:25:51, 3.51s/it] {'loss': 0.1066, 'grad_norm': 0.662398181258038, 'learning_rate': 2.76752774228217e-06, 'epoch': 0.66} 66%|██████▌ | 22552/34278 [24:47:16<11:25:51, 3.51s/it] 66%|██████▌ | 22553/34278 [24:47:19<10:59:36, 3.38s/it] {'loss': 0.1206, 'grad_norm': 1.0746020643569365, 'learning_rate': 2.767105024589375e-06, 'epoch': 0.66} 66%|██████▌ | 22553/34278 [24:47:19<10:59:36, 3.38s/it] 66%|██████▌ | 22554/34278 [24:47:22<10:33:21, 3.24s/it] {'loss': 0.1069, 'grad_norm': 0.9684590729721944, 'learning_rate': 2.7666823268313342e-06, 'epoch': 0.66} 66%|██████▌ | 22554/34278 [24:47:22<10:33:21, 3.24s/it] 66%|██████▌ | 22555/34278 [24:47:26<10:52:12, 3.34s/it] {'loss': 0.134, 'grad_norm': 1.0419397473849004, 'learning_rate': 2.766259649011821e-06, 'epoch': 0.66} 66%|██████▌ | 22555/34278 [24:47:26<10:52:12, 3.34s/it] 66%|██████▌ | 22556/34278 [24:47:30<11:28:34, 3.52s/it] {'loss': 0.1424, 'grad_norm': 0.8649779098765733, 'learning_rate': 2.765836991134606e-06, 'epoch': 0.66} 66%|██████▌ | 22556/34278 [24:47:30<11:28:34, 3.52s/it] 66%|██████▌ | 22557/34278 [24:47:33<11:15:00, 3.46s/it] {'loss': 0.086, 'grad_norm': 0.7707057213352337, 'learning_rate': 2.765414353203467e-06, 'epoch': 0.66} 66%|██████▌ | 22557/34278 [24:47:33<11:15:00, 3.46s/it] 66%|██████▌ | 22558/34278 [24:47:37<11:27:52, 3.52s/it] {'loss': 0.1119, 'grad_norm': 0.7544210102811724, 'learning_rate': 2.7649917352221738e-06, 'epoch': 0.66} 66%|██████▌ | 22558/34278 [24:47:37<11:27:52, 3.52s/it] 66%|██████▌ | 22559/34278 [24:47:40<11:06:39, 3.41s/it] {'loss': 0.1225, 'grad_norm': 1.0636440454375342, 'learning_rate': 2.764569137194503e-06, 'epoch': 0.66} 66%|██████▌ | 22559/34278 [24:47:40<11:06:39, 3.41s/it] 66%|██████▌ | 22560/34278 [24:47:43<11:09:01, 3.43s/it] {'loss': 0.0913, 'grad_norm': 0.7504238018266771, 'learning_rate': 2.7641465591242224e-06, 'epoch': 0.66} 66%|██████▌ | 22560/34278 [24:47:43<11:09:01, 3.43s/it] 66%|██████▌ | 22561/34278 [24:47:48<12:20:48, 3.79s/it] {'loss': 0.1096, 'grad_norm': 0.8251551102768702, 'learning_rate': 2.7637240010151103e-06, 'epoch': 0.66} 66%|██████▌ | 22561/34278 [24:47:48<12:20:48, 3.79s/it] 66%|██████▌ | 22562/34278 [24:47:51<11:40:06, 3.59s/it] {'loss': 0.1281, 'grad_norm': 0.8602708857579717, 'learning_rate': 2.763301462870936e-06, 'epoch': 0.66} 66%|██████▌ | 22562/34278 [24:47:51<11:40:06, 3.59s/it] 66%|██████▌ | 22563/34278 [24:47:54<11:20:57, 3.49s/it] {'loss': 0.1217, 'grad_norm': 0.7887120908935958, 'learning_rate': 2.7628789446954705e-06, 'epoch': 0.66} 66%|██████▌ | 22563/34278 [24:47:54<11:20:57, 3.49s/it] 66%|██████▌ | 22564/34278 [24:47:58<11:16:36, 3.47s/it] {'loss': 0.1034, 'grad_norm': 0.9049183673836019, 'learning_rate': 2.7624564464924874e-06, 'epoch': 0.66} 66%|██████▌ | 22564/34278 [24:47:58<11:16:36, 3.47s/it] 66%|██████▌ | 22565/34278 [24:48:01<10:43:45, 3.30s/it] {'loss': 0.1307, 'grad_norm': 0.889621781164339, 'learning_rate': 2.7620339682657616e-06, 'epoch': 0.66} 66%|██████▌ | 22565/34278 [24:48:01<10:43:45, 3.30s/it] 66%|██████▌ | 22566/34278 [24:48:04<10:39:00, 3.27s/it] {'loss': 0.1121, 'grad_norm': 0.7334398805214061, 'learning_rate': 2.761611510019062e-06, 'epoch': 0.66} 66%|██████▌ | 22566/34278 [24:48:04<10:39:00, 3.27s/it] 66%|██████▌ | 22567/34278 [24:48:07<10:44:19, 3.30s/it] {'loss': 0.142, 'grad_norm': 0.7767238014675791, 'learning_rate': 2.7611890717561584e-06, 'epoch': 0.66} 66%|██████▌ | 22567/34278 [24:48:07<10:44:19, 3.30s/it] 66%|██████▌ | 22568/34278 [24:48:12<12:01:48, 3.70s/it] {'loss': 0.1152, 'grad_norm': 0.8985569388201397, 'learning_rate': 2.7607666534808262e-06, 'epoch': 0.66} 66%|██████▌ | 22568/34278 [24:48:12<12:01:48, 3.70s/it] 66%|██████▌ | 22569/34278 [24:48:16<12:53:31, 3.96s/it] {'loss': 0.1212, 'grad_norm': 0.9604261215054245, 'learning_rate': 2.760344255196835e-06, 'epoch': 0.66} 66%|██████▌ | 22569/34278 [24:48:16<12:53:31, 3.96s/it] 66%|██████▌ | 22570/34278 [24:48:19<11:55:33, 3.67s/it] {'loss': 0.1533, 'grad_norm': 1.0756881300950472, 'learning_rate': 2.7599218769079518e-06, 'epoch': 0.66} 66%|██████▌ | 22570/34278 [24:48:19<11:55:33, 3.67s/it] 66%|██████▌ | 22571/34278 [24:48:23<11:46:30, 3.62s/it] {'loss': 0.1068, 'grad_norm': 0.6519123585154943, 'learning_rate': 2.759499518617955e-06, 'epoch': 0.66} 66%|██████▌ | 22571/34278 [24:48:23<11:46:30, 3.62s/it] 66%|██████▌ | 22572/34278 [24:48:27<12:00:34, 3.69s/it] {'loss': 0.1266, 'grad_norm': 1.0883905142442654, 'learning_rate': 2.759077180330612e-06, 'epoch': 0.66} 66%|██████▌ | 22572/34278 [24:48:27<12:00:34, 3.69s/it] 66%|██████▌ | 22573/34278 [24:48:30<11:18:43, 3.48s/it] {'loss': 0.1146, 'grad_norm': 1.0998632526864929, 'learning_rate': 2.758654862049691e-06, 'epoch': 0.66} 66%|██████▌ | 22573/34278 [24:48:30<11:18:43, 3.48s/it] 66%|██████▌ | 22574/34278 [24:48:33<11:06:12, 3.42s/it] {'loss': 0.1441, 'grad_norm': 0.9522597459448454, 'learning_rate': 2.758232563778966e-06, 'epoch': 0.66} 66%|██████▌ | 22574/34278 [24:48:33<11:06:12, 3.42s/it] 66%|██████▌ | 22575/34278 [24:48:37<11:16:33, 3.47s/it] {'loss': 0.1267, 'grad_norm': 0.8295697964866922, 'learning_rate': 2.7578102855222056e-06, 'epoch': 0.66} 66%|██████▌ | 22575/34278 [24:48:37<11:16:33, 3.47s/it] 66%|██████▌ | 22576/34278 [24:48:40<11:31:55, 3.55s/it] {'loss': 0.1113, 'grad_norm': 0.8743768041029046, 'learning_rate': 2.757388027283178e-06, 'epoch': 0.66} 66%|██████▌ | 22576/34278 [24:48:40<11:31:55, 3.55s/it] 66%|██████▌ | 22577/34278 [24:48:45<12:29:45, 3.84s/it] {'loss': 0.131, 'grad_norm': 1.2465440196922168, 'learning_rate': 2.7569657890656543e-06, 'epoch': 0.66} 66%|██████▌ | 22577/34278 [24:48:45<12:29:45, 3.84s/it] 66%|██████▌ | 22578/34278 [24:48:48<11:38:33, 3.58s/it] {'loss': 0.087, 'grad_norm': 0.8671282291886849, 'learning_rate': 2.7565435708734067e-06, 'epoch': 0.66} 66%|██████▌ | 22578/34278 [24:48:48<11:38:33, 3.58s/it] 66%|██████▌ | 22579/34278 [24:48:52<11:43:49, 3.61s/it] {'loss': 0.114, 'grad_norm': 1.0194614465372704, 'learning_rate': 2.7561213727102026e-06, 'epoch': 0.66} 66%|██████▌ | 22579/34278 [24:48:52<11:43:49, 3.61s/it] 66%|██████▌ | 22580/34278 [24:48:55<11:10:39, 3.44s/it] {'loss': 0.1359, 'grad_norm': 1.0730631218781865, 'learning_rate': 2.7556991945798097e-06, 'epoch': 0.66} 66%|██████▌ | 22580/34278 [24:48:55<11:10:39, 3.44s/it] 66%|██████▌ | 22581/34278 [24:48:58<10:53:13, 3.35s/it] {'loss': 0.1217, 'grad_norm': 1.0049835996455878, 'learning_rate': 2.755277036486e-06, 'epoch': 0.66} 66%|██████▌ | 22581/34278 [24:48:58<10:53:13, 3.35s/it] 66%|██████▌ | 22582/34278 [24:49:04<13:22:46, 4.12s/it] {'loss': 0.1341, 'grad_norm': 0.9253939123222444, 'learning_rate': 2.7548548984325392e-06, 'epoch': 0.66} 66%|██████▌ | 22582/34278 [24:49:04<13:22:46, 4.12s/it] 66%|██████▌ | 22583/34278 [24:49:10<15:34:49, 4.80s/it] {'loss': 0.1269, 'grad_norm': 0.8551010319335021, 'learning_rate': 2.754432780423198e-06, 'epoch': 0.66} 66%|██████▌ | 22583/34278 [24:49:10<15:34:49, 4.80s/it] 66%|██████▌ | 22584/34278 [24:49:13<14:06:04, 4.34s/it] {'loss': 0.1574, 'grad_norm': 1.2062317792160597, 'learning_rate': 2.7540106824617467e-06, 'epoch': 0.66} 66%|██████▌ | 22584/34278 [24:49:13<14:06:04, 4.34s/it] 66%|██████▌ | 22585/34278 [24:49:17<13:09:51, 4.05s/it] {'loss': 0.1338, 'grad_norm': 1.0425488044374756, 'learning_rate': 2.753588604551952e-06, 'epoch': 0.66} 66%|██████▌ | 22585/34278 [24:49:17<13:09:51, 4.05s/it] 66%|██████▌ | 22586/34278 [24:49:21<13:46:37, 4.24s/it] {'loss': 0.1133, 'grad_norm': 0.6753586540234627, 'learning_rate': 2.75316654669758e-06, 'epoch': 0.66} 66%|██████▌ | 22586/34278 [24:49:21<13:46:37, 4.24s/it] 66%|██████▌ | 22587/34278 [24:49:24<12:33:52, 3.87s/it] {'loss': 0.1196, 'grad_norm': 0.8399699469968134, 'learning_rate': 2.752744508902403e-06, 'epoch': 0.66} 66%|██████▌ | 22587/34278 [24:49:24<12:33:52, 3.87s/it] 66%|██████▌ | 22588/34278 [24:49:28<12:01:29, 3.70s/it] {'loss': 0.1364, 'grad_norm': 1.1790994215607702, 'learning_rate': 2.752322491170184e-06, 'epoch': 0.66} 66%|██████▌ | 22588/34278 [24:49:28<12:01:29, 3.70s/it] 66%|██████▌ | 22589/34278 [24:49:34<14:13:30, 4.38s/it] {'loss': 0.1149, 'grad_norm': 0.8149822434798201, 'learning_rate': 2.7519004935046955e-06, 'epoch': 0.66} 66%|██████▌ | 22589/34278 [24:49:34<14:13:30, 4.38s/it] 66%|██████▌ | 22590/34278 [24:49:37<12:51:48, 3.96s/it] {'loss': 0.1208, 'grad_norm': 1.0824127306644329, 'learning_rate': 2.7514785159097006e-06, 'epoch': 0.66} 66%|██████▌ | 22590/34278 [24:49:37<12:51:48, 3.96s/it] 66%|██████▌ | 22591/34278 [24:49:41<13:02:09, 4.02s/it] {'loss': 0.1272, 'grad_norm': 0.8803409266018206, 'learning_rate': 2.751056558388971e-06, 'epoch': 0.66} 66%|██████▌ | 22591/34278 [24:49:41<13:02:09, 4.02s/it] 66%|██████▌ | 22592/34278 [24:49:44<12:01:47, 3.71s/it] {'loss': 0.1227, 'grad_norm': 0.8494313396501025, 'learning_rate': 2.7506346209462715e-06, 'epoch': 0.66} 66%|██████▌ | 22592/34278 [24:49:44<12:01:47, 3.71s/it] 66%|██████▌ | 22593/34278 [24:49:47<11:19:24, 3.49s/it] {'loss': 0.1403, 'grad_norm': 0.7793815655046239, 'learning_rate': 2.7502127035853666e-06, 'epoch': 0.66} 66%|██████▌ | 22593/34278 [24:49:47<11:19:24, 3.49s/it] 66%|██████▌ | 22594/34278 [24:49:50<11:03:11, 3.41s/it] {'loss': 0.1105, 'grad_norm': 0.7266458890049697, 'learning_rate': 2.7497908063100266e-06, 'epoch': 0.66} 66%|██████▌ | 22594/34278 [24:49:50<11:03:11, 3.41s/it] 66%|██████▌ | 22595/34278 [24:49:53<10:49:03, 3.33s/it] {'loss': 0.1348, 'grad_norm': 0.7358696537864176, 'learning_rate': 2.7493689291240185e-06, 'epoch': 0.66} 66%|██████▌ | 22595/34278 [24:49:53<10:49:03, 3.33s/it] 66%|██████▌ | 22596/34278 [24:49:56<10:34:18, 3.26s/it] {'loss': 0.117, 'grad_norm': 0.9577455005381101, 'learning_rate': 2.7489470720311074e-06, 'epoch': 0.66} 66%|██████▌ | 22596/34278 [24:49:56<10:34:18, 3.26s/it] 66%|██████▌ | 22597/34278 [24:50:00<10:38:40, 3.28s/it] {'loss': 0.1227, 'grad_norm': 0.8449180607899556, 'learning_rate': 2.7485252350350576e-06, 'epoch': 0.66} 66%|██████▌ | 22597/34278 [24:50:00<10:38:40, 3.28s/it] 66%|██████▌ | 22598/34278 [24:50:05<12:51:23, 3.96s/it] {'loss': 0.1136, 'grad_norm': 0.8158164345241355, 'learning_rate': 2.748103418139639e-06, 'epoch': 0.66} 66%|██████▌ | 22598/34278 [24:50:05<12:51:23, 3.96s/it] 66%|██████▌ | 22599/34278 [24:50:09<12:24:44, 3.83s/it] {'loss': 0.1282, 'grad_norm': 0.8134861595351345, 'learning_rate': 2.747681621348615e-06, 'epoch': 0.66} 66%|██████▌ | 22599/34278 [24:50:09<12:24:44, 3.83s/it] 66%|██████▌ | 22600/34278 [24:50:12<12:04:05, 3.72s/it] {'loss': 0.1115, 'grad_norm': 1.0964996014100363, 'learning_rate': 2.7472598446657484e-06, 'epoch': 0.66} 66%|██████▌ | 22600/34278 [24:50:12<12:04:05, 3.72s/it] 66%|██████▌ | 22601/34278 [24:50:18<14:13:05, 4.38s/it] {'loss': 0.116, 'grad_norm': 0.828303490228993, 'learning_rate': 2.746838088094812e-06, 'epoch': 0.66} 66%|██████▌ | 22601/34278 [24:50:18<14:13:05, 4.38s/it] 66%|██████▌ | 22602/34278 [24:50:21<12:41:12, 3.91s/it] {'loss': 0.1378, 'grad_norm': 0.7322896981011644, 'learning_rate': 2.746416351639567e-06, 'epoch': 0.66} 66%|██████▌ | 22602/34278 [24:50:21<12:41:12, 3.91s/it] 66%|██████▌ | 22603/34278 [24:50:24<11:47:15, 3.63s/it] {'loss': 0.1154, 'grad_norm': 0.7766145382743532, 'learning_rate': 2.7459946353037775e-06, 'epoch': 0.66} 66%|██████▌ | 22603/34278 [24:50:24<11:47:15, 3.63s/it] 66%|██████▌ | 22604/34278 [24:50:30<14:29:44, 4.47s/it] {'loss': 0.1155, 'grad_norm': 0.751708619639476, 'learning_rate': 2.7455729390912113e-06, 'epoch': 0.66} 66%|██████▌ | 22604/34278 [24:50:30<14:29:44, 4.47s/it] 66%|██████▌ | 22605/34278 [24:50:35<14:29:16, 4.47s/it] {'loss': 0.1212, 'grad_norm': 0.6654296957356924, 'learning_rate': 2.7451512630056323e-06, 'epoch': 0.66} 66%|██████▌ | 22605/34278 [24:50:35<14:29:16, 4.47s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa75b7e1940> Failed to fetch sample 2654736. Exception: cannot identify image file <_io.BytesIO object at 0x7fa75b7e1940> 66%|██████▌ | 22606/34278 [24:50:38<13:28:15, 4.15s/it] {'loss': 0.1178, 'grad_norm': 0.7198116486010239, 'learning_rate': 2.7447296070508017e-06, 'epoch': 0.66} 66%|██████▌ | 22606/34278 [24:50:38<13:28:15, 4.15s/it] 66%|██████▌ | 22607/34278 [24:50:41<12:34:51, 3.88s/it] {'loss': 0.1119, 'grad_norm': 1.5396030220829788, 'learning_rate': 2.744307971230487e-06, 'epoch': 0.66} 66%|██████▌ | 22607/34278 [24:50:41<12:34:51, 3.88s/it] 66%|██████▌ | 22608/34278 [24:50:45<12:36:35, 3.89s/it] {'loss': 0.1219, 'grad_norm': 0.7797775196398998, 'learning_rate': 2.7438863555484545e-06, 'epoch': 0.66} 66%|██████▌ | 22608/34278 [24:50:45<12:36:35, 3.89s/it] 66%|██████▌ | 22609/34278 [24:50:50<13:21:52, 4.12s/it] {'loss': 0.1189, 'grad_norm': 0.7060434245789803, 'learning_rate': 2.7434647600084662e-06, 'epoch': 0.66} 66%|██████▌ | 22609/34278 [24:50:50<13:21:52, 4.12s/it] 66%|██████▌ | 22610/34278 [24:50:53<12:15:42, 3.78s/it] {'loss': 0.1101, 'grad_norm': 0.7733708686479651, 'learning_rate': 2.7430431846142837e-06, 'epoch': 0.66} 66%|██████▌ | 22610/34278 [24:50:53<12:15:42, 3.78s/it] 66%|██████▌ | 22611/34278 [24:50:57<12:25:54, 3.84s/it] {'loss': 0.1187, 'grad_norm': 0.6482630170163397, 'learning_rate': 2.742621629369675e-06, 'epoch': 0.66} 66%|██████▌ | 22611/34278 [24:50:57<12:25:54, 3.84s/it] 66%|██████▌ | 22612/34278 [24:51:00<11:48:17, 3.64s/it] {'loss': 0.125, 'grad_norm': 0.9201612115909791, 'learning_rate': 2.742200094278399e-06, 'epoch': 0.66} 66%|██████▌ | 22612/34278 [24:51:00<11:48:17, 3.64s/it] 66%|██████▌ | 22613/34278 [24:51:04<12:15:38, 3.78s/it] {'loss': 0.1421, 'grad_norm': 0.8470097144589287, 'learning_rate': 2.741778579344222e-06, 'epoch': 0.66} 66%|██████▌ | 22613/34278 [24:51:04<12:15:38, 3.78s/it] 66%|██████▌ | 22614/34278 [24:51:08<11:51:17, 3.66s/it] {'loss': 0.1166, 'grad_norm': 0.7590184952460127, 'learning_rate': 2.7413570845709086e-06, 'epoch': 0.66} 66%|██████▌ | 22614/34278 [24:51:08<11:51:17, 3.66s/it] 66%|██████▌ | 22615/34278 [24:51:11<11:49:16, 3.65s/it] {'loss': 0.1323, 'grad_norm': 0.8141803945777077, 'learning_rate': 2.74093560996222e-06, 'epoch': 0.66} 66%|██████▌ | 22615/34278 [24:51:11<11:49:16, 3.65s/it] 66%|██████▌ | 22616/34278 [24:51:14<11:15:05, 3.47s/it] {'loss': 0.1274, 'grad_norm': 0.8725507264487203, 'learning_rate': 2.740514155521917e-06, 'epoch': 0.66} 66%|██████▌ | 22616/34278 [24:51:14<11:15:05, 3.47s/it] 66%|██████▌ | 22617/34278 [24:51:19<12:57:16, 4.00s/it] {'loss': 0.139, 'grad_norm': 0.9987485743600544, 'learning_rate': 2.7400927212537643e-06, 'epoch': 0.66} 66%|██████▌ | 22617/34278 [24:51:19<12:57:16, 4.00s/it] 66%|██████▌ | 22618/34278 [24:51:23<12:14:07, 3.78s/it] {'loss': 0.1146, 'grad_norm': 0.751603426942623, 'learning_rate': 2.7396713071615262e-06, 'epoch': 0.66} 66%|██████▌ | 22618/34278 [24:51:23<12:14:07, 3.78s/it] 66%|██████▌ | 22619/34278 [24:51:27<12:24:18, 3.83s/it] {'loss': 0.1342, 'grad_norm': 0.7431142504141429, 'learning_rate': 2.739249913248963e-06, 'epoch': 0.66} 66%|██████▌ | 22619/34278 [24:51:27<12:24:18, 3.83s/it] 66%|██████▌ | 22620/34278 [24:51:30<11:49:37, 3.65s/it] {'loss': 0.1384, 'grad_norm': 0.9901582677371695, 'learning_rate': 2.7388285395198354e-06, 'epoch': 0.66} 66%|██████▌ | 22620/34278 [24:51:30<11:49:37, 3.65s/it] 66%|██████▌ | 22621/34278 [24:51:34<12:40:45, 3.92s/it] {'loss': 0.1129, 'grad_norm': 0.6486016174974297, 'learning_rate': 2.738407185977908e-06, 'epoch': 0.66} 66%|██████▌ | 22621/34278 [24:51:34<12:40:45, 3.92s/it] 66%|██████▌ | 22622/34278 [24:51:40<13:56:09, 4.30s/it] {'loss': 0.105, 'grad_norm': 0.7825818162792993, 'learning_rate': 2.7379858526269422e-06, 'epoch': 0.66} 66%|██████▌ | 22622/34278 [24:51:40<13:56:09, 4.30s/it] 66%|██████▌ | 22623/34278 [24:51:43<13:12:26, 4.08s/it] {'loss': 0.119, 'grad_norm': 0.7434131882275584, 'learning_rate': 2.7375645394706963e-06, 'epoch': 0.66} 66%|██████▌ | 22623/34278 [24:51:43<13:12:26, 4.08s/it] 66%|██████▌ | 22624/34278 [24:51:46<12:22:24, 3.82s/it] {'loss': 0.134, 'grad_norm': 0.8849720606460794, 'learning_rate': 2.7371432465129343e-06, 'epoch': 0.66} 66%|██████▌ | 22624/34278 [24:51:46<12:22:24, 3.82s/it] 66%|██████▌ | 22625/34278 [24:51:50<11:44:43, 3.63s/it] {'loss': 0.0968, 'grad_norm': 0.749407026158925, 'learning_rate': 2.736721973757419e-06, 'epoch': 0.66} 66%|██████▌ | 22625/34278 [24:51:50<11:44:43, 3.63s/it] 66%|██████▌ | 22626/34278 [24:51:53<11:09:58, 3.45s/it] {'loss': 0.1402, 'grad_norm': 0.7722278356763757, 'learning_rate': 2.7363007212079097e-06, 'epoch': 0.66} 66%|██████▌ | 22626/34278 [24:51:53<11:09:58, 3.45s/it] 66%|██████▌ | 22627/34278 [24:51:56<10:44:56, 3.32s/it] {'loss': 0.1364, 'grad_norm': 0.9857979943000369, 'learning_rate': 2.735879488868165e-06, 'epoch': 0.66} 66%|██████▌ | 22627/34278 [24:51:56<10:44:56, 3.32s/it] 66%|██████▌ | 22628/34278 [24:51:59<10:22:53, 3.21s/it] {'loss': 0.1221, 'grad_norm': 0.9618799872613633, 'learning_rate': 2.7354582767419498e-06, 'epoch': 0.66} 66%|██████▌ | 22628/34278 [24:51:59<10:22:53, 3.21s/it] 66%|██████▌ | 22629/34278 [24:52:03<11:25:08, 3.53s/it] {'loss': 0.1237, 'grad_norm': 0.7715563866110873, 'learning_rate': 2.7350370848330204e-06, 'epoch': 0.66} 66%|██████▌ | 22629/34278 [24:52:03<11:25:08, 3.53s/it] 66%|██████▌ | 22630/34278 [24:52:06<11:05:10, 3.43s/it] {'loss': 0.1066, 'grad_norm': 0.9087764788045213, 'learning_rate': 2.7346159131451396e-06, 'epoch': 0.66} 66%|██████▌ | 22630/34278 [24:52:06<11:05:10, 3.43s/it] 66%|██████▌ | 22631/34278 [24:52:09<10:35:39, 3.27s/it] {'loss': 0.1238, 'grad_norm': 1.2365590407456843, 'learning_rate': 2.7341947616820686e-06, 'epoch': 0.66} 66%|██████▌ | 22631/34278 [24:52:09<10:35:39, 3.27s/it] 66%|██████▌ | 22632/34278 [24:52:12<10:37:35, 3.28s/it] {'loss': 0.1078, 'grad_norm': 0.8406230466167327, 'learning_rate': 2.7337736304475665e-06, 'epoch': 0.66} 66%|██████▌ | 22632/34278 [24:52:12<10:37:35, 3.28s/it] 66%|██████▌ | 22633/34278 [24:52:18<12:57:44, 4.01s/it] {'loss': 0.1041, 'grad_norm': 0.6884332220566729, 'learning_rate': 2.7333525194453904e-06, 'epoch': 0.66} 66%|██████▌ | 22633/34278 [24:52:18<12:57:44, 4.01s/it] 66%|██████▌ | 22634/34278 [24:52:23<13:35:17, 4.20s/it] {'loss': 0.1381, 'grad_norm': 0.8460422483031798, 'learning_rate': 2.732931428679303e-06, 'epoch': 0.66} 66%|██████▌ | 22634/34278 [24:52:23<13:35:17, 4.20s/it] 66%|██████▌ | 22635/34278 [24:52:26<12:33:58, 3.89s/it] {'loss': 0.1057, 'grad_norm': 0.7864247637944488, 'learning_rate': 2.7325103581530616e-06, 'epoch': 0.66} 66%|██████▌ | 22635/34278 [24:52:26<12:33:58, 3.89s/it] 66%|██████▌ | 22636/34278 [24:52:31<13:31:01, 4.18s/it] {'loss': 0.1331, 'grad_norm': 0.9469238268633753, 'learning_rate': 2.732089307870428e-06, 'epoch': 0.66} 66%|██████▌ | 22636/34278 [24:52:31<13:31:01, 4.18s/it] 66%|██████▌ | 22637/34278 [24:52:34<12:49:30, 3.97s/it] {'loss': 0.1067, 'grad_norm': 0.8255544281769466, 'learning_rate': 2.7316682778351576e-06, 'epoch': 0.66} 66%|██████▌ | 22637/34278 [24:52:34<12:49:30, 3.97s/it] 66%|██████▌ | 22638/34278 [24:52:40<14:52:07, 4.60s/it] {'loss': 0.1138, 'grad_norm': 0.7599272483817766, 'learning_rate': 2.731247268051014e-06, 'epoch': 0.66} 66%|██████▌ | 22638/34278 [24:52:40<14:52:07, 4.60s/it] 66%|██████▌ | 22639/34278 [24:52:44<13:38:49, 4.22s/it] {'loss': 0.1035, 'grad_norm': 1.4022651437557578, 'learning_rate': 2.730826278521753e-06, 'epoch': 0.66} 66%|██████▌ | 22639/34278 [24:52:44<13:38:49, 4.22s/it] 66%|██████▌ | 22640/34278 [24:52:49<15:20:02, 4.74s/it] {'loss': 0.098, 'grad_norm': 0.8874944794084177, 'learning_rate': 2.7304053092511307e-06, 'epoch': 0.66} 66%|██████▌ | 22640/34278 [24:52:50<15:20:02, 4.74s/it] 66%|██████▌ | 22641/34278 [24:52:53<13:57:15, 4.32s/it] {'loss': 0.1167, 'grad_norm': 0.7912420615086282, 'learning_rate': 2.7299843602429076e-06, 'epoch': 0.66} 66%|██████▌ | 22641/34278 [24:52:53<13:57:15, 4.32s/it] 66%|██████▌ | 22642/34278 [24:52:59<16:04:43, 4.97s/it] {'loss': 0.1204, 'grad_norm': 0.899065096134132, 'learning_rate': 2.7295634315008456e-06, 'epoch': 0.66} 66%|██████▌ | 22642/34278 [24:52:59<16:04:43, 4.97s/it] 66%|██████▌ | 22643/34278 [24:53:04<15:35:54, 4.83s/it] {'loss': 0.1114, 'grad_norm': 1.2234695633380441, 'learning_rate': 2.7291425230286962e-06, 'epoch': 0.66} 66%|██████▌ | 22643/34278 [24:53:04<15:35:54, 4.83s/it] 66%|██████▌ | 22644/34278 [24:53:07<14:02:00, 4.34s/it] {'loss': 0.1191, 'grad_norm': 0.7589885649474262, 'learning_rate': 2.7287216348302225e-06, 'epoch': 0.66} 66%|██████▌ | 22644/34278 [24:53:07<14:02:00, 4.34s/it] 66%|██████▌ | 22645/34278 [24:53:11<13:27:27, 4.16s/it] {'loss': 0.123, 'grad_norm': 1.024785208722725, 'learning_rate': 2.7283007669091804e-06, 'epoch': 0.66} 66%|██████▌ | 22645/34278 [24:53:11<13:27:27, 4.16s/it] 66%|██████▌ | 22646/34278 [24:53:17<15:16:24, 4.73s/it] {'loss': 0.1158, 'grad_norm': 0.7557866587988469, 'learning_rate': 2.727879919269324e-06, 'epoch': 0.66} 66%|██████▌ | 22646/34278 [24:53:17<15:16:24, 4.73s/it] 66%|██████▌ | 22647/34278 [24:53:22<15:59:12, 4.95s/it] {'loss': 0.1318, 'grad_norm': 0.8675824733934782, 'learning_rate': 2.727459091914414e-06, 'epoch': 0.66} 66%|██████▌ | 22647/34278 [24:53:22<15:59:12, 4.95s/it] 66%|██████▌ | 22648/34278 [24:53:27<15:18:21, 4.74s/it] {'loss': 0.1035, 'grad_norm': 0.9281154785377084, 'learning_rate': 2.727038284848208e-06, 'epoch': 0.66} 66%|██████▌ | 22648/34278 [24:53:27<15:18:21, 4.74s/it] 66%|██████▌ | 22649/34278 [24:53:30<14:27:23, 4.48s/it] {'loss': 0.1208, 'grad_norm': 0.973165458035548, 'learning_rate': 2.726617498074462e-06, 'epoch': 0.66} 66%|██████▌ | 22649/34278 [24:53:30<14:27:23, 4.48s/it] 66%|██████▌ | 22650/34278 [24:53:33<13:01:25, 4.03s/it] {'loss': 0.1154, 'grad_norm': 0.7990752633850764, 'learning_rate': 2.7261967315969307e-06, 'epoch': 0.66} 66%|██████▌ | 22650/34278 [24:53:33<13:01:25, 4.03s/it] 66%|██████▌ | 22651/34278 [24:53:38<13:09:51, 4.08s/it] {'loss': 0.1187, 'grad_norm': 0.9689587981984759, 'learning_rate': 2.7257759854193735e-06, 'epoch': 0.66} 66%|██████▌ | 22651/34278 [24:53:38<13:09:51, 4.08s/it] 66%|██████▌ | 22652/34278 [24:53:44<15:36:44, 4.83s/it] {'loss': 0.1379, 'grad_norm': 1.0824105336467058, 'learning_rate': 2.7253552595455458e-06, 'epoch': 0.66} 66%|██████▌ | 22652/34278 [24:53:44<15:36:44, 4.83s/it] 66%|██████▌ | 22653/34278 [24:53:48<14:11:51, 4.40s/it] {'loss': 0.1237, 'grad_norm': 0.976149906546021, 'learning_rate': 2.724934553979201e-06, 'epoch': 0.66} 66%|██████▌ | 22653/34278 [24:53:48<14:11:51, 4.40s/it] 66%|██████▌ | 22654/34278 [24:53:53<15:28:52, 4.79s/it] {'loss': 0.1063, 'grad_norm': 0.8239511261071738, 'learning_rate': 2.724513868724098e-06, 'epoch': 0.66} 66%|██████▌ | 22654/34278 [24:53:53<15:28:52, 4.79s/it] 66%|██████▌ | 22655/34278 [24:53:56<13:44:40, 4.26s/it] {'loss': 0.1142, 'grad_norm': 0.7308597033836158, 'learning_rate': 2.724093203783993e-06, 'epoch': 0.66} 66%|██████▌ | 22655/34278 [24:53:56<13:44:40, 4.26s/it] 66%|██████▌ | 22656/34278 [24:54:00<12:52:05, 3.99s/it] {'loss': 0.0989, 'grad_norm': 1.0008807120680314, 'learning_rate': 2.7236725591626413e-06, 'epoch': 0.66} 66%|██████▌ | 22656/34278 [24:54:00<12:52:05, 3.99s/it] 66%|██████▌ | 22657/34278 [24:54:05<14:16:00, 4.42s/it] {'loss': 0.1116, 'grad_norm': 0.8664122835198478, 'learning_rate': 2.7232519348637955e-06, 'epoch': 0.66} 66%|██████▌ | 22657/34278 [24:54:05<14:16:00, 4.42s/it] 66%|██████▌ | 22658/34278 [24:54:09<13:34:27, 4.21s/it] {'loss': 0.1204, 'grad_norm': 0.7248420383427195, 'learning_rate': 2.7228313308912145e-06, 'epoch': 0.66} 66%|██████▌ | 22658/34278 [24:54:09<13:34:27, 4.21s/it] 66%|██████▌ | 22659/34278 [24:54:12<12:41:37, 3.93s/it] {'loss': 0.133, 'grad_norm': 0.7085192623477988, 'learning_rate': 2.7224107472486504e-06, 'epoch': 0.66} 66%|██████▌ | 22659/34278 [24:54:12<12:41:37, 3.93s/it] 66%|██████▌ | 22660/34278 [24:54:16<12:14:39, 3.79s/it] {'loss': 0.1119, 'grad_norm': 0.9273281691664607, 'learning_rate': 2.721990183939859e-06, 'epoch': 0.66} 66%|██████▌ | 22660/34278 [24:54:16<12:14:39, 3.79s/it] 66%|██████▌ | 22661/34278 [24:54:19<11:59:35, 3.72s/it] {'loss': 0.1035, 'grad_norm': 0.7399050138849338, 'learning_rate': 2.7215696409685977e-06, 'epoch': 0.66} 66%|██████▌ | 22661/34278 [24:54:19<11:59:35, 3.72s/it] 66%|██████▌ | 22662/34278 [24:54:22<11:25:28, 3.54s/it] {'loss': 0.1336, 'grad_norm': 0.7656988761837674, 'learning_rate': 2.7211491183386185e-06, 'epoch': 0.66} 66%|██████▌ | 22662/34278 [24:54:22<11:25:28, 3.54s/it] 66%|██████▌ | 22663/34278 [24:54:26<11:26:32, 3.55s/it] {'loss': 0.1288, 'grad_norm': 1.0817766966059668, 'learning_rate': 2.720728616053674e-06, 'epoch': 0.66} 66%|██████▌ | 22663/34278 [24:54:26<11:26:32, 3.55s/it] 66%|██████▌ | 22664/34278 [24:54:29<10:48:14, 3.35s/it] {'loss': 0.1007, 'grad_norm': 0.9326493487858707, 'learning_rate': 2.7203081341175225e-06, 'epoch': 0.66} 66%|██████▌ | 22664/34278 [24:54:29<10:48:14, 3.35s/it] 66%|██████▌ | 22665/34278 [24:54:32<10:34:19, 3.28s/it] {'loss': 0.1072, 'grad_norm': 0.8270822126515731, 'learning_rate': 2.7198876725339143e-06, 'epoch': 0.66} 66%|██████▌ | 22665/34278 [24:54:32<10:34:19, 3.28s/it] 66%|██████▌ | 22666/34278 [24:54:38<13:08:21, 4.07s/it] {'loss': 0.143, 'grad_norm': 0.9552120809115704, 'learning_rate': 2.719467231306605e-06, 'epoch': 0.66} 66%|██████▌ | 22666/34278 [24:54:38<13:08:21, 4.07s/it] 66%|██████▌ | 22667/34278 [24:54:40<11:54:43, 3.69s/it] {'loss': 0.1209, 'grad_norm': 1.0792521373628325, 'learning_rate': 2.7190468104393474e-06, 'epoch': 0.66} 66%|██████▌ | 22667/34278 [24:54:40<11:54:43, 3.69s/it] 66%|██████▌ | 22668/34278 [24:54:47<14:15:15, 4.42s/it] {'loss': 0.0945, 'grad_norm': 0.8389385598404163, 'learning_rate': 2.7186264099358965e-06, 'epoch': 0.66} 66%|██████▌ | 22668/34278 [24:54:47<14:15:15, 4.42s/it] 66%|██████▌ | 22669/34278 [24:54:50<13:04:17, 4.05s/it] {'loss': 0.1457, 'grad_norm': 0.8101273566871517, 'learning_rate': 2.7182060298000047e-06, 'epoch': 0.66} 66%|██████▌ | 22669/34278 [24:54:50<13:04:17, 4.05s/it] 66%|██████▌ | 22670/34278 [24:54:53<12:21:39, 3.83s/it] {'loss': 0.1285, 'grad_norm': 0.8277979793603644, 'learning_rate': 2.7177856700354233e-06, 'epoch': 0.66} 66%|██████▌ | 22670/34278 [24:54:53<12:21:39, 3.83s/it] 66%|██████▌ | 22671/34278 [24:54:57<12:00:46, 3.73s/it] {'loss': 0.1152, 'grad_norm': 1.0234917416377036, 'learning_rate': 2.7173653306459056e-06, 'epoch': 0.66} 66%|██████▌ | 22671/34278 [24:54:57<12:00:46, 3.73s/it] 66%|██████▌ | 22672/34278 [24:55:00<11:46:31, 3.65s/it] {'loss': 0.0961, 'grad_norm': 0.6416059728778798, 'learning_rate': 2.716945011635208e-06, 'epoch': 0.66} 66%|██████▌ | 22672/34278 [24:55:00<11:46:31, 3.65s/it] 66%|██████▌ | 22673/34278 [24:55:04<11:34:20, 3.59s/it] {'loss': 0.1093, 'grad_norm': 0.7594503786367398, 'learning_rate': 2.716524713007078e-06, 'epoch': 0.66} 66%|██████▌ | 22673/34278 [24:55:04<11:34:20, 3.59s/it] 66%|██████▌ | 22674/34278 [24:55:07<11:54:31, 3.69s/it] {'loss': 0.1218, 'grad_norm': 0.7648313826222802, 'learning_rate': 2.716104434765273e-06, 'epoch': 0.66} 66%|██████▌ | 22674/34278 [24:55:07<11:54:31, 3.69s/it] 66%|██████▌ | 22675/34278 [24:55:10<11:03:02, 3.43s/it] {'loss': 0.1183, 'grad_norm': 0.6918303444233517, 'learning_rate': 2.715684176913542e-06, 'epoch': 0.66} 66%|██████▌ | 22675/34278 [24:55:10<11:03:02, 3.43s/it] 66%|██████▌ | 22676/34278 [24:55:16<13:15:35, 4.11s/it] {'loss': 0.1297, 'grad_norm': 0.8202650141884076, 'learning_rate': 2.7152639394556345e-06, 'epoch': 0.66} 66%|██████▌ | 22676/34278 [24:55:16<13:15:35, 4.11s/it] 66%|██████▌ | 22677/34278 [24:55:19<12:05:47, 3.75s/it] {'loss': 0.1175, 'grad_norm': 0.8170915936510572, 'learning_rate': 2.7148437223953063e-06, 'epoch': 0.66} 66%|██████▌ | 22677/34278 [24:55:19<12:05:47, 3.75s/it] 66%|██████▌ | 22678/34278 [24:55:22<11:11:37, 3.47s/it] {'loss': 0.1346, 'grad_norm': 1.0014751947777412, 'learning_rate': 2.7144235257363095e-06, 'epoch': 0.66} 66%|██████▌ | 22678/34278 [24:55:22<11:11:37, 3.47s/it] 66%|██████▌ | 22679/34278 [24:55:27<13:06:05, 4.07s/it] {'loss': 0.1249, 'grad_norm': 1.1731135235685226, 'learning_rate': 2.7140033494823937e-06, 'epoch': 0.66} 66%|██████▌ | 22679/34278 [24:55:27<13:06:05, 4.07s/it] 66%|██████▌ | 22680/34278 [24:55:32<13:35:47, 4.22s/it] {'loss': 0.1167, 'grad_norm': 0.813509199094576, 'learning_rate': 2.713583193637308e-06, 'epoch': 0.66} 66%|██████▌ | 22680/34278 [24:55:32<13:35:47, 4.22s/it] 66%|██████▌ | 22681/34278 [24:55:35<12:47:27, 3.97s/it] {'loss': 0.118, 'grad_norm': 0.9376931570758963, 'learning_rate': 2.713163058204808e-06, 'epoch': 0.66} 66%|██████▌ | 22681/34278 [24:55:35<12:47:27, 3.97s/it] 66%|██████▌ | 22682/34278 [24:55:39<12:54:58, 4.01s/it] {'loss': 0.1092, 'grad_norm': 0.9350101979027772, 'learning_rate': 2.712742943188642e-06, 'epoch': 0.66} 66%|██████▌ | 22682/34278 [24:55:39<12:54:58, 4.01s/it] 66%|██████▌ | 22683/34278 [24:55:43<12:14:32, 3.80s/it] {'loss': 0.1516, 'grad_norm': 0.7997215927470465, 'learning_rate': 2.7123228485925603e-06, 'epoch': 0.66} 66%|██████▌ | 22683/34278 [24:55:43<12:14:32, 3.80s/it] 66%|██████▌ | 22684/34278 [24:55:46<11:54:31, 3.70s/it] {'loss': 0.1301, 'grad_norm': 1.004204197024303, 'learning_rate': 2.7119027744203125e-06, 'epoch': 0.66} 66%|██████▌ | 22684/34278 [24:55:46<11:54:31, 3.70s/it] 66%|██████▌ | 22685/34278 [24:55:50<12:29:25, 3.88s/it] {'loss': 0.1384, 'grad_norm': 0.8437709970552765, 'learning_rate': 2.7114827206756534e-06, 'epoch': 0.66} 66%|██████▌ | 22685/34278 [24:55:50<12:29:25, 3.88s/it] 66%|██████▌ | 22686/34278 [24:55:54<12:43:34, 3.95s/it] {'loss': 0.1193, 'grad_norm': 0.7818249507810892, 'learning_rate': 2.71106268736233e-06, 'epoch': 0.66} 66%|██████▌ | 22686/34278 [24:55:54<12:43:34, 3.95s/it] 66%|██████▌ | 22687/34278 [24:55:58<12:09:30, 3.78s/it] {'loss': 0.1061, 'grad_norm': 0.8459262924633701, 'learning_rate': 2.7106426744840903e-06, 'epoch': 0.66} 66%|██████▌ | 22687/34278 [24:55:58<12:09:30, 3.78s/it] 66%|██████▌ | 22688/34278 [24:56:02<12:33:22, 3.90s/it] {'loss': 0.1364, 'grad_norm': 0.7941108554831526, 'learning_rate': 2.710222682044689e-06, 'epoch': 0.66} 66%|██████▌ | 22688/34278 [24:56:02<12:33:22, 3.90s/it] 66%|██████▌ | 22689/34278 [24:56:05<11:59:47, 3.73s/it] {'loss': 0.1131, 'grad_norm': 0.8446789782799237, 'learning_rate': 2.70980271004787e-06, 'epoch': 0.66} 66%|██████▌ | 22689/34278 [24:56:05<11:59:47, 3.73s/it] 66%|██████▌ | 22690/34278 [24:56:11<14:16:51, 4.44s/it] {'loss': 0.1033, 'grad_norm': 0.7177839383770583, 'learning_rate': 2.7093827584973864e-06, 'epoch': 0.66} 66%|██████▌ | 22690/34278 [24:56:11<14:16:51, 4.44s/it] 66%|██████▌ | 22691/34278 [24:56:15<13:53:25, 4.32s/it] {'loss': 0.1453, 'grad_norm': 0.9387945384857022, 'learning_rate': 2.708962827396988e-06, 'epoch': 0.66} 66%|██████▌ | 22691/34278 [24:56:15<13:53:25, 4.32s/it] 66%|██████▌ | 22692/34278 [24:56:18<12:40:31, 3.94s/it] {'loss': 0.1007, 'grad_norm': 0.8395679812808373, 'learning_rate': 2.7085429167504227e-06, 'epoch': 0.66} 66%|██████▌ | 22692/34278 [24:56:19<12:40:31, 3.94s/it] 66%|██████▌ | 22693/34278 [24:56:22<11:53:41, 3.70s/it] {'loss': 0.1278, 'grad_norm': 0.7686783234017451, 'learning_rate': 2.708123026561438e-06, 'epoch': 0.66} 66%|██████▌ | 22693/34278 [24:56:22<11:53:41, 3.70s/it] 66%|██████▌ | 22694/34278 [24:56:25<11:42:08, 3.64s/it] {'loss': 0.1374, 'grad_norm': 0.8732786275286675, 'learning_rate': 2.7077031568337853e-06, 'epoch': 0.66} 66%|██████▌ | 22694/34278 [24:56:25<11:42:08, 3.64s/it] 66%|██████▌ | 22695/34278 [24:56:28<11:09:37, 3.47s/it] {'loss': 0.1275, 'grad_norm': 0.9427103113246084, 'learning_rate': 2.7072833075712102e-06, 'epoch': 0.66} 66%|██████▌ | 22695/34278 [24:56:28<11:09:37, 3.47s/it] 66%|██████▌ | 22696/34278 [24:56:32<11:44:37, 3.65s/it] {'loss': 0.1084, 'grad_norm': 0.7449211040189512, 'learning_rate': 2.7068634787774637e-06, 'epoch': 0.66} 66%|██████▌ | 22696/34278 [24:56:32<11:44:37, 3.65s/it] 66%|██████▌ | 22697/34278 [24:56:35<11:15:04, 3.50s/it] {'loss': 0.1332, 'grad_norm': 1.0419258901937143, 'learning_rate': 2.7064436704562906e-06, 'epoch': 0.66} 66%|██████▌ | 22697/34278 [24:56:35<11:15:04, 3.50s/it] 66%|██████▌ | 22698/34278 [24:56:38<10:35:30, 3.29s/it] {'loss': 0.1061, 'grad_norm': 0.6996777628115296, 'learning_rate': 2.706023882611443e-06, 'epoch': 0.66} 66%|██████▌ | 22698/34278 [24:56:38<10:35:30, 3.29s/it] 66%|██████▌ | 22699/34278 [24:56:41<10:05:02, 3.14s/it] {'loss': 0.1055, 'grad_norm': 1.0122235070797678, 'learning_rate': 2.705604115246667e-06, 'epoch': 0.66} 66%|██████▌ | 22699/34278 [24:56:41<10:05:02, 3.14s/it] 66%|██████▌ | 22700/34278 [24:56:45<10:29:07, 3.26s/it] {'loss': 0.1146, 'grad_norm': 0.7493860508219115, 'learning_rate': 2.7051843683657073e-06, 'epoch': 0.66} 66%|██████▌ | 22700/34278 [24:56:45<10:29:07, 3.26s/it] 66%|██████▌ | 22701/34278 [24:56:48<10:42:46, 3.33s/it] {'loss': 0.1298, 'grad_norm': 0.9118083070252305, 'learning_rate': 2.704764641972314e-06, 'epoch': 0.66} 66%|██████▌ | 22701/34278 [24:56:48<10:42:46, 3.33s/it] 66%|██████▌ | 22702/34278 [24:56:51<10:28:04, 3.26s/it] {'loss': 0.0997, 'grad_norm': 0.7962610988844777, 'learning_rate': 2.7043449360702356e-06, 'epoch': 0.66} 66%|██████▌ | 22702/34278 [24:56:51<10:28:04, 3.26s/it] 66%|██████▌ | 22703/34278 [24:56:54<10:16:53, 3.20s/it] {'loss': 0.1139, 'grad_norm': 0.8788015974195829, 'learning_rate': 2.703925250663216e-06, 'epoch': 0.66} 66%|██████▌ | 22703/34278 [24:56:54<10:16:53, 3.20s/it] 66%|██████▌ | 22704/34278 [24:56:57<10:07:13, 3.15s/it] {'loss': 0.1232, 'grad_norm': 0.8997812162323178, 'learning_rate': 2.7035055857550056e-06, 'epoch': 0.66} 66%|██████▌ | 22704/34278 [24:56:57<10:07:13, 3.15s/it] 66%|██████▌ | 22705/34278 [24:57:01<11:01:36, 3.43s/it] {'loss': 0.106, 'grad_norm': 0.7622724133070561, 'learning_rate': 2.703085941349349e-06, 'epoch': 0.66} 66%|██████▌ | 22705/34278 [24:57:01<11:01:36, 3.43s/it] 66%|██████▌ | 22706/34278 [24:57:05<11:27:18, 3.56s/it] {'loss': 0.1294, 'grad_norm': 0.715834222217085, 'learning_rate': 2.702666317449991e-06, 'epoch': 0.66} 66%|██████▌ | 22706/34278 [24:57:05<11:27:18, 3.56s/it] 66%|██████▌ | 22707/34278 [24:57:09<11:57:25, 3.72s/it] {'loss': 0.1324, 'grad_norm': 0.863459359701019, 'learning_rate': 2.70224671406068e-06, 'epoch': 0.66} 66%|██████▌ | 22707/34278 [24:57:09<11:57:25, 3.72s/it] 66%|██████▌ | 22708/34278 [24:57:12<11:18:32, 3.52s/it] {'loss': 0.1229, 'grad_norm': 1.007169623792825, 'learning_rate': 2.701827131185163e-06, 'epoch': 0.66} 66%|██████▌ | 22708/34278 [24:57:12<11:18:32, 3.52s/it] 66%|██████▌ | 22709/34278 [24:57:15<10:51:25, 3.38s/it] {'loss': 0.1311, 'grad_norm': 0.9935501999166708, 'learning_rate': 2.7014075688271857e-06, 'epoch': 0.66} 66%|██████▌ | 22709/34278 [24:57:15<10:51:25, 3.38s/it] 66%|██████▋ | 22710/34278 [24:57:18<10:20:57, 3.22s/it] {'loss': 0.1334, 'grad_norm': 0.8176213043713191, 'learning_rate': 2.70098802699049e-06, 'epoch': 0.66} 66%|██████▋ | 22710/34278 [24:57:18<10:20:57, 3.22s/it] 66%|██████▋ | 22711/34278 [24:57:22<10:33:24, 3.29s/it] {'loss': 0.1098, 'grad_norm': 0.8131923698562828, 'learning_rate': 2.7005685056788266e-06, 'epoch': 0.66} 66%|██████▋ | 22711/34278 [24:57:22<10:33:24, 3.29s/it] 66%|██████▋ | 22712/34278 [24:57:26<11:25:11, 3.55s/it] {'loss': 0.1169, 'grad_norm': 0.6752129161337445, 'learning_rate': 2.700149004895939e-06, 'epoch': 0.66} 66%|██████▋ | 22712/34278 [24:57:26<11:25:11, 3.55s/it] 66%|██████▋ | 22713/34278 [24:57:30<11:39:39, 3.63s/it] {'loss': 0.1615, 'grad_norm': 1.3347273643967854, 'learning_rate': 2.69972952464557e-06, 'epoch': 0.66} 66%|██████▋ | 22713/34278 [24:57:30<11:39:39, 3.63s/it] 66%|██████▋ | 22714/34278 [24:57:36<13:59:52, 4.36s/it] {'loss': 0.1235, 'grad_norm': 0.9448109125956424, 'learning_rate': 2.6993100649314663e-06, 'epoch': 0.66} 66%|██████▋ | 22714/34278 [24:57:36<13:59:52, 4.36s/it] 66%|██████▋ | 22715/34278 [24:57:41<15:22:03, 4.78s/it] {'loss': 0.1326, 'grad_norm': 0.7656346529520892, 'learning_rate': 2.6988906257573757e-06, 'epoch': 0.66} 66%|██████▋ | 22715/34278 [24:57:41<15:22:03, 4.78s/it] 66%|██████▋ | 22716/34278 [24:57:45<14:04:17, 4.38s/it] {'loss': 0.1242, 'grad_norm': 0.9833753913203368, 'learning_rate': 2.6984712071270396e-06, 'epoch': 0.66} 66%|██████▋ | 22716/34278 [24:57:45<14:04:17, 4.38s/it] 66%|██████▋ | 22717/34278 [24:57:48<13:15:00, 4.13s/it] {'loss': 0.1582, 'grad_norm': 1.089224248732522, 'learning_rate': 2.6980518090442016e-06, 'epoch': 0.66} 66%|██████▋ | 22717/34278 [24:57:48<13:15:00, 4.13s/it] 66%|██████▋ | 22718/34278 [24:57:54<14:44:47, 4.59s/it] {'loss': 0.119, 'grad_norm': 0.7869744316955992, 'learning_rate': 2.697632431512609e-06, 'epoch': 0.66} 66%|██████▋ | 22718/34278 [24:57:54<14:44:47, 4.59s/it] 66%|██████▋ | 22719/34278 [24:57:57<13:26:25, 4.19s/it] {'loss': 0.1059, 'grad_norm': 0.8584606434500435, 'learning_rate': 2.6972130745360033e-06, 'epoch': 0.66} 66%|██████▋ | 22719/34278 [24:57:57<13:26:25, 4.19s/it] 66%|██████▋ | 22720/34278 [24:58:02<13:51:00, 4.31s/it] {'loss': 0.1352, 'grad_norm': 0.8352241109709454, 'learning_rate': 2.696793738118129e-06, 'epoch': 0.66} 66%|██████▋ | 22720/34278 [24:58:02<13:51:00, 4.31s/it] 66%|██████▋ | 22721/34278 [24:58:04<12:06:18, 3.77s/it] {'loss': 0.1433, 'grad_norm': 0.9355569458030719, 'learning_rate': 2.6963744222627326e-06, 'epoch': 0.66} 66%|██████▋ | 22721/34278 [24:58:04<12:06:18, 3.77s/it] 66%|██████▋ | 22722/34278 [24:58:08<12:10:58, 3.80s/it] {'loss': 0.112, 'grad_norm': 0.7610063806664835, 'learning_rate': 2.6959551269735553e-06, 'epoch': 0.66} 66%|██████▋ | 22722/34278 [24:58:08<12:10:58, 3.80s/it] 66%|██████▋ | 22723/34278 [24:58:12<12:17:18, 3.83s/it] {'loss': 0.1512, 'grad_norm': 0.8125572547701, 'learning_rate': 2.6955358522543385e-06, 'epoch': 0.66} 66%|██████▋ | 22723/34278 [24:58:12<12:17:18, 3.83s/it] 66%|██████▋ | 22724/34278 [24:58:19<14:42:17, 4.58s/it] {'loss': 0.1122, 'grad_norm': 0.8369608260762295, 'learning_rate': 2.6951165981088303e-06, 'epoch': 0.66} 66%|██████▋ | 22724/34278 [24:58:19<14:42:17, 4.58s/it] 66%|██████▋ | 22725/34278 [24:58:22<13:09:49, 4.10s/it] {'loss': 0.1174, 'grad_norm': 0.7809482849014473, 'learning_rate': 2.6946973645407674e-06, 'epoch': 0.66} 66%|██████▋ | 22725/34278 [24:58:22<13:09:49, 4.10s/it] 66%|██████▋ | 22726/34278 [24:58:25<12:59:01, 4.05s/it] {'loss': 0.118, 'grad_norm': 0.8024793983428733, 'learning_rate': 2.6942781515538996e-06, 'epoch': 0.66} 66%|██████▋ | 22726/34278 [24:58:25<12:59:01, 4.05s/it] 66%|██████▋ | 22727/34278 [24:58:29<12:36:31, 3.93s/it] {'loss': 0.109, 'grad_norm': 0.7561825368964618, 'learning_rate': 2.6938589591519624e-06, 'epoch': 0.66} 66%|██████▋ | 22727/34278 [24:58:29<12:36:31, 3.93s/it] 66%|██████▋ | 22728/34278 [24:58:33<12:28:14, 3.89s/it] {'loss': 0.1175, 'grad_norm': 0.8774031918796692, 'learning_rate': 2.693439787338705e-06, 'epoch': 0.66} 66%|██████▋ | 22728/34278 [24:58:33<12:28:14, 3.89s/it] 66%|██████▋ | 22729/34278 [24:58:37<12:15:37, 3.82s/it] {'loss': 0.1382, 'grad_norm': 0.7858477310656956, 'learning_rate': 2.693020636117867e-06, 'epoch': 0.66} 66%|██████▋ | 22729/34278 [24:58:37<12:15:37, 3.82s/it] 66%|██████▋ | 22730/34278 [24:58:40<11:32:24, 3.60s/it] {'loss': 0.1137, 'grad_norm': 0.8101829820640726, 'learning_rate': 2.6926015054931876e-06, 'epoch': 0.66} 66%|██████▋ | 22730/34278 [24:58:40<11:32:24, 3.60s/it] 66%|██████▋ | 22731/34278 [24:58:46<13:47:44, 4.30s/it] {'loss': 0.1251, 'grad_norm': 0.949393329970762, 'learning_rate': 2.6921823954684105e-06, 'epoch': 0.66} 66%|██████▋ | 22731/34278 [24:58:46<13:47:44, 4.30s/it] 66%|██████▋ | 22732/34278 [24:58:52<15:26:15, 4.81s/it] {'loss': 0.0978, 'grad_norm': 0.7565575831085826, 'learning_rate': 2.691763306047281e-06, 'epoch': 0.66} 66%|██████▋ | 22732/34278 [24:58:52<15:26:15, 4.81s/it] 66%|██████▋ | 22733/34278 [24:58:55<13:46:28, 4.30s/it] {'loss': 0.1201, 'grad_norm': 0.8482425123804851, 'learning_rate': 2.6913442372335353e-06, 'epoch': 0.66} 66%|██████▋ | 22733/34278 [24:58:55<13:46:28, 4.30s/it] 66%|██████▋ | 22734/34278 [24:58:58<12:25:42, 3.88s/it] {'loss': 0.1484, 'grad_norm': 1.0875602580466324, 'learning_rate': 2.6909251890309185e-06, 'epoch': 0.66} 66%|██████▋ | 22734/34278 [24:58:58<12:25:42, 3.88s/it] 66%|██████▋ | 22735/34278 [24:59:03<13:58:30, 4.36s/it] {'loss': 0.116, 'grad_norm': 0.8051571625003154, 'learning_rate': 2.6905061614431716e-06, 'epoch': 0.66} 66%|██████▋ | 22735/34278 [24:59:03<13:58:30, 4.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 66%|██████▋ | 22736/34278 [24:59:06<12:59:07, 4.05s/it] {'loss': 0.1321, 'grad_norm': 0.897759965070597, 'learning_rate': 2.6900871544740315e-06, 'epoch': 0.66} 66%|██████▋ | 22736/34278 [24:59:06<12:59:07, 4.05s/it] 66%|██████▋ | 22737/34278 [24:59:10<12:17:34, 3.83s/it] {'loss': 0.1246, 'grad_norm': 0.9536557279590591, 'learning_rate': 2.6896681681272417e-06, 'epoch': 0.66} 66%|██████▋ | 22737/34278 [24:59:10<12:17:34, 3.83s/it] 66%|██████▋ | 22738/34278 [24:59:13<11:45:39, 3.67s/it] {'loss': 0.1268, 'grad_norm': 0.7645782420215563, 'learning_rate': 2.6892492024065453e-06, 'epoch': 0.66} 66%|██████▋ | 22738/34278 [24:59:13<11:45:39, 3.67s/it] 66%|██████▋ | 22739/34278 [24:59:16<10:54:49, 3.40s/it] {'loss': 0.1105, 'grad_norm': 0.7491515181046098, 'learning_rate': 2.688830257315681e-06, 'epoch': 0.66} 66%|██████▋ | 22739/34278 [24:59:16<10:54:49, 3.40s/it] 66%|██████▋ | 22740/34278 [24:59:19<10:36:42, 3.31s/it] {'loss': 0.1299, 'grad_norm': 0.921236417838504, 'learning_rate': 2.688411332858386e-06, 'epoch': 0.66} 66%|██████▋ | 22740/34278 [24:59:19<10:36:42, 3.31s/it] 66%|██████▋ | 22741/34278 [24:59:25<13:19:25, 4.16s/it] {'loss': 0.1477, 'grad_norm': 0.9820780602449843, 'learning_rate': 2.687992429038404e-06, 'epoch': 0.66} 66%|██████▋ | 22741/34278 [24:59:25<13:19:25, 4.16s/it] 66%|██████▋ | 22742/34278 [24:59:29<13:14:33, 4.13s/it] {'loss': 0.1301, 'grad_norm': 0.9562363784714465, 'learning_rate': 2.687573545859475e-06, 'epoch': 0.66} 66%|██████▋ | 22742/34278 [24:59:29<13:14:33, 4.13s/it] 66%|██████▋ | 22743/34278 [24:59:32<11:56:46, 3.73s/it] {'loss': 0.1111, 'grad_norm': 0.7733876220060898, 'learning_rate': 2.6871546833253347e-06, 'epoch': 0.66} 66%|██████▋ | 22743/34278 [24:59:32<11:56:46, 3.73s/it] 66%|██████▋ | 22744/34278 [24:59:38<13:54:04, 4.34s/it] {'loss': 0.12, 'grad_norm': 0.7413071223379776, 'learning_rate': 2.686735841439725e-06, 'epoch': 0.66} 66%|██████▋ | 22744/34278 [24:59:38<13:54:04, 4.34s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 66%|██████▋ | 22745/34278 [24:59:43<14:44:31, 4.60s/it] {'loss': 0.115, 'grad_norm': 1.0183380551511207, 'learning_rate': 2.6863170202063884e-06, 'epoch': 0.66} 66%|██████▋ | 22745/34278 [24:59:43<14:44:31, 4.60s/it] 66%|██████▋ | 22746/34278 [24:59:46<13:06:54, 4.09s/it] {'loss': 0.098, 'grad_norm': 0.786501491655702, 'learning_rate': 2.68589821962906e-06, 'epoch': 0.66} 66%|██████▋ | 22746/34278 [24:59:46<13:06:54, 4.09s/it] 66%|██████▋ | 22747/34278 [24:59:49<12:07:48, 3.79s/it] {'loss': 0.1419, 'grad_norm': 0.8241351984514618, 'learning_rate': 2.6854794397114785e-06, 'epoch': 0.66} 66%|██████▋ | 22747/34278 [24:59:49<12:07:48, 3.79s/it] 66%|██████▋ | 22748/34278 [24:59:52<11:43:03, 3.66s/it] {'loss': 0.1212, 'grad_norm': 0.7353921423165194, 'learning_rate': 2.685060680457386e-06, 'epoch': 0.66} 66%|██████▋ | 22748/34278 [24:59:52<11:43:03, 3.66s/it] 66%|██████▋ | 22749/34278 [24:59:56<11:21:29, 3.55s/it] {'loss': 0.1134, 'grad_norm': 1.0368551549822878, 'learning_rate': 2.684641941870517e-06, 'epoch': 0.66} 66%|██████▋ | 22749/34278 [24:59:56<11:21:29, 3.55s/it] 66%|██████▋ | 22750/34278 [24:59:58<10:38:31, 3.32s/it] {'loss': 0.1149, 'grad_norm': 0.767425619343563, 'learning_rate': 2.6842232239546118e-06, 'epoch': 0.66} 66%|██████▋ | 22750/34278 [24:59:58<10:38:31, 3.32s/it] 66%|██████▋ | 22751/34278 [25:00:05<13:35:16, 4.24s/it] {'loss': 0.1067, 'grad_norm': 0.6842819277001925, 'learning_rate': 2.6838045267134115e-06, 'epoch': 0.66} 66%|██████▋ | 22751/34278 [25:00:05<13:35:16, 4.24s/it] 66%|██████▋ | 22752/34278 [25:00:08<12:13:47, 3.82s/it] {'loss': 0.1198, 'grad_norm': 0.7447247423431499, 'learning_rate': 2.683385850150651e-06, 'epoch': 0.66} 66%|██████▋ | 22752/34278 [25:00:08<12:13:47, 3.82s/it] 66%|██████▋ | 22753/34278 [25:00:11<11:27:59, 3.58s/it] {'loss': 0.1349, 'grad_norm': 1.1057954706132223, 'learning_rate': 2.6829671942700665e-06, 'epoch': 0.66} 66%|██████▋ | 22753/34278 [25:00:11<11:27:59, 3.58s/it] 66%|██████▋ | 22754/34278 [25:00:15<12:16:46, 3.84s/it] {'loss': 0.1342, 'grad_norm': 0.7679389187018296, 'learning_rate': 2.6825485590754e-06, 'epoch': 0.66} 66%|██████▋ | 22754/34278 [25:00:15<12:16:46, 3.84s/it] 66%|██████▋ | 22755/34278 [25:00:18<11:47:09, 3.68s/it] {'loss': 0.1242, 'grad_norm': 0.6422875569214838, 'learning_rate': 2.682129944570385e-06, 'epoch': 0.66} 66%|██████▋ | 22755/34278 [25:00:18<11:47:09, 3.68s/it] 66%|██████▋ | 22756/34278 [25:00:22<11:49:22, 3.69s/it] {'loss': 0.1137, 'grad_norm': 0.787111958851915, 'learning_rate': 2.6817113507587623e-06, 'epoch': 0.66} 66%|██████▋ | 22756/34278 [25:00:22<11:49:22, 3.69s/it] 66%|██████▋ | 22757/34278 [25:00:25<11:29:39, 3.59s/it] {'loss': 0.144, 'grad_norm': 0.8947541133808778, 'learning_rate': 2.6812927776442647e-06, 'epoch': 0.66} 66%|██████▋ | 22757/34278 [25:00:25<11:29:39, 3.59s/it] 66%|██████▋ | 22758/34278 [25:00:29<11:59:07, 3.75s/it] {'loss': 0.1208, 'grad_norm': 0.7308854387359449, 'learning_rate': 2.680874225230634e-06, 'epoch': 0.66} 66%|██████▋ | 22758/34278 [25:00:30<11:59:07, 3.75s/it] 66%|██████▋ | 22759/34278 [25:00:32<11:13:24, 3.51s/it] {'loss': 0.1278, 'grad_norm': 0.8570165983911302, 'learning_rate': 2.680455693521605e-06, 'epoch': 0.66} 66%|██████▋ | 22759/34278 [25:00:32<11:13:24, 3.51s/it] 66%|██████▋ | 22760/34278 [25:00:36<10:55:38, 3.42s/it] {'loss': 0.1177, 'grad_norm': 0.6814366635582152, 'learning_rate': 2.6800371825209114e-06, 'epoch': 0.66} 66%|██████▋ | 22760/34278 [25:00:36<10:55:38, 3.42s/it] 66%|██████▋ | 22761/34278 [25:00:40<11:51:14, 3.71s/it] {'loss': 0.112, 'grad_norm': 0.757997692071325, 'learning_rate': 2.6796186922322926e-06, 'epoch': 0.66} 66%|██████▋ | 22761/34278 [25:00:40<11:51:14, 3.71s/it] 66%|██████▋ | 22762/34278 [25:00:43<11:18:33, 3.54s/it] {'loss': 0.1344, 'grad_norm': 0.9121638917545138, 'learning_rate': 2.679200222659486e-06, 'epoch': 0.66} 66%|██████▋ | 22762/34278 [25:00:43<11:18:33, 3.54s/it] 66%|██████▋ | 22763/34278 [25:00:47<11:09:47, 3.49s/it] {'loss': 0.1396, 'grad_norm': 0.7417214753064422, 'learning_rate': 2.6787817738062233e-06, 'epoch': 0.66} 66%|██████▋ | 22763/34278 [25:00:47<11:09:47, 3.49s/it] 66%|██████▋ | 22764/34278 [25:00:50<11:09:18, 3.49s/it] {'loss': 0.1063, 'grad_norm': 0.8072939739497127, 'learning_rate': 2.678363345676245e-06, 'epoch': 0.66} 66%|██████▋ | 22764/34278 [25:00:50<11:09:18, 3.49s/it] 66%|██████▋ | 22765/34278 [25:00:56<13:04:17, 4.09s/it] {'loss': 0.0934, 'grad_norm': 0.6786321051846522, 'learning_rate': 2.6779449382732846e-06, 'epoch': 0.66} 66%|██████▋ | 22765/34278 [25:00:56<13:04:17, 4.09s/it] 66%|██████▋ | 22766/34278 [25:00:59<12:11:03, 3.81s/it] {'loss': 0.1086, 'grad_norm': 0.7770781535074133, 'learning_rate': 2.677526551601076e-06, 'epoch': 0.66} 66%|██████▋ | 22766/34278 [25:00:59<12:11:03, 3.81s/it] 66%|██████▋ | 22767/34278 [25:01:02<11:39:01, 3.64s/it] {'loss': 0.1175, 'grad_norm': 0.6871878643258786, 'learning_rate': 2.6771081856633552e-06, 'epoch': 0.66} 66%|██████▋ | 22767/34278 [25:01:02<11:39:01, 3.64s/it] 66%|██████▋ | 22768/34278 [25:01:06<12:28:15, 3.90s/it] {'loss': 0.1386, 'grad_norm': 0.7890643089896222, 'learning_rate': 2.6766898404638604e-06, 'epoch': 0.66} 66%|██████▋ | 22768/34278 [25:01:06<12:28:15, 3.90s/it] 66%|██████▋ | 22769/34278 [25:01:09<11:32:42, 3.61s/it] {'loss': 0.1233, 'grad_norm': 0.7658669956462429, 'learning_rate': 2.6762715160063236e-06, 'epoch': 0.66} 66%|██████▋ | 22769/34278 [25:01:09<11:32:42, 3.61s/it] 66%|██████▋ | 22770/34278 [25:01:13<11:12:17, 3.51s/it] {'loss': 0.1158, 'grad_norm': 0.9075248415891648, 'learning_rate': 2.675853212294478e-06, 'epoch': 0.66} 66%|██████▋ | 22770/34278 [25:01:13<11:12:17, 3.51s/it] 66%|██████▋ | 22771/34278 [25:01:16<11:03:59, 3.46s/it] {'loss': 0.105, 'grad_norm': 0.6410609193505, 'learning_rate': 2.6754349293320625e-06, 'epoch': 0.66} 66%|██████▋ | 22771/34278 [25:01:16<11:03:59, 3.46s/it] 66%|██████▋ | 22772/34278 [25:01:22<13:34:41, 4.25s/it] {'loss': 0.1306, 'grad_norm': 0.84713033760718, 'learning_rate': 2.6750166671228094e-06, 'epoch': 0.66} 66%|██████▋ | 22772/34278 [25:01:22<13:34:41, 4.25s/it] 66%|██████▋ | 22773/34278 [25:01:27<14:26:58, 4.52s/it] {'loss': 0.1182, 'grad_norm': 0.7736255291218663, 'learning_rate': 2.6745984256704498e-06, 'epoch': 0.66} 66%|██████▋ | 22773/34278 [25:01:27<14:26:58, 4.52s/it] 66%|██████▋ | 22774/34278 [25:01:31<13:30:10, 4.23s/it] {'loss': 0.1155, 'grad_norm': 0.9803888247539395, 'learning_rate': 2.6741802049787202e-06, 'epoch': 0.66} 66%|██████▋ | 22774/34278 [25:01:31<13:30:10, 4.23s/it] 66%|██████▋ | 22775/34278 [25:01:35<13:04:32, 4.09s/it] {'loss': 0.1372, 'grad_norm': 1.0128989948567748, 'learning_rate': 2.6737620050513567e-06, 'epoch': 0.66} 66%|██████▋ | 22775/34278 [25:01:35<13:04:32, 4.09s/it] 66%|██████▋ | 22776/34278 [25:01:38<12:36:03, 3.94s/it] {'loss': 0.1428, 'grad_norm': 0.8742612390528713, 'learning_rate': 2.6733438258920912e-06, 'epoch': 0.66} 66%|██████▋ | 22776/34278 [25:01:38<12:36:03, 3.94s/it] 66%|██████▋ | 22777/34278 [25:01:41<11:25:50, 3.58s/it] {'loss': 0.1233, 'grad_norm': 0.8847815637561429, 'learning_rate': 2.6729256675046545e-06, 'epoch': 0.66} 66%|██████▋ | 22777/34278 [25:01:41<11:25:50, 3.58s/it] 66%|██████▋ | 22778/34278 [25:01:44<11:16:57, 3.53s/it] {'loss': 0.1385, 'grad_norm': 0.9185805914733591, 'learning_rate': 2.6725075298927837e-06, 'epoch': 0.66} 66%|██████▋ | 22778/34278 [25:01:44<11:16:57, 3.53s/it] 66%|██████▋ | 22779/34278 [25:01:47<10:43:23, 3.36s/it] {'loss': 0.1149, 'grad_norm': 0.9081917259864982, 'learning_rate': 2.672089413060208e-06, 'epoch': 0.66} 66%|██████▋ | 22779/34278 [25:01:47<10:43:23, 3.36s/it] 66%|██████▋ | 22780/34278 [25:01:51<10:42:17, 3.35s/it] {'loss': 0.12, 'grad_norm': 0.9060151823771639, 'learning_rate': 2.6716713170106622e-06, 'epoch': 0.66} 66%|██████▋ | 22780/34278 [25:01:51<10:42:17, 3.35s/it] 66%|██████▋ | 22781/34278 [25:01:54<10:37:54, 3.33s/it] {'loss': 0.1203, 'grad_norm': 0.8265405242520284, 'learning_rate': 2.6712532417478817e-06, 'epoch': 0.66} 66%|██████▋ | 22781/34278 [25:01:54<10:37:54, 3.33s/it] 66%|██████▋ | 22782/34278 [25:01:57<10:10:28, 3.19s/it] {'loss': 0.1205, 'grad_norm': 0.9303997829894811, 'learning_rate': 2.6708351872755955e-06, 'epoch': 0.66} 66%|██████▋ | 22782/34278 [25:01:57<10:10:28, 3.19s/it] 66%|██████▋ | 22783/34278 [25:02:02<12:33:50, 3.93s/it] {'loss': 0.1163, 'grad_norm': 0.6978932775368064, 'learning_rate': 2.6704171535975353e-06, 'epoch': 0.66} 66%|██████▋ | 22783/34278 [25:02:02<12:33:50, 3.93s/it] 66%|██████▋ | 22784/34278 [25:02:05<11:36:23, 3.64s/it] {'loss': 0.1141, 'grad_norm': 0.9416123650054097, 'learning_rate': 2.669999140717436e-06, 'epoch': 0.66} 66%|██████▋ | 22784/34278 [25:02:05<11:36:23, 3.64s/it] 66%|██████▋ | 22785/34278 [25:02:11<13:54:47, 4.36s/it] {'loss': 0.1127, 'grad_norm': 0.8858413083104199, 'learning_rate': 2.6695811486390267e-06, 'epoch': 0.66} 66%|██████▋ | 22785/34278 [25:02:11<13:54:47, 4.36s/it] 66%|██████▋ | 22786/34278 [25:02:16<14:09:28, 4.44s/it] {'loss': 0.1231, 'grad_norm': 0.9055770930977479, 'learning_rate': 2.6691631773660427e-06, 'epoch': 0.66} 66%|██████▋ | 22786/34278 [25:02:16<14:09:28, 4.44s/it] 66%|██████▋ | 22787/34278 [25:02:20<13:45:16, 4.31s/it] {'loss': 0.1046, 'grad_norm': 1.0147306886536345, 'learning_rate': 2.6687452269022107e-06, 'epoch': 0.66} 66%|██████▋ | 22787/34278 [25:02:20<13:45:16, 4.31s/it] 66%|██████▋ | 22788/34278 [25:02:23<12:46:23, 4.00s/it] {'loss': 0.1185, 'grad_norm': 0.8475171238017849, 'learning_rate': 2.6683272972512674e-06, 'epoch': 0.66} 66%|██████▋ | 22788/34278 [25:02:23<12:46:23, 4.00s/it] 66%|██████▋ | 22789/34278 [25:02:29<14:03:41, 4.41s/it] {'loss': 0.0949, 'grad_norm': 0.8877258272993748, 'learning_rate': 2.6679093884169415e-06, 'epoch': 0.66} 66%|██████▋ | 22789/34278 [25:02:29<14:03:41, 4.41s/it] 66%|██████▋ | 22790/34278 [25:02:32<13:03:32, 4.09s/it] {'loss': 0.1303, 'grad_norm': 0.8681780252684836, 'learning_rate': 2.6674915004029615e-06, 'epoch': 0.66} 66%|██████▋ | 22790/34278 [25:02:32<13:03:32, 4.09s/it] 66%|██████▋ | 22791/34278 [25:02:35<12:27:35, 3.90s/it] {'loss': 0.1159, 'grad_norm': 0.7965396402025787, 'learning_rate': 2.6670736332130608e-06, 'epoch': 0.66} 66%|██████▋ | 22791/34278 [25:02:35<12:27:35, 3.90s/it] 66%|██████▋ | 22792/34278 [25:02:38<11:31:52, 3.61s/it] {'loss': 0.1137, 'grad_norm': 0.9841908263698371, 'learning_rate': 2.666655786850972e-06, 'epoch': 0.66} 66%|██████▋ | 22792/34278 [25:02:38<11:31:52, 3.61s/it] 66%|██████▋ | 22793/34278 [25:02:44<13:38:24, 4.28s/it] {'loss': 0.1362, 'grad_norm': 0.9079873660443103, 'learning_rate': 2.666237961320421e-06, 'epoch': 0.66} 66%|██████▋ | 22793/34278 [25:02:44<13:38:24, 4.28s/it] 66%|██████▋ | 22794/34278 [25:02:50<15:26:44, 4.84s/it] {'loss': 0.1111, 'grad_norm': 1.3539266253683224, 'learning_rate': 2.665820156625142e-06, 'epoch': 0.66} 66%|██████▋ | 22794/34278 [25:02:50<15:26:44, 4.84s/it] 67%|██████▋ | 22795/34278 [25:02:54<14:41:29, 4.61s/it] {'loss': 0.1299, 'grad_norm': 0.9942759204181132, 'learning_rate': 2.6654023727688637e-06, 'epoch': 0.67} 67%|██████▋ | 22795/34278 [25:02:54<14:41:29, 4.61s/it] 67%|██████▋ | 22796/34278 [25:02:58<13:24:44, 4.21s/it] {'loss': 0.1284, 'grad_norm': 0.9141475331994009, 'learning_rate': 2.6649846097553144e-06, 'epoch': 0.67} 67%|██████▋ | 22796/34278 [25:02:58<13:24:44, 4.21s/it] 67%|██████▋ | 22797/34278 [25:03:01<12:59:41, 4.07s/it] {'loss': 0.1221, 'grad_norm': 0.8335895823147706, 'learning_rate': 2.664566867588224e-06, 'epoch': 0.67} 67%|██████▋ | 22797/34278 [25:03:01<12:59:41, 4.07s/it] 67%|██████▋ | 22798/34278 [25:03:05<12:03:43, 3.78s/it] {'loss': 0.1081, 'grad_norm': 0.9376403578406329, 'learning_rate': 2.664149146271326e-06, 'epoch': 0.67} 67%|██████▋ | 22798/34278 [25:03:05<12:03:43, 3.78s/it] 67%|██████▋ | 22799/34278 [25:03:08<11:22:47, 3.57s/it] {'loss': 0.1039, 'grad_norm': 0.7709837010877325, 'learning_rate': 2.6637314458083465e-06, 'epoch': 0.67} 67%|██████▋ | 22799/34278 [25:03:08<11:22:47, 3.57s/it] 67%|██████▋ | 22800/34278 [25:03:11<11:08:36, 3.50s/it] {'loss': 0.113, 'grad_norm': 0.7341095668651686, 'learning_rate': 2.6633137662030128e-06, 'epoch': 0.67} 67%|██████▋ | 22800/34278 [25:03:11<11:08:36, 3.50s/it] 67%|██████▋ | 22801/34278 [25:03:16<12:10:13, 3.82s/it] {'loss': 0.1331, 'grad_norm': 0.8577788157092703, 'learning_rate': 2.6628961074590575e-06, 'epoch': 0.67} 67%|██████▋ | 22801/34278 [25:03:16<12:10:13, 3.82s/it] 67%|██████▋ | 22802/34278 [25:03:19<11:45:57, 3.69s/it] {'loss': 0.1204, 'grad_norm': 1.0247188704207537, 'learning_rate': 2.6624784695802087e-06, 'epoch': 0.67} 67%|██████▋ | 22802/34278 [25:03:19<11:45:57, 3.69s/it] 67%|██████▋ | 22803/34278 [25:03:22<11:02:24, 3.46s/it] {'loss': 0.1102, 'grad_norm': 0.7539953384403465, 'learning_rate': 2.662060852570192e-06, 'epoch': 0.67} 67%|██████▋ | 22803/34278 [25:03:22<11:02:24, 3.46s/it] 67%|██████▋ | 22804/34278 [25:03:27<12:20:30, 3.87s/it] {'loss': 0.123, 'grad_norm': 0.8177840501455511, 'learning_rate': 2.6616432564327375e-06, 'epoch': 0.67} 67%|██████▋ | 22804/34278 [25:03:27<12:20:30, 3.87s/it] 67%|██████▋ | 22805/34278 [25:03:31<13:08:45, 4.12s/it] {'loss': 0.1325, 'grad_norm': 0.6556082097341025, 'learning_rate': 2.6612256811715758e-06, 'epoch': 0.67} 67%|██████▋ | 22805/34278 [25:03:31<13:08:45, 4.12s/it] 67%|██████▋ | 22806/34278 [25:03:34<12:05:39, 3.80s/it] {'loss': 0.1088, 'grad_norm': 0.9077159408637516, 'learning_rate': 2.660808126790433e-06, 'epoch': 0.67} 67%|██████▋ | 22806/34278 [25:03:34<12:05:39, 3.80s/it] 67%|██████▋ | 22807/34278 [25:03:40<13:21:06, 4.19s/it] {'loss': 0.1328, 'grad_norm': 0.8344040674081844, 'learning_rate': 2.6603905932930353e-06, 'epoch': 0.67} 67%|██████▋ | 22807/34278 [25:03:40<13:21:06, 4.19s/it] 67%|██████▋ | 22808/34278 [25:03:45<14:47:49, 4.64s/it] {'loss': 0.1232, 'grad_norm': 0.8027804473541733, 'learning_rate': 2.6599730806831114e-06, 'epoch': 0.67} 67%|██████▋ | 22808/34278 [25:03:45<14:47:49, 4.64s/it] 67%|██████▋ | 22809/34278 [25:03:48<13:07:06, 4.12s/it] {'loss': 0.1079, 'grad_norm': 0.7289277425195105, 'learning_rate': 2.659555588964391e-06, 'epoch': 0.67} 67%|██████▋ | 22809/34278 [25:03:48<13:07:06, 4.12s/it] 67%|██████▋ | 22810/34278 [25:03:54<14:56:49, 4.69s/it] {'loss': 0.1022, 'grad_norm': 0.7811310114777291, 'learning_rate': 2.6591381181405982e-06, 'epoch': 0.67} 67%|██████▋ | 22810/34278 [25:03:54<14:56:49, 4.69s/it] 67%|██████▋ | 22811/34278 [25:03:57<13:38:14, 4.28s/it] {'loss': 0.1115, 'grad_norm': 0.6699944434434246, 'learning_rate': 2.6587206682154632e-06, 'epoch': 0.67} 67%|██████▋ | 22811/34278 [25:03:57<13:38:14, 4.28s/it] 67%|██████▋ | 22812/34278 [25:04:02<13:51:47, 4.35s/it] {'loss': 0.1171, 'grad_norm': 0.8527080409705557, 'learning_rate': 2.658303239192711e-06, 'epoch': 0.67} 67%|██████▋ | 22812/34278 [25:04:02<13:51:47, 4.35s/it] 67%|██████▋ | 22813/34278 [25:04:05<12:36:27, 3.96s/it] {'loss': 0.13, 'grad_norm': 0.8503458504227392, 'learning_rate': 2.657885831076067e-06, 'epoch': 0.67} 67%|██████▋ | 22813/34278 [25:04:05<12:36:27, 3.96s/it] 67%|██████▋ | 22814/34278 [25:04:08<11:52:05, 3.73s/it] {'loss': 0.1096, 'grad_norm': 0.8610452118826941, 'learning_rate': 2.657468443869259e-06, 'epoch': 0.67} 67%|██████▋ | 22814/34278 [25:04:08<11:52:05, 3.73s/it] 67%|██████▋ | 22815/34278 [25:04:12<11:42:43, 3.68s/it] {'loss': 0.1137, 'grad_norm': 0.8117637844797436, 'learning_rate': 2.657051077576015e-06, 'epoch': 0.67} 67%|██████▋ | 22815/34278 [25:04:12<11:42:43, 3.68s/it] 67%|██████▋ | 22816/34278 [25:04:15<10:53:59, 3.42s/it] {'loss': 0.1161, 'grad_norm': 1.0340139159103572, 'learning_rate': 2.6566337322000604e-06, 'epoch': 0.67} 67%|██████▋ | 22816/34278 [25:04:15<10:53:59, 3.42s/it] 67%|██████▋ | 22817/34278 [25:04:19<11:25:10, 3.59s/it] {'loss': 0.1208, 'grad_norm': 1.0225009069481235, 'learning_rate': 2.656216407745118e-06, 'epoch': 0.67} 67%|██████▋ | 22817/34278 [25:04:19<11:25:10, 3.59s/it] 67%|██████▋ | 22818/34278 [25:04:22<11:09:34, 3.51s/it] {'loss': 0.1284, 'grad_norm': 0.883007858803241, 'learning_rate': 2.655799104214918e-06, 'epoch': 0.67} 67%|██████▋ | 22818/34278 [25:04:22<11:09:34, 3.51s/it] 67%|██████▋ | 22819/34278 [25:04:25<10:33:54, 3.32s/it] {'loss': 0.1147, 'grad_norm': 1.3535224376299615, 'learning_rate': 2.6553818216131837e-06, 'epoch': 0.67} 67%|██████▋ | 22819/34278 [25:04:25<10:33:54, 3.32s/it] 67%|██████▋ | 22820/34278 [25:04:28<10:18:12, 3.24s/it] {'loss': 0.1333, 'grad_norm': 1.1754917276501897, 'learning_rate': 2.654964559943639e-06, 'epoch': 0.67} 67%|██████▋ | 22820/34278 [25:04:28<10:18:12, 3.24s/it] 67%|██████▋ | 22821/34278 [25:04:31<9:58:16, 3.13s/it] {'loss': 0.1188, 'grad_norm': 0.909508895822343, 'learning_rate': 2.654547319210011e-06, 'epoch': 0.67} 67%|██████▋ | 22821/34278 [25:04:31<9:58:16, 3.13s/it] 67%|██████▋ | 22822/34278 [25:04:34<9:57:19, 3.13s/it] {'loss': 0.1047, 'grad_norm': 0.756381712481357, 'learning_rate': 2.6541300994160267e-06, 'epoch': 0.67} 67%|██████▋ | 22822/34278 [25:04:34<9:57:19, 3.13s/it] 67%|██████▋ | 22823/34278 [25:04:37<10:04:00, 3.16s/it] {'loss': 0.1345, 'grad_norm': 0.8862876683421308, 'learning_rate': 2.653712900565407e-06, 'epoch': 0.67} 67%|██████▋ | 22823/34278 [25:04:37<10:04:00, 3.16s/it] 67%|██████▋ | 22824/34278 [25:04:40<10:09:51, 3.19s/it] {'loss': 0.1335, 'grad_norm': 0.8249110733240348, 'learning_rate': 2.6532957226618805e-06, 'epoch': 0.67} 67%|██████▋ | 22824/34278 [25:04:40<10:09:51, 3.19s/it] 67%|██████▋ | 22825/34278 [25:04:43<10:03:48, 3.16s/it] {'loss': 0.1023, 'grad_norm': 1.007246692300794, 'learning_rate': 2.6528785657091696e-06, 'epoch': 0.67} 67%|██████▋ | 22825/34278 [25:04:43<10:03:48, 3.16s/it] 67%|██████▋ | 22826/34278 [25:04:46<9:47:49, 3.08s/it] {'loss': 0.1175, 'grad_norm': 0.8183854563361862, 'learning_rate': 2.652461429710996e-06, 'epoch': 0.67} 67%|██████▋ | 22826/34278 [25:04:46<9:47:49, 3.08s/it] 67%|██████▋ | 22827/34278 [25:04:49<9:40:29, 3.04s/it] {'loss': 0.1166, 'grad_norm': 0.9746215184008209, 'learning_rate': 2.652044314671086e-06, 'epoch': 0.67} 67%|██████▋ | 22827/34278 [25:04:49<9:40:29, 3.04s/it] 67%|██████▋ | 22828/34278 [25:04:53<9:57:07, 3.13s/it] {'loss': 0.1196, 'grad_norm': 0.7990369652570128, 'learning_rate': 2.6516272205931672e-06, 'epoch': 0.67} 67%|██████▋ | 22828/34278 [25:04:53<9:57:07, 3.13s/it] 67%|██████▋ | 22829/34278 [25:04:57<10:55:18, 3.43s/it] {'loss': 0.1273, 'grad_norm': 0.7056059919518601, 'learning_rate': 2.6512101474809595e-06, 'epoch': 0.67} 67%|██████▋ | 22829/34278 [25:04:57<10:55:18, 3.43s/it] 67%|██████▋ | 22830/34278 [25:05:02<12:32:10, 3.94s/it] {'loss': 0.1254, 'grad_norm': 1.076309111226193, 'learning_rate': 2.6507930953381844e-06, 'epoch': 0.67} 67%|██████▋ | 22830/34278 [25:05:02<12:32:10, 3.94s/it] 67%|██████▋ | 22831/34278 [25:05:05<11:33:00, 3.63s/it] {'loss': 0.0996, 'grad_norm': 0.7191027710196798, 'learning_rate': 2.6503760641685698e-06, 'epoch': 0.67} 67%|██████▋ | 22831/34278 [25:05:05<11:33:00, 3.63s/it] 67%|██████▋ | 22832/34278 [25:05:08<11:09:49, 3.51s/it] {'loss': 0.1207, 'grad_norm': 0.8633287497024185, 'learning_rate': 2.6499590539758354e-06, 'epoch': 0.67} 67%|██████▋ | 22832/34278 [25:05:08<11:09:49, 3.51s/it] 67%|██████▋ | 22833/34278 [25:05:12<11:17:23, 3.55s/it] {'loss': 0.1511, 'grad_norm': 0.7809386224903961, 'learning_rate': 2.6495420647637073e-06, 'epoch': 0.67} 67%|██████▋ | 22833/34278 [25:05:12<11:17:23, 3.55s/it] 67%|██████▋ | 22834/34278 [25:05:15<10:57:09, 3.45s/it] {'loss': 0.1244, 'grad_norm': 0.801690683183648, 'learning_rate': 2.649125096535904e-06, 'epoch': 0.67} 67%|██████▋ | 22834/34278 [25:05:15<10:57:09, 3.45s/it] 67%|██████▋ | 22835/34278 [25:05:19<11:29:59, 3.62s/it] {'loss': 0.1535, 'grad_norm': 0.8307182356116696, 'learning_rate': 2.648708149296153e-06, 'epoch': 0.67} 67%|██████▋ | 22835/34278 [25:05:19<11:29:59, 3.62s/it] 67%|██████▋ | 22836/34278 [25:05:25<13:46:13, 4.33s/it] {'loss': 0.1113, 'grad_norm': 0.7847049312689528, 'learning_rate': 2.648291223048175e-06, 'epoch': 0.67} 67%|██████▋ | 22836/34278 [25:05:25<13:46:13, 4.33s/it] 67%|██████▋ | 22837/34278 [25:05:29<13:06:40, 4.13s/it] {'loss': 0.1291, 'grad_norm': 0.648994948062591, 'learning_rate': 2.6478743177956888e-06, 'epoch': 0.67} 67%|██████▋ | 22837/34278 [25:05:29<13:06:40, 4.13s/it] 67%|██████▋ | 22838/34278 [25:05:33<13:38:25, 4.29s/it] {'loss': 0.1692, 'grad_norm': 1.0227885258841345, 'learning_rate': 2.6474574335424193e-06, 'epoch': 0.67} 67%|██████▋ | 22838/34278 [25:05:33<13:38:25, 4.29s/it] 67%|██████▋ | 22839/34278 [25:05:36<12:13:57, 3.85s/it] {'loss': 0.1081, 'grad_norm': 0.8035949903899916, 'learning_rate': 2.64704057029209e-06, 'epoch': 0.67} 67%|██████▋ | 22839/34278 [25:05:36<12:13:57, 3.85s/it] 67%|██████▋ | 22840/34278 [25:05:42<14:23:29, 4.53s/it] {'loss': 0.1176, 'grad_norm': 0.8940843548417005, 'learning_rate': 2.6466237280484197e-06, 'epoch': 0.67} 67%|██████▋ | 22840/34278 [25:05:42<14:23:29, 4.53s/it] 67%|██████▋ | 22841/34278 [25:05:46<13:38:43, 4.30s/it] {'loss': 0.1291, 'grad_norm': 0.7227817489635756, 'learning_rate': 2.646206906815132e-06, 'epoch': 0.67} 67%|██████▋ | 22841/34278 [25:05:46<13:38:43, 4.30s/it] 67%|██████▋ | 22842/34278 [25:05:49<12:28:07, 3.93s/it] {'loss': 0.1174, 'grad_norm': 1.041345619677843, 'learning_rate': 2.6457901065959474e-06, 'epoch': 0.67} 67%|██████▋ | 22842/34278 [25:05:49<12:28:07, 3.93s/it] 67%|██████▋ | 22843/34278 [25:05:53<12:10:05, 3.83s/it] {'loss': 0.1318, 'grad_norm': 0.8271141276860493, 'learning_rate': 2.645373327394585e-06, 'epoch': 0.67} 67%|██████▋ | 22843/34278 [25:05:53<12:10:05, 3.83s/it] 67%|██████▋ | 22844/34278 [25:05:56<11:35:10, 3.65s/it] {'loss': 0.1233, 'grad_norm': 0.7784161027827586, 'learning_rate': 2.6449565692147673e-06, 'epoch': 0.67} 67%|██████▋ | 22844/34278 [25:05:56<11:35:10, 3.65s/it] 67%|██████▋ | 22845/34278 [25:05:59<11:05:44, 3.49s/it] {'loss': 0.1175, 'grad_norm': 0.913431319897687, 'learning_rate': 2.6445398320602168e-06, 'epoch': 0.67} 67%|██████▋ | 22845/34278 [25:05:59<11:05:44, 3.49s/it] 67%|██████▋ | 22846/34278 [25:06:04<12:32:00, 3.95s/it] {'loss': 0.1218, 'grad_norm': 0.7544592426231707, 'learning_rate': 2.644123115934653e-06, 'epoch': 0.67} 67%|██████▋ | 22846/34278 [25:06:04<12:32:00, 3.95s/it] 67%|██████▋ | 22847/34278 [25:06:10<14:04:41, 4.43s/it] {'loss': 0.1255, 'grad_norm': 0.7237926195654799, 'learning_rate': 2.6437064208417934e-06, 'epoch': 0.67} 67%|██████▋ | 22847/34278 [25:06:10<14:04:41, 4.43s/it] 67%|██████▋ | 22848/34278 [25:06:13<12:51:55, 4.05s/it] {'loss': 0.1359, 'grad_norm': 0.7785536203346751, 'learning_rate': 2.6432897467853626e-06, 'epoch': 0.67} 67%|██████▋ | 22848/34278 [25:06:13<12:51:55, 4.05s/it] 67%|██████▋ | 22849/34278 [25:06:16<11:55:19, 3.76s/it] {'loss': 0.1294, 'grad_norm': 0.9023965626581943, 'learning_rate': 2.642873093769078e-06, 'epoch': 0.67} 67%|██████▋ | 22849/34278 [25:06:16<11:55:19, 3.76s/it] 67%|██████▋ | 22850/34278 [25:06:19<11:42:51, 3.69s/it] {'loss': 0.1302, 'grad_norm': 1.1122519111675686, 'learning_rate': 2.6424564617966574e-06, 'epoch': 0.67} 67%|██████▋ | 22850/34278 [25:06:19<11:42:51, 3.69s/it] 67%|██████▋ | 22851/34278 [25:06:24<12:16:34, 3.87s/it] {'loss': 0.1274, 'grad_norm': 0.7810632635125973, 'learning_rate': 2.6420398508718237e-06, 'epoch': 0.67} 67%|██████▋ | 22851/34278 [25:06:24<12:16:34, 3.87s/it] 67%|██████▋ | 22852/34278 [25:06:27<11:56:29, 3.76s/it] {'loss': 0.1267, 'grad_norm': 1.1671355798287721, 'learning_rate': 2.641623260998296e-06, 'epoch': 0.67} 67%|██████▋ | 22852/34278 [25:06:27<11:56:29, 3.76s/it] 67%|██████▋ | 22853/34278 [25:06:31<11:45:34, 3.71s/it] {'loss': 0.1082, 'grad_norm': 0.7620712314961029, 'learning_rate': 2.641206692179794e-06, 'epoch': 0.67} 67%|██████▋ | 22853/34278 [25:06:31<11:45:34, 3.71s/it] 67%|██████▋ | 22854/34278 [25:06:37<13:57:43, 4.40s/it] {'loss': 0.1198, 'grad_norm': 0.9685592833678796, 'learning_rate': 2.6407901444200335e-06, 'epoch': 0.67} 67%|██████▋ | 22854/34278 [25:06:37<13:57:43, 4.40s/it] 67%|██████▋ | 22855/34278 [25:06:39<12:27:16, 3.93s/it] {'loss': 0.1386, 'grad_norm': 1.3405049572231604, 'learning_rate': 2.640373617722737e-06, 'epoch': 0.67} 67%|██████▋ | 22855/34278 [25:06:40<12:27:16, 3.93s/it] 67%|██████▋ | 22856/34278 [25:06:43<11:52:17, 3.74s/it] {'loss': 0.1266, 'grad_norm': 0.7881531449010316, 'learning_rate': 2.639957112091619e-06, 'epoch': 0.67} 67%|██████▋ | 22856/34278 [25:06:43<11:52:17, 3.74s/it] 67%|██████▋ | 22857/34278 [25:06:46<11:09:10, 3.52s/it] {'loss': 0.1009, 'grad_norm': 0.9545364113754767, 'learning_rate': 2.6395406275304014e-06, 'epoch': 0.67} 67%|██████▋ | 22857/34278 [25:06:46<11:09:10, 3.52s/it] 67%|██████▋ | 22858/34278 [25:06:52<13:58:53, 4.41s/it] {'loss': 0.1227, 'grad_norm': 0.9211951782557177, 'learning_rate': 2.6391241640428034e-06, 'epoch': 0.67} 67%|██████▋ | 22858/34278 [25:06:52<13:58:53, 4.41s/it] 67%|██████▋ | 22859/34278 [25:06:55<12:45:18, 4.02s/it] {'loss': 0.1228, 'grad_norm': 0.8779482312824567, 'learning_rate': 2.6387077216325407e-06, 'epoch': 0.67} 67%|██████▋ | 22859/34278 [25:06:55<12:45:18, 4.02s/it] 67%|██████▋ | 22860/34278 [25:06:59<12:39:48, 3.99s/it] {'loss': 0.1368, 'grad_norm': 0.9842666461074883, 'learning_rate': 2.6382913003033305e-06, 'epoch': 0.67} 67%|██████▋ | 22860/34278 [25:06:59<12:39:48, 3.99s/it] 67%|██████▋ | 22861/34278 [25:07:03<12:23:17, 3.91s/it] {'loss': 0.1422, 'grad_norm': 0.81804116491733, 'learning_rate': 2.637874900058893e-06, 'epoch': 0.67} 67%|██████▋ | 22861/34278 [25:07:03<12:23:17, 3.91s/it] 67%|██████▋ | 22862/34278 [25:07:06<11:38:18, 3.67s/it] {'loss': 0.1342, 'grad_norm': 1.093203373968162, 'learning_rate': 2.6374585209029435e-06, 'epoch': 0.67} 67%|██████▋ | 22862/34278 [25:07:06<11:38:18, 3.67s/it] 67%|██████▋ | 22863/34278 [25:07:09<11:11:19, 3.53s/it] {'loss': 0.1317, 'grad_norm': 0.8712050536001551, 'learning_rate': 2.637042162839202e-06, 'epoch': 0.67} 67%|██████▋ | 22863/34278 [25:07:09<11:11:19, 3.53s/it] 67%|██████▋ | 22864/34278 [25:07:12<10:49:09, 3.41s/it] {'loss': 0.1203, 'grad_norm': 0.8753929827276967, 'learning_rate': 2.6366258258713816e-06, 'epoch': 0.67} 67%|██████▋ | 22864/34278 [25:07:12<10:49:09, 3.41s/it] 67%|██████▋ | 22865/34278 [25:07:16<10:33:47, 3.33s/it] {'loss': 0.1156, 'grad_norm': 0.9179428116205727, 'learning_rate': 2.636209510003204e-06, 'epoch': 0.67} 67%|██████▋ | 22865/34278 [25:07:16<10:33:47, 3.33s/it] 67%|██████▋ | 22866/34278 [25:07:19<10:31:56, 3.32s/it] {'loss': 0.1229, 'grad_norm': 0.9122813596479129, 'learning_rate': 2.6357932152383837e-06, 'epoch': 0.67} 67%|██████▋ | 22866/34278 [25:07:19<10:31:56, 3.32s/it] 67%|██████▋ | 22867/34278 [25:07:22<10:04:52, 3.18s/it] {'loss': 0.1178, 'grad_norm': 0.9132962926045334, 'learning_rate': 2.635376941580635e-06, 'epoch': 0.67} 67%|██████▋ | 22867/34278 [25:07:22<10:04:52, 3.18s/it] 67%|██████▋ | 22868/34278 [25:07:25<10:33:51, 3.33s/it] {'loss': 0.1243, 'grad_norm': 0.746429558365203, 'learning_rate': 2.6349606890336765e-06, 'epoch': 0.67} 67%|██████▋ | 22868/34278 [25:07:25<10:33:51, 3.33s/it] 67%|██████▋ | 22869/34278 [25:07:29<10:45:13, 3.39s/it] {'loss': 0.1306, 'grad_norm': 0.8487321079667792, 'learning_rate': 2.634544457601227e-06, 'epoch': 0.67} 67%|██████▋ | 22869/34278 [25:07:29<10:45:13, 3.39s/it] 67%|██████▋ | 22870/34278 [25:07:32<10:42:00, 3.38s/it] {'loss': 0.1121, 'grad_norm': 1.071069474226585, 'learning_rate': 2.6341282472869968e-06, 'epoch': 0.67} 67%|██████▋ | 22870/34278 [25:07:32<10:42:00, 3.38s/it] 67%|██████▋ | 22871/34278 [25:07:38<13:13:04, 4.17s/it] {'loss': 0.0795, 'grad_norm': 0.7754749506074043, 'learning_rate': 2.6337120580947074e-06, 'epoch': 0.67} 67%|██████▋ | 22871/34278 [25:07:38<13:13:04, 4.17s/it] 67%|██████▋ | 22872/34278 [25:07:42<13:02:43, 4.12s/it] {'loss': 0.1316, 'grad_norm': 0.8771783417649982, 'learning_rate': 2.6332958900280715e-06, 'epoch': 0.67} 67%|██████▋ | 22872/34278 [25:07:42<13:02:43, 4.12s/it] 67%|██████▋ | 22873/34278 [25:07:48<14:58:10, 4.73s/it] {'loss': 0.1148, 'grad_norm': 1.0235420465187148, 'learning_rate': 2.6328797430908038e-06, 'epoch': 0.67} 67%|██████▋ | 22873/34278 [25:07:49<14:58:10, 4.73s/it] 67%|██████▋ | 22874/34278 [25:07:55<16:12:44, 5.12s/it] {'loss': 0.108, 'grad_norm': 0.7937782970107664, 'learning_rate': 2.63246361728662e-06, 'epoch': 0.67} 67%|██████▋ | 22874/34278 [25:07:55<16:12:44, 5.12s/it] 67%|██████▋ | 22875/34278 [25:07:58<14:17:55, 4.51s/it] {'loss': 0.0956, 'grad_norm': 0.8326839070925353, 'learning_rate': 2.6320475126192378e-06, 'epoch': 0.67} 67%|██████▋ | 22875/34278 [25:07:58<14:17:55, 4.51s/it] 67%|██████▋ | 22876/34278 [25:08:01<13:16:38, 4.19s/it] {'loss': 0.1105, 'grad_norm': 0.7108209346505884, 'learning_rate': 2.6316314290923705e-06, 'epoch': 0.67} 67%|██████▋ | 22876/34278 [25:08:01<13:16:38, 4.19s/it] 67%|██████▋ | 22877/34278 [25:08:04<12:27:24, 3.93s/it] {'loss': 0.1262, 'grad_norm': 0.7732239015891544, 'learning_rate': 2.6312153667097297e-06, 'epoch': 0.67} 67%|██████▋ | 22877/34278 [25:08:04<12:27:24, 3.93s/it] 67%|██████▋ | 22878/34278 [25:08:08<11:46:25, 3.72s/it] {'loss': 0.1309, 'grad_norm': 0.8739867117762663, 'learning_rate': 2.6307993254750354e-06, 'epoch': 0.67} 67%|██████▋ | 22878/34278 [25:08:08<11:46:25, 3.72s/it] 67%|██████▋ | 22879/34278 [25:08:13<13:00:00, 4.11s/it] {'loss': 0.1312, 'grad_norm': 0.7112937245112858, 'learning_rate': 2.630383305391999e-06, 'epoch': 0.67} 67%|██████▋ | 22879/34278 [25:08:13<13:00:00, 4.11s/it] 67%|██████▋ | 22880/34278 [25:08:16<12:08:00, 3.83s/it] {'loss': 0.1247, 'grad_norm': 0.9659848731887509, 'learning_rate': 2.629967306464333e-06, 'epoch': 0.67} 67%|██████▋ | 22880/34278 [25:08:16<12:08:00, 3.83s/it] 67%|██████▋ | 22881/34278 [25:08:19<11:23:59, 3.60s/it] {'loss': 0.123, 'grad_norm': 0.9564188845262732, 'learning_rate': 2.629551328695752e-06, 'epoch': 0.67} 67%|██████▋ | 22881/34278 [25:08:19<11:23:59, 3.60s/it] 67%|██████▋ | 22882/34278 [25:08:22<11:17:36, 3.57s/it] {'loss': 0.1247, 'grad_norm': 0.9836315585119073, 'learning_rate': 2.629135372089974e-06, 'epoch': 0.67} 67%|██████▋ | 22882/34278 [25:08:22<11:17:36, 3.57s/it] 67%|██████▋ | 22883/34278 [25:08:27<11:50:50, 3.74s/it] {'loss': 0.1459, 'grad_norm': 1.0055943275658235, 'learning_rate': 2.628719436650709e-06, 'epoch': 0.67} 67%|██████▋ | 22883/34278 [25:08:27<11:50:50, 3.74s/it] 67%|██████▋ | 22884/34278 [25:08:29<11:02:55, 3.49s/it] {'loss': 0.1388, 'grad_norm': 0.9715136891055601, 'learning_rate': 2.628303522381669e-06, 'epoch': 0.67} 67%|██████▋ | 22884/34278 [25:08:29<11:02:55, 3.49s/it] 67%|██████▋ | 22885/34278 [25:08:33<10:52:54, 3.44s/it] {'loss': 0.0897, 'grad_norm': 0.9111219117398697, 'learning_rate': 2.6278876292865705e-06, 'epoch': 0.67} 67%|██████▋ | 22885/34278 [25:08:33<10:52:54, 3.44s/it] 67%|██████▋ | 22886/34278 [25:08:37<11:18:15, 3.57s/it] {'loss': 0.1279, 'grad_norm': 0.8552261691374892, 'learning_rate': 2.627471757369123e-06, 'epoch': 0.67} 67%|██████▋ | 22886/34278 [25:08:37<11:18:15, 3.57s/it] 67%|██████▋ | 22887/34278 [25:08:41<12:00:33, 3.80s/it] {'loss': 0.1202, 'grad_norm': 0.7070872845443794, 'learning_rate': 2.6270559066330425e-06, 'epoch': 0.67} 67%|██████▋ | 22887/34278 [25:08:41<12:00:33, 3.80s/it] 67%|██████▋ | 22888/34278 [25:08:47<14:07:49, 4.47s/it] {'loss': 0.1433, 'grad_norm': 0.9344461378859874, 'learning_rate': 2.626640077082041e-06, 'epoch': 0.67} 67%|██████▋ | 22888/34278 [25:08:47<14:07:49, 4.47s/it] 67%|██████▋ | 22889/34278 [25:08:50<13:06:20, 4.14s/it] {'loss': 0.126, 'grad_norm': 0.9546904343586825, 'learning_rate': 2.626224268719831e-06, 'epoch': 0.67} 67%|██████▋ | 22889/34278 [25:08:50<13:06:20, 4.14s/it] 67%|██████▋ | 22890/34278 [25:08:54<12:11:42, 3.86s/it] {'loss': 0.133, 'grad_norm': 0.8914692488127292, 'learning_rate': 2.6258084815501217e-06, 'epoch': 0.67} 67%|██████▋ | 22890/34278 [25:08:54<12:11:42, 3.86s/it] 67%|██████▋ | 22891/34278 [25:08:57<11:33:33, 3.65s/it] {'loss': 0.1349, 'grad_norm': 1.0945021448766752, 'learning_rate': 2.62539271557663e-06, 'epoch': 0.67} 67%|██████▋ | 22891/34278 [25:08:57<11:33:33, 3.65s/it] 67%|██████▋ | 22892/34278 [25:09:00<11:07:23, 3.52s/it] {'loss': 0.1411, 'grad_norm': 0.8442179469964343, 'learning_rate': 2.6249769708030626e-06, 'epoch': 0.67} 67%|██████▋ | 22892/34278 [25:09:00<11:07:23, 3.52s/it] 67%|██████▋ | 22893/34278 [25:09:04<11:21:52, 3.59s/it] {'loss': 0.1161, 'grad_norm': 0.6690842340004873, 'learning_rate': 2.624561247233136e-06, 'epoch': 0.67} 67%|██████▋ | 22893/34278 [25:09:04<11:21:52, 3.59s/it] 67%|██████▋ | 22894/34278 [25:09:07<11:02:06, 3.49s/it] {'loss': 0.1389, 'grad_norm': 1.055252963845563, 'learning_rate': 2.6241455448705585e-06, 'epoch': 0.67} 67%|██████▋ | 22894/34278 [25:09:07<11:02:06, 3.49s/it] 67%|██████▋ | 22895/34278 [25:09:10<10:39:06, 3.37s/it] {'loss': 0.1278, 'grad_norm': 0.9517407132556734, 'learning_rate': 2.6237298637190433e-06, 'epoch': 0.67} 67%|██████▋ | 22895/34278 [25:09:10<10:39:06, 3.37s/it] 67%|██████▋ | 22896/34278 [25:09:14<10:48:06, 3.42s/it] {'loss': 0.104, 'grad_norm': 0.714132413605472, 'learning_rate': 2.6233142037823013e-06, 'epoch': 0.67} 67%|██████▋ | 22896/34278 [25:09:14<10:48:06, 3.42s/it] 67%|██████▋ | 22897/34278 [25:09:17<10:56:33, 3.46s/it] {'loss': 0.1249, 'grad_norm': 0.7384307375035443, 'learning_rate': 2.6228985650640405e-06, 'epoch': 0.67} 67%|██████▋ | 22897/34278 [25:09:17<10:56:33, 3.46s/it] 67%|██████▋ | 22898/34278 [25:09:23<13:21:56, 4.23s/it] {'loss': 0.1235, 'grad_norm': 0.6353591443588497, 'learning_rate': 2.6224829475679737e-06, 'epoch': 0.67} 67%|██████▋ | 22898/34278 [25:09:23<13:21:56, 4.23s/it] 67%|██████▋ | 22899/34278 [25:09:26<12:15:33, 3.88s/it] {'loss': 0.1174, 'grad_norm': 0.913948058624623, 'learning_rate': 2.6220673512978135e-06, 'epoch': 0.67} 67%|██████▋ | 22899/34278 [25:09:26<12:15:33, 3.88s/it] 67%|██████▋ | 22900/34278 [25:09:29<11:15:37, 3.56s/it] {'loss': 0.1102, 'grad_norm': 0.8425906355515902, 'learning_rate': 2.621651776257266e-06, 'epoch': 0.67} 67%|██████▋ | 22900/34278 [25:09:29<11:15:37, 3.56s/it] 67%|██████▋ | 22901/34278 [25:09:32<11:02:17, 3.49s/it] {'loss': 0.1241, 'grad_norm': 0.8877934953357128, 'learning_rate': 2.6212362224500467e-06, 'epoch': 0.67} 67%|██████▋ | 22901/34278 [25:09:32<11:02:17, 3.49s/it] 67%|██████▋ | 22902/34278 [25:09:36<10:47:48, 3.42s/it] {'loss': 0.115, 'grad_norm': 0.8366168848950594, 'learning_rate': 2.6208206898798618e-06, 'epoch': 0.67} 67%|██████▋ | 22902/34278 [25:09:36<10:47:48, 3.42s/it] 67%|██████▋ | 22903/34278 [25:09:39<10:21:15, 3.28s/it] {'loss': 0.1199, 'grad_norm': 0.9015235413550581, 'learning_rate': 2.6204051785504197e-06, 'epoch': 0.67} 67%|██████▋ | 22903/34278 [25:09:39<10:21:15, 3.28s/it] 67%|██████▋ | 22904/34278 [25:09:42<10:28:33, 3.32s/it] {'loss': 0.1208, 'grad_norm': 0.7936000443709997, 'learning_rate': 2.619989688465433e-06, 'epoch': 0.67} 67%|██████▋ | 22904/34278 [25:09:42<10:28:33, 3.32s/it] 67%|██████▋ | 22905/34278 [25:09:45<10:40:05, 3.38s/it] {'loss': 0.1142, 'grad_norm': 0.905018323752349, 'learning_rate': 2.619574219628611e-06, 'epoch': 0.67} 67%|██████▋ | 22905/34278 [25:09:45<10:40:05, 3.38s/it] 67%|██████▋ | 22906/34278 [25:09:49<10:45:05, 3.40s/it] {'loss': 0.1455, 'grad_norm': 0.9977333843447267, 'learning_rate': 2.619158772043663e-06, 'epoch': 0.67} 67%|██████▋ | 22906/34278 [25:09:49<10:45:05, 3.40s/it] 67%|██████▋ | 22907/34278 [25:09:52<10:17:18, 3.26s/it] {'loss': 0.1124, 'grad_norm': 0.7992462283310806, 'learning_rate': 2.6187433457142953e-06, 'epoch': 0.67} 67%|██████▋ | 22907/34278 [25:09:52<10:17:18, 3.26s/it] 67%|██████▋ | 22908/34278 [25:09:55<10:19:34, 3.27s/it] {'loss': 0.127, 'grad_norm': 0.8310542157884723, 'learning_rate': 2.6183279406442195e-06, 'epoch': 0.67} 67%|██████▋ | 22908/34278 [25:09:55<10:19:34, 3.27s/it] 67%|██████▋ | 22909/34278 [25:09:58<10:06:39, 3.20s/it] {'loss': 0.1313, 'grad_norm': 1.1425389556738446, 'learning_rate': 2.6179125568371444e-06, 'epoch': 0.67} 67%|██████▋ | 22909/34278 [25:09:58<10:06:39, 3.20s/it] 67%|██████▋ | 22910/34278 [25:10:04<12:47:15, 4.05s/it] {'loss': 0.1308, 'grad_norm': 1.0003638408942155, 'learning_rate': 2.617497194296774e-06, 'epoch': 0.67} 67%|██████▋ | 22910/34278 [25:10:04<12:47:15, 4.05s/it] 67%|██████▋ | 22911/34278 [25:10:08<12:04:07, 3.82s/it] {'loss': 0.1416, 'grad_norm': 0.7607575092540827, 'learning_rate': 2.6170818530268218e-06, 'epoch': 0.67} 67%|██████▋ | 22911/34278 [25:10:08<12:04:07, 3.82s/it] 67%|██████▋ | 22912/34278 [25:10:14<14:10:04, 4.49s/it] {'loss': 0.1503, 'grad_norm': 1.035812661802313, 'learning_rate': 2.616666533030995e-06, 'epoch': 0.67} 67%|██████▋ | 22912/34278 [25:10:14<14:10:04, 4.49s/it] 67%|██████▋ | 22913/34278 [25:10:17<12:52:57, 4.08s/it] {'loss': 0.1125, 'grad_norm': 0.9809074352486825, 'learning_rate': 2.6162512343129996e-06, 'epoch': 0.67} 67%|██████▋ | 22913/34278 [25:10:17<12:52:57, 4.08s/it] 67%|██████▋ | 22914/34278 [25:10:20<11:58:44, 3.79s/it] {'loss': 0.1218, 'grad_norm': 0.8755276446003115, 'learning_rate': 2.6158359568765436e-06, 'epoch': 0.67} 67%|██████▋ | 22914/34278 [25:10:20<11:58:44, 3.79s/it] 67%|██████▋ | 22915/34278 [25:10:23<11:06:34, 3.52s/it] {'loss': 0.1165, 'grad_norm': 0.931614951643964, 'learning_rate': 2.6154207007253364e-06, 'epoch': 0.67} 67%|██████▋ | 22915/34278 [25:10:23<11:06:34, 3.52s/it] 67%|██████▋ | 22916/34278 [25:10:26<10:51:28, 3.44s/it] {'loss': 0.1401, 'grad_norm': 1.1166821915032044, 'learning_rate': 2.6150054658630814e-06, 'epoch': 0.67} 67%|██████▋ | 22916/34278 [25:10:26<10:51:28, 3.44s/it] 67%|██████▋ | 22917/34278 [25:10:29<10:33:42, 3.35s/it] {'loss': 0.1289, 'grad_norm': 0.896098729123017, 'learning_rate': 2.6145902522934886e-06, 'epoch': 0.67} 67%|██████▋ | 22917/34278 [25:10:29<10:33:42, 3.35s/it] 67%|██████▋ | 22918/34278 [25:10:35<13:09:48, 4.17s/it] {'loss': 0.1031, 'grad_norm': 0.7827358265099456, 'learning_rate': 2.614175060020267e-06, 'epoch': 0.67} 67%|██████▋ | 22918/34278 [25:10:35<13:09:48, 4.17s/it] 67%|██████▋ | 22919/34278 [25:10:38<12:05:10, 3.83s/it] {'loss': 0.1316, 'grad_norm': 0.8556313979924545, 'learning_rate': 2.6137598890471204e-06, 'epoch': 0.67} 67%|██████▋ | 22919/34278 [25:10:38<12:05:10, 3.83s/it] 67%|██████▋ | 22920/34278 [25:10:42<12:16:52, 3.89s/it] {'loss': 0.127, 'grad_norm': 0.9517679559457114, 'learning_rate': 2.6133447393777545e-06, 'epoch': 0.67} 67%|██████▋ | 22920/34278 [25:10:42<12:16:52, 3.89s/it] 67%|██████▋ | 22921/34278 [25:10:45<11:28:36, 3.64s/it] {'loss': 0.0907, 'grad_norm': 0.9916652286106274, 'learning_rate': 2.6129296110158784e-06, 'epoch': 0.67} 67%|██████▋ | 22921/34278 [25:10:45<11:28:36, 3.64s/it] 67%|██████▋ | 22922/34278 [25:10:49<11:29:02, 3.64s/it] {'loss': 0.1211, 'grad_norm': 0.6660238125799548, 'learning_rate': 2.6125145039651955e-06, 'epoch': 0.67} 67%|██████▋ | 22922/34278 [25:10:49<11:29:02, 3.64s/it] 67%|██████▋ | 22923/34278 [25:10:52<11:08:05, 3.53s/it] {'loss': 0.1568, 'grad_norm': 0.859101359514304, 'learning_rate': 2.612099418229415e-06, 'epoch': 0.67} 67%|██████▋ | 22923/34278 [25:10:52<11:08:05, 3.53s/it] 67%|██████▋ | 22924/34278 [25:10:56<11:00:03, 3.49s/it] {'loss': 0.1346, 'grad_norm': 1.0702947707584312, 'learning_rate': 2.6116843538122383e-06, 'epoch': 0.67} 67%|██████▋ | 22924/34278 [25:10:56<11:00:03, 3.49s/it] 67%|██████▋ | 22925/34278 [25:10:59<10:38:57, 3.38s/it] {'loss': 0.1029, 'grad_norm': 0.8805219805749532, 'learning_rate': 2.611269310717376e-06, 'epoch': 0.67} 67%|██████▋ | 22925/34278 [25:10:59<10:38:57, 3.38s/it] 67%|██████▋ | 22926/34278 [25:11:02<10:22:24, 3.29s/it] {'loss': 0.1282, 'grad_norm': 0.8034430681476991, 'learning_rate': 2.6108542889485304e-06, 'epoch': 0.67} 67%|██████▋ | 22926/34278 [25:11:02<10:22:24, 3.29s/it] 67%|██████▋ | 22927/34278 [25:11:05<10:08:33, 3.22s/it] {'loss': 0.1199, 'grad_norm': 0.8488474482376444, 'learning_rate': 2.6104392885094067e-06, 'epoch': 0.67} 67%|██████▋ | 22927/34278 [25:11:05<10:08:33, 3.22s/it] 67%|██████▋ | 22928/34278 [25:11:08<10:20:26, 3.28s/it] {'loss': 0.118, 'grad_norm': 1.282805480030664, 'learning_rate': 2.610024309403709e-06, 'epoch': 0.67} 67%|██████▋ | 22928/34278 [25:11:08<10:20:26, 3.28s/it] 67%|██████▋ | 22929/34278 [25:11:14<12:51:26, 4.08s/it] {'loss': 0.1236, 'grad_norm': 0.8617636404005417, 'learning_rate': 2.6096093516351473e-06, 'epoch': 0.67} 67%|██████▋ | 22929/34278 [25:11:14<12:51:26, 4.08s/it] 67%|██████▋ | 22930/34278 [25:11:17<11:48:28, 3.75s/it] {'loss': 0.1504, 'grad_norm': 1.2089206394358851, 'learning_rate': 2.60919441520742e-06, 'epoch': 0.67} 67%|██████▋ | 22930/34278 [25:11:17<11:48:28, 3.75s/it] 67%|██████▋ | 22931/34278 [25:11:21<11:25:36, 3.63s/it] {'loss': 0.1226, 'grad_norm': 0.8843065407926914, 'learning_rate': 2.6087795001242357e-06, 'epoch': 0.67} 67%|██████▋ | 22931/34278 [25:11:21<11:25:36, 3.63s/it] 67%|██████▋ | 22932/34278 [25:11:23<10:47:00, 3.42s/it] {'loss': 0.13, 'grad_norm': 1.1493588882110541, 'learning_rate': 2.6083646063892976e-06, 'epoch': 0.67} 67%|██████▋ | 22932/34278 [25:11:23<10:47:00, 3.42s/it] 67%|██████▋ | 22933/34278 [25:11:27<11:00:48, 3.49s/it] {'loss': 0.1407, 'grad_norm': 1.1155950339039007, 'learning_rate': 2.6079497340063077e-06, 'epoch': 0.67} 67%|██████▋ | 22933/34278 [25:11:27<11:00:48, 3.49s/it] 67%|██████▋ | 22934/34278 [25:11:30<10:21:57, 3.29s/it] {'loss': 0.1232, 'grad_norm': 0.9143638603560748, 'learning_rate': 2.6075348829789716e-06, 'epoch': 0.67} 67%|██████▋ | 22934/34278 [25:11:30<10:21:57, 3.29s/it] 67%|██████▋ | 22935/34278 [25:11:33<10:17:12, 3.26s/it] {'loss': 0.1097, 'grad_norm': 0.8037598879231852, 'learning_rate': 2.6071200533109943e-06, 'epoch': 0.67} 67%|██████▋ | 22935/34278 [25:11:33<10:17:12, 3.26s/it] 67%|██████▋ | 22936/34278 [25:11:36<10:08:02, 3.22s/it] {'loss': 0.1231, 'grad_norm': 1.1392089839495685, 'learning_rate': 2.606705245006078e-06, 'epoch': 0.67} 67%|██████▋ | 22936/34278 [25:11:36<10:08:02, 3.22s/it] 67%|██████▋ | 22937/34278 [25:11:42<12:52:26, 4.09s/it] {'loss': 0.1279, 'grad_norm': 0.8531527364776602, 'learning_rate': 2.6062904580679243e-06, 'epoch': 0.67} 67%|██████▋ | 22937/34278 [25:11:42<12:52:26, 4.09s/it] 67%|██████▋ | 22938/34278 [25:11:47<13:08:34, 4.17s/it] {'loss': 0.1155, 'grad_norm': 0.815867906913901, 'learning_rate': 2.6058756925002405e-06, 'epoch': 0.67} 67%|██████▋ | 22938/34278 [25:11:47<13:08:34, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 67%|██████▋ | 22939/34278 [25:11:51<13:12:04, 4.19s/it] {'loss': 0.1115, 'grad_norm': 0.8291403675860054, 'learning_rate': 2.605460948306726e-06, 'epoch': 0.67} 67%|██████▋ | 22939/34278 [25:11:51<13:12:04, 4.19s/it] 67%|██████▋ | 22940/34278 [25:11:56<13:54:28, 4.42s/it] {'loss': 0.1493, 'grad_norm': 0.7748396013848297, 'learning_rate': 2.6050462254910825e-06, 'epoch': 0.67} 67%|██████▋ | 22940/34278 [25:11:56<13:54:28, 4.42s/it] 67%|██████▋ | 22941/34278 [25:11:59<12:53:58, 4.10s/it] {'loss': 0.1066, 'grad_norm': 0.8602687148445679, 'learning_rate': 2.604631524057015e-06, 'epoch': 0.67} 67%|██████▋ | 22941/34278 [25:11:59<12:53:58, 4.10s/it] 67%|██████▋ | 22942/34278 [25:12:03<12:22:16, 3.93s/it] {'loss': 0.1051, 'grad_norm': 1.0815187177418915, 'learning_rate': 2.6042168440082278e-06, 'epoch': 0.67} 67%|██████▋ | 22942/34278 [25:12:03<12:22:16, 3.93s/it] 67%|██████▋ | 22943/34278 [25:12:09<14:35:40, 4.64s/it] {'loss': 0.1247, 'grad_norm': 0.7993337586485786, 'learning_rate': 2.6038021853484204e-06, 'epoch': 0.67} 67%|██████▋ | 22943/34278 [25:12:09<14:35:40, 4.64s/it] 67%|██████▋ | 22944/34278 [25:12:13<13:44:49, 4.37s/it] {'loss': 0.1323, 'grad_norm': 0.8172989834362377, 'learning_rate': 2.6033875480812932e-06, 'epoch': 0.67} 67%|██████▋ | 22944/34278 [25:12:13<13:44:49, 4.37s/it] 67%|██████▋ | 22945/34278 [25:12:16<12:49:32, 4.07s/it] {'loss': 0.1185, 'grad_norm': 1.0694196145230908, 'learning_rate': 2.602972932210551e-06, 'epoch': 0.67} 67%|██████▋ | 22945/34278 [25:12:16<12:49:32, 4.07s/it] 67%|██████▋ | 22946/34278 [25:12:20<12:30:57, 3.98s/it] {'loss': 0.1233, 'grad_norm': 0.8139093409115654, 'learning_rate': 2.6025583377398933e-06, 'epoch': 0.67} 67%|██████▋ | 22946/34278 [25:12:20<12:30:57, 3.98s/it] 67%|██████▋ | 22947/34278 [25:12:23<11:37:02, 3.69s/it] {'loss': 0.1207, 'grad_norm': 1.2906758880216815, 'learning_rate': 2.602143764673022e-06, 'epoch': 0.67} 67%|██████▋ | 22947/34278 [25:12:23<11:37:02, 3.69s/it] 67%|██████▋ | 22948/34278 [25:12:26<10:58:17, 3.49s/it] {'loss': 0.1222, 'grad_norm': 0.8766576779177476, 'learning_rate': 2.6017292130136406e-06, 'epoch': 0.67} 67%|██████▋ | 22948/34278 [25:12:26<10:58:17, 3.49s/it] 67%|██████▋ | 22949/34278 [25:12:31<12:09:53, 3.87s/it] {'loss': 0.0965, 'grad_norm': 0.7039154293282055, 'learning_rate': 2.6013146827654485e-06, 'epoch': 0.67} 67%|██████▋ | 22949/34278 [25:12:31<12:09:53, 3.87s/it] 67%|██████▋ | 22950/34278 [25:12:33<11:02:55, 3.51s/it] {'loss': 0.1098, 'grad_norm': 0.7745158648055794, 'learning_rate': 2.600900173932144e-06, 'epoch': 0.67} 67%|██████▋ | 22950/34278 [25:12:33<11:02:55, 3.51s/it] 67%|██████▋ | 22951/34278 [25:12:39<13:16:10, 4.22s/it] {'loss': 0.1288, 'grad_norm': 0.9171519033405305, 'learning_rate': 2.600485686517432e-06, 'epoch': 0.67} 67%|██████▋ | 22951/34278 [25:12:39<13:16:10, 4.22s/it] 67%|██████▋ | 22952/34278 [25:12:44<14:03:20, 4.47s/it] {'loss': 0.133, 'grad_norm': 1.0745458521225597, 'learning_rate': 2.60007122052501e-06, 'epoch': 0.67} 67%|██████▋ | 22952/34278 [25:12:44<14:03:20, 4.47s/it] 67%|██████▋ | 22953/34278 [25:12:47<12:40:38, 4.03s/it] {'loss': 0.1201, 'grad_norm': 1.0622810261729105, 'learning_rate': 2.59965677595858e-06, 'epoch': 0.67} 67%|██████▋ | 22953/34278 [25:12:47<12:40:38, 4.03s/it] 67%|██████▋ | 22954/34278 [25:12:50<11:27:56, 3.65s/it] {'loss': 0.1056, 'grad_norm': 0.6532650285431103, 'learning_rate': 2.5992423528218404e-06, 'epoch': 0.67} 67%|██████▋ | 22954/34278 [25:12:50<11:27:56, 3.65s/it] 67%|██████▋ | 22955/34278 [25:12:53<10:47:07, 3.43s/it] {'loss': 0.1158, 'grad_norm': 1.0279845866111847, 'learning_rate': 2.5988279511184934e-06, 'epoch': 0.67} 67%|██████▋ | 22955/34278 [25:12:53<10:47:07, 3.43s/it] 67%|██████▋ | 22956/34278 [25:12:56<10:42:10, 3.40s/it] {'loss': 0.0986, 'grad_norm': 0.8731483651185638, 'learning_rate': 2.598413570852237e-06, 'epoch': 0.67} 67%|██████▋ | 22956/34278 [25:12:56<10:42:10, 3.40s/it] 67%|██████▋ | 22957/34278 [25:13:00<10:44:50, 3.42s/it] {'loss': 0.1212, 'grad_norm': 0.9696134873233425, 'learning_rate': 2.597999212026769e-06, 'epoch': 0.67} 67%|██████▋ | 22957/34278 [25:13:00<10:44:50, 3.42s/it] 67%|██████▋ | 22958/34278 [25:13:06<13:09:01, 4.18s/it] {'loss': 0.1624, 'grad_norm': 1.1347937170465638, 'learning_rate': 2.597584874645791e-06, 'epoch': 0.67} 67%|██████▋ | 22958/34278 [25:13:06<13:09:01, 4.18s/it] 67%|██████▋ | 22959/34278 [25:13:09<12:07:05, 3.85s/it] {'loss': 0.1261, 'grad_norm': 1.0999434347085677, 'learning_rate': 2.5971705587130036e-06, 'epoch': 0.67} 67%|██████▋ | 22959/34278 [25:13:09<12:07:05, 3.85s/it] 67%|██████▋ | 22960/34278 [25:13:12<11:13:32, 3.57s/it] {'loss': 0.1308, 'grad_norm': 0.9410433692833908, 'learning_rate': 2.5967562642321014e-06, 'epoch': 0.67} 67%|██████▋ | 22960/34278 [25:13:12<11:13:32, 3.57s/it] 67%|██████▋ | 22961/34278 [25:13:15<11:04:45, 3.52s/it] {'loss': 0.1168, 'grad_norm': 1.110225390909667, 'learning_rate': 2.596341991206788e-06, 'epoch': 0.67} 67%|██████▋ | 22961/34278 [25:13:15<11:04:45, 3.52s/it] 67%|██████▋ | 22962/34278 [25:13:19<10:51:35, 3.45s/it] {'loss': 0.1421, 'grad_norm': 1.1251422889019116, 'learning_rate': 2.5959277396407588e-06, 'epoch': 0.67} 67%|██████▋ | 22962/34278 [25:13:19<10:51:35, 3.45s/it] 67%|██████▋ | 22963/34278 [25:13:22<10:51:23, 3.45s/it] {'loss': 0.1162, 'grad_norm': 0.9735252353434014, 'learning_rate': 2.595513509537712e-06, 'epoch': 0.67} 67%|██████▋ | 22963/34278 [25:13:22<10:51:23, 3.45s/it] 67%|██████▋ | 22964/34278 [25:13:26<11:44:04, 3.73s/it] {'loss': 0.0935, 'grad_norm': 0.7471502557532341, 'learning_rate': 2.595099300901346e-06, 'epoch': 0.67} 67%|██████▋ | 22964/34278 [25:13:26<11:44:04, 3.73s/it] 67%|██████▋ | 22965/34278 [25:13:32<13:51:56, 4.41s/it] {'loss': 0.1179, 'grad_norm': 0.7418263137633314, 'learning_rate': 2.5946851137353614e-06, 'epoch': 0.67} 67%|██████▋ | 22965/34278 [25:13:32<13:51:56, 4.41s/it] 67%|██████▋ | 22966/34278 [25:13:37<14:02:33, 4.47s/it] {'loss': 0.1106, 'grad_norm': 1.192634862037079, 'learning_rate': 2.594270948043454e-06, 'epoch': 0.67} 67%|██████▋ | 22966/34278 [25:13:37<14:02:33, 4.47s/it] 67%|██████▋ | 22967/34278 [25:13:40<13:09:33, 4.19s/it] {'loss': 0.1288, 'grad_norm': 0.7710972242554188, 'learning_rate': 2.5938568038293193e-06, 'epoch': 0.67} 67%|██████▋ | 22967/34278 [25:13:41<13:09:33, 4.19s/it] 67%|██████▋ | 22968/34278 [25:13:45<13:17:43, 4.23s/it] {'loss': 0.117, 'grad_norm': 0.8604964494284084, 'learning_rate': 2.5934426810966585e-06, 'epoch': 0.67} 67%|██████▋ | 22968/34278 [25:13:45<13:17:43, 4.23s/it] 67%|██████▋ | 22969/34278 [25:13:50<13:45:30, 4.38s/it] {'loss': 0.1361, 'grad_norm': 0.9719138250761021, 'learning_rate': 2.593028579849167e-06, 'epoch': 0.67} 67%|██████▋ | 22969/34278 [25:13:50<13:45:30, 4.38s/it] 67%|██████▋ | 22970/34278 [25:13:53<12:24:45, 3.95s/it] {'loss': 0.1196, 'grad_norm': 0.9418603652946018, 'learning_rate': 2.5926145000905402e-06, 'epoch': 0.67} 67%|██████▋ | 22970/34278 [25:13:53<12:24:45, 3.95s/it] 67%|██████▋ | 22971/34278 [25:13:55<11:24:25, 3.63s/it] {'loss': 0.1009, 'grad_norm': 0.8049131281734525, 'learning_rate': 2.5922004418244758e-06, 'epoch': 0.67} 67%|██████▋ | 22971/34278 [25:13:55<11:24:25, 3.63s/it] 67%|██████▋ | 22972/34278 [25:13:59<11:07:34, 3.54s/it] {'loss': 0.1411, 'grad_norm': 0.9613558638504005, 'learning_rate': 2.591786405054673e-06, 'epoch': 0.67} 67%|██████▋ | 22972/34278 [25:13:59<11:07:34, 3.54s/it] 67%|██████▋ | 22973/34278 [25:14:04<12:56:14, 4.12s/it] {'loss': 0.117, 'grad_norm': 1.01183092493497, 'learning_rate': 2.5913723897848264e-06, 'epoch': 0.67} 67%|██████▋ | 22973/34278 [25:14:04<12:56:14, 4.12s/it] 67%|██████▋ | 22974/34278 [25:14:07<11:54:40, 3.79s/it] {'loss': 0.0919, 'grad_norm': 0.9235575685507096, 'learning_rate': 2.5909583960186306e-06, 'epoch': 0.67} 67%|██████▋ | 22974/34278 [25:14:07<11:54:40, 3.79s/it] 67%|██████▋ | 22975/34278 [25:14:11<11:27:54, 3.65s/it] {'loss': 0.1164, 'grad_norm': 0.8172537684550482, 'learning_rate': 2.590544423759785e-06, 'epoch': 0.67} 67%|██████▋ | 22975/34278 [25:14:11<11:27:54, 3.65s/it] 67%|██████▋ | 22976/34278 [25:14:14<11:33:47, 3.68s/it] {'loss': 0.121, 'grad_norm': 0.7142541635523699, 'learning_rate': 2.5901304730119816e-06, 'epoch': 0.67} 67%|██████▋ | 22976/34278 [25:14:14<11:33:47, 3.68s/it] 67%|██████▋ | 22977/34278 [25:14:17<10:51:49, 3.46s/it] {'loss': 0.1393, 'grad_norm': 0.905901228042321, 'learning_rate': 2.5897165437789175e-06, 'epoch': 0.67} 67%|██████▋ | 22977/34278 [25:14:17<10:51:49, 3.46s/it] 67%|██████▋ | 22978/34278 [25:14:20<10:11:05, 3.24s/it] {'loss': 0.1244, 'grad_norm': 0.8181608904803088, 'learning_rate': 2.5893026360642912e-06, 'epoch': 0.67} 67%|██████▋ | 22978/34278 [25:14:20<10:11:05, 3.24s/it] 67%|██████▋ | 22979/34278 [25:14:24<10:41:59, 3.41s/it] {'loss': 0.1453, 'grad_norm': 0.9139770807479383, 'learning_rate': 2.588888749871795e-06, 'epoch': 0.67} 67%|██████▋ | 22979/34278 [25:14:24<10:41:59, 3.41s/it] 67%|██████▋ | 22980/34278 [25:14:27<10:14:51, 3.27s/it] {'loss': 0.1265, 'grad_norm': 0.782312618193292, 'learning_rate': 2.5884748852051236e-06, 'epoch': 0.67} 67%|██████▋ | 22980/34278 [25:14:27<10:14:51, 3.27s/it] 67%|██████▋ | 22981/34278 [25:14:30<9:56:35, 3.17s/it] {'loss': 0.1319, 'grad_norm': 0.6666164999326576, 'learning_rate': 2.588061042067974e-06, 'epoch': 0.67} 67%|██████▋ | 22981/34278 [25:14:30<9:56:35, 3.17s/it] 67%|██████▋ | 22982/34278 [25:14:33<9:40:37, 3.08s/it] {'loss': 0.1064, 'grad_norm': 0.7876602526271521, 'learning_rate': 2.5876472204640375e-06, 'epoch': 0.67} 67%|██████▋ | 22982/34278 [25:14:33<9:40:37, 3.08s/it] 67%|██████▋ | 22983/34278 [25:14:38<11:40:40, 3.72s/it] {'loss': 0.1264, 'grad_norm': 0.8579907867622241, 'learning_rate': 2.587233420397013e-06, 'epoch': 0.67} 67%|██████▋ | 22983/34278 [25:14:38<11:40:40, 3.72s/it] 67%|██████▋ | 22984/34278 [25:14:44<13:53:13, 4.43s/it] {'loss': 0.1386, 'grad_norm': 0.9408215174558893, 'learning_rate': 2.5868196418705906e-06, 'epoch': 0.67} 67%|██████▋ | 22984/34278 [25:14:44<13:53:13, 4.43s/it] 67%|██████▋ | 22985/34278 [25:14:47<12:25:58, 3.96s/it] {'loss': 0.1271, 'grad_norm': 0.8042278480049216, 'learning_rate': 2.5864058848884678e-06, 'epoch': 0.67} 67%|██████▋ | 22985/34278 [25:14:47<12:25:58, 3.96s/it] 67%|██████▋ | 22986/34278 [25:14:52<13:56:37, 4.45s/it] {'loss': 0.1257, 'grad_norm': 0.8255218498327359, 'learning_rate': 2.585992149454337e-06, 'epoch': 0.67} 67%|██████▋ | 22986/34278 [25:14:52<13:56:37, 4.45s/it] 67%|██████▋ | 22987/34278 [25:14:56<13:26:46, 4.29s/it] {'loss': 0.1128, 'grad_norm': 0.8001131882840679, 'learning_rate': 2.585578435571891e-06, 'epoch': 0.67} 67%|██████▋ | 22987/34278 [25:14:56<13:26:46, 4.29s/it] 67%|██████▋ | 22988/34278 [25:14:59<12:24:13, 3.96s/it] {'loss': 0.1418, 'grad_norm': 0.7617123776116494, 'learning_rate': 2.5851647432448242e-06, 'epoch': 0.67} 67%|██████▋ | 22988/34278 [25:14:59<12:24:13, 3.96s/it] 67%|██████▋ | 22989/34278 [25:15:03<11:39:00, 3.72s/it] {'loss': 0.1189, 'grad_norm': 0.9987195121392268, 'learning_rate': 2.5847510724768315e-06, 'epoch': 0.67} 67%|██████▋ | 22989/34278 [25:15:03<11:39:00, 3.72s/it] 67%|██████▋ | 22990/34278 [25:15:07<12:03:54, 3.85s/it] {'loss': 0.1251, 'grad_norm': 0.8461177176633972, 'learning_rate': 2.5843374232716035e-06, 'epoch': 0.67} 67%|██████▋ | 22990/34278 [25:15:07<12:03:54, 3.85s/it] 67%|██████▋ | 22991/34278 [25:15:10<11:22:45, 3.63s/it] {'loss': 0.1353, 'grad_norm': 0.798303148708433, 'learning_rate': 2.5839237956328356e-06, 'epoch': 0.67} 67%|██████▋ | 22991/34278 [25:15:10<11:22:45, 3.63s/it] 67%|██████▋ | 22992/34278 [25:15:13<11:10:23, 3.56s/it] {'loss': 0.1175, 'grad_norm': 0.8383877968105283, 'learning_rate': 2.583510189564219e-06, 'epoch': 0.67} 67%|██████▋ | 22992/34278 [25:15:13<11:10:23, 3.56s/it] 67%|██████▋ | 22993/34278 [25:15:19<13:14:16, 4.22s/it] {'loss': 0.1246, 'grad_norm': 0.9793682524815928, 'learning_rate': 2.583096605069445e-06, 'epoch': 0.67} 67%|██████▋ | 22993/34278 [25:15:19<13:14:16, 4.22s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd62f527d30> Failed to fetch sample 3423929. Exception: cannot identify image file <_io.BytesIO object at 0x7fd62f527d30> 67%|██████▋ | 22994/34278 [25:15:22<12:04:49, 3.85s/it] {'loss': 0.1317, 'grad_norm': 0.9327028157121722, 'learning_rate': 2.5826830421522075e-06, 'epoch': 0.67} 67%|██████▋ | 22994/34278 [25:15:22<12:04:49, 3.85s/it] 67%|██████▋ | 22995/34278 [25:15:25<11:16:14, 3.60s/it] {'loss': 0.1238, 'grad_norm': 0.8195649016502793, 'learning_rate': 2.5822695008162015e-06, 'epoch': 0.67} 67%|██████▋ | 22995/34278 [25:15:25<11:16:14, 3.60s/it] 67%|██████▋ | 22996/34278 [25:15:28<10:46:55, 3.44s/it] {'loss': 0.1332, 'grad_norm': 1.2466977300984077, 'learning_rate': 2.581855981065115e-06, 'epoch': 0.67} 67%|██████▋ | 22996/34278 [25:15:28<10:46:55, 3.44s/it] 67%|██████▋ | 22997/34278 [25:15:32<11:20:57, 3.62s/it] {'loss': 0.1276, 'grad_norm': 0.9484199779616992, 'learning_rate': 2.5814424829026395e-06, 'epoch': 0.67} 67%|██████▋ | 22997/34278 [25:15:32<11:20:57, 3.62s/it] 67%|██████▋ | 22998/34278 [25:15:37<12:37:43, 4.03s/it] {'loss': 0.1365, 'grad_norm': 0.9945373859264935, 'learning_rate': 2.5810290063324705e-06, 'epoch': 0.67} 67%|██████▋ | 22998/34278 [25:15:37<12:37:43, 4.03s/it] 67%|██████▋ | 22999/34278 [25:15:40<11:22:20, 3.63s/it] {'loss': 0.1144, 'grad_norm': 0.8312290435145775, 'learning_rate': 2.5806155513582963e-06, 'epoch': 0.67} 67%|██████▋ | 22999/34278 [25:15:40<11:22:20, 3.63s/it] 67%|██████▋ | 23000/34278 [25:15:43<10:46:01, 3.44s/it] {'loss': 0.1289, 'grad_norm': 0.912826024619701, 'learning_rate': 2.580202117983808e-06, 'epoch': 0.67} 67%|██████▋ | 23000/34278 [25:15:43<10:46:01, 3.44s/it] 67%|██████▋ | 23001/34278 [25:15:46<10:17:08, 3.28s/it] {'loss': 0.1068, 'grad_norm': 0.8686912167702016, 'learning_rate': 2.579788706212697e-06, 'epoch': 0.67} 67%|██████▋ | 23001/34278 [25:15:46<10:17:08, 3.28s/it] 67%|██████▋ | 23002/34278 [25:15:49<9:57:39, 3.18s/it] {'loss': 0.1316, 'grad_norm': 1.3427708204630477, 'learning_rate': 2.5793753160486566e-06, 'epoch': 0.67} 67%|██████▋ | 23002/34278 [25:15:49<9:57:39, 3.18s/it] 67%|██████▋ | 23003/34278 [25:15:51<9:40:41, 3.09s/it] {'loss': 0.117, 'grad_norm': 0.8592042705285448, 'learning_rate': 2.5789619474953753e-06, 'epoch': 0.67} 67%|██████▋ | 23003/34278 [25:15:52<9:40:41, 3.09s/it] 67%|██████▋ | 23004/34278 [25:15:55<10:00:06, 3.19s/it] {'loss': 0.1173, 'grad_norm': 1.082975401321015, 'learning_rate': 2.578548600556542e-06, 'epoch': 0.67} 67%|██████▋ | 23004/34278 [25:15:55<10:00:06, 3.19s/it] 67%|██████▋ | 23005/34278 [25:15:58<10:19:32, 3.30s/it] {'loss': 0.1339, 'grad_norm': 1.2965524748252881, 'learning_rate': 2.5781352752358492e-06, 'epoch': 0.67} 67%|██████▋ | 23005/34278 [25:15:59<10:19:32, 3.30s/it] 67%|██████▋ | 23006/34278 [25:16:03<11:49:19, 3.78s/it] {'loss': 0.1469, 'grad_norm': 1.1511374378045174, 'learning_rate': 2.5777219715369876e-06, 'epoch': 0.67} 67%|██████▋ | 23006/34278 [25:16:03<11:49:19, 3.78s/it] 67%|██████▋ | 23007/34278 [25:16:09<13:56:43, 4.45s/it] {'loss': 0.1263, 'grad_norm': 1.0427632821341766, 'learning_rate': 2.5773086894636446e-06, 'epoch': 0.67} 67%|██████▋ | 23007/34278 [25:16:09<13:56:43, 4.45s/it] 67%|██████▋ | 23008/34278 [25:16:12<12:36:58, 4.03s/it] {'loss': 0.1361, 'grad_norm': 0.929663944711284, 'learning_rate': 2.5768954290195136e-06, 'epoch': 0.67} 67%|██████▋ | 23008/34278 [25:16:12<12:36:58, 4.03s/it] 67%|██████▋ | 23009/34278 [25:16:16<11:57:13, 3.82s/it] {'loss': 0.1264, 'grad_norm': 1.1165350096418751, 'learning_rate': 2.5764821902082814e-06, 'epoch': 0.67} 67%|██████▋ | 23009/34278 [25:16:16<11:57:13, 3.82s/it] 67%|██████▋ | 23010/34278 [25:16:19<11:14:25, 3.59s/it] {'loss': 0.1267, 'grad_norm': 0.9749342362429257, 'learning_rate': 2.576068973033635e-06, 'epoch': 0.67} 67%|██████▋ | 23010/34278 [25:16:19<11:14:25, 3.59s/it] 67%|██████▋ | 23011/34278 [25:16:23<11:43:12, 3.74s/it] {'loss': 0.1284, 'grad_norm': 0.8730546817614292, 'learning_rate': 2.5756557774992676e-06, 'epoch': 0.67} 67%|██████▋ | 23011/34278 [25:16:23<11:43:12, 3.74s/it] 67%|██████▋ | 23012/34278 [25:16:26<11:28:58, 3.67s/it] {'loss': 0.1297, 'grad_norm': 0.9547620352732499, 'learning_rate': 2.575242603608867e-06, 'epoch': 0.67} 67%|██████▋ | 23012/34278 [25:16:26<11:28:58, 3.67s/it] 67%|██████▋ | 23013/34278 [25:16:30<11:10:20, 3.57s/it] {'loss': 0.1372, 'grad_norm': 1.1379441054263437, 'learning_rate': 2.5748294513661233e-06, 'epoch': 0.67} 67%|██████▋ | 23013/34278 [25:16:30<11:10:20, 3.57s/it] 67%|██████▋ | 23014/34278 [25:16:33<10:38:33, 3.40s/it] {'loss': 0.1269, 'grad_norm': 0.9335225321098604, 'learning_rate': 2.5744163207747202e-06, 'epoch': 0.67} 67%|██████▋ | 23014/34278 [25:16:33<10:38:33, 3.40s/it] 67%|██████▋ | 23015/34278 [25:16:37<11:34:27, 3.70s/it] {'loss': 0.1327, 'grad_norm': 1.6534952123573796, 'learning_rate': 2.574003211838352e-06, 'epoch': 0.67} 67%|██████▋ | 23015/34278 [25:16:37<11:34:27, 3.70s/it] 67%|██████▋ | 23016/34278 [25:16:41<11:28:55, 3.67s/it] {'loss': 0.1117, 'grad_norm': 1.0425208606698102, 'learning_rate': 2.573590124560703e-06, 'epoch': 0.67} 67%|██████▋ | 23016/34278 [25:16:41<11:28:55, 3.67s/it] 67%|██████▋ | 23017/34278 [25:16:44<11:18:31, 3.62s/it] {'loss': 0.0986, 'grad_norm': 0.8787349795954643, 'learning_rate': 2.5731770589454584e-06, 'epoch': 0.67} 67%|██████▋ | 23017/34278 [25:16:44<11:18:31, 3.62s/it] 67%|██████▋ | 23018/34278 [25:16:48<11:41:39, 3.74s/it] {'loss': 0.116, 'grad_norm': 0.9389100598612916, 'learning_rate': 2.572764014996314e-06, 'epoch': 0.67} 67%|██████▋ | 23018/34278 [25:16:48<11:41:39, 3.74s/it] 67%|██████▋ | 23019/34278 [25:16:51<10:56:40, 3.50s/it] {'loss': 0.1278, 'grad_norm': 0.8437284003147788, 'learning_rate': 2.5723509927169526e-06, 'epoch': 0.67} 67%|██████▋ | 23019/34278 [25:16:51<10:56:40, 3.50s/it] 67%|██████▋ | 23020/34278 [25:16:54<10:36:26, 3.39s/it] {'loss': 0.1513, 'grad_norm': 1.1954962997715544, 'learning_rate': 2.5719379921110605e-06, 'epoch': 0.67} 67%|██████▋ | 23020/34278 [25:16:54<10:36:26, 3.39s/it] 67%|██████▋ | 23021/34278 [25:16:58<10:44:25, 3.43s/it] {'loss': 0.1262, 'grad_norm': 1.0029319571789357, 'learning_rate': 2.5715250131823284e-06, 'epoch': 0.67} 67%|██████▋ | 23021/34278 [25:16:58<10:44:25, 3.43s/it] 67%|██████▋ | 23022/34278 [25:17:02<11:13:31, 3.59s/it] {'loss': 0.1005, 'grad_norm': 0.6272414702708596, 'learning_rate': 2.5711120559344404e-06, 'epoch': 0.67} 67%|██████▋ | 23022/34278 [25:17:02<11:13:31, 3.59s/it] 67%|██████▋ | 23023/34278 [25:17:05<10:42:07, 3.42s/it] {'loss': 0.1135, 'grad_norm': 1.0685323592073221, 'learning_rate': 2.570699120371083e-06, 'epoch': 0.67} 67%|██████▋ | 23023/34278 [25:17:05<10:42:07, 3.42s/it] 67%|██████▋ | 23024/34278 [25:17:08<10:28:18, 3.35s/it] {'loss': 0.1227, 'grad_norm': 0.848090512308395, 'learning_rate': 2.5702862064959445e-06, 'epoch': 0.67} 67%|██████▋ | 23024/34278 [25:17:08<10:28:18, 3.35s/it] 67%|██████▋ | 23025/34278 [25:17:14<12:45:38, 4.08s/it] {'loss': 0.1136, 'grad_norm': 0.8136112669022519, 'learning_rate': 2.569873314312712e-06, 'epoch': 0.67} 67%|██████▋ | 23025/34278 [25:17:14<12:45:38, 4.08s/it] 67%|██████▋ | 23026/34278 [25:17:20<14:30:32, 4.64s/it] {'loss': 0.1245, 'grad_norm': 0.7260969938467134, 'learning_rate': 2.5694604438250697e-06, 'epoch': 0.67} 67%|██████▋ | 23026/34278 [25:17:20<14:30:32, 4.64s/it] 67%|██████▋ | 23027/34278 [25:17:23<12:48:41, 4.10s/it] {'loss': 0.1424, 'grad_norm': 0.9505974670743264, 'learning_rate': 2.5690475950367035e-06, 'epoch': 0.67} 67%|██████▋ | 23027/34278 [25:17:23<12:48:41, 4.10s/it] 67%|██████▋ | 23028/34278 [25:17:26<11:51:04, 3.79s/it] {'loss': 0.1247, 'grad_norm': 0.8703509019340886, 'learning_rate': 2.5686347679513013e-06, 'epoch': 0.67} 67%|██████▋ | 23028/34278 [25:17:26<11:51:04, 3.79s/it] 67%|██████▋ | 23029/34278 [25:17:30<12:19:15, 3.94s/it] {'loss': 0.1345, 'grad_norm': 0.9355258205340619, 'learning_rate': 2.5682219625725456e-06, 'epoch': 0.67} 67%|██████▋ | 23029/34278 [25:17:30<12:19:15, 3.94s/it] 67%|██████▋ | 23030/34278 [25:17:34<11:58:45, 3.83s/it] {'loss': 0.1325, 'grad_norm': 1.1087803126689815, 'learning_rate': 2.5678091789041258e-06, 'epoch': 0.67} 67%|██████▋ | 23030/34278 [25:17:34<11:58:45, 3.83s/it] 67%|██████▋ | 23031/34278 [25:17:38<12:07:04, 3.88s/it] {'loss': 0.1189, 'grad_norm': 0.7716792466995319, 'learning_rate': 2.5673964169497233e-06, 'epoch': 0.67} 67%|██████▋ | 23031/34278 [25:17:38<12:07:04, 3.88s/it] 67%|██████▋ | 23032/34278 [25:17:40<11:09:19, 3.57s/it] {'loss': 0.0846, 'grad_norm': 0.9071085343732047, 'learning_rate': 2.5669836767130266e-06, 'epoch': 0.67} 67%|██████▋ | 23032/34278 [25:17:40<11:09:19, 3.57s/it] 67%|██████▋ | 23033/34278 [25:17:44<11:32:25, 3.69s/it] {'loss': 0.1092, 'grad_norm': 0.7954196170911573, 'learning_rate': 2.5665709581977195e-06, 'epoch': 0.67} 67%|██████▋ | 23033/34278 [25:17:44<11:32:25, 3.69s/it] 67%|██████▋ | 23034/34278 [25:17:48<11:19:03, 3.62s/it] {'loss': 0.122, 'grad_norm': 0.7621613749477639, 'learning_rate': 2.566158261407483e-06, 'epoch': 0.67} 67%|██████▋ | 23034/34278 [25:17:48<11:19:03, 3.62s/it] 67%|██████▋ | 23035/34278 [25:17:51<10:28:28, 3.35s/it] {'loss': 0.105, 'grad_norm': 0.7136243545915051, 'learning_rate': 2.565745586346005e-06, 'epoch': 0.67} 67%|██████▋ | 23035/34278 [25:17:51<10:28:28, 3.35s/it] 67%|██████▋ | 23036/34278 [25:17:54<10:30:57, 3.37s/it] {'loss': 0.1144, 'grad_norm': 0.8073084554454861, 'learning_rate': 2.5653329330169713e-06, 'epoch': 0.67} 67%|██████▋ | 23036/34278 [25:17:54<10:30:57, 3.37s/it] 67%|██████▋ | 23037/34278 [25:17:59<12:13:17, 3.91s/it] {'loss': 0.1342, 'grad_norm': 1.0988537169193622, 'learning_rate': 2.564920301424062e-06, 'epoch': 0.67} 67%|██████▋ | 23037/34278 [25:17:59<12:13:17, 3.91s/it] 67%|██████▋ | 23038/34278 [25:18:02<11:35:57, 3.72s/it] {'loss': 0.1218, 'grad_norm': 1.8690672441944067, 'learning_rate': 2.5645076915709644e-06, 'epoch': 0.67} 67%|██████▋ | 23038/34278 [25:18:02<11:35:57, 3.72s/it] 67%|██████▋ | 23039/34278 [25:18:06<11:55:48, 3.82s/it] {'loss': 0.1322, 'grad_norm': 0.9031188759708206, 'learning_rate': 2.5640951034613613e-06, 'epoch': 0.67} 67%|██████▋ | 23039/34278 [25:18:07<11:55:48, 3.82s/it] 67%|██████▋ | 23040/34278 [25:18:09<11:09:04, 3.57s/it] {'loss': 0.1274, 'grad_norm': 0.913525689768466, 'learning_rate': 2.5636825370989336e-06, 'epoch': 0.67} 67%|██████▋ | 23040/34278 [25:18:09<11:09:04, 3.57s/it] 67%|██████▋ | 23041/34278 [25:18:13<10:55:41, 3.50s/it] {'loss': 0.1275, 'grad_norm': 0.8978105971050229, 'learning_rate': 2.5632699924873667e-06, 'epoch': 0.67} 67%|██████▋ | 23041/34278 [25:18:13<10:55:41, 3.50s/it] 67%|██████▋ | 23042/34278 [25:18:16<10:48:38, 3.46s/it] {'loss': 0.1061, 'grad_norm': 0.8385083518811937, 'learning_rate': 2.5628574696303452e-06, 'epoch': 0.67} 67%|██████▋ | 23042/34278 [25:18:16<10:48:38, 3.46s/it] 67%|██████▋ | 23043/34278 [25:18:22<12:49:36, 4.11s/it] {'loss': 0.1372, 'grad_norm': 0.7681185201675401, 'learning_rate': 2.562444968531551e-06, 'epoch': 0.67} 67%|██████▋ | 23043/34278 [25:18:22<12:49:36, 4.11s/it] 67%|██████▋ | 23044/34278 [25:18:26<12:30:43, 4.01s/it] {'loss': 0.1218, 'grad_norm': 0.9136671236931833, 'learning_rate': 2.5620324891946636e-06, 'epoch': 0.67} 67%|██████▋ | 23044/34278 [25:18:26<12:30:43, 4.01s/it] 67%|██████▋ | 23045/34278 [25:18:29<11:53:12, 3.81s/it] {'loss': 0.1517, 'grad_norm': 0.9791057087025754, 'learning_rate': 2.5616200316233706e-06, 'epoch': 0.67} 67%|██████▋ | 23045/34278 [25:18:29<11:53:12, 3.81s/it] 67%|██████▋ | 23046/34278 [25:18:34<13:14:25, 4.24s/it] {'loss': 0.1194, 'grad_norm': 0.9491180681649855, 'learning_rate': 2.5612075958213516e-06, 'epoch': 0.67} 67%|██████▋ | 23046/34278 [25:18:34<13:14:25, 4.24s/it] 67%|██████▋ | 23047/34278 [25:18:40<14:58:30, 4.80s/it] {'loss': 0.1548, 'grad_norm': 0.8517097631766665, 'learning_rate': 2.560795181792285e-06, 'epoch': 0.67} 67%|██████▋ | 23047/34278 [25:18:40<14:58:30, 4.80s/it] 67%|██████▋ | 23048/34278 [25:18:44<14:03:24, 4.51s/it] {'loss': 0.1373, 'grad_norm': 0.816692508907859, 'learning_rate': 2.5603827895398613e-06, 'epoch': 0.67} 67%|██████▋ | 23048/34278 [25:18:44<14:03:24, 4.51s/it] 67%|██████▋ | 23049/34278 [25:18:47<12:41:07, 4.07s/it] {'loss': 0.1269, 'grad_norm': 1.037973862130898, 'learning_rate': 2.5599704190677567e-06, 'epoch': 0.67} 67%|██████▋ | 23049/34278 [25:18:47<12:41:07, 4.07s/it] 67%|██████▋ | 23050/34278 [25:18:51<12:15:59, 3.93s/it] {'loss': 0.1118, 'grad_norm': 0.8269344422895314, 'learning_rate': 2.5595580703796526e-06, 'epoch': 0.67} 67%|██████▋ | 23050/34278 [25:18:51<12:15:59, 3.93s/it] 67%|██████▋ | 23051/34278 [25:18:55<12:06:06, 3.88s/it] {'loss': 0.1201, 'grad_norm': 0.7313766193411093, 'learning_rate': 2.5591457434792332e-06, 'epoch': 0.67} 67%|██████▋ | 23051/34278 [25:18:55<12:06:06, 3.88s/it] 67%|██████▋ | 23052/34278 [25:18:57<11:09:42, 3.58s/it] {'loss': 0.133, 'grad_norm': 1.0086293225702918, 'learning_rate': 2.5587334383701777e-06, 'epoch': 0.67} 67%|██████▋ | 23052/34278 [25:18:57<11:09:42, 3.58s/it] 67%|██████▋ | 23053/34278 [25:19:01<10:50:42, 3.48s/it] {'loss': 0.1119, 'grad_norm': 0.8973611634538674, 'learning_rate': 2.5583211550561654e-06, 'epoch': 0.67} 67%|██████▋ | 23053/34278 [25:19:01<10:50:42, 3.48s/it] 67%|██████▋ | 23054/34278 [25:19:05<11:47:32, 3.78s/it] {'loss': 0.1111, 'grad_norm': 1.1141740133744749, 'learning_rate': 2.5579088935408793e-06, 'epoch': 0.67} 67%|██████▋ | 23054/34278 [25:19:05<11:47:32, 3.78s/it] 67%|██████▋ | 23055/34278 [25:19:08<10:58:00, 3.52s/it] {'loss': 0.117, 'grad_norm': 0.8410195825188868, 'learning_rate': 2.557496653828001e-06, 'epoch': 0.67} 67%|██████▋ | 23055/34278 [25:19:08<10:58:00, 3.52s/it] 67%|██████▋ | 23056/34278 [25:19:11<10:50:27, 3.48s/it] {'loss': 0.1195, 'grad_norm': 0.915494174963484, 'learning_rate': 2.5570844359212098e-06, 'epoch': 0.67} 67%|██████▋ | 23056/34278 [25:19:11<10:50:27, 3.48s/it] 67%|██████▋ | 23057/34278 [25:19:14<10:26:07, 3.35s/it] {'loss': 0.1282, 'grad_norm': 0.7680153816881505, 'learning_rate': 2.556672239824183e-06, 'epoch': 0.67} 67%|██████▋ | 23057/34278 [25:19:14<10:26:07, 3.35s/it] 67%|██████▋ | 23058/34278 [25:19:18<10:49:02, 3.47s/it] {'loss': 0.1017, 'grad_norm': 0.8294072737514747, 'learning_rate': 2.556260065540606e-06, 'epoch': 0.67} 67%|██████▋ | 23058/34278 [25:19:18<10:49:02, 3.47s/it] 67%|██████▋ | 23059/34278 [25:19:22<10:54:23, 3.50s/it] {'loss': 0.1243, 'grad_norm': 1.1684656397570399, 'learning_rate': 2.5558479130741537e-06, 'epoch': 0.67} 67%|██████▋ | 23059/34278 [25:19:22<10:54:23, 3.50s/it] 67%|██████▋ | 23060/34278 [25:19:25<10:19:12, 3.31s/it] {'loss': 0.1146, 'grad_norm': 0.7347464654050591, 'learning_rate': 2.555435782428509e-06, 'epoch': 0.67} 67%|██████▋ | 23060/34278 [25:19:25<10:19:12, 3.31s/it] 67%|██████▋ | 23061/34278 [25:19:31<12:51:25, 4.13s/it] {'loss': 0.1244, 'grad_norm': 0.8680076540846722, 'learning_rate': 2.555023673607349e-06, 'epoch': 0.67} 67%|██████▋ | 23061/34278 [25:19:31<12:51:25, 4.13s/it] 67%|██████▋ | 23062/34278 [25:19:34<12:11:48, 3.91s/it] {'loss': 0.1412, 'grad_norm': 0.8891559959126526, 'learning_rate': 2.5546115866143555e-06, 'epoch': 0.67} 67%|██████▋ | 23062/34278 [25:19:34<12:11:48, 3.91s/it] 67%|██████▋ | 23063/34278 [25:19:38<12:30:54, 4.02s/it] {'loss': 0.11, 'grad_norm': 0.9379800251188479, 'learning_rate': 2.5541995214532066e-06, 'epoch': 0.67} 67%|██████▋ | 23063/34278 [25:19:38<12:30:54, 4.02s/it] 67%|██████▋ | 23064/34278 [25:19:41<11:27:45, 3.68s/it] {'loss': 0.1059, 'grad_norm': 0.9374131690397216, 'learning_rate': 2.5537874781275777e-06, 'epoch': 0.67} 67%|██████▋ | 23064/34278 [25:19:41<11:27:45, 3.68s/it] 67%|██████▋ | 23065/34278 [25:19:46<12:35:51, 4.04s/it] {'loss': 0.1339, 'grad_norm': 0.9932545360671732, 'learning_rate': 2.5533754566411505e-06, 'epoch': 0.67} 67%|██████▋ | 23065/34278 [25:19:46<12:35:51, 4.04s/it] 67%|██████▋ | 23066/34278 [25:19:49<11:39:52, 3.75s/it] {'loss': 0.1207, 'grad_norm': 1.0481182700216478, 'learning_rate': 2.5529634569976053e-06, 'epoch': 0.67} 67%|██████▋ | 23066/34278 [25:19:49<11:39:52, 3.75s/it] 67%|██████▋ | 23067/34278 [25:19:53<11:25:10, 3.67s/it] {'loss': 0.1338, 'grad_norm': 0.8243975726970814, 'learning_rate': 2.552551479200616e-06, 'epoch': 0.67} 67%|██████▋ | 23067/34278 [25:19:53<11:25:10, 3.67s/it] 67%|██████▋ | 23068/34278 [25:19:58<12:56:53, 4.16s/it] {'loss': 0.0994, 'grad_norm': 0.7311252626071658, 'learning_rate': 2.5521395232538647e-06, 'epoch': 0.67} 67%|██████▋ | 23068/34278 [25:19:58<12:56:53, 4.16s/it] 67%|██████▋ | 23069/34278 [25:20:02<12:28:17, 4.01s/it] {'loss': 0.1192, 'grad_norm': 0.8570503512909122, 'learning_rate': 2.5517275891610283e-06, 'epoch': 0.67} 67%|██████▋ | 23069/34278 [25:20:02<12:28:17, 4.01s/it] 67%|██████▋ | 23070/34278 [25:20:05<11:29:24, 3.69s/it] {'loss': 0.122, 'grad_norm': 0.9790333521513247, 'learning_rate': 2.551315676925781e-06, 'epoch': 0.67} 67%|██████▋ | 23070/34278 [25:20:05<11:29:24, 3.69s/it] 67%|██████▋ | 23071/34278 [25:20:08<10:46:59, 3.46s/it] {'loss': 0.1191, 'grad_norm': 1.162734332338707, 'learning_rate': 2.5509037865518026e-06, 'epoch': 0.67} 67%|██████▋ | 23071/34278 [25:20:08<10:46:59, 3.46s/it] 67%|██████▋ | 23072/34278 [25:20:14<13:08:54, 4.22s/it] {'loss': 0.1099, 'grad_norm': 0.8244726491109521, 'learning_rate': 2.5504919180427723e-06, 'epoch': 0.67} 67%|██████▋ | 23072/34278 [25:20:14<13:08:54, 4.22s/it] 67%|██████▋ | 23073/34278 [25:20:17<11:59:02, 3.85s/it] {'loss': 0.1252, 'grad_norm': 0.906523145815668, 'learning_rate': 2.5500800714023654e-06, 'epoch': 0.67} 67%|██████▋ | 23073/34278 [25:20:17<11:59:02, 3.85s/it] 67%|██████▋ | 23074/34278 [25:20:20<11:17:45, 3.63s/it] {'loss': 0.1236, 'grad_norm': 1.21337257274489, 'learning_rate': 2.5496682466342576e-06, 'epoch': 0.67} 67%|██████▋ | 23074/34278 [25:20:20<11:17:45, 3.63s/it] 67%|██████▋ | 23075/34278 [25:20:23<11:12:23, 3.60s/it] {'loss': 0.11, 'grad_norm': 1.0699147048227882, 'learning_rate': 2.5492564437421287e-06, 'epoch': 0.67} 67%|██████▋ | 23075/34278 [25:20:23<11:12:23, 3.60s/it] 67%|██████▋ | 23076/34278 [25:20:26<10:44:41, 3.45s/it] {'loss': 0.1431, 'grad_norm': 0.7315953389906471, 'learning_rate': 2.5488446627296525e-06, 'epoch': 0.67} 67%|██████▋ | 23076/34278 [25:20:26<10:44:41, 3.45s/it] 67%|██████▋ | 23077/34278 [25:20:32<12:43:59, 4.09s/it] {'loss': 0.1323, 'grad_norm': 1.1274509195859086, 'learning_rate': 2.5484329036005024e-06, 'epoch': 0.67} 67%|██████▋ | 23077/34278 [25:20:32<12:43:59, 4.09s/it] 67%|██████▋ | 23078/34278 [25:20:38<14:44:35, 4.74s/it] {'loss': 0.1141, 'grad_norm': 1.2727683190101309, 'learning_rate': 2.548021166358362e-06, 'epoch': 0.67} 67%|██████▋ | 23078/34278 [25:20:38<14:44:35, 4.74s/it] 67%|██████▋ | 23079/34278 [25:20:41<13:16:49, 4.27s/it] {'loss': 0.1226, 'grad_norm': 0.8429130965203371, 'learning_rate': 2.5476094510069025e-06, 'epoch': 0.67} 67%|██████▋ | 23079/34278 [25:20:41<13:16:49, 4.27s/it] 67%|██████▋ | 23080/34278 [25:20:47<15:00:32, 4.83s/it] {'loss': 0.1034, 'grad_norm': 0.7867535761335129, 'learning_rate': 2.5471977575497995e-06, 'epoch': 0.67} 67%|██████▋ | 23080/34278 [25:20:47<15:00:32, 4.83s/it] 67%|██████▋ | 23081/34278 [25:20:50<13:10:08, 4.23s/it] {'loss': 0.1424, 'grad_norm': 0.7939388142724333, 'learning_rate': 2.5467860859907314e-06, 'epoch': 0.67} 67%|██████▋ | 23081/34278 [25:20:50<13:10:08, 4.23s/it] 67%|██████▋ | 23082/34278 [25:20:53<11:54:39, 3.83s/it] {'loss': 0.1102, 'grad_norm': 0.9056405970328766, 'learning_rate': 2.546374436333371e-06, 'epoch': 0.67} 67%|██████▋ | 23082/34278 [25:20:53<11:54:39, 3.83s/it] 67%|██████▋ | 23083/34278 [25:20:57<11:46:10, 3.78s/it] {'loss': 0.1173, 'grad_norm': 0.9434431303885463, 'learning_rate': 2.5459628085813924e-06, 'epoch': 0.67} 67%|██████▋ | 23083/34278 [25:20:57<11:46:10, 3.78s/it] 67%|██████▋ | 23084/34278 [25:21:00<11:00:09, 3.54s/it] {'loss': 0.1192, 'grad_norm': 0.802263753656373, 'learning_rate': 2.5455512027384717e-06, 'epoch': 0.67} 67%|██████▋ | 23084/34278 [25:21:00<11:00:09, 3.54s/it] 67%|██████▋ | 23085/34278 [25:21:03<10:58:02, 3.53s/it] {'loss': 0.1215, 'grad_norm': 1.0040014999455318, 'learning_rate': 2.5451396188082853e-06, 'epoch': 0.67} 67%|██████▋ | 23085/34278 [25:21:03<10:58:02, 3.53s/it] 67%|██████▋ | 23086/34278 [25:21:07<10:59:46, 3.54s/it] {'loss': 0.1028, 'grad_norm': 0.9112107723131383, 'learning_rate': 2.5447280567945077e-06, 'epoch': 0.67} 67%|██████▋ | 23086/34278 [25:21:07<10:59:46, 3.54s/it] 67%|██████▋ | 23087/34278 [25:21:10<10:27:53, 3.37s/it] {'loss': 0.1201, 'grad_norm': 0.9070384194223957, 'learning_rate': 2.544316516700809e-06, 'epoch': 0.67} 67%|██████▋ | 23087/34278 [25:21:10<10:27:53, 3.37s/it] 67%|██████▋ | 23088/34278 [25:21:13<10:18:41, 3.32s/it] {'loss': 0.1157, 'grad_norm': 0.8828354313074755, 'learning_rate': 2.543904998530868e-06, 'epoch': 0.67} 67%|██████▋ | 23088/34278 [25:21:13<10:18:41, 3.32s/it] 67%|██████▋ | 23089/34278 [25:21:16<10:10:05, 3.27s/it] {'loss': 0.0965, 'grad_norm': 0.9397248081116105, 'learning_rate': 2.5434935022883557e-06, 'epoch': 0.67} 67%|██████▋ | 23089/34278 [25:21:16<10:10:05, 3.27s/it] 67%|██████▋ | 23090/34278 [25:21:21<11:16:01, 3.63s/it] {'loss': 0.1424, 'grad_norm': 0.8354630477146472, 'learning_rate': 2.5430820279769487e-06, 'epoch': 0.67} 67%|██████▋ | 23090/34278 [25:21:21<11:16:01, 3.63s/it] 67%|██████▋ | 23091/34278 [25:21:24<10:38:00, 3.42s/it] {'loss': 0.1349, 'grad_norm': 0.9887642570087085, 'learning_rate': 2.5426705756003167e-06, 'epoch': 0.67} 67%|██████▋ | 23091/34278 [25:21:24<10:38:00, 3.42s/it] 67%|██████▋ | 23092/34278 [25:21:26<10:07:48, 3.26s/it] {'loss': 0.1225, 'grad_norm': 0.9275770073484282, 'learning_rate': 2.542259145162137e-06, 'epoch': 0.67} 67%|██████▋ | 23092/34278 [25:21:26<10:07:48, 3.26s/it] 67%|██████▋ | 23093/34278 [25:21:30<10:06:40, 3.25s/it] {'loss': 0.1283, 'grad_norm': 0.792629227694484, 'learning_rate': 2.5418477366660808e-06, 'epoch': 0.67} 67%|██████▋ | 23093/34278 [25:21:30<10:06:40, 3.25s/it] 67%|██████▋ | 23094/34278 [25:21:33<10:14:31, 3.30s/it] {'loss': 0.1232, 'grad_norm': 0.8784732543703531, 'learning_rate': 2.54143635011582e-06, 'epoch': 0.67} 67%|██████▋ | 23094/34278 [25:21:33<10:14:31, 3.30s/it] 67%|██████▋ | 23095/34278 [25:21:36<10:04:17, 3.24s/it] {'loss': 0.1427, 'grad_norm': 0.9103302091094058, 'learning_rate': 2.541024985515028e-06, 'epoch': 0.67} 67%|██████▋ | 23095/34278 [25:21:36<10:04:17, 3.24s/it] 67%|██████▋ | 23096/34278 [25:21:39<9:59:44, 3.22s/it] {'loss': 0.1133, 'grad_norm': 0.8114707191285135, 'learning_rate': 2.54061364286738e-06, 'epoch': 0.67} 67%|██████▋ | 23096/34278 [25:21:39<9:59:44, 3.22s/it] 67%|██████▋ | 23097/34278 [25:21:43<9:59:19, 3.22s/it] {'loss': 0.1179, 'grad_norm': 0.8646502927639343, 'learning_rate': 2.540202322176544e-06, 'epoch': 0.67} 67%|██████▋ | 23097/34278 [25:21:43<9:59:19, 3.22s/it] 67%|██████▋ | 23098/34278 [25:21:46<9:47:53, 3.16s/it] {'loss': 0.1272, 'grad_norm': 1.0437988076601405, 'learning_rate': 2.539791023446197e-06, 'epoch': 0.67} 67%|██████▋ | 23098/34278 [25:21:46<9:47:53, 3.16s/it] 67%|██████▋ | 23099/34278 [25:21:49<10:04:21, 3.24s/it] {'loss': 0.1208, 'grad_norm': 0.8614424288493707, 'learning_rate': 2.5393797466800084e-06, 'epoch': 0.67} 67%|██████▋ | 23099/34278 [25:21:49<10:04:21, 3.24s/it] 67%|██████▋ | 23100/34278 [25:21:52<9:48:31, 3.16s/it] {'loss': 0.1463, 'grad_norm': 1.023278228041395, 'learning_rate': 2.5389684918816477e-06, 'epoch': 0.67} 67%|██████▋ | 23100/34278 [25:21:52<9:48:31, 3.16s/it] 67%|██████▋ | 23101/34278 [25:21:55<10:04:49, 3.25s/it] {'loss': 0.1144, 'grad_norm': 1.0339157199520221, 'learning_rate': 2.5385572590547893e-06, 'epoch': 0.67} 67%|██████▋ | 23101/34278 [25:21:55<10:04:49, 3.25s/it] 67%|██████▋ | 23102/34278 [25:22:00<10:50:34, 3.49s/it] {'loss': 0.1308, 'grad_norm': 1.023955827056682, 'learning_rate': 2.538146048203105e-06, 'epoch': 0.67} 67%|██████▋ | 23102/34278 [25:22:00<10:50:34, 3.49s/it] 67%|██████▋ | 23103/34278 [25:22:03<11:04:54, 3.57s/it] {'loss': 0.1205, 'grad_norm': 0.8152534885073576, 'learning_rate': 2.5377348593302664e-06, 'epoch': 0.67} 67%|██████▋ | 23103/34278 [25:22:03<11:04:54, 3.57s/it] 67%|██████▋ | 23104/34278 [25:22:09<12:52:15, 4.15s/it] {'loss': 0.1116, 'grad_norm': 0.7159288195277267, 'learning_rate': 2.5373236924399402e-06, 'epoch': 0.67} 67%|██████▋ | 23104/34278 [25:22:09<12:52:15, 4.15s/it] 67%|██████▋ | 23105/34278 [25:22:12<11:44:24, 3.78s/it] {'loss': 0.121, 'grad_norm': 0.8480227129254346, 'learning_rate': 2.5369125475358027e-06, 'epoch': 0.67} 67%|██████▋ | 23105/34278 [25:22:12<11:44:24, 3.78s/it] 67%|██████▋ | 23106/34278 [25:22:15<11:12:08, 3.61s/it] {'loss': 0.1289, 'grad_norm': 0.8506173762262046, 'learning_rate': 2.536501424621522e-06, 'epoch': 0.67} 67%|██████▋ | 23106/34278 [25:22:15<11:12:08, 3.61s/it] 67%|██████▋ | 23107/34278 [25:22:18<10:42:05, 3.45s/it] {'loss': 0.1193, 'grad_norm': 0.7761814065000823, 'learning_rate': 2.5360903237007647e-06, 'epoch': 0.67} 67%|██████▋ | 23107/34278 [25:22:18<10:42:05, 3.45s/it] 67%|██████▋ | 23108/34278 [25:22:21<10:36:54, 3.42s/it] {'loss': 0.1126, 'grad_norm': 0.7300850631860111, 'learning_rate': 2.5356792447772084e-06, 'epoch': 0.67} 67%|██████▋ | 23108/34278 [25:22:21<10:36:54, 3.42s/it] 67%|██████▋ | 23109/34278 [25:22:25<10:32:30, 3.40s/it] {'loss': 0.1141, 'grad_norm': 0.7781441208719955, 'learning_rate': 2.5352681878545195e-06, 'epoch': 0.67} 67%|██████▋ | 23109/34278 [25:22:25<10:32:30, 3.40s/it] 67%|██████▋ | 23110/34278 [25:22:27<9:55:49, 3.20s/it] {'loss': 0.1056, 'grad_norm': 0.8628981366748527, 'learning_rate': 2.5348571529363658e-06, 'epoch': 0.67} 67%|██████▋ | 23110/34278 [25:22:27<9:55:49, 3.20s/it] 67%|██████▋ | 23111/34278 [25:22:33<12:02:05, 3.88s/it] {'loss': 0.1397, 'grad_norm': 0.8131652804182944, 'learning_rate': 2.534446140026421e-06, 'epoch': 0.67} 67%|██████▋ | 23111/34278 [25:22:33<12:02:05, 3.88s/it] 67%|██████▋ | 23112/34278 [25:22:36<11:29:55, 3.71s/it] {'loss': 0.141, 'grad_norm': 1.1178345651148511, 'learning_rate': 2.5340351491283527e-06, 'epoch': 0.67} 67%|██████▋ | 23112/34278 [25:22:36<11:29:55, 3.71s/it] 67%|██████▋ | 23113/34278 [25:22:39<10:50:17, 3.49s/it] {'loss': 0.1127, 'grad_norm': 1.02264296395628, 'learning_rate': 2.5336241802458283e-06, 'epoch': 0.67} 67%|██████▋ | 23113/34278 [25:22:39<10:50:17, 3.49s/it] 67%|██████▋ | 23114/34278 [25:22:43<11:19:09, 3.65s/it] {'loss': 0.1017, 'grad_norm': 0.8421944935825467, 'learning_rate': 2.5332132333825177e-06, 'epoch': 0.67} 67%|██████▋ | 23114/34278 [25:22:43<11:19:09, 3.65s/it] 67%|██████▋ | 23115/34278 [25:22:46<10:35:21, 3.41s/it] {'loss': 0.1251, 'grad_norm': 0.9686863631015941, 'learning_rate': 2.5328023085420926e-06, 'epoch': 0.67} 67%|██████▋ | 23115/34278 [25:22:46<10:35:21, 3.41s/it] 67%|██████▋ | 23116/34278 [25:22:49<10:14:41, 3.30s/it] {'loss': 0.1168, 'grad_norm': 0.9521066364827775, 'learning_rate': 2.5323914057282194e-06, 'epoch': 0.67} 67%|██████▋ | 23116/34278 [25:22:49<10:14:41, 3.30s/it] 67%|██████▋ | 23117/34278 [25:22:52<10:05:07, 3.25s/it] {'loss': 0.1321, 'grad_norm': 0.9256275405216868, 'learning_rate': 2.531980524944565e-06, 'epoch': 0.67} 67%|██████▋ | 23117/34278 [25:22:52<10:05:07, 3.25s/it] 67%|██████▋ | 23118/34278 [25:22:55<9:53:22, 3.19s/it] {'loss': 0.1153, 'grad_norm': 1.1994132199038094, 'learning_rate': 2.531569666194802e-06, 'epoch': 0.67} 67%|██████▋ | 23118/34278 [25:22:55<9:53:22, 3.19s/it] 67%|██████▋ | 23119/34278 [25:23:01<12:24:59, 4.01s/it] {'loss': 0.1283, 'grad_norm': 0.9828038265013251, 'learning_rate': 2.531158829482593e-06, 'epoch': 0.67} 67%|██████▋ | 23119/34278 [25:23:01<12:24:59, 4.01s/it] 67%|██████▋ | 23120/34278 [25:23:04<11:12:32, 3.62s/it] {'loss': 0.1171, 'grad_norm': 1.0409838458988905, 'learning_rate': 2.5307480148116105e-06, 'epoch': 0.67} 67%|██████▋ | 23120/34278 [25:23:04<11:12:32, 3.62s/it] 67%|██████▋ | 23121/34278 [25:23:07<10:57:50, 3.54s/it] {'loss': 0.1108, 'grad_norm': 1.014336557559262, 'learning_rate': 2.5303372221855183e-06, 'epoch': 0.67} 67%|██████▋ | 23121/34278 [25:23:07<10:57:50, 3.54s/it] 67%|██████▋ | 23122/34278 [25:23:11<11:15:16, 3.63s/it] {'loss': 0.1219, 'grad_norm': 0.7821874423805307, 'learning_rate': 2.529926451607988e-06, 'epoch': 0.67} 67%|██████▋ | 23122/34278 [25:23:11<11:15:16, 3.63s/it] 67%|██████▋ | 23123/34278 [25:23:14<10:42:09, 3.45s/it] {'loss': 0.1171, 'grad_norm': 0.7416077641116059, 'learning_rate': 2.5295157030826844e-06, 'epoch': 0.67} 67%|██████▋ | 23123/34278 [25:23:14<10:42:09, 3.45s/it] 67%|██████▋ | 23124/34278 [25:23:17<10:30:31, 3.39s/it] {'loss': 0.1265, 'grad_norm': 2.0238546409117313, 'learning_rate': 2.529104976613273e-06, 'epoch': 0.67} 67%|██████▋ | 23124/34278 [25:23:17<10:30:31, 3.39s/it] 67%|██████▋ | 23125/34278 [25:23:20<9:58:22, 3.22s/it] {'loss': 0.1006, 'grad_norm': 0.8823217553091093, 'learning_rate': 2.5286942722034226e-06, 'epoch': 0.67} 67%|██████▋ | 23125/34278 [25:23:20<9:58:22, 3.22s/it] 67%|██████▋ | 23126/34278 [25:23:24<10:07:29, 3.27s/it] {'loss': 0.1182, 'grad_norm': 1.0362855400209419, 'learning_rate': 2.5282835898568014e-06, 'epoch': 0.67} 67%|██████▋ | 23126/34278 [25:23:24<10:07:29, 3.27s/it] 67%|██████▋ | 23127/34278 [25:23:28<11:34:37, 3.74s/it] {'loss': 0.1147, 'grad_norm': 0.8452556499659798, 'learning_rate': 2.5278729295770733e-06, 'epoch': 0.67} 67%|██████▋ | 23127/34278 [25:23:28<11:34:37, 3.74s/it] 67%|██████▋ | 23128/34278 [25:23:31<10:52:00, 3.51s/it] {'loss': 0.107, 'grad_norm': 1.0348333314274814, 'learning_rate': 2.5274622913679063e-06, 'epoch': 0.67} 67%|██████▋ | 23128/34278 [25:23:31<10:52:00, 3.51s/it] 67%|██████▋ | 23129/34278 [25:23:38<13:34:10, 4.38s/it] {'loss': 0.1183, 'grad_norm': 0.7822383658747245, 'learning_rate': 2.5270516752329667e-06, 'epoch': 0.67} 67%|██████▋ | 23129/34278 [25:23:38<13:34:10, 4.38s/it] 67%|██████▋ | 23130/34278 [25:23:41<12:48:47, 4.14s/it] {'loss': 0.1284, 'grad_norm': 1.1259794431640837, 'learning_rate': 2.526641081175917e-06, 'epoch': 0.67} 67%|██████▋ | 23130/34278 [25:23:41<12:48:47, 4.14s/it] 67%|██████▋ | 23131/34278 [25:23:45<12:12:11, 3.94s/it] {'loss': 0.1159, 'grad_norm': 1.5440868563599224, 'learning_rate': 2.5262305092004246e-06, 'epoch': 0.67} 67%|██████▋ | 23131/34278 [25:23:45<12:12:11, 3.94s/it] 67%|██████▋ | 23132/34278 [25:23:51<13:53:59, 4.49s/it] {'loss': 0.1301, 'grad_norm': 0.9044928359421482, 'learning_rate': 2.5258199593101586e-06, 'epoch': 0.67} 67%|██████▋ | 23132/34278 [25:23:51<13:53:59, 4.49s/it] 67%|██████▋ | 23133/34278 [25:23:57<15:21:07, 4.96s/it] {'loss': 0.1256, 'grad_norm': 0.7338884647055344, 'learning_rate': 2.5254094315087814e-06, 'epoch': 0.67} 67%|██████▋ | 23133/34278 [25:23:57<15:21:07, 4.96s/it] 67%|██████▋ | 23134/34278 [25:24:00<13:21:12, 4.31s/it] {'loss': 0.1367, 'grad_norm': 1.3777881612596148, 'learning_rate': 2.524998925799956e-06, 'epoch': 0.67} 67%|██████▋ | 23134/34278 [25:24:00<13:21:12, 4.31s/it] 67%|██████▋ | 23135/34278 [25:24:03<12:27:47, 4.03s/it] {'loss': 0.1078, 'grad_norm': 1.0969808987305798, 'learning_rate': 2.5245884421873507e-06, 'epoch': 0.67} 67%|██████▋ | 23135/34278 [25:24:03<12:27:47, 4.03s/it] 67%|██████▋ | 23136/34278 [25:24:08<13:39:23, 4.41s/it] {'loss': 0.1297, 'grad_norm': 0.8023092456942991, 'learning_rate': 2.524177980674629e-06, 'epoch': 0.67} 67%|██████▋ | 23136/34278 [25:24:08<13:39:23, 4.41s/it] 67%|██████▋ | 23137/34278 [25:24:12<13:05:42, 4.23s/it] {'loss': 0.1406, 'grad_norm': 0.9949631306497734, 'learning_rate': 2.523767541265452e-06, 'epoch': 0.67} 67%|██████▋ | 23137/34278 [25:24:12<13:05:42, 4.23s/it] 68%|██████▊ | 23138/34278 [25:24:18<14:45:08, 4.77s/it] {'loss': 0.1247, 'grad_norm': 1.4188378041600502, 'learning_rate': 2.523357123963491e-06, 'epoch': 0.68} 68%|██████▊ | 23138/34278 [25:24:18<14:45:08, 4.77s/it] 68%|██████▊ | 23139/34278 [25:24:21<13:01:32, 4.21s/it] {'loss': 0.1013, 'grad_norm': 0.7450129807835786, 'learning_rate': 2.5229467287724065e-06, 'epoch': 0.68} 68%|██████▊ | 23139/34278 [25:24:21<13:01:32, 4.21s/it] 68%|██████▊ | 23140/34278 [25:24:27<14:46:31, 4.78s/it] {'loss': 0.1029, 'grad_norm': 0.906348749249676, 'learning_rate': 2.5225363556958594e-06, 'epoch': 0.68} 68%|██████▊ | 23140/34278 [25:24:27<14:46:31, 4.78s/it] 68%|██████▊ | 23141/34278 [25:24:30<13:11:05, 4.26s/it] {'loss': 0.1214, 'grad_norm': 0.8819105263323815, 'learning_rate': 2.522126004737519e-06, 'epoch': 0.68} 68%|██████▊ | 23141/34278 [25:24:30<13:11:05, 4.26s/it] 68%|██████▊ | 23142/34278 [25:24:33<12:14:45, 3.96s/it] {'loss': 0.1137, 'grad_norm': 0.9888744549332726, 'learning_rate': 2.521715675901046e-06, 'epoch': 0.68} 68%|██████▊ | 23142/34278 [25:24:33<12:14:45, 3.96s/it] 68%|██████▊ | 23143/34278 [25:24:36<11:24:22, 3.69s/it] {'loss': 0.1259, 'grad_norm': 1.0263899287360436, 'learning_rate': 2.521305369190102e-06, 'epoch': 0.68} 68%|██████▊ | 23143/34278 [25:24:36<11:24:22, 3.69s/it] 68%|██████▊ | 23144/34278 [25:24:41<12:25:30, 4.02s/it] {'loss': 0.1275, 'grad_norm': 0.7178879697205601, 'learning_rate': 2.520895084608351e-06, 'epoch': 0.68} 68%|██████▊ | 23144/34278 [25:24:41<12:25:30, 4.02s/it] 68%|██████▊ | 23145/34278 [25:24:44<11:17:16, 3.65s/it] {'loss': 0.1074, 'grad_norm': 1.0230194022667145, 'learning_rate': 2.5204848221594604e-06, 'epoch': 0.68} 68%|██████▊ | 23145/34278 [25:24:44<11:17:16, 3.65s/it] 68%|██████▊ | 23146/34278 [25:24:50<13:20:06, 4.31s/it] {'loss': 0.104, 'grad_norm': 0.7791910482511467, 'learning_rate': 2.5200745818470883e-06, 'epoch': 0.68} 68%|██████▊ | 23146/34278 [25:24:50<13:20:06, 4.31s/it] 68%|██████▊ | 23147/34278 [25:24:53<12:43:31, 4.12s/it] {'loss': 0.1194, 'grad_norm': 0.9135505313843247, 'learning_rate': 2.519664363674897e-06, 'epoch': 0.68} 68%|██████▊ | 23147/34278 [25:24:54<12:43:31, 4.12s/it] 68%|██████▊ | 23148/34278 [25:24:57<11:49:44, 3.83s/it] {'loss': 0.1422, 'grad_norm': 0.9800595424980734, 'learning_rate': 2.519254167646552e-06, 'epoch': 0.68} 68%|██████▊ | 23148/34278 [25:24:57<11:49:44, 3.83s/it] 68%|██████▊ | 23149/34278 [25:24:59<10:55:03, 3.53s/it] {'loss': 0.1163, 'grad_norm': 0.9179820691529096, 'learning_rate': 2.518843993765711e-06, 'epoch': 0.68} 68%|██████▊ | 23149/34278 [25:24:59<10:55:03, 3.53s/it] 68%|██████▊ | 23150/34278 [25:25:06<13:22:01, 4.32s/it] {'loss': 0.1203, 'grad_norm': 0.8089301134893563, 'learning_rate': 2.518433842036041e-06, 'epoch': 0.68} 68%|██████▊ | 23150/34278 [25:25:06<13:22:01, 4.32s/it] 68%|██████▊ | 23151/34278 [25:25:10<13:34:37, 4.39s/it] {'loss': 0.1134, 'grad_norm': 0.9199195743294921, 'learning_rate': 2.5180237124611996e-06, 'epoch': 0.68} 68%|██████▊ | 23151/34278 [25:25:10<13:34:37, 4.39s/it] 68%|██████▊ | 23152/34278 [25:25:14<12:39:52, 4.10s/it] {'loss': 0.1124, 'grad_norm': 1.1399409120340327, 'learning_rate': 2.517613605044851e-06, 'epoch': 0.68} 68%|██████▊ | 23152/34278 [25:25:14<12:39:52, 4.10s/it] 68%|██████▊ | 23153/34278 [25:25:17<11:47:10, 3.81s/it] {'loss': 0.1116, 'grad_norm': 0.9247520371924938, 'learning_rate': 2.5172035197906565e-06, 'epoch': 0.68} 68%|██████▊ | 23153/34278 [25:25:17<11:47:10, 3.81s/it] 68%|██████▊ | 23154/34278 [25:25:23<13:54:04, 4.50s/it] {'loss': 0.1515, 'grad_norm': 1.4068196431459592, 'learning_rate': 2.516793456702274e-06, 'epoch': 0.68} 68%|██████▊ | 23154/34278 [25:25:23<13:54:04, 4.50s/it] 68%|██████▊ | 23155/34278 [25:25:28<14:47:30, 4.79s/it] {'loss': 0.1261, 'grad_norm': 0.8356682916380015, 'learning_rate': 2.516383415783367e-06, 'epoch': 0.68} 68%|██████▊ | 23155/34278 [25:25:28<14:47:30, 4.79s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 68%|██████▊ | 23156/34278 [25:25:31<12:53:13, 4.17s/it] {'loss': 0.1133, 'grad_norm': 0.8440280544594345, 'learning_rate': 2.5159733970375978e-06, 'epoch': 0.68} 68%|██████▊ | 23156/34278 [25:25:31<12:53:13, 4.17s/it] 68%|██████▊ | 23157/34278 [25:25:34<12:08:47, 3.93s/it] {'loss': 0.1201, 'grad_norm': 0.7475092168729427, 'learning_rate': 2.5155634004686226e-06, 'epoch': 0.68} 68%|██████▊ | 23157/34278 [25:25:34<12:08:47, 3.93s/it] 68%|██████▊ | 23158/34278 [25:25:38<11:25:48, 3.70s/it] {'loss': 0.1226, 'grad_norm': 0.8433739199698794, 'learning_rate': 2.5151534260801068e-06, 'epoch': 0.68} 68%|██████▊ | 23158/34278 [25:25:38<11:25:48, 3.70s/it] 68%|██████▊ | 23159/34278 [25:25:42<11:40:06, 3.78s/it] {'loss': 0.1373, 'grad_norm': 0.8682621054044367, 'learning_rate': 2.5147434738757074e-06, 'epoch': 0.68} 68%|██████▊ | 23159/34278 [25:25:42<11:40:06, 3.78s/it] 68%|██████▊ | 23160/34278 [25:25:45<11:08:42, 3.61s/it] {'loss': 0.1284, 'grad_norm': 0.8544899021696603, 'learning_rate': 2.5143335438590837e-06, 'epoch': 0.68} 68%|██████▊ | 23160/34278 [25:25:45<11:08:42, 3.61s/it] 68%|██████▊ | 23161/34278 [25:25:49<11:44:56, 3.80s/it] {'loss': 0.1035, 'grad_norm': 0.7681728523315022, 'learning_rate': 2.513923636033897e-06, 'epoch': 0.68} 68%|██████▊ | 23161/34278 [25:25:49<11:44:56, 3.80s/it] 68%|██████▊ | 23162/34278 [25:25:52<11:21:59, 3.68s/it] {'loss': 0.1314, 'grad_norm': 0.7930441141389791, 'learning_rate': 2.5135137504038076e-06, 'epoch': 0.68} 68%|██████▊ | 23162/34278 [25:25:52<11:21:59, 3.68s/it] 68%|██████▊ | 23163/34278 [25:25:56<11:26:25, 3.71s/it] {'loss': 0.104, 'grad_norm': 0.8554197424505521, 'learning_rate': 2.5131038869724743e-06, 'epoch': 0.68} 68%|██████▊ | 23163/34278 [25:25:56<11:26:25, 3.71s/it] 68%|██████▊ | 23164/34278 [25:25:59<10:37:49, 3.44s/it] {'loss': 0.1251, 'grad_norm': 1.0880113516953294, 'learning_rate': 2.5126940457435543e-06, 'epoch': 0.68} 68%|██████▊ | 23164/34278 [25:25:59<10:37:49, 3.44s/it] 68%|██████▊ | 23165/34278 [25:26:03<10:53:49, 3.53s/it] {'loss': 0.0999, 'grad_norm': 0.8938954356306851, 'learning_rate': 2.5122842267207092e-06, 'epoch': 0.68} 68%|██████▊ | 23165/34278 [25:26:03<10:53:49, 3.53s/it] 68%|██████▊ | 23166/34278 [25:26:06<10:23:01, 3.36s/it] {'loss': 0.1107, 'grad_norm': 0.8132571370356921, 'learning_rate': 2.511874429907597e-06, 'epoch': 0.68} 68%|██████▊ | 23166/34278 [25:26:06<10:23:01, 3.36s/it] 68%|██████▊ | 23167/34278 [25:26:10<11:24:52, 3.70s/it] {'loss': 0.1412, 'grad_norm': 1.1041847456088503, 'learning_rate': 2.5114646553078726e-06, 'epoch': 0.68} 68%|██████▊ | 23167/34278 [25:26:10<11:24:52, 3.70s/it] 68%|██████▊ | 23168/34278 [25:26:13<10:43:14, 3.47s/it] {'loss': 0.117, 'grad_norm': 0.9164250587438241, 'learning_rate': 2.5110549029252006e-06, 'epoch': 0.68} 68%|██████▊ | 23168/34278 [25:26:13<10:43:14, 3.47s/it] 68%|██████▊ | 23169/34278 [25:26:16<10:18:14, 3.34s/it] {'loss': 0.1219, 'grad_norm': 0.7557817034171268, 'learning_rate': 2.5106451727632374e-06, 'epoch': 0.68} 68%|██████▊ | 23169/34278 [25:26:16<10:18:14, 3.34s/it] 68%|██████▊ | 23170/34278 [25:26:19<10:02:27, 3.25s/it] {'loss': 0.1018, 'grad_norm': 0.8457012666065661, 'learning_rate': 2.5102354648256373e-06, 'epoch': 0.68} 68%|██████▊ | 23170/34278 [25:26:19<10:02:27, 3.25s/it] 68%|██████▊ | 23171/34278 [25:26:22<9:43:23, 3.15s/it] {'loss': 0.1123, 'grad_norm': 1.0266255610670945, 'learning_rate': 2.5098257791160623e-06, 'epoch': 0.68} 68%|██████▊ | 23171/34278 [25:26:22<9:43:23, 3.15s/it] 68%|██████▊ | 23172/34278 [25:26:25<9:55:06, 3.22s/it] {'loss': 0.1038, 'grad_norm': 0.7980441916870344, 'learning_rate': 2.509416115638169e-06, 'epoch': 0.68} 68%|██████▊ | 23172/34278 [25:26:26<9:55:06, 3.22s/it] 68%|██████▊ | 23173/34278 [25:26:29<10:28:18, 3.39s/it] {'loss': 0.1331, 'grad_norm': 0.8657842501474882, 'learning_rate': 2.509006474395612e-06, 'epoch': 0.68} 68%|██████▊ | 23173/34278 [25:26:29<10:28:18, 3.39s/it] 68%|██████▊ | 23174/34278 [25:26:34<11:33:42, 3.75s/it] {'loss': 0.1236, 'grad_norm': 0.8752634771948503, 'learning_rate': 2.5085968553920498e-06, 'epoch': 0.68} 68%|██████▊ | 23174/34278 [25:26:34<11:33:42, 3.75s/it] 68%|██████▊ | 23175/34278 [25:26:40<13:32:46, 4.39s/it] {'loss': 0.1306, 'grad_norm': 1.3284849313228773, 'learning_rate': 2.508187258631143e-06, 'epoch': 0.68} 68%|██████▊ | 23175/34278 [25:26:40<13:32:46, 4.39s/it] 68%|██████▊ | 23176/34278 [25:26:43<12:21:20, 4.01s/it] {'loss': 0.118, 'grad_norm': 0.7788279697705117, 'learning_rate': 2.507777684116545e-06, 'epoch': 0.68} 68%|██████▊ | 23176/34278 [25:26:43<12:21:20, 4.01s/it] 68%|██████▊ | 23177/34278 [25:26:46<11:34:07, 3.75s/it] {'loss': 0.1276, 'grad_norm': 0.8955701524584327, 'learning_rate': 2.5073681318519106e-06, 'epoch': 0.68} 68%|██████▊ | 23177/34278 [25:26:46<11:34:07, 3.75s/it] 68%|██████▊ | 23178/34278 [25:26:50<11:41:41, 3.79s/it] {'loss': 0.1032, 'grad_norm': 0.8405903324101602, 'learning_rate': 2.506958601840901e-06, 'epoch': 0.68} 68%|██████▊ | 23178/34278 [25:26:50<11:41:41, 3.79s/it] 68%|██████▊ | 23179/34278 [25:26:53<10:58:45, 3.56s/it] {'loss': 0.109, 'grad_norm': 0.8614963663971591, 'learning_rate': 2.5065490940871674e-06, 'epoch': 0.68} 68%|██████▊ | 23179/34278 [25:26:53<10:58:45, 3.56s/it] 68%|██████▊ | 23180/34278 [25:26:56<10:22:00, 3.36s/it] {'loss': 0.1139, 'grad_norm': 0.7247493205722442, 'learning_rate': 2.50613960859437e-06, 'epoch': 0.68} 68%|██████▊ | 23180/34278 [25:26:56<10:22:00, 3.36s/it] 68%|██████▊ | 23181/34278 [25:26:58<9:42:01, 3.15s/it] {'loss': 0.1227, 'grad_norm': 0.94560884041923, 'learning_rate': 2.505730145366162e-06, 'epoch': 0.68} 68%|██████▊ | 23181/34278 [25:26:59<9:42:01, 3.15s/it] 68%|██████▊ | 23182/34278 [25:27:01<9:32:54, 3.10s/it] {'loss': 0.0908, 'grad_norm': 0.6935816823347132, 'learning_rate': 2.505320704406201e-06, 'epoch': 0.68} 68%|██████▊ | 23182/34278 [25:27:01<9:32:54, 3.10s/it] 68%|██████▊ | 23183/34278 [25:27:06<10:26:11, 3.39s/it] {'loss': 0.1393, 'grad_norm': 0.992465071574614, 'learning_rate': 2.5049112857181413e-06, 'epoch': 0.68} 68%|██████▊ | 23183/34278 [25:27:06<10:26:11, 3.39s/it] 68%|██████▊ | 23184/34278 [25:27:12<12:54:46, 4.19s/it] {'loss': 0.1205, 'grad_norm': 0.7836895910781367, 'learning_rate': 2.504501889305636e-06, 'epoch': 0.68} 68%|██████▊ | 23184/34278 [25:27:12<12:54:46, 4.19s/it] 68%|██████▊ | 23185/34278 [25:27:16<12:50:19, 4.17s/it] {'loss': 0.1215, 'grad_norm': 0.8185898009347492, 'learning_rate': 2.5040925151723428e-06, 'epoch': 0.68} 68%|██████▊ | 23185/34278 [25:27:16<12:50:19, 4.17s/it] 68%|██████▊ | 23186/34278 [25:27:19<11:47:47, 3.83s/it] {'loss': 0.1138, 'grad_norm': 0.8721580626868752, 'learning_rate': 2.5036831633219173e-06, 'epoch': 0.68} 68%|██████▊ | 23186/34278 [25:27:19<11:47:47, 3.83s/it] 68%|██████▊ | 23187/34278 [25:27:22<11:09:03, 3.62s/it] {'loss': 0.1358, 'grad_norm': 0.8459028226000596, 'learning_rate': 2.5032738337580107e-06, 'epoch': 0.68} 68%|██████▊ | 23187/34278 [25:27:22<11:09:03, 3.62s/it] 68%|██████▊ | 23188/34278 [25:27:25<10:48:01, 3.51s/it] {'loss': 0.1182, 'grad_norm': 0.8127755401220963, 'learning_rate': 2.502864526484281e-06, 'epoch': 0.68} 68%|██████▊ | 23188/34278 [25:27:25<10:48:01, 3.51s/it] 68%|██████▊ | 23189/34278 [25:27:31<13:21:11, 4.34s/it] {'loss': 0.1226, 'grad_norm': 0.6487232810150235, 'learning_rate': 2.5024552415043805e-06, 'epoch': 0.68} 68%|██████▊ | 23189/34278 [25:27:31<13:21:11, 4.34s/it] 68%|██████▊ | 23190/34278 [25:27:35<12:34:45, 4.08s/it] {'loss': 0.1428, 'grad_norm': 0.7757210954520243, 'learning_rate': 2.502045978821962e-06, 'epoch': 0.68} 68%|██████▊ | 23190/34278 [25:27:35<12:34:45, 4.08s/it] 68%|██████▊ | 23191/34278 [25:27:38<11:28:41, 3.73s/it] {'loss': 0.1327, 'grad_norm': 1.1467945440571865, 'learning_rate': 2.5016367384406803e-06, 'epoch': 0.68} 68%|██████▊ | 23191/34278 [25:27:38<11:28:41, 3.73s/it] 68%|██████▊ | 23192/34278 [25:27:41<10:49:32, 3.52s/it] {'loss': 0.1091, 'grad_norm': 1.2988519775489853, 'learning_rate': 2.5012275203641917e-06, 'epoch': 0.68} 68%|██████▊ | 23192/34278 [25:27:41<10:49:32, 3.52s/it] 68%|██████▊ | 23193/34278 [25:27:46<11:57:10, 3.88s/it] {'loss': 0.1483, 'grad_norm': 0.8683144536525187, 'learning_rate': 2.500818324596147e-06, 'epoch': 0.68} 68%|██████▊ | 23193/34278 [25:27:46<11:57:10, 3.88s/it] 68%|██████▊ | 23194/34278 [25:27:49<11:16:51, 3.66s/it] {'loss': 0.129, 'grad_norm': 0.7776973811963013, 'learning_rate': 2.500409151140198e-06, 'epoch': 0.68} 68%|██████▊ | 23194/34278 [25:27:49<11:16:51, 3.66s/it] 68%|██████▊ | 23195/34278 [25:27:52<10:40:02, 3.46s/it] {'loss': 0.116, 'grad_norm': 1.0010403591909511, 'learning_rate': 2.5000000000000015e-06, 'epoch': 0.68} 68%|██████▊ | 23195/34278 [25:27:52<10:40:02, 3.46s/it] 68%|██████▊ | 23196/34278 [25:27:56<11:27:03, 3.72s/it] {'loss': 0.1117, 'grad_norm': 0.7641616983467394, 'learning_rate': 2.4995908711792057e-06, 'epoch': 0.68} 68%|██████▊ | 23196/34278 [25:27:56<11:27:03, 3.72s/it] 68%|██████▊ | 23197/34278 [25:27:59<11:04:38, 3.60s/it] {'loss': 0.1278, 'grad_norm': 0.8650166714865422, 'learning_rate': 2.499181764681466e-06, 'epoch': 0.68} 68%|██████▊ | 23197/34278 [25:27:59<11:04:38, 3.60s/it] 68%|██████▊ | 23198/34278 [25:28:03<10:49:12, 3.52s/it] {'loss': 0.1183, 'grad_norm': 0.7524820034499831, 'learning_rate': 2.498772680510436e-06, 'epoch': 0.68} 68%|██████▊ | 23198/34278 [25:28:03<10:49:12, 3.52s/it] 68%|██████▊ | 23199/34278 [25:28:06<10:28:59, 3.41s/it] {'loss': 0.1315, 'grad_norm': 0.9294925244917861, 'learning_rate': 2.498363618669767e-06, 'epoch': 0.68} 68%|██████▊ | 23199/34278 [25:28:06<10:28:59, 3.41s/it] 68%|██████▊ | 23200/34278 [25:28:09<9:59:52, 3.25s/it] {'loss': 0.1318, 'grad_norm': 0.8821962518874622, 'learning_rate': 2.497954579163108e-06, 'epoch': 0.68} 68%|██████▊ | 23200/34278 [25:28:09<9:59:52, 3.25s/it] 68%|██████▊ | 23201/34278 [25:28:12<9:59:12, 3.25s/it] {'loss': 0.1093, 'grad_norm': 0.8173712125778281, 'learning_rate': 2.4975455619941158e-06, 'epoch': 0.68} 68%|██████▊ | 23201/34278 [25:28:12<9:59:12, 3.25s/it] 68%|██████▊ | 23202/34278 [25:28:17<11:29:55, 3.74s/it] {'loss': 0.1112, 'grad_norm': 0.8654358545193849, 'learning_rate': 2.4971365671664373e-06, 'epoch': 0.68} 68%|██████▊ | 23202/34278 [25:28:17<11:29:55, 3.74s/it] 68%|██████▊ | 23203/34278 [25:28:22<12:28:45, 4.06s/it] {'loss': 0.1514, 'grad_norm': 1.0899199397860233, 'learning_rate': 2.4967275946837276e-06, 'epoch': 0.68} 68%|██████▊ | 23203/34278 [25:28:22<12:28:45, 4.06s/it] 68%|██████▊ | 23204/34278 [25:28:25<12:17:34, 4.00s/it] {'loss': 0.1279, 'grad_norm': 0.901102202281327, 'learning_rate': 2.496318644549635e-06, 'epoch': 0.68} 68%|██████▊ | 23204/34278 [25:28:25<12:17:34, 4.00s/it] 68%|██████▊ | 23205/34278 [25:28:29<12:15:58, 3.99s/it] {'loss': 0.1291, 'grad_norm': 0.8632830006164858, 'learning_rate': 2.4959097167678135e-06, 'epoch': 0.68} 68%|██████▊ | 23205/34278 [25:28:29<12:15:58, 3.99s/it] 68%|██████▊ | 23206/34278 [25:28:32<11:13:18, 3.65s/it] {'loss': 0.1289, 'grad_norm': 1.075004156894414, 'learning_rate': 2.495500811341912e-06, 'epoch': 0.68} 68%|██████▊ | 23206/34278 [25:28:32<11:13:18, 3.65s/it] 68%|██████▊ | 23207/34278 [25:28:36<10:51:05, 3.53s/it] {'loss': 0.1234, 'grad_norm': 1.0066407992973547, 'learning_rate': 2.4950919282755796e-06, 'epoch': 0.68} 68%|██████▊ | 23207/34278 [25:28:36<10:51:05, 3.53s/it] 68%|██████▊ | 23208/34278 [25:28:40<11:30:32, 3.74s/it] {'loss': 0.1149, 'grad_norm': 1.9631922248190352, 'learning_rate': 2.4946830675724694e-06, 'epoch': 0.68} 68%|██████▊ | 23208/34278 [25:28:40<11:30:32, 3.74s/it] 68%|██████▊ | 23209/34278 [25:28:43<10:54:39, 3.55s/it] {'loss': 0.1295, 'grad_norm': 1.09266457228653, 'learning_rate': 2.4942742292362316e-06, 'epoch': 0.68} 68%|██████▊ | 23209/34278 [25:28:43<10:54:39, 3.55s/it] 68%|██████▊ | 23210/34278 [25:28:46<10:04:26, 3.28s/it] {'loss': 0.1139, 'grad_norm': 1.2044147096693798, 'learning_rate': 2.4938654132705154e-06, 'epoch': 0.68} 68%|██████▊ | 23210/34278 [25:28:46<10:04:26, 3.28s/it] 68%|██████▊ | 23211/34278 [25:28:49<10:34:06, 3.44s/it] {'loss': 0.1176, 'grad_norm': 0.6787450334175764, 'learning_rate': 2.4934566196789687e-06, 'epoch': 0.68} 68%|██████▊ | 23211/34278 [25:28:49<10:34:06, 3.44s/it] 68%|██████▊ | 23212/34278 [25:28:53<10:57:45, 3.57s/it] {'loss': 0.1224, 'grad_norm': 1.0005079052068275, 'learning_rate': 2.4930478484652447e-06, 'epoch': 0.68} 68%|██████▊ | 23212/34278 [25:28:53<10:57:45, 3.57s/it] 68%|██████▊ | 23213/34278 [25:28:56<10:20:25, 3.36s/it] {'loss': 0.1349, 'grad_norm': 0.9564861776070009, 'learning_rate': 2.4926390996329912e-06, 'epoch': 0.68} 68%|██████▊ | 23213/34278 [25:28:56<10:20:25, 3.36s/it] 68%|██████▊ | 23214/34278 [25:28:59<9:57:46, 3.24s/it] {'loss': 0.115, 'grad_norm': 0.9929649457165906, 'learning_rate': 2.492230373185854e-06, 'epoch': 0.68} 68%|██████▊ | 23214/34278 [25:28:59<9:57:46, 3.24s/it] 68%|██████▊ | 23215/34278 [25:29:05<12:35:01, 4.09s/it] {'loss': 0.1324, 'grad_norm': 0.9268754442125966, 'learning_rate': 2.4918216691274888e-06, 'epoch': 0.68} 68%|██████▊ | 23215/34278 [25:29:05<12:35:01, 4.09s/it] 68%|██████▊ | 23216/34278 [25:29:11<14:35:07, 4.75s/it] {'loss': 0.1494, 'grad_norm': 1.1756847446121403, 'learning_rate': 2.4914129874615404e-06, 'epoch': 0.68} 68%|██████▊ | 23216/34278 [25:29:11<14:35:07, 4.75s/it] 68%|██████▊ | 23217/34278 [25:29:14<12:43:51, 4.14s/it] {'loss': 0.1198, 'grad_norm': 1.383475103297312, 'learning_rate': 2.491004328191657e-06, 'epoch': 0.68} 68%|██████▊ | 23217/34278 [25:29:14<12:43:51, 4.14s/it] 68%|██████▊ | 23218/34278 [25:29:20<14:31:01, 4.73s/it] {'loss': 0.1442, 'grad_norm': 1.0030078487186729, 'learning_rate': 2.4905956913214897e-06, 'epoch': 0.68} 68%|██████▊ | 23218/34278 [25:29:20<14:31:01, 4.73s/it] 68%|██████▊ | 23219/34278 [25:29:24<13:28:03, 4.38s/it] {'loss': 0.117, 'grad_norm': 0.7002481196687629, 'learning_rate': 2.4901870768546842e-06, 'epoch': 0.68} 68%|██████▊ | 23219/34278 [25:29:24<13:28:03, 4.38s/it] 68%|██████▊ | 23220/34278 [25:29:27<12:05:08, 3.93s/it] {'loss': 0.1112, 'grad_norm': 0.97035648522706, 'learning_rate': 2.4897784847948885e-06, 'epoch': 0.68} 68%|██████▊ | 23220/34278 [25:29:27<12:05:08, 3.93s/it] 68%|██████▊ | 23221/34278 [25:29:30<11:11:32, 3.64s/it] {'loss': 0.1136, 'grad_norm': 0.9056461538362122, 'learning_rate': 2.4893699151457507e-06, 'epoch': 0.68} 68%|██████▊ | 23221/34278 [25:29:30<11:11:32, 3.64s/it] 68%|██████▊ | 23222/34278 [25:29:34<12:01:22, 3.91s/it] {'loss': 0.1217, 'grad_norm': 0.9779488105303187, 'learning_rate': 2.4889613679109208e-06, 'epoch': 0.68} 68%|██████▊ | 23222/34278 [25:29:34<12:01:22, 3.91s/it] 68%|██████▊ | 23223/34278 [25:29:37<11:17:15, 3.68s/it] {'loss': 0.1272, 'grad_norm': 0.7770129971333952, 'learning_rate': 2.4885528430940447e-06, 'epoch': 0.68} 68%|██████▊ | 23223/34278 [25:29:37<11:17:15, 3.68s/it] 68%|██████▊ | 23224/34278 [25:29:43<13:19:19, 4.34s/it] {'loss': 0.1407, 'grad_norm': 0.9252881197770451, 'learning_rate': 2.488144340698767e-06, 'epoch': 0.68} 68%|██████▊ | 23224/34278 [25:29:43<13:19:19, 4.34s/it] 68%|██████▊ | 23225/34278 [25:29:46<11:59:20, 3.90s/it] {'loss': 0.1341, 'grad_norm': 0.9574349860481227, 'learning_rate': 2.4877358607287393e-06, 'epoch': 0.68} 68%|██████▊ | 23225/34278 [25:29:46<11:59:20, 3.90s/it] 68%|██████▊ | 23226/34278 [25:29:49<11:27:24, 3.73s/it] {'loss': 0.1203, 'grad_norm': 1.138847368745896, 'learning_rate': 2.4873274031876045e-06, 'epoch': 0.68} 68%|██████▊ | 23226/34278 [25:29:49<11:27:24, 3.73s/it] 68%|██████▊ | 23227/34278 [25:29:52<10:45:52, 3.51s/it] {'loss': 0.1262, 'grad_norm': 1.0141871485961593, 'learning_rate': 2.48691896807901e-06, 'epoch': 0.68} 68%|██████▊ | 23227/34278 [25:29:52<10:45:52, 3.51s/it] 68%|██████▊ | 23228/34278 [25:29:56<10:44:44, 3.50s/it] {'loss': 0.1182, 'grad_norm': 1.0605770219692743, 'learning_rate': 2.4865105554066056e-06, 'epoch': 0.68} 68%|██████▊ | 23228/34278 [25:29:56<10:44:44, 3.50s/it] 68%|██████▊ | 23229/34278 [25:29:59<10:29:20, 3.42s/it] {'loss': 0.1202, 'grad_norm': 0.7354830297829584, 'learning_rate': 2.4861021651740343e-06, 'epoch': 0.68} 68%|██████▊ | 23229/34278 [25:29:59<10:29:20, 3.42s/it] 68%|██████▊ | 23230/34278 [25:30:02<10:05:56, 3.29s/it] {'loss': 0.1203, 'grad_norm': 0.7369375301099033, 'learning_rate': 2.485693797384941e-06, 'epoch': 0.68} 68%|██████▊ | 23230/34278 [25:30:02<10:05:56, 3.29s/it] 68%|██████▊ | 23231/34278 [25:30:05<9:43:39, 3.17s/it] {'loss': 0.1199, 'grad_norm': 1.1670457478783296, 'learning_rate': 2.4852854520429754e-06, 'epoch': 0.68} 68%|██████▊ | 23231/34278 [25:30:05<9:43:39, 3.17s/it] 68%|██████▊ | 23232/34278 [25:30:08<9:32:34, 3.11s/it] {'loss': 0.1025, 'grad_norm': 0.850800968871462, 'learning_rate': 2.484877129151779e-06, 'epoch': 0.68} 68%|██████▊ | 23232/34278 [25:30:08<9:32:34, 3.11s/it] 68%|██████▊ | 23233/34278 [25:30:12<10:11:55, 3.32s/it] {'loss': 0.1115, 'grad_norm': 0.7398335044777398, 'learning_rate': 2.4844688287150014e-06, 'epoch': 0.68} 68%|██████▊ | 23233/34278 [25:30:12<10:11:55, 3.32s/it] 68%|██████▊ | 23234/34278 [25:30:15<10:10:14, 3.32s/it] {'loss': 0.1315, 'grad_norm': 0.7725895973816064, 'learning_rate': 2.484060550736283e-06, 'epoch': 0.68} 68%|██████▊ | 23234/34278 [25:30:15<10:10:14, 3.32s/it] 68%|██████▊ | 23235/34278 [25:30:18<10:06:06, 3.29s/it] {'loss': 0.1013, 'grad_norm': 1.172249753279124, 'learning_rate': 2.4836522952192743e-06, 'epoch': 0.68} 68%|██████▊ | 23235/34278 [25:30:18<10:06:06, 3.29s/it] 68%|██████▊ | 23236/34278 [25:30:22<10:21:41, 3.38s/it] {'loss': 0.1296, 'grad_norm': 0.8574820241241642, 'learning_rate': 2.483244062167616e-06, 'epoch': 0.68} 68%|██████▊ | 23236/34278 [25:30:22<10:21:41, 3.38s/it] 68%|██████▊ | 23237/34278 [25:30:27<11:32:54, 3.77s/it] {'loss': 0.1089, 'grad_norm': 0.7456131632183346, 'learning_rate': 2.4828358515849532e-06, 'epoch': 0.68} 68%|██████▊ | 23237/34278 [25:30:27<11:32:54, 3.77s/it] 68%|██████▊ | 23238/34278 [25:30:30<10:49:28, 3.53s/it] {'loss': 0.1022, 'grad_norm': 1.4838246913619229, 'learning_rate': 2.48242766347493e-06, 'epoch': 0.68} 68%|██████▊ | 23238/34278 [25:30:30<10:49:28, 3.53s/it] 68%|██████▊ | 23239/34278 [25:30:33<10:21:23, 3.38s/it] {'loss': 0.1247, 'grad_norm': 0.9926797533852754, 'learning_rate': 2.4820194978411944e-06, 'epoch': 0.68} 68%|██████▊ | 23239/34278 [25:30:33<10:21:23, 3.38s/it] 68%|██████▊ | 23240/34278 [25:30:38<11:54:14, 3.88s/it] {'loss': 0.1317, 'grad_norm': 0.8703493259399564, 'learning_rate': 2.481611354687387e-06, 'epoch': 0.68} 68%|██████▊ | 23240/34278 [25:30:38<11:54:14, 3.88s/it] 68%|██████▊ | 23241/34278 [25:30:41<11:24:28, 3.72s/it] {'loss': 0.1159, 'grad_norm': 0.7748205997640173, 'learning_rate': 2.4812032340171504e-06, 'epoch': 0.68} 68%|██████▊ | 23241/34278 [25:30:41<11:24:28, 3.72s/it] 68%|██████▊ | 23242/34278 [25:30:46<12:40:51, 4.14s/it] {'loss': 0.12, 'grad_norm': 0.8303628657101826, 'learning_rate': 2.480795135834132e-06, 'epoch': 0.68} 68%|██████▊ | 23242/34278 [25:30:46<12:40:51, 4.14s/it] 68%|██████▊ | 23243/34278 [25:30:50<12:04:15, 3.94s/it] {'loss': 0.1258, 'grad_norm': 1.0797060426992753, 'learning_rate': 2.480387060141974e-06, 'epoch': 0.68} 68%|██████▊ | 23243/34278 [25:30:50<12:04:15, 3.94s/it] 68%|██████▊ | 23244/34278 [25:30:54<12:33:58, 4.10s/it] {'loss': 0.1139, 'grad_norm': 0.7337524852911939, 'learning_rate': 2.479979006944314e-06, 'epoch': 0.68} 68%|██████▊ | 23244/34278 [25:30:54<12:33:58, 4.10s/it] 68%|██████▊ | 23245/34278 [25:30:57<11:34:24, 3.78s/it] {'loss': 0.1213, 'grad_norm': 0.9822969250196009, 'learning_rate': 2.479570976244804e-06, 'epoch': 0.68} 68%|██████▊ | 23245/34278 [25:30:57<11:34:24, 3.78s/it] 68%|██████▊ | 23246/34278 [25:31:00<10:53:42, 3.56s/it] {'loss': 0.1207, 'grad_norm': 0.9337499917483004, 'learning_rate': 2.4791629680470826e-06, 'epoch': 0.68} 68%|██████▊ | 23246/34278 [25:31:00<10:53:42, 3.56s/it] 68%|██████▊ | 23247/34278 [25:31:04<10:55:45, 3.57s/it] {'loss': 0.108, 'grad_norm': 0.7617695585390399, 'learning_rate': 2.4787549823547906e-06, 'epoch': 0.68} 68%|██████▊ | 23247/34278 [25:31:04<10:55:45, 3.57s/it] 68%|██████▊ | 23248/34278 [25:31:07<10:38:01, 3.47s/it] {'loss': 0.1329, 'grad_norm': 0.8697977727272415, 'learning_rate': 2.478347019171574e-06, 'epoch': 0.68} 68%|██████▊ | 23248/34278 [25:31:07<10:38:01, 3.47s/it] 68%|██████▊ | 23249/34278 [25:31:10<10:23:31, 3.39s/it] {'loss': 0.1184, 'grad_norm': 1.1181899706718783, 'learning_rate': 2.477939078501074e-06, 'epoch': 0.68} 68%|██████▊ | 23249/34278 [25:31:10<10:23:31, 3.39s/it] 68%|██████▊ | 23250/34278 [25:31:13<10:02:52, 3.28s/it] {'loss': 0.1056, 'grad_norm': 0.9719237304218606, 'learning_rate': 2.4775311603469294e-06, 'epoch': 0.68} 68%|██████▊ | 23250/34278 [25:31:13<10:02:52, 3.28s/it] 68%|██████▊ | 23251/34278 [25:31:17<10:41:43, 3.49s/it] {'loss': 0.1435, 'grad_norm': 0.74337342749288, 'learning_rate': 2.4771232647127842e-06, 'epoch': 0.68} 68%|██████▊ | 23251/34278 [25:31:17<10:41:43, 3.49s/it] 68%|██████▊ | 23252/34278 [25:31:21<10:59:59, 3.59s/it] {'loss': 0.1187, 'grad_norm': 0.7109216846044918, 'learning_rate': 2.4767153916022823e-06, 'epoch': 0.68} 68%|██████▊ | 23252/34278 [25:31:21<10:59:59, 3.59s/it] 68%|██████▊ | 23253/34278 [25:31:24<10:32:20, 3.44s/it] {'loss': 0.1273, 'grad_norm': 1.506068787002971, 'learning_rate': 2.476307541019063e-06, 'epoch': 0.68} 68%|██████▊ | 23253/34278 [25:31:24<10:32:20, 3.44s/it] 68%|██████▊ | 23254/34278 [25:31:27<10:06:36, 3.30s/it] {'loss': 0.1036, 'grad_norm': 0.8728675770934057, 'learning_rate': 2.4758997129667654e-06, 'epoch': 0.68} 68%|██████▊ | 23254/34278 [25:31:27<10:06:36, 3.30s/it] 68%|██████▊ | 23255/34278 [25:31:33<12:35:43, 4.11s/it] {'loss': 0.1018, 'grad_norm': 0.9665068777593661, 'learning_rate': 2.4754919074490353e-06, 'epoch': 0.68} 68%|██████▊ | 23255/34278 [25:31:33<12:35:43, 4.11s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7facc27a6c00> Failed to fetch sample 2654737. Exception: cannot identify image file <_io.BytesIO object at 0x7facc27a6c00> 68%|██████▊ | 23256/34278 [25:31:36<11:33:33, 3.78s/it] {'loss': 0.1378, 'grad_norm': 0.9857849818076866, 'learning_rate': 2.4750841244695076e-06, 'epoch': 0.68} 68%|██████▊ | 23256/34278 [25:31:36<11:33:33, 3.78s/it] 68%|██████▊ | 23257/34278 [25:31:39<10:53:12, 3.56s/it] {'loss': 0.1201, 'grad_norm': 1.0120718578592216, 'learning_rate': 2.4746763640318273e-06, 'epoch': 0.68} 68%|██████▊ | 23257/34278 [25:31:39<10:53:12, 3.56s/it] 68%|██████▊ | 23258/34278 [25:31:44<12:08:56, 3.97s/it] {'loss': 0.1418, 'grad_norm': 1.1232237256642754, 'learning_rate': 2.474268626139635e-06, 'epoch': 0.68} 68%|██████▊ | 23258/34278 [25:31:44<12:08:56, 3.97s/it] 68%|██████▊ | 23259/34278 [25:31:47<11:18:33, 3.69s/it] {'loss': 0.1141, 'grad_norm': 1.0372873118648187, 'learning_rate': 2.47386091079657e-06, 'epoch': 0.68} 68%|██████▊ | 23259/34278 [25:31:47<11:18:33, 3.69s/it] 68%|██████▊ | 23260/34278 [25:31:53<12:58:45, 4.24s/it] {'loss': 0.1058, 'grad_norm': 0.8610873900766065, 'learning_rate': 2.4734532180062694e-06, 'epoch': 0.68} 68%|██████▊ | 23260/34278 [25:31:53<12:58:45, 4.24s/it] 68%|██████▊ | 23261/34278 [25:31:56<12:37:28, 4.13s/it] {'loss': 0.094, 'grad_norm': 0.7117429116053213, 'learning_rate': 2.4730455477723768e-06, 'epoch': 0.68} 68%|██████▊ | 23261/34278 [25:31:56<12:37:28, 4.13s/it] 68%|██████▊ | 23262/34278 [25:32:00<11:55:46, 3.90s/it] {'loss': 0.1319, 'grad_norm': 0.7874848766282998, 'learning_rate': 2.472637900098529e-06, 'epoch': 0.68} 68%|██████▊ | 23262/34278 [25:32:00<11:55:46, 3.90s/it] 68%|██████▊ | 23263/34278 [25:32:03<11:09:45, 3.65s/it] {'loss': 0.1305, 'grad_norm': 1.1463798161372984, 'learning_rate': 2.472230274988368e-06, 'epoch': 0.68} 68%|██████▊ | 23263/34278 [25:32:03<11:09:45, 3.65s/it] 68%|██████▊ | 23264/34278 [25:32:09<13:20:04, 4.36s/it] {'loss': 0.1164, 'grad_norm': 0.9051252263064438, 'learning_rate': 2.4718226724455307e-06, 'epoch': 0.68} 68%|██████▊ | 23264/34278 [25:32:09<13:20:04, 4.36s/it] 68%|██████▊ | 23265/34278 [25:32:15<14:48:19, 4.84s/it] {'loss': 0.1357, 'grad_norm': 0.8160666109047723, 'learning_rate': 2.4714150924736586e-06, 'epoch': 0.68} 68%|██████▊ | 23265/34278 [25:32:15<14:48:19, 4.84s/it] 68%|██████▊ | 23266/34278 [25:32:18<13:39:51, 4.47s/it] {'loss': 0.1135, 'grad_norm': 1.2440188524743452, 'learning_rate': 2.4710075350763884e-06, 'epoch': 0.68} 68%|██████▊ | 23266/34278 [25:32:18<13:39:51, 4.47s/it] 68%|██████▊ | 23267/34278 [25:32:22<12:33:20, 4.11s/it] {'loss': 0.1193, 'grad_norm': 1.0329275208505135, 'learning_rate': 2.4706000002573575e-06, 'epoch': 0.68} 68%|██████▊ | 23267/34278 [25:32:22<12:33:20, 4.11s/it] 68%|██████▊ | 23268/34278 [25:32:27<13:23:11, 4.38s/it] {'loss': 0.1219, 'grad_norm': 0.8850802153830637, 'learning_rate': 2.4701924880202068e-06, 'epoch': 0.68} 68%|██████▊ | 23268/34278 [25:32:27<13:23:11, 4.38s/it] 68%|██████▊ | 23269/34278 [25:32:30<12:19:31, 4.03s/it] {'loss': 0.1263, 'grad_norm': 1.5421189082104674, 'learning_rate': 2.4697849983685746e-06, 'epoch': 0.68} 68%|██████▊ | 23269/34278 [25:32:30<12:19:31, 4.03s/it] 68%|██████▊ | 23270/34278 [25:32:33<11:21:51, 3.72s/it] {'loss': 0.1098, 'grad_norm': 0.8565440088863334, 'learning_rate': 2.469377531306098e-06, 'epoch': 0.68} 68%|██████▊ | 23270/34278 [25:32:33<11:21:51, 3.72s/it] 68%|██████▊ | 23271/34278 [25:32:36<10:50:00, 3.54s/it] {'loss': 0.1082, 'grad_norm': 0.8226132125700635, 'learning_rate': 2.4689700868364134e-06, 'epoch': 0.68} 68%|██████▊ | 23271/34278 [25:32:36<10:50:00, 3.54s/it] 68%|██████▊ | 23272/34278 [25:32:41<12:31:55, 4.10s/it] {'loss': 0.1036, 'grad_norm': 0.7672631597649043, 'learning_rate': 2.4685626649631612e-06, 'epoch': 0.68} 68%|██████▊ | 23272/34278 [25:32:41<12:31:55, 4.10s/it] 68%|██████▊ | 23273/34278 [25:32:45<11:37:23, 3.80s/it] {'loss': 0.1105, 'grad_norm': 0.8840749501854094, 'learning_rate': 2.468155265689977e-06, 'epoch': 0.68} 68%|██████▊ | 23273/34278 [25:32:45<11:37:23, 3.80s/it] 68%|██████▊ | 23274/34278 [25:32:48<11:24:29, 3.73s/it] {'loss': 0.1145, 'grad_norm': 1.033047072091153, 'learning_rate': 2.467747889020495e-06, 'epoch': 0.68} 68%|██████▊ | 23274/34278 [25:32:48<11:24:29, 3.73s/it] 68%|██████▊ | 23275/34278 [25:32:51<10:25:08, 3.41s/it] {'loss': 0.1327, 'grad_norm': 1.0287247916203313, 'learning_rate': 2.4673405349583584e-06, 'epoch': 0.68} 68%|██████▊ | 23275/34278 [25:32:51<10:25:08, 3.41s/it] 68%|██████▊ | 23276/34278 [25:32:57<12:44:41, 4.17s/it] {'loss': 0.1206, 'grad_norm': 0.7547137343987765, 'learning_rate': 2.4669332035072015e-06, 'epoch': 0.68} 68%|██████▊ | 23276/34278 [25:32:57<12:44:41, 4.17s/it] 68%|██████▊ | 23277/34278 [25:33:00<11:55:49, 3.90s/it] {'loss': 0.1152, 'grad_norm': 0.8254254707409097, 'learning_rate': 2.4665258946706584e-06, 'epoch': 0.68} 68%|██████▊ | 23277/34278 [25:33:00<11:55:49, 3.90s/it] 68%|██████▊ | 23278/34278 [25:33:03<11:10:23, 3.66s/it] {'loss': 0.1147, 'grad_norm': 1.1018030493050908, 'learning_rate': 2.4661186084523687e-06, 'epoch': 0.68} 68%|██████▊ | 23278/34278 [25:33:03<11:10:23, 3.66s/it] 68%|██████▊ | 23279/34278 [25:33:06<10:41:41, 3.50s/it] {'loss': 0.1059, 'grad_norm': 0.9821404654476183, 'learning_rate': 2.465711344855967e-06, 'epoch': 0.68} 68%|██████▊ | 23279/34278 [25:33:06<10:41:41, 3.50s/it] 68%|██████▊ | 23280/34278 [25:33:10<10:58:52, 3.59s/it] {'loss': 0.1076, 'grad_norm': 0.7834246814515112, 'learning_rate': 2.4653041038850885e-06, 'epoch': 0.68} 68%|██████▊ | 23280/34278 [25:33:10<10:58:52, 3.59s/it] 68%|██████▊ | 23281/34278 [25:33:13<10:28:25, 3.43s/it] {'loss': 0.1245, 'grad_norm': 0.9638182775170588, 'learning_rate': 2.464896885543369e-06, 'epoch': 0.68} 68%|██████▊ | 23281/34278 [25:33:13<10:28:25, 3.43s/it] 68%|██████▊ | 23282/34278 [25:33:16<10:05:42, 3.31s/it] {'loss': 0.1158, 'grad_norm': 0.9872090026624819, 'learning_rate': 2.4644896898344474e-06, 'epoch': 0.68} 68%|██████▊ | 23282/34278 [25:33:16<10:05:42, 3.31s/it] 68%|██████▊ | 23283/34278 [25:33:20<10:50:14, 3.55s/it] {'loss': 0.1208, 'grad_norm': 1.0236346161098253, 'learning_rate': 2.4640825167619565e-06, 'epoch': 0.68} 68%|██████▊ | 23283/34278 [25:33:20<10:50:14, 3.55s/it] 68%|██████▊ | 23284/34278 [25:33:24<10:58:38, 3.59s/it] {'loss': 0.1361, 'grad_norm': 0.8528396822851665, 'learning_rate': 2.4636753663295293e-06, 'epoch': 0.68} 68%|██████▊ | 23284/34278 [25:33:24<10:58:38, 3.59s/it] 68%|██████▊ | 23285/34278 [25:33:28<11:07:47, 3.64s/it] {'loss': 0.135, 'grad_norm': 1.0815056572565798, 'learning_rate': 2.463268238540805e-06, 'epoch': 0.68} 68%|██████▊ | 23285/34278 [25:33:28<11:07:47, 3.64s/it] 68%|██████▊ | 23286/34278 [25:33:31<11:11:04, 3.66s/it] {'loss': 0.1169, 'grad_norm': 0.8551548620165316, 'learning_rate': 2.4628611333994147e-06, 'epoch': 0.68} 68%|██████▊ | 23286/34278 [25:33:31<11:11:04, 3.66s/it] 68%|██████▊ | 23287/34278 [25:33:35<10:42:19, 3.51s/it] {'loss': 0.1496, 'grad_norm': 1.1690525290461162, 'learning_rate': 2.462454050908994e-06, 'epoch': 0.68} 68%|██████▊ | 23287/34278 [25:33:35<10:42:19, 3.51s/it] 68%|██████▊ | 23288/34278 [25:33:40<12:53:08, 4.22s/it] {'loss': 0.1404, 'grad_norm': 1.056309387325001, 'learning_rate': 2.4620469910731805e-06, 'epoch': 0.68} 68%|██████▊ | 23288/34278 [25:33:40<12:53:08, 4.22s/it] 68%|██████▊ | 23289/34278 [25:33:47<14:38:18, 4.80s/it] {'loss': 0.0967, 'grad_norm': 0.8149182654688826, 'learning_rate': 2.461639953895605e-06, 'epoch': 0.68} 68%|██████▊ | 23289/34278 [25:33:47<14:38:18, 4.80s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f216b9ed620> Failed to fetch sample 2667586. Exception: cannot identify image file <_io.BytesIO object at 0x7f216b9ed620> 68%|██████▊ | 23290/34278 [25:33:52<15:04:07, 4.94s/it] {'loss': 0.1298, 'grad_norm': 0.7322182970506431, 'learning_rate': 2.4612329393799e-06, 'epoch': 0.68} 68%|██████▊ | 23290/34278 [25:33:52<15:04:07, 4.94s/it] 68%|██████▊ | 23291/34278 [25:33:55<13:36:22, 4.46s/it] {'loss': 0.142, 'grad_norm': 0.9476505209053838, 'learning_rate': 2.460825947529703e-06, 'epoch': 0.68} 68%|██████▊ | 23291/34278 [25:33:55<13:36:22, 4.46s/it] 68%|██████▊ | 23292/34278 [25:33:58<12:13:51, 4.01s/it] {'loss': 0.1106, 'grad_norm': 0.9886733701256886, 'learning_rate': 2.4604189783486445e-06, 'epoch': 0.68} 68%|██████▊ | 23292/34278 [25:33:58<12:13:51, 4.01s/it] 68%|██████▊ | 23293/34278 [25:34:04<13:34:41, 4.45s/it] {'loss': 0.1278, 'grad_norm': 0.8564438238412084, 'learning_rate': 2.4600120318403607e-06, 'epoch': 0.68} 68%|██████▊ | 23293/34278 [25:34:04<13:34:41, 4.45s/it] 68%|██████▊ | 23294/34278 [25:34:07<12:52:08, 4.22s/it] {'loss': 0.1189, 'grad_norm': 0.8189582019618686, 'learning_rate': 2.4596051080084814e-06, 'epoch': 0.68} 68%|██████▊ | 23294/34278 [25:34:07<12:52:08, 4.22s/it] 68%|██████▊ | 23295/34278 [25:34:13<14:29:50, 4.75s/it] {'loss': 0.111, 'grad_norm': 1.2801099694260838, 'learning_rate': 2.4591982068566427e-06, 'epoch': 0.68} 68%|██████▊ | 23295/34278 [25:34:13<14:29:50, 4.75s/it] 68%|██████▊ | 23296/34278 [25:34:16<12:37:06, 4.14s/it] {'loss': 0.1099, 'grad_norm': 1.0073932176205675, 'learning_rate': 2.458791328388477e-06, 'epoch': 0.68} 68%|██████▊ | 23296/34278 [25:34:16<12:37:06, 4.14s/it] 68%|██████▊ | 23297/34278 [25:34:21<13:18:20, 4.36s/it] {'loss': 0.1475, 'grad_norm': 1.0406894851352735, 'learning_rate': 2.4583844726076124e-06, 'epoch': 0.68} 68%|██████▊ | 23297/34278 [25:34:21<13:18:20, 4.36s/it] 68%|██████▊ | 23298/34278 [25:34:27<14:45:33, 4.84s/it] {'loss': 0.1234, 'grad_norm': 0.8681268505669584, 'learning_rate': 2.4579776395176853e-06, 'epoch': 0.68} 68%|██████▊ | 23298/34278 [25:34:27<14:45:33, 4.84s/it] 68%|██████▊ | 23299/34278 [25:34:30<13:17:59, 4.36s/it] {'loss': 0.1345, 'grad_norm': 0.8673681118810458, 'learning_rate': 2.457570829122329e-06, 'epoch': 0.68} 68%|██████▊ | 23299/34278 [25:34:30<13:17:59, 4.36s/it] 68%|██████▊ | 23300/34278 [25:34:36<14:44:42, 4.84s/it] {'loss': 0.1229, 'grad_norm': 1.1524444999473435, 'learning_rate': 2.457164041425173e-06, 'epoch': 0.68} 68%|██████▊ | 23300/34278 [25:34:36<14:44:42, 4.84s/it] 68%|██████▊ | 23301/34278 [25:34:40<13:59:47, 4.59s/it] {'loss': 0.1173, 'grad_norm': 0.7966303620764652, 'learning_rate': 2.4567572764298476e-06, 'epoch': 0.68} 68%|██████▊ | 23301/34278 [25:34:40<13:59:47, 4.59s/it] 68%|██████▊ | 23302/34278 [25:34:44<13:07:09, 4.30s/it] {'loss': 0.1459, 'grad_norm': 1.0292047862244138, 'learning_rate': 2.456350534139988e-06, 'epoch': 0.68} 68%|██████▊ | 23302/34278 [25:34:44<13:07:09, 4.30s/it] 68%|██████▊ | 23303/34278 [25:34:47<12:01:27, 3.94s/it] {'loss': 0.1225, 'grad_norm': 0.7958008433308223, 'learning_rate': 2.4559438145592234e-06, 'epoch': 0.68} 68%|██████▊ | 23303/34278 [25:34:47<12:01:27, 3.94s/it] 68%|██████▊ | 23304/34278 [25:34:50<11:16:17, 3.70s/it] {'loss': 0.1108, 'grad_norm': 1.1017273521197222, 'learning_rate': 2.4555371176911817e-06, 'epoch': 0.68} 68%|██████▊ | 23304/34278 [25:34:50<11:16:17, 3.70s/it] 68%|██████▊ | 23305/34278 [25:34:53<10:45:17, 3.53s/it] {'loss': 0.0946, 'grad_norm': 0.7683430539172763, 'learning_rate': 2.4551304435395007e-06, 'epoch': 0.68} 68%|██████▊ | 23305/34278 [25:34:53<10:45:17, 3.53s/it] 68%|██████▊ | 23306/34278 [25:34:56<10:19:46, 3.39s/it] {'loss': 0.1348, 'grad_norm': 0.9074795400999534, 'learning_rate': 2.4547237921078077e-06, 'epoch': 0.68} 68%|██████▊ | 23306/34278 [25:34:56<10:19:46, 3.39s/it] 68%|██████▊ | 23307/34278 [25:34:59<10:19:00, 3.39s/it] {'loss': 0.1034, 'grad_norm': 0.7755105568608713, 'learning_rate': 2.4543171633997314e-06, 'epoch': 0.68} 68%|██████▊ | 23307/34278 [25:34:59<10:19:00, 3.39s/it] 68%|██████▊ | 23308/34278 [25:35:03<10:37:14, 3.49s/it] {'loss': 0.1347, 'grad_norm': 0.9758079951548948, 'learning_rate': 2.4539105574189052e-06, 'epoch': 0.68} 68%|██████▊ | 23308/34278 [25:35:03<10:37:14, 3.49s/it] 68%|██████▊ | 23309/34278 [25:35:06<10:21:48, 3.40s/it] {'loss': 0.1168, 'grad_norm': 0.9060779365644929, 'learning_rate': 2.453503974168958e-06, 'epoch': 0.68} 68%|██████▊ | 23309/34278 [25:35:06<10:21:48, 3.40s/it] 68%|██████▊ | 23310/34278 [25:35:10<10:12:39, 3.35s/it] {'loss': 0.1333, 'grad_norm': 0.9526106057205321, 'learning_rate': 2.453097413653518e-06, 'epoch': 0.68} 68%|██████▊ | 23310/34278 [25:35:10<10:12:39, 3.35s/it] 68%|██████▊ | 23311/34278 [25:35:13<10:35:41, 3.48s/it] {'loss': 0.1195, 'grad_norm': 0.7123504495795431, 'learning_rate': 2.4526908758762156e-06, 'epoch': 0.68} 68%|██████▊ | 23311/34278 [25:35:13<10:35:41, 3.48s/it] 68%|██████▊ | 23312/34278 [25:35:17<10:51:46, 3.57s/it] {'loss': 0.1203, 'grad_norm': 0.9783550740959671, 'learning_rate': 2.4522843608406834e-06, 'epoch': 0.68} 68%|██████▊ | 23312/34278 [25:35:17<10:51:46, 3.57s/it] 68%|██████▊ | 23313/34278 [25:35:20<10:18:23, 3.38s/it] {'loss': 0.1185, 'grad_norm': 0.7570682363192021, 'learning_rate': 2.451877868550548e-06, 'epoch': 0.68} 68%|██████▊ | 23313/34278 [25:35:20<10:18:23, 3.38s/it] 68%|██████▊ | 23314/34278 [25:35:23<9:52:20, 3.24s/it] {'loss': 0.1077, 'grad_norm': 0.7957456151917327, 'learning_rate': 2.451471399009437e-06, 'epoch': 0.68} 68%|██████▊ | 23314/34278 [25:35:23<9:52:20, 3.24s/it] 68%|██████▊ | 23315/34278 [25:35:26<9:58:04, 3.27s/it] {'loss': 0.1129, 'grad_norm': 0.7098774031726814, 'learning_rate': 2.4510649522209825e-06, 'epoch': 0.68} 68%|██████▊ | 23315/34278 [25:35:26<9:58:04, 3.27s/it] 68%|██████▊ | 23316/34278 [25:35:30<10:23:04, 3.41s/it] {'loss': 0.1103, 'grad_norm': 0.7862130562280645, 'learning_rate': 2.4506585281888096e-06, 'epoch': 0.68} 68%|██████▊ | 23316/34278 [25:35:30<10:23:04, 3.41s/it] 68%|██████▊ | 23317/34278 [25:35:33<10:12:36, 3.35s/it] {'loss': 0.1062, 'grad_norm': 0.7523670724296232, 'learning_rate': 2.450252126916549e-06, 'epoch': 0.68} 68%|██████▊ | 23317/34278 [25:35:33<10:12:36, 3.35s/it] 68%|██████▊ | 23318/34278 [25:35:39<12:21:57, 4.06s/it] {'loss': 0.1055, 'grad_norm': 0.8176813521525674, 'learning_rate': 2.449845748407831e-06, 'epoch': 0.68} 68%|██████▊ | 23318/34278 [25:35:39<12:21:57, 4.06s/it] 68%|██████▊ | 23319/34278 [25:35:42<11:30:17, 3.78s/it] {'loss': 0.1234, 'grad_norm': 0.8821584704883535, 'learning_rate': 2.4494393926662807e-06, 'epoch': 0.68} 68%|██████▊ | 23319/34278 [25:35:42<11:30:17, 3.78s/it] 68%|██████▊ | 23320/34278 [25:35:45<10:40:28, 3.51s/it] {'loss': 0.1178, 'grad_norm': 0.7341871562175372, 'learning_rate': 2.4490330596955254e-06, 'epoch': 0.68} 68%|██████▊ | 23320/34278 [25:35:45<10:40:28, 3.51s/it] 68%|██████▊ | 23321/34278 [25:35:48<10:05:51, 3.32s/it] {'loss': 0.1069, 'grad_norm': 0.7906863096959323, 'learning_rate': 2.4486267494991956e-06, 'epoch': 0.68} 68%|██████▊ | 23321/34278 [25:35:48<10:05:51, 3.32s/it] 68%|██████▊ | 23322/34278 [25:35:53<11:17:21, 3.71s/it] {'loss': 0.1163, 'grad_norm': 0.8430909846833469, 'learning_rate': 2.4482204620809154e-06, 'epoch': 0.68} 68%|██████▊ | 23322/34278 [25:35:53<11:17:21, 3.71s/it] 68%|██████▊ | 23323/34278 [25:35:59<13:30:51, 4.44s/it] {'loss': 0.1032, 'grad_norm': 0.9695821294279957, 'learning_rate': 2.4478141974443148e-06, 'epoch': 0.68} 68%|██████▊ | 23323/34278 [25:35:59<13:30:51, 4.44s/it] 68%|██████▊ | 23324/34278 [25:36:05<15:07:09, 4.97s/it] {'loss': 0.126, 'grad_norm': 0.8744977225367986, 'learning_rate': 2.4474079555930186e-06, 'epoch': 0.68} 68%|██████▊ | 23324/34278 [25:36:05<15:07:09, 4.97s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 68%|██████▊ | 23325/34278 [25:36:08<13:18:34, 4.37s/it] {'loss': 0.0942, 'grad_norm': 0.789867314354223, 'learning_rate': 2.447001736530657e-06, 'epoch': 0.68} 68%|██████▊ | 23325/34278 [25:36:08<13:18:34, 4.37s/it] 68%|██████▊ | 23326/34278 [25:36:12<12:51:12, 4.23s/it] {'loss': 0.1289, 'grad_norm': 0.8290402506456502, 'learning_rate': 2.446595540260854e-06, 'epoch': 0.68} 68%|██████▊ | 23326/34278 [25:36:12<12:51:12, 4.23s/it] 68%|██████▊ | 23327/34278 [25:36:15<11:45:11, 3.86s/it] {'loss': 0.1369, 'grad_norm': 0.8054387956901801, 'learning_rate': 2.446189366787235e-06, 'epoch': 0.68} 68%|██████▊ | 23327/34278 [25:36:15<11:45:11, 3.86s/it] 68%|██████▊ | 23328/34278 [25:36:18<11:13:52, 3.69s/it] {'loss': 0.1257, 'grad_norm': 0.902868084152141, 'learning_rate': 2.445783216113427e-06, 'epoch': 0.68} 68%|██████▊ | 23328/34278 [25:36:18<11:13:52, 3.69s/it] 68%|██████▊ | 23329/34278 [25:36:22<11:05:43, 3.65s/it] {'loss': 0.1069, 'grad_norm': 0.837937168979911, 'learning_rate': 2.445377088243059e-06, 'epoch': 0.68} 68%|██████▊ | 23329/34278 [25:36:22<11:05:43, 3.65s/it] 68%|██████▊ | 23330/34278 [25:36:27<12:58:55, 4.27s/it] {'loss': 0.1297, 'grad_norm': 0.7170138498137587, 'learning_rate': 2.4449709831797546e-06, 'epoch': 0.68} 68%|██████▊ | 23330/34278 [25:36:27<12:58:55, 4.27s/it] 68%|██████▊ | 23331/34278 [25:36:31<12:14:56, 4.03s/it] {'loss': 0.1198, 'grad_norm': 0.7305979998787503, 'learning_rate': 2.4445649009271373e-06, 'epoch': 0.68} 68%|██████▊ | 23331/34278 [25:36:31<12:14:56, 4.03s/it] 68%|██████▊ | 23332/34278 [25:36:34<11:07:09, 3.66s/it] {'loss': 0.1161, 'grad_norm': 0.8470847097306455, 'learning_rate': 2.444158841488836e-06, 'epoch': 0.68} 68%|██████▊ | 23332/34278 [25:36:34<11:07:09, 3.66s/it] 68%|██████▊ | 23333/34278 [25:36:39<13:09:32, 4.33s/it] {'loss': 0.1345, 'grad_norm': 0.8936664292927026, 'learning_rate': 2.4437528048684757e-06, 'epoch': 0.68} 68%|██████▊ | 23333/34278 [25:36:40<13:09:32, 4.33s/it] 68%|██████▊ | 23334/34278 [25:36:43<12:05:31, 3.98s/it] {'loss': 0.143, 'grad_norm': 0.8879470216622817, 'learning_rate': 2.4433467910696752e-06, 'epoch': 0.68} 68%|██████▊ | 23334/34278 [25:36:43<12:05:31, 3.98s/it] 68%|██████▊ | 23335/34278 [25:36:46<11:17:32, 3.71s/it] {'loss': 0.1298, 'grad_norm': 0.8787725909890597, 'learning_rate': 2.442940800096068e-06, 'epoch': 0.68} 68%|██████▊ | 23335/34278 [25:36:46<11:17:32, 3.71s/it] 68%|██████▊ | 23336/34278 [25:36:49<11:08:36, 3.67s/it] {'loss': 0.1059, 'grad_norm': 0.8491824379892106, 'learning_rate': 2.4425348319512753e-06, 'epoch': 0.68} 68%|██████▊ | 23336/34278 [25:36:49<11:08:36, 3.67s/it] 68%|██████▊ | 23337/34278 [25:36:52<10:22:51, 3.42s/it] {'loss': 0.1427, 'grad_norm': 0.9070472465815405, 'learning_rate': 2.4421288866389193e-06, 'epoch': 0.68} 68%|██████▊ | 23337/34278 [25:36:52<10:22:51, 3.42s/it] 68%|██████▊ | 23338/34278 [25:36:57<11:23:08, 3.75s/it] {'loss': 0.1233, 'grad_norm': 0.8201645344592756, 'learning_rate': 2.441722964162628e-06, 'epoch': 0.68} 68%|██████▊ | 23338/34278 [25:36:57<11:23:08, 3.75s/it] 68%|██████▊ | 23339/34278 [25:36:59<10:27:32, 3.44s/it] {'loss': 0.0976, 'grad_norm': 0.673680317420867, 'learning_rate': 2.441317064526023e-06, 'epoch': 0.68} 68%|██████▊ | 23339/34278 [25:36:59<10:27:32, 3.44s/it] 68%|██████▊ | 23340/34278 [25:37:02<10:01:41, 3.30s/it] {'loss': 0.109, 'grad_norm': 0.7401891538857096, 'learning_rate': 2.440911187732727e-06, 'epoch': 0.68} 68%|██████▊ | 23340/34278 [25:37:02<10:01:41, 3.30s/it] 68%|██████▊ | 23341/34278 [25:37:06<10:25:29, 3.43s/it] {'loss': 0.121, 'grad_norm': 0.8609853245740076, 'learning_rate': 2.440505333786364e-06, 'epoch': 0.68} 68%|██████▊ | 23341/34278 [25:37:06<10:25:29, 3.43s/it] 68%|██████▊ | 23342/34278 [25:37:09<10:15:17, 3.38s/it] {'loss': 0.1045, 'grad_norm': 0.852046250461729, 'learning_rate': 2.4400995026905612e-06, 'epoch': 0.68} 68%|██████▊ | 23342/34278 [25:37:09<10:15:17, 3.38s/it] 68%|██████▊ | 23343/34278 [25:37:13<10:06:33, 3.33s/it] {'loss': 0.1028, 'grad_norm': 0.9457424621288142, 'learning_rate': 2.4396936944489384e-06, 'epoch': 0.68} 68%|██████▊ | 23343/34278 [25:37:13<10:06:33, 3.33s/it] 68%|██████▊ | 23344/34278 [25:37:16<9:59:41, 3.29s/it] {'loss': 0.1303, 'grad_norm': 0.8938133279893553, 'learning_rate': 2.439287909065118e-06, 'epoch': 0.68} 68%|██████▊ | 23344/34278 [25:37:16<9:59:41, 3.29s/it] 68%|██████▊ | 23345/34278 [25:37:19<10:20:27, 3.41s/it] {'loss': 0.136, 'grad_norm': 0.9570157452775915, 'learning_rate': 2.4388821465427252e-06, 'epoch': 0.68} 68%|██████▊ | 23345/34278 [25:37:19<10:20:27, 3.41s/it] 68%|██████▊ | 23346/34278 [25:37:23<10:10:55, 3.35s/it] {'loss': 0.1152, 'grad_norm': 0.791793052052301, 'learning_rate': 2.4384764068853796e-06, 'epoch': 0.68} 68%|██████▊ | 23346/34278 [25:37:23<10:10:55, 3.35s/it] 68%|██████▊ | 23347/34278 [25:37:26<9:54:42, 3.26s/it] {'loss': 0.1149, 'grad_norm': 1.0417459373225932, 'learning_rate': 2.4380706900967043e-06, 'epoch': 0.68} 68%|██████▊ | 23347/34278 [25:37:26<9:54:42, 3.26s/it] 68%|██████▊ | 23348/34278 [25:37:31<11:39:26, 3.84s/it] {'loss': 0.1345, 'grad_norm': 1.1372400470501136, 'learning_rate': 2.437664996180325e-06, 'epoch': 0.68} 68%|██████▊ | 23348/34278 [25:37:31<11:39:26, 3.84s/it] 68%|██████▊ | 23349/34278 [25:37:34<10:41:30, 3.52s/it] {'loss': 0.1228, 'grad_norm': 1.0135776593769665, 'learning_rate': 2.437259325139861e-06, 'epoch': 0.68} 68%|██████▊ | 23349/34278 [25:37:34<10:41:30, 3.52s/it] 68%|██████▊ | 23350/34278 [25:37:37<10:20:26, 3.41s/it] {'loss': 0.1064, 'grad_norm': 1.053747311183959, 'learning_rate': 2.436853676978932e-06, 'epoch': 0.68} 68%|██████▊ | 23350/34278 [25:37:37<10:20:26, 3.41s/it] 68%|██████▊ | 23351/34278 [25:37:40<10:00:02, 3.29s/it] {'loss': 0.1208, 'grad_norm': 1.011768761569344, 'learning_rate': 2.436448051701163e-06, 'epoch': 0.68} 68%|██████▊ | 23351/34278 [25:37:40<10:00:02, 3.29s/it] 68%|██████▊ | 23352/34278 [25:37:44<10:22:51, 3.42s/it] {'loss': 0.1082, 'grad_norm': 1.1087044230681251, 'learning_rate': 2.436042449310172e-06, 'epoch': 0.68} 68%|██████▊ | 23352/34278 [25:37:44<10:22:51, 3.42s/it] 68%|██████▊ | 23353/34278 [25:37:47<9:58:25, 3.29s/it] {'loss': 0.1215, 'grad_norm': 1.2907939712860053, 'learning_rate': 2.4356368698095838e-06, 'epoch': 0.68} 68%|██████▊ | 23353/34278 [25:37:47<9:58:25, 3.29s/it] 68%|██████▊ | 23354/34278 [25:37:51<10:43:48, 3.54s/it] {'loss': 0.1348, 'grad_norm': 0.9932325754634937, 'learning_rate': 2.435231313203016e-06, 'epoch': 0.68} 68%|██████▊ | 23354/34278 [25:37:51<10:43:48, 3.54s/it] 68%|██████▊ | 23355/34278 [25:37:54<10:29:03, 3.46s/it] {'loss': 0.1049, 'grad_norm': 1.09646618133375, 'learning_rate': 2.4348257794940925e-06, 'epoch': 0.68} 68%|██████▊ | 23355/34278 [25:37:54<10:29:03, 3.46s/it] 68%|██████▊ | 23356/34278 [25:37:58<10:39:13, 3.51s/it] {'loss': 0.1439, 'grad_norm': 0.9649665998018226, 'learning_rate': 2.4344202686864323e-06, 'epoch': 0.68} 68%|██████▊ | 23356/34278 [25:37:58<10:39:13, 3.51s/it] 68%|██████▊ | 23357/34278 [25:38:01<10:15:35, 3.38s/it] {'loss': 0.1191, 'grad_norm': 0.7197520872665651, 'learning_rate': 2.434014780783653e-06, 'epoch': 0.68} 68%|██████▊ | 23357/34278 [25:38:01<10:15:35, 3.38s/it] 68%|██████▊ | 23358/34278 [25:38:04<10:24:21, 3.43s/it] {'loss': 0.1028, 'grad_norm': 1.0539640534846044, 'learning_rate': 2.4336093157893774e-06, 'epoch': 0.68} 68%|██████▊ | 23358/34278 [25:38:04<10:24:21, 3.43s/it] 68%|██████▊ | 23359/34278 [25:38:07<10:08:58, 3.35s/it] {'loss': 0.1293, 'grad_norm': 0.8796372851471373, 'learning_rate': 2.433203873707227e-06, 'epoch': 0.68} 68%|██████▊ | 23359/34278 [25:38:07<10:08:58, 3.35s/it] 68%|██████▊ | 23360/34278 [25:38:11<10:10:18, 3.35s/it] {'loss': 0.1033, 'grad_norm': 0.6840906635886067, 'learning_rate': 2.4327984545408203e-06, 'epoch': 0.68} 68%|██████▊ | 23360/34278 [25:38:11<10:10:18, 3.35s/it] 68%|██████▊ | 23361/34278 [25:38:17<12:57:23, 4.27s/it] {'loss': 0.1202, 'grad_norm': 0.6951351396440043, 'learning_rate': 2.4323930582937737e-06, 'epoch': 0.68} 68%|██████▊ | 23361/34278 [25:38:17<12:57:23, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 68%|██████▊ | 23362/34278 [25:38:20<11:52:28, 3.92s/it] {'loss': 0.1125, 'grad_norm': 0.8606714595639731, 'learning_rate': 2.4319876849697112e-06, 'epoch': 0.68} 68%|██████▊ | 23362/34278 [25:38:20<11:52:28, 3.92s/it] 68%|██████▊ | 23363/34278 [25:38:24<11:44:46, 3.87s/it] {'loss': 0.1281, 'grad_norm': 0.9876280544161635, 'learning_rate': 2.431582334572249e-06, 'epoch': 0.68} 68%|██████▊ | 23363/34278 [25:38:24<11:44:46, 3.87s/it] 68%|██████▊ | 23364/34278 [25:38:27<11:00:42, 3.63s/it] {'loss': 0.1073, 'grad_norm': 0.8772466950283561, 'learning_rate': 2.4311770071050035e-06, 'epoch': 0.68} 68%|██████▊ | 23364/34278 [25:38:27<11:00:42, 3.63s/it] 68%|██████▊ | 23365/34278 [25:38:30<10:23:49, 3.43s/it] {'loss': 0.1002, 'grad_norm': 0.6179190694491767, 'learning_rate': 2.430771702571599e-06, 'epoch': 0.68} 68%|██████▊ | 23365/34278 [25:38:30<10:23:49, 3.43s/it] 68%|██████▊ | 23366/34278 [25:38:33<10:08:43, 3.35s/it] {'loss': 0.1166, 'grad_norm': 0.7588800959108316, 'learning_rate': 2.4303664209756526e-06, 'epoch': 0.68} 68%|██████▊ | 23366/34278 [25:38:33<10:08:43, 3.35s/it] 68%|██████▊ | 23367/34278 [25:38:37<10:36:26, 3.50s/it] {'loss': 0.1236, 'grad_norm': 1.0027587154496904, 'learning_rate': 2.42996116232078e-06, 'epoch': 0.68} 68%|██████▊ | 23367/34278 [25:38:37<10:36:26, 3.50s/it] 68%|██████▊ | 23368/34278 [25:38:43<13:14:55, 4.37s/it] {'loss': 0.1244, 'grad_norm': 0.8885632551773563, 'learning_rate': 2.429555926610601e-06, 'epoch': 0.68} 68%|██████▊ | 23368/34278 [25:38:43<13:14:55, 4.37s/it] 68%|██████▊ | 23369/34278 [25:38:48<13:01:27, 4.30s/it] {'loss': 0.1096, 'grad_norm': 0.6295678571108029, 'learning_rate': 2.429150713848734e-06, 'epoch': 0.68} 68%|██████▊ | 23369/34278 [25:38:48<13:01:27, 4.30s/it] 68%|██████▊ | 23370/34278 [25:38:51<12:32:06, 4.14s/it] {'loss': 0.1164, 'grad_norm': 0.7654281934622027, 'learning_rate': 2.428745524038794e-06, 'epoch': 0.68} 68%|██████▊ | 23370/34278 [25:38:51<12:32:06, 4.14s/it] 68%|██████▊ | 23371/34278 [25:38:55<12:02:38, 3.98s/it] {'loss': 0.1033, 'grad_norm': 0.7794154015509821, 'learning_rate': 2.4283403571843994e-06, 'epoch': 0.68} 68%|██████▊ | 23371/34278 [25:38:55<12:02:38, 3.98s/it] 68%|██████▊ | 23372/34278 [25:38:58<11:11:19, 3.69s/it] {'loss': 0.1308, 'grad_norm': 0.8584749842108826, 'learning_rate': 2.4279352132891705e-06, 'epoch': 0.68} 68%|██████▊ | 23372/34278 [25:38:58<11:11:19, 3.69s/it] 68%|██████▊ | 23373/34278 [25:39:04<13:16:45, 4.38s/it] {'loss': 0.1203, 'grad_norm': 0.7726199129739065, 'learning_rate': 2.427530092356722e-06, 'epoch': 0.68} 68%|██████▊ | 23373/34278 [25:39:04<13:16:45, 4.38s/it] 68%|██████▊ | 23374/34278 [25:39:09<14:10:48, 4.68s/it] {'loss': 0.1148, 'grad_norm': 0.8598346354331359, 'learning_rate': 2.427124994390669e-06, 'epoch': 0.68} 68%|██████▊ | 23374/34278 [25:39:09<14:10:48, 4.68s/it] 68%|██████▊ | 23375/34278 [25:39:12<12:36:30, 4.16s/it] {'loss': 0.1044, 'grad_norm': 0.7132961658887523, 'learning_rate': 2.4267199193946313e-06, 'epoch': 0.68} 68%|██████▊ | 23375/34278 [25:39:12<12:36:30, 4.16s/it] 68%|██████▊ | 23376/34278 [25:39:16<12:12:11, 4.03s/it] {'loss': 0.1567, 'grad_norm': 0.8904430262186038, 'learning_rate': 2.426314867372222e-06, 'epoch': 0.68} 68%|██████▊ | 23376/34278 [25:39:16<12:12:11, 4.03s/it] 68%|██████▊ | 23377/34278 [25:39:19<11:07:27, 3.67s/it] {'loss': 0.1299, 'grad_norm': 0.816336462451866, 'learning_rate': 2.4259098383270596e-06, 'epoch': 0.68} 68%|██████▊ | 23377/34278 [25:39:19<11:07:27, 3.67s/it] 68%|██████▊ | 23378/34278 [25:39:22<10:37:06, 3.51s/it] {'loss': 0.0963, 'grad_norm': 0.7746041696509879, 'learning_rate': 2.425504832262761e-06, 'epoch': 0.68} 68%|██████▊ | 23378/34278 [25:39:22<10:37:06, 3.51s/it] 68%|██████▊ | 23379/34278 [25:39:25<10:28:43, 3.46s/it] {'loss': 0.1102, 'grad_norm': 0.8023654822258212, 'learning_rate': 2.4250998491829414e-06, 'epoch': 0.68} 68%|██████▊ | 23379/34278 [25:39:25<10:28:43, 3.46s/it] 68%|██████▊ | 23380/34278 [25:39:29<10:25:51, 3.45s/it] {'loss': 0.1295, 'grad_norm': 1.3227585921290468, 'learning_rate': 2.424694889091213e-06, 'epoch': 0.68} 68%|██████▊ | 23380/34278 [25:39:29<10:25:51, 3.45s/it] 68%|██████▊ | 23381/34278 [25:39:32<10:28:42, 3.46s/it] {'loss': 0.1253, 'grad_norm': 0.9538132455914216, 'learning_rate': 2.4242899519911966e-06, 'epoch': 0.68} 68%|██████▊ | 23381/34278 [25:39:32<10:28:42, 3.46s/it] 68%|██████▊ | 23382/34278 [25:39:38<12:49:16, 4.24s/it] {'loss': 0.108, 'grad_norm': 0.7747601742899379, 'learning_rate': 2.423885037886502e-06, 'epoch': 0.68} 68%|██████▊ | 23382/34278 [25:39:38<12:49:16, 4.24s/it] 68%|██████▊ | 23383/34278 [25:39:43<13:23:42, 4.43s/it] {'loss': 0.1091, 'grad_norm': 0.7359015483255598, 'learning_rate': 2.4234801467807487e-06, 'epoch': 0.68} 68%|██████▊ | 23383/34278 [25:39:43<13:23:42, 4.43s/it] 68%|██████▊ | 23384/34278 [25:39:46<12:22:37, 4.09s/it] {'loss': 0.1227, 'grad_norm': 0.8063697329646766, 'learning_rate': 2.4230752786775485e-06, 'epoch': 0.68} 68%|██████▊ | 23384/34278 [25:39:46<12:22:37, 4.09s/it] 68%|██████▊ | 23385/34278 [25:39:50<11:37:11, 3.84s/it] {'loss': 0.1132, 'grad_norm': 1.1138567153813508, 'learning_rate': 2.4226704335805186e-06, 'epoch': 0.68} 68%|██████▊ | 23385/34278 [25:39:50<11:37:11, 3.84s/it] 68%|██████▊ | 23386/34278 [25:39:53<10:59:11, 3.63s/it] {'loss': 0.1253, 'grad_norm': 0.7221672197416532, 'learning_rate': 2.4222656114932713e-06, 'epoch': 0.68} 68%|██████▊ | 23386/34278 [25:39:53<10:59:11, 3.63s/it] 68%|██████▊ | 23387/34278 [25:39:56<10:28:00, 3.46s/it] {'loss': 0.125, 'grad_norm': 0.6745835099362463, 'learning_rate': 2.42186081241942e-06, 'epoch': 0.68} 68%|██████▊ | 23387/34278 [25:39:56<10:28:00, 3.46s/it] 68%|██████▊ | 23388/34278 [25:39:59<10:18:27, 3.41s/it] {'loss': 0.1053, 'grad_norm': 0.7813051552278909, 'learning_rate': 2.4214560363625794e-06, 'epoch': 0.68} 68%|██████▊ | 23388/34278 [25:39:59<10:18:27, 3.41s/it] 68%|██████▊ | 23389/34278 [25:40:03<10:17:32, 3.40s/it] {'loss': 0.1277, 'grad_norm': 0.8889806545809844, 'learning_rate': 2.421051283326366e-06, 'epoch': 0.68} 68%|██████▊ | 23389/34278 [25:40:03<10:17:32, 3.40s/it] 68%|██████▊ | 23390/34278 [25:40:06<10:00:28, 3.31s/it] {'loss': 0.1298, 'grad_norm': 0.7596732140411573, 'learning_rate': 2.4206465533143906e-06, 'epoch': 0.68} 68%|██████▊ | 23390/34278 [25:40:06<10:00:28, 3.31s/it] 68%|██████▊ | 23391/34278 [25:40:09<9:52:20, 3.26s/it] {'loss': 0.1117, 'grad_norm': 0.7427801645031388, 'learning_rate': 2.420241846330266e-06, 'epoch': 0.68} 68%|██████▊ | 23391/34278 [25:40:09<9:52:20, 3.26s/it] 68%|██████▊ | 23392/34278 [25:40:12<10:07:04, 3.35s/it] {'loss': 0.1102, 'grad_norm': 0.7627876825591198, 'learning_rate': 2.4198371623776077e-06, 'epoch': 0.68} 68%|██████▊ | 23392/34278 [25:40:12<10:07:04, 3.35s/it] 68%|██████▊ | 23393/34278 [25:40:16<10:43:36, 3.55s/it] {'loss': 0.1354, 'grad_norm': 0.9159339260042754, 'learning_rate': 2.4194325014600254e-06, 'epoch': 0.68} 68%|██████▊ | 23393/34278 [25:40:16<10:43:36, 3.55s/it] 68%|██████▊ | 23394/34278 [25:40:20<10:34:53, 3.50s/it] {'loss': 0.1204, 'grad_norm': 0.8578802680763596, 'learning_rate': 2.4190278635811336e-06, 'epoch': 0.68} 68%|██████▊ | 23394/34278 [25:40:20<10:34:53, 3.50s/it] 68%|██████▊ | 23395/34278 [25:40:23<10:36:32, 3.51s/it] {'loss': 0.1217, 'grad_norm': 1.0237255170517903, 'learning_rate': 2.418623248744547e-06, 'epoch': 0.68} 68%|██████▊ | 23395/34278 [25:40:23<10:36:32, 3.51s/it] 68%|██████▊ | 23396/34278 [25:40:26<10:09:01, 3.36s/it] {'loss': 0.1173, 'grad_norm': 0.7705445245732929, 'learning_rate': 2.4182186569538763e-06, 'epoch': 0.68} 68%|██████▊ | 23396/34278 [25:40:26<10:09:01, 3.36s/it] 68%|██████▊ | 23397/34278 [25:40:31<11:05:04, 3.67s/it] {'loss': 0.1085, 'grad_norm': 1.022543322282196, 'learning_rate': 2.4178140882127304e-06, 'epoch': 0.68} 68%|██████▊ | 23397/34278 [25:40:31<11:05:04, 3.67s/it] 68%|██████▊ | 23398/34278 [25:40:34<10:25:46, 3.45s/it] {'loss': 0.105, 'grad_norm': 0.8291564370473499, 'learning_rate': 2.4174095425247263e-06, 'epoch': 0.68} 68%|██████▊ | 23398/34278 [25:40:34<10:25:46, 3.45s/it] 68%|██████▊ | 23399/34278 [25:40:37<10:10:20, 3.37s/it] {'loss': 0.1126, 'grad_norm': 0.8308205991615312, 'learning_rate': 2.4170050198934707e-06, 'epoch': 0.68} 68%|██████▊ | 23399/34278 [25:40:37<10:10:20, 3.37s/it] 68%|██████▊ | 23400/34278 [25:40:40<9:34:34, 3.17s/it] {'loss': 0.1048, 'grad_norm': 1.0059662857379013, 'learning_rate': 2.4166005203225803e-06, 'epoch': 0.68} 68%|██████▊ | 23400/34278 [25:40:40<9:34:34, 3.17s/it] 68%|██████▊ | 23401/34278 [25:40:46<12:15:16, 4.06s/it] {'loss': 0.1321, 'grad_norm': 0.7635590561700802, 'learning_rate': 2.416196043815662e-06, 'epoch': 0.68} 68%|██████▊ | 23401/34278 [25:40:46<12:15:16, 4.06s/it] 68%|██████▊ | 23402/34278 [25:40:48<11:05:11, 3.67s/it] {'loss': 0.1234, 'grad_norm': 0.91562126553842, 'learning_rate': 2.4157915903763295e-06, 'epoch': 0.68} 68%|██████▊ | 23402/34278 [25:40:48<11:05:11, 3.67s/it] 68%|██████▊ | 23403/34278 [25:40:54<12:34:19, 4.16s/it] {'loss': 0.0964, 'grad_norm': 0.8778772330689085, 'learning_rate': 2.4153871600081936e-06, 'epoch': 0.68} 68%|██████▊ | 23403/34278 [25:40:54<12:34:19, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 68%|██████▊ | 23404/34278 [25:40:57<11:39:47, 3.86s/it] {'loss': 0.1201, 'grad_norm': 0.7183699973220936, 'learning_rate': 2.414982752714862e-06, 'epoch': 0.68} 68%|██████▊ | 23404/34278 [25:40:57<11:39:47, 3.86s/it] 68%|██████▊ | 23405/34278 [25:41:03<13:30:10, 4.47s/it] {'loss': 0.1278, 'grad_norm': 1.0354123592567526, 'learning_rate': 2.4145783684999472e-06, 'epoch': 0.68} 68%|██████▊ | 23405/34278 [25:41:03<13:30:10, 4.47s/it] 68%|██████▊ | 23406/34278 [25:41:06<12:31:02, 4.14s/it] {'loss': 0.141, 'grad_norm': 0.9630908241330794, 'learning_rate': 2.4141740073670617e-06, 'epoch': 0.68} 68%|██████▊ | 23406/34278 [25:41:06<12:31:02, 4.14s/it] 68%|██████▊ | 23407/34278 [25:41:09<11:38:45, 3.86s/it] {'loss': 0.1249, 'grad_norm': 0.7248657130182223, 'learning_rate': 2.4137696693198113e-06, 'epoch': 0.68} 68%|██████▊ | 23407/34278 [25:41:09<11:38:45, 3.86s/it] 68%|██████▊ | 23408/34278 [25:41:15<12:57:22, 4.29s/it] {'loss': 0.112, 'grad_norm': 0.7290492842727565, 'learning_rate': 2.41336535436181e-06, 'epoch': 0.68} 68%|██████▊ | 23408/34278 [25:41:15<12:57:22, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 68%|██████▊ | 23409/34278 [25:41:18<12:13:23, 4.05s/it] {'loss': 0.1118, 'grad_norm': 0.6970817164852366, 'learning_rate': 2.4129610624966654e-06, 'epoch': 0.68} 68%|██████▊ | 23409/34278 [25:41:18<12:13:23, 4.05s/it] 68%|██████▊ | 23410/34278 [25:41:21<11:24:18, 3.78s/it] {'loss': 0.1289, 'grad_norm': 0.8694633343773415, 'learning_rate': 2.412556793727985e-06, 'epoch': 0.68} 68%|██████▊ | 23410/34278 [25:41:21<11:24:18, 3.78s/it] 68%|██████▊ | 23411/34278 [25:41:25<11:06:04, 3.68s/it] {'loss': 0.1214, 'grad_norm': 0.810799855740774, 'learning_rate': 2.4121525480593793e-06, 'epoch': 0.68} 68%|██████▊ | 23411/34278 [25:41:25<11:06:04, 3.68s/it] 68%|██████▊ | 23412/34278 [25:41:28<10:38:03, 3.52s/it] {'loss': 0.1128, 'grad_norm': 0.7721906771436191, 'learning_rate': 2.41174832549446e-06, 'epoch': 0.68} 68%|██████▊ | 23412/34278 [25:41:28<10:38:03, 3.52s/it] 68%|██████▊ | 23413/34278 [25:41:31<10:01:25, 3.32s/it] {'loss': 0.1086, 'grad_norm': 0.8932376975416071, 'learning_rate': 2.4113441260368335e-06, 'epoch': 0.68} 68%|██████▊ | 23413/34278 [25:41:31<10:01:25, 3.32s/it] 68%|██████▊ | 23414/34278 [25:41:34<9:44:04, 3.23s/it] {'loss': 0.1142, 'grad_norm': 0.8040576828677514, 'learning_rate': 2.4109399496901074e-06, 'epoch': 0.68} 68%|██████▊ | 23414/34278 [25:41:34<9:44:04, 3.23s/it] 68%|██████▊ | 23415/34278 [25:41:37<9:49:27, 3.26s/it] {'loss': 0.1019, 'grad_norm': 1.060179750476534, 'learning_rate': 2.4105357964578928e-06, 'epoch': 0.68} 68%|██████▊ | 23415/34278 [25:41:37<9:49:27, 3.26s/it] 68%|██████▊ | 23416/34278 [25:41:41<10:06:31, 3.35s/it] {'loss': 0.1313, 'grad_norm': 0.974518952914883, 'learning_rate': 2.4101316663437966e-06, 'epoch': 0.68} 68%|██████▊ | 23416/34278 [25:41:41<10:06:31, 3.35s/it] 68%|██████▊ | 23417/34278 [25:41:44<10:04:17, 3.34s/it] {'loss': 0.1114, 'grad_norm': 0.9598616199303697, 'learning_rate': 2.409727559351425e-06, 'epoch': 0.68} 68%|██████▊ | 23417/34278 [25:41:44<10:04:17, 3.34s/it] 68%|██████▊ | 23418/34278 [25:41:47<10:00:01, 3.32s/it] {'loss': 0.1005, 'grad_norm': 0.9963924525348814, 'learning_rate': 2.4093234754843873e-06, 'epoch': 0.68} 68%|██████▊ | 23418/34278 [25:41:47<10:00:01, 3.32s/it] 68%|██████▊ | 23419/34278 [25:41:51<10:39:13, 3.53s/it] {'loss': 0.1324, 'grad_norm': 1.0680388047755742, 'learning_rate': 2.408919414746293e-06, 'epoch': 0.68} 68%|██████▊ | 23419/34278 [25:41:51<10:39:13, 3.53s/it] 68%|██████▊ | 23420/34278 [25:41:55<10:42:33, 3.55s/it] {'loss': 0.1359, 'grad_norm': 0.9501564668487673, 'learning_rate': 2.4085153771407477e-06, 'epoch': 0.68} 68%|██████▊ | 23420/34278 [25:41:55<10:42:33, 3.55s/it] 68%|██████▊ | 23421/34278 [25:42:00<12:01:39, 3.99s/it] {'loss': 0.1265, 'grad_norm': 1.0012804483359667, 'learning_rate': 2.4081113626713564e-06, 'epoch': 0.68} 68%|██████▊ | 23421/34278 [25:42:00<12:01:39, 3.99s/it] 68%|██████▊ | 23422/34278 [25:42:04<11:50:20, 3.93s/it] {'loss': 0.1282, 'grad_norm': 0.9585963243980063, 'learning_rate': 2.4077073713417304e-06, 'epoch': 0.68} 68%|██████▊ | 23422/34278 [25:42:04<11:50:20, 3.93s/it] 68%|██████▊ | 23423/34278 [25:42:08<12:25:59, 4.12s/it] {'loss': 0.14, 'grad_norm': 0.8267488349081108, 'learning_rate': 2.407303403155472e-06, 'epoch': 0.68} 68%|██████▊ | 23423/34278 [25:42:08<12:25:59, 4.12s/it] 68%|██████▊ | 23424/34278 [25:42:14<13:29:13, 4.47s/it] {'loss': 0.1294, 'grad_norm': 0.886117093413851, 'learning_rate': 2.4068994581161898e-06, 'epoch': 0.68} 68%|██████▊ | 23424/34278 [25:42:14<13:29:13, 4.47s/it] 68%|██████▊ | 23425/34278 [25:42:17<12:44:55, 4.23s/it] {'loss': 0.1081, 'grad_norm': 0.8729574453971385, 'learning_rate': 2.4064955362274924e-06, 'epoch': 0.68} 68%|██████▊ | 23425/34278 [25:42:17<12:44:55, 4.23s/it] 68%|██████▊ | 23426/34278 [25:42:21<12:08:06, 4.03s/it] {'loss': 0.12, 'grad_norm': 0.8968191632170365, 'learning_rate': 2.406091637492983e-06, 'epoch': 0.68} 68%|██████▊ | 23426/34278 [25:42:21<12:08:06, 4.03s/it] 68%|██████▊ | 23427/34278 [25:42:24<11:17:53, 3.75s/it] {'loss': 0.1321, 'grad_norm': 0.8399957092302629, 'learning_rate': 2.4056877619162674e-06, 'epoch': 0.68} 68%|██████▊ | 23427/34278 [25:42:24<11:17:53, 3.75s/it] 68%|██████▊ | 23428/34278 [25:42:27<11:01:33, 3.66s/it] {'loss': 0.1234, 'grad_norm': 0.9490961480081912, 'learning_rate': 2.4052839095009535e-06, 'epoch': 0.68} 68%|██████▊ | 23428/34278 [25:42:27<11:01:33, 3.66s/it] 68%|██████▊ | 23429/34278 [25:42:30<10:36:21, 3.52s/it] {'loss': 0.1459, 'grad_norm': 1.1315340538929324, 'learning_rate': 2.404880080250643e-06, 'epoch': 0.68} 68%|██████▊ | 23429/34278 [25:42:30<10:36:21, 3.52s/it] 68%|██████▊ | 23430/34278 [25:42:34<10:17:46, 3.42s/it] {'loss': 0.1302, 'grad_norm': 0.8190062475940487, 'learning_rate': 2.4044762741689464e-06, 'epoch': 0.68} 68%|██████▊ | 23430/34278 [25:42:34<10:17:46, 3.42s/it] 68%|██████▊ | 23431/34278 [25:42:38<11:00:25, 3.65s/it] {'loss': 0.135, 'grad_norm': 1.027433233966887, 'learning_rate': 2.404072491259464e-06, 'epoch': 0.68} 68%|██████▊ | 23431/34278 [25:42:38<11:00:25, 3.65s/it] 68%|██████▊ | 23432/34278 [25:42:41<10:40:50, 3.55s/it] {'loss': 0.0995, 'grad_norm': 1.0991760340368277, 'learning_rate': 2.403668731525804e-06, 'epoch': 0.68} 68%|██████▊ | 23432/34278 [25:42:41<10:40:50, 3.55s/it] 68%|██████▊ | 23433/34278 [25:42:44<10:14:11, 3.40s/it] {'loss': 0.1236, 'grad_norm': 1.0180612157341427, 'learning_rate': 2.4032649949715703e-06, 'epoch': 0.68} 68%|██████▊ | 23433/34278 [25:42:44<10:14:11, 3.40s/it] 68%|██████▊ | 23434/34278 [25:42:48<10:39:44, 3.54s/it] {'loss': 0.1304, 'grad_norm': 0.7992345966043113, 'learning_rate': 2.402861281600365e-06, 'epoch': 0.68} 68%|██████▊ | 23434/34278 [25:42:48<10:39:44, 3.54s/it] 68%|██████▊ | 23435/34278 [25:42:51<10:15:29, 3.41s/it] {'loss': 0.1264, 'grad_norm': 1.0817479691060017, 'learning_rate': 2.402457591415794e-06, 'epoch': 0.68} 68%|██████▊ | 23435/34278 [25:42:51<10:15:29, 3.41s/it] 68%|██████▊ | 23436/34278 [25:42:55<10:53:37, 3.62s/it] {'loss': 0.1282, 'grad_norm': 0.7583473275234861, 'learning_rate': 2.402053924421463e-06, 'epoch': 0.68} 68%|██████▊ | 23436/34278 [25:42:55<10:53:37, 3.62s/it] 68%|██████▊ | 23437/34278 [25:42:58<10:14:15, 3.40s/it] {'loss': 0.1194, 'grad_norm': 0.8215215240016145, 'learning_rate': 2.401650280620973e-06, 'epoch': 0.68} 68%|██████▊ | 23437/34278 [25:42:58<10:14:15, 3.40s/it] 68%|██████▊ | 23438/34278 [25:43:02<10:24:42, 3.46s/it] {'loss': 0.1548, 'grad_norm': 0.9652272594761065, 'learning_rate': 2.401246660017931e-06, 'epoch': 0.68} 68%|██████▊ | 23438/34278 [25:43:02<10:24:42, 3.46s/it] 68%|██████▊ | 23439/34278 [25:43:05<10:04:00, 3.34s/it] {'loss': 0.1234, 'grad_norm': 0.9839096272142991, 'learning_rate': 2.4008430626159383e-06, 'epoch': 0.68} 68%|██████▊ | 23439/34278 [25:43:05<10:04:00, 3.34s/it] 68%|██████▊ | 23440/34278 [25:43:08<9:49:27, 3.26s/it] {'loss': 0.1224, 'grad_norm': 0.8477160412619479, 'learning_rate': 2.4004394884185965e-06, 'epoch': 0.68} 68%|██████▊ | 23440/34278 [25:43:08<9:49:27, 3.26s/it] 68%|██████▊ | 23441/34278 [25:43:12<10:24:31, 3.46s/it] {'loss': 0.1489, 'grad_norm': 0.9105366763503833, 'learning_rate': 2.40003593742951e-06, 'epoch': 0.68} 68%|██████▊ | 23441/34278 [25:43:12<10:24:31, 3.46s/it] 68%|██████▊ | 23442/34278 [25:43:15<10:05:27, 3.35s/it] {'loss': 0.1178, 'grad_norm': 0.8676494595180941, 'learning_rate': 2.3996324096522844e-06, 'epoch': 0.68} 68%|██████▊ | 23442/34278 [25:43:15<10:05:27, 3.35s/it] 68%|██████▊ | 23443/34278 [25:43:18<9:47:49, 3.26s/it] {'loss': 0.1004, 'grad_norm': 0.8251007198965699, 'learning_rate': 2.3992289050905194e-06, 'epoch': 0.68} 68%|██████▊ | 23443/34278 [25:43:18<9:47:49, 3.26s/it] 68%|██████▊ | 23444/34278 [25:43:24<12:28:35, 4.15s/it] {'loss': 0.1245, 'grad_norm': 0.8984564437788387, 'learning_rate': 2.3988254237478164e-06, 'epoch': 0.68} 68%|██████▊ | 23444/34278 [25:43:24<12:28:35, 4.15s/it] 68%|██████▊ | 23445/34278 [25:43:30<13:47:21, 4.58s/it] {'loss': 0.1111, 'grad_norm': 0.7892369663173008, 'learning_rate': 2.3984219656277807e-06, 'epoch': 0.68} 68%|██████▊ | 23445/34278 [25:43:30<13:47:21, 4.58s/it] 68%|██████▊ | 23446/34278 [25:43:34<13:37:02, 4.53s/it] {'loss': 0.1026, 'grad_norm': 1.003280951784191, 'learning_rate': 2.3980185307340127e-06, 'epoch': 0.68} 68%|██████▊ | 23446/34278 [25:43:34<13:37:02, 4.53s/it] 68%|██████▊ | 23447/34278 [25:43:38<12:43:44, 4.23s/it] {'loss': 0.1122, 'grad_norm': 0.7889691182335306, 'learning_rate': 2.3976151190701123e-06, 'epoch': 0.68} 68%|██████▊ | 23447/34278 [25:43:38<12:43:44, 4.23s/it] 68%|██████▊ | 23448/34278 [25:43:44<14:51:10, 4.94s/it] {'loss': 0.121, 'grad_norm': 0.6360571341951934, 'learning_rate': 2.3972117306396823e-06, 'epoch': 0.68} 68%|██████▊ | 23448/34278 [25:43:44<14:51:10, 4.94s/it] 68%|██████▊ | 23449/34278 [25:43:47<13:12:10, 4.39s/it] {'loss': 0.1325, 'grad_norm': 1.0484686636854017, 'learning_rate': 2.3968083654463277e-06, 'epoch': 0.68} 68%|██████▊ | 23449/34278 [25:43:47<13:12:10, 4.39s/it] 68%|██████▊ | 23450/34278 [25:43:50<12:01:10, 4.00s/it] {'loss': 0.1189, 'grad_norm': 1.1633400083088463, 'learning_rate': 2.396405023493646e-06, 'epoch': 0.68} 68%|██████▊ | 23450/34278 [25:43:50<12:01:10, 4.00s/it] 68%|██████▊ | 23451/34278 [25:43:57<13:50:45, 4.60s/it] {'loss': 0.14, 'grad_norm': 0.7852866233186016, 'learning_rate': 2.3960017047852362e-06, 'epoch': 0.68} 68%|██████▊ | 23451/34278 [25:43:57<13:50:45, 4.60s/it] 68%|██████▊ | 23452/34278 [25:44:00<13:15:42, 4.41s/it] {'loss': 0.1323, 'grad_norm': 0.9356886034288822, 'learning_rate': 2.395598409324704e-06, 'epoch': 0.68} 68%|██████▊ | 23452/34278 [25:44:00<13:15:42, 4.41s/it] 68%|██████▊ | 23453/34278 [25:44:04<12:08:29, 4.04s/it] {'loss': 0.1307, 'grad_norm': 0.9572955367971694, 'learning_rate': 2.395195137115646e-06, 'epoch': 0.68} 68%|██████▊ | 23453/34278 [25:44:04<12:08:29, 4.04s/it] 68%|██████▊ | 23454/34278 [25:44:08<12:34:51, 4.18s/it] {'loss': 0.1137, 'grad_norm': 0.7671869608111163, 'learning_rate': 2.394791888161663e-06, 'epoch': 0.68} 68%|██████▊ | 23454/34278 [25:44:08<12:34:51, 4.18s/it] 68%|██████▊ | 23455/34278 [25:44:11<11:38:44, 3.87s/it] {'loss': 0.1277, 'grad_norm': 0.9668753842441362, 'learning_rate': 2.3943886624663586e-06, 'epoch': 0.68} 68%|██████▊ | 23455/34278 [25:44:11<11:38:44, 3.87s/it] 68%|██████▊ | 23456/34278 [25:44:14<10:56:49, 3.64s/it] {'loss': 0.1213, 'grad_norm': 0.7377271997149412, 'learning_rate': 2.393985460033331e-06, 'epoch': 0.68} 68%|██████▊ | 23456/34278 [25:44:14<10:56:49, 3.64s/it] 68%|██████▊ | 23457/34278 [25:44:18<10:36:53, 3.53s/it] {'loss': 0.1252, 'grad_norm': 0.707175469957673, 'learning_rate': 2.393582280866176e-06, 'epoch': 0.68} 68%|██████▊ | 23457/34278 [25:44:18<10:36:53, 3.53s/it] 68%|██████▊ | 23458/34278 [25:44:21<10:22:00, 3.45s/it] {'loss': 0.1152, 'grad_norm': 0.8718198956609994, 'learning_rate': 2.393179124968498e-06, 'epoch': 0.68} 68%|██████▊ | 23458/34278 [25:44:21<10:22:00, 3.45s/it] 68%|██████▊ | 23459/34278 [25:44:25<10:34:30, 3.52s/it] {'loss': 0.1276, 'grad_norm': 0.8221659544288544, 'learning_rate': 2.3927759923438936e-06, 'epoch': 0.68} 68%|██████▊ | 23459/34278 [25:44:25<10:34:30, 3.52s/it] 68%|██████▊ | 23460/34278 [25:44:29<10:54:07, 3.63s/it] {'loss': 0.1136, 'grad_norm': 0.7789014250704809, 'learning_rate': 2.392372882995964e-06, 'epoch': 0.68} 68%|██████▊ | 23460/34278 [25:44:29<10:54:07, 3.63s/it] 68%|██████▊ | 23461/34278 [25:44:32<10:32:47, 3.51s/it] {'loss': 0.1068, 'grad_norm': 0.7690363735768622, 'learning_rate': 2.391969796928305e-06, 'epoch': 0.68} 68%|██████▊ | 23461/34278 [25:44:32<10:32:47, 3.51s/it] 68%|██████▊ | 23462/34278 [25:44:35<10:37:15, 3.54s/it] {'loss': 0.1178, 'grad_norm': 0.8627675179824307, 'learning_rate': 2.3915667341445194e-06, 'epoch': 0.68} 68%|██████▊ | 23462/34278 [25:44:35<10:37:15, 3.54s/it] 68%|██████▊ | 23463/34278 [25:44:39<10:23:03, 3.46s/it] {'loss': 0.1415, 'grad_norm': 0.9365312772700242, 'learning_rate': 2.3911636946482024e-06, 'epoch': 0.68} 68%|██████▊ | 23463/34278 [25:44:39<10:23:03, 3.46s/it] 68%|██████▊ | 23464/34278 [25:44:43<11:33:43, 3.85s/it] {'loss': 0.1212, 'grad_norm': 1.0855533313368741, 'learning_rate': 2.390760678442952e-06, 'epoch': 0.68} 68%|██████▊ | 23464/34278 [25:44:43<11:33:43, 3.85s/it] 68%|██████▊ | 23465/34278 [25:44:46<10:51:09, 3.61s/it] {'loss': 0.1198, 'grad_norm': 0.8870097026272866, 'learning_rate': 2.3903576855323676e-06, 'epoch': 0.68} 68%|██████▊ | 23465/34278 [25:44:46<10:51:09, 3.61s/it] 68%|██████▊ | 23466/34278 [25:44:50<10:40:25, 3.55s/it] {'loss': 0.1033, 'grad_norm': 0.6458907704108451, 'learning_rate': 2.3899547159200478e-06, 'epoch': 0.68} 68%|██████▊ | 23466/34278 [25:44:50<10:40:25, 3.55s/it] 68%|██████▊ | 23467/34278 [25:44:56<13:00:55, 4.33s/it] {'loss': 0.1012, 'grad_norm': 0.7754345369737338, 'learning_rate': 2.389551769609588e-06, 'epoch': 0.68} 68%|██████▊ | 23467/34278 [25:44:56<13:00:55, 4.33s/it] 68%|██████▊ | 23468/34278 [25:44:59<12:07:08, 4.04s/it] {'loss': 0.1336, 'grad_norm': 0.8907205336921865, 'learning_rate': 2.389148846604588e-06, 'epoch': 0.68} 68%|██████▊ | 23468/34278 [25:44:59<12:07:08, 4.04s/it] 68%|██████▊ | 23469/34278 [25:45:02<11:15:02, 3.75s/it] {'loss': 0.1102, 'grad_norm': 0.9164401893153775, 'learning_rate': 2.388745946908645e-06, 'epoch': 0.68} 68%|██████▊ | 23469/34278 [25:45:02<11:15:02, 3.75s/it] 68%|██████▊ | 23470/34278 [25:45:07<11:43:38, 3.91s/it] {'loss': 0.1224, 'grad_norm': 0.9711317861214254, 'learning_rate': 2.3883430705253517e-06, 'epoch': 0.68} 68%|██████▊ | 23470/34278 [25:45:07<11:43:38, 3.91s/it] 68%|██████▊ | 23471/34278 [25:45:10<10:58:56, 3.66s/it] {'loss': 0.1204, 'grad_norm': 0.9728623130958626, 'learning_rate': 2.387940217458309e-06, 'epoch': 0.68} 68%|██████▊ | 23471/34278 [25:45:10<10:58:56, 3.66s/it] 68%|██████▊ | 23472/34278 [25:45:15<12:28:30, 4.16s/it] {'loss': 0.0982, 'grad_norm': 0.7746221587657227, 'learning_rate': 2.387537387711114e-06, 'epoch': 0.68} 68%|██████▊ | 23472/34278 [25:45:15<12:28:30, 4.16s/it] 68%|██████▊ | 23473/34278 [25:45:20<13:03:08, 4.35s/it] {'loss': 0.1259, 'grad_norm': 1.0602203978021345, 'learning_rate': 2.3871345812873614e-06, 'epoch': 0.68} 68%|██████▊ | 23473/34278 [25:45:20<13:03:08, 4.35s/it] 68%|██████▊ | 23474/34278 [25:45:23<11:35:53, 3.86s/it] {'loss': 0.0918, 'grad_norm': 0.7457289504548994, 'learning_rate': 2.386731798190646e-06, 'epoch': 0.68} 68%|██████▊ | 23474/34278 [25:45:23<11:35:53, 3.86s/it] 68%|██████▊ | 23475/34278 [25:45:27<11:39:54, 3.89s/it] {'loss': 0.1355, 'grad_norm': 0.9372285937573585, 'learning_rate': 2.386329038424567e-06, 'epoch': 0.68} 68%|██████▊ | 23475/34278 [25:45:27<11:39:54, 3.89s/it] 68%|██████▊ | 23476/34278 [25:45:30<11:07:31, 3.71s/it] {'loss': 0.116, 'grad_norm': 0.8393391738534541, 'learning_rate': 2.3859263019927183e-06, 'epoch': 0.68} 68%|██████▊ | 23476/34278 [25:45:30<11:07:31, 3.71s/it] 68%|██████▊ | 23477/34278 [25:45:33<10:26:47, 3.48s/it] {'loss': 0.1154, 'grad_norm': 0.8456078402600375, 'learning_rate': 2.3855235888986934e-06, 'epoch': 0.68} 68%|██████▊ | 23477/34278 [25:45:33<10:26:47, 3.48s/it] 68%|██████▊ | 23478/34278 [25:45:36<9:57:22, 3.32s/it] {'loss': 0.1232, 'grad_norm': 0.9142331599646668, 'learning_rate': 2.38512089914609e-06, 'epoch': 0.68} 68%|██████▊ | 23478/34278 [25:45:36<9:57:22, 3.32s/it] 68%|██████▊ | 23479/34278 [25:45:41<12:08:24, 4.05s/it] {'loss': 0.1045, 'grad_norm': 0.8034376540527108, 'learning_rate': 2.384718232738505e-06, 'epoch': 0.68} 68%|██████▊ | 23479/34278 [25:45:42<12:08:24, 4.05s/it] 68%|██████▊ | 23480/34278 [25:45:46<12:26:26, 4.15s/it] {'loss': 0.1163, 'grad_norm': 0.9115998323203565, 'learning_rate': 2.3843155896795312e-06, 'epoch': 0.68} 68%|██████▊ | 23480/34278 [25:45:46<12:26:26, 4.15s/it] 69%|██████▊ | 23481/34278 [25:45:49<11:25:46, 3.81s/it] {'loss': 0.1149, 'grad_norm': 0.8322445656168338, 'learning_rate': 2.383912969972762e-06, 'epoch': 0.69} 69%|██████▊ | 23481/34278 [25:45:49<11:25:46, 3.81s/it] 69%|██████▊ | 23482/34278 [25:45:55<13:17:06, 4.43s/it] {'loss': 0.1094, 'grad_norm': 0.9240018036308494, 'learning_rate': 2.3835103736217946e-06, 'epoch': 0.69} 69%|██████▊ | 23482/34278 [25:45:55<13:17:06, 4.43s/it] 69%|██████▊ | 23483/34278 [25:45:58<12:29:02, 4.16s/it] {'loss': 0.1487, 'grad_norm': 0.936371784389214, 'learning_rate': 2.38310780063022e-06, 'epoch': 0.69} 69%|██████▊ | 23483/34278 [25:45:58<12:29:02, 4.16s/it] 69%|██████▊ | 23484/34278 [25:46:02<11:38:15, 3.88s/it] {'loss': 0.1371, 'grad_norm': 0.8993352478631058, 'learning_rate': 2.3827052510016345e-06, 'epoch': 0.69} 69%|██████▊ | 23484/34278 [25:46:02<11:38:15, 3.88s/it] 69%|██████▊ | 23485/34278 [25:46:05<10:49:01, 3.61s/it] {'loss': 0.0973, 'grad_norm': 0.866249042145432, 'learning_rate': 2.3823027247396336e-06, 'epoch': 0.69} 69%|██████▊ | 23485/34278 [25:46:05<10:49:01, 3.61s/it] 69%|██████▊ | 23486/34278 [25:46:08<10:34:06, 3.53s/it] {'loss': 0.1294, 'grad_norm': 0.8120086252175812, 'learning_rate': 2.3819002218478095e-06, 'epoch': 0.69} 69%|██████▊ | 23486/34278 [25:46:08<10:34:06, 3.53s/it] 69%|██████▊ | 23487/34278 [25:46:11<10:00:02, 3.34s/it] {'loss': 0.1142, 'grad_norm': 0.9075515094803396, 'learning_rate': 2.3814977423297525e-06, 'epoch': 0.69} 69%|██████▊ | 23487/34278 [25:46:11<10:00:02, 3.34s/it] 69%|██████▊ | 23488/34278 [25:46:14<9:40:21, 3.23s/it] {'loss': 0.0967, 'grad_norm': 0.8194432468452225, 'learning_rate': 2.381095286189061e-06, 'epoch': 0.69} 69%|██████▊ | 23488/34278 [25:46:14<9:40:21, 3.23s/it] 69%|██████▊ | 23489/34278 [25:46:17<9:23:11, 3.13s/it] {'loss': 0.1093, 'grad_norm': 0.911996593340731, 'learning_rate': 2.380692853429324e-06, 'epoch': 0.69} 69%|██████▊ | 23489/34278 [25:46:17<9:23:11, 3.13s/it] 69%|██████▊ | 23490/34278 [25:46:23<12:04:09, 4.03s/it] {'loss': 0.1153, 'grad_norm': 0.7433077733892137, 'learning_rate': 2.380290444054137e-06, 'epoch': 0.69} 69%|██████▊ | 23490/34278 [25:46:23<12:04:09, 4.03s/it] 69%|██████▊ | 23491/34278 [25:46:27<12:21:31, 4.12s/it] {'loss': 0.1125, 'grad_norm': 0.6776526815796657, 'learning_rate': 2.37988805806709e-06, 'epoch': 0.69} 69%|██████▊ | 23491/34278 [25:46:27<12:21:31, 4.12s/it] 69%|██████▊ | 23492/34278 [25:46:30<11:41:50, 3.90s/it] {'loss': 0.1318, 'grad_norm': 0.9523202249609923, 'learning_rate': 2.379485695471779e-06, 'epoch': 0.69} 69%|██████▊ | 23492/34278 [25:46:30<11:41:50, 3.90s/it] 69%|██████▊ | 23493/34278 [25:46:36<13:29:34, 4.50s/it] {'loss': 0.1148, 'grad_norm': 0.7864939572064003, 'learning_rate': 2.3790833562717942e-06, 'epoch': 0.69} 69%|██████▊ | 23493/34278 [25:46:36<13:29:34, 4.50s/it] 69%|██████▊ | 23494/34278 [25:46:43<14:56:29, 4.99s/it] {'loss': 0.1162, 'grad_norm': 0.7627730295940247, 'learning_rate': 2.3786810404707255e-06, 'epoch': 0.69} 69%|██████▊ | 23494/34278 [25:46:43<14:56:29, 4.99s/it] 69%|██████▊ | 23495/34278 [25:46:48<15:41:44, 5.24s/it] {'loss': 0.1123, 'grad_norm': 0.7648168097564796, 'learning_rate': 2.3782787480721665e-06, 'epoch': 0.69} 69%|██████▊ | 23495/34278 [25:46:48<15:41:44, 5.24s/it] 69%|██████▊ | 23496/34278 [25:46:51<13:27:16, 4.49s/it] {'loss': 0.1036, 'grad_norm': 0.681673275124196, 'learning_rate': 2.377876479079711e-06, 'epoch': 0.69} 69%|██████▊ | 23496/34278 [25:46:51<13:27:16, 4.49s/it] 69%|██████▊ | 23497/34278 [25:46:57<15:00:50, 5.01s/it] {'loss': 0.1245, 'grad_norm': 0.7671201754749983, 'learning_rate': 2.3774742334969463e-06, 'epoch': 0.69} 69%|██████▊ | 23497/34278 [25:46:57<15:00:50, 5.01s/it] 69%|██████▊ | 23498/34278 [25:47:01<14:02:19, 4.69s/it] {'loss': 0.127, 'grad_norm': 0.8428541625357826, 'learning_rate': 2.3770720113274683e-06, 'epoch': 0.69} 69%|██████▊ | 23498/34278 [25:47:01<14:02:19, 4.69s/it] 69%|██████▊ | 23499/34278 [25:47:06<13:51:51, 4.63s/it] {'loss': 0.1248, 'grad_norm': 1.0171462256525663, 'learning_rate': 2.3766698125748646e-06, 'epoch': 0.69} 69%|██████▊ | 23499/34278 [25:47:06<13:51:51, 4.63s/it] 69%|██████▊ | 23500/34278 [25:47:09<12:29:02, 4.17s/it] {'loss': 0.1088, 'grad_norm': 0.9667422677895646, 'learning_rate': 2.3762676372427247e-06, 'epoch': 0.69} 69%|██████▊ | 23500/34278 [25:47:09<12:29:02, 4.17s/it] 69%|██████▊ | 23501/34278 [25:47:13<12:41:02, 4.24s/it] {'loss': 0.1219, 'grad_norm': 0.6010521933297716, 'learning_rate': 2.3758654853346407e-06, 'epoch': 0.69} 69%|██████▊ | 23501/34278 [25:47:13<12:41:02, 4.24s/it] 69%|██████▊ | 23502/34278 [25:47:17<12:19:59, 4.12s/it] {'loss': 0.112, 'grad_norm': 0.93863480082487, 'learning_rate': 2.3754633568542056e-06, 'epoch': 0.69} 69%|██████▊ | 23502/34278 [25:47:17<12:19:59, 4.12s/it] 69%|██████▊ | 23503/34278 [25:47:20<11:37:48, 3.89s/it] {'loss': 0.1307, 'grad_norm': 1.0692147299668187, 'learning_rate': 2.375061251805007e-06, 'epoch': 0.69} 69%|██████▊ | 23503/34278 [25:47:20<11:37:48, 3.89s/it] 69%|██████▊ | 23504/34278 [25:47:24<11:06:33, 3.71s/it] {'loss': 0.1185, 'grad_norm': 0.7559153663930311, 'learning_rate': 2.374659170190633e-06, 'epoch': 0.69} 69%|██████▊ | 23504/34278 [25:47:24<11:06:33, 3.71s/it] 69%|██████▊ | 23505/34278 [25:47:27<10:26:57, 3.49s/it] {'loss': 0.1123, 'grad_norm': 0.6871234650276633, 'learning_rate': 2.3742571120146767e-06, 'epoch': 0.69} 69%|██████▊ | 23505/34278 [25:47:27<10:26:57, 3.49s/it] 69%|██████▊ | 23506/34278 [25:47:30<10:14:35, 3.42s/it] {'loss': 0.1149, 'grad_norm': 1.0453357658913915, 'learning_rate': 2.373855077280727e-06, 'epoch': 0.69} 69%|██████▊ | 23506/34278 [25:47:30<10:14:35, 3.42s/it] 69%|██████▊ | 23507/34278 [25:47:33<9:50:07, 3.29s/it] {'loss': 0.1181, 'grad_norm': 0.9279121847645624, 'learning_rate': 2.3734530659923695e-06, 'epoch': 0.69} 69%|██████▊ | 23507/34278 [25:47:33<9:50:07, 3.29s/it] 69%|██████▊ | 23508/34278 [25:47:36<9:29:05, 3.17s/it] {'loss': 0.1215, 'grad_norm': 0.7215969775900233, 'learning_rate': 2.373051078153196e-06, 'epoch': 0.69} 69%|██████▊ | 23508/34278 [25:47:36<9:29:05, 3.17s/it] 69%|██████▊ | 23509/34278 [25:47:39<9:09:45, 3.06s/it] {'loss': 0.1324, 'grad_norm': 0.9561340154637641, 'learning_rate': 2.372649113766798e-06, 'epoch': 0.69} 69%|██████▊ | 23509/34278 [25:47:39<9:09:45, 3.06s/it] 69%|██████▊ | 23510/34278 [25:47:42<9:22:52, 3.14s/it] {'loss': 0.1242, 'grad_norm': 0.8386322298917255, 'learning_rate': 2.3722471728367613e-06, 'epoch': 0.69} 69%|██████▊ | 23510/34278 [25:47:42<9:22:52, 3.14s/it] 69%|██████▊ | 23511/34278 [25:47:46<10:24:21, 3.48s/it] {'loss': 0.1396, 'grad_norm': 0.9650038111161826, 'learning_rate': 2.371845255366672e-06, 'epoch': 0.69} 69%|██████▊ | 23511/34278 [25:47:46<10:24:21, 3.48s/it] 69%|██████▊ | 23512/34278 [25:47:49<10:07:28, 3.39s/it] {'loss': 0.1324, 'grad_norm': 1.0789013915669485, 'learning_rate': 2.3714433613601236e-06, 'epoch': 0.69} 69%|██████▊ | 23512/34278 [25:47:49<10:07:28, 3.39s/it] 69%|██████▊ | 23513/34278 [25:47:53<10:30:21, 3.51s/it] {'loss': 0.1165, 'grad_norm': 0.8364909352913613, 'learning_rate': 2.3710414908206993e-06, 'epoch': 0.69} 69%|██████▊ | 23513/34278 [25:47:53<10:30:21, 3.51s/it] 69%|██████▊ | 23514/34278 [25:47:56<9:54:13, 3.31s/it] {'loss': 0.1058, 'grad_norm': 1.068960133563225, 'learning_rate': 2.3706396437519884e-06, 'epoch': 0.69} 69%|██████▊ | 23514/34278 [25:47:56<9:54:13, 3.31s/it] 69%|██████▊ | 23515/34278 [25:47:59<9:49:55, 3.29s/it] {'loss': 0.1169, 'grad_norm': 1.4907302630864512, 'learning_rate': 2.3702378201575813e-06, 'epoch': 0.69} 69%|██████▊ | 23515/34278 [25:47:59<9:49:55, 3.29s/it] 69%|██████▊ | 23516/34278 [25:48:03<10:05:00, 3.37s/it] {'loss': 0.1369, 'grad_norm': 0.9768349753225332, 'learning_rate': 2.3698360200410637e-06, 'epoch': 0.69} 69%|██████▊ | 23516/34278 [25:48:03<10:05:00, 3.37s/it] 69%|██████▊ | 23517/34278 [25:48:06<9:44:01, 3.26s/it] {'loss': 0.1306, 'grad_norm': 1.0809606732262722, 'learning_rate': 2.3694342434060197e-06, 'epoch': 0.69} 69%|██████▊ | 23517/34278 [25:48:06<9:44:01, 3.26s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd62f6f7f10> Failed to fetch sample 3646669. Exception: cannot identify image file <_io.BytesIO object at 0x7fd62f6f7f10> 69%|██████▊ | 23518/34278 [25:48:10<10:15:45, 3.43s/it] {'loss': 0.11, 'grad_norm': 1.178862064829292, 'learning_rate': 2.369032490256041e-06, 'epoch': 0.69} 69%|██████▊ | 23518/34278 [25:48:10<10:15:45, 3.43s/it] 69%|██████▊ | 23519/34278 [25:48:13<10:30:45, 3.52s/it] {'loss': 0.1141, 'grad_norm': 1.0757149116537466, 'learning_rate': 2.36863076059471e-06, 'epoch': 0.69} 69%|██████▊ | 23519/34278 [25:48:13<10:30:45, 3.52s/it] 69%|██████▊ | 23520/34278 [25:48:16<9:59:24, 3.34s/it] {'loss': 0.1072, 'grad_norm': 1.0754990610337491, 'learning_rate': 2.3682290544256177e-06, 'epoch': 0.69} 69%|██████▊ | 23520/34278 [25:48:16<9:59:24, 3.34s/it] 69%|██████▊ | 23521/34278 [25:48:19<9:41:23, 3.24s/it] {'loss': 0.1138, 'grad_norm': 0.9965731383422439, 'learning_rate': 2.367827371752346e-06, 'epoch': 0.69} 69%|██████▊ | 23521/34278 [25:48:19<9:41:23, 3.24s/it] 69%|██████▊ | 23522/34278 [25:48:22<9:19:18, 3.12s/it] {'loss': 0.0927, 'grad_norm': 1.0173008119921403, 'learning_rate': 2.367425712578485e-06, 'epoch': 0.69} 69%|██████▊ | 23522/34278 [25:48:22<9:19:18, 3.12s/it] 69%|██████▊ | 23523/34278 [25:48:28<11:48:22, 3.95s/it] {'loss': 0.1315, 'grad_norm': 1.0887430923450627, 'learning_rate': 2.367024076907619e-06, 'epoch': 0.69} 69%|██████▊ | 23523/34278 [25:48:28<11:48:22, 3.95s/it] 69%|██████▊ | 23524/34278 [25:48:33<12:27:04, 4.17s/it] {'loss': 0.1346, 'grad_norm': 0.9403419166037104, 'learning_rate': 2.3666224647433316e-06, 'epoch': 0.69} 69%|██████▊ | 23524/34278 [25:48:33<12:27:04, 4.17s/it] 69%|██████▊ | 23525/34278 [25:48:36<11:56:33, 4.00s/it] {'loss': 0.1456, 'grad_norm': 1.0754113609847809, 'learning_rate': 2.36622087608921e-06, 'epoch': 0.69} 69%|██████▊ | 23525/34278 [25:48:36<11:56:33, 4.00s/it] 69%|██████▊ | 23526/34278 [25:48:41<12:29:44, 4.18s/it] {'loss': 0.1239, 'grad_norm': 0.891411413120843, 'learning_rate': 2.365819310948842e-06, 'epoch': 0.69} 69%|██████▊ | 23526/34278 [25:48:41<12:29:44, 4.18s/it] 69%|██████▊ | 23527/34278 [25:48:44<11:33:10, 3.87s/it] {'loss': 0.1306, 'grad_norm': 0.8652430958392724, 'learning_rate': 2.365417769325808e-06, 'epoch': 0.69} 69%|██████▊ | 23527/34278 [25:48:44<11:33:10, 3.87s/it] 69%|██████▊ | 23528/34278 [25:48:47<10:36:12, 3.55s/it] {'loss': 0.123, 'grad_norm': 0.8521861191231433, 'learning_rate': 2.3650162512236976e-06, 'epoch': 0.69} 69%|██████▊ | 23528/34278 [25:48:47<10:36:12, 3.55s/it] 69%|██████▊ | 23529/34278 [25:48:50<10:25:44, 3.49s/it] {'loss': 0.0889, 'grad_norm': 0.7609696299282188, 'learning_rate': 2.3646147566460925e-06, 'epoch': 0.69} 69%|██████▊ | 23529/34278 [25:48:50<10:25:44, 3.49s/it] 69%|██████▊ | 23530/34278 [25:48:53<10:01:00, 3.36s/it] {'loss': 0.1262, 'grad_norm': 0.8147779881198222, 'learning_rate': 2.364213285596576e-06, 'epoch': 0.69} 69%|██████▊ | 23530/34278 [25:48:53<10:01:00, 3.36s/it] 69%|██████▊ | 23531/34278 [25:48:57<10:39:24, 3.57s/it] {'loss': 0.1108, 'grad_norm': 0.6912119070395232, 'learning_rate': 2.3638118380787343e-06, 'epoch': 0.69} 69%|██████▊ | 23531/34278 [25:48:57<10:39:24, 3.57s/it] 69%|██████▊ | 23532/34278 [25:49:01<10:59:49, 3.68s/it] {'loss': 0.1118, 'grad_norm': 0.7823489429027166, 'learning_rate': 2.3634104140961526e-06, 'epoch': 0.69} 69%|██████▊ | 23532/34278 [25:49:01<10:59:49, 3.68s/it] 69%|██████▊ | 23533/34278 [25:49:05<11:17:20, 3.78s/it] {'loss': 0.1224, 'grad_norm': 0.8024496981683377, 'learning_rate': 2.363009013652414e-06, 'epoch': 0.69} 69%|██████▊ | 23533/34278 [25:49:05<11:17:20, 3.78s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f09f09a5030> Failed to fetch sample 3588839. Exception: cannot identify image file <_io.BytesIO object at 0x7f09f09a5030> 69%|██████▊ | 23534/34278 [25:49:09<10:54:30, 3.66s/it] {'loss': 0.1344, 'grad_norm': 0.8262579075470796, 'learning_rate': 2.362607636751099e-06, 'epoch': 0.69} 69%|██████▊ | 23534/34278 [25:49:09<10:54:30, 3.66s/it] 69%|██████▊ | 23535/34278 [25:49:12<10:34:15, 3.54s/it] {'loss': 0.1151, 'grad_norm': 0.7887056906078238, 'learning_rate': 2.362206283395796e-06, 'epoch': 0.69} 69%|██████▊ | 23535/34278 [25:49:12<10:34:15, 3.54s/it] 69%|██████▊ | 23536/34278 [25:49:16<11:26:01, 3.83s/it] {'loss': 0.1355, 'grad_norm': 0.7806153772020534, 'learning_rate': 2.361804953590085e-06, 'epoch': 0.69} 69%|██████▊ | 23536/34278 [25:49:16<11:26:01, 3.83s/it] 69%|██████▊ | 23537/34278 [25:49:19<10:36:43, 3.56s/it] {'loss': 0.1285, 'grad_norm': 0.8653411093637556, 'learning_rate': 2.361403647337548e-06, 'epoch': 0.69} 69%|██████▊ | 23537/34278 [25:49:19<10:36:43, 3.56s/it] 69%|██████▊ | 23538/34278 [25:49:25<12:39:58, 4.25s/it] {'loss': 0.1322, 'grad_norm': 1.4017383155451708, 'learning_rate': 2.361002364641769e-06, 'epoch': 0.69} 69%|██████▊ | 23538/34278 [25:49:25<12:39:58, 4.25s/it] 69%|██████▊ | 23539/34278 [25:49:30<13:24:09, 4.49s/it] {'loss': 0.1307, 'grad_norm': 1.144977948650432, 'learning_rate': 2.3606011055063334e-06, 'epoch': 0.69} 69%|██████▊ | 23539/34278 [25:49:30<13:24:09, 4.49s/it] 69%|██████▊ | 23540/34278 [25:49:34<12:31:51, 4.20s/it] {'loss': 0.1241, 'grad_norm': 0.7977310722793292, 'learning_rate': 2.3601998699348204e-06, 'epoch': 0.69} 69%|██████▊ | 23540/34278 [25:49:34<12:31:51, 4.20s/it] 69%|██████▊ | 23541/34278 [25:49:37<11:31:11, 3.86s/it] {'loss': 0.1252, 'grad_norm': 0.9778755835224564, 'learning_rate': 2.359798657930811e-06, 'epoch': 0.69} 69%|██████▊ | 23541/34278 [25:49:37<11:31:11, 3.86s/it] 69%|██████▊ | 23542/34278 [25:49:40<10:45:19, 3.61s/it] {'loss': 0.1346, 'grad_norm': 0.9654940653137173, 'learning_rate': 2.359397469497891e-06, 'epoch': 0.69} 69%|██████▊ | 23542/34278 [25:49:40<10:45:19, 3.61s/it] 69%|██████▊ | 23543/34278 [25:49:43<10:29:18, 3.52s/it] {'loss': 0.1188, 'grad_norm': 0.8260042062894757, 'learning_rate': 2.358996304639638e-06, 'epoch': 0.69} 69%|██████▊ | 23543/34278 [25:49:43<10:29:18, 3.52s/it] 69%|██████▊ | 23544/34278 [25:49:46<9:59:47, 3.35s/it] {'loss': 0.1116, 'grad_norm': 0.8274462611216946, 'learning_rate': 2.3585951633596355e-06, 'epoch': 0.69} 69%|██████▊ | 23544/34278 [25:49:46<9:59:47, 3.35s/it] 69%|██████▊ | 23545/34278 [25:49:49<9:50:19, 3.30s/it] {'loss': 0.1436, 'grad_norm': 1.1001433245277148, 'learning_rate': 2.358194045661467e-06, 'epoch': 0.69} 69%|██████▊ | 23545/34278 [25:49:49<9:50:19, 3.30s/it] 69%|██████▊ | 23546/34278 [25:49:52<9:41:00, 3.25s/it] {'loss': 0.1107, 'grad_norm': 0.9133574728664574, 'learning_rate': 2.3577929515487114e-06, 'epoch': 0.69} 69%|██████▊ | 23546/34278 [25:49:53<9:41:00, 3.25s/it] 69%|██████▊ | 23547/34278 [25:49:56<9:33:40, 3.21s/it] {'loss': 0.1138, 'grad_norm': 0.953493345282723, 'learning_rate': 2.3573918810249474e-06, 'epoch': 0.69} 69%|██████▊ | 23547/34278 [25:49:56<9:33:40, 3.21s/it] 69%|██████▊ | 23548/34278 [25:49:59<9:17:58, 3.12s/it] {'loss': 0.1361, 'grad_norm': 1.194121128930686, 'learning_rate': 2.35699083409376e-06, 'epoch': 0.69} 69%|██████▊ | 23548/34278 [25:49:59<9:17:58, 3.12s/it] 69%|██████▊ | 23549/34278 [25:50:03<10:42:15, 3.59s/it] {'loss': 0.1255, 'grad_norm': 0.8862759394539529, 'learning_rate': 2.3565898107587252e-06, 'epoch': 0.69} 69%|██████▊ | 23549/34278 [25:50:03<10:42:15, 3.59s/it] 69%|██████▊ | 23550/34278 [25:50:06<10:01:25, 3.36s/it] {'loss': 0.1089, 'grad_norm': 0.9633645413471328, 'learning_rate': 2.3561888110234282e-06, 'epoch': 0.69} 69%|██████▊ | 23550/34278 [25:50:06<10:01:25, 3.36s/it] 69%|██████▊ | 23551/34278 [25:50:12<11:55:07, 4.00s/it] {'loss': 0.1031, 'grad_norm': 0.7766659798818281, 'learning_rate': 2.355787834891444e-06, 'epoch': 0.69} 69%|██████▊ | 23551/34278 [25:50:12<11:55:07, 4.00s/it] 69%|██████▊ | 23552/34278 [25:50:15<11:04:53, 3.72s/it] {'loss': 0.1298, 'grad_norm': 0.9616354443413634, 'learning_rate': 2.3553868823663566e-06, 'epoch': 0.69} 69%|██████▊ | 23552/34278 [25:50:15<11:04:53, 3.72s/it] 69%|██████▊ | 23553/34278 [25:50:18<10:34:44, 3.55s/it] {'loss': 0.1279, 'grad_norm': 0.8658744516447879, 'learning_rate': 2.354985953451744e-06, 'epoch': 0.69} 69%|██████▊ | 23553/34278 [25:50:18<10:34:44, 3.55s/it] 69%|██████▊ | 23554/34278 [25:50:24<12:43:49, 4.27s/it] {'loss': 0.1247, 'grad_norm': 0.7759188916579862, 'learning_rate': 2.354585048151183e-06, 'epoch': 0.69} 69%|██████▊ | 23554/34278 [25:50:24<12:43:49, 4.27s/it] 69%|██████▊ | 23555/34278 [25:50:27<11:25:43, 3.84s/it] {'loss': 0.1286, 'grad_norm': 0.9923030343845581, 'learning_rate': 2.3541841664682557e-06, 'epoch': 0.69} 69%|██████▊ | 23555/34278 [25:50:27<11:25:43, 3.84s/it] 69%|██████▊ | 23556/34278 [25:50:32<12:28:31, 4.19s/it] {'loss': 0.1208, 'grad_norm': 0.9361035162978071, 'learning_rate': 2.353783308406542e-06, 'epoch': 0.69} 69%|██████▊ | 23556/34278 [25:50:32<12:28:31, 4.19s/it] 69%|██████▊ | 23557/34278 [25:50:35<11:55:45, 4.01s/it] {'loss': 0.1457, 'grad_norm': 0.842792622859444, 'learning_rate': 2.3533824739696177e-06, 'epoch': 0.69} 69%|██████▊ | 23557/34278 [25:50:35<11:55:45, 4.01s/it] 69%|██████▊ | 23558/34278 [25:50:38<11:17:32, 3.79s/it] {'loss': 0.1268, 'grad_norm': 0.9782272165786156, 'learning_rate': 2.352981663161065e-06, 'epoch': 0.69} 69%|██████▊ | 23558/34278 [25:50:38<11:17:32, 3.79s/it] 69%|██████▊ | 23559/34278 [25:50:45<13:22:01, 4.49s/it] {'loss': 0.1263, 'grad_norm': 0.8433063502457212, 'learning_rate': 2.3525808759844597e-06, 'epoch': 0.69} 69%|██████▊ | 23559/34278 [25:50:45<13:22:01, 4.49s/it] 69%|██████▊ | 23560/34278 [25:50:48<12:35:29, 4.23s/it] {'loss': 0.1188, 'grad_norm': 0.8262014928374279, 'learning_rate': 2.3521801124433785e-06, 'epoch': 0.69} 69%|██████▊ | 23560/34278 [25:50:48<12:35:29, 4.23s/it] 69%|██████▊ | 23561/34278 [25:50:54<13:48:33, 4.64s/it] {'loss': 0.104, 'grad_norm': 0.8564325546700844, 'learning_rate': 2.3517793725414012e-06, 'epoch': 0.69} 69%|██████▊ | 23561/34278 [25:50:54<13:48:33, 4.64s/it] 69%|██████▊ | 23562/34278 [25:50:57<12:19:51, 4.14s/it] {'loss': 0.12, 'grad_norm': 0.8299099115305489, 'learning_rate': 2.3513786562821074e-06, 'epoch': 0.69} 69%|██████▊ | 23562/34278 [25:50:57<12:19:51, 4.14s/it] 69%|██████▊ | 23563/34278 [25:51:00<11:13:36, 3.77s/it] {'loss': 0.136, 'grad_norm': 0.8302422510429294, 'learning_rate': 2.350977963669073e-06, 'epoch': 0.69} 69%|██████▊ | 23563/34278 [25:51:00<11:13:36, 3.77s/it] 69%|██████▊ | 23564/34278 [25:51:03<11:11:42, 3.76s/it] {'loss': 0.1236, 'grad_norm': 1.0926938136093967, 'learning_rate': 2.3505772947058724e-06, 'epoch': 0.69} 69%|██████▊ | 23564/34278 [25:51:03<11:11:42, 3.76s/it] 69%|██████▊ | 23565/34278 [25:51:07<11:08:32, 3.74s/it] {'loss': 0.1447, 'grad_norm': 0.8118820649428803, 'learning_rate': 2.3501766493960877e-06, 'epoch': 0.69} 69%|██████▊ | 23565/34278 [25:51:07<11:08:32, 3.74s/it] 69%|██████▊ | 23566/34278 [25:51:11<11:02:28, 3.71s/it] {'loss': 0.1215, 'grad_norm': 0.7483750510343007, 'learning_rate': 2.349776027743293e-06, 'epoch': 0.69} 69%|██████▊ | 23566/34278 [25:51:11<11:02:28, 3.71s/it] 69%|██████▉ | 23567/34278 [25:51:16<12:39:33, 4.25s/it] {'loss': 0.1183, 'grad_norm': 0.9237602169403718, 'learning_rate': 2.3493754297510633e-06, 'epoch': 0.69} 69%|██████▉ | 23567/34278 [25:51:16<12:39:33, 4.25s/it] 69%|██████▉ | 23568/34278 [25:51:21<12:41:12, 4.26s/it] {'loss': 0.0973, 'grad_norm': 0.9099250676978813, 'learning_rate': 2.3489748554229776e-06, 'epoch': 0.69} 69%|██████▉ | 23568/34278 [25:51:21<12:41:12, 4.26s/it] 69%|██████▉ | 23569/34278 [25:51:24<11:36:29, 3.90s/it] {'loss': 0.1191, 'grad_norm': 0.8952863050396787, 'learning_rate': 2.348574304762613e-06, 'epoch': 0.69} 69%|██████▉ | 23569/34278 [25:51:24<11:36:29, 3.90s/it] 69%|██████▉ | 23570/34278 [25:51:26<10:40:45, 3.59s/it] {'loss': 0.1178, 'grad_norm': 0.8276489452699551, 'learning_rate': 2.3481737777735442e-06, 'epoch': 0.69} 69%|██████▉ | 23570/34278 [25:51:26<10:40:45, 3.59s/it] 69%|██████▉ | 23571/34278 [25:51:30<10:21:40, 3.48s/it] {'loss': 0.1413, 'grad_norm': 1.82843189096159, 'learning_rate': 2.3477732744593447e-06, 'epoch': 0.69} 69%|██████▉ | 23571/34278 [25:51:30<10:21:40, 3.48s/it] 69%|██████▉ | 23572/34278 [25:51:34<11:20:43, 3.82s/it] {'loss': 0.1161, 'grad_norm': 0.906001687367933, 'learning_rate': 2.3473727948235942e-06, 'epoch': 0.69} 69%|██████▉ | 23572/34278 [25:51:34<11:20:43, 3.82s/it] 69%|██████▉ | 23573/34278 [25:51:39<11:43:47, 3.94s/it] {'loss': 0.1358, 'grad_norm': 0.7681955824426587, 'learning_rate': 2.3469723388698647e-06, 'epoch': 0.69} 69%|██████▉ | 23573/34278 [25:51:39<11:43:47, 3.94s/it] 69%|██████▉ | 23574/34278 [25:51:42<11:13:38, 3.78s/it] {'loss': 0.1245, 'grad_norm': 0.8080897467609642, 'learning_rate': 2.3465719066017323e-06, 'epoch': 0.69} 69%|██████▉ | 23574/34278 [25:51:42<11:13:38, 3.78s/it] 69%|██████▉ | 23575/34278 [25:51:45<10:48:30, 3.64s/it] {'loss': 0.1013, 'grad_norm': 0.8433169982377376, 'learning_rate': 2.3461714980227744e-06, 'epoch': 0.69} 69%|██████▉ | 23575/34278 [25:51:45<10:48:30, 3.64s/it] 69%|██████▉ | 23576/34278 [25:51:49<11:02:55, 3.72s/it] {'loss': 0.1329, 'grad_norm': 0.8831885607690425, 'learning_rate': 2.345771113136564e-06, 'epoch': 0.69} 69%|██████▉ | 23576/34278 [25:51:49<11:02:55, 3.72s/it] 69%|██████▉ | 23577/34278 [25:51:52<10:30:30, 3.54s/it] {'loss': 0.1281, 'grad_norm': 0.8231088923519766, 'learning_rate': 2.345370751946674e-06, 'epoch': 0.69} 69%|██████▉ | 23577/34278 [25:51:52<10:30:30, 3.54s/it] 69%|██████▉ | 23578/34278 [25:51:55<10:04:47, 3.39s/it] {'loss': 0.1291, 'grad_norm': 0.8871687192229474, 'learning_rate': 2.3449704144566817e-06, 'epoch': 0.69} 69%|██████▉ | 23578/34278 [25:51:55<10:04:47, 3.39s/it] 69%|██████▉ | 23579/34278 [25:51:58<9:46:23, 3.29s/it] {'loss': 0.1237, 'grad_norm': 0.8846479280956788, 'learning_rate': 2.3445701006701576e-06, 'epoch': 0.69} 69%|██████▉ | 23579/34278 [25:51:58<9:46:23, 3.29s/it] 69%|██████▉ | 23580/34278 [25:52:01<9:30:58, 3.20s/it] {'loss': 0.1027, 'grad_norm': 0.8714965548841259, 'learning_rate': 2.34416981059068e-06, 'epoch': 0.69} 69%|██████▉ | 23580/34278 [25:52:01<9:30:58, 3.20s/it] 69%|██████▉ | 23581/34278 [25:52:04<9:10:01, 3.09s/it] {'loss': 0.1043, 'grad_norm': 0.8758882087340438, 'learning_rate': 2.3437695442218184e-06, 'epoch': 0.69} 69%|██████▉ | 23581/34278 [25:52:04<9:10:01, 3.09s/it] 69%|██████▉ | 23582/34278 [25:52:07<9:15:32, 3.12s/it] {'loss': 0.105, 'grad_norm': 0.7658119347014151, 'learning_rate': 2.3433693015671498e-06, 'epoch': 0.69} 69%|██████▉ | 23582/34278 [25:52:07<9:15:32, 3.12s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff041964680> Failed to fetch sample 3641705. Exception: cannot identify image file <_io.BytesIO object at 0x7ff041964680> 69%|██████▉ | 23583/34278 [25:52:11<9:38:56, 3.25s/it] {'loss': 0.1235, 'grad_norm': 0.9644642036937223, 'learning_rate': 2.3429690826302464e-06, 'epoch': 0.69} 69%|██████▉ | 23583/34278 [25:52:11<9:38:56, 3.25s/it] 69%|██████▉ | 23584/34278 [25:52:17<12:10:31, 4.10s/it] {'loss': 0.1224, 'grad_norm': 0.7584404917842424, 'learning_rate': 2.3425688874146787e-06, 'epoch': 0.69} 69%|██████▉ | 23584/34278 [25:52:17<12:10:31, 4.10s/it] 69%|██████▉ | 23585/34278 [25:52:20<11:20:42, 3.82s/it] {'loss': 0.1183, 'grad_norm': 1.0403847680416156, 'learning_rate': 2.3421687159240214e-06, 'epoch': 0.69} 69%|██████▉ | 23585/34278 [25:52:20<11:20:42, 3.82s/it] 69%|██████▉ | 23586/34278 [25:52:24<11:41:00, 3.93s/it] {'loss': 0.1051, 'grad_norm': 0.9765791641162657, 'learning_rate': 2.341768568161849e-06, 'epoch': 0.69} 69%|██████▉ | 23586/34278 [25:52:24<11:41:00, 3.93s/it] 69%|██████▉ | 23587/34278 [25:52:28<11:15:07, 3.79s/it] {'loss': 0.0974, 'grad_norm': 0.8023847088816684, 'learning_rate': 2.341368444131733e-06, 'epoch': 0.69} 69%|██████▉ | 23587/34278 [25:52:28<11:15:07, 3.79s/it] 69%|██████▉ | 23588/34278 [25:52:34<13:12:22, 4.45s/it] {'loss': 0.1291, 'grad_norm': 1.3936574373681199, 'learning_rate': 2.3409683438372427e-06, 'epoch': 0.69} 69%|██████▉ | 23588/34278 [25:52:34<13:12:22, 4.45s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 69%|██████▉ | 23589/34278 [25:52:38<12:35:05, 4.24s/it] {'loss': 0.1296, 'grad_norm': 0.9726354913736999, 'learning_rate': 2.3405682672819534e-06, 'epoch': 0.69} 69%|██████▉ | 23589/34278 [25:52:38<12:35:05, 4.24s/it] 69%|██████▉ | 23590/34278 [25:52:42<12:26:43, 4.19s/it] {'loss': 0.1268, 'grad_norm': 1.2558747537163648, 'learning_rate': 2.3401682144694347e-06, 'epoch': 0.69} 69%|██████▉ | 23590/34278 [25:52:42<12:26:43, 4.19s/it] 69%|██████▉ | 23591/34278 [25:52:44<11:17:12, 3.80s/it] {'loss': 0.1142, 'grad_norm': 1.049934863308751, 'learning_rate': 2.339768185403259e-06, 'epoch': 0.69} 69%|██████▉ | 23591/34278 [25:52:45<11:17:12, 3.80s/it] 69%|██████▉ | 23592/34278 [25:52:48<10:38:49, 3.59s/it] {'loss': 0.1105, 'grad_norm': 0.9673425357026465, 'learning_rate': 2.339368180087e-06, 'epoch': 0.69} 69%|██████▉ | 23592/34278 [25:52:48<10:38:49, 3.59s/it] 69%|██████▉ | 23593/34278 [25:52:51<10:24:55, 3.51s/it] {'loss': 0.1207, 'grad_norm': 0.8431165119550438, 'learning_rate': 2.338968198524226e-06, 'epoch': 0.69} 69%|██████▉ | 23593/34278 [25:52:51<10:24:55, 3.51s/it] 69%|██████▉ | 23594/34278 [25:52:55<10:38:01, 3.58s/it] {'loss': 0.1051, 'grad_norm': 0.934694810059534, 'learning_rate': 2.338568240718508e-06, 'epoch': 0.69} 69%|██████▉ | 23594/34278 [25:52:55<10:38:01, 3.58s/it] 69%|██████▉ | 23595/34278 [25:52:59<10:59:17, 3.70s/it] {'loss': 0.1272, 'grad_norm': 1.0677053958956826, 'learning_rate': 2.3381683066734182e-06, 'epoch': 0.69} 69%|██████▉ | 23595/34278 [25:52:59<10:59:17, 3.70s/it] 69%|██████▉ | 23596/34278 [25:53:02<10:52:30, 3.67s/it] {'loss': 0.1262, 'grad_norm': 0.8241464009186282, 'learning_rate': 2.3377683963925252e-06, 'epoch': 0.69} 69%|██████▉ | 23596/34278 [25:53:02<10:52:30, 3.67s/it] 69%|██████▉ | 23597/34278 [25:53:05<10:15:09, 3.46s/it] {'loss': 0.11, 'grad_norm': 0.8876436793994138, 'learning_rate': 2.3373685098794017e-06, 'epoch': 0.69} 69%|██████▉ | 23597/34278 [25:53:05<10:15:09, 3.46s/it] 69%|██████▉ | 23598/34278 [25:53:11<12:24:37, 4.18s/it] {'loss': 0.119, 'grad_norm': 0.7850310122455091, 'learning_rate': 2.336968647137615e-06, 'epoch': 0.69} 69%|██████▉ | 23598/34278 [25:53:11<12:24:37, 4.18s/it] 69%|██████▉ | 23599/34278 [25:53:15<11:45:51, 3.97s/it] {'loss': 0.111, 'grad_norm': 0.762904058366715, 'learning_rate': 2.3365688081707383e-06, 'epoch': 0.69} 69%|██████▉ | 23599/34278 [25:53:15<11:45:51, 3.97s/it] 69%|██████▉ | 23600/34278 [25:53:17<10:52:14, 3.66s/it] {'loss': 0.1203, 'grad_norm': 0.8473425197337645, 'learning_rate': 2.3361689929823396e-06, 'epoch': 0.69} 69%|██████▉ | 23600/34278 [25:53:18<10:52:14, 3.66s/it] 69%|██████▉ | 23601/34278 [25:53:21<10:45:01, 3.62s/it] {'loss': 0.1177, 'grad_norm': 0.7210537937423882, 'learning_rate': 2.335769201575986e-06, 'epoch': 0.69} 69%|██████▉ | 23601/34278 [25:53:21<10:45:01, 3.62s/it] 69%|██████▉ | 23602/34278 [25:53:26<11:32:10, 3.89s/it] {'loss': 0.1084, 'grad_norm': 0.6633151120612071, 'learning_rate': 2.335369433955249e-06, 'epoch': 0.69} 69%|██████▉ | 23602/34278 [25:53:26<11:32:10, 3.89s/it] 69%|██████▉ | 23603/34278 [25:53:30<11:47:51, 3.98s/it] {'loss': 0.1313, 'grad_norm': 0.8600509904948564, 'learning_rate': 2.3349696901236995e-06, 'epoch': 0.69} 69%|██████▉ | 23603/34278 [25:53:30<11:47:51, 3.98s/it] 69%|██████▉ | 23604/34278 [25:53:34<12:01:51, 4.06s/it] {'loss': 0.1178, 'grad_norm': 0.6893909561461504, 'learning_rate': 2.334569970084903e-06, 'epoch': 0.69} 69%|██████▉ | 23604/34278 [25:53:34<12:01:51, 4.06s/it] 69%|██████▉ | 23605/34278 [25:53:38<12:01:41, 4.06s/it] {'loss': 0.1129, 'grad_norm': 0.698221538234911, 'learning_rate': 2.334170273842431e-06, 'epoch': 0.69} 69%|██████▉ | 23605/34278 [25:53:38<12:01:41, 4.06s/it] 69%|██████▉ | 23606/34278 [25:53:42<11:35:52, 3.91s/it] {'loss': 0.1436, 'grad_norm': 0.8706561084080148, 'learning_rate': 2.3337706013998508e-06, 'epoch': 0.69} 69%|██████▉ | 23606/34278 [25:53:42<11:35:52, 3.91s/it] 69%|██████▉ | 23607/34278 [25:53:47<12:35:23, 4.25s/it] {'loss': 0.1254, 'grad_norm': 0.7001568737571077, 'learning_rate': 2.333370952760728e-06, 'epoch': 0.69} 69%|██████▉ | 23607/34278 [25:53:47<12:35:23, 4.25s/it] 69%|██████▉ | 23608/34278 [25:53:52<13:27:03, 4.54s/it] {'loss': 0.1285, 'grad_norm': 0.7872645545012035, 'learning_rate': 2.3329713279286325e-06, 'epoch': 0.69} 69%|██████▉ | 23608/34278 [25:53:52<13:27:03, 4.54s/it] 69%|██████▉ | 23609/34278 [25:53:57<13:52:22, 4.68s/it] {'loss': 0.1594, 'grad_norm': 0.8688513455437255, 'learning_rate': 2.3325717269071346e-06, 'epoch': 0.69} 69%|██████▉ | 23609/34278 [25:53:57<13:52:22, 4.68s/it] 69%|██████▉ | 23610/34278 [25:54:00<12:05:19, 4.08s/it] {'loss': 0.1018, 'grad_norm': 0.9447196492204607, 'learning_rate': 2.332172149699799e-06, 'epoch': 0.69} 69%|██████▉ | 23610/34278 [25:54:00<12:05:19, 4.08s/it] 69%|██████▉ | 23611/34278 [25:54:04<12:29:32, 4.22s/it] {'loss': 0.1203, 'grad_norm': 3.555944155520488, 'learning_rate': 2.3317725963101923e-06, 'epoch': 0.69} 69%|██████▉ | 23611/34278 [25:54:04<12:29:32, 4.22s/it] 69%|██████▉ | 23612/34278 [25:54:10<14:04:05, 4.75s/it] {'loss': 0.1391, 'grad_norm': 1.043843622347159, 'learning_rate': 2.3313730667418846e-06, 'epoch': 0.69} 69%|██████▉ | 23612/34278 [25:54:10<14:04:05, 4.75s/it] 69%|██████▉ | 23613/34278 [25:54:14<13:16:09, 4.48s/it] {'loss': 0.1444, 'grad_norm': 1.1034160358026852, 'learning_rate': 2.3309735609984414e-06, 'epoch': 0.69} 69%|██████▉ | 23613/34278 [25:54:14<13:16:09, 4.48s/it] 69%|██████▉ | 23614/34278 [25:54:20<14:36:11, 4.93s/it] {'loss': 0.1004, 'grad_norm': 0.7520143710781712, 'learning_rate': 2.3305740790834263e-06, 'epoch': 0.69} 69%|██████▉ | 23614/34278 [25:54:20<14:36:11, 4.93s/it] 69%|██████▉ | 23615/34278 [25:54:23<13:04:52, 4.42s/it] {'loss': 0.1314, 'grad_norm': 0.770345667529433, 'learning_rate': 2.3301746210004094e-06, 'epoch': 0.69} 69%|██████▉ | 23615/34278 [25:54:23<13:04:52, 4.42s/it] 69%|██████▉ | 23616/34278 [25:54:27<12:25:58, 4.20s/it] {'loss': 0.1275, 'grad_norm': 0.7418194805544203, 'learning_rate': 2.3297751867529578e-06, 'epoch': 0.69} 69%|██████▉ | 23616/34278 [25:54:27<12:25:58, 4.20s/it] 69%|██████▉ | 23617/34278 [25:54:33<13:53:39, 4.69s/it] {'loss': 0.1157, 'grad_norm': 0.991478484394904, 'learning_rate': 2.329375776344636e-06, 'epoch': 0.69} 69%|██████▉ | 23617/34278 [25:54:33<13:53:39, 4.69s/it] 69%|██████▉ | 23618/34278 [25:54:36<12:29:25, 4.22s/it] {'loss': 0.1095, 'grad_norm': 0.722884649775238, 'learning_rate': 2.328976389779008e-06, 'epoch': 0.69} 69%|██████▉ | 23618/34278 [25:54:36<12:29:25, 4.22s/it] 69%|██████▉ | 23619/34278 [25:54:40<12:11:03, 4.12s/it] {'loss': 0.1126, 'grad_norm': 0.7579140561546834, 'learning_rate': 2.3285770270596424e-06, 'epoch': 0.69} 69%|██████▉ | 23619/34278 [25:54:40<12:11:03, 4.12s/it] 69%|██████▉ | 23620/34278 [25:54:43<11:57:18, 4.04s/it] {'loss': 0.1139, 'grad_norm': 0.7967802307443903, 'learning_rate': 2.328177688190102e-06, 'epoch': 0.69} 69%|██████▉ | 23620/34278 [25:54:43<11:57:18, 4.04s/it] 69%|██████▉ | 23621/34278 [25:54:47<11:53:20, 4.02s/it] {'loss': 0.1157, 'grad_norm': 0.8044319982174284, 'learning_rate': 2.3277783731739532e-06, 'epoch': 0.69} 69%|██████▉ | 23621/34278 [25:54:47<11:53:20, 4.02s/it] 69%|██████▉ | 23622/34278 [25:54:51<11:46:27, 3.98s/it] {'loss': 0.12, 'grad_norm': 0.593504956725694, 'learning_rate': 2.3273790820147634e-06, 'epoch': 0.69} 69%|██████▉ | 23622/34278 [25:54:51<11:46:27, 3.98s/it] 69%|██████▉ | 23623/34278 [25:54:57<13:38:27, 4.61s/it] {'loss': 0.1277, 'grad_norm': 0.934777912502349, 'learning_rate': 2.326979814716095e-06, 'epoch': 0.69} 69%|██████▉ | 23623/34278 [25:54:57<13:38:27, 4.61s/it] 69%|██████▉ | 23624/34278 [25:55:03<14:43:20, 4.97s/it] {'loss': 0.1304, 'grad_norm': 0.9523415607532826, 'learning_rate': 2.326580571281511e-06, 'epoch': 0.69} 69%|██████▉ | 23624/34278 [25:55:03<14:43:20, 4.97s/it] 69%|██████▉ | 23625/34278 [25:55:07<13:42:23, 4.63s/it] {'loss': 0.108, 'grad_norm': 0.7213247178173505, 'learning_rate': 2.3261813517145787e-06, 'epoch': 0.69} 69%|██████▉ | 23625/34278 [25:55:07<13:42:23, 4.63s/it] 69%|██████▉ | 23626/34278 [25:55:11<13:04:05, 4.42s/it] {'loss': 0.0956, 'grad_norm': 0.5879304315447973, 'learning_rate': 2.32578215601886e-06, 'epoch': 0.69} 69%|██████▉ | 23626/34278 [25:55:11<13:04:05, 4.42s/it] 69%|██████▉ | 23627/34278 [25:55:14<11:51:44, 4.01s/it] {'loss': 0.1355, 'grad_norm': 0.8829091017552038, 'learning_rate': 2.325382984197921e-06, 'epoch': 0.69} 69%|██████▉ | 23627/34278 [25:55:14<11:51:44, 4.01s/it] 69%|██████▉ | 23628/34278 [25:55:17<11:16:03, 3.81s/it] {'loss': 0.0971, 'grad_norm': 1.078545661877942, 'learning_rate': 2.3249838362553224e-06, 'epoch': 0.69} 69%|██████▉ | 23628/34278 [25:55:17<11:16:03, 3.81s/it] 69%|██████▉ | 23629/34278 [25:55:21<10:57:36, 3.71s/it] {'loss': 0.111, 'grad_norm': 0.8490332553845026, 'learning_rate': 2.3245847121946314e-06, 'epoch': 0.69} 69%|██████▉ | 23629/34278 [25:55:21<10:57:36, 3.71s/it] 69%|██████▉ | 23630/34278 [25:55:24<10:29:15, 3.55s/it] {'loss': 0.1227, 'grad_norm': 0.7672113373591892, 'learning_rate': 2.3241856120194094e-06, 'epoch': 0.69} 69%|██████▉ | 23630/34278 [25:55:24<10:29:15, 3.55s/it] 69%|██████▉ | 23631/34278 [25:55:30<12:35:18, 4.26s/it] {'loss': 0.1175, 'grad_norm': 1.0170675625170897, 'learning_rate': 2.3237865357332185e-06, 'epoch': 0.69} 69%|██████▉ | 23631/34278 [25:55:30<12:35:18, 4.26s/it] 69%|██████▉ | 23632/34278 [25:55:33<11:27:54, 3.88s/it] {'loss': 0.1017, 'grad_norm': 0.7222560693625352, 'learning_rate': 2.3233874833396213e-06, 'epoch': 0.69} 69%|██████▉ | 23632/34278 [25:55:33<11:27:54, 3.88s/it] 69%|██████▉ | 23633/34278 [25:55:36<11:09:50, 3.78s/it] {'loss': 0.1212, 'grad_norm': 0.9561382202915274, 'learning_rate': 2.3229884548421844e-06, 'epoch': 0.69} 69%|██████▉ | 23633/34278 [25:55:36<11:09:50, 3.78s/it] 69%|██████▉ | 23634/34278 [25:55:42<13:07:12, 4.44s/it] {'loss': 0.1221, 'grad_norm': 0.7687430226846184, 'learning_rate': 2.322589450244465e-06, 'epoch': 0.69} 69%|██████▉ | 23634/34278 [25:55:42<13:07:12, 4.44s/it] 69%|██████▉ | 23635/34278 [25:55:47<12:53:45, 4.36s/it] {'loss': 0.1118, 'grad_norm': 0.8756176070118019, 'learning_rate': 2.32219046955003e-06, 'epoch': 0.69} 69%|██████▉ | 23635/34278 [25:55:47<12:53:45, 4.36s/it] 69%|██████▉ | 23636/34278 [25:55:50<12:23:08, 4.19s/it] {'loss': 0.1428, 'grad_norm': 0.8219276207287353, 'learning_rate': 2.32179151276244e-06, 'epoch': 0.69} 69%|██████▉ | 23636/34278 [25:55:50<12:23:08, 4.19s/it] 69%|██████▉ | 23637/34278 [25:55:54<12:09:03, 4.11s/it] {'loss': 0.1393, 'grad_norm': 0.7935008080019541, 'learning_rate': 2.3213925798852534e-06, 'epoch': 0.69} 69%|██████▉ | 23637/34278 [25:55:54<12:09:03, 4.11s/it] 69%|██████▉ | 23638/34278 [25:55:58<11:18:10, 3.82s/it] {'loss': 0.1336, 'grad_norm': 1.013543929059074, 'learning_rate': 2.3209936709220343e-06, 'epoch': 0.69} 69%|██████▉ | 23638/34278 [25:55:58<11:18:10, 3.82s/it] 69%|██████▉ | 23639/34278 [25:56:00<10:19:15, 3.49s/it] {'loss': 0.1259, 'grad_norm': 0.8030855768031363, 'learning_rate': 2.320594785876346e-06, 'epoch': 0.69} 69%|██████▉ | 23639/34278 [25:56:00<10:19:15, 3.49s/it] 69%|██████▉ | 23640/34278 [25:56:04<10:12:44, 3.46s/it] {'loss': 0.1383, 'grad_norm': 0.9639021359278168, 'learning_rate': 2.320195924751748e-06, 'epoch': 0.69} 69%|██████▉ | 23640/34278 [25:56:04<10:12:44, 3.46s/it] 69%|██████▉ | 23641/34278 [25:56:07<10:24:33, 3.52s/it] {'loss': 0.1281, 'grad_norm': 0.8700975954012806, 'learning_rate': 2.3197970875517995e-06, 'epoch': 0.69} 69%|██████▉ | 23641/34278 [25:56:07<10:24:33, 3.52s/it] 69%|██████▉ | 23642/34278 [25:56:10<9:54:12, 3.35s/it] {'loss': 0.1443, 'grad_norm': 0.8136724596954176, 'learning_rate': 2.3193982742800647e-06, 'epoch': 0.69} 69%|██████▉ | 23642/34278 [25:56:10<9:54:12, 3.35s/it] 69%|██████▉ | 23643/34278 [25:56:14<10:11:07, 3.45s/it] {'loss': 0.1379, 'grad_norm': 1.0189556984596226, 'learning_rate': 2.3189994849401015e-06, 'epoch': 0.69} 69%|██████▉ | 23643/34278 [25:56:14<10:11:07, 3.45s/it] 69%|██████▉ | 23644/34278 [25:56:20<12:35:51, 4.26s/it] {'loss': 0.1059, 'grad_norm': 0.9742464893363215, 'learning_rate': 2.31860071953547e-06, 'epoch': 0.69} 69%|██████▉ | 23644/34278 [25:56:20<12:35:51, 4.26s/it] 69%|██████▉ | 23645/34278 [25:56:24<11:56:00, 4.04s/it] {'loss': 0.1269, 'grad_norm': 0.7963701868180952, 'learning_rate': 2.31820197806973e-06, 'epoch': 0.69} 69%|██████▉ | 23645/34278 [25:56:24<11:56:00, 4.04s/it] 69%|██████▉ | 23646/34278 [25:56:29<12:46:22, 4.32s/it] {'loss': 0.131, 'grad_norm': 0.8386964550617722, 'learning_rate': 2.317803260546445e-06, 'epoch': 0.69} 69%|██████▉ | 23646/34278 [25:56:29<12:46:22, 4.32s/it] 69%|██████▉ | 23647/34278 [25:56:32<11:36:12, 3.93s/it] {'loss': 0.1367, 'grad_norm': 0.9893009468384862, 'learning_rate': 2.3174045669691724e-06, 'epoch': 0.69} 69%|██████▉ | 23647/34278 [25:56:32<11:36:12, 3.93s/it] 69%|██████▉ | 23648/34278 [25:56:35<10:55:59, 3.70s/it] {'loss': 0.1219, 'grad_norm': 0.8806406178431144, 'learning_rate': 2.3170058973414696e-06, 'epoch': 0.69} 69%|██████▉ | 23648/34278 [25:56:35<10:55:59, 3.70s/it] 69%|██████▉ | 23649/34278 [25:56:38<10:18:06, 3.49s/it] {'loss': 0.1331, 'grad_norm': 0.9552274158140003, 'learning_rate': 2.3166072516668992e-06, 'epoch': 0.69} 69%|██████▉ | 23649/34278 [25:56:38<10:18:06, 3.49s/it] 69%|██████▉ | 23650/34278 [25:56:40<9:36:35, 3.26s/it] {'loss': 0.1157, 'grad_norm': 1.0930118529298434, 'learning_rate': 2.316208629949017e-06, 'epoch': 0.69} 69%|██████▉ | 23650/34278 [25:56:40<9:36:35, 3.26s/it] 69%|██████▉ | 23651/34278 [25:56:46<11:31:21, 3.90s/it] {'loss': 0.1108, 'grad_norm': 0.7272985772568562, 'learning_rate': 2.3158100321913836e-06, 'epoch': 0.69} 69%|██████▉ | 23651/34278 [25:56:46<11:31:21, 3.90s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 69%|██████▉ | 23652/34278 [25:56:51<12:17:20, 4.16s/it] {'loss': 0.1249, 'grad_norm': 0.8064813106794204, 'learning_rate': 2.315411458397559e-06, 'epoch': 0.69} 69%|██████▉ | 23652/34278 [25:56:51<12:17:20, 4.16s/it] 69%|██████▉ | 23653/34278 [25:56:54<11:15:58, 3.82s/it] {'loss': 0.12, 'grad_norm': 0.9704866191529995, 'learning_rate': 2.3150129085710998e-06, 'epoch': 0.69} 69%|██████▉ | 23653/34278 [25:56:54<11:15:58, 3.82s/it] 69%|██████▉ | 23654/34278 [25:56:57<10:59:44, 3.73s/it] {'loss': 0.113, 'grad_norm': 0.9910425680357245, 'learning_rate': 2.314614382715563e-06, 'epoch': 0.69} 69%|██████▉ | 23654/34278 [25:56:57<10:59:44, 3.73s/it] 69%|██████▉ | 23655/34278 [25:57:00<10:24:26, 3.53s/it] {'loss': 0.123, 'grad_norm': 0.9334528401558666, 'learning_rate': 2.31421588083451e-06, 'epoch': 0.69} 69%|██████▉ | 23655/34278 [25:57:00<10:24:26, 3.53s/it] 69%|██████▉ | 23656/34278 [25:57:03<10:04:50, 3.42s/it] {'loss': 0.1528, 'grad_norm': 0.9126471488991307, 'learning_rate': 2.313817402931494e-06, 'epoch': 0.69} 69%|██████▉ | 23656/34278 [25:57:03<10:04:50, 3.42s/it] 69%|██████▉ | 23657/34278 [25:57:07<10:41:01, 3.62s/it] {'loss': 0.1204, 'grad_norm': 0.9908774099220438, 'learning_rate': 2.3134189490100773e-06, 'epoch': 0.69} 69%|██████▉ | 23657/34278 [25:57:07<10:41:01, 3.62s/it] 69%|██████▉ | 23658/34278 [25:57:12<11:38:45, 3.95s/it] {'loss': 0.1182, 'grad_norm': 0.819538434326238, 'learning_rate': 2.313020519073813e-06, 'epoch': 0.69} 69%|██████▉ | 23658/34278 [25:57:12<11:38:45, 3.95s/it] 69%|██████▉ | 23659/34278 [25:57:16<11:10:33, 3.79s/it] {'loss': 0.1181, 'grad_norm': 0.800339781816359, 'learning_rate': 2.3126221131262614e-06, 'epoch': 0.69} 69%|██████▉ | 23659/34278 [25:57:16<11:10:33, 3.79s/it] 69%|██████▉ | 23660/34278 [25:57:19<10:33:45, 3.58s/it] {'loss': 0.1153, 'grad_norm': 0.8457097695409919, 'learning_rate': 2.312223731170979e-06, 'epoch': 0.69} 69%|██████▉ | 23660/34278 [25:57:19<10:33:45, 3.58s/it] 69%|██████▉ | 23661/34278 [25:57:22<10:25:51, 3.54s/it] {'loss': 0.1182, 'grad_norm': 0.95150576897296, 'learning_rate': 2.3118253732115186e-06, 'epoch': 0.69} 69%|██████▉ | 23661/34278 [25:57:22<10:25:51, 3.54s/it] 69%|██████▉ | 23662/34278 [25:57:25<10:09:12, 3.44s/it] {'loss': 0.1168, 'grad_norm': 0.8905554901884798, 'learning_rate': 2.3114270392514404e-06, 'epoch': 0.69} 69%|██████▉ | 23662/34278 [25:57:25<10:09:12, 3.44s/it] 69%|██████▉ | 23663/34278 [25:57:29<10:44:17, 3.64s/it] {'loss': 0.1083, 'grad_norm': 1.042156377872879, 'learning_rate': 2.311028729294301e-06, 'epoch': 0.69} 69%|██████▉ | 23663/34278 [25:57:30<10:44:17, 3.64s/it] 69%|██████▉ | 23664/34278 [25:57:35<12:49:26, 4.35s/it] {'loss': 0.1129, 'grad_norm': 0.8702989898114386, 'learning_rate': 2.310630443343654e-06, 'epoch': 0.69} 69%|██████▉ | 23664/34278 [25:57:35<12:49:26, 4.35s/it] 69%|██████▉ | 23665/34278 [25:57:39<11:58:16, 4.06s/it] {'loss': 0.1381, 'grad_norm': 1.0316251151985667, 'learning_rate': 2.3102321814030577e-06, 'epoch': 0.69} 69%|██████▉ | 23665/34278 [25:57:39<11:58:16, 4.06s/it] 69%|██████▉ | 23666/34278 [25:57:42<11:01:25, 3.74s/it] {'loss': 0.1058, 'grad_norm': 1.038377239359597, 'learning_rate': 2.309833943476067e-06, 'epoch': 0.69} 69%|██████▉ | 23666/34278 [25:57:42<11:01:25, 3.74s/it] 69%|██████▉ | 23667/34278 [25:57:46<10:59:34, 3.73s/it] {'loss': 0.1254, 'grad_norm': 0.8366944025737572, 'learning_rate': 2.309435729566234e-06, 'epoch': 0.69} 69%|██████▉ | 23667/34278 [25:57:46<10:59:34, 3.73s/it] 69%|██████▉ | 23668/34278 [25:57:49<10:25:47, 3.54s/it] {'loss': 0.1089, 'grad_norm': 0.6714876382825048, 'learning_rate': 2.309037539677117e-06, 'epoch': 0.69} 69%|██████▉ | 23668/34278 [25:57:49<10:25:47, 3.54s/it] 69%|██████▉ | 23669/34278 [25:57:52<10:06:10, 3.43s/it] {'loss': 0.1167, 'grad_norm': 0.91272258081961, 'learning_rate': 2.3086393738122718e-06, 'epoch': 0.69} 69%|██████▉ | 23669/34278 [25:57:52<10:06:10, 3.43s/it] 69%|██████▉ | 23670/34278 [25:57:55<9:36:38, 3.26s/it] {'loss': 0.142, 'grad_norm': 1.1446665184893436, 'learning_rate': 2.3082412319752525e-06, 'epoch': 0.69} 69%|██████▉ | 23670/34278 [25:57:55<9:36:38, 3.26s/it] 69%|██████▉ | 23671/34278 [25:57:58<9:37:35, 3.27s/it] {'loss': 0.1242, 'grad_norm': 0.9077590475189545, 'learning_rate': 2.307843114169611e-06, 'epoch': 0.69} 69%|██████▉ | 23671/34278 [25:57:58<9:37:35, 3.27s/it] 69%|██████▉ | 23672/34278 [25:58:02<10:39:48, 3.62s/it] {'loss': 0.1024, 'grad_norm': 0.6449499241531631, 'learning_rate': 2.3074450203989046e-06, 'epoch': 0.69} 69%|██████▉ | 23672/34278 [25:58:02<10:39:48, 3.62s/it] 69%|██████▉ | 23673/34278 [25:58:05<10:05:21, 3.42s/it] {'loss': 0.1385, 'grad_norm': 1.0168014356209796, 'learning_rate': 2.307046950666687e-06, 'epoch': 0.69} 69%|██████▉ | 23673/34278 [25:58:05<10:05:21, 3.42s/it] 69%|██████▉ | 23674/34278 [25:58:09<10:34:27, 3.59s/it] {'loss': 0.1162, 'grad_norm': 0.9092898333155542, 'learning_rate': 2.3066489049765096e-06, 'epoch': 0.69} 69%|██████▉ | 23674/34278 [25:58:09<10:34:27, 3.59s/it] 69%|██████▉ | 23675/34278 [25:58:15<12:34:22, 4.27s/it] {'loss': 0.1428, 'grad_norm': 0.9193741929850859, 'learning_rate': 2.3062508833319273e-06, 'epoch': 0.69} 69%|██████▉ | 23675/34278 [25:58:15<12:34:22, 4.27s/it] 69%|██████▉ | 23676/34278 [25:58:19<11:58:46, 4.07s/it] {'loss': 0.1107, 'grad_norm': 0.8523432817325961, 'learning_rate': 2.3058528857364963e-06, 'epoch': 0.69} 69%|██████▉ | 23676/34278 [25:58:19<11:58:46, 4.07s/it] 69%|██████▉ | 23677/34278 [25:58:22<10:52:39, 3.69s/it] {'loss': 0.1242, 'grad_norm': 0.8534216923236597, 'learning_rate': 2.3054549121937674e-06, 'epoch': 0.69} 69%|██████▉ | 23677/34278 [25:58:22<10:52:39, 3.69s/it] 69%|██████▉ | 23678/34278 [25:58:25<10:48:26, 3.67s/it] {'loss': 0.1297, 'grad_norm': 0.9272245904195249, 'learning_rate': 2.305056962707292e-06, 'epoch': 0.69} 69%|██████▉ | 23678/34278 [25:58:25<10:48:26, 3.67s/it] 69%|██████▉ | 23679/34278 [25:58:29<10:39:30, 3.62s/it] {'loss': 0.1071, 'grad_norm': 1.1494664409395687, 'learning_rate': 2.3046590372806268e-06, 'epoch': 0.69} 69%|██████▉ | 23679/34278 [25:58:29<10:39:30, 3.62s/it] 69%|██████▉ | 23680/34278 [25:58:32<10:03:11, 3.41s/it] {'loss': 0.1232, 'grad_norm': 0.9187653162047094, 'learning_rate': 2.30426113591732e-06, 'epoch': 0.69} 69%|██████▉ | 23680/34278 [25:58:32<10:03:11, 3.41s/it] 69%|██████▉ | 23681/34278 [25:58:35<9:45:37, 3.32s/it] {'loss': 0.1436, 'grad_norm': 0.8784483484111315, 'learning_rate': 2.3038632586209264e-06, 'epoch': 0.69} 69%|██████▉ | 23681/34278 [25:58:35<9:45:37, 3.32s/it] 69%|██████▉ | 23682/34278 [25:58:39<10:24:20, 3.54s/it] {'loss': 0.1136, 'grad_norm': 0.8719874068627423, 'learning_rate': 2.303465405395e-06, 'epoch': 0.69} 69%|██████▉ | 23682/34278 [25:58:39<10:24:20, 3.54s/it] 69%|██████▉ | 23683/34278 [25:58:44<11:48:11, 4.01s/it] {'loss': 0.1352, 'grad_norm': 1.1241996492284672, 'learning_rate': 2.3030675762430906e-06, 'epoch': 0.69} 69%|██████▉ | 23683/34278 [25:58:44<11:48:11, 4.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 69%|██████▉ | 23684/34278 [25:58:47<10:56:03, 3.72s/it] {'loss': 0.1456, 'grad_norm': 0.9061017200117076, 'learning_rate': 2.3026697711687477e-06, 'epoch': 0.69} 69%|██████▉ | 23684/34278 [25:58:47<10:56:03, 3.72s/it] 69%|██████▉ | 23685/34278 [25:58:50<10:28:48, 3.56s/it] {'loss': 0.1281, 'grad_norm': 0.9601401034716334, 'learning_rate': 2.302271990175528e-06, 'epoch': 0.69} 69%|██████▉ | 23685/34278 [25:58:50<10:28:48, 3.56s/it] 69%|██████▉ | 23686/34278 [25:58:53<10:09:39, 3.45s/it] {'loss': 0.11, 'grad_norm': 0.9931266493491848, 'learning_rate': 2.3018742332669775e-06, 'epoch': 0.69} 69%|██████▉ | 23686/34278 [25:58:53<10:09:39, 3.45s/it] 69%|██████▉ | 23687/34278 [25:58:57<10:28:15, 3.56s/it] {'loss': 0.1063, 'grad_norm': 0.8268954279743886, 'learning_rate': 2.301476500446652e-06, 'epoch': 0.69} 69%|██████▉ | 23687/34278 [25:58:57<10:28:15, 3.56s/it] 69%|██████▉ | 23688/34278 [25:59:00<10:06:05, 3.43s/it] {'loss': 0.1133, 'grad_norm': 0.7578368801598571, 'learning_rate': 2.301078791718098e-06, 'epoch': 0.69} 69%|██████▉ | 23688/34278 [25:59:00<10:06:05, 3.43s/it] 69%|██████▉ | 23689/34278 [25:59:03<9:38:14, 3.28s/it] {'loss': 0.1094, 'grad_norm': 1.0595097715640953, 'learning_rate': 2.30068110708487e-06, 'epoch': 0.69} 69%|██████▉ | 23689/34278 [25:59:03<9:38:14, 3.28s/it] 69%|██████▉ | 23690/34278 [25:59:06<9:36:11, 3.27s/it] {'loss': 0.1308, 'grad_norm': 0.7898769517130635, 'learning_rate': 2.300283446550517e-06, 'epoch': 0.69} 69%|██████▉ | 23690/34278 [25:59:06<9:36:11, 3.27s/it] 69%|██████▉ | 23691/34278 [25:59:09<9:11:07, 3.12s/it] {'loss': 0.1293, 'grad_norm': 1.1145970578791549, 'learning_rate': 2.2998858101185873e-06, 'epoch': 0.69} 69%|██████▉ | 23691/34278 [25:59:09<9:11:07, 3.12s/it] 69%|██████▉ | 23692/34278 [25:59:13<9:58:51, 3.39s/it] {'loss': 0.1221, 'grad_norm': 0.836399848990084, 'learning_rate': 2.299488197792632e-06, 'epoch': 0.69} 69%|██████▉ | 23692/34278 [25:59:13<9:58:51, 3.39s/it] 69%|██████▉ | 23693/34278 [25:59:16<9:48:36, 3.34s/it] {'loss': 0.125, 'grad_norm': 0.7747572602824823, 'learning_rate': 2.2990906095762033e-06, 'epoch': 0.69} 69%|██████▉ | 23693/34278 [25:59:16<9:48:36, 3.34s/it] 69%|██████▉ | 23694/34278 [25:59:20<9:59:32, 3.40s/it] {'loss': 0.1108, 'grad_norm': 0.9102249540121757, 'learning_rate': 2.2986930454728474e-06, 'epoch': 0.69} 69%|██████▉ | 23694/34278 [25:59:20<9:59:32, 3.40s/it] 69%|██████▉ | 23695/34278 [25:59:24<10:20:15, 3.52s/it] {'loss': 0.1251, 'grad_norm': 0.7265958771338673, 'learning_rate': 2.2982955054861166e-06, 'epoch': 0.69} 69%|██████▉ | 23695/34278 [25:59:24<10:20:15, 3.52s/it] 69%|██████▉ | 23696/34278 [25:59:28<10:59:27, 3.74s/it] {'loss': 0.1257, 'grad_norm': 0.9534936034246931, 'learning_rate': 2.2978979896195587e-06, 'epoch': 0.69} 69%|██████▉ | 23696/34278 [25:59:28<10:59:27, 3.74s/it] 69%|██████▉ | 23697/34278 [25:59:31<10:28:09, 3.56s/it] {'loss': 0.1051, 'grad_norm': 0.7784759412891694, 'learning_rate': 2.2975004978767206e-06, 'epoch': 0.69} 69%|██████▉ | 23697/34278 [25:59:31<10:28:09, 3.56s/it] 69%|██████▉ | 23698/34278 [25:59:34<9:56:01, 3.38s/it] {'loss': 0.1273, 'grad_norm': 1.0035190140289736, 'learning_rate': 2.297103030261153e-06, 'epoch': 0.69} 69%|██████▉ | 23698/34278 [25:59:34<9:56:01, 3.38s/it] 69%|██████▉ | 23699/34278 [25:59:37<9:51:56, 3.36s/it] {'loss': 0.1235, 'grad_norm': 0.8378814986653959, 'learning_rate': 2.296705586776406e-06, 'epoch': 0.69} 69%|██████▉ | 23699/34278 [25:59:38<9:51:56, 3.36s/it] 69%|██████▉ | 23700/34278 [25:59:41<9:42:45, 3.31s/it] {'loss': 0.1148, 'grad_norm': 0.7593074884696484, 'learning_rate': 2.2963081674260267e-06, 'epoch': 0.69} 69%|██████▉ | 23700/34278 [25:59:41<9:42:45, 3.31s/it] 69%|██████▉ | 23701/34278 [25:59:44<9:28:12, 3.22s/it] {'loss': 0.1188, 'grad_norm': 0.9424191104104974, 'learning_rate': 2.2959107722135603e-06, 'epoch': 0.69} 69%|██████▉ | 23701/34278 [25:59:44<9:28:12, 3.22s/it] 69%|██████▉ | 23702/34278 [25:59:47<9:29:15, 3.23s/it] {'loss': 0.1372, 'grad_norm': 0.9697443066315308, 'learning_rate': 2.295513401142559e-06, 'epoch': 0.69} 69%|██████▉ | 23702/34278 [25:59:47<9:29:15, 3.23s/it] 69%|██████▉ | 23703/34278 [25:59:50<9:29:30, 3.23s/it] {'loss': 0.1226, 'grad_norm': 0.8505360642343872, 'learning_rate': 2.2951160542165684e-06, 'epoch': 0.69} 69%|██████▉ | 23703/34278 [25:59:50<9:29:30, 3.23s/it] 69%|██████▉ | 23704/34278 [25:59:54<9:43:10, 3.31s/it] {'loss': 0.0891, 'grad_norm': 0.6855969249332672, 'learning_rate': 2.2947187314391346e-06, 'epoch': 0.69} 69%|██████▉ | 23704/34278 [25:59:54<9:43:10, 3.31s/it] 69%|██████▉ | 23705/34278 [25:59:57<9:51:54, 3.36s/it] {'loss': 0.1031, 'grad_norm': 0.906423769640861, 'learning_rate': 2.294321432813805e-06, 'epoch': 0.69} 69%|██████▉ | 23705/34278 [25:59:57<9:51:54, 3.36s/it] 69%|██████▉ | 23706/34278 [26:00:01<9:59:00, 3.40s/it] {'loss': 0.1298, 'grad_norm': 0.7999588831953454, 'learning_rate': 2.2939241583441308e-06, 'epoch': 0.69} 69%|██████▉ | 23706/34278 [26:00:01<9:59:00, 3.40s/it] 69%|██████▉ | 23707/34278 [26:00:04<9:38:55, 3.29s/it] {'loss': 0.1134, 'grad_norm': 0.8355790758687139, 'learning_rate': 2.2935269080336555e-06, 'epoch': 0.69} 69%|██████▉ | 23707/34278 [26:00:04<9:38:55, 3.29s/it] 69%|██████▉ | 23708/34278 [26:00:09<11:36:33, 3.95s/it] {'loss': 0.1226, 'grad_norm': 0.7848295123297799, 'learning_rate': 2.2931296818859233e-06, 'epoch': 0.69} 69%|██████▉ | 23708/34278 [26:00:09<11:36:33, 3.95s/it] 69%|██████▉ | 23709/34278 [26:00:15<13:26:42, 4.58s/it] {'loss': 0.1296, 'grad_norm': 1.133574432844111, 'learning_rate': 2.2927324799044858e-06, 'epoch': 0.69} 69%|██████▉ | 23709/34278 [26:00:15<13:26:42, 4.58s/it] 69%|██████▉ | 23710/34278 [26:00:19<12:56:58, 4.41s/it] {'loss': 0.127, 'grad_norm': 0.8236149312082659, 'learning_rate': 2.292335302092884e-06, 'epoch': 0.69} 69%|██████▉ | 23710/34278 [26:00:19<12:56:58, 4.41s/it] 69%|██████▉ | 23711/34278 [26:00:22<11:45:11, 4.00s/it] {'loss': 0.1163, 'grad_norm': 0.8715150804512658, 'learning_rate': 2.2919381484546665e-06, 'epoch': 0.69} 69%|██████▉ | 23711/34278 [26:00:22<11:45:11, 4.00s/it] 69%|██████▉ | 23712/34278 [26:00:25<10:52:40, 3.71s/it] {'loss': 0.1062, 'grad_norm': 0.9397516361988193, 'learning_rate': 2.2915410189933807e-06, 'epoch': 0.69} 69%|██████▉ | 23712/34278 [26:00:25<10:52:40, 3.71s/it] 69%|██████▉ | 23713/34278 [26:00:31<12:35:45, 4.29s/it] {'loss': 0.124, 'grad_norm': 0.8283909892003528, 'learning_rate': 2.29114391371257e-06, 'epoch': 0.69} 69%|██████▉ | 23713/34278 [26:00:31<12:35:45, 4.29s/it] 69%|██████▉ | 23714/34278 [26:00:34<11:40:22, 3.98s/it] {'loss': 0.1146, 'grad_norm': 0.912337308859186, 'learning_rate': 2.2907468326157777e-06, 'epoch': 0.69} 69%|██████▉ | 23714/34278 [26:00:34<11:40:22, 3.98s/it] 69%|██████▉ | 23715/34278 [26:00:40<12:58:41, 4.42s/it] {'loss': 0.1235, 'grad_norm': 0.8580879529726056, 'learning_rate': 2.290349775706553e-06, 'epoch': 0.69} 69%|██████▉ | 23715/34278 [26:00:40<12:58:41, 4.42s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 69%|██████▉ | 23716/34278 [26:00:45<13:34:48, 4.63s/it] {'loss': 0.1247, 'grad_norm': 0.9868941377315759, 'learning_rate': 2.289952742988437e-06, 'epoch': 0.69} 69%|██████▉ | 23716/34278 [26:00:45<13:34:48, 4.63s/it] 69%|██████▉ | 23717/34278 [26:00:48<11:57:25, 4.08s/it] {'loss': 0.124, 'grad_norm': 1.0810218537199563, 'learning_rate': 2.2895557344649777e-06, 'epoch': 0.69} 69%|██████▉ | 23717/34278 [26:00:48<11:57:25, 4.08s/it] 69%|██████▉ | 23718/34278 [26:00:51<11:37:28, 3.96s/it] {'loss': 0.119, 'grad_norm': 1.0714996206285003, 'learning_rate': 2.2891587501397157e-06, 'epoch': 0.69} 69%|██████▉ | 23718/34278 [26:00:51<11:37:28, 3.96s/it] 69%|██████▉ | 23719/34278 [26:00:55<11:05:09, 3.78s/it] {'loss': 0.1319, 'grad_norm': 1.0334090868066341, 'learning_rate': 2.2887617900161996e-06, 'epoch': 0.69} 69%|██████▉ | 23719/34278 [26:00:55<11:05:09, 3.78s/it] 69%|██████▉ | 23720/34278 [26:00:58<10:57:47, 3.74s/it] {'loss': 0.1351, 'grad_norm': 1.0229856177819148, 'learning_rate': 2.28836485409797e-06, 'epoch': 0.69} 69%|██████▉ | 23720/34278 [26:00:58<10:57:47, 3.74s/it] 69%|██████▉ | 23721/34278 [26:01:02<11:02:35, 3.77s/it] {'loss': 0.1147, 'grad_norm': 1.0630437252729676, 'learning_rate': 2.2879679423885708e-06, 'epoch': 0.69} 69%|██████▉ | 23721/34278 [26:01:02<11:02:35, 3.77s/it] 69%|██████▉ | 23722/34278 [26:01:05<10:38:57, 3.63s/it] {'loss': 0.1241, 'grad_norm': 0.8101866141088259, 'learning_rate': 2.2875710548915464e-06, 'epoch': 0.69} 69%|██████▉ | 23722/34278 [26:01:05<10:38:57, 3.63s/it] 69%|██████▉ | 23723/34278 [26:01:08<10:09:19, 3.46s/it] {'loss': 0.133, 'grad_norm': 1.1746830181502605, 'learning_rate': 2.2871741916104414e-06, 'epoch': 0.69} 69%|██████▉ | 23723/34278 [26:01:09<10:09:19, 3.46s/it] 69%|██████▉ | 23724/34278 [26:01:12<9:54:43, 3.38s/it] {'loss': 0.1341, 'grad_norm': 1.3231516665272993, 'learning_rate': 2.286777352548796e-06, 'epoch': 0.69} 69%|██████▉ | 23724/34278 [26:01:12<9:54:43, 3.38s/it] 69%|██████▉ | 23725/34278 [26:01:14<9:22:33, 3.20s/it] {'loss': 0.1034, 'grad_norm': 0.7157855577153792, 'learning_rate': 2.2863805377101565e-06, 'epoch': 0.69} 69%|██████▉ | 23725/34278 [26:01:14<9:22:33, 3.20s/it] 69%|██████▉ | 23726/34278 [26:01:19<10:08:19, 3.46s/it] {'loss': 0.1252, 'grad_norm': 0.6594134682756123, 'learning_rate': 2.2859837470980638e-06, 'epoch': 0.69} 69%|██████▉ | 23726/34278 [26:01:19<10:08:19, 3.46s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f69138da2f0> Failed to fetch sample 3657299. Exception: cannot identify image file <_io.BytesIO object at 0x7f69138da2f0> 69%|██████▉ | 23727/34278 [26:01:21<9:35:32, 3.27s/it] {'loss': 0.1151, 'grad_norm': 0.8131794627778848, 'learning_rate': 2.2855869807160588e-06, 'epoch': 0.69} 69%|██████▉ | 23727/34278 [26:01:21<9:35:32, 3.27s/it] 69%|██████▉ | 23728/34278 [26:01:25<9:39:38, 3.30s/it] {'loss': 0.1341, 'grad_norm': 1.2535305456139525, 'learning_rate': 2.285190238567685e-06, 'epoch': 0.69} 69%|██████▉ | 23728/34278 [26:01:25<9:39:38, 3.30s/it] 69%|██████▉ | 23729/34278 [26:01:28<10:05:14, 3.44s/it] {'loss': 0.1209, 'grad_norm': 0.778980524726283, 'learning_rate': 2.2847935206564865e-06, 'epoch': 0.69} 69%|██████▉ | 23729/34278 [26:01:28<10:05:14, 3.44s/it] 69%|██████▉ | 23730/34278 [26:01:32<9:50:02, 3.36s/it] {'loss': 0.1353, 'grad_norm': 0.8349112723321234, 'learning_rate': 2.284396826986003e-06, 'epoch': 0.69} 69%|██████▉ | 23730/34278 [26:01:32<9:50:02, 3.36s/it] 69%|██████▉ | 23731/34278 [26:01:35<9:45:08, 3.33s/it] {'loss': 0.1287, 'grad_norm': 0.8500357543524237, 'learning_rate': 2.284000157559775e-06, 'epoch': 0.69} 69%|██████▉ | 23731/34278 [26:01:35<9:45:08, 3.33s/it] 69%|██████▉ | 23732/34278 [26:01:39<10:34:07, 3.61s/it] {'loss': 0.1012, 'grad_norm': 0.7999652808402606, 'learning_rate': 2.2836035123813466e-06, 'epoch': 0.69} 69%|██████▉ | 23732/34278 [26:01:39<10:34:07, 3.61s/it] 69%|██████▉ | 23733/34278 [26:01:46<13:02:11, 4.45s/it] {'loss': 0.1263, 'grad_norm': 0.967336312167194, 'learning_rate': 2.2832068914542575e-06, 'epoch': 0.69} 69%|██████▉ | 23733/34278 [26:01:46<13:02:11, 4.45s/it] 69%|██████▉ | 23734/34278 [26:01:52<14:26:06, 4.93s/it] {'loss': 0.1406, 'grad_norm': 0.9032378828977392, 'learning_rate': 2.2828102947820476e-06, 'epoch': 0.69} 69%|██████▉ | 23734/34278 [26:01:52<14:26:06, 4.93s/it] 69%|██████▉ | 23735/34278 [26:01:55<13:01:43, 4.45s/it] {'loss': 0.1121, 'grad_norm': 0.9399886097246759, 'learning_rate': 2.282413722368258e-06, 'epoch': 0.69} 69%|██████▉ | 23735/34278 [26:01:55<13:01:43, 4.45s/it] 69%|██████▉ | 23736/34278 [26:02:00<13:08:23, 4.49s/it] {'loss': 0.12, 'grad_norm': 0.8896660325406226, 'learning_rate': 2.282017174216432e-06, 'epoch': 0.69} 69%|██████▉ | 23736/34278 [26:02:00<13:08:23, 4.49s/it] 69%|██████▉ | 23737/34278 [26:02:03<11:52:50, 4.06s/it] {'loss': 0.1147, 'grad_norm': 0.9735989195568626, 'learning_rate': 2.281620650330108e-06, 'epoch': 0.69} 69%|██████▉ | 23737/34278 [26:02:03<11:52:50, 4.06s/it] 69%|██████▉ | 23738/34278 [26:02:06<11:28:35, 3.92s/it] {'loss': 0.1269, 'grad_norm': 0.8502541808391574, 'learning_rate': 2.281224150712824e-06, 'epoch': 0.69} 69%|██████▉ | 23738/34278 [26:02:06<11:28:35, 3.92s/it] 69%|██████▉ | 23739/34278 [26:02:09<10:54:51, 3.73s/it] {'loss': 0.1381, 'grad_norm': 0.7925147321656687, 'learning_rate': 2.2808276753681243e-06, 'epoch': 0.69} 69%|██████▉ | 23739/34278 [26:02:09<10:54:51, 3.73s/it] 69%|██████▉ | 23740/34278 [26:02:13<10:28:31, 3.58s/it] {'loss': 0.1406, 'grad_norm': 0.7868908178188531, 'learning_rate': 2.280431224299543e-06, 'epoch': 0.69} 69%|██████▉ | 23740/34278 [26:02:13<10:28:31, 3.58s/it] 69%|██████▉ | 23741/34278 [26:02:16<10:07:46, 3.46s/it] {'loss': 0.1135, 'grad_norm': 1.0087179670265431, 'learning_rate': 2.280034797510623e-06, 'epoch': 0.69} 69%|██████▉ | 23741/34278 [26:02:16<10:07:46, 3.46s/it] 69%|██████▉ | 23742/34278 [26:02:19<9:49:44, 3.36s/it] {'loss': 0.1129, 'grad_norm': 0.7662286779625004, 'learning_rate': 2.279638395004905e-06, 'epoch': 0.69} 69%|██████▉ | 23742/34278 [26:02:19<9:49:44, 3.36s/it] 69%|██████▉ | 23743/34278 [26:02:23<10:21:52, 3.54s/it] {'loss': 0.1106, 'grad_norm': 0.8378206059463131, 'learning_rate': 2.279242016785926e-06, 'epoch': 0.69} 69%|██████▉ | 23743/34278 [26:02:23<10:21:52, 3.54s/it] 69%|██████▉ | 23744/34278 [26:02:26<9:44:37, 3.33s/it] {'loss': 0.1391, 'grad_norm': 0.752022430396605, 'learning_rate': 2.2788456628572227e-06, 'epoch': 0.69} 69%|██████▉ | 23744/34278 [26:02:26<9:44:37, 3.33s/it] 69%|██████▉ | 23745/34278 [26:02:30<10:22:29, 3.55s/it] {'loss': 0.1302, 'grad_norm': 1.0032264413742915, 'learning_rate': 2.2784493332223375e-06, 'epoch': 0.69} 69%|██████▉ | 23745/34278 [26:02:30<10:22:29, 3.55s/it] 69%|██████▉ | 23746/34278 [26:02:33<10:10:31, 3.48s/it] {'loss': 0.1223, 'grad_norm': 0.8548099192713413, 'learning_rate': 2.278053027884805e-06, 'epoch': 0.69} 69%|██████▉ | 23746/34278 [26:02:33<10:10:31, 3.48s/it] 69%|██████▉ | 23747/34278 [26:02:36<9:47:18, 3.35s/it] {'loss': 0.1323, 'grad_norm': 1.001924147167816, 'learning_rate': 2.2776567468481674e-06, 'epoch': 0.69} 69%|██████▉ | 23747/34278 [26:02:36<9:47:18, 3.35s/it] 69%|██████▉ | 23748/34278 [26:02:39<9:25:27, 3.22s/it] {'loss': 0.1388, 'grad_norm': 0.7231677851603999, 'learning_rate': 2.277260490115959e-06, 'epoch': 0.69} 69%|██████▉ | 23748/34278 [26:02:39<9:25:27, 3.22s/it] 69%|██████▉ | 23749/34278 [26:02:42<9:23:25, 3.21s/it] {'loss': 0.101, 'grad_norm': 0.8067489608143206, 'learning_rate': 2.2768642576917206e-06, 'epoch': 0.69} 69%|██████▉ | 23749/34278 [26:02:42<9:23:25, 3.21s/it] 69%|██████▉ | 23750/34278 [26:02:48<11:43:51, 4.01s/it] {'loss': 0.1103, 'grad_norm': 0.9504971563223352, 'learning_rate': 2.2764680495789874e-06, 'epoch': 0.69} 69%|██████▉ | 23750/34278 [26:02:48<11:43:51, 4.01s/it] 69%|██████▉ | 23751/34278 [26:02:53<12:11:52, 4.17s/it] {'loss': 0.1183, 'grad_norm': 0.8027638111751217, 'learning_rate': 2.2760718657812964e-06, 'epoch': 0.69} 69%|██████▉ | 23751/34278 [26:02:53<12:11:52, 4.17s/it] 69%|██████▉ | 23752/34278 [26:02:56<11:21:16, 3.88s/it] {'loss': 0.1304, 'grad_norm': 0.9116895444309365, 'learning_rate': 2.275675706302185e-06, 'epoch': 0.69} 69%|██████▉ | 23752/34278 [26:02:56<11:21:16, 3.88s/it] 69%|██████▉ | 23753/34278 [26:02:59<10:51:54, 3.72s/it] {'loss': 0.116, 'grad_norm': 0.8590398988302214, 'learning_rate': 2.2752795711451926e-06, 'epoch': 0.69} 69%|██████▉ | 23753/34278 [26:02:59<10:51:54, 3.72s/it] 69%|██████▉ | 23754/34278 [26:03:05<12:48:28, 4.38s/it] {'loss': 0.1227, 'grad_norm': 0.7807675862826012, 'learning_rate': 2.274883460313852e-06, 'epoch': 0.69} 69%|██████▉ | 23754/34278 [26:03:05<12:48:28, 4.38s/it] 69%|██████▉ | 23755/34278 [26:03:09<12:39:24, 4.33s/it] {'loss': 0.1205, 'grad_norm': 0.8985680026686662, 'learning_rate': 2.274487373811703e-06, 'epoch': 0.69} 69%|██████▉ | 23755/34278 [26:03:09<12:39:24, 4.33s/it] 69%|██████▉ | 23756/34278 [26:03:12<11:32:26, 3.95s/it] {'loss': 0.0981, 'grad_norm': 0.9048122272525334, 'learning_rate': 2.2740913116422796e-06, 'epoch': 0.69} 69%|██████▉ | 23756/34278 [26:03:13<11:32:26, 3.95s/it] 69%|██████▉ | 23757/34278 [26:03:15<10:34:21, 3.62s/it] {'loss': 0.088, 'grad_norm': 0.7144588342863268, 'learning_rate': 2.2736952738091173e-06, 'epoch': 0.69} 69%|██████▉ | 23757/34278 [26:03:15<10:34:21, 3.62s/it] 69%|██████▉ | 23758/34278 [26:03:20<11:21:33, 3.89s/it] {'loss': 0.135, 'grad_norm': 0.6768792673281033, 'learning_rate': 2.273299260315752e-06, 'epoch': 0.69} 69%|██████▉ | 23758/34278 [26:03:20<11:21:33, 3.89s/it] 69%|██████▉ | 23759/34278 [26:03:24<11:38:23, 3.98s/it] {'loss': 0.1014, 'grad_norm': 0.8819290215802724, 'learning_rate': 2.2729032711657224e-06, 'epoch': 0.69} 69%|██████▉ | 23759/34278 [26:03:24<11:38:23, 3.98s/it] 69%|██████▉ | 23760/34278 [26:03:28<11:19:12, 3.87s/it] {'loss': 0.09, 'grad_norm': 0.890492511431514, 'learning_rate': 2.272507306362561e-06, 'epoch': 0.69} 69%|██████▉ | 23760/34278 [26:03:28<11:19:12, 3.87s/it] 69%|██████▉ | 23761/34278 [26:03:31<10:36:08, 3.63s/it] {'loss': 0.1237, 'grad_norm': 0.793613907706273, 'learning_rate': 2.2721113659098013e-06, 'epoch': 0.69} 69%|██████▉ | 23761/34278 [26:03:31<10:36:08, 3.63s/it] 69%|██████▉ | 23762/34278 [26:03:34<10:09:59, 3.48s/it] {'loss': 0.1342, 'grad_norm': 0.8754751817467925, 'learning_rate': 2.271715449810982e-06, 'epoch': 0.69} 69%|██████▉ | 23762/34278 [26:03:34<10:09:59, 3.48s/it] 69%|██████▉ | 23763/34278 [26:03:38<10:31:01, 3.60s/it] {'loss': 0.1349, 'grad_norm': 0.8301535246481814, 'learning_rate': 2.271319558069637e-06, 'epoch': 0.69} 69%|██████▉ | 23763/34278 [26:03:38<10:31:01, 3.60s/it] 69%|██████▉ | 23764/34278 [26:03:43<11:47:41, 4.04s/it] {'loss': 0.1098, 'grad_norm': 0.8639729205611231, 'learning_rate': 2.2709236906892967e-06, 'epoch': 0.69} 69%|██████▉ | 23764/34278 [26:03:43<11:47:41, 4.04s/it] 69%|██████▉ | 23765/34278 [26:03:49<13:28:27, 4.61s/it] {'loss': 0.1211, 'grad_norm': 0.8410730677761642, 'learning_rate': 2.2705278476734984e-06, 'epoch': 0.69} 69%|██████▉ | 23765/34278 [26:03:49<13:28:27, 4.61s/it] 69%|██████▉ | 23766/34278 [26:03:52<12:22:12, 4.24s/it] {'loss': 0.1245, 'grad_norm': 0.84913739173186, 'learning_rate': 2.270132029025777e-06, 'epoch': 0.69} 69%|██████▉ | 23766/34278 [26:03:52<12:22:12, 4.24s/it] 69%|██████▉ | 23767/34278 [26:03:55<11:17:02, 3.86s/it] {'loss': 0.1298, 'grad_norm': 0.8301701775348659, 'learning_rate': 2.2697362347496665e-06, 'epoch': 0.69} 69%|██████▉ | 23767/34278 [26:03:55<11:17:02, 3.86s/it] 69%|██████▉ | 23768/34278 [26:04:00<12:20:31, 4.23s/it] {'loss': 0.11, 'grad_norm': 0.797361811400939, 'learning_rate': 2.269340464848697e-06, 'epoch': 0.69} 69%|██████▉ | 23768/34278 [26:04:00<12:20:31, 4.23s/it] 69%|██████▉ | 23769/34278 [26:04:07<14:10:13, 4.85s/it] {'loss': 0.1346, 'grad_norm': 0.7244305386077172, 'learning_rate': 2.268944719326405e-06, 'epoch': 0.69} 69%|██████▉ | 23769/34278 [26:04:07<14:10:13, 4.85s/it] 69%|██████▉ | 23770/34278 [26:04:10<12:52:49, 4.41s/it] {'loss': 0.1052, 'grad_norm': 0.9910751266872838, 'learning_rate': 2.268548998186321e-06, 'epoch': 0.69} 69%|██████▉ | 23770/34278 [26:04:10<12:52:49, 4.41s/it] 69%|██████▉ | 23771/34278 [26:04:13<11:26:08, 3.92s/it] {'loss': 0.1017, 'grad_norm': 0.6653589091318827, 'learning_rate': 2.26815330143198e-06, 'epoch': 0.69} 69%|██████▉ | 23771/34278 [26:04:13<11:26:08, 3.92s/it] 69%|██████▉ | 23772/34278 [26:04:16<11:04:48, 3.80s/it] {'loss': 0.1186, 'grad_norm': 0.8051877190686334, 'learning_rate': 2.2677576290669157e-06, 'epoch': 0.69} 69%|██████▉ | 23772/34278 [26:04:16<11:04:48, 3.80s/it] 69%|██████▉ | 23773/34278 [26:04:19<10:24:57, 3.57s/it] {'loss': 0.1434, 'grad_norm': 0.7642106374320058, 'learning_rate': 2.267361981094659e-06, 'epoch': 0.69} 69%|██████▉ | 23773/34278 [26:04:19<10:24:57, 3.57s/it] 69%|██████▉ | 23774/34278 [26:04:23<11:01:31, 3.78s/it] {'loss': 0.1052, 'grad_norm': 0.8472071730476597, 'learning_rate': 2.2669663575187407e-06, 'epoch': 0.69} 69%|██████▉ | 23774/34278 [26:04:24<11:01:31, 3.78s/it] 69%|██████▉ | 23775/34278 [26:04:26<10:16:02, 3.52s/it] {'loss': 0.1192, 'grad_norm': 0.8994965254843768, 'learning_rate': 2.266570758342696e-06, 'epoch': 0.69} 69%|██████▉ | 23775/34278 [26:04:26<10:16:02, 3.52s/it] 69%|██████▉ | 23776/34278 [26:04:32<12:28:04, 4.27s/it] {'loss': 0.1448, 'grad_norm': 0.9891177393959779, 'learning_rate': 2.266175183570053e-06, 'epoch': 0.69} 69%|██████▉ | 23776/34278 [26:04:32<12:28:04, 4.27s/it] 69%|██████▉ | 23777/34278 [26:04:35<11:07:59, 3.82s/it] {'loss': 0.1149, 'grad_norm': 0.8472114327687793, 'learning_rate': 2.2657796332043476e-06, 'epoch': 0.69} 69%|██████▉ | 23777/34278 [26:04:35<11:07:59, 3.82s/it] 69%|██████▉ | 23778/34278 [26:04:38<10:33:50, 3.62s/it] {'loss': 0.1123, 'grad_norm': 1.0004615352191524, 'learning_rate': 2.265384107249106e-06, 'epoch': 0.69} 69%|██████▉ | 23778/34278 [26:04:38<10:33:50, 3.62s/it] 69%|██████▉ | 23779/34278 [26:04:43<11:06:09, 3.81s/it] {'loss': 0.1008, 'grad_norm': 0.9177297504770803, 'learning_rate': 2.264988605707865e-06, 'epoch': 0.69} 69%|██████▉ | 23779/34278 [26:04:43<11:06:09, 3.81s/it] 69%|██████▉ | 23780/34278 [26:04:46<10:30:19, 3.60s/it] {'loss': 0.1154, 'grad_norm': 0.9711912489017491, 'learning_rate': 2.2645931285841533e-06, 'epoch': 0.69} 69%|██████▉ | 23780/34278 [26:04:46<10:30:19, 3.60s/it] 69%|██████▉ | 23781/34278 [26:04:49<9:57:32, 3.42s/it] {'loss': 0.1218, 'grad_norm': 0.8451203658486415, 'learning_rate': 2.2641976758814966e-06, 'epoch': 0.69} 69%|██████▉ | 23781/34278 [26:04:49<9:57:32, 3.42s/it] 69%|██████▉ | 23782/34278 [26:04:53<10:41:50, 3.67s/it] {'loss': 0.0981, 'grad_norm': 0.7791037540092294, 'learning_rate': 2.263802247603434e-06, 'epoch': 0.69} 69%|██████▉ | 23782/34278 [26:04:53<10:41:50, 3.67s/it] 69%|██████▉ | 23783/34278 [26:04:57<11:08:25, 3.82s/it] {'loss': 0.1196, 'grad_norm': 0.9122742883437229, 'learning_rate': 2.263406843753492e-06, 'epoch': 0.69} 69%|██████▉ | 23783/34278 [26:04:57<11:08:25, 3.82s/it] 69%|██████▉ | 23784/34278 [26:05:00<10:35:48, 3.64s/it] {'loss': 0.136, 'grad_norm': 0.9145510104429517, 'learning_rate': 2.263011464335198e-06, 'epoch': 0.69} 69%|██████▉ | 23784/34278 [26:05:00<10:35:48, 3.64s/it] 69%|██████▉ | 23785/34278 [26:05:05<11:05:10, 3.80s/it] {'loss': 0.1124, 'grad_norm': 0.8272283121849784, 'learning_rate': 2.2626161093520866e-06, 'epoch': 0.69} 69%|██████▉ | 23785/34278 [26:05:05<11:05:10, 3.80s/it] 69%|██████▉ | 23786/34278 [26:05:11<13:04:30, 4.49s/it] {'loss': 0.1052, 'grad_norm': 0.9133058887329731, 'learning_rate': 2.2622207788076848e-06, 'epoch': 0.69} 69%|██████▉ | 23786/34278 [26:05:11<13:04:30, 4.49s/it] 69%|██████▉ | 23787/34278 [26:05:14<12:15:55, 4.21s/it] {'loss': 0.1194, 'grad_norm': 0.7982430957193886, 'learning_rate': 2.2618254727055206e-06, 'epoch': 0.69} 69%|██████▉ | 23787/34278 [26:05:14<12:15:55, 4.21s/it] 69%|██████▉ | 23788/34278 [26:05:17<11:12:49, 3.85s/it] {'loss': 0.1213, 'grad_norm': 1.0187002693919793, 'learning_rate': 2.261430191049125e-06, 'epoch': 0.69} 69%|██████▉ | 23788/34278 [26:05:17<11:12:49, 3.85s/it] 69%|██████▉ | 23789/34278 [26:05:20<10:43:22, 3.68s/it] {'loss': 0.1149, 'grad_norm': 0.854908536529277, 'learning_rate': 2.2610349338420283e-06, 'epoch': 0.69} 69%|██████▉ | 23789/34278 [26:05:20<10:43:22, 3.68s/it] 69%|██████▉ | 23790/34278 [26:05:27<12:48:57, 4.40s/it] {'loss': 0.1414, 'grad_norm': 0.8299541363462521, 'learning_rate': 2.2606397010877585e-06, 'epoch': 0.69} 69%|██████▉ | 23790/34278 [26:05:27<12:48:57, 4.40s/it] 69%|██████▉ | 23791/34278 [26:05:30<11:40:31, 4.01s/it] {'loss': 0.1295, 'grad_norm': 1.0390679356647874, 'learning_rate': 2.2602444927898413e-06, 'epoch': 0.69} 69%|██████▉ | 23791/34278 [26:05:30<11:40:31, 4.01s/it] 69%|██████▉ | 23792/34278 [26:05:34<11:43:43, 4.03s/it] {'loss': 0.1317, 'grad_norm': 0.8770794811603092, 'learning_rate': 2.2598493089518093e-06, 'epoch': 0.69} 69%|██████▉ | 23792/34278 [26:05:34<11:43:43, 4.03s/it] 69%|██████▉ | 23793/34278 [26:05:40<13:26:00, 4.61s/it] {'loss': 0.1207, 'grad_norm': 0.818535052142983, 'learning_rate': 2.2594541495771866e-06, 'epoch': 0.69} 69%|██████▉ | 23793/34278 [26:05:40<13:26:00, 4.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 69%|██████▉ | 23794/34278 [26:05:43<11:55:59, 4.10s/it] {'loss': 0.1445, 'grad_norm': 0.8774843127880833, 'learning_rate': 2.2590590146695053e-06, 'epoch': 0.69} 69%|██████▉ | 23794/34278 [26:05:43<11:55:59, 4.10s/it] 69%|██████▉ | 23795/34278 [26:05:48<13:22:30, 4.59s/it] {'loss': 0.1111, 'grad_norm': 0.8805688386968867, 'learning_rate': 2.258663904232288e-06, 'epoch': 0.69} 69%|██████▉ | 23795/34278 [26:05:48<13:22:30, 4.59s/it] 69%|██████▉ | 23796/34278 [26:05:51<11:43:51, 4.03s/it] {'loss': 0.1041, 'grad_norm': 0.8019469765437267, 'learning_rate': 2.2582688182690674e-06, 'epoch': 0.69} 69%|██████▉ | 23796/34278 [26:05:51<11:43:51, 4.03s/it] 69%|██████▉ | 23797/34278 [26:05:55<11:23:12, 3.91s/it] {'loss': 0.1313, 'grad_norm': 0.9178817772973342, 'learning_rate': 2.2578737567833688e-06, 'epoch': 0.69} 69%|██████▉ | 23797/34278 [26:05:55<11:23:12, 3.91s/it] 69%|██████▉ | 23798/34278 [26:06:01<13:27:05, 4.62s/it] {'loss': 0.1207, 'grad_norm': 0.7250859828107897, 'learning_rate': 2.2574787197787155e-06, 'epoch': 0.69} 69%|██████▉ | 23798/34278 [26:06:01<13:27:05, 4.62s/it] 69%|██████▉ | 23799/34278 [26:06:04<12:16:09, 4.22s/it] {'loss': 0.1133, 'grad_norm': 0.8084459599289897, 'learning_rate': 2.257083707258639e-06, 'epoch': 0.69} 69%|██████▉ | 23799/34278 [26:06:04<12:16:09, 4.22s/it] 69%|██████▉ | 23800/34278 [26:06:07<11:19:35, 3.89s/it] {'loss': 0.1258, 'grad_norm': 0.9793508490564601, 'learning_rate': 2.256688719226665e-06, 'epoch': 0.69} 69%|██████▉ | 23800/34278 [26:06:07<11:19:35, 3.89s/it] 69%|██████▉ | 23801/34278 [26:06:12<11:39:03, 4.00s/it] {'loss': 0.1306, 'grad_norm': 0.6774895543187904, 'learning_rate': 2.256293755686318e-06, 'epoch': 0.69} 69%|██████▉ | 23801/34278 [26:06:12<11:39:03, 4.00s/it] 69%|██████▉ | 23802/34278 [26:06:15<10:51:33, 3.73s/it] {'loss': 0.1206, 'grad_norm': 0.7826264163921307, 'learning_rate': 2.255898816641127e-06, 'epoch': 0.69} 69%|██████▉ | 23802/34278 [26:06:15<10:51:33, 3.73s/it] 69%|██████▉ | 23803/34278 [26:06:18<10:07:45, 3.48s/it] {'loss': 0.1008, 'grad_norm': 1.5171655582646877, 'learning_rate': 2.2555039020946163e-06, 'epoch': 0.69} 69%|██████▉ | 23803/34278 [26:06:18<10:07:45, 3.48s/it] 69%|██████▉ | 23804/34278 [26:06:21<9:52:35, 3.39s/it] {'loss': 0.1161, 'grad_norm': 1.0481798594953093, 'learning_rate': 2.25510901205031e-06, 'epoch': 0.69} 69%|██████▉ | 23804/34278 [26:06:21<9:52:35, 3.39s/it] 69%|██████▉ | 23805/34278 [26:06:25<10:48:12, 3.71s/it] {'loss': 0.1198, 'grad_norm': 0.7933968532635306, 'learning_rate': 2.254714146511735e-06, 'epoch': 0.69} 69%|██████▉ | 23805/34278 [26:06:25<10:48:12, 3.71s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 69%|██████▉ | 23806/34278 [26:06:29<10:27:27, 3.60s/it] {'loss': 0.1133, 'grad_norm': 0.9066459719823079, 'learning_rate': 2.2543193054824185e-06, 'epoch': 0.69} 69%|██████▉ | 23806/34278 [26:06:29<10:27:27, 3.60s/it] 69%|██████▉ | 23807/34278 [26:06:32<9:54:40, 3.41s/it] {'loss': 0.1051, 'grad_norm': 1.2308532338379525, 'learning_rate': 2.253924488965884e-06, 'epoch': 0.69} 69%|██████▉ | 23807/34278 [26:06:32<9:54:40, 3.41s/it] 69%|██████▉ | 23808/34278 [26:06:36<10:37:53, 3.66s/it] {'loss': 0.1212, 'grad_norm': 0.9219605085205348, 'learning_rate': 2.2535296969656547e-06, 'epoch': 0.69} 69%|██████▉ | 23808/34278 [26:06:36<10:37:53, 3.66s/it] 69%|██████▉ | 23809/34278 [26:06:39<9:56:52, 3.42s/it] {'loss': 0.118, 'grad_norm': 0.9368588291885176, 'learning_rate': 2.253134929485257e-06, 'epoch': 0.69} 69%|██████▉ | 23809/34278 [26:06:39<9:56:52, 3.42s/it] 69%|██████▉ | 23810/34278 [26:06:43<11:10:21, 3.84s/it] {'loss': 0.1143, 'grad_norm': 0.8434970865606336, 'learning_rate': 2.252740186528216e-06, 'epoch': 0.69} 69%|██████▉ | 23810/34278 [26:06:44<11:10:21, 3.84s/it] 69%|██████▉ | 23811/34278 [26:06:46<10:22:55, 3.57s/it] {'loss': 0.0986, 'grad_norm': 1.1473254781592623, 'learning_rate': 2.252345468098051e-06, 'epoch': 0.69} 69%|██████▉ | 23811/34278 [26:06:46<10:22:55, 3.57s/it] 69%|██████▉ | 23812/34278 [26:06:50<10:28:14, 3.60s/it] {'loss': 0.1225, 'grad_norm': 0.8269881720214735, 'learning_rate': 2.251950774198294e-06, 'epoch': 0.69} 69%|██████▉ | 23812/34278 [26:06:50<10:28:14, 3.60s/it] 69%|██████▉ | 23813/34278 [26:06:53<10:03:16, 3.46s/it] {'loss': 0.097, 'grad_norm': 0.9458941903347555, 'learning_rate': 2.2515561048324637e-06, 'epoch': 0.69} 69%|██████▉ | 23813/34278 [26:06:53<10:03:16, 3.46s/it] 69%|██████▉ | 23814/34278 [26:06:56<9:27:40, 3.26s/it] {'loss': 0.1228, 'grad_norm': 0.7442355627088012, 'learning_rate': 2.251161460004083e-06, 'epoch': 0.69} 69%|██████▉ | 23814/34278 [26:06:56<9:27:40, 3.26s/it] 69%|██████▉ | 23815/34278 [26:07:02<11:48:19, 4.06s/it] {'loss': 0.1091, 'grad_norm': 0.9360742571652984, 'learning_rate': 2.2507668397166778e-06, 'epoch': 0.69} 69%|██████▉ | 23815/34278 [26:07:02<11:48:19, 4.06s/it] 69%|██████▉ | 23816/34278 [26:07:05<11:07:01, 3.83s/it] {'loss': 0.1336, 'grad_norm': 0.9932204254641761, 'learning_rate': 2.25037224397377e-06, 'epoch': 0.69} 69%|██████▉ | 23816/34278 [26:07:05<11:07:01, 3.83s/it] 69%|██████▉ | 23817/34278 [26:07:09<10:46:01, 3.71s/it] {'loss': 0.114, 'grad_norm': 1.0242302724736048, 'learning_rate': 2.2499776727788815e-06, 'epoch': 0.69} 69%|██████▉ | 23817/34278 [26:07:09<10:46:01, 3.71s/it] 69%|██████▉ | 23818/34278 [26:07:12<10:28:45, 3.61s/it] {'loss': 0.1075, 'grad_norm': 0.816982814096064, 'learning_rate': 2.249583126135535e-06, 'epoch': 0.69} 69%|██████▉ | 23818/34278 [26:07:12<10:28:45, 3.61s/it] 69%|██████▉ | 23819/34278 [26:07:19<13:03:34, 4.50s/it] {'loss': 0.1252, 'grad_norm': 1.0143184258574376, 'learning_rate': 2.249188604047256e-06, 'epoch': 0.69} 69%|██████▉ | 23819/34278 [26:07:19<13:03:34, 4.50s/it] 69%|██████▉ | 23820/34278 [26:07:22<11:44:57, 4.04s/it] {'loss': 0.1165, 'grad_norm': 0.9985708969606071, 'learning_rate': 2.2487941065175646e-06, 'epoch': 0.69} 69%|██████▉ | 23820/34278 [26:07:22<11:44:57, 4.04s/it] 69%|██████▉ | 23821/34278 [26:07:27<13:08:48, 4.53s/it] {'loss': 0.1235, 'grad_norm': 2.625312710630318, 'learning_rate': 2.2483996335499804e-06, 'epoch': 0.69} 69%|██████▉ | 23821/34278 [26:07:27<13:08:48, 4.53s/it] 69%|██████▉ | 23822/34278 [26:07:32<13:33:18, 4.67s/it] {'loss': 0.1151, 'grad_norm': 0.8656596326522185, 'learning_rate': 2.2480051851480296e-06, 'epoch': 0.69} 69%|██████▉ | 23822/34278 [26:07:32<13:33:18, 4.67s/it] 69%|██████▉ | 23823/34278 [26:07:36<12:38:26, 4.35s/it] {'loss': 0.1367, 'grad_norm': 1.0129732594232637, 'learning_rate': 2.247610761315229e-06, 'epoch': 0.69} 69%|██████▉ | 23823/34278 [26:07:36<12:38:26, 4.35s/it] 70%|██████▉ | 23824/34278 [26:07:42<13:50:51, 4.77s/it] {'loss': 0.1194, 'grad_norm': 0.968424923186662, 'learning_rate': 2.247216362055105e-06, 'epoch': 0.7} 70%|██████▉ | 23824/34278 [26:07:42<13:50:51, 4.77s/it] 70%|██████▉ | 23825/34278 [26:07:45<12:35:37, 4.34s/it] {'loss': 0.114, 'grad_norm': 1.0327727585847357, 'learning_rate': 2.2468219873711737e-06, 'epoch': 0.7} 70%|██████▉ | 23825/34278 [26:07:45<12:35:37, 4.34s/it] 70%|██████▉ | 23826/34278 [26:07:48<11:28:47, 3.95s/it] {'loss': 0.1176, 'grad_norm': 0.9663400679826721, 'learning_rate': 2.2464276372669615e-06, 'epoch': 0.7} 70%|██████▉ | 23826/34278 [26:07:48<11:28:47, 3.95s/it] 70%|██████▉ | 23827/34278 [26:07:54<13:32:16, 4.66s/it] {'loss': 0.1364, 'grad_norm': 1.0742687267998512, 'learning_rate': 2.246033311745985e-06, 'epoch': 0.7} 70%|██████▉ | 23827/34278 [26:07:54<13:32:16, 4.66s/it] 70%|██████▉ | 23828/34278 [26:07:57<12:06:18, 4.17s/it] {'loss': 0.1052, 'grad_norm': 0.8803108750502666, 'learning_rate': 2.245639010811764e-06, 'epoch': 0.7} 70%|██████▉ | 23828/34278 [26:07:57<12:06:18, 4.17s/it] 70%|██████▉ | 23829/34278 [26:08:00<11:13:35, 3.87s/it] {'loss': 0.1337, 'grad_norm': 0.8469585046262382, 'learning_rate': 2.245244734467821e-06, 'epoch': 0.7} 70%|██████▉ | 23829/34278 [26:08:01<11:13:35, 3.87s/it] 70%|██████▉ | 23830/34278 [26:08:04<10:39:34, 3.67s/it] {'loss': 0.1168, 'grad_norm': 0.8042952636604691, 'learning_rate': 2.2448504827176767e-06, 'epoch': 0.7} 70%|██████▉ | 23830/34278 [26:08:04<10:39:34, 3.67s/it] 70%|██████▉ | 23831/34278 [26:08:07<10:14:10, 3.53s/it] {'loss': 0.1357, 'grad_norm': 1.339756206521757, 'learning_rate': 2.2444562555648474e-06, 'epoch': 0.7} 70%|██████▉ | 23831/34278 [26:08:07<10:14:10, 3.53s/it] 70%|██████▉ | 23832/34278 [26:08:10<10:13:56, 3.53s/it] {'loss': 0.1198, 'grad_norm': 0.7626741984499985, 'learning_rate': 2.2440620530128572e-06, 'epoch': 0.7} 70%|██████▉ | 23832/34278 [26:08:10<10:13:56, 3.53s/it] 70%|██████▉ | 23833/34278 [26:08:17<12:51:39, 4.43s/it] {'loss': 0.1342, 'grad_norm': 0.623325959472245, 'learning_rate': 2.243667875065223e-06, 'epoch': 0.7} 70%|██████▉ | 23833/34278 [26:08:17<12:51:39, 4.43s/it] 70%|██████▉ | 23834/34278 [26:08:20<11:44:51, 4.05s/it] {'loss': 0.1132, 'grad_norm': 1.1762003072366538, 'learning_rate': 2.2432737217254617e-06, 'epoch': 0.7} 70%|██████▉ | 23834/34278 [26:08:20<11:44:51, 4.05s/it] 70%|██████▉ | 23835/34278 [26:08:24<11:34:39, 3.99s/it] {'loss': 0.1132, 'grad_norm': 0.8413067163168597, 'learning_rate': 2.2428795929970952e-06, 'epoch': 0.7} 70%|██████▉ | 23835/34278 [26:08:24<11:34:39, 3.99s/it] 70%|██████▉ | 23836/34278 [26:08:28<11:34:05, 3.99s/it] {'loss': 0.1185, 'grad_norm': 0.7803250474449454, 'learning_rate': 2.2424854888836434e-06, 'epoch': 0.7} 70%|██████▉ | 23836/34278 [26:08:28<11:34:05, 3.99s/it] 70%|██████▉ | 23837/34278 [26:08:31<10:52:53, 3.75s/it] {'loss': 0.1202, 'grad_norm': 0.7790511825822023, 'learning_rate': 2.2420914093886227e-06, 'epoch': 0.7} 70%|██████▉ | 23837/34278 [26:08:31<10:52:53, 3.75s/it] 70%|██████▉ | 23838/34278 [26:08:37<13:04:17, 4.51s/it] {'loss': 0.1339, 'grad_norm': 0.9583893332195279, 'learning_rate': 2.2416973545155496e-06, 'epoch': 0.7} 70%|██████▉ | 23838/34278 [26:08:38<13:04:17, 4.51s/it] 70%|██████▉ | 23839/34278 [26:08:41<12:39:01, 4.36s/it] {'loss': 0.1181, 'grad_norm': 0.8739164437819695, 'learning_rate': 2.2413033242679456e-06, 'epoch': 0.7} 70%|██████▉ | 23839/34278 [26:08:41<12:39:01, 4.36s/it] 70%|██████▉ | 23840/34278 [26:08:44<11:29:15, 3.96s/it] {'loss': 0.1506, 'grad_norm': 0.959295105755949, 'learning_rate': 2.2409093186493276e-06, 'epoch': 0.7} 70%|██████▉ | 23840/34278 [26:08:45<11:29:15, 3.96s/it] 70%|██████▉ | 23841/34278 [26:08:50<13:14:20, 4.57s/it] {'loss': 0.1186, 'grad_norm': 0.8142623408323446, 'learning_rate': 2.240515337663208e-06, 'epoch': 0.7} 70%|██████▉ | 23841/34278 [26:08:50<13:14:20, 4.57s/it] 70%|██████▉ | 23842/34278 [26:08:55<12:56:39, 4.47s/it] {'loss': 0.1152, 'grad_norm': 1.064582232367325, 'learning_rate': 2.2401213813131133e-06, 'epoch': 0.7} 70%|██████▉ | 23842/34278 [26:08:55<12:56:39, 4.47s/it] 70%|██████▉ | 23843/34278 [26:09:01<14:15:12, 4.92s/it] {'loss': 0.118, 'grad_norm': 1.2373366780234374, 'learning_rate': 2.239727449602556e-06, 'epoch': 0.7} 70%|██████▉ | 23843/34278 [26:09:01<14:15:12, 4.92s/it] 70%|██████▉ | 23844/34278 [26:09:04<12:59:01, 4.48s/it] {'loss': 0.1287, 'grad_norm': 0.7205967437679306, 'learning_rate': 2.239333542535051e-06, 'epoch': 0.7} 70%|██████▉ | 23844/34278 [26:09:04<12:59:01, 4.48s/it] 70%|██████▉ | 23845/34278 [26:09:08<12:04:33, 4.17s/it] {'loss': 0.1103, 'grad_norm': 0.8186552784652059, 'learning_rate': 2.2389396601141188e-06, 'epoch': 0.7} 70%|██████▉ | 23845/34278 [26:09:08<12:04:33, 4.17s/it] 70%|██████▉ | 23846/34278 [26:09:11<11:11:27, 3.86s/it] {'loss': 0.1075, 'grad_norm': 0.8412739325173756, 'learning_rate': 2.2385458023432742e-06, 'epoch': 0.7} 70%|██████▉ | 23846/34278 [26:09:11<11:11:27, 3.86s/it] 70%|██████▉ | 23847/34278 [26:09:14<10:31:38, 3.63s/it] {'loss': 0.105, 'grad_norm': 1.1737791197407967, 'learning_rate': 2.2381519692260318e-06, 'epoch': 0.7} 70%|██████▉ | 23847/34278 [26:09:14<10:31:38, 3.63s/it] 70%|██████▉ | 23848/34278 [26:09:19<11:47:56, 4.07s/it] {'loss': 0.1274, 'grad_norm': 0.7235996361954059, 'learning_rate': 2.2377581607659095e-06, 'epoch': 0.7} 70%|██████▉ | 23848/34278 [26:09:19<11:47:56, 4.07s/it] 70%|██████▉ | 23849/34278 [26:09:23<11:34:32, 4.00s/it] {'loss': 0.126, 'grad_norm': 0.7677692176770433, 'learning_rate': 2.2373643769664243e-06, 'epoch': 0.7} 70%|██████▉ | 23849/34278 [26:09:23<11:34:32, 4.00s/it] 70%|██████▉ | 23850/34278 [26:09:26<10:47:09, 3.72s/it] {'loss': 0.1165, 'grad_norm': 0.9732730500905161, 'learning_rate': 2.236970617831091e-06, 'epoch': 0.7} 70%|██████▉ | 23850/34278 [26:09:26<10:47:09, 3.72s/it] 70%|██████▉ | 23851/34278 [26:09:29<10:35:26, 3.66s/it] {'loss': 0.1369, 'grad_norm': 1.2348858461385546, 'learning_rate': 2.236576883363422e-06, 'epoch': 0.7} 70%|██████▉ | 23851/34278 [26:09:29<10:35:26, 3.66s/it] 70%|██████▉ | 23852/34278 [26:09:33<10:20:12, 3.57s/it] {'loss': 0.1144, 'grad_norm': 1.0381658883900549, 'learning_rate': 2.236183173566937e-06, 'epoch': 0.7} 70%|██████▉ | 23852/34278 [26:09:33<10:20:12, 3.57s/it] 70%|██████▉ | 23853/34278 [26:09:37<10:41:59, 3.69s/it] {'loss': 0.0984, 'grad_norm': 0.6237444371201205, 'learning_rate': 2.235789488445147e-06, 'epoch': 0.7} 70%|██████▉ | 23853/34278 [26:09:37<10:41:59, 3.69s/it] 70%|██████▉ | 23854/34278 [26:09:40<10:27:54, 3.61s/it] {'loss': 0.1085, 'grad_norm': 0.7176785407206546, 'learning_rate': 2.2353958280015703e-06, 'epoch': 0.7} 70%|██████▉ | 23854/34278 [26:09:40<10:27:54, 3.61s/it] 70%|██████▉ | 23855/34278 [26:09:43<10:03:09, 3.47s/it] {'loss': 0.1161, 'grad_norm': 0.9308137456355501, 'learning_rate': 2.235002192239718e-06, 'epoch': 0.7} 70%|██████▉ | 23855/34278 [26:09:43<10:03:09, 3.47s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0e403f54e0> Failed to fetch sample 3074346. Exception: cannot identify image file <_io.BytesIO object at 0x7f0e403f54e0> 70%|██████▉ | 23856/34278 [26:09:49<12:29:05, 4.31s/it] {'loss': 0.1126, 'grad_norm': 0.7946992493326904, 'learning_rate': 2.234608581163108e-06, 'epoch': 0.7} 70%|██████▉ | 23856/34278 [26:09:50<12:29:05, 4.31s/it] 70%|██████▉ | 23857/34278 [26:09:53<11:38:22, 4.02s/it] {'loss': 0.1274, 'grad_norm': 1.000559179116095, 'learning_rate': 2.234214994775252e-06, 'epoch': 0.7} 70%|██████▉ | 23857/34278 [26:09:53<11:38:22, 4.02s/it] 70%|██████▉ | 23858/34278 [26:09:59<13:13:40, 4.57s/it] {'loss': 0.1051, 'grad_norm': 0.6762911103499605, 'learning_rate': 2.2338214330796633e-06, 'epoch': 0.7} 70%|██████▉ | 23858/34278 [26:09:59<13:13:40, 4.57s/it] 70%|██████▉ | 23859/34278 [26:10:02<11:46:46, 4.07s/it] {'loss': 0.0999, 'grad_norm': 1.0782851143233587, 'learning_rate': 2.233427896079856e-06, 'epoch': 0.7} 70%|██████▉ | 23859/34278 [26:10:02<11:46:46, 4.07s/it] 70%|██████▉ | 23860/34278 [26:10:08<13:42:45, 4.74s/it] {'loss': 0.1377, 'grad_norm': 0.8465079972471626, 'learning_rate': 2.233034383779346e-06, 'epoch': 0.7} 70%|██████▉ | 23860/34278 [26:10:08<13:42:45, 4.74s/it] 70%|██████▉ | 23861/34278 [26:10:14<14:43:29, 5.09s/it] {'loss': 0.123, 'grad_norm': 1.0538568765976666, 'learning_rate': 2.2326408961816425e-06, 'epoch': 0.7} 70%|██████▉ | 23861/34278 [26:10:14<14:43:29, 5.09s/it] 70%|██████▉ | 23862/34278 [26:10:17<13:27:07, 4.65s/it] {'loss': 0.1282, 'grad_norm': 1.0404147483084252, 'learning_rate': 2.232247433290262e-06, 'epoch': 0.7} 70%|██████▉ | 23862/34278 [26:10:17<13:27:07, 4.65s/it] 70%|██████▉ | 23863/34278 [26:10:21<12:34:26, 4.35s/it] {'loss': 0.1027, 'grad_norm': 1.304273376189725, 'learning_rate': 2.231853995108716e-06, 'epoch': 0.7} 70%|██████▉ | 23863/34278 [26:10:21<12:34:26, 4.35s/it] 70%|██████▉ | 23864/34278 [26:10:24<11:26:46, 3.96s/it] {'loss': 0.1075, 'grad_norm': 1.1026164216958099, 'learning_rate': 2.231460581640515e-06, 'epoch': 0.7} 70%|██████▉ | 23864/34278 [26:10:24<11:26:46, 3.96s/it] 70%|██████▉ | 23865/34278 [26:10:28<11:01:24, 3.81s/it] {'loss': 0.136, 'grad_norm': 0.8837947441898395, 'learning_rate': 2.231067192889173e-06, 'epoch': 0.7} 70%|██████▉ | 23865/34278 [26:10:28<11:01:24, 3.81s/it] 70%|██████▉ | 23866/34278 [26:10:30<10:02:12, 3.47s/it] {'loss': 0.1273, 'grad_norm': 0.8644623049803704, 'learning_rate': 2.2306738288582036e-06, 'epoch': 0.7} 70%|██████▉ | 23866/34278 [26:10:30<10:02:12, 3.47s/it] 70%|██████▉ | 23867/34278 [26:10:35<11:25:02, 3.95s/it] {'loss': 0.0974, 'grad_norm': 0.7718012780034156, 'learning_rate': 2.2302804895511177e-06, 'epoch': 0.7} 70%|██████▉ | 23867/34278 [26:10:35<11:25:02, 3.95s/it] 70%|██████▉ | 23868/34278 [26:10:39<11:12:36, 3.88s/it] {'loss': 0.1084, 'grad_norm': 0.8881331523894219, 'learning_rate': 2.229887174971424e-06, 'epoch': 0.7} 70%|██████▉ | 23868/34278 [26:10:39<11:12:36, 3.88s/it] 70%|██████▉ | 23869/34278 [26:10:42<10:23:38, 3.59s/it] {'loss': 0.1181, 'grad_norm': 0.9709450734308829, 'learning_rate': 2.2294938851226387e-06, 'epoch': 0.7} 70%|██████▉ | 23869/34278 [26:10:42<10:23:38, 3.59s/it] 70%|██████▉ | 23870/34278 [26:10:45<10:15:10, 3.55s/it] {'loss': 0.1385, 'grad_norm': 1.0097910025405348, 'learning_rate': 2.2291006200082705e-06, 'epoch': 0.7} 70%|██████▉ | 23870/34278 [26:10:45<10:15:10, 3.55s/it] 70%|██████▉ | 23871/34278 [26:10:49<10:14:18, 3.54s/it] {'loss': 0.0969, 'grad_norm': 1.2827005423027638, 'learning_rate': 2.2287073796318266e-06, 'epoch': 0.7} 70%|██████▉ | 23871/34278 [26:10:49<10:14:18, 3.54s/it] 70%|██████▉ | 23872/34278 [26:10:55<12:25:38, 4.30s/it] {'loss': 0.099, 'grad_norm': 0.708415974217797, 'learning_rate': 2.2283141639968254e-06, 'epoch': 0.7} 70%|██████▉ | 23872/34278 [26:10:55<12:25:38, 4.30s/it] 70%|██████▉ | 23873/34278 [26:10:58<11:12:55, 3.88s/it] {'loss': 0.0995, 'grad_norm': 0.9949620544624855, 'learning_rate': 2.2279209731067736e-06, 'epoch': 0.7} 70%|██████▉ | 23873/34278 [26:10:58<11:12:55, 3.88s/it] 70%|██████▉ | 23874/34278 [26:11:01<10:33:57, 3.66s/it] {'loss': 0.136, 'grad_norm': 1.0254344330917917, 'learning_rate': 2.22752780696518e-06, 'epoch': 0.7} 70%|██████▉ | 23874/34278 [26:11:01<10:33:57, 3.66s/it] 70%|██████▉ | 23875/34278 [26:11:04<9:57:38, 3.45s/it] {'loss': 0.1169, 'grad_norm': 0.9702845823420303, 'learning_rate': 2.2271346655755577e-06, 'epoch': 0.7} 70%|██████▉ | 23875/34278 [26:11:04<9:57:38, 3.45s/it] 70%|██████▉ | 23876/34278 [26:11:08<10:23:30, 3.60s/it] {'loss': 0.1129, 'grad_norm': 0.7426787782128693, 'learning_rate': 2.226741548941416e-06, 'epoch': 0.7} 70%|██████▉ | 23876/34278 [26:11:08<10:23:30, 3.60s/it] 70%|██████▉ | 23877/34278 [26:11:12<10:23:39, 3.60s/it] {'loss': 0.1492, 'grad_norm': 0.8579837307191444, 'learning_rate': 2.226348457066261e-06, 'epoch': 0.7} 70%|██████▉ | 23877/34278 [26:11:12<10:23:39, 3.60s/it] 70%|██████▉ | 23878/34278 [26:11:15<9:52:55, 3.42s/it] {'loss': 0.1313, 'grad_norm': 0.9695186357645721, 'learning_rate': 2.225955389953605e-06, 'epoch': 0.7} 70%|██████▉ | 23878/34278 [26:11:15<9:52:55, 3.42s/it] 70%|██████▉ | 23879/34278 [26:11:18<9:32:54, 3.31s/it] {'loss': 0.1281, 'grad_norm': 1.0191855552721403, 'learning_rate': 2.2255623476069595e-06, 'epoch': 0.7} 70%|██████▉ | 23879/34278 [26:11:18<9:32:54, 3.31s/it] 70%|██████▉ | 23880/34278 [26:11:21<9:34:23, 3.31s/it] {'loss': 0.108, 'grad_norm': 0.9127613348863842, 'learning_rate': 2.2251693300298306e-06, 'epoch': 0.7} 70%|██████▉ | 23880/34278 [26:11:21<9:34:23, 3.31s/it] 70%|██████▉ | 23881/34278 [26:11:24<9:32:04, 3.30s/it] {'loss': 0.1284, 'grad_norm': 1.0601321119294365, 'learning_rate': 2.2247763372257253e-06, 'epoch': 0.7} 70%|██████▉ | 23881/34278 [26:11:24<9:32:04, 3.30s/it] 70%|██████▉ | 23882/34278 [26:11:27<9:22:05, 3.24s/it] {'loss': 0.1079, 'grad_norm': 0.7341699527352541, 'learning_rate': 2.224383369198157e-06, 'epoch': 0.7} 70%|██████▉ | 23882/34278 [26:11:27<9:22:05, 3.24s/it] 70%|██████▉ | 23883/34278 [26:11:31<9:53:23, 3.43s/it] {'loss': 0.1073, 'grad_norm': 0.904234228900846, 'learning_rate': 2.223990425950629e-06, 'epoch': 0.7} 70%|██████▉ | 23883/34278 [26:11:31<9:53:23, 3.43s/it] 70%|██████▉ | 23884/34278 [26:11:34<9:32:02, 3.30s/it] {'loss': 0.1243, 'grad_norm': 0.7317430894835597, 'learning_rate': 2.223597507486654e-06, 'epoch': 0.7} 70%|██████▉ | 23884/34278 [26:11:34<9:32:02, 3.30s/it] 70%|██████▉ | 23885/34278 [26:11:37<8:59:21, 3.11s/it] {'loss': 0.1077, 'grad_norm': 0.8553002245374118, 'learning_rate': 2.223204613809736e-06, 'epoch': 0.7} 70%|██████▉ | 23885/34278 [26:11:37<8:59:21, 3.11s/it] 70%|██████▉ | 23886/34278 [26:11:40<9:27:43, 3.28s/it] {'loss': 0.098, 'grad_norm': 0.8570750394948702, 'learning_rate': 2.2228117449233853e-06, 'epoch': 0.7} 70%|██████▉ | 23886/34278 [26:11:40<9:27:43, 3.28s/it] 70%|██████▉ | 23887/34278 [26:11:52<16:15:02, 5.63s/it] {'loss': 0.1422, 'grad_norm': 0.8867475923460965, 'learning_rate': 2.2224189008311088e-06, 'epoch': 0.7} 70%|██████▉ | 23887/34278 [26:11:52<16:15:02, 5.63s/it] 70%|██████▉ | 23888/34278 [26:11:55<14:25:43, 5.00s/it] {'loss': 0.1207, 'grad_norm': 0.9701536648985049, 'learning_rate': 2.2220260815364113e-06, 'epoch': 0.7} 70%|██████▉ | 23888/34278 [26:11:55<14:25:43, 5.00s/it] 70%|██████▉ | 23889/34278 [26:11:58<12:30:02, 4.33s/it] {'loss': 0.1097, 'grad_norm': 0.8527957005656143, 'learning_rate': 2.2216332870428025e-06, 'epoch': 0.7} 70%|██████▉ | 23889/34278 [26:11:58<12:30:02, 4.33s/it] 70%|██████▉ | 23890/34278 [26:12:10<18:56:15, 6.56s/it] {'loss': 0.1198, 'grad_norm': 0.8667439819336876, 'learning_rate': 2.22124051735379e-06, 'epoch': 0.7} 70%|██████▉ | 23890/34278 [26:12:10<18:56:15, 6.56s/it] 70%|██████▉ | 23891/34278 [26:12:26<27:19:35, 9.47s/it] {'loss': 0.1255, 'grad_norm': 0.6934904212398832, 'learning_rate': 2.2208477724728765e-06, 'epoch': 0.7} 70%|██████▉ | 23891/34278 [26:12:26<27:19:35, 9.47s/it] 70%|██████▉ | 23892/34278 [26:12:50<40:05:49, 13.90s/it] {'loss': 0.1344, 'grad_norm': 1.0531606372921636, 'learning_rate': 2.220455052403573e-06, 'epoch': 0.7} 70%|██████▉ | 23892/34278 [26:12:50<40:05:49, 13.90s/it] 70%|██████▉ | 23893/34278 [26:12:54<31:45:36, 11.01s/it] {'loss': 0.1121, 'grad_norm': 1.079285750979756, 'learning_rate': 2.220062357149383e-06, 'epoch': 0.7} 70%|██████▉ | 23893/34278 [26:12:54<31:45:36, 11.01s/it] 70%|██████▉ | 23894/34278 [26:13:10<35:40:24, 12.37s/it] {'loss': 0.1346, 'grad_norm': 0.6888322008395154, 'learning_rate': 2.219669686713811e-06, 'epoch': 0.7} 70%|██████▉ | 23894/34278 [26:13:10<35:40:24, 12.37s/it] 70%|██████▉ | 23895/34278 [26:13:21<34:54:07, 12.10s/it] {'loss': 0.1348, 'grad_norm': 0.9814590642120479, 'learning_rate': 2.2192770411003638e-06, 'epoch': 0.7} 70%|██████▉ | 23895/34278 [26:13:21<34:54:07, 12.10s/it] 70%|██████▉ | 23896/34278 [26:13:34<35:15:27, 12.23s/it] {'loss': 0.1356, 'grad_norm': 1.0259025493308334, 'learning_rate': 2.21888442031255e-06, 'epoch': 0.7} 70%|██████▉ | 23896/34278 [26:13:34<35:15:27, 12.23s/it] 70%|██████▉ | 23897/34278 [26:13:38<27:57:01, 9.69s/it] {'loss': 0.1181, 'grad_norm': 0.8542920359060812, 'learning_rate': 2.2184918243538717e-06, 'epoch': 0.7} 70%|██████▉ | 23897/34278 [26:13:38<27:57:01, 9.69s/it] 70%|██████▉ | 23898/34278 [26:13:49<29:34:35, 10.26s/it] {'loss': 0.1371, 'grad_norm': 0.8486031292820476, 'learning_rate': 2.218099253227832e-06, 'epoch': 0.7} 70%|██████▉ | 23898/34278 [26:13:49<29:34:35, 10.26s/it] 70%|██████▉ | 23899/34278 [26:13:54<25:00:43, 8.68s/it] {'loss': 0.1202, 'grad_norm': 0.9135604801082281, 'learning_rate': 2.217706706937941e-06, 'epoch': 0.7} 70%|██████▉ | 23899/34278 [26:13:54<25:00:43, 8.68s/it] 70%|██████▉ | 23900/34278 [26:13:57<20:03:15, 6.96s/it] {'loss': 0.1291, 'grad_norm': 1.0489641449001594, 'learning_rate': 2.2173141854877e-06, 'epoch': 0.7} 70%|██████▉ | 23900/34278 [26:13:57<20:03:15, 6.96s/it] 70%|██████▉ | 23901/34278 [26:14:10<24:57:19, 8.66s/it] {'loss': 0.1281, 'grad_norm': 0.9118790562128938, 'learning_rate': 2.21692168888061e-06, 'epoch': 0.7} 70%|██████▉ | 23901/34278 [26:14:10<24:57:19, 8.66s/it] 70%|██████▉ | 23902/34278 [26:14:31<36:01:55, 12.50s/it] {'loss': 0.1277, 'grad_norm': 0.8182972087568022, 'learning_rate': 2.216529217120182e-06, 'epoch': 0.7} 70%|██████▉ | 23902/34278 [26:14:31<36:01:55, 12.50s/it] 70%|██████▉ | 23903/34278 [26:14:45<37:01:54, 12.85s/it] {'loss': 0.1135, 'grad_norm': 1.023858829391313, 'learning_rate': 2.2161367702099172e-06, 'epoch': 0.7} 70%|██████▉ | 23903/34278 [26:14:45<37:01:54, 12.85s/it] 70%|██████▉ | 23904/34278 [26:14:51<31:09:40, 10.81s/it] {'loss': 0.1329, 'grad_norm': 0.8734826668400932, 'learning_rate': 2.2157443481533165e-06, 'epoch': 0.7} 70%|██████▉ | 23904/34278 [26:14:51<31:09:40, 10.81s/it] 70%|██████▉ | 23905/34278 [26:14:55<25:22:34, 8.81s/it] {'loss': 0.1125, 'grad_norm': 0.7654454023172019, 'learning_rate': 2.215351950953888e-06, 'epoch': 0.7} 70%|██████▉ | 23905/34278 [26:14:55<25:22:34, 8.81s/it] 70%|██████▉ | 23906/34278 [26:15:12<31:52:49, 11.07s/it] {'loss': 0.1042, 'grad_norm': 1.0366101945281585, 'learning_rate': 2.214959578615132e-06, 'epoch': 0.7} 70%|██████▉ | 23906/34278 [26:15:12<31:52:49, 11.07s/it] 70%|██████▉ | 23907/34278 [26:15:27<35:22:45, 12.28s/it] {'loss': 0.0937, 'grad_norm': 0.8992822757576193, 'learning_rate': 2.2145672311405505e-06, 'epoch': 0.7} 70%|██████▉ | 23907/34278 [26:15:27<35:22:45, 12.28s/it] 70%|██████▉ | 23908/34278 [26:15:44<39:32:56, 13.73s/it] {'loss': 0.1563, 'grad_norm': 0.8214520386037066, 'learning_rate': 2.2141749085336476e-06, 'epoch': 0.7} 70%|██████▉ | 23908/34278 [26:15:44<39:32:56, 13.73s/it] 70%|██████▉ | 23909/34278 [26:16:10<50:11:20, 17.43s/it] {'loss': 0.0978, 'grad_norm': 0.849531784424523, 'learning_rate': 2.213782610797928e-06, 'epoch': 0.7} 70%|██████▉ | 23909/34278 [26:16:10<50:11:20, 17.43s/it] 70%|██████▉ | 23910/34278 [26:16:14<38:58:20, 13.53s/it] {'loss': 0.1175, 'grad_norm': 0.9702522776466446, 'learning_rate': 2.213390337936892e-06, 'epoch': 0.7} 70%|██████▉ | 23910/34278 [26:16:14<38:58:20, 13.53s/it] 70%|██████▉ | 23911/34278 [26:16:27<37:56:43, 13.18s/it] {'loss': 0.1029, 'grad_norm': 0.6985722729712117, 'learning_rate': 2.2129980899540403e-06, 'epoch': 0.7} 70%|██████▉ | 23911/34278 [26:16:27<37:56:43, 13.18s/it] 70%|██████▉ | 23912/34278 [26:16:30<29:50:07, 10.36s/it] {'loss': 0.1161, 'grad_norm': 0.6521646867903083, 'learning_rate': 2.2126058668528784e-06, 'epoch': 0.7} 70%|██████▉ | 23912/34278 [26:16:30<29:50:07, 10.36s/it] 70%|██████▉ | 23913/34278 [26:16:49<36:56:10, 12.83s/it] {'loss': 0.1104, 'grad_norm': 0.8726392551150108, 'learning_rate': 2.2122136686369038e-06, 'epoch': 0.7} 70%|██████▉ | 23913/34278 [26:16:49<36:56:10, 12.83s/it] 70%|██████▉ | 23914/34278 [26:17:11<44:38:41, 15.51s/it] {'loss': 0.1098, 'grad_norm': 1.0279031197328325, 'learning_rate': 2.2118214953096218e-06, 'epoch': 0.7} 70%|██████▉ | 23914/34278 [26:17:11<44:38:41, 15.51s/it] 70%|██████▉ | 23915/34278 [26:17:31<48:21:01, 16.80s/it] {'loss': 0.1183, 'grad_norm': 0.724605864266765, 'learning_rate': 2.2114293468745302e-06, 'epoch': 0.7} 70%|██████▉ | 23915/34278 [26:17:31<48:21:01, 16.80s/it] 70%|██████▉ | 23916/34278 [26:17:33<36:16:52, 12.60s/it] {'loss': 0.1206, 'grad_norm': 0.8200723129100194, 'learning_rate': 2.2110372233351334e-06, 'epoch': 0.7} 70%|██████▉ | 23916/34278 [26:17:33<36:16:52, 12.60s/it] 70%|██████▉ | 23917/34278 [26:17:56<44:38:22, 15.51s/it] {'loss': 0.1191, 'grad_norm': 0.9265442451272636, 'learning_rate': 2.2106451246949307e-06, 'epoch': 0.7} 70%|██████▉ | 23917/34278 [26:17:56<44:38:22, 15.51s/it] 70%|██████▉ | 23918/34278 [26:18:20<52:03:59, 18.09s/it] {'loss': 0.1108, 'grad_norm': 0.6997308658185218, 'learning_rate': 2.2102530509574204e-06, 'epoch': 0.7} 70%|██████▉ | 23918/34278 [26:18:20<52:03:59, 18.09s/it] 70%|██████▉ | 23919/34278 [26:18:31<45:52:10, 15.94s/it] {'loss': 0.1191, 'grad_norm': 0.7151739698896734, 'learning_rate': 2.2098610021261046e-06, 'epoch': 0.7} 70%|██████▉ | 23919/34278 [26:18:31<45:52:10, 15.94s/it] 70%|██████▉ | 23920/34278 [26:18:34<34:31:55, 12.00s/it] {'loss': 0.1135, 'grad_norm': 0.9734372742959848, 'learning_rate': 2.2094689782044857e-06, 'epoch': 0.7} 70%|██████▉ | 23920/34278 [26:18:34<34:31:55, 12.00s/it] 70%|██████▉ | 23921/34278 [26:18:39<28:47:05, 10.01s/it] {'loss': 0.1104, 'grad_norm': 0.7242728998374421, 'learning_rate': 2.2090769791960604e-06, 'epoch': 0.7} 70%|██████▉ | 23921/34278 [26:18:39<28:47:05, 10.01s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 70%|██████▉ | 23922/34278 [26:18:56<34:35:28, 12.02s/it] {'loss': 0.1116, 'grad_norm': 1.1515168965564313, 'learning_rate': 2.2086850051043314e-06, 'epoch': 0.7} 70%|██████▉ | 23922/34278 [26:18:56<34:35:28, 12.02s/it] 70%|██████▉ | 23923/34278 [26:19:19<44:14:45, 15.38s/it] {'loss': 0.1283, 'grad_norm': 0.7441902855450077, 'learning_rate': 2.2082930559327955e-06, 'epoch': 0.7} 70%|██████▉ | 23923/34278 [26:19:19<44:14:45, 15.38s/it] 70%|██████▉ | 23924/34278 [26:19:38<47:19:36, 16.46s/it] {'loss': 0.1259, 'grad_norm': 0.8467353741366159, 'learning_rate': 2.2079011316849515e-06, 'epoch': 0.7} 70%|██████▉ | 23924/34278 [26:19:38<47:19:36, 16.46s/it] 70%|██████▉ | 23925/34278 [26:19:52<45:00:28, 15.65s/it] {'loss': 0.1087, 'grad_norm': 0.7578443751350888, 'learning_rate': 2.207509232364299e-06, 'epoch': 0.7} 70%|██████▉ | 23925/34278 [26:19:52<45:00:28, 15.65s/it] 70%|██████▉ | 23926/34278 [26:20:03<41:32:52, 14.45s/it] {'loss': 0.1455, 'grad_norm': 0.7694819193593879, 'learning_rate': 2.2071173579743405e-06, 'epoch': 0.7} 70%|██████▉ | 23926/34278 [26:20:03<41:32:52, 14.45s/it] 70%|██████▉ | 23927/34278 [26:20:29<51:31:25, 17.92s/it] {'loss': 0.1081, 'grad_norm': 0.8157212116854585, 'learning_rate': 2.2067255085185707e-06, 'epoch': 0.7} 70%|██████▉ | 23927/34278 [26:20:29<51:31:25, 17.92s/it] 70%|██████▉ | 23928/34278 [26:20:34<39:55:19, 13.89s/it] {'loss': 0.124, 'grad_norm': 0.8149733385337679, 'learning_rate': 2.2063336840004868e-06, 'epoch': 0.7} 70%|██████▉ | 23928/34278 [26:20:34<39:55:19, 13.89s/it] 70%|██████▉ | 23929/34278 [26:20:38<32:04:43, 11.16s/it] {'loss': 0.1165, 'grad_norm': 0.7288876374731305, 'learning_rate': 2.2059418844235912e-06, 'epoch': 0.7} 70%|██████▉ | 23929/34278 [26:20:38<32:04:43, 11.16s/it] 70%|██████▉ | 23930/34278 [26:20:50<32:11:20, 11.20s/it] {'loss': 0.1052, 'grad_norm': 0.7677110662916525, 'learning_rate': 2.205550109791379e-06, 'epoch': 0.7} 70%|██████▉ | 23930/34278 [26:20:50<32:11:20, 11.20s/it] 70%|██████▉ | 23931/34278 [26:20:53<24:59:23, 8.69s/it] {'loss': 0.1192, 'grad_norm': 0.8223754948929111, 'learning_rate': 2.205158360107345e-06, 'epoch': 0.7} 70%|██████▉ | 23931/34278 [26:20:53<24:59:23, 8.69s/it] 70%|██████▉ | 23932/34278 [26:20:57<21:11:48, 7.38s/it] {'loss': 0.1118, 'grad_norm': 0.8390319313943528, 'learning_rate': 2.2047666353749936e-06, 'epoch': 0.7} 70%|██████▉ | 23932/34278 [26:20:57<21:11:48, 7.38s/it] 70%|██████▉ | 23933/34278 [26:21:22<36:33:24, 12.72s/it] {'loss': 0.1139, 'grad_norm': 0.7492390754786155, 'learning_rate': 2.2043749355978183e-06, 'epoch': 0.7} 70%|██████▉ | 23933/34278 [26:21:22<36:33:24, 12.72s/it] 70%|██████▉ | 23934/34278 [26:21:38<39:04:37, 13.60s/it] {'loss': 0.1162, 'grad_norm': 0.7380429326903747, 'learning_rate': 2.203983260779314e-06, 'epoch': 0.7} 70%|██████▉ | 23934/34278 [26:21:38<39:04:37, 13.60s/it] 70%|██████▉ | 23935/34278 [26:21:48<36:34:36, 12.73s/it] {'loss': 0.1161, 'grad_norm': 0.8982197852339344, 'learning_rate': 2.203591610922982e-06, 'epoch': 0.7} 70%|██████▉ | 23935/34278 [26:21:48<36:34:36, 12.73s/it] 70%|██████▉ | 23936/34278 [26:21:53<29:34:40, 10.30s/it] {'loss': 0.1392, 'grad_norm': 0.8736667339876081, 'learning_rate': 2.2031999860323165e-06, 'epoch': 0.7} 70%|██████▉ | 23936/34278 [26:21:53<29:34:40, 10.30s/it] 70%|██████▉ | 23937/34278 [26:22:23<46:19:57, 16.13s/it] {'loss': 0.126, 'grad_norm': 0.6820437877376293, 'learning_rate': 2.2028083861108123e-06, 'epoch': 0.7} 70%|██████▉ | 23937/34278 [26:22:23<46:19:57, 16.13s/it] 70%|██████▉ | 23938/34278 [26:22:37<44:32:08, 15.51s/it] {'loss': 0.1394, 'grad_norm': 1.1657274485988667, 'learning_rate': 2.2024168111619666e-06, 'epoch': 0.7} 70%|██████▉ | 23938/34278 [26:22:37<44:32:08, 15.51s/it] 70%|██████▉ | 23939/34278 [26:22:40<33:50:11, 11.78s/it] {'loss': 0.1027, 'grad_norm': 0.720350106438394, 'learning_rate': 2.202025261189278e-06, 'epoch': 0.7} 70%|██████▉ | 23939/34278 [26:22:40<33:50:11, 11.78s/it] 70%|██████▉ | 23940/34278 [26:22:44<27:03:33, 9.42s/it] {'loss': 0.1467, 'grad_norm': 0.842727327060568, 'learning_rate': 2.20163373619624e-06, 'epoch': 0.7} 70%|██████▉ | 23940/34278 [26:22:44<27:03:33, 9.42s/it] 70%|██████▉ | 23941/34278 [26:22:59<31:34:48, 11.00s/it] {'loss': 0.1152, 'grad_norm': 0.7339955273043258, 'learning_rate': 2.2012422361863457e-06, 'epoch': 0.7} 70%|██████▉ | 23941/34278 [26:22:59<31:34:48, 11.00s/it] 70%|██████▉ | 23942/34278 [26:23:14<35:13:54, 12.27s/it] {'loss': 0.1123, 'grad_norm': 0.9130713180378784, 'learning_rate': 2.200850761163095e-06, 'epoch': 0.7} 70%|██████▉ | 23942/34278 [26:23:14<35:13:54, 12.27s/it] 70%|██████▉ | 23943/34278 [26:23:18<28:17:08, 9.85s/it] {'loss': 0.1122, 'grad_norm': 0.8363496227683581, 'learning_rate': 2.200459311129978e-06, 'epoch': 0.7} 70%|██████▉ | 23943/34278 [26:23:18<28:17:08, 9.85s/it] 70%|██████▉ | 23944/34278 [26:23:24<24:33:12, 8.55s/it] {'loss': 0.1146, 'grad_norm': 0.9935560450590118, 'learning_rate': 2.200067886090494e-06, 'epoch': 0.7} 70%|██████▉ | 23944/34278 [26:23:24<24:33:12, 8.55s/it] 70%|██████▉ | 23945/34278 [26:23:30<22:42:17, 7.91s/it] {'loss': 0.1337, 'grad_norm': 0.7691839559194878, 'learning_rate': 2.1996764860481334e-06, 'epoch': 0.7} 70%|██████▉ | 23945/34278 [26:23:30<22:42:17, 7.91s/it] 70%|██████▉ | 23946/34278 [26:23:34<19:19:36, 6.73s/it] {'loss': 0.1089, 'grad_norm': 0.7693515060529519, 'learning_rate': 2.1992851110063953e-06, 'epoch': 0.7} 70%|██████▉ | 23946/34278 [26:23:34<19:19:36, 6.73s/it] 70%|██████▉ | 23947/34278 [26:23:40<18:56:11, 6.60s/it] {'loss': 0.1246, 'grad_norm': 0.9126551268711051, 'learning_rate': 2.1988937609687707e-06, 'epoch': 0.7} 70%|██████▉ | 23947/34278 [26:23:40<18:56:11, 6.60s/it] 70%|██████▉ | 23948/34278 [26:23:44<16:32:45, 5.77s/it] {'loss': 0.1282, 'grad_norm': 0.7305308384724504, 'learning_rate': 2.198502435938752e-06, 'epoch': 0.7} 70%|██████▉ | 23948/34278 [26:23:44<16:32:45, 5.77s/it] 70%|██████▉ | 23949/34278 [26:23:47<14:27:15, 5.04s/it] {'loss': 0.1286, 'grad_norm': 1.1475363486012335, 'learning_rate': 2.198111135919834e-06, 'epoch': 0.7} 70%|██████▉ | 23949/34278 [26:23:47<14:27:15, 5.04s/it] 70%|██████▉ | 23950/34278 [26:24:11<30:08:16, 10.51s/it] {'loss': 0.1108, 'grad_norm': 1.3417151758689219, 'learning_rate': 2.197719860915514e-06, 'epoch': 0.7} 70%|██████▉ | 23950/34278 [26:24:11<30:08:16, 10.51s/it] 70%|██████▉ | 23951/34278 [26:24:14<23:46:18, 8.29s/it] {'loss': 0.1268, 'grad_norm': 0.6706037887177457, 'learning_rate': 2.19732861092928e-06, 'epoch': 0.7} 70%|██████▉ | 23951/34278 [26:24:14<23:46:18, 8.29s/it] 70%|██████▉ | 23952/34278 [26:24:17<19:30:09, 6.80s/it] {'loss': 0.1082, 'grad_norm': 0.86667105947099, 'learning_rate': 2.1969373859646287e-06, 'epoch': 0.7} 70%|██████▉ | 23952/34278 [26:24:17<19:30:09, 6.80s/it] 70%|██████▉ | 23953/34278 [26:24:20<15:58:13, 5.57s/it] {'loss': 0.1122, 'grad_norm': 1.02242976326532, 'learning_rate': 2.1965461860250515e-06, 'epoch': 0.7} 70%|██████▉ | 23953/34278 [26:24:20<15:58:13, 5.57s/it] 70%|██████▉ | 23954/34278 [26:24:23<13:56:55, 4.86s/it] {'loss': 0.1131, 'grad_norm': 0.8762455769409735, 'learning_rate': 2.196155011114039e-06, 'epoch': 0.7} 70%|██████▉ | 23954/34278 [26:24:23<13:56:55, 4.86s/it] 70%|██████▉ | 23955/34278 [26:24:26<12:22:26, 4.32s/it] {'loss': 0.1103, 'grad_norm': 0.89106618415591, 'learning_rate': 2.1957638612350846e-06, 'epoch': 0.7} 70%|██████▉ | 23955/34278 [26:24:26<12:22:26, 4.32s/it] 70%|██████▉ | 23956/34278 [26:24:38<18:36:35, 6.49s/it] {'loss': 0.136, 'grad_norm': 0.999620249948512, 'learning_rate': 2.1953727363916833e-06, 'epoch': 0.7} 70%|██████▉ | 23956/34278 [26:24:38<18:36:35, 6.49s/it] 70%|██████▉ | 23957/34278 [26:24:41<15:55:08, 5.55s/it] {'loss': 0.1146, 'grad_norm': 0.9502451622796657, 'learning_rate': 2.194981636587325e-06, 'epoch': 0.7} 70%|██████▉ | 23957/34278 [26:24:41<15:55:08, 5.55s/it] 70%|██████▉ | 23958/34278 [26:24:45<14:18:46, 4.99s/it] {'loss': 0.098, 'grad_norm': 0.7329955936712534, 'learning_rate': 2.1945905618254985e-06, 'epoch': 0.7} 70%|██████▉ | 23958/34278 [26:24:45<14:18:46, 4.99s/it] 70%|██████▉ | 23959/34278 [26:24:48<12:46:16, 4.46s/it] {'loss': 0.1229, 'grad_norm': 0.771353050670828, 'learning_rate': 2.1941995121096997e-06, 'epoch': 0.7} 70%|██████▉ | 23959/34278 [26:24:48<12:46:16, 4.46s/it] 70%|██████▉ | 23960/34278 [26:24:51<11:35:44, 4.05s/it] {'loss': 0.1359, 'grad_norm': 0.9019231796641504, 'learning_rate': 2.1938084874434184e-06, 'epoch': 0.7} 70%|██████▉ | 23960/34278 [26:24:51<11:35:44, 4.05s/it] 70%|██████▉ | 23961/34278 [26:24:57<12:56:02, 4.51s/it] {'loss': 0.1132, 'grad_norm': 0.8461351662171521, 'learning_rate': 2.193417487830141e-06, 'epoch': 0.7} 70%|██████▉ | 23961/34278 [26:24:57<12:56:02, 4.51s/it] 70%|██████▉ | 23962/34278 [26:25:02<13:56:43, 4.87s/it] {'loss': 0.1577, 'grad_norm': 0.9079653025212666, 'learning_rate': 2.1930265132733663e-06, 'epoch': 0.7} 70%|██████▉ | 23962/34278 [26:25:02<13:56:43, 4.87s/it] 70%|██████▉ | 23963/34278 [26:25:06<13:08:54, 4.59s/it] {'loss': 0.1072, 'grad_norm': 0.9916480713194713, 'learning_rate': 2.1926355637765805e-06, 'epoch': 0.7} 70%|██████▉ | 23963/34278 [26:25:06<13:08:54, 4.59s/it] 70%|██████▉ | 23964/34278 [26:25:12<13:59:45, 4.89s/it] {'loss': 0.1155, 'grad_norm': 0.675691073335476, 'learning_rate': 2.192244639343272e-06, 'epoch': 0.7} 70%|██████▉ | 23964/34278 [26:25:12<13:59:45, 4.89s/it] 70%|██████▉ | 23965/34278 [26:25:15<12:51:31, 4.49s/it] {'loss': 0.1145, 'grad_norm': 0.74496196362572, 'learning_rate': 2.1918537399769358e-06, 'epoch': 0.7} 70%|██████▉ | 23965/34278 [26:25:15<12:51:31, 4.49s/it] 70%|██████▉ | 23966/34278 [26:25:19<12:23:50, 4.33s/it] {'loss': 0.1355, 'grad_norm': 0.8457333082543786, 'learning_rate': 2.191462865681058e-06, 'epoch': 0.7} 70%|██████▉ | 23966/34278 [26:25:19<12:23:50, 4.33s/it] 70%|██████▉ | 23967/34278 [26:25:23<11:29:05, 4.01s/it] {'loss': 0.1184, 'grad_norm': 0.9140121292847121, 'learning_rate': 2.191072016459129e-06, 'epoch': 0.7} 70%|██████▉ | 23967/34278 [26:25:23<11:29:05, 4.01s/it] 70%|██████▉ | 23968/34278 [26:25:27<11:33:33, 4.04s/it] {'loss': 0.1065, 'grad_norm': 0.9728469106059102, 'learning_rate': 2.190681192314637e-06, 'epoch': 0.7} 70%|██████▉ | 23968/34278 [26:25:27<11:33:33, 4.04s/it] 70%|██████▉ | 23969/34278 [26:25:30<10:47:53, 3.77s/it] {'loss': 0.1197, 'grad_norm': 0.8514995422077425, 'learning_rate': 2.1902903932510748e-06, 'epoch': 0.7} 70%|██████▉ | 23969/34278 [26:25:30<10:47:53, 3.77s/it] 70%|██████▉ | 23970/34278 [26:25:33<10:06:22, 3.53s/it] {'loss': 0.1294, 'grad_norm': 1.085130976010377, 'learning_rate': 2.1898996192719297e-06, 'epoch': 0.7} 70%|██████▉ | 23970/34278 [26:25:33<10:06:22, 3.53s/it] 70%|██████▉ | 23971/34278 [26:25:36<9:31:37, 3.33s/it] {'loss': 0.1123, 'grad_norm': 0.8148898121292455, 'learning_rate': 2.1895088703806877e-06, 'epoch': 0.7} 70%|██████▉ | 23971/34278 [26:25:36<9:31:37, 3.33s/it] 70%|██████▉ | 23972/34278 [26:25:40<10:23:41, 3.63s/it] {'loss': 0.1368, 'grad_norm': 0.8148589495926012, 'learning_rate': 2.189118146580842e-06, 'epoch': 0.7} 70%|██████▉ | 23972/34278 [26:25:40<10:23:41, 3.63s/it] 70%|██████▉ | 23973/34278 [26:25:43<9:53:41, 3.46s/it] {'loss': 0.1269, 'grad_norm': 0.8087628675117582, 'learning_rate': 2.188727447875876e-06, 'epoch': 0.7} 70%|██████▉ | 23973/34278 [26:25:43<9:53:41, 3.46s/it] 70%|██████▉ | 23974/34278 [26:25:46<9:27:10, 3.30s/it] {'loss': 0.116, 'grad_norm': 0.8024056352754089, 'learning_rate': 2.1883367742692824e-06, 'epoch': 0.7} 70%|██████▉ | 23974/34278 [26:25:46<9:27:10, 3.30s/it] 70%|██████▉ | 23975/34278 [26:25:49<9:04:56, 3.17s/it] {'loss': 0.1368, 'grad_norm': 0.807474093646638, 'learning_rate': 2.1879461257645453e-06, 'epoch': 0.7} 70%|██████▉ | 23975/34278 [26:25:49<9:04:56, 3.17s/it] 70%|██████▉ | 23976/34278 [26:25:55<11:32:09, 4.03s/it] {'loss': 0.1174, 'grad_norm': 1.0470631070479548, 'learning_rate': 2.1875555023651552e-06, 'epoch': 0.7} 70%|██████▉ | 23976/34278 [26:25:55<11:32:09, 4.03s/it] 70%|██████▉ | 23977/34278 [26:25:59<11:41:29, 4.09s/it] {'loss': 0.1115, 'grad_norm': 0.8551891079223305, 'learning_rate': 2.1871649040745984e-06, 'epoch': 0.7} 70%|██████▉ | 23977/34278 [26:25:59<11:41:29, 4.09s/it] 70%|██████▉ | 23978/34278 [26:26:02<10:48:59, 3.78s/it] {'loss': 0.0985, 'grad_norm': 0.7515422966205455, 'learning_rate': 2.1867743308963585e-06, 'epoch': 0.7} 70%|██████▉ | 23978/34278 [26:26:02<10:48:59, 3.78s/it] 70%|██████▉ | 23979/34278 [26:26:08<12:48:21, 4.48s/it] {'loss': 0.1059, 'grad_norm': 0.8912524447396101, 'learning_rate': 2.186383782833929e-06, 'epoch': 0.7} 70%|██████▉ | 23979/34278 [26:26:08<12:48:21, 4.48s/it] 70%|██████▉ | 23980/34278 [26:26:14<14:10:49, 4.96s/it] {'loss': 0.1312, 'grad_norm': 0.9428279664495918, 'learning_rate': 2.1859932598907933e-06, 'epoch': 0.7} 70%|██████▉ | 23980/34278 [26:26:14<14:10:49, 4.96s/it] 70%|██████▉ | 23981/34278 [26:26:17<12:39:33, 4.43s/it] {'loss': 0.1031, 'grad_norm': 1.4812676271230822, 'learning_rate': 2.1856027620704367e-06, 'epoch': 0.7} 70%|██████▉ | 23981/34278 [26:26:18<12:39:33, 4.43s/it] 70%|██████▉ | 23982/34278 [26:26:21<11:32:23, 4.03s/it] {'loss': 0.0992, 'grad_norm': 0.8477961693049424, 'learning_rate': 2.1852122893763484e-06, 'epoch': 0.7} 70%|██████▉ | 23982/34278 [26:26:21<11:32:23, 4.03s/it] 70%|██████▉ | 23983/34278 [26:26:25<11:57:14, 4.18s/it] {'loss': 0.126, 'grad_norm': 0.8264979109448708, 'learning_rate': 2.1848218418120134e-06, 'epoch': 0.7} 70%|██████▉ | 23983/34278 [26:26:25<11:57:14, 4.18s/it] 70%|██████▉ | 23984/34278 [26:26:30<12:12:29, 4.27s/it] {'loss': 0.1329, 'grad_norm': 1.1011081401884897, 'learning_rate': 2.184431419380914e-06, 'epoch': 0.7} 70%|██████▉ | 23984/34278 [26:26:30<12:12:29, 4.27s/it] 70%|██████▉ | 23985/34278 [26:26:33<11:18:26, 3.95s/it] {'loss': 0.1013, 'grad_norm': 1.0591944467203402, 'learning_rate': 2.1840410220865394e-06, 'epoch': 0.7} 70%|██████▉ | 23985/34278 [26:26:33<11:18:26, 3.95s/it] 70%|██████▉ | 23986/34278 [26:26:38<12:26:05, 4.35s/it] {'loss': 0.1472, 'grad_norm': 0.789704705507825, 'learning_rate': 2.183650649932376e-06, 'epoch': 0.7} 70%|██████▉ | 23986/34278 [26:26:38<12:26:05, 4.35s/it] 70%|██████▉ | 23987/34278 [26:26:41<11:15:29, 3.94s/it] {'loss': 0.1409, 'grad_norm': 0.9502062566605105, 'learning_rate': 2.1832603029219074e-06, 'epoch': 0.7} 70%|██████▉ | 23987/34278 [26:26:41<11:15:29, 3.94s/it] 70%|██████▉ | 23988/34278 [26:26:45<10:57:17, 3.83s/it] {'loss': 0.1229, 'grad_norm': 0.8025599434789594, 'learning_rate': 2.182869981058617e-06, 'epoch': 0.7} 70%|██████▉ | 23988/34278 [26:26:45<10:57:17, 3.83s/it] 70%|██████▉ | 23989/34278 [26:26:51<12:46:05, 4.47s/it] {'loss': 0.1063, 'grad_norm': 0.8018413521504824, 'learning_rate': 2.1824796843459916e-06, 'epoch': 0.7} 70%|██████▉ | 23989/34278 [26:26:51<12:46:05, 4.47s/it] 70%|██████▉ | 23990/34278 [26:26:54<11:27:02, 4.01s/it] {'loss': 0.1156, 'grad_norm': 0.8754198713761392, 'learning_rate': 2.182089412787514e-06, 'epoch': 0.7} 70%|██████▉ | 23990/34278 [26:26:54<11:27:02, 4.01s/it] 70%|██████▉ | 23991/34278 [26:26:57<11:14:12, 3.93s/it] {'loss': 0.1253, 'grad_norm': 0.991015694336839, 'learning_rate': 2.1816991663866692e-06, 'epoch': 0.7} 70%|██████▉ | 23991/34278 [26:26:57<11:14:12, 3.93s/it] 70%|██████▉ | 23992/34278 [26:27:04<13:10:59, 4.61s/it] {'loss': 0.1432, 'grad_norm': 0.8127540572983639, 'learning_rate': 2.1813089451469436e-06, 'epoch': 0.7} 70%|██████▉ | 23992/34278 [26:27:04<13:10:59, 4.61s/it] 70%|██████▉ | 23993/34278 [26:27:06<11:38:26, 4.07s/it] {'loss': 0.1228, 'grad_norm': 0.9592306562067225, 'learning_rate': 2.1809187490718185e-06, 'epoch': 0.7} 70%|██████▉ | 23993/34278 [26:27:06<11:38:26, 4.07s/it] 70%|██████▉ | 23994/34278 [26:27:09<10:35:23, 3.71s/it] {'loss': 0.1225, 'grad_norm': 0.8551294010357323, 'learning_rate': 2.180528578164776e-06, 'epoch': 0.7} 70%|██████▉ | 23994/34278 [26:27:09<10:35:23, 3.71s/it] 70%|███████ | 23995/34278 [26:27:12<9:54:04, 3.47s/it] {'loss': 0.1251, 'grad_norm': 1.0370886387783538, 'learning_rate': 2.1801384324293036e-06, 'epoch': 0.7} 70%|███████ | 23995/34278 [26:27:12<9:54:04, 3.47s/it] 70%|███████ | 23996/34278 [26:27:15<9:31:50, 3.34s/it] {'loss': 0.1325, 'grad_norm': 1.1952787806638514, 'learning_rate': 2.17974831186888e-06, 'epoch': 0.7} 70%|███████ | 23996/34278 [26:27:15<9:31:50, 3.34s/it] 70%|███████ | 23997/34278 [26:27:18<9:19:03, 3.26s/it] {'loss': 0.133, 'grad_norm': 0.7795203523082717, 'learning_rate': 2.179358216486992e-06, 'epoch': 0.7} 70%|███████ | 23997/34278 [26:27:18<9:19:03, 3.26s/it] 70%|███████ | 23998/34278 [26:27:22<9:22:59, 3.29s/it] {'loss': 0.1296, 'grad_norm': 0.8025040785672175, 'learning_rate': 2.178968146287119e-06, 'epoch': 0.7} 70%|███████ | 23998/34278 [26:27:22<9:22:59, 3.29s/it] 70%|███████ | 23999/34278 [26:27:25<9:45:07, 3.42s/it] {'loss': 0.1214, 'grad_norm': 1.0730825866736446, 'learning_rate': 2.1785781012727457e-06, 'epoch': 0.7} 70%|███████ | 23999/34278 [26:27:25<9:45:07, 3.42s/it] 70%|███████ | 24000/34278 [26:27:28<9:34:54, 3.36s/it] {'loss': 0.1107, 'grad_norm': 0.6690442535288196, 'learning_rate': 2.1781880814473545e-06, 'epoch': 0.7} 70%|███████ | 24000/34278 [26:27:28<9:34:54, 3.36s/it]Set eos token id toSet eos token id to 151658151658Set eos token id to Set eos token id to Set eos token toSet eos token to Set eos token id to<|diff_marker|><|diff_marker|>Set eos token id to151658151658 Set eos token to151658Set eos token toSet generation config eos token id to151658 <|diff_marker|> <|diff_marker|>Set eos token toSet eos token to[151658] Set generation config eos token id to<|diff_marker|><|diff_marker|>Set generation config eos token id to Set generation config eos token id toSet generation config eos token id to [151658][151658]Set generation config eos token id to[151658][151658] [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id toSet eos token id toSet eos token id toSet eos token id to et eos token id to Set eos tokSet eos token id to151658Set eos token id toSet eos token id to Set eos token to 151658 <|diff_marker|> 151658 151658Set eos token toSet eos token id toSet generation config eos token id to Set eos token id to Set eos token id to Set eos token to<|diff_marker|>Set eos token id toSet eos token to151658 [151658] <|diff_marker|><|diff_marker|> 151658Set generation config eos token id to 151658 Set eos token toSet generation config eos token id to Set generation config eos token id to Set eos token to[151658]Set eos token to<|diff_marker|> <|diff_marker|><|diff_marker|> Set generation config eos token id toSet generation config eos token id to [151658] 151658 Set eos token to Set generation config eos token id to<|diff_marker|>[151658][151658] Set generation config eos token id to[151658] [151658] [151658] Set eos token id toSet eos token id toSet eos token id to Set eos token id to 151658 151658 151658151658 151658151658 151658 151658Set eos token to Set eos token to Set eos token to Set eos token to Set eos token to Set eos token toSet eos token to<|diff_marker|> <|diff_marker|> Set eos token to<|diff_marker|> <|diff_marker|> Set generation config eos token id to Set generation config eos token id to [151658] [151658] <|diff_marker|> Set generation config eos token id to <|diff_marker|><|diff_marker|>Set generation config eos token id toSet generation config eos token id to[151658] [151658]Set generation config eos token id toSet generation config eos token id to[151658] <|diff_marker|> [151658] [151658] Set generation config eos toSet eos token id to 151658151658151658151658 Set eos token toSet eos token toSet eos token toSet eos token to <|diff_marker|><|diff_marker|><|diff_marker|><|diff_marker|> Set generation config eos token id toSet generation config eos token id to Set eos token id toSet generation config eos token id toSet eos token id toSet generation config eos token id to[151658][151658]Set eos token id to Set eos token id to 151658[151658]151658[151658] 151658151658Set eos token toSet eos token to <|diff_marker|> <|diff_marker|>Set eos token to Set eos token to Set generation config eos token id to <|diff_marker|>Set generation config eos token id to <|diff_marker|> Set generation config eos token id to[151658]Set generation config eos token id to [151658] [151658] [151658] Set eos token id tock to 151645Set eos token id back to151645 151645Set eos token back to <|im_end|> Set eos token back to Set generation config eos token id back to<|im_end|> Set generation config eos token id back to [151645, 151643][151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to Set eos token id back to<|im_end|> Set generation config eos token id back to 151645 Set eos token back to <|im_end|> [151645, 151643]Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 70%|███████ | 24001/34278 [26:28:05<37:50:28, 13.26s/it] {'loss': 0.1437, 'grad_norm': 0.8040909148802236, 'learning_rate': 2.1777980868144245e-06, 'epoch': 0.7} 70%|███████ | 24001/34278 [26:28:05<37:50:28, 13.26s/it] 70%|███████ | 24002/34278 [26:28:08<29:35:21, 10.37s/it] {'loss': 0.1134, 'grad_norm': 1.084027438536715, 'learning_rate': 2.17740811737744e-06, 'epoch': 0.7} 70%|███████ | 24002/34278 [26:28:08<29:35:21, 10.37s/it] 70%|███████ | 24003/34278 [26:28:11<23:10:37, 8.12s/it] {'loss': 0.133, 'grad_norm': 1.0520530415588623, 'learning_rate': 2.177018173139883e-06, 'epoch': 0.7} 70%|███████ | 24003/34278 [26:28:11<23:10:37, 8.12s/it] 70%|███████ | 24004/34278 [26:28:17<21:17:43, 7.46s/it] {'loss': 0.102, 'grad_norm': 0.8252624810035873, 'learning_rate': 2.176628254105234e-06, 'epoch': 0.7} 70%|███████ | 24004/34278 [26:28:17<21:17:43, 7.46s/it] 70%|███████ | 24005/34278 [26:28:20<17:33:39, 6.15s/it] {'loss': 0.1294, 'grad_norm': 0.9945593723678332, 'learning_rate': 2.176238360276972e-06, 'epoch': 0.7} 70%|███████ | 24005/34278 [26:28:20<17:33:39, 6.15s/it] 70%|███████ | 24006/34278 [26:28:25<16:07:10, 5.65s/it] {'loss': 0.0969, 'grad_norm': 1.7703537817533679, 'learning_rate': 2.1758484916585828e-06, 'epoch': 0.7} 70%|███████ | 24006/34278 [26:28:25<16:07:10, 5.65s/it] 70%|███████ | 24007/34278 [26:28:29<14:33:36, 5.10s/it] {'loss': 0.1137, 'grad_norm': 0.8443668737230785, 'learning_rate': 2.175458648253543e-06, 'epoch': 0.7} 70%|███████ | 24007/34278 [26:28:29<14:33:36, 5.10s/it] 70%|███████ | 24008/34278 [26:28:34<15:10:45, 5.32s/it] {'loss': 0.1452, 'grad_norm': 1.1310173847517508, 'learning_rate': 2.1750688300653307e-06, 'epoch': 0.7} 70%|███████ | 24008/34278 [26:28:34<15:10:45, 5.32s/it] 70%|███████ | 24009/34278 [26:28:38<13:47:33, 4.84s/it] {'loss': 0.1296, 'grad_norm': 0.9392790970680069, 'learning_rate': 2.174679037097433e-06, 'epoch': 0.7} 70%|███████ | 24009/34278 [26:28:38<13:47:33, 4.84s/it] 70%|███████ | 24010/34278 [26:28:41<12:23:52, 4.35s/it] {'loss': 0.1098, 'grad_norm': 1.1362254631985376, 'learning_rate': 2.1742892693533263e-06, 'epoch': 0.7} 70%|███████ | 24010/34278 [26:28:41<12:23:52, 4.35s/it] 70%|███████ | 24011/34278 [26:28:45<11:30:27, 4.03s/it] {'loss': 0.1163, 'grad_norm': 0.7340869338528412, 'learning_rate': 2.1738995268364893e-06, 'epoch': 0.7} 70%|███████ | 24011/34278 [26:28:45<11:30:27, 4.03s/it] 70%|███████ | 24012/34278 [26:28:48<10:37:17, 3.72s/it] {'loss': 0.1208, 'grad_norm': 0.7099221677250814, 'learning_rate': 2.1735098095504036e-06, 'epoch': 0.7} 70%|███████ | 24012/34278 [26:28:48<10:37:17, 3.72s/it] 70%|███████ | 24013/34278 [26:28:52<10:47:04, 3.78s/it] {'loss': 0.1133, 'grad_norm': 1.126775946889232, 'learning_rate': 2.1731201174985484e-06, 'epoch': 0.7} 70%|███████ | 24013/34278 [26:28:52<10:47:04, 3.78s/it] 70%|███████ | 24014/34278 [26:28:55<10:10:16, 3.57s/it] {'loss': 0.1221, 'grad_norm': 0.8955328508740298, 'learning_rate': 2.1727304506843998e-06, 'epoch': 0.7} 70%|███████ | 24014/34278 [26:28:55<10:10:16, 3.57s/it] 70%|███████ | 24015/34278 [26:28:59<10:28:01, 3.67s/it] {'loss': 0.1302, 'grad_norm': 0.8803672228134498, 'learning_rate': 2.172340809111439e-06, 'epoch': 0.7} 70%|███████ | 24015/34278 [26:28:59<10:28:01, 3.67s/it] 70%|███████ | 24016/34278 [26:29:03<10:44:37, 3.77s/it] {'loss': 0.1226, 'grad_norm': 0.7970264155337651, 'learning_rate': 2.171951192783146e-06, 'epoch': 0.7} 70%|███████ | 24016/34278 [26:29:03<10:44:37, 3.77s/it] 70%|███████ | 24017/34278 [26:29:06<10:01:45, 3.52s/it] {'loss': 0.1264, 'grad_norm': 0.9698896300487313, 'learning_rate': 2.171561601702998e-06, 'epoch': 0.7} 70%|███████ | 24017/34278 [26:29:06<10:01:45, 3.52s/it] 70%|███████ | 24018/34278 [26:29:09<9:45:38, 3.42s/it] {'loss': 0.1267, 'grad_norm': 0.8072280167044456, 'learning_rate': 2.1711720358744704e-06, 'epoch': 0.7} 70%|███████ | 24018/34278 [26:29:09<9:45:38, 3.42s/it] 70%|███████ | 24019/34278 [26:29:15<12:12:57, 4.29s/it] {'loss': 0.1096, 'grad_norm': 0.7550862104546195, 'learning_rate': 2.170782495301046e-06, 'epoch': 0.7} 70%|███████ | 24019/34278 [26:29:15<12:12:57, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 70%|███████ | 24020/34278 [26:29:19<11:38:32, 4.09s/it] {'loss': 0.1383, 'grad_norm': 1.1206125222080219, 'learning_rate': 2.170392979986198e-06, 'epoch': 0.7} 70%|███████ | 24020/34278 [26:29:19<11:38:32, 4.09s/it] 70%|███████ | 24021/34278 [26:29:22<10:37:47, 3.73s/it] {'loss': 0.11, 'grad_norm': 0.7952524095088658, 'learning_rate': 2.1700034899334056e-06, 'epoch': 0.7} 70%|███████ | 24021/34278 [26:29:22<10:37:47, 3.73s/it] 70%|███████ | 24022/34278 [26:29:27<12:15:09, 4.30s/it] {'loss': 0.1164, 'grad_norm': 0.6707995429890307, 'learning_rate': 2.169614025146149e-06, 'epoch': 0.7} 70%|███████ | 24022/34278 [26:29:27<12:15:09, 4.30s/it] 70%|███████ | 24023/34278 [26:29:32<12:23:42, 4.35s/it] {'loss': 0.122, 'grad_norm': 1.335294007389897, 'learning_rate': 2.169224585627902e-06, 'epoch': 0.7} 70%|███████ | 24023/34278 [26:29:32<12:23:42, 4.35s/it] 70%|███████ | 24024/34278 [26:29:35<11:21:52, 3.99s/it] {'loss': 0.1076, 'grad_norm': 0.8019056674990741, 'learning_rate': 2.168835171382141e-06, 'epoch': 0.7} 70%|███████ | 24024/34278 [26:29:35<11:21:52, 3.99s/it] 70%|███████ | 24025/34278 [26:29:38<10:25:33, 3.66s/it] {'loss': 0.0955, 'grad_norm': 0.9430484276438996, 'learning_rate': 2.168445782412345e-06, 'epoch': 0.7} 70%|███████ | 24025/34278 [26:29:38<10:25:33, 3.66s/it] 70%|███████ | 24026/34278 [26:29:44<12:41:46, 4.46s/it] {'loss': 0.1253, 'grad_norm': 0.9831013496598711, 'learning_rate': 2.1680564187219877e-06, 'epoch': 0.7} 70%|███████ | 24026/34278 [26:29:44<12:41:46, 4.46s/it] 70%|███████ | 24027/34278 [26:29:47<11:19:00, 3.97s/it] {'loss': 0.115, 'grad_norm': 0.9485212158111505, 'learning_rate': 2.1676670803145483e-06, 'epoch': 0.7} 70%|███████ | 24027/34278 [26:29:47<11:19:00, 3.97s/it] 70%|███████ | 24028/34278 [26:29:50<10:57:53, 3.85s/it] {'loss': 0.1115, 'grad_norm': 1.155568103595562, 'learning_rate': 2.167277767193499e-06, 'epoch': 0.7} 70%|███████ | 24028/34278 [26:29:50<10:57:53, 3.85s/it] 70%|███████ | 24029/34278 [26:29:56<12:17:48, 4.32s/it] {'loss': 0.1152, 'grad_norm': 1.0984865711311678, 'learning_rate': 2.1668884793623202e-06, 'epoch': 0.7} 70%|███████ | 24029/34278 [26:29:56<12:17:48, 4.32s/it] 70%|███████ | 24030/34278 [26:29:59<11:16:16, 3.96s/it] {'loss': 0.1273, 'grad_norm': 0.7952488702239149, 'learning_rate': 2.166499216824484e-06, 'epoch': 0.7} 70%|███████ | 24030/34278 [26:29:59<11:16:16, 3.96s/it] 70%|███████ | 24031/34278 [26:30:03<10:57:53, 3.85s/it] {'loss': 0.1226, 'grad_norm': 0.8422218644867794, 'learning_rate': 2.166109979583465e-06, 'epoch': 0.7} 70%|███████ | 24031/34278 [26:30:03<10:57:53, 3.85s/it] 70%|███████ | 24032/34278 [26:30:05<9:51:15, 3.46s/it] {'loss': 0.1188, 'grad_norm': 0.8822999547685475, 'learning_rate': 2.1657207676427395e-06, 'epoch': 0.7} 70%|███████ | 24032/34278 [26:30:05<9:51:15, 3.46s/it] 70%|███████ | 24033/34278 [26:30:09<9:54:18, 3.48s/it] {'loss': 0.0944, 'grad_norm': 1.0246611848438882, 'learning_rate': 2.165331581005784e-06, 'epoch': 0.7} 70%|███████ | 24033/34278 [26:30:09<9:54:18, 3.48s/it] 70%|███████ | 24034/34278 [26:30:12<9:57:29, 3.50s/it] {'loss': 0.1456, 'grad_norm': 1.0312611627298989, 'learning_rate': 2.1649424196760717e-06, 'epoch': 0.7} 70%|███████ | 24034/34278 [26:30:12<9:57:29, 3.50s/it] 70%|███████ | 24035/34278 [26:30:18<11:55:11, 4.19s/it] {'loss': 0.1109, 'grad_norm': 0.7709151509435308, 'learning_rate': 2.1645532836570744e-06, 'epoch': 0.7} 70%|███████ | 24035/34278 [26:30:18<11:55:11, 4.19s/it] 70%|███████ | 24036/34278 [26:30:21<11:06:38, 3.91s/it] {'loss': 0.1331, 'grad_norm': 0.8516470804910463, 'learning_rate': 2.1641641729522705e-06, 'epoch': 0.7} 70%|███████ | 24036/34278 [26:30:21<11:06:38, 3.91s/it] 70%|███████ | 24037/34278 [26:30:24<10:19:34, 3.63s/it] {'loss': 0.1055, 'grad_norm': 1.006222626001592, 'learning_rate': 2.163775087565132e-06, 'epoch': 0.7} 70%|███████ | 24037/34278 [26:30:24<10:19:34, 3.63s/it] 70%|███████ | 24038/34278 [26:30:29<11:00:36, 3.87s/it] {'loss': 0.1243, 'grad_norm': 2.064272706827919, 'learning_rate': 2.163386027499129e-06, 'epoch': 0.7} 70%|███████ | 24038/34278 [26:30:29<11:00:36, 3.87s/it] 70%|███████ | 24039/34278 [26:30:31<10:08:42, 3.57s/it] {'loss': 0.1146, 'grad_norm': 1.2481984411755531, 'learning_rate': 2.1629969927577417e-06, 'epoch': 0.7} 70%|███████ | 24039/34278 [26:30:32<10:08:42, 3.57s/it] 70%|███████ | 24040/34278 [26:30:35<10:12:09, 3.59s/it] {'loss': 0.0884, 'grad_norm': 0.6976811043586211, 'learning_rate': 2.16260798334444e-06, 'epoch': 0.7} 70%|███████ | 24040/34278 [26:30:35<10:12:09, 3.59s/it] 70%|███████ | 24041/34278 [26:30:39<10:47:48, 3.80s/it] {'loss': 0.1081, 'grad_norm': 0.7852034296196531, 'learning_rate': 2.1622189992626956e-06, 'epoch': 0.7} 70%|███████ | 24041/34278 [26:30:39<10:47:48, 3.80s/it] 70%|███████ | 24042/34278 [26:30:42<10:10:02, 3.58s/it] {'loss': 0.1288, 'grad_norm': 1.2743735386200303, 'learning_rate': 2.1618300405159844e-06, 'epoch': 0.7} 70%|███████ | 24042/34278 [26:30:42<10:10:02, 3.58s/it] 70%|███████ | 24043/34278 [26:30:47<11:19:39, 3.98s/it] {'loss': 0.1128, 'grad_norm': 1.1277866592727162, 'learning_rate': 2.1614411071077764e-06, 'epoch': 0.7} 70%|███████ | 24043/34278 [26:30:47<11:19:39, 3.98s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 70%|███████ | 24044/34278 [26:30:50<10:28:10, 3.68s/it] {'loss': 0.1128, 'grad_norm': 0.8391056870980765, 'learning_rate': 2.161052199041543e-06, 'epoch': 0.7} 70%|███████ | 24044/34278 [26:30:50<10:28:10, 3.68s/it] 70%|███████ | 24045/34278 [26:30:57<12:39:54, 4.46s/it] {'loss': 0.1439, 'grad_norm': 0.7315497197928056, 'learning_rate': 2.160663316320758e-06, 'epoch': 0.7} 70%|███████ | 24045/34278 [26:30:57<12:39:54, 4.46s/it] 70%|███████ | 24046/34278 [26:31:00<11:37:14, 4.09s/it] {'loss': 0.104, 'grad_norm': 0.7604841604689835, 'learning_rate': 2.1602744589488944e-06, 'epoch': 0.7} 70%|███████ | 24046/34278 [26:31:00<11:37:14, 4.09s/it] 70%|███████ | 24047/34278 [26:31:03<10:50:43, 3.82s/it] {'loss': 0.1232, 'grad_norm': 0.9601341189787869, 'learning_rate': 2.1598856269294234e-06, 'epoch': 0.7} 70%|███████ | 24047/34278 [26:31:03<10:50:43, 3.82s/it] 70%|███████ | 24048/34278 [26:31:06<9:58:48, 3.51s/it] {'loss': 0.1305, 'grad_norm': 0.9800180073239532, 'learning_rate': 2.159496820265813e-06, 'epoch': 0.7} 70%|███████ | 24048/34278 [26:31:06<9:58:48, 3.51s/it] 70%|███████ | 24049/34278 [26:31:11<11:12:05, 3.94s/it] {'loss': 0.1205, 'grad_norm': 0.9845668064896056, 'learning_rate': 2.1591080389615386e-06, 'epoch': 0.7} 70%|███████ | 24049/34278 [26:31:11<11:12:05, 3.94s/it] 70%|███████ | 24050/34278 [26:31:14<10:25:35, 3.67s/it] {'loss': 0.0993, 'grad_norm': 0.8112349456134588, 'learning_rate': 2.1587192830200683e-06, 'epoch': 0.7} 70%|███████ | 24050/34278 [26:31:14<10:25:35, 3.67s/it] 70%|███████ | 24051/34278 [26:31:18<10:31:02, 3.70s/it] {'loss': 0.1197, 'grad_norm': 1.1056914242147518, 'learning_rate': 2.158330552444874e-06, 'epoch': 0.7} 70%|███████ | 24051/34278 [26:31:18<10:31:02, 3.70s/it] 70%|███████ | 24052/34278 [26:31:20<9:48:17, 3.45s/it] {'loss': 0.1228, 'grad_norm': 1.0860628694299137, 'learning_rate': 2.1579418472394274e-06, 'epoch': 0.7} 70%|███████ | 24052/34278 [26:31:21<9:48:17, 3.45s/it] 70%|███████ | 24053/34278 [26:31:25<10:16:39, 3.62s/it] {'loss': 0.1199, 'grad_norm': 0.9195609440088643, 'learning_rate': 2.157553167407198e-06, 'epoch': 0.7} 70%|███████ | 24053/34278 [26:31:25<10:16:39, 3.62s/it] 70%|███████ | 24054/34278 [26:31:28<9:46:34, 3.44s/it] {'loss': 0.1103, 'grad_norm': 0.8267432432912861, 'learning_rate': 2.1571645129516533e-06, 'epoch': 0.7} 70%|███████ | 24054/34278 [26:31:28<9:46:34, 3.44s/it] 70%|███████ | 24055/34278 [26:31:31<9:38:56, 3.40s/it] {'loss': 0.1102, 'grad_norm': 0.9119221876899662, 'learning_rate': 2.156775883876267e-06, 'epoch': 0.7} 70%|███████ | 24055/34278 [26:31:31<9:38:56, 3.40s/it] 70%|███████ | 24056/34278 [26:31:33<9:01:24, 3.18s/it] {'loss': 0.1258, 'grad_norm': 1.1060395320976149, 'learning_rate': 2.156387280184505e-06, 'epoch': 0.7} 70%|███████ | 24056/34278 [26:31:33<9:01:24, 3.18s/it] 70%|███████ | 24057/34278 [26:31:36<8:43:49, 3.07s/it] {'loss': 0.108, 'grad_norm': 1.2418193467656058, 'learning_rate': 2.1559987018798407e-06, 'epoch': 0.7} 70%|███████ | 24057/34278 [26:31:36<8:43:49, 3.07s/it] 70%|███████ | 24058/34278 [26:31:40<9:33:28, 3.37s/it] {'loss': 0.1126, 'grad_norm': 0.6941779523718972, 'learning_rate': 2.155610148965739e-06, 'epoch': 0.7} 70%|███████ | 24058/34278 [26:31:40<9:33:28, 3.37s/it] 70%|███████ | 24059/34278 [26:31:44<9:38:21, 3.40s/it] {'loss': 0.106, 'grad_norm': 0.7468745610187636, 'learning_rate': 2.155221621445673e-06, 'epoch': 0.7} 70%|███████ | 24059/34278 [26:31:44<9:38:21, 3.40s/it] 70%|███████ | 24060/34278 [26:31:47<9:11:16, 3.24s/it] {'loss': 0.1153, 'grad_norm': 1.3963220490013728, 'learning_rate': 2.154833119323109e-06, 'epoch': 0.7} 70%|███████ | 24060/34278 [26:31:47<9:11:16, 3.24s/it] 70%|███████ | 24061/34278 [26:31:50<9:02:45, 3.19s/it] {'loss': 0.119, 'grad_norm': 1.010510213635547, 'learning_rate': 2.1544446426015137e-06, 'epoch': 0.7} 70%|███████ | 24061/34278 [26:31:50<9:02:45, 3.19s/it] 70%|███████ | 24062/34278 [26:31:56<11:34:14, 4.08s/it] {'loss': 0.1247, 'grad_norm': 0.843969770784331, 'learning_rate': 2.1540561912843577e-06, 'epoch': 0.7} 70%|███████ | 24062/34278 [26:31:56<11:34:14, 4.08s/it] 70%|███████ | 24063/34278 [26:31:59<10:50:21, 3.82s/it] {'loss': 0.1054, 'grad_norm': 0.9308770573205402, 'learning_rate': 2.1536677653751103e-06, 'epoch': 0.7} 70%|███████ | 24063/34278 [26:31:59<10:50:21, 3.82s/it] 70%|███████ | 24064/34278 [26:32:05<12:11:01, 4.29s/it] {'loss': 0.1071, 'grad_norm': 1.1167229292695007, 'learning_rate': 2.1532793648772376e-06, 'epoch': 0.7} 70%|███████ | 24064/34278 [26:32:05<12:11:01, 4.29s/it] 70%|███████ | 24065/34278 [26:32:07<10:56:54, 3.86s/it] {'loss': 0.1162, 'grad_norm': 0.6777538264692602, 'learning_rate': 2.152890989794205e-06, 'epoch': 0.7} 70%|███████ | 24065/34278 [26:32:07<10:56:54, 3.86s/it] 70%|███████ | 24066/34278 [26:32:11<10:20:25, 3.65s/it] {'loss': 0.1106, 'grad_norm': 0.7712540637357135, 'learning_rate': 2.1525026401294846e-06, 'epoch': 0.7} 70%|███████ | 24066/34278 [26:32:11<10:20:25, 3.65s/it] 70%|███████ | 24067/34278 [26:32:14<9:49:16, 3.46s/it] {'loss': 0.1249, 'grad_norm': 1.0568101289978415, 'learning_rate': 2.1521143158865403e-06, 'epoch': 0.7} 70%|███████ | 24067/34278 [26:32:14<9:49:16, 3.46s/it] 70%|███████ | 24068/34278 [26:32:17<9:49:59, 3.47s/it] {'loss': 0.1057, 'grad_norm': 0.7967857620394879, 'learning_rate': 2.1517260170688357e-06, 'epoch': 0.7} 70%|███████ | 24068/34278 [26:32:17<9:49:59, 3.47s/it] 70%|███████ | 24069/34278 [26:32:20<9:34:54, 3.38s/it] {'loss': 0.1286, 'grad_norm': 0.6882672674262725, 'learning_rate': 2.1513377436798454e-06, 'epoch': 0.7} 70%|███████ | 24069/34278 [26:32:20<9:34:54, 3.38s/it] 70%|███████ | 24070/34278 [26:32:24<9:33:43, 3.37s/it] {'loss': 0.1169, 'grad_norm': 0.8283639053360913, 'learning_rate': 2.150949495723032e-06, 'epoch': 0.7} 70%|███████ | 24070/34278 [26:32:24<9:33:43, 3.37s/it] 70%|███████ | 24071/34278 [26:32:27<9:12:19, 3.25s/it] {'loss': 0.098, 'grad_norm': 0.9816232198395284, 'learning_rate': 2.1505612732018588e-06, 'epoch': 0.7} 70%|███████ | 24071/34278 [26:32:27<9:12:19, 3.25s/it] 70%|███████ | 24072/34278 [26:32:31<9:56:25, 3.51s/it] {'loss': 0.1189, 'grad_norm': 0.8261055672546759, 'learning_rate': 2.1501730761197962e-06, 'epoch': 0.7} 70%|███████ | 24072/34278 [26:32:31<9:56:25, 3.51s/it] 70%|███████ | 24073/34278 [26:32:33<9:12:58, 3.25s/it] {'loss': 0.1215, 'grad_norm': 0.6476406893872703, 'learning_rate': 2.1497849044803088e-06, 'epoch': 0.7} 70%|███████ | 24073/34278 [26:32:33<9:12:58, 3.25s/it] 70%|███████ | 24074/34278 [26:32:37<9:40:14, 3.41s/it] {'loss': 0.1185, 'grad_norm': 0.8205112972150803, 'learning_rate': 2.149396758286859e-06, 'epoch': 0.7} 70%|███████ | 24074/34278 [26:32:37<9:40:14, 3.41s/it] 70%|███████ | 24075/34278 [26:32:42<10:59:25, 3.88s/it] {'loss': 0.1242, 'grad_norm': 1.075725340492133, 'learning_rate': 2.1490086375429146e-06, 'epoch': 0.7} 70%|███████ | 24075/34278 [26:32:42<10:59:25, 3.88s/it] 70%|███████ | 24076/34278 [26:32:46<11:04:30, 3.91s/it] {'loss': 0.1072, 'grad_norm': 0.777168613703323, 'learning_rate': 2.1486205422519426e-06, 'epoch': 0.7} 70%|███████ | 24076/34278 [26:32:46<11:04:30, 3.91s/it] 70%|███████ | 24077/34278 [26:32:50<10:49:36, 3.82s/it] {'loss': 0.1484, 'grad_norm': 0.9361846554945391, 'learning_rate': 2.1482324724174052e-06, 'epoch': 0.7} 70%|███████ | 24077/34278 [26:32:50<10:49:36, 3.82s/it] 70%|███████ | 24078/34278 [26:32:53<10:15:08, 3.62s/it] {'loss': 0.1271, 'grad_norm': 0.8325886361286319, 'learning_rate': 2.1478444280427657e-06, 'epoch': 0.7} 70%|███████ | 24078/34278 [26:32:53<10:15:08, 3.62s/it] 70%|███████ | 24079/34278 [26:32:56<9:32:52, 3.37s/it] {'loss': 0.1249, 'grad_norm': 0.706949477634099, 'learning_rate': 2.1474564091314925e-06, 'epoch': 0.7} 70%|███████ | 24079/34278 [26:32:56<9:32:52, 3.37s/it] 70%|███████ | 24080/34278 [26:32:59<9:14:07, 3.26s/it] {'loss': 0.1086, 'grad_norm': 0.8312963660706355, 'learning_rate': 2.1470684156870454e-06, 'epoch': 0.7} 70%|███████ | 24080/34278 [26:32:59<9:14:07, 3.26s/it] 70%|███████ | 24081/34278 [26:33:02<9:27:54, 3.34s/it] {'loss': 0.1077, 'grad_norm': 1.151050570465602, 'learning_rate': 2.1466804477128905e-06, 'epoch': 0.7} 70%|███████ | 24081/34278 [26:33:02<9:27:54, 3.34s/it] 70%|███████ | 24082/34278 [26:33:06<9:53:21, 3.49s/it] {'loss': 0.1149, 'grad_norm': 0.8598811482320603, 'learning_rate': 2.1462925052124934e-06, 'epoch': 0.7} 70%|███████ | 24082/34278 [26:33:06<9:53:21, 3.49s/it] 70%|███████ | 24083/34278 [26:33:09<9:40:31, 3.42s/it] {'loss': 0.1103, 'grad_norm': 0.789679013115193, 'learning_rate': 2.1459045881893154e-06, 'epoch': 0.7} 70%|███████ | 24083/34278 [26:33:09<9:40:31, 3.42s/it] 70%|███████ | 24084/34278 [26:33:12<9:10:43, 3.24s/it] {'loss': 0.1204, 'grad_norm': 0.7495025644090153, 'learning_rate': 2.1455166966468177e-06, 'epoch': 0.7} 70%|███████ | 24084/34278 [26:33:12<9:10:43, 3.24s/it] 70%|███████ | 24085/34278 [26:33:15<9:13:49, 3.26s/it] {'loss': 0.1316, 'grad_norm': 1.095025131927912, 'learning_rate': 2.1451288305884683e-06, 'epoch': 0.7} 70%|███████ | 24085/34278 [26:33:15<9:13:49, 3.26s/it] 70%|███████ | 24086/34278 [26:33:21<11:37:02, 4.10s/it] {'loss': 0.1111, 'grad_norm': 0.7283491155742952, 'learning_rate': 2.144740990017725e-06, 'epoch': 0.7} 70%|███████ | 24086/34278 [26:33:21<11:37:02, 4.10s/it] 70%|███████ | 24087/34278 [26:33:25<11:35:24, 4.09s/it] {'loss': 0.1185, 'grad_norm': 0.8667311375326322, 'learning_rate': 2.1443531749380538e-06, 'epoch': 0.7} 70%|███████ | 24087/34278 [26:33:26<11:35:24, 4.09s/it] 70%|███████ | 24088/34278 [26:33:28<10:35:54, 3.74s/it] {'loss': 0.1259, 'grad_norm': 0.6838411245661873, 'learning_rate': 2.143965385352914e-06, 'epoch': 0.7} 70%|███████ | 24088/34278 [26:33:28<10:35:54, 3.74s/it] 70%|███████ | 24089/34278 [26:33:33<11:06:04, 3.92s/it] {'loss': 0.1414, 'grad_norm': 1.052758190353409, 'learning_rate': 2.1435776212657715e-06, 'epoch': 0.7} 70%|███████ | 24089/34278 [26:33:33<11:06:04, 3.92s/it] 70%|███████ | 24090/34278 [26:33:37<11:16:14, 3.98s/it] {'loss': 0.1294, 'grad_norm': 0.823716241046075, 'learning_rate': 2.1431898826800866e-06, 'epoch': 0.7} 70%|███████ | 24090/34278 [26:33:37<11:16:14, 3.98s/it] 70%|███████ | 24091/34278 [26:33:41<10:59:01, 3.88s/it] {'loss': 0.1433, 'grad_norm': 0.9334328281490369, 'learning_rate': 2.1428021695993184e-06, 'epoch': 0.7} 70%|███████ | 24091/34278 [26:33:41<10:59:01, 3.88s/it] 70%|███████ | 24092/34278 [26:33:47<12:47:52, 4.52s/it] {'loss': 0.122, 'grad_norm': 0.8547639693236345, 'learning_rate': 2.14241448202693e-06, 'epoch': 0.7} 70%|███████ | 24092/34278 [26:33:47<12:47:52, 4.52s/it] 70%|███████ | 24093/34278 [26:33:51<12:20:02, 4.36s/it] {'loss': 0.1006, 'grad_norm': 0.6344574823642141, 'learning_rate': 2.1420268199663854e-06, 'epoch': 0.7} 70%|███████ | 24093/34278 [26:33:51<12:20:02, 4.36s/it] 70%|███████ | 24094/34278 [26:33:53<11:04:10, 3.91s/it] {'loss': 0.1277, 'grad_norm': 0.8307319846047498, 'learning_rate': 2.141639183421142e-06, 'epoch': 0.7} 70%|███████ | 24094/34278 [26:33:53<11:04:10, 3.91s/it] 70%|███████ | 24095/34278 [26:33:56<10:03:11, 3.55s/it] {'loss': 0.1368, 'grad_norm': 0.792771071556801, 'learning_rate': 2.141251572394661e-06, 'epoch': 0.7} 70%|███████ | 24095/34278 [26:33:56<10:03:11, 3.55s/it] 70%|███████ | 24096/34278 [26:33:59<9:46:28, 3.46s/it] {'loss': 0.1212, 'grad_norm': 0.6623395859811816, 'learning_rate': 2.1408639868904046e-06, 'epoch': 0.7} 70%|███████ | 24096/34278 [26:33:59<9:46:28, 3.46s/it] 70%|███████ | 24097/34278 [26:34:05<12:03:58, 4.27s/it] {'loss': 0.1317, 'grad_norm': 0.8286602012946489, 'learning_rate': 2.140476426911832e-06, 'epoch': 0.7} 70%|███████ | 24097/34278 [26:34:06<12:03:58, 4.27s/it] 70%|███████ | 24098/34278 [26:34:09<11:14:38, 3.98s/it] {'loss': 0.1441, 'grad_norm': 0.8179453115130316, 'learning_rate': 2.1400888924623995e-06, 'epoch': 0.7} 70%|███████ | 24098/34278 [26:34:09<11:14:38, 3.98s/it] 70%|███████ | 24099/34278 [26:34:12<10:10:22, 3.60s/it] {'loss': 0.0981, 'grad_norm': 1.248903539581955, 'learning_rate': 2.139701383545575e-06, 'epoch': 0.7} 70%|███████ | 24099/34278 [26:34:12<10:10:22, 3.60s/it] 70%|███████ | 24100/34278 [26:34:16<10:40:02, 3.77s/it] {'loss': 0.1057, 'grad_norm': 0.8845381821470489, 'learning_rate': 2.139313900164813e-06, 'epoch': 0.7} 70%|███████ | 24100/34278 [26:34:16<10:40:02, 3.77s/it] 70%|███████ | 24101/34278 [26:34:19<10:16:36, 3.64s/it] {'loss': 0.1166, 'grad_norm': 0.8787044640074702, 'learning_rate': 2.1389264423235725e-06, 'epoch': 0.7} 70%|███████ | 24101/34278 [26:34:19<10:16:36, 3.64s/it] 70%|███████ | 24102/34278 [26:34:24<11:36:33, 4.11s/it] {'loss': 0.1373, 'grad_norm': 0.9290614703170608, 'learning_rate': 2.138539010025315e-06, 'epoch': 0.7} 70%|███████ | 24102/34278 [26:34:24<11:36:33, 4.11s/it] 70%|███████ | 24103/34278 [26:34:27<10:33:26, 3.74s/it] {'loss': 0.1069, 'grad_norm': 0.8282834718855829, 'learning_rate': 2.1381516032734985e-06, 'epoch': 0.7} 70%|███████ | 24103/34278 [26:34:27<10:33:26, 3.74s/it] 70%|███████ | 24104/34278 [26:34:34<12:50:36, 4.54s/it] {'loss': 0.1075, 'grad_norm': 0.8220651876508512, 'learning_rate': 2.137764222071579e-06, 'epoch': 0.7} 70%|███████ | 24104/34278 [26:34:34<12:50:36, 4.54s/it] 70%|███████ | 24105/34278 [26:34:37<12:06:19, 4.28s/it] {'loss': 0.1057, 'grad_norm': 0.9565262772610033, 'learning_rate': 2.137376866423018e-06, 'epoch': 0.7} 70%|███████ | 24105/34278 [26:34:37<12:06:19, 4.28s/it] 70%|███████ | 24106/34278 [26:34:40<11:07:06, 3.94s/it] {'loss': 0.1295, 'grad_norm': 1.1878622190301287, 'learning_rate': 2.1369895363312735e-06, 'epoch': 0.7} 70%|███████ | 24106/34278 [26:34:40<11:07:06, 3.94s/it] 70%|███████ | 24107/34278 [26:34:46<12:36:11, 4.46s/it] {'loss': 0.1148, 'grad_norm': 0.9622737987465088, 'learning_rate': 2.1366022317998042e-06, 'epoch': 0.7} 70%|███████ | 24107/34278 [26:34:46<12:36:11, 4.46s/it] 70%|███████ | 24108/34278 [26:34:49<11:38:01, 4.12s/it] {'loss': 0.1295, 'grad_norm': 0.9526874065691252, 'learning_rate': 2.1362149528320646e-06, 'epoch': 0.7} 70%|███████ | 24108/34278 [26:34:49<11:38:01, 4.12s/it] 70%|███████ | 24109/34278 [26:34:54<11:43:22, 4.15s/it] {'loss': 0.1227, 'grad_norm': 1.354666136293969, 'learning_rate': 2.135827699431516e-06, 'epoch': 0.7} 70%|███████ | 24109/34278 [26:34:54<11:43:22, 4.15s/it] 70%|███████ | 24110/34278 [26:34:57<10:51:35, 3.84s/it] {'loss': 0.1212, 'grad_norm': 1.2650871130812544, 'learning_rate': 2.135440471601612e-06, 'epoch': 0.7} 70%|███████ | 24110/34278 [26:34:57<10:51:35, 3.84s/it] 70%|███████ | 24111/34278 [26:35:01<10:53:07, 3.85s/it] {'loss': 0.1034, 'grad_norm': 0.847655465678781, 'learning_rate': 2.1350532693458117e-06, 'epoch': 0.7} 70%|███████ | 24111/34278 [26:35:01<10:53:07, 3.85s/it] 70%|███████ | 24112/34278 [26:35:07<12:46:14, 4.52s/it] {'loss': 0.135, 'grad_norm': 0.9253122018880892, 'learning_rate': 2.1346660926675732e-06, 'epoch': 0.7} 70%|███████ | 24112/34278 [26:35:07<12:46:14, 4.52s/it] 70%|███████ | 24113/34278 [26:35:10<11:24:47, 4.04s/it] {'loss': 0.111, 'grad_norm': 0.8972131725035827, 'learning_rate': 2.1342789415703524e-06, 'epoch': 0.7} 70%|███████ | 24113/34278 [26:35:10<11:24:47, 4.04s/it] 70%|███████ | 24114/34278 [26:35:13<10:30:15, 3.72s/it] {'loss': 0.1209, 'grad_norm': 0.9270774916417938, 'learning_rate': 2.1338918160576033e-06, 'epoch': 0.7} 70%|███████ | 24114/34278 [26:35:13<10:30:15, 3.72s/it] 70%|███████ | 24115/34278 [26:35:16<10:06:58, 3.58s/it] {'loss': 0.1249, 'grad_norm': 0.9024003077394835, 'learning_rate': 2.1335047161327853e-06, 'epoch': 0.7} 70%|███████ | 24115/34278 [26:35:16<10:06:58, 3.58s/it] 70%|███████ | 24116/34278 [26:35:21<11:53:04, 4.21s/it] {'loss': 0.1104, 'grad_norm': 0.9166456653557409, 'learning_rate': 2.1331176417993517e-06, 'epoch': 0.7} 70%|███████ | 24116/34278 [26:35:21<11:53:04, 4.21s/it] 70%|███████ | 24117/34278 [26:35:25<11:28:27, 4.07s/it] {'loss': 0.1047, 'grad_norm': 1.0659154021750539, 'learning_rate': 2.1327305930607605e-06, 'epoch': 0.7} 70%|███████ | 24117/34278 [26:35:25<11:28:27, 4.07s/it] 70%|███████ | 24118/34278 [26:35:28<10:47:53, 3.83s/it] {'loss': 0.1144, 'grad_norm': 1.8382653879684283, 'learning_rate': 2.1323435699204646e-06, 'epoch': 0.7} 70%|███████ | 24118/34278 [26:35:28<10:47:53, 3.83s/it] 70%|███████ | 24119/34278 [26:35:31<10:03:29, 3.56s/it] {'loss': 0.1426, 'grad_norm': 0.8872106360749761, 'learning_rate': 2.131956572381923e-06, 'epoch': 0.7} 70%|███████ | 24119/34278 [26:35:31<10:03:29, 3.56s/it] 70%|███████ | 24120/34278 [26:35:35<10:07:50, 3.59s/it] {'loss': 0.1437, 'grad_norm': 0.7547890595575304, 'learning_rate': 2.131569600448588e-06, 'epoch': 0.7} 70%|███████ | 24120/34278 [26:35:35<10:07:50, 3.59s/it] 70%|███████ | 24121/34278 [26:35:38<9:21:56, 3.32s/it] {'loss': 0.1078, 'grad_norm': 1.0143542045822773, 'learning_rate': 2.1311826541239133e-06, 'epoch': 0.7} 70%|███████ | 24121/34278 [26:35:38<9:21:56, 3.32s/it] 70%|███████ | 24122/34278 [26:35:44<11:37:23, 4.12s/it] {'loss': 0.1429, 'grad_norm': 0.7800617755717483, 'learning_rate': 2.130795733411355e-06, 'epoch': 0.7} 70%|███████ | 24122/34278 [26:35:44<11:37:23, 4.12s/it] 70%|███████ | 24123/34278 [26:35:47<10:55:01, 3.87s/it] {'loss': 0.1302, 'grad_norm': 0.8782027026296402, 'learning_rate': 2.130408838314369e-06, 'epoch': 0.7} 70%|███████ | 24123/34278 [26:35:47<10:55:01, 3.87s/it] 70%|███████ | 24124/34278 [26:35:51<10:41:35, 3.79s/it] {'loss': 0.1213, 'grad_norm': 0.9126599810947121, 'learning_rate': 2.1300219688364078e-06, 'epoch': 0.7} 70%|███████ | 24124/34278 [26:35:51<10:41:35, 3.79s/it] 70%|███████ | 24125/34278 [26:35:54<9:57:13, 3.53s/it] {'loss': 0.1027, 'grad_norm': 0.8159882221397627, 'learning_rate': 2.1296351249809237e-06, 'epoch': 0.7} 70%|███████ | 24125/34278 [26:35:54<9:57:13, 3.53s/it] 70%|███████ | 24126/34278 [26:35:57<10:09:34, 3.60s/it] {'loss': 0.1145, 'grad_norm': 0.8295197981214928, 'learning_rate': 2.129248306751374e-06, 'epoch': 0.7} 70%|███████ | 24126/34278 [26:35:57<10:09:34, 3.60s/it] 70%|███████ | 24127/34278 [26:36:01<10:20:15, 3.67s/it] {'loss': 0.1279, 'grad_norm': 1.2707174944966297, 'learning_rate': 2.1288615141512098e-06, 'epoch': 0.7} 70%|███████ | 24127/34278 [26:36:01<10:20:15, 3.67s/it] 70%|███████ | 24128/34278 [26:36:07<12:09:22, 4.31s/it] {'loss': 0.114, 'grad_norm': 0.9607515254219565, 'learning_rate': 2.128474747183881e-06, 'epoch': 0.7} 70%|███████ | 24128/34278 [26:36:07<12:09:22, 4.31s/it] 70%|███████ | 24129/34278 [26:36:10<11:05:33, 3.93s/it] {'loss': 0.143, 'grad_norm': 0.8996369639105667, 'learning_rate': 2.128088005852848e-06, 'epoch': 0.7} 70%|███████ | 24129/34278 [26:36:10<11:05:33, 3.93s/it] 70%|███████ | 24130/34278 [26:36:13<10:11:33, 3.62s/it] {'loss': 0.135, 'grad_norm': 0.9824472072603855, 'learning_rate': 2.1277012901615595e-06, 'epoch': 0.7} 70%|███████ | 24130/34278 [26:36:13<10:11:33, 3.62s/it] 70%|███████ | 24131/34278 [26:36:16<9:51:33, 3.50s/it] {'loss': 0.1507, 'grad_norm': 1.1287121871293229, 'learning_rate': 2.1273146001134672e-06, 'epoch': 0.7} 70%|███████ | 24131/34278 [26:36:16<9:51:33, 3.50s/it] 70%|███████ | 24132/34278 [26:36:19<9:39:49, 3.43s/it] {'loss': 0.1395, 'grad_norm': 0.7840416532663803, 'learning_rate': 2.126927935712025e-06, 'epoch': 0.7} 70%|███████ | 24132/34278 [26:36:19<9:39:49, 3.43s/it] 70%|███████ | 24133/34278 [26:36:25<11:47:31, 4.18s/it] {'loss': 0.1093, 'grad_norm': 1.1163970757759962, 'learning_rate': 2.1265412969606846e-06, 'epoch': 0.7} 70%|███████ | 24133/34278 [26:36:25<11:47:31, 4.18s/it] 70%|███████ | 24134/34278 [26:36:29<11:04:16, 3.93s/it] {'loss': 0.1193, 'grad_norm': 0.8538029088499534, 'learning_rate': 2.126154683862896e-06, 'epoch': 0.7} 70%|███████ | 24134/34278 [26:36:29<11:04:16, 3.93s/it] 70%|███████ | 24135/34278 [26:36:32<10:50:07, 3.85s/it] {'loss': 0.1015, 'grad_norm': 0.7897568904118584, 'learning_rate': 2.125768096422113e-06, 'epoch': 0.7} 70%|███████ | 24135/34278 [26:36:32<10:50:07, 3.85s/it] 70%|███████ | 24136/34278 [26:36:37<11:16:36, 4.00s/it] {'loss': 0.0895, 'grad_norm': 0.583966894031173, 'learning_rate': 2.1253815346417873e-06, 'epoch': 0.7} 70%|███████ | 24136/34278 [26:36:37<11:16:36, 4.00s/it] 70%|███████ | 24137/34278 [26:36:40<10:33:24, 3.75s/it] {'loss': 0.1098, 'grad_norm': 0.7785901180582534, 'learning_rate': 2.1249949985253686e-06, 'epoch': 0.7} 70%|███████ | 24137/34278 [26:36:40<10:33:24, 3.75s/it] 70%|███████ | 24138/34278 [26:36:44<10:33:48, 3.75s/it] {'loss': 0.1239, 'grad_norm': 0.7387064927061772, 'learning_rate': 2.1246084880763073e-06, 'epoch': 0.7} 70%|███████ | 24138/34278 [26:36:44<10:33:48, 3.75s/it] 70%|███████ | 24139/34278 [26:36:47<10:37:43, 3.77s/it] {'loss': 0.1164, 'grad_norm': 0.8556757756360723, 'learning_rate': 2.1242220032980563e-06, 'epoch': 0.7} 70%|███████ | 24139/34278 [26:36:47<10:37:43, 3.77s/it] 70%|███████ | 24140/34278 [26:36:51<10:14:22, 3.64s/it] {'loss': 0.1032, 'grad_norm': 0.899373524585836, 'learning_rate': 2.1238355441940634e-06, 'epoch': 0.7} 70%|███████ | 24140/34278 [26:36:51<10:14:22, 3.64s/it] 70%|███████ | 24141/34278 [26:36:54<10:03:58, 3.57s/it] {'loss': 0.123, 'grad_norm': 0.7693524759253813, 'learning_rate': 2.1234491107677802e-06, 'epoch': 0.7} 70%|███████ | 24141/34278 [26:36:54<10:03:58, 3.57s/it] 70%|███████ | 24142/34278 [26:36:57<9:44:09, 3.46s/it] {'loss': 0.098, 'grad_norm': 0.8646075646004397, 'learning_rate': 2.123062703022658e-06, 'epoch': 0.7} 70%|███████ | 24142/34278 [26:36:57<9:44:09, 3.46s/it] 70%|███████ | 24143/34278 [26:37:01<9:38:58, 3.43s/it] {'loss': 0.0983, 'grad_norm': 1.0283003112861857, 'learning_rate': 2.1226763209621452e-06, 'epoch': 0.7} 70%|███████ | 24143/34278 [26:37:01<9:38:58, 3.43s/it] 70%|███████ | 24144/34278 [26:37:04<9:07:57, 3.24s/it] {'loss': 0.1226, 'grad_norm': 0.7773822812720221, 'learning_rate': 2.12228996458969e-06, 'epoch': 0.7} 70%|███████ | 24144/34278 [26:37:04<9:07:57, 3.24s/it] 70%|███████ | 24145/34278 [26:37:06<8:51:22, 3.15s/it] {'loss': 0.1237, 'grad_norm': 1.1564431781030586, 'learning_rate': 2.1219036339087447e-06, 'epoch': 0.7} 70%|███████ | 24145/34278 [26:37:06<8:51:22, 3.15s/it] 70%|███████ | 24146/34278 [26:37:12<11:12:22, 3.98s/it] {'loss': 0.1174, 'grad_norm': 0.8573595090024431, 'learning_rate': 2.121517328922754e-06, 'epoch': 0.7} 70%|███████ | 24146/34278 [26:37:12<11:12:22, 3.98s/it] 70%|███████ | 24147/34278 [26:37:15<10:25:39, 3.71s/it] {'loss': 0.1275, 'grad_norm': 1.0830024680069363, 'learning_rate': 2.1211310496351724e-06, 'epoch': 0.7} 70%|███████ | 24147/34278 [26:37:15<10:25:39, 3.71s/it] 70%|███████ | 24148/34278 [26:37:19<10:18:30, 3.66s/it] {'loss': 0.0998, 'grad_norm': 0.8617612438167792, 'learning_rate': 2.120744796049443e-06, 'epoch': 0.7} 70%|███████ | 24148/34278 [26:37:19<10:18:30, 3.66s/it] 70%|███████ | 24149/34278 [26:37:22<9:23:01, 3.34s/it] {'loss': 0.1069, 'grad_norm': 1.0647450513089958, 'learning_rate': 2.120358568169019e-06, 'epoch': 0.7} 70%|███████ | 24149/34278 [26:37:22<9:23:01, 3.34s/it] 70%|███████ | 24150/34278 [26:37:27<11:32:56, 4.11s/it] {'loss': 0.1153, 'grad_norm': 0.891693853593121, 'learning_rate': 2.1199723659973466e-06, 'epoch': 0.7} 70%|███████ | 24150/34278 [26:37:27<11:32:56, 4.11s/it] 70%|███████ | 24151/34278 [26:37:31<10:42:29, 3.81s/it] {'loss': 0.1167, 'grad_norm': 0.7222184254171092, 'learning_rate': 2.1195861895378704e-06, 'epoch': 0.7} 70%|███████ | 24151/34278 [26:37:31<10:42:29, 3.81s/it] 70%|███████ | 24152/34278 [26:37:35<11:22:52, 4.05s/it] {'loss': 0.1014, 'grad_norm': 1.0003390452738457, 'learning_rate': 2.119200038794042e-06, 'epoch': 0.7} 70%|███████ | 24152/34278 [26:37:35<11:22:52, 4.05s/it] 70%|███████ | 24153/34278 [26:37:39<10:57:17, 3.90s/it] {'loss': 0.1533, 'grad_norm': 1.0467032394858526, 'learning_rate': 2.11881391376931e-06, 'epoch': 0.7} 70%|███████ | 24153/34278 [26:37:39<10:57:17, 3.90s/it] 70%|███████ | 24154/34278 [26:37:41<9:55:18, 3.53s/it] {'loss': 0.0998, 'grad_norm': 0.8797110334678755, 'learning_rate': 2.118427814467119e-06, 'epoch': 0.7} 70%|███████ | 24154/34278 [26:37:41<9:55:18, 3.53s/it] 70%|███████ | 24155/34278 [26:37:44<9:23:48, 3.34s/it] {'loss': 0.1305, 'grad_norm': 1.316442117975834, 'learning_rate': 2.118041740890915e-06, 'epoch': 0.7} 70%|███████ | 24155/34278 [26:37:44<9:23:48, 3.34s/it] 70%|███████ | 24156/34278 [26:37:48<9:43:49, 3.46s/it] {'loss': 0.1029, 'grad_norm': 0.8996957206478129, 'learning_rate': 2.117655693044148e-06, 'epoch': 0.7} 70%|███████ | 24156/34278 [26:37:48<9:43:49, 3.46s/it] 70%|███████ | 24157/34278 [26:37:51<9:12:29, 3.28s/it] {'loss': 0.1178, 'grad_norm': 0.9367241576178393, 'learning_rate': 2.117269670930263e-06, 'epoch': 0.7} 70%|███████ | 24157/34278 [26:37:51<9:12:29, 3.28s/it] 70%|███████ | 24158/34278 [26:37:57<11:36:53, 4.13s/it] {'loss': 0.1061, 'grad_norm': 0.7263129627173697, 'learning_rate': 2.116883674552703e-06, 'epoch': 0.7} 70%|███████ | 24158/34278 [26:37:57<11:36:53, 4.13s/it] 70%|███████ | 24159/34278 [26:38:01<11:36:03, 4.13s/it] {'loss': 0.1385, 'grad_norm': 1.4787850330764907, 'learning_rate': 2.1164977039149203e-06, 'epoch': 0.7} 70%|███████ | 24159/34278 [26:38:01<11:36:03, 4.13s/it] 70%|███████ | 24160/34278 [26:38:05<10:58:07, 3.90s/it] {'loss': 0.1044, 'grad_norm': 0.798915100234943, 'learning_rate': 2.116111759020358e-06, 'epoch': 0.7} 70%|███████ | 24160/34278 [26:38:05<10:58:07, 3.90s/it] 70%|███████ | 24161/34278 [26:38:10<12:23:35, 4.41s/it] {'loss': 0.1362, 'grad_norm': 1.0609816344790315, 'learning_rate': 2.1157258398724593e-06, 'epoch': 0.7} 70%|███████ | 24161/34278 [26:38:10<12:23:35, 4.41s/it] 70%|███████ | 24162/34278 [26:38:13<11:18:07, 4.02s/it] {'loss': 0.1334, 'grad_norm': 0.64276459998314, 'learning_rate': 2.1153399464746736e-06, 'epoch': 0.7} 70%|███████ | 24162/34278 [26:38:13<11:18:07, 4.02s/it] 70%|███████ | 24163/34278 [26:38:16<10:07:55, 3.61s/it] {'loss': 0.1103, 'grad_norm': 0.8832299790936908, 'learning_rate': 2.1149540788304452e-06, 'epoch': 0.7} 70%|███████ | 24163/34278 [26:38:16<10:07:55, 3.61s/it] 70%|███████ | 24164/34278 [26:38:19<10:08:26, 3.61s/it] {'loss': 0.1119, 'grad_norm': 1.2314604621688052, 'learning_rate': 2.1145682369432153e-06, 'epoch': 0.7} 70%|███████ | 24164/34278 [26:38:19<10:08:26, 3.61s/it] 70%|███████ | 24165/34278 [26:38:23<9:48:45, 3.49s/it] {'loss': 0.1112, 'grad_norm': 1.0568850293044327, 'learning_rate': 2.114182420816432e-06, 'epoch': 0.7} 70%|███████ | 24165/34278 [26:38:23<9:48:45, 3.49s/it] 71%|███████ | 24166/34278 [26:38:26<9:56:55, 3.54s/it] {'loss': 0.1089, 'grad_norm': 0.9384272229348917, 'learning_rate': 2.1137966304535407e-06, 'epoch': 0.71} 71%|███████ | 24166/34278 [26:38:26<9:56:55, 3.54s/it] 71%|███████ | 24167/34278 [26:38:32<12:00:29, 4.28s/it] {'loss': 0.1192, 'grad_norm': 0.7942333966452567, 'learning_rate': 2.1134108658579837e-06, 'epoch': 0.71} 71%|███████ | 24167/34278 [26:38:32<12:00:29, 4.28s/it] 71%|███████ | 24168/34278 [26:38:37<11:55:06, 4.24s/it] {'loss': 0.1113, 'grad_norm': 0.8440102570808579, 'learning_rate': 2.1130251270332042e-06, 'epoch': 0.71} 71%|███████ | 24168/34278 [26:38:37<11:55:06, 4.24s/it] 71%|███████ | 24169/34278 [26:38:40<10:55:58, 3.89s/it] {'loss': 0.1291, 'grad_norm': 1.235031544150926, 'learning_rate': 2.1126394139826468e-06, 'epoch': 0.71} 71%|███████ | 24169/34278 [26:38:40<10:55:58, 3.89s/it] 71%|███████ | 24170/34278 [26:38:46<12:39:02, 4.51s/it] {'loss': 0.1146, 'grad_norm': 1.1967882927456939, 'learning_rate': 2.112253726709757e-06, 'epoch': 0.71} 71%|███████ | 24170/34278 [26:38:46<12:39:02, 4.51s/it] 71%|███████ | 24171/34278 [26:38:49<11:30:52, 4.10s/it] {'loss': 0.1207, 'grad_norm': 0.9534341476626986, 'learning_rate': 2.111868065217975e-06, 'epoch': 0.71} 71%|███████ | 24171/34278 [26:38:49<11:30:52, 4.10s/it] 71%|███████ | 24172/34278 [26:38:51<10:25:03, 3.71s/it] {'loss': 0.1253, 'grad_norm': 0.8584359280085143, 'learning_rate': 2.111482429510748e-06, 'epoch': 0.71} 71%|███████ | 24172/34278 [26:38:51<10:25:03, 3.71s/it] 71%|███████ | 24173/34278 [26:38:55<10:02:36, 3.58s/it] {'loss': 0.1258, 'grad_norm': 0.8391558648829219, 'learning_rate': 2.1110968195915153e-06, 'epoch': 0.71} 71%|███████ | 24173/34278 [26:38:55<10:02:36, 3.58s/it] 71%|███████ | 24174/34278 [26:38:58<9:54:52, 3.53s/it] {'loss': 0.1256, 'grad_norm': 0.9438799524487655, 'learning_rate': 2.1107112354637194e-06, 'epoch': 0.71} 71%|███████ | 24174/34278 [26:38:58<9:54:52, 3.53s/it] 71%|███████ | 24175/34278 [26:39:01<9:22:32, 3.34s/it] {'loss': 0.1043, 'grad_norm': 0.7457031989603938, 'learning_rate': 2.1103256771308033e-06, 'epoch': 0.71} 71%|███████ | 24175/34278 [26:39:01<9:22:32, 3.34s/it] 71%|███████ | 24176/34278 [26:39:04<9:05:13, 3.24s/it] {'loss': 0.1295, 'grad_norm': 0.9067185036060394, 'learning_rate': 2.109940144596212e-06, 'epoch': 0.71} 71%|███████ | 24176/34278 [26:39:04<9:05:13, 3.24s/it] 71%|███████ | 24177/34278 [26:39:07<8:55:49, 3.18s/it] {'loss': 0.1361, 'grad_norm': 0.8066912346339389, 'learning_rate': 2.109554637863385e-06, 'epoch': 0.71} 71%|███████ | 24177/34278 [26:39:07<8:55:49, 3.18s/it] 71%|███████ | 24178/34278 [26:39:10<8:48:18, 3.14s/it] {'loss': 0.1051, 'grad_norm': 0.7969519618243077, 'learning_rate': 2.1091691569357626e-06, 'epoch': 0.71} 71%|███████ | 24178/34278 [26:39:10<8:48:18, 3.14s/it] 71%|███████ | 24179/34278 [26:39:14<9:03:22, 3.23s/it] {'loss': 0.1085, 'grad_norm': 0.8721099242465415, 'learning_rate': 2.1087837018167893e-06, 'epoch': 0.71} 71%|███████ | 24179/34278 [26:39:14<9:03:22, 3.23s/it] 71%|███████ | 24180/34278 [26:39:17<9:00:29, 3.21s/it] {'loss': 0.1113, 'grad_norm': 1.0506240532952547, 'learning_rate': 2.1083982725099055e-06, 'epoch': 0.71} 71%|███████ | 24180/34278 [26:39:17<9:00:29, 3.21s/it] 71%|███████ | 24181/34278 [26:39:20<9:04:08, 3.23s/it] {'loss': 0.1162, 'grad_norm': 1.008842102514274, 'learning_rate': 2.108012869018549e-06, 'epoch': 0.71} 71%|███████ | 24181/34278 [26:39:20<9:04:08, 3.23s/it] 71%|███████ | 24182/34278 [26:39:23<9:11:10, 3.28s/it] {'loss': 0.1041, 'grad_norm': 0.814216316192569, 'learning_rate': 2.107627491346164e-06, 'epoch': 0.71} 71%|███████ | 24182/34278 [26:39:23<9:11:10, 3.28s/it] 71%|███████ | 24183/34278 [26:39:28<10:40:37, 3.81s/it] {'loss': 0.1177, 'grad_norm': 1.0236797961912931, 'learning_rate': 2.107242139496192e-06, 'epoch': 0.71} 71%|███████ | 24183/34278 [26:39:28<10:40:37, 3.81s/it] 71%|███████ | 24184/34278 [26:39:32<10:12:44, 3.64s/it] {'loss': 0.1356, 'grad_norm': 1.13023933406375, 'learning_rate': 2.1068568134720714e-06, 'epoch': 0.71} 71%|███████ | 24184/34278 [26:39:32<10:12:44, 3.64s/it] 71%|███████ | 24185/34278 [26:39:35<9:58:32, 3.56s/it] {'loss': 0.1187, 'grad_norm': 0.838739405281954, 'learning_rate': 2.1064715132772406e-06, 'epoch': 0.71} 71%|███████ | 24185/34278 [26:39:35<9:58:32, 3.56s/it] 71%|███████ | 24186/34278 [26:39:38<9:23:55, 3.35s/it] {'loss': 0.105, 'grad_norm': 0.8367746237079655, 'learning_rate': 2.106086238915143e-06, 'epoch': 0.71} 71%|███████ | 24186/34278 [26:39:38<9:23:55, 3.35s/it] 71%|███████ | 24187/34278 [26:39:41<9:06:55, 3.25s/it] {'loss': 0.1326, 'grad_norm': 0.6874452186299084, 'learning_rate': 2.1057009903892155e-06, 'epoch': 0.71} 71%|███████ | 24187/34278 [26:39:41<9:06:55, 3.25s/it] 71%|███████ | 24188/34278 [26:39:44<8:46:32, 3.13s/it] {'loss': 0.1312, 'grad_norm': 0.8836810306274153, 'learning_rate': 2.1053157677028985e-06, 'epoch': 0.71} 71%|███████ | 24188/34278 [26:39:44<8:46:32, 3.13s/it] 71%|███████ | 24189/34278 [26:39:47<8:48:54, 3.15s/it] {'loss': 0.1117, 'grad_norm': 0.8753607875102475, 'learning_rate': 2.1049305708596322e-06, 'epoch': 0.71} 71%|███████ | 24189/34278 [26:39:47<8:48:54, 3.15s/it] 71%|███████ | 24190/34278 [26:39:51<9:23:07, 3.35s/it] {'loss': 0.1341, 'grad_norm': 0.7125699108263962, 'learning_rate': 2.1045453998628555e-06, 'epoch': 0.71} 71%|███████ | 24190/34278 [26:39:51<9:23:07, 3.35s/it] 71%|███████ | 24191/34278 [26:39:57<11:28:23, 4.09s/it] {'loss': 0.1089, 'grad_norm': 0.8221614538570634, 'learning_rate': 2.1041602547160043e-06, 'epoch': 0.71} 71%|███████ | 24191/34278 [26:39:57<11:28:23, 4.09s/it] 71%|███████ | 24192/34278 [26:40:01<11:31:30, 4.11s/it] {'loss': 0.1272, 'grad_norm': 1.0271439627238976, 'learning_rate': 2.103775135422521e-06, 'epoch': 0.71} 71%|███████ | 24192/34278 [26:40:01<11:31:30, 4.11s/it] 71%|███████ | 24193/34278 [26:40:07<12:51:32, 4.59s/it] {'loss': 0.1149, 'grad_norm': 0.7532382507475655, 'learning_rate': 2.10339004198584e-06, 'epoch': 0.71} 71%|███████ | 24193/34278 [26:40:07<12:51:32, 4.59s/it] 71%|███████ | 24194/34278 [26:40:10<11:40:50, 4.17s/it] {'loss': 0.1363, 'grad_norm': 0.7786412246151072, 'learning_rate': 2.1030049744094033e-06, 'epoch': 0.71} 71%|███████ | 24194/34278 [26:40:10<11:40:50, 4.17s/it] 71%|███████ | 24195/34278 [26:40:13<10:58:29, 3.92s/it] {'loss': 0.1155, 'grad_norm': 0.8096487627525432, 'learning_rate': 2.1026199326966447e-06, 'epoch': 0.71} 71%|███████ | 24195/34278 [26:40:13<10:58:29, 3.92s/it] 71%|███████ | 24196/34278 [26:40:17<11:12:05, 4.00s/it] {'loss': 0.1113, 'grad_norm': 0.802356014632334, 'learning_rate': 2.1022349168510047e-06, 'epoch': 0.71} 71%|███████ | 24196/34278 [26:40:17<11:12:05, 4.00s/it] 71%|███████ | 24197/34278 [26:40:22<11:47:37, 4.21s/it] {'loss': 0.1165, 'grad_norm': 0.7717028846688864, 'learning_rate': 2.10184992687592e-06, 'epoch': 0.71} 71%|███████ | 24197/34278 [26:40:22<11:47:37, 4.21s/it] 71%|███████ | 24198/34278 [26:40:25<11:01:59, 3.94s/it] {'loss': 0.1439, 'grad_norm': 0.8428359819795502, 'learning_rate': 2.1014649627748262e-06, 'epoch': 0.71} 71%|███████ | 24198/34278 [26:40:25<11:01:59, 3.94s/it] 71%|███████ | 24199/34278 [26:40:30<11:58:14, 4.28s/it] {'loss': 0.1208, 'grad_norm': 1.003031010013575, 'learning_rate': 2.101080024551161e-06, 'epoch': 0.71} 71%|███████ | 24199/34278 [26:40:30<11:58:14, 4.28s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 71%|███████ | 24200/34278 [26:40:33<11:00:36, 3.93s/it] {'loss': 0.1353, 'grad_norm': 0.8247121288769128, 'learning_rate': 2.1006951122083626e-06, 'epoch': 0.71} 71%|███████ | 24200/34278 [26:40:33<11:00:36, 3.93s/it] 71%|███████ | 24201/34278 [26:40:37<10:17:58, 3.68s/it] {'loss': 0.1276, 'grad_norm': 1.0078679597901747, 'learning_rate': 2.100310225749865e-06, 'epoch': 0.71} 71%|███████ | 24201/34278 [26:40:37<10:17:58, 3.68s/it] 71%|███████ | 24202/34278 [26:40:41<10:38:29, 3.80s/it] {'loss': 0.1086, 'grad_norm': 1.0383652940890002, 'learning_rate': 2.099925365179107e-06, 'epoch': 0.71} 71%|███████ | 24202/34278 [26:40:41<10:38:29, 3.80s/it] 71%|███████ | 24203/34278 [26:40:44<10:06:44, 3.61s/it] {'loss': 0.1285, 'grad_norm': 1.040648646029909, 'learning_rate': 2.0995405304995227e-06, 'epoch': 0.71} 71%|███████ | 24203/34278 [26:40:44<10:06:44, 3.61s/it] 71%|███████ | 24204/34278 [26:40:48<10:38:05, 3.80s/it] {'loss': 0.1142, 'grad_norm': 0.9445478766082092, 'learning_rate': 2.0991557217145464e-06, 'epoch': 0.71} 71%|███████ | 24204/34278 [26:40:48<10:38:05, 3.80s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 71%|███████ | 24205/34278 [26:40:51<10:15:44, 3.67s/it] {'loss': 0.1146, 'grad_norm': 0.7634335003745596, 'learning_rate': 2.0987709388276155e-06, 'epoch': 0.71} 71%|███████ | 24205/34278 [26:40:51<10:15:44, 3.67s/it] 71%|███████ | 24206/34278 [26:40:56<10:44:19, 3.84s/it] {'loss': 0.1126, 'grad_norm': 0.8196127515404932, 'learning_rate': 2.098386181842167e-06, 'epoch': 0.71} 71%|███████ | 24206/34278 [26:40:56<10:44:19, 3.84s/it] 71%|███████ | 24207/34278 [26:41:02<12:29:26, 4.46s/it] {'loss': 0.1299, 'grad_norm': 1.191064054478148, 'learning_rate': 2.0980014507616334e-06, 'epoch': 0.71} 71%|███████ | 24207/34278 [26:41:02<12:29:26, 4.46s/it] 71%|███████ | 24208/34278 [26:41:05<11:20:06, 4.05s/it] {'loss': 0.124, 'grad_norm': 0.7970035370632912, 'learning_rate': 2.097616745589449e-06, 'epoch': 0.71} 71%|███████ | 24208/34278 [26:41:05<11:20:06, 4.05s/it] 71%|███████ | 24209/34278 [26:41:08<10:59:33, 3.93s/it] {'loss': 0.1377, 'grad_norm': 0.8143835814356327, 'learning_rate': 2.097232066329051e-06, 'epoch': 0.71} 71%|███████ | 24209/34278 [26:41:08<10:59:33, 3.93s/it] 71%|███████ | 24210/34278 [26:41:12<10:32:16, 3.77s/it] {'loss': 0.1233, 'grad_norm': 0.8335081394825827, 'learning_rate': 2.0968474129838724e-06, 'epoch': 0.71} 71%|███████ | 24210/34278 [26:41:12<10:32:16, 3.77s/it] 71%|███████ | 24211/34278 [26:41:15<10:22:18, 3.71s/it] {'loss': 0.1177, 'grad_norm': 0.8513562958901598, 'learning_rate': 2.096462785557345e-06, 'epoch': 0.71} 71%|███████ | 24211/34278 [26:41:15<10:22:18, 3.71s/it] 71%|███████ | 24212/34278 [26:41:19<10:06:44, 3.62s/it] {'loss': 0.1476, 'grad_norm': 0.8739121203912802, 'learning_rate': 2.096078184052905e-06, 'epoch': 0.71} 71%|███████ | 24212/34278 [26:41:19<10:06:44, 3.62s/it] 71%|███████ | 24213/34278 [26:41:21<9:27:34, 3.38s/it] {'loss': 0.1411, 'grad_norm': 0.9402304193632139, 'learning_rate': 2.095693608473987e-06, 'epoch': 0.71} 71%|███████ | 24213/34278 [26:41:21<9:27:34, 3.38s/it] 71%|███████ | 24214/34278 [26:41:24<8:57:47, 3.21s/it] {'loss': 0.1398, 'grad_norm': 0.8652467146531886, 'learning_rate': 2.095309058824024e-06, 'epoch': 0.71} 71%|███████ | 24214/34278 [26:41:24<8:57:47, 3.21s/it] 71%|███████ | 24215/34278 [26:41:27<8:44:51, 3.13s/it] {'loss': 0.0986, 'grad_norm': 0.7144904842046801, 'learning_rate': 2.0949245351064456e-06, 'epoch': 0.71} 71%|███████ | 24215/34278 [26:41:27<8:44:51, 3.13s/it] 71%|███████ | 24216/34278 [26:41:31<9:28:48, 3.39s/it] {'loss': 0.1203, 'grad_norm': 0.8454451122292086, 'learning_rate': 2.09454003732469e-06, 'epoch': 0.71} 71%|███████ | 24216/34278 [26:41:31<9:28:48, 3.39s/it] 71%|███████ | 24217/34278 [26:41:34<9:16:49, 3.32s/it] {'loss': 0.1024, 'grad_norm': 0.8771193551816933, 'learning_rate': 2.094155565482185e-06, 'epoch': 0.71} 71%|███████ | 24217/34278 [26:41:34<9:16:49, 3.32s/it] 71%|███████ | 24218/34278 [26:41:37<9:05:34, 3.25s/it] {'loss': 0.1068, 'grad_norm': 0.963960932283189, 'learning_rate': 2.0937711195823658e-06, 'epoch': 0.71} 71%|███████ | 24218/34278 [26:41:37<9:05:34, 3.25s/it] 71%|███████ | 24219/34278 [26:41:41<9:05:11, 3.25s/it] {'loss': 0.1263, 'grad_norm': 0.8074460197791143, 'learning_rate': 2.0933866996286656e-06, 'epoch': 0.71} 71%|███████ | 24219/34278 [26:41:41<9:05:11, 3.25s/it] 71%|███████ | 24220/34278 [26:41:44<9:13:51, 3.30s/it] {'loss': 0.1471, 'grad_norm': 0.9488724299713767, 'learning_rate': 2.0930023056245156e-06, 'epoch': 0.71} 71%|███████ | 24220/34278 [26:41:44<9:13:51, 3.30s/it] 71%|███████ | 24221/34278 [26:41:48<9:27:15, 3.38s/it] {'loss': 0.1039, 'grad_norm': 0.8089768162625798, 'learning_rate': 2.092617937573345e-06, 'epoch': 0.71} 71%|███████ | 24221/34278 [26:41:48<9:27:15, 3.38s/it] 71%|███████ | 24222/34278 [26:41:53<11:19:27, 4.05s/it] {'loss': 0.1205, 'grad_norm': 0.8412545528659938, 'learning_rate': 2.0922335954785893e-06, 'epoch': 0.71} 71%|███████ | 24222/34278 [26:41:53<11:19:27, 4.05s/it] 71%|███████ | 24223/34278 [26:41:57<11:02:14, 3.95s/it] {'loss': 0.1301, 'grad_norm': 0.8330026310860443, 'learning_rate': 2.091849279343676e-06, 'epoch': 0.71} 71%|███████ | 24223/34278 [26:41:57<11:02:14, 3.95s/it] 71%|███████ | 24224/34278 [26:42:01<10:59:42, 3.94s/it] {'loss': 0.1276, 'grad_norm': 1.028899814171179, 'learning_rate': 2.09146498917204e-06, 'epoch': 0.71} 71%|███████ | 24224/34278 [26:42:01<10:59:42, 3.94s/it] 71%|███████ | 24225/34278 [26:42:04<10:28:14, 3.75s/it] {'loss': 0.1294, 'grad_norm': 0.9870883693822461, 'learning_rate': 2.0910807249671085e-06, 'epoch': 0.71} 71%|███████ | 24225/34278 [26:42:04<10:28:14, 3.75s/it] 71%|███████ | 24226/34278 [26:42:10<11:55:34, 4.27s/it] {'loss': 0.1016, 'grad_norm': 0.7054511588804655, 'learning_rate': 2.0906964867323154e-06, 'epoch': 0.71} 71%|███████ | 24226/34278 [26:42:10<11:55:34, 4.27s/it] 71%|███████ | 24227/34278 [26:42:13<10:48:16, 3.87s/it] {'loss': 0.1027, 'grad_norm': 0.8912403946712257, 'learning_rate': 2.0903122744710896e-06, 'epoch': 0.71} 71%|███████ | 24227/34278 [26:42:13<10:48:16, 3.87s/it] 71%|███████ | 24228/34278 [26:42:16<10:12:06, 3.65s/it] {'loss': 0.1303, 'grad_norm': 1.1862637949493537, 'learning_rate': 2.08992808818686e-06, 'epoch': 0.71} 71%|███████ | 24228/34278 [26:42:16<10:12:06, 3.65s/it] 71%|███████ | 24229/34278 [26:42:19<9:58:52, 3.58s/it] {'loss': 0.0978, 'grad_norm': 0.7571432834830695, 'learning_rate': 2.089543927883057e-06, 'epoch': 0.71} 71%|███████ | 24229/34278 [26:42:19<9:58:52, 3.58s/it] 71%|███████ | 24230/34278 [26:42:22<9:24:30, 3.37s/it] {'loss': 0.1307, 'grad_norm': 0.7935119833746682, 'learning_rate': 2.0891597935631134e-06, 'epoch': 0.71} 71%|███████ | 24230/34278 [26:42:22<9:24:30, 3.37s/it] 71%|███████ | 24231/34278 [26:42:25<9:06:46, 3.27s/it] {'loss': 0.1373, 'grad_norm': 0.9305514332896524, 'learning_rate': 2.088775685230454e-06, 'epoch': 0.71} 71%|███████ | 24231/34278 [26:42:25<9:06:46, 3.27s/it] 71%|███████ | 24232/34278 [26:42:28<8:49:38, 3.16s/it] {'loss': 0.1078, 'grad_norm': 1.0160859880353177, 'learning_rate': 2.0883916028885126e-06, 'epoch': 0.71} 71%|███████ | 24232/34278 [26:42:28<8:49:38, 3.16s/it] 71%|███████ | 24233/34278 [26:42:31<8:55:47, 3.20s/it] {'loss': 0.1099, 'grad_norm': 0.6911868767085836, 'learning_rate': 2.0880075465407156e-06, 'epoch': 0.71} 71%|███████ | 24233/34278 [26:42:31<8:55:47, 3.20s/it] 71%|███████ | 24234/34278 [26:42:37<10:59:12, 3.94s/it] {'loss': 0.106, 'grad_norm': 0.8930064859506339, 'learning_rate': 2.08762351619049e-06, 'epoch': 0.71} 71%|███████ | 24234/34278 [26:42:37<10:59:12, 3.94s/it] 71%|███████ | 24235/34278 [26:42:40<10:12:08, 3.66s/it] {'loss': 0.1099, 'grad_norm': 0.9108804388813418, 'learning_rate': 2.0872395118412667e-06, 'epoch': 0.71} 71%|███████ | 24235/34278 [26:42:40<10:12:08, 3.66s/it] 71%|███████ | 24236/34278 [26:42:43<10:00:00, 3.58s/it] {'loss': 0.1188, 'grad_norm': 0.8310533896476148, 'learning_rate': 2.086855533496476e-06, 'epoch': 0.71} 71%|███████ | 24236/34278 [26:42:43<10:00:00, 3.58s/it] 71%|███████ | 24237/34278 [26:42:46<9:24:56, 3.38s/it] {'loss': 0.109, 'grad_norm': 0.7936093721437792, 'learning_rate': 2.0864715811595433e-06, 'epoch': 0.71} 71%|███████ | 24237/34278 [26:42:46<9:24:56, 3.38s/it] 71%|███████ | 24238/34278 [26:42:50<9:31:26, 3.42s/it] {'loss': 0.1226, 'grad_norm': 0.8778617281934136, 'learning_rate': 2.0860876548338948e-06, 'epoch': 0.71} 71%|███████ | 24238/34278 [26:42:50<9:31:26, 3.42s/it] 71%|███████ | 24239/34278 [26:42:53<9:07:53, 3.27s/it] {'loss': 0.1257, 'grad_norm': 0.9491890116658876, 'learning_rate': 2.085703754522962e-06, 'epoch': 0.71} 71%|███████ | 24239/34278 [26:42:53<9:07:53, 3.27s/it] 71%|███████ | 24240/34278 [26:42:57<9:32:46, 3.42s/it] {'loss': 0.1164, 'grad_norm': 0.7687823104756915, 'learning_rate': 2.0853198802301705e-06, 'epoch': 0.71} 71%|███████ | 24240/34278 [26:42:57<9:32:46, 3.42s/it] 71%|███████ | 24241/34278 [26:43:00<9:16:06, 3.32s/it] {'loss': 0.0944, 'grad_norm': 0.7390914642422536, 'learning_rate': 2.0849360319589456e-06, 'epoch': 0.71} 71%|███████ | 24241/34278 [26:43:00<9:16:06, 3.32s/it] 71%|███████ | 24242/34278 [26:43:06<11:28:53, 4.12s/it] {'loss': 0.1069, 'grad_norm': 0.8037881174333962, 'learning_rate': 2.0845522097127156e-06, 'epoch': 0.71} 71%|███████ | 24242/34278 [26:43:06<11:28:53, 4.12s/it] 71%|███████ | 24243/34278 [26:43:11<12:37:03, 4.53s/it] {'loss': 0.1247, 'grad_norm': 0.8585303715610015, 'learning_rate': 2.08416841349491e-06, 'epoch': 0.71} 71%|███████ | 24243/34278 [26:43:11<12:37:03, 4.53s/it] 71%|███████ | 24244/34278 [26:43:14<11:28:01, 4.11s/it] {'loss': 0.118, 'grad_norm': 0.9026023504426673, 'learning_rate': 2.0837846433089516e-06, 'epoch': 0.71} 71%|███████ | 24244/34278 [26:43:14<11:28:01, 4.11s/it] 71%|███████ | 24245/34278 [26:43:17<10:39:18, 3.82s/it] {'loss': 0.1312, 'grad_norm': 0.9478598025329202, 'learning_rate': 2.0834008991582666e-06, 'epoch': 0.71} 71%|███████ | 24245/34278 [26:43:17<10:39:18, 3.82s/it] 71%|███████ | 24246/34278 [26:43:21<10:26:54, 3.75s/it] {'loss': 0.1089, 'grad_norm': 1.3842624142818118, 'learning_rate': 2.083017181046284e-06, 'epoch': 0.71} 71%|███████ | 24246/34278 [26:43:21<10:26:54, 3.75s/it] 71%|███████ | 24247/34278 [26:43:26<11:35:00, 4.16s/it] {'loss': 0.1181, 'grad_norm': 0.8416271506515415, 'learning_rate': 2.0826334889764254e-06, 'epoch': 0.71} 71%|███████ | 24247/34278 [26:43:26<11:35:00, 4.16s/it] 71%|███████ | 24248/34278 [26:43:29<10:32:55, 3.79s/it] {'loss': 0.1063, 'grad_norm': 0.9323315489613925, 'learning_rate': 2.0822498229521195e-06, 'epoch': 0.71} 71%|███████ | 24248/34278 [26:43:29<10:32:55, 3.79s/it] 71%|███████ | 24249/34278 [26:43:32<9:54:02, 3.55s/it] {'loss': 0.148, 'grad_norm': 1.0434978980243386, 'learning_rate': 2.0818661829767915e-06, 'epoch': 0.71} 71%|███████ | 24249/34278 [26:43:32<9:54:02, 3.55s/it] 71%|███████ | 24250/34278 [26:43:35<9:34:12, 3.44s/it] {'loss': 0.1129, 'grad_norm': 1.0057344394718042, 'learning_rate': 2.081482569053866e-06, 'epoch': 0.71} 71%|███████ | 24250/34278 [26:43:35<9:34:12, 3.44s/it] 71%|███████ | 24251/34278 [26:43:39<10:14:32, 3.68s/it] {'loss': 0.136, 'grad_norm': 0.8645375539330916, 'learning_rate': 2.0810989811867656e-06, 'epoch': 0.71} 71%|███████ | 24251/34278 [26:43:39<10:14:32, 3.68s/it] 71%|███████ | 24252/34278 [26:43:46<12:15:47, 4.40s/it] {'loss': 0.1147, 'grad_norm': 1.0281069821712148, 'learning_rate': 2.0807154193789185e-06, 'epoch': 0.71} 71%|███████ | 24252/34278 [26:43:46<12:15:47, 4.40s/it] 71%|███████ | 24253/34278 [26:43:49<11:12:48, 4.03s/it] {'loss': 0.1102, 'grad_norm': 0.9376559980706509, 'learning_rate': 2.0803318836337453e-06, 'epoch': 0.71} 71%|███████ | 24253/34278 [26:43:49<11:12:48, 4.03s/it] 71%|███████ | 24254/34278 [26:43:52<10:29:22, 3.77s/it] {'loss': 0.1153, 'grad_norm': 0.9411723033745302, 'learning_rate': 2.0799483739546745e-06, 'epoch': 0.71} 71%|███████ | 24254/34278 [26:43:52<10:29:22, 3.77s/it] 71%|███████ | 24255/34278 [26:43:55<10:05:37, 3.63s/it] {'loss': 0.1083, 'grad_norm': 0.7318044208269741, 'learning_rate': 2.0795648903451247e-06, 'epoch': 0.71} 71%|███████ | 24255/34278 [26:43:55<10:05:37, 3.63s/it] 71%|███████ | 24256/34278 [26:43:58<9:40:10, 3.47s/it] {'loss': 0.0976, 'grad_norm': 0.8638369251224559, 'learning_rate': 2.079181432808525e-06, 'epoch': 0.71} 71%|███████ | 24256/34278 [26:43:58<9:40:10, 3.47s/it] 71%|███████ | 24257/34278 [26:44:01<9:27:50, 3.40s/it] {'loss': 0.1056, 'grad_norm': 1.1350800421229454, 'learning_rate': 2.0787980013482963e-06, 'epoch': 0.71} 71%|███████ | 24257/34278 [26:44:01<9:27:50, 3.40s/it] 71%|███████ | 24258/34278 [26:44:07<11:18:07, 4.06s/it] {'loss': 0.1165, 'grad_norm': 0.9195655840391993, 'learning_rate': 2.0784145959678592e-06, 'epoch': 0.71} 71%|███████ | 24258/34278 [26:44:07<11:18:07, 4.06s/it] 71%|███████ | 24259/34278 [26:44:10<10:22:35, 3.73s/it] {'loss': 0.1298, 'grad_norm': 0.8873180874749146, 'learning_rate': 2.0780312166706396e-06, 'epoch': 0.71} 71%|███████ | 24259/34278 [26:44:10<10:22:35, 3.73s/it] 71%|███████ | 24260/34278 [26:44:13<9:38:40, 3.47s/it] {'loss': 0.1162, 'grad_norm': 0.8909275358850974, 'learning_rate': 2.0776478634600616e-06, 'epoch': 0.71} 71%|███████ | 24260/34278 [26:44:13<9:38:40, 3.47s/it] 71%|███████ | 24261/34278 [26:44:16<9:19:23, 3.35s/it] {'loss': 0.1368, 'grad_norm': 0.802875624164899, 'learning_rate': 2.077264536339544e-06, 'epoch': 0.71} 71%|███████ | 24261/34278 [26:44:16<9:19:23, 3.35s/it] 71%|███████ | 24262/34278 [26:44:20<9:32:16, 3.43s/it] {'loss': 0.1111, 'grad_norm': 0.9929377481657958, 'learning_rate': 2.076881235312512e-06, 'epoch': 0.71} 71%|███████ | 24262/34278 [26:44:20<9:32:16, 3.43s/it] 71%|███████ | 24263/34278 [26:44:24<10:02:41, 3.61s/it] {'loss': 0.1292, 'grad_norm': 1.263614547830916, 'learning_rate': 2.0764979603823877e-06, 'epoch': 0.71} 71%|███████ | 24263/34278 [26:44:24<10:02:41, 3.61s/it] 71%|███████ | 24264/34278 [26:44:27<9:40:24, 3.48s/it] {'loss': 0.1119, 'grad_norm': 0.8915764046134468, 'learning_rate': 2.076114711552589e-06, 'epoch': 0.71} 71%|███████ | 24264/34278 [26:44:27<9:40:24, 3.48s/it] 71%|███████ | 24265/34278 [26:44:30<9:26:13, 3.39s/it] {'loss': 0.1076, 'grad_norm': 0.9872741977126354, 'learning_rate': 2.0757314888265404e-06, 'epoch': 0.71} 71%|███████ | 24265/34278 [26:44:30<9:26:13, 3.39s/it] 71%|███████ | 24266/34278 [26:44:33<8:56:09, 3.21s/it] {'loss': 0.1314, 'grad_norm': 1.0237986457693453, 'learning_rate': 2.075348292207665e-06, 'epoch': 0.71} 71%|███████ | 24266/34278 [26:44:33<8:56:09, 3.21s/it] 71%|███████ | 24267/34278 [26:44:37<9:32:59, 3.43s/it] {'loss': 0.1433, 'grad_norm': 1.2074351477954173, 'learning_rate': 2.074965121699382e-06, 'epoch': 0.71} 71%|███████ | 24267/34278 [26:44:37<9:32:59, 3.43s/it] 71%|███████ | 24268/34278 [26:44:40<9:47:26, 3.52s/it] {'loss': 0.1077, 'grad_norm': 1.128488236197808, 'learning_rate': 2.0745819773051103e-06, 'epoch': 0.71} 71%|███████ | 24268/34278 [26:44:40<9:47:26, 3.52s/it] 71%|███████ | 24269/34278 [26:44:46<11:35:24, 4.17s/it] {'loss': 0.115, 'grad_norm': 0.6981280622894253, 'learning_rate': 2.074198859028274e-06, 'epoch': 0.71} 71%|███████ | 24269/34278 [26:44:46<11:35:24, 4.17s/it] 71%|███████ | 24270/34278 [26:44:50<11:44:44, 4.23s/it] {'loss': 0.115, 'grad_norm': 1.0420269362369856, 'learning_rate': 2.073815766872292e-06, 'epoch': 0.71} 71%|███████ | 24270/34278 [26:44:50<11:44:44, 4.23s/it] 71%|███████ | 24271/34278 [26:44:54<10:50:39, 3.90s/it] {'loss': 0.1263, 'grad_norm': 0.8642225907002064, 'learning_rate': 2.073432700840582e-06, 'epoch': 0.71} 71%|███████ | 24271/34278 [26:44:54<10:50:39, 3.90s/it] 71%|███████ | 24272/34278 [26:44:58<11:01:47, 3.97s/it] {'loss': 0.1155, 'grad_norm': 1.2527782822964186, 'learning_rate': 2.073049660936567e-06, 'epoch': 0.71} 71%|███████ | 24272/34278 [26:44:58<11:01:47, 3.97s/it] 71%|███████ | 24273/34278 [26:45:02<11:06:14, 4.00s/it] {'loss': 0.1222, 'grad_norm': 1.03955845050689, 'learning_rate': 2.072666647163667e-06, 'epoch': 0.71} 71%|███████ | 24273/34278 [26:45:02<11:06:14, 4.00s/it] 71%|███████ | 24274/34278 [26:45:05<10:07:16, 3.64s/it] {'loss': 0.1242, 'grad_norm': 1.1281478865665038, 'learning_rate': 2.0722836595253004e-06, 'epoch': 0.71} 71%|███████ | 24274/34278 [26:45:05<10:07:16, 3.64s/it] 71%|███████ | 24275/34278 [26:45:08<9:53:22, 3.56s/it] {'loss': 0.1027, 'grad_norm': 0.638413992170037, 'learning_rate': 2.071900698024885e-06, 'epoch': 0.71} 71%|███████ | 24275/34278 [26:45:08<9:53:22, 3.56s/it] 71%|███████ | 24276/34278 [26:45:14<11:57:14, 4.30s/it] {'loss': 0.1112, 'grad_norm': 1.2789385595384781, 'learning_rate': 2.0715177626658427e-06, 'epoch': 0.71} 71%|███████ | 24276/34278 [26:45:14<11:57:14, 4.30s/it] 71%|███████ | 24277/34278 [26:45:18<11:55:34, 4.29s/it] {'loss': 0.1096, 'grad_norm': 0.9262113633692753, 'learning_rate': 2.071134853451589e-06, 'epoch': 0.71} 71%|███████ | 24277/34278 [26:45:18<11:55:34, 4.29s/it] 71%|███████ | 24278/34278 [26:45:21<10:34:59, 3.81s/it] {'loss': 0.1058, 'grad_norm': 0.9699446681662852, 'learning_rate': 2.0707519703855446e-06, 'epoch': 0.71} 71%|███████ | 24278/34278 [26:45:21<10:34:59, 3.81s/it] 71%|███████ | 24279/34278 [26:45:25<10:25:24, 3.75s/it] {'loss': 0.1041, 'grad_norm': 0.8942764422388755, 'learning_rate': 2.0703691134711284e-06, 'epoch': 0.71} 71%|███████ | 24279/34278 [26:45:25<10:25:24, 3.75s/it] 71%|███████ | 24280/34278 [26:45:30<11:45:27, 4.23s/it] {'loss': 0.1106, 'grad_norm': 0.8257282047171776, 'learning_rate': 2.0699862827117576e-06, 'epoch': 0.71} 71%|███████ | 24280/34278 [26:45:30<11:45:27, 4.23s/it] 71%|███████ | 24281/34278 [26:45:33<10:47:16, 3.88s/it] {'loss': 0.1266, 'grad_norm': 0.8997744291216947, 'learning_rate': 2.069603478110848e-06, 'epoch': 0.71} 71%|███████ | 24281/34278 [26:45:33<10:47:16, 3.88s/it] 71%|███████ | 24282/34278 [26:45:36<10:23:33, 3.74s/it] {'loss': 0.1163, 'grad_norm': 0.8860122375314241, 'learning_rate': 2.069220699671821e-06, 'epoch': 0.71} 71%|███████ | 24282/34278 [26:45:36<10:23:33, 3.74s/it] 71%|███████ | 24283/34278 [26:45:40<10:29:07, 3.78s/it] {'loss': 0.1143, 'grad_norm': 0.7941591419252904, 'learning_rate': 2.0688379473980904e-06, 'epoch': 0.71} 71%|███████ | 24283/34278 [26:45:40<10:29:07, 3.78s/it] 71%|███████ | 24284/34278 [26:45:44<10:11:16, 3.67s/it] {'loss': 0.1213, 'grad_norm': 1.0087317633787842, 'learning_rate': 2.068455221293076e-06, 'epoch': 0.71} 71%|███████ | 24284/34278 [26:45:44<10:11:16, 3.67s/it] 71%|███████ | 24285/34278 [26:45:47<9:50:10, 3.54s/it] {'loss': 0.1065, 'grad_norm': 0.8425652259654369, 'learning_rate': 2.068072521360192e-06, 'epoch': 0.71} 71%|███████ | 24285/34278 [26:45:47<9:50:10, 3.54s/it] 71%|███████ | 24286/34278 [26:45:51<10:34:50, 3.81s/it] {'loss': 0.1361, 'grad_norm': 0.839806586690853, 'learning_rate': 2.067689847602859e-06, 'epoch': 0.71} 71%|███████ | 24286/34278 [26:45:51<10:34:50, 3.81s/it] 71%|███████ | 24287/34278 [26:45:54<9:39:13, 3.48s/it] {'loss': 0.122, 'grad_norm': 0.6598314202266496, 'learning_rate': 2.0673072000244902e-06, 'epoch': 0.71} 71%|███████ | 24287/34278 [26:45:54<9:39:13, 3.48s/it] 71%|███████ | 24288/34278 [26:45:57<9:27:38, 3.41s/it] {'loss': 0.1239, 'grad_norm': 0.9482659430429057, 'learning_rate': 2.0669245786285015e-06, 'epoch': 0.71} 71%|███████ | 24288/34278 [26:45:57<9:27:38, 3.41s/it] 71%|███████ | 24289/34278 [26:46:03<11:29:00, 4.14s/it] {'loss': 0.1107, 'grad_norm': 0.8901788825063378, 'learning_rate': 2.0665419834183093e-06, 'epoch': 0.71} 71%|███████ | 24289/34278 [26:46:03<11:29:00, 4.14s/it] 71%|███████ | 24290/34278 [26:46:06<10:38:36, 3.84s/it] {'loss': 0.1134, 'grad_norm': 0.7752036240710392, 'learning_rate': 2.0661594143973323e-06, 'epoch': 0.71} 71%|███████ | 24290/34278 [26:46:06<10:38:36, 3.84s/it] 71%|███████ | 24291/34278 [26:46:09<9:59:32, 3.60s/it] {'loss': 0.1192, 'grad_norm': 0.9534933633836123, 'learning_rate': 2.065776871568982e-06, 'epoch': 0.71} 71%|███████ | 24291/34278 [26:46:09<9:59:32, 3.60s/it] 71%|███████ | 24292/34278 [26:46:13<9:43:58, 3.51s/it] {'loss': 0.1058, 'grad_norm': 0.7428279053234718, 'learning_rate': 2.0653943549366768e-06, 'epoch': 0.71} 71%|███████ | 24292/34278 [26:46:13<9:43:58, 3.51s/it] 71%|███████ | 24293/34278 [26:46:16<9:35:21, 3.46s/it] {'loss': 0.0999, 'grad_norm': 0.8873485982934974, 'learning_rate': 2.0650118645038304e-06, 'epoch': 0.71} 71%|███████ | 24293/34278 [26:46:16<9:35:21, 3.46s/it] 71%|███████ | 24294/34278 [26:46:21<11:01:25, 3.97s/it] {'loss': 0.1284, 'grad_norm': 0.6832893915531372, 'learning_rate': 2.0646294002738555e-06, 'epoch': 0.71} 71%|███████ | 24294/34278 [26:46:21<11:01:25, 3.97s/it] 71%|███████ | 24295/34278 [26:46:24<9:53:50, 3.57s/it] {'loss': 0.1205, 'grad_norm': 0.9197184713157679, 'learning_rate': 2.0642469622501686e-06, 'epoch': 0.71} 71%|███████ | 24295/34278 [26:46:24<9:53:50, 3.57s/it] 71%|███████ | 24296/34278 [26:46:27<9:34:55, 3.46s/it] {'loss': 0.1118, 'grad_norm': 0.7557684728704201, 'learning_rate': 2.0638645504361858e-06, 'epoch': 0.71} 71%|███████ | 24296/34278 [26:46:27<9:34:55, 3.46s/it] 71%|███████ | 24297/34278 [26:46:31<9:38:38, 3.48s/it] {'loss': 0.1037, 'grad_norm': 0.6931123673153663, 'learning_rate': 2.0634821648353197e-06, 'epoch': 0.71} 71%|███████ | 24297/34278 [26:46:31<9:38:38, 3.48s/it] 71%|███████ | 24298/34278 [26:46:36<11:12:19, 4.04s/it] {'loss': 0.1334, 'grad_norm': 0.855582365801452, 'learning_rate': 2.063099805450982e-06, 'epoch': 0.71} 71%|███████ | 24298/34278 [26:46:36<11:12:19, 4.04s/it] 71%|███████ | 24299/34278 [26:46:39<10:14:32, 3.70s/it] {'loss': 0.1021, 'grad_norm': 0.8469217777266271, 'learning_rate': 2.0627174722865894e-06, 'epoch': 0.71} 71%|███████ | 24299/34278 [26:46:39<10:14:32, 3.70s/it] 71%|███████ | 24300/34278 [26:46:42<9:46:25, 3.53s/it] {'loss': 0.1028, 'grad_norm': 0.8223315291781041, 'learning_rate': 2.062335165345555e-06, 'epoch': 0.71} 71%|███████ | 24300/34278 [26:46:42<9:46:25, 3.53s/it] 71%|███████ | 24301/34278 [26:46:47<10:44:23, 3.88s/it] {'loss': 0.1283, 'grad_norm': 0.8780101299214722, 'learning_rate': 2.0619528846312882e-06, 'epoch': 0.71} 71%|███████ | 24301/34278 [26:46:47<10:44:23, 3.88s/it] 71%|███████ | 24302/34278 [26:46:50<10:17:59, 3.72s/it] {'loss': 0.1347, 'grad_norm': 0.6556728397295637, 'learning_rate': 2.061570630147205e-06, 'epoch': 0.71} 71%|███████ | 24302/34278 [26:46:50<10:17:59, 3.72s/it] 71%|███████ | 24303/34278 [26:46:53<9:39:41, 3.49s/it] {'loss': 0.1252, 'grad_norm': 0.9950550449399701, 'learning_rate': 2.0611884018967195e-06, 'epoch': 0.71} 71%|███████ | 24303/34278 [26:46:53<9:39:41, 3.49s/it] 71%|███████ | 24304/34278 [26:46:56<9:25:24, 3.40s/it] {'loss': 0.1046, 'grad_norm': 0.8847755573130276, 'learning_rate': 2.0608061998832423e-06, 'epoch': 0.71} 71%|███████ | 24304/34278 [26:46:56<9:25:24, 3.40s/it] 71%|███████ | 24305/34278 [26:46:59<9:21:04, 3.38s/it] {'loss': 0.1364, 'grad_norm': 0.7599713932287432, 'learning_rate': 2.0604240241101843e-06, 'epoch': 0.71} 71%|███████ | 24305/34278 [26:46:59<9:21:04, 3.38s/it] 71%|███████ | 24306/34278 [26:47:03<9:23:54, 3.39s/it] {'loss': 0.1397, 'grad_norm': 1.1163898676023372, 'learning_rate': 2.0600418745809602e-06, 'epoch': 0.71} 71%|███████ | 24306/34278 [26:47:03<9:23:54, 3.39s/it] 71%|███████ | 24307/34278 [26:47:06<9:09:55, 3.31s/it] {'loss': 0.1252, 'grad_norm': 0.7909592846118836, 'learning_rate': 2.059659751298979e-06, 'epoch': 0.71} 71%|███████ | 24307/34278 [26:47:06<9:09:55, 3.31s/it] 71%|███████ | 24308/34278 [26:47:09<9:10:34, 3.31s/it] {'loss': 0.157, 'grad_norm': 0.9164340973327898, 'learning_rate': 2.0592776542676535e-06, 'epoch': 0.71} 71%|███████ | 24308/34278 [26:47:09<9:10:34, 3.31s/it] 71%|███████ | 24309/34278 [26:47:12<8:55:41, 3.22s/it] {'loss': 0.1241, 'grad_norm': 0.9238638354559386, 'learning_rate': 2.0588955834903966e-06, 'epoch': 0.71} 71%|███████ | 24309/34278 [26:47:12<8:55:41, 3.22s/it] 71%|███████ | 24310/34278 [26:47:15<8:47:23, 3.17s/it] {'loss': 0.1131, 'grad_norm': 0.8793393489014347, 'learning_rate': 2.0585135389706185e-06, 'epoch': 0.71} 71%|███████ | 24310/34278 [26:47:15<8:47:23, 3.17s/it] 71%|███████ | 24311/34278 [26:47:20<9:44:04, 3.52s/it] {'loss': 0.1114, 'grad_norm': 0.6373542181329356, 'learning_rate': 2.058131520711727e-06, 'epoch': 0.71} 71%|███████ | 24311/34278 [26:47:20<9:44:04, 3.52s/it] 71%|███████ | 24312/34278 [26:47:22<9:08:20, 3.30s/it] {'loss': 0.1167, 'grad_norm': 0.8941034902251721, 'learning_rate': 2.0577495287171374e-06, 'epoch': 0.71} 71%|███████ | 24312/34278 [26:47:22<9:08:20, 3.30s/it] 71%|███████ | 24313/34278 [26:47:26<9:12:13, 3.32s/it] {'loss': 0.1386, 'grad_norm': 0.7132029447358353, 'learning_rate': 2.057367562990255e-06, 'epoch': 0.71} 71%|███████ | 24313/34278 [26:47:26<9:12:13, 3.32s/it] 71%|███████ | 24314/34278 [26:47:29<8:55:37, 3.23s/it] {'loss': 0.1013, 'grad_norm': 1.4641892671757206, 'learning_rate': 2.0569856235344947e-06, 'epoch': 0.71} 71%|███████ | 24314/34278 [26:47:29<8:55:37, 3.23s/it] 71%|███████ | 24315/34278 [26:47:33<9:22:31, 3.39s/it] {'loss': 0.1235, 'grad_norm': 0.9844328535665606, 'learning_rate': 2.0566037103532628e-06, 'epoch': 0.71} 71%|███████ | 24315/34278 [26:47:33<9:22:31, 3.39s/it] 71%|███████ | 24316/34278 [26:47:36<9:05:37, 3.29s/it] {'loss': 0.1089, 'grad_norm': 0.9058529002089648, 'learning_rate': 2.0562218234499714e-06, 'epoch': 0.71} 71%|███████ | 24316/34278 [26:47:36<9:05:37, 3.29s/it] 71%|███████ | 24317/34278 [26:47:39<9:04:50, 3.28s/it] {'loss': 0.1327, 'grad_norm': 0.828066655121191, 'learning_rate': 2.055839962828029e-06, 'epoch': 0.71} 71%|███████ | 24317/34278 [26:47:39<9:04:50, 3.28s/it] 71%|███████ | 24318/34278 [26:47:42<9:04:13, 3.28s/it] {'loss': 0.1158, 'grad_norm': 0.7649869333445348, 'learning_rate': 2.055458128490843e-06, 'epoch': 0.71} 71%|███████ | 24318/34278 [26:47:42<9:04:13, 3.28s/it] 71%|███████ | 24319/34278 [26:47:48<11:18:58, 4.09s/it] {'loss': 0.128, 'grad_norm': 0.8032000343573327, 'learning_rate': 2.055076320441824e-06, 'epoch': 0.71} 71%|███████ | 24319/34278 [26:47:48<11:18:58, 4.09s/it] 71%|███████ | 24320/34278 [26:47:51<10:38:50, 3.85s/it] {'loss': 0.1304, 'grad_norm': 0.9063916433840569, 'learning_rate': 2.0546945386843826e-06, 'epoch': 0.71} 71%|███████ | 24320/34278 [26:47:51<10:38:50, 3.85s/it] 71%|███████ | 24321/34278 [26:47:55<10:00:56, 3.62s/it] {'loss': 0.1068, 'grad_norm': 0.7374586396840892, 'learning_rate': 2.0543127832219246e-06, 'epoch': 0.71} 71%|███████ | 24321/34278 [26:47:55<10:00:56, 3.62s/it] 71%|███████ | 24322/34278 [26:47:57<9:24:19, 3.40s/it] {'loss': 0.122, 'grad_norm': 0.7476665841702215, 'learning_rate': 2.053931054057857e-06, 'epoch': 0.71} 71%|███████ | 24322/34278 [26:47:57<9:24:19, 3.40s/it] 71%|███████ | 24323/34278 [26:48:01<9:27:30, 3.42s/it] {'loss': 0.1347, 'grad_norm': 0.8550252027256865, 'learning_rate': 2.0535493511955925e-06, 'epoch': 0.71} 71%|███████ | 24323/34278 [26:48:01<9:27:30, 3.42s/it] 71%|███████ | 24324/34278 [26:48:04<8:52:18, 3.21s/it] {'loss': 0.1207, 'grad_norm': 0.8523804032655143, 'learning_rate': 2.053167674638533e-06, 'epoch': 0.71} 71%|███████ | 24324/34278 [26:48:04<8:52:18, 3.21s/it] 71%|███████ | 24325/34278 [26:48:10<11:08:01, 4.03s/it] {'loss': 0.0878, 'grad_norm': 0.6336614375024203, 'learning_rate': 2.0527860243900898e-06, 'epoch': 0.71} 71%|███████ | 24325/34278 [26:48:10<11:08:01, 4.03s/it] 71%|███████ | 24326/34278 [26:48:16<12:45:17, 4.61s/it] {'loss': 0.1415, 'grad_norm': 0.7924964057217228, 'learning_rate': 2.0524044004536716e-06, 'epoch': 0.71} 71%|███████ | 24326/34278 [26:48:16<12:45:17, 4.61s/it] 71%|███████ | 24327/34278 [26:48:18<11:13:39, 4.06s/it] {'loss': 0.1052, 'grad_norm': 0.8762698703750451, 'learning_rate': 2.052022802832682e-06, 'epoch': 0.71} 71%|███████ | 24327/34278 [26:48:18<11:13:39, 4.06s/it] 71%|███████ | 24328/34278 [26:48:21<10:25:26, 3.77s/it] {'loss': 0.1204, 'grad_norm': 0.9481505755848734, 'learning_rate': 2.0516412315305282e-06, 'epoch': 0.71} 71%|███████ | 24328/34278 [26:48:21<10:25:26, 3.77s/it] 71%|███████ | 24329/34278 [26:48:26<11:00:05, 3.98s/it] {'loss': 0.1297, 'grad_norm': 0.9267284542420775, 'learning_rate': 2.0512596865506195e-06, 'epoch': 0.71} 71%|███████ | 24329/34278 [26:48:26<11:00:05, 3.98s/it] 71%|███████ | 24330/34278 [26:48:30<11:01:32, 3.99s/it] {'loss': 0.1129, 'grad_norm': 0.8690038560467187, 'learning_rate': 2.05087816789636e-06, 'epoch': 0.71} 71%|███████ | 24330/34278 [26:48:30<11:01:32, 3.99s/it] 71%|███████ | 24331/34278 [26:48:33<10:04:21, 3.65s/it] {'loss': 0.107, 'grad_norm': 0.9277653152803282, 'learning_rate': 2.0504966755711547e-06, 'epoch': 0.71} 71%|███████ | 24331/34278 [26:48:33<10:04:21, 3.65s/it] 71%|███████ | 24332/34278 [26:48:36<9:37:04, 3.48s/it] {'loss': 0.1085, 'grad_norm': 0.9933305147356886, 'learning_rate': 2.0501152095784105e-06, 'epoch': 0.71} 71%|███████ | 24332/34278 [26:48:36<9:37:04, 3.48s/it] 71%|███████ | 24333/34278 [26:48:42<11:37:35, 4.21s/it] {'loss': 0.1298, 'grad_norm': 0.9288477974599171, 'learning_rate': 2.049733769921536e-06, 'epoch': 0.71} 71%|███████ | 24333/34278 [26:48:42<11:37:35, 4.21s/it] 71%|███████ | 24334/34278 [26:48:48<13:08:17, 4.76s/it] {'loss': 0.1251, 'grad_norm': 1.02337141711138, 'learning_rate': 2.0493523566039334e-06, 'epoch': 0.71} 71%|███████ | 24334/34278 [26:48:48<13:08:17, 4.76s/it] 71%|███████ | 24335/34278 [26:48:51<11:31:02, 4.17s/it] {'loss': 0.1173, 'grad_norm': 1.1745951293250398, 'learning_rate': 2.0489709696290073e-06, 'epoch': 0.71} 71%|███████ | 24335/34278 [26:48:51<11:31:02, 4.17s/it] 71%|███████ | 24336/34278 [26:48:57<13:03:24, 4.73s/it] {'loss': 0.1343, 'grad_norm': 0.8583992001256617, 'learning_rate': 2.0485896090001657e-06, 'epoch': 0.71} 71%|███████ | 24336/34278 [26:48:57<13:03:24, 4.73s/it] 71%|███████ | 24337/34278 [26:49:00<11:37:36, 4.21s/it] {'loss': 0.1042, 'grad_norm': 0.8578304036689427, 'learning_rate': 2.0482082747208092e-06, 'epoch': 0.71} 71%|███████ | 24337/34278 [26:49:00<11:37:36, 4.21s/it] 71%|███████ | 24338/34278 [26:49:03<10:43:47, 3.89s/it] {'loss': 0.1128, 'grad_norm': 0.942450149665433, 'learning_rate': 2.0478269667943453e-06, 'epoch': 0.71} 71%|███████ | 24338/34278 [26:49:03<10:43:47, 3.89s/it] 71%|███████ | 24339/34278 [26:49:06<10:11:38, 3.69s/it] {'loss': 0.1183, 'grad_norm': 0.9497845138690882, 'learning_rate': 2.047445685224179e-06, 'epoch': 0.71} 71%|███████ | 24339/34278 [26:49:06<10:11:38, 3.69s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (12324 > 8192). Running this sequence through the model will result in indexing errors 71%|███████ | 24340/34278 [26:49:09<9:45:43, 3.54s/it] {'loss': 0.1251, 'grad_norm': 0.7537585600108339, 'learning_rate': 2.047064430013713e-06, 'epoch': 0.71} 71%|███████ | 24340/34278 [26:49:09<9:45:43, 3.54s/it] 71%|███████ | 24341/34278 [26:49:16<12:08:39, 4.40s/it] {'loss': 0.1167, 'grad_norm': 0.9194084478200095, 'learning_rate': 2.0466832011663486e-06, 'epoch': 0.71} 71%|███████ | 24341/34278 [26:49:16<12:08:39, 4.40s/it] 71%|███████ | 24342/34278 [26:49:18<10:48:24, 3.92s/it] {'loss': 0.1152, 'grad_norm': 0.7379825674920547, 'learning_rate': 2.0463019986854932e-06, 'epoch': 0.71} 71%|███████ | 24342/34278 [26:49:18<10:48:24, 3.92s/it] 71%|███████ | 24343/34278 [26:49:25<12:50:15, 4.65s/it] {'loss': 0.1121, 'grad_norm': 0.8357722724183001, 'learning_rate': 2.045920822574547e-06, 'epoch': 0.71} 71%|███████ | 24343/34278 [26:49:25<12:50:15, 4.65s/it] 71%|███████ | 24344/34278 [26:49:28<11:32:04, 4.18s/it] {'loss': 0.1283, 'grad_norm': 0.9264669869312903, 'learning_rate': 2.0455396728369165e-06, 'epoch': 0.71} 71%|███████ | 24344/34278 [26:49:28<11:32:04, 4.18s/it] 71%|███████ | 24345/34278 [26:49:31<10:45:13, 3.90s/it] {'loss': 0.1166, 'grad_norm': 0.8289058552466958, 'learning_rate': 2.045158549476e-06, 'epoch': 0.71} 71%|███████ | 24345/34278 [26:49:31<10:45:13, 3.90s/it] 71%|███████ | 24346/34278 [26:49:36<11:19:17, 4.10s/it] {'loss': 0.1097, 'grad_norm': 0.8389492003646455, 'learning_rate': 2.0447774524952054e-06, 'epoch': 0.71} 71%|███████ | 24346/34278 [26:49:36<11:19:17, 4.10s/it] 71%|███████ | 24347/34278 [26:49:42<12:51:40, 4.66s/it] {'loss': 0.1299, 'grad_norm': 0.7369558637489452, 'learning_rate': 2.0443963818979318e-06, 'epoch': 0.71} 71%|███████ | 24347/34278 [26:49:42<12:51:40, 4.66s/it] 71%|███████ | 24348/34278 [26:49:45<12:04:28, 4.38s/it] {'loss': 0.1277, 'grad_norm': 1.345947586898109, 'learning_rate': 2.0440153376875797e-06, 'epoch': 0.71} 71%|███████ | 24348/34278 [26:49:45<12:04:28, 4.38s/it] 71%|███████ | 24349/34278 [26:49:48<10:49:16, 3.92s/it] {'loss': 0.1122, 'grad_norm': 0.8749608023462779, 'learning_rate': 2.0436343198675535e-06, 'epoch': 0.71} 71%|███████ | 24349/34278 [26:49:48<10:49:16, 3.92s/it] 71%|███████ | 24350/34278 [26:49:51<10:02:08, 3.64s/it] {'loss': 0.1079, 'grad_norm': 0.9801531819234112, 'learning_rate': 2.0432533284412556e-06, 'epoch': 0.71} 71%|███████ | 24350/34278 [26:49:51<10:02:08, 3.64s/it] 71%|███████ | 24351/34278 [26:49:54<9:16:17, 3.36s/it] {'loss': 0.1288, 'grad_norm': 0.7246153212649536, 'learning_rate': 2.0428723634120864e-06, 'epoch': 0.71} 71%|███████ | 24351/34278 [26:49:54<9:16:17, 3.36s/it] 71%|███████ | 24352/34278 [26:49:57<9:00:22, 3.27s/it] {'loss': 0.0999, 'grad_norm': 0.8065126575482403, 'learning_rate': 2.042491424783445e-06, 'epoch': 0.71} 71%|███████ | 24352/34278 [26:49:57<9:00:22, 3.27s/it] 71%|███████ | 24353/34278 [26:50:02<10:45:10, 3.90s/it] {'loss': 0.1155, 'grad_norm': 0.8847770818441456, 'learning_rate': 2.042110512558736e-06, 'epoch': 0.71} 71%|███████ | 24353/34278 [26:50:02<10:45:10, 3.90s/it] 71%|███████ | 24354/34278 [26:50:07<11:41:43, 4.24s/it] {'loss': 0.1134, 'grad_norm': 0.86288922139475, 'learning_rate': 2.0417296267413562e-06, 'epoch': 0.71} 71%|███████ | 24354/34278 [26:50:07<11:41:43, 4.24s/it] 71%|███████ | 24355/34278 [26:50:10<10:49:14, 3.93s/it] {'loss': 0.1066, 'grad_norm': 1.185462512963098, 'learning_rate': 2.0413487673347083e-06, 'epoch': 0.71} 71%|███████ | 24355/34278 [26:50:10<10:49:14, 3.93s/it] 71%|███████ | 24356/34278 [26:50:14<10:21:07, 3.76s/it] {'loss': 0.1435, 'grad_norm': 1.3018141769592924, 'learning_rate': 2.040967934342194e-06, 'epoch': 0.71} 71%|███████ | 24356/34278 [26:50:14<10:21:07, 3.76s/it] 71%|███████ | 24357/34278 [26:50:18<10:22:03, 3.76s/it] {'loss': 0.1224, 'grad_norm': 0.8400628496943888, 'learning_rate': 2.040587127767212e-06, 'epoch': 0.71} 71%|███████ | 24357/34278 [26:50:18<10:22:03, 3.76s/it] 71%|███████ | 24358/34278 [26:50:21<9:45:06, 3.54s/it] {'loss': 0.1306, 'grad_norm': 0.8008670181168767, 'learning_rate': 2.0402063476131593e-06, 'epoch': 0.71} 71%|███████ | 24358/34278 [26:50:21<9:45:06, 3.54s/it] 71%|███████ | 24359/34278 [26:50:24<9:14:19, 3.35s/it] {'loss': 0.1312, 'grad_norm': 0.8951122619476052, 'learning_rate': 2.03982559388344e-06, 'epoch': 0.71} 71%|███████ | 24359/34278 [26:50:24<9:14:19, 3.35s/it] 71%|███████ | 24360/34278 [26:50:26<8:51:05, 3.21s/it] {'loss': 0.1099, 'grad_norm': 1.170453836271604, 'learning_rate': 2.039444866581451e-06, 'epoch': 0.71} 71%|███████ | 24360/34278 [26:50:26<8:51:05, 3.21s/it] 71%|███████ | 24361/34278 [26:50:32<10:45:50, 3.91s/it] {'loss': 0.1231, 'grad_norm': 1.05398976968565, 'learning_rate': 2.03906416571059e-06, 'epoch': 0.71} 71%|███████ | 24361/34278 [26:50:32<10:45:50, 3.91s/it] 71%|███████ | 24362/34278 [26:50:35<10:20:38, 3.76s/it] {'loss': 0.1252, 'grad_norm': 4.280590207084984, 'learning_rate': 2.0386834912742566e-06, 'epoch': 0.71} 71%|███████ | 24362/34278 [26:50:35<10:20:38, 3.76s/it] 71%|███████ | 24363/34278 [26:50:38<9:28:53, 3.44s/it] {'loss': 0.1259, 'grad_norm': 0.8509599858629096, 'learning_rate': 2.0383028432758522e-06, 'epoch': 0.71} 71%|███████ | 24363/34278 [26:50:38<9:28:53, 3.44s/it] 71%|███████ | 24364/34278 [26:50:41<9:06:52, 3.31s/it] {'loss': 0.1291, 'grad_norm': 1.0586904553300105, 'learning_rate': 2.037922221718773e-06, 'epoch': 0.71} 71%|███████ | 24364/34278 [26:50:41<9:06:52, 3.31s/it] 71%|███████ | 24365/34278 [26:50:47<11:20:27, 4.12s/it] {'loss': 0.1159, 'grad_norm': 0.7578885581554284, 'learning_rate': 2.037541626606416e-06, 'epoch': 0.71} 71%|███████ | 24365/34278 [26:50:47<11:20:27, 4.12s/it] 71%|███████ | 24366/34278 [26:50:52<12:20:54, 4.48s/it] {'loss': 0.125, 'grad_norm': 0.9950357619484919, 'learning_rate': 2.037161057942179e-06, 'epoch': 0.71} 71%|███████ | 24366/34278 [26:50:52<12:20:54, 4.48s/it] 71%|███████ | 24367/34278 [26:50:56<11:21:02, 4.12s/it] {'loss': 0.1073, 'grad_norm': 0.938517775451267, 'learning_rate': 2.036780515729463e-06, 'epoch': 0.71} 71%|███████ | 24367/34278 [26:50:56<11:21:02, 4.12s/it] 71%|███████ | 24368/34278 [26:50:59<10:29:08, 3.81s/it] {'loss': 0.128, 'grad_norm': 1.1184516395218789, 'learning_rate': 2.0363999999716618e-06, 'epoch': 0.71} 71%|███████ | 24368/34278 [26:50:59<10:29:08, 3.81s/it] 71%|███████ | 24369/34278 [26:51:02<9:59:37, 3.63s/it] {'loss': 0.1273, 'grad_norm': 0.9023110909870798, 'learning_rate': 2.036019510672175e-06, 'epoch': 0.71} 71%|███████ | 24369/34278 [26:51:02<9:59:37, 3.63s/it] 71%|███████ | 24370/34278 [26:51:06<10:28:05, 3.80s/it] {'loss': 0.1337, 'grad_norm': 1.4001399121659068, 'learning_rate': 2.035639047834399e-06, 'epoch': 0.71} 71%|███████ | 24370/34278 [26:51:06<10:28:05, 3.80s/it] 71%|███████ | 24371/34278 [26:51:10<10:09:15, 3.69s/it] {'loss': 0.1075, 'grad_norm': 2.0301668528336987, 'learning_rate': 2.035258611461728e-06, 'epoch': 0.71} 71%|███████ | 24371/34278 [26:51:10<10:09:15, 3.69s/it] 71%|███████ | 24372/34278 [26:51:14<10:29:24, 3.81s/it] {'loss': 0.1136, 'grad_norm': 1.1384594601914078, 'learning_rate': 2.03487820155756e-06, 'epoch': 0.71} 71%|███████ | 24372/34278 [26:51:14<10:29:24, 3.81s/it] 71%|███████ | 24373/34278 [26:51:17<10:07:54, 3.68s/it] {'loss': 0.1233, 'grad_norm': 1.107004423559062, 'learning_rate': 2.034497818125294e-06, 'epoch': 0.71} 71%|███████ | 24373/34278 [26:51:17<10:07:54, 3.68s/it] 71%|███████ | 24374/34278 [26:51:20<9:42:41, 3.53s/it] {'loss': 0.13, 'grad_norm': 1.0133925683935907, 'learning_rate': 2.0341174611683235e-06, 'epoch': 0.71} 71%|███████ | 24374/34278 [26:51:20<9:42:41, 3.53s/it] 71%|███████ | 24375/34278 [26:51:25<10:47:25, 3.92s/it] {'loss': 0.0905, 'grad_norm': 0.7104769885495591, 'learning_rate': 2.033737130690042e-06, 'epoch': 0.71} 71%|███████ | 24375/34278 [26:51:25<10:47:25, 3.92s/it] 71%|███████ | 24376/34278 [26:51:29<10:25:04, 3.79s/it] {'loss': 0.1442, 'grad_norm': 1.422098025882749, 'learning_rate': 2.0333568266938498e-06, 'epoch': 0.71} 71%|███████ | 24376/34278 [26:51:29<10:25:04, 3.79s/it] 71%|███████ | 24377/34278 [26:51:32<10:07:42, 3.68s/it] {'loss': 0.116, 'grad_norm': 1.0058471267758462, 'learning_rate': 2.032976549183139e-06, 'epoch': 0.71} 71%|███████ | 24377/34278 [26:51:32<10:07:42, 3.68s/it] 71%|███████ | 24378/34278 [26:51:35<9:44:02, 3.54s/it] {'loss': 0.1098, 'grad_norm': 0.927592664479417, 'learning_rate': 2.0325962981613036e-06, 'epoch': 0.71} 71%|███████ | 24378/34278 [26:51:35<9:44:02, 3.54s/it] 71%|███████ | 24379/34278 [26:51:38<9:14:38, 3.36s/it] {'loss': 0.0929, 'grad_norm': 0.7931636033481047, 'learning_rate': 2.0322160736317404e-06, 'epoch': 0.71} 71%|███████ | 24379/34278 [26:51:38<9:14:38, 3.36s/it] 71%|███████ | 24380/34278 [26:51:41<9:01:16, 3.28s/it] {'loss': 0.1099, 'grad_norm': 0.9806693334105764, 'learning_rate': 2.031835875597845e-06, 'epoch': 0.71} 71%|███████ | 24380/34278 [26:51:41<9:01:16, 3.28s/it] 71%|███████ | 24381/34278 [26:51:44<8:50:32, 3.22s/it] {'loss': 0.1041, 'grad_norm': 1.0011884145644832, 'learning_rate': 2.0314557040630106e-06, 'epoch': 0.71} 71%|███████ | 24381/34278 [26:51:44<8:50:32, 3.22s/it] 71%|███████ | 24382/34278 [26:51:49<10:03:17, 3.66s/it] {'loss': 0.1219, 'grad_norm': 0.9600760848975869, 'learning_rate': 2.031075559030629e-06, 'epoch': 0.71} 71%|███████ | 24382/34278 [26:51:49<10:03:17, 3.66s/it] 71%|███████ | 24383/34278 [26:51:52<9:24:33, 3.42s/it] {'loss': 0.1084, 'grad_norm': 0.8918457468674876, 'learning_rate': 2.0306954405040984e-06, 'epoch': 0.71} 71%|███████ | 24383/34278 [26:51:52<9:24:33, 3.42s/it] 71%|███████ | 24384/34278 [26:51:55<9:24:15, 3.42s/it] {'loss': 0.1124, 'grad_norm': 0.9725798074427667, 'learning_rate': 2.0303153484868077e-06, 'epoch': 0.71} 71%|███████ | 24384/34278 [26:51:55<9:24:15, 3.42s/it] 71%|███████ | 24385/34278 [26:51:59<9:31:22, 3.47s/it] {'loss': 0.1243, 'grad_norm': 0.9163828993242817, 'learning_rate': 2.0299352829821535e-06, 'epoch': 0.71} 71%|███████ | 24385/34278 [26:51:59<9:31:22, 3.47s/it] 71%|███████ | 24386/34278 [26:52:02<9:20:40, 3.40s/it] {'loss': 0.1184, 'grad_norm': 1.2597823766770704, 'learning_rate': 2.029555243993529e-06, 'epoch': 0.71} 71%|███████ | 24386/34278 [26:52:02<9:20:40, 3.40s/it] 71%|███████ | 24387/34278 [26:52:05<8:50:09, 3.22s/it] {'loss': 0.1266, 'grad_norm': 1.1218957050688485, 'learning_rate': 2.029175231524326e-06, 'epoch': 0.71} 71%|███████ | 24387/34278 [26:52:05<8:50:09, 3.22s/it] 71%|███████ | 24388/34278 [26:52:08<8:40:17, 3.16s/it] {'loss': 0.1238, 'grad_norm': 0.8709065580069354, 'learning_rate': 2.0287952455779365e-06, 'epoch': 0.71} 71%|███████ | 24388/34278 [26:52:08<8:40:17, 3.16s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7facc27c09a0> Failed to fetch sample 2667590. Exception: cannot identify image file <_io.BytesIO object at 0x7facc27c09a0> 71%|███████ | 24389/34278 [26:52:11<8:26:10, 3.07s/it] {'loss': 0.0999, 'grad_norm': 0.8367507895732899, 'learning_rate': 2.028415286157755e-06, 'epoch': 0.71} 71%|███████ | 24389/34278 [26:52:11<8:26:10, 3.07s/it] 71%|███████ | 24390/34278 [26:52:15<9:26:44, 3.44s/it] {'loss': 0.1289, 'grad_norm': 0.9890103427532823, 'learning_rate': 2.0280353532671704e-06, 'epoch': 0.71} 71%|███████ | 24390/34278 [26:52:15<9:26:44, 3.44s/it] 71%|███████ | 24391/34278 [26:52:19<9:33:03, 3.48s/it] {'loss': 0.1257, 'grad_norm': 1.0479322528274433, 'learning_rate': 2.0276554469095787e-06, 'epoch': 0.71} 71%|███████ | 24391/34278 [26:52:19<9:33:03, 3.48s/it] 71%|███████ | 24392/34278 [26:52:21<8:50:44, 3.22s/it] {'loss': 0.0999, 'grad_norm': 0.9110123973017815, 'learning_rate': 2.027275567088368e-06, 'epoch': 0.71} 71%|███████ | 24392/34278 [26:52:21<8:50:44, 3.22s/it] 71%|███████ | 24393/34278 [26:52:24<8:45:07, 3.19s/it] {'loss': 0.1203, 'grad_norm': 0.8516766976858033, 'learning_rate': 2.0268957138069336e-06, 'epoch': 0.71} 71%|███████ | 24393/34278 [26:52:24<8:45:07, 3.19s/it] 71%|███████ | 24394/34278 [26:52:27<8:29:20, 3.09s/it] {'loss': 0.1271, 'grad_norm': 0.8787287052162154, 'learning_rate': 2.0265158870686636e-06, 'epoch': 0.71} 71%|███████ | 24394/34278 [26:52:27<8:29:20, 3.09s/it] 71%|███████ | 24395/34278 [26:52:30<8:26:08, 3.07s/it] {'loss': 0.1239, 'grad_norm': 1.320506184430548, 'learning_rate': 2.0261360868769487e-06, 'epoch': 0.71} 71%|███████ | 24395/34278 [26:52:30<8:26:08, 3.07s/it] 71%|███████ | 24396/34278 [26:52:34<9:13:12, 3.36s/it] {'loss': 0.1338, 'grad_norm': 0.9819746947924424, 'learning_rate': 2.0257563132351808e-06, 'epoch': 0.71} 71%|███████ | 24396/34278 [26:52:34<9:13:12, 3.36s/it] 71%|███████ | 24397/34278 [26:52:39<10:17:04, 3.75s/it] {'loss': 0.1228, 'grad_norm': 0.8328015881292921, 'learning_rate': 2.0253765661467523e-06, 'epoch': 0.71} 71%|███████ | 24397/34278 [26:52:39<10:17:04, 3.75s/it] 71%|███████ | 24398/34278 [26:52:45<12:30:48, 4.56s/it] {'loss': 0.1373, 'grad_norm': 0.9631663464689367, 'learning_rate': 2.0249968456150497e-06, 'epoch': 0.71} 71%|███████ | 24398/34278 [26:52:45<12:30:48, 4.56s/it] 71%|███████ | 24399/34278 [26:52:49<11:29:52, 4.19s/it] {'loss': 0.099, 'grad_norm': 1.0840115522703666, 'learning_rate': 2.024617151643467e-06, 'epoch': 0.71} 71%|███████ | 24399/34278 [26:52:49<11:29:52, 4.19s/it] 71%|███████ | 24400/34278 [26:52:52<10:30:32, 3.83s/it] {'loss': 0.0898, 'grad_norm': 0.7550602826241688, 'learning_rate': 2.024237484235392e-06, 'epoch': 0.71} 71%|███████ | 24400/34278 [26:52:52<10:30:32, 3.83s/it] 71%|███████ | 24401/34278 [26:52:56<11:13:51, 4.09s/it] {'loss': 0.1115, 'grad_norm': 0.8204516668833315, 'learning_rate': 2.023857843394213e-06, 'epoch': 0.71} 71%|███████ | 24401/34278 [26:52:57<11:13:51, 4.09s/it] 71%|███████ | 24402/34278 [26:53:00<10:31:58, 3.84s/it] {'loss': 0.1138, 'grad_norm': 0.9812200065073052, 'learning_rate': 2.0234782291233207e-06, 'epoch': 0.71} 71%|███████ | 24402/34278 [26:53:00<10:31:58, 3.84s/it] 71%|███████ | 24403/34278 [26:53:03<10:13:59, 3.73s/it] {'loss': 0.1037, 'grad_norm': 1.069486808913791, 'learning_rate': 2.0230986414261056e-06, 'epoch': 0.71} 71%|███████ | 24403/34278 [26:53:03<10:13:59, 3.73s/it] 71%|███████ | 24404/34278 [26:53:06<9:47:58, 3.57s/it] {'loss': 0.1047, 'grad_norm': 0.7081987164093422, 'learning_rate': 2.0227190803059554e-06, 'epoch': 0.71} 71%|███████ | 24404/34278 [26:53:06<9:47:58, 3.57s/it] 71%|███████ | 24405/34278 [26:53:13<11:57:19, 4.36s/it] {'loss': 0.1349, 'grad_norm': 0.6843058317568871, 'learning_rate': 2.0223395457662572e-06, 'epoch': 0.71} 71%|███████ | 24405/34278 [26:53:13<11:57:19, 4.36s/it] 71%|███████ | 24406/34278 [26:53:16<11:02:28, 4.03s/it] {'loss': 0.1478, 'grad_norm': 0.8905275066612361, 'learning_rate': 2.0219600378104014e-06, 'epoch': 0.71} 71%|███████ | 24406/34278 [26:53:16<11:02:28, 4.03s/it] 71%|███████ | 24407/34278 [26:53:19<10:19:56, 3.77s/it] {'loss': 0.0965, 'grad_norm': 0.8933715644007169, 'learning_rate': 2.021580556441776e-06, 'epoch': 0.71} 71%|███████ | 24407/34278 [26:53:19<10:19:56, 3.77s/it] 71%|███████ | 24408/34278 [26:53:22<9:54:00, 3.61s/it] {'loss': 0.1127, 'grad_norm': 1.0223465465187809, 'learning_rate': 2.0212011016637667e-06, 'epoch': 0.71} 71%|███████ | 24408/34278 [26:53:22<9:54:00, 3.61s/it] 71%|███████ | 24409/34278 [26:53:28<11:35:04, 4.23s/it] {'loss': 0.1162, 'grad_norm': 0.7702065574061616, 'learning_rate': 2.0208216734797632e-06, 'epoch': 0.71} 71%|███████ | 24409/34278 [26:53:28<11:35:04, 4.23s/it] 71%|███████ | 24410/34278 [26:53:31<10:30:38, 3.83s/it] {'loss': 0.1361, 'grad_norm': 0.9735199034492721, 'learning_rate': 2.0204422718931538e-06, 'epoch': 0.71} 71%|███████ | 24410/34278 [26:53:31<10:30:38, 3.83s/it] 71%|███████ | 24411/34278 [26:53:34<10:12:15, 3.72s/it] {'loss': 0.1074, 'grad_norm': 0.8188490262918654, 'learning_rate': 2.0200628969073248e-06, 'epoch': 0.71} 71%|███████ | 24411/34278 [26:53:34<10:12:15, 3.72s/it] 71%|███████ | 24412/34278 [26:53:38<10:33:56, 3.86s/it] {'loss': 0.1255, 'grad_norm': 1.2201197741130494, 'learning_rate': 2.019683548525661e-06, 'epoch': 0.71} 71%|███████ | 24412/34278 [26:53:39<10:33:56, 3.86s/it] 71%|███████ | 24413/34278 [26:53:42<9:55:51, 3.62s/it] {'loss': 0.1107, 'grad_norm': 1.057590408778701, 'learning_rate': 2.0193042267515526e-06, 'epoch': 0.71} 71%|███████ | 24413/34278 [26:53:42<9:55:51, 3.62s/it] 71%|███████ | 24414/34278 [26:53:45<9:34:45, 3.50s/it] {'loss': 0.1212, 'grad_norm': 0.7202570838943055, 'learning_rate': 2.018924931588383e-06, 'epoch': 0.71} 71%|███████ | 24414/34278 [26:53:45<9:34:45, 3.50s/it] 71%|███████ | 24415/34278 [26:53:48<9:16:52, 3.39s/it] {'loss': 0.1191, 'grad_norm': 0.7679363626552997, 'learning_rate': 2.01854566303954e-06, 'epoch': 0.71} 71%|███████ | 24415/34278 [26:53:48<9:16:52, 3.39s/it] 71%|███████ | 24416/34278 [26:53:51<9:10:16, 3.35s/it] {'loss': 0.1163, 'grad_norm': 1.1375598628888894, 'learning_rate': 2.0181664211084114e-06, 'epoch': 0.71} 71%|███████ | 24416/34278 [26:53:51<9:10:16, 3.35s/it] 71%|███████ | 24417/34278 [26:53:54<9:10:37, 3.35s/it] {'loss': 0.1203, 'grad_norm': 0.903098200513465, 'learning_rate': 2.017787205798381e-06, 'epoch': 0.71} 71%|███████ | 24417/34278 [26:53:55<9:10:37, 3.35s/it] 71%|███████ | 24418/34278 [26:53:58<8:59:22, 3.28s/it] {'loss': 0.1228, 'grad_norm': 0.7710524434318854, 'learning_rate': 2.017408017112833e-06, 'epoch': 0.71} 71%|███████ | 24418/34278 [26:53:58<8:59:22, 3.28s/it] 71%|███████ | 24419/34278 [26:54:01<8:52:15, 3.24s/it] {'loss': 0.1241, 'grad_norm': 0.8126255961949865, 'learning_rate': 2.017028855055156e-06, 'epoch': 0.71} 71%|███████ | 24419/34278 [26:54:01<8:52:15, 3.24s/it] 71%|███████ | 24420/34278 [26:54:04<8:32:26, 3.12s/it] {'loss': 0.1165, 'grad_norm': 0.8710504505193271, 'learning_rate': 2.016649719628731e-06, 'epoch': 0.71} 71%|███████ | 24420/34278 [26:54:04<8:32:26, 3.12s/it] 71%|███████ | 24421/34278 [26:54:08<9:37:04, 3.51s/it] {'loss': 0.1196, 'grad_norm': 0.7745623945296772, 'learning_rate': 2.0162706108369473e-06, 'epoch': 0.71} 71%|███████ | 24421/34278 [26:54:08<9:37:04, 3.51s/it] 71%|███████ | 24422/34278 [26:54:11<9:08:19, 3.34s/it] {'loss': 0.1055, 'grad_norm': 0.9979151301807482, 'learning_rate': 2.0158915286831852e-06, 'epoch': 0.71} 71%|███████ | 24422/34278 [26:54:11<9:08:19, 3.34s/it] 71%|███████ | 24423/34278 [26:54:14<8:39:28, 3.16s/it] {'loss': 0.1141, 'grad_norm': 0.7711457160494226, 'learning_rate': 2.0155124731708337e-06, 'epoch': 0.71} 71%|███████ | 24423/34278 [26:54:14<8:39:28, 3.16s/it] 71%|███████▏ | 24424/34278 [26:54:17<8:25:00, 3.07s/it] {'loss': 0.1099, 'grad_norm': 0.771775854309958, 'learning_rate': 2.015133444303274e-06, 'epoch': 0.71} 71%|███████▏ | 24424/34278 [26:54:17<8:25:00, 3.07s/it] 71%|███████▏ | 24425/34278 [26:54:23<10:48:17, 3.95s/it] {'loss': 0.1063, 'grad_norm': 0.9915515612916951, 'learning_rate': 2.0147544420838883e-06, 'epoch': 0.71} 71%|███████▏ | 24425/34278 [26:54:23<10:48:17, 3.95s/it] 71%|███████▏ | 24426/34278 [26:54:29<12:32:03, 4.58s/it] {'loss': 0.1266, 'grad_norm': 0.7671812323739688, 'learning_rate': 2.014375466516062e-06, 'epoch': 0.71} 71%|███████▏ | 24426/34278 [26:54:29<12:32:03, 4.58s/it] 71%|███████▏ | 24427/34278 [26:54:32<11:29:54, 4.20s/it] {'loss': 0.1341, 'grad_norm': 0.9770055202806369, 'learning_rate': 2.013996517603181e-06, 'epoch': 0.71} 71%|███████▏ | 24427/34278 [26:54:32<11:29:54, 4.20s/it] 71%|███████▏ | 24428/34278 [26:54:37<12:27:26, 4.55s/it] {'loss': 0.1451, 'grad_norm': 0.8835667304460261, 'learning_rate': 2.013617595348625e-06, 'epoch': 0.71} 71%|███████▏ | 24428/34278 [26:54:37<12:27:26, 4.55s/it] 71%|███████▏ | 24429/34278 [26:54:41<12:01:28, 4.40s/it] {'loss': 0.1238, 'grad_norm': 0.7354664119934039, 'learning_rate': 2.0132386997557795e-06, 'epoch': 0.71} 71%|███████▏ | 24429/34278 [26:54:41<12:01:28, 4.40s/it] 71%|███████▏ | 24430/34278 [26:54:45<11:21:10, 4.15s/it] {'loss': 0.1088, 'grad_norm': 0.7179745594529715, 'learning_rate': 2.0128598308280255e-06, 'epoch': 0.71} 71%|███████▏ | 24430/34278 [26:54:45<11:21:10, 4.15s/it] 71%|███████▏ | 24431/34278 [26:54:48<10:30:42, 3.84s/it] {'loss': 0.1171, 'grad_norm': 0.866216764674149, 'learning_rate': 2.0124809885687448e-06, 'epoch': 0.71} 71%|███████▏ | 24431/34278 [26:54:48<10:30:42, 3.84s/it] 71%|███████▏ | 24432/34278 [26:54:54<12:15:41, 4.48s/it] {'loss': 0.133, 'grad_norm': 0.8510728509096759, 'learning_rate': 2.0121021729813207e-06, 'epoch': 0.71} 71%|███████▏ | 24432/34278 [26:54:54<12:15:41, 4.48s/it] 71%|███████▏ | 24433/34278 [26:54:59<12:17:35, 4.50s/it] {'loss': 0.1385, 'grad_norm': 0.7179885008171474, 'learning_rate': 2.0117233840691364e-06, 'epoch': 0.71} 71%|███████▏ | 24433/34278 [26:54:59<12:17:35, 4.50s/it] 71%|███████▏ | 24434/34278 [26:55:02<11:16:04, 4.12s/it] {'loss': 0.1302, 'grad_norm': 0.7350504815158918, 'learning_rate': 2.0113446218355727e-06, 'epoch': 0.71} 71%|███████▏ | 24434/34278 [26:55:02<11:16:04, 4.12s/it] 71%|███████▏ | 24435/34278 [26:55:05<10:53:13, 3.98s/it] {'loss': 0.1232, 'grad_norm': 1.4719421673535458, 'learning_rate': 2.0109658862840085e-06, 'epoch': 0.71} 71%|███████▏ | 24435/34278 [26:55:05<10:53:13, 3.98s/it] 71%|███████▏ | 24436/34278 [26:55:10<11:33:08, 4.23s/it] {'loss': 0.1176, 'grad_norm': 0.8215421131912853, 'learning_rate': 2.0105871774178293e-06, 'epoch': 0.71} 71%|███████▏ | 24436/34278 [26:55:10<11:33:08, 4.23s/it] 71%|███████▏ | 24437/34278 [26:55:14<10:49:19, 3.96s/it] {'loss': 0.1281, 'grad_norm': 0.8400742032577411, 'learning_rate': 2.0102084952404145e-06, 'epoch': 0.71} 71%|███████▏ | 24437/34278 [26:55:14<10:49:19, 3.96s/it] 71%|███████▏ | 24438/34278 [26:55:17<10:20:07, 3.78s/it] {'loss': 0.1291, 'grad_norm': 0.9365317718177184, 'learning_rate': 2.0098298397551423e-06, 'epoch': 0.71} 71%|███████▏ | 24438/34278 [26:55:17<10:20:07, 3.78s/it] 71%|███████▏ | 24439/34278 [26:55:20<9:47:40, 3.58s/it] {'loss': 0.1121, 'grad_norm': 0.964290388631854, 'learning_rate': 2.009451210965396e-06, 'epoch': 0.71} 71%|███████▏ | 24439/34278 [26:55:20<9:47:40, 3.58s/it] 71%|███████▏ | 24440/34278 [26:55:25<10:47:09, 3.95s/it] {'loss': 0.1408, 'grad_norm': 1.1698157614872702, 'learning_rate': 2.0090726088745566e-06, 'epoch': 0.71} 71%|███████▏ | 24440/34278 [26:55:25<10:47:09, 3.95s/it] 71%|███████▏ | 24441/34278 [26:55:29<10:50:10, 3.97s/it] {'loss': 0.1213, 'grad_norm': 0.8204052766706701, 'learning_rate': 2.008694033486003e-06, 'epoch': 0.71} 71%|███████▏ | 24441/34278 [26:55:29<10:50:10, 3.97s/it] 71%|███████▏ | 24442/34278 [26:55:32<10:27:33, 3.83s/it] {'loss': 0.1138, 'grad_norm': 0.7812560378061456, 'learning_rate': 2.008315484803114e-06, 'epoch': 0.71} 71%|███████▏ | 24442/34278 [26:55:32<10:27:33, 3.83s/it] 71%|███████▏ | 24443/34278 [26:55:38<12:19:06, 4.51s/it] {'loss': 0.1454, 'grad_norm': 1.079505207040131, 'learning_rate': 2.007936962829271e-06, 'epoch': 0.71} 71%|███████▏ | 24443/34278 [26:55:38<12:19:06, 4.51s/it] 71%|███████▏ | 24444/34278 [26:55:44<13:31:55, 4.95s/it] {'loss': 0.1129, 'grad_norm': 0.8291730172821995, 'learning_rate': 2.0075584675678516e-06, 'epoch': 0.71} 71%|███████▏ | 24444/34278 [26:55:44<13:31:55, 4.95s/it] 71%|███████▏ | 24445/34278 [26:55:47<11:45:04, 4.30s/it] {'loss': 0.1177, 'grad_norm': 0.7556905008613477, 'learning_rate': 2.007179999022235e-06, 'epoch': 0.71} 71%|███████▏ | 24445/34278 [26:55:47<11:45:04, 4.30s/it] 71%|███████▏ | 24446/34278 [26:55:51<11:29:22, 4.21s/it] {'loss': 0.1074, 'grad_norm': 1.0318833643097356, 'learning_rate': 2.006801557195803e-06, 'epoch': 0.71} 71%|███████▏ | 24446/34278 [26:55:51<11:29:22, 4.21s/it] 71%|███████▏ | 24447/34278 [26:55:54<10:38:20, 3.90s/it] {'loss': 0.1127, 'grad_norm': 0.9405996013697249, 'learning_rate': 2.006423142091933e-06, 'epoch': 0.71} 71%|███████▏ | 24447/34278 [26:55:54<10:38:20, 3.90s/it] 71%|███████▏ | 24448/34278 [26:55:57<9:57:17, 3.65s/it] {'loss': 0.1317, 'grad_norm': 0.9335895698959623, 'learning_rate': 2.006044753714e-06, 'epoch': 0.71} 71%|███████▏ | 24448/34278 [26:55:57<9:57:17, 3.65s/it] 71%|███████▏ | 24449/34278 [26:56:01<9:29:42, 3.48s/it] {'loss': 0.1236, 'grad_norm': 1.1095474684342954, 'learning_rate': 2.0056663920653865e-06, 'epoch': 0.71} 71%|███████▏ | 24449/34278 [26:56:01<9:29:42, 3.48s/it] 71%|███████▏ | 24450/34278 [26:56:05<10:08:20, 3.71s/it] {'loss': 0.1319, 'grad_norm': 0.7760666738870904, 'learning_rate': 2.0052880571494665e-06, 'epoch': 0.71} 71%|███████▏ | 24450/34278 [26:56:05<10:08:20, 3.71s/it] 71%|███████▏ | 24451/34278 [26:56:08<9:56:16, 3.64s/it] {'loss': 0.129, 'grad_norm': 0.7959525494247813, 'learning_rate': 2.004909748969622e-06, 'epoch': 0.71} 71%|███████▏ | 24451/34278 [26:56:08<9:56:16, 3.64s/it] 71%|███████▏ | 24452/34278 [26:56:11<9:22:19, 3.43s/it] {'loss': 0.119, 'grad_norm': 1.0227122749939923, 'learning_rate': 2.0045314675292265e-06, 'epoch': 0.71} 71%|███████▏ | 24452/34278 [26:56:11<9:22:19, 3.43s/it] 71%|███████▏ | 24453/34278 [26:56:16<10:41:51, 3.92s/it] {'loss': 0.1148, 'grad_norm': 0.8146023359839762, 'learning_rate': 2.004153212831661e-06, 'epoch': 0.71} 71%|███████▏ | 24453/34278 [26:56:16<10:41:51, 3.92s/it] 71%|███████▏ | 24454/34278 [26:56:20<10:09:31, 3.72s/it] {'loss': 0.1268, 'grad_norm': 1.1375692905926735, 'learning_rate': 2.0037749848803002e-06, 'epoch': 0.71} 71%|███████▏ | 24454/34278 [26:56:20<10:09:31, 3.72s/it] 71%|███████▏ | 24455/34278 [26:56:25<11:44:35, 4.30s/it] {'loss': 0.1198, 'grad_norm': 0.8212560295520828, 'learning_rate': 2.0033967836785196e-06, 'epoch': 0.71} 71%|███████▏ | 24455/34278 [26:56:25<11:44:35, 4.30s/it] 71%|███████▏ | 24456/34278 [26:56:28<10:45:12, 3.94s/it] {'loss': 0.1333, 'grad_norm': 0.9104696570081723, 'learning_rate': 2.0030186092296965e-06, 'epoch': 0.71} 71%|███████▏ | 24456/34278 [26:56:28<10:45:12, 3.94s/it] 71%|███████▏ | 24457/34278 [26:56:31<9:58:17, 3.66s/it] {'loss': 0.1143, 'grad_norm': 0.9347454139472186, 'learning_rate': 2.00264046153721e-06, 'epoch': 0.71} 71%|███████▏ | 24457/34278 [26:56:31<9:58:17, 3.66s/it] 71%|███████▏ | 24458/34278 [26:56:37<11:55:48, 4.37s/it] {'loss': 0.1094, 'grad_norm': 0.6949505851193986, 'learning_rate': 2.002262340604432e-06, 'epoch': 0.71} 71%|███████▏ | 24458/34278 [26:56:37<11:55:48, 4.37s/it] 71%|███████▏ | 24459/34278 [26:56:40<10:43:26, 3.93s/it] {'loss': 0.1431, 'grad_norm': 0.7705963383300264, 'learning_rate': 2.0018842464347427e-06, 'epoch': 0.71} 71%|███████▏ | 24459/34278 [26:56:40<10:43:26, 3.93s/it] 71%|███████▏ | 24460/34278 [26:56:45<11:32:01, 4.23s/it] {'loss': 0.1031, 'grad_norm': 0.9104706678332688, 'learning_rate': 2.001506179031514e-06, 'epoch': 0.71} 71%|███████▏ | 24460/34278 [26:56:45<11:32:01, 4.23s/it] 71%|███████▏ | 24461/34278 [26:56:48<10:33:26, 3.87s/it] {'loss': 0.116, 'grad_norm': 0.7063606372601169, 'learning_rate': 2.001128138398121e-06, 'epoch': 0.71} 71%|███████▏ | 24461/34278 [26:56:48<10:33:26, 3.87s/it] 71%|███████▏ | 24462/34278 [26:56:51<9:42:02, 3.56s/it] {'loss': 0.1087, 'grad_norm': 0.7953551910751111, 'learning_rate': 2.0007501245379408e-06, 'epoch': 0.71} 71%|███████▏ | 24462/34278 [26:56:51<9:42:02, 3.56s/it] 71%|███████▏ | 24463/34278 [26:56:57<11:22:32, 4.17s/it] {'loss': 0.138, 'grad_norm': 1.077359935229099, 'learning_rate': 2.000372137454349e-06, 'epoch': 0.71} 71%|███████▏ | 24463/34278 [26:56:57<11:22:32, 4.17s/it] 71%|███████▏ | 24464/34278 [26:56:59<10:14:48, 3.76s/it] {'loss': 0.1165, 'grad_norm': 0.9097983110699532, 'learning_rate': 1.999994177150718e-06, 'epoch': 0.71} 71%|███████▏ | 24464/34278 [26:56:59<10:14:48, 3.76s/it] 71%|███████▏ | 24465/34278 [26:57:03<9:50:26, 3.61s/it] {'loss': 0.1092, 'grad_norm': 0.7102176656054635, 'learning_rate': 1.9996162436304217e-06, 'epoch': 0.71} 71%|███████▏ | 24465/34278 [26:57:03<9:50:26, 3.61s/it] 71%|███████▏ | 24466/34278 [26:57:06<9:16:37, 3.40s/it] {'loss': 0.1377, 'grad_norm': 0.8545017317221432, 'learning_rate': 1.9992383368968364e-06, 'epoch': 0.71} 71%|███████▏ | 24466/34278 [26:57:06<9:16:37, 3.40s/it] 71%|███████▏ | 24467/34278 [26:57:10<10:12:39, 3.75s/it] {'loss': 0.1191, 'grad_norm': 0.9044676141278017, 'learning_rate': 1.9988604569533353e-06, 'epoch': 0.71} 71%|███████▏ | 24467/34278 [26:57:10<10:12:39, 3.75s/it] 71%|███████▏ | 24468/34278 [26:57:13<9:51:28, 3.62s/it] {'loss': 0.121, 'grad_norm': 0.7449057819921429, 'learning_rate': 1.99848260380329e-06, 'epoch': 0.71} 71%|███████▏ | 24468/34278 [26:57:14<9:51:28, 3.62s/it] 71%|███████▏ | 24469/34278 [26:57:20<11:53:20, 4.36s/it] {'loss': 0.1368, 'grad_norm': 0.8174339871375383, 'learning_rate': 1.9981047774500755e-06, 'epoch': 0.71} 71%|███████▏ | 24469/34278 [26:57:20<11:53:20, 4.36s/it] 71%|███████▏ | 24470/34278 [26:57:23<10:53:39, 4.00s/it] {'loss': 0.1111, 'grad_norm': 0.8317875323554248, 'learning_rate': 1.9977269778970666e-06, 'epoch': 0.71} 71%|███████▏ | 24470/34278 [26:57:23<10:53:39, 4.00s/it] 71%|███████▏ | 24471/34278 [26:57:25<9:49:33, 3.61s/it] {'loss': 0.0925, 'grad_norm': 0.864772193549524, 'learning_rate': 1.9973492051476345e-06, 'epoch': 0.71} 71%|███████▏ | 24471/34278 [26:57:25<9:49:33, 3.61s/it] 71%|███████▏ | 24472/34278 [26:57:28<9:19:23, 3.42s/it] {'loss': 0.1008, 'grad_norm': 0.719659647315192, 'learning_rate': 1.9969714592051506e-06, 'epoch': 0.71} 71%|███████▏ | 24472/34278 [26:57:28<9:19:23, 3.42s/it] 71%|███████▏ | 24473/34278 [26:57:32<9:06:57, 3.35s/it] {'loss': 0.1215, 'grad_norm': 0.884086106888888, 'learning_rate': 1.9965937400729895e-06, 'epoch': 0.71} 71%|███████▏ | 24473/34278 [26:57:32<9:06:57, 3.35s/it] 71%|███████▏ | 24474/34278 [26:57:36<9:43:49, 3.57s/it] {'loss': 0.1207, 'grad_norm': 1.1009363270742796, 'learning_rate': 1.996216047754521e-06, 'epoch': 0.71} 71%|███████▏ | 24474/34278 [26:57:36<9:43:49, 3.57s/it] 71%|███████▏ | 24475/34278 [26:57:39<9:21:51, 3.44s/it] {'loss': 0.0985, 'grad_norm': 0.8531369406368245, 'learning_rate': 1.995838382253119e-06, 'epoch': 0.71} 71%|███████▏ | 24475/34278 [26:57:39<9:21:51, 3.44s/it] 71%|███████▏ | 24476/34278 [26:57:42<9:12:46, 3.38s/it] {'loss': 0.1192, 'grad_norm': 0.7103027521649443, 'learning_rate': 1.995460743572156e-06, 'epoch': 0.71} 71%|███████▏ | 24476/34278 [26:57:42<9:12:46, 3.38s/it] 71%|███████▏ | 24477/34278 [26:57:47<10:52:04, 3.99s/it] {'loss': 0.1364, 'grad_norm': 1.0738885162982494, 'learning_rate': 1.995083131715003e-06, 'epoch': 0.71} 71%|███████▏ | 24477/34278 [26:57:48<10:52:04, 3.99s/it] 71%|███████▏ | 24478/34278 [26:57:50<9:43:36, 3.57s/it] {'loss': 0.1333, 'grad_norm': 0.9695200049495182, 'learning_rate': 1.9947055466850283e-06, 'epoch': 0.71} 71%|███████▏ | 24478/34278 [26:57:50<9:43:36, 3.57s/it] 71%|███████▏ | 24479/34278 [26:57:53<9:34:57, 3.52s/it] {'loss': 0.1164, 'grad_norm': 1.0123868722289286, 'learning_rate': 1.9943279884856065e-06, 'epoch': 0.71} 71%|███████▏ | 24479/34278 [26:57:53<9:34:57, 3.52s/it] 71%|███████▏ | 24480/34278 [26:57:56<8:59:16, 3.30s/it] {'loss': 0.1268, 'grad_norm': 1.0769255112161285, 'learning_rate': 1.9939504571201055e-06, 'epoch': 0.71} 71%|███████▏ | 24480/34278 [26:57:56<8:59:16, 3.30s/it] 71%|███████▏ | 24481/34278 [26:57:59<8:50:03, 3.25s/it] {'loss': 0.1328, 'grad_norm': 0.9194611986223226, 'learning_rate': 1.993572952591899e-06, 'epoch': 0.71} 71%|███████▏ | 24481/34278 [26:57:59<8:50:03, 3.25s/it] 71%|███████▏ | 24482/34278 [26:58:02<8:37:04, 3.17s/it] {'loss': 0.1041, 'grad_norm': 0.8962777665479411, 'learning_rate': 1.9931954749043535e-06, 'epoch': 0.71} 71%|███████▏ | 24482/34278 [26:58:02<8:37:04, 3.17s/it] 71%|███████▏ | 24483/34278 [26:58:06<8:52:53, 3.26s/it] {'loss': 0.1178, 'grad_norm': 0.7625310960114333, 'learning_rate': 1.992818024060843e-06, 'epoch': 0.71} 71%|███████▏ | 24483/34278 [26:58:06<8:52:53, 3.26s/it] 71%|███████▏ | 24484/34278 [26:58:09<8:47:46, 3.23s/it] {'loss': 0.125, 'grad_norm': 1.265964283956046, 'learning_rate': 1.9924406000647354e-06, 'epoch': 0.71} 71%|███████▏ | 24484/34278 [26:58:09<8:47:46, 3.23s/it] 71%|███████▏ | 24485/34278 [26:58:14<9:54:51, 3.64s/it] {'loss': 0.1163, 'grad_norm': 1.0880981977655597, 'learning_rate': 1.992063202919398e-06, 'epoch': 0.71} 71%|███████▏ | 24485/34278 [26:58:14<9:54:51, 3.64s/it] 71%|███████▏ | 24486/34278 [26:58:20<11:46:02, 4.33s/it] {'loss': 0.1187, 'grad_norm': 0.8285814688788601, 'learning_rate': 1.991685832628202e-06, 'epoch': 0.71} 71%|███████▏ | 24486/34278 [26:58:20<11:46:02, 4.33s/it] 71%|███████▏ | 24487/34278 [26:58:23<10:52:37, 4.00s/it] {'loss': 0.1306, 'grad_norm': 0.9923470221096321, 'learning_rate': 1.9913084891945195e-06, 'epoch': 0.71} 71%|███████▏ | 24487/34278 [26:58:23<10:52:37, 4.00s/it] 71%|███████▏ | 24488/34278 [26:58:27<10:53:15, 4.00s/it] {'loss': 0.1234, 'grad_norm': 0.997270190992129, 'learning_rate': 1.9909311726217144e-06, 'epoch': 0.71} 71%|███████▏ | 24488/34278 [26:58:27<10:53:15, 4.00s/it] 71%|███████▏ | 24489/34278 [26:58:32<12:15:35, 4.51s/it] {'loss': 0.1136, 'grad_norm': 0.8188236531698319, 'learning_rate': 1.9905538829131594e-06, 'epoch': 0.71} 71%|███████▏ | 24489/34278 [26:58:33<12:15:35, 4.51s/it] 71%|███████▏ | 24490/34278 [26:58:37<11:52:46, 4.37s/it] {'loss': 0.1076, 'grad_norm': 0.6162367141874275, 'learning_rate': 1.9901766200722205e-06, 'epoch': 0.71} 71%|███████▏ | 24490/34278 [26:58:37<11:52:46, 4.37s/it] 71%|███████▏ | 24491/34278 [26:58:40<11:08:45, 4.10s/it] {'loss': 0.1173, 'grad_norm': 1.061775721054025, 'learning_rate': 1.9897993841022643e-06, 'epoch': 0.71} 71%|███████▏ | 24491/34278 [26:58:40<11:08:45, 4.10s/it] 71%|███████▏ | 24492/34278 [26:58:43<10:19:52, 3.80s/it] {'loss': 0.1603, 'grad_norm': 0.8772880924671139, 'learning_rate': 1.989422175006661e-06, 'epoch': 0.71} 71%|███████▏ | 24492/34278 [26:58:43<10:19:52, 3.80s/it] 71%|███████▏ | 24493/34278 [26:58:46<9:56:40, 3.66s/it] {'loss': 0.103, 'grad_norm': 0.9203970716723464, 'learning_rate': 1.9890449927887796e-06, 'epoch': 0.71} 71%|███████▏ | 24493/34278 [26:58:46<9:56:40, 3.66s/it] 71%|███████▏ | 24494/34278 [26:58:50<9:45:36, 3.59s/it] {'loss': 0.1119, 'grad_norm': 0.798478354769862, 'learning_rate': 1.988667837451986e-06, 'epoch': 0.71} 71%|███████▏ | 24494/34278 [26:58:50<9:45:36, 3.59s/it] 71%|███████▏ | 24495/34278 [26:58:53<9:21:33, 3.44s/it] {'loss': 0.1174, 'grad_norm': 0.9197447757365986, 'learning_rate': 1.9882907089996453e-06, 'epoch': 0.71} 71%|███████▏ | 24495/34278 [26:58:53<9:21:33, 3.44s/it] 71%|███████▏ | 24496/34278 [26:58:57<9:28:19, 3.49s/it] {'loss': 0.1149, 'grad_norm': 0.6913238297298403, 'learning_rate': 1.9879136074351276e-06, 'epoch': 0.71} 71%|███████▏ | 24496/34278 [26:58:57<9:28:19, 3.49s/it] 71%|███████▏ | 24497/34278 [26:58:59<9:00:07, 3.31s/it] {'loss': 0.126, 'grad_norm': 0.762813617857346, 'learning_rate': 1.987536532761798e-06, 'epoch': 0.71} 71%|███████▏ | 24497/34278 [26:58:59<9:00:07, 3.31s/it] 71%|███████▏ | 24498/34278 [26:59:05<11:07:38, 4.10s/it] {'loss': 0.104, 'grad_norm': 0.6810020343943598, 'learning_rate': 1.9871594849830213e-06, 'epoch': 0.71} 71%|███████▏ | 24498/34278 [26:59:05<11:07:38, 4.10s/it] 71%|███████▏ | 24499/34278 [26:59:08<10:07:13, 3.73s/it] {'loss': 0.1173, 'grad_norm': 0.6555628971500574, 'learning_rate': 1.986782464102166e-06, 'epoch': 0.71} 71%|███████▏ | 24499/34278 [26:59:08<10:07:13, 3.73s/it] 71%|███████▏ | 24500/34278 [26:59:11<9:28:11, 3.49s/it] {'loss': 0.1133, 'grad_norm': 1.1312181162398898, 'learning_rate': 1.9864054701225986e-06, 'epoch': 0.71} 71%|███████▏ | 24500/34278 [26:59:11<9:28:11, 3.49s/it] 71%|███████▏ | 24501/34278 [26:59:14<9:17:39, 3.42s/it] {'loss': 0.1326, 'grad_norm': 0.8451512766943347, 'learning_rate': 1.9860285030476844e-06, 'epoch': 0.71} 71%|███████▏ | 24501/34278 [26:59:14<9:17:39, 3.42s/it] 71%|███████▏ | 24502/34278 [26:59:18<9:11:54, 3.39s/it] {'loss': 0.1262, 'grad_norm': 0.9255673831474458, 'learning_rate': 1.9856515628807865e-06, 'epoch': 0.71} 71%|███████▏ | 24502/34278 [26:59:18<9:11:54, 3.39s/it] 71%|███████▏ | 24503/34278 [26:59:21<9:05:23, 3.35s/it] {'loss': 0.1007, 'grad_norm': 0.6826310324274596, 'learning_rate': 1.9852746496252735e-06, 'epoch': 0.71} 71%|███████▏ | 24503/34278 [26:59:21<9:05:23, 3.35s/it] 71%|███████▏ | 24504/34278 [26:59:27<11:20:48, 4.18s/it] {'loss': 0.1248, 'grad_norm': 0.7154138159018919, 'learning_rate': 1.984897763284507e-06, 'epoch': 0.71} 71%|███████▏ | 24504/34278 [26:59:27<11:20:48, 4.18s/it] 71%|███████▏ | 24505/34278 [26:59:30<10:04:59, 3.71s/it] {'loss': 0.1311, 'grad_norm': 0.856043848945111, 'learning_rate': 1.984520903861853e-06, 'epoch': 0.71} 71%|███████▏ | 24505/34278 [26:59:30<10:04:59, 3.71s/it] 71%|███████▏ | 24506/34278 [26:59:33<9:35:21, 3.53s/it] {'loss': 0.1547, 'grad_norm': 0.9118751512671261, 'learning_rate': 1.984144071360679e-06, 'epoch': 0.71} 71%|███████▏ | 24506/34278 [26:59:33<9:35:21, 3.53s/it] 71%|███████▏ | 24507/34278 [26:59:36<9:22:27, 3.45s/it] {'loss': 0.1119, 'grad_norm': 0.8120519707589329, 'learning_rate': 1.9837672657843467e-06, 'epoch': 0.71} 71%|███████▏ | 24507/34278 [26:59:36<9:22:27, 3.45s/it] 71%|███████▏ | 24508/34278 [26:59:42<11:36:31, 4.28s/it] {'loss': 0.1214, 'grad_norm': 0.9660033861049498, 'learning_rate': 1.983390487136218e-06, 'epoch': 0.71} 71%|███████▏ | 24508/34278 [26:59:42<11:36:31, 4.28s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 72%|███████▏ | 24509/34278 [26:59:46<10:42:46, 3.95s/it] {'loss': 0.1246, 'grad_norm': 0.7301469264130491, 'learning_rate': 1.983013735419661e-06, 'epoch': 0.72} 72%|███████▏ | 24509/34278 [26:59:46<10:42:46, 3.95s/it] 72%|███████▏ | 24510/34278 [26:59:49<10:03:55, 3.71s/it] {'loss': 0.1148, 'grad_norm': 0.9168246822075329, 'learning_rate': 1.982637010638035e-06, 'epoch': 0.72} 72%|███████▏ | 24510/34278 [26:59:49<10:03:55, 3.71s/it] 72%|███████▏ | 24511/34278 [26:59:52<9:46:53, 3.61s/it] {'loss': 0.1095, 'grad_norm': 0.802518765472926, 'learning_rate': 1.9822603127947076e-06, 'epoch': 0.72} 72%|███████▏ | 24511/34278 [26:59:52<9:46:53, 3.61s/it] 72%|███████▏ | 24512/34278 [26:59:55<9:17:28, 3.43s/it] {'loss': 0.1368, 'grad_norm': 1.0253697786433456, 'learning_rate': 1.981883641893038e-06, 'epoch': 0.72} 72%|███████▏ | 24512/34278 [26:59:55<9:17:28, 3.43s/it] 72%|███████▏ | 24513/34278 [26:59:58<9:05:18, 3.35s/it] {'loss': 0.1242, 'grad_norm': 1.0496533173020024, 'learning_rate': 1.9815069979363927e-06, 'epoch': 0.72} 72%|███████▏ | 24513/34278 [26:59:58<9:05:18, 3.35s/it] 72%|███████▏ | 24514/34278 [27:00:02<9:22:23, 3.46s/it] {'loss': 0.1204, 'grad_norm': 0.9436918956143182, 'learning_rate': 1.9811303809281318e-06, 'epoch': 0.72} 72%|███████▏ | 24514/34278 [27:00:02<9:22:23, 3.46s/it] 72%|███████▏ | 24515/34278 [27:00:07<10:50:02, 3.99s/it] {'loss': 0.1138, 'grad_norm': 0.9034584311781312, 'learning_rate': 1.980753790871617e-06, 'epoch': 0.72} 72%|███████▏ | 24515/34278 [27:00:07<10:50:02, 3.99s/it] 72%|███████▏ | 24516/34278 [27:00:11<10:21:31, 3.82s/it] {'loss': 0.1262, 'grad_norm': 0.9629977658340991, 'learning_rate': 1.980377227770211e-06, 'epoch': 0.72} 72%|███████▏ | 24516/34278 [27:00:11<10:21:31, 3.82s/it] 72%|███████▏ | 24517/34278 [27:00:13<9:36:40, 3.54s/it] {'loss': 0.1073, 'grad_norm': 0.7637325551091692, 'learning_rate': 1.9800006916272785e-06, 'epoch': 0.72} 72%|███████▏ | 24517/34278 [27:00:13<9:36:40, 3.54s/it] 72%|███████▏ | 24518/34278 [27:00:19<11:35:12, 4.27s/it] {'loss': 0.1225, 'grad_norm': 0.9062947721927214, 'learning_rate': 1.979624182446177e-06, 'epoch': 0.72} 72%|███████▏ | 24518/34278 [27:00:19<11:35:12, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 72%|███████▏ | 24519/34278 [27:00:22<10:33:25, 3.89s/it] {'loss': 0.1096, 'grad_norm': 0.9542482483793993, 'learning_rate': 1.9792477002302713e-06, 'epoch': 0.72} 72%|███████▏ | 24519/34278 [27:00:22<10:33:25, 3.89s/it] 72%|███████▏ | 24520/34278 [27:00:26<10:08:59, 3.74s/it] {'loss': 0.1276, 'grad_norm': 0.9360330193100261, 'learning_rate': 1.9788712449829213e-06, 'epoch': 0.72} 72%|███████▏ | 24520/34278 [27:00:26<10:08:59, 3.74s/it] 72%|███████▏ | 24521/34278 [27:00:29<9:36:47, 3.55s/it] {'loss': 0.1171, 'grad_norm': 1.2684726042405354, 'learning_rate': 1.9784948167074856e-06, 'epoch': 0.72} 72%|███████▏ | 24521/34278 [27:00:29<9:36:47, 3.55s/it] 72%|███████▏ | 24522/34278 [27:00:32<9:12:05, 3.40s/it] {'loss': 0.1395, 'grad_norm': 1.0361858156709687, 'learning_rate': 1.9781184154073273e-06, 'epoch': 0.72} 72%|███████▏ | 24522/34278 [27:00:32<9:12:05, 3.40s/it] 72%|███████▏ | 24523/34278 [27:00:35<9:05:53, 3.36s/it] {'loss': 0.1205, 'grad_norm': 0.8714331582704156, 'learning_rate': 1.977742041085808e-06, 'epoch': 0.72} 72%|███████▏ | 24523/34278 [27:00:35<9:05:53, 3.36s/it] 72%|███████▏ | 24524/34278 [27:00:39<9:47:22, 3.61s/it] {'loss': 0.1198, 'grad_norm': 1.257545422964162, 'learning_rate': 1.9773656937462867e-06, 'epoch': 0.72} 72%|███████▏ | 24524/34278 [27:00:39<9:47:22, 3.61s/it] 72%|███████▏ | 24525/34278 [27:00:42<9:05:51, 3.36s/it] {'loss': 0.0947, 'grad_norm': 0.8290693970217324, 'learning_rate': 1.97698937339212e-06, 'epoch': 0.72} 72%|███████▏ | 24525/34278 [27:00:42<9:05:51, 3.36s/it] 72%|███████▏ | 24526/34278 [27:00:47<10:18:45, 3.81s/it] {'loss': 0.1198, 'grad_norm': 0.7808933215197137, 'learning_rate': 1.976613080026673e-06, 'epoch': 0.72} 72%|███████▏ | 24526/34278 [27:00:47<10:18:45, 3.81s/it] 72%|███████▏ | 24527/34278 [27:00:50<9:59:38, 3.69s/it] {'loss': 0.1228, 'grad_norm': 0.9400499470621698, 'learning_rate': 1.976236813653303e-06, 'epoch': 0.72} 72%|███████▏ | 24527/34278 [27:00:51<9:59:38, 3.69s/it] 72%|███████▏ | 24528/34278 [27:00:54<9:35:55, 3.54s/it] {'loss': 0.0983, 'grad_norm': 0.8092859321774526, 'learning_rate': 1.9758605742753665e-06, 'epoch': 0.72} 72%|███████▏ | 24528/34278 [27:00:54<9:35:55, 3.54s/it] 72%|███████▏ | 24529/34278 [27:00:57<9:30:09, 3.51s/it] {'loss': 0.1192, 'grad_norm': 0.9741125357720664, 'learning_rate': 1.9754843618962255e-06, 'epoch': 0.72} 72%|███████▏ | 24529/34278 [27:00:57<9:30:09, 3.51s/it] 72%|███████▏ | 24530/34278 [27:01:00<9:08:46, 3.38s/it] {'loss': 0.1304, 'grad_norm': 0.9429728273169273, 'learning_rate': 1.975108176519239e-06, 'epoch': 0.72} 72%|███████▏ | 24530/34278 [27:01:00<9:08:46, 3.38s/it] 72%|███████▏ | 24531/34278 [27:01:03<9:04:33, 3.35s/it] {'loss': 0.1215, 'grad_norm': 0.8955526795595488, 'learning_rate': 1.974732018147766e-06, 'epoch': 0.72} 72%|███████▏ | 24531/34278 [27:01:04<9:04:33, 3.35s/it] 72%|███████▏ | 24532/34278 [27:01:07<9:05:34, 3.36s/it] {'loss': 0.0976, 'grad_norm': 0.8555186048254568, 'learning_rate': 1.9743558867851605e-06, 'epoch': 0.72} 72%|███████▏ | 24532/34278 [27:01:07<9:05:34, 3.36s/it] 72%|███████▏ | 24533/34278 [27:01:10<9:08:48, 3.38s/it] {'loss': 0.0959, 'grad_norm': 0.9543932338643378, 'learning_rate': 1.973979782434785e-06, 'epoch': 0.72} 72%|███████▏ | 24533/34278 [27:01:10<9:08:48, 3.38s/it] 72%|███████▏ | 24534/34278 [27:01:14<9:12:48, 3.40s/it] {'loss': 0.1214, 'grad_norm': 0.9182165873944451, 'learning_rate': 1.9736037050999946e-06, 'epoch': 0.72} 72%|███████▏ | 24534/34278 [27:01:14<9:12:48, 3.40s/it] 72%|███████▏ | 24535/34278 [27:01:17<8:53:05, 3.28s/it] {'loss': 0.1114, 'grad_norm': 0.8839100903731244, 'learning_rate': 1.9732276547841473e-06, 'epoch': 0.72} 72%|███████▏ | 24535/34278 [27:01:17<8:53:05, 3.28s/it] 72%|███████▏ | 24536/34278 [27:01:20<9:11:23, 3.40s/it] {'loss': 0.1316, 'grad_norm': 0.783510492837181, 'learning_rate': 1.9728516314906034e-06, 'epoch': 0.72} 72%|███████▏ | 24536/34278 [27:01:20<9:11:23, 3.40s/it] 72%|███████▏ | 24537/34278 [27:01:23<8:42:28, 3.22s/it] {'loss': 0.1344, 'grad_norm': 0.9539275579482954, 'learning_rate': 1.9724756352227163e-06, 'epoch': 0.72} 72%|███████▏ | 24537/34278 [27:01:23<8:42:28, 3.22s/it] 72%|███████▏ | 24538/34278 [27:01:26<8:22:44, 3.10s/it] {'loss': 0.1017, 'grad_norm': 0.9102475346754038, 'learning_rate': 1.9720996659838433e-06, 'epoch': 0.72} 72%|███████▏ | 24538/34278 [27:01:26<8:22:44, 3.10s/it] 72%|███████▏ | 24539/34278 [27:01:29<8:31:49, 3.15s/it] {'loss': 0.1271, 'grad_norm': 1.0290131903244208, 'learning_rate': 1.9717237237773428e-06, 'epoch': 0.72} 72%|███████▏ | 24539/34278 [27:01:29<8:31:49, 3.15s/it] 72%|███████▏ | 24540/34278 [27:01:33<8:43:21, 3.22s/it] {'loss': 0.1263, 'grad_norm': 0.8750834969039144, 'learning_rate': 1.9713478086065686e-06, 'epoch': 0.72} 72%|███████▏ | 24540/34278 [27:01:33<8:43:21, 3.22s/it] 72%|███████▏ | 24541/34278 [27:01:38<10:18:34, 3.81s/it] {'loss': 0.1269, 'grad_norm': 0.7595404203221637, 'learning_rate': 1.97097192047488e-06, 'epoch': 0.72} 72%|███████▏ | 24541/34278 [27:01:38<10:18:34, 3.81s/it] 72%|███████▏ | 24542/34278 [27:01:43<11:06:39, 4.11s/it] {'loss': 0.1346, 'grad_norm': 1.070364503939263, 'learning_rate': 1.9705960593856287e-06, 'epoch': 0.72} 72%|███████▏ | 24542/34278 [27:01:43<11:06:39, 4.11s/it] 72%|███████▏ | 24543/34278 [27:01:49<12:58:57, 4.80s/it] {'loss': 0.1214, 'grad_norm': 0.7964632149951953, 'learning_rate': 1.970220225342175e-06, 'epoch': 0.72} 72%|███████▏ | 24543/34278 [27:01:49<12:58:57, 4.80s/it] 72%|███████▏ | 24544/34278 [27:01:54<13:11:17, 4.88s/it] {'loss': 0.1262, 'grad_norm': 0.7370001324735711, 'learning_rate': 1.9698444183478715e-06, 'epoch': 0.72} 72%|███████▏ | 24544/34278 [27:01:54<13:11:17, 4.88s/it] 72%|███████▏ | 24545/34278 [27:01:58<11:58:12, 4.43s/it] {'loss': 0.1458, 'grad_norm': 0.9609390321102597, 'learning_rate': 1.9694686384060726e-06, 'epoch': 0.72} 72%|███████▏ | 24545/34278 [27:01:58<11:58:12, 4.43s/it] 72%|███████▏ | 24546/34278 [27:02:00<10:46:30, 3.99s/it] {'loss': 0.1063, 'grad_norm': 0.6317432156243418, 'learning_rate': 1.969092885520133e-06, 'epoch': 0.72} 72%|███████▏ | 24546/34278 [27:02:00<10:46:30, 3.99s/it] 72%|███████▏ | 24547/34278 [27:02:05<11:32:21, 4.27s/it] {'loss': 0.1112, 'grad_norm': 0.7703462693873755, 'learning_rate': 1.9687171596934112e-06, 'epoch': 0.72} 72%|███████▏ | 24547/34278 [27:02:05<11:32:21, 4.27s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f41602e0090> Failed to fetch sample 2762606. Exception: cannot identify image file <_io.BytesIO object at 0x7f41602e0090> 72%|███████▏ | 24548/34278 [27:02:09<11:05:04, 4.10s/it] {'loss': 0.1171, 'grad_norm': 0.7059559100543018, 'learning_rate': 1.9683414609292573e-06, 'epoch': 0.72} 72%|███████▏ | 24548/34278 [27:02:09<11:05:04, 4.10s/it] 72%|███████▏ | 24549/34278 [27:02:13<10:34:34, 3.91s/it] {'loss': 0.1234, 'grad_norm': 0.718228407163729, 'learning_rate': 1.967965789231028e-06, 'epoch': 0.72} 72%|███████▏ | 24549/34278 [27:02:13<10:34:34, 3.91s/it] 72%|███████▏ | 24550/34278 [27:02:16<9:48:48, 3.63s/it] {'loss': 0.0972, 'grad_norm': 0.7392834156275169, 'learning_rate': 1.967590144602077e-06, 'epoch': 0.72} 72%|███████▏ | 24550/34278 [27:02:16<9:48:48, 3.63s/it] 72%|███████▏ | 24551/34278 [27:02:19<9:40:05, 3.58s/it] {'loss': 0.1226, 'grad_norm': 0.966034325521619, 'learning_rate': 1.9672145270457553e-06, 'epoch': 0.72} 72%|███████▏ | 24551/34278 [27:02:19<9:40:05, 3.58s/it] 72%|███████▏ | 24552/34278 [27:02:23<10:04:31, 3.73s/it] {'loss': 0.1279, 'grad_norm': 1.193308368363112, 'learning_rate': 1.966838936565419e-06, 'epoch': 0.72} 72%|███████▏ | 24552/34278 [27:02:23<10:04:31, 3.73s/it] 72%|███████▏ | 24553/34278 [27:02:26<9:26:28, 3.50s/it] {'loss': 0.0993, 'grad_norm': 0.6950364685412184, 'learning_rate': 1.9664633731644215e-06, 'epoch': 0.72} 72%|███████▏ | 24553/34278 [27:02:26<9:26:28, 3.50s/it] 72%|███████▏ | 24554/34278 [27:02:29<9:08:35, 3.38s/it] {'loss': 0.0979, 'grad_norm': 0.7244842209202247, 'learning_rate': 1.9660878368461156e-06, 'epoch': 0.72} 72%|███████▏ | 24554/34278 [27:02:29<9:08:35, 3.38s/it] 72%|███████▏ | 24555/34278 [27:02:32<9:01:03, 3.34s/it] {'loss': 0.1345, 'grad_norm': 0.8063818433393074, 'learning_rate': 1.9657123276138507e-06, 'epoch': 0.72} 72%|███████▏ | 24555/34278 [27:02:32<9:01:03, 3.34s/it] 72%|███████▏ | 24556/34278 [27:02:37<10:14:25, 3.79s/it] {'loss': 0.1138, 'grad_norm': 0.8561936995955228, 'learning_rate': 1.9653368454709844e-06, 'epoch': 0.72} 72%|███████▏ | 24556/34278 [27:02:37<10:14:25, 3.79s/it] 72%|███████▏ | 24557/34278 [27:02:40<9:36:55, 3.56s/it] {'loss': 0.0989, 'grad_norm': 0.8185377889627363, 'learning_rate': 1.9649613904208637e-06, 'epoch': 0.72} 72%|███████▏ | 24557/34278 [27:02:40<9:36:55, 3.56s/it] 72%|███████▏ | 24558/34278 [27:02:46<11:33:10, 4.28s/it] {'loss': 0.1232, 'grad_norm': 0.7543619490448702, 'learning_rate': 1.9645859624668455e-06, 'epoch': 0.72} 72%|███████▏ | 24558/34278 [27:02:46<11:33:10, 4.28s/it] 72%|███████▏ | 24559/34278 [27:02:49<10:19:26, 3.82s/it] {'loss': 0.1097, 'grad_norm': 0.8433671728047157, 'learning_rate': 1.9642105616122768e-06, 'epoch': 0.72} 72%|███████▏ | 24559/34278 [27:02:49<10:19:26, 3.82s/it] 72%|███████▏ | 24560/34278 [27:02:54<11:03:57, 4.10s/it] {'loss': 0.1122, 'grad_norm': 0.8634624804529792, 'learning_rate': 1.963835187860514e-06, 'epoch': 0.72} 72%|███████▏ | 24560/34278 [27:02:54<11:03:57, 4.10s/it] 72%|███████▏ | 24561/34278 [27:02:57<10:23:36, 3.85s/it] {'loss': 0.1302, 'grad_norm': 0.7328743190271523, 'learning_rate': 1.9634598412149056e-06, 'epoch': 0.72} 72%|███████▏ | 24561/34278 [27:02:57<10:23:36, 3.85s/it] 72%|███████▏ | 24562/34278 [27:03:00<9:41:55, 3.59s/it] {'loss': 0.1269, 'grad_norm': 0.936673912964834, 'learning_rate': 1.9630845216788016e-06, 'epoch': 0.72} 72%|███████▏ | 24562/34278 [27:03:00<9:41:55, 3.59s/it] 72%|███████▏ | 24563/34278 [27:03:03<9:07:40, 3.38s/it] {'loss': 0.0939, 'grad_norm': 0.8894157851527578, 'learning_rate': 1.9627092292555534e-06, 'epoch': 0.72} 72%|███████▏ | 24563/34278 [27:03:03<9:07:40, 3.38s/it] 72%|███████▏ | 24564/34278 [27:03:06<8:48:37, 3.27s/it] {'loss': 0.1186, 'grad_norm': 1.090818452881439, 'learning_rate': 1.9623339639485133e-06, 'epoch': 0.72} 72%|███████▏ | 24564/34278 [27:03:06<8:48:37, 3.27s/it] 72%|███████▏ | 24565/34278 [27:03:09<8:39:29, 3.21s/it] {'loss': 0.1198, 'grad_norm': 0.9359231326597849, 'learning_rate': 1.9619587257610296e-06, 'epoch': 0.72} 72%|███████▏ | 24565/34278 [27:03:09<8:39:29, 3.21s/it] 72%|███████▏ | 24566/34278 [27:03:12<8:43:50, 3.24s/it] {'loss': 0.1407, 'grad_norm': 0.7439177195500215, 'learning_rate': 1.9615835146964547e-06, 'epoch': 0.72} 72%|███████▏ | 24566/34278 [27:03:12<8:43:50, 3.24s/it] 72%|███████▏ | 24567/34278 [27:03:15<8:35:27, 3.18s/it] {'loss': 0.1209, 'grad_norm': 0.9931004330733285, 'learning_rate': 1.961208330758137e-06, 'epoch': 0.72} 72%|███████▏ | 24567/34278 [27:03:15<8:35:27, 3.18s/it] 72%|███████▏ | 24568/34278 [27:03:21<10:25:09, 3.86s/it] {'loss': 0.1027, 'grad_norm': 1.208322531175956, 'learning_rate': 1.960833173949424e-06, 'epoch': 0.72} 72%|███████▏ | 24568/34278 [27:03:21<10:25:09, 3.86s/it] 72%|███████▏ | 24569/34278 [27:03:25<10:35:29, 3.93s/it] {'loss': 0.1251, 'grad_norm': 0.8527797878275589, 'learning_rate': 1.960458044273667e-06, 'epoch': 0.72} 72%|███████▏ | 24569/34278 [27:03:25<10:35:29, 3.93s/it] 72%|███████▏ | 24570/34278 [27:03:29<10:25:30, 3.87s/it] {'loss': 0.1251, 'grad_norm': 0.8364728856212879, 'learning_rate': 1.9600829417342166e-06, 'epoch': 0.72} 72%|███████▏ | 24570/34278 [27:03:29<10:25:30, 3.87s/it] 72%|███████▏ | 24571/34278 [27:03:32<10:00:30, 3.71s/it] {'loss': 0.1095, 'grad_norm': 0.8511367927807031, 'learning_rate': 1.95970786633442e-06, 'epoch': 0.72} 72%|███████▏ | 24571/34278 [27:03:32<10:00:30, 3.71s/it] 72%|███████▏ | 24572/34278 [27:03:36<9:55:40, 3.68s/it] {'loss': 0.1273, 'grad_norm': 0.7868164798833528, 'learning_rate': 1.959332818077624e-06, 'epoch': 0.72} 72%|███████▏ | 24572/34278 [27:03:36<9:55:40, 3.68s/it] 72%|███████▏ | 24573/34278 [27:03:41<11:06:24, 4.12s/it] {'loss': 0.1394, 'grad_norm': 0.8839613641074617, 'learning_rate': 1.9589577969671808e-06, 'epoch': 0.72} 72%|███████▏ | 24573/34278 [27:03:41<11:06:24, 4.12s/it] 72%|███████▏ | 24574/34278 [27:03:45<11:05:18, 4.11s/it] {'loss': 0.1357, 'grad_norm': 1.0469856369584865, 'learning_rate': 1.958582803006436e-06, 'epoch': 0.72} 72%|███████▏ | 24574/34278 [27:03:45<11:05:18, 4.11s/it] 72%|███████▏ | 24575/34278 [27:03:48<10:01:56, 3.72s/it] {'loss': 0.1099, 'grad_norm': 0.7026496730411657, 'learning_rate': 1.9582078361987345e-06, 'epoch': 0.72} 72%|███████▏ | 24575/34278 [27:03:48<10:01:56, 3.72s/it] 72%|███████▏ | 24576/34278 [27:03:52<10:12:11, 3.79s/it] {'loss': 0.1227, 'grad_norm': 0.7384995100826249, 'learning_rate': 1.9578328965474306e-06, 'epoch': 0.72} 72%|███████▏ | 24576/34278 [27:03:52<10:12:11, 3.79s/it] 72%|███████▏ | 24577/34278 [27:03:54<9:20:23, 3.47s/it] {'loss': 0.1175, 'grad_norm': 0.7931694308950297, 'learning_rate': 1.957457984055869e-06, 'epoch': 0.72} 72%|███████▏ | 24577/34278 [27:03:54<9:20:23, 3.47s/it] 72%|███████▏ | 24578/34278 [27:04:00<11:19:00, 4.20s/it] {'loss': 0.1035, 'grad_norm': 0.6955838590790396, 'learning_rate': 1.9570830987273944e-06, 'epoch': 0.72} 72%|███████▏ | 24578/34278 [27:04:00<11:19:00, 4.20s/it] 72%|███████▏ | 24579/34278 [27:04:06<12:33:07, 4.66s/it] {'loss': 0.115, 'grad_norm': 0.9837980017332323, 'learning_rate': 1.9567082405653565e-06, 'epoch': 0.72} 72%|███████▏ | 24579/34278 [27:04:06<12:33:07, 4.66s/it] 72%|███████▏ | 24580/34278 [27:04:09<11:28:18, 4.26s/it] {'loss': 0.1079, 'grad_norm': 0.7276555376558514, 'learning_rate': 1.956333409573102e-06, 'epoch': 0.72} 72%|███████▏ | 24580/34278 [27:04:09<11:28:18, 4.26s/it] 72%|███████▏ | 24581/34278 [27:04:14<11:33:11, 4.29s/it] {'loss': 0.1113, 'grad_norm': 0.7667172904919755, 'learning_rate': 1.9559586057539737e-06, 'epoch': 0.72} 72%|███████▏ | 24581/34278 [27:04:14<11:33:11, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 72%|███████▏ | 24582/34278 [27:04:16<10:18:29, 3.83s/it] {'loss': 0.1291, 'grad_norm': 0.7961658916482846, 'learning_rate': 1.9555838291113205e-06, 'epoch': 0.72} 72%|███████▏ | 24582/34278 [27:04:16<10:18:29, 3.83s/it] 72%|███████▏ | 24583/34278 [27:04:19<9:38:59, 3.58s/it] {'loss': 0.1193, 'grad_norm': 0.785833715893844, 'learning_rate': 1.9552090796484896e-06, 'epoch': 0.72} 72%|███████▏ | 24583/34278 [27:04:19<9:38:59, 3.58s/it] 72%|███████▏ | 24584/34278 [27:04:22<9:03:11, 3.36s/it] {'loss': 0.1407, 'grad_norm': 0.8239789626906368, 'learning_rate': 1.9548343573688256e-06, 'epoch': 0.72} 72%|███████▏ | 24584/34278 [27:04:22<9:03:11, 3.36s/it] 72%|███████▏ | 24585/34278 [27:04:25<8:43:39, 3.24s/it] {'loss': 0.1259, 'grad_norm': 0.9487799334267942, 'learning_rate': 1.9544596622756716e-06, 'epoch': 0.72} 72%|███████▏ | 24585/34278 [27:04:25<8:43:39, 3.24s/it] 72%|███████▏ | 24586/34278 [27:04:31<10:57:33, 4.07s/it] {'loss': 0.1075, 'grad_norm': 0.7663403839748532, 'learning_rate': 1.954084994372376e-06, 'epoch': 0.72} 72%|███████▏ | 24586/34278 [27:04:31<10:57:33, 4.07s/it] 72%|███████▏ | 24587/34278 [27:04:35<10:26:52, 3.88s/it] {'loss': 0.1141, 'grad_norm': 0.7928594910962723, 'learning_rate': 1.9537103536622813e-06, 'epoch': 0.72} 72%|███████▏ | 24587/34278 [27:04:35<10:26:52, 3.88s/it] 72%|███████▏ | 24588/34278 [27:04:41<12:07:27, 4.50s/it] {'loss': 0.1147, 'grad_norm': 0.9657934542018983, 'learning_rate': 1.9533357401487352e-06, 'epoch': 0.72} 72%|███████▏ | 24588/34278 [27:04:41<12:07:27, 4.50s/it] 72%|███████▏ | 24589/34278 [27:04:44<11:25:33, 4.25s/it] {'loss': 0.1211, 'grad_norm': 0.9545381189212526, 'learning_rate': 1.9529611538350785e-06, 'epoch': 0.72} 72%|███████▏ | 24589/34278 [27:04:44<11:25:33, 4.25s/it] 72%|███████▏ | 24590/34278 [27:04:49<12:05:02, 4.49s/it] {'loss': 0.1509, 'grad_norm': 0.8322622904427825, 'learning_rate': 1.9525865947246587e-06, 'epoch': 0.72} 72%|███████▏ | 24590/34278 [27:04:49<12:05:02, 4.49s/it] 72%|███████▏ | 24591/34278 [27:04:53<11:39:41, 4.33s/it] {'loss': 0.1383, 'grad_norm': 0.9111697633194339, 'learning_rate': 1.9522120628208186e-06, 'epoch': 0.72} 72%|███████▏ | 24591/34278 [27:04:53<11:39:41, 4.33s/it] 72%|███████▏ | 24592/34278 [27:04:56<10:29:51, 3.90s/it] {'loss': 0.1187, 'grad_norm': 0.8057984566038064, 'learning_rate': 1.9518375581268993e-06, 'epoch': 0.72} 72%|███████▏ | 24592/34278 [27:04:56<10:29:51, 3.90s/it] 72%|███████▏ | 24593/34278 [27:05:02<11:59:23, 4.46s/it] {'loss': 0.115, 'grad_norm': 0.8861541754385086, 'learning_rate': 1.951463080646247e-06, 'epoch': 0.72} 72%|███████▏ | 24593/34278 [27:05:02<11:59:23, 4.46s/it] 72%|███████▏ | 24594/34278 [27:05:05<10:54:18, 4.05s/it] {'loss': 0.1174, 'grad_norm': 0.8192781769631182, 'learning_rate': 1.951088630382206e-06, 'epoch': 0.72} 72%|███████▏ | 24594/34278 [27:05:05<10:54:18, 4.05s/it] 72%|███████▏ | 24595/34278 [27:05:08<10:24:56, 3.87s/it] {'loss': 0.124, 'grad_norm': 0.9404181893206632, 'learning_rate': 1.9507142073381167e-06, 'epoch': 0.72} 72%|███████▏ | 24595/34278 [27:05:08<10:24:56, 3.87s/it] 72%|███████▏ | 24596/34278 [27:05:15<12:24:26, 4.61s/it] {'loss': 0.1146, 'grad_norm': 2.2864494159241726, 'learning_rate': 1.950339811517325e-06, 'epoch': 0.72} 72%|███████▏ | 24596/34278 [27:05:15<12:24:26, 4.61s/it] 72%|███████▏ | 24597/34278 [27:05:18<11:28:04, 4.26s/it] {'loss': 0.1373, 'grad_norm': 0.8838459596196421, 'learning_rate': 1.949965442923171e-06, 'epoch': 0.72} 72%|███████▏ | 24597/34278 [27:05:18<11:28:04, 4.26s/it] 72%|███████▏ | 24598/34278 [27:05:21<10:21:50, 3.85s/it] {'loss': 0.1183, 'grad_norm': 0.7264295874041378, 'learning_rate': 1.9495911015589957e-06, 'epoch': 0.72} 72%|███████▏ | 24598/34278 [27:05:21<10:21:50, 3.85s/it] 72%|███████▏ | 24599/34278 [27:05:24<9:53:10, 3.68s/it] {'loss': 0.1234, 'grad_norm': 0.7934864666564094, 'learning_rate': 1.9492167874281425e-06, 'epoch': 0.72} 72%|███████▏ | 24599/34278 [27:05:24<9:53:10, 3.68s/it] 72%|███████▏ | 24600/34278 [27:05:27<9:06:41, 3.39s/it] {'loss': 0.1149, 'grad_norm': 0.8892097528522984, 'learning_rate': 1.9488425005339555e-06, 'epoch': 0.72} 72%|███████▏ | 24600/34278 [27:05:27<9:06:41, 3.39s/it] 72%|███████▏ | 24601/34278 [27:05:30<8:54:57, 3.32s/it] {'loss': 0.141, 'grad_norm': 1.0318295023072517, 'learning_rate': 1.948468240879775e-06, 'epoch': 0.72} 72%|███████▏ | 24601/34278 [27:05:30<8:54:57, 3.32s/it] 72%|███████▏ | 24602/34278 [27:05:33<8:43:37, 3.25s/it] {'loss': 0.1156, 'grad_norm': 0.8342628317211979, 'learning_rate': 1.9480940084689394e-06, 'epoch': 0.72} 72%|███████▏ | 24602/34278 [27:05:33<8:43:37, 3.25s/it] 72%|███████▏ | 24603/34278 [27:05:36<8:29:42, 3.16s/it] {'loss': 0.104, 'grad_norm': 0.9779654918887197, 'learning_rate': 1.9477198033047933e-06, 'epoch': 0.72} 72%|███████▏ | 24603/34278 [27:05:36<8:29:42, 3.16s/it] 72%|███████▏ | 24604/34278 [27:05:39<8:17:24, 3.09s/it] {'loss': 0.0947, 'grad_norm': 0.6842079680824421, 'learning_rate': 1.9473456253906764e-06, 'epoch': 0.72} 72%|███████▏ | 24604/34278 [27:05:39<8:17:24, 3.09s/it] 72%|███████▏ | 24605/34278 [27:05:43<9:00:58, 3.36s/it] {'loss': 0.1245, 'grad_norm': 0.9837934333206318, 'learning_rate': 1.946971474729926e-06, 'epoch': 0.72} 72%|███████▏ | 24605/34278 [27:05:43<9:00:58, 3.36s/it] 72%|███████▏ | 24606/34278 [27:05:46<8:57:43, 3.34s/it] {'loss': 0.1154, 'grad_norm': 0.7094418965261642, 'learning_rate': 1.946597351325888e-06, 'epoch': 0.72} 72%|███████▏ | 24606/34278 [27:05:46<8:57:43, 3.34s/it] 72%|███████▏ | 24607/34278 [27:05:50<8:44:48, 3.26s/it] {'loss': 0.0963, 'grad_norm': 0.8102633184500629, 'learning_rate': 1.9462232551819006e-06, 'epoch': 0.72} 72%|███████▏ | 24607/34278 [27:05:50<8:44:48, 3.26s/it] 72%|███████▏ | 24608/34278 [27:05:56<11:06:04, 4.13s/it] {'loss': 0.1071, 'grad_norm': 1.2047205682955549, 'learning_rate': 1.9458491863013006e-06, 'epoch': 0.72} 72%|███████▏ | 24608/34278 [27:05:56<11:06:04, 4.13s/it] 72%|███████▏ | 24609/34278 [27:05:59<10:16:27, 3.83s/it] {'loss': 0.1197, 'grad_norm': 0.6767291762863835, 'learning_rate': 1.9454751446874328e-06, 'epoch': 0.72} 72%|███████▏ | 24609/34278 [27:05:59<10:16:27, 3.83s/it] 72%|███████▏ | 24610/34278 [27:06:02<9:32:24, 3.55s/it] {'loss': 0.1211, 'grad_norm': 0.790391055486654, 'learning_rate': 1.945101130343633e-06, 'epoch': 0.72} 72%|███████▏ | 24610/34278 [27:06:02<9:32:24, 3.55s/it] 72%|███████▏ | 24611/34278 [27:06:08<11:35:50, 4.32s/it] {'loss': 0.1238, 'grad_norm': 0.905758586230062, 'learning_rate': 1.944727143273239e-06, 'epoch': 0.72} 72%|███████▏ | 24611/34278 [27:06:08<11:35:50, 4.32s/it] 72%|███████▏ | 24612/34278 [27:06:11<10:30:57, 3.92s/it] {'loss': 0.1193, 'grad_norm': 0.6817967211946586, 'learning_rate': 1.9443531834795927e-06, 'epoch': 0.72} 72%|███████▏ | 24612/34278 [27:06:11<10:30:57, 3.92s/it] 72%|███████▏ | 24613/34278 [27:06:14<9:52:37, 3.68s/it] {'loss': 0.1255, 'grad_norm': 1.153489879461069, 'learning_rate': 1.943979250966033e-06, 'epoch': 0.72} 72%|███████▏ | 24613/34278 [27:06:14<9:52:37, 3.68s/it] 72%|███████▏ | 24614/34278 [27:06:20<11:41:28, 4.36s/it] {'loss': 0.1097, 'grad_norm': 0.914770188982417, 'learning_rate': 1.943605345735897e-06, 'epoch': 0.72} 72%|███████▏ | 24614/34278 [27:06:20<11:41:28, 4.36s/it] 72%|███████▏ | 24615/34278 [27:06:25<12:41:20, 4.73s/it] {'loss': 0.1216, 'grad_norm': 0.6866580799115839, 'learning_rate': 1.9432314677925207e-06, 'epoch': 0.72} 72%|███████▏ | 24615/34278 [27:06:25<12:41:20, 4.73s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 72%|███████▏ | 24616/34278 [27:06:30<12:18:49, 4.59s/it] {'loss': 0.1432, 'grad_norm': 0.722579575940004, 'learning_rate': 1.9428576171392462e-06, 'epoch': 0.72} 72%|███████▏ | 24616/34278 [27:06:30<12:18:49, 4.59s/it] 72%|███████▏ | 24617/34278 [27:06:33<11:01:57, 4.11s/it] {'loss': 0.118, 'grad_norm': 0.937653794165693, 'learning_rate': 1.942483793779407e-06, 'epoch': 0.72} 72%|███████▏ | 24617/34278 [27:06:33<11:01:57, 4.11s/it] 72%|███████▏ | 24618/34278 [27:06:36<10:39:21, 3.97s/it] {'loss': 0.1082, 'grad_norm': 0.8842048175150861, 'learning_rate': 1.942109997716345e-06, 'epoch': 0.72} 72%|███████▏ | 24618/34278 [27:06:36<10:39:21, 3.97s/it] 72%|███████▏ | 24619/34278 [27:06:41<11:31:56, 4.30s/it] {'loss': 0.1352, 'grad_norm': 0.8019699131399003, 'learning_rate': 1.9417362289533933e-06, 'epoch': 0.72} 72%|███████▏ | 24619/34278 [27:06:41<11:31:56, 4.30s/it] 72%|███████▏ | 24620/34278 [27:06:44<10:15:17, 3.82s/it] {'loss': 0.1234, 'grad_norm': 0.9522965798772958, 'learning_rate': 1.9413624874938915e-06, 'epoch': 0.72} 72%|███████▏ | 24620/34278 [27:06:44<10:15:17, 3.82s/it] 72%|███████▏ | 24621/34278 [27:06:47<9:43:16, 3.62s/it] {'loss': 0.1119, 'grad_norm': 0.9299597553798609, 'learning_rate': 1.940988773341176e-06, 'epoch': 0.72} 72%|███████▏ | 24621/34278 [27:06:47<9:43:16, 3.62s/it] 72%|███████▏ | 24622/34278 [27:06:51<9:26:28, 3.52s/it] {'loss': 0.1155, 'grad_norm': 0.8737066950837651, 'learning_rate': 1.94061508649858e-06, 'epoch': 0.72} 72%|███████▏ | 24622/34278 [27:06:51<9:26:28, 3.52s/it] 72%|███████▏ | 24623/34278 [27:06:55<9:50:32, 3.67s/it] {'loss': 0.1489, 'grad_norm': 0.7935505559490547, 'learning_rate': 1.9402414269694425e-06, 'epoch': 0.72} 72%|███████▏ | 24623/34278 [27:06:55<9:50:32, 3.67s/it] 72%|███████▏ | 24624/34278 [27:06:59<10:09:02, 3.79s/it] {'loss': 0.1266, 'grad_norm': 0.8908221747073154, 'learning_rate': 1.939867794757101e-06, 'epoch': 0.72} 72%|███████▏ | 24624/34278 [27:06:59<10:09:02, 3.79s/it] 72%|███████▏ | 24625/34278 [27:07:02<10:01:07, 3.74s/it] {'loss': 0.11, 'grad_norm': 0.7495130405297756, 'learning_rate': 1.9394941898648874e-06, 'epoch': 0.72} 72%|███████▏ | 24625/34278 [27:07:02<10:01:07, 3.74s/it] 72%|███████▏ | 24626/34278 [27:07:08<11:23:28, 4.25s/it] {'loss': 0.1098, 'grad_norm': 0.8056942599242546, 'learning_rate': 1.939120612296141e-06, 'epoch': 0.72} 72%|███████▏ | 24626/34278 [27:07:08<11:23:28, 4.25s/it] 72%|███████▏ | 24627/34278 [27:07:13<12:27:19, 4.65s/it] {'loss': 0.1222, 'grad_norm': 0.856716002032613, 'learning_rate': 1.938747062054195e-06, 'epoch': 0.72} 72%|███████▏ | 24627/34278 [27:07:13<12:27:19, 4.65s/it] 72%|███████▏ | 24628/34278 [27:07:17<11:42:19, 4.37s/it] {'loss': 0.104, 'grad_norm': 0.8408169293756518, 'learning_rate': 1.9383735391423826e-06, 'epoch': 0.72} 72%|███████▏ | 24628/34278 [27:07:17<11:42:19, 4.37s/it] 72%|███████▏ | 24629/34278 [27:07:23<12:51:20, 4.80s/it] {'loss': 0.109, 'grad_norm': 1.0431592999174712, 'learning_rate': 1.9380000435640407e-06, 'epoch': 0.72} 72%|███████▏ | 24629/34278 [27:07:23<12:51:20, 4.80s/it] 72%|███████▏ | 24630/34278 [27:07:27<12:08:15, 4.53s/it] {'loss': 0.1316, 'grad_norm': 1.024265358311274, 'learning_rate': 1.9376265753225047e-06, 'epoch': 0.72} 72%|███████▏ | 24630/34278 [27:07:27<12:08:15, 4.53s/it] 72%|███████▏ | 24631/34278 [27:07:33<13:14:35, 4.94s/it] {'loss': 0.1205, 'grad_norm': 0.9089588791649065, 'learning_rate': 1.9372531344211076e-06, 'epoch': 0.72} 72%|███████▏ | 24631/34278 [27:07:33<13:14:35, 4.94s/it] 72%|███████▏ | 24632/34278 [27:07:36<11:39:16, 4.35s/it] {'loss': 0.1135, 'grad_norm': 0.9693610113332035, 'learning_rate': 1.9368797208631822e-06, 'epoch': 0.72} 72%|███████▏ | 24632/34278 [27:07:36<11:39:16, 4.35s/it] 72%|███████▏ | 24633/34278 [27:07:40<11:20:45, 4.23s/it] {'loss': 0.1305, 'grad_norm': 0.8042025428336325, 'learning_rate': 1.9365063346520645e-06, 'epoch': 0.72} 72%|███████▏ | 24633/34278 [27:07:40<11:20:45, 4.23s/it] 72%|███████▏ | 24634/34278 [27:07:43<10:38:18, 3.97s/it] {'loss': 0.1296, 'grad_norm': 0.8853523697634652, 'learning_rate': 1.9361329757910875e-06, 'epoch': 0.72} 72%|███████▏ | 24634/34278 [27:07:43<10:38:18, 3.97s/it] 72%|███████▏ | 24635/34278 [27:07:47<10:30:14, 3.92s/it] {'loss': 0.1328, 'grad_norm': 0.9365344251950367, 'learning_rate': 1.93575964428358e-06, 'epoch': 0.72} 72%|███████▏ | 24635/34278 [27:07:47<10:30:14, 3.92s/it] 72%|███████▏ | 24636/34278 [27:07:53<12:11:53, 4.55s/it] {'loss': 0.1083, 'grad_norm': 0.7650189822371252, 'learning_rate': 1.9353863401328827e-06, 'epoch': 0.72} 72%|███████▏ | 24636/34278 [27:07:53<12:11:53, 4.55s/it] 72%|███████▏ | 24637/34278 [27:07:56<11:06:09, 4.15s/it] {'loss': 0.1307, 'grad_norm': 0.9849746651651492, 'learning_rate': 1.9350130633423247e-06, 'epoch': 0.72} 72%|███████▏ | 24637/34278 [27:07:56<11:06:09, 4.15s/it] 72%|███████▏ | 24638/34278 [27:07:59<10:08:43, 3.79s/it] {'loss': 0.1339, 'grad_norm': 0.7478866575294362, 'learning_rate': 1.934639813915236e-06, 'epoch': 0.72} 72%|███████▏ | 24638/34278 [27:07:59<10:08:43, 3.79s/it] 72%|███████▏ | 24639/34278 [27:08:02<9:49:15, 3.67s/it] {'loss': 0.1089, 'grad_norm': 0.8851299734756334, 'learning_rate': 1.9342665918549534e-06, 'epoch': 0.72} 72%|███████▏ | 24639/34278 [27:08:02<9:49:15, 3.67s/it] 72%|███████▏ | 24640/34278 [27:08:05<9:10:51, 3.43s/it] {'loss': 0.1196, 'grad_norm': 0.9144544786870371, 'learning_rate': 1.933893397164807e-06, 'epoch': 0.72} 72%|███████▏ | 24640/34278 [27:08:05<9:10:51, 3.43s/it] 72%|███████▏ | 24641/34278 [27:08:09<9:21:41, 3.50s/it] {'loss': 0.1251, 'grad_norm': 0.9876098518976142, 'learning_rate': 1.9335202298481267e-06, 'epoch': 0.72} 72%|███████▏ | 24641/34278 [27:08:09<9:21:41, 3.50s/it] 72%|███████▏ | 24642/34278 [27:08:15<11:21:12, 4.24s/it] {'loss': 0.1144, 'grad_norm': 0.7706470829293179, 'learning_rate': 1.9331470899082457e-06, 'epoch': 0.72} 72%|███████▏ | 24642/34278 [27:08:15<11:21:12, 4.24s/it] 72%|███████▏ | 24643/34278 [27:08:18<10:26:14, 3.90s/it] {'loss': 0.1186, 'grad_norm': 0.7820956399378031, 'learning_rate': 1.9327739773484968e-06, 'epoch': 0.72} 72%|███████▏ | 24643/34278 [27:08:18<10:26:14, 3.90s/it] 72%|███████▏ | 24644/34278 [27:08:21<9:40:41, 3.62s/it] {'loss': 0.121, 'grad_norm': 0.8181877304881421, 'learning_rate': 1.93240089217221e-06, 'epoch': 0.72} 72%|███████▏ | 24644/34278 [27:08:21<9:40:41, 3.62s/it] 72%|███████▏ | 24645/34278 [27:08:25<10:17:04, 3.84s/it] {'loss': 0.1276, 'grad_norm': 1.3578256533319701, 'learning_rate': 1.932027834382714e-06, 'epoch': 0.72} 72%|███████▏ | 24645/34278 [27:08:25<10:17:04, 3.84s/it] 72%|███████▏ | 24646/34278 [27:08:28<9:36:15, 3.59s/it] {'loss': 0.1142, 'grad_norm': 1.3414013371998352, 'learning_rate': 1.9316548039833423e-06, 'epoch': 0.72} 72%|███████▏ | 24646/34278 [27:08:28<9:36:15, 3.59s/it] 72%|███████▏ | 24647/34278 [27:08:31<9:09:49, 3.43s/it] {'loss': 0.1113, 'grad_norm': 0.7576906087842425, 'learning_rate': 1.9312818009774227e-06, 'epoch': 0.72} 72%|███████▏ | 24647/34278 [27:08:31<9:09:49, 3.43s/it] 72%|███████▏ | 24648/34278 [27:08:35<9:03:11, 3.38s/it] {'loss': 0.1269, 'grad_norm': 0.8616321200133674, 'learning_rate': 1.9309088253682884e-06, 'epoch': 0.72} 72%|███████▏ | 24648/34278 [27:08:35<9:03:11, 3.38s/it] 72%|███████▏ | 24649/34278 [27:08:39<9:38:20, 3.60s/it] {'loss': 0.1167, 'grad_norm': 0.9510041875764456, 'learning_rate': 1.930535877159265e-06, 'epoch': 0.72} 72%|███████▏ | 24649/34278 [27:08:39<9:38:20, 3.60s/it] 72%|███████▏ | 24650/34278 [27:08:42<9:11:01, 3.43s/it] {'loss': 0.1108, 'grad_norm': 1.021867022805468, 'learning_rate': 1.930162956353687e-06, 'epoch': 0.72} 72%|███████▏ | 24650/34278 [27:08:42<9:11:01, 3.43s/it] 72%|███████▏ | 24651/34278 [27:08:45<9:23:56, 3.51s/it] {'loss': 0.1138, 'grad_norm': 0.9265718907821453, 'learning_rate': 1.9297900629548817e-06, 'epoch': 0.72} 72%|███████▏ | 24651/34278 [27:08:45<9:23:56, 3.51s/it] 72%|███████▏ | 24652/34278 [27:08:52<11:42:27, 4.38s/it] {'loss': 0.1268, 'grad_norm': 0.8233226303401939, 'learning_rate': 1.9294171969661756e-06, 'epoch': 0.72} 72%|███████▏ | 24652/34278 [27:08:52<11:42:27, 4.38s/it] 72%|███████▏ | 24653/34278 [27:08:58<12:45:51, 4.77s/it] {'loss': 0.1181, 'grad_norm': 0.8893054989634243, 'learning_rate': 1.9290443583908996e-06, 'epoch': 0.72} 72%|███████▏ | 24653/34278 [27:08:58<12:45:51, 4.77s/it] 72%|███████▏ | 24654/34278 [27:09:04<14:06:04, 5.27s/it] {'loss': 0.1029, 'grad_norm': 0.8629398360858149, 'learning_rate': 1.928671547232384e-06, 'epoch': 0.72} 72%|███████▏ | 24654/34278 [27:09:04<14:06:04, 5.27s/it] 72%|███████▏ | 24655/34278 [27:09:07<12:06:04, 4.53s/it] {'loss': 0.1289, 'grad_norm': 0.880249060721745, 'learning_rate': 1.928298763493954e-06, 'epoch': 0.72} 72%|███████▏ | 24655/34278 [27:09:07<12:06:04, 4.53s/it] 72%|███████▏ | 24656/34278 [27:09:09<10:37:46, 3.98s/it] {'loss': 0.1135, 'grad_norm': 1.0290716867981546, 'learning_rate': 1.927926007178942e-06, 'epoch': 0.72} 72%|███████▏ | 24656/34278 [27:09:09<10:37:46, 3.98s/it] 72%|███████▏ | 24657/34278 [27:09:14<10:43:21, 4.01s/it] {'loss': 0.0944, 'grad_norm': 0.9821125362305515, 'learning_rate': 1.9275532782906726e-06, 'epoch': 0.72} 72%|███████▏ | 24657/34278 [27:09:14<10:43:21, 4.01s/it] 72%|███████▏ | 24658/34278 [27:09:17<10:03:54, 3.77s/it] {'loss': 0.1052, 'grad_norm': 0.9376056462310045, 'learning_rate': 1.927180576832472e-06, 'epoch': 0.72} 72%|███████▏ | 24658/34278 [27:09:17<10:03:54, 3.77s/it] 72%|███████▏ | 24659/34278 [27:09:22<11:05:35, 4.15s/it] {'loss': 0.1325, 'grad_norm': 0.9270168486362149, 'learning_rate': 1.9268079028076705e-06, 'epoch': 0.72} 72%|███████▏ | 24659/34278 [27:09:22<11:05:35, 4.15s/it] 72%|███████▏ | 24660/34278 [27:09:25<10:15:17, 3.84s/it] {'loss': 0.1186, 'grad_norm': 0.758389496752336, 'learning_rate': 1.9264352562195953e-06, 'epoch': 0.72} 72%|███████▏ | 24660/34278 [27:09:25<10:15:17, 3.84s/it] 72%|███████▏ | 24661/34278 [27:09:29<10:32:57, 3.95s/it] {'loss': 0.1135, 'grad_norm': 0.836732520175048, 'learning_rate': 1.926062637071573e-06, 'epoch': 0.72} 72%|███████▏ | 24661/34278 [27:09:29<10:32:57, 3.95s/it] 72%|███████▏ | 24662/34278 [27:09:32<9:42:43, 3.64s/it] {'loss': 0.1073, 'grad_norm': 0.8483575778597005, 'learning_rate': 1.9256900453669273e-06, 'epoch': 0.72} 72%|███████▏ | 24662/34278 [27:09:32<9:42:43, 3.64s/it] 72%|███████▏ | 24663/34278 [27:09:35<9:25:53, 3.53s/it] {'loss': 0.1134, 'grad_norm': 0.8408754717403444, 'learning_rate': 1.9253174811089892e-06, 'epoch': 0.72} 72%|███████▏ | 24663/34278 [27:09:35<9:25:53, 3.53s/it] 72%|███████▏ | 24664/34278 [27:09:38<9:07:24, 3.42s/it] {'loss': 0.1243, 'grad_norm': 1.1023176611870964, 'learning_rate': 1.9249449443010825e-06, 'epoch': 0.72} 72%|███████▏ | 24664/34278 [27:09:38<9:07:24, 3.42s/it] 72%|███████▏ | 24665/34278 [27:09:43<10:21:43, 3.88s/it] {'loss': 0.1195, 'grad_norm': 0.917530972983815, 'learning_rate': 1.92457243494653e-06, 'epoch': 0.72} 72%|███████▏ | 24665/34278 [27:09:43<10:21:43, 3.88s/it] 72%|███████▏ | 24666/34278 [27:09:46<9:42:57, 3.64s/it] {'loss': 0.1069, 'grad_norm': 0.8137738459398199, 'learning_rate': 1.9241999530486636e-06, 'epoch': 0.72} 72%|███████▏ | 24666/34278 [27:09:46<9:42:57, 3.64s/it] 72%|███████▏ | 24667/34278 [27:09:50<9:24:15, 3.52s/it] {'loss': 0.1276, 'grad_norm': 1.033795804264537, 'learning_rate': 1.923827498610806e-06, 'epoch': 0.72} 72%|███████▏ | 24667/34278 [27:09:50<9:24:15, 3.52s/it] 72%|███████▏ | 24668/34278 [27:09:53<9:26:53, 3.54s/it] {'loss': 0.1468, 'grad_norm': 0.8122321928286242, 'learning_rate': 1.923455071636281e-06, 'epoch': 0.72} 72%|███████▏ | 24668/34278 [27:09:53<9:26:53, 3.54s/it] 72%|███████▏ | 24669/34278 [27:09:56<9:02:48, 3.39s/it] {'loss': 0.1115, 'grad_norm': 1.1422027589284771, 'learning_rate': 1.923082672128416e-06, 'epoch': 0.72} 72%|███████▏ | 24669/34278 [27:09:56<9:02:48, 3.39s/it] 72%|███████▏ | 24670/34278 [27:10:00<9:16:26, 3.47s/it] {'loss': 0.1183, 'grad_norm': 1.1093760544730045, 'learning_rate': 1.9227103000905346e-06, 'epoch': 0.72} 72%|███████▏ | 24670/34278 [27:10:00<9:16:26, 3.47s/it] 72%|███████▏ | 24671/34278 [27:10:05<10:38:00, 3.98s/it] {'loss': 0.1089, 'grad_norm': 0.862616174265635, 'learning_rate': 1.9223379555259587e-06, 'epoch': 0.72} 72%|███████▏ | 24671/34278 [27:10:05<10:38:00, 3.98s/it] 72%|███████▏ | 24672/34278 [27:10:10<10:57:20, 4.11s/it] {'loss': 0.1222, 'grad_norm': 0.8495067077110474, 'learning_rate': 1.921965638438015e-06, 'epoch': 0.72} 72%|███████▏ | 24672/34278 [27:10:10<10:57:20, 4.11s/it] 72%|███████▏ | 24673/34278 [27:10:13<10:34:07, 3.96s/it] {'loss': 0.1173, 'grad_norm': 1.1431146322021712, 'learning_rate': 1.921593348830029e-06, 'epoch': 0.72} 72%|███████▏ | 24673/34278 [27:10:13<10:34:07, 3.96s/it] 72%|███████▏ | 24674/34278 [27:10:17<10:04:39, 3.78s/it] {'loss': 0.1199, 'grad_norm': 0.8670236887396516, 'learning_rate': 1.9212210867053235e-06, 'epoch': 0.72} 72%|███████▏ | 24674/34278 [27:10:17<10:04:39, 3.78s/it] 72%|███████▏ | 24675/34278 [27:10:21<10:15:23, 3.85s/it] {'loss': 0.0979, 'grad_norm': 0.8445353815309623, 'learning_rate': 1.9208488520672185e-06, 'epoch': 0.72} 72%|███████▏ | 24675/34278 [27:10:21<10:15:23, 3.85s/it] 72%|███████▏ | 24676/34278 [27:10:27<11:56:50, 4.48s/it] {'loss': 0.1297, 'grad_norm': 0.9907716916234535, 'learning_rate': 1.9204766449190422e-06, 'epoch': 0.72} 72%|███████▏ | 24676/34278 [27:10:27<11:56:50, 4.48s/it] 72%|███████▏ | 24677/34278 [27:10:30<11:26:08, 4.29s/it] {'loss': 0.1272, 'grad_norm': 1.2213455095106163, 'learning_rate': 1.9201044652641134e-06, 'epoch': 0.72} 72%|███████▏ | 24677/34278 [27:10:30<11:26:08, 4.29s/it] 72%|███████▏ | 24678/34278 [27:10:33<10:27:13, 3.92s/it] {'loss': 0.1103, 'grad_norm': 0.8837624913930062, 'learning_rate': 1.9197323131057582e-06, 'epoch': 0.72} 72%|███████▏ | 24678/34278 [27:10:33<10:27:13, 3.92s/it] 72%|███████▏ | 24679/34278 [27:10:37<10:31:18, 3.95s/it] {'loss': 0.1219, 'grad_norm': 0.8819692157390737, 'learning_rate': 1.9193601884472963e-06, 'epoch': 0.72} 72%|███████▏ | 24679/34278 [27:10:37<10:31:18, 3.95s/it] 72%|███████▏ | 24680/34278 [27:10:40<9:45:02, 3.66s/it] {'loss': 0.0995, 'grad_norm': 0.9645443741095989, 'learning_rate': 1.918988091292052e-06, 'epoch': 0.72} 72%|███████▏ | 24680/34278 [27:10:40<9:45:02, 3.66s/it] 72%|███████▏ | 24681/34278 [27:10:43<9:05:21, 3.41s/it] {'loss': 0.0999, 'grad_norm': 1.0528319556078618, 'learning_rate': 1.9186160216433475e-06, 'epoch': 0.72} 72%|███████▏ | 24681/34278 [27:10:43<9:05:21, 3.41s/it] 72%|███████▏ | 24682/34278 [27:10:46<8:39:46, 3.25s/it] {'loss': 0.1169, 'grad_norm': 1.3322366515643702, 'learning_rate': 1.9182439795045014e-06, 'epoch': 0.72} 72%|███████▏ | 24682/34278 [27:10:46<8:39:46, 3.25s/it] 72%|███████▏ | 24683/34278 [27:10:49<8:31:41, 3.20s/it] {'loss': 0.1282, 'grad_norm': 0.7331645887220264, 'learning_rate': 1.917871964878838e-06, 'epoch': 0.72} 72%|███████▏ | 24683/34278 [27:10:49<8:31:41, 3.20s/it] 72%|███████▏ | 24684/34278 [27:10:56<11:09:57, 4.19s/it] {'loss': 0.1031, 'grad_norm': 0.8474656686504487, 'learning_rate': 1.917499977769679e-06, 'epoch': 0.72} 72%|███████▏ | 24684/34278 [27:10:56<11:09:57, 4.19s/it] 72%|███████▏ | 24685/34278 [27:10:58<10:03:00, 3.77s/it] {'loss': 0.1291, 'grad_norm': 1.2961017605312743, 'learning_rate': 1.9171280181803427e-06, 'epoch': 0.72} 72%|███████▏ | 24685/34278 [27:10:59<10:03:00, 3.77s/it] 72%|███████▏ | 24686/34278 [27:11:04<11:16:21, 4.23s/it] {'loss': 0.111, 'grad_norm': 0.9402362561328396, 'learning_rate': 1.916756086114153e-06, 'epoch': 0.72} 72%|███████▏ | 24686/34278 [27:11:04<11:16:21, 4.23s/it] 72%|███████▏ | 24687/34278 [27:11:07<10:44:48, 4.03s/it] {'loss': 0.1293, 'grad_norm': 1.0759894057376236, 'learning_rate': 1.9163841815744295e-06, 'epoch': 0.72} 72%|███████▏ | 24687/34278 [27:11:07<10:44:48, 4.03s/it] 72%|███████▏ | 24688/34278 [27:11:11<10:28:09, 3.93s/it] {'loss': 0.1148, 'grad_norm': 0.9344755495602592, 'learning_rate': 1.91601230456449e-06, 'epoch': 0.72} 72%|███████▏ | 24688/34278 [27:11:11<10:28:09, 3.93s/it] 72%|███████▏ | 24689/34278 [27:11:17<12:08:48, 4.56s/it] {'loss': 0.119, 'grad_norm': 1.008159646040027, 'learning_rate': 1.9156404550876563e-06, 'epoch': 0.72} 72%|███████▏ | 24689/34278 [27:11:17<12:08:48, 4.56s/it] 72%|███████▏ | 24690/34278 [27:11:21<11:25:34, 4.29s/it] {'loss': 0.1168, 'grad_norm': 1.2464510090837626, 'learning_rate': 1.9152686331472505e-06, 'epoch': 0.72} 72%|███████▏ | 24690/34278 [27:11:21<11:25:34, 4.29s/it] 72%|███████▏ | 24691/34278 [27:11:24<10:16:33, 3.86s/it] {'loss': 0.1237, 'grad_norm': 1.0474997117594915, 'learning_rate': 1.9148968387465895e-06, 'epoch': 0.72} 72%|███████▏ | 24691/34278 [27:11:24<10:16:33, 3.86s/it] 72%|███████▏ | 24692/34278 [27:11:27<10:14:47, 3.85s/it] {'loss': 0.1018, 'grad_norm': 0.7978159082470053, 'learning_rate': 1.914525071888991e-06, 'epoch': 0.72} 72%|███████▏ | 24692/34278 [27:11:27<10:14:47, 3.85s/it] 72%|███████▏ | 24693/34278 [27:11:33<12:01:23, 4.52s/it] {'loss': 0.122, 'grad_norm': 0.77938685141896, 'learning_rate': 1.9141533325777785e-06, 'epoch': 0.72} 72%|███████▏ | 24693/34278 [27:11:34<12:01:23, 4.52s/it] 72%|███████▏ | 24694/34278 [27:11:36<10:43:34, 4.03s/it] {'loss': 0.1304, 'grad_norm': 0.7419460825517156, 'learning_rate': 1.913781620816268e-06, 'epoch': 0.72} 72%|███████▏ | 24694/34278 [27:11:36<10:43:34, 4.03s/it] 72%|███████▏ | 24695/34278 [27:11:43<12:26:13, 4.67s/it] {'loss': 0.1239, 'grad_norm': 1.1909270497883744, 'learning_rate': 1.913409936607775e-06, 'epoch': 0.72} 72%|███████▏ | 24695/34278 [27:11:43<12:26:13, 4.67s/it] 72%|███████▏ | 24696/34278 [27:11:46<11:13:20, 4.22s/it] {'loss': 0.1277, 'grad_norm': 0.7898081197878707, 'learning_rate': 1.9130382799556253e-06, 'epoch': 0.72} 72%|███████▏ | 24696/34278 [27:11:46<11:13:20, 4.22s/it] 72%|███████▏ | 24697/34278 [27:11:52<12:39:41, 4.76s/it] {'loss': 0.115, 'grad_norm': 0.7746590602597376, 'learning_rate': 1.9126666508631324e-06, 'epoch': 0.72} 72%|███████▏ | 24697/34278 [27:11:52<12:39:41, 4.76s/it] 72%|███████▏ | 24698/34278 [27:11:55<11:31:51, 4.33s/it] {'loss': 0.1323, 'grad_norm': 0.9502854095795547, 'learning_rate': 1.912295049333613e-06, 'epoch': 0.72} 72%|███████▏ | 24698/34278 [27:11:55<11:31:51, 4.33s/it] 72%|███████▏ | 24699/34278 [27:11:58<10:34:57, 3.98s/it] {'loss': 0.128, 'grad_norm': 1.0002000160621727, 'learning_rate': 1.911923475370388e-06, 'epoch': 0.72} 72%|███████▏ | 24699/34278 [27:11:58<10:34:57, 3.98s/it] 72%|███████▏ | 24700/34278 [27:12:01<9:52:22, 3.71s/it] {'loss': 0.1122, 'grad_norm': 0.8516728307827025, 'learning_rate': 1.911551928976773e-06, 'epoch': 0.72} 72%|███████▏ | 24700/34278 [27:12:01<9:52:22, 3.71s/it] 72%|███████▏ | 24701/34278 [27:12:06<10:46:53, 4.05s/it] {'loss': 0.1325, 'grad_norm': 0.7974658657158188, 'learning_rate': 1.911180410156083e-06, 'epoch': 0.72} 72%|███████▏ | 24701/34278 [27:12:06<10:46:53, 4.05s/it] 72%|███████▏ | 24702/34278 [27:12:09<9:52:01, 3.71s/it] {'loss': 0.1095, 'grad_norm': 0.8218616731306487, 'learning_rate': 1.9108089189116374e-06, 'epoch': 0.72} 72%|███████▏ | 24702/34278 [27:12:09<9:52:01, 3.71s/it] 72%|███████▏ | 24703/34278 [27:12:13<10:16:02, 3.86s/it] {'loss': 0.1242, 'grad_norm': 0.771458787265972, 'learning_rate': 1.9104374552467542e-06, 'epoch': 0.72} 72%|███████▏ | 24703/34278 [27:12:13<10:16:02, 3.86s/it] 72%|███████▏ | 24704/34278 [27:12:17<9:53:13, 3.72s/it] {'loss': 0.1117, 'grad_norm': 0.9622223656294758, 'learning_rate': 1.910066019164748e-06, 'epoch': 0.72} 72%|███████▏ | 24704/34278 [27:12:17<9:53:13, 3.72s/it] 72%|███████▏ | 24705/34278 [27:12:20<9:13:37, 3.47s/it] {'loss': 0.1244, 'grad_norm': 0.8817943316492531, 'learning_rate': 1.9096946106689322e-06, 'epoch': 0.72} 72%|███████▏ | 24705/34278 [27:12:20<9:13:37, 3.47s/it] 72%|███████▏ | 24706/34278 [27:12:22<8:45:56, 3.30s/it] {'loss': 0.14, 'grad_norm': 0.8791277836520458, 'learning_rate': 1.9093232297626278e-06, 'epoch': 0.72} 72%|███████▏ | 24706/34278 [27:12:22<8:45:56, 3.30s/it] 72%|███████▏ | 24707/34278 [27:12:25<8:25:52, 3.17s/it] {'loss': 0.0991, 'grad_norm': 0.868448183477267, 'learning_rate': 1.9089518764491453e-06, 'epoch': 0.72} 72%|███████▏ | 24707/34278 [27:12:25<8:25:52, 3.17s/it] 72%|███████▏ | 24708/34278 [27:12:29<8:54:42, 3.35s/it] {'loss': 0.1266, 'grad_norm': 0.821549394991177, 'learning_rate': 1.908580550731805e-06, 'epoch': 0.72} 72%|███████▏ | 24708/34278 [27:12:29<8:54:42, 3.35s/it] 72%|███████▏ | 24709/34278 [27:12:35<10:48:35, 4.07s/it] {'loss': 0.1109, 'grad_norm': 0.7550515482338176, 'learning_rate': 1.9082092526139175e-06, 'epoch': 0.72} 72%|███████▏ | 24709/34278 [27:12:35<10:48:35, 4.07s/it] 72%|███████▏ | 24710/34278 [27:12:38<9:55:29, 3.73s/it] {'loss': 0.0992, 'grad_norm': 0.8146217101684364, 'learning_rate': 1.9078379820988014e-06, 'epoch': 0.72} 72%|███████▏ | 24710/34278 [27:12:38<9:55:29, 3.73s/it] 72%|███████▏ | 24711/34278 [27:12:41<9:31:26, 3.58s/it] {'loss': 0.1129, 'grad_norm': 0.9098167480942209, 'learning_rate': 1.9074667391897694e-06, 'epoch': 0.72} 72%|███████▏ | 24711/34278 [27:12:41<9:31:26, 3.58s/it] 72%|███████▏ | 24712/34278 [27:12:46<10:58:27, 4.13s/it] {'loss': 0.1102, 'grad_norm': 0.8360465489386156, 'learning_rate': 1.9070955238901352e-06, 'epoch': 0.72} 72%|███████▏ | 24712/34278 [27:12:46<10:58:27, 4.13s/it] 72%|███████▏ | 24713/34278 [27:12:50<10:09:28, 3.82s/it] {'loss': 0.144, 'grad_norm': 1.012354929219495, 'learning_rate': 1.9067243362032128e-06, 'epoch': 0.72} 72%|███████▏ | 24713/34278 [27:12:50<10:09:28, 3.82s/it] 72%|███████▏ | 24714/34278 [27:12:52<9:25:35, 3.55s/it] {'loss': 0.1103, 'grad_norm': 0.6986973803491306, 'learning_rate': 1.9063531761323195e-06, 'epoch': 0.72} 72%|███████▏ | 24714/34278 [27:12:52<9:25:35, 3.55s/it] 72%|███████▏ | 24715/34278 [27:12:58<10:44:11, 4.04s/it] {'loss': 0.127, 'grad_norm': 0.7190303083356872, 'learning_rate': 1.9059820436807646e-06, 'epoch': 0.72} 72%|███████▏ | 24715/34278 [27:12:58<10:44:11, 4.04s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 72%|███████▏ | 24716/34278 [27:13:04<12:16:27, 4.62s/it] {'loss': 0.1149, 'grad_norm': 0.7472901284349369, 'learning_rate': 1.9056109388518652e-06, 'epoch': 0.72} 72%|███████▏ | 24716/34278 [27:13:04<12:16:27, 4.62s/it] 72%|███████▏ | 24717/34278 [27:13:07<11:12:55, 4.22s/it] {'loss': 0.1285, 'grad_norm': 0.8931698248751406, 'learning_rate': 1.9052398616489325e-06, 'epoch': 0.72} 72%|███████▏ | 24717/34278 [27:13:07<11:12:55, 4.22s/it] 72%|███████▏ | 24718/34278 [27:13:12<11:36:55, 4.37s/it] {'loss': 0.0837, 'grad_norm': 0.6280847754916922, 'learning_rate': 1.9048688120752785e-06, 'epoch': 0.72} 72%|███████▏ | 24718/34278 [27:13:12<11:36:55, 4.37s/it] 72%|███████▏ | 24719/34278 [27:13:16<11:21:52, 4.28s/it] {'loss': 0.129, 'grad_norm': 0.664066802237107, 'learning_rate': 1.904497790134216e-06, 'epoch': 0.72} 72%|███████▏ | 24719/34278 [27:13:16<11:21:52, 4.28s/it] 72%|███████▏ | 24720/34278 [27:13:19<10:45:52, 4.05s/it] {'loss': 0.1249, 'grad_norm': 0.9312753488511102, 'learning_rate': 1.9041267958290604e-06, 'epoch': 0.72} 72%|███████▏ | 24720/34278 [27:13:19<10:45:52, 4.05s/it] 72%|███████▏ | 24721/34278 [27:13:23<10:21:56, 3.90s/it] {'loss': 0.098, 'grad_norm': 0.9114255991829319, 'learning_rate': 1.9037558291631215e-06, 'epoch': 0.72} 72%|███████▏ | 24721/34278 [27:13:23<10:21:56, 3.90s/it] 72%|███████▏ | 24722/34278 [27:13:30<12:52:47, 4.85s/it] {'loss': 0.1275, 'grad_norm': 0.8471499709460484, 'learning_rate': 1.9033848901397101e-06, 'epoch': 0.72} 72%|███████▏ | 24722/34278 [27:13:30<12:52:47, 4.85s/it] 72%|███████▏ | 24723/34278 [27:13:34<12:01:20, 4.53s/it] {'loss': 0.1179, 'grad_norm': 0.8323807539702123, 'learning_rate': 1.9030139787621405e-06, 'epoch': 0.72} 72%|███████▏ | 24723/34278 [27:13:34<12:01:20, 4.53s/it] 72%|███████▏ | 24724/34278 [27:13:37<11:08:56, 4.20s/it] {'loss': 0.1273, 'grad_norm': 0.9121685995045828, 'learning_rate': 1.9026430950337227e-06, 'epoch': 0.72} 72%|███████▏ | 24724/34278 [27:13:37<11:08:56, 4.20s/it] 72%|███████▏ | 24725/34278 [27:13:42<11:35:48, 4.37s/it] {'loss': 0.143, 'grad_norm': 0.8911803084946347, 'learning_rate': 1.9022722389577648e-06, 'epoch': 0.72} 72%|███████▏ | 24725/34278 [27:13:42<11:35:48, 4.37s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 72%|███████▏ | 24726/34278 [27:13:48<12:55:17, 4.87s/it] {'loss': 0.1317, 'grad_norm': 0.8695092308868242, 'learning_rate': 1.9019014105375843e-06, 'epoch': 0.72} 72%|███████▏ | 24726/34278 [27:13:48<12:55:17, 4.87s/it] 72%|███████▏ | 24727/34278 [27:13:51<11:51:42, 4.47s/it] {'loss': 0.1261, 'grad_norm': 1.005277111469028, 'learning_rate': 1.9015306097764885e-06, 'epoch': 0.72} 72%|███████▏ | 24727/34278 [27:13:51<11:51:42, 4.47s/it] 72%|███████▏ | 24728/34278 [27:13:55<11:14:21, 4.24s/it] {'loss': 0.1347, 'grad_norm': 0.9282153258894128, 'learning_rate': 1.9011598366777855e-06, 'epoch': 0.72} 72%|███████▏ | 24728/34278 [27:13:55<11:14:21, 4.24s/it] 72%|███████▏ | 24729/34278 [27:13:58<10:02:21, 3.78s/it] {'loss': 0.1142, 'grad_norm': 0.823270135603786, 'learning_rate': 1.9007890912447902e-06, 'epoch': 0.72} 72%|███████▏ | 24729/34278 [27:13:58<10:02:21, 3.78s/it] 72%|███████▏ | 24730/34278 [27:14:01<9:29:15, 3.58s/it] {'loss': 0.1229, 'grad_norm': 0.7616597909380265, 'learning_rate': 1.9004183734808097e-06, 'epoch': 0.72} 72%|███████▏ | 24730/34278 [27:14:01<9:29:15, 3.58s/it] 72%|███████▏ | 24731/34278 [27:14:05<9:32:28, 3.60s/it] {'loss': 0.1339, 'grad_norm': 1.1750734034252135, 'learning_rate': 1.9000476833891518e-06, 'epoch': 0.72} 72%|███████▏ | 24731/34278 [27:14:05<9:32:28, 3.60s/it] 72%|███████▏ | 24732/34278 [27:14:08<9:10:57, 3.46s/it] {'loss': 0.1049, 'grad_norm': 1.0496333424439988, 'learning_rate': 1.8996770209731291e-06, 'epoch': 0.72} 72%|███████▏ | 24732/34278 [27:14:08<9:10:57, 3.46s/it] 72%|███████▏ | 24733/34278 [27:14:11<9:13:02, 3.48s/it] {'loss': 0.1375, 'grad_norm': 0.8141086163389445, 'learning_rate': 1.8993063862360512e-06, 'epoch': 0.72} 72%|███████▏ | 24733/34278 [27:14:11<9:13:02, 3.48s/it] 72%|███████▏ | 24734/34278 [27:14:15<9:27:34, 3.57s/it] {'loss': 0.1074, 'grad_norm': 0.8109456699197971, 'learning_rate': 1.8989357791812253e-06, 'epoch': 0.72} 72%|███████▏ | 24734/34278 [27:14:15<9:27:34, 3.57s/it] 72%|███████▏ | 24735/34278 [27:14:18<8:57:05, 3.38s/it] {'loss': 0.114, 'grad_norm': 0.9727899938713124, 'learning_rate': 1.8985651998119592e-06, 'epoch': 0.72} 72%|███████▏ | 24735/34278 [27:14:18<8:57:05, 3.38s/it] 72%|███████▏ | 24736/34278 [27:14:21<8:54:32, 3.36s/it] {'loss': 0.1009, 'grad_norm': 0.8227569421172481, 'learning_rate': 1.8981946481315645e-06, 'epoch': 0.72} 72%|███████▏ | 24736/34278 [27:14:21<8:54:32, 3.36s/it] 72%|███████▏ | 24737/34278 [27:14:25<8:56:34, 3.37s/it] {'loss': 0.0992, 'grad_norm': 0.8011658588593519, 'learning_rate': 1.8978241241433454e-06, 'epoch': 0.72} 72%|███████▏ | 24737/34278 [27:14:25<8:56:34, 3.37s/it] 72%|███████▏ | 24738/34278 [27:14:29<9:50:27, 3.71s/it] {'loss': 0.1265, 'grad_norm': 0.7985781412771426, 'learning_rate': 1.8974536278506134e-06, 'epoch': 0.72} 72%|███████▏ | 24738/34278 [27:14:29<9:50:27, 3.71s/it] 72%|███████▏ | 24739/34278 [27:14:33<9:55:37, 3.75s/it] {'loss': 0.1188, 'grad_norm': 0.6669561995164679, 'learning_rate': 1.8970831592566734e-06, 'epoch': 0.72} 72%|███████▏ | 24739/34278 [27:14:33<9:55:37, 3.75s/it] 72%|███████▏ | 24740/34278 [27:14:37<9:55:10, 3.74s/it] {'loss': 0.1097, 'grad_norm': 0.894483025862555, 'learning_rate': 1.8967127183648365e-06, 'epoch': 0.72} 72%|███████▏ | 24740/34278 [27:14:37<9:55:10, 3.74s/it] 72%|███████▏ | 24741/34278 [27:14:40<9:14:15, 3.49s/it] {'loss': 0.1145, 'grad_norm': 0.9363555923795046, 'learning_rate': 1.896342305178407e-06, 'epoch': 0.72} 72%|███████▏ | 24741/34278 [27:14:40<9:14:15, 3.49s/it] 72%|███████▏ | 24742/34278 [27:14:45<10:28:09, 3.95s/it] {'loss': 0.1218, 'grad_norm': 0.8364978455189405, 'learning_rate': 1.8959719197006909e-06, 'epoch': 0.72} 72%|███████▏ | 24742/34278 [27:14:45<10:28:09, 3.95s/it] 72%|███████▏ | 24743/34278 [27:14:48<10:05:34, 3.81s/it] {'loss': 0.1287, 'grad_norm': 0.8276715893798426, 'learning_rate': 1.8956015619349966e-06, 'epoch': 0.72} 72%|███████▏ | 24743/34278 [27:14:48<10:05:34, 3.81s/it] 72%|███████▏ | 24744/34278 [27:14:52<10:23:08, 3.92s/it] {'loss': 0.135, 'grad_norm': 0.8775399174639029, 'learning_rate': 1.8952312318846323e-06, 'epoch': 0.72} 72%|███████▏ | 24744/34278 [27:14:52<10:23:08, 3.92s/it] 72%|███████▏ | 24745/34278 [27:14:55<9:45:39, 3.69s/it] {'loss': 0.0928, 'grad_norm': 1.222685766362629, 'learning_rate': 1.8948609295529002e-06, 'epoch': 0.72} 72%|███████▏ | 24745/34278 [27:14:55<9:45:39, 3.69s/it] 72%|███████▏ | 24746/34278 [27:15:01<10:57:01, 4.14s/it] {'loss': 0.1305, 'grad_norm': 0.961631466462116, 'learning_rate': 1.8944906549431108e-06, 'epoch': 0.72} 72%|███████▏ | 24746/34278 [27:15:01<10:57:01, 4.14s/it] 72%|███████▏ | 24747/34278 [27:15:05<10:45:59, 4.07s/it] {'loss': 0.1436, 'grad_norm': 1.0119973346594153, 'learning_rate': 1.8941204080585667e-06, 'epoch': 0.72} 72%|███████▏ | 24747/34278 [27:15:05<10:45:59, 4.07s/it] 72%|███████▏ | 24748/34278 [27:15:08<10:05:37, 3.81s/it] {'loss': 0.1256, 'grad_norm': 1.1602630861195822, 'learning_rate': 1.8937501889025732e-06, 'epoch': 0.72} 72%|███████▏ | 24748/34278 [27:15:08<10:05:37, 3.81s/it] 72%|███████▏ | 24749/34278 [27:15:11<9:47:27, 3.70s/it] {'loss': 0.1248, 'grad_norm': 1.1980093307519621, 'learning_rate': 1.893379997478436e-06, 'epoch': 0.72} 72%|███████▏ | 24749/34278 [27:15:11<9:47:27, 3.70s/it] 72%|███████▏ | 24750/34278 [27:15:14<9:14:51, 3.49s/it] {'loss': 0.1243, 'grad_norm': 1.202511537934108, 'learning_rate': 1.8930098337894626e-06, 'epoch': 0.72} 72%|███████▏ | 24750/34278 [27:15:14<9:14:51, 3.49s/it] 72%|███████▏ | 24751/34278 [27:15:18<9:22:38, 3.54s/it] {'loss': 0.1225, 'grad_norm': 0.7734793074211808, 'learning_rate': 1.8926396978389554e-06, 'epoch': 0.72} 72%|███████▏ | 24751/34278 [27:15:18<9:22:38, 3.54s/it] 72%|███████▏ | 24752/34278 [27:15:21<9:13:47, 3.49s/it] {'loss': 0.1315, 'grad_norm': 0.9730790432600112, 'learning_rate': 1.892269589630218e-06, 'epoch': 0.72} 72%|███████▏ | 24752/34278 [27:15:21<9:13:47, 3.49s/it] 72%|███████▏ | 24753/34278 [27:15:24<8:47:53, 3.33s/it] {'loss': 0.1545, 'grad_norm': 1.2765906400452278, 'learning_rate': 1.891899509166557e-06, 'epoch': 0.72} 72%|███████▏ | 24753/34278 [27:15:24<8:47:53, 3.33s/it] 72%|███████▏ | 24754/34278 [27:15:27<8:45:21, 3.31s/it] {'loss': 0.1215, 'grad_norm': 0.9315206832768864, 'learning_rate': 1.8915294564512737e-06, 'epoch': 0.72} 72%|███████▏ | 24754/34278 [27:15:27<8:45:21, 3.31s/it] 72%|███████▏ | 24755/34278 [27:15:30<8:22:33, 3.17s/it] {'loss': 0.1117, 'grad_norm': 0.745017946426986, 'learning_rate': 1.8911594314876736e-06, 'epoch': 0.72} 72%|███████▏ | 24755/34278 [27:15:30<8:22:33, 3.17s/it] 72%|███████▏ | 24756/34278 [27:15:33<8:11:14, 3.10s/it] {'loss': 0.1519, 'grad_norm': 0.8528260442145464, 'learning_rate': 1.8907894342790617e-06, 'epoch': 0.72} 72%|███████▏ | 24756/34278 [27:15:33<8:11:14, 3.10s/it] 72%|███████▏ | 24757/34278 [27:15:36<8:02:12, 3.04s/it] {'loss': 0.1131, 'grad_norm': 0.9805517312070635, 'learning_rate': 1.8904194648287394e-06, 'epoch': 0.72} 72%|███████▏ | 24757/34278 [27:15:36<8:02:12, 3.04s/it] 72%|███████▏ | 24758/34278 [27:15:42<10:22:52, 3.93s/it] {'loss': 0.1179, 'grad_norm': 0.7800230015632025, 'learning_rate': 1.8900495231400079e-06, 'epoch': 0.72} 72%|███████▏ | 24758/34278 [27:15:42<10:22:52, 3.93s/it] 72%|███████▏ | 24759/34278 [27:15:45<9:49:22, 3.71s/it] {'loss': 0.1299, 'grad_norm': 0.9897083552934886, 'learning_rate': 1.8896796092161735e-06, 'epoch': 0.72} 72%|███████▏ | 24759/34278 [27:15:45<9:49:22, 3.71s/it] 72%|███████▏ | 24760/34278 [27:15:50<10:24:52, 3.94s/it] {'loss': 0.1204, 'grad_norm': 0.8757966254637451, 'learning_rate': 1.8893097230605356e-06, 'epoch': 0.72} 72%|███████▏ | 24760/34278 [27:15:50<10:24:52, 3.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 72%|███████▏ | 24761/34278 [27:15:56<12:06:25, 4.58s/it] {'loss': 0.1134, 'grad_norm': 1.1702030094005131, 'learning_rate': 1.888939864676399e-06, 'epoch': 0.72} 72%|███████▏ | 24761/34278 [27:15:56<12:06:25, 4.58s/it] 72%|███████▏ | 24762/34278 [27:15:59<11:18:47, 4.28s/it] {'loss': 0.12, 'grad_norm': 0.8368577500639368, 'learning_rate': 1.8885700340670638e-06, 'epoch': 0.72} 72%|███████▏ | 24762/34278 [27:15:59<11:18:47, 4.28s/it] 72%|███████▏ | 24763/34278 [27:16:03<10:53:04, 4.12s/it] {'loss': 0.1248, 'grad_norm': 0.8328544580460618, 'learning_rate': 1.8882002312358337e-06, 'epoch': 0.72} 72%|███████▏ | 24763/34278 [27:16:03<10:53:04, 4.12s/it] 72%|███████▏ | 24764/34278 [27:16:06<9:55:17, 3.75s/it] {'loss': 0.0925, 'grad_norm': 0.7411473191067168, 'learning_rate': 1.8878304561860094e-06, 'epoch': 0.72} 72%|███████▏ | 24764/34278 [27:16:06<9:55:17, 3.75s/it] 72%|███████▏ | 24765/34278 [27:16:12<11:24:05, 4.31s/it] {'loss': 0.131, 'grad_norm': 0.8998334911980462, 'learning_rate': 1.8874607089208901e-06, 'epoch': 0.72} 72%|███████▏ | 24765/34278 [27:16:12<11:24:05, 4.31s/it] 72%|███████▏ | 24766/34278 [27:16:15<10:40:43, 4.04s/it] {'loss': 0.1492, 'grad_norm': 0.8310207083478498, 'learning_rate': 1.8870909894437783e-06, 'epoch': 0.72} 72%|███████▏ | 24766/34278 [27:16:15<10:40:43, 4.04s/it] 72%|███████▏ | 24767/34278 [27:16:19<10:24:44, 3.94s/it] {'loss': 0.1091, 'grad_norm': 0.6911958778317953, 'learning_rate': 1.886721297757977e-06, 'epoch': 0.72} 72%|███████▏ | 24767/34278 [27:16:19<10:24:44, 3.94s/it] 72%|███████▏ | 24768/34278 [27:16:23<10:15:40, 3.88s/it] {'loss': 0.1067, 'grad_norm': 0.7323282250704054, 'learning_rate': 1.8863516338667847e-06, 'epoch': 0.72} 72%|███████▏ | 24768/34278 [27:16:23<10:15:40, 3.88s/it] 72%|███████▏ | 24769/34278 [27:16:26<9:33:09, 3.62s/it] {'loss': 0.1111, 'grad_norm': 0.7857957241548171, 'learning_rate': 1.8859819977735e-06, 'epoch': 0.72} 72%|███████▏ | 24769/34278 [27:16:26<9:33:09, 3.62s/it] 72%|███████▏ | 24770/34278 [27:16:29<9:03:30, 3.43s/it] {'loss': 0.1268, 'grad_norm': 0.758324899343193, 'learning_rate': 1.885612389481426e-06, 'epoch': 0.72} 72%|███████▏ | 24770/34278 [27:16:29<9:03:30, 3.43s/it] 72%|███████▏ | 24771/34278 [27:16:32<8:42:18, 3.30s/it] {'loss': 0.1303, 'grad_norm': 0.8682851454231227, 'learning_rate': 1.885242808993862e-06, 'epoch': 0.72} 72%|███████▏ | 24771/34278 [27:16:32<8:42:18, 3.30s/it] 72%|███████▏ | 24772/34278 [27:16:38<11:10:57, 4.24s/it] {'loss': 0.1328, 'grad_norm': 0.7170521968542871, 'learning_rate': 1.8848732563141026e-06, 'epoch': 0.72} 72%|███████▏ | 24772/34278 [27:16:38<11:10:57, 4.24s/it] 72%|███████▏ | 24773/34278 [27:16:41<10:18:16, 3.90s/it] {'loss': 0.1194, 'grad_norm': 0.945809200661344, 'learning_rate': 1.8845037314454544e-06, 'epoch': 0.72} 72%|███████▏ | 24773/34278 [27:16:41<10:18:16, 3.90s/it] 72%|███████▏ | 24774/34278 [27:16:44<9:47:11, 3.71s/it] {'loss': 0.1156, 'grad_norm': 0.8845968746110616, 'learning_rate': 1.8841342343912134e-06, 'epoch': 0.72} 72%|███████▏ | 24774/34278 [27:16:44<9:47:11, 3.71s/it] 72%|███████▏ | 24775/34278 [27:16:47<9:14:31, 3.50s/it] {'loss': 0.1114, 'grad_norm': 0.953148813227204, 'learning_rate': 1.8837647651546765e-06, 'epoch': 0.72} 72%|███████▏ | 24775/34278 [27:16:47<9:14:31, 3.50s/it] 72%|███████▏ | 24776/34278 [27:16:51<9:17:58, 3.52s/it] {'loss': 0.1212, 'grad_norm': 0.8157125894220421, 'learning_rate': 1.8833953237391456e-06, 'epoch': 0.72} 72%|███████▏ | 24776/34278 [27:16:51<9:17:58, 3.52s/it] 72%|███████▏ | 24777/34278 [27:16:54<8:54:41, 3.38s/it] {'loss': 0.1063, 'grad_norm': 1.2872659325605247, 'learning_rate': 1.883025910147917e-06, 'epoch': 0.72} 72%|███████▏ | 24777/34278 [27:16:54<8:54:41, 3.38s/it] 72%|███████▏ | 24778/34278 [27:16:57<8:40:09, 3.29s/it] {'loss': 0.1342, 'grad_norm': 0.9941306782651606, 'learning_rate': 1.8826565243842877e-06, 'epoch': 0.72} 72%|███████▏ | 24778/34278 [27:16:57<8:40:09, 3.29s/it] 72%|███████▏ | 24779/34278 [27:17:00<8:43:59, 3.31s/it] {'loss': 0.1115, 'grad_norm': 0.8272135801291861, 'learning_rate': 1.8822871664515562e-06, 'epoch': 0.72} 72%|███████▏ | 24779/34278 [27:17:00<8:43:59, 3.31s/it] 72%|███████▏ | 24780/34278 [27:17:03<8:20:37, 3.16s/it] {'loss': 0.101, 'grad_norm': 0.8707805844517327, 'learning_rate': 1.8819178363530226e-06, 'epoch': 0.72} 72%|███████▏ | 24780/34278 [27:17:03<8:20:37, 3.16s/it] 72%|███████▏ | 24781/34278 [27:17:07<9:11:45, 3.49s/it] {'loss': 0.1169, 'grad_norm': 1.0051174270341248, 'learning_rate': 1.8815485340919825e-06, 'epoch': 0.72} 72%|███████▏ | 24781/34278 [27:17:07<9:11:45, 3.49s/it] 72%|███████▏ | 24782/34278 [27:17:11<8:54:18, 3.38s/it] {'loss': 0.0972, 'grad_norm': 1.081533538448045, 'learning_rate': 1.881179259671731e-06, 'epoch': 0.72} 72%|███████▏ | 24782/34278 [27:17:11<8:54:18, 3.38s/it] 72%|███████▏ | 24783/34278 [27:17:14<8:40:36, 3.29s/it] {'loss': 0.1131, 'grad_norm': 0.6500338650694056, 'learning_rate': 1.8808100130955676e-06, 'epoch': 0.72} 72%|███████▏ | 24783/34278 [27:17:14<8:40:36, 3.29s/it] 72%|███████▏ | 24784/34278 [27:17:20<10:46:06, 4.08s/it] {'loss': 0.1173, 'grad_norm': 0.8756280400598867, 'learning_rate': 1.8804407943667869e-06, 'epoch': 0.72} 72%|███████▏ | 24784/34278 [27:17:20<10:46:06, 4.08s/it] 72%|███████▏ | 24785/34278 [27:17:23<9:55:52, 3.77s/it] {'loss': 0.0986, 'grad_norm': 0.861538997600829, 'learning_rate': 1.880071603488685e-06, 'epoch': 0.72} 72%|███████▏ | 24785/34278 [27:17:23<9:55:52, 3.77s/it] 72%|███████▏ | 24786/34278 [27:17:26<9:30:00, 3.60s/it] {'loss': 0.1009, 'grad_norm': 0.740512192965018, 'learning_rate': 1.879702440464562e-06, 'epoch': 0.72} 72%|███████▏ | 24786/34278 [27:17:26<9:30:00, 3.60s/it] 72%|███████▏ | 24787/34278 [27:17:29<9:12:55, 3.50s/it] {'loss': 0.1195, 'grad_norm': 0.9904485291951682, 'learning_rate': 1.8793333052977098e-06, 'epoch': 0.72} 72%|███████▏ | 24787/34278 [27:17:29<9:12:55, 3.50s/it] 72%|███████▏ | 24788/34278 [27:17:32<8:47:52, 3.34s/it] {'loss': 0.1142, 'grad_norm': 0.8104199356771068, 'learning_rate': 1.8789641979914237e-06, 'epoch': 0.72} 72%|███████▏ | 24788/34278 [27:17:32<8:47:52, 3.34s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8dbf7e4720> Failed to fetch sample 3679500. Exception: cannot identify image file <_io.BytesIO object at 0x7f8dbf7e4720> 72%|███████▏ | 24789/34278 [27:17:36<9:06:48, 3.46s/it] {'loss': 0.1132, 'grad_norm': 0.7682817340857583, 'learning_rate': 1.8785951185490014e-06, 'epoch': 0.72} 72%|███████▏ | 24789/34278 [27:17:36<9:06:48, 3.46s/it] 72%|███████▏ | 24790/34278 [27:17:41<10:14:39, 3.89s/it] {'loss': 0.1306, 'grad_norm': 0.8446815571771478, 'learning_rate': 1.8782260669737357e-06, 'epoch': 0.72} 72%|███████▏ | 24790/34278 [27:17:41<10:14:39, 3.89s/it] 72%|███████▏ | 24791/34278 [27:17:44<9:36:33, 3.65s/it] {'loss': 0.1157, 'grad_norm': 0.7795864492754341, 'learning_rate': 1.8778570432689236e-06, 'epoch': 0.72} 72%|███████▏ | 24791/34278 [27:17:44<9:36:33, 3.65s/it] 72%|███████▏ | 24792/34278 [27:17:47<9:14:39, 3.51s/it] {'loss': 0.1042, 'grad_norm': 0.7912001186866039, 'learning_rate': 1.8774880474378571e-06, 'epoch': 0.72} 72%|███████▏ | 24792/34278 [27:17:47<9:14:39, 3.51s/it] 72%|███████▏ | 24793/34278 [27:17:50<8:56:58, 3.40s/it] {'loss': 0.1379, 'grad_norm': 1.1973824302252325, 'learning_rate': 1.8771190794838333e-06, 'epoch': 0.72} 72%|███████▏ | 24793/34278 [27:17:50<8:56:58, 3.40s/it] 72%|███████▏ | 24794/34278 [27:17:53<8:41:22, 3.30s/it] {'loss': 0.1326, 'grad_norm': 0.8313170598836324, 'learning_rate': 1.876750139410145e-06, 'epoch': 0.72} 72%|███████▏ | 24794/34278 [27:17:53<8:41:22, 3.30s/it] 72%|███████▏ | 24795/34278 [27:17:56<8:29:17, 3.22s/it] {'loss': 0.1114, 'grad_norm': 0.7775042501154492, 'learning_rate': 1.8763812272200843e-06, 'epoch': 0.72} 72%|███████▏ | 24795/34278 [27:17:56<8:29:17, 3.22s/it] 72%|███████▏ | 24796/34278 [27:17:59<8:20:49, 3.17s/it] {'loss': 0.119, 'grad_norm': 0.9891694611844759, 'learning_rate': 1.8760123429169464e-06, 'epoch': 0.72} 72%|███████▏ | 24796/34278 [27:17:59<8:20:49, 3.17s/it] 72%|███████▏ | 24797/34278 [27:18:03<8:47:41, 3.34s/it] {'loss': 0.1205, 'grad_norm': 0.7944133919249249, 'learning_rate': 1.8756434865040262e-06, 'epoch': 0.72} 72%|███████▏ | 24797/34278 [27:18:03<8:47:41, 3.34s/it] 72%|███████▏ | 24798/34278 [27:18:06<8:34:04, 3.25s/it] {'loss': 0.1104, 'grad_norm': 0.7436630045707892, 'learning_rate': 1.8752746579846148e-06, 'epoch': 0.72} 72%|███████▏ | 24798/34278 [27:18:06<8:34:04, 3.25s/it] 72%|███████▏ | 24799/34278 [27:18:10<8:56:00, 3.39s/it] {'loss': 0.1255, 'grad_norm': 1.0096995984949833, 'learning_rate': 1.8749058573620039e-06, 'epoch': 0.72} 72%|███████▏ | 24799/34278 [27:18:10<8:56:00, 3.39s/it] 72%|███████▏ | 24800/34278 [27:18:13<8:26:26, 3.21s/it] {'loss': 0.0964, 'grad_norm': 0.8741876356871757, 'learning_rate': 1.8745370846394894e-06, 'epoch': 0.72} 72%|███████▏ | 24800/34278 [27:18:13<8:26:26, 3.21s/it] 72%|███████▏ | 24801/34278 [27:18:16<8:23:03, 3.18s/it] {'loss': 0.1074, 'grad_norm': 0.8314448827100319, 'learning_rate': 1.8741683398203614e-06, 'epoch': 0.72} 72%|███████▏ | 24801/34278 [27:18:16<8:23:03, 3.18s/it] 72%|███████▏ | 24802/34278 [27:18:22<10:32:57, 4.01s/it] {'loss': 0.1169, 'grad_norm': 0.966429973558122, 'learning_rate': 1.8737996229079086e-06, 'epoch': 0.72} 72%|███████▏ | 24802/34278 [27:18:22<10:32:57, 4.01s/it] 72%|███████▏ | 24803/34278 [27:18:25<10:25:32, 3.96s/it] {'loss': 0.1107, 'grad_norm': 0.8734627794263268, 'learning_rate': 1.8734309339054308e-06, 'epoch': 0.72} 72%|███████▏ | 24803/34278 [27:18:25<10:25:32, 3.96s/it] 72%|███████▏ | 24804/34278 [27:18:29<9:54:02, 3.76s/it] {'loss': 0.1326, 'grad_norm': 1.170058015271319, 'learning_rate': 1.8730622728162146e-06, 'epoch': 0.72} 72%|███████▏ | 24804/34278 [27:18:29<9:54:02, 3.76s/it] 72%|███████▏ | 24805/34278 [27:18:32<9:46:33, 3.72s/it] {'loss': 0.1201, 'grad_norm': 0.9210037414578498, 'learning_rate': 1.8726936396435502e-06, 'epoch': 0.72} 72%|███████▏ | 24805/34278 [27:18:32<9:46:33, 3.72s/it] 72%|███████▏ | 24806/34278 [27:18:36<9:45:56, 3.71s/it] {'loss': 0.1044, 'grad_norm': 0.7295522152079452, 'learning_rate': 1.8723250343907323e-06, 'epoch': 0.72} 72%|███████▏ | 24806/34278 [27:18:36<9:45:56, 3.71s/it] 72%|███████▏ | 24807/34278 [27:18:39<9:16:12, 3.52s/it] {'loss': 0.1201, 'grad_norm': 0.9642615316670107, 'learning_rate': 1.8719564570610494e-06, 'epoch': 0.72} 72%|███████▏ | 24807/34278 [27:18:39<9:16:12, 3.52s/it] 72%|███████▏ | 24808/34278 [27:18:42<9:04:25, 3.45s/it] {'loss': 0.1115, 'grad_norm': 0.8333528157145845, 'learning_rate': 1.8715879076577915e-06, 'epoch': 0.72} 72%|███████▏ | 24808/34278 [27:18:42<9:04:25, 3.45s/it] 72%|███████▏ | 24809/34278 [27:18:45<8:34:44, 3.26s/it] {'loss': 0.1096, 'grad_norm': 0.8918676910593354, 'learning_rate': 1.8712193861842498e-06, 'epoch': 0.72} 72%|███████▏ | 24809/34278 [27:18:45<8:34:44, 3.26s/it] 72%|███████▏ | 24810/34278 [27:18:48<8:09:45, 3.10s/it] {'loss': 0.1103, 'grad_norm': 0.9016907346872983, 'learning_rate': 1.8708508926437157e-06, 'epoch': 0.72} 72%|███████▏ | 24810/34278 [27:18:48<8:09:45, 3.10s/it] 72%|███████▏ | 24811/34278 [27:18:51<8:23:44, 3.19s/it] {'loss': 0.1153, 'grad_norm': 1.0659294528626322, 'learning_rate': 1.8704824270394783e-06, 'epoch': 0.72} 72%|███████▏ | 24811/34278 [27:18:51<8:23:44, 3.19s/it] 72%|███████▏ | 24812/34278 [27:18:55<8:36:33, 3.27s/it] {'loss': 0.1364, 'grad_norm': 0.9068302087451171, 'learning_rate': 1.870113989374825e-06, 'epoch': 0.72} 72%|███████▏ | 24812/34278 [27:18:55<8:36:33, 3.27s/it] 72%|███████▏ | 24813/34278 [27:18:58<8:35:21, 3.27s/it] {'loss': 0.1066, 'grad_norm': 1.0306284501844045, 'learning_rate': 1.8697455796530483e-06, 'epoch': 0.72} 72%|███████▏ | 24813/34278 [27:18:58<8:35:21, 3.27s/it] 72%|███████▏ | 24814/34278 [27:19:02<8:45:47, 3.33s/it] {'loss': 0.1244, 'grad_norm': 1.1067347553094529, 'learning_rate': 1.8693771978774345e-06, 'epoch': 0.72} 72%|███████▏ | 24814/34278 [27:19:02<8:45:47, 3.33s/it] 72%|███████▏ | 24815/34278 [27:19:06<9:40:55, 3.68s/it] {'loss': 0.1223, 'grad_norm': 0.9523719170841769, 'learning_rate': 1.8690088440512738e-06, 'epoch': 0.72} 72%|███████▏ | 24815/34278 [27:19:06<9:40:55, 3.68s/it] 72%|███████▏ | 24816/34278 [27:19:09<9:08:32, 3.48s/it] {'loss': 0.1037, 'grad_norm': 0.7094855379979579, 'learning_rate': 1.8686405181778562e-06, 'epoch': 0.72} 72%|███████▏ | 24816/34278 [27:19:09<9:08:32, 3.48s/it] 72%|███████▏ | 24817/34278 [27:19:12<9:02:38, 3.44s/it] {'loss': 0.1171, 'grad_norm': 1.1803413326087289, 'learning_rate': 1.8682722202604681e-06, 'epoch': 0.72} 72%|███████▏ | 24817/34278 [27:19:12<9:02:38, 3.44s/it] 72%|███████▏ | 24818/34278 [27:19:16<8:49:09, 3.36s/it] {'loss': 0.1283, 'grad_norm': 1.0737906151829388, 'learning_rate': 1.8679039503023972e-06, 'epoch': 0.72} 72%|███████▏ | 24818/34278 [27:19:16<8:49:09, 3.36s/it] 72%|███████▏ | 24819/34278 [27:19:19<8:29:05, 3.23s/it] {'loss': 0.1355, 'grad_norm': 0.8839486697134835, 'learning_rate': 1.8675357083069328e-06, 'epoch': 0.72} 72%|███████▏ | 24819/34278 [27:19:19<8:29:05, 3.23s/it] 72%|███████▏ | 24820/34278 [27:19:21<8:12:04, 3.12s/it] {'loss': 0.1232, 'grad_norm': 0.9648421191721859, 'learning_rate': 1.867167494277361e-06, 'epoch': 0.72} 72%|███████▏ | 24820/34278 [27:19:21<8:12:04, 3.12s/it] 72%|███████▏ | 24821/34278 [27:19:24<8:02:32, 3.06s/it] {'loss': 0.1416, 'grad_norm': 1.0422206619084138, 'learning_rate': 1.8667993082169712e-06, 'epoch': 0.72} 72%|███████▏ | 24821/34278 [27:19:24<8:02:32, 3.06s/it] 72%|███████▏ | 24822/34278 [27:19:28<8:10:22, 3.11s/it] {'loss': 0.1203, 'grad_norm': 1.1206186102912703, 'learning_rate': 1.8664311501290478e-06, 'epoch': 0.72} 72%|███████▏ | 24822/34278 [27:19:28<8:10:22, 3.11s/it] 72%|███████▏ | 24823/34278 [27:19:34<10:30:46, 4.00s/it] {'loss': 0.1068, 'grad_norm': 1.2156421710641572, 'learning_rate': 1.8660630200168806e-06, 'epoch': 0.72} 72%|███████▏ | 24823/34278 [27:19:34<10:30:46, 4.00s/it] 72%|███████▏ | 24824/34278 [27:19:37<9:48:14, 3.73s/it] {'loss': 0.122, 'grad_norm': 0.7844188174103284, 'learning_rate': 1.8656949178837547e-06, 'epoch': 0.72} 72%|███████▏ | 24824/34278 [27:19:37<9:48:14, 3.73s/it] 72%|███████▏ | 24825/34278 [27:19:40<9:16:32, 3.53s/it] {'loss': 0.1119, 'grad_norm': 0.707850413303378, 'learning_rate': 1.8653268437329542e-06, 'epoch': 0.72} 72%|███████▏ | 24825/34278 [27:19:40<9:16:32, 3.53s/it] 72%|███████▏ | 24826/34278 [27:19:43<8:55:39, 3.40s/it] {'loss': 0.1229, 'grad_norm': 1.2279871966228981, 'learning_rate': 1.864958797567768e-06, 'epoch': 0.72} 72%|███████▏ | 24826/34278 [27:19:43<8:55:39, 3.40s/it] 72%|███████▏ | 24827/34278 [27:19:46<8:36:36, 3.28s/it] {'loss': 0.1132, 'grad_norm': 0.9589869772952597, 'learning_rate': 1.8645907793914826e-06, 'epoch': 0.72} 72%|███████▏ | 24827/34278 [27:19:46<8:36:36, 3.28s/it] 72%|███████▏ | 24828/34278 [27:19:49<8:34:42, 3.27s/it] {'loss': 0.1301, 'grad_norm': 0.753126405914351, 'learning_rate': 1.864222789207382e-06, 'epoch': 0.72} 72%|███████▏ | 24828/34278 [27:19:49<8:34:42, 3.27s/it] 72%|███████▏ | 24829/34278 [27:19:54<9:53:59, 3.77s/it] {'loss': 0.1166, 'grad_norm': 0.7513366561510253, 'learning_rate': 1.8638548270187505e-06, 'epoch': 0.72} 72%|███████▏ | 24829/34278 [27:19:54<9:53:59, 3.77s/it] 72%|███████▏ | 24830/34278 [27:19:58<9:50:35, 3.75s/it] {'loss': 0.115, 'grad_norm': 0.9634536537768936, 'learning_rate': 1.8634868928288757e-06, 'epoch': 0.72} 72%|███████▏ | 24830/34278 [27:19:58<9:50:35, 3.75s/it] 72%|███████▏ | 24831/34278 [27:20:02<10:22:36, 3.95s/it] {'loss': 0.1171, 'grad_norm': 0.8460884475997975, 'learning_rate': 1.863118986641042e-06, 'epoch': 0.72} 72%|███████▏ | 24831/34278 [27:20:02<10:22:36, 3.95s/it] 72%|███████▏ | 24832/34278 [27:20:05<9:34:50, 3.65s/it] {'loss': 0.1067, 'grad_norm': 0.8555442129602252, 'learning_rate': 1.8627511084585293e-06, 'epoch': 0.72} 72%|███████▏ | 24832/34278 [27:20:05<9:34:50, 3.65s/it] 72%|███████▏ | 24833/34278 [27:20:10<10:07:34, 3.86s/it] {'loss': 0.1054, 'grad_norm': 0.7231277466632015, 'learning_rate': 1.8623832582846291e-06, 'epoch': 0.72} 72%|███████▏ | 24833/34278 [27:20:10<10:07:34, 3.86s/it] 72%|███████▏ | 24834/34278 [27:20:13<9:40:19, 3.69s/it] {'loss': 0.1427, 'grad_norm': 1.2338736174024925, 'learning_rate': 1.8620154361226218e-06, 'epoch': 0.72} 72%|███████▏ | 24834/34278 [27:20:13<9:40:19, 3.69s/it] 72%|███████▏ | 24835/34278 [27:20:16<9:23:57, 3.58s/it] {'loss': 0.0983, 'grad_norm': 0.8570689943214541, 'learning_rate': 1.8616476419757907e-06, 'epoch': 0.72} 72%|███████▏ | 24835/34278 [27:20:16<9:23:57, 3.58s/it] 72%|███████▏ | 24836/34278 [27:20:22<11:17:21, 4.30s/it] {'loss': 0.139, 'grad_norm': 0.7948866818502712, 'learning_rate': 1.861279875847421e-06, 'epoch': 0.72} 72%|███████▏ | 24836/34278 [27:20:22<11:17:21, 4.30s/it] 72%|███████▏ | 24837/34278 [27:20:25<10:06:54, 3.86s/it] {'loss': 0.1235, 'grad_norm': 0.8685926764892993, 'learning_rate': 1.8609121377407963e-06, 'epoch': 0.72} 72%|███████▏ | 24837/34278 [27:20:25<10:06:54, 3.86s/it] 72%|███████▏ | 24838/34278 [27:20:29<10:03:36, 3.84s/it] {'loss': 0.1263, 'grad_norm': 0.8490860723033917, 'learning_rate': 1.8605444276591961e-06, 'epoch': 0.72} 72%|███████▏ | 24838/34278 [27:20:29<10:03:36, 3.84s/it] 72%|███████▏ | 24839/34278 [27:20:35<11:41:49, 4.46s/it] {'loss': 0.1075, 'grad_norm': 0.9168573640964627, 'learning_rate': 1.8601767456059062e-06, 'epoch': 0.72} 72%|███████▏ | 24839/34278 [27:20:35<11:41:49, 4.46s/it] 72%|███████▏ | 24840/34278 [27:20:38<11:06:28, 4.24s/it] {'loss': 0.1204, 'grad_norm': 0.7589705049956947, 'learning_rate': 1.8598090915842105e-06, 'epoch': 0.72} 72%|███████▏ | 24840/34278 [27:20:38<11:06:28, 4.24s/it] 72%|███████▏ | 24841/34278 [27:20:44<12:32:44, 4.79s/it] {'loss': 0.1145, 'grad_norm': 0.7442359511864499, 'learning_rate': 1.8594414655973898e-06, 'epoch': 0.72} 72%|███████▏ | 24841/34278 [27:20:44<12:32:44, 4.79s/it] 72%|███████▏ | 24842/34278 [27:20:51<13:37:17, 5.20s/it] {'loss': 0.1094, 'grad_norm': 0.8111783044886859, 'learning_rate': 1.8590738676487242e-06, 'epoch': 0.72} 72%|███████▏ | 24842/34278 [27:20:51<13:37:17, 5.20s/it] 72%|███████▏ | 24843/34278 [27:20:54<11:49:47, 4.51s/it] {'loss': 0.1195, 'grad_norm': 1.3516789518929335, 'learning_rate': 1.8587062977414987e-06, 'epoch': 0.72} 72%|███████▏ | 24843/34278 [27:20:54<11:49:47, 4.51s/it] 72%|███████▏ | 24844/34278 [27:20:56<10:37:07, 4.05s/it] {'loss': 0.1217, 'grad_norm': 0.7454359558463477, 'learning_rate': 1.8583387558789916e-06, 'epoch': 0.72} 72%|███████▏ | 24844/34278 [27:20:56<10:37:07, 4.05s/it] 72%|███████▏ | 24845/34278 [27:21:00<10:22:01, 3.96s/it] {'loss': 0.1352, 'grad_norm': 0.8358274388093241, 'learning_rate': 1.8579712420644869e-06, 'epoch': 0.72} 72%|███████▏ | 24845/34278 [27:21:00<10:22:01, 3.96s/it] 72%|███████▏ | 24846/34278 [27:21:06<11:58:21, 4.57s/it] {'loss': 0.0952, 'grad_norm': 0.6239342387495486, 'learning_rate': 1.8576037563012662e-06, 'epoch': 0.72} 72%|███████▏ | 24846/34278 [27:21:06<11:58:21, 4.57s/it] 72%|███████▏ | 24847/34278 [27:21:10<11:22:34, 4.34s/it] {'loss': 0.1604, 'grad_norm': 0.7656401194383812, 'learning_rate': 1.857236298592609e-06, 'epoch': 0.72} 72%|███████▏ | 24847/34278 [27:21:10<11:22:34, 4.34s/it] 72%|███████▏ | 24848/34278 [27:21:13<10:20:42, 3.95s/it] {'loss': 0.1326, 'grad_norm': 1.0440809395948463, 'learning_rate': 1.856868868941794e-06, 'epoch': 0.72} 72%|███████▏ | 24848/34278 [27:21:13<10:20:42, 3.95s/it] 72%|███████▏ | 24849/34278 [27:21:18<11:12:23, 4.28s/it] {'loss': 0.1331, 'grad_norm': 0.8446142292096692, 'learning_rate': 1.8565014673521053e-06, 'epoch': 0.72} 72%|███████▏ | 24849/34278 [27:21:18<11:12:23, 4.28s/it] 72%|███████▏ | 24850/34278 [27:21:21<10:19:55, 3.95s/it] {'loss': 0.1386, 'grad_norm': 0.814132853791011, 'learning_rate': 1.8561340938268196e-06, 'epoch': 0.72} 72%|███████▏ | 24850/34278 [27:21:21<10:19:55, 3.95s/it] 72%|███████▏ | 24851/34278 [27:21:24<9:44:34, 3.72s/it] {'loss': 0.1274, 'grad_norm': 0.7609068087208614, 'learning_rate': 1.8557667483692193e-06, 'epoch': 0.72} 72%|███████▏ | 24851/34278 [27:21:24<9:44:34, 3.72s/it] 73%|███████▎ | 24852/34278 [27:21:30<10:47:43, 4.12s/it] {'loss': 0.1122, 'grad_norm': 0.9634910209977473, 'learning_rate': 1.8553994309825818e-06, 'epoch': 0.73} 73%|███████▎ | 24852/34278 [27:21:30<10:47:43, 4.12s/it] 73%|███████▎ | 24853/34278 [27:21:35<11:40:26, 4.46s/it] {'loss': 0.114, 'grad_norm': 0.7403786149102356, 'learning_rate': 1.8550321416701888e-06, 'epoch': 0.73} 73%|███████▎ | 24853/34278 [27:21:35<11:40:26, 4.46s/it] 73%|███████▎ | 24854/34278 [27:21:38<10:39:30, 4.07s/it] {'loss': 0.1115, 'grad_norm': 0.8961740815132433, 'learning_rate': 1.8546648804353185e-06, 'epoch': 0.73} 73%|███████▎ | 24854/34278 [27:21:38<10:39:30, 4.07s/it] 73%|███████▎ | 24855/34278 [27:21:41<10:14:37, 3.91s/it] {'loss': 0.0985, 'grad_norm': 0.7385042438001875, 'learning_rate': 1.8542976472812474e-06, 'epoch': 0.73} 73%|███████▎ | 24855/34278 [27:21:42<10:14:37, 3.91s/it] 73%|███████▎ | 24856/34278 [27:21:45<9:35:44, 3.67s/it] {'loss': 0.1389, 'grad_norm': 1.104591922664808, 'learning_rate': 1.8539304422112558e-06, 'epoch': 0.73} 73%|███████▎ | 24856/34278 [27:21:45<9:35:44, 3.67s/it] 73%|███████▎ | 24857/34278 [27:21:48<9:14:23, 3.53s/it] {'loss': 0.1044, 'grad_norm': 0.6378269624568983, 'learning_rate': 1.853563265228624e-06, 'epoch': 0.73} 73%|███████▎ | 24857/34278 [27:21:48<9:14:23, 3.53s/it] 73%|███████▎ | 24858/34278 [27:21:54<11:15:36, 4.30s/it] {'loss': 0.1358, 'grad_norm': 0.8157868387104741, 'learning_rate': 1.853196116336628e-06, 'epoch': 0.73} 73%|███████▎ | 24858/34278 [27:21:54<11:15:36, 4.30s/it] 73%|███████▎ | 24859/34278 [27:21:57<10:10:13, 3.89s/it] {'loss': 0.1112, 'grad_norm': 0.7862279870745205, 'learning_rate': 1.8528289955385443e-06, 'epoch': 0.73} 73%|███████▎ | 24859/34278 [27:21:57<10:10:13, 3.89s/it] 73%|███████▎ | 24860/34278 [27:22:01<10:38:20, 4.07s/it] {'loss': 0.11, 'grad_norm': 0.9385475456855338, 'learning_rate': 1.8524619028376539e-06, 'epoch': 0.73} 73%|███████▎ | 24860/34278 [27:22:01<10:38:20, 4.07s/it] 73%|███████▎ | 24861/34278 [27:22:07<12:03:42, 4.61s/it] {'loss': 0.1069, 'grad_norm': 0.7027755140495263, 'learning_rate': 1.8520948382372323e-06, 'epoch': 0.73} 73%|███████▎ | 24861/34278 [27:22:07<12:03:42, 4.61s/it] 73%|███████▎ | 24862/34278 [27:22:11<11:34:14, 4.42s/it] {'loss': 0.125, 'grad_norm': 0.8164494945298956, 'learning_rate': 1.8517278017405532e-06, 'epoch': 0.73} 73%|███████▎ | 24862/34278 [27:22:11<11:34:14, 4.42s/it] 73%|███████▎ | 24863/34278 [27:22:17<12:22:16, 4.73s/it] {'loss': 0.1097, 'grad_norm': 0.7517332609638138, 'learning_rate': 1.8513607933508999e-06, 'epoch': 0.73} 73%|███████▎ | 24863/34278 [27:22:17<12:22:16, 4.73s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 73%|███████▎ | 24864/34278 [27:22:20<11:05:21, 4.24s/it] {'loss': 0.0931, 'grad_norm': 0.7674813511763025, 'learning_rate': 1.8509938130715455e-06, 'epoch': 0.73} 73%|███████▎ | 24864/34278 [27:22:20<11:05:21, 4.24s/it] 73%|███████▎ | 24865/34278 [27:22:23<10:15:56, 3.93s/it] {'loss': 0.115, 'grad_norm': 0.8343587845088293, 'learning_rate': 1.8506268609057653e-06, 'epoch': 0.73} 73%|███████▎ | 24865/34278 [27:22:23<10:15:56, 3.93s/it] 73%|███████▎ | 24866/34278 [27:22:26<9:44:58, 3.73s/it] {'loss': 0.1136, 'grad_norm': 0.7310635549617341, 'learning_rate': 1.8502599368568387e-06, 'epoch': 0.73} 73%|███████▎ | 24866/34278 [27:22:26<9:44:58, 3.73s/it] 73%|███████▎ | 24867/34278 [27:22:30<9:36:42, 3.68s/it] {'loss': 0.1036, 'grad_norm': 0.6595938915712233, 'learning_rate': 1.8498930409280392e-06, 'epoch': 0.73} 73%|███████▎ | 24867/34278 [27:22:30<9:36:42, 3.68s/it] 73%|███████▎ | 24868/34278 [27:22:33<9:12:20, 3.52s/it] {'loss': 0.0987, 'grad_norm': 0.7563890394954799, 'learning_rate': 1.8495261731226404e-06, 'epoch': 0.73} 73%|███████▎ | 24868/34278 [27:22:33<9:12:20, 3.52s/it] 73%|███████▎ | 24869/34278 [27:22:36<9:01:26, 3.45s/it] {'loss': 0.1129, 'grad_norm': 0.8098649358955136, 'learning_rate': 1.8491593334439206e-06, 'epoch': 0.73} 73%|███████▎ | 24869/34278 [27:22:36<9:01:26, 3.45s/it] 73%|███████▎ | 24870/34278 [27:22:40<8:56:23, 3.42s/it] {'loss': 0.1011, 'grad_norm': 0.7967892081724012, 'learning_rate': 1.8487925218951553e-06, 'epoch': 0.73} 73%|███████▎ | 24870/34278 [27:22:40<8:56:23, 3.42s/it] 73%|███████▎ | 24871/34278 [27:22:43<8:52:10, 3.39s/it] {'loss': 0.1015, 'grad_norm': 0.9593705090427759, 'learning_rate': 1.8484257384796184e-06, 'epoch': 0.73} 73%|███████▎ | 24871/34278 [27:22:43<8:52:10, 3.39s/it] 73%|███████▎ | 24872/34278 [27:22:46<8:44:24, 3.35s/it] {'loss': 0.1047, 'grad_norm': 0.8414936894015588, 'learning_rate': 1.8480589832005824e-06, 'epoch': 0.73} 73%|███████▎ | 24872/34278 [27:22:46<8:44:24, 3.35s/it] 73%|███████▎ | 24873/34278 [27:22:49<8:45:21, 3.35s/it] {'loss': 0.1357, 'grad_norm': 0.8877625900422822, 'learning_rate': 1.8476922560613247e-06, 'epoch': 0.73} 73%|███████▎ | 24873/34278 [27:22:49<8:45:21, 3.35s/it] 73%|███████▎ | 24874/34278 [27:22:53<8:36:28, 3.30s/it] {'loss': 0.1372, 'grad_norm': 0.9182396538995153, 'learning_rate': 1.8473255570651167e-06, 'epoch': 0.73} 73%|███████▎ | 24874/34278 [27:22:53<8:36:28, 3.30s/it] 73%|███████▎ | 24875/34278 [27:22:56<8:17:44, 3.18s/it] {'loss': 0.1043, 'grad_norm': 0.7526035537212655, 'learning_rate': 1.8469588862152338e-06, 'epoch': 0.73} 73%|███████▎ | 24875/34278 [27:22:56<8:17:44, 3.18s/it] 73%|███████▎ | 24876/34278 [27:22:59<8:27:06, 3.24s/it] {'loss': 0.1259, 'grad_norm': 0.7054903788913072, 'learning_rate': 1.8465922435149502e-06, 'epoch': 0.73} 73%|███████▎ | 24876/34278 [27:22:59<8:27:06, 3.24s/it] 73%|███████▎ | 24877/34278 [27:23:05<10:31:09, 4.03s/it] {'loss': 0.1112, 'grad_norm': 0.8085560954450418, 'learning_rate': 1.846225628967539e-06, 'epoch': 0.73} 73%|███████▎ | 24877/34278 [27:23:05<10:31:09, 4.03s/it] 73%|███████▎ | 24878/34278 [27:23:09<10:34:21, 4.05s/it] {'loss': 0.1112, 'grad_norm': 0.8898010958408633, 'learning_rate': 1.8458590425762707e-06, 'epoch': 0.73} 73%|███████▎ | 24878/34278 [27:23:09<10:34:21, 4.05s/it] 73%|███████▎ | 24879/34278 [27:23:12<9:55:07, 3.80s/it] {'loss': 0.1043, 'grad_norm': 0.6961278063567853, 'learning_rate': 1.8454924843444216e-06, 'epoch': 0.73} 73%|███████▎ | 24879/34278 [27:23:12<9:55:07, 3.80s/it] 73%|███████▎ | 24880/34278 [27:23:18<11:32:12, 4.42s/it] {'loss': 0.1149, 'grad_norm': 0.9587722189224249, 'learning_rate': 1.8451259542752603e-06, 'epoch': 0.73} 73%|███████▎ | 24880/34278 [27:23:18<11:32:12, 4.42s/it] 73%|███████▎ | 24881/34278 [27:23:21<10:44:22, 4.11s/it] {'loss': 0.1154, 'grad_norm': 0.7762580689387995, 'learning_rate': 1.8447594523720636e-06, 'epoch': 0.73} 73%|███████▎ | 24881/34278 [27:23:21<10:44:22, 4.11s/it] 73%|███████▎ | 24882/34278 [27:23:26<11:01:34, 4.22s/it] {'loss': 0.0861, 'grad_norm': 0.8285094579084039, 'learning_rate': 1.8443929786380994e-06, 'epoch': 0.73} 73%|███████▎ | 24882/34278 [27:23:26<11:01:34, 4.22s/it] 73%|███████▎ | 24883/34278 [27:23:29<10:12:33, 3.91s/it] {'loss': 0.1172, 'grad_norm': 0.7726023454221855, 'learning_rate': 1.8440265330766432e-06, 'epoch': 0.73} 73%|███████▎ | 24883/34278 [27:23:29<10:12:33, 3.91s/it] 73%|███████▎ | 24884/34278 [27:23:33<10:05:49, 3.87s/it] {'loss': 0.1268, 'grad_norm': 0.7417302916526094, 'learning_rate': 1.8436601156909645e-06, 'epoch': 0.73} 73%|███████▎ | 24884/34278 [27:23:33<10:05:49, 3.87s/it] 73%|███████▎ | 24885/34278 [27:23:36<9:21:36, 3.59s/it] {'loss': 0.1074, 'grad_norm': 0.8036061521060736, 'learning_rate': 1.8432937264843338e-06, 'epoch': 0.73} 73%|███████▎ | 24885/34278 [27:23:36<9:21:36, 3.59s/it] 73%|███████▎ | 24886/34278 [27:23:40<10:14:20, 3.92s/it] {'loss': 0.1046, 'grad_norm': 0.8508839085544653, 'learning_rate': 1.8429273654600221e-06, 'epoch': 0.73} 73%|███████▎ | 24886/34278 [27:23:40<10:14:20, 3.92s/it] 73%|███████▎ | 24887/34278 [27:23:43<9:27:11, 3.62s/it] {'loss': 0.1253, 'grad_norm': 0.8193173401699769, 'learning_rate': 1.8425610326213034e-06, 'epoch': 0.73} 73%|███████▎ | 24887/34278 [27:23:43<9:27:11, 3.62s/it] 73%|███████▎ | 24888/34278 [27:23:47<9:43:30, 3.73s/it] {'loss': 0.1248, 'grad_norm': 0.7690229197874942, 'learning_rate': 1.8421947279714464e-06, 'epoch': 0.73} 73%|███████▎ | 24888/34278 [27:23:47<9:43:30, 3.73s/it] 73%|███████▎ | 24889/34278 [27:23:52<10:48:51, 4.15s/it] {'loss': 0.0993, 'grad_norm': 0.8534059239823847, 'learning_rate': 1.8418284515137192e-06, 'epoch': 0.73} 73%|███████▎ | 24889/34278 [27:23:52<10:48:51, 4.15s/it] 73%|███████▎ | 24890/34278 [27:23:57<11:03:49, 4.24s/it] {'loss': 0.1436, 'grad_norm': 1.0321327097637345, 'learning_rate': 1.8414622032513952e-06, 'epoch': 0.73} 73%|███████▎ | 24890/34278 [27:23:57<11:03:49, 4.24s/it] 73%|███████▎ | 24891/34278 [27:24:00<9:59:15, 3.83s/it] {'loss': 0.1093, 'grad_norm': 0.7811226031964913, 'learning_rate': 1.8410959831877423e-06, 'epoch': 0.73} 73%|███████▎ | 24891/34278 [27:24:00<9:59:15, 3.83s/it] 73%|███████▎ | 24892/34278 [27:24:03<9:39:57, 3.71s/it] {'loss': 0.1217, 'grad_norm': 0.8800829918309522, 'learning_rate': 1.8407297913260274e-06, 'epoch': 0.73} 73%|███████▎ | 24892/34278 [27:24:03<9:39:57, 3.71s/it] 73%|███████▎ | 24893/34278 [27:24:08<10:33:26, 4.05s/it] {'loss': 0.1079, 'grad_norm': 1.0248503224959675, 'learning_rate': 1.8403636276695263e-06, 'epoch': 0.73} 73%|███████▎ | 24893/34278 [27:24:08<10:33:26, 4.05s/it] 73%|███████▎ | 24894/34278 [27:24:11<9:30:24, 3.65s/it] {'loss': 0.107, 'grad_norm': 0.7391089423226811, 'learning_rate': 1.8399974922215042e-06, 'epoch': 0.73} 73%|███████▎ | 24894/34278 [27:24:11<9:30:24, 3.65s/it] 73%|███████▎ | 24895/34278 [27:24:14<9:07:22, 3.50s/it] {'loss': 0.1146, 'grad_norm': 1.0746904798867014, 'learning_rate': 1.8396313849852281e-06, 'epoch': 0.73} 73%|███████▎ | 24895/34278 [27:24:14<9:07:22, 3.50s/it] 73%|███████▎ | 24896/34278 [27:24:17<9:00:27, 3.46s/it] {'loss': 0.116, 'grad_norm': 0.8649387591485509, 'learning_rate': 1.8392653059639709e-06, 'epoch': 0.73} 73%|███████▎ | 24896/34278 [27:24:17<9:00:27, 3.46s/it] 73%|███████▎ | 24897/34278 [27:24:21<8:55:45, 3.43s/it] {'loss': 0.1134, 'grad_norm': 0.9121882526351213, 'learning_rate': 1.838899255160998e-06, 'epoch': 0.73} 73%|███████▎ | 24897/34278 [27:24:21<8:55:45, 3.43s/it] 73%|███████▎ | 24898/34278 [27:24:25<9:19:22, 3.58s/it] {'loss': 0.1362, 'grad_norm': 0.965498288637953, 'learning_rate': 1.838533232579577e-06, 'epoch': 0.73} 73%|███████▎ | 24898/34278 [27:24:25<9:19:22, 3.58s/it] 73%|███████▎ | 24899/34278 [27:24:27<8:48:08, 3.38s/it] {'loss': 0.1197, 'grad_norm': 0.836395609570536, 'learning_rate': 1.838167238222976e-06, 'epoch': 0.73} 73%|███████▎ | 24899/34278 [27:24:27<8:48:08, 3.38s/it] 73%|███████▎ | 24900/34278 [27:24:30<8:30:19, 3.27s/it] {'loss': 0.1145, 'grad_norm': 0.7118901604503297, 'learning_rate': 1.8378012720944649e-06, 'epoch': 0.73} 73%|███████▎ | 24900/34278 [27:24:30<8:30:19, 3.27s/it] 73%|███████▎ | 24901/34278 [27:24:34<8:24:17, 3.23s/it] {'loss': 0.1266, 'grad_norm': 0.7555779815364517, 'learning_rate': 1.837435334197309e-06, 'epoch': 0.73} 73%|███████▎ | 24901/34278 [27:24:34<8:24:17, 3.23s/it] 73%|███████▎ | 24902/34278 [27:24:38<9:28:36, 3.64s/it] {'loss': 0.1153, 'grad_norm': 0.9109027979813684, 'learning_rate': 1.8370694245347736e-06, 'epoch': 0.73} 73%|███████▎ | 24902/34278 [27:24:38<9:28:36, 3.64s/it] 73%|███████▎ | 24903/34278 [27:24:42<9:55:37, 3.81s/it] {'loss': 0.1309, 'grad_norm': 1.0025817004775293, 'learning_rate': 1.8367035431101293e-06, 'epoch': 0.73} 73%|███████▎ | 24903/34278 [27:24:42<9:55:37, 3.81s/it] 73%|███████▎ | 24904/34278 [27:24:48<10:55:48, 4.20s/it] {'loss': 0.1137, 'grad_norm': 0.8049345831601331, 'learning_rate': 1.8363376899266394e-06, 'epoch': 0.73} 73%|███████▎ | 24904/34278 [27:24:48<10:55:48, 4.20s/it] 73%|███████▎ | 24905/34278 [27:24:51<10:05:05, 3.87s/it] {'loss': 0.1176, 'grad_norm': 0.8043646836958517, 'learning_rate': 1.8359718649875708e-06, 'epoch': 0.73} 73%|███████▎ | 24905/34278 [27:24:51<10:05:05, 3.87s/it] 73%|███████▎ | 24906/34278 [27:24:54<9:29:58, 3.65s/it] {'loss': 0.1536, 'grad_norm': 1.1209706309388336, 'learning_rate': 1.8356060682961918e-06, 'epoch': 0.73} 73%|███████▎ | 24906/34278 [27:24:54<9:29:58, 3.65s/it] 73%|███████▎ | 24907/34278 [27:24:58<9:40:41, 3.72s/it] {'loss': 0.1154, 'grad_norm': 0.7374726980267576, 'learning_rate': 1.8352402998557667e-06, 'epoch': 0.73} 73%|███████▎ | 24907/34278 [27:24:58<9:40:41, 3.72s/it] 73%|███████▎ | 24908/34278 [27:25:04<11:28:01, 4.41s/it] {'loss': 0.0989, 'grad_norm': 0.8924252687802675, 'learning_rate': 1.834874559669559e-06, 'epoch': 0.73} 73%|███████▎ | 24908/34278 [27:25:04<11:28:01, 4.41s/it] 73%|███████▎ | 24909/34278 [27:25:07<10:25:06, 4.00s/it] {'loss': 0.1395, 'grad_norm': 0.830323596731376, 'learning_rate': 1.8345088477408368e-06, 'epoch': 0.73} 73%|███████▎ | 24909/34278 [27:25:07<10:25:06, 4.00s/it] 73%|███████▎ | 24910/34278 [27:25:10<9:35:55, 3.69s/it] {'loss': 0.1019, 'grad_norm': 1.0699065525042704, 'learning_rate': 1.834143164072863e-06, 'epoch': 0.73} 73%|███████▎ | 24910/34278 [27:25:10<9:35:55, 3.69s/it] 73%|███████▎ | 24911/34278 [27:25:14<9:57:53, 3.83s/it] {'loss': 0.1228, 'grad_norm': 1.0995388157216845, 'learning_rate': 1.8337775086689047e-06, 'epoch': 0.73} 73%|███████▎ | 24911/34278 [27:25:14<9:57:53, 3.83s/it] 73%|███████▎ | 24912/34278 [27:25:17<9:42:33, 3.73s/it] {'loss': 0.1294, 'grad_norm': 0.9635517309064618, 'learning_rate': 1.8334118815322233e-06, 'epoch': 0.73} 73%|███████▎ | 24912/34278 [27:25:17<9:42:33, 3.73s/it] 73%|███████▎ | 24913/34278 [27:25:21<9:17:38, 3.57s/it] {'loss': 0.1157, 'grad_norm': 1.1126481145869898, 'learning_rate': 1.833046282666086e-06, 'epoch': 0.73} 73%|███████▎ | 24913/34278 [27:25:21<9:17:38, 3.57s/it] 73%|███████▎ | 24914/34278 [27:25:23<8:44:40, 3.36s/it] {'loss': 0.1127, 'grad_norm': 1.0521044133004638, 'learning_rate': 1.832680712073756e-06, 'epoch': 0.73} 73%|███████▎ | 24914/34278 [27:25:23<8:44:40, 3.36s/it] 73%|███████▎ | 24915/34278 [27:25:27<9:05:51, 3.50s/it] {'loss': 0.1122, 'grad_norm': 0.8842440041359714, 'learning_rate': 1.8323151697584946e-06, 'epoch': 0.73} 73%|███████▎ | 24915/34278 [27:25:27<9:05:51, 3.50s/it] 73%|███████▎ | 24916/34278 [27:25:30<8:41:13, 3.34s/it] {'loss': 0.1134, 'grad_norm': 0.7412429925750073, 'learning_rate': 1.8319496557235667e-06, 'epoch': 0.73} 73%|███████▎ | 24916/34278 [27:25:30<8:41:13, 3.34s/it] 73%|███████▎ | 24917/34278 [27:25:34<9:06:54, 3.51s/it] {'loss': 0.1233, 'grad_norm': 0.8617024744473915, 'learning_rate': 1.8315841699722386e-06, 'epoch': 0.73} 73%|███████▎ | 24917/34278 [27:25:34<9:06:54, 3.51s/it] 73%|███████▎ | 24918/34278 [27:25:37<8:58:04, 3.45s/it] {'loss': 0.0984, 'grad_norm': 0.7391870650891543, 'learning_rate': 1.8312187125077703e-06, 'epoch': 0.73} 73%|███████▎ | 24918/34278 [27:25:37<8:58:04, 3.45s/it] 73%|███████▎ | 24919/34278 [27:25:43<11:00:40, 4.24s/it] {'loss': 0.1067, 'grad_norm': 0.8149588234882889, 'learning_rate': 1.830853283333423e-06, 'epoch': 0.73} 73%|███████▎ | 24919/34278 [27:25:43<11:00:40, 4.24s/it] 73%|███████▎ | 24920/34278 [27:25:48<11:25:39, 4.40s/it] {'loss': 0.1382, 'grad_norm': 0.9990044124623102, 'learning_rate': 1.8304878824524625e-06, 'epoch': 0.73} 73%|███████▎ | 24920/34278 [27:25:48<11:25:39, 4.40s/it] 73%|███████▎ | 24921/34278 [27:25:54<12:50:51, 4.94s/it] {'loss': 0.091, 'grad_norm': 0.9153443954209698, 'learning_rate': 1.8301225098681502e-06, 'epoch': 0.73} 73%|███████▎ | 24921/34278 [27:25:54<12:50:51, 4.94s/it] 73%|███████▎ | 24922/34278 [27:25:58<11:24:30, 4.39s/it] {'loss': 0.1143, 'grad_norm': 0.6446411753985442, 'learning_rate': 1.8297571655837437e-06, 'epoch': 0.73} 73%|███████▎ | 24922/34278 [27:25:58<11:24:30, 4.39s/it] 73%|███████▎ | 24923/34278 [27:26:01<10:43:06, 4.12s/it] {'loss': 0.0873, 'grad_norm': 0.7362736806102024, 'learning_rate': 1.829391849602512e-06, 'epoch': 0.73} 73%|███████▎ | 24923/34278 [27:26:01<10:43:06, 4.12s/it] 73%|███████▎ | 24924/34278 [27:26:04<9:43:34, 3.74s/it] {'loss': 0.1237, 'grad_norm': 0.8609760765727609, 'learning_rate': 1.8290265619277125e-06, 'epoch': 0.73} 73%|███████▎ | 24924/34278 [27:26:04<9:43:34, 3.74s/it] 73%|███████▎ | 24925/34278 [27:26:07<9:19:52, 3.59s/it] {'loss': 0.112, 'grad_norm': 1.6472467366255574, 'learning_rate': 1.8286613025626054e-06, 'epoch': 0.73} 73%|███████▎ | 24925/34278 [27:26:07<9:19:52, 3.59s/it] 73%|███████▎ | 24926/34278 [27:26:11<9:46:50, 3.76s/it] {'loss': 0.1197, 'grad_norm': 0.8878191010116272, 'learning_rate': 1.8282960715104553e-06, 'epoch': 0.73} 73%|███████▎ | 24926/34278 [27:26:11<9:46:50, 3.76s/it] 73%|███████▎ | 24927/34278 [27:26:16<10:32:30, 4.06s/it] {'loss': 0.1128, 'grad_norm': 0.8555477765704576, 'learning_rate': 1.82793086877452e-06, 'epoch': 0.73} 73%|███████▎ | 24927/34278 [27:26:16<10:32:30, 4.06s/it] 73%|███████▎ | 24928/34278 [27:26:19<9:40:51, 3.73s/it] {'loss': 0.1244, 'grad_norm': 0.848971547739859, 'learning_rate': 1.8275656943580594e-06, 'epoch': 0.73} 73%|███████▎ | 24928/34278 [27:26:19<9:40:51, 3.73s/it] 73%|███████▎ | 24929/34278 [27:26:22<9:26:57, 3.64s/it] {'loss': 0.1204, 'grad_norm': 0.7995870031921165, 'learning_rate': 1.8272005482643352e-06, 'epoch': 0.73} 73%|███████▎ | 24929/34278 [27:26:22<9:26:57, 3.64s/it] 73%|███████▎ | 24930/34278 [27:26:28<11:17:43, 4.35s/it] {'loss': 0.1249, 'grad_norm': 0.8450287379984195, 'learning_rate': 1.8268354304966084e-06, 'epoch': 0.73} 73%|███████▎ | 24930/34278 [27:26:29<11:17:43, 4.35s/it] 73%|███████▎ | 24931/34278 [27:26:31<10:03:42, 3.88s/it] {'loss': 0.1092, 'grad_norm': 0.8166409415337273, 'learning_rate': 1.8264703410581375e-06, 'epoch': 0.73} 73%|███████▎ | 24931/34278 [27:26:31<10:03:42, 3.88s/it] 73%|███████▎ | 24932/34278 [27:26:34<9:23:25, 3.62s/it] {'loss': 0.1089, 'grad_norm': 0.8586100750549512, 'learning_rate': 1.82610527995218e-06, 'epoch': 0.73} 73%|███████▎ | 24932/34278 [27:26:34<9:23:25, 3.62s/it] 73%|███████▎ | 24933/34278 [27:26:37<9:04:13, 3.49s/it] {'loss': 0.1217, 'grad_norm': 0.9073711672332692, 'learning_rate': 1.8257402471819991e-06, 'epoch': 0.73} 73%|███████▎ | 24933/34278 [27:26:37<9:04:13, 3.49s/it] 73%|███████▎ | 24934/34278 [27:26:41<9:07:32, 3.52s/it] {'loss': 0.1119, 'grad_norm': 0.8195584608933435, 'learning_rate': 1.8253752427508493e-06, 'epoch': 0.73} 73%|███████▎ | 24934/34278 [27:26:41<9:07:32, 3.52s/it] 73%|███████▎ | 24935/34278 [27:26:44<8:35:34, 3.31s/it] {'loss': 0.0917, 'grad_norm': 1.159609683019026, 'learning_rate': 1.8250102666619917e-06, 'epoch': 0.73} 73%|███████▎ | 24935/34278 [27:26:44<8:35:34, 3.31s/it] 73%|███████▎ | 24936/34278 [27:26:48<9:01:03, 3.47s/it] {'loss': 0.1053, 'grad_norm': 0.7340335281291133, 'learning_rate': 1.8246453189186857e-06, 'epoch': 0.73} 73%|███████▎ | 24936/34278 [27:26:48<9:01:03, 3.47s/it] 73%|███████▎ | 24937/34278 [27:26:54<10:57:47, 4.23s/it] {'loss': 0.1448, 'grad_norm': 0.855828491858628, 'learning_rate': 1.8242803995241887e-06, 'epoch': 0.73} 73%|███████▎ | 24937/34278 [27:26:54<10:57:47, 4.23s/it] 73%|███████▎ | 24938/34278 [27:26:57<9:56:26, 3.83s/it] {'loss': 0.1476, 'grad_norm': 0.9658714695652126, 'learning_rate': 1.8239155084817567e-06, 'epoch': 0.73} 73%|███████▎ | 24938/34278 [27:26:57<9:56:26, 3.83s/it] 73%|███████▎ | 24939/34278 [27:27:00<9:36:56, 3.71s/it] {'loss': 0.1184, 'grad_norm': 1.1787234210589748, 'learning_rate': 1.8235506457946505e-06, 'epoch': 0.73} 73%|███████▎ | 24939/34278 [27:27:00<9:36:56, 3.71s/it] 73%|███████▎ | 24940/34278 [27:27:03<9:18:59, 3.59s/it] {'loss': 0.1262, 'grad_norm': 0.7264636736826157, 'learning_rate': 1.8231858114661238e-06, 'epoch': 0.73} 73%|███████▎ | 24940/34278 [27:27:03<9:18:59, 3.59s/it] 73%|███████▎ | 24941/34278 [27:27:09<11:18:05, 4.36s/it] {'loss': 0.109, 'grad_norm': 1.0263366576540314, 'learning_rate': 1.8228210054994377e-06, 'epoch': 0.73} 73%|███████▎ | 24941/34278 [27:27:10<11:18:05, 4.36s/it] 73%|███████▎ | 24942/34278 [27:27:13<10:24:23, 4.01s/it] {'loss': 0.096, 'grad_norm': 1.0271232640973176, 'learning_rate': 1.8224562278978452e-06, 'epoch': 0.73} 73%|███████▎ | 24942/34278 [27:27:13<10:24:23, 4.01s/it] 73%|███████▎ | 24943/34278 [27:27:17<10:57:51, 4.23s/it] {'loss': 0.0961, 'grad_norm': 0.8297968739124395, 'learning_rate': 1.8220914786646071e-06, 'epoch': 0.73} 73%|███████▎ | 24943/34278 [27:27:17<10:57:51, 4.23s/it] 73%|███████▎ | 24944/34278 [27:27:20<10:01:57, 3.87s/it] {'loss': 0.1192, 'grad_norm': 0.8402880182245117, 'learning_rate': 1.821726757802978e-06, 'epoch': 0.73} 73%|███████▎ | 24944/34278 [27:27:20<10:01:57, 3.87s/it] 73%|███████▎ | 24945/34278 [27:27:24<9:29:42, 3.66s/it] {'loss': 0.1172, 'grad_norm': 1.144508257056658, 'learning_rate': 1.8213620653162111e-06, 'epoch': 0.73} 73%|███████▎ | 24945/34278 [27:27:24<9:29:42, 3.66s/it] 73%|███████▎ | 24946/34278 [27:27:27<8:55:23, 3.44s/it] {'loss': 0.1118, 'grad_norm': 1.160513505920864, 'learning_rate': 1.8209974012075654e-06, 'epoch': 0.73} 73%|███████▎ | 24946/34278 [27:27:27<8:55:23, 3.44s/it] 73%|███████▎ | 24947/34278 [27:27:30<9:03:23, 3.49s/it] {'loss': 0.1146, 'grad_norm': 0.85734785080866, 'learning_rate': 1.8206327654802975e-06, 'epoch': 0.73} 73%|███████▎ | 24947/34278 [27:27:30<9:03:23, 3.49s/it] 73%|███████▎ | 24948/34278 [27:27:33<8:50:00, 3.41s/it] {'loss': 0.116, 'grad_norm': 0.8636575756890902, 'learning_rate': 1.8202681581376614e-06, 'epoch': 0.73} 73%|███████▎ | 24948/34278 [27:27:33<8:50:00, 3.41s/it] 73%|███████▎ | 24949/34278 [27:27:37<8:38:27, 3.33s/it] {'loss': 0.1373, 'grad_norm': 1.0040967361817545, 'learning_rate': 1.8199035791829105e-06, 'epoch': 0.73} 73%|███████▎ | 24949/34278 [27:27:37<8:38:27, 3.33s/it] 73%|███████▎ | 24950/34278 [27:27:40<8:32:26, 3.30s/it] {'loss': 0.1018, 'grad_norm': 1.1790709096440317, 'learning_rate': 1.8195390286193027e-06, 'epoch': 0.73} 73%|███████▎ | 24950/34278 [27:27:40<8:32:26, 3.30s/it] 73%|███████▎ | 24951/34278 [27:27:43<8:45:33, 3.38s/it] {'loss': 0.108, 'grad_norm': 0.9110015649735659, 'learning_rate': 1.81917450645009e-06, 'epoch': 0.73} 73%|███████▎ | 24951/34278 [27:27:43<8:45:33, 3.38s/it] 73%|███████▎ | 24952/34278 [27:27:49<10:51:40, 4.19s/it] {'loss': 0.1067, 'grad_norm': 0.8987094341283958, 'learning_rate': 1.8188100126785273e-06, 'epoch': 0.73} 73%|███████▎ | 24952/34278 [27:27:49<10:51:40, 4.19s/it] 73%|███████▎ | 24953/34278 [27:27:53<10:12:17, 3.94s/it] {'loss': 0.114, 'grad_norm': 0.9444427978391019, 'learning_rate': 1.8184455473078717e-06, 'epoch': 0.73} 73%|███████▎ | 24953/34278 [27:27:53<10:12:17, 3.94s/it] 73%|███████▎ | 24954/34278 [27:27:56<9:52:11, 3.81s/it] {'loss': 0.0994, 'grad_norm': 1.0161299164683342, 'learning_rate': 1.8180811103413743e-06, 'epoch': 0.73} 73%|███████▎ | 24954/34278 [27:27:56<9:52:11, 3.81s/it] 73%|███████▎ | 24955/34278 [27:28:00<9:31:37, 3.68s/it] {'loss': 0.1099, 'grad_norm': 0.8559058581243981, 'learning_rate': 1.8177167017822878e-06, 'epoch': 0.73} 73%|███████▎ | 24955/34278 [27:28:00<9:31:37, 3.68s/it] 73%|███████▎ | 24956/34278 [27:28:03<9:02:46, 3.49s/it] {'loss': 0.1209, 'grad_norm': 0.9927475393473875, 'learning_rate': 1.8173523216338685e-06, 'epoch': 0.73} 73%|███████▎ | 24956/34278 [27:28:03<9:02:46, 3.49s/it] 73%|███████▎ | 24957/34278 [27:28:06<8:41:31, 3.36s/it] {'loss': 0.1179, 'grad_norm': 0.8647802236615696, 'learning_rate': 1.8169879698993665e-06, 'epoch': 0.73} 73%|███████▎ | 24957/34278 [27:28:06<8:41:31, 3.36s/it] 73%|███████▎ | 24958/34278 [27:28:09<8:22:10, 3.23s/it] {'loss': 0.1006, 'grad_norm': 0.7483838827480895, 'learning_rate': 1.8166236465820375e-06, 'epoch': 0.73} 73%|███████▎ | 24958/34278 [27:28:09<8:22:10, 3.23s/it] 73%|███████▎ | 24959/34278 [27:28:12<8:14:29, 3.18s/it] {'loss': 0.1194, 'grad_norm': 1.0133208695063225, 'learning_rate': 1.8162593516851308e-06, 'epoch': 0.73} 73%|███████▎ | 24959/34278 [27:28:12<8:14:29, 3.18s/it] 73%|███████▎ | 24960/34278 [27:28:18<10:28:16, 4.05s/it] {'loss': 0.1186, 'grad_norm': 0.8257053792932199, 'learning_rate': 1.8158950852119024e-06, 'epoch': 0.73} 73%|███████▎ | 24960/34278 [27:28:18<10:28:16, 4.05s/it] 73%|███████▎ | 24961/34278 [27:28:21<9:49:52, 3.80s/it] {'loss': 0.1047, 'grad_norm': 0.7500533278285161, 'learning_rate': 1.8155308471656024e-06, 'epoch': 0.73} 73%|███████▎ | 24961/34278 [27:28:21<9:49:52, 3.80s/it] 73%|███████▎ | 24962/34278 [27:28:25<9:46:35, 3.78s/it] {'loss': 0.1224, 'grad_norm': 1.0483414779193183, 'learning_rate': 1.8151666375494815e-06, 'epoch': 0.73} 73%|███████▎ | 24962/34278 [27:28:25<9:46:35, 3.78s/it] 73%|███████▎ | 24963/34278 [27:28:29<10:02:09, 3.88s/it] {'loss': 0.1093, 'grad_norm': 0.8956709519972501, 'learning_rate': 1.8148024563667926e-06, 'epoch': 0.73} 73%|███████▎ | 24963/34278 [27:28:29<10:02:09, 3.88s/it] 73%|███████▎ | 24964/34278 [27:28:32<9:34:57, 3.70s/it] {'loss': 0.1462, 'grad_norm': 1.3678983466767172, 'learning_rate': 1.8144383036207886e-06, 'epoch': 0.73} 73%|███████▎ | 24964/34278 [27:28:32<9:34:57, 3.70s/it] 73%|███████▎ | 24965/34278 [27:28:38<11:24:17, 4.41s/it] {'loss': 0.1057, 'grad_norm': 0.762653630359437, 'learning_rate': 1.8140741793147172e-06, 'epoch': 0.73} 73%|███████▎ | 24965/34278 [27:28:38<11:24:17, 4.41s/it] 73%|███████▎ | 24966/34278 [27:28:41<10:06:16, 3.91s/it] {'loss': 0.1346, 'grad_norm': 0.9833261476048476, 'learning_rate': 1.8137100834518323e-06, 'epoch': 0.73} 73%|███████▎ | 24966/34278 [27:28:41<10:06:16, 3.91s/it] 73%|███████▎ | 24967/34278 [27:28:44<9:21:15, 3.62s/it] {'loss': 0.112, 'grad_norm': 1.3145407068180195, 'learning_rate': 1.8133460160353832e-06, 'epoch': 0.73} 73%|███████▎ | 24967/34278 [27:28:44<9:21:15, 3.62s/it] 73%|███████▎ | 24968/34278 [27:28:47<9:16:18, 3.59s/it] {'loss': 0.0967, 'grad_norm': 1.1744594263215486, 'learning_rate': 1.8129819770686192e-06, 'epoch': 0.73} 73%|███████▎ | 24968/34278 [27:28:47<9:16:18, 3.59s/it] 73%|███████▎ | 24969/34278 [27:28:50<8:38:21, 3.34s/it] {'loss': 0.1148, 'grad_norm': 0.6977637276046625, 'learning_rate': 1.8126179665547905e-06, 'epoch': 0.73} 73%|███████▎ | 24969/34278 [27:28:50<8:38:21, 3.34s/it] 73%|███████▎ | 24970/34278 [27:28:54<8:40:19, 3.35s/it] {'loss': 0.1244, 'grad_norm': 0.8791605244855828, 'learning_rate': 1.8122539844971498e-06, 'epoch': 0.73} 73%|███████▎ | 24970/34278 [27:28:54<8:40:19, 3.35s/it] 73%|███████▎ | 24971/34278 [27:28:57<8:22:06, 3.24s/it] {'loss': 0.1262, 'grad_norm': 1.3230451343526601, 'learning_rate': 1.8118900308989446e-06, 'epoch': 0.73} 73%|███████▎ | 24971/34278 [27:28:57<8:22:06, 3.24s/it] 73%|███████▎ | 24972/34278 [27:29:02<10:11:22, 3.94s/it] {'loss': 0.1222, 'grad_norm': 1.0614937945500669, 'learning_rate': 1.8115261057634226e-06, 'epoch': 0.73} 73%|███████▎ | 24972/34278 [27:29:02<10:11:22, 3.94s/it] 73%|███████▎ | 24973/34278 [27:29:08<11:42:39, 4.53s/it] {'loss': 0.1358, 'grad_norm': 0.8326956579195125, 'learning_rate': 1.8111622090938357e-06, 'epoch': 0.73} 73%|███████▎ | 24973/34278 [27:29:08<11:42:39, 4.53s/it] 73%|███████▎ | 24974/34278 [27:29:11<10:28:48, 4.06s/it] {'loss': 0.1004, 'grad_norm': 0.8717068756020058, 'learning_rate': 1.8107983408934315e-06, 'epoch': 0.73} 73%|███████▎ | 24974/34278 [27:29:11<10:28:48, 4.06s/it] 73%|███████▎ | 24975/34278 [27:29:14<9:53:56, 3.83s/it] {'loss': 0.1105, 'grad_norm': 1.1216425684225206, 'learning_rate': 1.8104345011654566e-06, 'epoch': 0.73} 73%|███████▎ | 24975/34278 [27:29:14<9:53:56, 3.83s/it] 73%|███████▎ | 24976/34278 [27:29:19<10:19:15, 3.99s/it] {'loss': 0.1049, 'grad_norm': 0.7841207142089922, 'learning_rate': 1.810070689913161e-06, 'epoch': 0.73} 73%|███████▎ | 24976/34278 [27:29:19<10:19:15, 3.99s/it] 73%|███████▎ | 24977/34278 [27:29:22<10:01:48, 3.88s/it] {'loss': 0.1108, 'grad_norm': 0.8524723388647767, 'learning_rate': 1.8097069071397943e-06, 'epoch': 0.73} 73%|███████▎ | 24977/34278 [27:29:22<10:01:48, 3.88s/it] 73%|███████▎ | 24978/34278 [27:29:27<10:19:15, 4.00s/it] {'loss': 0.137, 'grad_norm': 0.9369938320213941, 'learning_rate': 1.8093431528486034e-06, 'epoch': 0.73} 73%|███████▎ | 24978/34278 [27:29:27<10:19:15, 4.00s/it] 73%|███████▎ | 24979/34278 [27:29:31<10:22:31, 4.02s/it] {'loss': 0.1249, 'grad_norm': 0.8796896474443129, 'learning_rate': 1.808979427042833e-06, 'epoch': 0.73} 73%|███████▎ | 24979/34278 [27:29:31<10:22:31, 4.02s/it] 73%|███████▎ | 24980/34278 [27:29:37<11:50:52, 4.59s/it] {'loss': 0.1179, 'grad_norm': 0.9817665157287899, 'learning_rate': 1.8086157297257346e-06, 'epoch': 0.73} 73%|███████▎ | 24980/34278 [27:29:37<11:50:52, 4.59s/it] 73%|███████▎ | 24981/34278 [27:29:40<10:36:35, 4.11s/it] {'loss': 0.1111, 'grad_norm': 0.8483677780760432, 'learning_rate': 1.808252060900551e-06, 'epoch': 0.73} 73%|███████▎ | 24981/34278 [27:29:40<10:36:35, 4.11s/it] 73%|███████▎ | 24982/34278 [27:29:43<9:57:10, 3.85s/it] {'loss': 0.1133, 'grad_norm': 0.7097324491120331, 'learning_rate': 1.8078884205705311e-06, 'epoch': 0.73} 73%|███████▎ | 24982/34278 [27:29:43<9:57:10, 3.85s/it] 73%|███████▎ | 24983/34278 [27:29:46<9:15:15, 3.58s/it] {'loss': 0.124, 'grad_norm': 0.7778382295973635, 'learning_rate': 1.8075248087389236e-06, 'epoch': 0.73} 73%|███████▎ | 24983/34278 [27:29:46<9:15:15, 3.58s/it] 73%|███████▎ | 24984/34278 [27:29:49<8:55:29, 3.46s/it] {'loss': 0.1055, 'grad_norm': 0.7890920754294309, 'learning_rate': 1.8071612254089722e-06, 'epoch': 0.73} 73%|███████▎ | 24984/34278 [27:29:49<8:55:29, 3.46s/it] 73%|███████▎ | 24985/34278 [27:29:55<10:58:15, 4.25s/it] {'loss': 0.1204, 'grad_norm': 0.7835489966044812, 'learning_rate': 1.8067976705839208e-06, 'epoch': 0.73} 73%|███████▎ | 24985/34278 [27:29:55<10:58:15, 4.25s/it] 73%|███████▎ | 24986/34278 [27:29:58<9:57:24, 3.86s/it] {'loss': 0.1078, 'grad_norm': 0.8632739473909581, 'learning_rate': 1.8064341442670203e-06, 'epoch': 0.73} 73%|███████▎ | 24986/34278 [27:29:58<9:57:24, 3.86s/it] 73%|███████▎ | 24987/34278 [27:30:01<9:13:21, 3.57s/it] {'loss': 0.1176, 'grad_norm': 0.747518900822516, 'learning_rate': 1.8060706464615108e-06, 'epoch': 0.73} 73%|███████▎ | 24987/34278 [27:30:01<9:13:21, 3.57s/it] 73%|███████▎ | 24988/34278 [27:30:05<9:30:06, 3.68s/it] {'loss': 0.129, 'grad_norm': 0.9503703476024202, 'learning_rate': 1.8057071771706424e-06, 'epoch': 0.73} 73%|███████▎ | 24988/34278 [27:30:05<9:30:06, 3.68s/it] 73%|███████▎ | 24989/34278 [27:30:08<8:57:46, 3.47s/it] {'loss': 0.1269, 'grad_norm': 0.9969023318960142, 'learning_rate': 1.8053437363976556e-06, 'epoch': 0.73} 73%|███████▎ | 24989/34278 [27:30:08<8:57:46, 3.47s/it] 73%|███████▎ | 24990/34278 [27:30:11<8:52:19, 3.44s/it] {'loss': 0.1094, 'grad_norm': 0.8066353602916551, 'learning_rate': 1.8049803241457996e-06, 'epoch': 0.73} 73%|███████▎ | 24990/34278 [27:30:11<8:52:19, 3.44s/it] 73%|███████▎ | 24991/34278 [27:30:14<8:37:32, 3.34s/it] {'loss': 0.1116, 'grad_norm': 0.923421178093087, 'learning_rate': 1.8046169404183162e-06, 'epoch': 0.73} 73%|███████▎ | 24991/34278 [27:30:14<8:37:32, 3.34s/it] 73%|███████▎ | 24992/34278 [27:30:17<8:28:40, 3.29s/it] {'loss': 0.1343, 'grad_norm': 0.7958522972223832, 'learning_rate': 1.8042535852184484e-06, 'epoch': 0.73} 73%|███████▎ | 24992/34278 [27:30:17<8:28:40, 3.29s/it] 73%|███████▎ | 24993/34278 [27:30:20<7:59:20, 3.10s/it] {'loss': 0.1024, 'grad_norm': 0.9977349501167543, 'learning_rate': 1.8038902585494417e-06, 'epoch': 0.73} 73%|███████▎ | 24993/34278 [27:30:20<7:59:20, 3.10s/it] 73%|███████▎ | 24994/34278 [27:30:24<8:56:56, 3.47s/it] {'loss': 0.133, 'grad_norm': 0.9675218438224169, 'learning_rate': 1.803526960414541e-06, 'epoch': 0.73} 73%|███████▎ | 24994/34278 [27:30:24<8:56:56, 3.47s/it] 73%|███████▎ | 24995/34278 [27:30:28<9:14:22, 3.58s/it] {'loss': 0.1067, 'grad_norm': 0.7603232056663293, 'learning_rate': 1.8031636908169876e-06, 'epoch': 0.73} 73%|███████▎ | 24995/34278 [27:30:28<9:14:22, 3.58s/it] 73%|███████▎ | 24996/34278 [27:30:33<10:28:00, 4.06s/it] {'loss': 0.1142, 'grad_norm': 0.788021842468877, 'learning_rate': 1.8028004497600265e-06, 'epoch': 0.73} 73%|███████▎ | 24996/34278 [27:30:33<10:28:00, 4.06s/it] 73%|███████▎ | 24997/34278 [27:30:36<9:41:35, 3.76s/it] {'loss': 0.1343, 'grad_norm': 0.7482645623370769, 'learning_rate': 1.8024372372469008e-06, 'epoch': 0.73} 73%|███████▎ | 24997/34278 [27:30:36<9:41:35, 3.76s/it] 73%|███████▎ | 24998/34278 [27:30:40<9:18:03, 3.61s/it] {'loss': 0.1108, 'grad_norm': 0.8244271398921885, 'learning_rate': 1.8020740532808495e-06, 'epoch': 0.73} 73%|███████▎ | 24998/34278 [27:30:40<9:18:03, 3.61s/it] 73%|███████▎ | 24999/34278 [27:30:44<9:51:18, 3.82s/it] {'loss': 0.1211, 'grad_norm': 1.3435602789569605, 'learning_rate': 1.8017108978651182e-06, 'epoch': 0.73} 73%|███████▎ | 24999/34278 [27:30:44<9:51:18, 3.82s/it] 73%|███████▎ | 25000/34278 [27:30:47<8:58:12, 3.48s/it] {'loss': 0.1218, 'grad_norm': 1.025693838787472, 'learning_rate': 1.8013477710029498e-06, 'epoch': 0.73} 73%|███████▎ | 25000/34278 [27:30:47<8:58:12, 3.48s/it] 73%|███████▎ | 25001/34278 [27:30:51<9:26:51, 3.67s/it] {'loss': 0.1245, 'grad_norm': 0.7885644961583933, 'learning_rate': 1.8009846726975849e-06, 'epoch': 0.73} 73%|███████▎ | 25001/34278 [27:30:51<9:26:51, 3.67s/it] 73%|███████▎ | 25002/34278 [27:30:54<9:22:41, 3.64s/it] {'loss': 0.1336, 'grad_norm': 0.8563145559061879, 'learning_rate': 1.8006216029522638e-06, 'epoch': 0.73} 73%|███████▎ | 25002/34278 [27:30:54<9:22:41, 3.64s/it] 73%|███████▎ | 25003/34278 [27:30:57<8:54:21, 3.46s/it] {'loss': 0.1084, 'grad_norm': 0.7020424218793577, 'learning_rate': 1.8002585617702313e-06, 'epoch': 0.73} 73%|███████▎ | 25003/34278 [27:30:57<8:54:21, 3.46s/it] 73%|███████▎ | 25004/34278 [27:31:01<9:10:53, 3.56s/it] {'loss': 0.1287, 'grad_norm': 1.0573065590986708, 'learning_rate': 1.7998955491547254e-06, 'epoch': 0.73} 73%|███████▎ | 25004/34278 [27:31:01<9:10:53, 3.56s/it] 73%|███████▎ | 25005/34278 [27:31:04<8:50:34, 3.43s/it] {'loss': 0.1316, 'grad_norm': 0.7594585977249245, 'learning_rate': 1.7995325651089873e-06, 'epoch': 0.73} 73%|███████▎ | 25005/34278 [27:31:04<8:50:34, 3.43s/it] 73%|███████▎ | 25006/34278 [27:31:07<8:23:07, 3.26s/it] {'loss': 0.1058, 'grad_norm': 0.7442154971434948, 'learning_rate': 1.7991696096362582e-06, 'epoch': 0.73} 73%|███████▎ | 25006/34278 [27:31:07<8:23:07, 3.26s/it] 73%|███████▎ | 25007/34278 [27:31:11<8:31:55, 3.31s/it] {'loss': 0.0873, 'grad_norm': 0.914630533632953, 'learning_rate': 1.7988066827397805e-06, 'epoch': 0.73} 73%|███████▎ | 25007/34278 [27:31:11<8:31:55, 3.31s/it] 73%|███████▎ | 25008/34278 [27:31:15<8:56:07, 3.47s/it] {'loss': 0.1189, 'grad_norm': 0.8939894431960431, 'learning_rate': 1.7984437844227925e-06, 'epoch': 0.73} 73%|███████▎ | 25008/34278 [27:31:15<8:56:07, 3.47s/it] 73%|███████▎ | 25009/34278 [27:31:18<8:46:31, 3.41s/it] {'loss': 0.1462, 'grad_norm': 1.074739453093371, 'learning_rate': 1.7980809146885325e-06, 'epoch': 0.73} 73%|███████▎ | 25009/34278 [27:31:18<8:46:31, 3.41s/it] 73%|███████▎ | 25010/34278 [27:31:21<8:54:03, 3.46s/it] {'loss': 0.1088, 'grad_norm': 0.7462093769197482, 'learning_rate': 1.7977180735402433e-06, 'epoch': 0.73} 73%|███████▎ | 25010/34278 [27:31:21<8:54:03, 3.46s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f96ac1bafc0> Failed to fetch sample 3291473. Exception: cannot identify image file <_io.BytesIO object at 0x7f96ac1bafc0> 73%|███████▎ | 25011/34278 [27:31:25<9:05:20, 3.53s/it] {'loss': 0.1046, 'grad_norm': 0.9067292778305343, 'learning_rate': 1.797355260981161e-06, 'epoch': 0.73} 73%|███████▎ | 25011/34278 [27:31:25<9:05:20, 3.53s/it] 73%|███████▎ | 25012/34278 [27:31:28<8:59:52, 3.50s/it] {'loss': 0.1181, 'grad_norm': 0.707973753350686, 'learning_rate': 1.7969924770145264e-06, 'epoch': 0.73} 73%|███████▎ | 25012/34278 [27:31:28<8:59:52, 3.50s/it] 73%|███████▎ | 25013/34278 [27:31:34<10:49:46, 4.21s/it] {'loss': 0.0997, 'grad_norm': 0.7722361338043253, 'learning_rate': 1.79662972164358e-06, 'epoch': 0.73} 73%|███████▎ | 25013/34278 [27:31:34<10:49:46, 4.21s/it] 73%|███████▎ | 25014/34278 [27:31:39<11:17:13, 4.39s/it] {'loss': 0.1326, 'grad_norm': 0.8776022001151548, 'learning_rate': 1.7962669948715594e-06, 'epoch': 0.73} 73%|███████▎ | 25014/34278 [27:31:39<11:17:13, 4.39s/it] 73%|███████▎ | 25015/34278 [27:31:43<10:59:43, 4.27s/it] {'loss': 0.1354, 'grad_norm': 0.9531958295800378, 'learning_rate': 1.7959042967016998e-06, 'epoch': 0.73} 73%|███████▎ | 25015/34278 [27:31:43<10:59:43, 4.27s/it] 73%|███████▎ | 25016/34278 [27:31:46<9:50:59, 3.83s/it] {'loss': 0.1289, 'grad_norm': 0.9746917684306904, 'learning_rate': 1.7955416271372438e-06, 'epoch': 0.73} 73%|███████▎ | 25016/34278 [27:31:46<9:50:59, 3.83s/it] 73%|███████▎ | 25017/34278 [27:31:52<11:36:29, 4.51s/it] {'loss': 0.0989, 'grad_norm': 0.9164394104439284, 'learning_rate': 1.7951789861814251e-06, 'epoch': 0.73} 73%|███████▎ | 25017/34278 [27:31:52<11:36:29, 4.51s/it] 73%|███████▎ | 25018/34278 [27:31:55<10:32:23, 4.10s/it] {'loss': 0.1026, 'grad_norm': 0.727406514704139, 'learning_rate': 1.7948163738374858e-06, 'epoch': 0.73} 73%|███████▎ | 25018/34278 [27:31:55<10:32:23, 4.10s/it] 73%|███████▎ | 25019/34278 [27:31:59<10:01:11, 3.90s/it] {'loss': 0.1146, 'grad_norm': 0.8699623298374827, 'learning_rate': 1.7944537901086585e-06, 'epoch': 0.73} 73%|███████▎ | 25019/34278 [27:31:59<10:01:11, 3.90s/it] 73%|███████▎ | 25020/34278 [27:32:01<9:07:43, 3.55s/it] {'loss': 0.114, 'grad_norm': 0.847542047979384, 'learning_rate': 1.7940912349981844e-06, 'epoch': 0.73} 73%|███████▎ | 25020/34278 [27:32:01<9:07:43, 3.55s/it] 73%|███████▎ | 25021/34278 [27:32:05<9:22:54, 3.65s/it] {'loss': 0.096, 'grad_norm': 0.8382347962971295, 'learning_rate': 1.793728708509298e-06, 'epoch': 0.73} 73%|███████▎ | 25021/34278 [27:32:05<9:22:54, 3.65s/it] 73%|███████▎ | 25022/34278 [27:32:09<9:40:16, 3.76s/it] {'loss': 0.1168, 'grad_norm': 0.6323190932166773, 'learning_rate': 1.7933662106452349e-06, 'epoch': 0.73} 73%|███████▎ | 25022/34278 [27:32:09<9:40:16, 3.76s/it] 73%|███████▎ | 25023/34278 [27:32:12<8:51:49, 3.45s/it] {'loss': 0.1158, 'grad_norm': 0.7482440952761785, 'learning_rate': 1.7930037414092333e-06, 'epoch': 0.73} 73%|███████▎ | 25023/34278 [27:32:12<8:51:49, 3.45s/it] 73%|███████▎ | 25024/34278 [27:32:15<8:46:28, 3.41s/it] {'loss': 0.1095, 'grad_norm': 0.8202781357756733, 'learning_rate': 1.7926413008045296e-06, 'epoch': 0.73} 73%|███████▎ | 25024/34278 [27:32:15<8:46:28, 3.41s/it] 73%|███████▎ | 25025/34278 [27:32:18<8:26:23, 3.28s/it] {'loss': 0.135, 'grad_norm': 0.7695006838706988, 'learning_rate': 1.7922788888343574e-06, 'epoch': 0.73} 73%|███████▎ | 25025/34278 [27:32:18<8:26:23, 3.28s/it] 73%|███████▎ | 25026/34278 [27:32:23<9:50:23, 3.83s/it] {'loss': 0.1222, 'grad_norm': 0.7794378186839271, 'learning_rate': 1.7919165055019555e-06, 'epoch': 0.73} 73%|███████▎ | 25026/34278 [27:32:23<9:50:23, 3.83s/it] 73%|███████▎ | 25027/34278 [27:32:27<9:46:42, 3.81s/it] {'loss': 0.1027, 'grad_norm': 0.8003448262206493, 'learning_rate': 1.7915541508105566e-06, 'epoch': 0.73} 73%|███████▎ | 25027/34278 [27:32:27<9:46:42, 3.81s/it] 73%|███████▎ | 25028/34278 [27:32:31<9:29:18, 3.69s/it] {'loss': 0.1165, 'grad_norm': 0.9064438954442364, 'learning_rate': 1.7911918247633953e-06, 'epoch': 0.73} 73%|███████▎ | 25028/34278 [27:32:31<9:29:18, 3.69s/it] 73%|███████▎ | 25029/34278 [27:32:34<9:01:01, 3.51s/it] {'loss': 0.099, 'grad_norm': 0.8687801863595818, 'learning_rate': 1.7908295273637066e-06, 'epoch': 0.73} 73%|███████▎ | 25029/34278 [27:32:34<9:01:01, 3.51s/it] 73%|███████▎ | 25030/34278 [27:32:38<9:46:28, 3.80s/it] {'loss': 0.1135, 'grad_norm': 0.743202190253119, 'learning_rate': 1.790467258614728e-06, 'epoch': 0.73} 73%|███████▎ | 25030/34278 [27:32:38<9:46:28, 3.80s/it] 73%|███████▎ | 25031/34278 [27:32:43<10:37:40, 4.14s/it] {'loss': 0.1356, 'grad_norm': 0.847381812584872, 'learning_rate': 1.7901050185196916e-06, 'epoch': 0.73} 73%|███████▎ | 25031/34278 [27:32:43<10:37:40, 4.14s/it] 73%|███████▎ | 25032/34278 [27:32:48<11:38:05, 4.53s/it] {'loss': 0.1318, 'grad_norm': 0.8066910304199605, 'learning_rate': 1.7897428070818295e-06, 'epoch': 0.73} 73%|███████▎ | 25032/34278 [27:32:49<11:38:05, 4.53s/it] 73%|███████▎ | 25033/34278 [27:32:52<10:41:51, 4.17s/it] {'loss': 0.1082, 'grad_norm': 0.6904417673296508, 'learning_rate': 1.7893806243043794e-06, 'epoch': 0.73} 73%|███████▎ | 25033/34278 [27:32:52<10:41:51, 4.17s/it] 73%|███████▎ | 25034/34278 [27:32:57<11:41:03, 4.55s/it] {'loss': 0.1323, 'grad_norm': 0.8160857484736981, 'learning_rate': 1.7890184701905723e-06, 'epoch': 0.73} 73%|███████▎ | 25034/34278 [27:32:57<11:41:03, 4.55s/it] 73%|███████▎ | 25035/34278 [27:33:01<10:59:34, 4.28s/it] {'loss': 0.1166, 'grad_norm': 0.8490542342399159, 'learning_rate': 1.7886563447436394e-06, 'epoch': 0.73} 73%|███████▎ | 25035/34278 [27:33:01<10:59:34, 4.28s/it] 73%|███████▎ | 25036/34278 [27:33:07<12:29:19, 4.86s/it] {'loss': 0.143, 'grad_norm': 0.9124067324639801, 'learning_rate': 1.788294247966817e-06, 'epoch': 0.73} 73%|███████▎ | 25036/34278 [27:33:07<12:29:19, 4.86s/it] 73%|███████▎ | 25037/34278 [27:33:10<10:57:44, 4.27s/it] {'loss': 0.1181, 'grad_norm': 1.24884392752143, 'learning_rate': 1.7879321798633381e-06, 'epoch': 0.73} 73%|███████▎ | 25037/34278 [27:33:10<10:57:44, 4.27s/it] 73%|███████▎ | 25038/34278 [27:33:13<9:56:10, 3.87s/it] {'loss': 0.095, 'grad_norm': 0.7764460148314785, 'learning_rate': 1.7875701404364337e-06, 'epoch': 0.73} 73%|███████▎ | 25038/34278 [27:33:13<9:56:10, 3.87s/it] 73%|███████▎ | 25039/34278 [27:33:19<11:31:03, 4.49s/it] {'loss': 0.0896, 'grad_norm': 0.7370511598440967, 'learning_rate': 1.787208129689335e-06, 'epoch': 0.73} 73%|███████▎ | 25039/34278 [27:33:19<11:31:03, 4.49s/it] 73%|███████▎ | 25040/34278 [27:33:22<10:29:43, 4.09s/it] {'loss': 0.1231, 'grad_norm': 0.8654637129815397, 'learning_rate': 1.786846147625277e-06, 'epoch': 0.73} 73%|███████▎ | 25040/34278 [27:33:22<10:29:43, 4.09s/it] 73%|███████▎ | 25041/34278 [27:33:27<11:12:36, 4.37s/it] {'loss': 0.1301, 'grad_norm': 0.9567156856282536, 'learning_rate': 1.7864841942474876e-06, 'epoch': 0.73} 73%|███████▎ | 25041/34278 [27:33:27<11:12:36, 4.37s/it] 73%|███████▎ | 25042/34278 [27:33:31<10:29:41, 4.09s/it] {'loss': 0.111, 'grad_norm': 0.9964630297756714, 'learning_rate': 1.7861222695592e-06, 'epoch': 0.73} 73%|███████▎ | 25042/34278 [27:33:31<10:29:41, 4.09s/it] 73%|███████▎ | 25043/34278 [27:33:33<9:36:45, 3.75s/it] {'loss': 0.1111, 'grad_norm': 0.8372688229347758, 'learning_rate': 1.7857603735636475e-06, 'epoch': 0.73} 73%|███████▎ | 25043/34278 [27:33:33<9:36:45, 3.75s/it] 73%|███████▎ | 25044/34278 [27:33:37<9:32:47, 3.72s/it] {'loss': 0.1134, 'grad_norm': 0.7720023441235836, 'learning_rate': 1.7853985062640589e-06, 'epoch': 0.73} 73%|███████▎ | 25044/34278 [27:33:37<9:32:47, 3.72s/it] 73%|███████▎ | 25045/34278 [27:33:41<9:54:29, 3.86s/it] {'loss': 0.1057, 'grad_norm': 0.7857774481382417, 'learning_rate': 1.7850366676636632e-06, 'epoch': 0.73} 73%|███████▎ | 25045/34278 [27:33:41<9:54:29, 3.86s/it] 73%|███████▎ | 25046/34278 [27:33:45<9:24:55, 3.67s/it] {'loss': 0.106, 'grad_norm': 0.674889643651943, 'learning_rate': 1.7846748577656947e-06, 'epoch': 0.73} 73%|███████▎ | 25046/34278 [27:33:45<9:24:55, 3.67s/it] 73%|███████▎ | 25047/34278 [27:33:48<9:27:49, 3.69s/it] {'loss': 0.1297, 'grad_norm': 0.9857571154306463, 'learning_rate': 1.7843130765733797e-06, 'epoch': 0.73} 73%|███████▎ | 25047/34278 [27:33:48<9:27:49, 3.69s/it] 73%|███████▎ | 25048/34278 [27:33:51<8:56:54, 3.49s/it] {'loss': 0.1473, 'grad_norm': 0.9845988416842765, 'learning_rate': 1.7839513240899513e-06, 'epoch': 0.73} 73%|███████▎ | 25048/34278 [27:33:51<8:56:54, 3.49s/it] 73%|███████▎ | 25049/34278 [27:33:54<8:40:08, 3.38s/it] {'loss': 0.1024, 'grad_norm': 0.8661578830324631, 'learning_rate': 1.7835896003186366e-06, 'epoch': 0.73} 73%|███████▎ | 25049/34278 [27:33:54<8:40:08, 3.38s/it] 73%|███████▎ | 25050/34278 [27:33:58<8:32:10, 3.33s/it] {'loss': 0.1505, 'grad_norm': 0.9033248694122342, 'learning_rate': 1.7832279052626677e-06, 'epoch': 0.73} 73%|███████▎ | 25050/34278 [27:33:58<8:32:10, 3.33s/it] 73%|███████▎ | 25051/34278 [27:34:01<8:11:39, 3.20s/it] {'loss': 0.145, 'grad_norm': 1.0295070321884365, 'learning_rate': 1.7828662389252722e-06, 'epoch': 0.73} 73%|███████▎ | 25051/34278 [27:34:01<8:11:39, 3.20s/it] 73%|███████▎ | 25052/34278 [27:34:04<8:15:22, 3.22s/it] {'loss': 0.1216, 'grad_norm': 0.8899622472839388, 'learning_rate': 1.7825046013096769e-06, 'epoch': 0.73} 73%|███████▎ | 25052/34278 [27:34:04<8:15:22, 3.22s/it] 73%|███████▎ | 25053/34278 [27:34:07<7:58:42, 3.11s/it] {'loss': 0.1072, 'grad_norm': 1.0187479512197988, 'learning_rate': 1.7821429924191125e-06, 'epoch': 0.73} 73%|███████▎ | 25053/34278 [27:34:07<7:58:42, 3.11s/it] 73%|███████▎ | 25054/34278 [27:34:13<10:15:20, 4.00s/it] {'loss': 0.1129, 'grad_norm': 1.1032520339193246, 'learning_rate': 1.781781412256809e-06, 'epoch': 0.73} 73%|███████▎ | 25054/34278 [27:34:13<10:15:20, 4.00s/it] 73%|███████▎ | 25055/34278 [27:34:16<9:59:01, 3.90s/it] {'loss': 0.1361, 'grad_norm': 0.9820851148701948, 'learning_rate': 1.7814198608259931e-06, 'epoch': 0.73} 73%|███████▎ | 25055/34278 [27:34:16<9:59:01, 3.90s/it] 73%|███████▎ | 25056/34278 [27:34:21<10:35:08, 4.13s/it] {'loss': 0.0986, 'grad_norm': 0.7417339043079931, 'learning_rate': 1.7810583381298902e-06, 'epoch': 0.73} 73%|███████▎ | 25056/34278 [27:34:21<10:35:08, 4.13s/it] 73%|███████▎ | 25057/34278 [27:34:24<9:35:34, 3.75s/it] {'loss': 0.1075, 'grad_norm': 0.9315353013191379, 'learning_rate': 1.780696844171732e-06, 'epoch': 0.73} 73%|███████▎ | 25057/34278 [27:34:24<9:35:34, 3.75s/it] 73%|███████▎ | 25058/34278 [27:34:27<8:58:22, 3.50s/it] {'loss': 0.1147, 'grad_norm': 0.6483906621132459, 'learning_rate': 1.7803353789547422e-06, 'epoch': 0.73} 73%|███████▎ | 25058/34278 [27:34:27<8:58:22, 3.50s/it] 73%|███████▎ | 25059/34278 [27:34:30<8:53:44, 3.47s/it] {'loss': 0.1553, 'grad_norm': 1.0000328173101294, 'learning_rate': 1.7799739424821494e-06, 'epoch': 0.73} 73%|███████▎ | 25059/34278 [27:34:30<8:53:44, 3.47s/it] 73%|███████▎ | 25060/34278 [27:34:33<8:11:06, 3.20s/it] {'loss': 0.128, 'grad_norm': 0.7642304727740475, 'learning_rate': 1.7796125347571825e-06, 'epoch': 0.73} 73%|███████▎ | 25060/34278 [27:34:33<8:11:06, 3.20s/it] 73%|███████▎ | 25061/34278 [27:34:36<8:26:24, 3.30s/it] {'loss': 0.123, 'grad_norm': 1.00476660559154, 'learning_rate': 1.779251155783066e-06, 'epoch': 0.73} 73%|███████▎ | 25061/34278 [27:34:36<8:26:24, 3.30s/it] 73%|███████▎ | 25062/34278 [27:34:40<8:22:19, 3.27s/it] {'loss': 0.1273, 'grad_norm': 0.818370657229027, 'learning_rate': 1.7788898055630243e-06, 'epoch': 0.73} 73%|███████▎ | 25062/34278 [27:34:40<8:22:19, 3.27s/it] 73%|███████▎ | 25063/34278 [27:34:42<8:06:22, 3.17s/it] {'loss': 0.1256, 'grad_norm': 0.9591897965233913, 'learning_rate': 1.7785284841002876e-06, 'epoch': 0.73} 73%|███████▎ | 25063/34278 [27:34:42<8:06:22, 3.17s/it] 73%|███████▎ | 25064/34278 [27:34:47<9:03:13, 3.54s/it] {'loss': 0.1303, 'grad_norm': 0.9170908665415384, 'learning_rate': 1.7781671913980797e-06, 'epoch': 0.73} 73%|███████▎ | 25064/34278 [27:34:47<9:03:13, 3.54s/it] 73%|███████▎ | 25065/34278 [27:34:53<10:57:10, 4.28s/it] {'loss': 0.1316, 'grad_norm': 0.687266642683632, 'learning_rate': 1.7778059274596237e-06, 'epoch': 0.73} 73%|███████▎ | 25065/34278 [27:34:53<10:57:10, 4.28s/it] 73%|███████▎ | 25066/34278 [27:34:56<10:22:35, 4.06s/it] {'loss': 0.1159, 'grad_norm': 1.069808344895264, 'learning_rate': 1.7774446922881477e-06, 'epoch': 0.73} 73%|███████▎ | 25066/34278 [27:34:56<10:22:35, 4.06s/it] 73%|███████▎ | 25067/34278 [27:34:59<9:35:37, 3.75s/it] {'loss': 0.13, 'grad_norm': 1.1827380260337113, 'learning_rate': 1.7770834858868774e-06, 'epoch': 0.73} 73%|███████▎ | 25067/34278 [27:34:59<9:35:37, 3.75s/it] 73%|███████▎ | 25068/34278 [27:35:04<9:55:03, 3.88s/it] {'loss': 0.0954, 'grad_norm': 0.8296872513374155, 'learning_rate': 1.7767223082590368e-06, 'epoch': 0.73} 73%|███████▎ | 25068/34278 [27:35:04<9:55:03, 3.88s/it] 73%|███████▎ | 25069/34278 [27:35:09<10:52:00, 4.25s/it] {'loss': 0.131, 'grad_norm': 1.013042974837889, 'learning_rate': 1.7763611594078484e-06, 'epoch': 0.73} 73%|███████▎ | 25069/34278 [27:35:09<10:52:00, 4.25s/it] 73%|███████▎ | 25070/34278 [27:35:12<10:08:00, 3.96s/it] {'loss': 0.1373, 'grad_norm': 0.7654800282624646, 'learning_rate': 1.7760000393365396e-06, 'epoch': 0.73} 73%|███████▎ | 25070/34278 [27:35:12<10:08:00, 3.96s/it] 73%|███████▎ | 25071/34278 [27:35:15<9:23:11, 3.67s/it] {'loss': 0.099, 'grad_norm': 0.8389137047012237, 'learning_rate': 1.775638948048331e-06, 'epoch': 0.73} 73%|███████▎ | 25071/34278 [27:35:15<9:23:11, 3.67s/it] 73%|███████▎ | 25072/34278 [27:35:18<9:09:47, 3.58s/it] {'loss': 0.1134, 'grad_norm': 0.9852190149238672, 'learning_rate': 1.7752778855464482e-06, 'epoch': 0.73} 73%|███████▎ | 25072/34278 [27:35:18<9:09:47, 3.58s/it] 73%|███████▎ | 25073/34278 [27:35:24<10:58:36, 4.29s/it] {'loss': 0.1088, 'grad_norm': 0.8183655474665511, 'learning_rate': 1.7749168518341159e-06, 'epoch': 0.73} 73%|███████▎ | 25073/34278 [27:35:24<10:58:36, 4.29s/it] 73%|███████▎ | 25074/34278 [27:35:28<10:08:57, 3.97s/it] {'loss': 0.1254, 'grad_norm': 1.214453919438412, 'learning_rate': 1.7745558469145563e-06, 'epoch': 0.73} 73%|███████▎ | 25074/34278 [27:35:28<10:08:57, 3.97s/it] 73%|███████▎ | 25075/34278 [27:35:31<9:36:56, 3.76s/it] {'loss': 0.1075, 'grad_norm': 0.9299908590710989, 'learning_rate': 1.7741948707909906e-06, 'epoch': 0.73} 73%|███████▎ | 25075/34278 [27:35:31<9:36:56, 3.76s/it] 73%|███████▎ | 25076/34278 [27:35:35<9:37:59, 3.77s/it] {'loss': 0.118, 'grad_norm': 0.7162708437097317, 'learning_rate': 1.7738339234666453e-06, 'epoch': 0.73} 73%|███████▎ | 25076/34278 [27:35:35<9:37:59, 3.77s/it] 73%|███████▎ | 25077/34278 [27:35:37<8:50:21, 3.46s/it] {'loss': 0.1276, 'grad_norm': 0.9828499682152776, 'learning_rate': 1.773473004944738e-06, 'epoch': 0.73} 73%|███████▎ | 25077/34278 [27:35:37<8:50:21, 3.46s/it] 73%|███████▎ | 25078/34278 [27:35:40<8:25:32, 3.30s/it] {'loss': 0.1032, 'grad_norm': 0.9106745842008812, 'learning_rate': 1.7731121152284952e-06, 'epoch': 0.73} 73%|███████▎ | 25078/34278 [27:35:40<8:25:32, 3.30s/it] 73%|███████▎ | 25079/34278 [27:35:46<10:18:18, 4.03s/it] {'loss': 0.1061, 'grad_norm': 0.8463589980104531, 'learning_rate': 1.7727512543211356e-06, 'epoch': 0.73} 73%|███████▎ | 25079/34278 [27:35:46<10:18:18, 4.03s/it] 73%|███████▎ | 25080/34278 [27:35:52<12:03:30, 4.72s/it] {'loss': 0.1241, 'grad_norm': 0.8227355149757087, 'learning_rate': 1.7723904222258842e-06, 'epoch': 0.73} 73%|███████▎ | 25080/34278 [27:35:52<12:03:30, 4.72s/it] 73%|███████▎ | 25081/34278 [27:35:56<10:59:35, 4.30s/it] {'loss': 0.115, 'grad_norm': 1.0125654110241649, 'learning_rate': 1.7720296189459607e-06, 'epoch': 0.73} 73%|███████▎ | 25081/34278 [27:35:56<10:59:35, 4.30s/it] 73%|███████▎ | 25082/34278 [27:36:01<12:05:24, 4.73s/it] {'loss': 0.1227, 'grad_norm': 0.8123603753577865, 'learning_rate': 1.7716688444845841e-06, 'epoch': 0.73} 73%|███████▎ | 25082/34278 [27:36:01<12:05:24, 4.73s/it] 73%|███████▎ | 25083/34278 [27:36:06<12:06:57, 4.74s/it] {'loss': 0.148, 'grad_norm': 1.2190760843158348, 'learning_rate': 1.7713080988449783e-06, 'epoch': 0.73} 73%|███████▎ | 25083/34278 [27:36:06<12:06:57, 4.74s/it] 73%|███████▎ | 25084/34278 [27:36:09<10:47:53, 4.23s/it] {'loss': 0.1319, 'grad_norm': 1.058355541610057, 'learning_rate': 1.770947382030364e-06, 'epoch': 0.73} 73%|███████▎ | 25084/34278 [27:36:09<10:47:53, 4.23s/it] 73%|███████▎ | 25085/34278 [27:36:12<9:41:31, 3.80s/it] {'loss': 0.1205, 'grad_norm': 1.142991790627484, 'learning_rate': 1.7705866940439604e-06, 'epoch': 0.73} 73%|███████▎ | 25085/34278 [27:36:12<9:41:31, 3.80s/it] 73%|███████▎ | 25086/34278 [27:36:15<8:56:06, 3.50s/it] {'loss': 0.1185, 'grad_norm': 0.9925538806801744, 'learning_rate': 1.7702260348889865e-06, 'epoch': 0.73} 73%|███████▎ | 25086/34278 [27:36:15<8:56:06, 3.50s/it] 73%|███████▎ | 25087/34278 [27:36:18<8:31:58, 3.34s/it] {'loss': 0.1065, 'grad_norm': 0.9204263589105607, 'learning_rate': 1.7698654045686654e-06, 'epoch': 0.73} 73%|███████▎ | 25087/34278 [27:36:18<8:31:58, 3.34s/it] 73%|███████▎ | 25088/34278 [27:36:21<8:15:29, 3.24s/it] {'loss': 0.1213, 'grad_norm': 1.01427959243275, 'learning_rate': 1.7695048030862133e-06, 'epoch': 0.73} 73%|███████▎ | 25088/34278 [27:36:21<8:15:29, 3.24s/it] 73%|███████▎ | 25089/34278 [27:36:24<8:02:12, 3.15s/it] {'loss': 0.1171, 'grad_norm': 0.7468746509361103, 'learning_rate': 1.7691442304448508e-06, 'epoch': 0.73} 73%|███████▎ | 25089/34278 [27:36:24<8:02:12, 3.15s/it] 73%|███████▎ | 25090/34278 [27:36:27<7:56:48, 3.11s/it] {'loss': 0.1103, 'grad_norm': 0.7073049353332799, 'learning_rate': 1.7687836866477992e-06, 'epoch': 0.73} 73%|███████▎ | 25090/34278 [27:36:27<7:56:48, 3.11s/it] 73%|███████▎ | 25091/34278 [27:36:30<7:51:07, 3.08s/it] {'loss': 0.1062, 'grad_norm': 0.9058603297741257, 'learning_rate': 1.7684231716982753e-06, 'epoch': 0.73} 73%|███████▎ | 25091/34278 [27:36:30<7:51:07, 3.08s/it] 73%|███████▎ | 25092/34278 [27:36:33<8:05:00, 3.17s/it] {'loss': 0.1273, 'grad_norm': 0.7907075778553803, 'learning_rate': 1.7680626855994964e-06, 'epoch': 0.73} 73%|███████▎ | 25092/34278 [27:36:33<8:05:00, 3.17s/it] 73%|███████▎ | 25093/34278 [27:36:39<10:18:33, 4.04s/it] {'loss': 0.1212, 'grad_norm': 0.8180062151490706, 'learning_rate': 1.7677022283546835e-06, 'epoch': 0.73} 73%|███████▎ | 25093/34278 [27:36:39<10:18:33, 4.04s/it] 73%|███████▎ | 25094/34278 [27:36:43<10:20:06, 4.05s/it] {'loss': 0.1147, 'grad_norm': 0.976544821222924, 'learning_rate': 1.7673417999670538e-06, 'epoch': 0.73} 73%|███████▎ | 25094/34278 [27:36:43<10:20:06, 4.05s/it] 73%|███████▎ | 25095/34278 [27:36:49<11:53:03, 4.66s/it] {'loss': 0.1204, 'grad_norm': 0.9447337914495483, 'learning_rate': 1.7669814004398234e-06, 'epoch': 0.73} 73%|███████▎ | 25095/34278 [27:36:49<11:53:03, 4.66s/it] 73%|███████▎ | 25096/34278 [27:36:52<10:35:27, 4.15s/it] {'loss': 0.123, 'grad_norm': 0.8780045298183542, 'learning_rate': 1.766621029776211e-06, 'epoch': 0.73} 73%|███████▎ | 25096/34278 [27:36:52<10:35:27, 4.15s/it] 73%|███████▎ | 25097/34278 [27:36:55<9:41:39, 3.80s/it] {'loss': 0.1117, 'grad_norm': 0.9278161234312494, 'learning_rate': 1.7662606879794364e-06, 'epoch': 0.73} 73%|███████▎ | 25097/34278 [27:36:55<9:41:39, 3.80s/it] 73%|███████▎ | 25098/34278 [27:36:58<8:59:59, 3.53s/it] {'loss': 0.1301, 'grad_norm': 1.030305282599738, 'learning_rate': 1.7659003750527137e-06, 'epoch': 0.73} 73%|███████▎ | 25098/34278 [27:36:58<8:59:59, 3.53s/it] 73%|███████▎ | 25099/34278 [27:37:01<8:21:19, 3.28s/it] {'loss': 0.1201, 'grad_norm': 0.93053434249989, 'learning_rate': 1.7655400909992592e-06, 'epoch': 0.73} 73%|███████▎ | 25099/34278 [27:37:01<8:21:19, 3.28s/it] 73%|███████▎ | 25100/34278 [27:37:04<8:26:35, 3.31s/it] {'loss': 0.1065, 'grad_norm': 0.8230394303780576, 'learning_rate': 1.765179835822292e-06, 'epoch': 0.73} 73%|███████▎ | 25100/34278 [27:37:04<8:26:35, 3.31s/it] 73%|███████▎ | 25101/34278 [27:37:10<10:27:33, 4.10s/it] {'loss': 0.1074, 'grad_norm': 0.7766468427595344, 'learning_rate': 1.7648196095250252e-06, 'epoch': 0.73} 73%|███████▎ | 25101/34278 [27:37:10<10:27:33, 4.10s/it] 73%|███████▎ | 25102/34278 [27:37:13<9:44:18, 3.82s/it] {'loss': 0.1082, 'grad_norm': 0.8429244286159766, 'learning_rate': 1.7644594121106773e-06, 'epoch': 0.73} 73%|███████▎ | 25102/34278 [27:37:13<9:44:18, 3.82s/it] 73%|███████▎ | 25103/34278 [27:37:16<9:00:02, 3.53s/it] {'loss': 0.1158, 'grad_norm': 0.8269239382489441, 'learning_rate': 1.7640992435824644e-06, 'epoch': 0.73} 73%|███████▎ | 25103/34278 [27:37:16<9:00:02, 3.53s/it] 73%|███████▎ | 25104/34278 [27:37:20<9:26:06, 3.70s/it] {'loss': 0.1565, 'grad_norm': 0.7961464558235928, 'learning_rate': 1.7637391039436013e-06, 'epoch': 0.73} 73%|███████▎ | 25104/34278 [27:37:20<9:26:06, 3.70s/it] 73%|███████▎ | 25105/34278 [27:37:23<8:51:08, 3.47s/it] {'loss': 0.126, 'grad_norm': 0.7701526148099882, 'learning_rate': 1.7633789931973011e-06, 'epoch': 0.73} 73%|███████▎ | 25105/34278 [27:37:23<8:51:08, 3.47s/it] 73%|███████▎ | 25106/34278 [27:37:29<10:48:16, 4.24s/it] {'loss': 0.1365, 'grad_norm': 0.9857449419649804, 'learning_rate': 1.7630189113467827e-06, 'epoch': 0.73} 73%|███████▎ | 25106/34278 [27:37:29<10:48:16, 4.24s/it] 73%|███████▎ | 25107/34278 [27:37:32<9:57:55, 3.91s/it] {'loss': 0.1344, 'grad_norm': 0.8301893093873627, 'learning_rate': 1.7626588583952564e-06, 'epoch': 0.73} 73%|███████▎ | 25107/34278 [27:37:32<9:57:55, 3.91s/it] 73%|███████▎ | 25108/34278 [27:37:38<11:02:50, 4.34s/it] {'loss': 0.1064, 'grad_norm': 0.8109699195045951, 'learning_rate': 1.7622988343459412e-06, 'epoch': 0.73} 73%|███████▎ | 25108/34278 [27:37:38<11:02:50, 4.34s/it] 73%|███████▎ | 25109/34278 [27:37:41<10:06:03, 3.97s/it] {'loss': 0.1366, 'grad_norm': 1.0554056330856585, 'learning_rate': 1.761938839202047e-06, 'epoch': 0.73} 73%|███████▎ | 25109/34278 [27:37:41<10:06:03, 3.97s/it] 73%|███████▎ | 25110/34278 [27:37:45<9:58:45, 3.92s/it] {'loss': 0.1208, 'grad_norm': 0.8565797345130763, 'learning_rate': 1.761578872966792e-06, 'epoch': 0.73} 73%|███████▎ | 25110/34278 [27:37:45<9:58:45, 3.92s/it] 73%|███████▎ | 25111/34278 [27:37:48<9:23:34, 3.69s/it] {'loss': 0.0986, 'grad_norm': 0.7180030160198854, 'learning_rate': 1.7612189356433873e-06, 'epoch': 0.73} 73%|███████▎ | 25111/34278 [27:37:48<9:23:34, 3.69s/it] 73%|███████▎ | 25112/34278 [27:37:51<8:54:06, 3.50s/it] {'loss': 0.1468, 'grad_norm': 0.9382257106450259, 'learning_rate': 1.7608590272350452e-06, 'epoch': 0.73} 73%|███████▎ | 25112/34278 [27:37:51<8:54:06, 3.50s/it] 73%|███████▎ | 25113/34278 [27:37:54<8:31:32, 3.35s/it] {'loss': 0.1235, 'grad_norm': 1.027195069618291, 'learning_rate': 1.7604991477449806e-06, 'epoch': 0.73} 73%|███████▎ | 25113/34278 [27:37:54<8:31:32, 3.35s/it] 73%|███████▎ | 25114/34278 [27:38:00<10:27:02, 4.11s/it] {'loss': 0.1103, 'grad_norm': 0.7406530075175138, 'learning_rate': 1.760139297176408e-06, 'epoch': 0.73} 73%|███████▎ | 25114/34278 [27:38:00<10:27:02, 4.11s/it] 73%|███████▎ | 25115/34278 [27:38:03<9:45:50, 3.84s/it] {'loss': 0.1247, 'grad_norm': 0.7793079944136352, 'learning_rate': 1.7597794755325381e-06, 'epoch': 0.73} 73%|███████▎ | 25115/34278 [27:38:03<9:45:50, 3.84s/it] 73%|███████▎ | 25116/34278 [27:38:06<9:19:10, 3.66s/it] {'loss': 0.1104, 'grad_norm': 0.9175384882801038, 'learning_rate': 1.7594196828165822e-06, 'epoch': 0.73} 73%|███████▎ | 25116/34278 [27:38:06<9:19:10, 3.66s/it] 73%|███████▎ | 25117/34278 [27:38:09<8:43:31, 3.43s/it] {'loss': 0.1234, 'grad_norm': 0.830111568881049, 'learning_rate': 1.7590599190317553e-06, 'epoch': 0.73} 73%|███████▎ | 25117/34278 [27:38:09<8:43:31, 3.43s/it] 73%|███████▎ | 25118/34278 [27:38:12<8:35:31, 3.38s/it] {'loss': 0.1198, 'grad_norm': 0.8619153060807219, 'learning_rate': 1.7587001841812661e-06, 'epoch': 0.73} 73%|███████▎ | 25118/34278 [27:38:12<8:35:31, 3.38s/it] 73%|███████▎ | 25119/34278 [27:38:16<8:25:07, 3.31s/it] {'loss': 0.1471, 'grad_norm': 1.142988479214071, 'learning_rate': 1.7583404782683278e-06, 'epoch': 0.73} 73%|███████▎ | 25119/34278 [27:38:16<8:25:07, 3.31s/it] 73%|███████▎ | 25120/34278 [27:38:19<8:22:42, 3.29s/it] {'loss': 0.1392, 'grad_norm': 0.8390633501782565, 'learning_rate': 1.7579808012961535e-06, 'epoch': 0.73} 73%|███████▎ | 25120/34278 [27:38:19<8:22:42, 3.29s/it] 73%|███████▎ | 25121/34278 [27:38:22<8:05:39, 3.18s/it] {'loss': 0.1083, 'grad_norm': 0.8210486107886237, 'learning_rate': 1.7576211532679526e-06, 'epoch': 0.73} 73%|███████▎ | 25121/34278 [27:38:22<8:05:39, 3.18s/it] 73%|███████▎ | 25122/34278 [27:38:26<8:33:39, 3.37s/it] {'loss': 0.1453, 'grad_norm': 1.0812884348190424, 'learning_rate': 1.7572615341869348e-06, 'epoch': 0.73} 73%|███████▎ | 25122/34278 [27:38:26<8:33:39, 3.37s/it] 73%|███████▎ | 25123/34278 [27:38:31<10:00:15, 3.93s/it] {'loss': 0.113, 'grad_norm': 1.1511242222907994, 'learning_rate': 1.7569019440563134e-06, 'epoch': 0.73} 73%|███████▎ | 25123/34278 [27:38:31<10:00:15, 3.93s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff00c1d8360> Failed to fetch sample 3686875. Exception: cannot identify image file <_io.BytesIO object at 0x7ff00c1d8360> 73%|███████▎ | 25124/34278 [27:38:34<9:12:45, 3.62s/it] {'loss': 0.1043, 'grad_norm': 0.8570069548126423, 'learning_rate': 1.7565423828792971e-06, 'epoch': 0.73} 73%|███████▎ | 25124/34278 [27:38:34<9:12:45, 3.62s/it] 73%|███████▎ | 25125/34278 [27:38:38<9:40:05, 3.80s/it] {'loss': 0.1092, 'grad_norm': 0.8266584965000707, 'learning_rate': 1.7561828506590944e-06, 'epoch': 0.73} 73%|███████▎ | 25125/34278 [27:38:38<9:40:05, 3.80s/it] 73%|███████▎ | 25126/34278 [27:38:44<11:22:18, 4.47s/it] {'loss': 0.1058, 'grad_norm': 0.7430619098612392, 'learning_rate': 1.7558233473989172e-06, 'epoch': 0.73} 73%|███████▎ | 25126/34278 [27:38:44<11:22:18, 4.47s/it] 73%|███████▎ | 25127/34278 [27:38:48<10:49:19, 4.26s/it] {'loss': 0.1212, 'grad_norm': 1.0697237572037825, 'learning_rate': 1.7554638731019757e-06, 'epoch': 0.73} 73%|███████▎ | 25127/34278 [27:38:48<10:49:19, 4.26s/it] 73%|███████▎ | 25128/34278 [27:38:53<11:44:03, 4.62s/it] {'loss': 0.0987, 'grad_norm': 0.7909565495721145, 'learning_rate': 1.755104427771479e-06, 'epoch': 0.73} 73%|███████▎ | 25128/34278 [27:38:53<11:44:03, 4.62s/it] 73%|███████▎ | 25129/34278 [27:38:56<10:38:55, 4.19s/it] {'loss': 0.1182, 'grad_norm': 0.8600045561286982, 'learning_rate': 1.7547450114106335e-06, 'epoch': 0.73} 73%|███████▎ | 25129/34278 [27:38:56<10:38:55, 4.19s/it] 73%|███████▎ | 25130/34278 [27:38:59<9:46:45, 3.85s/it] {'loss': 0.1266, 'grad_norm': 0.7879225465652433, 'learning_rate': 1.754385624022651e-06, 'epoch': 0.73} 73%|███████▎ | 25130/34278 [27:38:59<9:46:45, 3.85s/it] 73%|███████▎ | 25131/34278 [27:39:03<9:51:05, 3.88s/it] {'loss': 0.1237, 'grad_norm': 0.9075453655824288, 'learning_rate': 1.7540262656107376e-06, 'epoch': 0.73} 73%|███████▎ | 25131/34278 [27:39:03<9:51:05, 3.88s/it] 73%|███████▎ | 25132/34278 [27:39:06<8:59:28, 3.54s/it] {'loss': 0.1029, 'grad_norm': 0.8839351725820979, 'learning_rate': 1.7536669361781028e-06, 'epoch': 0.73} 73%|███████▎ | 25132/34278 [27:39:06<8:59:28, 3.54s/it] 73%|███████▎ | 25133/34278 [27:39:09<8:46:49, 3.46s/it] {'loss': 0.1175, 'grad_norm': 0.7682383820730142, 'learning_rate': 1.753307635727956e-06, 'epoch': 0.73} 73%|███████▎ | 25133/34278 [27:39:09<8:46:49, 3.46s/it] 73%|███████▎ | 25134/34278 [27:39:12<8:30:54, 3.35s/it] {'loss': 0.1293, 'grad_norm': 0.7750322859507258, 'learning_rate': 1.7529483642635042e-06, 'epoch': 0.73} 73%|███████▎ | 25134/34278 [27:39:12<8:30:54, 3.35s/it] 73%|███████▎ | 25135/34278 [27:39:15<8:15:44, 3.25s/it] {'loss': 0.1144, 'grad_norm': 1.0730597542982712, 'learning_rate': 1.752589121787952e-06, 'epoch': 0.73} 73%|███████▎ | 25135/34278 [27:39:15<8:15:44, 3.25s/it] 73%|███████▎ | 25136/34278 [27:39:21<10:16:41, 4.05s/it] {'loss': 0.1178, 'grad_norm': 1.0052931011938175, 'learning_rate': 1.7522299083045109e-06, 'epoch': 0.73} 73%|███████▎ | 25136/34278 [27:39:21<10:16:41, 4.05s/it] 73%|███████▎ | 25137/34278 [27:39:26<10:42:33, 4.22s/it] {'loss': 0.1164, 'grad_norm': 0.7550743786223016, 'learning_rate': 1.751870723816384e-06, 'epoch': 0.73} 73%|███████▎ | 25137/34278 [27:39:26<10:42:33, 4.22s/it] 73%|███████▎ | 25138/34278 [27:39:29<9:34:23, 3.77s/it] {'loss': 0.1176, 'grad_norm': 0.8236347357126722, 'learning_rate': 1.7515115683267818e-06, 'epoch': 0.73} 73%|███████▎ | 25138/34278 [27:39:29<9:34:23, 3.77s/it] 73%|███████▎ | 25139/34278 [27:39:32<8:51:00, 3.49s/it] {'loss': 0.1096, 'grad_norm': 0.9729512647434295, 'learning_rate': 1.751152441838907e-06, 'epoch': 0.73} 73%|███████▎ | 25139/34278 [27:39:32<8:51:00, 3.49s/it] 73%|███████▎ | 25140/34278 [27:39:35<9:10:56, 3.62s/it] {'loss': 0.1069, 'grad_norm': 0.8616297450036582, 'learning_rate': 1.7507933443559694e-06, 'epoch': 0.73} 73%|███████▎ | 25140/34278 [27:39:35<9:10:56, 3.62s/it] 73%|███████▎ | 25141/34278 [27:39:39<9:15:05, 3.65s/it] {'loss': 0.1131, 'grad_norm': 0.8177603369689589, 'learning_rate': 1.7504342758811732e-06, 'epoch': 0.73} 73%|███████▎ | 25141/34278 [27:39:39<9:15:05, 3.65s/it] 73%|███████▎ | 25142/34278 [27:39:43<9:09:06, 3.61s/it] {'loss': 0.1174, 'grad_norm': 0.7055999696259391, 'learning_rate': 1.750075236417722e-06, 'epoch': 0.73} 73%|███████▎ | 25142/34278 [27:39:43<9:09:06, 3.61s/it] 73%|███████▎ | 25143/34278 [27:39:47<9:23:46, 3.70s/it] {'loss': 0.1383, 'grad_norm': 0.8013556770362623, 'learning_rate': 1.7497162259688238e-06, 'epoch': 0.73} 73%|███████▎ | 25143/34278 [27:39:47<9:23:46, 3.70s/it] 73%|███████▎ | 25144/34278 [27:39:53<11:05:03, 4.37s/it] {'loss': 0.1082, 'grad_norm': 0.7566002384298631, 'learning_rate': 1.7493572445376845e-06, 'epoch': 0.73} 73%|███████▎ | 25144/34278 [27:39:53<11:05:03, 4.37s/it] 73%|███████▎ | 25145/34278 [27:39:55<9:59:57, 3.94s/it] {'loss': 0.1188, 'grad_norm': 0.7498020039586966, 'learning_rate': 1.7489982921275077e-06, 'epoch': 0.73} 73%|███████▎ | 25145/34278 [27:39:55<9:59:57, 3.94s/it] 73%|███████▎ | 25146/34278 [27:39:59<9:34:49, 3.78s/it] {'loss': 0.1182, 'grad_norm': 0.7120447492160659, 'learning_rate': 1.748639368741497e-06, 'epoch': 0.73} 73%|███████▎ | 25146/34278 [27:39:59<9:34:49, 3.78s/it] 73%|███████▎ | 25147/34278 [27:40:02<9:27:28, 3.73s/it] {'loss': 0.1074, 'grad_norm': 0.7978790389234697, 'learning_rate': 1.748280474382859e-06, 'epoch': 0.73} 73%|███████▎ | 25147/34278 [27:40:03<9:27:28, 3.73s/it] 73%|███████▎ | 25148/34278 [27:40:06<8:57:53, 3.53s/it] {'loss': 0.1378, 'grad_norm': 0.8372600013761905, 'learning_rate': 1.7479216090547952e-06, 'epoch': 0.73} 73%|███████▎ | 25148/34278 [27:40:06<8:57:53, 3.53s/it] 73%|███████▎ | 25149/34278 [27:40:09<9:01:58, 3.56s/it] {'loss': 0.1008, 'grad_norm': 0.7960443512718854, 'learning_rate': 1.747562772760511e-06, 'epoch': 0.73} 73%|███████▎ | 25149/34278 [27:40:09<9:01:58, 3.56s/it] 73%|███████▎ | 25150/34278 [27:40:14<9:38:10, 3.80s/it] {'loss': 0.123, 'grad_norm': 0.8518957134997182, 'learning_rate': 1.7472039655032113e-06, 'epoch': 0.73} 73%|███████▎ | 25150/34278 [27:40:14<9:38:10, 3.80s/it] 73%|███████▎ | 25151/34278 [27:40:19<10:32:46, 4.16s/it] {'loss': 0.1102, 'grad_norm': 0.835281475119563, 'learning_rate': 1.7468451872860986e-06, 'epoch': 0.73} 73%|███████▎ | 25151/34278 [27:40:19<10:32:46, 4.16s/it] 73%|███████▎ | 25152/34278 [27:40:23<10:38:27, 4.20s/it] {'loss': 0.1209, 'grad_norm': 0.8287815632191993, 'learning_rate': 1.746486438112373e-06, 'epoch': 0.73} 73%|███████▎ | 25152/34278 [27:40:23<10:38:27, 4.20s/it] 73%|███████▎ | 25153/34278 [27:40:26<10:08:00, 4.00s/it] {'loss': 0.0999, 'grad_norm': 0.8533193339576118, 'learning_rate': 1.746127717985242e-06, 'epoch': 0.73} 73%|███████▎ | 25153/34278 [27:40:26<10:08:00, 4.00s/it] 73%|███████▎ | 25154/34278 [27:40:31<10:14:18, 4.04s/it] {'loss': 0.0973, 'grad_norm': 0.6564936950655875, 'learning_rate': 1.7457690269079047e-06, 'epoch': 0.73} 73%|███████▎ | 25154/34278 [27:40:31<10:14:18, 4.04s/it] 73%|███████▎ | 25155/34278 [27:40:37<11:47:17, 4.65s/it] {'loss': 0.1268, 'grad_norm': 1.1398355247475862, 'learning_rate': 1.7454103648835656e-06, 'epoch': 0.73} 73%|███████▎ | 25155/34278 [27:40:37<11:47:17, 4.65s/it] 73%|███████▎ | 25156/34278 [27:40:39<10:27:08, 4.13s/it] {'loss': 0.117, 'grad_norm': 1.463932068213724, 'learning_rate': 1.7450517319154247e-06, 'epoch': 0.73} 73%|███████▎ | 25156/34278 [27:40:39<10:27:08, 4.13s/it] 73%|███████▎ | 25157/34278 [27:40:43<10:03:32, 3.97s/it] {'loss': 0.1307, 'grad_norm': 1.0145696826910793, 'learning_rate': 1.7446931280066865e-06, 'epoch': 0.73} 73%|███████▎ | 25157/34278 [27:40:43<10:03:32, 3.97s/it] 73%|███████▎ | 25158/34278 [27:40:47<10:17:30, 4.06s/it] {'loss': 0.1153, 'grad_norm': 1.158387489395097, 'learning_rate': 1.7443345531605505e-06, 'epoch': 0.73} 73%|███████▎ | 25158/34278 [27:40:47<10:17:30, 4.06s/it] 73%|███████▎ | 25159/34278 [27:40:54<11:53:21, 4.69s/it] {'loss': 0.1181, 'grad_norm': 1.115998650436188, 'learning_rate': 1.743976007380217e-06, 'epoch': 0.73} 73%|███████▎ | 25159/34278 [27:40:54<11:53:21, 4.69s/it] 73%|███████▎ | 25160/34278 [27:40:57<10:38:51, 4.20s/it] {'loss': 0.1408, 'grad_norm': 0.8625697536330599, 'learning_rate': 1.7436174906688886e-06, 'epoch': 0.73} 73%|███████▎ | 25160/34278 [27:40:57<10:38:51, 4.20s/it] 73%|███████▎ | 25161/34278 [27:40:59<9:38:30, 3.81s/it] {'loss': 0.1307, 'grad_norm': 0.8435931153180534, 'learning_rate': 1.7432590030297674e-06, 'epoch': 0.73} 73%|███████▎ | 25161/34278 [27:40:59<9:38:30, 3.81s/it] 73%|███████▎ | 25162/34278 [27:41:03<9:36:51, 3.80s/it] {'loss': 0.1177, 'grad_norm': 1.1569062914021235, 'learning_rate': 1.7429005444660508e-06, 'epoch': 0.73} 73%|███████▎ | 25162/34278 [27:41:03<9:36:51, 3.80s/it] 73%|███████▎ | 25163/34278 [27:41:07<9:14:10, 3.65s/it] {'loss': 0.1002, 'grad_norm': 1.1099160223533906, 'learning_rate': 1.7425421149809424e-06, 'epoch': 0.73} 73%|███████▎ | 25163/34278 [27:41:07<9:14:10, 3.65s/it] 73%|███████▎ | 25164/34278 [27:41:09<8:34:27, 3.39s/it] {'loss': 0.1209, 'grad_norm': 1.2895727092868505, 'learning_rate': 1.7421837145776399e-06, 'epoch': 0.73} 73%|███████▎ | 25164/34278 [27:41:09<8:34:27, 3.39s/it] 73%|███████▎ | 25165/34278 [27:41:13<8:26:42, 3.34s/it] {'loss': 0.1126, 'grad_norm': 0.7167036218626416, 'learning_rate': 1.7418253432593423e-06, 'epoch': 0.73} 73%|███████▎ | 25165/34278 [27:41:13<8:26:42, 3.34s/it] 73%|███████▎ | 25166/34278 [27:41:16<8:26:37, 3.34s/it] {'loss': 0.1005, 'grad_norm': 0.7890719376869357, 'learning_rate': 1.74146700102925e-06, 'epoch': 0.73} 73%|███████▎ | 25166/34278 [27:41:16<8:26:37, 3.34s/it] 73%|███████▎ | 25167/34278 [27:41:19<7:57:56, 3.15s/it] {'loss': 0.1112, 'grad_norm': 0.8660126753722375, 'learning_rate': 1.741108687890564e-06, 'epoch': 0.73} 73%|███████▎ | 25167/34278 [27:41:19<7:57:56, 3.15s/it] 73%|███████▎ | 25168/34278 [27:41:23<9:11:37, 3.63s/it] {'loss': 0.1098, 'grad_norm': 1.0932106365105552, 'learning_rate': 1.7407504038464818e-06, 'epoch': 0.73} 73%|███████▎ | 25168/34278 [27:41:23<9:11:37, 3.63s/it] 73%|███████▎ | 25169/34278 [27:41:26<8:45:40, 3.46s/it] {'loss': 0.128, 'grad_norm': 0.9895097506666662, 'learning_rate': 1.7403921489002008e-06, 'epoch': 0.73} 73%|███████▎ | 25169/34278 [27:41:26<8:45:40, 3.46s/it] 73%|███████▎ | 25170/34278 [27:41:32<10:09:48, 4.02s/it] {'loss': 0.1228, 'grad_norm': 0.7495357288382063, 'learning_rate': 1.7400339230549212e-06, 'epoch': 0.73} 73%|███████▎ | 25170/34278 [27:41:32<10:09:48, 4.02s/it] 73%|███████▎ | 25171/34278 [27:41:36<10:40:56, 4.22s/it] {'loss': 0.106, 'grad_norm': 0.7804076536488669, 'learning_rate': 1.7396757263138415e-06, 'epoch': 0.73} 73%|███████▎ | 25171/34278 [27:41:36<10:40:56, 4.22s/it] 73%|███████▎ | 25172/34278 [27:41:39<9:47:16, 3.87s/it] {'loss': 0.1157, 'grad_norm': 0.8424890119681546, 'learning_rate': 1.7393175586801564e-06, 'epoch': 0.73} 73%|███████▎ | 25172/34278 [27:41:39<9:47:16, 3.87s/it] 73%|███████▎ | 25173/34278 [27:41:44<10:34:28, 4.18s/it] {'loss': 0.1095, 'grad_norm': 0.9289119281859283, 'learning_rate': 1.738959420157066e-06, 'epoch': 0.73} 73%|███████▎ | 25173/34278 [27:41:44<10:34:28, 4.18s/it] 73%|███████▎ | 25174/34278 [27:41:48<10:11:56, 4.03s/it] {'loss': 0.1034, 'grad_norm': 0.9027837895824324, 'learning_rate': 1.738601310747769e-06, 'epoch': 0.73} 73%|███████▎ | 25174/34278 [27:41:48<10:11:56, 4.03s/it] 73%|███████▎ | 25175/34278 [27:41:54<11:41:03, 4.62s/it] {'loss': 0.1128, 'grad_norm': 0.8428337517467128, 'learning_rate': 1.7382432304554609e-06, 'epoch': 0.73} 73%|███████▎ | 25175/34278 [27:41:54<11:41:03, 4.62s/it] 73%|███████▎ | 25176/34278 [27:41:58<11:06:56, 4.40s/it] {'loss': 0.1119, 'grad_norm': 0.9047097875695138, 'learning_rate': 1.7378851792833368e-06, 'epoch': 0.73} 73%|███████▎ | 25176/34278 [27:41:58<11:06:56, 4.40s/it] 73%|███████▎ | 25177/34278 [27:42:01<10:27:48, 4.14s/it] {'loss': 0.1248, 'grad_norm': 0.8795970607848003, 'learning_rate': 1.737527157234597e-06, 'epoch': 0.73} 73%|███████▎ | 25177/34278 [27:42:01<10:27:48, 4.14s/it] 73%|███████▎ | 25178/34278 [27:42:04<9:32:52, 3.78s/it] {'loss': 0.1036, 'grad_norm': 0.7990268085180928, 'learning_rate': 1.7371691643124338e-06, 'epoch': 0.73} 73%|███████▎ | 25178/34278 [27:42:04<9:32:52, 3.78s/it] 73%|███████▎ | 25179/34278 [27:42:08<9:04:56, 3.59s/it] {'loss': 0.1164, 'grad_norm': 0.793030213764276, 'learning_rate': 1.736811200520046e-06, 'epoch': 0.73} 73%|███████▎ | 25179/34278 [27:42:08<9:04:56, 3.59s/it] 73%|███████▎ | 25180/34278 [27:42:11<8:35:29, 3.40s/it] {'loss': 0.1253, 'grad_norm': 1.0068930119539534, 'learning_rate': 1.7364532658606304e-06, 'epoch': 0.73} 73%|███████▎ | 25180/34278 [27:42:11<8:35:29, 3.40s/it] 73%|███████▎ | 25181/34278 [27:42:14<8:25:40, 3.34s/it] {'loss': 0.1095, 'grad_norm': 0.800583865149948, 'learning_rate': 1.736095360337381e-06, 'epoch': 0.73} 73%|███████▎ | 25181/34278 [27:42:14<8:25:40, 3.34s/it] 73%|███████▎ | 25182/34278 [27:42:17<8:12:43, 3.25s/it] {'loss': 0.0936, 'grad_norm': 1.045965267800114, 'learning_rate': 1.7357374839534907e-06, 'epoch': 0.73} 73%|███████▎ | 25182/34278 [27:42:17<8:12:43, 3.25s/it] 73%|███████▎ | 25183/34278 [27:42:20<8:01:42, 3.18s/it] {'loss': 0.1062, 'grad_norm': 1.3238259145904048, 'learning_rate': 1.7353796367121594e-06, 'epoch': 0.73} 73%|███████▎ | 25183/34278 [27:42:20<8:01:42, 3.18s/it] 73%|███████▎ | 25184/34278 [27:42:26<10:03:34, 3.98s/it] {'loss': 0.129, 'grad_norm': 0.8951687075971192, 'learning_rate': 1.7350218186165774e-06, 'epoch': 0.73} 73%|███████▎ | 25184/34278 [27:42:26<10:03:34, 3.98s/it] 73%|███████▎ | 25185/34278 [27:42:29<9:20:56, 3.70s/it] {'loss': 0.1017, 'grad_norm': 0.9953421898794453, 'learning_rate': 1.7346640296699424e-06, 'epoch': 0.73} 73%|███████▎ | 25185/34278 [27:42:29<9:20:56, 3.70s/it] 73%|███████▎ | 25186/34278 [27:42:32<8:52:26, 3.51s/it] {'loss': 0.1061, 'grad_norm': 2.017025881446626, 'learning_rate': 1.7343062698754465e-06, 'epoch': 0.73} 73%|███████▎ | 25186/34278 [27:42:32<8:52:26, 3.51s/it] 73%|███████▎ | 25187/34278 [27:42:35<8:20:14, 3.30s/it] {'loss': 0.1083, 'grad_norm': 0.6387944226869747, 'learning_rate': 1.733948539236286e-06, 'epoch': 0.73} 73%|███████▎ | 25187/34278 [27:42:35<8:20:14, 3.30s/it] 73%|███████▎ | 25188/34278 [27:42:38<8:31:47, 3.38s/it] {'loss': 0.1064, 'grad_norm': 1.7999285802911165, 'learning_rate': 1.7335908377556533e-06, 'epoch': 0.73} 73%|███████▎ | 25188/34278 [27:42:38<8:31:47, 3.38s/it] 73%|███████▎ | 25189/34278 [27:42:41<8:21:58, 3.31s/it] {'loss': 0.1074, 'grad_norm': 0.8051056740792877, 'learning_rate': 1.73323316543674e-06, 'epoch': 0.73} 73%|███████▎ | 25189/34278 [27:42:41<8:21:58, 3.31s/it] 73%|███████▎ | 25190/34278 [27:42:45<8:24:22, 3.33s/it] {'loss': 0.1384, 'grad_norm': 0.7310949681442523, 'learning_rate': 1.7328755222827414e-06, 'epoch': 0.73} 73%|███████▎ | 25190/34278 [27:42:45<8:24:22, 3.33s/it] 73%|███████▎ | 25191/34278 [27:42:48<8:05:56, 3.21s/it] {'loss': 0.1124, 'grad_norm': 0.836118745669304, 'learning_rate': 1.732517908296852e-06, 'epoch': 0.73} 73%|███████▎ | 25191/34278 [27:42:48<8:05:56, 3.21s/it] 73%|███████▎ | 25192/34278 [27:42:53<9:40:56, 3.84s/it] {'loss': 0.1232, 'grad_norm': 0.8581381736726805, 'learning_rate': 1.7321603234822608e-06, 'epoch': 0.73} 73%|███████▎ | 25192/34278 [27:42:53<9:40:56, 3.84s/it] 73%|███████▎ | 25193/34278 [27:42:56<9:27:19, 3.75s/it] {'loss': 0.0923, 'grad_norm': 0.7946901559174058, 'learning_rate': 1.7318027678421638e-06, 'epoch': 0.73} 73%|███████▎ | 25193/34278 [27:42:56<9:27:19, 3.75s/it] 73%|███████▎ | 25194/34278 [27:42:59<8:57:28, 3.55s/it] {'loss': 0.1269, 'grad_norm': 0.987180674950145, 'learning_rate': 1.7314452413797517e-06, 'epoch': 0.73} 73%|███████▎ | 25194/34278 [27:43:00<8:57:28, 3.55s/it] 74%|███████▎ | 25195/34278 [27:43:03<8:43:07, 3.46s/it] {'loss': 0.1069, 'grad_norm': 0.8264194715410034, 'learning_rate': 1.7310877440982144e-06, 'epoch': 0.74} 74%|███████▎ | 25195/34278 [27:43:03<8:43:07, 3.46s/it] 74%|███████▎ | 25196/34278 [27:43:08<9:58:46, 3.96s/it] {'loss': 0.1106, 'grad_norm': 0.6918364701282613, 'learning_rate': 1.730730276000745e-06, 'epoch': 0.74} 74%|███████▎ | 25196/34278 [27:43:08<9:58:46, 3.96s/it] 74%|███████▎ | 25197/34278 [27:43:12<9:44:54, 3.86s/it] {'loss': 0.1043, 'grad_norm': 0.6190853723432631, 'learning_rate': 1.7303728370905377e-06, 'epoch': 0.74} 74%|███████▎ | 25197/34278 [27:43:12<9:44:54, 3.86s/it] 74%|███████▎ | 25198/34278 [27:43:15<9:10:23, 3.64s/it] {'loss': 0.117, 'grad_norm': 0.7734153098972022, 'learning_rate': 1.7300154273707803e-06, 'epoch': 0.74} 74%|███████▎ | 25198/34278 [27:43:15<9:10:23, 3.64s/it] 74%|███████▎ | 25199/34278 [27:43:18<8:41:22, 3.45s/it] {'loss': 0.1432, 'grad_norm': 1.5546059569207948, 'learning_rate': 1.7296580468446638e-06, 'epoch': 0.74} 74%|███████▎ | 25199/34278 [27:43:18<8:41:22, 3.45s/it] 74%|███████▎ | 25200/34278 [27:43:21<8:33:40, 3.40s/it] {'loss': 0.1054, 'grad_norm': 0.8826353310839861, 'learning_rate': 1.7293006955153808e-06, 'epoch': 0.74} 74%|███████▎ | 25200/34278 [27:43:21<8:33:40, 3.40s/it] 74%|███████▎ | 25201/34278 [27:43:26<9:54:28, 3.93s/it] {'loss': 0.1189, 'grad_norm': 0.8863678540996224, 'learning_rate': 1.7289433733861206e-06, 'epoch': 0.74} 74%|███████▎ | 25201/34278 [27:43:26<9:54:28, 3.93s/it] 74%|███████▎ | 25202/34278 [27:43:29<9:14:03, 3.66s/it] {'loss': 0.1327, 'grad_norm': 0.7047809478398082, 'learning_rate': 1.7285860804600708e-06, 'epoch': 0.74} 74%|███████▎ | 25202/34278 [27:43:29<9:14:03, 3.66s/it] 74%|███████▎ | 25203/34278 [27:43:32<8:45:50, 3.48s/it] {'loss': 0.118, 'grad_norm': 0.9617434252540195, 'learning_rate': 1.7282288167404243e-06, 'epoch': 0.74} 74%|███████▎ | 25203/34278 [27:43:32<8:45:50, 3.48s/it] 74%|███████▎ | 25204/34278 [27:43:37<9:30:46, 3.77s/it] {'loss': 0.1262, 'grad_norm': 0.6185126927262421, 'learning_rate': 1.727871582230371e-06, 'epoch': 0.74} 74%|███████▎ | 25204/34278 [27:43:37<9:30:46, 3.77s/it] 74%|███████▎ | 25205/34278 [27:43:40<9:17:47, 3.69s/it] {'loss': 0.1045, 'grad_norm': 0.7054006349378011, 'learning_rate': 1.7275143769330994e-06, 'epoch': 0.74} 74%|███████▎ | 25205/34278 [27:43:40<9:17:47, 3.69s/it] 74%|███████▎ | 25206/34278 [27:43:43<8:50:45, 3.51s/it] {'loss': 0.1218, 'grad_norm': 0.7796769073497872, 'learning_rate': 1.7271572008517968e-06, 'epoch': 0.74} 74%|███████▎ | 25206/34278 [27:43:43<8:50:45, 3.51s/it] 74%|███████▎ | 25207/34278 [27:43:47<8:47:40, 3.49s/it] {'loss': 0.1114, 'grad_norm': 0.788207880874118, 'learning_rate': 1.7268000539896545e-06, 'epoch': 0.74} 74%|███████▎ | 25207/34278 [27:43:47<8:47:40, 3.49s/it] 74%|███████▎ | 25208/34278 [27:43:52<10:33:46, 4.19s/it] {'loss': 0.1444, 'grad_norm': 0.8660248515564732, 'learning_rate': 1.7264429363498587e-06, 'epoch': 0.74} 74%|███████▎ | 25208/34278 [27:43:52<10:33:46, 4.19s/it] 74%|███████▎ | 25209/34278 [27:43:56<9:56:32, 3.95s/it] {'loss': 0.1372, 'grad_norm': 0.7349303066376833, 'learning_rate': 1.7260858479355986e-06, 'epoch': 0.74} 74%|███████▎ | 25209/34278 [27:43:56<9:56:32, 3.95s/it] 74%|███████▎ | 25210/34278 [27:43:59<9:15:39, 3.68s/it] {'loss': 0.1188, 'grad_norm': 0.6860352387906503, 'learning_rate': 1.7257287887500645e-06, 'epoch': 0.74} 74%|███████▎ | 25210/34278 [27:43:59<9:15:39, 3.68s/it] 74%|███████▎ | 25211/34278 [27:44:02<8:56:53, 3.55s/it] {'loss': 0.0989, 'grad_norm': 0.8315110432364982, 'learning_rate': 1.7253717587964419e-06, 'epoch': 0.74} 74%|███████▎ | 25211/34278 [27:44:02<8:56:53, 3.55s/it] 74%|███████▎ | 25212/34278 [27:44:05<8:38:28, 3.43s/it] {'loss': 0.1108, 'grad_norm': 0.8596638787874854, 'learning_rate': 1.725014758077917e-06, 'epoch': 0.74} 74%|███████▎ | 25212/34278 [27:44:05<8:38:28, 3.43s/it] 74%|███████▎ | 25213/34278 [27:44:08<8:13:31, 3.27s/it] {'loss': 0.111, 'grad_norm': 0.752770762716311, 'learning_rate': 1.72465778659768e-06, 'epoch': 0.74} 74%|███████▎ | 25213/34278 [27:44:08<8:13:31, 3.27s/it] 74%|███████▎ | 25214/34278 [27:44:11<8:07:32, 3.23s/it] {'loss': 0.1021, 'grad_norm': 0.6867116489178011, 'learning_rate': 1.7243008443589148e-06, 'epoch': 0.74} 74%|███████▎ | 25214/34278 [27:44:11<8:07:32, 3.23s/it] 74%|███████▎ | 25215/34278 [27:44:15<8:17:57, 3.30s/it] {'loss': 0.098, 'grad_norm': 0.7910658847762618, 'learning_rate': 1.7239439313648115e-06, 'epoch': 0.74} 74%|███████▎ | 25215/34278 [27:44:15<8:17:57, 3.30s/it] 74%|███████▎ | 25216/34278 [27:44:19<8:40:15, 3.44s/it] {'loss': 0.1053, 'grad_norm': 0.7865668459400588, 'learning_rate': 1.7235870476185528e-06, 'epoch': 0.74} 74%|███████▎ | 25216/34278 [27:44:19<8:40:15, 3.44s/it] 74%|███████▎ | 25217/34278 [27:44:22<9:01:06, 3.58s/it] {'loss': 0.113, 'grad_norm': 0.7660660298399999, 'learning_rate': 1.7232301931233287e-06, 'epoch': 0.74} 74%|███████▎ | 25217/34278 [27:44:23<9:01:06, 3.58s/it] 74%|███████▎ | 25218/34278 [27:44:26<8:52:57, 3.53s/it] {'loss': 0.1273, 'grad_norm': 0.774868231792155, 'learning_rate': 1.7228733678823234e-06, 'epoch': 0.74} 74%|███████▎ | 25218/34278 [27:44:26<8:52:57, 3.53s/it] 74%|███████▎ | 25219/34278 [27:44:29<8:40:08, 3.45s/it] {'loss': 0.1022, 'grad_norm': 0.6727820209920885, 'learning_rate': 1.7225165718987203e-06, 'epoch': 0.74} 74%|███████▎ | 25219/34278 [27:44:29<8:40:08, 3.45s/it] 74%|███████▎ | 25220/34278 [27:44:33<9:10:06, 3.64s/it] {'loss': 0.1196, 'grad_norm': 0.8086784982776921, 'learning_rate': 1.7221598051757066e-06, 'epoch': 0.74} 74%|███████▎ | 25220/34278 [27:44:33<9:10:06, 3.64s/it] 74%|███████▎ | 25221/34278 [27:44:37<8:53:46, 3.54s/it] {'loss': 0.1384, 'grad_norm': 0.9249791247127002, 'learning_rate': 1.7218030677164698e-06, 'epoch': 0.74} 74%|███████▎ | 25221/34278 [27:44:37<8:53:46, 3.54s/it] 74%|███████▎ | 25222/34278 [27:44:40<8:51:38, 3.52s/it] {'loss': 0.1221, 'grad_norm': 0.7257236343971472, 'learning_rate': 1.7214463595241909e-06, 'epoch': 0.74} 74%|███████▎ | 25222/34278 [27:44:40<8:51:38, 3.52s/it] 74%|███████▎ | 25223/34278 [27:44:44<8:55:30, 3.55s/it] {'loss': 0.122, 'grad_norm': 0.7856001053922799, 'learning_rate': 1.7210896806020583e-06, 'epoch': 0.74} 74%|███████▎ | 25223/34278 [27:44:44<8:55:30, 3.55s/it] 74%|███████▎ | 25224/34278 [27:44:50<10:40:52, 4.25s/it] {'loss': 0.111, 'grad_norm': 1.2352753135807606, 'learning_rate': 1.720733030953254e-06, 'epoch': 0.74} 74%|███████▎ | 25224/34278 [27:44:50<10:40:52, 4.25s/it] 74%|███████▎ | 25225/34278 [27:44:52<9:31:24, 3.79s/it] {'loss': 0.1101, 'grad_norm': 0.8579676431534605, 'learning_rate': 1.72037641058096e-06, 'epoch': 0.74} 74%|███████▎ | 25225/34278 [27:44:52<9:31:24, 3.79s/it] 74%|███████▎ | 25226/34278 [27:44:56<9:25:37, 3.75s/it] {'loss': 0.1289, 'grad_norm': 0.8398343325198167, 'learning_rate': 1.7200198194883632e-06, 'epoch': 0.74} 74%|███████▎ | 25226/34278 [27:44:56<9:25:37, 3.75s/it] 74%|███████▎ | 25227/34278 [27:44:59<9:10:42, 3.65s/it] {'loss': 0.1173, 'grad_norm': 1.547312546179713, 'learning_rate': 1.7196632576786481e-06, 'epoch': 0.74} 74%|███████▎ | 25227/34278 [27:44:59<9:10:42, 3.65s/it] 74%|███████▎ | 25228/34278 [27:45:02<8:40:01, 3.45s/it] {'loss': 0.1165, 'grad_norm': 0.9003318672602055, 'learning_rate': 1.7193067251549966e-06, 'epoch': 0.74} 74%|███████▎ | 25228/34278 [27:45:02<8:40:01, 3.45s/it] 74%|███████▎ | 25229/34278 [27:45:05<8:29:00, 3.38s/it] {'loss': 0.1351, 'grad_norm': 0.8140376947371802, 'learning_rate': 1.7189502219205894e-06, 'epoch': 0.74} 74%|███████▎ | 25229/34278 [27:45:05<8:29:00, 3.38s/it] 74%|███████▎ | 25230/34278 [27:45:08<8:08:47, 3.24s/it] {'loss': 0.1243, 'grad_norm': 0.8741857773815951, 'learning_rate': 1.718593747978613e-06, 'epoch': 0.74} 74%|███████▎ | 25230/34278 [27:45:08<8:08:47, 3.24s/it] 74%|███████▎ | 25231/34278 [27:45:11<7:54:42, 3.15s/it] {'loss': 0.1204, 'grad_norm': 0.8652461061178748, 'learning_rate': 1.7182373033322485e-06, 'epoch': 0.74} 74%|███████▎ | 25231/34278 [27:45:11<7:54:42, 3.15s/it] 74%|███████▎ | 25232/34278 [27:45:17<10:02:56, 4.00s/it] {'loss': 0.1228, 'grad_norm': 0.8595575707027153, 'learning_rate': 1.7178808879846763e-06, 'epoch': 0.74} 74%|███████▎ | 25232/34278 [27:45:17<10:02:56, 4.00s/it] 74%|███████▎ | 25233/34278 [27:45:21<9:52:04, 3.93s/it] {'loss': 0.1098, 'grad_norm': 0.9282324690575567, 'learning_rate': 1.7175245019390801e-06, 'epoch': 0.74} 74%|███████▎ | 25233/34278 [27:45:21<9:52:04, 3.93s/it] 74%|███████▎ | 25234/34278 [27:45:24<9:09:24, 3.64s/it] {'loss': 0.139, 'grad_norm': 0.8532936953628621, 'learning_rate': 1.7171681451986428e-06, 'epoch': 0.74} 74%|███████▎ | 25234/34278 [27:45:24<9:09:24, 3.64s/it] 74%|███████▎ | 25235/34278 [27:45:27<8:28:23, 3.37s/it] {'loss': 0.1135, 'grad_norm': 1.1356813078641073, 'learning_rate': 1.716811817766545e-06, 'epoch': 0.74} 74%|███████▎ | 25235/34278 [27:45:27<8:28:23, 3.37s/it] 74%|███████▎ | 25236/34278 [27:45:30<8:21:58, 3.33s/it] {'loss': 0.1417, 'grad_norm': 1.1125982455851906, 'learning_rate': 1.7164555196459659e-06, 'epoch': 0.74} 74%|███████▎ | 25236/34278 [27:45:30<8:21:58, 3.33s/it] 74%|███████▎ | 25237/34278 [27:45:34<8:40:11, 3.45s/it] {'loss': 0.1096, 'grad_norm': 0.8874185666031077, 'learning_rate': 1.7160992508400892e-06, 'epoch': 0.74} 74%|███████▎ | 25237/34278 [27:45:34<8:40:11, 3.45s/it] 74%|███████▎ | 25238/34278 [27:45:38<8:58:57, 3.58s/it] {'loss': 0.1236, 'grad_norm': 1.101421033845138, 'learning_rate': 1.7157430113520934e-06, 'epoch': 0.74} 74%|███████▎ | 25238/34278 [27:45:38<8:58:57, 3.58s/it] 74%|███████▎ | 25239/34278 [27:45:40<8:23:38, 3.34s/it] {'loss': 0.0967, 'grad_norm': 0.8578668617025079, 'learning_rate': 1.71538680118516e-06, 'epoch': 0.74} 74%|███████▎ | 25239/34278 [27:45:40<8:23:38, 3.34s/it] 74%|███████▎ | 25240/34278 [27:45:46<9:46:24, 3.89s/it] {'loss': 0.1108, 'grad_norm': 1.02838406227728, 'learning_rate': 1.7150306203424705e-06, 'epoch': 0.74} 74%|███████▎ | 25240/34278 [27:45:46<9:46:24, 3.89s/it] 74%|███████▎ | 25241/34278 [27:45:48<8:52:15, 3.53s/it] {'loss': 0.1232, 'grad_norm': 1.0423443265487888, 'learning_rate': 1.7146744688272033e-06, 'epoch': 0.74} 74%|███████▎ | 25241/34278 [27:45:48<8:52:15, 3.53s/it] 74%|███████▎ | 25242/34278 [27:45:54<10:30:48, 4.19s/it] {'loss': 0.1003, 'grad_norm': 0.7404149203167731, 'learning_rate': 1.7143183466425366e-06, 'epoch': 0.74} 74%|███████▎ | 25242/34278 [27:45:54<10:30:48, 4.19s/it] 74%|███████▎ | 25243/34278 [27:45:57<9:43:48, 3.88s/it] {'loss': 0.0916, 'grad_norm': 0.8153272341392377, 'learning_rate': 1.7139622537916533e-06, 'epoch': 0.74} 74%|███████▎ | 25243/34278 [27:45:57<9:43:48, 3.88s/it] 74%|███████▎ | 25244/34278 [27:46:01<9:34:37, 3.82s/it] {'loss': 0.1231, 'grad_norm': 0.8857841917298593, 'learning_rate': 1.7136061902777286e-06, 'epoch': 0.74} 74%|███████▎ | 25244/34278 [27:46:01<9:34:37, 3.82s/it] 74%|███████▎ | 25245/34278 [27:46:04<8:46:58, 3.50s/it] {'loss': 0.1233, 'grad_norm': 0.9688457467265948, 'learning_rate': 1.713250156103945e-06, 'epoch': 0.74} 74%|███████▎ | 25245/34278 [27:46:04<8:46:58, 3.50s/it] 74%|███████▎ | 25246/34278 [27:46:07<8:26:09, 3.36s/it] {'loss': 0.1212, 'grad_norm': 1.1345142642731583, 'learning_rate': 1.7128941512734781e-06, 'epoch': 0.74} 74%|███████▎ | 25246/34278 [27:46:07<8:26:09, 3.36s/it] 74%|███████▎ | 25247/34278 [27:46:11<8:53:36, 3.55s/it] {'loss': 0.1213, 'grad_norm': 0.8563042071323728, 'learning_rate': 1.7125381757895088e-06, 'epoch': 0.74} 74%|███████▎ | 25247/34278 [27:46:11<8:53:36, 3.55s/it] 74%|███████▎ | 25248/34278 [27:46:13<8:22:06, 3.34s/it] {'loss': 0.1225, 'grad_norm': 0.9739527917204357, 'learning_rate': 1.7121822296552138e-06, 'epoch': 0.74} 74%|███████▎ | 25248/34278 [27:46:13<8:22:06, 3.34s/it] 74%|███████▎ | 25249/34278 [27:46:20<10:28:42, 4.18s/it] {'loss': 0.1434, 'grad_norm': 1.0025426988035975, 'learning_rate': 1.7118263128737693e-06, 'epoch': 0.74} 74%|███████▎ | 25249/34278 [27:46:20<10:28:42, 4.18s/it] 74%|███████▎ | 25250/34278 [27:46:24<10:43:57, 4.28s/it] {'loss': 0.1053, 'grad_norm': 0.9172220347337707, 'learning_rate': 1.7114704254483549e-06, 'epoch': 0.74} 74%|███████▎ | 25250/34278 [27:46:24<10:43:57, 4.28s/it] 74%|███████▎ | 25251/34278 [27:46:28<10:04:41, 4.02s/it] {'loss': 0.1275, 'grad_norm': 0.8207826382201381, 'learning_rate': 1.7111145673821489e-06, 'epoch': 0.74} 74%|███████▎ | 25251/34278 [27:46:28<10:04:41, 4.02s/it] 74%|███████▎ | 25252/34278 [27:46:31<9:25:02, 3.76s/it] {'loss': 0.1207, 'grad_norm': 0.8933047248882343, 'learning_rate': 1.7107587386783258e-06, 'epoch': 0.74} 74%|███████▎ | 25252/34278 [27:46:31<9:25:02, 3.76s/it] 74%|███████▎ | 25253/34278 [27:46:36<10:32:49, 4.21s/it] {'loss': 0.1452, 'grad_norm': 0.8904389300151773, 'learning_rate': 1.7104029393400646e-06, 'epoch': 0.74} 74%|███████▎ | 25253/34278 [27:46:36<10:32:49, 4.21s/it] 74%|███████▎ | 25254/34278 [27:46:40<10:25:22, 4.16s/it] {'loss': 0.1029, 'grad_norm': 0.7367043057960646, 'learning_rate': 1.7100471693705405e-06, 'epoch': 0.74} 74%|███████▎ | 25254/34278 [27:46:40<10:25:22, 4.16s/it] 74%|███████▎ | 25255/34278 [27:46:43<9:31:47, 3.80s/it] {'loss': 0.1084, 'grad_norm': 0.7648136268051936, 'learning_rate': 1.7096914287729287e-06, 'epoch': 0.74} 74%|███████▎ | 25255/34278 [27:46:43<9:31:47, 3.80s/it] 74%|███████▎ | 25256/34278 [27:46:46<9:09:35, 3.66s/it] {'loss': 0.1306, 'grad_norm': 0.9378611376088117, 'learning_rate': 1.709335717550406e-06, 'epoch': 0.74} 74%|███████▎ | 25256/34278 [27:46:46<9:09:35, 3.66s/it] 74%|███████▎ | 25257/34278 [27:46:49<8:35:43, 3.43s/it] {'loss': 0.1039, 'grad_norm': 0.7537964662440101, 'learning_rate': 1.7089800357061504e-06, 'epoch': 0.74} 74%|███████▎ | 25257/34278 [27:46:49<8:35:43, 3.43s/it] 74%|███████▎ | 25258/34278 [27:46:54<9:41:02, 3.87s/it] {'loss': 0.1193, 'grad_norm': 0.7520104198875354, 'learning_rate': 1.7086243832433353e-06, 'epoch': 0.74} 74%|███████▎ | 25258/34278 [27:46:54<9:41:02, 3.87s/it] 74%|███████▎ | 25259/34278 [27:47:00<11:14:34, 4.49s/it] {'loss': 0.1153, 'grad_norm': 0.9008341568376169, 'learning_rate': 1.7082687601651344e-06, 'epoch': 0.74} 74%|███████▎ | 25259/34278 [27:47:00<11:14:34, 4.49s/it] 74%|███████▎ | 25260/34278 [27:47:03<10:00:32, 4.00s/it] {'loss': 0.1162, 'grad_norm': 1.059608005032094, 'learning_rate': 1.7079131664747256e-06, 'epoch': 0.74} 74%|███████▎ | 25260/34278 [27:47:03<10:00:32, 4.00s/it] 74%|███████▎ | 25261/34278 [27:47:09<11:25:27, 4.56s/it] {'loss': 0.1007, 'grad_norm': 0.7660511675565401, 'learning_rate': 1.7075576021752826e-06, 'epoch': 0.74} 74%|███████▎ | 25261/34278 [27:47:09<11:25:27, 4.56s/it] 74%|███████▎ | 25262/34278 [27:47:12<10:21:28, 4.14s/it] {'loss': 0.1112, 'grad_norm': 0.6890316702609873, 'learning_rate': 1.7072020672699775e-06, 'epoch': 0.74} 74%|███████▎ | 25262/34278 [27:47:12<10:21:28, 4.14s/it] 74%|███████▎ | 25263/34278 [27:47:15<9:50:37, 3.93s/it] {'loss': 0.1307, 'grad_norm': 0.9422067141317693, 'learning_rate': 1.7068465617619861e-06, 'epoch': 0.74} 74%|███████▎ | 25263/34278 [27:47:15<9:50:37, 3.93s/it] 74%|███████▎ | 25264/34278 [27:47:18<9:08:41, 3.65s/it] {'loss': 0.1211, 'grad_norm': 1.4778763290020256, 'learning_rate': 1.7064910856544842e-06, 'epoch': 0.74} 74%|███████▎ | 25264/34278 [27:47:18<9:08:41, 3.65s/it] 74%|███████▎ | 25265/34278 [27:47:22<8:58:40, 3.59s/it] {'loss': 0.1218, 'grad_norm': 0.8505609361935126, 'learning_rate': 1.7061356389506439e-06, 'epoch': 0.74} 74%|███████▎ | 25265/34278 [27:47:22<8:58:40, 3.59s/it] 74%|███████▎ | 25266/34278 [27:47:25<8:49:54, 3.53s/it] {'loss': 0.1034, 'grad_norm': 0.7946341945130069, 'learning_rate': 1.7057802216536369e-06, 'epoch': 0.74} 74%|███████▎ | 25266/34278 [27:47:25<8:49:54, 3.53s/it] 74%|███████▎ | 25267/34278 [27:47:29<8:57:54, 3.58s/it] {'loss': 0.1366, 'grad_norm': 0.8677088750994095, 'learning_rate': 1.7054248337666385e-06, 'epoch': 0.74} 74%|███████▎ | 25267/34278 [27:47:29<8:57:54, 3.58s/it] 74%|███████▎ | 25268/34278 [27:47:32<8:37:11, 3.44s/it] {'loss': 0.1092, 'grad_norm': 0.7919279927712066, 'learning_rate': 1.7050694752928198e-06, 'epoch': 0.74} 74%|███████▎ | 25268/34278 [27:47:32<8:37:11, 3.44s/it] 74%|███████▎ | 25269/34278 [27:47:36<8:48:18, 3.52s/it] {'loss': 0.1315, 'grad_norm': 1.0481757496621407, 'learning_rate': 1.7047141462353538e-06, 'epoch': 0.74} 74%|███████▎ | 25269/34278 [27:47:36<8:48:18, 3.52s/it] 74%|███████▎ | 25270/34278 [27:47:39<8:23:48, 3.36s/it] {'loss': 0.1469, 'grad_norm': 1.3248705217274135, 'learning_rate': 1.7043588465974148e-06, 'epoch': 0.74} 74%|███████▎ | 25270/34278 [27:47:39<8:23:48, 3.36s/it] 74%|███████▎ | 25271/34278 [27:47:42<8:16:52, 3.31s/it] {'loss': 0.0943, 'grad_norm': 0.9862843422901132, 'learning_rate': 1.7040035763821738e-06, 'epoch': 0.74} 74%|███████▎ | 25271/34278 [27:47:42<8:16:52, 3.31s/it] 74%|███████▎ | 25272/34278 [27:47:45<8:15:38, 3.30s/it] {'loss': 0.1223, 'grad_norm': 0.8694817549761541, 'learning_rate': 1.7036483355928002e-06, 'epoch': 0.74} 74%|███████▎ | 25272/34278 [27:47:45<8:15:38, 3.30s/it] 74%|███████▎ | 25273/34278 [27:47:48<8:09:22, 3.26s/it] {'loss': 0.1148, 'grad_norm': 0.8065121355796986, 'learning_rate': 1.7032931242324691e-06, 'epoch': 0.74} 74%|███████▎ | 25273/34278 [27:47:48<8:09:22, 3.26s/it] 74%|███████▎ | 25274/34278 [27:47:52<8:24:16, 3.36s/it] {'loss': 0.1084, 'grad_norm': 1.054999863812303, 'learning_rate': 1.7029379423043479e-06, 'epoch': 0.74} 74%|███████▎ | 25274/34278 [27:47:52<8:24:16, 3.36s/it] 74%|███████▎ | 25275/34278 [27:47:55<8:07:40, 3.25s/it] {'loss': 0.1153, 'grad_norm': 0.9127816940580294, 'learning_rate': 1.7025827898116115e-06, 'epoch': 0.74} 74%|███████▎ | 25275/34278 [27:47:55<8:07:40, 3.25s/it] 74%|███████▎ | 25276/34278 [27:47:58<7:54:24, 3.16s/it] {'loss': 0.1155, 'grad_norm': 0.9339764236338165, 'learning_rate': 1.7022276667574272e-06, 'epoch': 0.74} 74%|███████▎ | 25276/34278 [27:47:58<7:54:24, 3.16s/it] 74%|███████▎ | 25277/34278 [27:48:01<7:47:17, 3.11s/it] {'loss': 0.1329, 'grad_norm': 0.9194037018513296, 'learning_rate': 1.7018725731449692e-06, 'epoch': 0.74} 74%|███████▎ | 25277/34278 [27:48:01<7:47:17, 3.11s/it] 74%|███████▎ | 25278/34278 [27:48:07<9:52:20, 3.95s/it] {'loss': 0.1047, 'grad_norm': 1.0288101664648228, 'learning_rate': 1.701517508977405e-06, 'epoch': 0.74} 74%|███████▎ | 25278/34278 [27:48:07<9:52:20, 3.95s/it] 74%|███████▎ | 25279/34278 [27:48:11<9:53:30, 3.96s/it] {'loss': 0.1256, 'grad_norm': 0.8035757981797365, 'learning_rate': 1.7011624742579037e-06, 'epoch': 0.74} 74%|███████▎ | 25279/34278 [27:48:11<9:53:30, 3.96s/it] 74%|███████▎ | 25280/34278 [27:48:15<9:58:55, 3.99s/it] {'loss': 0.1134, 'grad_norm': 1.0381347895455237, 'learning_rate': 1.7008074689896359e-06, 'epoch': 0.74} 74%|███████▎ | 25280/34278 [27:48:15<9:58:55, 3.99s/it] 74%|███████▍ | 25281/34278 [27:48:18<9:28:31, 3.79s/it] {'loss': 0.1438, 'grad_norm': 0.9140663289613524, 'learning_rate': 1.7004524931757733e-06, 'epoch': 0.74} 74%|███████▍ | 25281/34278 [27:48:18<9:28:31, 3.79s/it] 74%|███████▍ | 25282/34278 [27:48:22<9:10:37, 3.67s/it] {'loss': 0.1339, 'grad_norm': 1.1388058220901165, 'learning_rate': 1.700097546819482e-06, 'epoch': 0.74} 74%|███████▍ | 25282/34278 [27:48:22<9:10:37, 3.67s/it] 74%|███████▍ | 25283/34278 [27:48:27<10:48:54, 4.33s/it] {'loss': 0.111, 'grad_norm': 0.744424852543238, 'learning_rate': 1.6997426299239327e-06, 'epoch': 0.74} 74%|███████▍ | 25283/34278 [27:48:27<10:48:54, 4.33s/it] 74%|███████▍ | 25284/34278 [27:48:32<11:12:30, 4.49s/it] {'loss': 0.1108, 'grad_norm': 0.6704914742716379, 'learning_rate': 1.6993877424922945e-06, 'epoch': 0.74} 74%|███████▍ | 25284/34278 [27:48:32<11:12:30, 4.49s/it] 74%|███████▍ | 25285/34278 [27:48:36<10:54:08, 4.36s/it] {'loss': 0.1444, 'grad_norm': 0.7887579127255342, 'learning_rate': 1.699032884527732e-06, 'epoch': 0.74} 74%|███████▍ | 25285/34278 [27:48:36<10:54:08, 4.36s/it] 74%|███████▍ | 25286/34278 [27:48:40<10:40:53, 4.28s/it] {'loss': 0.0984, 'grad_norm': 1.2477765701714825, 'learning_rate': 1.6986780560334165e-06, 'epoch': 0.74} 74%|███████▍ | 25286/34278 [27:48:40<10:40:53, 4.28s/it] 74%|███████▍ | 25287/34278 [27:48:43<9:46:10, 3.91s/it] {'loss': 0.1425, 'grad_norm': 1.09505236161158, 'learning_rate': 1.698323257012517e-06, 'epoch': 0.74} 74%|███████▍ | 25287/34278 [27:48:43<9:46:10, 3.91s/it] 74%|███████▍ | 25288/34278 [27:48:47<9:17:57, 3.72s/it] {'loss': 0.1025, 'grad_norm': 1.1090470372073882, 'learning_rate': 1.6979684874681983e-06, 'epoch': 0.74} 74%|███████▍ | 25288/34278 [27:48:47<9:17:57, 3.72s/it] 74%|███████▍ | 25289/34278 [27:48:51<9:23:03, 3.76s/it] {'loss': 0.0979, 'grad_norm': 0.7474682054854008, 'learning_rate': 1.697613747403628e-06, 'epoch': 0.74} 74%|███████▍ | 25289/34278 [27:48:51<9:23:03, 3.76s/it] 74%|███████▍ | 25290/34278 [27:48:56<10:56:03, 4.38s/it] {'loss': 0.122, 'grad_norm': 0.9958266496411424, 'learning_rate': 1.6972590368219755e-06, 'epoch': 0.74} 74%|███████▍ | 25290/34278 [27:48:56<10:56:03, 4.38s/it] 74%|███████▍ | 25291/34278 [27:49:00<10:02:33, 4.02s/it] {'loss': 0.1197, 'grad_norm': 1.4041187903991246, 'learning_rate': 1.6969043557264053e-06, 'epoch': 0.74} 74%|███████▍ | 25291/34278 [27:49:00<10:02:33, 4.02s/it] 74%|███████▍ | 25292/34278 [27:49:03<9:20:47, 3.74s/it] {'loss': 0.1109, 'grad_norm': 0.873790759024061, 'learning_rate': 1.6965497041200829e-06, 'epoch': 0.74} 74%|███████▍ | 25292/34278 [27:49:03<9:20:47, 3.74s/it] 74%|███████▍ | 25293/34278 [27:49:08<10:24:48, 4.17s/it] {'loss': 0.105, 'grad_norm': 0.7870163084088219, 'learning_rate': 1.6961950820061767e-06, 'epoch': 0.74} 74%|███████▍ | 25293/34278 [27:49:08<10:24:48, 4.17s/it] 74%|███████▍ | 25294/34278 [27:49:11<9:45:04, 3.91s/it] {'loss': 0.1, 'grad_norm': 0.8080892165558088, 'learning_rate': 1.6958404893878534e-06, 'epoch': 0.74} 74%|███████▍ | 25294/34278 [27:49:11<9:45:04, 3.91s/it] 74%|███████▍ | 25295/34278 [27:49:15<9:22:46, 3.76s/it] {'loss': 0.1259, 'grad_norm': 0.9422983683712554, 'learning_rate': 1.6954859262682777e-06, 'epoch': 0.74} 74%|███████▍ | 25295/34278 [27:49:15<9:22:46, 3.76s/it] 74%|███████▍ | 25296/34278 [27:49:18<8:54:04, 3.57s/it] {'loss': 0.1172, 'grad_norm': 0.8727555335092521, 'learning_rate': 1.6951313926506124e-06, 'epoch': 0.74} 74%|███████▍ | 25296/34278 [27:49:18<8:54:04, 3.57s/it] 74%|███████▍ | 25297/34278 [27:49:21<8:26:43, 3.39s/it] {'loss': 0.1222, 'grad_norm': 0.7652748839371435, 'learning_rate': 1.6947768885380278e-06, 'epoch': 0.74} 74%|███████▍ | 25297/34278 [27:49:21<8:26:43, 3.39s/it] 74%|███████▍ | 25298/34278 [27:49:23<7:53:55, 3.17s/it] {'loss': 0.1234, 'grad_norm': 1.0887679184307082, 'learning_rate': 1.6944224139336835e-06, 'epoch': 0.74} 74%|███████▍ | 25298/34278 [27:49:23<7:53:55, 3.17s/it] 74%|███████▍ | 25299/34278 [27:49:26<7:47:10, 3.12s/it] {'loss': 0.1361, 'grad_norm': 1.2673148003698989, 'learning_rate': 1.6940679688407474e-06, 'epoch': 0.74} 74%|███████▍ | 25299/34278 [27:49:26<7:47:10, 3.12s/it] 74%|███████▍ | 25300/34278 [27:49:29<7:46:36, 3.12s/it] {'loss': 0.1088, 'grad_norm': 0.8633692029700694, 'learning_rate': 1.6937135532623849e-06, 'epoch': 0.74} 74%|███████▍ | 25300/34278 [27:49:29<7:46:36, 3.12s/it] 74%|███████▍ | 25301/34278 [27:49:35<9:55:17, 3.98s/it] {'loss': 0.1455, 'grad_norm': 0.7747075654055182, 'learning_rate': 1.6933591672017585e-06, 'epoch': 0.74} 74%|███████▍ | 25301/34278 [27:49:35<9:55:17, 3.98s/it] 74%|███████▍ | 25302/34278 [27:49:39<9:25:11, 3.78s/it] {'loss': 0.1236, 'grad_norm': 0.677240042499947, 'learning_rate': 1.69300481066203e-06, 'epoch': 0.74} 74%|███████▍ | 25302/34278 [27:49:39<9:25:11, 3.78s/it] 74%|███████▍ | 25303/34278 [27:49:42<9:23:00, 3.76s/it] {'loss': 0.1178, 'grad_norm': 0.9029406886127779, 'learning_rate': 1.692650483646367e-06, 'epoch': 0.74} 74%|███████▍ | 25303/34278 [27:49:42<9:23:00, 3.76s/it] 74%|███████▍ | 25304/34278 [27:49:47<9:36:35, 3.86s/it] {'loss': 0.1231, 'grad_norm': 0.7303228048236673, 'learning_rate': 1.6922961861579295e-06, 'epoch': 0.74} 74%|███████▍ | 25304/34278 [27:49:47<9:36:35, 3.86s/it] 74%|███████▍ | 25305/34278 [27:49:50<9:27:40, 3.80s/it] {'loss': 0.12, 'grad_norm': 1.0257079461165068, 'learning_rate': 1.6919419181998835e-06, 'epoch': 0.74} 74%|███████▍ | 25305/34278 [27:49:50<9:27:40, 3.80s/it] 74%|███████▍ | 25306/34278 [27:49:53<8:46:42, 3.52s/it] {'loss': 0.0988, 'grad_norm': 0.826062206882135, 'learning_rate': 1.691587679775389e-06, 'epoch': 0.74} 74%|███████▍ | 25306/34278 [27:49:53<8:46:42, 3.52s/it] 74%|███████▍ | 25307/34278 [27:49:56<8:18:07, 3.33s/it] {'loss': 0.1151, 'grad_norm': 1.0108898120413579, 'learning_rate': 1.6912334708876116e-06, 'epoch': 0.74} 74%|███████▍ | 25307/34278 [27:49:56<8:18:07, 3.33s/it] 74%|███████▍ | 25308/34278 [27:49:59<8:25:37, 3.38s/it] {'loss': 0.1466, 'grad_norm': 0.9036026203197851, 'learning_rate': 1.6908792915397115e-06, 'epoch': 0.74} 74%|███████▍ | 25308/34278 [27:49:59<8:25:37, 3.38s/it] 74%|███████▍ | 25309/34278 [27:50:03<8:17:45, 3.33s/it] {'loss': 0.1222, 'grad_norm': 0.770493745646177, 'learning_rate': 1.6905251417348496e-06, 'epoch': 0.74} 74%|███████▍ | 25309/34278 [27:50:03<8:17:45, 3.33s/it] 74%|███████▍ | 25310/34278 [27:50:06<8:29:29, 3.41s/it] {'loss': 0.1103, 'grad_norm': 1.0709504522731002, 'learning_rate': 1.690171021476189e-06, 'epoch': 0.74} 74%|███████▍ | 25310/34278 [27:50:06<8:29:29, 3.41s/it] 74%|███████▍ | 25311/34278 [27:50:09<8:21:39, 3.36s/it] {'loss': 0.1019, 'grad_norm': 0.7943863876776105, 'learning_rate': 1.6898169307668932e-06, 'epoch': 0.74} 74%|███████▍ | 25311/34278 [27:50:09<8:21:39, 3.36s/it] 74%|███████▍ | 25312/34278 [27:50:13<8:50:55, 3.55s/it] {'loss': 0.1235, 'grad_norm': 0.9259974212396024, 'learning_rate': 1.6894628696101201e-06, 'epoch': 0.74} 74%|███████▍ | 25312/34278 [27:50:14<8:50:55, 3.55s/it] 74%|███████▍ | 25313/34278 [27:50:17<8:37:03, 3.46s/it] {'loss': 0.1198, 'grad_norm': 1.07402123444298, 'learning_rate': 1.6891088380090342e-06, 'epoch': 0.74} 74%|███████▍ | 25313/34278 [27:50:17<8:37:03, 3.46s/it] 74%|███████▍ | 25314/34278 [27:50:20<8:39:03, 3.47s/it] {'loss': 0.1168, 'grad_norm': 0.7311273303407267, 'learning_rate': 1.6887548359667939e-06, 'epoch': 0.74} 74%|███████▍ | 25314/34278 [27:50:20<8:39:03, 3.47s/it] 74%|███████▍ | 25315/34278 [27:50:26<10:25:47, 4.19s/it] {'loss': 0.1197, 'grad_norm': 0.8746853956529446, 'learning_rate': 1.6884008634865584e-06, 'epoch': 0.74} 74%|███████▍ | 25315/34278 [27:50:26<10:25:47, 4.19s/it] 74%|███████▍ | 25316/34278 [27:50:29<9:39:07, 3.88s/it] {'loss': 0.1156, 'grad_norm': 0.9412831713663902, 'learning_rate': 1.6880469205714888e-06, 'epoch': 0.74} 74%|███████▍ | 25316/34278 [27:50:29<9:39:07, 3.88s/it] 74%|███████▍ | 25317/34278 [27:50:32<8:48:47, 3.54s/it] {'loss': 0.1141, 'grad_norm': 0.7666779693899957, 'learning_rate': 1.6876930072247482e-06, 'epoch': 0.74} 74%|███████▍ | 25317/34278 [27:50:32<8:48:47, 3.54s/it] 74%|███████▍ | 25318/34278 [27:50:35<8:20:12, 3.35s/it] {'loss': 0.1234, 'grad_norm': 0.8145189765663631, 'learning_rate': 1.6873391234494936e-06, 'epoch': 0.74} 74%|███████▍ | 25318/34278 [27:50:35<8:20:12, 3.35s/it] 74%|███████▍ | 25319/34278 [27:50:38<8:16:27, 3.32s/it] {'loss': 0.1284, 'grad_norm': 1.2152461541244233, 'learning_rate': 1.6869852692488826e-06, 'epoch': 0.74} 74%|███████▍ | 25319/34278 [27:50:38<8:16:27, 3.32s/it] 74%|███████▍ | 25320/34278 [27:50:43<9:23:08, 3.77s/it] {'loss': 0.1454, 'grad_norm': 1.1502248798022294, 'learning_rate': 1.6866314446260778e-06, 'epoch': 0.74} 74%|███████▍ | 25320/34278 [27:50:43<9:23:08, 3.77s/it] 74%|███████▍ | 25321/34278 [27:50:49<10:59:11, 4.42s/it] {'loss': 0.1252, 'grad_norm': 1.0078035979676785, 'learning_rate': 1.6862776495842365e-06, 'epoch': 0.74} 74%|███████▍ | 25321/34278 [27:50:49<10:59:11, 4.42s/it] 74%|███████▍ | 25322/34278 [27:50:53<10:35:04, 4.25s/it] {'loss': 0.1189, 'grad_norm': 0.8911224581332606, 'learning_rate': 1.6859238841265157e-06, 'epoch': 0.74} 74%|███████▍ | 25322/34278 [27:50:53<10:35:04, 4.25s/it] 74%|███████▍ | 25323/34278 [27:50:58<11:35:55, 4.66s/it] {'loss': 0.1257, 'grad_norm': 0.7859438445214656, 'learning_rate': 1.6855701482560754e-06, 'epoch': 0.74} 74%|███████▍ | 25323/34278 [27:50:58<11:35:55, 4.66s/it] 74%|███████▍ | 25324/34278 [27:51:04<12:17:18, 4.94s/it] {'loss': 0.1133, 'grad_norm': 0.8271070160879815, 'learning_rate': 1.6852164419760752e-06, 'epoch': 0.74} 74%|███████▍ | 25324/34278 [27:51:04<12:17:18, 4.94s/it] 74%|███████▍ | 25325/34278 [27:51:07<11:10:14, 4.49s/it] {'loss': 0.1056, 'grad_norm': 1.007895153492105, 'learning_rate': 1.6848627652896716e-06, 'epoch': 0.74} 74%|███████▍ | 25325/34278 [27:51:07<11:10:14, 4.49s/it] 74%|███████▍ | 25326/34278 [27:51:11<10:30:34, 4.23s/it] {'loss': 0.1399, 'grad_norm': 0.9271857457001312, 'learning_rate': 1.6845091182000196e-06, 'epoch': 0.74} 74%|███████▍ | 25326/34278 [27:51:11<10:30:34, 4.23s/it] 74%|███████▍ | 25327/34278 [27:51:14<9:37:05, 3.87s/it] {'loss': 0.1216, 'grad_norm': 0.7724191013336708, 'learning_rate': 1.6841555007102806e-06, 'epoch': 0.74} 74%|███████▍ | 25327/34278 [27:51:14<9:37:05, 3.87s/it] 74%|███████▍ | 25328/34278 [27:51:18<9:46:40, 3.93s/it] {'loss': 0.0879, 'grad_norm': 0.7769602395806887, 'learning_rate': 1.6838019128236083e-06, 'epoch': 0.74} 74%|███████▍ | 25328/34278 [27:51:18<9:46:40, 3.93s/it] 74%|███████▍ | 25329/34278 [27:51:22<9:26:30, 3.80s/it] {'loss': 0.1168, 'grad_norm': 0.9950594960387527, 'learning_rate': 1.6834483545431606e-06, 'epoch': 0.74} 74%|███████▍ | 25329/34278 [27:51:22<9:26:30, 3.80s/it] 74%|███████▍ | 25330/34278 [27:51:25<8:47:38, 3.54s/it] {'loss': 0.1219, 'grad_norm': 0.8284338149531952, 'learning_rate': 1.6830948258720964e-06, 'epoch': 0.74} 74%|███████▍ | 25330/34278 [27:51:25<8:47:38, 3.54s/it] 74%|███████▍ | 25331/34278 [27:51:28<8:25:17, 3.39s/it] {'loss': 0.1082, 'grad_norm': 0.7465268207472059, 'learning_rate': 1.6827413268135694e-06, 'epoch': 0.74} 74%|███████▍ | 25331/34278 [27:51:28<8:25:17, 3.39s/it] 74%|███████▍ | 25332/34278 [27:51:31<8:35:44, 3.46s/it] {'loss': 0.1024, 'grad_norm': 0.9532600911478566, 'learning_rate': 1.6823878573707341e-06, 'epoch': 0.74} 74%|███████▍ | 25332/34278 [27:51:31<8:35:44, 3.46s/it] 74%|███████▍ | 25333/34278 [27:51:34<8:25:45, 3.39s/it] {'loss': 0.1254, 'grad_norm': 1.2348503494756145, 'learning_rate': 1.6820344175467502e-06, 'epoch': 0.74} 74%|███████▍ | 25333/34278 [27:51:34<8:25:45, 3.39s/it] 74%|███████▍ | 25334/34278 [27:51:38<8:22:27, 3.37s/it] {'loss': 0.1249, 'grad_norm': 0.8048644288093839, 'learning_rate': 1.6816810073447686e-06, 'epoch': 0.74} 74%|███████▍ | 25334/34278 [27:51:38<8:22:27, 3.37s/it] 74%|███████▍ | 25335/34278 [27:51:41<8:29:07, 3.42s/it] {'loss': 0.1013, 'grad_norm': 0.7463363458690977, 'learning_rate': 1.681327626767949e-06, 'epoch': 0.74} 74%|███████▍ | 25335/34278 [27:51:41<8:29:07, 3.42s/it] 74%|███████▍ | 25336/34278 [27:51:45<8:51:37, 3.57s/it] {'loss': 0.113, 'grad_norm': 0.8456477130553874, 'learning_rate': 1.6809742758194426e-06, 'epoch': 0.74} 74%|███████▍ | 25336/34278 [27:51:45<8:51:37, 3.57s/it] 74%|███████▍ | 25337/34278 [27:51:49<8:45:43, 3.53s/it] {'loss': 0.1072, 'grad_norm': 0.9659166131898201, 'learning_rate': 1.680620954502407e-06, 'epoch': 0.74} 74%|███████▍ | 25337/34278 [27:51:49<8:45:43, 3.53s/it] 74%|███████▍ | 25338/34278 [27:51:52<8:24:52, 3.39s/it] {'loss': 0.1112, 'grad_norm': 0.8320748594693623, 'learning_rate': 1.6802676628199948e-06, 'epoch': 0.74} 74%|███████▍ | 25338/34278 [27:51:52<8:24:52, 3.39s/it] 74%|███████▍ | 25339/34278 [27:51:55<8:19:36, 3.35s/it] {'loss': 0.1024, 'grad_norm': 1.0040201288794541, 'learning_rate': 1.6799144007753576e-06, 'epoch': 0.74} 74%|███████▍ | 25339/34278 [27:51:55<8:19:36, 3.35s/it] 74%|███████▍ | 25340/34278 [27:51:58<8:21:25, 3.37s/it] {'loss': 0.0968, 'grad_norm': 0.8786702076063105, 'learning_rate': 1.6795611683716555e-06, 'epoch': 0.74} 74%|███████▍ | 25340/34278 [27:51:58<8:21:25, 3.37s/it] 74%|███████▍ | 25341/34278 [27:52:02<8:17:30, 3.34s/it] {'loss': 0.0985, 'grad_norm': 0.8722965065660602, 'learning_rate': 1.679207965612038e-06, 'epoch': 0.74} 74%|███████▍ | 25341/34278 [27:52:02<8:17:30, 3.34s/it] 74%|███████▍ | 25342/34278 [27:52:08<10:18:10, 4.15s/it] {'loss': 0.1287, 'grad_norm': 0.8566522122966621, 'learning_rate': 1.6788547924996578e-06, 'epoch': 0.74} 74%|███████▍ | 25342/34278 [27:52:08<10:18:10, 4.15s/it] 74%|███████▍ | 25343/34278 [27:52:14<11:39:30, 4.70s/it] {'loss': 0.1162, 'grad_norm': 0.8130467367070658, 'learning_rate': 1.678501649037671e-06, 'epoch': 0.74} 74%|███████▍ | 25343/34278 [27:52:14<11:39:30, 4.70s/it] 74%|███████▍ | 25344/34278 [27:52:17<10:22:05, 4.18s/it] {'loss': 0.1175, 'grad_norm': 0.8761452407573056, 'learning_rate': 1.6781485352292281e-06, 'epoch': 0.74} 74%|███████▍ | 25344/34278 [27:52:17<10:22:05, 4.18s/it] 74%|███████▍ | 25345/34278 [27:52:20<9:58:34, 4.02s/it] {'loss': 0.1206, 'grad_norm': 1.2940274165400936, 'learning_rate': 1.6777954510774808e-06, 'epoch': 0.74} 74%|███████▍ | 25345/34278 [27:52:20<9:58:34, 4.02s/it] 74%|███████▍ | 25346/34278 [27:52:24<9:53:41, 3.99s/it] {'loss': 0.1351, 'grad_norm': 1.0570273790806137, 'learning_rate': 1.6774423965855823e-06, 'epoch': 0.74} 74%|███████▍ | 25346/34278 [27:52:24<9:53:41, 3.99s/it] 74%|███████▍ | 25347/34278 [27:52:27<9:12:27, 3.71s/it] {'loss': 0.111, 'grad_norm': 0.9344682524090563, 'learning_rate': 1.6770893717566872e-06, 'epoch': 0.74} 74%|███████▍ | 25347/34278 [27:52:27<9:12:27, 3.71s/it] 74%|███████▍ | 25348/34278 [27:52:31<9:16:01, 3.74s/it] {'loss': 0.136, 'grad_norm': 1.2793674111916882, 'learning_rate': 1.6767363765939444e-06, 'epoch': 0.74} 74%|███████▍ | 25348/34278 [27:52:31<9:16:01, 3.74s/it] 74%|███████▍ | 25349/34278 [27:52:34<8:49:35, 3.56s/it] {'loss': 0.1162, 'grad_norm': 1.0608057392286048, 'learning_rate': 1.6763834111005039e-06, 'epoch': 0.74} 74%|███████▍ | 25349/34278 [27:52:34<8:49:35, 3.56s/it] 74%|███████▍ | 25350/34278 [27:52:38<9:16:39, 3.74s/it] {'loss': 0.1417, 'grad_norm': 0.7445686660391058, 'learning_rate': 1.6760304752795215e-06, 'epoch': 0.74} 74%|███████▍ | 25350/34278 [27:52:38<9:16:39, 3.74s/it] 74%|███████▍ | 25351/34278 [27:52:42<9:01:31, 3.64s/it] {'loss': 0.1121, 'grad_norm': 0.9735687822159999, 'learning_rate': 1.675677569134143e-06, 'epoch': 0.74} 74%|███████▍ | 25351/34278 [27:52:42<9:01:31, 3.64s/it] 74%|███████▍ | 25352/34278 [27:52:46<9:11:48, 3.71s/it] {'loss': 0.1304, 'grad_norm': 1.0102545417041977, 'learning_rate': 1.6753246926675237e-06, 'epoch': 0.74} 74%|███████▍ | 25352/34278 [27:52:46<9:11:48, 3.71s/it] 74%|███████▍ | 25353/34278 [27:52:50<9:23:56, 3.79s/it] {'loss': 0.1373, 'grad_norm': 1.3639402008031616, 'learning_rate': 1.6749718458828102e-06, 'epoch': 0.74} 74%|███████▍ | 25353/34278 [27:52:50<9:23:56, 3.79s/it] 74%|███████▍ | 25354/34278 [27:52:52<8:41:11, 3.50s/it] {'loss': 0.1103, 'grad_norm': 0.8620142432377433, 'learning_rate': 1.6746190287831559e-06, 'epoch': 0.74} 74%|███████▍ | 25354/34278 [27:52:52<8:41:11, 3.50s/it] 74%|███████▍ | 25355/34278 [27:52:56<8:43:54, 3.52s/it] {'loss': 0.1222, 'grad_norm': 1.014938079064612, 'learning_rate': 1.6742662413717092e-06, 'epoch': 0.74} 74%|███████▍ | 25355/34278 [27:52:56<8:43:54, 3.52s/it] 74%|███████▍ | 25356/34278 [27:52:59<8:30:15, 3.43s/it] {'loss': 0.1183, 'grad_norm': 0.8723643333151515, 'learning_rate': 1.673913483651618e-06, 'epoch': 0.74} 74%|███████▍ | 25356/34278 [27:52:59<8:30:15, 3.43s/it] 74%|███████▍ | 25357/34278 [27:53:02<8:14:59, 3.33s/it] {'loss': 0.1075, 'grad_norm': 0.8299990124103854, 'learning_rate': 1.673560755626033e-06, 'epoch': 0.74} 74%|███████▍ | 25357/34278 [27:53:02<8:14:59, 3.33s/it] 74%|███████▍ | 25358/34278 [27:53:06<8:26:56, 3.41s/it] {'loss': 0.1156, 'grad_norm': 0.9832267034035805, 'learning_rate': 1.6732080572981052e-06, 'epoch': 0.74} 74%|███████▍ | 25358/34278 [27:53:06<8:26:56, 3.41s/it] 74%|███████▍ | 25359/34278 [27:53:09<8:22:17, 3.38s/it] {'loss': 0.1327, 'grad_norm': 1.3896315058880548, 'learning_rate': 1.6728553886709798e-06, 'epoch': 0.74} 74%|███████▍ | 25359/34278 [27:53:09<8:22:17, 3.38s/it] 74%|███████▍ | 25360/34278 [27:53:13<8:49:32, 3.56s/it] {'loss': 0.0972, 'grad_norm': 0.8507942262110857, 'learning_rate': 1.6725027497478092e-06, 'epoch': 0.74} 74%|███████▍ | 25360/34278 [27:53:13<8:49:32, 3.56s/it] 74%|███████▍ | 25361/34278 [27:53:17<8:37:16, 3.48s/it] {'loss': 0.1407, 'grad_norm': 0.9933799142356851, 'learning_rate': 1.6721501405317398e-06, 'epoch': 0.74} 74%|███████▍ | 25361/34278 [27:53:17<8:37:16, 3.48s/it] 74%|███████▍ | 25362/34278 [27:53:20<8:17:21, 3.35s/it] {'loss': 0.1361, 'grad_norm': 1.0911925315744393, 'learning_rate': 1.6717975610259175e-06, 'epoch': 0.74} 74%|███████▍ | 25362/34278 [27:53:20<8:17:21, 3.35s/it] 74%|███████▍ | 25363/34278 [27:53:23<8:10:35, 3.30s/it] {'loss': 0.1185, 'grad_norm': 0.9913835876354764, 'learning_rate': 1.6714450112334924e-06, 'epoch': 0.74} 74%|███████▍ | 25363/34278 [27:53:23<8:10:35, 3.30s/it] 74%|███████▍ | 25364/34278 [27:53:27<8:48:15, 3.56s/it] {'loss': 0.0895, 'grad_norm': 0.9459508176023983, 'learning_rate': 1.671092491157613e-06, 'epoch': 0.74} 74%|███████▍ | 25364/34278 [27:53:27<8:48:15, 3.56s/it] 74%|███████▍ | 25365/34278 [27:53:33<10:21:52, 4.19s/it] {'loss': 0.1016, 'grad_norm': 1.0171833360019018, 'learning_rate': 1.6707400008014257e-06, 'epoch': 0.74} 74%|███████▍ | 25365/34278 [27:53:33<10:21:52, 4.19s/it] 74%|███████▍ | 25366/34278 [27:53:35<9:15:38, 3.74s/it] {'loss': 0.1179, 'grad_norm': 0.9135241923001208, 'learning_rate': 1.6703875401680747e-06, 'epoch': 0.74} 74%|███████▍ | 25366/34278 [27:53:35<9:15:38, 3.74s/it] 74%|███████▍ | 25367/34278 [27:53:39<9:23:55, 3.80s/it] {'loss': 0.1096, 'grad_norm': 0.7377216085161417, 'learning_rate': 1.670035109260711e-06, 'epoch': 0.74} 74%|███████▍ | 25367/34278 [27:53:39<9:23:55, 3.80s/it] 74%|███████▍ | 25368/34278 [27:53:43<9:17:57, 3.76s/it] {'loss': 0.094, 'grad_norm': 0.781221612603949, 'learning_rate': 1.6696827080824784e-06, 'epoch': 0.74} 74%|███████▍ | 25368/34278 [27:53:43<9:17:57, 3.76s/it] 74%|███████▍ | 25369/34278 [27:53:46<8:48:34, 3.56s/it] {'loss': 0.1285, 'grad_norm': 1.2490721073710185, 'learning_rate': 1.6693303366365205e-06, 'epoch': 0.74} 74%|███████▍ | 25369/34278 [27:53:46<8:48:34, 3.56s/it] 74%|███████▍ | 25370/34278 [27:53:50<8:59:09, 3.63s/it] {'loss': 0.1423, 'grad_norm': 1.140595970412751, 'learning_rate': 1.6689779949259894e-06, 'epoch': 0.74} 74%|███████▍ | 25370/34278 [27:53:50<8:59:09, 3.63s/it] 74%|███████▍ | 25371/34278 [27:53:52<8:14:06, 3.33s/it] {'loss': 0.1228, 'grad_norm': 1.1134970041962298, 'learning_rate': 1.6686256829540282e-06, 'epoch': 0.74} 74%|███████▍ | 25371/34278 [27:53:52<8:14:06, 3.33s/it] 74%|███████▍ | 25372/34278 [27:53:58<9:39:25, 3.90s/it] {'loss': 0.1272, 'grad_norm': 0.9452682178795438, 'learning_rate': 1.6682734007237793e-06, 'epoch': 0.74} 74%|███████▍ | 25372/34278 [27:53:58<9:39:25, 3.90s/it] 74%|███████▍ | 25373/34278 [27:54:04<11:29:34, 4.65s/it] {'loss': 0.0964, 'grad_norm': 1.035749661831755, 'learning_rate': 1.6679211482383923e-06, 'epoch': 0.74} 74%|███████▍ | 25373/34278 [27:54:04<11:29:34, 4.65s/it] 74%|███████▍ | 25374/34278 [27:54:07<10:29:18, 4.24s/it] {'loss': 0.1007, 'grad_norm': 0.7226682253429818, 'learning_rate': 1.6675689255010098e-06, 'epoch': 0.74} 74%|███████▍ | 25374/34278 [27:54:07<10:29:18, 4.24s/it] 74%|███████▍ | 25375/34278 [27:54:10<9:32:52, 3.86s/it] {'loss': 0.1082, 'grad_norm': 0.8096881255591155, 'learning_rate': 1.6672167325147741e-06, 'epoch': 0.74} 74%|███████▍ | 25375/34278 [27:54:10<9:32:52, 3.86s/it] 74%|███████▍ | 25376/34278 [27:54:16<11:09:34, 4.51s/it] {'loss': 0.0941, 'grad_norm': 0.897310494971551, 'learning_rate': 1.6668645692828323e-06, 'epoch': 0.74} 74%|███████▍ | 25376/34278 [27:54:16<11:09:34, 4.51s/it] 74%|███████▍ | 25377/34278 [27:54:22<12:12:43, 4.94s/it] {'loss': 0.1282, 'grad_norm': 0.9521364435395179, 'learning_rate': 1.6665124358083296e-06, 'epoch': 0.74} 74%|███████▍ | 25377/34278 [27:54:22<12:12:43, 4.94s/it] 74%|███████▍ | 25378/34278 [27:54:26<11:13:03, 4.54s/it] {'loss': 0.1183, 'grad_norm': 1.2353152986653075, 'learning_rate': 1.666160332094408e-06, 'epoch': 0.74} 74%|███████▍ | 25378/34278 [27:54:26<11:13:03, 4.54s/it] 74%|███████▍ | 25379/34278 [27:54:30<10:35:54, 4.29s/it] {'loss': 0.1218, 'grad_norm': 0.7693406591470277, 'learning_rate': 1.6658082581442098e-06, 'epoch': 0.74} 74%|███████▍ | 25379/34278 [27:54:30<10:35:54, 4.29s/it] 74%|███████▍ | 25380/34278 [27:54:35<11:48:33, 4.78s/it] {'loss': 0.1289, 'grad_norm': 0.7983395794475866, 'learning_rate': 1.665456213960881e-06, 'epoch': 0.74} 74%|███████▍ | 25380/34278 [27:54:35<11:48:33, 4.78s/it] 74%|███████▍ | 25381/34278 [27:54:38<10:12:24, 4.13s/it] {'loss': 0.1125, 'grad_norm': 0.8501129599977921, 'learning_rate': 1.6651041995475613e-06, 'epoch': 0.74} 74%|███████▍ | 25381/34278 [27:54:38<10:12:24, 4.13s/it] 74%|███████▍ | 25382/34278 [27:54:42<10:07:06, 4.09s/it] {'loss': 0.1608, 'grad_norm': 0.8171686588634324, 'learning_rate': 1.664752214907397e-06, 'epoch': 0.74} 74%|███████▍ | 25382/34278 [27:54:42<10:07:06, 4.09s/it] 74%|███████▍ | 25383/34278 [27:54:47<10:20:11, 4.18s/it] {'loss': 0.1194, 'grad_norm': 0.927952847881401, 'learning_rate': 1.6644002600435267e-06, 'epoch': 0.74} 74%|███████▍ | 25383/34278 [27:54:47<10:20:11, 4.18s/it] 74%|███████▍ | 25384/34278 [27:54:49<9:24:58, 3.81s/it] {'loss': 0.1007, 'grad_norm': 0.8322235908395166, 'learning_rate': 1.664048334959097e-06, 'epoch': 0.74} 74%|███████▍ | 25384/34278 [27:54:49<9:24:58, 3.81s/it] 74%|███████▍ | 25385/34278 [27:54:53<8:56:45, 3.62s/it] {'loss': 0.105, 'grad_norm': 0.7438857772698875, 'learning_rate': 1.663696439657247e-06, 'epoch': 0.74} 74%|███████▍ | 25385/34278 [27:54:53<8:56:45, 3.62s/it] 74%|███████▍ | 25386/34278 [27:54:56<8:30:05, 3.44s/it] {'loss': 0.1027, 'grad_norm': 0.7538778802375633, 'learning_rate': 1.6633445741411169e-06, 'epoch': 0.74} 74%|███████▍ | 25386/34278 [27:54:56<8:30:05, 3.44s/it] 74%|███████▍ | 25387/34278 [27:55:02<10:44:03, 4.35s/it] {'loss': 0.1211, 'grad_norm': 1.1904370590106432, 'learning_rate': 1.66299273841385e-06, 'epoch': 0.74} 74%|███████▍ | 25387/34278 [27:55:02<10:44:03, 4.35s/it] 74%|███████▍ | 25388/34278 [27:55:05<9:50:34, 3.99s/it] {'loss': 0.1079, 'grad_norm': 1.5398770808674112, 'learning_rate': 1.662640932478589e-06, 'epoch': 0.74} 74%|███████▍ | 25388/34278 [27:55:05<9:50:34, 3.99s/it] 74%|███████▍ | 25389/34278 [27:55:09<9:53:23, 4.01s/it] {'loss': 0.1098, 'grad_norm': 0.8759431557643031, 'learning_rate': 1.6622891563384714e-06, 'epoch': 0.74} 74%|███████▍ | 25389/34278 [27:55:09<9:53:23, 4.01s/it] 74%|███████▍ | 25390/34278 [27:55:13<9:50:27, 3.99s/it] {'loss': 0.1224, 'grad_norm': 0.7902504275915255, 'learning_rate': 1.6619374099966412e-06, 'epoch': 0.74} 74%|███████▍ | 25390/34278 [27:55:13<9:50:27, 3.99s/it] 74%|███████▍ | 25391/34278 [27:55:16<9:00:10, 3.65s/it] {'loss': 0.1273, 'grad_norm': 1.162438744624882, 'learning_rate': 1.661585693456237e-06, 'epoch': 0.74} 74%|███████▍ | 25391/34278 [27:55:16<9:00:10, 3.65s/it] 74%|███████▍ | 25392/34278 [27:55:19<8:25:59, 3.42s/it] {'loss': 0.1168, 'grad_norm': 0.9169285785607442, 'learning_rate': 1.6612340067203968e-06, 'epoch': 0.74} 74%|███████▍ | 25392/34278 [27:55:19<8:25:59, 3.42s/it] 74%|███████▍ | 25393/34278 [27:55:22<8:21:39, 3.39s/it] {'loss': 0.1108, 'grad_norm': 0.8393633412940993, 'learning_rate': 1.6608823497922626e-06, 'epoch': 0.74} 74%|███████▍ | 25393/34278 [27:55:22<8:21:39, 3.39s/it] 74%|███████▍ | 25394/34278 [27:55:26<8:47:35, 3.56s/it] {'loss': 0.1095, 'grad_norm': 0.6990912682692665, 'learning_rate': 1.6605307226749757e-06, 'epoch': 0.74} 74%|███████▍ | 25394/34278 [27:55:26<8:47:35, 3.56s/it] 74%|███████▍ | 25395/34278 [27:55:32<10:28:24, 4.24s/it] {'loss': 0.1415, 'grad_norm': 0.9879222376101539, 'learning_rate': 1.6601791253716725e-06, 'epoch': 0.74} 74%|███████▍ | 25395/34278 [27:55:32<10:28:24, 4.24s/it] 74%|███████▍ | 25396/34278 [27:55:35<9:47:39, 3.97s/it] {'loss': 0.1076, 'grad_norm': 1.1348303097250978, 'learning_rate': 1.6598275578854917e-06, 'epoch': 0.74} 74%|███████▍ | 25396/34278 [27:55:35<9:47:39, 3.97s/it] 74%|███████▍ | 25397/34278 [27:55:39<9:20:01, 3.78s/it] {'loss': 0.1228, 'grad_norm': 0.9411588160906919, 'learning_rate': 1.6594760202195749e-06, 'epoch': 0.74} 74%|███████▍ | 25397/34278 [27:55:39<9:20:01, 3.78s/it] 74%|███████▍ | 25398/34278 [27:55:44<10:07:42, 4.11s/it] {'loss': 0.1162, 'grad_norm': 0.8093755643714603, 'learning_rate': 1.6591245123770583e-06, 'epoch': 0.74} 74%|███████▍ | 25398/34278 [27:55:44<10:07:42, 4.11s/it] 74%|███████▍ | 25399/34278 [27:55:47<9:15:40, 3.75s/it] {'loss': 0.1288, 'grad_norm': 1.0155090857890545, 'learning_rate': 1.6587730343610776e-06, 'epoch': 0.74} 74%|███████▍ | 25399/34278 [27:55:47<9:15:40, 3.75s/it] 74%|███████▍ | 25400/34278 [27:55:49<8:30:44, 3.45s/it] {'loss': 0.1101, 'grad_norm': 0.8846906618267024, 'learning_rate': 1.6584215861747766e-06, 'epoch': 0.74} 74%|███████▍ | 25400/34278 [27:55:49<8:30:44, 3.45s/it] 74%|███████▍ | 25401/34278 [27:55:53<8:28:05, 3.43s/it] {'loss': 0.1123, 'grad_norm': 0.7678375805109864, 'learning_rate': 1.65807016782129e-06, 'epoch': 0.74} 74%|███████▍ | 25401/34278 [27:55:53<8:28:05, 3.43s/it] 74%|███████▍ | 25402/34278 [27:55:56<8:13:37, 3.34s/it] {'loss': 0.1152, 'grad_norm': 0.7614403773250501, 'learning_rate': 1.6577187793037535e-06, 'epoch': 0.74} 74%|███████▍ | 25402/34278 [27:55:56<8:13:37, 3.34s/it] 74%|███████▍ | 25403/34278 [27:55:59<8:08:55, 3.31s/it] {'loss': 0.1033, 'grad_norm': 0.7680447139282883, 'learning_rate': 1.6573674206253077e-06, 'epoch': 0.74} 74%|███████▍ | 25403/34278 [27:55:59<8:08:55, 3.31s/it] 74%|███████▍ | 25404/34278 [27:56:02<8:10:37, 3.32s/it] {'loss': 0.1067, 'grad_norm': 0.7959803953095608, 'learning_rate': 1.6570160917890876e-06, 'epoch': 0.74} 74%|███████▍ | 25404/34278 [27:56:02<8:10:37, 3.32s/it] 74%|███████▍ | 25405/34278 [27:56:05<7:46:46, 3.16s/it] {'loss': 0.1366, 'grad_norm': 0.86153092292329, 'learning_rate': 1.6566647927982283e-06, 'epoch': 0.74} 74%|███████▍ | 25405/34278 [27:56:05<7:46:46, 3.16s/it] 74%|███████▍ | 25406/34278 [27:56:08<7:32:34, 3.06s/it] {'loss': 0.1234, 'grad_norm': 0.8970047267704228, 'learning_rate': 1.6563135236558675e-06, 'epoch': 0.74} 74%|███████▍ | 25406/34278 [27:56:08<7:32:34, 3.06s/it] 74%|███████▍ | 25407/34278 [27:56:11<7:40:49, 3.12s/it] {'loss': 0.1032, 'grad_norm': 0.8237904485101333, 'learning_rate': 1.6559622843651429e-06, 'epoch': 0.74} 74%|███████▍ | 25407/34278 [27:56:11<7:40:49, 3.12s/it] 74%|███████▍ | 25408/34278 [27:56:16<8:42:44, 3.54s/it] {'loss': 0.1236, 'grad_norm': 0.7105008058025912, 'learning_rate': 1.6556110749291888e-06, 'epoch': 0.74} 74%|███████▍ | 25408/34278 [27:56:16<8:42:44, 3.54s/it] 74%|███████▍ | 25409/34278 [27:56:19<8:17:22, 3.36s/it] {'loss': 0.1072, 'grad_norm': 0.7860115769909971, 'learning_rate': 1.655259895351139e-06, 'epoch': 0.74} 74%|███████▍ | 25409/34278 [27:56:19<8:17:22, 3.36s/it] 74%|███████▍ | 25410/34278 [27:56:22<7:55:59, 3.22s/it] {'loss': 0.1257, 'grad_norm': 0.8804442229356237, 'learning_rate': 1.6549087456341317e-06, 'epoch': 0.74} 74%|███████▍ | 25410/34278 [27:56:22<7:55:59, 3.22s/it] 74%|███████▍ | 25411/34278 [27:56:25<7:47:24, 3.16s/it] {'loss': 0.1246, 'grad_norm': 0.8105582031140881, 'learning_rate': 1.6545576257812995e-06, 'epoch': 0.74} 74%|███████▍ | 25411/34278 [27:56:25<7:47:24, 3.16s/it] 74%|███████▍ | 25412/34278 [27:56:28<7:40:57, 3.12s/it] {'loss': 0.1296, 'grad_norm': 0.789937401807501, 'learning_rate': 1.6542065357957793e-06, 'epoch': 0.74} 74%|███████▍ | 25412/34278 [27:56:28<7:40:57, 3.12s/it] 74%|███████▍ | 25413/34278 [27:56:31<7:47:04, 3.16s/it] {'loss': 0.151, 'grad_norm': 1.0434908734170527, 'learning_rate': 1.6538554756807035e-06, 'epoch': 0.74} 74%|███████▍ | 25413/34278 [27:56:31<7:47:04, 3.16s/it] 74%|███████▍ | 25414/34278 [27:56:34<7:57:52, 3.23s/it] {'loss': 0.1096, 'grad_norm': 0.8006444351043412, 'learning_rate': 1.653504445439208e-06, 'epoch': 0.74} 74%|███████▍ | 25414/34278 [27:56:34<7:57:52, 3.23s/it] 74%|███████▍ | 25415/34278 [27:56:40<9:56:34, 4.04s/it] {'loss': 0.1339, 'grad_norm': 0.838615604229827, 'learning_rate': 1.6531534450744268e-06, 'epoch': 0.74} 74%|███████▍ | 25415/34278 [27:56:40<9:56:34, 4.04s/it] 74%|███████▍ | 25416/34278 [27:56:44<9:30:26, 3.86s/it] {'loss': 0.1156, 'grad_norm': 0.8090596873840712, 'learning_rate': 1.6528024745894904e-06, 'epoch': 0.74} 74%|███████▍ | 25416/34278 [27:56:44<9:30:26, 3.86s/it] 74%|███████▍ | 25417/34278 [27:56:49<10:26:40, 4.24s/it] {'loss': 0.1187, 'grad_norm': 1.1246619977451124, 'learning_rate': 1.6524515339875346e-06, 'epoch': 0.74} 74%|███████▍ | 25417/34278 [27:56:49<10:26:40, 4.24s/it] 74%|███████▍ | 25418/34278 [27:56:53<10:11:15, 4.14s/it] {'loss': 0.1124, 'grad_norm': 0.7490263473434614, 'learning_rate': 1.6521006232716941e-06, 'epoch': 0.74} 74%|███████▍ | 25418/34278 [27:56:53<10:11:15, 4.14s/it] 74%|███████▍ | 25419/34278 [27:56:56<9:32:48, 3.88s/it] {'loss': 0.15, 'grad_norm': 0.7352819684613954, 'learning_rate': 1.6517497424450985e-06, 'epoch': 0.74} 74%|███████▍ | 25419/34278 [27:56:56<9:32:48, 3.88s/it] 74%|███████▍ | 25420/34278 [27:56:59<8:49:08, 3.58s/it] {'loss': 0.1202, 'grad_norm': 0.8687632301078961, 'learning_rate': 1.6513988915108836e-06, 'epoch': 0.74} 74%|███████▍ | 25420/34278 [27:56:59<8:49:08, 3.58s/it] 74%|███████▍ | 25421/34278 [27:57:05<10:31:29, 4.28s/it] {'loss': 0.119, 'grad_norm': 0.7663999610481497, 'learning_rate': 1.6510480704721798e-06, 'epoch': 0.74} 74%|███████▍ | 25421/34278 [27:57:05<10:31:29, 4.28s/it] 74%|███████▍ | 25422/34278 [27:57:08<9:42:04, 3.94s/it] {'loss': 0.1533, 'grad_norm': 0.7455157376620623, 'learning_rate': 1.650697279332118e-06, 'epoch': 0.74} 74%|███████▍ | 25422/34278 [27:57:08<9:42:04, 3.94s/it] 74%|███████▍ | 25423/34278 [27:57:13<10:10:14, 4.13s/it] {'loss': 0.1073, 'grad_norm': 0.7097885563794665, 'learning_rate': 1.650346518093831e-06, 'epoch': 0.74} 74%|███████▍ | 25423/34278 [27:57:13<10:10:14, 4.13s/it] 74%|███████▍ | 25424/34278 [27:57:15<9:16:00, 3.77s/it] {'loss': 0.1112, 'grad_norm': 0.8284290309907514, 'learning_rate': 1.6499957867604527e-06, 'epoch': 0.74} 74%|███████▍ | 25424/34278 [27:57:15<9:16:00, 3.77s/it] 74%|███████▍ | 25425/34278 [27:57:18<8:32:11, 3.47s/it] {'loss': 0.1107, 'grad_norm': 0.7962707894633176, 'learning_rate': 1.649645085335112e-06, 'epoch': 0.74} 74%|███████▍ | 25425/34278 [27:57:18<8:32:11, 3.47s/it] 74%|███████▍ | 25426/34278 [27:57:23<9:15:15, 3.76s/it] {'loss': 0.1284, 'grad_norm': 0.7968001948841206, 'learning_rate': 1.6492944138209382e-06, 'epoch': 0.74} 74%|███████▍ | 25426/34278 [27:57:23<9:15:15, 3.76s/it] 74%|███████▍ | 25427/34278 [27:57:26<8:58:16, 3.65s/it] {'loss': 0.1276, 'grad_norm': 1.2010669149237727, 'learning_rate': 1.648943772221066e-06, 'epoch': 0.74} 74%|███████▍ | 25427/34278 [27:57:26<8:58:16, 3.65s/it] 74%|███████▍ | 25428/34278 [27:57:29<8:25:22, 3.43s/it] {'loss': 0.0869, 'grad_norm': 0.7628615958201201, 'learning_rate': 1.648593160538624e-06, 'epoch': 0.74} 74%|███████▍ | 25428/34278 [27:57:29<8:25:22, 3.43s/it] 74%|███████▍ | 25429/34278 [27:57:32<8:11:07, 3.33s/it] {'loss': 0.0993, 'grad_norm': 0.6660997680511695, 'learning_rate': 1.6482425787767392e-06, 'epoch': 0.74} 74%|███████▍ | 25429/34278 [27:57:32<8:11:07, 3.33s/it] 74%|███████▍ | 25430/34278 [27:57:36<8:54:28, 3.62s/it] {'loss': 0.1365, 'grad_norm': 0.9096575575987993, 'learning_rate': 1.6478920269385472e-06, 'epoch': 0.74} 74%|███████▍ | 25430/34278 [27:57:36<8:54:28, 3.62s/it] 74%|███████▍ | 25431/34278 [27:57:40<8:55:57, 3.63s/it] {'loss': 0.1183, 'grad_norm': 1.1179123581038652, 'learning_rate': 1.6475415050271754e-06, 'epoch': 0.74} 74%|███████▍ | 25431/34278 [27:57:40<8:55:57, 3.63s/it] 74%|███████▍ | 25432/34278 [27:57:43<8:45:25, 3.56s/it] {'loss': 0.1199, 'grad_norm': 0.8847006034413464, 'learning_rate': 1.6471910130457508e-06, 'epoch': 0.74} 74%|███████▍ | 25432/34278 [27:57:43<8:45:25, 3.56s/it] 74%|███████▍ | 25433/34278 [27:57:48<9:11:35, 3.74s/it] {'loss': 0.123, 'grad_norm': 0.9165676425464232, 'learning_rate': 1.646840550997406e-06, 'epoch': 0.74} 74%|███████▍ | 25433/34278 [27:57:48<9:11:35, 3.74s/it] 74%|███████▍ | 25434/34278 [27:57:51<9:01:28, 3.67s/it] {'loss': 0.1234, 'grad_norm': 1.2480362274106354, 'learning_rate': 1.6464901188852684e-06, 'epoch': 0.74} 74%|███████▍ | 25434/34278 [27:57:51<9:01:28, 3.67s/it] 74%|███████▍ | 25435/34278 [27:57:54<8:27:06, 3.44s/it] {'loss': 0.0972, 'grad_norm': 0.7311462670928583, 'learning_rate': 1.646139716712465e-06, 'epoch': 0.74} 74%|███████▍ | 25435/34278 [27:57:54<8:27:06, 3.44s/it] 74%|███████▍ | 25436/34278 [27:57:57<8:12:29, 3.34s/it] {'loss': 0.1195, 'grad_norm': 1.0648678979346429, 'learning_rate': 1.6457893444821255e-06, 'epoch': 0.74} 74%|███████▍ | 25436/34278 [27:57:57<8:12:29, 3.34s/it] 74%|███████▍ | 25437/34278 [27:58:01<8:55:39, 3.64s/it] {'loss': 0.1074, 'grad_norm': 0.9334327651337558, 'learning_rate': 1.6454390021973798e-06, 'epoch': 0.74} 74%|███████▍ | 25437/34278 [27:58:01<8:55:39, 3.64s/it] 74%|███████▍ | 25438/34278 [27:58:07<10:35:44, 4.32s/it] {'loss': 0.1266, 'grad_norm': 0.7848374413560427, 'learning_rate': 1.6450886898613538e-06, 'epoch': 0.74} 74%|███████▍ | 25438/34278 [27:58:07<10:35:44, 4.32s/it] 74%|███████▍ | 25439/34278 [27:58:10<9:40:12, 3.94s/it] {'loss': 0.1237, 'grad_norm': 0.9621393073657919, 'learning_rate': 1.6447384074771732e-06, 'epoch': 0.74} 74%|███████▍ | 25439/34278 [27:58:10<9:40:12, 3.94s/it] 74%|███████▍ | 25440/34278 [27:58:16<10:41:48, 4.36s/it] {'loss': 0.1102, 'grad_norm': 0.9061550032611203, 'learning_rate': 1.644388155047969e-06, 'epoch': 0.74} 74%|███████▍ | 25440/34278 [27:58:16<10:41:48, 4.36s/it] 74%|███████▍ | 25441/34278 [27:58:18<9:23:56, 3.83s/it] {'loss': 0.1064, 'grad_norm': 1.186165421768891, 'learning_rate': 1.6440379325768646e-06, 'epoch': 0.74} 74%|███████▍ | 25441/34278 [27:58:18<9:23:56, 3.83s/it] 74%|███████▍ | 25442/34278 [27:58:21<8:47:34, 3.58s/it] {'loss': 0.1158, 'grad_norm': 1.1411524256563175, 'learning_rate': 1.6436877400669904e-06, 'epoch': 0.74} 74%|███████▍ | 25442/34278 [27:58:21<8:47:34, 3.58s/it] 74%|███████▍ | 25443/34278 [27:58:26<9:16:31, 3.78s/it] {'loss': 0.0974, 'grad_norm': 0.9063056563284617, 'learning_rate': 1.643337577521469e-06, 'epoch': 0.74} 74%|███████▍ | 25443/34278 [27:58:26<9:16:31, 3.78s/it] 74%|███████▍ | 25444/34278 [27:58:30<9:52:46, 4.03s/it] {'loss': 0.1212, 'grad_norm': 0.9133675605781577, 'learning_rate': 1.6429874449434297e-06, 'epoch': 0.74} 74%|███████▍ | 25444/34278 [27:58:30<9:52:46, 4.03s/it] 74%|███████▍ | 25445/34278 [27:58:33<9:08:16, 3.72s/it] {'loss': 0.1201, 'grad_norm': 1.1041187593131603, 'learning_rate': 1.6426373423359975e-06, 'epoch': 0.74} 74%|███████▍ | 25445/34278 [27:58:33<9:08:16, 3.72s/it] 74%|███████▍ | 25446/34278 [27:58:36<8:39:04, 3.53s/it] {'loss': 0.1025, 'grad_norm': 1.051387351379852, 'learning_rate': 1.6422872697022958e-06, 'epoch': 0.74} 74%|███████▍ | 25446/34278 [27:58:36<8:39:04, 3.53s/it] 74%|███████▍ | 25447/34278 [27:58:42<10:24:02, 4.24s/it] {'loss': 0.1158, 'grad_norm': 0.9285985697730978, 'learning_rate': 1.641937227045452e-06, 'epoch': 0.74} 74%|███████▍ | 25447/34278 [27:58:42<10:24:02, 4.24s/it] 74%|███████▍ | 25448/34278 [27:58:45<9:30:07, 3.87s/it] {'loss': 0.1423, 'grad_norm': 1.0951366644053775, 'learning_rate': 1.6415872143685924e-06, 'epoch': 0.74} 74%|███████▍ | 25448/34278 [27:58:45<9:30:07, 3.87s/it] 74%|███████▍ | 25449/34278 [27:58:51<10:58:49, 4.48s/it] {'loss': 0.1102, 'grad_norm': 1.001333933093759, 'learning_rate': 1.6412372316748387e-06, 'epoch': 0.74} 74%|███████▍ | 25449/34278 [27:58:51<10:58:49, 4.48s/it] 74%|███████▍ | 25450/34278 [27:58:56<10:58:01, 4.47s/it] {'loss': 0.1353, 'grad_norm': 0.8529908056140938, 'learning_rate': 1.640887278967319e-06, 'epoch': 0.74} 74%|███████▍ | 25450/34278 [27:58:56<10:58:01, 4.47s/it] 74%|███████▍ | 25451/34278 [27:59:02<12:06:04, 4.94s/it] {'loss': 0.1205, 'grad_norm': 0.6451206073086674, 'learning_rate': 1.6405373562491562e-06, 'epoch': 0.74} 74%|███████▍ | 25451/34278 [27:59:02<12:06:04, 4.94s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 74%|███████▍ | 25452/34278 [27:59:05<10:51:00, 4.43s/it] {'loss': 0.1365, 'grad_norm': 0.7512592341763321, 'learning_rate': 1.6401874635234716e-06, 'epoch': 0.74} 74%|███████▍ | 25452/34278 [27:59:05<10:51:00, 4.43s/it] 74%|███████▍ | 25453/34278 [27:59:08<9:36:04, 3.92s/it] {'loss': 0.1057, 'grad_norm': 0.6750821858785784, 'learning_rate': 1.6398376007933914e-06, 'epoch': 0.74} 74%|███████▍ | 25453/34278 [27:59:08<9:36:04, 3.92s/it] 74%|███████▍ | 25454/34278 [27:59:11<9:19:33, 3.80s/it] {'loss': 0.1235, 'grad_norm': 0.8788521283989128, 'learning_rate': 1.6394877680620407e-06, 'epoch': 0.74} 74%|███████▍ | 25454/34278 [27:59:11<9:19:33, 3.80s/it] 74%|███████▍ | 25455/34278 [27:59:15<9:08:05, 3.73s/it] {'loss': 0.1023, 'grad_norm': 0.6389843363690554, 'learning_rate': 1.6391379653325412e-06, 'epoch': 0.74} 74%|███████▍ | 25455/34278 [27:59:15<9:08:05, 3.73s/it] 74%|███████▍ | 25456/34278 [27:59:21<10:49:52, 4.42s/it] {'loss': 0.1208, 'grad_norm': 0.7921985564656653, 'learning_rate': 1.638788192608014e-06, 'epoch': 0.74} 74%|███████▍ | 25456/34278 [27:59:21<10:49:52, 4.42s/it] 74%|███████▍ | 25457/34278 [27:59:24<9:42:03, 3.96s/it] {'loss': 0.1136, 'grad_norm': 0.8358770450717694, 'learning_rate': 1.6384384498915844e-06, 'epoch': 0.74} 74%|███████▍ | 25457/34278 [27:59:24<9:42:03, 3.96s/it] 74%|███████▍ | 25458/34278 [27:59:27<9:03:47, 3.70s/it] {'loss': 0.1143, 'grad_norm': 0.6269270825396777, 'learning_rate': 1.6380887371863747e-06, 'epoch': 0.74} 74%|███████▍ | 25458/34278 [27:59:27<9:03:47, 3.70s/it] 74%|███████▍ | 25459/34278 [27:59:30<8:45:58, 3.58s/it] {'loss': 0.1121, 'grad_norm': 0.7772857358708563, 'learning_rate': 1.6377390544955024e-06, 'epoch': 0.74} 74%|███████▍ | 25459/34278 [27:59:30<8:45:58, 3.58s/it] 74%|███████▍ | 25460/34278 [27:59:34<8:54:01, 3.63s/it] {'loss': 0.1038, 'grad_norm': 0.8009850369783482, 'learning_rate': 1.6373894018220971e-06, 'epoch': 0.74} 74%|███████▍ | 25460/34278 [27:59:34<8:54:01, 3.63s/it] 74%|███████▍ | 25461/34278 [27:59:37<8:29:50, 3.47s/it] {'loss': 0.1236, 'grad_norm': 0.9487626611399563, 'learning_rate': 1.637039779169276e-06, 'epoch': 0.74} 74%|███████▍ | 25461/34278 [27:59:37<8:29:50, 3.47s/it] 74%|███████▍ | 25462/34278 [27:59:40<8:19:30, 3.40s/it] {'loss': 0.1101, 'grad_norm': 0.7575777363808675, 'learning_rate': 1.6366901865401592e-06, 'epoch': 0.74} 74%|███████▍ | 25462/34278 [27:59:40<8:19:30, 3.40s/it] 74%|███████▍ | 25463/34278 [27:59:43<8:03:28, 3.29s/it] {'loss': 0.1219, 'grad_norm': 0.8093174230273426, 'learning_rate': 1.6363406239378715e-06, 'epoch': 0.74} 74%|███████▍ | 25463/34278 [27:59:43<8:03:28, 3.29s/it] 74%|███████▍ | 25464/34278 [27:59:46<7:50:37, 3.20s/it] {'loss': 0.1119, 'grad_norm': 0.8523740167678296, 'learning_rate': 1.6359910913655314e-06, 'epoch': 0.74} 74%|███████▍ | 25464/34278 [27:59:46<7:50:37, 3.20s/it] 74%|███████▍ | 25465/34278 [27:59:49<7:40:51, 3.14s/it] {'loss': 0.1242, 'grad_norm': 0.8158904565755344, 'learning_rate': 1.6356415888262583e-06, 'epoch': 0.74} 74%|███████▍ | 25465/34278 [27:59:49<7:40:51, 3.14s/it] 74%|███████▍ | 25466/34278 [27:59:52<7:31:52, 3.08s/it] {'loss': 0.1212, 'grad_norm': 0.7557731574314885, 'learning_rate': 1.6352921163231738e-06, 'epoch': 0.74} 74%|███████▍ | 25466/34278 [27:59:52<7:31:52, 3.08s/it] 74%|███████▍ | 25467/34278 [27:59:58<9:25:16, 3.85s/it] {'loss': 0.1005, 'grad_norm': 1.5021850666971923, 'learning_rate': 1.6349426738594e-06, 'epoch': 0.74} 74%|███████▍ | 25467/34278 [27:59:58<9:25:16, 3.85s/it] 74%|███████▍ | 25468/34278 [28:00:01<8:55:10, 3.64s/it] {'loss': 0.1078, 'grad_norm': 0.8435779966046582, 'learning_rate': 1.634593261438055e-06, 'epoch': 0.74} 74%|███████▍ | 25468/34278 [28:00:01<8:55:10, 3.64s/it] 74%|███████▍ | 25469/34278 [28:00:07<10:43:03, 4.38s/it] {'loss': 0.1117, 'grad_norm': 0.6486910102729132, 'learning_rate': 1.6342438790622556e-06, 'epoch': 0.74} 74%|███████▍ | 25469/34278 [28:00:07<10:43:03, 4.38s/it] 74%|███████▍ | 25470/34278 [28:00:11<10:45:24, 4.40s/it] {'loss': 0.1015, 'grad_norm': 0.7197599091949821, 'learning_rate': 1.6338945267351253e-06, 'epoch': 0.74} 74%|███████▍ | 25470/34278 [28:00:11<10:45:24, 4.40s/it] 74%|███████▍ | 25471/34278 [28:00:15<9:55:46, 4.06s/it] {'loss': 0.1023, 'grad_norm': 0.9144436898866133, 'learning_rate': 1.6335452044597794e-06, 'epoch': 0.74} 74%|███████▍ | 25471/34278 [28:00:15<9:55:46, 4.06s/it] 74%|███████▍ | 25472/34278 [28:00:21<11:19:11, 4.63s/it] {'loss': 0.1117, 'grad_norm': 0.6483596145456543, 'learning_rate': 1.6331959122393405e-06, 'epoch': 0.74} 74%|███████▍ | 25472/34278 [28:00:21<11:19:11, 4.63s/it] 74%|███████▍ | 25473/34278 [28:00:23<10:04:03, 4.12s/it] {'loss': 0.1115, 'grad_norm': 0.787331992253388, 'learning_rate': 1.6328466500769225e-06, 'epoch': 0.74} 74%|███████▍ | 25473/34278 [28:00:23<10:04:03, 4.12s/it] 74%|███████▍ | 25474/34278 [28:00:26<9:05:55, 3.72s/it] {'loss': 0.1122, 'grad_norm': 0.9774850073862337, 'learning_rate': 1.6324974179756476e-06, 'epoch': 0.74} 74%|███████▍ | 25474/34278 [28:00:26<9:05:55, 3.72s/it] 74%|███████▍ | 25475/34278 [28:00:29<8:42:14, 3.56s/it] {'loss': 0.1164, 'grad_norm': 0.7852460671945108, 'learning_rate': 1.6321482159386314e-06, 'epoch': 0.74} 74%|███████▍ | 25475/34278 [28:00:29<8:42:14, 3.56s/it] 74%|███████▍ | 25476/34278 [28:00:33<8:35:06, 3.51s/it] {'loss': 0.1035, 'grad_norm': 0.8141108891769548, 'learning_rate': 1.6317990439689913e-06, 'epoch': 0.74} 74%|███████▍ | 25476/34278 [28:00:33<8:35:06, 3.51s/it] 74%|███████▍ | 25477/34278 [28:00:36<8:03:10, 3.29s/it] {'loss': 0.1224, 'grad_norm': 0.8439670860992345, 'learning_rate': 1.6314499020698444e-06, 'epoch': 0.74} 74%|███████▍ | 25477/34278 [28:00:36<8:03:10, 3.29s/it] 74%|███████▍ | 25478/34278 [28:00:39<7:51:06, 3.21s/it] {'loss': 0.109, 'grad_norm': 0.8902089647834072, 'learning_rate': 1.631100790244311e-06, 'epoch': 0.74} 74%|███████▍ | 25478/34278 [28:00:39<7:51:06, 3.21s/it] 74%|███████▍ | 25479/34278 [28:00:42<7:38:15, 3.12s/it] {'loss': 0.1022, 'grad_norm': 0.9622996545312251, 'learning_rate': 1.6307517084955033e-06, 'epoch': 0.74} 74%|███████▍ | 25479/34278 [28:00:42<7:38:15, 3.12s/it] 74%|███████▍ | 25480/34278 [28:00:45<7:32:24, 3.09s/it] {'loss': 0.1164, 'grad_norm': 0.8769423630598164, 'learning_rate': 1.630402656826542e-06, 'epoch': 0.74} 74%|███████▍ | 25480/34278 [28:00:45<7:32:24, 3.09s/it] 74%|███████▍ | 25481/34278 [28:00:47<7:25:46, 3.04s/it] {'loss': 0.1146, 'grad_norm': 1.1854201881829816, 'learning_rate': 1.630053635240541e-06, 'epoch': 0.74} 74%|███████▍ | 25481/34278 [28:00:48<7:25:46, 3.04s/it] 74%|███████▍ | 25482/34278 [28:00:50<7:22:07, 3.02s/it] {'loss': 0.1022, 'grad_norm': 0.8432987016520562, 'learning_rate': 1.6297046437406156e-06, 'epoch': 0.74} 74%|███████▍ | 25482/34278 [28:00:50<7:22:07, 3.02s/it] 74%|███████▍ | 25483/34278 [28:00:55<8:29:37, 3.48s/it] {'loss': 0.133, 'grad_norm': 0.7889299292369527, 'learning_rate': 1.6293556823298823e-06, 'epoch': 0.74} 74%|███████▍ | 25483/34278 [28:00:55<8:29:37, 3.48s/it] 74%|███████▍ | 25484/34278 [28:00:58<8:21:13, 3.42s/it] {'loss': 0.1201, 'grad_norm': 1.0117205907326565, 'learning_rate': 1.6290067510114583e-06, 'epoch': 0.74} 74%|███████▍ | 25484/34278 [28:00:58<8:21:13, 3.42s/it] 74%|███████▍ | 25485/34278 [28:01:01<7:57:05, 3.26s/it] {'loss': 0.098, 'grad_norm': 1.0412832235517318, 'learning_rate': 1.6286578497884575e-06, 'epoch': 0.74} 74%|███████▍ | 25485/34278 [28:01:01<7:57:05, 3.26s/it] 74%|███████▍ | 25486/34278 [28:01:04<7:48:19, 3.20s/it] {'loss': 0.1286, 'grad_norm': 0.8620017663603526, 'learning_rate': 1.6283089786639933e-06, 'epoch': 0.74} 74%|███████▍ | 25486/34278 [28:01:04<7:48:19, 3.20s/it] 74%|███████▍ | 25487/34278 [28:01:09<9:06:52, 3.73s/it] {'loss': 0.1316, 'grad_norm': 1.133428864430782, 'learning_rate': 1.627960137641183e-06, 'epoch': 0.74} 74%|███████▍ | 25487/34278 [28:01:09<9:06:52, 3.73s/it] 74%|███████▍ | 25488/34278 [28:01:12<8:40:13, 3.55s/it] {'loss': 0.1254, 'grad_norm': 0.7748872761268861, 'learning_rate': 1.6276113267231392e-06, 'epoch': 0.74} 74%|███████▍ | 25488/34278 [28:01:12<8:40:13, 3.55s/it] 74%|███████▍ | 25489/34278 [28:01:16<8:27:33, 3.46s/it] {'loss': 0.1147, 'grad_norm': 0.8168004763587532, 'learning_rate': 1.6272625459129737e-06, 'epoch': 0.74} 74%|███████▍ | 25489/34278 [28:01:16<8:27:33, 3.46s/it] 74%|███████▍ | 25490/34278 [28:01:20<9:18:34, 3.81s/it] {'loss': 0.1153, 'grad_norm': 0.7334192510394804, 'learning_rate': 1.6269137952138064e-06, 'epoch': 0.74} 74%|███████▍ | 25490/34278 [28:01:20<9:18:34, 3.81s/it] 74%|███████▍ | 25491/34278 [28:01:24<9:20:16, 3.83s/it] {'loss': 0.134, 'grad_norm': 1.0478577222605305, 'learning_rate': 1.626565074628747e-06, 'epoch': 0.74} 74%|███████▍ | 25491/34278 [28:01:24<9:20:16, 3.83s/it] 74%|███████▍ | 25492/34278 [28:01:29<9:55:29, 4.07s/it] {'loss': 0.1189, 'grad_norm': 0.8121206573306156, 'learning_rate': 1.626216384160908e-06, 'epoch': 0.74} 74%|███████▍ | 25492/34278 [28:01:29<9:55:29, 4.07s/it] 74%|███████▍ | 25493/34278 [28:01:34<10:44:34, 4.40s/it] {'loss': 0.1017, 'grad_norm': 0.9527340094945739, 'learning_rate': 1.6258677238134052e-06, 'epoch': 0.74} 74%|███████▍ | 25493/34278 [28:01:34<10:44:34, 4.40s/it] 74%|███████▍ | 25494/34278 [28:01:37<9:49:43, 4.03s/it] {'loss': 0.1368, 'grad_norm': 0.8191193835618129, 'learning_rate': 1.62551909358935e-06, 'epoch': 0.74} 74%|███████▍ | 25494/34278 [28:01:37<9:49:43, 4.03s/it] 74%|███████▍ | 25495/34278 [28:01:40<9:05:05, 3.72s/it] {'loss': 0.0997, 'grad_norm': 0.8564678401155142, 'learning_rate': 1.6251704934918533e-06, 'epoch': 0.74} 74%|███████▍ | 25495/34278 [28:01:40<9:05:05, 3.72s/it] 74%|███████▍ | 25496/34278 [28:01:46<10:44:51, 4.41s/it] {'loss': 0.1035, 'grad_norm': 0.7826307777950089, 'learning_rate': 1.6248219235240287e-06, 'epoch': 0.74} 74%|███████▍ | 25496/34278 [28:01:46<10:44:51, 4.41s/it] 74%|███████▍ | 25497/34278 [28:01:49<9:59:27, 4.10s/it] {'loss': 0.1015, 'grad_norm': 0.8216581390653788, 'learning_rate': 1.6244733836889897e-06, 'epoch': 0.74} 74%|███████▍ | 25497/34278 [28:01:49<9:59:27, 4.10s/it] 74%|███████▍ | 25498/34278 [28:01:54<10:11:58, 4.18s/it] {'loss': 0.1241, 'grad_norm': 0.7795015813151074, 'learning_rate': 1.6241248739898469e-06, 'epoch': 0.74} 74%|███████▍ | 25498/34278 [28:01:54<10:11:58, 4.18s/it] 74%|███████▍ | 25499/34278 [28:01:58<9:55:05, 4.07s/it] {'loss': 0.1162, 'grad_norm': 0.9220823397389473, 'learning_rate': 1.623776394429709e-06, 'epoch': 0.74} 74%|███████▍ | 25499/34278 [28:01:58<9:55:05, 4.07s/it] 74%|███████▍ | 25500/34278 [28:02:00<8:58:11, 3.68s/it] {'loss': 0.0977, 'grad_norm': 0.8041320740082133, 'learning_rate': 1.6234279450116918e-06, 'epoch': 0.74} 74%|███████▍ | 25500/34278 [28:02:00<8:58:11, 3.68s/it] 74%|███████▍ | 25501/34278 [28:02:05<9:35:32, 3.93s/it] {'loss': 0.1311, 'grad_norm': 0.7047822171205932, 'learning_rate': 1.6230795257389021e-06, 'epoch': 0.74} 74%|███████▍ | 25501/34278 [28:02:05<9:35:32, 3.93s/it] 74%|███████▍ | 25502/34278 [28:02:08<9:18:38, 3.82s/it] {'loss': 0.1121, 'grad_norm': 0.7572621591783532, 'learning_rate': 1.6227311366144538e-06, 'epoch': 0.74} 74%|███████▍ | 25502/34278 [28:02:08<9:18:38, 3.82s/it] 74%|███████▍ | 25503/34278 [28:02:11<8:30:14, 3.49s/it] {'loss': 0.1148, 'grad_norm': 1.0272799798139245, 'learning_rate': 1.622382777641454e-06, 'epoch': 0.74} 74%|███████▍ | 25503/34278 [28:02:11<8:30:14, 3.49s/it] 74%|███████▍ | 25504/34278 [28:02:15<8:48:32, 3.61s/it] {'loss': 0.1005, 'grad_norm': 0.8133294697702453, 'learning_rate': 1.622034448823016e-06, 'epoch': 0.74} 74%|███████▍ | 25504/34278 [28:02:15<8:48:32, 3.61s/it] 74%|███████▍ | 25505/34278 [28:02:21<10:34:56, 4.34s/it] {'loss': 0.1212, 'grad_norm': 0.8817510751531433, 'learning_rate': 1.6216861501622483e-06, 'epoch': 0.74} 74%|███████▍ | 25505/34278 [28:02:21<10:34:56, 4.34s/it] 74%|███████▍ | 25506/34278 [28:02:24<9:40:42, 3.97s/it] {'loss': 0.1438, 'grad_norm': 1.0316321980202487, 'learning_rate': 1.6213378816622583e-06, 'epoch': 0.74} 74%|███████▍ | 25506/34278 [28:02:24<9:40:42, 3.97s/it] 74%|███████▍ | 25507/34278 [28:02:27<9:05:33, 3.73s/it] {'loss': 0.1144, 'grad_norm': 0.7901698333667843, 'learning_rate': 1.6209896433261573e-06, 'epoch': 0.74} 74%|███████▍ | 25507/34278 [28:02:27<9:05:33, 3.73s/it] 74%|███████▍ | 25508/34278 [28:02:33<10:48:24, 4.44s/it] {'loss': 0.1242, 'grad_norm': 0.7283455912549498, 'learning_rate': 1.620641435157056e-06, 'epoch': 0.74} 74%|███████▍ | 25508/34278 [28:02:34<10:48:24, 4.44s/it] 74%|███████▍ | 25509/34278 [28:02:36<9:40:42, 3.97s/it] {'loss': 0.1234, 'grad_norm': 1.124404937433288, 'learning_rate': 1.6202932571580593e-06, 'epoch': 0.74} 74%|███████▍ | 25509/34278 [28:02:36<9:40:42, 3.97s/it] 74%|███████▍ | 25510/34278 [28:02:39<8:43:06, 3.58s/it] {'loss': 0.09, 'grad_norm': 0.741831345451023, 'learning_rate': 1.6199451093322794e-06, 'epoch': 0.74} 74%|███████▍ | 25510/34278 [28:02:39<8:43:06, 3.58s/it] 74%|███████▍ | 25511/34278 [28:02:42<8:20:06, 3.42s/it] {'loss': 0.0986, 'grad_norm': 0.8475788042988227, 'learning_rate': 1.6195969916828224e-06, 'epoch': 0.74} 74%|███████▍ | 25511/34278 [28:02:42<8:20:06, 3.42s/it] 74%|███████▍ | 25512/34278 [28:02:45<8:13:48, 3.38s/it] {'loss': 0.1177, 'grad_norm': 1.1727542967840356, 'learning_rate': 1.619248904212795e-06, 'epoch': 0.74} 74%|███████▍ | 25512/34278 [28:02:45<8:13:48, 3.38s/it] 74%|███████▍ | 25513/34278 [28:02:49<8:07:08, 3.33s/it] {'loss': 0.1264, 'grad_norm': 0.963286441570922, 'learning_rate': 1.6189008469253064e-06, 'epoch': 0.74} 74%|███████▍ | 25513/34278 [28:02:49<8:07:08, 3.33s/it] 74%|███████▍ | 25514/34278 [28:02:52<8:00:43, 3.29s/it] {'loss': 0.1297, 'grad_norm': 1.0430121824220553, 'learning_rate': 1.6185528198234656e-06, 'epoch': 0.74} 74%|███████▍ | 25514/34278 [28:02:52<8:00:43, 3.29s/it] 74%|███████▍ | 25515/34278 [28:02:55<7:54:14, 3.25s/it] {'loss': 0.1172, 'grad_norm': 0.8351046293034777, 'learning_rate': 1.6182048229103774e-06, 'epoch': 0.74} 74%|███████▍ | 25515/34278 [28:02:55<7:54:14, 3.25s/it] 74%|███████▍ | 25516/34278 [28:03:01<9:42:17, 3.99s/it] {'loss': 0.1146, 'grad_norm': 0.8808831723939586, 'learning_rate': 1.6178568561891484e-06, 'epoch': 0.74} 74%|███████▍ | 25516/34278 [28:03:01<9:42:17, 3.99s/it] 74%|███████▍ | 25517/34278 [28:03:05<9:45:13, 4.01s/it] {'loss': 0.1053, 'grad_norm': 1.0297276449556383, 'learning_rate': 1.6175089196628874e-06, 'epoch': 0.74} 74%|███████▍ | 25517/34278 [28:03:05<9:45:13, 4.01s/it] 74%|███████▍ | 25518/34278 [28:03:10<11:02:10, 4.54s/it] {'loss': 0.1078, 'grad_norm': 0.7246997990728504, 'learning_rate': 1.6171610133346992e-06, 'epoch': 0.74} 74%|███████▍ | 25518/34278 [28:03:11<11:02:10, 4.54s/it] 74%|███████▍ | 25519/34278 [28:03:16<11:36:38, 4.77s/it] {'loss': 0.1335, 'grad_norm': 0.8487178867770659, 'learning_rate': 1.6168131372076868e-06, 'epoch': 0.74} 74%|███████▍ | 25519/34278 [28:03:16<11:36:38, 4.77s/it] 74%|███████▍ | 25520/34278 [28:03:22<12:29:48, 5.14s/it] {'loss': 0.1137, 'grad_norm': 1.2413324814821267, 'learning_rate': 1.616465291284962e-06, 'epoch': 0.74} 74%|███████▍ | 25520/34278 [28:03:22<12:29:48, 5.14s/it] 74%|███████▍ | 25521/34278 [28:03:26<11:55:19, 4.90s/it] {'loss': 0.1131, 'grad_norm': 0.8236342336085557, 'learning_rate': 1.616117475569628e-06, 'epoch': 0.74} 74%|███████▍ | 25521/34278 [28:03:26<11:55:19, 4.90s/it] 74%|███████▍ | 25522/34278 [28:03:32<12:40:55, 5.21s/it] {'loss': 0.1234, 'grad_norm': 0.903432158000206, 'learning_rate': 1.6157696900647874e-06, 'epoch': 0.74} 74%|███████▍ | 25522/34278 [28:03:32<12:40:55, 5.21s/it] 74%|███████▍ | 25523/34278 [28:03:36<11:49:04, 4.86s/it] {'loss': 0.1453, 'grad_norm': 1.0895261908075224, 'learning_rate': 1.6154219347735484e-06, 'epoch': 0.74} 74%|███████▍ | 25523/34278 [28:03:36<11:49:04, 4.86s/it] 74%|███████▍ | 25524/34278 [28:03:40<10:46:36, 4.43s/it] {'loss': 0.127, 'grad_norm': 1.2394685697188887, 'learning_rate': 1.6150742096990151e-06, 'epoch': 0.74} 74%|███████▍ | 25524/34278 [28:03:40<10:46:36, 4.43s/it] 74%|███████▍ | 25525/34278 [28:03:43<9:53:53, 4.07s/it] {'loss': 0.1084, 'grad_norm': 0.7495244324670081, 'learning_rate': 1.6147265148442892e-06, 'epoch': 0.74} 74%|███████▍ | 25525/34278 [28:03:43<9:53:53, 4.07s/it] 74%|███████▍ | 25526/34278 [28:03:46<9:09:54, 3.77s/it] {'loss': 0.1036, 'grad_norm': 0.6213348733752562, 'learning_rate': 1.6143788502124768e-06, 'epoch': 0.74} 74%|███████▍ | 25526/34278 [28:03:46<9:09:54, 3.77s/it] 74%|███████▍ | 25527/34278 [28:03:49<8:29:34, 3.49s/it] {'loss': 0.1032, 'grad_norm': 0.8996843840594385, 'learning_rate': 1.6140312158066834e-06, 'epoch': 0.74} 74%|███████▍ | 25527/34278 [28:03:49<8:29:34, 3.49s/it] 74%|███████▍ | 25528/34278 [28:03:52<8:18:57, 3.42s/it] {'loss': 0.1166, 'grad_norm': 1.1597407651426115, 'learning_rate': 1.6136836116300109e-06, 'epoch': 0.74} 74%|███████▍ | 25528/34278 [28:03:52<8:18:57, 3.42s/it] 74%|███████▍ | 25529/34278 [28:03:55<8:08:04, 3.35s/it] {'loss': 0.1098, 'grad_norm': 0.6980797666475563, 'learning_rate': 1.6133360376855616e-06, 'epoch': 0.74} 74%|███████▍ | 25529/34278 [28:03:55<8:08:04, 3.35s/it] 74%|███████▍ | 25530/34278 [28:03:58<7:38:44, 3.15s/it] {'loss': 0.0947, 'grad_norm': 0.6870208505510624, 'learning_rate': 1.6129884939764396e-06, 'epoch': 0.74} 74%|███████▍ | 25530/34278 [28:03:58<7:38:44, 3.15s/it] 74%|███████▍ | 25531/34278 [28:04:01<7:57:37, 3.28s/it] {'loss': 0.1198, 'grad_norm': 0.7926473302234334, 'learning_rate': 1.6126409805057492e-06, 'epoch': 0.74} 74%|███████▍ | 25531/34278 [28:04:01<7:57:37, 3.28s/it] 74%|███████▍ | 25532/34278 [28:04:05<8:24:52, 3.46s/it] {'loss': 0.1071, 'grad_norm': 0.9145850364045315, 'learning_rate': 1.6122934972765914e-06, 'epoch': 0.74} 74%|███████▍ | 25532/34278 [28:04:05<8:24:52, 3.46s/it] 74%|███████▍ | 25533/34278 [28:04:11<10:15:25, 4.22s/it] {'loss': 0.1108, 'grad_norm': 1.2818810250205608, 'learning_rate': 1.611946044292067e-06, 'epoch': 0.74} 74%|███████▍ | 25533/34278 [28:04:11<10:15:25, 4.22s/it] 74%|███████▍ | 25534/34278 [28:04:14<9:26:40, 3.89s/it] {'loss': 0.098, 'grad_norm': 0.8039091127731487, 'learning_rate': 1.6115986215552808e-06, 'epoch': 0.74} 74%|███████▍ | 25534/34278 [28:04:14<9:26:40, 3.89s/it] 74%|███████▍ | 25535/34278 [28:04:18<8:57:06, 3.69s/it] {'loss': 0.1416, 'grad_norm': 0.8527214430764477, 'learning_rate': 1.6112512290693338e-06, 'epoch': 0.74} 74%|███████▍ | 25535/34278 [28:04:18<8:57:06, 3.69s/it] 74%|███████▍ | 25536/34278 [28:04:21<8:49:48, 3.64s/it] {'loss': 0.0991, 'grad_norm': 0.970098336114009, 'learning_rate': 1.6109038668373234e-06, 'epoch': 0.74} 74%|███████▍ | 25536/34278 [28:04:21<8:49:48, 3.64s/it] 74%|███████▍ | 25537/34278 [28:04:24<8:09:30, 3.36s/it] {'loss': 0.1327, 'grad_norm': 1.1187396466934256, 'learning_rate': 1.6105565348623574e-06, 'epoch': 0.74} 74%|███████▍ | 25537/34278 [28:04:24<8:09:30, 3.36s/it] 75%|███████▍ | 25538/34278 [28:04:28<8:26:53, 3.48s/it] {'loss': 0.1037, 'grad_norm': 0.9311686004770586, 'learning_rate': 1.6102092331475339e-06, 'epoch': 0.75} 75%|███████▍ | 25538/34278 [28:04:28<8:26:53, 3.48s/it] 75%|███████▍ | 25539/34278 [28:04:31<8:32:45, 3.52s/it] {'loss': 0.0931, 'grad_norm': 0.8785398287443141, 'learning_rate': 1.609861961695951e-06, 'epoch': 0.75} 75%|███████▍ | 25539/34278 [28:04:31<8:32:45, 3.52s/it] 75%|███████▍ | 25540/34278 [28:04:35<8:29:09, 3.50s/it] {'loss': 0.106, 'grad_norm': 0.8494143320744586, 'learning_rate': 1.609514720510713e-06, 'epoch': 0.75} 75%|███████▍ | 25540/34278 [28:04:35<8:29:09, 3.50s/it] 75%|███████▍ | 25541/34278 [28:04:39<8:55:46, 3.68s/it] {'loss': 0.112, 'grad_norm': 0.9415105827703377, 'learning_rate': 1.6091675095949189e-06, 'epoch': 0.75} 75%|███████▍ | 25541/34278 [28:04:39<8:55:46, 3.68s/it] 75%|███████▍ | 25542/34278 [28:04:42<8:19:26, 3.43s/it] {'loss': 0.1271, 'grad_norm': 0.8763081132105256, 'learning_rate': 1.6088203289516652e-06, 'epoch': 0.75} 75%|███████▍ | 25542/34278 [28:04:42<8:19:26, 3.43s/it] 75%|███████▍ | 25543/34278 [28:04:45<8:06:47, 3.34s/it] {'loss': 0.116, 'grad_norm': 0.8929431481682392, 'learning_rate': 1.6084731785840547e-06, 'epoch': 0.75} 75%|███████▍ | 25543/34278 [28:04:45<8:06:47, 3.34s/it] 75%|███████▍ | 25544/34278 [28:04:48<8:13:11, 3.39s/it] {'loss': 0.1381, 'grad_norm': 0.9028324909614509, 'learning_rate': 1.6081260584951875e-06, 'epoch': 0.75} 75%|███████▍ | 25544/34278 [28:04:48<8:13:11, 3.39s/it] 75%|███████▍ | 25545/34278 [28:04:51<8:05:22, 3.33s/it] {'loss': 0.114, 'grad_norm': 0.9750611600946603, 'learning_rate': 1.6077789686881611e-06, 'epoch': 0.75} 75%|███████▍ | 25545/34278 [28:04:51<8:05:22, 3.33s/it] 75%|███████▍ | 25546/34278 [28:04:57<9:35:09, 3.95s/it] {'loss': 0.1155, 'grad_norm': 0.8078006404430004, 'learning_rate': 1.6074319091660723e-06, 'epoch': 0.75} 75%|███████▍ | 25546/34278 [28:04:57<9:35:09, 3.95s/it] 75%|███████▍ | 25547/34278 [28:05:01<9:51:14, 4.06s/it] {'loss': 0.1269, 'grad_norm': 1.018055942403003, 'learning_rate': 1.6070848799320237e-06, 'epoch': 0.75} 75%|███████▍ | 25547/34278 [28:05:01<9:51:14, 4.06s/it] 75%|███████▍ | 25548/34278 [28:05:04<8:50:11, 3.64s/it] {'loss': 0.1179, 'grad_norm': 0.9232982422290381, 'learning_rate': 1.6067378809891094e-06, 'epoch': 0.75} 75%|███████▍ | 25548/34278 [28:05:04<8:50:11, 3.64s/it] 75%|███████▍ | 25549/34278 [28:05:08<9:26:59, 3.90s/it] {'loss': 0.131, 'grad_norm': 1.1525596094515442, 'learning_rate': 1.6063909123404298e-06, 'epoch': 0.75} 75%|███████▍ | 25549/34278 [28:05:08<9:26:59, 3.90s/it] 75%|███████▍ | 25550/34278 [28:05:12<9:14:21, 3.81s/it] {'loss': 0.1091, 'grad_norm': 1.0587325693696445, 'learning_rate': 1.6060439739890832e-06, 'epoch': 0.75} 75%|███████▍ | 25550/34278 [28:05:12<9:14:21, 3.81s/it] 75%|███████▍ | 25551/34278 [28:05:15<8:58:58, 3.71s/it] {'loss': 0.1147, 'grad_norm': 0.8698500911476206, 'learning_rate': 1.6056970659381654e-06, 'epoch': 0.75} 75%|███████▍ | 25551/34278 [28:05:15<8:58:58, 3.71s/it] 75%|███████▍ | 25552/34278 [28:05:18<8:27:14, 3.49s/it] {'loss': 0.1243, 'grad_norm': 1.3338528907672536, 'learning_rate': 1.6053501881907728e-06, 'epoch': 0.75} 75%|███████▍ | 25552/34278 [28:05:18<8:27:14, 3.49s/it] 75%|███████▍ | 25553/34278 [28:05:21<8:10:11, 3.37s/it] {'loss': 0.1241, 'grad_norm': 0.9939976643404548, 'learning_rate': 1.6050033407500048e-06, 'epoch': 0.75} 75%|███████▍ | 25553/34278 [28:05:21<8:10:11, 3.37s/it] 75%|███████▍ | 25554/34278 [28:05:27<9:32:16, 3.94s/it] {'loss': 0.1121, 'grad_norm': 0.9490224930223939, 'learning_rate': 1.6046565236189554e-06, 'epoch': 0.75} 75%|███████▍ | 25554/34278 [28:05:27<9:32:16, 3.94s/it] 75%|███████▍ | 25555/34278 [28:05:33<11:08:35, 4.60s/it] {'loss': 0.1245, 'grad_norm': 0.7675865964998464, 'learning_rate': 1.6043097368007233e-06, 'epoch': 0.75} 75%|███████▍ | 25555/34278 [28:05:33<11:08:35, 4.60s/it] 75%|███████▍ | 25556/34278 [28:05:39<11:59:00, 4.95s/it] {'loss': 0.1104, 'grad_norm': 1.0211646104043877, 'learning_rate': 1.6039629802984014e-06, 'epoch': 0.75} 75%|███████▍ | 25556/34278 [28:05:39<11:59:00, 4.95s/it] 75%|███████▍ | 25557/34278 [28:05:42<10:35:40, 4.37s/it] {'loss': 0.1307, 'grad_norm': 0.8369056220121306, 'learning_rate': 1.603616254115089e-06, 'epoch': 0.75} 75%|███████▍ | 25557/34278 [28:05:42<10:35:40, 4.37s/it] 75%|███████▍ | 25558/34278 [28:05:48<11:54:30, 4.92s/it] {'loss': 0.1043, 'grad_norm': 0.9513548401552328, 'learning_rate': 1.6032695582538798e-06, 'epoch': 0.75} 75%|███████▍ | 25558/34278 [28:05:48<11:54:30, 4.92s/it] 75%|███████▍ | 25559/34278 [28:05:51<10:25:42, 4.31s/it] {'loss': 0.1342, 'grad_norm': 1.072861385359351, 'learning_rate': 1.602922892717868e-06, 'epoch': 0.75} 75%|███████▍ | 25559/34278 [28:05:51<10:25:42, 4.31s/it] 75%|███████▍ | 25560/34278 [28:05:54<9:35:59, 3.96s/it] {'loss': 0.1468, 'grad_norm': 0.9539497317221474, 'learning_rate': 1.602576257510149e-06, 'epoch': 0.75} 75%|███████▍ | 25560/34278 [28:05:54<9:35:59, 3.96s/it] 75%|███████▍ | 25561/34278 [28:05:57<8:54:46, 3.68s/it] {'loss': 0.0915, 'grad_norm': 0.6985725803914287, 'learning_rate': 1.6022296526338204e-06, 'epoch': 0.75} 75%|███████▍ | 25561/34278 [28:05:57<8:54:46, 3.68s/it] 75%|███████▍ | 25562/34278 [28:06:00<8:23:37, 3.47s/it] {'loss': 0.1317, 'grad_norm': 1.3062122090407413, 'learning_rate': 1.6018830780919741e-06, 'epoch': 0.75} 75%|███████▍ | 25562/34278 [28:06:00<8:23:37, 3.47s/it] 75%|███████▍ | 25563/34278 [28:06:03<8:00:28, 3.31s/it] {'loss': 0.1248, 'grad_norm': 1.0300371415741765, 'learning_rate': 1.6015365338877025e-06, 'epoch': 0.75} 75%|███████▍ | 25563/34278 [28:06:03<8:00:28, 3.31s/it] 75%|███████▍ | 25564/34278 [28:06:07<8:54:44, 3.68s/it] {'loss': 0.0952, 'grad_norm': 0.8176422147947936, 'learning_rate': 1.6011900200241038e-06, 'epoch': 0.75} 75%|███████▍ | 25564/34278 [28:06:07<8:54:44, 3.68s/it] 75%|███████▍ | 25565/34278 [28:06:13<10:25:34, 4.31s/it] {'loss': 0.1243, 'grad_norm': 0.8474100415831087, 'learning_rate': 1.6008435365042685e-06, 'epoch': 0.75} 75%|███████▍ | 25565/34278 [28:06:13<10:25:34, 4.31s/it] 75%|███████▍ | 25566/34278 [28:06:16<9:17:08, 3.84s/it] {'loss': 0.1086, 'grad_norm': 5.006150667434528, 'learning_rate': 1.6004970833312878e-06, 'epoch': 0.75} 75%|███████▍ | 25566/34278 [28:06:16<9:17:08, 3.84s/it] 75%|███████▍ | 25567/34278 [28:06:19<8:43:42, 3.61s/it] {'loss': 0.1128, 'grad_norm': 0.8356199022025703, 'learning_rate': 1.6001506605082605e-06, 'epoch': 0.75} 75%|███████▍ | 25567/34278 [28:06:19<8:43:42, 3.61s/it] 75%|███████▍ | 25568/34278 [28:06:22<8:36:13, 3.56s/it] {'loss': 0.1076, 'grad_norm': 0.8197559960530519, 'learning_rate': 1.599804268038277e-06, 'epoch': 0.75} 75%|███████▍ | 25568/34278 [28:06:22<8:36:13, 3.56s/it] 75%|███████▍ | 25569/34278 [28:06:28<10:21:35, 4.28s/it] {'loss': 0.1304, 'grad_norm': 0.9229738169506888, 'learning_rate': 1.5994579059244276e-06, 'epoch': 0.75} 75%|███████▍ | 25569/34278 [28:06:28<10:21:35, 4.28s/it] 75%|███████▍ | 25570/34278 [28:06:31<9:21:24, 3.87s/it] {'loss': 0.1314, 'grad_norm': 1.0226081550306922, 'learning_rate': 1.5991115741698076e-06, 'epoch': 0.75} 75%|███████▍ | 25570/34278 [28:06:31<9:21:24, 3.87s/it] 75%|███████▍ | 25571/34278 [28:06:34<8:48:28, 3.64s/it] {'loss': 0.1086, 'grad_norm': 1.2285554130538987, 'learning_rate': 1.5987652727775077e-06, 'epoch': 0.75} 75%|███████▍ | 25571/34278 [28:06:34<8:48:28, 3.64s/it] 75%|███████▍ | 25572/34278 [28:06:38<8:50:36, 3.66s/it] {'loss': 0.1275, 'grad_norm': 1.0954519130263571, 'learning_rate': 1.598419001750618e-06, 'epoch': 0.75} 75%|███████▍ | 25572/34278 [28:06:38<8:50:36, 3.66s/it] 75%|███████▍ | 25573/34278 [28:06:41<8:15:28, 3.42s/it] {'loss': 0.133, 'grad_norm': 0.8074544591816742, 'learning_rate': 1.5980727610922315e-06, 'epoch': 0.75} 75%|███████▍ | 25573/34278 [28:06:41<8:15:28, 3.42s/it] 75%|███████▍ | 25574/34278 [28:06:46<9:28:34, 3.92s/it] {'loss': 0.1346, 'grad_norm': 1.2046636037278546, 'learning_rate': 1.5977265508054408e-06, 'epoch': 0.75} 75%|███████▍ | 25574/34278 [28:06:46<9:28:34, 3.92s/it] 75%|███████▍ | 25575/34278 [28:06:52<11:00:06, 4.55s/it] {'loss': 0.1246, 'grad_norm': 0.8894811971962874, 'learning_rate': 1.5973803708933355e-06, 'epoch': 0.75} 75%|███████▍ | 25575/34278 [28:06:52<11:00:06, 4.55s/it] 75%|███████▍ | 25576/34278 [28:06:56<10:33:37, 4.37s/it] {'loss': 0.1141, 'grad_norm': 0.988885625330251, 'learning_rate': 1.597034221359004e-06, 'epoch': 0.75} 75%|███████▍ | 25576/34278 [28:06:56<10:33:37, 4.37s/it] 75%|███████▍ | 25577/34278 [28:06:59<9:22:13, 3.88s/it] {'loss': 0.1147, 'grad_norm': 0.9923304779624527, 'learning_rate': 1.5966881022055403e-06, 'epoch': 0.75} 75%|███████▍ | 25577/34278 [28:06:59<9:22:13, 3.88s/it] 75%|███████▍ | 25578/34278 [28:07:01<8:28:49, 3.51s/it] {'loss': 0.1013, 'grad_norm': 1.0562825232150244, 'learning_rate': 1.5963420134360313e-06, 'epoch': 0.75} 75%|███████▍ | 25578/34278 [28:07:01<8:28:49, 3.51s/it] 75%|███████▍ | 25579/34278 [28:07:06<9:02:50, 3.74s/it] {'loss': 0.119, 'grad_norm': 0.9596395482609146, 'learning_rate': 1.5959959550535682e-06, 'epoch': 0.75} 75%|███████▍ | 25579/34278 [28:07:06<9:02:50, 3.74s/it] 75%|███████▍ | 25580/34278 [28:07:09<8:33:56, 3.55s/it] {'loss': 0.1417, 'grad_norm': 0.7517602565914026, 'learning_rate': 1.595649927061242e-06, 'epoch': 0.75} 75%|███████▍ | 25580/34278 [28:07:09<8:33:56, 3.55s/it] 75%|███████▍ | 25581/34278 [28:07:12<8:21:33, 3.46s/it] {'loss': 0.1385, 'grad_norm': 1.026186386798158, 'learning_rate': 1.595303929462141e-06, 'epoch': 0.75} 75%|███████▍ | 25581/34278 [28:07:12<8:21:33, 3.46s/it] 75%|███████▍ | 25582/34278 [28:07:16<8:23:05, 3.47s/it] {'loss': 0.119, 'grad_norm': 0.862758481281522, 'learning_rate': 1.594957962259352e-06, 'epoch': 0.75} 75%|███████▍ | 25582/34278 [28:07:16<8:23:05, 3.47s/it] 75%|███████▍ | 25583/34278 [28:07:18<7:58:01, 3.30s/it] {'loss': 0.1496, 'grad_norm': 0.7700798815742126, 'learning_rate': 1.5946120254559666e-06, 'epoch': 0.75} 75%|███████▍ | 25583/34278 [28:07:18<7:58:01, 3.30s/it] 75%|███████▍ | 25584/34278 [28:07:22<8:05:46, 3.35s/it] {'loss': 0.1278, 'grad_norm': 0.8386103680514339, 'learning_rate': 1.5942661190550713e-06, 'epoch': 0.75} 75%|███████▍ | 25584/34278 [28:07:22<8:05:46, 3.35s/it] 75%|███████▍ | 25585/34278 [28:07:25<7:39:01, 3.17s/it] {'loss': 0.1225, 'grad_norm': 1.0830627699934756, 'learning_rate': 1.5939202430597562e-06, 'epoch': 0.75} 75%|███████▍ | 25585/34278 [28:07:25<7:39:01, 3.17s/it] 75%|███████▍ | 25586/34278 [28:07:28<7:32:36, 3.12s/it] {'loss': 0.125, 'grad_norm': 0.9011135878300403, 'learning_rate': 1.5935743974731065e-06, 'epoch': 0.75} 75%|███████▍ | 25586/34278 [28:07:28<7:32:36, 3.12s/it] 75%|███████▍ | 25587/34278 [28:07:31<7:52:16, 3.26s/it] {'loss': 0.1296, 'grad_norm': 0.8177723284523348, 'learning_rate': 1.593228582298213e-06, 'epoch': 0.75} 75%|███████▍ | 25587/34278 [28:07:31<7:52:16, 3.26s/it] 75%|███████▍ | 25588/34278 [28:07:36<8:48:30, 3.65s/it] {'loss': 0.1113, 'grad_norm': 0.7812770830191983, 'learning_rate': 1.5928827975381617e-06, 'epoch': 0.75} 75%|███████▍ | 25588/34278 [28:07:36<8:48:30, 3.65s/it] 75%|███████▍ | 25589/34278 [28:07:39<8:20:31, 3.46s/it] {'loss': 0.085, 'grad_norm': 1.1012562085984332, 'learning_rate': 1.5925370431960373e-06, 'epoch': 0.75} 75%|███████▍ | 25589/34278 [28:07:39<8:20:31, 3.46s/it] 75%|███████▍ | 25590/34278 [28:07:45<10:11:09, 4.22s/it] {'loss': 0.1181, 'grad_norm': 0.7157208521702754, 'learning_rate': 1.5921913192749288e-06, 'epoch': 0.75} 75%|███████▍ | 25590/34278 [28:07:45<10:11:09, 4.22s/it] 75%|███████▍ | 25591/34278 [28:07:48<9:31:45, 3.95s/it] {'loss': 0.1038, 'grad_norm': 0.6344133581036481, 'learning_rate': 1.5918456257779248e-06, 'epoch': 0.75} 75%|███████▍ | 25591/34278 [28:07:48<9:31:45, 3.95s/it] 75%|███████▍ | 25592/34278 [28:07:52<9:35:22, 3.97s/it] {'loss': 0.1163, 'grad_norm': 0.8730673364442029, 'learning_rate': 1.5914999627081096e-06, 'epoch': 0.75} 75%|███████▍ | 25592/34278 [28:07:52<9:35:22, 3.97s/it] 75%|███████▍ | 25593/34278 [28:07:55<8:57:45, 3.72s/it] {'loss': 0.099, 'grad_norm': 1.073997424960767, 'learning_rate': 1.5911543300685667e-06, 'epoch': 0.75} 75%|███████▍ | 25593/34278 [28:07:55<8:57:45, 3.72s/it] 75%|███████▍ | 25594/34278 [28:07:59<8:40:24, 3.60s/it] {'loss': 0.1311, 'grad_norm': 0.8947140763807463, 'learning_rate': 1.5908087278623863e-06, 'epoch': 0.75} 75%|███████▍ | 25594/34278 [28:07:59<8:40:24, 3.60s/it] 75%|███████▍ | 25595/34278 [28:08:02<8:29:19, 3.52s/it] {'loss': 0.1223, 'grad_norm': 0.6886352924022845, 'learning_rate': 1.5904631560926515e-06, 'epoch': 0.75} 75%|███████▍ | 25595/34278 [28:08:02<8:29:19, 3.52s/it] 75%|███████▍ | 25596/34278 [28:08:05<8:01:34, 3.33s/it] {'loss': 0.0969, 'grad_norm': 0.8302032427528294, 'learning_rate': 1.5901176147624448e-06, 'epoch': 0.75} 75%|███████▍ | 25596/34278 [28:08:05<8:01:34, 3.33s/it] 75%|███████▍ | 25597/34278 [28:08:08<7:38:01, 3.17s/it] {'loss': 0.1056, 'grad_norm': 0.7117401002308471, 'learning_rate': 1.589772103874857e-06, 'epoch': 0.75} 75%|███████▍ | 25597/34278 [28:08:08<7:38:01, 3.17s/it] 75%|███████▍ | 25598/34278 [28:08:11<7:58:27, 3.31s/it] {'loss': 0.1454, 'grad_norm': 0.9721643354323618, 'learning_rate': 1.5894266234329697e-06, 'epoch': 0.75} 75%|███████▍ | 25598/34278 [28:08:11<7:58:27, 3.31s/it] 75%|███████▍ | 25599/34278 [28:08:17<9:53:07, 4.10s/it] {'loss': 0.1261, 'grad_norm': 0.8414748072462078, 'learning_rate': 1.5890811734398659e-06, 'epoch': 0.75} 75%|███████▍ | 25599/34278 [28:08:17<9:53:07, 4.10s/it] 75%|███████▍ | 25600/34278 [28:08:21<9:23:08, 3.89s/it] {'loss': 0.1059, 'grad_norm': 0.8829117922438163, 'learning_rate': 1.588735753898633e-06, 'epoch': 0.75} 75%|███████▍ | 25600/34278 [28:08:21<9:23:08, 3.89s/it] 75%|███████▍ | 25601/34278 [28:08:24<9:18:47, 3.86s/it] {'loss': 0.1199, 'grad_norm': 1.0462696314783562, 'learning_rate': 1.5883903648123528e-06, 'epoch': 0.75} 75%|███████▍ | 25601/34278 [28:08:24<9:18:47, 3.86s/it] 75%|███████▍ | 25602/34278 [28:08:28<8:49:34, 3.66s/it] {'loss': 0.1214, 'grad_norm': 0.6695386273924907, 'learning_rate': 1.588045006184107e-06, 'epoch': 0.75} 75%|███████▍ | 25602/34278 [28:08:28<8:49:34, 3.66s/it] 75%|███████▍ | 25603/34278 [28:08:31<8:28:57, 3.52s/it] {'loss': 0.1095, 'grad_norm': 1.2591927558180418, 'learning_rate': 1.5876996780169813e-06, 'epoch': 0.75} 75%|███████▍ | 25603/34278 [28:08:31<8:28:57, 3.52s/it] 75%|███████▍ | 25604/34278 [28:08:34<8:30:58, 3.53s/it] {'loss': 0.1179, 'grad_norm': 0.8860847365222658, 'learning_rate': 1.5873543803140594e-06, 'epoch': 0.75} 75%|███████▍ | 25604/34278 [28:08:34<8:30:58, 3.53s/it] 75%|███████▍ | 25605/34278 [28:08:40<9:49:58, 4.08s/it] {'loss': 0.128, 'grad_norm': 0.9533108212734227, 'learning_rate': 1.5870091130784237e-06, 'epoch': 0.75} 75%|███████▍ | 25605/34278 [28:08:40<9:49:58, 4.08s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 75%|███████▍ | 25606/34278 [28:08:42<8:51:19, 3.68s/it] {'loss': 0.1128, 'grad_norm': 0.9225178779363759, 'learning_rate': 1.5866638763131536e-06, 'epoch': 0.75} 75%|███████▍ | 25606/34278 [28:08:42<8:51:19, 3.68s/it] 75%|███████▍ | 25607/34278 [28:08:45<8:24:11, 3.49s/it] {'loss': 0.1409, 'grad_norm': 0.8778929331649101, 'learning_rate': 1.5863186700213356e-06, 'epoch': 0.75} 75%|███████▍ | 25607/34278 [28:08:45<8:24:11, 3.49s/it] 75%|███████▍ | 25608/34278 [28:08:49<8:21:54, 3.47s/it] {'loss': 0.1463, 'grad_norm': 0.8500698613801462, 'learning_rate': 1.5859734942060479e-06, 'epoch': 0.75} 75%|███████▍ | 25608/34278 [28:08:49<8:21:54, 3.47s/it] 75%|███████▍ | 25609/34278 [28:08:53<8:31:06, 3.54s/it] {'loss': 0.1208, 'grad_norm': 0.916773226968268, 'learning_rate': 1.5856283488703738e-06, 'epoch': 0.75} 75%|███████▍ | 25609/34278 [28:08:53<8:31:06, 3.54s/it] 75%|███████▍ | 25610/34278 [28:08:58<10:04:48, 4.19s/it] {'loss': 0.108, 'grad_norm': 1.3881675992267661, 'learning_rate': 1.5852832340173962e-06, 'epoch': 0.75} 75%|███████▍ | 25610/34278 [28:08:58<10:04:48, 4.19s/it] 75%|███████▍ | 25611/34278 [28:09:01<9:16:39, 3.85s/it] {'loss': 0.1109, 'grad_norm': 0.9720055053802819, 'learning_rate': 1.5849381496501948e-06, 'epoch': 0.75} 75%|███████▍ | 25611/34278 [28:09:01<9:16:39, 3.85s/it] 75%|███████▍ | 25612/34278 [28:09:05<8:51:31, 3.68s/it] {'loss': 0.1191, 'grad_norm': 0.7714399454441949, 'learning_rate': 1.5845930957718491e-06, 'epoch': 0.75} 75%|███████▍ | 25612/34278 [28:09:05<8:51:31, 3.68s/it] 75%|███████▍ | 25613/34278 [28:09:07<8:14:44, 3.43s/it] {'loss': 0.1281, 'grad_norm': 0.9274973907083488, 'learning_rate': 1.584248072385442e-06, 'epoch': 0.75} 75%|███████▍ | 25613/34278 [28:09:07<8:14:44, 3.43s/it] 75%|███████▍ | 25614/34278 [28:09:11<8:05:09, 3.36s/it] {'loss': 0.1084, 'grad_norm': 0.853891497627841, 'learning_rate': 1.5839030794940513e-06, 'epoch': 0.75} 75%|███████▍ | 25614/34278 [28:09:11<8:05:09, 3.36s/it] 75%|███████▍ | 25615/34278 [28:09:14<8:09:37, 3.39s/it] {'loss': 0.1211, 'grad_norm': 0.7458355945066203, 'learning_rate': 1.5835581171007603e-06, 'epoch': 0.75} 75%|███████▍ | 25615/34278 [28:09:14<8:09:37, 3.39s/it] 75%|███████▍ | 25616/34278 [28:09:17<7:59:34, 3.32s/it] {'loss': 0.117, 'grad_norm': 0.8022379384011737, 'learning_rate': 1.5832131852086452e-06, 'epoch': 0.75} 75%|███████▍ | 25616/34278 [28:09:17<7:59:34, 3.32s/it] 75%|███████▍ | 25617/34278 [28:09:20<7:43:32, 3.21s/it] {'loss': 0.1104, 'grad_norm': 0.8887473459433449, 'learning_rate': 1.5828682838207882e-06, 'epoch': 0.75} 75%|███████▍ | 25617/34278 [28:09:20<7:43:32, 3.21s/it] 75%|███████▍ | 25618/34278 [28:09:24<8:13:30, 3.42s/it] {'loss': 0.1281, 'grad_norm': 0.7825986872831625, 'learning_rate': 1.5825234129402679e-06, 'epoch': 0.75} 75%|███████▍ | 25618/34278 [28:09:24<8:13:30, 3.42s/it] 75%|███████▍ | 25619/34278 [28:09:27<8:09:00, 3.39s/it] {'loss': 0.1173, 'grad_norm': 0.7345869986556904, 'learning_rate': 1.582178572570161e-06, 'epoch': 0.75} 75%|███████▍ | 25619/34278 [28:09:27<8:09:00, 3.39s/it] 75%|███████▍ | 25620/34278 [28:09:30<7:52:36, 3.28s/it] {'loss': 0.1277, 'grad_norm': 0.820942982376986, 'learning_rate': 1.5818337627135477e-06, 'epoch': 0.75} 75%|███████▍ | 25620/34278 [28:09:30<7:52:36, 3.28s/it] 75%|███████▍ | 25621/34278 [28:09:34<8:09:33, 3.39s/it] {'loss': 0.1124, 'grad_norm': 0.9539420149368407, 'learning_rate': 1.5814889833735087e-06, 'epoch': 0.75} 75%|███████▍ | 25621/34278 [28:09:34<8:09:33, 3.39s/it] 75%|███████▍ | 25622/34278 [28:09:38<8:10:14, 3.40s/it] {'loss': 0.1097, 'grad_norm': 0.8295040468938645, 'learning_rate': 1.5811442345531197e-06, 'epoch': 0.75} 75%|███████▍ | 25622/34278 [28:09:38<8:10:14, 3.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 75%|███████▍ | 25623/34278 [28:09:41<7:50:41, 3.26s/it] {'loss': 0.1343, 'grad_norm': 0.7998331738142896, 'learning_rate': 1.5807995162554574e-06, 'epoch': 0.75} 75%|███████▍ | 25623/34278 [28:09:41<7:50:41, 3.26s/it] 75%|███████▍ | 25624/34278 [28:09:46<9:45:30, 4.06s/it] {'loss': 0.1434, 'grad_norm': 0.9058265781022713, 'learning_rate': 1.5804548284836018e-06, 'epoch': 0.75} 75%|███████▍ | 25624/34278 [28:09:46<9:45:30, 4.06s/it] 75%|███████▍ | 25625/34278 [28:09:50<9:09:03, 3.81s/it] {'loss': 0.1069, 'grad_norm': 0.77370724980713, 'learning_rate': 1.5801101712406296e-06, 'epoch': 0.75} 75%|███████▍ | 25625/34278 [28:09:50<9:09:03, 3.81s/it] 75%|███████▍ | 25626/34278 [28:09:54<9:31:37, 3.96s/it] {'loss': 0.1051, 'grad_norm': 0.8491998827047683, 'learning_rate': 1.5797655445296146e-06, 'epoch': 0.75} 75%|███████▍ | 25626/34278 [28:09:54<9:31:37, 3.96s/it] 75%|███████▍ | 25627/34278 [28:09:57<8:52:06, 3.69s/it] {'loss': 0.1179, 'grad_norm': 0.9163458204771122, 'learning_rate': 1.5794209483536388e-06, 'epoch': 0.75} 75%|███████▍ | 25627/34278 [28:09:57<8:52:06, 3.69s/it] 75%|███████▍ | 25628/34278 [28:10:00<8:22:10, 3.48s/it] {'loss': 0.1285, 'grad_norm': 0.9323202469232902, 'learning_rate': 1.5790763827157769e-06, 'epoch': 0.75} 75%|███████▍ | 25628/34278 [28:10:00<8:22:10, 3.48s/it] 75%|███████▍ | 25629/34278 [28:10:06<10:09:15, 4.23s/it] {'loss': 0.1163, 'grad_norm': 0.8657285290055365, 'learning_rate': 1.5787318476191021e-06, 'epoch': 0.75} 75%|███████▍ | 25629/34278 [28:10:06<10:09:15, 4.23s/it] 75%|███████▍ | 25630/34278 [28:10:09<9:15:16, 3.85s/it] {'loss': 0.1033, 'grad_norm': 0.874499824021466, 'learning_rate': 1.5783873430666947e-06, 'epoch': 0.75} 75%|███████▍ | 25630/34278 [28:10:09<9:15:16, 3.85s/it] 75%|███████▍ | 25631/34278 [28:10:12<8:48:06, 3.66s/it] {'loss': 0.1395, 'grad_norm': 0.8748384166736571, 'learning_rate': 1.5780428690616284e-06, 'epoch': 0.75} 75%|███████▍ | 25631/34278 [28:10:12<8:48:06, 3.66s/it] 75%|███████▍ | 25632/34278 [28:10:15<8:21:09, 3.48s/it] {'loss': 0.1055, 'grad_norm': 0.7851887220845484, 'learning_rate': 1.5776984256069767e-06, 'epoch': 0.75} 75%|███████▍ | 25632/34278 [28:10:15<8:21:09, 3.48s/it] 75%|███████▍ | 25633/34278 [28:10:21<10:00:09, 4.17s/it] {'loss': 0.1001, 'grad_norm': 0.9042595817353926, 'learning_rate': 1.5773540127058162e-06, 'epoch': 0.75} 75%|███████▍ | 25633/34278 [28:10:21<10:00:09, 4.17s/it] 75%|███████▍ | 25634/34278 [28:10:25<9:45:40, 4.07s/it] {'loss': 0.1104, 'grad_norm': 0.8458277829155152, 'learning_rate': 1.5770096303612243e-06, 'epoch': 0.75} 75%|███████▍ | 25634/34278 [28:10:25<9:45:40, 4.07s/it] 75%|███████▍ | 25635/34278 [28:10:28<9:05:09, 3.78s/it] {'loss': 0.1269, 'grad_norm': 0.8950348281535193, 'learning_rate': 1.5766652785762726e-06, 'epoch': 0.75} 75%|███████▍ | 25635/34278 [28:10:28<9:05:09, 3.78s/it] 75%|███████▍ | 25636/34278 [28:10:32<8:58:22, 3.74s/it] {'loss': 0.1233, 'grad_norm': 0.7225497063346766, 'learning_rate': 1.576320957354035e-06, 'epoch': 0.75} 75%|███████▍ | 25636/34278 [28:10:32<8:58:22, 3.74s/it] 75%|███████▍ | 25637/34278 [28:10:35<8:27:49, 3.53s/it] {'loss': 0.1108, 'grad_norm': 0.8940911506486378, 'learning_rate': 1.5759766666975878e-06, 'epoch': 0.75} 75%|███████▍ | 25637/34278 [28:10:35<8:27:49, 3.53s/it] 75%|███████▍ | 25638/34278 [28:10:38<8:07:52, 3.39s/it] {'loss': 0.1129, 'grad_norm': 0.8380670318959027, 'learning_rate': 1.575632406610002e-06, 'epoch': 0.75} 75%|███████▍ | 25638/34278 [28:10:38<8:07:52, 3.39s/it] 75%|███████▍ | 25639/34278 [28:10:41<7:56:38, 3.31s/it] {'loss': 0.0956, 'grad_norm': 0.7181155540416648, 'learning_rate': 1.5752881770943529e-06, 'epoch': 0.75} 75%|███████▍ | 25639/34278 [28:10:41<7:56:38, 3.31s/it] 75%|███████▍ | 25640/34278 [28:10:44<8:00:30, 3.34s/it] {'loss': 0.1446, 'grad_norm': 0.8029626984678675, 'learning_rate': 1.5749439781537145e-06, 'epoch': 0.75} 75%|███████▍ | 25640/34278 [28:10:44<8:00:30, 3.34s/it] 75%|███████▍ | 25641/34278 [28:10:47<7:45:18, 3.23s/it] {'loss': 0.1097, 'grad_norm': 1.0508857609416549, 'learning_rate': 1.574599809791159e-06, 'epoch': 0.75} 75%|███████▍ | 25641/34278 [28:10:47<7:45:18, 3.23s/it] 75%|███████▍ | 25642/34278 [28:10:50<7:30:44, 3.13s/it] {'loss': 0.0896, 'grad_norm': 0.8246555096083829, 'learning_rate': 1.5742556720097574e-06, 'epoch': 0.75} 75%|███████▍ | 25642/34278 [28:10:50<7:30:44, 3.13s/it] 75%|███████▍ | 25643/34278 [28:10:53<7:39:43, 3.19s/it] {'loss': 0.107, 'grad_norm': 0.8795579743206051, 'learning_rate': 1.5739115648125846e-06, 'epoch': 0.75} 75%|███████▍ | 25643/34278 [28:10:53<7:39:43, 3.19s/it] 75%|███████▍ | 25644/34278 [28:10:58<8:42:45, 3.63s/it] {'loss': 0.1179, 'grad_norm': 0.8127137789150932, 'learning_rate': 1.5735674882027097e-06, 'epoch': 0.75} 75%|███████▍ | 25644/34278 [28:10:58<8:42:45, 3.63s/it] 75%|███████▍ | 25645/34278 [28:11:01<8:18:19, 3.46s/it] {'loss': 0.1151, 'grad_norm': 0.8657455104022886, 'learning_rate': 1.5732234421832083e-06, 'epoch': 0.75} 75%|███████▍ | 25645/34278 [28:11:01<8:18:19, 3.46s/it] 75%|███████▍ | 25646/34278 [28:11:05<8:25:13, 3.51s/it] {'loss': 0.1207, 'grad_norm': 0.984091213798382, 'learning_rate': 1.5728794267571478e-06, 'epoch': 0.75} 75%|███████▍ | 25646/34278 [28:11:05<8:25:13, 3.51s/it] 75%|███████▍ | 25647/34278 [28:11:10<9:54:55, 4.14s/it] {'loss': 0.1071, 'grad_norm': 0.8555390115961721, 'learning_rate': 1.5725354419276039e-06, 'epoch': 0.75} 75%|███████▍ | 25647/34278 [28:11:10<9:54:55, 4.14s/it] 75%|███████▍ | 25648/34278 [28:11:14<9:46:22, 4.08s/it] {'loss': 0.1254, 'grad_norm': 1.1362405378720326, 'learning_rate': 1.5721914876976452e-06, 'epoch': 0.75} 75%|███████▍ | 25648/34278 [28:11:14<9:46:22, 4.08s/it] 75%|███████▍ | 25649/34278 [28:11:17<8:58:37, 3.75s/it] {'loss': 0.1286, 'grad_norm': 0.9078234450730663, 'learning_rate': 1.5718475640703407e-06, 'epoch': 0.75} 75%|███████▍ | 25649/34278 [28:11:17<8:58:37, 3.75s/it] 75%|███████▍ | 25650/34278 [28:11:21<8:52:31, 3.70s/it] {'loss': 0.1131, 'grad_norm': 1.012314769564317, 'learning_rate': 1.571503671048763e-06, 'epoch': 0.75} 75%|███████▍ | 25650/34278 [28:11:21<8:52:31, 3.70s/it] 75%|███████▍ | 25651/34278 [28:11:27<10:48:07, 4.51s/it] {'loss': 0.1178, 'grad_norm': 0.8669448799297004, 'learning_rate': 1.5711598086359837e-06, 'epoch': 0.75} 75%|███████▍ | 25651/34278 [28:11:27<10:48:07, 4.51s/it] 75%|███████▍ | 25652/34278 [28:11:32<10:47:51, 4.51s/it] {'loss': 0.1068, 'grad_norm': 0.9417444921510744, 'learning_rate': 1.5708159768350711e-06, 'epoch': 0.75} 75%|███████▍ | 25652/34278 [28:11:32<10:47:51, 4.51s/it] 75%|███████▍ | 25653/34278 [28:11:35<9:31:39, 3.98s/it] {'loss': 0.1192, 'grad_norm': 0.9311394405412246, 'learning_rate': 1.5704721756490932e-06, 'epoch': 0.75} 75%|███████▍ | 25653/34278 [28:11:35<9:31:39, 3.98s/it] 75%|███████▍ | 25654/34278 [28:11:38<9:18:35, 3.89s/it] {'loss': 0.1141, 'grad_norm': 0.7286011907726347, 'learning_rate': 1.5701284050811227e-06, 'epoch': 0.75} 75%|███████▍ | 25654/34278 [28:11:38<9:18:35, 3.89s/it] 75%|███████▍ | 25655/34278 [28:11:41<8:38:47, 3.61s/it] {'loss': 0.1033, 'grad_norm': 0.988775257715098, 'learning_rate': 1.569784665134227e-06, 'epoch': 0.75} 75%|███████▍ | 25655/34278 [28:11:41<8:38:47, 3.61s/it] 75%|███████▍ | 25656/34278 [28:11:44<8:14:52, 3.44s/it] {'loss': 0.1128, 'grad_norm': 0.9231658020629819, 'learning_rate': 1.5694409558114715e-06, 'epoch': 0.75} 75%|███████▍ | 25656/34278 [28:11:44<8:14:52, 3.44s/it] 75%|███████▍ | 25657/34278 [28:11:49<8:58:26, 3.75s/it] {'loss': 0.1257, 'grad_norm': 0.8887093770242535, 'learning_rate': 1.5690972771159318e-06, 'epoch': 0.75} 75%|███████▍ | 25657/34278 [28:11:49<8:58:26, 3.75s/it] 75%|███████▍ | 25658/34278 [28:11:55<10:32:49, 4.40s/it] {'loss': 0.1231, 'grad_norm': 1.3122384141373964, 'learning_rate': 1.5687536290506722e-06, 'epoch': 0.75} 75%|███████▍ | 25658/34278 [28:11:55<10:32:49, 4.40s/it] 75%|███████▍ | 25659/34278 [28:11:59<10:52:21, 4.54s/it] {'loss': 0.116, 'grad_norm': 1.0793584893273471, 'learning_rate': 1.56841001161876e-06, 'epoch': 0.75} 75%|███████▍ | 25659/34278 [28:11:59<10:52:21, 4.54s/it] 75%|███████▍ | 25660/34278 [28:12:02<9:39:00, 4.03s/it] {'loss': 0.1075, 'grad_norm': 0.9791206413428024, 'learning_rate': 1.5680664248232652e-06, 'epoch': 0.75} 75%|███████▍ | 25660/34278 [28:12:02<9:39:00, 4.03s/it] 75%|███████▍ | 25661/34278 [28:12:06<9:06:06, 3.80s/it] {'loss': 0.1076, 'grad_norm': 0.7584541340528008, 'learning_rate': 1.567722868667254e-06, 'epoch': 0.75} 75%|███████▍ | 25661/34278 [28:12:06<9:06:06, 3.80s/it] 75%|███████▍ | 25662/34278 [28:12:11<10:35:34, 4.43s/it] {'loss': 0.1331, 'grad_norm': 0.8887738686445517, 'learning_rate': 1.5673793431537925e-06, 'epoch': 0.75} 75%|███████▍ | 25662/34278 [28:12:11<10:35:34, 4.43s/it] 75%|███████▍ | 25663/34278 [28:12:15<9:39:51, 4.04s/it] {'loss': 0.1393, 'grad_norm': 1.8058045291080915, 'learning_rate': 1.5670358482859488e-06, 'epoch': 0.75} 75%|███████▍ | 25663/34278 [28:12:15<9:39:51, 4.04s/it] 75%|███████▍ | 25664/34278 [28:12:20<10:48:34, 4.52s/it] {'loss': 0.1128, 'grad_norm': 0.9396069158149576, 'learning_rate': 1.5666923840667907e-06, 'epoch': 0.75} 75%|███████▍ | 25664/34278 [28:12:20<10:48:34, 4.52s/it] 75%|███████▍ | 25665/34278 [28:12:24<9:56:13, 4.15s/it] {'loss': 0.0937, 'grad_norm': 0.852240701152601, 'learning_rate': 1.566348950499384e-06, 'epoch': 0.75} 75%|███████▍ | 25665/34278 [28:12:24<9:56:13, 4.15s/it] 75%|███████▍ | 25666/34278 [28:12:28<9:57:38, 4.16s/it] {'loss': 0.1355, 'grad_norm': 0.9996120030112183, 'learning_rate': 1.5660055475867918e-06, 'epoch': 0.75} 75%|███████▍ | 25666/34278 [28:12:28<9:57:38, 4.16s/it] 75%|███████▍ | 25667/34278 [28:12:31<9:07:37, 3.82s/it] {'loss': 0.1069, 'grad_norm': 0.8031682577824013, 'learning_rate': 1.5656621753320844e-06, 'epoch': 0.75} 75%|███████▍ | 25667/34278 [28:12:31<9:07:37, 3.82s/it] 75%|███████▍ | 25668/34278 [28:12:34<8:44:01, 3.65s/it] {'loss': 0.1258, 'grad_norm': 1.2529655951038805, 'learning_rate': 1.5653188337383236e-06, 'epoch': 0.75} 75%|███████▍ | 25668/34278 [28:12:34<8:44:01, 3.65s/it] 75%|███████▍ | 25669/34278 [28:12:38<8:40:31, 3.63s/it] {'loss': 0.1299, 'grad_norm': 0.8544354484814337, 'learning_rate': 1.5649755228085766e-06, 'epoch': 0.75} 75%|███████▍ | 25669/34278 [28:12:38<8:40:31, 3.63s/it] 75%|███████▍ | 25670/34278 [28:12:41<8:28:40, 3.55s/it] {'loss': 0.1348, 'grad_norm': 0.9718943308170146, 'learning_rate': 1.5646322425459092e-06, 'epoch': 0.75} 75%|███████▍ | 25670/34278 [28:12:41<8:28:40, 3.55s/it] 75%|███████▍ | 25671/34278 [28:12:44<8:18:49, 3.48s/it] {'loss': 0.1291, 'grad_norm': 0.8862346063142209, 'learning_rate': 1.5642889929533856e-06, 'epoch': 0.75} 75%|███████▍ | 25671/34278 [28:12:44<8:18:49, 3.48s/it] 75%|███████▍ | 25672/34278 [28:12:49<9:05:50, 3.81s/it] {'loss': 0.1293, 'grad_norm': 0.9019518537476776, 'learning_rate': 1.5639457740340674e-06, 'epoch': 0.75} 75%|███████▍ | 25672/34278 [28:12:49<9:05:50, 3.81s/it] 75%|███████▍ | 25673/34278 [28:12:52<8:29:34, 3.55s/it] {'loss': 0.1244, 'grad_norm': 0.7627082783371806, 'learning_rate': 1.563602585791023e-06, 'epoch': 0.75} 75%|███████▍ | 25673/34278 [28:12:52<8:29:34, 3.55s/it] 75%|███████▍ | 25674/34278 [28:12:55<8:10:30, 3.42s/it] {'loss': 0.1335, 'grad_norm': 0.9860020227765464, 'learning_rate': 1.5632594282273129e-06, 'epoch': 0.75} 75%|███████▍ | 25674/34278 [28:12:55<8:10:30, 3.42s/it] 75%|███████▍ | 25675/34278 [28:12:58<7:49:46, 3.28s/it] {'loss': 0.1172, 'grad_norm': 1.3573526095780761, 'learning_rate': 1.5629163013460041e-06, 'epoch': 0.75} 75%|███████▍ | 25675/34278 [28:12:58<7:49:46, 3.28s/it] 75%|███████▍ | 25676/34278 [28:13:01<7:49:34, 3.28s/it] {'loss': 0.1169, 'grad_norm': 0.7911529622078647, 'learning_rate': 1.5625732051501558e-06, 'epoch': 0.75} 75%|███████▍ | 25676/34278 [28:13:01<7:49:34, 3.28s/it] 75%|███████▍ | 25677/34278 [28:13:04<7:36:48, 3.19s/it] {'loss': 0.1208, 'grad_norm': 0.954631176425323, 'learning_rate': 1.5622301396428351e-06, 'epoch': 0.75} 75%|███████▍ | 25677/34278 [28:13:04<7:36:48, 3.19s/it] 75%|███████▍ | 25678/34278 [28:13:07<7:40:14, 3.21s/it] {'loss': 0.1021, 'grad_norm': 0.7905542458713434, 'learning_rate': 1.5618871048271034e-06, 'epoch': 0.75} 75%|███████▍ | 25678/34278 [28:13:07<7:40:14, 3.21s/it] 75%|███████▍ | 25679/34278 [28:13:11<8:11:34, 3.43s/it] {'loss': 0.1327, 'grad_norm': 0.9653716620721325, 'learning_rate': 1.5615441007060211e-06, 'epoch': 0.75} 75%|███████▍ | 25679/34278 [28:13:11<8:11:34, 3.43s/it] 75%|███████▍ | 25680/34278 [28:13:14<7:39:28, 3.21s/it] {'loss': 0.1096, 'grad_norm': 0.735935900526844, 'learning_rate': 1.561201127282652e-06, 'epoch': 0.75} 75%|███████▍ | 25680/34278 [28:13:14<7:39:28, 3.21s/it] 75%|███████▍ | 25681/34278 [28:13:19<8:52:44, 3.72s/it] {'loss': 0.1068, 'grad_norm': 0.7418271668979863, 'learning_rate': 1.5608581845600606e-06, 'epoch': 0.75} 75%|███████▍ | 25681/34278 [28:13:19<8:52:44, 3.72s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 75%|███████▍ | 25682/34278 [28:13:22<8:27:25, 3.54s/it] {'loss': 0.1155, 'grad_norm': 1.1632935520368335, 'learning_rate': 1.5605152725413058e-06, 'epoch': 0.75} 75%|███████▍ | 25682/34278 [28:13:22<8:27:25, 3.54s/it] 75%|███████▍ | 25683/34278 [28:13:28<10:15:38, 4.30s/it] {'loss': 0.1392, 'grad_norm': 0.8550446078428814, 'learning_rate': 1.5601723912294481e-06, 'epoch': 0.75} 75%|███████▍ | 25683/34278 [28:13:28<10:15:38, 4.30s/it] 75%|███████▍ | 25684/34278 [28:13:31<9:16:22, 3.88s/it] {'loss': 0.1305, 'grad_norm': 0.8621473842462445, 'learning_rate': 1.5598295406275516e-06, 'epoch': 0.75} 75%|███████▍ | 25684/34278 [28:13:31<9:16:22, 3.88s/it] 75%|███████▍ | 25685/34278 [28:13:34<8:45:33, 3.67s/it] {'loss': 0.1012, 'grad_norm': 0.937678709069431, 'learning_rate': 1.559486720738676e-06, 'epoch': 0.75} 75%|███████▍ | 25685/34278 [28:13:34<8:45:33, 3.67s/it] 75%|███████▍ | 25686/34278 [28:13:39<9:31:41, 3.99s/it] {'loss': 0.13, 'grad_norm': 0.7971498399200088, 'learning_rate': 1.5591439315658786e-06, 'epoch': 0.75} 75%|███████▍ | 25686/34278 [28:13:39<9:31:41, 3.99s/it] 75%|███████▍ | 25687/34278 [28:13:42<8:37:12, 3.61s/it] {'loss': 0.1231, 'grad_norm': 0.741866315596296, 'learning_rate': 1.5588011731122254e-06, 'epoch': 0.75} 75%|███████▍ | 25687/34278 [28:13:42<8:37:12, 3.61s/it] 75%|███████▍ | 25688/34278 [28:13:45<8:13:39, 3.45s/it] {'loss': 0.1401, 'grad_norm': 1.1522685624220508, 'learning_rate': 1.5584584453807738e-06, 'epoch': 0.75} 75%|███████▍ | 25688/34278 [28:13:45<8:13:39, 3.45s/it] 75%|███████▍ | 25689/34278 [28:13:48<8:16:58, 3.47s/it] {'loss': 0.1219, 'grad_norm': 0.938849643434187, 'learning_rate': 1.5581157483745824e-06, 'epoch': 0.75} 75%|███████▍ | 25689/34278 [28:13:48<8:16:58, 3.47s/it] 75%|███████▍ | 25690/34278 [28:13:52<8:19:06, 3.49s/it] {'loss': 0.1136, 'grad_norm': 0.9111289872918175, 'learning_rate': 1.5577730820967135e-06, 'epoch': 0.75} 75%|███████▍ | 25690/34278 [28:13:52<8:19:06, 3.49s/it] 75%|███████▍ | 25691/34278 [28:13:55<8:10:47, 3.43s/it] {'loss': 0.1123, 'grad_norm': 0.8241711797103661, 'learning_rate': 1.557430446550225e-06, 'epoch': 0.75} 75%|███████▍ | 25691/34278 [28:13:55<8:10:47, 3.43s/it] 75%|███████▍ | 25692/34278 [28:14:00<9:14:47, 3.88s/it] {'loss': 0.1126, 'grad_norm': 0.7043010184822692, 'learning_rate': 1.557087841738174e-06, 'epoch': 0.75} 75%|███████▍ | 25692/34278 [28:14:00<9:14:47, 3.88s/it] 75%|███████▍ | 25693/34278 [28:14:04<9:04:00, 3.80s/it] {'loss': 0.1051, 'grad_norm': 0.8381720276354171, 'learning_rate': 1.5567452676636207e-06, 'epoch': 0.75} 75%|███████▍ | 25693/34278 [28:14:04<9:04:00, 3.80s/it] 75%|███████▍ | 25694/34278 [28:14:06<8:22:57, 3.52s/it] {'loss': 0.1097, 'grad_norm': 1.084801681471802, 'learning_rate': 1.5564027243296254e-06, 'epoch': 0.75} 75%|███████▍ | 25694/34278 [28:14:06<8:22:57, 3.52s/it] 75%|███████▍ | 25695/34278 [28:14:10<8:14:34, 3.46s/it] {'loss': 0.1121, 'grad_norm': 1.0063538727901777, 'learning_rate': 1.5560602117392442e-06, 'epoch': 0.75} 75%|███████▍ | 25695/34278 [28:14:10<8:14:34, 3.46s/it] 75%|███████▍ | 25696/34278 [28:14:13<7:51:52, 3.30s/it] {'loss': 0.1241, 'grad_norm': 0.8374868438912568, 'learning_rate': 1.5557177298955339e-06, 'epoch': 0.75} 75%|███████▍ | 25696/34278 [28:14:13<7:51:52, 3.30s/it] 75%|███████▍ | 25697/34278 [28:14:16<7:44:54, 3.25s/it] {'loss': 0.1159, 'grad_norm': 0.9112579423054664, 'learning_rate': 1.5553752788015552e-06, 'epoch': 0.75} 75%|███████▍ | 25697/34278 [28:14:16<7:44:54, 3.25s/it] 75%|███████▍ | 25698/34278 [28:14:22<9:35:18, 4.02s/it] {'loss': 0.1293, 'grad_norm': 0.8229293561206907, 'learning_rate': 1.5550328584603619e-06, 'epoch': 0.75} 75%|███████▍ | 25698/34278 [28:14:22<9:35:18, 4.02s/it] 75%|███████▍ | 25699/34278 [28:14:25<8:53:51, 3.73s/it] {'loss': 0.1296, 'grad_norm': 0.991546629013683, 'learning_rate': 1.554690468875013e-06, 'epoch': 0.75} 75%|███████▍ | 25699/34278 [28:14:25<8:53:51, 3.73s/it] 75%|███████▍ | 25700/34278 [28:14:28<8:21:06, 3.51s/it] {'loss': 0.1025, 'grad_norm': 0.9612341527354766, 'learning_rate': 1.5543481100485669e-06, 'epoch': 0.75} 75%|███████▍ | 25700/34278 [28:14:28<8:21:06, 3.51s/it] 75%|███████▍ | 25701/34278 [28:14:31<8:01:40, 3.37s/it] {'loss': 0.1311, 'grad_norm': 0.8886960342689915, 'learning_rate': 1.5540057819840782e-06, 'epoch': 0.75} 75%|███████▍ | 25701/34278 [28:14:31<8:01:40, 3.37s/it] 75%|███████▍ | 25702/34278 [28:14:34<8:10:56, 3.43s/it] {'loss': 0.1189, 'grad_norm': 0.8294574979464563, 'learning_rate': 1.5536634846846016e-06, 'epoch': 0.75} 75%|███████▍ | 25702/34278 [28:14:34<8:10:56, 3.43s/it] 75%|███████▍ | 25703/34278 [28:14:37<7:58:04, 3.35s/it] {'loss': 0.1046, 'grad_norm': 0.9520830625817014, 'learning_rate': 1.553321218153196e-06, 'epoch': 0.75} 75%|███████▍ | 25703/34278 [28:14:37<7:58:04, 3.35s/it] 75%|███████▍ | 25704/34278 [28:14:42<8:28:54, 3.56s/it] {'loss': 0.1211, 'grad_norm': 0.775330468701103, 'learning_rate': 1.5529789823929149e-06, 'epoch': 0.75} 75%|███████▍ | 25704/34278 [28:14:42<8:28:54, 3.56s/it] 75%|███████▍ | 25705/34278 [28:14:45<8:19:19, 3.49s/it] {'loss': 0.1128, 'grad_norm': 0.6679047702139579, 'learning_rate': 1.5526367774068158e-06, 'epoch': 0.75} 75%|███████▍ | 25705/34278 [28:14:45<8:19:19, 3.49s/it] 75%|███████▍ | 25706/34278 [28:14:48<8:05:19, 3.40s/it] {'loss': 0.1336, 'grad_norm': 1.1116795057968818, 'learning_rate': 1.5522946031979507e-06, 'epoch': 0.75} 75%|███████▍ | 25706/34278 [28:14:48<8:05:19, 3.40s/it] 75%|███████▍ | 25707/34278 [28:14:51<7:53:15, 3.31s/it] {'loss': 0.1107, 'grad_norm': 0.878106243220114, 'learning_rate': 1.551952459769378e-06, 'epoch': 0.75} 75%|███████▍ | 25707/34278 [28:14:51<7:53:15, 3.31s/it] 75%|███████▍ | 25708/34278 [28:14:55<8:04:40, 3.39s/it] {'loss': 0.115, 'grad_norm': 0.7496572643796967, 'learning_rate': 1.5516103471241512e-06, 'epoch': 0.75} 75%|███████▍ | 25708/34278 [28:14:55<8:04:40, 3.39s/it] 75%|███████▌ | 25709/34278 [28:14:58<7:59:09, 3.36s/it] {'loss': 0.1185, 'grad_norm': 0.8909588369443133, 'learning_rate': 1.5512682652653221e-06, 'epoch': 0.75} 75%|███████▌ | 25709/34278 [28:14:58<7:59:09, 3.36s/it] 75%|███████▌ | 25710/34278 [28:15:01<7:53:49, 3.32s/it] {'loss': 0.1342, 'grad_norm': 0.9451354533796887, 'learning_rate': 1.5509262141959463e-06, 'epoch': 0.75} 75%|███████▌ | 25710/34278 [28:15:01<7:53:49, 3.32s/it] 75%|███████▌ | 25711/34278 [28:15:06<8:46:17, 3.69s/it] {'loss': 0.1175, 'grad_norm': 0.9042477030205243, 'learning_rate': 1.5505841939190796e-06, 'epoch': 0.75} 75%|███████▌ | 25711/34278 [28:15:06<8:46:17, 3.69s/it] 75%|███████▌ | 25712/34278 [28:15:09<8:11:25, 3.44s/it] {'loss': 0.1111, 'grad_norm': 0.8046544980615903, 'learning_rate': 1.5502422044377741e-06, 'epoch': 0.75} 75%|███████▌ | 25712/34278 [28:15:09<8:11:25, 3.44s/it] 75%|███████▌ | 25713/34278 [28:15:12<8:28:24, 3.56s/it] {'loss': 0.1093, 'grad_norm': 0.6873544487419347, 'learning_rate': 1.549900245755081e-06, 'epoch': 0.75} 75%|███████▌ | 25713/34278 [28:15:13<8:28:24, 3.56s/it] 75%|███████▌ | 25714/34278 [28:15:15<7:49:15, 3.29s/it] {'loss': 0.1243, 'grad_norm': 0.9247468006234043, 'learning_rate': 1.5495583178740563e-06, 'epoch': 0.75} 75%|███████▌ | 25714/34278 [28:15:15<7:49:15, 3.29s/it] 75%|███████▌ | 25715/34278 [28:15:18<7:30:35, 3.16s/it] {'loss': 0.1234, 'grad_norm': 0.8559041359883738, 'learning_rate': 1.5492164207977517e-06, 'epoch': 0.75} 75%|███████▌ | 25715/34278 [28:15:18<7:30:35, 3.16s/it] 75%|███████▌ | 25716/34278 [28:15:24<9:33:13, 4.02s/it] {'loss': 0.1109, 'grad_norm': 0.9897421291896847, 'learning_rate': 1.5488745545292155e-06, 'epoch': 0.75} 75%|███████▌ | 25716/34278 [28:15:24<9:33:13, 4.02s/it] 75%|███████▌ | 25717/34278 [28:15:27<8:47:30, 3.70s/it] {'loss': 0.1304, 'grad_norm': 0.8266209911558882, 'learning_rate': 1.5485327190715066e-06, 'epoch': 0.75} 75%|███████▌ | 25717/34278 [28:15:27<8:47:30, 3.70s/it] 75%|███████▌ | 25718/34278 [28:15:33<10:15:45, 4.32s/it] {'loss': 0.1014, 'grad_norm': 0.7300099003378026, 'learning_rate': 1.548190914427674e-06, 'epoch': 0.75} 75%|███████▌ | 25718/34278 [28:15:33<10:15:45, 4.32s/it] 75%|███████▌ | 25719/34278 [28:15:36<9:23:09, 3.95s/it] {'loss': 0.0927, 'grad_norm': 0.7944459331619189, 'learning_rate': 1.5478491406007672e-06, 'epoch': 0.75} 75%|███████▌ | 25719/34278 [28:15:36<9:23:09, 3.95s/it] 75%|███████▌ | 25720/34278 [28:15:39<9:07:20, 3.84s/it] {'loss': 0.1291, 'grad_norm': 1.4416304824765536, 'learning_rate': 1.5475073975938409e-06, 'epoch': 0.75} 75%|███████▌ | 25720/34278 [28:15:39<9:07:20, 3.84s/it] 75%|███████▌ | 25721/34278 [28:15:45<10:39:09, 4.48s/it] {'loss': 0.1003, 'grad_norm': 0.7870133310338016, 'learning_rate': 1.5471656854099437e-06, 'epoch': 0.75} 75%|███████▌ | 25721/34278 [28:15:45<10:39:09, 4.48s/it] 75%|███████▌ | 25722/34278 [28:15:49<9:46:24, 4.11s/it] {'loss': 0.122, 'grad_norm': 0.770019211641465, 'learning_rate': 1.546824004052126e-06, 'epoch': 0.75} 75%|███████▌ | 25722/34278 [28:15:49<9:46:24, 4.11s/it] 75%|███████▌ | 25723/34278 [28:15:52<9:11:52, 3.87s/it] {'loss': 0.1153, 'grad_norm': 0.8051129295730975, 'learning_rate': 1.546482353523439e-06, 'epoch': 0.75} 75%|███████▌ | 25723/34278 [28:15:52<9:11:52, 3.87s/it] 75%|███████▌ | 25724/34278 [28:15:55<8:33:00, 3.60s/it] {'loss': 0.1142, 'grad_norm': 0.7747533309883834, 'learning_rate': 1.5461407338269351e-06, 'epoch': 0.75} 75%|███████▌ | 25724/34278 [28:15:55<8:33:00, 3.60s/it] 75%|███████▌ | 25725/34278 [28:15:58<8:31:34, 3.59s/it] {'loss': 0.1003, 'grad_norm': 0.8472187954208377, 'learning_rate': 1.5457991449656618e-06, 'epoch': 0.75} 75%|███████▌ | 25725/34278 [28:15:58<8:31:34, 3.59s/it] 75%|███████▌ | 25726/34278 [28:16:01<8:00:54, 3.37s/it] {'loss': 0.1133, 'grad_norm': 1.1004646868941497, 'learning_rate': 1.545457586942668e-06, 'epoch': 0.75} 75%|███████▌ | 25726/34278 [28:16:01<8:00:54, 3.37s/it] 75%|███████▌ | 25727/34278 [28:16:05<8:09:26, 3.43s/it] {'loss': 0.1044, 'grad_norm': 0.8664687944865257, 'learning_rate': 1.5451160597610038e-06, 'epoch': 0.75} 75%|███████▌ | 25727/34278 [28:16:05<8:09:26, 3.43s/it] 75%|███████▌ | 25728/34278 [28:16:11<10:08:23, 4.27s/it] {'loss': 0.125, 'grad_norm': 0.8657255289319268, 'learning_rate': 1.5447745634237204e-06, 'epoch': 0.75} 75%|███████▌ | 25728/34278 [28:16:11<10:08:23, 4.27s/it] 75%|███████▌ | 25729/34278 [28:16:15<9:55:10, 4.18s/it] {'loss': 0.1212, 'grad_norm': 0.7561478075010376, 'learning_rate': 1.5444330979338634e-06, 'epoch': 0.75} 75%|███████▌ | 25729/34278 [28:16:15<9:55:10, 4.18s/it] 75%|███████▌ | 25730/34278 [28:16:18<9:09:51, 3.86s/it] {'loss': 0.1494, 'grad_norm': 0.8906009988923281, 'learning_rate': 1.544091663294484e-06, 'epoch': 0.75} 75%|███████▌ | 25730/34278 [28:16:18<9:09:51, 3.86s/it] 75%|███████▌ | 25731/34278 [28:16:22<8:51:32, 3.73s/it] {'loss': 0.1094, 'grad_norm': 0.9576505702480632, 'learning_rate': 1.5437502595086295e-06, 'epoch': 0.75} 75%|███████▌ | 25731/34278 [28:16:22<8:51:32, 3.73s/it] 75%|███████▌ | 25732/34278 [28:16:28<10:24:48, 4.39s/it] {'loss': 0.1083, 'grad_norm': 0.9966472602789167, 'learning_rate': 1.5434088865793461e-06, 'epoch': 0.75} 75%|███████▌ | 25732/34278 [28:16:28<10:24:48, 4.39s/it] 75%|███████▌ | 25733/34278 [28:16:33<10:54:45, 4.60s/it] {'loss': 0.1268, 'grad_norm': 0.9673156502653935, 'learning_rate': 1.5430675445096827e-06, 'epoch': 0.75} 75%|███████▌ | 25733/34278 [28:16:33<10:54:45, 4.60s/it] 75%|███████▌ | 25734/34278 [28:16:36<10:14:55, 4.32s/it] {'loss': 0.1242, 'grad_norm': 0.6494572180029597, 'learning_rate': 1.5427262333026894e-06, 'epoch': 0.75} 75%|███████▌ | 25734/34278 [28:16:36<10:14:55, 4.32s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0c22865760> Failed to fetch sample 3646665. Exception: cannot identify image file <_io.BytesIO object at 0x7f0c22865760> 75%|███████▌ | 25735/34278 [28:16:40<9:36:47, 4.05s/it] {'loss': 0.1096, 'grad_norm': 0.8430400269354585, 'learning_rate': 1.5423849529614098e-06, 'epoch': 0.75} 75%|███████▌ | 25735/34278 [28:16:40<9:36:47, 4.05s/it] 75%|███████▌ | 25736/34278 [28:16:46<10:58:23, 4.62s/it] {'loss': 0.1364, 'grad_norm': 0.8645013085778063, 'learning_rate': 1.5420437034888914e-06, 'epoch': 0.75} 75%|███████▌ | 25736/34278 [28:16:46<10:58:23, 4.62s/it] 75%|███████▌ | 25737/34278 [28:16:49<9:46:20, 4.12s/it] {'loss': 0.0997, 'grad_norm': 0.688901586949705, 'learning_rate': 1.5417024848881823e-06, 'epoch': 0.75} 75%|███████▌ | 25737/34278 [28:16:49<9:46:20, 4.12s/it] 75%|███████▌ | 25738/34278 [28:16:51<8:47:19, 3.70s/it] {'loss': 0.117, 'grad_norm': 0.7500581859235532, 'learning_rate': 1.5413612971623282e-06, 'epoch': 0.75} 75%|███████▌ | 25738/34278 [28:16:51<8:47:19, 3.70s/it] 75%|███████▌ | 25739/34278 [28:16:55<8:24:47, 3.55s/it] {'loss': 0.1068, 'grad_norm': 1.000258172728012, 'learning_rate': 1.5410201403143726e-06, 'epoch': 0.75} 75%|███████▌ | 25739/34278 [28:16:55<8:24:47, 3.55s/it] 75%|███████▌ | 25740/34278 [28:17:00<10:04:25, 4.25s/it] {'loss': 0.1053, 'grad_norm': 0.7286092268553375, 'learning_rate': 1.5406790143473644e-06, 'epoch': 0.75} 75%|███████▌ | 25740/34278 [28:17:00<10:04:25, 4.25s/it] 75%|███████▌ | 25741/34278 [28:17:04<9:43:33, 4.10s/it] {'loss': 0.1085, 'grad_norm': 0.744143623975935, 'learning_rate': 1.5403379192643491e-06, 'epoch': 0.75} 75%|███████▌ | 25741/34278 [28:17:04<9:43:33, 4.10s/it] 75%|███████▌ | 25742/34278 [28:17:10<11:02:53, 4.66s/it] {'loss': 0.1222, 'grad_norm': 0.9229720704294662, 'learning_rate': 1.5399968550683708e-06, 'epoch': 0.75} 75%|███████▌ | 25742/34278 [28:17:10<11:02:53, 4.66s/it] 75%|███████▌ | 25743/34278 [28:17:14<10:09:48, 4.29s/it] {'loss': 0.1154, 'grad_norm': 1.0041125224633975, 'learning_rate': 1.5396558217624734e-06, 'epoch': 0.75} 75%|███████▌ | 25743/34278 [28:17:14<10:09:48, 4.29s/it] 75%|███████▌ | 25744/34278 [28:17:18<10:14:40, 4.32s/it] {'loss': 0.1477, 'grad_norm': 0.8483922300350779, 'learning_rate': 1.5393148193497042e-06, 'epoch': 0.75} 75%|███████▌ | 25744/34278 [28:17:18<10:14:40, 4.32s/it] 75%|███████▌ | 25745/34278 [28:17:21<9:21:27, 3.95s/it] {'loss': 0.1295, 'grad_norm': 0.8912519006605253, 'learning_rate': 1.538973847833105e-06, 'epoch': 0.75} 75%|███████▌ | 25745/34278 [28:17:21<9:21:27, 3.95s/it] 75%|███████▌ | 25746/34278 [28:17:26<10:24:35, 4.39s/it] {'loss': 0.1028, 'grad_norm': 0.8236529472409622, 'learning_rate': 1.5386329072157209e-06, 'epoch': 0.75} 75%|███████▌ | 25746/34278 [28:17:27<10:24:35, 4.39s/it] 75%|███████▌ | 25747/34278 [28:17:31<10:33:06, 4.45s/it] {'loss': 0.1516, 'grad_norm': 1.0366472976575478, 'learning_rate': 1.5382919975005971e-06, 'epoch': 0.75} 75%|███████▌ | 25747/34278 [28:17:31<10:33:06, 4.45s/it] 75%|███████▌ | 25748/34278 [28:17:34<9:36:07, 4.05s/it] {'loss': 0.1157, 'grad_norm': 0.8515181377093403, 'learning_rate': 1.5379511186907764e-06, 'epoch': 0.75} 75%|███████▌ | 25748/34278 [28:17:34<9:36:07, 4.05s/it] 75%|███████▌ | 25749/34278 [28:17:37<8:49:13, 3.72s/it] {'loss': 0.1113, 'grad_norm': 2.0746256570564485, 'learning_rate': 1.5376102707893e-06, 'epoch': 0.75} 75%|███████▌ | 25749/34278 [28:17:37<8:49:13, 3.72s/it] 75%|███████▌ | 25750/34278 [28:17:40<8:08:26, 3.44s/it] {'loss': 0.1159, 'grad_norm': 0.9266128271380434, 'learning_rate': 1.5372694537992138e-06, 'epoch': 0.75} 75%|███████▌ | 25750/34278 [28:17:40<8:08:26, 3.44s/it] 75%|███████▌ | 25751/34278 [28:17:45<9:38:04, 4.07s/it] {'loss': 0.1202, 'grad_norm': 1.0543881817697456, 'learning_rate': 1.536928667723558e-06, 'epoch': 0.75} 75%|███████▌ | 25751/34278 [28:17:45<9:38:04, 4.07s/it] 75%|███████▌ | 25752/34278 [28:17:48<8:51:35, 3.74s/it] {'loss': 0.1232, 'grad_norm': 0.7582579627222545, 'learning_rate': 1.5365879125653776e-06, 'epoch': 0.75} 75%|███████▌ | 25752/34278 [28:17:48<8:51:35, 3.74s/it] 75%|███████▌ | 25753/34278 [28:17:52<8:25:46, 3.56s/it] {'loss': 0.123, 'grad_norm': 0.9280920830413915, 'learning_rate': 1.5362471883277125e-06, 'epoch': 0.75} 75%|███████▌ | 25753/34278 [28:17:52<8:25:46, 3.56s/it] 75%|███████▌ | 25754/34278 [28:17:55<8:18:46, 3.51s/it] {'loss': 0.1095, 'grad_norm': 0.8118954715549233, 'learning_rate': 1.5359064950136065e-06, 'epoch': 0.75} 75%|███████▌ | 25754/34278 [28:17:55<8:18:46, 3.51s/it] 75%|███████▌ | 25755/34278 [28:17:58<7:52:39, 3.33s/it] {'loss': 0.1275, 'grad_norm': 0.8382301200004377, 'learning_rate': 1.5355658326261008e-06, 'epoch': 0.75} 75%|███████▌ | 25755/34278 [28:17:58<7:52:39, 3.33s/it] 75%|███████▌ | 25756/34278 [28:18:01<7:36:15, 3.21s/it] {'loss': 0.1191, 'grad_norm': 0.9030350085480212, 'learning_rate': 1.5352252011682351e-06, 'epoch': 0.75} 75%|███████▌ | 25756/34278 [28:18:01<7:36:15, 3.21s/it] 75%|███████▌ | 25757/34278 [28:18:04<7:35:08, 3.20s/it] {'loss': 0.1097, 'grad_norm': 0.8534362217001674, 'learning_rate': 1.5348846006430513e-06, 'epoch': 0.75} 75%|███████▌ | 25757/34278 [28:18:04<7:35:08, 3.20s/it] 75%|███████▌ | 25758/34278 [28:18:07<7:40:29, 3.24s/it] {'loss': 0.1158, 'grad_norm': 1.0283059911674362, 'learning_rate': 1.534544031053592e-06, 'epoch': 0.75} 75%|███████▌ | 25758/34278 [28:18:07<7:40:29, 3.24s/it] 75%|███████▌ | 25759/34278 [28:18:11<7:40:55, 3.25s/it] {'loss': 0.1497, 'grad_norm': 0.8814878203165918, 'learning_rate': 1.5342034924028948e-06, 'epoch': 0.75} 75%|███████▌ | 25759/34278 [28:18:11<7:40:55, 3.25s/it] 75%|███████▌ | 25760/34278 [28:18:17<9:36:09, 4.06s/it] {'loss': 0.1358, 'grad_norm': 0.9472910776939061, 'learning_rate': 1.5338629846940033e-06, 'epoch': 0.75} 75%|███████▌ | 25760/34278 [28:18:17<9:36:09, 4.06s/it] 75%|███████▌ | 25761/34278 [28:18:20<8:59:43, 3.80s/it] {'loss': 0.1311, 'grad_norm': 0.934545581378259, 'learning_rate': 1.533522507929956e-06, 'epoch': 0.75} 75%|███████▌ | 25761/34278 [28:18:20<8:59:43, 3.80s/it] 75%|███████▌ | 25762/34278 [28:18:23<8:41:41, 3.68s/it] {'loss': 0.1388, 'grad_norm': 3.804019459227602, 'learning_rate': 1.53318206211379e-06, 'epoch': 0.75} 75%|███████▌ | 25762/34278 [28:18:23<8:41:41, 3.68s/it] 75%|███████▌ | 25763/34278 [28:18:27<8:33:01, 3.62s/it] {'loss': 0.114, 'grad_norm': 0.9187604451123866, 'learning_rate': 1.532841647248547e-06, 'epoch': 0.75} 75%|███████▌ | 25763/34278 [28:18:27<8:33:01, 3.62s/it] 75%|███████▌ | 25764/34278 [28:18:30<8:14:23, 3.48s/it] {'loss': 0.1076, 'grad_norm': 0.9454141071279012, 'learning_rate': 1.5325012633372677e-06, 'epoch': 0.75} 75%|███████▌ | 25764/34278 [28:18:30<8:14:23, 3.48s/it] 75%|███████▌ | 25765/34278 [28:18:33<8:15:59, 3.50s/it] {'loss': 0.1071, 'grad_norm': 1.022386653222536, 'learning_rate': 1.53216091038299e-06, 'epoch': 0.75} 75%|███████▌ | 25765/34278 [28:18:33<8:15:59, 3.50s/it] 75%|███████▌ | 25766/34278 [28:18:40<10:12:02, 4.31s/it] {'loss': 0.1164, 'grad_norm': 0.8616648625056953, 'learning_rate': 1.5318205883887494e-06, 'epoch': 0.75} 75%|███████▌ | 25766/34278 [28:18:40<10:12:02, 4.31s/it] 75%|███████▌ | 25767/34278 [28:18:43<9:20:10, 3.95s/it] {'loss': 0.1135, 'grad_norm': 0.9621778674013503, 'learning_rate': 1.5314802973575888e-06, 'epoch': 0.75} 75%|███████▌ | 25767/34278 [28:18:43<9:20:10, 3.95s/it] 75%|███████▌ | 25768/34278 [28:18:46<8:51:18, 3.75s/it] {'loss': 0.1051, 'grad_norm': 0.62209175496305, 'learning_rate': 1.531140037292544e-06, 'epoch': 0.75} 75%|███████▌ | 25768/34278 [28:18:46<8:51:18, 3.75s/it] 75%|███████▌ | 25769/34278 [28:18:49<8:22:33, 3.54s/it] {'loss': 0.1156, 'grad_norm': 0.8775747135041679, 'learning_rate': 1.5307998081966507e-06, 'epoch': 0.75} 75%|███████▌ | 25769/34278 [28:18:49<8:22:33, 3.54s/it] 75%|███████▌ | 25770/34278 [28:18:52<8:01:06, 3.39s/it] {'loss': 0.1232, 'grad_norm': 1.1942064510018175, 'learning_rate': 1.530459610072949e-06, 'epoch': 0.75} 75%|███████▌ | 25770/34278 [28:18:52<8:01:06, 3.39s/it] 75%|███████▌ | 25771/34278 [28:18:55<8:00:56, 3.39s/it] {'loss': 0.1081, 'grad_norm': 0.8855733617127285, 'learning_rate': 1.5301194429244776e-06, 'epoch': 0.75} 75%|███████▌ | 25771/34278 [28:18:55<8:00:56, 3.39s/it] 75%|███████▌ | 25772/34278 [28:18:59<7:51:48, 3.33s/it] {'loss': 0.1179, 'grad_norm': 0.7607391829487901, 'learning_rate': 1.529779306754271e-06, 'epoch': 0.75} 75%|███████▌ | 25772/34278 [28:18:59<7:51:48, 3.33s/it] 75%|███████▌ | 25773/34278 [28:19:02<8:14:16, 3.49s/it] {'loss': 0.1143, 'grad_norm': 0.8636723525629825, 'learning_rate': 1.5294392015653648e-06, 'epoch': 0.75} 75%|███████▌ | 25773/34278 [28:19:02<8:14:16, 3.49s/it] 75%|███████▌ | 25774/34278 [28:19:06<8:13:47, 3.48s/it] {'loss': 0.1219, 'grad_norm': 0.8722640448327522, 'learning_rate': 1.5290991273607986e-06, 'epoch': 0.75} 75%|███████▌ | 25774/34278 [28:19:06<8:13:47, 3.48s/it] 75%|███████▌ | 25775/34278 [28:19:12<9:43:50, 4.12s/it] {'loss': 0.1253, 'grad_norm': 1.0854114211235604, 'learning_rate': 1.5287590841436056e-06, 'epoch': 0.75} 75%|███████▌ | 25775/34278 [28:19:12<9:43:50, 4.12s/it] 75%|███████▌ | 25776/34278 [28:19:15<9:33:53, 4.05s/it] {'loss': 0.1192, 'grad_norm': 0.793304124044723, 'learning_rate': 1.5284190719168224e-06, 'epoch': 0.75} 75%|███████▌ | 25776/34278 [28:19:15<9:33:53, 4.05s/it] 75%|███████▌ | 25777/34278 [28:19:19<9:25:54, 3.99s/it] {'loss': 0.1179, 'grad_norm': 0.7337154705818196, 'learning_rate': 1.5280790906834863e-06, 'epoch': 0.75} 75%|███████▌ | 25777/34278 [28:19:19<9:25:54, 3.99s/it] 75%|███████▌ | 25778/34278 [28:19:23<9:15:02, 3.92s/it] {'loss': 0.1141, 'grad_norm': 0.871914108863201, 'learning_rate': 1.527739140446632e-06, 'epoch': 0.75} 75%|███████▌ | 25778/34278 [28:19:23<9:15:02, 3.92s/it] 75%|███████▌ | 25779/34278 [28:19:28<9:40:14, 4.10s/it] {'loss': 0.1256, 'grad_norm': 0.8699708569000806, 'learning_rate': 1.527399221209292e-06, 'epoch': 0.75} 75%|███████▌ | 25779/34278 [28:19:28<9:40:14, 4.10s/it] 75%|███████▌ | 25780/34278 [28:19:31<9:26:47, 4.00s/it] {'loss': 0.1174, 'grad_norm': 0.8517358303980148, 'learning_rate': 1.5270593329745036e-06, 'epoch': 0.75} 75%|███████▌ | 25780/34278 [28:19:31<9:26:47, 4.00s/it] 75%|███████▌ | 25781/34278 [28:19:36<9:55:06, 4.20s/it] {'loss': 0.1224, 'grad_norm': 1.2603983556341987, 'learning_rate': 1.5267194757452996e-06, 'epoch': 0.75} 75%|███████▌ | 25781/34278 [28:19:36<9:55:06, 4.20s/it] 75%|███████▌ | 25782/34278 [28:19:40<9:40:14, 4.10s/it] {'loss': 0.1174, 'grad_norm': 1.229955663058453, 'learning_rate': 1.5263796495247162e-06, 'epoch': 0.75} 75%|███████▌ | 25782/34278 [28:19:40<9:40:14, 4.10s/it] 75%|███████▌ | 25783/34278 [28:19:45<10:36:32, 4.50s/it] {'loss': 0.1135, 'grad_norm': 0.8220658004708723, 'learning_rate': 1.5260398543157851e-06, 'epoch': 0.75} 75%|███████▌ | 25783/34278 [28:19:45<10:36:32, 4.50s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 75%|███████▌ | 25784/34278 [28:19:48<9:31:09, 4.03s/it] {'loss': 0.1361, 'grad_norm': 0.8753495724032763, 'learning_rate': 1.5257000901215418e-06, 'epoch': 0.75} 75%|███████▌ | 25784/34278 [28:19:48<9:31:09, 4.03s/it] 75%|███████▌ | 25785/34278 [28:19:52<9:02:17, 3.83s/it] {'loss': 0.1067, 'grad_norm': 1.0133299363899728, 'learning_rate': 1.5253603569450192e-06, 'epoch': 0.75} 75%|███████▌ | 25785/34278 [28:19:52<9:02:17, 3.83s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47e9c1f150> Failed to fetch sample 3268273. Exception: cannot identify image file <_io.BytesIO object at 0x7f47e9c1f150> 75%|███████▌ | 25786/34278 [28:19:55<8:33:54, 3.63s/it] {'loss': 0.1084, 'grad_norm': 1.0725323846787793, 'learning_rate': 1.5250206547892477e-06, 'epoch': 0.75} 75%|███████▌ | 25786/34278 [28:19:55<8:33:54, 3.63s/it] 75%|███████▌ | 25787/34278 [28:20:01<10:27:08, 4.43s/it] {'loss': 0.1073, 'grad_norm': 0.7877331615597004, 'learning_rate': 1.524680983657263e-06, 'epoch': 0.75} 75%|███████▌ | 25787/34278 [28:20:01<10:27:08, 4.43s/it] 75%|███████▌ | 25788/34278 [28:20:07<11:19:47, 4.80s/it] {'loss': 0.1302, 'grad_norm': 0.8543230161109976, 'learning_rate': 1.5243413435520977e-06, 'epoch': 0.75} 75%|███████▌ | 25788/34278 [28:20:07<11:19:47, 4.80s/it] 75%|███████▌ | 25789/34278 [28:20:12<11:46:14, 4.99s/it] {'loss': 0.1082, 'grad_norm': 0.8261126604003302, 'learning_rate': 1.5240017344767837e-06, 'epoch': 0.75} 75%|███████▌ | 25789/34278 [28:20:12<11:46:14, 4.99s/it] 75%|███████▌ | 25790/34278 [28:20:16<10:54:38, 4.63s/it] {'loss': 0.1374, 'grad_norm': 1.011447016471589, 'learning_rate': 1.5236621564343507e-06, 'epoch': 0.75} 75%|███████▌ | 25790/34278 [28:20:16<10:54:38, 4.63s/it] 75%|███████▌ | 25791/34278 [28:20:19<10:00:23, 4.24s/it] {'loss': 0.1168, 'grad_norm': 1.0970751413172217, 'learning_rate': 1.5233226094278336e-06, 'epoch': 0.75} 75%|███████▌ | 25791/34278 [28:20:19<10:00:23, 4.24s/it] 75%|███████▌ | 25792/34278 [28:20:23<9:29:59, 4.03s/it] {'loss': 0.1066, 'grad_norm': 1.154937269989941, 'learning_rate': 1.5229830934602608e-06, 'epoch': 0.75} 75%|███████▌ | 25792/34278 [28:20:23<9:29:59, 4.03s/it] 75%|███████▌ | 25793/34278 [28:20:27<9:46:15, 4.15s/it] {'loss': 0.0936, 'grad_norm': 0.7369734811166199, 'learning_rate': 1.5226436085346646e-06, 'epoch': 0.75} 75%|███████▌ | 25793/34278 [28:20:27<9:46:15, 4.15s/it] 75%|███████▌ | 25794/34278 [28:20:30<8:55:54, 3.79s/it] {'loss': 0.1172, 'grad_norm': 0.8012708710065065, 'learning_rate': 1.5223041546540778e-06, 'epoch': 0.75} 75%|███████▌ | 25794/34278 [28:20:30<8:55:54, 3.79s/it] 75%|███████▌ | 25795/34278 [28:20:33<8:07:16, 3.45s/it] {'loss': 0.1302, 'grad_norm': 0.8504496220103561, 'learning_rate': 1.5219647318215297e-06, 'epoch': 0.75} 75%|███████▌ | 25795/34278 [28:20:33<8:07:16, 3.45s/it] 75%|███████▌ | 25796/34278 [28:20:36<8:04:51, 3.43s/it] {'loss': 0.102, 'grad_norm': 1.3388398262619507, 'learning_rate': 1.5216253400400483e-06, 'epoch': 0.75} 75%|███████▌ | 25796/34278 [28:20:36<8:04:51, 3.43s/it] 75%|███████▌ | 25797/34278 [28:20:42<9:58:12, 4.23s/it] {'loss': 0.1097, 'grad_norm': 0.8398517099374087, 'learning_rate': 1.5212859793126672e-06, 'epoch': 0.75} 75%|███████▌ | 25797/34278 [28:20:42<9:58:12, 4.23s/it] 75%|███████▌ | 25798/34278 [28:20:46<9:26:16, 4.01s/it] {'loss': 0.1333, 'grad_norm': 0.7939681099286744, 'learning_rate': 1.5209466496424146e-06, 'epoch': 0.75} 75%|███████▌ | 25798/34278 [28:20:46<9:26:16, 4.01s/it] 75%|███████▌ | 25799/34278 [28:20:52<10:48:19, 4.59s/it] {'loss': 0.1291, 'grad_norm': 0.7374726875389844, 'learning_rate': 1.5206073510323177e-06, 'epoch': 0.75} 75%|███████▌ | 25799/34278 [28:20:52<10:48:19, 4.59s/it] 75%|███████▌ | 25800/34278 [28:20:55<9:54:37, 4.21s/it] {'loss': 0.1097, 'grad_norm': 0.8541748450938101, 'learning_rate': 1.5202680834854084e-06, 'epoch': 0.75} 75%|███████▌ | 25800/34278 [28:20:55<9:54:37, 4.21s/it] 75%|███████▌ | 25801/34278 [28:21:00<10:22:37, 4.41s/it] {'loss': 0.1375, 'grad_norm': 0.7624552554381788, 'learning_rate': 1.5199288470047163e-06, 'epoch': 0.75} 75%|███████▌ | 25801/34278 [28:21:00<10:22:37, 4.41s/it] 75%|███████▌ | 25802/34278 [28:21:03<9:24:46, 4.00s/it] {'loss': 0.1114, 'grad_norm': 0.8605363225941227, 'learning_rate': 1.5195896415932687e-06, 'epoch': 0.75} 75%|███████▌ | 25802/34278 [28:21:03<9:24:46, 4.00s/it] 75%|███████▌ | 25803/34278 [28:21:06<8:35:26, 3.65s/it] {'loss': 0.0925, 'grad_norm': 0.7502923124699972, 'learning_rate': 1.5192504672540919e-06, 'epoch': 0.75} 75%|███████▌ | 25803/34278 [28:21:06<8:35:26, 3.65s/it] 75%|███████▌ | 25804/34278 [28:21:09<8:16:27, 3.52s/it] {'loss': 0.1117, 'grad_norm': 0.8045466862669897, 'learning_rate': 1.5189113239902182e-06, 'epoch': 0.75} 75%|███████▌ | 25804/34278 [28:21:09<8:16:27, 3.52s/it] 75%|███████▌ | 25805/34278 [28:21:14<9:33:15, 4.06s/it] {'loss': 0.1147, 'grad_norm': 1.120178268443537, 'learning_rate': 1.5185722118046714e-06, 'epoch': 0.75} 75%|███████▌ | 25805/34278 [28:21:14<9:33:15, 4.06s/it] 75%|███████▌ | 25806/34278 [28:21:20<10:51:18, 4.61s/it] {'loss': 0.1303, 'grad_norm': 0.9345635236915236, 'learning_rate': 1.518233130700481e-06, 'epoch': 0.75} 75%|███████▌ | 25806/34278 [28:21:20<10:51:18, 4.61s/it] 75%|███████▌ | 25807/34278 [28:21:23<9:47:34, 4.16s/it] {'loss': 0.1096, 'grad_norm': 0.7488181350558006, 'learning_rate': 1.5178940806806753e-06, 'epoch': 0.75} 75%|███████▌ | 25807/34278 [28:21:23<9:47:34, 4.16s/it] 75%|███████▌ | 25808/34278 [28:21:28<10:28:57, 4.46s/it] {'loss': 0.1092, 'grad_norm': 0.7496632544713565, 'learning_rate': 1.5175550617482804e-06, 'epoch': 0.75} 75%|███████▌ | 25808/34278 [28:21:29<10:28:57, 4.46s/it] 75%|███████▌ | 25809/34278 [28:21:31<9:14:35, 3.93s/it] {'loss': 0.1195, 'grad_norm': 0.7321865320211671, 'learning_rate': 1.5172160739063208e-06, 'epoch': 0.75} 75%|███████▌ | 25809/34278 [28:21:31<9:14:35, 3.93s/it] 75%|███████▌ | 25810/34278 [28:21:34<8:34:26, 3.65s/it] {'loss': 0.1236, 'grad_norm': 0.9067910773689672, 'learning_rate': 1.516877117157826e-06, 'epoch': 0.75} 75%|███████▌ | 25810/34278 [28:21:34<8:34:26, 3.65s/it] 75%|███████▌ | 25811/34278 [28:21:40<10:18:40, 4.38s/it] {'loss': 0.1087, 'grad_norm': 0.8273544180677533, 'learning_rate': 1.5165381915058196e-06, 'epoch': 0.75} 75%|███████▌ | 25811/34278 [28:21:40<10:18:40, 4.38s/it] 75%|███████▌ | 25812/34278 [28:21:46<11:20:17, 4.82s/it] {'loss': 0.1203, 'grad_norm': 0.7558130969418564, 'learning_rate': 1.51619929695333e-06, 'epoch': 0.75} 75%|███████▌ | 25812/34278 [28:21:46<11:20:17, 4.82s/it] 75%|███████▌ | 25813/34278 [28:21:49<9:54:59, 4.22s/it] {'loss': 0.1267, 'grad_norm': 0.8595675943393801, 'learning_rate': 1.51586043350338e-06, 'epoch': 0.75} 75%|███████▌ | 25813/34278 [28:21:49<9:54:59, 4.22s/it] 75%|███████▌ | 25814/34278 [28:21:52<9:16:15, 3.94s/it] {'loss': 0.0972, 'grad_norm': 0.6977803243866763, 'learning_rate': 1.5155216011589979e-06, 'epoch': 0.75} 75%|███████▌ | 25814/34278 [28:21:52<9:16:15, 3.94s/it] 75%|███████▌ | 25815/34278 [28:21:57<9:57:54, 4.24s/it] {'loss': 0.1192, 'grad_norm': 0.8183157120963988, 'learning_rate': 1.5151827999232071e-06, 'epoch': 0.75} 75%|███████▌ | 25815/34278 [28:21:57<9:57:54, 4.24s/it] 75%|███████▌ | 25816/34278 [28:22:03<11:17:06, 4.80s/it] {'loss': 0.1492, 'grad_norm': 0.9025714857647811, 'learning_rate': 1.5148440297990308e-06, 'epoch': 0.75} 75%|███████▌ | 25816/34278 [28:22:03<11:17:06, 4.80s/it] 75%|███████▌ | 25817/34278 [28:22:07<10:11:01, 4.33s/it] {'loss': 0.1206, 'grad_norm': 2.137719978205146, 'learning_rate': 1.5145052907894946e-06, 'epoch': 0.75} 75%|███████▌ | 25817/34278 [28:22:07<10:11:01, 4.33s/it] 75%|███████▌ | 25818/34278 [28:22:10<9:23:24, 4.00s/it] {'loss': 0.1065, 'grad_norm': 0.7518584121211901, 'learning_rate': 1.5141665828976253e-06, 'epoch': 0.75} 75%|███████▌ | 25818/34278 [28:22:10<9:23:24, 4.00s/it] 75%|███████▌ | 25819/34278 [28:22:13<8:33:31, 3.64s/it] {'loss': 0.11, 'grad_norm': 0.8352424914910912, 'learning_rate': 1.5138279061264445e-06, 'epoch': 0.75} 75%|███████▌ | 25819/34278 [28:22:13<8:33:31, 3.64s/it] 75%|███████▌ | 25820/34278 [28:22:16<8:18:19, 3.54s/it] {'loss': 0.108, 'grad_norm': 0.9322873060983801, 'learning_rate': 1.5134892604789743e-06, 'epoch': 0.75} 75%|███████▌ | 25820/34278 [28:22:16<8:18:19, 3.54s/it] 75%|███████▌ | 25821/34278 [28:22:22<9:50:10, 4.19s/it] {'loss': 0.1193, 'grad_norm': 1.1939548122290728, 'learning_rate': 1.5131506459582412e-06, 'epoch': 0.75} 75%|███████▌ | 25821/34278 [28:22:22<9:50:10, 4.19s/it] 75%|███████▌ | 25822/34278 [28:22:26<9:56:39, 4.23s/it] {'loss': 0.1042, 'grad_norm': 0.7629191767789621, 'learning_rate': 1.5128120625672648e-06, 'epoch': 0.75} 75%|███████▌ | 25822/34278 [28:22:26<9:56:39, 4.23s/it] 75%|███████▌ | 25823/34278 [28:22:32<11:11:54, 4.77s/it] {'loss': 0.1052, 'grad_norm': 0.8692873201149913, 'learning_rate': 1.5124735103090704e-06, 'epoch': 0.75} 75%|███████▌ | 25823/34278 [28:22:32<11:11:54, 4.77s/it] 75%|███████▌ | 25824/34278 [28:22:36<10:33:27, 4.50s/it] {'loss': 0.1018, 'grad_norm': 0.934424415248695, 'learning_rate': 1.5121349891866815e-06, 'epoch': 0.75} 75%|███████▌ | 25824/34278 [28:22:36<10:33:27, 4.50s/it] 75%|███████▌ | 25825/34278 [28:22:40<10:04:52, 4.29s/it] {'loss': 0.1441, 'grad_norm': 1.0896673051291843, 'learning_rate': 1.5117964992031187e-06, 'epoch': 0.75} 75%|███████▌ | 25825/34278 [28:22:40<10:04:52, 4.29s/it] 75%|███████▌ | 25826/34278 [28:22:43<9:24:24, 4.01s/it] {'loss': 0.1235, 'grad_norm': 1.206064428837, 'learning_rate': 1.5114580403614022e-06, 'epoch': 0.75} 75%|███████▌ | 25826/34278 [28:22:43<9:24:24, 4.01s/it] 75%|███████▌ | 25827/34278 [28:22:47<9:40:10, 4.12s/it] {'loss': 0.112, 'grad_norm': 0.8417440204406492, 'learning_rate': 1.5111196126645573e-06, 'epoch': 0.75} 75%|███████▌ | 25827/34278 [28:22:47<9:40:10, 4.12s/it] 75%|███████▌ | 25828/34278 [28:22:51<9:12:39, 3.92s/it] {'loss': 0.1006, 'grad_norm': 0.8429514644846525, 'learning_rate': 1.5107812161156037e-06, 'epoch': 0.75} 75%|███████▌ | 25828/34278 [28:22:51<9:12:39, 3.92s/it] 75%|███████▌ | 25829/34278 [28:22:57<10:37:51, 4.53s/it] {'loss': 0.125, 'grad_norm': 0.6798431981677711, 'learning_rate': 1.5104428507175612e-06, 'epoch': 0.75} 75%|███████▌ | 25829/34278 [28:22:57<10:37:51, 4.53s/it] 75%|███████▌ | 25830/34278 [28:23:00<10:02:00, 4.28s/it] {'loss': 0.146, 'grad_norm': 0.786217281828536, 'learning_rate': 1.5101045164734512e-06, 'epoch': 0.75} 75%|███████▌ | 25830/34278 [28:23:00<10:02:00, 4.28s/it] 75%|███████▌ | 25831/34278 [28:23:04<9:29:42, 4.05s/it] {'loss': 0.132, 'grad_norm': 0.909765805922423, 'learning_rate': 1.5097662133862973e-06, 'epoch': 0.75} 75%|███████▌ | 25831/34278 [28:23:04<9:29:42, 4.05s/it] 75%|███████▌ | 25832/34278 [28:23:09<10:29:39, 4.47s/it] {'loss': 0.1107, 'grad_norm': 0.8993856071037685, 'learning_rate': 1.5094279414591168e-06, 'epoch': 0.75} 75%|███████▌ | 25832/34278 [28:23:09<10:29:39, 4.47s/it] 75%|███████▌ | 25833/34278 [28:23:13<10:01:50, 4.28s/it] {'loss': 0.1085, 'grad_norm': 0.8117467983951188, 'learning_rate': 1.509089700694929e-06, 'epoch': 0.75} 75%|███████▌ | 25833/34278 [28:23:13<10:01:50, 4.28s/it] 75%|███████▌ | 25834/34278 [28:23:16<9:19:19, 3.97s/it] {'loss': 0.1259, 'grad_norm': 0.8235728459135495, 'learning_rate': 1.5087514910967572e-06, 'epoch': 0.75} 75%|███████▌ | 25834/34278 [28:23:16<9:19:19, 3.97s/it] 75%|███████▌ | 25835/34278 [28:23:20<9:05:36, 3.88s/it] {'loss': 0.1109, 'grad_norm': 0.8557785771479315, 'learning_rate': 1.508413312667616e-06, 'epoch': 0.75} 75%|███████▌ | 25835/34278 [28:23:20<9:05:36, 3.88s/it] 75%|███████▌ | 25836/34278 [28:23:24<8:52:17, 3.78s/it] {'loss': 0.116, 'grad_norm': 0.8349318552744862, 'learning_rate': 1.508075165410528e-06, 'epoch': 0.75} 75%|███████▌ | 25836/34278 [28:23:24<8:52:17, 3.78s/it] 75%|███████▌ | 25837/34278 [28:23:30<10:49:27, 4.62s/it] {'loss': 0.1194, 'grad_norm': 0.8397457214782929, 'learning_rate': 1.5077370493285126e-06, 'epoch': 0.75} 75%|███████▌ | 25837/34278 [28:23:30<10:49:27, 4.62s/it] 75%|███████▌ | 25838/34278 [28:23:34<10:15:23, 4.37s/it] {'loss': 0.1275, 'grad_norm': 0.7883389338674813, 'learning_rate': 1.5073989644245873e-06, 'epoch': 0.75} 75%|███████▌ | 25838/34278 [28:23:34<10:15:23, 4.37s/it] 75%|███████▌ | 25839/34278 [28:23:38<10:02:11, 4.28s/it] {'loss': 0.1026, 'grad_norm': 0.8064327679277885, 'learning_rate': 1.5070609107017687e-06, 'epoch': 0.75} 75%|███████▌ | 25839/34278 [28:23:38<10:02:11, 4.28s/it] 75%|███████▌ | 25840/34278 [28:23:42<9:36:33, 4.10s/it] {'loss': 0.1176, 'grad_norm': 0.8328954087145269, 'learning_rate': 1.506722888163078e-06, 'epoch': 0.75} 75%|███████▌ | 25840/34278 [28:23:42<9:36:33, 4.10s/it] 75%|███████▌ | 25841/34278 [28:23:48<10:57:03, 4.67s/it] {'loss': 0.1199, 'grad_norm': 0.918903002541291, 'learning_rate': 1.5063848968115297e-06, 'epoch': 0.75} 75%|███████▌ | 25841/34278 [28:23:48<10:57:03, 4.67s/it] 75%|███████▌ | 25842/34278 [28:23:51<10:07:36, 4.32s/it] {'loss': 0.116, 'grad_norm': 0.9218694076987992, 'learning_rate': 1.506046936650145e-06, 'epoch': 0.75} 75%|███████▌ | 25842/34278 [28:23:51<10:07:36, 4.32s/it] 75%|███████▌ | 25843/34278 [28:23:54<9:15:43, 3.95s/it] {'loss': 0.0949, 'grad_norm': 0.8553981890892228, 'learning_rate': 1.5057090076819375e-06, 'epoch': 0.75} 75%|███████▌ | 25843/34278 [28:23:54<9:15:43, 3.95s/it] 75%|███████▌ | 25844/34278 [28:23:59<9:45:14, 4.16s/it] {'loss': 0.1019, 'grad_norm': 0.8016816482886957, 'learning_rate': 1.5053711099099272e-06, 'epoch': 0.75} 75%|███████▌ | 25844/34278 [28:23:59<9:45:14, 4.16s/it] 75%|███████▌ | 25845/34278 [28:24:03<9:17:52, 3.97s/it] {'loss': 0.1027, 'grad_norm': 0.8313417626973173, 'learning_rate': 1.5050332433371295e-06, 'epoch': 0.75} 75%|███████▌ | 25845/34278 [28:24:03<9:17:52, 3.97s/it] 75%|███████▌ | 25846/34278 [28:24:05<8:27:47, 3.61s/it] {'loss': 0.0942, 'grad_norm': 0.8996554314666295, 'learning_rate': 1.5046954079665588e-06, 'epoch': 0.75} 75%|███████▌ | 25846/34278 [28:24:05<8:27:47, 3.61s/it] 75%|███████▌ | 25847/34278 [28:24:10<9:02:48, 3.86s/it] {'loss': 0.1277, 'grad_norm': 0.8516424226755879, 'learning_rate': 1.5043576038012337e-06, 'epoch': 0.75} 75%|███████▌ | 25847/34278 [28:24:10<9:02:48, 3.86s/it] 75%|███████▌ | 25848/34278 [28:24:13<8:37:34, 3.68s/it] {'loss': 0.1181, 'grad_norm': 0.9941469091678935, 'learning_rate': 1.5040198308441707e-06, 'epoch': 0.75} 75%|███████▌ | 25848/34278 [28:24:13<8:37:34, 3.68s/it] 75%|███████▌ | 25849/34278 [28:24:16<8:26:58, 3.61s/it] {'loss': 0.1065, 'grad_norm': 0.8230225930256071, 'learning_rate': 1.503682089098384e-06, 'epoch': 0.75} 75%|███████▌ | 25849/34278 [28:24:17<8:26:58, 3.61s/it] 75%|███████▌ | 25850/34278 [28:24:19<7:54:19, 3.38s/it] {'loss': 0.1049, 'grad_norm': 0.9212499227895967, 'learning_rate': 1.5033443785668873e-06, 'epoch': 0.75} 75%|███████▌ | 25850/34278 [28:24:19<7:54:19, 3.38s/it] 75%|███████▌ | 25851/34278 [28:24:23<7:53:17, 3.37s/it] {'loss': 0.1368, 'grad_norm': 0.9520060923931164, 'learning_rate': 1.5030066992526993e-06, 'epoch': 0.75} 75%|███████▌ | 25851/34278 [28:24:23<7:53:17, 3.37s/it] 75%|███████▌ | 25852/34278 [28:24:26<7:33:51, 3.23s/it] {'loss': 0.1013, 'grad_norm': 0.8640732809284455, 'learning_rate': 1.502669051158831e-06, 'epoch': 0.75} 75%|███████▌ | 25852/34278 [28:24:26<7:33:51, 3.23s/it] 75%|███████▌ | 25853/34278 [28:24:29<7:48:26, 3.34s/it] {'loss': 0.1161, 'grad_norm': 0.9303947086138226, 'learning_rate': 1.5023314342882984e-06, 'epoch': 0.75} 75%|███████▌ | 25853/34278 [28:24:29<7:48:26, 3.34s/it] 75%|███████▌ | 25854/34278 [28:24:32<7:31:39, 3.22s/it] {'loss': 0.1266, 'grad_norm': 0.8011869651117773, 'learning_rate': 1.5019938486441172e-06, 'epoch': 0.75} 75%|███████▌ | 25854/34278 [28:24:32<7:31:39, 3.22s/it] 75%|███████▌ | 25855/34278 [28:24:35<7:38:40, 3.27s/it] {'loss': 0.1017, 'grad_norm': 0.9567966624041573, 'learning_rate': 1.5016562942293e-06, 'epoch': 0.75} 75%|███████▌ | 25855/34278 [28:24:36<7:38:40, 3.27s/it] 75%|███████▌ | 25856/34278 [28:24:39<7:50:17, 3.35s/it] {'loss': 0.1069, 'grad_norm': 0.8704921474973745, 'learning_rate': 1.5013187710468584e-06, 'epoch': 0.75} 75%|███████▌ | 25856/34278 [28:24:39<7:50:17, 3.35s/it] 75%|███████▌ | 25857/34278 [28:24:42<7:28:39, 3.20s/it] {'loss': 0.0988, 'grad_norm': 0.7797878699950223, 'learning_rate': 1.5009812790998096e-06, 'epoch': 0.75} 75%|███████▌ | 25857/34278 [28:24:42<7:28:39, 3.20s/it] 75%|███████▌ | 25858/34278 [28:24:48<9:26:13, 4.03s/it] {'loss': 0.1409, 'grad_norm': 0.8203057622573032, 'learning_rate': 1.500643818391165e-06, 'epoch': 0.75} 75%|███████▌ | 25858/34278 [28:24:48<9:26:13, 4.03s/it] 75%|███████▌ | 25859/34278 [28:24:51<8:41:25, 3.72s/it] {'loss': 0.1073, 'grad_norm': 0.951587561368629, 'learning_rate': 1.500306388923935e-06, 'epoch': 0.75} 75%|███████▌ | 25859/34278 [28:24:51<8:41:25, 3.72s/it] 75%|███████▌ | 25860/34278 [28:24:54<8:10:28, 3.50s/it] {'loss': 0.1158, 'grad_norm': 1.061516934732331, 'learning_rate': 1.4999689907011338e-06, 'epoch': 0.75} 75%|███████▌ | 25860/34278 [28:24:54<8:10:28, 3.50s/it] 75%|███████▌ | 25861/34278 [28:24:59<9:10:58, 3.93s/it] {'loss': 0.11, 'grad_norm': 0.9483004185178417, 'learning_rate': 1.4996316237257758e-06, 'epoch': 0.75} 75%|███████▌ | 25861/34278 [28:24:59<9:10:58, 3.93s/it] 75%|███████▌ | 25862/34278 [28:25:02<8:57:21, 3.83s/it] {'loss': 0.1019, 'grad_norm': 0.7892898201827739, 'learning_rate': 1.4992942880008716e-06, 'epoch': 0.75} 75%|███████▌ | 25862/34278 [28:25:02<8:57:21, 3.83s/it] 75%|███████▌ | 25863/34278 [28:25:06<8:30:38, 3.64s/it] {'loss': 0.1114, 'grad_norm': 0.8486407833114757, 'learning_rate': 1.4989569835294298e-06, 'epoch': 0.75} 75%|███████▌ | 25863/34278 [28:25:06<8:30:38, 3.64s/it] 75%|███████▌ | 25864/34278 [28:25:09<8:09:26, 3.49s/it] {'loss': 0.111, 'grad_norm': 0.8545793243946734, 'learning_rate': 1.4986197103144661e-06, 'epoch': 0.75} 75%|███████▌ | 25864/34278 [28:25:09<8:09:26, 3.49s/it] 75%|███████▌ | 25865/34278 [28:25:12<7:59:55, 3.42s/it] {'loss': 0.1285, 'grad_norm': 0.9917273192168473, 'learning_rate': 1.4982824683589887e-06, 'epoch': 0.75} 75%|███████▌ | 25865/34278 [28:25:12<7:59:55, 3.42s/it] 75%|███████▌ | 25866/34278 [28:25:15<7:42:45, 3.30s/it] {'loss': 0.1201, 'grad_norm': 1.0136428433058013, 'learning_rate': 1.4979452576660091e-06, 'epoch': 0.75} 75%|███████▌ | 25866/34278 [28:25:15<7:42:45, 3.30s/it] 75%|███████▌ | 25867/34278 [28:25:18<7:24:35, 3.17s/it] {'loss': 0.0963, 'grad_norm': 1.142683193062564, 'learning_rate': 1.4976080782385399e-06, 'epoch': 0.75} 75%|███████▌ | 25867/34278 [28:25:18<7:24:35, 3.17s/it] 75%|███████▌ | 25868/34278 [28:25:24<9:18:02, 3.98s/it] {'loss': 0.1179, 'grad_norm': 1.0490554723344994, 'learning_rate': 1.4972709300795896e-06, 'epoch': 0.75} 75%|███████▌ | 25868/34278 [28:25:24<9:18:02, 3.98s/it] 75%|███████▌ | 25869/34278 [28:25:27<8:43:19, 3.73s/it] {'loss': 0.1063, 'grad_norm': 1.0838232578951599, 'learning_rate': 1.4969338131921667e-06, 'epoch': 0.75} 75%|███████▌ | 25869/34278 [28:25:27<8:43:19, 3.73s/it] 75%|███████▌ | 25870/34278 [28:25:30<8:33:24, 3.66s/it] {'loss': 0.1102, 'grad_norm': 0.8307847314099764, 'learning_rate': 1.4965967275792842e-06, 'epoch': 0.75} 75%|███████▌ | 25870/34278 [28:25:30<8:33:24, 3.66s/it] 75%|███████▌ | 25871/34278 [28:25:33<7:58:12, 3.41s/it] {'loss': 0.115, 'grad_norm': 1.338595340756564, 'learning_rate': 1.4962596732439484e-06, 'epoch': 0.75} 75%|███████▌ | 25871/34278 [28:25:33<7:58:12, 3.41s/it] 75%|███████▌ | 25872/34278 [28:25:39<9:49:01, 4.20s/it] {'loss': 0.1166, 'grad_norm': 1.0822023601561124, 'learning_rate': 1.495922650189171e-06, 'epoch': 0.75} 75%|███████▌ | 25872/34278 [28:25:39<9:49:01, 4.20s/it] 75%|███████▌ | 25873/34278 [28:25:43<9:14:19, 3.96s/it] {'loss': 0.1385, 'grad_norm': 0.7975328518759391, 'learning_rate': 1.4955856584179584e-06, 'epoch': 0.75} 75%|███████▌ | 25873/34278 [28:25:43<9:14:19, 3.96s/it] 75%|███████▌ | 25874/34278 [28:25:47<9:21:59, 4.01s/it] {'loss': 0.1126, 'grad_norm': 0.7821845278332362, 'learning_rate': 1.495248697933322e-06, 'epoch': 0.75} 75%|███████▌ | 25874/34278 [28:25:47<9:21:59, 4.01s/it] 75%|███████▌ | 25875/34278 [28:25:50<8:47:25, 3.77s/it] {'loss': 0.101, 'grad_norm': 0.7082873143847945, 'learning_rate': 1.4949117687382686e-06, 'epoch': 0.75} 75%|███████▌ | 25875/34278 [28:25:50<8:47:25, 3.77s/it] 75%|███████▌ | 25876/34278 [28:25:54<9:12:46, 3.95s/it] {'loss': 0.1053, 'grad_norm': 0.7938271501070359, 'learning_rate': 1.4945748708358044e-06, 'epoch': 0.75} 75%|███████▌ | 25876/34278 [28:25:54<9:12:46, 3.95s/it] 75%|███████▌ | 25877/34278 [28:26:01<10:49:39, 4.64s/it] {'loss': 0.1171, 'grad_norm': 0.7773730416861138, 'learning_rate': 1.4942380042289388e-06, 'epoch': 0.75} 75%|███████▌ | 25877/34278 [28:26:01<10:49:39, 4.64s/it] 75%|███████▌ | 25878/34278 [28:26:04<9:56:56, 4.26s/it] {'loss': 0.1002, 'grad_norm': 1.0313441772972234, 'learning_rate': 1.4939011689206812e-06, 'epoch': 0.75} 75%|███████▌ | 25878/34278 [28:26:04<9:56:56, 4.26s/it] 75%|███████▌ | 25879/34278 [28:26:08<9:28:23, 4.06s/it] {'loss': 0.1287, 'grad_norm': 0.818432201028089, 'learning_rate': 1.493564364914037e-06, 'epoch': 0.75} 75%|███████▌ | 25879/34278 [28:26:08<9:28:23, 4.06s/it] 76%|███████▌ | 25880/34278 [28:26:11<9:00:34, 3.86s/it] {'loss': 0.1269, 'grad_norm': 0.9182194321868798, 'learning_rate': 1.4932275922120116e-06, 'epoch': 0.76} 76%|███████▌ | 25880/34278 [28:26:11<9:00:34, 3.86s/it] 76%|███████▌ | 25881/34278 [28:26:14<8:27:53, 3.63s/it] {'loss': 0.1031, 'grad_norm': 0.8024847123486553, 'learning_rate': 1.4928908508176148e-06, 'epoch': 0.76} 76%|███████▌ | 25881/34278 [28:26:14<8:27:53, 3.63s/it] 76%|███████▌ | 25882/34278 [28:26:20<10:06:40, 4.34s/it] {'loss': 0.1176, 'grad_norm': 1.0148061125312728, 'learning_rate': 1.4925541407338511e-06, 'epoch': 0.76} 76%|███████▌ | 25882/34278 [28:26:20<10:06:40, 4.34s/it] 76%|███████▌ | 25883/34278 [28:26:23<9:18:57, 3.99s/it] {'loss': 0.1023, 'grad_norm': 0.783349701504457, 'learning_rate': 1.4922174619637236e-06, 'epoch': 0.76} 76%|███████▌ | 25883/34278 [28:26:23<9:18:57, 3.99s/it] 76%|███████▌ | 25884/34278 [28:26:26<8:32:55, 3.67s/it] {'loss': 0.1452, 'grad_norm': 0.7475352753241996, 'learning_rate': 1.4918808145102443e-06, 'epoch': 0.76} 76%|███████▌ | 25884/34278 [28:26:26<8:32:55, 3.67s/it] 76%|███████▌ | 25885/34278 [28:26:31<9:27:53, 4.06s/it] {'loss': 0.1081, 'grad_norm': 0.8235592086340293, 'learning_rate': 1.4915441983764156e-06, 'epoch': 0.76} 76%|███████▌ | 25885/34278 [28:26:31<9:27:53, 4.06s/it] 76%|███████▌ | 25886/34278 [28:26:34<8:46:24, 3.76s/it] {'loss': 0.1188, 'grad_norm': 1.0793896475449292, 'learning_rate': 1.4912076135652414e-06, 'epoch': 0.76} 76%|███████▌ | 25886/34278 [28:26:34<8:46:24, 3.76s/it] 76%|███████▌ | 25887/34278 [28:26:37<8:16:21, 3.55s/it] {'loss': 0.1166, 'grad_norm': 0.8527877192086889, 'learning_rate': 1.4908710600797293e-06, 'epoch': 0.76} 76%|███████▌ | 25887/34278 [28:26:37<8:16:21, 3.55s/it] 76%|███████▌ | 25888/34278 [28:26:40<7:59:33, 3.43s/it] {'loss': 0.1279, 'grad_norm': 1.025990503369907, 'learning_rate': 1.490534537922883e-06, 'epoch': 0.76} 76%|███████▌ | 25888/34278 [28:26:40<7:59:33, 3.43s/it] 76%|███████▌ | 25889/34278 [28:26:43<7:44:09, 3.32s/it] {'loss': 0.1035, 'grad_norm': 0.7099068887493515, 'learning_rate': 1.4901980470977046e-06, 'epoch': 0.76} 76%|███████▌ | 25889/34278 [28:26:43<7:44:09, 3.32s/it] 76%|███████▌ | 25890/34278 [28:26:47<7:36:37, 3.27s/it] {'loss': 0.1263, 'grad_norm': 0.6294984277144369, 'learning_rate': 1.4898615876072002e-06, 'epoch': 0.76} 76%|███████▌ | 25890/34278 [28:26:47<7:36:37, 3.27s/it] 76%|███████▌ | 25891/34278 [28:26:52<9:19:33, 4.00s/it] {'loss': 0.1024, 'grad_norm': 1.0830088936709952, 'learning_rate': 1.4895251594543758e-06, 'epoch': 0.76} 76%|███████▌ | 25891/34278 [28:26:52<9:19:33, 4.00s/it] 76%|███████▌ | 25892/34278 [28:26:56<8:57:18, 3.84s/it] {'loss': 0.108, 'grad_norm': 0.879881989210489, 'learning_rate': 1.4891887626422324e-06, 'epoch': 0.76} 76%|███████▌ | 25892/34278 [28:26:56<8:57:18, 3.84s/it] 76%|███████▌ | 25893/34278 [28:26:59<8:28:03, 3.64s/it] {'loss': 0.1211, 'grad_norm': 0.8406366078337004, 'learning_rate': 1.4888523971737716e-06, 'epoch': 0.76} 76%|███████▌ | 25893/34278 [28:26:59<8:28:03, 3.64s/it] 76%|███████▌ | 25894/34278 [28:27:02<8:23:34, 3.60s/it] {'loss': 0.1065, 'grad_norm': 0.7158975523182622, 'learning_rate': 1.4885160630520008e-06, 'epoch': 0.76} 76%|███████▌ | 25894/34278 [28:27:02<8:23:34, 3.60s/it] 76%|███████▌ | 25895/34278 [28:27:06<8:10:42, 3.51s/it] {'loss': 0.1197, 'grad_norm': 0.9213164725454261, 'learning_rate': 1.488179760279918e-06, 'epoch': 0.76} 76%|███████▌ | 25895/34278 [28:27:06<8:10:42, 3.51s/it] 76%|███████▌ | 25896/34278 [28:27:08<7:33:08, 3.24s/it] {'loss': 0.1069, 'grad_norm': 2.0983595262270427, 'learning_rate': 1.4878434888605287e-06, 'epoch': 0.76} 76%|███████▌ | 25896/34278 [28:27:08<7:33:08, 3.24s/it] 76%|███████▌ | 25897/34278 [28:27:12<7:28:06, 3.21s/it] {'loss': 0.1069, 'grad_norm': 0.9421713784874025, 'learning_rate': 1.4875072487968356e-06, 'epoch': 0.76} 76%|███████▌ | 25897/34278 [28:27:12<7:28:06, 3.21s/it] 76%|███████▌ | 25898/34278 [28:27:15<7:27:05, 3.20s/it] {'loss': 0.1099, 'grad_norm': 1.0693569782422778, 'learning_rate': 1.4871710400918388e-06, 'epoch': 0.76} 76%|███████▌ | 25898/34278 [28:27:15<7:27:05, 3.20s/it] 76%|███████▌ | 25899/34278 [28:27:18<7:43:40, 3.32s/it] {'loss': 0.1015, 'grad_norm': 0.8511918821329131, 'learning_rate': 1.4868348627485397e-06, 'epoch': 0.76} 76%|███████▌ | 25899/34278 [28:27:18<7:43:40, 3.32s/it] 76%|███████▌ | 25900/34278 [28:27:24<9:30:41, 4.09s/it] {'loss': 0.0939, 'grad_norm': 0.6910395200974965, 'learning_rate': 1.4864987167699414e-06, 'epoch': 0.76} 76%|███████▌ | 25900/34278 [28:27:24<9:30:41, 4.09s/it] 76%|███████▌ | 25901/34278 [28:27:28<9:18:42, 4.00s/it] {'loss': 0.1056, 'grad_norm': 1.0408751846096094, 'learning_rate': 1.486162602159042e-06, 'epoch': 0.76} 76%|███████▌ | 25901/34278 [28:27:28<9:18:42, 4.00s/it] 76%|███████▌ | 25902/34278 [28:27:31<8:48:53, 3.79s/it] {'loss': 0.1044, 'grad_norm': 0.7004601690901944, 'learning_rate': 1.485826518918846e-06, 'epoch': 0.76} 76%|███████▌ | 25902/34278 [28:27:31<8:48:53, 3.79s/it] 76%|███████▌ | 25903/34278 [28:27:34<8:09:01, 3.50s/it] {'loss': 0.1255, 'grad_norm': 1.0082745657940422, 'learning_rate': 1.4854904670523496e-06, 'epoch': 0.76} 76%|███████▌ | 25903/34278 [28:27:34<8:09:01, 3.50s/it] 76%|███████▌ | 25904/34278 [28:27:40<9:52:20, 4.24s/it] {'loss': 0.1157, 'grad_norm': 0.8292697853817166, 'learning_rate': 1.485154446562558e-06, 'epoch': 0.76} 76%|███████▌ | 25904/34278 [28:27:40<9:52:20, 4.24s/it] 76%|███████▌ | 25905/34278 [28:27:43<9:04:10, 3.90s/it] {'loss': 0.1225, 'grad_norm': 0.8205664785554989, 'learning_rate': 1.4848184574524677e-06, 'epoch': 0.76} 76%|███████▌ | 25905/34278 [28:27:43<9:04:10, 3.90s/it] 76%|███████▌ | 25906/34278 [28:27:46<8:20:58, 3.59s/it] {'loss': 0.1262, 'grad_norm': 1.1649608083028422, 'learning_rate': 1.4844824997250779e-06, 'epoch': 0.76} 76%|███████▌ | 25906/34278 [28:27:46<8:20:58, 3.59s/it] 76%|███████▌ | 25907/34278 [28:27:49<8:09:11, 3.51s/it] {'loss': 0.1155, 'grad_norm': 1.0107865491889776, 'learning_rate': 1.4841465733833887e-06, 'epoch': 0.76} 76%|███████▌ | 25907/34278 [28:27:49<8:09:11, 3.51s/it] 76%|███████▌ | 25908/34278 [28:27:53<8:09:21, 3.51s/it] {'loss': 0.1002, 'grad_norm': 0.7543127553830585, 'learning_rate': 1.4838106784304012e-06, 'epoch': 0.76} 76%|███████▌ | 25908/34278 [28:27:53<8:09:21, 3.51s/it] 76%|███████▌ | 25909/34278 [28:27:56<7:53:10, 3.39s/it] {'loss': 0.1158, 'grad_norm': 0.9196363893053334, 'learning_rate': 1.483474814869113e-06, 'epoch': 0.76} 76%|███████▌ | 25909/34278 [28:27:56<7:53:10, 3.39s/it] 76%|███████▌ | 25910/34278 [28:28:00<8:01:25, 3.45s/it] {'loss': 0.1007, 'grad_norm': 5.341613293993254, 'learning_rate': 1.4831389827025206e-06, 'epoch': 0.76} 76%|███████▌ | 25910/34278 [28:28:00<8:01:25, 3.45s/it] 76%|███████▌ | 25911/34278 [28:28:03<8:08:00, 3.50s/it] {'loss': 0.12, 'grad_norm': 0.9742554329408882, 'learning_rate': 1.4828031819336254e-06, 'epoch': 0.76} 76%|███████▌ | 25911/34278 [28:28:03<8:08:00, 3.50s/it] 76%|███████▌ | 25912/34278 [28:28:07<8:15:26, 3.55s/it] {'loss': 0.1218, 'grad_norm': 0.7953343134436859, 'learning_rate': 1.4824674125654232e-06, 'epoch': 0.76} 76%|███████▌ | 25912/34278 [28:28:07<8:15:26, 3.55s/it] 76%|███████▌ | 25913/34278 [28:28:10<7:49:07, 3.36s/it] {'loss': 0.1056, 'grad_norm': 0.8716245200533763, 'learning_rate': 1.4821316746009096e-06, 'epoch': 0.76} 76%|███████▌ | 25913/34278 [28:28:10<7:49:07, 3.36s/it] 76%|███████▌ | 25914/34278 [28:28:13<7:26:19, 3.20s/it] {'loss': 0.088, 'grad_norm': 0.9618030116541052, 'learning_rate': 1.4817959680430876e-06, 'epoch': 0.76} 76%|███████▌ | 25914/34278 [28:28:13<7:26:19, 3.20s/it] 76%|███████▌ | 25915/34278 [28:28:16<7:49:08, 3.37s/it] {'loss': 0.1179, 'grad_norm': 0.8652819971298802, 'learning_rate': 1.4814602928949512e-06, 'epoch': 0.76} 76%|███████▌ | 25915/34278 [28:28:16<7:49:08, 3.37s/it] 76%|███████▌ | 25916/34278 [28:28:20<7:58:39, 3.43s/it] {'loss': 0.1302, 'grad_norm': 0.8854919138178449, 'learning_rate': 1.4811246491594961e-06, 'epoch': 0.76} 76%|███████▌ | 25916/34278 [28:28:20<7:58:39, 3.43s/it] 76%|███████▌ | 25917/34278 [28:28:23<7:42:35, 3.32s/it] {'loss': 0.0977, 'grad_norm': 0.8830455795101136, 'learning_rate': 1.4807890368397215e-06, 'epoch': 0.76} 76%|███████▌ | 25917/34278 [28:28:23<7:42:35, 3.32s/it] 76%|███████▌ | 25918/34278 [28:28:28<8:48:40, 3.79s/it] {'loss': 0.1076, 'grad_norm': 0.9216145671200741, 'learning_rate': 1.4804534559386208e-06, 'epoch': 0.76} 76%|███████▌ | 25918/34278 [28:28:28<8:48:40, 3.79s/it] 76%|███████▌ | 25919/34278 [28:28:31<8:37:13, 3.71s/it] {'loss': 0.1256, 'grad_norm': 0.8000667135237715, 'learning_rate': 1.480117906459193e-06, 'epoch': 0.76} 76%|███████▌ | 25919/34278 [28:28:31<8:37:13, 3.71s/it] 76%|███████▌ | 25920/34278 [28:28:34<7:55:42, 3.42s/it] {'loss': 0.1058, 'grad_norm': 0.9024547885912507, 'learning_rate': 1.4797823884044303e-06, 'epoch': 0.76} 76%|███████▌ | 25920/34278 [28:28:34<7:55:42, 3.42s/it] 76%|███████▌ | 25921/34278 [28:28:37<7:38:06, 3.29s/it] {'loss': 0.1022, 'grad_norm': 0.7338654652081568, 'learning_rate': 1.4794469017773327e-06, 'epoch': 0.76} 76%|███████▌ | 25921/34278 [28:28:37<7:38:06, 3.29s/it] 76%|███████▌ | 25922/34278 [28:28:41<7:45:33, 3.34s/it] {'loss': 0.1317, 'grad_norm': 0.9438720342464741, 'learning_rate': 1.479111446580892e-06, 'epoch': 0.76} 76%|███████▌ | 25922/34278 [28:28:41<7:45:33, 3.34s/it] 76%|███████▌ | 25923/34278 [28:28:44<7:40:24, 3.31s/it] {'loss': 0.1132, 'grad_norm': 0.8413841232525172, 'learning_rate': 1.4787760228181019e-06, 'epoch': 0.76} 76%|███████▌ | 25923/34278 [28:28:44<7:40:24, 3.31s/it] 76%|███████▌ | 25924/34278 [28:28:47<7:26:40, 3.21s/it] {'loss': 0.1201, 'grad_norm': 0.9747519063979391, 'learning_rate': 1.4784406304919596e-06, 'epoch': 0.76} 76%|███████▌ | 25924/34278 [28:28:47<7:26:40, 3.21s/it] 76%|███████▌ | 25925/34278 [28:28:53<9:11:03, 3.96s/it] {'loss': 0.0901, 'grad_norm': 0.9040490081693991, 'learning_rate': 1.4781052696054598e-06, 'epoch': 0.76} 76%|███████▌ | 25925/34278 [28:28:53<9:11:03, 3.96s/it] 76%|███████▌ | 25926/34278 [28:28:55<8:27:15, 3.64s/it] {'loss': 0.1268, 'grad_norm': 0.8826366885308079, 'learning_rate': 1.477769940161594e-06, 'epoch': 0.76} 76%|███████▌ | 25926/34278 [28:28:55<8:27:15, 3.64s/it] 76%|███████▌ | 25927/34278 [28:28:59<8:23:33, 3.62s/it] {'loss': 0.1015, 'grad_norm': 0.9184658452872857, 'learning_rate': 1.477434642163359e-06, 'epoch': 0.76} 76%|███████▌ | 25927/34278 [28:28:59<8:23:33, 3.62s/it] 76%|███████▌ | 25928/34278 [28:29:02<8:04:15, 3.48s/it] {'loss': 0.1212, 'grad_norm': 1.0181008044302222, 'learning_rate': 1.4770993756137465e-06, 'epoch': 0.76} 76%|███████▌ | 25928/34278 [28:29:02<8:04:15, 3.48s/it] 76%|███████▌ | 25929/34278 [28:29:07<8:56:38, 3.86s/it] {'loss': 0.1025, 'grad_norm': 0.9308072159314915, 'learning_rate': 1.4767641405157485e-06, 'epoch': 0.76} 76%|███████▌ | 25929/34278 [28:29:07<8:56:38, 3.86s/it] 76%|███████▌ | 25930/34278 [28:29:11<8:56:43, 3.86s/it] {'loss': 0.121, 'grad_norm': 0.8788407820851127, 'learning_rate': 1.476428936872359e-06, 'epoch': 0.76} 76%|███████▌ | 25930/34278 [28:29:11<8:56:43, 3.86s/it] 76%|███████▌ | 25931/34278 [28:29:14<8:52:04, 3.82s/it] {'loss': 0.1288, 'grad_norm': 1.0523315667493367, 'learning_rate': 1.4760937646865718e-06, 'epoch': 0.76} 76%|███████▌ | 25931/34278 [28:29:14<8:52:04, 3.82s/it] 76%|███████▌ | 25932/34278 [28:29:18<8:49:07, 3.80s/it] {'loss': 0.1102, 'grad_norm': 0.806332663983592, 'learning_rate': 1.475758623961379e-06, 'epoch': 0.76} 76%|███████▌ | 25932/34278 [28:29:18<8:49:07, 3.80s/it] 76%|███████▌ | 25933/34278 [28:29:24<10:10:17, 4.39s/it] {'loss': 0.1155, 'grad_norm': 0.7671064953869315, 'learning_rate': 1.4754235146997704e-06, 'epoch': 0.76} 76%|███████▌ | 25933/34278 [28:29:24<10:10:17, 4.39s/it] 76%|███████▌ | 25934/34278 [28:29:29<10:15:47, 4.43s/it] {'loss': 0.1172, 'grad_norm': 1.0927368440905616, 'learning_rate': 1.4750884369047403e-06, 'epoch': 0.76} 76%|███████▌ | 25934/34278 [28:29:29<10:15:47, 4.43s/it] 76%|███████▌ | 25935/34278 [28:29:31<9:12:47, 3.98s/it] {'loss': 0.1021, 'grad_norm': 0.8445207583002338, 'learning_rate': 1.4747533905792794e-06, 'epoch': 0.76} 76%|███████▌ | 25935/34278 [28:29:31<9:12:47, 3.98s/it] 76%|███████▌ | 25936/34278 [28:29:34<8:32:32, 3.69s/it] {'loss': 0.0965, 'grad_norm': 0.6645077128838399, 'learning_rate': 1.474418375726377e-06, 'epoch': 0.76} 76%|███████▌ | 25936/34278 [28:29:34<8:32:32, 3.69s/it] 76%|███████▌ | 25937/34278 [28:29:37<8:05:45, 3.49s/it] {'loss': 0.1181, 'grad_norm': 0.9671532615444046, 'learning_rate': 1.4740833923490262e-06, 'epoch': 0.76} 76%|███████▌ | 25937/34278 [28:29:37<8:05:45, 3.49s/it] 76%|███████▌ | 25938/34278 [28:29:41<7:57:04, 3.43s/it] {'loss': 0.1432, 'grad_norm': 0.9902850061524677, 'learning_rate': 1.4737484404502178e-06, 'epoch': 0.76} 76%|███████▌ | 25938/34278 [28:29:41<7:57:04, 3.43s/it] 76%|███████▌ | 25939/34278 [28:29:44<7:42:54, 3.33s/it] {'loss': 0.0995, 'grad_norm': 1.0415156810652546, 'learning_rate': 1.4734135200329425e-06, 'epoch': 0.76} 76%|███████▌ | 25939/34278 [28:29:44<7:42:54, 3.33s/it] 76%|███████▌ | 25940/34278 [28:29:48<8:21:12, 3.61s/it] {'loss': 0.1335, 'grad_norm': 0.8475533043562927, 'learning_rate': 1.473078631100187e-06, 'epoch': 0.76} 76%|███████▌ | 25940/34278 [28:29:48<8:21:12, 3.61s/it] 76%|███████▌ | 25941/34278 [28:29:54<10:06:42, 4.37s/it] {'loss': 0.1102, 'grad_norm': 0.8573282963470593, 'learning_rate': 1.472743773654946e-06, 'epoch': 0.76} 76%|███████▌ | 25941/34278 [28:29:54<10:06:42, 4.37s/it] 76%|███████▌ | 25942/34278 [28:29:57<9:16:44, 4.01s/it] {'loss': 0.1218, 'grad_norm': 0.9241080034264542, 'learning_rate': 1.4724089477002047e-06, 'epoch': 0.76} 76%|███████▌ | 25942/34278 [28:29:57<9:16:44, 4.01s/it] 76%|███████▌ | 25943/34278 [28:30:03<10:20:53, 4.47s/it] {'loss': 0.105, 'grad_norm': 0.9145944415032274, 'learning_rate': 1.4720741532389537e-06, 'epoch': 0.76} 76%|███████▌ | 25943/34278 [28:30:03<10:20:53, 4.47s/it] 76%|███████▌ | 25944/34278 [28:30:06<9:26:42, 4.08s/it] {'loss': 0.1223, 'grad_norm': 0.8711946635114528, 'learning_rate': 1.4717393902741845e-06, 'epoch': 0.76} 76%|███████▌ | 25944/34278 [28:30:06<9:26:42, 4.08s/it] 76%|███████▌ | 25945/34278 [28:30:12<10:32:12, 4.55s/it] {'loss': 0.1271, 'grad_norm': 0.9409995913225305, 'learning_rate': 1.4714046588088838e-06, 'epoch': 0.76} 76%|███████▌ | 25945/34278 [28:30:12<10:32:12, 4.55s/it] 76%|███████▌ | 25946/34278 [28:30:18<11:41:35, 5.05s/it] {'loss': 0.115, 'grad_norm': 0.75000581011883, 'learning_rate': 1.4710699588460382e-06, 'epoch': 0.76} 76%|███████▌ | 25946/34278 [28:30:18<11:41:35, 5.05s/it] 76%|███████▌ | 25947/34278 [28:30:23<11:55:31, 5.15s/it] {'loss': 0.1, 'grad_norm': 0.8327671343984938, 'learning_rate': 1.4707352903886395e-06, 'epoch': 0.76} 76%|███████▌ | 25947/34278 [28:30:23<11:55:31, 5.15s/it] 76%|███████▌ | 25948/34278 [28:30:27<10:54:04, 4.71s/it] {'loss': 0.1186, 'grad_norm': 0.8913646526156245, 'learning_rate': 1.4704006534396714e-06, 'epoch': 0.76} 76%|███████▌ | 25948/34278 [28:30:27<10:54:04, 4.71s/it] 76%|███████▌ | 25949/34278 [28:30:30<9:34:39, 4.14s/it] {'loss': 0.1023, 'grad_norm': 1.0481898711688422, 'learning_rate': 1.4700660480021263e-06, 'epoch': 0.76} 76%|███████▌ | 25949/34278 [28:30:30<9:34:39, 4.14s/it] 76%|███████▌ | 25950/34278 [28:30:36<10:49:43, 4.68s/it] {'loss': 0.1009, 'grad_norm': 0.8521890379544377, 'learning_rate': 1.4697314740789864e-06, 'epoch': 0.76} 76%|███████▌ | 25950/34278 [28:30:36<10:49:43, 4.68s/it] 76%|███████▌ | 25951/34278 [28:30:41<11:05:30, 4.80s/it] {'loss': 0.138, 'grad_norm': 1.126152932194546, 'learning_rate': 1.4693969316732426e-06, 'epoch': 0.76} 76%|███████▌ | 25951/34278 [28:30:41<11:05:30, 4.80s/it] 76%|███████▌ | 25952/34278 [28:30:44<9:41:41, 4.19s/it] {'loss': 0.1306, 'grad_norm': 0.9845809426788137, 'learning_rate': 1.4690624207878807e-06, 'epoch': 0.76} 76%|███████▌ | 25952/34278 [28:30:44<9:41:41, 4.19s/it] 76%|███████▌ | 25953/34278 [28:30:50<10:53:30, 4.71s/it] {'loss': 0.1197, 'grad_norm': 0.673575236417537, 'learning_rate': 1.4687279414258848e-06, 'epoch': 0.76} 76%|███████▌ | 25953/34278 [28:30:50<10:53:30, 4.71s/it] 76%|███████▌ | 25954/34278 [28:30:53<10:03:29, 4.35s/it] {'loss': 0.104, 'grad_norm': 0.9049576524294846, 'learning_rate': 1.4683934935902428e-06, 'epoch': 0.76} 76%|███████▌ | 25954/34278 [28:30:53<10:03:29, 4.35s/it] 76%|███████▌ | 25955/34278 [28:30:56<9:17:03, 4.02s/it] {'loss': 0.1081, 'grad_norm': 0.7645593047819226, 'learning_rate': 1.4680590772839427e-06, 'epoch': 0.76} 76%|███████▌ | 25955/34278 [28:30:56<9:17:03, 4.02s/it] 76%|███████▌ | 25956/34278 [28:30:59<8:36:30, 3.72s/it] {'loss': 0.1153, 'grad_norm': 0.8204293226318649, 'learning_rate': 1.4677246925099659e-06, 'epoch': 0.76} 76%|███████▌ | 25956/34278 [28:30:59<8:36:30, 3.72s/it] 76%|███████▌ | 25957/34278 [28:31:02<7:57:43, 3.44s/it] {'loss': 0.1447, 'grad_norm': 0.8188709015858062, 'learning_rate': 1.4673903392713018e-06, 'epoch': 0.76} 76%|███████▌ | 25957/34278 [28:31:02<7:57:43, 3.44s/it] 76%|███████▌ | 25958/34278 [28:31:06<7:55:37, 3.43s/it] {'loss': 0.1197, 'grad_norm': 1.2373567565441503, 'learning_rate': 1.4670560175709331e-06, 'epoch': 0.76} 76%|███████▌ | 25958/34278 [28:31:06<7:55:37, 3.43s/it] 76%|███████▌ | 25959/34278 [28:31:11<9:07:20, 3.95s/it] {'loss': 0.1269, 'grad_norm': 0.8767735977746076, 'learning_rate': 1.4667217274118433e-06, 'epoch': 0.76} 76%|███████▌ | 25959/34278 [28:31:11<9:07:20, 3.95s/it] 76%|███████▌ | 25960/34278 [28:31:17<10:44:10, 4.65s/it] {'loss': 0.1212, 'grad_norm': 0.9925801970461513, 'learning_rate': 1.4663874687970187e-06, 'epoch': 0.76} 76%|███████▌ | 25960/34278 [28:31:17<10:44:10, 4.65s/it] 76%|███████▌ | 25961/34278 [28:31:20<9:24:15, 4.07s/it] {'loss': 0.108, 'grad_norm': 0.6686022034013344, 'learning_rate': 1.4660532417294448e-06, 'epoch': 0.76} 76%|███████▌ | 25961/34278 [28:31:20<9:24:15, 4.07s/it] 76%|███████▌ | 25962/34278 [28:31:23<9:01:08, 3.90s/it] {'loss': 0.1171, 'grad_norm': 2.0859127547442027, 'learning_rate': 1.4657190462121035e-06, 'epoch': 0.76} 76%|███████▌ | 25962/34278 [28:31:23<9:01:08, 3.90s/it] 76%|███████▌ | 25963/34278 [28:31:26<8:17:08, 3.59s/it] {'loss': 0.1372, 'grad_norm': 0.8538955615247369, 'learning_rate': 1.4653848822479778e-06, 'epoch': 0.76} 76%|███████▌ | 25963/34278 [28:31:26<8:17:08, 3.59s/it] 76%|███████▌ | 25964/34278 [28:31:29<7:54:15, 3.42s/it] {'loss': 0.0862, 'grad_norm': 1.1753289851422752, 'learning_rate': 1.4650507498400535e-06, 'epoch': 0.76} 76%|███████▌ | 25964/34278 [28:31:29<7:54:15, 3.42s/it] 76%|███████▌ | 25965/34278 [28:31:34<8:40:00, 3.75s/it] {'loss': 0.1168, 'grad_norm': 1.1818555683389145, 'learning_rate': 1.4647166489913123e-06, 'epoch': 0.76} 76%|███████▌ | 25965/34278 [28:31:34<8:40:00, 3.75s/it] 76%|███████▌ | 25966/34278 [28:31:37<8:14:51, 3.57s/it] {'loss': 0.1097, 'grad_norm': 0.9780284514575832, 'learning_rate': 1.4643825797047351e-06, 'epoch': 0.76} 76%|███████▌ | 25966/34278 [28:31:37<8:14:51, 3.57s/it] 76%|███████▌ | 25967/34278 [28:31:40<7:42:07, 3.34s/it] {'loss': 0.119, 'grad_norm': 1.0640706978670855, 'learning_rate': 1.4640485419833062e-06, 'epoch': 0.76} 76%|███████▌ | 25967/34278 [28:31:40<7:42:07, 3.34s/it] 76%|███████▌ | 25968/34278 [28:31:43<7:40:42, 3.33s/it] {'loss': 0.1337, 'grad_norm': 1.1239686902375026, 'learning_rate': 1.4637145358300099e-06, 'epoch': 0.76} 76%|███████▌ | 25968/34278 [28:31:43<7:40:42, 3.33s/it] 76%|███████▌ | 25969/34278 [28:31:46<7:39:05, 3.32s/it] {'loss': 0.1145, 'grad_norm': 1.3655347668875975, 'learning_rate': 1.463380561247826e-06, 'epoch': 0.76} 76%|███████▌ | 25969/34278 [28:31:46<7:39:05, 3.32s/it] 76%|███████▌ | 25970/34278 [28:31:49<7:30:35, 3.25s/it] {'loss': 0.1179, 'grad_norm': 1.1733687009818943, 'learning_rate': 1.463046618239734e-06, 'epoch': 0.76} 76%|███████▌ | 25970/34278 [28:31:49<7:30:35, 3.25s/it] 76%|███████▌ | 25971/34278 [28:31:53<7:37:27, 3.30s/it] {'loss': 0.0879, 'grad_norm': 0.7321282465144936, 'learning_rate': 1.4627127068087194e-06, 'epoch': 0.76} 76%|███████▌ | 25971/34278 [28:31:53<7:37:27, 3.30s/it] 76%|███████▌ | 25972/34278 [28:31:58<8:52:02, 3.84s/it] {'loss': 0.1125, 'grad_norm': 0.9933471409592995, 'learning_rate': 1.4623788269577594e-06, 'epoch': 0.76} 76%|███████▌ | 25972/34278 [28:31:58<8:52:02, 3.84s/it] 76%|███████▌ | 25973/34278 [28:32:04<10:19:26, 4.48s/it] {'loss': 0.1381, 'grad_norm': 1.2773186342760419, 'learning_rate': 1.4620449786898372e-06, 'epoch': 0.76} 76%|███████▌ | 25973/34278 [28:32:04<10:19:26, 4.48s/it] 76%|███████▌ | 25974/34278 [28:32:08<10:09:37, 4.40s/it] {'loss': 0.1387, 'grad_norm': 1.1663542132844367, 'learning_rate': 1.4617111620079343e-06, 'epoch': 0.76} 76%|███████▌ | 25974/34278 [28:32:08<10:09:37, 4.40s/it] 76%|███████▌ | 25975/34278 [28:32:11<9:19:15, 4.04s/it] {'loss': 0.0976, 'grad_norm': 0.9684096729095881, 'learning_rate': 1.4613773769150298e-06, 'epoch': 0.76} 76%|███████▌ | 25975/34278 [28:32:11<9:19:15, 4.04s/it] 76%|███████▌ | 25976/34278 [28:32:15<9:03:30, 3.93s/it] {'loss': 0.1261, 'grad_norm': 0.7706560538170121, 'learning_rate': 1.4610436234141013e-06, 'epoch': 0.76} 76%|███████▌ | 25976/34278 [28:32:15<9:03:30, 3.93s/it] 76%|███████▌ | 25977/34278 [28:32:18<8:24:57, 3.65s/it] {'loss': 0.1148, 'grad_norm': 1.1786850203145411, 'learning_rate': 1.4607099015081322e-06, 'epoch': 0.76} 76%|███████▌ | 25977/34278 [28:32:18<8:24:57, 3.65s/it] 76%|███████▌ | 25978/34278 [28:32:21<7:48:19, 3.39s/it] {'loss': 0.1199, 'grad_norm': 1.230491814231743, 'learning_rate': 1.4603762112000986e-06, 'epoch': 0.76} 76%|███████▌ | 25978/34278 [28:32:21<7:48:19, 3.39s/it] 76%|███████▌ | 25979/34278 [28:32:24<8:05:05, 3.51s/it] {'loss': 0.127, 'grad_norm': 1.0535222003526965, 'learning_rate': 1.460042552492983e-06, 'epoch': 0.76} 76%|███████▌ | 25979/34278 [28:32:24<8:05:05, 3.51s/it] 76%|███████▌ | 25980/34278 [28:32:28<8:17:51, 3.60s/it] {'loss': 0.1464, 'grad_norm': 1.018164078482079, 'learning_rate': 1.4597089253897606e-06, 'epoch': 0.76} 76%|███████▌ | 25980/34278 [28:32:28<8:17:51, 3.60s/it] 76%|███████▌ | 25981/34278 [28:32:31<7:39:52, 3.33s/it] {'loss': 0.1143, 'grad_norm': 0.8576248454750078, 'learning_rate': 1.4593753298934132e-06, 'epoch': 0.76} 76%|███████▌ | 25981/34278 [28:32:31<7:39:52, 3.33s/it] 76%|███████▌ | 25982/34278 [28:32:37<9:44:02, 4.22s/it] {'loss': 0.1107, 'grad_norm': 1.2795461807170105, 'learning_rate': 1.4590417660069177e-06, 'epoch': 0.76} 76%|███████▌ | 25982/34278 [28:32:37<9:44:02, 4.22s/it] 76%|███████▌ | 25983/34278 [28:32:43<10:58:39, 4.76s/it] {'loss': 0.1176, 'grad_norm': 0.9777373110980944, 'learning_rate': 1.4587082337332508e-06, 'epoch': 0.76} 76%|███████▌ | 25983/34278 [28:32:43<10:58:39, 4.76s/it] 76%|███████▌ | 25984/34278 [28:32:47<10:14:23, 4.44s/it] {'loss': 0.1158, 'grad_norm': 0.9614068004197822, 'learning_rate': 1.458374733075391e-06, 'epoch': 0.76} 76%|███████▌ | 25984/34278 [28:32:47<10:14:23, 4.44s/it] 76%|███████▌ | 25985/34278 [28:32:50<9:09:55, 3.98s/it] {'loss': 0.1083, 'grad_norm': 0.8765415290918716, 'learning_rate': 1.4580412640363185e-06, 'epoch': 0.76} 76%|███████▌ | 25985/34278 [28:32:50<9:09:55, 3.98s/it] 76%|███████▌ | 25986/34278 [28:32:53<8:51:04, 3.84s/it] {'loss': 0.1151, 'grad_norm': 0.8348287583351599, 'learning_rate': 1.4577078266190058e-06, 'epoch': 0.76} 76%|███████▌ | 25986/34278 [28:32:53<8:51:04, 3.84s/it] 76%|███████▌ | 25987/34278 [28:32:57<8:24:45, 3.65s/it] {'loss': 0.117, 'grad_norm': 0.950245424025854, 'learning_rate': 1.4573744208264335e-06, 'epoch': 0.76} 76%|███████▌ | 25987/34278 [28:32:57<8:24:45, 3.65s/it] 76%|███████▌ | 25988/34278 [28:32:59<7:49:02, 3.39s/it] {'loss': 0.1294, 'grad_norm': 1.1097815798448516, 'learning_rate': 1.457041046661577e-06, 'epoch': 0.76} 76%|███████▌ | 25988/34278 [28:32:59<7:49:02, 3.39s/it] 76%|███████▌ | 25989/34278 [28:33:02<7:29:29, 3.25s/it] {'loss': 0.1116, 'grad_norm': 1.009146907616732, 'learning_rate': 1.4567077041274109e-06, 'epoch': 0.76} 76%|███████▌ | 25989/34278 [28:33:02<7:29:29, 3.25s/it] 76%|███████▌ | 25990/34278 [28:33:06<7:36:14, 3.30s/it] {'loss': 0.1038, 'grad_norm': 0.7586790868249903, 'learning_rate': 1.456374393226912e-06, 'epoch': 0.76} 76%|███████▌ | 25990/34278 [28:33:06<7:36:14, 3.30s/it] 76%|███████▌ | 25991/34278 [28:33:12<9:34:46, 4.16s/it] {'loss': 0.1274, 'grad_norm': 0.7418242568597122, 'learning_rate': 1.4560411139630581e-06, 'epoch': 0.76} 76%|███████▌ | 25991/34278 [28:33:12<9:34:46, 4.16s/it] 76%|███████▌ | 25992/34278 [28:33:15<8:39:54, 3.76s/it] {'loss': 0.0986, 'grad_norm': 1.084858512902555, 'learning_rate': 1.4557078663388236e-06, 'epoch': 0.76} 76%|███████▌ | 25992/34278 [28:33:15<8:39:54, 3.76s/it] 76%|███████▌ | 25993/34278 [28:33:20<9:40:07, 4.20s/it] {'loss': 0.1144, 'grad_norm': 0.858504223295019, 'learning_rate': 1.4553746503571813e-06, 'epoch': 0.76} 76%|███████▌ | 25993/34278 [28:33:20<9:40:07, 4.20s/it] 76%|███████▌ | 25994/34278 [28:33:24<9:20:00, 4.06s/it] {'loss': 0.1033, 'grad_norm': 0.7787597298270245, 'learning_rate': 1.4550414660211099e-06, 'epoch': 0.76} 76%|███████▌ | 25994/34278 [28:33:24<9:20:00, 4.06s/it] 76%|███████▌ | 25995/34278 [28:33:28<9:27:33, 4.11s/it] {'loss': 0.1131, 'grad_norm': 0.8932275680825315, 'learning_rate': 1.4547083133335821e-06, 'epoch': 0.76} 76%|███████▌ | 25995/34278 [28:33:28<9:27:33, 4.11s/it] 76%|███████▌ | 25996/34278 [28:33:31<8:51:35, 3.85s/it] {'loss': 0.0972, 'grad_norm': 1.0437860675415536, 'learning_rate': 1.45437519229757e-06, 'epoch': 0.76} 76%|███████▌ | 25996/34278 [28:33:31<8:51:35, 3.85s/it] 76%|███████▌ | 25997/34278 [28:33:34<8:26:48, 3.67s/it] {'loss': 0.1121, 'grad_norm': 0.7378942739734735, 'learning_rate': 1.45404210291605e-06, 'epoch': 0.76} 76%|███████▌ | 25997/34278 [28:33:34<8:26:48, 3.67s/it] 76%|███████▌ | 25998/34278 [28:33:38<8:29:24, 3.69s/it] {'loss': 0.1082, 'grad_norm': 0.6903590941643238, 'learning_rate': 1.4537090451919972e-06, 'epoch': 0.76} 76%|███████▌ | 25998/34278 [28:33:38<8:29:24, 3.69s/it] 76%|███████▌ | 25999/34278 [28:33:42<8:23:42, 3.65s/it] {'loss': 0.1243, 'grad_norm': 0.940410991770605, 'learning_rate': 1.4533760191283836e-06, 'epoch': 0.76} 76%|███████▌ | 25999/34278 [28:33:42<8:23:42, 3.65s/it] 76%|███████▌ | 26000/34278 [28:33:45<8:11:46, 3.56s/it] {'loss': 0.1172, 'grad_norm': 0.8713184786511823, 'learning_rate': 1.4530430247281808e-06, 'epoch': 0.76} 76%|███████▌ | 26000/34278 [28:33:45<8:11:46, 3.56s/it]Set eos token id toSet eos token id to 151658 151658 Set eos token to Set eos token id to<|diff_marker|>Set eos token toSet eos token id to <|diff_marker|> Set generation config eos token id to Set eos token id to151658151658 Set generation config eos token id toSet eos token id to [151658] 151658Set eos token to[151658]Set eos token to 151658 <|diff_marker|> <|diff_marker|>Set eos token toSet eos token to Set generation config eos token id to <|diff_marker|> Set generation config eos token id to<|diff_marker|> [151658]Set generation config eos token id to Set generation config eos token id to [151658] [151658] [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id toSet eos token id toSet eos token id toSet eos token id toonfig eos token id to [151658] 151658151658151658151658 Set eos token id to Set eos token toSet eos token to Set eos token toSet eos token to <|diff_marker|> Set eos token id to 151658<|diff_marker|>Set generation config eos token id toSet eos token id to <|diff_marker|> <|diff_marker|> Set eos token toSet generation config eos token id to Set generation config eos token id to [151658] Set generation config eos token id to[151658]<|diff_marker|>151658 [151658] Set generation config eos token id to[151658] Set eos token to <|diff_marker|>151658 [151658] Set generation config eos token id to Set eos token to <|diff_marker|> Set generation config eos token id to [151658] [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] 151658 Set eos token toSet eos token id toSet eos token id toSet eos token id to <|diff_marker|> Set eos token id to 151658 151658151658 Set generation config eos token id to Set eos token to Set eos token toSet eos token id to <|diff_marker|>Set eos token id toSet eos token to[151658]<|diff_marker|> <|diff_marker|>Set generation config eos token id to 151658 151658 [151658]Set generation config eos token id to151658Set eos token to Set eos token toSet generation config eos token id to <|diff_marker|><|diff_marker|>Set eos token to [151658] Set generation config eos token id to[151658]<|diff_marker|>Set generation config eos token id to [151658]Set generation config eos token id to [151658] [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to151658 Set eos token id to Set eos token to151658 <|diff_marker|> Set generation config eos token id to Set eos token id to Set eos token id to151658[151658] Set eos token to Set eos token id to Set eos token to151658151658 <|diff_marker|><|diff_marker|> Set eos token to151658Set generation config eos token id toSet eos token to Set generation config eos token id to <|diff_marker|> Set eos token to[151658]<|diff_marker|> <|diff_marker|>[151658]Set generation config eos token id to Set generation config eos token id to Set generation config eos token id to [151658] [151658][151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645Set eos token id back to Set eos token back to <|im_end|> 151645Set generation config eos token id back to Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 76%|███████▌ | 26001/34278 [28:34:18<28:27:00, 12.37s/it] {'loss': 0.1337, 'grad_norm': 0.7118485771160761, 'learning_rate': 1.4527100619943646e-06, 'epoch': 0.76} 76%|███████▌ | 26001/34278 [28:34:18<28:27:00, 12.37s/it] 76%|███████▌ | 26002/34278 [28:34:21<21:57:48, 9.55s/it] {'loss': 0.1185, 'grad_norm': 0.8045119603049272, 'learning_rate': 1.4523771309299044e-06, 'epoch': 0.76} 76%|███████▌ | 26002/34278 [28:34:21<21:57:48, 9.55s/it] 76%|███████▌ | 26003/34278 [28:34:27<19:12:42, 8.36s/it] {'loss': 0.1107, 'grad_norm': 0.8168851311481075, 'learning_rate': 1.452044231537774e-06, 'epoch': 0.76} 76%|███████▌ | 26003/34278 [28:34:27<19:12:42, 8.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 76%|███████▌ | 26004/34278 [28:34:30<15:40:15, 6.82s/it] {'loss': 0.1195, 'grad_norm': 0.6164216694250991, 'learning_rate': 1.451711363820948e-06, 'epoch': 0.76} 76%|███████▌ | 26004/34278 [28:34:30<15:40:15, 6.82s/it] 76%|███████▌ | 26005/34278 [28:34:36<15:08:19, 6.59s/it] {'loss': 0.1109, 'grad_norm': 0.9144872541790582, 'learning_rate': 1.4513785277823956e-06, 'epoch': 0.76} 76%|███████▌ | 26005/34278 [28:34:36<15:08:19, 6.59s/it] 76%|███████▌ | 26006/34278 [28:34:42<14:37:13, 6.36s/it] {'loss': 0.1358, 'grad_norm': 0.9360306194839095, 'learning_rate': 1.4510457234250868e-06, 'epoch': 0.76} 76%|███████▌ | 26006/34278 [28:34:42<14:37:13, 6.36s/it] 76%|███████▌ | 26007/34278 [28:34:45<12:19:32, 5.36s/it] {'loss': 0.1001, 'grad_norm': 0.7041663597927578, 'learning_rate': 1.4507129507519968e-06, 'epoch': 0.76} 76%|███████▌ | 26007/34278 [28:34:45<12:19:32, 5.36s/it] 76%|███████▌ | 26008/34278 [28:34:48<11:00:09, 4.79s/it] {'loss': 0.114, 'grad_norm': 0.7837777907709031, 'learning_rate': 1.4503802097660918e-06, 'epoch': 0.76} 76%|███████▌ | 26008/34278 [28:34:48<11:00:09, 4.79s/it] 76%|███████▌ | 26009/34278 [28:34:51<9:47:57, 4.27s/it] {'loss': 0.1038, 'grad_norm': 0.7362928846800872, 'learning_rate': 1.4500475004703475e-06, 'epoch': 0.76} 76%|███████▌ | 26009/34278 [28:34:51<9:47:57, 4.27s/it] 76%|███████▌ | 26010/34278 [28:34:56<10:20:10, 4.50s/it] {'loss': 0.1175, 'grad_norm': 0.8596231315156331, 'learning_rate': 1.4497148228677294e-06, 'epoch': 0.76} 76%|███████▌ | 26010/34278 [28:34:56<10:20:10, 4.50s/it] 76%|███████▌ | 26011/34278 [28:34:59<9:13:07, 4.01s/it] {'loss': 0.1158, 'grad_norm': 1.0259867411899775, 'learning_rate': 1.4493821769612115e-06, 'epoch': 0.76} 76%|███████▌ | 26011/34278 [28:34:59<9:13:07, 4.01s/it] 76%|███████▌ | 26012/34278 [28:35:02<8:24:40, 3.66s/it] {'loss': 0.1118, 'grad_norm': 0.9072027808247067, 'learning_rate': 1.4490495627537621e-06, 'epoch': 0.76} 76%|███████▌ | 26012/34278 [28:35:02<8:24:40, 3.66s/it] 76%|███████▌ | 26013/34278 [28:35:05<7:46:04, 3.38s/it] {'loss': 0.1051, 'grad_norm': 0.8457106723844688, 'learning_rate': 1.4487169802483485e-06, 'epoch': 0.76} 76%|███████▌ | 26013/34278 [28:35:05<7:46:04, 3.38s/it] 76%|███████▌ | 26014/34278 [28:35:08<7:38:38, 3.33s/it] {'loss': 0.1337, 'grad_norm': 1.1286016343179337, 'learning_rate': 1.4483844294479427e-06, 'epoch': 0.76} 76%|███████▌ | 26014/34278 [28:35:08<7:38:38, 3.33s/it] 76%|███████▌ | 26015/34278 [28:35:11<7:25:26, 3.23s/it] {'loss': 0.1367, 'grad_norm': 0.8435348531995076, 'learning_rate': 1.4480519103555141e-06, 'epoch': 0.76} 76%|███████▌ | 26015/34278 [28:35:11<7:25:26, 3.23s/it] 76%|███████▌ | 26016/34278 [28:35:15<7:50:01, 3.41s/it] {'loss': 0.1012, 'grad_norm': 0.8806963037281516, 'learning_rate': 1.4477194229740282e-06, 'epoch': 0.76} 76%|███████▌ | 26016/34278 [28:35:15<7:50:01, 3.41s/it] 76%|███████▌ | 26017/34278 [28:35:18<7:25:40, 3.24s/it] {'loss': 0.1215, 'grad_norm': 1.1095409650591022, 'learning_rate': 1.4473869673064573e-06, 'epoch': 0.76} 76%|███████▌ | 26017/34278 [28:35:18<7:25:40, 3.24s/it] 76%|███████▌ | 26018/34278 [28:35:21<7:17:08, 3.18s/it] {'loss': 0.1168, 'grad_norm': 0.8571719788990452, 'learning_rate': 1.4470545433557676e-06, 'epoch': 0.76} 76%|███████▌ | 26018/34278 [28:35:21<7:17:08, 3.18s/it] 76%|███████▌ | 26019/34278 [28:35:24<7:25:31, 3.24s/it] {'loss': 0.1042, 'grad_norm': 0.8549046547853687, 'learning_rate': 1.4467221511249247e-06, 'epoch': 0.76} 76%|███████▌ | 26019/34278 [28:35:24<7:25:31, 3.24s/it] 76%|███████▌ | 26020/34278 [28:35:27<7:08:05, 3.11s/it] {'loss': 0.1172, 'grad_norm': 0.8946800367869224, 'learning_rate': 1.4463897906168984e-06, 'epoch': 0.76} 76%|███████▌ | 26020/34278 [28:35:27<7:08:05, 3.11s/it] 76%|███████▌ | 26021/34278 [28:35:30<7:13:35, 3.15s/it] {'loss': 0.1137, 'grad_norm': 0.9367240916529788, 'learning_rate': 1.4460574618346573e-06, 'epoch': 0.76} 76%|███████▌ | 26021/34278 [28:35:30<7:13:35, 3.15s/it] 76%|███████▌ | 26022/34278 [28:35:34<7:28:06, 3.26s/it] {'loss': 0.1113, 'grad_norm': 1.0603338690555317, 'learning_rate': 1.445725164781167e-06, 'epoch': 0.76} 76%|███████▌ | 26022/34278 [28:35:34<7:28:06, 3.26s/it] 76%|███████▌ | 26023/34278 [28:35:37<7:17:18, 3.18s/it] {'loss': 0.1035, 'grad_norm': 0.7919956411995522, 'learning_rate': 1.4453928994593925e-06, 'epoch': 0.76} 76%|███████▌ | 26023/34278 [28:35:37<7:17:18, 3.18s/it] 76%|███████▌ | 26024/34278 [28:35:41<8:12:07, 3.58s/it] {'loss': 0.1153, 'grad_norm': 1.0548655536326177, 'learning_rate': 1.4450606658723026e-06, 'epoch': 0.76} 76%|███████▌ | 26024/34278 [28:35:41<8:12:07, 3.58s/it] 76%|███████▌ | 26025/34278 [28:35:44<7:50:31, 3.42s/it] {'loss': 0.0978, 'grad_norm': 0.8492669261576317, 'learning_rate': 1.4447284640228631e-06, 'epoch': 0.76} 76%|███████▌ | 26025/34278 [28:35:44<7:50:31, 3.42s/it] 76%|███████▌ | 26026/34278 [28:35:50<9:27:52, 4.13s/it] {'loss': 0.1065, 'grad_norm': 0.8709168049423043, 'learning_rate': 1.4443962939140372e-06, 'epoch': 0.76} 76%|███████▌ | 26026/34278 [28:35:50<9:27:52, 4.13s/it] 76%|███████▌ | 26027/34278 [28:35:53<8:43:05, 3.80s/it] {'loss': 0.1065, 'grad_norm': 1.3811960156659688, 'learning_rate': 1.4440641555487922e-06, 'epoch': 0.76} 76%|███████▌ | 26027/34278 [28:35:53<8:43:05, 3.80s/it] 76%|███████▌ | 26028/34278 [28:35:56<8:14:39, 3.60s/it] {'loss': 0.1045, 'grad_norm': 0.7200139465267126, 'learning_rate': 1.4437320489300954e-06, 'epoch': 0.76} 76%|███████▌ | 26028/34278 [28:35:56<8:14:39, 3.60s/it] 76%|███████▌ | 26029/34278 [28:36:02<9:47:24, 4.27s/it] {'loss': 0.1129, 'grad_norm': 0.7628794682821637, 'learning_rate': 1.44339997406091e-06, 'epoch': 0.76} 76%|███████▌ | 26029/34278 [28:36:02<9:47:24, 4.27s/it] 76%|███████▌ | 26030/34278 [28:36:05<9:01:57, 3.94s/it] {'loss': 0.1548, 'grad_norm': 0.9482328257842885, 'learning_rate': 1.4430679309441992e-06, 'epoch': 0.76} 76%|███████▌ | 26030/34278 [28:36:05<9:01:57, 3.94s/it] 76%|███████▌ | 26031/34278 [28:36:08<8:22:12, 3.65s/it] {'loss': 0.1201, 'grad_norm': 0.9370533620990203, 'learning_rate': 1.44273591958293e-06, 'epoch': 0.76} 76%|███████▌ | 26031/34278 [28:36:08<8:22:12, 3.65s/it] 76%|███████▌ | 26032/34278 [28:36:11<8:12:53, 3.59s/it] {'loss': 0.1229, 'grad_norm': 0.9843232443690725, 'learning_rate': 1.4424039399800639e-06, 'epoch': 0.76} 76%|███████▌ | 26032/34278 [28:36:11<8:12:53, 3.59s/it] 76%|███████▌ | 26033/34278 [28:36:17<9:43:33, 4.25s/it] {'loss': 0.1253, 'grad_norm': 0.891275018099884, 'learning_rate': 1.442071992138566e-06, 'epoch': 0.76} 76%|███████▌ | 26033/34278 [28:36:17<9:43:33, 4.25s/it] 76%|███████▌ | 26034/34278 [28:36:21<9:31:23, 4.16s/it] {'loss': 0.1184, 'grad_norm': 1.0782847967864844, 'learning_rate': 1.441740076061402e-06, 'epoch': 0.76} 76%|███████▌ | 26034/34278 [28:36:21<9:31:23, 4.16s/it] 76%|███████▌ | 26035/34278 [28:36:25<9:26:40, 4.12s/it] {'loss': 0.1218, 'grad_norm': 1.1422747124755679, 'learning_rate': 1.4414081917515328e-06, 'epoch': 0.76} 76%|███████▌ | 26035/34278 [28:36:25<9:26:40, 4.12s/it] 76%|███████▌ | 26036/34278 [28:36:29<8:57:24, 3.91s/it] {'loss': 0.1212, 'grad_norm': 1.0143969155664316, 'learning_rate': 1.4410763392119203e-06, 'epoch': 0.76} 76%|███████▌ | 26036/34278 [28:36:29<8:57:24, 3.91s/it] 76%|███████▌ | 26037/34278 [28:36:32<8:25:45, 3.68s/it] {'loss': 0.113, 'grad_norm': 1.0085085582756732, 'learning_rate': 1.4407445184455304e-06, 'epoch': 0.76} 76%|███████▌ | 26037/34278 [28:36:32<8:25:45, 3.68s/it] 76%|███████▌ | 26038/34278 [28:36:35<7:57:17, 3.48s/it] {'loss': 0.1162, 'grad_norm': 0.8049804432728066, 'learning_rate': 1.4404127294553216e-06, 'epoch': 0.76} 76%|███████▌ | 26038/34278 [28:36:35<7:57:17, 3.48s/it] 76%|███████▌ | 26039/34278 [28:36:41<9:43:15, 4.25s/it] {'loss': 0.1194, 'grad_norm': 0.7588235740314043, 'learning_rate': 1.4400809722442604e-06, 'epoch': 0.76} 76%|███████▌ | 26039/34278 [28:36:41<9:43:15, 4.25s/it] 76%|███████▌ | 26040/34278 [28:36:45<9:28:06, 4.14s/it] {'loss': 0.1227, 'grad_norm': 0.7968265456666715, 'learning_rate': 1.4397492468153047e-06, 'epoch': 0.76} 76%|███████▌ | 26040/34278 [28:36:45<9:28:06, 4.14s/it] 76%|███████▌ | 26041/34278 [28:36:48<8:38:04, 3.77s/it] {'loss': 0.1268, 'grad_norm': 0.9157507802234769, 'learning_rate': 1.4394175531714193e-06, 'epoch': 0.76} 76%|███████▌ | 26041/34278 [28:36:48<8:38:04, 3.77s/it] 76%|███████▌ | 26042/34278 [28:36:51<8:30:18, 3.72s/it] {'loss': 0.0965, 'grad_norm': 0.872768238942069, 'learning_rate': 1.4390858913155641e-06, 'epoch': 0.76} 76%|███████▌ | 26042/34278 [28:36:51<8:30:18, 3.72s/it] 76%|███████▌ | 26043/34278 [28:36:55<8:18:52, 3.63s/it] {'loss': 0.0954, 'grad_norm': 0.8289127647144127, 'learning_rate': 1.4387542612506983e-06, 'epoch': 0.76} 76%|███████▌ | 26043/34278 [28:36:55<8:18:52, 3.63s/it] 76%|███████▌ | 26044/34278 [28:36:59<8:35:43, 3.76s/it] {'loss': 0.1166, 'grad_norm': 0.7870445360928183, 'learning_rate': 1.438422662979785e-06, 'epoch': 0.76} 76%|███████▌ | 26044/34278 [28:36:59<8:35:43, 3.76s/it] 76%|███████▌ | 26045/34278 [28:37:02<7:58:14, 3.49s/it] {'loss': 0.1299, 'grad_norm': 0.910924733280388, 'learning_rate': 1.4380910965057843e-06, 'epoch': 0.76} 76%|███████▌ | 26045/34278 [28:37:02<7:58:14, 3.49s/it] 76%|███████▌ | 26046/34278 [28:37:05<7:43:46, 3.38s/it] {'loss': 0.1259, 'grad_norm': 0.8015310873760055, 'learning_rate': 1.4377595618316552e-06, 'epoch': 0.76} 76%|███████▌ | 26046/34278 [28:37:05<7:43:46, 3.38s/it] 76%|███████▌ | 26047/34278 [28:37:09<8:18:35, 3.63s/it] {'loss': 0.1202, 'grad_norm': 0.7435053511577032, 'learning_rate': 1.4374280589603602e-06, 'epoch': 0.76} 76%|███████▌ | 26047/34278 [28:37:09<8:18:35, 3.63s/it] 76%|███████▌ | 26048/34278 [28:37:13<8:15:06, 3.61s/it] {'loss': 0.1072, 'grad_norm': 0.9469440455160251, 'learning_rate': 1.4370965878948562e-06, 'epoch': 0.76} 76%|███████▌ | 26048/34278 [28:37:13<8:15:06, 3.61s/it] 76%|███████▌ | 26049/34278 [28:37:16<8:02:21, 3.52s/it] {'loss': 0.1254, 'grad_norm': 0.8816852366894563, 'learning_rate': 1.4367651486381023e-06, 'epoch': 0.76} 76%|███████▌ | 26049/34278 [28:37:16<8:02:21, 3.52s/it] 76%|███████▌ | 26050/34278 [28:37:22<9:52:17, 4.32s/it] {'loss': 0.1094, 'grad_norm': 1.0915406489172184, 'learning_rate': 1.4364337411930585e-06, 'epoch': 0.76} 76%|███████▌ | 26050/34278 [28:37:22<9:52:17, 4.32s/it] 76%|███████▌ | 26051/34278 [28:37:25<8:58:20, 3.93s/it] {'loss': 0.1438, 'grad_norm': 0.7775839955869801, 'learning_rate': 1.436102365562685e-06, 'epoch': 0.76} 76%|███████▌ | 26051/34278 [28:37:25<8:58:20, 3.93s/it] 76%|███████▌ | 26052/34278 [28:37:28<8:28:19, 3.71s/it] {'loss': 0.1302, 'grad_norm': 0.7853566644503953, 'learning_rate': 1.4357710217499387e-06, 'epoch': 0.76} 76%|███████▌ | 26052/34278 [28:37:28<8:28:19, 3.71s/it] 76%|███████▌ | 26053/34278 [28:37:31<7:55:39, 3.47s/it] {'loss': 0.1342, 'grad_norm': 0.9019594927138638, 'learning_rate': 1.4354397097577766e-06, 'epoch': 0.76} 76%|███████▌ | 26053/34278 [28:37:31<7:55:39, 3.47s/it] 76%|███████▌ | 26054/34278 [28:37:36<9:07:57, 4.00s/it] {'loss': 0.1028, 'grad_norm': 0.6598718319418905, 'learning_rate': 1.4351084295891593e-06, 'epoch': 0.76} 76%|███████▌ | 26054/34278 [28:37:36<9:07:57, 4.00s/it] 76%|███████▌ | 26055/34278 [28:37:40<8:34:45, 3.76s/it] {'loss': 0.1326, 'grad_norm': 0.7998989603001021, 'learning_rate': 1.4347771812470428e-06, 'epoch': 0.76} 76%|███████▌ | 26055/34278 [28:37:40<8:34:45, 3.76s/it] 76%|███████▌ | 26056/34278 [28:37:42<8:00:04, 3.50s/it] {'loss': 0.1265, 'grad_norm': 0.8621933089397833, 'learning_rate': 1.4344459647343833e-06, 'epoch': 0.76} 76%|███████▌ | 26056/34278 [28:37:42<8:00:04, 3.50s/it] 76%|███████▌ | 26057/34278 [28:37:45<7:38:12, 3.34s/it] {'loss': 0.1094, 'grad_norm': 0.8460505298845999, 'learning_rate': 1.4341147800541387e-06, 'epoch': 0.76} 76%|███████▌ | 26057/34278 [28:37:45<7:38:12, 3.34s/it] 76%|███████▌ | 26058/34278 [28:37:50<8:35:10, 3.76s/it] {'loss': 0.1244, 'grad_norm': 0.8597389125328023, 'learning_rate': 1.4337836272092681e-06, 'epoch': 0.76} 76%|███████▌ | 26058/34278 [28:37:50<8:35:10, 3.76s/it] 76%|███████▌ | 26059/34278 [28:37:53<8:06:41, 3.55s/it] {'loss': 0.1313, 'grad_norm': 0.821636293537273, 'learning_rate': 1.4334525062027255e-06, 'epoch': 0.76} 76%|███████▌ | 26059/34278 [28:37:53<8:06:41, 3.55s/it] 76%|███████▌ | 26060/34278 [28:37:56<7:40:28, 3.36s/it] {'loss': 0.1067, 'grad_norm': 1.0433881299768766, 'learning_rate': 1.4331214170374663e-06, 'epoch': 0.76} 76%|███████▌ | 26060/34278 [28:37:56<7:40:28, 3.36s/it] 76%|███████▌ | 26061/34278 [28:37:59<7:35:39, 3.33s/it] {'loss': 0.1075, 'grad_norm': 2.0221918569399744, 'learning_rate': 1.4327903597164488e-06, 'epoch': 0.76} 76%|███████▌ | 26061/34278 [28:37:59<7:35:39, 3.33s/it] 76%|███████▌ | 26062/34278 [28:38:04<8:46:35, 3.85s/it] {'loss': 0.1275, 'grad_norm': 0.7565484371987905, 'learning_rate': 1.4324593342426264e-06, 'epoch': 0.76} 76%|███████▌ | 26062/34278 [28:38:04<8:46:35, 3.85s/it] 76%|███████▌ | 26063/34278 [28:38:08<8:19:22, 3.65s/it] {'loss': 0.1323, 'grad_norm': 0.9417510453568073, 'learning_rate': 1.432128340618955e-06, 'epoch': 0.76} 76%|███████▌ | 26063/34278 [28:38:08<8:19:22, 3.65s/it] 76%|███████▌ | 26064/34278 [28:38:11<7:54:30, 3.47s/it] {'loss': 0.101, 'grad_norm': 0.8785801625610209, 'learning_rate': 1.4317973788483914e-06, 'epoch': 0.76} 76%|███████▌ | 26064/34278 [28:38:11<7:54:30, 3.47s/it] 76%|███████▌ | 26065/34278 [28:38:14<7:46:14, 3.41s/it] {'loss': 0.1231, 'grad_norm': 0.7417185558525325, 'learning_rate': 1.4314664489338892e-06, 'epoch': 0.76} 76%|███████▌ | 26065/34278 [28:38:14<7:46:14, 3.41s/it] 76%|███████▌ | 26066/34278 [28:38:17<7:40:14, 3.36s/it] {'loss': 0.1048, 'grad_norm': 0.8351879040678396, 'learning_rate': 1.4311355508784015e-06, 'epoch': 0.76} 76%|███████▌ | 26066/34278 [28:38:17<7:40:14, 3.36s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 76%|███████▌ | 26067/34278 [28:38:21<7:57:15, 3.49s/it] {'loss': 0.1249, 'grad_norm': 0.8431731966399241, 'learning_rate': 1.430804684684885e-06, 'epoch': 0.76} 76%|███████▌ | 26067/34278 [28:38:21<7:57:15, 3.49s/it] 76%|███████▌ | 26068/34278 [28:38:26<8:55:20, 3.91s/it] {'loss': 0.1123, 'grad_norm': 1.4833114662917593, 'learning_rate': 1.4304738503562903e-06, 'epoch': 0.76} 76%|███████▌ | 26068/34278 [28:38:26<8:55:20, 3.91s/it] 76%|███████▌ | 26069/34278 [28:38:29<8:11:29, 3.59s/it] {'loss': 0.129, 'grad_norm': 0.8581745216150858, 'learning_rate': 1.4301430478955748e-06, 'epoch': 0.76} 76%|███████▌ | 26069/34278 [28:38:29<8:11:29, 3.59s/it] 76%|███████▌ | 26070/34278 [28:38:32<8:01:08, 3.52s/it] {'loss': 0.1123, 'grad_norm': 1.0792194494872442, 'learning_rate': 1.4298122773056883e-06, 'epoch': 0.76} 76%|███████▌ | 26070/34278 [28:38:32<8:01:08, 3.52s/it] 76%|███████▌ | 26071/34278 [28:38:35<7:41:05, 3.37s/it] {'loss': 0.1073, 'grad_norm': 0.8004529914549448, 'learning_rate': 1.4294815385895872e-06, 'epoch': 0.76} 76%|███████▌ | 26071/34278 [28:38:35<7:41:05, 3.37s/it] 76%|███████▌ | 26072/34278 [28:38:39<8:09:45, 3.58s/it] {'loss': 0.1075, 'grad_norm': 0.6705396840924204, 'learning_rate': 1.4291508317502229e-06, 'epoch': 0.76} 76%|███████▌ | 26072/34278 [28:38:39<8:09:45, 3.58s/it] 76%|███████▌ | 26073/34278 [28:38:43<8:28:53, 3.72s/it] {'loss': 0.125, 'grad_norm': 0.9491198281198915, 'learning_rate': 1.4288201567905452e-06, 'epoch': 0.76} 76%|███████▌ | 26073/34278 [28:38:43<8:28:53, 3.72s/it] 76%|███████▌ | 26074/34278 [28:38:46<8:03:38, 3.54s/it] {'loss': 0.1154, 'grad_norm': 0.6802741828152761, 'learning_rate': 1.4284895137135091e-06, 'epoch': 0.76} 76%|███████▌ | 26074/34278 [28:38:46<8:03:38, 3.54s/it] 76%|███████▌ | 26075/34278 [28:38:49<7:34:36, 3.33s/it] {'loss': 0.0956, 'grad_norm': 0.6897548118106798, 'learning_rate': 1.4281589025220676e-06, 'epoch': 0.76} 76%|███████▌ | 26075/34278 [28:38:49<7:34:36, 3.33s/it] 76%|███████▌ | 26076/34278 [28:38:54<8:29:24, 3.73s/it] {'loss': 0.1185, 'grad_norm': 0.5562101397590327, 'learning_rate': 1.4278283232191692e-06, 'epoch': 0.76} 76%|███████▌ | 26076/34278 [28:38:54<8:29:24, 3.73s/it] 76%|███████▌ | 26077/34278 [28:39:00<9:55:15, 4.36s/it] {'loss': 0.0986, 'grad_norm': 0.697138522827796, 'learning_rate': 1.4274977758077685e-06, 'epoch': 0.76} 76%|███████▌ | 26077/34278 [28:39:00<9:55:15, 4.36s/it] 76%|███████▌ | 26078/34278 [28:39:02<8:53:02, 3.90s/it] {'loss': 0.1113, 'grad_norm': 0.7319436734846092, 'learning_rate': 1.4271672602908143e-06, 'epoch': 0.76} 76%|███████▌ | 26078/34278 [28:39:03<8:53:02, 3.90s/it] 76%|███████▌ | 26079/34278 [28:39:06<8:33:42, 3.76s/it] {'loss': 0.094, 'grad_norm': 0.7363816138287833, 'learning_rate': 1.4268367766712571e-06, 'epoch': 0.76} 76%|███████▌ | 26079/34278 [28:39:06<8:33:42, 3.76s/it] 76%|███████▌ | 26080/34278 [28:39:09<8:22:56, 3.68s/it] {'loss': 0.1191, 'grad_norm': 0.9144220524949574, 'learning_rate': 1.4265063249520478e-06, 'epoch': 0.76} 76%|███████▌ | 26080/34278 [28:39:09<8:22:56, 3.68s/it] 76%|███████▌ | 26081/34278 [28:39:15<9:48:51, 4.31s/it] {'loss': 0.1045, 'grad_norm': 0.8609360798205258, 'learning_rate': 1.4261759051361378e-06, 'epoch': 0.76} 76%|███████▌ | 26081/34278 [28:39:15<9:48:51, 4.31s/it] 76%|███████▌ | 26082/34278 [28:39:21<10:52:20, 4.78s/it] {'loss': 0.1203, 'grad_norm': 0.7758677831773033, 'learning_rate': 1.4258455172264774e-06, 'epoch': 0.76} 76%|███████▌ | 26082/34278 [28:39:21<10:52:20, 4.78s/it] 76%|███████▌ | 26083/34278 [28:39:25<10:06:33, 4.44s/it] {'loss': 0.1273, 'grad_norm': 0.6705278719730049, 'learning_rate': 1.4255151612260127e-06, 'epoch': 0.76} 76%|███████▌ | 26083/34278 [28:39:25<10:06:33, 4.44s/it] 76%|███████▌ | 26084/34278 [28:39:29<9:52:21, 4.34s/it] {'loss': 0.1223, 'grad_norm': 0.8855828532746887, 'learning_rate': 1.425184837137697e-06, 'epoch': 0.76} 76%|███████▌ | 26084/34278 [28:39:29<9:52:21, 4.34s/it] 76%|███████▌ | 26085/34278 [28:39:32<9:05:58, 4.00s/it] {'loss': 0.1048, 'grad_norm': 0.8729962093532602, 'learning_rate': 1.4248545449644778e-06, 'epoch': 0.76} 76%|███████▌ | 26085/34278 [28:39:32<9:05:58, 4.00s/it] 76%|███████▌ | 26086/34278 [28:39:39<10:52:22, 4.78s/it] {'loss': 0.1099, 'grad_norm': 0.9834331543670914, 'learning_rate': 1.424524284709302e-06, 'epoch': 0.76} 76%|███████▌ | 26086/34278 [28:39:39<10:52:22, 4.78s/it] 76%|███████▌ | 26087/34278 [28:39:43<10:17:51, 4.53s/it] {'loss': 0.1089, 'grad_norm': 1.1227771963032696, 'learning_rate': 1.4241940563751205e-06, 'epoch': 0.76} 76%|███████▌ | 26087/34278 [28:39:43<10:17:51, 4.53s/it] 76%|███████▌ | 26088/34278 [28:39:45<9:04:08, 3.99s/it] {'loss': 0.1044, 'grad_norm': 0.9042592500315241, 'learning_rate': 1.4238638599648818e-06, 'epoch': 0.76} 76%|███████▌ | 26088/34278 [28:39:45<9:04:08, 3.99s/it] 76%|███████▌ | 26089/34278 [28:39:49<8:35:51, 3.78s/it] {'loss': 0.0945, 'grad_norm': 0.9398773073336264, 'learning_rate': 1.423533695481533e-06, 'epoch': 0.76} 76%|███████▌ | 26089/34278 [28:39:49<8:35:51, 3.78s/it] 76%|███████▌ | 26090/34278 [28:39:52<8:29:22, 3.73s/it] {'loss': 0.1295, 'grad_norm': 0.9616544174256167, 'learning_rate': 1.4232035629280199e-06, 'epoch': 0.76} 76%|███████▌ | 26090/34278 [28:39:52<8:29:22, 3.73s/it] 76%|███████▌ | 26091/34278 [28:39:55<7:56:12, 3.49s/it] {'loss': 0.1224, 'grad_norm': 1.0352716786918337, 'learning_rate': 1.4228734623072932e-06, 'epoch': 0.76} 76%|███████▌ | 26091/34278 [28:39:55<7:56:12, 3.49s/it] 76%|███████▌ | 26092/34278 [28:39:59<7:58:28, 3.51s/it] {'loss': 0.1108, 'grad_norm': 0.9028573316249922, 'learning_rate': 1.422543393622297e-06, 'epoch': 0.76} 76%|███████▌ | 26092/34278 [28:39:59<7:58:28, 3.51s/it] 76%|███████▌ | 26093/34278 [28:40:03<8:13:09, 3.62s/it] {'loss': 0.1204, 'grad_norm': 1.06193886836678, 'learning_rate': 1.4222133568759793e-06, 'epoch': 0.76} 76%|███████▌ | 26093/34278 [28:40:03<8:13:09, 3.62s/it] 76%|███████▌ | 26094/34278 [28:40:07<8:33:06, 3.76s/it] {'loss': 0.1339, 'grad_norm': 0.894203663634131, 'learning_rate': 1.4218833520712876e-06, 'epoch': 0.76} 76%|███████▌ | 26094/34278 [28:40:07<8:33:06, 3.76s/it] 76%|███████▌ | 26095/34278 [28:40:13<10:19:44, 4.54s/it] {'loss': 0.1122, 'grad_norm': 0.9934211483158438, 'learning_rate': 1.421553379211168e-06, 'epoch': 0.76} 76%|███████▌ | 26095/34278 [28:40:13<10:19:44, 4.54s/it] 76%|███████▌ | 26096/34278 [28:40:19<11:15:32, 4.95s/it] {'loss': 0.1033, 'grad_norm': 0.8794428270048518, 'learning_rate': 1.4212234382985634e-06, 'epoch': 0.76} 76%|███████▌ | 26096/34278 [28:40:19<11:15:32, 4.95s/it] 76%|███████▌ | 26097/34278 [28:40:24<11:01:58, 4.85s/it] {'loss': 0.103, 'grad_norm': 0.8888271444596492, 'learning_rate': 1.420893529336424e-06, 'epoch': 0.76} 76%|███████▌ | 26097/34278 [28:40:24<11:01:58, 4.85s/it] 76%|███████▌ | 26098/34278 [28:40:27<10:17:21, 4.53s/it] {'loss': 0.1202, 'grad_norm': 0.8720925799092015, 'learning_rate': 1.4205636523276907e-06, 'epoch': 0.76} 76%|███████▌ | 26098/34278 [28:40:27<10:17:21, 4.53s/it] 76%|███████▌ | 26099/34278 [28:40:30<9:17:21, 4.09s/it] {'loss': 0.1303, 'grad_norm': 0.9711674969153876, 'learning_rate': 1.4202338072753119e-06, 'epoch': 0.76} 76%|███████▌ | 26099/34278 [28:40:30<9:17:21, 4.09s/it] 76%|███████▌ | 26100/34278 [28:40:33<8:20:04, 3.67s/it] {'loss': 0.1049, 'grad_norm': 1.0092013053378361, 'learning_rate': 1.4199039941822296e-06, 'epoch': 0.76} 76%|███████▌ | 26100/34278 [28:40:33<8:20:04, 3.67s/it] 76%|███████▌ | 26101/34278 [28:40:36<7:54:28, 3.48s/it] {'loss': 0.1304, 'grad_norm': 1.0326288842659024, 'learning_rate': 1.4195742130513917e-06, 'epoch': 0.76} 76%|███████▌ | 26101/34278 [28:40:36<7:54:28, 3.48s/it] 76%|███████▌ | 26102/34278 [28:40:40<8:06:57, 3.57s/it] {'loss': 0.1092, 'grad_norm': 1.290525021354441, 'learning_rate': 1.4192444638857406e-06, 'epoch': 0.76} 76%|███████▌ | 26102/34278 [28:40:40<8:06:57, 3.57s/it] 76%|███████▌ | 26103/34278 [28:40:43<7:46:05, 3.42s/it] {'loss': 0.1038, 'grad_norm': 0.7499833600199455, 'learning_rate': 1.418914746688218e-06, 'epoch': 0.76} 76%|███████▌ | 26103/34278 [28:40:43<7:46:05, 3.42s/it] 76%|███████▌ | 26104/34278 [28:40:47<8:13:13, 3.62s/it] {'loss': 0.1243, 'grad_norm': 0.7940921307230743, 'learning_rate': 1.4185850614617702e-06, 'epoch': 0.76} 76%|███████▌ | 26104/34278 [28:40:47<8:13:13, 3.62s/it] 76%|███████▌ | 26105/34278 [28:40:50<7:41:33, 3.39s/it] {'loss': 0.1245, 'grad_norm': 0.890027375851292, 'learning_rate': 1.4182554082093413e-06, 'epoch': 0.76} 76%|███████▌ | 26105/34278 [28:40:50<7:41:33, 3.39s/it] 76%|███████▌ | 26106/34278 [28:40:53<7:40:15, 3.38s/it] {'loss': 0.1012, 'grad_norm': 1.0072383997810619, 'learning_rate': 1.417925786933872e-06, 'epoch': 0.76} 76%|███████▌ | 26106/34278 [28:40:53<7:40:15, 3.38s/it] 76%|███████▌ | 26107/34278 [28:40:56<7:23:40, 3.26s/it] {'loss': 0.1075, 'grad_norm': 0.8988512233934777, 'learning_rate': 1.4175961976383074e-06, 'epoch': 0.76} 76%|███████▌ | 26107/34278 [28:40:56<7:23:40, 3.26s/it] 76%|███████▌ | 26108/34278 [28:41:00<7:45:56, 3.42s/it] {'loss': 0.122, 'grad_norm': 0.7775804860429172, 'learning_rate': 1.4172666403255885e-06, 'epoch': 0.76} 76%|███████▌ | 26108/34278 [28:41:00<7:45:56, 3.42s/it] 76%|███████▌ | 26109/34278 [28:41:05<8:33:36, 3.77s/it] {'loss': 0.1181, 'grad_norm': 0.8146670210731684, 'learning_rate': 1.4169371149986566e-06, 'epoch': 0.76} 76%|███████▌ | 26109/34278 [28:41:05<8:33:36, 3.77s/it] 76%|███████▌ | 26110/34278 [28:41:08<8:12:22, 3.62s/it] {'loss': 0.1273, 'grad_norm': 0.8593160795739001, 'learning_rate': 1.4166076216604546e-06, 'epoch': 0.76} 76%|███████▌ | 26110/34278 [28:41:08<8:12:22, 3.62s/it] 76%|███████▌ | 26111/34278 [28:41:11<7:51:31, 3.46s/it] {'loss': 0.1052, 'grad_norm': 0.7258672718667234, 'learning_rate': 1.416278160313926e-06, 'epoch': 0.76} 76%|███████▌ | 26111/34278 [28:41:11<7:51:31, 3.46s/it] 76%|███████▌ | 26112/34278 [28:41:14<7:43:17, 3.40s/it] {'loss': 0.129, 'grad_norm': 0.8639625016813878, 'learning_rate': 1.41594873096201e-06, 'epoch': 0.76} 76%|███████▌ | 26112/34278 [28:41:14<7:43:17, 3.40s/it] 76%|███████▌ | 26113/34278 [28:41:18<7:44:40, 3.41s/it] {'loss': 0.1188, 'grad_norm': 1.0549937706213535, 'learning_rate': 1.4156193336076468e-06, 'epoch': 0.76} 76%|███████▌ | 26113/34278 [28:41:18<7:44:40, 3.41s/it] 76%|███████▌ | 26114/34278 [28:41:23<9:11:15, 4.05s/it] {'loss': 0.1311, 'grad_norm': 1.040744735760747, 'learning_rate': 1.4152899682537807e-06, 'epoch': 0.76} 76%|███████▌ | 26114/34278 [28:41:23<9:11:15, 4.05s/it] 76%|███████▌ | 26115/34278 [28:41:28<9:39:09, 4.26s/it] {'loss': 0.1102, 'grad_norm': 0.800371634445398, 'learning_rate': 1.4149606349033479e-06, 'epoch': 0.76} 76%|███████▌ | 26115/34278 [28:41:28<9:39:09, 4.26s/it] 76%|███████▌ | 26116/34278 [28:41:34<10:55:28, 4.82s/it] {'loss': 0.1107, 'grad_norm': 0.7326289348566162, 'learning_rate': 1.414631333559292e-06, 'epoch': 0.76} 76%|███████▌ | 26116/34278 [28:41:34<10:55:28, 4.82s/it] 76%|███████▌ | 26117/34278 [28:41:37<9:34:46, 4.23s/it] {'loss': 0.1244, 'grad_norm': 0.8358216723861759, 'learning_rate': 1.4143020642245508e-06, 'epoch': 0.76} 76%|███████▌ | 26117/34278 [28:41:37<9:34:46, 4.23s/it] 76%|███████▌ | 26118/34278 [28:41:43<10:39:23, 4.70s/it] {'loss': 0.1041, 'grad_norm': 0.9032544770366628, 'learning_rate': 1.4139728269020658e-06, 'epoch': 0.76} 76%|███████▌ | 26118/34278 [28:41:43<10:39:23, 4.70s/it] 76%|███████▌ | 26119/34278 [28:41:49<11:38:58, 5.14s/it] {'loss': 0.1187, 'grad_norm': 0.6647824916904694, 'learning_rate': 1.4136436215947758e-06, 'epoch': 0.76} 76%|███████▌ | 26119/34278 [28:41:49<11:38:58, 5.14s/it] 76%|███████▌ | 26120/34278 [28:41:53<11:02:54, 4.88s/it] {'loss': 0.1216, 'grad_norm': 0.7611100393768583, 'learning_rate': 1.4133144483056177e-06, 'epoch': 0.76} 76%|███████▌ | 26120/34278 [28:41:53<11:02:54, 4.88s/it] 76%|███████▌ | 26121/34278 [28:41:57<10:08:02, 4.47s/it] {'loss': 0.1087, 'grad_norm': 1.0394359126634016, 'learning_rate': 1.412985307037532e-06, 'epoch': 0.76} 76%|███████▌ | 26121/34278 [28:41:57<10:08:02, 4.47s/it] 76%|███████▌ | 26122/34278 [28:42:00<9:12:26, 4.06s/it] {'loss': 0.1305, 'grad_norm': 0.8332488684432571, 'learning_rate': 1.4126561977934588e-06, 'epoch': 0.76} 76%|███████▌ | 26122/34278 [28:42:00<9:12:26, 4.06s/it] 76%|███████▌ | 26123/34278 [28:42:03<8:28:57, 3.74s/it] {'loss': 0.1103, 'grad_norm': 0.8175459595351123, 'learning_rate': 1.4123271205763328e-06, 'epoch': 0.76} 76%|███████▌ | 26123/34278 [28:42:03<8:28:57, 3.74s/it] 76%|███████▌ | 26124/34278 [28:42:06<7:59:57, 3.53s/it] {'loss': 0.0998, 'grad_norm': 1.0622581782031035, 'learning_rate': 1.4119980753890961e-06, 'epoch': 0.76} 76%|███████▌ | 26124/34278 [28:42:06<7:59:57, 3.53s/it] 76%|███████▌ | 26125/34278 [28:42:09<7:49:58, 3.46s/it] {'loss': 0.115, 'grad_norm': 1.350676129465357, 'learning_rate': 1.4116690622346834e-06, 'epoch': 0.76} 76%|███████▌ | 26125/34278 [28:42:09<7:49:58, 3.46s/it] 76%|███████▌ | 26126/34278 [28:42:12<7:34:49, 3.35s/it] {'loss': 0.1075, 'grad_norm': 1.1058608444791878, 'learning_rate': 1.411340081116031e-06, 'epoch': 0.76} 76%|███████▌ | 26126/34278 [28:42:12<7:34:49, 3.35s/it] 76%|███████▌ | 26127/34278 [28:42:17<8:14:13, 3.64s/it] {'loss': 0.1241, 'grad_norm': 0.8742520958084282, 'learning_rate': 1.4110111320360782e-06, 'epoch': 0.76} 76%|███████▌ | 26127/34278 [28:42:17<8:14:13, 3.64s/it] 76%|███████▌ | 26128/34278 [28:42:19<7:46:15, 3.43s/it] {'loss': 0.1218, 'grad_norm': 0.8807076242303162, 'learning_rate': 1.4106822149977628e-06, 'epoch': 0.76} 76%|███████▌ | 26128/34278 [28:42:19<7:46:15, 3.43s/it] 76%|███████▌ | 26129/34278 [28:42:25<9:07:05, 4.03s/it] {'loss': 0.1246, 'grad_norm': 0.7028763989526833, 'learning_rate': 1.4103533300040196e-06, 'epoch': 0.76} 76%|███████▌ | 26129/34278 [28:42:25<9:07:05, 4.03s/it] 76%|███████▌ | 26130/34278 [28:42:28<8:29:09, 3.75s/it] {'loss': 0.1214, 'grad_norm': 0.7250147832036411, 'learning_rate': 1.4100244770577831e-06, 'epoch': 0.76} 76%|███████▌ | 26130/34278 [28:42:28<8:29:09, 3.75s/it] 76%|███████▌ | 26131/34278 [28:42:31<7:43:05, 3.41s/it] {'loss': 0.1208, 'grad_norm': 0.8747872299612623, 'learning_rate': 1.4096956561619929e-06, 'epoch': 0.76} 76%|███████▌ | 26131/34278 [28:42:31<7:43:05, 3.41s/it] 76%|███████▌ | 26132/34278 [28:42:36<9:22:10, 4.14s/it] {'loss': 0.0886, 'grad_norm': 0.6920845028611138, 'learning_rate': 1.4093668673195832e-06, 'epoch': 0.76} 76%|███████▌ | 26132/34278 [28:42:36<9:22:10, 4.14s/it] 76%|███████▌ | 26133/34278 [28:42:41<9:43:24, 4.30s/it] {'loss': 0.1176, 'grad_norm': 0.8411257337544789, 'learning_rate': 1.409038110533485e-06, 'epoch': 0.76} 76%|███████▌ | 26133/34278 [28:42:41<9:43:24, 4.30s/it] 76%|███████▌ | 26134/34278 [28:42:44<8:43:48, 3.86s/it] {'loss': 0.1193, 'grad_norm': 0.7044509770592748, 'learning_rate': 1.408709385806641e-06, 'epoch': 0.76} 76%|███████▌ | 26134/34278 [28:42:44<8:43:48, 3.86s/it] 76%|███████▌ | 26135/34278 [28:42:47<8:01:20, 3.55s/it] {'loss': 0.1249, 'grad_norm': 0.9000777091653247, 'learning_rate': 1.4083806931419825e-06, 'epoch': 0.76} 76%|███████▌ | 26135/34278 [28:42:47<8:01:20, 3.55s/it] 76%|███████▌ | 26136/34278 [28:42:50<8:00:14, 3.54s/it] {'loss': 0.1076, 'grad_norm': 0.773039298613744, 'learning_rate': 1.4080520325424418e-06, 'epoch': 0.76} 76%|███████▌ | 26136/34278 [28:42:50<8:00:14, 3.54s/it] 76%|███████▋ | 26137/34278 [28:42:57<9:53:08, 4.37s/it] {'loss': 0.1214, 'grad_norm': 0.718514893172037, 'learning_rate': 1.4077234040109567e-06, 'epoch': 0.76} 76%|███████▋ | 26137/34278 [28:42:57<9:53:08, 4.37s/it] 76%|███████▋ | 26138/34278 [28:43:00<9:00:08, 3.98s/it] {'loss': 0.1298, 'grad_norm': 0.7763429287032169, 'learning_rate': 1.4073948075504596e-06, 'epoch': 0.76} 76%|███████▋ | 26138/34278 [28:43:00<9:00:08, 3.98s/it] 76%|███████▋ | 26139/34278 [28:43:03<8:21:15, 3.70s/it] {'loss': 0.122, 'grad_norm': 1.1534295868138604, 'learning_rate': 1.4070662431638821e-06, 'epoch': 0.76} 76%|███████▋ | 26139/34278 [28:43:03<8:21:15, 3.70s/it] 76%|███████▋ | 26140/34278 [28:43:08<9:30:56, 4.21s/it] {'loss': 0.1488, 'grad_norm': 0.8624455969386104, 'learning_rate': 1.4067377108541597e-06, 'epoch': 0.76} 76%|███████▋ | 26140/34278 [28:43:08<9:30:56, 4.21s/it] 76%|███████▋ | 26141/34278 [28:43:11<8:48:30, 3.90s/it] {'loss': 0.1213, 'grad_norm': 0.8243022362839859, 'learning_rate': 1.4064092106242272e-06, 'epoch': 0.76} 76%|███████▋ | 26141/34278 [28:43:11<8:48:30, 3.90s/it] 76%|███████▋ | 26142/34278 [28:43:17<9:49:32, 4.35s/it] {'loss': 0.1135, 'grad_norm': 0.769343615825338, 'learning_rate': 1.406080742477015e-06, 'epoch': 0.76} 76%|███████▋ | 26142/34278 [28:43:17<9:49:32, 4.35s/it] 76%|███████▋ | 26143/34278 [28:43:22<10:27:52, 4.63s/it] {'loss': 0.1172, 'grad_norm': 0.9565018495201091, 'learning_rate': 1.4057523064154544e-06, 'epoch': 0.76} 76%|███████▋ | 26143/34278 [28:43:22<10:27:52, 4.63s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 76%|███████▋ | 26144/34278 [28:43:27<10:39:02, 4.71s/it] {'loss': 0.1457, 'grad_norm': 1.4047610873709226, 'learning_rate': 1.405423902442481e-06, 'epoch': 0.76} 76%|███████▋ | 26144/34278 [28:43:27<10:39:02, 4.71s/it] 76%|███████▋ | 26145/34278 [28:43:33<11:28:02, 5.08s/it] {'loss': 0.1271, 'grad_norm': 0.9720275556686869, 'learning_rate': 1.4050955305610232e-06, 'epoch': 0.76} 76%|███████▋ | 26145/34278 [28:43:33<11:28:02, 5.08s/it] 76%|███████▋ | 26146/34278 [28:43:39<12:22:21, 5.48s/it] {'loss': 0.0847, 'grad_norm': 0.7835991442333221, 'learning_rate': 1.4047671907740156e-06, 'epoch': 0.76} 76%|███████▋ | 26146/34278 [28:43:39<12:22:21, 5.48s/it] 76%|███████▋ | 26147/34278 [28:43:42<10:46:33, 4.77s/it] {'loss': 0.1313, 'grad_norm': 0.9013144996617115, 'learning_rate': 1.4044388830843875e-06, 'epoch': 0.76} 76%|███████▋ | 26147/34278 [28:43:42<10:46:33, 4.77s/it] 76%|███████▋ | 26148/34278 [28:43:46<10:20:45, 4.58s/it] {'loss': 0.0963, 'grad_norm': 0.8317579576376288, 'learning_rate': 1.4041106074950716e-06, 'epoch': 0.76} 76%|███████▋ | 26148/34278 [28:43:47<10:20:45, 4.58s/it] 76%|███████▋ | 26149/34278 [28:43:49<9:03:26, 4.01s/it] {'loss': 0.1088, 'grad_norm': 0.9615326176543755, 'learning_rate': 1.4037823640089982e-06, 'epoch': 0.76} 76%|███████▋ | 26149/34278 [28:43:49<9:03:26, 4.01s/it] 76%|███████▋ | 26150/34278 [28:43:54<9:28:09, 4.19s/it] {'loss': 0.1356, 'grad_norm': 0.9889612993697408, 'learning_rate': 1.4034541526290957e-06, 'epoch': 0.76} 76%|███████▋ | 26150/34278 [28:43:54<9:28:09, 4.19s/it] 76%|███████▋ | 26151/34278 [28:43:57<8:49:24, 3.91s/it] {'loss': 0.1057, 'grad_norm': 0.8351460427530856, 'learning_rate': 1.4031259733582958e-06, 'epoch': 0.76} 76%|███████▋ | 26151/34278 [28:43:57<8:49:24, 3.91s/it] 76%|███████▋ | 26152/34278 [28:44:00<8:10:41, 3.62s/it] {'loss': 0.1043, 'grad_norm': 0.897135120743724, 'learning_rate': 1.4027978261995301e-06, 'epoch': 0.76} 76%|███████▋ | 26152/34278 [28:44:00<8:10:41, 3.62s/it] 76%|███████▋ | 26153/34278 [28:44:04<8:09:47, 3.62s/it] {'loss': 0.1102, 'grad_norm': 0.9586903416993737, 'learning_rate': 1.4024697111557251e-06, 'epoch': 0.76} 76%|███████▋ | 26153/34278 [28:44:04<8:09:47, 3.62s/it] 76%|███████▋ | 26154/34278 [28:44:08<8:57:56, 3.97s/it] {'loss': 0.1033, 'grad_norm': 0.7091006457430841, 'learning_rate': 1.4021416282298133e-06, 'epoch': 0.76} 76%|███████▋ | 26154/34278 [28:44:08<8:57:56, 3.97s/it] 76%|███████▋ | 26155/34278 [28:44:12<8:26:00, 3.74s/it] {'loss': 0.1457, 'grad_norm': 0.8564627396767824, 'learning_rate': 1.401813577424722e-06, 'epoch': 0.76} 76%|███████▋ | 26155/34278 [28:44:12<8:26:00, 3.74s/it] 76%|███████▋ | 26156/34278 [28:44:15<8:17:04, 3.67s/it] {'loss': 0.1177, 'grad_norm': 0.8665696067842101, 'learning_rate': 1.401485558743379e-06, 'epoch': 0.76} 76%|███████▋ | 26156/34278 [28:44:15<8:17:04, 3.67s/it] 76%|███████▋ | 26157/34278 [28:44:18<8:00:31, 3.55s/it] {'loss': 0.134, 'grad_norm': 1.0068283095317692, 'learning_rate': 1.401157572188714e-06, 'epoch': 0.76} 76%|███████▋ | 26157/34278 [28:44:18<8:00:31, 3.55s/it] 76%|███████▋ | 26158/34278 [28:44:22<7:49:56, 3.47s/it] {'loss': 0.1269, 'grad_norm': 0.8907497856235018, 'learning_rate': 1.4008296177636565e-06, 'epoch': 0.76} 76%|███████▋ | 26158/34278 [28:44:22<7:49:56, 3.47s/it] 76%|███████▋ | 26159/34278 [28:44:26<8:05:17, 3.59s/it] {'loss': 0.0952, 'grad_norm': 0.9406625381336449, 'learning_rate': 1.4005016954711325e-06, 'epoch': 0.76} 76%|███████▋ | 26159/34278 [28:44:26<8:05:17, 3.59s/it] 76%|███████▋ | 26160/34278 [28:44:29<7:59:34, 3.54s/it] {'loss': 0.1187, 'grad_norm': 1.0340193349784765, 'learning_rate': 1.400173805314069e-06, 'epoch': 0.76} 76%|███████▋ | 26160/34278 [28:44:29<7:59:34, 3.54s/it] 76%|███████▋ | 26161/34278 [28:44:33<8:07:54, 3.61s/it] {'loss': 0.1209, 'grad_norm': 0.8797648399153376, 'learning_rate': 1.3998459472953956e-06, 'epoch': 0.76} 76%|███████▋ | 26161/34278 [28:44:33<8:07:54, 3.61s/it] 76%|███████▋ | 26162/34278 [28:44:36<7:58:53, 3.54s/it] {'loss': 0.1182, 'grad_norm': 0.8892336228707008, 'learning_rate': 1.3995181214180386e-06, 'epoch': 0.76} 76%|███████▋ | 26162/34278 [28:44:36<7:58:53, 3.54s/it] 76%|███████▋ | 26163/34278 [28:44:41<8:53:58, 3.95s/it] {'loss': 0.1184, 'grad_norm': 0.7893949456538327, 'learning_rate': 1.399190327684921e-06, 'epoch': 0.76} 76%|███████▋ | 26163/34278 [28:44:41<8:53:58, 3.95s/it] 76%|███████▋ | 26164/34278 [28:44:44<8:16:14, 3.67s/it] {'loss': 0.0968, 'grad_norm': 0.9609933664140798, 'learning_rate': 1.3988625660989758e-06, 'epoch': 0.76} 76%|███████▋ | 26164/34278 [28:44:44<8:16:14, 3.67s/it] 76%|███████▋ | 26165/34278 [28:44:47<7:54:45, 3.51s/it] {'loss': 0.0961, 'grad_norm': 0.7529623815427332, 'learning_rate': 1.3985348366631258e-06, 'epoch': 0.76} 76%|███████▋ | 26165/34278 [28:44:47<7:54:45, 3.51s/it] 76%|███████▋ | 26166/34278 [28:44:50<7:38:59, 3.39s/it] {'loss': 0.1115, 'grad_norm': 0.8554896954035796, 'learning_rate': 1.3982071393802953e-06, 'epoch': 0.76} 76%|███████▋ | 26166/34278 [28:44:50<7:38:59, 3.39s/it] 76%|███████▋ | 26167/34278 [28:44:56<9:31:31, 4.23s/it] {'loss': 0.1248, 'grad_norm': 0.9777081127658431, 'learning_rate': 1.3978794742534135e-06, 'epoch': 0.76} 76%|███████▋ | 26167/34278 [28:44:56<9:31:31, 4.23s/it] 76%|███████▋ | 26168/34278 [28:44:59<8:42:07, 3.86s/it] {'loss': 0.1054, 'grad_norm': 0.7910750670891128, 'learning_rate': 1.3975518412854038e-06, 'epoch': 0.76} 76%|███████▋ | 26168/34278 [28:44:59<8:42:07, 3.86s/it] 76%|███████▋ | 26169/34278 [28:45:03<8:23:19, 3.72s/it] {'loss': 0.1132, 'grad_norm': 0.8372695287966306, 'learning_rate': 1.3972242404791896e-06, 'epoch': 0.76} 76%|███████▋ | 26169/34278 [28:45:03<8:23:19, 3.72s/it] 76%|███████▋ | 26170/34278 [28:45:07<8:57:54, 3.98s/it] {'loss': 0.1249, 'grad_norm': 0.7779962076886326, 'learning_rate': 1.3968966718376976e-06, 'epoch': 0.76} 76%|███████▋ | 26170/34278 [28:45:07<8:57:54, 3.98s/it] 76%|███████▋ | 26171/34278 [28:45:10<8:11:53, 3.64s/it] {'loss': 0.1071, 'grad_norm': 0.9212527853625375, 'learning_rate': 1.3965691353638532e-06, 'epoch': 0.76} 76%|███████▋ | 26171/34278 [28:45:10<8:11:53, 3.64s/it] 76%|███████▋ | 26172/34278 [28:45:14<8:14:44, 3.66s/it] {'loss': 0.0999, 'grad_norm': 0.9292382048809429, 'learning_rate': 1.3962416310605798e-06, 'epoch': 0.76} 76%|███████▋ | 26172/34278 [28:45:14<8:14:44, 3.66s/it] 76%|███████▋ | 26173/34278 [28:45:17<7:37:57, 3.39s/it] {'loss': 0.1005, 'grad_norm': 1.134281645203754, 'learning_rate': 1.395914158930799e-06, 'epoch': 0.76} 76%|███████▋ | 26173/34278 [28:45:17<7:37:57, 3.39s/it] 76%|███████▋ | 26174/34278 [28:45:20<7:29:43, 3.33s/it] {'loss': 0.0955, 'grad_norm': 0.9668929216293283, 'learning_rate': 1.395586718977438e-06, 'epoch': 0.76} 76%|███████▋ | 26174/34278 [28:45:20<7:29:43, 3.33s/it] 76%|███████▋ | 26175/34278 [28:45:23<7:38:06, 3.39s/it] {'loss': 0.1068, 'grad_norm': 0.806384713606756, 'learning_rate': 1.3952593112034163e-06, 'epoch': 0.76} 76%|███████▋ | 26175/34278 [28:45:24<7:38:06, 3.39s/it] 76%|███████▋ | 26176/34278 [28:45:27<7:35:55, 3.38s/it] {'loss': 0.1182, 'grad_norm': 0.729396128674825, 'learning_rate': 1.3949319356116608e-06, 'epoch': 0.76} 76%|███████▋ | 26176/34278 [28:45:27<7:35:55, 3.38s/it] 76%|███████▋ | 26177/34278 [28:45:33<9:15:42, 4.12s/it] {'loss': 0.114, 'grad_norm': 1.0656409502481048, 'learning_rate': 1.3946045922050911e-06, 'epoch': 0.76} 76%|███████▋ | 26177/34278 [28:45:33<9:15:42, 4.12s/it] 76%|███████▋ | 26178/34278 [28:45:36<8:29:05, 3.77s/it] {'loss': 0.1134, 'grad_norm': 0.9233551092026401, 'learning_rate': 1.3942772809866317e-06, 'epoch': 0.76} 76%|███████▋ | 26178/34278 [28:45:36<8:29:05, 3.77s/it] 76%|███████▋ | 26179/34278 [28:45:42<9:55:08, 4.41s/it] {'loss': 0.1182, 'grad_norm': 0.9580018016534266, 'learning_rate': 1.3939500019592046e-06, 'epoch': 0.76} 76%|███████▋ | 26179/34278 [28:45:42<9:55:08, 4.41s/it] 76%|███████▋ | 26180/34278 [28:45:45<9:03:54, 4.03s/it] {'loss': 0.1503, 'grad_norm': 0.7582018960150472, 'learning_rate': 1.3936227551257293e-06, 'epoch': 0.76} 76%|███████▋ | 26180/34278 [28:45:45<9:03:54, 4.03s/it] 76%|███████▋ | 26181/34278 [28:45:48<8:19:26, 3.70s/it] {'loss': 0.0959, 'grad_norm': 0.7594002273528503, 'learning_rate': 1.3932955404891295e-06, 'epoch': 0.76} 76%|███████▋ | 26181/34278 [28:45:48<8:19:26, 3.70s/it] 76%|███████▋ | 26182/34278 [28:45:51<8:05:04, 3.59s/it] {'loss': 0.1023, 'grad_norm': 0.8320128400983632, 'learning_rate': 1.3929683580523274e-06, 'epoch': 0.76} 76%|███████▋ | 26182/34278 [28:45:51<8:05:04, 3.59s/it] 76%|███████▋ | 26183/34278 [28:45:54<7:45:02, 3.45s/it] {'loss': 0.1258, 'grad_norm': 0.8733870138636963, 'learning_rate': 1.3926412078182411e-06, 'epoch': 0.76} 76%|███████▋ | 26183/34278 [28:45:54<7:45:02, 3.45s/it] 76%|███████▋ | 26184/34278 [28:46:00<9:24:47, 4.19s/it] {'loss': 0.1367, 'grad_norm': 0.9261463317054388, 'learning_rate': 1.392314089789794e-06, 'epoch': 0.76} 76%|███████▋ | 26184/34278 [28:46:00<9:24:47, 4.19s/it] 76%|███████▋ | 26185/34278 [28:46:03<8:43:11, 3.88s/it] {'loss': 0.1163, 'grad_norm': 0.9961368062548425, 'learning_rate': 1.3919870039699062e-06, 'epoch': 0.76} 76%|███████▋ | 26185/34278 [28:46:03<8:43:11, 3.88s/it] 76%|███████▋ | 26186/34278 [28:46:10<10:29:55, 4.67s/it] {'loss': 0.1006, 'grad_norm': 0.6961372193675052, 'learning_rate': 1.3916599503614958e-06, 'epoch': 0.76} 76%|███████▋ | 26186/34278 [28:46:10<10:29:55, 4.67s/it] 76%|███████▋ | 26187/34278 [28:46:13<9:17:05, 4.13s/it] {'loss': 0.114, 'grad_norm': 0.7459581187138465, 'learning_rate': 1.391332928967483e-06, 'epoch': 0.76} 76%|███████▋ | 26187/34278 [28:46:13<9:17:05, 4.13s/it] 76%|███████▋ | 26188/34278 [28:46:16<8:37:16, 3.84s/it] {'loss': 0.1203, 'grad_norm': 0.9477156298218982, 'learning_rate': 1.391005939790791e-06, 'epoch': 0.76} 76%|███████▋ | 26188/34278 [28:46:16<8:37:16, 3.84s/it] 76%|███████▋ | 26189/34278 [28:46:19<8:33:00, 3.81s/it] {'loss': 0.1258, 'grad_norm': 1.0762321539839521, 'learning_rate': 1.3906789828343358e-06, 'epoch': 0.76} 76%|███████▋ | 26189/34278 [28:46:19<8:33:00, 3.81s/it] 76%|███████▋ | 26190/34278 [28:46:23<8:13:53, 3.66s/it] {'loss': 0.128, 'grad_norm': 0.9400921453796561, 'learning_rate': 1.3903520581010354e-06, 'epoch': 0.76} 76%|███████▋ | 26190/34278 [28:46:23<8:13:53, 3.66s/it] 76%|███████▋ | 26191/34278 [28:46:27<8:48:18, 3.92s/it] {'loss': 0.1159, 'grad_norm': 0.9850343275209514, 'learning_rate': 1.3900251655938118e-06, 'epoch': 0.76} 76%|███████▋ | 26191/34278 [28:46:27<8:48:18, 3.92s/it] 76%|███████▋ | 26192/34278 [28:46:30<8:05:33, 3.60s/it] {'loss': 0.108, 'grad_norm': 0.776003117127375, 'learning_rate': 1.3896983053155821e-06, 'epoch': 0.76} 76%|███████▋ | 26192/34278 [28:46:30<8:05:33, 3.60s/it] 76%|███████▋ | 26193/34278 [28:46:34<8:01:45, 3.58s/it] {'loss': 0.0962, 'grad_norm': 0.8159114840533124, 'learning_rate': 1.3893714772692607e-06, 'epoch': 0.76} 76%|███████▋ | 26193/34278 [28:46:34<8:01:45, 3.58s/it] 76%|███████▋ | 26194/34278 [28:46:37<7:50:30, 3.49s/it] {'loss': 0.1176, 'grad_norm': 0.9462367818043006, 'learning_rate': 1.389044681457772e-06, 'epoch': 0.76} 76%|███████▋ | 26194/34278 [28:46:37<7:50:30, 3.49s/it] 76%|███████▋ | 26195/34278 [28:46:41<7:58:28, 3.55s/it] {'loss': 0.1124, 'grad_norm': 1.0006720493583758, 'learning_rate': 1.3887179178840305e-06, 'epoch': 0.76} 76%|███████▋ | 26195/34278 [28:46:41<7:58:28, 3.55s/it] 76%|███████▋ | 26196/34278 [28:46:45<8:25:44, 3.75s/it] {'loss': 0.1061, 'grad_norm': 0.8008535688160848, 'learning_rate': 1.3883911865509514e-06, 'epoch': 0.76} 76%|███████▋ | 26196/34278 [28:46:45<8:25:44, 3.75s/it] 76%|███████▋ | 26197/34278 [28:46:51<9:53:13, 4.40s/it] {'loss': 0.1058, 'grad_norm': 0.7653365849573978, 'learning_rate': 1.3880644874614552e-06, 'epoch': 0.76} 76%|███████▋ | 26197/34278 [28:46:51<9:53:13, 4.40s/it] 76%|███████▋ | 26198/34278 [28:46:54<9:05:32, 4.05s/it] {'loss': 0.1298, 'grad_norm': 1.0773741125237393, 'learning_rate': 1.3877378206184571e-06, 'epoch': 0.76} 76%|███████▋ | 26198/34278 [28:46:54<9:05:32, 4.05s/it] 76%|███████▋ | 26199/34278 [28:46:57<8:30:29, 3.79s/it] {'loss': 0.1141, 'grad_norm': 0.9364940124081743, 'learning_rate': 1.3874111860248722e-06, 'epoch': 0.76} 76%|███████▋ | 26199/34278 [28:46:57<8:30:29, 3.79s/it] 76%|███████▋ | 26200/34278 [28:47:01<8:35:40, 3.83s/it] {'loss': 0.1126, 'grad_norm': 0.9301490879668555, 'learning_rate': 1.3870845836836177e-06, 'epoch': 0.76} 76%|███████▋ | 26200/34278 [28:47:01<8:35:40, 3.83s/it] 76%|███████▋ | 26201/34278 [28:47:07<10:00:51, 4.46s/it] {'loss': 0.1168, 'grad_norm': 0.8093211906790393, 'learning_rate': 1.386758013597611e-06, 'epoch': 0.76} 76%|███████▋ | 26201/34278 [28:47:07<10:00:51, 4.46s/it] 76%|███████▋ | 26202/34278 [28:47:11<9:25:53, 4.20s/it] {'loss': 0.1174, 'grad_norm': 0.8698017267897885, 'learning_rate': 1.386431475769766e-06, 'epoch': 0.76} 76%|███████▋ | 26202/34278 [28:47:11<9:25:53, 4.20s/it] 76%|███████▋ | 26203/34278 [28:47:15<9:12:33, 4.11s/it] {'loss': 0.1231, 'grad_norm': 0.953222352438878, 'learning_rate': 1.3861049702029971e-06, 'epoch': 0.76} 76%|███████▋ | 26203/34278 [28:47:15<9:12:33, 4.11s/it] 76%|███████▋ | 26204/34278 [28:47:18<8:33:55, 3.82s/it] {'loss': 0.1207, 'grad_norm': 0.8810582395735321, 'learning_rate': 1.3857784969002214e-06, 'epoch': 0.76} 76%|███████▋ | 26204/34278 [28:47:18<8:33:55, 3.82s/it] 76%|███████▋ | 26205/34278 [28:47:21<7:59:29, 3.56s/it] {'loss': 0.1412, 'grad_norm': 0.7950840377710335, 'learning_rate': 1.3854520558643513e-06, 'epoch': 0.76} 76%|███████▋ | 26205/34278 [28:47:21<7:59:29, 3.56s/it] 76%|███████▋ | 26206/34278 [28:47:24<7:40:31, 3.42s/it] {'loss': 0.1035, 'grad_norm': 0.8763916156368131, 'learning_rate': 1.3851256470983037e-06, 'epoch': 0.76} 76%|███████▋ | 26206/34278 [28:47:24<7:40:31, 3.42s/it] 76%|███████▋ | 26207/34278 [28:47:27<7:39:19, 3.41s/it] {'loss': 0.127, 'grad_norm': 0.8761734058534855, 'learning_rate': 1.38479927060499e-06, 'epoch': 0.76} 76%|███████▋ | 26207/34278 [28:47:27<7:39:19, 3.41s/it] 76%|███████▋ | 26208/34278 [28:47:30<7:19:39, 3.27s/it] {'loss': 0.1309, 'grad_norm': 1.0127740547800734, 'learning_rate': 1.3844729263873269e-06, 'epoch': 0.76} 76%|███████▋ | 26208/34278 [28:47:30<7:19:39, 3.27s/it] 76%|███████▋ | 26209/34278 [28:47:36<9:08:27, 4.08s/it] {'loss': 0.1312, 'grad_norm': 0.8732384759525622, 'learning_rate': 1.3841466144482262e-06, 'epoch': 0.76} 76%|███████▋ | 26209/34278 [28:47:36<9:08:27, 4.08s/it] 76%|███████▋ | 26210/34278 [28:47:39<8:17:56, 3.70s/it] {'loss': 0.1104, 'grad_norm': 0.9122354532164116, 'learning_rate': 1.3838203347905999e-06, 'epoch': 0.76} 76%|███████▋ | 26210/34278 [28:47:39<8:17:56, 3.70s/it] 76%|███████▋ | 26211/34278 [28:47:43<8:53:57, 3.97s/it] {'loss': 0.1152, 'grad_norm': 0.8134407679285944, 'learning_rate': 1.3834940874173624e-06, 'epoch': 0.76} 76%|███████▋ | 26211/34278 [28:47:43<8:53:57, 3.97s/it] 76%|███████▋ | 26212/34278 [28:47:47<8:42:21, 3.89s/it] {'loss': 0.1396, 'grad_norm': 0.8169273397185864, 'learning_rate': 1.383167872331428e-06, 'epoch': 0.76} 76%|███████▋ | 26212/34278 [28:47:47<8:42:21, 3.89s/it] 76%|███████▋ | 26213/34278 [28:47:50<8:04:16, 3.60s/it] {'loss': 0.1084, 'grad_norm': 0.9313472989800363, 'learning_rate': 1.382841689535706e-06, 'epoch': 0.76} 76%|███████▋ | 26213/34278 [28:47:50<8:04:16, 3.60s/it] 76%|███████▋ | 26214/34278 [28:47:53<7:37:28, 3.40s/it] {'loss': 0.1313, 'grad_norm': 0.8093829327740456, 'learning_rate': 1.3825155390331114e-06, 'epoch': 0.76} 76%|███████▋ | 26214/34278 [28:47:53<7:37:28, 3.40s/it] 76%|███████▋ | 26215/34278 [28:47:59<9:19:25, 4.16s/it] {'loss': 0.1357, 'grad_norm': 0.9013230684690757, 'learning_rate': 1.382189420826554e-06, 'epoch': 0.76} 76%|███████▋ | 26215/34278 [28:47:59<9:19:25, 4.16s/it] 76%|███████▋ | 26216/34278 [28:48:02<8:40:52, 3.88s/it] {'loss': 0.1242, 'grad_norm': 0.7668506858228894, 'learning_rate': 1.3818633349189448e-06, 'epoch': 0.76} 76%|███████▋ | 26216/34278 [28:48:02<8:40:52, 3.88s/it] 76%|███████▋ | 26217/34278 [28:48:05<8:00:47, 3.58s/it] {'loss': 0.1132, 'grad_norm': 0.926569989866445, 'learning_rate': 1.381537281313196e-06, 'epoch': 0.76} 76%|███████▋ | 26217/34278 [28:48:05<8:00:47, 3.58s/it] 76%|███████▋ | 26218/34278 [28:48:09<8:11:50, 3.66s/it] {'loss': 0.0901, 'grad_norm': 0.6817679841498191, 'learning_rate': 1.3812112600122201e-06, 'epoch': 0.76} 76%|███████▋ | 26218/34278 [28:48:09<8:11:50, 3.66s/it] 76%|███████▋ | 26219/34278 [28:48:12<7:56:27, 3.55s/it] {'loss': 0.1268, 'grad_norm': 0.8774705720123143, 'learning_rate': 1.3808852710189263e-06, 'epoch': 0.76} 76%|███████▋ | 26219/34278 [28:48:12<7:56:27, 3.55s/it] 76%|███████▋ | 26220/34278 [28:48:16<7:51:41, 3.51s/it] {'loss': 0.1262, 'grad_norm': 0.853094144607071, 'learning_rate': 1.3805593143362227e-06, 'epoch': 0.76} 76%|███████▋ | 26220/34278 [28:48:16<7:51:41, 3.51s/it] 76%|███████▋ | 26221/34278 [28:48:19<7:33:55, 3.38s/it] {'loss': 0.1089, 'grad_norm': 0.8010893325699411, 'learning_rate': 1.3802333899670239e-06, 'epoch': 0.76} 76%|███████▋ | 26221/34278 [28:48:19<7:33:55, 3.38s/it] 76%|███████▋ | 26222/34278 [28:48:22<7:38:38, 3.42s/it] {'loss': 0.1336, 'grad_norm': 0.8227006921687338, 'learning_rate': 1.3799074979142369e-06, 'epoch': 0.76} 76%|███████▋ | 26222/34278 [28:48:22<7:38:38, 3.42s/it] 77%|███████▋ | 26223/34278 [28:48:26<8:08:21, 3.64s/it] {'loss': 0.1174, 'grad_norm': 0.7544040557332929, 'learning_rate': 1.379581638180768e-06, 'epoch': 0.77} 77%|███████▋ | 26223/34278 [28:48:26<8:08:21, 3.64s/it] 77%|███████▋ | 26224/34278 [28:48:32<9:39:17, 4.32s/it] {'loss': 0.1453, 'grad_norm': 0.8637354983940975, 'learning_rate': 1.3792558107695335e-06, 'epoch': 0.77} 77%|███████▋ | 26224/34278 [28:48:32<9:39:17, 4.32s/it] 77%|███████▋ | 26225/34278 [28:48:35<8:55:30, 3.99s/it] {'loss': 0.1152, 'grad_norm': 1.0777165785817178, 'learning_rate': 1.3789300156834389e-06, 'epoch': 0.77} 77%|███████▋ | 26225/34278 [28:48:36<8:55:30, 3.99s/it] 77%|███████▋ | 26226/34278 [28:48:41<9:51:11, 4.41s/it] {'loss': 0.1084, 'grad_norm': 0.7879738396809659, 'learning_rate': 1.3786042529253913e-06, 'epoch': 0.77} 77%|███████▋ | 26226/34278 [28:48:41<9:51:11, 4.41s/it] 77%|███████▋ | 26227/34278 [28:48:44<8:46:36, 3.92s/it] {'loss': 0.0875, 'grad_norm': 0.7016310321239007, 'learning_rate': 1.3782785224983015e-06, 'epoch': 0.77} 77%|███████▋ | 26227/34278 [28:48:44<8:46:36, 3.92s/it] 77%|███████▋ | 26228/34278 [28:48:47<8:09:00, 3.64s/it] {'loss': 0.1087, 'grad_norm': 1.2203319913017596, 'learning_rate': 1.3779528244050765e-06, 'epoch': 0.77} 77%|███████▋ | 26228/34278 [28:48:47<8:09:00, 3.64s/it] 77%|███████▋ | 26229/34278 [28:48:51<8:54:34, 3.98s/it] {'loss': 0.1388, 'grad_norm': 0.7662474942040508, 'learning_rate': 1.3776271586486229e-06, 'epoch': 0.77} 77%|███████▋ | 26229/34278 [28:48:51<8:54:34, 3.98s/it] 77%|███████▋ | 26230/34278 [28:48:55<8:48:05, 3.94s/it] {'loss': 0.1176, 'grad_norm': 0.8063930269842505, 'learning_rate': 1.3773015252318489e-06, 'epoch': 0.77} 77%|███████▋ | 26230/34278 [28:48:55<8:48:05, 3.94s/it] 77%|███████▋ | 26231/34278 [28:48:58<8:06:29, 3.63s/it] {'loss': 0.1126, 'grad_norm': 0.8660898453060176, 'learning_rate': 1.3769759241576642e-06, 'epoch': 0.77} 77%|███████▋ | 26231/34278 [28:48:58<8:06:29, 3.63s/it] 77%|███████▋ | 26232/34278 [28:49:01<7:35:33, 3.40s/it] {'loss': 0.0982, 'grad_norm': 1.010878535140388, 'learning_rate': 1.376650355428973e-06, 'epoch': 0.77} 77%|███████▋ | 26232/34278 [28:49:01<7:35:33, 3.40s/it] 77%|███████▋ | 26233/34278 [28:49:06<8:49:46, 3.95s/it] {'loss': 0.1208, 'grad_norm': 0.7806800752071383, 'learning_rate': 1.376324819048681e-06, 'epoch': 0.77} 77%|███████▋ | 26233/34278 [28:49:06<8:49:46, 3.95s/it] 77%|███████▋ | 26234/34278 [28:49:12<9:56:48, 4.45s/it] {'loss': 0.1226, 'grad_norm': 0.938268846177341, 'learning_rate': 1.3759993150196975e-06, 'epoch': 0.77} 77%|███████▋ | 26234/34278 [28:49:12<9:56:48, 4.45s/it] 77%|███████▋ | 26235/34278 [28:49:15<8:59:23, 4.02s/it] {'loss': 0.125, 'grad_norm': 0.9861122910139926, 'learning_rate': 1.3756738433449257e-06, 'epoch': 0.77} 77%|███████▋ | 26235/34278 [28:49:15<8:59:23, 4.02s/it] 77%|███████▋ | 26236/34278 [28:49:18<8:06:46, 3.63s/it] {'loss': 0.1236, 'grad_norm': 0.9386610683992159, 'learning_rate': 1.375348404027274e-06, 'epoch': 0.77} 77%|███████▋ | 26236/34278 [28:49:18<8:06:46, 3.63s/it] 77%|███████▋ | 26237/34278 [28:49:21<7:42:09, 3.45s/it] {'loss': 0.1255, 'grad_norm': 0.8633672654574206, 'learning_rate': 1.375022997069645e-06, 'epoch': 0.77} 77%|███████▋ | 26237/34278 [28:49:21<7:42:09, 3.45s/it] 77%|███████▋ | 26238/34278 [28:49:24<7:40:41, 3.44s/it] {'loss': 0.1252, 'grad_norm': 0.8935751608128624, 'learning_rate': 1.374697622474947e-06, 'epoch': 0.77} 77%|███████▋ | 26238/34278 [28:49:24<7:40:41, 3.44s/it] 77%|███████▋ | 26239/34278 [28:49:28<8:09:04, 3.65s/it] {'loss': 0.1198, 'grad_norm': 0.8291465751734923, 'learning_rate': 1.374372280246083e-06, 'epoch': 0.77} 77%|███████▋ | 26239/34278 [28:49:28<8:09:04, 3.65s/it] 77%|███████▋ | 26240/34278 [28:49:33<8:44:05, 3.91s/it] {'loss': 0.0963, 'grad_norm': 0.6881511252850518, 'learning_rate': 1.374046970385956e-06, 'epoch': 0.77} 77%|███████▋ | 26240/34278 [28:49:33<8:44:05, 3.91s/it] 77%|███████▋ | 26241/34278 [28:49:36<8:05:53, 3.63s/it] {'loss': 0.1095, 'grad_norm': 0.7701647522818884, 'learning_rate': 1.3737216928974723e-06, 'epoch': 0.77} 77%|███████▋ | 26241/34278 [28:49:36<8:05:53, 3.63s/it] 77%|███████▋ | 26242/34278 [28:49:39<7:40:32, 3.44s/it] {'loss': 0.1439, 'grad_norm': 1.071376330529272, 'learning_rate': 1.373396447783537e-06, 'epoch': 0.77} 77%|███████▋ | 26242/34278 [28:49:39<7:40:32, 3.44s/it] 77%|███████▋ | 26243/34278 [28:49:43<8:35:39, 3.85s/it] {'loss': 0.1083, 'grad_norm': 1.0701209096809126, 'learning_rate': 1.3730712350470516e-06, 'epoch': 0.77} 77%|███████▋ | 26243/34278 [28:49:43<8:35:39, 3.85s/it] 77%|███████▋ | 26244/34278 [28:49:47<8:17:43, 3.72s/it] {'loss': 0.1212, 'grad_norm': 0.8551621899125321, 'learning_rate': 1.372746054690921e-06, 'epoch': 0.77} 77%|███████▋ | 26244/34278 [28:49:47<8:17:43, 3.72s/it] 77%|███████▋ | 26245/34278 [28:49:51<8:14:15, 3.69s/it] {'loss': 0.1297, 'grad_norm': 1.0081329271816202, 'learning_rate': 1.3724209067180483e-06, 'epoch': 0.77} 77%|███████▋ | 26245/34278 [28:49:51<8:14:15, 3.69s/it] 77%|███████▋ | 26246/34278 [28:49:54<7:53:06, 3.53s/it] {'loss': 0.1246, 'grad_norm': 0.9135838539518528, 'learning_rate': 1.3720957911313342e-06, 'epoch': 0.77} 77%|███████▋ | 26246/34278 [28:49:54<7:53:06, 3.53s/it] 77%|███████▋ | 26247/34278 [28:49:57<7:37:12, 3.42s/it] {'loss': 0.1087, 'grad_norm': 0.7551891413814291, 'learning_rate': 1.3717707079336828e-06, 'epoch': 0.77} 77%|███████▋ | 26247/34278 [28:49:57<7:37:12, 3.42s/it] 77%|███████▋ | 26248/34278 [28:50:00<7:35:04, 3.40s/it] {'loss': 0.1187, 'grad_norm': 0.8999379916351353, 'learning_rate': 1.3714456571279984e-06, 'epoch': 0.77} 77%|███████▋ | 26248/34278 [28:50:00<7:35:04, 3.40s/it] 77%|███████▋ | 26249/34278 [28:50:04<7:43:16, 3.46s/it] {'loss': 0.1294, 'grad_norm': 1.0576561005476992, 'learning_rate': 1.3711206387171798e-06, 'epoch': 0.77} 77%|███████▋ | 26249/34278 [28:50:04<7:43:16, 3.46s/it] 77%|███████▋ | 26250/34278 [28:50:07<7:25:16, 3.33s/it] {'loss': 0.1214, 'grad_norm': 0.9553902479960511, 'learning_rate': 1.3707956527041294e-06, 'epoch': 0.77} 77%|███████▋ | 26250/34278 [28:50:07<7:25:16, 3.33s/it] 77%|███████▋ | 26251/34278 [28:50:10<7:17:46, 3.27s/it] {'loss': 0.1445, 'grad_norm': 0.9196799012445692, 'learning_rate': 1.37047069909175e-06, 'epoch': 0.77} 77%|███████▋ | 26251/34278 [28:50:10<7:17:46, 3.27s/it] 77%|███████▋ | 26252/34278 [28:50:13<7:01:18, 3.15s/it] {'loss': 0.1058, 'grad_norm': 0.7874001995227332, 'learning_rate': 1.3701457778829418e-06, 'epoch': 0.77} 77%|███████▋ | 26252/34278 [28:50:13<7:01:18, 3.15s/it] 77%|███████▋ | 26253/34278 [28:50:16<6:49:02, 3.06s/it] {'loss': 0.1093, 'grad_norm': 1.0459834083947193, 'learning_rate': 1.369820889080603e-06, 'epoch': 0.77} 77%|███████▋ | 26253/34278 [28:50:16<6:49:02, 3.06s/it] 77%|███████▋ | 26254/34278 [28:50:19<6:46:25, 3.04s/it] {'loss': 0.1384, 'grad_norm': 0.938649616690978, 'learning_rate': 1.3694960326876393e-06, 'epoch': 0.77} 77%|███████▋ | 26254/34278 [28:50:19<6:46:25, 3.04s/it] 77%|███████▋ | 26255/34278 [28:50:22<6:57:15, 3.12s/it] {'loss': 0.1058, 'grad_norm': 0.6865196226329512, 'learning_rate': 1.3691712087069486e-06, 'epoch': 0.77} 77%|███████▋ | 26255/34278 [28:50:22<6:57:15, 3.12s/it] 77%|███████▋ | 26256/34278 [28:50:28<8:55:26, 4.00s/it] {'loss': 0.1487, 'grad_norm': 0.9685957610013003, 'learning_rate': 1.368846417141429e-06, 'epoch': 0.77} 77%|███████▋ | 26256/34278 [28:50:28<8:55:26, 4.00s/it] 77%|███████▋ | 26257/34278 [28:50:31<8:12:02, 3.68s/it] {'loss': 0.1156, 'grad_norm': 0.7855783301905297, 'learning_rate': 1.368521657993983e-06, 'epoch': 0.77} 77%|███████▋ | 26257/34278 [28:50:31<8:12:02, 3.68s/it] 77%|███████▋ | 26258/34278 [28:50:35<8:16:45, 3.72s/it] {'loss': 0.1107, 'grad_norm': 0.9477634181498675, 'learning_rate': 1.3681969312675092e-06, 'epoch': 0.77} 77%|███████▋ | 26258/34278 [28:50:35<8:16:45, 3.72s/it] 77%|███████▋ | 26259/34278 [28:50:38<7:40:25, 3.44s/it] {'loss': 0.1233, 'grad_norm': 1.1397961070466422, 'learning_rate': 1.3678722369649045e-06, 'epoch': 0.77} 77%|███████▋ | 26259/34278 [28:50:38<7:40:25, 3.44s/it] 77%|███████▋ | 26260/34278 [28:50:41<7:20:29, 3.30s/it] {'loss': 0.1173, 'grad_norm': 0.8802910992411307, 'learning_rate': 1.3675475750890693e-06, 'epoch': 0.77} 77%|███████▋ | 26260/34278 [28:50:41<7:20:29, 3.30s/it] 77%|███████▋ | 26261/34278 [28:50:44<7:32:38, 3.39s/it] {'loss': 0.1033, 'grad_norm': 0.6852959092629128, 'learning_rate': 1.3672229456429036e-06, 'epoch': 0.77} 77%|███████▋ | 26261/34278 [28:50:44<7:32:38, 3.39s/it] 77%|███████▋ | 26262/34278 [28:50:47<7:31:10, 3.38s/it] {'loss': 0.0959, 'grad_norm': 0.8839510340431758, 'learning_rate': 1.3668983486293047e-06, 'epoch': 0.77} 77%|███████▋ | 26262/34278 [28:50:47<7:31:10, 3.38s/it] 77%|███████▋ | 26263/34278 [28:50:51<7:44:42, 3.48s/it] {'loss': 0.1088, 'grad_norm': 0.8572553122440686, 'learning_rate': 1.3665737840511684e-06, 'epoch': 0.77} 77%|███████▋ | 26263/34278 [28:50:51<7:44:42, 3.48s/it] 77%|███████▋ | 26264/34278 [28:50:57<9:20:32, 4.20s/it] {'loss': 0.1143, 'grad_norm': 0.7137228297578125, 'learning_rate': 1.3662492519113951e-06, 'epoch': 0.77} 77%|███████▋ | 26264/34278 [28:50:57<9:20:32, 4.20s/it] 77%|███████▋ | 26265/34278 [28:51:00<8:31:18, 3.83s/it] {'loss': 0.1464, 'grad_norm': 0.8696405135233204, 'learning_rate': 1.3659247522128798e-06, 'epoch': 0.77} 77%|███████▋ | 26265/34278 [28:51:00<8:31:18, 3.83s/it] 77%|███████▋ | 26266/34278 [28:51:03<8:06:07, 3.64s/it] {'loss': 0.1097, 'grad_norm': 0.7868935310991707, 'learning_rate': 1.365600284958522e-06, 'epoch': 0.77} 77%|███████▋ | 26266/34278 [28:51:03<8:06:07, 3.64s/it] 77%|███████▋ | 26267/34278 [28:51:07<8:03:31, 3.62s/it] {'loss': 0.1058, 'grad_norm': 0.8974860316812074, 'learning_rate': 1.3652758501512165e-06, 'epoch': 0.77} 77%|███████▋ | 26267/34278 [28:51:07<8:03:31, 3.62s/it] 77%|███████▋ | 26268/34278 [28:51:10<8:05:04, 3.63s/it] {'loss': 0.1056, 'grad_norm': 0.799519184422163, 'learning_rate': 1.3649514477938613e-06, 'epoch': 0.77} 77%|███████▋ | 26268/34278 [28:51:10<8:05:04, 3.63s/it] 77%|███████▋ | 26269/34278 [28:51:14<7:51:09, 3.53s/it] {'loss': 0.1151, 'grad_norm': 0.8261566520424768, 'learning_rate': 1.3646270778893523e-06, 'epoch': 0.77} 77%|███████▋ | 26269/34278 [28:51:14<7:51:09, 3.53s/it] 77%|███████▋ | 26270/34278 [28:51:16<7:12:37, 3.24s/it] {'loss': 0.1019, 'grad_norm': 0.8451196113916615, 'learning_rate': 1.364302740440583e-06, 'epoch': 0.77} 77%|███████▋ | 26270/34278 [28:51:16<7:12:37, 3.24s/it] 77%|███████▋ | 26271/34278 [28:51:19<7:07:37, 3.20s/it] {'loss': 0.0953, 'grad_norm': 0.727806267640842, 'learning_rate': 1.3639784354504509e-06, 'epoch': 0.77} 77%|███████▋ | 26271/34278 [28:51:19<7:07:37, 3.20s/it] 77%|███████▋ | 26272/34278 [28:51:24<7:51:10, 3.53s/it] {'loss': 0.136, 'grad_norm': 0.8105129847942211, 'learning_rate': 1.3636541629218536e-06, 'epoch': 0.77} 77%|███████▋ | 26272/34278 [28:51:24<7:51:10, 3.53s/it] 77%|███████▋ | 26273/34278 [28:51:28<8:05:58, 3.64s/it] {'loss': 0.0933, 'grad_norm': 0.7534847321433793, 'learning_rate': 1.363329922857682e-06, 'epoch': 0.77} 77%|███████▋ | 26273/34278 [28:51:28<8:05:58, 3.64s/it] 77%|███████▋ | 26274/34278 [28:51:31<7:58:05, 3.58s/it] {'loss': 0.1095, 'grad_norm': 0.78367161884003, 'learning_rate': 1.3630057152608334e-06, 'epoch': 0.77} 77%|███████▋ | 26274/34278 [28:51:31<7:58:05, 3.58s/it] 77%|███████▋ | 26275/34278 [28:51:34<7:30:26, 3.38s/it] {'loss': 0.0926, 'grad_norm': 0.6653689714496968, 'learning_rate': 1.3626815401342025e-06, 'epoch': 0.77} 77%|███████▋ | 26275/34278 [28:51:34<7:30:26, 3.38s/it] 77%|███████▋ | 26276/34278 [28:51:38<7:47:51, 3.51s/it] {'loss': 0.1182, 'grad_norm': 0.7206363186672688, 'learning_rate': 1.3623573974806808e-06, 'epoch': 0.77} 77%|███████▋ | 26276/34278 [28:51:38<7:47:51, 3.51s/it] 77%|███████▋ | 26277/34278 [28:51:41<7:26:30, 3.35s/it] {'loss': 0.144, 'grad_norm': 0.8402632051718905, 'learning_rate': 1.3620332873031639e-06, 'epoch': 0.77} 77%|███████▋ | 26277/34278 [28:51:41<7:26:30, 3.35s/it] 77%|███████▋ | 26278/34278 [28:51:44<7:13:26, 3.25s/it] {'loss': 0.1109, 'grad_norm': 0.9351239698532204, 'learning_rate': 1.3617092096045466e-06, 'epoch': 0.77} 77%|███████▋ | 26278/34278 [28:51:44<7:13:26, 3.25s/it] 77%|███████▋ | 26279/34278 [28:51:48<7:38:55, 3.44s/it] {'loss': 0.12, 'grad_norm': 0.7641055166835903, 'learning_rate': 1.3613851643877206e-06, 'epoch': 0.77} 77%|███████▋ | 26279/34278 [28:51:48<7:38:55, 3.44s/it] 77%|███████▋ | 26280/34278 [28:51:51<7:17:09, 3.28s/it] {'loss': 0.1249, 'grad_norm': 0.8797418194600454, 'learning_rate': 1.361061151655579e-06, 'epoch': 0.77} 77%|███████▋ | 26280/34278 [28:51:51<7:17:09, 3.28s/it] 77%|███████▋ | 26281/34278 [28:51:56<8:54:55, 4.01s/it] {'loss': 0.1423, 'grad_norm': 0.7831566990130667, 'learning_rate': 1.3607371714110151e-06, 'epoch': 0.77} 77%|███████▋ | 26281/34278 [28:51:56<8:54:55, 4.01s/it] 77%|███████▋ | 26282/34278 [28:51:59<8:07:02, 3.65s/it] {'loss': 0.1243, 'grad_norm': 0.9812283302496945, 'learning_rate': 1.3604132236569212e-06, 'epoch': 0.77} 77%|███████▋ | 26282/34278 [28:51:59<8:07:02, 3.65s/it] 77%|███████▋ | 26283/34278 [28:52:03<8:04:01, 3.63s/it] {'loss': 0.1083, 'grad_norm': 0.7172249047175969, 'learning_rate': 1.3600893083961864e-06, 'epoch': 0.77} 77%|███████▋ | 26283/34278 [28:52:03<8:04:01, 3.63s/it] 77%|███████▋ | 26284/34278 [28:52:05<7:30:05, 3.38s/it] {'loss': 0.0939, 'grad_norm': 0.8203349237324321, 'learning_rate': 1.3597654256317084e-06, 'epoch': 0.77} 77%|███████▋ | 26284/34278 [28:52:06<7:30:05, 3.38s/it] 77%|███████▋ | 26285/34278 [28:52:09<7:20:52, 3.31s/it] {'loss': 0.1094, 'grad_norm': 0.9417641017967823, 'learning_rate': 1.3594415753663754e-06, 'epoch': 0.77} 77%|███████▋ | 26285/34278 [28:52:09<7:20:52, 3.31s/it] 77%|███████▋ | 26286/34278 [28:52:12<7:12:40, 3.25s/it] {'loss': 0.1242, 'grad_norm': 0.8256592426687523, 'learning_rate': 1.359117757603078e-06, 'epoch': 0.77} 77%|███████▋ | 26286/34278 [28:52:12<7:12:40, 3.25s/it] 77%|███████▋ | 26287/34278 [28:52:15<7:01:22, 3.16s/it] {'loss': 0.1245, 'grad_norm': 0.9015058823509582, 'learning_rate': 1.3587939723447091e-06, 'epoch': 0.77} 77%|███████▋ | 26287/34278 [28:52:15<7:01:22, 3.16s/it] 77%|███████▋ | 26288/34278 [28:52:19<7:36:42, 3.43s/it] {'loss': 0.12, 'grad_norm': 1.2808031531899104, 'learning_rate': 1.3584702195941585e-06, 'epoch': 0.77} 77%|███████▋ | 26288/34278 [28:52:19<7:36:42, 3.43s/it] 77%|███████▋ | 26289/34278 [28:52:22<7:14:32, 3.26s/it] {'loss': 0.102, 'grad_norm': 0.8742758191790518, 'learning_rate': 1.3581464993543147e-06, 'epoch': 0.77} 77%|███████▋ | 26289/34278 [28:52:22<7:14:32, 3.26s/it] 77%|███████▋ | 26290/34278 [28:52:28<9:01:59, 4.07s/it] {'loss': 0.1385, 'grad_norm': 0.7941010130215583, 'learning_rate': 1.35782281162807e-06, 'epoch': 0.77} 77%|███████▋ | 26290/34278 [28:52:28<9:01:59, 4.07s/it] 77%|███████▋ | 26291/34278 [28:52:34<10:18:43, 4.65s/it] {'loss': 0.1038, 'grad_norm': 0.750154611086974, 'learning_rate': 1.3574991564183155e-06, 'epoch': 0.77} 77%|███████▋ | 26291/34278 [28:52:34<10:18:43, 4.65s/it] 77%|███████▋ | 26292/34278 [28:52:37<9:35:33, 4.32s/it] {'loss': 0.1118, 'grad_norm': 0.998807815362782, 'learning_rate': 1.3571755337279386e-06, 'epoch': 0.77} 77%|███████▋ | 26292/34278 [28:52:37<9:35:33, 4.32s/it] 77%|███████▋ | 26293/34278 [28:52:40<8:51:14, 3.99s/it] {'loss': 0.1038, 'grad_norm': 1.2365964096570772, 'learning_rate': 1.356851943559827e-06, 'epoch': 0.77} 77%|███████▋ | 26293/34278 [28:52:40<8:51:14, 3.99s/it] 77%|███████▋ | 26294/34278 [28:52:43<8:13:29, 3.71s/it] {'loss': 0.1268, 'grad_norm': 1.3332685752500593, 'learning_rate': 1.3565283859168738e-06, 'epoch': 0.77} 77%|███████▋ | 26294/34278 [28:52:43<8:13:29, 3.71s/it] 77%|███████▋ | 26295/34278 [28:52:46<7:30:10, 3.38s/it] {'loss': 0.1321, 'grad_norm': 0.9024981148315767, 'learning_rate': 1.3562048608019635e-06, 'epoch': 0.77} 77%|███████▋ | 26295/34278 [28:52:46<7:30:10, 3.38s/it] 77%|███████▋ | 26296/34278 [28:52:49<7:22:44, 3.33s/it] {'loss': 0.1136, 'grad_norm': 0.9515302823841885, 'learning_rate': 1.3558813682179884e-06, 'epoch': 0.77} 77%|███████▋ | 26296/34278 [28:52:49<7:22:44, 3.33s/it] 77%|███████▋ | 26297/34278 [28:52:56<9:21:53, 4.22s/it] {'loss': 0.1029, 'grad_norm': 0.6359741660583533, 'learning_rate': 1.3555579081678321e-06, 'epoch': 0.77} 77%|███████▋ | 26297/34278 [28:52:56<9:21:53, 4.22s/it] 77%|███████▋ | 26298/34278 [28:52:59<8:33:50, 3.86s/it] {'loss': 0.0962, 'grad_norm': 0.6716431148909698, 'learning_rate': 1.355234480654387e-06, 'epoch': 0.77} 77%|███████▋ | 26298/34278 [28:52:59<8:33:50, 3.86s/it] 77%|███████▋ | 26299/34278 [28:53:01<7:48:43, 3.52s/it] {'loss': 0.1119, 'grad_norm': 1.0057342710694277, 'learning_rate': 1.354911085680538e-06, 'epoch': 0.77} 77%|███████▋ | 26299/34278 [28:53:01<7:48:43, 3.52s/it] 77%|███████▋ | 26300/34278 [28:53:04<7:13:34, 3.26s/it] {'loss': 0.1082, 'grad_norm': 1.0221882901163448, 'learning_rate': 1.3545877232491716e-06, 'epoch': 0.77} 77%|███████▋ | 26300/34278 [28:53:04<7:13:34, 3.26s/it] 77%|███████▋ | 26301/34278 [28:53:08<7:43:41, 3.49s/it] {'loss': 0.1294, 'grad_norm': 0.9537886818232255, 'learning_rate': 1.3542643933631755e-06, 'epoch': 0.77} 77%|███████▋ | 26301/34278 [28:53:08<7:43:41, 3.49s/it] 77%|███████▋ | 26302/34278 [28:53:12<8:00:22, 3.61s/it] {'loss': 0.1269, 'grad_norm': 1.3430581809932332, 'learning_rate': 1.3539410960254384e-06, 'epoch': 0.77} 77%|███████▋ | 26302/34278 [28:53:12<8:00:22, 3.61s/it] 77%|███████▋ | 26303/34278 [28:53:15<7:33:30, 3.41s/it] {'loss': 0.1085, 'grad_norm': 0.8514420956156064, 'learning_rate': 1.3536178312388432e-06, 'epoch': 0.77} 77%|███████▋ | 26303/34278 [28:53:15<7:33:30, 3.41s/it] 77%|███████▋ | 26304/34278 [28:53:19<8:07:21, 3.67s/it] {'loss': 0.1242, 'grad_norm': 0.7326071125051908, 'learning_rate': 1.3532945990062784e-06, 'epoch': 0.77} 77%|███████▋ | 26304/34278 [28:53:19<8:07:21, 3.67s/it] 77%|███████▋ | 26305/34278 [28:53:24<8:51:57, 4.00s/it] {'loss': 0.1427, 'grad_norm': 0.8487698632811732, 'learning_rate': 1.35297139933063e-06, 'epoch': 0.77} 77%|███████▋ | 26305/34278 [28:53:24<8:51:57, 4.00s/it] 77%|███████▋ | 26306/34278 [28:53:27<8:33:20, 3.86s/it] {'loss': 0.1057, 'grad_norm': 1.1671383181527097, 'learning_rate': 1.3526482322147798e-06, 'epoch': 0.77} 77%|███████▋ | 26306/34278 [28:53:27<8:33:20, 3.86s/it] 77%|███████▋ | 26307/34278 [28:53:30<7:56:08, 3.58s/it] {'loss': 0.1143, 'grad_norm': 0.7276062023581822, 'learning_rate': 1.352325097661616e-06, 'epoch': 0.77} 77%|███████▋ | 26307/34278 [28:53:30<7:56:08, 3.58s/it] 77%|███████▋ | 26308/34278 [28:53:36<9:02:56, 4.09s/it] {'loss': 0.1058, 'grad_norm': 0.8089488802652228, 'learning_rate': 1.3520019956740244e-06, 'epoch': 0.77} 77%|███████▋ | 26308/34278 [28:53:36<9:02:56, 4.09s/it] 77%|███████▋ | 26309/34278 [28:53:40<8:57:24, 4.05s/it] {'loss': 0.1184, 'grad_norm': 0.9404382782265481, 'learning_rate': 1.351678926254888e-06, 'epoch': 0.77} 77%|███████▋ | 26309/34278 [28:53:40<8:57:24, 4.05s/it] 77%|███████▋ | 26310/34278 [28:53:42<8:10:22, 3.69s/it] {'loss': 0.0923, 'grad_norm': 0.7773568824293435, 'learning_rate': 1.35135588940709e-06, 'epoch': 0.77} 77%|███████▋ | 26310/34278 [28:53:42<8:10:22, 3.69s/it] 77%|███████▋ | 26311/34278 [28:53:46<7:55:01, 3.58s/it] {'loss': 0.1063, 'grad_norm': 1.1283489551534065, 'learning_rate': 1.3510328851335164e-06, 'epoch': 0.77} 77%|███████▋ | 26311/34278 [28:53:46<7:55:01, 3.58s/it] 77%|███████▋ | 26312/34278 [28:53:50<8:17:08, 3.74s/it] {'loss': 0.1107, 'grad_norm': 0.9811542149546232, 'learning_rate': 1.3507099134370494e-06, 'epoch': 0.77} 77%|███████▋ | 26312/34278 [28:53:50<8:17:08, 3.74s/it] 77%|███████▋ | 26313/34278 [28:53:53<8:02:25, 3.63s/it] {'loss': 0.1099, 'grad_norm': 1.0223309748746012, 'learning_rate': 1.3503869743205727e-06, 'epoch': 0.77} 77%|███████▋ | 26313/34278 [28:53:53<8:02:25, 3.63s/it] 77%|███████▋ | 26314/34278 [28:53:58<8:41:43, 3.93s/it] {'loss': 0.0999, 'grad_norm': 0.7347629024668387, 'learning_rate': 1.3500640677869713e-06, 'epoch': 0.77} 77%|███████▋ | 26314/34278 [28:53:58<8:41:43, 3.93s/it] 77%|███████▋ | 26315/34278 [28:54:02<8:33:51, 3.87s/it] {'loss': 0.1092, 'grad_norm': 0.7341897061573558, 'learning_rate': 1.3497411938391276e-06, 'epoch': 0.77} 77%|███████▋ | 26315/34278 [28:54:02<8:33:51, 3.87s/it] 77%|███████▋ | 26316/34278 [28:54:05<8:22:27, 3.79s/it] {'loss': 0.1246, 'grad_norm': 0.851833901783838, 'learning_rate': 1.3494183524799204e-06, 'epoch': 0.77} 77%|███████▋ | 26316/34278 [28:54:05<8:22:27, 3.79s/it] 77%|███████▋ | 26317/34278 [28:54:10<9:15:01, 4.18s/it] {'loss': 0.1059, 'grad_norm': 0.8753818742823339, 'learning_rate': 1.3490955437122367e-06, 'epoch': 0.77} 77%|███████▋ | 26317/34278 [28:54:10<9:15:01, 4.18s/it] 77%|███████▋ | 26318/34278 [28:54:14<8:38:43, 3.91s/it] {'loss': 0.1044, 'grad_norm': 0.8341477962960666, 'learning_rate': 1.348772767538955e-06, 'epoch': 0.77} 77%|███████▋ | 26318/34278 [28:54:14<8:38:43, 3.91s/it] 77%|███████▋ | 26319/34278 [28:54:17<8:05:38, 3.66s/it] {'loss': 0.111, 'grad_norm': 0.7149525553203409, 'learning_rate': 1.34845002396296e-06, 'epoch': 0.77} 77%|███████▋ | 26319/34278 [28:54:17<8:05:38, 3.66s/it] 77%|███████▋ | 26320/34278 [28:54:19<7:30:40, 3.40s/it] {'loss': 0.1126, 'grad_norm': 0.9113556991700084, 'learning_rate': 1.3481273129871297e-06, 'epoch': 0.77} 77%|███████▋ | 26320/34278 [28:54:19<7:30:40, 3.40s/it] 77%|███████▋ | 26321/34278 [28:54:22<7:13:29, 3.27s/it] {'loss': 0.1024, 'grad_norm': 0.7604835147354264, 'learning_rate': 1.3478046346143487e-06, 'epoch': 0.77} 77%|███████▋ | 26321/34278 [28:54:22<7:13:29, 3.27s/it] 77%|███████▋ | 26322/34278 [28:54:27<7:52:31, 3.56s/it] {'loss': 0.1329, 'grad_norm': 0.7818953082305324, 'learning_rate': 1.3474819888474955e-06, 'epoch': 0.77} 77%|███████▋ | 26322/34278 [28:54:27<7:52:31, 3.56s/it] 77%|███████▋ | 26323/34278 [28:54:32<9:21:27, 4.23s/it] {'loss': 0.1349, 'grad_norm': 0.7055300546011628, 'learning_rate': 1.3471593756894502e-06, 'epoch': 0.77} 77%|███████▋ | 26323/34278 [28:54:32<9:21:27, 4.23s/it] 77%|███████▋ | 26324/34278 [28:54:36<8:36:19, 3.89s/it] {'loss': 0.1202, 'grad_norm': 0.801859806573647, 'learning_rate': 1.3468367951430939e-06, 'epoch': 0.77} 77%|███████▋ | 26324/34278 [28:54:36<8:36:19, 3.89s/it] 77%|███████▋ | 26325/34278 [28:54:38<7:52:05, 3.56s/it] {'loss': 0.1149, 'grad_norm': 0.8591917215223992, 'learning_rate': 1.3465142472113085e-06, 'epoch': 0.77} 77%|███████▋ | 26325/34278 [28:54:38<7:52:05, 3.56s/it] 77%|███████▋ | 26326/34278 [28:54:42<7:49:00, 3.54s/it] {'loss': 0.1284, 'grad_norm': 0.7905620443673668, 'learning_rate': 1.3461917318969714e-06, 'epoch': 0.77} 77%|███████▋ | 26326/34278 [28:54:42<7:49:00, 3.54s/it] 77%|███████▋ | 26327/34278 [28:54:45<7:48:12, 3.53s/it] {'loss': 0.1161, 'grad_norm': 0.9012770736109739, 'learning_rate': 1.3458692492029608e-06, 'epoch': 0.77} 77%|███████▋ | 26327/34278 [28:54:45<7:48:12, 3.53s/it] 77%|███████▋ | 26328/34278 [28:54:48<7:11:35, 3.26s/it] {'loss': 0.093, 'grad_norm': 0.7719845505449965, 'learning_rate': 1.3455467991321586e-06, 'epoch': 0.77} 77%|███████▋ | 26328/34278 [28:54:48<7:11:35, 3.26s/it] 77%|███████▋ | 26329/34278 [28:54:51<7:18:05, 3.31s/it] {'loss': 0.1207, 'grad_norm': 0.8562658122996444, 'learning_rate': 1.3452243816874423e-06, 'epoch': 0.77} 77%|███████▋ | 26329/34278 [28:54:51<7:18:05, 3.31s/it] 77%|███████▋ | 26330/34278 [28:54:56<7:53:08, 3.57s/it] {'loss': 0.1085, 'grad_norm': 0.6791845106246308, 'learning_rate': 1.344901996871687e-06, 'epoch': 0.77} 77%|███████▋ | 26330/34278 [28:54:56<7:53:08, 3.57s/it] 77%|███████▋ | 26331/34278 [28:54:58<7:22:35, 3.34s/it] {'loss': 0.1092, 'grad_norm': 0.8296558353589102, 'learning_rate': 1.3445796446877773e-06, 'epoch': 0.77} 77%|███████▋ | 26331/34278 [28:54:58<7:22:35, 3.34s/it] 77%|███████▋ | 26332/34278 [28:55:02<7:18:56, 3.31s/it] {'loss': 0.1275, 'grad_norm': 1.1079128073038012, 'learning_rate': 1.3442573251385882e-06, 'epoch': 0.77} 77%|███████▋ | 26332/34278 [28:55:02<7:18:56, 3.31s/it] 77%|███████▋ | 26333/34278 [28:55:05<7:27:15, 3.38s/it] {'loss': 0.1136, 'grad_norm': 0.7430678761466323, 'learning_rate': 1.343935038226995e-06, 'epoch': 0.77} 77%|███████▋ | 26333/34278 [28:55:05<7:27:15, 3.38s/it] 77%|███████▋ | 26334/34278 [28:55:08<7:08:38, 3.24s/it] {'loss': 0.1432, 'grad_norm': 0.8437397346011964, 'learning_rate': 1.3436127839558788e-06, 'epoch': 0.77} 77%|███████▋ | 26334/34278 [28:55:08<7:08:38, 3.24s/it] 77%|███████▋ | 26335/34278 [28:55:11<7:04:36, 3.21s/it] {'loss': 0.121, 'grad_norm': 0.8250038742054424, 'learning_rate': 1.3432905623281151e-06, 'epoch': 0.77} 77%|███████▋ | 26335/34278 [28:55:11<7:04:36, 3.21s/it] 77%|███████▋ | 26336/34278 [28:55:16<8:21:59, 3.79s/it] {'loss': 0.1114, 'grad_norm': 0.8556139117128854, 'learning_rate': 1.3429683733465782e-06, 'epoch': 0.77} 77%|███████▋ | 26336/34278 [28:55:16<8:21:59, 3.79s/it] 77%|███████▋ | 26337/34278 [28:55:19<7:48:19, 3.54s/it] {'loss': 0.1042, 'grad_norm': 0.8087185715853028, 'learning_rate': 1.3426462170141475e-06, 'epoch': 0.77} 77%|███████▋ | 26337/34278 [28:55:19<7:48:19, 3.54s/it] 77%|███████▋ | 26338/34278 [28:55:22<7:21:04, 3.33s/it] {'loss': 0.1096, 'grad_norm': 0.7425615248636965, 'learning_rate': 1.3423240933336989e-06, 'epoch': 0.77} 77%|███████▋ | 26338/34278 [28:55:22<7:21:04, 3.33s/it] 77%|███████▋ | 26339/34278 [28:55:25<7:09:17, 3.24s/it] {'loss': 0.1001, 'grad_norm': 0.9583128033071, 'learning_rate': 1.3420020023081081e-06, 'epoch': 0.77} 77%|███████▋ | 26339/34278 [28:55:25<7:09:17, 3.24s/it] 77%|███████▋ | 26340/34278 [28:55:31<8:56:41, 4.06s/it] {'loss': 0.1213, 'grad_norm': 0.8593452010260961, 'learning_rate': 1.3416799439402483e-06, 'epoch': 0.77} 77%|███████▋ | 26340/34278 [28:55:31<8:56:41, 4.06s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f30ff6f7010> Failed to fetch sample 3074343. Exception: cannot identify image file <_io.BytesIO object at 0x7f30ff6f7010> 77%|███████▋ | 26341/34278 [28:55:37<10:20:38, 4.69s/it] {'loss': 0.1257, 'grad_norm': 0.8879522014577076, 'learning_rate': 1.3413579182329989e-06, 'epoch': 0.77} 77%|███████▋ | 26341/34278 [28:55:37<10:20:38, 4.69s/it] 77%|███████▋ | 26342/34278 [28:55:44<11:28:17, 5.20s/it] {'loss': 0.1172, 'grad_norm': 0.7333457531120627, 'learning_rate': 1.3410359251892307e-06, 'epoch': 0.77} 77%|███████▋ | 26342/34278 [28:55:44<11:28:17, 5.20s/it] 77%|███████▋ | 26343/34278 [28:55:47<9:52:05, 4.48s/it] {'loss': 0.1241, 'grad_norm': 1.0015957078579298, 'learning_rate': 1.34071396481182e-06, 'epoch': 0.77} 77%|███████▋ | 26343/34278 [28:55:47<9:52:05, 4.48s/it] 77%|███████▋ | 26344/34278 [28:55:51<10:01:28, 4.55s/it] {'loss': 0.1337, 'grad_norm': 0.8076648583421359, 'learning_rate': 1.3403920371036433e-06, 'epoch': 0.77} 77%|███████▋ | 26344/34278 [28:55:51<10:01:28, 4.55s/it] 77%|███████▋ | 26345/34278 [28:55:56<10:13:50, 4.64s/it] {'loss': 0.1345, 'grad_norm': 0.9656849619903707, 'learning_rate': 1.3400701420675727e-06, 'epoch': 0.77} 77%|███████▋ | 26345/34278 [28:55:56<10:13:50, 4.64s/it] 77%|███████▋ | 26346/34278 [28:55:59<9:24:58, 4.27s/it] {'loss': 0.1108, 'grad_norm': 0.8535953469313441, 'learning_rate': 1.339748279706481e-06, 'epoch': 0.77} 77%|███████▋ | 26346/34278 [28:56:00<9:24:58, 4.27s/it] 77%|███████▋ | 26347/34278 [28:56:03<8:39:58, 3.93s/it] {'loss': 0.1257, 'grad_norm': 0.7917522328083967, 'learning_rate': 1.339426450023244e-06, 'epoch': 0.77} 77%|███████▋ | 26347/34278 [28:56:03<8:39:58, 3.93s/it] 77%|███████▋ | 26348/34278 [28:56:07<8:38:52, 3.93s/it] {'loss': 0.1197, 'grad_norm': 0.6787829623433255, 'learning_rate': 1.3391046530207325e-06, 'epoch': 0.77} 77%|███████▋ | 26348/34278 [28:56:07<8:38:52, 3.93s/it] 77%|███████▋ | 26349/34278 [28:56:10<8:31:12, 3.87s/it] {'loss': 0.0988, 'grad_norm': 0.8878994011466506, 'learning_rate': 1.3387828887018222e-06, 'epoch': 0.77} 77%|███████▋ | 26349/34278 [28:56:10<8:31:12, 3.87s/it] 77%|███████▋ | 26350/34278 [28:56:13<7:44:13, 3.51s/it] {'loss': 0.1173, 'grad_norm': 0.9161149240701941, 'learning_rate': 1.3384611570693828e-06, 'epoch': 0.77} 77%|███████▋ | 26350/34278 [28:56:13<7:44:13, 3.51s/it] 77%|███████▋ | 26351/34278 [28:56:16<7:23:57, 3.36s/it] {'loss': 0.1169, 'grad_norm': 0.7338299930776192, 'learning_rate': 1.3381394581262896e-06, 'epoch': 0.77} 77%|███████▋ | 26351/34278 [28:56:16<7:23:57, 3.36s/it] 77%|███████▋ | 26352/34278 [28:56:19<7:11:08, 3.26s/it] {'loss': 0.1167, 'grad_norm': 0.8177897018433256, 'learning_rate': 1.3378177918754132e-06, 'epoch': 0.77} 77%|███████▋ | 26352/34278 [28:56:19<7:11:08, 3.26s/it] 77%|███████▋ | 26353/34278 [28:56:22<7:02:28, 3.20s/it] {'loss': 0.0966, 'grad_norm': 0.7839406401691176, 'learning_rate': 1.3374961583196238e-06, 'epoch': 0.77} 77%|███████▋ | 26353/34278 [28:56:22<7:02:28, 3.20s/it] 77%|███████▋ | 26354/34278 [28:56:25<7:06:31, 3.23s/it] {'loss': 0.1188, 'grad_norm': 0.790638859628819, 'learning_rate': 1.3371745574617945e-06, 'epoch': 0.77} 77%|███████▋ | 26354/34278 [28:56:25<7:06:31, 3.23s/it] 77%|███████▋ | 26355/34278 [28:56:31<8:54:07, 4.04s/it] {'loss': 0.1167, 'grad_norm': 0.9705534971863546, 'learning_rate': 1.3368529893047977e-06, 'epoch': 0.77} 77%|███████▋ | 26355/34278 [28:56:31<8:54:07, 4.04s/it] 77%|███████▋ | 26356/34278 [28:56:35<8:28:53, 3.85s/it] {'loss': 0.0999, 'grad_norm': 1.0390544757812006, 'learning_rate': 1.3365314538515028e-06, 'epoch': 0.77} 77%|███████▋ | 26356/34278 [28:56:35<8:28:53, 3.85s/it] 77%|███████▋ | 26357/34278 [28:56:37<7:41:51, 3.50s/it] {'loss': 0.1177, 'grad_norm': 0.945480891191583, 'learning_rate': 1.3362099511047793e-06, 'epoch': 0.77} 77%|███████▋ | 26357/34278 [28:56:37<7:41:51, 3.50s/it] 77%|███████▋ | 26358/34278 [28:56:41<7:35:02, 3.45s/it] {'loss': 0.1454, 'grad_norm': 0.8838098435908222, 'learning_rate': 1.3358884810675005e-06, 'epoch': 0.77} 77%|███████▋ | 26358/34278 [28:56:41<7:35:02, 3.45s/it] 77%|███████▋ | 26359/34278 [28:56:44<7:44:11, 3.52s/it] {'loss': 0.1254, 'grad_norm': 0.851736956955131, 'learning_rate': 1.3355670437425344e-06, 'epoch': 0.77} 77%|███████▋ | 26359/34278 [28:56:44<7:44:11, 3.52s/it] 77%|███████▋ | 26360/34278 [28:56:48<7:41:47, 3.50s/it] {'loss': 0.0926, 'grad_norm': 0.7620084892561015, 'learning_rate': 1.3352456391327479e-06, 'epoch': 0.77} 77%|███████▋ | 26360/34278 [28:56:48<7:41:47, 3.50s/it] 77%|███████▋ | 26361/34278 [28:56:53<8:59:51, 4.09s/it] {'loss': 0.1037, 'grad_norm': 0.7596378487858548, 'learning_rate': 1.3349242672410162e-06, 'epoch': 0.77} 77%|███████▋ | 26361/34278 [28:56:53<8:59:51, 4.09s/it] 77%|███████▋ | 26362/34278 [28:56:56<8:11:54, 3.73s/it] {'loss': 0.1191, 'grad_norm': 0.7407636176932145, 'learning_rate': 1.334602928070206e-06, 'epoch': 0.77} 77%|███████▋ | 26362/34278 [28:56:56<8:11:54, 3.73s/it] 77%|███████▋ | 26363/34278 [28:57:00<8:04:04, 3.67s/it] {'loss': 0.1341, 'grad_norm': 0.8387156139437207, 'learning_rate': 1.3342816216231846e-06, 'epoch': 0.77} 77%|███████▋ | 26363/34278 [28:57:00<8:04:04, 3.67s/it] 77%|███████▋ | 26364/34278 [28:57:06<9:33:19, 4.35s/it] {'loss': 0.0995, 'grad_norm': 0.9424023168724408, 'learning_rate': 1.3339603479028229e-06, 'epoch': 0.77} 77%|███████▋ | 26364/34278 [28:57:06<9:33:19, 4.35s/it] 77%|███████▋ | 26365/34278 [28:57:10<9:21:40, 4.26s/it] {'loss': 0.1123, 'grad_norm': 0.8413175728940638, 'learning_rate': 1.333639106911988e-06, 'epoch': 0.77} 77%|███████▋ | 26365/34278 [28:57:10<9:21:40, 4.26s/it] 77%|███████▋ | 26366/34278 [28:57:13<8:33:09, 3.89s/it] {'loss': 0.114, 'grad_norm': 0.7428676524295936, 'learning_rate': 1.3333178986535466e-06, 'epoch': 0.77} 77%|███████▋ | 26366/34278 [28:57:13<8:33:09, 3.89s/it] 77%|███████▋ | 26367/34278 [28:57:16<8:05:35, 3.68s/it] {'loss': 0.101, 'grad_norm': 0.8633340480866784, 'learning_rate': 1.3329967231303682e-06, 'epoch': 0.77} 77%|███████▋ | 26367/34278 [28:57:16<8:05:35, 3.68s/it] 77%|███████▋ | 26368/34278 [28:57:19<7:37:47, 3.47s/it] {'loss': 0.1074, 'grad_norm': 1.2447757965667121, 'learning_rate': 1.3326755803453206e-06, 'epoch': 0.77} 77%|███████▋ | 26368/34278 [28:57:19<7:37:47, 3.47s/it] 77%|███████▋ | 26369/34278 [28:57:22<7:19:55, 3.34s/it] {'loss': 0.1117, 'grad_norm': 1.0479824407084282, 'learning_rate': 1.3323544703012697e-06, 'epoch': 0.77} 77%|███████▋ | 26369/34278 [28:57:22<7:19:55, 3.34s/it] 77%|███████▋ | 26370/34278 [28:57:26<7:39:57, 3.49s/it] {'loss': 0.1121, 'grad_norm': 0.7147793449500534, 'learning_rate': 1.3320333930010815e-06, 'epoch': 0.77} 77%|███████▋ | 26370/34278 [28:57:26<7:39:57, 3.49s/it] 77%|███████▋ | 26371/34278 [28:57:30<8:00:57, 3.65s/it] {'loss': 0.1103, 'grad_norm': 0.7578252238900758, 'learning_rate': 1.3317123484476251e-06, 'epoch': 0.77} 77%|███████▋ | 26371/34278 [28:57:30<8:00:57, 3.65s/it] 77%|███████▋ | 26372/34278 [28:57:36<9:37:35, 4.38s/it] {'loss': 0.1074, 'grad_norm': 0.83767739554361, 'learning_rate': 1.3313913366437637e-06, 'epoch': 0.77} 77%|███████▋ | 26372/34278 [28:57:36<9:37:35, 4.38s/it] 77%|███████▋ | 26373/34278 [28:57:39<8:55:17, 4.06s/it] {'loss': 0.1188, 'grad_norm': 0.9595939537888586, 'learning_rate': 1.331070357592364e-06, 'epoch': 0.77} 77%|███████▋ | 26373/34278 [28:57:39<8:55:17, 4.06s/it] 77%|███████▋ | 26374/34278 [28:57:43<8:35:10, 3.91s/it] {'loss': 0.1145, 'grad_norm': 0.9503184644901421, 'learning_rate': 1.3307494112962943e-06, 'epoch': 0.77} 77%|███████▋ | 26374/34278 [28:57:43<8:35:10, 3.91s/it] 77%|███████▋ | 26375/34278 [28:57:46<8:10:09, 3.72s/it] {'loss': 0.0997, 'grad_norm': 0.6936057562434776, 'learning_rate': 1.3304284977584182e-06, 'epoch': 0.77} 77%|███████▋ | 26375/34278 [28:57:46<8:10:09, 3.72s/it] 77%|███████▋ | 26376/34278 [28:57:49<7:42:25, 3.51s/it] {'loss': 0.1052, 'grad_norm': 0.7660194227454243, 'learning_rate': 1.3301076169815986e-06, 'epoch': 0.77} 77%|███████▋ | 26376/34278 [28:57:49<7:42:25, 3.51s/it] 77%|███████▋ | 26377/34278 [28:57:52<7:38:11, 3.48s/it] {'loss': 0.12, 'grad_norm': 0.860131712849057, 'learning_rate': 1.3297867689687038e-06, 'epoch': 0.77} 77%|███████▋ | 26377/34278 [28:57:52<7:38:11, 3.48s/it] 77%|███████▋ | 26378/34278 [28:57:55<7:14:24, 3.30s/it] {'loss': 0.1241, 'grad_norm': 0.8692900744563608, 'learning_rate': 1.3294659537225951e-06, 'epoch': 0.77} 77%|███████▋ | 26378/34278 [28:57:55<7:14:24, 3.30s/it] 77%|███████▋ | 26379/34278 [28:58:00<7:54:06, 3.60s/it] {'loss': 0.0993, 'grad_norm': 0.8160703074895804, 'learning_rate': 1.3291451712461395e-06, 'epoch': 0.77} 77%|███████▋ | 26379/34278 [28:58:00<7:54:06, 3.60s/it] 77%|███████▋ | 26380/34278 [28:58:04<8:03:22, 3.67s/it] {'loss': 0.1149, 'grad_norm': 0.9095841995953802, 'learning_rate': 1.3288244215421981e-06, 'epoch': 0.77} 77%|███████▋ | 26380/34278 [28:58:04<8:03:22, 3.67s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8ade8968e0> Failed to fetch sample 3641699. Exception: cannot identify image file <_io.BytesIO object at 0x7f8ade8968e0> 77%|███████▋ | 26381/34278 [28:58:06<7:28:58, 3.41s/it] {'loss': 0.1204, 'grad_norm': 0.8708794576204514, 'learning_rate': 1.3285037046136372e-06, 'epoch': 0.77} 77%|███████▋ | 26381/34278 [28:58:06<7:28:58, 3.41s/it] 77%|███████▋ | 26382/34278 [28:58:09<7:03:23, 3.22s/it] {'loss': 0.1166, 'grad_norm': 0.9675974965172188, 'learning_rate': 1.3281830204633188e-06, 'epoch': 0.77} 77%|███████▋ | 26382/34278 [28:58:09<7:03:23, 3.22s/it] 77%|███████▋ | 26383/34278 [28:58:13<7:25:41, 3.39s/it] {'loss': 0.1284, 'grad_norm': 0.8515714097596896, 'learning_rate': 1.3278623690941045e-06, 'epoch': 0.77} 77%|███████▋ | 26383/34278 [28:58:13<7:25:41, 3.39s/it] 77%|███████▋ | 26384/34278 [28:58:16<7:01:24, 3.20s/it] {'loss': 0.1157, 'grad_norm': 0.8043822820085537, 'learning_rate': 1.3275417505088585e-06, 'epoch': 0.77} 77%|███████▋ | 26384/34278 [28:58:16<7:01:24, 3.20s/it] 77%|███████▋ | 26385/34278 [28:58:22<8:56:19, 4.08s/it] {'loss': 0.1225, 'grad_norm': 0.955162189308029, 'learning_rate': 1.3272211647104443e-06, 'epoch': 0.77} 77%|███████▋ | 26385/34278 [28:58:22<8:56:19, 4.08s/it] 77%|███████▋ | 26386/34278 [28:58:27<9:59:06, 4.55s/it] {'loss': 0.1048, 'grad_norm': 1.0431847591400947, 'learning_rate': 1.3269006117017231e-06, 'epoch': 0.77} 77%|███████▋ | 26386/34278 [28:58:27<9:59:06, 4.55s/it] 77%|███████▋ | 26387/34278 [28:58:32<9:57:02, 4.54s/it] {'loss': 0.1118, 'grad_norm': 0.7443241082161194, 'learning_rate': 1.3265800914855542e-06, 'epoch': 0.77} 77%|███████▋ | 26387/34278 [28:58:32<9:57:02, 4.54s/it] 77%|███████▋ | 26388/34278 [28:58:35<8:58:50, 4.10s/it] {'loss': 0.1057, 'grad_norm': 0.8425779974112467, 'learning_rate': 1.3262596040648034e-06, 'epoch': 0.77} 77%|███████▋ | 26388/34278 [28:58:35<8:58:50, 4.10s/it] 77%|███████▋ | 26389/34278 [28:58:39<9:07:00, 4.16s/it] {'loss': 0.1054, 'grad_norm': 0.768087129628987, 'learning_rate': 1.3259391494423296e-06, 'epoch': 0.77} 77%|███████▋ | 26389/34278 [28:58:39<9:07:00, 4.16s/it] 77%|███████▋ | 26390/34278 [28:58:44<9:23:00, 4.28s/it] {'loss': 0.1188, 'grad_norm': 1.061329548066553, 'learning_rate': 1.3256187276209913e-06, 'epoch': 0.77} 77%|███████▋ | 26390/34278 [28:58:44<9:23:00, 4.28s/it] 77%|███████▋ | 26391/34278 [28:58:47<8:32:29, 3.90s/it] {'loss': 0.1119, 'grad_norm': 0.900892773663434, 'learning_rate': 1.325298338603655e-06, 'epoch': 0.77} 77%|███████▋ | 26391/34278 [28:58:47<8:32:29, 3.90s/it] 77%|███████▋ | 26392/34278 [28:58:50<8:16:38, 3.78s/it] {'loss': 0.1044, 'grad_norm': 1.0821439260157737, 'learning_rate': 1.3249779823931774e-06, 'epoch': 0.77} 77%|███████▋ | 26392/34278 [28:58:50<8:16:38, 3.78s/it] 77%|███████▋ | 26393/34278 [28:58:53<7:45:22, 3.54s/it] {'loss': 0.0815, 'grad_norm': 0.8665013694211476, 'learning_rate': 1.3246576589924176e-06, 'epoch': 0.77} 77%|███████▋ | 26393/34278 [28:58:53<7:45:22, 3.54s/it] 77%|███████▋ | 26394/34278 [28:58:57<7:30:17, 3.43s/it] {'loss': 0.1176, 'grad_norm': 1.037289665752422, 'learning_rate': 1.3243373684042388e-06, 'epoch': 0.77} 77%|███████▋ | 26394/34278 [28:58:57<7:30:17, 3.43s/it] 77%|███████▋ | 26395/34278 [28:59:03<9:16:17, 4.23s/it] {'loss': 0.137, 'grad_norm': 1.037716672470052, 'learning_rate': 1.324017110631498e-06, 'epoch': 0.77} 77%|███████▋ | 26395/34278 [28:59:03<9:16:17, 4.23s/it] 77%|███████▋ | 26396/34278 [28:59:06<8:31:58, 3.90s/it] {'loss': 0.1345, 'grad_norm': 1.211706994811601, 'learning_rate': 1.3236968856770537e-06, 'epoch': 0.77} 77%|███████▋ | 26396/34278 [28:59:06<8:31:58, 3.90s/it] 77%|███████▋ | 26397/34278 [28:59:09<7:56:44, 3.63s/it] {'loss': 0.1334, 'grad_norm': 1.0750149620536222, 'learning_rate': 1.3233766935437665e-06, 'epoch': 0.77} 77%|███████▋ | 26397/34278 [28:59:09<7:56:44, 3.63s/it] 77%|███████▋ | 26398/34278 [28:59:11<7:20:18, 3.35s/it] {'loss': 0.1093, 'grad_norm': 0.8909687763686794, 'learning_rate': 1.3230565342344953e-06, 'epoch': 0.77} 77%|███████▋ | 26398/34278 [28:59:11<7:20:18, 3.35s/it] 77%|███████▋ | 26399/34278 [28:59:15<7:19:15, 3.35s/it] {'loss': 0.1208, 'grad_norm': 0.6824404088662096, 'learning_rate': 1.3227364077520976e-06, 'epoch': 0.77} 77%|███████▋ | 26399/34278 [28:59:15<7:19:15, 3.35s/it] 77%|███████▋ | 26400/34278 [28:59:18<7:02:56, 3.22s/it] {'loss': 0.1224, 'grad_norm': 0.8293847946531514, 'learning_rate': 1.3224163140994302e-06, 'epoch': 0.77} 77%|███████▋ | 26400/34278 [28:59:18<7:02:56, 3.22s/it] 77%|███████▋ | 26401/34278 [28:59:23<8:27:21, 3.86s/it] {'loss': 0.1119, 'grad_norm': 0.871079034104856, 'learning_rate': 1.3220962532793535e-06, 'epoch': 0.77} 77%|███████▋ | 26401/34278 [28:59:23<8:27:21, 3.86s/it] 77%|███████▋ | 26402/34278 [28:59:26<8:04:10, 3.69s/it] {'loss': 0.1438, 'grad_norm': 0.9644347823087496, 'learning_rate': 1.321776225294722e-06, 'epoch': 0.77} 77%|███████▋ | 26402/34278 [28:59:26<8:04:10, 3.69s/it] 77%|███████▋ | 26403/34278 [28:59:30<7:48:32, 3.57s/it] {'loss': 0.1059, 'grad_norm': 1.1372883534878597, 'learning_rate': 1.321456230148394e-06, 'epoch': 0.77} 77%|███████▋ | 26403/34278 [28:59:30<7:48:32, 3.57s/it] 77%|███████▋ | 26404/34278 [28:59:33<7:27:02, 3.41s/it] {'loss': 0.123, 'grad_norm': 0.7689123155392992, 'learning_rate': 1.3211362678432282e-06, 'epoch': 0.77} 77%|███████▋ | 26404/34278 [28:59:33<7:27:02, 3.41s/it] 77%|███████▋ | 26405/34278 [28:59:37<8:16:26, 3.78s/it] {'loss': 0.1149, 'grad_norm': 0.7275816622209508, 'learning_rate': 1.32081633838208e-06, 'epoch': 0.77} 77%|███████▋ | 26405/34278 [28:59:37<8:16:26, 3.78s/it] 77%|███████▋ | 26406/34278 [28:59:41<8:00:21, 3.66s/it] {'loss': 0.1349, 'grad_norm': 0.7988063990793386, 'learning_rate': 1.3204964417678034e-06, 'epoch': 0.77} 77%|███████▋ | 26406/34278 [28:59:41<8:00:21, 3.66s/it] 77%|███████▋ | 26407/34278 [28:59:44<7:32:39, 3.45s/it] {'loss': 0.1164, 'grad_norm': 1.000820402644795, 'learning_rate': 1.3201765780032577e-06, 'epoch': 0.77} 77%|███████▋ | 26407/34278 [28:59:44<7:32:39, 3.45s/it] 77%|███████▋ | 26408/34278 [28:59:50<9:08:23, 4.18s/it] {'loss': 0.1155, 'grad_norm': 0.8542802866515816, 'learning_rate': 1.3198567470912955e-06, 'epoch': 0.77} 77%|███████▋ | 26408/34278 [28:59:50<9:08:23, 4.18s/it] 77%|███████▋ | 26409/34278 [28:59:53<8:33:44, 3.92s/it] {'loss': 0.1033, 'grad_norm': 0.8719979593231254, 'learning_rate': 1.3195369490347753e-06, 'epoch': 0.77} 77%|███████▋ | 26409/34278 [28:59:53<8:33:44, 3.92s/it] 77%|███████▋ | 26410/34278 [28:59:59<9:58:42, 4.57s/it] {'loss': 0.1246, 'grad_norm': 1.1192189668562742, 'learning_rate': 1.3192171838365492e-06, 'epoch': 0.77} 77%|███████▋ | 26410/34278 [28:59:59<9:58:42, 4.57s/it] 77%|███████▋ | 26411/34278 [29:00:03<9:26:00, 4.32s/it] {'loss': 0.0945, 'grad_norm': 0.683453341444437, 'learning_rate': 1.3188974514994752e-06, 'epoch': 0.77} 77%|███████▋ | 26411/34278 [29:00:03<9:26:00, 4.32s/it] 77%|███████▋ | 26412/34278 [29:00:07<9:08:17, 4.18s/it] {'loss': 0.1087, 'grad_norm': 1.1538560322257447, 'learning_rate': 1.3185777520264053e-06, 'epoch': 0.77} 77%|███████▋ | 26412/34278 [29:00:07<9:08:17, 4.18s/it] 77%|███████▋ | 26413/34278 [29:00:10<8:21:04, 3.82s/it] {'loss': 0.124, 'grad_norm': 1.123113533059127, 'learning_rate': 1.3182580854201938e-06, 'epoch': 0.77} 77%|███████▋ | 26413/34278 [29:00:10<8:21:04, 3.82s/it] 77%|███████▋ | 26414/34278 [29:00:13<7:54:22, 3.62s/it] {'loss': 0.1192, 'grad_norm': 0.9604330296704243, 'learning_rate': 1.3179384516836947e-06, 'epoch': 0.77} 77%|███████▋ | 26414/34278 [29:00:13<7:54:22, 3.62s/it] 77%|███████▋ | 26415/34278 [29:00:18<9:09:16, 4.19s/it] {'loss': 0.133, 'grad_norm': 0.9035893269277159, 'learning_rate': 1.3176188508197634e-06, 'epoch': 0.77} 77%|███████▋ | 26415/34278 [29:00:18<9:09:16, 4.19s/it] 77%|███████▋ | 26416/34278 [29:00:21<8:25:15, 3.86s/it] {'loss': 0.1317, 'grad_norm': 0.85900840253525, 'learning_rate': 1.3172992828312519e-06, 'epoch': 0.77} 77%|███████▋ | 26416/34278 [29:00:21<8:25:15, 3.86s/it] 77%|███████▋ | 26417/34278 [29:00:24<7:37:22, 3.49s/it] {'loss': 0.1144, 'grad_norm': 1.1163044663463344, 'learning_rate': 1.3169797477210122e-06, 'epoch': 0.77} 77%|███████▋ | 26417/34278 [29:00:24<7:37:22, 3.49s/it] 77%|███████▋ | 26418/34278 [29:00:27<7:08:22, 3.27s/it] {'loss': 0.0885, 'grad_norm': 1.1446577803877949, 'learning_rate': 1.3166602454918997e-06, 'epoch': 0.77} 77%|███████▋ | 26418/34278 [29:00:27<7:08:22, 3.27s/it] 77%|███████▋ | 26419/34278 [29:00:30<6:55:17, 3.17s/it] {'loss': 0.1162, 'grad_norm': 0.7461391709802458, 'learning_rate': 1.316340776146765e-06, 'epoch': 0.77} 77%|███████▋ | 26419/34278 [29:00:30<6:55:17, 3.17s/it] 77%|███████▋ | 26420/34278 [29:00:33<7:07:44, 3.27s/it] {'loss': 0.0949, 'grad_norm': 0.8121190406704041, 'learning_rate': 1.3160213396884576e-06, 'epoch': 0.77} 77%|███████▋ | 26420/34278 [29:00:33<7:07:44, 3.27s/it] 77%|███████▋ | 26421/34278 [29:00:37<7:18:03, 3.35s/it] {'loss': 0.1323, 'grad_norm': 1.1227051978681637, 'learning_rate': 1.3157019361198348e-06, 'epoch': 0.77} 77%|███████▋ | 26421/34278 [29:00:37<7:18:03, 3.35s/it] 77%|███████▋ | 26422/34278 [29:00:40<7:08:05, 3.27s/it] {'loss': 0.0977, 'grad_norm': 1.3014020476072332, 'learning_rate': 1.3153825654437458e-06, 'epoch': 0.77} 77%|███████▋ | 26422/34278 [29:00:40<7:08:05, 3.27s/it] 77%|███████▋ | 26423/34278 [29:00:43<7:12:51, 3.31s/it] {'loss': 0.1235, 'grad_norm': 1.3137095697635548, 'learning_rate': 1.3150632276630405e-06, 'epoch': 0.77} 77%|███████▋ | 26423/34278 [29:00:43<7:12:51, 3.31s/it] 77%|███████▋ | 26424/34278 [29:00:49<9:09:12, 4.20s/it] {'loss': 0.1578, 'grad_norm': 0.9783099395143354, 'learning_rate': 1.3147439227805726e-06, 'epoch': 0.77} 77%|███████▋ | 26424/34278 [29:00:49<9:09:12, 4.20s/it] 77%|███████▋ | 26425/34278 [29:00:53<8:51:21, 4.06s/it] {'loss': 0.1119, 'grad_norm': 1.6056538931094753, 'learning_rate': 1.314424650799191e-06, 'epoch': 0.77} 77%|███████▋ | 26425/34278 [29:00:53<8:51:21, 4.06s/it] 77%|███████▋ | 26426/34278 [29:00:57<8:38:53, 3.97s/it] {'loss': 0.1271, 'grad_norm': 0.8491353982017323, 'learning_rate': 1.3141054117217444e-06, 'epoch': 0.77} 77%|███████▋ | 26426/34278 [29:00:57<8:38:53, 3.97s/it] 77%|███████▋ | 26427/34278 [29:01:03<10:04:31, 4.62s/it] {'loss': 0.1122, 'grad_norm': 1.2452005449943448, 'learning_rate': 1.3137862055510852e-06, 'epoch': 0.77} 77%|███████▋ | 26427/34278 [29:01:03<10:04:31, 4.62s/it] 77%|███████▋ | 26428/34278 [29:01:06<8:55:11, 4.09s/it] {'loss': 0.0943, 'grad_norm': 0.9635875427143986, 'learning_rate': 1.3134670322900644e-06, 'epoch': 0.77} 77%|███████▋ | 26428/34278 [29:01:06<8:55:11, 4.09s/it] 77%|███████▋ | 26429/34278 [29:01:10<8:50:19, 4.05s/it] {'loss': 0.11, 'grad_norm': 0.752123689176318, 'learning_rate': 1.3131478919415298e-06, 'epoch': 0.77} 77%|███████▋ | 26429/34278 [29:01:10<8:50:19, 4.05s/it] 77%|███████▋ | 26430/34278 [29:01:13<8:15:33, 3.79s/it] {'loss': 0.1247, 'grad_norm': 0.7198618254965795, 'learning_rate': 1.3128287845083288e-06, 'epoch': 0.77} 77%|███████▋ | 26430/34278 [29:01:13<8:15:33, 3.79s/it] 77%|███████▋ | 26431/34278 [29:01:16<7:56:02, 3.64s/it] {'loss': 0.1205, 'grad_norm': 1.3233392143387757, 'learning_rate': 1.3125097099933144e-06, 'epoch': 0.77} 77%|███████▋ | 26431/34278 [29:01:16<7:56:02, 3.64s/it] 77%|███████▋ | 26432/34278 [29:01:20<7:51:12, 3.60s/it] {'loss': 0.1065, 'grad_norm': 1.0919256862175135, 'learning_rate': 1.3121906683993307e-06, 'epoch': 0.77} 77%|███████▋ | 26432/34278 [29:01:20<7:51:12, 3.60s/it] 77%|███████▋ | 26433/34278 [29:01:25<8:42:40, 4.00s/it] {'loss': 0.1164, 'grad_norm': 0.9580261211845957, 'learning_rate': 1.3118716597292292e-06, 'epoch': 0.77} 77%|███████▋ | 26433/34278 [29:01:25<8:42:40, 4.00s/it] 77%|███████▋ | 26434/34278 [29:01:28<8:19:02, 3.82s/it] {'loss': 0.1006, 'grad_norm': 0.6467131070332959, 'learning_rate': 1.3115526839858583e-06, 'epoch': 0.77} 77%|███████▋ | 26434/34278 [29:01:28<8:19:02, 3.82s/it] 77%|███████▋ | 26435/34278 [29:01:31<7:53:42, 3.62s/it] {'loss': 0.1164, 'grad_norm': 0.8629428027626489, 'learning_rate': 1.3112337411720643e-06, 'epoch': 0.77} 77%|███████▋ | 26435/34278 [29:01:31<7:53:42, 3.62s/it] 77%|███████▋ | 26436/34278 [29:01:36<8:23:15, 3.85s/it] {'loss': 0.1205, 'grad_norm': 0.86144947721426, 'learning_rate': 1.3109148312906934e-06, 'epoch': 0.77} 77%|███████▋ | 26436/34278 [29:01:36<8:23:15, 3.85s/it] 77%|███████▋ | 26437/34278 [29:01:39<8:15:36, 3.79s/it] {'loss': 0.0983, 'grad_norm': 0.7128145435843275, 'learning_rate': 1.3105959543445962e-06, 'epoch': 0.77} 77%|███████▋ | 26437/34278 [29:01:39<8:15:36, 3.79s/it] 77%|███████▋ | 26438/34278 [29:01:44<8:46:05, 4.03s/it] {'loss': 0.1091, 'grad_norm': 0.9961558734954546, 'learning_rate': 1.3102771103366157e-06, 'epoch': 0.77} 77%|███████▋ | 26438/34278 [29:01:44<8:46:05, 4.03s/it] 77%|███████▋ | 26439/34278 [29:01:50<10:14:23, 4.70s/it] {'loss': 0.1195, 'grad_norm': 0.9780844718050035, 'learning_rate': 1.3099582992696019e-06, 'epoch': 0.77} 77%|███████▋ | 26439/34278 [29:01:50<10:14:23, 4.70s/it] 77%|███████▋ | 26440/34278 [29:01:56<11:12:27, 5.15s/it] {'loss': 0.1313, 'grad_norm': 1.0287588646014765, 'learning_rate': 1.309639521146398e-06, 'epoch': 0.77} 77%|███████▋ | 26440/34278 [29:01:56<11:12:27, 5.15s/it] 77%|███████▋ | 26441/34278 [29:02:00<10:05:32, 4.64s/it] {'loss': 0.1255, 'grad_norm': 1.251658470659023, 'learning_rate': 1.309320775969853e-06, 'epoch': 0.77} 77%|███████▋ | 26441/34278 [29:02:00<10:05:32, 4.64s/it] 77%|███████▋ | 26442/34278 [29:02:03<8:51:29, 4.07s/it] {'loss': 0.1265, 'grad_norm': 1.196933869166425, 'learning_rate': 1.3090020637428109e-06, 'epoch': 0.77} 77%|███████▋ | 26442/34278 [29:02:03<8:51:29, 4.07s/it] 77%|███████▋ | 26443/34278 [29:02:05<8:03:31, 3.70s/it] {'loss': 0.1154, 'grad_norm': 0.9939862539712442, 'learning_rate': 1.3086833844681163e-06, 'epoch': 0.77} 77%|███████▋ | 26443/34278 [29:02:05<8:03:31, 3.70s/it] 77%|███████▋ | 26444/34278 [29:02:09<7:56:18, 3.65s/it] {'loss': 0.1168, 'grad_norm': 0.8493062979362679, 'learning_rate': 1.3083647381486147e-06, 'epoch': 0.77} 77%|███████▋ | 26444/34278 [29:02:09<7:56:18, 3.65s/it] 77%|███████▋ | 26445/34278 [29:02:12<7:33:45, 3.48s/it] {'loss': 0.1091, 'grad_norm': 0.9132476162737213, 'learning_rate': 1.3080461247871528e-06, 'epoch': 0.77} 77%|███████▋ | 26445/34278 [29:02:12<7:33:45, 3.48s/it] 77%|███████▋ | 26446/34278 [29:02:15<7:00:38, 3.22s/it] {'loss': 0.0903, 'grad_norm': 0.7964659543172595, 'learning_rate': 1.3077275443865744e-06, 'epoch': 0.77} 77%|███████▋ | 26446/34278 [29:02:15<7:00:38, 3.22s/it] 77%|███████▋ | 26447/34278 [29:02:18<7:06:34, 3.27s/it] {'loss': 0.1269, 'grad_norm': 0.8313801419019907, 'learning_rate': 1.307408996949721e-06, 'epoch': 0.77} 77%|███████▋ | 26447/34278 [29:02:18<7:06:34, 3.27s/it] 77%|███████▋ | 26448/34278 [29:02:21<6:57:32, 3.20s/it] {'loss': 0.1092, 'grad_norm': 0.8557884059584263, 'learning_rate': 1.3070904824794405e-06, 'epoch': 0.77} 77%|███████▋ | 26448/34278 [29:02:21<6:57:32, 3.20s/it] 77%|███████▋ | 26449/34278 [29:02:25<7:28:08, 3.43s/it] {'loss': 0.12, 'grad_norm': 0.9307981977101334, 'learning_rate': 1.3067720009785744e-06, 'epoch': 0.77} 77%|███████▋ | 26449/34278 [29:02:25<7:28:08, 3.43s/it] 77%|███████▋ | 26450/34278 [29:02:31<9:10:26, 4.22s/it] {'loss': 0.1123, 'grad_norm': 0.9161618851245314, 'learning_rate': 1.3064535524499638e-06, 'epoch': 0.77} 77%|███████▋ | 26450/34278 [29:02:31<9:10:26, 4.22s/it] 77%|███████▋ | 26451/34278 [29:02:34<8:36:03, 3.96s/it] {'loss': 0.1178, 'grad_norm': 0.9973343431939814, 'learning_rate': 1.3061351368964565e-06, 'epoch': 0.77} 77%|███████▋ | 26451/34278 [29:02:34<8:36:03, 3.96s/it] 77%|███████▋ | 26452/34278 [29:02:39<9:04:04, 4.17s/it] {'loss': 0.1071, 'grad_norm': 0.8219635170300139, 'learning_rate': 1.3058167543208932e-06, 'epoch': 0.77} 77%|███████▋ | 26452/34278 [29:02:39<9:04:04, 4.17s/it] 77%|███████▋ | 26453/34278 [29:02:44<9:24:57, 4.33s/it] {'loss': 0.1105, 'grad_norm': 0.8308945313301898, 'learning_rate': 1.3054984047261143e-06, 'epoch': 0.77} 77%|███████▋ | 26453/34278 [29:02:44<9:24:57, 4.33s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff041990680> Failed to fetch sample 2667589. Exception: cannot identify image file <_io.BytesIO object at 0x7ff041990680> 77%|███████▋ | 26454/34278 [29:02:47<8:53:40, 4.09s/it] {'loss': 0.1267, 'grad_norm': 0.9276828756661168, 'learning_rate': 1.305180088114965e-06, 'epoch': 0.77} 77%|███████▋ | 26454/34278 [29:02:47<8:53:40, 4.09s/it] 77%|███████▋ | 26455/34278 [29:02:53<10:12:45, 4.70s/it] {'loss': 0.1004, 'grad_norm': 0.6314204423551208, 'learning_rate': 1.3048618044902867e-06, 'epoch': 0.77} 77%|███████▋ | 26455/34278 [29:02:53<10:12:45, 4.70s/it] 77%|███████▋ | 26456/34278 [29:02:56<9:07:24, 4.20s/it] {'loss': 0.1083, 'grad_norm': 0.7850854170641326, 'learning_rate': 1.3045435538549178e-06, 'epoch': 0.77} 77%|███████▋ | 26456/34278 [29:02:56<9:07:24, 4.20s/it] 77%|███████▋ | 26457/34278 [29:03:00<8:31:40, 3.93s/it] {'loss': 0.103, 'grad_norm': 0.8000894342283231, 'learning_rate': 1.3042253362117025e-06, 'epoch': 0.77} 77%|███████▋ | 26457/34278 [29:03:00<8:31:40, 3.93s/it] 77%|███████▋ | 26458/34278 [29:03:04<9:02:51, 4.17s/it] {'loss': 0.1183, 'grad_norm': 0.9008739004837375, 'learning_rate': 1.3039071515634822e-06, 'epoch': 0.77} 77%|███████▋ | 26458/34278 [29:03:05<9:02:51, 4.17s/it] 77%|███████▋ | 26459/34278 [29:03:07<8:10:37, 3.76s/it] {'loss': 0.136, 'grad_norm': 0.8673481606862793, 'learning_rate': 1.3035889999130963e-06, 'epoch': 0.77} 77%|███████▋ | 26459/34278 [29:03:07<8:10:37, 3.76s/it] 77%|███████▋ | 26460/34278 [29:03:10<7:45:43, 3.57s/it] {'loss': 0.1126, 'grad_norm': 0.8850524606452219, 'learning_rate': 1.3032708812633843e-06, 'epoch': 0.77} 77%|███████▋ | 26460/34278 [29:03:10<7:45:43, 3.57s/it] 77%|███████▋ | 26461/34278 [29:03:16<9:20:34, 4.30s/it] {'loss': 0.1173, 'grad_norm': 0.8824738752227391, 'learning_rate': 1.302952795617189e-06, 'epoch': 0.77} 77%|███████▋ | 26461/34278 [29:03:16<9:20:34, 4.30s/it] 77%|███████▋ | 26462/34278 [29:03:19<8:28:50, 3.91s/it] {'loss': 0.1301, 'grad_norm': 0.9801705753487708, 'learning_rate': 1.3026347429773467e-06, 'epoch': 0.77} 77%|███████▋ | 26462/34278 [29:03:19<8:28:50, 3.91s/it] 77%|███████▋ | 26463/34278 [29:03:26<10:09:03, 4.68s/it] {'loss': 0.1166, 'grad_norm': 0.869696321205162, 'learning_rate': 1.3023167233466988e-06, 'epoch': 0.77} 77%|███████▋ | 26463/34278 [29:03:26<10:09:03, 4.68s/it] 77%|███████▋ | 26464/34278 [29:03:29<9:15:21, 4.26s/it] {'loss': 0.1208, 'grad_norm': 0.699863753769055, 'learning_rate': 1.3019987367280863e-06, 'epoch': 0.77} 77%|███████▋ | 26464/34278 [29:03:29<9:15:21, 4.26s/it] 77%|███████▋ | 26465/34278 [29:03:32<8:25:58, 3.89s/it] {'loss': 0.1334, 'grad_norm': 0.7509956601812794, 'learning_rate': 1.3016807831243462e-06, 'epoch': 0.77} 77%|███████▋ | 26465/34278 [29:03:32<8:25:58, 3.89s/it] 77%|███████▋ | 26466/34278 [29:03:39<10:04:36, 4.64s/it] {'loss': 0.1117, 'grad_norm': 0.7857488046474499, 'learning_rate': 1.3013628625383156e-06, 'epoch': 0.77} 77%|███████▋ | 26466/34278 [29:03:39<10:04:36, 4.64s/it] 77%|███████▋ | 26467/34278 [29:03:41<8:54:23, 4.10s/it] {'loss': 0.1092, 'grad_norm': 0.7597992997166758, 'learning_rate': 1.301044974972836e-06, 'epoch': 0.77} 77%|███████▋ | 26467/34278 [29:03:42<8:54:23, 4.10s/it] 77%|███████▋ | 26468/34278 [29:03:45<8:23:05, 3.86s/it] {'loss': 0.1109, 'grad_norm': 0.902079263486036, 'learning_rate': 1.3007271204307425e-06, 'epoch': 0.77} 77%|███████▋ | 26468/34278 [29:03:45<8:23:05, 3.86s/it] 77%|███████▋ | 26469/34278 [29:03:48<7:47:45, 3.59s/it] {'loss': 0.1356, 'grad_norm': 0.9154434600938871, 'learning_rate': 1.3004092989148753e-06, 'epoch': 0.77} 77%|███████▋ | 26469/34278 [29:03:48<7:47:45, 3.59s/it] 77%|███████▋ | 26470/34278 [29:03:51<7:31:55, 3.47s/it] {'loss': 0.1209, 'grad_norm': 0.7602294858779245, 'learning_rate': 1.3000915104280699e-06, 'epoch': 0.77} 77%|███████▋ | 26470/34278 [29:03:51<7:31:55, 3.47s/it] 77%|███████▋ | 26471/34278 [29:03:54<7:27:10, 3.44s/it] {'loss': 0.1037, 'grad_norm': 0.8650595391250381, 'learning_rate': 1.2997737549731647e-06, 'epoch': 0.77} 77%|███████▋ | 26471/34278 [29:03:54<7:27:10, 3.44s/it] 77%|███████▋ | 26472/34278 [29:03:59<7:58:38, 3.68s/it] {'loss': 0.116, 'grad_norm': 0.728679407589401, 'learning_rate': 1.299456032552997e-06, 'epoch': 0.77} 77%|███████▋ | 26472/34278 [29:03:59<7:58:38, 3.68s/it] 77%|███████▋ | 26473/34278 [29:04:02<7:34:09, 3.49s/it] {'loss': 0.1636, 'grad_norm': 0.911503442395789, 'learning_rate': 1.2991383431704008e-06, 'epoch': 0.77} 77%|███████▋ | 26473/34278 [29:04:02<7:34:09, 3.49s/it] 77%|███████▋ | 26474/34278 [29:04:05<7:33:09, 3.48s/it] {'loss': 0.1356, 'grad_norm': 1.119843180670213, 'learning_rate': 1.2988206868282138e-06, 'epoch': 0.77} 77%|███████▋ | 26474/34278 [29:04:05<7:33:09, 3.48s/it] 77%|███████▋ | 26475/34278 [29:04:10<8:34:15, 3.95s/it] {'loss': 0.0951, 'grad_norm': 0.8635609734114374, 'learning_rate': 1.2985030635292733e-06, 'epoch': 0.77} 77%|███████▋ | 26475/34278 [29:04:10<8:34:15, 3.95s/it] 77%|███████▋ | 26476/34278 [29:04:13<8:04:49, 3.73s/it] {'loss': 0.1326, 'grad_norm': 0.7894405202081105, 'learning_rate': 1.2981854732764142e-06, 'epoch': 0.77} 77%|███████▋ | 26476/34278 [29:04:13<8:04:49, 3.73s/it] 77%|███████▋ | 26477/34278 [29:04:19<9:27:15, 4.36s/it] {'loss': 0.1196, 'grad_norm': 0.7744503147262438, 'learning_rate': 1.2978679160724706e-06, 'epoch': 0.77} 77%|███████▋ | 26477/34278 [29:04:19<9:27:15, 4.36s/it] 77%|███████▋ | 26478/34278 [29:04:22<8:47:07, 4.05s/it] {'loss': 0.1343, 'grad_norm': 0.9085344278051577, 'learning_rate': 1.2975503919202793e-06, 'epoch': 0.77} 77%|███████▋ | 26478/34278 [29:04:22<8:47:07, 4.05s/it] 77%|███████▋ | 26479/34278 [29:04:28<10:03:17, 4.64s/it] {'loss': 0.118, 'grad_norm': 0.8709450678143941, 'learning_rate': 1.2972329008226741e-06, 'epoch': 0.77} 77%|███████▋ | 26479/34278 [29:04:29<10:03:17, 4.64s/it] 77%|███████▋ | 26480/34278 [29:04:32<9:13:12, 4.26s/it] {'loss': 0.0993, 'grad_norm': 0.6416055095465956, 'learning_rate': 1.296915442782487e-06, 'epoch': 0.77} 77%|███████▋ | 26480/34278 [29:04:32<9:13:12, 4.26s/it] 77%|███████▋ | 26481/34278 [29:04:35<8:28:29, 3.91s/it] {'loss': 0.1105, 'grad_norm': 0.9495578419459769, 'learning_rate': 1.2965980178025577e-06, 'epoch': 0.77} 77%|███████▋ | 26481/34278 [29:04:35<8:28:29, 3.91s/it] 77%|███████▋ | 26482/34278 [29:04:38<8:11:56, 3.79s/it] {'loss': 0.1049, 'grad_norm': 0.8520634504504494, 'learning_rate': 1.2962806258857175e-06, 'epoch': 0.77} 77%|███████▋ | 26482/34278 [29:04:38<8:11:56, 3.79s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6d1efdcbd0> Failed to fetch sample 3655706. Exception: cannot identify image file <_io.BytesIO object at 0x7f6d1efdcbd0> 77%|███████▋ | 26483/34278 [29:04:44<9:35:53, 4.43s/it] {'loss': 0.117, 'grad_norm': 0.9039195815906078, 'learning_rate': 1.2959632670347976e-06, 'epoch': 0.77} 77%|███████▋ | 26483/34278 [29:04:44<9:35:53, 4.43s/it] 77%|███████▋ | 26484/34278 [29:04:49<9:45:49, 4.51s/it] {'loss': 0.1187, 'grad_norm': 0.7995310471836102, 'learning_rate': 1.2956459412526357e-06, 'epoch': 0.77} 77%|███████▋ | 26484/34278 [29:04:49<9:45:49, 4.51s/it] 77%|███████▋ | 26485/34278 [29:04:52<8:43:22, 4.03s/it] {'loss': 0.1098, 'grad_norm': 1.0539171841398818, 'learning_rate': 1.2953286485420618e-06, 'epoch': 0.77} 77%|███████▋ | 26485/34278 [29:04:52<8:43:22, 4.03s/it] 77%|███████▋ | 26486/34278 [29:04:56<8:25:44, 3.89s/it] {'loss': 0.1255, 'grad_norm': 0.9727985169305272, 'learning_rate': 1.2950113889059084e-06, 'epoch': 0.77} 77%|███████▋ | 26486/34278 [29:04:56<8:25:44, 3.89s/it] 77%|███████▋ | 26487/34278 [29:04:59<7:55:17, 3.66s/it] {'loss': 0.1136, 'grad_norm': 1.1550559656561254, 'learning_rate': 1.294694162347009e-06, 'epoch': 0.77} 77%|███████▋ | 26487/34278 [29:04:59<7:55:17, 3.66s/it] 77%|███████▋ | 26488/34278 [29:05:02<7:27:27, 3.45s/it] {'loss': 0.1141, 'grad_norm': 0.726800504404567, 'learning_rate': 1.2943769688681968e-06, 'epoch': 0.77} 77%|███████▋ | 26488/34278 [29:05:02<7:27:27, 3.45s/it] 77%|███████▋ | 26489/34278 [29:05:04<6:59:45, 3.23s/it] {'loss': 0.0945, 'grad_norm': 1.8236590615622714, 'learning_rate': 1.294059808472302e-06, 'epoch': 0.77} 77%|███████▋ | 26489/34278 [29:05:04<6:59:45, 3.23s/it] 77%|███████▋ | 26490/34278 [29:05:08<7:01:26, 3.25s/it] {'loss': 0.1052, 'grad_norm': 0.8323302104959504, 'learning_rate': 1.2937426811621557e-06, 'epoch': 0.77} 77%|███████▋ | 26490/34278 [29:05:08<7:01:26, 3.25s/it] 77%|███████▋ | 26491/34278 [29:05:11<7:00:12, 3.24s/it] {'loss': 0.1081, 'grad_norm': 0.7459109385827977, 'learning_rate': 1.293425586940591e-06, 'epoch': 0.77} 77%|███████▋ | 26491/34278 [29:05:11<7:00:12, 3.24s/it] 77%|███████▋ | 26492/34278 [29:05:15<7:52:58, 3.64s/it] {'loss': 0.1089, 'grad_norm': 1.2150684866793608, 'learning_rate': 1.2931085258104365e-06, 'epoch': 0.77} 77%|███████▋ | 26492/34278 [29:05:15<7:52:58, 3.64s/it] 77%|███████▋ | 26493/34278 [29:05:19<7:32:23, 3.49s/it] {'loss': 0.1237, 'grad_norm': 0.9975084486954806, 'learning_rate': 1.292791497774526e-06, 'epoch': 0.77} 77%|███████▋ | 26493/34278 [29:05:19<7:32:23, 3.49s/it] 77%|███████▋ | 26494/34278 [29:05:22<7:13:14, 3.34s/it] {'loss': 0.1296, 'grad_norm': 0.6490156257469298, 'learning_rate': 1.292474502835686e-06, 'epoch': 0.77} 77%|███████▋ | 26494/34278 [29:05:22<7:13:14, 3.34s/it] 77%|███████▋ | 26495/34278 [29:05:25<7:05:24, 3.28s/it] {'loss': 0.1077, 'grad_norm': 0.7228196405436439, 'learning_rate': 1.2921575409967507e-06, 'epoch': 0.77} 77%|███████▋ | 26495/34278 [29:05:25<7:05:24, 3.28s/it] 77%|███████▋ | 26496/34278 [29:05:28<7:15:30, 3.36s/it] {'loss': 0.1186, 'grad_norm': 1.0764161726479002, 'learning_rate': 1.2918406122605459e-06, 'epoch': 0.77} 77%|███████▋ | 26496/34278 [29:05:28<7:15:30, 3.36s/it] 77%|███████▋ | 26497/34278 [29:05:32<7:12:16, 3.33s/it] {'loss': 0.1026, 'grad_norm': 0.7546420173359845, 'learning_rate': 1.2915237166299038e-06, 'epoch': 0.77} 77%|███████▋ | 26497/34278 [29:05:32<7:12:16, 3.33s/it] 77%|███████▋ | 26498/34278 [29:05:36<8:02:25, 3.72s/it] {'loss': 0.1229, 'grad_norm': 1.01179067039939, 'learning_rate': 1.2912068541076523e-06, 'epoch': 0.77} 77%|███████▋ | 26498/34278 [29:05:36<8:02:25, 3.72s/it] 77%|███████▋ | 26499/34278 [29:05:39<7:25:16, 3.43s/it] {'loss': 0.1226, 'grad_norm': 0.9371775488421225, 'learning_rate': 1.2908900246966215e-06, 'epoch': 0.77} 77%|███████▋ | 26499/34278 [29:05:39<7:25:16, 3.43s/it] 77%|███████▋ | 26500/34278 [29:05:42<7:29:06, 3.46s/it] {'loss': 0.1218, 'grad_norm': 1.0982933175170113, 'learning_rate': 1.2905732283996374e-06, 'epoch': 0.77} 77%|███████▋ | 26500/34278 [29:05:42<7:29:06, 3.46s/it] 77%|███████▋ | 26501/34278 [29:05:49<9:13:27, 4.27s/it] {'loss': 0.1318, 'grad_norm': 1.797577771811171, 'learning_rate': 1.290256465219532e-06, 'epoch': 0.77} 77%|███████▋ | 26501/34278 [29:05:49<9:13:27, 4.27s/it] 77%|███████▋ | 26502/34278 [29:05:51<8:17:32, 3.84s/it] {'loss': 0.1202, 'grad_norm': 0.9185502509125388, 'learning_rate': 1.2899397351591308e-06, 'epoch': 0.77} 77%|███████▋ | 26502/34278 [29:05:51<8:17:32, 3.84s/it] 77%|███████▋ | 26503/34278 [29:05:57<9:20:22, 4.32s/it] {'loss': 0.1, 'grad_norm': 0.8406384464510138, 'learning_rate': 1.289623038221261e-06, 'epoch': 0.77} 77%|███████▋ | 26503/34278 [29:05:57<9:20:22, 4.32s/it] 77%|███████▋ | 26504/34278 [29:06:01<9:08:39, 4.23s/it] {'loss': 0.1173, 'grad_norm': 0.8154581556170826, 'learning_rate': 1.289306374408751e-06, 'epoch': 0.77} 77%|███████▋ | 26504/34278 [29:06:01<9:08:39, 4.23s/it] 77%|███████▋ | 26505/34278 [29:06:04<8:26:35, 3.91s/it] {'loss': 0.1083, 'grad_norm': 0.9123238426515281, 'learning_rate': 1.2889897437244292e-06, 'epoch': 0.77} 77%|███████▋ | 26505/34278 [29:06:04<8:26:35, 3.91s/it] 77%|███████▋ | 26506/34278 [29:06:09<8:54:00, 4.12s/it] {'loss': 0.1092, 'grad_norm': 1.1856662017489783, 'learning_rate': 1.288673146171121e-06, 'epoch': 0.77} 77%|███████▋ | 26506/34278 [29:06:09<8:54:00, 4.12s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 77%|███████▋ | 26507/34278 [29:06:13<8:59:42, 4.17s/it] {'loss': 0.1159, 'grad_norm': 0.8013534639063158, 'learning_rate': 1.2883565817516513e-06, 'epoch': 0.77} 77%|███████▋ | 26507/34278 [29:06:13<8:59:42, 4.17s/it] 77%|███████▋ | 26508/34278 [29:06:16<8:08:07, 3.77s/it] {'loss': 0.1189, 'grad_norm': 0.8404164148131812, 'learning_rate': 1.2880400504688501e-06, 'epoch': 0.77} 77%|███████▋ | 26508/34278 [29:06:16<8:08:07, 3.77s/it] 77%|███████▋ | 26509/34278 [29:06:22<9:30:53, 4.41s/it] {'loss': 0.1373, 'grad_norm': 0.9352412053671898, 'learning_rate': 1.2877235523255388e-06, 'epoch': 0.77} 77%|███████▋ | 26509/34278 [29:06:22<9:30:53, 4.41s/it] 77%|███████▋ | 26510/34278 [29:06:25<8:44:02, 4.05s/it] {'loss': 0.1287, 'grad_norm': 0.8562182761344047, 'learning_rate': 1.2874070873245465e-06, 'epoch': 0.77} 77%|███████▋ | 26510/34278 [29:06:25<8:44:02, 4.05s/it] 77%|███████▋ | 26511/34278 [29:06:30<9:40:31, 4.48s/it] {'loss': 0.1168, 'grad_norm': 0.7641345416105122, 'learning_rate': 1.2870906554686979e-06, 'epoch': 0.77} 77%|███████▋ | 26511/34278 [29:06:30<9:40:31, 4.48s/it] 77%|███████▋ | 26512/34278 [29:06:34<9:16:17, 4.30s/it] {'loss': 0.1192, 'grad_norm': 0.9916976973259733, 'learning_rate': 1.2867742567608182e-06, 'epoch': 0.77} 77%|███████▋ | 26512/34278 [29:06:34<9:16:17, 4.30s/it] 77%|███████▋ | 26513/34278 [29:06:38<8:36:24, 3.99s/it] {'loss': 0.1066, 'grad_norm': 0.8311264741871122, 'learning_rate': 1.2864578912037302e-06, 'epoch': 0.77} 77%|███████▋ | 26513/34278 [29:06:38<8:36:24, 3.99s/it] 77%|███████▋ | 26514/34278 [29:06:41<7:57:32, 3.69s/it] {'loss': 0.0888, 'grad_norm': 0.7764714206791056, 'learning_rate': 1.2861415588002607e-06, 'epoch': 0.77} 77%|███████▋ | 26514/34278 [29:06:41<7:57:32, 3.69s/it] 77%|███████▋ | 26515/34278 [29:06:44<8:06:44, 3.76s/it] {'loss': 0.0964, 'grad_norm': 0.8708474570257974, 'learning_rate': 1.2858252595532316e-06, 'epoch': 0.77} 77%|███████▋ | 26515/34278 [29:06:44<8:06:44, 3.76s/it] 77%|███████▋ | 26516/34278 [29:06:47<7:34:21, 3.51s/it] {'loss': 0.1209, 'grad_norm': 0.9782078589934318, 'learning_rate': 1.285508993465469e-06, 'epoch': 0.77} 77%|███████▋ | 26516/34278 [29:06:47<7:34:21, 3.51s/it] 77%|███████▋ | 26517/34278 [29:06:51<7:33:22, 3.50s/it] {'loss': 0.108, 'grad_norm': 0.9509027822156123, 'learning_rate': 1.2851927605397946e-06, 'epoch': 0.77} 77%|███████▋ | 26517/34278 [29:06:51<7:33:22, 3.50s/it] 77%|███████▋ | 26518/34278 [29:06:54<7:15:19, 3.37s/it] {'loss': 0.103, 'grad_norm': 0.7746880580086296, 'learning_rate': 1.2848765607790332e-06, 'epoch': 0.77} 77%|███████▋ | 26518/34278 [29:06:54<7:15:19, 3.37s/it] 77%|███████▋ | 26519/34278 [29:06:57<7:06:13, 3.30s/it] {'loss': 0.1023, 'grad_norm': 0.7650423822361858, 'learning_rate': 1.2845603941860074e-06, 'epoch': 0.77} 77%|███████▋ | 26519/34278 [29:06:57<7:06:13, 3.30s/it] 77%|███████▋ | 26520/34278 [29:07:00<6:46:56, 3.15s/it] {'loss': 0.1082, 'grad_norm': 1.4347883213488632, 'learning_rate': 1.2842442607635381e-06, 'epoch': 0.77} 77%|███████▋ | 26520/34278 [29:07:00<6:46:56, 3.15s/it] 77%|███████▋ | 26521/34278 [29:07:04<7:12:44, 3.35s/it] {'loss': 0.1036, 'grad_norm': 1.1284873371552322, 'learning_rate': 1.2839281605144488e-06, 'epoch': 0.77} 77%|███████▋ | 26521/34278 [29:07:04<7:12:44, 3.35s/it] 77%|███████▋ | 26522/34278 [29:07:07<7:25:28, 3.45s/it] {'loss': 0.1099, 'grad_norm': 0.9096892401280708, 'learning_rate': 1.283612093441563e-06, 'epoch': 0.77} 77%|███████▋ | 26522/34278 [29:07:07<7:25:28, 3.45s/it] 77%|███████▋ | 26523/34278 [29:07:11<7:48:29, 3.62s/it] {'loss': 0.1174, 'grad_norm': 0.7006154927094754, 'learning_rate': 1.2832960595477017e-06, 'epoch': 0.77} 77%|███████▋ | 26523/34278 [29:07:11<7:48:29, 3.62s/it] 77%|███████▋ | 26524/34278 [29:07:14<7:25:30, 3.45s/it] {'loss': 0.1021, 'grad_norm': 1.0828107341052888, 'learning_rate': 1.2829800588356839e-06, 'epoch': 0.77} 77%|███████▋ | 26524/34278 [29:07:14<7:25:30, 3.45s/it] 77%|███████▋ | 26525/34278 [29:07:20<9:03:29, 4.21s/it] {'loss': 0.1172, 'grad_norm': 1.0696356690412794, 'learning_rate': 1.282664091308335e-06, 'epoch': 0.77} 77%|███████▋ | 26525/34278 [29:07:20<9:03:29, 4.21s/it] 77%|███████▋ | 26526/34278 [29:07:26<10:08:13, 4.71s/it] {'loss': 0.1297, 'grad_norm': 0.8497245081942782, 'learning_rate': 1.282348156968472e-06, 'epoch': 0.77} 77%|███████▋ | 26526/34278 [29:07:26<10:08:13, 4.71s/it] 77%|███████▋ | 26527/34278 [29:07:32<10:58:37, 5.10s/it] {'loss': 0.1285, 'grad_norm': 0.9001105388819419, 'learning_rate': 1.282032255818917e-06, 'epoch': 0.77} 77%|███████▋ | 26527/34278 [29:07:32<10:58:37, 5.10s/it] 77%|███████▋ | 26528/34278 [29:07:38<11:14:48, 5.22s/it] {'loss': 0.1217, 'grad_norm': 0.8068502221676728, 'learning_rate': 1.2817163878624917e-06, 'epoch': 0.77} 77%|███████▋ | 26528/34278 [29:07:38<11:14:48, 5.22s/it] 77%|███████▋ | 26529/34278 [29:07:41<9:52:23, 4.59s/it] {'loss': 0.1291, 'grad_norm': 0.8138790681916827, 'learning_rate': 1.281400553102015e-06, 'epoch': 0.77} 77%|███████▋ | 26529/34278 [29:07:41<9:52:23, 4.59s/it] 77%|███████▋ | 26530/34278 [29:07:45<9:16:07, 4.31s/it] {'loss': 0.141, 'grad_norm': 0.8778194586781097, 'learning_rate': 1.2810847515403058e-06, 'epoch': 0.77} 77%|███████▋ | 26530/34278 [29:07:45<9:16:07, 4.31s/it] 77%|███████▋ | 26531/34278 [29:07:48<8:27:09, 3.93s/it] {'loss': 0.1069, 'grad_norm': 0.9503925376410518, 'learning_rate': 1.2807689831801846e-06, 'epoch': 0.77} 77%|███████▋ | 26531/34278 [29:07:48<8:27:09, 3.93s/it] 77%|███████▋ | 26532/34278 [29:07:50<7:40:30, 3.57s/it] {'loss': 0.1157, 'grad_norm': 1.0624130337991038, 'learning_rate': 1.2804532480244709e-06, 'epoch': 0.77} 77%|███████▋ | 26532/34278 [29:07:50<7:40:30, 3.57s/it] 77%|███████▋ | 26533/34278 [29:07:53<7:23:32, 3.44s/it] {'loss': 0.1371, 'grad_norm': 0.9790620493478033, 'learning_rate': 1.2801375460759802e-06, 'epoch': 0.77} 77%|███████▋ | 26533/34278 [29:07:53<7:23:32, 3.44s/it] 77%|███████▋ | 26534/34278 [29:07:56<7:06:26, 3.30s/it] {'loss': 0.1086, 'grad_norm': 1.1323242419711559, 'learning_rate': 1.2798218773375342e-06, 'epoch': 0.77} 77%|███████▋ | 26534/34278 [29:07:56<7:06:26, 3.30s/it] 77%|███████▋ | 26535/34278 [29:08:00<6:57:07, 3.23s/it] {'loss': 0.1239, 'grad_norm': 1.0682323746768634, 'learning_rate': 1.2795062418119519e-06, 'epoch': 0.77} 77%|███████▋ | 26535/34278 [29:08:00<6:57:07, 3.23s/it] 77%|███████▋ | 26536/34278 [29:08:03<7:13:46, 3.36s/it] {'loss': 0.1241, 'grad_norm': 0.8797137663193164, 'learning_rate': 1.2791906395020493e-06, 'epoch': 0.77} 77%|███████▋ | 26536/34278 [29:08:03<7:13:46, 3.36s/it] 77%|███████▋ | 26537/34278 [29:08:09<8:50:09, 4.11s/it] {'loss': 0.1112, 'grad_norm': 0.9167332230238365, 'learning_rate': 1.2788750704106434e-06, 'epoch': 0.77} 77%|███████▋ | 26537/34278 [29:08:09<8:50:09, 4.11s/it] 77%|███████▋ | 26538/34278 [29:08:12<8:12:30, 3.82s/it] {'loss': 0.1075, 'grad_norm': 0.7695633151047667, 'learning_rate': 1.2785595345405539e-06, 'epoch': 0.77} 77%|███████▋ | 26538/34278 [29:08:12<8:12:30, 3.82s/it] 77%|███████▋ | 26539/34278 [29:08:15<7:39:55, 3.57s/it] {'loss': 0.1069, 'grad_norm': 0.8312754128050671, 'learning_rate': 1.278244031894595e-06, 'epoch': 0.77} 77%|███████▋ | 26539/34278 [29:08:15<7:39:55, 3.57s/it] 77%|███████▋ | 26540/34278 [29:08:21<9:14:16, 4.30s/it] {'loss': 0.1333, 'grad_norm': 0.8945044287668984, 'learning_rate': 1.277928562475585e-06, 'epoch': 0.77} 77%|███████▋ | 26540/34278 [29:08:21<9:14:16, 4.30s/it] 77%|███████▋ | 26541/34278 [29:08:25<9:15:08, 4.31s/it] {'loss': 0.1226, 'grad_norm': 0.8845537811854449, 'learning_rate': 1.2776131262863412e-06, 'epoch': 0.77} 77%|███████▋ | 26541/34278 [29:08:25<9:15:08, 4.31s/it] 77%|███████▋ | 26542/34278 [29:08:29<8:32:53, 3.98s/it] {'loss': 0.1193, 'grad_norm': 1.046034095084399, 'learning_rate': 1.2772977233296796e-06, 'epoch': 0.77} 77%|███████▋ | 26542/34278 [29:08:29<8:32:53, 3.98s/it] 77%|███████▋ | 26543/34278 [29:08:34<9:05:26, 4.23s/it] {'loss': 0.119, 'grad_norm': 0.9048239629282795, 'learning_rate': 1.276982353608413e-06, 'epoch': 0.77} 77%|███████▋ | 26543/34278 [29:08:34<9:05:26, 4.23s/it] 77%|███████▋ | 26544/34278 [29:08:40<10:16:30, 4.78s/it] {'loss': 0.1241, 'grad_norm': 0.8592410590381747, 'learning_rate': 1.2766670171253614e-06, 'epoch': 0.77} 77%|███████▋ | 26544/34278 [29:08:40<10:16:30, 4.78s/it] 77%|███████▋ | 26545/34278 [29:08:43<9:13:03, 4.29s/it] {'loss': 0.1062, 'grad_norm': 0.9521390371402167, 'learning_rate': 1.276351713883336e-06, 'epoch': 0.77} 77%|███████▋ | 26545/34278 [29:08:43<9:13:03, 4.29s/it] 77%|███████▋ | 26546/34278 [29:08:46<8:20:49, 3.89s/it] {'loss': 0.1071, 'grad_norm': 0.6922278166740373, 'learning_rate': 1.2760364438851553e-06, 'epoch': 0.77} 77%|███████▋ | 26546/34278 [29:08:46<8:20:49, 3.89s/it] 77%|███████▋ | 26547/34278 [29:08:49<7:59:52, 3.72s/it] {'loss': 0.1184, 'grad_norm': 0.7868868881989651, 'learning_rate': 1.2757212071336301e-06, 'epoch': 0.77} 77%|███████▋ | 26547/34278 [29:08:49<7:59:52, 3.72s/it] 77%|███████▋ | 26548/34278 [29:08:52<7:37:57, 3.55s/it] {'loss': 0.1082, 'grad_norm': 0.8375359753984807, 'learning_rate': 1.275406003631579e-06, 'epoch': 0.77} 77%|███████▋ | 26548/34278 [29:08:52<7:37:57, 3.55s/it] 77%|███████▋ | 26549/34278 [29:08:56<7:49:58, 3.65s/it] {'loss': 0.1235, 'grad_norm': 0.8448427683544858, 'learning_rate': 1.275090833381814e-06, 'epoch': 0.77} 77%|███████▋ | 26549/34278 [29:08:56<7:49:58, 3.65s/it] 77%|███████▋ | 26550/34278 [29:09:00<7:49:51, 3.65s/it] {'loss': 0.1287, 'grad_norm': 0.8780531252526221, 'learning_rate': 1.2747756963871472e-06, 'epoch': 0.77} 77%|███████▋ | 26550/34278 [29:09:00<7:49:51, 3.65s/it] 77%|███████▋ | 26551/34278 [29:09:03<7:36:03, 3.54s/it] {'loss': 0.1189, 'grad_norm': 0.7180162243325253, 'learning_rate': 1.2744605926503934e-06, 'epoch': 0.77} 77%|███████▋ | 26551/34278 [29:09:03<7:36:03, 3.54s/it] 77%|███████▋ | 26552/34278 [29:09:07<7:43:22, 3.60s/it] {'loss': 0.1208, 'grad_norm': 0.7395139920355911, 'learning_rate': 1.274145522174368e-06, 'epoch': 0.77} 77%|███████▋ | 26552/34278 [29:09:07<7:43:22, 3.60s/it] 77%|███████▋ | 26553/34278 [29:09:10<7:20:00, 3.42s/it] {'loss': 0.1495, 'grad_norm': 0.8643409270290434, 'learning_rate': 1.2738304849618815e-06, 'epoch': 0.77} 77%|███████▋ | 26553/34278 [29:09:10<7:20:00, 3.42s/it] 77%|███████▋ | 26554/34278 [29:09:14<8:07:02, 3.78s/it] {'loss': 0.1055, 'grad_norm': 0.9146671208526945, 'learning_rate': 1.2735154810157458e-06, 'epoch': 0.77} 77%|███████▋ | 26554/34278 [29:09:14<8:07:02, 3.78s/it] 77%|███████▋ | 26555/34278 [29:09:20<9:31:42, 4.44s/it] {'loss': 0.13, 'grad_norm': 0.7962582769405433, 'learning_rate': 1.2732005103387756e-06, 'epoch': 0.77} 77%|███████▋ | 26555/34278 [29:09:20<9:31:42, 4.44s/it] 77%|███████▋ | 26556/34278 [29:09:24<9:10:14, 4.28s/it] {'loss': 0.1424, 'grad_norm': 1.033888721443246, 'learning_rate': 1.2728855729337802e-06, 'epoch': 0.77} 77%|███████▋ | 26556/34278 [29:09:24<9:10:14, 4.28s/it] 77%|███████▋ | 26557/34278 [29:09:27<8:08:51, 3.80s/it] {'loss': 0.0905, 'grad_norm': 0.7861583172360689, 'learning_rate': 1.2725706688035728e-06, 'epoch': 0.77} 77%|███████▋ | 26557/34278 [29:09:27<8:08:51, 3.80s/it] 77%|███████▋ | 26558/34278 [29:09:30<7:26:48, 3.47s/it] {'loss': 0.1219, 'grad_norm': 0.8478843797782293, 'learning_rate': 1.2722557979509664e-06, 'epoch': 0.77} 77%|███████▋ | 26558/34278 [29:09:30<7:26:48, 3.47s/it] 77%|███████▋ | 26559/34278 [29:09:34<7:54:51, 3.69s/it] {'loss': 0.1055, 'grad_norm': 0.7467407285651229, 'learning_rate': 1.2719409603787696e-06, 'epoch': 0.77} 77%|███████▋ | 26559/34278 [29:09:34<7:54:51, 3.69s/it] 77%|███████▋ | 26560/34278 [29:09:40<9:25:30, 4.40s/it] {'loss': 0.1237, 'grad_norm': 0.9988871413293041, 'learning_rate': 1.271626156089793e-06, 'epoch': 0.77} 77%|███████▋ | 26560/34278 [29:09:40<9:25:30, 4.40s/it] 77%|███████▋ | 26561/34278 [29:09:43<8:30:21, 3.97s/it] {'loss': 0.1093, 'grad_norm': 0.7728425459962185, 'learning_rate': 1.2713113850868492e-06, 'epoch': 0.77} 77%|███████▋ | 26561/34278 [29:09:43<8:30:21, 3.97s/it] 77%|███████▋ | 26562/34278 [29:09:46<7:53:35, 3.68s/it] {'loss': 0.1348, 'grad_norm': 0.7353651798996653, 'learning_rate': 1.2709966473727474e-06, 'epoch': 0.77} 77%|███████▋ | 26562/34278 [29:09:46<7:53:35, 3.68s/it] 77%|███████▋ | 26563/34278 [29:09:49<7:20:49, 3.43s/it] {'loss': 0.112, 'grad_norm': 0.6751947559540624, 'learning_rate': 1.270681942950296e-06, 'epoch': 0.77} 77%|███████▋ | 26563/34278 [29:09:49<7:20:49, 3.43s/it] 77%|███████▋ | 26564/34278 [29:09:51<6:56:10, 3.24s/it] {'loss': 0.1178, 'grad_norm': 0.9657854387208041, 'learning_rate': 1.2703672718223058e-06, 'epoch': 0.77} 77%|███████▋ | 26564/34278 [29:09:51<6:56:10, 3.24s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8a43850950> Failed to fetch sample 3087946. Exception: cannot identify image file <_io.BytesIO object at 0x7f8a43850950> 77%|███████▋ | 26565/34278 [29:09:55<7:11:56, 3.36s/it] {'loss': 0.1035, 'grad_norm': 0.7443473108788434, 'learning_rate': 1.2700526339915875e-06, 'epoch': 0.77} 77%|███████▋ | 26565/34278 [29:09:55<7:11:56, 3.36s/it] 78%|███████▊ | 26566/34278 [29:09:58<7:01:26, 3.28s/it] {'loss': 0.1287, 'grad_norm': 0.8117538167786532, 'learning_rate': 1.2697380294609495e-06, 'epoch': 0.78} 78%|███████▊ | 26566/34278 [29:09:58<7:01:26, 3.28s/it] 78%|███████▊ | 26567/34278 [29:10:02<7:14:59, 3.38s/it] {'loss': 0.1173, 'grad_norm': 0.9204762012155704, 'learning_rate': 1.2694234582331982e-06, 'epoch': 0.78} 78%|███████▊ | 26567/34278 [29:10:02<7:14:59, 3.38s/it] 78%|███████▊ | 26568/34278 [29:10:06<7:36:56, 3.56s/it] {'loss': 0.108, 'grad_norm': 0.810251947875881, 'learning_rate': 1.2691089203111444e-06, 'epoch': 0.78} 78%|███████▊ | 26568/34278 [29:10:06<7:36:56, 3.56s/it] 78%|███████▊ | 26569/34278 [29:10:11<8:35:32, 4.01s/it] {'loss': 0.1341, 'grad_norm': 0.7534612732531387, 'learning_rate': 1.2687944156975952e-06, 'epoch': 0.78} 78%|███████▊ | 26569/34278 [29:10:11<8:35:32, 4.01s/it] 78%|███████▊ | 26570/34278 [29:10:15<8:21:31, 3.90s/it] {'loss': 0.1113, 'grad_norm': 0.9639774981600991, 'learning_rate': 1.2684799443953582e-06, 'epoch': 0.78} 78%|███████▊ | 26570/34278 [29:10:15<8:21:31, 3.90s/it] 78%|███████▊ | 26571/34278 [29:10:18<7:47:04, 3.64s/it] {'loss': 0.1056, 'grad_norm': 0.7398740880920021, 'learning_rate': 1.2681655064072429e-06, 'epoch': 0.78} 78%|███████▊ | 26571/34278 [29:10:18<7:47:04, 3.64s/it] 78%|███████▊ | 26572/34278 [29:10:22<8:21:06, 3.90s/it] {'loss': 0.1137, 'grad_norm': 0.637081908395961, 'learning_rate': 1.267851101736055e-06, 'epoch': 0.78} 78%|███████▊ | 26572/34278 [29:10:22<8:21:06, 3.90s/it] 78%|███████▊ | 26573/34278 [29:10:25<7:57:41, 3.72s/it] {'loss': 0.1368, 'grad_norm': 0.9941602533204257, 'learning_rate': 1.2675367303846004e-06, 'epoch': 0.78} 78%|███████▊ | 26573/34278 [29:10:25<7:57:41, 3.72s/it] 78%|███████▊ | 26574/34278 [29:10:28<7:29:22, 3.50s/it] {'loss': 0.0949, 'grad_norm': 1.0444638934507875, 'learning_rate': 1.267222392355688e-06, 'epoch': 0.78} 78%|███████▊ | 26574/34278 [29:10:28<7:29:22, 3.50s/it] 78%|███████▊ | 26575/34278 [29:10:33<7:57:56, 3.72s/it] {'loss': 0.133, 'grad_norm': 0.7904267526058851, 'learning_rate': 1.2669080876521217e-06, 'epoch': 0.78} 78%|███████▊ | 26575/34278 [29:10:33<7:57:56, 3.72s/it] 78%|███████▊ | 26576/34278 [29:10:38<9:13:01, 4.31s/it] {'loss': 0.1337, 'grad_norm': 0.8829949795051159, 'learning_rate': 1.2665938162767105e-06, 'epoch': 0.78} 78%|███████▊ | 26576/34278 [29:10:38<9:13:01, 4.31s/it] 78%|███████▊ | 26577/34278 [29:10:43<9:19:45, 4.36s/it] {'loss': 0.1183, 'grad_norm': 0.9142570146776361, 'learning_rate': 1.2662795782322567e-06, 'epoch': 0.78} 78%|███████▊ | 26577/34278 [29:10:43<9:19:45, 4.36s/it] 78%|███████▊ | 26578/34278 [29:10:47<9:16:22, 4.34s/it] {'loss': 0.0951, 'grad_norm': 0.7460793058960168, 'learning_rate': 1.2659653735215687e-06, 'epoch': 0.78} 78%|███████▊ | 26578/34278 [29:10:47<9:16:22, 4.34s/it] 78%|███████▊ | 26579/34278 [29:10:51<8:47:34, 4.11s/it] {'loss': 0.1152, 'grad_norm': 0.8705738509957747, 'learning_rate': 1.2656512021474509e-06, 'epoch': 0.78} 78%|███████▊ | 26579/34278 [29:10:51<8:47:34, 4.11s/it] 78%|███████▊ | 26580/34278 [29:10:54<8:05:43, 3.79s/it] {'loss': 0.0934, 'grad_norm': 0.8404481239550062, 'learning_rate': 1.2653370641127066e-06, 'epoch': 0.78} 78%|███████▊ | 26580/34278 [29:10:54<8:05:43, 3.79s/it] 78%|███████▊ | 26581/34278 [29:10:57<7:38:25, 3.57s/it] {'loss': 0.1379, 'grad_norm': 0.8810257364826629, 'learning_rate': 1.2650229594201408e-06, 'epoch': 0.78} 78%|███████▊ | 26581/34278 [29:10:57<7:38:25, 3.57s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4d856a3740> Failed to fetch sample 2898119. Exception: cannot identify image file <_io.BytesIO object at 0x7f4d856a3740> 78%|███████▊ | 26582/34278 [29:11:00<7:25:28, 3.47s/it] {'loss': 0.1033, 'grad_norm': 0.6820729342789238, 'learning_rate': 1.26470888807256e-06, 'epoch': 0.78} 78%|███████▊ | 26582/34278 [29:11:00<7:25:28, 3.47s/it] 78%|███████▊ | 26583/34278 [29:11:04<7:31:07, 3.52s/it] {'loss': 0.1401, 'grad_norm': 0.7775126625130048, 'learning_rate': 1.2643948500727666e-06, 'epoch': 0.78} 78%|███████▊ | 26583/34278 [29:11:04<7:31:07, 3.52s/it] 78%|███████▊ | 26584/34278 [29:11:07<7:14:28, 3.39s/it] {'loss': 0.1061, 'grad_norm': 0.8362570852191665, 'learning_rate': 1.264080845423563e-06, 'epoch': 0.78} 78%|███████▊ | 26584/34278 [29:11:07<7:14:28, 3.39s/it] 78%|███████▊ | 26585/34278 [29:11:10<7:02:29, 3.30s/it] {'loss': 0.1267, 'grad_norm': 0.8341827908323795, 'learning_rate': 1.2637668741277548e-06, 'epoch': 0.78} 78%|███████▊ | 26585/34278 [29:11:10<7:02:29, 3.30s/it] 78%|███████▊ | 26586/34278 [29:11:14<7:56:33, 3.72s/it] {'loss': 0.1141, 'grad_norm': 0.8924927693673023, 'learning_rate': 1.2634529361881442e-06, 'epoch': 0.78} 78%|███████▊ | 26586/34278 [29:11:14<7:56:33, 3.72s/it] 78%|███████▊ | 26587/34278 [29:11:19<8:41:25, 4.07s/it] {'loss': 0.1138, 'grad_norm': 0.8432344529903103, 'learning_rate': 1.2631390316075315e-06, 'epoch': 0.78} 78%|███████▊ | 26587/34278 [29:11:19<8:41:25, 4.07s/it] 78%|███████▊ | 26588/34278 [29:11:25<10:00:17, 4.68s/it] {'loss': 0.1188, 'grad_norm': 0.8962107175638608, 'learning_rate': 1.2628251603887238e-06, 'epoch': 0.78} 78%|███████▊ | 26588/34278 [29:11:25<10:00:17, 4.68s/it] 78%|███████▊ | 26589/34278 [29:11:28<8:46:44, 4.11s/it] {'loss': 0.0966, 'grad_norm': 0.8872274234974421, 'learning_rate': 1.262511322534521e-06, 'epoch': 0.78} 78%|███████▊ | 26589/34278 [29:11:28<8:46:44, 4.11s/it] 78%|███████▊ | 26590/34278 [29:11:31<8:10:34, 3.83s/it] {'loss': 0.1208, 'grad_norm': 0.7627003562541219, 'learning_rate': 1.262197518047723e-06, 'epoch': 0.78} 78%|███████▊ | 26590/34278 [29:11:31<8:10:34, 3.83s/it] 78%|███████▊ | 26591/34278 [29:11:37<9:23:49, 4.40s/it] {'loss': 0.1341, 'grad_norm': 0.7761094661369075, 'learning_rate': 1.2618837469311351e-06, 'epoch': 0.78} 78%|███████▊ | 26591/34278 [29:11:37<9:23:49, 4.40s/it] 78%|███████▊ | 26592/34278 [29:11:41<8:54:59, 4.18s/it] {'loss': 0.1218, 'grad_norm': 0.8623604643579398, 'learning_rate': 1.261570009187557e-06, 'epoch': 0.78} 78%|███████▊ | 26592/34278 [29:11:41<8:54:59, 4.18s/it] 78%|███████▊ | 26593/34278 [29:11:45<8:38:32, 4.05s/it] {'loss': 0.098, 'grad_norm': 0.8593192080900649, 'learning_rate': 1.261256304819788e-06, 'epoch': 0.78} 78%|███████▊ | 26593/34278 [29:11:45<8:38:32, 4.05s/it] 78%|███████▊ | 26594/34278 [29:11:48<8:04:16, 3.78s/it] {'loss': 0.1133, 'grad_norm': 0.6704118061247689, 'learning_rate': 1.2609426338306296e-06, 'epoch': 0.78} 78%|███████▊ | 26594/34278 [29:11:48<8:04:16, 3.78s/it] 78%|███████▊ | 26595/34278 [29:11:51<7:30:00, 3.51s/it] {'loss': 0.1456, 'grad_norm': 0.8922664771139991, 'learning_rate': 1.2606289962228846e-06, 'epoch': 0.78} 78%|███████▊ | 26595/34278 [29:11:51<7:30:00, 3.51s/it] 78%|███████▊ | 26596/34278 [29:11:54<7:19:59, 3.44s/it] {'loss': 0.1013, 'grad_norm': 0.6804087787177441, 'learning_rate': 1.2603153919993516e-06, 'epoch': 0.78} 78%|███████▊ | 26596/34278 [29:11:54<7:19:59, 3.44s/it] 78%|███████▊ | 26597/34278 [29:11:57<7:02:31, 3.30s/it] {'loss': 0.1104, 'grad_norm': 0.7390251390423266, 'learning_rate': 1.2600018211628278e-06, 'epoch': 0.78} 78%|███████▊ | 26597/34278 [29:11:57<7:02:31, 3.30s/it] 78%|███████▊ | 26598/34278 [29:12:03<8:39:34, 4.06s/it] {'loss': 0.1145, 'grad_norm': 1.2288790597249433, 'learning_rate': 1.2596882837161174e-06, 'epoch': 0.78} 78%|███████▊ | 26598/34278 [29:12:03<8:39:34, 4.06s/it] 78%|███████▊ | 26599/34278 [29:12:06<8:28:29, 3.97s/it] {'loss': 0.122, 'grad_norm': 1.1180371656432018, 'learning_rate': 1.2593747796620148e-06, 'epoch': 0.78} 78%|███████▊ | 26599/34278 [29:12:06<8:28:29, 3.97s/it] 78%|███████▊ | 26600/34278 [29:12:09<7:48:34, 3.66s/it] {'loss': 0.1047, 'grad_norm': 0.7891802008532125, 'learning_rate': 1.2590613090033215e-06, 'epoch': 0.78} 78%|███████▊ | 26600/34278 [29:12:09<7:48:34, 3.66s/it] 78%|███████▊ | 26601/34278 [29:12:13<8:05:25, 3.79s/it] {'loss': 0.1241, 'grad_norm': 0.8002358332725346, 'learning_rate': 1.2587478717428375e-06, 'epoch': 0.78} 78%|███████▊ | 26601/34278 [29:12:13<8:05:25, 3.79s/it] 78%|███████▊ | 26602/34278 [29:12:16<7:32:27, 3.54s/it] {'loss': 0.1268, 'grad_norm': 0.9576015250321538, 'learning_rate': 1.2584344678833587e-06, 'epoch': 0.78} 78%|███████▊ | 26602/34278 [29:12:16<7:32:27, 3.54s/it] 78%|███████▊ | 26603/34278 [29:12:20<7:19:46, 3.44s/it] {'loss': 0.105, 'grad_norm': 0.8548690248052143, 'learning_rate': 1.258121097427683e-06, 'epoch': 0.78} 78%|███████▊ | 26603/34278 [29:12:20<7:19:46, 3.44s/it] 78%|███████▊ | 26604/34278 [29:12:23<7:29:51, 3.52s/it] {'loss': 0.1066, 'grad_norm': 0.8483129380130555, 'learning_rate': 1.2578077603786104e-06, 'epoch': 0.78} 78%|███████▊ | 26604/34278 [29:12:23<7:29:51, 3.52s/it] 78%|███████▊ | 26605/34278 [29:12:27<7:22:52, 3.46s/it] {'loss': 0.1042, 'grad_norm': 0.8726933690240404, 'learning_rate': 1.2574944567389346e-06, 'epoch': 0.78} 78%|███████▊ | 26605/34278 [29:12:27<7:22:52, 3.46s/it] 78%|███████▊ | 26606/34278 [29:12:30<7:25:07, 3.48s/it] {'loss': 0.1302, 'grad_norm': 0.8512247353198941, 'learning_rate': 1.2571811865114569e-06, 'epoch': 0.78} 78%|███████▊ | 26606/34278 [29:12:30<7:25:07, 3.48s/it] 78%|███████▊ | 26607/34278 [29:12:34<7:22:42, 3.46s/it] {'loss': 0.1083, 'grad_norm': 0.8286348752057278, 'learning_rate': 1.2568679496989706e-06, 'epoch': 0.78} 78%|███████▊ | 26607/34278 [29:12:34<7:22:42, 3.46s/it] 78%|███████▊ | 26608/34278 [29:12:37<7:12:13, 3.38s/it] {'loss': 0.1131, 'grad_norm': 0.901537042493502, 'learning_rate': 1.2565547463042753e-06, 'epoch': 0.78} 78%|███████▊ | 26608/34278 [29:12:37<7:12:13, 3.38s/it] 78%|███████▊ | 26609/34278 [29:12:40<7:23:01, 3.47s/it] {'loss': 0.1285, 'grad_norm': 0.8679979136696329, 'learning_rate': 1.2562415763301656e-06, 'epoch': 0.78} 78%|███████▊ | 26609/34278 [29:12:40<7:23:01, 3.47s/it] 78%|███████▊ | 26610/34278 [29:12:43<6:57:30, 3.27s/it] {'loss': 0.091, 'grad_norm': 0.7955828353613565, 'learning_rate': 1.2559284397794353e-06, 'epoch': 0.78} 78%|███████▊ | 26610/34278 [29:12:43<6:57:30, 3.27s/it] 78%|███████▊ | 26611/34278 [29:12:47<6:59:17, 3.28s/it] {'loss': 0.1016, 'grad_norm': 0.6771791765096367, 'learning_rate': 1.2556153366548823e-06, 'epoch': 0.78} 78%|███████▊ | 26611/34278 [29:12:47<6:59:17, 3.28s/it] 78%|███████▊ | 26612/34278 [29:12:50<7:07:38, 3.35s/it] {'loss': 0.1142, 'grad_norm': 0.8774287468611829, 'learning_rate': 1.2553022669593034e-06, 'epoch': 0.78} 78%|███████▊ | 26612/34278 [29:12:50<7:07:38, 3.35s/it] 78%|███████▊ | 26613/34278 [29:12:53<6:43:12, 3.16s/it] {'loss': 0.1263, 'grad_norm': 0.969730479636771, 'learning_rate': 1.254989230695492e-06, 'epoch': 0.78} 78%|███████▊ | 26613/34278 [29:12:53<6:43:12, 3.16s/it] 78%|███████▊ | 26614/34278 [29:12:56<6:50:23, 3.21s/it] {'loss': 0.1198, 'grad_norm': 0.7443136930296058, 'learning_rate': 1.2546762278662412e-06, 'epoch': 0.78} 78%|███████▊ | 26614/34278 [29:12:56<6:50:23, 3.21s/it] 78%|███████▊ | 26615/34278 [29:13:02<8:26:08, 3.96s/it] {'loss': 0.1228, 'grad_norm': 0.7589716149465291, 'learning_rate': 1.2543632584743488e-06, 'epoch': 0.78} 78%|███████▊ | 26615/34278 [29:13:02<8:26:08, 3.96s/it] 78%|███████▊ | 26616/34278 [29:13:05<8:04:47, 3.80s/it] {'loss': 0.1348, 'grad_norm': 0.7364776568685283, 'learning_rate': 1.2540503225226064e-06, 'epoch': 0.78} 78%|███████▊ | 26616/34278 [29:13:05<8:04:47, 3.80s/it] 78%|███████▊ | 26617/34278 [29:13:08<7:39:42, 3.60s/it] {'loss': 0.1159, 'grad_norm': 0.8164915778874572, 'learning_rate': 1.2537374200138058e-06, 'epoch': 0.78} 78%|███████▊ | 26617/34278 [29:13:08<7:39:42, 3.60s/it] 78%|███████▊ | 26618/34278 [29:13:14<9:10:32, 4.31s/it] {'loss': 0.1148, 'grad_norm': 0.7452086785688905, 'learning_rate': 1.2534245509507465e-06, 'epoch': 0.78} 78%|███████▊ | 26618/34278 [29:13:14<9:10:32, 4.31s/it] 78%|███████▊ | 26619/34278 [29:13:18<8:49:07, 4.15s/it] {'loss': 0.1223, 'grad_norm': 0.8122767630242331, 'learning_rate': 1.2531117153362176e-06, 'epoch': 0.78} 78%|███████▊ | 26619/34278 [29:13:18<8:49:07, 4.15s/it] 78%|███████▊ | 26620/34278 [29:13:22<8:21:24, 3.93s/it] {'loss': 0.1036, 'grad_norm': 0.9413088095715116, 'learning_rate': 1.2527989131730123e-06, 'epoch': 0.78} 78%|███████▊ | 26620/34278 [29:13:22<8:21:24, 3.93s/it] 78%|███████▊ | 26621/34278 [29:13:26<8:52:39, 4.17s/it] {'loss': 0.1077, 'grad_norm': 0.8623245561426485, 'learning_rate': 1.2524861444639246e-06, 'epoch': 0.78} 78%|███████▊ | 26621/34278 [29:13:26<8:52:39, 4.17s/it] 78%|███████▊ | 26622/34278 [29:13:31<9:22:26, 4.41s/it] {'loss': 0.1218, 'grad_norm': 0.825163178735011, 'learning_rate': 1.2521734092117466e-06, 'epoch': 0.78} 78%|███████▊ | 26622/34278 [29:13:31<9:22:26, 4.41s/it] 78%|███████▊ | 26623/34278 [29:13:34<8:32:08, 4.01s/it] {'loss': 0.1176, 'grad_norm': 0.8185792339260545, 'learning_rate': 1.2518607074192679e-06, 'epoch': 0.78} 78%|███████▊ | 26623/34278 [29:13:34<8:32:08, 4.01s/it] 78%|███████▊ | 26624/34278 [29:13:38<8:08:51, 3.83s/it] {'loss': 0.1182, 'grad_norm': 0.7985641157458386, 'learning_rate': 1.251548039089282e-06, 'epoch': 0.78} 78%|███████▊ | 26624/34278 [29:13:38<8:08:51, 3.83s/it] 78%|███████▊ | 26625/34278 [29:13:41<7:52:58, 3.71s/it] {'loss': 0.1076, 'grad_norm': 1.024651304113618, 'learning_rate': 1.2512354042245818e-06, 'epoch': 0.78} 78%|███████▊ | 26625/34278 [29:13:41<7:52:58, 3.71s/it] 78%|███████▊ | 26626/34278 [29:13:45<8:16:27, 3.89s/it] {'loss': 0.1297, 'grad_norm': 0.8758307280649539, 'learning_rate': 1.2509228028279568e-06, 'epoch': 0.78} 78%|███████▊ | 26626/34278 [29:13:45<8:16:27, 3.89s/it] 78%|███████▊ | 26627/34278 [29:13:50<8:44:34, 4.11s/it] {'loss': 0.1407, 'grad_norm': 0.9491621859658749, 'learning_rate': 1.250610234902197e-06, 'epoch': 0.78} 78%|███████▊ | 26627/34278 [29:13:50<8:44:34, 4.11s/it] 78%|███████▊ | 26628/34278 [29:13:53<8:10:27, 3.85s/it] {'loss': 0.105, 'grad_norm': 1.2129012666247923, 'learning_rate': 1.2502977004500956e-06, 'epoch': 0.78} 78%|███████▊ | 26628/34278 [29:13:53<8:10:27, 3.85s/it] 78%|███████▊ | 26629/34278 [29:13:56<7:35:55, 3.58s/it] {'loss': 0.1151, 'grad_norm': 0.8308476317112877, 'learning_rate': 1.2499851994744393e-06, 'epoch': 0.78} 78%|███████▊ | 26629/34278 [29:13:56<7:35:55, 3.58s/it] 78%|███████▊ | 26630/34278 [29:14:00<7:27:46, 3.51s/it] {'loss': 0.1186, 'grad_norm': 0.99605626968484, 'learning_rate': 1.24967273197802e-06, 'epoch': 0.78} 78%|███████▊ | 26630/34278 [29:14:00<7:27:46, 3.51s/it] 78%|███████▊ | 26631/34278 [29:14:06<9:09:14, 4.31s/it] {'loss': 0.113, 'grad_norm': 0.9990949850093903, 'learning_rate': 1.2493602979636289e-06, 'epoch': 0.78} 78%|███████▊ | 26631/34278 [29:14:06<9:09:14, 4.31s/it] 78%|███████▊ | 26632/34278 [29:14:09<8:11:34, 3.86s/it] {'loss': 0.1174, 'grad_norm': 0.943995982225375, 'learning_rate': 1.2490478974340536e-06, 'epoch': 0.78} 78%|███████▊ | 26632/34278 [29:14:09<8:11:34, 3.86s/it] 78%|███████▊ | 26633/34278 [29:14:13<8:50:59, 4.17s/it] {'loss': 0.1186, 'grad_norm': 0.8212802373601578, 'learning_rate': 1.2487355303920817e-06, 'epoch': 0.78} 78%|███████▊ | 26633/34278 [29:14:14<8:50:59, 4.17s/it] 78%|███████▊ | 26634/34278 [29:14:17<8:19:41, 3.92s/it] {'loss': 0.1618, 'grad_norm': 0.9452085012954069, 'learning_rate': 1.2484231968405053e-06, 'epoch': 0.78} 78%|███████▊ | 26634/34278 [29:14:17<8:19:41, 3.92s/it] 78%|███████▊ | 26635/34278 [29:14:21<8:16:19, 3.90s/it] {'loss': 0.1225, 'grad_norm': 1.0732195217586151, 'learning_rate': 1.2481108967821092e-06, 'epoch': 0.78} 78%|███████▊ | 26635/34278 [29:14:21<8:16:19, 3.90s/it] 78%|███████▊ | 26636/34278 [29:14:24<7:50:34, 3.69s/it] {'loss': 0.0999, 'grad_norm': 1.1001290613704633, 'learning_rate': 1.2477986302196848e-06, 'epoch': 0.78} 78%|███████▊ | 26636/34278 [29:14:24<7:50:34, 3.69s/it] 78%|███████▊ | 26637/34278 [29:14:30<9:17:49, 4.38s/it] {'loss': 0.1289, 'grad_norm': 0.8535470565199292, 'learning_rate': 1.2474863971560176e-06, 'epoch': 0.78} 78%|███████▊ | 26637/34278 [29:14:30<9:17:49, 4.38s/it] 78%|███████▊ | 26638/34278 [29:14:34<9:06:33, 4.29s/it] {'loss': 0.1245, 'grad_norm': 1.0052434803438746, 'learning_rate': 1.2471741975938971e-06, 'epoch': 0.78} 78%|███████▊ | 26638/34278 [29:14:34<9:06:33, 4.29s/it] 78%|███████▊ | 26639/34278 [29:14:37<8:05:27, 3.81s/it] {'loss': 0.1126, 'grad_norm': 0.9707925426337226, 'learning_rate': 1.2468620315361097e-06, 'epoch': 0.78} 78%|███████▊ | 26639/34278 [29:14:37<8:05:27, 3.81s/it] 78%|███████▊ | 26640/34278 [29:14:40<7:58:03, 3.76s/it] {'loss': 0.0935, 'grad_norm': 1.0776174456302299, 'learning_rate': 1.2465498989854403e-06, 'epoch': 0.78} 78%|███████▊ | 26640/34278 [29:14:40<7:58:03, 3.76s/it] 78%|███████▊ | 26641/34278 [29:14:44<8:05:27, 3.81s/it] {'loss': 0.1149, 'grad_norm': 0.849047428327794, 'learning_rate': 1.2462377999446772e-06, 'epoch': 0.78} 78%|███████▊ | 26641/34278 [29:14:44<8:05:27, 3.81s/it] 78%|███████▊ | 26642/34278 [29:14:48<8:19:57, 3.93s/it] {'loss': 0.1347, 'grad_norm': 0.880854927069741, 'learning_rate': 1.2459257344166093e-06, 'epoch': 0.78} 78%|███████▊ | 26642/34278 [29:14:48<8:19:57, 3.93s/it] 78%|███████▊ | 26643/34278 [29:14:52<8:23:30, 3.96s/it] {'loss': 0.0963, 'grad_norm': 0.7519132175018172, 'learning_rate': 1.2456137024040194e-06, 'epoch': 0.78} 78%|███████▊ | 26643/34278 [29:14:52<8:23:30, 3.96s/it] 78%|███████▊ | 26644/34278 [29:14:56<8:03:15, 3.80s/it] {'loss': 0.1205, 'grad_norm': 1.1795045369126658, 'learning_rate': 1.2453017039096932e-06, 'epoch': 0.78} 78%|███████▊ | 26644/34278 [29:14:56<8:03:15, 3.80s/it] 78%|███████▊ | 26645/34278 [29:14:59<7:35:47, 3.58s/it] {'loss': 0.1335, 'grad_norm': 1.0666707961461077, 'learning_rate': 1.244989738936418e-06, 'epoch': 0.78} 78%|███████▊ | 26645/34278 [29:14:59<7:35:47, 3.58s/it] 78%|███████▊ | 26646/34278 [29:15:02<7:33:34, 3.57s/it] {'loss': 0.1022, 'grad_norm': 0.7465684775842973, 'learning_rate': 1.2446778074869787e-06, 'epoch': 0.78} 78%|███████▊ | 26646/34278 [29:15:03<7:33:34, 3.57s/it] 78%|███████▊ | 26647/34278 [29:15:06<7:25:40, 3.50s/it] {'loss': 0.1219, 'grad_norm': 0.9245849369293081, 'learning_rate': 1.244365909564156e-06, 'epoch': 0.78} 78%|███████▊ | 26647/34278 [29:15:06<7:25:40, 3.50s/it] 78%|███████▊ | 26648/34278 [29:15:09<7:02:38, 3.32s/it] {'loss': 0.1095, 'grad_norm': 0.8194364920016455, 'learning_rate': 1.2440540451707412e-06, 'epoch': 0.78} 78%|███████▊ | 26648/34278 [29:15:09<7:02:38, 3.32s/it] 78%|███████▊ | 26649/34278 [29:15:13<7:18:57, 3.45s/it] {'loss': 0.1159, 'grad_norm': 1.4730538792742582, 'learning_rate': 1.2437422143095146e-06, 'epoch': 0.78} 78%|███████▊ | 26649/34278 [29:15:13<7:18:57, 3.45s/it] 78%|███████▊ | 26650/34278 [29:15:16<7:29:11, 3.53s/it] {'loss': 0.1004, 'grad_norm': 0.9654301351932375, 'learning_rate': 1.24343041698326e-06, 'epoch': 0.78} 78%|███████▊ | 26650/34278 [29:15:16<7:29:11, 3.53s/it] 78%|███████▊ | 26651/34278 [29:15:22<9:06:50, 4.30s/it] {'loss': 0.1005, 'grad_norm': 1.0609500260812186, 'learning_rate': 1.2431186531947632e-06, 'epoch': 0.78} 78%|███████▊ | 26651/34278 [29:15:22<9:06:50, 4.30s/it] 78%|███████▊ | 26652/34278 [29:15:25<8:20:05, 3.93s/it] {'loss': 0.1039, 'grad_norm': 0.9494782063941555, 'learning_rate': 1.2428069229468065e-06, 'epoch': 0.78} 78%|███████▊ | 26652/34278 [29:15:25<8:20:05, 3.93s/it] 78%|███████▊ | 26653/34278 [29:15:30<8:30:18, 4.02s/it] {'loss': 0.0927, 'grad_norm': 0.6299200428661222, 'learning_rate': 1.2424952262421708e-06, 'epoch': 0.78} 78%|███████▊ | 26653/34278 [29:15:30<8:30:18, 4.02s/it] 78%|███████▊ | 26654/34278 [29:15:33<7:56:02, 3.75s/it] {'loss': 0.1024, 'grad_norm': 0.9646335130164041, 'learning_rate': 1.242183563083641e-06, 'epoch': 0.78} 78%|███████▊ | 26654/34278 [29:15:33<7:56:02, 3.75s/it] 78%|███████▊ | 26655/34278 [29:15:36<7:29:05, 3.53s/it] {'loss': 0.1122, 'grad_norm': 0.8931292920192913, 'learning_rate': 1.2418719334740003e-06, 'epoch': 0.78} 78%|███████▊ | 26655/34278 [29:15:36<7:29:05, 3.53s/it] 78%|███████▊ | 26656/34278 [29:15:39<7:08:11, 3.37s/it] {'loss': 0.1009, 'grad_norm': 1.0655031520186276, 'learning_rate': 1.24156033741603e-06, 'epoch': 0.78} 78%|███████▊ | 26656/34278 [29:15:39<7:08:11, 3.37s/it] 78%|███████▊ | 26657/34278 [29:15:42<6:49:20, 3.22s/it] {'loss': 0.1024, 'grad_norm': 1.021857634597553, 'learning_rate': 1.2412487749125107e-06, 'epoch': 0.78} 78%|███████▊ | 26657/34278 [29:15:42<6:49:20, 3.22s/it] 78%|███████▊ | 26658/34278 [29:15:45<6:57:49, 3.29s/it] {'loss': 0.124, 'grad_norm': 0.8575078998222521, 'learning_rate': 1.240937245966226e-06, 'epoch': 0.78} 78%|███████▊ | 26658/34278 [29:15:45<6:57:49, 3.29s/it] 78%|███████▊ | 26659/34278 [29:15:48<6:45:57, 3.20s/it] {'loss': 0.1243, 'grad_norm': 0.9227808571683745, 'learning_rate': 1.2406257505799553e-06, 'epoch': 0.78} 78%|███████▊ | 26659/34278 [29:15:48<6:45:57, 3.20s/it] 78%|███████▊ | 26660/34278 [29:15:51<6:52:28, 3.25s/it] {'loss': 0.1085, 'grad_norm': 1.116672167410054, 'learning_rate': 1.24031428875648e-06, 'epoch': 0.78} 78%|███████▊ | 26660/34278 [29:15:51<6:52:28, 3.25s/it] 78%|███████▊ | 26661/34278 [29:15:55<7:14:37, 3.42s/it] {'loss': 0.1331, 'grad_norm': 0.9178566782027545, 'learning_rate': 1.240002860498583e-06, 'epoch': 0.78} 78%|███████▊ | 26661/34278 [29:15:55<7:14:37, 3.42s/it] 78%|███████▊ | 26662/34278 [29:15:58<6:57:31, 3.29s/it] {'loss': 0.103, 'grad_norm': 0.8968965306034115, 'learning_rate': 1.2396914658090425e-06, 'epoch': 0.78} 78%|███████▊ | 26662/34278 [29:15:58<6:57:31, 3.29s/it] 78%|███████▊ | 26663/34278 [29:16:01<6:41:58, 3.17s/it] {'loss': 0.106, 'grad_norm': 0.9028466221955137, 'learning_rate': 1.2393801046906378e-06, 'epoch': 0.78} 78%|███████▊ | 26663/34278 [29:16:01<6:41:58, 3.17s/it] 78%|███████▊ | 26664/34278 [29:16:05<6:56:23, 3.28s/it] {'loss': 0.102, 'grad_norm': 0.8946521624877845, 'learning_rate': 1.2390687771461514e-06, 'epoch': 0.78} 78%|███████▊ | 26664/34278 [29:16:05<6:56:23, 3.28s/it] 78%|███████▊ | 26665/34278 [29:16:10<8:02:26, 3.80s/it] {'loss': 0.1271, 'grad_norm': 0.8069674068638788, 'learning_rate': 1.2387574831783594e-06, 'epoch': 0.78} 78%|███████▊ | 26665/34278 [29:16:10<8:02:26, 3.80s/it] 78%|███████▊ | 26666/34278 [29:16:13<7:34:02, 3.58s/it] {'loss': 0.1321, 'grad_norm': 0.8904724566033538, 'learning_rate': 1.2384462227900446e-06, 'epoch': 0.78} 78%|███████▊ | 26666/34278 [29:16:13<7:34:02, 3.58s/it] 78%|███████▊ | 26667/34278 [29:16:17<7:41:33, 3.64s/it] {'loss': 0.1135, 'grad_norm': 0.8145129592069313, 'learning_rate': 1.2381349959839817e-06, 'epoch': 0.78} 78%|███████▊ | 26667/34278 [29:16:17<7:41:33, 3.64s/it] 78%|███████▊ | 26668/34278 [29:16:20<7:42:42, 3.65s/it] {'loss': 0.1009, 'grad_norm': 0.9192763556875785, 'learning_rate': 1.2378238027629535e-06, 'epoch': 0.78} 78%|███████▊ | 26668/34278 [29:16:20<7:42:42, 3.65s/it] 78%|███████▊ | 26669/34278 [29:16:23<7:27:42, 3.53s/it] {'loss': 0.1238, 'grad_norm': 1.0417557445471464, 'learning_rate': 1.2375126431297363e-06, 'epoch': 0.78} 78%|███████▊ | 26669/34278 [29:16:23<7:27:42, 3.53s/it] 78%|███████▊ | 26670/34278 [29:16:27<7:11:27, 3.40s/it] {'loss': 0.112, 'grad_norm': 0.7742885384288763, 'learning_rate': 1.2372015170871066e-06, 'epoch': 0.78} 78%|███████▊ | 26670/34278 [29:16:27<7:11:27, 3.40s/it] 78%|███████▊ | 26671/34278 [29:16:29<6:40:12, 3.16s/it] {'loss': 0.1024, 'grad_norm': 0.9630538149212657, 'learning_rate': 1.2368904246378433e-06, 'epoch': 0.78} 78%|███████▊ | 26671/34278 [29:16:29<6:40:12, 3.16s/it] 78%|███████▊ | 26672/34278 [29:16:35<8:23:02, 3.97s/it] {'loss': 0.1315, 'grad_norm': 1.1272673744241541, 'learning_rate': 1.2365793657847258e-06, 'epoch': 0.78} 78%|███████▊ | 26672/34278 [29:16:35<8:23:02, 3.97s/it] 78%|███████▊ | 26673/34278 [29:16:38<7:53:53, 3.74s/it] {'loss': 0.1067, 'grad_norm': 0.9593502375094775, 'learning_rate': 1.2362683405305288e-06, 'epoch': 0.78} 78%|███████▊ | 26673/34278 [29:16:38<7:53:53, 3.74s/it] 78%|███████▊ | 26674/34278 [29:16:41<7:25:45, 3.52s/it] {'loss': 0.1238, 'grad_norm': 1.250199771377887, 'learning_rate': 1.2359573488780286e-06, 'epoch': 0.78} 78%|███████▊ | 26674/34278 [29:16:41<7:25:45, 3.52s/it] 78%|███████▊ | 26675/34278 [29:16:45<7:23:01, 3.50s/it] {'loss': 0.119, 'grad_norm': 1.1218438201352507, 'learning_rate': 1.2356463908300038e-06, 'epoch': 0.78} 78%|███████▊ | 26675/34278 [29:16:45<7:23:01, 3.50s/it] 78%|███████▊ | 26676/34278 [29:16:48<7:10:05, 3.39s/it] {'loss': 0.1054, 'grad_norm': 1.0016220520139367, 'learning_rate': 1.2353354663892292e-06, 'epoch': 0.78} 78%|███████▊ | 26676/34278 [29:16:48<7:10:05, 3.39s/it] 78%|███████▊ | 26677/34278 [29:16:51<7:11:03, 3.40s/it] {'loss': 0.1153, 'grad_norm': 1.1907167595340427, 'learning_rate': 1.2350245755584784e-06, 'epoch': 0.78} 78%|███████▊ | 26677/34278 [29:16:51<7:11:03, 3.40s/it] 78%|███████▊ | 26678/34278 [29:16:55<7:32:43, 3.57s/it] {'loss': 0.0937, 'grad_norm': 0.7287298020032715, 'learning_rate': 1.2347137183405322e-06, 'epoch': 0.78} 78%|███████▊ | 26678/34278 [29:16:55<7:32:43, 3.57s/it] 78%|███████▊ | 26679/34278 [29:16:59<7:25:29, 3.52s/it] {'loss': 0.1019, 'grad_norm': 0.745480931902787, 'learning_rate': 1.234402894738163e-06, 'epoch': 0.78} 78%|███████▊ | 26679/34278 [29:16:59<7:25:29, 3.52s/it] 78%|███████▊ | 26680/34278 [29:17:02<7:16:45, 3.45s/it] {'loss': 0.1319, 'grad_norm': 1.0994987771951368, 'learning_rate': 1.2340921047541443e-06, 'epoch': 0.78} 78%|███████▊ | 26680/34278 [29:17:02<7:16:45, 3.45s/it] 78%|███████▊ | 26681/34278 [29:17:08<8:51:57, 4.20s/it] {'loss': 0.1156, 'grad_norm': 1.0285183921062595, 'learning_rate': 1.2337813483912537e-06, 'epoch': 0.78} 78%|███████▊ | 26681/34278 [29:17:08<8:51:57, 4.20s/it] 78%|███████▊ | 26682/34278 [29:17:14<9:52:34, 4.68s/it] {'loss': 0.1196, 'grad_norm': 0.8084195105996703, 'learning_rate': 1.2334706256522645e-06, 'epoch': 0.78} 78%|███████▊ | 26682/34278 [29:17:14<9:52:34, 4.68s/it] 78%|███████▊ | 26683/34278 [29:17:17<9:13:59, 4.38s/it] {'loss': 0.1032, 'grad_norm': 0.9120359785443655, 'learning_rate': 1.2331599365399488e-06, 'epoch': 0.78} 78%|███████▊ | 26683/34278 [29:17:17<9:13:59, 4.38s/it] 78%|███████▊ | 26684/34278 [29:17:21<8:34:44, 4.07s/it] {'loss': 0.1221, 'grad_norm': 0.6817428343949745, 'learning_rate': 1.232849281057082e-06, 'epoch': 0.78} 78%|███████▊ | 26684/34278 [29:17:21<8:34:44, 4.07s/it] 78%|███████▊ | 26685/34278 [29:17:25<8:32:46, 4.05s/it] {'loss': 0.1441, 'grad_norm': 1.0225743706698278, 'learning_rate': 1.2325386592064387e-06, 'epoch': 0.78} 78%|███████▊ | 26685/34278 [29:17:25<8:32:46, 4.05s/it] 78%|███████▊ | 26686/34278 [29:17:31<9:43:26, 4.61s/it] {'loss': 0.1362, 'grad_norm': 1.0009657983376468, 'learning_rate': 1.2322280709907914e-06, 'epoch': 0.78} 78%|███████▊ | 26686/34278 [29:17:31<9:43:26, 4.61s/it] 78%|███████▊ | 26687/34278 [29:17:36<10:02:07, 4.76s/it] {'loss': 0.112, 'grad_norm': 0.7195015225001173, 'learning_rate': 1.2319175164129104e-06, 'epoch': 0.78} 78%|███████▊ | 26687/34278 [29:17:36<10:02:07, 4.76s/it] 78%|███████▊ | 26688/34278 [29:17:39<9:24:14, 4.46s/it] {'loss': 0.1098, 'grad_norm': 1.0028875804361255, 'learning_rate': 1.2316069954755722e-06, 'epoch': 0.78} 78%|███████▊ | 26688/34278 [29:17:39<9:24:14, 4.46s/it] 78%|███████▊ | 26689/34278 [29:17:43<8:32:23, 4.05s/it] {'loss': 0.1161, 'grad_norm': 1.000659052366997, 'learning_rate': 1.2312965081815454e-06, 'epoch': 0.78} 78%|███████▊ | 26689/34278 [29:17:43<8:32:23, 4.05s/it] 78%|███████▊ | 26690/34278 [29:17:47<8:36:40, 4.09s/it] {'loss': 0.1108, 'grad_norm': 0.7514182524851203, 'learning_rate': 1.2309860545336038e-06, 'epoch': 0.78} 78%|███████▊ | 26690/34278 [29:17:47<8:36:40, 4.09s/it] 78%|███████▊ | 26691/34278 [29:17:50<7:58:42, 3.79s/it] {'loss': 0.1346, 'grad_norm': 0.9129436053352685, 'learning_rate': 1.2306756345345206e-06, 'epoch': 0.78} 78%|███████▊ | 26691/34278 [29:17:50<7:58:42, 3.79s/it] 78%|███████▊ | 26692/34278 [29:17:53<7:22:31, 3.50s/it] {'loss': 0.0991, 'grad_norm': 0.9784802357904911, 'learning_rate': 1.2303652481870654e-06, 'epoch': 0.78} 78%|███████▊ | 26692/34278 [29:17:53<7:22:31, 3.50s/it] 78%|███████▊ | 26693/34278 [29:17:56<7:09:21, 3.40s/it] {'loss': 0.1103, 'grad_norm': 0.8306120165879555, 'learning_rate': 1.2300548954940079e-06, 'epoch': 0.78} 78%|███████▊ | 26693/34278 [29:17:56<7:09:21, 3.40s/it] 78%|███████▊ | 26694/34278 [29:17:59<7:06:38, 3.38s/it] {'loss': 0.1405, 'grad_norm': 0.8554702064026037, 'learning_rate': 1.2297445764581218e-06, 'epoch': 0.78} 78%|███████▊ | 26694/34278 [29:17:59<7:06:38, 3.38s/it] 78%|███████▊ | 26695/34278 [29:18:02<6:55:15, 3.29s/it] {'loss': 0.1205, 'grad_norm': 0.8924164994352132, 'learning_rate': 1.2294342910821743e-06, 'epoch': 0.78} 78%|███████▊ | 26695/34278 [29:18:02<6:55:15, 3.29s/it] 78%|███████▊ | 26696/34278 [29:18:05<6:40:35, 3.17s/it] {'loss': 0.0954, 'grad_norm': 1.0941959692835694, 'learning_rate': 1.2291240393689397e-06, 'epoch': 0.78} 78%|███████▊ | 26696/34278 [29:18:05<6:40:35, 3.17s/it] 78%|███████▊ | 26697/34278 [29:18:09<6:52:10, 3.26s/it] {'loss': 0.1283, 'grad_norm': 0.8278228119703622, 'learning_rate': 1.228813821321183e-06, 'epoch': 0.78} 78%|███████▊ | 26697/34278 [29:18:09<6:52:10, 3.26s/it] 78%|███████▊ | 26698/34278 [29:18:14<8:07:25, 3.86s/it] {'loss': 0.1158, 'grad_norm': 0.7989796401063589, 'learning_rate': 1.2285036369416785e-06, 'epoch': 0.78} 78%|███████▊ | 26698/34278 [29:18:14<8:07:25, 3.86s/it] 78%|███████▊ | 26699/34278 [29:18:17<7:51:17, 3.73s/it] {'loss': 0.1016, 'grad_norm': 0.8799667853157728, 'learning_rate': 1.2281934862331929e-06, 'epoch': 0.78} 78%|███████▊ | 26699/34278 [29:18:17<7:51:17, 3.73s/it] 78%|███████▊ | 26700/34278 [29:18:23<9:15:08, 4.40s/it] {'loss': 0.1212, 'grad_norm': 0.8323693952223855, 'learning_rate': 1.2278833691984938e-06, 'epoch': 0.78} 78%|███████▊ | 26700/34278 [29:18:23<9:15:08, 4.40s/it] 78%|███████▊ | 26701/34278 [29:18:29<10:21:01, 4.92s/it] {'loss': 0.1268, 'grad_norm': 0.8224931820100664, 'learning_rate': 1.2275732858403516e-06, 'epoch': 0.78} 78%|███████▊ | 26701/34278 [29:18:29<10:21:01, 4.92s/it] 78%|███████▊ | 26702/34278 [29:18:34<9:54:53, 4.71s/it] {'loss': 0.118, 'grad_norm': 1.0095870012942598, 'learning_rate': 1.227263236161536e-06, 'epoch': 0.78} 78%|███████▊ | 26702/34278 [29:18:34<9:54:53, 4.71s/it] 78%|███████▊ | 26703/34278 [29:18:38<9:52:35, 4.69s/it] {'loss': 0.1267, 'grad_norm': 0.9547925257292422, 'learning_rate': 1.2269532201648138e-06, 'epoch': 0.78} 78%|███████▊ | 26703/34278 [29:18:38<9:52:35, 4.69s/it] 78%|███████▊ | 26704/34278 [29:18:41<8:55:25, 4.24s/it] {'loss': 0.1079, 'grad_norm': 0.8814984158576729, 'learning_rate': 1.2266432378529515e-06, 'epoch': 0.78} 78%|███████▊ | 26704/34278 [29:18:41<8:55:25, 4.24s/it] 78%|███████▊ | 26705/34278 [29:18:45<8:36:30, 4.09s/it] {'loss': 0.1031, 'grad_norm': 0.6721525574171244, 'learning_rate': 1.2263332892287183e-06, 'epoch': 0.78} 78%|███████▊ | 26705/34278 [29:18:45<8:36:30, 4.09s/it] 78%|███████▊ | 26706/34278 [29:18:48<7:57:33, 3.78s/it] {'loss': 0.1099, 'grad_norm': 0.8500720477022115, 'learning_rate': 1.2260233742948796e-06, 'epoch': 0.78} 78%|███████▊ | 26706/34278 [29:18:48<7:57:33, 3.78s/it] 78%|███████▊ | 26707/34278 [29:18:51<7:18:06, 3.47s/it] {'loss': 0.0952, 'grad_norm': 0.8083825183320806, 'learning_rate': 1.225713493054203e-06, 'epoch': 0.78} 78%|███████▊ | 26707/34278 [29:18:51<7:18:06, 3.47s/it] 78%|███████▊ | 26708/34278 [29:18:57<9:04:04, 4.31s/it] {'loss': 0.1405, 'grad_norm': 0.8482078588101409, 'learning_rate': 1.225403645509457e-06, 'epoch': 0.78} 78%|███████▊ | 26708/34278 [29:18:57<9:04:04, 4.31s/it] 78%|███████▊ | 26709/34278 [29:19:03<9:44:15, 4.63s/it] {'loss': 0.1131, 'grad_norm': 0.8291330958677605, 'learning_rate': 1.2250938316634058e-06, 'epoch': 0.78} 78%|███████▊ | 26709/34278 [29:19:03<9:44:15, 4.63s/it] 78%|███████▊ | 26710/34278 [29:19:06<9:01:00, 4.29s/it] {'loss': 0.1108, 'grad_norm': 0.749058899827566, 'learning_rate': 1.2247840515188148e-06, 'epoch': 0.78} 78%|███████▊ | 26710/34278 [29:19:06<9:01:00, 4.29s/it] 78%|███████▊ | 26711/34278 [29:19:11<9:24:33, 4.48s/it] {'loss': 0.1075, 'grad_norm': 0.6851845760589484, 'learning_rate': 1.224474305078452e-06, 'epoch': 0.78} 78%|███████▊ | 26711/34278 [29:19:11<9:24:33, 4.48s/it] 78%|███████▊ | 26712/34278 [29:19:17<10:18:38, 4.91s/it] {'loss': 0.117, 'grad_norm': 0.8155896657226269, 'learning_rate': 1.2241645923450795e-06, 'epoch': 0.78} 78%|███████▊ | 26712/34278 [29:19:17<10:18:38, 4.91s/it] 78%|███████▊ | 26713/34278 [29:19:20<9:11:36, 4.37s/it] {'loss': 0.1254, 'grad_norm': 0.740677544101561, 'learning_rate': 1.2238549133214656e-06, 'epoch': 0.78} 78%|███████▊ | 26713/34278 [29:19:20<9:11:36, 4.37s/it] 78%|███████▊ | 26714/34278 [29:19:25<9:18:20, 4.43s/it] {'loss': 0.1072, 'grad_norm': 0.8438046602980243, 'learning_rate': 1.2235452680103727e-06, 'epoch': 0.78} 78%|███████▊ | 26714/34278 [29:19:25<9:18:20, 4.43s/it] 78%|███████▊ | 26715/34278 [29:19:28<8:36:29, 4.10s/it] {'loss': 0.1098, 'grad_norm': 0.8325713259705939, 'learning_rate': 1.2232356564145669e-06, 'epoch': 0.78} 78%|███████▊ | 26715/34278 [29:19:28<8:36:29, 4.10s/it] 78%|███████▊ | 26716/34278 [29:19:33<8:56:42, 4.26s/it] {'loss': 0.0973, 'grad_norm': 0.9077832510116061, 'learning_rate': 1.222926078536812e-06, 'epoch': 0.78} 78%|███████▊ | 26716/34278 [29:19:33<8:56:42, 4.26s/it] 78%|███████▊ | 26717/34278 [29:19:35<8:06:31, 3.86s/it] {'loss': 0.1291, 'grad_norm': 0.9130288259711684, 'learning_rate': 1.2226165343798695e-06, 'epoch': 0.78} 78%|███████▊ | 26717/34278 [29:19:35<8:06:31, 3.86s/it] 78%|███████▊ | 26718/34278 [29:19:38<7:29:09, 3.56s/it] {'loss': 0.116, 'grad_norm': 0.9629574235455058, 'learning_rate': 1.2223070239465056e-06, 'epoch': 0.78} 78%|███████▊ | 26718/34278 [29:19:38<7:29:09, 3.56s/it] 78%|███████▊ | 26719/34278 [29:19:41<7:08:03, 3.40s/it] {'loss': 0.1058, 'grad_norm': 0.9530366632288055, 'learning_rate': 1.2219975472394835e-06, 'epoch': 0.78} 78%|███████▊ | 26719/34278 [29:19:41<7:08:03, 3.40s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff4d82dbce0> Failed to fetch sample 3255714. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4d82dbce0> 78%|███████▊ | 26720/34278 [29:19:44<6:43:22, 3.20s/it] {'loss': 0.1365, 'grad_norm': 0.8913347652453288, 'learning_rate': 1.2216881042615648e-06, 'epoch': 0.78} 78%|███████▊ | 26720/34278 [29:19:44<6:43:22, 3.20s/it] 78%|███████▊ | 26721/34278 [29:19:47<6:27:04, 3.07s/it] {'loss': 0.0949, 'grad_norm': 0.8646455633136173, 'learning_rate': 1.2213786950155132e-06, 'epoch': 0.78} 78%|███████▊ | 26721/34278 [29:19:47<6:27:04, 3.07s/it] 78%|███████▊ | 26722/34278 [29:19:50<6:41:06, 3.19s/it] {'loss': 0.1227, 'grad_norm': 0.8838438486528315, 'learning_rate': 1.2210693195040912e-06, 'epoch': 0.78} 78%|███████▊ | 26722/34278 [29:19:50<6:41:06, 3.19s/it] 78%|███████▊ | 26723/34278 [29:19:54<6:42:33, 3.20s/it] {'loss': 0.127, 'grad_norm': 0.794708762125285, 'learning_rate': 1.2207599777300588e-06, 'epoch': 0.78} 78%|███████▊ | 26723/34278 [29:19:54<6:42:33, 3.20s/it] 78%|███████▊ | 26724/34278 [29:19:58<7:45:37, 3.70s/it] {'loss': 0.1344, 'grad_norm': 0.8809471127235253, 'learning_rate': 1.2204506696961788e-06, 'epoch': 0.78} 78%|███████▊ | 26724/34278 [29:19:58<7:45:37, 3.70s/it] 78%|███████▊ | 26725/34278 [29:20:02<7:24:50, 3.53s/it] {'loss': 0.1117, 'grad_norm': 0.7929277943310479, 'learning_rate': 1.220141395405215e-06, 'epoch': 0.78} 78%|███████▊ | 26725/34278 [29:20:02<7:24:50, 3.53s/it] 78%|███████▊ | 26726/34278 [29:20:07<8:21:16, 3.98s/it] {'loss': 0.0999, 'grad_norm': 0.7012512896546673, 'learning_rate': 1.2198321548599258e-06, 'epoch': 0.78} 78%|███████▊ | 26726/34278 [29:20:07<8:21:16, 3.98s/it] 78%|███████▊ | 26727/34278 [29:20:12<9:13:49, 4.40s/it] {'loss': 0.1143, 'grad_norm': 0.7876964499431774, 'learning_rate': 1.2195229480630715e-06, 'epoch': 0.78} 78%|███████▊ | 26727/34278 [29:20:12<9:13:49, 4.40s/it] 78%|███████▊ | 26728/34278 [29:20:15<8:31:29, 4.06s/it] {'loss': 0.1247, 'grad_norm': 0.9069026507711498, 'learning_rate': 1.2192137750174154e-06, 'epoch': 0.78} 78%|███████▊ | 26728/34278 [29:20:15<8:31:29, 4.06s/it] 78%|███████▊ | 26729/34278 [29:20:19<8:21:34, 3.99s/it] {'loss': 0.1205, 'grad_norm': 0.7822523918268156, 'learning_rate': 1.218904635725716e-06, 'epoch': 0.78} 78%|███████▊ | 26729/34278 [29:20:19<8:21:34, 3.99s/it] 78%|███████▊ | 26730/34278 [29:20:22<7:47:25, 3.72s/it] {'loss': 0.1125, 'grad_norm': 0.9584496067290577, 'learning_rate': 1.218595530190732e-06, 'epoch': 0.78} 78%|███████▊ | 26730/34278 [29:20:22<7:47:25, 3.72s/it] 78%|███████▊ | 26731/34278 [29:20:25<7:14:07, 3.45s/it] {'loss': 0.1253, 'grad_norm': 1.5316248666339345, 'learning_rate': 1.218286458415225e-06, 'epoch': 0.78} 78%|███████▊ | 26731/34278 [29:20:25<7:14:07, 3.45s/it] 78%|███████▊ | 26732/34278 [29:20:28<6:47:16, 3.24s/it] {'loss': 0.1111, 'grad_norm': 0.8843190043115959, 'learning_rate': 1.2179774204019545e-06, 'epoch': 0.78} 78%|███████▊ | 26732/34278 [29:20:28<6:47:16, 3.24s/it] 78%|███████▊ | 26733/34278 [29:20:32<7:10:24, 3.42s/it] {'loss': 0.1074, 'grad_norm': 0.7014560902717982, 'learning_rate': 1.2176684161536789e-06, 'epoch': 0.78} 78%|███████▊ | 26733/34278 [29:20:32<7:10:24, 3.42s/it] 78%|███████▊ | 26734/34278 [29:20:35<7:10:47, 3.43s/it] {'loss': 0.1365, 'grad_norm': 0.8405700625956761, 'learning_rate': 1.2173594456731552e-06, 'epoch': 0.78} 78%|███████▊ | 26734/34278 [29:20:35<7:10:47, 3.43s/it] 78%|███████▊ | 26735/34278 [29:20:38<6:50:01, 3.26s/it] {'loss': 0.1196, 'grad_norm': 0.7313345740138453, 'learning_rate': 1.217050508963145e-06, 'epoch': 0.78} 78%|███████▊ | 26735/34278 [29:20:38<6:50:01, 3.26s/it] 78%|███████▊ | 26736/34278 [29:20:43<8:13:25, 3.93s/it] {'loss': 0.1107, 'grad_norm': 0.9984755855923472, 'learning_rate': 1.2167416060264032e-06, 'epoch': 0.78} 78%|███████▊ | 26736/34278 [29:20:43<8:13:25, 3.93s/it] 78%|███████▊ | 26737/34278 [29:20:47<7:45:40, 3.71s/it] {'loss': 0.1034, 'grad_norm': 0.8165395911930621, 'learning_rate': 1.2164327368656891e-06, 'epoch': 0.78} 78%|███████▊ | 26737/34278 [29:20:47<7:45:40, 3.71s/it] 78%|███████▊ | 26738/34278 [29:20:49<7:11:41, 3.44s/it] {'loss': 0.1332, 'grad_norm': 0.8972382791560111, 'learning_rate': 1.2161239014837622e-06, 'epoch': 0.78} 78%|███████▊ | 26738/34278 [29:20:49<7:11:41, 3.44s/it] 78%|███████▊ | 26739/34278 [29:20:52<6:57:06, 3.32s/it] {'loss': 0.1184, 'grad_norm': 0.8449132336743154, 'learning_rate': 1.215815099883378e-06, 'epoch': 0.78} 78%|███████▊ | 26739/34278 [29:20:52<6:57:06, 3.32s/it] 78%|███████▊ | 26740/34278 [29:20:56<7:08:21, 3.41s/it] {'loss': 0.0959, 'grad_norm': 0.901871165124545, 'learning_rate': 1.215506332067291e-06, 'epoch': 0.78} 78%|███████▊ | 26740/34278 [29:20:56<7:08:21, 3.41s/it] 78%|███████▊ | 26741/34278 [29:20:59<7:05:54, 3.39s/it] {'loss': 0.1495, 'grad_norm': 0.8205331596332643, 'learning_rate': 1.215197598038262e-06, 'epoch': 0.78} 78%|███████▊ | 26741/34278 [29:20:59<7:05:54, 3.39s/it] 78%|███████▊ | 26742/34278 [29:21:03<6:58:03, 3.33s/it] {'loss': 0.1186, 'grad_norm': 0.8982133362430896, 'learning_rate': 1.2148888977990435e-06, 'epoch': 0.78} 78%|███████▊ | 26742/34278 [29:21:03<6:58:03, 3.33s/it] 78%|███████▊ | 26743/34278 [29:21:09<8:39:05, 4.13s/it] {'loss': 0.1084, 'grad_norm': 0.8739905942822849, 'learning_rate': 1.2145802313523953e-06, 'epoch': 0.78} 78%|███████▊ | 26743/34278 [29:21:09<8:39:05, 4.13s/it] 78%|███████▊ | 26744/34278 [29:21:13<8:38:04, 4.13s/it] {'loss': 0.1272, 'grad_norm': 0.9793410854378775, 'learning_rate': 1.2142715987010695e-06, 'epoch': 0.78} 78%|███████▊ | 26744/34278 [29:21:13<8:38:04, 4.13s/it] 78%|███████▊ | 26745/34278 [29:21:17<8:26:38, 4.04s/it] {'loss': 0.1137, 'grad_norm': 0.7237864976502513, 'learning_rate': 1.2139629998478242e-06, 'epoch': 0.78} 78%|███████▊ | 26745/34278 [29:21:17<8:26:38, 4.04s/it] 78%|███████▊ | 26746/34278 [29:21:20<7:54:15, 3.78s/it] {'loss': 0.1064, 'grad_norm': 0.8582898059022251, 'learning_rate': 1.2136544347954137e-06, 'epoch': 0.78} 78%|███████▊ | 26746/34278 [29:21:20<7:54:15, 3.78s/it] 78%|███████▊ | 26747/34278 [29:21:23<7:41:49, 3.68s/it] {'loss': 0.1088, 'grad_norm': 0.8673267023193789, 'learning_rate': 1.213345903546591e-06, 'epoch': 0.78} 78%|███████▊ | 26747/34278 [29:21:23<7:41:49, 3.68s/it] 78%|███████▊ | 26748/34278 [29:21:27<7:36:03, 3.63s/it] {'loss': 0.1124, 'grad_norm': 0.8770160523231496, 'learning_rate': 1.2130374061041129e-06, 'epoch': 0.78} 78%|███████▊ | 26748/34278 [29:21:27<7:36:03, 3.63s/it] 78%|███████▊ | 26749/34278 [29:21:30<7:11:57, 3.44s/it] {'loss': 0.1272, 'grad_norm': 0.9360533571696744, 'learning_rate': 1.2127289424707333e-06, 'epoch': 0.78} 78%|███████▊ | 26749/34278 [29:21:30<7:11:57, 3.44s/it] 78%|███████▊ | 26750/34278 [29:21:36<8:44:51, 4.18s/it] {'loss': 0.1284, 'grad_norm': 0.9511495848014366, 'learning_rate': 1.2124205126492045e-06, 'epoch': 0.78} 78%|███████▊ | 26750/34278 [29:21:36<8:44:51, 4.18s/it] 78%|███████▊ | 26751/34278 [29:21:41<9:47:08, 4.68s/it] {'loss': 0.1248, 'grad_norm': 0.6983904983574253, 'learning_rate': 1.2121121166422828e-06, 'epoch': 0.78} 78%|███████▊ | 26751/34278 [29:21:41<9:47:08, 4.68s/it] 78%|███████▊ | 26752/34278 [29:21:44<8:46:49, 4.20s/it] {'loss': 0.1332, 'grad_norm': 0.80836694000979, 'learning_rate': 1.2118037544527195e-06, 'epoch': 0.78} 78%|███████▊ | 26752/34278 [29:21:44<8:46:49, 4.20s/it] 78%|███████▊ | 26753/34278 [29:21:47<7:58:32, 3.82s/it] {'loss': 0.1042, 'grad_norm': 0.7763670996189822, 'learning_rate': 1.2114954260832668e-06, 'epoch': 0.78} 78%|███████▊ | 26753/34278 [29:21:47<7:58:32, 3.82s/it] 78%|███████▊ | 26754/34278 [29:21:51<7:34:06, 3.62s/it] {'loss': 0.1133, 'grad_norm': 0.7963575504478867, 'learning_rate': 1.2111871315366785e-06, 'epoch': 0.78} 78%|███████▊ | 26754/34278 [29:21:51<7:34:06, 3.62s/it] 78%|███████▊ | 26755/34278 [29:21:53<7:06:53, 3.40s/it] {'loss': 0.1133, 'grad_norm': 0.9381122350973579, 'learning_rate': 1.2108788708157087e-06, 'epoch': 0.78} 78%|███████▊ | 26755/34278 [29:21:53<7:06:53, 3.40s/it] 78%|███████▊ | 26756/34278 [29:21:58<7:32:52, 3.61s/it] {'loss': 0.1191, 'grad_norm': 1.0777249917953655, 'learning_rate': 1.2105706439231073e-06, 'epoch': 0.78} 78%|███████▊ | 26756/34278 [29:21:58<7:32:52, 3.61s/it] 78%|███████▊ | 26757/34278 [29:22:00<7:02:10, 3.37s/it] {'loss': 0.1087, 'grad_norm': 0.9266597424002107, 'learning_rate': 1.2102624508616257e-06, 'epoch': 0.78} 78%|███████▊ | 26757/34278 [29:22:00<7:02:10, 3.37s/it] 78%|███████▊ | 26758/34278 [29:22:03<6:53:06, 3.30s/it] {'loss': 0.136, 'grad_norm': 0.9842706032528625, 'learning_rate': 1.2099542916340172e-06, 'epoch': 0.78} 78%|███████▊ | 26758/34278 [29:22:04<6:53:06, 3.30s/it] 78%|███████▊ | 26759/34278 [29:22:07<7:03:08, 3.38s/it] {'loss': 0.133, 'grad_norm': 0.9169158453911002, 'learning_rate': 1.209646166243032e-06, 'epoch': 0.78} 78%|███████▊ | 26759/34278 [29:22:07<7:03:08, 3.38s/it] 78%|███████▊ | 26760/34278 [29:22:10<6:54:00, 3.30s/it] {'loss': 0.0961, 'grad_norm': 0.7454080072920237, 'learning_rate': 1.2093380746914201e-06, 'epoch': 0.78} 78%|███████▊ | 26760/34278 [29:22:10<6:54:00, 3.30s/it] 78%|███████▊ | 26761/34278 [29:22:13<6:46:26, 3.24s/it] {'loss': 0.1224, 'grad_norm': 0.9954352719939636, 'learning_rate': 1.2090300169819325e-06, 'epoch': 0.78} 78%|███████▊ | 26761/34278 [29:22:13<6:46:26, 3.24s/it] 78%|███████▊ | 26762/34278 [29:22:19<8:25:22, 4.03s/it] {'loss': 0.1043, 'grad_norm': 0.6148071655149786, 'learning_rate': 1.2087219931173217e-06, 'epoch': 0.78} 78%|███████▊ | 26762/34278 [29:22:19<8:25:22, 4.03s/it] 78%|███████▊ | 26763/34278 [29:22:23<8:05:02, 3.87s/it] {'loss': 0.1218, 'grad_norm': 1.0154074792198873, 'learning_rate': 1.2084140031003355e-06, 'epoch': 0.78} 78%|███████▊ | 26763/34278 [29:22:23<8:05:02, 3.87s/it] 78%|███████▊ | 26764/34278 [29:22:26<7:36:16, 3.64s/it] {'loss': 0.1107, 'grad_norm': 0.9427767581583829, 'learning_rate': 1.208106046933723e-06, 'epoch': 0.78} 78%|███████▊ | 26764/34278 [29:22:26<7:36:16, 3.64s/it] 78%|███████▊ | 26765/34278 [29:22:32<9:20:27, 4.48s/it] {'loss': 0.1452, 'grad_norm': 0.9415031648731707, 'learning_rate': 1.2077981246202353e-06, 'epoch': 0.78} 78%|███████▊ | 26765/34278 [29:22:32<9:20:27, 4.48s/it] 78%|███████▊ | 26766/34278 [29:22:36<8:39:11, 4.15s/it] {'loss': 0.1227, 'grad_norm': 0.8756557239714715, 'learning_rate': 1.2074902361626196e-06, 'epoch': 0.78} 78%|███████▊ | 26766/34278 [29:22:36<8:39:11, 4.15s/it] 78%|███████▊ | 26767/34278 [29:22:38<7:46:49, 3.73s/it] {'loss': 0.0937, 'grad_norm': 0.6957932399657003, 'learning_rate': 1.2071823815636257e-06, 'epoch': 0.78} 78%|███████▊ | 26767/34278 [29:22:38<7:46:49, 3.73s/it] 78%|███████▊ | 26768/34278 [29:22:44<9:12:55, 4.42s/it] {'loss': 0.1041, 'grad_norm': 0.883933179139988, 'learning_rate': 1.2068745608260035e-06, 'epoch': 0.78} 78%|███████▊ | 26768/34278 [29:22:44<9:12:55, 4.42s/it] 78%|███████▊ | 26769/34278 [29:22:48<9:00:27, 4.32s/it] {'loss': 0.1299, 'grad_norm': 0.7340686565749424, 'learning_rate': 1.2065667739525e-06, 'epoch': 0.78} 78%|███████▊ | 26769/34278 [29:22:48<9:00:27, 4.32s/it] 78%|███████▊ | 26770/34278 [29:22:53<9:07:30, 4.38s/it] {'loss': 0.109, 'grad_norm': 1.0386307950050804, 'learning_rate': 1.2062590209458614e-06, 'epoch': 0.78} 78%|███████▊ | 26770/34278 [29:22:53<9:07:30, 4.38s/it] 78%|███████▊ | 26771/34278 [29:22:56<8:12:49, 3.94s/it] {'loss': 0.1195, 'grad_norm': 0.8605718743394106, 'learning_rate': 1.2059513018088375e-06, 'epoch': 0.78} 78%|███████▊ | 26771/34278 [29:22:56<8:12:49, 3.94s/it] 78%|███████▊ | 26772/34278 [29:22:59<7:40:05, 3.68s/it] {'loss': 0.1132, 'grad_norm': 0.7351565609045481, 'learning_rate': 1.2056436165441738e-06, 'epoch': 0.78} 78%|███████▊ | 26772/34278 [29:22:59<7:40:05, 3.68s/it] 78%|███████▊ | 26773/34278 [29:23:02<7:14:30, 3.47s/it] {'loss': 0.144, 'grad_norm': 0.804844919474205, 'learning_rate': 1.2053359651546193e-06, 'epoch': 0.78} 78%|███████▊ | 26773/34278 [29:23:02<7:14:30, 3.47s/it] 78%|███████▊ | 26774/34278 [29:23:05<7:03:15, 3.38s/it] {'loss': 0.1119, 'grad_norm': 0.975276817856636, 'learning_rate': 1.2050283476429176e-06, 'epoch': 0.78} 78%|███████▊ | 26774/34278 [29:23:05<7:03:15, 3.38s/it] 78%|███████▊ | 26775/34278 [29:23:11<8:44:22, 4.19s/it] {'loss': 0.0903, 'grad_norm': 0.6950365481214305, 'learning_rate': 1.2047207640118187e-06, 'epoch': 0.78} 78%|███████▊ | 26775/34278 [29:23:11<8:44:22, 4.19s/it] 78%|███████▊ | 26776/34278 [29:23:16<8:55:45, 4.28s/it] {'loss': 0.1122, 'grad_norm': 0.7311888139797804, 'learning_rate': 1.204413214264067e-06, 'epoch': 0.78} 78%|███████▊ | 26776/34278 [29:23:16<8:55:45, 4.28s/it] 78%|███████▊ | 26777/34278 [29:23:19<8:15:21, 3.96s/it] {'loss': 0.1098, 'grad_norm': 0.7739204836649747, 'learning_rate': 1.2041056984024063e-06, 'epoch': 0.78} 78%|███████▊ | 26777/34278 [29:23:19<8:15:21, 3.96s/it] 78%|███████▊ | 26778/34278 [29:23:23<8:33:47, 4.11s/it] {'loss': 0.1089, 'grad_norm': 0.7795367277900777, 'learning_rate': 1.2037982164295837e-06, 'epoch': 0.78} 78%|███████▊ | 26778/34278 [29:23:24<8:33:47, 4.11s/it] 78%|███████▊ | 26779/34278 [29:23:28<8:57:38, 4.30s/it] {'loss': 0.1092, 'grad_norm': 0.7290185208829019, 'learning_rate': 1.203490768348346e-06, 'epoch': 0.78} 78%|███████▊ | 26779/34278 [29:23:28<8:57:38, 4.30s/it] 78%|███████▊ | 26780/34278 [29:23:31<7:59:02, 3.83s/it] {'loss': 0.1412, 'grad_norm': 0.8874276508149599, 'learning_rate': 1.203183354161435e-06, 'epoch': 0.78} 78%|███████▊ | 26780/34278 [29:23:31<7:59:02, 3.83s/it] 78%|███████▊ | 26781/34278 [29:23:36<8:33:47, 4.11s/it] {'loss': 0.1143, 'grad_norm': 0.7654875723898305, 'learning_rate': 1.2028759738715983e-06, 'epoch': 0.78} 78%|███████▊ | 26781/34278 [29:23:36<8:33:47, 4.11s/it] 78%|███████▊ | 26782/34278 [29:23:39<8:06:02, 3.89s/it] {'loss': 0.1149, 'grad_norm': 0.967778599241166, 'learning_rate': 1.2025686274815784e-06, 'epoch': 0.78} 78%|███████▊ | 26782/34278 [29:23:39<8:06:02, 3.89s/it] 78%|███████▊ | 26783/34278 [29:23:44<8:31:57, 4.10s/it] {'loss': 0.1046, 'grad_norm': 0.6716975594959714, 'learning_rate': 1.2022613149941176e-06, 'epoch': 0.78} 78%|███████▊ | 26783/34278 [29:23:44<8:31:57, 4.10s/it] 78%|███████▊ | 26784/34278 [29:23:50<9:41:09, 4.65s/it] {'loss': 0.1139, 'grad_norm': 0.7881468291688234, 'learning_rate': 1.2019540364119608e-06, 'epoch': 0.78} 78%|███████▊ | 26784/34278 [29:23:50<9:41:09, 4.65s/it] 78%|███████▊ | 26785/34278 [29:23:52<8:28:12, 4.07s/it] {'loss': 0.1383, 'grad_norm': 0.7793891505249078, 'learning_rate': 1.2016467917378539e-06, 'epoch': 0.78} 78%|███████▊ | 26785/34278 [29:23:52<8:28:12, 4.07s/it] 78%|███████▊ | 26786/34278 [29:23:55<7:50:00, 3.76s/it] {'loss': 0.1384, 'grad_norm': 0.9393049255640398, 'learning_rate': 1.201339580974537e-06, 'epoch': 0.78} 78%|███████▊ | 26786/34278 [29:23:55<7:50:00, 3.76s/it] 78%|███████▊ | 26787/34278 [29:23:59<7:31:59, 3.62s/it] {'loss': 0.1214, 'grad_norm': 0.773114171474124, 'learning_rate': 1.201032404124753e-06, 'epoch': 0.78} 78%|███████▊ | 26787/34278 [29:23:59<7:31:59, 3.62s/it] 78%|███████▊ | 26788/34278 [29:24:02<7:17:50, 3.51s/it] {'loss': 0.1116, 'grad_norm': 0.9296450342613287, 'learning_rate': 1.2007252611912457e-06, 'epoch': 0.78} 78%|███████▊ | 26788/34278 [29:24:02<7:17:50, 3.51s/it] 78%|███████▊ | 26789/34278 [29:24:05<6:58:52, 3.36s/it] {'loss': 0.1066, 'grad_norm': 0.8073457606369954, 'learning_rate': 1.200418152176756e-06, 'epoch': 0.78} 78%|███████▊ | 26789/34278 [29:24:05<6:58:52, 3.36s/it] 78%|███████▊ | 26790/34278 [29:24:08<7:06:48, 3.42s/it] {'loss': 0.1189, 'grad_norm': 1.1966057101530976, 'learning_rate': 1.2001110770840253e-06, 'epoch': 0.78} 78%|███████▊ | 26790/34278 [29:24:08<7:06:48, 3.42s/it] 78%|███████▊ | 26791/34278 [29:24:12<6:57:54, 3.35s/it] {'loss': 0.1273, 'grad_norm': 0.9165946645517647, 'learning_rate': 1.1998040359157954e-06, 'epoch': 0.78} 78%|███████▊ | 26791/34278 [29:24:12<6:57:54, 3.35s/it] 78%|███████▊ | 26792/34278 [29:24:15<6:55:26, 3.33s/it] {'loss': 0.1258, 'grad_norm': 1.0271750745608197, 'learning_rate': 1.1994970286748093e-06, 'epoch': 0.78} 78%|███████▊ | 26792/34278 [29:24:15<6:55:26, 3.33s/it] 78%|███████▊ | 26793/34278 [29:24:21<8:36:10, 4.14s/it] {'loss': 0.1071, 'grad_norm': 0.6935406299393101, 'learning_rate': 1.1991900553638065e-06, 'epoch': 0.78} 78%|███████▊ | 26793/34278 [29:24:21<8:36:10, 4.14s/it] 78%|███████▊ | 26794/34278 [29:24:24<8:15:58, 3.98s/it] {'loss': 0.1212, 'grad_norm': 0.8348271471994329, 'learning_rate': 1.1988831159855257e-06, 'epoch': 0.78} 78%|███████▊ | 26794/34278 [29:24:24<8:15:58, 3.98s/it] 78%|███████▊ | 26795/34278 [29:24:28<8:12:12, 3.95s/it] {'loss': 0.1183, 'grad_norm': 0.7935682707299909, 'learning_rate': 1.1985762105427107e-06, 'epoch': 0.78} 78%|███████▊ | 26795/34278 [29:24:28<8:12:12, 3.95s/it] 78%|███████▊ | 26796/34278 [29:24:31<7:42:29, 3.71s/it] {'loss': 0.1038, 'grad_norm': 0.8346079487274879, 'learning_rate': 1.198269339038099e-06, 'epoch': 0.78} 78%|███████▊ | 26796/34278 [29:24:32<7:42:29, 3.71s/it] 78%|███████▊ | 26797/34278 [29:24:35<7:19:02, 3.52s/it] {'loss': 0.115, 'grad_norm': 0.8994901482859692, 'learning_rate': 1.1979625014744306e-06, 'epoch': 0.78} 78%|███████▊ | 26797/34278 [29:24:35<7:19:02, 3.52s/it] 78%|███████▊ | 26798/34278 [29:24:40<8:42:11, 4.19s/it] {'loss': 0.0962, 'grad_norm': 0.7649070281646475, 'learning_rate': 1.1976556978544467e-06, 'epoch': 0.78} 78%|███████▊ | 26798/34278 [29:24:40<8:42:11, 4.19s/it] 78%|███████▊ | 26799/34278 [29:24:44<8:07:09, 3.91s/it] {'loss': 0.1365, 'grad_norm': 0.9454515925374907, 'learning_rate': 1.1973489281808854e-06, 'epoch': 0.78} 78%|███████▊ | 26799/34278 [29:24:44<8:07:09, 3.91s/it] 78%|███████▊ | 26800/34278 [29:24:47<7:39:29, 3.69s/it] {'loss': 0.1119, 'grad_norm': 0.8893040178251703, 'learning_rate': 1.1970421924564835e-06, 'epoch': 0.78} 78%|███████▊ | 26800/34278 [29:24:47<7:39:29, 3.69s/it] 78%|███████▊ | 26801/34278 [29:24:51<8:02:03, 3.87s/it] {'loss': 0.1032, 'grad_norm': 0.6299941910648025, 'learning_rate': 1.1967354906839824e-06, 'epoch': 0.78} 78%|███████▊ | 26801/34278 [29:24:51<8:02:03, 3.87s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 78%|███████▊ | 26802/34278 [29:24:54<7:36:32, 3.66s/it] {'loss': 0.1005, 'grad_norm': 0.7194752175170329, 'learning_rate': 1.1964288228661175e-06, 'epoch': 0.78} 78%|███████▊ | 26802/34278 [29:24:54<7:36:32, 3.66s/it] 78%|███████▊ | 26803/34278 [29:24:57<7:03:01, 3.40s/it] {'loss': 0.115, 'grad_norm': 0.8269650255837503, 'learning_rate': 1.1961221890056296e-06, 'epoch': 0.78} 78%|███████▊ | 26803/34278 [29:24:57<7:03:01, 3.40s/it] 78%|███████▊ | 26804/34278 [29:25:00<6:56:39, 3.34s/it] {'loss': 0.0987, 'grad_norm': 0.6900089824003621, 'learning_rate': 1.1958155891052531e-06, 'epoch': 0.78} 78%|███████▊ | 26804/34278 [29:25:00<6:56:39, 3.34s/it] 78%|███████▊ | 26805/34278 [29:25:03<6:45:15, 3.25s/it] {'loss': 0.1228, 'grad_norm': 0.7664680136418915, 'learning_rate': 1.1955090231677285e-06, 'epoch': 0.78} 78%|███████▊ | 26805/34278 [29:25:03<6:45:15, 3.25s/it] 78%|███████▊ | 26806/34278 [29:25:06<6:41:09, 3.22s/it] {'loss': 0.1058, 'grad_norm': 0.8842127872208614, 'learning_rate': 1.195202491195791e-06, 'epoch': 0.78} 78%|███████▊ | 26806/34278 [29:25:06<6:41:09, 3.22s/it] 78%|███████▊ | 26807/34278 [29:25:09<6:29:26, 3.13s/it] {'loss': 0.1179, 'grad_norm': 0.9171726720943688, 'learning_rate': 1.194895993192176e-06, 'epoch': 0.78} 78%|███████▊ | 26807/34278 [29:25:09<6:29:26, 3.13s/it] 78%|███████▊ | 26808/34278 [29:25:14<7:09:57, 3.45s/it] {'loss': 0.1299, 'grad_norm': 0.7028869612526993, 'learning_rate': 1.1945895291596217e-06, 'epoch': 0.78} 78%|███████▊ | 26808/34278 [29:25:14<7:09:57, 3.45s/it] 78%|███████▊ | 26809/34278 [29:25:18<7:33:05, 3.64s/it] {'loss': 0.1273, 'grad_norm': 0.7584500685507514, 'learning_rate': 1.1942830991008646e-06, 'epoch': 0.78} 78%|███████▊ | 26809/34278 [29:25:18<7:33:05, 3.64s/it] 78%|███████▊ | 26810/34278 [29:25:24<8:57:52, 4.32s/it] {'loss': 0.1283, 'grad_norm': 0.870532507451866, 'learning_rate': 1.1939767030186377e-06, 'epoch': 0.78} 78%|███████▊ | 26810/34278 [29:25:24<8:57:52, 4.32s/it] 78%|███████▊ | 26811/34278 [29:25:27<8:13:12, 3.96s/it] {'loss': 0.1084, 'grad_norm': 0.8621506605643977, 'learning_rate': 1.1936703409156802e-06, 'epoch': 0.78} 78%|███████▊ | 26811/34278 [29:25:27<8:13:12, 3.96s/it] 78%|███████▊ | 26812/34278 [29:25:30<7:55:55, 3.82s/it] {'loss': 0.1067, 'grad_norm': 0.6880970855532711, 'learning_rate': 1.1933640127947255e-06, 'epoch': 0.78} 78%|███████▊ | 26812/34278 [29:25:30<7:55:55, 3.82s/it] 78%|███████▊ | 26813/34278 [29:25:34<7:39:26, 3.69s/it] {'loss': 0.1329, 'grad_norm': 0.9526159172576324, 'learning_rate': 1.1930577186585063e-06, 'epoch': 0.78} 78%|███████▊ | 26813/34278 [29:25:34<7:39:26, 3.69s/it] 78%|███████▊ | 26814/34278 [29:25:36<7:05:27, 3.42s/it] {'loss': 0.0982, 'grad_norm': 0.7638848398201711, 'learning_rate': 1.1927514585097594e-06, 'epoch': 0.78} 78%|███████▊ | 26814/34278 [29:25:36<7:05:27, 3.42s/it] 78%|███████▊ | 26815/34278 [29:25:40<7:12:46, 3.48s/it] {'loss': 0.1011, 'grad_norm': 0.8123166818427471, 'learning_rate': 1.1924452323512192e-06, 'epoch': 0.78} 78%|███████▊ | 26815/34278 [29:25:40<7:12:46, 3.48s/it] 78%|███████▊ | 26816/34278 [29:25:43<6:57:31, 3.36s/it] {'loss': 0.1246, 'grad_norm': 0.8692443813015546, 'learning_rate': 1.1921390401856198e-06, 'epoch': 0.78} 78%|███████▊ | 26816/34278 [29:25:43<6:57:31, 3.36s/it] 78%|███████▊ | 26817/34278 [29:25:48<8:03:50, 3.89s/it] {'loss': 0.104, 'grad_norm': 0.8254263945250957, 'learning_rate': 1.1918328820156928e-06, 'epoch': 0.78} 78%|███████▊ | 26817/34278 [29:25:48<8:03:50, 3.89s/it] 78%|███████▊ | 26818/34278 [29:25:52<8:08:44, 3.93s/it] {'loss': 0.0986, 'grad_norm': 1.043279921974565, 'learning_rate': 1.1915267578441737e-06, 'epoch': 0.78} 78%|███████▊ | 26818/34278 [29:25:52<8:08:44, 3.93s/it] 78%|███████▊ | 26819/34278 [29:25:55<7:32:29, 3.64s/it] {'loss': 0.1259, 'grad_norm': 0.8854793295232857, 'learning_rate': 1.1912206676737942e-06, 'epoch': 0.78} 78%|███████▊ | 26819/34278 [29:25:55<7:32:29, 3.64s/it] 78%|███████▊ | 26820/34278 [29:25:58<7:14:36, 3.50s/it] {'loss': 0.1021, 'grad_norm': 0.7688145757132527, 'learning_rate': 1.1909146115072866e-06, 'epoch': 0.78} 78%|███████▊ | 26820/34278 [29:25:58<7:14:36, 3.50s/it] 78%|███████▊ | 26821/34278 [29:26:02<7:05:01, 3.42s/it] {'loss': 0.0956, 'grad_norm': 0.6315318923356129, 'learning_rate': 1.1906085893473835e-06, 'epoch': 0.78} 78%|███████▊ | 26821/34278 [29:26:02<7:05:01, 3.42s/it] 78%|███████▊ | 26822/34278 [29:26:06<7:32:32, 3.64s/it] {'loss': 0.1018, 'grad_norm': 0.855182269868678, 'learning_rate': 1.1903026011968194e-06, 'epoch': 0.78} 78%|███████▊ | 26822/34278 [29:26:06<7:32:32, 3.64s/it] 78%|███████▊ | 26823/34278 [29:26:12<8:56:35, 4.32s/it] {'loss': 0.1071, 'grad_norm': 1.2437032393368883, 'learning_rate': 1.189996647058324e-06, 'epoch': 0.78} 78%|███████▊ | 26823/34278 [29:26:12<8:56:35, 4.32s/it] 78%|███████▊ | 26824/34278 [29:26:15<8:32:51, 4.13s/it] {'loss': 0.1156, 'grad_norm': 0.7593396867233735, 'learning_rate': 1.1896907269346274e-06, 'epoch': 0.78} 78%|███████▊ | 26824/34278 [29:26:15<8:32:51, 4.13s/it] 78%|███████▊ | 26825/34278 [29:26:19<8:20:34, 4.03s/it] {'loss': 0.0971, 'grad_norm': 0.7818691260607719, 'learning_rate': 1.1893848408284641e-06, 'epoch': 0.78} 78%|███████▊ | 26825/34278 [29:26:19<8:20:34, 4.03s/it] 78%|███████▊ | 26826/34278 [29:26:24<8:49:21, 4.26s/it] {'loss': 0.1057, 'grad_norm': 0.8160192005648955, 'learning_rate': 1.1890789887425618e-06, 'epoch': 0.78} 78%|███████▊ | 26826/34278 [29:26:24<8:49:21, 4.26s/it] 78%|███████▊ | 26827/34278 [29:26:27<8:01:11, 3.87s/it] {'loss': 0.0952, 'grad_norm': 0.7062585494868514, 'learning_rate': 1.188773170679653e-06, 'epoch': 0.78} 78%|███████▊ | 26827/34278 [29:26:27<8:01:11, 3.87s/it] 78%|███████▊ | 26828/34278 [29:26:30<7:26:14, 3.59s/it] {'loss': 0.1078, 'grad_norm': 1.0048072435142412, 'learning_rate': 1.1884673866424683e-06, 'epoch': 0.78} 78%|███████▊ | 26828/34278 [29:26:30<7:26:14, 3.59s/it] 78%|███████▊ | 26829/34278 [29:26:33<7:15:35, 3.51s/it] {'loss': 0.0987, 'grad_norm': 0.6853913456310932, 'learning_rate': 1.1881616366337372e-06, 'epoch': 0.78} 78%|███████▊ | 26829/34278 [29:26:33<7:15:35, 3.51s/it] 78%|███████▊ | 26830/34278 [29:26:36<6:50:47, 3.31s/it] {'loss': 0.1291, 'grad_norm': 0.7420160718974594, 'learning_rate': 1.1878559206561874e-06, 'epoch': 0.78} 78%|███████▊ | 26830/34278 [29:26:36<6:50:47, 3.31s/it] 78%|███████▊ | 26831/34278 [29:26:39<6:46:54, 3.28s/it] {'loss': 0.1307, 'grad_norm': 1.1803984725626182, 'learning_rate': 1.1875502387125514e-06, 'epoch': 0.78} 78%|███████▊ | 26831/34278 [29:26:39<6:46:54, 3.28s/it] 78%|███████▊ | 26832/34278 [29:26:43<6:54:51, 3.34s/it] {'loss': 0.141, 'grad_norm': 0.8828823275993483, 'learning_rate': 1.1872445908055557e-06, 'epoch': 0.78} 78%|███████▊ | 26832/34278 [29:26:43<6:54:51, 3.34s/it] 78%|███████▊ | 26833/34278 [29:26:46<6:37:27, 3.20s/it] {'loss': 0.1124, 'grad_norm': 1.0668592512178308, 'learning_rate': 1.1869389769379314e-06, 'epoch': 0.78} 78%|███████▊ | 26833/34278 [29:26:46<6:37:27, 3.20s/it] 78%|███████▊ | 26834/34278 [29:26:49<6:35:17, 3.19s/it] {'loss': 0.1389, 'grad_norm': 0.8475742344287523, 'learning_rate': 1.186633397112404e-06, 'epoch': 0.78} 78%|███████▊ | 26834/34278 [29:26:49<6:35:17, 3.19s/it] 78%|███████▊ | 26835/34278 [29:26:55<8:26:52, 4.09s/it] {'loss': 0.1361, 'grad_norm': 1.0318833825760718, 'learning_rate': 1.1863278513317046e-06, 'epoch': 0.78} 78%|███████▊ | 26835/34278 [29:26:55<8:26:52, 4.09s/it] 78%|███████▊ | 26836/34278 [29:27:01<9:41:16, 4.69s/it] {'loss': 0.1176, 'grad_norm': 0.8669605871058667, 'learning_rate': 1.1860223395985598e-06, 'epoch': 0.78} 78%|███████▊ | 26836/34278 [29:27:01<9:41:16, 4.69s/it] 78%|███████▊ | 26837/34278 [29:27:04<8:36:55, 4.17s/it] {'loss': 0.1121, 'grad_norm': 0.700379017369269, 'learning_rate': 1.1857168619156962e-06, 'epoch': 0.78} 78%|███████▊ | 26837/34278 [29:27:04<8:36:55, 4.17s/it] 78%|███████▊ | 26838/34278 [29:27:10<9:49:43, 4.76s/it] {'loss': 0.1385, 'grad_norm': 0.8243197277304135, 'learning_rate': 1.1854114182858413e-06, 'epoch': 0.78} 78%|███████▊ | 26838/34278 [29:27:10<9:49:43, 4.76s/it] 78%|███████▊ | 26839/34278 [29:27:13<8:53:54, 4.31s/it] {'loss': 0.1273, 'grad_norm': 0.8155538368360573, 'learning_rate': 1.1851060087117244e-06, 'epoch': 0.78} 78%|███████▊ | 26839/34278 [29:27:13<8:53:54, 4.31s/it] 78%|███████▊ | 26840/34278 [29:27:19<9:41:36, 4.69s/it] {'loss': 0.1044, 'grad_norm': 0.7557102775662806, 'learning_rate': 1.1848006331960688e-06, 'epoch': 0.78} 78%|███████▊ | 26840/34278 [29:27:19<9:41:36, 4.69s/it] 78%|███████▊ | 26841/34278 [29:27:22<8:41:28, 4.21s/it] {'loss': 0.1227, 'grad_norm': 0.8165470558445905, 'learning_rate': 1.1844952917416043e-06, 'epoch': 0.78} 78%|███████▊ | 26841/34278 [29:27:22<8:41:28, 4.21s/it] 78%|███████▊ | 26842/34278 [29:27:26<8:26:16, 4.09s/it] {'loss': 0.1128, 'grad_norm': 0.9378788158559712, 'learning_rate': 1.184189984351054e-06, 'epoch': 0.78} 78%|███████▊ | 26842/34278 [29:27:26<8:26:16, 4.09s/it] 78%|███████▊ | 26843/34278 [29:27:29<7:44:27, 3.75s/it] {'loss': 0.1074, 'grad_norm': 0.7393726333565241, 'learning_rate': 1.183884711027144e-06, 'epoch': 0.78} 78%|███████▊ | 26843/34278 [29:27:29<7:44:27, 3.75s/it] 78%|███████▊ | 26844/34278 [29:27:32<7:17:06, 3.53s/it] {'loss': 0.1138, 'grad_norm': 0.7803308518649982, 'learning_rate': 1.1835794717726e-06, 'epoch': 0.78} 78%|███████▊ | 26844/34278 [29:27:32<7:17:06, 3.53s/it] 78%|███████▊ | 26845/34278 [29:27:38<8:45:29, 4.24s/it] {'loss': 0.1222, 'grad_norm': 1.114408197837132, 'learning_rate': 1.1832742665901486e-06, 'epoch': 0.78} 78%|███████▊ | 26845/34278 [29:27:38<8:45:29, 4.24s/it] 78%|███████▊ | 26846/34278 [29:27:41<8:01:57, 3.89s/it] {'loss': 0.0992, 'grad_norm': 0.9755866168268343, 'learning_rate': 1.182969095482514e-06, 'epoch': 0.78} 78%|███████▊ | 26846/34278 [29:27:41<8:01:57, 3.89s/it] 78%|███████▊ | 26847/34278 [29:27:44<7:29:19, 3.63s/it] {'loss': 0.1193, 'grad_norm': 0.7674839105195207, 'learning_rate': 1.1826639584524185e-06, 'epoch': 0.78} 78%|███████▊ | 26847/34278 [29:27:44<7:29:19, 3.63s/it] 78%|███████▊ | 26848/34278 [29:27:47<7:16:11, 3.52s/it] {'loss': 0.1328, 'grad_norm': 0.7976890375796063, 'learning_rate': 1.1823588555025894e-06, 'epoch': 0.78} 78%|███████▊ | 26848/34278 [29:27:47<7:16:11, 3.52s/it] 78%|███████▊ | 26849/34278 [29:27:50<6:57:08, 3.37s/it] {'loss': 0.1152, 'grad_norm': 1.0477441113233703, 'learning_rate': 1.182053786635749e-06, 'epoch': 0.78} 78%|███████▊ | 26849/34278 [29:27:50<6:57:08, 3.37s/it] 78%|███████▊ | 26850/34278 [29:27:54<7:03:26, 3.42s/it] {'loss': 0.1294, 'grad_norm': 0.9130039249914723, 'learning_rate': 1.1817487518546194e-06, 'epoch': 0.78} 78%|███████▊ | 26850/34278 [29:27:54<7:03:26, 3.42s/it] 78%|███████▊ | 26851/34278 [29:27:57<6:46:18, 3.28s/it] {'loss': 0.1347, 'grad_norm': 0.8111684208701135, 'learning_rate': 1.1814437511619254e-06, 'epoch': 0.78} 78%|███████▊ | 26851/34278 [29:27:57<6:46:18, 3.28s/it] 78%|███████▊ | 26852/34278 [29:28:00<6:39:35, 3.23s/it] {'loss': 0.122, 'grad_norm': 1.128438059941651, 'learning_rate': 1.1811387845603916e-06, 'epoch': 0.78} 78%|███████▊ | 26852/34278 [29:28:00<6:39:35, 3.23s/it] 78%|███████▊ | 26853/34278 [29:28:02<6:22:32, 3.09s/it] {'loss': 0.1088, 'grad_norm': 0.722683754825944, 'learning_rate': 1.180833852052739e-06, 'epoch': 0.78} 78%|███████▊ | 26853/34278 [29:28:02<6:22:32, 3.09s/it] 78%|███████▊ | 26854/34278 [29:28:08<8:10:31, 3.96s/it] {'loss': 0.1169, 'grad_norm': 1.05479675217445, 'learning_rate': 1.1805289536416887e-06, 'epoch': 0.78} 78%|███████▊ | 26854/34278 [29:28:08<8:10:31, 3.96s/it] 78%|███████▊ | 26855/34278 [29:28:14<9:01:10, 4.37s/it] {'loss': 0.1138, 'grad_norm': 0.7803310474664418, 'learning_rate': 1.180224089329966e-06, 'epoch': 0.78} 78%|███████▊ | 26855/34278 [29:28:14<9:01:10, 4.37s/it] 78%|███████▊ | 26856/34278 [29:28:17<8:37:35, 4.18s/it] {'loss': 0.0925, 'grad_norm': 0.8860024298146697, 'learning_rate': 1.1799192591202884e-06, 'epoch': 0.78} 78%|███████▊ | 26856/34278 [29:28:17<8:37:35, 4.18s/it] 78%|███████▊ | 26857/34278 [29:28:21<8:07:35, 3.94s/it] {'loss': 0.1147, 'grad_norm': 0.8998043125351297, 'learning_rate': 1.1796144630153806e-06, 'epoch': 0.78} 78%|███████▊ | 26857/34278 [29:28:21<8:07:35, 3.94s/it] 78%|███████▊ | 26858/34278 [29:28:26<8:37:48, 4.19s/it] {'loss': 0.1173, 'grad_norm': 0.8579436010477661, 'learning_rate': 1.1793097010179639e-06, 'epoch': 0.78} 78%|███████▊ | 26858/34278 [29:28:26<8:37:48, 4.19s/it] 78%|███████▊ | 26859/34278 [29:28:29<8:05:50, 3.93s/it] {'loss': 0.1233, 'grad_norm': 1.1066410641333093, 'learning_rate': 1.179004973130758e-06, 'epoch': 0.78} 78%|███████▊ | 26859/34278 [29:28:29<8:05:50, 3.93s/it] 78%|███████▊ | 26860/34278 [29:28:32<7:26:32, 3.61s/it] {'loss': 0.0991, 'grad_norm': 0.8990092908336493, 'learning_rate': 1.1787002793564822e-06, 'epoch': 0.78} 78%|███████▊ | 26860/34278 [29:28:32<7:26:32, 3.61s/it] 78%|███████▊ | 26861/34278 [29:28:37<8:18:57, 4.04s/it] {'loss': 0.105, 'grad_norm': 0.9630480007906665, 'learning_rate': 1.1783956196978595e-06, 'epoch': 0.78} 78%|███████▊ | 26861/34278 [29:28:37<8:18:57, 4.04s/it] 78%|███████▊ | 26862/34278 [29:28:40<7:52:01, 3.82s/it] {'loss': 0.1256, 'grad_norm': 0.8582363903078812, 'learning_rate': 1.1780909941576074e-06, 'epoch': 0.78} 78%|███████▊ | 26862/34278 [29:28:40<7:52:01, 3.82s/it] 78%|███████▊ | 26863/34278 [29:28:44<7:45:38, 3.77s/it] {'loss': 0.1187, 'grad_norm': 0.9205639610287715, 'learning_rate': 1.1777864027384478e-06, 'epoch': 0.78} 78%|███████▊ | 26863/34278 [29:28:44<7:45:38, 3.77s/it] 78%|███████▊ | 26864/34278 [29:28:47<7:28:01, 3.63s/it] {'loss': 0.1108, 'grad_norm': 1.0210478042715165, 'learning_rate': 1.177481845443097e-06, 'epoch': 0.78} 78%|███████▊ | 26864/34278 [29:28:47<7:28:01, 3.63s/it] 78%|███████▊ | 26865/34278 [29:28:51<7:20:58, 3.57s/it] {'loss': 0.1126, 'grad_norm': 0.9103747963753535, 'learning_rate': 1.1771773222742778e-06, 'epoch': 0.78} 78%|███████▊ | 26865/34278 [29:28:51<7:20:58, 3.57s/it] 78%|███████▊ | 26866/34278 [29:28:54<7:27:09, 3.62s/it] {'loss': 0.0976, 'grad_norm': 1.0552119030485607, 'learning_rate': 1.1768728332347062e-06, 'epoch': 0.78} 78%|███████▊ | 26866/34278 [29:28:54<7:27:09, 3.62s/it] 78%|███████▊ | 26867/34278 [29:29:00<8:53:56, 4.32s/it] {'loss': 0.1007, 'grad_norm': 0.9337237343380899, 'learning_rate': 1.1765683783271004e-06, 'epoch': 0.78} 78%|███████▊ | 26867/34278 [29:29:00<8:53:56, 4.32s/it] 78%|███████▊ | 26868/34278 [29:29:04<8:38:21, 4.20s/it] {'loss': 0.0996, 'grad_norm': 0.9383534581297182, 'learning_rate': 1.176263957554179e-06, 'epoch': 0.78} 78%|███████▊ | 26868/34278 [29:29:04<8:38:21, 4.20s/it] 78%|███████▊ | 26869/34278 [29:29:07<8:07:04, 3.94s/it] {'loss': 0.1263, 'grad_norm': 0.7949921727905045, 'learning_rate': 1.1759595709186616e-06, 'epoch': 0.78} 78%|███████▊ | 26869/34278 [29:29:07<8:07:04, 3.94s/it] 78%|███████▊ | 26870/34278 [29:29:10<7:27:11, 3.62s/it] {'loss': 0.1313, 'grad_norm': 1.0558144681634218, 'learning_rate': 1.1756552184232634e-06, 'epoch': 0.78} 78%|███████▊ | 26870/34278 [29:29:10<7:27:11, 3.62s/it] 78%|███████▊ | 26871/34278 [29:29:14<7:18:16, 3.55s/it] {'loss': 0.1046, 'grad_norm': 0.8564336025917788, 'learning_rate': 1.175350900070703e-06, 'epoch': 0.78} 78%|███████▊ | 26871/34278 [29:29:14<7:18:16, 3.55s/it] 78%|███████▊ | 26872/34278 [29:29:17<7:09:05, 3.48s/it] {'loss': 0.1235, 'grad_norm': 1.1128294106431664, 'learning_rate': 1.1750466158636975e-06, 'epoch': 0.78} 78%|███████▊ | 26872/34278 [29:29:17<7:09:05, 3.48s/it] 78%|███████▊ | 26873/34278 [29:29:20<6:54:07, 3.36s/it] {'loss': 0.1033, 'grad_norm': 0.7481349097680933, 'learning_rate': 1.1747423658049612e-06, 'epoch': 0.78} 78%|███████▊ | 26873/34278 [29:29:20<6:54:07, 3.36s/it] 78%|███████▊ | 26874/34278 [29:29:23<6:42:31, 3.26s/it] {'loss': 0.1238, 'grad_norm': 1.0080161958019997, 'learning_rate': 1.1744381498972117e-06, 'epoch': 0.78} 78%|███████▊ | 26874/34278 [29:29:23<6:42:31, 3.26s/it] 78%|███████▊ | 26875/34278 [29:29:27<7:01:07, 3.41s/it] {'loss': 0.1103, 'grad_norm': 0.9772100033840949, 'learning_rate': 1.1741339681431669e-06, 'epoch': 0.78} 78%|███████▊ | 26875/34278 [29:29:27<7:01:07, 3.41s/it] 78%|███████▊ | 26876/34278 [29:29:31<7:12:11, 3.50s/it] {'loss': 0.1268, 'grad_norm': 0.8285845957630742, 'learning_rate': 1.17382982054554e-06, 'epoch': 0.78} 78%|███████▊ | 26876/34278 [29:29:31<7:12:11, 3.50s/it] 78%|███████▊ | 26877/34278 [29:29:34<6:49:07, 3.32s/it] {'loss': 0.1228, 'grad_norm': 1.0864471912142566, 'learning_rate': 1.1735257071070466e-06, 'epoch': 0.78} 78%|███████▊ | 26877/34278 [29:29:34<6:49:07, 3.32s/it] 78%|███████▊ | 26878/34278 [29:29:36<6:31:17, 3.17s/it] {'loss': 0.0952, 'grad_norm': 0.8564488073734594, 'learning_rate': 1.1732216278304032e-06, 'epoch': 0.78} 78%|███████▊ | 26878/34278 [29:29:36<6:31:17, 3.17s/it] 78%|███████▊ | 26879/34278 [29:29:40<7:06:29, 3.46s/it] {'loss': 0.105, 'grad_norm': 0.8030651747634504, 'learning_rate': 1.1729175827183232e-06, 'epoch': 0.78} 78%|███████▊ | 26879/34278 [29:29:40<7:06:29, 3.46s/it] 78%|███████▊ | 26880/34278 [29:29:44<7:00:23, 3.41s/it] {'loss': 0.1163, 'grad_norm': 0.7667886353252452, 'learning_rate': 1.1726135717735204e-06, 'epoch': 0.78} 78%|███████▊ | 26880/34278 [29:29:44<7:00:23, 3.41s/it] 78%|███████▊ | 26881/34278 [29:29:47<6:46:37, 3.30s/it] {'loss': 0.1322, 'grad_norm': 1.0480954538064446, 'learning_rate': 1.1723095949987101e-06, 'epoch': 0.78} 78%|███████▊ | 26881/34278 [29:29:47<6:46:37, 3.30s/it] 78%|███████▊ | 26882/34278 [29:29:51<7:17:29, 3.55s/it] {'loss': 0.1023, 'grad_norm': 0.7230827016380355, 'learning_rate': 1.1720056523966072e-06, 'epoch': 0.78} 78%|███████▊ | 26882/34278 [29:29:51<7:17:29, 3.55s/it] 78%|███████▊ | 26883/34278 [29:29:56<8:07:19, 3.95s/it] {'loss': 0.1179, 'grad_norm': 0.7217534438164802, 'learning_rate': 1.171701743969924e-06, 'epoch': 0.78} 78%|███████▊ | 26883/34278 [29:29:56<8:07:19, 3.95s/it] 78%|███████▊ | 26884/34278 [29:29:59<7:43:21, 3.76s/it] {'loss': 0.1053, 'grad_norm': 0.8114078742560802, 'learning_rate': 1.1713978697213723e-06, 'epoch': 0.78} 78%|███████▊ | 26884/34278 [29:29:59<7:43:21, 3.76s/it] 78%|███████▊ | 26885/34278 [29:30:02<7:25:32, 3.62s/it] {'loss': 0.1154, 'grad_norm': 0.7021766512605949, 'learning_rate': 1.1710940296536682e-06, 'epoch': 0.78} 78%|███████▊ | 26885/34278 [29:30:02<7:25:32, 3.62s/it] 78%|███████▊ | 26886/34278 [29:30:06<7:29:02, 3.64s/it] {'loss': 0.1204, 'grad_norm': 0.8600017977371976, 'learning_rate': 1.1707902237695206e-06, 'epoch': 0.78} 78%|███████▊ | 26886/34278 [29:30:06<7:29:02, 3.64s/it] 78%|███████▊ | 26887/34278 [29:30:09<7:02:55, 3.43s/it] {'loss': 0.1081, 'grad_norm': 0.8926396098940488, 'learning_rate': 1.1704864520716442e-06, 'epoch': 0.78} 78%|███████▊ | 26887/34278 [29:30:09<7:02:55, 3.43s/it] 78%|███████▊ | 26888/34278 [29:30:12<6:49:48, 3.33s/it] {'loss': 0.1187, 'grad_norm': 0.8685521401022998, 'learning_rate': 1.170182714562752e-06, 'epoch': 0.78} 78%|███████▊ | 26888/34278 [29:30:12<6:49:48, 3.33s/it] 78%|███████▊ | 26889/34278 [29:30:15<6:41:25, 3.26s/it] {'loss': 0.1027, 'grad_norm': 0.8758005932407248, 'learning_rate': 1.1698790112455538e-06, 'epoch': 0.78} 78%|███████▊ | 26889/34278 [29:30:15<6:41:25, 3.26s/it] 78%|███████▊ | 26890/34278 [29:30:19<6:55:28, 3.37s/it] {'loss': 0.1413, 'grad_norm': 0.8952556006070274, 'learning_rate': 1.1695753421227606e-06, 'epoch': 0.78} 78%|███████▊ | 26890/34278 [29:30:19<6:55:28, 3.37s/it] 78%|███████▊ | 26891/34278 [29:30:24<8:06:24, 3.95s/it] {'loss': 0.1254, 'grad_norm': 0.8134807017722688, 'learning_rate': 1.1692717071970844e-06, 'epoch': 0.78} 78%|███████▊ | 26891/34278 [29:30:24<8:06:24, 3.95s/it] 78%|███████▊ | 26892/34278 [29:30:27<7:33:24, 3.68s/it] {'loss': 0.115, 'grad_norm': 1.0174722168432515, 'learning_rate': 1.1689681064712367e-06, 'epoch': 0.78} 78%|███████▊ | 26892/34278 [29:30:27<7:33:24, 3.68s/it] 78%|███████▊ | 26893/34278 [29:30:33<8:34:51, 4.18s/it] {'loss': 0.1077, 'grad_norm': 0.9097203778879775, 'learning_rate': 1.1686645399479278e-06, 'epoch': 0.78} 78%|███████▊ | 26893/34278 [29:30:33<8:34:51, 4.18s/it] 78%|███████▊ | 26894/34278 [29:30:37<8:31:57, 4.16s/it] {'loss': 0.0995, 'grad_norm': 0.7252747379861907, 'learning_rate': 1.1683610076298658e-06, 'epoch': 0.78} 78%|███████▊ | 26894/34278 [29:30:37<8:31:57, 4.16s/it] 78%|███████▊ | 26895/34278 [29:30:40<7:49:42, 3.82s/it] {'loss': 0.1343, 'grad_norm': 2.0357071447954014, 'learning_rate': 1.1680575095197634e-06, 'epoch': 0.78} 78%|███████▊ | 26895/34278 [29:30:40<7:49:42, 3.82s/it] 78%|███████▊ | 26896/34278 [29:30:43<7:22:10, 3.59s/it] {'loss': 0.1078, 'grad_norm': 1.3182450614391934, 'learning_rate': 1.1677540456203285e-06, 'epoch': 0.78} 78%|███████▊ | 26896/34278 [29:30:43<7:22:10, 3.59s/it] 78%|███████▊ | 26897/34278 [29:30:47<7:34:12, 3.69s/it] {'loss': 0.1047, 'grad_norm': 0.864687794459096, 'learning_rate': 1.167450615934268e-06, 'epoch': 0.78} 78%|███████▊ | 26897/34278 [29:30:47<7:34:12, 3.69s/it] 78%|███████▊ | 26898/34278 [29:30:51<7:43:14, 3.77s/it] {'loss': 0.1189, 'grad_norm': 0.847906256638758, 'learning_rate': 1.1671472204642964e-06, 'epoch': 0.78} 78%|███████▊ | 26898/34278 [29:30:51<7:43:14, 3.77s/it] 78%|███████▊ | 26899/34278 [29:30:54<7:26:23, 3.63s/it] {'loss': 0.1006, 'grad_norm': 0.8590265179007457, 'learning_rate': 1.1668438592131194e-06, 'epoch': 0.78} 78%|███████▊ | 26899/34278 [29:30:54<7:26:23, 3.63s/it] 78%|███████▊ | 26900/34278 [29:30:57<6:58:20, 3.40s/it] {'loss': 0.1186, 'grad_norm': 1.0116670667599434, 'learning_rate': 1.1665405321834439e-06, 'epoch': 0.78} 78%|███████▊ | 26900/34278 [29:30:57<6:58:20, 3.40s/it] 78%|███████▊ | 26901/34278 [29:31:00<6:46:33, 3.31s/it] {'loss': 0.1174, 'grad_norm': 0.7010234245668824, 'learning_rate': 1.1662372393779809e-06, 'epoch': 0.78} 78%|███████▊ | 26901/34278 [29:31:00<6:46:33, 3.31s/it] 78%|███████▊ | 26902/34278 [29:31:06<8:27:04, 4.12s/it] {'loss': 0.1467, 'grad_norm': 0.9368354596407508, 'learning_rate': 1.1659339807994364e-06, 'epoch': 0.78} 78%|███████▊ | 26902/34278 [29:31:06<8:27:04, 4.12s/it] 78%|███████▊ | 26903/34278 [29:31:10<8:10:34, 3.99s/it] {'loss': 0.1168, 'grad_norm': 0.771874980587234, 'learning_rate': 1.165630756450517e-06, 'epoch': 0.78} 78%|███████▊ | 26903/34278 [29:31:10<8:10:34, 3.99s/it] 78%|███████▊ | 26904/34278 [29:31:14<8:34:26, 4.19s/it] {'loss': 0.1099, 'grad_norm': 0.7951679227400112, 'learning_rate': 1.1653275663339308e-06, 'epoch': 0.78} 78%|███████▊ | 26904/34278 [29:31:14<8:34:26, 4.19s/it] 78%|███████▊ | 26905/34278 [29:31:18<8:22:42, 4.09s/it] {'loss': 0.1168, 'grad_norm': 0.8609451871037228, 'learning_rate': 1.1650244104523862e-06, 'epoch': 0.78} 78%|███████▊ | 26905/34278 [29:31:18<8:22:42, 4.09s/it] 78%|███████▊ | 26906/34278 [29:31:21<7:54:01, 3.86s/it] {'loss': 0.0979, 'grad_norm': 0.9215836537814548, 'learning_rate': 1.164721288808588e-06, 'epoch': 0.78} 78%|███████▊ | 26906/34278 [29:31:21<7:54:01, 3.86s/it] 78%|███████▊ | 26907/34278 [29:31:24<7:19:57, 3.58s/it] {'loss': 0.1019, 'grad_norm': 0.8879045100520424, 'learning_rate': 1.1644182014052408e-06, 'epoch': 0.78} 78%|███████▊ | 26907/34278 [29:31:24<7:19:57, 3.58s/it] 78%|███████▊ | 26908/34278 [29:31:29<8:03:34, 3.94s/it] {'loss': 0.1483, 'grad_norm': 0.9620173950761984, 'learning_rate': 1.1641151482450541e-06, 'epoch': 0.78} 78%|███████▊ | 26908/34278 [29:31:29<8:03:34, 3.94s/it] 79%|███████▊ | 26909/34278 [29:31:32<7:27:01, 3.64s/it] {'loss': 0.0995, 'grad_norm': 0.8099233952877505, 'learning_rate': 1.1638121293307302e-06, 'epoch': 0.79} 79%|███████▊ | 26909/34278 [29:31:32<7:27:01, 3.64s/it] 79%|███████▊ | 26910/34278 [29:31:35<7:10:04, 3.50s/it] {'loss': 0.0979, 'grad_norm': 0.9758814672664569, 'learning_rate': 1.163509144664977e-06, 'epoch': 0.79} 79%|███████▊ | 26910/34278 [29:31:35<7:10:04, 3.50s/it] 79%|███████▊ | 26911/34278 [29:31:39<7:00:05, 3.42s/it] {'loss': 0.1155, 'grad_norm': 0.9139112724301753, 'learning_rate': 1.1632061942504975e-06, 'epoch': 0.79} 79%|███████▊ | 26911/34278 [29:31:39<7:00:05, 3.42s/it] 79%|███████▊ | 26912/34278 [29:31:42<6:44:47, 3.30s/it] {'loss': 0.1024, 'grad_norm': 1.267470682177874, 'learning_rate': 1.1629032780899978e-06, 'epoch': 0.79} 79%|███████▊ | 26912/34278 [29:31:42<6:44:47, 3.30s/it] 79%|███████▊ | 26913/34278 [29:31:45<6:33:49, 3.21s/it] {'loss': 0.0993, 'grad_norm': 0.9679952516979631, 'learning_rate': 1.1626003961861821e-06, 'epoch': 0.79} 79%|███████▊ | 26913/34278 [29:31:45<6:33:49, 3.21s/it] 79%|███████▊ | 26914/34278 [29:31:48<6:27:12, 3.15s/it] {'loss': 0.1023, 'grad_norm': 0.7575683204419322, 'learning_rate': 1.1622975485417526e-06, 'epoch': 0.79} 79%|███████▊ | 26914/34278 [29:31:48<6:27:12, 3.15s/it] 79%|███████▊ | 26915/34278 [29:31:51<6:24:08, 3.13s/it] {'loss': 0.1123, 'grad_norm': 0.7378218391439108, 'learning_rate': 1.1619947351594147e-06, 'epoch': 0.79} 79%|███████▊ | 26915/34278 [29:31:51<6:24:08, 3.13s/it] 79%|███████▊ | 26916/34278 [29:31:55<7:02:15, 3.44s/it] {'loss': 0.105, 'grad_norm': 0.7338850072246711, 'learning_rate': 1.1616919560418727e-06, 'epoch': 0.79} 79%|███████▊ | 26916/34278 [29:31:55<7:02:15, 3.44s/it] 79%|███████▊ | 26917/34278 [29:31:58<6:53:51, 3.37s/it] {'loss': 0.111, 'grad_norm': 0.8676492666613993, 'learning_rate': 1.1613892111918273e-06, 'epoch': 0.79} 79%|███████▊ | 26917/34278 [29:31:58<6:53:51, 3.37s/it] 79%|███████▊ | 26918/34278 [29:32:01<6:49:36, 3.34s/it] {'loss': 0.1029, 'grad_norm': 1.1011054265315563, 'learning_rate': 1.1610865006119838e-06, 'epoch': 0.79} 79%|███████▊ | 26918/34278 [29:32:01<6:49:36, 3.34s/it] 79%|███████▊ | 26919/34278 [29:32:04<6:35:38, 3.23s/it] {'loss': 0.1152, 'grad_norm': 0.7005316929548462, 'learning_rate': 1.160783824305044e-06, 'epoch': 0.79} 79%|███████▊ | 26919/34278 [29:32:04<6:35:38, 3.23s/it] 79%|███████▊ | 26920/34278 [29:32:07<6:22:48, 3.12s/it] {'loss': 0.1308, 'grad_norm': 1.0391599316915296, 'learning_rate': 1.1604811822737084e-06, 'epoch': 0.79} 79%|███████▊ | 26920/34278 [29:32:07<6:22:48, 3.12s/it] 79%|███████▊ | 26921/34278 [29:32:10<6:27:45, 3.16s/it] {'loss': 0.1019, 'grad_norm': 0.8165121760645124, 'learning_rate': 1.1601785745206795e-06, 'epoch': 0.79} 79%|███████▊ | 26921/34278 [29:32:10<6:27:45, 3.16s/it] 79%|███████▊ | 26922/34278 [29:32:16<8:12:04, 4.01s/it] {'loss': 0.0917, 'grad_norm': 0.7787105521225467, 'learning_rate': 1.1598760010486614e-06, 'epoch': 0.79} 79%|███████▊ | 26922/34278 [29:32:16<8:12:04, 4.01s/it] 79%|███████▊ | 26923/34278 [29:32:20<7:58:44, 3.91s/it] {'loss': 0.1258, 'grad_norm': 0.691787327300869, 'learning_rate': 1.1595734618603543e-06, 'epoch': 0.79} 79%|███████▊ | 26923/34278 [29:32:20<7:58:44, 3.91s/it] 79%|███████▊ | 26924/34278 [29:32:23<7:40:27, 3.76s/it] {'loss': 0.1177, 'grad_norm': 0.982851786473919, 'learning_rate': 1.1592709569584565e-06, 'epoch': 0.79} 79%|███████▊ | 26924/34278 [29:32:23<7:40:27, 3.76s/it] 79%|███████▊ | 26925/34278 [29:32:26<6:59:55, 3.43s/it] {'loss': 0.1159, 'grad_norm': 0.8147593573127733, 'learning_rate': 1.1589684863456723e-06, 'epoch': 0.79} 79%|███████▊ | 26925/34278 [29:32:26<6:59:55, 3.43s/it] 79%|███████▊ | 26926/34278 [29:32:29<6:52:37, 3.37s/it] {'loss': 0.1244, 'grad_norm': 0.8159315026056807, 'learning_rate': 1.1586660500247004e-06, 'epoch': 0.79} 79%|███████▊ | 26926/34278 [29:32:29<6:52:37, 3.37s/it] 79%|███████▊ | 26927/34278 [29:32:32<6:40:55, 3.27s/it] {'loss': 0.1345, 'grad_norm': 0.8041398174115109, 'learning_rate': 1.1583636479982384e-06, 'epoch': 0.79} 79%|███████▊ | 26927/34278 [29:32:32<6:40:55, 3.27s/it] 79%|███████▊ | 26928/34278 [29:32:37<7:40:38, 3.76s/it] {'loss': 0.0908, 'grad_norm': 0.7807428745584111, 'learning_rate': 1.1580612802689911e-06, 'epoch': 0.79} 79%|███████▊ | 26928/34278 [29:32:37<7:40:38, 3.76s/it] 79%|███████▊ | 26929/34278 [29:32:41<7:39:42, 3.75s/it] {'loss': 0.1256, 'grad_norm': 0.7520185543281153, 'learning_rate': 1.157758946839656e-06, 'epoch': 0.79} 79%|███████▊ | 26929/34278 [29:32:41<7:39:42, 3.75s/it] 79%|███████▊ | 26930/34278 [29:32:45<7:40:11, 3.76s/it] {'loss': 0.1114, 'grad_norm': 0.7153975776300564, 'learning_rate': 1.1574566477129302e-06, 'epoch': 0.79} 79%|███████▊ | 26930/34278 [29:32:45<7:40:11, 3.76s/it] 79%|███████▊ | 26931/34278 [29:32:51<8:59:11, 4.40s/it] {'loss': 0.1145, 'grad_norm': 0.8062988737351608, 'learning_rate': 1.1571543828915155e-06, 'epoch': 0.79} 79%|███████▊ | 26931/34278 [29:32:51<8:59:11, 4.40s/it] 79%|███████▊ | 26932/34278 [29:32:54<8:37:31, 4.23s/it] {'loss': 0.0981, 'grad_norm': 1.0590870312759066, 'learning_rate': 1.1568521523781095e-06, 'epoch': 0.79} 79%|███████▊ | 26932/34278 [29:32:55<8:37:31, 4.23s/it] 79%|███████▊ | 26933/34278 [29:32:58<8:03:14, 3.95s/it] {'loss': 0.127, 'grad_norm': 0.8514193187990563, 'learning_rate': 1.1565499561754085e-06, 'epoch': 0.79} 79%|███████▊ | 26933/34278 [29:32:58<8:03:14, 3.95s/it] 79%|███████▊ | 26934/34278 [29:33:01<7:32:18, 3.70s/it] {'loss': 0.1207, 'grad_norm': 0.895190943725986, 'learning_rate': 1.1562477942861116e-06, 'epoch': 0.79} 79%|███████▊ | 26934/34278 [29:33:01<7:32:18, 3.70s/it] 79%|███████▊ | 26935/34278 [29:33:04<7:04:41, 3.47s/it] {'loss': 0.1045, 'grad_norm': 0.9450988864887407, 'learning_rate': 1.1559456667129183e-06, 'epoch': 0.79} 79%|███████▊ | 26935/34278 [29:33:04<7:04:41, 3.47s/it] 79%|███████▊ | 26936/34278 [29:33:08<7:12:30, 3.53s/it] {'loss': 0.1157, 'grad_norm': 1.0362955785684302, 'learning_rate': 1.1556435734585248e-06, 'epoch': 0.79} 79%|███████▊ | 26936/34278 [29:33:08<7:12:30, 3.53s/it] 79%|███████▊ | 26937/34278 [29:33:11<7:16:51, 3.57s/it] {'loss': 0.1101, 'grad_norm': 0.7935279419670792, 'learning_rate': 1.1553415145256259e-06, 'epoch': 0.79} 79%|███████▊ | 26937/34278 [29:33:11<7:16:51, 3.57s/it] 79%|███████▊ | 26938/34278 [29:33:14<7:02:33, 3.45s/it] {'loss': 0.1319, 'grad_norm': 0.9216892335011704, 'learning_rate': 1.155039489916922e-06, 'epoch': 0.79} 79%|███████▊ | 26938/34278 [29:33:14<7:02:33, 3.45s/it] 79%|███████▊ | 26939/34278 [29:33:18<6:51:36, 3.37s/it] {'loss': 0.0903, 'grad_norm': 0.8454616031196389, 'learning_rate': 1.1547374996351063e-06, 'epoch': 0.79} 79%|███████▊ | 26939/34278 [29:33:18<6:51:36, 3.37s/it] 79%|███████▊ | 26940/34278 [29:33:21<6:51:25, 3.36s/it] {'loss': 0.1245, 'grad_norm': 0.6737115247528297, 'learning_rate': 1.1544355436828769e-06, 'epoch': 0.79} 79%|███████▊ | 26940/34278 [29:33:21<6:51:25, 3.36s/it] 79%|███████▊ | 26941/34278 [29:33:26<7:53:18, 3.87s/it] {'loss': 0.1044, 'grad_norm': 0.7819384496496615, 'learning_rate': 1.1541336220629285e-06, 'epoch': 0.79} 79%|███████▊ | 26941/34278 [29:33:26<7:53:18, 3.87s/it] 79%|███████▊ | 26942/34278 [29:33:31<8:55:05, 4.38s/it] {'loss': 0.103, 'grad_norm': 1.322076755898772, 'learning_rate': 1.1538317347779583e-06, 'epoch': 0.79} 79%|███████▊ | 26942/34278 [29:33:32<8:55:05, 4.38s/it] 79%|███████▊ | 26943/34278 [29:33:35<8:23:35, 4.12s/it] {'loss': 0.1466, 'grad_norm': 1.382774753900953, 'learning_rate': 1.1535298818306595e-06, 'epoch': 0.79} 79%|███████▊ | 26943/34278 [29:33:35<8:23:35, 4.12s/it] 79%|███████▊ | 26944/34278 [29:33:39<8:12:28, 4.03s/it] {'loss': 0.1186, 'grad_norm': 0.7743865283279714, 'learning_rate': 1.1532280632237269e-06, 'epoch': 0.79} 79%|███████▊ | 26944/34278 [29:33:39<8:12:28, 4.03s/it] 79%|███████▊ | 26945/34278 [29:33:43<8:04:53, 3.97s/it] {'loss': 0.1533, 'grad_norm': 0.9518823635543687, 'learning_rate': 1.1529262789598554e-06, 'epoch': 0.79} 79%|███████▊ | 26945/34278 [29:33:43<8:04:53, 3.97s/it] 79%|███████▊ | 26946/34278 [29:33:46<7:33:55, 3.71s/it] {'loss': 0.0983, 'grad_norm': 0.7107481831528828, 'learning_rate': 1.1526245290417415e-06, 'epoch': 0.79} 79%|███████▊ | 26946/34278 [29:33:46<7:33:55, 3.71s/it] 79%|███████▊ | 26947/34278 [29:33:49<7:12:35, 3.54s/it] {'loss': 0.1048, 'grad_norm': 1.0304808572182447, 'learning_rate': 1.152322813472076e-06, 'epoch': 0.79} 79%|███████▊ | 26947/34278 [29:33:49<7:12:35, 3.54s/it] 79%|███████▊ | 26948/34278 [29:33:52<6:54:21, 3.39s/it] {'loss': 0.1142, 'grad_norm': 0.9304661820686831, 'learning_rate': 1.1520211322535552e-06, 'epoch': 0.79} 79%|███████▊ | 26948/34278 [29:33:52<6:54:21, 3.39s/it] 79%|███████▊ | 26949/34278 [29:33:55<6:46:50, 3.33s/it] {'loss': 0.1145, 'grad_norm': 0.9070036935799715, 'learning_rate': 1.1517194853888713e-06, 'epoch': 0.79} 79%|███████▊ | 26949/34278 [29:33:55<6:46:50, 3.33s/it] 79%|███████▊ | 26950/34278 [29:33:58<6:32:27, 3.21s/it] {'loss': 0.1046, 'grad_norm': 0.8650367570286813, 'learning_rate': 1.1514178728807151e-06, 'epoch': 0.79} 79%|███████▊ | 26950/34278 [29:33:58<6:32:27, 3.21s/it] 79%|███████▊ | 26951/34278 [29:34:02<7:06:22, 3.49s/it] {'loss': 0.1131, 'grad_norm': 0.7018217070502166, 'learning_rate': 1.1511162947317822e-06, 'epoch': 0.79} 79%|███████▊ | 26951/34278 [29:34:02<7:06:22, 3.49s/it] 79%|███████▊ | 26952/34278 [29:34:06<7:07:33, 3.50s/it] {'loss': 0.1079, 'grad_norm': 0.7367283180653517, 'learning_rate': 1.1508147509447653e-06, 'epoch': 0.79} 79%|███████▊ | 26952/34278 [29:34:06<7:07:33, 3.50s/it] 79%|███████▊ | 26953/34278 [29:34:09<7:13:04, 3.55s/it] {'loss': 0.1071, 'grad_norm': 0.8656438834038928, 'learning_rate': 1.1505132415223552e-06, 'epoch': 0.79} 79%|███████▊ | 26953/34278 [29:34:09<7:13:04, 3.55s/it] 79%|███████▊ | 26954/34278 [29:34:14<8:00:34, 3.94s/it] {'loss': 0.1095, 'grad_norm': 0.8423230424025819, 'learning_rate': 1.150211766467243e-06, 'epoch': 0.79} 79%|███████▊ | 26954/34278 [29:34:14<8:00:34, 3.94s/it] 79%|███████▊ | 26955/34278 [29:34:17<7:24:59, 3.65s/it] {'loss': 0.1265, 'grad_norm': 0.9235866939573659, 'learning_rate': 1.1499103257821226e-06, 'epoch': 0.79} 79%|███████▊ | 26955/34278 [29:34:17<7:24:59, 3.65s/it] 79%|███████▊ | 26956/34278 [29:34:20<6:54:10, 3.39s/it] {'loss': 0.1215, 'grad_norm': 0.7883921567853773, 'learning_rate': 1.149608919469683e-06, 'epoch': 0.79} 79%|███████▊ | 26956/34278 [29:34:20<6:54:10, 3.39s/it] 79%|███████▊ | 26957/34278 [29:34:23<6:30:39, 3.20s/it] {'loss': 0.102, 'grad_norm': 0.7621126622420391, 'learning_rate': 1.1493075475326138e-06, 'epoch': 0.79} 79%|███████▊ | 26957/34278 [29:34:23<6:30:39, 3.20s/it] 79%|███████▊ | 26958/34278 [29:34:28<7:28:38, 3.68s/it] {'loss': 0.1274, 'grad_norm': 0.7508340717348726, 'learning_rate': 1.1490062099736098e-06, 'epoch': 0.79} 79%|███████▊ | 26958/34278 [29:34:28<7:28:38, 3.68s/it] 79%|███████▊ | 26959/34278 [29:34:31<7:09:35, 3.52s/it] {'loss': 0.1102, 'grad_norm': 1.2107192085602738, 'learning_rate': 1.1487049067953592e-06, 'epoch': 0.79} 79%|███████▊ | 26959/34278 [29:34:31<7:09:35, 3.52s/it] 79%|███████▊ | 26960/34278 [29:34:34<6:58:24, 3.43s/it] {'loss': 0.1143, 'grad_norm': 0.882740292413829, 'learning_rate': 1.1484036380005503e-06, 'epoch': 0.79} 79%|███████▊ | 26960/34278 [29:34:34<6:58:24, 3.43s/it] 79%|███████▊ | 26961/34278 [29:34:40<8:33:05, 4.21s/it] {'loss': 0.1205, 'grad_norm': 0.8018073132782089, 'learning_rate': 1.1481024035918763e-06, 'epoch': 0.79} 79%|███████▊ | 26961/34278 [29:34:40<8:33:05, 4.21s/it] 79%|███████▊ | 26962/34278 [29:34:44<8:14:17, 4.05s/it] {'loss': 0.1422, 'grad_norm': 0.7865662270094895, 'learning_rate': 1.1478012035720237e-06, 'epoch': 0.79} 79%|███████▊ | 26962/34278 [29:34:44<8:14:17, 4.05s/it] 79%|███████▊ | 26963/34278 [29:34:47<7:45:55, 3.82s/it] {'loss': 0.1238, 'grad_norm': 0.8155312475792762, 'learning_rate': 1.1475000379436818e-06, 'epoch': 0.79} 79%|███████▊ | 26963/34278 [29:34:47<7:45:55, 3.82s/it] 79%|███████▊ | 26964/34278 [29:34:50<7:19:19, 3.60s/it] {'loss': 0.0987, 'grad_norm': 0.9796092648509211, 'learning_rate': 1.147198906709539e-06, 'epoch': 0.79} 79%|███████▊ | 26964/34278 [29:34:50<7:19:19, 3.60s/it] 79%|███████▊ | 26965/34278 [29:34:53<7:11:02, 3.54s/it] {'loss': 0.1305, 'grad_norm': 1.0586389811067674, 'learning_rate': 1.1468978098722866e-06, 'epoch': 0.79} 79%|███████▊ | 26965/34278 [29:34:53<7:11:02, 3.54s/it] 79%|███████▊ | 26966/34278 [29:34:57<7:00:58, 3.45s/it] {'loss': 0.0995, 'grad_norm': 0.7409045100530098, 'learning_rate': 1.1465967474346106e-06, 'epoch': 0.79} 79%|███████▊ | 26966/34278 [29:34:57<7:00:58, 3.45s/it] 79%|███████▊ | 26967/34278 [29:34:59<6:33:15, 3.23s/it] {'loss': 0.0857, 'grad_norm': 0.7780086121140247, 'learning_rate': 1.1462957193991975e-06, 'epoch': 0.79} 79%|███████▊ | 26967/34278 [29:34:59<6:33:15, 3.23s/it] 79%|███████▊ | 26968/34278 [29:35:03<6:46:04, 3.33s/it] {'loss': 0.1051, 'grad_norm': 0.837628663682243, 'learning_rate': 1.1459947257687376e-06, 'epoch': 0.79} 79%|███████▊ | 26968/34278 [29:35:03<6:46:04, 3.33s/it] 79%|███████▊ | 26969/34278 [29:35:06<6:36:45, 3.26s/it] {'loss': 0.121, 'grad_norm': 0.7236360158338214, 'learning_rate': 1.1456937665459156e-06, 'epoch': 0.79} 79%|███████▊ | 26969/34278 [29:35:06<6:36:45, 3.26s/it] 79%|███████▊ | 26970/34278 [29:35:09<6:24:50, 3.16s/it] {'loss': 0.117, 'grad_norm': 1.070127536202674, 'learning_rate': 1.1453928417334209e-06, 'epoch': 0.79} 79%|███████▊ | 26970/34278 [29:35:09<6:24:50, 3.16s/it] 79%|███████▊ | 26971/34278 [29:35:12<6:28:36, 3.19s/it] {'loss': 0.1167, 'grad_norm': 0.7734379500031032, 'learning_rate': 1.145091951333937e-06, 'epoch': 0.79} 79%|███████▊ | 26971/34278 [29:35:12<6:28:36, 3.19s/it] 79%|███████▊ | 26972/34278 [29:35:16<6:41:18, 3.30s/it] {'loss': 0.1132, 'grad_norm': 0.7271054501317741, 'learning_rate': 1.144791095350154e-06, 'epoch': 0.79} 79%|███████▊ | 26972/34278 [29:35:16<6:41:18, 3.30s/it] 79%|███████▊ | 26973/34278 [29:35:20<6:59:16, 3.44s/it] {'loss': 0.1153, 'grad_norm': 0.7722909978875346, 'learning_rate': 1.1444902737847553e-06, 'epoch': 0.79} 79%|███████▊ | 26973/34278 [29:35:20<6:59:16, 3.44s/it] 79%|███████▊ | 26974/34278 [29:35:23<7:05:58, 3.50s/it] {'loss': 0.1247, 'grad_norm': 1.0529311870098763, 'learning_rate': 1.1441894866404257e-06, 'epoch': 0.79} 79%|███████▊ | 26974/34278 [29:35:23<7:05:58, 3.50s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb129834310> Failed to fetch sample 3588837. Exception: cannot identify image file <_io.BytesIO object at 0x7fb129834310> 79%|███████▊ | 26975/34278 [29:35:26<6:54:20, 3.40s/it] {'loss': 0.1246, 'grad_norm': 1.1310731684381354, 'learning_rate': 1.1438887339198518e-06, 'epoch': 0.79} 79%|███████▊ | 26975/34278 [29:35:26<6:54:20, 3.40s/it] 79%|███████▊ | 26976/34278 [29:35:30<7:01:02, 3.46s/it] {'loss': 0.1233, 'grad_norm': 0.7147847864563611, 'learning_rate': 1.1435880156257206e-06, 'epoch': 0.79} 79%|███████▊ | 26976/34278 [29:35:30<7:01:02, 3.46s/it] 79%|███████▊ | 26977/34278 [29:35:33<6:46:42, 3.34s/it] {'loss': 0.1041, 'grad_norm': 0.8413088562452513, 'learning_rate': 1.143287331760713e-06, 'epoch': 0.79} 79%|███████▊ | 26977/34278 [29:35:33<6:46:42, 3.34s/it] 79%|███████▊ | 26978/34278 [29:35:37<6:53:06, 3.40s/it] {'loss': 0.1276, 'grad_norm': 0.9282174368199931, 'learning_rate': 1.142986682327517e-06, 'epoch': 0.79} 79%|███████▊ | 26978/34278 [29:35:37<6:53:06, 3.40s/it] 79%|███████▊ | 26979/34278 [29:35:39<6:31:48, 3.22s/it] {'loss': 0.1028, 'grad_norm': 0.8712969838419217, 'learning_rate': 1.1426860673288153e-06, 'epoch': 0.79} 79%|███████▊ | 26979/34278 [29:35:39<6:31:48, 3.22s/it] 79%|███████▊ | 26980/34278 [29:35:43<6:52:17, 3.39s/it] {'loss': 0.1414, 'grad_norm': 1.1435555554080257, 'learning_rate': 1.14238548676729e-06, 'epoch': 0.79} 79%|███████▊ | 26980/34278 [29:35:43<6:52:17, 3.39s/it] 79%|███████▊ | 26981/34278 [29:35:47<6:57:17, 3.43s/it] {'loss': 0.1013, 'grad_norm': 0.7130729393273794, 'learning_rate': 1.1420849406456263e-06, 'epoch': 0.79} 79%|███████▊ | 26981/34278 [29:35:47<6:57:17, 3.43s/it] 79%|███████▊ | 26982/34278 [29:35:50<6:39:24, 3.28s/it] {'loss': 0.1216, 'grad_norm': 1.1305725242569322, 'learning_rate': 1.1417844289665091e-06, 'epoch': 0.79} 79%|███████▊ | 26982/34278 [29:35:50<6:39:24, 3.28s/it] 79%|███████▊ | 26983/34278 [29:35:56<8:14:18, 4.07s/it] {'loss': 0.1122, 'grad_norm': 0.7932486868097797, 'learning_rate': 1.1414839517326192e-06, 'epoch': 0.79} 79%|███████▊ | 26983/34278 [29:35:56<8:14:18, 4.07s/it] 79%|███████▊ | 26984/34278 [29:35:58<7:30:26, 3.71s/it] {'loss': 0.1119, 'grad_norm': 0.8054079701445868, 'learning_rate': 1.1411835089466377e-06, 'epoch': 0.79} 79%|███████▊ | 26984/34278 [29:35:58<7:30:26, 3.71s/it] 79%|███████▊ | 26985/34278 [29:36:02<7:20:09, 3.62s/it] {'loss': 0.1151, 'grad_norm': 0.9864994701651784, 'learning_rate': 1.1408831006112504e-06, 'epoch': 0.79} 79%|███████▊ | 26985/34278 [29:36:02<7:20:09, 3.62s/it] 79%|███████▊ | 26986/34278 [29:36:05<7:17:28, 3.60s/it] {'loss': 0.1227, 'grad_norm': 0.8039319204841343, 'learning_rate': 1.1405827267291376e-06, 'epoch': 0.79} 79%|███████▊ | 26986/34278 [29:36:05<7:17:28, 3.60s/it] 79%|███████▊ | 26987/34278 [29:36:09<7:23:15, 3.65s/it] {'loss': 0.1473, 'grad_norm': 1.0690657532685257, 'learning_rate': 1.1402823873029778e-06, 'epoch': 0.79} 79%|███████▊ | 26987/34278 [29:36:09<7:23:15, 3.65s/it] 79%|███████▊ | 26988/34278 [29:36:12<7:07:37, 3.52s/it] {'loss': 0.118, 'grad_norm': 2.349516417868194, 'learning_rate': 1.1399820823354584e-06, 'epoch': 0.79} 79%|███████▊ | 26988/34278 [29:36:12<7:07:37, 3.52s/it] 79%|███████▊ | 26989/34278 [29:36:16<7:10:29, 3.54s/it] {'loss': 0.1093, 'grad_norm': 0.9688472133294002, 'learning_rate': 1.139681811829258e-06, 'epoch': 0.79} 79%|███████▊ | 26989/34278 [29:36:16<7:10:29, 3.54s/it] 79%|███████▊ | 26990/34278 [29:36:22<8:38:32, 4.27s/it] {'loss': 0.1307, 'grad_norm': 0.8275472845456364, 'learning_rate': 1.1393815757870546e-06, 'epoch': 0.79} 79%|███████▊ | 26990/34278 [29:36:22<8:38:32, 4.27s/it] 79%|███████▊ | 26991/34278 [29:36:25<8:05:28, 4.00s/it] {'loss': 0.113, 'grad_norm': 0.9310817924852001, 'learning_rate': 1.1390813742115332e-06, 'epoch': 0.79} 79%|███████▊ | 26991/34278 [29:36:25<8:05:28, 4.00s/it] 79%|███████▊ | 26992/34278 [29:36:28<7:27:54, 3.69s/it] {'loss': 0.1116, 'grad_norm': 0.9557097175977453, 'learning_rate': 1.1387812071053706e-06, 'epoch': 0.79} 79%|███████▊ | 26992/34278 [29:36:28<7:27:54, 3.69s/it] 79%|███████▊ | 26993/34278 [29:36:32<7:21:13, 3.63s/it] {'loss': 0.105, 'grad_norm': 0.8246996114442896, 'learning_rate': 1.1384810744712471e-06, 'epoch': 0.79} 79%|███████▊ | 26993/34278 [29:36:32<7:21:13, 3.63s/it] 79%|███████▉ | 26994/34278 [29:36:36<7:46:34, 3.84s/it] {'loss': 0.1361, 'grad_norm': 0.8743406944167271, 'learning_rate': 1.1381809763118424e-06, 'epoch': 0.79} 79%|███████▉ | 26994/34278 [29:36:36<7:46:34, 3.84s/it] 79%|███████▉ | 26995/34278 [29:36:41<8:15:59, 4.09s/it] {'loss': 0.1231, 'grad_norm': 1.1043558299341243, 'learning_rate': 1.1378809126298373e-06, 'epoch': 0.79} 79%|███████▉ | 26995/34278 [29:36:41<8:15:59, 4.09s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (9414 > 8192). Running this sequence through the model will result in indexing errors 79%|███████▉ | 26996/34278 [29:36:45<8:39:36, 4.28s/it] {'loss': 0.1121, 'grad_norm': 1.0758464004068735, 'learning_rate': 1.1375808834279095e-06, 'epoch': 0.79} 79%|███████▉ | 26996/34278 [29:36:45<8:39:36, 4.28s/it] 79%|███████▉ | 26997/34278 [29:36:48<7:54:51, 3.91s/it] {'loss': 0.1036, 'grad_norm': 0.931716048355612, 'learning_rate': 1.137280888708736e-06, 'epoch': 0.79} 79%|███████▉ | 26997/34278 [29:36:49<7:54:51, 3.91s/it] 79%|███████▉ | 26998/34278 [29:36:52<7:23:29, 3.66s/it] {'loss': 0.0965, 'grad_norm': 0.7779312814316556, 'learning_rate': 1.1369809284749982e-06, 'epoch': 0.79} 79%|███████▉ | 26998/34278 [29:36:52<7:23:29, 3.66s/it] 79%|███████▉ | 26999/34278 [29:36:55<7:12:32, 3.57s/it] {'loss': 0.1458, 'grad_norm': 1.0376133259115237, 'learning_rate': 1.1366810027293711e-06, 'epoch': 0.79} 79%|███████▉ | 26999/34278 [29:36:55<7:12:32, 3.57s/it] 79%|███████▉ | 27000/34278 [29:36:58<7:01:14, 3.47s/it] {'loss': 0.1313, 'grad_norm': 1.1522640271177464, 'learning_rate': 1.1363811114745354e-06, 'epoch': 0.79} 79%|███████▉ | 27000/34278 [29:36:58<7:01:14, 3.47s/it] 79%|███████▉ | 27001/34278 [29:37:01<6:49:05, 3.37s/it] {'loss': 0.108, 'grad_norm': 0.8182592835601398, 'learning_rate': 1.1360812547131655e-06, 'epoch': 0.79} 79%|███████▉ | 27001/34278 [29:37:01<6:49:05, 3.37s/it] 79%|███████▉ | 27002/34278 [29:37:04<6:40:58, 3.31s/it] {'loss': 0.1443, 'grad_norm': 1.152048894714544, 'learning_rate': 1.135781432447941e-06, 'epoch': 0.79} 79%|███████▉ | 27002/34278 [29:37:04<6:40:58, 3.31s/it] 79%|███████▉ | 27003/34278 [29:37:08<6:33:30, 3.25s/it] {'loss': 0.1191, 'grad_norm': 0.884579195459697, 'learning_rate': 1.135481644681537e-06, 'epoch': 0.79} 79%|███████▉ | 27003/34278 [29:37:08<6:33:30, 3.25s/it] 79%|███████▉ | 27004/34278 [29:37:12<7:03:27, 3.49s/it] {'loss': 0.1037, 'grad_norm': 0.7527828077355831, 'learning_rate': 1.135181891416629e-06, 'epoch': 0.79} 79%|███████▉ | 27004/34278 [29:37:12<7:03:27, 3.49s/it] 79%|███████▉ | 27005/34278 [29:37:16<7:48:42, 3.87s/it] {'loss': 0.1116, 'grad_norm': 0.8200120988297046, 'learning_rate': 1.1348821726558951e-06, 'epoch': 0.79} 79%|███████▉ | 27005/34278 [29:37:16<7:48:42, 3.87s/it] 79%|███████▉ | 27006/34278 [29:37:22<9:06:39, 4.51s/it] {'loss': 0.1083, 'grad_norm': 0.8972028511074547, 'learning_rate': 1.1345824884020113e-06, 'epoch': 0.79} 79%|███████▉ | 27006/34278 [29:37:22<9:06:39, 4.51s/it] 79%|███████▉ | 27007/34278 [29:37:28<9:42:44, 4.81s/it] {'loss': 0.1122, 'grad_norm': 0.855428936458742, 'learning_rate': 1.134282838657651e-06, 'epoch': 0.79} 79%|███████▉ | 27007/34278 [29:37:28<9:42:44, 4.81s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 79%|███████▉ | 27008/34278 [29:37:31<8:43:08, 4.32s/it] {'loss': 0.1092, 'grad_norm': 0.7133366751370145, 'learning_rate': 1.133983223425492e-06, 'epoch': 0.79} 79%|███████▉ | 27008/34278 [29:37:31<8:43:08, 4.32s/it] 79%|███████▉ | 27009/34278 [29:37:34<7:58:09, 3.95s/it] {'loss': 0.1307, 'grad_norm': 0.8798477211755299, 'learning_rate': 1.1336836427082083e-06, 'epoch': 0.79} 79%|███████▉ | 27009/34278 [29:37:34<7:58:09, 3.95s/it] 79%|███████▉ | 27010/34278 [29:37:37<7:19:10, 3.63s/it] {'loss': 0.1097, 'grad_norm': 0.8665858198878355, 'learning_rate': 1.1333840965084725e-06, 'epoch': 0.79} 79%|███████▉ | 27010/34278 [29:37:37<7:19:10, 3.63s/it] 79%|███████▉ | 27011/34278 [29:37:40<7:07:00, 3.53s/it] {'loss': 0.0864, 'grad_norm': 0.7480220082472291, 'learning_rate': 1.1330845848289606e-06, 'epoch': 0.79} 79%|███████▉ | 27011/34278 [29:37:40<7:07:00, 3.53s/it] 79%|███████▉ | 27012/34278 [29:37:43<6:52:57, 3.41s/it] {'loss': 0.1112, 'grad_norm': 0.7079429673161648, 'learning_rate': 1.1327851076723473e-06, 'epoch': 0.79} 79%|███████▉ | 27012/34278 [29:37:43<6:52:57, 3.41s/it] 79%|███████▉ | 27013/34278 [29:37:46<6:36:58, 3.28s/it] {'loss': 0.1049, 'grad_norm': 1.383172043736597, 'learning_rate': 1.1324856650413057e-06, 'epoch': 0.79} 79%|███████▉ | 27013/34278 [29:37:46<6:36:58, 3.28s/it] 79%|███████▉ | 27014/34278 [29:37:52<7:53:24, 3.91s/it] {'loss': 0.1247, 'grad_norm': 0.8373019806538111, 'learning_rate': 1.132186256938508e-06, 'epoch': 0.79} 79%|███████▉ | 27014/34278 [29:37:52<7:53:24, 3.91s/it] 79%|███████▉ | 27015/34278 [29:37:58<9:19:16, 4.62s/it] {'loss': 0.1167, 'grad_norm': 0.771620531410981, 'learning_rate': 1.1318868833666286e-06, 'epoch': 0.79} 79%|███████▉ | 27015/34278 [29:37:58<9:19:16, 4.62s/it] 79%|███████▉ | 27016/34278 [29:38:01<8:18:05, 4.12s/it] {'loss': 0.1064, 'grad_norm': 0.9316198499289414, 'learning_rate': 1.1315875443283396e-06, 'epoch': 0.79} 79%|███████▉ | 27016/34278 [29:38:01<8:18:05, 4.12s/it] 79%|███████▉ | 27017/34278 [29:38:04<7:50:42, 3.89s/it] {'loss': 0.1145, 'grad_norm': 0.8558932039206868, 'learning_rate': 1.1312882398263114e-06, 'epoch': 0.79} 79%|███████▉ | 27017/34278 [29:38:04<7:50:42, 3.89s/it] 79%|███████▉ | 27018/34278 [29:38:08<7:57:18, 3.94s/it] {'loss': 0.0945, 'grad_norm': 0.8262050949069694, 'learning_rate': 1.130988969863221e-06, 'epoch': 0.79} 79%|███████▉ | 27018/34278 [29:38:08<7:57:18, 3.94s/it] 79%|███████▉ | 27019/34278 [29:38:12<7:26:52, 3.69s/it] {'loss': 0.1219, 'grad_norm': 1.0459089058686855, 'learning_rate': 1.1306897344417373e-06, 'epoch': 0.79} 79%|███████▉ | 27019/34278 [29:38:12<7:26:52, 3.69s/it] 79%|███████▉ | 27020/34278 [29:38:15<7:14:14, 3.59s/it] {'loss': 0.098, 'grad_norm': 0.9251468084913778, 'learning_rate': 1.1303905335645304e-06, 'epoch': 0.79} 79%|███████▉ | 27020/34278 [29:38:15<7:14:14, 3.59s/it] 79%|███████▉ | 27021/34278 [29:38:18<6:59:43, 3.47s/it] {'loss': 0.1029, 'grad_norm': 0.7679005529198607, 'learning_rate': 1.1300913672342744e-06, 'epoch': 0.79} 79%|███████▉ | 27021/34278 [29:38:18<6:59:43, 3.47s/it] 79%|███████▉ | 27022/34278 [29:38:21<6:51:01, 3.40s/it] {'loss': 0.1487, 'grad_norm': 0.86234896637721, 'learning_rate': 1.1297922354536396e-06, 'epoch': 0.79} 79%|███████▉ | 27022/34278 [29:38:21<6:51:01, 3.40s/it] 79%|███████▉ | 27023/34278 [29:38:25<7:06:35, 3.53s/it] {'loss': 0.1192, 'grad_norm': 0.6475395648456037, 'learning_rate': 1.1294931382252932e-06, 'epoch': 0.79} 79%|███████▉ | 27023/34278 [29:38:25<7:06:35, 3.53s/it] 79%|███████▉ | 27024/34278 [29:38:28<6:49:55, 3.39s/it] {'loss': 0.1038, 'grad_norm': 0.962154914870367, 'learning_rate': 1.1291940755519092e-06, 'epoch': 0.79} 79%|███████▉ | 27024/34278 [29:38:28<6:49:55, 3.39s/it] 79%|███████▉ | 27025/34278 [29:38:31<6:36:01, 3.28s/it] {'loss': 0.1077, 'grad_norm': 0.8348365296359087, 'learning_rate': 1.128895047436157e-06, 'epoch': 0.79} 79%|███████▉ | 27025/34278 [29:38:31<6:36:01, 3.28s/it] 79%|███████▉ | 27026/34278 [29:38:36<7:17:03, 3.62s/it] {'loss': 0.1187, 'grad_norm': 0.7840105408076355, 'learning_rate': 1.1285960538807066e-06, 'epoch': 0.79} 79%|███████▉ | 27026/34278 [29:38:36<7:17:03, 3.62s/it] 79%|███████▉ | 27027/34278 [29:38:39<7:16:48, 3.61s/it] {'loss': 0.1142, 'grad_norm': 0.8347802697343071, 'learning_rate': 1.1282970948882243e-06, 'epoch': 0.79} 79%|███████▉ | 27027/34278 [29:38:39<7:16:48, 3.61s/it] 79%|███████▉ | 27028/34278 [29:38:44<8:09:12, 4.05s/it] {'loss': 0.1143, 'grad_norm': 0.8759492568057259, 'learning_rate': 1.1279981704613828e-06, 'epoch': 0.79} 79%|███████▉ | 27028/34278 [29:38:44<8:09:12, 4.05s/it] 79%|███████▉ | 27029/34278 [29:38:48<7:58:17, 3.96s/it] {'loss': 0.1071, 'grad_norm': 0.8110954759711658, 'learning_rate': 1.1276992806028485e-06, 'epoch': 0.79} 79%|███████▉ | 27029/34278 [29:38:48<7:58:17, 3.96s/it] 79%|███████▉ | 27030/34278 [29:38:54<9:12:49, 4.58s/it] {'loss': 0.1124, 'grad_norm': 0.7398886339637835, 'learning_rate': 1.1274004253152914e-06, 'epoch': 0.79} 79%|███████▉ | 27030/34278 [29:38:54<9:12:49, 4.58s/it] 79%|███████▉ | 27031/34278 [29:38:57<8:26:19, 4.19s/it] {'loss': 0.116, 'grad_norm': 0.7910840109521887, 'learning_rate': 1.1271016046013778e-06, 'epoch': 0.79} 79%|███████▉ | 27031/34278 [29:38:57<8:26:19, 4.19s/it] 79%|███████▉ | 27032/34278 [29:39:01<8:02:35, 4.00s/it] {'loss': 0.1322, 'grad_norm': 1.0890123327402674, 'learning_rate': 1.126802818463778e-06, 'epoch': 0.79} 79%|███████▉ | 27032/34278 [29:39:01<8:02:35, 4.00s/it] 79%|███████▉ | 27033/34278 [29:39:06<8:32:11, 4.24s/it] {'loss': 0.1241, 'grad_norm': 0.6688060214778, 'learning_rate': 1.1265040669051581e-06, 'epoch': 0.79} 79%|███████▉ | 27033/34278 [29:39:06<8:32:11, 4.24s/it] 79%|███████▉ | 27034/34278 [29:39:09<7:41:49, 3.83s/it] {'loss': 0.1193, 'grad_norm': 0.7722276746876664, 'learning_rate': 1.1262053499281833e-06, 'epoch': 0.79} 79%|███████▉ | 27034/34278 [29:39:09<7:41:49, 3.83s/it] 79%|███████▉ | 27035/34278 [29:39:12<7:23:31, 3.67s/it] {'loss': 0.1218, 'grad_norm': 0.7518688019620501, 'learning_rate': 1.1259066675355224e-06, 'epoch': 0.79} 79%|███████▉ | 27035/34278 [29:39:12<7:23:31, 3.67s/it] 79%|███████▉ | 27036/34278 [29:39:15<7:00:04, 3.48s/it] {'loss': 0.1408, 'grad_norm': 0.7614123499987674, 'learning_rate': 1.1256080197298437e-06, 'epoch': 0.79} 79%|███████▉ | 27036/34278 [29:39:15<7:00:04, 3.48s/it] 79%|███████▉ | 27037/34278 [29:39:18<7:00:38, 3.49s/it] {'loss': 0.1338, 'grad_norm': 0.7663154896064904, 'learning_rate': 1.1253094065138105e-06, 'epoch': 0.79} 79%|███████▉ | 27037/34278 [29:39:18<7:00:38, 3.49s/it] 79%|███████▉ | 27038/34278 [29:39:22<6:55:29, 3.44s/it] {'loss': 0.132, 'grad_norm': 0.8704325046926882, 'learning_rate': 1.1250108278900906e-06, 'epoch': 0.79} 79%|███████▉ | 27038/34278 [29:39:22<6:55:29, 3.44s/it] 79%|███████▉ | 27039/34278 [29:39:25<6:33:55, 3.26s/it] {'loss': 0.1017, 'grad_norm': 0.7366138270196971, 'learning_rate': 1.12471228386135e-06, 'epoch': 0.79} 79%|███████▉ | 27039/34278 [29:39:25<6:33:55, 3.26s/it] 79%|███████▉ | 27040/34278 [29:39:28<6:42:57, 3.34s/it] {'loss': 0.1171, 'grad_norm': 0.7670380610852506, 'learning_rate': 1.1244137744302508e-06, 'epoch': 0.79} 79%|███████▉ | 27040/34278 [29:39:28<6:42:57, 3.34s/it] 79%|███████▉ | 27041/34278 [29:39:31<6:19:52, 3.15s/it] {'loss': 0.0941, 'grad_norm': 0.845280063715469, 'learning_rate': 1.1241152995994603e-06, 'epoch': 0.79} 79%|███████▉ | 27041/34278 [29:39:31<6:19:52, 3.15s/it] 79%|███████▉ | 27042/34278 [29:39:34<6:09:07, 3.06s/it] {'loss': 0.1026, 'grad_norm': 0.7234602302032337, 'learning_rate': 1.1238168593716448e-06, 'epoch': 0.79} 79%|███████▉ | 27042/34278 [29:39:34<6:09:07, 3.06s/it] 79%|███████▉ | 27043/34278 [29:39:37<6:19:53, 3.15s/it] {'loss': 0.1067, 'grad_norm': 0.755873136293606, 'learning_rate': 1.123518453749467e-06, 'epoch': 0.79} 79%|███████▉ | 27043/34278 [29:39:37<6:19:53, 3.15s/it] 79%|███████▉ | 27044/34278 [29:39:41<6:38:47, 3.31s/it] {'loss': 0.1141, 'grad_norm': 0.7218157111755862, 'learning_rate': 1.12322008273559e-06, 'epoch': 0.79} 79%|███████▉ | 27044/34278 [29:39:41<6:38:47, 3.31s/it] 79%|███████▉ | 27045/34278 [29:39:44<6:25:46, 3.20s/it] {'loss': 0.1219, 'grad_norm': 0.9889466128783103, 'learning_rate': 1.1229217463326798e-06, 'epoch': 0.79} 79%|███████▉ | 27045/34278 [29:39:44<6:25:46, 3.20s/it] 79%|███████▉ | 27046/34278 [29:39:48<6:49:37, 3.40s/it] {'loss': 0.1123, 'grad_norm': 0.7903896519892492, 'learning_rate': 1.1226234445433987e-06, 'epoch': 0.79} 79%|███████▉ | 27046/34278 [29:39:48<6:49:37, 3.40s/it] 79%|███████▉ | 27047/34278 [29:39:51<7:02:49, 3.51s/it] {'loss': 0.1025, 'grad_norm': 0.9271489373101308, 'learning_rate': 1.1223251773704069e-06, 'epoch': 0.79} 79%|███████▉ | 27047/34278 [29:39:51<7:02:49, 3.51s/it] 79%|███████▉ | 27048/34278 [29:39:57<8:29:33, 4.23s/it] {'loss': 0.1173, 'grad_norm': 0.7319543777838504, 'learning_rate': 1.1220269448163735e-06, 'epoch': 0.79} 79%|███████▉ | 27048/34278 [29:39:57<8:29:33, 4.23s/it] 79%|███████▉ | 27049/34278 [29:40:00<7:53:01, 3.93s/it] {'loss': 0.1124, 'grad_norm': 0.7077700265266096, 'learning_rate': 1.121728746883957e-06, 'epoch': 0.79} 79%|███████▉ | 27049/34278 [29:40:00<7:53:01, 3.93s/it] 79%|███████▉ | 27050/34278 [29:40:04<7:34:06, 3.77s/it] {'loss': 0.1071, 'grad_norm': 1.1161121398535105, 'learning_rate': 1.1214305835758194e-06, 'epoch': 0.79} 79%|███████▉ | 27050/34278 [29:40:04<7:34:06, 3.77s/it] 79%|███████▉ | 27051/34278 [29:40:08<7:50:54, 3.91s/it] {'loss': 0.1144, 'grad_norm': 0.7895281541789846, 'learning_rate': 1.1211324548946255e-06, 'epoch': 0.79} 79%|███████▉ | 27051/34278 [29:40:08<7:50:54, 3.91s/it] 79%|███████▉ | 27052/34278 [29:40:11<7:03:02, 3.51s/it] {'loss': 0.1022, 'grad_norm': 0.7774085888223303, 'learning_rate': 1.1208343608430344e-06, 'epoch': 0.79} 79%|███████▉ | 27052/34278 [29:40:11<7:03:02, 3.51s/it] 79%|███████▉ | 27053/34278 [29:40:13<6:34:40, 3.28s/it] {'loss': 0.1278, 'grad_norm': 0.8718471687330965, 'learning_rate': 1.1205363014237075e-06, 'epoch': 0.79} 79%|███████▉ | 27053/34278 [29:40:13<6:34:40, 3.28s/it] 79%|███████▉ | 27054/34278 [29:40:17<6:33:48, 3.27s/it] {'loss': 0.1089, 'grad_norm': 0.8811875020140885, 'learning_rate': 1.1202382766393056e-06, 'epoch': 0.79} 79%|███████▉ | 27054/34278 [29:40:17<6:33:48, 3.27s/it] 79%|███████▉ | 27055/34278 [29:40:20<6:33:12, 3.27s/it] {'loss': 0.1158, 'grad_norm': 0.8360053810229248, 'learning_rate': 1.119940286492492e-06, 'epoch': 0.79} 79%|███████▉ | 27055/34278 [29:40:20<6:33:12, 3.27s/it] 79%|███████▉ | 27056/34278 [29:40:23<6:24:45, 3.20s/it] {'loss': 0.1254, 'grad_norm': 0.9734365039401242, 'learning_rate': 1.119642330985925e-06, 'epoch': 0.79} 79%|███████▉ | 27056/34278 [29:40:23<6:24:45, 3.20s/it] 79%|███████▉ | 27057/34278 [29:40:26<6:17:22, 3.14s/it] {'loss': 0.1349, 'grad_norm': 0.868937374812772, 'learning_rate': 1.1193444101222639e-06, 'epoch': 0.79} 79%|███████▉ | 27057/34278 [29:40:26<6:17:22, 3.14s/it] 79%|███████▉ | 27058/34278 [29:40:30<6:42:14, 3.34s/it] {'loss': 0.1136, 'grad_norm': 0.9714343484952161, 'learning_rate': 1.119046523904171e-06, 'epoch': 0.79} 79%|███████▉ | 27058/34278 [29:40:30<6:42:14, 3.34s/it] 79%|███████▉ | 27059/34278 [29:40:33<6:50:04, 3.41s/it] {'loss': 0.1113, 'grad_norm': 1.0384370064192667, 'learning_rate': 1.1187486723343027e-06, 'epoch': 0.79} 79%|███████▉ | 27059/34278 [29:40:33<6:50:04, 3.41s/it] 79%|███████▉ | 27060/34278 [29:40:37<6:50:41, 3.41s/it] {'loss': 0.1324, 'grad_norm': 0.8904081157750814, 'learning_rate': 1.1184508554153207e-06, 'epoch': 0.79} 79%|███████▉ | 27060/34278 [29:40:37<6:50:41, 3.41s/it] 79%|███████▉ | 27061/34278 [29:40:40<6:33:38, 3.27s/it] {'loss': 0.1195, 'grad_norm': 0.6980593416909135, 'learning_rate': 1.118153073149882e-06, 'epoch': 0.79} 79%|███████▉ | 27061/34278 [29:40:40<6:33:38, 3.27s/it] 79%|███████▉ | 27062/34278 [29:40:43<6:25:04, 3.20s/it] {'loss': 0.1266, 'grad_norm': 0.8183181824070184, 'learning_rate': 1.1178553255406471e-06, 'epoch': 0.79} 79%|███████▉ | 27062/34278 [29:40:43<6:25:04, 3.20s/it] 79%|███████▉ | 27063/34278 [29:40:46<6:28:19, 3.23s/it] {'loss': 0.1235, 'grad_norm': 0.7674492847616166, 'learning_rate': 1.1175576125902732e-06, 'epoch': 0.79} 79%|███████▉ | 27063/34278 [29:40:46<6:28:19, 3.23s/it] 79%|███████▉ | 27064/34278 [29:40:49<6:28:49, 3.23s/it] {'loss': 0.1234, 'grad_norm': 1.0686941219648478, 'learning_rate': 1.1172599343014167e-06, 'epoch': 0.79} 79%|███████▉ | 27064/34278 [29:40:49<6:28:49, 3.23s/it] 79%|███████▉ | 27065/34278 [29:40:52<6:19:20, 3.16s/it] {'loss': 0.0975, 'grad_norm': 0.9257269730998541, 'learning_rate': 1.1169622906767368e-06, 'epoch': 0.79} 79%|███████▉ | 27065/34278 [29:40:52<6:19:20, 3.16s/it] 79%|███████▉ | 27066/34278 [29:40:55<6:02:14, 3.01s/it] {'loss': 0.1322, 'grad_norm': 0.7989338259893596, 'learning_rate': 1.116664681718892e-06, 'epoch': 0.79} 79%|███████▉ | 27066/34278 [29:40:55<6:02:14, 3.01s/it] 79%|███████▉ | 27067/34278 [29:40:58<6:10:40, 3.08s/it] {'loss': 0.0992, 'grad_norm': 0.9501551133211303, 'learning_rate': 1.1163671074305365e-06, 'epoch': 0.79} 79%|███████▉ | 27067/34278 [29:40:58<6:10:40, 3.08s/it] 79%|███████▉ | 27068/34278 [29:41:04<7:54:12, 3.95s/it] {'loss': 0.0976, 'grad_norm': 0.8220297913394656, 'learning_rate': 1.1160695678143297e-06, 'epoch': 0.79} 79%|███████▉ | 27068/34278 [29:41:04<7:54:12, 3.95s/it] 79%|███████▉ | 27069/34278 [29:41:07<7:18:50, 3.65s/it] {'loss': 0.0971, 'grad_norm': 0.7420781942638193, 'learning_rate': 1.1157720628729264e-06, 'epoch': 0.79} 79%|███████▉ | 27069/34278 [29:41:07<7:18:50, 3.65s/it] 79%|███████▉ | 27070/34278 [29:41:10<6:48:47, 3.40s/it] {'loss': 0.1144, 'grad_norm': 0.7748818413562756, 'learning_rate': 1.1154745926089816e-06, 'epoch': 0.79} 79%|███████▉ | 27070/34278 [29:41:10<6:48:47, 3.40s/it] 79%|███████▉ | 27071/34278 [29:41:16<8:35:35, 4.29s/it] {'loss': 0.0945, 'grad_norm': 0.7086597366847097, 'learning_rate': 1.1151771570251524e-06, 'epoch': 0.79} 79%|███████▉ | 27071/34278 [29:41:16<8:35:35, 4.29s/it] 79%|███████▉ | 27072/34278 [29:41:20<8:27:05, 4.22s/it] {'loss': 0.1112, 'grad_norm': 0.9554744825537503, 'learning_rate': 1.1148797561240954e-06, 'epoch': 0.79} 79%|███████▉ | 27072/34278 [29:41:20<8:27:05, 4.22s/it] 79%|███████▉ | 27073/34278 [29:41:24<8:00:22, 4.00s/it] {'loss': 0.1034, 'grad_norm': 0.8266113022934952, 'learning_rate': 1.1145823899084645e-06, 'epoch': 0.79} 79%|███████▉ | 27073/34278 [29:41:24<8:00:22, 4.00s/it] 79%|███████▉ | 27074/34278 [29:41:28<8:09:27, 4.08s/it] {'loss': 0.1184, 'grad_norm': 0.8992658916797747, 'learning_rate': 1.1142850583809133e-06, 'epoch': 0.79} 79%|███████▉ | 27074/34278 [29:41:28<8:09:27, 4.08s/it] 79%|███████▉ | 27075/34278 [29:41:31<7:19:04, 3.66s/it] {'loss': 0.1081, 'grad_norm': 1.061094711579638, 'learning_rate': 1.1139877615440993e-06, 'epoch': 0.79} 79%|███████▉ | 27075/34278 [29:41:31<7:19:04, 3.66s/it] 79%|███████▉ | 27076/34278 [29:41:34<6:48:20, 3.40s/it] {'loss': 0.1061, 'grad_norm': 0.7539182591125799, 'learning_rate': 1.1136904994006743e-06, 'epoch': 0.79} 79%|███████▉ | 27076/34278 [29:41:34<6:48:20, 3.40s/it] 79%|███████▉ | 27077/34278 [29:41:37<6:33:13, 3.28s/it] {'loss': 0.1036, 'grad_norm': 0.7331230192797517, 'learning_rate': 1.1133932719532903e-06, 'epoch': 0.79} 79%|███████▉ | 27077/34278 [29:41:37<6:33:13, 3.28s/it] 79%|███████▉ | 27078/34278 [29:41:41<7:12:24, 3.60s/it] {'loss': 0.1173, 'grad_norm': 0.8883405371673604, 'learning_rate': 1.1130960792046057e-06, 'epoch': 0.79} 79%|███████▉ | 27078/34278 [29:41:41<7:12:24, 3.60s/it] 79%|███████▉ | 27079/34278 [29:41:45<7:22:32, 3.69s/it] {'loss': 0.1142, 'grad_norm': 0.7454993019681683, 'learning_rate': 1.1127989211572715e-06, 'epoch': 0.79} 79%|███████▉ | 27079/34278 [29:41:45<7:22:32, 3.69s/it] 79%|███████▉ | 27080/34278 [29:41:48<7:00:14, 3.50s/it] {'loss': 0.1088, 'grad_norm': 1.1582832738734559, 'learning_rate': 1.1125017978139396e-06, 'epoch': 0.79} 79%|███████▉ | 27080/34278 [29:41:48<7:00:14, 3.50s/it] 79%|███████▉ | 27081/34278 [29:41:52<7:07:48, 3.57s/it] {'loss': 0.1037, 'grad_norm': 0.8510141194896007, 'learning_rate': 1.1122047091772647e-06, 'epoch': 0.79} 79%|███████▉ | 27081/34278 [29:41:52<7:07:48, 3.57s/it] 79%|███████▉ | 27082/34278 [29:41:55<6:46:37, 3.39s/it] {'loss': 0.0945, 'grad_norm': 0.787896618425818, 'learning_rate': 1.111907655249898e-06, 'epoch': 0.79} 79%|███████▉ | 27082/34278 [29:41:55<6:46:37, 3.39s/it] 79%|███████▉ | 27083/34278 [29:41:58<6:56:37, 3.47s/it] {'loss': 0.1298, 'grad_norm': 0.7847387202622397, 'learning_rate': 1.1116106360344909e-06, 'epoch': 0.79} 79%|███████▉ | 27083/34278 [29:41:58<6:56:37, 3.47s/it] 79%|███████▉ | 27084/34278 [29:42:02<7:18:40, 3.66s/it] {'loss': 0.1479, 'grad_norm': 0.8143617847944276, 'learning_rate': 1.1113136515336953e-06, 'epoch': 0.79} 79%|███████▉ | 27084/34278 [29:42:02<7:18:40, 3.66s/it] 79%|███████▉ | 27085/34278 [29:42:06<7:02:36, 3.53s/it] {'loss': 0.0974, 'grad_norm': 0.7919774903233028, 'learning_rate': 1.1110167017501643e-06, 'epoch': 0.79} 79%|███████▉ | 27085/34278 [29:42:06<7:02:36, 3.53s/it] 79%|███████▉ | 27086/34278 [29:42:09<6:52:35, 3.44s/it] {'loss': 0.1096, 'grad_norm': 0.8546097168471126, 'learning_rate': 1.1107197866865482e-06, 'epoch': 0.79} 79%|███████▉ | 27086/34278 [29:42:09<6:52:35, 3.44s/it] 79%|███████▉ | 27087/34278 [29:42:12<6:39:21, 3.33s/it] {'loss': 0.1419, 'grad_norm': 0.8094623375714232, 'learning_rate': 1.1104229063454957e-06, 'epoch': 0.79} 79%|███████▉ | 27087/34278 [29:42:12<6:39:21, 3.33s/it] 79%|███████▉ | 27088/34278 [29:42:15<6:17:49, 3.15s/it] {'loss': 0.1396, 'grad_norm': 1.1065230427955302, 'learning_rate': 1.1101260607296588e-06, 'epoch': 0.79} 79%|███████▉ | 27088/34278 [29:42:15<6:17:49, 3.15s/it] 79%|███████▉ | 27089/34278 [29:42:18<6:20:43, 3.18s/it] {'loss': 0.0995, 'grad_norm': 0.7493205945004073, 'learning_rate': 1.1098292498416895e-06, 'epoch': 0.79} 79%|███████▉ | 27089/34278 [29:42:18<6:20:43, 3.18s/it] 79%|███████▉ | 27090/34278 [29:42:22<7:03:12, 3.53s/it] {'loss': 0.1119, 'grad_norm': 0.832246248518706, 'learning_rate': 1.109532473684236e-06, 'epoch': 0.79} 79%|███████▉ | 27090/34278 [29:42:22<7:03:12, 3.53s/it] 79%|███████▉ | 27091/34278 [29:42:25<6:51:52, 3.44s/it] {'loss': 0.1259, 'grad_norm': 0.8641004671671787, 'learning_rate': 1.1092357322599467e-06, 'epoch': 0.79} 79%|███████▉ | 27091/34278 [29:42:25<6:51:52, 3.44s/it] 79%|███████▉ | 27092/34278 [29:42:28<6:34:11, 3.29s/it] {'loss': 0.1203, 'grad_norm': 0.8880996219591738, 'learning_rate': 1.1089390255714733e-06, 'epoch': 0.79} 79%|███████▉ | 27092/34278 [29:42:28<6:34:11, 3.29s/it] 79%|███████▉ | 27093/34278 [29:42:31<6:26:48, 3.23s/it] {'loss': 0.1159, 'grad_norm': 0.9002697296503198, 'learning_rate': 1.108642353621463e-06, 'epoch': 0.79} 79%|███████▉ | 27093/34278 [29:42:31<6:26:48, 3.23s/it] 79%|███████▉ | 27094/34278 [29:42:35<6:41:49, 3.36s/it] {'loss': 0.1095, 'grad_norm': 0.7692330428831918, 'learning_rate': 1.108345716412562e-06, 'epoch': 0.79} 79%|███████▉ | 27094/34278 [29:42:35<6:41:49, 3.36s/it] 79%|███████▉ | 27095/34278 [29:42:39<6:45:57, 3.39s/it] {'loss': 0.1178, 'grad_norm': 0.8019301129088359, 'learning_rate': 1.1080491139474248e-06, 'epoch': 0.79} 79%|███████▉ | 27095/34278 [29:42:39<6:45:57, 3.39s/it] 79%|███████▉ | 27096/34278 [29:42:42<6:45:55, 3.39s/it] {'loss': 0.0918, 'grad_norm': 0.72149071079873, 'learning_rate': 1.107752546228696e-06, 'epoch': 0.79} 79%|███████▉ | 27096/34278 [29:42:42<6:45:55, 3.39s/it] 79%|███████▉ | 27097/34278 [29:42:48<8:27:20, 4.24s/it] {'loss': 0.119, 'grad_norm': 0.9390267499518805, 'learning_rate': 1.1074560132590218e-06, 'epoch': 0.79} 79%|███████▉ | 27097/34278 [29:42:48<8:27:20, 4.24s/it] 79%|███████▉ | 27098/34278 [29:42:51<7:47:37, 3.91s/it] {'loss': 0.1068, 'grad_norm': 0.9875472469030105, 'learning_rate': 1.1071595150410518e-06, 'epoch': 0.79} 79%|███████▉ | 27098/34278 [29:42:51<7:47:37, 3.91s/it] 79%|███████▉ | 27099/34278 [29:42:54<7:16:41, 3.65s/it] {'loss': 0.132, 'grad_norm': 0.9522414094922597, 'learning_rate': 1.1068630515774332e-06, 'epoch': 0.79} 79%|███████▉ | 27099/34278 [29:42:54<7:16:41, 3.65s/it] 79%|███████▉ | 27100/34278 [29:42:58<7:31:44, 3.78s/it] {'loss': 0.1058, 'grad_norm': 0.822154335648392, 'learning_rate': 1.10656662287081e-06, 'epoch': 0.79} 79%|███████▉ | 27100/34278 [29:42:58<7:31:44, 3.78s/it] 79%|███████▉ | 27101/34278 [29:43:02<7:07:01, 3.57s/it] {'loss': 0.1111, 'grad_norm': 0.7328819702652183, 'learning_rate': 1.1062702289238308e-06, 'epoch': 0.79} 79%|███████▉ | 27101/34278 [29:43:02<7:07:01, 3.57s/it] 79%|███████▉ | 27102/34278 [29:43:05<7:03:31, 3.54s/it] {'loss': 0.1417, 'grad_norm': 0.92680488768724, 'learning_rate': 1.105973869739143e-06, 'epoch': 0.79} 79%|███████▉ | 27102/34278 [29:43:05<7:03:31, 3.54s/it] 79%|███████▉ | 27103/34278 [29:43:08<6:51:40, 3.44s/it] {'loss': 0.0881, 'grad_norm': 0.7479262159317424, 'learning_rate': 1.1056775453193907e-06, 'epoch': 0.79} 79%|███████▉ | 27103/34278 [29:43:08<6:51:40, 3.44s/it] 79%|███████▉ | 27104/34278 [29:43:12<6:51:19, 3.44s/it] {'loss': 0.1094, 'grad_norm': 1.290706270914707, 'learning_rate': 1.1053812556672183e-06, 'epoch': 0.79} 79%|███████▉ | 27104/34278 [29:43:12<6:51:19, 3.44s/it] 79%|███████▉ | 27105/34278 [29:43:15<6:30:29, 3.27s/it] {'loss': 0.1224, 'grad_norm': 0.8930020032148578, 'learning_rate': 1.1050850007852737e-06, 'epoch': 0.79} 79%|███████▉ | 27105/34278 [29:43:15<6:30:29, 3.27s/it] 79%|███████▉ | 27106/34278 [29:43:18<6:38:56, 3.34s/it] {'loss': 0.1273, 'grad_norm': 1.2154004616883283, 'learning_rate': 1.1047887806761993e-06, 'epoch': 0.79} 79%|███████▉ | 27106/34278 [29:43:18<6:38:56, 3.34s/it] 79%|███████▉ | 27107/34278 [29:43:21<6:27:04, 3.24s/it] {'loss': 0.112, 'grad_norm': 0.9299056012719995, 'learning_rate': 1.1044925953426406e-06, 'epoch': 0.79} 79%|███████▉ | 27107/34278 [29:43:21<6:27:04, 3.24s/it] 79%|███████▉ | 27108/34278 [29:43:26<7:17:00, 3.66s/it] {'loss': 0.1214, 'grad_norm': 0.8292919192985915, 'learning_rate': 1.1041964447872434e-06, 'epoch': 0.79} 79%|███████▉ | 27108/34278 [29:43:26<7:17:00, 3.66s/it] 79%|███████▉ | 27109/34278 [29:43:28<6:46:57, 3.41s/it] {'loss': 0.1178, 'grad_norm': 0.937772791906975, 'learning_rate': 1.10390032901265e-06, 'epoch': 0.79} 79%|███████▉ | 27109/34278 [29:43:28<6:46:57, 3.41s/it] 79%|███████▉ | 27110/34278 [29:43:32<6:48:16, 3.42s/it] {'loss': 0.1165, 'grad_norm': 0.9054766946410139, 'learning_rate': 1.1036042480215037e-06, 'epoch': 0.79} 79%|███████▉ | 27110/34278 [29:43:32<6:48:16, 3.42s/it] 79%|███████▉ | 27111/34278 [29:43:36<7:13:21, 3.63s/it] {'loss': 0.152, 'grad_norm': 0.8453587046069618, 'learning_rate': 1.1033082018164492e-06, 'epoch': 0.79} 79%|███████▉ | 27111/34278 [29:43:36<7:13:21, 3.63s/it] 79%|███████▉ | 27112/34278 [29:43:40<7:28:23, 3.75s/it] {'loss': 0.0991, 'grad_norm': 0.7023114841040197, 'learning_rate': 1.1030121904001278e-06, 'epoch': 0.79} 79%|███████▉ | 27112/34278 [29:43:40<7:28:23, 3.75s/it] 79%|███████▉ | 27113/34278 [29:43:43<7:11:37, 3.61s/it] {'loss': 0.1098, 'grad_norm': 0.9327751718824633, 'learning_rate': 1.1027162137751852e-06, 'epoch': 0.79} 79%|███████▉ | 27113/34278 [29:43:43<7:11:37, 3.61s/it] 79%|███████▉ | 27114/34278 [29:43:46<6:39:25, 3.35s/it] {'loss': 0.0862, 'grad_norm': 0.7422887851770696, 'learning_rate': 1.1024202719442596e-06, 'epoch': 0.79} 79%|███████▉ | 27114/34278 [29:43:46<6:39:25, 3.35s/it] 79%|███████▉ | 27115/34278 [29:43:52<8:15:48, 4.15s/it] {'loss': 0.1268, 'grad_norm': 0.7049014329909208, 'learning_rate': 1.1021243649099972e-06, 'epoch': 0.79} 79%|███████▉ | 27115/34278 [29:43:52<8:15:48, 4.15s/it] 79%|███████▉ | 27116/34278 [29:43:55<7:35:23, 3.82s/it] {'loss': 0.1174, 'grad_norm': 0.8604393699555738, 'learning_rate': 1.1018284926750378e-06, 'epoch': 0.79} 79%|███████▉ | 27116/34278 [29:43:55<7:35:23, 3.82s/it] 79%|███████▉ | 27117/34278 [29:43:58<6:59:56, 3.52s/it] {'loss': 0.1186, 'grad_norm': 0.9837231514686511, 'learning_rate': 1.1015326552420218e-06, 'epoch': 0.79} 79%|███████▉ | 27117/34278 [29:43:58<6:59:56, 3.52s/it] 79%|███████▉ | 27118/34278 [29:44:02<7:05:00, 3.56s/it] {'loss': 0.1047, 'grad_norm': 0.7758830117983704, 'learning_rate': 1.101236852613592e-06, 'epoch': 0.79} 79%|███████▉ | 27118/34278 [29:44:02<7:05:00, 3.56s/it] 79%|███████▉ | 27119/34278 [29:44:05<6:40:24, 3.36s/it] {'loss': 0.1124, 'grad_norm': 0.8812448555955511, 'learning_rate': 1.1009410847923897e-06, 'epoch': 0.79} 79%|███████▉ | 27119/34278 [29:44:05<6:40:24, 3.36s/it] 79%|███████▉ | 27120/34278 [29:44:07<6:26:59, 3.24s/it] {'loss': 0.112, 'grad_norm': 0.8442310245177006, 'learning_rate': 1.100645351781055e-06, 'epoch': 0.79} 79%|███████▉ | 27120/34278 [29:44:08<6:26:59, 3.24s/it] 79%|███████▉ | 27121/34278 [29:44:11<6:46:25, 3.41s/it] {'loss': 0.0901, 'grad_norm': 0.7845106073583007, 'learning_rate': 1.1003496535822262e-06, 'epoch': 0.79} 79%|███████▉ | 27121/34278 [29:44:11<6:46:25, 3.41s/it] 79%|███████▉ | 27122/34278 [29:44:15<6:45:08, 3.40s/it] {'loss': 0.1264, 'grad_norm': 0.7196121650906268, 'learning_rate': 1.1000539901985458e-06, 'epoch': 0.79} 79%|███████▉ | 27122/34278 [29:44:15<6:45:08, 3.40s/it] 79%|███████▉ | 27123/34278 [29:44:18<6:42:48, 3.38s/it] {'loss': 0.1078, 'grad_norm': 0.7425485908378817, 'learning_rate': 1.099758361632653e-06, 'epoch': 0.79} 79%|███████▉ | 27123/34278 [29:44:18<6:42:48, 3.38s/it] 79%|███████▉ | 27124/34278 [29:44:21<6:38:29, 3.34s/it] {'loss': 0.0996, 'grad_norm': 0.7759085616538031, 'learning_rate': 1.0994627678871833e-06, 'epoch': 0.79} 79%|███████▉ | 27124/34278 [29:44:21<6:38:29, 3.34s/it] 79%|███████▉ | 27125/34278 [29:44:27<8:13:13, 4.14s/it] {'loss': 0.1002, 'grad_norm': 0.7586204681365918, 'learning_rate': 1.0991672089647814e-06, 'epoch': 0.79} 79%|███████▉ | 27125/34278 [29:44:27<8:13:13, 4.14s/it] 79%|███████▉ | 27126/34278 [29:44:31<7:55:59, 3.99s/it] {'loss': 0.1019, 'grad_norm': 1.8345522764639859, 'learning_rate': 1.0988716848680842e-06, 'epoch': 0.79} 79%|███████▉ | 27126/34278 [29:44:31<7:55:59, 3.99s/it] 79%|███████▉ | 27127/34278 [29:44:34<7:16:40, 3.66s/it] {'loss': 0.0987, 'grad_norm': 0.8190523113213574, 'learning_rate': 1.0985761955997276e-06, 'epoch': 0.79} 79%|███████▉ | 27127/34278 [29:44:34<7:16:40, 3.66s/it] 79%|███████▉ | 27128/34278 [29:44:38<7:23:15, 3.72s/it] {'loss': 0.1144, 'grad_norm': 0.8617946805400212, 'learning_rate': 1.0982807411623526e-06, 'epoch': 0.79} 79%|███████▉ | 27128/34278 [29:44:38<7:23:15, 3.72s/it] 79%|███████▉ | 27129/34278 [29:44:43<8:12:29, 4.13s/it] {'loss': 0.1042, 'grad_norm': 1.2286630752846879, 'learning_rate': 1.0979853215585957e-06, 'epoch': 0.79} 79%|███████▉ | 27129/34278 [29:44:43<8:12:29, 4.13s/it] 79%|███████▉ | 27130/34278 [29:44:47<8:23:14, 4.22s/it] {'loss': 0.1325, 'grad_norm': 0.9680502916427776, 'learning_rate': 1.0976899367910932e-06, 'epoch': 0.79} 79%|███████▉ | 27130/34278 [29:44:47<8:23:14, 4.22s/it] 79%|███████▉ | 27131/34278 [29:44:50<7:32:23, 3.80s/it] {'loss': 0.1125, 'grad_norm': 0.955072518187602, 'learning_rate': 1.097394586862483e-06, 'epoch': 0.79} 79%|███████▉ | 27131/34278 [29:44:50<7:32:23, 3.80s/it] 79%|███████▉ | 27132/34278 [29:44:53<7:03:25, 3.56s/it] {'loss': 0.1068, 'grad_norm': 0.7425040135781064, 'learning_rate': 1.0970992717754043e-06, 'epoch': 0.79} 79%|███████▉ | 27132/34278 [29:44:53<7:03:25, 3.56s/it] 79%|███████▉ | 27133/34278 [29:44:57<7:16:55, 3.67s/it] {'loss': 0.115, 'grad_norm': 0.9104003460204163, 'learning_rate': 1.0968039915324913e-06, 'epoch': 0.79} 79%|███████▉ | 27133/34278 [29:44:57<7:16:55, 3.67s/it] 79%|███████▉ | 27134/34278 [29:45:01<7:20:41, 3.70s/it] {'loss': 0.1315, 'grad_norm': 0.8801343256956771, 'learning_rate': 1.0965087461363788e-06, 'epoch': 0.79} 79%|███████▉ | 27134/34278 [29:45:01<7:20:41, 3.70s/it] 79%|███████▉ | 27135/34278 [29:45:04<7:14:40, 3.65s/it] {'loss': 0.1108, 'grad_norm': 0.9456310284348848, 'learning_rate': 1.0962135355897063e-06, 'epoch': 0.79} 79%|███████▉ | 27135/34278 [29:45:04<7:14:40, 3.65s/it] 79%|███████▉ | 27136/34278 [29:45:08<7:10:14, 3.61s/it] {'loss': 0.1005, 'grad_norm': 0.6906914642583293, 'learning_rate': 1.0959183598951056e-06, 'epoch': 0.79} 79%|███████▉ | 27136/34278 [29:45:08<7:10:14, 3.61s/it] 79%|███████▉ | 27137/34278 [29:45:11<7:01:22, 3.54s/it] {'loss': 0.1009, 'grad_norm': 0.9960876587345895, 'learning_rate': 1.095623219055214e-06, 'epoch': 0.79} 79%|███████▉ | 27137/34278 [29:45:11<7:01:22, 3.54s/it] 79%|███████▉ | 27138/34278 [29:45:15<7:12:24, 3.63s/it] {'loss': 0.1129, 'grad_norm': 0.908515226563882, 'learning_rate': 1.095328113072668e-06, 'epoch': 0.79} 79%|███████▉ | 27138/34278 [29:45:15<7:12:24, 3.63s/it] 79%|███████▉ | 27139/34278 [29:45:19<7:23:21, 3.73s/it] {'loss': 0.1038, 'grad_norm': 0.771092521850201, 'learning_rate': 1.0950330419501003e-06, 'epoch': 0.79} 79%|███████▉ | 27139/34278 [29:45:19<7:23:21, 3.73s/it] 79%|███████▉ | 27140/34278 [29:45:22<6:54:11, 3.48s/it] {'loss': 0.105, 'grad_norm': 0.7236073233684125, 'learning_rate': 1.0947380056901436e-06, 'epoch': 0.79} 79%|███████▉ | 27140/34278 [29:45:22<6:54:11, 3.48s/it] 79%|███████▉ | 27141/34278 [29:45:28<8:24:42, 4.24s/it] {'loss': 0.1182, 'grad_norm': 0.8497883082548906, 'learning_rate': 1.0944430042954358e-06, 'epoch': 0.79} 79%|███████▉ | 27141/34278 [29:45:28<8:24:42, 4.24s/it] 79%|███████▉ | 27142/34278 [29:45:31<7:36:40, 3.84s/it] {'loss': 0.0972, 'grad_norm': 0.7610705951590233, 'learning_rate': 1.0941480377686065e-06, 'epoch': 0.79} 79%|███████▉ | 27142/34278 [29:45:31<7:36:40, 3.84s/it] 79%|███████▉ | 27143/34278 [29:45:36<8:21:32, 4.22s/it] {'loss': 0.1314, 'grad_norm': 0.72110363440369, 'learning_rate': 1.0938531061122926e-06, 'epoch': 0.79} 79%|███████▉ | 27143/34278 [29:45:36<8:21:32, 4.22s/it] 79%|███████▉ | 27144/34278 [29:45:39<7:50:00, 3.95s/it] {'loss': 0.1338, 'grad_norm': 0.807041227237705, 'learning_rate': 1.0935582093291247e-06, 'epoch': 0.79} 79%|███████▉ | 27144/34278 [29:45:39<7:50:00, 3.95s/it] 79%|███████▉ | 27145/34278 [29:45:45<8:55:25, 4.50s/it] {'loss': 0.1163, 'grad_norm': 1.0550105281370603, 'learning_rate': 1.0932633474217374e-06, 'epoch': 0.79} 79%|███████▉ | 27145/34278 [29:45:45<8:55:25, 4.50s/it] 79%|███████▉ | 27146/34278 [29:45:48<8:08:28, 4.11s/it] {'loss': 0.1513, 'grad_norm': 0.8850146050823224, 'learning_rate': 1.0929685203927625e-06, 'epoch': 0.79} 79%|███████▉ | 27146/34278 [29:45:48<8:08:28, 4.11s/it] 79%|███████▉ | 27147/34278 [29:45:54<9:14:10, 4.66s/it] {'loss': 0.0995, 'grad_norm': 0.6960212464952853, 'learning_rate': 1.0926737282448308e-06, 'epoch': 0.79} 79%|███████▉ | 27147/34278 [29:45:54<9:14:10, 4.66s/it] 79%|███████▉ | 27148/34278 [29:45:57<8:23:51, 4.24s/it] {'loss': 0.1196, 'grad_norm': 0.7826302176495409, 'learning_rate': 1.0923789709805754e-06, 'epoch': 0.79} 79%|███████▉ | 27148/34278 [29:45:57<8:23:51, 4.24s/it] 79%|███████▉ | 27149/34278 [29:46:00<7:35:31, 3.83s/it] {'loss': 0.0996, 'grad_norm': 0.8804146177427541, 'learning_rate': 1.092084248602629e-06, 'epoch': 0.79} 79%|███████▉ | 27149/34278 [29:46:00<7:35:31, 3.83s/it] 79%|███████▉ | 27150/34278 [29:46:03<7:08:48, 3.61s/it] {'loss': 0.1037, 'grad_norm': 0.7310782519951465, 'learning_rate': 1.0917895611136214e-06, 'epoch': 0.79} 79%|███████▉ | 27150/34278 [29:46:03<7:08:48, 3.61s/it] 79%|███████▉ | 27151/34278 [29:46:08<7:29:08, 3.78s/it] {'loss': 0.1093, 'grad_norm': 0.9999875152982066, 'learning_rate': 1.0914949085161819e-06, 'epoch': 0.79} 79%|███████▉ | 27151/34278 [29:46:08<7:29:08, 3.78s/it] 79%|███████▉ | 27152/34278 [29:46:10<6:50:45, 3.46s/it] {'loss': 0.1355, 'grad_norm': 0.9923710383217718, 'learning_rate': 1.091200290812945e-06, 'epoch': 0.79} 79%|███████▉ | 27152/34278 [29:46:10<6:50:45, 3.46s/it] 79%|███████▉ | 27153/34278 [29:46:14<6:45:57, 3.42s/it] {'loss': 0.1365, 'grad_norm': 0.9491778541297881, 'learning_rate': 1.0909057080065382e-06, 'epoch': 0.79} 79%|███████▉ | 27153/34278 [29:46:14<6:45:57, 3.42s/it] 79%|███████▉ | 27154/34278 [29:46:19<8:13:43, 4.16s/it] {'loss': 0.0983, 'grad_norm': 0.6807099827568043, 'learning_rate': 1.0906111600995895e-06, 'epoch': 0.79} 79%|███████▉ | 27154/34278 [29:46:19<8:13:43, 4.16s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 79%|███████▉ | 27155/34278 [29:46:22<7:33:33, 3.82s/it] {'loss': 0.1187, 'grad_norm': 0.8567725912593552, 'learning_rate': 1.090316647094734e-06, 'epoch': 0.79} 79%|███████▉ | 27155/34278 [29:46:22<7:33:33, 3.82s/it] 79%|███████▉ | 27156/34278 [29:46:26<7:09:34, 3.62s/it] {'loss': 0.1172, 'grad_norm': 1.108805960470962, 'learning_rate': 1.0900221689945978e-06, 'epoch': 0.79} 79%|███████▉ | 27156/34278 [29:46:26<7:09:34, 3.62s/it] 79%|███████▉ | 27157/34278 [29:46:30<7:35:13, 3.84s/it] {'loss': 0.1112, 'grad_norm': 0.8158151823512337, 'learning_rate': 1.089727725801809e-06, 'epoch': 0.79} 79%|███████▉ | 27157/34278 [29:46:30<7:35:13, 3.84s/it] 79%|███████▉ | 27158/34278 [29:46:34<7:52:23, 3.98s/it] {'loss': 0.1294, 'grad_norm': 1.0486515651120698, 'learning_rate': 1.0894333175189993e-06, 'epoch': 0.79} 79%|███████▉ | 27158/34278 [29:46:34<7:52:23, 3.98s/it] 79%|███████▉ | 27159/34278 [29:46:37<7:15:24, 3.67s/it] {'loss': 0.1058, 'grad_norm': 0.824966210310151, 'learning_rate': 1.0891389441487954e-06, 'epoch': 0.79} 79%|███████▉ | 27159/34278 [29:46:37<7:15:24, 3.67s/it] 79%|███████▉ | 27160/34278 [29:46:41<7:07:32, 3.60s/it] {'loss': 0.1264, 'grad_norm': 0.9263257222914655, 'learning_rate': 1.088844605693824e-06, 'epoch': 0.79} 79%|███████▉ | 27160/34278 [29:46:41<7:07:32, 3.60s/it] 79%|███████▉ | 27161/34278 [29:46:46<8:18:24, 4.20s/it] {'loss': 0.0885, 'grad_norm': 1.0945050618072207, 'learning_rate': 1.088550302156714e-06, 'epoch': 0.79} 79%|███████▉ | 27161/34278 [29:46:46<8:18:24, 4.20s/it] 79%|███████▉ | 27162/34278 [29:46:49<7:27:09, 3.77s/it] {'loss': 0.1211, 'grad_norm': 1.3830758417720495, 'learning_rate': 1.0882560335400943e-06, 'epoch': 0.79} 79%|███████▉ | 27162/34278 [29:46:49<7:27:09, 3.77s/it] 79%|███████▉ | 27163/34278 [29:46:53<7:19:47, 3.71s/it] {'loss': 0.1079, 'grad_norm': 1.211255424690409, 'learning_rate': 1.0879617998465912e-06, 'epoch': 0.79} 79%|███████▉ | 27163/34278 [29:46:53<7:19:47, 3.71s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8c201d6b10> Failed to fetch sample 3268271. Exception: cannot identify image file <_io.BytesIO object at 0x7f8c201d6b10> 79%|███████▉ | 27164/34278 [29:46:56<6:57:15, 3.52s/it] {'loss': 0.0924, 'grad_norm': 0.8134082408658129, 'learning_rate': 1.0876676010788307e-06, 'epoch': 0.79} 79%|███████▉ | 27164/34278 [29:46:56<6:57:15, 3.52s/it] 79%|███████▉ | 27165/34278 [29:46:59<6:34:39, 3.33s/it] {'loss': 0.1309, 'grad_norm': 0.889315860492122, 'learning_rate': 1.0873734372394402e-06, 'epoch': 0.79} 79%|███████▉ | 27165/34278 [29:46:59<6:34:39, 3.33s/it] 79%|███████▉ | 27166/34278 [29:47:01<6:13:58, 3.16s/it] {'loss': 0.1157, 'grad_norm': 1.2094765568061059, 'learning_rate': 1.0870793083310449e-06, 'epoch': 0.79} 79%|███████▉ | 27166/34278 [29:47:01<6:13:58, 3.16s/it] 79%|███████▉ | 27167/34278 [29:47:07<7:30:51, 3.80s/it] {'loss': 0.1479, 'grad_norm': 1.3901104998579217, 'learning_rate': 1.0867852143562712e-06, 'epoch': 0.79} 79%|███████▉ | 27167/34278 [29:47:07<7:30:51, 3.80s/it] 79%|███████▉ | 27168/34278 [29:47:13<8:46:41, 4.44s/it] {'loss': 0.1106, 'grad_norm': 0.9522795647766157, 'learning_rate': 1.0864911553177459e-06, 'epoch': 0.79} 79%|███████▉ | 27168/34278 [29:47:13<8:46:41, 4.44s/it] 79%|███████▉ | 27169/34278 [29:47:16<8:20:54, 4.23s/it] {'loss': 0.1218, 'grad_norm': 0.9193753418996613, 'learning_rate': 1.0861971312180942e-06, 'epoch': 0.79} 79%|███████▉ | 27169/34278 [29:47:16<8:20:54, 4.23s/it] 79%|███████▉ | 27170/34278 [29:47:19<7:38:46, 3.87s/it] {'loss': 0.1312, 'grad_norm': 1.0362991357964215, 'learning_rate': 1.085903142059938e-06, 'epoch': 0.79} 79%|███████▉ | 27170/34278 [29:47:19<7:38:46, 3.87s/it] 79%|███████▉ | 27171/34278 [29:47:23<7:16:28, 3.68s/it] {'loss': 0.1189, 'grad_norm': 1.1401714533665626, 'learning_rate': 1.0856091878459064e-06, 'epoch': 0.79} 79%|███████▉ | 27171/34278 [29:47:23<7:16:28, 3.68s/it] 79%|███████▉ | 27172/34278 [29:47:26<6:54:36, 3.50s/it] {'loss': 0.0886, 'grad_norm': 0.917302415006295, 'learning_rate': 1.0853152685786196e-06, 'epoch': 0.79} 79%|███████▉ | 27172/34278 [29:47:26<6:54:36, 3.50s/it] 79%|███████▉ | 27173/34278 [29:47:29<7:01:20, 3.56s/it] {'loss': 0.1123, 'grad_norm': 1.2631958799446088, 'learning_rate': 1.085021384260705e-06, 'epoch': 0.79} 79%|███████▉ | 27173/34278 [29:47:29<7:01:20, 3.56s/it] 79%|███████▉ | 27174/34278 [29:47:35<8:24:49, 4.26s/it] {'loss': 0.1209, 'grad_norm': 0.7993442809384943, 'learning_rate': 1.0847275348947833e-06, 'epoch': 0.79} 79%|███████▉ | 27174/34278 [29:47:35<8:24:49, 4.26s/it] 79%|███████▉ | 27175/34278 [29:47:39<8:21:09, 4.23s/it] {'loss': 0.1272, 'grad_norm': 1.187448320820712, 'learning_rate': 1.0844337204834814e-06, 'epoch': 0.79} 79%|███████▉ | 27175/34278 [29:47:39<8:21:09, 4.23s/it] 79%|███████▉ | 27176/34278 [29:47:43<8:13:29, 4.17s/it] {'loss': 0.1402, 'grad_norm': 0.8773412349222718, 'learning_rate': 1.08413994102942e-06, 'epoch': 0.79} 79%|███████▉ | 27176/34278 [29:47:43<8:13:29, 4.17s/it] 79%|███████▉ | 27177/34278 [29:47:47<8:01:29, 4.07s/it] {'loss': 0.1006, 'grad_norm': 0.683838764971712, 'learning_rate': 1.0838461965352215e-06, 'epoch': 0.79} 79%|███████▉ | 27177/34278 [29:47:47<8:01:29, 4.07s/it] 79%|███████▉ | 27178/34278 [29:47:50<7:22:16, 3.74s/it] {'loss': 0.0994, 'grad_norm': 0.8919481410247305, 'learning_rate': 1.083552487003509e-06, 'epoch': 0.79} 79%|███████▉ | 27178/34278 [29:47:50<7:22:16, 3.74s/it] 79%|███████▉ | 27179/34278 [29:47:55<8:08:59, 4.13s/it] {'loss': 0.0918, 'grad_norm': 0.7318332943450829, 'learning_rate': 1.083258812436907e-06, 'epoch': 0.79} 79%|███████▉ | 27179/34278 [29:47:55<8:08:59, 4.13s/it] 79%|███████▉ | 27180/34278 [29:47:58<7:28:03, 3.79s/it] {'loss': 0.1257, 'grad_norm': 0.8147479444203601, 'learning_rate': 1.0829651728380346e-06, 'epoch': 0.79} 79%|███████▉ | 27180/34278 [29:47:58<7:28:03, 3.79s/it] 79%|███████▉ | 27181/34278 [29:48:01<7:02:04, 3.57s/it] {'loss': 0.1198, 'grad_norm': 1.0957109828739, 'learning_rate': 1.082671568209513e-06, 'epoch': 0.79} 79%|███████▉ | 27181/34278 [29:48:01<7:02:04, 3.57s/it] 79%|███████▉ | 27182/34278 [29:48:04<6:37:06, 3.36s/it] {'loss': 0.0927, 'grad_norm': 0.9268456476495067, 'learning_rate': 1.0823779985539657e-06, 'epoch': 0.79} 79%|███████▉ | 27182/34278 [29:48:04<6:37:06, 3.36s/it] 79%|███████▉ | 27183/34278 [29:48:08<6:59:36, 3.55s/it] {'loss': 0.1095, 'grad_norm': 0.937669862032185, 'learning_rate': 1.0820844638740125e-06, 'epoch': 0.79} 79%|███████▉ | 27183/34278 [29:48:08<6:59:36, 3.55s/it] 79%|███████▉ | 27184/34278 [29:48:14<8:23:04, 4.25s/it] {'loss': 0.1105, 'grad_norm': 0.8994880085158811, 'learning_rate': 1.0817909641722713e-06, 'epoch': 0.79} 79%|███████▉ | 27184/34278 [29:48:14<8:23:04, 4.25s/it] 79%|███████▉ | 27185/34278 [29:48:17<7:40:28, 3.90s/it] {'loss': 0.1043, 'grad_norm': 0.8368374731245064, 'learning_rate': 1.0814974994513672e-06, 'epoch': 0.79} 79%|███████▉ | 27185/34278 [29:48:17<7:40:28, 3.90s/it] 79%|███████▉ | 27186/34278 [29:48:20<7:13:08, 3.66s/it] {'loss': 0.1055, 'grad_norm': 0.7875018884126366, 'learning_rate': 1.0812040697139187e-06, 'epoch': 0.79} 79%|███████▉ | 27186/34278 [29:48:20<7:13:08, 3.66s/it] 79%|███████▉ | 27187/34278 [29:48:25<7:34:59, 3.85s/it] {'loss': 0.1227, 'grad_norm': 0.9334903899768735, 'learning_rate': 1.0809106749625431e-06, 'epoch': 0.79} 79%|███████▉ | 27187/34278 [29:48:25<7:34:59, 3.85s/it] 79%|███████▉ | 27188/34278 [29:48:28<7:07:11, 3.62s/it] {'loss': 0.1147, 'grad_norm': 1.1329781338344356, 'learning_rate': 1.0806173151998628e-06, 'epoch': 0.79} 79%|███████▉ | 27188/34278 [29:48:28<7:07:11, 3.62s/it] 79%|███████▉ | 27189/34278 [29:48:34<8:38:26, 4.39s/it] {'loss': 0.1172, 'grad_norm': 1.4054346205322619, 'learning_rate': 1.0803239904284952e-06, 'epoch': 0.79} 79%|███████▉ | 27189/34278 [29:48:34<8:38:26, 4.39s/it] 79%|███████▉ | 27190/34278 [29:48:37<8:09:52, 4.15s/it] {'loss': 0.103, 'grad_norm': 0.8135928422315268, 'learning_rate': 1.0800307006510585e-06, 'epoch': 0.79} 79%|███████▉ | 27190/34278 [29:48:38<8:09:52, 4.15s/it] 79%|███████▉ | 27191/34278 [29:48:43<9:04:02, 4.61s/it] {'loss': 0.119, 'grad_norm': 1.0014669905524591, 'learning_rate': 1.0797374458701716e-06, 'epoch': 0.79} 79%|███████▉ | 27191/34278 [29:48:43<9:04:02, 4.61s/it] 79%|███████▉ | 27192/34278 [29:48:46<8:10:54, 4.16s/it] {'loss': 0.1083, 'grad_norm': 0.9133195992648436, 'learning_rate': 1.079444226088454e-06, 'epoch': 0.79} 79%|███████▉ | 27192/34278 [29:48:46<8:10:54, 4.16s/it] 79%|███████▉ | 27193/34278 [29:48:53<9:30:32, 4.83s/it] {'loss': 0.1009, 'grad_norm': 0.6612529568331375, 'learning_rate': 1.0791510413085232e-06, 'epoch': 0.79} 79%|███████▉ | 27193/34278 [29:48:53<9:30:32, 4.83s/it] 79%|███████▉ | 27194/34278 [29:48:59<10:11:22, 5.18s/it] {'loss': 0.1153, 'grad_norm': 1.050006629251399, 'learning_rate': 1.078857891532994e-06, 'epoch': 0.79} 79%|███████▉ | 27194/34278 [29:48:59<10:11:22, 5.18s/it] 79%|███████▉ | 27195/34278 [29:49:03<9:31:37, 4.84s/it] {'loss': 0.1411, 'grad_norm': 0.9127293559854903, 'learning_rate': 1.0785647767644869e-06, 'epoch': 0.79} 79%|███████▉ | 27195/34278 [29:49:03<9:31:37, 4.84s/it] 79%|███████▉ | 27196/34278 [29:49:06<8:24:47, 4.28s/it] {'loss': 0.119, 'grad_norm': 0.8450020135109233, 'learning_rate': 1.078271697005616e-06, 'epoch': 0.79} 79%|███████▉ | 27196/34278 [29:49:06<8:24:47, 4.28s/it] 79%|███████▉ | 27197/34278 [29:49:09<7:38:18, 3.88s/it] {'loss': 0.124, 'grad_norm': 0.8240192595216537, 'learning_rate': 1.0779786522589998e-06, 'epoch': 0.79} 79%|███████▉ | 27197/34278 [29:49:09<7:38:18, 3.88s/it] 79%|███████▉ | 27198/34278 [29:49:12<7:29:42, 3.81s/it] {'loss': 0.1238, 'grad_norm': 0.8152466647131754, 'learning_rate': 1.0776856425272548e-06, 'epoch': 0.79} 79%|███████▉ | 27198/34278 [29:49:12<7:29:42, 3.81s/it] 79%|███████▉ | 27199/34278 [29:49:15<7:08:58, 3.64s/it] {'loss': 0.1203, 'grad_norm': 0.8905718914173306, 'learning_rate': 1.0773926678129958e-06, 'epoch': 0.79} 79%|███████▉ | 27199/34278 [29:49:15<7:08:58, 3.64s/it] 79%|███████▉ | 27200/34278 [29:49:19<6:51:07, 3.49s/it] {'loss': 0.1145, 'grad_norm': 0.9610597286522311, 'learning_rate': 1.0770997281188378e-06, 'epoch': 0.79} 79%|███████▉ | 27200/34278 [29:49:19<6:51:07, 3.49s/it] 79%|███████▉ | 27201/34278 [29:49:22<6:59:32, 3.56s/it] {'loss': 0.1246, 'grad_norm': 0.75896572777544, 'learning_rate': 1.0768068234473978e-06, 'epoch': 0.79} 79%|███████▉ | 27201/34278 [29:49:22<6:59:32, 3.56s/it] 79%|███████▉ | 27202/34278 [29:49:26<6:49:13, 3.47s/it] {'loss': 0.1304, 'grad_norm': 0.8331296571922095, 'learning_rate': 1.0765139538012892e-06, 'epoch': 0.79} 79%|███████▉ | 27202/34278 [29:49:26<6:49:13, 3.47s/it] 79%|███████▉ | 27203/34278 [29:49:29<6:41:56, 3.41s/it] {'loss': 0.0939, 'grad_norm': 0.8385479477782422, 'learning_rate': 1.0762211191831283e-06, 'epoch': 0.79} 79%|███████▉ | 27203/34278 [29:49:29<6:41:56, 3.41s/it] 79%|███████▉ | 27204/34278 [29:49:32<6:26:27, 3.28s/it] {'loss': 0.1069, 'grad_norm': 0.8796501707340214, 'learning_rate': 1.0759283195955273e-06, 'epoch': 0.79} 79%|███████▉ | 27204/34278 [29:49:32<6:26:27, 3.28s/it] 79%|███████▉ | 27205/34278 [29:49:35<6:13:47, 3.17s/it] {'loss': 0.1161, 'grad_norm': 0.9855807190635272, 'learning_rate': 1.075635555041103e-06, 'epoch': 0.79} 79%|███████▉ | 27205/34278 [29:49:35<6:13:47, 3.17s/it] 79%|███████▉ | 27206/34278 [29:49:38<6:13:57, 3.17s/it] {'loss': 0.1176, 'grad_norm': 1.0909022563980866, 'learning_rate': 1.0753428255224674e-06, 'epoch': 0.79} 79%|███████▉ | 27206/34278 [29:49:38<6:13:57, 3.17s/it] 79%|███████▉ | 27207/34278 [29:49:42<7:00:26, 3.57s/it] {'loss': 0.12, 'grad_norm': 0.9271966413936198, 'learning_rate': 1.0750501310422328e-06, 'epoch': 0.79} 79%|███████▉ | 27207/34278 [29:49:42<7:00:26, 3.57s/it] 79%|███████▉ | 27208/34278 [29:49:48<8:29:28, 4.32s/it] {'loss': 0.0915, 'grad_norm': 0.7606476951413613, 'learning_rate': 1.074757471603014e-06, 'epoch': 0.79} 79%|███████▉ | 27208/34278 [29:49:48<8:29:28, 4.32s/it] 79%|███████▉ | 27209/34278 [29:49:51<7:37:26, 3.88s/it] {'loss': 0.1127, 'grad_norm': 1.0786965180563166, 'learning_rate': 1.074464847207425e-06, 'epoch': 0.79} 79%|███████▉ | 27209/34278 [29:49:51<7:37:26, 3.88s/it] 79%|███████▉ | 27210/34278 [29:49:54<7:07:24, 3.63s/it] {'loss': 0.0867, 'grad_norm': 0.9642516925484652, 'learning_rate': 1.074172257858076e-06, 'epoch': 0.79} 79%|███████▉ | 27210/34278 [29:49:54<7:07:24, 3.63s/it] 79%|███████▉ | 27211/34278 [29:50:00<8:27:48, 4.31s/it] {'loss': 0.116, 'grad_norm': 0.8178564365082546, 'learning_rate': 1.0738797035575787e-06, 'epoch': 0.79} 79%|███████▉ | 27211/34278 [29:50:00<8:27:48, 4.31s/it] 79%|███████▉ | 27212/34278 [29:50:04<8:00:09, 4.08s/it] {'loss': 0.1112, 'grad_norm': 0.7119798409234761, 'learning_rate': 1.0735871843085483e-06, 'epoch': 0.79} 79%|███████▉ | 27212/34278 [29:50:04<8:00:09, 4.08s/it] 79%|███████▉ | 27213/34278 [29:50:07<7:22:50, 3.76s/it] {'loss': 0.1263, 'grad_norm': 1.1182402379916003, 'learning_rate': 1.0732947001135935e-06, 'epoch': 0.79} 79%|███████▉ | 27213/34278 [29:50:07<7:22:50, 3.76s/it] 79%|███████▉ | 27214/34278 [29:50:11<7:24:01, 3.77s/it] {'loss': 0.105, 'grad_norm': 0.7720191956329254, 'learning_rate': 1.0730022509753235e-06, 'epoch': 0.79} 79%|███████▉ | 27214/34278 [29:50:11<7:24:01, 3.77s/it] 79%|███████▉ | 27215/34278 [29:50:14<7:04:23, 3.61s/it] {'loss': 0.1111, 'grad_norm': 0.606352370124071, 'learning_rate': 1.072709836896355e-06, 'epoch': 0.79} 79%|███████▉ | 27215/34278 [29:50:14<7:04:23, 3.61s/it] 79%|███████▉ | 27216/34278 [29:50:17<6:56:01, 3.53s/it] {'loss': 0.107, 'grad_norm': 0.957001421108336, 'learning_rate': 1.0724174578792952e-06, 'epoch': 0.79} 79%|███████▉ | 27216/34278 [29:50:17<6:56:01, 3.53s/it] 79%|███████▉ | 27217/34278 [29:50:23<8:24:07, 4.28s/it] {'loss': 0.1168, 'grad_norm': 0.9266659059031501, 'learning_rate': 1.0721251139267536e-06, 'epoch': 0.79} 79%|███████▉ | 27217/34278 [29:50:23<8:24:07, 4.28s/it] 79%|███████▉ | 27218/34278 [29:50:26<7:43:10, 3.94s/it] {'loss': 0.1329, 'grad_norm': 0.9046915043052184, 'learning_rate': 1.071832805041343e-06, 'epoch': 0.79} 79%|███████▉ | 27218/34278 [29:50:26<7:43:10, 3.94s/it] 79%|███████▉ | 27219/34278 [29:50:32<8:45:31, 4.47s/it] {'loss': 0.1393, 'grad_norm': 0.8037414910215839, 'learning_rate': 1.071540531225671e-06, 'epoch': 0.79} 79%|███████▉ | 27219/34278 [29:50:32<8:45:31, 4.47s/it] 79%|███████▉ | 27220/34278 [29:50:36<8:21:03, 4.26s/it] {'loss': 0.1211, 'grad_norm': 1.1688948899651286, 'learning_rate': 1.071248292482346e-06, 'epoch': 0.79} 79%|███████▉ | 27220/34278 [29:50:36<8:21:03, 4.26s/it] 79%|███████▉ | 27221/34278 [29:50:39<7:49:08, 3.99s/it] {'loss': 0.1219, 'grad_norm': 0.8053109879370193, 'learning_rate': 1.0709560888139787e-06, 'epoch': 0.79} 79%|███████▉ | 27221/34278 [29:50:39<7:49:08, 3.99s/it] 79%|███████▉ | 27222/34278 [29:50:45<8:47:35, 4.49s/it] {'loss': 0.1088, 'grad_norm': 0.7711154565723309, 'learning_rate': 1.0706639202231783e-06, 'epoch': 0.79} 79%|███████▉ | 27222/34278 [29:50:45<8:47:35, 4.49s/it] 79%|███████▉ | 27223/34278 [29:50:48<7:58:14, 4.07s/it] {'loss': 0.1249, 'grad_norm': 1.011853271244742, 'learning_rate': 1.0703717867125524e-06, 'epoch': 0.79} 79%|███████▉ | 27223/34278 [29:50:48<7:58:14, 4.07s/it] 79%|███████▉ | 27224/34278 [29:50:51<7:26:14, 3.80s/it] {'loss': 0.1049, 'grad_norm': 0.8364694502713238, 'learning_rate': 1.070079688284708e-06, 'epoch': 0.79} 79%|███████▉ | 27224/34278 [29:50:51<7:26:14, 3.80s/it] 79%|███████▉ | 27225/34278 [29:50:54<7:01:10, 3.58s/it] {'loss': 0.1262, 'grad_norm': 0.8879913859202871, 'learning_rate': 1.0697876249422557e-06, 'epoch': 0.79} 79%|███████▉ | 27225/34278 [29:50:54<7:01:10, 3.58s/it] 79%|███████▉ | 27226/34278 [29:50:59<7:52:39, 4.02s/it] {'loss': 0.1118, 'grad_norm': 0.79892463453512, 'learning_rate': 1.0694955966877996e-06, 'epoch': 0.79} 79%|███████▉ | 27226/34278 [29:50:59<7:52:39, 4.02s/it] 79%|███████▉ | 27227/34278 [29:51:04<8:10:00, 4.17s/it] {'loss': 0.1119, 'grad_norm': 0.7897948536845185, 'learning_rate': 1.06920360352395e-06, 'epoch': 0.79} 79%|███████▉ | 27227/34278 [29:51:04<8:10:00, 4.17s/it] 79%|███████▉ | 27228/34278 [29:51:07<7:34:43, 3.87s/it] {'loss': 0.1295, 'grad_norm': 0.7850501406887443, 'learning_rate': 1.0689116454533105e-06, 'epoch': 0.79} 79%|███████▉ | 27228/34278 [29:51:07<7:34:43, 3.87s/it] 79%|███████▉ | 27229/34278 [29:51:13<8:47:15, 4.49s/it] {'loss': 0.1069, 'grad_norm': 1.0072128526934645, 'learning_rate': 1.068619722478491e-06, 'epoch': 0.79} 79%|███████▉ | 27229/34278 [29:51:13<8:47:15, 4.49s/it] 79%|███████▉ | 27230/34278 [29:51:16<8:09:26, 4.17s/it] {'loss': 0.1318, 'grad_norm': 0.8950120076679392, 'learning_rate': 1.0683278346020953e-06, 'epoch': 0.79} 79%|███████▉ | 27230/34278 [29:51:16<8:09:26, 4.17s/it] 79%|███████▉ | 27231/34278 [29:51:19<7:34:56, 3.87s/it] {'loss': 0.1288, 'grad_norm': 1.0075836046887874, 'learning_rate': 1.068035981826731e-06, 'epoch': 0.79} 79%|███████▉ | 27231/34278 [29:51:19<7:34:56, 3.87s/it] 79%|███████▉ | 27232/34278 [29:51:23<7:29:46, 3.83s/it] {'loss': 0.1053, 'grad_norm': 0.8989678170551099, 'learning_rate': 1.0677441641550012e-06, 'epoch': 0.79} 79%|███████▉ | 27232/34278 [29:51:23<7:29:46, 3.83s/it] 79%|███████▉ | 27233/34278 [29:51:28<7:55:56, 4.05s/it] {'loss': 0.1479, 'grad_norm': 1.0010786692342146, 'learning_rate': 1.0674523815895143e-06, 'epoch': 0.79} 79%|███████▉ | 27233/34278 [29:51:28<7:55:56, 4.05s/it] 79%|███████▉ | 27234/34278 [29:51:34<9:05:22, 4.65s/it] {'loss': 0.1303, 'grad_norm': 0.7515913770216751, 'learning_rate': 1.0671606341328728e-06, 'epoch': 0.79} 79%|███████▉ | 27234/34278 [29:51:34<9:05:22, 4.65s/it] 79%|███████▉ | 27235/34278 [29:51:37<8:17:40, 4.24s/it] {'loss': 0.1161, 'grad_norm': 0.8256631985755358, 'learning_rate': 1.0668689217876832e-06, 'epoch': 0.79} 79%|███████▉ | 27235/34278 [29:51:37<8:17:40, 4.24s/it] 79%|███████▉ | 27236/34278 [29:51:41<8:22:51, 4.28s/it] {'loss': 0.1253, 'grad_norm': 0.9734480532915907, 'learning_rate': 1.0665772445565493e-06, 'epoch': 0.79} 79%|███████▉ | 27236/34278 [29:51:41<8:22:51, 4.28s/it] 79%|███████▉ | 27237/34278 [29:51:45<7:44:12, 3.96s/it] {'loss': 0.0938, 'grad_norm': 0.6859718576154452, 'learning_rate': 1.0662856024420732e-06, 'epoch': 0.79} 79%|███████▉ | 27237/34278 [29:51:45<7:44:12, 3.96s/it] 79%|███████▉ | 27238/34278 [29:51:48<7:37:58, 3.90s/it] {'loss': 0.1151, 'grad_norm': 1.0410261535803178, 'learning_rate': 1.06599399544686e-06, 'epoch': 0.79} 79%|███████▉ | 27238/34278 [29:51:48<7:37:58, 3.90s/it] 79%|███████▉ | 27239/34278 [29:51:51<7:08:36, 3.65s/it] {'loss': 0.11, 'grad_norm': 0.9047278946682342, 'learning_rate': 1.0657024235735152e-06, 'epoch': 0.79} 79%|███████▉ | 27239/34278 [29:51:52<7:08:36, 3.65s/it] 79%|███████▉ | 27240/34278 [29:51:58<8:43:53, 4.47s/it] {'loss': 0.1375, 'grad_norm': 0.9623374158719716, 'learning_rate': 1.0654108868246398e-06, 'epoch': 0.79} 79%|███████▉ | 27240/34278 [29:51:58<8:43:53, 4.47s/it] 79%|███████▉ | 27241/34278 [29:52:01<7:54:20, 4.04s/it] {'loss': 0.1204, 'grad_norm': 0.8649855101651447, 'learning_rate': 1.0651193852028353e-06, 'epoch': 0.79} 79%|███████▉ | 27241/34278 [29:52:01<7:54:20, 4.04s/it] 79%|███████▉ | 27242/34278 [29:52:04<7:13:40, 3.70s/it] {'loss': 0.1297, 'grad_norm': 0.8117566411642689, 'learning_rate': 1.0648279187107068e-06, 'epoch': 0.79} 79%|███████▉ | 27242/34278 [29:52:04<7:13:40, 3.70s/it] 79%|███████▉ | 27243/34278 [29:52:09<8:15:11, 4.22s/it] {'loss': 0.1118, 'grad_norm': 0.8079122548166594, 'learning_rate': 1.064536487350855e-06, 'epoch': 0.79} 79%|███████▉ | 27243/34278 [29:52:09<8:15:11, 4.22s/it] 79%|███████▉ | 27244/34278 [29:52:12<7:38:56, 3.91s/it] {'loss': 0.1035, 'grad_norm': 0.8523914860289792, 'learning_rate': 1.06424509112588e-06, 'epoch': 0.79} 79%|███████▉ | 27244/34278 [29:52:12<7:38:56, 3.91s/it] 79%|███████▉ | 27245/34278 [29:52:16<7:18:25, 3.74s/it] {'loss': 0.1113, 'grad_norm': 0.9556340736393387, 'learning_rate': 1.063953730038388e-06, 'epoch': 0.79} 79%|███████▉ | 27245/34278 [29:52:16<7:18:25, 3.74s/it] 79%|███████▉ | 27246/34278 [29:52:19<6:53:04, 3.52s/it] {'loss': 0.1024, 'grad_norm': 0.9754331264843032, 'learning_rate': 1.0636624040909765e-06, 'epoch': 0.79} 79%|███████▉ | 27246/34278 [29:52:19<6:53:04, 3.52s/it] 79%|███████▉ | 27247/34278 [29:52:22<6:30:24, 3.33s/it] {'loss': 0.1128, 'grad_norm': 0.8163054623079017, 'learning_rate': 1.0633711132862467e-06, 'epoch': 0.79} 79%|███████▉ | 27247/34278 [29:52:22<6:30:24, 3.33s/it] 79%|███████▉ | 27248/34278 [29:52:26<6:50:27, 3.50s/it] {'loss': 0.1115, 'grad_norm': 0.842210170911893, 'learning_rate': 1.0630798576268013e-06, 'epoch': 0.79} 79%|███████▉ | 27248/34278 [29:52:26<6:50:27, 3.50s/it] 79%|███████▉ | 27249/34278 [29:52:28<6:22:07, 3.26s/it] {'loss': 0.12, 'grad_norm': 0.7836559465763133, 'learning_rate': 1.062788637115239e-06, 'epoch': 0.79} 79%|███████▉ | 27249/34278 [29:52:28<6:22:07, 3.26s/it] 79%|███████▉ | 27250/34278 [29:52:31<6:12:46, 3.18s/it] {'loss': 0.0986, 'grad_norm': 0.7956972852306032, 'learning_rate': 1.0624974517541587e-06, 'epoch': 0.79} 79%|███████▉ | 27250/34278 [29:52:31<6:12:46, 3.18s/it] 79%|███████▉ | 27251/34278 [29:52:34<6:11:12, 3.17s/it] {'loss': 0.1198, 'grad_norm': 0.8141726113794314, 'learning_rate': 1.0622063015461603e-06, 'epoch': 0.79} 79%|███████▉ | 27251/34278 [29:52:34<6:11:12, 3.17s/it] 80%|███████▉ | 27252/34278 [29:52:37<5:59:22, 3.07s/it] {'loss': 0.1072, 'grad_norm': 0.7980079504447605, 'learning_rate': 1.0619151864938464e-06, 'epoch': 0.8} 80%|███████▉ | 27252/34278 [29:52:37<5:59:22, 3.07s/it] 80%|███████▉ | 27253/34278 [29:52:40<5:55:46, 3.04s/it] {'loss': 0.1124, 'grad_norm': 0.8830622236492937, 'learning_rate': 1.0616241065998134e-06, 'epoch': 0.8} 80%|███████▉ | 27253/34278 [29:52:40<5:55:46, 3.04s/it] 80%|███████▉ | 27254/34278 [29:52:44<6:04:08, 3.11s/it] {'loss': 0.1292, 'grad_norm': 0.7724068062364485, 'learning_rate': 1.0613330618666584e-06, 'epoch': 0.8} 80%|███████▉ | 27254/34278 [29:52:44<6:04:08, 3.11s/it] 80%|███████▉ | 27255/34278 [29:52:46<5:57:37, 3.06s/it] {'loss': 0.1105, 'grad_norm': 0.9088773064446449, 'learning_rate': 1.0610420522969833e-06, 'epoch': 0.8} 80%|███████▉ | 27255/34278 [29:52:46<5:57:37, 3.06s/it] 80%|███████▉ | 27256/34278 [29:52:50<6:23:12, 3.27s/it] {'loss': 0.1221, 'grad_norm': 0.7981120777989499, 'learning_rate': 1.0607510778933828e-06, 'epoch': 0.8} 80%|███████▉ | 27256/34278 [29:52:50<6:23:12, 3.27s/it] 80%|███████▉ | 27257/34278 [29:52:54<6:33:47, 3.37s/it] {'loss': 0.1015, 'grad_norm': 0.9219924405039507, 'learning_rate': 1.0604601386584579e-06, 'epoch': 0.8} 80%|███████▉ | 27257/34278 [29:52:54<6:33:47, 3.37s/it] 80%|███████▉ | 27258/34278 [29:52:57<6:16:04, 3.21s/it] {'loss': 0.1263, 'grad_norm': 0.9552342074351698, 'learning_rate': 1.0601692345948033e-06, 'epoch': 0.8} 80%|███████▉ | 27258/34278 [29:52:57<6:16:04, 3.21s/it] 80%|███████▉ | 27259/34278 [29:53:00<6:24:02, 3.28s/it] {'loss': 0.1196, 'grad_norm': 0.9333306679607168, 'learning_rate': 1.0598783657050183e-06, 'epoch': 0.8} 80%|███████▉ | 27259/34278 [29:53:00<6:24:02, 3.28s/it] 80%|███████▉ | 27260/34278 [29:53:03<6:05:31, 3.13s/it] {'loss': 0.1193, 'grad_norm': 0.7489530231122494, 'learning_rate': 1.0595875319916977e-06, 'epoch': 0.8} 80%|███████▉ | 27260/34278 [29:53:03<6:05:31, 3.13s/it] 80%|███████▉ | 27261/34278 [29:53:06<6:04:37, 3.12s/it] {'loss': 0.1002, 'grad_norm': 0.867807944720601, 'learning_rate': 1.0592967334574394e-06, 'epoch': 0.8} 80%|███████▉ | 27261/34278 [29:53:06<6:04:37, 3.12s/it] 80%|███████▉ | 27262/34278 [29:53:09<5:59:28, 3.07s/it] {'loss': 0.1106, 'grad_norm': 1.0469200478889713, 'learning_rate': 1.059005970104839e-06, 'epoch': 0.8} 80%|███████▉ | 27262/34278 [29:53:09<5:59:28, 3.07s/it] 80%|███████▉ | 27263/34278 [29:53:14<6:53:07, 3.53s/it] {'loss': 0.122, 'grad_norm': 0.9471704814576529, 'learning_rate': 1.0587152419364926e-06, 'epoch': 0.8} 80%|███████▉ | 27263/34278 [29:53:14<6:53:07, 3.53s/it] 80%|███████▉ | 27264/34278 [29:53:17<6:45:14, 3.47s/it] {'loss': 0.1129, 'grad_norm': 0.9068258596768978, 'learning_rate': 1.0584245489549956e-06, 'epoch': 0.8} 80%|███████▉ | 27264/34278 [29:53:17<6:45:14, 3.47s/it] 80%|███████▉ | 27265/34278 [29:53:21<6:59:31, 3.59s/it] {'loss': 0.1039, 'grad_norm': 0.8361631280770446, 'learning_rate': 1.0581338911629436e-06, 'epoch': 0.8} 80%|███████▉ | 27265/34278 [29:53:21<6:59:31, 3.59s/it] 80%|███████▉ | 27266/34278 [29:53:25<7:35:40, 3.90s/it] {'loss': 0.1398, 'grad_norm': 1.1017321696575815, 'learning_rate': 1.057843268562932e-06, 'epoch': 0.8} 80%|███████▉ | 27266/34278 [29:53:25<7:35:40, 3.90s/it] 80%|███████▉ | 27267/34278 [29:53:30<8:14:42, 4.23s/it] {'loss': 0.1375, 'grad_norm': 1.2778504875373855, 'learning_rate': 1.0575526811575526e-06, 'epoch': 0.8} 80%|███████▉ | 27267/34278 [29:53:30<8:14:42, 4.23s/it] 80%|███████▉ | 27268/34278 [29:53:34<7:39:28, 3.93s/it] {'loss': 0.1123, 'grad_norm': 0.7949750708281008, 'learning_rate': 1.0572621289494022e-06, 'epoch': 0.8} 80%|███████▉ | 27268/34278 [29:53:34<7:39:28, 3.93s/it] 80%|███████▉ | 27269/34278 [29:53:40<8:50:55, 4.54s/it] {'loss': 0.1003, 'grad_norm': 0.7750850559447942, 'learning_rate': 1.0569716119410755e-06, 'epoch': 0.8} 80%|███████▉ | 27269/34278 [29:53:40<8:50:55, 4.54s/it] 80%|███████▉ | 27270/34278 [29:53:43<8:00:20, 4.11s/it] {'loss': 0.1483, 'grad_norm': 0.9011085546946347, 'learning_rate': 1.0566811301351648e-06, 'epoch': 0.8} 80%|███████▉ | 27270/34278 [29:53:43<8:00:20, 4.11s/it] 80%|███████▉ | 27271/34278 [29:53:46<7:30:24, 3.86s/it] {'loss': 0.0983, 'grad_norm': 0.7779645887598234, 'learning_rate': 1.0563906835342624e-06, 'epoch': 0.8} 80%|███████▉ | 27271/34278 [29:53:46<7:30:24, 3.86s/it] 80%|███████▉ | 27272/34278 [29:53:52<8:49:54, 4.54s/it] {'loss': 0.1049, 'grad_norm': 0.7652944868889955, 'learning_rate': 1.0561002721409641e-06, 'epoch': 0.8} 80%|███████▉ | 27272/34278 [29:53:52<8:49:54, 4.54s/it] 80%|███████▉ | 27273/34278 [29:53:56<8:17:18, 4.26s/it] {'loss': 0.1393, 'grad_norm': 0.8896689235920401, 'learning_rate': 1.0558098959578612e-06, 'epoch': 0.8} 80%|███████▉ | 27273/34278 [29:53:56<8:17:18, 4.26s/it] 80%|███████▉ | 27274/34278 [29:53:59<7:48:30, 4.01s/it] {'loss': 0.1288, 'grad_norm': 2.154808598446225, 'learning_rate': 1.0555195549875425e-06, 'epoch': 0.8} 80%|███████▉ | 27274/34278 [29:53:59<7:48:30, 4.01s/it] 80%|███████▉ | 27275/34278 [29:54:02<7:17:17, 3.75s/it] {'loss': 0.1057, 'grad_norm': 0.8105039252933518, 'learning_rate': 1.055229249232607e-06, 'epoch': 0.8} 80%|███████▉ | 27275/34278 [29:54:02<7:17:17, 3.75s/it] 80%|███████▉ | 27276/34278 [29:54:06<7:07:53, 3.67s/it] {'loss': 0.1068, 'grad_norm': 0.7601961340534943, 'learning_rate': 1.0549389786956427e-06, 'epoch': 0.8} 80%|███████▉ | 27276/34278 [29:54:06<7:07:53, 3.67s/it] 80%|███████▉ | 27277/34278 [29:54:09<6:53:30, 3.54s/it] {'loss': 0.1056, 'grad_norm': 0.9367799725960024, 'learning_rate': 1.05464874337924e-06, 'epoch': 0.8} 80%|███████▉ | 27277/34278 [29:54:09<6:53:30, 3.54s/it] 80%|███████▉ | 27278/34278 [29:54:13<6:58:11, 3.58s/it] {'loss': 0.1142, 'grad_norm': 1.3257994814465035, 'learning_rate': 1.0543585432859938e-06, 'epoch': 0.8} 80%|███████▉ | 27278/34278 [29:54:13<6:58:11, 3.58s/it] 80%|███████▉ | 27279/34278 [29:54:16<6:43:01, 3.45s/it] {'loss': 0.1139, 'grad_norm': 0.8538977594302306, 'learning_rate': 1.0540683784184902e-06, 'epoch': 0.8} 80%|███████▉ | 27279/34278 [29:54:16<6:43:01, 3.45s/it] 80%|███████▉ | 27280/34278 [29:54:20<7:21:21, 3.78s/it] {'loss': 0.1112, 'grad_norm': 0.8283071662944053, 'learning_rate': 1.0537782487793242e-06, 'epoch': 0.8} 80%|███████▉ | 27280/34278 [29:54:20<7:21:21, 3.78s/it] 80%|███████▉ | 27281/34278 [29:54:24<7:06:00, 3.65s/it] {'loss': 0.1116, 'grad_norm': 0.8035194992437611, 'learning_rate': 1.0534881543710823e-06, 'epoch': 0.8} 80%|███████▉ | 27281/34278 [29:54:24<7:06:00, 3.65s/it] 80%|███████▉ | 27282/34278 [29:54:27<7:01:46, 3.62s/it] {'loss': 0.0872, 'grad_norm': 0.6971214743367149, 'learning_rate': 1.0531980951963572e-06, 'epoch': 0.8} 80%|███████▉ | 27282/34278 [29:54:27<7:01:46, 3.62s/it] 80%|███████▉ | 27283/34278 [29:54:30<6:35:18, 3.39s/it] {'loss': 0.1292, 'grad_norm': 0.9076570749974003, 'learning_rate': 1.0529080712577378e-06, 'epoch': 0.8} 80%|███████▉ | 27283/34278 [29:54:30<6:35:18, 3.39s/it] 80%|███████▉ | 27284/34278 [29:54:34<6:45:15, 3.48s/it] {'loss': 0.1183, 'grad_norm': 0.8532211474920889, 'learning_rate': 1.0526180825578108e-06, 'epoch': 0.8} 80%|███████▉ | 27284/34278 [29:54:34<6:45:15, 3.48s/it] 80%|███████▉ | 27285/34278 [29:54:38<7:21:40, 3.79s/it] {'loss': 0.1094, 'grad_norm': 0.7560721645327512, 'learning_rate': 1.0523281290991678e-06, 'epoch': 0.8} 80%|███████▉ | 27285/34278 [29:54:38<7:21:40, 3.79s/it] 80%|███████▉ | 27286/34278 [29:54:41<6:55:43, 3.57s/it] {'loss': 0.1021, 'grad_norm': 0.6863790227198021, 'learning_rate': 1.0520382108843979e-06, 'epoch': 0.8} 80%|███████▉ | 27286/34278 [29:54:41<6:55:43, 3.57s/it] 80%|███████▉ | 27287/34278 [29:54:45<6:46:51, 3.49s/it] {'loss': 0.1089, 'grad_norm': 0.943530506934025, 'learning_rate': 1.0517483279160889e-06, 'epoch': 0.8} 80%|███████▉ | 27287/34278 [29:54:45<6:46:51, 3.49s/it] 80%|███████▉ | 27288/34278 [29:54:47<6:24:02, 3.30s/it] {'loss': 0.1103, 'grad_norm': 0.8455719346304803, 'learning_rate': 1.051458480196827e-06, 'epoch': 0.8} 80%|███████▉ | 27288/34278 [29:54:47<6:24:02, 3.30s/it] 80%|███████▉ | 27289/34278 [29:54:50<6:13:26, 3.21s/it] {'loss': 0.1068, 'grad_norm': 1.0411619034805029, 'learning_rate': 1.0511686677292021e-06, 'epoch': 0.8} 80%|███████▉ | 27289/34278 [29:54:51<6:13:26, 3.21s/it] 80%|███████▉ | 27290/34278 [29:54:54<6:23:50, 3.30s/it] {'loss': 0.1289, 'grad_norm': 0.7012838742892026, 'learning_rate': 1.050878890515799e-06, 'epoch': 0.8} 80%|███████▉ | 27290/34278 [29:54:54<6:23:50, 3.30s/it] 80%|███████▉ | 27291/34278 [29:54:58<6:55:15, 3.57s/it] {'loss': 0.1221, 'grad_norm': 0.7541020856424244, 'learning_rate': 1.0505891485592073e-06, 'epoch': 0.8} 80%|███████▉ | 27291/34278 [29:54:58<6:55:15, 3.57s/it] 80%|███████▉ | 27292/34278 [29:55:03<7:23:58, 3.81s/it] {'loss': 0.1273, 'grad_norm': 0.8180727127319456, 'learning_rate': 1.050299441862014e-06, 'epoch': 0.8} 80%|███████▉ | 27292/34278 [29:55:03<7:23:58, 3.81s/it] 80%|███████▉ | 27293/34278 [29:55:06<7:13:37, 3.72s/it] {'loss': 0.1265, 'grad_norm': 0.8950766712575334, 'learning_rate': 1.0500097704268042e-06, 'epoch': 0.8} 80%|███████▉ | 27293/34278 [29:55:06<7:13:37, 3.72s/it] 80%|███████▉ | 27294/34278 [29:55:11<7:40:36, 3.96s/it] {'loss': 0.1167, 'grad_norm': 0.7790903717148736, 'learning_rate': 1.0497201342561625e-06, 'epoch': 0.8} 80%|███████▉ | 27294/34278 [29:55:11<7:40:36, 3.96s/it] 80%|███████▉ | 27295/34278 [29:55:14<7:09:53, 3.69s/it] {'loss': 0.1062, 'grad_norm': 0.7474984820936914, 'learning_rate': 1.0494305333526782e-06, 'epoch': 0.8} 80%|███████▉ | 27295/34278 [29:55:14<7:09:53, 3.69s/it] 80%|███████▉ | 27296/34278 [29:55:17<6:43:38, 3.47s/it] {'loss': 0.1033, 'grad_norm': 0.9903092008224443, 'learning_rate': 1.0491409677189352e-06, 'epoch': 0.8} 80%|███████▉ | 27296/34278 [29:55:17<6:43:38, 3.47s/it] 80%|███████▉ | 27297/34278 [29:55:20<6:25:42, 3.32s/it] {'loss': 0.1112, 'grad_norm': 0.779866185498765, 'learning_rate': 1.048851437357517e-06, 'epoch': 0.8} 80%|███████▉ | 27297/34278 [29:55:20<6:25:42, 3.32s/it] 80%|███████▉ | 27298/34278 [29:55:22<6:08:52, 3.17s/it] {'loss': 0.095, 'grad_norm': 0.8087830263086303, 'learning_rate': 1.0485619422710097e-06, 'epoch': 0.8} 80%|███████▉ | 27298/34278 [29:55:22<6:08:52, 3.17s/it] 80%|███████▉ | 27299/34278 [29:55:28<7:45:28, 4.00s/it] {'loss': 0.1068, 'grad_norm': 1.0298698460761202, 'learning_rate': 1.048272482462e-06, 'epoch': 0.8} 80%|███████▉ | 27299/34278 [29:55:28<7:45:28, 4.00s/it] 80%|███████▉ | 27300/34278 [29:55:32<7:19:55, 3.78s/it] {'loss': 0.1211, 'grad_norm': 0.8249956681325328, 'learning_rate': 1.0479830579330697e-06, 'epoch': 0.8} 80%|███████▉ | 27300/34278 [29:55:32<7:19:55, 3.78s/it] 80%|███████▉ | 27301/34278 [29:55:37<8:14:11, 4.25s/it] {'loss': 0.1122, 'grad_norm': 0.7654990462051833, 'learning_rate': 1.0476936686868023e-06, 'epoch': 0.8} 80%|███████▉ | 27301/34278 [29:55:37<8:14:11, 4.25s/it] 80%|███████▉ | 27302/34278 [29:55:41<7:57:05, 4.10s/it] {'loss': 0.1389, 'grad_norm': 1.2977896988259128, 'learning_rate': 1.0474043147257835e-06, 'epoch': 0.8} 80%|███████▉ | 27302/34278 [29:55:41<7:57:05, 4.10s/it] 80%|███████▉ | 27303/34278 [29:55:43<7:05:27, 3.66s/it] {'loss': 0.1136, 'grad_norm': 1.008924547308374, 'learning_rate': 1.0471149960525938e-06, 'epoch': 0.8} 80%|███████▉ | 27303/34278 [29:55:43<7:05:27, 3.66s/it] 80%|███████▉ | 27304/34278 [29:55:46<6:42:09, 3.46s/it] {'loss': 0.1065, 'grad_norm': 0.8084272623989259, 'learning_rate': 1.0468257126698177e-06, 'epoch': 0.8} 80%|███████▉ | 27304/34278 [29:55:46<6:42:09, 3.46s/it] 80%|███████▉ | 27305/34278 [29:55:49<6:30:26, 3.36s/it] {'loss': 0.1408, 'grad_norm': 0.7677326657619127, 'learning_rate': 1.0465364645800397e-06, 'epoch': 0.8} 80%|███████▉ | 27305/34278 [29:55:49<6:30:26, 3.36s/it] 80%|███████▉ | 27306/34278 [29:55:54<7:11:51, 3.72s/it] {'loss': 0.1184, 'grad_norm': 0.7261190198599423, 'learning_rate': 1.0462472517858401e-06, 'epoch': 0.8} 80%|███████▉ | 27306/34278 [29:55:54<7:11:51, 3.72s/it] 80%|███████▉ | 27307/34278 [29:55:59<8:13:02, 4.24s/it] {'loss': 0.0889, 'grad_norm': 0.7956539267186988, 'learning_rate': 1.0459580742898e-06, 'epoch': 0.8} 80%|███████▉ | 27307/34278 [29:55:59<8:13:02, 4.24s/it] 80%|███████▉ | 27308/34278 [29:56:03<7:38:29, 3.95s/it] {'loss': 0.0973, 'grad_norm': 0.8544799910759517, 'learning_rate': 1.045668932094504e-06, 'epoch': 0.8} 80%|███████▉ | 27308/34278 [29:56:03<7:38:29, 3.95s/it] 80%|███████▉ | 27309/34278 [29:56:05<6:54:18, 3.57s/it] {'loss': 0.1101, 'grad_norm': 0.9441641107906975, 'learning_rate': 1.04537982520253e-06, 'epoch': 0.8} 80%|███████▉ | 27309/34278 [29:56:05<6:54:18, 3.57s/it] 80%|███████▉ | 27310/34278 [29:56:08<6:30:49, 3.37s/it] {'loss': 0.0964, 'grad_norm': 0.748369684774757, 'learning_rate': 1.0450907536164623e-06, 'epoch': 0.8} 80%|███████▉ | 27310/34278 [29:56:08<6:30:49, 3.37s/it] 80%|███████▉ | 27311/34278 [29:56:15<8:18:21, 4.29s/it] {'loss': 0.1177, 'grad_norm': 0.7201166621857837, 'learning_rate': 1.0448017173388792e-06, 'epoch': 0.8} 80%|███████▉ | 27311/34278 [29:56:15<8:18:21, 4.29s/it] 80%|███████▉ | 27312/34278 [29:56:18<7:30:17, 3.88s/it] {'loss': 0.1088, 'grad_norm': 0.9313919002849654, 'learning_rate': 1.0445127163723634e-06, 'epoch': 0.8} 80%|███████▉ | 27312/34278 [29:56:18<7:30:17, 3.88s/it] 80%|███████▉ | 27313/34278 [29:56:24<8:41:55, 4.50s/it] {'loss': 0.1107, 'grad_norm': 0.7366937516037917, 'learning_rate': 1.0442237507194936e-06, 'epoch': 0.8} 80%|███████▉ | 27313/34278 [29:56:24<8:41:55, 4.50s/it] 80%|███████▉ | 27314/34278 [29:56:27<7:49:31, 4.05s/it] {'loss': 0.1079, 'grad_norm': 0.9756247991440086, 'learning_rate': 1.0439348203828487e-06, 'epoch': 0.8} 80%|███████▉ | 27314/34278 [29:56:27<7:49:31, 4.05s/it] 80%|███████▉ | 27315/34278 [29:56:31<8:07:12, 4.20s/it] {'loss': 0.1128, 'grad_norm': 0.7766049100662035, 'learning_rate': 1.0436459253650088e-06, 'epoch': 0.8} 80%|███████▉ | 27315/34278 [29:56:31<8:07:12, 4.20s/it] 80%|███████▉ | 27316/34278 [29:56:34<7:21:02, 3.80s/it] {'loss': 0.128, 'grad_norm': 0.9086673460545542, 'learning_rate': 1.0433570656685555e-06, 'epoch': 0.8} 80%|███████▉ | 27316/34278 [29:56:34<7:21:02, 3.80s/it] 80%|███████▉ | 27317/34278 [29:56:40<8:38:03, 4.47s/it] {'loss': 0.1162, 'grad_norm': 0.8607224473451294, 'learning_rate': 1.0430682412960659e-06, 'epoch': 0.8} 80%|███████▉ | 27317/34278 [29:56:40<8:38:03, 4.47s/it] 80%|███████▉ | 27318/34278 [29:56:44<8:31:42, 4.41s/it] {'loss': 0.1116, 'grad_norm': 0.9319137601603329, 'learning_rate': 1.0427794522501168e-06, 'epoch': 0.8} 80%|███████▉ | 27318/34278 [29:56:44<8:31:42, 4.41s/it] 80%|███████▉ | 27319/34278 [29:56:47<7:42:11, 3.98s/it] {'loss': 0.1297, 'grad_norm': 0.7976591204978623, 'learning_rate': 1.0424906985332895e-06, 'epoch': 0.8} 80%|███████▉ | 27319/34278 [29:56:47<7:42:11, 3.98s/it] 80%|███████▉ | 27320/34278 [29:56:50<7:04:17, 3.66s/it] {'loss': 0.1028, 'grad_norm': 0.870949844542542, 'learning_rate': 1.0422019801481604e-06, 'epoch': 0.8} 80%|███████▉ | 27320/34278 [29:56:50<7:04:17, 3.66s/it] 80%|███████▉ | 27321/34278 [29:56:55<7:55:16, 4.10s/it] {'loss': 0.1051, 'grad_norm': 0.8619316749310867, 'learning_rate': 1.0419132970973046e-06, 'epoch': 0.8} 80%|███████▉ | 27321/34278 [29:56:55<7:55:16, 4.10s/it] 80%|███████▉ | 27322/34278 [29:57:01<9:01:03, 4.67s/it] {'loss': 0.1381, 'grad_norm': 0.7862771472718773, 'learning_rate': 1.041624649383305e-06, 'epoch': 0.8} 80%|███████▉ | 27322/34278 [29:57:01<9:01:03, 4.67s/it] 80%|███████▉ | 27323/34278 [29:57:04<8:00:39, 4.15s/it] {'loss': 0.1311, 'grad_norm': 0.857233247605059, 'learning_rate': 1.041336037008735e-06, 'epoch': 0.8} 80%|███████▉ | 27323/34278 [29:57:04<8:00:39, 4.15s/it] 80%|███████▉ | 27324/34278 [29:57:09<8:23:04, 4.34s/it] {'loss': 0.1036, 'grad_norm': 0.86679087743703, 'learning_rate': 1.0410474599761711e-06, 'epoch': 0.8} 80%|███████▉ | 27324/34278 [29:57:09<8:23:04, 4.34s/it] 80%|███████▉ | 27325/34278 [29:57:13<7:56:03, 4.11s/it] {'loss': 0.0985, 'grad_norm': 0.9608810048027391, 'learning_rate': 1.0407589182881916e-06, 'epoch': 0.8} 80%|███████▉ | 27325/34278 [29:57:13<7:56:03, 4.11s/it] 80%|███████▉ | 27326/34278 [29:57:16<7:14:35, 3.75s/it] {'loss': 0.1235, 'grad_norm': 0.8463746457760372, 'learning_rate': 1.0404704119473707e-06, 'epoch': 0.8} 80%|███████▉ | 27326/34278 [29:57:16<7:14:35, 3.75s/it] 80%|███████▉ | 27327/34278 [29:57:18<6:45:32, 3.50s/it] {'loss': 0.1092, 'grad_norm': 0.8035519886738591, 'learning_rate': 1.040181940956284e-06, 'epoch': 0.8} 80%|███████▉ | 27327/34278 [29:57:18<6:45:32, 3.50s/it] 80%|███████▉ | 27328/34278 [29:57:21<6:21:41, 3.30s/it] {'loss': 0.1281, 'grad_norm': 0.9602836819719859, 'learning_rate': 1.039893505317508e-06, 'epoch': 0.8} 80%|███████▉ | 27328/34278 [29:57:21<6:21:41, 3.30s/it] 80%|███████▉ | 27329/34278 [29:57:26<7:08:11, 3.70s/it] {'loss': 0.1184, 'grad_norm': 0.9643572818748601, 'learning_rate': 1.039605105033618e-06, 'epoch': 0.8} 80%|███████▉ | 27329/34278 [29:57:26<7:08:11, 3.70s/it] 80%|███████▉ | 27330/34278 [29:57:29<6:53:43, 3.57s/it] {'loss': 0.135, 'grad_norm': 0.7501368876732976, 'learning_rate': 1.0393167401071885e-06, 'epoch': 0.8} 80%|███████▉ | 27330/34278 [29:57:29<6:53:43, 3.57s/it] 80%|███████▉ | 27331/34278 [29:57:33<6:54:48, 3.58s/it] {'loss': 0.1354, 'grad_norm': 1.222176281836421, 'learning_rate': 1.0390284105407927e-06, 'epoch': 0.8} 80%|███████▉ | 27331/34278 [29:57:33<6:54:48, 3.58s/it] 80%|███████▉ | 27332/34278 [29:57:36<6:28:42, 3.36s/it] {'loss': 0.1145, 'grad_norm': 1.1490824271254725, 'learning_rate': 1.0387401163370064e-06, 'epoch': 0.8} 80%|███████▉ | 27332/34278 [29:57:36<6:28:42, 3.36s/it] 80%|███████▉ | 27333/34278 [29:57:39<6:29:52, 3.37s/it] {'loss': 0.1142, 'grad_norm': 0.8264927213388011, 'learning_rate': 1.0384518574984014e-06, 'epoch': 0.8} 80%|███████▉ | 27333/34278 [29:57:39<6:29:52, 3.37s/it] 80%|███████▉ | 27334/34278 [29:57:42<6:29:02, 3.36s/it] {'loss': 0.1297, 'grad_norm': 1.0103645390764446, 'learning_rate': 1.038163634027553e-06, 'epoch': 0.8} 80%|███████▉ | 27334/34278 [29:57:42<6:29:02, 3.36s/it] 80%|███████▉ | 27335/34278 [29:57:45<6:06:30, 3.17s/it] {'loss': 0.1158, 'grad_norm': 0.7876174767197305, 'learning_rate': 1.0378754459270352e-06, 'epoch': 0.8} 80%|███████▉ | 27335/34278 [29:57:45<6:06:30, 3.17s/it] 80%|███████▉ | 27336/34278 [29:57:49<6:21:57, 3.30s/it] {'loss': 0.1167, 'grad_norm': 0.6555318205499635, 'learning_rate': 1.037587293199419e-06, 'epoch': 0.8} 80%|███████▉ | 27336/34278 [29:57:49<6:21:57, 3.30s/it] 80%|███████▉ | 27337/34278 [29:57:52<6:32:16, 3.39s/it] {'loss': 0.1314, 'grad_norm': 0.9265734139056865, 'learning_rate': 1.0372991758472768e-06, 'epoch': 0.8} 80%|███████▉ | 27337/34278 [29:57:52<6:32:16, 3.39s/it] 80%|███████▉ | 27338/34278 [29:57:55<6:14:36, 3.24s/it] {'loss': 0.1538, 'grad_norm': 1.1487913808388042, 'learning_rate': 1.037011093873183e-06, 'epoch': 0.8} 80%|███████▉ | 27338/34278 [29:57:55<6:14:36, 3.24s/it] 80%|███████▉ | 27339/34278 [29:57:58<6:14:55, 3.24s/it] {'loss': 0.1128, 'grad_norm': 0.9946993047628627, 'learning_rate': 1.0367230472797064e-06, 'epoch': 0.8} 80%|███████▉ | 27339/34278 [29:57:58<6:14:55, 3.24s/it] 80%|███████▉ | 27340/34278 [29:58:04<7:28:58, 3.88s/it] {'loss': 0.113, 'grad_norm': 0.7669284339381132, 'learning_rate': 1.036435036069422e-06, 'epoch': 0.8} 80%|███████▉ | 27340/34278 [29:58:04<7:28:58, 3.88s/it] 80%|███████▉ | 27341/34278 [29:58:07<6:59:44, 3.63s/it] {'loss': 0.11, 'grad_norm': 0.7329772568054622, 'learning_rate': 1.0361470602448975e-06, 'epoch': 0.8} 80%|███████▉ | 27341/34278 [29:58:07<6:59:44, 3.63s/it] 80%|███████▉ | 27342/34278 [29:58:10<6:44:28, 3.50s/it] {'loss': 0.1071, 'grad_norm': 0.7349377252456173, 'learning_rate': 1.0358591198087076e-06, 'epoch': 0.8} 80%|███████▉ | 27342/34278 [29:58:10<6:44:28, 3.50s/it] 80%|███████▉ | 27343/34278 [29:58:13<6:38:30, 3.45s/it] {'loss': 0.1076, 'grad_norm': 0.9312266589613643, 'learning_rate': 1.0355712147634211e-06, 'epoch': 0.8} 80%|███████▉ | 27343/34278 [29:58:13<6:38:30, 3.45s/it] 80%|███████▉ | 27344/34278 [29:58:16<6:23:46, 3.32s/it] {'loss': 0.0947, 'grad_norm': 0.8086047133577277, 'learning_rate': 1.0352833451116069e-06, 'epoch': 0.8} 80%|███████▉ | 27344/34278 [29:58:16<6:23:46, 3.32s/it] 80%|███████▉ | 27345/34278 [29:58:19<6:09:00, 3.19s/it] {'loss': 0.1043, 'grad_norm': 0.8984617533998107, 'learning_rate': 1.0349955108558369e-06, 'epoch': 0.8} 80%|███████▉ | 27345/34278 [29:58:19<6:09:00, 3.19s/it] 80%|███████▉ | 27346/34278 [29:58:22<5:56:13, 3.08s/it] {'loss': 0.088, 'grad_norm': 0.6451594497243076, 'learning_rate': 1.0347077119986814e-06, 'epoch': 0.8} 80%|███████▉ | 27346/34278 [29:58:22<5:56:13, 3.08s/it] 80%|███████▉ | 27347/34278 [29:58:26<6:26:40, 3.35s/it] {'loss': 0.1217, 'grad_norm': 0.8004683371416975, 'learning_rate': 1.0344199485427086e-06, 'epoch': 0.8} 80%|███████▉ | 27347/34278 [29:58:26<6:26:40, 3.35s/it] 80%|███████▉ | 27348/34278 [29:58:29<6:20:32, 3.29s/it] {'loss': 0.1125, 'grad_norm': 1.0035865205869823, 'learning_rate': 1.0341322204904875e-06, 'epoch': 0.8} 80%|███████▉ | 27348/34278 [29:58:29<6:20:32, 3.29s/it] 80%|███████▉ | 27349/34278 [29:58:33<6:25:17, 3.34s/it] {'loss': 0.1037, 'grad_norm': 0.9000770291014063, 'learning_rate': 1.0338445278445874e-06, 'epoch': 0.8} 80%|███████▉ | 27349/34278 [29:58:33<6:25:17, 3.34s/it] 80%|███████▉ | 27350/34278 [29:58:39<8:03:40, 4.19s/it] {'loss': 0.1179, 'grad_norm': 0.7279193308205009, 'learning_rate': 1.0335568706075771e-06, 'epoch': 0.8} 80%|███████▉ | 27350/34278 [29:58:39<8:03:40, 4.19s/it] 80%|███████▉ | 27351/34278 [29:58:44<8:24:02, 4.37s/it] {'loss': 0.1037, 'grad_norm': 0.666165510272029, 'learning_rate': 1.0332692487820216e-06, 'epoch': 0.8} 80%|███████▉ | 27351/34278 [29:58:44<8:24:02, 4.37s/it] 80%|███████▉ | 27352/34278 [29:58:48<8:09:21, 4.24s/it] {'loss': 0.1185, 'grad_norm': 0.8768612695247942, 'learning_rate': 1.0329816623704942e-06, 'epoch': 0.8} 80%|███████▉ | 27352/34278 [29:58:48<8:09:21, 4.24s/it] 80%|███████▉ | 27353/34278 [29:58:51<7:36:07, 3.95s/it] {'loss': 0.0969, 'grad_norm': 0.8473419457334794, 'learning_rate': 1.032694111375559e-06, 'epoch': 0.8} 80%|███████▉ | 27353/34278 [29:58:51<7:36:07, 3.95s/it] 80%|███████▉ | 27354/34278 [29:58:56<8:23:51, 4.37s/it] {'loss': 0.106, 'grad_norm': 0.80039041600822, 'learning_rate': 1.0324065957997824e-06, 'epoch': 0.8} 80%|███████▉ | 27354/34278 [29:58:56<8:23:51, 4.37s/it] 80%|███████▉ | 27355/34278 [29:59:00<8:01:36, 4.17s/it] {'loss': 0.1086, 'grad_norm': 0.7061535278711579, 'learning_rate': 1.0321191156457343e-06, 'epoch': 0.8} 80%|███████▉ | 27355/34278 [29:59:00<8:01:36, 4.17s/it] 80%|███████▉ | 27356/34278 [29:59:03<7:24:55, 3.86s/it] {'loss': 0.1223, 'grad_norm': 0.7978768143194749, 'learning_rate': 1.0318316709159792e-06, 'epoch': 0.8} 80%|███████▉ | 27356/34278 [29:59:03<7:24:55, 3.86s/it] 80%|███████▉ | 27357/34278 [29:59:06<6:45:33, 3.52s/it] {'loss': 0.1014, 'grad_norm': 0.7193545422037824, 'learning_rate': 1.0315442616130828e-06, 'epoch': 0.8} 80%|███████▉ | 27357/34278 [29:59:06<6:45:33, 3.52s/it] 80%|███████▉ | 27358/34278 [29:59:10<7:10:01, 3.73s/it] {'loss': 0.1123, 'grad_norm': 0.709688780305429, 'learning_rate': 1.0312568877396111e-06, 'epoch': 0.8} 80%|███████▉ | 27358/34278 [29:59:10<7:10:01, 3.73s/it] 80%|███████▉ | 27359/34278 [29:59:13<6:45:58, 3.52s/it] {'loss': 0.1015, 'grad_norm': 0.8245917762991147, 'learning_rate': 1.0309695492981324e-06, 'epoch': 0.8} 80%|███████▉ | 27359/34278 [29:59:13<6:45:58, 3.52s/it] 80%|███████▉ | 27360/34278 [29:59:16<6:26:13, 3.35s/it] {'loss': 0.1255, 'grad_norm': 1.154410388594645, 'learning_rate': 1.0306822462912103e-06, 'epoch': 0.8} 80%|███████▉ | 27360/34278 [29:59:16<6:26:13, 3.35s/it] 80%|███████▉ | 27361/34278 [29:59:19<6:28:12, 3.37s/it] {'loss': 0.109, 'grad_norm': 0.9838972049995711, 'learning_rate': 1.030394978721408e-06, 'epoch': 0.8} 80%|███████▉ | 27361/34278 [29:59:19<6:28:12, 3.37s/it] 80%|███████▉ | 27362/34278 [29:59:23<6:30:05, 3.38s/it] {'loss': 0.114, 'grad_norm': 0.7685313319232628, 'learning_rate': 1.0301077465912928e-06, 'epoch': 0.8} 80%|███████▉ | 27362/34278 [29:59:23<6:30:05, 3.38s/it] 80%|███████▉ | 27363/34278 [29:59:27<6:50:09, 3.56s/it] {'loss': 0.1279, 'grad_norm': 0.795557683594711, 'learning_rate': 1.0298205499034265e-06, 'epoch': 0.8} 80%|███████▉ | 27363/34278 [29:59:27<6:50:09, 3.56s/it] 80%|███████▉ | 27364/34278 [29:59:33<8:14:07, 4.29s/it] {'loss': 0.0909, 'grad_norm': 0.9100744446084528, 'learning_rate': 1.0295333886603749e-06, 'epoch': 0.8} 80%|███████▉ | 27364/34278 [29:59:33<8:14:07, 4.29s/it] 80%|███████▉ | 27365/34278 [29:59:36<7:53:07, 4.11s/it] {'loss': 0.1327, 'grad_norm': 0.9533167245822671, 'learning_rate': 1.0292462628647026e-06, 'epoch': 0.8} 80%|███████▉ | 27365/34278 [29:59:36<7:53:07, 4.11s/it] 80%|███████▉ | 27366/34278 [29:59:40<7:23:32, 3.85s/it] {'loss': 0.1114, 'grad_norm': 0.9543349418398875, 'learning_rate': 1.0289591725189717e-06, 'epoch': 0.8} 80%|███████▉ | 27366/34278 [29:59:40<7:23:32, 3.85s/it] 80%|███████▉ | 27367/34278 [29:59:43<7:08:36, 3.72s/it] {'loss': 0.0793, 'grad_norm': 0.8412541393310877, 'learning_rate': 1.028672117625744e-06, 'epoch': 0.8} 80%|███████▉ | 27367/34278 [29:59:43<7:08:36, 3.72s/it] 80%|███████▉ | 27368/34278 [29:59:49<8:14:03, 4.29s/it] {'loss': 0.1165, 'grad_norm': 0.7861122645941854, 'learning_rate': 1.0283850981875853e-06, 'epoch': 0.8} 80%|███████▉ | 27368/34278 [29:59:49<8:14:03, 4.29s/it] 80%|███████▉ | 27369/34278 [29:59:52<7:28:26, 3.89s/it] {'loss': 0.1175, 'grad_norm': 0.9010736568877544, 'learning_rate': 1.0280981142070545e-06, 'epoch': 0.8} 80%|███████▉ | 27369/34278 [29:59:52<7:28:26, 3.89s/it] 80%|███████▉ | 27370/34278 [29:59:55<7:00:39, 3.65s/it] {'loss': 0.137, 'grad_norm': 1.152385746574157, 'learning_rate': 1.0278111656867174e-06, 'epoch': 0.8} 80%|███████▉ | 27370/34278 [29:59:55<7:00:39, 3.65s/it] 80%|███████▉ | 27371/34278 [29:59:58<6:41:53, 3.49s/it] {'loss': 0.1173, 'grad_norm': 0.9707370319950321, 'learning_rate': 1.0275242526291324e-06, 'epoch': 0.8} 80%|███████▉ | 27371/34278 [29:59:58<6:41:53, 3.49s/it] 80%|███████▉ | 27372/34278 [30:00:01<6:37:08, 3.45s/it] {'loss': 0.1067, 'grad_norm': 0.7163698412723432, 'learning_rate': 1.0272373750368635e-06, 'epoch': 0.8} 80%|███████▉ | 27372/34278 [30:00:01<6:37:08, 3.45s/it] 80%|███████▉ | 27373/34278 [30:00:04<6:20:30, 3.31s/it] {'loss': 0.106, 'grad_norm': 0.8788242145510866, 'learning_rate': 1.0269505329124713e-06, 'epoch': 0.8} 80%|███████▉ | 27373/34278 [30:00:04<6:20:30, 3.31s/it] 80%|███████▉ | 27374/34278 [30:00:08<6:31:57, 3.41s/it] {'loss': 0.1055, 'grad_norm': 1.0949977314962551, 'learning_rate': 1.026663726258515e-06, 'epoch': 0.8} 80%|███████▉ | 27374/34278 [30:00:08<6:31:57, 3.41s/it] 80%|███████▉ | 27375/34278 [30:00:11<6:20:03, 3.30s/it] {'loss': 0.1183, 'grad_norm': 0.8828902644346203, 'learning_rate': 1.0263769550775564e-06, 'epoch': 0.8} 80%|███████▉ | 27375/34278 [30:00:11<6:20:03, 3.30s/it] 80%|███████▉ | 27376/34278 [30:00:14<6:11:04, 3.23s/it] {'loss': 0.1222, 'grad_norm': 0.7971093754824756, 'learning_rate': 1.0260902193721573e-06, 'epoch': 0.8} 80%|███████▉ | 27376/34278 [30:00:14<6:11:04, 3.23s/it] 80%|███████▉ | 27377/34278 [30:00:17<6:06:21, 3.19s/it] {'loss': 0.1297, 'grad_norm': 0.8554109268803041, 'learning_rate': 1.0258035191448756e-06, 'epoch': 0.8} 80%|███████▉ | 27377/34278 [30:00:17<6:06:21, 3.19s/it] 80%|███████▉ | 27378/34278 [30:00:20<5:59:48, 3.13s/it] {'loss': 0.0929, 'grad_norm': 1.4321768780104973, 'learning_rate': 1.0255168543982708e-06, 'epoch': 0.8} 80%|███████▉ | 27378/34278 [30:00:20<5:59:48, 3.13s/it] 80%|███████▉ | 27379/34278 [30:00:24<6:12:27, 3.24s/it] {'loss': 0.1044, 'grad_norm': 0.781439972212485, 'learning_rate': 1.0252302251349033e-06, 'epoch': 0.8} 80%|███████▉ | 27379/34278 [30:00:24<6:12:27, 3.24s/it] 80%|███████▉ | 27380/34278 [30:00:27<6:02:36, 3.15s/it] {'loss': 0.125, 'grad_norm': 0.9651303026479694, 'learning_rate': 1.024943631357332e-06, 'epoch': 0.8} 80%|███████▉ | 27380/34278 [30:00:27<6:02:36, 3.15s/it] 80%|███████▉ | 27381/34278 [30:00:30<5:55:59, 3.10s/it] {'loss': 0.1326, 'grad_norm': 0.9186434046135167, 'learning_rate': 1.0246570730681122e-06, 'epoch': 0.8} 80%|███████▉ | 27381/34278 [30:00:30<5:55:59, 3.10s/it] 80%|███████▉ | 27382/34278 [30:00:32<5:41:36, 2.97s/it] {'loss': 0.0973, 'grad_norm': 1.0266898586051725, 'learning_rate': 1.0243705502698075e-06, 'epoch': 0.8} 80%|███████▉ | 27382/34278 [30:00:32<5:41:36, 2.97s/it] 80%|███████▉ | 27383/34278 [30:00:37<6:43:18, 3.51s/it] {'loss': 0.1196, 'grad_norm': 1.1227978782006496, 'learning_rate': 1.0240840629649735e-06, 'epoch': 0.8} 80%|███████▉ | 27383/34278 [30:00:37<6:43:18, 3.51s/it] 80%|███████▉ | 27384/34278 [30:00:40<6:30:35, 3.40s/it] {'loss': 0.1077, 'grad_norm': 0.9718486690452055, 'learning_rate': 1.0237976111561666e-06, 'epoch': 0.8} 80%|███████▉ | 27384/34278 [30:00:40<6:30:35, 3.40s/it] 80%|███████▉ | 27385/34278 [30:00:45<7:29:18, 3.91s/it] {'loss': 0.1176, 'grad_norm': 0.9010038203770405, 'learning_rate': 1.023511194845947e-06, 'epoch': 0.8} 80%|███████▉ | 27385/34278 [30:00:45<7:29:18, 3.91s/it] 80%|███████▉ | 27386/34278 [30:00:51<8:43:15, 4.56s/it] {'loss': 0.1277, 'grad_norm': 0.9916957610020355, 'learning_rate': 1.02322481403687e-06, 'epoch': 0.8} 80%|███████▉ | 27386/34278 [30:00:51<8:43:15, 4.56s/it] 80%|███████▉ | 27387/34278 [30:00:57<9:31:59, 4.98s/it] {'loss': 0.0998, 'grad_norm': 0.851873687310045, 'learning_rate': 1.0229384687314915e-06, 'epoch': 0.8} 80%|███████▉ | 27387/34278 [30:00:57<9:31:59, 4.98s/it] 80%|███████▉ | 27388/34278 [30:01:00<8:25:06, 4.40s/it] {'loss': 0.1032, 'grad_norm': 1.017243669354646, 'learning_rate': 1.0226521589323684e-06, 'epoch': 0.8} 80%|███████▉ | 27388/34278 [30:01:00<8:25:06, 4.40s/it] 80%|███████▉ | 27389/34278 [30:01:03<7:39:28, 4.00s/it] {'loss': 0.0963, 'grad_norm': 1.6622331720543835, 'learning_rate': 1.0223658846420593e-06, 'epoch': 0.8} 80%|███████▉ | 27389/34278 [30:01:03<7:39:28, 4.00s/it] 80%|███████▉ | 27390/34278 [30:01:06<7:09:22, 3.74s/it] {'loss': 0.1008, 'grad_norm': 0.9540677558757608, 'learning_rate': 1.0220796458631171e-06, 'epoch': 0.8} 80%|███████▉ | 27390/34278 [30:01:07<7:09:22, 3.74s/it] 80%|███████▉ | 27391/34278 [30:01:10<6:58:28, 3.65s/it] {'loss': 0.1097, 'grad_norm': 0.974557481035565, 'learning_rate': 1.021793442598098e-06, 'epoch': 0.8} 80%|███████▉ | 27391/34278 [30:01:10<6:58:28, 3.65s/it] 80%|███████▉ | 27392/34278 [30:01:13<6:29:36, 3.39s/it] {'loss': 0.1062, 'grad_norm': 1.0136333069184431, 'learning_rate': 1.021507274849558e-06, 'epoch': 0.8} 80%|███████▉ | 27392/34278 [30:01:13<6:29:36, 3.39s/it] 80%|███████▉ | 27393/34278 [30:01:16<6:20:00, 3.31s/it] {'loss': 0.1054, 'grad_norm': 0.9111600102249696, 'learning_rate': 1.0212211426200502e-06, 'epoch': 0.8} 80%|███████▉ | 27393/34278 [30:01:16<6:20:00, 3.31s/it] 80%|███████▉ | 27394/34278 [30:01:22<7:49:34, 4.09s/it] {'loss': 0.1374, 'grad_norm': 0.8667781071069143, 'learning_rate': 1.0209350459121304e-06, 'epoch': 0.8} 80%|███████▉ | 27394/34278 [30:01:22<7:49:34, 4.09s/it] 80%|███████▉ | 27395/34278 [30:01:27<8:21:54, 4.38s/it] {'loss': 0.1156, 'grad_norm': 1.1658946403964856, 'learning_rate': 1.0206489847283535e-06, 'epoch': 0.8} 80%|███████▉ | 27395/34278 [30:01:27<8:21:54, 4.38s/it] 80%|███████▉ | 27396/34278 [30:01:30<7:30:49, 3.93s/it] {'loss': 0.1213, 'grad_norm': 1.0612162801070393, 'learning_rate': 1.0203629590712727e-06, 'epoch': 0.8} 80%|███████▉ | 27396/34278 [30:01:30<7:30:49, 3.93s/it] 80%|███████▉ | 27397/34278 [30:01:34<7:39:43, 4.01s/it] {'loss': 0.1323, 'grad_norm': 0.6733781856203072, 'learning_rate': 1.0200769689434404e-06, 'epoch': 0.8} 80%|███████▉ | 27397/34278 [30:01:34<7:39:43, 4.01s/it] 80%|███████▉ | 27398/34278 [30:01:37<7:19:41, 3.83s/it] {'loss': 0.1178, 'grad_norm': 0.8627238954451573, 'learning_rate': 1.0197910143474116e-06, 'epoch': 0.8} 80%|███████▉ | 27398/34278 [30:01:37<7:19:41, 3.83s/it] 80%|███████▉ | 27399/34278 [30:01:40<6:48:54, 3.57s/it] {'loss': 0.1258, 'grad_norm': 1.0186587177239874, 'learning_rate': 1.0195050952857378e-06, 'epoch': 0.8} 80%|███████▉ | 27399/34278 [30:01:40<6:48:54, 3.57s/it] 80%|███████▉ | 27400/34278 [30:01:46<8:06:29, 4.24s/it] {'loss': 0.0969, 'grad_norm': 1.1352688856390698, 'learning_rate': 1.0192192117609727e-06, 'epoch': 0.8} 80%|███████▉ | 27400/34278 [30:01:46<8:06:29, 4.24s/it] 80%|███████▉ | 27401/34278 [30:01:52<9:04:07, 4.75s/it] {'loss': 0.1187, 'grad_norm': 0.7959818087463973, 'learning_rate': 1.0189333637756676e-06, 'epoch': 0.8} 80%|███████▉ | 27401/34278 [30:01:52<9:04:07, 4.75s/it] 80%|███████▉ | 27402/34278 [30:01:55<7:56:57, 4.16s/it] {'loss': 0.1263, 'grad_norm': 0.9637908873423694, 'learning_rate': 1.0186475513323762e-06, 'epoch': 0.8} 80%|███████▉ | 27402/34278 [30:01:55<7:56:57, 4.16s/it] 80%|███████▉ | 27403/34278 [30:01:58<7:38:01, 4.00s/it] {'loss': 0.097, 'grad_norm': 0.8707825603159962, 'learning_rate': 1.0183617744336494e-06, 'epoch': 0.8} 80%|███████▉ | 27403/34278 [30:01:58<7:38:01, 4.00s/it] 80%|███████▉ | 27404/34278 [30:02:02<7:27:04, 3.90s/it] {'loss': 0.1302, 'grad_norm': 1.131584600088067, 'learning_rate': 1.018076033082036e-06, 'epoch': 0.8} 80%|███████▉ | 27404/34278 [30:02:02<7:27:04, 3.90s/it] 80%|███████▉ | 27405/34278 [30:02:06<7:42:49, 4.04s/it] {'loss': 0.1137, 'grad_norm': 0.9505786522226987, 'learning_rate': 1.0177903272800898e-06, 'epoch': 0.8} 80%|███████▉ | 27405/34278 [30:02:06<7:42:49, 4.04s/it] 80%|███████▉ | 27406/34278 [30:02:09<6:58:11, 3.65s/it] {'loss': 0.1277, 'grad_norm': 0.8803258128358723, 'learning_rate': 1.0175046570303626e-06, 'epoch': 0.8} 80%|███████▉ | 27406/34278 [30:02:09<6:58:11, 3.65s/it] 80%|███████▉ | 27407/34278 [30:02:13<6:51:40, 3.59s/it] {'loss': 0.1314, 'grad_norm': 1.1133269484320216, 'learning_rate': 1.0172190223354023e-06, 'epoch': 0.8} 80%|███████▉ | 27407/34278 [30:02:13<6:51:40, 3.59s/it] 80%|███████▉ | 27408/34278 [30:02:17<7:33:02, 3.96s/it] {'loss': 0.1098, 'grad_norm': 0.7900603980233648, 'learning_rate': 1.016933423197759e-06, 'epoch': 0.8} 80%|███████▉ | 27408/34278 [30:02:17<7:33:02, 3.96s/it] 80%|███████▉ | 27409/34278 [30:02:21<7:16:52, 3.82s/it] {'loss': 0.1147, 'grad_norm': 0.7658848977935718, 'learning_rate': 1.0166478596199847e-06, 'epoch': 0.8} 80%|███████▉ | 27409/34278 [30:02:21<7:16:52, 3.82s/it] 80%|███████▉ | 27410/34278 [30:02:24<6:54:40, 3.62s/it] {'loss': 0.1266, 'grad_norm': 1.0040842211708778, 'learning_rate': 1.0163623316046267e-06, 'epoch': 0.8} 80%|███████▉ | 27410/34278 [30:02:24<6:54:40, 3.62s/it] 80%|███████▉ | 27411/34278 [30:02:29<7:37:02, 3.99s/it] {'loss': 0.1253, 'grad_norm': 0.9590869503747794, 'learning_rate': 1.016076839154233e-06, 'epoch': 0.8} 80%|███████▉ | 27411/34278 [30:02:29<7:37:02, 3.99s/it] 80%|███████▉ | 27412/34278 [30:02:32<7:16:17, 3.81s/it] {'loss': 0.1141, 'grad_norm': 0.825789617131499, 'learning_rate': 1.0157913822713567e-06, 'epoch': 0.8} 80%|███████▉ | 27412/34278 [30:02:32<7:16:17, 3.81s/it] 80%|███████▉ | 27413/34278 [30:02:36<6:55:57, 3.64s/it] {'loss': 0.1149, 'grad_norm': 1.18040239101638, 'learning_rate': 1.0155059609585432e-06, 'epoch': 0.8} 80%|███████▉ | 27413/34278 [30:02:36<6:55:57, 3.64s/it] 80%|███████▉ | 27414/34278 [30:02:38<6:23:27, 3.35s/it] {'loss': 0.1045, 'grad_norm': 0.8379277379174047, 'learning_rate': 1.0152205752183408e-06, 'epoch': 0.8} 80%|███████▉ | 27414/34278 [30:02:38<6:23:27, 3.35s/it] 80%|███████▉ | 27415/34278 [30:02:42<6:24:59, 3.37s/it] {'loss': 0.1139, 'grad_norm': 0.7105118907286953, 'learning_rate': 1.0149352250532985e-06, 'epoch': 0.8} 80%|███████▉ | 27415/34278 [30:02:42<6:24:59, 3.37s/it] 80%|███████▉ | 27416/34278 [30:02:45<6:25:01, 3.37s/it] {'loss': 0.1055, 'grad_norm': 0.9452899581305659, 'learning_rate': 1.0146499104659634e-06, 'epoch': 0.8} 80%|███████▉ | 27416/34278 [30:02:45<6:25:01, 3.37s/it] 80%|███████▉ | 27417/34278 [30:02:49<6:34:49, 3.45s/it] {'loss': 0.1005, 'grad_norm': 1.0002905838789558, 'learning_rate': 1.0143646314588817e-06, 'epoch': 0.8} 80%|███████▉ | 27417/34278 [30:02:49<6:34:49, 3.45s/it] 80%|███████▉ | 27418/34278 [30:02:52<6:43:22, 3.53s/it] {'loss': 0.1152, 'grad_norm': 0.8606236401368312, 'learning_rate': 1.0140793880346006e-06, 'epoch': 0.8} 80%|███████▉ | 27418/34278 [30:02:52<6:43:22, 3.53s/it] 80%|███████▉ | 27419/34278 [30:02:55<6:24:54, 3.37s/it] {'loss': 0.1376, 'grad_norm': 0.8002761524186969, 'learning_rate': 1.0137941801956686e-06, 'epoch': 0.8} 80%|███████▉ | 27419/34278 [30:02:55<6:24:54, 3.37s/it] 80%|███████▉ | 27420/34278 [30:03:00<7:06:17, 3.73s/it] {'loss': 0.1107, 'grad_norm': 0.8804755466804401, 'learning_rate': 1.0135090079446307e-06, 'epoch': 0.8} 80%|███████▉ | 27420/34278 [30:03:00<7:06:17, 3.73s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 80%|███████▉ | 27421/34278 [30:03:03<6:50:56, 3.60s/it] {'loss': 0.1182, 'grad_norm': 0.7130652507536094, 'learning_rate': 1.0132238712840315e-06, 'epoch': 0.8} 80%|███████▉ | 27421/34278 [30:03:03<6:50:56, 3.60s/it] 80%|███████▉ | 27422/34278 [30:03:07<6:46:03, 3.55s/it] {'loss': 0.1176, 'grad_norm': 0.9118448767102106, 'learning_rate': 1.012938770216419e-06, 'epoch': 0.8} 80%|███████▉ | 27422/34278 [30:03:07<6:46:03, 3.55s/it] 80%|████████ | 27423/34278 [30:03:10<6:20:46, 3.33s/it] {'loss': 0.1111, 'grad_norm': 0.8609004163722702, 'learning_rate': 1.0126537047443364e-06, 'epoch': 0.8} 80%|████████ | 27423/34278 [30:03:10<6:20:46, 3.33s/it] 80%|████████ | 27424/34278 [30:03:13<6:10:44, 3.25s/it] {'loss': 0.1212, 'grad_norm': 0.9193491155298933, 'learning_rate': 1.0123686748703292e-06, 'epoch': 0.8} 80%|████████ | 27424/34278 [30:03:13<6:10:44, 3.25s/it] 80%|████████ | 27425/34278 [30:03:18<7:16:04, 3.82s/it] {'loss': 0.1125, 'grad_norm': 0.8982582014204211, 'learning_rate': 1.0120836805969442e-06, 'epoch': 0.8} 80%|████████ | 27425/34278 [30:03:18<7:16:04, 3.82s/it] 80%|████████ | 27426/34278 [30:03:22<7:20:05, 3.85s/it] {'loss': 0.1081, 'grad_norm': 0.7771863371344789, 'learning_rate': 1.0117987219267238e-06, 'epoch': 0.8} 80%|████████ | 27426/34278 [30:03:22<7:20:05, 3.85s/it] 80%|████████ | 27427/34278 [30:03:27<7:58:12, 4.19s/it] {'loss': 0.1231, 'grad_norm': 0.8806857382044603, 'learning_rate': 1.011513798862211e-06, 'epoch': 0.8} 80%|████████ | 27427/34278 [30:03:27<7:58:12, 4.19s/it] 80%|████████ | 27428/34278 [30:03:31<8:01:25, 4.22s/it] {'loss': 0.1198, 'grad_norm': 0.8949088742009761, 'learning_rate': 1.0112289114059525e-06, 'epoch': 0.8} 80%|████████ | 27428/34278 [30:03:31<8:01:25, 4.22s/it] 80%|████████ | 27429/34278 [30:03:34<7:33:35, 3.97s/it] {'loss': 0.1271, 'grad_norm': 0.8702535990284568, 'learning_rate': 1.0109440595604887e-06, 'epoch': 0.8} 80%|████████ | 27429/34278 [30:03:34<7:33:35, 3.97s/it] 80%|████████ | 27430/34278 [30:03:38<7:14:39, 3.81s/it] {'loss': 0.117, 'grad_norm': 0.9267839597026477, 'learning_rate': 1.0106592433283652e-06, 'epoch': 0.8} 80%|████████ | 27430/34278 [30:03:38<7:14:39, 3.81s/it] 80%|████████ | 27431/34278 [30:03:41<7:04:10, 3.72s/it] {'loss': 0.103, 'grad_norm': 0.8222063174651958, 'learning_rate': 1.0103744627121226e-06, 'epoch': 0.8} 80%|████████ | 27431/34278 [30:03:41<7:04:10, 3.72s/it] 80%|████████ | 27432/34278 [30:03:44<6:46:08, 3.56s/it] {'loss': 0.1235, 'grad_norm': 0.7926290344386797, 'learning_rate': 1.0100897177143054e-06, 'epoch': 0.8} 80%|████████ | 27432/34278 [30:03:44<6:46:08, 3.56s/it] 80%|████████ | 27433/34278 [30:03:48<6:46:24, 3.56s/it] {'loss': 0.1073, 'grad_norm': 0.8564751019855571, 'learning_rate': 1.009805008337455e-06, 'epoch': 0.8} 80%|████████ | 27433/34278 [30:03:48<6:46:24, 3.56s/it] 80%|████████ | 27434/34278 [30:03:51<6:17:30, 3.31s/it] {'loss': 0.123, 'grad_norm': 0.7440124553336979, 'learning_rate': 1.0095203345841115e-06, 'epoch': 0.8} 80%|████████ | 27434/34278 [30:03:51<6:17:30, 3.31s/it] 80%|████████ | 27435/34278 [30:03:57<7:54:10, 4.16s/it] {'loss': 0.1039, 'grad_norm': 0.9112097950179906, 'learning_rate': 1.009235696456818e-06, 'epoch': 0.8} 80%|████████ | 27435/34278 [30:03:57<7:54:10, 4.16s/it] 80%|████████ | 27436/34278 [30:04:03<8:58:01, 4.72s/it] {'loss': 0.1068, 'grad_norm': 0.7822635495414189, 'learning_rate': 1.0089510939581166e-06, 'epoch': 0.8} 80%|████████ | 27436/34278 [30:04:03<8:58:01, 4.72s/it] 80%|████████ | 27437/34278 [30:04:06<7:59:18, 4.20s/it] {'loss': 0.1031, 'grad_norm': 0.9361557872530007, 'learning_rate': 1.0086665270905472e-06, 'epoch': 0.8} 80%|████████ | 27437/34278 [30:04:06<7:59:18, 4.20s/it] 80%|████████ | 27438/34278 [30:04:12<9:04:28, 4.78s/it] {'loss': 0.1164, 'grad_norm': 0.9948467467756411, 'learning_rate': 1.0083819958566489e-06, 'epoch': 0.8} 80%|████████ | 27438/34278 [30:04:12<9:04:28, 4.78s/it] 80%|████████ | 27439/34278 [30:04:16<8:24:42, 4.43s/it] {'loss': 0.1096, 'grad_norm': 0.72331521075041, 'learning_rate': 1.0080975002589644e-06, 'epoch': 0.8} 80%|████████ | 27439/34278 [30:04:16<8:24:42, 4.43s/it] 80%|████████ | 27440/34278 [30:04:20<8:23:35, 4.42s/it] {'loss': 0.1139, 'grad_norm': 0.7935637014052944, 'learning_rate': 1.007813040300033e-06, 'epoch': 0.8} 80%|████████ | 27440/34278 [30:04:20<8:23:35, 4.42s/it] 80%|████████ | 27441/34278 [30:04:26<9:26:04, 4.97s/it] {'loss': 0.1158, 'grad_norm': 0.9534471465186868, 'learning_rate': 1.0075286159823905e-06, 'epoch': 0.8} 80%|████████ | 27441/34278 [30:04:26<9:26:04, 4.97s/it] 80%|████████ | 27442/34278 [30:04:29<8:25:49, 4.44s/it] {'loss': 0.1033, 'grad_norm': 0.9560321220386006, 'learning_rate': 1.0072442273085825e-06, 'epoch': 0.8} 80%|████████ | 27442/34278 [30:04:29<8:25:49, 4.44s/it] 80%|████████ | 27443/34278 [30:04:35<9:17:41, 4.90s/it] {'loss': 0.1226, 'grad_norm': 0.8886134616677258, 'learning_rate': 1.0069598742811448e-06, 'epoch': 0.8} 80%|████████ | 27443/34278 [30:04:35<9:17:41, 4.90s/it] 80%|████████ | 27444/34278 [30:04:39<8:31:32, 4.49s/it] {'loss': 0.1228, 'grad_norm': 0.6454328075690808, 'learning_rate': 1.006675556902615e-06, 'epoch': 0.8} 80%|████████ | 27444/34278 [30:04:39<8:31:32, 4.49s/it] 80%|████████ | 27445/34278 [30:04:42<7:28:08, 3.94s/it] {'loss': 0.1496, 'grad_norm': 0.9234637758355148, 'learning_rate': 1.0063912751755334e-06, 'epoch': 0.8} 80%|████████ | 27445/34278 [30:04:42<7:28:08, 3.94s/it] 80%|████████ | 27446/34278 [30:04:48<8:37:15, 4.54s/it] {'loss': 0.1101, 'grad_norm': 0.9702854374728535, 'learning_rate': 1.0061070291024372e-06, 'epoch': 0.8} 80%|████████ | 27446/34278 [30:04:48<8:37:15, 4.54s/it] 80%|████████ | 27447/34278 [30:04:51<8:10:08, 4.31s/it] {'loss': 0.1057, 'grad_norm': 0.7329279377667942, 'learning_rate': 1.0058228186858633e-06, 'epoch': 0.8} 80%|████████ | 27447/34278 [30:04:51<8:10:08, 4.31s/it] 80%|████████ | 27448/34278 [30:04:55<7:40:38, 4.05s/it] {'loss': 0.1214, 'grad_norm': 0.9125468441366694, 'learning_rate': 1.0055386439283494e-06, 'epoch': 0.8} 80%|████████ | 27448/34278 [30:04:55<7:40:38, 4.05s/it] 80%|████████ | 27449/34278 [30:05:01<8:57:41, 4.72s/it] {'loss': 0.1218, 'grad_norm': 0.7911482095262947, 'learning_rate': 1.0052545048324342e-06, 'epoch': 0.8} 80%|████████ | 27449/34278 [30:05:01<8:57:41, 4.72s/it] 80%|████████ | 27450/34278 [30:05:04<7:53:57, 4.16s/it] {'loss': 0.1012, 'grad_norm': 0.8908001573765704, 'learning_rate': 1.0049704014006527e-06, 'epoch': 0.8} 80%|████████ | 27450/34278 [30:05:04<7:53:57, 4.16s/it] 80%|████████ | 27451/34278 [30:05:10<8:52:41, 4.68s/it] {'loss': 0.1317, 'grad_norm': 0.9453363997313438, 'learning_rate': 1.004686333635541e-06, 'epoch': 0.8} 80%|████████ | 27451/34278 [30:05:10<8:52:41, 4.68s/it] 80%|████████ | 27452/34278 [30:05:13<7:57:36, 4.20s/it] {'loss': 0.1072, 'grad_norm': 0.870801751568944, 'learning_rate': 1.0044023015396375e-06, 'epoch': 0.8} 80%|████████ | 27452/34278 [30:05:13<7:57:36, 4.20s/it] 80%|████████ | 27453/34278 [30:05:16<7:19:00, 3.86s/it] {'loss': 0.1021, 'grad_norm': 0.9291563537905266, 'learning_rate': 1.0041183051154746e-06, 'epoch': 0.8} 80%|████████ | 27453/34278 [30:05:16<7:19:00, 3.86s/it] 80%|████████ | 27454/34278 [30:05:19<7:05:29, 3.74s/it] {'loss': 0.116, 'grad_norm': 0.8064952750070626, 'learning_rate': 1.00383434436559e-06, 'epoch': 0.8} 80%|████████ | 27454/34278 [30:05:19<7:05:29, 3.74s/it] 80%|████████ | 27455/34278 [30:05:25<8:20:40, 4.40s/it] {'loss': 0.1124, 'grad_norm': 0.9306562986957118, 'learning_rate': 1.0035504192925195e-06, 'epoch': 0.8} 80%|████████ | 27455/34278 [30:05:25<8:20:40, 4.40s/it] 80%|████████ | 27456/34278 [30:05:28<7:18:00, 3.85s/it] {'loss': 0.1156, 'grad_norm': 0.9311307517380818, 'learning_rate': 1.003266529898797e-06, 'epoch': 0.8} 80%|████████ | 27456/34278 [30:05:28<7:18:00, 3.85s/it] 80%|████████ | 27457/34278 [30:05:34<8:25:43, 4.45s/it] {'loss': 0.1258, 'grad_norm': 0.7746613497416763, 'learning_rate': 1.0029826761869554e-06, 'epoch': 0.8} 80%|████████ | 27457/34278 [30:05:34<8:25:43, 4.45s/it] 80%|████████ | 27458/34278 [30:05:40<9:20:43, 4.93s/it] {'loss': 0.1265, 'grad_norm': 0.9255562720236967, 'learning_rate': 1.0026988581595315e-06, 'epoch': 0.8} 80%|████████ | 27458/34278 [30:05:40<9:20:43, 4.93s/it] 80%|████████ | 27459/34278 [30:05:43<8:05:21, 4.27s/it] {'loss': 0.1018, 'grad_norm': 0.8488042810775273, 'learning_rate': 1.0024150758190566e-06, 'epoch': 0.8} 80%|████████ | 27459/34278 [30:05:43<8:05:21, 4.27s/it] 80%|████████ | 27460/34278 [30:05:47<8:15:06, 4.36s/it] {'loss': 0.1282, 'grad_norm': 0.7533661385301036, 'learning_rate': 1.0021313291680674e-06, 'epoch': 0.8} 80%|████████ | 27460/34278 [30:05:47<8:15:06, 4.36s/it] 80%|████████ | 27461/34278 [30:05:51<7:42:59, 4.08s/it] {'loss': 0.1114, 'grad_norm': 0.7168257875771503, 'learning_rate': 1.0018476182090935e-06, 'epoch': 0.8} 80%|████████ | 27461/34278 [30:05:51<7:42:59, 4.08s/it] 80%|████████ | 27462/34278 [30:05:53<7:04:41, 3.74s/it] {'loss': 0.1119, 'grad_norm': 0.7129530312175197, 'learning_rate': 1.001563942944671e-06, 'epoch': 0.8} 80%|████████ | 27462/34278 [30:05:54<7:04:41, 3.74s/it] 80%|████████ | 27463/34278 [30:05:57<6:54:05, 3.65s/it] {'loss': 0.1059, 'grad_norm': 0.6467688600173058, 'learning_rate': 1.001280303377331e-06, 'epoch': 0.8} 80%|████████ | 27463/34278 [30:05:57<6:54:05, 3.65s/it] 80%|████████ | 27464/34278 [30:06:00<6:28:20, 3.42s/it] {'loss': 0.1349, 'grad_norm': 0.8250352063102936, 'learning_rate': 1.000996699509605e-06, 'epoch': 0.8} 80%|████████ | 27464/34278 [30:06:00<6:28:20, 3.42s/it] 80%|████████ | 27465/34278 [30:06:03<6:21:39, 3.36s/it] {'loss': 0.1005, 'grad_norm': 0.743134392226119, 'learning_rate': 1.0007131313440255e-06, 'epoch': 0.8} 80%|████████ | 27465/34278 [30:06:03<6:21:39, 3.36s/it] 80%|████████ | 27466/34278 [30:06:06<6:18:59, 3.34s/it] {'loss': 0.132, 'grad_norm': 0.7632444025424922, 'learning_rate': 1.0004295988831259e-06, 'epoch': 0.8} 80%|████████ | 27466/34278 [30:06:06<6:18:59, 3.34s/it] 80%|████████ | 27467/34278 [30:06:11<7:01:09, 3.71s/it] {'loss': 0.1106, 'grad_norm': 0.8975724299906467, 'learning_rate': 1.0001461021294363e-06, 'epoch': 0.8} 80%|████████ | 27467/34278 [30:06:11<7:01:09, 3.71s/it] 80%|████████ | 27468/34278 [30:06:16<7:54:49, 4.18s/it] {'loss': 0.1083, 'grad_norm': 0.8529121784209267, 'learning_rate': 9.998626410854856e-07, 'epoch': 0.8} 80%|████████ | 27468/34278 [30:06:16<7:54:49, 4.18s/it] 80%|████████ | 27469/34278 [30:06:22<8:54:32, 4.71s/it] {'loss': 0.1283, 'grad_norm': 0.9149983717059812, 'learning_rate': 9.99579215753808e-07, 'epoch': 0.8} 80%|████████ | 27469/34278 [30:06:22<8:54:32, 4.71s/it] 80%|████████ | 27470/34278 [30:06:25<8:00:23, 4.23s/it] {'loss': 0.1109, 'grad_norm': 0.8506981819837592, 'learning_rate': 9.992958261369324e-07, 'epoch': 0.8} 80%|████████ | 27470/34278 [30:06:25<8:00:23, 4.23s/it] 80%|████████ | 27471/34278 [30:06:28<7:20:01, 3.88s/it] {'loss': 0.1125, 'grad_norm': 0.8245174876311564, 'learning_rate': 9.990124722373857e-07, 'epoch': 0.8} 80%|████████ | 27471/34278 [30:06:28<7:20:01, 3.88s/it] 80%|████████ | 27472/34278 [30:06:31<6:43:43, 3.56s/it] {'loss': 0.1006, 'grad_norm': 0.7232528396795872, 'learning_rate': 9.987291540577026e-07, 'epoch': 0.8} 80%|████████ | 27472/34278 [30:06:31<6:43:43, 3.56s/it] 80%|████████ | 27473/34278 [30:06:35<6:52:05, 3.63s/it] {'loss': 0.1357, 'grad_norm': 0.9982663724992538, 'learning_rate': 9.984458716004114e-07, 'epoch': 0.8} 80%|████████ | 27473/34278 [30:06:35<6:52:05, 3.63s/it] 80%|████████ | 27474/34278 [30:06:38<6:32:48, 3.46s/it] {'loss': 0.1184, 'grad_norm': 0.7650235797531183, 'learning_rate': 9.98162624868038e-07, 'epoch': 0.8} 80%|████████ | 27474/34278 [30:06:38<6:32:48, 3.46s/it] 80%|████████ | 27475/34278 [30:06:42<6:54:19, 3.65s/it] {'loss': 0.124, 'grad_norm': 0.7297808765548561, 'learning_rate': 9.978794138631153e-07, 'epoch': 0.8} 80%|████████ | 27475/34278 [30:06:42<6:54:19, 3.65s/it] 80%|████████ | 27476/34278 [30:06:45<6:40:29, 3.53s/it] {'loss': 0.1283, 'grad_norm': 0.8622994255567406, 'learning_rate': 9.975962385881688e-07, 'epoch': 0.8} 80%|████████ | 27476/34278 [30:06:45<6:40:29, 3.53s/it] 80%|████████ | 27477/34278 [30:06:49<6:40:54, 3.54s/it] {'loss': 0.123, 'grad_norm': 0.8824422420247346, 'learning_rate': 9.973130990457285e-07, 'epoch': 0.8} 80%|████████ | 27477/34278 [30:06:49<6:40:54, 3.54s/it] 80%|████████ | 27478/34278 [30:06:52<6:21:41, 3.37s/it] {'loss': 0.1197, 'grad_norm': 0.8055058362559767, 'learning_rate': 9.9702999523832e-07, 'epoch': 0.8} 80%|████████ | 27478/34278 [30:06:52<6:21:41, 3.37s/it] 80%|████████ | 27479/34278 [30:06:55<6:26:23, 3.41s/it] {'loss': 0.1043, 'grad_norm': 0.7574393173391978, 'learning_rate': 9.967469271684732e-07, 'epoch': 0.8} 80%|████████ | 27479/34278 [30:06:55<6:26:23, 3.41s/it] 80%|████████ | 27480/34278 [30:06:59<6:45:41, 3.58s/it] {'loss': 0.1188, 'grad_norm': 0.9211481717686488, 'learning_rate': 9.964638948387145e-07, 'epoch': 0.8} 80%|████████ | 27480/34278 [30:06:59<6:45:41, 3.58s/it] 80%|████████ | 27481/34278 [30:07:02<6:28:31, 3.43s/it] {'loss': 0.1512, 'grad_norm': 1.547594870813011, 'learning_rate': 9.961808982515693e-07, 'epoch': 0.8} 80%|████████ | 27481/34278 [30:07:02<6:28:31, 3.43s/it] 80%|████████ | 27482/34278 [30:07:05<6:13:11, 3.29s/it] {'loss': 0.1156, 'grad_norm': 0.9228661760817329, 'learning_rate': 9.95897937409565e-07, 'epoch': 0.8} 80%|████████ | 27482/34278 [30:07:05<6:13:11, 3.29s/it] 80%|████████ | 27483/34278 [30:07:09<6:38:14, 3.52s/it] {'loss': 0.1137, 'grad_norm': 0.7546212666172488, 'learning_rate': 9.956150123152291e-07, 'epoch': 0.8} 80%|████████ | 27483/34278 [30:07:09<6:38:14, 3.52s/it] 80%|████████ | 27484/34278 [30:07:12<6:17:06, 3.33s/it] {'loss': 0.1259, 'grad_norm': 1.1521681964485182, 'learning_rate': 9.953321229710854e-07, 'epoch': 0.8} 80%|████████ | 27484/34278 [30:07:12<6:17:06, 3.33s/it] 80%|████████ | 27485/34278 [30:07:18<7:47:19, 4.13s/it] {'loss': 0.1, 'grad_norm': 0.9702727280511132, 'learning_rate': 9.95049269379662e-07, 'epoch': 0.8} 80%|████████ | 27485/34278 [30:07:18<7:47:19, 4.13s/it] 80%|████████ | 27486/34278 [30:07:21<7:09:38, 3.80s/it] {'loss': 0.1016, 'grad_norm': 0.8129101381875943, 'learning_rate': 9.947664515434823e-07, 'epoch': 0.8} 80%|████████ | 27486/34278 [30:07:21<7:09:38, 3.80s/it] 80%|████████ | 27487/34278 [30:07:27<8:02:11, 4.26s/it] {'loss': 0.1346, 'grad_norm': 0.7833678546779559, 'learning_rate': 9.944836694650706e-07, 'epoch': 0.8} 80%|████████ | 27487/34278 [30:07:27<8:02:11, 4.26s/it] 80%|████████ | 27488/34278 [30:07:29<7:12:32, 3.82s/it] {'loss': 0.1318, 'grad_norm': 0.9835583680742942, 'learning_rate': 9.942009231469524e-07, 'epoch': 0.8} 80%|████████ | 27488/34278 [30:07:29<7:12:32, 3.82s/it] 80%|████████ | 27489/34278 [30:07:33<6:50:48, 3.63s/it] {'loss': 0.1029, 'grad_norm': 0.8395465308407088, 'learning_rate': 9.939182125916535e-07, 'epoch': 0.8} 80%|████████ | 27489/34278 [30:07:33<6:50:48, 3.63s/it] 80%|████████ | 27490/34278 [30:07:36<6:32:37, 3.47s/it] {'loss': 0.13, 'grad_norm': 1.2117427699435273, 'learning_rate': 9.936355378016965e-07, 'epoch': 0.8} 80%|████████ | 27490/34278 [30:07:36<6:32:37, 3.47s/it] 80%|████████ | 27491/34278 [30:07:40<6:47:45, 3.60s/it] {'loss': 0.1078, 'grad_norm': 0.9255749330060353, 'learning_rate': 9.933528987796037e-07, 'epoch': 0.8} 80%|████████ | 27491/34278 [30:07:40<6:47:45, 3.60s/it] 80%|████████ | 27492/34278 [30:07:43<6:43:03, 3.56s/it] {'loss': 0.1263, 'grad_norm': 0.8952781065033935, 'learning_rate': 9.93070295527901e-07, 'epoch': 0.8} 80%|████████ | 27492/34278 [30:07:43<6:43:03, 3.56s/it] 80%|████████ | 27493/34278 [30:07:48<7:21:53, 3.91s/it] {'loss': 0.126, 'grad_norm': 0.8174950489610706, 'learning_rate': 9.9278772804911e-07, 'epoch': 0.8} 80%|████████ | 27493/34278 [30:07:48<7:21:53, 3.91s/it] 80%|████████ | 27494/34278 [30:07:51<7:03:03, 3.74s/it] {'loss': 0.1304, 'grad_norm': 0.8633872550982359, 'learning_rate': 9.92505196345752e-07, 'epoch': 0.8} 80%|████████ | 27494/34278 [30:07:51<7:03:03, 3.74s/it] 80%|████████ | 27495/34278 [30:07:54<6:25:39, 3.41s/it] {'loss': 0.1055, 'grad_norm': 0.895939107570586, 'learning_rate': 9.922227004203517e-07, 'epoch': 0.8} 80%|████████ | 27495/34278 [30:07:54<6:25:39, 3.41s/it] 80%|████████ | 27496/34278 [30:07:58<6:50:18, 3.63s/it] {'loss': 0.1054, 'grad_norm': 0.8810439537896332, 'learning_rate': 9.919402402754314e-07, 'epoch': 0.8} 80%|████████ | 27496/34278 [30:07:58<6:50:18, 3.63s/it] 80%|████████ | 27497/34278 [30:08:02<6:48:31, 3.61s/it] {'loss': 0.0878, 'grad_norm': 0.7126229364810918, 'learning_rate': 9.916578159135114e-07, 'epoch': 0.8} 80%|████████ | 27497/34278 [30:08:02<6:48:31, 3.61s/it] 80%|████████ | 27498/34278 [30:08:05<6:35:32, 3.50s/it] {'loss': 0.1233, 'grad_norm': 0.7414594488993557, 'learning_rate': 9.913754273371128e-07, 'epoch': 0.8} 80%|████████ | 27498/34278 [30:08:05<6:35:32, 3.50s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff4d81d9bc0> Failed to fetch sample 3110971. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4d81d9bc0> 80%|████████ | 27499/34278 [30:08:08<6:11:44, 3.29s/it] {'loss': 0.1364, 'grad_norm': 0.9218964391377544, 'learning_rate': 9.910930745487586e-07, 'epoch': 0.8} 80%|████████ | 27499/34278 [30:08:08<6:11:44, 3.29s/it] 80%|████████ | 27500/34278 [30:08:11<6:03:37, 3.22s/it] {'loss': 0.1068, 'grad_norm': 0.8567387959737994, 'learning_rate': 9.908107575509673e-07, 'epoch': 0.8} 80%|████████ | 27500/34278 [30:08:11<6:03:37, 3.22s/it] 80%|████████ | 27501/34278 [30:08:14<6:05:02, 3.23s/it] {'loss': 0.1169, 'grad_norm': 0.7935055703829693, 'learning_rate': 9.905284763462603e-07, 'epoch': 0.8} 80%|████████ | 27501/34278 [30:08:14<6:05:02, 3.23s/it] 80%|████████ | 27502/34278 [30:08:17<6:10:55, 3.28s/it] {'loss': 0.1273, 'grad_norm': 0.8315308574392503, 'learning_rate': 9.90246230937159e-07, 'epoch': 0.8} 80%|████████ | 27502/34278 [30:08:17<6:10:55, 3.28s/it] 80%|████████ | 27503/34278 [30:08:20<6:01:36, 3.20s/it] {'loss': 0.1268, 'grad_norm': 0.7955133952656952, 'learning_rate': 9.899640213261823e-07, 'epoch': 0.8} 80%|████████ | 27503/34278 [30:08:20<6:01:36, 3.20s/it] 80%|████████ | 27504/34278 [30:08:25<6:39:07, 3.54s/it] {'loss': 0.1217, 'grad_norm': 0.9488374605414202, 'learning_rate': 9.89681847515848e-07, 'epoch': 0.8} 80%|████████ | 27504/34278 [30:08:25<6:39:07, 3.54s/it] 80%|████████ | 27505/34278 [30:08:27<6:14:00, 3.31s/it] {'loss': 0.1246, 'grad_norm': 0.7815366441533715, 'learning_rate': 9.893997095086788e-07, 'epoch': 0.8} 80%|████████ | 27505/34278 [30:08:27<6:14:00, 3.31s/it] 80%|████████ | 27506/34278 [30:08:32<6:47:53, 3.61s/it] {'loss': 0.1101, 'grad_norm': 0.9385181131703689, 'learning_rate': 9.891176073071896e-07, 'epoch': 0.8} 80%|████████ | 27506/34278 [30:08:32<6:47:53, 3.61s/it] 80%|████████ | 27507/34278 [30:08:35<6:20:48, 3.37s/it] {'loss': 0.1133, 'grad_norm': 0.8404212265666252, 'learning_rate': 9.888355409139027e-07, 'epoch': 0.8} 80%|████████ | 27507/34278 [30:08:35<6:20:48, 3.37s/it] 80%|████████ | 27508/34278 [30:08:39<6:44:04, 3.58s/it] {'loss': 0.1087, 'grad_norm': 0.7111770796997006, 'learning_rate': 9.88553510331333e-07, 'epoch': 0.8} 80%|████████ | 27508/34278 [30:08:39<6:44:04, 3.58s/it] 80%|████████ | 27509/34278 [30:08:42<6:25:39, 3.42s/it] {'loss': 0.1061, 'grad_norm': 0.85651853202019, 'learning_rate': 9.882715155620015e-07, 'epoch': 0.8} 80%|████████ | 27509/34278 [30:08:42<6:25:39, 3.42s/it] 80%|████████ | 27510/34278 [30:08:45<6:32:34, 3.48s/it] {'loss': 0.142, 'grad_norm': 0.9364010872011337, 'learning_rate': 9.879895566084241e-07, 'epoch': 0.8} 80%|████████ | 27510/34278 [30:08:45<6:32:34, 3.48s/it] 80%|████████ | 27511/34278 [30:08:51<7:45:50, 4.13s/it] {'loss': 0.0942, 'grad_norm': 0.8477630588934852, 'learning_rate': 9.877076334731167e-07, 'epoch': 0.8} 80%|████████ | 27511/34278 [30:08:51<7:45:50, 4.13s/it] 80%|████████ | 27512/34278 [30:08:57<8:38:08, 4.59s/it] {'loss': 0.135, 'grad_norm': 0.9740602613692005, 'learning_rate': 9.874257461585979e-07, 'epoch': 0.8} 80%|████████ | 27512/34278 [30:08:57<8:38:08, 4.59s/it] 80%|████████ | 27513/34278 [30:09:00<8:11:15, 4.36s/it] {'loss': 0.1097, 'grad_norm': 1.0255415594007768, 'learning_rate': 9.871438946673855e-07, 'epoch': 0.8} 80%|████████ | 27513/34278 [30:09:00<8:11:15, 4.36s/it] 80%|████████ | 27514/34278 [30:09:06<9:03:03, 4.82s/it] {'loss': 0.101, 'grad_norm': 0.867708343555253, 'learning_rate': 9.868620790019929e-07, 'epoch': 0.8} 80%|████████ | 27514/34278 [30:09:06<9:03:03, 4.82s/it] 80%|████████ | 27515/34278 [30:09:09<8:00:28, 4.26s/it] {'loss': 0.1145, 'grad_norm': 1.0916727839946327, 'learning_rate': 9.865802991649393e-07, 'epoch': 0.8} 80%|████████ | 27515/34278 [30:09:09<8:00:28, 4.26s/it] 80%|████████ | 27516/34278 [30:09:13<7:26:39, 3.96s/it] {'loss': 0.1057, 'grad_norm': 0.7076756699990595, 'learning_rate': 9.862985551587384e-07, 'epoch': 0.8} 80%|████████ | 27516/34278 [30:09:13<7:26:39, 3.96s/it] 80%|████████ | 27517/34278 [30:09:16<6:56:47, 3.70s/it] {'loss': 0.0968, 'grad_norm': 0.8077215592786581, 'learning_rate': 9.86016846985905e-07, 'epoch': 0.8} 80%|████████ | 27517/34278 [30:09:16<6:56:47, 3.70s/it] 80%|████████ | 27518/34278 [30:09:19<6:49:40, 3.64s/it] {'loss': 0.1089, 'grad_norm': 0.7284569848091642, 'learning_rate': 9.857351746489546e-07, 'epoch': 0.8} 80%|████████ | 27518/34278 [30:09:19<6:49:40, 3.64s/it] 80%|████████ | 27519/34278 [30:09:22<6:33:23, 3.49s/it] {'loss': 0.1124, 'grad_norm': 0.9925048104491448, 'learning_rate': 9.854535381504038e-07, 'epoch': 0.8} 80%|████████ | 27519/34278 [30:09:22<6:33:23, 3.49s/it] 80%|████████ | 27520/34278 [30:09:27<6:59:45, 3.73s/it] {'loss': 0.1149, 'grad_norm': 1.082531906295039, 'learning_rate': 9.851719374927655e-07, 'epoch': 0.8} 80%|████████ | 27520/34278 [30:09:27<6:59:45, 3.73s/it] 80%|████████ | 27521/34278 [30:09:31<7:15:15, 3.86s/it] {'loss': 0.1243, 'grad_norm': 0.9989830487617188, 'learning_rate': 9.848903726785518e-07, 'epoch': 0.8} 80%|████████ | 27521/34278 [30:09:31<7:15:15, 3.86s/it] 80%|████████ | 27522/34278 [30:09:36<7:47:19, 4.15s/it] {'loss': 0.1294, 'grad_norm': 0.6992970445341425, 'learning_rate': 9.846088437102802e-07, 'epoch': 0.8} 80%|████████ | 27522/34278 [30:09:36<7:47:19, 4.15s/it] 80%|████████ | 27523/34278 [30:09:39<7:28:52, 3.99s/it] {'loss': 0.1084, 'grad_norm': 0.7919031635809013, 'learning_rate': 9.843273505904622e-07, 'epoch': 0.8} 80%|████████ | 27523/34278 [30:09:39<7:28:52, 3.99s/it] 80%|████████ | 27524/34278 [30:09:42<6:53:11, 3.67s/it] {'loss': 0.1195, 'grad_norm': 0.9061890772451463, 'learning_rate': 9.840458933216097e-07, 'epoch': 0.8} 80%|████████ | 27524/34278 [30:09:42<6:53:11, 3.67s/it] 80%|████████ | 27525/34278 [30:09:48<8:17:10, 4.42s/it] {'loss': 0.083, 'grad_norm': 1.0522030653377388, 'learning_rate': 9.837644719062367e-07, 'epoch': 0.8} 80%|████████ | 27525/34278 [30:09:48<8:17:10, 4.42s/it] 80%|████████ | 27526/34278 [30:09:53<8:21:07, 4.45s/it] {'loss': 0.1188, 'grad_norm': 0.6484002636183097, 'learning_rate': 9.834830863468575e-07, 'epoch': 0.8} 80%|████████ | 27526/34278 [30:09:53<8:21:07, 4.45s/it] 80%|████████ | 27527/34278 [30:09:56<7:35:19, 4.05s/it] {'loss': 0.1042, 'grad_norm': 0.7819960455186077, 'learning_rate': 9.832017366459817e-07, 'epoch': 0.8} 80%|████████ | 27527/34278 [30:09:56<7:35:19, 4.05s/it] 80%|████████ | 27528/34278 [30:09:59<6:49:37, 3.64s/it] {'loss': 0.1427, 'grad_norm': 0.9196624254240777, 'learning_rate': 9.829204228061212e-07, 'epoch': 0.8} 80%|████████ | 27528/34278 [30:09:59<6:49:37, 3.64s/it] 80%|████████ | 27529/34278 [30:10:02<6:35:46, 3.52s/it] {'loss': 0.1036, 'grad_norm': 0.7512637588587939, 'learning_rate': 9.826391448297895e-07, 'epoch': 0.8} 80%|████████ | 27529/34278 [30:10:02<6:35:46, 3.52s/it] 80%|████████ | 27530/34278 [30:10:05<6:10:11, 3.29s/it] {'loss': 0.1025, 'grad_norm': 0.761399698788126, 'learning_rate': 9.82357902719495e-07, 'epoch': 0.8} 80%|████████ | 27530/34278 [30:10:05<6:10:11, 3.29s/it] 80%|████████ | 27531/34278 [30:10:09<6:42:33, 3.58s/it] {'loss': 0.1362, 'grad_norm': 0.7884522314321935, 'learning_rate': 9.820766964777501e-07, 'epoch': 0.8} 80%|████████ | 27531/34278 [30:10:09<6:42:33, 3.58s/it] 80%|████████ | 27532/34278 [30:10:11<6:08:49, 3.28s/it] {'loss': 0.114, 'grad_norm': 0.8762764995908606, 'learning_rate': 9.817955261070666e-07, 'epoch': 0.8} 80%|████████ | 27532/34278 [30:10:11<6:08:49, 3.28s/it] 80%|████████ | 27533/34278 [30:10:15<6:15:00, 3.34s/it] {'loss': 0.0966, 'grad_norm': 0.7402311609140194, 'learning_rate': 9.815143916099533e-07, 'epoch': 0.8} 80%|████████ | 27533/34278 [30:10:15<6:15:00, 3.34s/it] 80%|████████ | 27534/34278 [30:10:18<6:01:57, 3.22s/it] {'loss': 0.1055, 'grad_norm': 0.9790257807777503, 'learning_rate': 9.81233292988919e-07, 'epoch': 0.8} 80%|████████ | 27534/34278 [30:10:18<6:01:57, 3.22s/it] 80%|████████ | 27535/34278 [30:10:21<6:01:23, 3.22s/it] {'loss': 0.1247, 'grad_norm': 0.9065201993239207, 'learning_rate': 9.809522302464757e-07, 'epoch': 0.8} 80%|████████ | 27535/34278 [30:10:21<6:01:23, 3.22s/it] 80%|████████ | 27536/34278 [30:10:24<6:04:21, 3.24s/it] {'loss': 0.1106, 'grad_norm': 0.9448381142552766, 'learning_rate': 9.806712033851307e-07, 'epoch': 0.8} 80%|████████ | 27536/34278 [30:10:24<6:04:21, 3.24s/it] 80%|████████ | 27537/34278 [30:10:28<6:18:41, 3.37s/it] {'loss': 0.102, 'grad_norm': 0.9193845004981372, 'learning_rate': 9.803902124073945e-07, 'epoch': 0.8} 80%|████████ | 27537/34278 [30:10:28<6:18:41, 3.37s/it] 80%|████████ | 27538/34278 [30:10:33<7:21:28, 3.93s/it] {'loss': 0.1209, 'grad_norm': 0.8344915088105022, 'learning_rate': 9.801092573157734e-07, 'epoch': 0.8} 80%|████████ | 27538/34278 [30:10:33<7:21:28, 3.93s/it] 80%|████████ | 27539/34278 [30:10:37<7:09:09, 3.82s/it] {'loss': 0.1067, 'grad_norm': 0.7923926602430644, 'learning_rate': 9.798283381127793e-07, 'epoch': 0.8} 80%|████████ | 27539/34278 [30:10:37<7:09:09, 3.82s/it] 80%|████████ | 27540/34278 [30:10:40<6:53:40, 3.68s/it] {'loss': 0.1148, 'grad_norm': 0.831507834185248, 'learning_rate': 9.795474548009176e-07, 'epoch': 0.8} 80%|████████ | 27540/34278 [30:10:40<6:53:40, 3.68s/it] 80%|████████ | 27541/34278 [30:10:44<6:44:18, 3.60s/it] {'loss': 0.1399, 'grad_norm': 0.7731722617722381, 'learning_rate': 9.792666073826952e-07, 'epoch': 0.8} 80%|████████ | 27541/34278 [30:10:44<6:44:18, 3.60s/it] 80%|████████ | 27542/34278 [30:10:50<8:04:13, 4.31s/it] {'loss': 0.086, 'grad_norm': 1.0448949677239212, 'learning_rate': 9.789857958606207e-07, 'epoch': 0.8} 80%|████████ | 27542/34278 [30:10:50<8:04:13, 4.31s/it] 80%|████████ | 27543/34278 [30:10:53<7:20:09, 3.92s/it] {'loss': 0.0958, 'grad_norm': 0.7869590606394334, 'learning_rate': 9.787050202372023e-07, 'epoch': 0.8} 80%|████████ | 27543/34278 [30:10:53<7:20:09, 3.92s/it] 80%|████████ | 27544/34278 [30:10:56<6:48:07, 3.64s/it] {'loss': 0.1138, 'grad_norm': 1.0754165322432865, 'learning_rate': 9.784242805149442e-07, 'epoch': 0.8} 80%|████████ | 27544/34278 [30:10:56<6:48:07, 3.64s/it] 80%|████████ | 27545/34278 [30:10:59<6:47:24, 3.63s/it] {'loss': 0.1325, 'grad_norm': 0.8133889982205859, 'learning_rate': 9.78143576696356e-07, 'epoch': 0.8} 80%|████████ | 27545/34278 [30:10:59<6:47:24, 3.63s/it] 80%|████████ | 27546/34278 [30:11:03<6:47:56, 3.64s/it] {'loss': 0.0991, 'grad_norm': 0.8810615591701374, 'learning_rate': 9.778629087839414e-07, 'epoch': 0.8} 80%|████████ | 27546/34278 [30:11:03<6:47:56, 3.64s/it] 80%|████████ | 27547/34278 [30:11:06<6:33:10, 3.50s/it] {'loss': 0.1087, 'grad_norm': 1.1282760815960862, 'learning_rate': 9.77582276780205e-07, 'epoch': 0.8} 80%|████████ | 27547/34278 [30:11:06<6:33:10, 3.50s/it] 80%|████████ | 27548/34278 [30:11:09<6:27:15, 3.45s/it] {'loss': 0.1304, 'grad_norm': 1.0124644803125806, 'learning_rate': 9.77301680687654e-07, 'epoch': 0.8} 80%|████████ | 27548/34278 [30:11:09<6:27:15, 3.45s/it] 80%|████████ | 27549/34278 [30:11:13<6:35:53, 3.53s/it] {'loss': 0.1149, 'grad_norm': 0.8586315951501855, 'learning_rate': 9.770211205087948e-07, 'epoch': 0.8} 80%|████████ | 27549/34278 [30:11:13<6:35:53, 3.53s/it] 80%|████████ | 27550/34278 [30:11:16<6:12:48, 3.32s/it] {'loss': 0.1087, 'grad_norm': 1.2215218079973935, 'learning_rate': 9.767405962461306e-07, 'epoch': 0.8} 80%|████████ | 27550/34278 [30:11:16<6:12:48, 3.32s/it] 80%|████████ | 27551/34278 [30:11:19<6:10:52, 3.31s/it] {'loss': 0.1246, 'grad_norm': 0.757236596094083, 'learning_rate': 9.764601079021645e-07, 'epoch': 0.8} 80%|████████ | 27551/34278 [30:11:19<6:10:52, 3.31s/it] 80%|████████ | 27552/34278 [30:11:23<6:20:06, 3.39s/it] {'loss': 0.1124, 'grad_norm': 0.7812827316456944, 'learning_rate': 9.761796554794034e-07, 'epoch': 0.8} 80%|████████ | 27552/34278 [30:11:23<6:20:06, 3.39s/it] 80%|████████ | 27553/34278 [30:11:26<6:05:54, 3.26s/it] {'loss': 0.1265, 'grad_norm': 0.9113787930889454, 'learning_rate': 9.7589923898035e-07, 'epoch': 0.8} 80%|████████ | 27553/34278 [30:11:26<6:05:54, 3.26s/it] 80%|████████ | 27554/34278 [30:11:30<6:46:18, 3.63s/it] {'loss': 0.1233, 'grad_norm': 0.9763738642644213, 'learning_rate': 9.75618858407506e-07, 'epoch': 0.8} 80%|████████ | 27554/34278 [30:11:30<6:46:18, 3.63s/it] 80%|████████ | 27555/34278 [30:11:33<6:35:55, 3.53s/it] {'loss': 0.1041, 'grad_norm': 0.7342411020902224, 'learning_rate': 9.753385137633764e-07, 'epoch': 0.8} 80%|████████ | 27555/34278 [30:11:33<6:35:55, 3.53s/it] 80%|████████ | 27556/34278 [30:11:37<6:35:29, 3.53s/it] {'loss': 0.1276, 'grad_norm': 0.8643422526208991, 'learning_rate': 9.750582050504648e-07, 'epoch': 0.8} 80%|████████ | 27556/34278 [30:11:37<6:35:29, 3.53s/it] 80%|████████ | 27557/34278 [30:11:40<6:32:14, 3.50s/it] {'loss': 0.1224, 'grad_norm': 0.8031755648302986, 'learning_rate': 9.74777932271273e-07, 'epoch': 0.8} 80%|████████ | 27557/34278 [30:11:40<6:32:14, 3.50s/it] 80%|████████ | 27558/34278 [30:11:44<6:30:37, 3.49s/it] {'loss': 0.132, 'grad_norm': 0.941386872050956, 'learning_rate': 9.744976954283013e-07, 'epoch': 0.8} 80%|████████ | 27558/34278 [30:11:44<6:30:37, 3.49s/it] 80%|████████ | 27559/34278 [30:11:47<6:27:43, 3.46s/it] {'loss': 0.1092, 'grad_norm': 0.6966302439469197, 'learning_rate': 9.742174945240545e-07, 'epoch': 0.8} 80%|████████ | 27559/34278 [30:11:47<6:27:43, 3.46s/it] 80%|████████ | 27560/34278 [30:11:52<7:09:41, 3.84s/it] {'loss': 0.0967, 'grad_norm': 0.7677355592201016, 'learning_rate': 9.739373295610322e-07, 'epoch': 0.8} 80%|████████ | 27560/34278 [30:11:52<7:09:41, 3.84s/it] 80%|████████ | 27561/34278 [30:11:58<8:17:52, 4.45s/it] {'loss': 0.1289, 'grad_norm': 0.8254770337618673, 'learning_rate': 9.736572005417354e-07, 'epoch': 0.8} 80%|████████ | 27561/34278 [30:11:58<8:17:52, 4.45s/it] 80%|████████ | 27562/34278 [30:12:03<8:41:18, 4.66s/it] {'loss': 0.1366, 'grad_norm': 0.7646539644982906, 'learning_rate': 9.733771074686681e-07, 'epoch': 0.8} 80%|████████ | 27562/34278 [30:12:03<8:41:18, 4.66s/it] 80%|████████ | 27563/34278 [30:12:09<9:19:54, 5.00s/it] {'loss': 0.1034, 'grad_norm': 0.7624287634823101, 'learning_rate': 9.730970503443281e-07, 'epoch': 0.8} 80%|████████ | 27563/34278 [30:12:09<9:19:54, 5.00s/it] 80%|████████ | 27564/34278 [30:12:15<9:48:25, 5.26s/it] {'loss': 0.1081, 'grad_norm': 0.9306076603852309, 'learning_rate': 9.728170291712153e-07, 'epoch': 0.8} 80%|████████ | 27564/34278 [30:12:15<9:48:25, 5.26s/it] 80%|████████ | 27565/34278 [30:12:19<9:00:16, 4.83s/it] {'loss': 0.1123, 'grad_norm': 0.7185652973623851, 'learning_rate': 9.725370439518323e-07, 'epoch': 0.8} 80%|████████ | 27565/34278 [30:12:19<9:00:16, 4.83s/it] 80%|████████ | 27566/34278 [30:12:24<9:27:20, 5.07s/it] {'loss': 0.1281, 'grad_norm': 1.0339619225291377, 'learning_rate': 9.722570946886755e-07, 'epoch': 0.8} 80%|████████ | 27566/34278 [30:12:24<9:27:20, 5.07s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 80%|████████ | 27567/34278 [30:12:27<8:24:27, 4.51s/it] {'loss': 0.126, 'grad_norm': 0.8357950887205879, 'learning_rate': 9.71977181384247e-07, 'epoch': 0.8} 80%|████████ | 27567/34278 [30:12:27<8:24:27, 4.51s/it] 80%|████████ | 27568/34278 [30:12:33<9:03:34, 4.86s/it] {'loss': 0.1074, 'grad_norm': 0.8086783951902399, 'learning_rate': 9.716973040410437e-07, 'epoch': 0.8} 80%|████████ | 27568/34278 [30:12:33<9:03:34, 4.86s/it] 80%|████████ | 27569/34278 [30:12:36<8:11:19, 4.39s/it] {'loss': 0.1335, 'grad_norm': 0.7949498671032257, 'learning_rate': 9.714174626615664e-07, 'epoch': 0.8} 80%|████████ | 27569/34278 [30:12:36<8:11:19, 4.39s/it] 80%|████████ | 27570/34278 [30:12:42<9:02:02, 4.85s/it] {'loss': 0.1377, 'grad_norm': 1.0117006930422152, 'learning_rate': 9.711376572483122e-07, 'epoch': 0.8} 80%|████████ | 27570/34278 [30:12:42<9:02:02, 4.85s/it] 80%|████████ | 27571/34278 [30:12:46<8:23:59, 4.51s/it] {'loss': 0.1108, 'grad_norm': 0.9456639991047978, 'learning_rate': 9.708578878037778e-07, 'epoch': 0.8} 80%|████████ | 27571/34278 [30:12:46<8:23:59, 4.51s/it] 80%|████████ | 27572/34278 [30:12:50<8:19:43, 4.47s/it] {'loss': 0.0948, 'grad_norm': 0.915343136462663, 'learning_rate': 9.705781543304627e-07, 'epoch': 0.8} 80%|████████ | 27572/34278 [30:12:50<8:19:43, 4.47s/it] 80%|████████ | 27573/34278 [30:12:55<8:26:22, 4.53s/it] {'loss': 0.1075, 'grad_norm': 0.8714827951311132, 'learning_rate': 9.702984568308654e-07, 'epoch': 0.8} 80%|████████ | 27573/34278 [30:12:55<8:26:22, 4.53s/it] 80%|████████ | 27574/34278 [30:12:59<7:53:22, 4.24s/it] {'loss': 0.1061, 'grad_norm': 0.884711263380116, 'learning_rate': 9.700187953074797e-07, 'epoch': 0.8} 80%|████████ | 27574/34278 [30:12:59<7:53:22, 4.24s/it] 80%|████████ | 27575/34278 [30:13:02<7:16:39, 3.91s/it] {'loss': 0.1145, 'grad_norm': 0.890139260451249, 'learning_rate': 9.697391697628056e-07, 'epoch': 0.8} 80%|████████ | 27575/34278 [30:13:02<7:16:39, 3.91s/it] 80%|████████ | 27576/34278 [30:13:06<7:30:12, 4.03s/it] {'loss': 0.0963, 'grad_norm': 0.7615195481589713, 'learning_rate': 9.694595801993383e-07, 'epoch': 0.8} 80%|████████ | 27576/34278 [30:13:06<7:30:12, 4.03s/it] 80%|████████ | 27577/34278 [30:13:09<7:00:26, 3.76s/it] {'loss': 0.1417, 'grad_norm': 1.0854808140848085, 'learning_rate': 9.691800266195721e-07, 'epoch': 0.8} 80%|████████ | 27577/34278 [30:13:09<7:00:26, 3.76s/it] 80%|████████ | 27578/34278 [30:13:12<6:25:54, 3.46s/it] {'loss': 0.0975, 'grad_norm': 0.874197590124047, 'learning_rate': 9.689005090260045e-07, 'epoch': 0.8} 80%|████████ | 27578/34278 [30:13:12<6:25:54, 3.46s/it] 80%|████████ | 27579/34278 [30:13:15<6:16:16, 3.37s/it] {'loss': 0.1153, 'grad_norm': 0.7717465030444081, 'learning_rate': 9.686210274211321e-07, 'epoch': 0.8} 80%|████████ | 27579/34278 [30:13:15<6:16:16, 3.37s/it] 80%|████████ | 27580/34278 [30:13:18<6:02:10, 3.24s/it] {'loss': 0.0937, 'grad_norm': 0.9969176410958358, 'learning_rate': 9.683415818074493e-07, 'epoch': 0.8} 80%|████████ | 27580/34278 [30:13:18<6:02:10, 3.24s/it] 80%|████████ | 27581/34278 [30:13:21<6:06:59, 3.29s/it] {'loss': 0.1073, 'grad_norm': 0.8402278413277697, 'learning_rate': 9.680621721874483e-07, 'epoch': 0.8} 80%|████████ | 27581/34278 [30:13:21<6:06:59, 3.29s/it] 80%|████████ | 27582/34278 [30:13:27<7:39:54, 4.12s/it] {'loss': 0.1173, 'grad_norm': 0.8249377411077903, 'learning_rate': 9.67782798563628e-07, 'epoch': 0.8} 80%|████████ | 27582/34278 [30:13:27<7:39:54, 4.12s/it] 80%|████████ | 27583/34278 [30:13:34<8:53:07, 4.78s/it] {'loss': 0.1256, 'grad_norm': 0.803192629308676, 'learning_rate': 9.675034609384792e-07, 'epoch': 0.8} 80%|████████ | 27583/34278 [30:13:34<8:53:07, 4.78s/it] 80%|████████ | 27584/34278 [30:13:37<8:11:29, 4.41s/it] {'loss': 0.1011, 'grad_norm': 1.1678074185282614, 'learning_rate': 9.672241593144965e-07, 'epoch': 0.8} 80%|████████ | 27584/34278 [30:13:37<8:11:29, 4.41s/it] 80%|████████ | 27585/34278 [30:13:41<7:33:10, 4.06s/it] {'loss': 0.1217, 'grad_norm': 1.1864856481601738, 'learning_rate': 9.669448936941733e-07, 'epoch': 0.8} 80%|████████ | 27585/34278 [30:13:41<7:33:10, 4.06s/it] 80%|████████ | 27586/34278 [30:13:44<7:17:26, 3.92s/it] {'loss': 0.1224, 'grad_norm': 0.7118350636026991, 'learning_rate': 9.666656640800048e-07, 'epoch': 0.8} 80%|████████ | 27586/34278 [30:13:44<7:17:26, 3.92s/it] 80%|████████ | 27587/34278 [30:13:47<6:45:47, 3.64s/it] {'loss': 0.1159, 'grad_norm': 0.7955510891893275, 'learning_rate': 9.663864704744825e-07, 'epoch': 0.8} 80%|████████ | 27587/34278 [30:13:47<6:45:47, 3.64s/it] 80%|████████ | 27588/34278 [30:13:50<6:25:45, 3.46s/it] {'loss': 0.107, 'grad_norm': 0.8232893457522569, 'learning_rate': 9.661073128800973e-07, 'epoch': 0.8} 80%|████████ | 27588/34278 [30:13:50<6:25:45, 3.46s/it] 80%|████████ | 27589/34278 [30:13:54<6:26:40, 3.47s/it] {'loss': 0.115, 'grad_norm': 0.8513235400021794, 'learning_rate': 9.658281912993444e-07, 'epoch': 0.8} 80%|████████ | 27589/34278 [30:13:54<6:26:40, 3.47s/it] 80%|████████ | 27590/34278 [30:13:57<6:20:15, 3.41s/it] {'loss': 0.1057, 'grad_norm': 1.1189109386472753, 'learning_rate': 9.655491057347133e-07, 'epoch': 0.8} 80%|████████ | 27590/34278 [30:13:57<6:20:15, 3.41s/it] 80%|████████ | 27591/34278 [30:14:01<6:30:12, 3.50s/it] {'loss': 0.1183, 'grad_norm': 0.8443591755701726, 'learning_rate': 9.652700561886964e-07, 'epoch': 0.8} 80%|████████ | 27591/34278 [30:14:01<6:30:12, 3.50s/it] 80%|████████ | 27592/34278 [30:14:03<6:01:59, 3.25s/it] {'loss': 0.1136, 'grad_norm': 0.9920503892854112, 'learning_rate': 9.64991042663787e-07, 'epoch': 0.8} 80%|████████ | 27592/34278 [30:14:03<6:01:59, 3.25s/it] 80%|████████ | 27593/34278 [30:14:08<6:48:30, 3.67s/it] {'loss': 0.1125, 'grad_norm': 0.9103245817667461, 'learning_rate': 9.647120651624737e-07, 'epoch': 0.8} 80%|████████ | 27593/34278 [30:14:08<6:48:30, 3.67s/it] 81%|████████ | 27594/34278 [30:14:14<7:59:22, 4.30s/it] {'loss': 0.0974, 'grad_norm': 0.7498952518795737, 'learning_rate': 9.644331236872472e-07, 'epoch': 0.81} 81%|████████ | 27594/34278 [30:14:14<7:59:22, 4.30s/it] 81%|████████ | 27595/34278 [30:14:17<7:26:45, 4.01s/it] {'loss': 0.1145, 'grad_norm': 1.1221231264385536, 'learning_rate': 9.641542182405995e-07, 'epoch': 0.81} 81%|████████ | 27595/34278 [30:14:17<7:26:45, 4.01s/it] 81%|████████ | 27596/34278 [30:14:21<7:11:02, 3.87s/it] {'loss': 0.1165, 'grad_norm': 1.4629426387282476, 'learning_rate': 9.63875348825018e-07, 'epoch': 0.81} 81%|████████ | 27596/34278 [30:14:21<7:11:02, 3.87s/it] 81%|████████ | 27597/34278 [30:14:24<6:48:25, 3.67s/it] {'loss': 0.0948, 'grad_norm': 0.7649073341332615, 'learning_rate': 9.63596515442995e-07, 'epoch': 0.81} 81%|████████ | 27597/34278 [30:14:24<6:48:25, 3.67s/it] 81%|████████ | 27598/34278 [30:14:27<6:27:12, 3.48s/it] {'loss': 0.0973, 'grad_norm': 0.9968918576357672, 'learning_rate': 9.633177180970177e-07, 'epoch': 0.81} 81%|████████ | 27598/34278 [30:14:27<6:27:12, 3.48s/it] 81%|████████ | 27599/34278 [30:14:30<6:16:59, 3.39s/it] {'loss': 0.1086, 'grad_norm': 1.0691184429048453, 'learning_rate': 9.630389567895776e-07, 'epoch': 0.81} 81%|████████ | 27599/34278 [30:14:30<6:16:59, 3.39s/it] 81%|████████ | 27600/34278 [30:14:33<5:56:30, 3.20s/it] {'loss': 0.1242, 'grad_norm': 1.019450539143369, 'learning_rate': 9.627602315231616e-07, 'epoch': 0.81} 81%|████████ | 27600/34278 [30:14:33<5:56:30, 3.20s/it] 81%|████████ | 27601/34278 [30:14:39<7:31:27, 4.06s/it] {'loss': 0.1249, 'grad_norm': 0.9327498563407185, 'learning_rate': 9.624815423002576e-07, 'epoch': 0.81} 81%|████████ | 27601/34278 [30:14:39<7:31:27, 4.06s/it] 81%|████████ | 27602/34278 [30:14:42<6:54:17, 3.72s/it] {'loss': 0.0984, 'grad_norm': 0.7362454293940043, 'learning_rate': 9.622028891233543e-07, 'epoch': 0.81} 81%|████████ | 27602/34278 [30:14:42<6:54:17, 3.72s/it] 81%|████████ | 27603/34278 [30:14:45<6:24:33, 3.46s/it] {'loss': 0.1144, 'grad_norm': 0.7822579531266258, 'learning_rate': 9.619242719949411e-07, 'epoch': 0.81} 81%|████████ | 27603/34278 [30:14:45<6:24:33, 3.46s/it] 81%|████████ | 27604/34278 [30:14:48<6:08:14, 3.31s/it] {'loss': 0.0971, 'grad_norm': 0.9239614326140464, 'learning_rate': 9.616456909175027e-07, 'epoch': 0.81} 81%|████████ | 27604/34278 [30:14:48<6:08:14, 3.31s/it] 81%|████████ | 27605/34278 [30:14:54<7:40:08, 4.14s/it] {'loss': 0.1039, 'grad_norm': 0.8223689071686345, 'learning_rate': 9.613671458935287e-07, 'epoch': 0.81} 81%|████████ | 27605/34278 [30:14:54<7:40:08, 4.14s/it] 81%|████████ | 27606/34278 [30:14:57<7:01:48, 3.79s/it] {'loss': 0.1166, 'grad_norm': 0.9177224570765862, 'learning_rate': 9.610886369255051e-07, 'epoch': 0.81} 81%|████████ | 27606/34278 [30:14:57<7:01:48, 3.79s/it] 81%|████████ | 27607/34278 [30:15:00<6:38:16, 3.58s/it] {'loss': 0.1146, 'grad_norm': 0.8410811683324743, 'learning_rate': 9.608101640159162e-07, 'epoch': 0.81} 81%|████████ | 27607/34278 [30:15:00<6:38:16, 3.58s/it] 81%|████████ | 27608/34278 [30:15:04<6:44:24, 3.64s/it] {'loss': 0.1121, 'grad_norm': 1.1448666995095045, 'learning_rate': 9.605317271672504e-07, 'epoch': 0.81} 81%|████████ | 27608/34278 [30:15:04<6:44:24, 3.64s/it] 81%|████████ | 27609/34278 [30:15:10<8:05:00, 4.36s/it] {'loss': 0.1228, 'grad_norm': 0.9329725147861546, 'learning_rate': 9.60253326381994e-07, 'epoch': 0.81} 81%|████████ | 27609/34278 [30:15:10<8:05:00, 4.36s/it] 81%|████████ | 27610/34278 [30:15:15<8:28:22, 4.57s/it] {'loss': 0.092, 'grad_norm': 0.7056684039714629, 'learning_rate': 9.59974961662632e-07, 'epoch': 0.81} 81%|████████ | 27610/34278 [30:15:15<8:28:22, 4.57s/it] 81%|████████ | 27611/34278 [30:15:18<8:03:18, 4.35s/it] {'loss': 0.1125, 'grad_norm': 0.9541838169658238, 'learning_rate': 9.596966330116474e-07, 'epoch': 0.81} 81%|████████ | 27611/34278 [30:15:19<8:03:18, 4.35s/it] 81%|████████ | 27612/34278 [30:15:21<7:15:01, 3.92s/it] {'loss': 0.1427, 'grad_norm': 1.1040348580314125, 'learning_rate': 9.594183404315283e-07, 'epoch': 0.81} 81%|████████ | 27612/34278 [30:15:21<7:15:01, 3.92s/it] 81%|████████ | 27613/34278 [30:15:24<6:38:59, 3.59s/it] {'loss': 0.1166, 'grad_norm': 0.8768262650652688, 'learning_rate': 9.591400839247572e-07, 'epoch': 0.81} 81%|████████ | 27613/34278 [30:15:24<6:38:59, 3.59s/it] 81%|████████ | 27614/34278 [30:15:27<6:24:26, 3.46s/it] {'loss': 0.0999, 'grad_norm': 0.7615113329104115, 'learning_rate': 9.58861863493818e-07, 'epoch': 0.81} 81%|████████ | 27614/34278 [30:15:27<6:24:26, 3.46s/it] 81%|████████ | 27615/34278 [30:15:31<6:36:28, 3.57s/it] {'loss': 0.1082, 'grad_norm': 0.912292119000475, 'learning_rate': 9.58583679141195e-07, 'epoch': 0.81} 81%|████████ | 27615/34278 [30:15:31<6:36:28, 3.57s/it] 81%|████████ | 27616/34278 [30:15:35<6:41:40, 3.62s/it] {'loss': 0.1074, 'grad_norm': 0.8534327845555917, 'learning_rate': 9.58305530869374e-07, 'epoch': 0.81} 81%|████████ | 27616/34278 [30:15:35<6:41:40, 3.62s/it] 81%|████████ | 27617/34278 [30:15:39<6:43:53, 3.64s/it] {'loss': 0.1009, 'grad_norm': 1.302506181373477, 'learning_rate': 9.580274186808359e-07, 'epoch': 0.81} 81%|████████ | 27617/34278 [30:15:39<6:43:53, 3.64s/it] 81%|████████ | 27618/34278 [30:15:42<6:33:15, 3.54s/it] {'loss': 0.1216, 'grad_norm': 1.1892135792392173, 'learning_rate': 9.577493425780631e-07, 'epoch': 0.81} 81%|████████ | 27618/34278 [30:15:42<6:33:15, 3.54s/it] 81%|████████ | 27619/34278 [30:15:45<6:13:21, 3.36s/it] {'loss': 0.1243, 'grad_norm': 1.0685644096264661, 'learning_rate': 9.574713025635401e-07, 'epoch': 0.81} 81%|████████ | 27619/34278 [30:15:45<6:13:21, 3.36s/it] 81%|████████ | 27620/34278 [30:15:48<6:19:12, 3.42s/it] {'loss': 0.1064, 'grad_norm': 1.0613542035055563, 'learning_rate': 9.571932986397474e-07, 'epoch': 0.81} 81%|████████ | 27620/34278 [30:15:48<6:19:12, 3.42s/it] 81%|████████ | 27621/34278 [30:15:54<7:39:47, 4.14s/it] {'loss': 0.1002, 'grad_norm': 1.0999879934902057, 'learning_rate': 9.569153308091678e-07, 'epoch': 0.81} 81%|████████ | 27621/34278 [30:15:54<7:39:47, 4.14s/it] 81%|████████ | 27622/34278 [30:15:59<7:49:34, 4.23s/it] {'loss': 0.1184, 'grad_norm': 0.9100385971405007, 'learning_rate': 9.566373990742845e-07, 'epoch': 0.81} 81%|████████ | 27622/34278 [30:15:59<7:49:34, 4.23s/it] 81%|████████ | 27623/34278 [30:16:02<7:20:59, 3.98s/it] {'loss': 0.1017, 'grad_norm': 0.8974380860024977, 'learning_rate': 9.563595034375766e-07, 'epoch': 0.81} 81%|████████ | 27623/34278 [30:16:02<7:20:59, 3.98s/it] 81%|████████ | 27624/34278 [30:16:05<7:02:04, 3.81s/it] {'loss': 0.1246, 'grad_norm': 0.8825603888271505, 'learning_rate': 9.560816439015247e-07, 'epoch': 0.81} 81%|████████ | 27624/34278 [30:16:06<7:02:04, 3.81s/it] 81%|████████ | 27625/34278 [30:16:09<6:40:30, 3.61s/it] {'loss': 0.1226, 'grad_norm': 0.9686615295957507, 'learning_rate': 9.55803820468612e-07, 'epoch': 0.81} 81%|████████ | 27625/34278 [30:16:09<6:40:30, 3.61s/it] 81%|████████ | 27626/34278 [30:16:15<7:58:44, 4.32s/it] {'loss': 0.108, 'grad_norm': 0.7032030121516174, 'learning_rate': 9.555260331413157e-07, 'epoch': 0.81} 81%|████████ | 27626/34278 [30:16:15<7:58:44, 4.32s/it] 81%|████████ | 27627/34278 [30:16:17<7:06:51, 3.85s/it] {'loss': 0.1107, 'grad_norm': 0.9914186132599075, 'learning_rate': 9.552482819221193e-07, 'epoch': 0.81} 81%|████████ | 27627/34278 [30:16:17<7:06:51, 3.85s/it] 81%|████████ | 27628/34278 [30:16:21<6:45:46, 3.66s/it] {'loss': 0.1095, 'grad_norm': 0.8088929042226313, 'learning_rate': 9.54970566813499e-07, 'epoch': 0.81} 81%|████████ | 27628/34278 [30:16:21<6:45:46, 3.66s/it] 81%|████████ | 27629/34278 [30:16:24<6:33:41, 3.55s/it] {'loss': 0.1124, 'grad_norm': 1.0028305047125612, 'learning_rate': 9.546928878179374e-07, 'epoch': 0.81} 81%|████████ | 27629/34278 [30:16:24<6:33:41, 3.55s/it] 81%|████████ | 27630/34278 [30:16:27<6:26:57, 3.49s/it] {'loss': 0.1062, 'grad_norm': 1.2467790967603478, 'learning_rate': 9.54415244937912e-07, 'epoch': 0.81} 81%|████████ | 27630/34278 [30:16:27<6:26:57, 3.49s/it] 81%|████████ | 27631/34278 [30:16:30<6:10:57, 3.35s/it] {'loss': 0.1099, 'grad_norm': 1.0096328635678125, 'learning_rate': 9.541376381759004e-07, 'epoch': 0.81} 81%|████████ | 27631/34278 [30:16:30<6:10:57, 3.35s/it] 81%|████████ | 27632/34278 [30:16:33<5:56:09, 3.22s/it] {'loss': 0.1066, 'grad_norm': 0.7248636709813338, 'learning_rate': 9.538600675343818e-07, 'epoch': 0.81} 81%|████████ | 27632/34278 [30:16:33<5:56:09, 3.22s/it] 81%|████████ | 27633/34278 [30:16:36<5:59:18, 3.24s/it] {'loss': 0.1056, 'grad_norm': 0.8359399294345164, 'learning_rate': 9.53582533015836e-07, 'epoch': 0.81} 81%|████████ | 27633/34278 [30:16:37<5:59:18, 3.24s/it] 81%|████████ | 27634/34278 [30:16:39<5:44:54, 3.11s/it] {'loss': 0.1472, 'grad_norm': 0.9525697005330482, 'learning_rate': 9.53305034622738e-07, 'epoch': 0.81} 81%|████████ | 27634/34278 [30:16:39<5:44:54, 3.11s/it] 81%|████████ | 27635/34278 [30:16:43<6:04:05, 3.29s/it] {'loss': 0.113, 'grad_norm': 1.1082326229718025, 'learning_rate': 9.530275723575677e-07, 'epoch': 0.81} 81%|████████ | 27635/34278 [30:16:43<6:04:05, 3.29s/it] 81%|████████ | 27636/34278 [30:16:46<6:00:56, 3.26s/it] {'loss': 0.1222, 'grad_norm': 0.8652451949775221, 'learning_rate': 9.527501462228018e-07, 'epoch': 0.81} 81%|████████ | 27636/34278 [30:16:46<6:00:56, 3.26s/it] 81%|████████ | 27637/34278 [30:16:49<5:51:49, 3.18s/it] {'loss': 0.1402, 'grad_norm': 1.0307970435761293, 'learning_rate': 9.524727562209146e-07, 'epoch': 0.81} 81%|████████ | 27637/34278 [30:16:49<5:51:49, 3.18s/it] 81%|████████ | 27638/34278 [30:16:54<6:39:21, 3.61s/it] {'loss': 0.123, 'grad_norm': 0.8585871109648557, 'learning_rate': 9.521954023543844e-07, 'epoch': 0.81} 81%|████████ | 27638/34278 [30:16:54<6:39:21, 3.61s/it] 81%|████████ | 27639/34278 [30:16:57<6:10:56, 3.35s/it] {'loss': 0.1173, 'grad_norm': 0.9834767927940743, 'learning_rate': 9.519180846256893e-07, 'epoch': 0.81} 81%|████████ | 27639/34278 [30:16:57<6:10:56, 3.35s/it] 81%|████████ | 27640/34278 [30:17:01<6:33:00, 3.55s/it] {'loss': 0.1374, 'grad_norm': 0.8467804807672528, 'learning_rate': 9.516408030373025e-07, 'epoch': 0.81} 81%|████████ | 27640/34278 [30:17:01<6:33:00, 3.55s/it] 81%|████████ | 27641/34278 [30:17:03<6:10:20, 3.35s/it] {'loss': 0.1256, 'grad_norm': 1.092355609652649, 'learning_rate': 9.51363557591699e-07, 'epoch': 0.81} 81%|████████ | 27641/34278 [30:17:03<6:10:20, 3.35s/it] 81%|████████ | 27642/34278 [30:17:09<7:13:50, 3.92s/it] {'loss': 0.1139, 'grad_norm': 1.1469343584533855, 'learning_rate': 9.510863482913568e-07, 'epoch': 0.81} 81%|████████ | 27642/34278 [30:17:09<7:13:50, 3.92s/it] 81%|████████ | 27643/34278 [30:17:12<6:54:12, 3.75s/it] {'loss': 0.1178, 'grad_norm': 0.8730894118637729, 'learning_rate': 9.508091751387489e-07, 'epoch': 0.81} 81%|████████ | 27643/34278 [30:17:12<6:54:12, 3.75s/it] 81%|████████ | 27644/34278 [30:17:15<6:44:00, 3.65s/it] {'loss': 0.1097, 'grad_norm': 0.7942972733331833, 'learning_rate': 9.505320381363486e-07, 'epoch': 0.81} 81%|████████ | 27644/34278 [30:17:15<6:44:00, 3.65s/it] 81%|████████ | 27645/34278 [30:17:19<6:44:50, 3.66s/it] {'loss': 0.1318, 'grad_norm': 0.8877381070858066, 'learning_rate': 9.502549372866321e-07, 'epoch': 0.81} 81%|████████ | 27645/34278 [30:17:19<6:44:50, 3.66s/it] 81%|████████ | 27646/34278 [30:17:22<6:21:17, 3.45s/it] {'loss': 0.1133, 'grad_norm': 0.8931044665754303, 'learning_rate': 9.499778725920739e-07, 'epoch': 0.81} 81%|████████ | 27646/34278 [30:17:22<6:21:17, 3.45s/it] 81%|████████ | 27647/34278 [30:17:26<6:46:53, 3.68s/it] {'loss': 0.1309, 'grad_norm': 1.2627948123104578, 'learning_rate': 9.497008440551464e-07, 'epoch': 0.81} 81%|████████ | 27647/34278 [30:17:26<6:46:53, 3.68s/it] 81%|████████ | 27648/34278 [30:17:29<6:29:11, 3.52s/it] {'loss': 0.1153, 'grad_norm': 0.9609561848997056, 'learning_rate': 9.494238516783211e-07, 'epoch': 0.81} 81%|████████ | 27648/34278 [30:17:29<6:29:11, 3.52s/it] 81%|████████ | 27649/34278 [30:17:33<6:20:42, 3.45s/it] {'loss': 0.1268, 'grad_norm': 1.00056393687836, 'learning_rate': 9.491468954640742e-07, 'epoch': 0.81} 81%|████████ | 27649/34278 [30:17:33<6:20:42, 3.45s/it] 81%|████████ | 27650/34278 [30:17:36<6:08:06, 3.33s/it] {'loss': 0.0995, 'grad_norm': 0.9582638790325807, 'learning_rate': 9.488699754148762e-07, 'epoch': 0.81} 81%|████████ | 27650/34278 [30:17:36<6:08:06, 3.33s/it] 81%|████████ | 27651/34278 [30:17:39<6:02:28, 3.28s/it] {'loss': 0.1202, 'grad_norm': 0.8134779507158203, 'learning_rate': 9.485930915331992e-07, 'epoch': 0.81} 81%|████████ | 27651/34278 [30:17:39<6:02:28, 3.28s/it] 81%|████████ | 27652/34278 [30:17:44<6:56:31, 3.77s/it] {'loss': 0.1104, 'grad_norm': 0.7138897674898071, 'learning_rate': 9.483162438215177e-07, 'epoch': 0.81} 81%|████████ | 27652/34278 [30:17:44<6:56:31, 3.77s/it] 81%|████████ | 27653/34278 [30:17:47<6:19:27, 3.44s/it] {'loss': 0.122, 'grad_norm': 0.8479061846663208, 'learning_rate': 9.480394322823011e-07, 'epoch': 0.81} 81%|████████ | 27653/34278 [30:17:47<6:19:27, 3.44s/it] 81%|████████ | 27654/34278 [30:17:50<6:21:50, 3.46s/it] {'loss': 0.114, 'grad_norm': 0.8387993100806406, 'learning_rate': 9.477626569180198e-07, 'epoch': 0.81} 81%|████████ | 27654/34278 [30:17:50<6:21:50, 3.46s/it] 81%|████████ | 27655/34278 [30:17:54<6:30:21, 3.54s/it] {'loss': 0.1241, 'grad_norm': 0.7209054024864514, 'learning_rate': 9.474859177311479e-07, 'epoch': 0.81} 81%|████████ | 27655/34278 [30:17:54<6:30:21, 3.54s/it] 81%|████████ | 27656/34278 [30:17:57<6:21:22, 3.46s/it] {'loss': 0.1453, 'grad_norm': 1.036788125164798, 'learning_rate': 9.472092147241529e-07, 'epoch': 0.81} 81%|████████ | 27656/34278 [30:17:57<6:21:22, 3.46s/it] 81%|████████ | 27657/34278 [30:18:01<6:28:13, 3.52s/it] {'loss': 0.0744, 'grad_norm': 0.7315252955723138, 'learning_rate': 9.469325478995078e-07, 'epoch': 0.81} 81%|████████ | 27657/34278 [30:18:01<6:28:13, 3.52s/it] 81%|████████ | 27658/34278 [30:18:04<6:21:16, 3.46s/it] {'loss': 0.1302, 'grad_norm': 0.842430395675882, 'learning_rate': 9.466559172596801e-07, 'epoch': 0.81} 81%|████████ | 27658/34278 [30:18:04<6:21:16, 3.46s/it] 81%|████████ | 27659/34278 [30:18:08<6:33:38, 3.57s/it] {'loss': 0.1202, 'grad_norm': 0.8221287547134344, 'learning_rate': 9.463793228071422e-07, 'epoch': 0.81} 81%|████████ | 27659/34278 [30:18:08<6:33:38, 3.57s/it] 81%|████████ | 27660/34278 [30:18:12<6:39:58, 3.63s/it] {'loss': 0.1337, 'grad_norm': 1.0681433003230443, 'learning_rate': 9.461027645443616e-07, 'epoch': 0.81} 81%|████████ | 27660/34278 [30:18:12<6:39:58, 3.63s/it] 81%|████████ | 27661/34278 [30:18:15<6:40:16, 3.63s/it] {'loss': 0.1005, 'grad_norm': 0.7929305994184553, 'learning_rate': 9.458262424738069e-07, 'epoch': 0.81} 81%|████████ | 27661/34278 [30:18:15<6:40:16, 3.63s/it] 81%|████████ | 27662/34278 [30:18:19<6:30:31, 3.54s/it] {'loss': 0.1427, 'grad_norm': 0.7386760497497545, 'learning_rate': 9.455497565979477e-07, 'epoch': 0.81} 81%|████████ | 27662/34278 [30:18:19<6:30:31, 3.54s/it] 81%|████████ | 27663/34278 [30:18:22<6:21:31, 3.46s/it] {'loss': 0.117, 'grad_norm': 0.9587053256319008, 'learning_rate': 9.452733069192532e-07, 'epoch': 0.81} 81%|████████ | 27663/34278 [30:18:22<6:21:31, 3.46s/it] 81%|████████ | 27664/34278 [30:18:26<6:50:07, 3.72s/it] {'loss': 0.1066, 'grad_norm': 0.7468410251227554, 'learning_rate': 9.449968934401899e-07, 'epoch': 0.81} 81%|████████ | 27664/34278 [30:18:26<6:50:07, 3.72s/it] 81%|████████ | 27665/34278 [30:18:29<6:25:54, 3.50s/it] {'loss': 0.1117, 'grad_norm': 0.7986438414699737, 'learning_rate': 9.447205161632272e-07, 'epoch': 0.81} 81%|████████ | 27665/34278 [30:18:29<6:25:54, 3.50s/it] 81%|████████ | 27666/34278 [30:18:32<6:05:03, 3.31s/it] {'loss': 0.1423, 'grad_norm': 0.9221184390188412, 'learning_rate': 9.444441750908323e-07, 'epoch': 0.81} 81%|████████ | 27666/34278 [30:18:32<6:05:03, 3.31s/it] 81%|████████ | 27667/34278 [30:18:38<7:48:41, 4.25s/it] {'loss': 0.1479, 'grad_norm': 0.8046740951130008, 'learning_rate': 9.441678702254697e-07, 'epoch': 0.81} 81%|████████ | 27667/34278 [30:18:38<7:48:41, 4.25s/it] 81%|████████ | 27668/34278 [30:18:44<8:40:05, 4.72s/it] {'loss': 0.1273, 'grad_norm': 0.8078955023180293, 'learning_rate': 9.438916015696087e-07, 'epoch': 0.81} 81%|████████ | 27668/34278 [30:18:44<8:40:05, 4.72s/it] 81%|████████ | 27669/34278 [30:18:48<8:06:49, 4.42s/it] {'loss': 0.1113, 'grad_norm': 0.751614568114784, 'learning_rate': 9.43615369125716e-07, 'epoch': 0.81} 81%|████████ | 27669/34278 [30:18:48<8:06:49, 4.42s/it] 81%|████████ | 27670/34278 [30:18:52<7:39:48, 4.18s/it] {'loss': 0.1099, 'grad_norm': 0.7839193155711048, 'learning_rate': 9.433391728962571e-07, 'epoch': 0.81} 81%|████████ | 27670/34278 [30:18:52<7:39:48, 4.18s/it] 81%|████████ | 27671/34278 [30:18:55<7:15:51, 3.96s/it] {'loss': 0.079, 'grad_norm': 0.6825041258920004, 'learning_rate': 9.430630128836966e-07, 'epoch': 0.81} 81%|████████ | 27671/34278 [30:18:55<7:15:51, 3.96s/it] 81%|████████ | 27672/34278 [30:19:01<8:32:35, 4.66s/it] {'loss': 0.1127, 'grad_norm': 0.8099100181328582, 'learning_rate': 9.427868890905023e-07, 'epoch': 0.81} 81%|████████ | 27672/34278 [30:19:01<8:32:35, 4.66s/it] 81%|████████ | 27673/34278 [30:19:05<7:54:39, 4.31s/it] {'loss': 0.1151, 'grad_norm': 0.790503562504446, 'learning_rate': 9.425108015191364e-07, 'epoch': 0.81} 81%|████████ | 27673/34278 [30:19:05<7:54:39, 4.31s/it] 81%|████████ | 27674/34278 [30:19:08<7:22:38, 4.02s/it] {'loss': 0.1368, 'grad_norm': 1.3363533684220408, 'learning_rate': 9.422347501720675e-07, 'epoch': 0.81} 81%|████████ | 27674/34278 [30:19:08<7:22:38, 4.02s/it] 81%|████████ | 27675/34278 [30:19:11<6:39:35, 3.63s/it] {'loss': 0.1117, 'grad_norm': 0.8836754545764983, 'learning_rate': 9.419587350517562e-07, 'epoch': 0.81} 81%|████████ | 27675/34278 [30:19:11<6:39:35, 3.63s/it] 81%|████████ | 27676/34278 [30:19:15<6:41:51, 3.65s/it] {'loss': 0.1263, 'grad_norm': 0.8294599315489385, 'learning_rate': 9.4168275616067e-07, 'epoch': 0.81} 81%|████████ | 27676/34278 [30:19:15<6:41:51, 3.65s/it] 81%|████████ | 27677/34278 [30:19:18<6:38:58, 3.63s/it] {'loss': 0.1004, 'grad_norm': 0.7710846682558097, 'learning_rate': 9.414068135012716e-07, 'epoch': 0.81} 81%|████████ | 27677/34278 [30:19:18<6:38:58, 3.63s/it] 81%|████████ | 27678/34278 [30:19:21<6:14:36, 3.41s/it] {'loss': 0.1198, 'grad_norm': 0.9781689552740249, 'learning_rate': 9.411309070760228e-07, 'epoch': 0.81} 81%|████████ | 27678/34278 [30:19:21<6:14:36, 3.41s/it] 81%|████████ | 27679/34278 [30:19:25<6:27:56, 3.53s/it] {'loss': 0.111, 'grad_norm': 0.7372479514081962, 'learning_rate': 9.408550368873882e-07, 'epoch': 0.81} 81%|████████ | 27679/34278 [30:19:25<6:27:56, 3.53s/it] 81%|████████ | 27680/34278 [30:19:28<6:07:04, 3.34s/it] {'loss': 0.1022, 'grad_norm': 0.8232777820631658, 'learning_rate': 9.405792029378324e-07, 'epoch': 0.81} 81%|████████ | 27680/34278 [30:19:28<6:07:04, 3.34s/it] 81%|████████ | 27681/34278 [30:19:31<5:48:11, 3.17s/it] {'loss': 0.1331, 'grad_norm': 1.2378355493279536, 'learning_rate': 9.403034052298148e-07, 'epoch': 0.81} 81%|████████ | 27681/34278 [30:19:31<5:48:11, 3.17s/it] 81%|████████ | 27682/34278 [30:19:37<7:19:41, 4.00s/it] {'loss': 0.1218, 'grad_norm': 0.8633461785558697, 'learning_rate': 9.400276437658007e-07, 'epoch': 0.81} 81%|████████ | 27682/34278 [30:19:37<7:19:41, 4.00s/it] 81%|████████ | 27683/34278 [30:19:42<8:14:59, 4.50s/it] {'loss': 0.0994, 'grad_norm': 0.6479737271777818, 'learning_rate': 9.397519185482506e-07, 'epoch': 0.81} 81%|████████ | 27683/34278 [30:19:42<8:14:59, 4.50s/it] 81%|████████ | 27684/34278 [30:19:46<7:37:04, 4.16s/it] {'loss': 0.1247, 'grad_norm': 1.0638117353299963, 'learning_rate': 9.39476229579625e-07, 'epoch': 0.81} 81%|████████ | 27684/34278 [30:19:46<7:37:04, 4.16s/it] 81%|████████ | 27685/34278 [30:19:50<7:57:30, 4.35s/it] {'loss': 0.1153, 'grad_norm': 0.9631327341602408, 'learning_rate': 9.392005768623863e-07, 'epoch': 0.81} 81%|████████ | 27685/34278 [30:19:50<7:57:30, 4.35s/it] 81%|████████ | 27686/34278 [30:19:54<7:31:33, 4.11s/it] {'loss': 0.1173, 'grad_norm': 0.91266006072369, 'learning_rate': 9.389249603989964e-07, 'epoch': 0.81} 81%|████████ | 27686/34278 [30:19:54<7:31:33, 4.11s/it] 81%|████████ | 27687/34278 [30:19:57<7:09:32, 3.91s/it] {'loss': 0.1317, 'grad_norm': 1.076757805086941, 'learning_rate': 9.38649380191915e-07, 'epoch': 0.81} 81%|████████ | 27687/34278 [30:19:57<7:09:32, 3.91s/it] 81%|████████ | 27688/34278 [30:20:01<6:48:18, 3.72s/it] {'loss': 0.1151, 'grad_norm': 0.8860938818454016, 'learning_rate': 9.383738362436017e-07, 'epoch': 0.81} 81%|████████ | 27688/34278 [30:20:01<6:48:18, 3.72s/it] 81%|████████ | 27689/34278 [30:20:03<6:20:39, 3.47s/it] {'loss': 0.0913, 'grad_norm': 0.8796505794890483, 'learning_rate': 9.380983285565182e-07, 'epoch': 0.81} 81%|████████ | 27689/34278 [30:20:03<6:20:39, 3.47s/it] 81%|████████ | 27690/34278 [30:20:07<6:33:17, 3.58s/it] {'loss': 0.1, 'grad_norm': 0.6516729246030777, 'learning_rate': 9.378228571331227e-07, 'epoch': 0.81} 81%|████████ | 27690/34278 [30:20:07<6:33:17, 3.58s/it] 81%|████████ | 27691/34278 [30:20:10<6:17:13, 3.44s/it] {'loss': 0.1136, 'grad_norm': 0.9564384973263884, 'learning_rate': 9.375474219758729e-07, 'epoch': 0.81} 81%|████████ | 27691/34278 [30:20:10<6:17:13, 3.44s/it] 81%|████████ | 27692/34278 [30:20:16<7:38:48, 4.18s/it] {'loss': 0.0961, 'grad_norm': 0.9857789493763088, 'learning_rate': 9.372720230872323e-07, 'epoch': 0.81} 81%|████████ | 27692/34278 [30:20:16<7:38:48, 4.18s/it] 81%|████████ | 27693/34278 [30:20:20<7:11:10, 3.93s/it] {'loss': 0.0895, 'grad_norm': 0.8264663643223091, 'learning_rate': 9.369966604696573e-07, 'epoch': 0.81} 81%|████████ | 27693/34278 [30:20:20<7:11:10, 3.93s/it] 81%|████████ | 27694/34278 [30:20:23<6:36:23, 3.61s/it] {'loss': 0.1204, 'grad_norm': 0.9302521695413319, 'learning_rate': 9.367213341256054e-07, 'epoch': 0.81} 81%|████████ | 27694/34278 [30:20:23<6:36:23, 3.61s/it] 81%|████████ | 27695/34278 [30:20:25<6:11:48, 3.39s/it] {'loss': 0.1384, 'grad_norm': 1.0324593926005705, 'learning_rate': 9.364460440575363e-07, 'epoch': 0.81} 81%|████████ | 27695/34278 [30:20:25<6:11:48, 3.39s/it] 81%|████████ | 27696/34278 [30:20:29<6:24:53, 3.51s/it] {'loss': 0.1011, 'grad_norm': 0.7545117449966773, 'learning_rate': 9.361707902679068e-07, 'epoch': 0.81} 81%|████████ | 27696/34278 [30:20:29<6:24:53, 3.51s/it] 81%|████████ | 27697/34278 [30:20:33<6:38:17, 3.63s/it] {'loss': 0.1268, 'grad_norm': 0.7125484529146792, 'learning_rate': 9.358955727591729e-07, 'epoch': 0.81} 81%|████████ | 27697/34278 [30:20:33<6:38:17, 3.63s/it] 81%|████████ | 27698/34278 [30:20:39<7:48:12, 4.27s/it] {'loss': 0.1134, 'grad_norm': 0.8507698900342994, 'learning_rate': 9.356203915337935e-07, 'epoch': 0.81} 81%|████████ | 27698/34278 [30:20:39<7:48:12, 4.27s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 81%|████████ | 27699/34278 [30:20:42<7:21:19, 4.02s/it] {'loss': 0.1207, 'grad_norm': 0.9029781710710214, 'learning_rate': 9.353452465942264e-07, 'epoch': 0.81} 81%|████████ | 27699/34278 [30:20:42<7:21:19, 4.02s/it] 81%|████████ | 27700/34278 [30:20:45<6:37:58, 3.63s/it] {'loss': 0.1093, 'grad_norm': 0.889847297408012, 'learning_rate': 9.350701379429261e-07, 'epoch': 0.81} 81%|████████ | 27700/34278 [30:20:45<6:37:58, 3.63s/it] 81%|████████ | 27701/34278 [30:20:48<6:21:42, 3.48s/it] {'loss': 0.1, 'grad_norm': 0.7877630553778059, 'learning_rate': 9.347950655823484e-07, 'epoch': 0.81} 81%|████████ | 27701/34278 [30:20:48<6:21:42, 3.48s/it] 81%|████████ | 27702/34278 [30:20:54<7:42:21, 4.22s/it] {'loss': 0.0957, 'grad_norm': 0.7420402317438628, 'learning_rate': 9.34520029514951e-07, 'epoch': 0.81} 81%|████████ | 27702/34278 [30:20:54<7:42:21, 4.22s/it] 81%|████████ | 27703/34278 [30:20:58<7:16:06, 3.98s/it] {'loss': 0.1277, 'grad_norm': 0.8436022398261921, 'learning_rate': 9.342450297431871e-07, 'epoch': 0.81} 81%|████████ | 27703/34278 [30:20:58<7:16:06, 3.98s/it] 81%|████████ | 27704/34278 [30:21:01<6:58:27, 3.82s/it] {'loss': 0.1069, 'grad_norm': 0.7753417486698301, 'learning_rate': 9.339700662695145e-07, 'epoch': 0.81} 81%|████████ | 27704/34278 [30:21:01<6:58:27, 3.82s/it] 81%|████████ | 27705/34278 [30:21:04<6:26:41, 3.53s/it] {'loss': 0.1237, 'grad_norm': 0.8434091077299489, 'learning_rate': 9.336951390963849e-07, 'epoch': 0.81} 81%|████████ | 27705/34278 [30:21:04<6:26:41, 3.53s/it] 81%|████████ | 27706/34278 [30:21:07<6:17:22, 3.45s/it] {'loss': 0.1381, 'grad_norm': 1.1354659045094726, 'learning_rate': 9.334202482262555e-07, 'epoch': 0.81} 81%|████████ | 27706/34278 [30:21:07<6:17:22, 3.45s/it] 81%|████████ | 27707/34278 [30:21:10<6:00:24, 3.29s/it] {'loss': 0.1012, 'grad_norm': 0.8704295928961485, 'learning_rate': 9.331453936615798e-07, 'epoch': 0.81} 81%|████████ | 27707/34278 [30:21:10<6:00:24, 3.29s/it] 81%|████████ | 27708/34278 [30:21:13<6:06:06, 3.34s/it] {'loss': 0.1118, 'grad_norm': 0.9215930248056898, 'learning_rate': 9.328705754048095e-07, 'epoch': 0.81} 81%|████████ | 27708/34278 [30:21:13<6:06:06, 3.34s/it] 81%|████████ | 27709/34278 [30:21:19<7:27:16, 4.09s/it] {'loss': 0.1059, 'grad_norm': 0.8713600958684332, 'learning_rate': 9.325957934584001e-07, 'epoch': 0.81} 81%|████████ | 27709/34278 [30:21:19<7:27:16, 4.09s/it] 81%|████████ | 27710/34278 [30:21:25<8:33:10, 4.69s/it] {'loss': 0.1111, 'grad_norm': 0.7628119466868182, 'learning_rate': 9.323210478248057e-07, 'epoch': 0.81} 81%|████████ | 27710/34278 [30:21:25<8:33:10, 4.69s/it] 81%|████████ | 27711/34278 [30:21:29<7:53:13, 4.32s/it] {'loss': 0.1165, 'grad_norm': 0.9370638454702768, 'learning_rate': 9.320463385064766e-07, 'epoch': 0.81} 81%|████████ | 27711/34278 [30:21:29<7:53:13, 4.32s/it] 81%|████████ | 27712/34278 [30:21:32<7:22:15, 4.04s/it] {'loss': 0.1085, 'grad_norm': 0.8615433986827776, 'learning_rate': 9.317716655058678e-07, 'epoch': 0.81} 81%|████████ | 27712/34278 [30:21:32<7:22:15, 4.04s/it] 81%|████████ | 27713/34278 [30:21:37<7:31:15, 4.12s/it] {'loss': 0.1003, 'grad_norm': 0.8872474163883061, 'learning_rate': 9.314970288254304e-07, 'epoch': 0.81} 81%|████████ | 27713/34278 [30:21:37<7:31:15, 4.12s/it] 81%|████████ | 27714/34278 [30:21:40<7:03:05, 3.87s/it] {'loss': 0.1269, 'grad_norm': 0.9321866280096461, 'learning_rate': 9.312224284676158e-07, 'epoch': 0.81} 81%|████████ | 27714/34278 [30:21:40<7:03:05, 3.87s/it] 81%|████████ | 27715/34278 [30:21:43<6:37:38, 3.64s/it] {'loss': 0.122, 'grad_norm': 0.8037436535570405, 'learning_rate': 9.309478644348751e-07, 'epoch': 0.81} 81%|████████ | 27715/34278 [30:21:43<6:37:38, 3.64s/it] 81%|████████ | 27716/34278 [30:21:47<6:43:59, 3.69s/it] {'loss': 0.1243, 'grad_norm': 0.9128272271350647, 'learning_rate': 9.306733367296622e-07, 'epoch': 0.81} 81%|████████ | 27716/34278 [30:21:47<6:43:59, 3.69s/it] 81%|████████ | 27717/34278 [30:21:50<6:12:34, 3.41s/it] {'loss': 0.102, 'grad_norm': 1.0434697059821711, 'learning_rate': 9.303988453544266e-07, 'epoch': 0.81} 81%|████████ | 27717/34278 [30:21:50<6:12:34, 3.41s/it] 81%|████████ | 27718/34278 [30:21:54<6:35:22, 3.62s/it] {'loss': 0.1151, 'grad_norm': 0.7839297890649839, 'learning_rate': 9.301243903116169e-07, 'epoch': 0.81} 81%|████████ | 27718/34278 [30:21:54<6:35:22, 3.62s/it] 81%|████████ | 27719/34278 [30:22:00<8:01:02, 4.40s/it] {'loss': 0.1098, 'grad_norm': 0.7976192774185791, 'learning_rate': 9.298499716036863e-07, 'epoch': 0.81} 81%|████████ | 27719/34278 [30:22:00<8:01:02, 4.40s/it] 81%|████████ | 27720/34278 [30:22:03<7:14:07, 3.97s/it] {'loss': 0.1226, 'grad_norm': 0.9749869644334156, 'learning_rate': 9.295755892330838e-07, 'epoch': 0.81} 81%|████████ | 27720/34278 [30:22:03<7:14:07, 3.97s/it] 81%|████████ | 27721/34278 [30:22:06<6:34:34, 3.61s/it] {'loss': 0.1096, 'grad_norm': 0.9294002766208463, 'learning_rate': 9.293012432022563e-07, 'epoch': 0.81} 81%|████████ | 27721/34278 [30:22:06<6:34:34, 3.61s/it] 81%|████████ | 27722/34278 [30:22:10<6:59:13, 3.84s/it] {'loss': 0.1059, 'grad_norm': 1.0845043734624233, 'learning_rate': 9.290269335136576e-07, 'epoch': 0.81} 81%|████████ | 27722/34278 [30:22:10<6:59:13, 3.84s/it] 81%|████████ | 27723/34278 [30:22:13<6:33:26, 3.60s/it] {'loss': 0.1204, 'grad_norm': 0.8523074332648163, 'learning_rate': 9.287526601697349e-07, 'epoch': 0.81} 81%|████████ | 27723/34278 [30:22:13<6:33:26, 3.60s/it] 81%|████████ | 27724/34278 [30:22:16<6:17:19, 3.45s/it] {'loss': 0.1131, 'grad_norm': 0.886266788779363, 'learning_rate': 9.284784231729355e-07, 'epoch': 0.81} 81%|████████ | 27724/34278 [30:22:16<6:17:19, 3.45s/it] 81%|████████ | 27725/34278 [30:22:22<7:48:39, 4.29s/it] {'loss': 0.0941, 'grad_norm': 1.6201560529506194, 'learning_rate': 9.282042225257099e-07, 'epoch': 0.81} 81%|████████ | 27725/34278 [30:22:22<7:48:39, 4.29s/it] 81%|████████ | 27726/34278 [30:22:26<7:25:00, 4.08s/it] {'loss': 0.1198, 'grad_norm': 0.8354550398540652, 'learning_rate': 9.279300582305051e-07, 'epoch': 0.81} 81%|████████ | 27726/34278 [30:22:26<7:25:00, 4.08s/it] 81%|████████ | 27727/34278 [30:22:29<7:03:18, 3.88s/it] {'loss': 0.121, 'grad_norm': 0.992604360467188, 'learning_rate': 9.276559302897669e-07, 'epoch': 0.81} 81%|████████ | 27727/34278 [30:22:29<7:03:18, 3.88s/it] 81%|████████ | 27728/34278 [30:22:32<6:29:02, 3.56s/it] {'loss': 0.1139, 'grad_norm': 1.0072788773839294, 'learning_rate': 9.273818387059452e-07, 'epoch': 0.81} 81%|████████ | 27728/34278 [30:22:32<6:29:02, 3.56s/it] 81%|████████ | 27729/34278 [30:22:37<7:20:55, 4.04s/it] {'loss': 0.1056, 'grad_norm': 0.9233711831501684, 'learning_rate': 9.271077834814868e-07, 'epoch': 0.81} 81%|████████ | 27729/34278 [30:22:37<7:20:55, 4.04s/it] 81%|████████ | 27730/34278 [30:22:41<7:15:48, 3.99s/it] {'loss': 0.1421, 'grad_norm': 0.904104940376141, 'learning_rate': 9.26833764618838e-07, 'epoch': 0.81} 81%|████████ | 27730/34278 [30:22:41<7:15:48, 3.99s/it] 81%|████████ | 27731/34278 [30:22:45<7:06:45, 3.91s/it] {'loss': 0.1115, 'grad_norm': 1.1756509487612572, 'learning_rate': 9.265597821204441e-07, 'epoch': 0.81} 81%|████████ | 27731/34278 [30:22:45<7:06:45, 3.91s/it] 81%|████████ | 27732/34278 [30:22:51<8:06:30, 4.46s/it] {'loss': 0.1092, 'grad_norm': 0.9482912710344759, 'learning_rate': 9.262858359887528e-07, 'epoch': 0.81} 81%|████████ | 27732/34278 [30:22:51<8:06:30, 4.46s/it] 81%|████████ | 27733/34278 [30:22:54<7:22:27, 4.06s/it] {'loss': 0.1242, 'grad_norm': 0.8394346876055967, 'learning_rate': 9.260119262262085e-07, 'epoch': 0.81} 81%|████████ | 27733/34278 [30:22:54<7:22:27, 4.06s/it] 81%|████████ | 27734/34278 [30:22:57<6:44:54, 3.71s/it] {'loss': 0.1043, 'grad_norm': 0.9702058106309435, 'learning_rate': 9.257380528352578e-07, 'epoch': 0.81} 81%|████████ | 27734/34278 [30:22:57<6:44:54, 3.71s/it] 81%|████████ | 27735/34278 [30:23:01<6:49:02, 3.75s/it] {'loss': 0.0907, 'grad_norm': 0.8907473523033501, 'learning_rate': 9.254642158183441e-07, 'epoch': 0.81} 81%|████████ | 27735/34278 [30:23:01<6:49:02, 3.75s/it] 81%|████████ | 27736/34278 [30:23:04<6:29:09, 3.57s/it] {'loss': 0.1045, 'grad_norm': 0.9795930823804683, 'learning_rate': 9.251904151779145e-07, 'epoch': 0.81} 81%|████████ | 27736/34278 [30:23:04<6:29:09, 3.57s/it] 81%|████████ | 27737/34278 [30:23:07<6:10:08, 3.40s/it] {'loss': 0.1211, 'grad_norm': 1.1694958440117467, 'learning_rate': 9.24916650916412e-07, 'epoch': 0.81} 81%|████████ | 27737/34278 [30:23:07<6:10:08, 3.40s/it] 81%|████████ | 27738/34278 [30:23:10<6:15:36, 3.45s/it] {'loss': 0.1233, 'grad_norm': 0.9396109202494459, 'learning_rate': 9.246429230362797e-07, 'epoch': 0.81} 81%|████████ | 27738/34278 [30:23:10<6:15:36, 3.45s/it] 81%|████████ | 27739/34278 [30:23:13<5:59:34, 3.30s/it] {'loss': 0.1127, 'grad_norm': 0.8639898766860498, 'learning_rate': 9.243692315399627e-07, 'epoch': 0.81} 81%|████████ | 27739/34278 [30:23:13<5:59:34, 3.30s/it] 81%|████████ | 27740/34278 [30:23:16<5:49:33, 3.21s/it] {'loss': 0.1187, 'grad_norm': 0.9180130668315164, 'learning_rate': 9.240955764299053e-07, 'epoch': 0.81} 81%|████████ | 27740/34278 [30:23:16<5:49:33, 3.21s/it] 81%|████████ | 27741/34278 [30:23:20<5:54:59, 3.26s/it] {'loss': 0.1268, 'grad_norm': 1.0764279828707437, 'learning_rate': 9.238219577085483e-07, 'epoch': 0.81} 81%|████████ | 27741/34278 [30:23:20<5:54:59, 3.26s/it] 81%|████████ | 27742/34278 [30:23:23<6:05:25, 3.35s/it] {'loss': 0.1141, 'grad_norm': 0.9050155148591006, 'learning_rate': 9.235483753783375e-07, 'epoch': 0.81} 81%|████████ | 27742/34278 [30:23:23<6:05:25, 3.35s/it] 81%|████████ | 27743/34278 [30:23:27<6:19:58, 3.49s/it] {'loss': 0.1148, 'grad_norm': 0.8308343890269729, 'learning_rate': 9.232748294417132e-07, 'epoch': 0.81} 81%|████████ | 27743/34278 [30:23:27<6:19:58, 3.49s/it] 81%|████████ | 27744/34278 [30:23:33<7:41:03, 4.23s/it] {'loss': 0.1227, 'grad_norm': 0.8282001548967434, 'learning_rate': 9.230013199011168e-07, 'epoch': 0.81} 81%|████████ | 27744/34278 [30:23:33<7:41:03, 4.23s/it] 81%|████████ | 27745/34278 [30:23:36<7:11:29, 3.96s/it] {'loss': 0.1234, 'grad_norm': 1.0009096968741495, 'learning_rate': 9.227278467589918e-07, 'epoch': 0.81} 81%|████████ | 27745/34278 [30:23:36<7:11:29, 3.96s/it] 81%|████████ | 27746/34278 [30:23:40<6:57:42, 3.84s/it] {'loss': 0.0985, 'grad_norm': 0.7376641217003067, 'learning_rate': 9.224544100177801e-07, 'epoch': 0.81} 81%|████████ | 27746/34278 [30:23:40<6:57:42, 3.84s/it] 81%|████████ | 27747/34278 [30:23:46<8:07:55, 4.48s/it] {'loss': 0.0991, 'grad_norm': 0.8762697498729055, 'learning_rate': 9.221810096799222e-07, 'epoch': 0.81} 81%|████████ | 27747/34278 [30:23:46<8:07:55, 4.48s/it] 81%|████████ | 27748/34278 [30:23:49<7:22:54, 4.07s/it] {'loss': 0.1134, 'grad_norm': 1.0207785273047654, 'learning_rate': 9.21907645747857e-07, 'epoch': 0.81} 81%|████████ | 27748/34278 [30:23:49<7:22:54, 4.07s/it] 81%|████████ | 27749/34278 [30:23:52<6:59:25, 3.85s/it] {'loss': 0.0921, 'grad_norm': 0.8337569521718208, 'learning_rate': 9.21634318224029e-07, 'epoch': 0.81} 81%|████████ | 27749/34278 [30:23:52<6:59:25, 3.85s/it] 81%|████████ | 27750/34278 [30:23:55<6:33:02, 3.61s/it] {'loss': 0.0994, 'grad_norm': 0.7603370679586554, 'learning_rate': 9.213610271108753e-07, 'epoch': 0.81} 81%|████████ | 27750/34278 [30:23:55<6:33:02, 3.61s/it] 81%|████████ | 27751/34278 [30:23:59<6:31:55, 3.60s/it] {'loss': 0.1153, 'grad_norm': 0.9298924413491145, 'learning_rate': 9.210877724108347e-07, 'epoch': 0.81} 81%|████████ | 27751/34278 [30:23:59<6:31:55, 3.60s/it] 81%|████████ | 27752/34278 [30:24:02<6:08:21, 3.39s/it] {'loss': 0.1163, 'grad_norm': 0.765337120934524, 'learning_rate': 9.208145541263514e-07, 'epoch': 0.81} 81%|████████ | 27752/34278 [30:24:02<6:08:21, 3.39s/it] 81%|████████ | 27753/34278 [30:24:05<5:55:22, 3.27s/it] {'loss': 0.1175, 'grad_norm': 0.9483243862733962, 'learning_rate': 9.205413722598616e-07, 'epoch': 0.81} 81%|████████ | 27753/34278 [30:24:05<5:55:22, 3.27s/it] 81%|████████ | 27754/34278 [30:24:09<6:25:08, 3.54s/it] {'loss': 0.1199, 'grad_norm': 0.9522069946984685, 'learning_rate': 9.202682268138036e-07, 'epoch': 0.81} 81%|████████ | 27754/34278 [30:24:09<6:25:08, 3.54s/it] 81%|████████ | 27755/34278 [30:24:12<6:14:58, 3.45s/it] {'loss': 0.1015, 'grad_norm': 1.0336830645120536, 'learning_rate': 9.19995117790618e-07, 'epoch': 0.81} 81%|████████ | 27755/34278 [30:24:12<6:14:58, 3.45s/it] 81%|████████ | 27756/34278 [30:24:16<6:28:46, 3.58s/it] {'loss': 0.0967, 'grad_norm': 0.8547717435982832, 'learning_rate': 9.197220451927424e-07, 'epoch': 0.81} 81%|████████ | 27756/34278 [30:24:16<6:28:46, 3.58s/it] 81%|████████ | 27757/34278 [30:24:21<7:01:18, 3.88s/it] {'loss': 0.1078, 'grad_norm': 0.862362092638936, 'learning_rate': 9.194490090226127e-07, 'epoch': 0.81} 81%|████████ | 27757/34278 [30:24:21<7:01:18, 3.88s/it] 81%|████████ | 27758/34278 [30:24:24<6:53:36, 3.81s/it] {'loss': 0.0973, 'grad_norm': 0.787151199295326, 'learning_rate': 9.191760092826685e-07, 'epoch': 0.81} 81%|████████ | 27758/34278 [30:24:24<6:53:36, 3.81s/it] 81%|████████ | 27759/34278 [30:24:27<6:26:07, 3.55s/it] {'loss': 0.1148, 'grad_norm': 0.9128466409672559, 'learning_rate': 9.189030459753473e-07, 'epoch': 0.81} 81%|████████ | 27759/34278 [30:24:27<6:26:07, 3.55s/it] 81%|████████ | 27760/34278 [30:24:31<6:26:18, 3.56s/it] {'loss': 0.1199, 'grad_norm': 0.8140903532431837, 'learning_rate': 9.186301191030861e-07, 'epoch': 0.81} 81%|████████ | 27760/34278 [30:24:31<6:26:18, 3.56s/it] 81%|████████ | 27761/34278 [30:24:34<6:31:02, 3.60s/it] {'loss': 0.1148, 'grad_norm': 0.8068983792589345, 'learning_rate': 9.183572286683195e-07, 'epoch': 0.81} 81%|████████ | 27761/34278 [30:24:34<6:31:02, 3.60s/it] 81%|████████ | 27762/34278 [30:24:40<7:47:25, 4.30s/it] {'loss': 0.1135, 'grad_norm': 0.9276602676040361, 'learning_rate': 9.180843746734863e-07, 'epoch': 0.81} 81%|████████ | 27762/34278 [30:24:40<7:47:25, 4.30s/it] 81%|████████ | 27763/34278 [30:24:45<8:06:16, 4.48s/it] {'loss': 0.1409, 'grad_norm': 0.9907750424037756, 'learning_rate': 9.178115571210206e-07, 'epoch': 0.81} 81%|████████ | 27763/34278 [30:24:45<8:06:16, 4.48s/it] 81%|████████ | 27764/34278 [30:24:49<7:28:48, 4.13s/it] {'loss': 0.1075, 'grad_norm': 0.8760567874999488, 'learning_rate': 9.175387760133591e-07, 'epoch': 0.81} 81%|████████ | 27764/34278 [30:24:49<7:28:48, 4.13s/it] 81%|████████ | 27765/34278 [30:24:52<7:01:56, 3.89s/it] {'loss': 0.1209, 'grad_norm': 1.0320778962467883, 'learning_rate': 9.172660313529363e-07, 'epoch': 0.81} 81%|████████ | 27765/34278 [30:24:52<7:01:56, 3.89s/it] 81%|████████ | 27766/34278 [30:24:55<6:34:24, 3.63s/it] {'loss': 0.1152, 'grad_norm': 1.4146373263182843, 'learning_rate': 9.16993323142189e-07, 'epoch': 0.81} 81%|████████ | 27766/34278 [30:24:55<6:34:24, 3.63s/it] 81%|████████ | 27767/34278 [30:25:01<7:51:59, 4.35s/it] {'loss': 0.1067, 'grad_norm': 0.8040419550955294, 'learning_rate': 9.167206513835508e-07, 'epoch': 0.81} 81%|████████ | 27767/34278 [30:25:01<7:51:59, 4.35s/it] 81%|████████ | 27768/34278 [30:25:04<7:07:12, 3.94s/it] {'loss': 0.1287, 'grad_norm': 0.875952119479627, 'learning_rate': 9.164480160794543e-07, 'epoch': 0.81} 81%|████████ | 27768/34278 [30:25:04<7:07:12, 3.94s/it] 81%|████████ | 27769/34278 [30:25:10<8:12:56, 4.54s/it] {'loss': 0.1207, 'grad_norm': 0.912674442463428, 'learning_rate': 9.161754172323351e-07, 'epoch': 0.81} 81%|████████ | 27769/34278 [30:25:10<8:12:56, 4.54s/it] 81%|████████ | 27770/34278 [30:25:14<7:47:04, 4.31s/it] {'loss': 0.1162, 'grad_norm': 0.941059971365258, 'learning_rate': 9.159028548446281e-07, 'epoch': 0.81} 81%|████████ | 27770/34278 [30:25:14<7:47:04, 4.31s/it] 81%|████████ | 27771/34278 [30:25:17<7:19:54, 4.06s/it] {'loss': 0.1123, 'grad_norm': 0.8365171229245679, 'learning_rate': 9.15630328918764e-07, 'epoch': 0.81} 81%|████████ | 27771/34278 [30:25:17<7:19:54, 4.06s/it] 81%|████████ | 27772/34278 [30:25:23<8:05:57, 4.48s/it] {'loss': 0.1018, 'grad_norm': 0.8387735287196747, 'learning_rate': 9.153578394571788e-07, 'epoch': 0.81} 81%|████████ | 27772/34278 [30:25:23<8:05:57, 4.48s/it] 81%|████████ | 27773/34278 [30:25:25<7:09:03, 3.96s/it] {'loss': 0.0986, 'grad_norm': 0.8553702571973816, 'learning_rate': 9.150853864623039e-07, 'epoch': 0.81} 81%|████████ | 27773/34278 [30:25:25<7:09:03, 3.96s/it] 81%|████████ | 27774/34278 [30:25:30<7:18:47, 4.05s/it] {'loss': 0.1017, 'grad_norm': 0.8284252048007986, 'learning_rate': 9.148129699365699e-07, 'epoch': 0.81} 81%|████████ | 27774/34278 [30:25:30<7:18:47, 4.05s/it] 81%|████████ | 27775/34278 [30:25:34<7:31:46, 4.17s/it] {'loss': 0.0964, 'grad_norm': 0.9572269906039605, 'learning_rate': 9.145405898824106e-07, 'epoch': 0.81} 81%|████████ | 27775/34278 [30:25:34<7:31:46, 4.17s/it] 81%|████████ | 27776/34278 [30:25:37<7:02:10, 3.90s/it] {'loss': 0.1072, 'grad_norm': 0.8696717750444994, 'learning_rate': 9.142682463022589e-07, 'epoch': 0.81} 81%|████████ | 27776/34278 [30:25:37<7:02:10, 3.90s/it] 81%|████████ | 27777/34278 [30:25:40<6:37:18, 3.67s/it] {'loss': 0.1148, 'grad_norm': 0.8478827962096758, 'learning_rate': 9.139959391985453e-07, 'epoch': 0.81} 81%|████████ | 27777/34278 [30:25:40<6:37:18, 3.67s/it] 81%|████████ | 27778/34278 [30:25:44<6:25:44, 3.56s/it] {'loss': 0.1179, 'grad_norm': 0.9358642936458822, 'learning_rate': 9.137236685736988e-07, 'epoch': 0.81} 81%|████████ | 27778/34278 [30:25:44<6:25:44, 3.56s/it] 81%|████████ | 27779/34278 [30:25:47<6:29:19, 3.59s/it] {'loss': 0.119, 'grad_norm': 0.8906320175570829, 'learning_rate': 9.134514344301537e-07, 'epoch': 0.81} 81%|████████ | 27779/34278 [30:25:47<6:29:19, 3.59s/it] 81%|████████ | 27780/34278 [30:25:53<7:47:48, 4.32s/it] {'loss': 0.1043, 'grad_norm': 0.9854779651616952, 'learning_rate': 9.131792367703385e-07, 'epoch': 0.81} 81%|████████ | 27780/34278 [30:25:53<7:47:48, 4.32s/it] 81%|████████ | 27781/34278 [30:25:57<7:13:37, 4.00s/it] {'loss': 0.1219, 'grad_norm': 0.9770178275001017, 'learning_rate': 9.129070755966807e-07, 'epoch': 0.81} 81%|████████ | 27781/34278 [30:25:57<7:13:37, 4.00s/it] 81%|████████ | 27782/34278 [30:26:01<7:20:46, 4.07s/it] {'loss': 0.1022, 'grad_norm': 0.7729644612952208, 'learning_rate': 9.126349509116156e-07, 'epoch': 0.81} 81%|████████ | 27782/34278 [30:26:01<7:20:46, 4.07s/it] 81%|████████ | 27783/34278 [30:26:05<7:22:03, 4.08s/it] {'loss': 0.1196, 'grad_norm': 1.0305427076463296, 'learning_rate': 9.123628627175696e-07, 'epoch': 0.81} 81%|████████ | 27783/34278 [30:26:05<7:22:03, 4.08s/it] 81%|████████ | 27784/34278 [30:26:08<6:46:15, 3.75s/it] {'loss': 0.1118, 'grad_norm': 1.077065156779928, 'learning_rate': 9.120908110169713e-07, 'epoch': 0.81} 81%|████████ | 27784/34278 [30:26:08<6:46:15, 3.75s/it] 81%|████████ | 27785/34278 [30:26:11<6:29:54, 3.60s/it] {'loss': 0.1193, 'grad_norm': 1.1021114832715748, 'learning_rate': 9.118187958122515e-07, 'epoch': 0.81} 81%|████████ | 27785/34278 [30:26:11<6:29:54, 3.60s/it] 81%|████████ | 27786/34278 [30:26:15<6:19:28, 3.51s/it] {'loss': 0.1157, 'grad_norm': 0.9724480263050775, 'learning_rate': 9.115468171058373e-07, 'epoch': 0.81} 81%|████████ | 27786/34278 [30:26:15<6:19:28, 3.51s/it] 81%|████████ | 27787/34278 [30:26:18<6:02:47, 3.35s/it] {'loss': 0.0942, 'grad_norm': 0.944353015182522, 'learning_rate': 9.11274874900156e-07, 'epoch': 0.81} 81%|████████ | 27787/34278 [30:26:18<6:02:47, 3.35s/it] 81%|████████ | 27788/34278 [30:26:22<6:40:24, 3.70s/it] {'loss': 0.092, 'grad_norm': 1.1658182485253474, 'learning_rate': 9.110029691976368e-07, 'epoch': 0.81} 81%|████████ | 27788/34278 [30:26:22<6:40:24, 3.70s/it] 81%|████████ | 27789/34278 [30:26:26<6:46:22, 3.76s/it] {'loss': 0.1068, 'grad_norm': 0.8876520200068828, 'learning_rate': 9.10731100000708e-07, 'epoch': 0.81} 81%|████████ | 27789/34278 [30:26:26<6:46:22, 3.76s/it] 81%|████████ | 27790/34278 [30:26:32<7:48:00, 4.33s/it] {'loss': 0.1288, 'grad_norm': 0.819044641947194, 'learning_rate': 9.104592673117956e-07, 'epoch': 0.81} 81%|████████ | 27790/34278 [30:26:32<7:48:00, 4.33s/it] 81%|████████ | 27791/34278 [30:26:35<7:27:34, 4.14s/it] {'loss': 0.1207, 'grad_norm': 0.8926254486360656, 'learning_rate': 9.101874711333258e-07, 'epoch': 0.81} 81%|████████ | 27791/34278 [30:26:35<7:27:34, 4.14s/it] 81%|████████ | 27792/34278 [30:26:39<6:57:20, 3.86s/it] {'loss': 0.1231, 'grad_norm': 0.8922334135057118, 'learning_rate': 9.09915711467727e-07, 'epoch': 0.81} 81%|████████ | 27792/34278 [30:26:39<6:57:20, 3.86s/it] 81%|████████ | 27793/34278 [30:26:45<8:06:19, 4.50s/it] {'loss': 0.1105, 'grad_norm': 0.815953010826716, 'learning_rate': 9.09643988317423e-07, 'epoch': 0.81} 81%|████████ | 27793/34278 [30:26:45<8:06:19, 4.50s/it] 81%|████████ | 27794/34278 [30:26:50<8:25:33, 4.68s/it] {'loss': 0.1228, 'grad_norm': 0.8055476566901355, 'learning_rate': 9.093723016848427e-07, 'epoch': 0.81} 81%|████████ | 27794/34278 [30:26:50<8:25:33, 4.68s/it] 81%|████████ | 27795/34278 [30:26:53<7:37:46, 4.24s/it] {'loss': 0.1214, 'grad_norm': 0.9657413088559, 'learning_rate': 9.091006515724083e-07, 'epoch': 0.81} 81%|████████ | 27795/34278 [30:26:53<7:37:46, 4.24s/it] 81%|████████ | 27796/34278 [30:26:56<7:14:06, 4.02s/it] {'loss': 0.1097, 'grad_norm': 0.9176982476270428, 'learning_rate': 9.088290379825481e-07, 'epoch': 0.81} 81%|████████ | 27796/34278 [30:26:56<7:14:06, 4.02s/it] 81%|████████ | 27797/34278 [30:27:02<8:20:56, 4.64s/it] {'loss': 0.1253, 'grad_norm': 1.1033443173486115, 'learning_rate': 9.085574609176856e-07, 'epoch': 0.81} 81%|████████ | 27797/34278 [30:27:02<8:20:56, 4.64s/it] 81%|████████ | 27798/34278 [30:27:05<7:27:45, 4.15s/it] {'loss': 0.1237, 'grad_norm': 0.8570696955032819, 'learning_rate': 9.082859203802436e-07, 'epoch': 0.81} 81%|████████ | 27798/34278 [30:27:05<7:27:45, 4.15s/it] 81%|████████ | 27799/34278 [30:27:09<6:54:59, 3.84s/it] {'loss': 0.1124, 'grad_norm': 1.1833928866140564, 'learning_rate': 9.08014416372649e-07, 'epoch': 0.81} 81%|████████ | 27799/34278 [30:27:09<6:54:59, 3.84s/it] 81%|████████ | 27800/34278 [30:27:15<8:04:17, 4.49s/it] {'loss': 0.0946, 'grad_norm': 0.8803872613469325, 'learning_rate': 9.077429488973255e-07, 'epoch': 0.81} 81%|████████ | 27800/34278 [30:27:15<8:04:17, 4.49s/it] 81%|████████ | 27801/34278 [30:27:18<7:27:37, 4.15s/it] {'loss': 0.1138, 'grad_norm': 0.8849863394553857, 'learning_rate': 9.07471517956695e-07, 'epoch': 0.81} 81%|████████ | 27801/34278 [30:27:18<7:27:37, 4.15s/it] 81%|████████ | 27802/34278 [30:27:21<6:52:02, 3.82s/it] {'loss': 0.1063, 'grad_norm': 0.9009437140714899, 'learning_rate': 9.07200123553183e-07, 'epoch': 0.81} 81%|████████ | 27802/34278 [30:27:21<6:52:02, 3.82s/it] 81%|████████ | 27803/34278 [30:27:25<7:04:37, 3.93s/it] {'loss': 0.1176, 'grad_norm': 0.8611692773242475, 'learning_rate': 9.069287656892118e-07, 'epoch': 0.81} 81%|████████ | 27803/34278 [30:27:25<7:04:37, 3.93s/it] 81%|████████ | 27804/34278 [30:27:28<6:24:35, 3.56s/it] {'loss': 0.1183, 'grad_norm': 0.714812702387066, 'learning_rate': 9.066574443672016e-07, 'epoch': 0.81} 81%|████████ | 27804/34278 [30:27:28<6:24:35, 3.56s/it] 81%|████████ | 27805/34278 [30:27:31<6:16:11, 3.49s/it] {'loss': 0.1088, 'grad_norm': 0.772792801634389, 'learning_rate': 9.063861595895767e-07, 'epoch': 0.81} 81%|████████ | 27805/34278 [30:27:31<6:16:11, 3.49s/it] 81%|████████ | 27806/34278 [30:27:35<6:25:12, 3.57s/it] {'loss': 0.1112, 'grad_norm': 1.0948717600218414, 'learning_rate': 9.061149113587603e-07, 'epoch': 0.81} 81%|████████ | 27806/34278 [30:27:35<6:25:12, 3.57s/it] 81%|████████ | 27807/34278 [30:27:38<6:14:03, 3.47s/it] {'loss': 0.106, 'grad_norm': 0.8362913322933143, 'learning_rate': 9.058436996771724e-07, 'epoch': 0.81} 81%|████████ | 27807/34278 [30:27:38<6:14:03, 3.47s/it] 81%|████████ | 27808/34278 [30:27:42<6:11:37, 3.45s/it] {'loss': 0.1213, 'grad_norm': 1.2474769822718166, 'learning_rate': 9.055725245472335e-07, 'epoch': 0.81} 81%|████████ | 27808/34278 [30:27:42<6:11:37, 3.45s/it] 81%|████████ | 27809/34278 [30:27:45<5:55:26, 3.30s/it] {'loss': 0.1128, 'grad_norm': 0.8080131307207777, 'learning_rate': 9.053013859713672e-07, 'epoch': 0.81} 81%|████████ | 27809/34278 [30:27:45<5:55:26, 3.30s/it] 81%|████████ | 27810/34278 [30:27:50<6:58:08, 3.88s/it] {'loss': 0.1084, 'grad_norm': 0.744940244018974, 'learning_rate': 9.050302839519926e-07, 'epoch': 0.81} 81%|████████ | 27810/34278 [30:27:50<6:58:08, 3.88s/it] 81%|████████ | 27811/34278 [30:27:53<6:53:26, 3.84s/it] {'loss': 0.0974, 'grad_norm': 0.7687660014294762, 'learning_rate': 9.047592184915272e-07, 'epoch': 0.81} 81%|████████ | 27811/34278 [30:27:54<6:53:26, 3.84s/it] 81%|████████ | 27812/34278 [30:27:58<7:03:57, 3.93s/it] {'loss': 0.1269, 'grad_norm': 0.8384119192449437, 'learning_rate': 9.044881895923969e-07, 'epoch': 0.81} 81%|████████ | 27812/34278 [30:27:58<7:03:57, 3.93s/it] 81%|████████ | 27813/34278 [30:28:03<7:54:35, 4.40s/it] {'loss': 0.1215, 'grad_norm': 0.9045942397770038, 'learning_rate': 9.042171972570179e-07, 'epoch': 0.81} 81%|████████ | 27813/34278 [30:28:03<7:54:35, 4.40s/it] 81%|████████ | 27814/34278 [30:28:06<7:11:47, 4.01s/it] {'loss': 0.1143, 'grad_norm': 0.8526699918228667, 'learning_rate': 9.039462414878092e-07, 'epoch': 0.81} 81%|████████ | 27814/34278 [30:28:06<7:11:47, 4.01s/it] 81%|████████ | 27815/34278 [30:28:10<6:57:39, 3.88s/it] {'loss': 0.1192, 'grad_norm': 0.8874721617085721, 'learning_rate': 9.036753222871914e-07, 'epoch': 0.81} 81%|████████ | 27815/34278 [30:28:10<6:57:39, 3.88s/it] 81%|████████ | 27816/34278 [30:28:13<6:32:47, 3.65s/it] {'loss': 0.1189, 'grad_norm': 0.8227247161142672, 'learning_rate': 9.034044396575825e-07, 'epoch': 0.81} 81%|████████ | 27816/34278 [30:28:13<6:32:47, 3.65s/it] 81%|████████ | 27817/34278 [30:28:16<6:01:33, 3.36s/it] {'loss': 0.1076, 'grad_norm': 0.9705319413020947, 'learning_rate': 9.031335936014001e-07, 'epoch': 0.81} 81%|████████ | 27817/34278 [30:28:16<6:01:33, 3.36s/it] 81%|████████ | 27818/34278 [30:28:19<5:57:24, 3.32s/it] {'loss': 0.119, 'grad_norm': 1.1221833081400452, 'learning_rate': 9.028627841210625e-07, 'epoch': 0.81} 81%|████████ | 27818/34278 [30:28:19<5:57:24, 3.32s/it] 81%|████████ | 27819/34278 [30:28:23<6:37:20, 3.69s/it] {'loss': 0.1022, 'grad_norm': 0.9065303454259425, 'learning_rate': 9.025920112189895e-07, 'epoch': 0.81} 81%|████████ | 27819/34278 [30:28:23<6:37:20, 3.69s/it] 81%|████████ | 27820/34278 [30:28:27<6:34:13, 3.66s/it] {'loss': 0.1304, 'grad_norm': 0.8308899240624873, 'learning_rate': 9.023212748975968e-07, 'epoch': 0.81} 81%|████████ | 27820/34278 [30:28:27<6:34:13, 3.66s/it] 81%|████████ | 27821/34278 [30:28:32<7:10:40, 4.00s/it] {'loss': 0.1038, 'grad_norm': 0.8682517098380476, 'learning_rate': 9.020505751593001e-07, 'epoch': 0.81} 81%|████████ | 27821/34278 [30:28:32<7:10:40, 4.00s/it] 81%|████████ | 27822/34278 [30:28:38<8:14:07, 4.59s/it] {'loss': 0.1128, 'grad_norm': 0.8321078365736718, 'learning_rate': 9.01779912006519e-07, 'epoch': 0.81} 81%|████████ | 27822/34278 [30:28:38<8:14:07, 4.59s/it] 81%|████████ | 27823/34278 [30:28:42<8:06:07, 4.52s/it] {'loss': 0.1025, 'grad_norm': 0.9589914766280058, 'learning_rate': 9.015092854416668e-07, 'epoch': 0.81} 81%|████████ | 27823/34278 [30:28:42<8:06:07, 4.52s/it] 81%|████████ | 27824/34278 [30:28:48<8:42:30, 4.86s/it] {'loss': 0.1162, 'grad_norm': 0.9256429735490215, 'learning_rate': 9.012386954671631e-07, 'epoch': 0.81} 81%|████████ | 27824/34278 [30:28:48<8:42:30, 4.86s/it] 81%|████████ | 27825/34278 [30:28:51<7:56:18, 4.43s/it] {'loss': 0.1126, 'grad_norm': 0.9232161626985291, 'learning_rate': 9.009681420854205e-07, 'epoch': 0.81} 81%|████████ | 27825/34278 [30:28:51<7:56:18, 4.43s/it] 81%|████████ | 27826/34278 [30:28:54<7:12:08, 4.02s/it] {'loss': 0.1282, 'grad_norm': 0.8141652835612888, 'learning_rate': 9.006976252988569e-07, 'epoch': 0.81} 81%|████████ | 27826/34278 [30:28:54<7:12:08, 4.02s/it] 81%|████████ | 27827/34278 [30:29:00<8:09:49, 4.56s/it] {'loss': 0.1413, 'grad_norm': 0.8589456492842638, 'learning_rate': 9.004271451098867e-07, 'epoch': 0.81} 81%|████████ | 27827/34278 [30:29:00<8:09:49, 4.56s/it] 81%|████████ | 27828/34278 [30:29:03<7:20:50, 4.10s/it] {'loss': 0.1083, 'grad_norm': 0.8635579234007896, 'learning_rate': 9.001567015209229e-07, 'epoch': 0.81} 81%|████████ | 27828/34278 [30:29:03<7:20:50, 4.10s/it] 81%|████████ | 27829/34278 [30:29:06<6:39:48, 3.72s/it] {'loss': 0.114, 'grad_norm': 0.8168511748696249, 'learning_rate': 8.998862945343811e-07, 'epoch': 0.81} 81%|████████ | 27829/34278 [30:29:06<6:39:48, 3.72s/it] 81%|████████ | 27830/34278 [30:29:09<6:12:22, 3.47s/it] {'loss': 0.1285, 'grad_norm': 1.0173724138335858, 'learning_rate': 8.996159241526775e-07, 'epoch': 0.81} 81%|████████ | 27830/34278 [30:29:09<6:12:22, 3.47s/it] 81%|████████ | 27831/34278 [30:29:12<5:56:06, 3.31s/it] {'loss': 0.1166, 'grad_norm': 1.1457423515664586, 'learning_rate': 8.993455903782222e-07, 'epoch': 0.81} 81%|████████ | 27831/34278 [30:29:12<5:56:06, 3.31s/it] 81%|████████ | 27832/34278 [30:29:15<5:40:12, 3.17s/it] {'loss': 0.0882, 'grad_norm': 1.0103305109512088, 'learning_rate': 8.990752932134322e-07, 'epoch': 0.81} 81%|████████ | 27832/34278 [30:29:15<5:40:12, 3.17s/it] 81%|████████ | 27833/34278 [30:29:19<6:28:54, 3.62s/it] {'loss': 0.1103, 'grad_norm': 0.8465947302811158, 'learning_rate': 8.98805032660719e-07, 'epoch': 0.81} 81%|████████ | 27833/34278 [30:29:19<6:28:54, 3.62s/it] 81%|████████ | 27834/34278 [30:29:22<6:06:43, 3.41s/it] {'loss': 0.1199, 'grad_norm': 0.8989232846716325, 'learning_rate': 8.985348087224943e-07, 'epoch': 0.81} 81%|████████ | 27834/34278 [30:29:22<6:06:43, 3.41s/it] 81%|████████ | 27835/34278 [30:29:26<6:35:07, 3.68s/it] {'loss': 0.1047, 'grad_norm': 1.0875244874638665, 'learning_rate': 8.982646214011715e-07, 'epoch': 0.81} 81%|████████ | 27835/34278 [30:29:26<6:35:07, 3.68s/it] 81%|████████ | 27836/34278 [30:29:29<6:07:06, 3.42s/it] {'loss': 0.0913, 'grad_norm': 0.9733932925803976, 'learning_rate': 8.979944706991639e-07, 'epoch': 0.81} 81%|████████ | 27836/34278 [30:29:29<6:07:06, 3.42s/it] 81%|████████ | 27837/34278 [30:29:32<5:50:51, 3.27s/it] {'loss': 0.1282, 'grad_norm': 1.05482053175379, 'learning_rate': 8.977243566188831e-07, 'epoch': 0.81} 81%|████████ | 27837/34278 [30:29:32<5:50:51, 3.27s/it] 81%|████████ | 27838/34278 [30:29:35<5:44:14, 3.21s/it] {'loss': 0.1615, 'grad_norm': 0.9503921749161258, 'learning_rate': 8.974542791627383e-07, 'epoch': 0.81} 81%|████████ | 27838/34278 [30:29:35<5:44:14, 3.21s/it] 81%|████████ | 27839/34278 [30:29:38<5:44:26, 3.21s/it] {'loss': 0.123, 'grad_norm': 0.9053194371817254, 'learning_rate': 8.971842383331436e-07, 'epoch': 0.81} 81%|████████ | 27839/34278 [30:29:39<5:44:26, 3.21s/it] 81%|████████ | 27840/34278 [30:29:42<6:00:01, 3.36s/it] {'loss': 0.1165, 'grad_norm': 0.9312336263357013, 'learning_rate': 8.969142341325088e-07, 'epoch': 0.81} 81%|████████ | 27840/34278 [30:29:42<6:00:01, 3.36s/it] 81%|████████ | 27841/34278 [30:29:46<6:26:17, 3.60s/it] {'loss': 0.1173, 'grad_norm': 0.8743210919694165, 'learning_rate': 8.966442665632418e-07, 'epoch': 0.81} 81%|████████ | 27841/34278 [30:29:46<6:26:17, 3.60s/it] 81%|████████ | 27842/34278 [30:29:49<5:55:50, 3.32s/it] {'loss': 0.1225, 'grad_norm': 1.1143132195729024, 'learning_rate': 8.963743356277577e-07, 'epoch': 0.81} 81%|████████ | 27842/34278 [30:29:49<5:55:50, 3.32s/it] 81%|████████ | 27843/34278 [30:29:55<7:07:50, 3.99s/it] {'loss': 0.1087, 'grad_norm': 0.8115040841378355, 'learning_rate': 8.961044413284636e-07, 'epoch': 0.81} 81%|████████ | 27843/34278 [30:29:55<7:07:50, 3.99s/it] 81%|████████ | 27844/34278 [30:30:01<8:12:44, 4.59s/it] {'loss': 0.1204, 'grad_norm': 0.9327315347288102, 'learning_rate': 8.958345836677684e-07, 'epoch': 0.81} 81%|████████ | 27844/34278 [30:30:01<8:12:44, 4.59s/it] 81%|████████ | 27845/34278 [30:30:07<8:57:25, 5.01s/it] {'loss': 0.0988, 'grad_norm': 1.105081627639045, 'learning_rate': 8.955647626480835e-07, 'epoch': 0.81} 81%|████████ | 27845/34278 [30:30:07<8:57:25, 5.01s/it] 81%|████████ | 27846/34278 [30:30:10<8:06:10, 4.54s/it] {'loss': 0.1166, 'grad_norm': 0.9425030223238433, 'learning_rate': 8.952949782718162e-07, 'epoch': 0.81} 81%|████████ | 27846/34278 [30:30:10<8:06:10, 4.54s/it] 81%|████████ | 27847/34278 [30:30:16<8:39:10, 4.84s/it] {'loss': 0.1379, 'grad_norm': 0.882574285397809, 'learning_rate': 8.950252305413748e-07, 'epoch': 0.81} 81%|████████ | 27847/34278 [30:30:16<8:39:10, 4.84s/it] 81%|████████ | 27848/34278 [30:30:19<8:01:44, 4.50s/it] {'loss': 0.1188, 'grad_norm': 0.7754649697258346, 'learning_rate': 8.947555194591679e-07, 'epoch': 0.81} 81%|████████ | 27848/34278 [30:30:19<8:01:44, 4.50s/it] 81%|████████ | 27849/34278 [30:30:23<7:22:31, 4.13s/it] {'loss': 0.1215, 'grad_norm': 1.03417462139561, 'learning_rate': 8.944858450276051e-07, 'epoch': 0.81} 81%|████████ | 27849/34278 [30:30:23<7:22:31, 4.13s/it] 81%|████████ | 27850/34278 [30:30:25<6:45:32, 3.79s/it] {'loss': 0.1114, 'grad_norm': 0.9348661378267702, 'learning_rate': 8.942162072490924e-07, 'epoch': 0.81} 81%|████████ | 27850/34278 [30:30:26<6:45:32, 3.79s/it] 81%|████████▏ | 27851/34278 [30:30:31<7:34:53, 4.25s/it] {'loss': 0.1118, 'grad_norm': 0.6796537220442412, 'learning_rate': 8.93946606126036e-07, 'epoch': 0.81} 81%|████████▏ | 27851/34278 [30:30:31<7:34:53, 4.25s/it] 81%|████████▏ | 27852/34278 [30:30:34<7:13:40, 4.05s/it] {'loss': 0.1392, 'grad_norm': 1.1326677677469623, 'learning_rate': 8.93677041660846e-07, 'epoch': 0.81} 81%|████████▏ | 27852/34278 [30:30:34<7:13:40, 4.05s/it] 81%|████████▏ | 27853/34278 [30:30:37<6:32:03, 3.66s/it] {'loss': 0.1091, 'grad_norm': 0.8264266582231248, 'learning_rate': 8.93407513855925e-07, 'epoch': 0.81} 81%|████████▏ | 27853/34278 [30:30:37<6:32:03, 3.66s/it] 81%|████████▏ | 27854/34278 [30:30:41<6:25:55, 3.60s/it] {'loss': 0.0898, 'grad_norm': 0.7842864713261196, 'learning_rate': 8.931380227136832e-07, 'epoch': 0.81} 81%|████████▏ | 27854/34278 [30:30:41<6:25:55, 3.60s/it] 81%|████████▏ | 27855/34278 [30:30:44<6:31:57, 3.66s/it] {'loss': 0.1081, 'grad_norm': 1.0625415526642308, 'learning_rate': 8.928685682365229e-07, 'epoch': 0.81} 81%|████████▏ | 27855/34278 [30:30:44<6:31:57, 3.66s/it] 81%|████████▏ | 27856/34278 [30:30:48<6:19:54, 3.55s/it] {'loss': 0.1101, 'grad_norm': 0.8304302302014871, 'learning_rate': 8.925991504268533e-07, 'epoch': 0.81} 81%|████████▏ | 27856/34278 [30:30:48<6:19:54, 3.55s/it] 81%|████████▏ | 27857/34278 [30:30:51<6:05:04, 3.41s/it] {'loss': 0.1093, 'grad_norm': 0.8309523917579732, 'learning_rate': 8.92329769287078e-07, 'epoch': 0.81} 81%|████████▏ | 27857/34278 [30:30:51<6:05:04, 3.41s/it] 81%|████████▏ | 27858/34278 [30:30:55<6:23:47, 3.59s/it] {'loss': 0.1201, 'grad_norm': 1.4803614141624153, 'learning_rate': 8.920604248196007e-07, 'epoch': 0.81} 81%|████████▏ | 27858/34278 [30:30:55<6:23:47, 3.59s/it] 81%|████████▏ | 27859/34278 [30:30:58<5:55:53, 3.33s/it] {'loss': 0.1222, 'grad_norm': 1.0780219900720958, 'learning_rate': 8.917911170268273e-07, 'epoch': 0.81} 81%|████████▏ | 27859/34278 [30:30:58<5:55:53, 3.33s/it] 81%|████████▏ | 27860/34278 [30:31:00<5:37:50, 3.16s/it] {'loss': 0.0937, 'grad_norm': 0.7461002323140512, 'learning_rate': 8.91521845911163e-07, 'epoch': 0.81} 81%|████████▏ | 27860/34278 [30:31:00<5:37:50, 3.16s/it] 81%|████████▏ | 27861/34278 [30:31:04<5:46:47, 3.24s/it] {'loss': 0.1078, 'grad_norm': 0.8102491491294351, 'learning_rate': 8.912526114750097e-07, 'epoch': 0.81} 81%|████████▏ | 27861/34278 [30:31:04<5:46:47, 3.24s/it] 81%|████████▏ | 27862/34278 [30:31:07<5:32:41, 3.11s/it] {'loss': 0.1217, 'grad_norm': 1.1440873401470806, 'learning_rate': 8.90983413720774e-07, 'epoch': 0.81} 81%|████████▏ | 27862/34278 [30:31:07<5:32:41, 3.11s/it] 81%|████████▏ | 27863/34278 [30:31:10<5:48:46, 3.26s/it] {'loss': 0.12, 'grad_norm': 0.7364765684460183, 'learning_rate': 8.907142526508572e-07, 'epoch': 0.81} 81%|████████▏ | 27863/34278 [30:31:10<5:48:46, 3.26s/it] 81%|████████▏ | 27864/34278 [30:31:14<6:09:09, 3.45s/it] {'loss': 0.1081, 'grad_norm': 1.0929421996467108, 'learning_rate': 8.904451282676612e-07, 'epoch': 0.81} 81%|████████▏ | 27864/34278 [30:31:14<6:09:09, 3.45s/it] 81%|████████▏ | 27865/34278 [30:31:17<5:54:06, 3.31s/it] {'loss': 0.1081, 'grad_norm': 0.8992288094888158, 'learning_rate': 8.901760405735898e-07, 'epoch': 0.81} 81%|████████▏ | 27865/34278 [30:31:17<5:54:06, 3.31s/it] 81%|████████▏ | 27866/34278 [30:31:21<6:06:19, 3.43s/it] {'loss': 0.1267, 'grad_norm': 1.0922194465089228, 'learning_rate': 8.899069895710477e-07, 'epoch': 0.81} 81%|████████▏ | 27866/34278 [30:31:21<6:06:19, 3.43s/it] 81%|████████▏ | 27867/34278 [30:31:24<6:07:01, 3.43s/it] {'loss': 0.1228, 'grad_norm': 0.9395317411324351, 'learning_rate': 8.89637975262434e-07, 'epoch': 0.81} 81%|████████▏ | 27867/34278 [30:31:24<6:07:01, 3.43s/it] 81%|████████▏ | 27868/34278 [30:31:31<7:42:39, 4.33s/it] {'loss': 0.0967, 'grad_norm': 0.7290633268101204, 'learning_rate': 8.893689976501507e-07, 'epoch': 0.81} 81%|████████▏ | 27868/34278 [30:31:31<7:42:39, 4.33s/it] 81%|████████▏ | 27869/34278 [30:31:37<8:41:05, 4.88s/it] {'loss': 0.1173, 'grad_norm': 0.8547884924335176, 'learning_rate': 8.891000567366004e-07, 'epoch': 0.81} 81%|████████▏ | 27869/34278 [30:31:37<8:41:05, 4.88s/it] 81%|████████▏ | 27870/34278 [30:31:40<7:41:33, 4.32s/it] {'loss': 0.1247, 'grad_norm': 1.0524216083013698, 'learning_rate': 8.888311525241822e-07, 'epoch': 0.81} 81%|████████▏ | 27870/34278 [30:31:40<7:41:33, 4.32s/it] 81%|████████▏ | 27871/34278 [30:31:46<8:33:29, 4.81s/it] {'loss': 0.0942, 'grad_norm': 0.9536313632947281, 'learning_rate': 8.885622850152986e-07, 'epoch': 0.81} 81%|████████▏ | 27871/34278 [30:31:46<8:33:29, 4.81s/it] 81%|████████▏ | 27872/34278 [30:31:49<7:46:54, 4.37s/it] {'loss': 0.1135, 'grad_norm': 0.9656085483237548, 'learning_rate': 8.8829345421235e-07, 'epoch': 0.81} 81%|████████▏ | 27872/34278 [30:31:49<7:46:54, 4.37s/it] 81%|████████▏ | 27873/34278 [30:31:55<8:21:43, 4.70s/it] {'loss': 0.1236, 'grad_norm': 0.788840840057328, 'learning_rate': 8.880246601177361e-07, 'epoch': 0.81} 81%|████████▏ | 27873/34278 [30:31:55<8:21:43, 4.70s/it] 81%|████████▏ | 27874/34278 [30:31:58<7:37:04, 4.28s/it] {'loss': 0.108, 'grad_norm': 0.8092387032392127, 'learning_rate': 8.877559027338556e-07, 'epoch': 0.81} 81%|████████▏ | 27874/34278 [30:31:58<7:37:04, 4.28s/it] 81%|████████▏ | 27875/34278 [30:32:01<6:56:59, 3.91s/it] {'loss': 0.1187, 'grad_norm': 1.5655596722106258, 'learning_rate': 8.874871820631098e-07, 'epoch': 0.81} 81%|████████▏ | 27875/34278 [30:32:01<6:56:59, 3.91s/it] 81%|████████▏ | 27876/34278 [30:32:04<6:29:37, 3.65s/it] {'loss': 0.1205, 'grad_norm': 0.7460362896971863, 'learning_rate': 8.872184981078952e-07, 'epoch': 0.81} 81%|████████▏ | 27876/34278 [30:32:04<6:29:37, 3.65s/it] 81%|████████▏ | 27877/34278 [30:32:07<6:09:19, 3.46s/it] {'loss': 0.1047, 'grad_norm': 0.924290849914, 'learning_rate': 8.869498508706137e-07, 'epoch': 0.81} 81%|████████▏ | 27877/34278 [30:32:07<6:09:19, 3.46s/it] 81%|████████▏ | 27878/34278 [30:32:11<6:27:54, 3.64s/it] {'loss': 0.1213, 'grad_norm': 0.7165676662127671, 'learning_rate': 8.866812403536601e-07, 'epoch': 0.81} 81%|████████▏ | 27878/34278 [30:32:11<6:27:54, 3.64s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f9cdef8e700> Failed to fetch sample 2696827. Exception: cannot identify image file <_io.BytesIO object at 0x7f9cdef8e700> 81%|████████▏ | 27879/34278 [30:32:16<7:26:04, 4.18s/it] {'loss': 0.1043, 'grad_norm': 0.7409949292685086, 'learning_rate': 8.864126665594363e-07, 'epoch': 0.81} 81%|████████▏ | 27879/34278 [30:32:16<7:26:04, 4.18s/it] 81%|████████▏ | 27880/34278 [30:32:19<6:47:02, 3.82s/it] {'loss': 0.1412, 'grad_norm': 0.825083620554212, 'learning_rate': 8.861441294903383e-07, 'epoch': 0.81} 81%|████████▏ | 27880/34278 [30:32:19<6:47:02, 3.82s/it] 81%|████████▏ | 27881/34278 [30:32:22<6:14:49, 3.52s/it] {'loss': 0.1253, 'grad_norm': 0.8182300223152272, 'learning_rate': 8.858756291487619e-07, 'epoch': 0.81} 81%|████████▏ | 27881/34278 [30:32:22<6:14:49, 3.52s/it] 81%|████████▏ | 27882/34278 [30:32:25<5:48:38, 3.27s/it] {'loss': 0.1178, 'grad_norm': 0.7415768586442154, 'learning_rate': 8.856071655371057e-07, 'epoch': 0.81} 81%|████████▏ | 27882/34278 [30:32:25<5:48:38, 3.27s/it] 81%|████████▏ | 27883/34278 [30:32:28<5:38:46, 3.18s/it] {'loss': 0.1163, 'grad_norm': 0.8896231090110454, 'learning_rate': 8.853387386577677e-07, 'epoch': 0.81} 81%|████████▏ | 27883/34278 [30:32:28<5:38:46, 3.18s/it] 81%|████████▏ | 27884/34278 [30:32:31<5:44:25, 3.23s/it] {'loss': 0.1107, 'grad_norm': 0.8696533261783911, 'learning_rate': 8.85070348513144e-07, 'epoch': 0.81} 81%|████████▏ | 27884/34278 [30:32:31<5:44:25, 3.23s/it] 81%|████████▏ | 27885/34278 [30:32:34<5:31:57, 3.12s/it] {'loss': 0.1231, 'grad_norm': 0.775099061652046, 'learning_rate': 8.84801995105628e-07, 'epoch': 0.81} 81%|████████▏ | 27885/34278 [30:32:34<5:31:57, 3.12s/it] 81%|████████▏ | 27886/34278 [30:32:37<5:36:42, 3.16s/it] {'loss': 0.1067, 'grad_norm': 0.7282668608994736, 'learning_rate': 8.845336784376185e-07, 'epoch': 0.81} 81%|████████▏ | 27886/34278 [30:32:37<5:36:42, 3.16s/it] 81%|████████▏ | 27887/34278 [30:32:41<5:40:04, 3.19s/it] {'loss': 0.1039, 'grad_norm': 0.5516454272813409, 'learning_rate': 8.842653985115102e-07, 'epoch': 0.81} 81%|████████▏ | 27887/34278 [30:32:41<5:40:04, 3.19s/it] 81%|████████▏ | 27888/34278 [30:32:47<7:13:21, 4.07s/it] {'loss': 0.1312, 'grad_norm': 0.9617941084820119, 'learning_rate': 8.839971553296956e-07, 'epoch': 0.81} 81%|████████▏ | 27888/34278 [30:32:47<7:13:21, 4.07s/it] 81%|████████▏ | 27889/34278 [30:32:50<6:49:00, 3.84s/it] {'loss': 0.0992, 'grad_norm': 0.84508873568692, 'learning_rate': 8.837289488945738e-07, 'epoch': 0.81} 81%|████████▏ | 27889/34278 [30:32:50<6:49:00, 3.84s/it] 81%|████████▏ | 27890/34278 [30:32:54<6:44:30, 3.80s/it] {'loss': 0.109, 'grad_norm': 0.9907097824580179, 'learning_rate': 8.834607792085375e-07, 'epoch': 0.81} 81%|████████▏ | 27890/34278 [30:32:54<6:44:30, 3.80s/it] 81%|████████▏ | 27891/34278 [30:32:58<7:03:01, 3.97s/it] {'loss': 0.1248, 'grad_norm': 0.9375159128166626, 'learning_rate': 8.831926462739799e-07, 'epoch': 0.81} 81%|████████▏ | 27891/34278 [30:32:58<7:03:01, 3.97s/it] 81%|████████▏ | 27892/34278 [30:33:02<6:49:45, 3.85s/it] {'loss': 0.1352, 'grad_norm': 0.9239841975052013, 'learning_rate': 8.829245500932959e-07, 'epoch': 0.81} 81%|████████▏ | 27892/34278 [30:33:02<6:49:45, 3.85s/it] 81%|████████▏ | 27893/34278 [30:33:08<8:11:25, 4.62s/it] {'loss': 0.1083, 'grad_norm': 0.8556550843005295, 'learning_rate': 8.826564906688794e-07, 'epoch': 0.81} 81%|████████▏ | 27893/34278 [30:33:08<8:11:25, 4.62s/it] 81%|████████▏ | 27894/34278 [30:33:12<7:35:13, 4.28s/it] {'loss': 0.1285, 'grad_norm': 0.9812198149961999, 'learning_rate': 8.823884680031214e-07, 'epoch': 0.81} 81%|████████▏ | 27894/34278 [30:33:12<7:35:13, 4.28s/it] 81%|████████▏ | 27895/34278 [30:33:15<7:02:02, 3.97s/it] {'loss': 0.122, 'grad_norm': 0.6997671115908596, 'learning_rate': 8.821204820984164e-07, 'epoch': 0.81} 81%|████████▏ | 27895/34278 [30:33:15<7:02:02, 3.97s/it] 81%|████████▏ | 27896/34278 [30:33:18<6:32:46, 3.69s/it] {'loss': 0.0922, 'grad_norm': 0.7670904985101262, 'learning_rate': 8.818525329571581e-07, 'epoch': 0.81} 81%|████████▏ | 27896/34278 [30:33:18<6:32:46, 3.69s/it] 81%|████████▏ | 27897/34278 [30:33:21<6:15:57, 3.54s/it] {'loss': 0.0938, 'grad_norm': 0.9271316138154613, 'learning_rate': 8.815846205817369e-07, 'epoch': 0.81} 81%|████████▏ | 27897/34278 [30:33:21<6:15:57, 3.54s/it] 81%|████████▏ | 27898/34278 [30:33:25<6:38:57, 3.75s/it] {'loss': 0.1027, 'grad_norm': 0.8567940019276363, 'learning_rate': 8.813167449745436e-07, 'epoch': 0.81} 81%|████████▏ | 27898/34278 [30:33:25<6:38:57, 3.75s/it] 81%|████████▏ | 27899/34278 [30:33:31<7:47:43, 4.40s/it] {'loss': 0.1055, 'grad_norm': 0.7482419574374475, 'learning_rate': 8.810489061379728e-07, 'epoch': 0.81} 81%|████████▏ | 27899/34278 [30:33:31<7:47:43, 4.40s/it] 81%|████████▏ | 27900/34278 [30:33:34<7:09:58, 4.04s/it] {'loss': 0.1094, 'grad_norm': 0.8392858184176683, 'learning_rate': 8.80781104074413e-07, 'epoch': 0.81} 81%|████████▏ | 27900/34278 [30:33:34<7:09:58, 4.04s/it] 81%|████████▏ | 27901/34278 [30:33:38<6:53:51, 3.89s/it] {'loss': 0.103, 'grad_norm': 0.7941550962089481, 'learning_rate': 8.805133387862558e-07, 'epoch': 0.81} 81%|████████▏ | 27901/34278 [30:33:38<6:53:51, 3.89s/it] 81%|████████▏ | 27902/34278 [30:33:41<6:33:02, 3.70s/it] {'loss': 0.1238, 'grad_norm': 0.979570616426688, 'learning_rate': 8.802456102758938e-07, 'epoch': 0.81} 81%|████████▏ | 27902/34278 [30:33:41<6:33:02, 3.70s/it] 81%|████████▏ | 27903/34278 [30:33:44<6:06:40, 3.45s/it] {'loss': 0.0994, 'grad_norm': 0.7537379325636532, 'learning_rate': 8.799779185457153e-07, 'epoch': 0.81} 81%|████████▏ | 27903/34278 [30:33:44<6:06:40, 3.45s/it] 81%|████████▏ | 27904/34278 [30:33:47<5:59:01, 3.38s/it] {'loss': 0.1026, 'grad_norm': 0.803492811338486, 'learning_rate': 8.797102635981092e-07, 'epoch': 0.81} 81%|████████▏ | 27904/34278 [30:33:47<5:59:01, 3.38s/it] 81%|████████▏ | 27905/34278 [30:33:54<7:28:56, 4.23s/it] {'loss': 0.1066, 'grad_norm': 0.9189270594140458, 'learning_rate': 8.794426454354671e-07, 'epoch': 0.81} 81%|████████▏ | 27905/34278 [30:33:54<7:28:56, 4.23s/it] 81%|████████▏ | 27906/34278 [30:33:56<6:48:41, 3.85s/it] {'loss': 0.1217, 'grad_norm': 0.9376877902881371, 'learning_rate': 8.791750640601765e-07, 'epoch': 0.81} 81%|████████▏ | 27906/34278 [30:33:56<6:48:41, 3.85s/it] 81%|████████▏ | 27907/34278 [30:33:59<6:11:53, 3.50s/it] {'loss': 0.1024, 'grad_norm': 0.7900601135877897, 'learning_rate': 8.789075194746288e-07, 'epoch': 0.81} 81%|████████▏ | 27907/34278 [30:33:59<6:11:53, 3.50s/it] 81%|████████▏ | 27908/34278 [30:34:05<7:39:18, 4.33s/it] {'loss': 0.1132, 'grad_norm': 0.7052171852321076, 'learning_rate': 8.786400116812093e-07, 'epoch': 0.81} 81%|████████▏ | 27908/34278 [30:34:05<7:39:18, 4.33s/it] 81%|████████▏ | 27909/34278 [30:34:08<6:55:34, 3.91s/it] {'loss': 0.1118, 'grad_norm': 0.7881679378547637, 'learning_rate': 8.783725406823095e-07, 'epoch': 0.81} 81%|████████▏ | 27909/34278 [30:34:08<6:55:34, 3.91s/it] 81%|████████▏ | 27910/34278 [30:34:12<6:41:01, 3.78s/it] {'loss': 0.1019, 'grad_norm': 0.8932581135162265, 'learning_rate': 8.781051064803153e-07, 'epoch': 0.81} 81%|████████▏ | 27910/34278 [30:34:12<6:41:01, 3.78s/it] 81%|████████▏ | 27911/34278 [30:34:16<6:52:49, 3.89s/it] {'loss': 0.1169, 'grad_norm': 0.7752407613929274, 'learning_rate': 8.778377090776136e-07, 'epoch': 0.81} 81%|████████▏ | 27911/34278 [30:34:16<6:52:49, 3.89s/it] 81%|████████▏ | 27912/34278 [30:34:27<10:30:10, 5.94s/it] {'loss': 0.1159, 'grad_norm': 0.9243660125538735, 'learning_rate': 8.775703484765929e-07, 'epoch': 0.81} 81%|████████▏ | 27912/34278 [30:34:27<10:30:10, 5.94s/it] 81%|████████▏ | 27913/34278 [30:34:31<9:24:20, 5.32s/it] {'loss': 0.1161, 'grad_norm': 0.8063009476385474, 'learning_rate': 8.773030246796416e-07, 'epoch': 0.81} 81%|████████▏ | 27913/34278 [30:34:31<9:24:20, 5.32s/it] 81%|████████▏ | 27914/34278 [30:34:34<8:07:41, 4.60s/it] {'loss': 0.1098, 'grad_norm': 0.8555816929791252, 'learning_rate': 8.770357376891442e-07, 'epoch': 0.81} 81%|████████▏ | 27914/34278 [30:34:34<8:07:41, 4.60s/it] 81%|████████▏ | 27915/34278 [30:34:38<8:18:08, 4.70s/it] {'loss': 0.1255, 'grad_norm': 0.9909714390437983, 'learning_rate': 8.767684875074867e-07, 'epoch': 0.81} 81%|████████▏ | 27915/34278 [30:34:38<8:18:08, 4.70s/it] 81%|████████▏ | 27916/34278 [30:34:42<7:42:11, 4.36s/it] {'loss': 0.1154, 'grad_norm': 0.769080409173358, 'learning_rate': 8.765012741370566e-07, 'epoch': 0.81} 81%|████████▏ | 27916/34278 [30:34:42<7:42:11, 4.36s/it] 81%|████████▏ | 27917/34278 [30:34:45<7:00:24, 3.97s/it] {'loss': 0.0981, 'grad_norm': 0.9236014058704317, 'learning_rate': 8.762340975802392e-07, 'epoch': 0.81} 81%|████████▏ | 27917/34278 [30:34:45<7:00:24, 3.97s/it] 81%|████████▏ | 27918/34278 [30:34:51<7:48:07, 4.42s/it] {'loss': 0.1003, 'grad_norm': 0.7957683463367754, 'learning_rate': 8.759669578394165e-07, 'epoch': 0.81} 81%|████████▏ | 27918/34278 [30:34:51<7:48:07, 4.42s/it] 81%|████████▏ | 27919/34278 [30:34:55<7:51:40, 4.45s/it] {'loss': 0.1296, 'grad_norm': 0.9393352776753457, 'learning_rate': 8.756998549169793e-07, 'epoch': 0.81} 81%|████████▏ | 27919/34278 [30:34:55<7:51:40, 4.45s/it] 81%|████████▏ | 27920/34278 [30:35:17<17:08:33, 9.71s/it] {'loss': 0.1265, 'grad_norm': 0.9901589837775178, 'learning_rate': 8.754327888153085e-07, 'epoch': 0.81} 81%|████████▏ | 27920/34278 [30:35:17<17:08:33, 9.71s/it] 81%|████████▏ | 27921/34278 [30:35:20<13:34:56, 7.69s/it] {'loss': 0.1105, 'grad_norm': 0.9328615769706948, 'learning_rate': 8.751657595367885e-07, 'epoch': 0.81} 81%|████████▏ | 27921/34278 [30:35:20<13:34:56, 7.69s/it] 81%|████████▏ | 27922/34278 [30:35:32<15:37:02, 8.85s/it] {'loss': 0.1214, 'grad_norm': 0.8225611506888997, 'learning_rate': 8.74898767083805e-07, 'epoch': 0.81} 81%|████████▏ | 27922/34278 [30:35:32<15:37:02, 8.85s/it] 81%|████████▏ | 27923/34278 [30:35:36<13:21:56, 7.57s/it] {'loss': 0.0917, 'grad_norm': 0.7273666981494035, 'learning_rate': 8.74631811458741e-07, 'epoch': 0.81} 81%|████████▏ | 27923/34278 [30:35:36<13:21:56, 7.57s/it] 81%|████████▏ | 27924/34278 [30:35:39<10:53:09, 6.17s/it] {'loss': 0.1348, 'grad_norm': 0.7965634618209972, 'learning_rate': 8.743648926639775e-07, 'epoch': 0.81} 81%|████████▏ | 27924/34278 [30:35:39<10:53:09, 6.17s/it] 81%|████████▏ | 27925/34278 [30:35:50<13:24:04, 7.59s/it] {'loss': 0.1138, 'grad_norm': 0.9075342283984179, 'learning_rate': 8.740980107018998e-07, 'epoch': 0.81} 81%|████████▏ | 27925/34278 [30:35:50<13:24:04, 7.59s/it] 81%|████████▏ | 27926/34278 [30:35:54<11:18:31, 6.41s/it] {'loss': 0.1293, 'grad_norm': 0.958721682505962, 'learning_rate': 8.73831165574891e-07, 'epoch': 0.81} 81%|████████▏ | 27926/34278 [30:35:54<11:18:31, 6.41s/it] 81%|████████▏ | 27927/34278 [30:36:07<15:13:38, 8.63s/it] {'loss': 0.0987, 'grad_norm': 0.8471685867698534, 'learning_rate': 8.735643572853325e-07, 'epoch': 0.81} 81%|████████▏ | 27927/34278 [30:36:07<15:13:38, 8.63s/it] 81%|████████▏ | 27928/34278 [30:36:10<12:12:41, 6.92s/it] {'loss': 0.1086, 'grad_norm': 0.9322191934295897, 'learning_rate': 8.732975858356057e-07, 'epoch': 0.81} 81%|████████▏ | 27928/34278 [30:36:10<12:12:41, 6.92s/it] 81%|████████▏ | 27929/34278 [30:36:25<16:22:11, 9.28s/it] {'loss': 0.1044, 'grad_norm': 0.8270460426510711, 'learning_rate': 8.730308512280938e-07, 'epoch': 0.81} 81%|████████▏ | 27929/34278 [30:36:25<16:22:11, 9.28s/it] 81%|████████▏ | 27930/34278 [30:36:28<13:13:54, 7.50s/it] {'loss': 0.1168, 'grad_norm': 0.7654599046546438, 'learning_rate': 8.72764153465176e-07, 'epoch': 0.81} 81%|████████▏ | 27930/34278 [30:36:29<13:13:54, 7.50s/it] 81%|████████▏ | 27931/34278 [30:36:52<21:35:07, 12.24s/it] {'loss': 0.1308, 'grad_norm': 0.953571558305935, 'learning_rate': 8.724974925492347e-07, 'epoch': 0.81} 81%|████████▏ | 27931/34278 [30:36:52<21:35:07, 12.24s/it] 81%|████████▏ | 27932/34278 [30:37:02<20:42:59, 11.75s/it] {'loss': 0.1106, 'grad_norm': 0.8514589093817848, 'learning_rate': 8.722308684826514e-07, 'epoch': 0.81} 81%|████████▏ | 27932/34278 [30:37:02<20:42:59, 11.75s/it] 81%|████████▏ | 27933/34278 [30:37:25<26:22:54, 14.97s/it] {'loss': 0.1331, 'grad_norm': 0.8506731842034587, 'learning_rate': 8.719642812678059e-07, 'epoch': 0.81} 81%|████████▏ | 27933/34278 [30:37:25<26:22:54, 14.97s/it] 81%|████████▏ | 27934/34278 [30:37:28<19:57:36, 11.33s/it] {'loss': 0.1125, 'grad_norm': 1.485484176016856, 'learning_rate': 8.716977309070762e-07, 'epoch': 0.81} 81%|████████▏ | 27934/34278 [30:37:28<19:57:36, 11.33s/it] 81%|████████▏ | 27935/34278 [30:37:51<26:22:53, 14.97s/it] {'loss': 0.1156, 'grad_norm': 1.0000498783709355, 'learning_rate': 8.714312174028456e-07, 'epoch': 0.81} 81%|████████▏ | 27935/34278 [30:37:51<26:22:53, 14.97s/it] 81%|████████▏ | 27936/34278 [30:37:54<19:54:26, 11.30s/it] {'loss': 0.0874, 'grad_norm': 1.086574623093909, 'learning_rate': 8.711647407574897e-07, 'epoch': 0.81} 81%|████████▏ | 27936/34278 [30:37:54<19:54:26, 11.30s/it] 82%|████████▏ | 27937/34278 [30:38:05<19:35:13, 11.12s/it] {'loss': 0.1451, 'grad_norm': 1.00993218665719, 'learning_rate': 8.708983009733906e-07, 'epoch': 0.82} 82%|████████▏ | 27937/34278 [30:38:05<19:35:13, 11.12s/it] 82%|████████▏ | 27938/34278 [30:38:08<15:24:36, 8.75s/it] {'loss': 0.1237, 'grad_norm': 1.927800729165628, 'learning_rate': 8.706318980529249e-07, 'epoch': 0.82} 82%|████████▏ | 27938/34278 [30:38:08<15:24:36, 8.75s/it] 82%|████████▏ | 27939/34278 [30:38:30<22:28:16, 12.76s/it] {'loss': 0.1027, 'grad_norm': 0.7889614899462662, 'learning_rate': 8.703655319984728e-07, 'epoch': 0.82} 82%|████████▏ | 27939/34278 [30:38:30<22:28:16, 12.76s/it] 82%|████████▏ | 27940/34278 [30:38:35<18:17:28, 10.39s/it] {'loss': 0.1134, 'grad_norm': 1.124823704480609, 'learning_rate': 8.700992028124116e-07, 'epoch': 0.82} 82%|████████▏ | 27940/34278 [30:38:35<18:17:28, 10.39s/it] 82%|████████▏ | 27941/34278 [30:38:57<24:17:34, 13.80s/it] {'loss': 0.116, 'grad_norm': 1.026337449686936, 'learning_rate': 8.698329104971176e-07, 'epoch': 0.82} 82%|████████▏ | 27941/34278 [30:38:57<24:17:34, 13.80s/it] 82%|████████▏ | 27942/34278 [30:39:08<22:59:04, 13.06s/it] {'loss': 0.0998, 'grad_norm': 0.8714488107477869, 'learning_rate': 8.695666550549692e-07, 'epoch': 0.82} 82%|████████▏ | 27942/34278 [30:39:08<22:59:04, 13.06s/it] 82%|████████▏ | 27943/34278 [30:39:24<24:36:36, 13.99s/it] {'loss': 0.117, 'grad_norm': 0.7970873611052459, 'learning_rate': 8.693004364883451e-07, 'epoch': 0.82} 82%|████████▏ | 27943/34278 [30:39:24<24:36:36, 13.99s/it] 82%|████████▏ | 27944/34278 [30:39:43<27:26:31, 15.60s/it] {'loss': 0.1015, 'grad_norm': 0.9040347466080777, 'learning_rate': 8.690342547996205e-07, 'epoch': 0.82} 82%|████████▏ | 27944/34278 [30:39:43<27:26:31, 15.60s/it] 82%|████████▏ | 27945/34278 [30:39:48<21:47:47, 12.39s/it] {'loss': 0.1231, 'grad_norm': 1.1349387215847295, 'learning_rate': 8.687681099911704e-07, 'epoch': 0.82} 82%|████████▏ | 27945/34278 [30:39:48<21:47:47, 12.39s/it] 82%|████████▏ | 27946/34278 [30:40:04<23:20:46, 13.27s/it] {'loss': 0.1299, 'grad_norm': 1.0015897725252845, 'learning_rate': 8.685020020653745e-07, 'epoch': 0.82} 82%|████████▏ | 27946/34278 [30:40:04<23:20:46, 13.27s/it] 82%|████████▏ | 27947/34278 [30:40:18<23:43:16, 13.49s/it] {'loss': 0.1263, 'grad_norm': 1.0853462300215388, 'learning_rate': 8.682359310246058e-07, 'epoch': 0.82} 82%|████████▏ | 27947/34278 [30:40:18<23:43:16, 13.49s/it] 82%|████████▏ | 27948/34278 [30:40:28<22:06:34, 12.57s/it] {'loss': 0.1222, 'grad_norm': 1.021627090189694, 'learning_rate': 8.67969896871238e-07, 'epoch': 0.82} 82%|████████▏ | 27948/34278 [30:40:28<22:06:34, 12.57s/it] 82%|████████▏ | 27949/34278 [30:40:39<21:20:21, 12.14s/it] {'loss': 0.1376, 'grad_norm': 0.9893004089744691, 'learning_rate': 8.677038996076509e-07, 'epoch': 0.82} 82%|████████▏ | 27949/34278 [30:40:39<21:20:21, 12.14s/it] 82%|████████▏ | 27950/34278 [30:40:45<17:49:11, 10.14s/it] {'loss': 0.1018, 'grad_norm': 0.677533835465219, 'learning_rate': 8.674379392362175e-07, 'epoch': 0.82} 82%|████████▏ | 27950/34278 [30:40:45<17:49:11, 10.14s/it] 82%|████████▏ | 27951/34278 [30:41:00<20:41:06, 11.77s/it] {'loss': 0.1037, 'grad_norm': 0.6357550110472617, 'learning_rate': 8.671720157593099e-07, 'epoch': 0.82} 82%|████████▏ | 27951/34278 [30:41:00<20:41:06, 11.77s/it] 82%|████████▏ | 27952/34278 [30:41:19<24:26:02, 13.90s/it] {'loss': 0.129, 'grad_norm': 0.8820956278642804, 'learning_rate': 8.669061291793051e-07, 'epoch': 0.82} 82%|████████▏ | 27952/34278 [30:41:19<24:26:02, 13.90s/it] 82%|████████▏ | 27953/34278 [30:41:32<24:07:33, 13.73s/it] {'loss': 0.1071, 'grad_norm': 0.862788822741437, 'learning_rate': 8.666402794985762e-07, 'epoch': 0.82} 82%|████████▏ | 27953/34278 [30:41:32<24:07:33, 13.73s/it] 82%|████████▏ | 27954/34278 [30:41:47<24:24:41, 13.90s/it] {'loss': 0.1031, 'grad_norm': 0.9375392918641782, 'learning_rate': 8.663744667194946e-07, 'epoch': 0.82} 82%|████████▏ | 27954/34278 [30:41:47<24:24:41, 13.90s/it] 82%|████████▏ | 27955/34278 [30:42:01<24:34:28, 13.99s/it] {'loss': 0.1047, 'grad_norm': 0.759271459198332, 'learning_rate': 8.661086908444349e-07, 'epoch': 0.82} 82%|████████▏ | 27955/34278 [30:42:01<24:34:28, 13.99s/it] 82%|████████▏ | 27956/34278 [30:42:23<28:44:24, 16.37s/it] {'loss': 0.1075, 'grad_norm': 1.0672211803055243, 'learning_rate': 8.658429518757716e-07, 'epoch': 0.82} 82%|████████▏ | 27956/34278 [30:42:23<28:44:24, 16.37s/it] 82%|████████▏ | 27957/34278 [30:42:39<28:49:43, 16.42s/it] {'loss': 0.1138, 'grad_norm': 0.835350576889197, 'learning_rate': 8.655772498158754e-07, 'epoch': 0.82} 82%|████████▏ | 27957/34278 [30:42:39<28:49:43, 16.42s/it] 82%|████████▏ | 27958/34278 [30:42:53<27:31:56, 15.68s/it] {'loss': 0.1132, 'grad_norm': 0.6703112766894321, 'learning_rate': 8.653115846671173e-07, 'epoch': 0.82} 82%|████████▏ | 27958/34278 [30:42:53<27:31:56, 15.68s/it] 82%|████████▏ | 27959/34278 [30:42:57<21:05:51, 12.02s/it] {'loss': 0.1214, 'grad_norm': 0.8498449228358369, 'learning_rate': 8.650459564318714e-07, 'epoch': 0.82} 82%|████████▏ | 27959/34278 [30:42:57<21:05:51, 12.02s/it] 82%|████████▏ | 27960/34278 [30:43:00<16:26:02, 9.36s/it] {'loss': 0.1206, 'grad_norm': 1.0296137250965731, 'learning_rate': 8.647803651125069e-07, 'epoch': 0.82} 82%|████████▏ | 27960/34278 [30:43:00<16:26:02, 9.36s/it] 82%|████████▏ | 27961/34278 [30:43:03<13:06:10, 7.47s/it] {'loss': 0.1066, 'grad_norm': 0.9620023355296395, 'learning_rate': 8.645148107113976e-07, 'epoch': 0.82} 82%|████████▏ | 27961/34278 [30:43:03<13:06:10, 7.47s/it] 82%|████████▏ | 27962/34278 [30:43:14<15:11:20, 8.66s/it] {'loss': 0.1154, 'grad_norm': 0.8697029912694123, 'learning_rate': 8.642492932309116e-07, 'epoch': 0.82} 82%|████████▏ | 27962/34278 [30:43:14<15:11:20, 8.66s/it] 82%|████████▏ | 27963/34278 [30:43:26<16:56:09, 9.65s/it] {'loss': 0.1058, 'grad_norm': 0.872224754036844, 'learning_rate': 8.639838126734218e-07, 'epoch': 0.82} 82%|████████▏ | 27963/34278 [30:43:26<16:56:09, 9.65s/it] 82%|████████▏ | 27964/34278 [30:43:38<17:55:08, 10.22s/it] {'loss': 0.1086, 'grad_norm': 0.7181466266117666, 'learning_rate': 8.63718369041296e-07, 'epoch': 0.82} 82%|████████▏ | 27964/34278 [30:43:38<17:55:08, 10.22s/it] 82%|████████▏ | 27965/34278 [30:44:00<24:19:32, 13.87s/it] {'loss': 0.1303, 'grad_norm': 0.7093394701015387, 'learning_rate': 8.634529623369059e-07, 'epoch': 0.82} 82%|████████▏ | 27965/34278 [30:44:00<24:19:32, 13.87s/it] 82%|████████▏ | 27966/34278 [30:44:24<29:11:15, 16.65s/it] {'loss': 0.1167, 'grad_norm': 1.0146880919739127, 'learning_rate': 8.631875925626193e-07, 'epoch': 0.82} 82%|████████▏ | 27966/34278 [30:44:24<29:11:15, 16.65s/it] 82%|████████▏ | 27967/34278 [30:44:26<21:55:25, 12.51s/it] {'loss': 0.1152, 'grad_norm': 0.6976651318338014, 'learning_rate': 8.629222597208081e-07, 'epoch': 0.82} 82%|████████▏ | 27967/34278 [30:44:26<21:55:25, 12.51s/it] 82%|████████▏ | 27968/34278 [30:44:39<22:03:51, 12.59s/it] {'loss': 0.0986, 'grad_norm': 0.7889779039807315, 'learning_rate': 8.626569638138377e-07, 'epoch': 0.82} 82%|████████▏ | 27968/34278 [30:44:39<22:03:51, 12.59s/it] 82%|████████▏ | 27969/34278 [30:44:53<22:50:22, 13.03s/it] {'loss': 0.1368, 'grad_norm': 0.7200946525928537, 'learning_rate': 8.623917048440794e-07, 'epoch': 0.82} 82%|████████▏ | 27969/34278 [30:44:53<22:50:22, 13.03s/it] 82%|████████▏ | 27970/34278 [30:44:56<17:28:50, 9.98s/it] {'loss': 0.1251, 'grad_norm': 0.8065351815209023, 'learning_rate': 8.621264828139003e-07, 'epoch': 0.82} 82%|████████▏ | 27970/34278 [30:44:56<17:28:50, 9.98s/it] 82%|████████▏ | 27971/34278 [30:44:59<13:40:39, 7.81s/it] {'loss': 0.1024, 'grad_norm': 0.8627913754233999, 'learning_rate': 8.618612977256674e-07, 'epoch': 0.82} 82%|████████▏ | 27971/34278 [30:44:59<13:40:39, 7.81s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fd62f6e6a20> Failed to fetch sample 3655703. Exception: cannot identify image file <_io.BytesIO object at 0x7fd62f6e6a20> 82%|████████▏ | 27972/34278 [30:45:10<15:35:05, 8.90s/it] {'loss': 0.1271, 'grad_norm': 0.7703068451678616, 'learning_rate': 8.615961495817482e-07, 'epoch': 0.82} 82%|████████▏ | 27972/34278 [30:45:10<15:35:05, 8.90s/it] 82%|████████▏ | 27973/34278 [30:45:13<12:25:35, 7.10s/it] {'loss': 0.1096, 'grad_norm': 0.9032384205047582, 'learning_rate': 8.613310383845125e-07, 'epoch': 0.82} 82%|████████▏ | 27973/34278 [30:45:13<12:25:35, 7.10s/it] 82%|████████▏ | 27974/34278 [30:45:36<20:39:43, 11.80s/it] {'loss': 0.1051, 'grad_norm': 0.9759871484921885, 'learning_rate': 8.610659641363251e-07, 'epoch': 0.82} 82%|████████▏ | 27974/34278 [30:45:36<20:39:43, 11.80s/it] 82%|████████▏ | 27975/34278 [30:45:39<15:56:15, 9.10s/it] {'loss': 0.1085, 'grad_norm': 0.8059160076351759, 'learning_rate': 8.608009268395512e-07, 'epoch': 0.82} 82%|████████▏ | 27975/34278 [30:45:39<15:56:15, 9.10s/it] 82%|████████▏ | 27976/34278 [30:45:55<19:46:59, 11.30s/it] {'loss': 0.1036, 'grad_norm': 0.8301027795536169, 'learning_rate': 8.605359264965602e-07, 'epoch': 0.82} 82%|████████▏ | 27976/34278 [30:45:55<19:46:59, 11.30s/it] 82%|████████▏ | 27977/34278 [30:46:12<22:35:01, 12.90s/it] {'loss': 0.1043, 'grad_norm': 0.777042585692951, 'learning_rate': 8.602709631097161e-07, 'epoch': 0.82} 82%|████████▏ | 27977/34278 [30:46:12<22:35:01, 12.90s/it] 82%|████████▏ | 27978/34278 [30:46:23<21:52:17, 12.50s/it] {'loss': 0.1165, 'grad_norm': 0.8054281415233515, 'learning_rate': 8.600060366813823e-07, 'epoch': 0.82} 82%|████████▏ | 27978/34278 [30:46:23<21:52:17, 12.50s/it] 82%|████████▏ | 27979/34278 [30:46:27<16:59:26, 9.71s/it] {'loss': 0.1392, 'grad_norm': 0.9361815134274353, 'learning_rate': 8.597411472139288e-07, 'epoch': 0.82} 82%|████████▏ | 27979/34278 [30:46:27<16:59:26, 9.71s/it] 82%|████████▏ | 27980/34278 [30:46:30<13:36:15, 7.78s/it] {'loss': 0.1209, 'grad_norm': 0.7066121911380071, 'learning_rate': 8.594762947097173e-07, 'epoch': 0.82} 82%|████████▏ | 27980/34278 [30:46:30<13:36:15, 7.78s/it] 82%|████████▏ | 27981/34278 [30:46:33<11:08:10, 6.37s/it] {'loss': 0.1014, 'grad_norm': 1.027407194909586, 'learning_rate': 8.592114791711126e-07, 'epoch': 0.82} 82%|████████▏ | 27981/34278 [30:46:33<11:08:10, 6.37s/it] 82%|████████▏ | 27982/34278 [30:46:49<16:02:08, 9.17s/it] {'loss': 0.1121, 'grad_norm': 0.7264163145303579, 'learning_rate': 8.589467006004803e-07, 'epoch': 0.82} 82%|████████▏ | 27982/34278 [30:46:49<16:02:08, 9.17s/it] 82%|████████▏ | 27983/34278 [30:46:52<12:49:40, 7.34s/it] {'loss': 0.1018, 'grad_norm': 0.7785567537892137, 'learning_rate': 8.586819590001833e-07, 'epoch': 0.82} 82%|████████▏ | 27983/34278 [30:46:52<12:49:40, 7.34s/it] 82%|████████▏ | 27984/34278 [30:46:58<12:08:39, 6.95s/it] {'loss': 0.1166, 'grad_norm': 0.7948416690473252, 'learning_rate': 8.584172543725839e-07, 'epoch': 0.82} 82%|████████▏ | 27984/34278 [30:46:58<12:08:39, 6.95s/it] 82%|████████▏ | 27985/34278 [30:47:03<11:01:19, 6.31s/it] {'loss': 0.1146, 'grad_norm': 0.9166306968610213, 'learning_rate': 8.581525867200464e-07, 'epoch': 0.82} 82%|████████▏ | 27985/34278 [30:47:03<11:01:19, 6.31s/it] 82%|████████▏ | 27986/34278 [30:47:07<9:59:21, 5.72s/it] {'loss': 0.1108, 'grad_norm': 0.826955882779524, 'learning_rate': 8.578879560449354e-07, 'epoch': 0.82} 82%|████████▏ | 27986/34278 [30:47:07<9:59:21, 5.72s/it] 82%|████████▏ | 27987/34278 [30:47:11<8:58:05, 5.13s/it] {'loss': 0.1048, 'grad_norm': 0.6589801802365504, 'learning_rate': 8.576233623496117e-07, 'epoch': 0.82} 82%|████████▏ | 27987/34278 [30:47:11<8:58:05, 5.13s/it] 82%|████████▏ | 27988/34278 [30:47:13<7:45:12, 4.44s/it] {'loss': 0.1344, 'grad_norm': 1.039948581199204, 'learning_rate': 8.573588056364368e-07, 'epoch': 0.82} 82%|████████▏ | 27988/34278 [30:47:13<7:45:12, 4.44s/it] 82%|████████▏ | 27989/34278 [30:47:16<6:58:40, 3.99s/it] {'loss': 0.0856, 'grad_norm': 0.6468888511256666, 'learning_rate': 8.570942859077747e-07, 'epoch': 0.82} 82%|████████▏ | 27989/34278 [30:47:16<6:58:40, 3.99s/it] 82%|████████▏ | 27990/34278 [30:47:21<7:05:54, 4.06s/it] {'loss': 0.121, 'grad_norm': 0.7987498188958376, 'learning_rate': 8.568298031659844e-07, 'epoch': 0.82} 82%|████████▏ | 27990/34278 [30:47:21<7:05:54, 4.06s/it] 82%|████████▏ | 27991/34278 [30:47:31<10:36:57, 6.08s/it] {'loss': 0.1071, 'grad_norm': 0.6352220061579698, 'learning_rate': 8.565653574134297e-07, 'epoch': 0.82} 82%|████████▏ | 27991/34278 [30:47:31<10:36:57, 6.08s/it] 82%|████████▏ | 27992/34278 [30:47:38<10:47:00, 6.18s/it] {'loss': 0.1183, 'grad_norm': 0.9729734228360658, 'learning_rate': 8.563009486524698e-07, 'epoch': 0.82} 82%|████████▏ | 27992/34278 [30:47:38<10:47:00, 6.18s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 82%|████████▏ | 27993/34278 [30:47:41<9:25:30, 5.40s/it] {'loss': 0.1241, 'grad_norm': 0.7873868919153967, 'learning_rate': 8.560365768854662e-07, 'epoch': 0.82} 82%|████████▏ | 27993/34278 [30:47:41<9:25:30, 5.40s/it] 82%|████████▏ | 27994/34278 [30:47:45<8:43:58, 5.00s/it] {'loss': 0.0951, 'grad_norm': 0.6202144193652498, 'learning_rate': 8.55772242114778e-07, 'epoch': 0.82} 82%|████████▏ | 27994/34278 [30:47:45<8:43:58, 5.00s/it] 82%|████████▏ | 27995/34278 [30:47:49<7:57:32, 4.56s/it] {'loss': 0.1106, 'grad_norm': 0.7052777954048747, 'learning_rate': 8.555079443427672e-07, 'epoch': 0.82} 82%|████████▏ | 27995/34278 [30:47:49<7:57:32, 4.56s/it] 82%|████████▏ | 27996/34278 [30:47:53<7:26:39, 4.27s/it] {'loss': 0.0756, 'grad_norm': 0.878384548400134, 'learning_rate': 8.552436835717909e-07, 'epoch': 0.82} 82%|████████▏ | 27996/34278 [30:47:53<7:26:39, 4.27s/it] 82%|████████▏ | 27997/34278 [30:47:59<8:21:29, 4.79s/it] {'loss': 0.1041, 'grad_norm': 0.9384075757366674, 'learning_rate': 8.549794598042104e-07, 'epoch': 0.82} 82%|████████▏ | 27997/34278 [30:47:59<8:21:29, 4.79s/it] 82%|████████▏ | 27998/34278 [30:48:02<7:35:17, 4.35s/it] {'loss': 0.1188, 'grad_norm': 0.7983972119889551, 'learning_rate': 8.54715273042383e-07, 'epoch': 0.82} 82%|████████▏ | 27998/34278 [30:48:02<7:35:17, 4.35s/it] 82%|████████▏ | 27999/34278 [30:48:07<8:00:13, 4.59s/it] {'loss': 0.1267, 'grad_norm': 0.8255508273022011, 'learning_rate': 8.544511232886693e-07, 'epoch': 0.82} 82%|████████▏ | 27999/34278 [30:48:07<8:00:13, 4.59s/it] 82%|████████▏ | 28000/34278 [30:48:10<7:08:05, 4.09s/it] {'loss': 0.0905, 'grad_norm': 0.7603824207116087, 'learning_rate': 8.541870105454264e-07, 'epoch': 0.82} 82%|████████▏ | 28000/34278 [30:48:10<7:08:05, 4.09s/it]Set eos token id toSet eos token id to 151658Set eos token id toSet eos token id toSet eos token id to Set eos token id to Set eos token id to151658 Set eos token to 151658 151658151658Set eos token id to151658Set eos token to<|diff_marker|> 151658 Set eos token toSet eos token toSet eos token to<|diff_marker|>Set generation config eos token id to151658 Set eos token to <|diff_marker|> <|diff_marker|> Set generation config eos token id toSet eos token to [151658]<|diff_marker|><|diff_marker|> Set eos token to Set generation config eos token id to [151658] <|diff_marker|>Set generation config eos token id to Set generation config eos token id toSet generation config eos token id to<|diff_marker|> [151658] Set generation config eos token id to[151658][151658] Set generation config eos token id to [151658] [151658] [151658] Set eos token id toSet eos token id toSet eos token id toSet eos token id toSet eos token id toSet eos token id toSet eos token id to Set eos token id to 151658 151658151658 151658151658 151658 151658Set eos token to Set eos token toSet eos token to 151658Set eos token to <|diff_marker|> Set generation config eos token id toSet eos token toSet eos token to <|diff_marker|> <|diff_marker|> Set generation config eos token id to [151658] <|diff_marker|>Set generation config eos token id to Set generation config eos token id to[151658] <|diff_marker|> [151658]Set eos token to Set generation config eos token id to <|diff_marker|> [151658] Set generation config eos token id to<|diff_marker|> [151658] [151658] Set generation config eos token id to [151658] Set eos token to <|diff_marker|> Set generation config eos token id to [151658] 1658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] 151658151658151658 Set eos token toSet eos token to Set eos token to <|diff_marker|> <|diff_marker|> <|diff_marker|> Set generation config eos token id to Set generation config eos token id to Set generation config eos token id to [151658] [151658][151658] Set eos token id to Set eos token id to151658 Set eos token to151658 <|diff_marker|> Set eos token id to Set generation config eos token id toSet eos token to <|diff_marker|> [151658] Set generation config eos token id to 151658Set eos token id to Set eos token to [151658]<|diff_marker|>Set eos token id to Set generation config eos token id to151658 151658 [151658]Set eos token to Set eos token to <|diff_marker|><|diff_marker|> Set generation config eos token id to Set generation config eos token id to [151658] [151658] Set eos token id back to et eos token id back to151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id ba 151645 Set 151645 Set e 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back toSet eos token id back to <|im_end|> Set generation config eos token id back to 151645 Set eos token back to <|im_end|> [151645, 151643]Set generation config eos token id back to Set eos token id back to 151645[151645, 151643] Set eos token back to <|im_end|> Set generation config eos token id back to Set eos token id back to[151645, 151643] 151645 Set eos token back to <|im_end|> Set generation config eos token id back to Set eos token id back to[151645, 151643] 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 82%|████████▏ | 28001/34278 [30:48:48<24:49:19, 14.24s/it] {'loss': 0.1176, 'grad_norm': 0.8667780567402311, 'learning_rate': 8.539229348150107e-07, 'epoch': 0.82} 82%|████████▏ | 28001/34278 [30:48:48<24:49:19, 14.24s/it] 82%|████████▏ | 28002/34278 [30:48:51<18:44:58, 10.76s/it] {'loss': 0.112, 'grad_norm': 1.0316246110869005, 'learning_rate': 8.536588960997811e-07, 'epoch': 0.82} 82%|████████▏ | 28002/34278 [30:48:51<18:44:58, 10.76s/it] 82%|████████▏ | 28003/34278 [30:48:55<15:28:11, 8.88s/it] {'loss': 0.1251, 'grad_norm': 0.7797496813247969, 'learning_rate': 8.53394894402097e-07, 'epoch': 0.82} 82%|████████▏ | 28003/34278 [30:48:55<15:28:11, 8.88s/it] 82%|████████▏ | 28004/34278 [30:49:01<13:54:54, 7.98s/it] {'loss': 0.1226, 'grad_norm': 0.9794218367100127, 'learning_rate': 8.531309297243129e-07, 'epoch': 0.82} 82%|████████▏ | 28004/34278 [30:49:01<13:54:54, 7.98s/it] 82%|████████▏ | 28005/34278 [30:49:04<11:25:17, 6.55s/it] {'loss': 0.1169, 'grad_norm': 0.7280829179655549, 'learning_rate': 8.528670020687845e-07, 'epoch': 0.82} 82%|████████▏ | 28005/34278 [30:49:04<11:25:17, 6.55s/it] 82%|████████▏ | 28006/34278 [30:49:07<9:29:07, 5.44s/it] {'loss': 0.1391, 'grad_norm': 1.0092566018665785, 'learning_rate': 8.526031114378713e-07, 'epoch': 0.82} 82%|████████▏ | 28006/34278 [30:49:07<9:29:07, 5.44s/it] 82%|████████▏ | 28007/34278 [30:49:11<8:30:55, 4.89s/it] {'loss': 0.0923, 'grad_norm': 0.983347000050237, 'learning_rate': 8.523392578339268e-07, 'epoch': 0.82} 82%|████████▏ | 28007/34278 [30:49:11<8:30:55, 4.89s/it] 82%|████████▏ | 28008/34278 [30:49:14<7:54:19, 4.54s/it] {'loss': 0.1325, 'grad_norm': 0.7769789828232809, 'learning_rate': 8.520754412593052e-07, 'epoch': 0.82} 82%|████████▏ | 28008/34278 [30:49:14<7:54:19, 4.54s/it] 82%|████████▏ | 28009/34278 [30:49:17<7:00:40, 4.03s/it] {'loss': 0.104, 'grad_norm': 0.8762249085199176, 'learning_rate': 8.518116617163664e-07, 'epoch': 0.82} 82%|████████▏ | 28009/34278 [30:49:17<7:00:40, 4.03s/it] 82%|████████▏ | 28010/34278 [30:49:23<8:01:17, 4.61s/it] {'loss': 0.1061, 'grad_norm': 0.7477001410736634, 'learning_rate': 8.515479192074627e-07, 'epoch': 0.82} 82%|████████▏ | 28010/34278 [30:49:23<8:01:17, 4.61s/it] 82%|████████▏ | 28011/34278 [30:49:29<8:47:34, 5.05s/it] {'loss': 0.1028, 'grad_norm': 0.7165997494065797, 'learning_rate': 8.512842137349475e-07, 'epoch': 0.82} 82%|████████▏ | 28011/34278 [30:49:29<8:47:34, 5.05s/it] 82%|████████▏ | 28012/34278 [30:49:33<8:12:01, 4.71s/it] {'loss': 0.124, 'grad_norm': 0.7707761993942428, 'learning_rate': 8.510205453011783e-07, 'epoch': 0.82} 82%|████████▏ | 28012/34278 [30:49:33<8:12:01, 4.71s/it] 82%|████████▏ | 28013/34278 [30:49:36<7:13:33, 4.15s/it] {'loss': 0.1219, 'grad_norm': 0.7468551414347658, 'learning_rate': 8.507569139085064e-07, 'epoch': 0.82} 82%|████████▏ | 28013/34278 [30:49:36<7:13:33, 4.15s/it] 82%|████████▏ | 28014/34278 [30:49:42<8:09:01, 4.68s/it] {'loss': 0.1092, 'grad_norm': 1.0180306974051387, 'learning_rate': 8.504933195592858e-07, 'epoch': 0.82} 82%|████████▏ | 28014/34278 [30:49:42<8:09:01, 4.68s/it] 82%|████████▏ | 28015/34278 [30:49:48<8:50:59, 5.09s/it] {'loss': 0.1106, 'grad_norm': 0.7306808186430649, 'learning_rate': 8.502297622558697e-07, 'epoch': 0.82} 82%|████████▏ | 28015/34278 [30:49:48<8:50:59, 5.09s/it] 82%|████████▏ | 28016/34278 [30:49:54<9:18:47, 5.35s/it] {'loss': 0.1224, 'grad_norm': 0.813070514036749, 'learning_rate': 8.499662420006127e-07, 'epoch': 0.82} 82%|████████▏ | 28016/34278 [30:49:54<9:18:47, 5.35s/it] 82%|████████▏ | 28017/34278 [30:49:57<8:10:53, 4.70s/it] {'loss': 0.1114, 'grad_norm': 0.9905332900506604, 'learning_rate': 8.497027587958672e-07, 'epoch': 0.82} 82%|████████▏ | 28017/34278 [30:49:57<8:10:53, 4.70s/it] 82%|████████▏ | 28018/34278 [30:50:02<8:31:45, 4.91s/it] {'loss': 0.1199, 'grad_norm': 0.8700338934573142, 'learning_rate': 8.494393126439831e-07, 'epoch': 0.82} 82%|████████▏ | 28018/34278 [30:50:02<8:31:45, 4.91s/it] 82%|████████▏ | 28019/34278 [30:50:07<8:16:05, 4.76s/it] {'loss': 0.1441, 'grad_norm': 0.9352580233423357, 'learning_rate': 8.491759035473152e-07, 'epoch': 0.82} 82%|████████▏ | 28019/34278 [30:50:07<8:16:05, 4.76s/it] 82%|████████▏ | 28020/34278 [30:50:09<7:09:29, 4.12s/it] {'loss': 0.1272, 'grad_norm': 0.9657119282357742, 'learning_rate': 8.489125315082125e-07, 'epoch': 0.82} 82%|████████▏ | 28020/34278 [30:50:10<7:09:29, 4.12s/it] 82%|████████▏ | 28021/34278 [30:50:12<6:23:57, 3.68s/it] {'loss': 0.1174, 'grad_norm': 0.8445003314221814, 'learning_rate': 8.486491965290294e-07, 'epoch': 0.82} 82%|████████▏ | 28021/34278 [30:50:12<6:23:57, 3.68s/it] 82%|████████▏ | 28022/34278 [30:50:16<6:28:45, 3.73s/it] {'loss': 0.1183, 'grad_norm': 0.9901304891181352, 'learning_rate': 8.483858986121135e-07, 'epoch': 0.82} 82%|████████▏ | 28022/34278 [30:50:16<6:28:45, 3.73s/it] 82%|████████▏ | 28023/34278 [30:50:19<5:59:31, 3.45s/it] {'loss': 0.1154, 'grad_norm': 1.0211973522370925, 'learning_rate': 8.48122637759819e-07, 'epoch': 0.82} 82%|████████▏ | 28023/34278 [30:50:19<5:59:31, 3.45s/it] 82%|████████▏ | 28024/34278 [30:50:23<6:17:36, 3.62s/it] {'loss': 0.1232, 'grad_norm': 1.0334704825490604, 'learning_rate': 8.478594139744928e-07, 'epoch': 0.82} 82%|████████▏ | 28024/34278 [30:50:23<6:17:36, 3.62s/it] 82%|████████▏ | 28025/34278 [30:50:26<6:04:56, 3.50s/it] {'loss': 0.1213, 'grad_norm': 1.0091919402124505, 'learning_rate': 8.475962272584881e-07, 'epoch': 0.82} 82%|████████▏ | 28025/34278 [30:50:26<6:04:56, 3.50s/it] 82%|████████▏ | 28026/34278 [30:50:29<6:03:40, 3.49s/it] {'loss': 0.1386, 'grad_norm': 1.010854824605726, 'learning_rate': 8.47333077614152e-07, 'epoch': 0.82} 82%|████████▏ | 28026/34278 [30:50:30<6:03:40, 3.49s/it] 82%|████████▏ | 28027/34278 [30:50:32<5:45:03, 3.31s/it] {'loss': 0.1223, 'grad_norm': 0.8002253611829233, 'learning_rate': 8.470699650438358e-07, 'epoch': 0.82} 82%|████████▏ | 28027/34278 [30:50:32<5:45:03, 3.31s/it] 82%|████████▏ | 28028/34278 [30:50:35<5:28:26, 3.15s/it] {'loss': 0.1112, 'grad_norm': 0.7758489461325259, 'learning_rate': 8.468068895498859e-07, 'epoch': 0.82} 82%|████████▏ | 28028/34278 [30:50:35<5:28:26, 3.15s/it] 82%|████████▏ | 28029/34278 [30:50:39<5:36:47, 3.23s/it] {'loss': 0.1709, 'grad_norm': 1.0687019772086732, 'learning_rate': 8.465438511346546e-07, 'epoch': 0.82} 82%|████████▏ | 28029/34278 [30:50:39<5:36:47, 3.23s/it] 82%|████████▏ | 28030/34278 [30:50:41<5:23:10, 3.10s/it] {'loss': 0.1212, 'grad_norm': 0.8801451995809791, 'learning_rate': 8.462808498004882e-07, 'epoch': 0.82} 82%|████████▏ | 28030/34278 [30:50:41<5:23:10, 3.10s/it] 82%|████████▏ | 28031/34278 [30:50:45<5:49:35, 3.36s/it] {'loss': 0.1316, 'grad_norm': 0.8419010267623476, 'learning_rate': 8.460178855497331e-07, 'epoch': 0.82} 82%|████████▏ | 28031/34278 [30:50:45<5:49:35, 3.36s/it] 82%|████████▏ | 28032/34278 [30:50:48<5:32:19, 3.19s/it] {'loss': 0.0872, 'grad_norm': 0.8620228436395571, 'learning_rate': 8.457549583847391e-07, 'epoch': 0.82} 82%|████████▏ | 28032/34278 [30:50:48<5:32:19, 3.19s/it] 82%|████████▏ | 28033/34278 [30:50:52<5:55:43, 3.42s/it] {'loss': 0.1323, 'grad_norm': 0.9043170259284783, 'learning_rate': 8.454920683078544e-07, 'epoch': 0.82} 82%|████████▏ | 28033/34278 [30:50:52<5:55:43, 3.42s/it] 82%|████████▏ | 28034/34278 [30:50:56<6:18:06, 3.63s/it] {'loss': 0.1269, 'grad_norm': 0.7789213372353636, 'learning_rate': 8.452292153214242e-07, 'epoch': 0.82} 82%|████████▏ | 28034/34278 [30:50:56<6:18:06, 3.63s/it] 82%|████████▏ | 28035/34278 [30:51:03<7:45:20, 4.47s/it] {'loss': 0.1293, 'grad_norm': 0.905291898223673, 'learning_rate': 8.449663994277951e-07, 'epoch': 0.82} 82%|████████▏ | 28035/34278 [30:51:03<7:45:20, 4.47s/it] 82%|████████▏ | 28036/34278 [30:51:06<7:05:02, 4.09s/it] {'loss': 0.125, 'grad_norm': 0.9083828888139835, 'learning_rate': 8.447036206293152e-07, 'epoch': 0.82} 82%|████████▏ | 28036/34278 [30:51:06<7:05:02, 4.09s/it] 82%|████████▏ | 28037/34278 [30:51:09<6:37:09, 3.82s/it] {'loss': 0.1247, 'grad_norm': 0.7955954016926723, 'learning_rate': 8.444408789283292e-07, 'epoch': 0.82} 82%|████████▏ | 28037/34278 [30:51:09<6:37:09, 3.82s/it] 82%|████████▏ | 28038/34278 [30:51:12<6:22:52, 3.68s/it] {'loss': 0.1119, 'grad_norm': 0.9815952664476065, 'learning_rate': 8.44178174327181e-07, 'epoch': 0.82} 82%|████████▏ | 28038/34278 [30:51:12<6:22:52, 3.68s/it] 82%|████████▏ | 28039/34278 [30:51:16<6:09:19, 3.55s/it] {'loss': 0.1226, 'grad_norm': 0.8610262579032661, 'learning_rate': 8.439155068282201e-07, 'epoch': 0.82} 82%|████████▏ | 28039/34278 [30:51:16<6:09:19, 3.55s/it] 82%|████████▏ | 28040/34278 [30:51:19<5:47:43, 3.34s/it] {'loss': 0.1318, 'grad_norm': 0.8876390810937166, 'learning_rate': 8.436528764337892e-07, 'epoch': 0.82} 82%|████████▏ | 28040/34278 [30:51:19<5:47:43, 3.34s/it] 82%|████████▏ | 28041/34278 [30:51:22<5:44:51, 3.32s/it] {'loss': 0.1333, 'grad_norm': 1.015407307769133, 'learning_rate': 8.43390283146232e-07, 'epoch': 0.82} 82%|████████▏ | 28041/34278 [30:51:22<5:44:51, 3.32s/it] 82%|████████▏ | 28042/34278 [30:51:25<5:37:27, 3.25s/it] {'loss': 0.1041, 'grad_norm': 0.9222701906240001, 'learning_rate': 8.431277269678961e-07, 'epoch': 0.82} 82%|████████▏ | 28042/34278 [30:51:25<5:37:27, 3.25s/it] 82%|████████▏ | 28043/34278 [30:51:28<5:32:51, 3.20s/it] {'loss': 0.1135, 'grad_norm': 0.8136467820095734, 'learning_rate': 8.428652079011229e-07, 'epoch': 0.82} 82%|████████▏ | 28043/34278 [30:51:28<5:32:51, 3.20s/it] 82%|████████▏ | 28044/34278 [30:51:33<6:18:30, 3.64s/it] {'loss': 0.1144, 'grad_norm': 0.8692521287609517, 'learning_rate': 8.426027259482555e-07, 'epoch': 0.82} 82%|████████▏ | 28044/34278 [30:51:33<6:18:30, 3.64s/it] 82%|████████▏ | 28045/34278 [30:51:36<6:08:57, 3.55s/it] {'loss': 0.1025, 'grad_norm': 1.0202616954193195, 'learning_rate': 8.423402811116388e-07, 'epoch': 0.82} 82%|████████▏ | 28045/34278 [30:51:36<6:08:57, 3.55s/it] 82%|████████▏ | 28046/34278 [30:51:40<6:32:38, 3.78s/it] {'loss': 0.082, 'grad_norm': 0.7730376981748913, 'learning_rate': 8.420778733936164e-07, 'epoch': 0.82} 82%|████████▏ | 28046/34278 [30:51:40<6:32:38, 3.78s/it] 82%|████████▏ | 28047/34278 [30:51:46<7:41:59, 4.45s/it] {'loss': 0.1, 'grad_norm': 0.9667693727371081, 'learning_rate': 8.418155027965302e-07, 'epoch': 0.82} 82%|████████▏ | 28047/34278 [30:51:46<7:41:59, 4.45s/it] 82%|████████▏ | 28048/34278 [30:51:50<7:04:08, 4.08s/it] {'loss': 0.1074, 'grad_norm': 0.8015330055711267, 'learning_rate': 8.41553169322722e-07, 'epoch': 0.82} 82%|████████▏ | 28048/34278 [30:51:50<7:04:08, 4.08s/it] 82%|████████▏ | 28049/34278 [30:51:54<7:13:04, 4.17s/it] {'loss': 0.1423, 'grad_norm': 0.8360952802038947, 'learning_rate': 8.41290872974535e-07, 'epoch': 0.82} 82%|████████▏ | 28049/34278 [30:51:54<7:13:04, 4.17s/it] 82%|████████▏ | 28050/34278 [30:51:57<6:49:12, 3.94s/it] {'loss': 0.1079, 'grad_norm': 1.0510750425775541, 'learning_rate': 8.410286137543089e-07, 'epoch': 0.82} 82%|████████▏ | 28050/34278 [30:51:57<6:49:12, 3.94s/it] 82%|████████▏ | 28051/34278 [30:52:01<6:26:14, 3.72s/it] {'loss': 0.1062, 'grad_norm': 0.7413942925453786, 'learning_rate': 8.407663916643882e-07, 'epoch': 0.82} 82%|████████▏ | 28051/34278 [30:52:01<6:26:14, 3.72s/it] 82%|████████▏ | 28052/34278 [30:52:06<7:35:50, 4.39s/it] {'loss': 0.1146, 'grad_norm': 0.9235686785184701, 'learning_rate': 8.405042067071112e-07, 'epoch': 0.82} 82%|████████▏ | 28052/34278 [30:52:06<7:35:50, 4.39s/it] 82%|████████▏ | 28053/34278 [30:52:11<7:37:53, 4.41s/it] {'loss': 0.1028, 'grad_norm': 0.8694952551010822, 'learning_rate': 8.402420588848204e-07, 'epoch': 0.82} 82%|████████▏ | 28053/34278 [30:52:11<7:37:53, 4.41s/it] 82%|████████▏ | 28054/34278 [30:52:14<7:01:44, 4.07s/it] {'loss': 0.0997, 'grad_norm': 0.8309566220828405, 'learning_rate': 8.399799481998555e-07, 'epoch': 0.82} 82%|████████▏ | 28054/34278 [30:52:14<7:01:44, 4.07s/it] 82%|████████▏ | 28055/34278 [30:52:20<7:57:54, 4.61s/it] {'loss': 0.1293, 'grad_norm': 0.7166765632882739, 'learning_rate': 8.397178746545558e-07, 'epoch': 0.82} 82%|████████▏ | 28055/34278 [30:52:20<7:57:54, 4.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 82%|████████▏ | 28056/34278 [30:52:23<7:10:36, 4.15s/it] {'loss': 0.1124, 'grad_norm': 0.7696646028108041, 'learning_rate': 8.394558382512613e-07, 'epoch': 0.82} 82%|████████▏ | 28056/34278 [30:52:23<7:10:36, 4.15s/it] 82%|████████▏ | 28057/34278 [30:52:26<6:35:08, 3.81s/it] {'loss': 0.1048, 'grad_norm': 0.7930973318720292, 'learning_rate': 8.391938389923132e-07, 'epoch': 0.82} 82%|████████▏ | 28057/34278 [30:52:26<6:35:08, 3.81s/it] 82%|████████▏ | 28058/34278 [30:52:32<7:40:46, 4.44s/it] {'loss': 0.0871, 'grad_norm': 0.7494214465106429, 'learning_rate': 8.389318768800481e-07, 'epoch': 0.82} 82%|████████▏ | 28058/34278 [30:52:32<7:40:46, 4.44s/it] 82%|████████▏ | 28059/34278 [30:52:36<7:11:39, 4.16s/it] {'loss': 0.1177, 'grad_norm': 0.7757831673639287, 'learning_rate': 8.386699519168074e-07, 'epoch': 0.82} 82%|████████▏ | 28059/34278 [30:52:36<7:11:39, 4.16s/it] 82%|████████▏ | 28060/34278 [30:52:39<6:40:49, 3.87s/it] {'loss': 0.1219, 'grad_norm': 0.8360261929111451, 'learning_rate': 8.384080641049275e-07, 'epoch': 0.82} 82%|████████▏ | 28060/34278 [30:52:39<6:40:49, 3.87s/it] 82%|████████▏ | 28061/34278 [30:52:42<6:31:57, 3.78s/it] {'loss': 0.118, 'grad_norm': 1.2544170598866653, 'learning_rate': 8.38146213446746e-07, 'epoch': 0.82} 82%|████████▏ | 28061/34278 [30:52:42<6:31:57, 3.78s/it] 82%|████████▏ | 28062/34278 [30:52:48<7:20:26, 4.25s/it] {'loss': 0.1273, 'grad_norm': 0.8124155581438096, 'learning_rate': 8.378843999446018e-07, 'epoch': 0.82} 82%|████████▏ | 28062/34278 [30:52:48<7:20:26, 4.25s/it] 82%|████████▏ | 28063/34278 [30:52:51<6:35:47, 3.82s/it] {'loss': 0.1079, 'grad_norm': 0.893277986606559, 'learning_rate': 8.376226236008328e-07, 'epoch': 0.82} 82%|████████▏ | 28063/34278 [30:52:51<6:35:47, 3.82s/it] 82%|████████▏ | 28064/34278 [30:52:54<6:10:39, 3.58s/it] {'loss': 0.1154, 'grad_norm': 0.7547091523477031, 'learning_rate': 8.373608844177755e-07, 'epoch': 0.82} 82%|████████▏ | 28064/34278 [30:52:54<6:10:39, 3.58s/it] 82%|████████▏ | 28065/34278 [30:52:58<6:42:18, 3.89s/it] {'loss': 0.1162, 'grad_norm': 0.7725148549140893, 'learning_rate': 8.370991823977653e-07, 'epoch': 0.82} 82%|████████▏ | 28065/34278 [30:52:58<6:42:18, 3.89s/it] 82%|████████▏ | 28066/34278 [30:53:04<7:46:20, 4.50s/it] {'loss': 0.1197, 'grad_norm': 0.7866570268503806, 'learning_rate': 8.368375175431415e-07, 'epoch': 0.82} 82%|████████▏ | 28066/34278 [30:53:04<7:46:20, 4.50s/it] 82%|████████▏ | 28067/34278 [30:53:08<7:16:52, 4.22s/it] {'loss': 0.1261, 'grad_norm': 0.9351223328356347, 'learning_rate': 8.365758898562371e-07, 'epoch': 0.82} 82%|████████▏ | 28067/34278 [30:53:08<7:16:52, 4.22s/it] 82%|████████▏ | 28068/34278 [30:53:13<8:05:17, 4.69s/it] {'loss': 0.1184, 'grad_norm': 0.9229732339443021, 'learning_rate': 8.363142993393891e-07, 'epoch': 0.82} 82%|████████▏ | 28068/34278 [30:53:13<8:05:17, 4.69s/it] 82%|████████▏ | 28069/34278 [30:53:17<7:20:18, 4.25s/it] {'loss': 0.1325, 'grad_norm': 0.8975736632063561, 'learning_rate': 8.360527459949341e-07, 'epoch': 0.82} 82%|████████▏ | 28069/34278 [30:53:17<7:20:18, 4.25s/it] 82%|████████▏ | 28070/34278 [30:53:21<7:24:09, 4.29s/it] {'loss': 0.1086, 'grad_norm': 0.7978540709436979, 'learning_rate': 8.357912298252063e-07, 'epoch': 0.82} 82%|████████▏ | 28070/34278 [30:53:21<7:24:09, 4.29s/it] 82%|████████▏ | 28071/34278 [30:53:25<7:26:09, 4.31s/it] {'loss': 0.0938, 'grad_norm': 0.8436162053129328, 'learning_rate': 8.355297508325394e-07, 'epoch': 0.82} 82%|████████▏ | 28071/34278 [30:53:25<7:26:09, 4.31s/it] 82%|████████▏ | 28072/34278 [30:53:31<8:16:56, 4.80s/it] {'loss': 0.159, 'grad_norm': 0.9512952476076122, 'learning_rate': 8.352683090192698e-07, 'epoch': 0.82} 82%|████████▏ | 28072/34278 [30:53:31<8:16:56, 4.80s/it] 82%|████████▏ | 28073/34278 [30:53:35<7:33:57, 4.39s/it] {'loss': 0.1217, 'grad_norm': 0.8949113859449455, 'learning_rate': 8.35006904387729e-07, 'epoch': 0.82} 82%|████████▏ | 28073/34278 [30:53:35<7:33:57, 4.39s/it] 82%|████████▏ | 28074/34278 [30:53:38<7:04:34, 4.11s/it] {'loss': 0.1079, 'grad_norm': 0.9513471759480473, 'learning_rate': 8.34745536940254e-07, 'epoch': 0.82} 82%|████████▏ | 28074/34278 [30:53:38<7:04:34, 4.11s/it] 82%|████████▏ | 28075/34278 [30:53:42<6:50:21, 3.97s/it] {'loss': 0.1031, 'grad_norm': 0.8236986247300286, 'learning_rate': 8.344842066791753e-07, 'epoch': 0.82} 82%|████████▏ | 28075/34278 [30:53:42<6:50:21, 3.97s/it] 82%|████████▏ | 28076/34278 [30:53:45<6:17:50, 3.66s/it] {'loss': 0.1315, 'grad_norm': 0.8718510701854801, 'learning_rate': 8.342229136068281e-07, 'epoch': 0.82} 82%|████████▏ | 28076/34278 [30:53:45<6:17:50, 3.66s/it] 82%|████████▏ | 28077/34278 [30:53:48<6:01:27, 3.50s/it] {'loss': 0.0978, 'grad_norm': 0.7291032550965775, 'learning_rate': 8.339616577255444e-07, 'epoch': 0.82} 82%|████████▏ | 28077/34278 [30:53:48<6:01:27, 3.50s/it] 82%|████████▏ | 28078/34278 [30:53:51<5:36:40, 3.26s/it] {'loss': 0.1079, 'grad_norm': 0.8902461305151809, 'learning_rate': 8.337004390376552e-07, 'epoch': 0.82} 82%|████████▏ | 28078/34278 [30:53:51<5:36:40, 3.26s/it] 82%|████████▏ | 28079/34278 [30:53:54<5:36:29, 3.26s/it] {'loss': 0.106, 'grad_norm': 0.9204624887828335, 'learning_rate': 8.334392575454941e-07, 'epoch': 0.82} 82%|████████▏ | 28079/34278 [30:53:54<5:36:29, 3.26s/it] 82%|████████▏ | 28080/34278 [30:53:57<5:32:39, 3.22s/it] {'loss': 0.1142, 'grad_norm': 1.0351279618199487, 'learning_rate': 8.331781132513939e-07, 'epoch': 0.82} 82%|████████▏ | 28080/34278 [30:53:57<5:32:39, 3.22s/it] 82%|████████▏ | 28081/34278 [30:54:00<5:31:29, 3.21s/it] {'loss': 0.1323, 'grad_norm': 0.8509110671830511, 'learning_rate': 8.329170061576847e-07, 'epoch': 0.82} 82%|████████▏ | 28081/34278 [30:54:00<5:31:29, 3.21s/it] 82%|████████▏ | 28082/34278 [30:54:03<5:29:39, 3.19s/it] {'loss': 0.1025, 'grad_norm': 0.7621287085683462, 'learning_rate': 8.326559362666964e-07, 'epoch': 0.82} 82%|████████▏ | 28082/34278 [30:54:03<5:29:39, 3.19s/it] 82%|████████▏ | 28083/34278 [30:54:07<5:46:18, 3.35s/it] {'loss': 0.117, 'grad_norm': 0.7648280607430413, 'learning_rate': 8.323949035807621e-07, 'epoch': 0.82} 82%|████████▏ | 28083/34278 [30:54:07<5:46:18, 3.35s/it] 82%|████████▏ | 28084/34278 [30:54:11<6:08:35, 3.57s/it] {'loss': 0.1058, 'grad_norm': 0.7945796102385012, 'learning_rate': 8.321339081022117e-07, 'epoch': 0.82} 82%|████████▏ | 28084/34278 [30:54:11<6:08:35, 3.57s/it] 82%|████████▏ | 28085/34278 [30:54:14<5:49:08, 3.38s/it] {'loss': 0.0838, 'grad_norm': 0.8999082911118238, 'learning_rate': 8.318729498333722e-07, 'epoch': 0.82} 82%|████████▏ | 28085/34278 [30:54:14<5:49:08, 3.38s/it] 82%|████████▏ | 28086/34278 [30:54:18<5:51:03, 3.40s/it] {'loss': 0.1321, 'grad_norm': 0.950698878684458, 'learning_rate': 8.316120287765784e-07, 'epoch': 0.82} 82%|████████▏ | 28086/34278 [30:54:18<5:51:03, 3.40s/it] 82%|████████▏ | 28087/34278 [30:54:24<7:16:13, 4.23s/it] {'loss': 0.1081, 'grad_norm': 0.8058811638166943, 'learning_rate': 8.313511449341572e-07, 'epoch': 0.82} 82%|████████▏ | 28087/34278 [30:54:24<7:16:13, 4.23s/it] 82%|████████▏ | 28088/34278 [30:54:30<8:16:50, 4.82s/it] {'loss': 0.1225, 'grad_norm': 0.8291288706638122, 'learning_rate': 8.310902983084368e-07, 'epoch': 0.82} 82%|████████▏ | 28088/34278 [30:54:30<8:16:50, 4.82s/it] 82%|████████▏ | 28089/34278 [30:54:33<7:17:12, 4.24s/it] {'loss': 0.0867, 'grad_norm': 0.718478792515388, 'learning_rate': 8.308294889017482e-07, 'epoch': 0.82} 82%|████████▏ | 28089/34278 [30:54:33<7:17:12, 4.24s/it] 82%|████████▏ | 28090/34278 [30:54:36<6:57:40, 4.05s/it] {'loss': 0.1117, 'grad_norm': 0.9614708259651105, 'learning_rate': 8.305687167164189e-07, 'epoch': 0.82} 82%|████████▏ | 28090/34278 [30:54:36<6:57:40, 4.05s/it] 82%|████████▏ | 28091/34278 [30:54:40<6:56:14, 4.04s/it] {'loss': 0.1058, 'grad_norm': 0.7657402132189794, 'learning_rate': 8.303079817547749e-07, 'epoch': 0.82} 82%|████████▏ | 28091/34278 [30:54:40<6:56:14, 4.04s/it] 82%|████████▏ | 28092/34278 [30:54:43<6:16:23, 3.65s/it] {'loss': 0.1092, 'grad_norm': 0.9436221345772089, 'learning_rate': 8.300472840191464e-07, 'epoch': 0.82} 82%|████████▏ | 28092/34278 [30:54:43<6:16:23, 3.65s/it] 82%|████████▏ | 28093/34278 [30:54:47<6:09:58, 3.59s/it] {'loss': 0.1071, 'grad_norm': 0.9302532658571502, 'learning_rate': 8.297866235118612e-07, 'epoch': 0.82} 82%|████████▏ | 28093/34278 [30:54:47<6:09:58, 3.59s/it] 82%|████████▏ | 28094/34278 [30:54:52<7:13:27, 4.21s/it] {'loss': 0.1177, 'grad_norm': 0.744906274912989, 'learning_rate': 8.295260002352462e-07, 'epoch': 0.82} 82%|████████▏ | 28094/34278 [30:54:52<7:13:27, 4.21s/it] 82%|████████▏ | 28095/34278 [30:54:55<6:40:27, 3.89s/it] {'loss': 0.111, 'grad_norm': 0.7902433385563541, 'learning_rate': 8.292654141916257e-07, 'epoch': 0.82} 82%|████████▏ | 28095/34278 [30:54:55<6:40:27, 3.89s/it] 82%|████████▏ | 28096/34278 [30:54:59<6:18:40, 3.68s/it] {'loss': 0.1032, 'grad_norm': 0.8382154140304817, 'learning_rate': 8.290048653833288e-07, 'epoch': 0.82} 82%|████████▏ | 28096/34278 [30:54:59<6:18:40, 3.68s/it] 82%|████████▏ | 28097/34278 [30:55:02<6:26:46, 3.75s/it] {'loss': 0.1179, 'grad_norm': 1.0535307215415357, 'learning_rate': 8.287443538126805e-07, 'epoch': 0.82} 82%|████████▏ | 28097/34278 [30:55:02<6:26:46, 3.75s/it] 82%|████████▏ | 28098/34278 [30:55:06<6:05:33, 3.55s/it] {'loss': 0.13, 'grad_norm': 0.9495198768864338, 'learning_rate': 8.284838794820061e-07, 'epoch': 0.82} 82%|████████▏ | 28098/34278 [30:55:06<6:05:33, 3.55s/it] 82%|████████▏ | 28099/34278 [30:55:08<5:44:26, 3.34s/it] {'loss': 0.1196, 'grad_norm': 0.7835200583928362, 'learning_rate': 8.28223442393633e-07, 'epoch': 0.82} 82%|████████▏ | 28099/34278 [30:55:08<5:44:26, 3.34s/it] 82%|████████▏ | 28100/34278 [30:55:13<6:18:14, 3.67s/it] {'loss': 0.1248, 'grad_norm': 1.00873338218089, 'learning_rate': 8.279630425498858e-07, 'epoch': 0.82} 82%|████████▏ | 28100/34278 [30:55:13<6:18:14, 3.67s/it] 82%|████████▏ | 28101/34278 [30:55:17<6:28:15, 3.77s/it] {'loss': 0.1043, 'grad_norm': 1.0290157010544827, 'learning_rate': 8.277026799530869e-07, 'epoch': 0.82} 82%|████████▏ | 28101/34278 [30:55:17<6:28:15, 3.77s/it] 82%|████████▏ | 28102/34278 [30:55:20<5:54:19, 3.44s/it] {'loss': 0.0975, 'grad_norm': 0.8396608742587166, 'learning_rate': 8.274423546055638e-07, 'epoch': 0.82} 82%|████████▏ | 28102/34278 [30:55:20<5:54:19, 3.44s/it] 82%|████████▏ | 28103/34278 [30:55:23<5:41:26, 3.32s/it] {'loss': 0.1415, 'grad_norm': 1.0409349443091456, 'learning_rate': 8.271820665096381e-07, 'epoch': 0.82} 82%|████████▏ | 28103/34278 [30:55:23<5:41:26, 3.32s/it] 82%|████████▏ | 28104/34278 [30:55:27<6:06:02, 3.56s/it] {'loss': 0.1113, 'grad_norm': 0.7578961483672078, 'learning_rate': 8.269218156676356e-07, 'epoch': 0.82} 82%|████████▏ | 28104/34278 [30:55:27<6:06:02, 3.56s/it] 82%|████████▏ | 28105/34278 [30:55:30<5:45:43, 3.36s/it] {'loss': 0.1206, 'grad_norm': 0.8745063625424268, 'learning_rate': 8.266616020818779e-07, 'epoch': 0.82} 82%|████████▏ | 28105/34278 [30:55:30<5:45:43, 3.36s/it] 82%|████████▏ | 28106/34278 [30:55:32<5:27:00, 3.18s/it] {'loss': 0.1223, 'grad_norm': 1.0504859406376101, 'learning_rate': 8.264014257546909e-07, 'epoch': 0.82} 82%|████████▏ | 28106/34278 [30:55:32<5:27:00, 3.18s/it] 82%|████████▏ | 28107/34278 [30:55:36<5:32:36, 3.23s/it] {'loss': 0.1032, 'grad_norm': 0.7646400680302314, 'learning_rate': 8.26141286688395e-07, 'epoch': 0.82} 82%|████████▏ | 28107/34278 [30:55:36<5:32:36, 3.23s/it] 82%|████████▏ | 28108/34278 [30:55:40<5:55:36, 3.46s/it] {'loss': 0.1245, 'grad_norm': 0.7847866097437022, 'learning_rate': 8.258811848853126e-07, 'epoch': 0.82} 82%|████████▏ | 28108/34278 [30:55:40<5:55:36, 3.46s/it] 82%|████████▏ | 28109/34278 [30:55:43<5:38:14, 3.29s/it] {'loss': 0.1199, 'grad_norm': 0.8507550174118889, 'learning_rate': 8.256211203477659e-07, 'epoch': 0.82} 82%|████████▏ | 28109/34278 [30:55:43<5:38:14, 3.29s/it] 82%|████████▏ | 28110/34278 [30:55:45<5:23:19, 3.15s/it] {'loss': 0.1216, 'grad_norm': 1.0639099007405886, 'learning_rate': 8.253610930780793e-07, 'epoch': 0.82} 82%|████████▏ | 28110/34278 [30:55:45<5:23:19, 3.15s/it] 82%|████████▏ | 28111/34278 [30:55:49<5:28:50, 3.20s/it] {'loss': 0.1073, 'grad_norm': 0.7559782093999984, 'learning_rate': 8.251011030785722e-07, 'epoch': 0.82} 82%|████████▏ | 28111/34278 [30:55:49<5:28:50, 3.20s/it] 82%|████████▏ | 28112/34278 [30:55:52<5:30:29, 3.22s/it] {'loss': 0.1069, 'grad_norm': 0.6794944671157042, 'learning_rate': 8.248411503515641e-07, 'epoch': 0.82} 82%|████████▏ | 28112/34278 [30:55:52<5:30:29, 3.22s/it] 82%|████████▏ | 28113/34278 [30:55:57<6:16:58, 3.67s/it] {'loss': 0.1256, 'grad_norm': 1.510744603051289, 'learning_rate': 8.245812348993793e-07, 'epoch': 0.82} 82%|████████▏ | 28113/34278 [30:55:57<6:16:58, 3.67s/it] 82%|████████▏ | 28114/34278 [30:56:02<7:06:17, 4.15s/it] {'loss': 0.0988, 'grad_norm': 0.9243085518108931, 'learning_rate': 8.243213567243357e-07, 'epoch': 0.82} 82%|████████▏ | 28114/34278 [30:56:02<7:06:17, 4.15s/it] 82%|████████▏ | 28115/34278 [30:56:05<6:38:24, 3.88s/it] {'loss': 0.1006, 'grad_norm': 0.7633260998978164, 'learning_rate': 8.240615158287524e-07, 'epoch': 0.82} 82%|████████▏ | 28115/34278 [30:56:05<6:38:24, 3.88s/it] 82%|████████▏ | 28116/34278 [30:56:09<6:22:01, 3.72s/it] {'loss': 0.1101, 'grad_norm': 0.9381278677959661, 'learning_rate': 8.238017122149533e-07, 'epoch': 0.82} 82%|████████▏ | 28116/34278 [30:56:09<6:22:01, 3.72s/it] 82%|████████▏ | 28117/34278 [30:56:13<6:41:53, 3.91s/it] {'loss': 0.0933, 'grad_norm': 0.9691918239872492, 'learning_rate': 8.235419458852556e-07, 'epoch': 0.82} 82%|████████▏ | 28117/34278 [30:56:13<6:41:53, 3.91s/it] 82%|████████▏ | 28118/34278 [30:56:19<7:37:22, 4.46s/it] {'loss': 0.1247, 'grad_norm': 0.9012611915988304, 'learning_rate': 8.232822168419774e-07, 'epoch': 0.82} 82%|████████▏ | 28118/34278 [30:56:19<7:37:22, 4.46s/it] 82%|████████▏ | 28119/34278 [30:56:22<6:53:55, 4.03s/it] {'loss': 0.1091, 'grad_norm': 0.7544283074306304, 'learning_rate': 8.230225250874391e-07, 'epoch': 0.82} 82%|████████▏ | 28119/34278 [30:56:22<6:53:55, 4.03s/it] 82%|████████▏ | 28120/34278 [30:56:25<6:18:39, 3.69s/it] {'loss': 0.111, 'grad_norm': 0.847683240702552, 'learning_rate': 8.227628706239593e-07, 'epoch': 0.82} 82%|████████▏ | 28120/34278 [30:56:25<6:18:39, 3.69s/it] 82%|████████▏ | 28121/34278 [30:56:27<5:51:48, 3.43s/it] {'loss': 0.107, 'grad_norm': 1.009555390596786, 'learning_rate': 8.225032534538535e-07, 'epoch': 0.82} 82%|████████▏ | 28121/34278 [30:56:27<5:51:48, 3.43s/it] 82%|████████▏ | 28122/34278 [30:56:31<5:45:53, 3.37s/it] {'loss': 0.1019, 'grad_norm': 0.9268798893189278, 'learning_rate': 8.22243673579442e-07, 'epoch': 0.82} 82%|████████▏ | 28122/34278 [30:56:31<5:45:53, 3.37s/it] 82%|████████▏ | 28123/34278 [30:56:34<5:55:55, 3.47s/it] {'loss': 0.1083, 'grad_norm': 0.8621761592208615, 'learning_rate': 8.219841310030424e-07, 'epoch': 0.82} 82%|████████▏ | 28123/34278 [30:56:34<5:55:55, 3.47s/it] 82%|████████▏ | 28124/34278 [30:56:39<6:20:56, 3.71s/it] {'loss': 0.1147, 'grad_norm': 1.209421675021601, 'learning_rate': 8.217246257269712e-07, 'epoch': 0.82} 82%|████████▏ | 28124/34278 [30:56:39<6:20:56, 3.71s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa49a76fb00> Failed to fetch sample 2751685. Exception: cannot identify image file <_io.BytesIO object at 0x7fa49a76fb00> 82%|████████▏ | 28125/34278 [30:56:42<6:04:45, 3.56s/it] {'loss': 0.0829, 'grad_norm': 0.8386429567325634, 'learning_rate': 8.214651577535442e-07, 'epoch': 0.82} 82%|████████▏ | 28125/34278 [30:56:42<6:04:45, 3.56s/it] 82%|████████▏ | 28126/34278 [30:56:45<5:44:38, 3.36s/it] {'loss': 0.0961, 'grad_norm': 0.915009359660299, 'learning_rate': 8.212057270850798e-07, 'epoch': 0.82} 82%|████████▏ | 28126/34278 [30:56:45<5:44:38, 3.36s/it] 82%|████████▏ | 28127/34278 [30:56:49<6:06:15, 3.57s/it] {'loss': 0.1111, 'grad_norm': 0.8658750313762614, 'learning_rate': 8.209463337238921e-07, 'epoch': 0.82} 82%|████████▏ | 28127/34278 [30:56:49<6:06:15, 3.57s/it] 82%|████████▏ | 28128/34278 [30:56:52<5:44:49, 3.36s/it] {'loss': 0.1069, 'grad_norm': 1.661013780257039, 'learning_rate': 8.206869776722976e-07, 'epoch': 0.82} 82%|████████▏ | 28128/34278 [30:56:52<5:44:49, 3.36s/it] 82%|████████▏ | 28129/34278 [30:56:58<7:06:57, 4.17s/it] {'loss': 0.104, 'grad_norm': 0.9154463517099505, 'learning_rate': 8.204276589326132e-07, 'epoch': 0.82} 82%|████████▏ | 28129/34278 [30:56:58<7:06:57, 4.17s/it] 82%|████████▏ | 28130/34278 [30:57:02<7:02:50, 4.13s/it] {'loss': 0.116, 'grad_norm': 0.805309686255245, 'learning_rate': 8.201683775071534e-07, 'epoch': 0.82} 82%|████████▏ | 28130/34278 [30:57:02<7:02:50, 4.13s/it] 82%|████████▏ | 28131/34278 [30:57:04<6:20:34, 3.71s/it] {'loss': 0.1046, 'grad_norm': 0.9345352578915128, 'learning_rate': 8.199091333982312e-07, 'epoch': 0.82} 82%|████████▏ | 28131/34278 [30:57:04<6:20:34, 3.71s/it] 82%|████████▏ | 28132/34278 [30:57:07<5:57:44, 3.49s/it] {'loss': 0.089, 'grad_norm': 0.8183592785988367, 'learning_rate': 8.196499266081631e-07, 'epoch': 0.82} 82%|████████▏ | 28132/34278 [30:57:07<5:57:44, 3.49s/it] 82%|████████▏ | 28133/34278 [30:57:11<5:59:27, 3.51s/it] {'loss': 0.0994, 'grad_norm': 0.7325455817803519, 'learning_rate': 8.193907571392617e-07, 'epoch': 0.82} 82%|████████▏ | 28133/34278 [30:57:11<5:59:27, 3.51s/it] 82%|████████▏ | 28134/34278 [30:57:14<5:45:44, 3.38s/it] {'loss': 0.1379, 'grad_norm': 0.8562035668368565, 'learning_rate': 8.19131624993843e-07, 'epoch': 0.82} 82%|████████▏ | 28134/34278 [30:57:14<5:45:44, 3.38s/it] 82%|████████▏ | 28135/34278 [30:57:17<5:41:04, 3.33s/it] {'loss': 0.1013, 'grad_norm': 1.034877966516031, 'learning_rate': 8.188725301742178e-07, 'epoch': 0.82} 82%|████████▏ | 28135/34278 [30:57:17<5:41:04, 3.33s/it] 82%|████████▏ | 28136/34278 [30:57:20<5:34:41, 3.27s/it] {'loss': 0.1121, 'grad_norm': 0.7534523804468917, 'learning_rate': 8.186134726827016e-07, 'epoch': 0.82} 82%|████████▏ | 28136/34278 [30:57:20<5:34:41, 3.27s/it] 82%|████████▏ | 28137/34278 [30:57:24<5:38:54, 3.31s/it] {'loss': 0.1151, 'grad_norm': 0.976663392916067, 'learning_rate': 8.183544525216059e-07, 'epoch': 0.82} 82%|████████▏ | 28137/34278 [30:57:24<5:38:54, 3.31s/it] 82%|████████▏ | 28138/34278 [30:57:28<5:56:03, 3.48s/it] {'loss': 0.1111, 'grad_norm': 0.8498206421645776, 'learning_rate': 8.180954696932425e-07, 'epoch': 0.82} 82%|████████▏ | 28138/34278 [30:57:28<5:56:03, 3.48s/it] 82%|████████▏ | 28139/34278 [30:57:31<5:37:30, 3.30s/it] {'loss': 0.0992, 'grad_norm': 0.8489290048667434, 'learning_rate': 8.178365241999247e-07, 'epoch': 0.82} 82%|████████▏ | 28139/34278 [30:57:31<5:37:30, 3.30s/it] 82%|████████▏ | 28140/34278 [30:57:35<6:00:40, 3.53s/it] {'loss': 0.1064, 'grad_norm': 0.7877822691146962, 'learning_rate': 8.175776160439646e-07, 'epoch': 0.82} 82%|████████▏ | 28140/34278 [30:57:35<6:00:40, 3.53s/it] 82%|████████▏ | 28141/34278 [30:57:38<5:42:38, 3.35s/it] {'loss': 0.1234, 'grad_norm': 0.8335305962650439, 'learning_rate': 8.173187452276738e-07, 'epoch': 0.82} 82%|████████▏ | 28141/34278 [30:57:38<5:42:38, 3.35s/it] 82%|████████▏ | 28142/34278 [30:57:40<5:26:04, 3.19s/it] {'loss': 0.1236, 'grad_norm': 1.4482998243119232, 'learning_rate': 8.170599117533612e-07, 'epoch': 0.82} 82%|████████▏ | 28142/34278 [30:57:40<5:26:04, 3.19s/it] 82%|████████▏ | 28143/34278 [30:57:44<5:25:31, 3.18s/it] {'loss': 0.0965, 'grad_norm': 0.8594430263994742, 'learning_rate': 8.168011156233402e-07, 'epoch': 0.82} 82%|████████▏ | 28143/34278 [30:57:44<5:25:31, 3.18s/it] 82%|████████▏ | 28144/34278 [30:57:47<5:31:28, 3.24s/it] {'loss': 0.1167, 'grad_norm': 0.8527821907224292, 'learning_rate': 8.165423568399206e-07, 'epoch': 0.82} 82%|████████▏ | 28144/34278 [30:57:47<5:31:28, 3.24s/it] 82%|████████▏ | 28145/34278 [30:57:50<5:35:53, 3.29s/it] {'loss': 0.0954, 'grad_norm': 0.9662625297186198, 'learning_rate': 8.162836354054093e-07, 'epoch': 0.82} 82%|████████▏ | 28145/34278 [30:57:50<5:35:53, 3.29s/it] 82%|████████▏ | 28146/34278 [30:57:54<5:35:10, 3.28s/it] {'loss': 0.1277, 'grad_norm': 0.870731752773125, 'learning_rate': 8.160249513221218e-07, 'epoch': 0.82} 82%|████████▏ | 28146/34278 [30:57:54<5:35:10, 3.28s/it] 82%|████████▏ | 28147/34278 [30:57:59<6:44:38, 3.96s/it] {'loss': 0.1056, 'grad_norm': 0.8296927225677133, 'learning_rate': 8.157663045923647e-07, 'epoch': 0.82} 82%|████████▏ | 28147/34278 [30:57:59<6:44:38, 3.96s/it] 82%|████████▏ | 28148/34278 [30:58:02<6:19:38, 3.72s/it] {'loss': 0.1275, 'grad_norm': 1.1020732115297716, 'learning_rate': 8.15507695218446e-07, 'epoch': 0.82} 82%|████████▏ | 28148/34278 [30:58:02<6:19:38, 3.72s/it] 82%|████████▏ | 28149/34278 [30:58:05<6:03:59, 3.56s/it] {'loss': 0.1423, 'grad_norm': 0.9708489056716605, 'learning_rate': 8.152491232026766e-07, 'epoch': 0.82} 82%|████████▏ | 28149/34278 [30:58:06<6:03:59, 3.56s/it] 82%|████████▏ | 28150/34278 [30:58:09<6:01:53, 3.54s/it] {'loss': 0.1178, 'grad_norm': 0.8520652340420806, 'learning_rate': 8.149905885473641e-07, 'epoch': 0.82} 82%|████████▏ | 28150/34278 [30:58:09<6:01:53, 3.54s/it] 82%|████████▏ | 28151/34278 [30:58:13<6:16:48, 3.69s/it] {'loss': 0.1154, 'grad_norm': 0.9514485146187803, 'learning_rate': 8.147320912548156e-07, 'epoch': 0.82} 82%|████████▏ | 28151/34278 [30:58:13<6:16:48, 3.69s/it] 82%|████████▏ | 28152/34278 [30:58:16<6:03:16, 3.56s/it] {'loss': 0.0939, 'grad_norm': 0.9543060882194034, 'learning_rate': 8.1447363132734e-07, 'epoch': 0.82} 82%|████████▏ | 28152/34278 [30:58:16<6:03:16, 3.56s/it] 82%|████████▏ | 28153/34278 [30:58:20<5:55:53, 3.49s/it] {'loss': 0.1069, 'grad_norm': 0.7780015591522186, 'learning_rate': 8.142152087672456e-07, 'epoch': 0.82} 82%|████████▏ | 28153/34278 [30:58:20<5:55:53, 3.49s/it] 82%|████████▏ | 28154/34278 [30:58:24<6:26:46, 3.79s/it] {'loss': 0.1097, 'grad_norm': 0.8282556396543697, 'learning_rate': 8.139568235768386e-07, 'epoch': 0.82} 82%|████████▏ | 28154/34278 [30:58:24<6:26:46, 3.79s/it] 82%|████████▏ | 28155/34278 [30:58:28<6:22:18, 3.75s/it] {'loss': 0.1249, 'grad_norm': 0.9320926407715043, 'learning_rate': 8.136984757584243e-07, 'epoch': 0.82} 82%|████████▏ | 28155/34278 [30:58:28<6:22:18, 3.75s/it] 82%|████████▏ | 28156/34278 [30:58:31<6:01:16, 3.54s/it] {'loss': 0.1363, 'grad_norm': 0.8407397356659568, 'learning_rate': 8.134401653143126e-07, 'epoch': 0.82} 82%|████████▏ | 28156/34278 [30:58:31<6:01:16, 3.54s/it] 82%|████████▏ | 28157/34278 [30:58:34<5:45:23, 3.39s/it] {'loss': 0.0888, 'grad_norm': 0.9340377620301075, 'learning_rate': 8.13181892246806e-07, 'epoch': 0.82} 82%|████████▏ | 28157/34278 [30:58:34<5:45:23, 3.39s/it] 82%|████████▏ | 28158/34278 [30:58:37<5:47:55, 3.41s/it] {'loss': 0.1216, 'grad_norm': 0.688415117109616, 'learning_rate': 8.129236565582121e-07, 'epoch': 0.82} 82%|████████▏ | 28158/34278 [30:58:37<5:47:55, 3.41s/it] 82%|████████▏ | 28159/34278 [30:58:41<5:55:46, 3.49s/it] {'loss': 0.1067, 'grad_norm': 1.2667062439300032, 'learning_rate': 8.12665458250837e-07, 'epoch': 0.82} 82%|████████▏ | 28159/34278 [30:58:41<5:55:46, 3.49s/it] 82%|████████▏ | 28160/34278 [30:58:44<5:35:33, 3.29s/it] {'loss': 0.1164, 'grad_norm': 0.9749497627623536, 'learning_rate': 8.124072973269859e-07, 'epoch': 0.82} 82%|████████▏ | 28160/34278 [30:58:44<5:35:33, 3.29s/it] 82%|████████▏ | 28161/34278 [30:58:50<6:57:30, 4.10s/it] {'loss': 0.1099, 'grad_norm': 0.893241137790369, 'learning_rate': 8.121491737889609e-07, 'epoch': 0.82} 82%|████████▏ | 28161/34278 [30:58:50<6:57:30, 4.10s/it] 82%|████████▏ | 28162/34278 [30:58:53<6:22:56, 3.76s/it] {'loss': 0.1173, 'grad_norm': 0.7536120269677897, 'learning_rate': 8.118910876390701e-07, 'epoch': 0.82} 82%|████████▏ | 28162/34278 [30:58:53<6:22:56, 3.76s/it] 82%|████████▏ | 28163/34278 [30:58:56<5:53:13, 3.47s/it] {'loss': 0.1106, 'grad_norm': 0.7341619083776675, 'learning_rate': 8.116330388796146e-07, 'epoch': 0.82} 82%|████████▏ | 28163/34278 [30:58:56<5:53:13, 3.47s/it] 82%|████████▏ | 28164/34278 [30:58:59<5:45:23, 3.39s/it] {'loss': 0.1123, 'grad_norm': 0.7074439101407712, 'learning_rate': 8.113750275129001e-07, 'epoch': 0.82} 82%|████████▏ | 28164/34278 [30:58:59<5:45:23, 3.39s/it] 82%|████████▏ | 28165/34278 [30:59:04<6:57:45, 4.10s/it] {'loss': 0.1048, 'grad_norm': 0.7957780752170743, 'learning_rate': 8.111170535412288e-07, 'epoch': 0.82} 82%|████████▏ | 28165/34278 [30:59:04<6:57:45, 4.10s/it] 82%|████████▏ | 28166/34278 [30:59:09<7:17:30, 4.29s/it] {'loss': 0.1067, 'grad_norm': 1.0663470356321327, 'learning_rate': 8.108591169669055e-07, 'epoch': 0.82} 82%|████████▏ | 28166/34278 [30:59:09<7:17:30, 4.29s/it] 82%|████████▏ | 28167/34278 [30:59:15<8:17:08, 4.88s/it] {'loss': 0.1075, 'grad_norm': 0.8461527373718306, 'learning_rate': 8.106012177922323e-07, 'epoch': 0.82} 82%|████████▏ | 28167/34278 [30:59:15<8:17:08, 4.88s/it] 82%|████████▏ | 28168/34278 [30:59:20<8:08:17, 4.79s/it] {'loss': 0.0978, 'grad_norm': 0.7391384934802716, 'learning_rate': 8.103433560195095e-07, 'epoch': 0.82} 82%|████████▏ | 28168/34278 [30:59:20<8:08:17, 4.79s/it] 82%|████████▏ | 28169/34278 [30:59:24<7:26:32, 4.39s/it] {'loss': 0.1519, 'grad_norm': 0.9557676295109226, 'learning_rate': 8.100855316510414e-07, 'epoch': 0.82} 82%|████████▏ | 28169/34278 [30:59:24<7:26:32, 4.39s/it] 82%|████████▏ | 28170/34278 [30:59:27<6:44:08, 3.97s/it] {'loss': 0.1335, 'grad_norm': 0.8545413239568856, 'learning_rate': 8.098277446891306e-07, 'epoch': 0.82} 82%|████████▏ | 28170/34278 [30:59:27<6:44:08, 3.97s/it] 82%|████████▏ | 28171/34278 [30:59:32<7:39:02, 4.51s/it] {'loss': 0.1153, 'grad_norm': 0.8413890070645333, 'learning_rate': 8.095699951360775e-07, 'epoch': 0.82} 82%|████████▏ | 28171/34278 [30:59:32<7:39:02, 4.51s/it] 82%|████████▏ | 28172/34278 [30:59:36<7:02:59, 4.16s/it] {'loss': 0.1111, 'grad_norm': 0.8415007361696992, 'learning_rate': 8.093122829941824e-07, 'epoch': 0.82} 82%|████████▏ | 28172/34278 [30:59:36<7:02:59, 4.16s/it] 82%|████████▏ | 28173/34278 [30:59:40<7:11:19, 4.24s/it] {'loss': 0.1084, 'grad_norm': 0.8162786862022892, 'learning_rate': 8.090546082657475e-07, 'epoch': 0.82} 82%|████████▏ | 28173/34278 [30:59:40<7:11:19, 4.24s/it] 82%|████████▏ | 28174/34278 [30:59:44<7:04:19, 4.17s/it] {'loss': 0.0986, 'grad_norm': 0.8794433969944587, 'learning_rate': 8.087969709530724e-07, 'epoch': 0.82} 82%|████████▏ | 28174/34278 [30:59:44<7:04:19, 4.17s/it] 82%|████████▏ | 28175/34278 [30:59:48<6:46:14, 3.99s/it] {'loss': 0.1084, 'grad_norm': 0.6406696687558314, 'learning_rate': 8.085393710584555e-07, 'epoch': 0.82} 82%|████████▏ | 28175/34278 [30:59:48<6:46:14, 3.99s/it] 82%|████████▏ | 28176/34278 [30:59:51<6:22:34, 3.76s/it] {'loss': 0.1007, 'grad_norm': 0.8138758942418084, 'learning_rate': 8.082818085842009e-07, 'epoch': 0.82} 82%|████████▏ | 28176/34278 [30:59:51<6:22:34, 3.76s/it] 82%|████████▏ | 28177/34278 [30:59:54<6:00:20, 3.54s/it] {'loss': 0.1064, 'grad_norm': 0.7493458640073105, 'learning_rate': 8.080242835326052e-07, 'epoch': 0.82} 82%|████████▏ | 28177/34278 [30:59:54<6:00:20, 3.54s/it] 82%|████████▏ | 28178/34278 [31:00:00<7:07:35, 4.21s/it] {'loss': 0.1132, 'grad_norm': 0.8126863787995176, 'learning_rate': 8.077667959059671e-07, 'epoch': 0.82} 82%|████████▏ | 28178/34278 [31:00:00<7:07:35, 4.21s/it] 82%|████████▏ | 28179/34278 [31:00:03<6:40:07, 3.94s/it] {'loss': 0.0934, 'grad_norm': 1.1214392039574879, 'learning_rate': 8.075093457065875e-07, 'epoch': 0.82} 82%|████████▏ | 28179/34278 [31:00:03<6:40:07, 3.94s/it] 82%|████████▏ | 28180/34278 [31:00:06<6:10:56, 3.65s/it] {'loss': 0.1237, 'grad_norm': 0.7769434290556544, 'learning_rate': 8.072519329367634e-07, 'epoch': 0.82} 82%|████████▏ | 28180/34278 [31:00:06<6:10:56, 3.65s/it] 82%|████████▏ | 28181/34278 [31:00:12<7:21:04, 4.34s/it] {'loss': 0.1196, 'grad_norm': 1.1882184696955005, 'learning_rate': 8.069945575987925e-07, 'epoch': 0.82} 82%|████████▏ | 28181/34278 [31:00:12<7:21:04, 4.34s/it] 82%|████████▏ | 28182/34278 [31:00:18<8:04:45, 4.77s/it] {'loss': 0.0993, 'grad_norm': 0.8913308246801798, 'learning_rate': 8.06737219694973e-07, 'epoch': 0.82} 82%|████████▏ | 28182/34278 [31:00:18<8:04:45, 4.77s/it] 82%|████████▏ | 28183/34278 [31:00:24<8:47:21, 5.19s/it] {'loss': 0.1173, 'grad_norm': 0.7085190988101954, 'learning_rate': 8.064799192276035e-07, 'epoch': 0.82} 82%|████████▏ | 28183/34278 [31:00:24<8:47:21, 5.19s/it] 82%|████████▏ | 28184/34278 [31:00:27<7:56:30, 4.69s/it] {'loss': 0.1224, 'grad_norm': 0.7946797222797932, 'learning_rate': 8.062226561989806e-07, 'epoch': 0.82} 82%|████████▏ | 28184/34278 [31:00:27<7:56:30, 4.69s/it] 82%|████████▏ | 28185/34278 [31:00:31<7:24:42, 4.38s/it] {'loss': 0.0989, 'grad_norm': 0.9038636203393499, 'learning_rate': 8.05965430611399e-07, 'epoch': 0.82} 82%|████████▏ | 28185/34278 [31:00:31<7:24:42, 4.38s/it] 82%|████████▏ | 28186/34278 [31:00:35<7:15:44, 4.29s/it] {'loss': 0.1353, 'grad_norm': 0.8029605503880207, 'learning_rate': 8.057082424671586e-07, 'epoch': 0.82} 82%|████████▏ | 28186/34278 [31:00:35<7:15:44, 4.29s/it] 82%|████████▏ | 28187/34278 [31:00:39<6:52:59, 4.07s/it] {'loss': 0.1096, 'grad_norm': 1.155784660196701, 'learning_rate': 8.05451091768552e-07, 'epoch': 0.82} 82%|████████▏ | 28187/34278 [31:00:39<6:52:59, 4.07s/it] 82%|████████▏ | 28188/34278 [31:00:45<7:51:07, 4.64s/it] {'loss': 0.121, 'grad_norm': 0.9439379323722944, 'learning_rate': 8.051939785178769e-07, 'epoch': 0.82} 82%|████████▏ | 28188/34278 [31:00:45<7:51:07, 4.64s/it] 82%|████████▏ | 28189/34278 [31:00:48<7:08:33, 4.22s/it] {'loss': 0.1036, 'grad_norm': 0.8646942937179766, 'learning_rate': 8.049369027174303e-07, 'epoch': 0.82} 82%|████████▏ | 28189/34278 [31:00:48<7:08:33, 4.22s/it] 82%|████████▏ | 28190/34278 [31:00:54<8:01:50, 4.75s/it] {'loss': 0.1146, 'grad_norm': 0.8208873320004823, 'learning_rate': 8.046798643695047e-07, 'epoch': 0.82} 82%|████████▏ | 28190/34278 [31:00:54<8:01:50, 4.75s/it] 82%|████████▏ | 28191/34278 [31:00:57<7:13:02, 4.27s/it] {'loss': 0.1333, 'grad_norm': 0.8275727422519764, 'learning_rate': 8.044228634763951e-07, 'epoch': 0.82} 82%|████████▏ | 28191/34278 [31:00:57<7:13:02, 4.27s/it] 82%|████████▏ | 28192/34278 [31:01:00<6:28:50, 3.83s/it] {'loss': 0.0911, 'grad_norm': 0.9578165442255402, 'learning_rate': 8.041659000403979e-07, 'epoch': 0.82} 82%|████████▏ | 28192/34278 [31:01:00<6:28:50, 3.83s/it] 82%|████████▏ | 28193/34278 [31:01:03<6:08:29, 3.63s/it] {'loss': 0.1077, 'grad_norm': 0.9774819388114641, 'learning_rate': 8.039089740638045e-07, 'epoch': 0.82} 82%|████████▏ | 28193/34278 [31:01:03<6:08:29, 3.63s/it] 82%|████████▏ | 28194/34278 [31:01:06<5:40:10, 3.35s/it] {'loss': 0.1145, 'grad_norm': 1.1650693290225729, 'learning_rate': 8.036520855489116e-07, 'epoch': 0.82} 82%|████████▏ | 28194/34278 [31:01:06<5:40:10, 3.35s/it] 82%|████████▏ | 28195/34278 [31:01:09<5:53:19, 3.49s/it] {'loss': 0.1182, 'grad_norm': 0.7307145376854071, 'learning_rate': 8.033952344980095e-07, 'epoch': 0.82} 82%|████████▏ | 28195/34278 [31:01:09<5:53:19, 3.49s/it] 82%|████████▏ | 28196/34278 [31:01:13<5:55:53, 3.51s/it] {'loss': 0.1035, 'grad_norm': 1.01455580386506, 'learning_rate': 8.031384209133941e-07, 'epoch': 0.82} 82%|████████▏ | 28196/34278 [31:01:13<5:55:53, 3.51s/it] 82%|████████▏ | 28197/34278 [31:01:18<6:45:20, 4.00s/it] {'loss': 0.1207, 'grad_norm': 0.975275404336792, 'learning_rate': 8.028816447973575e-07, 'epoch': 0.82} 82%|████████▏ | 28197/34278 [31:01:18<6:45:20, 4.00s/it] 82%|████████▏ | 28198/34278 [31:01:24<7:53:41, 4.67s/it] {'loss': 0.1081, 'grad_norm': 0.8215151553917331, 'learning_rate': 8.026249061521901e-07, 'epoch': 0.82} 82%|████████▏ | 28198/34278 [31:01:24<7:53:41, 4.67s/it] 82%|████████▏ | 28199/34278 [31:01:28<7:31:49, 4.46s/it] {'loss': 0.14, 'grad_norm': 0.9120500380292472, 'learning_rate': 8.023682049801857e-07, 'epoch': 0.82} 82%|████████▏ | 28199/34278 [31:01:28<7:31:49, 4.46s/it] 82%|████████▏ | 28200/34278 [31:01:32<7:13:00, 4.27s/it] {'loss': 0.0803, 'grad_norm': 0.932260325372986, 'learning_rate': 8.021115412836372e-07, 'epoch': 0.82} 82%|████████▏ | 28200/34278 [31:01:32<7:13:00, 4.27s/it] 82%|████████▏ | 28201/34278 [31:01:36<6:45:04, 4.00s/it] {'loss': 0.1321, 'grad_norm': 0.7206661322297229, 'learning_rate': 8.018549150648342e-07, 'epoch': 0.82} 82%|████████▏ | 28201/34278 [31:01:36<6:45:04, 4.00s/it] 82%|████████▏ | 28202/34278 [31:01:39<6:18:53, 3.74s/it] {'loss': 0.1029, 'grad_norm': 0.9936279788538864, 'learning_rate': 8.015983263260679e-07, 'epoch': 0.82} 82%|████████▏ | 28202/34278 [31:01:39<6:18:53, 3.74s/it] 82%|████████▏ | 28203/34278 [31:01:43<6:38:21, 3.93s/it] {'loss': 0.1055, 'grad_norm': 1.2841456857653297, 'learning_rate': 8.013417750696301e-07, 'epoch': 0.82} 82%|████████▏ | 28203/34278 [31:01:43<6:38:21, 3.93s/it] 82%|████████▏ | 28204/34278 [31:01:47<6:32:22, 3.88s/it] {'loss': 0.1093, 'grad_norm': 0.9865292061306443, 'learning_rate': 8.010852612978109e-07, 'epoch': 0.82} 82%|████████▏ | 28204/34278 [31:01:47<6:32:22, 3.88s/it] 82%|████████▏ | 28205/34278 [31:01:53<7:49:05, 4.63s/it] {'loss': 0.1041, 'grad_norm': 0.8830803277538544, 'learning_rate': 8.008287850128976e-07, 'epoch': 0.82} 82%|████████▏ | 28205/34278 [31:01:53<7:49:05, 4.63s/it] 82%|████████▏ | 28206/34278 [31:01:56<6:50:53, 4.06s/it] {'loss': 0.1225, 'grad_norm': 0.8307317317359434, 'learning_rate': 8.005723462171849e-07, 'epoch': 0.82} 82%|████████▏ | 28206/34278 [31:01:56<6:50:53, 4.06s/it] 82%|████████▏ | 28207/34278 [31:02:00<6:42:01, 3.97s/it] {'loss': 0.0983, 'grad_norm': 0.8542356041045629, 'learning_rate': 8.0031594491296e-07, 'epoch': 0.82} 82%|████████▏ | 28207/34278 [31:02:00<6:42:01, 3.97s/it] 82%|████████▏ | 28208/34278 [31:02:03<6:14:41, 3.70s/it] {'loss': 0.1079, 'grad_norm': 1.0179093888808328, 'learning_rate': 8.000595811025103e-07, 'epoch': 0.82} 82%|████████▏ | 28208/34278 [31:02:03<6:14:41, 3.70s/it] 82%|████████▏ | 28209/34278 [31:02:07<6:17:40, 3.73s/it] {'loss': 0.1096, 'grad_norm': 0.8032717625037453, 'learning_rate': 7.998032547881274e-07, 'epoch': 0.82} 82%|████████▏ | 28209/34278 [31:02:07<6:17:40, 3.73s/it] 82%|████████▏ | 28210/34278 [31:02:13<7:27:10, 4.42s/it] {'loss': 0.1033, 'grad_norm': 1.074792740723042, 'learning_rate': 7.995469659720984e-07, 'epoch': 0.82} 82%|████████▏ | 28210/34278 [31:02:13<7:27:10, 4.42s/it] 82%|████████▏ | 28211/34278 [31:02:18<7:46:53, 4.62s/it] {'loss': 0.1106, 'grad_norm': 0.8809499433955115, 'learning_rate': 7.992907146567103e-07, 'epoch': 0.82} 82%|████████▏ | 28211/34278 [31:02:18<7:46:53, 4.62s/it] 82%|████████▏ | 28212/34278 [31:02:23<8:20:15, 4.95s/it] {'loss': 0.1166, 'grad_norm': 1.1853681386942079, 'learning_rate': 7.990345008442518e-07, 'epoch': 0.82} 82%|████████▏ | 28212/34278 [31:02:23<8:20:15, 4.95s/it] 82%|████████▏ | 28213/34278 [31:02:27<7:24:34, 4.40s/it] {'loss': 0.1157, 'grad_norm': 1.0374941834845035, 'learning_rate': 7.987783245370118e-07, 'epoch': 0.82} 82%|████████▏ | 28213/34278 [31:02:27<7:24:34, 4.40s/it] 82%|████████▏ | 28214/34278 [31:02:30<6:50:19, 4.06s/it] {'loss': 0.12, 'grad_norm': 1.0796943034117419, 'learning_rate': 7.985221857372754e-07, 'epoch': 0.82} 82%|████████▏ | 28214/34278 [31:02:30<6:50:19, 4.06s/it] 82%|████████▏ | 28215/34278 [31:02:33<6:25:48, 3.82s/it] {'loss': 0.0969, 'grad_norm': 0.7717978272424924, 'learning_rate': 7.982660844473295e-07, 'epoch': 0.82} 82%|████████▏ | 28215/34278 [31:02:33<6:25:48, 3.82s/it] 82%|████████▏ | 28216/34278 [31:02:36<5:54:51, 3.51s/it] {'loss': 0.1374, 'grad_norm': 0.8402358338303498, 'learning_rate': 7.980100206694613e-07, 'epoch': 0.82} 82%|████████▏ | 28216/34278 [31:02:36<5:54:51, 3.51s/it] 82%|████████▏ | 28217/34278 [31:02:39<5:45:42, 3.42s/it] {'loss': 0.0878, 'grad_norm': 0.8793454759852466, 'learning_rate': 7.977539944059559e-07, 'epoch': 0.82} 82%|████████▏ | 28217/34278 [31:02:39<5:45:42, 3.42s/it] 82%|████████▏ | 28218/34278 [31:02:45<7:08:19, 4.24s/it] {'loss': 0.1203, 'grad_norm': 0.9858176212897587, 'learning_rate': 7.974980056590997e-07, 'epoch': 0.82} 82%|████████▏ | 28218/34278 [31:02:45<7:08:19, 4.24s/it] 82%|████████▏ | 28219/34278 [31:02:51<7:45:07, 4.61s/it] {'loss': 0.1054, 'grad_norm': 0.9134492697793195, 'learning_rate': 7.972420544311793e-07, 'epoch': 0.82} 82%|████████▏ | 28219/34278 [31:02:51<7:45:07, 4.61s/it] 82%|████████▏ | 28220/34278 [31:02:54<6:51:59, 4.08s/it] {'loss': 0.12, 'grad_norm': 0.9446679430183189, 'learning_rate': 7.969861407244784e-07, 'epoch': 0.82} 82%|████████▏ | 28220/34278 [31:02:54<6:51:59, 4.08s/it] 82%|████████▏ | 28221/34278 [31:02:59<7:40:47, 4.56s/it] {'loss': 0.1044, 'grad_norm': 0.9735403902127442, 'learning_rate': 7.967302645412811e-07, 'epoch': 0.82} 82%|████████▏ | 28221/34278 [31:02:59<7:40:47, 4.56s/it] 82%|████████▏ | 28222/34278 [31:03:05<8:21:39, 4.97s/it] {'loss': 0.0979, 'grad_norm': 0.8884458166026682, 'learning_rate': 7.96474425883873e-07, 'epoch': 0.82} 82%|████████▏ | 28222/34278 [31:03:05<8:21:39, 4.97s/it] 82%|████████▏ | 28223/34278 [31:03:08<7:25:02, 4.41s/it] {'loss': 0.117, 'grad_norm': 0.7660160862184069, 'learning_rate': 7.962186247545373e-07, 'epoch': 0.82} 82%|████████▏ | 28223/34278 [31:03:08<7:25:02, 4.41s/it] 82%|████████▏ | 28224/34278 [31:03:11<6:47:50, 4.04s/it] {'loss': 0.1147, 'grad_norm': 0.9650815747394327, 'learning_rate': 7.959628611555592e-07, 'epoch': 0.82} 82%|████████▏ | 28224/34278 [31:03:11<6:47:50, 4.04s/it] 82%|████████▏ | 28225/34278 [31:03:14<6:16:27, 3.73s/it] {'loss': 0.1039, 'grad_norm': 1.0124418689911145, 'learning_rate': 7.957071350892198e-07, 'epoch': 0.82} 82%|████████▏ | 28225/34278 [31:03:14<6:16:27, 3.73s/it] 82%|████████▏ | 28226/34278 [31:03:18<6:01:32, 3.58s/it] {'loss': 0.1114, 'grad_norm': 0.8996514207594016, 'learning_rate': 7.954514465578044e-07, 'epoch': 0.82} 82%|████████▏ | 28226/34278 [31:03:18<6:01:32, 3.58s/it] 82%|████████▏ | 28227/34278 [31:03:22<6:13:34, 3.70s/it] {'loss': 0.1135, 'grad_norm': 0.8903856104697206, 'learning_rate': 7.951957955635953e-07, 'epoch': 0.82} 82%|████████▏ | 28227/34278 [31:03:22<6:13:34, 3.70s/it] 82%|████████▏ | 28228/34278 [31:03:28<7:39:33, 4.56s/it] {'loss': 0.1242, 'grad_norm': 0.8601415069949574, 'learning_rate': 7.949401821088726e-07, 'epoch': 0.82} 82%|████████▏ | 28228/34278 [31:03:28<7:39:33, 4.56s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 82%|████████▏ | 28229/34278 [31:03:31<6:52:33, 4.09s/it] {'loss': 0.1315, 'grad_norm': 0.9129479336715849, 'learning_rate': 7.946846061959207e-07, 'epoch': 0.82} 82%|████████▏ | 28229/34278 [31:03:31<6:52:33, 4.09s/it] 82%|████████▏ | 28230/34278 [31:03:36<7:01:59, 4.19s/it] {'loss': 0.1142, 'grad_norm': 1.0827694069197806, 'learning_rate': 7.944290678270216e-07, 'epoch': 0.82} 82%|████████▏ | 28230/34278 [31:03:36<7:01:59, 4.19s/it] 82%|████████▏ | 28231/34278 [31:03:40<6:54:40, 4.11s/it] {'loss': 0.1044, 'grad_norm': 0.8177956924191697, 'learning_rate': 7.941735670044559e-07, 'epoch': 0.82} 82%|████████▏ | 28231/34278 [31:03:40<6:54:40, 4.11s/it] 82%|████████▏ | 28232/34278 [31:03:43<6:19:45, 3.77s/it] {'loss': 0.1291, 'grad_norm': 0.9078116440610156, 'learning_rate': 7.939181037305033e-07, 'epoch': 0.82} 82%|████████▏ | 28232/34278 [31:03:43<6:19:45, 3.77s/it] 82%|████████▏ | 28233/34278 [31:03:47<6:41:38, 3.99s/it] {'loss': 0.0973, 'grad_norm': 0.7443518641912515, 'learning_rate': 7.936626780074475e-07, 'epoch': 0.82} 82%|████████▏ | 28233/34278 [31:03:47<6:41:38, 3.99s/it] 82%|████████▏ | 28234/34278 [31:03:50<6:24:01, 3.81s/it] {'loss': 0.1121, 'grad_norm': 0.8551287825765357, 'learning_rate': 7.934072898375667e-07, 'epoch': 0.82} 82%|████████▏ | 28234/34278 [31:03:50<6:24:01, 3.81s/it] 82%|████████▏ | 28235/34278 [31:03:53<5:50:44, 3.48s/it] {'loss': 0.1203, 'grad_norm': 0.8493242321268851, 'learning_rate': 7.93151939223139e-07, 'epoch': 0.82} 82%|████████▏ | 28235/34278 [31:03:53<5:50:44, 3.48s/it] 82%|████████▏ | 28236/34278 [31:03:56<5:42:02, 3.40s/it] {'loss': 0.1245, 'grad_norm': 0.6591109536062264, 'learning_rate': 7.92896626166449e-07, 'epoch': 0.82} 82%|████████▏ | 28236/34278 [31:03:56<5:42:02, 3.40s/it] 82%|████████▏ | 28237/34278 [31:04:02<6:58:52, 4.16s/it] {'loss': 0.1158, 'grad_norm': 0.7665058536724771, 'learning_rate': 7.926413506697733e-07, 'epoch': 0.82} 82%|████████▏ | 28237/34278 [31:04:02<6:58:52, 4.16s/it] 82%|████████▏ | 28238/34278 [31:04:07<7:27:51, 4.45s/it] {'loss': 0.0953, 'grad_norm': 0.8320924718203171, 'learning_rate': 7.923861127353905e-07, 'epoch': 0.82} 82%|████████▏ | 28238/34278 [31:04:07<7:27:51, 4.45s/it] 82%|████████▏ | 28239/34278 [31:04:10<6:41:08, 3.99s/it] {'loss': 0.1072, 'grad_norm': 1.051981188820652, 'learning_rate': 7.921309123655812e-07, 'epoch': 0.82} 82%|████████▏ | 28239/34278 [31:04:10<6:41:08, 3.99s/it] 82%|████████▏ | 28240/34278 [31:04:14<6:30:08, 3.88s/it] {'loss': 0.1117, 'grad_norm': 0.7169976623296156, 'learning_rate': 7.918757495626228e-07, 'epoch': 0.82} 82%|████████▏ | 28240/34278 [31:04:14<6:30:08, 3.88s/it] 82%|████████▏ | 28241/34278 [31:04:17<6:14:45, 3.72s/it] {'loss': 0.1129, 'grad_norm': 0.8116336118984168, 'learning_rate': 7.916206243287916e-07, 'epoch': 0.82} 82%|████████▏ | 28241/34278 [31:04:17<6:14:45, 3.72s/it] 82%|████████▏ | 28242/34278 [31:04:21<5:58:54, 3.57s/it] {'loss': 0.0958, 'grad_norm': 0.9123025125674905, 'learning_rate': 7.913655366663669e-07, 'epoch': 0.82} 82%|████████▏ | 28242/34278 [31:04:21<5:58:54, 3.57s/it] 82%|████████▏ | 28243/34278 [31:04:24<5:54:25, 3.52s/it] {'loss': 0.1152, 'grad_norm': 0.8207178203459888, 'learning_rate': 7.91110486577627e-07, 'epoch': 0.82} 82%|████████▏ | 28243/34278 [31:04:24<5:54:25, 3.52s/it] 82%|████████▏ | 28244/34278 [31:04:27<5:30:08, 3.28s/it] {'loss': 0.0939, 'grad_norm': 0.7823137649557989, 'learning_rate': 7.908554740648483e-07, 'epoch': 0.82} 82%|████████▏ | 28244/34278 [31:04:27<5:30:08, 3.28s/it] 82%|████████▏ | 28245/34278 [31:04:30<5:30:35, 3.29s/it] {'loss': 0.1139, 'grad_norm': 0.9079250673663318, 'learning_rate': 7.906004991303057e-07, 'epoch': 0.82} 82%|████████▏ | 28245/34278 [31:04:30<5:30:35, 3.29s/it] 82%|████████▏ | 28246/34278 [31:04:33<5:35:55, 3.34s/it] {'loss': 0.1037, 'grad_norm': 0.8535080065364986, 'learning_rate': 7.903455617762785e-07, 'epoch': 0.82} 82%|████████▏ | 28246/34278 [31:04:33<5:35:55, 3.34s/it] 82%|████████▏ | 28247/34278 [31:04:39<6:57:38, 4.15s/it] {'loss': 0.126, 'grad_norm': 0.7694194856557461, 'learning_rate': 7.900906620050397e-07, 'epoch': 0.82} 82%|████████▏ | 28247/34278 [31:04:40<6:57:38, 4.15s/it] 82%|████████▏ | 28248/34278 [31:04:42<6:22:26, 3.81s/it] {'loss': 0.1345, 'grad_norm': 1.024056981269128, 'learning_rate': 7.898357998188666e-07, 'epoch': 0.82} 82%|████████▏ | 28248/34278 [31:04:42<6:22:26, 3.81s/it] 82%|████████▏ | 28249/34278 [31:04:46<6:05:09, 3.63s/it] {'loss': 0.1222, 'grad_norm': 1.157752385255544, 'learning_rate': 7.895809752200356e-07, 'epoch': 0.82} 82%|████████▏ | 28249/34278 [31:04:46<6:05:09, 3.63s/it] 82%|████████▏ | 28250/34278 [31:04:48<5:32:20, 3.31s/it] {'loss': 0.0903, 'grad_norm': 1.0403615699776128, 'learning_rate': 7.893261882108205e-07, 'epoch': 0.82} 82%|████████▏ | 28250/34278 [31:04:48<5:32:20, 3.31s/it] 82%|████████▏ | 28251/34278 [31:04:52<5:58:24, 3.57s/it] {'loss': 0.1283, 'grad_norm': 0.8346576357266661, 'learning_rate': 7.89071438793495e-07, 'epoch': 0.82} 82%|████████▏ | 28251/34278 [31:04:52<5:58:24, 3.57s/it] 82%|████████▏ | 28252/34278 [31:04:55<5:28:19, 3.27s/it] {'loss': 0.1292, 'grad_norm': 1.0467219620973778, 'learning_rate': 7.888167269703339e-07, 'epoch': 0.82} 82%|████████▏ | 28252/34278 [31:04:55<5:28:19, 3.27s/it] 82%|████████▏ | 28253/34278 [31:04:58<5:24:02, 3.23s/it] {'loss': 0.0997, 'grad_norm': 0.877842756180668, 'learning_rate': 7.885620527436133e-07, 'epoch': 0.82} 82%|████████▏ | 28253/34278 [31:04:58<5:24:02, 3.23s/it] 82%|████████▏ | 28254/34278 [31:05:03<6:22:51, 3.81s/it] {'loss': 0.0874, 'grad_norm': 0.9062617894548268, 'learning_rate': 7.883074161156056e-07, 'epoch': 0.82} 82%|████████▏ | 28254/34278 [31:05:03<6:22:51, 3.81s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 82%|████████▏ | 28255/34278 [31:05:06<5:52:29, 3.51s/it] {'loss': 0.095, 'grad_norm': 0.9725653219617851, 'learning_rate': 7.880528170885826e-07, 'epoch': 0.82} 82%|████████▏ | 28255/34278 [31:05:06<5:52:29, 3.51s/it] 82%|████████▏ | 28256/34278 [31:05:09<5:38:49, 3.38s/it] {'loss': 0.1184, 'grad_norm': 1.1414445368593333, 'learning_rate': 7.877982556648195e-07, 'epoch': 0.82} 82%|████████▏ | 28256/34278 [31:05:09<5:38:49, 3.38s/it] 82%|████████▏ | 28257/34278 [31:05:12<5:27:05, 3.26s/it] {'loss': 0.105, 'grad_norm': 1.1247366079705248, 'learning_rate': 7.875437318465884e-07, 'epoch': 0.82} 82%|████████▏ | 28257/34278 [31:05:12<5:27:05, 3.26s/it] 82%|████████▏ | 28258/34278 [31:05:15<5:20:32, 3.19s/it] {'loss': 0.111, 'grad_norm': 0.7926035231731909, 'learning_rate': 7.872892456361597e-07, 'epoch': 0.82} 82%|████████▏ | 28258/34278 [31:05:15<5:20:32, 3.19s/it] 82%|████████▏ | 28259/34278 [31:05:19<5:31:18, 3.30s/it] {'loss': 0.1112, 'grad_norm': 0.8265197543925429, 'learning_rate': 7.870347970358072e-07, 'epoch': 0.82} 82%|████████▏ | 28259/34278 [31:05:19<5:31:18, 3.30s/it] 82%|████████▏ | 28260/34278 [31:05:21<5:11:39, 3.11s/it] {'loss': 0.1006, 'grad_norm': 0.977496687454048, 'learning_rate': 7.867803860478035e-07, 'epoch': 0.82} 82%|████████▏ | 28260/34278 [31:05:21<5:11:39, 3.11s/it] 82%|████████▏ | 28261/34278 [31:05:27<6:32:55, 3.92s/it] {'loss': 0.0981, 'grad_norm': 0.8298812136312068, 'learning_rate': 7.86526012674419e-07, 'epoch': 0.82} 82%|████████▏ | 28261/34278 [31:05:27<6:32:55, 3.92s/it] 82%|████████▏ | 28262/34278 [31:05:30<6:10:05, 3.69s/it] {'loss': 0.1067, 'grad_norm': 0.923303082482779, 'learning_rate': 7.86271676917923e-07, 'epoch': 0.82} 82%|████████▏ | 28262/34278 [31:05:30<6:10:05, 3.69s/it] 82%|████████▏ | 28263/34278 [31:05:34<6:16:39, 3.76s/it] {'loss': 0.1028, 'grad_norm': 0.8738865679941992, 'learning_rate': 7.860173787805886e-07, 'epoch': 0.82} 82%|████████▏ | 28263/34278 [31:05:34<6:16:39, 3.76s/it] 82%|████████▏ | 28264/34278 [31:05:39<6:39:39, 3.99s/it] {'loss': 0.1153, 'grad_norm': 1.0582842785978028, 'learning_rate': 7.857631182646835e-07, 'epoch': 0.82} 82%|████████▏ | 28264/34278 [31:05:39<6:39:39, 3.99s/it] 82%|████████▏ | 28265/34278 [31:05:42<6:15:00, 3.74s/it] {'loss': 0.1219, 'grad_norm': 0.8562993519550272, 'learning_rate': 7.855088953724799e-07, 'epoch': 0.82} 82%|████████▏ | 28265/34278 [31:05:42<6:15:00, 3.74s/it] 82%|████████▏ | 28266/34278 [31:05:46<6:11:40, 3.71s/it] {'loss': 0.1184, 'grad_norm': 0.9664005471330721, 'learning_rate': 7.85254710106248e-07, 'epoch': 0.82} 82%|████████▏ | 28266/34278 [31:05:46<6:11:40, 3.71s/it] 82%|████████▏ | 28267/34278 [31:05:49<5:55:11, 3.55s/it] {'loss': 0.0945, 'grad_norm': 0.9763949671009416, 'learning_rate': 7.850005624682555e-07, 'epoch': 0.82} 82%|████████▏ | 28267/34278 [31:05:49<5:55:11, 3.55s/it] 82%|████████▏ | 28268/34278 [31:05:53<6:09:00, 3.68s/it] {'loss': 0.1308, 'grad_norm': 0.8966174290579905, 'learning_rate': 7.847464524607712e-07, 'epoch': 0.82} 82%|████████▏ | 28268/34278 [31:05:53<6:09:00, 3.68s/it] 82%|████████▏ | 28269/34278 [31:05:56<5:59:06, 3.59s/it] {'loss': 0.1297, 'grad_norm': 0.8161974105240922, 'learning_rate': 7.844923800860649e-07, 'epoch': 0.82} 82%|████████▏ | 28269/34278 [31:05:56<5:59:06, 3.59s/it] 82%|████████▏ | 28270/34278 [31:06:01<6:41:57, 4.01s/it] {'loss': 0.1091, 'grad_norm': 0.817094311582936, 'learning_rate': 7.842383453464037e-07, 'epoch': 0.82} 82%|████████▏ | 28270/34278 [31:06:01<6:41:57, 4.01s/it] 82%|████████▏ | 28271/34278 [31:06:04<6:09:26, 3.69s/it] {'loss': 0.0993, 'grad_norm': 1.1028670740452124, 'learning_rate': 7.839843482440568e-07, 'epoch': 0.82} 82%|████████▏ | 28271/34278 [31:06:04<6:09:26, 3.69s/it] 82%|████████▏ | 28272/34278 [31:06:10<7:24:23, 4.44s/it] {'loss': 0.1278, 'grad_norm': 0.9547552172157876, 'learning_rate': 7.837303887812903e-07, 'epoch': 0.82} 82%|████████▏ | 28272/34278 [31:06:10<7:24:23, 4.44s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 82%|████████▏ | 28273/34278 [31:06:16<8:10:02, 4.90s/it] {'loss': 0.1466, 'grad_norm': 0.8519816492596898, 'learning_rate': 7.834764669603733e-07, 'epoch': 0.82} 82%|████████▏ | 28273/34278 [31:06:16<8:10:02, 4.90s/it] 82%|████████▏ | 28274/34278 [31:06:19<7:18:39, 4.38s/it] {'loss': 0.1201, 'grad_norm': 1.3415020570432128, 'learning_rate': 7.832225827835721e-07, 'epoch': 0.82} 82%|████████▏ | 28274/34278 [31:06:19<7:18:39, 4.38s/it] 82%|████████▏ | 28275/34278 [31:06:22<6:34:22, 3.94s/it] {'loss': 0.1079, 'grad_norm': 1.135995784586104, 'learning_rate': 7.829687362531518e-07, 'epoch': 0.82} 82%|████████▏ | 28275/34278 [31:06:22<6:34:22, 3.94s/it] 82%|████████▏ | 28276/34278 [31:06:25<6:07:05, 3.67s/it] {'loss': 0.0952, 'grad_norm': 0.7050498967393973, 'learning_rate': 7.827149273713797e-07, 'epoch': 0.82} 82%|████████▏ | 28276/34278 [31:06:25<6:07:05, 3.67s/it] 82%|████████▏ | 28277/34278 [31:06:29<6:11:58, 3.72s/it] {'loss': 0.1256, 'grad_norm': 0.7523177070831721, 'learning_rate': 7.824611561405238e-07, 'epoch': 0.82} 82%|████████▏ | 28277/34278 [31:06:29<6:11:58, 3.72s/it] 82%|████████▏ | 28278/34278 [31:06:32<5:49:12, 3.49s/it] {'loss': 0.1192, 'grad_norm': 0.7396224695897348, 'learning_rate': 7.822074225628462e-07, 'epoch': 0.82} 82%|████████▏ | 28278/34278 [31:06:32<5:49:12, 3.49s/it] 82%|████████▏ | 28279/34278 [31:06:36<5:54:50, 3.55s/it] {'loss': 0.1076, 'grad_norm': 1.1024753633581628, 'learning_rate': 7.819537266406152e-07, 'epoch': 0.82} 82%|████████▏ | 28279/34278 [31:06:36<5:54:50, 3.55s/it] 83%|████████▎ | 28280/34278 [31:06:42<7:10:05, 4.30s/it] {'loss': 0.1262, 'grad_norm': 0.8183379302789136, 'learning_rate': 7.81700068376094e-07, 'epoch': 0.83} 83%|████████▎ | 28280/34278 [31:06:42<7:10:05, 4.30s/it] 83%|████████▎ | 28281/34278 [31:06:46<7:12:11, 4.32s/it] {'loss': 0.1288, 'grad_norm': 1.1095814611202723, 'learning_rate': 7.814464477715466e-07, 'epoch': 0.83} 83%|████████▎ | 28281/34278 [31:06:46<7:12:11, 4.32s/it] 83%|████████▎ | 28282/34278 [31:06:49<6:35:45, 3.96s/it] {'loss': 0.1186, 'grad_norm': 0.8442212664368564, 'learning_rate': 7.811928648292389e-07, 'epoch': 0.83} 83%|████████▎ | 28282/34278 [31:06:49<6:35:45, 3.96s/it] 83%|████████▎ | 28283/34278 [31:06:53<6:24:25, 3.85s/it] {'loss': 0.125, 'grad_norm': 0.8772915047651296, 'learning_rate': 7.809393195514348e-07, 'epoch': 0.83} 83%|████████▎ | 28283/34278 [31:06:53<6:24:25, 3.85s/it] 83%|████████▎ | 28284/34278 [31:06:56<5:59:02, 3.59s/it] {'loss': 0.0965, 'grad_norm': 0.813365149645741, 'learning_rate': 7.806858119403976e-07, 'epoch': 0.83} 83%|████████▎ | 28284/34278 [31:06:56<5:59:02, 3.59s/it] 83%|████████▎ | 28285/34278 [31:06:59<5:48:37, 3.49s/it] {'loss': 0.0976, 'grad_norm': 0.9009509604505839, 'learning_rate': 7.804323419983884e-07, 'epoch': 0.83} 83%|████████▎ | 28285/34278 [31:06:59<5:48:37, 3.49s/it] 83%|████████▎ | 28286/34278 [31:07:05<7:03:07, 4.24s/it] {'loss': 0.0937, 'grad_norm': 0.9275007492531333, 'learning_rate': 7.801789097276735e-07, 'epoch': 0.83} 83%|████████▎ | 28286/34278 [31:07:05<7:03:07, 4.24s/it] 83%|████████▎ | 28287/34278 [31:07:08<6:32:10, 3.93s/it] {'loss': 0.1149, 'grad_norm': 0.9853359204091033, 'learning_rate': 7.799255151305141e-07, 'epoch': 0.83} 83%|████████▎ | 28287/34278 [31:07:08<6:32:10, 3.93s/it] 83%|████████▎ | 28288/34278 [31:07:12<6:06:16, 3.67s/it] {'loss': 0.1243, 'grad_norm': 0.8139979900434158, 'learning_rate': 7.79672158209171e-07, 'epoch': 0.83} 83%|████████▎ | 28288/34278 [31:07:12<6:06:16, 3.67s/it] 83%|████████▎ | 28289/34278 [31:07:17<6:53:56, 4.15s/it] {'loss': 0.1094, 'grad_norm': 1.0410753243598199, 'learning_rate': 7.794188389659074e-07, 'epoch': 0.83} 83%|████████▎ | 28289/34278 [31:07:17<6:53:56, 4.15s/it] 83%|████████▎ | 28290/34278 [31:07:21<6:45:19, 4.06s/it] {'loss': 0.1214, 'grad_norm': 0.9731979014587019, 'learning_rate': 7.791655574029866e-07, 'epoch': 0.83} 83%|████████▎ | 28290/34278 [31:07:21<6:45:19, 4.06s/it] 83%|████████▎ | 28291/34278 [31:07:24<6:19:11, 3.80s/it] {'loss': 0.102, 'grad_norm': 0.7663712521466084, 'learning_rate': 7.789123135226672e-07, 'epoch': 0.83} 83%|████████▎ | 28291/34278 [31:07:24<6:19:11, 3.80s/it] 83%|████████▎ | 28292/34278 [31:07:27<6:14:08, 3.75s/it] {'loss': 0.1176, 'grad_norm': 0.9390017754213866, 'learning_rate': 7.786591073272104e-07, 'epoch': 0.83} 83%|████████▎ | 28292/34278 [31:07:27<6:14:08, 3.75s/it] 83%|████████▎ | 28293/34278 [31:07:33<7:07:30, 4.29s/it] {'loss': 0.1048, 'grad_norm': 0.8612843747806949, 'learning_rate': 7.784059388188786e-07, 'epoch': 0.83} 83%|████████▎ | 28293/34278 [31:07:33<7:07:30, 4.29s/it] 83%|████████▎ | 28294/34278 [31:07:37<6:45:27, 4.07s/it] {'loss': 0.1109, 'grad_norm': 0.8004162637113328, 'learning_rate': 7.78152807999929e-07, 'epoch': 0.83} 83%|████████▎ | 28294/34278 [31:07:37<6:45:27, 4.07s/it] 83%|████████▎ | 28295/34278 [31:07:39<6:07:31, 3.69s/it] {'loss': 0.1222, 'grad_norm': 0.9804348695270982, 'learning_rate': 7.778997148726236e-07, 'epoch': 0.83} 83%|████████▎ | 28295/34278 [31:07:39<6:07:31, 3.69s/it] 83%|████████▎ | 28296/34278 [31:07:43<6:00:23, 3.61s/it] {'loss': 0.1052, 'grad_norm': 1.0311878124273608, 'learning_rate': 7.776466594392229e-07, 'epoch': 0.83} 83%|████████▎ | 28296/34278 [31:07:43<6:00:23, 3.61s/it] 83%|████████▎ | 28297/34278 [31:07:47<6:06:37, 3.68s/it] {'loss': 0.1189, 'grad_norm': 0.8249509708880214, 'learning_rate': 7.773936417019851e-07, 'epoch': 0.83} 83%|████████▎ | 28297/34278 [31:07:47<6:06:37, 3.68s/it] 83%|████████▎ | 28298/34278 [31:07:53<7:19:55, 4.41s/it] {'loss': 0.1211, 'grad_norm': 0.8672929515206617, 'learning_rate': 7.771406616631677e-07, 'epoch': 0.83} 83%|████████▎ | 28298/34278 [31:07:53<7:19:55, 4.41s/it] 83%|████████▎ | 28299/34278 [31:07:56<6:30:51, 3.92s/it] {'loss': 0.1076, 'grad_norm': 0.8216202561782774, 'learning_rate': 7.768877193250313e-07, 'epoch': 0.83} 83%|████████▎ | 28299/34278 [31:07:56<6:30:51, 3.92s/it] 83%|████████▎ | 28300/34278 [31:07:59<6:10:44, 3.72s/it] {'loss': 0.0942, 'grad_norm': 1.0132895677276117, 'learning_rate': 7.766348146898317e-07, 'epoch': 0.83} 83%|████████▎ | 28300/34278 [31:07:59<6:10:44, 3.72s/it] 83%|████████▎ | 28301/34278 [31:08:02<5:49:02, 3.50s/it] {'loss': 0.1026, 'grad_norm': 0.7401115594171007, 'learning_rate': 7.7638194775983e-07, 'epoch': 0.83} 83%|████████▎ | 28301/34278 [31:08:02<5:49:02, 3.50s/it] 83%|████████▎ | 28302/34278 [31:08:05<5:51:37, 3.53s/it] {'loss': 0.1073, 'grad_norm': 1.1460493593402705, 'learning_rate': 7.761291185372804e-07, 'epoch': 0.83} 83%|████████▎ | 28302/34278 [31:08:05<5:51:37, 3.53s/it] 83%|████████▎ | 28303/34278 [31:08:10<6:20:29, 3.82s/it] {'loss': 0.1057, 'grad_norm': 0.93550391849133, 'learning_rate': 7.758763270244435e-07, 'epoch': 0.83} 83%|████████▎ | 28303/34278 [31:08:10<6:20:29, 3.82s/it] 83%|████████▎ | 28304/34278 [31:08:16<7:38:20, 4.60s/it] {'loss': 0.1132, 'grad_norm': 0.68782910234542, 'learning_rate': 7.756235732235739e-07, 'epoch': 0.83} 83%|████████▎ | 28304/34278 [31:08:16<7:38:20, 4.60s/it] 83%|████████▎ | 28305/34278 [31:08:19<6:54:19, 4.16s/it] {'loss': 0.1097, 'grad_norm': 0.7713530952904493, 'learning_rate': 7.753708571369273e-07, 'epoch': 0.83} 83%|████████▎ | 28305/34278 [31:08:19<6:54:19, 4.16s/it] 83%|████████▎ | 28306/34278 [31:08:25<7:39:21, 4.62s/it] {'loss': 0.1162, 'grad_norm': 3.2181783992514017, 'learning_rate': 7.751181787667616e-07, 'epoch': 0.83} 83%|████████▎ | 28306/34278 [31:08:25<7:39:21, 4.62s/it] 83%|████████▎ | 28307/34278 [31:08:28<6:55:40, 4.18s/it] {'loss': 0.1285, 'grad_norm': 0.9646956568179892, 'learning_rate': 7.748655381153331e-07, 'epoch': 0.83} 83%|████████▎ | 28307/34278 [31:08:28<6:55:40, 4.18s/it] 83%|████████▎ | 28308/34278 [31:08:34<7:47:06, 4.69s/it] {'loss': 0.1013, 'grad_norm': 0.7694065702105926, 'learning_rate': 7.746129351848957e-07, 'epoch': 0.83} 83%|████████▎ | 28308/34278 [31:08:34<7:47:06, 4.69s/it] 83%|████████▎ | 28309/34278 [31:08:41<8:38:48, 5.22s/it] {'loss': 0.0955, 'grad_norm': 0.9747134014101453, 'learning_rate': 7.743603699777064e-07, 'epoch': 0.83} 83%|████████▎ | 28309/34278 [31:08:41<8:38:48, 5.22s/it] 83%|████████▎ | 28310/34278 [31:08:44<7:53:06, 4.76s/it] {'loss': 0.1085, 'grad_norm': 0.9093358083483087, 'learning_rate': 7.741078424960186e-07, 'epoch': 0.83} 83%|████████▎ | 28310/34278 [31:08:44<7:53:06, 4.76s/it] 83%|████████▎ | 28311/34278 [31:08:51<8:40:28, 5.23s/it] {'loss': 0.1365, 'grad_norm': 0.9312594654961549, 'learning_rate': 7.738553527420861e-07, 'epoch': 0.83} 83%|████████▎ | 28311/34278 [31:08:51<8:40:28, 5.23s/it] 83%|████████▎ | 28312/34278 [31:08:54<7:44:42, 4.67s/it] {'loss': 0.1296, 'grad_norm': 0.9024985437063421, 'learning_rate': 7.736029007181644e-07, 'epoch': 0.83} 83%|████████▎ | 28312/34278 [31:08:54<7:44:42, 4.67s/it] 83%|████████▎ | 28313/34278 [31:08:57<6:57:08, 4.20s/it] {'loss': 0.1071, 'grad_norm': 0.676235094909886, 'learning_rate': 7.733504864265079e-07, 'epoch': 0.83} 83%|████████▎ | 28313/34278 [31:08:57<6:57:08, 4.20s/it] 83%|████████▎ | 28314/34278 [31:09:00<6:33:53, 3.96s/it] {'loss': 0.1107, 'grad_norm': 1.0936149343330415, 'learning_rate': 7.730981098693696e-07, 'epoch': 0.83} 83%|████████▎ | 28314/34278 [31:09:00<6:33:53, 3.96s/it] 83%|████████▎ | 28315/34278 [31:09:06<7:32:20, 4.55s/it] {'loss': 0.1191, 'grad_norm': 0.7801232048413431, 'learning_rate': 7.728457710490011e-07, 'epoch': 0.83} 83%|████████▎ | 28315/34278 [31:09:06<7:32:20, 4.55s/it] 83%|████████▎ | 28316/34278 [31:09:09<6:41:14, 4.04s/it] {'loss': 0.1175, 'grad_norm': 0.8271271730496492, 'learning_rate': 7.725934699676574e-07, 'epoch': 0.83} 83%|████████▎ | 28316/34278 [31:09:09<6:41:14, 4.04s/it] 83%|████████▎ | 28317/34278 [31:09:13<6:31:44, 3.94s/it] {'loss': 0.114, 'grad_norm': 0.832033601569443, 'learning_rate': 7.723412066275904e-07, 'epoch': 0.83} 83%|████████▎ | 28317/34278 [31:09:13<6:31:44, 3.94s/it] 83%|████████▎ | 28318/34278 [31:09:19<7:29:40, 4.53s/it] {'loss': 0.1147, 'grad_norm': 0.7192394984985441, 'learning_rate': 7.720889810310506e-07, 'epoch': 0.83} 83%|████████▎ | 28318/34278 [31:09:19<7:29:40, 4.53s/it] 83%|████████▎ | 28319/34278 [31:09:22<6:44:05, 4.07s/it] {'loss': 0.1047, 'grad_norm': 0.8398302146468973, 'learning_rate': 7.718367931802906e-07, 'epoch': 0.83} 83%|████████▎ | 28319/34278 [31:09:22<6:44:05, 4.07s/it] 83%|████████▎ | 28320/34278 [31:09:25<6:13:23, 3.76s/it] {'loss': 0.0997, 'grad_norm': 0.8094270219011735, 'learning_rate': 7.715846430775642e-07, 'epoch': 0.83} 83%|████████▎ | 28320/34278 [31:09:25<6:13:23, 3.76s/it] 83%|████████▎ | 28321/34278 [31:09:28<5:54:48, 3.57s/it] {'loss': 0.1192, 'grad_norm': 0.8033581687404708, 'learning_rate': 7.713325307251201e-07, 'epoch': 0.83} 83%|████████▎ | 28321/34278 [31:09:28<5:54:48, 3.57s/it] 83%|████████▎ | 28322/34278 [31:09:34<7:02:06, 4.25s/it] {'loss': 0.1096, 'grad_norm': 0.7806039180298726, 'learning_rate': 7.710804561252089e-07, 'epoch': 0.83} 83%|████████▎ | 28322/34278 [31:09:34<7:02:06, 4.25s/it] 83%|████████▎ | 28323/34278 [31:09:37<6:37:40, 4.01s/it] {'loss': 0.0981, 'grad_norm': 0.713031695630493, 'learning_rate': 7.70828419280083e-07, 'epoch': 0.83} 83%|████████▎ | 28323/34278 [31:09:37<6:37:40, 4.01s/it] 83%|████████▎ | 28324/34278 [31:09:40<6:01:12, 3.64s/it] {'loss': 0.119, 'grad_norm': 0.9534880108445638, 'learning_rate': 7.705764201919902e-07, 'epoch': 0.83} 83%|████████▎ | 28324/34278 [31:09:40<6:01:12, 3.64s/it] 83%|████████▎ | 28325/34278 [31:09:43<5:47:13, 3.50s/it] {'loss': 0.1009, 'grad_norm': 0.9174454231049807, 'learning_rate': 7.70324458863182e-07, 'epoch': 0.83} 83%|████████▎ | 28325/34278 [31:09:43<5:47:13, 3.50s/it] 83%|████████▎ | 28326/34278 [31:09:47<5:52:10, 3.55s/it] {'loss': 0.1174, 'grad_norm': 0.7142890473135709, 'learning_rate': 7.700725352959076e-07, 'epoch': 0.83} 83%|████████▎ | 28326/34278 [31:09:47<5:52:10, 3.55s/it] 83%|████████▎ | 28327/34278 [31:09:50<5:31:55, 3.35s/it] {'loss': 0.1552, 'grad_norm': 0.8562145201052048, 'learning_rate': 7.698206494924165e-07, 'epoch': 0.83} 83%|████████▎ | 28327/34278 [31:09:50<5:31:55, 3.35s/it] 83%|████████▎ | 28328/34278 [31:09:56<6:44:33, 4.08s/it] {'loss': 0.1111, 'grad_norm': 0.7943972321512707, 'learning_rate': 7.695688014549552e-07, 'epoch': 0.83} 83%|████████▎ | 28328/34278 [31:09:56<6:44:33, 4.08s/it] 83%|████████▎ | 28329/34278 [31:09:59<6:11:30, 3.75s/it] {'loss': 0.11, 'grad_norm': 0.8320070793705426, 'learning_rate': 7.693169911857751e-07, 'epoch': 0.83} 83%|████████▎ | 28329/34278 [31:09:59<6:11:30, 3.75s/it] 83%|████████▎ | 28330/34278 [31:10:03<6:36:20, 4.00s/it] {'loss': 0.101, 'grad_norm': 0.6725606630061371, 'learning_rate': 7.690652186871217e-07, 'epoch': 0.83} 83%|████████▎ | 28330/34278 [31:10:03<6:36:20, 4.00s/it] 83%|████████▎ | 28331/34278 [31:10:06<6:00:34, 3.64s/it] {'loss': 0.1383, 'grad_norm': 1.0683871223509236, 'learning_rate': 7.688134839612454e-07, 'epoch': 0.83} 83%|████████▎ | 28331/34278 [31:10:06<6:00:34, 3.64s/it] 83%|████████▎ | 28332/34278 [31:10:09<5:49:01, 3.52s/it] {'loss': 0.0912, 'grad_norm': 0.8267029761207979, 'learning_rate': 7.685617870103912e-07, 'epoch': 0.83} 83%|████████▎ | 28332/34278 [31:10:09<5:49:01, 3.52s/it] 83%|████████▎ | 28333/34278 [31:10:15<7:01:18, 4.25s/it] {'loss': 0.1146, 'grad_norm': 0.7858096230308077, 'learning_rate': 7.683101278368077e-07, 'epoch': 0.83} 83%|████████▎ | 28333/34278 [31:10:15<7:01:18, 4.25s/it] 83%|████████▎ | 28334/34278 [31:10:18<6:28:38, 3.92s/it] {'loss': 0.1435, 'grad_norm': 0.8060557571466852, 'learning_rate': 7.68058506442742e-07, 'epoch': 0.83} 83%|████████▎ | 28334/34278 [31:10:18<6:28:38, 3.92s/it] 83%|████████▎ | 28335/34278 [31:10:22<6:09:03, 3.73s/it] {'loss': 0.0946, 'grad_norm': 0.9064266915483007, 'learning_rate': 7.67806922830438e-07, 'epoch': 0.83} 83%|████████▎ | 28335/34278 [31:10:22<6:09:03, 3.73s/it] 83%|████████▎ | 28336/34278 [31:10:28<7:19:13, 4.44s/it] {'loss': 0.1296, 'grad_norm': 1.0676803527842267, 'learning_rate': 7.675553770021438e-07, 'epoch': 0.83} 83%|████████▎ | 28336/34278 [31:10:28<7:19:13, 4.44s/it] 83%|████████▎ | 28337/34278 [31:10:34<8:05:51, 4.91s/it] {'loss': 0.1345, 'grad_norm': 0.812890709765754, 'learning_rate': 7.673038689601059e-07, 'epoch': 0.83} 83%|████████▎ | 28337/34278 [31:10:34<8:05:51, 4.91s/it] 83%|████████▎ | 28338/34278 [31:10:36<7:04:00, 4.28s/it] {'loss': 0.1196, 'grad_norm': 0.85359666285905, 'learning_rate': 7.670523987065675e-07, 'epoch': 0.83} 83%|████████▎ | 28338/34278 [31:10:37<7:04:00, 4.28s/it] 83%|████████▎ | 28339/34278 [31:10:39<6:23:08, 3.87s/it] {'loss': 0.0988, 'grad_norm': 1.0561929947567026, 'learning_rate': 7.668009662437759e-07, 'epoch': 0.83} 83%|████████▎ | 28339/34278 [31:10:39<6:23:08, 3.87s/it] 83%|████████▎ | 28340/34278 [31:10:42<5:56:08, 3.60s/it] {'loss': 0.1302, 'grad_norm': 0.8477233441117852, 'learning_rate': 7.665495715739745e-07, 'epoch': 0.83} 83%|████████▎ | 28340/34278 [31:10:42<5:56:08, 3.60s/it] 83%|████████▎ | 28341/34278 [31:10:45<5:37:29, 3.41s/it] {'loss': 0.1227, 'grad_norm': 1.1954087286146777, 'learning_rate': 7.662982146994074e-07, 'epoch': 0.83} 83%|████████▎ | 28341/34278 [31:10:45<5:37:29, 3.41s/it] 83%|████████▎ | 28342/34278 [31:10:48<5:16:33, 3.20s/it] {'loss': 0.1163, 'grad_norm': 0.7946048547454742, 'learning_rate': 7.660468956223188e-07, 'epoch': 0.83} 83%|████████▎ | 28342/34278 [31:10:48<5:16:33, 3.20s/it] 83%|████████▎ | 28343/34278 [31:10:51<5:14:53, 3.18s/it] {'loss': 0.1123, 'grad_norm': 0.8484589925491341, 'learning_rate': 7.657956143449535e-07, 'epoch': 0.83} 83%|████████▎ | 28343/34278 [31:10:51<5:14:53, 3.18s/it] 83%|████████▎ | 28344/34278 [31:10:55<5:44:52, 3.49s/it] {'loss': 0.1266, 'grad_norm': 0.9025290487939114, 'learning_rate': 7.655443708695548e-07, 'epoch': 0.83} 83%|████████▎ | 28344/34278 [31:10:55<5:44:52, 3.49s/it] 83%|████████▎ | 28345/34278 [31:10:58<5:30:05, 3.34s/it] {'loss': 0.1127, 'grad_norm': 1.1840698900904802, 'learning_rate': 7.652931651983636e-07, 'epoch': 0.83} 83%|████████▎ | 28345/34278 [31:10:58<5:30:05, 3.34s/it] 83%|████████▎ | 28346/34278 [31:11:02<5:41:46, 3.46s/it] {'loss': 0.1074, 'grad_norm': 0.7510775826390401, 'learning_rate': 7.650419973336254e-07, 'epoch': 0.83} 83%|████████▎ | 28346/34278 [31:11:02<5:41:46, 3.46s/it] 83%|████████▎ | 28347/34278 [31:11:06<5:55:15, 3.59s/it] {'loss': 0.1289, 'grad_norm': 1.043439873566343, 'learning_rate': 7.647908672775817e-07, 'epoch': 0.83} 83%|████████▎ | 28347/34278 [31:11:06<5:55:15, 3.59s/it] 83%|████████▎ | 28348/34278 [31:11:09<5:40:44, 3.45s/it] {'loss': 0.115, 'grad_norm': 0.8373814056569358, 'learning_rate': 7.645397750324723e-07, 'epoch': 0.83} 83%|████████▎ | 28348/34278 [31:11:09<5:40:44, 3.45s/it] 83%|████████▎ | 28349/34278 [31:11:12<5:29:45, 3.34s/it] {'loss': 0.1226, 'grad_norm': 1.0559775862677445, 'learning_rate': 7.642887206005412e-07, 'epoch': 0.83} 83%|████████▎ | 28349/34278 [31:11:12<5:29:45, 3.34s/it] 83%|████████▎ | 28350/34278 [31:11:15<5:23:49, 3.28s/it] {'loss': 0.0901, 'grad_norm': 0.7475334772010874, 'learning_rate': 7.640377039840302e-07, 'epoch': 0.83} 83%|████████▎ | 28350/34278 [31:11:15<5:23:49, 3.28s/it] 83%|████████▎ | 28351/34278 [31:11:20<5:54:32, 3.59s/it] {'loss': 0.1258, 'grad_norm': 1.031190810352983, 'learning_rate': 7.637867251851794e-07, 'epoch': 0.83} 83%|████████▎ | 28351/34278 [31:11:20<5:54:32, 3.59s/it] 83%|████████▎ | 28352/34278 [31:11:23<5:41:07, 3.45s/it] {'loss': 0.0978, 'grad_norm': 0.7248062253201605, 'learning_rate': 7.635357842062279e-07, 'epoch': 0.83} 83%|████████▎ | 28352/34278 [31:11:23<5:41:07, 3.45s/it] 83%|████████▎ | 28353/34278 [31:11:26<5:19:45, 3.24s/it] {'loss': 0.0878, 'grad_norm': 0.8071471013482424, 'learning_rate': 7.632848810494193e-07, 'epoch': 0.83} 83%|████████▎ | 28353/34278 [31:11:26<5:19:45, 3.24s/it] 83%|████████▎ | 28354/34278 [31:11:31<6:37:53, 4.03s/it] {'loss': 0.105, 'grad_norm': 0.6077534103469545, 'learning_rate': 7.630340157169902e-07, 'epoch': 0.83} 83%|████████▎ | 28354/34278 [31:11:31<6:37:53, 4.03s/it] 83%|████████▎ | 28355/34278 [31:11:35<6:14:25, 3.79s/it] {'loss': 0.1185, 'grad_norm': 0.7467996251164862, 'learning_rate': 7.627831882111825e-07, 'epoch': 0.83} 83%|████████▎ | 28355/34278 [31:11:35<6:14:25, 3.79s/it] 83%|████████▎ | 28356/34278 [31:11:38<5:52:31, 3.57s/it] {'loss': 0.1043, 'grad_norm': 1.3510960449703084, 'learning_rate': 7.625323985342359e-07, 'epoch': 0.83} 83%|████████▎ | 28356/34278 [31:11:38<5:52:31, 3.57s/it] 83%|████████▎ | 28357/34278 [31:11:41<5:40:04, 3.45s/it] {'loss': 0.0866, 'grad_norm': 0.9039522451332882, 'learning_rate': 7.622816466883887e-07, 'epoch': 0.83} 83%|████████▎ | 28357/34278 [31:11:41<5:40:04, 3.45s/it] 83%|████████▎ | 28358/34278 [31:11:44<5:17:33, 3.22s/it] {'loss': 0.1101, 'grad_norm': 0.9638217489673516, 'learning_rate': 7.620309326758779e-07, 'epoch': 0.83} 83%|████████▎ | 28358/34278 [31:11:44<5:17:33, 3.22s/it] 83%|████████▎ | 28359/34278 [31:11:50<6:42:24, 4.08s/it] {'loss': 0.1231, 'grad_norm': 0.8155457194286022, 'learning_rate': 7.617802564989446e-07, 'epoch': 0.83} 83%|████████▎ | 28359/34278 [31:11:50<6:42:24, 4.08s/it] 83%|████████▎ | 28360/34278 [31:11:53<6:25:54, 3.91s/it] {'loss': 0.1051, 'grad_norm': 0.7700458546478592, 'learning_rate': 7.615296181598242e-07, 'epoch': 0.83} 83%|████████▎ | 28360/34278 [31:11:53<6:25:54, 3.91s/it] 83%|████████▎ | 28361/34278 [31:11:56<6:04:37, 3.70s/it] {'loss': 0.1067, 'grad_norm': 0.8312943947121507, 'learning_rate': 7.612790176607566e-07, 'epoch': 0.83} 83%|████████▎ | 28361/34278 [31:11:56<6:04:37, 3.70s/it] 83%|████████▎ | 28362/34278 [31:12:00<6:02:48, 3.68s/it] {'loss': 0.1432, 'grad_norm': 1.020073930862499, 'learning_rate': 7.61028455003977e-07, 'epoch': 0.83} 83%|████████▎ | 28362/34278 [31:12:00<6:02:48, 3.68s/it] 83%|████████▎ | 28363/34278 [31:12:03<5:38:34, 3.43s/it] {'loss': 0.0982, 'grad_norm': 0.814031831138474, 'learning_rate': 7.607779301917245e-07, 'epoch': 0.83} 83%|████████▎ | 28363/34278 [31:12:03<5:38:34, 3.43s/it] 83%|████████▎ | 28364/34278 [31:12:06<5:18:59, 3.24s/it] {'loss': 0.0922, 'grad_norm': 1.0610711076104806, 'learning_rate': 7.60527443226235e-07, 'epoch': 0.83} 83%|████████▎ | 28364/34278 [31:12:06<5:18:59, 3.24s/it] 83%|████████▎ | 28365/34278 [31:12:09<5:12:52, 3.17s/it] {'loss': 0.1193, 'grad_norm': 0.9477252653121643, 'learning_rate': 7.602769941097427e-07, 'epoch': 0.83} 83%|████████▎ | 28365/34278 [31:12:09<5:12:52, 3.17s/it] 83%|████████▎ | 28366/34278 [31:12:12<5:07:00, 3.12s/it] {'loss': 0.1206, 'grad_norm': 0.7560137550405617, 'learning_rate': 7.600265828444858e-07, 'epoch': 0.83} 83%|████████▎ | 28366/34278 [31:12:12<5:07:00, 3.12s/it] 83%|████████▎ | 28367/34278 [31:12:15<5:12:44, 3.17s/it] {'loss': 0.1111, 'grad_norm': 0.706287256192517, 'learning_rate': 7.597762094327004e-07, 'epoch': 0.83} 83%|████████▎ | 28367/34278 [31:12:15<5:12:44, 3.17s/it] 83%|████████▎ | 28368/34278 [31:12:18<5:11:26, 3.16s/it] {'loss': 0.1025, 'grad_norm': 1.1547158265989943, 'learning_rate': 7.595258738766192e-07, 'epoch': 0.83} 83%|████████▎ | 28368/34278 [31:12:18<5:11:26, 3.16s/it] 83%|████████▎ | 28369/34278 [31:12:21<5:01:01, 3.06s/it] {'loss': 0.1125, 'grad_norm': 0.6871472421710532, 'learning_rate': 7.592755761784803e-07, 'epoch': 0.83} 83%|████████▎ | 28369/34278 [31:12:21<5:01:01, 3.06s/it] 83%|████████▎ | 28370/34278 [31:12:24<5:06:31, 3.11s/it] {'loss': 0.1152, 'grad_norm': 0.8584336200980527, 'learning_rate': 7.59025316340517e-07, 'epoch': 0.83} 83%|████████▎ | 28370/34278 [31:12:24<5:06:31, 3.11s/it] 83%|████████▎ | 28371/34278 [31:12:27<5:06:49, 3.12s/it] {'loss': 0.118, 'grad_norm': 1.0571173083816197, 'learning_rate': 7.587750943649618e-07, 'epoch': 0.83} 83%|████████▎ | 28371/34278 [31:12:27<5:06:49, 3.12s/it] 83%|████████▎ | 28372/34278 [31:12:31<5:28:12, 3.33s/it] {'loss': 0.1206, 'grad_norm': 0.8932959152501599, 'learning_rate': 7.585249102540498e-07, 'epoch': 0.83} 83%|████████▎ | 28372/34278 [31:12:31<5:28:12, 3.33s/it] 83%|████████▎ | 28373/34278 [31:12:37<6:45:33, 4.12s/it] {'loss': 0.1036, 'grad_norm': 0.6940551219872615, 'learning_rate': 7.582747640100168e-07, 'epoch': 0.83} 83%|████████▎ | 28373/34278 [31:12:37<6:45:33, 4.12s/it] 83%|████████▎ | 28374/34278 [31:12:40<6:08:32, 3.75s/it] {'loss': 0.1088, 'grad_norm': 0.7990764587298337, 'learning_rate': 7.580246556350934e-07, 'epoch': 0.83} 83%|████████▎ | 28374/34278 [31:12:40<6:08:32, 3.75s/it] 83%|████████▎ | 28375/34278 [31:12:43<5:54:58, 3.61s/it] {'loss': 0.0993, 'grad_norm': 0.7090968228632207, 'learning_rate': 7.577745851315127e-07, 'epoch': 0.83} 83%|████████▎ | 28375/34278 [31:12:43<5:54:58, 3.61s/it] 83%|████████▎ | 28376/34278 [31:12:49<7:10:02, 4.37s/it] {'loss': 0.0948, 'grad_norm': 0.689448746294295, 'learning_rate': 7.575245525015085e-07, 'epoch': 0.83} 83%|████████▎ | 28376/34278 [31:12:49<7:10:02, 4.37s/it] 83%|████████▎ | 28377/34278 [31:12:55<7:57:12, 4.85s/it] {'loss': 0.128, 'grad_norm': 0.8383856371408687, 'learning_rate': 7.572745577473123e-07, 'epoch': 0.83} 83%|████████▎ | 28377/34278 [31:12:55<7:57:12, 4.85s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8dbf7d7bf0> Failed to fetch sample 3255719. Exception: cannot identify image file <_io.BytesIO object at 0x7f8dbf7d7bf0> 83%|████████▎ | 28378/34278 [31:12:59<7:18:35, 4.46s/it] {'loss': 0.1143, 'grad_norm': 1.0460968061717753, 'learning_rate': 7.570246008711552e-07, 'epoch': 0.83} 83%|████████▎ | 28378/34278 [31:12:59<7:18:35, 4.46s/it] 83%|████████▎ | 28379/34278 [31:13:02<6:48:32, 4.16s/it] {'loss': 0.1155, 'grad_norm': 0.8701828394876752, 'learning_rate': 7.567746818752692e-07, 'epoch': 0.83} 83%|████████▎ | 28379/34278 [31:13:02<6:48:32, 4.16s/it] 83%|████████▎ | 28380/34278 [31:13:06<6:26:19, 3.93s/it] {'loss': 0.1285, 'grad_norm': 0.7056535521867008, 'learning_rate': 7.565248007618875e-07, 'epoch': 0.83} 83%|████████▎ | 28380/34278 [31:13:06<6:26:19, 3.93s/it] 83%|████████▎ | 28381/34278 [31:13:08<5:49:32, 3.56s/it] {'loss': 0.0988, 'grad_norm': 1.0685582939700338, 'learning_rate': 7.56274957533239e-07, 'epoch': 0.83} 83%|████████▎ | 28381/34278 [31:13:08<5:49:32, 3.56s/it] 83%|████████▎ | 28382/34278 [31:13:11<5:25:12, 3.31s/it] {'loss': 0.1244, 'grad_norm': 0.8933234757039856, 'learning_rate': 7.560251521915534e-07, 'epoch': 0.83} 83%|████████▎ | 28382/34278 [31:13:11<5:25:12, 3.31s/it] 83%|████████▎ | 28383/34278 [31:13:14<5:21:32, 3.27s/it] {'loss': 0.1076, 'grad_norm': 0.7656982208089511, 'learning_rate': 7.557753847390637e-07, 'epoch': 0.83} 83%|████████▎ | 28383/34278 [31:13:14<5:21:32, 3.27s/it] 83%|████████▎ | 28384/34278 [31:13:17<5:07:12, 3.13s/it] {'loss': 0.1127, 'grad_norm': 0.7442566120946866, 'learning_rate': 7.555256551779966e-07, 'epoch': 0.83} 83%|████████▎ | 28384/34278 [31:13:17<5:07:12, 3.13s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa5ba0f8ea0> Failed to fetch sample 3268272. Exception: cannot identify image file <_io.BytesIO object at 0x7fa5ba0f8ea0> 83%|████████▎ | 28385/34278 [31:13:21<5:28:21, 3.34s/it] {'loss': 0.1139, 'grad_norm': 0.9755368554092401, 'learning_rate': 7.552759635105832e-07, 'epoch': 0.83} 83%|████████▎ | 28385/34278 [31:13:21<5:28:21, 3.34s/it] 83%|████████▎ | 28386/34278 [31:13:26<6:21:11, 3.88s/it] {'loss': 0.1158, 'grad_norm': 0.8705401918820534, 'learning_rate': 7.550263097390543e-07, 'epoch': 0.83} 83%|████████▎ | 28386/34278 [31:13:26<6:21:11, 3.88s/it] 83%|████████▎ | 28387/34278 [31:13:30<6:18:41, 3.86s/it] {'loss': 0.1143, 'grad_norm': 0.7506131579224751, 'learning_rate': 7.54776693865637e-07, 'epoch': 0.83} 83%|████████▎ | 28387/34278 [31:13:30<6:18:41, 3.86s/it] 83%|████████▎ | 28388/34278 [31:13:33<5:55:58, 3.63s/it] {'loss': 0.1052, 'grad_norm': 0.8008476601379567, 'learning_rate': 7.545271158925588e-07, 'epoch': 0.83} 83%|████████▎ | 28388/34278 [31:13:33<5:55:58, 3.63s/it] 83%|████████▎ | 28389/34278 [31:13:37<5:54:28, 3.61s/it] {'loss': 0.1243, 'grad_norm': 0.8488696894521165, 'learning_rate': 7.542775758220499e-07, 'epoch': 0.83} 83%|████████▎ | 28389/34278 [31:13:37<5:54:28, 3.61s/it] 83%|████████▎ | 28390/34278 [31:13:40<5:49:01, 3.56s/it] {'loss': 0.1159, 'grad_norm': 0.7289801411330107, 'learning_rate': 7.540280736563366e-07, 'epoch': 0.83} 83%|████████▎ | 28390/34278 [31:13:40<5:49:01, 3.56s/it] 83%|████████▎ | 28391/34278 [31:13:44<5:47:43, 3.54s/it] {'loss': 0.124, 'grad_norm': 0.7703778206123986, 'learning_rate': 7.537786093976479e-07, 'epoch': 0.83} 83%|████████▎ | 28391/34278 [31:13:44<5:47:43, 3.54s/it] 83%|████████▎ | 28392/34278 [31:13:47<5:31:10, 3.38s/it] {'loss': 0.1437, 'grad_norm': 0.9645049119459579, 'learning_rate': 7.535291830482088e-07, 'epoch': 0.83} 83%|████████▎ | 28392/34278 [31:13:47<5:31:10, 3.38s/it] 83%|████████▎ | 28393/34278 [31:13:52<6:46:26, 4.14s/it] {'loss': 0.1335, 'grad_norm': 0.8649477174827838, 'learning_rate': 7.532797946102488e-07, 'epoch': 0.83} 83%|████████▎ | 28393/34278 [31:13:52<6:46:26, 4.14s/it] 83%|████████▎ | 28394/34278 [31:13:56<6:14:55, 3.82s/it] {'loss': 0.1023, 'grad_norm': 0.7159959822166435, 'learning_rate': 7.530304440859932e-07, 'epoch': 0.83} 83%|████████▎ | 28394/34278 [31:13:56<6:14:55, 3.82s/it] 83%|████████▎ | 28395/34278 [31:14:00<6:47:07, 4.15s/it] {'loss': 0.1196, 'grad_norm': 0.8603737686192554, 'learning_rate': 7.527811314776667e-07, 'epoch': 0.83} 83%|████████▎ | 28395/34278 [31:14:00<6:47:07, 4.15s/it] 83%|████████▎ | 28396/34278 [31:14:05<7:09:18, 4.38s/it] {'loss': 0.1158, 'grad_norm': 0.9851910441865979, 'learning_rate': 7.525318567874962e-07, 'epoch': 0.83} 83%|████████▎ | 28396/34278 [31:14:05<7:09:18, 4.38s/it] 83%|████████▎ | 28397/34278 [31:14:08<6:26:12, 3.94s/it] {'loss': 0.122, 'grad_norm': 0.8980323382883142, 'learning_rate': 7.522826200177085e-07, 'epoch': 0.83} 83%|████████▎ | 28397/34278 [31:14:08<6:26:12, 3.94s/it] 83%|████████▎ | 28398/34278 [31:14:14<7:26:37, 4.56s/it] {'loss': 0.1099, 'grad_norm': 0.7215558665966592, 'learning_rate': 7.520334211705265e-07, 'epoch': 0.83} 83%|████████▎ | 28398/34278 [31:14:14<7:26:37, 4.56s/it] 83%|████████▎ | 28399/34278 [31:14:17<6:43:07, 4.11s/it] {'loss': 0.0843, 'grad_norm': 0.6656369779731551, 'learning_rate': 7.517842602481773e-07, 'epoch': 0.83} 83%|████████▎ | 28399/34278 [31:14:17<6:43:07, 4.11s/it] 83%|████████▎ | 28400/34278 [31:14:21<6:15:53, 3.84s/it] {'loss': 0.1196, 'grad_norm': 1.0008539603737099, 'learning_rate': 7.515351372528839e-07, 'epoch': 0.83} 83%|████████▎ | 28400/34278 [31:14:21<6:15:53, 3.84s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f579e669260> Failed to fetch sample 2667588. Exception: cannot identify image file <_io.BytesIO object at 0x7f579e669260> 83%|████████▎ | 28401/34278 [31:14:27<7:20:21, 4.50s/it] {'loss': 0.1255, 'grad_norm': 0.8997140369278976, 'learning_rate': 7.512860521868693e-07, 'epoch': 0.83} 83%|████████▎ | 28401/34278 [31:14:27<7:20:21, 4.50s/it] 83%|████████▎ | 28402/34278 [31:14:29<6:33:44, 4.02s/it] {'loss': 0.1262, 'grad_norm': 0.8186369579234746, 'learning_rate': 7.510370050523591e-07, 'epoch': 0.83} 83%|████████▎ | 28402/34278 [31:14:30<6:33:44, 4.02s/it] 83%|████████▎ | 28403/34278 [31:14:32<6:03:27, 3.71s/it] {'loss': 0.1302, 'grad_norm': 0.9062082953019484, 'learning_rate': 7.507879958515768e-07, 'epoch': 0.83} 83%|████████▎ | 28403/34278 [31:14:33<6:03:27, 3.71s/it] 83%|████████▎ | 28404/34278 [31:14:36<6:04:49, 3.73s/it] {'loss': 0.1106, 'grad_norm': 0.7423631168798381, 'learning_rate': 7.505390245867455e-07, 'epoch': 0.83} 83%|████████▎ | 28404/34278 [31:14:36<6:04:49, 3.73s/it] 83%|████████▎ | 28405/34278 [31:14:39<5:50:33, 3.58s/it] {'loss': 0.1131, 'grad_norm': 0.8827426244144843, 'learning_rate': 7.502900912600858e-07, 'epoch': 0.83} 83%|████████▎ | 28405/34278 [31:14:39<5:50:33, 3.58s/it] 83%|████████▎ | 28406/34278 [31:14:45<6:52:37, 4.22s/it] {'loss': 0.1241, 'grad_norm': 0.8395354430181127, 'learning_rate': 7.50041195873823e-07, 'epoch': 0.83} 83%|████████▎ | 28406/34278 [31:14:45<6:52:37, 4.22s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 83%|████████▎ | 28407/34278 [31:14:48<6:07:38, 3.76s/it] {'loss': 0.0881, 'grad_norm': 0.946227806949324, 'learning_rate': 7.497923384301775e-07, 'epoch': 0.83} 83%|████████▎ | 28407/34278 [31:14:48<6:07:38, 3.76s/it] 83%|████████▎ | 28408/34278 [31:14:54<7:26:57, 4.57s/it] {'loss': 0.1367, 'grad_norm': 0.8016717106451814, 'learning_rate': 7.495435189313704e-07, 'epoch': 0.83} 83%|████████▎ | 28408/34278 [31:14:54<7:26:57, 4.57s/it] 83%|████████▎ | 28409/34278 [31:14:59<7:21:05, 4.51s/it] {'loss': 0.1323, 'grad_norm': 0.8166243097367427, 'learning_rate': 7.492947373796244e-07, 'epoch': 0.83} 83%|████████▎ | 28409/34278 [31:14:59<7:21:05, 4.51s/it] 83%|████████▎ | 28410/34278 [31:15:02<6:33:20, 4.02s/it] {'loss': 0.1214, 'grad_norm': 1.3640015216564052, 'learning_rate': 7.49045993777161e-07, 'epoch': 0.83} 83%|████████▎ | 28410/34278 [31:15:02<6:33:20, 4.02s/it] 83%|████████▎ | 28411/34278 [31:15:05<6:20:41, 3.89s/it] {'loss': 0.1399, 'grad_norm': 0.8039236960400502, 'learning_rate': 7.487972881262006e-07, 'epoch': 0.83} 83%|████████▎ | 28411/34278 [31:15:05<6:20:41, 3.89s/it] 83%|████████▎ | 28412/34278 [31:15:09<6:24:37, 3.93s/it] {'loss': 0.1247, 'grad_norm': 0.7848311756178701, 'learning_rate': 7.485486204289616e-07, 'epoch': 0.83} 83%|████████▎ | 28412/34278 [31:15:09<6:24:37, 3.93s/it] 83%|████████▎ | 28413/34278 [31:15:13<6:16:29, 3.85s/it] {'loss': 0.1125, 'grad_norm': 0.7767831778621073, 'learning_rate': 7.48299990687667e-07, 'epoch': 0.83} 83%|████████▎ | 28413/34278 [31:15:13<6:16:29, 3.85s/it] 83%|████████▎ | 28414/34278 [31:15:17<6:29:50, 3.99s/it] {'loss': 0.1005, 'grad_norm': 0.758910254961333, 'learning_rate': 7.480513989045341e-07, 'epoch': 0.83} 83%|████████▎ | 28414/34278 [31:15:17<6:29:50, 3.99s/it] 83%|████████▎ | 28415/34278 [31:15:23<7:31:12, 4.62s/it] {'loss': 0.1061, 'grad_norm': 0.7087555708668387, 'learning_rate': 7.478028450817832e-07, 'epoch': 0.83} 83%|████████▎ | 28415/34278 [31:15:23<7:31:12, 4.62s/it] 83%|████████▎ | 28416/34278 [31:15:26<6:47:34, 4.17s/it] {'loss': 0.1199, 'grad_norm': 1.0047912122330391, 'learning_rate': 7.475543292216347e-07, 'epoch': 0.83} 83%|████████▎ | 28416/34278 [31:15:26<6:47:34, 4.17s/it] 83%|████████▎ | 28417/34278 [31:15:30<6:24:40, 3.94s/it] {'loss': 0.1127, 'grad_norm': 1.527544903022475, 'learning_rate': 7.473058513263054e-07, 'epoch': 0.83} 83%|████████▎ | 28417/34278 [31:15:30<6:24:40, 3.94s/it] 83%|████████▎ | 28418/34278 [31:15:33<6:15:39, 3.85s/it] {'loss': 0.1093, 'grad_norm': 0.9787763908199145, 'learning_rate': 7.470574113980139e-07, 'epoch': 0.83} 83%|████████▎ | 28418/34278 [31:15:33<6:15:39, 3.85s/it] 83%|████████▎ | 28419/34278 [31:15:38<6:34:26, 4.04s/it] {'loss': 0.1215, 'grad_norm': 0.7679047763089842, 'learning_rate': 7.46809009438979e-07, 'epoch': 0.83} 83%|████████▎ | 28419/34278 [31:15:38<6:34:26, 4.04s/it] 83%|████████▎ | 28420/34278 [31:15:44<7:22:29, 4.53s/it] {'loss': 0.1043, 'grad_norm': 0.9601933188428016, 'learning_rate': 7.465606454514174e-07, 'epoch': 0.83} 83%|████████▎ | 28420/34278 [31:15:44<7:22:29, 4.53s/it] 83%|████████▎ | 28421/34278 [31:15:47<6:36:40, 4.06s/it] {'loss': 0.1112, 'grad_norm': 0.7876307216321129, 'learning_rate': 7.463123194375476e-07, 'epoch': 0.83} 83%|████████▎ | 28421/34278 [31:15:47<6:36:40, 4.06s/it] 83%|████████▎ | 28422/34278 [31:15:53<7:32:12, 4.63s/it] {'loss': 0.1209, 'grad_norm': 0.7313640996354795, 'learning_rate': 7.460640313995854e-07, 'epoch': 0.83} 83%|████████▎ | 28422/34278 [31:15:53<7:32:12, 4.63s/it] 83%|████████▎ | 28423/34278 [31:15:58<8:04:44, 4.97s/it] {'loss': 0.105, 'grad_norm': 0.7396376101369221, 'learning_rate': 7.458157813397487e-07, 'epoch': 0.83} 83%|████████▎ | 28423/34278 [31:15:58<8:04:44, 4.97s/it] 83%|████████▎ | 28424/34278 [31:16:05<8:48:39, 5.42s/it] {'loss': 0.1176, 'grad_norm': 0.8886354246380967, 'learning_rate': 7.455675692602532e-07, 'epoch': 0.83} 83%|████████▎ | 28424/34278 [31:16:05<8:48:39, 5.42s/it] 83%|████████▎ | 28425/34278 [31:16:08<7:40:02, 4.72s/it] {'loss': 0.1045, 'grad_norm': 1.0904938595718519, 'learning_rate': 7.453193951633142e-07, 'epoch': 0.83} 83%|████████▎ | 28425/34278 [31:16:08<7:40:02, 4.72s/it] 83%|████████▎ | 28426/34278 [31:16:11<6:59:36, 4.30s/it] {'loss': 0.1128, 'grad_norm': 0.7608969060337386, 'learning_rate': 7.450712590511472e-07, 'epoch': 0.83} 83%|████████▎ | 28426/34278 [31:16:11<6:59:36, 4.30s/it] 83%|████████▎ | 28427/34278 [31:16:14<6:30:11, 4.00s/it] {'loss': 0.0987, 'grad_norm': 0.7658006455646321, 'learning_rate': 7.448231609259699e-07, 'epoch': 0.83} 83%|████████▎ | 28427/34278 [31:16:14<6:30:11, 4.00s/it] 83%|████████▎ | 28428/34278 [31:16:17<5:59:20, 3.69s/it] {'loss': 0.1226, 'grad_norm': 0.8597849271304524, 'learning_rate': 7.445751007899943e-07, 'epoch': 0.83} 83%|████████▎ | 28428/34278 [31:16:17<5:59:20, 3.69s/it] 83%|████████▎ | 28429/34278 [31:16:23<6:45:13, 4.16s/it] {'loss': 0.1004, 'grad_norm': 0.6598982366190425, 'learning_rate': 7.443270786454376e-07, 'epoch': 0.83} 83%|████████▎ | 28429/34278 [31:16:23<6:45:13, 4.16s/it] 83%|████████▎ | 28430/34278 [31:16:26<6:29:03, 3.99s/it] {'loss': 0.1058, 'grad_norm': 0.9080284371942519, 'learning_rate': 7.440790944945131e-07, 'epoch': 0.83} 83%|████████▎ | 28430/34278 [31:16:26<6:29:03, 3.99s/it] 83%|████████▎ | 28431/34278 [31:16:29<6:01:52, 3.71s/it] {'loss': 0.1247, 'grad_norm': 1.4034088887775558, 'learning_rate': 7.438311483394328e-07, 'epoch': 0.83} 83%|████████▎ | 28431/34278 [31:16:29<6:01:52, 3.71s/it] 83%|████████▎ | 28432/34278 [31:16:32<5:38:38, 3.48s/it] {'loss': 0.096, 'grad_norm': 0.7383696981856201, 'learning_rate': 7.435832401824122e-07, 'epoch': 0.83} 83%|████████▎ | 28432/34278 [31:16:32<5:38:38, 3.48s/it] 83%|████████▎ | 28433/34278 [31:16:35<5:30:21, 3.39s/it] {'loss': 0.1043, 'grad_norm': 0.7879472713030384, 'learning_rate': 7.433353700256651e-07, 'epoch': 0.83} 83%|████████▎ | 28433/34278 [31:16:35<5:30:21, 3.39s/it] 83%|████████▎ | 28434/34278 [31:16:39<5:24:06, 3.33s/it] {'loss': 0.115, 'grad_norm': 0.784470214110543, 'learning_rate': 7.430875378714042e-07, 'epoch': 0.83} 83%|████████▎ | 28434/34278 [31:16:39<5:24:06, 3.33s/it] 83%|████████▎ | 28435/34278 [31:16:42<5:11:11, 3.20s/it] {'loss': 0.1079, 'grad_norm': 0.7232544424523933, 'learning_rate': 7.428397437218404e-07, 'epoch': 0.83} 83%|████████▎ | 28435/34278 [31:16:42<5:11:11, 3.20s/it] 83%|████████▎ | 28436/34278 [31:16:44<5:04:15, 3.12s/it] {'loss': 0.104, 'grad_norm': 0.7159335182674081, 'learning_rate': 7.425919875791881e-07, 'epoch': 0.83} 83%|████████▎ | 28436/34278 [31:16:44<5:04:15, 3.12s/it] 83%|████████▎ | 28437/34278 [31:16:47<4:54:22, 3.02s/it] {'loss': 0.114, 'grad_norm': 0.7607774667796043, 'learning_rate': 7.423442694456584e-07, 'epoch': 0.83} 83%|████████▎ | 28437/34278 [31:16:47<4:54:22, 3.02s/it] 83%|████████▎ | 28438/34278 [31:16:50<4:49:35, 2.98s/it] {'loss': 0.1102, 'grad_norm': 0.8795922124661933, 'learning_rate': 7.420965893234611e-07, 'epoch': 0.83} 83%|████████▎ | 28438/34278 [31:16:50<4:49:35, 2.98s/it] 83%|████████▎ | 28439/34278 [31:16:55<5:59:28, 3.69s/it] {'loss': 0.121, 'grad_norm': 0.8882395773551321, 'learning_rate': 7.418489472148094e-07, 'epoch': 0.83} 83%|████████▎ | 28439/34278 [31:16:55<5:59:28, 3.69s/it] 83%|████████▎ | 28440/34278 [31:16:59<5:52:35, 3.62s/it] {'loss': 0.1074, 'grad_norm': 0.907455358333316, 'learning_rate': 7.416013431219149e-07, 'epoch': 0.83} 83%|████████▎ | 28440/34278 [31:16:59<5:52:35, 3.62s/it] 83%|████████▎ | 28441/34278 [31:17:02<5:30:56, 3.40s/it] {'loss': 0.112, 'grad_norm': 0.7848234078597283, 'learning_rate': 7.41353777046987e-07, 'epoch': 0.83} 83%|████████▎ | 28441/34278 [31:17:02<5:30:56, 3.40s/it] 83%|████████▎ | 28442/34278 [31:17:05<5:34:40, 3.44s/it] {'loss': 0.1009, 'grad_norm': 0.9077419057850704, 'learning_rate': 7.411062489922344e-07, 'epoch': 0.83} 83%|████████▎ | 28442/34278 [31:17:05<5:34:40, 3.44s/it] 83%|████████▎ | 28443/34278 [31:17:09<5:32:17, 3.42s/it] {'loss': 0.125, 'grad_norm': 0.863099550723619, 'learning_rate': 7.408587589598704e-07, 'epoch': 0.83} 83%|████████▎ | 28443/34278 [31:17:09<5:32:17, 3.42s/it] 83%|████████▎ | 28444/34278 [31:17:12<5:18:27, 3.28s/it] {'loss': 0.1115, 'grad_norm': 0.8234409787337859, 'learning_rate': 7.406113069521009e-07, 'epoch': 0.83} 83%|████████▎ | 28444/34278 [31:17:12<5:18:27, 3.28s/it] 83%|████████▎ | 28445/34278 [31:17:16<5:42:50, 3.53s/it] {'loss': 0.1137, 'grad_norm': 0.9249257401348212, 'learning_rate': 7.403638929711371e-07, 'epoch': 0.83} 83%|████████▎ | 28445/34278 [31:17:16<5:42:50, 3.53s/it] 83%|████████▎ | 28446/34278 [31:17:19<5:27:59, 3.37s/it] {'loss': 0.1254, 'grad_norm': 0.9947568607836801, 'learning_rate': 7.401165170191887e-07, 'epoch': 0.83} 83%|████████▎ | 28446/34278 [31:17:19<5:27:59, 3.37s/it] 83%|████████▎ | 28447/34278 [31:17:25<6:44:05, 4.16s/it] {'loss': 0.0924, 'grad_norm': 0.7455723298912932, 'learning_rate': 7.39869179098463e-07, 'epoch': 0.83} 83%|████████▎ | 28447/34278 [31:17:25<6:44:05, 4.16s/it] 83%|████████▎ | 28448/34278 [31:17:28<6:15:28, 3.86s/it] {'loss': 0.1023, 'grad_norm': 0.7798383077405091, 'learning_rate': 7.396218792111676e-07, 'epoch': 0.83} 83%|████████▎ | 28448/34278 [31:17:28<6:15:28, 3.86s/it] 83%|████████▎ | 28449/34278 [31:17:31<5:50:11, 3.60s/it] {'loss': 0.1106, 'grad_norm': 0.8745388904463481, 'learning_rate': 7.393746173595106e-07, 'epoch': 0.83} 83%|████████▎ | 28449/34278 [31:17:31<5:50:11, 3.60s/it] 83%|████████▎ | 28450/34278 [31:17:34<5:34:09, 3.44s/it] {'loss': 0.1324, 'grad_norm': 1.0526718887968298, 'learning_rate': 7.391273935457016e-07, 'epoch': 0.83} 83%|████████▎ | 28450/34278 [31:17:34<5:34:09, 3.44s/it] 83%|████████▎ | 28451/34278 [31:17:37<5:28:17, 3.38s/it] {'loss': 0.1236, 'grad_norm': 0.716904351557592, 'learning_rate': 7.388802077719454e-07, 'epoch': 0.83} 83%|████████▎ | 28451/34278 [31:17:37<5:28:17, 3.38s/it] 83%|████████▎ | 28452/34278 [31:17:40<5:09:49, 3.19s/it] {'loss': 0.1218, 'grad_norm': 0.9323271799953796, 'learning_rate': 7.386330600404484e-07, 'epoch': 0.83} 83%|████████▎ | 28452/34278 [31:17:40<5:09:49, 3.19s/it] 83%|████████▎ | 28453/34278 [31:17:43<5:03:36, 3.13s/it] {'loss': 0.113, 'grad_norm': 1.0813662179731656, 'learning_rate': 7.383859503534197e-07, 'epoch': 0.83} 83%|████████▎ | 28453/34278 [31:17:43<5:03:36, 3.13s/it] 83%|████████▎ | 28454/34278 [31:17:47<5:28:17, 3.38s/it] {'loss': 0.1015, 'grad_norm': 0.787413754439722, 'learning_rate': 7.381388787130639e-07, 'epoch': 0.83} 83%|████████▎ | 28454/34278 [31:17:47<5:28:17, 3.38s/it] 83%|████████▎ | 28455/34278 [31:17:50<5:28:46, 3.39s/it] {'loss': 0.1067, 'grad_norm': 0.7645144026430417, 'learning_rate': 7.378918451215844e-07, 'epoch': 0.83} 83%|████████▎ | 28455/34278 [31:17:50<5:28:46, 3.39s/it] 83%|████████▎ | 28456/34278 [31:17:53<5:17:01, 3.27s/it] {'loss': 0.108, 'grad_norm': 0.9540995963289592, 'learning_rate': 7.376448495811911e-07, 'epoch': 0.83} 83%|████████▎ | 28456/34278 [31:17:53<5:17:01, 3.27s/it] 83%|████████▎ | 28457/34278 [31:17:56<5:10:34, 3.20s/it] {'loss': 0.1079, 'grad_norm': 1.043364353745594, 'learning_rate': 7.373978920940878e-07, 'epoch': 0.83} 83%|████████▎ | 28457/34278 [31:17:56<5:10:34, 3.20s/it] 83%|████████▎ | 28458/34278 [31:18:00<5:11:01, 3.21s/it] {'loss': 0.1326, 'grad_norm': 0.9012451293447941, 'learning_rate': 7.371509726624765e-07, 'epoch': 0.83} 83%|████████▎ | 28458/34278 [31:18:00<5:11:01, 3.21s/it] 83%|████████▎ | 28459/34278 [31:18:03<5:28:02, 3.38s/it] {'loss': 0.1267, 'grad_norm': 0.8397749882451837, 'learning_rate': 7.369040912885656e-07, 'epoch': 0.83} 83%|████████▎ | 28459/34278 [31:18:03<5:28:02, 3.38s/it] 83%|████████▎ | 28460/34278 [31:18:06<5:16:45, 3.27s/it] {'loss': 0.1121, 'grad_norm': 0.7710792796993645, 'learning_rate': 7.366572479745565e-07, 'epoch': 0.83} 83%|████████▎ | 28460/34278 [31:18:06<5:16:45, 3.27s/it] 83%|████████▎ | 28461/34278 [31:18:11<6:04:13, 3.76s/it] {'loss': 0.1063, 'grad_norm': 1.0625839752446422, 'learning_rate': 7.364104427226532e-07, 'epoch': 0.83} 83%|████████▎ | 28461/34278 [31:18:11<6:04:13, 3.76s/it] 83%|████████▎ | 28462/34278 [31:18:14<5:41:44, 3.53s/it] {'loss': 0.1366, 'grad_norm': 0.8124520740403229, 'learning_rate': 7.361636755350593e-07, 'epoch': 0.83} 83%|████████▎ | 28462/34278 [31:18:14<5:41:44, 3.53s/it] 83%|████████▎ | 28463/34278 [31:18:18<5:34:07, 3.45s/it] {'loss': 0.1123, 'grad_norm': 1.019349418711587, 'learning_rate': 7.359169464139798e-07, 'epoch': 0.83} 83%|████████▎ | 28463/34278 [31:18:18<5:34:07, 3.45s/it] 83%|████████▎ | 28464/34278 [31:18:20<5:12:50, 3.23s/it] {'loss': 0.1103, 'grad_norm': 0.6286083303873796, 'learning_rate': 7.356702553616157e-07, 'epoch': 0.83} 83%|████████▎ | 28464/34278 [31:18:20<5:12:50, 3.23s/it] 83%|████████▎ | 28465/34278 [31:18:24<5:31:03, 3.42s/it] {'loss': 0.1179, 'grad_norm': 0.7864544122825847, 'learning_rate': 7.354236023801687e-07, 'epoch': 0.83} 83%|████████▎ | 28465/34278 [31:18:24<5:31:03, 3.42s/it] 83%|████████▎ | 28466/34278 [31:18:27<5:16:54, 3.27s/it] {'loss': 0.1124, 'grad_norm': 0.8116342155865951, 'learning_rate': 7.351769874718423e-07, 'epoch': 0.83} 83%|████████▎ | 28466/34278 [31:18:27<5:16:54, 3.27s/it] 83%|████████▎ | 28467/34278 [31:18:30<5:21:40, 3.32s/it] {'loss': 0.1167, 'grad_norm': 1.110231707576032, 'learning_rate': 7.349304106388366e-07, 'epoch': 0.83} 83%|████████▎ | 28467/34278 [31:18:31<5:21:40, 3.32s/it] 83%|████████▎ | 28468/34278 [31:18:34<5:15:38, 3.26s/it] {'loss': 0.1048, 'grad_norm': 0.7610493832294706, 'learning_rate': 7.34683871883356e-07, 'epoch': 0.83} 83%|████████▎ | 28468/34278 [31:18:34<5:15:38, 3.26s/it] 83%|████████▎ | 28469/34278 [31:18:40<6:36:28, 4.10s/it] {'loss': 0.1166, 'grad_norm': 0.769739157904961, 'learning_rate': 7.344373712075976e-07, 'epoch': 0.83} 83%|████████▎ | 28469/34278 [31:18:40<6:36:28, 4.10s/it] 83%|████████▎ | 28470/34278 [31:18:43<6:06:44, 3.79s/it] {'loss': 0.1197, 'grad_norm': 0.9835055826127647, 'learning_rate': 7.341909086137655e-07, 'epoch': 0.83} 83%|████████▎ | 28470/34278 [31:18:43<6:06:44, 3.79s/it] 83%|████████▎ | 28471/34278 [31:18:46<5:45:15, 3.57s/it] {'loss': 0.1031, 'grad_norm': 0.6848325801545342, 'learning_rate': 7.339444841040583e-07, 'epoch': 0.83} 83%|████████▎ | 28471/34278 [31:18:46<5:45:15, 3.57s/it] 83%|████████▎ | 28472/34278 [31:18:52<6:53:52, 4.28s/it] {'loss': 0.1098, 'grad_norm': 0.8539488257851424, 'learning_rate': 7.336980976806757e-07, 'epoch': 0.83} 83%|████████▎ | 28472/34278 [31:18:52<6:53:52, 4.28s/it] 83%|████████▎ | 28473/34278 [31:18:55<6:13:27, 3.86s/it] {'loss': 0.1091, 'grad_norm': 0.9767131282538098, 'learning_rate': 7.334517493458176e-07, 'epoch': 0.83} 83%|████████▎ | 28473/34278 [31:18:55<6:13:27, 3.86s/it] 83%|████████▎ | 28474/34278 [31:18:58<5:50:25, 3.62s/it] {'loss': 0.1029, 'grad_norm': 0.8913909267721453, 'learning_rate': 7.332054391016852e-07, 'epoch': 0.83} 83%|████████▎ | 28474/34278 [31:18:58<5:50:25, 3.62s/it] 83%|████████▎ | 28475/34278 [31:19:04<7:00:34, 4.35s/it] {'loss': 0.1085, 'grad_norm': 0.9429812227510052, 'learning_rate': 7.329591669504748e-07, 'epoch': 0.83} 83%|████████▎ | 28475/34278 [31:19:04<7:00:34, 4.35s/it] 83%|████████▎ | 28476/34278 [31:19:07<6:38:09, 4.12s/it] {'loss': 0.1227, 'grad_norm': 0.7317002307282251, 'learning_rate': 7.327129328943877e-07, 'epoch': 0.83} 83%|████████▎ | 28476/34278 [31:19:07<6:38:09, 4.12s/it] 83%|████████▎ | 28477/34278 [31:19:11<6:15:01, 3.88s/it] {'loss': 0.1042, 'grad_norm': 0.8530850221167933, 'learning_rate': 7.324667369356209e-07, 'epoch': 0.83} 83%|████████▎ | 28477/34278 [31:19:11<6:15:01, 3.88s/it] 83%|████████▎ | 28478/34278 [31:19:14<5:56:07, 3.68s/it] {'loss': 0.1223, 'grad_norm': 1.2876696314117573, 'learning_rate': 7.322205790763709e-07, 'epoch': 0.83} 83%|████████▎ | 28478/34278 [31:19:14<5:56:07, 3.68s/it] 83%|████████▎ | 28479/34278 [31:19:17<5:44:46, 3.57s/it] {'loss': 0.0964, 'grad_norm': 0.9535450140788281, 'learning_rate': 7.319744593188371e-07, 'epoch': 0.83} 83%|████████▎ | 28479/34278 [31:19:17<5:44:46, 3.57s/it] 83%|████████▎ | 28480/34278 [31:19:20<5:22:28, 3.34s/it] {'loss': 0.105, 'grad_norm': 0.9078480178805935, 'learning_rate': 7.317283776652173e-07, 'epoch': 0.83} 83%|████████▎ | 28480/34278 [31:19:20<5:22:28, 3.34s/it] 83%|████████▎ | 28481/34278 [31:19:26<6:43:15, 4.17s/it] {'loss': 0.1058, 'grad_norm': 1.0122084769082274, 'learning_rate': 7.31482334117708e-07, 'epoch': 0.83} 83%|████████▎ | 28481/34278 [31:19:26<6:43:15, 4.17s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 83%|████████▎ | 28482/34278 [31:19:29<6:04:58, 3.78s/it] {'loss': 0.0953, 'grad_norm': 1.1380085633167951, 'learning_rate': 7.31236328678504e-07, 'epoch': 0.83} 83%|████████▎ | 28482/34278 [31:19:29<6:04:58, 3.78s/it] 83%|████████▎ | 28483/34278 [31:19:32<5:41:57, 3.54s/it] {'loss': 0.1333, 'grad_norm': 1.2484178967668609, 'learning_rate': 7.309903613498037e-07, 'epoch': 0.83} 83%|████████▎ | 28483/34278 [31:19:32<5:41:57, 3.54s/it] 83%|████████▎ | 28484/34278 [31:19:35<5:40:30, 3.53s/it] {'loss': 0.1051, 'grad_norm': 0.9646207965847657, 'learning_rate': 7.307444321338031e-07, 'epoch': 0.83} 83%|████████▎ | 28484/34278 [31:19:35<5:40:30, 3.53s/it] 83%|████████▎ | 28485/34278 [31:19:38<5:23:08, 3.35s/it] {'loss': 0.1078, 'grad_norm': 1.0211502583035859, 'learning_rate': 7.304985410326942e-07, 'epoch': 0.83} 83%|████████▎ | 28485/34278 [31:19:38<5:23:08, 3.35s/it] 83%|████████▎ | 28486/34278 [31:19:41<5:15:23, 3.27s/it] {'loss': 0.0964, 'grad_norm': 0.6957929901096648, 'learning_rate': 7.302526880486782e-07, 'epoch': 0.83} 83%|████████▎ | 28486/34278 [31:19:41<5:15:23, 3.27s/it] 83%|████████▎ | 28487/34278 [31:19:44<5:03:23, 3.14s/it] {'loss': 0.1153, 'grad_norm': 0.7550524268357655, 'learning_rate': 7.300068731839461e-07, 'epoch': 0.83} 83%|████████▎ | 28487/34278 [31:19:44<5:03:23, 3.14s/it] 83%|████████▎ | 28488/34278 [31:19:47<4:57:25, 3.08s/it] {'loss': 0.1045, 'grad_norm': 0.754587778508101, 'learning_rate': 7.297610964406926e-07, 'epoch': 0.83} 83%|████████▎ | 28488/34278 [31:19:47<4:57:25, 3.08s/it] 83%|████████▎ | 28489/34278 [31:19:50<4:58:42, 3.10s/it] {'loss': 0.1108, 'grad_norm': 0.7728547608988162, 'learning_rate': 7.295153578211139e-07, 'epoch': 0.83} 83%|████████▎ | 28489/34278 [31:19:50<4:58:42, 3.10s/it] 83%|████████▎ | 28490/34278 [31:19:54<5:04:37, 3.16s/it] {'loss': 0.1014, 'grad_norm': 0.8260731903976712, 'learning_rate': 7.292696573274022e-07, 'epoch': 0.83} 83%|████████▎ | 28490/34278 [31:19:54<5:04:37, 3.16s/it] 83%|████████▎ | 28491/34278 [31:19:57<5:06:05, 3.17s/it] {'loss': 0.1014, 'grad_norm': 0.776247445470421, 'learning_rate': 7.290239949617506e-07, 'epoch': 0.83} 83%|████████▎ | 28491/34278 [31:19:57<5:06:05, 3.17s/it] 83%|████████▎ | 28492/34278 [31:20:01<5:40:47, 3.53s/it] {'loss': 0.1173, 'grad_norm': 0.960356033116326, 'learning_rate': 7.287783707263535e-07, 'epoch': 0.83} 83%|████████▎ | 28492/34278 [31:20:01<5:40:47, 3.53s/it] 83%|████████▎ | 28493/34278 [31:20:05<5:45:52, 3.59s/it] {'loss': 0.1096, 'grad_norm': 0.6693668087549213, 'learning_rate': 7.285327846234042e-07, 'epoch': 0.83} 83%|████████▎ | 28493/34278 [31:20:05<5:45:52, 3.59s/it] 83%|████████▎ | 28494/34278 [31:20:08<5:28:36, 3.41s/it] {'loss': 0.1076, 'grad_norm': 0.7847598719804603, 'learning_rate': 7.282872366550947e-07, 'epoch': 0.83} 83%|████████▎ | 28494/34278 [31:20:08<5:28:36, 3.41s/it] 83%|████████▎ | 28495/34278 [31:20:12<5:55:53, 3.69s/it] {'loss': 0.1241, 'grad_norm': 0.8124027714812048, 'learning_rate': 7.280417268236157e-07, 'epoch': 0.83} 83%|████████▎ | 28495/34278 [31:20:12<5:55:53, 3.69s/it] 83%|████████▎ | 28496/34278 [31:20:19<7:14:59, 4.51s/it] {'loss': 0.0886, 'grad_norm': 0.8112382677692418, 'learning_rate': 7.277962551311613e-07, 'epoch': 0.83} 83%|████████▎ | 28496/34278 [31:20:19<7:14:59, 4.51s/it] 83%|████████▎ | 28497/34278 [31:20:23<7:15:10, 4.52s/it] {'loss': 0.1072, 'grad_norm': 0.8637708261310176, 'learning_rate': 7.275508215799216e-07, 'epoch': 0.83} 83%|████████▎ | 28497/34278 [31:20:23<7:15:10, 4.52s/it] 83%|████████▎ | 28498/34278 [31:20:26<6:28:24, 4.03s/it] {'loss': 0.0964, 'grad_norm': 0.8126661809103095, 'learning_rate': 7.273054261720891e-07, 'epoch': 0.83} 83%|████████▎ | 28498/34278 [31:20:26<6:28:24, 4.03s/it] 83%|████████▎ | 28499/34278 [31:20:30<6:09:21, 3.83s/it] {'loss': 0.1114, 'grad_norm': 1.1081717731744656, 'learning_rate': 7.270600689098523e-07, 'epoch': 0.83} 83%|████████▎ | 28499/34278 [31:20:30<6:09:21, 3.83s/it] 83%|████████▎ | 28500/34278 [31:20:33<5:53:03, 3.67s/it] {'loss': 0.1013, 'grad_norm': 1.0169366316178285, 'learning_rate': 7.268147497954048e-07, 'epoch': 0.83} 83%|████████▎ | 28500/34278 [31:20:33<5:53:03, 3.67s/it] 83%|████████▎ | 28501/34278 [31:20:39<7:06:14, 4.43s/it] {'loss': 0.1141, 'grad_norm': 0.8697696658821369, 'learning_rate': 7.265694688309349e-07, 'epoch': 0.83} 83%|████████▎ | 28501/34278 [31:20:39<7:06:14, 4.43s/it] 83%|████████▎ | 28502/34278 [31:20:42<6:35:32, 4.11s/it] {'loss': 0.1095, 'grad_norm': 0.8830334202234175, 'learning_rate': 7.263242260186315e-07, 'epoch': 0.83} 83%|████████▎ | 28502/34278 [31:20:42<6:35:32, 4.11s/it] 83%|████████▎ | 28503/34278 [31:20:45<6:02:16, 3.76s/it] {'loss': 0.1219, 'grad_norm': 0.9855130207278656, 'learning_rate': 7.26079021360685e-07, 'epoch': 0.83} 83%|████████▎ | 28503/34278 [31:20:45<6:02:16, 3.76s/it] 83%|████████▎ | 28504/34278 [31:20:49<5:52:43, 3.67s/it] {'loss': 0.0972, 'grad_norm': 0.6709105986819079, 'learning_rate': 7.258338548592858e-07, 'epoch': 0.83} 83%|████████▎ | 28504/34278 [31:20:49<5:52:43, 3.67s/it] 83%|████████▎ | 28505/34278 [31:20:52<5:33:16, 3.46s/it] {'loss': 0.113, 'grad_norm': 0.7081321681791498, 'learning_rate': 7.255887265166211e-07, 'epoch': 0.83} 83%|████████▎ | 28505/34278 [31:20:52<5:33:16, 3.46s/it] 83%|████████▎ | 28506/34278 [31:20:56<6:05:31, 3.80s/it] {'loss': 0.1133, 'grad_norm': 1.1133234397478888, 'learning_rate': 7.253436363348804e-07, 'epoch': 0.83} 83%|████████▎ | 28506/34278 [31:20:56<6:05:31, 3.80s/it] 83%|████████▎ | 28507/34278 [31:20:59<5:38:59, 3.52s/it] {'loss': 0.1146, 'grad_norm': 1.0341565117238025, 'learning_rate': 7.250985843162517e-07, 'epoch': 0.83} 83%|████████▎ | 28507/34278 [31:20:59<5:38:59, 3.52s/it] 83%|████████▎ | 28508/34278 [31:21:02<5:30:50, 3.44s/it] {'loss': 0.1001, 'grad_norm': 1.0097485894293217, 'learning_rate': 7.248535704629211e-07, 'epoch': 0.83} 83%|████████▎ | 28508/34278 [31:21:02<5:30:50, 3.44s/it] 83%|████████▎ | 28509/34278 [31:21:06<5:32:20, 3.46s/it] {'loss': 0.1183, 'grad_norm': 0.8673098107296484, 'learning_rate': 7.24608594777077e-07, 'epoch': 0.83} 83%|████████▎ | 28509/34278 [31:21:06<5:32:20, 3.46s/it] 83%|████████▎ | 28510/34278 [31:21:10<5:50:43, 3.65s/it] {'loss': 0.1216, 'grad_norm': 0.8743674749189494, 'learning_rate': 7.24363657260908e-07, 'epoch': 0.83} 83%|████████▎ | 28510/34278 [31:21:10<5:50:43, 3.65s/it] 83%|████████▎ | 28511/34278 [31:21:13<5:30:26, 3.44s/it] {'loss': 0.1215, 'grad_norm': 0.821606492559704, 'learning_rate': 7.241187579165998e-07, 'epoch': 0.83} 83%|████████▎ | 28511/34278 [31:21:13<5:30:26, 3.44s/it] 83%|████████▎ | 28512/34278 [31:21:16<5:20:55, 3.34s/it] {'loss': 0.1017, 'grad_norm': 1.0138135264361696, 'learning_rate': 7.238738967463372e-07, 'epoch': 0.83} 83%|████████▎ | 28512/34278 [31:21:16<5:20:55, 3.34s/it] 83%|████████▎ | 28513/34278 [31:21:22<6:38:18, 4.15s/it] {'loss': 0.1171, 'grad_norm': 0.9482798737416205, 'learning_rate': 7.236290737523089e-07, 'epoch': 0.83} 83%|████████▎ | 28513/34278 [31:21:22<6:38:18, 4.15s/it] 83%|████████▎ | 28514/34278 [31:21:25<6:05:44, 3.81s/it] {'loss': 0.0775, 'grad_norm': 0.8324205475097127, 'learning_rate': 7.233842889366993e-07, 'epoch': 0.83} 83%|████████▎ | 28514/34278 [31:21:25<6:05:44, 3.81s/it] 83%|████████▎ | 28515/34278 [31:21:29<5:57:39, 3.72s/it] {'loss': 0.1208, 'grad_norm': 0.8960529727392906, 'learning_rate': 7.231395423016918e-07, 'epoch': 0.83} 83%|████████▎ | 28515/34278 [31:21:29<5:57:39, 3.72s/it] 83%|████████▎ | 28516/34278 [31:21:32<5:38:07, 3.52s/it] {'loss': 0.1251, 'grad_norm': 0.9782130382983932, 'learning_rate': 7.228948338494757e-07, 'epoch': 0.83} 83%|████████▎ | 28516/34278 [31:21:32<5:38:07, 3.52s/it] 83%|████████▎ | 28517/34278 [31:21:36<5:49:29, 3.64s/it] {'loss': 0.1176, 'grad_norm': 0.9436371391172321, 'learning_rate': 7.226501635822337e-07, 'epoch': 0.83} 83%|████████▎ | 28517/34278 [31:21:36<5:49:29, 3.64s/it] 83%|████████▎ | 28518/34278 [31:21:39<5:37:27, 3.52s/it] {'loss': 0.1165, 'grad_norm': 1.0025269470149112, 'learning_rate': 7.224055315021484e-07, 'epoch': 0.83} 83%|████████▎ | 28518/34278 [31:21:39<5:37:27, 3.52s/it] 83%|████████▎ | 28519/34278 [31:21:42<5:35:06, 3.49s/it] {'loss': 0.1367, 'grad_norm': 1.2910502736752922, 'learning_rate': 7.221609376114069e-07, 'epoch': 0.83} 83%|████████▎ | 28519/34278 [31:21:42<5:35:06, 3.49s/it] 83%|████████▎ | 28520/34278 [31:21:45<5:20:15, 3.34s/it] {'loss': 0.1186, 'grad_norm': 0.7938050409417399, 'learning_rate': 7.21916381912191e-07, 'epoch': 0.83} 83%|████████▎ | 28520/34278 [31:21:45<5:20:15, 3.34s/it] 83%|████████▎ | 28521/34278 [31:21:51<6:32:12, 4.09s/it] {'loss': 0.1342, 'grad_norm': 0.8765039578690622, 'learning_rate': 7.216718644066834e-07, 'epoch': 0.83} 83%|████████▎ | 28521/34278 [31:21:51<6:32:12, 4.09s/it] 83%|████████▎ | 28522/34278 [31:21:54<6:08:44, 3.84s/it] {'loss': 0.1083, 'grad_norm': 0.7935005075097074, 'learning_rate': 7.214273850970677e-07, 'epoch': 0.83} 83%|████████▎ | 28522/34278 [31:21:54<6:08:44, 3.84s/it] 83%|████████▎ | 28523/34278 [31:21:59<6:32:22, 4.09s/it] {'loss': 0.1123, 'grad_norm': 0.8121681822286977, 'learning_rate': 7.211829439855284e-07, 'epoch': 0.83} 83%|████████▎ | 28523/34278 [31:21:59<6:32:22, 4.09s/it] 83%|████████▎ | 28524/34278 [31:22:02<5:51:28, 3.67s/it] {'loss': 0.1016, 'grad_norm': 0.7256000871820524, 'learning_rate': 7.209385410742465e-07, 'epoch': 0.83} 83%|████████▎ | 28524/34278 [31:22:02<5:51:28, 3.67s/it] 83%|████████▎ | 28525/34278 [31:22:05<5:47:59, 3.63s/it] {'loss': 0.0928, 'grad_norm': 0.918670339112401, 'learning_rate': 7.206941763654024e-07, 'epoch': 0.83} 83%|████████▎ | 28525/34278 [31:22:05<5:47:59, 3.63s/it] 83%|████████▎ | 28526/34278 [31:22:08<5:32:31, 3.47s/it] {'loss': 0.1038, 'grad_norm': 0.7244585269610889, 'learning_rate': 7.204498498611806e-07, 'epoch': 0.83} 83%|████████▎ | 28526/34278 [31:22:08<5:32:31, 3.47s/it] 83%|████████▎ | 28527/34278 [31:22:11<5:16:10, 3.30s/it] {'loss': 0.1294, 'grad_norm': 1.2410483979623603, 'learning_rate': 7.202055615637594e-07, 'epoch': 0.83} 83%|████████▎ | 28527/34278 [31:22:11<5:16:10, 3.30s/it] 83%|████████▎ | 28528/34278 [31:22:15<5:20:07, 3.34s/it] {'loss': 0.1168, 'grad_norm': 1.0056864448028837, 'learning_rate': 7.199613114753228e-07, 'epoch': 0.83} 83%|████████▎ | 28528/34278 [31:22:15<5:20:07, 3.34s/it] 83%|████████▎ | 28529/34278 [31:22:18<5:12:10, 3.26s/it] {'loss': 0.1154, 'grad_norm': 0.914242684557372, 'learning_rate': 7.197170995980485e-07, 'epoch': 0.83} 83%|████████▎ | 28529/34278 [31:22:18<5:12:10, 3.26s/it] 83%|████████▎ | 28530/34278 [31:22:21<5:13:43, 3.27s/it] {'loss': 0.1113, 'grad_norm': 0.7913292705219948, 'learning_rate': 7.194729259341194e-07, 'epoch': 0.83} 83%|████████▎ | 28530/34278 [31:22:21<5:13:43, 3.27s/it] 83%|████████▎ | 28531/34278 [31:22:24<5:09:56, 3.24s/it] {'loss': 0.0951, 'grad_norm': 1.0144165012259367, 'learning_rate': 7.192287904857138e-07, 'epoch': 0.83} 83%|████████▎ | 28531/34278 [31:22:24<5:09:56, 3.24s/it] 83%|████████▎ | 28532/34278 [31:22:28<5:11:25, 3.25s/it] {'loss': 0.0979, 'grad_norm': 0.6879136500285986, 'learning_rate': 7.18984693255011e-07, 'epoch': 0.83} 83%|████████▎ | 28532/34278 [31:22:28<5:11:25, 3.25s/it] 83%|████████▎ | 28533/34278 [31:22:31<5:21:29, 3.36s/it] {'loss': 0.0871, 'grad_norm': 0.7006082843911513, 'learning_rate': 7.187406342441905e-07, 'epoch': 0.83} 83%|████████▎ | 28533/34278 [31:22:31<5:21:29, 3.36s/it] 83%|████████▎ | 28534/34278 [31:22:35<5:35:07, 3.50s/it] {'loss': 0.1152, 'grad_norm': 0.815068687722015, 'learning_rate': 7.184966134554333e-07, 'epoch': 0.83} 83%|████████▎ | 28534/34278 [31:22:35<5:35:07, 3.50s/it] 83%|████████▎ | 28535/34278 [31:22:39<5:50:42, 3.66s/it] {'loss': 0.1244, 'grad_norm': 0.8554049222639807, 'learning_rate': 7.182526308909149e-07, 'epoch': 0.83} 83%|████████▎ | 28535/34278 [31:22:39<5:50:42, 3.66s/it] 83%|████████▎ | 28536/34278 [31:22:44<6:42:10, 4.20s/it] {'loss': 0.1048, 'grad_norm': 0.7513868656783036, 'learning_rate': 7.180086865528157e-07, 'epoch': 0.83} 83%|████████▎ | 28536/34278 [31:22:44<6:42:10, 4.20s/it] 83%|████████▎ | 28537/34278 [31:22:48<6:15:00, 3.92s/it] {'loss': 0.1205, 'grad_norm': 0.6375296876128865, 'learning_rate': 7.17764780443313e-07, 'epoch': 0.83} 83%|████████▎ | 28537/34278 [31:22:48<6:15:00, 3.92s/it] 83%|████████▎ | 28538/34278 [31:22:51<5:46:08, 3.62s/it] {'loss': 0.0938, 'grad_norm': 0.8367052356876804, 'learning_rate': 7.175209125645827e-07, 'epoch': 0.83} 83%|████████▎ | 28538/34278 [31:22:51<5:46:08, 3.62s/it] 83%|████████▎ | 28539/34278 [31:22:56<6:30:30, 4.08s/it] {'loss': 0.0966, 'grad_norm': 0.7349705041397777, 'learning_rate': 7.172770829188036e-07, 'epoch': 0.83} 83%|████████▎ | 28539/34278 [31:22:56<6:30:30, 4.08s/it] 83%|████████▎ | 28540/34278 [31:22:59<6:02:25, 3.79s/it] {'loss': 0.1151, 'grad_norm': 1.0130061093738332, 'learning_rate': 7.170332915081535e-07, 'epoch': 0.83} 83%|████████▎ | 28540/34278 [31:22:59<6:02:25, 3.79s/it] 83%|████████▎ | 28541/34278 [31:23:02<5:39:49, 3.55s/it] {'loss': 0.1209, 'grad_norm': 1.0458129887242262, 'learning_rate': 7.167895383348078e-07, 'epoch': 0.83} 83%|████████▎ | 28541/34278 [31:23:02<5:39:49, 3.55s/it] 83%|████████▎ | 28542/34278 [31:23:05<5:17:49, 3.32s/it] {'loss': 0.1053, 'grad_norm': 0.9933350286807865, 'learning_rate': 7.165458234009415e-07, 'epoch': 0.83} 83%|████████▎ | 28542/34278 [31:23:05<5:17:49, 3.32s/it] 83%|████████▎ | 28543/34278 [31:23:08<5:06:03, 3.20s/it] {'loss': 0.0778, 'grad_norm': 0.7640996831201017, 'learning_rate': 7.163021467087322e-07, 'epoch': 0.83} 83%|████████▎ | 28543/34278 [31:23:08<5:06:03, 3.20s/it] 83%|████████▎ | 28544/34278 [31:23:10<4:51:30, 3.05s/it] {'loss': 0.1144, 'grad_norm': 0.9615357386913754, 'learning_rate': 7.160585082603549e-07, 'epoch': 0.83} 83%|████████▎ | 28544/34278 [31:23:10<4:51:30, 3.05s/it] 83%|████████▎ | 28545/34278 [31:23:13<4:47:27, 3.01s/it] {'loss': 0.1051, 'grad_norm': 0.7987610593492653, 'learning_rate': 7.15814908057983e-07, 'epoch': 0.83} 83%|████████▎ | 28545/34278 [31:23:13<4:47:27, 3.01s/it] 83%|████████▎ | 28546/34278 [31:23:16<4:52:27, 3.06s/it] {'loss': 0.0999, 'grad_norm': 1.015639203436034, 'learning_rate': 7.155713461037944e-07, 'epoch': 0.83} 83%|████████▎ | 28546/34278 [31:23:16<4:52:27, 3.06s/it] 83%|████████▎ | 28547/34278 [31:23:19<4:45:07, 2.99s/it] {'loss': 0.1014, 'grad_norm': 0.7743864515395077, 'learning_rate': 7.153278223999622e-07, 'epoch': 0.83} 83%|████████▎ | 28547/34278 [31:23:19<4:45:07, 2.99s/it] 83%|████████▎ | 28548/34278 [31:23:23<5:00:35, 3.15s/it] {'loss': 0.1058, 'grad_norm': 0.8477173807154128, 'learning_rate': 7.150843369486593e-07, 'epoch': 0.83} 83%|████████▎ | 28548/34278 [31:23:23<5:00:35, 3.15s/it] 83%|████████▎ | 28549/34278 [31:23:27<5:25:20, 3.41s/it] {'loss': 0.1286, 'grad_norm': 0.8275212261359371, 'learning_rate': 7.14840889752062e-07, 'epoch': 0.83} 83%|████████▎ | 28549/34278 [31:23:27<5:25:20, 3.41s/it] 83%|████████▎ | 28550/34278 [31:23:30<5:33:46, 3.50s/it] {'loss': 0.0743, 'grad_norm': 0.6775216173110621, 'learning_rate': 7.145974808123418e-07, 'epoch': 0.83} 83%|████████▎ | 28550/34278 [31:23:30<5:33:46, 3.50s/it] 83%|████████▎ | 28551/34278 [31:23:34<5:47:06, 3.64s/it] {'loss': 0.1197, 'grad_norm': 0.8947868673066094, 'learning_rate': 7.143541101316715e-07, 'epoch': 0.83} 83%|████████▎ | 28551/34278 [31:23:34<5:47:06, 3.64s/it] 83%|████████▎ | 28552/34278 [31:23:39<6:14:40, 3.93s/it] {'loss': 0.1286, 'grad_norm': 1.1815004802146287, 'learning_rate': 7.141107777122242e-07, 'epoch': 0.83} 83%|████████▎ | 28552/34278 [31:23:39<6:14:40, 3.93s/it] 83%|████████▎ | 28553/34278 [31:23:42<5:48:43, 3.65s/it] {'loss': 0.1211, 'grad_norm': 0.8162810130464536, 'learning_rate': 7.138674835561743e-07, 'epoch': 0.83} 83%|████████▎ | 28553/34278 [31:23:42<5:48:43, 3.65s/it] 83%|████████▎ | 28554/34278 [31:23:48<6:50:47, 4.31s/it] {'loss': 0.1077, 'grad_norm': 0.731954929166307, 'learning_rate': 7.136242276656924e-07, 'epoch': 0.83} 83%|████████▎ | 28554/34278 [31:23:48<6:50:47, 4.31s/it] 83%|████████▎ | 28555/34278 [31:23:51<6:08:46, 3.87s/it] {'loss': 0.1164, 'grad_norm': 0.6193620227743445, 'learning_rate': 7.133810100429489e-07, 'epoch': 0.83} 83%|████████▎ | 28555/34278 [31:23:51<6:08:46, 3.87s/it] 83%|████████▎ | 28556/34278 [31:23:54<5:43:31, 3.60s/it] {'loss': 0.1186, 'grad_norm': 1.0063182105702355, 'learning_rate': 7.131378306901171e-07, 'epoch': 0.83} 83%|████████▎ | 28556/34278 [31:23:54<5:43:31, 3.60s/it] 83%|████████▎ | 28557/34278 [31:23:57<5:21:02, 3.37s/it] {'loss': 0.1063, 'grad_norm': 1.1640924578124354, 'learning_rate': 7.128946896093669e-07, 'epoch': 0.83} 83%|████████▎ | 28557/34278 [31:23:57<5:21:02, 3.37s/it] 83%|████████▎ | 28558/34278 [31:24:02<6:17:39, 3.96s/it] {'loss': 0.1291, 'grad_norm': 0.7617651428155251, 'learning_rate': 7.126515868028705e-07, 'epoch': 0.83} 83%|████████▎ | 28558/34278 [31:24:02<6:17:39, 3.96s/it] 83%|████████▎ | 28559/34278 [31:24:05<5:57:22, 3.75s/it] {'loss': 0.1187, 'grad_norm': 0.7278757571787543, 'learning_rate': 7.124085222727956e-07, 'epoch': 0.83} 83%|████████▎ | 28559/34278 [31:24:05<5:57:22, 3.75s/it] 83%|████████▎ | 28560/34278 [31:24:10<6:15:56, 3.94s/it] {'loss': 0.1071, 'grad_norm': 0.7965604064954516, 'learning_rate': 7.121654960213159e-07, 'epoch': 0.83} 83%|████████▎ | 28560/34278 [31:24:10<6:15:56, 3.94s/it] 83%|████████▎ | 28561/34278 [31:24:13<6:07:33, 3.86s/it] {'loss': 0.115, 'grad_norm': 0.908660796583881, 'learning_rate': 7.119225080505982e-07, 'epoch': 0.83} 83%|████████▎ | 28561/34278 [31:24:13<6:07:33, 3.86s/it] 83%|████████▎ | 28562/34278 [31:24:16<5:39:39, 3.57s/it] {'loss': 0.0974, 'grad_norm': 0.9092177506653578, 'learning_rate': 7.116795583628122e-07, 'epoch': 0.83} 83%|████████▎ | 28562/34278 [31:24:16<5:39:39, 3.57s/it] 83%|████████▎ | 28563/34278 [31:24:19<5:23:04, 3.39s/it] {'loss': 0.1212, 'grad_norm': 0.8069928837933115, 'learning_rate': 7.114366469601269e-07, 'epoch': 0.83} 83%|████████▎ | 28563/34278 [31:24:19<5:23:04, 3.39s/it] 83%|████████▎ | 28564/34278 [31:24:22<5:06:14, 3.22s/it] {'loss': 0.0945, 'grad_norm': 0.8121585361316815, 'learning_rate': 7.111937738447127e-07, 'epoch': 0.83} 83%|████████▎ | 28564/34278 [31:24:22<5:06:14, 3.22s/it] 83%|████████▎ | 28565/34278 [31:24:25<5:17:39, 3.34s/it] {'loss': 0.0928, 'grad_norm': 0.7402716149894177, 'learning_rate': 7.109509390187358e-07, 'epoch': 0.83} 83%|████████▎ | 28565/34278 [31:24:25<5:17:39, 3.34s/it] 83%|████████▎ | 28566/34278 [31:24:29<5:09:06, 3.25s/it] {'loss': 0.1142, 'grad_norm': 0.8233873005469028, 'learning_rate': 7.107081424843665e-07, 'epoch': 0.83} 83%|████████▎ | 28566/34278 [31:24:29<5:09:06, 3.25s/it] 83%|████████▎ | 28567/34278 [31:24:32<5:11:28, 3.27s/it] {'loss': 0.1124, 'grad_norm': 0.8461371407414695, 'learning_rate': 7.104653842437703e-07, 'epoch': 0.83} 83%|████████▎ | 28567/34278 [31:24:32<5:11:28, 3.27s/it] 83%|████████▎ | 28568/34278 [31:24:36<5:36:21, 3.53s/it] {'loss': 0.1131, 'grad_norm': 0.8776924496588524, 'learning_rate': 7.10222664299114e-07, 'epoch': 0.83} 83%|████████▎ | 28568/34278 [31:24:36<5:36:21, 3.53s/it] 83%|████████▎ | 28569/34278 [31:24:39<5:28:12, 3.45s/it] {'loss': 0.1015, 'grad_norm': 0.7461887739325114, 'learning_rate': 7.09979982652566e-07, 'epoch': 0.83} 83%|████████▎ | 28569/34278 [31:24:39<5:28:12, 3.45s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff5a76004f0> Failed to fetch sample 3646666. Exception: cannot identify image file <_io.BytesIO object at 0x7ff5a76004f0> 83%|████████▎ | 28570/34278 [31:24:42<5:11:11, 3.27s/it] {'loss': 0.1304, 'grad_norm': 1.1116031844994403, 'learning_rate': 7.09737339306294e-07, 'epoch': 0.83} 83%|████████▎ | 28570/34278 [31:24:42<5:11:11, 3.27s/it] 83%|████████▎ | 28571/34278 [31:24:45<5:03:04, 3.19s/it] {'loss': 0.089, 'grad_norm': 0.8107043254877853, 'learning_rate': 7.094947342624625e-07, 'epoch': 0.83} 83%|████████▎ | 28571/34278 [31:24:45<5:03:04, 3.19s/it] 83%|████████▎ | 28572/34278 [31:24:49<5:12:12, 3.28s/it] {'loss': 0.1032, 'grad_norm': 0.7352173617171506, 'learning_rate': 7.092521675232367e-07, 'epoch': 0.83} 83%|████████▎ | 28572/34278 [31:24:49<5:12:12, 3.28s/it] 83%|████████▎ | 28573/34278 [31:24:52<5:29:53, 3.47s/it] {'loss': 0.1445, 'grad_norm': 1.0507661539210595, 'learning_rate': 7.090096390907842e-07, 'epoch': 0.83} 83%|████████▎ | 28573/34278 [31:24:52<5:29:53, 3.47s/it] 83%|████████▎ | 28574/34278 [31:24:55<5:11:54, 3.28s/it] {'loss': 0.1124, 'grad_norm': 1.0960354019228602, 'learning_rate': 7.087671489672693e-07, 'epoch': 0.83} 83%|████████▎ | 28574/34278 [31:24:55<5:11:54, 3.28s/it] 83%|████████▎ | 28575/34278 [31:25:00<6:03:18, 3.82s/it] {'loss': 0.1031, 'grad_norm': 0.9175672214080708, 'learning_rate': 7.085246971548549e-07, 'epoch': 0.83} 83%|████████▎ | 28575/34278 [31:25:00<6:03:18, 3.82s/it] 83%|████████▎ | 28576/34278 [31:25:04<5:58:52, 3.78s/it] {'loss': 0.1119, 'grad_norm': 0.8436657160071062, 'learning_rate': 7.082822836557097e-07, 'epoch': 0.83} 83%|████████▎ | 28576/34278 [31:25:04<5:58:52, 3.78s/it] 83%|████████▎ | 28577/34278 [31:25:07<5:48:29, 3.67s/it] {'loss': 0.1209, 'grad_norm': 0.9998008421766619, 'learning_rate': 7.080399084719957e-07, 'epoch': 0.83} 83%|████████▎ | 28577/34278 [31:25:08<5:48:29, 3.67s/it] 83%|████████▎ | 28578/34278 [31:25:13<6:50:18, 4.32s/it] {'loss': 0.1448, 'grad_norm': 0.8883943602055342, 'learning_rate': 7.07797571605876e-07, 'epoch': 0.83} 83%|████████▎ | 28578/34278 [31:25:13<6:50:18, 4.32s/it] 83%|████████▎ | 28579/34278 [31:25:17<6:22:03, 4.02s/it] {'loss': 0.124, 'grad_norm': 0.6888493874528057, 'learning_rate': 7.075552730595159e-07, 'epoch': 0.83} 83%|████████▎ | 28579/34278 [31:25:17<6:22:03, 4.02s/it] 83%|████████▎ | 28580/34278 [31:25:20<6:04:09, 3.83s/it] {'loss': 0.1377, 'grad_norm': 0.7837341038410314, 'learning_rate': 7.073130128350775e-07, 'epoch': 0.83} 83%|████████▎ | 28580/34278 [31:25:20<6:04:09, 3.83s/it] 83%|████████▎ | 28581/34278 [31:25:23<5:33:44, 3.51s/it] {'loss': 0.1233, 'grad_norm': 0.928583318848546, 'learning_rate': 7.07070790934723e-07, 'epoch': 0.83} 83%|████████▎ | 28581/34278 [31:25:23<5:33:44, 3.51s/it] 83%|████████▎ | 28582/34278 [31:25:27<5:49:44, 3.68s/it] {'loss': 0.0833, 'grad_norm': 0.9859291086138892, 'learning_rate': 7.068286073606151e-07, 'epoch': 0.83} 83%|████████▎ | 28582/34278 [31:25:27<5:49:44, 3.68s/it] 83%|████████▎ | 28583/34278 [31:25:31<5:57:06, 3.76s/it] {'loss': 0.1083, 'grad_norm': 0.7622521249094621, 'learning_rate': 7.065864621149182e-07, 'epoch': 0.83} 83%|████████▎ | 28583/34278 [31:25:31<5:57:06, 3.76s/it] 83%|████████▎ | 28584/34278 [31:25:34<5:30:18, 3.48s/it] {'loss': 0.1309, 'grad_norm': 0.8917953768939532, 'learning_rate': 7.063443551997923e-07, 'epoch': 0.83} 83%|████████▎ | 28584/34278 [31:25:34<5:30:18, 3.48s/it] 83%|████████▎ | 28585/34278 [31:25:37<5:26:52, 3.45s/it] {'loss': 0.1166, 'grad_norm': 0.7811004491707488, 'learning_rate': 7.061022866173978e-07, 'epoch': 0.83} 83%|████████▎ | 28585/34278 [31:25:37<5:26:52, 3.45s/it] 83%|████████▎ | 28586/34278 [31:25:40<5:14:20, 3.31s/it] {'loss': 0.1164, 'grad_norm': 1.0898986487645377, 'learning_rate': 7.058602563698979e-07, 'epoch': 0.83} 83%|████████▎ | 28586/34278 [31:25:40<5:14:20, 3.31s/it] 83%|████████▎ | 28587/34278 [31:25:43<5:04:38, 3.21s/it] {'loss': 0.109, 'grad_norm': 0.9799969989463303, 'learning_rate': 7.056182644594517e-07, 'epoch': 0.83} 83%|████████▎ | 28587/34278 [31:25:43<5:04:38, 3.21s/it] 83%|████████▎ | 28588/34278 [31:25:46<5:01:56, 3.18s/it] {'loss': 0.117, 'grad_norm': 0.8211650187716997, 'learning_rate': 7.053763108882217e-07, 'epoch': 0.83} 83%|████████▎ | 28588/34278 [31:25:46<5:01:56, 3.18s/it] 83%|████████▎ | 28589/34278 [31:25:49<4:59:16, 3.16s/it] {'loss': 0.1026, 'grad_norm': 1.0052000997715311, 'learning_rate': 7.051343956583656e-07, 'epoch': 0.83} 83%|████████▎ | 28589/34278 [31:25:49<4:59:16, 3.16s/it] 83%|████████▎ | 28590/34278 [31:25:54<5:32:01, 3.50s/it] {'loss': 0.1214, 'grad_norm': 1.0879712078598416, 'learning_rate': 7.048925187720451e-07, 'epoch': 0.83} 83%|████████▎ | 28590/34278 [31:25:54<5:32:01, 3.50s/it] 83%|████████▎ | 28591/34278 [31:25:57<5:17:25, 3.35s/it] {'loss': 0.1092, 'grad_norm': 0.975316319835064, 'learning_rate': 7.046506802314196e-07, 'epoch': 0.83} 83%|████████▎ | 28591/34278 [31:25:57<5:17:25, 3.35s/it] 83%|████████▎ | 28592/34278 [31:26:03<6:36:32, 4.18s/it] {'loss': 0.1008, 'grad_norm': 1.0447357396549664, 'learning_rate': 7.044088800386456e-07, 'epoch': 0.83} 83%|████████▎ | 28592/34278 [31:26:03<6:36:32, 4.18s/it] 83%|████████▎ | 28593/34278 [31:26:06<6:16:14, 3.97s/it] {'loss': 0.1031, 'grad_norm': 0.8410676282452567, 'learning_rate': 7.041671181958842e-07, 'epoch': 0.83} 83%|████████▎ | 28593/34278 [31:26:06<6:16:14, 3.97s/it] 83%|████████▎ | 28594/34278 [31:26:10<6:02:41, 3.83s/it] {'loss': 0.137, 'grad_norm': 0.7683735529820956, 'learning_rate': 7.039253947052943e-07, 'epoch': 0.83} 83%|████████▎ | 28594/34278 [31:26:10<6:02:41, 3.83s/it] 83%|████████▎ | 28595/34278 [31:26:13<5:35:50, 3.55s/it] {'loss': 0.1138, 'grad_norm': 0.792190140443723, 'learning_rate': 7.036837095690314e-07, 'epoch': 0.83} 83%|████████▎ | 28595/34278 [31:26:13<5:35:50, 3.55s/it] 83%|████████▎ | 28596/34278 [31:26:16<5:22:23, 3.40s/it] {'loss': 0.1176, 'grad_norm': 0.8358270075550457, 'learning_rate': 7.034420627892563e-07, 'epoch': 0.83} 83%|████████▎ | 28596/34278 [31:26:16<5:22:23, 3.40s/it] 83%|████████▎ | 28597/34278 [31:26:19<5:08:39, 3.26s/it] {'loss': 0.108, 'grad_norm': 0.8197322268176289, 'learning_rate': 7.032004543681248e-07, 'epoch': 0.83} 83%|████████▎ | 28597/34278 [31:26:19<5:08:39, 3.26s/it] 83%|████████▎ | 28598/34278 [31:26:22<5:15:17, 3.33s/it] {'loss': 0.1358, 'grad_norm': 1.1150473509834575, 'learning_rate': 7.029588843077922e-07, 'epoch': 0.83} 83%|████████▎ | 28598/34278 [31:26:22<5:15:17, 3.33s/it] 83%|████████▎ | 28599/34278 [31:26:25<5:07:06, 3.24s/it] {'loss': 0.1005, 'grad_norm': 0.7481916598519731, 'learning_rate': 7.027173526104175e-07, 'epoch': 0.83} 83%|████████▎ | 28599/34278 [31:26:25<5:07:06, 3.24s/it] 83%|████████▎ | 28600/34278 [31:26:29<5:15:36, 3.34s/it] {'loss': 0.0926, 'grad_norm': 0.6620424226777286, 'learning_rate': 7.024758592781577e-07, 'epoch': 0.83} 83%|████████▎ | 28600/34278 [31:26:29<5:15:36, 3.34s/it] 83%|████████▎ | 28601/34278 [31:26:32<5:03:39, 3.21s/it] {'loss': 0.1049, 'grad_norm': 0.7548615752952624, 'learning_rate': 7.022344043131668e-07, 'epoch': 0.83} 83%|████████▎ | 28601/34278 [31:26:32<5:03:39, 3.21s/it] 83%|████████▎ | 28602/34278 [31:26:35<5:00:58, 3.18s/it] {'loss': 0.1168, 'grad_norm': 0.672330819651534, 'learning_rate': 7.019929877176007e-07, 'epoch': 0.83} 83%|████████▎ | 28602/34278 [31:26:35<5:00:58, 3.18s/it] 83%|████████▎ | 28603/34278 [31:26:38<4:55:55, 3.13s/it] {'loss': 0.0999, 'grad_norm': 0.8648402334944325, 'learning_rate': 7.017516094936161e-07, 'epoch': 0.83} 83%|████████▎ | 28603/34278 [31:26:38<4:55:55, 3.13s/it] 83%|████████▎ | 28604/34278 [31:26:41<5:10:23, 3.28s/it] {'loss': 0.116, 'grad_norm': 0.8599387318696812, 'learning_rate': 7.015102696433668e-07, 'epoch': 0.83} 83%|████████▎ | 28604/34278 [31:26:41<5:10:23, 3.28s/it] 83%|████████▎ | 28605/34278 [31:26:44<4:59:20, 3.17s/it] {'loss': 0.1092, 'grad_norm': 0.866340663535028, 'learning_rate': 7.01268968169006e-07, 'epoch': 0.83} 83%|████████▎ | 28605/34278 [31:26:44<4:59:20, 3.17s/it] 83%|████████▎ | 28606/34278 [31:26:48<5:05:17, 3.23s/it] {'loss': 0.131, 'grad_norm': 0.9427175962869279, 'learning_rate': 7.010277050726916e-07, 'epoch': 0.83} 83%|████████▎ | 28606/34278 [31:26:48<5:05:17, 3.23s/it] 83%|████████▎ | 28607/34278 [31:26:53<6:08:50, 3.90s/it] {'loss': 0.1418, 'grad_norm': 0.9318603541137698, 'learning_rate': 7.007864803565756e-07, 'epoch': 0.83} 83%|████████▎ | 28607/34278 [31:26:53<6:08:50, 3.90s/it] 83%|████████▎ | 28608/34278 [31:26:56<5:42:16, 3.62s/it] {'loss': 0.1241, 'grad_norm': 0.8435687070765879, 'learning_rate': 7.005452940228103e-07, 'epoch': 0.83} 83%|████████▎ | 28608/34278 [31:26:56<5:42:16, 3.62s/it] 83%|████████▎ | 28609/34278 [31:26:59<5:21:40, 3.40s/it] {'loss': 0.1257, 'grad_norm': 0.9562018194829276, 'learning_rate': 7.003041460735516e-07, 'epoch': 0.83} 83%|████████▎ | 28609/34278 [31:26:59<5:21:40, 3.40s/it] 83%|████████▎ | 28610/34278 [31:27:05<6:37:52, 4.21s/it] {'loss': 0.0974, 'grad_norm': 0.9332906271173597, 'learning_rate': 7.000630365109506e-07, 'epoch': 0.83} 83%|████████▎ | 28610/34278 [31:27:05<6:37:52, 4.21s/it] 83%|████████▎ | 28611/34278 [31:27:08<6:04:39, 3.86s/it] {'loss': 0.1262, 'grad_norm': 0.9706392694115391, 'learning_rate': 6.998219653371597e-07, 'epoch': 0.83} 83%|████████▎ | 28611/34278 [31:27:08<6:04:39, 3.86s/it] 83%|████████▎ | 28612/34278 [31:27:11<5:34:36, 3.54s/it] {'loss': 0.1283, 'grad_norm': 0.7587883368411257, 'learning_rate': 6.995809325543318e-07, 'epoch': 0.83} 83%|████████▎ | 28612/34278 [31:27:11<5:34:36, 3.54s/it] 83%|████████▎ | 28613/34278 [31:27:15<5:50:22, 3.71s/it] {'loss': 0.1336, 'grad_norm': 1.0124911720343897, 'learning_rate': 6.993399381646198e-07, 'epoch': 0.83} 83%|████████▎ | 28613/34278 [31:27:15<5:50:22, 3.71s/it] 83%|████████▎ | 28614/34278 [31:27:19<5:56:07, 3.77s/it] {'loss': 0.0891, 'grad_norm': 0.990090424597686, 'learning_rate': 6.990989821701738e-07, 'epoch': 0.83} 83%|████████▎ | 28614/34278 [31:27:19<5:56:07, 3.77s/it] 83%|████████▎ | 28615/34278 [31:27:22<5:37:02, 3.57s/it] {'loss': 0.1154, 'grad_norm': 0.8324856108831794, 'learning_rate': 6.988580645731446e-07, 'epoch': 0.83} 83%|████████▎ | 28615/34278 [31:27:22<5:37:02, 3.57s/it] 83%|████████▎ | 28616/34278 [31:27:25<5:17:55, 3.37s/it] {'loss': 0.1084, 'grad_norm': 0.8441041372041478, 'learning_rate': 6.986171853756851e-07, 'epoch': 0.83} 83%|████████▎ | 28616/34278 [31:27:25<5:17:55, 3.37s/it] 83%|████████▎ | 28617/34278 [31:27:28<5:08:41, 3.27s/it] {'loss': 0.1214, 'grad_norm': 0.9744669453409137, 'learning_rate': 6.983763445799429e-07, 'epoch': 0.83} 83%|████████▎ | 28617/34278 [31:27:28<5:08:41, 3.27s/it] 83%|████████▎ | 28618/34278 [31:27:31<5:08:46, 3.27s/it] {'loss': 0.0847, 'grad_norm': 0.8003730853774177, 'learning_rate': 6.981355421880715e-07, 'epoch': 0.83} 83%|████████▎ | 28618/34278 [31:27:31<5:08:46, 3.27s/it] 83%|████████▎ | 28619/34278 [31:27:35<5:13:13, 3.32s/it] {'loss': 0.1117, 'grad_norm': 0.919042214983713, 'learning_rate': 6.978947782022177e-07, 'epoch': 0.83} 83%|████████▎ | 28619/34278 [31:27:35<5:13:13, 3.32s/it] 83%|████████▎ | 28620/34278 [31:27:37<5:00:20, 3.19s/it] {'loss': 0.1355, 'grad_norm': 1.0268314494193647, 'learning_rate': 6.976540526245335e-07, 'epoch': 0.83} 83%|████████▎ | 28620/34278 [31:27:38<5:00:20, 3.19s/it] 83%|████████▎ | 28621/34278 [31:27:41<5:18:33, 3.38s/it] {'loss': 0.1227, 'grad_norm': 1.0784022321210935, 'learning_rate': 6.974133654571668e-07, 'epoch': 0.83} 83%|████████▎ | 28621/34278 [31:27:41<5:18:33, 3.38s/it] 83%|████████▎ | 28622/34278 [31:27:47<6:36:36, 4.21s/it] {'loss': 0.0935, 'grad_norm': 0.8930931507470183, 'learning_rate': 6.971727167022652e-07, 'epoch': 0.83} 83%|████████▎ | 28622/34278 [31:27:47<6:36:36, 4.21s/it] 84%|████████▎ | 28623/34278 [31:27:51<6:14:27, 3.97s/it] {'loss': 0.1242, 'grad_norm': 0.8160656435612694, 'learning_rate': 6.969321063619788e-07, 'epoch': 0.84} 84%|████████▎ | 28623/34278 [31:27:51<6:14:27, 3.97s/it] 84%|████████▎ | 28624/34278 [31:27:56<6:58:49, 4.44s/it] {'loss': 0.0843, 'grad_norm': 0.6302078749308491, 'learning_rate': 6.966915344384562e-07, 'epoch': 0.84} 84%|████████▎ | 28624/34278 [31:27:56<6:58:49, 4.44s/it] 84%|████████▎ | 28625/34278 [31:27:59<6:14:09, 3.97s/it] {'loss': 0.1219, 'grad_norm': 1.0298737572591399, 'learning_rate': 6.964510009338432e-07, 'epoch': 0.84} 84%|████████▎ | 28625/34278 [31:27:59<6:14:09, 3.97s/it] 84%|████████▎ | 28626/34278 [31:28:03<5:54:28, 3.76s/it] {'loss': 0.1127, 'grad_norm': 0.8240990607958188, 'learning_rate': 6.962105058502894e-07, 'epoch': 0.84} 84%|████████▎ | 28626/34278 [31:28:03<5:54:28, 3.76s/it] 84%|████████▎ | 28627/34278 [31:28:06<5:48:38, 3.70s/it] {'loss': 0.1204, 'grad_norm': 0.7479982485384717, 'learning_rate': 6.959700491899408e-07, 'epoch': 0.84} 84%|████████▎ | 28627/34278 [31:28:06<5:48:38, 3.70s/it] 84%|████████▎ | 28628/34278 [31:28:09<5:23:27, 3.44s/it] {'loss': 0.0914, 'grad_norm': 0.8899750430844123, 'learning_rate': 6.957296309549432e-07, 'epoch': 0.84} 84%|████████▎ | 28628/34278 [31:28:09<5:23:27, 3.44s/it] 84%|████████▎ | 28629/34278 [31:28:13<5:31:01, 3.52s/it] {'loss': 0.114, 'grad_norm': 0.659858287058075, 'learning_rate': 6.954892511474437e-07, 'epoch': 0.84} 84%|████████▎ | 28629/34278 [31:28:13<5:31:01, 3.52s/it] 84%|████████▎ | 28630/34278 [31:28:16<5:21:08, 3.41s/it] {'loss': 0.1175, 'grad_norm': 0.9106006986190525, 'learning_rate': 6.952489097695897e-07, 'epoch': 0.84} 84%|████████▎ | 28630/34278 [31:28:16<5:21:08, 3.41s/it] 84%|████████▎ | 28631/34278 [31:28:19<5:17:27, 3.37s/it] {'loss': 0.1229, 'grad_norm': 0.9145590676697298, 'learning_rate': 6.950086068235262e-07, 'epoch': 0.84} 84%|████████▎ | 28631/34278 [31:28:19<5:17:27, 3.37s/it] 84%|████████▎ | 28632/34278 [31:28:22<5:05:14, 3.24s/it] {'loss': 0.0883, 'grad_norm': 0.6906119242767612, 'learning_rate': 6.947683423113966e-07, 'epoch': 0.84} 84%|████████▎ | 28632/34278 [31:28:22<5:05:14, 3.24s/it] 84%|████████▎ | 28633/34278 [31:28:26<5:24:00, 3.44s/it] {'loss': 0.1021, 'grad_norm': 0.8851635895069407, 'learning_rate': 6.94528116235349e-07, 'epoch': 0.84} 84%|████████▎ | 28633/34278 [31:28:26<5:24:00, 3.44s/it] 84%|████████▎ | 28634/34278 [31:28:29<5:19:27, 3.40s/it] {'loss': 0.0986, 'grad_norm': 0.8354837908061795, 'learning_rate': 6.942879285975263e-07, 'epoch': 0.84} 84%|████████▎ | 28634/34278 [31:28:29<5:19:27, 3.40s/it] 84%|████████▎ | 28635/34278 [31:28:32<5:01:44, 3.21s/it] {'loss': 0.1011, 'grad_norm': 0.8622551654046755, 'learning_rate': 6.940477794000711e-07, 'epoch': 0.84} 84%|████████▎ | 28635/34278 [31:28:32<5:01:44, 3.21s/it] 84%|████████▎ | 28636/34278 [31:28:36<5:36:13, 3.58s/it] {'loss': 0.1133, 'grad_norm': 1.2199028664512792, 'learning_rate': 6.938076686451312e-07, 'epoch': 0.84} 84%|████████▎ | 28636/34278 [31:28:36<5:36:13, 3.58s/it] 84%|████████▎ | 28637/34278 [31:28:40<5:36:49, 3.58s/it] {'loss': 0.1042, 'grad_norm': 0.7976686518802866, 'learning_rate': 6.935675963348487e-07, 'epoch': 0.84} 84%|████████▎ | 28637/34278 [31:28:40<5:36:49, 3.58s/it] 84%|████████▎ | 28638/34278 [31:28:43<5:21:10, 3.42s/it] {'loss': 0.116, 'grad_norm': 0.8755527093589413, 'learning_rate': 6.933275624713659e-07, 'epoch': 0.84} 84%|████████▎ | 28638/34278 [31:28:43<5:21:10, 3.42s/it] 84%|████████▎ | 28639/34278 [31:28:47<5:28:21, 3.49s/it] {'loss': 0.1204, 'grad_norm': 0.8567428163946965, 'learning_rate': 6.930875670568271e-07, 'epoch': 0.84} 84%|████████▎ | 28639/34278 [31:28:47<5:28:21, 3.49s/it] 84%|████████▎ | 28640/34278 [31:28:53<6:52:28, 4.39s/it] {'loss': 0.1327, 'grad_norm': 0.7874076456759808, 'learning_rate': 6.92847610093374e-07, 'epoch': 0.84} 84%|████████▎ | 28640/34278 [31:28:53<6:52:28, 4.39s/it] 84%|████████▎ | 28641/34278 [31:28:56<6:16:06, 4.00s/it] {'loss': 0.1299, 'grad_norm': 0.8660864449366423, 'learning_rate': 6.926076915831498e-07, 'epoch': 0.84} 84%|████████▎ | 28641/34278 [31:28:56<6:16:06, 4.00s/it] 84%|████████▎ | 28642/34278 [31:28:59<5:49:43, 3.72s/it] {'loss': 0.1121, 'grad_norm': 0.8542416222610314, 'learning_rate': 6.923678115282945e-07, 'epoch': 0.84} 84%|████████▎ | 28642/34278 [31:28:59<5:49:43, 3.72s/it] 84%|████████▎ | 28643/34278 [31:29:03<5:52:52, 3.76s/it] {'loss': 0.1304, 'grad_norm': 0.9936849820137602, 'learning_rate': 6.921279699309525e-07, 'epoch': 0.84} 84%|████████▎ | 28643/34278 [31:29:03<5:52:52, 3.76s/it] 84%|████████▎ | 28644/34278 [31:29:06<5:36:05, 3.58s/it] {'loss': 0.0936, 'grad_norm': 0.8727545532687586, 'learning_rate': 6.918881667932637e-07, 'epoch': 0.84} 84%|████████▎ | 28644/34278 [31:29:06<5:36:05, 3.58s/it] 84%|████████▎ | 28645/34278 [31:29:11<5:54:55, 3.78s/it] {'loss': 0.1354, 'grad_norm': 0.7728314756224676, 'learning_rate': 6.916484021173681e-07, 'epoch': 0.84} 84%|████████▎ | 28645/34278 [31:29:11<5:54:55, 3.78s/it] 84%|████████▎ | 28646/34278 [31:29:13<5:27:22, 3.49s/it] {'loss': 0.1112, 'grad_norm': 0.8778921020468223, 'learning_rate': 6.914086759054062e-07, 'epoch': 0.84} 84%|████████▎ | 28646/34278 [31:29:13<5:27:22, 3.49s/it] 84%|████████▎ | 28647/34278 [31:29:16<5:12:42, 3.33s/it] {'loss': 0.0941, 'grad_norm': 0.6652813785176614, 'learning_rate': 6.911689881595208e-07, 'epoch': 0.84} 84%|████████▎ | 28647/34278 [31:29:16<5:12:42, 3.33s/it] 84%|████████▎ | 28648/34278 [31:29:21<5:53:53, 3.77s/it] {'loss': 0.1247, 'grad_norm': 0.7999509435973636, 'learning_rate': 6.9092933888185e-07, 'epoch': 0.84} 84%|████████▎ | 28648/34278 [31:29:21<5:53:53, 3.77s/it] 84%|████████▎ | 28649/34278 [31:29:25<5:46:05, 3.69s/it] {'loss': 0.1163, 'grad_norm': 0.8654250706204483, 'learning_rate': 6.906897280745322e-07, 'epoch': 0.84} 84%|████████▎ | 28649/34278 [31:29:25<5:46:05, 3.69s/it] 84%|████████▎ | 28650/34278 [31:29:27<5:19:32, 3.41s/it] {'loss': 0.0887, 'grad_norm': 0.9017734917226445, 'learning_rate': 6.904501557397092e-07, 'epoch': 0.84} 84%|████████▎ | 28650/34278 [31:29:27<5:19:32, 3.41s/it] 84%|████████▎ | 28651/34278 [31:29:31<5:26:25, 3.48s/it] {'loss': 0.1143, 'grad_norm': 0.7807807142597256, 'learning_rate': 6.902106218795185e-07, 'epoch': 0.84} 84%|████████▎ | 28651/34278 [31:29:31<5:26:25, 3.48s/it] 84%|████████▎ | 28652/34278 [31:29:37<6:38:48, 4.25s/it] {'loss': 0.1182, 'grad_norm': 0.9637840937523848, 'learning_rate': 6.899711264960957e-07, 'epoch': 0.84} 84%|████████▎ | 28652/34278 [31:29:37<6:38:48, 4.25s/it] 84%|████████▎ | 28653/34278 [31:29:40<5:59:27, 3.83s/it] {'loss': 0.0901, 'grad_norm': 0.8176700269805882, 'learning_rate': 6.897316695915846e-07, 'epoch': 0.84} 84%|████████▎ | 28653/34278 [31:29:40<5:59:27, 3.83s/it] 84%|████████▎ | 28654/34278 [31:29:43<5:36:56, 3.59s/it] {'loss': 0.1326, 'grad_norm': 0.9653718107085448, 'learning_rate': 6.894922511681196e-07, 'epoch': 0.84} 84%|████████▎ | 28654/34278 [31:29:43<5:36:56, 3.59s/it] 84%|████████▎ | 28655/34278 [31:29:46<5:32:01, 3.54s/it] {'loss': 0.1109, 'grad_norm': 0.7124793179035528, 'learning_rate': 6.892528712278385e-07, 'epoch': 0.84} 84%|████████▎ | 28655/34278 [31:29:46<5:32:01, 3.54s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0cab9fcb30> Failed to fetch sample 3720967. Exception: cannot identify image file <_io.BytesIO object at 0x7f0cab9fcb30> 84%|████████▎ | 28656/34278 [31:29:50<5:41:35, 3.65s/it] {'loss': 0.1338, 'grad_norm': 1.3319571241127004, 'learning_rate': 6.89013529772879e-07, 'epoch': 0.84} 84%|████████▎ | 28656/34278 [31:29:50<5:41:35, 3.65s/it] 84%|████████▎ | 28657/34278 [31:29:54<5:32:40, 3.55s/it] {'loss': 0.1128, 'grad_norm': 0.9256128810957058, 'learning_rate': 6.887742268053782e-07, 'epoch': 0.84} 84%|████████▎ | 28657/34278 [31:29:54<5:32:40, 3.55s/it] 84%|████████▎ | 28658/34278 [31:29:57<5:35:18, 3.58s/it] {'loss': 0.1223, 'grad_norm': 0.7285361643222902, 'learning_rate': 6.885349623274706e-07, 'epoch': 0.84} 84%|████████▎ | 28658/34278 [31:29:57<5:35:18, 3.58s/it] 84%|████████▎ | 28659/34278 [31:30:01<5:50:51, 3.75s/it] {'loss': 0.1158, 'grad_norm': 1.2547232560492405, 'learning_rate': 6.882957363412934e-07, 'epoch': 0.84} 84%|████████▎ | 28659/34278 [31:30:01<5:50:51, 3.75s/it] 84%|████████▎ | 28660/34278 [31:30:06<6:01:34, 3.86s/it] {'loss': 0.1141, 'grad_norm': 0.950298002722007, 'learning_rate': 6.880565488489837e-07, 'epoch': 0.84} 84%|████████▎ | 28660/34278 [31:30:06<6:01:34, 3.86s/it] 84%|████████▎ | 28661/34278 [31:30:09<5:54:02, 3.78s/it] {'loss': 0.1033, 'grad_norm': 0.7211726500922024, 'learning_rate': 6.87817399852676e-07, 'epoch': 0.84} 84%|████████▎ | 28661/34278 [31:30:09<5:54:02, 3.78s/it] 84%|████████▎ | 28662/34278 [31:30:12<5:19:15, 3.41s/it] {'loss': 0.1038, 'grad_norm': 0.9019731638988802, 'learning_rate': 6.875782893545042e-07, 'epoch': 0.84} 84%|████████▎ | 28662/34278 [31:30:12<5:19:15, 3.41s/it] 84%|████████▎ | 28663/34278 [31:30:15<5:11:51, 3.33s/it] {'loss': 0.126, 'grad_norm': 0.9445192804347247, 'learning_rate': 6.873392173566051e-07, 'epoch': 0.84} 84%|████████▎ | 28663/34278 [31:30:15<5:11:51, 3.33s/it] 84%|████████▎ | 28664/34278 [31:30:18<5:03:52, 3.25s/it] {'loss': 0.1162, 'grad_norm': 1.0122184251294524, 'learning_rate': 6.871001838611102e-07, 'epoch': 0.84} 84%|████████▎ | 28664/34278 [31:30:18<5:03:52, 3.25s/it] 84%|████████▎ | 28665/34278 [31:30:22<5:17:05, 3.39s/it] {'loss': 0.1099, 'grad_norm': 0.777162636737857, 'learning_rate': 6.86861188870156e-07, 'epoch': 0.84} 84%|████████▎ | 28665/34278 [31:30:22<5:17:05, 3.39s/it] 84%|████████▎ | 28666/34278 [31:30:26<5:52:05, 3.76s/it] {'loss': 0.1315, 'grad_norm': 0.8608607181598449, 'learning_rate': 6.866222323858762e-07, 'epoch': 0.84} 84%|████████▎ | 28666/34278 [31:30:26<5:52:05, 3.76s/it] 84%|████████▎ | 28667/34278 [31:30:29<5:24:01, 3.46s/it] {'loss': 0.1152, 'grad_norm': 0.8781240607716954, 'learning_rate': 6.863833144104037e-07, 'epoch': 0.84} 84%|████████▎ | 28667/34278 [31:30:29<5:24:01, 3.46s/it] 84%|████████▎ | 28668/34278 [31:30:35<6:40:47, 4.29s/it] {'loss': 0.0938, 'grad_norm': 0.6626272101503468, 'learning_rate': 6.861444349458702e-07, 'epoch': 0.84} 84%|████████▎ | 28668/34278 [31:30:35<6:40:47, 4.29s/it] 84%|████████▎ | 28669/34278 [31:30:39<6:38:09, 4.26s/it] {'loss': 0.1088, 'grad_norm': 0.7128834338638651, 'learning_rate': 6.859055939944098e-07, 'epoch': 0.84} 84%|████████▎ | 28669/34278 [31:30:39<6:38:09, 4.26s/it] 84%|████████▎ | 28670/34278 [31:30:42<5:59:18, 3.84s/it] {'loss': 0.1087, 'grad_norm': 0.6701056151114528, 'learning_rate': 6.856667915581538e-07, 'epoch': 0.84} 84%|████████▎ | 28670/34278 [31:30:42<5:59:18, 3.84s/it] 84%|████████▎ | 28671/34278 [31:30:45<5:35:59, 3.60s/it] {'loss': 0.1027, 'grad_norm': 0.7778893549588168, 'learning_rate': 6.854280276392361e-07, 'epoch': 0.84} 84%|████████▎ | 28671/34278 [31:30:45<5:35:59, 3.60s/it] 84%|████████▎ | 28672/34278 [31:30:48<5:20:05, 3.43s/it] {'loss': 0.1345, 'grad_norm': 0.9109236468173831, 'learning_rate': 6.851893022397855e-07, 'epoch': 0.84} 84%|████████▎ | 28672/34278 [31:30:48<5:20:05, 3.43s/it] 84%|████████▎ | 28673/34278 [31:30:53<5:50:13, 3.75s/it] {'loss': 0.1168, 'grad_norm': 0.9414579629667669, 'learning_rate': 6.849506153619356e-07, 'epoch': 0.84} 84%|████████▎ | 28673/34278 [31:30:53<5:50:13, 3.75s/it] 84%|████████▎ | 28674/34278 [31:30:56<5:22:17, 3.45s/it] {'loss': 0.1127, 'grad_norm': 0.9713870553892409, 'learning_rate': 6.847119670078173e-07, 'epoch': 0.84} 84%|████████▎ | 28674/34278 [31:30:56<5:22:17, 3.45s/it] 84%|████████▎ | 28675/34278 [31:30:59<5:11:40, 3.34s/it] {'loss': 0.1246, 'grad_norm': 0.8368938544227067, 'learning_rate': 6.844733571795587e-07, 'epoch': 0.84} 84%|████████▎ | 28675/34278 [31:30:59<5:11:40, 3.34s/it] 84%|████████▎ | 28676/34278 [31:31:05<6:24:52, 4.12s/it] {'loss': 0.1108, 'grad_norm': 0.7175759144395739, 'learning_rate': 6.842347858792919e-07, 'epoch': 0.84} 84%|████████▎ | 28676/34278 [31:31:05<6:24:52, 4.12s/it] 84%|████████▎ | 28677/34278 [31:31:11<7:17:14, 4.68s/it] {'loss': 0.1034, 'grad_norm': 0.9315659849847924, 'learning_rate': 6.839962531091482e-07, 'epoch': 0.84} 84%|████████▎ | 28677/34278 [31:31:11<7:17:14, 4.68s/it] 84%|████████▎ | 28678/34278 [31:31:14<6:43:56, 4.33s/it] {'loss': 0.1021, 'grad_norm': 0.9265283512957927, 'learning_rate': 6.837577588712551e-07, 'epoch': 0.84} 84%|████████▎ | 28678/34278 [31:31:14<6:43:56, 4.33s/it] 84%|████████▎ | 28679/34278 [31:31:17<6:06:43, 3.93s/it] {'loss': 0.1124, 'grad_norm': 0.7840001819232951, 'learning_rate': 6.835193031677418e-07, 'epoch': 0.84} 84%|████████▎ | 28679/34278 [31:31:17<6:06:43, 3.93s/it] 84%|████████▎ | 28680/34278 [31:31:21<5:54:08, 3.80s/it] {'loss': 0.0994, 'grad_norm': 0.7422903185418308, 'learning_rate': 6.832808860007384e-07, 'epoch': 0.84} 84%|████████▎ | 28680/34278 [31:31:21<5:54:08, 3.80s/it] 84%|████████▎ | 28681/34278 [31:31:24<5:29:26, 3.53s/it] {'loss': 0.1197, 'grad_norm': 0.8006926436386917, 'learning_rate': 6.830425073723728e-07, 'epoch': 0.84} 84%|████████▎ | 28681/34278 [31:31:24<5:29:26, 3.53s/it] 84%|████████▎ | 28682/34278 [31:31:28<5:52:27, 3.78s/it] {'loss': 0.0911, 'grad_norm': 0.7505425422294162, 'learning_rate': 6.828041672847707e-07, 'epoch': 0.84} 84%|████████▎ | 28682/34278 [31:31:28<5:52:27, 3.78s/it] 84%|████████▎ | 28683/34278 [31:31:31<5:36:34, 3.61s/it] {'loss': 0.1224, 'grad_norm': 0.7827003981639024, 'learning_rate': 6.825658657400653e-07, 'epoch': 0.84} 84%|████████▎ | 28683/34278 [31:31:31<5:36:34, 3.61s/it] 84%|████████▎ | 28684/34278 [31:31:34<5:21:33, 3.45s/it] {'loss': 0.1044, 'grad_norm': 0.806682899445697, 'learning_rate': 6.823276027403808e-07, 'epoch': 0.84} 84%|████████▎ | 28684/34278 [31:31:34<5:21:33, 3.45s/it] 84%|████████▎ | 28685/34278 [31:31:38<5:18:07, 3.41s/it] {'loss': 0.119, 'grad_norm': 1.0913688915087802, 'learning_rate': 6.820893782878435e-07, 'epoch': 0.84} 84%|████████▎ | 28685/34278 [31:31:38<5:18:07, 3.41s/it] 84%|████████▎ | 28686/34278 [31:31:41<5:15:06, 3.38s/it] {'loss': 0.0914, 'grad_norm': 0.7646187400611799, 'learning_rate': 6.818511923845828e-07, 'epoch': 0.84} 84%|████████▎ | 28686/34278 [31:31:41<5:15:06, 3.38s/it] 84%|████████▎ | 28687/34278 [31:31:44<5:10:56, 3.34s/it] {'loss': 0.1129, 'grad_norm': 0.9484808691312467, 'learning_rate': 6.816130450327235e-07, 'epoch': 0.84} 84%|████████▎ | 28687/34278 [31:31:44<5:10:56, 3.34s/it] 84%|████████▎ | 28688/34278 [31:31:48<5:17:06, 3.40s/it] {'loss': 0.1108, 'grad_norm': 1.0347871767578014, 'learning_rate': 6.813749362343914e-07, 'epoch': 0.84} 84%|████████▎ | 28688/34278 [31:31:48<5:17:06, 3.40s/it] 84%|████████▎ | 28689/34278 [31:31:51<5:02:17, 3.25s/it] {'loss': 0.1118, 'grad_norm': 0.9270575124098639, 'learning_rate': 6.811368659917128e-07, 'epoch': 0.84} 84%|████████▎ | 28689/34278 [31:31:51<5:02:17, 3.25s/it] 84%|████████▎ | 28690/34278 [31:31:54<5:07:53, 3.31s/it] {'loss': 0.132, 'grad_norm': 0.9329141279665811, 'learning_rate': 6.808988343068146e-07, 'epoch': 0.84} 84%|████████▎ | 28690/34278 [31:31:54<5:07:53, 3.31s/it] 84%|████████▎ | 28691/34278 [31:31:57<4:58:33, 3.21s/it] {'loss': 0.1086, 'grad_norm': 1.13431048449298, 'learning_rate': 6.8066084118182e-07, 'epoch': 0.84} 84%|████████▎ | 28691/34278 [31:31:57<4:58:33, 3.21s/it] 84%|████████▎ | 28692/34278 [31:32:00<4:51:28, 3.13s/it] {'loss': 0.0996, 'grad_norm': 0.9504017432662675, 'learning_rate': 6.804228866188534e-07, 'epoch': 0.84} 84%|████████▎ | 28692/34278 [31:32:00<4:51:28, 3.13s/it] 84%|████████▎ | 28693/34278 [31:32:04<5:10:57, 3.34s/it] {'loss': 0.103, 'grad_norm': 0.7452738299748206, 'learning_rate': 6.801849706200414e-07, 'epoch': 0.84} 84%|████████▎ | 28693/34278 [31:32:04<5:10:57, 3.34s/it] 84%|████████▎ | 28694/34278 [31:32:08<5:26:40, 3.51s/it] {'loss': 0.1085, 'grad_norm': 0.7461686281640708, 'learning_rate': 6.799470931875051e-07, 'epoch': 0.84} 84%|████████▎ | 28694/34278 [31:32:08<5:26:40, 3.51s/it] 84%|████████▎ | 28695/34278 [31:32:11<5:11:34, 3.35s/it] {'loss': 0.1244, 'grad_norm': 0.8425623946505637, 'learning_rate': 6.797092543233719e-07, 'epoch': 0.84} 84%|████████▎ | 28695/34278 [31:32:11<5:11:34, 3.35s/it] 84%|████████▎ | 28696/34278 [31:32:14<5:08:52, 3.32s/it] {'loss': 0.1075, 'grad_norm': 0.9280261705744823, 'learning_rate': 6.794714540297615e-07, 'epoch': 0.84} 84%|████████▎ | 28696/34278 [31:32:14<5:08:52, 3.32s/it] 84%|████████▎ | 28697/34278 [31:32:18<5:29:09, 3.54s/it] {'loss': 0.1131, 'grad_norm': 0.853209451411559, 'learning_rate': 6.792336923087994e-07, 'epoch': 0.84} 84%|████████▎ | 28697/34278 [31:32:18<5:29:09, 3.54s/it] 84%|████████▎ | 28698/34278 [31:32:22<5:39:40, 3.65s/it] {'loss': 0.1373, 'grad_norm': 0.8637432778009454, 'learning_rate': 6.789959691626069e-07, 'epoch': 0.84} 84%|████████▎ | 28698/34278 [31:32:22<5:39:40, 3.65s/it] 84%|████████▎ | 28699/34278 [31:32:25<5:27:15, 3.52s/it] {'loss': 0.101, 'grad_norm': 0.8197077682181532, 'learning_rate': 6.787582845933078e-07, 'epoch': 0.84} 84%|████████▎ | 28699/34278 [31:32:25<5:27:15, 3.52s/it] 84%|████████▎ | 28700/34278 [31:32:28<5:12:11, 3.36s/it] {'loss': 0.0987, 'grad_norm': 1.0369316022870985, 'learning_rate': 6.785206386030219e-07, 'epoch': 0.84} 84%|████████▎ | 28700/34278 [31:32:28<5:12:11, 3.36s/it] 84%|████████▎ | 28701/34278 [31:32:31<4:57:52, 3.20s/it] {'loss': 0.1012, 'grad_norm': 0.7420193447575757, 'learning_rate': 6.782830311938731e-07, 'epoch': 0.84} 84%|████████▎ | 28701/34278 [31:32:31<4:57:52, 3.20s/it] 84%|████████▎ | 28702/34278 [31:32:34<4:57:29, 3.20s/it] {'loss': 0.1111, 'grad_norm': 1.0916979403699227, 'learning_rate': 6.78045462367981e-07, 'epoch': 0.84} 84%|████████▎ | 28702/34278 [31:32:34<4:57:29, 3.20s/it] 84%|████████▎ | 28703/34278 [31:32:38<5:08:39, 3.32s/it] {'loss': 0.1054, 'grad_norm': 0.8667817266571434, 'learning_rate': 6.778079321274683e-07, 'epoch': 0.84} 84%|████████▎ | 28703/34278 [31:32:38<5:08:39, 3.32s/it] 84%|████████▎ | 28704/34278 [31:32:41<5:06:08, 3.30s/it] {'loss': 0.0948, 'grad_norm': 0.8127985451597882, 'learning_rate': 6.775704404744543e-07, 'epoch': 0.84} 84%|████████▎ | 28704/34278 [31:32:41<5:06:08, 3.30s/it] 84%|████████▎ | 28705/34278 [31:32:44<4:57:49, 3.21s/it] {'loss': 0.1382, 'grad_norm': 0.945295115772799, 'learning_rate': 6.77332987411059e-07, 'epoch': 0.84} 84%|████████▎ | 28705/34278 [31:32:44<4:57:49, 3.21s/it] 84%|████████▎ | 28706/34278 [31:32:47<5:04:55, 3.28s/it] {'loss': 0.1178, 'grad_norm': 0.9591986280141048, 'learning_rate': 6.770955729394024e-07, 'epoch': 0.84} 84%|████████▎ | 28706/34278 [31:32:47<5:04:55, 3.28s/it] 84%|████████▎ | 28707/34278 [31:32:51<5:06:42, 3.30s/it] {'loss': 0.0897, 'grad_norm': 0.9707874551440682, 'learning_rate': 6.768581970616056e-07, 'epoch': 0.84} 84%|████████▎ | 28707/34278 [31:32:51<5:06:42, 3.30s/it] 84%|████████▍ | 28708/34278 [31:32:54<5:12:41, 3.37s/it] {'loss': 0.1081, 'grad_norm': 0.5864343240786279, 'learning_rate': 6.766208597797874e-07, 'epoch': 0.84} 84%|████████▍ | 28708/34278 [31:32:54<5:12:41, 3.37s/it] 84%|████████▍ | 28709/34278 [31:32:58<5:16:40, 3.41s/it] {'loss': 0.1371, 'grad_norm': 0.929317101844393, 'learning_rate': 6.763835610960645e-07, 'epoch': 0.84} 84%|████████▍ | 28709/34278 [31:32:58<5:16:40, 3.41s/it] 84%|████████▍ | 28710/34278 [31:33:02<5:36:17, 3.62s/it] {'loss': 0.1138, 'grad_norm': 0.7194146484129015, 'learning_rate': 6.76146301012558e-07, 'epoch': 0.84} 84%|████████▍ | 28710/34278 [31:33:02<5:36:17, 3.62s/it] 84%|████████▍ | 28711/34278 [31:33:08<6:36:22, 4.27s/it] {'loss': 0.1377, 'grad_norm': 0.8593105085498757, 'learning_rate': 6.759090795313856e-07, 'epoch': 0.84} 84%|████████▍ | 28711/34278 [31:33:08<6:36:22, 4.27s/it] 84%|████████▍ | 28712/34278 [31:33:11<6:06:29, 3.95s/it] {'loss': 0.1122, 'grad_norm': 0.8303128010172225, 'learning_rate': 6.756718966546622e-07, 'epoch': 0.84} 84%|████████▍ | 28712/34278 [31:33:11<6:06:29, 3.95s/it] 84%|████████▍ | 28713/34278 [31:33:14<5:40:54, 3.68s/it] {'loss': 0.1058, 'grad_norm': 0.9188633021514161, 'learning_rate': 6.754347523845101e-07, 'epoch': 0.84} 84%|████████▍ | 28713/34278 [31:33:14<5:40:54, 3.68s/it] 84%|████████▍ | 28714/34278 [31:33:17<5:20:19, 3.45s/it] {'loss': 0.1181, 'grad_norm': 0.8497070208353793, 'learning_rate': 6.751976467230442e-07, 'epoch': 0.84} 84%|████████▍ | 28714/34278 [31:33:17<5:20:19, 3.45s/it] 84%|████████▍ | 28715/34278 [31:33:22<6:05:31, 3.94s/it] {'loss': 0.1136, 'grad_norm': 0.9130469832679831, 'learning_rate': 6.749605796723802e-07, 'epoch': 0.84} 84%|████████▍ | 28715/34278 [31:33:22<6:05:31, 3.94s/it] 84%|████████▍ | 28716/34278 [31:33:27<6:26:29, 4.17s/it] {'loss': 0.1219, 'grad_norm': 0.8934997775976855, 'learning_rate': 6.747235512346368e-07, 'epoch': 0.84} 84%|████████▍ | 28716/34278 [31:33:27<6:26:29, 4.17s/it] 84%|████████▍ | 28717/34278 [31:33:30<6:05:44, 3.95s/it] {'loss': 0.1038, 'grad_norm': 0.8038434375759682, 'learning_rate': 6.744865614119289e-07, 'epoch': 0.84} 84%|████████▍ | 28717/34278 [31:33:30<6:05:44, 3.95s/it] 84%|████████▍ | 28718/34278 [31:33:34<6:01:04, 3.90s/it] {'loss': 0.1127, 'grad_norm': 0.8213997320272479, 'learning_rate': 6.742496102063711e-07, 'epoch': 0.84} 84%|████████▍ | 28718/34278 [31:33:34<6:01:04, 3.90s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f09807254e0> Failed to fetch sample 3187197. Exception: cannot identify image file <_io.BytesIO object at 0x7f09807254e0> 84%|████████▍ | 28719/34278 [31:33:38<6:11:17, 4.01s/it] {'loss': 0.1193, 'grad_norm': 0.7663259946596382, 'learning_rate': 6.740126976200806e-07, 'epoch': 0.84} 84%|████████▍ | 28719/34278 [31:33:38<6:11:17, 4.01s/it] 84%|████████▍ | 28720/34278 [31:33:43<6:25:02, 4.16s/it] {'loss': 0.0954, 'grad_norm': 0.7842153966592031, 'learning_rate': 6.737758236551728e-07, 'epoch': 0.84} 84%|████████▍ | 28720/34278 [31:33:43<6:25:02, 4.16s/it] 84%|████████▍ | 28721/34278 [31:33:45<5:44:18, 3.72s/it] {'loss': 0.0939, 'grad_norm': 0.9729076057399066, 'learning_rate': 6.735389883137616e-07, 'epoch': 0.84} 84%|████████▍ | 28721/34278 [31:33:45<5:44:18, 3.72s/it] 84%|████████▍ | 28722/34278 [31:33:51<6:46:32, 4.39s/it] {'loss': 0.1236, 'grad_norm': 0.909854706876548, 'learning_rate': 6.73302191597961e-07, 'epoch': 0.84} 84%|████████▍ | 28722/34278 [31:33:51<6:46:32, 4.39s/it] 84%|████████▍ | 28723/34278 [31:33:55<6:22:11, 4.13s/it] {'loss': 0.1431, 'grad_norm': 0.8856700792705716, 'learning_rate': 6.730654335098857e-07, 'epoch': 0.84} 84%|████████▍ | 28723/34278 [31:33:55<6:22:11, 4.13s/it] 84%|████████▍ | 28724/34278 [31:33:57<5:41:07, 3.69s/it] {'loss': 0.1113, 'grad_norm': 1.1578674171675276, 'learning_rate': 6.728287140516487e-07, 'epoch': 0.84} 84%|████████▍ | 28724/34278 [31:33:57<5:41:07, 3.69s/it] 84%|████████▍ | 28725/34278 [31:34:00<5:17:34, 3.43s/it] {'loss': 0.1201, 'grad_norm': 0.9210415753413871, 'learning_rate': 6.725920332253654e-07, 'epoch': 0.84} 84%|████████▍ | 28725/34278 [31:34:00<5:17:34, 3.43s/it] 84%|████████▍ | 28726/34278 [31:34:05<5:48:15, 3.76s/it] {'loss': 0.1157, 'grad_norm': 0.8147646490799825, 'learning_rate': 6.72355391033146e-07, 'epoch': 0.84} 84%|████████▍ | 28726/34278 [31:34:05<5:48:15, 3.76s/it] 84%|████████▍ | 28727/34278 [31:34:08<5:37:27, 3.65s/it] {'loss': 0.1085, 'grad_norm': 0.8097996550992647, 'learning_rate': 6.721187874771057e-07, 'epoch': 0.84} 84%|████████▍ | 28727/34278 [31:34:08<5:37:27, 3.65s/it] 84%|████████▍ | 28728/34278 [31:34:12<5:31:54, 3.59s/it] {'loss': 0.1182, 'grad_norm': 0.7883422579529276, 'learning_rate': 6.718822225593547e-07, 'epoch': 0.84} 84%|████████▍ | 28728/34278 [31:34:12<5:31:54, 3.59s/it] 84%|████████▍ | 28729/34278 [31:34:16<5:47:27, 3.76s/it] {'loss': 0.1399, 'grad_norm': 0.8299692523240949, 'learning_rate': 6.716456962820067e-07, 'epoch': 0.84} 84%|████████▍ | 28729/34278 [31:34:16<5:47:27, 3.76s/it] 84%|████████▍ | 28730/34278 [31:34:18<5:17:42, 3.44s/it] {'loss': 0.1046, 'grad_norm': 0.6813307692782673, 'learning_rate': 6.714092086471718e-07, 'epoch': 0.84} 84%|████████▍ | 28730/34278 [31:34:18<5:17:42, 3.44s/it] 84%|████████▍ | 28731/34278 [31:34:22<5:14:37, 3.40s/it] {'loss': 0.1224, 'grad_norm': 0.9338365931502542, 'learning_rate': 6.711727596569639e-07, 'epoch': 0.84} 84%|████████▍ | 28731/34278 [31:34:22<5:14:37, 3.40s/it] 84%|████████▍ | 28732/34278 [31:34:25<5:12:22, 3.38s/it] {'loss': 0.1017, 'grad_norm': 0.8818751665529719, 'learning_rate': 6.709363493134902e-07, 'epoch': 0.84} 84%|████████▍ | 28732/34278 [31:34:25<5:12:22, 3.38s/it] 84%|████████▍ | 28733/34278 [31:34:30<5:44:54, 3.73s/it] {'loss': 0.1068, 'grad_norm': 0.7867298737998444, 'learning_rate': 6.706999776188649e-07, 'epoch': 0.84} 84%|████████▍ | 28733/34278 [31:34:30<5:44:54, 3.73s/it] 84%|████████▍ | 28734/34278 [31:34:35<6:38:35, 4.31s/it] {'loss': 0.1138, 'grad_norm': 0.8124559165017454, 'learning_rate': 6.704636445751966e-07, 'epoch': 0.84} 84%|████████▍ | 28734/34278 [31:34:35<6:38:35, 4.31s/it] 84%|████████▍ | 28735/34278 [31:34:38<5:56:03, 3.85s/it] {'loss': 0.103, 'grad_norm': 0.8695400580267945, 'learning_rate': 6.702273501845946e-07, 'epoch': 0.84} 84%|████████▍ | 28735/34278 [31:34:38<5:56:03, 3.85s/it] 84%|████████▍ | 28736/34278 [31:34:42<5:47:26, 3.76s/it] {'loss': 0.1379, 'grad_norm': 0.9273491181602084, 'learning_rate': 6.699910944491689e-07, 'epoch': 0.84} 84%|████████▍ | 28736/34278 [31:34:42<5:47:26, 3.76s/it] 84%|████████▍ | 28737/34278 [31:34:46<6:03:04, 3.93s/it] {'loss': 0.1187, 'grad_norm': 0.7142323570772079, 'learning_rate': 6.6975487737103e-07, 'epoch': 0.84} 84%|████████▍ | 28737/34278 [31:34:46<6:03:04, 3.93s/it] 84%|████████▍ | 28738/34278 [31:34:52<6:59:18, 4.54s/it] {'loss': 0.1004, 'grad_norm': 0.768054204273176, 'learning_rate': 6.695186989522856e-07, 'epoch': 0.84} 84%|████████▍ | 28738/34278 [31:34:52<6:59:18, 4.54s/it] 84%|████████▍ | 28739/34278 [31:34:56<6:44:42, 4.38s/it] {'loss': 0.1177, 'grad_norm': 0.9379413389553742, 'learning_rate': 6.692825591950441e-07, 'epoch': 0.84} 84%|████████▍ | 28739/34278 [31:34:56<6:44:42, 4.38s/it] 84%|████████▍ | 28740/34278 [31:34:59<6:13:24, 4.05s/it] {'loss': 0.118, 'grad_norm': 0.8190572260259622, 'learning_rate': 6.69046458101415e-07, 'epoch': 0.84} 84%|████████▍ | 28740/34278 [31:34:59<6:13:24, 4.05s/it] 84%|████████▍ | 28741/34278 [31:35:02<5:45:32, 3.74s/it] {'loss': 0.1163, 'grad_norm': 0.8160379824822651, 'learning_rate': 6.688103956735048e-07, 'epoch': 0.84} 84%|████████▍ | 28741/34278 [31:35:02<5:45:32, 3.74s/it] 84%|████████▍ | 28742/34278 [31:35:06<5:42:43, 3.71s/it] {'loss': 0.1283, 'grad_norm': 1.0185656110838428, 'learning_rate': 6.685743719134197e-07, 'epoch': 0.84} 84%|████████▍ | 28742/34278 [31:35:06<5:42:43, 3.71s/it] 84%|████████▍ | 28743/34278 [31:35:09<5:28:54, 3.57s/it] {'loss': 0.0913, 'grad_norm': 1.1103838702868496, 'learning_rate': 6.683383868232706e-07, 'epoch': 0.84} 84%|████████▍ | 28743/34278 [31:35:09<5:28:54, 3.57s/it] 84%|████████▍ | 28744/34278 [31:35:12<5:09:12, 3.35s/it] {'loss': 0.0981, 'grad_norm': 0.7070828225170954, 'learning_rate': 6.681024404051623e-07, 'epoch': 0.84} 84%|████████▍ | 28744/34278 [31:35:12<5:09:12, 3.35s/it] 84%|████████▍ | 28745/34278 [31:35:15<4:59:13, 3.24s/it] {'loss': 0.1358, 'grad_norm': 0.9366319569932889, 'learning_rate': 6.678665326612005e-07, 'epoch': 0.84} 84%|████████▍ | 28745/34278 [31:35:15<4:59:13, 3.24s/it] 84%|████████▍ | 28746/34278 [31:35:20<5:43:03, 3.72s/it] {'loss': 0.1194, 'grad_norm': 0.9124516318112963, 'learning_rate': 6.676306635934926e-07, 'epoch': 0.84} 84%|████████▍ | 28746/34278 [31:35:20<5:43:03, 3.72s/it] 84%|████████▍ | 28747/34278 [31:35:23<5:30:57, 3.59s/it] {'loss': 0.107, 'grad_norm': 0.8135006791237972, 'learning_rate': 6.673948332041446e-07, 'epoch': 0.84} 84%|████████▍ | 28747/34278 [31:35:23<5:30:57, 3.59s/it] 84%|████████▍ | 28748/34278 [31:35:26<5:12:38, 3.39s/it] {'loss': 0.1078, 'grad_norm': 0.6885575632321572, 'learning_rate': 6.6715904149526e-07, 'epoch': 0.84} 84%|████████▍ | 28748/34278 [31:35:26<5:12:38, 3.39s/it] 84%|████████▍ | 28749/34278 [31:35:29<5:13:23, 3.40s/it] {'loss': 0.0977, 'grad_norm': 0.7077370536667261, 'learning_rate': 6.669232884689448e-07, 'epoch': 0.84} 84%|████████▍ | 28749/34278 [31:35:29<5:13:23, 3.40s/it] 84%|████████▍ | 28750/34278 [31:35:32<5:02:09, 3.28s/it] {'loss': 0.0833, 'grad_norm': 0.8249751267352112, 'learning_rate': 6.666875741273055e-07, 'epoch': 0.84} 84%|████████▍ | 28750/34278 [31:35:32<5:02:09, 3.28s/it] 84%|████████▍ | 28751/34278 [31:35:38<5:59:27, 3.90s/it] {'loss': 0.1151, 'grad_norm': 0.8238758026084099, 'learning_rate': 6.66451898472445e-07, 'epoch': 0.84} 84%|████████▍ | 28751/34278 [31:35:38<5:59:27, 3.90s/it] 84%|████████▍ | 28752/34278 [31:35:41<5:54:09, 3.85s/it] {'loss': 0.093, 'grad_norm': 0.7147478588553577, 'learning_rate': 6.662162615064666e-07, 'epoch': 0.84} 84%|████████▍ | 28752/34278 [31:35:41<5:54:09, 3.85s/it] 84%|████████▍ | 28753/34278 [31:35:44<5:23:49, 3.52s/it] {'loss': 0.0861, 'grad_norm': 0.7429986693398549, 'learning_rate': 6.659806632314753e-07, 'epoch': 0.84} 84%|████████▍ | 28753/34278 [31:35:44<5:23:49, 3.52s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0fd20814e0> Failed to fetch sample 3646660. Exception: cannot identify image file <_io.BytesIO object at 0x7f0fd20814e0> 84%|████████▍ | 28754/34278 [31:35:49<5:57:17, 3.88s/it] {'loss': 0.1197, 'grad_norm': 0.7751296716191228, 'learning_rate': 6.657451036495738e-07, 'epoch': 0.84} 84%|████████▍ | 28754/34278 [31:35:49<5:57:17, 3.88s/it] 84%|████████▍ | 28755/34278 [31:35:52<5:34:52, 3.64s/it] {'loss': 0.1165, 'grad_norm': 0.7673811335053209, 'learning_rate': 6.65509582762866e-07, 'epoch': 0.84} 84%|████████▍ | 28755/34278 [31:35:52<5:34:52, 3.64s/it] 84%|████████▍ | 28756/34278 [31:35:55<5:19:17, 3.47s/it] {'loss': 0.1292, 'grad_norm': 1.0133916581262266, 'learning_rate': 6.652741005734525e-07, 'epoch': 0.84} 84%|████████▍ | 28756/34278 [31:35:55<5:19:17, 3.47s/it] 84%|████████▍ | 28757/34278 [31:35:58<5:04:33, 3.31s/it] {'loss': 0.0914, 'grad_norm': 0.8180041015398647, 'learning_rate': 6.650386570834383e-07, 'epoch': 0.84} 84%|████████▍ | 28757/34278 [31:35:58<5:04:33, 3.31s/it] 84%|████████▍ | 28758/34278 [31:36:01<4:51:44, 3.17s/it] {'loss': 0.1249, 'grad_norm': 0.7476366684534999, 'learning_rate': 6.648032522949232e-07, 'epoch': 0.84} 84%|████████▍ | 28758/34278 [31:36:01<4:51:44, 3.17s/it] 84%|████████▍ | 28759/34278 [31:36:05<5:17:47, 3.45s/it] {'loss': 0.1323, 'grad_norm': 0.8546995788144813, 'learning_rate': 6.645678862100114e-07, 'epoch': 0.84} 84%|████████▍ | 28759/34278 [31:36:05<5:17:47, 3.45s/it] 84%|████████▍ | 28760/34278 [31:36:08<4:57:26, 3.23s/it] {'loss': 0.1014, 'grad_norm': 0.7531425851554054, 'learning_rate': 6.643325588308008e-07, 'epoch': 0.84} 84%|████████▍ | 28760/34278 [31:36:08<4:57:26, 3.23s/it] 84%|████████▍ | 28761/34278 [31:36:11<5:04:28, 3.31s/it] {'loss': 0.097, 'grad_norm': 0.9611365207426606, 'learning_rate': 6.64097270159395e-07, 'epoch': 0.84} 84%|████████▍ | 28761/34278 [31:36:11<5:04:28, 3.31s/it] 84%|████████▍ | 28762/34278 [31:36:14<5:00:40, 3.27s/it] {'loss': 0.1244, 'grad_norm': 0.6552977587311081, 'learning_rate': 6.638620201978929e-07, 'epoch': 0.84} 84%|████████▍ | 28762/34278 [31:36:14<5:00:40, 3.27s/it] 84%|████████▍ | 28763/34278 [31:36:17<4:54:35, 3.20s/it] {'loss': 0.1107, 'grad_norm': 0.7306251649323, 'learning_rate': 6.636268089483971e-07, 'epoch': 0.84} 84%|████████▍ | 28763/34278 [31:36:17<4:54:35, 3.20s/it] 84%|████████▍ | 28764/34278 [31:36:21<4:56:11, 3.22s/it] {'loss': 0.1058, 'grad_norm': 0.8586736099696834, 'learning_rate': 6.633916364130056e-07, 'epoch': 0.84} 84%|████████▍ | 28764/34278 [31:36:21<4:56:11, 3.22s/it] 84%|████████▍ | 28765/34278 [31:36:24<5:05:02, 3.32s/it] {'loss': 0.1135, 'grad_norm': 1.0360708242033867, 'learning_rate': 6.631565025938169e-07, 'epoch': 0.84} 84%|████████▍ | 28765/34278 [31:36:24<5:05:02, 3.32s/it] 84%|████████▍ | 28766/34278 [31:36:27<4:54:53, 3.21s/it] {'loss': 0.1187, 'grad_norm': 0.948705596794399, 'learning_rate': 6.629214074929319e-07, 'epoch': 0.84} 84%|████████▍ | 28766/34278 [31:36:27<4:54:53, 3.21s/it] 84%|████████▍ | 28767/34278 [31:36:30<4:42:44, 3.08s/it] {'loss': 0.1144, 'grad_norm': 0.8846009342703027, 'learning_rate': 6.626863511124504e-07, 'epoch': 0.84} 84%|████████▍ | 28767/34278 [31:36:30<4:42:44, 3.08s/it] 84%|████████▍ | 28768/34278 [31:36:33<4:43:55, 3.09s/it] {'loss': 0.1015, 'grad_norm': 0.8417190637982709, 'learning_rate': 6.624513334544697e-07, 'epoch': 0.84} 84%|████████▍ | 28768/34278 [31:36:33<4:43:55, 3.09s/it] 84%|████████▍ | 28769/34278 [31:36:36<4:31:03, 2.95s/it] {'loss': 0.1209, 'grad_norm': 0.7031646678553474, 'learning_rate': 6.622163545210875e-07, 'epoch': 0.84} 84%|████████▍ | 28769/34278 [31:36:36<4:31:03, 2.95s/it] 84%|████████▍ | 28770/34278 [31:36:38<4:19:46, 2.83s/it] {'loss': 0.1413, 'grad_norm': 0.8554940765743669, 'learning_rate': 6.619814143144026e-07, 'epoch': 0.84} 84%|████████▍ | 28770/34278 [31:36:38<4:19:46, 2.83s/it] 84%|████████▍ | 28771/34278 [31:36:42<4:35:41, 3.00s/it] {'loss': 0.0912, 'grad_norm': 0.8921323870672512, 'learning_rate': 6.61746512836512e-07, 'epoch': 0.84} 84%|████████▍ | 28771/34278 [31:36:42<4:35:41, 3.00s/it] 84%|████████▍ | 28772/34278 [31:36:45<4:53:50, 3.20s/it] {'loss': 0.1007, 'grad_norm': 0.7582561652777967, 'learning_rate': 6.615116500895113e-07, 'epoch': 0.84} 84%|████████▍ | 28772/34278 [31:36:45<4:53:50, 3.20s/it] 84%|████████▍ | 28773/34278 [31:36:51<5:49:34, 3.81s/it] {'loss': 0.1223, 'grad_norm': 0.9667372846011172, 'learning_rate': 6.612768260755004e-07, 'epoch': 0.84} 84%|████████▍ | 28773/34278 [31:36:51<5:49:34, 3.81s/it] 84%|████████▍ | 28774/34278 [31:36:54<5:28:56, 3.59s/it] {'loss': 0.0737, 'grad_norm': 0.8493696642065195, 'learning_rate': 6.610420407965745e-07, 'epoch': 0.84} 84%|████████▍ | 28774/34278 [31:36:54<5:28:56, 3.59s/it] 84%|████████▍ | 28775/34278 [31:36:57<5:32:21, 3.62s/it] {'loss': 0.1174, 'grad_norm': 0.9896193161742983, 'learning_rate': 6.608072942548288e-07, 'epoch': 0.84} 84%|████████▍ | 28775/34278 [31:36:57<5:32:21, 3.62s/it] 84%|████████▍ | 28776/34278 [31:37:01<5:24:37, 3.54s/it] {'loss': 0.1345, 'grad_norm': 0.8209049064261055, 'learning_rate': 6.605725864523604e-07, 'epoch': 0.84} 84%|████████▍ | 28776/34278 [31:37:01<5:24:37, 3.54s/it] 84%|████████▍ | 28777/34278 [31:37:04<5:27:23, 3.57s/it] {'loss': 0.1197, 'grad_norm': 0.8505099019023185, 'learning_rate': 6.603379173912644e-07, 'epoch': 0.84} 84%|████████▍ | 28777/34278 [31:37:04<5:27:23, 3.57s/it] 84%|████████▍ | 28778/34278 [31:37:08<5:33:35, 3.64s/it] {'loss': 0.095, 'grad_norm': 0.8435687476471607, 'learning_rate': 6.601032870736341e-07, 'epoch': 0.84} 84%|████████▍ | 28778/34278 [31:37:08<5:33:35, 3.64s/it] 84%|████████▍ | 28779/34278 [31:37:14<6:44:30, 4.41s/it] {'loss': 0.1135, 'grad_norm': 0.8049885646169743, 'learning_rate': 6.598686955015654e-07, 'epoch': 0.84} 84%|████████▍ | 28779/34278 [31:37:14<6:44:30, 4.41s/it] 84%|████████▍ | 28780/34278 [31:37:18<6:14:24, 4.09s/it] {'loss': 0.1203, 'grad_norm': 0.8305185722291034, 'learning_rate': 6.596341426771546e-07, 'epoch': 0.84} 84%|████████▍ | 28780/34278 [31:37:18<6:14:24, 4.09s/it] 84%|████████▍ | 28781/34278 [31:37:21<5:42:47, 3.74s/it] {'loss': 0.1166, 'grad_norm': 1.6308803784404653, 'learning_rate': 6.593996286024934e-07, 'epoch': 0.84} 84%|████████▍ | 28781/34278 [31:37:21<5:42:47, 3.74s/it] 84%|████████▍ | 28782/34278 [31:37:23<5:16:39, 3.46s/it] {'loss': 0.1029, 'grad_norm': 0.8362606748811074, 'learning_rate': 6.591651532796755e-07, 'epoch': 0.84} 84%|████████▍ | 28782/34278 [31:37:23<5:16:39, 3.46s/it] 84%|████████▍ | 28783/34278 [31:37:28<5:52:31, 3.85s/it] {'loss': 0.1072, 'grad_norm': 0.8043625644606235, 'learning_rate': 6.589307167107962e-07, 'epoch': 0.84} 84%|████████▍ | 28783/34278 [31:37:28<5:52:31, 3.85s/it] 84%|████████▍ | 28784/34278 [31:37:32<5:38:55, 3.70s/it] {'loss': 0.1369, 'grad_norm': 0.7346860174215263, 'learning_rate': 6.586963188979456e-07, 'epoch': 0.84} 84%|████████▍ | 28784/34278 [31:37:32<5:38:55, 3.70s/it] 84%|████████▍ | 28785/34278 [31:37:34<5:14:02, 3.43s/it] {'loss': 0.1306, 'grad_norm': 0.8034760711491574, 'learning_rate': 6.584619598432191e-07, 'epoch': 0.84} 84%|████████▍ | 28785/34278 [31:37:34<5:14:02, 3.43s/it] 84%|████████▍ | 28786/34278 [31:37:38<5:11:13, 3.40s/it] {'loss': 0.1322, 'grad_norm': 0.9681865502098288, 'learning_rate': 6.58227639548707e-07, 'epoch': 0.84} 84%|████████▍ | 28786/34278 [31:37:38<5:11:13, 3.40s/it] 84%|████████▍ | 28787/34278 [31:37:41<5:10:39, 3.39s/it] {'loss': 0.1088, 'grad_norm': 0.8605951304570972, 'learning_rate': 6.579933580165027e-07, 'epoch': 0.84} 84%|████████▍ | 28787/34278 [31:37:41<5:10:39, 3.39s/it] 84%|████████▍ | 28788/34278 [31:37:47<6:31:56, 4.28s/it] {'loss': 0.1163, 'grad_norm': 0.9754889755454179, 'learning_rate': 6.577591152486972e-07, 'epoch': 0.84} 84%|████████▍ | 28788/34278 [31:37:47<6:31:56, 4.28s/it] 84%|████████▍ | 28789/34278 [31:37:51<6:10:21, 4.05s/it] {'loss': 0.1017, 'grad_norm': 0.9073355830069386, 'learning_rate': 6.575249112473808e-07, 'epoch': 0.84} 84%|████████▍ | 28789/34278 [31:37:51<6:10:21, 4.05s/it] 84%|████████▍ | 28790/34278 [31:37:54<5:52:19, 3.85s/it] {'loss': 0.1243, 'grad_norm': 0.7126419293482791, 'learning_rate': 6.572907460146454e-07, 'epoch': 0.84} 84%|████████▍ | 28790/34278 [31:37:54<5:52:19, 3.85s/it] 84%|████████▍ | 28791/34278 [31:37:58<5:58:02, 3.92s/it] {'loss': 0.1029, 'grad_norm': 0.6861012492374159, 'learning_rate': 6.570566195525829e-07, 'epoch': 0.84} 84%|████████▍ | 28791/34278 [31:37:58<5:58:02, 3.92s/it] 84%|████████▍ | 28792/34278 [31:38:02<5:43:49, 3.76s/it] {'loss': 0.1054, 'grad_norm': 0.9680994117995753, 'learning_rate': 6.568225318632804e-07, 'epoch': 0.84} 84%|████████▍ | 28792/34278 [31:38:02<5:43:49, 3.76s/it] 84%|████████▍ | 28793/34278 [31:38:08<6:54:27, 4.53s/it] {'loss': 0.1102, 'grad_norm': 0.7902755960454412, 'learning_rate': 6.565884829488312e-07, 'epoch': 0.84} 84%|████████▍ | 28793/34278 [31:38:08<6:54:27, 4.53s/it] 84%|████████▍ | 28794/34278 [31:38:12<6:24:09, 4.20s/it] {'loss': 0.119, 'grad_norm': 0.7962644914448704, 'learning_rate': 6.56354472811323e-07, 'epoch': 0.84} 84%|████████▍ | 28794/34278 [31:38:12<6:24:09, 4.20s/it] 84%|████████▍ | 28795/34278 [31:38:17<7:04:36, 4.65s/it] {'loss': 0.0997, 'grad_norm': 0.9050618458773342, 'learning_rate': 6.561205014528443e-07, 'epoch': 0.84} 84%|████████▍ | 28795/34278 [31:38:17<7:04:36, 4.65s/it] 84%|████████▍ | 28796/34278 [31:38:21<6:49:24, 4.48s/it] {'loss': 0.0844, 'grad_norm': 0.7665868849262667, 'learning_rate': 6.558865688754845e-07, 'epoch': 0.84} 84%|████████▍ | 28796/34278 [31:38:21<6:49:24, 4.48s/it] 84%|████████▍ | 28797/34278 [31:38:25<6:37:18, 4.35s/it] {'loss': 0.1064, 'grad_norm': 0.822595988242091, 'learning_rate': 6.556526750813336e-07, 'epoch': 0.84} 84%|████████▍ | 28797/34278 [31:38:25<6:37:18, 4.35s/it] 84%|████████▍ | 28798/34278 [31:38:29<6:07:26, 4.02s/it] {'loss': 0.1158, 'grad_norm': 0.8578890853877902, 'learning_rate': 6.554188200724782e-07, 'epoch': 0.84} 84%|████████▍ | 28798/34278 [31:38:29<6:07:26, 4.02s/it] 84%|████████▍ | 28799/34278 [31:38:32<5:36:53, 3.69s/it] {'loss': 0.114, 'grad_norm': 0.9404242245978279, 'learning_rate': 6.551850038510054e-07, 'epoch': 0.84} 84%|████████▍ | 28799/34278 [31:38:32<5:36:53, 3.69s/it] 84%|████████▍ | 28800/34278 [31:38:34<5:15:38, 3.46s/it] {'loss': 0.1284, 'grad_norm': 0.8845685891665658, 'learning_rate': 6.54951226419005e-07, 'epoch': 0.84} 84%|████████▍ | 28800/34278 [31:38:34<5:15:38, 3.46s/it] 84%|████████▍ | 28801/34278 [31:38:38<5:12:10, 3.42s/it] {'loss': 0.1224, 'grad_norm': 0.8490759394963238, 'learning_rate': 6.547174877785628e-07, 'epoch': 0.84} 84%|████████▍ | 28801/34278 [31:38:38<5:12:10, 3.42s/it] 84%|████████▍ | 28802/34278 [31:38:44<6:33:37, 4.31s/it] {'loss': 0.1058, 'grad_norm': 0.8678591029861535, 'learning_rate': 6.54483787931764e-07, 'epoch': 0.84} 84%|████████▍ | 28802/34278 [31:38:44<6:33:37, 4.31s/it] 84%|████████▍ | 28803/34278 [31:38:49<6:44:18, 4.43s/it] {'loss': 0.1079, 'grad_norm': 0.9107735500306756, 'learning_rate': 6.542501268806978e-07, 'epoch': 0.84} 84%|████████▍ | 28803/34278 [31:38:49<6:44:18, 4.43s/it] 84%|████████▍ | 28804/34278 [31:38:54<6:56:35, 4.57s/it] {'loss': 0.1166, 'grad_norm': 0.7782338705327391, 'learning_rate': 6.540165046274493e-07, 'epoch': 0.84} 84%|████████▍ | 28804/34278 [31:38:54<6:56:35, 4.57s/it] 84%|████████▍ | 28805/34278 [31:38:58<6:39:11, 4.38s/it] {'loss': 0.1214, 'grad_norm': 0.7252490773874591, 'learning_rate': 6.537829211741032e-07, 'epoch': 0.84} 84%|████████▍ | 28805/34278 [31:38:58<6:39:11, 4.38s/it] 84%|████████▍ | 28806/34278 [31:39:01<6:08:16, 4.04s/it] {'loss': 0.1138, 'grad_norm': 0.8941322386607158, 'learning_rate': 6.535493765227463e-07, 'epoch': 0.84} 84%|████████▍ | 28806/34278 [31:39:01<6:08:16, 4.04s/it] 84%|████████▍ | 28807/34278 [31:39:04<5:35:34, 3.68s/it] {'loss': 0.1018, 'grad_norm': 0.8353832920115603, 'learning_rate': 6.533158706754633e-07, 'epoch': 0.84} 84%|████████▍ | 28807/34278 [31:39:04<5:35:34, 3.68s/it] 84%|████████▍ | 28808/34278 [31:39:08<5:53:54, 3.88s/it] {'loss': 0.1051, 'grad_norm': 0.7030860241272633, 'learning_rate': 6.530824036343375e-07, 'epoch': 0.84} 84%|████████▍ | 28808/34278 [31:39:08<5:53:54, 3.88s/it] 84%|████████▍ | 28809/34278 [31:39:11<5:23:21, 3.55s/it] {'loss': 0.0976, 'grad_norm': 0.8681428224464733, 'learning_rate': 6.528489754014545e-07, 'epoch': 0.84} 84%|████████▍ | 28809/34278 [31:39:11<5:23:21, 3.55s/it] 84%|████████▍ | 28810/34278 [31:39:16<5:54:30, 3.89s/it] {'loss': 0.1011, 'grad_norm': 0.9715518332274773, 'learning_rate': 6.526155859788985e-07, 'epoch': 0.84} 84%|████████▍ | 28810/34278 [31:39:16<5:54:30, 3.89s/it] 84%|████████▍ | 28811/34278 [31:39:22<7:05:25, 4.67s/it] {'loss': 0.1148, 'grad_norm': 0.7589110212252026, 'learning_rate': 6.523822353687531e-07, 'epoch': 0.84} 84%|████████▍ | 28811/34278 [31:39:22<7:05:25, 4.67s/it] 84%|████████▍ | 28812/34278 [31:39:26<6:32:19, 4.31s/it] {'loss': 0.1172, 'grad_norm': 0.8907199418658908, 'learning_rate': 6.521489235731005e-07, 'epoch': 0.84} 84%|████████▍ | 28812/34278 [31:39:26<6:32:19, 4.31s/it] 84%|████████▍ | 28813/34278 [31:39:32<7:26:25, 4.90s/it] {'loss': 0.115, 'grad_norm': 0.8646787870284771, 'learning_rate': 6.519156505940249e-07, 'epoch': 0.84} 84%|████████▍ | 28813/34278 [31:39:32<7:26:25, 4.90s/it] 84%|████████▍ | 28814/34278 [31:39:35<6:37:27, 4.36s/it] {'loss': 0.1362, 'grad_norm': 0.8748720508629039, 'learning_rate': 6.516824164336077e-07, 'epoch': 0.84} 84%|████████▍ | 28814/34278 [31:39:35<6:37:27, 4.36s/it] 84%|████████▍ | 28815/34278 [31:39:39<6:36:51, 4.36s/it] {'loss': 0.0978, 'grad_norm': 0.7613177884739198, 'learning_rate': 6.514492210939327e-07, 'epoch': 0.84} 84%|████████▍ | 28815/34278 [31:39:39<6:36:51, 4.36s/it] 84%|████████▍ | 28816/34278 [31:39:42<6:03:57, 4.00s/it] {'loss': 0.1513, 'grad_norm': 0.96189654180933, 'learning_rate': 6.512160645770799e-07, 'epoch': 0.84} 84%|████████▍ | 28816/34278 [31:39:42<6:03:57, 4.00s/it] 84%|████████▍ | 28817/34278 [31:39:45<5:36:29, 3.70s/it] {'loss': 0.1031, 'grad_norm': 0.8913120494635219, 'learning_rate': 6.509829468851336e-07, 'epoch': 0.84} 84%|████████▍ | 28817/34278 [31:39:45<5:36:29, 3.70s/it] 84%|████████▍ | 28818/34278 [31:39:49<5:42:05, 3.76s/it] {'loss': 0.0985, 'grad_norm': 0.6909259859319695, 'learning_rate': 6.50749868020173e-07, 'epoch': 0.84} 84%|████████▍ | 28818/34278 [31:39:49<5:42:05, 3.76s/it] 84%|████████▍ | 28819/34278 [31:39:53<5:40:54, 3.75s/it] {'loss': 0.1226, 'grad_norm': 0.6436536944911209, 'learning_rate': 6.505168279842777e-07, 'epoch': 0.84} 84%|████████▍ | 28819/34278 [31:39:53<5:40:54, 3.75s/it] 84%|████████▍ | 28820/34278 [31:39:56<5:19:43, 3.51s/it] {'loss': 0.1128, 'grad_norm': 0.7144179993522757, 'learning_rate': 6.502838267795303e-07, 'epoch': 0.84} 84%|████████▍ | 28820/34278 [31:39:56<5:19:43, 3.51s/it] 84%|████████▍ | 28821/34278 [31:39:59<5:09:00, 3.40s/it] {'loss': 0.1044, 'grad_norm': 0.8841992410911953, 'learning_rate': 6.500508644080117e-07, 'epoch': 0.84} 84%|████████▍ | 28821/34278 [31:39:59<5:09:00, 3.40s/it] 84%|████████▍ | 28822/34278 [31:40:02<5:05:18, 3.36s/it] {'loss': 0.1162, 'grad_norm': 0.7712873026638355, 'learning_rate': 6.498179408717992e-07, 'epoch': 0.84} 84%|████████▍ | 28822/34278 [31:40:02<5:05:18, 3.36s/it] 84%|████████▍ | 28823/34278 [31:40:07<5:28:26, 3.61s/it] {'loss': 0.1158, 'grad_norm': 0.7415406444945843, 'learning_rate': 6.495850561729749e-07, 'epoch': 0.84} 84%|████████▍ | 28823/34278 [31:40:07<5:28:26, 3.61s/it] 84%|████████▍ | 28824/34278 [31:40:11<5:37:16, 3.71s/it] {'loss': 0.102, 'grad_norm': 0.8610246489611074, 'learning_rate': 6.493522103136169e-07, 'epoch': 0.84} 84%|████████▍ | 28824/34278 [31:40:11<5:37:16, 3.71s/it] 84%|████████▍ | 28825/34278 [31:40:17<6:43:19, 4.44s/it] {'loss': 0.0973, 'grad_norm': 0.6609714171145477, 'learning_rate': 6.491194032958026e-07, 'epoch': 0.84} 84%|████████▍ | 28825/34278 [31:40:17<6:43:19, 4.44s/it] 84%|████████▍ | 28826/34278 [31:40:22<7:20:33, 4.85s/it] {'loss': 0.1128, 'grad_norm': 0.7704983458435415, 'learning_rate': 6.488866351216116e-07, 'epoch': 0.84} 84%|████████▍ | 28826/34278 [31:40:23<7:20:33, 4.85s/it] 84%|████████▍ | 28827/34278 [31:40:26<6:46:20, 4.47s/it] {'loss': 0.1119, 'grad_norm': 0.7011720107836845, 'learning_rate': 6.486539057931229e-07, 'epoch': 0.84} 84%|████████▍ | 28827/34278 [31:40:26<6:46:20, 4.47s/it] 84%|████████▍ | 28828/34278 [31:40:29<6:02:46, 3.99s/it] {'loss': 0.1091, 'grad_norm': 0.8783664330634164, 'learning_rate': 6.484212153124137e-07, 'epoch': 0.84} 84%|████████▍ | 28828/34278 [31:40:29<6:02:46, 3.99s/it] 84%|████████▍ | 28829/34278 [31:40:32<5:43:46, 3.79s/it] {'loss': 0.1047, 'grad_norm': 0.9538451986776406, 'learning_rate': 6.481885636815599e-07, 'epoch': 0.84} 84%|████████▍ | 28829/34278 [31:40:32<5:43:46, 3.79s/it] 84%|████████▍ | 28830/34278 [31:40:35<5:25:19, 3.58s/it] {'loss': 0.1258, 'grad_norm': 0.811201426363696, 'learning_rate': 6.479559509026406e-07, 'epoch': 0.84} 84%|████████▍ | 28830/34278 [31:40:35<5:25:19, 3.58s/it] 84%|████████▍ | 28831/34278 [31:40:39<5:33:56, 3.68s/it] {'loss': 0.1382, 'grad_norm': 0.9946951198396704, 'learning_rate': 6.477233769777319e-07, 'epoch': 0.84} 84%|████████▍ | 28831/34278 [31:40:39<5:33:56, 3.68s/it] 84%|████████▍ | 28832/34278 [31:40:44<5:50:28, 3.86s/it] {'loss': 0.1221, 'grad_norm': 0.8552353285760814, 'learning_rate': 6.474908419089076e-07, 'epoch': 0.84} 84%|████████▍ | 28832/34278 [31:40:44<5:50:28, 3.86s/it] 84%|████████▍ | 28833/34278 [31:40:47<5:40:41, 3.75s/it] {'loss': 0.0966, 'grad_norm': 0.7219144875927064, 'learning_rate': 6.472583456982485e-07, 'epoch': 0.84} 84%|████████▍ | 28833/34278 [31:40:47<5:40:41, 3.75s/it] 84%|████████▍ | 28834/34278 [31:40:51<5:55:48, 3.92s/it] {'loss': 0.12, 'grad_norm': 1.2242165187967142, 'learning_rate': 6.470258883478275e-07, 'epoch': 0.84} 84%|████████▍ | 28834/34278 [31:40:51<5:55:48, 3.92s/it] 84%|████████▍ | 28835/34278 [31:40:54<5:25:42, 3.59s/it] {'loss': 0.1456, 'grad_norm': 1.1793211548741807, 'learning_rate': 6.467934698597189e-07, 'epoch': 0.84} 84%|████████▍ | 28835/34278 [31:40:54<5:25:42, 3.59s/it] 84%|████████▍ | 28836/34278 [31:41:00<6:30:04, 4.30s/it] {'loss': 0.1191, 'grad_norm': 0.9663827119742197, 'learning_rate': 6.465610902360009e-07, 'epoch': 0.84} 84%|████████▍ | 28836/34278 [31:41:00<6:30:04, 4.30s/it] 84%|████████▍ | 28837/34278 [31:41:06<7:17:10, 4.82s/it] {'loss': 0.1278, 'grad_norm': 0.8042727150124604, 'learning_rate': 6.463287494787446e-07, 'epoch': 0.84} 84%|████████▍ | 28837/34278 [31:41:06<7:17:10, 4.82s/it] 84%|████████▍ | 28838/34278 [31:41:11<7:04:11, 4.68s/it] {'loss': 0.1202, 'grad_norm': 1.083381842465957, 'learning_rate': 6.460964475900266e-07, 'epoch': 0.84} 84%|████████▍ | 28838/34278 [31:41:11<7:04:11, 4.68s/it] 84%|████████▍ | 28839/34278 [31:41:14<6:23:42, 4.23s/it] {'loss': 0.098, 'grad_norm': 0.9834436853846289, 'learning_rate': 6.4586418457192e-07, 'epoch': 0.84} 84%|████████▍ | 28839/34278 [31:41:14<6:23:42, 4.23s/it] 84%|████████▍ | 28840/34278 [31:41:18<6:15:24, 4.14s/it] {'loss': 0.1179, 'grad_norm': 0.6787730454878973, 'learning_rate': 6.456319604264988e-07, 'epoch': 0.84} 84%|████████▍ | 28840/34278 [31:41:18<6:15:24, 4.14s/it] 84%|████████▍ | 28841/34278 [31:41:22<6:19:05, 4.18s/it] {'loss': 0.1191, 'grad_norm': 0.8495841239183446, 'learning_rate': 6.453997751558366e-07, 'epoch': 0.84} 84%|████████▍ | 28841/34278 [31:41:22<6:19:05, 4.18s/it] 84%|████████▍ | 28842/34278 [31:41:25<5:44:44, 3.81s/it] {'loss': 0.1132, 'grad_norm': 1.1290615715573953, 'learning_rate': 6.451676287620046e-07, 'epoch': 0.84} 84%|████████▍ | 28842/34278 [31:41:25<5:44:44, 3.81s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f81001a2430> Failed to fetch sample 2696826. Exception: cannot identify image file <_io.BytesIO object at 0x7f81001a2430> 84%|████████▍ | 28843/34278 [31:41:29<5:41:00, 3.76s/it] {'loss': 0.114, 'grad_norm': 0.8453385437174072, 'learning_rate': 6.44935521247076e-07, 'epoch': 0.84} 84%|████████▍ | 28843/34278 [31:41:29<5:41:00, 3.76s/it] 84%|████████▍ | 28844/34278 [31:41:32<5:27:52, 3.62s/it] {'loss': 0.1179, 'grad_norm': 0.9923323922290838, 'learning_rate': 6.447034526131247e-07, 'epoch': 0.84} 84%|████████▍ | 28844/34278 [31:41:32<5:27:52, 3.62s/it] 84%|████████▍ | 28845/34278 [31:41:36<5:30:39, 3.65s/it] {'loss': 0.1197, 'grad_norm': 0.8784270460169328, 'learning_rate': 6.444714228622212e-07, 'epoch': 0.84} 84%|████████▍ | 28845/34278 [31:41:36<5:30:39, 3.65s/it] 84%|████████▍ | 28846/34278 [31:41:39<5:20:39, 3.54s/it] {'loss': 0.1099, 'grad_norm': 0.8813807687323639, 'learning_rate': 6.442394319964362e-07, 'epoch': 0.84} 84%|████████▍ | 28846/34278 [31:41:39<5:20:39, 3.54s/it] 84%|████████▍ | 28847/34278 [31:41:44<6:12:31, 4.12s/it] {'loss': 0.1135, 'grad_norm': 0.7550142799237555, 'learning_rate': 6.440074800178426e-07, 'epoch': 0.84} 84%|████████▍ | 28847/34278 [31:41:44<6:12:31, 4.12s/it] 84%|████████▍ | 28848/34278 [31:41:48<6:11:42, 4.11s/it] {'loss': 0.1148, 'grad_norm': 0.7698178913705608, 'learning_rate': 6.437755669285106e-07, 'epoch': 0.84} 84%|████████▍ | 28848/34278 [31:41:48<6:11:42, 4.11s/it] 84%|████████▍ | 28849/34278 [31:41:51<5:40:17, 3.76s/it] {'loss': 0.1112, 'grad_norm': 0.7689855145974284, 'learning_rate': 6.435436927305077e-07, 'epoch': 0.84} 84%|████████▍ | 28849/34278 [31:41:51<5:40:17, 3.76s/it] 84%|████████▍ | 28850/34278 [31:41:54<5:10:41, 3.43s/it] {'loss': 0.1002, 'grad_norm': 0.8660549598806329, 'learning_rate': 6.433118574259095e-07, 'epoch': 0.84} 84%|████████▍ | 28850/34278 [31:41:54<5:10:41, 3.43s/it] 84%|████████▍ | 28851/34278 [31:41:57<4:57:00, 3.28s/it] {'loss': 0.1298, 'grad_norm': 0.8977184197766986, 'learning_rate': 6.430800610167831e-07, 'epoch': 0.84} 84%|████████▍ | 28851/34278 [31:41:57<4:57:00, 3.28s/it] 84%|████████▍ | 28852/34278 [31:42:01<5:13:43, 3.47s/it] {'loss': 0.1067, 'grad_norm': 0.9849880656802578, 'learning_rate': 6.428483035051963e-07, 'epoch': 0.84} 84%|████████▍ | 28852/34278 [31:42:01<5:13:43, 3.47s/it] 84%|████████▍ | 28853/34278 [31:42:07<6:21:01, 4.21s/it] {'loss': 0.1035, 'grad_norm': 0.8153342556747524, 'learning_rate': 6.426165848932208e-07, 'epoch': 0.84} 84%|████████▍ | 28853/34278 [31:42:07<6:21:01, 4.21s/it] 84%|████████▍ | 28854/34278 [31:42:10<5:55:22, 3.93s/it] {'loss': 0.0998, 'grad_norm': 0.9065290418267716, 'learning_rate': 6.423849051829246e-07, 'epoch': 0.84} 84%|████████▍ | 28854/34278 [31:42:10<5:55:22, 3.93s/it] 84%|████████▍ | 28855/34278 [31:42:16<6:56:17, 4.61s/it] {'loss': 0.0989, 'grad_norm': 0.7417628350177068, 'learning_rate': 6.421532643763745e-07, 'epoch': 0.84} 84%|████████▍ | 28855/34278 [31:42:16<6:56:17, 4.61s/it] 84%|████████▍ | 28856/34278 [31:42:19<6:13:56, 4.14s/it] {'loss': 0.1439, 'grad_norm': 0.8666948957920616, 'learning_rate': 6.419216624756397e-07, 'epoch': 0.84} 84%|████████▍ | 28856/34278 [31:42:19<6:13:56, 4.14s/it] 84%|████████▍ | 28857/34278 [31:42:22<5:42:50, 3.79s/it] {'loss': 0.103, 'grad_norm': 0.8978597665981048, 'learning_rate': 6.41690099482789e-07, 'epoch': 0.84} 84%|████████▍ | 28857/34278 [31:42:22<5:42:50, 3.79s/it] 84%|████████▍ | 28858/34278 [31:42:25<5:26:04, 3.61s/it] {'loss': 0.0889, 'grad_norm': 0.7343306993336536, 'learning_rate': 6.414585753998887e-07, 'epoch': 0.84} 84%|████████▍ | 28858/34278 [31:42:25<5:26:04, 3.61s/it] 84%|████████▍ | 28859/34278 [31:42:29<5:28:22, 3.64s/it] {'loss': 0.1351, 'grad_norm': 0.8811943569068373, 'learning_rate': 6.412270902290047e-07, 'epoch': 0.84} 84%|████████▍ | 28859/34278 [31:42:29<5:28:22, 3.64s/it] 84%|████████▍ | 28860/34278 [31:42:33<5:22:53, 3.58s/it] {'loss': 0.1295, 'grad_norm': 0.9439589139156926, 'learning_rate': 6.40995643972206e-07, 'epoch': 0.84} 84%|████████▍ | 28860/34278 [31:42:33<5:22:53, 3.58s/it] 84%|████████▍ | 28861/34278 [31:42:36<5:28:32, 3.64s/it] {'loss': 0.1274, 'grad_norm': 0.789718850268482, 'learning_rate': 6.407642366315564e-07, 'epoch': 0.84} 84%|████████▍ | 28861/34278 [31:42:36<5:28:32, 3.64s/it] 84%|████████▍ | 28862/34278 [31:42:39<5:09:51, 3.43s/it] {'loss': 0.1152, 'grad_norm': 0.7178593481303421, 'learning_rate': 6.405328682091228e-07, 'epoch': 0.84} 84%|████████▍ | 28862/34278 [31:42:39<5:09:51, 3.43s/it] 84%|████████▍ | 28863/34278 [31:42:42<4:58:38, 3.31s/it] {'loss': 0.12, 'grad_norm': 0.9482712353650073, 'learning_rate': 6.403015387069722e-07, 'epoch': 0.84} 84%|████████▍ | 28863/34278 [31:42:42<4:58:38, 3.31s/it] 84%|████████▍ | 28864/34278 [31:42:47<5:28:38, 3.64s/it] {'loss': 0.0973, 'grad_norm': 0.8814999547315627, 'learning_rate': 6.400702481271692e-07, 'epoch': 0.84} 84%|████████▍ | 28864/34278 [31:42:47<5:28:38, 3.64s/it] 84%|████████▍ | 28865/34278 [31:42:53<6:32:40, 4.35s/it] {'loss': 0.1252, 'grad_norm': 0.9815439228739841, 'learning_rate': 6.398389964717766e-07, 'epoch': 0.84} 84%|████████▍ | 28865/34278 [31:42:53<6:32:40, 4.35s/it] 84%|████████▍ | 28866/34278 [31:42:56<6:06:40, 4.07s/it] {'loss': 0.1454, 'grad_norm': 0.9033488606404223, 'learning_rate': 6.396077837428621e-07, 'epoch': 0.84} 84%|████████▍ | 28866/34278 [31:42:56<6:06:40, 4.07s/it] 84%|████████▍ | 28867/34278 [31:43:00<6:10:18, 4.11s/it] {'loss': 0.1095, 'grad_norm': 0.7170638165342558, 'learning_rate': 6.393766099424869e-07, 'epoch': 0.84} 84%|████████▍ | 28867/34278 [31:43:00<6:10:18, 4.11s/it] 84%|████████▍ | 28868/34278 [31:43:05<6:22:43, 4.24s/it] {'loss': 0.1149, 'grad_norm': 1.0315142844876628, 'learning_rate': 6.391454750727177e-07, 'epoch': 0.84} 84%|████████▍ | 28868/34278 [31:43:05<6:22:43, 4.24s/it] 84%|████████▍ | 28869/34278 [31:43:08<5:54:39, 3.93s/it] {'loss': 0.1185, 'grad_norm': 0.7692115385358217, 'learning_rate': 6.389143791356156e-07, 'epoch': 0.84} 84%|████████▍ | 28869/34278 [31:43:08<5:54:39, 3.93s/it] 84%|████████▍ | 28870/34278 [31:43:11<5:22:56, 3.58s/it] {'loss': 0.1369, 'grad_norm': 0.8417598144992364, 'learning_rate': 6.386833221332456e-07, 'epoch': 0.84} 84%|████████▍ | 28870/34278 [31:43:11<5:22:56, 3.58s/it] 84%|████████▍ | 28871/34278 [31:43:15<5:36:27, 3.73s/it] {'loss': 0.1071, 'grad_norm': 0.8156129188232211, 'learning_rate': 6.384523040676704e-07, 'epoch': 0.84} 84%|████████▍ | 28871/34278 [31:43:15<5:36:27, 3.73s/it] 84%|████████▍ | 28872/34278 [31:43:19<5:49:42, 3.88s/it] {'loss': 0.1181, 'grad_norm': 0.7775657771437443, 'learning_rate': 6.382213249409502e-07, 'epoch': 0.84} 84%|████████▍ | 28872/34278 [31:43:19<5:49:42, 3.88s/it] 84%|████████▍ | 28873/34278 [31:43:22<5:27:43, 3.64s/it] {'loss': 0.1025, 'grad_norm': 0.9616514026711401, 'learning_rate': 6.379903847551489e-07, 'epoch': 0.84} 84%|████████▍ | 28873/34278 [31:43:22<5:27:43, 3.64s/it] 84%|████████▍ | 28874/34278 [31:43:28<6:18:24, 4.20s/it] {'loss': 0.1177, 'grad_norm': 0.9075184500056969, 'learning_rate': 6.377594835123296e-07, 'epoch': 0.84} 84%|████████▍ | 28874/34278 [31:43:28<6:18:24, 4.20s/it] 84%|████████▍ | 28875/34278 [31:43:32<6:08:15, 4.09s/it] {'loss': 0.0963, 'grad_norm': 1.0023575745946811, 'learning_rate': 6.375286212145521e-07, 'epoch': 0.84} 84%|████████▍ | 28875/34278 [31:43:32<6:08:15, 4.09s/it] 84%|████████▍ | 28876/34278 [31:43:35<5:51:08, 3.90s/it] {'loss': 0.1237, 'grad_norm': 0.8121770301739846, 'learning_rate': 6.372977978638762e-07, 'epoch': 0.84} 84%|████████▍ | 28876/34278 [31:43:35<5:51:08, 3.90s/it] 84%|████████▍ | 28877/34278 [31:43:39<5:46:02, 3.84s/it] {'loss': 0.0942, 'grad_norm': 0.8369541291810983, 'learning_rate': 6.370670134623652e-07, 'epoch': 0.84} 84%|████████▍ | 28877/34278 [31:43:39<5:46:02, 3.84s/it] 84%|████████▍ | 28878/34278 [31:43:44<6:17:10, 4.19s/it] {'loss': 0.0904, 'grad_norm': 1.0474218154312502, 'learning_rate': 6.368362680120787e-07, 'epoch': 0.84} 84%|████████▍ | 28878/34278 [31:43:44<6:17:10, 4.19s/it] 84%|████████▍ | 28879/34278 [31:43:47<5:46:10, 3.85s/it] {'loss': 0.0898, 'grad_norm': 0.8552151482340444, 'learning_rate': 6.366055615150746e-07, 'epoch': 0.84} 84%|████████▍ | 28879/34278 [31:43:47<5:46:10, 3.85s/it] 84%|████████▍ | 28880/34278 [31:43:50<5:27:51, 3.64s/it] {'loss': 0.1051, 'grad_norm': 0.6530577140130184, 'learning_rate': 6.36374893973416e-07, 'epoch': 0.84} 84%|████████▍ | 28880/34278 [31:43:50<5:27:51, 3.64s/it] 84%|████████▍ | 28881/34278 [31:43:53<5:15:26, 3.51s/it] {'loss': 0.0954, 'grad_norm': 1.2912320210230699, 'learning_rate': 6.361442653891608e-07, 'epoch': 0.84} 84%|████████▍ | 28881/34278 [31:43:53<5:15:26, 3.51s/it] 84%|████████▍ | 28882/34278 [31:43:57<5:09:57, 3.45s/it] {'loss': 0.0853, 'grad_norm': 0.8506607381697783, 'learning_rate': 6.35913675764367e-07, 'epoch': 0.84} 84%|████████▍ | 28882/34278 [31:43:57<5:09:57, 3.45s/it] 84%|████████▍ | 28883/34278 [31:44:03<6:18:51, 4.21s/it] {'loss': 0.0926, 'grad_norm': 0.8377732292276981, 'learning_rate': 6.356831251010948e-07, 'epoch': 0.84} 84%|████████▍ | 28883/34278 [31:44:03<6:18:51, 4.21s/it] 84%|████████▍ | 28884/34278 [31:44:06<5:46:44, 3.86s/it] {'loss': 0.12, 'grad_norm': 0.8006864847043814, 'learning_rate': 6.354526134014022e-07, 'epoch': 0.84} 84%|████████▍ | 28884/34278 [31:44:06<5:46:44, 3.86s/it] 84%|████████▍ | 28885/34278 [31:44:09<5:22:54, 3.59s/it] {'loss': 0.1278, 'grad_norm': 0.8652466306530592, 'learning_rate': 6.352221406673453e-07, 'epoch': 0.84} 84%|████████▍ | 28885/34278 [31:44:09<5:22:54, 3.59s/it] 84%|████████▍ | 28886/34278 [31:44:12<5:25:55, 3.63s/it] {'loss': 0.1077, 'grad_norm': 0.8295095284993502, 'learning_rate': 6.349917069009837e-07, 'epoch': 0.84} 84%|████████▍ | 28886/34278 [31:44:12<5:25:55, 3.63s/it] 84%|████████▍ | 28887/34278 [31:44:16<5:31:49, 3.69s/it] {'loss': 0.1279, 'grad_norm': 0.8357083084260215, 'learning_rate': 6.347613121043745e-07, 'epoch': 0.84} 84%|████████▍ | 28887/34278 [31:44:16<5:31:49, 3.69s/it] 84%|████████▍ | 28888/34278 [31:44:19<5:16:56, 3.53s/it] {'loss': 0.1479, 'grad_norm': 0.8238125863301013, 'learning_rate': 6.345309562795748e-07, 'epoch': 0.84} 84%|████████▍ | 28888/34278 [31:44:19<5:16:56, 3.53s/it] 84%|████████▍ | 28889/34278 [31:44:23<5:10:50, 3.46s/it] {'loss': 0.1171, 'grad_norm': 0.9650659380382625, 'learning_rate': 6.343006394286394e-07, 'epoch': 0.84} 84%|████████▍ | 28889/34278 [31:44:23<5:10:50, 3.46s/it] 84%|████████▍ | 28890/34278 [31:44:26<5:11:01, 3.46s/it] {'loss': 0.1078, 'grad_norm': 0.71697884241401, 'learning_rate': 6.340703615536264e-07, 'epoch': 0.84} 84%|████████▍ | 28890/34278 [31:44:26<5:11:01, 3.46s/it] 84%|████████▍ | 28891/34278 [31:44:29<5:07:26, 3.42s/it] {'loss': 0.1126, 'grad_norm': 0.8489867200630906, 'learning_rate': 6.338401226565904e-07, 'epoch': 0.84} 84%|████████▍ | 28891/34278 [31:44:29<5:07:26, 3.42s/it] 84%|████████▍ | 28892/34278 [31:44:32<4:47:18, 3.20s/it] {'loss': 0.1168, 'grad_norm': 0.9940494807831788, 'learning_rate': 6.336099227395875e-07, 'epoch': 0.84} 84%|████████▍ | 28892/34278 [31:44:32<4:47:18, 3.20s/it] 84%|████████▍ | 28893/34278 [31:44:35<4:52:13, 3.26s/it] {'loss': 0.0921, 'grad_norm': 0.9758747411540146, 'learning_rate': 6.333797618046739e-07, 'epoch': 0.84} 84%|████████▍ | 28893/34278 [31:44:35<4:52:13, 3.26s/it] 84%|████████▍ | 28894/34278 [31:44:39<4:49:02, 3.22s/it] {'loss': 0.0993, 'grad_norm': 0.7884703811383246, 'learning_rate': 6.331496398539033e-07, 'epoch': 0.84} 84%|████████▍ | 28894/34278 [31:44:39<4:49:02, 3.22s/it] 84%|████████▍ | 28895/34278 [31:44:42<4:45:03, 3.18s/it] {'loss': 0.1189, 'grad_norm': 0.8741558895470134, 'learning_rate': 6.329195568893292e-07, 'epoch': 0.84} 84%|████████▍ | 28895/34278 [31:44:42<4:45:03, 3.18s/it] 84%|████████▍ | 28896/34278 [31:44:45<4:53:13, 3.27s/it] {'loss': 0.1095, 'grad_norm': 0.784373689885044, 'learning_rate': 6.326895129130079e-07, 'epoch': 0.84} 84%|████████▍ | 28896/34278 [31:44:45<4:53:13, 3.27s/it] 84%|████████▍ | 28897/34278 [31:44:49<5:16:37, 3.53s/it] {'loss': 0.1144, 'grad_norm': 0.9079845061376804, 'learning_rate': 6.324595079269907e-07, 'epoch': 0.84} 84%|████████▍ | 28897/34278 [31:44:49<5:16:37, 3.53s/it] 84%|████████▍ | 28898/34278 [31:44:52<5:00:07, 3.35s/it] {'loss': 0.1287, 'grad_norm': 0.773522031670242, 'learning_rate': 6.322295419333335e-07, 'epoch': 0.84} 84%|████████▍ | 28898/34278 [31:44:52<5:00:07, 3.35s/it] 84%|████████▍ | 28899/34278 [31:44:56<5:09:21, 3.45s/it] {'loss': 0.1115, 'grad_norm': 1.0060439764611728, 'learning_rate': 6.319996149340873e-07, 'epoch': 0.84} 84%|████████▍ | 28899/34278 [31:44:56<5:09:21, 3.45s/it] 84%|████████▍ | 28900/34278 [31:44:59<4:57:39, 3.32s/it] {'loss': 0.0998, 'grad_norm': 0.849708583536126, 'learning_rate': 6.317697269313072e-07, 'epoch': 0.84} 84%|████████▍ | 28900/34278 [31:44:59<4:57:39, 3.32s/it] 84%|████████▍ | 28901/34278 [31:45:02<4:51:47, 3.26s/it] {'loss': 0.116, 'grad_norm': 0.6597539800178907, 'learning_rate': 6.315398779270443e-07, 'epoch': 0.84} 84%|████████▍ | 28901/34278 [31:45:02<4:51:47, 3.26s/it] 84%|████████▍ | 28902/34278 [31:45:05<4:47:25, 3.21s/it] {'loss': 0.1026, 'grad_norm': 0.7636676781210086, 'learning_rate': 6.313100679233491e-07, 'epoch': 0.84} 84%|████████▍ | 28902/34278 [31:45:05<4:47:25, 3.21s/it] 84%|████████▍ | 28903/34278 [31:45:08<4:49:12, 3.23s/it] {'loss': 0.1268, 'grad_norm': 0.9582051504730216, 'learning_rate': 6.310802969222745e-07, 'epoch': 0.84} 84%|████████▍ | 28903/34278 [31:45:08<4:49:12, 3.23s/it] 84%|████████▍ | 28904/34278 [31:45:11<4:39:51, 3.12s/it] {'loss': 0.1331, 'grad_norm': 0.802010833128608, 'learning_rate': 6.308505649258734e-07, 'epoch': 0.84} 84%|████████▍ | 28904/34278 [31:45:11<4:39:51, 3.12s/it] 84%|████████▍ | 28905/34278 [31:45:15<5:05:23, 3.41s/it] {'loss': 0.1057, 'grad_norm': 0.6908653461538637, 'learning_rate': 6.306208719361956e-07, 'epoch': 0.84} 84%|████████▍ | 28905/34278 [31:45:15<5:05:23, 3.41s/it] 84%|████████▍ | 28906/34278 [31:45:19<5:25:57, 3.64s/it] {'loss': 0.1208, 'grad_norm': 0.9270209884296832, 'learning_rate': 6.303912179552902e-07, 'epoch': 0.84} 84%|████████▍ | 28906/34278 [31:45:19<5:25:57, 3.64s/it] 84%|████████▍ | 28907/34278 [31:45:23<5:19:53, 3.57s/it] {'loss': 0.1123, 'grad_norm': 0.821447517787425, 'learning_rate': 6.301616029852103e-07, 'epoch': 0.84} 84%|████████▍ | 28907/34278 [31:45:23<5:19:53, 3.57s/it] 84%|████████▍ | 28908/34278 [31:45:26<5:05:52, 3.42s/it] {'loss': 0.1154, 'grad_norm': 0.7620011929699494, 'learning_rate': 6.299320270280046e-07, 'epoch': 0.84} 84%|████████▍ | 28908/34278 [31:45:26<5:05:52, 3.42s/it] 84%|████████▍ | 28909/34278 [31:45:30<5:12:20, 3.49s/it] {'loss': 0.1114, 'grad_norm': 0.7979724597079998, 'learning_rate': 6.297024900857196e-07, 'epoch': 0.84} 84%|████████▍ | 28909/34278 [31:45:30<5:12:20, 3.49s/it] 84%|████████▍ | 28910/34278 [31:45:32<4:53:20, 3.28s/it] {'loss': 0.1277, 'grad_norm': 0.912165101893905, 'learning_rate': 6.294729921604104e-07, 'epoch': 0.84} 84%|████████▍ | 28910/34278 [31:45:32<4:53:20, 3.28s/it] 84%|████████▍ | 28911/34278 [31:45:36<5:02:14, 3.38s/it] {'loss': 0.102, 'grad_norm': 0.8670551328968326, 'learning_rate': 6.29243533254123e-07, 'epoch': 0.84} 84%|████████▍ | 28911/34278 [31:45:36<5:02:14, 3.38s/it] 84%|████████▍ | 28912/34278 [31:45:39<4:51:45, 3.26s/it] {'loss': 0.0969, 'grad_norm': 0.9780130220486077, 'learning_rate': 6.290141133689043e-07, 'epoch': 0.84} 84%|████████▍ | 28912/34278 [31:45:39<4:51:45, 3.26s/it] 84%|████████▍ | 28913/34278 [31:45:42<4:53:04, 3.28s/it] {'loss': 0.1178, 'grad_norm': 1.0302144398091853, 'learning_rate': 6.287847325068059e-07, 'epoch': 0.84} 84%|████████▍ | 28913/34278 [31:45:42<4:53:04, 3.28s/it] 84%|████████▍ | 28914/34278 [31:45:46<4:58:55, 3.34s/it] {'loss': 0.1008, 'grad_norm': 0.7133364137578695, 'learning_rate': 6.285553906698732e-07, 'epoch': 0.84} 84%|████████▍ | 28914/34278 [31:45:46<4:58:55, 3.34s/it] 84%|████████▍ | 28915/34278 [31:45:50<5:21:25, 3.60s/it] {'loss': 0.0976, 'grad_norm': 0.9786751119061714, 'learning_rate': 6.283260878601538e-07, 'epoch': 0.84} 84%|████████▍ | 28915/34278 [31:45:50<5:21:25, 3.60s/it] 84%|████████▍ | 28916/34278 [31:45:53<5:18:54, 3.57s/it] {'loss': 0.1148, 'grad_norm': 0.9970447061315006, 'learning_rate': 6.280968240796953e-07, 'epoch': 0.84} 84%|████████▍ | 28916/34278 [31:45:54<5:18:54, 3.57s/it] 84%|████████▍ | 28917/34278 [31:45:57<5:25:11, 3.64s/it] {'loss': 0.1119, 'grad_norm': 0.7532731327396806, 'learning_rate': 6.278675993305461e-07, 'epoch': 0.84} 84%|████████▍ | 28917/34278 [31:45:57<5:25:11, 3.64s/it] 84%|████████▍ | 28918/34278 [31:46:03<6:28:51, 4.35s/it] {'loss': 0.1159, 'grad_norm': 0.9077873625714724, 'learning_rate': 6.276384136147512e-07, 'epoch': 0.84} 84%|████████▍ | 28918/34278 [31:46:03<6:28:51, 4.35s/it] 84%|████████▍ | 28919/34278 [31:46:06<5:53:35, 3.96s/it] {'loss': 0.0998, 'grad_norm': 0.8612141675562002, 'learning_rate': 6.274092669343551e-07, 'epoch': 0.84} 84%|████████▍ | 28919/34278 [31:46:06<5:53:35, 3.96s/it] 84%|████████▍ | 28920/34278 [31:46:12<6:47:30, 4.56s/it] {'loss': 0.1136, 'grad_norm': 1.2176983928374796, 'learning_rate': 6.271801592914068e-07, 'epoch': 0.84} 84%|████████▍ | 28920/34278 [31:46:12<6:47:30, 4.56s/it] 84%|████████▍ | 28921/34278 [31:46:16<6:33:49, 4.41s/it] {'loss': 0.1134, 'grad_norm': 0.9229084585460244, 'learning_rate': 6.269510906879489e-07, 'epoch': 0.84} 84%|████████▍ | 28921/34278 [31:46:16<6:33:49, 4.41s/it] 84%|████████▍ | 28922/34278 [31:46:20<6:06:26, 4.11s/it] {'loss': 0.1189, 'grad_norm': 0.8544153087523163, 'learning_rate': 6.267220611260283e-07, 'epoch': 0.84} 84%|████████▍ | 28922/34278 [31:46:20<6:06:26, 4.11s/it] 84%|████████▍ | 28923/34278 [31:46:24<6:12:51, 4.18s/it] {'loss': 0.1027, 'grad_norm': 1.08151919459781, 'learning_rate': 6.264930706076894e-07, 'epoch': 0.84} 84%|████████▍ | 28923/34278 [31:46:24<6:12:51, 4.18s/it] 84%|████████▍ | 28924/34278 [31:46:27<5:41:46, 3.83s/it] {'loss': 0.1052, 'grad_norm': 0.8685751850216777, 'learning_rate': 6.262641191349773e-07, 'epoch': 0.84} 84%|████████▍ | 28924/34278 [31:46:27<5:41:46, 3.83s/it] 84%|████████▍ | 28925/34278 [31:46:32<6:18:44, 4.25s/it] {'loss': 0.1011, 'grad_norm': 0.9392648152313195, 'learning_rate': 6.260352067099329e-07, 'epoch': 0.84} 84%|████████▍ | 28925/34278 [31:46:32<6:18:44, 4.25s/it] 84%|████████▍ | 28926/34278 [31:46:35<5:40:04, 3.81s/it] {'loss': 0.1097, 'grad_norm': 0.678042428996098, 'learning_rate': 6.258063333346037e-07, 'epoch': 0.84} 84%|████████▍ | 28926/34278 [31:46:35<5:40:04, 3.81s/it] 84%|████████▍ | 28927/34278 [31:46:39<5:52:42, 3.95s/it] {'loss': 0.1035, 'grad_norm': 0.9806532946104165, 'learning_rate': 6.255774990110303e-07, 'epoch': 0.84} 84%|████████▍ | 28927/34278 [31:46:39<5:52:42, 3.95s/it] 84%|████████▍ | 28928/34278 [31:46:43<5:32:19, 3.73s/it] {'loss': 0.0961, 'grad_norm': 0.9535940013267304, 'learning_rate': 6.253487037412575e-07, 'epoch': 0.84} 84%|████████▍ | 28928/34278 [31:46:43<5:32:19, 3.73s/it] 84%|████████▍ | 28929/34278 [31:46:46<5:11:57, 3.50s/it] {'loss': 0.0942, 'grad_norm': 0.9213404933200429, 'learning_rate': 6.251199475273262e-07, 'epoch': 0.84} 84%|████████▍ | 28929/34278 [31:46:46<5:11:57, 3.50s/it] 84%|████████▍ | 28930/34278 [31:46:49<4:56:13, 3.32s/it] {'loss': 0.1244, 'grad_norm': 0.7984942682764017, 'learning_rate': 6.248912303712812e-07, 'epoch': 0.84} 84%|████████▍ | 28930/34278 [31:46:49<4:56:13, 3.32s/it] 84%|████████▍ | 28931/34278 [31:46:52<4:53:07, 3.29s/it] {'loss': 0.1154, 'grad_norm': 0.8499461483793341, 'learning_rate': 6.246625522751621e-07, 'epoch': 0.84} 84%|████████▍ | 28931/34278 [31:46:52<4:53:07, 3.29s/it] 84%|████████▍ | 28932/34278 [31:46:54<4:33:25, 3.07s/it] {'loss': 0.0921, 'grad_norm': 0.7866133750070303, 'learning_rate': 6.244339132410104e-07, 'epoch': 0.84} 84%|████████▍ | 28932/34278 [31:46:54<4:33:25, 3.07s/it] 84%|████████▍ | 28933/34278 [31:47:00<5:48:40, 3.91s/it] {'loss': 0.1212, 'grad_norm': 0.742497463167279, 'learning_rate': 6.242053132708686e-07, 'epoch': 0.84} 84%|████████▍ | 28933/34278 [31:47:00<5:48:40, 3.91s/it] 84%|████████▍ | 28934/34278 [31:47:03<5:20:08, 3.59s/it] {'loss': 0.101, 'grad_norm': 0.8884821717707002, 'learning_rate': 6.239767523667778e-07, 'epoch': 0.84} 84%|████████▍ | 28934/34278 [31:47:03<5:20:08, 3.59s/it] 84%|████████▍ | 28935/34278 [31:47:06<5:08:44, 3.47s/it] {'loss': 0.0791, 'grad_norm': 0.9183989534593693, 'learning_rate': 6.237482305307785e-07, 'epoch': 0.84} 84%|████████▍ | 28935/34278 [31:47:06<5:08:44, 3.47s/it] 84%|████████▍ | 28936/34278 [31:47:09<5:03:41, 3.41s/it] {'loss': 0.1147, 'grad_norm': 0.946605435463055, 'learning_rate': 6.235197477649085e-07, 'epoch': 0.84} 84%|████████▍ | 28936/34278 [31:47:09<5:03:41, 3.41s/it] 84%|████████▍ | 28937/34278 [31:47:15<6:11:56, 4.18s/it] {'loss': 0.1029, 'grad_norm': 0.7782264256580208, 'learning_rate': 6.232913040712107e-07, 'epoch': 0.84} 84%|████████▍ | 28937/34278 [31:47:15<6:11:56, 4.18s/it] 84%|████████▍ | 28938/34278 [31:47:19<6:01:32, 4.06s/it] {'loss': 0.1046, 'grad_norm': 0.9038559295240802, 'learning_rate': 6.230628994517235e-07, 'epoch': 0.84} 84%|████████▍ | 28938/34278 [31:47:19<6:01:32, 4.06s/it] 84%|████████▍ | 28939/34278 [31:47:22<5:32:48, 3.74s/it] {'loss': 0.1062, 'grad_norm': 1.011085336469972, 'learning_rate': 6.22834533908484e-07, 'epoch': 0.84} 84%|████████▍ | 28939/34278 [31:47:22<5:32:48, 3.74s/it] 84%|████████▍ | 28940/34278 [31:47:26<5:24:48, 3.65s/it] {'loss': 0.1139, 'grad_norm': 0.898068911669527, 'learning_rate': 6.226062074435347e-07, 'epoch': 0.84} 84%|████████▍ | 28940/34278 [31:47:26<5:24:48, 3.65s/it] 84%|████████▍ | 28941/34278 [31:47:29<5:07:37, 3.46s/it] {'loss': 0.1064, 'grad_norm': 0.7582485393852866, 'learning_rate': 6.22377920058912e-07, 'epoch': 0.84} 84%|████████▍ | 28941/34278 [31:47:29<5:07:37, 3.46s/it] 84%|████████▍ | 28942/34278 [31:47:31<4:49:22, 3.25s/it] {'loss': 0.1142, 'grad_norm': 0.8052222974506925, 'learning_rate': 6.221496717566533e-07, 'epoch': 0.84} 84%|████████▍ | 28942/34278 [31:47:31<4:49:22, 3.25s/it] 84%|████████▍ | 28943/34278 [31:47:36<5:24:24, 3.65s/it] {'loss': 0.0949, 'grad_norm': 0.8742033596095347, 'learning_rate': 6.219214625387987e-07, 'epoch': 0.84} 84%|████████▍ | 28943/34278 [31:47:36<5:24:24, 3.65s/it] 84%|████████▍ | 28944/34278 [31:47:40<5:36:01, 3.78s/it] {'loss': 0.1574, 'grad_norm': 1.0294737368038258, 'learning_rate': 6.216932924073837e-07, 'epoch': 0.84} 84%|████████▍ | 28944/34278 [31:47:40<5:36:01, 3.78s/it] 84%|████████▍ | 28945/34278 [31:47:44<5:28:30, 3.70s/it] {'loss': 0.0981, 'grad_norm': 0.8464921950502365, 'learning_rate': 6.214651613644445e-07, 'epoch': 0.84} 84%|████████▍ | 28945/34278 [31:47:44<5:28:30, 3.70s/it] 84%|████████▍ | 28946/34278 [31:47:48<5:37:55, 3.80s/it] {'loss': 0.0903, 'grad_norm': 1.0762773562063659, 'learning_rate': 6.212370694120196e-07, 'epoch': 0.84} 84%|████████▍ | 28946/34278 [31:47:48<5:37:55, 3.80s/it] 84%|████████▍ | 28947/34278 [31:47:50<5:12:16, 3.51s/it] {'loss': 0.1177, 'grad_norm': 0.7914913901208562, 'learning_rate': 6.21009016552146e-07, 'epoch': 0.84} 84%|████████▍ | 28947/34278 [31:47:51<5:12:16, 3.51s/it] 84%|████████▍ | 28948/34278 [31:47:53<4:54:13, 3.31s/it] {'loss': 0.0968, 'grad_norm': 0.7304359791593851, 'learning_rate': 6.207810027868583e-07, 'epoch': 0.84} 84%|████████▍ | 28948/34278 [31:47:53<4:54:13, 3.31s/it] 84%|████████▍ | 28949/34278 [31:47:56<4:47:48, 3.24s/it] {'loss': 0.1259, 'grad_norm': 0.887361182023445, 'learning_rate': 6.205530281181915e-07, 'epoch': 0.84} 84%|████████▍ | 28949/34278 [31:47:56<4:47:48, 3.24s/it] 84%|████████▍ | 28950/34278 [31:47:59<4:36:31, 3.11s/it] {'loss': 0.1092, 'grad_norm': 0.8771738497094493, 'learning_rate': 6.203250925481824e-07, 'epoch': 0.84} 84%|████████▍ | 28950/34278 [31:47:59<4:36:31, 3.11s/it] 84%|████████▍ | 28951/34278 [31:48:03<4:47:45, 3.24s/it] {'loss': 0.1156, 'grad_norm': 0.9470336523330438, 'learning_rate': 6.200971960788649e-07, 'epoch': 0.84} 84%|████████▍ | 28951/34278 [31:48:03<4:47:45, 3.24s/it] 84%|████████▍ | 28952/34278 [31:48:06<4:53:27, 3.31s/it] {'loss': 0.1268, 'grad_norm': 0.949541291182677, 'learning_rate': 6.19869338712274e-07, 'epoch': 0.84} 84%|████████▍ | 28952/34278 [31:48:06<4:53:27, 3.31s/it] 84%|████████▍ | 28953/34278 [31:48:10<4:57:52, 3.36s/it] {'loss': 0.1099, 'grad_norm': 0.8543379562142469, 'learning_rate': 6.196415204504447e-07, 'epoch': 0.84} 84%|████████▍ | 28953/34278 [31:48:10<4:57:52, 3.36s/it] 84%|████████▍ | 28954/34278 [31:48:13<5:00:51, 3.39s/it] {'loss': 0.1063, 'grad_norm': 0.852058767861762, 'learning_rate': 6.194137412954104e-07, 'epoch': 0.84} 84%|████████▍ | 28954/34278 [31:48:13<5:00:51, 3.39s/it] 84%|████████▍ | 28955/34278 [31:48:17<5:04:27, 3.43s/it] {'loss': 0.1026, 'grad_norm': 0.7605570777580096, 'learning_rate': 6.191860012492034e-07, 'epoch': 0.84} 84%|████████▍ | 28955/34278 [31:48:17<5:04:27, 3.43s/it] 84%|████████▍ | 28956/34278 [31:48:20<4:55:28, 3.33s/it] {'loss': 0.1123, 'grad_norm': 0.9689081783921539, 'learning_rate': 6.189583003138588e-07, 'epoch': 0.84} 84%|████████▍ | 28956/34278 [31:48:20<4:55:28, 3.33s/it] 84%|████████▍ | 28957/34278 [31:48:24<5:08:56, 3.48s/it] {'loss': 0.1023, 'grad_norm': 0.7616158465788677, 'learning_rate': 6.187306384914082e-07, 'epoch': 0.84} 84%|████████▍ | 28957/34278 [31:48:24<5:08:56, 3.48s/it] 84%|████████▍ | 28958/34278 [31:48:28<5:38:46, 3.82s/it] {'loss': 0.1169, 'grad_norm': 0.8094604344402836, 'learning_rate': 6.185030157838851e-07, 'epoch': 0.84} 84%|████████▍ | 28958/34278 [31:48:28<5:38:46, 3.82s/it] 84%|████████▍ | 28959/34278 [31:48:34<6:33:36, 4.44s/it] {'loss': 0.1298, 'grad_norm': 0.7705378157013021, 'learning_rate': 6.182754321933204e-07, 'epoch': 0.84} 84%|████████▍ | 28959/34278 [31:48:34<6:33:36, 4.44s/it] 84%|████████▍ | 28960/34278 [31:48:38<6:13:39, 4.22s/it] {'loss': 0.0925, 'grad_norm': 1.1339312308593168, 'learning_rate': 6.180478877217477e-07, 'epoch': 0.84} 84%|████████▍ | 28960/34278 [31:48:38<6:13:39, 4.22s/it] 84%|████████▍ | 28961/34278 [31:48:41<5:51:11, 3.96s/it] {'loss': 0.1178, 'grad_norm': 0.9206239070075416, 'learning_rate': 6.17820382371197e-07, 'epoch': 0.84} 84%|████████▍ | 28961/34278 [31:48:41<5:51:11, 3.96s/it] 84%|████████▍ | 28962/34278 [31:48:46<6:21:47, 4.31s/it] {'loss': 0.108, 'grad_norm': 0.8088148687671737, 'learning_rate': 6.175929161436994e-07, 'epoch': 0.84} 84%|████████▍ | 28962/34278 [31:48:46<6:21:47, 4.31s/it] 84%|████████▍ | 28963/34278 [31:48:49<5:42:00, 3.86s/it] {'loss': 0.1118, 'grad_norm': 0.8366321837183356, 'learning_rate': 6.173654890412855e-07, 'epoch': 0.84} 84%|████████▍ | 28963/34278 [31:48:49<5:42:00, 3.86s/it] 84%|████████▍ | 28964/34278 [31:48:52<5:14:39, 3.55s/it] {'loss': 0.1103, 'grad_norm': 0.7771326612400864, 'learning_rate': 6.171381010659877e-07, 'epoch': 0.84} 84%|████████▍ | 28964/34278 [31:48:52<5:14:39, 3.55s/it] 85%|████████▍ | 28965/34278 [31:48:58<6:21:50, 4.31s/it] {'loss': 0.0969, 'grad_norm': 0.95233919865386, 'learning_rate': 6.169107522198348e-07, 'epoch': 0.85} 85%|████████▍ | 28965/34278 [31:48:58<6:21:50, 4.31s/it] 85%|████████▍ | 28966/34278 [31:49:04<7:05:08, 4.80s/it] {'loss': 0.1113, 'grad_norm': 0.8560933825205206, 'learning_rate': 6.166834425048545e-07, 'epoch': 0.85} 85%|████████▍ | 28966/34278 [31:49:04<7:05:08, 4.80s/it] 85%|████████▍ | 28967/34278 [31:49:10<7:36:13, 5.15s/it] {'loss': 0.1134, 'grad_norm': 0.9462886365787525, 'learning_rate': 6.1645617192308e-07, 'epoch': 0.85} 85%|████████▍ | 28967/34278 [31:49:10<7:36:13, 5.15s/it] 85%|████████▍ | 28968/34278 [31:49:13<6:29:40, 4.40s/it] {'loss': 0.1297, 'grad_norm': 0.8744992932669856, 'learning_rate': 6.162289404765382e-07, 'epoch': 0.85} 85%|████████▍ | 28968/34278 [31:49:13<6:29:40, 4.40s/it] 85%|████████▍ | 28969/34278 [31:49:17<6:35:25, 4.47s/it] {'loss': 0.1231, 'grad_norm': 1.0338832755218144, 'learning_rate': 6.160017481672553e-07, 'epoch': 0.85} 85%|████████▍ | 28969/34278 [31:49:17<6:35:25, 4.47s/it] 85%|████████▍ | 28970/34278 [31:49:20<5:51:34, 3.97s/it] {'loss': 0.1184, 'grad_norm': 1.1384718410664043, 'learning_rate': 6.157745949972649e-07, 'epoch': 0.85} 85%|████████▍ | 28970/34278 [31:49:20<5:51:34, 3.97s/it] 85%|████████▍ | 28971/34278 [31:49:23<5:25:22, 3.68s/it] {'loss': 0.1207, 'grad_norm': 0.8257966487801969, 'learning_rate': 6.155474809685919e-07, 'epoch': 0.85} 85%|████████▍ | 28971/34278 [31:49:23<5:25:22, 3.68s/it] 85%|████████▍ | 28972/34278 [31:49:28<6:00:56, 4.08s/it] {'loss': 0.126, 'grad_norm': 1.1902644605351222, 'learning_rate': 6.153204060832635e-07, 'epoch': 0.85} 85%|████████▍ | 28972/34278 [31:49:28<6:00:56, 4.08s/it] 85%|████████▍ | 28973/34278 [31:49:31<5:30:16, 3.74s/it] {'loss': 0.1029, 'grad_norm': 0.9941656567051275, 'learning_rate': 6.150933703433087e-07, 'epoch': 0.85} 85%|████████▍ | 28973/34278 [31:49:31<5:30:16, 3.74s/it] 85%|████████▍ | 28974/34278 [31:49:34<5:19:22, 3.61s/it] {'loss': 0.1198, 'grad_norm': 0.8267997592841663, 'learning_rate': 6.148663737507537e-07, 'epoch': 0.85} 85%|████████▍ | 28974/34278 [31:49:34<5:19:22, 3.61s/it] 85%|████████▍ | 28975/34278 [31:49:37<4:56:10, 3.35s/it] {'loss': 0.1205, 'grad_norm': 0.8514516111687394, 'learning_rate': 6.146394163076241e-07, 'epoch': 0.85} 85%|████████▍ | 28975/34278 [31:49:37<4:56:10, 3.35s/it] 85%|████████▍ | 28976/34278 [31:49:40<4:50:51, 3.29s/it] {'loss': 0.1073, 'grad_norm': 0.794370338561396, 'learning_rate': 6.144124980159466e-07, 'epoch': 0.85} 85%|████████▍ | 28976/34278 [31:49:40<4:50:51, 3.29s/it] 85%|████████▍ | 28977/34278 [31:49:43<4:37:49, 3.14s/it] {'loss': 0.115, 'grad_norm': 0.8941573073639782, 'learning_rate': 6.141856188777484e-07, 'epoch': 0.85} 85%|████████▍ | 28977/34278 [31:49:43<4:37:49, 3.14s/it] 85%|████████▍ | 28978/34278 [31:49:46<4:40:54, 3.18s/it] {'loss': 0.0951, 'grad_norm': 0.9754947754227669, 'learning_rate': 6.13958778895054e-07, 'epoch': 0.85} 85%|████████▍ | 28978/34278 [31:49:46<4:40:54, 3.18s/it] 85%|████████▍ | 28979/34278 [31:49:49<4:35:54, 3.12s/it] {'loss': 0.1168, 'grad_norm': 0.8238535281379181, 'learning_rate': 6.137319780698881e-07, 'epoch': 0.85} 85%|████████▍ | 28979/34278 [31:49:49<4:35:54, 3.12s/it] 85%|████████▍ | 28980/34278 [31:49:55<5:56:25, 4.04s/it] {'loss': 0.0883, 'grad_norm': 0.6944915606910066, 'learning_rate': 6.135052164042765e-07, 'epoch': 0.85} 85%|████████▍ | 28980/34278 [31:49:55<5:56:25, 4.04s/it] 85%|████████▍ | 28981/34278 [31:49:58<5:22:38, 3.65s/it] {'loss': 0.0976, 'grad_norm': 0.9429286371602528, 'learning_rate': 6.132784939002423e-07, 'epoch': 0.85} 85%|████████▍ | 28981/34278 [31:49:58<5:22:38, 3.65s/it] 85%|████████▍ | 28982/34278 [31:50:01<5:05:46, 3.46s/it] {'loss': 0.1065, 'grad_norm': 0.7588408904627072, 'learning_rate': 6.130518105598104e-07, 'epoch': 0.85} 85%|████████▍ | 28982/34278 [31:50:01<5:05:46, 3.46s/it] 85%|████████▍ | 28983/34278 [31:50:04<4:52:37, 3.32s/it] {'loss': 0.0994, 'grad_norm': 0.8980012058379951, 'learning_rate': 6.128251663850055e-07, 'epoch': 0.85} 85%|████████▍ | 28983/34278 [31:50:04<4:52:37, 3.32s/it] 85%|████████▍ | 28984/34278 [31:50:10<6:09:09, 4.18s/it] {'loss': 0.1242, 'grad_norm': 1.234274059590637, 'learning_rate': 6.125985613778506e-07, 'epoch': 0.85} 85%|████████▍ | 28984/34278 [31:50:10<6:09:09, 4.18s/it] 85%|████████▍ | 28985/34278 [31:50:16<6:57:40, 4.73s/it] {'loss': 0.1086, 'grad_norm': 0.9430980757964642, 'learning_rate': 6.123719955403673e-07, 'epoch': 0.85} 85%|████████▍ | 28985/34278 [31:50:16<6:57:40, 4.73s/it] 85%|████████▍ | 28986/34278 [31:50:20<6:18:22, 4.29s/it] {'loss': 0.0884, 'grad_norm': 0.8886403987846055, 'learning_rate': 6.121454688745804e-07, 'epoch': 0.85} 85%|████████▍ | 28986/34278 [31:50:20<6:18:22, 4.29s/it] 85%|████████▍ | 28987/34278 [31:50:23<5:51:40, 3.99s/it] {'loss': 0.1259, 'grad_norm': 0.7905198450456818, 'learning_rate': 6.119189813825105e-07, 'epoch': 0.85} 85%|████████▍ | 28987/34278 [31:50:23<5:51:40, 3.99s/it] 85%|████████▍ | 28988/34278 [31:50:26<5:37:21, 3.83s/it] {'loss': 0.1236, 'grad_norm': 0.8787226332974925, 'learning_rate': 6.11692533066181e-07, 'epoch': 0.85} 85%|████████▍ | 28988/34278 [31:50:26<5:37:21, 3.83s/it] 85%|████████▍ | 28989/34278 [31:50:29<5:16:42, 3.59s/it] {'loss': 0.1005, 'grad_norm': 0.9199138971677424, 'learning_rate': 6.114661239276121e-07, 'epoch': 0.85} 85%|████████▍ | 28989/34278 [31:50:29<5:16:42, 3.59s/it] 85%|████████▍ | 28990/34278 [31:50:35<6:21:33, 4.33s/it] {'loss': 0.107, 'grad_norm': 1.2170396444377143, 'learning_rate': 6.112397539688269e-07, 'epoch': 0.85} 85%|████████▍ | 28990/34278 [31:50:35<6:21:33, 4.33s/it] 85%|████████▍ | 28991/34278 [31:50:38<5:42:59, 3.89s/it] {'loss': 0.1182, 'grad_norm': 0.9045354533634172, 'learning_rate': 6.110134231918458e-07, 'epoch': 0.85} 85%|████████▍ | 28991/34278 [31:50:38<5:42:59, 3.89s/it] 85%|████████▍ | 28992/34278 [31:50:41<5:11:18, 3.53s/it] {'loss': 0.101, 'grad_norm': 0.7436760800768687, 'learning_rate': 6.107871315986879e-07, 'epoch': 0.85} 85%|████████▍ | 28992/34278 [31:50:41<5:11:18, 3.53s/it] 85%|████████▍ | 28993/34278 [31:50:47<6:16:24, 4.27s/it] {'loss': 0.097, 'grad_norm': 0.7470511240234238, 'learning_rate': 6.105608791913747e-07, 'epoch': 0.85} 85%|████████▍ | 28993/34278 [31:50:47<6:16:24, 4.27s/it] 85%|████████▍ | 28994/34278 [31:50:50<5:48:37, 3.96s/it] {'loss': 0.105, 'grad_norm': 0.9281311415140583, 'learning_rate': 6.103346659719278e-07, 'epoch': 0.85} 85%|████████▍ | 28994/34278 [31:50:50<5:48:37, 3.96s/it] 85%|████████▍ | 28995/34278 [31:50:55<6:09:14, 4.19s/it] {'loss': 0.1032, 'grad_norm': 0.8024697346946731, 'learning_rate': 6.101084919423645e-07, 'epoch': 0.85} 85%|████████▍ | 28995/34278 [31:50:55<6:09:14, 4.19s/it] 85%|████████▍ | 28996/34278 [31:50:58<5:45:56, 3.93s/it] {'loss': 0.1081, 'grad_norm': 0.772975732299244, 'learning_rate': 6.098823571047036e-07, 'epoch': 0.85} 85%|████████▍ | 28996/34278 [31:50:58<5:45:56, 3.93s/it] 85%|████████▍ | 28997/34278 [31:51:05<6:46:31, 4.62s/it] {'loss': 0.0956, 'grad_norm': 0.9238825247852168, 'learning_rate': 6.096562614609658e-07, 'epoch': 0.85} 85%|████████▍ | 28997/34278 [31:51:05<6:46:31, 4.62s/it] 85%|████████▍ | 28998/34278 [31:51:08<6:08:29, 4.19s/it] {'loss': 0.1478, 'grad_norm': 0.898659153580409, 'learning_rate': 6.094302050131695e-07, 'epoch': 0.85} 85%|████████▍ | 28998/34278 [31:51:08<6:08:29, 4.19s/it] 85%|████████▍ | 28999/34278 [31:51:11<5:37:53, 3.84s/it] {'loss': 0.1151, 'grad_norm': 1.5643563022734468, 'learning_rate': 6.092041877633298e-07, 'epoch': 0.85} 85%|████████▍ | 28999/34278 [31:51:11<5:37:53, 3.84s/it] 85%|████████▍ | 29000/34278 [31:51:16<6:05:55, 4.16s/it] {'loss': 0.1099, 'grad_norm': 0.8831666935557358, 'learning_rate': 6.089782097134689e-07, 'epoch': 0.85} 85%|████████▍ | 29000/34278 [31:51:16<6:05:55, 4.16s/it] 85%|████████▍ | 29001/34278 [31:51:19<5:46:50, 3.94s/it] {'loss': 0.1242, 'grad_norm': 0.8419068549251498, 'learning_rate': 6.087522708656024e-07, 'epoch': 0.85} 85%|████████▍ | 29001/34278 [31:51:19<5:46:50, 3.94s/it] 85%|████████▍ | 29002/34278 [31:51:22<5:24:19, 3.69s/it] {'loss': 0.1057, 'grad_norm': 0.6877823378891237, 'learning_rate': 6.085263712217465e-07, 'epoch': 0.85} 85%|████████▍ | 29002/34278 [31:51:22<5:24:19, 3.69s/it] 85%|████████▍ | 29003/34278 [31:51:25<5:09:49, 3.52s/it] {'loss': 0.0968, 'grad_norm': 0.9322222929739505, 'learning_rate': 6.083005107839196e-07, 'epoch': 0.85} 85%|████████▍ | 29003/34278 [31:51:25<5:09:49, 3.52s/it] 85%|████████▍ | 29004/34278 [31:51:29<5:11:33, 3.54s/it] {'loss': 0.1306, 'grad_norm': 1.1745189573711559, 'learning_rate': 6.080746895541372e-07, 'epoch': 0.85} 85%|████████▍ | 29004/34278 [31:51:29<5:11:33, 3.54s/it] 85%|████████▍ | 29005/34278 [31:51:32<4:49:35, 3.30s/it] {'loss': 0.116, 'grad_norm': 0.9126592759386749, 'learning_rate': 6.078489075344152e-07, 'epoch': 0.85} 85%|████████▍ | 29005/34278 [31:51:32<4:49:35, 3.30s/it] 85%|████████▍ | 29006/34278 [31:51:35<4:50:55, 3.31s/it] {'loss': 0.1192, 'grad_norm': 0.919260997371977, 'learning_rate': 6.076231647267689e-07, 'epoch': 0.85} 85%|████████▍ | 29006/34278 [31:51:35<4:50:55, 3.31s/it] 85%|████████▍ | 29007/34278 [31:51:38<4:43:36, 3.23s/it] {'loss': 0.1072, 'grad_norm': 0.888339511523776, 'learning_rate': 6.073974611332156e-07, 'epoch': 0.85} 85%|████████▍ | 29007/34278 [31:51:38<4:43:36, 3.23s/it] 85%|████████▍ | 29008/34278 [31:51:42<4:50:23, 3.31s/it] {'loss': 0.1007, 'grad_norm': 0.9077946045990275, 'learning_rate': 6.071717967557694e-07, 'epoch': 0.85} 85%|████████▍ | 29008/34278 [31:51:42<4:50:23, 3.31s/it] 85%|████████▍ | 29009/34278 [31:51:45<4:49:48, 3.30s/it] {'loss': 0.1135, 'grad_norm': 0.8680779547275776, 'learning_rate': 6.069461715964436e-07, 'epoch': 0.85} 85%|████████▍ | 29009/34278 [31:51:45<4:49:48, 3.30s/it] 85%|████████▍ | 29010/34278 [31:51:49<5:09:38, 3.53s/it] {'loss': 0.1028, 'grad_norm': 1.0610882156130454, 'learning_rate': 6.06720585657255e-07, 'epoch': 0.85} 85%|████████▍ | 29010/34278 [31:51:49<5:09:38, 3.53s/it] 85%|████████▍ | 29011/34278 [31:51:52<5:07:24, 3.50s/it] {'loss': 0.0991, 'grad_norm': 1.1003124603593972, 'learning_rate': 6.064950389402152e-07, 'epoch': 0.85} 85%|████████▍ | 29011/34278 [31:51:52<5:07:24, 3.50s/it] 85%|████████▍ | 29012/34278 [31:51:55<4:58:12, 3.40s/it] {'loss': 0.1175, 'grad_norm': 1.0180243383169565, 'learning_rate': 6.062695314473383e-07, 'epoch': 0.85} 85%|████████▍ | 29012/34278 [31:51:55<4:58:12, 3.40s/it] 85%|████████▍ | 29013/34278 [31:51:59<4:49:17, 3.30s/it] {'loss': 0.1146, 'grad_norm': 0.9826360830307576, 'learning_rate': 6.060440631806397e-07, 'epoch': 0.85} 85%|████████▍ | 29013/34278 [31:51:59<4:49:17, 3.30s/it] 85%|████████▍ | 29014/34278 [31:52:02<4:53:32, 3.35s/it] {'loss': 0.112, 'grad_norm': 0.8642952642862485, 'learning_rate': 6.058186341421307e-07, 'epoch': 0.85} 85%|████████▍ | 29014/34278 [31:52:02<4:53:32, 3.35s/it] 85%|████████▍ | 29015/34278 [31:52:05<4:41:01, 3.20s/it] {'loss': 0.1192, 'grad_norm': 0.9622097652971888, 'learning_rate': 6.05593244333823e-07, 'epoch': 0.85} 85%|████████▍ | 29015/34278 [31:52:05<4:41:01, 3.20s/it] 85%|████████▍ | 29016/34278 [31:52:08<4:41:53, 3.21s/it] {'loss': 0.1062, 'grad_norm': 0.7831867065941874, 'learning_rate': 6.053678937577306e-07, 'epoch': 0.85} 85%|████████▍ | 29016/34278 [31:52:08<4:41:53, 3.21s/it] 85%|████████▍ | 29017/34278 [31:52:12<4:54:37, 3.36s/it] {'loss': 0.0972, 'grad_norm': 0.9394779054339935, 'learning_rate': 6.051425824158636e-07, 'epoch': 0.85} 85%|████████▍ | 29017/34278 [31:52:12<4:54:37, 3.36s/it] 85%|████████▍ | 29018/34278 [31:52:18<6:18:57, 4.32s/it] {'loss': 0.1355, 'grad_norm': 1.0162801455659933, 'learning_rate': 6.049173103102357e-07, 'epoch': 0.85} 85%|████████▍ | 29018/34278 [31:52:18<6:18:57, 4.32s/it] 85%|████████▍ | 29019/34278 [31:52:21<5:37:09, 3.85s/it] {'loss': 0.1249, 'grad_norm': 0.8091046858537951, 'learning_rate': 6.046920774428555e-07, 'epoch': 0.85} 85%|████████▍ | 29019/34278 [31:52:21<5:37:09, 3.85s/it] 85%|████████▍ | 29020/34278 [31:52:24<5:09:29, 3.53s/it] {'loss': 0.1096, 'grad_norm': 0.9962912216737734, 'learning_rate': 6.044668838157364e-07, 'epoch': 0.85} 85%|████████▍ | 29020/34278 [31:52:24<5:09:29, 3.53s/it] 85%|████████▍ | 29021/34278 [31:52:27<4:57:15, 3.39s/it] {'loss': 0.1179, 'grad_norm': 0.965340154386159, 'learning_rate': 6.042417294308878e-07, 'epoch': 0.85} 85%|████████▍ | 29021/34278 [31:52:27<4:57:15, 3.39s/it] 85%|████████▍ | 29022/34278 [31:52:31<5:09:19, 3.53s/it] {'loss': 0.0928, 'grad_norm': 0.7149561802564454, 'learning_rate': 6.040166142903186e-07, 'epoch': 0.85} 85%|████████▍ | 29022/34278 [31:52:31<5:09:19, 3.53s/it] 85%|████████▍ | 29023/34278 [31:52:34<4:51:27, 3.33s/it] {'loss': 0.1121, 'grad_norm': 0.8466089184322997, 'learning_rate': 6.037915383960391e-07, 'epoch': 0.85} 85%|████████▍ | 29023/34278 [31:52:34<4:51:27, 3.33s/it] 85%|████████▍ | 29024/34278 [31:52:38<5:13:42, 3.58s/it] {'loss': 0.134, 'grad_norm': 1.3270229358566914, 'learning_rate': 6.035665017500609e-07, 'epoch': 0.85} 85%|████████▍ | 29024/34278 [31:52:38<5:13:42, 3.58s/it] 85%|████████▍ | 29025/34278 [31:52:41<5:05:21, 3.49s/it] {'loss': 0.1099, 'grad_norm': 1.099903479306349, 'learning_rate': 6.033415043543916e-07, 'epoch': 0.85} 85%|████████▍ | 29025/34278 [31:52:41<5:05:21, 3.49s/it] 85%|████████▍ | 29026/34278 [31:52:46<5:49:06, 3.99s/it] {'loss': 0.1156, 'grad_norm': 0.7118010932488361, 'learning_rate': 6.031165462110383e-07, 'epoch': 0.85} 85%|████████▍ | 29026/34278 [31:52:46<5:49:06, 3.99s/it] 85%|████████▍ | 29027/34278 [31:52:49<5:24:41, 3.71s/it] {'loss': 0.1156, 'grad_norm': 1.0062081837952301, 'learning_rate': 6.02891627322012e-07, 'epoch': 0.85} 85%|████████▍ | 29027/34278 [31:52:49<5:24:41, 3.71s/it] 85%|████████▍ | 29028/34278 [31:52:52<5:07:23, 3.51s/it] {'loss': 0.1033, 'grad_norm': 0.9127783006903045, 'learning_rate': 6.0266674768932e-07, 'epoch': 0.85} 85%|████████▍ | 29028/34278 [31:52:52<5:07:23, 3.51s/it] 85%|████████▍ | 29029/34278 [31:52:56<5:13:35, 3.58s/it] {'loss': 0.1145, 'grad_norm': 0.9014088640034168, 'learning_rate': 6.024419073149668e-07, 'epoch': 0.85} 85%|████████▍ | 29029/34278 [31:52:56<5:13:35, 3.58s/it] 85%|████████▍ | 29030/34278 [31:52:59<4:57:31, 3.40s/it] {'loss': 0.1179, 'grad_norm': 1.0071058171151983, 'learning_rate': 6.022171062009652e-07, 'epoch': 0.85} 85%|████████▍ | 29030/34278 [31:52:59<4:57:31, 3.40s/it] 85%|████████▍ | 29031/34278 [31:53:02<4:41:06, 3.21s/it] {'loss': 0.1213, 'grad_norm': 0.9446048324157715, 'learning_rate': 6.019923443493192e-07, 'epoch': 0.85} 85%|████████▍ | 29031/34278 [31:53:02<4:41:06, 3.21s/it] 85%|████████▍ | 29032/34278 [31:53:06<4:51:13, 3.33s/it] {'loss': 0.1067, 'grad_norm': 0.8361849720046881, 'learning_rate': 6.017676217620344e-07, 'epoch': 0.85} 85%|████████▍ | 29032/34278 [31:53:06<4:51:13, 3.33s/it] 85%|████████▍ | 29033/34278 [31:53:09<4:47:42, 3.29s/it] {'loss': 0.1115, 'grad_norm': 0.9667808011358108, 'learning_rate': 6.015429384411192e-07, 'epoch': 0.85} 85%|████████▍ | 29033/34278 [31:53:09<4:47:42, 3.29s/it] 85%|████████▍ | 29034/34278 [31:53:12<4:35:36, 3.15s/it] {'loss': 0.1079, 'grad_norm': 1.0742950683483192, 'learning_rate': 6.013182943885781e-07, 'epoch': 0.85} 85%|████████▍ | 29034/34278 [31:53:12<4:35:36, 3.15s/it] 85%|████████▍ | 29035/34278 [31:53:15<4:55:53, 3.39s/it] {'loss': 0.1199, 'grad_norm': 1.1633351695493122, 'learning_rate': 6.010936896064184e-07, 'epoch': 0.85} 85%|████████▍ | 29035/34278 [31:53:15<4:55:53, 3.39s/it] 85%|████████▍ | 29036/34278 [31:53:18<4:42:39, 3.24s/it] {'loss': 0.108, 'grad_norm': 0.8533109669312983, 'learning_rate': 6.008691240966425e-07, 'epoch': 0.85} 85%|████████▍ | 29036/34278 [31:53:18<4:42:39, 3.24s/it] 85%|████████▍ | 29037/34278 [31:53:21<4:38:54, 3.19s/it] {'loss': 0.1057, 'grad_norm': 0.8061642141837491, 'learning_rate': 6.006445978612585e-07, 'epoch': 0.85} 85%|████████▍ | 29037/34278 [31:53:21<4:38:54, 3.19s/it] 85%|████████▍ | 29038/34278 [31:53:25<4:58:04, 3.41s/it] {'loss': 0.0857, 'grad_norm': 0.8820922330996973, 'learning_rate': 6.004201109022689e-07, 'epoch': 0.85} 85%|████████▍ | 29038/34278 [31:53:25<4:58:04, 3.41s/it] 85%|████████▍ | 29039/34278 [31:53:28<4:45:57, 3.27s/it] {'loss': 0.1074, 'grad_norm': 0.9703849497398688, 'learning_rate': 6.001956632216771e-07, 'epoch': 0.85} 85%|████████▍ | 29039/34278 [31:53:28<4:45:57, 3.27s/it] 85%|████████▍ | 29040/34278 [31:53:33<5:35:17, 3.84s/it] {'loss': 0.1062, 'grad_norm': 0.8490460378689196, 'learning_rate': 5.999712548214886e-07, 'epoch': 0.85} 85%|████████▍ | 29040/34278 [31:53:33<5:35:17, 3.84s/it] 85%|████████▍ | 29041/34278 [31:53:38<5:57:29, 4.10s/it] {'loss': 0.1173, 'grad_norm': 0.9537004209596786, 'learning_rate': 5.99746885703707e-07, 'epoch': 0.85} 85%|████████▍ | 29041/34278 [31:53:38<5:57:29, 4.10s/it] 85%|████████▍ | 29042/34278 [31:53:45<6:55:59, 4.77s/it] {'loss': 0.1304, 'grad_norm': 0.9381791552270514, 'learning_rate': 5.995225558703344e-07, 'epoch': 0.85} 85%|████████▍ | 29042/34278 [31:53:45<6:55:59, 4.77s/it] 85%|████████▍ | 29043/34278 [31:53:49<6:52:45, 4.73s/it] {'loss': 0.1075, 'grad_norm': 1.2164745413556348, 'learning_rate': 5.992982653233742e-07, 'epoch': 0.85} 85%|████████▍ | 29043/34278 [31:53:49<6:52:45, 4.73s/it] 85%|████████▍ | 29044/34278 [31:53:52<6:01:49, 4.15s/it] {'loss': 0.1095, 'grad_norm': 0.7229966815710739, 'learning_rate': 5.990740140648288e-07, 'epoch': 0.85} 85%|████████▍ | 29044/34278 [31:53:52<6:01:49, 4.15s/it] 85%|████████▍ | 29045/34278 [31:53:55<5:35:13, 3.84s/it] {'loss': 0.1282, 'grad_norm': 0.8851650160487878, 'learning_rate': 5.988498020966993e-07, 'epoch': 0.85} 85%|████████▍ | 29045/34278 [31:53:55<5:35:13, 3.84s/it] 85%|████████▍ | 29046/34278 [31:53:58<5:11:06, 3.57s/it] {'loss': 0.1191, 'grad_norm': 3.132183776134045, 'learning_rate': 5.986256294209874e-07, 'epoch': 0.85} 85%|████████▍ | 29046/34278 [31:53:58<5:11:06, 3.57s/it] 85%|████████▍ | 29047/34278 [31:54:01<5:01:53, 3.46s/it] {'loss': 0.1082, 'grad_norm': 0.8087015027414471, 'learning_rate': 5.984014960396972e-07, 'epoch': 0.85} 85%|████████▍ | 29047/34278 [31:54:01<5:01:53, 3.46s/it] 85%|████████▍ | 29048/34278 [31:54:04<4:56:52, 3.41s/it] {'loss': 0.1113, 'grad_norm': 0.9006617659695598, 'learning_rate': 5.98177401954827e-07, 'epoch': 0.85} 85%|████████▍ | 29048/34278 [31:54:05<4:56:52, 3.41s/it] 85%|████████▍ | 29049/34278 [31:54:07<4:39:41, 3.21s/it] {'loss': 0.1255, 'grad_norm': 0.8753993150754519, 'learning_rate': 5.979533471683773e-07, 'epoch': 0.85} 85%|████████▍ | 29049/34278 [31:54:07<4:39:41, 3.21s/it] 85%|████████▍ | 29050/34278 [31:54:10<4:22:34, 3.01s/it] {'loss': 0.1078, 'grad_norm': 0.8116099502956838, 'learning_rate': 5.977293316823502e-07, 'epoch': 0.85} 85%|████████▍ | 29050/34278 [31:54:10<4:22:34, 3.01s/it] 85%|████████▍ | 29051/34278 [31:54:14<4:57:41, 3.42s/it] {'loss': 0.1115, 'grad_norm': 0.8751003965449491, 'learning_rate': 5.975053554987448e-07, 'epoch': 0.85} 85%|████████▍ | 29051/34278 [31:54:14<4:57:41, 3.42s/it] 85%|████████▍ | 29052/34278 [31:54:17<4:40:40, 3.22s/it] {'loss': 0.116, 'grad_norm': 0.9368127147652712, 'learning_rate': 5.972814186195597e-07, 'epoch': 0.85} 85%|████████▍ | 29052/34278 [31:54:17<4:40:40, 3.22s/it] 85%|████████▍ | 29053/34278 [31:54:20<4:30:24, 3.11s/it] {'loss': 0.0994, 'grad_norm': 0.847932109256919, 'learning_rate': 5.970575210467949e-07, 'epoch': 0.85} 85%|████████▍ | 29053/34278 [31:54:20<4:30:24, 3.11s/it] 85%|████████▍ | 29054/34278 [31:54:23<4:28:12, 3.08s/it] {'loss': 0.1091, 'grad_norm': 0.7030066060440572, 'learning_rate': 5.968336627824506e-07, 'epoch': 0.85} 85%|████████▍ | 29054/34278 [31:54:23<4:28:12, 3.08s/it] 85%|████████▍ | 29055/34278 [31:54:26<4:22:05, 3.01s/it] {'loss': 0.1247, 'grad_norm': 0.660917535734426, 'learning_rate': 5.966098438285245e-07, 'epoch': 0.85} 85%|████████▍ | 29055/34278 [31:54:26<4:22:05, 3.01s/it] 85%|████████▍ | 29056/34278 [31:54:29<4:35:14, 3.16s/it] {'loss': 0.1049, 'grad_norm': 0.7533538106123413, 'learning_rate': 5.963860641870134e-07, 'epoch': 0.85} 85%|████████▍ | 29056/34278 [31:54:29<4:35:14, 3.16s/it] 85%|████████▍ | 29057/34278 [31:54:32<4:29:54, 3.10s/it] {'loss': 0.0943, 'grad_norm': 0.9245893197000575, 'learning_rate': 5.961623238599168e-07, 'epoch': 0.85} 85%|████████▍ | 29057/34278 [31:54:32<4:29:54, 3.10s/it] 85%|████████▍ | 29058/34278 [31:54:38<5:45:18, 3.97s/it] {'loss': 0.1128, 'grad_norm': 0.7611611264482083, 'learning_rate': 5.959386228492314e-07, 'epoch': 0.85} 85%|████████▍ | 29058/34278 [31:54:38<5:45:18, 3.97s/it] 85%|████████▍ | 29059/34278 [31:54:41<5:26:37, 3.75s/it] {'loss': 0.1033, 'grad_norm': 0.9054708090581463, 'learning_rate': 5.957149611569541e-07, 'epoch': 0.85} 85%|████████▍ | 29059/34278 [31:54:41<5:26:37, 3.75s/it] 85%|████████▍ | 29060/34278 [31:54:47<6:23:17, 4.41s/it] {'loss': 0.1185, 'grad_norm': 0.7867854167182577, 'learning_rate': 5.954913387850836e-07, 'epoch': 0.85} 85%|████████▍ | 29060/34278 [31:54:47<6:23:17, 4.41s/it] 85%|████████▍ | 29061/34278 [31:54:51<5:55:51, 4.09s/it] {'loss': 0.1009, 'grad_norm': 0.9561588506171079, 'learning_rate': 5.952677557356146e-07, 'epoch': 0.85} 85%|████████▍ | 29061/34278 [31:54:51<5:55:51, 4.09s/it] 85%|████████▍ | 29062/34278 [31:54:54<5:29:43, 3.79s/it] {'loss': 0.1215, 'grad_norm': 0.8367745657832105, 'learning_rate': 5.950442120105432e-07, 'epoch': 0.85} 85%|████████▍ | 29062/34278 [31:54:54<5:29:43, 3.79s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 85%|████████▍ | 29063/34278 [31:54:57<5:08:52, 3.55s/it] {'loss': 0.1464, 'grad_norm': 0.7966824773289977, 'learning_rate': 5.948207076118662e-07, 'epoch': 0.85} 85%|████████▍ | 29063/34278 [31:54:57<5:08:52, 3.55s/it] 85%|████████▍ | 29064/34278 [31:55:00<5:09:49, 3.57s/it] {'loss': 0.1128, 'grad_norm': 0.9122979652791569, 'learning_rate': 5.945972425415769e-07, 'epoch': 0.85} 85%|████████▍ | 29064/34278 [31:55:00<5:09:49, 3.57s/it] 85%|████████▍ | 29065/34278 [31:55:05<5:47:13, 4.00s/it] {'loss': 0.1098, 'grad_norm': 0.7476498141301914, 'learning_rate': 5.943738168016732e-07, 'epoch': 0.85} 85%|████████▍ | 29065/34278 [31:55:05<5:47:13, 4.00s/it] 85%|████████▍ | 29066/34278 [31:55:08<5:22:14, 3.71s/it] {'loss': 0.1206, 'grad_norm': 1.0757619620389016, 'learning_rate': 5.941504303941475e-07, 'epoch': 0.85} 85%|████████▍ | 29066/34278 [31:55:08<5:22:14, 3.71s/it] 85%|████████▍ | 29067/34278 [31:55:14<6:19:06, 4.37s/it] {'loss': 0.1108, 'grad_norm': 1.0973587077063143, 'learning_rate': 5.939270833209959e-07, 'epoch': 0.85} 85%|████████▍ | 29067/34278 [31:55:14<6:19:06, 4.37s/it] 85%|████████▍ | 29068/34278 [31:55:18<5:53:52, 4.08s/it] {'loss': 0.0973, 'grad_norm': 0.8395425331530202, 'learning_rate': 5.937037755842112e-07, 'epoch': 0.85} 85%|████████▍ | 29068/34278 [31:55:18<5:53:52, 4.08s/it] 85%|████████▍ | 29069/34278 [31:55:21<5:45:54, 3.98s/it] {'loss': 0.1319, 'grad_norm': 0.8268829295462345, 'learning_rate': 5.934805071857863e-07, 'epoch': 0.85} 85%|████████▍ | 29069/34278 [31:55:21<5:45:54, 3.98s/it] 85%|████████▍ | 29070/34278 [31:55:25<5:37:38, 3.89s/it] {'loss': 0.1068, 'grad_norm': 0.7921914210874075, 'learning_rate': 5.932572781277158e-07, 'epoch': 0.85} 85%|████████▍ | 29070/34278 [31:55:25<5:37:38, 3.89s/it] 85%|████████▍ | 29071/34278 [31:55:28<5:17:34, 3.66s/it] {'loss': 0.1045, 'grad_norm': 0.8860279687171584, 'learning_rate': 5.930340884119934e-07, 'epoch': 0.85} 85%|████████▍ | 29071/34278 [31:55:28<5:17:34, 3.66s/it] 85%|████████▍ | 29072/34278 [31:55:31<5:02:43, 3.49s/it] {'loss': 0.1192, 'grad_norm': 1.0539568396441068, 'learning_rate': 5.928109380406094e-07, 'epoch': 0.85} 85%|████████▍ | 29072/34278 [31:55:31<5:02:43, 3.49s/it] 85%|████████▍ | 29073/34278 [31:55:38<6:15:30, 4.33s/it] {'loss': 0.1015, 'grad_norm': 0.776371118999152, 'learning_rate': 5.925878270155582e-07, 'epoch': 0.85} 85%|████████▍ | 29073/34278 [31:55:38<6:15:30, 4.33s/it] 85%|████████▍ | 29074/34278 [31:55:44<7:00:20, 4.85s/it] {'loss': 0.1196, 'grad_norm': 0.8651158857965905, 'learning_rate': 5.923647553388312e-07, 'epoch': 0.85} 85%|████████▍ | 29074/34278 [31:55:44<7:00:20, 4.85s/it] 85%|████████▍ | 29075/34278 [31:55:47<6:10:34, 4.27s/it] {'loss': 0.1063, 'grad_norm': 0.8464081537327086, 'learning_rate': 5.921417230124177e-07, 'epoch': 0.85} 85%|████████▍ | 29075/34278 [31:55:47<6:10:34, 4.27s/it] 85%|████████▍ | 29076/34278 [31:55:50<5:58:46, 4.14s/it] {'loss': 0.1133, 'grad_norm': 0.9301961433279016, 'learning_rate': 5.919187300383112e-07, 'epoch': 0.85} 85%|████████▍ | 29076/34278 [31:55:50<5:58:46, 4.14s/it] 85%|████████▍ | 29077/34278 [31:55:53<5:29:24, 3.80s/it] {'loss': 0.1393, 'grad_norm': 0.9654583692499626, 'learning_rate': 5.916957764185033e-07, 'epoch': 0.85} 85%|████████▍ | 29077/34278 [31:55:53<5:29:24, 3.80s/it] 85%|████████▍ | 29078/34278 [31:55:57<5:25:52, 3.76s/it] {'loss': 0.0997, 'grad_norm': 0.7106699956578029, 'learning_rate': 5.914728621549826e-07, 'epoch': 0.85} 85%|████████▍ | 29078/34278 [31:55:57<5:25:52, 3.76s/it] 85%|████████▍ | 29079/34278 [31:56:00<5:12:19, 3.60s/it] {'loss': 0.1002, 'grad_norm': 0.8398080467262697, 'learning_rate': 5.91249987249739e-07, 'epoch': 0.85} 85%|████████▍ | 29079/34278 [31:56:00<5:12:19, 3.60s/it] 85%|████████▍ | 29080/34278 [31:56:06<6:01:54, 4.18s/it] {'loss': 0.115, 'grad_norm': 1.082671993972656, 'learning_rate': 5.910271517047639e-07, 'epoch': 0.85} 85%|████████▍ | 29080/34278 [31:56:06<6:01:54, 4.18s/it] 85%|████████▍ | 29081/34278 [31:56:09<5:47:07, 4.01s/it] {'loss': 0.1049, 'grad_norm': 0.732134523449264, 'learning_rate': 5.90804355522046e-07, 'epoch': 0.85} 85%|████████▍ | 29081/34278 [31:56:09<5:47:07, 4.01s/it] 85%|████████▍ | 29082/34278 [31:56:12<5:21:53, 3.72s/it] {'loss': 0.1001, 'grad_norm': 0.8603765456311879, 'learning_rate': 5.905815987035735e-07, 'epoch': 0.85} 85%|████████▍ | 29082/34278 [31:56:13<5:21:53, 3.72s/it] 85%|████████▍ | 29083/34278 [31:56:16<5:07:59, 3.56s/it] {'loss': 0.1394, 'grad_norm': 0.8973437925039011, 'learning_rate': 5.903588812513356e-07, 'epoch': 0.85} 85%|████████▍ | 29083/34278 [31:56:16<5:07:59, 3.56s/it] 85%|████████▍ | 29084/34278 [31:56:19<5:06:50, 3.54s/it] {'loss': 0.1304, 'grad_norm': 0.8189177716404316, 'learning_rate': 5.901362031673219e-07, 'epoch': 0.85} 85%|████████▍ | 29084/34278 [31:56:19<5:06:50, 3.54s/it] 85%|████████▍ | 29085/34278 [31:56:22<4:50:25, 3.36s/it] {'loss': 0.1104, 'grad_norm': 0.8655269686689555, 'learning_rate': 5.899135644535193e-07, 'epoch': 0.85} 85%|████████▍ | 29085/34278 [31:56:22<4:50:25, 3.36s/it] 85%|████████▍ | 29086/34278 [31:56:26<4:59:48, 3.46s/it] {'loss': 0.1005, 'grad_norm': 0.8718371207005509, 'learning_rate': 5.896909651119149e-07, 'epoch': 0.85} 85%|████████▍ | 29086/34278 [31:56:26<4:59:48, 3.46s/it] 85%|████████▍ | 29087/34278 [31:56:29<4:56:10, 3.42s/it] {'loss': 0.1259, 'grad_norm': 0.7441187259645008, 'learning_rate': 5.894684051444977e-07, 'epoch': 0.85} 85%|████████▍ | 29087/34278 [31:56:29<4:56:10, 3.42s/it] 85%|████████▍ | 29088/34278 [31:56:32<4:49:01, 3.34s/it] {'loss': 0.1064, 'grad_norm': 0.8453179416188262, 'learning_rate': 5.892458845532528e-07, 'epoch': 0.85} 85%|████████▍ | 29088/34278 [31:56:32<4:49:01, 3.34s/it] 85%|████████▍ | 29089/34278 [31:56:35<4:37:33, 3.21s/it] {'loss': 0.1155, 'grad_norm': 0.885642919851678, 'learning_rate': 5.890234033401676e-07, 'epoch': 0.85} 85%|████████▍ | 29089/34278 [31:56:35<4:37:33, 3.21s/it] 85%|████████▍ | 29090/34278 [31:56:38<4:39:09, 3.23s/it] {'loss': 0.1318, 'grad_norm': 1.123644647537821, 'learning_rate': 5.888009615072293e-07, 'epoch': 0.85} 85%|████████▍ | 29090/34278 [31:56:39<4:39:09, 3.23s/it] 85%|████████▍ | 29091/34278 [31:56:42<4:33:54, 3.17s/it] {'loss': 0.129, 'grad_norm': 0.9043374872091943, 'learning_rate': 5.88578559056423e-07, 'epoch': 0.85} 85%|████████▍ | 29091/34278 [31:56:42<4:33:54, 3.17s/it] 85%|████████▍ | 29092/34278 [31:56:45<4:35:23, 3.19s/it] {'loss': 0.1224, 'grad_norm': 1.048869038299374, 'learning_rate': 5.883561959897338e-07, 'epoch': 0.85} 85%|████████▍ | 29092/34278 [31:56:45<4:35:23, 3.19s/it] 85%|████████▍ | 29093/34278 [31:56:48<4:25:06, 3.07s/it] {'loss': 0.0853, 'grad_norm': 0.7790492693780142, 'learning_rate': 5.881338723091478e-07, 'epoch': 0.85} 85%|████████▍ | 29093/34278 [31:56:48<4:25:06, 3.07s/it] 85%|████████▍ | 29094/34278 [31:56:51<4:30:58, 3.14s/it] {'loss': 0.1216, 'grad_norm': 0.8720151642317396, 'learning_rate': 5.879115880166486e-07, 'epoch': 0.85} 85%|████████▍ | 29094/34278 [31:56:51<4:30:58, 3.14s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ef9ce118c70> Failed to fetch sample 3646658. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9ce118c70> 85%|████████▍ | 29095/34278 [31:56:54<4:36:14, 3.20s/it] {'loss': 0.1265, 'grad_norm': 0.7905411336337679, 'learning_rate': 5.876893431142222e-07, 'epoch': 0.85} 85%|████████▍ | 29095/34278 [31:56:54<4:36:14, 3.20s/it] 85%|████████▍ | 29096/34278 [31:56:57<4:29:27, 3.12s/it] {'loss': 0.1101, 'grad_norm': 0.9281130594392916, 'learning_rate': 5.874671376038516e-07, 'epoch': 0.85} 85%|████████▍ | 29096/34278 [31:56:57<4:29:27, 3.12s/it] 85%|████████▍ | 29097/34278 [31:57:01<5:00:31, 3.48s/it] {'loss': 0.1089, 'grad_norm': 0.8018721161052225, 'learning_rate': 5.872449714875217e-07, 'epoch': 0.85} 85%|████████▍ | 29097/34278 [31:57:01<5:00:31, 3.48s/it] 85%|████████▍ | 29098/34278 [31:57:04<4:43:01, 3.28s/it] {'loss': 0.0836, 'grad_norm': 0.8999977959568195, 'learning_rate': 5.870228447672149e-07, 'epoch': 0.85} 85%|████████▍ | 29098/34278 [31:57:04<4:43:01, 3.28s/it] 85%|████████▍ | 29099/34278 [31:57:07<4:37:46, 3.22s/it] {'loss': 0.1261, 'grad_norm': 1.014786378940118, 'learning_rate': 5.868007574449141e-07, 'epoch': 0.85} 85%|████████▍ | 29099/34278 [31:57:07<4:37:46, 3.22s/it] 85%|████████▍ | 29100/34278 [31:57:10<4:30:33, 3.14s/it] {'loss': 0.0987, 'grad_norm': 0.9986420432333819, 'learning_rate': 5.865787095226028e-07, 'epoch': 0.85} 85%|████████▍ | 29100/34278 [31:57:10<4:30:33, 3.14s/it] 85%|████████▍ | 29101/34278 [31:57:15<5:22:15, 3.73s/it] {'loss': 0.1271, 'grad_norm': 0.7767268360772743, 'learning_rate': 5.863567010022637e-07, 'epoch': 0.85} 85%|████████▍ | 29101/34278 [31:57:15<5:22:15, 3.73s/it] 85%|████████▍ | 29102/34278 [31:57:19<5:13:12, 3.63s/it] {'loss': 0.1224, 'grad_norm': 0.9771046731481287, 'learning_rate': 5.861347318858779e-07, 'epoch': 0.85} 85%|████████▍ | 29102/34278 [31:57:19<5:13:12, 3.63s/it] 85%|████████▍ | 29103/34278 [31:57:22<4:51:52, 3.38s/it] {'loss': 0.1122, 'grad_norm': 0.7944777870991566, 'learning_rate': 5.859128021754279e-07, 'epoch': 0.85} 85%|████████▍ | 29103/34278 [31:57:22<4:51:52, 3.38s/it] 85%|████████▍ | 29104/34278 [31:57:26<5:16:06, 3.67s/it] {'loss': 0.1138, 'grad_norm': 0.7144871424506061, 'learning_rate': 5.856909118728954e-07, 'epoch': 0.85} 85%|████████▍ | 29104/34278 [31:57:26<5:16:06, 3.67s/it] 85%|████████▍ | 29105/34278 [31:57:30<5:16:25, 3.67s/it] {'loss': 0.1159, 'grad_norm': 0.7735633126790501, 'learning_rate': 5.854690609802593e-07, 'epoch': 0.85} 85%|████████▍ | 29105/34278 [31:57:30<5:16:25, 3.67s/it] 85%|████████▍ | 29106/34278 [31:57:36<6:26:18, 4.48s/it] {'loss': 0.1034, 'grad_norm': 0.9817189047356594, 'learning_rate': 5.852472494995015e-07, 'epoch': 0.85} 85%|████████▍ | 29106/34278 [31:57:36<6:26:18, 4.48s/it] 85%|████████▍ | 29107/34278 [31:57:39<5:50:37, 4.07s/it] {'loss': 0.109, 'grad_norm': 0.8178974331985509, 'learning_rate': 5.850254774326037e-07, 'epoch': 0.85} 85%|████████▍ | 29107/34278 [31:57:39<5:50:37, 4.07s/it] 85%|████████▍ | 29108/34278 [31:57:42<5:24:01, 3.76s/it] {'loss': 0.1211, 'grad_norm': 1.000510735201703, 'learning_rate': 5.848037447815441e-07, 'epoch': 0.85} 85%|████████▍ | 29108/34278 [31:57:42<5:24:01, 3.76s/it] 85%|████████▍ | 29109/34278 [31:57:45<4:58:49, 3.47s/it] {'loss': 0.1213, 'grad_norm': 0.9691590387254037, 'learning_rate': 5.84582051548302e-07, 'epoch': 0.85} 85%|████████▍ | 29109/34278 [31:57:45<4:58:49, 3.47s/it] 85%|████████▍ | 29110/34278 [31:57:51<5:59:49, 4.18s/it] {'loss': 0.1086, 'grad_norm': 1.0165256421212807, 'learning_rate': 5.843603977348577e-07, 'epoch': 0.85} 85%|████████▍ | 29110/34278 [31:57:51<5:59:49, 4.18s/it] 85%|████████▍ | 29111/34278 [31:57:55<5:53:04, 4.10s/it] {'loss': 0.125, 'grad_norm': 0.752107434507686, 'learning_rate': 5.841387833431906e-07, 'epoch': 0.85} 85%|████████▍ | 29111/34278 [31:57:55<5:53:04, 4.10s/it] 85%|████████▍ | 29112/34278 [31:57:58<5:37:12, 3.92s/it] {'loss': 0.1099, 'grad_norm': 0.722377620849228, 'learning_rate': 5.839172083752765e-07, 'epoch': 0.85} 85%|████████▍ | 29112/34278 [31:57:58<5:37:12, 3.92s/it] 85%|████████▍ | 29113/34278 [31:58:04<6:31:27, 4.55s/it] {'loss': 0.1158, 'grad_norm': 1.157240859714747, 'learning_rate': 5.836956728330955e-07, 'epoch': 0.85} 85%|████████▍ | 29113/34278 [31:58:04<6:31:27, 4.55s/it] 85%|████████▍ | 29114/34278 [31:58:07<5:58:50, 4.17s/it] {'loss': 0.1229, 'grad_norm': 0.9839289493941649, 'learning_rate': 5.834741767186264e-07, 'epoch': 0.85} 85%|████████▍ | 29114/34278 [31:58:07<5:58:50, 4.17s/it] 85%|████████▍ | 29115/34278 [31:58:12<6:16:38, 4.38s/it] {'loss': 0.1161, 'grad_norm': 0.9185839013973843, 'learning_rate': 5.832527200338455e-07, 'epoch': 0.85} 85%|████████▍ | 29115/34278 [31:58:12<6:16:38, 4.38s/it] 85%|████████▍ | 29116/34278 [31:58:15<5:44:09, 4.00s/it] {'loss': 0.1145, 'grad_norm': 0.7015963374246615, 'learning_rate': 5.830313027807294e-07, 'epoch': 0.85} 85%|████████▍ | 29116/34278 [31:58:15<5:44:09, 4.00s/it] 85%|████████▍ | 29117/34278 [31:58:18<5:11:53, 3.63s/it] {'loss': 0.1176, 'grad_norm': 0.7002043277821434, 'learning_rate': 5.828099249612556e-07, 'epoch': 0.85} 85%|████████▍ | 29117/34278 [31:58:18<5:11:53, 3.63s/it] 85%|████████▍ | 29118/34278 [31:58:22<5:17:04, 3.69s/it] {'loss': 0.0924, 'grad_norm': 0.8103848443445484, 'learning_rate': 5.825885865774001e-07, 'epoch': 0.85} 85%|████████▍ | 29118/34278 [31:58:22<5:17:04, 3.69s/it] 85%|████████▍ | 29119/34278 [31:58:28<6:13:24, 4.34s/it] {'loss': 0.1167, 'grad_norm': 0.6318928773755528, 'learning_rate': 5.823672876311387e-07, 'epoch': 0.85} 85%|████████▍ | 29119/34278 [31:58:28<6:13:24, 4.34s/it] 85%|████████▍ | 29120/34278 [31:58:31<5:38:25, 3.94s/it] {'loss': 0.1168, 'grad_norm': 0.8146860468507399, 'learning_rate': 5.821460281244489e-07, 'epoch': 0.85} 85%|████████▍ | 29120/34278 [31:58:31<5:38:25, 3.94s/it] 85%|████████▍ | 29121/34278 [31:58:34<5:08:22, 3.59s/it] {'loss': 0.133, 'grad_norm': 0.9467762026272764, 'learning_rate': 5.819248080593043e-07, 'epoch': 0.85} 85%|████████▍ | 29121/34278 [31:58:34<5:08:22, 3.59s/it] 85%|████████▍ | 29122/34278 [31:58:37<4:56:15, 3.45s/it] {'loss': 0.0927, 'grad_norm': 0.6215715522082205, 'learning_rate': 5.817036274376797e-07, 'epoch': 0.85} 85%|████████▍ | 29122/34278 [31:58:37<4:56:15, 3.45s/it] 85%|████████▍ | 29123/34278 [31:58:40<5:02:34, 3.52s/it] {'loss': 0.1337, 'grad_norm': 1.9880019343072017, 'learning_rate': 5.814824862615514e-07, 'epoch': 0.85} 85%|████████▍ | 29123/34278 [31:58:40<5:02:34, 3.52s/it] 85%|████████▍ | 29124/34278 [31:58:44<5:04:08, 3.54s/it] {'loss': 0.1097, 'grad_norm': 0.7421360015741368, 'learning_rate': 5.812613845328912e-07, 'epoch': 0.85} 85%|████████▍ | 29124/34278 [31:58:44<5:04:08, 3.54s/it] 85%|████████▍ | 29125/34278 [31:58:48<5:03:30, 3.53s/it] {'loss': 0.1272, 'grad_norm': 0.7819558106184666, 'learning_rate': 5.810403222536759e-07, 'epoch': 0.85} 85%|████████▍ | 29125/34278 [31:58:48<5:03:30, 3.53s/it] 85%|████████▍ | 29126/34278 [31:58:52<5:27:34, 3.81s/it] {'loss': 0.1193, 'grad_norm': 2.3119527103291024, 'learning_rate': 5.808192994258771e-07, 'epoch': 0.85} 85%|████████▍ | 29126/34278 [31:58:52<5:27:34, 3.81s/it] 85%|████████▍ | 29127/34278 [31:58:55<5:02:17, 3.52s/it] {'loss': 0.0942, 'grad_norm': 0.8243265048202371, 'learning_rate': 5.805983160514689e-07, 'epoch': 0.85} 85%|████████▍ | 29127/34278 [31:58:55<5:02:17, 3.52s/it] 85%|████████▍ | 29128/34278 [31:58:58<4:51:05, 3.39s/it] {'loss': 0.0879, 'grad_norm': 0.7023324635356789, 'learning_rate': 5.803773721324247e-07, 'epoch': 0.85} 85%|████████▍ | 29128/34278 [31:58:58<4:51:05, 3.39s/it] 85%|████████▍ | 29129/34278 [31:59:04<5:55:12, 4.14s/it] {'loss': 0.1061, 'grad_norm': 0.5917597332407699, 'learning_rate': 5.801564676707144e-07, 'epoch': 0.85} 85%|████████▍ | 29129/34278 [31:59:04<5:55:12, 4.14s/it] 85%|████████▍ | 29130/34278 [31:59:07<5:21:37, 3.75s/it] {'loss': 0.1144, 'grad_norm': 0.7443615849355073, 'learning_rate': 5.799356026683128e-07, 'epoch': 0.85} 85%|████████▍ | 29130/34278 [31:59:07<5:21:37, 3.75s/it] 85%|████████▍ | 29131/34278 [31:59:10<4:59:58, 3.50s/it] {'loss': 0.1046, 'grad_norm': 0.8423593937147441, 'learning_rate': 5.797147771271916e-07, 'epoch': 0.85} 85%|████████▍ | 29131/34278 [31:59:10<4:59:58, 3.50s/it] 85%|████████▍ | 29132/34278 [31:59:14<5:21:58, 3.75s/it] {'loss': 0.1036, 'grad_norm': 0.9524970229024283, 'learning_rate': 5.794939910493208e-07, 'epoch': 0.85} 85%|████████▍ | 29132/34278 [31:59:14<5:21:58, 3.75s/it] 85%|████████▍ | 29133/34278 [31:59:17<5:14:13, 3.66s/it] {'loss': 0.1114, 'grad_norm': 0.8264582787598, 'learning_rate': 5.792732444366734e-07, 'epoch': 0.85} 85%|████████▍ | 29133/34278 [31:59:17<5:14:13, 3.66s/it] 85%|████████▍ | 29134/34278 [31:59:20<4:52:59, 3.42s/it] {'loss': 0.1102, 'grad_norm': 0.7869697019267445, 'learning_rate': 5.790525372912192e-07, 'epoch': 0.85} 85%|████████▍ | 29134/34278 [31:59:20<4:52:59, 3.42s/it] 85%|████████▍ | 29135/34278 [31:59:23<4:38:43, 3.25s/it] {'loss': 0.1052, 'grad_norm': 0.9496161098997309, 'learning_rate': 5.78831869614927e-07, 'epoch': 0.85} 85%|████████▍ | 29135/34278 [31:59:23<4:38:43, 3.25s/it] 85%|████████▍ | 29136/34278 [31:59:26<4:38:19, 3.25s/it] {'loss': 0.1191, 'grad_norm': 0.7375757422015793, 'learning_rate': 5.786112414097689e-07, 'epoch': 0.85} 85%|████████▍ | 29136/34278 [31:59:26<4:38:19, 3.25s/it] 85%|████████▌ | 29137/34278 [31:59:31<5:07:24, 3.59s/it] {'loss': 0.1233, 'grad_norm': 0.7762538525516017, 'learning_rate': 5.783906526777155e-07, 'epoch': 0.85} 85%|████████▌ | 29137/34278 [31:59:31<5:07:24, 3.59s/it] 85%|████████▌ | 29138/34278 [31:59:37<6:12:07, 4.34s/it] {'loss': 0.1214, 'grad_norm': 0.8092677835828772, 'learning_rate': 5.781701034207343e-07, 'epoch': 0.85} 85%|████████▌ | 29138/34278 [31:59:37<6:12:07, 4.34s/it] 85%|████████▌ | 29139/34278 [31:59:41<5:59:47, 4.20s/it] {'loss': 0.1021, 'grad_norm': 0.933973821805636, 'learning_rate': 5.779495936407942e-07, 'epoch': 0.85} 85%|████████▌ | 29139/34278 [31:59:41<5:59:47, 4.20s/it] 85%|████████▌ | 29140/34278 [31:59:47<6:48:47, 4.77s/it] {'loss': 0.1106, 'grad_norm': 0.9931670136317993, 'learning_rate': 5.777291233398652e-07, 'epoch': 0.85} 85%|████████▌ | 29140/34278 [31:59:47<6:48:47, 4.77s/it] 85%|████████▌ | 29141/34278 [31:59:50<6:07:16, 4.29s/it] {'loss': 0.1032, 'grad_norm': 0.9863575881144226, 'learning_rate': 5.775086925199152e-07, 'epoch': 0.85} 85%|████████▌ | 29141/34278 [31:59:50<6:07:16, 4.29s/it] 85%|████████▌ | 29142/34278 [31:59:53<5:35:31, 3.92s/it] {'loss': 0.1191, 'grad_norm': 1.396160627450317, 'learning_rate': 5.772883011829106e-07, 'epoch': 0.85} 85%|████████▌ | 29142/34278 [31:59:53<5:35:31, 3.92s/it] 85%|████████▌ | 29143/34278 [31:59:56<5:11:57, 3.65s/it] {'loss': 0.1148, 'grad_norm': 0.9039831491364916, 'learning_rate': 5.770679493308206e-07, 'epoch': 0.85} 85%|████████▌ | 29143/34278 [31:59:56<5:11:57, 3.65s/it] 85%|████████▌ | 29144/34278 [32:00:00<5:24:35, 3.79s/it] {'loss': 0.1146, 'grad_norm': 0.7585223572741104, 'learning_rate': 5.768476369656128e-07, 'epoch': 0.85} 85%|████████▌ | 29144/34278 [32:00:00<5:24:35, 3.79s/it] 85%|████████▌ | 29145/34278 [32:00:03<5:04:59, 3.57s/it] {'loss': 0.0994, 'grad_norm': 0.8947061008193511, 'learning_rate': 5.766273640892539e-07, 'epoch': 0.85} 85%|████████▌ | 29145/34278 [32:00:03<5:04:59, 3.57s/it] 85%|████████▌ | 29146/34278 [32:00:07<5:04:03, 3.55s/it] {'loss': 0.1101, 'grad_norm': 0.9824931411171514, 'learning_rate': 5.764071307037083e-07, 'epoch': 0.85} 85%|████████▌ | 29146/34278 [32:00:07<5:04:03, 3.55s/it] 85%|████████▌ | 29147/34278 [32:00:10<4:52:15, 3.42s/it] {'loss': 0.1086, 'grad_norm': 0.8736909814841786, 'learning_rate': 5.761869368109451e-07, 'epoch': 0.85} 85%|████████▌ | 29147/34278 [32:00:10<4:52:15, 3.42s/it] 85%|████████▌ | 29148/34278 [32:00:13<4:38:56, 3.26s/it] {'loss': 0.1266, 'grad_norm': 0.9175574839728452, 'learning_rate': 5.759667824129278e-07, 'epoch': 0.85} 85%|████████▌ | 29148/34278 [32:00:13<4:38:56, 3.26s/it] 85%|████████▌ | 29149/34278 [32:00:16<4:37:48, 3.25s/it] {'loss': 0.1255, 'grad_norm': 0.712944830383276, 'learning_rate': 5.75746667511623e-07, 'epoch': 0.85} 85%|████████▌ | 29149/34278 [32:00:16<4:37:48, 3.25s/it] 85%|████████▌ | 29150/34278 [32:00:19<4:29:26, 3.15s/it] {'loss': 0.1163, 'grad_norm': 0.8898608437365731, 'learning_rate': 5.75526592108997e-07, 'epoch': 0.85} 85%|████████▌ | 29150/34278 [32:00:19<4:29:26, 3.15s/it] 85%|████████▌ | 29151/34278 [32:00:23<4:52:30, 3.42s/it] {'loss': 0.1288, 'grad_norm': 0.9883291396051264, 'learning_rate': 5.753065562070131e-07, 'epoch': 0.85} 85%|████████▌ | 29151/34278 [32:00:23<4:52:30, 3.42s/it] 85%|████████▌ | 29152/34278 [32:00:26<4:53:52, 3.44s/it] {'loss': 0.1093, 'grad_norm': 0.8090705015277261, 'learning_rate': 5.75086559807635e-07, 'epoch': 0.85} 85%|████████▌ | 29152/34278 [32:00:26<4:53:52, 3.44s/it] 85%|████████▌ | 29153/34278 [32:00:29<4:41:16, 3.29s/it] {'loss': 0.1064, 'grad_norm': 0.7690992370566142, 'learning_rate': 5.748666029128292e-07, 'epoch': 0.85} 85%|████████▌ | 29153/34278 [32:00:29<4:41:16, 3.29s/it] 85%|████████▌ | 29154/34278 [32:00:35<5:49:04, 4.09s/it] {'loss': 0.1278, 'grad_norm': 0.729171642889137, 'learning_rate': 5.746466855245564e-07, 'epoch': 0.85} 85%|████████▌ | 29154/34278 [32:00:35<5:49:04, 4.09s/it] 85%|████████▌ | 29155/34278 [32:00:38<5:25:09, 3.81s/it] {'loss': 0.1128, 'grad_norm': 0.7557397395067963, 'learning_rate': 5.744268076447829e-07, 'epoch': 0.85} 85%|████████▌ | 29155/34278 [32:00:38<5:25:09, 3.81s/it] 85%|████████▌ | 29156/34278 [32:00:42<5:12:30, 3.66s/it] {'loss': 0.1202, 'grad_norm': 0.7387186636381406, 'learning_rate': 5.742069692754692e-07, 'epoch': 0.85} 85%|████████▌ | 29156/34278 [32:00:42<5:12:30, 3.66s/it] 85%|████████▌ | 29157/34278 [32:00:45<5:10:28, 3.64s/it] {'loss': 0.1042, 'grad_norm': 0.8967285687017305, 'learning_rate': 5.739871704185807e-07, 'epoch': 0.85} 85%|████████▌ | 29157/34278 [32:00:45<5:10:28, 3.64s/it] 85%|████████▌ | 29158/34278 [32:00:48<4:53:53, 3.44s/it] {'loss': 0.0932, 'grad_norm': 0.6574592588010976, 'learning_rate': 5.737674110760777e-07, 'epoch': 0.85} 85%|████████▌ | 29158/34278 [32:00:48<4:53:53, 3.44s/it] 85%|████████▌ | 29159/34278 [32:00:51<4:40:37, 3.29s/it] {'loss': 0.0916, 'grad_norm': 0.7309849420674775, 'learning_rate': 5.735476912499216e-07, 'epoch': 0.85} 85%|████████▌ | 29159/34278 [32:00:51<4:40:37, 3.29s/it] 85%|████████▌ | 29160/34278 [32:00:55<4:49:06, 3.39s/it] {'loss': 0.1137, 'grad_norm': 0.7636606999105496, 'learning_rate': 5.733280109420753e-07, 'epoch': 0.85} 85%|████████▌ | 29160/34278 [32:00:55<4:49:06, 3.39s/it] 85%|████████▌ | 29161/34278 [32:00:58<4:43:11, 3.32s/it] {'loss': 0.1118, 'grad_norm': 0.895668734767981, 'learning_rate': 5.731083701545003e-07, 'epoch': 0.85} 85%|████████▌ | 29161/34278 [32:00:58<4:43:11, 3.32s/it] 85%|████████▌ | 29162/34278 [32:01:01<4:36:41, 3.25s/it] {'loss': 0.1279, 'grad_norm': 0.9338595039586092, 'learning_rate': 5.728887688891566e-07, 'epoch': 0.85} 85%|████████▌ | 29162/34278 [32:01:01<4:36:41, 3.25s/it] 85%|████████▌ | 29163/34278 [32:01:05<4:43:36, 3.33s/it] {'loss': 0.1198, 'grad_norm': 0.9975607117987849, 'learning_rate': 5.726692071480061e-07, 'epoch': 0.85} 85%|████████▌ | 29163/34278 [32:01:05<4:43:36, 3.33s/it] 85%|████████▌ | 29164/34278 [32:01:11<5:52:19, 4.13s/it] {'loss': 0.1095, 'grad_norm': 1.03145749253362, 'learning_rate': 5.724496849330075e-07, 'epoch': 0.85} 85%|████████▌ | 29164/34278 [32:01:11<5:52:19, 4.13s/it] 85%|████████▌ | 29165/34278 [32:01:15<5:52:18, 4.13s/it] {'loss': 0.1096, 'grad_norm': 0.9783715480649365, 'learning_rate': 5.722302022461206e-07, 'epoch': 0.85} 85%|████████▌ | 29165/34278 [32:01:15<5:52:18, 4.13s/it] 85%|████████▌ | 29166/34278 [32:01:18<5:28:54, 3.86s/it] {'loss': 0.1263, 'grad_norm': 0.8587857273281144, 'learning_rate': 5.720107590893054e-07, 'epoch': 0.85} 85%|████████▌ | 29166/34278 [32:01:18<5:28:54, 3.86s/it] 85%|████████▌ | 29167/34278 [32:01:21<5:06:58, 3.60s/it] {'loss': 0.102, 'grad_norm': 0.9195945072151114, 'learning_rate': 5.717913554645221e-07, 'epoch': 0.85} 85%|████████▌ | 29167/34278 [32:01:21<5:06:58, 3.60s/it] 85%|████████▌ | 29168/34278 [32:01:27<6:08:23, 4.33s/it] {'loss': 0.1133, 'grad_norm': 0.8380710786381557, 'learning_rate': 5.715719913737283e-07, 'epoch': 0.85} 85%|████████▌ | 29168/34278 [32:01:27<6:08:23, 4.33s/it] 85%|████████▌ | 29169/34278 [32:01:30<5:42:08, 4.02s/it] {'loss': 0.1194, 'grad_norm': 0.7497282590627513, 'learning_rate': 5.713526668188818e-07, 'epoch': 0.85} 85%|████████▌ | 29169/34278 [32:01:30<5:42:08, 4.02s/it] 85%|████████▌ | 29170/34278 [32:01:33<5:19:38, 3.75s/it] {'loss': 0.1134, 'grad_norm': 0.9293399746740582, 'learning_rate': 5.711333818019421e-07, 'epoch': 0.85} 85%|████████▌ | 29170/34278 [32:01:33<5:19:38, 3.75s/it] 85%|████████▌ | 29171/34278 [32:01:37<5:10:27, 3.65s/it] {'loss': 0.0919, 'grad_norm': 0.8645695735950689, 'learning_rate': 5.709141363248666e-07, 'epoch': 0.85} 85%|████████▌ | 29171/34278 [32:01:37<5:10:27, 3.65s/it] 85%|████████▌ | 29172/34278 [32:01:41<5:14:01, 3.69s/it] {'loss': 0.1092, 'grad_norm': 0.7521536442160687, 'learning_rate': 5.706949303896115e-07, 'epoch': 0.85} 85%|████████▌ | 29172/34278 [32:01:41<5:14:01, 3.69s/it] 85%|████████▌ | 29173/34278 [32:01:45<5:20:53, 3.77s/it] {'loss': 0.1224, 'grad_norm': 0.7737213973962924, 'learning_rate': 5.704757639981346e-07, 'epoch': 0.85} 85%|████████▌ | 29173/34278 [32:01:45<5:20:53, 3.77s/it] 85%|████████▌ | 29174/34278 [32:01:47<4:52:32, 3.44s/it] {'loss': 0.1334, 'grad_norm': 1.0631300616216923, 'learning_rate': 5.702566371523937e-07, 'epoch': 0.85} 85%|████████▌ | 29174/34278 [32:01:47<4:52:32, 3.44s/it] 85%|████████▌ | 29175/34278 [32:01:52<5:34:24, 3.93s/it] {'loss': 0.1037, 'grad_norm': 0.8543005343058528, 'learning_rate': 5.700375498543442e-07, 'epoch': 0.85} 85%|████████▌ | 29175/34278 [32:01:52<5:34:24, 3.93s/it] 85%|████████▌ | 29176/34278 [32:01:56<5:29:38, 3.88s/it] {'loss': 0.1138, 'grad_norm': 0.8937262599219347, 'learning_rate': 5.698185021059404e-07, 'epoch': 0.85} 85%|████████▌ | 29176/34278 [32:01:56<5:29:38, 3.88s/it] 85%|████████▌ | 29177/34278 [32:01:59<5:08:06, 3.62s/it] {'loss': 0.0949, 'grad_norm': 0.7674303809737473, 'learning_rate': 5.69599493909141e-07, 'epoch': 0.85} 85%|████████▌ | 29177/34278 [32:01:59<5:08:06, 3.62s/it] 85%|████████▌ | 29178/34278 [32:02:02<4:58:19, 3.51s/it] {'loss': 0.1123, 'grad_norm': 0.7625463589295192, 'learning_rate': 5.693805252658984e-07, 'epoch': 0.85} 85%|████████▌ | 29178/34278 [32:02:02<4:58:19, 3.51s/it] 85%|████████▌ | 29179/34278 [32:02:06<4:49:47, 3.41s/it] {'loss': 0.1069, 'grad_norm': 0.8400415243150124, 'learning_rate': 5.691615961781694e-07, 'epoch': 0.85} 85%|████████▌ | 29179/34278 [32:02:06<4:49:47, 3.41s/it] 85%|████████▌ | 29180/34278 [32:02:11<5:54:00, 4.17s/it] {'loss': 0.1042, 'grad_norm': 0.7553999497936758, 'learning_rate': 5.689427066479081e-07, 'epoch': 0.85} 85%|████████▌ | 29180/34278 [32:02:12<5:54:00, 4.17s/it] 85%|████████▌ | 29181/34278 [32:02:17<6:36:41, 4.67s/it] {'loss': 0.1156, 'grad_norm': 1.9784624781982205, 'learning_rate': 5.687238566770692e-07, 'epoch': 0.85} 85%|████████▌ | 29181/34278 [32:02:17<6:36:41, 4.67s/it] 85%|████████▌ | 29182/34278 [32:02:21<6:00:24, 4.24s/it] {'loss': 0.1046, 'grad_norm': 0.990666517949037, 'learning_rate': 5.685050462676045e-07, 'epoch': 0.85} 85%|████████▌ | 29182/34278 [32:02:21<6:00:24, 4.24s/it] 85%|████████▌ | 29183/34278 [32:02:27<6:44:23, 4.76s/it] {'loss': 0.1085, 'grad_norm': 0.7638324941083079, 'learning_rate': 5.682862754214696e-07, 'epoch': 0.85} 85%|████████▌ | 29183/34278 [32:02:27<6:44:23, 4.76s/it] 85%|████████▌ | 29184/34278 [32:02:29<5:56:29, 4.20s/it] {'loss': 0.1168, 'grad_norm': 0.8732438449305735, 'learning_rate': 5.680675441406164e-07, 'epoch': 0.85} 85%|████████▌ | 29184/34278 [32:02:29<5:56:29, 4.20s/it] 85%|████████▌ | 29185/34278 [32:02:35<6:37:55, 4.69s/it] {'loss': 0.1185, 'grad_norm': 0.8059618409293302, 'learning_rate': 5.678488524269993e-07, 'epoch': 0.85} 85%|████████▌ | 29185/34278 [32:02:35<6:37:55, 4.69s/it] 85%|████████▌ | 29186/34278 [32:02:39<6:05:51, 4.31s/it] {'loss': 0.1026, 'grad_norm': 0.9910972610017613, 'learning_rate': 5.676302002825679e-07, 'epoch': 0.85} 85%|████████▌ | 29186/34278 [32:02:39<6:05:51, 4.31s/it] 85%|████████▌ | 29187/34278 [32:02:42<5:43:24, 4.05s/it] {'loss': 0.1188, 'grad_norm': 0.950750253391563, 'learning_rate': 5.674115877092773e-07, 'epoch': 0.85} 85%|████████▌ | 29187/34278 [32:02:42<5:43:24, 4.05s/it] 85%|████████▌ | 29188/34278 [32:02:46<5:32:54, 3.92s/it] {'loss': 0.1192, 'grad_norm': 0.8802836415266408, 'learning_rate': 5.671930147090782e-07, 'epoch': 0.85} 85%|████████▌ | 29188/34278 [32:02:46<5:32:54, 3.92s/it] 85%|████████▌ | 29189/34278 [32:02:51<5:57:26, 4.21s/it] {'loss': 0.1, 'grad_norm': 0.984339877038203, 'learning_rate': 5.669744812839207e-07, 'epoch': 0.85} 85%|████████▌ | 29189/34278 [32:02:51<5:57:26, 4.21s/it] 85%|████████▌ | 29190/34278 [32:02:57<6:45:12, 4.78s/it] {'loss': 0.1205, 'grad_norm': 0.8359904446673222, 'learning_rate': 5.667559874357564e-07, 'epoch': 0.85} 85%|████████▌ | 29190/34278 [32:02:57<6:45:12, 4.78s/it] 85%|████████▌ | 29191/34278 [32:03:00<6:02:55, 4.28s/it] {'loss': 0.1327, 'grad_norm': 0.8618322585781738, 'learning_rate': 5.665375331665374e-07, 'epoch': 0.85} 85%|████████▌ | 29191/34278 [32:03:00<6:02:55, 4.28s/it] 85%|████████▌ | 29192/34278 [32:03:04<5:53:14, 4.17s/it] {'loss': 0.1171, 'grad_norm': 0.9559095642731167, 'learning_rate': 5.663191184782118e-07, 'epoch': 0.85} 85%|████████▌ | 29192/34278 [32:03:04<5:53:14, 4.17s/it] 85%|████████▌ | 29193/34278 [32:03:08<5:44:28, 4.06s/it] {'loss': 0.1026, 'grad_norm': 0.853620565237487, 'learning_rate': 5.661007433727322e-07, 'epoch': 0.85} 85%|████████▌ | 29193/34278 [32:03:08<5:44:28, 4.06s/it] 85%|████████▌ | 29194/34278 [32:03:11<5:24:13, 3.83s/it] {'loss': 0.1141, 'grad_norm': 1.006325333801254, 'learning_rate': 5.658824078520464e-07, 'epoch': 0.85} 85%|████████▌ | 29194/34278 [32:03:11<5:24:13, 3.83s/it] 85%|████████▌ | 29195/34278 [32:03:14<5:01:24, 3.56s/it] {'loss': 0.1073, 'grad_norm': 1.0933564406113303, 'learning_rate': 5.656641119181033e-07, 'epoch': 0.85} 85%|████████▌ | 29195/34278 [32:03:14<5:01:24, 3.56s/it] 85%|████████▌ | 29196/34278 [32:03:17<4:46:45, 3.39s/it] {'loss': 0.1064, 'grad_norm': 0.8362935999240677, 'learning_rate': 5.65445855572852e-07, 'epoch': 0.85} 85%|████████▌ | 29196/34278 [32:03:17<4:46:45, 3.39s/it] 85%|████████▌ | 29197/34278 [32:03:20<4:43:20, 3.35s/it] {'loss': 0.1009, 'grad_norm': 0.7974708708563673, 'learning_rate': 5.652276388182426e-07, 'epoch': 0.85} 85%|████████▌ | 29197/34278 [32:03:20<4:43:20, 3.35s/it] 85%|████████▌ | 29198/34278 [32:03:23<4:37:14, 3.27s/it] {'loss': 0.1204, 'grad_norm': 0.8415166654775664, 'learning_rate': 5.650094616562224e-07, 'epoch': 0.85} 85%|████████▌ | 29198/34278 [32:03:23<4:37:14, 3.27s/it] 85%|████████▌ | 29199/34278 [32:03:27<4:47:05, 3.39s/it] {'loss': 0.1104, 'grad_norm': 0.815356017037793, 'learning_rate': 5.647913240887376e-07, 'epoch': 0.85} 85%|████████▌ | 29199/34278 [32:03:27<4:47:05, 3.39s/it] 85%|████████▌ | 29200/34278 [32:03:33<5:54:38, 4.19s/it] {'loss': 0.0861, 'grad_norm': 0.7089991203220957, 'learning_rate': 5.645732261177384e-07, 'epoch': 0.85} 85%|████████▌ | 29200/34278 [32:03:33<5:54:38, 4.19s/it] 85%|████████▌ | 29201/34278 [32:03:38<6:27:28, 4.58s/it] {'loss': 0.1023, 'grad_norm': 0.762593618453199, 'learning_rate': 5.643551677451703e-07, 'epoch': 0.85} 85%|████████▌ | 29201/34278 [32:03:38<6:27:28, 4.58s/it] 85%|████████▌ | 29202/34278 [32:03:45<7:09:06, 5.07s/it] {'loss': 0.1186, 'grad_norm': 0.7838990712235988, 'learning_rate': 5.641371489729797e-07, 'epoch': 0.85} 85%|████████▌ | 29202/34278 [32:03:45<7:09:06, 5.07s/it] 85%|████████▌ | 29203/34278 [32:03:48<6:21:55, 4.52s/it] {'loss': 0.1036, 'grad_norm': 0.9898869616111717, 'learning_rate': 5.639191698031137e-07, 'epoch': 0.85} 85%|████████▌ | 29203/34278 [32:03:48<6:21:55, 4.52s/it] 85%|████████▌ | 29204/34278 [32:03:51<5:46:04, 4.09s/it] {'loss': 0.0961, 'grad_norm': 0.8933480439330093, 'learning_rate': 5.637012302375195e-07, 'epoch': 0.85} 85%|████████▌ | 29204/34278 [32:03:51<5:46:04, 4.09s/it] 85%|████████▌ | 29205/34278 [32:03:54<5:30:23, 3.91s/it] {'loss': 0.1377, 'grad_norm': 0.9400445190726446, 'learning_rate': 5.634833302781411e-07, 'epoch': 0.85} 85%|████████▌ | 29205/34278 [32:03:54<5:30:23, 3.91s/it] 85%|████████▌ | 29206/34278 [32:03:58<5:14:03, 3.72s/it] {'loss': 0.1228, 'grad_norm': 0.8242725405017413, 'learning_rate': 5.632654699269241e-07, 'epoch': 0.85} 85%|████████▌ | 29206/34278 [32:03:58<5:14:03, 3.72s/it] 85%|████████▌ | 29207/34278 [32:04:01<5:07:30, 3.64s/it] {'loss': 0.1244, 'grad_norm': 0.8976782201804555, 'learning_rate': 5.630476491858145e-07, 'epoch': 0.85} 85%|████████▌ | 29207/34278 [32:04:01<5:07:30, 3.64s/it] 85%|████████▌ | 29208/34278 [32:04:05<5:01:18, 3.57s/it] {'loss': 0.1274, 'grad_norm': 0.82601834719666, 'learning_rate': 5.628298680567556e-07, 'epoch': 0.85} 85%|████████▌ | 29208/34278 [32:04:05<5:01:18, 3.57s/it] 85%|████████▌ | 29209/34278 [32:04:08<4:56:44, 3.51s/it] {'loss': 0.0974, 'grad_norm': 0.9804919939828782, 'learning_rate': 5.626121265416917e-07, 'epoch': 0.85} 85%|████████▌ | 29209/34278 [32:04:08<4:56:44, 3.51s/it] 85%|████████▌ | 29210/34278 [32:04:11<4:43:57, 3.36s/it] {'loss': 0.1192, 'grad_norm': 1.145379006702492, 'learning_rate': 5.623944246425695e-07, 'epoch': 0.85} 85%|████████▌ | 29210/34278 [32:04:11<4:43:57, 3.36s/it] 85%|████████▌ | 29211/34278 [32:04:14<4:35:46, 3.27s/it] {'loss': 0.0943, 'grad_norm': 0.7038569782151766, 'learning_rate': 5.621767623613294e-07, 'epoch': 0.85} 85%|████████▌ | 29211/34278 [32:04:14<4:35:46, 3.27s/it] 85%|████████▌ | 29212/34278 [32:04:17<4:30:55, 3.21s/it] {'loss': 0.1392, 'grad_norm': 0.9213798575557912, 'learning_rate': 5.619591396999158e-07, 'epoch': 0.85} 85%|████████▌ | 29212/34278 [32:04:17<4:30:55, 3.21s/it] 85%|████████▌ | 29213/34278 [32:04:22<5:11:29, 3.69s/it] {'loss': 0.1199, 'grad_norm': 0.9035633900811643, 'learning_rate': 5.617415566602718e-07, 'epoch': 0.85} 85%|████████▌ | 29213/34278 [32:04:22<5:11:29, 3.69s/it] 85%|████████▌ | 29214/34278 [32:04:25<5:04:20, 3.61s/it] {'loss': 0.1093, 'grad_norm': 0.8680700646262012, 'learning_rate': 5.61524013244339e-07, 'epoch': 0.85} 85%|████████▌ | 29214/34278 [32:04:25<5:04:20, 3.61s/it] 85%|████████▌ | 29215/34278 [32:04:32<6:15:14, 4.45s/it] {'loss': 0.1124, 'grad_norm': 0.8398015135678619, 'learning_rate': 5.613065094540615e-07, 'epoch': 0.85} 85%|████████▌ | 29215/34278 [32:04:32<6:15:14, 4.45s/it] 85%|████████▌ | 29216/34278 [32:04:35<5:42:20, 4.06s/it] {'loss': 0.1163, 'grad_norm': 0.9560727325836809, 'learning_rate': 5.610890452913787e-07, 'epoch': 0.85} 85%|████████▌ | 29216/34278 [32:04:35<5:42:20, 4.06s/it] 85%|████████▌ | 29217/34278 [32:04:38<5:19:28, 3.79s/it] {'loss': 0.0945, 'grad_norm': 0.8763948693257041, 'learning_rate': 5.608716207582338e-07, 'epoch': 0.85} 85%|████████▌ | 29217/34278 [32:04:38<5:19:28, 3.79s/it] 85%|████████▌ | 29218/34278 [32:04:44<6:24:50, 4.56s/it] {'loss': 0.1143, 'grad_norm': 1.0084153223099461, 'learning_rate': 5.606542358565681e-07, 'epoch': 0.85} 85%|████████▌ | 29218/34278 [32:04:44<6:24:50, 4.56s/it] 85%|████████▌ | 29219/34278 [32:04:48<6:02:29, 4.30s/it] {'loss': 0.1286, 'grad_norm': 0.7695030500440252, 'learning_rate': 5.6043689058832e-07, 'epoch': 0.85} 85%|████████▌ | 29219/34278 [32:04:48<6:02:29, 4.30s/it] 85%|████████▌ | 29220/34278 [32:04:52<5:49:44, 4.15s/it] {'loss': 0.1279, 'grad_norm': 0.8144531799096898, 'learning_rate': 5.60219584955432e-07, 'epoch': 0.85} 85%|████████▌ | 29220/34278 [32:04:52<5:49:44, 4.15s/it] 85%|████████▌ | 29221/34278 [32:04:55<5:18:26, 3.78s/it] {'loss': 0.1185, 'grad_norm': 1.0118076535744631, 'learning_rate': 5.600023189598442e-07, 'epoch': 0.85} 85%|████████▌ | 29221/34278 [32:04:55<5:18:26, 3.78s/it] 85%|████████▌ | 29222/34278 [32:04:59<5:39:59, 4.03s/it] {'loss': 0.1265, 'grad_norm': 0.895815978752632, 'learning_rate': 5.597850926034954e-07, 'epoch': 0.85} 85%|████████▌ | 29222/34278 [32:04:59<5:39:59, 4.03s/it] 85%|████████▌ | 29223/34278 [32:05:02<5:15:08, 3.74s/it] {'loss': 0.1147, 'grad_norm': 0.9172450629512476, 'learning_rate': 5.595679058883257e-07, 'epoch': 0.85} 85%|████████▌ | 29223/34278 [32:05:02<5:15:08, 3.74s/it] 85%|████████▌ | 29224/34278 [32:05:07<5:42:58, 4.07s/it] {'loss': 0.1096, 'grad_norm': 0.7653174122887514, 'learning_rate': 5.593507588162739e-07, 'epoch': 0.85} 85%|████████▌ | 29224/34278 [32:05:07<5:42:58, 4.07s/it] 85%|████████▌ | 29225/34278 [32:05:13<6:15:57, 4.46s/it] {'loss': 0.102, 'grad_norm': 0.7425071369559749, 'learning_rate': 5.591336513892776e-07, 'epoch': 0.85} 85%|████████▌ | 29225/34278 [32:05:13<6:15:57, 4.46s/it] 85%|████████▌ | 29226/34278 [32:05:17<6:05:05, 4.34s/it] {'loss': 0.1053, 'grad_norm': 0.6504027414778139, 'learning_rate': 5.589165836092759e-07, 'epoch': 0.85} 85%|████████▌ | 29226/34278 [32:05:17<6:05:05, 4.34s/it] 85%|████████▌ | 29227/34278 [32:05:21<6:00:36, 4.28s/it] {'loss': 0.132, 'grad_norm': 1.1333888888884196, 'learning_rate': 5.586995554782076e-07, 'epoch': 0.85} 85%|████████▌ | 29227/34278 [32:05:21<6:00:36, 4.28s/it] 85%|████████▌ | 29228/34278 [32:05:24<5:35:19, 3.98s/it] {'loss': 0.12, 'grad_norm': 0.9105737631709079, 'learning_rate': 5.584825669980098e-07, 'epoch': 0.85} 85%|████████▌ | 29228/34278 [32:05:24<5:35:19, 3.98s/it] 85%|████████▌ | 29229/34278 [32:05:27<5:11:03, 3.70s/it] {'loss': 0.089, 'grad_norm': 0.9115820786569032, 'learning_rate': 5.582656181706181e-07, 'epoch': 0.85} 85%|████████▌ | 29229/34278 [32:05:27<5:11:03, 3.70s/it] 85%|████████▌ | 29230/34278 [32:05:30<4:52:31, 3.48s/it] {'loss': 0.1097, 'grad_norm': 0.968959144874335, 'learning_rate': 5.58048708997972e-07, 'epoch': 0.85} 85%|████████▌ | 29230/34278 [32:05:30<4:52:31, 3.48s/it] 85%|████████▌ | 29231/34278 [32:05:33<4:44:02, 3.38s/it] {'loss': 0.1077, 'grad_norm': 1.0005124495851303, 'learning_rate': 5.578318394820053e-07, 'epoch': 0.85} 85%|████████▌ | 29231/34278 [32:05:33<4:44:02, 3.38s/it] 85%|████████▌ | 29232/34278 [32:05:37<4:41:14, 3.34s/it] {'loss': 0.1147, 'grad_norm': 0.9823464381116843, 'learning_rate': 5.576150096246563e-07, 'epoch': 0.85} 85%|████████▌ | 29232/34278 [32:05:37<4:41:14, 3.34s/it] 85%|████████▌ | 29233/34278 [32:05:42<5:45:05, 4.10s/it] {'loss': 0.1203, 'grad_norm': 0.7421311343553169, 'learning_rate': 5.573982194278594e-07, 'epoch': 0.85} 85%|████████▌ | 29233/34278 [32:05:42<5:45:05, 4.10s/it] 85%|████████▌ | 29234/34278 [32:05:46<5:19:44, 3.80s/it] {'loss': 0.1131, 'grad_norm': 0.9075214868246244, 'learning_rate': 5.571814688935517e-07, 'epoch': 0.85} 85%|████████▌ | 29234/34278 [32:05:46<5:19:44, 3.80s/it] 85%|████████▌ | 29235/34278 [32:05:48<4:53:09, 3.49s/it] {'loss': 0.1174, 'grad_norm': 0.9861239596102009, 'learning_rate': 5.56964758023667e-07, 'epoch': 0.85} 85%|████████▌ | 29235/34278 [32:05:48<4:53:09, 3.49s/it] 85%|████████▌ | 29236/34278 [32:05:54<5:50:48, 4.17s/it] {'loss': 0.1481, 'grad_norm': 1.2044479152443401, 'learning_rate': 5.567480868201397e-07, 'epoch': 0.85} 85%|████████▌ | 29236/34278 [32:05:54<5:50:48, 4.17s/it] 85%|████████▌ | 29237/34278 [32:05:57<5:30:56, 3.94s/it] {'loss': 0.1024, 'grad_norm': 0.72138023383901, 'learning_rate': 5.565314552849044e-07, 'epoch': 0.85} 85%|████████▌ | 29237/34278 [32:05:57<5:30:56, 3.94s/it] 85%|████████▌ | 29238/34278 [32:06:00<5:03:39, 3.62s/it] {'loss': 0.1459, 'grad_norm': 0.9495747437271533, 'learning_rate': 5.563148634198967e-07, 'epoch': 0.85} 85%|████████▌ | 29238/34278 [32:06:00<5:03:39, 3.62s/it] 85%|████████▌ | 29239/34278 [32:06:04<5:06:59, 3.66s/it] {'loss': 0.1078, 'grad_norm': 1.001649971644289, 'learning_rate': 5.560983112270479e-07, 'epoch': 0.85} 85%|████████▌ | 29239/34278 [32:06:04<5:06:59, 3.66s/it] 85%|████████▌ | 29240/34278 [32:06:07<4:52:44, 3.49s/it] {'loss': 0.1111, 'grad_norm': 0.9833936050822271, 'learning_rate': 5.558817987082937e-07, 'epoch': 0.85} 85%|████████▌ | 29240/34278 [32:06:07<4:52:44, 3.49s/it] 85%|████████▌ | 29241/34278 [32:06:10<4:38:19, 3.32s/it] {'loss': 0.11, 'grad_norm': 0.7548229031177789, 'learning_rate': 5.556653258655659e-07, 'epoch': 0.85} 85%|████████▌ | 29241/34278 [32:06:10<4:38:19, 3.32s/it] 85%|████████▌ | 29242/34278 [32:06:15<5:13:06, 3.73s/it] {'loss': 0.0984, 'grad_norm': 0.9911272248553084, 'learning_rate': 5.554488927007961e-07, 'epoch': 0.85} 85%|████████▌ | 29242/34278 [32:06:15<5:13:06, 3.73s/it] 85%|████████▌ | 29243/34278 [32:06:18<5:06:23, 3.65s/it] {'loss': 0.0982, 'grad_norm': 0.7981739377756809, 'learning_rate': 5.552324992159175e-07, 'epoch': 0.85} 85%|████████▌ | 29243/34278 [32:06:18<5:06:23, 3.65s/it] 85%|████████▌ | 29244/34278 [32:06:21<4:56:21, 3.53s/it] {'loss': 0.1243, 'grad_norm': 0.7470635185418462, 'learning_rate': 5.550161454128633e-07, 'epoch': 0.85} 85%|████████▌ | 29244/34278 [32:06:21<4:56:21, 3.53s/it] 85%|████████▌ | 29245/34278 [32:06:25<4:50:59, 3.47s/it] {'loss': 0.1326, 'grad_norm': 0.884584534357942, 'learning_rate': 5.547998312935637e-07, 'epoch': 0.85} 85%|████████▌ | 29245/34278 [32:06:25<4:50:59, 3.47s/it] 85%|████████▌ | 29246/34278 [32:06:27<4:28:11, 3.20s/it] {'loss': 0.1056, 'grad_norm': 1.1726197074415614, 'learning_rate': 5.545835568599489e-07, 'epoch': 0.85} 85%|████████▌ | 29246/34278 [32:06:27<4:28:11, 3.20s/it] 85%|████████▌ | 29247/34278 [32:06:30<4:23:25, 3.14s/it] {'loss': 0.1449, 'grad_norm': 0.7488514684875713, 'learning_rate': 5.543673221139517e-07, 'epoch': 0.85} 85%|████████▌ | 29247/34278 [32:06:30<4:23:25, 3.14s/it] 85%|████████▌ | 29248/34278 [32:06:33<4:19:44, 3.10s/it] {'loss': 0.1172, 'grad_norm': 0.8045346970530131, 'learning_rate': 5.541511270575023e-07, 'epoch': 0.85} 85%|████████▌ | 29248/34278 [32:06:33<4:19:44, 3.10s/it] 85%|████████▌ | 29249/34278 [32:06:36<4:09:09, 2.97s/it] {'loss': 0.0989, 'grad_norm': 1.257946166363082, 'learning_rate': 5.539349716925285e-07, 'epoch': 0.85} 85%|████████▌ | 29249/34278 [32:06:36<4:09:09, 2.97s/it] 85%|████████▌ | 29250/34278 [32:06:40<4:35:07, 3.28s/it] {'loss': 0.1055, 'grad_norm': 0.8544562436703373, 'learning_rate': 5.537188560209633e-07, 'epoch': 0.85} 85%|████████▌ | 29250/34278 [32:06:40<4:35:07, 3.28s/it] 85%|████████▌ | 29251/34278 [32:06:43<4:31:52, 3.24s/it] {'loss': 0.1372, 'grad_norm': 0.7590419976067295, 'learning_rate': 5.535027800447351e-07, 'epoch': 0.85} 85%|████████▌ | 29251/34278 [32:06:43<4:31:52, 3.24s/it] 85%|████████▌ | 29252/34278 [32:06:46<4:22:50, 3.14s/it] {'loss': 0.092, 'grad_norm': 0.9760112363389173, 'learning_rate': 5.532867437657718e-07, 'epoch': 0.85} 85%|████████▌ | 29252/34278 [32:06:46<4:22:50, 3.14s/it] 85%|████████▌ | 29253/34278 [32:06:49<4:14:27, 3.04s/it] {'loss': 0.1023, 'grad_norm': 1.0380472022167764, 'learning_rate': 5.530707471860036e-07, 'epoch': 0.85} 85%|████████▌ | 29253/34278 [32:06:49<4:14:27, 3.04s/it] 85%|████████▌ | 29254/34278 [32:06:53<4:50:11, 3.47s/it] {'loss': 0.1117, 'grad_norm': 1.066616929378313, 'learning_rate': 5.528547903073583e-07, 'epoch': 0.85} 85%|████████▌ | 29254/34278 [32:06:53<4:50:11, 3.47s/it] 85%|████████▌ | 29255/34278 [32:06:58<5:30:13, 3.94s/it] {'loss': 0.0984, 'grad_norm': 0.6603745490932088, 'learning_rate': 5.526388731317627e-07, 'epoch': 0.85} 85%|████████▌ | 29255/34278 [32:06:58<5:30:13, 3.94s/it] 85%|████████▌ | 29256/34278 [32:07:01<5:04:26, 3.64s/it] {'loss': 0.1125, 'grad_norm': 0.7552520615868892, 'learning_rate': 5.524229956611454e-07, 'epoch': 0.85} 85%|████████▌ | 29256/34278 [32:07:01<5:04:26, 3.64s/it] 85%|████████▌ | 29257/34278 [32:07:05<5:07:43, 3.68s/it] {'loss': 0.1077, 'grad_norm': 0.8156985411177039, 'learning_rate': 5.522071578974353e-07, 'epoch': 0.85} 85%|████████▌ | 29257/34278 [32:07:05<5:07:43, 3.68s/it] 85%|████████▌ | 29258/34278 [32:07:09<5:07:44, 3.68s/it] {'loss': 0.11, 'grad_norm': 0.6509190477659933, 'learning_rate': 5.51991359842558e-07, 'epoch': 0.85} 85%|████████▌ | 29258/34278 [32:07:09<5:07:44, 3.68s/it] 85%|████████▌ | 29259/34278 [32:07:13<5:09:06, 3.70s/it] {'loss': 0.1196, 'grad_norm': 0.81201383586702, 'learning_rate': 5.517756014984388e-07, 'epoch': 0.85} 85%|████████▌ | 29259/34278 [32:07:13<5:09:06, 3.70s/it] 85%|████████▌ | 29260/34278 [32:07:15<4:47:04, 3.43s/it] {'loss': 0.135, 'grad_norm': 0.7430861923950558, 'learning_rate': 5.51559882867006e-07, 'epoch': 0.85} 85%|████████▌ | 29260/34278 [32:07:15<4:47:04, 3.43s/it] 85%|████████▌ | 29261/34278 [32:07:20<5:19:03, 3.82s/it] {'loss': 0.1079, 'grad_norm': 1.1002617405827813, 'learning_rate': 5.513442039501837e-07, 'epoch': 0.85} 85%|████████▌ | 29261/34278 [32:07:20<5:19:03, 3.82s/it] 85%|████████▌ | 29262/34278 [32:07:24<5:22:36, 3.86s/it] {'loss': 0.1157, 'grad_norm': 1.135539187412405, 'learning_rate': 5.511285647498993e-07, 'epoch': 0.85} 85%|████████▌ | 29262/34278 [32:07:24<5:22:36, 3.86s/it] 85%|████████▌ | 29263/34278 [32:07:27<5:04:49, 3.65s/it] {'loss': 0.1466, 'grad_norm': 1.1742054621140416, 'learning_rate': 5.509129652680761e-07, 'epoch': 0.85} 85%|████████▌ | 29263/34278 [32:07:27<5:04:49, 3.65s/it] 85%|████████▌ | 29264/34278 [32:07:30<4:45:18, 3.41s/it] {'loss': 0.0934, 'grad_norm': 0.8570858075514588, 'learning_rate': 5.506974055066411e-07, 'epoch': 0.85} 85%|████████▌ | 29264/34278 [32:07:30<4:45:18, 3.41s/it] 85%|████████▌ | 29265/34278 [32:07:34<5:01:54, 3.61s/it] {'loss': 0.118, 'grad_norm': 0.6999114656213908, 'learning_rate': 5.504818854675176e-07, 'epoch': 0.85} 85%|████████▌ | 29265/34278 [32:07:34<5:01:54, 3.61s/it] 85%|████████▌ | 29266/34278 [32:07:37<4:50:44, 3.48s/it] {'loss': 0.0964, 'grad_norm': 0.7353575055464785, 'learning_rate': 5.502664051526285e-07, 'epoch': 0.85} 85%|████████▌ | 29266/34278 [32:07:37<4:50:44, 3.48s/it] 85%|████████▌ | 29267/34278 [32:07:40<4:36:45, 3.31s/it] {'loss': 0.0923, 'grad_norm': 0.8843302451918854, 'learning_rate': 5.500509645638985e-07, 'epoch': 0.85} 85%|████████▌ | 29267/34278 [32:07:40<4:36:45, 3.31s/it] 85%|████████▌ | 29268/34278 [32:07:46<5:43:22, 4.11s/it] {'loss': 0.0977, 'grad_norm': 0.8845497856311907, 'learning_rate': 5.498355637032521e-07, 'epoch': 0.85} 85%|████████▌ | 29268/34278 [32:07:46<5:43:22, 4.11s/it] 85%|████████▌ | 29269/34278 [32:07:51<5:56:10, 4.27s/it] {'loss': 0.1068, 'grad_norm': 0.8000443880087532, 'learning_rate': 5.49620202572611e-07, 'epoch': 0.85} 85%|████████▌ | 29269/34278 [32:07:51<5:56:10, 4.27s/it] 85%|████████▌ | 29270/34278 [32:07:55<5:48:43, 4.18s/it] {'loss': 0.1145, 'grad_norm': 0.7903037974342727, 'learning_rate': 5.494048811738989e-07, 'epoch': 0.85} 85%|████████▌ | 29270/34278 [32:07:55<5:48:43, 4.18s/it] 85%|████████▌ | 29271/34278 [32:08:01<6:29:51, 4.67s/it] {'loss': 0.0961, 'grad_norm': 0.9523686122631562, 'learning_rate': 5.491895995090374e-07, 'epoch': 0.85} 85%|████████▌ | 29271/34278 [32:08:01<6:29:51, 4.67s/it] 85%|████████▌ | 29272/34278 [32:08:05<6:31:36, 4.69s/it] {'loss': 0.1321, 'grad_norm': 1.2468134396418666, 'learning_rate': 5.489743575799483e-07, 'epoch': 0.85} 85%|████████▌ | 29272/34278 [32:08:05<6:31:36, 4.69s/it] 85%|████████▌ | 29273/34278 [32:08:09<6:04:22, 4.37s/it] {'loss': 0.1158, 'grad_norm': 0.9434175407452533, 'learning_rate': 5.48759155388553e-07, 'epoch': 0.85} 85%|████████▌ | 29273/34278 [32:08:09<6:04:22, 4.37s/it] 85%|████████▌ | 29274/34278 [32:08:12<5:31:53, 3.98s/it] {'loss': 0.104, 'grad_norm': 1.1145933180924381, 'learning_rate': 5.485439929367748e-07, 'epoch': 0.85} 85%|████████▌ | 29274/34278 [32:08:12<5:31:53, 3.98s/it] 85%|████████▌ | 29275/34278 [32:08:15<5:07:59, 3.69s/it] {'loss': 0.1093, 'grad_norm': 0.7761601853374593, 'learning_rate': 5.483288702265327e-07, 'epoch': 0.85} 85%|████████▌ | 29275/34278 [32:08:15<5:07:59, 3.69s/it] 85%|████████▌ | 29276/34278 [32:08:18<5:00:35, 3.61s/it] {'loss': 0.1158, 'grad_norm': 0.6857080858149806, 'learning_rate': 5.481137872597469e-07, 'epoch': 0.85} 85%|████████▌ | 29276/34278 [32:08:18<5:00:35, 3.61s/it] 85%|████████▌ | 29277/34278 [32:08:22<4:58:19, 3.58s/it] {'loss': 0.1361, 'grad_norm': 0.9622116621490467, 'learning_rate': 5.478987440383399e-07, 'epoch': 0.85} 85%|████████▌ | 29277/34278 [32:08:22<4:58:19, 3.58s/it] 85%|████████▌ | 29278/34278 [32:08:26<5:00:42, 3.61s/it] {'loss': 0.103, 'grad_norm': 0.8826381949857007, 'learning_rate': 5.476837405642299e-07, 'epoch': 0.85} 85%|████████▌ | 29278/34278 [32:08:26<5:00:42, 3.61s/it] 85%|████████▌ | 29279/34278 [32:08:29<4:52:02, 3.51s/it] {'loss': 0.1357, 'grad_norm': 0.8971376210726807, 'learning_rate': 5.474687768393344e-07, 'epoch': 0.85} 85%|████████▌ | 29279/34278 [32:08:29<4:52:02, 3.51s/it] 85%|████████▌ | 29280/34278 [32:08:32<4:36:25, 3.32s/it] {'loss': 0.1181, 'grad_norm': 1.140063921540892, 'learning_rate': 5.472538528655769e-07, 'epoch': 0.85} 85%|████████▌ | 29280/34278 [32:08:32<4:36:25, 3.32s/it] 85%|████████▌ | 29281/34278 [32:08:35<4:43:30, 3.40s/it] {'loss': 0.0968, 'grad_norm': 0.7401249554485791, 'learning_rate': 5.47038968644874e-07, 'epoch': 0.85} 85%|████████▌ | 29281/34278 [32:08:35<4:43:30, 3.40s/it] 85%|████████▌ | 29282/34278 [32:08:39<4:57:09, 3.57s/it] {'loss': 0.0871, 'grad_norm': 0.69552968201201, 'learning_rate': 5.468241241791428e-07, 'epoch': 0.85} 85%|████████▌ | 29282/34278 [32:08:39<4:57:09, 3.57s/it] 85%|████████▌ | 29283/34278 [32:08:45<5:43:15, 4.12s/it] {'loss': 0.1062, 'grad_norm': 0.9130804690861541, 'learning_rate': 5.466093194703043e-07, 'epoch': 0.85} 85%|████████▌ | 29283/34278 [32:08:45<5:43:15, 4.12s/it] 85%|████████▌ | 29284/34278 [32:08:48<5:09:37, 3.72s/it] {'loss': 0.1248, 'grad_norm': 1.1165299293809114, 'learning_rate': 5.463945545202748e-07, 'epoch': 0.85} 85%|████████▌ | 29284/34278 [32:08:48<5:09:37, 3.72s/it] 85%|████████▌ | 29285/34278 [32:08:51<4:55:49, 3.55s/it] {'loss': 0.1211, 'grad_norm': 1.119356456817356, 'learning_rate': 5.4617982933097e-07, 'epoch': 0.85} 85%|████████▌ | 29285/34278 [32:08:51<4:55:49, 3.55s/it] 85%|████████▌ | 29286/34278 [32:08:56<5:33:41, 4.01s/it] {'loss': 0.1175, 'grad_norm': 0.7619026129903873, 'learning_rate': 5.45965143904309e-07, 'epoch': 0.85} 85%|████████▌ | 29286/34278 [32:08:56<5:33:41, 4.01s/it] 85%|████████▌ | 29287/34278 [32:09:02<6:22:21, 4.60s/it] {'loss': 0.1192, 'grad_norm': 1.115115578102341, 'learning_rate': 5.457504982422085e-07, 'epoch': 0.85} 85%|████████▌ | 29287/34278 [32:09:02<6:22:21, 4.60s/it] 85%|████████▌ | 29288/34278 [32:09:04<5:34:09, 4.02s/it] {'loss': 0.1106, 'grad_norm': 0.967813089467639, 'learning_rate': 5.455358923465843e-07, 'epoch': 0.85} 85%|████████▌ | 29288/34278 [32:09:04<5:34:09, 4.02s/it] 85%|████████▌ | 29289/34278 [32:09:08<5:31:06, 3.98s/it] {'loss': 0.1132, 'grad_norm': 2.7302747049786613, 'learning_rate': 5.453213262193513e-07, 'epoch': 0.85} 85%|████████▌ | 29289/34278 [32:09:08<5:31:06, 3.98s/it] 85%|████████▌ | 29290/34278 [32:09:14<6:18:31, 4.55s/it] {'loss': 0.1049, 'grad_norm': 0.9810793593185674, 'learning_rate': 5.451067998624276e-07, 'epoch': 0.85} 85%|████████▌ | 29290/34278 [32:09:14<6:18:31, 4.55s/it] 85%|████████▌ | 29291/34278 [32:09:17<5:42:27, 4.12s/it] {'loss': 0.1206, 'grad_norm': 0.8378819714164876, 'learning_rate': 5.448923132777256e-07, 'epoch': 0.85} 85%|████████▌ | 29291/34278 [32:09:17<5:42:27, 4.12s/it] 85%|████████▌ | 29292/34278 [32:09:23<6:24:54, 4.63s/it] {'loss': 0.1264, 'grad_norm': 1.0451784392384882, 'learning_rate': 5.44677866467162e-07, 'epoch': 0.85} 85%|████████▌ | 29292/34278 [32:09:23<6:24:54, 4.63s/it] 85%|████████▌ | 29293/34278 [32:09:26<5:43:49, 4.14s/it] {'loss': 0.0946, 'grad_norm': 0.8355508850984268, 'learning_rate': 5.444634594326503e-07, 'epoch': 0.85} 85%|████████▌ | 29293/34278 [32:09:26<5:43:49, 4.14s/it] 85%|████████▌ | 29294/34278 [32:09:30<5:42:05, 4.12s/it] {'loss': 0.1121, 'grad_norm': 0.8607237234297561, 'learning_rate': 5.442490921761062e-07, 'epoch': 0.85} 85%|████████▌ | 29294/34278 [32:09:30<5:42:05, 4.12s/it] 85%|████████▌ | 29295/34278 [32:09:33<5:12:55, 3.77s/it] {'loss': 0.1262, 'grad_norm': 0.7746667125089692, 'learning_rate': 5.440347646994426e-07, 'epoch': 0.85} 85%|████████▌ | 29295/34278 [32:09:33<5:12:55, 3.77s/it] 85%|████████▌ | 29296/34278 [32:09:36<4:57:59, 3.59s/it] {'loss': 0.1231, 'grad_norm': 0.9918799982651284, 'learning_rate': 5.438204770045719e-07, 'epoch': 0.85} 85%|████████▌ | 29296/34278 [32:09:36<4:57:59, 3.59s/it] 85%|████████▌ | 29297/34278 [32:09:42<5:46:46, 4.18s/it] {'loss': 0.1094, 'grad_norm': 0.8050739299835314, 'learning_rate': 5.43606229093408e-07, 'epoch': 0.85} 85%|████████▌ | 29297/34278 [32:09:42<5:46:46, 4.18s/it] 85%|████████▌ | 29298/34278 [32:09:47<6:15:55, 4.53s/it] {'loss': 0.1074, 'grad_norm': 0.6753780555823767, 'learning_rate': 5.433920209678651e-07, 'epoch': 0.85} 85%|████████▌ | 29298/34278 [32:09:47<6:15:55, 4.53s/it] 85%|████████▌ | 29299/34278 [32:09:53<6:49:16, 4.93s/it] {'loss': 0.1059, 'grad_norm': 0.7884651155454309, 'learning_rate': 5.431778526298531e-07, 'epoch': 0.85} 85%|████████▌ | 29299/34278 [32:09:53<6:49:16, 4.93s/it] 85%|████████▌ | 29300/34278 [32:09:57<6:26:50, 4.66s/it] {'loss': 0.1112, 'grad_norm': 0.8300237753886387, 'learning_rate': 5.429637240812863e-07, 'epoch': 0.85} 85%|████████▌ | 29300/34278 [32:09:57<6:26:50, 4.66s/it] 85%|████████▌ | 29301/34278 [32:10:00<5:52:10, 4.25s/it] {'loss': 0.1101, 'grad_norm': 0.9567400843568022, 'learning_rate': 5.427496353240757e-07, 'epoch': 0.85} 85%|████████▌ | 29301/34278 [32:10:00<5:52:10, 4.25s/it] 85%|████████▌ | 29302/34278 [32:10:06<6:21:00, 4.59s/it] {'loss': 0.1172, 'grad_norm': 0.9788802429972455, 'learning_rate': 5.425355863601311e-07, 'epoch': 0.85} 85%|████████▌ | 29302/34278 [32:10:06<6:21:00, 4.59s/it] 85%|████████▌ | 29303/34278 [32:10:10<6:06:47, 4.42s/it] {'loss': 0.0997, 'grad_norm': 0.7857087629895083, 'learning_rate': 5.423215771913648e-07, 'epoch': 0.85} 85%|████████▌ | 29303/34278 [32:10:10<6:06:47, 4.42s/it] 85%|████████▌ | 29304/34278 [32:10:15<6:23:34, 4.63s/it] {'loss': 0.1066, 'grad_norm': 1.191233660332849, 'learning_rate': 5.42107607819688e-07, 'epoch': 0.85} 85%|████████▌ | 29304/34278 [32:10:15<6:23:34, 4.63s/it] 85%|████████▌ | 29305/34278 [32:10:19<6:06:48, 4.43s/it] {'loss': 0.1357, 'grad_norm': 0.9282773544238075, 'learning_rate': 5.418936782470108e-07, 'epoch': 0.85} 85%|████████▌ | 29305/34278 [32:10:19<6:06:48, 4.43s/it] 85%|████████▌ | 29306/34278 [32:10:23<5:52:44, 4.26s/it] {'loss': 0.0964, 'grad_norm': 0.8048667103505373, 'learning_rate': 5.416797884752412e-07, 'epoch': 0.85} 85%|████████▌ | 29306/34278 [32:10:23<5:52:44, 4.26s/it] 85%|████████▌ | 29307/34278 [32:10:26<5:21:26, 3.88s/it] {'loss': 0.0757, 'grad_norm': 0.6759841812231248, 'learning_rate': 5.414659385062915e-07, 'epoch': 0.85} 85%|████████▌ | 29307/34278 [32:10:26<5:21:26, 3.88s/it] 86%|████████▌ | 29308/34278 [32:10:31<6:06:50, 4.43s/it] {'loss': 0.1249, 'grad_norm': 0.8143535130999393, 'learning_rate': 5.412521283420691e-07, 'epoch': 0.86} 86%|████████▌ | 29308/34278 [32:10:31<6:06:50, 4.43s/it] 86%|████████▌ | 29309/34278 [32:10:35<5:42:12, 4.13s/it] {'loss': 0.1434, 'grad_norm': 1.0261914213636394, 'learning_rate': 5.410383579844819e-07, 'epoch': 0.86} 86%|████████▌ | 29309/34278 [32:10:35<5:42:12, 4.13s/it] 86%|████████▌ | 29310/34278 [32:10:38<5:04:20, 3.68s/it] {'loss': 0.1141, 'grad_norm': 0.9932610365277973, 'learning_rate': 5.408246274354412e-07, 'epoch': 0.86} 86%|████████▌ | 29310/34278 [32:10:38<5:04:20, 3.68s/it] 86%|████████▌ | 29311/34278 [32:10:41<4:56:26, 3.58s/it] {'loss': 0.0952, 'grad_norm': 0.8289037734445469, 'learning_rate': 5.406109366968542e-07, 'epoch': 0.86} 86%|████████▌ | 29311/34278 [32:10:41<4:56:26, 3.58s/it] 86%|████████▌ | 29312/34278 [32:10:47<5:53:59, 4.28s/it] {'loss': 0.0987, 'grad_norm': 0.707035527951989, 'learning_rate': 5.403972857706269e-07, 'epoch': 0.86} 86%|████████▌ | 29312/34278 [32:10:47<5:53:59, 4.28s/it] 86%|████████▌ | 29313/34278 [32:10:53<6:43:57, 4.88s/it] {'loss': 0.121, 'grad_norm': 1.27919610258187, 'learning_rate': 5.401836746586691e-07, 'epoch': 0.86} 86%|████████▌ | 29313/34278 [32:10:53<6:43:57, 4.88s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 86%|████████▌ | 29314/34278 [32:10:56<5:58:34, 4.33s/it] {'loss': 0.0863, 'grad_norm': 0.8770398345237028, 'learning_rate': 5.399701033628873e-07, 'epoch': 0.86} 86%|████████▌ | 29314/34278 [32:10:56<5:58:34, 4.33s/it] 86%|████████▌ | 29315/34278 [32:11:02<6:42:24, 4.86s/it] {'loss': 0.1365, 'grad_norm': 1.0098450048278378, 'learning_rate': 5.397565718851861e-07, 'epoch': 0.86} 86%|████████▌ | 29315/34278 [32:11:02<6:42:24, 4.86s/it] 86%|████████▌ | 29316/34278 [32:11:05<5:57:55, 4.33s/it] {'loss': 0.0928, 'grad_norm': 0.7429027275631229, 'learning_rate': 5.395430802274737e-07, 'epoch': 0.86} 86%|████████▌ | 29316/34278 [32:11:05<5:57:55, 4.33s/it] 86%|████████▌ | 29317/34278 [32:11:10<6:17:00, 4.56s/it] {'loss': 0.1118, 'grad_norm': 1.0558971372613972, 'learning_rate': 5.393296283916571e-07, 'epoch': 0.86} 86%|████████▌ | 29317/34278 [32:11:10<6:17:00, 4.56s/it] 86%|████████▌ | 29318/34278 [32:11:16<6:49:33, 4.95s/it] {'loss': 0.1204, 'grad_norm': 0.872179606238544, 'learning_rate': 5.391162163796404e-07, 'epoch': 0.86} 86%|████████▌ | 29318/34278 [32:11:16<6:49:33, 4.95s/it] 86%|████████▌ | 29319/34278 [32:11:19<6:01:58, 4.38s/it] {'loss': 0.1309, 'grad_norm': 0.7364807336666261, 'learning_rate': 5.38902844193328e-07, 'epoch': 0.86} 86%|████████▌ | 29319/34278 [32:11:19<6:01:58, 4.38s/it] 86%|████████▌ | 29320/34278 [32:11:22<5:27:21, 3.96s/it] {'loss': 0.1439, 'grad_norm': 1.0647888786797322, 'learning_rate': 5.386895118346275e-07, 'epoch': 0.86} 86%|████████▌ | 29320/34278 [32:11:22<5:27:21, 3.96s/it] 86%|████████▌ | 29321/34278 [32:11:26<5:19:29, 3.87s/it] {'loss': 0.0887, 'grad_norm': 0.8659584860814815, 'learning_rate': 5.384762193054411e-07, 'epoch': 0.86} 86%|████████▌ | 29321/34278 [32:11:26<5:19:29, 3.87s/it] 86%|████████▌ | 29322/34278 [32:11:30<5:29:47, 3.99s/it] {'loss': 0.0905, 'grad_norm': 0.716367057438113, 'learning_rate': 5.38262966607675e-07, 'epoch': 0.86} 86%|████████▌ | 29322/34278 [32:11:30<5:29:47, 3.99s/it] 86%|████████▌ | 29323/34278 [32:11:34<5:24:07, 3.92s/it] {'loss': 0.1015, 'grad_norm': 0.8314557367445821, 'learning_rate': 5.380497537432306e-07, 'epoch': 0.86} 86%|████████▌ | 29323/34278 [32:11:34<5:24:07, 3.92s/it] 86%|████████▌ | 29324/34278 [32:11:37<5:04:00, 3.68s/it] {'loss': 0.1101, 'grad_norm': 0.935713220005702, 'learning_rate': 5.37836580714014e-07, 'epoch': 0.86} 86%|████████▌ | 29324/34278 [32:11:37<5:04:00, 3.68s/it] 86%|████████▌ | 29325/34278 [32:11:43<5:51:03, 4.25s/it] {'loss': 0.1062, 'grad_norm': 0.8077184679024814, 'learning_rate': 5.376234475219272e-07, 'epoch': 0.86} 86%|████████▌ | 29325/34278 [32:11:43<5:51:03, 4.25s/it] 86%|████████▌ | 29326/34278 [32:11:46<5:19:20, 3.87s/it] {'loss': 0.1079, 'grad_norm': 0.7880069280216613, 'learning_rate': 5.374103541688724e-07, 'epoch': 0.86} 86%|████████▌ | 29326/34278 [32:11:46<5:19:20, 3.87s/it] 86%|████████▌ | 29327/34278 [32:11:48<4:52:39, 3.55s/it] {'loss': 0.0921, 'grad_norm': 0.7149049891944173, 'learning_rate': 5.371973006567521e-07, 'epoch': 0.86} 86%|████████▌ | 29327/34278 [32:11:49<4:52:39, 3.55s/it] 86%|████████▌ | 29328/34278 [32:11:51<4:34:59, 3.33s/it] {'loss': 0.0931, 'grad_norm': 0.7817950072121391, 'learning_rate': 5.369842869874703e-07, 'epoch': 0.86} 86%|████████▌ | 29328/34278 [32:11:51<4:34:59, 3.33s/it] 86%|████████▌ | 29329/34278 [32:11:55<4:34:16, 3.33s/it] {'loss': 0.1161, 'grad_norm': 0.8093934164300488, 'learning_rate': 5.367713131629259e-07, 'epoch': 0.86} 86%|████████▌ | 29329/34278 [32:11:55<4:34:16, 3.33s/it] 86%|████████▌ | 29330/34278 [32:11:57<4:22:35, 3.18s/it] {'loss': 0.1023, 'grad_norm': 0.8492960270568929, 'learning_rate': 5.365583791850232e-07, 'epoch': 0.86} 86%|████████▌ | 29330/34278 [32:11:57<4:22:35, 3.18s/it] 86%|████████▌ | 29331/34278 [32:12:00<4:12:05, 3.06s/it] {'loss': 0.1195, 'grad_norm': 0.8401120955569392, 'learning_rate': 5.363454850556621e-07, 'epoch': 0.86} 86%|████████▌ | 29331/34278 [32:12:00<4:12:05, 3.06s/it] 86%|████████▌ | 29332/34278 [32:12:03<4:11:22, 3.05s/it] {'loss': 0.1113, 'grad_norm': 0.6950237996322621, 'learning_rate': 5.361326307767411e-07, 'epoch': 0.86} 86%|████████▌ | 29332/34278 [32:12:03<4:11:22, 3.05s/it] 86%|████████▌ | 29333/34278 [32:12:08<4:42:36, 3.43s/it] {'loss': 0.105, 'grad_norm': 0.7877889633338129, 'learning_rate': 5.359198163501628e-07, 'epoch': 0.86} 86%|████████▌ | 29333/34278 [32:12:08<4:42:36, 3.43s/it] 86%|████████▌ | 29334/34278 [32:12:11<4:40:04, 3.40s/it] {'loss': 0.1108, 'grad_norm': 0.7789335755700192, 'learning_rate': 5.357070417778282e-07, 'epoch': 0.86} 86%|████████▌ | 29334/34278 [32:12:11<4:40:04, 3.40s/it] 86%|████████▌ | 29335/34278 [32:12:15<4:48:16, 3.50s/it] {'loss': 0.1081, 'grad_norm': 0.8458783956068391, 'learning_rate': 5.354943070616348e-07, 'epoch': 0.86} 86%|████████▌ | 29335/34278 [32:12:15<4:48:16, 3.50s/it] 86%|████████▌ | 29336/34278 [32:12:20<5:22:25, 3.91s/it] {'loss': 0.1087, 'grad_norm': 0.8052134834191995, 'learning_rate': 5.352816122034815e-07, 'epoch': 0.86} 86%|████████▌ | 29336/34278 [32:12:20<5:22:25, 3.91s/it] 86%|████████▌ | 29337/34278 [32:12:23<5:12:16, 3.79s/it] {'loss': 0.1099, 'grad_norm': 0.766542615685328, 'learning_rate': 5.350689572052692e-07, 'epoch': 0.86} 86%|████████▌ | 29337/34278 [32:12:23<5:12:16, 3.79s/it] 86%|████████▌ | 29338/34278 [32:12:29<6:03:31, 4.42s/it] {'loss': 0.0988, 'grad_norm': 0.864832166392014, 'learning_rate': 5.348563420688951e-07, 'epoch': 0.86} 86%|████████▌ | 29338/34278 [32:12:29<6:03:31, 4.42s/it] 86%|████████▌ | 29339/34278 [32:12:32<5:20:21, 3.89s/it] {'loss': 0.0895, 'grad_norm': 0.7109001278436873, 'learning_rate': 5.346437667962562e-07, 'epoch': 0.86} 86%|████████▌ | 29339/34278 [32:12:32<5:20:21, 3.89s/it] 86%|████████▌ | 29340/34278 [32:12:35<5:08:32, 3.75s/it] {'loss': 0.1045, 'grad_norm': 0.7544914283330012, 'learning_rate': 5.344312313892536e-07, 'epoch': 0.86} 86%|████████▌ | 29340/34278 [32:12:35<5:08:32, 3.75s/it] 86%|████████▌ | 29341/34278 [32:12:38<4:54:42, 3.58s/it] {'loss': 0.1037, 'grad_norm': 0.7666375340072559, 'learning_rate': 5.34218735849783e-07, 'epoch': 0.86} 86%|████████▌ | 29341/34278 [32:12:38<4:54:42, 3.58s/it] 86%|████████▌ | 29342/34278 [32:12:41<4:33:54, 3.33s/it] {'loss': 0.1083, 'grad_norm': 0.7521466008665737, 'learning_rate': 5.340062801797402e-07, 'epoch': 0.86} 86%|████████▌ | 29342/34278 [32:12:41<4:33:54, 3.33s/it] 86%|████████▌ | 29343/34278 [32:12:47<5:37:41, 4.11s/it] {'loss': 0.1275, 'grad_norm': 0.8834062688206936, 'learning_rate': 5.337938643810248e-07, 'epoch': 0.86} 86%|████████▌ | 29343/34278 [32:12:47<5:37:41, 4.11s/it] 86%|████████▌ | 29344/34278 [32:12:50<5:20:49, 3.90s/it] {'loss': 0.0979, 'grad_norm': 0.8681128326531021, 'learning_rate': 5.335814884555313e-07, 'epoch': 0.86} 86%|████████▌ | 29344/34278 [32:12:50<5:20:49, 3.90s/it] 86%|████████▌ | 29345/34278 [32:12:54<5:14:23, 3.82s/it] {'loss': 0.1254, 'grad_norm': 0.77129467969646, 'learning_rate': 5.333691524051549e-07, 'epoch': 0.86} 86%|████████▌ | 29345/34278 [32:12:54<5:14:23, 3.82s/it] 86%|████████▌ | 29346/34278 [32:12:57<5:04:34, 3.71s/it] {'loss': 0.1112, 'grad_norm': 0.7712736447209378, 'learning_rate': 5.331568562317924e-07, 'epoch': 0.86} 86%|████████▌ | 29346/34278 [32:12:57<5:04:34, 3.71s/it] 86%|████████▌ | 29347/34278 [32:13:00<4:50:11, 3.53s/it] {'loss': 0.1491, 'grad_norm': 0.9815960137582008, 'learning_rate': 5.3294459993734e-07, 'epoch': 0.86} 86%|████████▌ | 29347/34278 [32:13:00<4:50:11, 3.53s/it] 86%|████████▌ | 29348/34278 [32:13:05<5:18:04, 3.87s/it] {'loss': 0.1138, 'grad_norm': 0.7662818827976515, 'learning_rate': 5.327323835236919e-07, 'epoch': 0.86} 86%|████████▌ | 29348/34278 [32:13:05<5:18:04, 3.87s/it] 86%|████████▌ | 29349/34278 [32:13:09<5:29:49, 4.01s/it] {'loss': 0.1066, 'grad_norm': 1.0146229817022845, 'learning_rate': 5.325202069927421e-07, 'epoch': 0.86} 86%|████████▌ | 29349/34278 [32:13:10<5:29:49, 4.01s/it] 86%|████████▌ | 29350/34278 [32:13:12<4:58:21, 3.63s/it] {'loss': 0.13, 'grad_norm': 1.1253188556326577, 'learning_rate': 5.323080703463862e-07, 'epoch': 0.86} 86%|████████▌ | 29350/34278 [32:13:12<4:58:21, 3.63s/it] 86%|████████▌ | 29351/34278 [32:13:16<4:57:22, 3.62s/it] {'loss': 0.1288, 'grad_norm': 0.9550917774014132, 'learning_rate': 5.320959735865161e-07, 'epoch': 0.86} 86%|████████▌ | 29351/34278 [32:13:16<4:57:22, 3.62s/it] 86%|████████▌ | 29352/34278 [32:13:21<5:43:41, 4.19s/it] {'loss': 0.1206, 'grad_norm': 1.6951356957353556, 'learning_rate': 5.31883916715028e-07, 'epoch': 0.86} 86%|████████▌ | 29352/34278 [32:13:21<5:43:41, 4.19s/it] 86%|████████▌ | 29353/34278 [32:13:25<5:41:06, 4.16s/it] {'loss': 0.0904, 'grad_norm': 0.7042063336446921, 'learning_rate': 5.316718997338128e-07, 'epoch': 0.86} 86%|████████▌ | 29353/34278 [32:13:25<5:41:06, 4.16s/it] 86%|████████▌ | 29354/34278 [32:13:29<5:21:01, 3.91s/it] {'loss': 0.0926, 'grad_norm': 0.703958415806529, 'learning_rate': 5.314599226447648e-07, 'epoch': 0.86} 86%|████████▌ | 29354/34278 [32:13:29<5:21:01, 3.91s/it] 86%|████████▌ | 29355/34278 [32:13:34<5:55:57, 4.34s/it] {'loss': 0.0997, 'grad_norm': 0.7579356122279145, 'learning_rate': 5.312479854497754e-07, 'epoch': 0.86} 86%|████████▌ | 29355/34278 [32:13:34<5:55:57, 4.34s/it] 86%|████████▌ | 29356/34278 [32:13:37<5:24:28, 3.96s/it] {'loss': 0.1041, 'grad_norm': 0.986485784109621, 'learning_rate': 5.31036088150737e-07, 'epoch': 0.86} 86%|████████▌ | 29356/34278 [32:13:37<5:24:28, 3.96s/it] 86%|████████▌ | 29357/34278 [32:13:42<5:35:31, 4.09s/it] {'loss': 0.0954, 'grad_norm': 1.199734024618942, 'learning_rate': 5.308242307495414e-07, 'epoch': 0.86} 86%|████████▌ | 29357/34278 [32:13:42<5:35:31, 4.09s/it] 86%|████████▌ | 29358/34278 [32:13:45<5:24:34, 3.96s/it] {'loss': 0.1054, 'grad_norm': 0.7066274475070856, 'learning_rate': 5.306124132480811e-07, 'epoch': 0.86} 86%|████████▌ | 29358/34278 [32:13:45<5:24:34, 3.96s/it] 86%|████████▌ | 29359/34278 [32:13:48<5:06:35, 3.74s/it] {'loss': 0.1134, 'grad_norm': 0.8040161117063696, 'learning_rate': 5.304006356482449e-07, 'epoch': 0.86} 86%|████████▌ | 29359/34278 [32:13:48<5:06:35, 3.74s/it] 86%|████████▌ | 29360/34278 [32:13:54<6:01:11, 4.41s/it] {'loss': 0.1115, 'grad_norm': 0.8210433065501983, 'learning_rate': 5.301888979519265e-07, 'epoch': 0.86} 86%|████████▌ | 29360/34278 [32:13:54<6:01:11, 4.41s/it] 86%|████████▌ | 29361/34278 [32:13:59<6:00:07, 4.39s/it] {'loss': 0.135, 'grad_norm': 0.7970420691389978, 'learning_rate': 5.299772001610143e-07, 'epoch': 0.86} 86%|████████▌ | 29361/34278 [32:13:59<6:00:07, 4.39s/it] 86%|████████▌ | 29362/34278 [32:14:02<5:38:24, 4.13s/it] {'loss': 0.1014, 'grad_norm': 0.6669580327023281, 'learning_rate': 5.297655422773973e-07, 'epoch': 0.86} 86%|████████▌ | 29362/34278 [32:14:02<5:38:24, 4.13s/it] 86%|████████▌ | 29363/34278 [32:14:05<5:14:52, 3.84s/it] {'loss': 0.1128, 'grad_norm': 0.964123335847811, 'learning_rate': 5.295539243029668e-07, 'epoch': 0.86} 86%|████████▌ | 29363/34278 [32:14:05<5:14:52, 3.84s/it] 86%|████████▌ | 29364/34278 [32:14:08<4:51:43, 3.56s/it] {'loss': 0.0842, 'grad_norm': 0.7834984730795249, 'learning_rate': 5.293423462396124e-07, 'epoch': 0.86} 86%|████████▌ | 29364/34278 [32:14:08<4:51:43, 3.56s/it] 86%|████████▌ | 29365/34278 [32:14:13<5:12:32, 3.82s/it] {'loss': 0.0944, 'grad_norm': 0.8848024773925179, 'learning_rate': 5.291308080892226e-07, 'epoch': 0.86} 86%|████████▌ | 29365/34278 [32:14:13<5:12:32, 3.82s/it] 86%|████████▌ | 29366/34278 [32:14:16<4:59:22, 3.66s/it] {'loss': 0.1162, 'grad_norm': 0.6919986244584313, 'learning_rate': 5.289193098536844e-07, 'epoch': 0.86} 86%|████████▌ | 29366/34278 [32:14:16<4:59:22, 3.66s/it] 86%|████████▌ | 29367/34278 [32:14:19<4:45:06, 3.48s/it] {'loss': 0.1364, 'grad_norm': 0.861378097059738, 'learning_rate': 5.287078515348887e-07, 'epoch': 0.86} 86%|████████▌ | 29367/34278 [32:14:19<4:45:06, 3.48s/it] 86%|████████▌ | 29368/34278 [32:14:22<4:41:16, 3.44s/it] {'loss': 0.0966, 'grad_norm': 0.8173164390501377, 'learning_rate': 5.284964331347214e-07, 'epoch': 0.86} 86%|████████▌ | 29368/34278 [32:14:22<4:41:16, 3.44s/it] 86%|████████▌ | 29369/34278 [32:14:27<5:01:36, 3.69s/it] {'loss': 0.1085, 'grad_norm': 0.885406075990331, 'learning_rate': 5.282850546550689e-07, 'epoch': 0.86} 86%|████████▌ | 29369/34278 [32:14:27<5:01:36, 3.69s/it] 86%|████████▌ | 29370/34278 [32:14:30<4:47:46, 3.52s/it] {'loss': 0.133, 'grad_norm': 0.9307121622400922, 'learning_rate': 5.280737160978216e-07, 'epoch': 0.86} 86%|████████▌ | 29370/34278 [32:14:30<4:47:46, 3.52s/it] 86%|████████▌ | 29371/34278 [32:14:34<4:59:03, 3.66s/it] {'loss': 0.1077, 'grad_norm': 0.749292932707458, 'learning_rate': 5.27862417464865e-07, 'epoch': 0.86} 86%|████████▌ | 29371/34278 [32:14:34<4:59:03, 3.66s/it] 86%|████████▌ | 29372/34278 [32:14:40<5:56:50, 4.36s/it] {'loss': 0.1093, 'grad_norm': 0.7245720534743995, 'learning_rate': 5.276511587580835e-07, 'epoch': 0.86} 86%|████████▌ | 29372/34278 [32:14:40<5:56:50, 4.36s/it] 86%|████████▌ | 29373/34278 [32:14:43<5:28:27, 4.02s/it] {'loss': 0.1209, 'grad_norm': 0.7584360200431304, 'learning_rate': 5.274399399793667e-07, 'epoch': 0.86} 86%|████████▌ | 29373/34278 [32:14:43<5:28:27, 4.02s/it] 86%|████████▌ | 29374/34278 [32:14:46<5:03:34, 3.71s/it] {'loss': 0.1208, 'grad_norm': 1.0455436498297157, 'learning_rate': 5.272287611305976e-07, 'epoch': 0.86} 86%|████████▌ | 29374/34278 [32:14:46<5:03:34, 3.71s/it] 86%|████████▌ | 29375/34278 [32:14:49<4:48:26, 3.53s/it] {'loss': 0.1199, 'grad_norm': 0.8819058019966554, 'learning_rate': 5.270176222136619e-07, 'epoch': 0.86} 86%|████████▌ | 29375/34278 [32:14:49<4:48:26, 3.53s/it] 86%|████████▌ | 29376/34278 [32:14:55<5:53:54, 4.33s/it] {'loss': 0.1113, 'grad_norm': 0.7700924778430094, 'learning_rate': 5.268065232304448e-07, 'epoch': 0.86} 86%|████████▌ | 29376/34278 [32:14:55<5:53:54, 4.33s/it] 86%|████████▌ | 29377/34278 [32:14:58<5:20:06, 3.92s/it] {'loss': 0.1218, 'grad_norm': 1.129022640684438, 'learning_rate': 5.265954641828325e-07, 'epoch': 0.86} 86%|████████▌ | 29377/34278 [32:14:58<5:20:06, 3.92s/it] 86%|████████▌ | 29378/34278 [32:15:01<4:53:42, 3.60s/it] {'loss': 0.1053, 'grad_norm': 0.8689289119023054, 'learning_rate': 5.263844450727079e-07, 'epoch': 0.86} 86%|████████▌ | 29378/34278 [32:15:01<4:53:42, 3.60s/it] 86%|████████▌ | 29379/34278 [32:15:08<6:02:37, 4.44s/it] {'loss': 0.1232, 'grad_norm': 0.8711379810428355, 'learning_rate': 5.261734659019541e-07, 'epoch': 0.86} 86%|████████▌ | 29379/34278 [32:15:08<6:02:37, 4.44s/it] 86%|████████▌ | 29380/34278 [32:15:11<5:39:46, 4.16s/it] {'loss': 0.1019, 'grad_norm': 0.8130489438871341, 'learning_rate': 5.259625266724566e-07, 'epoch': 0.86} 86%|████████▌ | 29380/34278 [32:15:11<5:39:46, 4.16s/it] 86%|████████▌ | 29381/34278 [32:15:14<5:10:41, 3.81s/it] {'loss': 0.1222, 'grad_norm': 0.9020637647946841, 'learning_rate': 5.257516273860963e-07, 'epoch': 0.86} 86%|████████▌ | 29381/34278 [32:15:14<5:10:41, 3.81s/it] 86%|████████▌ | 29382/34278 [32:15:18<5:02:31, 3.71s/it] {'loss': 0.1331, 'grad_norm': 1.0732804808961944, 'learning_rate': 5.255407680447589e-07, 'epoch': 0.86} 86%|████████▌ | 29382/34278 [32:15:18<5:02:31, 3.71s/it] 86%|████████▌ | 29383/34278 [32:15:21<4:51:45, 3.58s/it] {'loss': 0.1082, 'grad_norm': 0.8667888667591757, 'learning_rate': 5.253299486503238e-07, 'epoch': 0.86} 86%|████████▌ | 29383/34278 [32:15:21<4:51:45, 3.58s/it] 86%|████████▌ | 29384/34278 [32:15:24<4:45:43, 3.50s/it] {'loss': 0.1161, 'grad_norm': 0.8101618514340407, 'learning_rate': 5.251191692046764e-07, 'epoch': 0.86} 86%|████████▌ | 29384/34278 [32:15:24<4:45:43, 3.50s/it] 86%|████████▌ | 29385/34278 [32:15:28<4:45:06, 3.50s/it] {'loss': 0.108, 'grad_norm': 0.70625914522278, 'learning_rate': 5.249084297096962e-07, 'epoch': 0.86} 86%|████████▌ | 29385/34278 [32:15:28<4:45:06, 3.50s/it] 86%|████████▌ | 29386/34278 [32:15:32<5:09:05, 3.79s/it] {'loss': 0.1103, 'grad_norm': 0.9503020444073192, 'learning_rate': 5.246977301672645e-07, 'epoch': 0.86} 86%|████████▌ | 29386/34278 [32:15:32<5:09:05, 3.79s/it] 86%|████████▌ | 29387/34278 [32:15:35<4:54:40, 3.61s/it] {'loss': 0.1096, 'grad_norm': 0.933888693705328, 'learning_rate': 5.244870705792632e-07, 'epoch': 0.86} 86%|████████▌ | 29387/34278 [32:15:35<4:54:40, 3.61s/it] 86%|████████▌ | 29388/34278 [32:15:38<4:33:42, 3.36s/it] {'loss': 0.0949, 'grad_norm': 0.8162679424213231, 'learning_rate': 5.24276450947574e-07, 'epoch': 0.86} 86%|████████▌ | 29388/34278 [32:15:38<4:33:42, 3.36s/it] 86%|████████▌ | 29389/34278 [32:15:41<4:18:36, 3.17s/it] {'loss': 0.0991, 'grad_norm': 0.8624342590855032, 'learning_rate': 5.240658712740748e-07, 'epoch': 0.86} 86%|████████▌ | 29389/34278 [32:15:41<4:18:36, 3.17s/it] 86%|████████▌ | 29390/34278 [32:15:44<4:17:17, 3.16s/it] {'loss': 0.1017, 'grad_norm': 0.7814906469522114, 'learning_rate': 5.238553315606482e-07, 'epoch': 0.86} 86%|████████▌ | 29390/34278 [32:15:44<4:17:17, 3.16s/it] 86%|████████▌ | 29391/34278 [32:15:47<4:22:35, 3.22s/it] {'loss': 0.0958, 'grad_norm': 0.7770873719225593, 'learning_rate': 5.236448318091731e-07, 'epoch': 0.86} 86%|████████▌ | 29391/34278 [32:15:47<4:22:35, 3.22s/it] 86%|████████▌ | 29392/34278 [32:15:51<4:24:55, 3.25s/it] {'loss': 0.1103, 'grad_norm': 0.9589446488203188, 'learning_rate': 5.234343720215268e-07, 'epoch': 0.86} 86%|████████▌ | 29392/34278 [32:15:51<4:24:55, 3.25s/it] 86%|████████▌ | 29393/34278 [32:15:55<4:59:11, 3.67s/it] {'loss': 0.1151, 'grad_norm': 0.8069289270927945, 'learning_rate': 5.232239521995902e-07, 'epoch': 0.86} 86%|████████▌ | 29393/34278 [32:15:55<4:59:11, 3.67s/it] 86%|████████▌ | 29394/34278 [32:16:02<6:06:02, 4.50s/it] {'loss': 0.1071, 'grad_norm': 0.7421129880346099, 'learning_rate': 5.230135723452423e-07, 'epoch': 0.86} 86%|████████▌ | 29394/34278 [32:16:02<6:06:02, 4.50s/it] 86%|████████▌ | 29395/34278 [32:16:05<5:29:09, 4.04s/it] {'loss': 0.1159, 'grad_norm': 1.084033587602726, 'learning_rate': 5.228032324603605e-07, 'epoch': 0.86} 86%|████████▌ | 29395/34278 [32:16:05<5:29:09, 4.04s/it] 86%|████████▌ | 29396/34278 [32:16:07<4:58:05, 3.66s/it] {'loss': 0.0996, 'grad_norm': 0.8517420453028375, 'learning_rate': 5.225929325468216e-07, 'epoch': 0.86} 86%|████████▌ | 29396/34278 [32:16:07<4:58:05, 3.66s/it] 86%|████████▌ | 29397/34278 [32:16:11<4:56:00, 3.64s/it] {'loss': 0.1074, 'grad_norm': 0.7296119333962274, 'learning_rate': 5.223826726065045e-07, 'epoch': 0.86} 86%|████████▌ | 29397/34278 [32:16:11<4:56:00, 3.64s/it] 86%|████████▌ | 29398/34278 [32:16:14<4:36:50, 3.40s/it] {'loss': 0.1288, 'grad_norm': 0.9196641582945607, 'learning_rate': 5.221724526412869e-07, 'epoch': 0.86} 86%|████████▌ | 29398/34278 [32:16:14<4:36:50, 3.40s/it] 86%|████████▌ | 29399/34278 [32:16:17<4:29:08, 3.31s/it] {'loss': 0.1, 'grad_norm': 1.0408563371365043, 'learning_rate': 5.219622726530427e-07, 'epoch': 0.86} 86%|████████▌ | 29399/34278 [32:16:17<4:29:08, 3.31s/it] 86%|████████▌ | 29400/34278 [32:16:20<4:18:49, 3.18s/it] {'loss': 0.1215, 'grad_norm': 0.8169846588627468, 'learning_rate': 5.21752132643652e-07, 'epoch': 0.86} 86%|████████▌ | 29400/34278 [32:16:20<4:18:49, 3.18s/it] 86%|████████▌ | 29401/34278 [32:16:23<4:13:05, 3.11s/it] {'loss': 0.133, 'grad_norm': 0.8105030774802362, 'learning_rate': 5.215420326149889e-07, 'epoch': 0.86} 86%|████████▌ | 29401/34278 [32:16:23<4:13:05, 3.11s/it] 86%|████████▌ | 29402/34278 [32:16:26<4:10:23, 3.08s/it] {'loss': 0.1055, 'grad_norm': 0.7658042004307059, 'learning_rate': 5.213319725689292e-07, 'epoch': 0.86} 86%|████████▌ | 29402/34278 [32:16:26<4:10:23, 3.08s/it] 86%|████████▌ | 29403/34278 [32:16:29<4:06:18, 3.03s/it] {'loss': 0.105, 'grad_norm': 0.8088666126180049, 'learning_rate': 5.211219525073491e-07, 'epoch': 0.86} 86%|████████▌ | 29403/34278 [32:16:29<4:06:18, 3.03s/it] 86%|████████▌ | 29404/34278 [32:16:32<4:14:23, 3.13s/it] {'loss': 0.1079, 'grad_norm': 0.7757079944570077, 'learning_rate': 5.209119724321226e-07, 'epoch': 0.86} 86%|████████▌ | 29404/34278 [32:16:32<4:14:23, 3.13s/it] 86%|████████▌ | 29405/34278 [32:16:35<4:13:24, 3.12s/it] {'loss': 0.1189, 'grad_norm': 0.7172074945031245, 'learning_rate': 5.207020323451245e-07, 'epoch': 0.86} 86%|████████▌ | 29405/34278 [32:16:35<4:13:24, 3.12s/it] 86%|████████▌ | 29406/34278 [32:16:38<4:11:13, 3.09s/it] {'loss': 0.1313, 'grad_norm': 0.9422097405586951, 'learning_rate': 5.204921322482292e-07, 'epoch': 0.86} 86%|████████▌ | 29406/34278 [32:16:38<4:11:13, 3.09s/it] 86%|████████▌ | 29407/34278 [32:16:44<5:10:42, 3.83s/it] {'loss': 0.1045, 'grad_norm': 0.8578235439343572, 'learning_rate': 5.202822721433115e-07, 'epoch': 0.86} 86%|████████▌ | 29407/34278 [32:16:44<5:10:42, 3.83s/it] 86%|████████▌ | 29408/34278 [32:16:48<5:08:53, 3.81s/it] {'loss': 0.0942, 'grad_norm': 0.8219734969697102, 'learning_rate': 5.200724520322448e-07, 'epoch': 0.86} 86%|████████▌ | 29408/34278 [32:16:48<5:08:53, 3.81s/it] 86%|████████▌ | 29409/34278 [32:16:51<4:50:36, 3.58s/it] {'loss': 0.1394, 'grad_norm': 0.8307940094535435, 'learning_rate': 5.198626719169004e-07, 'epoch': 0.86} 86%|████████▌ | 29409/34278 [32:16:51<4:50:36, 3.58s/it] 86%|████████▌ | 29410/34278 [32:16:55<5:12:31, 3.85s/it] {'loss': 0.0993, 'grad_norm': 0.9403933662185611, 'learning_rate': 5.196529317991534e-07, 'epoch': 0.86} 86%|████████▌ | 29410/34278 [32:16:55<5:12:31, 3.85s/it] 86%|████████▌ | 29411/34278 [32:17:01<6:13:04, 4.60s/it] {'loss': 0.1053, 'grad_norm': 0.743426296762458, 'learning_rate': 5.194432316808745e-07, 'epoch': 0.86} 86%|████████▌ | 29411/34278 [32:17:01<6:13:04, 4.60s/it] 86%|████████▌ | 29412/34278 [32:17:05<5:37:00, 4.16s/it] {'loss': 0.1329, 'grad_norm': 0.7736891008032006, 'learning_rate': 5.19233571563938e-07, 'epoch': 0.86} 86%|████████▌ | 29412/34278 [32:17:05<5:37:00, 4.16s/it] 86%|████████▌ | 29413/34278 [32:17:07<5:02:29, 3.73s/it] {'loss': 0.1067, 'grad_norm': 0.7645620492073064, 'learning_rate': 5.190239514502138e-07, 'epoch': 0.86} 86%|████████▌ | 29413/34278 [32:17:07<5:02:29, 3.73s/it] 86%|████████▌ | 29414/34278 [32:17:11<4:56:26, 3.66s/it] {'loss': 0.1352, 'grad_norm': 1.0641400675618282, 'learning_rate': 5.18814371341575e-07, 'epoch': 0.86} 86%|████████▌ | 29414/34278 [32:17:11<4:56:26, 3.66s/it] 86%|████████▌ | 29415/34278 [32:17:14<4:56:47, 3.66s/it] {'loss': 0.1247, 'grad_norm': 0.897047553575903, 'learning_rate': 5.186048312398911e-07, 'epoch': 0.86} 86%|████████▌ | 29415/34278 [32:17:14<4:56:47, 3.66s/it] 86%|████████▌ | 29416/34278 [32:17:18<4:54:56, 3.64s/it] {'loss': 0.1234, 'grad_norm': 0.7767238626142258, 'learning_rate': 5.18395331147033e-07, 'epoch': 0.86} 86%|████████▌ | 29416/34278 [32:17:18<4:54:56, 3.64s/it] 86%|████████▌ | 29417/34278 [32:17:24<5:50:21, 4.32s/it] {'loss': 0.1156, 'grad_norm': 0.9954592295334918, 'learning_rate': 5.181858710648719e-07, 'epoch': 0.86} 86%|████████▌ | 29417/34278 [32:17:24<5:50:21, 4.32s/it] 86%|████████▌ | 29418/34278 [32:17:28<5:37:43, 4.17s/it] {'loss': 0.1121, 'grad_norm': 0.8472966942564119, 'learning_rate': 5.179764509952779e-07, 'epoch': 0.86} 86%|████████▌ | 29418/34278 [32:17:28<5:37:43, 4.17s/it] 86%|████████▌ | 29419/34278 [32:17:31<5:20:12, 3.95s/it] {'loss': 0.1179, 'grad_norm': 0.6922976041201744, 'learning_rate': 5.177670709401195e-07, 'epoch': 0.86} 86%|████████▌ | 29419/34278 [32:17:31<5:20:12, 3.95s/it] 86%|████████▌ | 29420/34278 [32:17:34<4:56:16, 3.66s/it] {'loss': 0.1235, 'grad_norm': 0.8547285959502339, 'learning_rate': 5.175577309012675e-07, 'epoch': 0.86} 86%|████████▌ | 29420/34278 [32:17:34<4:56:16, 3.66s/it] 86%|████████▌ | 29421/34278 [32:17:38<4:57:55, 3.68s/it] {'loss': 0.0913, 'grad_norm': 0.8427890899975256, 'learning_rate': 5.173484308805899e-07, 'epoch': 0.86} 86%|████████▌ | 29421/34278 [32:17:38<4:57:55, 3.68s/it] 86%|████████▌ | 29422/34278 [32:17:41<4:52:26, 3.61s/it] {'loss': 0.1306, 'grad_norm': 0.8593176410266, 'learning_rate': 5.171391708799545e-07, 'epoch': 0.86} 86%|████████▌ | 29422/34278 [32:17:41<4:52:26, 3.61s/it] 86%|████████▌ | 29423/34278 [32:17:45<4:45:57, 3.53s/it] {'loss': 0.0893, 'grad_norm': 0.6973922847440359, 'learning_rate': 5.169299509012304e-07, 'epoch': 0.86} 86%|████████▌ | 29423/34278 [32:17:45<4:45:57, 3.53s/it] 86%|████████▌ | 29424/34278 [32:17:49<5:01:03, 3.72s/it] {'loss': 0.1275, 'grad_norm': 0.8347714708783674, 'learning_rate': 5.167207709462868e-07, 'epoch': 0.86} 86%|████████▌ | 29424/34278 [32:17:49<5:01:03, 3.72s/it] 86%|████████▌ | 29425/34278 [32:17:52<4:38:19, 3.44s/it] {'loss': 0.1037, 'grad_norm': 0.7919908106576974, 'learning_rate': 5.165116310169899e-07, 'epoch': 0.86} 86%|████████▌ | 29425/34278 [32:17:52<4:38:19, 3.44s/it] 86%|████████▌ | 29426/34278 [32:17:55<4:46:11, 3.54s/it] {'loss': 0.1129, 'grad_norm': 0.8015650500711422, 'learning_rate': 5.163025311152054e-07, 'epoch': 0.86} 86%|████████▌ | 29426/34278 [32:17:55<4:46:11, 3.54s/it] 86%|████████▌ | 29427/34278 [32:17:59<4:42:44, 3.50s/it] {'loss': 0.1266, 'grad_norm': 0.7722056462631454, 'learning_rate': 5.160934712428029e-07, 'epoch': 0.86} 86%|████████▌ | 29427/34278 [32:17:59<4:42:44, 3.50s/it] 86%|████████▌ | 29428/34278 [32:18:02<4:29:11, 3.33s/it] {'loss': 0.1156, 'grad_norm': 0.9812732596639206, 'learning_rate': 5.158844514016464e-07, 'epoch': 0.86} 86%|████████▌ | 29428/34278 [32:18:02<4:29:11, 3.33s/it] 86%|████████▌ | 29429/34278 [32:18:08<5:31:08, 4.10s/it] {'loss': 0.1065, 'grad_norm': 0.751644355534073, 'learning_rate': 5.156754715936041e-07, 'epoch': 0.86} 86%|████████▌ | 29429/34278 [32:18:08<5:31:08, 4.10s/it] 86%|████████▌ | 29430/34278 [32:18:11<5:18:13, 3.94s/it] {'loss': 0.1209, 'grad_norm': 0.7564942418699732, 'learning_rate': 5.154665318205399e-07, 'epoch': 0.86} 86%|████████▌ | 29430/34278 [32:18:11<5:18:13, 3.94s/it] 86%|████████▌ | 29431/34278 [32:18:14<4:55:18, 3.66s/it] {'loss': 0.1113, 'grad_norm': 0.7916462788596871, 'learning_rate': 5.152576320843206e-07, 'epoch': 0.86} 86%|████████▌ | 29431/34278 [32:18:14<4:55:18, 3.66s/it] 86%|████████▌ | 29432/34278 [32:18:20<5:49:15, 4.32s/it] {'loss': 0.0907, 'grad_norm': 0.8626458932377982, 'learning_rate': 5.150487723868097e-07, 'epoch': 0.86} 86%|████████▌ | 29432/34278 [32:18:20<5:49:15, 4.32s/it] 86%|████████▌ | 29433/34278 [32:18:23<5:26:59, 4.05s/it] {'loss': 0.1249, 'grad_norm': 0.784768920987382, 'learning_rate': 5.148399527298737e-07, 'epoch': 0.86} 86%|████████▌ | 29433/34278 [32:18:24<5:26:59, 4.05s/it] 86%|████████▌ | 29434/34278 [32:18:27<5:14:16, 3.89s/it] {'loss': 0.1036, 'grad_norm': 0.8921925899855412, 'learning_rate': 5.146311731153752e-07, 'epoch': 0.86} 86%|████████▌ | 29434/34278 [32:18:27<5:14:16, 3.89s/it] 86%|████████▌ | 29435/34278 [32:18:33<5:57:09, 4.42s/it] {'loss': 0.1109, 'grad_norm': 0.9137812493085843, 'learning_rate': 5.144224335451792e-07, 'epoch': 0.86} 86%|████████▌ | 29435/34278 [32:18:33<5:57:09, 4.42s/it] 86%|████████▌ | 29436/34278 [32:18:37<5:51:50, 4.36s/it] {'loss': 0.1152, 'grad_norm': 1.0167966710404392, 'learning_rate': 5.142137340211483e-07, 'epoch': 0.86} 86%|████████▌ | 29436/34278 [32:18:37<5:51:50, 4.36s/it] 86%|████████▌ | 29437/34278 [32:18:41<5:54:08, 4.39s/it] {'loss': 0.1214, 'grad_norm': 0.7613544011294333, 'learning_rate': 5.140050745451475e-07, 'epoch': 0.86} 86%|████████▌ | 29437/34278 [32:18:41<5:54:08, 4.39s/it] 86%|████████▌ | 29438/34278 [32:18:44<5:18:51, 3.95s/it] {'loss': 0.119, 'grad_norm': 0.8576052611696953, 'learning_rate': 5.137964551190383e-07, 'epoch': 0.86} 86%|████████▌ | 29438/34278 [32:18:44<5:18:51, 3.95s/it] 86%|████████▌ | 29439/34278 [32:18:48<5:07:53, 3.82s/it] {'loss': 0.0913, 'grad_norm': 1.0403343300488277, 'learning_rate': 5.135878757446827e-07, 'epoch': 0.86} 86%|████████▌ | 29439/34278 [32:18:48<5:07:53, 3.82s/it] 86%|████████▌ | 29440/34278 [32:18:54<5:56:40, 4.42s/it] {'loss': 0.0994, 'grad_norm': 0.7258185632604269, 'learning_rate': 5.133793364239431e-07, 'epoch': 0.86} 86%|████████▌ | 29440/34278 [32:18:54<5:56:40, 4.42s/it] 86%|████████▌ | 29441/34278 [32:18:57<5:28:43, 4.08s/it] {'loss': 0.1223, 'grad_norm': 1.0186480626788514, 'learning_rate': 5.131708371586829e-07, 'epoch': 0.86} 86%|████████▌ | 29441/34278 [32:18:57<5:28:43, 4.08s/it] 86%|████████▌ | 29442/34278 [32:19:00<5:00:12, 3.72s/it] {'loss': 0.1289, 'grad_norm': 0.8462446482007799, 'learning_rate': 5.129623779507625e-07, 'epoch': 0.86} 86%|████████▌ | 29442/34278 [32:19:00<5:00:12, 3.72s/it] 86%|████████▌ | 29443/34278 [32:19:06<5:56:17, 4.42s/it] {'loss': 0.0982, 'grad_norm': 0.9467615585489157, 'learning_rate': 5.127539588020419e-07, 'epoch': 0.86} 86%|████████▌ | 29443/34278 [32:19:06<5:56:17, 4.42s/it] 86%|████████▌ | 29444/34278 [32:19:12<6:35:00, 4.90s/it] {'loss': 0.11, 'grad_norm': 0.8042426959877547, 'learning_rate': 5.125455797143836e-07, 'epoch': 0.86} 86%|████████▌ | 29444/34278 [32:19:12<6:35:00, 4.90s/it] 86%|████████▌ | 29445/34278 [32:19:15<5:51:09, 4.36s/it] {'loss': 0.1172, 'grad_norm': 0.7695714351117898, 'learning_rate': 5.123372406896471e-07, 'epoch': 0.86} 86%|████████▌ | 29445/34278 [32:19:15<5:51:09, 4.36s/it] 86%|████████▌ | 29446/34278 [32:19:18<5:20:51, 3.98s/it] {'loss': 0.1035, 'grad_norm': 1.2927717821152398, 'learning_rate': 5.121289417296904e-07, 'epoch': 0.86} 86%|████████▌ | 29446/34278 [32:19:18<5:20:51, 3.98s/it] 86%|████████▌ | 29447/34278 [32:19:24<5:57:29, 4.44s/it] {'loss': 0.12, 'grad_norm': 0.781705172671196, 'learning_rate': 5.119206828363777e-07, 'epoch': 0.86} 86%|████████▌ | 29447/34278 [32:19:24<5:57:29, 4.44s/it] 86%|████████▌ | 29448/34278 [32:19:29<6:21:51, 4.74s/it] {'loss': 0.1116, 'grad_norm': 0.9423321120858826, 'learning_rate': 5.117124640115651e-07, 'epoch': 0.86} 86%|████████▌ | 29448/34278 [32:19:29<6:21:51, 4.74s/it] 86%|████████▌ | 29449/34278 [32:19:32<5:45:10, 4.29s/it] {'loss': 0.1077, 'grad_norm': 0.878780722775897, 'learning_rate': 5.115042852571111e-07, 'epoch': 0.86} 86%|████████▌ | 29449/34278 [32:19:32<5:45:10, 4.29s/it] 86%|████████▌ | 29450/34278 [32:19:35<5:16:08, 3.93s/it] {'loss': 0.1201, 'grad_norm': 0.9486237045265276, 'learning_rate': 5.112961465748767e-07, 'epoch': 0.86} 86%|████████▌ | 29450/34278 [32:19:35<5:16:08, 3.93s/it] 86%|████████▌ | 29451/34278 [32:19:38<4:54:47, 3.66s/it] {'loss': 0.1344, 'grad_norm': 0.8706977732052311, 'learning_rate': 5.11088047966719e-07, 'epoch': 0.86} 86%|████████▌ | 29451/34278 [32:19:38<4:54:47, 3.66s/it] 86%|████████▌ | 29452/34278 [32:19:42<4:43:21, 3.52s/it] {'loss': 0.0984, 'grad_norm': 0.7262858207686353, 'learning_rate': 5.10879989434494e-07, 'epoch': 0.86} 86%|████████▌ | 29452/34278 [32:19:42<4:43:21, 3.52s/it] 86%|████████▌ | 29453/34278 [32:19:46<4:56:10, 3.68s/it] {'loss': 0.1225, 'grad_norm': 0.9363907226557692, 'learning_rate': 5.106719709800612e-07, 'epoch': 0.86} 86%|████████▌ | 29453/34278 [32:19:46<4:56:10, 3.68s/it] 86%|████████▌ | 29454/34278 [32:19:49<4:47:21, 3.57s/it] {'loss': 0.0964, 'grad_norm': 0.7580962212275943, 'learning_rate': 5.104639926052785e-07, 'epoch': 0.86} 86%|████████▌ | 29454/34278 [32:19:49<4:47:21, 3.57s/it] 86%|████████▌ | 29455/34278 [32:19:55<5:47:15, 4.32s/it] {'loss': 0.1116, 'grad_norm': 0.8527587779765261, 'learning_rate': 5.102560543120011e-07, 'epoch': 0.86} 86%|████████▌ | 29455/34278 [32:19:55<5:47:15, 4.32s/it] 86%|████████▌ | 29456/34278 [32:19:58<5:15:49, 3.93s/it] {'loss': 0.1236, 'grad_norm': 0.8396394476457001, 'learning_rate': 5.100481561020853e-07, 'epoch': 0.86} 86%|████████▌ | 29456/34278 [32:19:58<5:15:49, 3.93s/it] 86%|████████▌ | 29457/34278 [32:20:02<5:27:16, 4.07s/it] {'loss': 0.1001, 'grad_norm': 0.8179078502682865, 'learning_rate': 5.098402979773886e-07, 'epoch': 0.86} 86%|████████▌ | 29457/34278 [32:20:02<5:27:16, 4.07s/it] 86%|████████▌ | 29458/34278 [32:20:06<5:04:21, 3.79s/it] {'loss': 0.1241, 'grad_norm': 0.9643860638380869, 'learning_rate': 5.096324799397645e-07, 'epoch': 0.86} 86%|████████▌ | 29458/34278 [32:20:06<5:04:21, 3.79s/it] 86%|████████▌ | 29459/34278 [32:20:09<4:48:23, 3.59s/it] {'loss': 0.1392, 'grad_norm': 0.8908851752833253, 'learning_rate': 5.094247019910709e-07, 'epoch': 0.86} 86%|████████▌ | 29459/34278 [32:20:09<4:48:23, 3.59s/it] 86%|████████▌ | 29460/34278 [32:20:11<4:23:10, 3.28s/it] {'loss': 0.1018, 'grad_norm': 0.8365052768357152, 'learning_rate': 5.092169641331607e-07, 'epoch': 0.86} 86%|████████▌ | 29460/34278 [32:20:11<4:23:10, 3.28s/it] 86%|████████▌ | 29461/34278 [32:20:17<5:14:17, 3.91s/it] {'loss': 0.095, 'grad_norm': 1.0732215328561312, 'learning_rate': 5.090092663678903e-07, 'epoch': 0.86} 86%|████████▌ | 29461/34278 [32:20:17<5:14:17, 3.91s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 86%|████████▌ | 29462/34278 [32:20:20<4:55:31, 3.68s/it] {'loss': 0.1072, 'grad_norm': 0.8282863012039224, 'learning_rate': 5.08801608697112e-07, 'epoch': 0.86} 86%|████████▌ | 29462/34278 [32:20:20<4:55:31, 3.68s/it] 86%|████████▌ | 29463/34278 [32:20:23<4:45:11, 3.55s/it] {'loss': 0.1489, 'grad_norm': 1.095662547070976, 'learning_rate': 5.085939911226822e-07, 'epoch': 0.86} 86%|████████▌ | 29463/34278 [32:20:23<4:45:11, 3.55s/it] 86%|████████▌ | 29464/34278 [32:20:26<4:24:07, 3.29s/it] {'loss': 0.1046, 'grad_norm': 1.1262591698820914, 'learning_rate': 5.083864136464517e-07, 'epoch': 0.86} 86%|████████▌ | 29464/34278 [32:20:26<4:24:07, 3.29s/it] 86%|████████▌ | 29465/34278 [32:20:31<5:09:58, 3.86s/it] {'loss': 0.1164, 'grad_norm': 0.7123270198738395, 'learning_rate': 5.08178876270276e-07, 'epoch': 0.86} 86%|████████▌ | 29465/34278 [32:20:31<5:09:58, 3.86s/it] 86%|████████▌ | 29466/34278 [32:20:35<5:10:52, 3.88s/it] {'loss': 0.116, 'grad_norm': 1.1177619069218627, 'learning_rate': 5.079713789960061e-07, 'epoch': 0.86} 86%|████████▌ | 29466/34278 [32:20:35<5:10:52, 3.88s/it] 86%|████████▌ | 29467/34278 [32:20:37<4:41:41, 3.51s/it] {'loss': 0.1242, 'grad_norm': 1.0782613105866432, 'learning_rate': 5.077639218254965e-07, 'epoch': 0.86} 86%|████████▌ | 29467/34278 [32:20:38<4:41:41, 3.51s/it] 86%|████████▌ | 29468/34278 [32:20:40<4:26:17, 3.32s/it] {'loss': 0.1009, 'grad_norm': 1.2186038789684839, 'learning_rate': 5.075565047605979e-07, 'epoch': 0.86} 86%|████████▌ | 29468/34278 [32:20:40<4:26:17, 3.32s/it] 86%|████████▌ | 29469/34278 [32:20:43<4:20:52, 3.25s/it] {'loss': 0.1076, 'grad_norm': 0.7794700839204736, 'learning_rate': 5.073491278031617e-07, 'epoch': 0.86} 86%|████████▌ | 29469/34278 [32:20:43<4:20:52, 3.25s/it] 86%|████████▌ | 29470/34278 [32:20:47<4:18:23, 3.22s/it] {'loss': 0.1396, 'grad_norm': 0.870849339425482, 'learning_rate': 5.071417909550402e-07, 'epoch': 0.86} 86%|████████▌ | 29470/34278 [32:20:47<4:18:23, 3.22s/it] 86%|████████▌ | 29471/34278 [32:20:49<4:07:39, 3.09s/it] {'loss': 0.1076, 'grad_norm': 0.7510726306428365, 'learning_rate': 5.069344942180848e-07, 'epoch': 0.86} 86%|████████▌ | 29471/34278 [32:20:49<4:07:39, 3.09s/it] 86%|████████▌ | 29472/34278 [32:20:55<5:09:11, 3.86s/it] {'loss': 0.0957, 'grad_norm': 0.6740657251828723, 'learning_rate': 5.067272375941463e-07, 'epoch': 0.86} 86%|████████▌ | 29472/34278 [32:20:55<5:09:11, 3.86s/it] 86%|████████▌ | 29473/34278 [32:20:58<4:56:16, 3.70s/it] {'loss': 0.1336, 'grad_norm': 0.891263377532334, 'learning_rate': 5.065200210850723e-07, 'epoch': 0.86} 86%|████████▌ | 29473/34278 [32:20:58<4:56:16, 3.70s/it] 86%|████████▌ | 29474/34278 [32:21:03<5:26:12, 4.07s/it] {'loss': 0.1181, 'grad_norm': 0.8615194734568703, 'learning_rate': 5.063128446927168e-07, 'epoch': 0.86} 86%|████████▌ | 29474/34278 [32:21:03<5:26:12, 4.07s/it] 86%|████████▌ | 29475/34278 [32:21:09<6:09:27, 4.62s/it] {'loss': 0.098, 'grad_norm': 0.855735698394919, 'learning_rate': 5.061057084189274e-07, 'epoch': 0.86} 86%|████████▌ | 29475/34278 [32:21:09<6:09:27, 4.62s/it] 86%|████████▌ | 29476/34278 [32:21:12<5:30:51, 4.13s/it] {'loss': 0.1092, 'grad_norm': 0.9202547256040262, 'learning_rate': 5.058986122655512e-07, 'epoch': 0.86} 86%|████████▌ | 29476/34278 [32:21:12<5:30:51, 4.13s/it] 86%|████████▌ | 29477/34278 [32:21:15<5:08:18, 3.85s/it] {'loss': 0.1379, 'grad_norm': 0.8691794033323256, 'learning_rate': 5.056915562344411e-07, 'epoch': 0.86} 86%|████████▌ | 29477/34278 [32:21:15<5:08:18, 3.85s/it] 86%|████████▌ | 29478/34278 [32:21:19<4:57:58, 3.72s/it] {'loss': 0.1002, 'grad_norm': 0.7190411998033227, 'learning_rate': 5.054845403274444e-07, 'epoch': 0.86} 86%|████████▌ | 29478/34278 [32:21:19<4:57:58, 3.72s/it] 86%|████████▌ | 29479/34278 [32:21:23<5:19:13, 3.99s/it] {'loss': 0.1047, 'grad_norm': 0.9201431554442078, 'learning_rate': 5.052775645464075e-07, 'epoch': 0.86} 86%|████████▌ | 29479/34278 [32:21:23<5:19:13, 3.99s/it] 86%|████████▌ | 29480/34278 [32:21:26<4:52:23, 3.66s/it] {'loss': 0.1208, 'grad_norm': 0.8565354245381157, 'learning_rate': 5.050706288931806e-07, 'epoch': 0.86} 86%|████████▌ | 29480/34278 [32:21:26<4:52:23, 3.66s/it] 86%|████████▌ | 29481/34278 [32:21:29<4:32:53, 3.41s/it] {'loss': 0.1142, 'grad_norm': 1.1982432707796746, 'learning_rate': 5.048637333696105e-07, 'epoch': 0.86} 86%|████████▌ | 29481/34278 [32:21:29<4:32:53, 3.41s/it] 86%|████████▌ | 29482/34278 [32:21:33<4:32:51, 3.41s/it] {'loss': 0.0932, 'grad_norm': 0.7490489965000526, 'learning_rate': 5.046568779775424e-07, 'epoch': 0.86} 86%|████████▌ | 29482/34278 [32:21:33<4:32:51, 3.41s/it] 86%|████████▌ | 29483/34278 [32:21:36<4:32:14, 3.41s/it] {'loss': 0.1063, 'grad_norm': 0.8452700790680486, 'learning_rate': 5.044500627188248e-07, 'epoch': 0.86} 86%|████████▌ | 29483/34278 [32:21:36<4:32:14, 3.41s/it] 86%|████████▌ | 29484/34278 [32:21:40<4:37:26, 3.47s/it] {'loss': 0.1211, 'grad_norm': 0.8475220682439937, 'learning_rate': 5.042432875953046e-07, 'epoch': 0.86} 86%|████████▌ | 29484/34278 [32:21:40<4:37:26, 3.47s/it] 86%|████████▌ | 29485/34278 [32:21:43<4:24:17, 3.31s/it] {'loss': 0.1239, 'grad_norm': 0.8374258136534395, 'learning_rate': 5.040365526088276e-07, 'epoch': 0.86} 86%|████████▌ | 29485/34278 [32:21:43<4:24:17, 3.31s/it] 86%|████████▌ | 29486/34278 [32:21:46<4:35:31, 3.45s/it] {'loss': 0.0896, 'grad_norm': 0.6981846126698772, 'learning_rate': 5.038298577612378e-07, 'epoch': 0.86} 86%|████████▌ | 29486/34278 [32:21:46<4:35:31, 3.45s/it] 86%|████████▌ | 29487/34278 [32:21:50<4:44:26, 3.56s/it] {'loss': 0.1133, 'grad_norm': 0.920102923102032, 'learning_rate': 5.036232030543825e-07, 'epoch': 0.86} 86%|████████▌ | 29487/34278 [32:21:50<4:44:26, 3.56s/it] 86%|████████▌ | 29488/34278 [32:21:53<4:31:31, 3.40s/it] {'loss': 0.1276, 'grad_norm': 0.8147794332701817, 'learning_rate': 5.034165884901049e-07, 'epoch': 0.86} 86%|████████▌ | 29488/34278 [32:21:53<4:31:31, 3.40s/it] 86%|████████▌ | 29489/34278 [32:21:59<5:33:56, 4.18s/it] {'loss': 0.1059, 'grad_norm': 0.8643020393761554, 'learning_rate': 5.032100140702518e-07, 'epoch': 0.86} 86%|████████▌ | 29489/34278 [32:21:59<5:33:56, 4.18s/it] 86%|████████▌ | 29490/34278 [32:22:02<5:11:05, 3.90s/it] {'loss': 0.1211, 'grad_norm': 1.0273624491597337, 'learning_rate': 5.030034797966649e-07, 'epoch': 0.86} 86%|████████▌ | 29490/34278 [32:22:02<5:11:05, 3.90s/it] 86%|████████▌ | 29491/34278 [32:22:06<4:55:31, 3.70s/it] {'loss': 0.1197, 'grad_norm': 1.0993653441757587, 'learning_rate': 5.027969856711907e-07, 'epoch': 0.86} 86%|████████▌ | 29491/34278 [32:22:06<4:55:31, 3.70s/it] 86%|████████▌ | 29492/34278 [32:22:09<4:43:33, 3.55s/it] {'loss': 0.117, 'grad_norm': 0.8674785008953878, 'learning_rate': 5.025905316956703e-07, 'epoch': 0.86} 86%|████████▌ | 29492/34278 [32:22:09<4:43:33, 3.55s/it] 86%|████████▌ | 29493/34278 [32:22:14<5:19:56, 4.01s/it] {'loss': 0.1089, 'grad_norm': 0.8134897664006652, 'learning_rate': 5.023841178719491e-07, 'epoch': 0.86} 86%|████████▌ | 29493/34278 [32:22:14<5:19:56, 4.01s/it] 86%|████████▌ | 29494/34278 [32:22:19<5:53:11, 4.43s/it] {'loss': 0.106, 'grad_norm': 0.740252608636668, 'learning_rate': 5.021777442018677e-07, 'epoch': 0.86} 86%|████████▌ | 29494/34278 [32:22:19<5:53:11, 4.43s/it] 86%|████████▌ | 29495/34278 [32:22:22<5:18:10, 3.99s/it] {'loss': 0.0992, 'grad_norm': 0.6740421975228598, 'learning_rate': 5.019714106872709e-07, 'epoch': 0.86} 86%|████████▌ | 29495/34278 [32:22:22<5:18:10, 3.99s/it] 86%|████████▌ | 29496/34278 [32:22:25<4:51:28, 3.66s/it] {'loss': 0.0991, 'grad_norm': 0.7712131246928646, 'learning_rate': 5.017651173299981e-07, 'epoch': 0.86} 86%|████████▌ | 29496/34278 [32:22:25<4:51:28, 3.66s/it] 86%|████████▌ | 29497/34278 [32:22:28<4:40:15, 3.52s/it] {'loss': 0.0951, 'grad_norm': 0.6535127532044942, 'learning_rate': 5.015588641318941e-07, 'epoch': 0.86} 86%|████████▌ | 29497/34278 [32:22:28<4:40:15, 3.52s/it] 86%|████████▌ | 29498/34278 [32:22:32<4:38:46, 3.50s/it] {'loss': 0.1152, 'grad_norm': 0.8457754040744747, 'learning_rate': 5.013526510947986e-07, 'epoch': 0.86} 86%|████████▌ | 29498/34278 [32:22:32<4:38:46, 3.50s/it] 86%|████████▌ | 29499/34278 [32:22:35<4:30:53, 3.40s/it] {'loss': 0.102, 'grad_norm': 0.9662646942978871, 'learning_rate': 5.011464782205511e-07, 'epoch': 0.86} 86%|████████▌ | 29499/34278 [32:22:35<4:30:53, 3.40s/it] 86%|████████▌ | 29500/34278 [32:22:38<4:26:36, 3.35s/it] {'loss': 0.106, 'grad_norm': 1.363511336845176, 'learning_rate': 5.009403455109946e-07, 'epoch': 0.86} 86%|████████▌ | 29500/34278 [32:22:38<4:26:36, 3.35s/it] 86%|████████▌ | 29501/34278 [32:22:41<4:23:54, 3.31s/it] {'loss': 0.1108, 'grad_norm': 0.7929904353469331, 'learning_rate': 5.007342529679693e-07, 'epoch': 0.86} 86%|████████▌ | 29501/34278 [32:22:41<4:23:54, 3.31s/it] 86%|████████▌ | 29502/34278 [32:22:44<4:12:47, 3.18s/it] {'loss': 0.1041, 'grad_norm': 0.7810159069157689, 'learning_rate': 5.005282005933148e-07, 'epoch': 0.86} 86%|████████▌ | 29502/34278 [32:22:44<4:12:47, 3.18s/it] 86%|████████▌ | 29503/34278 [32:22:50<5:15:52, 3.97s/it] {'loss': 0.1139, 'grad_norm': 0.9411311208768363, 'learning_rate': 5.003221883888692e-07, 'epoch': 0.86} 86%|████████▌ | 29503/34278 [32:22:50<5:15:52, 3.97s/it] 86%|████████▌ | 29504/34278 [32:22:56<6:03:10, 4.56s/it] {'loss': 0.1218, 'grad_norm': 0.901259545584759, 'learning_rate': 5.001162163564738e-07, 'epoch': 0.86} 86%|████████▌ | 29504/34278 [32:22:56<6:03:10, 4.56s/it] 86%|████████▌ | 29505/34278 [32:23:00<5:43:44, 4.32s/it] {'loss': 0.1082, 'grad_norm': 0.7228034390535022, 'learning_rate': 4.99910284497967e-07, 'epoch': 0.86} 86%|████████▌ | 29505/34278 [32:23:00<5:43:44, 4.32s/it] 86%|████████▌ | 29506/34278 [32:23:03<5:15:13, 3.96s/it] {'loss': 0.1067, 'grad_norm': 0.9359231392219072, 'learning_rate': 4.997043928151851e-07, 'epoch': 0.86} 86%|████████▌ | 29506/34278 [32:23:03<5:15:13, 3.96s/it] 86%|████████▌ | 29507/34278 [32:23:09<5:59:04, 4.52s/it] {'loss': 0.0999, 'grad_norm': 0.7750835167395633, 'learning_rate': 4.994985413099695e-07, 'epoch': 0.86} 86%|████████▌ | 29507/34278 [32:23:09<5:59:04, 4.52s/it] 86%|████████▌ | 29508/34278 [32:23:15<6:32:57, 4.94s/it] {'loss': 0.1321, 'grad_norm': 0.9282258491681282, 'learning_rate': 4.992927299841566e-07, 'epoch': 0.86} 86%|████████▌ | 29508/34278 [32:23:15<6:32:57, 4.94s/it] 86%|████████▌ | 29509/34278 [32:23:18<5:48:36, 4.39s/it] {'loss': 0.1138, 'grad_norm': 0.8180139497002444, 'learning_rate': 4.99086958839583e-07, 'epoch': 0.86} 86%|████████▌ | 29509/34278 [32:23:18<5:48:36, 4.39s/it] 86%|████████▌ | 29510/34278 [32:23:21<5:13:14, 3.94s/it] {'loss': 0.1194, 'grad_norm': 1.0599843840977938, 'learning_rate': 4.988812278780875e-07, 'epoch': 0.86} 86%|████████▌ | 29510/34278 [32:23:21<5:13:14, 3.94s/it] 86%|████████▌ | 29511/34278 [32:23:27<6:10:06, 4.66s/it] {'loss': 0.1146, 'grad_norm': 0.7357831173012828, 'learning_rate': 4.986755371015062e-07, 'epoch': 0.86} 86%|████████▌ | 29511/34278 [32:23:27<6:10:06, 4.66s/it] 86%|████████▌ | 29512/34278 [32:23:30<5:29:19, 4.15s/it] {'loss': 0.1137, 'grad_norm': 0.7365994357681691, 'learning_rate': 4.984698865116739e-07, 'epoch': 0.86} 86%|████████▌ | 29512/34278 [32:23:30<5:29:19, 4.15s/it] 86%|████████▌ | 29513/34278 [32:23:33<5:03:51, 3.83s/it] {'loss': 0.1135, 'grad_norm': 0.8982152257623877, 'learning_rate': 4.982642761104279e-07, 'epoch': 0.86} 86%|████████▌ | 29513/34278 [32:23:33<5:03:51, 3.83s/it] 86%|████████▌ | 29514/34278 [32:23:36<4:50:44, 3.66s/it] {'loss': 0.1015, 'grad_norm': 0.7699343204703459, 'learning_rate': 4.980587058996044e-07, 'epoch': 0.86} 86%|████████▌ | 29514/34278 [32:23:36<4:50:44, 3.66s/it] 86%|████████▌ | 29515/34278 [32:23:40<4:50:12, 3.66s/it] {'loss': 0.1134, 'grad_norm': 0.7885245394304854, 'learning_rate': 4.978531758810385e-07, 'epoch': 0.86} 86%|████████▌ | 29515/34278 [32:23:40<4:50:12, 3.66s/it] 86%|████████▌ | 29516/34278 [32:23:43<4:30:20, 3.41s/it] {'loss': 0.0949, 'grad_norm': 0.7656299112574463, 'learning_rate': 4.976476860565638e-07, 'epoch': 0.86} 86%|████████▌ | 29516/34278 [32:23:43<4:30:20, 3.41s/it] 86%|████████▌ | 29517/34278 [32:23:46<4:33:50, 3.45s/it] {'loss': 0.1144, 'grad_norm': 0.9263050205599165, 'learning_rate': 4.974422364280169e-07, 'epoch': 0.86} 86%|████████▌ | 29517/34278 [32:23:46<4:33:50, 3.45s/it] 86%|████████▌ | 29518/34278 [32:23:50<4:31:43, 3.43s/it] {'loss': 0.1032, 'grad_norm': 0.8468985385132534, 'learning_rate': 4.972368269972294e-07, 'epoch': 0.86} 86%|████████▌ | 29518/34278 [32:23:50<4:31:43, 3.43s/it] 86%|████████▌ | 29519/34278 [32:23:53<4:22:27, 3.31s/it] {'loss': 0.1014, 'grad_norm': 0.823224175773147, 'learning_rate': 4.970314577660379e-07, 'epoch': 0.86} 86%|████████▌ | 29519/34278 [32:23:53<4:22:27, 3.31s/it] 86%|████████▌ | 29520/34278 [32:23:56<4:16:09, 3.23s/it] {'loss': 0.1455, 'grad_norm': 0.8047470667383144, 'learning_rate': 4.96826128736273e-07, 'epoch': 0.86} 86%|████████▌ | 29520/34278 [32:23:56<4:16:09, 3.23s/it] 86%|████████▌ | 29521/34278 [32:23:59<4:09:39, 3.15s/it] {'loss': 0.0958, 'grad_norm': 0.9622720918984243, 'learning_rate': 4.96620839909771e-07, 'epoch': 0.86} 86%|████████▌ | 29521/34278 [32:23:59<4:09:39, 3.15s/it] 86%|████████▌ | 29522/34278 [32:24:03<4:28:14, 3.38s/it] {'loss': 0.1074, 'grad_norm': 0.6545674953297864, 'learning_rate': 4.964155912883628e-07, 'epoch': 0.86} 86%|████████▌ | 29522/34278 [32:24:03<4:28:14, 3.38s/it] 86%|████████▌ | 29523/34278 [32:24:06<4:20:08, 3.28s/it] {'loss': 0.0923, 'grad_norm': 0.781662532258794, 'learning_rate': 4.962103828738807e-07, 'epoch': 0.86} 86%|████████▌ | 29523/34278 [32:24:06<4:20:08, 3.28s/it] 86%|████████▌ | 29524/34278 [32:24:10<4:49:44, 3.66s/it] {'loss': 0.1015, 'grad_norm': 0.8537351550173269, 'learning_rate': 4.960052146681566e-07, 'epoch': 0.86} 86%|████████▌ | 29524/34278 [32:24:10<4:49:44, 3.66s/it] 86%|████████▌ | 29525/34278 [32:24:13<4:26:22, 3.36s/it] {'loss': 0.0995, 'grad_norm': 0.9227109810282786, 'learning_rate': 4.95800086673024e-07, 'epoch': 0.86} 86%|████████▌ | 29525/34278 [32:24:13<4:26:22, 3.36s/it] 86%|████████▌ | 29526/34278 [32:24:19<5:29:30, 4.16s/it] {'loss': 0.0862, 'grad_norm': 0.6084352556263708, 'learning_rate': 4.955949988903119e-07, 'epoch': 0.86} 86%|████████▌ | 29526/34278 [32:24:19<5:29:30, 4.16s/it] 86%|████████▌ | 29527/34278 [32:24:22<4:58:32, 3.77s/it] {'loss': 0.1151, 'grad_norm': 0.9855506075943639, 'learning_rate': 4.953899513218535e-07, 'epoch': 0.86} 86%|████████▌ | 29527/34278 [32:24:22<4:58:32, 3.77s/it] 86%|████████▌ | 29528/34278 [32:24:25<4:45:20, 3.60s/it] {'loss': 0.1124, 'grad_norm': 0.8146833029360177, 'learning_rate': 4.951849439694778e-07, 'epoch': 0.86} 86%|████████▌ | 29528/34278 [32:24:25<4:45:20, 3.60s/it] 86%|████████▌ | 29529/34278 [32:24:28<4:40:50, 3.55s/it] {'loss': 0.1176, 'grad_norm': 1.0698802220578514, 'learning_rate': 4.94979976835015e-07, 'epoch': 0.86} 86%|████████▌ | 29529/34278 [32:24:28<4:40:50, 3.55s/it] 86%|████████▌ | 29530/34278 [32:24:32<4:42:36, 3.57s/it] {'loss': 0.1153, 'grad_norm': 0.8454586330671141, 'learning_rate': 4.947750499202952e-07, 'epoch': 0.86} 86%|████████▌ | 29530/34278 [32:24:32<4:42:36, 3.57s/it] 86%|████████▌ | 29531/34278 [32:24:36<4:54:58, 3.73s/it] {'loss': 0.0999, 'grad_norm': 0.8488563654978626, 'learning_rate': 4.945701632271499e-07, 'epoch': 0.86} 86%|████████▌ | 29531/34278 [32:24:36<4:54:58, 3.73s/it] 86%|████████▌ | 29532/34278 [32:24:39<4:34:58, 3.48s/it] {'loss': 0.0987, 'grad_norm': 0.8152915120222931, 'learning_rate': 4.943653167574058e-07, 'epoch': 0.86} 86%|████████▌ | 29532/34278 [32:24:39<4:34:58, 3.48s/it] 86%|████████▌ | 29533/34278 [32:24:42<4:27:46, 3.39s/it] {'loss': 0.1129, 'grad_norm': 0.8512444655778405, 'learning_rate': 4.941605105128922e-07, 'epoch': 0.86} 86%|████████▌ | 29533/34278 [32:24:42<4:27:46, 3.39s/it] 86%|████████▌ | 29534/34278 [32:24:45<4:18:33, 3.27s/it] {'loss': 0.1134, 'grad_norm': 0.7923319732939569, 'learning_rate': 4.93955744495439e-07, 'epoch': 0.86} 86%|████████▌ | 29534/34278 [32:24:45<4:18:33, 3.27s/it] 86%|████████▌ | 29535/34278 [32:24:48<4:11:57, 3.19s/it] {'loss': 0.0937, 'grad_norm': 0.990599679297401, 'learning_rate': 4.937510187068728e-07, 'epoch': 0.86} 86%|████████▌ | 29535/34278 [32:24:48<4:11:57, 3.19s/it] 86%|████████▌ | 29536/34278 [32:24:55<5:24:48, 4.11s/it] {'loss': 0.1152, 'grad_norm': 0.9879391746123323, 'learning_rate': 4.935463331490198e-07, 'epoch': 0.86} 86%|████████▌ | 29536/34278 [32:24:55<5:24:48, 4.11s/it] 86%|████████▌ | 29537/34278 [32:24:58<5:07:21, 3.89s/it] {'loss': 0.1311, 'grad_norm': 0.8191392081185264, 'learning_rate': 4.933416878237118e-07, 'epoch': 0.86} 86%|████████▌ | 29537/34278 [32:24:58<5:07:21, 3.89s/it] 86%|████████▌ | 29538/34278 [32:25:01<4:58:28, 3.78s/it] {'loss': 0.1108, 'grad_norm': 0.9034065550039516, 'learning_rate': 4.93137082732773e-07, 'epoch': 0.86} 86%|████████▌ | 29538/34278 [32:25:01<4:58:28, 3.78s/it] 86%|████████▌ | 29539/34278 [32:25:04<4:38:13, 3.52s/it] {'loss': 0.1106, 'grad_norm': 0.879997117093169, 'learning_rate': 4.929325178780293e-07, 'epoch': 0.86} 86%|████████▌ | 29539/34278 [32:25:04<4:38:13, 3.52s/it] 86%|████████▌ | 29540/34278 [32:25:09<5:06:28, 3.88s/it] {'loss': 0.102, 'grad_norm': 0.8021533510115464, 'learning_rate': 4.927279932613094e-07, 'epoch': 0.86} 86%|████████▌ | 29540/34278 [32:25:09<5:06:28, 3.88s/it] 86%|████████▌ | 29541/34278 [32:25:12<4:40:49, 3.56s/it] {'loss': 0.1085, 'grad_norm': 1.1201269405766916, 'learning_rate': 4.925235088844382e-07, 'epoch': 0.86} 86%|████████▌ | 29541/34278 [32:25:12<4:40:49, 3.56s/it] 86%|████████▌ | 29542/34278 [32:25:18<5:39:32, 4.30s/it] {'loss': 0.1399, 'grad_norm': 1.093923757410142, 'learning_rate': 4.923190647492399e-07, 'epoch': 0.86} 86%|████████▌ | 29542/34278 [32:25:18<5:39:32, 4.30s/it] 86%|████████▌ | 29543/34278 [32:25:24<6:15:44, 4.76s/it] {'loss': 0.1077, 'grad_norm': 0.8121647255800227, 'learning_rate': 4.921146608575405e-07, 'epoch': 0.86} 86%|████████▌ | 29543/34278 [32:25:24<6:15:44, 4.76s/it] 86%|████████▌ | 29544/34278 [32:25:28<5:59:51, 4.56s/it] {'loss': 0.1349, 'grad_norm': 0.959957420606875, 'learning_rate': 4.919102972111667e-07, 'epoch': 0.86} 86%|████████▌ | 29544/34278 [32:25:28<5:59:51, 4.56s/it] 86%|████████▌ | 29545/34278 [32:25:30<5:12:04, 3.96s/it] {'loss': 0.1277, 'grad_norm': 0.9910972412210832, 'learning_rate': 4.917059738119417e-07, 'epoch': 0.86} 86%|████████▌ | 29545/34278 [32:25:30<5:12:04, 3.96s/it] 86%|████████▌ | 29546/34278 [32:25:35<5:19:05, 4.05s/it] {'loss': 0.0984, 'grad_norm': 0.9460236678096623, 'learning_rate': 4.915016906616888e-07, 'epoch': 0.86} 86%|████████▌ | 29546/34278 [32:25:35<5:19:05, 4.05s/it] 86%|████████▌ | 29547/34278 [32:25:37<4:50:45, 3.69s/it] {'loss': 0.101, 'grad_norm': 0.8204929894094617, 'learning_rate': 4.912974477622329e-07, 'epoch': 0.86} 86%|████████▌ | 29547/34278 [32:25:38<4:50:45, 3.69s/it] 86%|████████▌ | 29548/34278 [32:25:41<4:36:09, 3.50s/it] {'loss': 0.1287, 'grad_norm': 1.071794228341519, 'learning_rate': 4.910932451153966e-07, 'epoch': 0.86} 86%|████████▌ | 29548/34278 [32:25:41<4:36:09, 3.50s/it] 86%|████████▌ | 29549/34278 [32:25:43<4:20:07, 3.30s/it] {'loss': 0.1319, 'grad_norm': 1.0222057992768019, 'learning_rate': 4.90889082723004e-07, 'epoch': 0.86} 86%|████████▌ | 29549/34278 [32:25:43<4:20:07, 3.30s/it] 86%|████████▌ | 29550/34278 [32:25:46<4:10:11, 3.17s/it] {'loss': 0.1249, 'grad_norm': 0.7890094777152389, 'learning_rate': 4.906849605868763e-07, 'epoch': 0.86} 86%|████████▌ | 29550/34278 [32:25:46<4:10:11, 3.17s/it] 86%|████████▌ | 29551/34278 [32:25:50<4:14:45, 3.23s/it] {'loss': 0.1096, 'grad_norm': 0.8086902933476409, 'learning_rate': 4.904808787088383e-07, 'epoch': 0.86} 86%|████████▌ | 29551/34278 [32:25:50<4:14:45, 3.23s/it] 86%|████████▌ | 29552/34278 [32:25:52<4:04:56, 3.11s/it] {'loss': 0.1083, 'grad_norm': 0.7286563718766472, 'learning_rate': 4.902768370907102e-07, 'epoch': 0.86} 86%|████████▌ | 29552/34278 [32:25:52<4:04:56, 3.11s/it] 86%|████████▌ | 29553/34278 [32:25:55<4:02:53, 3.08s/it] {'loss': 0.1085, 'grad_norm': 1.0063736550386313, 'learning_rate': 4.900728357343127e-07, 'epoch': 0.86} 86%|████████▌ | 29553/34278 [32:25:56<4:02:53, 3.08s/it] 86%|████████▌ | 29554/34278 [32:25:59<4:17:09, 3.27s/it] {'loss': 0.1047, 'grad_norm': 0.898909238721518, 'learning_rate': 4.898688746414687e-07, 'epoch': 0.86} 86%|████████▌ | 29554/34278 [32:25:59<4:17:09, 3.27s/it] 86%|████████▌ | 29555/34278 [32:26:02<4:07:30, 3.14s/it] {'loss': 0.1001, 'grad_norm': 0.9043671477518233, 'learning_rate': 4.896649538139992e-07, 'epoch': 0.86} 86%|████████▌ | 29555/34278 [32:26:02<4:07:30, 3.14s/it] 86%|████████▌ | 29556/34278 [32:26:05<4:09:31, 3.17s/it] {'loss': 0.1051, 'grad_norm': 0.9191083316418673, 'learning_rate': 4.894610732537241e-07, 'epoch': 0.86} 86%|████████▌ | 29556/34278 [32:26:05<4:09:31, 3.17s/it] 86%|████████▌ | 29557/34278 [32:26:08<4:04:34, 3.11s/it] {'loss': 0.0833, 'grad_norm': 0.8197405853127391, 'learning_rate': 4.892572329624639e-07, 'epoch': 0.86} 86%|████████▌ | 29557/34278 [32:26:08<4:04:34, 3.11s/it] 86%|████████▌ | 29558/34278 [32:26:12<4:21:31, 3.32s/it] {'loss': 0.1055, 'grad_norm': 0.711686348154552, 'learning_rate': 4.890534329420388e-07, 'epoch': 0.86} 86%|████████▌ | 29558/34278 [32:26:12<4:21:31, 3.32s/it] 86%|████████▌ | 29559/34278 [32:26:18<5:30:49, 4.21s/it] {'loss': 0.1095, 'grad_norm': 0.7694761334203983, 'learning_rate': 4.888496731942671e-07, 'epoch': 0.86} 86%|████████▌ | 29559/34278 [32:26:18<5:30:49, 4.21s/it] 86%|████████▌ | 29560/34278 [32:26:23<5:30:37, 4.20s/it] {'loss': 0.146, 'grad_norm': 1.086513979497295, 'learning_rate': 4.886459537209681e-07, 'epoch': 0.86} 86%|████████▌ | 29560/34278 [32:26:23<5:30:37, 4.20s/it] 86%|████████▌ | 29561/34278 [32:26:27<5:37:20, 4.29s/it] {'loss': 0.1104, 'grad_norm': 0.8444544898854911, 'learning_rate': 4.884422745239625e-07, 'epoch': 0.86} 86%|████████▌ | 29561/34278 [32:26:27<5:37:20, 4.29s/it] 86%|████████▌ | 29562/34278 [32:26:33<6:14:48, 4.77s/it] {'loss': 0.1252, 'grad_norm': 0.9671191569305575, 'learning_rate': 4.882386356050667e-07, 'epoch': 0.86} 86%|████████▌ | 29562/34278 [32:26:33<6:14:48, 4.77s/it] 86%|████████▌ | 29563/34278 [32:26:39<6:45:13, 5.16s/it] {'loss': 0.1355, 'grad_norm': 0.8935100670661668, 'learning_rate': 4.880350369660985e-07, 'epoch': 0.86} 86%|████████▌ | 29563/34278 [32:26:39<6:45:13, 5.16s/it] 86%|████████▌ | 29564/34278 [32:26:42<5:57:30, 4.55s/it] {'loss': 0.1052, 'grad_norm': 0.772155745151359, 'learning_rate': 4.878314786088778e-07, 'epoch': 0.86} 86%|████████▌ | 29564/34278 [32:26:42<5:57:30, 4.55s/it] 86%|████████▋ | 29565/34278 [32:26:48<6:30:23, 4.97s/it] {'loss': 0.1172, 'grad_norm': 0.8498414873168953, 'learning_rate': 4.876279605352202e-07, 'epoch': 0.86} 86%|████████▋ | 29565/34278 [32:26:48<6:30:23, 4.97s/it] 86%|████████▋ | 29566/34278 [32:26:51<5:47:51, 4.43s/it] {'loss': 0.1088, 'grad_norm': 0.7325778536929377, 'learning_rate': 4.87424482746941e-07, 'epoch': 0.86} 86%|████████▋ | 29566/34278 [32:26:51<5:47:51, 4.43s/it] 86%|████████▋ | 29567/34278 [32:26:54<5:10:00, 3.95s/it] {'loss': 0.0994, 'grad_norm': 0.982890890838626, 'learning_rate': 4.872210452458609e-07, 'epoch': 0.86} 86%|████████▋ | 29567/34278 [32:26:54<5:10:00, 3.95s/it] 86%|████████▋ | 29568/34278 [32:26:59<5:24:03, 4.13s/it] {'loss': 0.0958, 'grad_norm': 0.7980004055946873, 'learning_rate': 4.87017648033794e-07, 'epoch': 0.86} 86%|████████▋ | 29568/34278 [32:26:59<5:24:03, 4.13s/it] 86%|████████▋ | 29569/34278 [32:27:02<5:01:49, 3.85s/it] {'loss': 0.1077, 'grad_norm': 0.9001500264776446, 'learning_rate': 4.868142911125551e-07, 'epoch': 0.86} 86%|████████▋ | 29569/34278 [32:27:02<5:01:49, 3.85s/it] 86%|████████▋ | 29570/34278 [32:27:08<5:52:53, 4.50s/it] {'loss': 0.1135, 'grad_norm': 0.8083634707055541, 'learning_rate': 4.86610974483962e-07, 'epoch': 0.86} 86%|████████▋ | 29570/34278 [32:27:08<5:52:53, 4.50s/it] 86%|████████▋ | 29571/34278 [32:27:11<5:13:52, 4.00s/it] {'loss': 0.1129, 'grad_norm': 1.0745313282423126, 'learning_rate': 4.864076981498284e-07, 'epoch': 0.86} 86%|████████▋ | 29571/34278 [32:27:11<5:13:52, 4.00s/it] 86%|████████▋ | 29572/34278 [32:27:14<4:52:06, 3.72s/it] {'loss': 0.1273, 'grad_norm': 0.9907363144617128, 'learning_rate': 4.862044621119688e-07, 'epoch': 0.86} 86%|████████▋ | 29572/34278 [32:27:14<4:52:06, 3.72s/it] 86%|████████▋ | 29573/34278 [32:27:18<5:15:10, 4.02s/it] {'loss': 0.131, 'grad_norm': 1.0432915768941344, 'learning_rate': 4.860012663721981e-07, 'epoch': 0.86} 86%|████████▋ | 29573/34278 [32:27:18<5:15:10, 4.02s/it] 86%|████████▋ | 29574/34278 [32:27:22<5:13:31, 4.00s/it] {'loss': 0.1071, 'grad_norm': 0.8546023685608694, 'learning_rate': 4.857981109323312e-07, 'epoch': 0.86} 86%|████████▋ | 29574/34278 [32:27:22<5:13:31, 4.00s/it] 86%|████████▋ | 29575/34278 [32:27:28<5:54:43, 4.53s/it] {'loss': 0.0988, 'grad_norm': 0.9118591567924467, 'learning_rate': 4.855949957941814e-07, 'epoch': 0.86} 86%|████████▋ | 29575/34278 [32:27:28<5:54:43, 4.53s/it] 86%|████████▋ | 29576/34278 [32:27:33<5:59:12, 4.58s/it] {'loss': 0.1196, 'grad_norm': 0.7762545627015762, 'learning_rate': 4.853919209595604e-07, 'epoch': 0.86} 86%|████████▋ | 29576/34278 [32:27:33<5:59:12, 4.58s/it] 86%|████████▋ | 29577/34278 [32:27:36<5:28:47, 4.20s/it] {'loss': 0.122, 'grad_norm': 1.0126079495030087, 'learning_rate': 4.851888864302839e-07, 'epoch': 0.86} 86%|████████▋ | 29577/34278 [32:27:36<5:28:47, 4.20s/it] 86%|████████▋ | 29578/34278 [32:27:39<5:06:52, 3.92s/it] {'loss': 0.1097, 'grad_norm': 1.0261197469856702, 'learning_rate': 4.849858922081623e-07, 'epoch': 0.86} 86%|████████▋ | 29578/34278 [32:27:39<5:06:52, 3.92s/it] 86%|████████▋ | 29579/34278 [32:27:45<5:54:17, 4.52s/it] {'loss': 0.1282, 'grad_norm': 0.8383641536878149, 'learning_rate': 4.847829382950098e-07, 'epoch': 0.86} 86%|████████▋ | 29579/34278 [32:27:45<5:54:17, 4.52s/it] 86%|████████▋ | 29580/34278 [32:27:48<5:20:08, 4.09s/it] {'loss': 0.1197, 'grad_norm': 0.867578677704776, 'learning_rate': 4.845800246926369e-07, 'epoch': 0.86} 86%|████████▋ | 29580/34278 [32:27:48<5:20:08, 4.09s/it] 86%|████████▋ | 29581/34278 [32:27:53<5:21:38, 4.11s/it] {'loss': 0.1044, 'grad_norm': 0.9841259215689497, 'learning_rate': 4.843771514028555e-07, 'epoch': 0.86} 86%|████████▋ | 29581/34278 [32:27:53<5:21:38, 4.11s/it] 86%|████████▋ | 29582/34278 [32:27:56<4:56:56, 3.79s/it] {'loss': 0.1045, 'grad_norm': 0.868658120493819, 'learning_rate': 4.841743184274778e-07, 'epoch': 0.86} 86%|████████▋ | 29582/34278 [32:27:56<4:56:56, 3.79s/it] 86%|████████▋ | 29583/34278 [32:27:59<4:49:17, 3.70s/it] {'loss': 0.1215, 'grad_norm': 0.7679781412591147, 'learning_rate': 4.839715257683125e-07, 'epoch': 0.86} 86%|████████▋ | 29583/34278 [32:27:59<4:49:17, 3.70s/it] 86%|████████▋ | 29584/34278 [32:28:03<4:55:34, 3.78s/it] {'loss': 0.1178, 'grad_norm': 0.7238086362854077, 'learning_rate': 4.837687734271713e-07, 'epoch': 0.86} 86%|████████▋ | 29584/34278 [32:28:03<4:55:34, 3.78s/it] 86%|████████▋ | 29585/34278 [32:28:06<4:37:09, 3.54s/it] {'loss': 0.1146, 'grad_norm': 0.8831568359416163, 'learning_rate': 4.835660614058657e-07, 'epoch': 0.86} 86%|████████▋ | 29585/34278 [32:28:06<4:37:09, 3.54s/it] 86%|████████▋ | 29586/34278 [32:28:10<4:43:09, 3.62s/it] {'loss': 0.1314, 'grad_norm': 1.0227267616933795, 'learning_rate': 4.833633897062029e-07, 'epoch': 0.86} 86%|████████▋ | 29586/34278 [32:28:10<4:43:09, 3.62s/it] 86%|████████▋ | 29587/34278 [32:28:13<4:26:51, 3.41s/it] {'loss': 0.1041, 'grad_norm': 1.3691232071072594, 'learning_rate': 4.831607583299941e-07, 'epoch': 0.86} 86%|████████▋ | 29587/34278 [32:28:13<4:26:51, 3.41s/it] 86%|████████▋ | 29588/34278 [32:28:17<4:41:01, 3.60s/it] {'loss': 0.1326, 'grad_norm': 0.9266690092284912, 'learning_rate': 4.829581672790484e-07, 'epoch': 0.86} 86%|████████▋ | 29588/34278 [32:28:17<4:41:01, 3.60s/it] 86%|████████▋ | 29589/34278 [32:28:20<4:27:26, 3.42s/it] {'loss': 0.0837, 'grad_norm': 0.6945943866312693, 'learning_rate': 4.827556165551728e-07, 'epoch': 0.86} 86%|████████▋ | 29589/34278 [32:28:20<4:27:26, 3.42s/it] 86%|████████▋ | 29590/34278 [32:28:26<5:32:30, 4.26s/it] {'loss': 0.1126, 'grad_norm': 0.7280800558592375, 'learning_rate': 4.825531061601768e-07, 'epoch': 0.86} 86%|████████▋ | 29590/34278 [32:28:26<5:32:30, 4.26s/it] 86%|████████▋ | 29591/34278 [32:28:29<5:04:25, 3.90s/it] {'loss': 0.108, 'grad_norm': 1.0866269712261198, 'learning_rate': 4.823506360958691e-07, 'epoch': 0.86} 86%|████████▋ | 29591/34278 [32:28:29<5:04:25, 3.90s/it] 86%|████████▋ | 29592/34278 [32:28:33<4:59:42, 3.84s/it] {'loss': 0.1049, 'grad_norm': 0.9334877491028227, 'learning_rate': 4.821482063640559e-07, 'epoch': 0.86} 86%|████████▋ | 29592/34278 [32:28:33<4:59:42, 3.84s/it] 86%|████████▋ | 29593/34278 [32:28:36<4:50:21, 3.72s/it] {'loss': 0.115, 'grad_norm': 0.8746056086522032, 'learning_rate': 4.819458169665447e-07, 'epoch': 0.86} 86%|████████▋ | 29593/34278 [32:28:36<4:50:21, 3.72s/it] 86%|████████▋ | 29594/34278 [32:28:40<4:40:12, 3.59s/it] {'loss': 0.1243, 'grad_norm': 0.8324723328818899, 'learning_rate': 4.817434679051436e-07, 'epoch': 0.86} 86%|████████▋ | 29594/34278 [32:28:40<4:40:12, 3.59s/it] 86%|████████▋ | 29595/34278 [32:28:43<4:36:32, 3.54s/it] {'loss': 0.0997, 'grad_norm': 0.7242512479396693, 'learning_rate': 4.815411591816583e-07, 'epoch': 0.86} 86%|████████▋ | 29595/34278 [32:28:43<4:36:32, 3.54s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ef9bc1d72e0> Failed to fetch sample 3747005. Exception: cannot identify image file <_io.BytesIO object at 0x7ef9bc1d72e0> 86%|████████▋ | 29596/34278 [32:28:47<4:47:33, 3.69s/it] {'loss': 0.117, 'grad_norm': 0.7987187858506292, 'learning_rate': 4.813388907978927e-07, 'epoch': 0.86} 86%|████████▋ | 29596/34278 [32:28:47<4:47:33, 3.69s/it] 86%|████████▋ | 29597/34278 [32:28:50<4:40:47, 3.60s/it] {'loss': 0.12, 'grad_norm': 0.8758749388026807, 'learning_rate': 4.811366627556569e-07, 'epoch': 0.86} 86%|████████▋ | 29597/34278 [32:28:50<4:40:47, 3.60s/it] 86%|████████▋ | 29598/34278 [32:28:54<4:32:14, 3.49s/it] {'loss': 0.1227, 'grad_norm': 0.8615853750316675, 'learning_rate': 4.809344750567541e-07, 'epoch': 0.86} 86%|████████▋ | 29598/34278 [32:28:54<4:32:14, 3.49s/it] 86%|████████▋ | 29599/34278 [32:28:59<5:26:24, 4.19s/it] {'loss': 0.1303, 'grad_norm': 0.9269933404487796, 'learning_rate': 4.807323277029885e-07, 'epoch': 0.86} 86%|████████▋ | 29599/34278 [32:28:59<5:26:24, 4.19s/it] 86%|████████▋ | 29600/34278 [32:29:02<4:50:06, 3.72s/it] {'loss': 0.1002, 'grad_norm': 0.8515560207390576, 'learning_rate': 4.805302206961671e-07, 'epoch': 0.86} 86%|████████▋ | 29600/34278 [32:29:02<4:50:06, 3.72s/it] 86%|████████▋ | 29601/34278 [32:29:05<4:29:36, 3.46s/it] {'loss': 0.1053, 'grad_norm': 0.8271251257469049, 'learning_rate': 4.803281540380927e-07, 'epoch': 0.86} 86%|████████▋ | 29601/34278 [32:29:05<4:29:36, 3.46s/it] 86%|████████▋ | 29602/34278 [32:29:09<4:41:28, 3.61s/it] {'loss': 0.1464, 'grad_norm': 0.8991172528992981, 'learning_rate': 4.801261277305691e-07, 'epoch': 0.86} 86%|████████▋ | 29602/34278 [32:29:09<4:41:28, 3.61s/it] 86%|████████▋ | 29603/34278 [32:29:12<4:28:40, 3.45s/it] {'loss': 0.1001, 'grad_norm': 0.8219771993122964, 'learning_rate': 4.799241417754003e-07, 'epoch': 0.86} 86%|████████▋ | 29603/34278 [32:29:12<4:28:40, 3.45s/it] 86%|████████▋ | 29604/34278 [32:29:15<4:14:23, 3.27s/it] {'loss': 0.1231, 'grad_norm': 0.7661951768747031, 'learning_rate': 4.797221961743903e-07, 'epoch': 0.86} 86%|████████▋ | 29604/34278 [32:29:15<4:14:23, 3.27s/it] 86%|████████▋ | 29605/34278 [32:29:18<4:08:11, 3.19s/it] {'loss': 0.0881, 'grad_norm': 0.800394710623368, 'learning_rate': 4.795202909293417e-07, 'epoch': 0.86} 86%|████████▋ | 29605/34278 [32:29:18<4:08:11, 3.19s/it] 86%|████████▋ | 29606/34278 [32:29:21<4:05:23, 3.15s/it] {'loss': 0.122, 'grad_norm': 0.957192883285935, 'learning_rate': 4.793184260420558e-07, 'epoch': 0.86} 86%|████████▋ | 29606/34278 [32:29:21<4:05:23, 3.15s/it] 86%|████████▋ | 29607/34278 [32:29:24<4:02:08, 3.11s/it] {'loss': 0.1059, 'grad_norm': 0.8945823312298447, 'learning_rate': 4.791166015143367e-07, 'epoch': 0.86} 86%|████████▋ | 29607/34278 [32:29:24<4:02:08, 3.11s/it] 86%|████████▋ | 29608/34278 [32:29:27<3:57:44, 3.05s/it] {'loss': 0.1174, 'grad_norm': 0.8919861071128578, 'learning_rate': 4.789148173479846e-07, 'epoch': 0.86} 86%|████████▋ | 29608/34278 [32:29:27<3:57:44, 3.05s/it] 86%|████████▋ | 29609/34278 [32:29:30<4:12:32, 3.25s/it] {'loss': 0.1166, 'grad_norm': 0.8249856781036057, 'learning_rate': 4.787130735448025e-07, 'epoch': 0.86} 86%|████████▋ | 29609/34278 [32:29:30<4:12:32, 3.25s/it] 86%|████████▋ | 29610/34278 [32:29:33<4:07:14, 3.18s/it] {'loss': 0.1027, 'grad_norm': 0.8328722605491482, 'learning_rate': 4.785113701065902e-07, 'epoch': 0.86} 86%|████████▋ | 29610/34278 [32:29:34<4:07:14, 3.18s/it] 86%|████████▋ | 29611/34278 [32:29:37<4:24:18, 3.40s/it] {'loss': 0.0985, 'grad_norm': 0.826765544122123, 'learning_rate': 4.783097070351494e-07, 'epoch': 0.86} 86%|████████▋ | 29611/34278 [32:29:37<4:24:18, 3.40s/it] 86%|████████▋ | 29612/34278 [32:29:40<4:13:11, 3.26s/it] {'loss': 0.1188, 'grad_norm': 0.8292172219427822, 'learning_rate': 4.781080843322805e-07, 'epoch': 0.86} 86%|████████▋ | 29612/34278 [32:29:40<4:13:11, 3.26s/it] 86%|████████▋ | 29613/34278 [32:29:44<4:11:09, 3.23s/it] {'loss': 0.1044, 'grad_norm': 0.8156842541071601, 'learning_rate': 4.779065019997813e-07, 'epoch': 0.86} 86%|████████▋ | 29613/34278 [32:29:44<4:11:09, 3.23s/it] 86%|████████▋ | 29614/34278 [32:29:47<4:09:28, 3.21s/it] {'loss': 0.0988, 'grad_norm': 0.6863972599258233, 'learning_rate': 4.777049600394551e-07, 'epoch': 0.86} 86%|████████▋ | 29614/34278 [32:29:47<4:09:28, 3.21s/it] 86%|████████▋ | 29615/34278 [32:29:50<4:10:56, 3.23s/it] {'loss': 0.1236, 'grad_norm': 0.8723736753961623, 'learning_rate': 4.775034584530997e-07, 'epoch': 0.86} 86%|████████▋ | 29615/34278 [32:29:50<4:10:56, 3.23s/it] 86%|████████▋ | 29616/34278 [32:29:53<4:03:57, 3.14s/it] {'loss': 0.1085, 'grad_norm': 0.7932184896501067, 'learning_rate': 4.773019972425124e-07, 'epoch': 0.86} 86%|████████▋ | 29616/34278 [32:29:53<4:03:57, 3.14s/it] 86%|████████▋ | 29617/34278 [32:29:57<4:25:28, 3.42s/it] {'loss': 0.1008, 'grad_norm': 0.8837860027059837, 'learning_rate': 4.771005764094944e-07, 'epoch': 0.86} 86%|████████▋ | 29617/34278 [32:29:57<4:25:28, 3.42s/it] 86%|████████▋ | 29618/34278 [32:30:00<4:15:48, 3.29s/it] {'loss': 0.0891, 'grad_norm': 0.9156822763571928, 'learning_rate': 4.768991959558428e-07, 'epoch': 0.86} 86%|████████▋ | 29618/34278 [32:30:00<4:15:48, 3.29s/it] 86%|████████▋ | 29619/34278 [32:30:05<4:54:42, 3.80s/it] {'loss': 0.0967, 'grad_norm': 0.7655104960309999, 'learning_rate': 4.766978558833546e-07, 'epoch': 0.86} 86%|████████▋ | 29619/34278 [32:30:05<4:54:42, 3.80s/it] 86%|████████▋ | 29620/34278 [32:30:09<5:00:54, 3.88s/it] {'loss': 0.1254, 'grad_norm': 0.9262543272787156, 'learning_rate': 4.7649655619382783e-07, 'epoch': 0.86} 86%|████████▋ | 29620/34278 [32:30:09<5:00:54, 3.88s/it] 86%|████████▋ | 29621/34278 [32:30:12<4:50:47, 3.75s/it] {'loss': 0.1097, 'grad_norm': 0.7676951827197807, 'learning_rate': 4.7629529688906106e-07, 'epoch': 0.86} 86%|████████▋ | 29621/34278 [32:30:12<4:50:47, 3.75s/it] 86%|████████▋ | 29622/34278 [32:30:15<4:34:08, 3.53s/it] {'loss': 0.1322, 'grad_norm': 0.8639825347892992, 'learning_rate': 4.7609407797085004e-07, 'epoch': 0.86} 86%|████████▋ | 29622/34278 [32:30:15<4:34:08, 3.53s/it] 86%|████████▋ | 29623/34278 [32:30:18<4:22:50, 3.39s/it] {'loss': 0.1022, 'grad_norm': 0.6632079328706527, 'learning_rate': 4.7589289944099006e-07, 'epoch': 0.86} 86%|████████▋ | 29623/34278 [32:30:19<4:22:50, 3.39s/it] 86%|████████▋ | 29624/34278 [32:30:22<4:21:45, 3.37s/it] {'loss': 0.1171, 'grad_norm': 0.8810722908815922, 'learning_rate': 4.756917613012796e-07, 'epoch': 0.86} 86%|████████▋ | 29624/34278 [32:30:22<4:21:45, 3.37s/it] 86%|████████▋ | 29625/34278 [32:30:25<4:10:03, 3.22s/it] {'loss': 0.0895, 'grad_norm': 0.8158237493702719, 'learning_rate': 4.754906635535117e-07, 'epoch': 0.86} 86%|████████▋ | 29625/34278 [32:30:25<4:10:03, 3.22s/it] 86%|████████▋ | 29626/34278 [32:30:28<4:09:41, 3.22s/it] {'loss': 0.1252, 'grad_norm': 0.7865349197095143, 'learning_rate': 4.7528960619948326e-07, 'epoch': 0.86} 86%|████████▋ | 29626/34278 [32:30:28<4:09:41, 3.22s/it] 86%|████████▋ | 29627/34278 [32:30:33<4:43:51, 3.66s/it] {'loss': 0.1103, 'grad_norm': 0.8034154361529194, 'learning_rate': 4.7508858924098957e-07, 'epoch': 0.86} 86%|████████▋ | 29627/34278 [32:30:33<4:43:51, 3.66s/it] 86%|████████▋ | 29628/34278 [32:30:35<4:25:04, 3.42s/it] {'loss': 0.0968, 'grad_norm': 0.9944795265877507, 'learning_rate': 4.748876126798252e-07, 'epoch': 0.86} 86%|████████▋ | 29628/34278 [32:30:35<4:25:04, 3.42s/it] 86%|████████▋ | 29629/34278 [32:30:38<4:14:36, 3.29s/it] {'loss': 0.1093, 'grad_norm': 0.803210162221882, 'learning_rate': 4.7468667651778323e-07, 'epoch': 0.86} 86%|████████▋ | 29629/34278 [32:30:38<4:14:36, 3.29s/it] 86%|████████▋ | 29630/34278 [32:30:42<4:23:19, 3.40s/it] {'loss': 0.1339, 'grad_norm': 0.7230637311969597, 'learning_rate': 4.7448578075665887e-07, 'epoch': 0.86} 86%|████████▋ | 29630/34278 [32:30:42<4:23:19, 3.40s/it] 86%|████████▋ | 29631/34278 [32:30:45<4:15:39, 3.30s/it] {'loss': 0.1086, 'grad_norm': 0.763966666019329, 'learning_rate': 4.7428492539824456e-07, 'epoch': 0.86} 86%|████████▋ | 29631/34278 [32:30:45<4:15:39, 3.30s/it] 86%|████████▋ | 29632/34278 [32:30:48<4:04:09, 3.15s/it] {'loss': 0.114, 'grad_norm': 0.826078356683304, 'learning_rate': 4.74084110444335e-07, 'epoch': 0.86} 86%|████████▋ | 29632/34278 [32:30:48<4:04:09, 3.15s/it] 86%|████████▋ | 29633/34278 [32:30:51<4:07:52, 3.20s/it] {'loss': 0.0972, 'grad_norm': 0.700532019841836, 'learning_rate': 4.738833358967204e-07, 'epoch': 0.86} 86%|████████▋ | 29633/34278 [32:30:51<4:07:52, 3.20s/it] 86%|████████▋ | 29634/34278 [32:30:55<4:15:22, 3.30s/it] {'loss': 0.1099, 'grad_norm': 0.8568305123509733, 'learning_rate': 4.736826017571966e-07, 'epoch': 0.86} 86%|████████▋ | 29634/34278 [32:30:55<4:15:22, 3.30s/it] 86%|████████▋ | 29635/34278 [32:30:58<4:12:44, 3.27s/it] {'loss': 0.0964, 'grad_norm': 0.8350213993387027, 'learning_rate': 4.734819080275538e-07, 'epoch': 0.86} 86%|████████▋ | 29635/34278 [32:30:58<4:12:44, 3.27s/it] 86%|████████▋ | 29636/34278 [32:31:01<4:08:43, 3.21s/it] {'loss': 0.1063, 'grad_norm': 0.7811801257511575, 'learning_rate': 4.732812547095833e-07, 'epoch': 0.86} 86%|████████▋ | 29636/34278 [32:31:01<4:08:43, 3.21s/it] 86%|████████▋ | 29637/34278 [32:31:04<4:04:35, 3.16s/it] {'loss': 0.1155, 'grad_norm': 0.7120592483117036, 'learning_rate': 4.730806418050765e-07, 'epoch': 0.86} 86%|████████▋ | 29637/34278 [32:31:04<4:04:35, 3.16s/it] 86%|████████▋ | 29638/34278 [32:31:07<4:01:13, 3.12s/it] {'loss': 0.12, 'grad_norm': 0.798759867045699, 'learning_rate': 4.728800693158264e-07, 'epoch': 0.86} 86%|████████▋ | 29638/34278 [32:31:07<4:01:13, 3.12s/it] 86%|████████▋ | 29639/34278 [32:31:10<4:04:44, 3.17s/it] {'loss': 0.1087, 'grad_norm': 1.1690973989697988, 'learning_rate': 4.726795372436227e-07, 'epoch': 0.86} 86%|████████▋ | 29639/34278 [32:31:10<4:04:44, 3.17s/it] 86%|████████▋ | 29640/34278 [32:31:15<4:36:22, 3.58s/it] {'loss': 0.1404, 'grad_norm': 0.7437712717524874, 'learning_rate': 4.7247904559025394e-07, 'epoch': 0.86} 86%|████████▋ | 29640/34278 [32:31:15<4:36:22, 3.58s/it] 86%|████████▋ | 29641/34278 [32:31:18<4:29:22, 3.49s/it] {'loss': 0.1112, 'grad_norm': 0.7925820616934592, 'learning_rate': 4.7227859435751257e-07, 'epoch': 0.86} 86%|████████▋ | 29641/34278 [32:31:18<4:29:22, 3.49s/it] 86%|████████▋ | 29642/34278 [32:31:21<4:10:22, 3.24s/it] {'loss': 0.1117, 'grad_norm': 0.7629146577327641, 'learning_rate': 4.720781835471866e-07, 'epoch': 0.86} 86%|████████▋ | 29642/34278 [32:31:21<4:10:22, 3.24s/it] 86%|████████▋ | 29643/34278 [32:31:26<4:50:42, 3.76s/it] {'loss': 0.109, 'grad_norm': 0.7356041591264301, 'learning_rate': 4.718778131610641e-07, 'epoch': 0.86} 86%|████████▋ | 29643/34278 [32:31:26<4:50:42, 3.76s/it] 86%|████████▋ | 29644/34278 [32:31:29<4:31:24, 3.51s/it] {'loss': 0.1199, 'grad_norm': 0.7391439129785692, 'learning_rate': 4.716774832009374e-07, 'epoch': 0.86} 86%|████████▋ | 29644/34278 [32:31:29<4:31:24, 3.51s/it] 86%|████████▋ | 29645/34278 [32:31:32<4:24:02, 3.42s/it] {'loss': 0.1194, 'grad_norm': 0.9699460332926753, 'learning_rate': 4.7147719366859356e-07, 'epoch': 0.86} 86%|████████▋ | 29645/34278 [32:31:32<4:24:02, 3.42s/it] 86%|████████▋ | 29646/34278 [32:31:36<4:38:55, 3.61s/it] {'loss': 0.1124, 'grad_norm': 0.9004880669962468, 'learning_rate': 4.7127694456581886e-07, 'epoch': 0.86} 86%|████████▋ | 29646/34278 [32:31:36<4:38:55, 3.61s/it] 86%|████████▋ | 29647/34278 [32:31:40<4:39:23, 3.62s/it] {'loss': 0.1194, 'grad_norm': 0.8018383910546009, 'learning_rate': 4.710767358944035e-07, 'epoch': 0.86} 86%|████████▋ | 29647/34278 [32:31:40<4:39:23, 3.62s/it] 86%|████████▋ | 29648/34278 [32:31:43<4:22:28, 3.40s/it] {'loss': 0.1025, 'grad_norm': 0.8808434508993854, 'learning_rate': 4.708765676561339e-07, 'epoch': 0.86} 86%|████████▋ | 29648/34278 [32:31:43<4:22:28, 3.40s/it] 86%|████████▋ | 29649/34278 [32:31:47<4:39:50, 3.63s/it] {'loss': 0.1395, 'grad_norm': 0.9759817118866354, 'learning_rate': 4.706764398527963e-07, 'epoch': 0.86} 86%|████████▋ | 29649/34278 [32:31:47<4:39:50, 3.63s/it] 86%|████████▋ | 29650/34278 [32:31:50<4:21:56, 3.40s/it] {'loss': 0.0977, 'grad_norm': 1.155416398124548, 'learning_rate': 4.704763524861783e-07, 'epoch': 0.86} 86%|████████▋ | 29650/34278 [32:31:50<4:21:56, 3.40s/it] 87%|████████▋ | 29651/34278 [32:31:53<4:27:46, 3.47s/it] {'loss': 0.1054, 'grad_norm': 0.895229972793927, 'learning_rate': 4.702763055580672e-07, 'epoch': 0.87} 87%|████████▋ | 29651/34278 [32:31:53<4:27:46, 3.47s/it] 87%|████████▋ | 29652/34278 [32:31:56<4:21:05, 3.39s/it] {'loss': 0.1126, 'grad_norm': 0.8855127487464511, 'learning_rate': 4.700762990702473e-07, 'epoch': 0.87} 87%|████████▋ | 29652/34278 [32:31:56<4:21:05, 3.39s/it] 87%|████████▋ | 29653/34278 [32:31:59<4:05:31, 3.19s/it] {'loss': 0.102, 'grad_norm': 0.8177305592994144, 'learning_rate': 4.698763330245043e-07, 'epoch': 0.87} 87%|████████▋ | 29653/34278 [32:31:59<4:05:31, 3.19s/it] 87%|████████▋ | 29654/34278 [32:32:02<4:06:26, 3.20s/it] {'loss': 0.0984, 'grad_norm': 0.812140178505874, 'learning_rate': 4.6967640742262513e-07, 'epoch': 0.87} 87%|████████▋ | 29654/34278 [32:32:02<4:06:26, 3.20s/it] 87%|████████▋ | 29655/34278 [32:32:05<3:56:30, 3.07s/it] {'loss': 0.1312, 'grad_norm': 0.8331334389322026, 'learning_rate': 4.6947652226639216e-07, 'epoch': 0.87} 87%|████████▋ | 29655/34278 [32:32:05<3:56:30, 3.07s/it] 87%|████████▋ | 29656/34278 [32:32:08<3:48:06, 2.96s/it] {'loss': 0.1145, 'grad_norm': 1.0606635692666215, 'learning_rate': 4.692766775575913e-07, 'epoch': 0.87} 87%|████████▋ | 29656/34278 [32:32:08<3:48:06, 2.96s/it] 87%|████████▋ | 29657/34278 [32:32:11<4:00:00, 3.12s/it] {'loss': 0.1161, 'grad_norm': 0.912480668116238, 'learning_rate': 4.690768732980078e-07, 'epoch': 0.87} 87%|████████▋ | 29657/34278 [32:32:11<4:00:00, 3.12s/it] 87%|████████▋ | 29658/34278 [32:32:14<3:55:26, 3.06s/it] {'loss': 0.1201, 'grad_norm': 0.8693959965482749, 'learning_rate': 4.688771094894246e-07, 'epoch': 0.87} 87%|████████▋ | 29658/34278 [32:32:14<3:55:26, 3.06s/it] 87%|████████▋ | 29659/34278 [32:32:19<4:29:16, 3.50s/it] {'loss': 0.125, 'grad_norm': 1.1410749152867425, 'learning_rate': 4.6867738613362356e-07, 'epoch': 0.87} 87%|████████▋ | 29659/34278 [32:32:19<4:29:16, 3.50s/it] 87%|████████▋ | 29660/34278 [32:32:22<4:20:49, 3.39s/it] {'loss': 0.1197, 'grad_norm': 0.9557606060725321, 'learning_rate': 4.6847770323239006e-07, 'epoch': 0.87} 87%|████████▋ | 29660/34278 [32:32:22<4:20:49, 3.39s/it] 87%|████████▋ | 29661/34278 [32:32:28<5:23:52, 4.21s/it] {'loss': 0.1102, 'grad_norm': 0.9258297153682862, 'learning_rate': 4.682780607875048e-07, 'epoch': 0.87} 87%|████████▋ | 29661/34278 [32:32:28<5:23:52, 4.21s/it] 87%|████████▋ | 29662/34278 [32:32:32<5:07:17, 3.99s/it] {'loss': 0.1157, 'grad_norm': 0.9021782178302932, 'learning_rate': 4.680784588007525e-07, 'epoch': 0.87} 87%|████████▋ | 29662/34278 [32:32:32<5:07:17, 3.99s/it] 87%|████████▋ | 29663/34278 [32:32:35<4:56:47, 3.86s/it] {'loss': 0.0917, 'grad_norm': 0.912579263470195, 'learning_rate': 4.678788972739129e-07, 'epoch': 0.87} 87%|████████▋ | 29663/34278 [32:32:35<4:56:47, 3.86s/it] 87%|████████▋ | 29664/34278 [32:32:39<4:46:41, 3.73s/it] {'loss': 0.1161, 'grad_norm': 0.7514401759654336, 'learning_rate': 4.6767937620876946e-07, 'epoch': 0.87} 87%|████████▋ | 29664/34278 [32:32:39<4:46:41, 3.73s/it] 87%|████████▋ | 29665/34278 [32:32:42<4:36:14, 3.59s/it] {'loss': 0.1068, 'grad_norm': 0.7817265586761414, 'learning_rate': 4.674798956071025e-07, 'epoch': 0.87} 87%|████████▋ | 29665/34278 [32:32:42<4:36:14, 3.59s/it] 87%|████████▋ | 29666/34278 [32:32:45<4:17:50, 3.35s/it] {'loss': 0.1015, 'grad_norm': 0.7489309934528932, 'learning_rate': 4.6728045547069223e-07, 'epoch': 0.87} 87%|████████▋ | 29666/34278 [32:32:45<4:17:50, 3.35s/it] 87%|████████▋ | 29667/34278 [32:32:50<5:08:13, 4.01s/it] {'loss': 0.1026, 'grad_norm': 0.7725362704533929, 'learning_rate': 4.6708105580132e-07, 'epoch': 0.87} 87%|████████▋ | 29667/34278 [32:32:50<5:08:13, 4.01s/it] 87%|████████▋ | 29668/34278 [32:32:53<4:50:05, 3.78s/it] {'loss': 0.1124, 'grad_norm': 0.8417707708156441, 'learning_rate': 4.6688169660076666e-07, 'epoch': 0.87} 87%|████████▋ | 29668/34278 [32:32:53<4:50:05, 3.78s/it] 87%|████████▋ | 29669/34278 [32:32:57<4:40:29, 3.65s/it] {'loss': 0.1365, 'grad_norm': 0.9401182149442621, 'learning_rate': 4.6668237787081185e-07, 'epoch': 0.87} 87%|████████▋ | 29669/34278 [32:32:57<4:40:29, 3.65s/it] 87%|████████▋ | 29670/34278 [32:33:00<4:28:42, 3.50s/it] {'loss': 0.1218, 'grad_norm': 0.912303103831374, 'learning_rate': 4.66483099613233e-07, 'epoch': 0.87} 87%|████████▋ | 29670/34278 [32:33:00<4:28:42, 3.50s/it] 87%|████████▋ | 29671/34278 [32:33:04<4:31:50, 3.54s/it] {'loss': 0.126, 'grad_norm': 0.8884869685940538, 'learning_rate': 4.662838618298121e-07, 'epoch': 0.87} 87%|████████▋ | 29671/34278 [32:33:04<4:31:50, 3.54s/it] 87%|████████▋ | 29672/34278 [32:33:07<4:22:38, 3.42s/it] {'loss': 0.1138, 'grad_norm': 1.057071481354854, 'learning_rate': 4.6608466452232713e-07, 'epoch': 0.87} 87%|████████▋ | 29672/34278 [32:33:07<4:22:38, 3.42s/it] 87%|████████▋ | 29673/34278 [32:33:09<4:07:55, 3.23s/it] {'loss': 0.1201, 'grad_norm': 0.8261209164621662, 'learning_rate': 4.6588550769255336e-07, 'epoch': 0.87} 87%|████████▋ | 29673/34278 [32:33:09<4:07:55, 3.23s/it] 87%|████████▋ | 29674/34278 [32:33:13<4:04:39, 3.19s/it] {'loss': 0.1232, 'grad_norm': 1.0436250607599036, 'learning_rate': 4.656863913422732e-07, 'epoch': 0.87} 87%|████████▋ | 29674/34278 [32:33:13<4:04:39, 3.19s/it] 87%|████████▋ | 29675/34278 [32:33:19<5:08:45, 4.02s/it] {'loss': 0.1106, 'grad_norm': 0.7731383788619942, 'learning_rate': 4.654873154732631e-07, 'epoch': 0.87} 87%|████████▋ | 29675/34278 [32:33:19<5:08:45, 4.02s/it] 87%|████████▋ | 29676/34278 [32:33:22<4:46:22, 3.73s/it] {'loss': 0.1059, 'grad_norm': 0.9086950245842619, 'learning_rate': 4.652882800872982e-07, 'epoch': 0.87} 87%|████████▋ | 29676/34278 [32:33:22<4:46:22, 3.73s/it] 87%|████████▋ | 29677/34278 [32:33:27<5:20:54, 4.18s/it] {'loss': 0.1187, 'grad_norm': 0.6931387976822992, 'learning_rate': 4.6508928518615883e-07, 'epoch': 0.87} 87%|████████▋ | 29677/34278 [32:33:27<5:20:54, 4.18s/it] 87%|████████▋ | 29678/34278 [32:33:30<5:06:38, 4.00s/it] {'loss': 0.1332, 'grad_norm': 0.8971935488584154, 'learning_rate': 4.6489033077161907e-07, 'epoch': 0.87} 87%|████████▋ | 29678/34278 [32:33:30<5:06:38, 4.00s/it] 87%|████████▋ | 29679/34278 [32:33:34<4:46:34, 3.74s/it] {'loss': 0.1153, 'grad_norm': 0.8124624341357914, 'learning_rate': 4.6469141684545473e-07, 'epoch': 0.87} 87%|████████▋ | 29679/34278 [32:33:34<4:46:34, 3.74s/it] 87%|████████▋ | 29680/34278 [32:33:40<5:44:36, 4.50s/it] {'loss': 0.1164, 'grad_norm': 0.7495792132361767, 'learning_rate': 4.644925434094433e-07, 'epoch': 0.87} 87%|████████▋ | 29680/34278 [32:33:40<5:44:36, 4.50s/it] 87%|████████▋ | 29681/34278 [32:33:46<6:19:07, 4.95s/it] {'loss': 0.1062, 'grad_norm': 0.8907488826080208, 'learning_rate': 4.6429371046536e-07, 'epoch': 0.87} 87%|████████▋ | 29681/34278 [32:33:46<6:19:07, 4.95s/it] 87%|████████▋ | 29682/34278 [32:33:50<5:56:13, 4.65s/it] {'loss': 0.1015, 'grad_norm': 0.7172330951828068, 'learning_rate': 4.6409491801498006e-07, 'epoch': 0.87} 87%|████████▋ | 29682/34278 [32:33:50<5:56:13, 4.65s/it] 87%|████████▋ | 29683/34278 [32:33:53<5:25:09, 4.25s/it] {'loss': 0.0988, 'grad_norm': 0.766924963875774, 'learning_rate': 4.6389616606007717e-07, 'epoch': 0.87} 87%|████████▋ | 29683/34278 [32:33:53<5:25:09, 4.25s/it] 87%|████████▋ | 29684/34278 [32:33:56<5:03:09, 3.96s/it] {'loss': 0.1194, 'grad_norm': 0.6698874112215939, 'learning_rate': 4.6369745460242755e-07, 'epoch': 0.87} 87%|████████▋ | 29684/34278 [32:33:56<5:03:09, 3.96s/it] 87%|████████▋ | 29685/34278 [32:34:00<4:50:15, 3.79s/it] {'loss': 0.1064, 'grad_norm': 0.878505291382565, 'learning_rate': 4.634987836438026e-07, 'epoch': 0.87} 87%|████████▋ | 29685/34278 [32:34:00<4:50:15, 3.79s/it] 87%|████████▋ | 29686/34278 [32:34:05<5:12:51, 4.09s/it] {'loss': 0.1042, 'grad_norm': 0.8542466741397426, 'learning_rate': 4.633001531859777e-07, 'epoch': 0.87} 87%|████████▋ | 29686/34278 [32:34:05<5:12:51, 4.09s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f6c99fd4090> Failed to fetch sample 3221374. Exception: cannot identify image file <_io.BytesIO object at 0x7f6c99fd4090> 87%|████████▋ | 29687/34278 [32:34:08<4:54:56, 3.85s/it] {'loss': 0.1288, 'grad_norm': 0.7377300469235439, 'learning_rate': 4.631015632307273e-07, 'epoch': 0.87} 87%|████████▋ | 29687/34278 [32:34:08<4:54:56, 3.85s/it] 87%|████████▋ | 29688/34278 [32:34:11<4:36:40, 3.62s/it] {'loss': 0.1123, 'grad_norm': 0.832876429750545, 'learning_rate': 4.629030137798229e-07, 'epoch': 0.87} 87%|████████▋ | 29688/34278 [32:34:11<4:36:40, 3.62s/it] 87%|████████▋ | 29689/34278 [32:34:14<4:22:23, 3.43s/it] {'loss': 0.093, 'grad_norm': 0.8588785022383585, 'learning_rate': 4.627045048350365e-07, 'epoch': 0.87} 87%|████████▋ | 29689/34278 [32:34:14<4:22:23, 3.43s/it] 87%|████████▋ | 29690/34278 [32:34:18<4:35:55, 3.61s/it] {'loss': 0.1027, 'grad_norm': 0.7639942154531778, 'learning_rate': 4.6250603639814153e-07, 'epoch': 0.87} 87%|████████▋ | 29690/34278 [32:34:18<4:35:55, 3.61s/it] 87%|████████▋ | 29691/34278 [32:34:21<4:24:55, 3.47s/it] {'loss': 0.1068, 'grad_norm': 0.746171628499873, 'learning_rate': 4.6230760847090936e-07, 'epoch': 0.87} 87%|████████▋ | 29691/34278 [32:34:21<4:24:55, 3.47s/it] 87%|████████▋ | 29692/34278 [32:34:24<4:16:19, 3.35s/it] {'loss': 0.0966, 'grad_norm': 0.812632010411005, 'learning_rate': 4.62109221055112e-07, 'epoch': 0.87} 87%|████████▋ | 29692/34278 [32:34:24<4:16:19, 3.35s/it] 87%|████████▋ | 29693/34278 [32:34:30<5:13:36, 4.10s/it] {'loss': 0.1112, 'grad_norm': 1.05417175822687, 'learning_rate': 4.619108741525197e-07, 'epoch': 0.87} 87%|████████▋ | 29693/34278 [32:34:30<5:13:36, 4.10s/it] 87%|████████▋ | 29694/34278 [32:34:33<4:46:12, 3.75s/it] {'loss': 0.1126, 'grad_norm': 0.829800720148143, 'learning_rate': 4.6171256776490423e-07, 'epoch': 0.87} 87%|████████▋ | 29694/34278 [32:34:33<4:46:12, 3.75s/it] 87%|████████▋ | 29695/34278 [32:34:36<4:21:35, 3.42s/it] {'loss': 0.1155, 'grad_norm': 0.9135399325225867, 'learning_rate': 4.61514301894036e-07, 'epoch': 0.87} 87%|████████▋ | 29695/34278 [32:34:36<4:21:35, 3.42s/it] 87%|████████▋ | 29696/34278 [32:34:38<4:08:26, 3.25s/it] {'loss': 0.1176, 'grad_norm': 0.8423022840432328, 'learning_rate': 4.613160765416835e-07, 'epoch': 0.87} 87%|████████▋ | 29696/34278 [32:34:38<4:08:26, 3.25s/it] 87%|████████▋ | 29697/34278 [32:34:42<4:08:32, 3.26s/it] {'loss': 0.1227, 'grad_norm': 0.9440167894563377, 'learning_rate': 4.6111789170961764e-07, 'epoch': 0.87} 87%|████████▋ | 29697/34278 [32:34:42<4:08:32, 3.26s/it] 87%|████████▋ | 29698/34278 [32:34:44<3:57:31, 3.11s/it] {'loss': 0.1258, 'grad_norm': 0.842576393518547, 'learning_rate': 4.6091974739960855e-07, 'epoch': 0.87} 87%|████████▋ | 29698/34278 [32:34:44<3:57:31, 3.11s/it] 87%|████████▋ | 29699/34278 [32:34:47<3:51:15, 3.03s/it] {'loss': 0.1087, 'grad_norm': 0.8789314064777867, 'learning_rate': 4.607216436134243e-07, 'epoch': 0.87} 87%|████████▋ | 29699/34278 [32:34:47<3:51:15, 3.03s/it] 87%|████████▋ | 29700/34278 [32:34:50<3:47:39, 2.98s/it] {'loss': 0.0907, 'grad_norm': 0.7346332916299733, 'learning_rate': 4.6052358035283296e-07, 'epoch': 0.87} 87%|████████▋ | 29700/34278 [32:34:50<3:47:39, 2.98s/it] 87%|████████▋ | 29701/34278 [32:34:53<3:50:44, 3.02s/it] {'loss': 0.0919, 'grad_norm': 0.6216925139506582, 'learning_rate': 4.603255576196042e-07, 'epoch': 0.87} 87%|████████▋ | 29701/34278 [32:34:53<3:50:44, 3.02s/it] 87%|████████▋ | 29702/34278 [32:34:56<3:44:32, 2.94s/it] {'loss': 0.1112, 'grad_norm': 0.9711471230620363, 'learning_rate': 4.6012757541550547e-07, 'epoch': 0.87} 87%|████████▋ | 29702/34278 [32:34:56<3:44:32, 2.94s/it] 87%|████████▋ | 29703/34278 [32:35:01<4:24:40, 3.47s/it] {'loss': 0.1119, 'grad_norm': 0.9344305312678193, 'learning_rate': 4.5992963374230204e-07, 'epoch': 0.87} 87%|████████▋ | 29703/34278 [32:35:01<4:24:40, 3.47s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 87%|████████▋ | 29704/34278 [32:35:04<4:25:10, 3.48s/it] {'loss': 0.0924, 'grad_norm': 0.7686073239935602, 'learning_rate': 4.5973173260176475e-07, 'epoch': 0.87} 87%|████████▋ | 29704/34278 [32:35:04<4:25:10, 3.48s/it] 87%|████████▋ | 29705/34278 [32:35:07<4:16:28, 3.37s/it] {'loss': 0.1017, 'grad_norm': 0.798213112409071, 'learning_rate': 4.595338719956582e-07, 'epoch': 0.87} 87%|████████▋ | 29705/34278 [32:35:07<4:16:28, 3.37s/it] 87%|████████▋ | 29706/34278 [32:35:10<4:06:19, 3.23s/it] {'loss': 0.1067, 'grad_norm': 0.8807198372508848, 'learning_rate': 4.5933605192574894e-07, 'epoch': 0.87} 87%|████████▋ | 29706/34278 [32:35:10<4:06:19, 3.23s/it] 87%|████████▋ | 29707/34278 [32:35:13<3:54:29, 3.08s/it] {'loss': 0.0941, 'grad_norm': 0.8785217562478561, 'learning_rate': 4.5913827239380483e-07, 'epoch': 0.87} 87%|████████▋ | 29707/34278 [32:35:13<3:54:29, 3.08s/it] 87%|████████▋ | 29708/34278 [32:35:19<4:59:14, 3.93s/it] {'loss': 0.1189, 'grad_norm': 1.0508830325605063, 'learning_rate': 4.589405334015895e-07, 'epoch': 0.87} 87%|████████▋ | 29708/34278 [32:35:19<4:59:14, 3.93s/it] 87%|████████▋ | 29709/34278 [32:35:22<4:39:50, 3.67s/it] {'loss': 0.1269, 'grad_norm': 0.8974488364052352, 'learning_rate': 4.5874283495086823e-07, 'epoch': 0.87} 87%|████████▋ | 29709/34278 [32:35:22<4:39:50, 3.67s/it] 87%|████████▋ | 29710/34278 [32:35:26<4:55:36, 3.88s/it] {'loss': 0.1193, 'grad_norm': 0.9093572309706573, 'learning_rate': 4.585451770434074e-07, 'epoch': 0.87} 87%|████████▋ | 29710/34278 [32:35:26<4:55:36, 3.88s/it] 87%|████████▋ | 29711/34278 [32:35:32<5:40:17, 4.47s/it] {'loss': 0.1024, 'grad_norm': 1.1047782347058013, 'learning_rate': 4.5834755968097167e-07, 'epoch': 0.87} 87%|████████▋ | 29711/34278 [32:35:32<5:40:17, 4.47s/it] 87%|████████▋ | 29712/34278 [32:35:36<5:26:00, 4.28s/it] {'loss': 0.13, 'grad_norm': 0.7875122541939666, 'learning_rate': 4.581499828653246e-07, 'epoch': 0.87} 87%|████████▋ | 29712/34278 [32:35:36<5:26:00, 4.28s/it] 87%|████████▋ | 29713/34278 [32:35:40<5:10:49, 4.09s/it] {'loss': 0.1018, 'grad_norm': 1.0699875763352769, 'learning_rate': 4.5795244659822933e-07, 'epoch': 0.87} 87%|████████▋ | 29713/34278 [32:35:40<5:10:49, 4.09s/it] 87%|████████▋ | 29714/34278 [32:35:43<4:47:50, 3.78s/it] {'loss': 0.1121, 'grad_norm': 1.0058920404274683, 'learning_rate': 4.577549508814516e-07, 'epoch': 0.87} 87%|████████▋ | 29714/34278 [32:35:43<4:47:50, 3.78s/it] 87%|████████▋ | 29715/34278 [32:35:46<4:37:40, 3.65s/it] {'loss': 0.1162, 'grad_norm': 1.5685121632036332, 'learning_rate': 4.5755749571675223e-07, 'epoch': 0.87} 87%|████████▋ | 29715/34278 [32:35:46<4:37:40, 3.65s/it] 87%|████████▋ | 29716/34278 [32:35:50<4:48:13, 3.79s/it] {'loss': 0.1151, 'grad_norm': 0.7863468657298148, 'learning_rate': 4.573600811058948e-07, 'epoch': 0.87} 87%|████████▋ | 29716/34278 [32:35:50<4:48:13, 3.79s/it] 87%|████████▋ | 29717/34278 [32:35:54<4:40:08, 3.69s/it] {'loss': 0.1, 'grad_norm': 0.9508696183350926, 'learning_rate': 4.571627070506435e-07, 'epoch': 0.87} 87%|████████▋ | 29717/34278 [32:35:54<4:40:08, 3.69s/it] 87%|████████▋ | 29718/34278 [32:35:58<4:46:50, 3.77s/it] {'loss': 0.0987, 'grad_norm': 0.8904285033234068, 'learning_rate': 4.5696537355275903e-07, 'epoch': 0.87} 87%|████████▋ | 29718/34278 [32:35:58<4:46:50, 3.77s/it] 87%|████████▋ | 29719/34278 [32:36:01<4:32:59, 3.59s/it] {'loss': 0.1011, 'grad_norm': 0.7338508216069743, 'learning_rate': 4.5676808061400233e-07, 'epoch': 0.87} 87%|████████▋ | 29719/34278 [32:36:01<4:32:59, 3.59s/it] 87%|████████▋ | 29720/34278 [32:36:04<4:25:01, 3.49s/it] {'loss': 0.1052, 'grad_norm': 0.7718343882417587, 'learning_rate': 4.5657082823613643e-07, 'epoch': 0.87} 87%|████████▋ | 29720/34278 [32:36:04<4:25:01, 3.49s/it] 87%|████████▋ | 29721/34278 [32:36:07<4:18:42, 3.41s/it] {'loss': 0.11, 'grad_norm': 0.8998635641660996, 'learning_rate': 4.5637361642092036e-07, 'epoch': 0.87} 87%|████████▋ | 29721/34278 [32:36:07<4:18:42, 3.41s/it] 87%|████████▋ | 29722/34278 [32:36:13<5:13:56, 4.13s/it] {'loss': 0.1068, 'grad_norm': 0.9120540171063068, 'learning_rate': 4.5617644517011727e-07, 'epoch': 0.87} 87%|████████▋ | 29722/34278 [32:36:13<5:13:56, 4.13s/it] 87%|████████▋ | 29723/34278 [32:36:16<4:45:25, 3.76s/it] {'loss': 0.1251, 'grad_norm': 0.904704683530814, 'learning_rate': 4.559793144854857e-07, 'epoch': 0.87} 87%|████████▋ | 29723/34278 [32:36:16<4:45:25, 3.76s/it] 87%|████████▋ | 29724/34278 [32:36:22<5:34:55, 4.41s/it] {'loss': 0.1058, 'grad_norm': 0.7281993287272448, 'learning_rate': 4.557822243687865e-07, 'epoch': 0.87} 87%|████████▋ | 29724/34278 [32:36:22<5:34:55, 4.41s/it] 87%|████████▋ | 29725/34278 [32:36:28<6:10:05, 4.88s/it] {'loss': 0.1136, 'grad_norm': 0.7850270269802017, 'learning_rate': 4.555851748217788e-07, 'epoch': 0.87} 87%|████████▋ | 29725/34278 [32:36:28<6:10:05, 4.88s/it] 87%|████████▋ | 29726/34278 [32:36:33<6:07:27, 4.84s/it] {'loss': 0.1242, 'grad_norm': 1.4475167809231915, 'learning_rate': 4.553881658462206e-07, 'epoch': 0.87} 87%|████████▋ | 29726/34278 [32:36:33<6:07:27, 4.84s/it] 87%|████████▋ | 29727/34278 [32:36:35<5:21:07, 4.23s/it] {'loss': 0.1168, 'grad_norm': 0.9211778454821116, 'learning_rate': 4.5519119744387273e-07, 'epoch': 0.87} 87%|████████▋ | 29727/34278 [32:36:35<5:21:07, 4.23s/it] 87%|████████▋ | 29728/34278 [32:36:39<4:57:57, 3.93s/it] {'loss': 0.1036, 'grad_norm': 0.9632290231290926, 'learning_rate': 4.549942696164933e-07, 'epoch': 0.87} 87%|████████▋ | 29728/34278 [32:36:39<4:57:57, 3.93s/it] 87%|████████▋ | 29729/34278 [32:36:42<4:44:54, 3.76s/it] {'loss': 0.1083, 'grad_norm': 0.729019807957965, 'learning_rate': 4.5479738236584026e-07, 'epoch': 0.87} 87%|████████▋ | 29729/34278 [32:36:42<4:44:54, 3.76s/it] 87%|████████▋ | 29730/34278 [32:36:45<4:24:13, 3.49s/it] {'loss': 0.1025, 'grad_norm': 0.7198276258285211, 'learning_rate': 4.5460053569367e-07, 'epoch': 0.87} 87%|████████▋ | 29730/34278 [32:36:45<4:24:13, 3.49s/it] 87%|████████▋ | 29731/34278 [32:36:48<4:07:51, 3.27s/it] {'loss': 0.1269, 'grad_norm': 0.9094212307826949, 'learning_rate': 4.544037296017423e-07, 'epoch': 0.87} 87%|████████▋ | 29731/34278 [32:36:48<4:07:51, 3.27s/it] 87%|████████▋ | 29732/34278 [32:36:52<4:29:01, 3.55s/it] {'loss': 0.1173, 'grad_norm': 1.0392531432521597, 'learning_rate': 4.5420696409181285e-07, 'epoch': 0.87} 87%|████████▋ | 29732/34278 [32:36:52<4:29:01, 3.55s/it] 87%|████████▋ | 29733/34278 [32:36:55<4:23:50, 3.48s/it] {'loss': 0.1142, 'grad_norm': 0.8547771219552441, 'learning_rate': 4.540102391656365e-07, 'epoch': 0.87} 87%|████████▋ | 29733/34278 [32:36:55<4:23:50, 3.48s/it] 87%|████████▋ | 29734/34278 [32:36:58<4:08:01, 3.27s/it] {'loss': 0.102, 'grad_norm': 0.9342302833273862, 'learning_rate': 4.5381355482497334e-07, 'epoch': 0.87} 87%|████████▋ | 29734/34278 [32:36:58<4:08:01, 3.27s/it] 87%|████████▋ | 29735/34278 [32:37:01<3:55:40, 3.11s/it] {'loss': 0.1192, 'grad_norm': 0.8647156138343178, 'learning_rate': 4.536169110715777e-07, 'epoch': 0.87} 87%|████████▋ | 29735/34278 [32:37:01<3:55:40, 3.11s/it] 87%|████████▋ | 29736/34278 [32:37:04<3:54:25, 3.10s/it] {'loss': 0.0861, 'grad_norm': 0.8518650128110973, 'learning_rate': 4.5342030790720415e-07, 'epoch': 0.87} 87%|████████▋ | 29736/34278 [32:37:04<3:54:25, 3.10s/it] 87%|████████▋ | 29737/34278 [32:37:09<4:41:04, 3.71s/it] {'loss': 0.132, 'grad_norm': 0.7755918713379286, 'learning_rate': 4.532237453336091e-07, 'epoch': 0.87} 87%|████████▋ | 29737/34278 [32:37:09<4:41:04, 3.71s/it] 87%|████████▋ | 29738/34278 [32:37:12<4:20:59, 3.45s/it] {'loss': 0.1186, 'grad_norm': 1.0685931566112636, 'learning_rate': 4.5302722335254735e-07, 'epoch': 0.87} 87%|████████▋ | 29738/34278 [32:37:12<4:20:59, 3.45s/it] 87%|████████▋ | 29739/34278 [32:37:15<4:10:02, 3.31s/it] {'loss': 0.1157, 'grad_norm': 1.0566728184063763, 'learning_rate': 4.5283074196577236e-07, 'epoch': 0.87} 87%|████████▋ | 29739/34278 [32:37:15<4:10:02, 3.31s/it] 87%|████████▋ | 29740/34278 [32:37:18<4:04:09, 3.23s/it] {'loss': 0.0999, 'grad_norm': 0.7225468710013514, 'learning_rate': 4.526343011750389e-07, 'epoch': 0.87} 87%|████████▋ | 29740/34278 [32:37:18<4:04:09, 3.23s/it] 87%|████████▋ | 29741/34278 [32:37:22<4:21:18, 3.46s/it] {'loss': 0.1254, 'grad_norm': 0.8328686994694089, 'learning_rate': 4.524379009821017e-07, 'epoch': 0.87} 87%|████████▋ | 29741/34278 [32:37:22<4:21:18, 3.46s/it] 87%|████████▋ | 29742/34278 [32:37:26<4:31:04, 3.59s/it] {'loss': 0.1049, 'grad_norm': 0.7084810337350889, 'learning_rate': 4.522415413887138e-07, 'epoch': 0.87} 87%|████████▋ | 29742/34278 [32:37:26<4:31:04, 3.59s/it] 87%|████████▋ | 29743/34278 [32:37:29<4:25:46, 3.52s/it] {'loss': 0.1101, 'grad_norm': 0.8528611464567548, 'learning_rate': 4.520452223966265e-07, 'epoch': 0.87} 87%|████████▋ | 29743/34278 [32:37:29<4:25:46, 3.52s/it] 87%|████████▋ | 29744/34278 [32:37:32<4:24:39, 3.50s/it] {'loss': 0.1222, 'grad_norm': 0.9303856966190855, 'learning_rate': 4.518489440075946e-07, 'epoch': 0.87} 87%|████████▋ | 29744/34278 [32:37:32<4:24:39, 3.50s/it] 87%|████████▋ | 29745/34278 [32:37:35<4:11:24, 3.33s/it] {'loss': 0.1451, 'grad_norm': 1.0222688221812284, 'learning_rate': 4.516527062233683e-07, 'epoch': 0.87} 87%|████████▋ | 29745/34278 [32:37:35<4:11:24, 3.33s/it] 87%|████████▋ | 29746/34278 [32:37:41<5:13:06, 4.15s/it] {'loss': 0.0919, 'grad_norm': 0.9208701209918345, 'learning_rate': 4.514565090457018e-07, 'epoch': 0.87} 87%|████████▋ | 29746/34278 [32:37:41<5:13:06, 4.15s/it] 87%|████████▋ | 29747/34278 [32:37:47<5:50:43, 4.64s/it] {'loss': 0.1012, 'grad_norm': 0.8087070543058863, 'learning_rate': 4.512603524763459e-07, 'epoch': 0.87} 87%|████████▋ | 29747/34278 [32:37:47<5:50:43, 4.64s/it] 87%|████████▋ | 29748/34278 [32:37:52<6:01:48, 4.79s/it] {'loss': 0.119, 'grad_norm': 0.7322560302675284, 'learning_rate': 4.51064236517052e-07, 'epoch': 0.87} 87%|████████▋ | 29748/34278 [32:37:52<6:01:48, 4.79s/it] 87%|████████▋ | 29749/34278 [32:37:55<5:19:49, 4.24s/it] {'loss': 0.1025, 'grad_norm': 0.8583460991070108, 'learning_rate': 4.5086816116956976e-07, 'epoch': 0.87} 87%|████████▋ | 29749/34278 [32:37:55<5:19:49, 4.24s/it] 87%|████████▋ | 29750/34278 [32:37:59<5:07:03, 4.07s/it] {'loss': 0.1139, 'grad_norm': 0.8543875345341135, 'learning_rate': 4.5067212643565174e-07, 'epoch': 0.87} 87%|████████▋ | 29750/34278 [32:37:59<5:07:03, 4.07s/it] 87%|████████▋ | 29751/34278 [32:38:03<5:13:26, 4.15s/it] {'loss': 0.1149, 'grad_norm': 0.9019525138139194, 'learning_rate': 4.504761323170453e-07, 'epoch': 0.87} 87%|████████▋ | 29751/34278 [32:38:03<5:13:26, 4.15s/it] 87%|████████▋ | 29752/34278 [32:38:07<5:04:33, 4.04s/it] {'loss': 0.1356, 'grad_norm': 0.8608702128939949, 'learning_rate': 4.5028017881550367e-07, 'epoch': 0.87} 87%|████████▋ | 29752/34278 [32:38:07<5:04:33, 4.04s/it] 87%|████████▋ | 29753/34278 [32:38:10<4:47:42, 3.81s/it] {'loss': 0.1214, 'grad_norm': 0.895518179646985, 'learning_rate': 4.500842659327731e-07, 'epoch': 0.87} 87%|████████▋ | 29753/34278 [32:38:10<4:47:42, 3.81s/it] 87%|████████▋ | 29754/34278 [32:38:14<4:33:36, 3.63s/it] {'loss': 0.1097, 'grad_norm': 1.1530854543177367, 'learning_rate': 4.49888393670605e-07, 'epoch': 0.87} 87%|████████▋ | 29754/34278 [32:38:14<4:33:36, 3.63s/it] 87%|████████▋ | 29755/34278 [32:38:18<4:52:48, 3.88s/it] {'loss': 0.1306, 'grad_norm': 0.9497906050404548, 'learning_rate': 4.4969256203074743e-07, 'epoch': 0.87} 87%|████████▋ | 29755/34278 [32:38:18<4:52:48, 3.88s/it] 87%|████████▋ | 29756/34278 [32:38:24<5:48:31, 4.62s/it] {'loss': 0.1065, 'grad_norm': 1.0179029781359794, 'learning_rate': 4.4949677101494725e-07, 'epoch': 0.87} 87%|████████▋ | 29756/34278 [32:38:24<5:48:31, 4.62s/it] 87%|████████▋ | 29757/34278 [32:38:30<6:15:55, 4.99s/it] {'loss': 0.0932, 'grad_norm': 0.7478767179124656, 'learning_rate': 4.4930102062495375e-07, 'epoch': 0.87} 87%|████████▋ | 29757/34278 [32:38:30<6:15:55, 4.99s/it] 87%|████████▋ | 29758/34278 [32:38:37<6:51:42, 5.47s/it] {'loss': 0.1331, 'grad_norm': 0.8353406196041437, 'learning_rate': 4.4910531086251487e-07, 'epoch': 0.87} 87%|████████▋ | 29758/34278 [32:38:37<6:51:42, 5.47s/it] 87%|████████▋ | 29759/34278 [32:38:43<7:03:26, 5.62s/it] {'loss': 0.1118, 'grad_norm': 0.7320057912699145, 'learning_rate': 4.489096417293781e-07, 'epoch': 0.87} 87%|████████▋ | 29759/34278 [32:38:43<7:03:26, 5.62s/it] 87%|████████▋ | 29760/34278 [32:38:46<6:05:24, 4.85s/it] {'loss': 0.0998, 'grad_norm': 0.7836198279260504, 'learning_rate': 4.4871401322728827e-07, 'epoch': 0.87} 87%|████████▋ | 29760/34278 [32:38:46<6:05:24, 4.85s/it] 87%|████████▋ | 29761/34278 [32:38:49<5:28:34, 4.36s/it] {'loss': 0.0916, 'grad_norm': 0.6365631225709393, 'learning_rate': 4.485184253579944e-07, 'epoch': 0.87} 87%|████████▋ | 29761/34278 [32:38:49<5:28:34, 4.36s/it] 87%|████████▋ | 29762/34278 [32:38:52<4:59:09, 3.97s/it] {'loss': 0.1078, 'grad_norm': 0.9872720899301752, 'learning_rate': 4.4832287812324127e-07, 'epoch': 0.87} 87%|████████▋ | 29762/34278 [32:38:52<4:59:09, 3.97s/it] 87%|████████▋ | 29763/34278 [32:38:56<4:51:43, 3.88s/it] {'loss': 0.1116, 'grad_norm': 1.0003743596686172, 'learning_rate': 4.4812737152477304e-07, 'epoch': 0.87} 87%|████████▋ | 29763/34278 [32:38:56<4:51:43, 3.88s/it] 87%|████████▋ | 29764/34278 [32:38:59<4:39:03, 3.71s/it] {'loss': 0.1167, 'grad_norm': 1.1022894300171184, 'learning_rate': 4.4793190556433887e-07, 'epoch': 0.87} 87%|████████▋ | 29764/34278 [32:38:59<4:39:03, 3.71s/it] 87%|████████▋ | 29765/34278 [32:39:02<4:29:24, 3.58s/it] {'loss': 0.1157, 'grad_norm': 0.9228480213490751, 'learning_rate': 4.4773648024368174e-07, 'epoch': 0.87} 87%|████████▋ | 29765/34278 [32:39:02<4:29:24, 3.58s/it] 87%|████████▋ | 29766/34278 [32:39:08<5:08:59, 4.11s/it] {'loss': 0.1034, 'grad_norm': 0.8021311905958423, 'learning_rate': 4.475410955645465e-07, 'epoch': 0.87} 87%|████████▋ | 29766/34278 [32:39:08<5:08:59, 4.11s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 87%|████████▋ | 29767/34278 [32:39:11<4:53:23, 3.90s/it] {'loss': 0.1108, 'grad_norm': 0.879069332619487, 'learning_rate': 4.4734575152867777e-07, 'epoch': 0.87} 87%|████████▋ | 29767/34278 [32:39:11<4:53:23, 3.90s/it] 87%|████████▋ | 29768/34278 [32:39:14<4:33:29, 3.64s/it] {'loss': 0.1014, 'grad_norm': 0.8309891909386835, 'learning_rate': 4.4715044813781974e-07, 'epoch': 0.87} 87%|████████▋ | 29768/34278 [32:39:14<4:33:29, 3.64s/it] 87%|████████▋ | 29769/34278 [32:39:17<4:15:52, 3.40s/it] {'loss': 0.1141, 'grad_norm': 0.9584210287436049, 'learning_rate': 4.469551853937143e-07, 'epoch': 0.87} 87%|████████▋ | 29769/34278 [32:39:17<4:15:52, 3.40s/it] 87%|████████▋ | 29770/34278 [32:39:22<4:40:53, 3.74s/it] {'loss': 0.1122, 'grad_norm': 1.0390781683334436, 'learning_rate': 4.4675996329810677e-07, 'epoch': 0.87} 87%|████████▋ | 29770/34278 [32:39:22<4:40:53, 3.74s/it] 87%|████████▋ | 29771/34278 [32:39:25<4:33:09, 3.64s/it] {'loss': 0.0945, 'grad_norm': 0.8384784217556217, 'learning_rate': 4.4656478185273965e-07, 'epoch': 0.87} 87%|████████▋ | 29771/34278 [32:39:25<4:33:09, 3.64s/it] 87%|████████▋ | 29772/34278 [32:39:30<4:59:58, 3.99s/it] {'loss': 0.1183, 'grad_norm': 0.934736468470086, 'learning_rate': 4.463696410593554e-07, 'epoch': 0.87} 87%|████████▋ | 29772/34278 [32:39:30<4:59:58, 3.99s/it] 87%|████████▋ | 29773/34278 [32:39:33<4:32:37, 3.63s/it] {'loss': 0.0994, 'grad_norm': 1.025228067281019, 'learning_rate': 4.461745409196949e-07, 'epoch': 0.87} 87%|████████▋ | 29773/34278 [32:39:33<4:32:37, 3.63s/it] 87%|████████▋ | 29774/34278 [32:39:36<4:16:31, 3.42s/it] {'loss': 0.1118, 'grad_norm': 0.8652336674970745, 'learning_rate': 4.459794814355023e-07, 'epoch': 0.87} 87%|████████▋ | 29774/34278 [32:39:36<4:16:31, 3.42s/it] 87%|████████▋ | 29775/34278 [32:39:38<4:04:34, 3.26s/it] {'loss': 0.0994, 'grad_norm': 1.0542096956907, 'learning_rate': 4.457844626085167e-07, 'epoch': 0.87} 87%|████████▋ | 29775/34278 [32:39:38<4:04:34, 3.26s/it] 87%|████████▋ | 29776/34278 [32:39:44<4:53:03, 3.91s/it] {'loss': 0.1015, 'grad_norm': 0.9462186891090342, 'learning_rate': 4.455894844404801e-07, 'epoch': 0.87} 87%|████████▋ | 29776/34278 [32:39:44<4:53:03, 3.91s/it] 87%|████████▋ | 29777/34278 [32:39:47<4:36:53, 3.69s/it] {'loss': 0.1054, 'grad_norm': 1.1768899062578044, 'learning_rate': 4.4539454693313445e-07, 'epoch': 0.87} 87%|████████▋ | 29777/34278 [32:39:47<4:36:53, 3.69s/it] 87%|████████▋ | 29778/34278 [32:39:53<5:32:48, 4.44s/it] {'loss': 0.1238, 'grad_norm': 0.9259754020643249, 'learning_rate': 4.4519965008821884e-07, 'epoch': 0.87} 87%|████████▋ | 29778/34278 [32:39:53<5:32:48, 4.44s/it] 87%|████████▋ | 29779/34278 [32:39:57<5:27:43, 4.37s/it] {'loss': 0.0988, 'grad_norm': 0.9200452501299549, 'learning_rate': 4.4500479390747256e-07, 'epoch': 0.87} 87%|████████▋ | 29779/34278 [32:39:57<5:27:43, 4.37s/it] 87%|████████▋ | 29780/34278 [32:40:01<5:02:12, 4.03s/it] {'loss': 0.1065, 'grad_norm': 0.6860223353769073, 'learning_rate': 4.448099783926368e-07, 'epoch': 0.87} 87%|████████▋ | 29780/34278 [32:40:01<5:02:12, 4.03s/it] 87%|████████▋ | 29781/34278 [32:40:04<4:44:35, 3.80s/it] {'loss': 0.1165, 'grad_norm': 1.1867613252608016, 'learning_rate': 4.446152035454493e-07, 'epoch': 0.87} 87%|████████▋ | 29781/34278 [32:40:04<4:44:35, 3.80s/it] 87%|████████▋ | 29782/34278 [32:40:07<4:27:26, 3.57s/it] {'loss': 0.1074, 'grad_norm': 0.9965666379750093, 'learning_rate': 4.444204693676507e-07, 'epoch': 0.87} 87%|████████▋ | 29782/34278 [32:40:07<4:27:26, 3.57s/it] 87%|████████▋ | 29783/34278 [32:40:10<4:20:59, 3.48s/it] {'loss': 0.1385, 'grad_norm': 1.0111669054258339, 'learning_rate': 4.4422577586097805e-07, 'epoch': 0.87} 87%|████████▋ | 29783/34278 [32:40:10<4:20:59, 3.48s/it] 87%|████████▋ | 29784/34278 [32:40:13<4:13:16, 3.38s/it] {'loss': 0.1264, 'grad_norm': 0.8950318607751709, 'learning_rate': 4.44031123027171e-07, 'epoch': 0.87} 87%|████████▋ | 29784/34278 [32:40:13<4:13:16, 3.38s/it] 87%|████████▋ | 29785/34278 [32:40:16<4:05:49, 3.28s/it] {'loss': 0.1238, 'grad_norm': 1.0518872645517119, 'learning_rate': 4.4383651086796655e-07, 'epoch': 0.87} 87%|████████▋ | 29785/34278 [32:40:16<4:05:49, 3.28s/it] 87%|████████▋ | 29786/34278 [32:40:20<4:08:57, 3.33s/it] {'loss': 0.109, 'grad_norm': 0.9826575438077829, 'learning_rate': 4.43641939385101e-07, 'epoch': 0.87} 87%|████████▋ | 29786/34278 [32:40:20<4:08:57, 3.33s/it] 87%|████████▋ | 29787/34278 [32:40:23<4:12:28, 3.37s/it] {'loss': 0.1318, 'grad_norm': 0.8794138827483294, 'learning_rate': 4.4344740858031253e-07, 'epoch': 0.87} 87%|████████▋ | 29787/34278 [32:40:23<4:12:28, 3.37s/it] 87%|████████▋ | 29788/34278 [32:40:27<4:23:47, 3.53s/it] {'loss': 0.1075, 'grad_norm': 1.1362465386233664, 'learning_rate': 4.432529184553386e-07, 'epoch': 0.87} 87%|████████▋ | 29788/34278 [32:40:27<4:23:47, 3.53s/it] 87%|████████▋ | 29789/34278 [32:40:30<4:10:38, 3.35s/it] {'loss': 0.0917, 'grad_norm': 0.8820666260009485, 'learning_rate': 4.4305846901191495e-07, 'epoch': 0.87} 87%|████████▋ | 29789/34278 [32:40:30<4:10:38, 3.35s/it] 87%|████████▋ | 29790/34278 [32:40:36<5:00:52, 4.02s/it] {'loss': 0.1024, 'grad_norm': 0.712915621453368, 'learning_rate': 4.428640602517764e-07, 'epoch': 0.87} 87%|████████▋ | 29790/34278 [32:40:36<5:00:52, 4.02s/it] 87%|████████▋ | 29791/34278 [32:40:42<5:48:37, 4.66s/it] {'loss': 0.1011, 'grad_norm': 0.8040965846348229, 'learning_rate': 4.42669692176661e-07, 'epoch': 0.87} 87%|████████▋ | 29791/34278 [32:40:42<5:48:37, 4.66s/it] 87%|████████▋ | 29792/34278 [32:40:46<5:30:30, 4.42s/it] {'loss': 0.1116, 'grad_norm': 1.0730527558974972, 'learning_rate': 4.424753647883023e-07, 'epoch': 0.87} 87%|████████▋ | 29792/34278 [32:40:46<5:30:30, 4.42s/it] 87%|████████▋ | 29793/34278 [32:40:50<5:16:19, 4.23s/it] {'loss': 0.1115, 'grad_norm': 1.048869244710362, 'learning_rate': 4.42281078088434e-07, 'epoch': 0.87} 87%|████████▋ | 29793/34278 [32:40:50<5:16:19, 4.23s/it] 87%|████████▋ | 29794/34278 [32:40:54<5:17:15, 4.25s/it] {'loss': 0.1051, 'grad_norm': 0.7481494214929864, 'learning_rate': 4.4208683207879355e-07, 'epoch': 0.87} 87%|████████▋ | 29794/34278 [32:40:54<5:17:15, 4.25s/it] 87%|████████▋ | 29795/34278 [32:40:59<5:30:44, 4.43s/it] {'loss': 0.1264, 'grad_norm': 1.0399294612372059, 'learning_rate': 4.418926267611146e-07, 'epoch': 0.87} 87%|████████▋ | 29795/34278 [32:40:59<5:30:44, 4.43s/it] 87%|████████▋ | 29796/34278 [32:41:04<6:01:05, 4.83s/it] {'loss': 0.1065, 'grad_norm': 0.7680603792499419, 'learning_rate': 4.416984621371284e-07, 'epoch': 0.87} 87%|████████▋ | 29796/34278 [32:41:04<6:01:05, 4.83s/it] 87%|████████▋ | 29797/34278 [32:41:08<5:37:54, 4.52s/it] {'loss': 0.0989, 'grad_norm': 0.6179645275901688, 'learning_rate': 4.4150433820857153e-07, 'epoch': 0.87} 87%|████████▋ | 29797/34278 [32:41:08<5:37:54, 4.52s/it] 87%|████████▋ | 29798/34278 [32:41:11<4:59:25, 4.01s/it] {'loss': 0.1101, 'grad_norm': 1.2297313226543285, 'learning_rate': 4.4131025497717585e-07, 'epoch': 0.87} 87%|████████▋ | 29798/34278 [32:41:11<4:59:25, 4.01s/it] 87%|████████▋ | 29799/34278 [32:41:14<4:36:44, 3.71s/it] {'loss': 0.124, 'grad_norm': 1.0925233920483266, 'learning_rate': 4.4111621244467275e-07, 'epoch': 0.87} 87%|████████▋ | 29799/34278 [32:41:14<4:36:44, 3.71s/it] 87%|████████▋ | 29800/34278 [32:41:17<4:16:56, 3.44s/it] {'loss': 0.1139, 'grad_norm': 0.906662609980025, 'learning_rate': 4.409222106127958e-07, 'epoch': 0.87} 87%|████████▋ | 29800/34278 [32:41:17<4:16:56, 3.44s/it] 87%|████████▋ | 29801/34278 [32:41:20<4:16:27, 3.44s/it] {'loss': 0.127, 'grad_norm': 0.887670852975099, 'learning_rate': 4.407282494832782e-07, 'epoch': 0.87} 87%|████████▋ | 29801/34278 [32:41:20<4:16:27, 3.44s/it] 87%|████████▋ | 29802/34278 [32:41:23<4:03:54, 3.27s/it] {'loss': 0.1057, 'grad_norm': 0.8546909175050807, 'learning_rate': 4.405343290578507e-07, 'epoch': 0.87} 87%|████████▋ | 29802/34278 [32:41:23<4:03:54, 3.27s/it] 87%|████████▋ | 29803/34278 [32:41:26<4:04:09, 3.27s/it] {'loss': 0.1253, 'grad_norm': 0.8254342158400528, 'learning_rate': 4.4034044933824294e-07, 'epoch': 0.87} 87%|████████▋ | 29803/34278 [32:41:26<4:04:09, 3.27s/it] 87%|████████▋ | 29804/34278 [32:41:30<4:05:35, 3.29s/it] {'loss': 0.14, 'grad_norm': 0.8674148604627151, 'learning_rate': 4.401466103261881e-07, 'epoch': 0.87} 87%|████████▋ | 29804/34278 [32:41:30<4:05:35, 3.29s/it] 87%|████████▋ | 29805/34278 [32:41:33<3:57:30, 3.19s/it] {'loss': 0.1118, 'grad_norm': 0.8069543818570193, 'learning_rate': 4.399528120234148e-07, 'epoch': 0.87} 87%|████████▋ | 29805/34278 [32:41:33<3:57:30, 3.19s/it] 87%|████████▋ | 29806/34278 [32:41:36<3:49:06, 3.07s/it] {'loss': 0.0957, 'grad_norm': 0.8461359594532442, 'learning_rate': 4.3975905443165437e-07, 'epoch': 0.87} 87%|████████▋ | 29806/34278 [32:41:36<3:49:06, 3.07s/it]Token indices sequence length is longer than the specified maximum sequence length for this model (13677 > 8192). Running this sequence through the model will result in indexing errors 87%|████████▋ | 29807/34278 [32:41:38<3:44:24, 3.01s/it] {'loss': 0.1032, 'grad_norm': 0.9271947325694814, 'learning_rate': 4.395653375526371e-07, 'epoch': 0.87} 87%|████████▋ | 29807/34278 [32:41:38<3:44:24, 3.01s/it] 87%|████████▋ | 29808/34278 [32:41:41<3:42:55, 2.99s/it] {'loss': 0.1136, 'grad_norm': 1.0509761241612552, 'learning_rate': 4.3937166138809217e-07, 'epoch': 0.87} 87%|████████▋ | 29808/34278 [32:41:41<3:42:55, 2.99s/it] 87%|████████▋ | 29809/34278 [32:41:45<3:49:10, 3.08s/it] {'loss': 0.1093, 'grad_norm': 0.9624390305621473, 'learning_rate': 4.3917802593974714e-07, 'epoch': 0.87} 87%|████████▋ | 29809/34278 [32:41:45<3:49:10, 3.08s/it] 87%|████████▋ | 29810/34278 [32:41:48<3:51:03, 3.10s/it] {'loss': 0.1071, 'grad_norm': 0.9669244218025086, 'learning_rate': 4.389844312093322e-07, 'epoch': 0.87} 87%|████████▋ | 29810/34278 [32:41:48<3:51:03, 3.10s/it] 87%|████████▋ | 29811/34278 [32:41:52<4:06:15, 3.31s/it] {'loss': 0.1253, 'grad_norm': 0.8908038698081963, 'learning_rate': 4.387908771985766e-07, 'epoch': 0.87} 87%|████████▋ | 29811/34278 [32:41:52<4:06:15, 3.31s/it] 87%|████████▋ | 29812/34278 [32:41:57<4:42:46, 3.80s/it] {'loss': 0.1183, 'grad_norm': 0.9371307776667135, 'learning_rate': 4.385973639092067e-07, 'epoch': 0.87} 87%|████████▋ | 29812/34278 [32:41:57<4:42:46, 3.80s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 87%|████████▋ | 29813/34278 [32:41:59<4:23:42, 3.54s/it] {'loss': 0.1128, 'grad_norm': 0.7877828857168898, 'learning_rate': 4.3840389134295e-07, 'epoch': 0.87} 87%|████████▋ | 29813/34278 [32:42:00<4:23:42, 3.54s/it] 87%|████████▋ | 29814/34278 [32:42:03<4:12:31, 3.39s/it] {'loss': 0.1077, 'grad_norm': 0.8044260548349564, 'learning_rate': 4.3821045950153517e-07, 'epoch': 0.87} 87%|████████▋ | 29814/34278 [32:42:03<4:12:31, 3.39s/it] 87%|████████▋ | 29815/34278 [32:42:06<4:08:18, 3.34s/it] {'loss': 0.1203, 'grad_norm': 0.8445328060565375, 'learning_rate': 4.3801706838668855e-07, 'epoch': 0.87} 87%|████████▋ | 29815/34278 [32:42:06<4:08:18, 3.34s/it] 87%|████████▋ | 29816/34278 [32:42:10<4:17:51, 3.47s/it] {'loss': 0.1149, 'grad_norm': 0.7289914820312914, 'learning_rate': 4.3782371800013545e-07, 'epoch': 0.87} 87%|████████▋ | 29816/34278 [32:42:10<4:17:51, 3.47s/it] 87%|████████▋ | 29817/34278 [32:42:12<4:06:20, 3.31s/it] {'loss': 0.1018, 'grad_norm': 0.6795465123690801, 'learning_rate': 4.376304083436028e-07, 'epoch': 0.87} 87%|████████▋ | 29817/34278 [32:42:12<4:06:20, 3.31s/it] 87%|████████▋ | 29818/34278 [32:42:16<4:00:47, 3.24s/it] {'loss': 0.11, 'grad_norm': 0.7402984432363503, 'learning_rate': 4.3743713941881817e-07, 'epoch': 0.87} 87%|████████▋ | 29818/34278 [32:42:16<4:00:47, 3.24s/it] 87%|████████▋ | 29819/34278 [32:42:19<4:02:46, 3.27s/it] {'loss': 0.119, 'grad_norm': 0.8466375251813191, 'learning_rate': 4.3724391122750565e-07, 'epoch': 0.87} 87%|████████▋ | 29819/34278 [32:42:19<4:02:46, 3.27s/it] 87%|████████▋ | 29820/34278 [32:42:22<3:56:17, 3.18s/it] {'loss': 0.1048, 'grad_norm': 0.7558040130905236, 'learning_rate': 4.370507237713889e-07, 'epoch': 0.87} 87%|████████▋ | 29820/34278 [32:42:22<3:56:17, 3.18s/it] 87%|████████▋ | 29821/34278 [32:42:25<3:47:55, 3.07s/it] {'loss': 0.1111, 'grad_norm': 0.9283048767907476, 'learning_rate': 4.3685757705219545e-07, 'epoch': 0.87} 87%|████████▋ | 29821/34278 [32:42:25<3:47:55, 3.07s/it] 87%|████████▋ | 29822/34278 [32:42:28<4:01:52, 3.26s/it] {'loss': 0.1019, 'grad_norm': 0.8306414883025738, 'learning_rate': 4.3666447107164667e-07, 'epoch': 0.87} 87%|████████▋ | 29822/34278 [32:42:28<4:01:52, 3.26s/it] 87%|████████▋ | 29823/34278 [32:42:31<3:50:53, 3.11s/it] {'loss': 0.1124, 'grad_norm': 1.0089986757984788, 'learning_rate': 4.36471405831469e-07, 'epoch': 0.87} 87%|████████▋ | 29823/34278 [32:42:31<3:50:53, 3.11s/it] 87%|████████▋ | 29824/34278 [32:42:34<3:47:43, 3.07s/it] {'loss': 0.1316, 'grad_norm': 0.960068224297036, 'learning_rate': 4.362783813333854e-07, 'epoch': 0.87} 87%|████████▋ | 29824/34278 [32:42:34<3:47:43, 3.07s/it] 87%|████████▋ | 29825/34278 [32:42:37<3:47:43, 3.07s/it] {'loss': 0.0981, 'grad_norm': 0.94203315437371, 'learning_rate': 4.3608539757911903e-07, 'epoch': 0.87} 87%|████████▋ | 29825/34278 [32:42:37<3:47:43, 3.07s/it] 87%|████████▋ | 29826/34278 [32:42:42<4:29:48, 3.64s/it] {'loss': 0.114, 'grad_norm': 0.9243236379327263, 'learning_rate': 4.3589245457039244e-07, 'epoch': 0.87} 87%|████████▋ | 29826/34278 [32:42:42<4:29:48, 3.64s/it] 87%|████████▋ | 29827/34278 [32:42:45<4:20:06, 3.51s/it] {'loss': 0.1236, 'grad_norm': 1.14120755316505, 'learning_rate': 4.3569955230892857e-07, 'epoch': 0.87} 87%|████████▋ | 29827/34278 [32:42:45<4:20:06, 3.51s/it] 87%|████████▋ | 29828/34278 [32:42:48<4:05:34, 3.31s/it] {'loss': 0.1228, 'grad_norm': 0.8580259854946702, 'learning_rate': 4.355066907964489e-07, 'epoch': 0.87} 87%|████████▋ | 29828/34278 [32:42:48<4:05:34, 3.31s/it] 87%|████████▋ | 29829/34278 [32:42:51<3:56:45, 3.19s/it] {'loss': 0.1305, 'grad_norm': 0.7866802087229846, 'learning_rate': 4.3531387003467706e-07, 'epoch': 0.87} 87%|████████▋ | 29829/34278 [32:42:51<3:56:45, 3.19s/it] 87%|████████▋ | 29830/34278 [32:42:54<3:50:15, 3.11s/it] {'loss': 0.1166, 'grad_norm': 0.800562842323476, 'learning_rate': 4.351210900253322e-07, 'epoch': 0.87} 87%|████████▋ | 29830/34278 [32:42:54<3:50:15, 3.11s/it] 87%|████████▋ | 29831/34278 [32:42:58<4:18:06, 3.48s/it] {'loss': 0.133, 'grad_norm': 0.7850213567763958, 'learning_rate': 4.349283507701374e-07, 'epoch': 0.87} 87%|████████▋ | 29831/34278 [32:42:58<4:18:06, 3.48s/it] 87%|████████▋ | 29832/34278 [32:43:02<4:20:29, 3.52s/it] {'loss': 0.126, 'grad_norm': 0.8698308370262087, 'learning_rate': 4.3473565227081236e-07, 'epoch': 0.87} 87%|████████▋ | 29832/34278 [32:43:02<4:20:29, 3.52s/it] 87%|████████▋ | 29833/34278 [32:43:07<5:04:07, 4.11s/it] {'loss': 0.1, 'grad_norm': 0.6526075225422932, 'learning_rate': 4.345429945290769e-07, 'epoch': 0.87} 87%|████████▋ | 29833/34278 [32:43:07<5:04:07, 4.11s/it] 87%|████████▋ | 29834/34278 [32:43:11<4:41:26, 3.80s/it] {'loss': 0.1111, 'grad_norm': 0.841088174324627, 'learning_rate': 4.343503775466518e-07, 'epoch': 0.87} 87%|████████▋ | 29834/34278 [32:43:11<4:41:26, 3.80s/it] 87%|████████▋ | 29835/34278 [32:43:14<4:31:19, 3.66s/it] {'loss': 0.1072, 'grad_norm': 0.8447937630109417, 'learning_rate': 4.341578013252573e-07, 'epoch': 0.87} 87%|████████▋ | 29835/34278 [32:43:14<4:31:19, 3.66s/it] 87%|████████▋ | 29836/34278 [32:43:17<4:18:57, 3.50s/it] {'loss': 0.1005, 'grad_norm': 0.924956378450774, 'learning_rate': 4.339652658666116e-07, 'epoch': 0.87} 87%|████████▋ | 29836/34278 [32:43:17<4:18:57, 3.50s/it] 87%|████████▋ | 29837/34278 [32:43:21<4:40:05, 3.78s/it] {'loss': 0.1193, 'grad_norm': 0.8227379584896524, 'learning_rate': 4.337727711724343e-07, 'epoch': 0.87} 87%|████████▋ | 29837/34278 [32:43:21<4:40:05, 3.78s/it] 87%|████████▋ | 29838/34278 [32:43:25<4:43:59, 3.84s/it] {'loss': 0.0987, 'grad_norm': 0.7631561645780006, 'learning_rate': 4.3358031724444416e-07, 'epoch': 0.87} 87%|████████▋ | 29838/34278 [32:43:25<4:43:59, 3.84s/it] 87%|████████▋ | 29839/34278 [32:43:29<4:45:28, 3.86s/it] {'loss': 0.1154, 'grad_norm': 0.8128963542843648, 'learning_rate': 4.333879040843575e-07, 'epoch': 0.87} 87%|████████▋ | 29839/34278 [32:43:29<4:45:28, 3.86s/it] 87%|████████▋ | 29840/34278 [32:43:33<4:41:38, 3.81s/it] {'loss': 0.1227, 'grad_norm': 0.7893618652077217, 'learning_rate': 4.331955316938935e-07, 'epoch': 0.87} 87%|████████▋ | 29840/34278 [32:43:33<4:41:38, 3.81s/it] 87%|████████▋ | 29841/34278 [32:43:36<4:30:47, 3.66s/it] {'loss': 0.1216, 'grad_norm': 0.9053852248961141, 'learning_rate': 4.330032000747708e-07, 'epoch': 0.87} 87%|████████▋ | 29841/34278 [32:43:36<4:30:47, 3.66s/it] 87%|████████▋ | 29842/34278 [32:43:40<4:21:11, 3.53s/it] {'loss': 0.1205, 'grad_norm': 0.8752026556647005, 'learning_rate': 4.328109092287053e-07, 'epoch': 0.87} 87%|████████▋ | 29842/34278 [32:43:40<4:21:11, 3.53s/it] 87%|████████▋ | 29843/34278 [32:43:43<4:21:51, 3.54s/it] {'loss': 0.105, 'grad_norm': 1.1452348967503392, 'learning_rate': 4.3261865915741273e-07, 'epoch': 0.87} 87%|████████▋ | 29843/34278 [32:43:43<4:21:51, 3.54s/it] 87%|████████▋ | 29844/34278 [32:43:46<4:09:37, 3.38s/it] {'loss': 0.1105, 'grad_norm': 0.7792971177379204, 'learning_rate': 4.324264498626113e-07, 'epoch': 0.87} 87%|████████▋ | 29844/34278 [32:43:46<4:09:37, 3.38s/it] 87%|████████▋ | 29845/34278 [32:43:49<4:07:32, 3.35s/it] {'loss': 0.1183, 'grad_norm': 0.7918411056047596, 'learning_rate': 4.322342813460162e-07, 'epoch': 0.87} 87%|████████▋ | 29845/34278 [32:43:49<4:07:32, 3.35s/it] 87%|████████▋ | 29846/34278 [32:43:52<3:54:15, 3.17s/it] {'loss': 0.0924, 'grad_norm': 0.7757046761783517, 'learning_rate': 4.320421536093422e-07, 'epoch': 0.87} 87%|████████▋ | 29846/34278 [32:43:52<3:54:15, 3.17s/it] 87%|████████▋ | 29847/34278 [32:43:56<4:05:02, 3.32s/it] {'loss': 0.1033, 'grad_norm': 0.6833492941206484, 'learning_rate': 4.318500666543052e-07, 'epoch': 0.87} 87%|████████▋ | 29847/34278 [32:43:56<4:05:02, 3.32s/it] 87%|████████▋ | 29848/34278 [32:43:59<4:03:32, 3.30s/it] {'loss': 0.1247, 'grad_norm': 0.9957608773109329, 'learning_rate': 4.3165802048262096e-07, 'epoch': 0.87} 87%|████████▋ | 29848/34278 [32:43:59<4:03:32, 3.30s/it] 87%|████████▋ | 29849/34278 [32:44:05<5:06:05, 4.15s/it] {'loss': 0.0876, 'grad_norm': 0.8078436126699239, 'learning_rate': 4.314660150960037e-07, 'epoch': 0.87} 87%|████████▋ | 29849/34278 [32:44:05<5:06:05, 4.15s/it] 87%|████████▋ | 29850/34278 [32:44:09<5:03:09, 4.11s/it] {'loss': 0.0973, 'grad_norm': 0.8166737009025434, 'learning_rate': 4.3127405049616654e-07, 'epoch': 0.87} 87%|████████▋ | 29850/34278 [32:44:09<5:03:09, 4.11s/it] 87%|████████▋ | 29851/34278 [32:44:13<4:46:36, 3.88s/it] {'loss': 0.1144, 'grad_norm': 0.9200094109189592, 'learning_rate': 4.3108212668482476e-07, 'epoch': 0.87} 87%|████████▋ | 29851/34278 [32:44:13<4:46:36, 3.88s/it] 87%|████████▋ | 29852/34278 [32:44:16<4:40:39, 3.80s/it] {'loss': 0.1081, 'grad_norm': 1.1565132894069288, 'learning_rate': 4.308902436636903e-07, 'epoch': 0.87} 87%|████████▋ | 29852/34278 [32:44:16<4:40:39, 3.80s/it] 87%|████████▋ | 29853/34278 [32:44:19<4:19:19, 3.52s/it] {'loss': 0.1061, 'grad_norm': 1.0310305442230399, 'learning_rate': 4.3069840143447674e-07, 'epoch': 0.87} 87%|████████▋ | 29853/34278 [32:44:19<4:19:19, 3.52s/it] 87%|████████▋ | 29854/34278 [32:44:22<4:18:19, 3.50s/it] {'loss': 0.1078, 'grad_norm': 0.8234617934363656, 'learning_rate': 4.3050659999889776e-07, 'epoch': 0.87} 87%|████████▋ | 29854/34278 [32:44:23<4:18:19, 3.50s/it] 87%|████████▋ | 29855/34278 [32:44:27<4:30:44, 3.67s/it] {'loss': 0.1094, 'grad_norm': 0.9566713458942278, 'learning_rate': 4.303148393586654e-07, 'epoch': 0.87} 87%|████████▋ | 29855/34278 [32:44:27<4:30:44, 3.67s/it] 87%|████████▋ | 29856/34278 [32:44:32<4:59:34, 4.06s/it] {'loss': 0.1275, 'grad_norm': 1.1751645532482526, 'learning_rate': 4.3012311951549036e-07, 'epoch': 0.87} 87%|████████▋ | 29856/34278 [32:44:32<4:59:34, 4.06s/it] 87%|████████▋ | 29857/34278 [32:44:34<4:34:59, 3.73s/it] {'loss': 0.0854, 'grad_norm': 0.9431671484471107, 'learning_rate': 4.299314404710864e-07, 'epoch': 0.87} 87%|████████▋ | 29857/34278 [32:44:35<4:34:59, 3.73s/it] 87%|████████▋ | 29858/34278 [32:44:40<5:23:10, 4.39s/it] {'loss': 0.1144, 'grad_norm': 1.068907517348355, 'learning_rate': 4.2973980222716207e-07, 'epoch': 0.87} 87%|████████▋ | 29858/34278 [32:44:40<5:23:10, 4.39s/it] 87%|████████▋ | 29859/34278 [32:44:44<4:59:07, 4.06s/it] {'loss': 0.1149, 'grad_norm': 0.7769048005080315, 'learning_rate': 4.29548204785431e-07, 'epoch': 0.87} 87%|████████▋ | 29859/34278 [32:44:44<4:59:07, 4.06s/it] 87%|████████▋ | 29860/34278 [32:44:47<4:41:11, 3.82s/it] {'loss': 0.1113, 'grad_norm': 0.9716605913536874, 'learning_rate': 4.2935664814760136e-07, 'epoch': 0.87} 87%|████████▋ | 29860/34278 [32:44:47<4:41:11, 3.82s/it] 87%|████████▋ | 29861/34278 [32:44:50<4:29:23, 3.66s/it] {'loss': 0.1241, 'grad_norm': 0.8453873145446957, 'learning_rate': 4.2916513231538557e-07, 'epoch': 0.87} 87%|████████▋ | 29861/34278 [32:44:50<4:29:23, 3.66s/it] 87%|████████▋ | 29862/34278 [32:44:54<4:30:42, 3.68s/it] {'loss': 0.1215, 'grad_norm': 0.8464955427424956, 'learning_rate': 4.289736572904923e-07, 'epoch': 0.87} 87%|████████▋ | 29862/34278 [32:44:54<4:30:42, 3.68s/it] 87%|████████▋ | 29863/34278 [32:44:59<5:07:28, 4.18s/it] {'loss': 0.1416, 'grad_norm': 0.9637879572247396, 'learning_rate': 4.2878222307463024e-07, 'epoch': 0.87} 87%|████████▋ | 29863/34278 [32:44:59<5:07:28, 4.18s/it] 87%|████████▋ | 29864/34278 [32:45:02<4:41:49, 3.83s/it] {'loss': 0.1155, 'grad_norm': 1.1063074579540713, 'learning_rate': 4.28590829669509e-07, 'epoch': 0.87} 87%|████████▋ | 29864/34278 [32:45:02<4:41:49, 3.83s/it] 87%|████████▋ | 29865/34278 [32:45:07<5:02:36, 4.11s/it] {'loss': 0.1174, 'grad_norm': 0.9476633249530068, 'learning_rate': 4.28399477076839e-07, 'epoch': 0.87} 87%|████████▋ | 29865/34278 [32:45:07<5:02:36, 4.11s/it] 87%|████████▋ | 29866/34278 [32:45:10<4:36:30, 3.76s/it] {'loss': 0.1017, 'grad_norm': 1.0494547089309854, 'learning_rate': 4.2820816529832554e-07, 'epoch': 0.87} 87%|████████▋ | 29866/34278 [32:45:10<4:36:30, 3.76s/it] 87%|████████▋ | 29867/34278 [32:45:13<4:27:17, 3.64s/it] {'loss': 0.0867, 'grad_norm': 0.740568707230243, 'learning_rate': 4.2801689433567937e-07, 'epoch': 0.87} 87%|████████▋ | 29867/34278 [32:45:13<4:27:17, 3.64s/it] 87%|████████▋ | 29868/34278 [32:45:17<4:30:57, 3.69s/it] {'loss': 0.095, 'grad_norm': 0.7175939290898041, 'learning_rate': 4.27825664190607e-07, 'epoch': 0.87} 87%|████████▋ | 29868/34278 [32:45:17<4:30:57, 3.69s/it] 87%|████████▋ | 29869/34278 [32:45:22<4:59:23, 4.07s/it] {'loss': 0.1285, 'grad_norm': 0.9289451850827819, 'learning_rate': 4.276344748648148e-07, 'epoch': 0.87} 87%|████████▋ | 29869/34278 [32:45:22<4:59:23, 4.07s/it] 87%|████████▋ | 29870/34278 [32:45:28<5:40:41, 4.64s/it] {'loss': 0.1117, 'grad_norm': 0.6876481776515677, 'learning_rate': 4.274433263600103e-07, 'epoch': 0.87} 87%|████████▋ | 29870/34278 [32:45:28<5:40:41, 4.64s/it] 87%|████████▋ | 29871/34278 [32:45:32<5:26:14, 4.44s/it] {'loss': 0.1072, 'grad_norm': 0.90676459476418, 'learning_rate': 4.2725221867790155e-07, 'epoch': 0.87} 87%|████████▋ | 29871/34278 [32:45:32<5:26:14, 4.44s/it] 87%|████████▋ | 29872/34278 [32:45:38<5:57:46, 4.87s/it] {'loss': 0.1008, 'grad_norm': 0.7552836163927147, 'learning_rate': 4.270611518201928e-07, 'epoch': 0.87} 87%|████████▋ | 29872/34278 [32:45:38<5:57:46, 4.87s/it] 87%|████████▋ | 29873/34278 [32:45:41<5:10:38, 4.23s/it] {'loss': 0.1108, 'grad_norm': 0.7758009585920969, 'learning_rate': 4.268701257885899e-07, 'epoch': 0.87} 87%|████████▋ | 29873/34278 [32:45:41<5:10:38, 4.23s/it] 87%|████████▋ | 29874/34278 [32:45:44<4:47:24, 3.92s/it] {'loss': 0.1072, 'grad_norm': 0.8339668780951237, 'learning_rate': 4.2667914058479976e-07, 'epoch': 0.87} 87%|████████▋ | 29874/34278 [32:45:44<4:47:24, 3.92s/it] 87%|████████▋ | 29875/34278 [32:45:48<4:43:00, 3.86s/it] {'loss': 0.128, 'grad_norm': 0.7467170059467244, 'learning_rate': 4.264881962105266e-07, 'epoch': 0.87} 87%|████████▋ | 29875/34278 [32:45:48<4:43:00, 3.86s/it] 87%|████████▋ | 29876/34278 [32:45:54<5:28:17, 4.47s/it] {'loss': 0.1076, 'grad_norm': 1.0153642190803998, 'learning_rate': 4.262972926674735e-07, 'epoch': 0.87} 87%|████████▋ | 29876/34278 [32:45:54<5:28:17, 4.47s/it] 87%|████████▋ | 29877/34278 [32:45:57<4:58:26, 4.07s/it] {'loss': 0.1203, 'grad_norm': 0.7803989734333402, 'learning_rate': 4.26106429957347e-07, 'epoch': 0.87} 87%|████████▋ | 29877/34278 [32:45:57<4:58:26, 4.07s/it] 87%|████████▋ | 29878/34278 [32:46:01<5:13:26, 4.27s/it] {'loss': 0.108, 'grad_norm': 0.9265790676639721, 'learning_rate': 4.2591560808185106e-07, 'epoch': 0.87} 87%|████████▋ | 29878/34278 [32:46:01<5:13:26, 4.27s/it] 87%|████████▋ | 29879/34278 [32:46:05<4:58:29, 4.07s/it] {'loss': 0.1056, 'grad_norm': 0.8310663452206771, 'learning_rate': 4.257248270426889e-07, 'epoch': 0.87} 87%|████████▋ | 29879/34278 [32:46:05<4:58:29, 4.07s/it] 87%|████████▋ | 29880/34278 [32:46:11<5:38:10, 4.61s/it] {'loss': 0.1439, 'grad_norm': 0.8347754515050189, 'learning_rate': 4.255340868415625e-07, 'epoch': 0.87} 87%|████████▋ | 29880/34278 [32:46:11<5:38:10, 4.61s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 87%|████████▋ | 29881/34278 [32:46:14<5:04:32, 4.16s/it] {'loss': 0.1305, 'grad_norm': 0.7486925850183045, 'learning_rate': 4.2534338748017655e-07, 'epoch': 0.87} 87%|████████▋ | 29881/34278 [32:46:14<5:04:32, 4.16s/it] 87%|████████▋ | 29882/34278 [32:46:17<4:31:32, 3.71s/it] {'loss': 0.1217, 'grad_norm': 0.9735315168196614, 'learning_rate': 4.251527289602314e-07, 'epoch': 0.87} 87%|████████▋ | 29882/34278 [32:46:17<4:31:32, 3.71s/it] 87%|████████▋ | 29883/34278 [32:46:21<4:47:27, 3.92s/it] {'loss': 0.1111, 'grad_norm': 0.7607551056091284, 'learning_rate': 4.2496211128343125e-07, 'epoch': 0.87} 87%|████████▋ | 29883/34278 [32:46:21<4:47:27, 3.92s/it] 87%|████████▋ | 29884/34278 [32:46:24<4:33:48, 3.74s/it] {'loss': 0.1121, 'grad_norm': 0.9457045281394639, 'learning_rate': 4.24771534451478e-07, 'epoch': 0.87} 87%|████████▋ | 29884/34278 [32:46:24<4:33:48, 3.74s/it] 87%|████████▋ | 29885/34278 [32:46:27<4:18:57, 3.54s/it] {'loss': 0.1103, 'grad_norm': 0.8157892787241623, 'learning_rate': 4.24580998466072e-07, 'epoch': 0.87} 87%|████████▋ | 29885/34278 [32:46:27<4:18:57, 3.54s/it] 87%|████████▋ | 29886/34278 [32:46:31<4:14:33, 3.48s/it] {'loss': 0.1019, 'grad_norm': 0.8344157962970016, 'learning_rate': 4.243905033289142e-07, 'epoch': 0.87} 87%|████████▋ | 29886/34278 [32:46:31<4:14:33, 3.48s/it] 87%|████████▋ | 29887/34278 [32:46:34<4:03:02, 3.32s/it] {'loss': 0.1195, 'grad_norm': 0.6242416127974619, 'learning_rate': 4.2420004904170644e-07, 'epoch': 0.87} 87%|████████▋ | 29887/34278 [32:46:34<4:03:02, 3.32s/it] 87%|████████▋ | 29888/34278 [32:46:38<4:28:14, 3.67s/it] {'loss': 0.11, 'grad_norm': 0.72553159326258, 'learning_rate': 4.2400963560614736e-07, 'epoch': 0.87} 87%|████████▋ | 29888/34278 [32:46:38<4:28:14, 3.67s/it] 87%|████████▋ | 29889/34278 [32:46:41<4:13:32, 3.47s/it] {'loss': 0.1041, 'grad_norm': 0.9495918137817224, 'learning_rate': 4.238192630239385e-07, 'epoch': 0.87} 87%|████████▋ | 29889/34278 [32:46:41<4:13:32, 3.47s/it] 87%|████████▋ | 29890/34278 [32:46:46<4:32:11, 3.72s/it] {'loss': 0.1021, 'grad_norm': 0.8443970710203967, 'learning_rate': 4.236289312967784e-07, 'epoch': 0.87} 87%|████████▋ | 29890/34278 [32:46:46<4:32:11, 3.72s/it] 87%|████████▋ | 29891/34278 [32:46:49<4:19:49, 3.55s/it] {'loss': 0.1141, 'grad_norm': 0.8687350574882492, 'learning_rate': 4.234386404263674e-07, 'epoch': 0.87} 87%|████████▋ | 29891/34278 [32:46:49<4:19:49, 3.55s/it] 87%|████████▋ | 29892/34278 [32:46:55<5:12:00, 4.27s/it] {'loss': 0.1398, 'grad_norm': 1.1711484215960941, 'learning_rate': 4.232483904144036e-07, 'epoch': 0.87} 87%|████████▋ | 29892/34278 [32:46:55<5:12:00, 4.27s/it] 87%|████████▋ | 29893/34278 [32:46:58<4:45:37, 3.91s/it] {'loss': 0.1305, 'grad_norm': 0.8544625514073956, 'learning_rate': 4.2305818126258445e-07, 'epoch': 0.87} 87%|████████▋ | 29893/34278 [32:46:58<4:45:37, 3.91s/it] 87%|████████▋ | 29894/34278 [32:47:01<4:22:11, 3.59s/it] {'loss': 0.1024, 'grad_norm': 0.8036563494629677, 'learning_rate': 4.2286801297260983e-07, 'epoch': 0.87} 87%|████████▋ | 29894/34278 [32:47:01<4:22:11, 3.59s/it] 87%|████████▋ | 29895/34278 [32:47:04<4:14:32, 3.48s/it] {'loss': 0.105, 'grad_norm': 0.7717621317173138, 'learning_rate': 4.226778855461777e-07, 'epoch': 0.87} 87%|████████▋ | 29895/34278 [32:47:04<4:14:32, 3.48s/it] 87%|████████▋ | 29896/34278 [32:47:07<4:13:31, 3.47s/it] {'loss': 0.1056, 'grad_norm': 0.8559787744048357, 'learning_rate': 4.224877989849835e-07, 'epoch': 0.87} 87%|████████▋ | 29896/34278 [32:47:07<4:13:31, 3.47s/it] 87%|████████▋ | 29897/34278 [32:47:11<4:14:33, 3.49s/it] {'loss': 0.1164, 'grad_norm': 0.7974922375493215, 'learning_rate': 4.2229775329072687e-07, 'epoch': 0.87} 87%|████████▋ | 29897/34278 [32:47:11<4:14:33, 3.49s/it] 87%|████████▋ | 29898/34278 [32:47:16<5:01:48, 4.13s/it] {'loss': 0.1071, 'grad_norm': 0.6807687891692692, 'learning_rate': 4.221077484651026e-07, 'epoch': 0.87} 87%|████████▋ | 29898/34278 [32:47:16<5:01:48, 4.13s/it] 87%|████████▋ | 29899/34278 [32:47:19<4:33:32, 3.75s/it] {'loss': 0.1102, 'grad_norm': 0.9698070852506537, 'learning_rate': 4.219177845098071e-07, 'epoch': 0.87} 87%|████████▋ | 29899/34278 [32:47:19<4:33:32, 3.75s/it] 87%|████████▋ | 29900/34278 [32:47:22<4:20:43, 3.57s/it] {'loss': 0.0935, 'grad_norm': 1.0138886604346813, 'learning_rate': 4.2172786142653633e-07, 'epoch': 0.87} 87%|████████▋ | 29900/34278 [32:47:22<4:20:43, 3.57s/it] 87%|████████▋ | 29901/34278 [32:47:26<4:27:33, 3.67s/it] {'loss': 0.1128, 'grad_norm': 0.9475220467153204, 'learning_rate': 4.215379792169877e-07, 'epoch': 0.87} 87%|████████▋ | 29901/34278 [32:47:26<4:27:33, 3.67s/it] 87%|████████▋ | 29902/34278 [32:47:31<4:44:00, 3.89s/it] {'loss': 0.129, 'grad_norm': 0.8521653135230126, 'learning_rate': 4.2134813788285436e-07, 'epoch': 0.87} 87%|████████▋ | 29902/34278 [32:47:31<4:44:00, 3.89s/it] 87%|████████▋ | 29903/34278 [32:47:34<4:20:25, 3.57s/it] {'loss': 0.1151, 'grad_norm': 0.8184166906469769, 'learning_rate': 4.2115833742583157e-07, 'epoch': 0.87} 87%|████████▋ | 29903/34278 [32:47:34<4:20:25, 3.57s/it] 87%|████████▋ | 29904/34278 [32:47:37<4:13:06, 3.47s/it] {'loss': 0.1221, 'grad_norm': 0.8239441281175344, 'learning_rate': 4.2096857784761466e-07, 'epoch': 0.87} 87%|████████▋ | 29904/34278 [32:47:37<4:13:06, 3.47s/it] 87%|████████▋ | 29905/34278 [32:47:40<4:10:18, 3.43s/it] {'loss': 0.1057, 'grad_norm': 0.778795356122018, 'learning_rate': 4.2077885914989733e-07, 'epoch': 0.87} 87%|████████▋ | 29905/34278 [32:47:40<4:10:18, 3.43s/it] 87%|████████▋ | 29906/34278 [32:47:44<4:09:20, 3.42s/it] {'loss': 0.1132, 'grad_norm': 0.8442644199060447, 'learning_rate': 4.205891813343721e-07, 'epoch': 0.87} 87%|████████▋ | 29906/34278 [32:47:44<4:09:20, 3.42s/it] 87%|████████▋ | 29907/34278 [32:47:47<4:05:48, 3.37s/it] {'loss': 0.1139, 'grad_norm': 0.7711401672933712, 'learning_rate': 4.203995444027337e-07, 'epoch': 0.87} 87%|████████▋ | 29907/34278 [32:47:47<4:05:48, 3.37s/it] 87%|████████▋ | 29908/34278 [32:47:50<4:04:33, 3.36s/it] {'loss': 0.1347, 'grad_norm': 0.7549938529351892, 'learning_rate': 4.202099483566757e-07, 'epoch': 0.87} 87%|████████▋ | 29908/34278 [32:47:50<4:04:33, 3.36s/it] 87%|████████▋ | 29909/34278 [32:47:53<3:54:55, 3.23s/it] {'loss': 0.1139, 'grad_norm': 0.8560959875783646, 'learning_rate': 4.200203931978897e-07, 'epoch': 0.87} 87%|████████▋ | 29909/34278 [32:47:53<3:54:55, 3.23s/it] 87%|████████▋ | 29910/34278 [32:47:56<3:52:28, 3.19s/it] {'loss': 0.1178, 'grad_norm': 0.8226736831077333, 'learning_rate': 4.198308789280681e-07, 'epoch': 0.87} 87%|████████▋ | 29910/34278 [32:47:56<3:52:28, 3.19s/it] 87%|████████▋ | 29911/34278 [32:47:59<3:50:50, 3.17s/it] {'loss': 0.1126, 'grad_norm': 0.8858024223853243, 'learning_rate': 4.1964140554890343e-07, 'epoch': 0.87} 87%|████████▋ | 29911/34278 [32:47:59<3:50:50, 3.17s/it] 87%|████████▋ | 29912/34278 [32:48:02<3:45:31, 3.10s/it] {'loss': 0.1069, 'grad_norm': 0.7147194341951294, 'learning_rate': 4.1945197306208606e-07, 'epoch': 0.87} 87%|████████▋ | 29912/34278 [32:48:02<3:45:31, 3.10s/it] 87%|████████▋ | 29913/34278 [32:48:05<3:37:03, 2.98s/it] {'loss': 0.1068, 'grad_norm': 4.827301150089571, 'learning_rate': 4.19262581469308e-07, 'epoch': 0.87} 87%|████████▋ | 29913/34278 [32:48:05<3:37:03, 2.98s/it] 87%|████████▋ | 29914/34278 [32:48:08<3:36:00, 2.97s/it] {'loss': 0.1077, 'grad_norm': 0.9969858528344643, 'learning_rate': 4.1907323077226114e-07, 'epoch': 0.87} 87%|████████▋ | 29914/34278 [32:48:08<3:36:00, 2.97s/it] 87%|████████▋ | 29915/34278 [32:48:11<3:50:20, 3.17s/it] {'loss': 0.1035, 'grad_norm': 0.9808522789212067, 'learning_rate': 4.1888392097263473e-07, 'epoch': 0.87} 87%|████████▋ | 29915/34278 [32:48:11<3:50:20, 3.17s/it] 87%|████████▋ | 29916/34278 [32:48:15<4:05:51, 3.38s/it] {'loss': 0.1273, 'grad_norm': 0.8994254612147985, 'learning_rate': 4.18694652072118e-07, 'epoch': 0.87} 87%|████████▋ | 29916/34278 [32:48:15<4:05:51, 3.38s/it] 87%|████████▋ | 29917/34278 [32:48:19<4:10:14, 3.44s/it] {'loss': 0.1051, 'grad_norm': 0.8959273599287071, 'learning_rate': 4.185054240724029e-07, 'epoch': 0.87} 87%|████████▋ | 29917/34278 [32:48:19<4:10:14, 3.44s/it] 87%|████████▋ | 29918/34278 [32:48:22<4:00:48, 3.31s/it] {'loss': 0.1083, 'grad_norm': 0.832198318156632, 'learning_rate': 4.1831623697517697e-07, 'epoch': 0.87} 87%|████████▋ | 29918/34278 [32:48:22<4:00:48, 3.31s/it] 87%|████████▋ | 29919/34278 [32:48:26<4:06:26, 3.39s/it] {'loss': 0.1021, 'grad_norm': 0.7491026800389943, 'learning_rate': 4.1812709078213056e-07, 'epoch': 0.87} 87%|████████▋ | 29919/34278 [32:48:26<4:06:26, 3.39s/it] 87%|████████▋ | 29920/34278 [32:48:29<4:02:40, 3.34s/it] {'loss': 0.1048, 'grad_norm': 0.7795319850787747, 'learning_rate': 4.1793798549495115e-07, 'epoch': 0.87} 87%|████████▋ | 29920/34278 [32:48:29<4:02:40, 3.34s/it] 87%|████████▋ | 29921/34278 [32:48:31<3:48:49, 3.15s/it] {'loss': 0.0944, 'grad_norm': 0.8050425427365901, 'learning_rate': 4.177489211153279e-07, 'epoch': 0.87} 87%|████████▋ | 29921/34278 [32:48:31<3:48:49, 3.15s/it] 87%|████████▋ | 29922/34278 [32:48:38<4:53:21, 4.04s/it] {'loss': 0.1113, 'grad_norm': 1.023494835948346, 'learning_rate': 4.175598976449491e-07, 'epoch': 0.87} 87%|████████▋ | 29922/34278 [32:48:38<4:53:21, 4.04s/it] 87%|████████▋ | 29923/34278 [32:48:44<5:37:15, 4.65s/it] {'loss': 0.1197, 'grad_norm': 0.8657322492817636, 'learning_rate': 4.1737091508550043e-07, 'epoch': 0.87} 87%|████████▋ | 29923/34278 [32:48:44<5:37:15, 4.65s/it] 87%|████████▋ | 29924/34278 [32:48:49<6:02:56, 5.00s/it] {'loss': 0.119, 'grad_norm': 0.9398422308099172, 'learning_rate': 4.1718197343867004e-07, 'epoch': 0.87} 87%|████████▋ | 29924/34278 [32:48:49<6:02:56, 5.00s/it] 87%|████████▋ | 29925/34278 [32:48:53<5:26:33, 4.50s/it] {'loss': 0.099, 'grad_norm': 0.9654851883590972, 'learning_rate': 4.1699307270614607e-07, 'epoch': 0.87} 87%|████████▋ | 29925/34278 [32:48:53<5:26:33, 4.50s/it] 87%|████████▋ | 29926/34278 [32:48:56<5:08:02, 4.25s/it] {'loss': 0.1241, 'grad_norm': 0.9190948702960542, 'learning_rate': 4.168042128896127e-07, 'epoch': 0.87} 87%|████████▋ | 29926/34278 [32:48:56<5:08:02, 4.25s/it] 87%|████████▋ | 29927/34278 [32:49:01<5:18:37, 4.39s/it] {'loss': 0.1198, 'grad_norm': 0.9845219121354531, 'learning_rate': 4.1661539399075855e-07, 'epoch': 0.87} 87%|████████▋ | 29927/34278 [32:49:01<5:18:37, 4.39s/it] 87%|████████▋ | 29928/34278 [32:49:05<5:06:01, 4.22s/it] {'loss': 0.1357, 'grad_norm': 0.8927423186777527, 'learning_rate': 4.164266160112679e-07, 'epoch': 0.87} 87%|████████▋ | 29928/34278 [32:49:05<5:06:01, 4.22s/it] 87%|████████▋ | 29929/34278 [32:49:11<5:46:31, 4.78s/it] {'loss': 0.1073, 'grad_norm': 0.6886350312245313, 'learning_rate': 4.162378789528254e-07, 'epoch': 0.87} 87%|████████▋ | 29929/34278 [32:49:11<5:46:31, 4.78s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 87%|████████▋ | 29930/34278 [32:49:15<5:23:23, 4.46s/it] {'loss': 0.1191, 'grad_norm': 1.350166590262735, 'learning_rate': 4.160491828171165e-07, 'epoch': 0.87} 87%|████████▋ | 29930/34278 [32:49:15<5:23:23, 4.46s/it] 87%|████████▋ | 29931/34278 [32:49:18<4:55:33, 4.08s/it] {'loss': 0.1056, 'grad_norm': 0.8723272090238917, 'learning_rate': 4.1586052760582753e-07, 'epoch': 0.87} 87%|████████▋ | 29931/34278 [32:49:18<4:55:33, 4.08s/it] 87%|████████▋ | 29932/34278 [32:49:21<4:37:35, 3.83s/it] {'loss': 0.1123, 'grad_norm': 0.8579215096389476, 'learning_rate': 4.156719133206416e-07, 'epoch': 0.87} 87%|████████▋ | 29932/34278 [32:49:21<4:37:35, 3.83s/it] 87%|████████▋ | 29933/34278 [32:49:25<4:27:07, 3.69s/it] {'loss': 0.1153, 'grad_norm': 0.9218092007763244, 'learning_rate': 4.154833399632413e-07, 'epoch': 0.87} 87%|████████▋ | 29933/34278 [32:49:25<4:27:07, 3.69s/it] 87%|████████▋ | 29934/34278 [32:49:27<4:08:29, 3.43s/it] {'loss': 0.1077, 'grad_norm': 0.7899232977371622, 'learning_rate': 4.1529480753531193e-07, 'epoch': 0.87} 87%|████████▋ | 29934/34278 [32:49:27<4:08:29, 3.43s/it] 87%|████████▋ | 29935/34278 [32:49:31<4:02:40, 3.35s/it] {'loss': 0.1211, 'grad_norm': 0.8275254846762283, 'learning_rate': 4.1510631603853655e-07, 'epoch': 0.87} 87%|████████▋ | 29935/34278 [32:49:31<4:02:40, 3.35s/it] 87%|████████▋ | 29936/34278 [32:49:37<4:59:31, 4.14s/it] {'loss': 0.1223, 'grad_norm': 1.005011908344243, 'learning_rate': 4.149178654745961e-07, 'epoch': 0.87} 87%|████████▋ | 29936/34278 [32:49:37<4:59:31, 4.14s/it] 87%|████████▋ | 29937/34278 [32:49:40<4:38:22, 3.85s/it] {'loss': 0.1276, 'grad_norm': 0.7056828889262484, 'learning_rate': 4.1472945584517476e-07, 'epoch': 0.87} 87%|████████▋ | 29937/34278 [32:49:40<4:38:22, 3.85s/it] 87%|████████▋ | 29938/34278 [32:49:43<4:20:09, 3.60s/it] {'loss': 0.1158, 'grad_norm': 1.3001967449194474, 'learning_rate': 4.145410871519551e-07, 'epoch': 0.87} 87%|████████▋ | 29938/34278 [32:49:43<4:20:09, 3.60s/it] 87%|████████▋ | 29939/34278 [32:49:46<4:08:38, 3.44s/it] {'loss': 0.0849, 'grad_norm': 1.0241636410606516, 'learning_rate': 4.143527593966179e-07, 'epoch': 0.87} 87%|████████▋ | 29939/34278 [32:49:46<4:08:38, 3.44s/it] 87%|████████▋ | 29940/34278 [32:49:49<4:10:18, 3.46s/it] {'loss': 0.0996, 'grad_norm': 0.9323677915350644, 'learning_rate': 4.141644725808436e-07, 'epoch': 0.87} 87%|████████▋ | 29940/34278 [32:49:49<4:10:18, 3.46s/it] 87%|████████▋ | 29941/34278 [32:49:55<4:49:49, 4.01s/it] {'loss': 0.101, 'grad_norm': 0.7319661576150167, 'learning_rate': 4.1397622670631523e-07, 'epoch': 0.87} 87%|████████▋ | 29941/34278 [32:49:55<4:49:49, 4.01s/it] 87%|████████▋ | 29942/34278 [32:49:58<4:32:53, 3.78s/it] {'loss': 0.107, 'grad_norm': 0.7916314828259348, 'learning_rate': 4.1378802177471144e-07, 'epoch': 0.87} 87%|████████▋ | 29942/34278 [32:49:58<4:32:53, 3.78s/it] 87%|████████▋ | 29943/34278 [32:50:01<4:20:25, 3.60s/it] {'loss': 0.1079, 'grad_norm': 0.9512679043925271, 'learning_rate': 4.135998577877132e-07, 'epoch': 0.87} 87%|████████▋ | 29943/34278 [32:50:01<4:20:25, 3.60s/it] 87%|████████▋ | 29944/34278 [32:50:04<4:08:36, 3.44s/it] {'loss': 0.1201, 'grad_norm': 1.0788826656880606, 'learning_rate': 4.134117347470018e-07, 'epoch': 0.87} 87%|████████▋ | 29944/34278 [32:50:04<4:08:36, 3.44s/it] 87%|████████▋ | 29945/34278 [32:50:07<3:58:43, 3.31s/it] {'loss': 0.1201, 'grad_norm': 1.027673197157146, 'learning_rate': 4.1322365265425545e-07, 'epoch': 0.87} 87%|████████▋ | 29945/34278 [32:50:07<3:58:43, 3.31s/it] 87%|████████▋ | 29946/34278 [32:50:10<3:47:44, 3.15s/it] {'loss': 0.0916, 'grad_norm': 1.0343293405049407, 'learning_rate': 4.130356115111522e-07, 'epoch': 0.87} 87%|████████▋ | 29946/34278 [32:50:10<3:47:44, 3.15s/it] 87%|████████▋ | 29947/34278 [32:50:15<4:24:37, 3.67s/it] {'loss': 0.1096, 'grad_norm': 0.7508662886368539, 'learning_rate': 4.12847611319373e-07, 'epoch': 0.87} 87%|████████▋ | 29947/34278 [32:50:15<4:24:37, 3.67s/it] 87%|████████▋ | 29948/34278 [32:50:18<4:21:20, 3.62s/it] {'loss': 0.1011, 'grad_norm': 0.8008703546242361, 'learning_rate': 4.1265965208059423e-07, 'epoch': 0.87} 87%|████████▋ | 29948/34278 [32:50:18<4:21:20, 3.62s/it] 87%|████████▋ | 29949/34278 [32:50:22<4:24:57, 3.67s/it] {'loss': 0.1193, 'grad_norm': 0.749532749373626, 'learning_rate': 4.124717337964962e-07, 'epoch': 0.87} 87%|████████▋ | 29949/34278 [32:50:22<4:24:57, 3.67s/it] 87%|████████▋ | 29950/34278 [32:50:25<4:03:13, 3.37s/it] {'loss': 0.11, 'grad_norm': 0.7525171418922129, 'learning_rate': 4.122838564687542e-07, 'epoch': 0.87} 87%|████████▋ | 29950/34278 [32:50:25<4:03:13, 3.37s/it] 87%|████████▋ | 29951/34278 [32:50:28<4:05:44, 3.41s/it] {'loss': 0.1171, 'grad_norm': 0.8707908207771639, 'learning_rate': 4.120960200990481e-07, 'epoch': 0.87} 87%|████████▋ | 29951/34278 [32:50:28<4:05:44, 3.41s/it] 87%|████████▋ | 29952/34278 [32:50:31<3:59:46, 3.33s/it] {'loss': 0.1204, 'grad_norm': 0.7708651216906697, 'learning_rate': 4.119082246890532e-07, 'epoch': 0.87} 87%|████████▋ | 29952/34278 [32:50:31<3:59:46, 3.33s/it] 87%|████████▋ | 29953/34278 [32:50:34<3:46:47, 3.15s/it] {'loss': 0.0917, 'grad_norm': 0.9071471813604898, 'learning_rate': 4.117204702404459e-07, 'epoch': 0.87} 87%|████████▋ | 29953/34278 [32:50:34<3:46:47, 3.15s/it] 87%|████████▋ | 29954/34278 [32:50:38<4:02:40, 3.37s/it] {'loss': 0.1175, 'grad_norm': 1.0675546952025274, 'learning_rate': 4.115327567549021e-07, 'epoch': 0.87} 87%|████████▋ | 29954/34278 [32:50:38<4:02:40, 3.37s/it] 87%|████████▋ | 29955/34278 [32:50:41<3:50:02, 3.19s/it] {'loss': 0.1088, 'grad_norm': 0.887214975252043, 'learning_rate': 4.113450842340999e-07, 'epoch': 0.87} 87%|████████▋ | 29955/34278 [32:50:41<3:50:02, 3.19s/it] 87%|████████▋ | 29956/34278 [32:50:45<4:13:55, 3.53s/it] {'loss': 0.096, 'grad_norm': 0.9111866400366525, 'learning_rate': 4.11157452679713e-07, 'epoch': 0.87} 87%|████████▋ | 29956/34278 [32:50:45<4:13:55, 3.53s/it] 87%|████████▋ | 29957/34278 [32:50:49<4:27:08, 3.71s/it] {'loss': 0.1141, 'grad_norm': 0.697758272359337, 'learning_rate': 4.1096986209341716e-07, 'epoch': 0.87} 87%|████████▋ | 29957/34278 [32:50:49<4:27:08, 3.71s/it] 87%|████████▋ | 29958/34278 [32:50:53<4:27:24, 3.71s/it] {'loss': 0.105, 'grad_norm': 0.8424568829645216, 'learning_rate': 4.107823124768867e-07, 'epoch': 0.87} 87%|████████▋ | 29958/34278 [32:50:53<4:27:24, 3.71s/it] 87%|████████▋ | 29959/34278 [32:50:58<4:48:48, 4.01s/it] {'loss': 0.1084, 'grad_norm': 0.7569289409986389, 'learning_rate': 4.1059480383179586e-07, 'epoch': 0.87} 87%|████████▋ | 29959/34278 [32:50:58<4:48:48, 4.01s/it] 87%|████████▋ | 29960/34278 [32:51:01<4:25:50, 3.69s/it] {'loss': 0.1246, 'grad_norm': 1.0302477770546048, 'learning_rate': 4.1040733615981876e-07, 'epoch': 0.87} 87%|████████▋ | 29960/34278 [32:51:01<4:25:50, 3.69s/it] 87%|████████▋ | 29961/34278 [32:51:04<4:29:15, 3.74s/it] {'loss': 0.1065, 'grad_norm': 1.1180763991332148, 'learning_rate': 4.1021990946263025e-07, 'epoch': 0.87} 87%|████████▋ | 29961/34278 [32:51:04<4:29:15, 3.74s/it] 87%|████████▋ | 29962/34278 [32:51:08<4:25:27, 3.69s/it] {'loss': 0.0891, 'grad_norm': 0.7034423738035324, 'learning_rate': 4.1003252374190284e-07, 'epoch': 0.87} 87%|████████▋ | 29962/34278 [32:51:08<4:25:27, 3.69s/it] 87%|████████▋ | 29963/34278 [32:51:12<4:30:42, 3.76s/it] {'loss': 0.1065, 'grad_norm': 0.8282797111144902, 'learning_rate': 4.098451789993085e-07, 'epoch': 0.87} 87%|████████▋ | 29963/34278 [32:51:12<4:30:42, 3.76s/it] 87%|████████▋ | 29964/34278 [32:51:16<4:47:02, 3.99s/it] {'loss': 0.1067, 'grad_norm': 0.8533735373287727, 'learning_rate': 4.0965787523652156e-07, 'epoch': 0.87} 87%|████████▋ | 29964/34278 [32:51:16<4:47:02, 3.99s/it] 87%|████████▋ | 29965/34278 [32:51:22<5:14:06, 4.37s/it] {'loss': 0.1331, 'grad_norm': 0.9289842007959089, 'learning_rate': 4.094706124552128e-07, 'epoch': 0.87} 87%|████████▋ | 29965/34278 [32:51:22<5:14:06, 4.37s/it] 87%|████████▋ | 29966/34278 [32:51:25<4:47:28, 4.00s/it] {'loss': 0.1074, 'grad_norm': 0.7402285195808873, 'learning_rate': 4.0928339065705424e-07, 'epoch': 0.87} 87%|████████▋ | 29966/34278 [32:51:25<4:47:28, 4.00s/it] 87%|████████▋ | 29967/34278 [32:51:28<4:27:42, 3.73s/it] {'loss': 0.0995, 'grad_norm': 0.898508439569312, 'learning_rate': 4.0909620984371733e-07, 'epoch': 0.87} 87%|████████▋ | 29967/34278 [32:51:28<4:27:42, 3.73s/it] 87%|████████▋ | 29968/34278 [32:51:32<4:27:04, 3.72s/it] {'loss': 0.0994, 'grad_norm': 0.7110177183257381, 'learning_rate': 4.0890907001687463e-07, 'epoch': 0.87} 87%|████████▋ | 29968/34278 [32:51:32<4:27:04, 3.72s/it] 87%|████████▋ | 29969/34278 [32:51:35<4:20:55, 3.63s/it] {'loss': 0.1059, 'grad_norm': 0.9737372746845955, 'learning_rate': 4.087219711781959e-07, 'epoch': 0.87} 87%|████████▋ | 29969/34278 [32:51:35<4:20:55, 3.63s/it] 87%|████████▋ | 29970/34278 [32:51:38<4:09:42, 3.48s/it] {'loss': 0.1002, 'grad_norm': 0.793370043176925, 'learning_rate': 4.0853491332935034e-07, 'epoch': 0.87} 87%|████████▋ | 29970/34278 [32:51:38<4:09:42, 3.48s/it] 87%|████████▋ | 29971/34278 [32:51:42<4:21:43, 3.65s/it] {'loss': 0.1241, 'grad_norm': 0.9129891616227633, 'learning_rate': 4.0834789647201003e-07, 'epoch': 0.87} 87%|████████▋ | 29971/34278 [32:51:42<4:21:43, 3.65s/it] 87%|████████▋ | 29972/34278 [32:51:46<4:28:38, 3.74s/it] {'loss': 0.1325, 'grad_norm': 0.9607523923505505, 'learning_rate': 4.081609206078424e-07, 'epoch': 0.87} 87%|████████▋ | 29972/34278 [32:51:46<4:28:38, 3.74s/it] 87%|████████▋ | 29973/34278 [32:51:52<5:15:47, 4.40s/it] {'loss': 0.1118, 'grad_norm': 0.8913248019903911, 'learning_rate': 4.079739857385179e-07, 'epoch': 0.87} 87%|████████▋ | 29973/34278 [32:51:52<5:15:47, 4.40s/it] 87%|████████▋ | 29974/34278 [32:51:56<5:01:36, 4.20s/it] {'loss': 0.1165, 'grad_norm': 0.8102677818546921, 'learning_rate': 4.077870918657062e-07, 'epoch': 0.87} 87%|████████▋ | 29974/34278 [32:51:56<5:01:36, 4.20s/it] 87%|████████▋ | 29975/34278 [32:51:59<4:47:02, 4.00s/it] {'loss': 0.1191, 'grad_norm': 0.9559230906975134, 'learning_rate': 4.076002389910755e-07, 'epoch': 0.87} 87%|████████▋ | 29975/34278 [32:51:59<4:47:02, 4.00s/it] 87%|████████▋ | 29976/34278 [32:52:05<5:26:39, 4.56s/it] {'loss': 0.111, 'grad_norm': 1.04818129475544, 'learning_rate': 4.074134271162927e-07, 'epoch': 0.87} 87%|████████▋ | 29976/34278 [32:52:05<5:26:39, 4.56s/it] 87%|████████▋ | 29977/34278 [32:52:11<5:58:19, 5.00s/it] {'loss': 0.1135, 'grad_norm': 0.8321931156481468, 'learning_rate': 4.0722665624302717e-07, 'epoch': 0.87} 87%|████████▋ | 29977/34278 [32:52:11<5:58:19, 5.00s/it] 87%|████████▋ | 29978/34278 [32:52:15<5:33:20, 4.65s/it] {'loss': 0.1107, 'grad_norm': 0.9789598585791341, 'learning_rate': 4.0703992637294466e-07, 'epoch': 0.87} 87%|████████▋ | 29978/34278 [32:52:15<5:33:20, 4.65s/it] 87%|████████▋ | 29979/34278 [32:52:21<6:03:43, 5.08s/it] {'loss': 0.1119, 'grad_norm': 0.8875249128288172, 'learning_rate': 4.068532375077144e-07, 'epoch': 0.87} 87%|████████▋ | 29979/34278 [32:52:21<6:03:43, 5.08s/it] 87%|████████▋ | 29980/34278 [32:52:24<5:14:37, 4.39s/it] {'loss': 0.1162, 'grad_norm': 3.800407419199114, 'learning_rate': 4.066665896490013e-07, 'epoch': 0.87} 87%|████████▋ | 29980/34278 [32:52:24<5:14:37, 4.39s/it] 87%|████████▋ | 29981/34278 [32:52:28<5:03:59, 4.24s/it] {'loss': 0.111, 'grad_norm': 0.7398637198877579, 'learning_rate': 4.0647998279847277e-07, 'epoch': 0.87} 87%|████████▋ | 29981/34278 [32:52:28<5:03:59, 4.24s/it] 87%|████████▋ | 29982/34278 [32:52:31<4:34:01, 3.83s/it] {'loss': 0.092, 'grad_norm': 0.9066124867651206, 'learning_rate': 4.0629341695779423e-07, 'epoch': 0.87} 87%|████████▋ | 29982/34278 [32:52:31<4:34:01, 3.83s/it] 87%|████████▋ | 29983/34278 [32:52:34<4:24:02, 3.69s/it] {'loss': 0.1071, 'grad_norm': 0.7808211060457666, 'learning_rate': 4.06106892128631e-07, 'epoch': 0.87} 87%|████████▋ | 29983/34278 [32:52:34<4:24:02, 3.69s/it] 87%|████████▋ | 29984/34278 [32:52:38<4:19:21, 3.62s/it] {'loss': 0.1066, 'grad_norm': 0.8295891986742827, 'learning_rate': 4.059204083126489e-07, 'epoch': 0.87} 87%|████████▋ | 29984/34278 [32:52:38<4:19:21, 3.62s/it] 87%|████████▋ | 29985/34278 [32:52:41<4:05:57, 3.44s/it] {'loss': 0.094, 'grad_norm': 0.980293509020823, 'learning_rate': 4.0573396551151335e-07, 'epoch': 0.87} 87%|████████▋ | 29985/34278 [32:52:41<4:05:57, 3.44s/it] 87%|████████▋ | 29986/34278 [32:52:44<4:09:51, 3.49s/it] {'loss': 0.1293, 'grad_norm': 0.7797549997944827, 'learning_rate': 4.0554756372688744e-07, 'epoch': 0.87} 87%|████████▋ | 29986/34278 [32:52:44<4:09:51, 3.49s/it] 87%|████████▋ | 29987/34278 [32:52:48<4:09:53, 3.49s/it] {'loss': 0.119, 'grad_norm': 0.6828497120825575, 'learning_rate': 4.05361202960437e-07, 'epoch': 0.87} 87%|████████▋ | 29987/34278 [32:52:48<4:09:53, 3.49s/it] 87%|████████▋ | 29988/34278 [32:52:51<4:01:14, 3.37s/it] {'loss': 0.1278, 'grad_norm': 0.7780385789923829, 'learning_rate': 4.051748832138247e-07, 'epoch': 0.87} 87%|████████▋ | 29988/34278 [32:52:51<4:01:14, 3.37s/it] 87%|████████▋ | 29989/34278 [32:52:55<4:11:28, 3.52s/it] {'loss': 0.1093, 'grad_norm': 0.8115155380145198, 'learning_rate': 4.049886044887136e-07, 'epoch': 0.87} 87%|████████▋ | 29989/34278 [32:52:55<4:11:28, 3.52s/it] 87%|████████▋ | 29990/34278 [32:52:58<4:11:14, 3.52s/it] {'loss': 0.1009, 'grad_norm': 0.7910249846198542, 'learning_rate': 4.0480236678676674e-07, 'epoch': 0.87} 87%|████████▋ | 29990/34278 [32:52:58<4:11:14, 3.52s/it] 87%|████████▋ | 29991/34278 [32:53:01<3:57:57, 3.33s/it] {'loss': 0.1178, 'grad_norm': 0.9517769185784213, 'learning_rate': 4.0461617010964906e-07, 'epoch': 0.87} 87%|████████▋ | 29991/34278 [32:53:01<3:57:57, 3.33s/it] 87%|████████▋ | 29992/34278 [32:53:07<4:55:45, 4.14s/it] {'loss': 0.1106, 'grad_norm': 1.1443227130278366, 'learning_rate': 4.0443001445902073e-07, 'epoch': 0.87} 87%|████████▋ | 29992/34278 [32:53:07<4:55:45, 4.14s/it] 87%|████████▋ | 29993/34278 [32:53:10<4:39:12, 3.91s/it] {'loss': 0.1464, 'grad_norm': 0.9945797268553265, 'learning_rate': 4.042438998365433e-07, 'epoch': 0.87} 87%|████████▋ | 29993/34278 [32:53:10<4:39:12, 3.91s/it] 88%|████████▊ | 29994/34278 [32:53:14<4:22:20, 3.67s/it] {'loss': 0.1146, 'grad_norm': 0.7847706764904184, 'learning_rate': 4.040578262438799e-07, 'epoch': 0.88} 88%|████████▊ | 29994/34278 [32:53:14<4:22:20, 3.67s/it] 88%|████████▊ | 29995/34278 [32:53:18<4:40:55, 3.94s/it] {'loss': 0.1081, 'grad_norm': 0.8295375998411549, 'learning_rate': 4.0387179368269137e-07, 'epoch': 0.88} 88%|████████▊ | 29995/34278 [32:53:18<4:40:55, 3.94s/it] 88%|████████▊ | 29996/34278 [32:53:21<4:23:20, 3.69s/it] {'loss': 0.1125, 'grad_norm': 0.8514728598688256, 'learning_rate': 4.0368580215463746e-07, 'epoch': 0.88} 88%|████████▊ | 29996/34278 [32:53:21<4:23:20, 3.69s/it] 88%|████████▊ | 29997/34278 [32:53:25<4:28:04, 3.76s/it] {'loss': 0.1265, 'grad_norm': 0.857317926644112, 'learning_rate': 4.034998516613797e-07, 'epoch': 0.88} 88%|████████▊ | 29997/34278 [32:53:25<4:28:04, 3.76s/it] 88%|████████▊ | 29998/34278 [32:53:31<5:07:04, 4.30s/it] {'loss': 0.1122, 'grad_norm': 1.005063040112087, 'learning_rate': 4.033139422045784e-07, 'epoch': 0.88} 88%|████████▊ | 29998/34278 [32:53:31<5:07:04, 4.30s/it] 88%|████████▊ | 29999/34278 [32:53:34<4:49:29, 4.06s/it] {'loss': 0.0949, 'grad_norm': 0.783048773544439, 'learning_rate': 4.0312807378589335e-07, 'epoch': 0.88} 88%|████████▊ | 29999/34278 [32:53:34<4:49:29, 4.06s/it] 88%|████████▊ | 30000/34278 [32:53:38<4:35:01, 3.86s/it] {'loss': 0.0973, 'grad_norm': 0.7896354250843888, 'learning_rate': 4.029422464069821e-07, 'epoch': 0.88} 88%|████████▊ | 30000/34278 [32:53:38<4:35:01, 3.86s/it]Set eos token id toSet eos token id to 151658 151658 Set eos token toSet eos token to Set eos token id to<|diff_marker|><|diff_marker|> Set eos token id toSet generation config eos token id toSet generation config eos token id to151658 Set eos token id to Set eos token id toSet eos token to[151658][151658]151658 <|diff_marker|>151658 Set eos token to Set eos token to Set generation config eos token id to<|diff_marker|><|diff_marker|> Set generation config eos token id to[151658] 151658 Set generation config eos token id to [151658] Set eos token to[151658] <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id toSet eos token id to Set eos token id toSet eos token id to 151658151658Set eos token id to Set eos token toSet eos token to <|diff_marker|><|diff_marker|>151658 Set eos token id to Set generation config eos token id to Set generation config eos token id to Set eos token to 151658 <|diff_marker|> [151658][151658]Set eos token toSet generation config eos token id to <|diff_marker|> [151658]Set generation config eos token id to Set eos token id to [151658] Set eos token id to151658Set eos token id to Set eos token to 151658 <|diff_marker|>151658 Set generation config eos token id toSet eos token to Set eos token to<|diff_marker|> <|diff_marker|> Set generation config eos token id toSet generation config eos token id to[151658] [151658] [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token id to Set eos token id toSet eos token id to151658Set eos token to 151658 <|diff_marker|> Set eos token id to 151658 Set eos token toSet eos token id to151658 Set eos token toSet generation config eos token id toSet eos token id to <|diff_marker|> Set eos token to <|diff_marker|>151658Set generation config eos token id to Set eos token to151658 151658Set generation config eos token id to [151658]Set eos token to<|diff_marker|>[151658] <|diff_marker|> Set eos token toSet eos token to<|diff_marker|> Set generation config eos token id to<|diff_marker|> Set generation config eos token id to <|diff_marker|>Set generation config eos token id to Set generation config eos token id to[151658] Set generation config eos token id to [151658] [151658][151658] [151658] [151658] Set eos token id back to et eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to Set eos token id back to 151645[151645, 151643] Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] 645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 88%|████████▊ | 30001/34278 [32:54:14<16:04:05, 13.52s/it] {'loss': 0.115, 'grad_norm': 0.6961728329544881, 'learning_rate': 4.027564600695055e-07, 'epoch': 0.88} 88%|████████▊ | 30001/34278 [32:54:14<16:04:05, 13.52s/it] 88%|████████▊ | 30002/34278 [32:54:20<13:27:36, 11.33s/it] {'loss': 0.1123, 'grad_norm': 0.8803479317906855, 'learning_rate': 4.025707147751223e-07, 'epoch': 0.88} 88%|████████▊ | 30002/34278 [32:54:20<13:27:36, 11.33s/it] 88%|████████▊ | 30003/34278 [32:54:24<10:49:19, 9.11s/it] {'loss': 0.1079, 'grad_norm': 0.8960196147789281, 'learning_rate': 4.023850105254895e-07, 'epoch': 0.88} 88%|████████▊ | 30003/34278 [32:54:24<10:49:19, 9.11s/it] 88%|████████▊ | 30004/34278 [32:54:27<8:39:09, 7.29s/it] {'loss': 0.0946, 'grad_norm': 0.7061949648126875, 'learning_rate': 4.021993473222668e-07, 'epoch': 0.88} 88%|████████▊ | 30004/34278 [32:54:27<8:39:09, 7.29s/it] 88%|████████▊ | 30005/34278 [32:54:30<7:11:24, 6.06s/it] {'loss': 0.0999, 'grad_norm': 0.7050193902636842, 'learning_rate': 4.020137251671108e-07, 'epoch': 0.88} 88%|████████▊ | 30005/34278 [32:54:30<7:11:24, 6.06s/it] 88%|████████▊ | 30006/34278 [32:54:33<6:05:41, 5.14s/it] {'loss': 0.1046, 'grad_norm': 1.057616937879093, 'learning_rate': 4.018281440616778e-07, 'epoch': 0.88} 88%|████████▊ | 30006/34278 [32:54:33<6:05:41, 5.14s/it] 88%|████████▊ | 30007/34278 [32:54:38<6:05:07, 5.13s/it] {'loss': 0.0961, 'grad_norm': 1.6158759986205025, 'learning_rate': 4.016426040076249e-07, 'epoch': 0.88} 88%|████████▊ | 30007/34278 [32:54:38<6:05:07, 5.13s/it] 88%|████████▊ | 30008/34278 [32:54:44<6:21:54, 5.37s/it] {'loss': 0.1005, 'grad_norm': 0.6381332574210345, 'learning_rate': 4.0145710500661075e-07, 'epoch': 0.88} 88%|████████▊ | 30008/34278 [32:54:44<6:21:54, 5.37s/it] 88%|████████▊ | 30009/34278 [32:54:47<5:32:08, 4.67s/it] {'loss': 0.1222, 'grad_norm': 0.7287050047502998, 'learning_rate': 4.012716470602895e-07, 'epoch': 0.88} 88%|████████▊ | 30009/34278 [32:54:47<5:32:08, 4.67s/it] 88%|████████▊ | 30010/34278 [32:54:50<4:55:42, 4.16s/it] {'loss': 0.1154, 'grad_norm': 0.7602515936667066, 'learning_rate': 4.0108623017031613e-07, 'epoch': 0.88} 88%|████████▊ | 30010/34278 [32:54:50<4:55:42, 4.16s/it] 88%|████████▊ | 30011/34278 [32:54:54<4:53:38, 4.13s/it] {'loss': 0.0995, 'grad_norm': 0.7219171099542323, 'learning_rate': 4.00900854338348e-07, 'epoch': 0.88} 88%|████████▊ | 30011/34278 [32:54:54<4:53:38, 4.13s/it] 88%|████████▊ | 30012/34278 [32:54:57<4:31:29, 3.82s/it] {'loss': 0.103, 'grad_norm': 0.7480555337142266, 'learning_rate': 4.0071551956603893e-07, 'epoch': 0.88} 88%|████████▊ | 30012/34278 [32:54:57<4:31:29, 3.82s/it] 88%|████████▊ | 30013/34278 [32:55:00<4:13:21, 3.56s/it] {'loss': 0.1128, 'grad_norm': 0.7282980125470874, 'learning_rate': 4.005302258550425e-07, 'epoch': 0.88} 88%|████████▊ | 30013/34278 [32:55:00<4:13:21, 3.56s/it] 88%|████████▊ | 30014/34278 [32:55:04<4:19:47, 3.66s/it] {'loss': 0.1136, 'grad_norm': 0.8296426808167071, 'learning_rate': 4.0034497320701584e-07, 'epoch': 0.88} 88%|████████▊ | 30014/34278 [32:55:04<4:19:47, 3.66s/it] 88%|████████▊ | 30015/34278 [32:55:07<4:02:51, 3.42s/it] {'loss': 0.1005, 'grad_norm': 0.7558778919722866, 'learning_rate': 4.001597616236108e-07, 'epoch': 0.88} 88%|████████▊ | 30015/34278 [32:55:07<4:02:51, 3.42s/it] 88%|████████▊ | 30016/34278 [32:55:10<3:55:07, 3.31s/it] {'loss': 0.1275, 'grad_norm': 1.0726349994338809, 'learning_rate': 3.999745911064812e-07, 'epoch': 0.88} 88%|████████▊ | 30016/34278 [32:55:10<3:55:07, 3.31s/it] 88%|████████▊ | 30017/34278 [32:55:14<4:17:49, 3.63s/it] {'loss': 0.1007, 'grad_norm': 0.8753592511955679, 'learning_rate': 3.997894616572806e-07, 'epoch': 0.88} 88%|████████▊ | 30017/34278 [32:55:14<4:17:49, 3.63s/it] 88%|████████▊ | 30018/34278 [32:55:19<4:30:57, 3.82s/it] {'loss': 0.0973, 'grad_norm': 0.8331388970453892, 'learning_rate': 3.996043732776617e-07, 'epoch': 0.88} 88%|████████▊ | 30018/34278 [32:55:19<4:30:57, 3.82s/it] 88%|████████▊ | 30019/34278 [32:55:22<4:28:25, 3.78s/it] {'loss': 0.1228, 'grad_norm': 0.7109433945251317, 'learning_rate': 3.994193259692758e-07, 'epoch': 0.88} 88%|████████▊ | 30019/34278 [32:55:22<4:28:25, 3.78s/it] 88%|████████▊ | 30020/34278 [32:55:28<5:00:12, 4.23s/it] {'loss': 0.1025, 'grad_norm': 0.8393881516507595, 'learning_rate': 3.992343197337761e-07, 'epoch': 0.88} 88%|████████▊ | 30020/34278 [32:55:28<5:00:12, 4.23s/it] 88%|████████▊ | 30021/34278 [32:55:31<4:41:45, 3.97s/it] {'loss': 0.1054, 'grad_norm': 0.830369322465344, 'learning_rate': 3.9904935457281524e-07, 'epoch': 0.88} 88%|████████▊ | 30021/34278 [32:55:31<4:41:45, 3.97s/it] 88%|████████▊ | 30022/34278 [32:55:34<4:24:41, 3.73s/it] {'loss': 0.1091, 'grad_norm': 0.7929558092667852, 'learning_rate': 3.988644304880429e-07, 'epoch': 0.88} 88%|████████▊ | 30022/34278 [32:55:34<4:24:41, 3.73s/it] 88%|████████▊ | 30023/34278 [32:55:37<4:15:45, 3.61s/it] {'loss': 0.0978, 'grad_norm': 0.8207451692628104, 'learning_rate': 3.9867954748111e-07, 'epoch': 0.88} 88%|████████▊ | 30023/34278 [32:55:37<4:15:45, 3.61s/it] 88%|████████▊ | 30024/34278 [32:55:42<4:33:05, 3.85s/it] {'loss': 0.1114, 'grad_norm': 0.8455388434418435, 'learning_rate': 3.984947055536681e-07, 'epoch': 0.88} 88%|████████▊ | 30024/34278 [32:55:42<4:33:05, 3.85s/it] 88%|████████▊ | 30025/34278 [32:55:45<4:22:08, 3.70s/it] {'loss': 0.0862, 'grad_norm': 1.1177513257661293, 'learning_rate': 3.9830990470736684e-07, 'epoch': 0.88} 88%|████████▊ | 30025/34278 [32:55:45<4:22:08, 3.70s/it] 88%|████████▊ | 30026/34278 [32:55:49<4:16:44, 3.62s/it] {'loss': 0.1016, 'grad_norm': 0.9104098906899314, 'learning_rate': 3.981251449438567e-07, 'epoch': 0.88} 88%|████████▊ | 30026/34278 [32:55:49<4:16:44, 3.62s/it] 88%|████████▊ | 30027/34278 [32:55:52<4:17:38, 3.64s/it] {'loss': 0.0957, 'grad_norm': 0.766933255601191, 'learning_rate': 3.9794042626478566e-07, 'epoch': 0.88} 88%|████████▊ | 30027/34278 [32:55:52<4:17:38, 3.64s/it] 88%|████████▊ | 30028/34278 [32:55:55<4:05:40, 3.47s/it] {'loss': 0.1149, 'grad_norm': 0.8170312088935947, 'learning_rate': 3.9775574867180477e-07, 'epoch': 0.88} 88%|████████▊ | 30028/34278 [32:55:55<4:05:40, 3.47s/it] 88%|████████▊ | 30029/34278 [32:55:58<3:56:45, 3.34s/it] {'loss': 0.1059, 'grad_norm': 0.73466149004003, 'learning_rate': 3.975711121665621e-07, 'epoch': 0.88} 88%|████████▊ | 30029/34278 [32:55:58<3:56:45, 3.34s/it] 88%|████████▊ | 30030/34278 [32:56:01<3:44:17, 3.17s/it] {'loss': 0.1139, 'grad_norm': 0.8350009194852385, 'learning_rate': 3.973865167507052e-07, 'epoch': 0.88} 88%|████████▊ | 30030/34278 [32:56:01<3:44:17, 3.17s/it] 88%|████████▊ | 30031/34278 [32:56:07<4:34:50, 3.88s/it] {'loss': 0.104, 'grad_norm': 0.8192370000669458, 'learning_rate': 3.9720196242588214e-07, 'epoch': 0.88} 88%|████████▊ | 30031/34278 [32:56:07<4:34:50, 3.88s/it] 88%|████████▊ | 30032/34278 [32:56:10<4:14:28, 3.60s/it] {'loss': 0.108, 'grad_norm': 0.9749452790959781, 'learning_rate': 3.9701744919374285e-07, 'epoch': 0.88} 88%|████████▊ | 30032/34278 [32:56:10<4:14:28, 3.60s/it] 88%|████████▊ | 30033/34278 [32:56:16<5:03:50, 4.29s/it] {'loss': 0.1195, 'grad_norm': 0.8738741975347576, 'learning_rate': 3.968329770559315e-07, 'epoch': 0.88} 88%|████████▊ | 30033/34278 [32:56:16<5:03:50, 4.29s/it] 88%|████████▊ | 30034/34278 [32:56:18<4:30:49, 3.83s/it] {'loss': 0.102, 'grad_norm': 0.8300108519181769, 'learning_rate': 3.966485460140973e-07, 'epoch': 0.88} 88%|████████▊ | 30034/34278 [32:56:18<4:30:49, 3.83s/it] 88%|████████▊ | 30035/34278 [32:56:23<4:44:07, 4.02s/it] {'loss': 0.1216, 'grad_norm': 1.1384501295776965, 'learning_rate': 3.964641560698862e-07, 'epoch': 0.88} 88%|████████▊ | 30035/34278 [32:56:23<4:44:07, 4.02s/it] 88%|████████▊ | 30036/34278 [32:56:29<5:23:11, 4.57s/it] {'loss': 0.134, 'grad_norm': 0.8709744781499231, 'learning_rate': 3.962798072249435e-07, 'epoch': 0.88} 88%|████████▊ | 30036/34278 [32:56:29<5:23:11, 4.57s/it] 88%|████████▊ | 30037/34278 [32:56:32<4:52:31, 4.14s/it] {'loss': 0.0907, 'grad_norm': 0.9695599752029358, 'learning_rate': 3.9609549948091517e-07, 'epoch': 0.88} 88%|████████▊ | 30037/34278 [32:56:32<4:52:31, 4.14s/it] 88%|████████▊ | 30038/34278 [32:56:36<4:54:29, 4.17s/it] {'loss': 0.1069, 'grad_norm': 0.7941932202922776, 'learning_rate': 3.9591123283944875e-07, 'epoch': 0.88} 88%|████████▊ | 30038/34278 [32:56:36<4:54:29, 4.17s/it] 88%|████████▊ | 30039/34278 [32:56:39<4:29:39, 3.82s/it] {'loss': 0.105, 'grad_norm': 0.8073176101841258, 'learning_rate': 3.9572700730218685e-07, 'epoch': 0.88} 88%|████████▊ | 30039/34278 [32:56:39<4:29:39, 3.82s/it] 88%|████████▊ | 30040/34278 [32:56:42<4:14:53, 3.61s/it] {'loss': 0.0891, 'grad_norm': 0.7648654751563687, 'learning_rate': 3.955428228707747e-07, 'epoch': 0.88} 88%|████████▊ | 30040/34278 [32:56:42<4:14:53, 3.61s/it] 88%|████████▊ | 30041/34278 [32:56:47<4:39:44, 3.96s/it] {'loss': 0.1166, 'grad_norm': 0.9619900489099411, 'learning_rate': 3.953586795468584e-07, 'epoch': 0.88} 88%|████████▊ | 30041/34278 [32:56:47<4:39:44, 3.96s/it] 88%|████████▊ | 30042/34278 [32:56:50<4:15:03, 3.61s/it] {'loss': 0.1073, 'grad_norm': 0.9208818118250522, 'learning_rate': 3.9517457733207973e-07, 'epoch': 0.88} 88%|████████▊ | 30042/34278 [32:56:50<4:15:03, 3.61s/it] 88%|████████▊ | 30043/34278 [32:56:53<4:06:54, 3.50s/it] {'loss': 0.106, 'grad_norm': 1.0076492847406802, 'learning_rate': 3.9499051622808203e-07, 'epoch': 0.88} 88%|████████▊ | 30043/34278 [32:56:53<4:06:54, 3.50s/it] 88%|████████▊ | 30044/34278 [32:56:56<3:57:38, 3.37s/it] {'loss': 0.1056, 'grad_norm': 0.6455984377638654, 'learning_rate': 3.948064962365111e-07, 'epoch': 0.88} 88%|████████▊ | 30044/34278 [32:56:56<3:57:38, 3.37s/it] 88%|████████▊ | 30045/34278 [32:56:59<3:54:24, 3.32s/it] {'loss': 0.1208, 'grad_norm': 0.7674607736520921, 'learning_rate': 3.9462251735900845e-07, 'epoch': 0.88} 88%|████████▊ | 30045/34278 [32:56:59<3:54:24, 3.32s/it] 88%|████████▊ | 30046/34278 [32:57:02<3:47:02, 3.22s/it] {'loss': 0.1063, 'grad_norm': 0.9589067616641717, 'learning_rate': 3.944385795972161e-07, 'epoch': 0.88} 88%|████████▊ | 30046/34278 [32:57:02<3:47:02, 3.22s/it] 88%|████████▊ | 30047/34278 [32:57:08<4:30:44, 3.84s/it] {'loss': 0.1416, 'grad_norm': 1.0815651788480012, 'learning_rate': 3.9425468295277714e-07, 'epoch': 0.88} 88%|████████▊ | 30047/34278 [32:57:08<4:30:44, 3.84s/it] 88%|████████▊ | 30048/34278 [32:57:11<4:13:36, 3.60s/it] {'loss': 0.1517, 'grad_norm': 0.9132409396108546, 'learning_rate': 3.9407082742733306e-07, 'epoch': 0.88} 88%|████████▊ | 30048/34278 [32:57:11<4:13:36, 3.60s/it] 88%|████████▊ | 30049/34278 [32:57:14<4:04:02, 3.46s/it] {'loss': 0.1227, 'grad_norm': 0.9288953014955593, 'learning_rate': 3.938870130225242e-07, 'epoch': 0.88} 88%|████████▊ | 30049/34278 [32:57:14<4:04:02, 3.46s/it] 88%|████████▊ | 30050/34278 [32:57:17<3:49:38, 3.26s/it] {'loss': 0.0873, 'grad_norm': 0.8664721978661181, 'learning_rate': 3.937032397399926e-07, 'epoch': 0.88} 88%|████████▊ | 30050/34278 [32:57:17<3:49:38, 3.26s/it] 88%|████████▊ | 30051/34278 [32:57:19<3:42:56, 3.16s/it] {'loss': 0.1311, 'grad_norm': 0.849766620273122, 'learning_rate': 3.935195075813797e-07, 'epoch': 0.88} 88%|████████▊ | 30051/34278 [32:57:19<3:42:56, 3.16s/it] 88%|████████▊ | 30052/34278 [32:57:25<4:35:11, 3.91s/it] {'loss': 0.1059, 'grad_norm': 1.6430253113803783, 'learning_rate': 3.9333581654832473e-07, 'epoch': 0.88} 88%|████████▊ | 30052/34278 [32:57:25<4:35:11, 3.91s/it] 88%|████████▊ | 30053/34278 [32:57:28<4:14:33, 3.62s/it] {'loss': 0.1006, 'grad_norm': 0.927702926764876, 'learning_rate': 3.931521666424676e-07, 'epoch': 0.88} 88%|████████▊ | 30053/34278 [32:57:28<4:14:33, 3.62s/it] 88%|████████▊ | 30054/34278 [32:57:32<4:17:39, 3.66s/it] {'loss': 0.1072, 'grad_norm': 0.8112553638058957, 'learning_rate': 3.929685578654485e-07, 'epoch': 0.88} 88%|████████▊ | 30054/34278 [32:57:32<4:17:39, 3.66s/it] 88%|████████▊ | 30055/34278 [32:57:36<4:28:18, 3.81s/it] {'loss': 0.1063, 'grad_norm': 0.8092630212378464, 'learning_rate': 3.927849902189057e-07, 'epoch': 0.88} 88%|████████▊ | 30055/34278 [32:57:36<4:28:18, 3.81s/it] 88%|████████▊ | 30056/34278 [32:57:39<4:10:13, 3.56s/it] {'loss': 0.125, 'grad_norm': 1.0379929437241544, 'learning_rate': 3.926014637044795e-07, 'epoch': 0.88} 88%|████████▊ | 30056/34278 [32:57:39<4:10:13, 3.56s/it] 88%|████████▊ | 30057/34278 [32:57:42<4:07:53, 3.52s/it] {'loss': 0.103, 'grad_norm': 0.8573863127018945, 'learning_rate': 3.9241797832380634e-07, 'epoch': 0.88} 88%|████████▊ | 30057/34278 [32:57:42<4:07:53, 3.52s/it] 88%|████████▊ | 30058/34278 [32:57:46<4:01:45, 3.44s/it] {'loss': 0.1025, 'grad_norm': 0.7165389145789426, 'learning_rate': 3.922345340785266e-07, 'epoch': 0.88} 88%|████████▊ | 30058/34278 [32:57:46<4:01:45, 3.44s/it] 88%|████████▊ | 30059/34278 [32:57:50<4:16:33, 3.65s/it] {'loss': 0.1301, 'grad_norm': 0.9053114762016254, 'learning_rate': 3.9205113097027734e-07, 'epoch': 0.88} 88%|████████▊ | 30059/34278 [32:57:50<4:16:33, 3.65s/it] 88%|████████▊ | 30060/34278 [32:57:53<4:03:44, 3.47s/it] {'loss': 0.0985, 'grad_norm': 1.5393050782075324, 'learning_rate': 3.9186776900069444e-07, 'epoch': 0.88} 88%|████████▊ | 30060/34278 [32:57:53<4:03:44, 3.47s/it] 88%|████████▊ | 30061/34278 [32:57:56<4:02:37, 3.45s/it] {'loss': 0.1217, 'grad_norm': 0.7210706822800717, 'learning_rate': 3.9168444817141603e-07, 'epoch': 0.88} 88%|████████▊ | 30061/34278 [32:57:56<4:02:37, 3.45s/it] 88%|████████▊ | 30062/34278 [32:58:00<4:04:10, 3.47s/it] {'loss': 0.1152, 'grad_norm': 0.8042049388030086, 'learning_rate': 3.9150116848407973e-07, 'epoch': 0.88} 88%|████████▊ | 30062/34278 [32:58:00<4:04:10, 3.47s/it] 88%|████████▊ | 30063/34278 [32:58:03<3:54:57, 3.34s/it] {'loss': 0.1204, 'grad_norm': 0.9580766186000356, 'learning_rate': 3.913179299403203e-07, 'epoch': 0.88} 88%|████████▊ | 30063/34278 [32:58:03<3:54:57, 3.34s/it] 88%|████████▊ | 30064/34278 [32:58:06<3:47:45, 3.24s/it] {'loss': 0.1178, 'grad_norm': 0.8548960706534561, 'learning_rate': 3.911347325417747e-07, 'epoch': 0.88} 88%|████████▊ | 30064/34278 [32:58:06<3:47:45, 3.24s/it] 88%|████████▊ | 30065/34278 [32:58:09<3:56:59, 3.38s/it] {'loss': 0.1217, 'grad_norm': 0.7249366520179904, 'learning_rate': 3.9095157629007786e-07, 'epoch': 0.88} 88%|████████▊ | 30065/34278 [32:58:09<3:56:59, 3.38s/it] 88%|████████▊ | 30066/34278 [32:58:12<3:49:21, 3.27s/it] {'loss': 0.1001, 'grad_norm': 0.7628181865832407, 'learning_rate': 3.907684611868645e-07, 'epoch': 0.88} 88%|████████▊ | 30066/34278 [32:58:13<3:49:21, 3.27s/it] 88%|████████▊ | 30067/34278 [32:58:16<3:49:41, 3.27s/it] {'loss': 0.1089, 'grad_norm': 0.9237902685551095, 'learning_rate': 3.905853872337695e-07, 'epoch': 0.88} 88%|████████▊ | 30067/34278 [32:58:16<3:49:41, 3.27s/it] 88%|████████▊ | 30068/34278 [32:58:19<3:43:45, 3.19s/it] {'loss': 0.1103, 'grad_norm': 0.7328612675413081, 'learning_rate': 3.9040235443242924e-07, 'epoch': 0.88} 88%|████████▊ | 30068/34278 [32:58:19<3:43:45, 3.19s/it] 88%|████████▊ | 30069/34278 [32:58:22<3:42:44, 3.18s/it] {'loss': 0.1169, 'grad_norm': 0.7110917749059704, 'learning_rate': 3.9021936278447636e-07, 'epoch': 0.88} 88%|████████▊ | 30069/34278 [32:58:22<3:42:44, 3.18s/it] 88%|████████▊ | 30070/34278 [32:58:26<3:57:44, 3.39s/it] {'loss': 0.0999, 'grad_norm': 0.9421898718187118, 'learning_rate': 3.900364122915434e-07, 'epoch': 0.88} 88%|████████▊ | 30070/34278 [32:58:26<3:57:44, 3.39s/it] 88%|████████▊ | 30071/34278 [32:58:30<4:20:48, 3.72s/it] {'loss': 0.135, 'grad_norm': 0.7690975519919347, 'learning_rate': 3.898535029552658e-07, 'epoch': 0.88} 88%|████████▊ | 30071/34278 [32:58:30<4:20:48, 3.72s/it] 88%|████████▊ | 30072/34278 [32:58:35<4:50:37, 4.15s/it] {'loss': 0.1127, 'grad_norm': 0.999268463003431, 'learning_rate': 3.896706347772755e-07, 'epoch': 0.88} 88%|████████▊ | 30072/34278 [32:58:35<4:50:37, 4.15s/it] 88%|████████▊ | 30073/34278 [32:58:39<4:28:57, 3.84s/it] {'loss': 0.124, 'grad_norm': 1.0454217622245405, 'learning_rate': 3.89487807759204e-07, 'epoch': 0.88} 88%|████████▊ | 30073/34278 [32:58:39<4:28:57, 3.84s/it] 88%|████████▊ | 30074/34278 [32:58:41<4:04:39, 3.49s/it] {'loss': 0.1349, 'grad_norm': 1.1237790440347422, 'learning_rate': 3.8930502190268616e-07, 'epoch': 0.88} 88%|████████▊ | 30074/34278 [32:58:41<4:04:39, 3.49s/it] 88%|████████▊ | 30075/34278 [32:58:45<4:03:31, 3.48s/it] {'loss': 0.1143, 'grad_norm': 0.8953995022199389, 'learning_rate': 3.891222772093523e-07, 'epoch': 0.88} 88%|████████▊ | 30075/34278 [32:58:45<4:03:31, 3.48s/it] 88%|████████▊ | 30076/34278 [32:58:48<3:55:39, 3.36s/it] {'loss': 0.1054, 'grad_norm': 0.638862969028371, 'learning_rate': 3.8893957368083325e-07, 'epoch': 0.88} 88%|████████▊ | 30076/34278 [32:58:48<3:55:39, 3.36s/it] 88%|████████▊ | 30077/34278 [32:58:51<3:56:56, 3.38s/it] {'loss': 0.1073, 'grad_norm': 0.6438116980388293, 'learning_rate': 3.887569113187617e-07, 'epoch': 0.88} 88%|████████▊ | 30077/34278 [32:58:51<3:56:56, 3.38s/it] 88%|████████▊ | 30078/34278 [32:58:57<4:50:33, 4.15s/it] {'loss': 0.102, 'grad_norm': 0.6530132009425201, 'learning_rate': 3.885742901247674e-07, 'epoch': 0.88} 88%|████████▊ | 30078/34278 [32:58:57<4:50:33, 4.15s/it] 88%|████████▊ | 30079/34278 [32:59:00<4:25:52, 3.80s/it] {'loss': 0.1374, 'grad_norm': 0.7356138233567865, 'learning_rate': 3.8839171010048083e-07, 'epoch': 0.88} 88%|████████▊ | 30079/34278 [32:59:00<4:25:52, 3.80s/it] 88%|████████▊ | 30080/34278 [32:59:03<4:09:30, 3.57s/it] {'loss': 0.1328, 'grad_norm': 0.8455874163711665, 'learning_rate': 3.8820917124753163e-07, 'epoch': 0.88} 88%|████████▊ | 30080/34278 [32:59:03<4:09:30, 3.57s/it] 88%|████████▊ | 30081/34278 [32:59:06<3:58:07, 3.40s/it] {'loss': 0.0962, 'grad_norm': 0.9487013068431305, 'learning_rate': 3.8802667356755084e-07, 'epoch': 0.88} 88%|████████▊ | 30081/34278 [32:59:06<3:58:07, 3.40s/it] 88%|████████▊ | 30082/34278 [32:59:09<3:51:40, 3.31s/it] {'loss': 0.0828, 'grad_norm': 0.7003432501212324, 'learning_rate': 3.8784421706216714e-07, 'epoch': 0.88} 88%|████████▊ | 30082/34278 [32:59:09<3:51:40, 3.31s/it] 88%|████████▊ | 30083/34278 [32:59:14<4:17:16, 3.68s/it] {'loss': 0.1132, 'grad_norm': 0.6712415974749824, 'learning_rate': 3.876618017330086e-07, 'epoch': 0.88} 88%|████████▊ | 30083/34278 [32:59:14<4:17:16, 3.68s/it] 88%|████████▊ | 30084/34278 [32:59:17<4:05:46, 3.52s/it] {'loss': 0.0873, 'grad_norm': 0.8493576882662304, 'learning_rate': 3.874794275817051e-07, 'epoch': 0.88} 88%|████████▊ | 30084/34278 [32:59:17<4:05:46, 3.52s/it] 88%|████████▊ | 30085/34278 [32:59:20<3:50:27, 3.30s/it] {'loss': 0.1332, 'grad_norm': 0.8328528734137715, 'learning_rate': 3.8729709460988365e-07, 'epoch': 0.88} 88%|████████▊ | 30085/34278 [32:59:20<3:50:27, 3.30s/it] 88%|████████▊ | 30086/34278 [32:59:23<3:47:48, 3.26s/it] {'loss': 0.118, 'grad_norm': 0.8530582869124106, 'learning_rate': 3.871148028191729e-07, 'epoch': 0.88} 88%|████████▊ | 30086/34278 [32:59:23<3:47:48, 3.26s/it] 88%|████████▊ | 30087/34278 [32:59:26<3:40:45, 3.16s/it] {'loss': 0.119, 'grad_norm': 0.9669593474332003, 'learning_rate': 3.869325522111994e-07, 'epoch': 0.88} 88%|████████▊ | 30087/34278 [32:59:26<3:40:45, 3.16s/it] 88%|████████▊ | 30088/34278 [32:59:29<3:45:34, 3.23s/it] {'loss': 0.1051, 'grad_norm': 0.9538297900146231, 'learning_rate': 3.8675034278759184e-07, 'epoch': 0.88} 88%|████████▊ | 30088/34278 [32:59:29<3:45:34, 3.23s/it] 88%|████████▊ | 30089/34278 [32:59:32<3:39:11, 3.14s/it] {'loss': 0.1056, 'grad_norm': 0.7721806789673515, 'learning_rate': 3.865681745499761e-07, 'epoch': 0.88} 88%|████████▊ | 30089/34278 [32:59:32<3:39:11, 3.14s/it] 88%|████████▊ | 30090/34278 [32:59:35<3:34:30, 3.07s/it] {'loss': 0.1165, 'grad_norm': 0.5860090225534232, 'learning_rate': 3.86386047499977e-07, 'epoch': 0.88} 88%|████████▊ | 30090/34278 [32:59:35<3:34:30, 3.07s/it] 88%|████████▊ | 30091/34278 [32:59:41<4:35:48, 3.95s/it] {'loss': 0.1027, 'grad_norm': 0.768090832424776, 'learning_rate': 3.862039616392221e-07, 'epoch': 0.88} 88%|████████▊ | 30091/34278 [32:59:41<4:35:48, 3.95s/it] 88%|████████▊ | 30092/34278 [32:59:44<4:20:25, 3.73s/it] {'loss': 0.1128, 'grad_norm': 0.9645450329038163, 'learning_rate': 3.86021916969338e-07, 'epoch': 0.88} 88%|████████▊ | 30092/34278 [32:59:44<4:20:25, 3.73s/it] 88%|████████▊ | 30093/34278 [32:59:47<4:06:33, 3.53s/it] {'loss': 0.1298, 'grad_norm': 1.041949592417441, 'learning_rate': 3.858399134919472e-07, 'epoch': 0.88} 88%|████████▊ | 30093/34278 [32:59:47<4:06:33, 3.53s/it] 88%|████████▊ | 30094/34278 [32:59:51<4:06:45, 3.54s/it] {'loss': 0.1187, 'grad_norm': 0.8590835881084311, 'learning_rate': 3.856579512086778e-07, 'epoch': 0.88} 88%|████████▊ | 30094/34278 [32:59:51<4:06:45, 3.54s/it] 88%|████████▊ | 30095/34278 [32:59:54<4:02:00, 3.47s/it] {'loss': 0.1185, 'grad_norm': 0.8951618817724409, 'learning_rate': 3.854760301211524e-07, 'epoch': 0.88} 88%|████████▊ | 30095/34278 [32:59:54<4:02:00, 3.47s/it] 88%|████████▊ | 30096/34278 [32:59:57<3:50:04, 3.30s/it] {'loss': 0.1026, 'grad_norm': 0.7344761342477992, 'learning_rate': 3.8529415023099425e-07, 'epoch': 0.88} 88%|████████▊ | 30096/34278 [32:59:57<3:50:04, 3.30s/it] 88%|████████▊ | 30097/34278 [33:00:03<4:45:59, 4.10s/it] {'loss': 0.1022, 'grad_norm': 0.7742693369014273, 'learning_rate': 3.8511231153982866e-07, 'epoch': 0.88} 88%|████████▊ | 30097/34278 [33:00:03<4:45:59, 4.10s/it] 88%|████████▊ | 30098/34278 [33:00:07<4:36:48, 3.97s/it] {'loss': 0.1222, 'grad_norm': 1.4251768437248684, 'learning_rate': 3.8493051404927985e-07, 'epoch': 0.88} 88%|████████▊ | 30098/34278 [33:00:07<4:36:48, 3.97s/it] 88%|████████▊ | 30099/34278 [33:00:11<4:36:11, 3.97s/it] {'loss': 0.1207, 'grad_norm': 0.85558468117075, 'learning_rate': 3.847487577609693e-07, 'epoch': 0.88} 88%|████████▊ | 30099/34278 [33:00:11<4:36:11, 3.97s/it] 88%|████████▊ | 30100/34278 [33:00:15<4:48:29, 4.14s/it] {'loss': 0.1001, 'grad_norm': 0.7702977387294887, 'learning_rate': 3.845670426765191e-07, 'epoch': 0.88} 88%|████████▊ | 30100/34278 [33:00:15<4:48:29, 4.14s/it] 88%|████████▊ | 30101/34278 [33:00:19<4:30:51, 3.89s/it] {'loss': 0.1134, 'grad_norm': 0.6038297114219298, 'learning_rate': 3.843853687975535e-07, 'epoch': 0.88} 88%|████████▊ | 30101/34278 [33:00:19<4:30:51, 3.89s/it] 88%|████████▊ | 30102/34278 [33:00:22<4:15:10, 3.67s/it] {'loss': 0.1063, 'grad_norm': 0.7674689518157997, 'learning_rate': 3.842037361256934e-07, 'epoch': 0.88} 88%|████████▊ | 30102/34278 [33:00:22<4:15:10, 3.67s/it] 88%|████████▊ | 30103/34278 [33:00:26<4:32:52, 3.92s/it] {'loss': 0.1057, 'grad_norm': 0.8539562901700051, 'learning_rate': 3.8402214466255914e-07, 'epoch': 0.88} 88%|████████▊ | 30103/34278 [33:00:26<4:32:52, 3.92s/it] 88%|████████▊ | 30104/34278 [33:00:30<4:30:20, 3.89s/it] {'loss': 0.1229, 'grad_norm': 0.7256561522628749, 'learning_rate': 3.838405944097745e-07, 'epoch': 0.88} 88%|████████▊ | 30104/34278 [33:00:30<4:30:20, 3.89s/it] 88%|████████▊ | 30105/34278 [33:00:33<4:11:41, 3.62s/it] {'loss': 0.1316, 'grad_norm': 0.7276274960672796, 'learning_rate': 3.8365908536895924e-07, 'epoch': 0.88} 88%|████████▊ | 30105/34278 [33:00:33<4:11:41, 3.62s/it] 88%|████████▊ | 30106/34278 [33:00:36<3:51:52, 3.33s/it] {'loss': 0.1199, 'grad_norm': 0.9949774348404163, 'learning_rate': 3.834776175417332e-07, 'epoch': 0.88} 88%|████████▊ | 30106/34278 [33:00:36<3:51:52, 3.33s/it] 88%|████████▊ | 30107/34278 [33:00:39<4:00:28, 3.46s/it] {'loss': 0.1152, 'grad_norm': 0.8770199532814935, 'learning_rate': 3.832961909297173e-07, 'epoch': 0.88} 88%|████████▊ | 30107/34278 [33:00:39<4:00:28, 3.46s/it] 88%|████████▊ | 30108/34278 [33:00:43<3:54:27, 3.37s/it] {'loss': 0.0978, 'grad_norm': 0.8766923169155485, 'learning_rate': 3.8311480553453127e-07, 'epoch': 0.88} 88%|████████▊ | 30108/34278 [33:00:43<3:54:27, 3.37s/it] 88%|████████▊ | 30109/34278 [33:00:47<4:06:17, 3.54s/it] {'loss': 0.1069, 'grad_norm': 0.8452586878530937, 'learning_rate': 3.8293346135779287e-07, 'epoch': 0.88} 88%|████████▊ | 30109/34278 [33:00:47<4:06:17, 3.54s/it] 88%|████████▊ | 30110/34278 [33:00:50<4:09:16, 3.59s/it] {'loss': 0.1005, 'grad_norm': 0.7680703348608079, 'learning_rate': 3.827521584011229e-07, 'epoch': 0.88} 88%|████████▊ | 30110/34278 [33:00:50<4:09:16, 3.59s/it] 88%|████████▊ | 30111/34278 [33:00:55<4:25:12, 3.82s/it] {'loss': 0.0992, 'grad_norm': 0.9003956041630796, 'learning_rate': 3.8257089666613957e-07, 'epoch': 0.88} 88%|████████▊ | 30111/34278 [33:00:55<4:25:12, 3.82s/it] 88%|████████▊ | 30112/34278 [33:00:58<4:08:01, 3.57s/it] {'loss': 0.1114, 'grad_norm': 0.9955260840028338, 'learning_rate': 3.8238967615446155e-07, 'epoch': 0.88} 88%|████████▊ | 30112/34278 [33:00:58<4:08:01, 3.57s/it] 88%|████████▊ | 30113/34278 [33:01:00<3:52:06, 3.34s/it] {'loss': 0.1021, 'grad_norm': 0.8329995820013536, 'learning_rate': 3.8220849686770535e-07, 'epoch': 0.88} 88%|████████▊ | 30113/34278 [33:01:00<3:52:06, 3.34s/it] 88%|████████▊ | 30114/34278 [33:01:05<4:07:14, 3.56s/it] {'loss': 0.1167, 'grad_norm': 0.887962467161193, 'learning_rate': 3.820273588074896e-07, 'epoch': 0.88} 88%|████████▊ | 30114/34278 [33:01:05<4:07:14, 3.56s/it] 88%|████████▊ | 30115/34278 [33:01:09<4:21:30, 3.77s/it] {'loss': 0.114, 'grad_norm': 0.8019271561048367, 'learning_rate': 3.8184626197543095e-07, 'epoch': 0.88} 88%|████████▊ | 30115/34278 [33:01:09<4:21:30, 3.77s/it] 88%|████████▊ | 30116/34278 [33:01:15<5:07:51, 4.44s/it] {'loss': 0.1241, 'grad_norm': 0.8154637267247528, 'learning_rate': 3.8166520637314684e-07, 'epoch': 0.88} 88%|████████▊ | 30116/34278 [33:01:15<5:07:51, 4.44s/it] 88%|████████▊ | 30117/34278 [33:01:20<5:27:22, 4.72s/it] {'loss': 0.1192, 'grad_norm': 1.0624333597161308, 'learning_rate': 3.8148419200225275e-07, 'epoch': 0.88} 88%|████████▊ | 30117/34278 [33:01:20<5:27:22, 4.72s/it] 88%|████████▊ | 30118/34278 [33:01:25<5:22:29, 4.65s/it] {'loss': 0.1105, 'grad_norm': 0.7806937411066809, 'learning_rate': 3.813032188643662e-07, 'epoch': 0.88} 88%|████████▊ | 30118/34278 [33:01:25<5:22:29, 4.65s/it] 88%|████████▊ | 30119/34278 [33:01:28<4:46:16, 4.13s/it] {'loss': 0.1245, 'grad_norm': 0.7367626320148644, 'learning_rate': 3.8112228696110144e-07, 'epoch': 0.88} 88%|████████▊ | 30119/34278 [33:01:28<4:46:16, 4.13s/it] 88%|████████▊ | 30120/34278 [33:01:31<4:25:02, 3.82s/it] {'loss': 0.1381, 'grad_norm': 0.8753164490793615, 'learning_rate': 3.809413962940739e-07, 'epoch': 0.88} 88%|████████▊ | 30120/34278 [33:01:31<4:25:02, 3.82s/it] 88%|████████▊ | 30121/34278 [33:01:37<5:12:57, 4.52s/it] {'loss': 0.0997, 'grad_norm': 0.7303670589121473, 'learning_rate': 3.8076054686489893e-07, 'epoch': 0.88} 88%|████████▊ | 30121/34278 [33:01:37<5:12:57, 4.52s/it] 88%|████████▊ | 30122/34278 [33:01:43<5:41:44, 4.93s/it] {'loss': 0.1191, 'grad_norm': 0.7192568392563339, 'learning_rate': 3.805797386751914e-07, 'epoch': 0.88} 88%|████████▊ | 30122/34278 [33:01:43<5:41:44, 4.93s/it] 88%|████████▊ | 30123/34278 [33:01:46<5:10:14, 4.48s/it] {'loss': 0.1306, 'grad_norm': 0.8562402704481382, 'learning_rate': 3.803989717265649e-07, 'epoch': 0.88} 88%|████████▊ | 30123/34278 [33:01:46<5:10:14, 4.48s/it] 88%|████████▊ | 30124/34278 [33:01:49<4:37:05, 4.00s/it] {'loss': 0.1022, 'grad_norm': 0.9609287710374402, 'learning_rate': 3.802182460206344e-07, 'epoch': 0.88} 88%|████████▊ | 30124/34278 [33:01:49<4:37:05, 4.00s/it] 88%|████████▊ | 30125/34278 [33:01:52<4:13:50, 3.67s/it] {'loss': 0.1078, 'grad_norm': 0.9550523474148769, 'learning_rate': 3.800375615590124e-07, 'epoch': 0.88} 88%|████████▊ | 30125/34278 [33:01:52<4:13:50, 3.67s/it] 88%|████████▊ | 30126/34278 [33:01:56<4:22:17, 3.79s/it] {'loss': 0.1209, 'grad_norm': 0.8383321912229541, 'learning_rate': 3.798569183433115e-07, 'epoch': 0.88} 88%|████████▊ | 30126/34278 [33:01:56<4:22:17, 3.79s/it] 88%|████████▊ | 30127/34278 [33:01:59<4:09:39, 3.61s/it] {'loss': 0.0934, 'grad_norm': 0.7164831033059212, 'learning_rate': 3.796763163751449e-07, 'epoch': 0.88} 88%|████████▊ | 30127/34278 [33:01:59<4:09:39, 3.61s/it] 88%|████████▊ | 30128/34278 [33:02:03<4:07:20, 3.58s/it] {'loss': 0.0829, 'grad_norm': 0.9388970028053739, 'learning_rate': 3.7949575565612626e-07, 'epoch': 0.88} 88%|████████▊ | 30128/34278 [33:02:03<4:07:20, 3.58s/it] 88%|████████▊ | 30129/34278 [33:02:06<4:03:40, 3.52s/it] {'loss': 0.1208, 'grad_norm': 0.8746108903663825, 'learning_rate': 3.7931523618786605e-07, 'epoch': 0.88} 88%|████████▊ | 30129/34278 [33:02:06<4:03:40, 3.52s/it] 88%|████████▊ | 30130/34278 [33:02:09<3:41:53, 3.21s/it] {'loss': 0.1028, 'grad_norm': 0.9518464912522664, 'learning_rate': 3.7913475797197616e-07, 'epoch': 0.88} 88%|████████▊ | 30130/34278 [33:02:09<3:41:53, 3.21s/it] 88%|████████▊ | 30131/34278 [33:02:12<3:43:08, 3.23s/it] {'loss': 0.1041, 'grad_norm': 0.8934136366208427, 'learning_rate': 3.789543210100688e-07, 'epoch': 0.88} 88%|████████▊ | 30131/34278 [33:02:12<3:43:08, 3.23s/it] 88%|████████▊ | 30132/34278 [33:02:14<3:29:24, 3.03s/it] {'loss': 0.098, 'grad_norm': 1.497982141631577, 'learning_rate': 3.787739253037537e-07, 'epoch': 0.88} 88%|████████▊ | 30132/34278 [33:02:14<3:29:24, 3.03s/it] 88%|████████▊ | 30133/34278 [33:02:18<3:38:17, 3.16s/it] {'loss': 0.1017, 'grad_norm': 0.8273318173618878, 'learning_rate': 3.785935708546401e-07, 'epoch': 0.88} 88%|████████▊ | 30133/34278 [33:02:18<3:38:17, 3.16s/it] 88%|████████▊ | 30134/34278 [33:02:21<3:40:31, 3.19s/it] {'loss': 0.095, 'grad_norm': 0.7497054741907678, 'learning_rate': 3.7841325766434236e-07, 'epoch': 0.88} 88%|████████▊ | 30134/34278 [33:02:21<3:40:31, 3.19s/it] 88%|████████▊ | 30135/34278 [33:02:24<3:32:13, 3.07s/it] {'loss': 0.0966, 'grad_norm': 1.0387847957457483, 'learning_rate': 3.7823298573446687e-07, 'epoch': 0.88} 88%|████████▊ | 30135/34278 [33:02:24<3:32:13, 3.07s/it] 88%|████████▊ | 30136/34278 [33:02:30<4:31:27, 3.93s/it] {'loss': 0.1179, 'grad_norm': 0.8097839007625008, 'learning_rate': 3.7805275506662355e-07, 'epoch': 0.88} 88%|████████▊ | 30136/34278 [33:02:30<4:31:27, 3.93s/it] 88%|████████▊ | 30137/34278 [33:02:33<4:08:55, 3.61s/it] {'loss': 0.1078, 'grad_norm': 0.8732204206518616, 'learning_rate': 3.778725656624227e-07, 'epoch': 0.88} 88%|████████▊ | 30137/34278 [33:02:33<4:08:55, 3.61s/it] 88%|████████▊ | 30138/34278 [33:02:37<4:31:36, 3.94s/it] {'loss': 0.121, 'grad_norm': 0.8567947949250957, 'learning_rate': 3.776924175234725e-07, 'epoch': 0.88} 88%|████████▊ | 30138/34278 [33:02:37<4:31:36, 3.94s/it] 88%|████████▊ | 30139/34278 [33:02:40<4:10:06, 3.63s/it] {'loss': 0.0971, 'grad_norm': 1.1688406674700176, 'learning_rate': 3.775123106513795e-07, 'epoch': 0.88} 88%|████████▊ | 30139/34278 [33:02:40<4:10:06, 3.63s/it] 88%|████████▊ | 30140/34278 [33:02:43<3:59:31, 3.47s/it] {'loss': 0.1092, 'grad_norm': 0.8019344356493667, 'learning_rate': 3.7733224504775344e-07, 'epoch': 0.88} 88%|████████▊ | 30140/34278 [33:02:43<3:59:31, 3.47s/it] 88%|████████▊ | 30141/34278 [33:02:46<3:48:28, 3.31s/it] {'loss': 0.1175, 'grad_norm': 0.9010931894003198, 'learning_rate': 3.7715222071420253e-07, 'epoch': 0.88} 88%|████████▊ | 30141/34278 [33:02:46<3:48:28, 3.31s/it] 88%|████████▊ | 30142/34278 [33:02:49<3:40:40, 3.20s/it] {'loss': 0.1171, 'grad_norm': 0.9679309741482341, 'learning_rate': 3.769722376523327e-07, 'epoch': 0.88} 88%|████████▊ | 30142/34278 [33:02:49<3:40:40, 3.20s/it] 88%|████████▊ | 30143/34278 [33:02:53<3:41:36, 3.22s/it] {'loss': 0.1175, 'grad_norm': 0.9531648527760062, 'learning_rate': 3.76792295863751e-07, 'epoch': 0.88} 88%|████████▊ | 30143/34278 [33:02:53<3:41:36, 3.22s/it] 88%|████████▊ | 30144/34278 [33:02:56<3:44:53, 3.26s/it] {'loss': 0.1195, 'grad_norm': 0.9525118817732205, 'learning_rate': 3.766123953500639e-07, 'epoch': 0.88} 88%|████████▊ | 30144/34278 [33:02:56<3:44:53, 3.26s/it] 88%|████████▊ | 30145/34278 [33:03:00<3:58:38, 3.46s/it] {'loss': 0.0972, 'grad_norm': 0.8026345896916173, 'learning_rate': 3.7643253611287734e-07, 'epoch': 0.88} 88%|████████▊ | 30145/34278 [33:03:00<3:58:38, 3.46s/it] 88%|████████▊ | 30146/34278 [33:03:03<3:45:58, 3.28s/it] {'loss': 0.0985, 'grad_norm': 1.0289423310726562, 'learning_rate': 3.762527181537984e-07, 'epoch': 0.88} 88%|████████▊ | 30146/34278 [33:03:03<3:45:58, 3.28s/it] 88%|████████▊ | 30147/34278 [33:03:06<3:42:20, 3.23s/it] {'loss': 0.1142, 'grad_norm': 0.8479740417898695, 'learning_rate': 3.760729414744302e-07, 'epoch': 0.88} 88%|████████▊ | 30147/34278 [33:03:06<3:42:20, 3.23s/it] 88%|████████▊ | 30148/34278 [33:03:09<3:32:19, 3.08s/it] {'loss': 0.1036, 'grad_norm': 0.8876567653180892, 'learning_rate': 3.758932060763798e-07, 'epoch': 0.88} 88%|████████▊ | 30148/34278 [33:03:09<3:32:19, 3.08s/it] 88%|████████▊ | 30149/34278 [33:03:14<4:30:27, 3.93s/it] {'loss': 0.1044, 'grad_norm': 0.9156554228765477, 'learning_rate': 3.757135119612515e-07, 'epoch': 0.88} 88%|████████▊ | 30149/34278 [33:03:14<4:30:27, 3.93s/it] 88%|████████▊ | 30150/34278 [33:03:17<4:11:42, 3.66s/it] {'loss': 0.1046, 'grad_norm': 0.786024507820376, 'learning_rate': 3.755338591306473e-07, 'epoch': 0.88} 88%|████████▊ | 30150/34278 [33:03:18<4:11:42, 3.66s/it] 88%|████████▊ | 30151/34278 [33:03:23<4:58:15, 4.34s/it] {'loss': 0.136, 'grad_norm': 0.7131721520840634, 'learning_rate': 3.753542475861738e-07, 'epoch': 0.88} 88%|████████▊ | 30151/34278 [33:03:23<4:58:15, 4.34s/it] 88%|████████▊ | 30152/34278 [33:03:29<5:33:05, 4.84s/it] {'loss': 0.0937, 'grad_norm': 0.6966960885413169, 'learning_rate': 3.75174677329434e-07, 'epoch': 0.88} 88%|████████▊ | 30152/34278 [33:03:29<5:33:05, 4.84s/it] 88%|████████▊ | 30153/34278 [33:03:35<5:47:28, 5.05s/it] {'loss': 0.1018, 'grad_norm': 1.554708338985056, 'learning_rate': 3.7499514836202954e-07, 'epoch': 0.88} 88%|████████▊ | 30153/34278 [33:03:35<5:47:28, 5.05s/it] 88%|████████▊ | 30154/34278 [33:03:39<5:30:19, 4.81s/it] {'loss': 0.0866, 'grad_norm': 0.884565350692909, 'learning_rate': 3.7481566068556575e-07, 'epoch': 0.88} 88%|████████▊ | 30154/34278 [33:03:39<5:30:19, 4.81s/it] 88%|████████▊ | 30155/34278 [33:03:43<5:19:04, 4.64s/it] {'loss': 0.1291, 'grad_norm': 0.8087524084328968, 'learning_rate': 3.74636214301643e-07, 'epoch': 0.88} 88%|████████▊ | 30155/34278 [33:03:43<5:19:04, 4.64s/it] 88%|████████▊ | 30156/34278 [33:03:49<5:33:26, 4.85s/it] {'loss': 0.094, 'grad_norm': 0.8313123893698217, 'learning_rate': 3.744568092118633e-07, 'epoch': 0.88} 88%|████████▊ | 30156/34278 [33:03:49<5:33:26, 4.85s/it] 88%|████████▊ | 30157/34278 [33:03:53<5:27:54, 4.77s/it] {'loss': 0.1, 'grad_norm': 0.8654617412353212, 'learning_rate': 3.742774454178294e-07, 'epoch': 0.88} 88%|████████▊ | 30157/34278 [33:03:53<5:27:54, 4.77s/it] 88%|████████▊ | 30158/34278 [33:03:57<5:01:04, 4.38s/it] {'loss': 0.1252, 'grad_norm': 0.7752968563793792, 'learning_rate': 3.740981229211427e-07, 'epoch': 0.88} 88%|████████▊ | 30158/34278 [33:03:57<5:01:04, 4.38s/it] 88%|████████▊ | 30159/34278 [33:04:01<4:51:30, 4.25s/it] {'loss': 0.1138, 'grad_norm': 0.8847875580572274, 'learning_rate': 3.739188417234041e-07, 'epoch': 0.88} 88%|████████▊ | 30159/34278 [33:04:01<4:51:30, 4.25s/it] 88%|████████▊ | 30160/34278 [33:04:04<4:35:34, 4.02s/it] {'loss': 0.1187, 'grad_norm': 0.9552669953758516, 'learning_rate': 3.737396018262124e-07, 'epoch': 0.88} 88%|████████▊ | 30160/34278 [33:04:04<4:35:34, 4.02s/it] 88%|████████▊ | 30161/34278 [33:04:10<5:13:37, 4.57s/it] {'loss': 0.1026, 'grad_norm': 0.8074904802698792, 'learning_rate': 3.7356040323117016e-07, 'epoch': 0.88} 88%|████████▊ | 30161/34278 [33:04:10<5:13:37, 4.57s/it] 88%|████████▊ | 30162/34278 [33:04:13<4:41:57, 4.11s/it] {'loss': 0.1191, 'grad_norm': 0.8214451204027293, 'learning_rate': 3.733812459398761e-07, 'epoch': 0.88} 88%|████████▊ | 30162/34278 [33:04:13<4:41:57, 4.11s/it] 88%|████████▊ | 30163/34278 [33:04:19<5:10:56, 4.53s/it] {'loss': 0.1034, 'grad_norm': 0.9769523954735072, 'learning_rate': 3.73202129953929e-07, 'epoch': 0.88} 88%|████████▊ | 30163/34278 [33:04:19<5:10:56, 4.53s/it] 88%|████████▊ | 30164/34278 [33:04:22<4:52:58, 4.27s/it] {'loss': 0.0912, 'grad_norm': 0.7063403750544465, 'learning_rate': 3.730230552749292e-07, 'epoch': 0.88} 88%|████████▊ | 30164/34278 [33:04:22<4:52:58, 4.27s/it] 88%|████████▊ | 30165/34278 [33:04:26<4:33:18, 3.99s/it] {'loss': 0.135, 'grad_norm': 0.6989168611891687, 'learning_rate': 3.7284402190447546e-07, 'epoch': 0.88} 88%|████████▊ | 30165/34278 [33:04:26<4:33:18, 3.99s/it] 88%|████████▊ | 30166/34278 [33:04:29<4:18:20, 3.77s/it] {'loss': 0.1035, 'grad_norm': 0.8157231699521588, 'learning_rate': 3.7266502984416477e-07, 'epoch': 0.88} 88%|████████▊ | 30166/34278 [33:04:29<4:18:20, 3.77s/it] 88%|████████▊ | 30167/34278 [33:04:33<4:15:37, 3.73s/it] {'loss': 0.1036, 'grad_norm': 1.1671000651664127, 'learning_rate': 3.7248607909559697e-07, 'epoch': 0.88} 88%|████████▊ | 30167/34278 [33:04:33<4:15:37, 3.73s/it] 88%|████████▊ | 30168/34278 [33:04:39<5:03:30, 4.43s/it] {'loss': 0.0905, 'grad_norm': 0.7715153986115448, 'learning_rate': 3.7230716966036915e-07, 'epoch': 0.88} 88%|████████▊ | 30168/34278 [33:04:39<5:03:30, 4.43s/it] 88%|████████▊ | 30169/34278 [33:04:42<4:32:07, 3.97s/it] {'loss': 0.1093, 'grad_norm': 0.876595675384357, 'learning_rate': 3.7212830154007675e-07, 'epoch': 0.88} 88%|████████▊ | 30169/34278 [33:04:42<4:32:07, 3.97s/it] 88%|████████▊ | 30170/34278 [33:04:45<4:20:52, 3.81s/it] {'loss': 0.1222, 'grad_norm': 0.953601495349331, 'learning_rate': 3.7194947473631837e-07, 'epoch': 0.88} 88%|████████▊ | 30170/34278 [33:04:45<4:20:52, 3.81s/it] 88%|████████▊ | 30171/34278 [33:04:48<4:03:03, 3.55s/it] {'loss': 0.1162, 'grad_norm': 0.862498674999839, 'learning_rate': 3.7177068925069116e-07, 'epoch': 0.88} 88%|████████▊ | 30171/34278 [33:04:48<4:03:03, 3.55s/it] 88%|████████▊ | 30172/34278 [33:04:51<4:00:05, 3.51s/it] {'loss': 0.1326, 'grad_norm': 0.8836441089597472, 'learning_rate': 3.7159194508479046e-07, 'epoch': 0.88} 88%|████████▊ | 30172/34278 [33:04:51<4:00:05, 3.51s/it] 88%|████████▊ | 30173/34278 [33:04:54<3:47:38, 3.33s/it] {'loss': 0.1065, 'grad_norm': 0.8572154163014452, 'learning_rate': 3.7141324224021116e-07, 'epoch': 0.88} 88%|████████▊ | 30173/34278 [33:04:54<3:47:38, 3.33s/it] 88%|████████▊ | 30174/34278 [33:04:58<4:05:51, 3.59s/it] {'loss': 0.1051, 'grad_norm': 1.014929219004118, 'learning_rate': 3.7123458071855024e-07, 'epoch': 0.88} 88%|████████▊ | 30174/34278 [33:04:58<4:05:51, 3.59s/it] 88%|████████▊ | 30175/34278 [33:05:03<4:34:10, 4.01s/it] {'loss': 0.1376, 'grad_norm': 0.8122855596044031, 'learning_rate': 3.7105596052140145e-07, 'epoch': 0.88} 88%|████████▊ | 30175/34278 [33:05:03<4:34:10, 4.01s/it] 88%|████████▊ | 30176/34278 [33:05:09<5:14:10, 4.60s/it] {'loss': 0.1035, 'grad_norm': 0.7589586008961208, 'learning_rate': 3.708773816503608e-07, 'epoch': 0.88} 88%|████████▊ | 30176/34278 [33:05:09<5:14:10, 4.60s/it] 88%|████████▊ | 30177/34278 [33:05:12<4:41:27, 4.12s/it] {'loss': 0.1359, 'grad_norm': 0.7209743075329469, 'learning_rate': 3.706988441070203e-07, 'epoch': 0.88} 88%|████████▊ | 30177/34278 [33:05:12<4:41:27, 4.12s/it] 88%|████████▊ | 30178/34278 [33:05:18<5:17:07, 4.64s/it] {'loss': 0.0876, 'grad_norm': 0.7143563623297943, 'learning_rate': 3.7052034789297697e-07, 'epoch': 0.88} 88%|████████▊ | 30178/34278 [33:05:18<5:17:07, 4.64s/it] 88%|████████▊ | 30179/34278 [33:05:21<4:47:03, 4.20s/it] {'loss': 0.0997, 'grad_norm': 1.0480815545668183, 'learning_rate': 3.7034189300982294e-07, 'epoch': 0.88} 88%|████████▊ | 30179/34278 [33:05:21<4:47:03, 4.20s/it] 88%|████████▊ | 30180/34278 [33:05:25<4:28:10, 3.93s/it] {'loss': 0.117, 'grad_norm': 0.7673726984716717, 'learning_rate': 3.7016347945914966e-07, 'epoch': 0.88} 88%|████████▊ | 30180/34278 [33:05:25<4:28:10, 3.93s/it] 88%|████████▊ | 30181/34278 [33:05:28<4:12:20, 3.70s/it] {'loss': 0.1018, 'grad_norm': 0.6952595807552829, 'learning_rate': 3.699851072425525e-07, 'epoch': 0.88} 88%|████████▊ | 30181/34278 [33:05:28<4:12:20, 3.70s/it] 88%|████████▊ | 30182/34278 [33:05:31<4:00:34, 3.52s/it] {'loss': 0.1159, 'grad_norm': 0.8821413448512636, 'learning_rate': 3.698067763616231e-07, 'epoch': 0.88} 88%|████████▊ | 30182/34278 [33:05:31<4:00:34, 3.52s/it] 88%|████████▊ | 30183/34278 [33:05:35<4:04:51, 3.59s/it] {'loss': 0.1087, 'grad_norm': 0.9795421892790883, 'learning_rate': 3.696284868179534e-07, 'epoch': 0.88} 88%|████████▊ | 30183/34278 [33:05:35<4:04:51, 3.59s/it] 88%|████████▊ | 30184/34278 [33:05:41<4:53:58, 4.31s/it] {'loss': 0.1128, 'grad_norm': 0.6566982558593002, 'learning_rate': 3.6945023861313547e-07, 'epoch': 0.88} 88%|████████▊ | 30184/34278 [33:05:41<4:53:58, 4.31s/it] 88%|████████▊ | 30185/34278 [33:05:44<4:37:58, 4.07s/it] {'loss': 0.1046, 'grad_norm': 0.8421261691287741, 'learning_rate': 3.6927203174876027e-07, 'epoch': 0.88} 88%|████████▊ | 30185/34278 [33:05:44<4:37:58, 4.07s/it] 88%|████████▊ | 30186/34278 [33:05:50<5:04:09, 4.46s/it] {'loss': 0.1012, 'grad_norm': 0.8631026093023745, 'learning_rate': 3.6909386622641876e-07, 'epoch': 0.88} 88%|████████▊ | 30186/34278 [33:05:50<5:04:09, 4.46s/it] 88%|████████▊ | 30187/34278 [33:05:53<4:42:07, 4.14s/it] {'loss': 0.0996, 'grad_norm': 0.8232195206923908, 'learning_rate': 3.689157420477013e-07, 'epoch': 0.88} 88%|████████▊ | 30187/34278 [33:05:53<4:42:07, 4.14s/it] 88%|████████▊ | 30188/34278 [33:05:56<4:20:26, 3.82s/it] {'loss': 0.1314, 'grad_norm': 0.9174507808088106, 'learning_rate': 3.687376592141995e-07, 'epoch': 0.88} 88%|████████▊ | 30188/34278 [33:05:56<4:20:26, 3.82s/it] 88%|████████▊ | 30189/34278 [33:05:59<4:05:32, 3.60s/it] {'loss': 0.0857, 'grad_norm': 0.8254176508882628, 'learning_rate': 3.6855961772750193e-07, 'epoch': 0.88} 88%|████████▊ | 30189/34278 [33:05:59<4:05:32, 3.60s/it] 88%|████████▊ | 30190/34278 [33:06:06<5:00:51, 4.42s/it] {'loss': 0.0941, 'grad_norm': 0.8672602160728197, 'learning_rate': 3.68381617589198e-07, 'epoch': 0.88} 88%|████████▊ | 30190/34278 [33:06:06<5:00:51, 4.42s/it] 88%|████████▊ | 30191/34278 [33:06:09<4:38:37, 4.09s/it] {'loss': 0.1155, 'grad_norm': 0.8352647499556992, 'learning_rate': 3.682036588008786e-07, 'epoch': 0.88} 88%|████████▊ | 30191/34278 [33:06:09<4:38:37, 4.09s/it] 88%|████████▊ | 30192/34278 [33:06:12<4:10:03, 3.67s/it] {'loss': 0.1294, 'grad_norm': 0.9017842170348959, 'learning_rate': 3.6802574136413084e-07, 'epoch': 0.88} 88%|████████▊ | 30192/34278 [33:06:12<4:10:03, 3.67s/it] 88%|████████▊ | 30193/34278 [33:06:17<4:56:17, 4.35s/it] {'loss': 0.1041, 'grad_norm': 0.7935272301615023, 'learning_rate': 3.678478652805423e-07, 'epoch': 0.88} 88%|████████▊ | 30193/34278 [33:06:17<4:56:17, 4.35s/it] 88%|████████▊ | 30194/34278 [33:06:20<4:27:36, 3.93s/it] {'loss': 0.1378, 'grad_norm': 0.8954750138041571, 'learning_rate': 3.676700305517028e-07, 'epoch': 0.88} 88%|████████▊ | 30194/34278 [33:06:20<4:27:36, 3.93s/it] 88%|████████▊ | 30195/34278 [33:06:23<4:03:51, 3.58s/it] {'loss': 0.1232, 'grad_norm': 0.9790343919425698, 'learning_rate': 3.674922371792e-07, 'epoch': 0.88} 88%|████████▊ | 30195/34278 [33:06:23<4:03:51, 3.58s/it] 88%|████████▊ | 30196/34278 [33:06:28<4:36:53, 4.07s/it] {'loss': 0.1253, 'grad_norm': 1.0833191201543046, 'learning_rate': 3.673144851646199e-07, 'epoch': 0.88} 88%|████████▊ | 30196/34278 [33:06:28<4:36:53, 4.07s/it] 88%|████████▊ | 30197/34278 [33:06:31<4:16:13, 3.77s/it] {'loss': 0.1127, 'grad_norm': 0.7122042425090758, 'learning_rate': 3.671367745095511e-07, 'epoch': 0.88} 88%|████████▊ | 30197/34278 [33:06:31<4:16:13, 3.77s/it] 88%|████████▊ | 30198/34278 [33:06:35<4:05:00, 3.60s/it] {'loss': 0.1189, 'grad_norm': 0.7927459358635081, 'learning_rate': 3.6695910521557797e-07, 'epoch': 0.88} 88%|████████▊ | 30198/34278 [33:06:35<4:05:00, 3.60s/it] 88%|████████▊ | 30199/34278 [33:06:37<3:44:52, 3.31s/it] {'loss': 0.1042, 'grad_norm': 0.7376181023768205, 'learning_rate': 3.6678147728428926e-07, 'epoch': 0.88} 88%|████████▊ | 30199/34278 [33:06:37<3:44:52, 3.31s/it] 88%|████████▊ | 30200/34278 [33:06:43<4:38:12, 4.09s/it] {'loss': 0.0969, 'grad_norm': 0.8077804932474733, 'learning_rate': 3.6660389071726807e-07, 'epoch': 0.88} 88%|████████▊ | 30200/34278 [33:06:43<4:38:12, 4.09s/it] 88%|████████▊ | 30201/34278 [33:06:47<4:27:23, 3.94s/it] {'loss': 0.1205, 'grad_norm': 0.7568147108828605, 'learning_rate': 3.664263455161027e-07, 'epoch': 0.88} 88%|████████▊ | 30201/34278 [33:06:47<4:27:23, 3.94s/it] 88%|████████▊ | 30202/34278 [33:06:50<4:12:23, 3.72s/it] {'loss': 0.1218, 'grad_norm': 0.9443235460663865, 'learning_rate': 3.6624884168237675e-07, 'epoch': 0.88} 88%|████████▊ | 30202/34278 [33:06:50<4:12:23, 3.72s/it] 88%|████████▊ | 30203/34278 [33:06:53<3:55:07, 3.46s/it] {'loss': 0.1306, 'grad_norm': 0.9146561949814475, 'learning_rate': 3.66071379217674e-07, 'epoch': 0.88} 88%|████████▊ | 30203/34278 [33:06:53<3:55:07, 3.46s/it] 88%|████████▊ | 30204/34278 [33:06:56<3:57:41, 3.50s/it] {'loss': 0.126, 'grad_norm': 0.7435052455702427, 'learning_rate': 3.658939581235793e-07, 'epoch': 0.88} 88%|████████▊ | 30204/34278 [33:06:56<3:57:41, 3.50s/it] 88%|████████▊ | 30205/34278 [33:06:59<3:46:39, 3.34s/it] {'loss': 0.1057, 'grad_norm': 0.8530381042113752, 'learning_rate': 3.6571657840167864e-07, 'epoch': 0.88} 88%|████████▊ | 30205/34278 [33:06:59<3:46:39, 3.34s/it] 88%|████████▊ | 30206/34278 [33:07:02<3:40:28, 3.25s/it] {'loss': 0.0845, 'grad_norm': 0.8355369961185656, 'learning_rate': 3.6553924005355347e-07, 'epoch': 0.88} 88%|████████▊ | 30206/34278 [33:07:02<3:40:28, 3.25s/it] 88%|████████▊ | 30207/34278 [33:07:05<3:36:14, 3.19s/it] {'loss': 0.1119, 'grad_norm': 0.8559363073796667, 'learning_rate': 3.6536194308078756e-07, 'epoch': 0.88} 88%|████████▊ | 30207/34278 [33:07:06<3:36:14, 3.19s/it] 88%|████████▊ | 30208/34278 [33:07:08<3:32:29, 3.13s/it] {'loss': 0.1162, 'grad_norm': 1.2055206965322574, 'learning_rate': 3.6518468748496406e-07, 'epoch': 0.88} 88%|████████▊ | 30208/34278 [33:07:09<3:32:29, 3.13s/it] 88%|████████▊ | 30209/34278 [33:07:11<3:27:45, 3.06s/it] {'loss': 0.1211, 'grad_norm': 0.8828364406691396, 'learning_rate': 3.650074732676656e-07, 'epoch': 0.88} 88%|████████▊ | 30209/34278 [33:07:11<3:27:45, 3.06s/it] 88%|████████▊ | 30210/34278 [33:07:15<3:31:08, 3.11s/it] {'loss': 0.0958, 'grad_norm': 0.8566573650686539, 'learning_rate': 3.648303004304721e-07, 'epoch': 0.88} 88%|████████▊ | 30210/34278 [33:07:15<3:31:08, 3.11s/it] 88%|████████▊ | 30211/34278 [33:07:19<3:57:58, 3.51s/it] {'loss': 0.0915, 'grad_norm': 0.7686853905378299, 'learning_rate': 3.6465316897496883e-07, 'epoch': 0.88} 88%|████████▊ | 30211/34278 [33:07:19<3:57:58, 3.51s/it] 88%|████████▊ | 30212/34278 [33:07:22<3:47:06, 3.35s/it] {'loss': 0.118, 'grad_norm': 0.7660954448389473, 'learning_rate': 3.6447607890273516e-07, 'epoch': 0.88} 88%|████████▊ | 30212/34278 [33:07:22<3:47:06, 3.35s/it] 88%|████████▊ | 30213/34278 [33:07:25<3:46:33, 3.34s/it] {'loss': 0.1309, 'grad_norm': 1.2803163696680397, 'learning_rate': 3.6429903021535207e-07, 'epoch': 0.88} 88%|████████▊ | 30213/34278 [33:07:25<3:46:33, 3.34s/it] 88%|████████▊ | 30214/34278 [33:07:29<3:53:40, 3.45s/it] {'loss': 0.0962, 'grad_norm': 0.8122822849721811, 'learning_rate': 3.641220229144016e-07, 'epoch': 0.88} 88%|████████▊ | 30214/34278 [33:07:29<3:53:40, 3.45s/it] 88%|████████▊ | 30215/34278 [33:07:35<4:42:01, 4.16s/it] {'loss': 0.1174, 'grad_norm': 0.9291086541392866, 'learning_rate': 3.63945057001463e-07, 'epoch': 0.88} 88%|████████▊ | 30215/34278 [33:07:35<4:42:01, 4.16s/it] 88%|████████▊ | 30216/34278 [33:07:38<4:23:05, 3.89s/it] {'loss': 0.1095, 'grad_norm': 0.7421057537233868, 'learning_rate': 3.6376813247811503e-07, 'epoch': 0.88} 88%|████████▊ | 30216/34278 [33:07:38<4:23:05, 3.89s/it] 88%|████████▊ | 30217/34278 [33:07:41<4:05:04, 3.62s/it] {'loss': 0.1213, 'grad_norm': 0.8081893349893342, 'learning_rate': 3.635912493459387e-07, 'epoch': 0.88} 88%|████████▊ | 30217/34278 [33:07:41<4:05:04, 3.62s/it] 88%|████████▊ | 30218/34278 [33:07:45<4:09:02, 3.68s/it] {'loss': 0.1276, 'grad_norm': 0.7958339955453644, 'learning_rate': 3.634144076065133e-07, 'epoch': 0.88} 88%|████████▊ | 30218/34278 [33:07:45<4:09:02, 3.68s/it] 88%|████████▊ | 30219/34278 [33:07:48<3:52:46, 3.44s/it] {'loss': 0.1135, 'grad_norm': 0.773084285292648, 'learning_rate': 3.63237607261418e-07, 'epoch': 0.88} 88%|████████▊ | 30219/34278 [33:07:48<3:52:46, 3.44s/it] 88%|████████▊ | 30220/34278 [33:07:51<3:46:47, 3.35s/it] {'loss': 0.1146, 'grad_norm': 0.8580062597910707, 'learning_rate': 3.6306084831222887e-07, 'epoch': 0.88} 88%|████████▊ | 30220/34278 [33:07:51<3:46:47, 3.35s/it] 88%|████████▊ | 30221/34278 [33:07:55<3:54:00, 3.46s/it] {'loss': 0.1097, 'grad_norm': 1.190768283142839, 'learning_rate': 3.628841307605269e-07, 'epoch': 0.88} 88%|████████▊ | 30221/34278 [33:07:55<3:54:00, 3.46s/it] 88%|████████▊ | 30222/34278 [33:07:58<3:41:28, 3.28s/it] {'loss': 0.1004, 'grad_norm': 0.7718992014176562, 'learning_rate': 3.6270745460788736e-07, 'epoch': 0.88} 88%|████████▊ | 30222/34278 [33:07:58<3:41:28, 3.28s/it] 88%|████████▊ | 30223/34278 [33:08:00<3:29:41, 3.10s/it] {'loss': 0.1068, 'grad_norm': 0.757772427516553, 'learning_rate': 3.625308198558897e-07, 'epoch': 0.88} 88%|████████▊ | 30223/34278 [33:08:00<3:29:41, 3.10s/it] 88%|████████▊ | 30224/34278 [33:08:05<3:58:12, 3.53s/it] {'loss': 0.109, 'grad_norm': 0.8011985273552965, 'learning_rate': 3.6235422650610863e-07, 'epoch': 0.88} 88%|████████▊ | 30224/34278 [33:08:05<3:58:12, 3.53s/it] 88%|████████▊ | 30225/34278 [33:08:11<4:48:50, 4.28s/it] {'loss': 0.1159, 'grad_norm': 0.8914904698648717, 'learning_rate': 3.62177674560123e-07, 'epoch': 0.88} 88%|████████▊ | 30225/34278 [33:08:11<4:48:50, 4.28s/it] 88%|████████▊ | 30226/34278 [33:08:14<4:27:38, 3.96s/it] {'loss': 0.1239, 'grad_norm': 0.6703580860724604, 'learning_rate': 3.6200116401950704e-07, 'epoch': 0.88} 88%|████████▊ | 30226/34278 [33:08:14<4:27:38, 3.96s/it] 88%|████████▊ | 30227/34278 [33:08:17<4:07:41, 3.67s/it] {'loss': 0.0931, 'grad_norm': 0.9313783388406465, 'learning_rate': 3.6182469488583836e-07, 'epoch': 0.88} 88%|████████▊ | 30227/34278 [33:08:17<4:07:41, 3.67s/it] 88%|████████▊ | 30228/34278 [33:08:20<3:53:29, 3.46s/it] {'loss': 0.1236, 'grad_norm': 1.0304906920137527, 'learning_rate': 3.616482671606908e-07, 'epoch': 0.88} 88%|████████▊ | 30228/34278 [33:08:20<3:53:29, 3.46s/it] 88%|████████▊ | 30229/34278 [33:08:23<3:48:54, 3.39s/it] {'loss': 0.1131, 'grad_norm': 0.7672447611229704, 'learning_rate': 3.6147188084564075e-07, 'epoch': 0.88} 88%|████████▊ | 30229/34278 [33:08:23<3:48:54, 3.39s/it] 88%|████████▊ | 30230/34278 [33:08:27<3:59:11, 3.55s/it] {'loss': 0.1032, 'grad_norm': 0.8632346594672483, 'learning_rate': 3.612955359422621e-07, 'epoch': 0.88} 88%|████████▊ | 30230/34278 [33:08:27<3:59:11, 3.55s/it] 88%|████████▊ | 30231/34278 [33:08:30<3:54:14, 3.47s/it] {'loss': 0.1315, 'grad_norm': 0.8417856631660557, 'learning_rate': 3.611192324521301e-07, 'epoch': 0.88} 88%|████████▊ | 30231/34278 [33:08:30<3:54:14, 3.47s/it] 88%|████████▊ | 30232/34278 [33:08:35<4:21:32, 3.88s/it] {'loss': 0.0889, 'grad_norm': 0.7716576250477684, 'learning_rate': 3.6094297037681857e-07, 'epoch': 0.88} 88%|████████▊ | 30232/34278 [33:08:35<4:21:32, 3.88s/it] 88%|████████▊ | 30233/34278 [33:08:39<4:23:05, 3.90s/it] {'loss': 0.1137, 'grad_norm': 0.5673189806323955, 'learning_rate': 3.607667497178996e-07, 'epoch': 0.88} 88%|████████▊ | 30233/34278 [33:08:39<4:23:05, 3.90s/it] 88%|████████▊ | 30234/34278 [33:08:42<4:09:50, 3.71s/it] {'loss': 0.1183, 'grad_norm': 1.2397629243526644, 'learning_rate': 3.6059057047694745e-07, 'epoch': 0.88} 88%|████████▊ | 30234/34278 [33:08:42<4:09:50, 3.71s/it] 88%|████████▊ | 30235/34278 [33:08:45<3:52:44, 3.45s/it] {'loss': 0.0991, 'grad_norm': 0.7958689435050259, 'learning_rate': 3.6041443265553645e-07, 'epoch': 0.88} 88%|████████▊ | 30235/34278 [33:08:45<3:52:44, 3.45s/it] 88%|████████▊ | 30236/34278 [33:08:51<4:28:17, 3.98s/it] {'loss': 0.0997, 'grad_norm': 0.7340789986518017, 'learning_rate': 3.602383362552375e-07, 'epoch': 0.88} 88%|████████▊ | 30236/34278 [33:08:51<4:28:17, 3.98s/it] 88%|████████▊ | 30237/34278 [33:08:54<4:08:58, 3.70s/it] {'loss': 0.1301, 'grad_norm': 1.2508656996866652, 'learning_rate': 3.6006228127762275e-07, 'epoch': 0.88} 88%|████████▊ | 30237/34278 [33:08:54<4:08:58, 3.70s/it] 88%|████████▊ | 30238/34278 [33:08:56<3:52:47, 3.46s/it] {'loss': 0.1124, 'grad_norm': 0.9875033771728545, 'learning_rate': 3.598862677242643e-07, 'epoch': 0.88} 88%|████████▊ | 30238/34278 [33:08:56<3:52:47, 3.46s/it] 88%|████████▊ | 30239/34278 [33:09:00<3:47:22, 3.38s/it] {'loss': 0.0895, 'grad_norm': 0.8114853958236029, 'learning_rate': 3.5971029559673407e-07, 'epoch': 0.88} 88%|████████▊ | 30239/34278 [33:09:00<3:47:22, 3.38s/it] 88%|████████▊ | 30240/34278 [33:09:06<4:42:08, 4.19s/it] {'loss': 0.1121, 'grad_norm': 1.0385342276752447, 'learning_rate': 3.59534364896601e-07, 'epoch': 0.88} 88%|████████▊ | 30240/34278 [33:09:06<4:42:08, 4.19s/it] 88%|████████▊ | 30241/34278 [33:09:09<4:24:59, 3.94s/it] {'loss': 0.1008, 'grad_norm': 0.7225282116525515, 'learning_rate': 3.5935847562543927e-07, 'epoch': 0.88} 88%|████████▊ | 30241/34278 [33:09:09<4:24:59, 3.94s/it] 88%|████████▊ | 30242/34278 [33:09:12<4:07:13, 3.68s/it] {'loss': 0.1013, 'grad_norm': 0.8569040528906543, 'learning_rate': 3.591826277848165e-07, 'epoch': 0.88} 88%|████████▊ | 30242/34278 [33:09:12<4:07:13, 3.68s/it] 88%|████████▊ | 30243/34278 [33:09:15<3:50:54, 3.43s/it] {'loss': 0.097, 'grad_norm': 0.7433117161667993, 'learning_rate': 3.5900682137630317e-07, 'epoch': 0.88} 88%|████████▊ | 30243/34278 [33:09:15<3:50:54, 3.43s/it] 88%|████████▊ | 30244/34278 [33:09:21<4:37:28, 4.13s/it] {'loss': 0.1158, 'grad_norm': 0.8608631443274006, 'learning_rate': 3.5883105640146965e-07, 'epoch': 0.88} 88%|████████▊ | 30244/34278 [33:09:21<4:37:28, 4.13s/it] 88%|████████▊ | 30245/34278 [33:09:24<4:16:46, 3.82s/it] {'loss': 0.1232, 'grad_norm': 1.0101357825938861, 'learning_rate': 3.5865533286188415e-07, 'epoch': 0.88} 88%|████████▊ | 30245/34278 [33:09:24<4:16:46, 3.82s/it] 88%|████████▊ | 30246/34278 [33:09:27<4:03:37, 3.63s/it] {'loss': 0.1139, 'grad_norm': 0.8340525484015279, 'learning_rate': 3.58479650759116e-07, 'epoch': 0.88} 88%|████████▊ | 30246/34278 [33:09:27<4:03:37, 3.63s/it] 88%|████████▊ | 30247/34278 [33:09:30<3:52:04, 3.45s/it] {'loss': 0.1226, 'grad_norm': 0.7593287722072896, 'learning_rate': 3.583040100947327e-07, 'epoch': 0.88} 88%|████████▊ | 30247/34278 [33:09:30<3:52:04, 3.45s/it] 88%|████████▊ | 30248/34278 [33:09:33<3:35:58, 3.22s/it] {'loss': 0.1289, 'grad_norm': 0.9341213761325549, 'learning_rate': 3.5812841087030427e-07, 'epoch': 0.88} 88%|████████▊ | 30248/34278 [33:09:33<3:35:58, 3.22s/it] 88%|████████▊ | 30249/34278 [33:09:36<3:28:53, 3.11s/it] {'loss': 0.0969, 'grad_norm': 0.7890900124503014, 'learning_rate': 3.5795285308739715e-07, 'epoch': 0.88} 88%|████████▊ | 30249/34278 [33:09:36<3:28:53, 3.11s/it] 88%|████████▊ | 30250/34278 [33:09:38<3:23:49, 3.04s/it] {'loss': 0.1042, 'grad_norm': 0.7419355140165514, 'learning_rate': 3.577773367475779e-07, 'epoch': 0.88} 88%|████████▊ | 30250/34278 [33:09:38<3:23:49, 3.04s/it] 88%|████████▊ | 30251/34278 [33:09:45<4:29:19, 4.01s/it] {'loss': 0.1147, 'grad_norm': 0.7754554477757386, 'learning_rate': 3.576018618524152e-07, 'epoch': 0.88} 88%|████████▊ | 30251/34278 [33:09:45<4:29:19, 4.01s/it] 88%|████████▊ | 30252/34278 [33:09:48<4:10:02, 3.73s/it] {'loss': 0.1177, 'grad_norm': 0.8652486385826137, 'learning_rate': 3.574264284034745e-07, 'epoch': 0.88} 88%|████████▊ | 30252/34278 [33:09:48<4:10:02, 3.73s/it] 88%|████████▊ | 30253/34278 [33:09:51<4:01:38, 3.60s/it] {'loss': 0.1053, 'grad_norm': 0.8863230988863542, 'learning_rate': 3.572510364023224e-07, 'epoch': 0.88} 88%|████████▊ | 30253/34278 [33:09:51<4:01:38, 3.60s/it] 88%|████████▊ | 30254/34278 [33:09:54<3:47:40, 3.39s/it] {'loss': 0.098, 'grad_norm': 0.7918883451187265, 'learning_rate': 3.5707568585052477e-07, 'epoch': 0.88} 88%|████████▊ | 30254/34278 [33:09:54<3:47:40, 3.39s/it] 88%|████████▊ | 30255/34278 [33:10:00<4:37:23, 4.14s/it] {'loss': 0.1235, 'grad_norm': 0.7949978564938222, 'learning_rate': 3.569003767496476e-07, 'epoch': 0.88} 88%|████████▊ | 30255/34278 [33:10:00<4:37:23, 4.14s/it] 88%|████████▊ | 30256/34278 [33:10:03<4:21:21, 3.90s/it] {'loss': 0.1428, 'grad_norm': 1.299755252294959, 'learning_rate': 3.5672510910125526e-07, 'epoch': 0.88} 88%|████████▊ | 30256/34278 [33:10:03<4:21:21, 3.90s/it] 88%|████████▊ | 30257/34278 [33:10:08<4:28:09, 4.00s/it] {'loss': 0.103, 'grad_norm': 0.7907743204613458, 'learning_rate': 3.565498829069119e-07, 'epoch': 0.88} 88%|████████▊ | 30257/34278 [33:10:08<4:28:09, 4.00s/it] 88%|████████▊ | 30258/34278 [33:10:11<4:17:41, 3.85s/it] {'loss': 0.1137, 'grad_norm': 0.8460585996047849, 'learning_rate': 3.563746981681826e-07, 'epoch': 0.88} 88%|████████▊ | 30258/34278 [33:10:11<4:17:41, 3.85s/it] 88%|████████▊ | 30259/34278 [33:10:14<4:00:01, 3.58s/it] {'loss': 0.1239, 'grad_norm': 1.0933770669696037, 'learning_rate': 3.561995548866326e-07, 'epoch': 0.88} 88%|████████▊ | 30259/34278 [33:10:14<4:00:01, 3.58s/it] 88%|████████▊ | 30260/34278 [33:10:19<4:30:29, 4.04s/it] {'loss': 0.0979, 'grad_norm': 0.7781846495940566, 'learning_rate': 3.560244530638235e-07, 'epoch': 0.88} 88%|████████▊ | 30260/34278 [33:10:19<4:30:29, 4.04s/it] 88%|████████▊ | 30261/34278 [33:10:22<4:09:48, 3.73s/it] {'loss': 0.1074, 'grad_norm': 0.8418009683284758, 'learning_rate': 3.558493927013201e-07, 'epoch': 0.88} 88%|████████▊ | 30261/34278 [33:10:22<4:09:48, 3.73s/it] 88%|████████▊ | 30262/34278 [33:10:26<4:07:27, 3.70s/it] {'loss': 0.1113, 'grad_norm': 0.7863138275600805, 'learning_rate': 3.5567437380068515e-07, 'epoch': 0.88} 88%|████████▊ | 30262/34278 [33:10:26<4:07:27, 3.70s/it] 88%|████████▊ | 30263/34278 [33:10:29<3:53:54, 3.50s/it] {'loss': 0.1033, 'grad_norm': 1.299581592141793, 'learning_rate': 3.554993963634795e-07, 'epoch': 0.88} 88%|████████▊ | 30263/34278 [33:10:29<3:53:54, 3.50s/it] 88%|████████▊ | 30264/34278 [33:10:35<4:50:02, 4.34s/it] {'loss': 0.1229, 'grad_norm': 0.9671923105206143, 'learning_rate': 3.5532446039126645e-07, 'epoch': 0.88} 88%|████████▊ | 30264/34278 [33:10:35<4:50:02, 4.34s/it] 88%|████████▊ | 30265/34278 [33:10:41<5:18:38, 4.76s/it] {'loss': 0.112, 'grad_norm': 0.7420217463153288, 'learning_rate': 3.551495658856091e-07, 'epoch': 0.88} 88%|████████▊ | 30265/34278 [33:10:41<5:18:38, 4.76s/it] 88%|████████▊ | 30266/34278 [33:10:44<4:49:17, 4.33s/it] {'loss': 0.1269, 'grad_norm': 0.8974301122852568, 'learning_rate': 3.5497471284806686e-07, 'epoch': 0.88} 88%|████████▊ | 30266/34278 [33:10:44<4:49:17, 4.33s/it] 88%|████████▊ | 30267/34278 [33:10:47<4:20:40, 3.90s/it] {'loss': 0.0921, 'grad_norm': 1.0018350939994582, 'learning_rate': 3.5479990128020113e-07, 'epoch': 0.88} 88%|████████▊ | 30267/34278 [33:10:47<4:20:40, 3.90s/it] 88%|████████▊ | 30268/34278 [33:10:50<3:58:33, 3.57s/it] {'loss': 0.1246, 'grad_norm': 0.9819348515423417, 'learning_rate': 3.5462513118357413e-07, 'epoch': 0.88} 88%|████████▊ | 30268/34278 [33:10:50<3:58:33, 3.57s/it] 88%|████████▊ | 30269/34278 [33:10:53<3:45:41, 3.38s/it] {'loss': 0.1112, 'grad_norm': 0.8769700484705546, 'learning_rate': 3.544504025597445e-07, 'epoch': 0.88} 88%|████████▊ | 30269/34278 [33:10:53<3:45:41, 3.38s/it] 88%|████████▊ | 30270/34278 [33:10:56<3:39:36, 3.29s/it] {'loss': 0.1104, 'grad_norm': 0.8802326679739839, 'learning_rate': 3.542757154102716e-07, 'epoch': 0.88} 88%|████████▊ | 30270/34278 [33:10:56<3:39:36, 3.29s/it] 88%|████████▊ | 30271/34278 [33:10:59<3:33:24, 3.20s/it] {'loss': 0.1093, 'grad_norm': 0.8391767456555339, 'learning_rate': 3.54101069736717e-07, 'epoch': 0.88} 88%|████████▊ | 30271/34278 [33:10:59<3:33:24, 3.20s/it] 88%|████████▊ | 30272/34278 [33:11:02<3:40:30, 3.30s/it] {'loss': 0.1079, 'grad_norm': 0.6753819973574647, 'learning_rate': 3.5392646554063935e-07, 'epoch': 0.88} 88%|████████▊ | 30272/34278 [33:11:02<3:40:30, 3.30s/it] 88%|████████▊ | 30273/34278 [33:11:06<3:46:39, 3.40s/it] {'loss': 0.1181, 'grad_norm': 0.7957394720387498, 'learning_rate': 3.537519028235964e-07, 'epoch': 0.88} 88%|████████▊ | 30273/34278 [33:11:06<3:46:39, 3.40s/it] 88%|████████▊ | 30274/34278 [33:11:09<3:38:51, 3.28s/it] {'loss': 0.1081, 'grad_norm': 0.9294920446919498, 'learning_rate': 3.535773815871485e-07, 'epoch': 0.88} 88%|████████▊ | 30274/34278 [33:11:09<3:38:51, 3.28s/it] 88%|████████▊ | 30275/34278 [33:11:12<3:30:00, 3.15s/it] {'loss': 0.1038, 'grad_norm': 1.0407690985588065, 'learning_rate': 3.534029018328516e-07, 'epoch': 0.88} 88%|████████▊ | 30275/34278 [33:11:12<3:30:00, 3.15s/it] 88%|████████▊ | 30276/34278 [33:11:16<3:50:18, 3.45s/it] {'loss': 0.1218, 'grad_norm': 0.9803304389381334, 'learning_rate': 3.5322846356226405e-07, 'epoch': 0.88} 88%|████████▊ | 30276/34278 [33:11:16<3:50:18, 3.45s/it] 88%|████████▊ | 30277/34278 [33:11:19<3:39:29, 3.29s/it] {'loss': 0.1137, 'grad_norm': 1.0250165615672384, 'learning_rate': 3.5305406677694386e-07, 'epoch': 0.88} 88%|████████▊ | 30277/34278 [33:11:19<3:39:29, 3.29s/it] 88%|████████▊ | 30278/34278 [33:11:22<3:38:35, 3.28s/it] {'loss': 0.1113, 'grad_norm': 1.0597643994693118, 'learning_rate': 3.5287971147844823e-07, 'epoch': 0.88} 88%|████████▊ | 30278/34278 [33:11:22<3:38:35, 3.28s/it] 88%|████████▊ | 30279/34278 [33:11:25<3:33:56, 3.21s/it] {'loss': 0.107, 'grad_norm': 0.6969120680101855, 'learning_rate': 3.5270539766833257e-07, 'epoch': 0.88} 88%|████████▊ | 30279/34278 [33:11:25<3:33:56, 3.21s/it] 88%|████████▊ | 30280/34278 [33:11:28<3:24:10, 3.06s/it] {'loss': 0.1206, 'grad_norm': 1.0829992251926237, 'learning_rate': 3.5253112534815336e-07, 'epoch': 0.88} 88%|████████▊ | 30280/34278 [33:11:28<3:24:10, 3.06s/it] 88%|████████▊ | 30281/34278 [33:11:32<3:35:57, 3.24s/it] {'loss': 0.1067, 'grad_norm': 1.1661278745786465, 'learning_rate': 3.5235689451946775e-07, 'epoch': 0.88} 88%|████████▊ | 30281/34278 [33:11:32<3:35:57, 3.24s/it] 88%|████████▊ | 30282/34278 [33:11:35<3:36:34, 3.25s/it] {'loss': 0.0962, 'grad_norm': 1.0443561061873734, 'learning_rate': 3.521827051838295e-07, 'epoch': 0.88} 88%|████████▊ | 30282/34278 [33:11:35<3:36:34, 3.25s/it] 88%|████████▊ | 30283/34278 [33:11:40<4:10:39, 3.76s/it] {'loss': 0.1177, 'grad_norm': 0.9537307633965, 'learning_rate': 3.52008557342795e-07, 'epoch': 0.88} 88%|████████▊ | 30283/34278 [33:11:40<4:10:39, 3.76s/it] 88%|████████▊ | 30284/34278 [33:11:43<3:52:55, 3.50s/it] {'loss': 0.0946, 'grad_norm': 0.7337475191683551, 'learning_rate': 3.5183445099791825e-07, 'epoch': 0.88} 88%|████████▊ | 30284/34278 [33:11:43<3:52:55, 3.50s/it] 88%|████████▊ | 30285/34278 [33:11:46<3:44:31, 3.37s/it] {'loss': 0.1096, 'grad_norm': 1.1754368881679809, 'learning_rate': 3.516603861507545e-07, 'epoch': 0.88} 88%|████████▊ | 30285/34278 [33:11:46<3:44:31, 3.37s/it] 88%|████████▊ | 30286/34278 [33:11:49<3:37:08, 3.26s/it] {'loss': 0.1121, 'grad_norm': 0.7435421860743248, 'learning_rate': 3.5148636280285697e-07, 'epoch': 0.88} 88%|████████▊ | 30286/34278 [33:11:49<3:37:08, 3.26s/it] 88%|████████▊ | 30287/34278 [33:11:55<4:31:44, 4.09s/it] {'loss': 0.1186, 'grad_norm': 1.0549735837093794, 'learning_rate': 3.513123809557789e-07, 'epoch': 0.88} 88%|████████▊ | 30287/34278 [33:11:55<4:31:44, 4.09s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 88%|████████▊ | 30288/34278 [33:11:58<4:13:31, 3.81s/it] {'loss': 0.1243, 'grad_norm': 0.8718229764373705, 'learning_rate': 3.5113844061107404e-07, 'epoch': 0.88} 88%|████████▊ | 30288/34278 [33:11:58<4:13:31, 3.81s/it] 88%|████████▊ | 30289/34278 [33:12:01<3:54:28, 3.53s/it] {'loss': 0.1217, 'grad_norm': 2.081912003183867, 'learning_rate': 3.509645417702967e-07, 'epoch': 0.88} 88%|████████▊ | 30289/34278 [33:12:01<3:54:28, 3.53s/it] 88%|████████▊ | 30290/34278 [33:12:04<3:39:45, 3.31s/it] {'loss': 0.1147, 'grad_norm': 1.1329910540016377, 'learning_rate': 3.5079068443499676e-07, 'epoch': 0.88} 88%|████████▊ | 30290/34278 [33:12:04<3:39:45, 3.31s/it] 88%|████████▊ | 30291/34278 [33:12:07<3:38:32, 3.29s/it] {'loss': 0.0998, 'grad_norm': 0.8690301360376276, 'learning_rate': 3.506168686067285e-07, 'epoch': 0.88} 88%|████████▊ | 30291/34278 [33:12:07<3:38:32, 3.29s/it] 88%|████████▊ | 30292/34278 [33:12:10<3:39:53, 3.31s/it] {'loss': 0.09, 'grad_norm': 1.1275935446741046, 'learning_rate': 3.504430942870429e-07, 'epoch': 0.88} 88%|████████▊ | 30292/34278 [33:12:10<3:39:53, 3.31s/it] 88%|████████▊ | 30293/34278 [33:12:15<4:02:29, 3.65s/it] {'loss': 0.1115, 'grad_norm': 0.7895830958470886, 'learning_rate': 3.5026936147749104e-07, 'epoch': 0.88} 88%|████████▊ | 30293/34278 [33:12:15<4:02:29, 3.65s/it] 88%|████████▊ | 30294/34278 [33:12:18<3:48:30, 3.44s/it] {'loss': 0.1199, 'grad_norm': 0.8889682049730542, 'learning_rate': 3.500956701796243e-07, 'epoch': 0.88} 88%|████████▊ | 30294/34278 [33:12:18<3:48:30, 3.44s/it] 88%|████████▊ | 30295/34278 [33:12:22<3:58:04, 3.59s/it] {'loss': 0.1126, 'grad_norm': 0.7569732883636934, 'learning_rate': 3.4992202039499377e-07, 'epoch': 0.88} 88%|████████▊ | 30295/34278 [33:12:22<3:58:04, 3.59s/it] 88%|████████▊ | 30296/34278 [33:12:27<4:43:57, 4.28s/it] {'loss': 0.0896, 'grad_norm': 0.7844529605231352, 'learning_rate': 3.497484121251499e-07, 'epoch': 0.88} 88%|████████▊ | 30296/34278 [33:12:27<4:43:57, 4.28s/it] 88%|████████▊ | 30297/34278 [33:12:33<5:15:06, 4.75s/it] {'loss': 0.1146, 'grad_norm': 0.6706005533343086, 'learning_rate': 3.4957484537164076e-07, 'epoch': 0.88} 88%|████████▊ | 30297/34278 [33:12:33<5:15:06, 4.75s/it] 88%|████████▊ | 30298/34278 [33:12:39<5:41:31, 5.15s/it] {'loss': 0.1139, 'grad_norm': 0.7684084409947594, 'learning_rate': 3.494013201360186e-07, 'epoch': 0.88} 88%|████████▊ | 30298/34278 [33:12:39<5:41:31, 5.15s/it] 88%|████████▊ | 30299/34278 [33:12:42<4:52:57, 4.42s/it] {'loss': 0.1272, 'grad_norm': 1.0253168927714207, 'learning_rate': 3.492278364198309e-07, 'epoch': 0.88} 88%|████████▊ | 30299/34278 [33:12:42<4:52:57, 4.42s/it] 88%|████████▊ | 30300/34278 [33:12:45<4:21:36, 3.95s/it] {'loss': 0.1214, 'grad_norm': 0.9081985058797085, 'learning_rate': 3.490543942246255e-07, 'epoch': 0.88} 88%|████████▊ | 30300/34278 [33:12:45<4:21:36, 3.95s/it] 88%|████████▊ | 30301/34278 [33:12:51<5:07:28, 4.64s/it] {'loss': 0.1174, 'grad_norm': 0.8650806966578726, 'learning_rate': 3.488809935519533e-07, 'epoch': 0.88} 88%|████████▊ | 30301/34278 [33:12:51<5:07:28, 4.64s/it] 88%|████████▊ | 30302/34278 [33:12:54<4:38:26, 4.20s/it] {'loss': 0.1157, 'grad_norm': 0.952124751383058, 'learning_rate': 3.4870763440336185e-07, 'epoch': 0.88} 88%|████████▊ | 30302/34278 [33:12:54<4:38:26, 4.20s/it] 88%|████████▊ | 30303/34278 [33:12:57<4:13:38, 3.83s/it] {'loss': 0.1027, 'grad_norm': 0.8657833867463129, 'learning_rate': 3.485343167803973e-07, 'epoch': 0.88} 88%|████████▊ | 30303/34278 [33:12:57<4:13:38, 3.83s/it] 88%|████████▊ | 30304/34278 [33:13:00<3:55:32, 3.56s/it] {'loss': 0.1072, 'grad_norm': 0.8805401006232512, 'learning_rate': 3.4836104068460887e-07, 'epoch': 0.88} 88%|████████▊ | 30304/34278 [33:13:00<3:55:32, 3.56s/it] 88%|████████▊ | 30305/34278 [33:13:04<4:03:52, 3.68s/it] {'loss': 0.11, 'grad_norm': 0.994418975411668, 'learning_rate': 3.48187806117542e-07, 'epoch': 0.88} 88%|████████▊ | 30305/34278 [33:13:04<4:03:52, 3.68s/it] 88%|████████▊ | 30306/34278 [33:13:09<4:25:09, 4.01s/it] {'loss': 0.1046, 'grad_norm': 0.7017094904926031, 'learning_rate': 3.480146130807438e-07, 'epoch': 0.88} 88%|████████▊ | 30306/34278 [33:13:09<4:25:09, 4.01s/it] 88%|████████▊ | 30307/34278 [33:13:12<4:06:59, 3.73s/it] {'loss': 0.1075, 'grad_norm': 0.7848220442586089, 'learning_rate': 3.4784146157576025e-07, 'epoch': 0.88} 88%|████████▊ | 30307/34278 [33:13:12<4:06:59, 3.73s/it] 88%|████████▊ | 30308/34278 [33:13:16<4:08:29, 3.76s/it] {'loss': 0.1239, 'grad_norm': 0.9510315721539675, 'learning_rate': 3.4766835160413846e-07, 'epoch': 0.88} 88%|████████▊ | 30308/34278 [33:13:16<4:08:29, 3.76s/it] 88%|████████▊ | 30309/34278 [33:13:18<3:45:20, 3.41s/it] {'loss': 0.133, 'grad_norm': 1.1385160742814275, 'learning_rate': 3.474952831674233e-07, 'epoch': 0.88} 88%|████████▊ | 30309/34278 [33:13:18<3:45:20, 3.41s/it] 88%|████████▊ | 30310/34278 [33:13:21<3:37:09, 3.28s/it] {'loss': 0.1212, 'grad_norm': 0.8747383799876995, 'learning_rate': 3.473222562671585e-07, 'epoch': 0.88} 88%|████████▊ | 30310/34278 [33:13:21<3:37:09, 3.28s/it] 88%|████████▊ | 30311/34278 [33:13:28<4:34:42, 4.15s/it] {'loss': 0.1087, 'grad_norm': 0.8685490919580368, 'learning_rate': 3.4714927090489126e-07, 'epoch': 0.88} 88%|████████▊ | 30311/34278 [33:13:28<4:34:42, 4.15s/it] 88%|████████▊ | 30312/34278 [33:13:30<4:08:24, 3.76s/it] {'loss': 0.1199, 'grad_norm': 0.6771623979097176, 'learning_rate': 3.46976327082163e-07, 'epoch': 0.88} 88%|████████▊ | 30312/34278 [33:13:30<4:08:24, 3.76s/it] 88%|████████▊ | 30313/34278 [33:13:33<3:47:37, 3.44s/it] {'loss': 0.1196, 'grad_norm': 0.794962219029144, 'learning_rate': 3.468034248005209e-07, 'epoch': 0.88} 88%|████████▊ | 30313/34278 [33:13:33<3:47:37, 3.44s/it] 88%|████████▊ | 30314/34278 [33:13:38<4:16:41, 3.89s/it] {'loss': 0.1123, 'grad_norm': 0.8725883111671113, 'learning_rate': 3.466305640615059e-07, 'epoch': 0.88} 88%|████████▊ | 30314/34278 [33:13:38<4:16:41, 3.89s/it] 88%|████████▊ | 30315/34278 [33:13:41<4:03:46, 3.69s/it] {'loss': 0.118, 'grad_norm': 0.7223817005236681, 'learning_rate': 3.4645774486666285e-07, 'epoch': 0.88} 88%|████████▊ | 30315/34278 [33:13:41<4:03:46, 3.69s/it] 88%|████████▊ | 30316/34278 [33:13:44<3:47:28, 3.44s/it] {'loss': 0.1165, 'grad_norm': 0.749383241880221, 'learning_rate': 3.4628496721753444e-07, 'epoch': 0.88} 88%|████████▊ | 30316/34278 [33:13:44<3:47:28, 3.44s/it] 88%|████████▊ | 30317/34278 [33:13:47<3:38:53, 3.32s/it] {'loss': 0.0974, 'grad_norm': 0.9503710213224816, 'learning_rate': 3.4611223111566226e-07, 'epoch': 0.88} 88%|████████▊ | 30317/34278 [33:13:47<3:38:53, 3.32s/it] 88%|████████▊ | 30318/34278 [33:13:52<3:59:22, 3.63s/it] {'loss': 0.105, 'grad_norm': 0.8096765693535113, 'learning_rate': 3.4593953656258896e-07, 'epoch': 0.88} 88%|████████▊ | 30318/34278 [33:13:52<3:59:22, 3.63s/it] 88%|████████▊ | 30319/34278 [33:13:55<3:59:19, 3.63s/it] {'loss': 0.1145, 'grad_norm': 0.7867137257183453, 'learning_rate': 3.457668835598571e-07, 'epoch': 0.88} 88%|████████▊ | 30319/34278 [33:13:55<3:59:19, 3.63s/it] 88%|████████▊ | 30320/34278 [33:13:59<4:10:03, 3.79s/it] {'loss': 0.1027, 'grad_norm': 1.102137486792454, 'learning_rate': 3.4559427210900663e-07, 'epoch': 0.88} 88%|████████▊ | 30320/34278 [33:13:59<4:10:03, 3.79s/it] 88%|████████▊ | 30321/34278 [33:14:03<4:15:44, 3.88s/it] {'loss': 0.122, 'grad_norm': 0.8603445635648463, 'learning_rate': 3.454217022115802e-07, 'epoch': 0.88} 88%|████████▊ | 30321/34278 [33:14:03<4:15:44, 3.88s/it] 88%|████████▊ | 30322/34278 [33:14:06<3:57:43, 3.61s/it] {'loss': 0.1111, 'grad_norm': 0.815858513164158, 'learning_rate': 3.452491738691183e-07, 'epoch': 0.88} 88%|████████▊ | 30322/34278 [33:14:06<3:57:43, 3.61s/it] 88%|████████▊ | 30323/34278 [33:14:10<3:58:15, 3.61s/it] {'loss': 0.102, 'grad_norm': 0.6665759354368596, 'learning_rate': 3.45076687083159e-07, 'epoch': 0.88} 88%|████████▊ | 30323/34278 [33:14:10<3:58:15, 3.61s/it] 88%|████████▊ | 30324/34278 [33:14:13<3:43:35, 3.39s/it] {'loss': 0.0927, 'grad_norm': 1.4909042422648486, 'learning_rate': 3.44904241855244e-07, 'epoch': 0.88} 88%|████████▊ | 30324/34278 [33:14:13<3:43:35, 3.39s/it] 88%|████████▊ | 30325/34278 [33:14:16<3:39:57, 3.34s/it] {'loss': 0.1112, 'grad_norm': 0.8096901343232552, 'learning_rate': 3.447318381869136e-07, 'epoch': 0.88} 88%|████████▊ | 30325/34278 [33:14:16<3:39:57, 3.34s/it] 88%|████████▊ | 30326/34278 [33:14:20<3:40:32, 3.35s/it] {'loss': 0.1079, 'grad_norm': 0.9453686884324168, 'learning_rate': 3.445594760797061e-07, 'epoch': 0.88} 88%|████████▊ | 30326/34278 [33:14:20<3:40:32, 3.35s/it] 88%|████████▊ | 30327/34278 [33:14:24<3:54:06, 3.56s/it] {'loss': 0.0901, 'grad_norm': 0.7234627055790871, 'learning_rate': 3.443871555351597e-07, 'epoch': 0.88} 88%|████████▊ | 30327/34278 [33:14:24<3:54:06, 3.56s/it] 88%|████████▊ | 30328/34278 [33:14:27<3:42:45, 3.38s/it] {'loss': 0.0975, 'grad_norm': 0.8254188611841817, 'learning_rate': 3.4421487655481366e-07, 'epoch': 0.88} 88%|████████▊ | 30328/34278 [33:14:27<3:42:45, 3.38s/it] 88%|████████▊ | 30329/34278 [33:14:29<3:33:57, 3.25s/it] {'loss': 0.0927, 'grad_norm': 0.7246687590352812, 'learning_rate': 3.440426391402063e-07, 'epoch': 0.88} 88%|████████▊ | 30329/34278 [33:14:29<3:33:57, 3.25s/it] 88%|████████▊ | 30330/34278 [33:14:32<3:28:34, 3.17s/it] {'loss': 0.1043, 'grad_norm': 0.8764899631729903, 'learning_rate': 3.43870443292873e-07, 'epoch': 0.88} 88%|████████▊ | 30330/34278 [33:14:32<3:28:34, 3.17s/it] 88%|████████▊ | 30331/34278 [33:14:36<3:27:03, 3.15s/it] {'loss': 0.1229, 'grad_norm': 0.8373716420881642, 'learning_rate': 3.436982890143542e-07, 'epoch': 0.88} 88%|████████▊ | 30331/34278 [33:14:36<3:27:03, 3.15s/it] 88%|████████▊ | 30332/34278 [33:14:39<3:36:56, 3.30s/it] {'loss': 0.1116, 'grad_norm': 0.7997293485721308, 'learning_rate': 3.435261763061859e-07, 'epoch': 0.88} 88%|████████▊ | 30332/34278 [33:14:39<3:36:56, 3.30s/it] 88%|████████▊ | 30333/34278 [33:14:45<4:20:56, 3.97s/it] {'loss': 0.135, 'grad_norm': 0.9667778708676278, 'learning_rate': 3.43354105169903e-07, 'epoch': 0.88} 88%|████████▊ | 30333/34278 [33:14:45<4:20:56, 3.97s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 88%|████████▊ | 30334/34278 [33:14:50<4:55:38, 4.50s/it] {'loss': 0.1301, 'grad_norm': 0.7458684991803146, 'learning_rate': 3.431820756070442e-07, 'epoch': 0.88} 88%|████████▊ | 30334/34278 [33:14:50<4:55:38, 4.50s/it] 88%|████████▊ | 30335/34278 [33:14:54<4:33:03, 4.16s/it] {'loss': 0.1234, 'grad_norm': 0.8914313326095475, 'learning_rate': 3.430100876191439e-07, 'epoch': 0.88} 88%|████████▊ | 30335/34278 [33:14:54<4:33:03, 4.16s/it] 88%|████████▊ | 30336/34278 [33:14:57<4:12:25, 3.84s/it] {'loss': 0.1036, 'grad_norm': 0.8700656007969616, 'learning_rate': 3.4283814120773753e-07, 'epoch': 0.88} 88%|████████▊ | 30336/34278 [33:14:57<4:12:25, 3.84s/it] 89%|████████▊ | 30337/34278 [33:15:00<4:01:32, 3.68s/it] {'loss': 0.1164, 'grad_norm': 1.0175748851464634, 'learning_rate': 3.426662363743599e-07, 'epoch': 0.89} 89%|████████▊ | 30337/34278 [33:15:00<4:01:32, 3.68s/it] 89%|████████▊ | 30338/34278 [33:15:06<4:52:19, 4.45s/it] {'loss': 0.1189, 'grad_norm': 0.9406545152020644, 'learning_rate': 3.424943731205477e-07, 'epoch': 0.89} 89%|████████▊ | 30338/34278 [33:15:06<4:52:19, 4.45s/it] 89%|████████▊ | 30339/34278 [33:15:10<4:35:50, 4.20s/it] {'loss': 0.0977, 'grad_norm': 0.8426086877704665, 'learning_rate': 3.4232255144783347e-07, 'epoch': 0.89} 89%|████████▊ | 30339/34278 [33:15:10<4:35:50, 4.20s/it] 89%|████████▊ | 30340/34278 [33:15:16<5:10:42, 4.73s/it] {'loss': 0.1201, 'grad_norm': 0.9989210816802246, 'learning_rate': 3.42150771357751e-07, 'epoch': 0.89} 89%|████████▊ | 30340/34278 [33:15:16<5:10:42, 4.73s/it] 89%|████████▊ | 30341/34278 [33:15:20<4:50:57, 4.43s/it] {'loss': 0.1052, 'grad_norm': 0.7275818106060049, 'learning_rate': 3.419790328518352e-07, 'epoch': 0.89} 89%|████████▊ | 30341/34278 [33:15:20<4:50:57, 4.43s/it] 89%|████████▊ | 30342/34278 [33:15:26<5:20:18, 4.88s/it] {'loss': 0.0996, 'grad_norm': 0.7937811983282704, 'learning_rate': 3.4180733593161764e-07, 'epoch': 0.89} 89%|████████▊ | 30342/34278 [33:15:26<5:20:18, 4.88s/it] 89%|████████▊ | 30343/34278 [33:15:30<5:05:37, 4.66s/it] {'loss': 0.0883, 'grad_norm': 0.6934409654506852, 'learning_rate': 3.4163568059863374e-07, 'epoch': 0.89} 89%|████████▊ | 30343/34278 [33:15:30<5:05:37, 4.66s/it] 89%|████████▊ | 30344/34278 [33:15:33<4:38:36, 4.25s/it] {'loss': 0.1137, 'grad_norm': 1.0038186410180796, 'learning_rate': 3.414640668544128e-07, 'epoch': 0.89} 89%|████████▊ | 30344/34278 [33:15:33<4:38:36, 4.25s/it] 89%|████████▊ | 30345/34278 [33:15:38<4:44:03, 4.33s/it] {'loss': 0.1123, 'grad_norm': 0.7151777752932028, 'learning_rate': 3.4129249470048974e-07, 'epoch': 0.89} 89%|████████▊ | 30345/34278 [33:15:38<4:44:03, 4.33s/it] 89%|████████▊ | 30346/34278 [33:15:41<4:19:05, 3.95s/it] {'loss': 0.1089, 'grad_norm': 0.8763191749524867, 'learning_rate': 3.41120964138395e-07, 'epoch': 0.89} 89%|████████▊ | 30346/34278 [33:15:41<4:19:05, 3.95s/it] 89%|████████▊ | 30347/34278 [33:15:44<4:08:25, 3.79s/it] {'loss': 0.1074, 'grad_norm': 0.8118309774776161, 'learning_rate': 3.409494751696596e-07, 'epoch': 0.89} 89%|████████▊ | 30347/34278 [33:15:44<4:08:25, 3.79s/it] 89%|████████▊ | 30348/34278 [33:15:48<4:07:02, 3.77s/it] {'loss': 0.0919, 'grad_norm': 0.9261107718805629, 'learning_rate': 3.4077802779581504e-07, 'epoch': 0.89} 89%|████████▊ | 30348/34278 [33:15:48<4:07:02, 3.77s/it] 89%|████████▊ | 30349/34278 [33:15:51<3:58:02, 3.64s/it] {'loss': 0.1105, 'grad_norm': 0.864050094739019, 'learning_rate': 3.406066220183929e-07, 'epoch': 0.89} 89%|████████▊ | 30349/34278 [33:15:51<3:58:02, 3.64s/it] 89%|████████▊ | 30350/34278 [33:15:54<3:42:49, 3.40s/it] {'loss': 0.1065, 'grad_norm': 0.8010293586694941, 'learning_rate': 3.404352578389214e-07, 'epoch': 0.89} 89%|████████▊ | 30350/34278 [33:15:54<3:42:49, 3.40s/it] 89%|████████▊ | 30351/34278 [33:15:57<3:42:18, 3.40s/it] {'loss': 0.1173, 'grad_norm': 0.715757383092985, 'learning_rate': 3.4026393525893266e-07, 'epoch': 0.89} 89%|████████▊ | 30351/34278 [33:15:57<3:42:18, 3.40s/it] 89%|████████▊ | 30352/34278 [33:16:01<3:36:33, 3.31s/it] {'loss': 0.1016, 'grad_norm': 0.7882231160982873, 'learning_rate': 3.4009265427995483e-07, 'epoch': 0.89} 89%|████████▊ | 30352/34278 [33:16:01<3:36:33, 3.31s/it] 89%|████████▊ | 30353/34278 [33:16:06<4:27:27, 4.09s/it] {'loss': 0.0949, 'grad_norm': 0.7943256913202561, 'learning_rate': 3.3992141490351685e-07, 'epoch': 0.89} 89%|████████▊ | 30353/34278 [33:16:06<4:27:27, 4.09s/it] 89%|████████▊ | 30354/34278 [33:16:11<4:27:41, 4.09s/it] {'loss': 0.1012, 'grad_norm': 0.792414663988053, 'learning_rate': 3.3975021713114844e-07, 'epoch': 0.89} 89%|████████▊ | 30354/34278 [33:16:11<4:27:41, 4.09s/it] 89%|████████▊ | 30355/34278 [33:16:14<4:23:15, 4.03s/it] {'loss': 0.1585, 'grad_norm': 0.9474072103656929, 'learning_rate': 3.395790609643779e-07, 'epoch': 0.89} 89%|████████▊ | 30355/34278 [33:16:14<4:23:15, 4.03s/it] 89%|████████▊ | 30356/34278 [33:16:17<3:57:20, 3.63s/it] {'loss': 0.1174, 'grad_norm': 1.0532927138993091, 'learning_rate': 3.3940794640473284e-07, 'epoch': 0.89} 89%|████████▊ | 30356/34278 [33:16:17<3:57:20, 3.63s/it] 89%|████████▊ | 30357/34278 [33:16:20<3:50:17, 3.52s/it] {'loss': 0.1171, 'grad_norm': 0.6979188327652479, 'learning_rate': 3.3923687345374046e-07, 'epoch': 0.89} 89%|████████▊ | 30357/34278 [33:16:20<3:50:17, 3.52s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fef53719ee0> Failed to fetch sample 3687537. Exception: cannot identify image file <_io.BytesIO object at 0x7fef53719ee0> 89%|████████▊ | 30358/34278 [33:16:26<4:22:04, 4.01s/it] {'loss': 0.1041, 'grad_norm': 1.392831635861828, 'learning_rate': 3.390658421129295e-07, 'epoch': 0.89} 89%|████████▊ | 30358/34278 [33:16:26<4:22:04, 4.01s/it] 89%|████████▊ | 30359/34278 [33:16:29<4:09:06, 3.81s/it] {'loss': 0.1172, 'grad_norm': 0.756555154112626, 'learning_rate': 3.388948523838259e-07, 'epoch': 0.89} 89%|████████▊ | 30359/34278 [33:16:29<4:09:06, 3.81s/it] 89%|████████▊ | 30360/34278 [33:16:32<3:53:21, 3.57s/it] {'loss': 0.1008, 'grad_norm': 1.1926946760902775, 'learning_rate': 3.3872390426795467e-07, 'epoch': 0.89} 89%|████████▊ | 30360/34278 [33:16:32<3:53:21, 3.57s/it] 89%|████████▊ | 30361/34278 [33:16:37<4:22:25, 4.02s/it] {'loss': 0.1133, 'grad_norm': 0.7766733283219366, 'learning_rate': 3.385529977668456e-07, 'epoch': 0.89} 89%|████████▊ | 30361/34278 [33:16:37<4:22:25, 4.02s/it] 89%|████████▊ | 30362/34278 [33:16:40<4:09:53, 3.83s/it] {'loss': 0.108, 'grad_norm': 0.7369969660292518, 'learning_rate': 3.383821328820225e-07, 'epoch': 0.89} 89%|████████▊ | 30362/34278 [33:16:40<4:09:53, 3.83s/it] 89%|████████▊ | 30363/34278 [33:16:46<4:38:24, 4.27s/it] {'loss': 0.1336, 'grad_norm': 1.0297691496982326, 'learning_rate': 3.382113096150097e-07, 'epoch': 0.89} 89%|████████▊ | 30363/34278 [33:16:46<4:38:24, 4.27s/it] 89%|████████▊ | 30364/34278 [33:16:49<4:19:56, 3.98s/it] {'loss': 0.1031, 'grad_norm': 0.813095504044418, 'learning_rate': 3.380405279673349e-07, 'epoch': 0.89} 89%|████████▊ | 30364/34278 [33:16:49<4:19:56, 3.98s/it] 89%|████████▊ | 30365/34278 [33:16:52<3:58:32, 3.66s/it] {'loss': 0.1379, 'grad_norm': 0.9703168646153482, 'learning_rate': 3.3786978794052126e-07, 'epoch': 0.89} 89%|████████▊ | 30365/34278 [33:16:52<3:58:32, 3.66s/it] 89%|████████▊ | 30366/34278 [33:16:56<4:00:10, 3.68s/it] {'loss': 0.1289, 'grad_norm': 0.9349777106145506, 'learning_rate': 3.376990895360921e-07, 'epoch': 0.89} 89%|████████▊ | 30366/34278 [33:16:56<4:00:10, 3.68s/it] 89%|████████▊ | 30367/34278 [33:16:58<3:43:34, 3.43s/it] {'loss': 0.0895, 'grad_norm': 0.8325957039835654, 'learning_rate': 3.3752843275557224e-07, 'epoch': 0.89} 89%|████████▊ | 30367/34278 [33:16:58<3:43:34, 3.43s/it] 89%|████████▊ | 30368/34278 [33:17:01<3:34:42, 3.29s/it] {'loss': 0.1073, 'grad_norm': 0.8702869652325833, 'learning_rate': 3.3735781760048714e-07, 'epoch': 0.89} 89%|████████▊ | 30368/34278 [33:17:01<3:34:42, 3.29s/it] 89%|████████▊ | 30369/34278 [33:17:05<3:30:57, 3.24s/it] {'loss': 0.1108, 'grad_norm': 0.7044139624804796, 'learning_rate': 3.371872440723578e-07, 'epoch': 0.89} 89%|████████▊ | 30369/34278 [33:17:05<3:30:57, 3.24s/it] 89%|████████▊ | 30370/34278 [33:17:07<3:21:24, 3.09s/it] {'loss': 0.1027, 'grad_norm': 0.8736508089684976, 'learning_rate': 3.370167121727069e-07, 'epoch': 0.89} 89%|████████▊ | 30370/34278 [33:17:07<3:21:24, 3.09s/it] 89%|████████▊ | 30371/34278 [33:17:10<3:22:16, 3.11s/it] {'loss': 0.0869, 'grad_norm': 0.7745660315239808, 'learning_rate': 3.3684622190305825e-07, 'epoch': 0.89} 89%|████████▊ | 30371/34278 [33:17:10<3:22:16, 3.11s/it] 89%|████████▊ | 30372/34278 [33:17:14<3:31:11, 3.24s/it] {'loss': 0.1149, 'grad_norm': 0.6383409050373748, 'learning_rate': 3.3667577326493283e-07, 'epoch': 0.89} 89%|████████▊ | 30372/34278 [33:17:14<3:31:11, 3.24s/it] 89%|████████▊ | 30373/34278 [33:17:17<3:34:00, 3.29s/it] {'loss': 0.1091, 'grad_norm': 0.7618712373043488, 'learning_rate': 3.3650536625985384e-07, 'epoch': 0.89} 89%|████████▊ | 30373/34278 [33:17:17<3:34:00, 3.29s/it] 89%|████████▊ | 30374/34278 [33:17:21<3:30:32, 3.24s/it] {'loss': 0.1144, 'grad_norm': 1.0625097758772686, 'learning_rate': 3.363350008893407e-07, 'epoch': 0.89} 89%|████████▊ | 30374/34278 [33:17:21<3:30:32, 3.24s/it] 89%|████████▊ | 30375/34278 [33:17:24<3:38:14, 3.35s/it] {'loss': 0.1043, 'grad_norm': 0.7828996684426, 'learning_rate': 3.3616467715491654e-07, 'epoch': 0.89} 89%|████████▊ | 30375/34278 [33:17:24<3:38:14, 3.35s/it] 89%|████████▊ | 30376/34278 [33:17:28<3:40:18, 3.39s/it] {'loss': 0.1211, 'grad_norm': 0.70035622400141, 'learning_rate': 3.3599439505810015e-07, 'epoch': 0.89} 89%|████████▊ | 30376/34278 [33:17:28<3:40:18, 3.39s/it] 89%|████████▊ | 30377/34278 [33:17:30<3:29:35, 3.22s/it] {'loss': 0.1268, 'grad_norm': 0.8927629012197409, 'learning_rate': 3.358241546004121e-07, 'epoch': 0.89} 89%|████████▊ | 30377/34278 [33:17:30<3:29:35, 3.22s/it] 89%|████████▊ | 30378/34278 [33:17:33<3:22:38, 3.12s/it] {'loss': 0.1323, 'grad_norm': 0.9462367269295212, 'learning_rate': 3.3565395578337214e-07, 'epoch': 0.89} 89%|████████▊ | 30378/34278 [33:17:33<3:22:38, 3.12s/it] 89%|████████▊ | 30379/34278 [33:17:36<3:20:06, 3.08s/it] {'loss': 0.1314, 'grad_norm': 0.974531682763787, 'learning_rate': 3.354837986085013e-07, 'epoch': 0.89} 89%|████████▊ | 30379/34278 [33:17:36<3:20:06, 3.08s/it] 89%|████████▊ | 30380/34278 [33:17:43<4:21:49, 4.03s/it] {'loss': 0.1071, 'grad_norm': 0.8371538846374478, 'learning_rate': 3.353136830773168e-07, 'epoch': 0.89} 89%|████████▊ | 30380/34278 [33:17:43<4:21:49, 4.03s/it] 89%|████████▊ | 30381/34278 [33:17:46<4:07:45, 3.81s/it] {'loss': 0.122, 'grad_norm': 0.8964652179080821, 'learning_rate': 3.351436091913385e-07, 'epoch': 0.89} 89%|████████▊ | 30381/34278 [33:17:46<4:07:45, 3.81s/it] 89%|████████▊ | 30382/34278 [33:17:49<3:48:56, 3.53s/it] {'loss': 0.0863, 'grad_norm': 0.772242333638482, 'learning_rate': 3.349735769520851e-07, 'epoch': 0.89} 89%|████████▊ | 30382/34278 [33:17:49<3:48:56, 3.53s/it] 89%|████████▊ | 30383/34278 [33:17:55<4:33:48, 4.22s/it] {'loss': 0.1095, 'grad_norm': 0.7731169190457556, 'learning_rate': 3.3480358636107267e-07, 'epoch': 0.89} 89%|████████▊ | 30383/34278 [33:17:55<4:33:48, 4.22s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 89%|████████▊ | 30384/34278 [33:17:58<4:22:07, 4.04s/it] {'loss': 0.112, 'grad_norm': 0.7525963182497357, 'learning_rate': 3.346336374198206e-07, 'epoch': 0.89} 89%|████████▊ | 30384/34278 [33:17:58<4:22:07, 4.04s/it] 89%|████████▊ | 30385/34278 [33:18:01<3:58:00, 3.67s/it] {'loss': 0.1016, 'grad_norm': 0.7043585510672596, 'learning_rate': 3.3446373012984647e-07, 'epoch': 0.89} 89%|████████▊ | 30385/34278 [33:18:01<3:58:00, 3.67s/it] 89%|████████▊ | 30386/34278 [33:18:07<4:42:32, 4.36s/it] {'loss': 0.1052, 'grad_norm': 0.6955703191132063, 'learning_rate': 3.3429386449266634e-07, 'epoch': 0.89} 89%|████████▊ | 30386/34278 [33:18:07<4:42:32, 4.36s/it] 89%|████████▊ | 30387/34278 [33:18:10<4:18:57, 3.99s/it] {'loss': 0.1084, 'grad_norm': 0.9402183031429808, 'learning_rate': 3.3412404050979564e-07, 'epoch': 0.89} 89%|████████▊ | 30387/34278 [33:18:10<4:18:57, 3.99s/it] 89%|████████▊ | 30388/34278 [33:18:14<4:08:38, 3.83s/it] {'loss': 0.0924, 'grad_norm': 0.9400248364293934, 'learning_rate': 3.3395425818275264e-07, 'epoch': 0.89} 89%|████████▊ | 30388/34278 [33:18:14<4:08:38, 3.83s/it] 89%|████████▊ | 30389/34278 [33:18:16<3:50:37, 3.56s/it] {'loss': 0.1098, 'grad_norm': 0.893022260749343, 'learning_rate': 3.337845175130522e-07, 'epoch': 0.89} 89%|████████▊ | 30389/34278 [33:18:16<3:50:37, 3.56s/it] 89%|████████▊ | 30390/34278 [33:18:22<4:30:38, 4.18s/it] {'loss': 0.1274, 'grad_norm': 0.9950712262562712, 'learning_rate': 3.336148185022081e-07, 'epoch': 0.89} 89%|████████▊ | 30390/34278 [33:18:22<4:30:38, 4.18s/it] 89%|████████▊ | 30391/34278 [33:18:26<4:26:16, 4.11s/it] {'loss': 0.0984, 'grad_norm': 0.8041470734874085, 'learning_rate': 3.3344516115173863e-07, 'epoch': 0.89} 89%|████████▊ | 30391/34278 [33:18:26<4:26:16, 4.11s/it] 89%|████████▊ | 30392/34278 [33:18:29<4:08:27, 3.84s/it] {'loss': 0.1172, 'grad_norm': 0.832538593492917, 'learning_rate': 3.33275545463157e-07, 'epoch': 0.89} 89%|████████▊ | 30392/34278 [33:18:29<4:08:27, 3.84s/it] 89%|████████▊ | 30393/34278 [33:18:34<4:32:57, 4.22s/it] {'loss': 0.1215, 'grad_norm': 0.9968711353932206, 'learning_rate': 3.3310597143797585e-07, 'epoch': 0.89} 89%|████████▊ | 30393/34278 [33:18:34<4:32:57, 4.22s/it] 89%|████████▊ | 30394/34278 [33:18:38<4:30:00, 4.17s/it] {'loss': 0.1069, 'grad_norm': 0.7186352544941272, 'learning_rate': 3.329364390777118e-07, 'epoch': 0.89} 89%|████████▊ | 30394/34278 [33:18:38<4:30:00, 4.17s/it] 89%|████████▊ | 30395/34278 [33:18:42<4:09:40, 3.86s/it] {'loss': 0.1224, 'grad_norm': 1.319439937713482, 'learning_rate': 3.327669483838758e-07, 'epoch': 0.89} 89%|████████▊ | 30395/34278 [33:18:42<4:09:40, 3.86s/it] 89%|████████▊ | 30396/34278 [33:18:45<3:52:23, 3.59s/it] {'loss': 0.0908, 'grad_norm': 0.8814657572629593, 'learning_rate': 3.325974993579839e-07, 'epoch': 0.89} 89%|████████▊ | 30396/34278 [33:18:45<3:52:23, 3.59s/it] 89%|████████▊ | 30397/34278 [33:18:50<4:28:16, 4.15s/it] {'loss': 0.107, 'grad_norm': 0.8183405807279585, 'learning_rate': 3.3242809200154603e-07, 'epoch': 0.89} 89%|████████▊ | 30397/34278 [33:18:50<4:28:16, 4.15s/it] 89%|████████▊ | 30398/34278 [33:18:53<4:11:03, 3.88s/it] {'loss': 0.1071, 'grad_norm': 0.9222259577856428, 'learning_rate': 3.3225872631607646e-07, 'epoch': 0.89} 89%|████████▊ | 30398/34278 [33:18:53<4:11:03, 3.88s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f96a1ad7650> Failed to fetch sample 3353704. Exception: cannot identify image file <_io.BytesIO object at 0x7f96a1ad7650> 89%|████████▊ | 30399/34278 [33:18:57<4:04:25, 3.78s/it] {'loss': 0.1116, 'grad_norm': 0.8522762644758137, 'learning_rate': 3.320894023030868e-07, 'epoch': 0.89} 89%|████████▊ | 30399/34278 [33:18:57<4:04:25, 3.78s/it] 89%|████████▊ | 30400/34278 [33:19:00<3:50:11, 3.56s/it] {'loss': 0.0998, 'grad_norm': 0.7650674869070023, 'learning_rate': 3.319201199640881e-07, 'epoch': 0.89} 89%|████████▊ | 30400/34278 [33:19:00<3:50:11, 3.56s/it] 89%|████████▊ | 30401/34278 [33:19:05<4:15:04, 3.95s/it] {'loss': 0.103, 'grad_norm': 0.7338584267253627, 'learning_rate': 3.3175087930059246e-07, 'epoch': 0.89} 89%|████████▊ | 30401/34278 [33:19:05<4:15:04, 3.95s/it] 89%|████████▊ | 30402/34278 [33:19:08<3:55:20, 3.64s/it] {'loss': 0.1184, 'grad_norm': 1.0480343563621168, 'learning_rate': 3.3158168031411085e-07, 'epoch': 0.89} 89%|████████▊ | 30402/34278 [33:19:08<3:55:20, 3.64s/it] 89%|████████▊ | 30403/34278 [33:19:11<3:43:41, 3.46s/it] {'loss': 0.1189, 'grad_norm': 0.9729019843719371, 'learning_rate': 3.3141252300615377e-07, 'epoch': 0.89} 89%|████████▊ | 30403/34278 [33:19:11<3:43:41, 3.46s/it] 89%|████████▊ | 30404/34278 [33:19:17<4:36:19, 4.28s/it] {'loss': 0.1196, 'grad_norm': 0.7023401695125001, 'learning_rate': 3.3124340737823056e-07, 'epoch': 0.89} 89%|████████▊ | 30404/34278 [33:19:17<4:36:19, 4.28s/it] 89%|████████▊ | 30405/34278 [33:19:23<5:15:32, 4.89s/it] {'loss': 0.1112, 'grad_norm': 0.7756368928637107, 'learning_rate': 3.3107433343185224e-07, 'epoch': 0.89} 89%|████████▊ | 30405/34278 [33:19:23<5:15:32, 4.89s/it] 89%|████████▊ | 30406/34278 [33:19:26<4:38:44, 4.32s/it] {'loss': 0.1031, 'grad_norm': 0.8369702667736236, 'learning_rate': 3.3090530116852757e-07, 'epoch': 0.89} 89%|████████▊ | 30406/34278 [33:19:26<4:38:44, 4.32s/it] 89%|████████▊ | 30407/34278 [33:19:29<4:14:40, 3.95s/it] {'loss': 0.0888, 'grad_norm': 1.2126969738676157, 'learning_rate': 3.3073631058976486e-07, 'epoch': 0.89} 89%|████████▊ | 30407/34278 [33:19:29<4:14:40, 3.95s/it] 89%|████████▊ | 30408/34278 [33:19:33<4:03:09, 3.77s/it] {'loss': 0.1267, 'grad_norm': 0.8124887938795178, 'learning_rate': 3.305673616970745e-07, 'epoch': 0.89} 89%|████████▊ | 30408/34278 [33:19:33<4:03:09, 3.77s/it] 89%|████████▊ | 30409/34278 [33:19:36<3:51:27, 3.59s/it] {'loss': 0.1546, 'grad_norm': 0.8935019345644799, 'learning_rate': 3.3039845449196473e-07, 'epoch': 0.89} 89%|████████▊ | 30409/34278 [33:19:36<3:51:27, 3.59s/it] 89%|████████▊ | 30410/34278 [33:19:39<3:39:41, 3.41s/it] {'loss': 0.1252, 'grad_norm': 0.820766387428527, 'learning_rate': 3.3022958897594157e-07, 'epoch': 0.89} 89%|████████▊ | 30410/34278 [33:19:39<3:39:41, 3.41s/it] 89%|████████▊ | 30411/34278 [33:19:44<4:06:43, 3.83s/it] {'loss': 0.1052, 'grad_norm': 1.0688200141873976, 'learning_rate': 3.30060765150515e-07, 'epoch': 0.89} 89%|████████▊ | 30411/34278 [33:19:44<4:06:43, 3.83s/it] 89%|████████▊ | 30412/34278 [33:19:50<4:49:59, 4.50s/it] {'loss': 0.1005, 'grad_norm': 0.9664418098268349, 'learning_rate': 3.2989198301719095e-07, 'epoch': 0.89} 89%|████████▊ | 30412/34278 [33:19:50<4:49:59, 4.50s/it] 89%|████████▊ | 30413/34278 [33:19:53<4:29:56, 4.19s/it] {'loss': 0.1285, 'grad_norm': 1.0969730287842412, 'learning_rate': 3.2972324257747543e-07, 'epoch': 0.89} 89%|████████▊ | 30413/34278 [33:19:53<4:29:56, 4.19s/it] 89%|████████▊ | 30414/34278 [33:19:56<4:03:16, 3.78s/it] {'loss': 0.1015, 'grad_norm': 0.9190987287618725, 'learning_rate': 3.295545438328762e-07, 'epoch': 0.89} 89%|████████▊ | 30414/34278 [33:19:56<4:03:16, 3.78s/it] 89%|████████▊ | 30415/34278 [33:19:59<3:52:43, 3.61s/it] {'loss': 0.1077, 'grad_norm': 1.037815457992433, 'learning_rate': 3.293858867848998e-07, 'epoch': 0.89} 89%|████████▊ | 30415/34278 [33:19:59<3:52:43, 3.61s/it] 89%|████████▊ | 30416/34278 [33:20:02<3:42:17, 3.45s/it] {'loss': 0.1158, 'grad_norm': 0.749096335415323, 'learning_rate': 3.2921727143505114e-07, 'epoch': 0.89} 89%|████████▊ | 30416/34278 [33:20:02<3:42:17, 3.45s/it] 89%|████████▊ | 30417/34278 [33:20:05<3:33:25, 3.32s/it] {'loss': 0.1025, 'grad_norm': 0.805998932613953, 'learning_rate': 3.290486977848345e-07, 'epoch': 0.89} 89%|████████▊ | 30417/34278 [33:20:05<3:33:25, 3.32s/it] 89%|████████▊ | 30418/34278 [33:20:11<4:27:25, 4.16s/it] {'loss': 0.1073, 'grad_norm': 0.6761166927583815, 'learning_rate': 3.2888016583575765e-07, 'epoch': 0.89} 89%|████████▊ | 30418/34278 [33:20:11<4:27:25, 4.16s/it] 89%|████████▊ | 30419/34278 [33:20:14<4:03:30, 3.79s/it] {'loss': 0.0964, 'grad_norm': 0.8600428273131147, 'learning_rate': 3.2871167558932214e-07, 'epoch': 0.89} 89%|████████▊ | 30419/34278 [33:20:14<4:03:30, 3.79s/it] 89%|████████▊ | 30420/34278 [33:20:18<3:55:28, 3.66s/it] {'loss': 0.1233, 'grad_norm': 0.8516478008239923, 'learning_rate': 3.28543227047034e-07, 'epoch': 0.89} 89%|████████▊ | 30420/34278 [33:20:18<3:55:28, 3.66s/it] 89%|████████▊ | 30421/34278 [33:20:21<3:45:09, 3.50s/it] {'loss': 0.108, 'grad_norm': 0.9747511822210017, 'learning_rate': 3.2837482021039757e-07, 'epoch': 0.89} 89%|████████▊ | 30421/34278 [33:20:21<3:45:09, 3.50s/it] 89%|████████▉ | 30422/34278 [33:20:24<3:41:39, 3.45s/it] {'loss': 0.1074, 'grad_norm': 0.673979853826961, 'learning_rate': 3.282064550809155e-07, 'epoch': 0.89} 89%|████████▉ | 30422/34278 [33:20:24<3:41:39, 3.45s/it] 89%|████████▉ | 30423/34278 [33:20:29<4:09:37, 3.89s/it] {'loss': 0.137, 'grad_norm': 1.093949927830266, 'learning_rate': 3.2803813166009004e-07, 'epoch': 0.89} 89%|████████▉ | 30423/34278 [33:20:29<4:09:37, 3.89s/it] 89%|████████▉ | 30424/34278 [33:20:32<3:51:43, 3.61s/it] {'loss': 0.1227, 'grad_norm': 0.9164141893291394, 'learning_rate': 3.2786984994942596e-07, 'epoch': 0.89} 89%|████████▉ | 30424/34278 [33:20:32<3:51:43, 3.61s/it] 89%|████████▉ | 30425/34278 [33:20:35<3:42:36, 3.47s/it] {'loss': 0.0866, 'grad_norm': 0.760827304849373, 'learning_rate': 3.2770160995042323e-07, 'epoch': 0.89} 89%|████████▉ | 30425/34278 [33:20:35<3:42:36, 3.47s/it] 89%|████████▉ | 30426/34278 [33:20:41<4:31:58, 4.24s/it] {'loss': 0.117, 'grad_norm': 1.0346784719106643, 'learning_rate': 3.275334116645867e-07, 'epoch': 0.89} 89%|████████▉ | 30426/34278 [33:20:41<4:31:58, 4.24s/it] 89%|████████▉ | 30427/34278 [33:20:47<4:55:23, 4.60s/it] {'loss': 0.1127, 'grad_norm': 0.7707234390139983, 'learning_rate': 3.2736525509341476e-07, 'epoch': 0.89} 89%|████████▉ | 30427/34278 [33:20:47<4:55:23, 4.60s/it] 89%|████████▉ | 30428/34278 [33:20:49<4:20:33, 4.06s/it] {'loss': 0.1262, 'grad_norm': 0.8335679426506698, 'learning_rate': 3.2719714023841163e-07, 'epoch': 0.89} 89%|████████▉ | 30428/34278 [33:20:49<4:20:33, 4.06s/it] 89%|████████▉ | 30429/34278 [33:20:53<4:10:28, 3.90s/it] {'loss': 0.0833, 'grad_norm': 0.7506067864274482, 'learning_rate': 3.270290671010773e-07, 'epoch': 0.89} 89%|████████▉ | 30429/34278 [33:20:53<4:10:28, 3.90s/it] 89%|████████▉ | 30430/34278 [33:20:56<4:01:50, 3.77s/it] {'loss': 0.1138, 'grad_norm': 0.8514175594375909, 'learning_rate': 3.268610356829105e-07, 'epoch': 0.89} 89%|████████▉ | 30430/34278 [33:20:56<4:01:50, 3.77s/it] 89%|████████▉ | 30431/34278 [33:20:59<3:45:37, 3.52s/it] {'loss': 0.1412, 'grad_norm': 0.8487384069260846, 'learning_rate': 3.266930459854134e-07, 'epoch': 0.89} 89%|████████▉ | 30431/34278 [33:20:59<3:45:37, 3.52s/it] 89%|████████▉ | 30432/34278 [33:21:03<3:40:40, 3.44s/it] {'loss': 0.1074, 'grad_norm': 0.7939705584414907, 'learning_rate': 3.2652509801008536e-07, 'epoch': 0.89} 89%|████████▉ | 30432/34278 [33:21:03<3:40:40, 3.44s/it] 89%|████████▉ | 30433/34278 [33:21:06<3:35:49, 3.37s/it] {'loss': 0.1118, 'grad_norm': 0.8212437316467182, 'learning_rate': 3.263571917584257e-07, 'epoch': 0.89} 89%|████████▉ | 30433/34278 [33:21:06<3:35:49, 3.37s/it] 89%|████████▉ | 30434/34278 [33:21:12<4:24:33, 4.13s/it] {'loss': 0.103, 'grad_norm': 0.844297758483977, 'learning_rate': 3.2618932723193274e-07, 'epoch': 0.89} 89%|████████▉ | 30434/34278 [33:21:12<4:24:33, 4.13s/it] 89%|████████▉ | 30435/34278 [33:21:15<4:04:05, 3.81s/it] {'loss': 0.1095, 'grad_norm': 0.8544561466150221, 'learning_rate': 3.260215044321069e-07, 'epoch': 0.89} 89%|████████▉ | 30435/34278 [33:21:15<4:04:05, 3.81s/it] 89%|████████▉ | 30436/34278 [33:21:18<3:54:29, 3.66s/it] {'loss': 0.1035, 'grad_norm': 0.7951195004484574, 'learning_rate': 3.2585372336044473e-07, 'epoch': 0.89} 89%|████████▉ | 30436/34278 [33:21:18<3:54:29, 3.66s/it] 89%|████████▉ | 30437/34278 [33:21:21<3:38:39, 3.42s/it] {'loss': 0.1073, 'grad_norm': 0.865101965359929, 'learning_rate': 3.2568598401844344e-07, 'epoch': 0.89} 89%|████████▉ | 30437/34278 [33:21:21<3:38:39, 3.42s/it] 89%|████████▉ | 30438/34278 [33:21:24<3:31:46, 3.31s/it] {'loss': 0.119, 'grad_norm': 0.9363742201354716, 'learning_rate': 3.255182864076034e-07, 'epoch': 0.89} 89%|████████▉ | 30438/34278 [33:21:24<3:31:46, 3.31s/it] 89%|████████▉ | 30439/34278 [33:21:27<3:34:26, 3.35s/it] {'loss': 0.1263, 'grad_norm': 0.8501357082705552, 'learning_rate': 3.2535063052942015e-07, 'epoch': 0.89} 89%|████████▉ | 30439/34278 [33:21:27<3:34:26, 3.35s/it] 89%|████████▉ | 30440/34278 [33:21:31<3:35:44, 3.37s/it] {'loss': 0.1014, 'grad_norm': 0.8730648155104872, 'learning_rate': 3.2518301638538976e-07, 'epoch': 0.89} 89%|████████▉ | 30440/34278 [33:21:31<3:35:44, 3.37s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff39633cd10> Failed to fetch sample 3646661. Exception: cannot identify image file <_io.BytesIO object at 0x7ff39633cd10> 89%|████████▉ | 30441/34278 [33:21:36<4:04:55, 3.83s/it] {'loss': 0.1016, 'grad_norm': 0.908414300042689, 'learning_rate': 3.250154439770098e-07, 'epoch': 0.89} 89%|████████▉ | 30441/34278 [33:21:36<4:04:55, 3.83s/it] 89%|████████▉ | 30442/34278 [33:21:39<3:56:05, 3.69s/it] {'loss': 0.1504, 'grad_norm': 0.8683485720759831, 'learning_rate': 3.2484791330577635e-07, 'epoch': 0.89} 89%|████████▉ | 30442/34278 [33:21:39<3:56:05, 3.69s/it] 89%|████████▉ | 30443/34278 [33:21:42<3:46:02, 3.54s/it] {'loss': 0.1075, 'grad_norm': 1.0706309103476457, 'learning_rate': 3.246804243731838e-07, 'epoch': 0.89} 89%|████████▉ | 30443/34278 [33:21:42<3:46:02, 3.54s/it] 89%|████████▉ | 30444/34278 [33:21:46<3:53:26, 3.65s/it] {'loss': 0.106, 'grad_norm': 0.6936667445533639, 'learning_rate': 3.245129771807287e-07, 'epoch': 0.89} 89%|████████▉ | 30444/34278 [33:21:46<3:53:26, 3.65s/it] 89%|████████▉ | 30445/34278 [33:21:52<4:38:59, 4.37s/it] {'loss': 0.1237, 'grad_norm': 0.8525681655626866, 'learning_rate': 3.24345571729906e-07, 'epoch': 0.89} 89%|████████▉ | 30445/34278 [33:21:52<4:38:59, 4.37s/it] 89%|████████▉ | 30446/34278 [33:21:55<4:11:53, 3.94s/it] {'loss': 0.1253, 'grad_norm': 0.9698863640439978, 'learning_rate': 3.2417820802221e-07, 'epoch': 0.89} 89%|████████▉ | 30446/34278 [33:21:55<4:11:53, 3.94s/it] 89%|████████▉ | 30447/34278 [33:22:00<4:36:01, 4.32s/it] {'loss': 0.0998, 'grad_norm': 1.1217498817935327, 'learning_rate': 3.24010886059134e-07, 'epoch': 0.89} 89%|████████▉ | 30447/34278 [33:22:00<4:36:01, 4.32s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 89%|████████▉ | 30448/34278 [33:22:03<4:11:42, 3.94s/it] {'loss': 0.1288, 'grad_norm': 0.8988503477875328, 'learning_rate': 3.238436058421729e-07, 'epoch': 0.89} 89%|████████▉ | 30448/34278 [33:22:03<4:11:42, 3.94s/it] 89%|████████▉ | 30449/34278 [33:22:06<3:50:32, 3.61s/it] {'loss': 0.093, 'grad_norm': 0.7698741762173608, 'learning_rate': 3.236763673728194e-07, 'epoch': 0.89} 89%|████████▉ | 30449/34278 [33:22:06<3:50:32, 3.61s/it] 89%|████████▉ | 30450/34278 [33:22:12<4:35:01, 4.31s/it] {'loss': 0.0992, 'grad_norm': 1.034610237159529, 'learning_rate': 3.235091706525673e-07, 'epoch': 0.89} 89%|████████▉ | 30450/34278 [33:22:12<4:35:01, 4.31s/it] 89%|████████▉ | 30451/34278 [33:22:16<4:17:24, 4.04s/it] {'loss': 0.1391, 'grad_norm': 0.7852459626938117, 'learning_rate': 3.2334201568290924e-07, 'epoch': 0.89} 89%|████████▉ | 30451/34278 [33:22:16<4:17:24, 4.04s/it] 89%|████████▉ | 30452/34278 [33:22:20<4:32:41, 4.28s/it] {'loss': 0.1137, 'grad_norm': 0.8902457560368345, 'learning_rate': 3.2317490246533745e-07, 'epoch': 0.89} 89%|████████▉ | 30452/34278 [33:22:20<4:32:41, 4.28s/it] 89%|████████▉ | 30453/34278 [33:22:23<4:03:59, 3.83s/it] {'loss': 0.1048, 'grad_norm': 1.07622350155105, 'learning_rate': 3.230078310013429e-07, 'epoch': 0.89} 89%|████████▉ | 30453/34278 [33:22:23<4:03:59, 3.83s/it] 89%|████████▉ | 30454/34278 [33:22:29<4:45:15, 4.48s/it] {'loss': 0.1096, 'grad_norm': 0.8138661621072937, 'learning_rate': 3.2284080129241837e-07, 'epoch': 0.89} 89%|████████▉ | 30454/34278 [33:22:29<4:45:15, 4.48s/it] 89%|████████▉ | 30455/34278 [33:22:32<4:14:49, 4.00s/it] {'loss': 0.1056, 'grad_norm': 0.7863486309532878, 'learning_rate': 3.226738133400542e-07, 'epoch': 0.89} 89%|████████▉ | 30455/34278 [33:22:32<4:14:49, 4.00s/it] 89%|████████▉ | 30456/34278 [33:22:37<4:35:44, 4.33s/it] {'loss': 0.1237, 'grad_norm': 0.8666539740372097, 'learning_rate': 3.225068671457426e-07, 'epoch': 0.89} 89%|████████▉ | 30456/34278 [33:22:37<4:35:44, 4.33s/it] 89%|████████▉ | 30457/34278 [33:22:43<5:03:26, 4.76s/it] {'loss': 0.1275, 'grad_norm': 0.8918594009942206, 'learning_rate': 3.223399627109719e-07, 'epoch': 0.89} 89%|████████▉ | 30457/34278 [33:22:43<5:03:26, 4.76s/it] 89%|████████▉ | 30458/34278 [33:22:46<4:33:02, 4.29s/it] {'loss': 0.1047, 'grad_norm': 0.6043041679174953, 'learning_rate': 3.2217310003723467e-07, 'epoch': 0.89} 89%|████████▉ | 30458/34278 [33:22:46<4:33:02, 4.29s/it] 89%|████████▉ | 30459/34278 [33:22:51<4:38:38, 4.38s/it] {'loss': 0.1278, 'grad_norm': 0.8028948186370006, 'learning_rate': 3.2200627912601866e-07, 'epoch': 0.89} 89%|████████▉ | 30459/34278 [33:22:51<4:38:38, 4.38s/it] 89%|████████▉ | 30460/34278 [33:22:54<4:14:52, 4.01s/it] {'loss': 0.124, 'grad_norm': 1.0191922252066992, 'learning_rate': 3.218394999788138e-07, 'epoch': 0.89} 89%|████████▉ | 30460/34278 [33:22:54<4:14:52, 4.01s/it] 89%|████████▉ | 30461/34278 [33:22:59<4:35:58, 4.34s/it] {'loss': 0.1314, 'grad_norm': 1.0013908355360317, 'learning_rate': 3.216727625971083e-07, 'epoch': 0.89} 89%|████████▉ | 30461/34278 [33:22:59<4:35:58, 4.34s/it] 89%|████████▉ | 30462/34278 [33:23:02<4:09:49, 3.93s/it] {'loss': 0.1286, 'grad_norm': 0.6720942638188737, 'learning_rate': 3.215060669823933e-07, 'epoch': 0.89} 89%|████████▉ | 30462/34278 [33:23:02<4:09:49, 3.93s/it] 89%|████████▉ | 30463/34278 [33:23:05<3:52:59, 3.66s/it] {'loss': 0.1251, 'grad_norm': 0.785874262867982, 'learning_rate': 3.213394131361547e-07, 'epoch': 0.89} 89%|████████▉ | 30463/34278 [33:23:05<3:52:59, 3.66s/it] 89%|████████▉ | 30464/34278 [33:23:08<3:40:24, 3.47s/it] {'loss': 0.0906, 'grad_norm': 0.7468232802908306, 'learning_rate': 3.2117280105988026e-07, 'epoch': 0.89} 89%|████████▉ | 30464/34278 [33:23:08<3:40:24, 3.47s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff0419bdcb0> Failed to fetch sample 3268270. Exception: cannot identify image file <_io.BytesIO object at 0x7ff0419bdcb0> 89%|████████▉ | 30465/34278 [33:23:11<3:32:57, 3.35s/it] {'loss': 0.0913, 'grad_norm': 0.7520984517669977, 'learning_rate': 3.2100623075505874e-07, 'epoch': 0.89} 89%|████████▉ | 30465/34278 [33:23:11<3:32:57, 3.35s/it] 89%|████████▉ | 30466/34278 [33:23:14<3:26:41, 3.25s/it] {'loss': 0.1219, 'grad_norm': 1.0473963135644402, 'learning_rate': 3.2083970222317686e-07, 'epoch': 0.89} 89%|████████▉ | 30466/34278 [33:23:14<3:26:41, 3.25s/it] 89%|████████▉ | 30467/34278 [33:23:19<4:06:13, 3.88s/it] {'loss': 0.0994, 'grad_norm': 0.8427324687372726, 'learning_rate': 3.206732154657194e-07, 'epoch': 0.89} 89%|████████▉ | 30467/34278 [33:23:19<4:06:13, 3.88s/it] 89%|████████▉ | 30468/34278 [33:23:22<3:48:48, 3.60s/it] {'loss': 0.112, 'grad_norm': 0.8416467101814705, 'learning_rate': 3.2050677048417577e-07, 'epoch': 0.89} 89%|████████▉ | 30468/34278 [33:23:23<3:48:48, 3.60s/it] 89%|████████▉ | 30469/34278 [33:23:27<4:00:55, 3.80s/it] {'loss': 0.1002, 'grad_norm': 0.7900828921500531, 'learning_rate': 3.203403672800309e-07, 'epoch': 0.89} 89%|████████▉ | 30469/34278 [33:23:27<4:00:55, 3.80s/it] 89%|████████▉ | 30470/34278 [33:23:29<3:41:02, 3.48s/it] {'loss': 0.1045, 'grad_norm': 0.8763892121160695, 'learning_rate': 3.2017400585476923e-07, 'epoch': 0.89} 89%|████████▉ | 30470/34278 [33:23:29<3:41:02, 3.48s/it] 89%|████████▉ | 30471/34278 [33:23:36<4:37:02, 4.37s/it] {'loss': 0.0971, 'grad_norm': 0.8571175985732294, 'learning_rate': 3.2000768620987776e-07, 'epoch': 0.89} 89%|████████▉ | 30471/34278 [33:23:36<4:37:02, 4.37s/it] 89%|████████▉ | 30472/34278 [33:23:42<5:15:04, 4.97s/it] {'loss': 0.114, 'grad_norm': 0.8307611178054767, 'learning_rate': 3.198414083468404e-07, 'epoch': 0.89} 89%|████████▉ | 30472/34278 [33:23:42<5:15:04, 4.97s/it] 89%|████████▉ | 30473/34278 [33:23:45<4:32:21, 4.29s/it] {'loss': 0.1126, 'grad_norm': 1.315998495462903, 'learning_rate': 3.19675172267141e-07, 'epoch': 0.89} 89%|████████▉ | 30473/34278 [33:23:45<4:32:21, 4.29s/it] 89%|████████▉ | 30474/34278 [33:23:48<4:10:19, 3.95s/it] {'loss': 0.0932, 'grad_norm': 1.105312974477422, 'learning_rate': 3.195089779722643e-07, 'epoch': 0.89} 89%|████████▉ | 30474/34278 [33:23:48<4:10:19, 3.95s/it] 89%|████████▉ | 30475/34278 [33:23:51<3:55:58, 3.72s/it] {'loss': 0.1035, 'grad_norm': 0.9735997191167188, 'learning_rate': 3.193428254636949e-07, 'epoch': 0.89} 89%|████████▉ | 30475/34278 [33:23:51<3:55:58, 3.72s/it] 89%|████████▉ | 30476/34278 [33:23:55<3:46:48, 3.58s/it] {'loss': 0.1118, 'grad_norm': 0.7936149384272677, 'learning_rate': 3.191767147429159e-07, 'epoch': 0.89} 89%|████████▉ | 30476/34278 [33:23:55<3:46:48, 3.58s/it] 89%|████████▉ | 30477/34278 [33:23:59<3:56:46, 3.74s/it] {'loss': 0.1215, 'grad_norm': 1.138799021036221, 'learning_rate': 3.190106458114084e-07, 'epoch': 0.89} 89%|████████▉ | 30477/34278 [33:23:59<3:56:46, 3.74s/it] 89%|████████▉ | 30478/34278 [33:24:02<3:45:42, 3.56s/it] {'loss': 0.1187, 'grad_norm': 1.2291060948656234, 'learning_rate': 3.188446186706573e-07, 'epoch': 0.89} 89%|████████▉ | 30478/34278 [33:24:02<3:45:42, 3.56s/it] 89%|████████▉ | 30479/34278 [33:24:05<3:39:10, 3.46s/it] {'loss': 0.1232, 'grad_norm': 1.1005788815254303, 'learning_rate': 3.186786333221431e-07, 'epoch': 0.89} 89%|████████▉ | 30479/34278 [33:24:05<3:39:10, 3.46s/it] 89%|████████▉ | 30480/34278 [33:24:10<4:17:51, 4.07s/it] {'loss': 0.114, 'grad_norm': 0.6889063645022628, 'learning_rate': 3.185126897673485e-07, 'epoch': 0.89} 89%|████████▉ | 30480/34278 [33:24:11<4:17:51, 4.07s/it] 89%|████████▉ | 30481/34278 [33:24:16<4:43:18, 4.48s/it] {'loss': 0.1312, 'grad_norm': 0.983870192076746, 'learning_rate': 3.183467880077562e-07, 'epoch': 0.89} 89%|████████▉ | 30481/34278 [33:24:16<4:43:18, 4.48s/it] 89%|████████▉ | 30482/34278 [33:24:19<4:16:34, 4.06s/it] {'loss': 0.0919, 'grad_norm': 0.6741017738651957, 'learning_rate': 3.1818092804484556e-07, 'epoch': 0.89} 89%|████████▉ | 30482/34278 [33:24:19<4:16:34, 4.06s/it] 89%|████████▉ | 30483/34278 [33:24:23<4:10:32, 3.96s/it] {'loss': 0.1177, 'grad_norm': 0.7882326232819421, 'learning_rate': 3.1801510988009765e-07, 'epoch': 0.89} 89%|████████▉ | 30483/34278 [33:24:23<4:10:32, 3.96s/it] 89%|████████▉ | 30484/34278 [33:24:26<4:01:10, 3.81s/it] {'loss': 0.1158, 'grad_norm': 0.8376701441824327, 'learning_rate': 3.1784933351499404e-07, 'epoch': 0.89} 89%|████████▉ | 30484/34278 [33:24:26<4:01:10, 3.81s/it] 89%|████████▉ | 30485/34278 [33:24:29<3:46:12, 3.58s/it] {'loss': 0.1146, 'grad_norm': 1.136097784752444, 'learning_rate': 3.1768359895101296e-07, 'epoch': 0.89} 89%|████████▉ | 30485/34278 [33:24:29<3:46:12, 3.58s/it] 89%|████████▉ | 30486/34278 [33:24:32<3:37:49, 3.45s/it] {'loss': 0.1224, 'grad_norm': 0.932393242138555, 'learning_rate': 3.175179061896355e-07, 'epoch': 0.89} 89%|████████▉ | 30486/34278 [33:24:32<3:37:49, 3.45s/it] 89%|████████▉ | 30487/34278 [33:24:39<4:29:59, 4.27s/it] {'loss': 0.1164, 'grad_norm': 0.748704135637435, 'learning_rate': 3.173522552323399e-07, 'epoch': 0.89} 89%|████████▉ | 30487/34278 [33:24:39<4:29:59, 4.27s/it] 89%|████████▉ | 30488/34278 [33:24:42<4:15:43, 4.05s/it] {'loss': 0.0986, 'grad_norm': 0.8891019213388898, 'learning_rate': 3.171866460806061e-07, 'epoch': 0.89} 89%|████████▉ | 30488/34278 [33:24:42<4:15:43, 4.05s/it] 89%|████████▉ | 30489/34278 [33:24:46<4:06:05, 3.90s/it] {'loss': 0.1016, 'grad_norm': 0.7344867986012511, 'learning_rate': 3.170210787359118e-07, 'epoch': 0.89} 89%|████████▉ | 30489/34278 [33:24:46<4:06:05, 3.90s/it] 89%|████████▉ | 30490/34278 [33:24:51<4:29:05, 4.26s/it] {'loss': 0.1264, 'grad_norm': 1.1247175630906892, 'learning_rate': 3.1685555319973525e-07, 'epoch': 0.89} 89%|████████▉ | 30490/34278 [33:24:51<4:29:05, 4.26s/it] 89%|████████▉ | 30491/34278 [33:24:54<4:01:05, 3.82s/it] {'loss': 0.1247, 'grad_norm': 0.94693035646125, 'learning_rate': 3.166900694735542e-07, 'epoch': 0.89} 89%|████████▉ | 30491/34278 [33:24:54<4:01:05, 3.82s/it] 89%|████████▉ | 30492/34278 [33:25:00<4:42:31, 4.48s/it] {'loss': 0.118, 'grad_norm': 0.882752799471745, 'learning_rate': 3.1652462755884686e-07, 'epoch': 0.89} 89%|████████▉ | 30492/34278 [33:25:00<4:42:31, 4.48s/it] 89%|████████▉ | 30493/34278 [33:25:05<5:08:29, 4.89s/it] {'loss': 0.1211, 'grad_norm': 0.8356541380491931, 'learning_rate': 3.1635922745708927e-07, 'epoch': 0.89} 89%|████████▉ | 30493/34278 [33:25:05<5:08:29, 4.89s/it] 89%|████████▉ | 30494/34278 [33:25:08<4:33:40, 4.34s/it] {'loss': 0.0996, 'grad_norm': 0.9078331283278978, 'learning_rate': 3.1619386916975804e-07, 'epoch': 0.89} 89%|████████▉ | 30494/34278 [33:25:08<4:33:40, 4.34s/it] 89%|████████▉ | 30495/34278 [33:25:11<4:02:29, 3.85s/it] {'loss': 0.1261, 'grad_norm': 0.988811410334127, 'learning_rate': 3.160285526983303e-07, 'epoch': 0.89} 89%|████████▉ | 30495/34278 [33:25:11<4:02:29, 3.85s/it] 89%|████████▉ | 30496/34278 [33:25:15<3:59:26, 3.80s/it] {'loss': 0.1246, 'grad_norm': 1.0003746410143617, 'learning_rate': 3.158632780442816e-07, 'epoch': 0.89} 89%|████████▉ | 30496/34278 [33:25:15<3:59:26, 3.80s/it] 89%|████████▉ | 30497/34278 [33:25:19<4:01:54, 3.84s/it] {'loss': 0.0992, 'grad_norm': 0.8954152287289935, 'learning_rate': 3.1569804520908633e-07, 'epoch': 0.89} 89%|████████▉ | 30497/34278 [33:25:19<4:01:54, 3.84s/it] 89%|████████▉ | 30498/34278 [33:25:23<4:03:42, 3.87s/it] {'loss': 0.1197, 'grad_norm': 0.8224363469838618, 'learning_rate': 3.1553285419422153e-07, 'epoch': 0.89} 89%|████████▉ | 30498/34278 [33:25:23<4:03:42, 3.87s/it] 89%|████████▉ | 30499/34278 [33:25:26<3:44:53, 3.57s/it] {'loss': 0.1346, 'grad_norm': 1.0507848059549514, 'learning_rate': 3.1536770500116164e-07, 'epoch': 0.89} 89%|████████▉ | 30499/34278 [33:25:26<3:44:53, 3.57s/it] 89%|████████▉ | 30500/34278 [33:25:29<3:41:13, 3.51s/it] {'loss': 0.1221, 'grad_norm': 0.8836399621266284, 'learning_rate': 3.152025976313794e-07, 'epoch': 0.89} 89%|████████▉ | 30500/34278 [33:25:29<3:41:13, 3.51s/it] 89%|████████▉ | 30501/34278 [33:25:32<3:39:24, 3.49s/it] {'loss': 0.0937, 'grad_norm': 0.6925661010777691, 'learning_rate': 3.150375320863508e-07, 'epoch': 0.89} 89%|████████▉ | 30501/34278 [33:25:32<3:39:24, 3.49s/it] 89%|████████▉ | 30502/34278 [33:25:36<3:43:13, 3.55s/it] {'loss': 0.1088, 'grad_norm': 0.8388241395615271, 'learning_rate': 3.1487250836754915e-07, 'epoch': 0.89} 89%|████████▉ | 30502/34278 [33:25:36<3:43:13, 3.55s/it] 89%|████████▉ | 30503/34278 [33:25:39<3:36:21, 3.44s/it] {'loss': 0.1265, 'grad_norm': 0.9641978352144771, 'learning_rate': 3.147075264764465e-07, 'epoch': 0.89} 89%|████████▉ | 30503/34278 [33:25:39<3:36:21, 3.44s/it] 89%|████████▉ | 30504/34278 [33:25:45<4:22:28, 4.17s/it] {'loss': 0.1036, 'grad_norm': 0.8059912009626359, 'learning_rate': 3.145425864145163e-07, 'epoch': 0.89} 89%|████████▉ | 30504/34278 [33:25:45<4:22:28, 4.17s/it] 89%|████████▉ | 30505/34278 [33:25:49<4:14:01, 4.04s/it] {'loss': 0.1259, 'grad_norm': 0.8709323443371206, 'learning_rate': 3.143776881832322e-07, 'epoch': 0.89} 89%|████████▉ | 30505/34278 [33:25:49<4:14:01, 4.04s/it] 89%|████████▉ | 30506/34278 [33:25:55<4:45:51, 4.55s/it] {'loss': 0.1075, 'grad_norm': 0.7596751859457278, 'learning_rate': 3.1421283178406537e-07, 'epoch': 0.89} 89%|████████▉ | 30506/34278 [33:25:55<4:45:51, 4.55s/it] 89%|████████▉ | 30507/34278 [33:25:59<4:33:30, 4.35s/it] {'loss': 0.1099, 'grad_norm': 0.8746650117572208, 'learning_rate': 3.140480172184873e-07, 'epoch': 0.89} 89%|████████▉ | 30507/34278 [33:25:59<4:33:30, 4.35s/it] 89%|████████▉ | 30508/34278 [33:26:02<4:18:26, 4.11s/it] {'loss': 0.1325, 'grad_norm': 0.8399294210591024, 'learning_rate': 3.1388324448797083e-07, 'epoch': 0.89} 89%|████████▉ | 30508/34278 [33:26:02<4:18:26, 4.11s/it] 89%|████████▉ | 30509/34278 [33:26:05<3:55:07, 3.74s/it] {'loss': 0.1154, 'grad_norm': 1.0775169203826622, 'learning_rate': 3.1371851359398465e-07, 'epoch': 0.89} 89%|████████▉ | 30509/34278 [33:26:05<3:55:07, 3.74s/it] 89%|████████▉ | 30510/34278 [33:26:08<3:35:00, 3.42s/it] {'loss': 0.1014, 'grad_norm': 1.135219210523027, 'learning_rate': 3.1355382453800155e-07, 'epoch': 0.89} 89%|████████▉ | 30510/34278 [33:26:08<3:35:00, 3.42s/it] 89%|████████▉ | 30511/34278 [33:26:13<4:05:04, 3.90s/it] {'loss': 0.1263, 'grad_norm': 0.9120801805597152, 'learning_rate': 3.133891773214914e-07, 'epoch': 0.89} 89%|████████▉ | 30511/34278 [33:26:13<4:05:04, 3.90s/it] 89%|████████▉ | 30512/34278 [33:26:15<3:45:07, 3.59s/it] {'loss': 0.1296, 'grad_norm': 0.8937106915785827, 'learning_rate': 3.1322457194592426e-07, 'epoch': 0.89} 89%|████████▉ | 30512/34278 [33:26:16<3:45:07, 3.59s/it] 89%|████████▉ | 30513/34278 [33:26:19<3:34:33, 3.42s/it] {'loss': 0.1092, 'grad_norm': 0.8927612096903995, 'learning_rate': 3.130600084127683e-07, 'epoch': 0.89} 89%|████████▉ | 30513/34278 [33:26:19<3:34:33, 3.42s/it] 89%|████████▉ | 30514/34278 [33:26:23<3:56:22, 3.77s/it] {'loss': 0.0989, 'grad_norm': 0.9258014059962174, 'learning_rate': 3.1289548672349514e-07, 'epoch': 0.89} 89%|████████▉ | 30514/34278 [33:26:23<3:56:22, 3.77s/it] 89%|████████▉ | 30515/34278 [33:26:29<4:37:30, 4.42s/it] {'loss': 0.1403, 'grad_norm': 0.8992620417502106, 'learning_rate': 3.127310068795708e-07, 'epoch': 0.89} 89%|████████▉ | 30515/34278 [33:26:29<4:37:30, 4.42s/it] 89%|████████▉ | 30516/34278 [33:26:32<4:14:12, 4.05s/it] {'loss': 0.1094, 'grad_norm': 0.8881986317519851, 'learning_rate': 3.1256656888246586e-07, 'epoch': 0.89} 89%|████████▉ | 30516/34278 [33:26:32<4:14:12, 4.05s/it] 89%|████████▉ | 30517/34278 [33:26:38<4:43:05, 4.52s/it] {'loss': 0.1184, 'grad_norm': 0.8222217825382221, 'learning_rate': 3.124021727336468e-07, 'epoch': 0.89} 89%|████████▉ | 30517/34278 [33:26:38<4:43:05, 4.52s/it] 89%|████████▉ | 30518/34278 [33:26:41<4:16:38, 4.10s/it] {'loss': 0.0984, 'grad_norm': 0.7635861878730664, 'learning_rate': 3.1223781843458314e-07, 'epoch': 0.89} 89%|████████▉ | 30518/34278 [33:26:41<4:16:38, 4.10s/it] 89%|████████▉ | 30519/34278 [33:26:44<3:57:42, 3.79s/it] {'loss': 0.13, 'grad_norm': 0.9682910058529923, 'learning_rate': 3.1207350598674137e-07, 'epoch': 0.89} 89%|████████▉ | 30519/34278 [33:26:44<3:57:42, 3.79s/it] 89%|████████▉ | 30520/34278 [33:26:47<3:49:37, 3.67s/it] {'loss': 0.1362, 'grad_norm': 0.8115903491035434, 'learning_rate': 3.11909235391587e-07, 'epoch': 0.89} 89%|████████▉ | 30520/34278 [33:26:47<3:49:37, 3.67s/it] 89%|████████▉ | 30521/34278 [33:26:51<3:48:03, 3.64s/it] {'loss': 0.099, 'grad_norm': 0.7768085823349589, 'learning_rate': 3.117450066505878e-07, 'epoch': 0.89} 89%|████████▉ | 30521/34278 [33:26:51<3:48:03, 3.64s/it] 89%|████████▉ | 30522/34278 [33:26:55<3:48:31, 3.65s/it] {'loss': 0.1294, 'grad_norm': 0.8773722598141154, 'learning_rate': 3.1158081976521094e-07, 'epoch': 0.89} 89%|████████▉ | 30522/34278 [33:26:55<3:48:31, 3.65s/it] 89%|████████▉ | 30523/34278 [33:26:57<3:28:45, 3.34s/it] {'loss': 0.1125, 'grad_norm': 0.8846059299708734, 'learning_rate': 3.114166747369218e-07, 'epoch': 0.89} 89%|████████▉ | 30523/34278 [33:26:57<3:28:45, 3.34s/it] 89%|████████▉ | 30524/34278 [33:27:02<3:55:59, 3.77s/it] {'loss': 0.1339, 'grad_norm': 0.8924039041252876, 'learning_rate': 3.112525715671838e-07, 'epoch': 0.89} 89%|████████▉ | 30524/34278 [33:27:02<3:55:59, 3.77s/it] 89%|████████▉ | 30525/34278 [33:27:06<3:51:24, 3.70s/it] {'loss': 0.1294, 'grad_norm': 0.9959765749310289, 'learning_rate': 3.1108851025746457e-07, 'epoch': 0.89} 89%|████████▉ | 30525/34278 [33:27:06<3:51:24, 3.70s/it] 89%|████████▉ | 30526/34278 [33:27:09<3:41:59, 3.55s/it] {'loss': 0.1142, 'grad_norm': 0.8864124216035949, 'learning_rate': 3.109244908092279e-07, 'epoch': 0.89} 89%|████████▉ | 30526/34278 [33:27:09<3:41:59, 3.55s/it] 89%|████████▉ | 30527/34278 [33:27:12<3:34:57, 3.44s/it] {'loss': 0.1179, 'grad_norm': 1.0028369673101463, 'learning_rate': 3.1076051322393663e-07, 'epoch': 0.89} 89%|████████▉ | 30527/34278 [33:27:12<3:34:57, 3.44s/it] 89%|████████▉ | 30528/34278 [33:27:15<3:25:21, 3.29s/it] {'loss': 0.1197, 'grad_norm': 0.9746235679010936, 'learning_rate': 3.105965775030573e-07, 'epoch': 0.89} 89%|████████▉ | 30528/34278 [33:27:15<3:25:21, 3.29s/it] 89%|████████▉ | 30529/34278 [33:27:20<4:08:41, 3.98s/it] {'loss': 0.1113, 'grad_norm': 0.7617722022236654, 'learning_rate': 3.1043268364805257e-07, 'epoch': 0.89} 89%|████████▉ | 30529/34278 [33:27:21<4:08:41, 3.98s/it] 89%|████████▉ | 30530/34278 [33:27:25<4:13:56, 4.07s/it] {'loss': 0.1209, 'grad_norm': 0.649378711927433, 'learning_rate': 3.1026883166038413e-07, 'epoch': 0.89} 89%|████████▉ | 30530/34278 [33:27:25<4:13:56, 4.07s/it] 89%|████████▉ | 30531/34278 [33:27:28<3:54:11, 3.75s/it] {'loss': 0.1055, 'grad_norm': 0.8936101132561894, 'learning_rate': 3.1010502154151743e-07, 'epoch': 0.89} 89%|████████▉ | 30531/34278 [33:27:28<3:54:11, 3.75s/it] 89%|████████▉ | 30532/34278 [33:27:31<3:39:56, 3.52s/it] {'loss': 0.112, 'grad_norm': 0.997456688702685, 'learning_rate': 3.09941253292913e-07, 'epoch': 0.89} 89%|████████▉ | 30532/34278 [33:27:31<3:39:56, 3.52s/it] 89%|████████▉ | 30533/34278 [33:27:34<3:37:09, 3.48s/it] {'loss': 0.1236, 'grad_norm': 0.8087439596778104, 'learning_rate': 3.0977752691603303e-07, 'epoch': 0.89} 89%|████████▉ | 30533/34278 [33:27:34<3:37:09, 3.48s/it] 89%|████████▉ | 30534/34278 [33:27:37<3:24:16, 3.27s/it] {'loss': 0.1104, 'grad_norm': 0.7940552257410859, 'learning_rate': 3.0961384241233907e-07, 'epoch': 0.89} 89%|████████▉ | 30534/34278 [33:27:37<3:24:16, 3.27s/it] 89%|████████▉ | 30535/34278 [33:27:40<3:12:09, 3.08s/it] {'loss': 0.1031, 'grad_norm': 1.3486098578327579, 'learning_rate': 3.094501997832944e-07, 'epoch': 0.89} 89%|████████▉ | 30535/34278 [33:27:40<3:12:09, 3.08s/it] 89%|████████▉ | 30536/34278 [33:27:44<3:28:38, 3.35s/it] {'loss': 0.1328, 'grad_norm': 1.070137905667782, 'learning_rate': 3.092865990303584e-07, 'epoch': 0.89} 89%|████████▉ | 30536/34278 [33:27:44<3:28:38, 3.35s/it] 89%|████████▉ | 30537/34278 [33:27:47<3:24:55, 3.29s/it] {'loss': 0.1088, 'grad_norm': 0.7014482598401466, 'learning_rate': 3.0912304015499106e-07, 'epoch': 0.89} 89%|████████▉ | 30537/34278 [33:27:47<3:24:55, 3.29s/it] 89%|████████▉ | 30538/34278 [33:27:50<3:16:22, 3.15s/it] {'loss': 0.1118, 'grad_norm': 0.8931822380794611, 'learning_rate': 3.089595231586545e-07, 'epoch': 0.89} 89%|████████▉ | 30538/34278 [33:27:50<3:16:22, 3.15s/it] 89%|████████▉ | 30539/34278 [33:27:54<3:32:18, 3.41s/it] {'loss': 0.0883, 'grad_norm': 0.7487271169363175, 'learning_rate': 3.087960480428065e-07, 'epoch': 0.89} 89%|████████▉ | 30539/34278 [33:27:54<3:32:18, 3.41s/it] 89%|████████▉ | 30540/34278 [33:28:00<4:22:32, 4.21s/it] {'loss': 0.1131, 'grad_norm': 0.7756367771297727, 'learning_rate': 3.086326148089075e-07, 'epoch': 0.89} 89%|████████▉ | 30540/34278 [33:28:00<4:22:32, 4.21s/it] 89%|████████▉ | 30541/34278 [33:28:04<4:23:49, 4.24s/it] {'loss': 0.1071, 'grad_norm': 0.8286140357712191, 'learning_rate': 3.0846922345841746e-07, 'epoch': 0.89} 89%|████████▉ | 30541/34278 [33:28:04<4:23:49, 4.24s/it] 89%|████████▉ | 30542/34278 [33:28:07<4:02:36, 3.90s/it] {'loss': 0.1344, 'grad_norm': 0.8281252707009609, 'learning_rate': 3.083058739927941e-07, 'epoch': 0.89} 89%|████████▉ | 30542/34278 [33:28:07<4:02:36, 3.90s/it] 89%|████████▉ | 30543/34278 [33:28:11<4:02:06, 3.89s/it] {'loss': 0.0929, 'grad_norm': 0.8690984771551428, 'learning_rate': 3.0814256641349517e-07, 'epoch': 0.89} 89%|████████▉ | 30543/34278 [33:28:11<4:02:06, 3.89s/it] 89%|████████▉ | 30544/34278 [33:28:14<3:46:11, 3.63s/it] {'loss': 0.1216, 'grad_norm': 1.0453484297338385, 'learning_rate': 3.0797930072198e-07, 'epoch': 0.89} 89%|████████▉ | 30544/34278 [33:28:14<3:46:11, 3.63s/it] 89%|████████▉ | 30545/34278 [33:28:18<3:56:54, 3.81s/it] {'loss': 0.1067, 'grad_norm': 1.211958748447629, 'learning_rate': 3.0781607691970474e-07, 'epoch': 0.89} 89%|████████▉ | 30545/34278 [33:28:18<3:56:54, 3.81s/it] 89%|████████▉ | 30546/34278 [33:28:21<3:42:32, 3.58s/it] {'loss': 0.1116, 'grad_norm': 0.960417382087224, 'learning_rate': 3.0765289500812866e-07, 'epoch': 0.89} 89%|████████▉ | 30546/34278 [33:28:21<3:42:32, 3.58s/it] 89%|████████▉ | 30547/34278 [33:28:27<4:16:01, 4.12s/it] {'loss': 0.1019, 'grad_norm': 0.8513883921626092, 'learning_rate': 3.0748975498870627e-07, 'epoch': 0.89} 89%|████████▉ | 30547/34278 [33:28:27<4:16:01, 4.12s/it] 89%|████████▉ | 30548/34278 [33:28:29<3:49:02, 3.68s/it] {'loss': 0.1111, 'grad_norm': 0.8598663139293854, 'learning_rate': 3.0732665686289574e-07, 'epoch': 0.89} 89%|████████▉ | 30548/34278 [33:28:29<3:49:02, 3.68s/it] 89%|████████▉ | 30549/34278 [33:28:32<3:31:40, 3.41s/it] {'loss': 0.0974, 'grad_norm': 1.0060394657794325, 'learning_rate': 3.071636006321527e-07, 'epoch': 0.89} 89%|████████▉ | 30549/34278 [33:28:32<3:31:40, 3.41s/it] 89%|████████▉ | 30550/34278 [33:28:35<3:30:20, 3.39s/it] {'loss': 0.1098, 'grad_norm': 0.9333911478303079, 'learning_rate': 3.070005862979325e-07, 'epoch': 0.89} 89%|████████▉ | 30550/34278 [33:28:35<3:30:20, 3.39s/it] 89%|████████▉ | 30551/34278 [33:28:40<3:58:53, 3.85s/it] {'loss': 0.1133, 'grad_norm': 0.8556123932859099, 'learning_rate': 3.068376138616902e-07, 'epoch': 0.89} 89%|████████▉ | 30551/34278 [33:28:40<3:58:53, 3.85s/it] 89%|████████▉ | 30552/34278 [33:28:43<3:41:02, 3.56s/it] {'loss': 0.1081, 'grad_norm': 0.8275808135699773, 'learning_rate': 3.0667468332488237e-07, 'epoch': 0.89} 89%|████████▉ | 30552/34278 [33:28:43<3:41:02, 3.56s/it] 89%|████████▉ | 30553/34278 [33:28:46<3:27:46, 3.35s/it] {'loss': 0.0892, 'grad_norm': 1.247262579244288, 'learning_rate': 3.065117946889623e-07, 'epoch': 0.89} 89%|████████▉ | 30553/34278 [33:28:46<3:27:46, 3.35s/it] 89%|████████▉ | 30554/34278 [33:28:52<4:19:56, 4.19s/it] {'loss': 0.1198, 'grad_norm': 0.8238164952677675, 'learning_rate': 3.0634894795538385e-07, 'epoch': 0.89} 89%|████████▉ | 30554/34278 [33:28:52<4:19:56, 4.19s/it] 89%|████████▉ | 30555/34278 [33:28:55<3:58:23, 3.84s/it] {'loss': 0.1196, 'grad_norm': 0.7707484983912157, 'learning_rate': 3.0618614312560244e-07, 'epoch': 0.89} 89%|████████▉ | 30555/34278 [33:28:55<3:58:23, 3.84s/it] 89%|████████▉ | 30556/34278 [33:28:59<4:01:41, 3.90s/it] {'loss': 0.1, 'grad_norm': 0.8191705907142699, 'learning_rate': 3.060233802010709e-07, 'epoch': 0.89} 89%|████████▉ | 30556/34278 [33:28:59<4:01:41, 3.90s/it] 89%|████████▉ | 30557/34278 [33:29:02<3:41:06, 3.57s/it] {'loss': 0.1119, 'grad_norm': 0.9204114270653742, 'learning_rate': 3.0586065918324025e-07, 'epoch': 0.89} 89%|████████▉ | 30557/34278 [33:29:02<3:41:06, 3.57s/it] 89%|████████▉ | 30558/34278 [33:29:05<3:34:48, 3.46s/it] {'loss': 0.0896, 'grad_norm': 0.6168227730592508, 'learning_rate': 3.0569798007356653e-07, 'epoch': 0.89} 89%|████████▉ | 30558/34278 [33:29:05<3:34:48, 3.46s/it] 89%|████████▉ | 30559/34278 [33:29:08<3:23:33, 3.28s/it] {'loss': 0.1006, 'grad_norm': 0.7855472813687441, 'learning_rate': 3.055353428735003e-07, 'epoch': 0.89} 89%|████████▉ | 30559/34278 [33:29:08<3:23:33, 3.28s/it] 89%|████████▉ | 30560/34278 [33:29:11<3:22:35, 3.27s/it] {'loss': 0.0903, 'grad_norm': 0.7277658231052057, 'learning_rate': 3.0537274758449366e-07, 'epoch': 0.89} 89%|████████▉ | 30560/34278 [33:29:11<3:22:35, 3.27s/it] 89%|████████▉ | 30561/34278 [33:29:15<3:30:52, 3.40s/it] {'loss': 0.1249, 'grad_norm': 0.7969145948110496, 'learning_rate': 3.052101942079988e-07, 'epoch': 0.89} 89%|████████▉ | 30561/34278 [33:29:15<3:30:52, 3.40s/it] 89%|████████▉ | 30562/34278 [33:29:18<3:23:28, 3.29s/it] {'loss': 0.1064, 'grad_norm': 0.877067568969228, 'learning_rate': 3.050476827454668e-07, 'epoch': 0.89} 89%|████████▉ | 30562/34278 [33:29:18<3:23:28, 3.29s/it] 89%|████████▉ | 30563/34278 [33:29:24<4:21:00, 4.22s/it] {'loss': 0.1162, 'grad_norm': 0.8154884683646071, 'learning_rate': 3.048852131983476e-07, 'epoch': 0.89} 89%|████████▉ | 30563/34278 [33:29:24<4:21:00, 4.22s/it] 89%|████████▉ | 30564/34278 [33:29:28<4:00:33, 3.89s/it] {'loss': 0.1112, 'grad_norm': 0.7883368860871747, 'learning_rate': 3.0472278556809233e-07, 'epoch': 0.89} 89%|████████▉ | 30564/34278 [33:29:28<4:00:33, 3.89s/it] 89%|████████▉ | 30565/34278 [33:29:32<4:04:39, 3.95s/it] {'loss': 0.1182, 'grad_norm': 0.6825588540517084, 'learning_rate': 3.0456039985615193e-07, 'epoch': 0.89} 89%|████████▉ | 30565/34278 [33:29:32<4:04:39, 3.95s/it] 89%|████████▉ | 30566/34278 [33:29:35<3:47:21, 3.68s/it] {'loss': 0.1112, 'grad_norm': 0.9066670376722872, 'learning_rate': 3.0439805606397533e-07, 'epoch': 0.89} 89%|████████▉ | 30566/34278 [33:29:35<3:47:21, 3.68s/it] 89%|████████▉ | 30567/34278 [33:29:37<3:28:39, 3.37s/it] {'loss': 0.0867, 'grad_norm': 0.9240938686339571, 'learning_rate': 3.042357541930113e-07, 'epoch': 0.89} 89%|████████▉ | 30567/34278 [33:29:37<3:28:39, 3.37s/it] 89%|████████▉ | 30568/34278 [33:29:41<3:40:15, 3.56s/it] {'loss': 0.0945, 'grad_norm': 0.9674166330384046, 'learning_rate': 3.0407349424471043e-07, 'epoch': 0.89} 89%|████████▉ | 30568/34278 [33:29:41<3:40:15, 3.56s/it] 89%|████████▉ | 30569/34278 [33:29:44<3:27:19, 3.35s/it] {'loss': 0.0938, 'grad_norm': 0.8731843201559136, 'learning_rate': 3.039112762205193e-07, 'epoch': 0.89} 89%|████████▉ | 30569/34278 [33:29:44<3:27:19, 3.35s/it] 89%|████████▉ | 30570/34278 [33:29:50<4:15:48, 4.14s/it] {'loss': 0.119, 'grad_norm': 0.7800262294296054, 'learning_rate': 3.037491001218873e-07, 'epoch': 0.89} 89%|████████▉ | 30570/34278 [33:29:50<4:15:48, 4.14s/it] 89%|████████▉ | 30571/34278 [33:29:53<3:59:24, 3.87s/it] {'loss': 0.1109, 'grad_norm': 0.9072854256014472, 'learning_rate': 3.0358696595026327e-07, 'epoch': 0.89} 89%|████████▉ | 30571/34278 [33:29:53<3:59:24, 3.87s/it] 89%|████████▉ | 30572/34278 [33:29:57<4:01:25, 3.91s/it] {'loss': 0.1044, 'grad_norm': 0.7739425465977036, 'learning_rate': 3.034248737070933e-07, 'epoch': 0.89} 89%|████████▉ | 30572/34278 [33:29:57<4:01:25, 3.91s/it] 89%|████████▉ | 30573/34278 [33:30:01<3:58:45, 3.87s/it] {'loss': 0.1087, 'grad_norm': 0.8250757646433379, 'learning_rate': 3.0326282339382453e-07, 'epoch': 0.89} 89%|████████▉ | 30573/34278 [33:30:01<3:58:45, 3.87s/it] 89%|████████▉ | 30574/34278 [33:30:05<3:50:16, 3.73s/it] {'loss': 0.0985, 'grad_norm': 0.7924070911825599, 'learning_rate': 3.0310081501190415e-07, 'epoch': 0.89} 89%|████████▉ | 30574/34278 [33:30:05<3:50:16, 3.73s/it] 89%|████████▉ | 30575/34278 [33:30:11<4:32:40, 4.42s/it] {'loss': 0.1127, 'grad_norm': 1.0470126728379414, 'learning_rate': 3.029388485627782e-07, 'epoch': 0.89} 89%|████████▉ | 30575/34278 [33:30:11<4:32:40, 4.42s/it] 89%|████████▉ | 30576/34278 [33:30:17<5:01:11, 4.88s/it] {'loss': 0.1336, 'grad_norm': 0.7333108222090367, 'learning_rate': 3.027769240478939e-07, 'epoch': 0.89} 89%|████████▉ | 30576/34278 [33:30:17<5:01:11, 4.88s/it] 89%|████████▉ | 30577/34278 [33:30:20<4:27:59, 4.34s/it] {'loss': 0.134, 'grad_norm': 0.8361327535722722, 'learning_rate': 3.0261504146869457e-07, 'epoch': 0.89} 89%|████████▉ | 30577/34278 [33:30:20<4:27:59, 4.34s/it] 89%|████████▉ | 30578/34278 [33:30:23<4:00:32, 3.90s/it] {'loss': 0.1014, 'grad_norm': 0.811324091203435, 'learning_rate': 3.024532008266279e-07, 'epoch': 0.89} 89%|████████▉ | 30578/34278 [33:30:23<4:00:32, 3.90s/it] 89%|████████▉ | 30579/34278 [33:30:28<4:36:05, 4.48s/it] {'loss': 0.1054, 'grad_norm': 0.8761814206887861, 'learning_rate': 3.0229140212313767e-07, 'epoch': 0.89} 89%|████████▉ | 30579/34278 [33:30:28<4:36:05, 4.48s/it] 89%|████████▉ | 30580/34278 [33:30:32<4:20:49, 4.23s/it] {'loss': 0.1161, 'grad_norm': 0.8650337702531615, 'learning_rate': 3.021296453596678e-07, 'epoch': 0.89} 89%|████████▉ | 30580/34278 [33:30:32<4:20:49, 4.23s/it] 89%|████████▉ | 30581/34278 [33:30:38<4:50:13, 4.71s/it] {'loss': 0.1031, 'grad_norm': 0.8423918525707391, 'learning_rate': 3.019679305376627e-07, 'epoch': 0.89} 89%|████████▉ | 30581/34278 [33:30:38<4:50:13, 4.71s/it] 89%|████████▉ | 30582/34278 [33:30:41<4:13:05, 4.11s/it] {'loss': 0.114, 'grad_norm': 1.008493869777864, 'learning_rate': 3.018062576585673e-07, 'epoch': 0.89} 89%|████████▉ | 30582/34278 [33:30:41<4:13:05, 4.11s/it] 89%|████████▉ | 30583/34278 [33:30:44<3:54:24, 3.81s/it] {'loss': 0.1039, 'grad_norm': 0.8460407679440864, 'learning_rate': 3.016446267238238e-07, 'epoch': 0.89} 89%|████████▉ | 30583/34278 [33:30:44<3:54:24, 3.81s/it] 89%|████████▉ | 30584/34278 [33:30:48<3:55:06, 3.82s/it] {'loss': 0.0898, 'grad_norm': 0.8650983055723295, 'learning_rate': 3.0148303773487486e-07, 'epoch': 0.89} 89%|████████▉ | 30584/34278 [33:30:48<3:55:06, 3.82s/it] 89%|████████▉ | 30585/34278 [33:30:51<3:39:54, 3.57s/it] {'loss': 0.1034, 'grad_norm': 0.7098856616095556, 'learning_rate': 3.0132149069316497e-07, 'epoch': 0.89} 89%|████████▉ | 30585/34278 [33:30:51<3:39:54, 3.57s/it] 89%|████████▉ | 30586/34278 [33:30:56<4:12:41, 4.11s/it] {'loss': 0.1122, 'grad_norm': 0.7638130427984393, 'learning_rate': 3.0115998560013404e-07, 'epoch': 0.89} 89%|████████▉ | 30586/34278 [33:30:56<4:12:41, 4.11s/it] 89%|████████▉ | 30587/34278 [33:30:59<3:49:00, 3.72s/it] {'loss': 0.1272, 'grad_norm': 0.9767240295345735, 'learning_rate': 3.0099852245722483e-07, 'epoch': 0.89} 89%|████████▉ | 30587/34278 [33:30:59<3:49:00, 3.72s/it] 89%|████████▉ | 30588/34278 [33:31:01<3:31:15, 3.44s/it] {'loss': 0.1167, 'grad_norm': 1.1747269220045504, 'learning_rate': 3.0083710126588005e-07, 'epoch': 0.89} 89%|████████▉ | 30588/34278 [33:31:01<3:31:15, 3.44s/it] 89%|████████▉ | 30589/34278 [33:31:05<3:28:13, 3.39s/it] {'loss': 0.0973, 'grad_norm': 0.9038430151497058, 'learning_rate': 3.006757220275397e-07, 'epoch': 0.89} 89%|████████▉ | 30589/34278 [33:31:05<3:28:13, 3.39s/it] 89%|████████▉ | 30590/34278 [33:31:09<3:53:03, 3.79s/it] {'loss': 0.1106, 'grad_norm': 0.7636481866103193, 'learning_rate': 3.005143847436437e-07, 'epoch': 0.89} 89%|████████▉ | 30590/34278 [33:31:09<3:53:03, 3.79s/it] 89%|████████▉ | 30591/34278 [33:31:15<4:26:34, 4.34s/it] {'loss': 0.1155, 'grad_norm': 1.0234036792038594, 'learning_rate': 3.003530894156348e-07, 'epoch': 0.89} 89%|████████▉ | 30591/34278 [33:31:15<4:26:34, 4.34s/it] 89%|████████▉ | 30592/34278 [33:31:18<4:00:12, 3.91s/it] {'loss': 0.1213, 'grad_norm': 0.8167228866756323, 'learning_rate': 3.0019183604495075e-07, 'epoch': 0.89} 89%|████████▉ | 30592/34278 [33:31:18<4:00:12, 3.91s/it] 89%|████████▉ | 30593/34278 [33:31:23<4:28:19, 4.37s/it] {'loss': 0.0835, 'grad_norm': 0.6343908880006888, 'learning_rate': 3.0003062463303257e-07, 'epoch': 0.89} 89%|████████▉ | 30593/34278 [33:31:23<4:28:19, 4.37s/it] 89%|████████▉ | 30594/34278 [33:31:26<4:03:59, 3.97s/it] {'loss': 0.0921, 'grad_norm': 0.67414439241472, 'learning_rate': 2.99869455181318e-07, 'epoch': 0.89} 89%|████████▉ | 30594/34278 [33:31:26<4:03:59, 3.97s/it] 89%|████████▉ | 30595/34278 [33:31:29<3:44:10, 3.65s/it] {'loss': 0.1224, 'grad_norm': 0.7684257627189851, 'learning_rate': 2.9970832769124823e-07, 'epoch': 0.89} 89%|████████▉ | 30595/34278 [33:31:29<3:44:10, 3.65s/it] 89%|████████▉ | 30596/34278 [33:31:33<3:37:58, 3.55s/it] {'loss': 0.0952, 'grad_norm': 0.7164831969761788, 'learning_rate': 2.995472421642598e-07, 'epoch': 0.89} 89%|████████▉ | 30596/34278 [33:31:33<3:37:58, 3.55s/it] 89%|████████▉ | 30597/34278 [33:31:36<3:36:33, 3.53s/it] {'loss': 0.0964, 'grad_norm': 0.774801751984761, 'learning_rate': 2.993861986017915e-07, 'epoch': 0.89} 89%|████████▉ | 30597/34278 [33:31:36<3:36:33, 3.53s/it] 89%|████████▉ | 30598/34278 [33:31:40<3:38:10, 3.56s/it] {'loss': 0.1069, 'grad_norm': 0.8492196878393536, 'learning_rate': 2.992251970052806e-07, 'epoch': 0.89} 89%|████████▉ | 30598/34278 [33:31:40<3:38:10, 3.56s/it] 89%|████████▉ | 30599/34278 [33:31:43<3:25:09, 3.35s/it] {'loss': 0.1139, 'grad_norm': 0.816706175559454, 'learning_rate': 2.9906423737616595e-07, 'epoch': 0.89} 89%|████████▉ | 30599/34278 [33:31:43<3:25:09, 3.35s/it] 89%|████████▉ | 30600/34278 [33:31:46<3:25:05, 3.35s/it] {'loss': 0.11, 'grad_norm': 1.1452740993489874, 'learning_rate': 2.989033197158825e-07, 'epoch': 0.89} 89%|████████▉ | 30600/34278 [33:31:46<3:25:05, 3.35s/it] 89%|████████▉ | 30601/34278 [33:31:49<3:22:30, 3.30s/it] {'loss': 0.1386, 'grad_norm': 0.7472179425536273, 'learning_rate': 2.9874244402586903e-07, 'epoch': 0.89} 89%|████████▉ | 30601/34278 [33:31:49<3:22:30, 3.30s/it] 89%|████████▉ | 30602/34278 [33:31:54<3:49:06, 3.74s/it] {'loss': 0.1148, 'grad_norm': 0.8886819051906715, 'learning_rate': 2.985816103075606e-07, 'epoch': 0.89} 89%|████████▉ | 30602/34278 [33:31:54<3:49:06, 3.74s/it] 89%|████████▉ | 30603/34278 [33:31:57<3:42:29, 3.63s/it] {'loss': 0.1115, 'grad_norm': 1.1445926068268717, 'learning_rate': 2.984208185623927e-07, 'epoch': 0.89} 89%|████████▉ | 30603/34278 [33:31:57<3:42:29, 3.63s/it] 89%|████████▉ | 30604/34278 [33:32:00<3:25:36, 3.36s/it] {'loss': 0.1243, 'grad_norm': 0.7815328900810175, 'learning_rate': 2.982600687918014e-07, 'epoch': 0.89} 89%|████████▉ | 30604/34278 [33:32:00<3:25:36, 3.36s/it] 89%|████████▉ | 30605/34278 [33:32:03<3:21:00, 3.28s/it] {'loss': 0.0996, 'grad_norm': 0.7975707815032408, 'learning_rate': 2.980993609972221e-07, 'epoch': 0.89} 89%|████████▉ | 30605/34278 [33:32:03<3:21:00, 3.28s/it] 89%|████████▉ | 30606/34278 [33:32:07<3:29:05, 3.42s/it] {'loss': 0.0782, 'grad_norm': 0.8032606289605577, 'learning_rate': 2.9793869518009e-07, 'epoch': 0.89} 89%|████████▉ | 30606/34278 [33:32:07<3:29:05, 3.42s/it] 89%|████████▉ | 30607/34278 [33:32:10<3:26:38, 3.38s/it] {'loss': 0.1096, 'grad_norm': 0.8663623225654811, 'learning_rate': 2.9777807134183714e-07, 'epoch': 0.89} 89%|████████▉ | 30607/34278 [33:32:10<3:26:38, 3.38s/it] 89%|████████▉ | 30608/34278 [33:32:14<3:34:40, 3.51s/it] {'loss': 0.1195, 'grad_norm': 0.7444972697751595, 'learning_rate': 2.976174894839007e-07, 'epoch': 0.89} 89%|████████▉ | 30608/34278 [33:32:14<3:34:40, 3.51s/it] 89%|████████▉ | 30609/34278 [33:32:17<3:25:48, 3.37s/it] {'loss': 0.1303, 'grad_norm': 0.8804128856649871, 'learning_rate': 2.974569496077123e-07, 'epoch': 0.89} 89%|████████▉ | 30609/34278 [33:32:17<3:25:48, 3.37s/it] 89%|████████▉ | 30610/34278 [33:32:20<3:20:52, 3.29s/it] {'loss': 0.1054, 'grad_norm': 0.8628877034987134, 'learning_rate': 2.972964517147048e-07, 'epoch': 0.89} 89%|████████▉ | 30610/34278 [33:32:20<3:20:52, 3.29s/it] 89%|████████▉ | 30611/34278 [33:32:24<3:28:51, 3.42s/it] {'loss': 0.131, 'grad_norm': 0.8268818707012316, 'learning_rate': 2.971359958063125e-07, 'epoch': 0.89} 89%|████████▉ | 30611/34278 [33:32:24<3:28:51, 3.42s/it] 89%|████████▉ | 30612/34278 [33:32:27<3:26:37, 3.38s/it] {'loss': 0.12, 'grad_norm': 0.9250978819681452, 'learning_rate': 2.9697558188396757e-07, 'epoch': 0.89} 89%|████████▉ | 30612/34278 [33:32:27<3:26:37, 3.38s/it] 89%|████████▉ | 30613/34278 [33:32:31<3:29:05, 3.42s/it] {'loss': 0.113, 'grad_norm': 0.8044794877353744, 'learning_rate': 2.968152099491023e-07, 'epoch': 0.89} 89%|████████▉ | 30613/34278 [33:32:31<3:29:05, 3.42s/it] 89%|████████▉ | 30614/34278 [33:32:34<3:20:58, 3.29s/it] {'loss': 0.0922, 'grad_norm': 0.8176311630704783, 'learning_rate': 2.966548800031471e-07, 'epoch': 0.89} 89%|████████▉ | 30614/34278 [33:32:34<3:20:58, 3.29s/it] 89%|████████▉ | 30615/34278 [33:32:37<3:20:07, 3.28s/it] {'loss': 0.116, 'grad_norm': 0.7626515244155622, 'learning_rate': 2.964945920475354e-07, 'epoch': 0.89} 89%|████████▉ | 30615/34278 [33:32:37<3:20:07, 3.28s/it] 89%|████████▉ | 30616/34278 [33:32:40<3:20:05, 3.28s/it] {'loss': 0.1064, 'grad_norm': 0.7250689703709754, 'learning_rate': 2.9633434608369596e-07, 'epoch': 0.89} 89%|████████▉ | 30616/34278 [33:32:40<3:20:05, 3.28s/it] 89%|████████▉ | 30617/34278 [33:32:43<3:17:26, 3.24s/it] {'loss': 0.1268, 'grad_norm': 2.6020126383282434, 'learning_rate': 2.9617414211306093e-07, 'epoch': 0.89} 89%|████████▉ | 30617/34278 [33:32:43<3:17:26, 3.24s/it] 89%|████████▉ | 30618/34278 [33:32:47<3:29:57, 3.44s/it] {'loss': 0.1186, 'grad_norm': 0.9140808306146535, 'learning_rate': 2.9601398013706094e-07, 'epoch': 0.89} 89%|████████▉ | 30618/34278 [33:32:47<3:29:57, 3.44s/it] 89%|████████▉ | 30619/34278 [33:32:51<3:32:30, 3.48s/it] {'loss': 0.0968, 'grad_norm': 0.7610128111653633, 'learning_rate': 2.9585386015712537e-07, 'epoch': 0.89} 89%|████████▉ | 30619/34278 [33:32:51<3:32:30, 3.48s/it] 89%|████████▉ | 30620/34278 [33:32:54<3:23:20, 3.34s/it] {'loss': 0.1079, 'grad_norm': 0.9511054648297602, 'learning_rate': 2.9569378217468247e-07, 'epoch': 0.89} 89%|████████▉ | 30620/34278 [33:32:54<3:23:20, 3.34s/it] 89%|████████▉ | 30621/34278 [33:33:00<4:19:22, 4.26s/it] {'loss': 0.1329, 'grad_norm': 0.908847078825639, 'learning_rate': 2.9553374619116335e-07, 'epoch': 0.89} 89%|████████▉ | 30621/34278 [33:33:00<4:19:22, 4.26s/it] 89%|████████▉ | 30622/34278 [33:33:04<4:03:10, 3.99s/it] {'loss': 0.1308, 'grad_norm': 0.8692358887319792, 'learning_rate': 2.953737522079952e-07, 'epoch': 0.89} 89%|████████▉ | 30622/34278 [33:33:04<4:03:10, 3.99s/it] 89%|████████▉ | 30623/34278 [33:33:09<4:35:42, 4.53s/it] {'loss': 0.1421, 'grad_norm': 0.7723239781444972, 'learning_rate': 2.952138002266081e-07, 'epoch': 0.89} 89%|████████▉ | 30623/34278 [33:33:09<4:35:42, 4.53s/it] 89%|████████▉ | 30624/34278 [33:33:12<4:10:00, 4.11s/it] {'loss': 0.1169, 'grad_norm': 0.8969370724316609, 'learning_rate': 2.950538902484279e-07, 'epoch': 0.89} 89%|████████▉ | 30624/34278 [33:33:13<4:10:00, 4.11s/it] 89%|████████▉ | 30625/34278 [33:33:15<3:47:05, 3.73s/it] {'loss': 0.0857, 'grad_norm': 0.7501624440476841, 'learning_rate': 2.9489402227488474e-07, 'epoch': 0.89} 89%|████████▉ | 30625/34278 [33:33:15<3:47:05, 3.73s/it] 89%|████████▉ | 30626/34278 [33:33:21<4:25:37, 4.36s/it] {'loss': 0.1151, 'grad_norm': 0.8291425232771156, 'learning_rate': 2.9473419630740405e-07, 'epoch': 0.89} 89%|████████▉ | 30626/34278 [33:33:21<4:25:37, 4.36s/it] 89%|████████▉ | 30627/34278 [33:33:26<4:33:57, 4.50s/it] {'loss': 0.0864, 'grad_norm': 0.8029671988119853, 'learning_rate': 2.9457441234741256e-07, 'epoch': 0.89} 89%|████████▉ | 30627/34278 [33:33:26<4:33:57, 4.50s/it] 89%|████████▉ | 30628/34278 [33:33:29<4:09:07, 4.10s/it] {'loss': 0.0901, 'grad_norm': 0.7704961735744835, 'learning_rate': 2.944146703963374e-07, 'epoch': 0.89} 89%|████████▉ | 30628/34278 [33:33:29<4:09:07, 4.10s/it] 89%|████████▉ | 30629/34278 [33:33:35<4:43:38, 4.66s/it] {'loss': 0.1128, 'grad_norm': 0.8741711341205287, 'learning_rate': 2.942549704556058e-07, 'epoch': 0.89} 89%|████████▉ | 30629/34278 [33:33:35<4:43:38, 4.66s/it] 89%|████████▉ | 30630/34278 [33:33:39<4:19:42, 4.27s/it] {'loss': 0.1079, 'grad_norm': 0.9295472757177593, 'learning_rate': 2.9409531252664105e-07, 'epoch': 0.89} 89%|████████▉ | 30630/34278 [33:33:39<4:19:42, 4.27s/it] 89%|████████▉ | 30631/34278 [33:33:42<4:04:55, 4.03s/it] {'loss': 0.1105, 'grad_norm': 0.9614111491802272, 'learning_rate': 2.9393569661087143e-07, 'epoch': 0.89} 89%|████████▉ | 30631/34278 [33:33:42<4:04:55, 4.03s/it] 89%|████████▉ | 30632/34278 [33:33:46<4:05:04, 4.03s/it] {'loss': 0.0896, 'grad_norm': 1.0608409579751146, 'learning_rate': 2.937761227097202e-07, 'epoch': 0.89} 89%|████████▉ | 30632/34278 [33:33:46<4:05:04, 4.03s/it] 89%|████████▉ | 30633/34278 [33:33:49<3:50:39, 3.80s/it] {'loss': 0.1199, 'grad_norm': 0.9306061138796965, 'learning_rate': 2.9361659082461137e-07, 'epoch': 0.89} 89%|████████▉ | 30633/34278 [33:33:49<3:50:39, 3.80s/it] 89%|████████▉ | 30634/34278 [33:33:53<3:42:59, 3.67s/it] {'loss': 0.1011, 'grad_norm': 2.088853597013049, 'learning_rate': 2.9345710095697036e-07, 'epoch': 0.89} 89%|████████▉ | 30634/34278 [33:33:53<3:42:59, 3.67s/it] 89%|████████▉ | 30635/34278 [33:33:55<3:26:33, 3.40s/it] {'loss': 0.1088, 'grad_norm': 0.9394738653616855, 'learning_rate': 2.9329765310822156e-07, 'epoch': 0.89} 89%|████████▉ | 30635/34278 [33:33:55<3:26:33, 3.40s/it] 89%|████████▉ | 30636/34278 [33:34:01<4:14:31, 4.19s/it] {'loss': 0.1107, 'grad_norm': 0.7388414370064872, 'learning_rate': 2.931382472797878e-07, 'epoch': 0.89} 89%|████████▉ | 30636/34278 [33:34:01<4:14:31, 4.19s/it] 89%|████████▉ | 30637/34278 [33:34:04<3:52:26, 3.83s/it] {'loss': 0.1186, 'grad_norm': 0.6401330911219283, 'learning_rate': 2.9297888347309124e-07, 'epoch': 0.89} 89%|████████▉ | 30637/34278 [33:34:04<3:52:26, 3.83s/it] 89%|████████▉ | 30638/34278 [33:34:07<3:33:59, 3.53s/it] {'loss': 0.103, 'grad_norm': 0.7369857127213442, 'learning_rate': 2.928195616895563e-07, 'epoch': 0.89} 89%|████████▉ | 30638/34278 [33:34:07<3:33:59, 3.53s/it] 89%|████████▉ | 30639/34278 [33:34:10<3:19:28, 3.29s/it] {'loss': 0.1145, 'grad_norm': 1.0939213670571821, 'learning_rate': 2.926602819306046e-07, 'epoch': 0.89} 89%|████████▉ | 30639/34278 [33:34:10<3:19:28, 3.29s/it] 89%|████████▉ | 30640/34278 [33:34:13<3:19:10, 3.28s/it] {'loss': 0.1139, 'grad_norm': 0.7832643718469073, 'learning_rate': 2.9250104419765724e-07, 'epoch': 0.89} 89%|████████▉ | 30640/34278 [33:34:13<3:19:10, 3.28s/it] 89%|████████▉ | 30641/34278 [33:34:19<4:00:52, 3.97s/it] {'loss': 0.1126, 'grad_norm': 0.8266742583807348, 'learning_rate': 2.9234184849213696e-07, 'epoch': 0.89} 89%|████████▉ | 30641/34278 [33:34:19<4:00:52, 3.97s/it] 89%|████████▉ | 30642/34278 [33:34:22<3:46:25, 3.74s/it] {'loss': 0.0995, 'grad_norm': 0.9793590388440535, 'learning_rate': 2.9218269481546545e-07, 'epoch': 0.89} 89%|████████▉ | 30642/34278 [33:34:22<3:46:25, 3.74s/it] 89%|████████▉ | 30643/34278 [33:34:25<3:36:08, 3.57s/it] {'loss': 0.1271, 'grad_norm': 0.7229233293580005, 'learning_rate': 2.920235831690632e-07, 'epoch': 0.89} 89%|████████▉ | 30643/34278 [33:34:25<3:36:08, 3.57s/it] 89%|████████▉ | 30644/34278 [33:34:29<3:37:07, 3.58s/it] {'loss': 0.1333, 'grad_norm': 0.9154397096787203, 'learning_rate': 2.9186451355435017e-07, 'epoch': 0.89} 89%|████████▉ | 30644/34278 [33:34:29<3:37:07, 3.58s/it] 89%|████████▉ | 30645/34278 [33:34:34<4:00:43, 3.98s/it] {'loss': 0.1305, 'grad_norm': 0.9308696874881223, 'learning_rate': 2.9170548597274697e-07, 'epoch': 0.89} 89%|████████▉ | 30645/34278 [33:34:34<4:00:43, 3.98s/it] 89%|████████▉ | 30646/34278 [33:34:38<3:57:42, 3.93s/it] {'loss': 0.0946, 'grad_norm': 0.8685920500446513, 'learning_rate': 2.915465004256729e-07, 'epoch': 0.89} 89%|████████▉ | 30646/34278 [33:34:38<3:57:42, 3.93s/it] 89%|████████▉ | 30647/34278 [33:34:41<3:43:40, 3.70s/it] {'loss': 0.1112, 'grad_norm': 0.9372730465917704, 'learning_rate': 2.9138755691454745e-07, 'epoch': 0.89} 89%|████████▉ | 30647/34278 [33:34:41<3:43:40, 3.70s/it] 89%|████████▉ | 30648/34278 [33:34:44<3:32:32, 3.51s/it] {'loss': 0.1358, 'grad_norm': 0.8874817740803603, 'learning_rate': 2.912286554407906e-07, 'epoch': 0.89} 89%|████████▉ | 30648/34278 [33:34:44<3:32:32, 3.51s/it] 89%|████████▉ | 30649/34278 [33:34:47<3:26:56, 3.42s/it] {'loss': 0.1025, 'grad_norm': 0.9400008207788705, 'learning_rate': 2.910697960058201e-07, 'epoch': 0.89} 89%|████████▉ | 30649/34278 [33:34:47<3:26:56, 3.42s/it] 89%|████████▉ | 30650/34278 [33:34:51<3:36:17, 3.58s/it] {'loss': 0.1202, 'grad_norm': 0.9180184949259226, 'learning_rate': 2.9091097861105365e-07, 'epoch': 0.89} 89%|████████▉ | 30650/34278 [33:34:51<3:36:17, 3.58s/it] 89%|████████▉ | 30651/34278 [33:34:57<4:21:08, 4.32s/it] {'loss': 0.0969, 'grad_norm': 0.7490221131587833, 'learning_rate': 2.9075220325791076e-07, 'epoch': 0.89} 89%|████████▉ | 30651/34278 [33:34:57<4:21:08, 4.32s/it] 89%|████████▉ | 30652/34278 [33:35:00<3:53:42, 3.87s/it] {'loss': 0.0943, 'grad_norm': 0.869059582480006, 'learning_rate': 2.905934699478069e-07, 'epoch': 0.89} 89%|████████▉ | 30652/34278 [33:35:00<3:53:42, 3.87s/it] 89%|████████▉ | 30653/34278 [33:35:03<3:48:27, 3.78s/it] {'loss': 0.0965, 'grad_norm': 0.9224576121602542, 'learning_rate': 2.9043477868216154e-07, 'epoch': 0.89} 89%|████████▉ | 30653/34278 [33:35:03<3:48:27, 3.78s/it] 89%|████████▉ | 30654/34278 [33:35:06<3:33:03, 3.53s/it] {'loss': 0.125, 'grad_norm': 3.962025518557501, 'learning_rate': 2.9027612946238906e-07, 'epoch': 0.89} 89%|████████▉ | 30654/34278 [33:35:06<3:33:03, 3.53s/it] 89%|████████▉ | 30655/34278 [33:35:10<3:27:10, 3.43s/it] {'loss': 0.1219, 'grad_norm': 0.8240901935222994, 'learning_rate': 2.901175222899083e-07, 'epoch': 0.89} 89%|████████▉ | 30655/34278 [33:35:10<3:27:10, 3.43s/it] 89%|████████▉ | 30656/34278 [33:35:12<3:17:36, 3.27s/it] {'loss': 0.0914, 'grad_norm': 1.1371779334655232, 'learning_rate': 2.899589571661332e-07, 'epoch': 0.89} 89%|████████▉ | 30656/34278 [33:35:12<3:17:36, 3.27s/it] 89%|████████▉ | 30657/34278 [33:35:15<3:07:30, 3.11s/it] {'loss': 0.1017, 'grad_norm': 0.8406002026070913, 'learning_rate': 2.898004340924798e-07, 'epoch': 0.89} 89%|████████▉ | 30657/34278 [33:35:15<3:07:30, 3.11s/it] 89%|████████▉ | 30658/34278 [33:35:21<3:52:10, 3.85s/it] {'loss': 0.1167, 'grad_norm': 0.77795371350629, 'learning_rate': 2.896419530703637e-07, 'epoch': 0.89} 89%|████████▉ | 30658/34278 [33:35:21<3:52:10, 3.85s/it] 89%|████████▉ | 30659/34278 [33:35:24<3:36:42, 3.59s/it] {'loss': 0.1116, 'grad_norm': 1.038122229345276, 'learning_rate': 2.894835141012009e-07, 'epoch': 0.89} 89%|████████▉ | 30659/34278 [33:35:24<3:36:42, 3.59s/it] 89%|████████▉ | 30660/34278 [33:35:27<3:23:57, 3.38s/it] {'loss': 0.0913, 'grad_norm': 0.7155179875183134, 'learning_rate': 2.8932511718640366e-07, 'epoch': 0.89} 89%|████████▉ | 30660/34278 [33:35:27<3:23:57, 3.38s/it] 89%|████████▉ | 30661/34278 [33:35:30<3:20:41, 3.33s/it] {'loss': 0.1044, 'grad_norm': 0.9850546589207486, 'learning_rate': 2.891667623273881e-07, 'epoch': 0.89} 89%|████████▉ | 30661/34278 [33:35:30<3:20:41, 3.33s/it] 89%|████████▉ | 30662/34278 [33:35:34<3:42:04, 3.68s/it] {'loss': 0.1404, 'grad_norm': 0.9246766592180333, 'learning_rate': 2.8900844952556685e-07, 'epoch': 0.89} 89%|████████▉ | 30662/34278 [33:35:34<3:42:04, 3.68s/it] 89%|████████▉ | 30663/34278 [33:35:38<3:35:53, 3.58s/it] {'loss': 0.1197, 'grad_norm': 0.8832666813312848, 'learning_rate': 2.888501787823533e-07, 'epoch': 0.89} 89%|████████▉ | 30663/34278 [33:35:38<3:35:53, 3.58s/it] 89%|████████▉ | 30664/34278 [33:35:41<3:30:55, 3.50s/it] {'loss': 0.1011, 'grad_norm': 0.7829364176625517, 'learning_rate': 2.886919500991603e-07, 'epoch': 0.89} 89%|████████▉ | 30664/34278 [33:35:41<3:30:55, 3.50s/it] 89%|████████▉ | 30665/34278 [33:35:44<3:24:20, 3.39s/it] {'loss': 0.1069, 'grad_norm': 0.8653227660922241, 'learning_rate': 2.885337634774016e-07, 'epoch': 0.89} 89%|████████▉ | 30665/34278 [33:35:44<3:24:20, 3.39s/it] 89%|████████▉ | 30666/34278 [33:35:47<3:15:28, 3.25s/it] {'loss': 0.1057, 'grad_norm': 0.7824465549723113, 'learning_rate': 2.883756189184889e-07, 'epoch': 0.89} 89%|████████▉ | 30666/34278 [33:35:47<3:15:28, 3.25s/it] 89%|████████▉ | 30667/34278 [33:35:50<3:15:30, 3.25s/it] {'loss': 0.0947, 'grad_norm': 0.9997744609921497, 'learning_rate': 2.882175164238332e-07, 'epoch': 0.89} 89%|████████▉ | 30667/34278 [33:35:50<3:15:30, 3.25s/it] 89%|████████▉ | 30668/34278 [33:35:53<3:12:37, 3.20s/it] {'loss': 0.1004, 'grad_norm': 0.7816483007528533, 'learning_rate': 2.8805945599484743e-07, 'epoch': 0.89} 89%|████████▉ | 30668/34278 [33:35:53<3:12:37, 3.20s/it] 89%|████████▉ | 30669/34278 [33:35:57<3:19:39, 3.32s/it] {'loss': 0.1237, 'grad_norm': 0.7103235354537379, 'learning_rate': 2.87901437632942e-07, 'epoch': 0.89} 89%|████████▉ | 30669/34278 [33:35:57<3:19:39, 3.32s/it] 89%|████████▉ | 30670/34278 [33:36:00<3:11:43, 3.19s/it] {'loss': 0.1019, 'grad_norm': 0.7618700438722354, 'learning_rate': 2.877434613395269e-07, 'epoch': 0.89} 89%|████████▉ | 30670/34278 [33:36:00<3:11:43, 3.19s/it] 89%|████████▉ | 30671/34278 [33:36:03<3:07:10, 3.11s/it] {'loss': 0.0943, 'grad_norm': 0.8066672420541794, 'learning_rate': 2.875855271160133e-07, 'epoch': 0.89} 89%|████████▉ | 30671/34278 [33:36:03<3:07:10, 3.11s/it] 89%|████████▉ | 30672/34278 [33:36:09<3:58:07, 3.96s/it] {'loss': 0.1095, 'grad_norm': 0.9799411310826015, 'learning_rate': 2.874276349638122e-07, 'epoch': 0.89} 89%|████████▉ | 30672/34278 [33:36:09<3:58:07, 3.96s/it] 89%|████████▉ | 30673/34278 [33:36:12<3:41:37, 3.69s/it] {'loss': 0.1159, 'grad_norm': 0.8487921395001287, 'learning_rate': 2.87269784884332e-07, 'epoch': 0.89} 89%|████████▉ | 30673/34278 [33:36:12<3:41:37, 3.69s/it] 89%|████████▉ | 30674/34278 [33:36:15<3:29:49, 3.49s/it] {'loss': 0.1378, 'grad_norm': 1.1195891799567557, 'learning_rate': 2.8711197687898097e-07, 'epoch': 0.89} 89%|████████▉ | 30674/34278 [33:36:15<3:29:49, 3.49s/it] 89%|████████▉ | 30675/34278 [33:36:18<3:26:50, 3.44s/it] {'loss': 0.1056, 'grad_norm': 0.7989028904936224, 'learning_rate': 2.869542109491702e-07, 'epoch': 0.89} 89%|████████▉ | 30675/34278 [33:36:18<3:26:50, 3.44s/it] 89%|████████▉ | 30676/34278 [33:36:21<3:20:04, 3.33s/it] {'loss': 0.0971, 'grad_norm': 0.7346770494945787, 'learning_rate': 2.867964870963069e-07, 'epoch': 0.89} 89%|████████▉ | 30676/34278 [33:36:21<3:20:04, 3.33s/it] 89%|████████▉ | 30677/34278 [33:36:24<3:13:01, 3.22s/it] {'loss': 0.1043, 'grad_norm': 1.0206758932178501, 'learning_rate': 2.8663880532179887e-07, 'epoch': 0.89} 89%|████████▉ | 30677/34278 [33:36:24<3:13:01, 3.22s/it] 89%|████████▉ | 30678/34278 [33:36:28<3:16:53, 3.28s/it] {'loss': 0.1208, 'grad_norm': 0.8518157081409864, 'learning_rate': 2.8648116562705494e-07, 'epoch': 0.89} 89%|████████▉ | 30678/34278 [33:36:28<3:16:53, 3.28s/it] 90%|████████▉ | 30679/34278 [33:36:30<3:08:23, 3.14s/it] {'loss': 0.1239, 'grad_norm': 0.9966931330463132, 'learning_rate': 2.863235680134824e-07, 'epoch': 0.9} 90%|████████▉ | 30679/34278 [33:36:30<3:08:23, 3.14s/it] 90%|████████▉ | 30680/34278 [33:36:33<3:07:12, 3.12s/it] {'loss': 0.1207, 'grad_norm': 0.7942440447797972, 'learning_rate': 2.861660124824872e-07, 'epoch': 0.9} 90%|████████▉ | 30680/34278 [33:36:33<3:07:12, 3.12s/it] 90%|████████▉ | 30681/34278 [33:36:37<3:05:58, 3.10s/it] {'loss': 0.1128, 'grad_norm': 0.780495041468004, 'learning_rate': 2.8600849903547666e-07, 'epoch': 0.9} 90%|████████▉ | 30681/34278 [33:36:37<3:05:58, 3.10s/it] 90%|████████▉ | 30682/34278 [33:36:40<3:04:20, 3.08s/it] {'loss': 0.1214, 'grad_norm': 0.9115147619165473, 'learning_rate': 2.8585102767385685e-07, 'epoch': 0.9} 90%|████████▉ | 30682/34278 [33:36:40<3:04:20, 3.08s/it] 90%|████████▉ | 30683/34278 [33:36:43<3:13:44, 3.23s/it] {'loss': 0.1054, 'grad_norm': 0.7222577368355971, 'learning_rate': 2.856935983990339e-07, 'epoch': 0.9} 90%|████████▉ | 30683/34278 [33:36:43<3:13:44, 3.23s/it] 90%|████████▉ | 30684/34278 [33:36:47<3:20:45, 3.35s/it] {'loss': 0.1142, 'grad_norm': 0.7301611366280484, 'learning_rate': 2.855362112124127e-07, 'epoch': 0.9} 90%|████████▉ | 30684/34278 [33:36:47<3:20:45, 3.35s/it] 90%|████████▉ | 30685/34278 [33:36:51<3:35:35, 3.60s/it] {'loss': 0.1079, 'grad_norm': 0.7552815843178827, 'learning_rate': 2.8537886611539945e-07, 'epoch': 0.9} 90%|████████▉ | 30685/34278 [33:36:51<3:35:35, 3.60s/it] 90%|████████▉ | 30686/34278 [33:36:54<3:25:50, 3.44s/it] {'loss': 0.1102, 'grad_norm': 0.8885221919914154, 'learning_rate': 2.8522156310939797e-07, 'epoch': 0.9} 90%|████████▉ | 30686/34278 [33:36:54<3:25:50, 3.44s/it] 90%|████████▉ | 30687/34278 [33:37:00<4:18:36, 4.32s/it] {'loss': 0.1161, 'grad_norm': 0.6438831330036408, 'learning_rate': 2.850643021958127e-07, 'epoch': 0.9} 90%|████████▉ | 30687/34278 [33:37:00<4:18:36, 4.32s/it] 90%|████████▉ | 30688/34278 [33:37:03<3:53:43, 3.91s/it] {'loss': 0.1039, 'grad_norm': 1.0867915301485433, 'learning_rate': 2.8490708337604756e-07, 'epoch': 0.9} 90%|████████▉ | 30688/34278 [33:37:03<3:53:43, 3.91s/it] 90%|████████▉ | 30689/34278 [33:37:09<4:28:40, 4.49s/it] {'loss': 0.1149, 'grad_norm': 0.7095812392346565, 'learning_rate': 2.847499066515069e-07, 'epoch': 0.9} 90%|████████▉ | 30689/34278 [33:37:09<4:28:40, 4.49s/it] 90%|████████▉ | 30690/34278 [33:37:14<4:25:49, 4.45s/it] {'loss': 0.118, 'grad_norm': 0.8682813865885494, 'learning_rate': 2.84592772023593e-07, 'epoch': 0.9} 90%|████████▉ | 30690/34278 [33:37:14<4:25:49, 4.45s/it] 90%|████████▉ | 30691/34278 [33:37:18<4:18:06, 4.32s/it] {'loss': 0.1098, 'grad_norm': 0.8883751743992266, 'learning_rate': 2.8443567949370974e-07, 'epoch': 0.9} 90%|████████▉ | 30691/34278 [33:37:18<4:18:06, 4.32s/it] 90%|████████▉ | 30692/34278 [33:37:21<4:02:30, 4.06s/it] {'loss': 0.1219, 'grad_norm': 1.3194172476721755, 'learning_rate': 2.8427862906325875e-07, 'epoch': 0.9} 90%|████████▉ | 30692/34278 [33:37:21<4:02:30, 4.06s/it] 90%|████████▉ | 30693/34278 [33:37:25<3:57:49, 3.98s/it] {'loss': 0.1039, 'grad_norm': 0.8805669145462485, 'learning_rate': 2.8412162073364227e-07, 'epoch': 0.9} 90%|████████▉ | 30693/34278 [33:37:25<3:57:49, 3.98s/it] 90%|████████▉ | 30694/34278 [33:37:30<4:17:16, 4.31s/it] {'loss': 0.1162, 'grad_norm': 0.7842008713991359, 'learning_rate': 2.8396465450626186e-07, 'epoch': 0.9} 90%|████████▉ | 30694/34278 [33:37:30<4:17:16, 4.31s/it] 90%|████████▉ | 30695/34278 [33:37:33<3:53:20, 3.91s/it] {'loss': 0.1288, 'grad_norm': 0.927920248075283, 'learning_rate': 2.8380773038251984e-07, 'epoch': 0.9} 90%|████████▉ | 30695/34278 [33:37:33<3:53:20, 3.91s/it] 90%|████████▉ | 30696/34278 [33:37:36<3:39:53, 3.68s/it] {'loss': 0.1096, 'grad_norm': 0.8649481687855042, 'learning_rate': 2.836508483638167e-07, 'epoch': 0.9} 90%|████████▉ | 30696/34278 [33:37:36<3:39:53, 3.68s/it] 90%|████████▉ | 30697/34278 [33:37:40<3:40:17, 3.69s/it] {'loss': 0.0965, 'grad_norm': 0.7721159957993788, 'learning_rate': 2.8349400845155193e-07, 'epoch': 0.9} 90%|████████▉ | 30697/34278 [33:37:40<3:40:17, 3.69s/it] 90%|████████▉ | 30698/34278 [33:37:43<3:29:01, 3.50s/it] {'loss': 0.1059, 'grad_norm': 0.8514232986269841, 'learning_rate': 2.833372106471277e-07, 'epoch': 0.9} 90%|████████▉ | 30698/34278 [33:37:43<3:29:01, 3.50s/it] 90%|████████▉ | 30699/34278 [33:37:46<3:21:08, 3.37s/it] {'loss': 0.1026, 'grad_norm': 0.8185380777442179, 'learning_rate': 2.8318045495194293e-07, 'epoch': 0.9} 90%|████████▉ | 30699/34278 [33:37:46<3:21:08, 3.37s/it] 90%|████████▉ | 30700/34278 [33:37:49<3:12:35, 3.23s/it] {'loss': 0.0835, 'grad_norm': 0.680964602932964, 'learning_rate': 2.8302374136739643e-07, 'epoch': 0.9} 90%|████████▉ | 30700/34278 [33:37:49<3:12:35, 3.23s/it] 90%|████████▉ | 30701/34278 [33:37:53<3:27:21, 3.48s/it] {'loss': 0.1148, 'grad_norm': 0.7553616726414424, 'learning_rate': 2.8286706989488766e-07, 'epoch': 0.9} 90%|████████▉ | 30701/34278 [33:37:53<3:27:21, 3.48s/it] 90%|████████▉ | 30702/34278 [33:37:57<3:33:04, 3.58s/it] {'loss': 0.1132, 'grad_norm': 0.760820917165124, 'learning_rate': 2.8271044053581666e-07, 'epoch': 0.9} 90%|████████▉ | 30702/34278 [33:37:57<3:33:04, 3.58s/it] 90%|████████▉ | 30703/34278 [33:38:00<3:22:19, 3.40s/it] {'loss': 0.1031, 'grad_norm': 0.7823646909128376, 'learning_rate': 2.8255385329158056e-07, 'epoch': 0.9} 90%|████████▉ | 30703/34278 [33:38:00<3:22:19, 3.40s/it] 90%|████████▉ | 30704/34278 [33:38:03<3:20:35, 3.37s/it] {'loss': 0.1039, 'grad_norm': 0.8388846348531964, 'learning_rate': 2.823973081635767e-07, 'epoch': 0.9} 90%|████████▉ | 30704/34278 [33:38:03<3:20:35, 3.37s/it] 90%|████████▉ | 30705/34278 [33:38:07<3:29:37, 3.52s/it] {'loss': 0.114, 'grad_norm': 0.6541002407280917, 'learning_rate': 2.822408051532044e-07, 'epoch': 0.9} 90%|████████▉ | 30705/34278 [33:38:07<3:29:37, 3.52s/it] 90%|████████▉ | 30706/34278 [33:38:10<3:29:14, 3.51s/it] {'loss': 0.1107, 'grad_norm': 0.7585429260662029, 'learning_rate': 2.8208434426185926e-07, 'epoch': 0.9} 90%|████████▉ | 30706/34278 [33:38:10<3:29:14, 3.51s/it] 90%|████████▉ | 30707/34278 [33:38:13<3:19:50, 3.36s/it] {'loss': 0.0971, 'grad_norm': 0.8984268695878658, 'learning_rate': 2.819279254909385e-07, 'epoch': 0.9} 90%|████████▉ | 30707/34278 [33:38:13<3:19:50, 3.36s/it] 90%|████████▉ | 30708/34278 [33:38:18<3:37:27, 3.65s/it] {'loss': 0.1216, 'grad_norm': 0.9978469007664811, 'learning_rate': 2.8177154884183986e-07, 'epoch': 0.9} 90%|████████▉ | 30708/34278 [33:38:18<3:37:27, 3.65s/it] 90%|████████▉ | 30709/34278 [33:38:21<3:39:09, 3.68s/it] {'loss': 0.1117, 'grad_norm': 0.8137297061545247, 'learning_rate': 2.8161521431595897e-07, 'epoch': 0.9} 90%|████████▉ | 30709/34278 [33:38:21<3:39:09, 3.68s/it] 90%|████████▉ | 30710/34278 [33:38:25<3:43:12, 3.75s/it] {'loss': 0.1022, 'grad_norm': 0.7365263239636884, 'learning_rate': 2.814589219146896e-07, 'epoch': 0.9} 90%|████████▉ | 30710/34278 [33:38:25<3:43:12, 3.75s/it] 90%|████████▉ | 30711/34278 [33:38:29<3:40:23, 3.71s/it] {'loss': 0.0966, 'grad_norm': 1.0540794711851977, 'learning_rate': 2.813026716394296e-07, 'epoch': 0.9} 90%|████████▉ | 30711/34278 [33:38:29<3:40:23, 3.71s/it] 90%|████████▉ | 30712/34278 [33:38:32<3:30:41, 3.55s/it] {'loss': 0.0908, 'grad_norm': 0.7269967391048545, 'learning_rate': 2.8114646349157227e-07, 'epoch': 0.9} 90%|████████▉ | 30712/34278 [33:38:32<3:30:41, 3.55s/it] 90%|████████▉ | 30713/34278 [33:38:38<4:18:11, 4.35s/it] {'loss': 0.1286, 'grad_norm': 0.793578367137509, 'learning_rate': 2.8099029747251314e-07, 'epoch': 0.9} 90%|████████▉ | 30713/34278 [33:38:38<4:18:11, 4.35s/it] 90%|████████▉ | 30714/34278 [33:38:42<4:02:08, 4.08s/it] {'loss': 0.0899, 'grad_norm': 0.8365626249923098, 'learning_rate': 2.8083417358364615e-07, 'epoch': 0.9} 90%|████████▉ | 30714/34278 [33:38:42<4:02:08, 4.08s/it] 90%|████████▉ | 30715/34278 [33:38:45<3:43:30, 3.76s/it] {'loss': 0.1257, 'grad_norm': 0.8866360400757096, 'learning_rate': 2.806780918263652e-07, 'epoch': 0.9} 90%|████████▉ | 30715/34278 [33:38:45<3:43:30, 3.76s/it] 90%|████████▉ | 30716/34278 [33:38:51<4:23:15, 4.43s/it] {'loss': 0.1288, 'grad_norm': 0.8700735130147824, 'learning_rate': 2.8052205220206406e-07, 'epoch': 0.9} 90%|████████▉ | 30716/34278 [33:38:51<4:23:15, 4.43s/it] 90%|████████▉ | 30717/34278 [33:38:54<4:03:05, 4.10s/it] {'loss': 0.1095, 'grad_norm': 0.8824283698640039, 'learning_rate': 2.8036605471213453e-07, 'epoch': 0.9} 90%|████████▉ | 30717/34278 [33:38:54<4:03:05, 4.10s/it] 90%|████████▉ | 30718/34278 [33:38:57<3:45:26, 3.80s/it] {'loss': 0.1142, 'grad_norm': 0.9074946088251287, 'learning_rate': 2.802100993579698e-07, 'epoch': 0.9} 90%|████████▉ | 30718/34278 [33:38:57<3:45:26, 3.80s/it] 90%|████████▉ | 30719/34278 [33:39:01<3:46:35, 3.82s/it] {'loss': 0.1228, 'grad_norm': 0.9216507187468439, 'learning_rate': 2.800541861409639e-07, 'epoch': 0.9} 90%|████████▉ | 30719/34278 [33:39:01<3:46:35, 3.82s/it] 90%|████████▉ | 30720/34278 [33:39:04<3:29:15, 3.53s/it] {'loss': 0.1107, 'grad_norm': 0.8283107882034496, 'learning_rate': 2.798983150625062e-07, 'epoch': 0.9} 90%|████████▉ | 30720/34278 [33:39:04<3:29:15, 3.53s/it] 90%|████████▉ | 30721/34278 [33:39:07<3:20:17, 3.38s/it] {'loss': 0.1166, 'grad_norm': 0.9998284171102396, 'learning_rate': 2.797424861239906e-07, 'epoch': 0.9} 90%|████████▉ | 30721/34278 [33:39:07<3:20:17, 3.38s/it] 90%|████████▉ | 30722/34278 [33:39:10<3:14:16, 3.28s/it] {'loss': 0.0987, 'grad_norm': 0.681729079454811, 'learning_rate': 2.795866993268076e-07, 'epoch': 0.9} 90%|████████▉ | 30722/34278 [33:39:10<3:14:16, 3.28s/it] 90%|████████▉ | 30723/34278 [33:39:13<3:12:09, 3.24s/it] {'loss': 0.0987, 'grad_norm': 0.7205330169882552, 'learning_rate': 2.794309546723467e-07, 'epoch': 0.9} 90%|████████▉ | 30723/34278 [33:39:13<3:12:09, 3.24s/it] 90%|████████▉ | 30724/34278 [33:39:16<3:04:38, 3.12s/it] {'loss': 0.1013, 'grad_norm': 0.943005385774912, 'learning_rate': 2.79275252161999e-07, 'epoch': 0.9} 90%|████████▉ | 30724/34278 [33:39:16<3:04:38, 3.12s/it] 90%|████████▉ | 30725/34278 [33:39:19<2:59:35, 3.03s/it] {'loss': 0.0857, 'grad_norm': 0.7708921117820327, 'learning_rate': 2.791195917971562e-07, 'epoch': 0.9} 90%|████████▉ | 30725/34278 [33:39:19<2:59:35, 3.03s/it] 90%|████████▉ | 30726/34278 [33:39:22<3:01:05, 3.06s/it] {'loss': 0.1099, 'grad_norm': 0.710933459659192, 'learning_rate': 2.7896397357920655e-07, 'epoch': 0.9} 90%|████████▉ | 30726/34278 [33:39:22<3:01:05, 3.06s/it] 90%|████████▉ | 30727/34278 [33:39:25<3:08:28, 3.18s/it] {'loss': 0.1309, 'grad_norm': 0.9451970832846561, 'learning_rate': 2.788083975095385e-07, 'epoch': 0.9} 90%|████████▉ | 30727/34278 [33:39:25<3:08:28, 3.18s/it] 90%|████████▉ | 30728/34278 [33:39:28<3:01:06, 3.06s/it] {'loss': 0.0908, 'grad_norm': 0.8845957460741791, 'learning_rate': 2.78652863589543e-07, 'epoch': 0.9} 90%|████████▉ | 30728/34278 [33:39:28<3:01:06, 3.06s/it] 90%|████████▉ | 30729/34278 [33:39:32<3:07:30, 3.17s/it] {'loss': 0.0954, 'grad_norm': 0.7861611751527217, 'learning_rate': 2.7849737182060743e-07, 'epoch': 0.9} 90%|████████▉ | 30729/34278 [33:39:32<3:07:30, 3.17s/it] 90%|████████▉ | 30730/34278 [33:39:35<3:09:17, 3.20s/it] {'loss': 0.1213, 'grad_norm': 0.9881368903368798, 'learning_rate': 2.7834192220412004e-07, 'epoch': 0.9} 90%|████████▉ | 30730/34278 [33:39:35<3:09:17, 3.20s/it] 90%|████████▉ | 30731/34278 [33:39:38<3:05:51, 3.14s/it] {'loss': 0.1232, 'grad_norm': 1.0046237997108478, 'learning_rate': 2.7818651474146865e-07, 'epoch': 0.9} 90%|████████▉ | 30731/34278 [33:39:38<3:05:51, 3.14s/it] 90%|████████▉ | 30732/34278 [33:39:41<3:07:40, 3.18s/it] {'loss': 0.0804, 'grad_norm': 0.8559461826711967, 'learning_rate': 2.7803114943404096e-07, 'epoch': 0.9} 90%|████████▉ | 30732/34278 [33:39:41<3:07:40, 3.18s/it] 90%|████████▉ | 30733/34278 [33:39:44<3:10:29, 3.22s/it] {'loss': 0.1227, 'grad_norm': 0.8640787998027005, 'learning_rate': 2.7787582628322484e-07, 'epoch': 0.9} 90%|████████▉ | 30733/34278 [33:39:44<3:10:29, 3.22s/it] 90%|████████▉ | 30734/34278 [33:39:47<3:03:34, 3.11s/it] {'loss': 0.1108, 'grad_norm': 0.765354985120999, 'learning_rate': 2.777205452904047e-07, 'epoch': 0.9} 90%|████████▉ | 30734/34278 [33:39:47<3:03:34, 3.11s/it] 90%|████████▉ | 30735/34278 [33:39:50<3:01:28, 3.07s/it] {'loss': 0.0985, 'grad_norm': 0.8463059268957756, 'learning_rate': 2.775653064569689e-07, 'epoch': 0.9} 90%|████████▉ | 30735/34278 [33:39:50<3:01:28, 3.07s/it] 90%|████████▉ | 30736/34278 [33:39:56<3:49:42, 3.89s/it] {'loss': 0.104, 'grad_norm': 0.7819341705243084, 'learning_rate': 2.774101097843024e-07, 'epoch': 0.9} 90%|████████▉ | 30736/34278 [33:39:56<3:49:42, 3.89s/it] 90%|████████▉ | 30737/34278 [33:40:00<3:45:25, 3.82s/it] {'loss': 0.107, 'grad_norm': 0.7381435723679869, 'learning_rate': 2.7725495527379075e-07, 'epoch': 0.9} 90%|████████▉ | 30737/34278 [33:40:00<3:45:25, 3.82s/it] 90%|████████▉ | 30738/34278 [33:40:03<3:28:37, 3.54s/it] {'loss': 0.1262, 'grad_norm': 0.9283154934314369, 'learning_rate': 2.7709984292682067e-07, 'epoch': 0.9} 90%|████████▉ | 30738/34278 [33:40:03<3:28:37, 3.54s/it] 90%|████████▉ | 30739/34278 [33:40:06<3:20:45, 3.40s/it] {'loss': 0.0994, 'grad_norm': 0.8106571715015493, 'learning_rate': 2.7694477274477547e-07, 'epoch': 0.9} 90%|████████▉ | 30739/34278 [33:40:06<3:20:45, 3.40s/it] 90%|████████▉ | 30740/34278 [33:40:09<3:19:30, 3.38s/it] {'loss': 0.0992, 'grad_norm': 0.7198788558290798, 'learning_rate': 2.767897447290391e-07, 'epoch': 0.9} 90%|████████▉ | 30740/34278 [33:40:09<3:19:30, 3.38s/it] 90%|████████▉ | 30741/34278 [33:40:12<3:08:47, 3.20s/it] {'loss': 0.1103, 'grad_norm': 0.9750558874247081, 'learning_rate': 2.76634758880997e-07, 'epoch': 0.9} 90%|████████▉ | 30741/34278 [33:40:12<3:08:47, 3.20s/it] 90%|████████▉ | 30742/34278 [33:40:14<2:57:59, 3.02s/it] {'loss': 0.1046, 'grad_norm': 0.9876056999943881, 'learning_rate': 2.764798152020315e-07, 'epoch': 0.9} 90%|████████▉ | 30742/34278 [33:40:14<2:57:59, 3.02s/it] 90%|████████▉ | 30743/34278 [33:40:20<3:38:38, 3.71s/it] {'loss': 0.0982, 'grad_norm': 0.833314343991956, 'learning_rate': 2.7632491369352756e-07, 'epoch': 0.9} 90%|████████▉ | 30743/34278 [33:40:20<3:38:38, 3.71s/it] 90%|████████▉ | 30744/34278 [33:40:23<3:24:35, 3.47s/it] {'loss': 0.1087, 'grad_norm': 0.8603811947496085, 'learning_rate': 2.7617005435686626e-07, 'epoch': 0.9} 90%|████████▉ | 30744/34278 [33:40:23<3:24:35, 3.47s/it] 90%|████████▉ | 30745/34278 [33:40:26<3:14:39, 3.31s/it] {'loss': 0.1125, 'grad_norm': 0.8700389349659293, 'learning_rate': 2.760152371934316e-07, 'epoch': 0.9} 90%|████████▉ | 30745/34278 [33:40:26<3:14:39, 3.31s/it] 90%|████████▉ | 30746/34278 [33:40:29<3:15:05, 3.31s/it] {'loss': 0.0942, 'grad_norm': 0.8254819574187079, 'learning_rate': 2.758604622046057e-07, 'epoch': 0.9} 90%|████████▉ | 30746/34278 [33:40:29<3:15:05, 3.31s/it] 90%|████████▉ | 30747/34278 [33:40:35<4:03:20, 4.13s/it] {'loss': 0.1055, 'grad_norm': 0.6604077936248297, 'learning_rate': 2.7570572939176866e-07, 'epoch': 0.9} 90%|████████▉ | 30747/34278 [33:40:35<4:03:20, 4.13s/it] 90%|████████▉ | 30748/34278 [33:40:38<3:45:08, 3.83s/it] {'loss': 0.1215, 'grad_norm': 0.8877528725600868, 'learning_rate': 2.755510387563032e-07, 'epoch': 0.9} 90%|████████▉ | 30748/34278 [33:40:38<3:45:08, 3.83s/it] 90%|████████▉ | 30749/34278 [33:40:44<4:22:59, 4.47s/it] {'loss': 0.0986, 'grad_norm': 0.924881130670953, 'learning_rate': 2.7539639029959097e-07, 'epoch': 0.9} 90%|████████▉ | 30749/34278 [33:40:44<4:22:59, 4.47s/it] 90%|████████▉ | 30750/34278 [33:40:47<3:58:49, 4.06s/it] {'loss': 0.1184, 'grad_norm': 1.092146565671295, 'learning_rate': 2.752417840230115e-07, 'epoch': 0.9} 90%|████████▉ | 30750/34278 [33:40:47<3:58:49, 4.06s/it] 90%|████████▉ | 30751/34278 [33:40:51<3:48:28, 3.89s/it] {'loss': 0.1135, 'grad_norm': 0.8404472563530829, 'learning_rate': 2.7508721992794586e-07, 'epoch': 0.9} 90%|████████▉ | 30751/34278 [33:40:51<3:48:28, 3.89s/it] 90%|████████▉ | 30752/34278 [33:40:54<3:36:43, 3.69s/it] {'loss': 0.1087, 'grad_norm': 0.6649637523160014, 'learning_rate': 2.749326980157735e-07, 'epoch': 0.9} 90%|████████▉ | 30752/34278 [33:40:54<3:36:43, 3.69s/it] 90%|████████▉ | 30753/34278 [33:40:57<3:22:01, 3.44s/it] {'loss': 0.1138, 'grad_norm': 0.7348002229460175, 'learning_rate': 2.7477821828787333e-07, 'epoch': 0.9} 90%|████████▉ | 30753/34278 [33:40:57<3:22:01, 3.44s/it] 90%|████████▉ | 30754/34278 [33:41:00<3:13:57, 3.30s/it] {'loss': 0.1132, 'grad_norm': 0.8516718049135685, 'learning_rate': 2.746237807456259e-07, 'epoch': 0.9} 90%|████████▉ | 30754/34278 [33:41:00<3:13:57, 3.30s/it] 90%|████████▉ | 30755/34278 [33:41:03<3:15:16, 3.33s/it] {'loss': 0.1218, 'grad_norm': 0.7731775570235779, 'learning_rate': 2.744693853904096e-07, 'epoch': 0.9} 90%|████████▉ | 30755/34278 [33:41:03<3:15:16, 3.33s/it] 90%|████████▉ | 30756/34278 [33:41:09<4:03:49, 4.15s/it] {'loss': 0.1262, 'grad_norm': 0.6942492578301084, 'learning_rate': 2.743150322236021e-07, 'epoch': 0.9} 90%|████████▉ | 30756/34278 [33:41:09<4:03:49, 4.15s/it] 90%|████████▉ | 30757/34278 [33:41:12<3:44:17, 3.82s/it] {'loss': 0.089, 'grad_norm': 0.9903813547306958, 'learning_rate': 2.7416072124658186e-07, 'epoch': 0.9} 90%|████████▉ | 30757/34278 [33:41:12<3:44:17, 3.82s/it] 90%|████████▉ | 30758/34278 [33:41:18<4:14:59, 4.35s/it] {'loss': 0.1009, 'grad_norm': 0.8046959363603583, 'learning_rate': 2.740064524607267e-07, 'epoch': 0.9} 90%|████████▉ | 30758/34278 [33:41:18<4:14:59, 4.35s/it] 90%|████████▉ | 30759/34278 [33:41:21<3:50:39, 3.93s/it] {'loss': 0.1156, 'grad_norm': 0.8655088682922392, 'learning_rate': 2.738522258674142e-07, 'epoch': 0.9} 90%|████████▉ | 30759/34278 [33:41:21<3:50:39, 3.93s/it] 90%|████████▉ | 30760/34278 [33:41:24<3:37:14, 3.71s/it] {'loss': 0.1041, 'grad_norm': 0.9014478945337064, 'learning_rate': 2.736980414680196e-07, 'epoch': 0.9} 90%|████████▉ | 30760/34278 [33:41:24<3:37:14, 3.71s/it] 90%|████████▉ | 30761/34278 [33:41:28<3:47:17, 3.88s/it] {'loss': 0.1093, 'grad_norm': 0.8983628128044445, 'learning_rate': 2.7354389926392113e-07, 'epoch': 0.9} 90%|████████▉ | 30761/34278 [33:41:28<3:47:17, 3.88s/it] 90%|████████▉ | 30762/34278 [33:41:32<3:38:44, 3.73s/it] {'loss': 0.1236, 'grad_norm': 0.8696555401269471, 'learning_rate': 2.733897992564949e-07, 'epoch': 0.9} 90%|████████▉ | 30762/34278 [33:41:32<3:38:44, 3.73s/it] 90%|████████▉ | 30763/34278 [33:41:35<3:26:07, 3.52s/it] {'loss': 0.1057, 'grad_norm': 0.9487568686646244, 'learning_rate': 2.732357414471165e-07, 'epoch': 0.9} 90%|████████▉ | 30763/34278 [33:41:35<3:26:07, 3.52s/it] 90%|████████▉ | 30764/34278 [33:41:37<3:12:30, 3.29s/it] {'loss': 0.1099, 'grad_norm': 0.7836511441328771, 'learning_rate': 2.7308172583715984e-07, 'epoch': 0.9} 90%|████████▉ | 30764/34278 [33:41:37<3:12:30, 3.29s/it] 90%|████████▉ | 30765/34278 [33:41:43<3:58:40, 4.08s/it] {'loss': 0.1129, 'grad_norm': 0.8953019495718707, 'learning_rate': 2.729277524280022e-07, 'epoch': 0.9} 90%|████████▉ | 30765/34278 [33:41:43<3:58:40, 4.08s/it] 90%|████████▉ | 30766/34278 [33:41:46<3:34:59, 3.67s/it] {'loss': 0.1155, 'grad_norm': 0.7815891728054658, 'learning_rate': 2.7277382122101627e-07, 'epoch': 0.9} 90%|████████▉ | 30766/34278 [33:41:46<3:34:59, 3.67s/it] 90%|████████▉ | 30767/34278 [33:41:50<3:44:14, 3.83s/it] {'loss': 0.119, 'grad_norm': 0.9841914813041843, 'learning_rate': 2.726199322175771e-07, 'epoch': 0.9} 90%|████████▉ | 30767/34278 [33:41:50<3:44:14, 3.83s/it] 90%|████████▉ | 30768/34278 [33:41:53<3:22:57, 3.47s/it] {'loss': 0.0907, 'grad_norm': 0.721315614571614, 'learning_rate': 2.7246608541905975e-07, 'epoch': 0.9} 90%|████████▉ | 30768/34278 [33:41:53<3:22:57, 3.47s/it] 90%|████████▉ | 30769/34278 [33:41:59<4:06:01, 4.21s/it] {'loss': 0.1115, 'grad_norm': 0.9775726056696618, 'learning_rate': 2.7231228082683634e-07, 'epoch': 0.9} 90%|████████▉ | 30769/34278 [33:41:59<4:06:01, 4.21s/it] 90%|████████▉ | 30770/34278 [33:42:02<3:44:36, 3.84s/it] {'loss': 0.1174, 'grad_norm': 0.8163172049338536, 'learning_rate': 2.7215851844227925e-07, 'epoch': 0.9} 90%|████████▉ | 30770/34278 [33:42:02<3:44:36, 3.84s/it] 90%|████████▉ | 30771/34278 [33:42:05<3:29:51, 3.59s/it] {'loss': 0.1033, 'grad_norm': 1.0424985541986709, 'learning_rate': 2.720047982667634e-07, 'epoch': 0.9} 90%|████████▉ | 30771/34278 [33:42:05<3:29:51, 3.59s/it] 90%|████████▉ | 30772/34278 [33:42:09<3:43:44, 3.83s/it] {'loss': 0.1101, 'grad_norm': 0.9386592665034721, 'learning_rate': 2.718511203016594e-07, 'epoch': 0.9} 90%|████████▉ | 30772/34278 [33:42:09<3:43:44, 3.83s/it] 90%|████████▉ | 30773/34278 [33:42:12<3:35:16, 3.69s/it] {'loss': 0.1025, 'grad_norm': 0.6801122392266861, 'learning_rate': 2.7169748454834055e-07, 'epoch': 0.9} 90%|████████▉ | 30773/34278 [33:42:12<3:35:16, 3.69s/it] 90%|████████▉ | 30774/34278 [33:42:15<3:19:22, 3.41s/it] {'loss': 0.0888, 'grad_norm': 0.782668735345411, 'learning_rate': 2.715438910081769e-07, 'epoch': 0.9} 90%|████████▉ | 30774/34278 [33:42:15<3:19:22, 3.41s/it] 90%|████████▉ | 30775/34278 [33:42:19<3:21:14, 3.45s/it] {'loss': 0.1076, 'grad_norm': 0.8612139763576521, 'learning_rate': 2.713903396825418e-07, 'epoch': 0.9} 90%|████████▉ | 30775/34278 [33:42:19<3:21:14, 3.45s/it] 90%|████████▉ | 30776/34278 [33:42:24<3:53:33, 4.00s/it] {'loss': 0.0999, 'grad_norm': 0.6433592810114508, 'learning_rate': 2.712368305728047e-07, 'epoch': 0.9} 90%|████████▉ | 30776/34278 [33:42:24<3:53:33, 4.00s/it] 90%|████████▉ | 30777/34278 [33:42:28<3:52:19, 3.98s/it] {'loss': 0.1068, 'grad_norm': 0.8775201550517686, 'learning_rate': 2.7108336368033505e-07, 'epoch': 0.9} 90%|████████▉ | 30777/34278 [33:42:28<3:52:19, 3.98s/it] 90%|████████▉ | 30778/34278 [33:42:34<4:23:04, 4.51s/it] {'loss': 0.1337, 'grad_norm': 0.9999598708750298, 'learning_rate': 2.709299390065051e-07, 'epoch': 0.9} 90%|████████▉ | 30778/34278 [33:42:34<4:23:04, 4.51s/it] 90%|████████▉ | 30779/34278 [33:42:37<3:59:29, 4.11s/it] {'loss': 0.088, 'grad_norm': 0.7312506172836786, 'learning_rate': 2.7077655655268375e-07, 'epoch': 0.9} 90%|████████▉ | 30779/34278 [33:42:37<3:59:29, 4.11s/it] 90%|████████▉ | 30780/34278 [33:42:40<3:50:12, 3.95s/it] {'loss': 0.1097, 'grad_norm': 1.1768203471175818, 'learning_rate': 2.706232163202405e-07, 'epoch': 0.9} 90%|████████▉ | 30780/34278 [33:42:40<3:50:12, 3.95s/it] 90%|████████▉ | 30781/34278 [33:42:43<3:33:34, 3.66s/it] {'loss': 0.1066, 'grad_norm': 0.7983772650753197, 'learning_rate': 2.704699183105441e-07, 'epoch': 0.9} 90%|████████▉ | 30781/34278 [33:42:43<3:33:34, 3.66s/it] 90%|████████▉ | 30782/34278 [33:42:47<3:22:36, 3.48s/it] {'loss': 0.0904, 'grad_norm': 0.877758334830019, 'learning_rate': 2.7031666252496367e-07, 'epoch': 0.9} 90%|████████▉ | 30782/34278 [33:42:47<3:22:36, 3.48s/it] 90%|████████▉ | 30783/34278 [33:42:50<3:24:11, 3.51s/it] {'loss': 0.1134, 'grad_norm': 1.0949126353961296, 'learning_rate': 2.701634489648658e-07, 'epoch': 0.9} 90%|████████▉ | 30783/34278 [33:42:50<3:24:11, 3.51s/it] 90%|████████▉ | 30784/34278 [33:42:53<3:13:32, 3.32s/it] {'loss': 0.1035, 'grad_norm': 0.8147996950949038, 'learning_rate': 2.700102776316199e-07, 'epoch': 0.9} 90%|████████▉ | 30784/34278 [33:42:53<3:13:32, 3.32s/it] 90%|████████▉ | 30785/34278 [33:42:56<3:16:31, 3.38s/it] {'loss': 0.1177, 'grad_norm': 0.9267408923909373, 'learning_rate': 2.6985714852659386e-07, 'epoch': 0.9} 90%|████████▉ | 30785/34278 [33:42:57<3:16:31, 3.38s/it] 90%|████████▉ | 30786/34278 [33:43:02<4:00:13, 4.13s/it] {'loss': 0.118, 'grad_norm': 1.0054067499868213, 'learning_rate': 2.6970406165115425e-07, 'epoch': 0.9} 90%|████████▉ | 30786/34278 [33:43:02<4:00:13, 4.13s/it] 90%|████████▉ | 30787/34278 [33:43:06<3:55:19, 4.04s/it] {'loss': 0.1134, 'grad_norm': 0.852842728820599, 'learning_rate': 2.695510170066662e-07, 'epoch': 0.9} 90%|████████▉ | 30787/34278 [33:43:06<3:55:19, 4.04s/it] 90%|████████▉ | 30788/34278 [33:43:10<3:42:18, 3.82s/it] {'loss': 0.1362, 'grad_norm': 0.9854147793022071, 'learning_rate': 2.6939801459449856e-07, 'epoch': 0.9} 90%|████████▉ | 30788/34278 [33:43:10<3:42:18, 3.82s/it] 90%|████████▉ | 30789/34278 [33:43:12<3:26:38, 3.55s/it] {'loss': 0.0972, 'grad_norm': 0.891330330666457, 'learning_rate': 2.692450544160152e-07, 'epoch': 0.9} 90%|████████▉ | 30789/34278 [33:43:12<3:26:38, 3.55s/it] 90%|████████▉ | 30790/34278 [33:43:16<3:27:18, 3.57s/it] {'loss': 0.1103, 'grad_norm': 0.7731519930964349, 'learning_rate': 2.6909213647258404e-07, 'epoch': 0.9} 90%|████████▉ | 30790/34278 [33:43:16<3:27:18, 3.57s/it] 90%|████████▉ | 30791/34278 [33:43:19<3:22:57, 3.49s/it] {'loss': 0.1372, 'grad_norm': 1.002568118988472, 'learning_rate': 2.6893926076556774e-07, 'epoch': 0.9} 90%|████████▉ | 30791/34278 [33:43:19<3:22:57, 3.49s/it] 90%|████████▉ | 30792/34278 [33:43:26<4:16:12, 4.41s/it] {'loss': 0.1301, 'grad_norm': 0.8203451141084636, 'learning_rate': 2.6878642729633307e-07, 'epoch': 0.9} 90%|████████▉ | 30792/34278 [33:43:26<4:16:12, 4.41s/it] 90%|████████▉ | 30793/34278 [33:43:29<3:57:53, 4.10s/it] {'loss': 0.1026, 'grad_norm': 0.9201695529116912, 'learning_rate': 2.686336360662434e-07, 'epoch': 0.9} 90%|████████▉ | 30793/34278 [33:43:29<3:57:53, 4.10s/it] 90%|████████▉ | 30794/34278 [33:43:32<3:40:40, 3.80s/it] {'loss': 0.117, 'grad_norm': 0.6820850081623775, 'learning_rate': 2.6848088707666307e-07, 'epoch': 0.9} 90%|████████▉ | 30794/34278 [33:43:32<3:40:40, 3.80s/it] 90%|████████▉ | 30795/34278 [33:43:39<4:24:51, 4.56s/it] {'loss': 0.1249, 'grad_norm': 0.8914170321809327, 'learning_rate': 2.683281803289556e-07, 'epoch': 0.9} 90%|████████▉ | 30795/34278 [33:43:39<4:24:51, 4.56s/it] 90%|████████▉ | 30796/34278 [33:43:44<4:29:11, 4.64s/it] {'loss': 0.0944, 'grad_norm': 0.9629246405962423, 'learning_rate': 2.681755158244853e-07, 'epoch': 0.9} 90%|████████▉ | 30796/34278 [33:43:44<4:29:11, 4.64s/it] 90%|████████▉ | 30797/34278 [33:43:50<4:52:25, 5.04s/it] {'loss': 0.115, 'grad_norm': 0.7348982561750177, 'learning_rate': 2.680228935646134e-07, 'epoch': 0.9} 90%|████████▉ | 30797/34278 [33:43:50<4:52:25, 5.04s/it] 90%|████████▉ | 30798/34278 [33:43:52<4:13:59, 4.38s/it] {'loss': 0.1004, 'grad_norm': 0.864887716885459, 'learning_rate': 2.6787031355070435e-07, 'epoch': 0.9} 90%|████████▉ | 30798/34278 [33:43:52<4:13:59, 4.38s/it] 90%|████████▉ | 30799/34278 [33:43:55<3:50:41, 3.98s/it] {'loss': 0.1068, 'grad_norm': 0.7809241301837349, 'learning_rate': 2.6771777578411983e-07, 'epoch': 0.9} 90%|████████▉ | 30799/34278 [33:43:55<3:50:41, 3.98s/it] 90%|████████▉ | 30800/34278 [33:44:02<4:29:31, 4.65s/it] {'loss': 0.1067, 'grad_norm': 0.8871663708259863, 'learning_rate': 2.6756528026622043e-07, 'epoch': 0.9} 90%|████████▉ | 30800/34278 [33:44:02<4:29:31, 4.65s/it] 90%|████████▉ | 30801/34278 [33:44:05<4:05:35, 4.24s/it] {'loss': 0.1183, 'grad_norm': 0.8317968895716673, 'learning_rate': 2.6741282699836837e-07, 'epoch': 0.9} 90%|████████▉ | 30801/34278 [33:44:05<4:05:35, 4.24s/it] 90%|████████▉ | 30802/34278 [33:44:08<3:51:06, 3.99s/it] {'loss': 0.1132, 'grad_norm': 0.8902794853332741, 'learning_rate': 2.6726041598192585e-07, 'epoch': 0.9} 90%|████████▉ | 30802/34278 [33:44:08<3:51:06, 3.99s/it] 90%|████████▉ | 30803/34278 [33:44:11<3:34:03, 3.70s/it] {'loss': 0.087, 'grad_norm': 0.8742007970379547, 'learning_rate': 2.6710804721825246e-07, 'epoch': 0.9} 90%|████████▉ | 30803/34278 [33:44:11<3:34:03, 3.70s/it] 90%|████████▉ | 30804/34278 [33:44:17<4:01:56, 4.18s/it] {'loss': 0.1238, 'grad_norm': 0.9510818055178514, 'learning_rate': 2.669557207087076e-07, 'epoch': 0.9} 90%|████████▉ | 30804/34278 [33:44:17<4:01:56, 4.18s/it] 90%|████████▉ | 30805/34278 [33:44:20<3:41:55, 3.83s/it] {'loss': 0.1206, 'grad_norm': 0.7604341231543608, 'learning_rate': 2.668034364546529e-07, 'epoch': 0.9} 90%|████████▉ | 30805/34278 [33:44:20<3:41:55, 3.83s/it] 90%|████████▉ | 30806/34278 [33:44:23<3:34:02, 3.70s/it] {'loss': 0.1054, 'grad_norm': 0.8376054704151209, 'learning_rate': 2.6665119445744736e-07, 'epoch': 0.9} 90%|████████▉ | 30806/34278 [33:44:23<3:34:02, 3.70s/it] 90%|████████▉ | 30807/34278 [33:44:27<3:40:28, 3.81s/it] {'loss': 0.1024, 'grad_norm': 1.085951157882634, 'learning_rate': 2.6649899471844875e-07, 'epoch': 0.9} 90%|████████▉ | 30807/34278 [33:44:27<3:40:28, 3.81s/it] 90%|████████▉ | 30808/34278 [33:44:30<3:28:17, 3.60s/it] {'loss': 0.0976, 'grad_norm': 0.7417634980802448, 'learning_rate': 2.663468372390182e-07, 'epoch': 0.9} 90%|████████▉ | 30808/34278 [33:44:30<3:28:17, 3.60s/it] 90%|████████▉ | 30809/34278 [33:44:36<4:08:38, 4.30s/it] {'loss': 0.1098, 'grad_norm': 0.8267745571301569, 'learning_rate': 2.6619472202051356e-07, 'epoch': 0.9} 90%|████████▉ | 30809/34278 [33:44:36<4:08:38, 4.30s/it] 90%|████████▉ | 30810/34278 [33:44:39<3:49:39, 3.97s/it] {'loss': 0.0986, 'grad_norm': 0.9105857106462436, 'learning_rate': 2.6604264906429143e-07, 'epoch': 0.9} 90%|████████▉ | 30810/34278 [33:44:39<3:49:39, 3.97s/it] 90%|████████▉ | 30811/34278 [33:44:44<3:52:23, 4.02s/it] {'loss': 0.1303, 'grad_norm': 1.2090116911069029, 'learning_rate': 2.658906183717108e-07, 'epoch': 0.9} 90%|████████▉ | 30811/34278 [33:44:44<3:52:23, 4.02s/it] 90%|████████▉ | 30812/34278 [33:44:50<4:26:31, 4.61s/it] {'loss': 0.1109, 'grad_norm': 0.7385412598763644, 'learning_rate': 2.657386299441289e-07, 'epoch': 0.9} 90%|████████▉ | 30812/34278 [33:44:50<4:26:31, 4.61s/it] 90%|████████▉ | 30813/34278 [33:44:53<4:12:05, 4.37s/it] {'loss': 0.1125, 'grad_norm': 0.813675632854998, 'learning_rate': 2.655866837829019e-07, 'epoch': 0.9} 90%|████████▉ | 30813/34278 [33:44:53<4:12:05, 4.37s/it] 90%|████████▉ | 30814/34278 [33:44:56<3:51:20, 4.01s/it] {'loss': 0.1004, 'grad_norm': 0.9246775517224919, 'learning_rate': 2.654347798893864e-07, 'epoch': 0.9} 90%|████████▉ | 30814/34278 [33:44:56<3:51:20, 4.01s/it] 90%|████████▉ | 30815/34278 [33:45:00<3:43:12, 3.87s/it] {'loss': 0.1098, 'grad_norm': 0.9039317095109196, 'learning_rate': 2.652829182649397e-07, 'epoch': 0.9} 90%|████████▉ | 30815/34278 [33:45:00<3:43:12, 3.87s/it] 90%|████████▉ | 30816/34278 [33:45:03<3:30:05, 3.64s/it] {'loss': 0.1233, 'grad_norm': 0.8844314416017702, 'learning_rate': 2.651310989109174e-07, 'epoch': 0.9} 90%|████████▉ | 30816/34278 [33:45:03<3:30:05, 3.64s/it] 90%|████████▉ | 30817/34278 [33:45:07<3:31:24, 3.67s/it] {'loss': 0.1022, 'grad_norm': 0.8500331010090957, 'learning_rate': 2.649793218286728e-07, 'epoch': 0.9} 90%|████████▉ | 30817/34278 [33:45:07<3:31:24, 3.67s/it] 90%|████████▉ | 30818/34278 [33:45:10<3:21:51, 3.50s/it] {'loss': 0.0893, 'grad_norm': 0.8048821621514879, 'learning_rate': 2.6482758701956377e-07, 'epoch': 0.9} 90%|████████▉ | 30818/34278 [33:45:10<3:21:51, 3.50s/it] 90%|████████▉ | 30819/34278 [33:45:15<3:47:45, 3.95s/it] {'loss': 0.1259, 'grad_norm': 0.7685700013816277, 'learning_rate': 2.6467589448494255e-07, 'epoch': 0.9} 90%|████████▉ | 30819/34278 [33:45:15<3:47:45, 3.95s/it] 90%|████████▉ | 30820/34278 [33:45:18<3:36:59, 3.77s/it] {'loss': 0.1175, 'grad_norm': 0.9148224900349895, 'learning_rate': 2.645242442261659e-07, 'epoch': 0.9} 90%|████████▉ | 30820/34278 [33:45:18<3:36:59, 3.77s/it] 90%|████████▉ | 30821/34278 [33:45:21<3:20:34, 3.48s/it] {'loss': 0.1107, 'grad_norm': 0.7447199833000645, 'learning_rate': 2.6437263624458474e-07, 'epoch': 0.9} 90%|████████▉ | 30821/34278 [33:45:21<3:20:34, 3.48s/it] 90%|████████▉ | 30822/34278 [33:45:24<3:10:43, 3.31s/it] {'loss': 0.1035, 'grad_norm': 0.6002362659498117, 'learning_rate': 2.642210705415554e-07, 'epoch': 0.9} 90%|████████▉ | 30822/34278 [33:45:24<3:10:43, 3.31s/it] 90%|████████▉ | 30823/34278 [33:45:27<3:04:05, 3.20s/it] {'loss': 0.1231, 'grad_norm': 0.9959436406921786, 'learning_rate': 2.6406954711843014e-07, 'epoch': 0.9} 90%|████████▉ | 30823/34278 [33:45:27<3:04:05, 3.20s/it] 90%|████████▉ | 30824/34278 [33:45:33<3:52:19, 4.04s/it] {'loss': 0.1154, 'grad_norm': 0.9342524108593679, 'learning_rate': 2.6391806597656003e-07, 'epoch': 0.9} 90%|████████▉ | 30824/34278 [33:45:33<3:52:19, 4.04s/it] 90%|████████▉ | 30825/34278 [33:45:36<3:41:16, 3.84s/it] {'loss': 0.109, 'grad_norm': 0.9400089022570661, 'learning_rate': 2.637666271172995e-07, 'epoch': 0.9} 90%|████████▉ | 30825/34278 [33:45:36<3:41:16, 3.84s/it] 90%|████████▉ | 30826/34278 [33:45:40<3:40:00, 3.82s/it] {'loss': 0.1314, 'grad_norm': 0.8065494860891471, 'learning_rate': 2.636152305419998e-07, 'epoch': 0.9} 90%|████████▉ | 30826/34278 [33:45:40<3:40:00, 3.82s/it] 90%|████████▉ | 30827/34278 [33:45:44<3:47:15, 3.95s/it] {'loss': 0.117, 'grad_norm': 0.7908419503663388, 'learning_rate': 2.634638762520125e-07, 'epoch': 0.9} 90%|████████▉ | 30827/34278 [33:45:44<3:47:15, 3.95s/it] 90%|████████▉ | 30828/34278 [33:45:47<3:24:48, 3.56s/it] {'loss': 0.0926, 'grad_norm': 1.0302652218095083, 'learning_rate': 2.6331256424869e-07, 'epoch': 0.9} 90%|████████▉ | 30828/34278 [33:45:47<3:24:48, 3.56s/it] 90%|████████▉ | 30829/34278 [33:45:53<4:06:52, 4.29s/it] {'loss': 0.1076, 'grad_norm': 0.7005029814400058, 'learning_rate': 2.631612945333817e-07, 'epoch': 0.9} 90%|████████▉ | 30829/34278 [33:45:53<4:06:52, 4.29s/it] 90%|████████▉ | 30830/34278 [33:45:56<3:45:26, 3.92s/it] {'loss': 0.1141, 'grad_norm': 0.8424244710245236, 'learning_rate': 2.630100671074376e-07, 'epoch': 0.9} 90%|████████▉ | 30830/34278 [33:45:56<3:45:26, 3.92s/it] 90%|████████▉ | 30831/34278 [33:45:59<3:28:50, 3.64s/it] {'loss': 0.1099, 'grad_norm': 1.0077946107225784, 'learning_rate': 2.628588819722094e-07, 'epoch': 0.9} 90%|████████▉ | 30831/34278 [33:45:59<3:28:50, 3.64s/it] 90%|████████▉ | 30832/34278 [33:46:02<3:20:09, 3.48s/it] {'loss': 0.0934, 'grad_norm': 0.8653977940915419, 'learning_rate': 2.627077391290467e-07, 'epoch': 0.9} 90%|████████▉ | 30832/34278 [33:46:02<3:20:09, 3.48s/it] 90%|████████▉ | 30833/34278 [33:46:06<3:22:40, 3.53s/it] {'loss': 0.1042, 'grad_norm': 0.6753000315672723, 'learning_rate': 2.625566385792988e-07, 'epoch': 0.9} 90%|████████▉ | 30833/34278 [33:46:06<3:22:40, 3.53s/it] 90%|████████▉ | 30834/34278 [33:46:09<3:21:34, 3.51s/it] {'loss': 0.1162, 'grad_norm': 0.8426563397021365, 'learning_rate': 2.6240558032431307e-07, 'epoch': 0.9} 90%|████████▉ | 30834/34278 [33:46:09<3:21:34, 3.51s/it] 90%|████████▉ | 30835/34278 [33:46:16<4:08:23, 4.33s/it] {'loss': 0.0923, 'grad_norm': 0.8772241797615372, 'learning_rate': 2.622545643654401e-07, 'epoch': 0.9} 90%|████████▉ | 30835/34278 [33:46:16<4:08:23, 4.33s/it] 90%|████████▉ | 30836/34278 [33:46:18<3:43:52, 3.90s/it] {'loss': 0.1218, 'grad_norm': 0.9294651252284601, 'learning_rate': 2.621035907040276e-07, 'epoch': 0.9} 90%|████████▉ | 30836/34278 [33:46:18<3:43:52, 3.90s/it] 90%|████████▉ | 30837/34278 [33:46:22<3:44:30, 3.91s/it] {'loss': 0.1339, 'grad_norm': 0.7503295861919965, 'learning_rate': 2.6195265934142177e-07, 'epoch': 0.9} 90%|████████▉ | 30837/34278 [33:46:22<3:44:30, 3.91s/it] 90%|████████▉ | 30838/34278 [33:46:28<4:20:50, 4.55s/it] {'loss': 0.1176, 'grad_norm': 0.9234185871140549, 'learning_rate': 2.6180177027897326e-07, 'epoch': 0.9} 90%|████████▉ | 30838/34278 [33:46:28<4:20:50, 4.55s/it] 90%|████████▉ | 30839/34278 [33:46:33<4:20:13, 4.54s/it] {'loss': 0.0955, 'grad_norm': 0.8301220374617003, 'learning_rate': 2.61650923518027e-07, 'epoch': 0.9} 90%|████████▉ | 30839/34278 [33:46:33<4:20:13, 4.54s/it] 90%|████████▉ | 30840/34278 [33:46:36<3:58:26, 4.16s/it] {'loss': 0.1012, 'grad_norm': 1.0647598043586948, 'learning_rate': 2.6150011905992977e-07, 'epoch': 0.9} 90%|████████▉ | 30840/34278 [33:46:36<3:58:26, 4.16s/it] 90%|████████▉ | 30841/34278 [33:46:42<4:30:25, 4.72s/it] {'loss': 0.1471, 'grad_norm': 0.8196386934894001, 'learning_rate': 2.613493569060288e-07, 'epoch': 0.9} 90%|████████▉ | 30841/34278 [33:46:42<4:30:25, 4.72s/it] 90%|████████▉ | 30842/34278 [33:46:45<4:00:42, 4.20s/it] {'loss': 0.0898, 'grad_norm': 0.7867605259723952, 'learning_rate': 2.6119863705766967e-07, 'epoch': 0.9} 90%|████████▉ | 30842/34278 [33:46:45<4:00:42, 4.20s/it] 90%|████████▉ | 30843/34278 [33:46:49<3:50:36, 4.03s/it] {'loss': 0.1258, 'grad_norm': 0.8799963901929425, 'learning_rate': 2.610479595161969e-07, 'epoch': 0.9} 90%|████████▉ | 30843/34278 [33:46:49<3:50:36, 4.03s/it] 90%|████████▉ | 30844/34278 [33:46:52<3:34:27, 3.75s/it] {'loss': 0.1008, 'grad_norm': 0.6158166063517552, 'learning_rate': 2.6089732428295654e-07, 'epoch': 0.9} 90%|████████▉ | 30844/34278 [33:46:52<3:34:27, 3.75s/it] 90%|████████▉ | 30845/34278 [33:46:55<3:23:40, 3.56s/it] {'loss': 0.1368, 'grad_norm': 1.0886071944431155, 'learning_rate': 2.6074673135929486e-07, 'epoch': 0.9} 90%|████████▉ | 30845/34278 [33:46:55<3:23:40, 3.56s/it] 90%|████████▉ | 30846/34278 [33:46:58<3:12:08, 3.36s/it] {'loss': 0.1114, 'grad_norm': 0.8235006316894611, 'learning_rate': 2.6059618074655457e-07, 'epoch': 0.9} 90%|████████▉ | 30846/34278 [33:46:58<3:12:08, 3.36s/it] 90%|████████▉ | 30847/34278 [33:47:01<3:09:04, 3.31s/it] {'loss': 0.106, 'grad_norm': 0.7278380548351747, 'learning_rate': 2.6044567244607963e-07, 'epoch': 0.9} 90%|████████▉ | 30847/34278 [33:47:01<3:09:04, 3.31s/it] 90%|████████▉ | 30848/34278 [33:47:04<3:03:42, 3.21s/it] {'loss': 0.1188, 'grad_norm': 0.8271609162705068, 'learning_rate': 2.6029520645921515e-07, 'epoch': 0.9} 90%|████████▉ | 30848/34278 [33:47:04<3:03:42, 3.21s/it] 90%|████████▉ | 30849/34278 [33:47:08<3:12:20, 3.37s/it] {'loss': 0.1279, 'grad_norm': 0.8698388363891452, 'learning_rate': 2.601447827873027e-07, 'epoch': 0.9} 90%|████████▉ | 30849/34278 [33:47:08<3:12:20, 3.37s/it] 90%|████████▉ | 30850/34278 [33:47:11<3:16:30, 3.44s/it] {'loss': 0.0866, 'grad_norm': 0.7373099240975741, 'learning_rate': 2.5999440143168686e-07, 'epoch': 0.9} 90%|████████▉ | 30850/34278 [33:47:11<3:16:30, 3.44s/it] 90%|█████████ | 30851/34278 [33:47:18<4:04:45, 4.29s/it] {'loss': 0.1118, 'grad_norm': 0.7588799540457818, 'learning_rate': 2.5984406239370874e-07, 'epoch': 0.9} 90%|█████████ | 30851/34278 [33:47:18<4:04:45, 4.29s/it] 90%|█████████ | 30852/34278 [33:47:24<4:40:52, 4.92s/it] {'loss': 0.1166, 'grad_norm': 0.920814935558577, 'learning_rate': 2.5969376567471226e-07, 'epoch': 0.9} 90%|█████████ | 30852/34278 [33:47:24<4:40:52, 4.92s/it] 90%|█████████ | 30853/34278 [33:47:30<4:56:41, 5.20s/it] {'loss': 0.1226, 'grad_norm': 0.8258080195124249, 'learning_rate': 2.5954351127603807e-07, 'epoch': 0.9} 90%|█████████ | 30853/34278 [33:47:30<4:56:41, 5.20s/it] 90%|█████████ | 30854/34278 [33:47:36<5:10:54, 5.45s/it] {'loss': 0.1029, 'grad_norm': 1.0723653998016915, 'learning_rate': 2.593932991990272e-07, 'epoch': 0.9} 90%|█████████ | 30854/34278 [33:47:36<5:10:54, 5.45s/it] 90%|█████████ | 30855/34278 [33:47:39<4:37:28, 4.86s/it] {'loss': 0.0984, 'grad_norm': 0.8500662747471033, 'learning_rate': 2.5924312944502095e-07, 'epoch': 0.9} 90%|█████████ | 30855/34278 [33:47:39<4:37:28, 4.86s/it] 90%|█████████ | 30856/34278 [33:47:43<4:18:18, 4.53s/it] {'loss': 0.1193, 'grad_norm': 0.8728339231385889, 'learning_rate': 2.590930020153609e-07, 'epoch': 0.9} 90%|█████████ | 30856/34278 [33:47:43<4:18:18, 4.53s/it] 90%|█████████ | 30857/34278 [33:47:46<3:55:52, 4.14s/it] {'loss': 0.0893, 'grad_norm': 0.8822052485185008, 'learning_rate': 2.589429169113866e-07, 'epoch': 0.9} 90%|█████████ | 30857/34278 [33:47:46<3:55:52, 4.14s/it] 90%|█████████ | 30858/34278 [33:47:52<4:26:59, 4.68s/it] {'loss': 0.1235, 'grad_norm': 0.7788704914360438, 'learning_rate': 2.5879287413443863e-07, 'epoch': 0.9} 90%|█████████ | 30858/34278 [33:47:52<4:26:59, 4.68s/it] 90%|█████████ | 30859/34278 [33:47:55<3:55:53, 4.14s/it] {'loss': 0.108, 'grad_norm': 0.8237787340704235, 'learning_rate': 2.5864287368585596e-07, 'epoch': 0.9} 90%|█████████ | 30859/34278 [33:47:55<3:55:53, 4.14s/it] 90%|█████████ | 30860/34278 [33:47:58<3:39:46, 3.86s/it] {'loss': 0.1245, 'grad_norm': 0.955336411671347, 'learning_rate': 2.584929155669774e-07, 'epoch': 0.9} 90%|█████████ | 30860/34278 [33:47:58<3:39:46, 3.86s/it] 90%|█████████ | 30861/34278 [33:48:01<3:19:22, 3.50s/it] {'loss': 0.0947, 'grad_norm': 0.8322752354288175, 'learning_rate': 2.5834299977914203e-07, 'epoch': 0.9} 90%|█████████ | 30861/34278 [33:48:01<3:19:22, 3.50s/it] 90%|█████████ | 30862/34278 [33:48:04<3:11:03, 3.36s/it] {'loss': 0.139, 'grad_norm': 0.8808347603018455, 'learning_rate': 2.581931263236892e-07, 'epoch': 0.9} 90%|█████████ | 30862/34278 [33:48:04<3:11:03, 3.36s/it] 90%|█████████ | 30863/34278 [33:48:07<3:02:43, 3.21s/it] {'loss': 0.1091, 'grad_norm': 0.7896792013506789, 'learning_rate': 2.5804329520195625e-07, 'epoch': 0.9} 90%|█████████ | 30863/34278 [33:48:07<3:02:43, 3.21s/it] 90%|█████████ | 30864/34278 [33:48:10<3:02:16, 3.20s/it] {'loss': 0.1038, 'grad_norm': 0.7428391325210728, 'learning_rate': 2.5789350641527987e-07, 'epoch': 0.9} 90%|█████████ | 30864/34278 [33:48:10<3:02:16, 3.20s/it] 90%|█████████ | 30865/34278 [33:48:14<3:13:00, 3.39s/it] {'loss': 0.1187, 'grad_norm': 0.8425847751115024, 'learning_rate': 2.57743759964999e-07, 'epoch': 0.9} 90%|█████████ | 30865/34278 [33:48:14<3:13:00, 3.39s/it] 90%|█████████ | 30866/34278 [33:48:20<3:51:47, 4.08s/it] {'loss': 0.1127, 'grad_norm': 0.7751035477123707, 'learning_rate': 2.575940558524498e-07, 'epoch': 0.9} 90%|█████████ | 30866/34278 [33:48:20<3:51:47, 4.08s/it] 90%|█████████ | 30867/34278 [33:48:23<3:40:45, 3.88s/it] {'loss': 0.119, 'grad_norm': 1.1087439040171116, 'learning_rate': 2.5744439407896725e-07, 'epoch': 0.9} 90%|█████████ | 30867/34278 [33:48:23<3:40:45, 3.88s/it] 90%|█████████ | 30868/34278 [33:48:27<3:45:45, 3.97s/it] {'loss': 0.099, 'grad_norm': 0.844292477344265, 'learning_rate': 2.5729477464589037e-07, 'epoch': 0.9} 90%|█████████ | 30868/34278 [33:48:27<3:45:45, 3.97s/it] 90%|█████████ | 30869/34278 [33:48:31<3:48:08, 4.02s/it] {'loss': 0.1004, 'grad_norm': 0.7187965661116755, 'learning_rate': 2.5714519755455416e-07, 'epoch': 0.9} 90%|█████████ | 30869/34278 [33:48:32<3:48:08, 4.02s/it] 90%|█████████ | 30870/34278 [33:48:36<3:59:22, 4.21s/it] {'loss': 0.1334, 'grad_norm': 0.8663774904698612, 'learning_rate': 2.5699566280629196e-07, 'epoch': 0.9} 90%|█████████ | 30870/34278 [33:48:36<3:59:22, 4.21s/it] 90%|█████████ | 30871/34278 [33:48:41<4:09:49, 4.40s/it] {'loss': 0.1072, 'grad_norm': 0.8035676158163162, 'learning_rate': 2.568461704024411e-07, 'epoch': 0.9} 90%|█████████ | 30871/34278 [33:48:41<4:09:49, 4.40s/it] 90%|█████████ | 30872/34278 [33:48:46<4:23:50, 4.65s/it] {'loss': 0.1221, 'grad_norm': 1.0698248448789052, 'learning_rate': 2.5669672034433544e-07, 'epoch': 0.9} 90%|█████████ | 30872/34278 [33:48:46<4:23:50, 4.65s/it] 90%|█████████ | 30873/34278 [33:48:50<4:04:06, 4.30s/it] {'loss': 0.1058, 'grad_norm': 0.8177173393092028, 'learning_rate': 2.56547312633309e-07, 'epoch': 0.9} 90%|█████████ | 30873/34278 [33:48:50<4:04:06, 4.30s/it] 90%|█████████ | 30874/34278 [33:48:53<3:41:54, 3.91s/it] {'loss': 0.1116, 'grad_norm': 0.8131376079609332, 'learning_rate': 2.563979472706951e-07, 'epoch': 0.9} 90%|█████████ | 30874/34278 [33:48:53<3:41:54, 3.91s/it] 90%|█████████ | 30875/34278 [33:48:56<3:23:58, 3.60s/it] {'loss': 0.111, 'grad_norm': 0.9209480791815651, 'learning_rate': 2.562486242578288e-07, 'epoch': 0.9} 90%|█████████ | 30875/34278 [33:48:56<3:23:58, 3.60s/it] 90%|█████████ | 30876/34278 [33:48:59<3:22:08, 3.57s/it] {'loss': 0.1054, 'grad_norm': 0.6814739903364504, 'learning_rate': 2.560993435960424e-07, 'epoch': 0.9} 90%|█████████ | 30876/34278 [33:48:59<3:22:08, 3.57s/it] 90%|█████████ | 30877/34278 [33:49:03<3:23:29, 3.59s/it] {'loss': 0.1447, 'grad_norm': 1.072332176891597, 'learning_rate': 2.559501052866681e-07, 'epoch': 0.9} 90%|█████████ | 30877/34278 [33:49:03<3:23:29, 3.59s/it] 90%|█████████ | 30878/34278 [33:49:06<3:10:48, 3.37s/it] {'loss': 0.1091, 'grad_norm': 0.7242888196953966, 'learning_rate': 2.558009093310393e-07, 'epoch': 0.9} 90%|█████████ | 30878/34278 [33:49:06<3:10:48, 3.37s/it] 90%|█████████ | 30879/34278 [33:49:09<3:07:01, 3.30s/it] {'loss': 0.0992, 'grad_norm': 0.826829439552671, 'learning_rate': 2.556517557304866e-07, 'epoch': 0.9} 90%|█████████ | 30879/34278 [33:49:09<3:07:01, 3.30s/it] 90%|█████████ | 30880/34278 [33:49:12<3:01:52, 3.21s/it] {'loss': 0.1045, 'grad_norm': 0.6926146048625293, 'learning_rate': 2.5550264448634285e-07, 'epoch': 0.9} 90%|█████████ | 30880/34278 [33:49:12<3:01:52, 3.21s/it] 90%|█████████ | 30881/34278 [33:49:14<2:53:03, 3.06s/it] {'loss': 0.1059, 'grad_norm': 0.8374251096933836, 'learning_rate': 2.553535755999387e-07, 'epoch': 0.9} 90%|█████████ | 30881/34278 [33:49:14<2:53:03, 3.06s/it] 90%|█████████ | 30882/34278 [33:49:18<2:56:53, 3.13s/it] {'loss': 0.0972, 'grad_norm': 0.8812956683022092, 'learning_rate': 2.552045490726057e-07, 'epoch': 0.9} 90%|█████████ | 30882/34278 [33:49:18<2:56:53, 3.13s/it] 90%|█████████ | 30883/34278 [33:49:21<3:00:36, 3.19s/it] {'loss': 0.1059, 'grad_norm': 0.8180317243090963, 'learning_rate': 2.5505556490567405e-07, 'epoch': 0.9} 90%|█████████ | 30883/34278 [33:49:21<3:00:36, 3.19s/it] 90%|█████████ | 30884/34278 [33:49:25<3:05:56, 3.29s/it] {'loss': 0.1173, 'grad_norm': 0.9969662856760693, 'learning_rate': 2.5490662310047264e-07, 'epoch': 0.9} 90%|█████████ | 30884/34278 [33:49:25<3:05:56, 3.29s/it] 90%|█████████ | 30885/34278 [33:49:28<3:08:12, 3.33s/it] {'loss': 0.0814, 'grad_norm': 1.2642738594841663, 'learning_rate': 2.547577236583326e-07, 'epoch': 0.9} 90%|█████████ | 30885/34278 [33:49:28<3:08:12, 3.33s/it] 90%|█████████ | 30886/34278 [33:49:31<3:06:31, 3.30s/it] {'loss': 0.1076, 'grad_norm': 0.8798486010831635, 'learning_rate': 2.5460886658058295e-07, 'epoch': 0.9} 90%|█████████ | 30886/34278 [33:49:31<3:06:31, 3.30s/it] 90%|█████████ | 30887/34278 [33:49:34<3:00:54, 3.20s/it] {'loss': 0.0998, 'grad_norm': 0.6230444000273161, 'learning_rate': 2.544600518685519e-07, 'epoch': 0.9} 90%|█████████ | 30887/34278 [33:49:34<3:00:54, 3.20s/it] 90%|█████████ | 30888/34278 [33:49:38<3:04:54, 3.27s/it] {'loss': 0.108, 'grad_norm': 0.7321902723437084, 'learning_rate': 2.543112795235697e-07, 'epoch': 0.9} 90%|█████████ | 30888/34278 [33:49:38<3:04:54, 3.27s/it] 90%|█████████ | 30889/34278 [33:49:40<2:57:52, 3.15s/it] {'loss': 0.1009, 'grad_norm': 0.7923578488218115, 'learning_rate': 2.541625495469635e-07, 'epoch': 0.9} 90%|█████████ | 30889/34278 [33:49:40<2:57:52, 3.15s/it] 90%|█████████ | 30890/34278 [33:49:46<3:46:20, 4.01s/it] {'loss': 0.0948, 'grad_norm': 0.7111146089392812, 'learning_rate': 2.5401386194006005e-07, 'epoch': 0.9} 90%|█████████ | 30890/34278 [33:49:46<3:46:20, 4.01s/it] 90%|█████████ | 30891/34278 [33:49:51<3:56:18, 4.19s/it] {'loss': 0.1153, 'grad_norm': 0.7253050259418145, 'learning_rate': 2.538652167041883e-07, 'epoch': 0.9} 90%|█████████ | 30891/34278 [33:49:51<3:56:18, 4.19s/it] 90%|█████████ | 30892/34278 [33:49:54<3:39:34, 3.89s/it] {'loss': 0.1037, 'grad_norm': 0.7411366341909867, 'learning_rate': 2.5371661384067546e-07, 'epoch': 0.9} 90%|█████████ | 30892/34278 [33:49:54<3:39:34, 3.89s/it] 90%|█████████ | 30893/34278 [33:49:59<4:01:12, 4.28s/it] {'loss': 0.0949, 'grad_norm': 0.8317204427851583, 'learning_rate': 2.5356805335084776e-07, 'epoch': 0.9} 90%|█████████ | 30893/34278 [33:49:59<4:01:12, 4.28s/it] 90%|█████████ | 30894/34278 [33:50:03<3:43:17, 3.96s/it] {'loss': 0.1088, 'grad_norm': 0.6538123182560942, 'learning_rate': 2.5341953523603024e-07, 'epoch': 0.9} 90%|█████████ | 30894/34278 [33:50:03<3:43:17, 3.96s/it] 90%|█████████ | 30895/34278 [33:50:06<3:36:21, 3.84s/it] {'loss': 0.103, 'grad_norm': 0.8232839203533794, 'learning_rate': 2.5327105949755125e-07, 'epoch': 0.9} 90%|█████████ | 30895/34278 [33:50:06<3:36:21, 3.84s/it] 90%|█████████ | 30896/34278 [33:50:09<3:16:11, 3.48s/it] {'loss': 0.119, 'grad_norm': 0.9527635326361865, 'learning_rate': 2.5312262613673476e-07, 'epoch': 0.9} 90%|█████████ | 30896/34278 [33:50:09<3:16:11, 3.48s/it] 90%|█████████ | 30897/34278 [33:50:15<3:58:07, 4.23s/it] {'loss': 0.1225, 'grad_norm': 0.7507188860357058, 'learning_rate': 2.5297423515490584e-07, 'epoch': 0.9} 90%|█████████ | 30897/34278 [33:50:15<3:58:07, 4.23s/it] 90%|█████████ | 30898/34278 [33:50:19<3:55:57, 4.19s/it] {'loss': 0.1153, 'grad_norm': 0.6477173770002925, 'learning_rate': 2.5282588655338947e-07, 'epoch': 0.9} 90%|█████████ | 30898/34278 [33:50:19<3:55:57, 4.19s/it] 90%|█████████ | 30899/34278 [33:50:24<4:17:10, 4.57s/it] {'loss': 0.1176, 'grad_norm': 0.7759081563079759, 'learning_rate': 2.526775803335113e-07, 'epoch': 0.9} 90%|█████████ | 30899/34278 [33:50:24<4:17:10, 4.57s/it] 90%|█████████ | 30900/34278 [33:50:27<3:46:32, 4.02s/it] {'loss': 0.1, 'grad_norm': 0.7528650119604344, 'learning_rate': 2.525293164965936e-07, 'epoch': 0.9} 90%|█████████ | 30900/34278 [33:50:27<3:46:32, 4.02s/it] 90%|█████████ | 30901/34278 [33:50:30<3:29:41, 3.73s/it] {'loss': 0.1159, 'grad_norm': 0.8382682887800795, 'learning_rate': 2.523810950439615e-07, 'epoch': 0.9} 90%|█████████ | 30901/34278 [33:50:30<3:29:41, 3.73s/it] 90%|█████████ | 30902/34278 [33:50:33<3:13:15, 3.43s/it] {'loss': 0.1206, 'grad_norm': 0.9790493988981085, 'learning_rate': 2.5223291597693764e-07, 'epoch': 0.9} 90%|█████████ | 30902/34278 [33:50:33<3:13:15, 3.43s/it] 90%|█████████ | 30903/34278 [33:50:36<3:12:23, 3.42s/it] {'loss': 0.0947, 'grad_norm': 0.9686947625084419, 'learning_rate': 2.520847792968445e-07, 'epoch': 0.9} 90%|█████████ | 30903/34278 [33:50:36<3:12:23, 3.42s/it] 90%|█████████ | 30904/34278 [33:50:42<3:54:57, 4.18s/it] {'loss': 0.1195, 'grad_norm': 1.0043684611504828, 'learning_rate': 2.519366850050048e-07, 'epoch': 0.9} 90%|█████████ | 30904/34278 [33:50:42<3:54:57, 4.18s/it] 90%|█████████ | 30905/34278 [33:50:46<3:48:21, 4.06s/it] {'loss': 0.108, 'grad_norm': 1.1410283074329841, 'learning_rate': 2.5178863310274136e-07, 'epoch': 0.9} 90%|█████████ | 30905/34278 [33:50:46<3:48:21, 4.06s/it] 90%|█████████ | 30906/34278 [33:50:49<3:21:36, 3.59s/it] {'loss': 0.0961, 'grad_norm': 0.822753373595886, 'learning_rate': 2.51640623591376e-07, 'epoch': 0.9} 90%|█████████ | 30906/34278 [33:50:49<3:21:36, 3.59s/it] 90%|█████████ | 30907/34278 [33:50:52<3:11:07, 3.40s/it] {'loss': 0.0935, 'grad_norm': 0.7734796717613033, 'learning_rate': 2.5149265647222863e-07, 'epoch': 0.9} 90%|█████████ | 30907/34278 [33:50:52<3:11:07, 3.40s/it] 90%|█████████ | 30908/34278 [33:50:58<4:02:20, 4.31s/it] {'loss': 0.1136, 'grad_norm': 0.8922035087707321, 'learning_rate': 2.513447317466222e-07, 'epoch': 0.9} 90%|█████████ | 30908/34278 [33:50:58<4:02:20, 4.31s/it] 90%|█████████ | 30909/34278 [33:51:01<3:36:51, 3.86s/it] {'loss': 0.1387, 'grad_norm': 0.9762875819094442, 'learning_rate': 2.511968494158751e-07, 'epoch': 0.9} 90%|█████████ | 30909/34278 [33:51:01<3:36:51, 3.86s/it] 90%|█████████ | 30910/34278 [33:51:07<4:11:02, 4.47s/it] {'loss': 0.0967, 'grad_norm': 0.7317975276067623, 'learning_rate': 2.510490094813101e-07, 'epoch': 0.9} 90%|█████████ | 30910/34278 [33:51:07<4:11:02, 4.47s/it] 90%|█████████ | 30911/34278 [33:51:10<3:54:02, 4.17s/it] {'loss': 0.1044, 'grad_norm': 0.7514206912013227, 'learning_rate': 2.5090121194424554e-07, 'epoch': 0.9} 90%|█████████ | 30911/34278 [33:51:10<3:54:02, 4.17s/it] 90%|█████████ | 30912/34278 [33:51:14<3:41:05, 3.94s/it] {'loss': 0.102, 'grad_norm': 1.0323943227713905, 'learning_rate': 2.5075345680600107e-07, 'epoch': 0.9} 90%|█████████ | 30912/34278 [33:51:14<3:41:05, 3.94s/it] 90%|█████████ | 30913/34278 [33:51:17<3:28:56, 3.73s/it] {'loss': 0.1103, 'grad_norm': 0.8356574035684364, 'learning_rate': 2.5060574406789664e-07, 'epoch': 0.9} 90%|█████████ | 30913/34278 [33:51:17<3:28:56, 3.73s/it] 90%|█████████ | 30914/34278 [33:51:20<3:17:50, 3.53s/it] {'loss': 0.0862, 'grad_norm': 0.661884839685752, 'learning_rate': 2.504580737312495e-07, 'epoch': 0.9} 90%|█████████ | 30914/34278 [33:51:20<3:17:50, 3.53s/it] 90%|█████████ | 30915/34278 [33:51:25<3:53:22, 4.16s/it] {'loss': 0.1014, 'grad_norm': 0.7698925091478908, 'learning_rate': 2.503104457973787e-07, 'epoch': 0.9} 90%|█████████ | 30915/34278 [33:51:25<3:53:22, 4.16s/it] 90%|█████████ | 30916/34278 [33:51:29<3:38:01, 3.89s/it] {'loss': 0.1072, 'grad_norm': 1.1653480649798673, 'learning_rate': 2.501628602676037e-07, 'epoch': 0.9} 90%|█████████ | 30916/34278 [33:51:29<3:38:01, 3.89s/it] 90%|█████████ | 30917/34278 [33:51:32<3:30:39, 3.76s/it] {'loss': 0.1254, 'grad_norm': 0.8357527387352833, 'learning_rate': 2.500153171432396e-07, 'epoch': 0.9} 90%|█████████ | 30917/34278 [33:51:32<3:30:39, 3.76s/it] 90%|█████████ | 30918/34278 [33:51:36<3:28:50, 3.73s/it] {'loss': 0.1405, 'grad_norm': 0.7601997967500363, 'learning_rate': 2.498678164256052e-07, 'epoch': 0.9} 90%|█████████ | 30918/34278 [33:51:36<3:28:50, 3.73s/it] 90%|█████████ | 30919/34278 [33:51:39<3:20:18, 3.58s/it] {'loss': 0.1058, 'grad_norm': 0.9844474328297425, 'learning_rate': 2.497203581160174e-07, 'epoch': 0.9} 90%|█████████ | 30919/34278 [33:51:39<3:20:18, 3.58s/it] 90%|█████████ | 30920/34278 [33:51:42<3:11:49, 3.43s/it] {'loss': 0.1323, 'grad_norm': 0.9306600426511832, 'learning_rate': 2.4957294221579166e-07, 'epoch': 0.9} 90%|█████████ | 30920/34278 [33:51:42<3:11:49, 3.43s/it] 90%|█████████ | 30921/34278 [33:51:46<3:17:16, 3.53s/it] {'loss': 0.1107, 'grad_norm': 0.811693818410758, 'learning_rate': 2.4942556872624477e-07, 'epoch': 0.9} 90%|█████████ | 30921/34278 [33:51:46<3:17:16, 3.53s/it] 90%|█████████ | 30922/34278 [33:51:52<4:00:14, 4.30s/it] {'loss': 0.1039, 'grad_norm': 0.813405748983144, 'learning_rate': 2.4927823764869296e-07, 'epoch': 0.9} 90%|█████████ | 30922/34278 [33:51:52<4:00:14, 4.30s/it] 90%|█████████ | 30923/34278 [33:51:55<3:40:01, 3.93s/it] {'loss': 0.121, 'grad_norm': 0.7775052672368832, 'learning_rate': 2.4913094898445066e-07, 'epoch': 0.9} 90%|█████████ | 30923/34278 [33:51:55<3:40:01, 3.93s/it] 90%|█████████ | 30924/34278 [33:52:00<3:51:05, 4.13s/it] {'loss': 0.1124, 'grad_norm': 1.168129538557138, 'learning_rate': 2.489837027348324e-07, 'epoch': 0.9} 90%|█████████ | 30924/34278 [33:52:00<3:51:05, 4.13s/it] 90%|█████████ | 30925/34278 [33:52:03<3:33:26, 3.82s/it] {'loss': 0.1146, 'grad_norm': 0.8427444648072628, 'learning_rate': 2.488364989011544e-07, 'epoch': 0.9} 90%|█████████ | 30925/34278 [33:52:03<3:33:26, 3.82s/it] 90%|█████████ | 30926/34278 [33:52:06<3:27:43, 3.72s/it] {'loss': 0.1171, 'grad_norm': 0.7314329053327707, 'learning_rate': 2.486893374847299e-07, 'epoch': 0.9} 90%|█████████ | 30926/34278 [33:52:06<3:27:43, 3.72s/it] 90%|█████████ | 30927/34278 [33:52:10<3:36:37, 3.88s/it] {'loss': 0.1134, 'grad_norm': 0.9614666321424571, 'learning_rate': 2.4854221848687245e-07, 'epoch': 0.9} 90%|█████████ | 30927/34278 [33:52:10<3:36:37, 3.88s/it] 90%|█████████ | 30928/34278 [33:52:13<3:16:44, 3.52s/it] {'loss': 0.0922, 'grad_norm': 0.8172701848878543, 'learning_rate': 2.4839514190889534e-07, 'epoch': 0.9} 90%|█████████ | 30928/34278 [33:52:13<3:16:44, 3.52s/it] 90%|█████████ | 30929/34278 [33:52:19<3:49:50, 4.12s/it] {'loss': 0.1075, 'grad_norm': 0.8324606832777022, 'learning_rate': 2.482481077521126e-07, 'epoch': 0.9} 90%|█████████ | 30929/34278 [33:52:19<3:49:50, 4.12s/it] 90%|█████████ | 30930/34278 [33:52:24<4:11:33, 4.51s/it] {'loss': 0.1299, 'grad_norm': 0.7730009621892097, 'learning_rate': 2.481011160178365e-07, 'epoch': 0.9} 90%|█████████ | 30930/34278 [33:52:24<4:11:33, 4.51s/it] 90%|█████████ | 30931/34278 [33:52:27<3:50:14, 4.13s/it] {'loss': 0.1041, 'grad_norm': 1.0540633963722275, 'learning_rate': 2.4795416670737925e-07, 'epoch': 0.9} 90%|█████████ | 30931/34278 [33:52:27<3:50:14, 4.13s/it] 90%|█████████ | 30932/34278 [33:52:30<3:30:51, 3.78s/it] {'loss': 0.1178, 'grad_norm': 0.7690313082884614, 'learning_rate': 2.478072598220532e-07, 'epoch': 0.9} 90%|█████████ | 30932/34278 [33:52:30<3:30:51, 3.78s/it] 90%|█████████ | 30933/34278 [33:52:37<4:12:46, 4.53s/it] {'loss': 0.1105, 'grad_norm': 0.9766291913066324, 'learning_rate': 2.4766039536316843e-07, 'epoch': 0.9} 90%|█████████ | 30933/34278 [33:52:37<4:12:46, 4.53s/it] 90%|█████████ | 30934/34278 [33:52:40<3:49:29, 4.12s/it] {'loss': 0.1208, 'grad_norm': 0.8108029336761172, 'learning_rate': 2.475135733320372e-07, 'epoch': 0.9} 90%|█████████ | 30934/34278 [33:52:40<3:49:29, 4.12s/it] 90%|█████████ | 30935/34278 [33:52:43<3:32:10, 3.81s/it] {'loss': 0.1149, 'grad_norm': 0.8216731486601332, 'learning_rate': 2.473667937299712e-07, 'epoch': 0.9} 90%|█████████ | 30935/34278 [33:52:43<3:32:10, 3.81s/it] 90%|█████████ | 30936/34278 [33:52:47<3:29:54, 3.77s/it] {'loss': 0.0882, 'grad_norm': 0.7140596228755002, 'learning_rate': 2.4722005655827995e-07, 'epoch': 0.9} 90%|█████████ | 30936/34278 [33:52:47<3:29:54, 3.77s/it] 90%|█████████ | 30937/34278 [33:52:50<3:21:04, 3.61s/it] {'loss': 0.1222, 'grad_norm': 0.9323876320571479, 'learning_rate': 2.47073361818273e-07, 'epoch': 0.9} 90%|█████████ | 30937/34278 [33:52:50<3:21:04, 3.61s/it] 90%|█████████ | 30938/34278 [33:52:53<3:12:48, 3.46s/it] {'loss': 0.1355, 'grad_norm': 1.040450051990931, 'learning_rate': 2.4692670951126043e-07, 'epoch': 0.9} 90%|█████████ | 30938/34278 [33:52:53<3:12:48, 3.46s/it] 90%|█████████ | 30939/34278 [33:52:56<2:59:03, 3.22s/it] {'loss': 0.0996, 'grad_norm': 1.0706327905908033, 'learning_rate': 2.4678009963855165e-07, 'epoch': 0.9} 90%|█████████ | 30939/34278 [33:52:56<2:59:03, 3.22s/it] 90%|█████████ | 30940/34278 [33:52:59<3:05:09, 3.33s/it] {'loss': 0.1033, 'grad_norm': 0.8522603385455392, 'learning_rate': 2.466335322014557e-07, 'epoch': 0.9} 90%|█████████ | 30940/34278 [33:52:59<3:05:09, 3.33s/it] 90%|█████████ | 30941/34278 [33:53:02<3:01:36, 3.27s/it] {'loss': 0.116, 'grad_norm': 0.7507943013091795, 'learning_rate': 2.4648700720128036e-07, 'epoch': 0.9} 90%|█████████ | 30941/34278 [33:53:02<3:01:36, 3.27s/it] 90%|█████████ | 30942/34278 [33:53:05<2:53:50, 3.13s/it] {'loss': 0.1013, 'grad_norm': 0.6786135439008352, 'learning_rate': 2.4634052463933466e-07, 'epoch': 0.9} 90%|█████████ | 30942/34278 [33:53:05<2:53:50, 3.13s/it] 90%|█████████ | 30943/34278 [33:53:08<2:56:58, 3.18s/it] {'loss': 0.1081, 'grad_norm': 0.8883417035087033, 'learning_rate': 2.4619408451692584e-07, 'epoch': 0.9} 90%|█████████ | 30943/34278 [33:53:08<2:56:58, 3.18s/it] 90%|█████████ | 30944/34278 [33:53:12<3:01:36, 3.27s/it] {'loss': 0.1239, 'grad_norm': 0.847454523909641, 'learning_rate': 2.460476868353612e-07, 'epoch': 0.9} 90%|█████████ | 30944/34278 [33:53:12<3:01:36, 3.27s/it] 90%|█████████ | 30945/34278 [33:53:16<3:18:14, 3.57s/it] {'loss': 0.1167, 'grad_norm': 0.6801634304033013, 'learning_rate': 2.45901331595948e-07, 'epoch': 0.9} 90%|█████████ | 30945/34278 [33:53:16<3:18:14, 3.57s/it] 90%|█████████ | 30946/34278 [33:53:19<3:06:34, 3.36s/it] {'loss': 0.101, 'grad_norm': 0.7889849025518476, 'learning_rate': 2.4575501879999295e-07, 'epoch': 0.9} 90%|█████████ | 30946/34278 [33:53:19<3:06:34, 3.36s/it] 90%|█████████ | 30947/34278 [33:53:25<3:53:59, 4.21s/it] {'loss': 0.1186, 'grad_norm': 0.6949769325179381, 'learning_rate': 2.456087484488018e-07, 'epoch': 0.9} 90%|█████████ | 30947/34278 [33:53:25<3:53:59, 4.21s/it] 90%|█████████ | 30948/34278 [33:53:28<3:30:24, 3.79s/it] {'loss': 0.1208, 'grad_norm': 0.8075382518895727, 'learning_rate': 2.454625205436817e-07, 'epoch': 0.9} 90%|█████████ | 30948/34278 [33:53:28<3:30:24, 3.79s/it] 90%|█████████ | 30949/34278 [33:53:31<3:20:07, 3.61s/it] {'loss': 0.1166, 'grad_norm': 0.7407950225445782, 'learning_rate': 2.4531633508593665e-07, 'epoch': 0.9} 90%|█████████ | 30949/34278 [33:53:31<3:20:07, 3.61s/it] 90%|█████████ | 30950/34278 [33:53:34<3:10:09, 3.43s/it] {'loss': 0.1167, 'grad_norm': 0.9062719691739753, 'learning_rate': 2.451701920768723e-07, 'epoch': 0.9} 90%|█████████ | 30950/34278 [33:53:34<3:10:09, 3.43s/it] 90%|█████████ | 30951/34278 [33:53:39<3:31:49, 3.82s/it] {'loss': 0.1192, 'grad_norm': 0.6634029421397257, 'learning_rate': 2.4502409151779317e-07, 'epoch': 0.9} 90%|█████████ | 30951/34278 [33:53:39<3:31:49, 3.82s/it] 90%|█████████ | 30952/34278 [33:53:45<4:03:50, 4.40s/it] {'loss': 0.1226, 'grad_norm': 0.9013022547794278, 'learning_rate': 2.4487803341000425e-07, 'epoch': 0.9} 90%|█████████ | 30952/34278 [33:53:45<4:03:50, 4.40s/it] 90%|█████████ | 30953/34278 [33:53:48<3:44:41, 4.05s/it] {'loss': 0.0961, 'grad_norm': 0.8704093365322719, 'learning_rate': 2.44732017754809e-07, 'epoch': 0.9} 90%|█████████ | 30953/34278 [33:53:48<3:44:41, 4.05s/it] 90%|█████████ | 30954/34278 [33:53:52<3:40:19, 3.98s/it] {'loss': 0.0978, 'grad_norm': 0.9566523073249599, 'learning_rate': 2.445860445535109e-07, 'epoch': 0.9} 90%|█████████ | 30954/34278 [33:53:52<3:40:19, 3.98s/it] 90%|█████████ | 30955/34278 [33:53:57<4:02:35, 4.38s/it] {'loss': 0.1432, 'grad_norm': 0.7246687325679024, 'learning_rate': 2.4444011380741375e-07, 'epoch': 0.9} 90%|█████████ | 30955/34278 [33:53:57<4:02:35, 4.38s/it] 90%|█████████ | 30956/34278 [33:54:00<3:38:41, 3.95s/it] {'loss': 0.1089, 'grad_norm': 0.8785881788994038, 'learning_rate': 2.4429422551782046e-07, 'epoch': 0.9} 90%|█████████ | 30956/34278 [33:54:00<3:38:41, 3.95s/it] 90%|█████████ | 30957/34278 [33:54:03<3:29:21, 3.78s/it] {'loss': 0.0962, 'grad_norm': 0.6631408368634061, 'learning_rate': 2.4414837968603223e-07, 'epoch': 0.9} 90%|█████████ | 30957/34278 [33:54:03<3:29:21, 3.78s/it] 90%|█████████ | 30958/34278 [33:54:06<3:17:43, 3.57s/it] {'loss': 0.1202, 'grad_norm': 0.7138819774688472, 'learning_rate': 2.4400257631335136e-07, 'epoch': 0.9} 90%|█████████ | 30958/34278 [33:54:06<3:17:43, 3.57s/it] 90%|█████████ | 30959/34278 [33:54:12<3:43:42, 4.04s/it] {'loss': 0.1019, 'grad_norm': 0.8350813516994058, 'learning_rate': 2.4385681540108117e-07, 'epoch': 0.9} 90%|█████████ | 30959/34278 [33:54:12<3:43:42, 4.04s/it] 90%|█████████ | 30960/34278 [33:54:15<3:36:02, 3.91s/it] {'loss': 0.0901, 'grad_norm': 0.8326867081345029, 'learning_rate': 2.4371109695052185e-07, 'epoch': 0.9} 90%|█████████ | 30960/34278 [33:54:15<3:36:02, 3.91s/it] 90%|█████████ | 30961/34278 [33:54:18<3:18:39, 3.59s/it] {'loss': 0.1093, 'grad_norm': 0.9466281671113214, 'learning_rate': 2.435654209629745e-07, 'epoch': 0.9} 90%|█████████ | 30961/34278 [33:54:18<3:18:39, 3.59s/it] 90%|█████████ | 30962/34278 [33:54:24<3:53:35, 4.23s/it] {'loss': 0.1239, 'grad_norm': 0.7263247733108613, 'learning_rate': 2.434197874397398e-07, 'epoch': 0.9} 90%|█████████ | 30962/34278 [33:54:24<3:53:35, 4.23s/it] 90%|█████████ | 30963/34278 [33:54:27<3:36:21, 3.92s/it] {'loss': 0.1144, 'grad_norm': 1.1284205234836768, 'learning_rate': 2.432741963821167e-07, 'epoch': 0.9} 90%|█████████ | 30963/34278 [33:54:27<3:36:21, 3.92s/it] 90%|█████████ | 30964/34278 [33:54:32<3:57:49, 4.31s/it] {'loss': 0.115, 'grad_norm': 0.8764197879557643, 'learning_rate': 2.4312864779140633e-07, 'epoch': 0.9} 90%|█████████ | 30964/34278 [33:54:32<3:57:49, 4.31s/it] 90%|█████████ | 30965/34278 [33:54:37<4:01:38, 4.38s/it] {'loss': 0.0999, 'grad_norm': 0.8498423461680746, 'learning_rate': 2.429831416689088e-07, 'epoch': 0.9} 90%|█████████ | 30965/34278 [33:54:37<4:01:38, 4.38s/it] 90%|█████████ | 30966/34278 [33:54:39<3:35:20, 3.90s/it] {'loss': 0.1004, 'grad_norm': 0.964680119609015, 'learning_rate': 2.4283767801592196e-07, 'epoch': 0.9} 90%|█████████ | 30966/34278 [33:54:39<3:35:20, 3.90s/it] 90%|█████████ | 30967/34278 [33:54:43<3:30:51, 3.82s/it] {'loss': 0.091, 'grad_norm': 0.7881218667683954, 'learning_rate': 2.426922568337442e-07, 'epoch': 0.9} 90%|█████████ | 30967/34278 [33:54:43<3:30:51, 3.82s/it] 90%|█████████ | 30968/34278 [33:54:46<3:22:57, 3.68s/it] {'loss': 0.0981, 'grad_norm': 0.7746170126969204, 'learning_rate': 2.425468781236745e-07, 'epoch': 0.9} 90%|█████████ | 30968/34278 [33:54:46<3:22:57, 3.68s/it] 90%|█████████ | 30969/34278 [33:54:52<4:01:10, 4.37s/it] {'loss': 0.1071, 'grad_norm': 0.6896526807509923, 'learning_rate': 2.4240154188701013e-07, 'epoch': 0.9} 90%|█████████ | 30969/34278 [33:54:52<4:01:10, 4.37s/it] 90%|█████████ | 30970/34278 [33:54:58<4:27:10, 4.85s/it] {'loss': 0.1166, 'grad_norm': 0.880851038956034, 'learning_rate': 2.422562481250501e-07, 'epoch': 0.9} 90%|█████████ | 30970/34278 [33:54:58<4:27:10, 4.85s/it] 90%|█████████ | 30971/34278 [33:55:02<3:59:09, 4.34s/it] {'loss': 0.112, 'grad_norm': 0.8271895146708425, 'learning_rate': 2.421109968390895e-07, 'epoch': 0.9} 90%|█████████ | 30971/34278 [33:55:02<3:59:09, 4.34s/it] 90%|█████████ | 30972/34278 [33:55:08<4:35:23, 5.00s/it] {'loss': 0.1297, 'grad_norm': 0.8047694310509017, 'learning_rate': 2.419657880304266e-07, 'epoch': 0.9} 90%|█████████ | 30972/34278 [33:55:08<4:35:23, 5.00s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 90%|█████████ | 30973/34278 [33:55:12<4:10:44, 4.55s/it] {'loss': 0.1152, 'grad_norm': 0.7329982488742748, 'learning_rate': 2.418206217003577e-07, 'epoch': 0.9} 90%|█████████ | 30973/34278 [33:55:12<4:10:44, 4.55s/it] 90%|█████████ | 30974/34278 [33:55:15<3:50:27, 4.19s/it] {'loss': 0.102, 'grad_norm': 0.8524591097579353, 'learning_rate': 2.4167549785017676e-07, 'epoch': 0.9} 90%|█████████ | 30974/34278 [33:55:15<3:50:27, 4.19s/it] 90%|█████████ | 30975/34278 [33:55:18<3:35:55, 3.92s/it] {'loss': 0.1175, 'grad_norm': 1.0279475728860563, 'learning_rate': 2.415304164811827e-07, 'epoch': 0.9} 90%|█████████ | 30975/34278 [33:55:18<3:35:55, 3.92s/it] 90%|█████████ | 30976/34278 [33:55:22<3:26:24, 3.75s/it] {'loss': 0.1055, 'grad_norm': 0.7631475659126696, 'learning_rate': 2.4138537759466894e-07, 'epoch': 0.9} 90%|█████████ | 30976/34278 [33:55:22<3:26:24, 3.75s/it] 90%|█████████ | 30977/34278 [33:55:25<3:12:51, 3.51s/it] {'loss': 0.1283, 'grad_norm': 1.0890616148085686, 'learning_rate': 2.4124038119193006e-07, 'epoch': 0.9} 90%|█████████ | 30977/34278 [33:55:25<3:12:51, 3.51s/it] 90%|█████████ | 30978/34278 [33:55:28<3:19:09, 3.62s/it] {'loss': 0.1085, 'grad_norm': 0.9686796762385809, 'learning_rate': 2.410954272742616e-07, 'epoch': 0.9} 90%|█████████ | 30978/34278 [33:55:28<3:19:09, 3.62s/it] 90%|█████████ | 30979/34278 [33:55:32<3:14:51, 3.54s/it] {'loss': 0.134, 'grad_norm': 0.7816770751882689, 'learning_rate': 2.4095051584295704e-07, 'epoch': 0.9} 90%|█████████ | 30979/34278 [33:55:32<3:14:51, 3.54s/it] 90%|█████████ | 30980/34278 [33:55:35<3:04:15, 3.35s/it] {'loss': 0.1066, 'grad_norm': 0.891180047793578, 'learning_rate': 2.4080564689930974e-07, 'epoch': 0.9} 90%|█████████ | 30980/34278 [33:55:35<3:04:15, 3.35s/it] 90%|█████████ | 30981/34278 [33:55:38<3:06:31, 3.39s/it] {'loss': 0.1225, 'grad_norm': 0.9566231189326931, 'learning_rate': 2.406608204446137e-07, 'epoch': 0.9} 90%|█████████ | 30981/34278 [33:55:38<3:06:31, 3.39s/it] 90%|█████████ | 30982/34278 [33:55:44<3:49:14, 4.17s/it] {'loss': 0.099, 'grad_norm': 0.732768771886364, 'learning_rate': 2.4051603648016176e-07, 'epoch': 0.9} 90%|█████████ | 30982/34278 [33:55:44<3:49:14, 4.17s/it] 90%|█████████ | 30983/34278 [33:55:47<3:30:53, 3.84s/it] {'loss': 0.1364, 'grad_norm': 0.9489851905210597, 'learning_rate': 2.4037129500724675e-07, 'epoch': 0.9} 90%|█████████ | 30983/34278 [33:55:47<3:30:53, 3.84s/it] 90%|█████████ | 30984/34278 [33:55:50<3:17:38, 3.60s/it] {'loss': 0.1373, 'grad_norm': 0.981945530141147, 'learning_rate': 2.402265960271599e-07, 'epoch': 0.9} 90%|█████████ | 30984/34278 [33:55:50<3:17:38, 3.60s/it] 90%|█████████ | 30985/34278 [33:55:56<3:56:36, 4.31s/it] {'loss': 0.1099, 'grad_norm': 0.8241497624676498, 'learning_rate': 2.400819395411946e-07, 'epoch': 0.9} 90%|█████████ | 30985/34278 [33:55:56<3:56:36, 4.31s/it] 90%|█████████ | 30986/34278 [33:55:59<3:37:17, 3.96s/it] {'loss': 0.1164, 'grad_norm': 0.8656803812664841, 'learning_rate': 2.39937325550641e-07, 'epoch': 0.9} 90%|█████████ | 30986/34278 [33:55:59<3:37:17, 3.96s/it] 90%|█████████ | 30987/34278 [33:56:04<3:48:20, 4.16s/it] {'loss': 0.1225, 'grad_norm': 0.9454769588806509, 'learning_rate': 2.397927540567907e-07, 'epoch': 0.9} 90%|█████████ | 30987/34278 [33:56:04<3:48:20, 4.16s/it] 90%|█████████ | 30988/34278 [33:56:07<3:24:15, 3.73s/it] {'loss': 0.1211, 'grad_norm': 0.9195737657724945, 'learning_rate': 2.396482250609339e-07, 'epoch': 0.9} 90%|█████████ | 30988/34278 [33:56:07<3:24:15, 3.73s/it] 90%|█████████ | 30989/34278 [33:56:11<3:28:42, 3.81s/it] {'loss': 0.1167, 'grad_norm': 0.8605926343562299, 'learning_rate': 2.395037385643623e-07, 'epoch': 0.9} 90%|█████████ | 30989/34278 [33:56:11<3:28:42, 3.81s/it] 90%|█████████ | 30990/34278 [33:56:14<3:20:48, 3.66s/it] {'loss': 0.1058, 'grad_norm': 0.7684792064288665, 'learning_rate': 2.393592945683648e-07, 'epoch': 0.9} 90%|█████████ | 30990/34278 [33:56:14<3:20:48, 3.66s/it] 90%|█████████ | 30991/34278 [33:56:17<3:05:12, 3.38s/it] {'loss': 0.1375, 'grad_norm': 0.9725175925245941, 'learning_rate': 2.3921489307422994e-07, 'epoch': 0.9} 90%|█████████ | 30991/34278 [33:56:17<3:05:12, 3.38s/it] 90%|█████████ | 30992/34278 [33:56:22<3:38:53, 4.00s/it] {'loss': 0.1289, 'grad_norm': 0.8903827468080072, 'learning_rate': 2.3907053408324885e-07, 'epoch': 0.9} 90%|█████████ | 30992/34278 [33:56:22<3:38:53, 4.00s/it] 90%|█████████ | 30993/34278 [33:56:27<3:44:08, 4.09s/it] {'loss': 0.1012, 'grad_norm': 0.7438117327272914, 'learning_rate': 2.3892621759670943e-07, 'epoch': 0.9} 90%|█████████ | 30993/34278 [33:56:27<3:44:08, 4.09s/it] 90%|█████████ | 30994/34278 [33:56:30<3:37:06, 3.97s/it] {'loss': 0.1216, 'grad_norm': 0.8033488966088786, 'learning_rate': 2.3878194361590003e-07, 'epoch': 0.9} 90%|█████████ | 30994/34278 [33:56:30<3:37:06, 3.97s/it] 90%|█████████ | 30995/34278 [33:56:33<3:22:07, 3.69s/it] {'loss': 0.1022, 'grad_norm': 0.8563267664831004, 'learning_rate': 2.386377121421091e-07, 'epoch': 0.9} 90%|█████████ | 30995/34278 [33:56:33<3:22:07, 3.69s/it] 90%|█████████ | 30996/34278 [33:56:36<3:06:11, 3.40s/it] {'loss': 0.1125, 'grad_norm': 0.8219721406685765, 'learning_rate': 2.3849352317662446e-07, 'epoch': 0.9} 90%|█████████ | 30996/34278 [33:56:36<3:06:11, 3.40s/it] 90%|█████████ | 30997/34278 [33:56:42<3:48:18, 4.18s/it] {'loss': 0.0953, 'grad_norm': 0.6910867898795853, 'learning_rate': 2.3834937672073178e-07, 'epoch': 0.9} 90%|█████████ | 30997/34278 [33:56:42<3:48:18, 4.18s/it] 90%|█████████ | 30998/34278 [33:56:48<4:15:25, 4.67s/it] {'loss': 0.1343, 'grad_norm': 1.2877588495692245, 'learning_rate': 2.3820527277571949e-07, 'epoch': 0.9} 90%|█████████ | 30998/34278 [33:56:48<4:15:25, 4.67s/it] 90%|█████████ | 30999/34278 [33:56:51<3:44:03, 4.10s/it] {'loss': 0.1171, 'grad_norm': 0.9232830015681499, 'learning_rate': 2.380612113428743e-07, 'epoch': 0.9} 90%|█████████ | 30999/34278 [33:56:51<3:44:03, 4.10s/it] 90%|█████████ | 31000/34278 [33:56:53<3:24:21, 3.74s/it] {'loss': 0.0948, 'grad_norm': 0.8927480596637919, 'learning_rate': 2.3791719242348188e-07, 'epoch': 0.9} 90%|█████████ | 31000/34278 [33:56:53<3:24:21, 3.74s/it] 90%|█████████ | 31001/34278 [33:56:57<3:28:38, 3.82s/it] {'loss': 0.1136, 'grad_norm': 0.9465587826538443, 'learning_rate': 2.377732160188273e-07, 'epoch': 0.9} 90%|█████████ | 31001/34278 [33:56:57<3:28:38, 3.82s/it] 90%|█████████ | 31002/34278 [33:57:02<3:47:16, 4.16s/it] {'loss': 0.0976, 'grad_norm': 0.8571432623023012, 'learning_rate': 2.3762928213019786e-07, 'epoch': 0.9} 90%|█████████ | 31002/34278 [33:57:02<3:47:16, 4.16s/it] 90%|█████████ | 31003/34278 [33:57:05<3:27:05, 3.79s/it] {'loss': 0.1145, 'grad_norm': 0.8745611444710717, 'learning_rate': 2.3748539075887646e-07, 'epoch': 0.9} 90%|█████████ | 31003/34278 [33:57:05<3:27:05, 3.79s/it] 90%|█████████ | 31004/34278 [33:57:08<3:12:36, 3.53s/it] {'loss': 0.1167, 'grad_norm': 0.8098827712414928, 'learning_rate': 2.3734154190614755e-07, 'epoch': 0.9} 90%|█████████ | 31004/34278 [33:57:08<3:12:36, 3.53s/it] 90%|█████████ | 31005/34278 [33:57:12<3:09:25, 3.47s/it] {'loss': 0.1036, 'grad_norm': 0.7728558326997578, 'learning_rate': 2.3719773557329794e-07, 'epoch': 0.9} 90%|█████████ | 31005/34278 [33:57:12<3:09:25, 3.47s/it] 90%|█████████ | 31006/34278 [33:57:18<3:50:15, 4.22s/it] {'loss': 0.1134, 'grad_norm': 0.9925806399402782, 'learning_rate': 2.3705397176160994e-07, 'epoch': 0.9} 90%|█████████ | 31006/34278 [33:57:18<3:50:15, 4.22s/it] 90%|█████████ | 31007/34278 [33:57:21<3:41:01, 4.05s/it] {'loss': 0.1048, 'grad_norm': 0.7889660460696515, 'learning_rate': 2.3691025047236637e-07, 'epoch': 0.9} 90%|█████████ | 31007/34278 [33:57:21<3:41:01, 4.05s/it] 90%|█████████ | 31008/34278 [33:57:27<4:17:02, 4.72s/it] {'loss': 0.1169, 'grad_norm': 0.7049189496488726, 'learning_rate': 2.367665717068518e-07, 'epoch': 0.9} 90%|█████████ | 31008/34278 [33:57:28<4:17:02, 4.72s/it] 90%|█████████ | 31009/34278 [33:57:30<3:43:52, 4.11s/it] {'loss': 0.1024, 'grad_norm': 0.834736438102681, 'learning_rate': 2.3662293546634796e-07, 'epoch': 0.9} 90%|█████████ | 31009/34278 [33:57:30<3:43:52, 4.11s/it] 90%|█████████ | 31010/34278 [33:57:33<3:26:18, 3.79s/it] {'loss': 0.1128, 'grad_norm': 0.7792846950019244, 'learning_rate': 2.364793417521366e-07, 'epoch': 0.9} 90%|█████████ | 31010/34278 [33:57:33<3:26:18, 3.79s/it] 90%|█████████ | 31011/34278 [33:57:36<3:13:27, 3.55s/it] {'loss': 0.1363, 'grad_norm': 0.8957140921369928, 'learning_rate': 2.3633579056550115e-07, 'epoch': 0.9} 90%|█████████ | 31011/34278 [33:57:36<3:13:27, 3.55s/it] 90%|█████████ | 31012/34278 [33:57:40<3:10:06, 3.49s/it] {'loss': 0.1141, 'grad_norm': 0.6520278082438606, 'learning_rate': 2.3619228190772282e-07, 'epoch': 0.9} 90%|█████████ | 31012/34278 [33:57:40<3:10:06, 3.49s/it] 90%|█████████ | 31013/34278 [33:57:43<3:13:29, 3.56s/it] {'loss': 0.1189, 'grad_norm': 0.8356939446502553, 'learning_rate': 2.3604881578008276e-07, 'epoch': 0.9} 90%|█████████ | 31013/34278 [33:57:43<3:13:29, 3.56s/it] 90%|█████████ | 31014/34278 [33:57:47<3:18:04, 3.64s/it] {'loss': 0.1028, 'grad_norm': 0.7724826162424849, 'learning_rate': 2.3590539218386056e-07, 'epoch': 0.9} 90%|█████████ | 31014/34278 [33:57:47<3:18:04, 3.64s/it] 90%|█████████ | 31015/34278 [33:57:53<3:55:57, 4.34s/it] {'loss': 0.0979, 'grad_norm': 0.8208899102690861, 'learning_rate': 2.3576201112033903e-07, 'epoch': 0.9} 90%|█████████ | 31015/34278 [33:57:53<3:55:57, 4.34s/it] 90%|█████████ | 31016/34278 [33:57:59<4:23:40, 4.85s/it] {'loss': 0.117, 'grad_norm': 0.9358438814407456, 'learning_rate': 2.3561867259079607e-07, 'epoch': 0.9} 90%|█████████ | 31016/34278 [33:57:59<4:23:40, 4.85s/it] 90%|█████████ | 31017/34278 [33:58:02<3:58:11, 4.38s/it] {'loss': 0.1138, 'grad_norm': 0.952807885742543, 'learning_rate': 2.3547537659651286e-07, 'epoch': 0.9} 90%|█████████ | 31017/34278 [33:58:02<3:58:11, 4.38s/it] 90%|█████████ | 31018/34278 [33:58:05<3:34:23, 3.95s/it] {'loss': 0.1204, 'grad_norm': 0.765959134979808, 'learning_rate': 2.353321231387673e-07, 'epoch': 0.9} 90%|█████████ | 31018/34278 [33:58:05<3:34:23, 3.95s/it] 90%|█████████ | 31019/34278 [33:58:09<3:28:19, 3.84s/it] {'loss': 0.1135, 'grad_norm': 0.7022546939782963, 'learning_rate': 2.3518891221884e-07, 'epoch': 0.9} 90%|█████████ | 31019/34278 [33:58:09<3:28:19, 3.84s/it] 90%|█████████ | 31020/34278 [33:58:12<3:21:47, 3.72s/it] {'loss': 0.1097, 'grad_norm': 0.831842049537802, 'learning_rate': 2.3504574383800825e-07, 'epoch': 0.9} 90%|█████████ | 31020/34278 [33:58:12<3:21:47, 3.72s/it] 90%|█████████ | 31021/34278 [33:58:16<3:16:03, 3.61s/it] {'loss': 0.0981, 'grad_norm': 0.7555123556790414, 'learning_rate': 2.3490261799755e-07, 'epoch': 0.9} 90%|█████████ | 31021/34278 [33:58:16<3:16:03, 3.61s/it] 91%|█████████ | 31022/34278 [33:58:21<3:49:32, 4.23s/it] {'loss': 0.1375, 'grad_norm': 0.8825737255745918, 'learning_rate': 2.3475953469874413e-07, 'epoch': 0.91} 91%|█████████ | 31022/34278 [33:58:21<3:49:32, 4.23s/it] 91%|█████████ | 31023/34278 [33:58:26<3:49:24, 4.23s/it] {'loss': 0.125, 'grad_norm': 0.8603160876024761, 'learning_rate': 2.34616493942868e-07, 'epoch': 0.91} 91%|█████████ | 31023/34278 [33:58:26<3:49:24, 4.23s/it] 91%|█████████ | 31024/34278 [33:58:32<4:16:52, 4.74s/it] {'loss': 0.0977, 'grad_norm': 0.7903554247217, 'learning_rate': 2.3447349573119725e-07, 'epoch': 0.91} 91%|█████████ | 31024/34278 [33:58:32<4:16:52, 4.74s/it] 91%|█████████ | 31025/34278 [33:58:35<3:49:30, 4.23s/it] {'loss': 0.0881, 'grad_norm': 0.8445792710266081, 'learning_rate': 2.3433054006501087e-07, 'epoch': 0.91} 91%|█████████ | 31025/34278 [33:58:35<3:49:30, 4.23s/it] 91%|█████████ | 31026/34278 [33:58:40<4:11:37, 4.64s/it] {'loss': 0.0999, 'grad_norm': 0.6327029514433683, 'learning_rate': 2.341876269455834e-07, 'epoch': 0.91} 91%|█████████ | 31026/34278 [33:58:40<4:11:37, 4.64s/it] 91%|█████████ | 31027/34278 [33:58:43<3:44:25, 4.14s/it] {'loss': 0.1183, 'grad_norm': 0.8723141299911839, 'learning_rate': 2.3404475637419045e-07, 'epoch': 0.91} 91%|█████████ | 31027/34278 [33:58:43<3:44:25, 4.14s/it] 91%|█████████ | 31028/34278 [33:58:46<3:22:38, 3.74s/it] {'loss': 0.0942, 'grad_norm': 0.8394980661644551, 'learning_rate': 2.3390192835210824e-07, 'epoch': 0.91} 91%|█████████ | 31028/34278 [33:58:46<3:22:38, 3.74s/it] 91%|█████████ | 31029/34278 [33:58:49<3:14:34, 3.59s/it] {'loss': 0.0981, 'grad_norm': 0.776989093761145, 'learning_rate': 2.337591428806124e-07, 'epoch': 0.91} 91%|█████████ | 31029/34278 [33:58:49<3:14:34, 3.59s/it] 91%|█████████ | 31030/34278 [33:58:53<3:12:29, 3.56s/it] {'loss': 0.1182, 'grad_norm': 0.8312850821874578, 'learning_rate': 2.3361639996097697e-07, 'epoch': 0.91} 91%|█████████ | 31030/34278 [33:58:53<3:12:29, 3.56s/it] 91%|█████████ | 31031/34278 [33:58:56<3:04:59, 3.42s/it] {'loss': 0.1065, 'grad_norm': 0.804418812954818, 'learning_rate': 2.3347369959447584e-07, 'epoch': 0.91} 91%|█████████ | 31031/34278 [33:58:56<3:04:59, 3.42s/it] 91%|█████████ | 31032/34278 [33:58:59<3:01:04, 3.35s/it] {'loss': 0.0858, 'grad_norm': 1.229864617067817, 'learning_rate': 2.3333104178238475e-07, 'epoch': 0.91} 91%|█████████ | 31032/34278 [33:58:59<3:01:04, 3.35s/it] 91%|█████████ | 31033/34278 [33:59:05<3:44:43, 4.16s/it] {'loss': 0.1266, 'grad_norm': 0.8527797931521814, 'learning_rate': 2.3318842652597595e-07, 'epoch': 0.91} 91%|█████████ | 31033/34278 [33:59:05<3:44:43, 4.16s/it] 91%|█████████ | 31034/34278 [33:59:11<4:18:09, 4.77s/it] {'loss': 0.1215, 'grad_norm': 0.7333503546245682, 'learning_rate': 2.3304585382652178e-07, 'epoch': 0.91} 91%|█████████ | 31034/34278 [33:59:11<4:18:09, 4.77s/it] 91%|█████████ | 31035/34278 [33:59:17<4:37:51, 5.14s/it] {'loss': 0.1002, 'grad_norm': 0.9030188895183978, 'learning_rate': 2.3290332368529734e-07, 'epoch': 0.91} 91%|█████████ | 31035/34278 [33:59:17<4:37:51, 5.14s/it] 91%|█████████ | 31036/34278 [33:59:20<4:01:22, 4.47s/it] {'loss': 0.116, 'grad_norm': 0.8612737379561668, 'learning_rate': 2.3276083610357436e-07, 'epoch': 0.91} 91%|█████████ | 31036/34278 [33:59:20<4:01:22, 4.47s/it] 91%|█████████ | 31037/34278 [33:59:23<3:38:47, 4.05s/it] {'loss': 0.1321, 'grad_norm': 0.8589736419008908, 'learning_rate': 2.3261839108262353e-07, 'epoch': 0.91} 91%|█████████ | 31037/34278 [33:59:23<3:38:47, 4.05s/it] 91%|█████████ | 31038/34278 [33:59:29<4:12:51, 4.68s/it] {'loss': 0.1098, 'grad_norm': 1.0454511600397138, 'learning_rate': 2.3247598862371878e-07, 'epoch': 0.91} 91%|█████████ | 31038/34278 [33:59:29<4:12:51, 4.68s/it] 91%|█████████ | 31039/34278 [33:59:33<3:52:57, 4.32s/it] {'loss': 0.1172, 'grad_norm': 0.8891984577817922, 'learning_rate': 2.323336287281297e-07, 'epoch': 0.91} 91%|█████████ | 31039/34278 [33:59:33<3:52:57, 4.32s/it] 91%|█████████ | 31040/34278 [33:59:38<4:13:43, 4.70s/it] {'loss': 0.1037, 'grad_norm': 0.8612033636963312, 'learning_rate': 2.3219131139712746e-07, 'epoch': 0.91} 91%|█████████ | 31040/34278 [33:59:38<4:13:43, 4.70s/it] 91%|█████████ | 31041/34278 [33:59:43<4:10:33, 4.64s/it] {'loss': 0.1265, 'grad_norm': 1.1069143747697723, 'learning_rate': 2.320490366319833e-07, 'epoch': 0.91} 91%|█████████ | 31041/34278 [33:59:43<4:10:33, 4.64s/it] 91%|█████████ | 31042/34278 [33:59:47<3:57:08, 4.40s/it] {'loss': 0.1238, 'grad_norm': 0.9442900554547359, 'learning_rate': 2.3190680443396784e-07, 'epoch': 0.91} 91%|█████████ | 31042/34278 [33:59:47<3:57:08, 4.40s/it] 91%|█████████ | 31043/34278 [33:59:50<3:36:26, 4.01s/it] {'loss': 0.1181, 'grad_norm': 0.7584375799488945, 'learning_rate': 2.3176461480434954e-07, 'epoch': 0.91} 91%|█████████ | 31043/34278 [33:59:50<3:36:26, 4.01s/it] 91%|█████████ | 31044/34278 [33:59:56<4:08:41, 4.61s/it] {'loss': 0.1161, 'grad_norm': 0.8124001834974975, 'learning_rate': 2.3162246774439845e-07, 'epoch': 0.91} 91%|█████████ | 31044/34278 [33:59:56<4:08:41, 4.61s/it] 91%|█████████ | 31045/34278 [33:59:59<3:46:34, 4.20s/it] {'loss': 0.1176, 'grad_norm': 0.8476413654548273, 'learning_rate': 2.3148036325538414e-07, 'epoch': 0.91} 91%|█████████ | 31045/34278 [33:59:59<3:46:34, 4.20s/it] 91%|█████████ | 31046/34278 [34:00:02<3:25:34, 3.82s/it] {'loss': 0.0995, 'grad_norm': 0.9265441993099188, 'learning_rate': 2.3133830133857393e-07, 'epoch': 0.91} 91%|█████████ | 31046/34278 [34:00:02<3:25:34, 3.82s/it] 91%|█████████ | 31047/34278 [34:00:06<3:21:01, 3.73s/it] {'loss': 0.1017, 'grad_norm': 0.8002302248988808, 'learning_rate': 2.3119628199523792e-07, 'epoch': 0.91} 91%|█████████ | 31047/34278 [34:00:06<3:21:01, 3.73s/it] 91%|█████████ | 31048/34278 [34:00:09<3:08:50, 3.51s/it] {'loss': 0.1194, 'grad_norm': 0.8202679698560909, 'learning_rate': 2.310543052266423e-07, 'epoch': 0.91} 91%|█████████ | 31048/34278 [34:00:09<3:08:50, 3.51s/it] 91%|█████████ | 31049/34278 [34:00:11<2:59:06, 3.33s/it] {'loss': 0.1057, 'grad_norm': 0.7965327943716995, 'learning_rate': 2.3091237103405606e-07, 'epoch': 0.91} 91%|█████████ | 31049/34278 [34:00:11<2:59:06, 3.33s/it] 91%|█████████ | 31050/34278 [34:00:14<2:52:26, 3.21s/it] {'loss': 0.0935, 'grad_norm': 0.7776205645230267, 'learning_rate': 2.3077047941874597e-07, 'epoch': 0.91} 91%|█████████ | 31050/34278 [34:00:14<2:52:26, 3.21s/it] 91%|█████████ | 31051/34278 [34:00:17<2:49:06, 3.14s/it] {'loss': 0.1199, 'grad_norm': 0.6608344382665353, 'learning_rate': 2.306286303819777e-07, 'epoch': 0.91} 91%|█████████ | 31051/34278 [34:00:17<2:49:06, 3.14s/it] 91%|█████████ | 31052/34278 [34:00:21<2:57:02, 3.29s/it] {'loss': 0.1156, 'grad_norm': 0.8951919596240322, 'learning_rate': 2.3048682392501854e-07, 'epoch': 0.91} 91%|█████████ | 31052/34278 [34:00:21<2:57:02, 3.29s/it] 91%|█████████ | 31053/34278 [34:00:24<2:48:39, 3.14s/it] {'loss': 0.1322, 'grad_norm': 0.9607789177046562, 'learning_rate': 2.303450600491347e-07, 'epoch': 0.91} 91%|█████████ | 31053/34278 [34:00:24<2:48:39, 3.14s/it] 91%|█████████ | 31054/34278 [34:00:27<2:52:23, 3.21s/it] {'loss': 0.1013, 'grad_norm': 0.779468320438791, 'learning_rate': 2.3020333875559132e-07, 'epoch': 0.91} 91%|█████████ | 31054/34278 [34:00:27<2:52:23, 3.21s/it] 91%|█████████ | 31055/34278 [34:00:33<3:38:31, 4.07s/it] {'loss': 0.1241, 'grad_norm': 0.6703479128871883, 'learning_rate': 2.3006166004565454e-07, 'epoch': 0.91} 91%|█████████ | 31055/34278 [34:00:33<3:38:31, 4.07s/it] 91%|█████████ | 31056/34278 [34:00:37<3:30:25, 3.92s/it] {'loss': 0.1359, 'grad_norm': 0.8106083888563351, 'learning_rate': 2.2992002392058843e-07, 'epoch': 0.91} 91%|█████████ | 31056/34278 [34:00:37<3:30:25, 3.92s/it] 91%|█████████ | 31057/34278 [34:00:40<3:17:31, 3.68s/it] {'loss': 0.1166, 'grad_norm': 1.0194871420894678, 'learning_rate': 2.2977843038165693e-07, 'epoch': 0.91} 91%|█████████ | 31057/34278 [34:00:40<3:17:31, 3.68s/it] 91%|█████████ | 31058/34278 [34:00:43<3:12:19, 3.58s/it] {'loss': 0.1041, 'grad_norm': 0.6440779035411326, 'learning_rate': 2.2963687943012515e-07, 'epoch': 0.91} 91%|█████████ | 31058/34278 [34:00:43<3:12:19, 3.58s/it] 91%|█████████ | 31059/34278 [34:00:46<2:59:03, 3.34s/it] {'loss': 0.0986, 'grad_norm': 0.7043635813284241, 'learning_rate': 2.294953710672565e-07, 'epoch': 0.91} 91%|█████████ | 31059/34278 [34:00:46<2:59:03, 3.34s/it] 91%|█████████ | 31060/34278 [34:00:49<2:54:11, 3.25s/it] {'loss': 0.1145, 'grad_norm': 0.8523291980476269, 'learning_rate': 2.29353905294315e-07, 'epoch': 0.91} 91%|█████████ | 31060/34278 [34:00:49<2:54:11, 3.25s/it] 91%|█████████ | 31061/34278 [34:00:53<3:01:13, 3.38s/it] {'loss': 0.1101, 'grad_norm': 0.7977194639494615, 'learning_rate': 2.2921248211256242e-07, 'epoch': 0.91} 91%|█████████ | 31061/34278 [34:00:53<3:01:13, 3.38s/it] 91%|█████████ | 31062/34278 [34:00:58<3:23:31, 3.80s/it] {'loss': 0.113, 'grad_norm': 0.7833624685213701, 'learning_rate': 2.2907110152326217e-07, 'epoch': 0.91} 91%|█████████ | 31062/34278 [34:00:58<3:23:31, 3.80s/it] 91%|█████████ | 31063/34278 [34:01:01<3:17:06, 3.68s/it] {'loss': 0.1274, 'grad_norm': 0.9112102627067392, 'learning_rate': 2.289297635276766e-07, 'epoch': 0.91} 91%|█████████ | 31063/34278 [34:01:01<3:17:06, 3.68s/it] 91%|█████████ | 31064/34278 [34:01:07<3:56:13, 4.41s/it] {'loss': 0.0988, 'grad_norm': 0.7919046650217728, 'learning_rate': 2.2878846812706524e-07, 'epoch': 0.91} 91%|█████████ | 31064/34278 [34:01:07<3:56:13, 4.41s/it] 91%|█████████ | 31065/34278 [34:01:10<3:36:07, 4.04s/it] {'loss': 0.1413, 'grad_norm': 1.4700491593039289, 'learning_rate': 2.2864721532269317e-07, 'epoch': 0.91} 91%|█████████ | 31065/34278 [34:01:10<3:36:07, 4.04s/it] 91%|█████████ | 31066/34278 [34:01:13<3:18:10, 3.70s/it] {'loss': 0.1366, 'grad_norm': 0.7976698184707087, 'learning_rate': 2.2850600511582e-07, 'epoch': 0.91} 91%|█████████ | 31066/34278 [34:01:13<3:18:10, 3.70s/it] 91%|█████████ | 31067/34278 [34:01:16<3:07:40, 3.51s/it] {'loss': 0.0992, 'grad_norm': 0.8000100874950398, 'learning_rate': 2.283648375077052e-07, 'epoch': 0.91} 91%|█████████ | 31067/34278 [34:01:16<3:07:40, 3.51s/it] 91%|█████████ | 31068/34278 [34:01:20<3:06:52, 3.49s/it] {'loss': 0.0908, 'grad_norm': 0.6631769861970778, 'learning_rate': 2.282237124996106e-07, 'epoch': 0.91} 91%|█████████ | 31068/34278 [34:01:20<3:06:52, 3.49s/it] 91%|█████████ | 31069/34278 [34:01:23<3:04:26, 3.45s/it] {'loss': 0.1244, 'grad_norm': 0.6905358890076473, 'learning_rate': 2.2808263009279574e-07, 'epoch': 0.91} 91%|█████████ | 31069/34278 [34:01:23<3:04:26, 3.45s/it] 91%|█████████ | 31070/34278 [34:01:28<3:21:50, 3.78s/it] {'loss': 0.1234, 'grad_norm': 0.7779457979066686, 'learning_rate': 2.2794159028851958e-07, 'epoch': 0.91} 91%|█████████ | 31070/34278 [34:01:28<3:21:50, 3.78s/it] 91%|█████████ | 31071/34278 [34:01:31<3:12:06, 3.59s/it] {'loss': 0.1196, 'grad_norm': 0.8832020763851377, 'learning_rate': 2.2780059308804116e-07, 'epoch': 0.91} 91%|█████████ | 31071/34278 [34:01:31<3:12:06, 3.59s/it] 91%|█████████ | 31072/34278 [34:01:34<3:07:07, 3.50s/it] {'loss': 0.1031, 'grad_norm': 0.7292286484002232, 'learning_rate': 2.2765963849262107e-07, 'epoch': 0.91} 91%|█████████ | 31072/34278 [34:01:34<3:07:07, 3.50s/it] 91%|█████████ | 31073/34278 [34:01:37<3:04:31, 3.45s/it] {'loss': 0.1218, 'grad_norm': 0.7474740082202941, 'learning_rate': 2.2751872650351614e-07, 'epoch': 0.91} 91%|█████████ | 31073/34278 [34:01:37<3:04:31, 3.45s/it] 91%|█████████ | 31074/34278 [34:01:42<3:27:53, 3.89s/it] {'loss': 0.0952, 'grad_norm': 0.9404091708721877, 'learning_rate': 2.2737785712198423e-07, 'epoch': 0.91} 91%|█████████ | 31074/34278 [34:01:42<3:27:53, 3.89s/it] 91%|█████████ | 31075/34278 [34:01:48<4:03:32, 4.56s/it] {'loss': 0.1165, 'grad_norm': 0.8599232304803229, 'learning_rate': 2.2723703034928435e-07, 'epoch': 0.91} 91%|█████████ | 31075/34278 [34:01:48<4:03:32, 4.56s/it] 91%|█████████ | 31076/34278 [34:01:51<3:34:07, 4.01s/it] {'loss': 0.1024, 'grad_norm': 0.6120221323168671, 'learning_rate': 2.2709624618667159e-07, 'epoch': 0.91} 91%|█████████ | 31076/34278 [34:01:51<3:34:07, 4.01s/it] 91%|█████████ | 31077/34278 [34:01:54<3:17:58, 3.71s/it] {'loss': 0.1167, 'grad_norm': 0.7979718073030723, 'learning_rate': 2.269555046354055e-07, 'epoch': 0.91} 91%|█████████ | 31077/34278 [34:01:54<3:17:58, 3.71s/it] 91%|█████████ | 31078/34278 [34:02:00<3:52:22, 4.36s/it] {'loss': 0.1044, 'grad_norm': 1.2851241355542093, 'learning_rate': 2.2681480569674007e-07, 'epoch': 0.91} 91%|█████████ | 31078/34278 [34:02:00<3:52:22, 4.36s/it] 91%|█████████ | 31079/34278 [34:02:03<3:28:18, 3.91s/it] {'loss': 0.1069, 'grad_norm': 0.8137264629751338, 'learning_rate': 2.2667414937193378e-07, 'epoch': 0.91} 91%|█████████ | 31079/34278 [34:02:03<3:28:18, 3.91s/it] 91%|█████████ | 31080/34278 [34:02:06<3:11:34, 3.59s/it] {'loss': 0.0959, 'grad_norm': 0.8026044382238032, 'learning_rate': 2.2653353566224058e-07, 'epoch': 0.91} 91%|█████████ | 31080/34278 [34:02:06<3:11:34, 3.59s/it] 91%|█████████ | 31081/34278 [34:02:09<3:03:08, 3.44s/it] {'loss': 0.1094, 'grad_norm': 0.9254360566307075, 'learning_rate': 2.2639296456891612e-07, 'epoch': 0.91} 91%|█████████ | 31081/34278 [34:02:09<3:03:08, 3.44s/it] 91%|█████████ | 31082/34278 [34:02:13<3:09:58, 3.57s/it] {'loss': 0.1064, 'grad_norm': 0.8374644847661314, 'learning_rate': 2.262524360932161e-07, 'epoch': 0.91} 91%|█████████ | 31082/34278 [34:02:13<3:09:58, 3.57s/it] 91%|█████████ | 31083/34278 [34:02:16<3:06:04, 3.49s/it] {'loss': 0.1023, 'grad_norm': 0.8899565088495911, 'learning_rate': 2.261119502363951e-07, 'epoch': 0.91} 91%|█████████ | 31083/34278 [34:02:16<3:06:04, 3.49s/it] 91%|█████████ | 31084/34278 [34:02:21<3:37:24, 4.08s/it] {'loss': 0.1093, 'grad_norm': 0.6607754550107457, 'learning_rate': 2.2597150699970594e-07, 'epoch': 0.91} 91%|█████████ | 31084/34278 [34:02:21<3:37:24, 4.08s/it] 91%|█████████ | 31085/34278 [34:02:24<3:18:07, 3.72s/it] {'loss': 0.1014, 'grad_norm': 0.7933378432633355, 'learning_rate': 2.258311063844043e-07, 'epoch': 0.91} 91%|█████████ | 31085/34278 [34:02:24<3:18:07, 3.72s/it] 91%|█████████ | 31086/34278 [34:02:28<3:11:12, 3.59s/it] {'loss': 0.097, 'grad_norm': 0.9711117450350107, 'learning_rate': 2.256907483917431e-07, 'epoch': 0.91} 91%|█████████ | 31086/34278 [34:02:28<3:11:12, 3.59s/it] 91%|█████████ | 31087/34278 [34:02:31<3:04:24, 3.47s/it] {'loss': 0.1079, 'grad_norm': 0.7130955809590389, 'learning_rate': 2.2555043302297464e-07, 'epoch': 0.91} 91%|█████████ | 31087/34278 [34:02:31<3:04:24, 3.47s/it] 91%|█████████ | 31088/34278 [34:02:33<2:51:14, 3.22s/it] {'loss': 0.1113, 'grad_norm': 0.9837946899650742, 'learning_rate': 2.254101602793518e-07, 'epoch': 0.91} 91%|█████████ | 31088/34278 [34:02:33<2:51:14, 3.22s/it] 91%|█████████ | 31089/34278 [34:02:37<2:50:08, 3.20s/it] {'loss': 0.1125, 'grad_norm': 0.8483504739284345, 'learning_rate': 2.252699301621286e-07, 'epoch': 0.91} 91%|█████████ | 31089/34278 [34:02:37<2:50:08, 3.20s/it] 91%|█████████ | 31090/34278 [34:02:41<3:04:53, 3.48s/it] {'loss': 0.1033, 'grad_norm': 0.820332575581785, 'learning_rate': 2.2512974267255517e-07, 'epoch': 0.91} 91%|█████████ | 31090/34278 [34:02:41<3:04:53, 3.48s/it] 91%|█████████ | 31091/34278 [34:02:43<2:51:33, 3.23s/it] {'loss': 0.1264, 'grad_norm': 0.7765796816180711, 'learning_rate': 2.2498959781188267e-07, 'epoch': 0.91} 91%|█████████ | 31091/34278 [34:02:43<2:51:33, 3.23s/it] 91%|█████████ | 31092/34278 [34:02:49<3:24:56, 3.86s/it] {'loss': 0.1119, 'grad_norm': 0.8304216817017167, 'learning_rate': 2.2484949558136405e-07, 'epoch': 0.91} 91%|█████████ | 31092/34278 [34:02:49<3:24:56, 3.86s/it] 91%|█████████ | 31093/34278 [34:02:53<3:33:47, 4.03s/it] {'loss': 0.0962, 'grad_norm': 0.859422112818967, 'learning_rate': 2.2470943598224936e-07, 'epoch': 0.91} 91%|█████████ | 31093/34278 [34:02:53<3:33:47, 4.03s/it] 91%|█████████ | 31094/34278 [34:02:56<3:16:35, 3.70s/it] {'loss': 0.1054, 'grad_norm': 0.6354218403920345, 'learning_rate': 2.245694190157871e-07, 'epoch': 0.91} 91%|█████████ | 31094/34278 [34:02:56<3:16:35, 3.70s/it] 91%|█████████ | 31095/34278 [34:02:59<3:03:56, 3.47s/it] {'loss': 0.1197, 'grad_norm': 0.7047697085873138, 'learning_rate': 2.244294446832307e-07, 'epoch': 0.91} 91%|█████████ | 31095/34278 [34:02:59<3:03:56, 3.47s/it] 91%|█████████ | 31096/34278 [34:03:02<2:56:16, 3.32s/it] {'loss': 0.1021, 'grad_norm': 0.8612226207933252, 'learning_rate': 2.24289512985828e-07, 'epoch': 0.91} 91%|█████████ | 31096/34278 [34:03:02<2:56:16, 3.32s/it] 91%|█████████ | 31097/34278 [34:03:06<3:04:33, 3.48s/it] {'loss': 0.1203, 'grad_norm': 0.814602887405063, 'learning_rate': 2.241496239248281e-07, 'epoch': 0.91} 91%|█████████ | 31097/34278 [34:03:06<3:04:33, 3.48s/it] 91%|█████████ | 31098/34278 [34:03:09<2:53:17, 3.27s/it] {'loss': 0.1115, 'grad_norm': 0.8157422395148158, 'learning_rate': 2.24009777501481e-07, 'epoch': 0.91} 91%|█████████ | 31098/34278 [34:03:09<2:53:17, 3.27s/it] 91%|█████████ | 31099/34278 [34:03:14<3:20:57, 3.79s/it] {'loss': 0.1034, 'grad_norm': 0.9433895599957898, 'learning_rate': 2.2386997371703413e-07, 'epoch': 0.91} 91%|█████████ | 31099/34278 [34:03:14<3:20:57, 3.79s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 91%|█████████ | 31100/34278 [34:03:17<3:10:47, 3.60s/it] {'loss': 0.1002, 'grad_norm': 0.9194868283420881, 'learning_rate': 2.237302125727353e-07, 'epoch': 0.91} 91%|█████████ | 31100/34278 [34:03:17<3:10:47, 3.60s/it] 91%|█████████ | 31101/34278 [34:03:20<3:04:42, 3.49s/it] {'loss': 0.0983, 'grad_norm': 0.9143603771129105, 'learning_rate': 2.2359049406983358e-07, 'epoch': 0.91} 91%|█████████ | 31101/34278 [34:03:20<3:04:42, 3.49s/it] 91%|█████████ | 31102/34278 [34:03:23<3:04:13, 3.48s/it] {'loss': 0.125, 'grad_norm': 0.893875061229067, 'learning_rate': 2.234508182095757e-07, 'epoch': 0.91} 91%|█████████ | 31102/34278 [34:03:24<3:04:13, 3.48s/it] 91%|█████████ | 31103/34278 [34:03:29<3:41:56, 4.19s/it] {'loss': 0.116, 'grad_norm': 0.8063523363762451, 'learning_rate': 2.2331118499320904e-07, 'epoch': 0.91} 91%|█████████ | 31103/34278 [34:03:29<3:41:56, 4.19s/it] 91%|█████████ | 31104/34278 [34:03:32<3:21:39, 3.81s/it] {'loss': 0.1179, 'grad_norm': 0.9565084545115901, 'learning_rate': 2.2317159442197868e-07, 'epoch': 0.91} 91%|█████████ | 31104/34278 [34:03:32<3:21:39, 3.81s/it] 91%|█████████ | 31105/34278 [34:03:36<3:13:11, 3.65s/it] {'loss': 0.1137, 'grad_norm': 0.8637748883300185, 'learning_rate': 2.2303204649713305e-07, 'epoch': 0.91} 91%|█████████ | 31105/34278 [34:03:36<3:13:11, 3.65s/it] 91%|█████████ | 31106/34278 [34:03:41<3:41:54, 4.20s/it] {'loss': 0.1096, 'grad_norm': 0.8914017938165445, 'learning_rate': 2.228925412199162e-07, 'epoch': 0.91} 91%|█████████ | 31106/34278 [34:03:41<3:41:54, 4.20s/it] 91%|█████████ | 31107/34278 [34:03:44<3:26:11, 3.90s/it] {'loss': 0.0936, 'grad_norm': 0.7612287769685566, 'learning_rate': 2.2275307859157546e-07, 'epoch': 0.91} 91%|█████████ | 31107/34278 [34:03:44<3:26:11, 3.90s/it] 91%|█████████ | 31108/34278 [34:03:48<3:20:40, 3.80s/it] {'loss': 0.12, 'grad_norm': 0.9977039789906293, 'learning_rate': 2.2261365861335372e-07, 'epoch': 0.91} 91%|█████████ | 31108/34278 [34:03:48<3:20:40, 3.80s/it] 91%|█████████ | 31109/34278 [34:03:51<3:08:24, 3.57s/it] {'loss': 0.1214, 'grad_norm': 0.954415893967994, 'learning_rate': 2.2247428128649717e-07, 'epoch': 0.91} 91%|█████████ | 31109/34278 [34:03:51<3:08:24, 3.57s/it] 91%|█████████ | 31110/34278 [34:03:55<3:23:10, 3.85s/it] {'loss': 0.0997, 'grad_norm': 0.7368586790932046, 'learning_rate': 2.2233494661225042e-07, 'epoch': 0.91} 91%|█████████ | 31110/34278 [34:03:55<3:23:10, 3.85s/it] 91%|█████████ | 31111/34278 [34:03:58<3:11:42, 3.63s/it] {'loss': 0.0907, 'grad_norm': 0.8528026718875293, 'learning_rate': 2.2219565459185578e-07, 'epoch': 0.91} 91%|█████████ | 31111/34278 [34:03:58<3:11:42, 3.63s/it] 91%|█████████ | 31112/34278 [34:04:01<2:56:57, 3.35s/it] {'loss': 0.1451, 'grad_norm': 0.7457147550208287, 'learning_rate': 2.2205640522655725e-07, 'epoch': 0.91} 91%|█████████ | 31112/34278 [34:04:01<2:56:57, 3.35s/it] 91%|█████████ | 31113/34278 [34:04:05<2:58:23, 3.38s/it] {'loss': 0.1118, 'grad_norm': 0.9140024106994731, 'learning_rate': 2.2191719851759996e-07, 'epoch': 0.91} 91%|█████████ | 31113/34278 [34:04:05<2:58:23, 3.38s/it] 91%|█████████ | 31114/34278 [34:04:08<2:51:54, 3.26s/it] {'loss': 0.0968, 'grad_norm': 0.9136693296214689, 'learning_rate': 2.2177803446622404e-07, 'epoch': 0.91} 91%|█████████ | 31114/34278 [34:04:08<2:51:54, 3.26s/it] 91%|█████████ | 31115/34278 [34:04:11<2:58:46, 3.39s/it] {'loss': 0.101, 'grad_norm': 0.7244222394257598, 'learning_rate': 2.2163891307367457e-07, 'epoch': 0.91} 91%|█████████ | 31115/34278 [34:04:11<2:58:46, 3.39s/it] 91%|█████████ | 31116/34278 [34:04:14<2:51:30, 3.25s/it] {'loss': 0.129, 'grad_norm': 0.8501133370857192, 'learning_rate': 2.214998343411917e-07, 'epoch': 0.91} 91%|█████████ | 31116/34278 [34:04:14<2:51:30, 3.25s/it] 91%|█████████ | 31117/34278 [34:04:17<2:46:36, 3.16s/it] {'loss': 0.1176, 'grad_norm': 0.8108296197082614, 'learning_rate': 2.2136079827001666e-07, 'epoch': 0.91} 91%|█████████ | 31117/34278 [34:04:17<2:46:36, 3.16s/it] 91%|█████████ | 31118/34278 [34:04:23<3:31:12, 4.01s/it] {'loss': 0.1272, 'grad_norm': 0.7676470133077243, 'learning_rate': 2.2122180486139232e-07, 'epoch': 0.91} 91%|█████████ | 31118/34278 [34:04:23<3:31:12, 4.01s/it] 91%|█████████ | 31119/34278 [34:04:26<3:13:56, 3.68s/it] {'loss': 0.1052, 'grad_norm': 0.9231091928417677, 'learning_rate': 2.2108285411655938e-07, 'epoch': 0.91} 91%|█████████ | 31119/34278 [34:04:26<3:13:56, 3.68s/it] 91%|█████████ | 31120/34278 [34:04:29<3:02:05, 3.46s/it] {'loss': 0.1269, 'grad_norm': 0.9125915237622088, 'learning_rate': 2.209439460367574e-07, 'epoch': 0.91} 91%|█████████ | 31120/34278 [34:04:29<3:02:05, 3.46s/it] 91%|█████████ | 31121/34278 [34:04:32<2:58:42, 3.40s/it] {'loss': 0.1045, 'grad_norm': 1.3371222585899003, 'learning_rate': 2.2080508062322704e-07, 'epoch': 0.91} 91%|█████████ | 31121/34278 [34:04:32<2:58:42, 3.40s/it] 91%|█████████ | 31122/34278 [34:04:37<3:15:41, 3.72s/it] {'loss': 0.1254, 'grad_norm': 0.7902946209997209, 'learning_rate': 2.2066625787720842e-07, 'epoch': 0.91} 91%|█████████ | 31122/34278 [34:04:37<3:15:41, 3.72s/it] 91%|█████████ | 31123/34278 [34:04:40<3:02:08, 3.46s/it] {'loss': 0.1125, 'grad_norm': 0.6485061366155017, 'learning_rate': 2.2052747779994055e-07, 'epoch': 0.91} 91%|█████████ | 31123/34278 [34:04:40<3:02:08, 3.46s/it] 91%|█████████ | 31124/34278 [34:04:42<2:51:57, 3.27s/it] {'loss': 0.1224, 'grad_norm': 1.104568038545473, 'learning_rate': 2.2038874039266077e-07, 'epoch': 0.91} 91%|█████████ | 31124/34278 [34:04:42<2:51:57, 3.27s/it] 91%|█████████ | 31125/34278 [34:04:45<2:48:16, 3.20s/it] {'loss': 0.1066, 'grad_norm': 0.7622552739176846, 'learning_rate': 2.202500456566109e-07, 'epoch': 0.91} 91%|█████████ | 31125/34278 [34:04:45<2:48:16, 3.20s/it] 91%|█████████ | 31126/34278 [34:04:49<2:56:50, 3.37s/it] {'loss': 0.1074, 'grad_norm': 0.7422091194608011, 'learning_rate': 2.201113935930277e-07, 'epoch': 0.91} 91%|█████████ | 31126/34278 [34:04:49<2:56:50, 3.37s/it] 91%|█████████ | 31127/34278 [34:04:52<2:47:18, 3.19s/it] {'loss': 0.1117, 'grad_norm': 0.8907907529926969, 'learning_rate': 2.1997278420314848e-07, 'epoch': 0.91} 91%|█████████ | 31127/34278 [34:04:52<2:47:18, 3.19s/it] 91%|█████████ | 31128/34278 [34:04:57<3:14:09, 3.70s/it] {'loss': 0.1036, 'grad_norm': 0.9067372482328319, 'learning_rate': 2.198342174882112e-07, 'epoch': 0.91} 91%|█████████ | 31128/34278 [34:04:57<3:14:09, 3.70s/it] 91%|█████████ | 31129/34278 [34:05:03<3:51:32, 4.41s/it] {'loss': 0.1138, 'grad_norm': 0.7713778168392936, 'learning_rate': 2.196956934494532e-07, 'epoch': 0.91} 91%|█████████ | 31129/34278 [34:05:03<3:51:32, 4.41s/it] 91%|█████████ | 31130/34278 [34:05:09<4:09:50, 4.76s/it] {'loss': 0.1027, 'grad_norm': 0.6670136652004125, 'learning_rate': 2.1955721208811066e-07, 'epoch': 0.91} 91%|█████████ | 31130/34278 [34:05:09<4:09:50, 4.76s/it] 91%|█████████ | 31131/34278 [34:05:12<3:45:15, 4.29s/it] {'loss': 0.1326, 'grad_norm': 0.8619621585420121, 'learning_rate': 2.1941877340541984e-07, 'epoch': 0.91} 91%|█████████ | 31131/34278 [34:05:12<3:45:15, 4.29s/it] 91%|█████████ | 31132/34278 [34:05:15<3:27:20, 3.95s/it] {'loss': 0.1058, 'grad_norm': 0.8390283872276735, 'learning_rate': 2.1928037740261753e-07, 'epoch': 0.91} 91%|█████████ | 31132/34278 [34:05:15<3:27:20, 3.95s/it] 91%|█████████ | 31133/34278 [34:05:18<3:20:15, 3.82s/it] {'loss': 0.105, 'grad_norm': 0.7881674467248587, 'learning_rate': 2.1914202408093887e-07, 'epoch': 0.91} 91%|█████████ | 31133/34278 [34:05:18<3:20:15, 3.82s/it] 91%|█████████ | 31134/34278 [34:05:22<3:17:35, 3.77s/it] {'loss': 0.0975, 'grad_norm': 0.7892069065864602, 'learning_rate': 2.1900371344161787e-07, 'epoch': 0.91} 91%|█████████ | 31134/34278 [34:05:22<3:17:35, 3.77s/it] 91%|█████████ | 31135/34278 [34:05:25<3:04:31, 3.52s/it] {'loss': 0.1178, 'grad_norm': 0.7709085175835558, 'learning_rate': 2.1886544548589184e-07, 'epoch': 0.91} 91%|█████████ | 31135/34278 [34:05:25<3:04:31, 3.52s/it] 91%|█████████ | 31136/34278 [34:05:31<3:46:50, 4.33s/it] {'loss': 0.1023, 'grad_norm': 0.6235509734500673, 'learning_rate': 2.187272202149926e-07, 'epoch': 0.91} 91%|█████████ | 31136/34278 [34:05:31<3:46:50, 4.33s/it] 91%|█████████ | 31137/34278 [34:05:34<3:25:24, 3.92s/it] {'loss': 0.1144, 'grad_norm': 0.6889896345070166, 'learning_rate': 2.1858903763015583e-07, 'epoch': 0.91} 91%|█████████ | 31137/34278 [34:05:34<3:25:24, 3.92s/it] 91%|█████████ | 31138/34278 [34:05:37<3:09:45, 3.63s/it] {'loss': 0.1128, 'grad_norm': 0.8440367631430685, 'learning_rate': 2.184508977326144e-07, 'epoch': 0.91} 91%|█████████ | 31138/34278 [34:05:37<3:09:45, 3.63s/it] 91%|█████████ | 31139/34278 [34:05:40<2:58:33, 3.41s/it] {'loss': 0.1042, 'grad_norm': 0.8555631303594844, 'learning_rate': 2.1831280052360238e-07, 'epoch': 0.91} 91%|█████████ | 31139/34278 [34:05:40<2:58:33, 3.41s/it] 91%|█████████ | 31140/34278 [34:05:43<2:58:30, 3.41s/it] {'loss': 0.1311, 'grad_norm': 0.9960188899937072, 'learning_rate': 2.1817474600435262e-07, 'epoch': 0.91} 91%|█████████ | 31140/34278 [34:05:43<2:58:30, 3.41s/it] 91%|█████████ | 31141/34278 [34:05:46<2:51:15, 3.28s/it] {'loss': 0.1087, 'grad_norm': 0.7604910924176862, 'learning_rate': 2.1803673417609584e-07, 'epoch': 0.91} 91%|█████████ | 31141/34278 [34:05:46<2:51:15, 3.28s/it] 91%|█████████ | 31142/34278 [34:05:53<3:35:59, 4.13s/it] {'loss': 0.1196, 'grad_norm': 0.8788464053917843, 'learning_rate': 2.1789876504006601e-07, 'epoch': 0.91} 91%|█████████ | 31142/34278 [34:05:53<3:35:59, 4.13s/it] 91%|█████████ | 31143/34278 [34:05:58<3:51:53, 4.44s/it] {'loss': 0.0924, 'grad_norm': 0.8429605800895439, 'learning_rate': 2.1776083859749498e-07, 'epoch': 0.91} 91%|█████████ | 31143/34278 [34:05:58<3:51:53, 4.44s/it] 91%|█████████ | 31144/34278 [34:06:01<3:30:10, 4.02s/it] {'loss': 0.1232, 'grad_norm': 0.9128799379517005, 'learning_rate': 2.176229548496134e-07, 'epoch': 0.91} 91%|█████████ | 31144/34278 [34:06:01<3:30:10, 4.02s/it] 91%|█████████ | 31145/34278 [34:06:05<3:38:46, 4.19s/it] {'loss': 0.1039, 'grad_norm': 0.9937007117197812, 'learning_rate': 2.1748511379765247e-07, 'epoch': 0.91} 91%|█████████ | 31145/34278 [34:06:05<3:38:46, 4.19s/it] 91%|█████████ | 31146/34278 [34:06:08<3:19:32, 3.82s/it] {'loss': 0.1173, 'grad_norm': 1.179439316220124, 'learning_rate': 2.1734731544284293e-07, 'epoch': 0.91} 91%|█████████ | 31146/34278 [34:06:08<3:19:32, 3.82s/it] 91%|█████████ | 31147/34278 [34:06:12<3:16:04, 3.76s/it] {'loss': 0.1144, 'grad_norm': 0.8566170913287037, 'learning_rate': 2.1720955978641433e-07, 'epoch': 0.91} 91%|█████████ | 31147/34278 [34:06:12<3:16:04, 3.76s/it] 91%|█████████ | 31148/34278 [34:06:15<3:01:10, 3.47s/it] {'loss': 0.1093, 'grad_norm': 0.8013007543558315, 'learning_rate': 2.170718468295968e-07, 'epoch': 0.91} 91%|█████████ | 31148/34278 [34:06:15<3:01:10, 3.47s/it] 91%|█████████ | 31149/34278 [34:06:18<2:56:12, 3.38s/it] {'loss': 0.1153, 'grad_norm': 1.1102425067230988, 'learning_rate': 2.1693417657362048e-07, 'epoch': 0.91} 91%|█████████ | 31149/34278 [34:06:18<2:56:12, 3.38s/it] 91%|█████████ | 31150/34278 [34:06:24<3:36:27, 4.15s/it] {'loss': 0.1024, 'grad_norm': 0.7486575885918128, 'learning_rate': 2.1679654901971436e-07, 'epoch': 0.91} 91%|█████████ | 31150/34278 [34:06:24<3:36:27, 4.15s/it] 91%|█████████ | 31151/34278 [34:06:27<3:18:55, 3.82s/it] {'loss': 0.1171, 'grad_norm': 0.8329626115948966, 'learning_rate': 2.1665896416910638e-07, 'epoch': 0.91} 91%|█████████ | 31151/34278 [34:06:27<3:18:55, 3.82s/it] 91%|█████████ | 31152/34278 [34:06:30<3:15:15, 3.75s/it] {'loss': 0.1066, 'grad_norm': 0.7603322985587933, 'learning_rate': 2.165214220230255e-07, 'epoch': 0.91} 91%|█████████ | 31152/34278 [34:06:30<3:15:15, 3.75s/it] 91%|█████████ | 31153/34278 [34:06:33<3:01:02, 3.48s/it] {'loss': 0.0884, 'grad_norm': 0.7644941059406194, 'learning_rate': 2.163839225826997e-07, 'epoch': 0.91} 91%|█████████ | 31153/34278 [34:06:33<3:01:02, 3.48s/it] 91%|█████████ | 31154/34278 [34:06:39<3:29:16, 4.02s/it] {'loss': 0.1062, 'grad_norm': 0.8187194717508908, 'learning_rate': 2.1624646584935515e-07, 'epoch': 0.91} 91%|█████████ | 31154/34278 [34:06:39<3:29:16, 4.02s/it] 91%|█████████ | 31155/34278 [34:06:42<3:19:56, 3.84s/it] {'loss': 0.1072, 'grad_norm': 0.8938377716105418, 'learning_rate': 2.161090518242215e-07, 'epoch': 0.91} 91%|█████████ | 31155/34278 [34:06:42<3:19:56, 3.84s/it] 91%|█████████ | 31156/34278 [34:06:45<3:13:17, 3.71s/it] {'loss': 0.1165, 'grad_norm': 0.7978155172171842, 'learning_rate': 2.159716805085238e-07, 'epoch': 0.91} 91%|█████████ | 31156/34278 [34:06:45<3:13:17, 3.71s/it] 91%|█████████ | 31157/34278 [34:06:48<2:58:48, 3.44s/it] {'loss': 0.096, 'grad_norm': 0.7328847864974878, 'learning_rate': 2.1583435190348833e-07, 'epoch': 0.91} 91%|█████████ | 31157/34278 [34:06:48<2:58:48, 3.44s/it] 91%|█████████ | 31158/34278 [34:06:51<2:54:22, 3.35s/it] {'loss': 0.1249, 'grad_norm': 0.6593593177615167, 'learning_rate': 2.1569706601034246e-07, 'epoch': 0.91} 91%|█████████ | 31158/34278 [34:06:51<2:54:22, 3.35s/it] 91%|█████████ | 31159/34278 [34:06:54<2:50:58, 3.29s/it] {'loss': 0.1173, 'grad_norm': 0.855227361992073, 'learning_rate': 2.155598228303113e-07, 'epoch': 0.91} 91%|█████████ | 31159/34278 [34:06:55<2:50:58, 3.29s/it] 91%|█████████ | 31160/34278 [34:06:57<2:43:05, 3.14s/it] {'loss': 0.0982, 'grad_norm': 0.813050944243014, 'learning_rate': 2.1542262236461887e-07, 'epoch': 0.91} 91%|█████████ | 31160/34278 [34:06:57<2:43:05, 3.14s/it] 91%|█████████ | 31161/34278 [34:07:01<2:48:32, 3.24s/it] {'loss': 0.098, 'grad_norm': 0.9921805848528199, 'learning_rate': 2.152854646144914e-07, 'epoch': 0.91} 91%|█████████ | 31161/34278 [34:07:01<2:48:32, 3.24s/it] 91%|█████████ | 31162/34278 [34:07:04<2:42:51, 3.14s/it] {'loss': 0.1068, 'grad_norm': 0.7507611875558359, 'learning_rate': 2.151483495811535e-07, 'epoch': 0.91} 91%|█████████ | 31162/34278 [34:07:04<2:42:51, 3.14s/it] 91%|█████████ | 31163/34278 [34:07:07<2:45:30, 3.19s/it] {'loss': 0.1313, 'grad_norm': 0.822681609407171, 'learning_rate': 2.1501127726582916e-07, 'epoch': 0.91} 91%|█████████ | 31163/34278 [34:07:07<2:45:30, 3.19s/it] 91%|█████████ | 31164/34278 [34:07:11<2:52:24, 3.32s/it] {'loss': 0.1231, 'grad_norm': 0.976159035189303, 'learning_rate': 2.148742476697413e-07, 'epoch': 0.91} 91%|█████████ | 31164/34278 [34:07:11<2:52:24, 3.32s/it] 91%|█████████ | 31165/34278 [34:07:17<3:36:24, 4.17s/it] {'loss': 0.1147, 'grad_norm': 0.7753825272428042, 'learning_rate': 2.1473726079411394e-07, 'epoch': 0.91} 91%|█████████ | 31165/34278 [34:07:17<3:36:24, 4.17s/it] 91%|█████████ | 31166/34278 [34:07:20<3:20:01, 3.86s/it] {'loss': 0.1184, 'grad_norm': 1.1635609513305534, 'learning_rate': 2.1460031664017002e-07, 'epoch': 0.91} 91%|█████████ | 31166/34278 [34:07:20<3:20:01, 3.86s/it] 91%|█████████ | 31167/34278 [34:07:23<3:02:30, 3.52s/it] {'loss': 0.1118, 'grad_norm': 0.9433089715287637, 'learning_rate': 2.1446341520913238e-07, 'epoch': 0.91} 91%|█████████ | 31167/34278 [34:07:23<3:02:30, 3.52s/it] 91%|█████████ | 31168/34278 [34:07:26<3:07:08, 3.61s/it] {'loss': 0.1106, 'grad_norm': 0.6221398742753601, 'learning_rate': 2.1432655650222234e-07, 'epoch': 0.91} 91%|█████████ | 31168/34278 [34:07:26<3:07:08, 3.61s/it] 91%|█████████ | 31169/34278 [34:07:30<3:12:35, 3.72s/it] {'loss': 0.1407, 'grad_norm': 0.811485942160595, 'learning_rate': 2.1418974052066276e-07, 'epoch': 0.91} 91%|█████████ | 31169/34278 [34:07:30<3:12:35, 3.72s/it] 91%|█████████ | 31170/34278 [34:07:36<3:39:21, 4.23s/it] {'loss': 0.125, 'grad_norm': 0.8165279238446315, 'learning_rate': 2.1405296726567493e-07, 'epoch': 0.91} 91%|█████████ | 31170/34278 [34:07:36<3:39:21, 4.23s/it] 91%|█████████ | 31171/34278 [34:07:39<3:25:51, 3.98s/it] {'loss': 0.1085, 'grad_norm': 0.9310940873584964, 'learning_rate': 2.139162367384784e-07, 'epoch': 0.91} 91%|█████████ | 31171/34278 [34:07:39<3:25:51, 3.98s/it] 91%|█████████ | 31172/34278 [34:07:43<3:21:51, 3.90s/it] {'loss': 0.0968, 'grad_norm': 0.6928230019094935, 'learning_rate': 2.1377954894029662e-07, 'epoch': 0.91} 91%|█████████ | 31172/34278 [34:07:43<3:21:51, 3.90s/it] 91%|█████████ | 31173/34278 [34:07:46<3:04:04, 3.56s/it] {'loss': 0.1196, 'grad_norm': 0.7918110263515088, 'learning_rate': 2.1364290387234864e-07, 'epoch': 0.91} 91%|█████████ | 31173/34278 [34:07:46<3:04:04, 3.56s/it] 91%|█████████ | 31174/34278 [34:07:49<2:55:05, 3.38s/it] {'loss': 0.1077, 'grad_norm': 0.8102359101197163, 'learning_rate': 2.135063015358535e-07, 'epoch': 0.91} 91%|█████████ | 31174/34278 [34:07:49<2:55:05, 3.38s/it] 91%|█████████ | 31175/34278 [34:07:51<2:41:35, 3.12s/it] {'loss': 0.0799, 'grad_norm': 0.9933870456354755, 'learning_rate': 2.1336974193203185e-07, 'epoch': 0.91} 91%|█████████ | 31175/34278 [34:07:51<2:41:35, 3.12s/it] 91%|█████████ | 31176/34278 [34:07:54<2:42:23, 3.14s/it] {'loss': 0.1067, 'grad_norm': 0.768122671744198, 'learning_rate': 2.132332250621022e-07, 'epoch': 0.91} 91%|█████████ | 31176/34278 [34:07:54<2:42:23, 3.14s/it] 91%|█████████ | 31177/34278 [34:07:57<2:39:17, 3.08s/it] {'loss': 0.0991, 'grad_norm': 0.8469302239599373, 'learning_rate': 2.1309675092728353e-07, 'epoch': 0.91} 91%|█████████ | 31177/34278 [34:07:57<2:39:17, 3.08s/it] 91%|█████████ | 31178/34278 [34:08:03<3:26:29, 4.00s/it] {'loss': 0.1015, 'grad_norm': 0.8711834960374405, 'learning_rate': 2.1296031952879437e-07, 'epoch': 0.91} 91%|█████████ | 31178/34278 [34:08:03<3:26:29, 4.00s/it] 91%|█████████ | 31179/34278 [34:08:09<3:54:40, 4.54s/it] {'loss': 0.108, 'grad_norm': 0.8359144380449869, 'learning_rate': 2.1282393086785313e-07, 'epoch': 0.91} 91%|█████████ | 31179/34278 [34:08:09<3:54:40, 4.54s/it] 91%|█████████ | 31180/34278 [34:08:15<4:16:26, 4.97s/it] {'loss': 0.13, 'grad_norm': 0.771065559095617, 'learning_rate': 2.1268758494567666e-07, 'epoch': 0.91} 91%|█████████ | 31180/34278 [34:08:15<4:16:26, 4.97s/it] 91%|█████████ | 31181/34278 [34:08:18<3:48:20, 4.42s/it] {'loss': 0.1193, 'grad_norm': 0.8667820910889501, 'learning_rate': 2.1255128176348283e-07, 'epoch': 0.91} 91%|█████████ | 31181/34278 [34:08:18<3:48:20, 4.42s/it] 91%|█████████ | 31182/34278 [34:08:22<3:29:40, 4.06s/it] {'loss': 0.1084, 'grad_norm': 0.7451349820479439, 'learning_rate': 2.1241502132248848e-07, 'epoch': 0.91} 91%|█████████ | 31182/34278 [34:08:22<3:29:40, 4.06s/it] 91%|█████████ | 31183/34278 [34:08:25<3:27:08, 4.02s/it] {'loss': 0.1314, 'grad_norm': 0.7770633929395803, 'learning_rate': 2.122788036239093e-07, 'epoch': 0.91} 91%|█████████ | 31183/34278 [34:08:25<3:27:08, 4.02s/it] 91%|█████████ | 31184/34278 [34:08:29<3:15:26, 3.79s/it] {'loss': 0.0927, 'grad_norm': 0.8238839116750369, 'learning_rate': 2.1214262866896208e-07, 'epoch': 0.91} 91%|█████████ | 31184/34278 [34:08:29<3:15:26, 3.79s/it] 91%|█████████ | 31185/34278 [34:08:31<2:55:36, 3.41s/it] {'loss': 0.1215, 'grad_norm': 0.8207904619219264, 'learning_rate': 2.1200649645886308e-07, 'epoch': 0.91} 91%|█████████ | 31185/34278 [34:08:31<2:55:36, 3.41s/it] 91%|█████████ | 31186/34278 [34:08:36<3:11:43, 3.72s/it] {'loss': 0.1098, 'grad_norm': 0.905414138949186, 'learning_rate': 2.1187040699482685e-07, 'epoch': 0.91} 91%|█████████ | 31186/34278 [34:08:36<3:11:43, 3.72s/it] 91%|█████████ | 31187/34278 [34:08:42<3:54:05, 4.54s/it] {'loss': 0.1078, 'grad_norm': 0.8058840126251363, 'learning_rate': 2.117343602780686e-07, 'epoch': 0.91} 91%|█████████ | 31187/34278 [34:08:42<3:54:05, 4.54s/it] 91%|█████████ | 31188/34278 [34:08:47<4:05:04, 4.76s/it] {'loss': 0.1306, 'grad_norm': 0.8401359675866493, 'learning_rate': 2.1159835630980286e-07, 'epoch': 0.91} 91%|█████████ | 31188/34278 [34:08:47<4:05:04, 4.76s/it] 91%|█████████ | 31189/34278 [34:08:50<3:34:41, 4.17s/it] {'loss': 0.097, 'grad_norm': 0.8153818141633182, 'learning_rate': 2.1146239509124365e-07, 'epoch': 0.91} 91%|█████████ | 31189/34278 [34:08:50<3:34:41, 4.17s/it] 91%|█████████ | 31190/34278 [34:08:53<3:14:56, 3.79s/it] {'loss': 0.0835, 'grad_norm': 0.9347981040478424, 'learning_rate': 2.1132647662360562e-07, 'epoch': 0.91} 91%|█████████ | 31190/34278 [34:08:53<3:14:56, 3.79s/it] 91%|█████████ | 31191/34278 [34:08:56<3:06:26, 3.62s/it] {'loss': 0.102, 'grad_norm': 0.88370779193522, 'learning_rate': 2.1119060090810106e-07, 'epoch': 0.91} 91%|█████████ | 31191/34278 [34:08:56<3:06:26, 3.62s/it] 91%|█████████ | 31192/34278 [34:09:00<3:00:31, 3.51s/it] {'loss': 0.1082, 'grad_norm': 0.9322960137650994, 'learning_rate': 2.110547679459446e-07, 'epoch': 0.91} 91%|█████████ | 31192/34278 [34:09:00<3:00:31, 3.51s/it] 91%|█████████ | 31193/34278 [34:09:05<3:34:14, 4.17s/it] {'loss': 0.0998, 'grad_norm': 0.7839507296136591, 'learning_rate': 2.1091897773834746e-07, 'epoch': 0.91} 91%|█████████ | 31193/34278 [34:09:05<3:34:14, 4.17s/it] 91%|█████████ | 31194/34278 [34:09:08<3:14:43, 3.79s/it] {'loss': 0.1055, 'grad_norm': 0.7603850833206659, 'learning_rate': 2.1078323028652203e-07, 'epoch': 0.91} 91%|█████████ | 31194/34278 [34:09:08<3:14:43, 3.79s/it] 91%|█████████ | 31195/34278 [34:09:12<3:07:34, 3.65s/it] {'loss': 0.1009, 'grad_norm': 0.6832395959495823, 'learning_rate': 2.1064752559168067e-07, 'epoch': 0.91} 91%|█████████ | 31195/34278 [34:09:12<3:07:34, 3.65s/it] 91%|█████████ | 31196/34278 [34:09:17<3:37:53, 4.24s/it] {'loss': 0.128, 'grad_norm': 0.8325881847933028, 'learning_rate': 2.1051186365503517e-07, 'epoch': 0.91} 91%|█████████ | 31196/34278 [34:09:17<3:37:53, 4.24s/it] 91%|█████████ | 31197/34278 [34:09:23<4:05:13, 4.78s/it] {'loss': 0.1121, 'grad_norm': 0.6845476421445118, 'learning_rate': 2.1037624447779682e-07, 'epoch': 0.91} 91%|█████████ | 31197/34278 [34:09:23<4:05:13, 4.78s/it] 91%|█████████ | 31198/34278 [34:09:26<3:35:36, 4.20s/it] {'loss': 0.1075, 'grad_norm': 0.7073666026472823, 'learning_rate': 2.1024066806117515e-07, 'epoch': 0.91} 91%|█████████ | 31198/34278 [34:09:26<3:35:36, 4.20s/it] 91%|█████████ | 31199/34278 [34:09:29<3:14:58, 3.80s/it] {'loss': 0.1141, 'grad_norm': 1.0179873335506506, 'learning_rate': 2.1010513440638203e-07, 'epoch': 0.91} 91%|█████████ | 31199/34278 [34:09:29<3:14:58, 3.80s/it] 91%|█████████ | 31200/34278 [34:09:32<3:00:08, 3.51s/it] {'loss': 0.0954, 'grad_norm': 0.8199139157129577, 'learning_rate': 2.09969643514627e-07, 'epoch': 0.91} 91%|█████████ | 31200/34278 [34:09:32<3:00:08, 3.51s/it] 91%|█████████ | 31201/34278 [34:09:35<2:57:49, 3.47s/it] {'loss': 0.1098, 'grad_norm': 1.125528827216129, 'learning_rate': 2.0983419538711803e-07, 'epoch': 0.91} 91%|█████████ | 31201/34278 [34:09:35<2:57:49, 3.47s/it] 91%|█████████ | 31202/34278 [34:09:40<3:20:26, 3.91s/it] {'loss': 0.1239, 'grad_norm': 0.7962419334239833, 'learning_rate': 2.0969879002506742e-07, 'epoch': 0.91} 91%|█████████ | 31202/34278 [34:09:40<3:20:26, 3.91s/it] 91%|█████████ | 31203/34278 [34:09:46<3:55:47, 4.60s/it] {'loss': 0.0953, 'grad_norm': 0.7826052581298216, 'learning_rate': 2.095634274296826e-07, 'epoch': 0.91} 91%|█████████ | 31203/34278 [34:09:46<3:55:47, 4.60s/it] 91%|█████████ | 31204/34278 [34:09:50<3:34:53, 4.19s/it] {'loss': 0.1033, 'grad_norm': 0.9102591702918901, 'learning_rate': 2.0942810760217092e-07, 'epoch': 0.91} 91%|█████████ | 31204/34278 [34:09:50<3:34:53, 4.19s/it] 91%|█████████ | 31205/34278 [34:09:55<4:00:58, 4.71s/it] {'loss': 0.1047, 'grad_norm': 0.8062521425848791, 'learning_rate': 2.0929283054374193e-07, 'epoch': 0.91} 91%|█████████ | 31205/34278 [34:09:55<4:00:58, 4.71s/it] 91%|█████████ | 31206/34278 [34:09:58<3:30:17, 4.11s/it] {'loss': 0.1228, 'grad_norm': 0.8918295512444327, 'learning_rate': 2.0915759625560306e-07, 'epoch': 0.91} 91%|█████████ | 31206/34278 [34:09:58<3:30:17, 4.11s/it] 91%|█████████ | 31207/34278 [34:10:04<3:51:03, 4.51s/it] {'loss': 0.1021, 'grad_norm': 1.0958800628665835, 'learning_rate': 2.0902240473896106e-07, 'epoch': 0.91} 91%|█████████ | 31207/34278 [34:10:04<3:51:03, 4.51s/it] 91%|█████████ | 31208/34278 [34:10:09<4:08:16, 4.85s/it] {'loss': 0.1171, 'grad_norm': 0.8979327014990276, 'learning_rate': 2.0888725599502335e-07, 'epoch': 0.91} 91%|█████████ | 31208/34278 [34:10:09<4:08:16, 4.85s/it] 91%|█████████ | 31209/34278 [34:10:12<3:35:37, 4.22s/it] {'loss': 0.1012, 'grad_norm': 0.8538666372363959, 'learning_rate': 2.0875215002499727e-07, 'epoch': 0.91} 91%|█████████ | 31209/34278 [34:10:12<3:35:37, 4.22s/it] 91%|█████████ | 31210/34278 [34:10:16<3:34:36, 4.20s/it] {'loss': 0.1114, 'grad_norm': 1.3116497012888577, 'learning_rate': 2.0861708683008796e-07, 'epoch': 0.91} 91%|█████████ | 31210/34278 [34:10:16<3:34:36, 4.20s/it] 91%|█████████ | 31211/34278 [34:10:19<3:18:13, 3.88s/it] {'loss': 0.1142, 'grad_norm': 0.8829732928385619, 'learning_rate': 2.084820664115006e-07, 'epoch': 0.91} 91%|█████████ | 31211/34278 [34:10:19<3:18:13, 3.88s/it] 91%|█████████ | 31212/34278 [34:10:22<3:03:27, 3.59s/it] {'loss': 0.1215, 'grad_norm': 0.8813183058437354, 'learning_rate': 2.0834708877044252e-07, 'epoch': 0.91} 91%|█████████ | 31212/34278 [34:10:22<3:03:27, 3.59s/it] 91%|█████████ | 31213/34278 [34:10:25<2:55:15, 3.43s/it] {'loss': 0.1125, 'grad_norm': 0.8921581346672733, 'learning_rate': 2.0821215390811722e-07, 'epoch': 0.91} 91%|█████████ | 31213/34278 [34:10:25<2:55:15, 3.43s/it] 91%|█████████ | 31214/34278 [34:10:28<2:49:42, 3.32s/it] {'loss': 0.1025, 'grad_norm': 0.8727611817341433, 'learning_rate': 2.0807726182572984e-07, 'epoch': 0.91} 91%|█████████ | 31214/34278 [34:10:28<2:49:42, 3.32s/it] 91%|█████████ | 31215/34278 [34:10:31<2:47:19, 3.28s/it] {'loss': 0.107, 'grad_norm': 0.7044576566301238, 'learning_rate': 2.0794241252448554e-07, 'epoch': 0.91} 91%|█████████ | 31215/34278 [34:10:31<2:47:19, 3.28s/it] 91%|█████████ | 31216/34278 [34:10:36<3:00:32, 3.54s/it] {'loss': 0.1085, 'grad_norm': 0.7849471879663136, 'learning_rate': 2.0780760600558724e-07, 'epoch': 0.91} 91%|█████████ | 31216/34278 [34:10:36<3:00:32, 3.54s/it] 91%|█████████ | 31217/34278 [34:10:39<2:58:21, 3.50s/it] {'loss': 0.1132, 'grad_norm': 0.8331097134214188, 'learning_rate': 2.0767284227023786e-07, 'epoch': 0.91} 91%|█████████ | 31217/34278 [34:10:39<2:58:21, 3.50s/it] 91%|█████████ | 31218/34278 [34:10:42<2:53:31, 3.40s/it] {'loss': 0.1337, 'grad_norm': 0.6959226899889344, 'learning_rate': 2.0753812131964202e-07, 'epoch': 0.91} 91%|█████████ | 31218/34278 [34:10:42<2:53:31, 3.40s/it] 91%|█████████ | 31219/34278 [34:10:45<2:50:10, 3.34s/it] {'loss': 0.1025, 'grad_norm': 0.967610349999905, 'learning_rate': 2.0740344315500093e-07, 'epoch': 0.91} 91%|█████████ | 31219/34278 [34:10:45<2:50:10, 3.34s/it] 91%|█████████ | 31220/34278 [34:10:48<2:44:55, 3.24s/it] {'loss': 0.1306, 'grad_norm': 0.9678259605853304, 'learning_rate': 2.0726880777751922e-07, 'epoch': 0.91} 91%|█████████ | 31220/34278 [34:10:48<2:44:55, 3.24s/it] 91%|█████████ | 31221/34278 [34:10:52<2:48:57, 3.32s/it] {'loss': 0.1215, 'grad_norm': 0.8755955747118267, 'learning_rate': 2.0713421518839595e-07, 'epoch': 0.91} 91%|█████████ | 31221/34278 [34:10:52<2:48:57, 3.32s/it] 91%|█████████ | 31222/34278 [34:10:55<2:45:11, 3.24s/it] {'loss': 0.125, 'grad_norm': 0.7458522518925815, 'learning_rate': 2.0699966538883565e-07, 'epoch': 0.91} 91%|█████████ | 31222/34278 [34:10:55<2:45:11, 3.24s/it] 91%|█████████ | 31223/34278 [34:10:58<2:39:55, 3.14s/it] {'loss': 0.1196, 'grad_norm': 0.9182134613131028, 'learning_rate': 2.068651583800374e-07, 'epoch': 0.91} 91%|█████████ | 31223/34278 [34:10:58<2:39:55, 3.14s/it] 91%|█████████ | 31224/34278 [34:11:01<2:42:46, 3.20s/it] {'loss': 0.1035, 'grad_norm': 0.8866340951449045, 'learning_rate': 2.0673069416320303e-07, 'epoch': 0.91} 91%|█████████ | 31224/34278 [34:11:01<2:42:46, 3.20s/it] 91%|█████████ | 31225/34278 [34:11:06<3:04:47, 3.63s/it] {'loss': 0.1116, 'grad_norm': 0.8933500504473191, 'learning_rate': 2.065962727395321e-07, 'epoch': 0.91} 91%|█████████ | 31225/34278 [34:11:06<3:04:47, 3.63s/it] 91%|█████████ | 31226/34278 [34:11:09<2:56:01, 3.46s/it] {'loss': 0.1031, 'grad_norm': 0.7485970038879802, 'learning_rate': 2.0646189411022588e-07, 'epoch': 0.91} 91%|█████████ | 31226/34278 [34:11:09<2:56:01, 3.46s/it] 91%|█████████ | 31227/34278 [34:11:12<2:57:37, 3.49s/it] {'loss': 0.1201, 'grad_norm': 0.8148759799355173, 'learning_rate': 2.0632755827648397e-07, 'epoch': 0.91} 91%|█████████ | 31227/34278 [34:11:12<2:57:37, 3.49s/it] 91%|█████████ | 31228/34278 [34:11:18<3:34:31, 4.22s/it] {'loss': 0.0992, 'grad_norm': 0.7876042273869697, 'learning_rate': 2.061932652395049e-07, 'epoch': 0.91} 91%|█████████ | 31228/34278 [34:11:18<3:34:31, 4.22s/it] 91%|█████████ | 31229/34278 [34:11:22<3:18:21, 3.90s/it] {'loss': 0.1073, 'grad_norm': 0.9348614192922458, 'learning_rate': 2.060590150004882e-07, 'epoch': 0.91} 91%|█████████ | 31229/34278 [34:11:22<3:18:21, 3.90s/it] 91%|█████████ | 31230/34278 [34:11:25<3:03:54, 3.62s/it] {'loss': 0.1041, 'grad_norm': 0.8014039247808129, 'learning_rate': 2.0592480756063237e-07, 'epoch': 0.91} 91%|█████████ | 31230/34278 [34:11:25<3:03:54, 3.62s/it] 91%|█████████ | 31231/34278 [34:11:30<3:31:36, 4.17s/it] {'loss': 0.1228, 'grad_norm': 0.8874166261027333, 'learning_rate': 2.057906429211337e-07, 'epoch': 0.91} 91%|█████████ | 31231/34278 [34:11:30<3:31:36, 4.17s/it] 91%|█████████ | 31232/34278 [34:11:35<3:48:10, 4.49s/it] {'loss': 0.1181, 'grad_norm': 0.7143287227728006, 'learning_rate': 2.0565652108319344e-07, 'epoch': 0.91} 91%|█████████ | 31232/34278 [34:11:35<3:48:10, 4.49s/it] 91%|█████████ | 31233/34278 [34:11:38<3:28:02, 4.10s/it] {'loss': 0.1188, 'grad_norm': 0.7281203535430113, 'learning_rate': 2.055224420480073e-07, 'epoch': 0.91} 91%|█████████ | 31233/34278 [34:11:38<3:28:02, 4.10s/it] 91%|█████████ | 31234/34278 [34:11:43<3:41:53, 4.37s/it] {'loss': 0.0935, 'grad_norm': 0.763141740829035, 'learning_rate': 2.0538840581677156e-07, 'epoch': 0.91} 91%|█████████ | 31234/34278 [34:11:43<3:41:53, 4.37s/it] 91%|█████████ | 31235/34278 [34:11:46<3:18:52, 3.92s/it] {'loss': 0.112, 'grad_norm': 0.9045996524010984, 'learning_rate': 2.052544123906841e-07, 'epoch': 0.91} 91%|█████████ | 31235/34278 [34:11:46<3:18:52, 3.92s/it] 91%|█████████ | 31236/34278 [34:11:52<3:47:36, 4.49s/it] {'loss': 0.106, 'grad_norm': 0.7547713704952629, 'learning_rate': 2.051204617709407e-07, 'epoch': 0.91} 91%|█████████ | 31236/34278 [34:11:52<3:47:36, 4.49s/it] 91%|█████████ | 31237/34278 [34:11:58<4:10:12, 4.94s/it] {'loss': 0.1105, 'grad_norm': 1.0281046979497552, 'learning_rate': 2.0498655395873645e-07, 'epoch': 0.91} 91%|█████████ | 31237/34278 [34:11:58<4:10:12, 4.94s/it] 91%|█████████ | 31238/34278 [34:12:02<3:50:53, 4.56s/it] {'loss': 0.1236, 'grad_norm': 0.9824969572957485, 'learning_rate': 2.0485268895526766e-07, 'epoch': 0.91} 91%|█████████ | 31238/34278 [34:12:02<3:50:53, 4.56s/it] 91%|█████████ | 31239/34278 [34:12:05<3:35:19, 4.25s/it] {'loss': 0.0903, 'grad_norm': 0.8476378577194715, 'learning_rate': 2.0471886676173002e-07, 'epoch': 0.91} 91%|█████████ | 31239/34278 [34:12:05<3:35:19, 4.25s/it] 91%|█████████ | 31240/34278 [34:12:09<3:20:58, 3.97s/it] {'loss': 0.1229, 'grad_norm': 0.7982521157682128, 'learning_rate': 2.045850873793176e-07, 'epoch': 0.91} 91%|█████████ | 31240/34278 [34:12:09<3:20:58, 3.97s/it] 91%|█████████ | 31241/34278 [34:12:15<3:55:07, 4.65s/it] {'loss': 0.1272, 'grad_norm': 0.9620654671412657, 'learning_rate': 2.044513508092244e-07, 'epoch': 0.91} 91%|█████████ | 31241/34278 [34:12:15<3:55:07, 4.65s/it] 91%|█████████ | 31242/34278 [34:12:18<3:28:10, 4.11s/it] {'loss': 0.0799, 'grad_norm': 0.7299604073402126, 'learning_rate': 2.0431765705264505e-07, 'epoch': 0.91} 91%|█████████ | 31242/34278 [34:12:18<3:28:10, 4.11s/it] 91%|█████████ | 31243/34278 [34:12:21<3:14:12, 3.84s/it] {'loss': 0.1011, 'grad_norm': 0.7012347569197621, 'learning_rate': 2.0418400611077194e-07, 'epoch': 0.91} 91%|█████████ | 31243/34278 [34:12:21<3:14:12, 3.84s/it] 91%|█████████ | 31244/34278 [34:12:24<3:02:06, 3.60s/it] {'loss': 0.0953, 'grad_norm': 0.7623281519241979, 'learning_rate': 2.0405039798479964e-07, 'epoch': 0.91} 91%|█████████ | 31244/34278 [34:12:24<3:02:06, 3.60s/it] 91%|█████████ | 31245/34278 [34:12:27<2:57:59, 3.52s/it] {'loss': 0.1127, 'grad_norm': 0.8991371714453419, 'learning_rate': 2.039168326759211e-07, 'epoch': 0.91} 91%|█████████ | 31245/34278 [34:12:27<2:57:59, 3.52s/it] 91%|█████████ | 31246/34278 [34:12:30<2:49:04, 3.35s/it] {'loss': 0.1039, 'grad_norm': 1.036221718478377, 'learning_rate': 2.0378331018532814e-07, 'epoch': 0.91} 91%|█████████ | 31246/34278 [34:12:30<2:49:04, 3.35s/it] 91%|█████████ | 31247/34278 [34:12:33<2:45:29, 3.28s/it] {'loss': 0.0994, 'grad_norm': 0.794707890425934, 'learning_rate': 2.0364983051421204e-07, 'epoch': 0.91} 91%|█████████ | 31247/34278 [34:12:33<2:45:29, 3.28s/it] 91%|█████████ | 31248/34278 [34:12:36<2:41:26, 3.20s/it] {'loss': 0.1216, 'grad_norm': 0.9886205039691118, 'learning_rate': 2.0351639366376575e-07, 'epoch': 0.91} 91%|█████████ | 31248/34278 [34:12:36<2:41:26, 3.20s/it] 91%|█████████ | 31249/34278 [34:12:39<2:36:11, 3.09s/it] {'loss': 0.1286, 'grad_norm': 0.9432682229351107, 'learning_rate': 2.0338299963517993e-07, 'epoch': 0.91} 91%|█████████ | 31249/34278 [34:12:39<2:36:11, 3.09s/it] 91%|█████████ | 31250/34278 [34:12:45<3:11:41, 3.80s/it] {'loss': 0.0921, 'grad_norm': 0.8112227455028246, 'learning_rate': 2.0324964842964589e-07, 'epoch': 0.91} 91%|█████████ | 31250/34278 [34:12:45<3:11:41, 3.80s/it] 91%|█████████ | 31251/34278 [34:12:47<2:53:30, 3.44s/it] {'loss': 0.1389, 'grad_norm': 0.7402480177845739, 'learning_rate': 2.0311634004835324e-07, 'epoch': 0.91} 91%|█████████ | 31251/34278 [34:12:47<2:53:30, 3.44s/it] 91%|█████████ | 31252/34278 [34:12:51<2:55:42, 3.48s/it] {'loss': 0.1064, 'grad_norm': 0.8635522058155916, 'learning_rate': 2.0298307449249377e-07, 'epoch': 0.91} 91%|█████████ | 31252/34278 [34:12:51<2:55:42, 3.48s/it] 91%|█████████ | 31253/34278 [34:12:54<2:46:10, 3.30s/it] {'loss': 0.1093, 'grad_norm': 0.9832749334854058, 'learning_rate': 2.02849851763256e-07, 'epoch': 0.91} 91%|█████████ | 31253/34278 [34:12:54<2:46:10, 3.30s/it] 91%|█████████ | 31254/34278 [34:12:58<2:55:19, 3.48s/it] {'loss': 0.1018, 'grad_norm': 1.0136663990298818, 'learning_rate': 2.0271667186182897e-07, 'epoch': 0.91} 91%|█████████ | 31254/34278 [34:12:58<2:55:19, 3.48s/it] 91%|█████████ | 31255/34278 [34:13:02<3:06:32, 3.70s/it] {'loss': 0.1097, 'grad_norm': 0.7260314617532732, 'learning_rate': 2.025835347894023e-07, 'epoch': 0.91} 91%|█████████ | 31255/34278 [34:13:02<3:06:32, 3.70s/it] 91%|█████████ | 31256/34278 [34:13:05<3:04:36, 3.67s/it] {'loss': 0.1295, 'grad_norm': 0.9468722254681623, 'learning_rate': 2.0245044054716557e-07, 'epoch': 0.91} 91%|█████████ | 31256/34278 [34:13:05<3:04:36, 3.67s/it] 91%|█████████ | 31257/34278 [34:13:08<2:51:31, 3.41s/it] {'loss': 0.1195, 'grad_norm': 1.0606643474312076, 'learning_rate': 2.023173891363056e-07, 'epoch': 0.91} 91%|█████████ | 31257/34278 [34:13:08<2:51:31, 3.41s/it] 91%|█████████ | 31258/34278 [34:13:13<3:07:58, 3.73s/it] {'loss': 0.1126, 'grad_norm': 0.9831047885398735, 'learning_rate': 2.0218438055801038e-07, 'epoch': 0.91} 91%|█████████ | 31258/34278 [34:13:13<3:07:58, 3.73s/it] 91%|█████████ | 31259/34278 [34:13:18<3:35:35, 4.28s/it] {'loss': 0.1127, 'grad_norm': 0.859583230323244, 'learning_rate': 2.0205141481346835e-07, 'epoch': 0.91} 91%|█████████ | 31259/34278 [34:13:18<3:35:35, 4.28s/it] 91%|█████████ | 31260/34278 [34:13:21<3:12:38, 3.83s/it] {'loss': 0.1071, 'grad_norm': 0.8819680571912979, 'learning_rate': 2.0191849190386526e-07, 'epoch': 0.91} 91%|█████████ | 31260/34278 [34:13:21<3:12:38, 3.83s/it] 91%|█████████ | 31261/34278 [34:13:25<3:19:43, 3.97s/it] {'loss': 0.1175, 'grad_norm': 0.7757292719689446, 'learning_rate': 2.0178561183038793e-07, 'epoch': 0.91} 91%|█████████ | 31261/34278 [34:13:25<3:19:43, 3.97s/it] 91%|█████████ | 31262/34278 [34:13:28<3:02:35, 3.63s/it] {'loss': 0.1094, 'grad_norm': 0.8075015649567129, 'learning_rate': 2.0165277459422428e-07, 'epoch': 0.91} 91%|█████████ | 31262/34278 [34:13:28<3:02:35, 3.63s/it] 91%|█████████ | 31263/34278 [34:13:31<2:50:32, 3.39s/it] {'loss': 0.1265, 'grad_norm': 0.8592848982920227, 'learning_rate': 2.0151998019655895e-07, 'epoch': 0.91} 91%|█████████ | 31263/34278 [34:13:31<2:50:32, 3.39s/it] 91%|█████████ | 31264/34278 [34:13:36<3:11:45, 3.82s/it] {'loss': 0.1202, 'grad_norm': 1.12269639300866, 'learning_rate': 2.0138722863857762e-07, 'epoch': 0.91} 91%|█████████ | 31264/34278 [34:13:36<3:11:45, 3.82s/it] 91%|█████████ | 31265/34278 [34:13:40<3:10:55, 3.80s/it] {'loss': 0.1012, 'grad_norm': 0.8409779073218775, 'learning_rate': 2.0125451992146606e-07, 'epoch': 0.91} 91%|█████████ | 31265/34278 [34:13:40<3:10:55, 3.80s/it] 91%|█████████ | 31266/34278 [34:13:43<3:07:26, 3.73s/it] {'loss': 0.102, 'grad_norm': 0.7299165332371558, 'learning_rate': 2.0112185404640827e-07, 'epoch': 0.91} 91%|█████████ | 31266/34278 [34:13:43<3:07:26, 3.73s/it] 91%|█████████ | 31267/34278 [34:13:47<3:04:06, 3.67s/it] {'loss': 0.1041, 'grad_norm': 0.8173740844966054, 'learning_rate': 2.0098923101458833e-07, 'epoch': 0.91} 91%|█████████ | 31267/34278 [34:13:47<3:04:06, 3.67s/it] 91%|█████████ | 31268/34278 [34:13:50<3:02:20, 3.63s/it] {'loss': 0.0918, 'grad_norm': 0.6394795081855168, 'learning_rate': 2.0085665082719142e-07, 'epoch': 0.91} 91%|█████████ | 31268/34278 [34:13:50<3:02:20, 3.63s/it] 91%|█████████ | 31269/34278 [34:13:53<2:55:04, 3.49s/it] {'loss': 0.1102, 'grad_norm': 0.7054351889259936, 'learning_rate': 2.00724113485401e-07, 'epoch': 0.91} 91%|█████████ | 31269/34278 [34:13:53<2:55:04, 3.49s/it] 91%|█████████ | 31270/34278 [34:13:57<2:57:17, 3.54s/it] {'loss': 0.1159, 'grad_norm': 0.8141005998471933, 'learning_rate': 2.0059161899040001e-07, 'epoch': 0.91} 91%|█████████ | 31270/34278 [34:13:57<2:57:17, 3.54s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f16ba1bca90> Failed to fetch sample 3255717. Exception: cannot identify image file <_io.BytesIO object at 0x7f16ba1bca90> 91%|█████████ | 31271/34278 [34:14:00<2:54:08, 3.47s/it] {'loss': 0.1069, 'grad_norm': 0.9182301652023297, 'learning_rate': 2.004591673433709e-07, 'epoch': 0.91} 91%|█████████ | 31271/34278 [34:14:00<2:54:08, 3.47s/it] 91%|█████████ | 31272/34278 [34:14:03<2:46:25, 3.32s/it] {'loss': 0.11, 'grad_norm': 0.7432654349729967, 'learning_rate': 2.003267585454971e-07, 'epoch': 0.91} 91%|█████████ | 31272/34278 [34:14:03<2:46:25, 3.32s/it] 91%|█████████ | 31273/34278 [34:14:09<3:27:02, 4.13s/it] {'loss': 0.1161, 'grad_norm': 0.8057741532574665, 'learning_rate': 2.0019439259795935e-07, 'epoch': 0.91} 91%|█████████ | 31273/34278 [34:14:09<3:27:02, 4.13s/it] 91%|█████████ | 31274/34278 [34:14:12<3:09:27, 3.78s/it] {'loss': 0.1175, 'grad_norm': 1.125519038579545, 'learning_rate': 2.0006206950194063e-07, 'epoch': 0.91} 91%|█████████ | 31274/34278 [34:14:12<3:09:27, 3.78s/it] 91%|█████████ | 31275/34278 [34:14:16<3:01:34, 3.63s/it] {'loss': 0.113, 'grad_norm': 0.8786397529807448, 'learning_rate': 1.9992978925862215e-07, 'epoch': 0.91} 91%|█████████ | 31275/34278 [34:14:16<3:01:34, 3.63s/it] 91%|█████████ | 31276/34278 [34:14:19<3:04:43, 3.69s/it] {'loss': 0.1181, 'grad_norm': 0.8508726654735528, 'learning_rate': 1.9979755186918525e-07, 'epoch': 0.91} 91%|█████████ | 31276/34278 [34:14:19<3:04:43, 3.69s/it] 91%|█████████ | 31277/34278 [34:14:24<3:18:42, 3.97s/it] {'loss': 0.1113, 'grad_norm': 0.916797857464992, 'learning_rate': 1.9966535733480897e-07, 'epoch': 0.91} 91%|█████████ | 31277/34278 [34:14:24<3:18:42, 3.97s/it] 91%|█████████ | 31278/34278 [34:14:30<3:47:14, 4.54s/it] {'loss': 0.1079, 'grad_norm': 0.8103247524664272, 'learning_rate': 1.9953320565667457e-07, 'epoch': 0.91} 91%|█████████ | 31278/34278 [34:14:30<3:47:14, 4.54s/it] 91%|█████████▏| 31279/34278 [34:14:33<3:32:01, 4.24s/it] {'loss': 0.1151, 'grad_norm': 0.8911431660576439, 'learning_rate': 1.9940109683596165e-07, 'epoch': 0.91} 91%|█████████▏| 31279/34278 [34:14:33<3:32:01, 4.24s/it] 91%|█████████▏| 31280/34278 [34:14:36<3:11:12, 3.83s/it] {'loss': 0.111, 'grad_norm': 0.768540094033884, 'learning_rate': 1.9926903087385042e-07, 'epoch': 0.91} 91%|█████████▏| 31280/34278 [34:14:36<3:11:12, 3.83s/it] 91%|█████████▏| 31281/34278 [34:14:39<2:58:11, 3.57s/it] {'loss': 0.1025, 'grad_norm': 0.8326238160989231, 'learning_rate': 1.9913700777151823e-07, 'epoch': 0.91} 91%|█████████▏| 31281/34278 [34:14:39<2:58:11, 3.57s/it] 91%|█████████▏| 31282/34278 [34:14:43<3:07:08, 3.75s/it] {'loss': 0.0899, 'grad_norm': 1.2324609342680006, 'learning_rate': 1.9900502753014584e-07, 'epoch': 0.91} 91%|█████████▏| 31282/34278 [34:14:43<3:07:08, 3.75s/it] 91%|█████████▏| 31283/34278 [34:14:47<3:06:39, 3.74s/it] {'loss': 0.1146, 'grad_norm': 0.7990047517751675, 'learning_rate': 1.988730901509106e-07, 'epoch': 0.91} 91%|█████████▏| 31283/34278 [34:14:47<3:06:39, 3.74s/it] 91%|█████████▏| 31284/34278 [34:14:50<2:56:09, 3.53s/it] {'loss': 0.1181, 'grad_norm': 0.7296273030813718, 'learning_rate': 1.987411956349894e-07, 'epoch': 0.91} 91%|█████████▏| 31284/34278 [34:14:50<2:56:09, 3.53s/it] 91%|█████████▏| 31285/34278 [34:14:53<2:51:40, 3.44s/it] {'loss': 0.117, 'grad_norm': 0.8432077346000755, 'learning_rate': 1.9860934398356013e-07, 'epoch': 0.91} 91%|█████████▏| 31285/34278 [34:14:53<2:51:40, 3.44s/it] 91%|█████████▏| 31286/34278 [34:14:57<2:51:35, 3.44s/it] {'loss': 0.1075, 'grad_norm': 0.8749049709019081, 'learning_rate': 1.9847753519780188e-07, 'epoch': 0.91} 91%|█████████▏| 31286/34278 [34:14:57<2:51:35, 3.44s/it] 91%|█████████▏| 31287/34278 [34:15:01<2:59:39, 3.60s/it] {'loss': 0.1059, 'grad_norm': 0.7978980913542358, 'learning_rate': 1.983457692788898e-07, 'epoch': 0.91} 91%|█████████▏| 31287/34278 [34:15:01<2:59:39, 3.60s/it] 91%|█████████▏| 31288/34278 [34:15:05<3:05:32, 3.72s/it] {'loss': 0.1023, 'grad_norm': 0.7783015059887668, 'learning_rate': 1.9821404622799966e-07, 'epoch': 0.91} 91%|█████████▏| 31288/34278 [34:15:05<3:05:32, 3.72s/it] 91%|█████████▏| 31289/34278 [34:15:08<2:53:29, 3.48s/it] {'loss': 0.1187, 'grad_norm': 0.7633953813240251, 'learning_rate': 1.9808236604630882e-07, 'epoch': 0.91} 91%|█████████▏| 31289/34278 [34:15:08<2:53:29, 3.48s/it] 91%|█████████▏| 31290/34278 [34:15:11<2:46:15, 3.34s/it] {'loss': 0.0849, 'grad_norm': 0.6597935910223036, 'learning_rate': 1.9795072873499245e-07, 'epoch': 0.91} 91%|█████████▏| 31290/34278 [34:15:11<2:46:15, 3.34s/it] 91%|█████████▏| 31291/34278 [34:15:14<2:47:40, 3.37s/it] {'loss': 0.1089, 'grad_norm': 0.8487592715030142, 'learning_rate': 1.9781913429522403e-07, 'epoch': 0.91} 91%|█████████▏| 31291/34278 [34:15:14<2:47:40, 3.37s/it] 91%|█████████▏| 31292/34278 [34:15:17<2:38:17, 3.18s/it] {'loss': 0.1104, 'grad_norm': 0.9464287978142161, 'learning_rate': 1.9768758272818155e-07, 'epoch': 0.91} 91%|█████████▏| 31292/34278 [34:15:17<2:38:17, 3.18s/it] 91%|█████████▏| 31293/34278 [34:15:21<2:50:42, 3.43s/it] {'loss': 0.1168, 'grad_norm': 1.1129414968464362, 'learning_rate': 1.9755607403503797e-07, 'epoch': 0.91} 91%|█████████▏| 31293/34278 [34:15:21<2:50:42, 3.43s/it] 91%|█████████▏| 31294/34278 [34:15:24<2:46:21, 3.34s/it] {'loss': 0.1284, 'grad_norm': 0.7808060210399638, 'learning_rate': 1.9742460821696674e-07, 'epoch': 0.91} 91%|█████████▏| 31294/34278 [34:15:24<2:46:21, 3.34s/it] 91%|█████████▏| 31295/34278 [34:15:28<2:57:42, 3.57s/it] {'loss': 0.1215, 'grad_norm': 0.9637865048798887, 'learning_rate': 1.972931852751425e-07, 'epoch': 0.91} 91%|█████████▏| 31295/34278 [34:15:28<2:57:42, 3.57s/it] 91%|█████████▏| 31296/34278 [34:15:31<2:46:53, 3.36s/it] {'loss': 0.0873, 'grad_norm': 0.6754609672884183, 'learning_rate': 1.9716180521073823e-07, 'epoch': 0.91} 91%|█████████▏| 31296/34278 [34:15:31<2:46:53, 3.36s/it] 91%|█████████▏| 31297/34278 [34:15:34<2:43:18, 3.29s/it] {'loss': 0.1092, 'grad_norm': 0.7668595822596354, 'learning_rate': 1.9703046802492687e-07, 'epoch': 0.91} 91%|█████████▏| 31297/34278 [34:15:34<2:43:18, 3.29s/it] 91%|█████████▏| 31298/34278 [34:15:38<2:49:54, 3.42s/it] {'loss': 0.1223, 'grad_norm': 0.8351857025860321, 'learning_rate': 1.9689917371888024e-07, 'epoch': 0.91} 91%|█████████▏| 31298/34278 [34:15:38<2:49:54, 3.42s/it] 91%|█████████▏| 31299/34278 [34:15:44<3:28:09, 4.19s/it] {'loss': 0.1364, 'grad_norm': 0.9598557356092702, 'learning_rate': 1.9676792229377184e-07, 'epoch': 0.91} 91%|█████████▏| 31299/34278 [34:15:44<3:28:09, 4.19s/it] 91%|█████████▏| 31300/34278 [34:15:50<3:54:44, 4.73s/it] {'loss': 0.1027, 'grad_norm': 0.799230935536841, 'learning_rate': 1.9663671375077298e-07, 'epoch': 0.91} 91%|█████████▏| 31300/34278 [34:15:50<3:54:44, 4.73s/it] 91%|█████████▏| 31301/34278 [34:15:54<3:40:19, 4.44s/it] {'loss': 0.1189, 'grad_norm': 0.6777469754416956, 'learning_rate': 1.9650554809105438e-07, 'epoch': 0.91} 91%|█████████▏| 31301/34278 [34:15:54<3:40:19, 4.44s/it] 91%|█████████▏| 31302/34278 [34:15:58<3:38:24, 4.40s/it] {'loss': 0.1191, 'grad_norm': 0.9142433031010697, 'learning_rate': 1.9637442531578787e-07, 'epoch': 0.91} 91%|█████████▏| 31302/34278 [34:15:58<3:38:24, 4.40s/it] 91%|█████████▏| 31303/34278 [34:16:01<3:15:18, 3.94s/it] {'loss': 0.1158, 'grad_norm': 0.906970151087242, 'learning_rate': 1.962433454261431e-07, 'epoch': 0.91} 91%|█████████▏| 31303/34278 [34:16:01<3:15:18, 3.94s/it] 91%|█████████▏| 31304/34278 [34:16:07<3:46:34, 4.57s/it] {'loss': 0.0898, 'grad_norm': 0.8526403777218867, 'learning_rate': 1.9611230842329133e-07, 'epoch': 0.91} 91%|█████████▏| 31304/34278 [34:16:07<3:46:34, 4.57s/it] 91%|█████████▏| 31305/34278 [34:16:11<3:37:13, 4.38s/it] {'loss': 0.1151, 'grad_norm': 0.5530258071756962, 'learning_rate': 1.9598131430840272e-07, 'epoch': 0.91} 91%|█████████▏| 31305/34278 [34:16:11<3:37:13, 4.38s/it] 91%|█████████▏| 31306/34278 [34:16:14<3:14:13, 3.92s/it] {'loss': 0.094, 'grad_norm': 0.7692251461411396, 'learning_rate': 1.9585036308264582e-07, 'epoch': 0.91} 91%|█████████▏| 31306/34278 [34:16:14<3:14:13, 3.92s/it] 91%|█████████▏| 31307/34278 [34:16:16<2:56:19, 3.56s/it] {'loss': 0.0996, 'grad_norm': 0.9758808536373275, 'learning_rate': 1.9571945474718967e-07, 'epoch': 0.91} 91%|█████████▏| 31307/34278 [34:16:16<2:56:19, 3.56s/it] 91%|█████████▏| 31308/34278 [34:16:20<2:49:36, 3.43s/it] {'loss': 0.1169, 'grad_norm': 0.8248060555101733, 'learning_rate': 1.955885893032039e-07, 'epoch': 0.91} 91%|█████████▏| 31308/34278 [34:16:20<2:49:36, 3.43s/it] 91%|█████████▏| 31309/34278 [34:16:23<2:43:35, 3.31s/it] {'loss': 0.1001, 'grad_norm': 0.7753417412953666, 'learning_rate': 1.954577667518559e-07, 'epoch': 0.91} 91%|█████████▏| 31309/34278 [34:16:23<2:43:35, 3.31s/it] 91%|█████████▏| 31310/34278 [34:16:26<2:43:52, 3.31s/it] {'loss': 0.1028, 'grad_norm': 0.8813853837993885, 'learning_rate': 1.953269870943142e-07, 'epoch': 0.91} 91%|█████████▏| 31310/34278 [34:16:26<2:43:52, 3.31s/it] 91%|█████████▏| 31311/34278 [34:16:29<2:39:39, 3.23s/it] {'loss': 0.093, 'grad_norm': 0.7867172767948929, 'learning_rate': 1.9519625033174562e-07, 'epoch': 0.91} 91%|█████████▏| 31311/34278 [34:16:29<2:39:39, 3.23s/it] 91%|█████████▏| 31312/34278 [34:16:32<2:39:23, 3.22s/it] {'loss': 0.1046, 'grad_norm': 0.6112482339889289, 'learning_rate': 1.9506555646531867e-07, 'epoch': 0.91} 91%|█████████▏| 31312/34278 [34:16:32<2:39:23, 3.22s/it] 91%|█████████▏| 31313/34278 [34:16:38<3:18:22, 4.01s/it] {'loss': 0.1101, 'grad_norm': 0.9858858148612809, 'learning_rate': 1.9493490549619965e-07, 'epoch': 0.91} 91%|█████████▏| 31313/34278 [34:16:38<3:18:22, 4.01s/it] 91%|█████████▏| 31314/34278 [34:16:41<3:03:27, 3.71s/it] {'loss': 0.1102, 'grad_norm': 0.8754013133805054, 'learning_rate': 1.9480429742555374e-07, 'epoch': 0.91} 91%|█████████▏| 31314/34278 [34:16:41<3:03:27, 3.71s/it] 91%|█████████▏| 31315/34278 [34:16:45<3:09:50, 3.84s/it] {'loss': 0.1031, 'grad_norm': 0.9901464111457875, 'learning_rate': 1.9467373225454832e-07, 'epoch': 0.91} 91%|█████████▏| 31315/34278 [34:16:45<3:09:50, 3.84s/it] 91%|█████████▏| 31316/34278 [34:16:48<2:55:38, 3.56s/it] {'loss': 0.1245, 'grad_norm': 0.7140144003983445, 'learning_rate': 1.9454320998434918e-07, 'epoch': 0.91} 91%|█████████▏| 31316/34278 [34:16:48<2:55:38, 3.56s/it] 91%|█████████▏| 31317/34278 [34:16:51<2:52:37, 3.50s/it] {'loss': 0.1102, 'grad_norm': 0.9792188827829292, 'learning_rate': 1.9441273061612087e-07, 'epoch': 0.91} 91%|█████████▏| 31317/34278 [34:16:51<2:52:37, 3.50s/it] 91%|█████████▏| 31318/34278 [34:16:55<2:47:23, 3.39s/it] {'loss': 0.1128, 'grad_norm': 0.7464639286634547, 'learning_rate': 1.9428229415102807e-07, 'epoch': 0.91} 91%|█████████▏| 31318/34278 [34:16:55<2:47:23, 3.39s/it] 91%|█████████▏| 31319/34278 [34:17:00<3:23:49, 4.13s/it] {'loss': 0.1166, 'grad_norm': 0.6927326790382387, 'learning_rate': 1.9415190059023647e-07, 'epoch': 0.91} 91%|█████████▏| 31319/34278 [34:17:00<3:23:49, 4.13s/it] 91%|█████████▏| 31320/34278 [34:17:06<3:51:28, 4.70s/it] {'loss': 0.1198, 'grad_norm': 0.8408633978323875, 'learning_rate': 1.9402154993490962e-07, 'epoch': 0.91} 91%|█████████▏| 31320/34278 [34:17:06<3:51:28, 4.70s/it] 91%|█████████▏| 31321/34278 [34:17:09<3:24:29, 4.15s/it] {'loss': 0.1052, 'grad_norm': 1.1110077527922588, 'learning_rate': 1.9389124218620937e-07, 'epoch': 0.91} 91%|█████████▏| 31321/34278 [34:17:09<3:24:29, 4.15s/it] 91%|█████████▏| 31322/34278 [34:17:13<3:17:02, 4.00s/it] {'loss': 0.1177, 'grad_norm': 1.0907989034999677, 'learning_rate': 1.9376097734530196e-07, 'epoch': 0.91} 91%|█████████▏| 31322/34278 [34:17:13<3:17:02, 4.00s/it] 91%|█████████▏| 31323/34278 [34:17:16<3:08:04, 3.82s/it] {'loss': 0.1135, 'grad_norm': 0.842923815502877, 'learning_rate': 1.9363075541334986e-07, 'epoch': 0.91} 91%|█████████▏| 31323/34278 [34:17:16<3:08:04, 3.82s/it] 91%|█████████▏| 31324/34278 [34:17:20<3:11:34, 3.89s/it] {'loss': 0.1004, 'grad_norm': 0.8011246284710944, 'learning_rate': 1.9350057639151377e-07, 'epoch': 0.91} 91%|█████████▏| 31324/34278 [34:17:20<3:11:34, 3.89s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 91%|█████████▏| 31325/34278 [34:17:24<3:08:08, 3.82s/it] {'loss': 0.113, 'grad_norm': 0.8584964658344592, 'learning_rate': 1.933704402809583e-07, 'epoch': 0.91} 91%|█████████▏| 31325/34278 [34:17:24<3:08:08, 3.82s/it] 91%|█████████▏| 31326/34278 [34:17:27<2:56:51, 3.59s/it] {'loss': 0.111, 'grad_norm': 0.9878084824197102, 'learning_rate': 1.9324034708284368e-07, 'epoch': 0.91} 91%|█████████▏| 31326/34278 [34:17:27<2:56:51, 3.59s/it] 91%|█████████▏| 31327/34278 [34:17:31<2:56:07, 3.58s/it] {'loss': 0.1242, 'grad_norm': 1.0150341445004687, 'learning_rate': 1.9311029679833115e-07, 'epoch': 0.91} 91%|█████████▏| 31327/34278 [34:17:31<2:56:07, 3.58s/it] 91%|█████████▏| 31328/34278 [34:17:34<2:53:32, 3.53s/it] {'loss': 0.0944, 'grad_norm': 0.749833970785854, 'learning_rate': 1.929802894285826e-07, 'epoch': 0.91} 91%|█████████▏| 31328/34278 [34:17:34<2:53:32, 3.53s/it] 91%|█████████▏| 31329/34278 [34:17:38<2:59:50, 3.66s/it] {'loss': 0.1373, 'grad_norm': 0.8103429378662382, 'learning_rate': 1.9285032497475876e-07, 'epoch': 0.91} 91%|█████████▏| 31329/34278 [34:17:38<2:59:50, 3.66s/it] 91%|█████████▏| 31330/34278 [34:17:42<2:57:34, 3.61s/it] {'loss': 0.0981, 'grad_norm': 0.6616439365790598, 'learning_rate': 1.927204034380198e-07, 'epoch': 0.91} 91%|█████████▏| 31330/34278 [34:17:42<2:57:34, 3.61s/it] 91%|█████████▏| 31331/34278 [34:17:45<2:52:24, 3.51s/it] {'loss': 0.1087, 'grad_norm': 0.7877981710024274, 'learning_rate': 1.9259052481952534e-07, 'epoch': 0.91} 91%|█████████▏| 31331/34278 [34:17:45<2:52:24, 3.51s/it] 91%|█████████▏| 31332/34278 [34:17:48<2:41:08, 3.28s/it] {'loss': 0.1068, 'grad_norm': 1.1921717562156433, 'learning_rate': 1.9246068912043504e-07, 'epoch': 0.91} 91%|█████████▏| 31332/34278 [34:17:48<2:41:08, 3.28s/it] 91%|█████████▏| 31333/34278 [34:17:51<2:37:35, 3.21s/it] {'loss': 0.1142, 'grad_norm': 0.674428734976857, 'learning_rate': 1.9233089634190794e-07, 'epoch': 0.91} 91%|█████████▏| 31333/34278 [34:17:51<2:37:35, 3.21s/it] 91%|█████████▏| 31334/34278 [34:17:56<3:12:25, 3.92s/it] {'loss': 0.136, 'grad_norm': 0.9418225486542654, 'learning_rate': 1.9220114648510259e-07, 'epoch': 0.91} 91%|█████████▏| 31334/34278 [34:17:56<3:12:25, 3.92s/it] 91%|█████████▏| 31335/34278 [34:17:59<2:54:17, 3.55s/it] {'loss': 0.0952, 'grad_norm': 0.819986673302682, 'learning_rate': 1.9207143955117858e-07, 'epoch': 0.91} 91%|█████████▏| 31335/34278 [34:17:59<2:54:17, 3.55s/it] 91%|█████████▏| 31336/34278 [34:18:03<3:08:14, 3.84s/it] {'loss': 0.114, 'grad_norm': 0.9348377373039002, 'learning_rate': 1.919417755412928e-07, 'epoch': 0.91} 91%|█████████▏| 31336/34278 [34:18:03<3:08:14, 3.84s/it] 91%|█████████▏| 31337/34278 [34:18:07<2:58:51, 3.65s/it] {'loss': 0.1363, 'grad_norm': 0.833627907377239, 'learning_rate': 1.918121544566026e-07, 'epoch': 0.91} 91%|█████████▏| 31337/34278 [34:18:07<2:58:51, 3.65s/it] 91%|█████████▏| 31338/34278 [34:18:10<2:53:01, 3.53s/it] {'loss': 0.123, 'grad_norm': 0.9850077049898679, 'learning_rate': 1.9168257629826604e-07, 'epoch': 0.91} 91%|█████████▏| 31338/34278 [34:18:10<2:53:01, 3.53s/it] 91%|█████████▏| 31339/34278 [34:18:13<2:44:50, 3.37s/it] {'loss': 0.0977, 'grad_norm': 0.6545767772470965, 'learning_rate': 1.9155304106743932e-07, 'epoch': 0.91} 91%|█████████▏| 31339/34278 [34:18:13<2:44:50, 3.37s/it] 91%|█████████▏| 31340/34278 [34:18:16<2:48:53, 3.45s/it] {'loss': 0.1189, 'grad_norm': 1.2227961436415566, 'learning_rate': 1.9142354876527935e-07, 'epoch': 0.91} 91%|█████████▏| 31340/34278 [34:18:16<2:48:53, 3.45s/it] 91%|█████████▏| 31341/34278 [34:18:20<2:49:19, 3.46s/it] {'loss': 0.1173, 'grad_norm': 0.6897087811078801, 'learning_rate': 1.9129409939294185e-07, 'epoch': 0.91} 91%|█████████▏| 31341/34278 [34:18:20<2:49:19, 3.46s/it] 91%|█████████▏| 31342/34278 [34:18:23<2:44:50, 3.37s/it] {'loss': 0.1249, 'grad_norm': 1.1786573132138842, 'learning_rate': 1.9116469295158312e-07, 'epoch': 0.91} 91%|█████████▏| 31342/34278 [34:18:23<2:44:50, 3.37s/it] 91%|█████████▏| 31343/34278 [34:18:27<2:48:53, 3.45s/it] {'loss': 0.0996, 'grad_norm': 0.9441161938170052, 'learning_rate': 1.9103532944235781e-07, 'epoch': 0.91} 91%|█████████▏| 31343/34278 [34:18:27<2:48:53, 3.45s/it] 91%|█████████▏| 31344/34278 [34:18:30<2:43:25, 3.34s/it] {'loss': 0.1031, 'grad_norm': 0.7107152263994747, 'learning_rate': 1.9090600886642109e-07, 'epoch': 0.91} 91%|█████████▏| 31344/34278 [34:18:30<2:43:25, 3.34s/it] 91%|█████████▏| 31345/34278 [34:18:35<3:11:11, 3.91s/it] {'loss': 0.107, 'grad_norm': 0.9142669952677954, 'learning_rate': 1.9077673122492702e-07, 'epoch': 0.91} 91%|█████████▏| 31345/34278 [34:18:35<3:11:11, 3.91s/it] 91%|█████████▏| 31346/34278 [34:18:39<3:11:24, 3.92s/it] {'loss': 0.1267, 'grad_norm': 0.8485615026657264, 'learning_rate': 1.906474965190308e-07, 'epoch': 0.91} 91%|█████████▏| 31346/34278 [34:18:39<3:11:24, 3.92s/it] 91%|█████████▏| 31347/34278 [34:18:45<3:34:58, 4.40s/it] {'loss': 0.1069, 'grad_norm': 0.6411471885901804, 'learning_rate': 1.9051830474988597e-07, 'epoch': 0.91} 91%|█████████▏| 31347/34278 [34:18:45<3:34:58, 4.40s/it] 91%|█████████▏| 31348/34278 [34:18:47<3:09:13, 3.87s/it] {'loss': 0.1032, 'grad_norm': 0.8686416439086416, 'learning_rate': 1.9038915591864493e-07, 'epoch': 0.91} 91%|█████████▏| 31348/34278 [34:18:47<3:09:13, 3.87s/it] 91%|█████████▏| 31349/34278 [34:18:50<2:57:00, 3.63s/it] {'loss': 0.1239, 'grad_norm': 0.8858180694698129, 'learning_rate': 1.9026005002646174e-07, 'epoch': 0.91} 91%|█████████▏| 31349/34278 [34:18:50<2:57:00, 3.63s/it] 91%|█████████▏| 31350/34278 [34:18:54<2:55:12, 3.59s/it] {'loss': 0.1051, 'grad_norm': 0.7773263938989426, 'learning_rate': 1.9013098707448885e-07, 'epoch': 0.91} 91%|█████████▏| 31350/34278 [34:18:54<2:55:12, 3.59s/it] 91%|█████████▏| 31351/34278 [34:18:57<2:51:00, 3.51s/it] {'loss': 0.1083, 'grad_norm': 0.9766792669376659, 'learning_rate': 1.9000196706387697e-07, 'epoch': 0.91} 91%|█████████▏| 31351/34278 [34:18:57<2:51:00, 3.51s/it] 91%|█████████▏| 31352/34278 [34:19:00<2:43:53, 3.36s/it] {'loss': 0.1061, 'grad_norm': 0.8726764601428715, 'learning_rate': 1.8987298999578076e-07, 'epoch': 0.91} 91%|█████████▏| 31352/34278 [34:19:00<2:43:53, 3.36s/it] 91%|█████████▏| 31353/34278 [34:19:03<2:41:46, 3.32s/it] {'loss': 0.0984, 'grad_norm': 0.8840248440184265, 'learning_rate': 1.897440558713498e-07, 'epoch': 0.91} 91%|█████████▏| 31353/34278 [34:19:03<2:41:46, 3.32s/it] 91%|█████████▏| 31354/34278 [34:19:07<2:46:53, 3.42s/it] {'loss': 0.1066, 'grad_norm': 0.8373228389039238, 'learning_rate': 1.8961516469173547e-07, 'epoch': 0.91} 91%|█████████▏| 31354/34278 [34:19:07<2:46:53, 3.42s/it] 91%|█████████▏| 31355/34278 [34:19:13<3:22:13, 4.15s/it] {'loss': 0.1168, 'grad_norm': 0.771490868722007, 'learning_rate': 1.89486316458089e-07, 'epoch': 0.91} 91%|█████████▏| 31355/34278 [34:19:13<3:22:13, 4.15s/it] 91%|█████████▏| 31356/34278 [34:19:16<3:08:14, 3.87s/it] {'loss': 0.1254, 'grad_norm': 0.894960769756662, 'learning_rate': 1.8935751117156008e-07, 'epoch': 0.91} 91%|█████████▏| 31356/34278 [34:19:16<3:08:14, 3.87s/it] 91%|█████████▏| 31357/34278 [34:19:19<2:59:02, 3.68s/it] {'loss': 0.0965, 'grad_norm': 0.7806313323142715, 'learning_rate': 1.8922874883329888e-07, 'epoch': 0.91} 91%|█████████▏| 31357/34278 [34:19:19<2:59:02, 3.68s/it] 91%|█████████▏| 31358/34278 [34:19:23<3:00:03, 3.70s/it] {'loss': 0.1062, 'grad_norm': 0.6557312970640437, 'learning_rate': 1.8910002944445448e-07, 'epoch': 0.91} 91%|█████████▏| 31358/34278 [34:19:23<3:00:03, 3.70s/it] 91%|█████████▏| 31359/34278 [34:19:26<2:51:02, 3.52s/it] {'loss': 0.1101, 'grad_norm': 0.6793510244450381, 'learning_rate': 1.8897135300617708e-07, 'epoch': 0.91} 91%|█████████▏| 31359/34278 [34:19:26<2:51:02, 3.52s/it] 91%|█████████▏| 31360/34278 [34:19:29<2:45:08, 3.40s/it] {'loss': 0.1169, 'grad_norm': 0.9202773589720421, 'learning_rate': 1.888427195196152e-07, 'epoch': 0.91} 91%|█████████▏| 31360/34278 [34:19:29<2:45:08, 3.40s/it] 91%|█████████▏| 31361/34278 [34:19:35<3:24:27, 4.21s/it] {'loss': 0.1034, 'grad_norm': 0.6585451880768798, 'learning_rate': 1.8871412898591678e-07, 'epoch': 0.91} 91%|█████████▏| 31361/34278 [34:19:35<3:24:27, 4.21s/it] 91%|█████████▏| 31362/34278 [34:19:38<3:08:22, 3.88s/it] {'loss': 0.1189, 'grad_norm': 0.947702899641979, 'learning_rate': 1.8858558140622928e-07, 'epoch': 0.91} 91%|█████████▏| 31362/34278 [34:19:38<3:08:22, 3.88s/it] 91%|█████████▏| 31363/34278 [34:19:45<3:42:52, 4.59s/it] {'loss': 0.0929, 'grad_norm': 0.7113266911015266, 'learning_rate': 1.8845707678170232e-07, 'epoch': 0.91} 91%|█████████▏| 31363/34278 [34:19:45<3:42:52, 4.59s/it] 91%|█████████▏| 31364/34278 [34:19:48<3:19:10, 4.10s/it] {'loss': 0.1026, 'grad_norm': 0.7883772252736441, 'learning_rate': 1.883286151134811e-07, 'epoch': 0.91} 91%|█████████▏| 31364/34278 [34:19:48<3:19:10, 4.10s/it] 92%|█████████▏| 31365/34278 [34:19:51<3:05:54, 3.83s/it] {'loss': 0.1091, 'grad_norm': 0.6954327366328499, 'learning_rate': 1.8820019640271414e-07, 'epoch': 0.92} 92%|█████████▏| 31365/34278 [34:19:51<3:05:54, 3.83s/it] 92%|█████████▏| 31366/34278 [34:19:55<3:06:00, 3.83s/it] {'loss': 0.1197, 'grad_norm': 0.8511500789027039, 'learning_rate': 1.880718206505472e-07, 'epoch': 0.92} 92%|█████████▏| 31366/34278 [34:19:55<3:06:00, 3.83s/it] 92%|█████████▏| 31367/34278 [34:19:58<2:55:52, 3.62s/it] {'loss': 0.1044, 'grad_norm': 1.0019832829025392, 'learning_rate': 1.8794348785812545e-07, 'epoch': 0.92} 92%|█████████▏| 31367/34278 [34:19:58<2:55:52, 3.62s/it] 92%|█████████▏| 31368/34278 [34:20:02<3:11:00, 3.94s/it] {'loss': 0.1213, 'grad_norm': 0.790297852887975, 'learning_rate': 1.8781519802659577e-07, 'epoch': 0.92} 92%|█████████▏| 31368/34278 [34:20:02<3:11:00, 3.94s/it] 92%|█████████▏| 31369/34278 [34:20:06<3:02:48, 3.77s/it] {'loss': 0.1045, 'grad_norm': 0.6716434249793395, 'learning_rate': 1.876869511571039e-07, 'epoch': 0.92} 92%|█████████▏| 31369/34278 [34:20:06<3:02:48, 3.77s/it] 92%|█████████▏| 31370/34278 [34:20:08<2:46:06, 3.43s/it] {'loss': 0.1276, 'grad_norm': 0.922704496735113, 'learning_rate': 1.8755874725079394e-07, 'epoch': 0.92} 92%|█████████▏| 31370/34278 [34:20:08<2:46:06, 3.43s/it] 92%|█████████▏| 31371/34278 [34:20:12<2:46:04, 3.43s/it] {'loss': 0.0853, 'grad_norm': 0.7622788651070335, 'learning_rate': 1.8743058630880993e-07, 'epoch': 0.92} 92%|█████████▏| 31371/34278 [34:20:12<2:46:04, 3.43s/it] 92%|█████████▏| 31372/34278 [34:20:15<2:37:59, 3.26s/it] {'loss': 0.1099, 'grad_norm': 0.8387169493634814, 'learning_rate': 1.8730246833229772e-07, 'epoch': 0.92} 92%|█████████▏| 31372/34278 [34:20:15<2:37:59, 3.26s/it] 92%|█████████▏| 31373/34278 [34:20:19<2:44:51, 3.41s/it] {'loss': 0.111, 'grad_norm': 0.6763488529931732, 'learning_rate': 1.8717439332240017e-07, 'epoch': 0.92} 92%|█████████▏| 31373/34278 [34:20:19<2:44:51, 3.41s/it] 92%|█████████▏| 31374/34278 [34:20:21<2:38:23, 3.27s/it] {'loss': 0.1099, 'grad_norm': 0.8951673166253913, 'learning_rate': 1.8704636128025978e-07, 'epoch': 0.92} 92%|█████████▏| 31374/34278 [34:20:22<2:38:23, 3.27s/it] 92%|█████████▏| 31375/34278 [34:20:26<2:55:40, 3.63s/it] {'loss': 0.1041, 'grad_norm': 0.8297555505125109, 'learning_rate': 1.8691837220702113e-07, 'epoch': 0.92} 92%|█████████▏| 31375/34278 [34:20:26<2:55:40, 3.63s/it] 92%|█████████▏| 31376/34278 [34:20:32<3:29:58, 4.34s/it] {'loss': 0.1201, 'grad_norm': 0.9665615025615534, 'learning_rate': 1.8679042610382613e-07, 'epoch': 0.92} 92%|█████████▏| 31376/34278 [34:20:32<3:29:58, 4.34s/it] 92%|█████████▏| 31377/34278 [34:20:36<3:23:23, 4.21s/it] {'loss': 0.1184, 'grad_norm': 0.9289330274306193, 'learning_rate': 1.8666252297181776e-07, 'epoch': 0.92} 92%|█████████▏| 31377/34278 [34:20:36<3:23:23, 4.21s/it] 92%|█████████▏| 31378/34278 [34:20:41<3:37:50, 4.51s/it] {'loss': 0.1215, 'grad_norm': 1.014049638060475, 'learning_rate': 1.865346628121367e-07, 'epoch': 0.92} 92%|█████████▏| 31378/34278 [34:20:41<3:37:50, 4.51s/it] 92%|█████████▏| 31379/34278 [34:20:47<3:59:53, 4.96s/it] {'loss': 0.1107, 'grad_norm': 0.9387538042209965, 'learning_rate': 1.8640684562592548e-07, 'epoch': 0.92} 92%|█████████▏| 31379/34278 [34:20:47<3:59:53, 4.96s/it] 92%|█████████▏| 31380/34278 [34:20:50<3:35:33, 4.46s/it] {'loss': 0.143, 'grad_norm': 0.7807314757172528, 'learning_rate': 1.8627907141432422e-07, 'epoch': 0.92} 92%|█████████▏| 31380/34278 [34:20:50<3:35:33, 4.46s/it] 92%|█████████▏| 31381/34278 [34:20:53<3:13:20, 4.00s/it] {'loss': 0.1143, 'grad_norm': 0.8032832425950431, 'learning_rate': 1.8615134017847426e-07, 'epoch': 0.92} 92%|█████████▏| 31381/34278 [34:20:53<3:13:20, 4.00s/it] 92%|█████████▏| 31382/34278 [34:20:56<3:00:48, 3.75s/it] {'loss': 0.1098, 'grad_norm': 0.9384815469369953, 'learning_rate': 1.8602365191951687e-07, 'epoch': 0.92} 92%|█████████▏| 31382/34278 [34:20:56<3:00:48, 3.75s/it] 92%|█████████▏| 31383/34278 [34:21:02<3:32:17, 4.40s/it] {'loss': 0.0858, 'grad_norm': 1.1222528703641497, 'learning_rate': 1.858960066385912e-07, 'epoch': 0.92} 92%|█████████▏| 31383/34278 [34:21:02<3:32:17, 4.40s/it] 92%|█████████▏| 31384/34278 [34:21:05<3:10:03, 3.94s/it] {'loss': 0.1175, 'grad_norm': 0.7615160881334084, 'learning_rate': 1.8576840433683574e-07, 'epoch': 0.92} 92%|█████████▏| 31384/34278 [34:21:05<3:10:03, 3.94s/it] 92%|█████████▏| 31385/34278 [34:21:08<2:52:46, 3.58s/it] {'loss': 0.0987, 'grad_norm': 1.0065607394870228, 'learning_rate': 1.8564084501539181e-07, 'epoch': 0.92} 92%|█████████▏| 31385/34278 [34:21:08<2:52:46, 3.58s/it] 92%|█████████▏| 31386/34278 [34:21:11<2:50:22, 3.53s/it] {'loss': 0.0969, 'grad_norm': 0.9255308678433942, 'learning_rate': 1.8551332867539572e-07, 'epoch': 0.92} 92%|█████████▏| 31386/34278 [34:21:11<2:50:22, 3.53s/it] 92%|█████████▏| 31387/34278 [34:21:15<2:45:58, 3.44s/it] {'loss': 0.1391, 'grad_norm': 0.930750996718049, 'learning_rate': 1.8538585531798881e-07, 'epoch': 0.92} 92%|█████████▏| 31387/34278 [34:21:15<2:45:58, 3.44s/it] 92%|█████████▏| 31388/34278 [34:21:17<2:36:23, 3.25s/it] {'loss': 0.1052, 'grad_norm': 0.8228319762684867, 'learning_rate': 1.852584249443068e-07, 'epoch': 0.92} 92%|█████████▏| 31388/34278 [34:21:17<2:36:23, 3.25s/it] 92%|█████████▏| 31389/34278 [34:21:20<2:32:41, 3.17s/it] {'loss': 0.0928, 'grad_norm': 0.8235112268013984, 'learning_rate': 1.8513103755548822e-07, 'epoch': 0.92} 92%|█████████▏| 31389/34278 [34:21:20<2:32:41, 3.17s/it] 92%|█████████▏| 31390/34278 [34:21:27<3:16:03, 4.07s/it] {'loss': 0.0915, 'grad_norm': 0.7145745651504304, 'learning_rate': 1.8500369315267108e-07, 'epoch': 0.92} 92%|█████████▏| 31390/34278 [34:21:27<3:16:03, 4.07s/it] 92%|█████████▏| 31391/34278 [34:21:30<3:05:36, 3.86s/it] {'loss': 0.1101, 'grad_norm': 0.7421971743172168, 'learning_rate': 1.8487639173699057e-07, 'epoch': 0.92} 92%|█████████▏| 31391/34278 [34:21:30<3:05:36, 3.86s/it] 92%|█████████▏| 31392/34278 [34:21:33<2:53:57, 3.62s/it] {'loss': 0.1162, 'grad_norm': 0.9867879653682757, 'learning_rate': 1.847491333095841e-07, 'epoch': 0.92} 92%|█████████▏| 31392/34278 [34:21:33<2:53:57, 3.62s/it] 92%|█████████▏| 31393/34278 [34:21:37<2:55:05, 3.64s/it] {'loss': 0.0963, 'grad_norm': 0.843077088935799, 'learning_rate': 1.8462191787158855e-07, 'epoch': 0.92} 92%|█████████▏| 31393/34278 [34:21:37<2:55:05, 3.64s/it] 92%|█████████▏| 31394/34278 [34:21:40<2:45:51, 3.45s/it] {'loss': 0.1055, 'grad_norm': 0.7220111672319596, 'learning_rate': 1.8449474542413858e-07, 'epoch': 0.92} 92%|█████████▏| 31394/34278 [34:21:40<2:45:51, 3.45s/it] 92%|█████████▏| 31395/34278 [34:21:44<2:52:50, 3.60s/it] {'loss': 0.1162, 'grad_norm': 0.8945048968438312, 'learning_rate': 1.843676159683705e-07, 'epoch': 0.92} 92%|█████████▏| 31395/34278 [34:21:44<2:52:50, 3.60s/it] 92%|█████████▏| 31396/34278 [34:21:49<3:11:40, 3.99s/it] {'loss': 0.1001, 'grad_norm': 0.8184157935974352, 'learning_rate': 1.8424052950541892e-07, 'epoch': 0.92} 92%|█████████▏| 31396/34278 [34:21:49<3:11:40, 3.99s/it] 92%|█████████▏| 31397/34278 [34:21:51<2:55:35, 3.66s/it] {'loss': 0.1094, 'grad_norm': 0.9280777362350853, 'learning_rate': 1.8411348603641743e-07, 'epoch': 0.92} 92%|█████████▏| 31397/34278 [34:21:51<2:55:35, 3.66s/it] 92%|█████████▏| 31398/34278 [34:21:54<2:45:58, 3.46s/it] {'loss': 0.1035, 'grad_norm': 0.7392705056496901, 'learning_rate': 1.8398648556250122e-07, 'epoch': 0.92} 92%|█████████▏| 31398/34278 [34:21:54<2:45:58, 3.46s/it] 92%|█████████▏| 31399/34278 [34:21:57<2:35:14, 3.24s/it] {'loss': 0.1192, 'grad_norm': 0.8085351305562487, 'learning_rate': 1.8385952808480434e-07, 'epoch': 0.92} 92%|█████████▏| 31399/34278 [34:21:57<2:35:14, 3.24s/it] 92%|█████████▏| 31400/34278 [34:22:00<2:34:52, 3.23s/it] {'loss': 0.1095, 'grad_norm': 0.927287870661808, 'learning_rate': 1.8373261360445983e-07, 'epoch': 0.92} 92%|█████████▏| 31400/34278 [34:22:00<2:34:52, 3.23s/it] 92%|█████████▏| 31401/34278 [34:22:04<2:40:20, 3.34s/it] {'loss': 0.121, 'grad_norm': 0.8250135573400705, 'learning_rate': 1.8360574212260063e-07, 'epoch': 0.92} 92%|█████████▏| 31401/34278 [34:22:04<2:40:20, 3.34s/it] 92%|█████████▏| 31402/34278 [34:22:08<2:49:29, 3.54s/it] {'loss': 0.1178, 'grad_norm': 0.8695358461892482, 'learning_rate': 1.8347891364035974e-07, 'epoch': 0.92} 92%|█████████▏| 31402/34278 [34:22:08<2:49:29, 3.54s/it] 92%|█████████▏| 31403/34278 [34:22:12<2:56:18, 3.68s/it] {'loss': 0.1341, 'grad_norm': 0.9794220199794262, 'learning_rate': 1.833521281588696e-07, 'epoch': 0.92} 92%|█████████▏| 31403/34278 [34:22:12<2:56:18, 3.68s/it] 92%|█████████▏| 31404/34278 [34:22:15<2:49:24, 3.54s/it] {'loss': 0.1042, 'grad_norm': 0.9536576826698145, 'learning_rate': 1.8322538567926152e-07, 'epoch': 0.92} 92%|█████████▏| 31404/34278 [34:22:15<2:49:24, 3.54s/it] 92%|█████████▏| 31405/34278 [34:22:19<2:54:09, 3.64s/it] {'loss': 0.1068, 'grad_norm': 0.773398645910214, 'learning_rate': 1.830986862026668e-07, 'epoch': 0.92} 92%|█████████▏| 31405/34278 [34:22:19<2:54:09, 3.64s/it] 92%|█████████▏| 31406/34278 [34:22:23<2:51:44, 3.59s/it] {'loss': 0.1204, 'grad_norm': 0.834665420479556, 'learning_rate': 1.8297202973021787e-07, 'epoch': 0.92} 92%|█████████▏| 31406/34278 [34:22:23<2:51:44, 3.59s/it] 92%|█████████▏| 31407/34278 [34:22:26<2:46:51, 3.49s/it] {'loss': 0.0881, 'grad_norm': 1.12506311343205, 'learning_rate': 1.8284541626304496e-07, 'epoch': 0.92} 92%|█████████▏| 31407/34278 [34:22:26<2:46:51, 3.49s/it] 92%|█████████▏| 31408/34278 [34:22:32<3:24:05, 4.27s/it] {'loss': 0.1263, 'grad_norm': 0.7596616311493886, 'learning_rate': 1.8271884580227716e-07, 'epoch': 0.92} 92%|█████████▏| 31408/34278 [34:22:32<3:24:05, 4.27s/it] 92%|█████████▏| 31409/34278 [34:22:35<3:13:27, 4.05s/it] {'loss': 0.1047, 'grad_norm': 0.7707059374752233, 'learning_rate': 1.8259231834904689e-07, 'epoch': 0.92} 92%|█████████▏| 31409/34278 [34:22:35<3:13:27, 4.05s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0cab9fbfb0> Failed to fetch sample 3187195. Exception: cannot identify image file <_io.BytesIO object at 0x7f0cab9fbfb0> 92%|█████████▏| 31410/34278 [34:22:38<2:59:18, 3.75s/it] {'loss': 0.1105, 'grad_norm': 0.7854974420155609, 'learning_rate': 1.8246583390448102e-07, 'epoch': 0.92} 92%|█████████▏| 31410/34278 [34:22:38<2:59:18, 3.75s/it] 92%|█████████▏| 31411/34278 [34:22:42<2:49:42, 3.55s/it] {'loss': 0.0884, 'grad_norm': 0.8750514815335055, 'learning_rate': 1.823393924697109e-07, 'epoch': 0.92} 92%|█████████▏| 31411/34278 [34:22:42<2:49:42, 3.55s/it] 92%|█████████▏| 31412/34278 [34:22:46<3:05:29, 3.88s/it] {'loss': 0.1129, 'grad_norm': 0.947681277184962, 'learning_rate': 1.8221299404586445e-07, 'epoch': 0.92} 92%|█████████▏| 31412/34278 [34:22:46<3:05:29, 3.88s/it] 92%|█████████▏| 31413/34278 [34:22:49<2:51:16, 3.59s/it] {'loss': 0.121, 'grad_norm': 0.9993209605677023, 'learning_rate': 1.8208663863407083e-07, 'epoch': 0.92} 92%|█████████▏| 31413/34278 [34:22:49<2:51:16, 3.59s/it] 92%|█████████▏| 31414/34278 [34:22:55<3:24:46, 4.29s/it] {'loss': 0.0999, 'grad_norm': 0.8050607414401262, 'learning_rate': 1.819603262354569e-07, 'epoch': 0.92} 92%|█████████▏| 31414/34278 [34:22:55<3:24:46, 4.29s/it] 92%|█████████▏| 31415/34278 [34:22:59<3:21:12, 4.22s/it] {'loss': 0.1305, 'grad_norm': 0.8008160192839413, 'learning_rate': 1.818340568511512e-07, 'epoch': 0.92} 92%|█████████▏| 31415/34278 [34:22:59<3:21:12, 4.22s/it] 92%|█████████▏| 31416/34278 [34:23:02<3:03:43, 3.85s/it] {'loss': 0.1149, 'grad_norm': 0.875097615447291, 'learning_rate': 1.8170783048228057e-07, 'epoch': 0.92} 92%|█████████▏| 31416/34278 [34:23:02<3:03:43, 3.85s/it] 92%|█████████▏| 31417/34278 [34:23:08<3:34:52, 4.51s/it] {'loss': 0.1067, 'grad_norm': 0.8675509678375348, 'learning_rate': 1.8158164712997306e-07, 'epoch': 0.92} 92%|█████████▏| 31417/34278 [34:23:08<3:34:52, 4.51s/it] 92%|█████████▏| 31418/34278 [34:23:12<3:26:22, 4.33s/it] {'loss': 0.1177, 'grad_norm': 0.7910112209571196, 'learning_rate': 1.8145550679535329e-07, 'epoch': 0.92} 92%|█████████▏| 31418/34278 [34:23:12<3:26:22, 4.33s/it] 92%|█████████▏| 31419/34278 [34:23:18<3:49:44, 4.82s/it] {'loss': 0.1315, 'grad_norm': 0.8016354528718228, 'learning_rate': 1.8132940947954924e-07, 'epoch': 0.92} 92%|█████████▏| 31419/34278 [34:23:18<3:49:44, 4.82s/it] 92%|█████████▏| 31420/34278 [34:23:21<3:25:37, 4.32s/it] {'loss': 0.1057, 'grad_norm': 0.981834198358148, 'learning_rate': 1.8120335518368614e-07, 'epoch': 0.92} 92%|█████████▏| 31420/34278 [34:23:21<3:25:37, 4.32s/it] 92%|█████████▏| 31421/34278 [34:23:27<3:43:17, 4.69s/it] {'loss': 0.1175, 'grad_norm': 0.823750186569676, 'learning_rate': 1.8107734390888809e-07, 'epoch': 0.92} 92%|█████████▏| 31421/34278 [34:23:27<3:43:17, 4.69s/it] 92%|█████████▏| 31422/34278 [34:23:31<3:41:23, 4.65s/it] {'loss': 0.1117, 'grad_norm': 0.9265404485278054, 'learning_rate': 1.809513756562814e-07, 'epoch': 0.92} 92%|█████████▏| 31422/34278 [34:23:31<3:41:23, 4.65s/it] 92%|█████████▏| 31423/34278 [34:23:35<3:34:23, 4.51s/it] {'loss': 0.0889, 'grad_norm': 0.8217818917326766, 'learning_rate': 1.808254504269913e-07, 'epoch': 0.92} 92%|█████████▏| 31423/34278 [34:23:36<3:34:23, 4.51s/it] 92%|█████████▏| 31424/34278 [34:23:39<3:27:01, 4.35s/it] {'loss': 0.108, 'grad_norm': 0.7781014107621114, 'learning_rate': 1.8069956822214018e-07, 'epoch': 0.92} 92%|█████████▏| 31424/34278 [34:23:39<3:27:01, 4.35s/it] 92%|█████████▏| 31425/34278 [34:23:43<3:14:01, 4.08s/it] {'loss': 0.1091, 'grad_norm': 0.8778356214948702, 'learning_rate': 1.805737290428533e-07, 'epoch': 0.92} 92%|█████████▏| 31425/34278 [34:23:43<3:14:01, 4.08s/it] 92%|█████████▏| 31426/34278 [34:23:47<3:14:29, 4.09s/it] {'loss': 0.1044, 'grad_norm': 0.6689752395268365, 'learning_rate': 1.804479328902542e-07, 'epoch': 0.92} 92%|█████████▏| 31426/34278 [34:23:47<3:14:29, 4.09s/it] 92%|█████████▏| 31427/34278 [34:23:50<3:00:54, 3.81s/it] {'loss': 0.1035, 'grad_norm': 0.9883087329036726, 'learning_rate': 1.8032217976546418e-07, 'epoch': 0.92} 92%|█████████▏| 31427/34278 [34:23:50<3:00:54, 3.81s/it] 92%|█████████▏| 31428/34278 [34:23:54<2:55:31, 3.70s/it] {'loss': 0.1163, 'grad_norm': 0.928078768675476, 'learning_rate': 1.801964696696079e-07, 'epoch': 0.92} 92%|█████████▏| 31428/34278 [34:23:54<2:55:31, 3.70s/it] 92%|█████████▏| 31429/34278 [34:23:59<3:26:53, 4.36s/it] {'loss': 0.1364, 'grad_norm': 0.8242058167547271, 'learning_rate': 1.8007080260380727e-07, 'epoch': 0.92} 92%|█████████▏| 31429/34278 [34:23:59<3:26:53, 4.36s/it] 92%|█████████▏| 31430/34278 [34:24:02<3:04:46, 3.89s/it] {'loss': 0.1061, 'grad_norm': 0.9462811646216613, 'learning_rate': 1.7994517856918359e-07, 'epoch': 0.92} 92%|█████████▏| 31430/34278 [34:24:02<3:04:46, 3.89s/it] 92%|█████████▏| 31431/34278 [34:24:05<2:54:45, 3.68s/it] {'loss': 0.1091, 'grad_norm': 0.721219548430664, 'learning_rate': 1.7981959756685875e-07, 'epoch': 0.92} 92%|█████████▏| 31431/34278 [34:24:05<2:54:45, 3.68s/it] 92%|█████████▏| 31432/34278 [34:24:10<3:10:02, 4.01s/it] {'loss': 0.1011, 'grad_norm': 0.848426286099118, 'learning_rate': 1.7969405959795404e-07, 'epoch': 0.92} 92%|█████████▏| 31432/34278 [34:24:10<3:10:02, 4.01s/it] 92%|█████████▏| 31433/34278 [34:24:14<3:06:36, 3.94s/it] {'loss': 0.1313, 'grad_norm': 0.7847501625730108, 'learning_rate': 1.7956856466358974e-07, 'epoch': 0.92} 92%|█████████▏| 31433/34278 [34:24:14<3:06:36, 3.94s/it] 92%|█████████▏| 31434/34278 [34:24:18<3:14:37, 4.11s/it] {'loss': 0.1344, 'grad_norm': 0.8439447752271549, 'learning_rate': 1.7944311276488656e-07, 'epoch': 0.92} 92%|█████████▏| 31434/34278 [34:24:19<3:14:37, 4.11s/it] 92%|█████████▏| 31435/34278 [34:24:22<3:00:20, 3.81s/it] {'loss': 0.1318, 'grad_norm': 0.886942847069556, 'learning_rate': 1.7931770390296423e-07, 'epoch': 0.92} 92%|█████████▏| 31435/34278 [34:24:22<3:00:20, 3.81s/it] 92%|█████████▏| 31436/34278 [34:24:25<2:51:24, 3.62s/it] {'loss': 0.0984, 'grad_norm': 0.9204156745615503, 'learning_rate': 1.7919233807894343e-07, 'epoch': 0.92} 92%|█████████▏| 31436/34278 [34:24:25<2:51:24, 3.62s/it] 92%|█████████▏| 31437/34278 [34:24:28<2:51:24, 3.62s/it] {'loss': 0.1084, 'grad_norm': 0.9114936709997368, 'learning_rate': 1.7906701529394277e-07, 'epoch': 0.92} 92%|█████████▏| 31437/34278 [34:24:28<2:51:24, 3.62s/it] 92%|█████████▏| 31438/34278 [34:24:34<3:23:28, 4.30s/it] {'loss': 0.1289, 'grad_norm': 1.0084742204612493, 'learning_rate': 1.7894173554907967e-07, 'epoch': 0.92} 92%|█████████▏| 31438/34278 [34:24:34<3:23:28, 4.30s/it] 92%|█████████▏| 31439/34278 [34:24:40<3:44:47, 4.75s/it] {'loss': 0.0979, 'grad_norm': 1.079762771034041, 'learning_rate': 1.7881649884547492e-07, 'epoch': 0.92} 92%|█████████▏| 31439/34278 [34:24:40<3:44:47, 4.75s/it] 92%|█████████▏| 31440/34278 [34:24:43<3:24:07, 4.32s/it] {'loss': 0.1246, 'grad_norm': 0.8832897504243681, 'learning_rate': 1.7869130518424538e-07, 'epoch': 0.92} 92%|█████████▏| 31440/34278 [34:24:43<3:24:07, 4.32s/it] 92%|█████████▏| 31441/34278 [34:24:46<3:06:09, 3.94s/it] {'loss': 0.1275, 'grad_norm': 0.7904649565451088, 'learning_rate': 1.785661545665085e-07, 'epoch': 0.92} 92%|█████████▏| 31441/34278 [34:24:46<3:06:09, 3.94s/it] 92%|█████████▏| 31442/34278 [34:24:49<2:48:36, 3.57s/it] {'loss': 0.1032, 'grad_norm': 0.9835395240531951, 'learning_rate': 1.7844104699338228e-07, 'epoch': 0.92} 92%|█████████▏| 31442/34278 [34:24:49<2:48:36, 3.57s/it] 92%|█████████▏| 31443/34278 [34:24:52<2:43:00, 3.45s/it] {'loss': 0.0801, 'grad_norm': 0.7235054219206365, 'learning_rate': 1.783159824659836e-07, 'epoch': 0.92} 92%|█████████▏| 31443/34278 [34:24:52<2:43:00, 3.45s/it] 92%|█████████▏| 31444/34278 [34:24:56<2:39:20, 3.37s/it] {'loss': 0.1281, 'grad_norm': 0.829841051846416, 'learning_rate': 1.7819096098542876e-07, 'epoch': 0.92} 92%|█████████▏| 31444/34278 [34:24:56<2:39:20, 3.37s/it] 92%|█████████▏| 31445/34278 [34:24:59<2:35:38, 3.30s/it] {'loss': 0.1278, 'grad_norm': 0.9330809494992617, 'learning_rate': 1.7806598255283415e-07, 'epoch': 0.92} 92%|█████████▏| 31445/34278 [34:24:59<2:35:38, 3.30s/it] 92%|█████████▏| 31446/34278 [34:25:02<2:32:33, 3.23s/it] {'loss': 0.0967, 'grad_norm': 0.7713121767513006, 'learning_rate': 1.7794104716931437e-07, 'epoch': 0.92} 92%|█████████▏| 31446/34278 [34:25:02<2:32:33, 3.23s/it] 92%|█████████▏| 31447/34278 [34:25:08<3:14:54, 4.13s/it] {'loss': 0.0993, 'grad_norm': 0.7427712675609687, 'learning_rate': 1.778161548359869e-07, 'epoch': 0.92} 92%|█████████▏| 31447/34278 [34:25:08<3:14:54, 4.13s/it] 92%|█████████▏| 31448/34278 [34:25:14<3:39:54, 4.66s/it] {'loss': 0.1081, 'grad_norm': 0.7985539646604289, 'learning_rate': 1.7769130555396476e-07, 'epoch': 0.92} 92%|█████████▏| 31448/34278 [34:25:14<3:39:54, 4.66s/it] 92%|█████████▏| 31449/34278 [34:25:20<3:59:00, 5.07s/it] {'loss': 0.1267, 'grad_norm': 0.745822331716892, 'learning_rate': 1.775664993243642e-07, 'epoch': 0.92} 92%|█████████▏| 31449/34278 [34:25:20<3:59:00, 5.07s/it] 92%|█████████▏| 31450/34278 [34:25:25<3:58:44, 5.07s/it] {'loss': 0.1091, 'grad_norm': 0.6665221359510427, 'learning_rate': 1.7744173614829885e-07, 'epoch': 0.92} 92%|█████████▏| 31450/34278 [34:25:25<3:58:44, 5.07s/it] 92%|█████████▏| 31451/34278 [34:25:28<3:31:07, 4.48s/it] {'loss': 0.1176, 'grad_norm': 0.7913045617358165, 'learning_rate': 1.7731701602688168e-07, 'epoch': 0.92} 92%|█████████▏| 31451/34278 [34:25:28<3:31:07, 4.48s/it] 92%|█████████▏| 31452/34278 [34:25:31<3:12:00, 4.08s/it] {'loss': 0.1168, 'grad_norm': 0.8724016247432931, 'learning_rate': 1.7719233896122733e-07, 'epoch': 0.92} 92%|█████████▏| 31452/34278 [34:25:31<3:12:00, 4.08s/it] 92%|█████████▏| 31453/34278 [34:25:34<2:55:36, 3.73s/it] {'loss': 0.1, 'grad_norm': 0.8798424447611202, 'learning_rate': 1.7706770495244884e-07, 'epoch': 0.92} 92%|█████████▏| 31453/34278 [34:25:34<2:55:36, 3.73s/it] 92%|█████████▏| 31454/34278 [34:25:37<2:41:20, 3.43s/it] {'loss': 0.1011, 'grad_norm': 0.9129538205177682, 'learning_rate': 1.7694311400165753e-07, 'epoch': 0.92} 92%|█████████▏| 31454/34278 [34:25:37<2:41:20, 3.43s/it] 92%|█████████▏| 31455/34278 [34:25:40<2:44:10, 3.49s/it] {'loss': 0.1063, 'grad_norm': 0.7626105906124192, 'learning_rate': 1.768185661099675e-07, 'epoch': 0.92} 92%|█████████▏| 31455/34278 [34:25:40<2:44:10, 3.49s/it] 92%|█████████▏| 31456/34278 [34:25:43<2:37:34, 3.35s/it] {'loss': 0.1225, 'grad_norm': 0.9591430922037872, 'learning_rate': 1.766940612784901e-07, 'epoch': 0.92} 92%|█████████▏| 31456/34278 [34:25:43<2:37:34, 3.35s/it] 92%|█████████▏| 31457/34278 [34:25:46<2:28:31, 3.16s/it] {'loss': 0.0987, 'grad_norm': 0.8823448331257628, 'learning_rate': 1.7656959950833608e-07, 'epoch': 0.92} 92%|█████████▏| 31457/34278 [34:25:46<2:28:31, 3.16s/it] 92%|█████████▏| 31458/34278 [34:25:52<3:09:28, 4.03s/it] {'loss': 0.1016, 'grad_norm': 0.6726474082165065, 'learning_rate': 1.7644518080061735e-07, 'epoch': 0.92} 92%|█████████▏| 31458/34278 [34:25:52<3:09:28, 4.03s/it] 92%|█████████▏| 31459/34278 [34:25:55<2:53:13, 3.69s/it] {'loss': 0.1399, 'grad_norm': 0.9953081672778715, 'learning_rate': 1.7632080515644523e-07, 'epoch': 0.92} 92%|█████████▏| 31459/34278 [34:25:55<2:53:13, 3.69s/it] 92%|█████████▏| 31460/34278 [34:25:58<2:39:26, 3.39s/it] {'loss': 0.0915, 'grad_norm': 1.0520249333383132, 'learning_rate': 1.761964725769294e-07, 'epoch': 0.92} 92%|█████████▏| 31460/34278 [34:25:58<2:39:26, 3.39s/it] 92%|█████████▏| 31461/34278 [34:26:02<2:47:24, 3.57s/it] {'loss': 0.09, 'grad_norm': 0.7312001105015509, 'learning_rate': 1.7607218306317896e-07, 'epoch': 0.92} 92%|█████████▏| 31461/34278 [34:26:02<2:47:24, 3.57s/it] 92%|█████████▏| 31462/34278 [34:26:05<2:43:58, 3.49s/it] {'loss': 0.1025, 'grad_norm': 1.160657903192297, 'learning_rate': 1.7594793661630526e-07, 'epoch': 0.92} 92%|█████████▏| 31462/34278 [34:26:05<2:43:58, 3.49s/it] 92%|█████████▏| 31463/34278 [34:26:10<3:00:41, 3.85s/it] {'loss': 0.1258, 'grad_norm': 0.8381060018723425, 'learning_rate': 1.7582373323741686e-07, 'epoch': 0.92} 92%|█████████▏| 31463/34278 [34:26:10<3:00:41, 3.85s/it] 92%|█████████▏| 31464/34278 [34:26:14<3:03:37, 3.92s/it] {'loss': 0.118, 'grad_norm': 0.8742176123686887, 'learning_rate': 1.756995729276223e-07, 'epoch': 0.92} 92%|█████████▏| 31464/34278 [34:26:14<3:03:37, 3.92s/it] 92%|█████████▏| 31465/34278 [34:26:17<2:50:36, 3.64s/it] {'loss': 0.1119, 'grad_norm': 0.6928916489087649, 'learning_rate': 1.7557545568803014e-07, 'epoch': 0.92} 92%|█████████▏| 31465/34278 [34:26:17<2:50:36, 3.64s/it] 92%|█████████▏| 31466/34278 [34:26:23<3:26:23, 4.40s/it] {'loss': 0.1241, 'grad_norm': 0.7375050529998568, 'learning_rate': 1.7545138151974895e-07, 'epoch': 0.92} 92%|█████████▏| 31466/34278 [34:26:23<3:26:23, 4.40s/it] 92%|█████████▏| 31467/34278 [34:26:26<3:10:48, 4.07s/it] {'loss': 0.1073, 'grad_norm': 1.665030995723457, 'learning_rate': 1.7532735042388617e-07, 'epoch': 0.92} 92%|█████████▏| 31467/34278 [34:26:26<3:10:48, 4.07s/it] 92%|█████████▏| 31468/34278 [34:26:30<3:00:04, 3.84s/it] {'loss': 0.095, 'grad_norm': 0.8717161143773435, 'learning_rate': 1.7520336240154867e-07, 'epoch': 0.92} 92%|█████████▏| 31468/34278 [34:26:30<3:00:04, 3.84s/it] 92%|█████████▏| 31469/34278 [34:26:33<2:49:00, 3.61s/it] {'loss': 0.1024, 'grad_norm': 0.870272092108847, 'learning_rate': 1.7507941745384394e-07, 'epoch': 0.92} 92%|█████████▏| 31469/34278 [34:26:33<2:49:00, 3.61s/it] 92%|█████████▏| 31470/34278 [34:26:37<2:55:41, 3.75s/it] {'loss': 0.1045, 'grad_norm': 0.940668690315811, 'learning_rate': 1.749555155818783e-07, 'epoch': 0.92} 92%|█████████▏| 31470/34278 [34:26:37<2:55:41, 3.75s/it] 92%|█████████▏| 31471/34278 [34:26:40<2:50:26, 3.64s/it] {'loss': 0.0996, 'grad_norm': 0.6265336298709263, 'learning_rate': 1.748316567867575e-07, 'epoch': 0.92} 92%|█████████▏| 31471/34278 [34:26:40<2:50:26, 3.64s/it] 92%|█████████▏| 31472/34278 [34:26:43<2:44:15, 3.51s/it] {'loss': 0.1025, 'grad_norm': 0.8199566455457374, 'learning_rate': 1.7470784106958903e-07, 'epoch': 0.92} 92%|█████████▏| 31472/34278 [34:26:43<2:44:15, 3.51s/it] 92%|█████████▏| 31473/34278 [34:26:47<2:48:13, 3.60s/it] {'loss': 0.134, 'grad_norm': 0.8002080741057659, 'learning_rate': 1.7458406843147647e-07, 'epoch': 0.92} 92%|█████████▏| 31473/34278 [34:26:47<2:48:13, 3.60s/it] 92%|█████████▏| 31474/34278 [34:26:50<2:34:23, 3.30s/it] {'loss': 0.121, 'grad_norm': 0.8333135633377379, 'learning_rate': 1.7446033887352498e-07, 'epoch': 0.92} 92%|█████████▏| 31474/34278 [34:26:50<2:34:23, 3.30s/it] 92%|█████████▏| 31475/34278 [34:26:53<2:36:29, 3.35s/it] {'loss': 0.1135, 'grad_norm': 0.8724163486160376, 'learning_rate': 1.7433665239684038e-07, 'epoch': 0.92} 92%|█████████▏| 31475/34278 [34:26:53<2:36:29, 3.35s/it] 92%|█████████▏| 31476/34278 [34:26:57<2:36:11, 3.34s/it] {'loss': 0.1351, 'grad_norm': 0.7607951231388181, 'learning_rate': 1.742130090025257e-07, 'epoch': 0.92} 92%|█████████▏| 31476/34278 [34:26:57<2:36:11, 3.34s/it] 92%|█████████▏| 31477/34278 [34:27:00<2:29:40, 3.21s/it] {'loss': 0.1216, 'grad_norm': 0.866945882665733, 'learning_rate': 1.7408940869168556e-07, 'epoch': 0.92} 92%|█████████▏| 31477/34278 [34:27:00<2:29:40, 3.21s/it] 92%|█████████▏| 31478/34278 [34:27:03<2:27:44, 3.17s/it] {'loss': 0.1126, 'grad_norm': 0.901991959683693, 'learning_rate': 1.7396585146542245e-07, 'epoch': 0.92} 92%|█████████▏| 31478/34278 [34:27:03<2:27:44, 3.17s/it] 92%|█████████▏| 31479/34278 [34:27:08<2:54:11, 3.73s/it] {'loss': 0.1195, 'grad_norm': 0.7517972495851978, 'learning_rate': 1.73842337324841e-07, 'epoch': 0.92} 92%|█████████▏| 31479/34278 [34:27:08<2:54:11, 3.73s/it] 92%|█████████▏| 31480/34278 [34:27:13<3:18:39, 4.26s/it] {'loss': 0.1282, 'grad_norm': 0.9077314335051077, 'learning_rate': 1.7371886627104317e-07, 'epoch': 0.92} 92%|█████████▏| 31480/34278 [34:27:13<3:18:39, 4.26s/it] 92%|█████████▏| 31481/34278 [34:27:17<3:08:15, 4.04s/it] {'loss': 0.1065, 'grad_norm': 0.719193242973939, 'learning_rate': 1.7359543830513027e-07, 'epoch': 0.92} 92%|█████████▏| 31481/34278 [34:27:17<3:08:15, 4.04s/it] 92%|█████████▏| 31482/34278 [34:27:21<3:12:27, 4.13s/it] {'loss': 0.1167, 'grad_norm': 0.9507072030075335, 'learning_rate': 1.734720534282053e-07, 'epoch': 0.92} 92%|█████████▏| 31482/34278 [34:27:21<3:12:27, 4.13s/it] 92%|█████████▏| 31483/34278 [34:27:24<2:56:18, 3.78s/it] {'loss': 0.1118, 'grad_norm': 0.8482801160596916, 'learning_rate': 1.7334871164137013e-07, 'epoch': 0.92} 92%|█████████▏| 31483/34278 [34:27:24<2:56:18, 3.78s/it] 92%|█████████▏| 31484/34278 [34:27:29<3:19:53, 4.29s/it] {'loss': 0.1274, 'grad_norm': 0.7959575808115095, 'learning_rate': 1.7322541294572505e-07, 'epoch': 0.92} 92%|█████████▏| 31484/34278 [34:27:29<3:19:53, 4.29s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 92%|█████████▏| 31485/34278 [34:27:33<3:03:46, 3.95s/it] {'loss': 0.093, 'grad_norm': 0.8047235547623197, 'learning_rate': 1.731021573423719e-07, 'epoch': 0.92} 92%|█████████▏| 31485/34278 [34:27:33<3:03:46, 3.95s/it] 92%|█████████▏| 31486/34278 [34:27:35<2:48:08, 3.61s/it] {'loss': 0.1374, 'grad_norm': 0.7961735318616792, 'learning_rate': 1.7297894483240984e-07, 'epoch': 0.92} 92%|█████████▏| 31486/34278 [34:27:35<2:48:08, 3.61s/it] 92%|█████████▏| 31487/34278 [34:27:39<2:46:51, 3.59s/it] {'loss': 0.118, 'grad_norm': 0.6548827610332445, 'learning_rate': 1.7285577541693966e-07, 'epoch': 0.92} 92%|█████████▏| 31487/34278 [34:27:39<2:46:51, 3.59s/it] 92%|█████████▏| 31488/34278 [34:27:42<2:38:35, 3.41s/it] {'loss': 0.117, 'grad_norm': 0.8131044492909674, 'learning_rate': 1.7273264909706043e-07, 'epoch': 0.92} 92%|█████████▏| 31488/34278 [34:27:42<2:38:35, 3.41s/it] 92%|█████████▏| 31489/34278 [34:27:45<2:31:52, 3.27s/it] {'loss': 0.1158, 'grad_norm': 0.7032682394703215, 'learning_rate': 1.7260956587387245e-07, 'epoch': 0.92} 92%|█████████▏| 31489/34278 [34:27:45<2:31:52, 3.27s/it] 92%|█████████▏| 31490/34278 [34:27:48<2:32:03, 3.27s/it] {'loss': 0.1176, 'grad_norm': 0.8437393397753546, 'learning_rate': 1.7248652574847367e-07, 'epoch': 0.92} 92%|█████████▏| 31490/34278 [34:27:48<2:32:03, 3.27s/it] 92%|█████████▏| 31491/34278 [34:27:53<2:54:40, 3.76s/it] {'loss': 0.1155, 'grad_norm': 0.8217591225454703, 'learning_rate': 1.7236352872196216e-07, 'epoch': 0.92} 92%|█████████▏| 31491/34278 [34:27:53<2:54:40, 3.76s/it] 92%|█████████▏| 31492/34278 [34:27:56<2:47:58, 3.62s/it] {'loss': 0.1017, 'grad_norm': 0.7316706954362365, 'learning_rate': 1.72240574795437e-07, 'epoch': 0.92} 92%|█████████▏| 31492/34278 [34:27:56<2:47:58, 3.62s/it] 92%|█████████▏| 31493/34278 [34:27:59<2:39:44, 3.44s/it] {'loss': 0.1119, 'grad_norm': 0.7630958289001806, 'learning_rate': 1.721176639699962e-07, 'epoch': 0.92} 92%|█████████▏| 31493/34278 [34:27:59<2:39:44, 3.44s/it] 92%|█████████▏| 31494/34278 [34:28:03<2:35:02, 3.34s/it] {'loss': 0.1219, 'grad_norm': 0.9153354818383868, 'learning_rate': 1.7199479624673498e-07, 'epoch': 0.92} 92%|█████████▏| 31494/34278 [34:28:03<2:35:02, 3.34s/it] 92%|█████████▏| 31495/34278 [34:28:05<2:28:02, 3.19s/it] {'loss': 0.1242, 'grad_norm': 0.898372333136174, 'learning_rate': 1.7187197162675252e-07, 'epoch': 0.92} 92%|█████████▏| 31495/34278 [34:28:05<2:28:02, 3.19s/it] 92%|█████████▏| 31496/34278 [34:28:09<2:27:29, 3.18s/it] {'loss': 0.131, 'grad_norm': 0.9563962679915206, 'learning_rate': 1.7174919011114455e-07, 'epoch': 0.92} 92%|█████████▏| 31496/34278 [34:28:09<2:27:29, 3.18s/it] 92%|█████████▏| 31497/34278 [34:28:15<3:06:45, 4.03s/it] {'loss': 0.1096, 'grad_norm': 0.9190040010802156, 'learning_rate': 1.7162645170100746e-07, 'epoch': 0.92} 92%|█████████▏| 31497/34278 [34:28:15<3:06:45, 4.03s/it] 92%|█████████▏| 31498/34278 [34:28:18<2:52:24, 3.72s/it] {'loss': 0.1085, 'grad_norm': 0.9572626954655384, 'learning_rate': 1.715037563974359e-07, 'epoch': 0.92} 92%|█████████▏| 31498/34278 [34:28:18<2:52:24, 3.72s/it] 92%|█████████▏| 31499/34278 [34:28:20<2:39:00, 3.43s/it] {'loss': 0.106, 'grad_norm': 0.839287077203837, 'learning_rate': 1.7138110420152676e-07, 'epoch': 0.92} 92%|█████████▏| 31499/34278 [34:28:20<2:39:00, 3.43s/it] 92%|█████████▏| 31500/34278 [34:28:24<2:46:11, 3.59s/it] {'loss': 0.1079, 'grad_norm': 0.827734241275159, 'learning_rate': 1.712584951143742e-07, 'epoch': 0.92} 92%|█████████▏| 31500/34278 [34:28:24<2:46:11, 3.59s/it] 92%|█████████▏| 31501/34278 [34:28:27<2:39:43, 3.45s/it] {'loss': 0.1103, 'grad_norm': 0.8295905862872595, 'learning_rate': 1.711359291370729e-07, 'epoch': 0.92} 92%|█████████▏| 31501/34278 [34:28:27<2:39:43, 3.45s/it] 92%|█████████▏| 31502/34278 [34:28:30<2:32:52, 3.30s/it] {'loss': 0.0934, 'grad_norm': 0.9027734528844755, 'learning_rate': 1.7101340627071804e-07, 'epoch': 0.92} 92%|█████████▏| 31502/34278 [34:28:30<2:32:52, 3.30s/it] 92%|█████████▏| 31503/34278 [34:28:33<2:25:11, 3.14s/it] {'loss': 0.0978, 'grad_norm': 0.8794472254828379, 'learning_rate': 1.708909265164027e-07, 'epoch': 0.92} 92%|█████████▏| 31503/34278 [34:28:33<2:25:11, 3.14s/it] 92%|█████████▏| 31504/34278 [34:28:38<2:51:59, 3.72s/it] {'loss': 0.1121, 'grad_norm': 0.7755228381034948, 'learning_rate': 1.7076848987521933e-07, 'epoch': 0.92} 92%|█████████▏| 31504/34278 [34:28:38<2:51:59, 3.72s/it] 92%|█████████▏| 31505/34278 [34:28:43<3:02:02, 3.94s/it] {'loss': 0.1178, 'grad_norm': 0.8267621166910578, 'learning_rate': 1.706460963482631e-07, 'epoch': 0.92} 92%|█████████▏| 31505/34278 [34:28:43<3:02:02, 3.94s/it] 92%|█████████▏| 31506/34278 [34:28:46<2:50:53, 3.70s/it] {'loss': 0.1037, 'grad_norm': 0.9177075774316056, 'learning_rate': 1.7052374593662492e-07, 'epoch': 0.92} 92%|█████████▏| 31506/34278 [34:28:46<2:50:53, 3.70s/it] 92%|█████████▏| 31507/34278 [34:28:49<2:51:22, 3.71s/it] {'loss': 0.1022, 'grad_norm': 1.0129853969583604, 'learning_rate': 1.7040143864139825e-07, 'epoch': 0.92} 92%|█████████▏| 31507/34278 [34:28:49<2:51:22, 3.71s/it] 92%|█████████▏| 31508/34278 [34:28:52<2:40:58, 3.49s/it] {'loss': 0.1245, 'grad_norm': 0.9014312892017495, 'learning_rate': 1.7027917446367447e-07, 'epoch': 0.92} 92%|█████████▏| 31508/34278 [34:28:52<2:40:58, 3.49s/it] 92%|█████████▏| 31509/34278 [34:28:57<2:58:19, 3.86s/it] {'loss': 0.0983, 'grad_norm': 0.9386499379140206, 'learning_rate': 1.7015695340454552e-07, 'epoch': 0.92} 92%|█████████▏| 31509/34278 [34:28:57<2:58:19, 3.86s/it] 92%|█████████▏| 31510/34278 [34:29:00<2:41:45, 3.51s/it] {'loss': 0.1137, 'grad_norm': 1.12692336892604, 'learning_rate': 1.7003477546510217e-07, 'epoch': 0.92} 92%|█████████▏| 31510/34278 [34:29:00<2:41:45, 3.51s/it] 92%|█████████▏| 31511/34278 [34:29:06<3:14:43, 4.22s/it] {'loss': 0.1049, 'grad_norm': 0.8824618055641689, 'learning_rate': 1.699126406464352e-07, 'epoch': 0.92} 92%|█████████▏| 31511/34278 [34:29:06<3:14:43, 4.22s/it] 92%|█████████▏| 31512/34278 [34:29:09<3:03:24, 3.98s/it] {'loss': 0.1189, 'grad_norm': 0.7877852882881141, 'learning_rate': 1.6979054894963486e-07, 'epoch': 0.92} 92%|█████████▏| 31512/34278 [34:29:09<3:03:24, 3.98s/it] 92%|█████████▏| 31513/34278 [34:29:12<2:49:33, 3.68s/it] {'loss': 0.1175, 'grad_norm': 1.2274157217371404, 'learning_rate': 1.6966850037579196e-07, 'epoch': 0.92} 92%|█████████▏| 31513/34278 [34:29:12<2:49:33, 3.68s/it] 92%|█████████▏| 31514/34278 [34:29:15<2:38:57, 3.45s/it] {'loss': 0.1046, 'grad_norm': 0.8035846976351573, 'learning_rate': 1.6954649492599507e-07, 'epoch': 0.92} 92%|█████████▏| 31514/34278 [34:29:15<2:38:57, 3.45s/it] 92%|█████████▏| 31515/34278 [34:29:18<2:35:27, 3.38s/it] {'loss': 0.1246, 'grad_norm': 0.7122975510823543, 'learning_rate': 1.6942453260133497e-07, 'epoch': 0.92} 92%|█████████▏| 31515/34278 [34:29:18<2:35:27, 3.38s/it] 92%|█████████▏| 31516/34278 [34:29:21<2:29:43, 3.25s/it] {'loss': 0.1143, 'grad_norm': 0.8803363288778558, 'learning_rate': 1.693026134028991e-07, 'epoch': 0.92} 92%|█████████▏| 31516/34278 [34:29:21<2:29:43, 3.25s/it] 92%|█████████▏| 31517/34278 [34:29:27<3:01:31, 3.94s/it] {'loss': 0.0965, 'grad_norm': 0.8039546970931579, 'learning_rate': 1.6918073733177554e-07, 'epoch': 0.92} 92%|█████████▏| 31517/34278 [34:29:27<3:01:31, 3.94s/it] 92%|█████████▏| 31518/34278 [34:29:30<2:51:20, 3.72s/it] {'loss': 0.1258, 'grad_norm': 0.8576721603822425, 'learning_rate': 1.6905890438905338e-07, 'epoch': 0.92} 92%|█████████▏| 31518/34278 [34:29:30<2:51:20, 3.72s/it] 92%|█████████▏| 31519/34278 [34:29:34<2:50:30, 3.71s/it] {'loss': 0.1138, 'grad_norm': 0.9595800848719139, 'learning_rate': 1.6893711457582064e-07, 'epoch': 0.92} 92%|█████████▏| 31519/34278 [34:29:34<2:50:30, 3.71s/it] 92%|█████████▏| 31520/34278 [34:29:36<2:35:52, 3.39s/it] {'loss': 0.1252, 'grad_norm': 0.7927836583174125, 'learning_rate': 1.6881536789316422e-07, 'epoch': 0.92} 92%|█████████▏| 31520/34278 [34:29:36<2:35:52, 3.39s/it] 92%|█████████▏| 31521/34278 [34:29:42<3:12:38, 4.19s/it] {'loss': 0.0888, 'grad_norm': 0.7674785520740165, 'learning_rate': 1.6869366434216993e-07, 'epoch': 0.92} 92%|█████████▏| 31521/34278 [34:29:42<3:12:38, 4.19s/it] 92%|█████████▏| 31522/34278 [34:29:48<3:38:32, 4.76s/it] {'loss': 0.1336, 'grad_norm': 0.8523844486809289, 'learning_rate': 1.6857200392392635e-07, 'epoch': 0.92} 92%|█████████▏| 31522/34278 [34:29:48<3:38:32, 4.76s/it] 92%|█████████▏| 31523/34278 [34:29:55<3:59:24, 5.21s/it] {'loss': 0.104, 'grad_norm': 0.8203410314200877, 'learning_rate': 1.684503866395182e-07, 'epoch': 0.92} 92%|█████████▏| 31523/34278 [34:29:55<3:59:24, 5.21s/it] 92%|█████████▏| 31524/34278 [34:29:58<3:37:28, 4.74s/it] {'loss': 0.1096, 'grad_norm': 1.0542733564281603, 'learning_rate': 1.683288124900312e-07, 'epoch': 0.92} 92%|█████████▏| 31524/34278 [34:29:58<3:37:28, 4.74s/it] 92%|█████████▏| 31525/34278 [34:30:04<3:51:03, 5.04s/it] {'loss': 0.1203, 'grad_norm': 0.8929220501399922, 'learning_rate': 1.682072814765512e-07, 'epoch': 0.92} 92%|█████████▏| 31525/34278 [34:30:04<3:51:03, 5.04s/it] 92%|█████████▏| 31526/34278 [34:30:08<3:38:05, 4.75s/it] {'loss': 0.1158, 'grad_norm': 0.9651186180879701, 'learning_rate': 1.6808579360016343e-07, 'epoch': 0.92} 92%|█████████▏| 31526/34278 [34:30:08<3:38:05, 4.75s/it] 92%|█████████▏| 31527/34278 [34:30:11<3:15:35, 4.27s/it] {'loss': 0.1308, 'grad_norm': 0.9016379181013806, 'learning_rate': 1.6796434886195256e-07, 'epoch': 0.92} 92%|█████████▏| 31527/34278 [34:30:11<3:15:35, 4.27s/it] 92%|█████████▏| 31528/34278 [34:30:14<2:59:30, 3.92s/it] {'loss': 0.1066, 'grad_norm': 0.8093931544203599, 'learning_rate': 1.6784294726300166e-07, 'epoch': 0.92} 92%|█████████▏| 31528/34278 [34:30:14<2:59:30, 3.92s/it] 92%|█████████▏| 31529/34278 [34:30:18<2:54:16, 3.80s/it] {'loss': 0.0962, 'grad_norm': 0.8010758614571243, 'learning_rate': 1.6772158880439594e-07, 'epoch': 0.92} 92%|█████████▏| 31529/34278 [34:30:18<2:54:16, 3.80s/it] 92%|█████████▏| 31530/34278 [34:30:21<2:43:08, 3.56s/it] {'loss': 0.1079, 'grad_norm': 0.8112238434650944, 'learning_rate': 1.6760027348721785e-07, 'epoch': 0.92} 92%|█████████▏| 31530/34278 [34:30:21<2:43:08, 3.56s/it] 92%|█████████▏| 31531/34278 [34:30:24<2:38:34, 3.46s/it] {'loss': 0.1323, 'grad_norm': 0.8691568220414676, 'learning_rate': 1.6747900131255102e-07, 'epoch': 0.92} 92%|█████████▏| 31531/34278 [34:30:24<2:38:34, 3.46s/it] 92%|█████████▏| 31532/34278 [34:30:29<2:58:27, 3.90s/it] {'loss': 0.1101, 'grad_norm': 0.7292816387389669, 'learning_rate': 1.6735777228147842e-07, 'epoch': 0.92} 92%|█████████▏| 31532/34278 [34:30:29<2:58:27, 3.90s/it] 92%|█████████▏| 31533/34278 [34:30:34<3:15:39, 4.28s/it] {'loss': 0.1122, 'grad_norm': 0.7633244929188836, 'learning_rate': 1.6723658639508257e-07, 'epoch': 0.92} 92%|█████████▏| 31533/34278 [34:30:34<3:15:39, 4.28s/it] 92%|█████████▏| 31534/34278 [34:30:37<2:59:40, 3.93s/it] {'loss': 0.1312, 'grad_norm': 1.0608089952896051, 'learning_rate': 1.6711544365444367e-07, 'epoch': 0.92} 92%|█████████▏| 31534/34278 [34:30:37<2:59:40, 3.93s/it] 92%|█████████▏| 31535/34278 [34:30:41<2:53:04, 3.79s/it] {'loss': 0.1193, 'grad_norm': 0.8281099543989686, 'learning_rate': 1.669943440606453e-07, 'epoch': 0.92} 92%|█████████▏| 31535/34278 [34:30:41<2:53:04, 3.79s/it] 92%|█████████▏| 31536/34278 [34:30:45<2:53:40, 3.80s/it] {'loss': 0.1164, 'grad_norm': 0.7216056971282659, 'learning_rate': 1.668732876147666e-07, 'epoch': 0.92} 92%|█████████▏| 31536/34278 [34:30:45<2:53:40, 3.80s/it] 92%|█████████▏| 31537/34278 [34:30:48<2:41:13, 3.53s/it] {'loss': 0.1079, 'grad_norm': 0.7507557229293694, 'learning_rate': 1.6675227431789009e-07, 'epoch': 0.92} 92%|█████████▏| 31537/34278 [34:30:48<2:41:13, 3.53s/it] 92%|█████████▏| 31538/34278 [34:30:54<3:16:47, 4.31s/it] {'loss': 0.1182, 'grad_norm': 0.8338918816597148, 'learning_rate': 1.666313041710954e-07, 'epoch': 0.92} 92%|█████████▏| 31538/34278 [34:30:54<3:16:47, 4.31s/it] 92%|█████████▏| 31539/34278 [34:30:57<2:59:32, 3.93s/it] {'loss': 0.1114, 'grad_norm': 0.7484368235347966, 'learning_rate': 1.6651037717546281e-07, 'epoch': 0.92} 92%|█████████▏| 31539/34278 [34:30:57<2:59:32, 3.93s/it] 92%|█████████▏| 31540/34278 [34:31:00<2:54:34, 3.83s/it] {'loss': 0.0989, 'grad_norm': 0.95879932611911, 'learning_rate': 1.66389493332072e-07, 'epoch': 0.92} 92%|█████████▏| 31540/34278 [34:31:00<2:54:34, 3.83s/it] 92%|█████████▏| 31541/34278 [34:31:06<3:25:37, 4.51s/it] {'loss': 0.0904, 'grad_norm': 0.8048620998555469, 'learning_rate': 1.6626865264200097e-07, 'epoch': 0.92} 92%|█████████▏| 31541/34278 [34:31:06<3:25:37, 4.51s/it] 92%|█████████▏| 31542/34278 [34:31:10<3:08:43, 4.14s/it] {'loss': 0.1112, 'grad_norm': 0.8090812381727864, 'learning_rate': 1.6614785510633002e-07, 'epoch': 0.92} 92%|█████████▏| 31542/34278 [34:31:10<3:08:43, 4.14s/it] 92%|█████████▏| 31543/34278 [34:31:13<3:01:21, 3.98s/it] {'loss': 0.1107, 'grad_norm': 0.6789774324548513, 'learning_rate': 1.6602710072613715e-07, 'epoch': 0.92} 92%|█████████▏| 31543/34278 [34:31:13<3:01:21, 3.98s/it] 92%|█████████▏| 31544/34278 [34:31:16<2:48:48, 3.70s/it] {'loss': 0.1067, 'grad_norm': 1.1174933592605976, 'learning_rate': 1.6590638950249982e-07, 'epoch': 0.92} 92%|█████████▏| 31544/34278 [34:31:16<2:48:48, 3.70s/it] 92%|█████████▏| 31545/34278 [34:31:19<2:38:45, 3.49s/it] {'loss': 0.1029, 'grad_norm': 0.7905427998092589, 'learning_rate': 1.657857214364972e-07, 'epoch': 0.92} 92%|█████████▏| 31545/34278 [34:31:19<2:38:45, 3.49s/it] 92%|█████████▏| 31546/34278 [34:31:25<3:13:12, 4.24s/it] {'loss': 0.1135, 'grad_norm': 0.9453729966948536, 'learning_rate': 1.656650965292056e-07, 'epoch': 0.92} 92%|█████████▏| 31546/34278 [34:31:25<3:13:12, 4.24s/it] 92%|█████████▏| 31547/34278 [34:31:29<3:02:28, 4.01s/it] {'loss': 0.1242, 'grad_norm': 0.8392599511973942, 'learning_rate': 1.6554451478170085e-07, 'epoch': 0.92} 92%|█████████▏| 31547/34278 [34:31:29<3:02:28, 4.01s/it] 92%|█████████▏| 31548/34278 [34:31:32<2:55:32, 3.86s/it] {'loss': 0.0989, 'grad_norm': 0.5591915862294311, 'learning_rate': 1.65423976195061e-07, 'epoch': 0.92} 92%|█████████▏| 31548/34278 [34:31:32<2:55:32, 3.86s/it] 92%|█████████▏| 31549/34278 [34:31:35<2:43:51, 3.60s/it] {'loss': 0.1131, 'grad_norm': 0.9739971293006879, 'learning_rate': 1.653034807703624e-07, 'epoch': 0.92} 92%|█████████▏| 31549/34278 [34:31:35<2:43:51, 3.60s/it] 92%|█████████▏| 31550/34278 [34:31:41<3:14:15, 4.27s/it] {'loss': 0.0911, 'grad_norm': 0.8296837190875574, 'learning_rate': 1.651830285086803e-07, 'epoch': 0.92} 92%|█████████▏| 31550/34278 [34:31:41<3:14:15, 4.27s/it] 92%|█████████▏| 31551/34278 [34:31:45<3:06:38, 4.11s/it] {'loss': 0.1122, 'grad_norm': 0.7338766359170078, 'learning_rate': 1.650626194110888e-07, 'epoch': 0.92} 92%|█████████▏| 31551/34278 [34:31:45<3:06:38, 4.11s/it] 92%|█████████▏| 31552/34278 [34:31:48<2:55:24, 3.86s/it] {'loss': 0.1236, 'grad_norm': 0.9081774787175226, 'learning_rate': 1.6494225347866543e-07, 'epoch': 0.92} 92%|█████████▏| 31552/34278 [34:31:48<2:55:24, 3.86s/it] 92%|█████████▏| 31553/34278 [34:31:53<3:13:27, 4.26s/it] {'loss': 0.1046, 'grad_norm': 1.1705428651744463, 'learning_rate': 1.6482193071248264e-07, 'epoch': 0.92} 92%|█████████▏| 31553/34278 [34:31:53<3:13:27, 4.26s/it] 92%|█████████▏| 31554/34278 [34:31:56<2:57:00, 3.90s/it] {'loss': 0.1125, 'grad_norm': 0.7857833395439243, 'learning_rate': 1.6470165111361514e-07, 'epoch': 0.92} 92%|█████████▏| 31554/34278 [34:31:56<2:57:00, 3.90s/it] 92%|█████████▏| 31555/34278 [34:31:59<2:45:20, 3.64s/it] {'loss': 0.1016, 'grad_norm': 0.7386346121884209, 'learning_rate': 1.6458141468313705e-07, 'epoch': 0.92} 92%|█████████▏| 31555/34278 [34:31:59<2:45:20, 3.64s/it] 92%|█████████▏| 31556/34278 [34:32:03<2:41:01, 3.55s/it] {'loss': 0.097, 'grad_norm': 0.9834408545286342, 'learning_rate': 1.644612214221225e-07, 'epoch': 0.92} 92%|█████████▏| 31556/34278 [34:32:03<2:41:01, 3.55s/it] 92%|█████████▏| 31557/34278 [34:32:06<2:33:16, 3.38s/it] {'loss': 0.0978, 'grad_norm': 0.8244546780176799, 'learning_rate': 1.6434107133164402e-07, 'epoch': 0.92} 92%|█████████▏| 31557/34278 [34:32:06<2:33:16, 3.38s/it] 92%|█████████▏| 31558/34278 [34:32:12<3:07:37, 4.14s/it] {'loss': 0.1168, 'grad_norm': 0.7044948386735506, 'learning_rate': 1.6422096441277292e-07, 'epoch': 0.92} 92%|█████████▏| 31558/34278 [34:32:12<3:07:37, 4.14s/it] 92%|█████████▏| 31559/34278 [34:32:15<2:57:41, 3.92s/it] {'loss': 0.1092, 'grad_norm': 0.820548093941712, 'learning_rate': 1.641009006665828e-07, 'epoch': 0.92} 92%|█████████▏| 31559/34278 [34:32:15<2:57:41, 3.92s/it] 92%|█████████▏| 31560/34278 [34:32:21<3:25:34, 4.54s/it] {'loss': 0.1206, 'grad_norm': 0.963253671691206, 'learning_rate': 1.6398088009414616e-07, 'epoch': 0.92} 92%|█████████▏| 31560/34278 [34:32:21<3:25:34, 4.54s/it] 92%|█████████▏| 31561/34278 [34:32:25<3:16:18, 4.34s/it] {'loss': 0.1006, 'grad_norm': 0.7454790088032144, 'learning_rate': 1.6386090269653322e-07, 'epoch': 0.92} 92%|█████████▏| 31561/34278 [34:32:25<3:16:18, 4.34s/it] 92%|█████████▏| 31562/34278 [34:32:28<2:58:35, 3.95s/it] {'loss': 0.1263, 'grad_norm': 0.8767584099696616, 'learning_rate': 1.637409684748159e-07, 'epoch': 0.92} 92%|█████████▏| 31562/34278 [34:32:28<2:58:35, 3.95s/it] 92%|█████████▏| 31563/34278 [34:32:31<2:52:38, 3.82s/it] {'loss': 0.1131, 'grad_norm': 0.808533787829623, 'learning_rate': 1.6362107743006507e-07, 'epoch': 0.92} 92%|█████████▏| 31563/34278 [34:32:32<2:52:38, 3.82s/it] 92%|█████████▏| 31564/34278 [34:32:35<2:43:32, 3.62s/it] {'loss': 0.1065, 'grad_norm': 0.8194379113041063, 'learning_rate': 1.6350122956335035e-07, 'epoch': 0.92} 92%|█████████▏| 31564/34278 [34:32:35<2:43:32, 3.62s/it] 92%|█████████▏| 31565/34278 [34:32:39<2:49:57, 3.76s/it] {'loss': 0.1133, 'grad_norm': 0.6831599856540861, 'learning_rate': 1.633814248757415e-07, 'epoch': 0.92} 92%|█████████▏| 31565/34278 [34:32:39<2:49:57, 3.76s/it] 92%|█████████▏| 31566/34278 [34:32:43<2:51:51, 3.80s/it] {'loss': 0.1008, 'grad_norm': 0.784619164142305, 'learning_rate': 1.6326166336830985e-07, 'epoch': 0.92} 92%|█████████▏| 31566/34278 [34:32:43<2:51:51, 3.80s/it] 92%|█████████▏| 31567/34278 [34:32:48<3:14:43, 4.31s/it] {'loss': 0.1122, 'grad_norm': 0.8300106688756127, 'learning_rate': 1.6314194504212287e-07, 'epoch': 0.92} 92%|█████████▏| 31567/34278 [34:32:48<3:14:43, 4.31s/it] 92%|█████████▏| 31568/34278 [34:32:54<3:38:18, 4.83s/it] {'loss': 0.1037, 'grad_norm': 0.8176939053385948, 'learning_rate': 1.6302226989824976e-07, 'epoch': 0.92} 92%|█████████▏| 31568/34278 [34:32:54<3:38:18, 4.83s/it] 92%|█████████▏| 31569/34278 [34:32:59<3:40:31, 4.88s/it] {'loss': 0.1127, 'grad_norm': 0.6868479781352995, 'learning_rate': 1.6290263793775962e-07, 'epoch': 0.92} 92%|█████████▏| 31569/34278 [34:32:59<3:40:31, 4.88s/it] 92%|█████████▏| 31570/34278 [34:33:02<3:14:45, 4.32s/it] {'loss': 0.1474, 'grad_norm': 0.7253165730691489, 'learning_rate': 1.6278304916171995e-07, 'epoch': 0.92} 92%|█████████▏| 31570/34278 [34:33:02<3:14:45, 4.32s/it] 92%|█████████▏| 31571/34278 [34:33:07<3:18:08, 4.39s/it] {'loss': 0.1039, 'grad_norm': 0.7775839000205057, 'learning_rate': 1.6266350357119765e-07, 'epoch': 0.92} 92%|█████████▏| 31571/34278 [34:33:07<3:18:08, 4.39s/it] 92%|█████████▏| 31572/34278 [34:33:10<3:00:32, 4.00s/it] {'loss': 0.0894, 'grad_norm': 0.6301357172828128, 'learning_rate': 1.6254400116726133e-07, 'epoch': 0.92} 92%|█████████▏| 31572/34278 [34:33:10<3:00:32, 4.00s/it] 92%|█████████▏| 31573/34278 [34:33:16<3:27:58, 4.61s/it] {'loss': 0.109, 'grad_norm': 0.8042375168854584, 'learning_rate': 1.624245419509779e-07, 'epoch': 0.92} 92%|█████████▏| 31573/34278 [34:33:16<3:27:58, 4.61s/it] 92%|█████████▏| 31574/34278 [34:33:19<3:07:25, 4.16s/it] {'loss': 0.1196, 'grad_norm': 0.8331649296079149, 'learning_rate': 1.6230512592341263e-07, 'epoch': 0.92} 92%|█████████▏| 31574/34278 [34:33:19<3:07:25, 4.16s/it] 92%|█████████▏| 31575/34278 [34:33:25<3:34:46, 4.77s/it] {'loss': 0.1123, 'grad_norm': 0.7302033742048548, 'learning_rate': 1.62185753085633e-07, 'epoch': 0.92} 92%|█████████▏| 31575/34278 [34:33:25<3:34:46, 4.77s/it] 92%|█████████▏| 31576/34278 [34:33:28<3:09:32, 4.21s/it] {'loss': 0.1191, 'grad_norm': 0.9631957716768756, 'learning_rate': 1.6206642343870427e-07, 'epoch': 0.92} 92%|█████████▏| 31576/34278 [34:33:28<3:09:32, 4.21s/it] 92%|█████████▏| 31577/34278 [34:33:32<3:03:27, 4.08s/it] {'loss': 0.1328, 'grad_norm': 0.9610146717478596, 'learning_rate': 1.6194713698369057e-07, 'epoch': 0.92} 92%|█████████▏| 31577/34278 [34:33:32<3:03:27, 4.08s/it] 92%|█████████▏| 31578/34278 [34:33:35<2:49:41, 3.77s/it] {'loss': 0.1077, 'grad_norm': 0.6558015522080165, 'learning_rate': 1.618278937216583e-07, 'epoch': 0.92} 92%|█████████▏| 31578/34278 [34:33:35<2:49:41, 3.77s/it] 92%|█████████▏| 31579/34278 [34:33:39<2:55:14, 3.90s/it] {'loss': 0.1055, 'grad_norm': 0.7104860258779933, 'learning_rate': 1.6170869365367158e-07, 'epoch': 0.92} 92%|█████████▏| 31579/34278 [34:33:39<2:55:14, 3.90s/it] 92%|█████████▏| 31580/34278 [34:33:44<3:06:07, 4.14s/it] {'loss': 0.12, 'grad_norm': 0.8201677319994595, 'learning_rate': 1.6158953678079515e-07, 'epoch': 0.92} 92%|█████████▏| 31580/34278 [34:33:44<3:06:07, 4.14s/it] 92%|█████████▏| 31581/34278 [34:33:47<2:53:43, 3.86s/it] {'loss': 0.1189, 'grad_norm': 1.1372388083613736, 'learning_rate': 1.61470423104092e-07, 'epoch': 0.92} 92%|█████████▏| 31581/34278 [34:33:47<2:53:43, 3.86s/it] 92%|█████████▏| 31582/34278 [34:33:50<2:44:08, 3.65s/it] {'loss': 0.0907, 'grad_norm': 2.693507197644251, 'learning_rate': 1.6135135262462577e-07, 'epoch': 0.92} 92%|█████████▏| 31582/34278 [34:33:50<2:44:08, 3.65s/it] 92%|█████████▏| 31583/34278 [34:33:54<2:46:40, 3.71s/it] {'loss': 0.1374, 'grad_norm': 0.8207127203449859, 'learning_rate': 1.6123232534345946e-07, 'epoch': 0.92} 92%|█████████▏| 31583/34278 [34:33:54<2:46:40, 3.71s/it] 92%|█████████▏| 31584/34278 [34:34:00<3:18:38, 4.42s/it] {'loss': 0.1311, 'grad_norm': 0.7659081986364706, 'learning_rate': 1.6111334126165611e-07, 'epoch': 0.92} 92%|█████████▏| 31584/34278 [34:34:00<3:18:38, 4.42s/it] 92%|█████████▏| 31585/34278 [34:34:03<2:59:30, 4.00s/it] {'loss': 0.1046, 'grad_norm': 0.8268252639756813, 'learning_rate': 1.609944003802777e-07, 'epoch': 0.92} 92%|█████████▏| 31585/34278 [34:34:03<2:59:30, 4.00s/it] 92%|█████████▏| 31586/34278 [34:34:07<2:52:22, 3.84s/it] {'loss': 0.098, 'grad_norm': 0.7053070577774613, 'learning_rate': 1.608755027003861e-07, 'epoch': 0.92} 92%|█████████▏| 31586/34278 [34:34:07<2:52:22, 3.84s/it] 92%|█████████▏| 31587/34278 [34:34:10<2:44:17, 3.66s/it] {'loss': 0.104, 'grad_norm': 0.7269891485623712, 'learning_rate': 1.607566482230427e-07, 'epoch': 0.92} 92%|█████████▏| 31587/34278 [34:34:10<2:44:17, 3.66s/it] 92%|█████████▏| 31588/34278 [34:34:16<3:16:16, 4.38s/it] {'loss': 0.1146, 'grad_norm': 0.7715076818314276, 'learning_rate': 1.606378369493089e-07, 'epoch': 0.92} 92%|█████████▏| 31588/34278 [34:34:16<3:16:16, 4.38s/it] 92%|█████████▏| 31589/34278 [34:34:22<3:35:53, 4.82s/it] {'loss': 0.13, 'grad_norm': 0.9468994735051999, 'learning_rate': 1.6051906888024494e-07, 'epoch': 0.92} 92%|█████████▏| 31589/34278 [34:34:22<3:35:53, 4.82s/it] 92%|█████████▏| 31590/34278 [34:34:25<3:16:25, 4.38s/it] {'loss': 0.1309, 'grad_norm': 0.9145211725139617, 'learning_rate': 1.6040034401691163e-07, 'epoch': 0.92} 92%|█████████▏| 31590/34278 [34:34:25<3:16:25, 4.38s/it] 92%|█████████▏| 31591/34278 [34:34:28<3:01:41, 4.06s/it] {'loss': 0.1193, 'grad_norm': 0.8534177807613806, 'learning_rate': 1.6028166236036868e-07, 'epoch': 0.92} 92%|█████████▏| 31591/34278 [34:34:28<3:01:41, 4.06s/it] 92%|█████████▏| 31592/34278 [34:34:32<2:54:49, 3.91s/it] {'loss': 0.0996, 'grad_norm': 0.640869112211262, 'learning_rate': 1.601630239116758e-07, 'epoch': 0.92} 92%|█████████▏| 31592/34278 [34:34:32<2:54:49, 3.91s/it] 92%|█████████▏| 31593/34278 [34:34:38<3:22:42, 4.53s/it] {'loss': 0.1234, 'grad_norm': 0.8611550427653011, 'learning_rate': 1.6004442867189217e-07, 'epoch': 0.92} 92%|█████████▏| 31593/34278 [34:34:38<3:22:42, 4.53s/it] 92%|█████████▏| 31594/34278 [34:34:42<3:15:05, 4.36s/it] {'loss': 0.0917, 'grad_norm': 0.8012778753854138, 'learning_rate': 1.5992587664207638e-07, 'epoch': 0.92} 92%|█████████▏| 31594/34278 [34:34:42<3:15:05, 4.36s/it] 92%|█████████▏| 31595/34278 [34:34:46<3:08:24, 4.21s/it] {'loss': 0.1189, 'grad_norm': 0.7906154408473884, 'learning_rate': 1.5980736782328644e-07, 'epoch': 0.92} 92%|█████████▏| 31595/34278 [34:34:46<3:08:24, 4.21s/it] 92%|█████████▏| 31596/34278 [34:34:49<2:56:33, 3.95s/it] {'loss': 0.1253, 'grad_norm': 0.9313369846310038, 'learning_rate': 1.5968890221658207e-07, 'epoch': 0.92} 92%|█████████▏| 31596/34278 [34:34:49<2:56:33, 3.95s/it] 92%|█████████▏| 31597/34278 [34:34:52<2:47:39, 3.75s/it] {'loss': 0.1123, 'grad_norm': 0.73495900928635, 'learning_rate': 1.595704798230191e-07, 'epoch': 0.92} 92%|█████████▏| 31597/34278 [34:34:52<2:47:39, 3.75s/it] 92%|█████████▏| 31598/34278 [34:34:56<2:47:25, 3.75s/it] {'loss': 0.1566, 'grad_norm': 0.8927840892463856, 'learning_rate': 1.5945210064365503e-07, 'epoch': 0.92} 92%|█████████▏| 31598/34278 [34:34:56<2:47:25, 3.75s/it] 92%|█████████▏| 31599/34278 [34:34:59<2:35:25, 3.48s/it] {'loss': 0.1018, 'grad_norm': 0.8984768986685986, 'learning_rate': 1.593337646795473e-07, 'epoch': 0.92} 92%|█████████▏| 31599/34278 [34:34:59<2:35:25, 3.48s/it] 92%|█████████▏| 31600/34278 [34:35:02<2:31:35, 3.40s/it] {'loss': 0.0899, 'grad_norm': 0.8765152866815265, 'learning_rate': 1.5921547193175292e-07, 'epoch': 0.92} 92%|█████████▏| 31600/34278 [34:35:02<2:31:35, 3.40s/it] 92%|█████████▏| 31601/34278 [34:35:06<2:32:38, 3.42s/it] {'loss': 0.1142, 'grad_norm': 0.7967989330865396, 'learning_rate': 1.5909722240132542e-07, 'epoch': 0.92} 92%|█████████▏| 31601/34278 [34:35:06<2:32:38, 3.42s/it] 92%|█████████▏| 31602/34278 [34:35:09<2:32:36, 3.42s/it] {'loss': 0.1072, 'grad_norm': 0.7442665459955101, 'learning_rate': 1.5897901608932342e-07, 'epoch': 0.92} 92%|█████████▏| 31602/34278 [34:35:09<2:32:36, 3.42s/it] 92%|█████████▏| 31603/34278 [34:35:13<2:33:00, 3.43s/it] {'loss': 0.1185, 'grad_norm': 0.9868247398129755, 'learning_rate': 1.5886085299680166e-07, 'epoch': 0.92} 92%|█████████▏| 31603/34278 [34:35:13<2:33:00, 3.43s/it] 92%|█████████▏| 31604/34278 [34:35:16<2:27:17, 3.31s/it] {'loss': 0.0949, 'grad_norm': 0.78649939067307, 'learning_rate': 1.5874273312481368e-07, 'epoch': 0.92} 92%|█████████▏| 31604/34278 [34:35:16<2:27:17, 3.31s/it] 92%|█████████▏| 31605/34278 [34:35:19<2:25:41, 3.27s/it] {'loss': 0.127, 'grad_norm': 1.1103592657883514, 'learning_rate': 1.5862465647441537e-07, 'epoch': 0.92} 92%|█████████▏| 31605/34278 [34:35:19<2:25:41, 3.27s/it] 92%|█████████▏| 31606/34278 [34:35:22<2:23:54, 3.23s/it] {'loss': 0.1126, 'grad_norm': 0.8698085582139836, 'learning_rate': 1.585066230466603e-07, 'epoch': 0.92} 92%|█████████▏| 31606/34278 [34:35:22<2:23:54, 3.23s/it] 92%|█████████▏| 31607/34278 [34:35:25<2:22:32, 3.20s/it] {'loss': 0.1079, 'grad_norm': 0.8990790270924715, 'learning_rate': 1.5838863284260208e-07, 'epoch': 0.92} 92%|█████████▏| 31607/34278 [34:35:25<2:22:32, 3.20s/it] 92%|█████████▏| 31608/34278 [34:35:28<2:21:00, 3.17s/it] {'loss': 0.1154, 'grad_norm': 0.8297752819122285, 'learning_rate': 1.582706858632943e-07, 'epoch': 0.92} 92%|█████████▏| 31608/34278 [34:35:28<2:21:00, 3.17s/it] 92%|█████████▏| 31609/34278 [34:35:32<2:27:11, 3.31s/it] {'loss': 0.1038, 'grad_norm': 0.846981054375405, 'learning_rate': 1.5815278210979056e-07, 'epoch': 0.92} 92%|█████████▏| 31609/34278 [34:35:32<2:27:11, 3.31s/it] 92%|█████████▏| 31610/34278 [34:35:35<2:26:00, 3.28s/it] {'loss': 0.1, 'grad_norm': 0.8276836928239499, 'learning_rate': 1.5803492158314283e-07, 'epoch': 0.92} 92%|█████████▏| 31610/34278 [34:35:35<2:26:00, 3.28s/it] 92%|█████████▏| 31611/34278 [34:35:41<2:59:51, 4.05s/it] {'loss': 0.1101, 'grad_norm': 0.798405376271041, 'learning_rate': 1.57917104284403e-07, 'epoch': 0.92} 92%|█████████▏| 31611/34278 [34:35:41<2:59:51, 4.05s/it] 92%|█████████▏| 31612/34278 [34:35:44<2:43:30, 3.68s/it] {'loss': 0.0903, 'grad_norm': 0.7716831291840625, 'learning_rate': 1.5779933021462357e-07, 'epoch': 0.92} 92%|█████████▏| 31612/34278 [34:35:44<2:43:30, 3.68s/it] 92%|█████████▏| 31613/34278 [34:35:47<2:42:58, 3.67s/it] {'loss': 0.0964, 'grad_norm': 0.7221244000074717, 'learning_rate': 1.5768159937485538e-07, 'epoch': 0.92} 92%|█████████▏| 31613/34278 [34:35:47<2:42:58, 3.67s/it] 92%|█████████▏| 31614/34278 [34:35:50<2:32:29, 3.43s/it] {'loss': 0.1065, 'grad_norm': 0.6758694466963024, 'learning_rate': 1.5756391176615092e-07, 'epoch': 0.92} 92%|█████████▏| 31614/34278 [34:35:50<2:32:29, 3.43s/it] 92%|█████████▏| 31615/34278 [34:35:53<2:29:28, 3.37s/it] {'loss': 0.0868, 'grad_norm': 0.7512265184986077, 'learning_rate': 1.5744626738955883e-07, 'epoch': 0.92} 92%|█████████▏| 31615/34278 [34:35:53<2:29:28, 3.37s/it] 92%|█████████▏| 31616/34278 [34:35:57<2:28:38, 3.35s/it] {'loss': 0.1137, 'grad_norm': 0.8501602567042736, 'learning_rate': 1.5732866624613152e-07, 'epoch': 0.92} 92%|█████████▏| 31616/34278 [34:35:57<2:28:38, 3.35s/it] 92%|█████████▏| 31617/34278 [34:36:00<2:31:11, 3.41s/it] {'loss': 0.0969, 'grad_norm': 0.7045020634778086, 'learning_rate': 1.572111083369171e-07, 'epoch': 0.92} 92%|█████████▏| 31617/34278 [34:36:00<2:31:11, 3.41s/it] 92%|█████████▏| 31618/34278 [34:36:05<2:54:46, 3.94s/it] {'loss': 0.1157, 'grad_norm': 0.8468483635821392, 'learning_rate': 1.5709359366296583e-07, 'epoch': 0.92} 92%|█████████▏| 31618/34278 [34:36:05<2:54:46, 3.94s/it] 92%|█████████▏| 31619/34278 [34:36:09<2:50:48, 3.85s/it] {'loss': 0.0975, 'grad_norm': 0.8278444364184621, 'learning_rate': 1.5697612222532687e-07, 'epoch': 0.92} 92%|█████████▏| 31619/34278 [34:36:09<2:50:48, 3.85s/it] 92%|█████████▏| 31620/34278 [34:36:15<3:19:04, 4.49s/it] {'loss': 0.1062, 'grad_norm': 0.7522274086515124, 'learning_rate': 1.5685869402504938e-07, 'epoch': 0.92} 92%|█████████▏| 31620/34278 [34:36:15<3:19:04, 4.49s/it] 92%|█████████▏| 31621/34278 [34:36:18<3:02:09, 4.11s/it] {'loss': 0.0988, 'grad_norm': 1.0404808666507726, 'learning_rate': 1.5674130906318085e-07, 'epoch': 0.92} 92%|█████████▏| 31621/34278 [34:36:18<3:02:09, 4.11s/it] 92%|█████████▏| 31622/34278 [34:36:22<3:00:05, 4.07s/it] {'loss': 0.106, 'grad_norm': 0.9457242979288193, 'learning_rate': 1.566239673407699e-07, 'epoch': 0.92} 92%|█████████▏| 31622/34278 [34:36:22<3:00:05, 4.07s/it] 92%|█████████▏| 31623/34278 [34:36:26<2:51:43, 3.88s/it] {'loss': 0.1583, 'grad_norm': 0.7968748994921456, 'learning_rate': 1.5650666885886457e-07, 'epoch': 0.92} 92%|█████████▏| 31623/34278 [34:36:26<2:51:43, 3.88s/it] 92%|█████████▏| 31624/34278 [34:36:29<2:45:45, 3.75s/it] {'loss': 0.1074, 'grad_norm': 0.9325826393941028, 'learning_rate': 1.5638941361851069e-07, 'epoch': 0.92} 92%|█████████▏| 31624/34278 [34:36:29<2:45:45, 3.75s/it] 92%|█████████▏| 31625/34278 [34:36:32<2:35:20, 3.51s/it] {'loss': 0.1351, 'grad_norm': 1.063030955184054, 'learning_rate': 1.5627220162075574e-07, 'epoch': 0.92} 92%|█████████▏| 31625/34278 [34:36:32<2:35:20, 3.51s/it] 92%|█████████▏| 31626/34278 [34:36:38<3:02:54, 4.14s/it] {'loss': 0.1059, 'grad_norm': 1.026518357218933, 'learning_rate': 1.5615503286664668e-07, 'epoch': 0.92} 92%|█████████▏| 31626/34278 [34:36:38<3:02:54, 4.14s/it] 92%|█████████▏| 31627/34278 [34:36:43<3:12:40, 4.36s/it] {'loss': 0.1176, 'grad_norm': 0.7980818277717988, 'learning_rate': 1.5603790735722933e-07, 'epoch': 0.92} 92%|█████████▏| 31627/34278 [34:36:43<3:12:40, 4.36s/it] 92%|█████████▏| 31628/34278 [34:36:48<3:33:33, 4.84s/it] {'loss': 0.1167, 'grad_norm': 0.8576736422214318, 'learning_rate': 1.5592082509354845e-07, 'epoch': 0.92} 92%|█████████▏| 31628/34278 [34:36:49<3:33:33, 4.84s/it] 92%|█████████▏| 31629/34278 [34:36:53<3:24:47, 4.64s/it] {'loss': 0.1081, 'grad_norm': 0.7265952864485458, 'learning_rate': 1.5580378607665092e-07, 'epoch': 0.92} 92%|█████████▏| 31629/34278 [34:36:53<3:24:47, 4.64s/it] 92%|█████████▏| 31630/34278 [34:36:59<3:42:57, 5.05s/it] {'loss': 0.1181, 'grad_norm': 0.7084197499329727, 'learning_rate': 1.5568679030758095e-07, 'epoch': 0.92} 92%|█████████▏| 31630/34278 [34:36:59<3:42:57, 5.05s/it] 92%|█████████▏| 31631/34278 [34:37:02<3:18:27, 4.50s/it] {'loss': 0.1036, 'grad_norm': 0.836058832165904, 'learning_rate': 1.555698377873821e-07, 'epoch': 0.92} 92%|█████████▏| 31631/34278 [34:37:02<3:18:27, 4.50s/it] 92%|█████████▏| 31632/34278 [34:37:08<3:37:42, 4.94s/it] {'loss': 0.0979, 'grad_norm': 0.7518532359470982, 'learning_rate': 1.5545292851709915e-07, 'epoch': 0.92} 92%|█████████▏| 31632/34278 [34:37:08<3:37:42, 4.94s/it] 92%|█████████▏| 31633/34278 [34:37:11<3:12:21, 4.36s/it] {'loss': 0.1162, 'grad_norm': 0.7655427662612161, 'learning_rate': 1.5533606249777677e-07, 'epoch': 0.92} 92%|█████████▏| 31633/34278 [34:37:11<3:12:21, 4.36s/it] 92%|█████████▏| 31634/34278 [34:37:14<2:57:20, 4.02s/it] {'loss': 0.1197, 'grad_norm': 0.7174580099324258, 'learning_rate': 1.5521923973045694e-07, 'epoch': 0.92} 92%|█████████▏| 31634/34278 [34:37:14<2:57:20, 4.02s/it] 92%|█████████▏| 31635/34278 [34:37:18<2:49:21, 3.84s/it] {'loss': 0.1072, 'grad_norm': 0.9234571164611187, 'learning_rate': 1.5510246021618325e-07, 'epoch': 0.92} 92%|█████████▏| 31635/34278 [34:37:18<2:49:21, 3.84s/it] 92%|█████████▏| 31636/34278 [34:37:22<2:55:24, 3.98s/it] {'loss': 0.133, 'grad_norm': 1.3094640955070806, 'learning_rate': 1.5498572395599877e-07, 'epoch': 0.92} 92%|█████████▏| 31636/34278 [34:37:22<2:55:24, 3.98s/it] 92%|█████████▏| 31637/34278 [34:37:28<3:20:57, 4.57s/it] {'loss': 0.1269, 'grad_norm': 0.7505909050299393, 'learning_rate': 1.548690309509443e-07, 'epoch': 0.92} 92%|█████████▏| 31637/34278 [34:37:28<3:20:57, 4.57s/it] 92%|█████████▏| 31638/34278 [34:37:31<3:01:06, 4.12s/it] {'loss': 0.0924, 'grad_norm': 0.6915615289355302, 'learning_rate': 1.5475238120206293e-07, 'epoch': 0.92} 92%|█████████▏| 31638/34278 [34:37:31<3:01:06, 4.12s/it] 92%|█████████▏| 31639/34278 [34:37:34<2:45:14, 3.76s/it] {'loss': 0.1054, 'grad_norm': 0.9557879590278532, 'learning_rate': 1.5463577471039548e-07, 'epoch': 0.92} 92%|█████████▏| 31639/34278 [34:37:34<2:45:14, 3.76s/it] 92%|█████████▏| 31640/34278 [34:37:37<2:44:16, 3.74s/it] {'loss': 0.1501, 'grad_norm': 0.9334558976212776, 'learning_rate': 1.545192114769839e-07, 'epoch': 0.92} 92%|█████████▏| 31640/34278 [34:37:37<2:44:16, 3.74s/it] 92%|█████████▏| 31641/34278 [34:37:41<2:44:34, 3.74s/it] {'loss': 0.1224, 'grad_norm': 0.8468658970243811, 'learning_rate': 1.5440269150286734e-07, 'epoch': 0.92} 92%|█████████▏| 31641/34278 [34:37:41<2:44:34, 3.74s/it] 92%|█████████▏| 31642/34278 [34:37:47<3:17:03, 4.49s/it] {'loss': 0.1194, 'grad_norm': 0.9010276123425714, 'learning_rate': 1.5428621478908723e-07, 'epoch': 0.92} 92%|█████████▏| 31642/34278 [34:37:47<3:17:03, 4.49s/it] 92%|█████████▏| 31643/34278 [34:37:50<2:56:46, 4.03s/it] {'loss': 0.0969, 'grad_norm': 0.7955231366219253, 'learning_rate': 1.5416978133668213e-07, 'epoch': 0.92} 92%|█████████▏| 31643/34278 [34:37:50<2:56:46, 4.03s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff04196f0b0> Failed to fetch sample 3074341. Exception: cannot identify image file <_io.BytesIO object at 0x7ff04196f0b0> 92%|█████████▏| 31644/34278 [34:37:57<3:27:12, 4.72s/it] {'loss': 0.1338, 'grad_norm': 0.7320602127955743, 'learning_rate': 1.5405339114669348e-07, 'epoch': 0.92} 92%|█████████▏| 31644/34278 [34:37:57<3:27:12, 4.72s/it] 92%|█████████▏| 31645/34278 [34:38:00<3:03:08, 4.17s/it] {'loss': 0.1185, 'grad_norm': 0.860636207288666, 'learning_rate': 1.5393704422015875e-07, 'epoch': 0.92} 92%|█████████▏| 31645/34278 [34:38:00<3:03:08, 4.17s/it] 92%|█████████▏| 31646/34278 [34:38:06<3:27:21, 4.73s/it] {'loss': 0.1141, 'grad_norm': 0.7553259095559253, 'learning_rate': 1.5382074055811768e-07, 'epoch': 0.92} 92%|█████████▏| 31646/34278 [34:38:06<3:27:21, 4.73s/it] 92%|█████████▏| 31647/34278 [34:38:09<3:15:20, 4.45s/it] {'loss': 0.0884, 'grad_norm': 1.1257911236360727, 'learning_rate': 1.5370448016160778e-07, 'epoch': 0.92} 92%|█████████▏| 31647/34278 [34:38:09<3:15:20, 4.45s/it] 92%|█████████▏| 31648/34278 [34:38:13<3:09:48, 4.33s/it] {'loss': 0.1183, 'grad_norm': 0.764563354961511, 'learning_rate': 1.5358826303166764e-07, 'epoch': 0.92} 92%|█████████▏| 31648/34278 [34:38:14<3:09:48, 4.33s/it] 92%|█████████▏| 31649/34278 [34:38:20<3:31:51, 4.83s/it] {'loss': 0.097, 'grad_norm': 0.8925355857901404, 'learning_rate': 1.5347208916933366e-07, 'epoch': 0.92} 92%|█████████▏| 31649/34278 [34:38:20<3:31:51, 4.83s/it] 92%|█████████▏| 31650/34278 [34:38:25<3:34:43, 4.90s/it] {'loss': 0.0823, 'grad_norm': 0.7161358890200636, 'learning_rate': 1.5335595857564501e-07, 'epoch': 0.92} 92%|█████████▏| 31650/34278 [34:38:25<3:34:43, 4.90s/it] 92%|█████████▏| 31651/34278 [34:38:28<3:21:13, 4.60s/it] {'loss': 0.122, 'grad_norm': 1.0281336572021094, 'learning_rate': 1.5323987125163697e-07, 'epoch': 0.92} 92%|█████████▏| 31651/34278 [34:38:28<3:21:13, 4.60s/it] 92%|█████████▏| 31652/34278 [34:38:35<3:40:41, 5.04s/it] {'loss': 0.113, 'grad_norm': 0.9173171067427917, 'learning_rate': 1.531238271983465e-07, 'epoch': 0.92} 92%|█████████▏| 31652/34278 [34:38:35<3:40:41, 5.04s/it] 92%|█████████▏| 31653/34278 [34:38:38<3:24:35, 4.68s/it] {'loss': 0.119, 'grad_norm': 0.8164228517657877, 'learning_rate': 1.5300782641680945e-07, 'epoch': 0.92} 92%|█████████▏| 31653/34278 [34:38:38<3:24:35, 4.68s/it] 92%|█████████▏| 31654/34278 [34:38:41<2:59:52, 4.11s/it] {'loss': 0.1141, 'grad_norm': 0.7005962818840026, 'learning_rate': 1.5289186890806108e-07, 'epoch': 0.92} 92%|█████████▏| 31654/34278 [34:38:41<2:59:52, 4.11s/it] 92%|█████████▏| 31655/34278 [34:38:44<2:46:16, 3.80s/it] {'loss': 0.1278, 'grad_norm': 0.8436412188491318, 'learning_rate': 1.5277595467313723e-07, 'epoch': 0.92} 92%|█████████▏| 31655/34278 [34:38:44<2:46:16, 3.80s/it] 92%|█████████▏| 31656/34278 [34:38:48<2:39:51, 3.66s/it] {'loss': 0.0983, 'grad_norm': 1.0043470679257775, 'learning_rate': 1.5266008371307262e-07, 'epoch': 0.92} 92%|█████████▏| 31656/34278 [34:38:48<2:39:51, 3.66s/it] 92%|█████████▏| 31657/34278 [34:38:51<2:30:46, 3.45s/it] {'loss': 0.1022, 'grad_norm': 0.9269625408708297, 'learning_rate': 1.52544256028902e-07, 'epoch': 0.92} 92%|█████████▏| 31657/34278 [34:38:51<2:30:46, 3.45s/it] 92%|█████████▏| 31658/34278 [34:38:57<3:04:35, 4.23s/it] {'loss': 0.1178, 'grad_norm': 0.8423703202891744, 'learning_rate': 1.5242847162165843e-07, 'epoch': 0.92} 92%|█████████▏| 31658/34278 [34:38:57<3:04:35, 4.23s/it] 92%|█████████▏| 31659/34278 [34:39:00<2:49:27, 3.88s/it] {'loss': 0.1071, 'grad_norm': 0.7931191425893267, 'learning_rate': 1.523127304923766e-07, 'epoch': 0.92} 92%|█████████▏| 31659/34278 [34:39:00<2:49:27, 3.88s/it] 92%|█████████▏| 31660/34278 [34:39:03<2:36:48, 3.59s/it] {'loss': 0.1022, 'grad_norm': 0.8677367734275767, 'learning_rate': 1.5219703264208963e-07, 'epoch': 0.92} 92%|█████████▏| 31660/34278 [34:39:03<2:36:48, 3.59s/it] 92%|█████████▏| 31661/34278 [34:39:06<2:32:35, 3.50s/it] {'loss': 0.1114, 'grad_norm': 0.9929099802499494, 'learning_rate': 1.5208137807183e-07, 'epoch': 0.92} 92%|█████████▏| 31661/34278 [34:39:06<2:32:35, 3.50s/it] 92%|█████████▏| 31662/34278 [34:39:10<2:37:21, 3.61s/it] {'loss': 0.1134, 'grad_norm': 0.6626511392472205, 'learning_rate': 1.519657667826302e-07, 'epoch': 0.92} 92%|█████████▏| 31662/34278 [34:39:10<2:37:21, 3.61s/it] 92%|█████████▏| 31663/34278 [34:39:13<2:27:05, 3.37s/it] {'loss': 0.1024, 'grad_norm': 0.8092447279789307, 'learning_rate': 1.518501987755233e-07, 'epoch': 0.92} 92%|█████████▏| 31663/34278 [34:39:13<2:27:05, 3.37s/it] 92%|█████████▏| 31664/34278 [34:39:15<2:17:49, 3.16s/it] {'loss': 0.1233, 'grad_norm': 1.0462278416088158, 'learning_rate': 1.517346740515402e-07, 'epoch': 0.92} 92%|█████████▏| 31664/34278 [34:39:15<2:17:49, 3.16s/it] 92%|█████████▏| 31665/34278 [34:39:19<2:22:16, 3.27s/it] {'loss': 0.0986, 'grad_norm': 0.7076361377472974, 'learning_rate': 1.5161919261171275e-07, 'epoch': 0.92} 92%|█████████▏| 31665/34278 [34:39:19<2:22:16, 3.27s/it] 92%|█████████▏| 31666/34278 [34:39:22<2:19:32, 3.21s/it] {'loss': 0.0979, 'grad_norm': 0.5744799080853549, 'learning_rate': 1.5150375445707188e-07, 'epoch': 0.92} 92%|█████████▏| 31666/34278 [34:39:22<2:19:32, 3.21s/it] 92%|█████████▏| 31667/34278 [34:39:25<2:14:56, 3.10s/it] {'loss': 0.0956, 'grad_norm': 1.062507853611053, 'learning_rate': 1.5138835958864728e-07, 'epoch': 0.92} 92%|█████████▏| 31667/34278 [34:39:25<2:14:56, 3.10s/it] 92%|█████████▏| 31668/34278 [34:39:28<2:14:54, 3.10s/it] {'loss': 0.1027, 'grad_norm': 0.9327357428416654, 'learning_rate': 1.5127300800747036e-07, 'epoch': 0.92} 92%|█████████▏| 31668/34278 [34:39:28<2:14:54, 3.10s/it] 92%|█████████▏| 31669/34278 [34:39:32<2:24:10, 3.32s/it] {'loss': 0.1177, 'grad_norm': 0.8754677669518118, 'learning_rate': 1.5115769971457084e-07, 'epoch': 0.92} 92%|█████████▏| 31669/34278 [34:39:32<2:24:10, 3.32s/it] 92%|█████████▏| 31670/34278 [34:39:35<2:22:16, 3.27s/it] {'loss': 0.1191, 'grad_norm': 0.8856063982246912, 'learning_rate': 1.510424347109779e-07, 'epoch': 0.92} 92%|█████████▏| 31670/34278 [34:39:35<2:22:16, 3.27s/it] 92%|█████████▏| 31671/34278 [34:39:38<2:19:40, 3.21s/it] {'loss': 0.0836, 'grad_norm': 0.7518185981710238, 'learning_rate': 1.5092721299772017e-07, 'epoch': 0.92} 92%|█████████▏| 31671/34278 [34:39:38<2:19:40, 3.21s/it] 92%|█████████▏| 31672/34278 [34:39:43<2:41:48, 3.73s/it] {'loss': 0.1002, 'grad_norm': 0.799150001983761, 'learning_rate': 1.5081203457582738e-07, 'epoch': 0.92} 92%|█████████▏| 31672/34278 [34:39:43<2:41:48, 3.73s/it] 92%|█████████▏| 31673/34278 [34:39:46<2:32:30, 3.51s/it] {'loss': 0.104, 'grad_norm': 0.9102159400402953, 'learning_rate': 1.5069689944632648e-07, 'epoch': 0.92} 92%|█████████▏| 31673/34278 [34:39:46<2:32:30, 3.51s/it] 92%|█████████▏| 31674/34278 [34:39:49<2:24:02, 3.32s/it] {'loss': 0.1379, 'grad_norm': 0.9043552789685778, 'learning_rate': 1.505818076102461e-07, 'epoch': 0.92} 92%|█████████▏| 31674/34278 [34:39:49<2:24:02, 3.32s/it] 92%|█████████▏| 31675/34278 [34:39:52<2:29:22, 3.44s/it] {'loss': 0.1027, 'grad_norm': 0.815463861856309, 'learning_rate': 1.5046675906861374e-07, 'epoch': 0.92} 92%|█████████▏| 31675/34278 [34:39:52<2:29:22, 3.44s/it] 92%|█████████▏| 31676/34278 [34:39:56<2:27:09, 3.39s/it] {'loss': 0.112, 'grad_norm': 0.7249940916778375, 'learning_rate': 1.5035175382245692e-07, 'epoch': 0.92} 92%|█████████▏| 31676/34278 [34:39:56<2:27:09, 3.39s/it] 92%|█████████▏| 31677/34278 [34:39:59<2:23:34, 3.31s/it] {'loss': 0.1342, 'grad_norm': 1.0881781103280483, 'learning_rate': 1.502367918728015e-07, 'epoch': 0.92} 92%|█████████▏| 31677/34278 [34:39:59<2:23:34, 3.31s/it] 92%|█████████▏| 31678/34278 [34:40:02<2:19:25, 3.22s/it] {'loss': 0.133, 'grad_norm': 0.9374697187390513, 'learning_rate': 1.5012187322067439e-07, 'epoch': 0.92} 92%|█████████▏| 31678/34278 [34:40:02<2:19:25, 3.22s/it] 92%|█████████▏| 31679/34278 [34:40:05<2:24:45, 3.34s/it] {'loss': 0.1134, 'grad_norm': 0.9271405452120017, 'learning_rate': 1.5000699786710092e-07, 'epoch': 0.92} 92%|█████████▏| 31679/34278 [34:40:05<2:24:45, 3.34s/it] 92%|█████████▏| 31680/34278 [34:40:08<2:21:24, 3.27s/it] {'loss': 0.1128, 'grad_norm': 0.8913088739471651, 'learning_rate': 1.4989216581310805e-07, 'epoch': 0.92} 92%|█████████▏| 31680/34278 [34:40:08<2:21:24, 3.27s/it] 92%|█████████▏| 31681/34278 [34:40:11<2:14:36, 3.11s/it] {'loss': 0.0892, 'grad_norm': 0.8566853906111136, 'learning_rate': 1.497773770597194e-07, 'epoch': 0.92} 92%|█████████▏| 31681/34278 [34:40:11<2:14:36, 3.11s/it] 92%|█████████▏| 31682/34278 [34:40:14<2:14:39, 3.11s/it] {'loss': 0.1206, 'grad_norm': 0.8363044898995028, 'learning_rate': 1.496626316079608e-07, 'epoch': 0.92} 92%|█████████▏| 31682/34278 [34:40:14<2:14:39, 3.11s/it] 92%|█████████▏| 31683/34278 [34:40:18<2:17:59, 3.19s/it] {'loss': 0.1088, 'grad_norm': 0.8974881530619577, 'learning_rate': 1.4954792945885643e-07, 'epoch': 0.92} 92%|█████████▏| 31683/34278 [34:40:18<2:17:59, 3.19s/it] 92%|█████████▏| 31684/34278 [34:40:22<2:33:49, 3.56s/it] {'loss': 0.1229, 'grad_norm': 0.8425431127384165, 'learning_rate': 1.4943327061342993e-07, 'epoch': 0.92} 92%|█████████▏| 31684/34278 [34:40:22<2:33:49, 3.56s/it] 92%|█████████▏| 31685/34278 [34:40:25<2:26:49, 3.40s/it] {'loss': 0.1183, 'grad_norm': 0.9562055036842013, 'learning_rate': 1.4931865507270548e-07, 'epoch': 0.92} 92%|█████████▏| 31685/34278 [34:40:25<2:26:49, 3.40s/it] 92%|█████████▏| 31686/34278 [34:40:29<2:29:21, 3.46s/it] {'loss': 0.0912, 'grad_norm': 0.7798587627156494, 'learning_rate': 1.4920408283770616e-07, 'epoch': 0.92} 92%|█████████▏| 31686/34278 [34:40:29<2:29:21, 3.46s/it] 92%|█████████▏| 31687/34278 [34:40:32<2:30:05, 3.48s/it] {'loss': 0.1396, 'grad_norm': 1.0245868531379574, 'learning_rate': 1.49089553909455e-07, 'epoch': 0.92} 92%|█████████▏| 31687/34278 [34:40:32<2:30:05, 3.48s/it] 92%|█████████▏| 31688/34278 [34:40:35<2:26:15, 3.39s/it] {'loss': 0.1302, 'grad_norm': 0.8113574996612866, 'learning_rate': 1.48975068288974e-07, 'epoch': 0.92} 92%|█████████▏| 31688/34278 [34:40:35<2:26:15, 3.39s/it] 92%|█████████▏| 31689/34278 [34:40:39<2:29:30, 3.46s/it] {'loss': 0.1198, 'grad_norm': 0.8802965898818487, 'learning_rate': 1.4886062597728567e-07, 'epoch': 0.92} 92%|█████████▏| 31689/34278 [34:40:39<2:29:30, 3.46s/it] 92%|█████████▏| 31690/34278 [34:40:42<2:24:33, 3.35s/it] {'loss': 0.1119, 'grad_norm': 0.8279329466765452, 'learning_rate': 1.4874622697541196e-07, 'epoch': 0.92} 92%|█████████▏| 31690/34278 [34:40:42<2:24:33, 3.35s/it] 92%|█████████▏| 31691/34278 [34:40:45<2:20:19, 3.25s/it] {'loss': 0.1263, 'grad_norm': 0.957264706814189, 'learning_rate': 1.4863187128437317e-07, 'epoch': 0.92} 92%|█████████▏| 31691/34278 [34:40:45<2:20:19, 3.25s/it] 92%|█████████▏| 31692/34278 [34:40:51<2:56:10, 4.09s/it] {'loss': 0.0919, 'grad_norm': 0.8328102267929881, 'learning_rate': 1.4851755890519125e-07, 'epoch': 0.92} 92%|█████████▏| 31692/34278 [34:40:51<2:56:10, 4.09s/it] 92%|█████████▏| 31693/34278 [34:40:55<2:50:19, 3.95s/it] {'loss': 0.0953, 'grad_norm': 0.7208471124902135, 'learning_rate': 1.4840328983888653e-07, 'epoch': 0.92} 92%|█████████▏| 31693/34278 [34:40:55<2:50:19, 3.95s/it] 92%|█████████▏| 31694/34278 [34:40:58<2:37:24, 3.66s/it] {'loss': 0.135, 'grad_norm': 0.801460191767315, 'learning_rate': 1.482890640864787e-07, 'epoch': 0.92} 92%|█████████▏| 31694/34278 [34:40:58<2:37:24, 3.66s/it] 92%|█████████▏| 31695/34278 [34:41:01<2:35:31, 3.61s/it] {'loss': 0.1061, 'grad_norm': 0.7721325867798572, 'learning_rate': 1.4817488164898863e-07, 'epoch': 0.92} 92%|█████████▏| 31695/34278 [34:41:01<2:35:31, 3.61s/it] 92%|█████████▏| 31696/34278 [34:41:04<2:26:47, 3.41s/it] {'loss': 0.0998, 'grad_norm': 0.9239235113400591, 'learning_rate': 1.480607425274344e-07, 'epoch': 0.92} 92%|█████████▏| 31696/34278 [34:41:04<2:26:47, 3.41s/it] 92%|█████████▏| 31697/34278 [34:41:07<2:22:40, 3.32s/it] {'loss': 0.1242, 'grad_norm': 0.8160326047347986, 'learning_rate': 1.4794664672283577e-07, 'epoch': 0.92} 92%|█████████▏| 31697/34278 [34:41:07<2:22:40, 3.32s/it] 92%|█████████▏| 31698/34278 [34:41:11<2:23:13, 3.33s/it] {'loss': 0.1087, 'grad_norm': 0.748262937428363, 'learning_rate': 1.4783259423621076e-07, 'epoch': 0.92} 92%|█████████▏| 31698/34278 [34:41:11<2:23:13, 3.33s/it] 92%|█████████▏| 31699/34278 [34:41:14<2:27:07, 3.42s/it] {'loss': 0.1115, 'grad_norm': 0.8645288476102676, 'learning_rate': 1.4771858506857862e-07, 'epoch': 0.92} 92%|█████████▏| 31699/34278 [34:41:14<2:27:07, 3.42s/it] 92%|█████████▏| 31700/34278 [34:41:18<2:31:44, 3.53s/it] {'loss': 0.0943, 'grad_norm': 0.7152663028836039, 'learning_rate': 1.476046192209568e-07, 'epoch': 0.92} 92%|█████████▏| 31700/34278 [34:41:18<2:31:44, 3.53s/it] 92%|█████████▏| 31701/34278 [34:41:24<3:02:37, 4.25s/it] {'loss': 0.117, 'grad_norm': 0.6501551081778637, 'learning_rate': 1.4749069669436179e-07, 'epoch': 0.92} 92%|█████████▏| 31701/34278 [34:41:24<3:02:37, 4.25s/it] 92%|█████████▏| 31702/34278 [34:41:27<2:47:56, 3.91s/it] {'loss': 0.0845, 'grad_norm': 0.6584951160164141, 'learning_rate': 1.4737681748981214e-07, 'epoch': 0.92} 92%|█████████▏| 31702/34278 [34:41:27<2:47:56, 3.91s/it] 92%|█████████▏| 31703/34278 [34:41:31<2:46:32, 3.88s/it] {'loss': 0.103, 'grad_norm': 0.7890857304082028, 'learning_rate': 1.472629816083232e-07, 'epoch': 0.92} 92%|█████████▏| 31703/34278 [34:41:31<2:46:32, 3.88s/it] 92%|█████████▏| 31704/34278 [34:41:34<2:34:00, 3.59s/it] {'loss': 0.1131, 'grad_norm': 0.8117915023448221, 'learning_rate': 1.4714918905091246e-07, 'epoch': 0.92} 92%|█████████▏| 31704/34278 [34:41:34<2:34:00, 3.59s/it] 92%|█████████▏| 31705/34278 [34:41:37<2:31:38, 3.54s/it] {'loss': 0.1245, 'grad_norm': 0.7919583720680227, 'learning_rate': 1.4703543981859524e-07, 'epoch': 0.92} 92%|█████████▏| 31705/34278 [34:41:37<2:31:38, 3.54s/it] 92%|█████████▏| 31706/34278 [34:41:41<2:33:16, 3.58s/it] {'loss': 0.1107, 'grad_norm': 0.7812810231682721, 'learning_rate': 1.4692173391238684e-07, 'epoch': 0.92} 92%|█████████▏| 31706/34278 [34:41:41<2:33:16, 3.58s/it] 92%|█████████▏| 31707/34278 [34:41:45<2:38:29, 3.70s/it] {'loss': 0.1178, 'grad_norm': 0.8852469426470593, 'learning_rate': 1.4680807133330312e-07, 'epoch': 0.92} 92%|█████████▏| 31707/34278 [34:41:45<2:38:29, 3.70s/it] 93%|█████████▎| 31708/34278 [34:41:48<2:35:42, 3.64s/it] {'loss': 0.0879, 'grad_norm': 0.7730858926753635, 'learning_rate': 1.466944520823582e-07, 'epoch': 0.93} 93%|█████████▎| 31708/34278 [34:41:48<2:35:42, 3.64s/it] 93%|█████████▎| 31709/34278 [34:41:52<2:37:15, 3.67s/it] {'loss': 0.1083, 'grad_norm': 0.8014526540941422, 'learning_rate': 1.4658087616056582e-07, 'epoch': 0.93} 93%|█████████▎| 31709/34278 [34:41:52<2:37:15, 3.67s/it] 93%|█████████▎| 31710/34278 [34:41:55<2:25:45, 3.41s/it] {'loss': 0.1022, 'grad_norm': 0.865674395824994, 'learning_rate': 1.4646734356894177e-07, 'epoch': 0.93} 93%|█████████▎| 31710/34278 [34:41:55<2:25:45, 3.41s/it] 93%|█████████▎| 31711/34278 [34:42:01<2:59:16, 4.19s/it] {'loss': 0.1113, 'grad_norm': 0.7474366567978811, 'learning_rate': 1.4635385430849857e-07, 'epoch': 0.93} 93%|█████████▎| 31711/34278 [34:42:01<2:59:16, 4.19s/it] 93%|█████████▎| 31712/34278 [34:42:05<3:02:07, 4.26s/it] {'loss': 0.1215, 'grad_norm': 1.0057173331831166, 'learning_rate': 1.4624040838024933e-07, 'epoch': 0.93} 93%|█████████▎| 31712/34278 [34:42:05<3:02:07, 4.26s/it] 93%|█████████▎| 31713/34278 [34:42:09<2:48:54, 3.95s/it] {'loss': 0.1333, 'grad_norm': 0.7728605512356185, 'learning_rate': 1.461270057852071e-07, 'epoch': 0.93} 93%|█████████▎| 31713/34278 [34:42:09<2:48:54, 3.95s/it] 93%|█████████▎| 31714/34278 [34:42:15<3:14:04, 4.54s/it] {'loss': 0.1045, 'grad_norm': 0.7762412372580629, 'learning_rate': 1.4601364652438387e-07, 'epoch': 0.93} 93%|█████████▎| 31714/34278 [34:42:15<3:14:04, 4.54s/it] 93%|█████████▎| 31715/34278 [34:42:18<2:55:07, 4.10s/it] {'loss': 0.1314, 'grad_norm': 1.0697619655151576, 'learning_rate': 1.4590033059879216e-07, 'epoch': 0.93} 93%|█████████▎| 31715/34278 [34:42:18<2:55:07, 4.10s/it] 93%|█████████▎| 31716/34278 [34:42:24<3:19:57, 4.68s/it] {'loss': 0.102, 'grad_norm': 0.7180033330029212, 'learning_rate': 1.4578705800944392e-07, 'epoch': 0.93} 93%|█████████▎| 31716/34278 [34:42:24<3:19:57, 4.68s/it] 93%|█████████▎| 31717/34278 [34:42:27<2:56:17, 4.13s/it] {'loss': 0.1018, 'grad_norm': 0.9335884476519792, 'learning_rate': 1.4567382875735002e-07, 'epoch': 0.93} 93%|█████████▎| 31717/34278 [34:42:27<2:56:17, 4.13s/it] 93%|█████████▎| 31718/34278 [34:42:30<2:48:50, 3.96s/it] {'loss': 0.0834, 'grad_norm': 0.6776272690764072, 'learning_rate': 1.4556064284352135e-07, 'epoch': 0.93} 93%|█████████▎| 31718/34278 [34:42:30<2:48:50, 3.96s/it] 93%|█████████▎| 31719/34278 [34:42:33<2:40:59, 3.77s/it] {'loss': 0.1015, 'grad_norm': 0.6304707389915937, 'learning_rate': 1.4544750026896814e-07, 'epoch': 0.93} 93%|█████████▎| 31719/34278 [34:42:33<2:40:59, 3.77s/it] 93%|█████████▎| 31720/34278 [34:42:38<2:46:10, 3.90s/it] {'loss': 0.1043, 'grad_norm': 0.8153619968939354, 'learning_rate': 1.4533440103470132e-07, 'epoch': 0.93} 93%|█████████▎| 31720/34278 [34:42:38<2:46:10, 3.90s/it] 93%|█████████▎| 31721/34278 [34:42:44<3:12:46, 4.52s/it] {'loss': 0.1093, 'grad_norm': 0.9327875910475767, 'learning_rate': 1.4522134514172948e-07, 'epoch': 0.93} 93%|█████████▎| 31721/34278 [34:42:44<3:12:46, 4.52s/it] 93%|█████████▎| 31722/34278 [34:42:47<2:54:58, 4.11s/it] {'loss': 0.1055, 'grad_norm': 0.9585100548820297, 'learning_rate': 1.451083325910624e-07, 'epoch': 0.93} 93%|█████████▎| 31722/34278 [34:42:47<2:54:58, 4.11s/it] 93%|█████████▎| 31723/34278 [34:42:50<2:38:09, 3.71s/it] {'loss': 0.1133, 'grad_norm': 0.8694365073157463, 'learning_rate': 1.449953633837098e-07, 'epoch': 0.93} 93%|█████████▎| 31723/34278 [34:42:50<2:38:09, 3.71s/it] 93%|█████████▎| 31724/34278 [34:42:54<2:42:18, 3.81s/it] {'loss': 0.1024, 'grad_norm': 0.8312840869919882, 'learning_rate': 1.448824375206792e-07, 'epoch': 0.93} 93%|█████████▎| 31724/34278 [34:42:54<2:42:18, 3.81s/it] 93%|█████████▎| 31725/34278 [34:42:57<2:35:02, 3.64s/it] {'loss': 0.0885, 'grad_norm': 0.7670830538040962, 'learning_rate': 1.447695550029793e-07, 'epoch': 0.93} 93%|█████████▎| 31725/34278 [34:42:57<2:35:02, 3.64s/it] 93%|█████████▎| 31726/34278 [34:43:00<2:30:43, 3.54s/it] {'loss': 0.1015, 'grad_norm': 0.7136171008459236, 'learning_rate': 1.4465671583161755e-07, 'epoch': 0.93} 93%|█████████▎| 31726/34278 [34:43:00<2:30:43, 3.54s/it] 93%|█████████▎| 31727/34278 [34:43:04<2:32:50, 3.59s/it] {'loss': 0.1108, 'grad_norm': 0.7312121747966311, 'learning_rate': 1.4454392000760154e-07, 'epoch': 0.93} 93%|█████████▎| 31727/34278 [34:43:04<2:32:50, 3.59s/it] 93%|█████████▎| 31728/34278 [34:43:08<2:34:54, 3.64s/it] {'loss': 0.0977, 'grad_norm': 1.0217869543902838, 'learning_rate': 1.444311675319382e-07, 'epoch': 0.93} 93%|█████████▎| 31728/34278 [34:43:08<2:34:54, 3.64s/it] 93%|█████████▎| 31729/34278 [34:43:10<2:24:14, 3.40s/it] {'loss': 0.1235, 'grad_norm': 0.734487254337658, 'learning_rate': 1.4431845840563508e-07, 'epoch': 0.93} 93%|█████████▎| 31729/34278 [34:43:10<2:24:14, 3.40s/it] 93%|█████████▎| 31730/34278 [34:43:17<3:02:22, 4.29s/it] {'loss': 0.1168, 'grad_norm': 0.8334042837425122, 'learning_rate': 1.4420579262969748e-07, 'epoch': 0.93} 93%|█████████▎| 31730/34278 [34:43:17<3:02:22, 4.29s/it] 93%|█████████▎| 31731/34278 [34:43:20<2:42:23, 3.83s/it] {'loss': 0.1004, 'grad_norm': 0.7957090527856719, 'learning_rate': 1.440931702051307e-07, 'epoch': 0.93} 93%|█████████▎| 31731/34278 [34:43:20<2:42:23, 3.83s/it] 93%|█████████▎| 31732/34278 [34:43:23<2:35:07, 3.66s/it] {'loss': 0.108, 'grad_norm': 0.8122803162626553, 'learning_rate': 1.439805911329417e-07, 'epoch': 0.93} 93%|█████████▎| 31732/34278 [34:43:23<2:35:07, 3.66s/it] 93%|█████████▎| 31733/34278 [34:43:27<2:41:57, 3.82s/it] {'loss': 0.1185, 'grad_norm': 0.722345553644814, 'learning_rate': 1.4386805541413361e-07, 'epoch': 0.93} 93%|█████████▎| 31733/34278 [34:43:27<2:41:57, 3.82s/it] 93%|█████████▎| 31734/34278 [34:43:30<2:29:30, 3.53s/it] {'loss': 0.1207, 'grad_norm': 0.8192274853642497, 'learning_rate': 1.4375556304971338e-07, 'epoch': 0.93} 93%|█████████▎| 31734/34278 [34:43:30<2:29:30, 3.53s/it] 93%|█████████▎| 31735/34278 [34:43:33<2:19:07, 3.28s/it] {'loss': 0.0914, 'grad_norm': 0.8564909861136769, 'learning_rate': 1.4364311404068355e-07, 'epoch': 0.93} 93%|█████████▎| 31735/34278 [34:43:33<2:19:07, 3.28s/it] 93%|█████████▎| 31736/34278 [34:43:36<2:17:38, 3.25s/it] {'loss': 0.0932, 'grad_norm': 1.0361703726960494, 'learning_rate': 1.435307083880494e-07, 'epoch': 0.93} 93%|█████████▎| 31736/34278 [34:43:36<2:17:38, 3.25s/it] 93%|█████████▎| 31737/34278 [34:43:42<2:57:55, 4.20s/it] {'loss': 0.1138, 'grad_norm': 0.9041171044957181, 'learning_rate': 1.4341834609281346e-07, 'epoch': 0.93} 93%|█████████▎| 31737/34278 [34:43:42<2:57:55, 4.20s/it] 93%|█████████▎| 31738/34278 [34:43:45<2:42:51, 3.85s/it] {'loss': 0.1159, 'grad_norm': 0.8281386330147424, 'learning_rate': 1.4330602715597886e-07, 'epoch': 0.93} 93%|█████████▎| 31738/34278 [34:43:45<2:42:51, 3.85s/it] 93%|█████████▎| 31739/34278 [34:43:48<2:30:36, 3.56s/it] {'loss': 0.096, 'grad_norm': 0.855771458799659, 'learning_rate': 1.431937515785481e-07, 'epoch': 0.93} 93%|█████████▎| 31739/34278 [34:43:48<2:30:36, 3.56s/it] 93%|█████████▎| 31740/34278 [34:43:51<2:24:35, 3.42s/it] {'loss': 0.1097, 'grad_norm': 0.7884953687926183, 'learning_rate': 1.4308151936152537e-07, 'epoch': 0.93} 93%|█████████▎| 31740/34278 [34:43:51<2:24:35, 3.42s/it] 93%|█████████▎| 31741/34278 [34:43:55<2:25:31, 3.44s/it] {'loss': 0.1147, 'grad_norm': 0.8772604667798529, 'learning_rate': 1.4296933050591043e-07, 'epoch': 0.93} 93%|█████████▎| 31741/34278 [34:43:55<2:25:31, 3.44s/it] 93%|█████████▎| 31742/34278 [34:43:58<2:19:38, 3.30s/it] {'loss': 0.121, 'grad_norm': 1.0410341867643633, 'learning_rate': 1.428571850127064e-07, 'epoch': 0.93} 93%|█████████▎| 31742/34278 [34:43:58<2:19:38, 3.30s/it] 93%|█████████▎| 31743/34278 [34:44:02<2:30:29, 3.56s/it] {'loss': 0.109, 'grad_norm': 1.055490587638585, 'learning_rate': 1.4274508288291411e-07, 'epoch': 0.93} 93%|█████████▎| 31743/34278 [34:44:02<2:30:29, 3.56s/it] 93%|█████████▎| 31744/34278 [34:44:05<2:22:41, 3.38s/it] {'loss': 0.1284, 'grad_norm': 0.6718414874251398, 'learning_rate': 1.4263302411753388e-07, 'epoch': 0.93} 93%|█████████▎| 31744/34278 [34:44:05<2:22:41, 3.38s/it] 93%|█████████▎| 31745/34278 [34:44:08<2:18:01, 3.27s/it] {'loss': 0.1138, 'grad_norm': 0.8776701654134024, 'learning_rate': 1.42521008717566e-07, 'epoch': 0.93} 93%|█████████▎| 31745/34278 [34:44:08<2:18:01, 3.27s/it] 93%|█████████▎| 31746/34278 [34:44:11<2:20:30, 3.33s/it] {'loss': 0.1165, 'grad_norm': 0.8780409718093165, 'learning_rate': 1.424090366840114e-07, 'epoch': 0.93} 93%|█████████▎| 31746/34278 [34:44:11<2:20:30, 3.33s/it] 93%|█████████▎| 31747/34278 [34:44:15<2:25:33, 3.45s/it] {'loss': 0.1235, 'grad_norm': 0.8087318675701313, 'learning_rate': 1.422971080178698e-07, 'epoch': 0.93} 93%|█████████▎| 31747/34278 [34:44:15<2:25:33, 3.45s/it] 93%|█████████▎| 31748/34278 [34:44:18<2:20:24, 3.33s/it] {'loss': 0.1323, 'grad_norm': 0.8035138993306693, 'learning_rate': 1.4218522272013924e-07, 'epoch': 0.93} 93%|█████████▎| 31748/34278 [34:44:18<2:20:24, 3.33s/it] 93%|█████████▎| 31749/34278 [34:44:21<2:18:51, 3.29s/it] {'loss': 0.1018, 'grad_norm': 1.3027535291669212, 'learning_rate': 1.420733807918201e-07, 'epoch': 0.93} 93%|█████████▎| 31749/34278 [34:44:21<2:18:51, 3.29s/it] 93%|█████████▎| 31750/34278 [34:44:24<2:16:28, 3.24s/it] {'loss': 0.1081, 'grad_norm': 0.7610153643861877, 'learning_rate': 1.4196158223390987e-07, 'epoch': 0.93} 93%|█████████▎| 31750/34278 [34:44:24<2:16:28, 3.24s/it] 93%|█████████▎| 31751/34278 [34:44:28<2:18:47, 3.30s/it] {'loss': 0.1146, 'grad_norm': 0.8453250451507696, 'learning_rate': 1.4184982704740668e-07, 'epoch': 0.93} 93%|█████████▎| 31751/34278 [34:44:28<2:18:47, 3.30s/it] 93%|█████████▎| 31752/34278 [34:44:34<2:53:24, 4.12s/it] {'loss': 0.1124, 'grad_norm': 0.9790357627712215, 'learning_rate': 1.4173811523330804e-07, 'epoch': 0.93} 93%|█████████▎| 31752/34278 [34:44:34<2:53:24, 4.12s/it] 93%|█████████▎| 31753/34278 [34:44:37<2:43:09, 3.88s/it] {'loss': 0.0958, 'grad_norm': 0.9046145975244893, 'learning_rate': 1.4162644679261262e-07, 'epoch': 0.93} 93%|█████████▎| 31753/34278 [34:44:37<2:43:09, 3.88s/it] 93%|█████████▎| 31754/34278 [34:44:40<2:33:46, 3.66s/it] {'loss': 0.1199, 'grad_norm': 1.1355467678113773, 'learning_rate': 1.4151482172631627e-07, 'epoch': 0.93} 93%|█████████▎| 31754/34278 [34:44:40<2:33:46, 3.66s/it] 93%|█████████▎| 31755/34278 [34:44:44<2:28:25, 3.53s/it] {'loss': 0.1414, 'grad_norm': 1.057588745451631, 'learning_rate': 1.4140324003541538e-07, 'epoch': 0.93} 93%|█████████▎| 31755/34278 [34:44:44<2:28:25, 3.53s/it] 93%|█████████▎| 31756/34278 [34:44:46<2:20:07, 3.33s/it] {'loss': 0.1246, 'grad_norm': 0.9310083548881187, 'learning_rate': 1.4129170172090645e-07, 'epoch': 0.93} 93%|█████████▎| 31756/34278 [34:44:46<2:20:07, 3.33s/it] 93%|█████████▎| 31757/34278 [34:44:52<2:53:32, 4.13s/it] {'loss': 0.0913, 'grad_norm': 0.6704608895088169, 'learning_rate': 1.411802067837864e-07, 'epoch': 0.93} 93%|█████████▎| 31757/34278 [34:44:52<2:53:32, 4.13s/it] 93%|█████████▎| 31758/34278 [34:44:55<2:40:30, 3.82s/it] {'loss': 0.1242, 'grad_norm': 0.7102930648508062, 'learning_rate': 1.4106875522504836e-07, 'epoch': 0.93} 93%|█████████▎| 31758/34278 [34:44:55<2:40:30, 3.82s/it] 93%|█████████▎| 31759/34278 [34:44:59<2:33:05, 3.65s/it] {'loss': 0.1254, 'grad_norm': 0.7892150966461705, 'learning_rate': 1.409573470456893e-07, 'epoch': 0.93} 93%|█████████▎| 31759/34278 [34:44:59<2:33:05, 3.65s/it] 93%|█████████▎| 31760/34278 [34:45:03<2:36:22, 3.73s/it] {'loss': 0.1324, 'grad_norm': 0.890289439329342, 'learning_rate': 1.4084598224670343e-07, 'epoch': 0.93} 93%|█████████▎| 31760/34278 [34:45:03<2:36:22, 3.73s/it] 93%|█████████▎| 31761/34278 [34:45:06<2:35:00, 3.70s/it] {'loss': 0.1005, 'grad_norm': 0.9709724081723901, 'learning_rate': 1.4073466082908382e-07, 'epoch': 0.93} 93%|█████████▎| 31761/34278 [34:45:06<2:35:00, 3.70s/it] 93%|█████████▎| 31762/34278 [34:45:10<2:41:15, 3.85s/it] {'loss': 0.1012, 'grad_norm': 0.8535624983542714, 'learning_rate': 1.406233827938247e-07, 'epoch': 0.93} 93%|█████████▎| 31762/34278 [34:45:10<2:41:15, 3.85s/it] 93%|█████████▎| 31763/34278 [34:45:13<2:27:06, 3.51s/it] {'loss': 0.109, 'grad_norm': 0.830784851460545, 'learning_rate': 1.405121481419214e-07, 'epoch': 0.93} 93%|█████████▎| 31763/34278 [34:45:13<2:27:06, 3.51s/it] 93%|█████████▎| 31764/34278 [34:45:17<2:32:05, 3.63s/it] {'loss': 0.1017, 'grad_norm': 0.7923320312471503, 'learning_rate': 1.4040095687436473e-07, 'epoch': 0.93} 93%|█████████▎| 31764/34278 [34:45:17<2:32:05, 3.63s/it] 93%|█████████▎| 31765/34278 [34:45:23<3:00:53, 4.32s/it] {'loss': 0.1191, 'grad_norm': 0.8768415708836962, 'learning_rate': 1.402898089921484e-07, 'epoch': 0.93} 93%|█████████▎| 31765/34278 [34:45:23<3:00:53, 4.32s/it] 93%|█████████▎| 31766/34278 [34:45:27<2:52:23, 4.12s/it] {'loss': 0.1401, 'grad_norm': 0.8864873670478928, 'learning_rate': 1.4017870449626492e-07, 'epoch': 0.93} 93%|█████████▎| 31766/34278 [34:45:27<2:52:23, 4.12s/it] 93%|█████████▎| 31767/34278 [34:45:29<2:34:33, 3.69s/it] {'loss': 0.1342, 'grad_norm': 1.097276375567561, 'learning_rate': 1.4006764338770573e-07, 'epoch': 0.93} 93%|█████████▎| 31767/34278 [34:45:29<2:34:33, 3.69s/it] 93%|█████████▎| 31768/34278 [34:45:32<2:25:46, 3.48s/it] {'loss': 0.0982, 'grad_norm': 0.7614307100585425, 'learning_rate': 1.3995662566746115e-07, 'epoch': 0.93} 93%|█████████▎| 31768/34278 [34:45:32<2:25:46, 3.48s/it] 93%|█████████▎| 31769/34278 [34:45:36<2:31:05, 3.61s/it] {'loss': 0.0997, 'grad_norm': 0.6780695007409673, 'learning_rate': 1.3984565133652484e-07, 'epoch': 0.93} 93%|█████████▎| 31769/34278 [34:45:36<2:31:05, 3.61s/it] 93%|█████████▎| 31770/34278 [34:45:39<2:22:23, 3.41s/it] {'loss': 0.1148, 'grad_norm': 0.88304632227594, 'learning_rate': 1.3973472039588654e-07, 'epoch': 0.93} 93%|█████████▎| 31770/34278 [34:45:39<2:22:23, 3.41s/it] 93%|█████████▎| 31771/34278 [34:45:44<2:38:34, 3.80s/it] {'loss': 0.0999, 'grad_norm': 0.8076494746165436, 'learning_rate': 1.39623832846536e-07, 'epoch': 0.93} 93%|█████████▎| 31771/34278 [34:45:44<2:38:34, 3.80s/it] 93%|█████████▎| 31772/34278 [34:45:47<2:27:21, 3.53s/it] {'loss': 0.106, 'grad_norm': 0.9200410378993642, 'learning_rate': 1.395129886894636e-07, 'epoch': 0.93} 93%|█████████▎| 31772/34278 [34:45:47<2:27:21, 3.53s/it] 93%|█████████▎| 31773/34278 [34:45:50<2:21:57, 3.40s/it] {'loss': 0.1227, 'grad_norm': 0.8777988146574237, 'learning_rate': 1.3940218792565964e-07, 'epoch': 0.93} 93%|█████████▎| 31773/34278 [34:45:50<2:21:57, 3.40s/it] 93%|█████████▎| 31774/34278 [34:45:53<2:20:26, 3.37s/it] {'loss': 0.1173, 'grad_norm': 0.8797267955743507, 'learning_rate': 1.3929143055611162e-07, 'epoch': 0.93} 93%|█████████▎| 31774/34278 [34:45:53<2:20:26, 3.37s/it] 93%|█████████▎| 31775/34278 [34:45:57<2:25:59, 3.50s/it] {'loss': 0.1108, 'grad_norm': 0.8029341141164189, 'learning_rate': 1.391807165818093e-07, 'epoch': 0.93} 93%|█████████▎| 31775/34278 [34:45:57<2:25:59, 3.50s/it] 93%|█████████▎| 31776/34278 [34:46:00<2:22:44, 3.42s/it] {'loss': 0.106, 'grad_norm': 0.8464957889930641, 'learning_rate': 1.3907004600374198e-07, 'epoch': 0.93} 93%|█████████▎| 31776/34278 [34:46:00<2:22:44, 3.42s/it] 93%|█████████▎| 31777/34278 [34:46:03<2:19:28, 3.35s/it] {'loss': 0.1138, 'grad_norm': 0.7510489981223771, 'learning_rate': 1.389594188228971e-07, 'epoch': 0.93} 93%|█████████▎| 31777/34278 [34:46:03<2:19:28, 3.35s/it] 93%|█████████▎| 31778/34278 [34:46:07<2:19:19, 3.34s/it] {'loss': 0.0964, 'grad_norm': 0.7993096754734151, 'learning_rate': 1.3884883504026116e-07, 'epoch': 0.93} 93%|█████████▎| 31778/34278 [34:46:07<2:19:19, 3.34s/it] 93%|█████████▎| 31779/34278 [34:46:10<2:22:41, 3.43s/it] {'loss': 0.098, 'grad_norm': 0.7978983676680128, 'learning_rate': 1.3873829465682277e-07, 'epoch': 0.93} 93%|█████████▎| 31779/34278 [34:46:10<2:22:41, 3.43s/it] 93%|█████████▎| 31780/34278 [34:46:14<2:21:28, 3.40s/it] {'loss': 0.1233, 'grad_norm': 0.8653884600183934, 'learning_rate': 1.3862779767356838e-07, 'epoch': 0.93} 93%|█████████▎| 31780/34278 [34:46:14<2:21:28, 3.40s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f798b8cb510> Failed to fetch sample 2762610. Exception: cannot identify image file <_io.BytesIO object at 0x7f798b8cb510> 93%|█████████▎| 31781/34278 [34:46:20<2:53:05, 4.16s/it] {'loss': 0.1054, 'grad_norm': 0.7238549802724723, 'learning_rate': 1.3851734409148443e-07, 'epoch': 0.93} 93%|█████████▎| 31781/34278 [34:46:20<2:53:05, 4.16s/it] 93%|█████████▎| 31782/34278 [34:46:23<2:38:04, 3.80s/it] {'loss': 0.0951, 'grad_norm': 0.9286978961897876, 'learning_rate': 1.3840693391155735e-07, 'epoch': 0.93} 93%|█████████▎| 31782/34278 [34:46:23<2:38:04, 3.80s/it] 93%|█████████▎| 31783/34278 [34:46:26<2:32:04, 3.66s/it] {'loss': 0.1067, 'grad_norm': 0.6889075601075261, 'learning_rate': 1.3829656713477247e-07, 'epoch': 0.93} 93%|█████████▎| 31783/34278 [34:46:26<2:32:04, 3.66s/it] 93%|█████████▎| 31784/34278 [34:46:29<2:23:25, 3.45s/it] {'loss': 0.1265, 'grad_norm': 0.8273047894290693, 'learning_rate': 1.3818624376211564e-07, 'epoch': 0.93} 93%|█████████▎| 31784/34278 [34:46:29<2:23:25, 3.45s/it] 93%|█████████▎| 31785/34278 [34:46:32<2:18:05, 3.32s/it] {'loss': 0.099, 'grad_norm': 0.8189016133973054, 'learning_rate': 1.3807596379457056e-07, 'epoch': 0.93} 93%|█████████▎| 31785/34278 [34:46:32<2:18:05, 3.32s/it] 93%|█████████▎| 31786/34278 [34:46:36<2:30:32, 3.62s/it] {'loss': 0.1255, 'grad_norm': 0.925306070894806, 'learning_rate': 1.3796572723312308e-07, 'epoch': 0.93} 93%|█████████▎| 31786/34278 [34:46:36<2:30:32, 3.62s/it] 93%|█████████▎| 31787/34278 [34:46:42<2:54:12, 4.20s/it] {'loss': 0.1076, 'grad_norm': 0.7317591662671944, 'learning_rate': 1.3785553407875685e-07, 'epoch': 0.93} 93%|█████████▎| 31787/34278 [34:46:42<2:54:12, 4.20s/it] 93%|█████████▎| 31788/34278 [34:46:45<2:43:11, 3.93s/it] {'loss': 0.0947, 'grad_norm': 0.8200400377904487, 'learning_rate': 1.3774538433245555e-07, 'epoch': 0.93} 93%|█████████▎| 31788/34278 [34:46:45<2:43:11, 3.93s/it] 93%|█████████▎| 31789/34278 [34:46:48<2:30:11, 3.62s/it] {'loss': 0.1294, 'grad_norm': 0.8756006166518264, 'learning_rate': 1.376352779952034e-07, 'epoch': 0.93} 93%|█████████▎| 31789/34278 [34:46:48<2:30:11, 3.62s/it] 93%|█████████▎| 31790/34278 [34:46:54<2:59:07, 4.32s/it] {'loss': 0.0814, 'grad_norm': 0.6485610255946412, 'learning_rate': 1.3752521506798233e-07, 'epoch': 0.93} 93%|█████████▎| 31790/34278 [34:46:54<2:59:07, 4.32s/it] 93%|█████████▎| 31791/34278 [34:46:57<2:42:57, 3.93s/it] {'loss': 0.1108, 'grad_norm': 0.9764972503010176, 'learning_rate': 1.374151955517755e-07, 'epoch': 0.93} 93%|█████████▎| 31791/34278 [34:46:57<2:42:57, 3.93s/it] 93%|█████████▎| 31792/34278 [34:47:00<2:34:27, 3.73s/it] {'loss': 0.1228, 'grad_norm': 0.8023872679716414, 'learning_rate': 1.3730521944756437e-07, 'epoch': 0.93} 93%|█████████▎| 31792/34278 [34:47:00<2:34:27, 3.73s/it] 93%|█████████▎| 31793/34278 [34:47:03<2:24:09, 3.48s/it] {'loss': 0.1042, 'grad_norm': 0.7619632234424019, 'learning_rate': 1.3719528675633254e-07, 'epoch': 0.93} 93%|█████████▎| 31793/34278 [34:47:03<2:24:09, 3.48s/it] 93%|█████████▎| 31794/34278 [34:47:08<2:43:15, 3.94s/it] {'loss': 0.1115, 'grad_norm': 1.0058169030589184, 'learning_rate': 1.370853974790598e-07, 'epoch': 0.93} 93%|█████████▎| 31794/34278 [34:47:08<2:43:15, 3.94s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fefcafa0680> Failed to fetch sample 3074345. Exception: cannot identify image file <_io.BytesIO object at 0x7fefcafa0680> 93%|█████████▎| 31795/34278 [34:47:11<2:28:57, 3.60s/it] {'loss': 0.1266, 'grad_norm': 0.8253121798893776, 'learning_rate': 1.369755516167276e-07, 'epoch': 0.93} 93%|█████████▎| 31795/34278 [34:47:11<2:28:57, 3.60s/it] 93%|█████████▎| 31796/34278 [34:47:14<2:19:47, 3.38s/it] {'loss': 0.0953, 'grad_norm': 1.0321014996611595, 'learning_rate': 1.368657491703168e-07, 'epoch': 0.93} 93%|█████████▎| 31796/34278 [34:47:14<2:19:47, 3.38s/it] 93%|█████████▎| 31797/34278 [34:47:19<2:46:08, 4.02s/it] {'loss': 0.0953, 'grad_norm': 0.9970188652347979, 'learning_rate': 1.3675599014080832e-07, 'epoch': 0.93} 93%|█████████▎| 31797/34278 [34:47:19<2:46:08, 4.02s/it] 93%|█████████▎| 31798/34278 [34:47:24<2:49:00, 4.09s/it] {'loss': 0.0975, 'grad_norm': 0.909214366606785, 'learning_rate': 1.3664627452918021e-07, 'epoch': 0.93} 93%|█████████▎| 31798/34278 [34:47:24<2:49:00, 4.09s/it] 93%|█████████▎| 31799/34278 [34:47:30<3:11:45, 4.64s/it] {'loss': 0.0843, 'grad_norm': 0.8532918532153793, 'learning_rate': 1.3653660233641397e-07, 'epoch': 0.93} 93%|█████████▎| 31799/34278 [34:47:30<3:11:45, 4.64s/it] 93%|█████████▎| 31800/34278 [34:47:33<2:51:46, 4.16s/it] {'loss': 0.1055, 'grad_norm': 0.9876479818329683, 'learning_rate': 1.364269735634882e-07, 'epoch': 0.93} 93%|█████████▎| 31800/34278 [34:47:33<2:51:46, 4.16s/it] 93%|█████████▎| 31801/34278 [34:47:36<2:42:17, 3.93s/it] {'loss': 0.0961, 'grad_norm': 0.6862723268211314, 'learning_rate': 1.363173882113805e-07, 'epoch': 0.93} 93%|█████████▎| 31801/34278 [34:47:36<2:42:17, 3.93s/it] 93%|█████████▎| 31802/34278 [34:47:39<2:33:33, 3.72s/it] {'loss': 0.1088, 'grad_norm': 0.8376938152866861, 'learning_rate': 1.3620784628107065e-07, 'epoch': 0.93} 93%|█████████▎| 31802/34278 [34:47:39<2:33:33, 3.72s/it] 93%|█████████▎| 31803/34278 [34:47:42<2:20:15, 3.40s/it] {'loss': 0.1069, 'grad_norm': 0.8994906102792165, 'learning_rate': 1.3609834777353669e-07, 'epoch': 0.93} 93%|█████████▎| 31803/34278 [34:47:42<2:20:15, 3.40s/it] 93%|█████████▎| 31804/34278 [34:47:48<2:54:19, 4.23s/it] {'loss': 0.1073, 'grad_norm': 1.007395043983961, 'learning_rate': 1.3598889268975457e-07, 'epoch': 0.93} 93%|█████████▎| 31804/34278 [34:47:48<2:54:19, 4.23s/it] 93%|█████████▎| 31805/34278 [34:47:51<2:38:28, 3.84s/it] {'loss': 0.1094, 'grad_norm': 0.7227518113123437, 'learning_rate': 1.3587948103070237e-07, 'epoch': 0.93} 93%|█████████▎| 31805/34278 [34:47:51<2:38:28, 3.84s/it] 93%|█████████▎| 31806/34278 [34:47:54<2:33:21, 3.72s/it] {'loss': 0.0976, 'grad_norm': 0.849188319709201, 'learning_rate': 1.3577011279735763e-07, 'epoch': 0.93} 93%|█████████▎| 31806/34278 [34:47:54<2:33:21, 3.72s/it] 93%|█████████▎| 31807/34278 [34:47:57<2:24:08, 3.50s/it] {'loss': 0.1007, 'grad_norm': 0.7660060966761473, 'learning_rate': 1.3566078799069625e-07, 'epoch': 0.93} 93%|█████████▎| 31807/34278 [34:47:57<2:24:08, 3.50s/it] 93%|█████████▎| 31808/34278 [34:48:00<2:14:46, 3.27s/it] {'loss': 0.1146, 'grad_norm': 0.703551761846541, 'learning_rate': 1.3555150661169358e-07, 'epoch': 0.93} 93%|█████████▎| 31808/34278 [34:48:00<2:14:46, 3.27s/it] 93%|█████████▎| 31809/34278 [34:48:03<2:07:56, 3.11s/it] {'loss': 0.1049, 'grad_norm': 0.9981733909666994, 'learning_rate': 1.3544226866132658e-07, 'epoch': 0.93} 93%|█████████▎| 31809/34278 [34:48:03<2:07:56, 3.11s/it] 93%|█████████▎| 31810/34278 [34:48:06<2:11:27, 3.20s/it] {'loss': 0.1067, 'grad_norm': 0.8990268865529819, 'learning_rate': 1.3533307414056894e-07, 'epoch': 0.93} 93%|█████████▎| 31810/34278 [34:48:06<2:11:27, 3.20s/it] 93%|█████████▎| 31811/34278 [34:48:12<2:38:47, 3.86s/it] {'loss': 0.1215, 'grad_norm': 0.8876133612278219, 'learning_rate': 1.3522392305039656e-07, 'epoch': 0.93} 93%|█████████▎| 31811/34278 [34:48:12<2:38:47, 3.86s/it] 93%|█████████▎| 31812/34278 [34:48:16<2:41:58, 3.94s/it] {'loss': 0.1132, 'grad_norm': 0.8881889152314952, 'learning_rate': 1.3511481539178362e-07, 'epoch': 0.93} 93%|█████████▎| 31812/34278 [34:48:16<2:41:58, 3.94s/it] 93%|█████████▎| 31813/34278 [34:48:19<2:32:28, 3.71s/it] {'loss': 0.1052, 'grad_norm': 0.9820246173766248, 'learning_rate': 1.350057511657049e-07, 'epoch': 0.93} 93%|█████████▎| 31813/34278 [34:48:19<2:32:28, 3.71s/it] 93%|█████████▎| 31814/34278 [34:48:22<2:22:47, 3.48s/it] {'loss': 0.1432, 'grad_norm': 0.868547077141067, 'learning_rate': 1.34896730373133e-07, 'epoch': 0.93} 93%|█████████▎| 31814/34278 [34:48:22<2:22:47, 3.48s/it] 93%|█████████▎| 31815/34278 [34:48:28<2:49:52, 4.14s/it] {'loss': 0.1048, 'grad_norm': 0.7482346363401683, 'learning_rate': 1.3478775301504154e-07, 'epoch': 0.93} 93%|█████████▎| 31815/34278 [34:48:28<2:49:52, 4.14s/it] 93%|█████████▎| 31816/34278 [34:48:31<2:44:55, 4.02s/it] {'loss': 0.1037, 'grad_norm': 0.7306615085695191, 'learning_rate': 1.346788190924031e-07, 'epoch': 0.93} 93%|█████████▎| 31816/34278 [34:48:31<2:44:55, 4.02s/it] 93%|█████████▎| 31817/34278 [34:48:35<2:38:29, 3.86s/it] {'loss': 0.112, 'grad_norm': 0.8326001612957701, 'learning_rate': 1.3456992860619188e-07, 'epoch': 0.93} 93%|█████████▎| 31817/34278 [34:48:35<2:38:29, 3.86s/it] 93%|█████████▎| 31818/34278 [34:48:39<2:39:58, 3.90s/it] {'loss': 0.1194, 'grad_norm': 0.9366844088472931, 'learning_rate': 1.3446108155737826e-07, 'epoch': 0.93} 93%|█████████▎| 31818/34278 [34:48:39<2:39:58, 3.90s/it] 93%|█████████▎| 31819/34278 [34:48:44<2:58:07, 4.35s/it] {'loss': 0.107, 'grad_norm': 0.872850129913893, 'learning_rate': 1.3435227794693472e-07, 'epoch': 0.93} 93%|█████████▎| 31819/34278 [34:48:44<2:58:07, 4.35s/it] 93%|█████████▎| 31820/34278 [34:48:50<3:12:11, 4.69s/it] {'loss': 0.1209, 'grad_norm': 0.8333367549240172, 'learning_rate': 1.3424351777583278e-07, 'epoch': 0.93} 93%|█████████▎| 31820/34278 [34:48:50<3:12:11, 4.69s/it] 93%|█████████▎| 31821/34278 [34:48:53<2:52:43, 4.22s/it] {'loss': 0.1027, 'grad_norm': 0.7771479190120988, 'learning_rate': 1.3413480104504272e-07, 'epoch': 0.93} 93%|█████████▎| 31821/34278 [34:48:53<2:52:43, 4.22s/it] 93%|█████████▎| 31822/34278 [34:48:56<2:43:49, 4.00s/it] {'loss': 0.107, 'grad_norm': 0.790493073916453, 'learning_rate': 1.3402612775553546e-07, 'epoch': 0.93} 93%|█████████▎| 31822/34278 [34:48:56<2:43:49, 4.00s/it] 93%|█████████▎| 31823/34278 [34:49:01<2:54:39, 4.27s/it] {'loss': 0.1121, 'grad_norm': 0.900113995987599, 'learning_rate': 1.339174979082819e-07, 'epoch': 0.93} 93%|█████████▎| 31823/34278 [34:49:01<2:54:39, 4.27s/it] 93%|█████████▎| 31824/34278 [34:49:07<3:14:31, 4.76s/it] {'loss': 0.0845, 'grad_norm': 0.7251102759118273, 'learning_rate': 1.3380891150425068e-07, 'epoch': 0.93} 93%|█████████▎| 31824/34278 [34:49:07<3:14:31, 4.76s/it] 93%|█████████▎| 31825/34278 [34:49:10<2:58:04, 4.36s/it] {'loss': 0.1111, 'grad_norm': 1.0832926473263882, 'learning_rate': 1.3370036854441216e-07, 'epoch': 0.93} 93%|█████████▎| 31825/34278 [34:49:10<2:58:04, 4.36s/it] 93%|█████████▎| 31826/34278 [34:49:17<3:23:57, 4.99s/it] {'loss': 0.1318, 'grad_norm': 0.7568245320348015, 'learning_rate': 1.3359186902973554e-07, 'epoch': 0.93} 93%|█████████▎| 31826/34278 [34:49:17<3:23:57, 4.99s/it] 93%|█████████▎| 31827/34278 [34:49:21<3:09:06, 4.63s/it] {'loss': 0.1185, 'grad_norm': 0.8860534980102193, 'learning_rate': 1.334834129611884e-07, 'epoch': 0.93} 93%|█████████▎| 31827/34278 [34:49:21<3:09:06, 4.63s/it] 93%|█████████▎| 31828/34278 [34:49:24<2:48:19, 4.12s/it] {'loss': 0.1006, 'grad_norm': 0.8883106338757215, 'learning_rate': 1.3337500033973882e-07, 'epoch': 0.93} 93%|█████████▎| 31828/34278 [34:49:24<2:48:19, 4.12s/it] 93%|█████████▎| 31829/34278 [34:49:29<3:02:56, 4.48s/it] {'loss': 0.1163, 'grad_norm': 0.9404115406259522, 'learning_rate': 1.3326663116635717e-07, 'epoch': 0.93} 93%|█████████▎| 31829/34278 [34:49:29<3:02:56, 4.48s/it] 93%|█████████▎| 31830/34278 [34:49:32<2:47:50, 4.11s/it] {'loss': 0.1209, 'grad_norm': 0.7562662765938616, 'learning_rate': 1.3315830544200826e-07, 'epoch': 0.93} 93%|█████████▎| 31830/34278 [34:49:32<2:47:50, 4.11s/it] 93%|█████████▎| 31831/34278 [34:49:37<2:57:37, 4.36s/it] {'loss': 0.1243, 'grad_norm': 0.7330848003186264, 'learning_rate': 1.3305002316766013e-07, 'epoch': 0.93} 93%|█████████▎| 31831/34278 [34:49:37<2:57:37, 4.36s/it] 93%|█████████▎| 31832/34278 [34:49:41<2:53:22, 4.25s/it] {'loss': 0.1208, 'grad_norm': 0.9121372954925305, 'learning_rate': 1.329417843442804e-07, 'epoch': 0.93} 93%|█████████▎| 31832/34278 [34:49:41<2:53:22, 4.25s/it] 93%|█████████▎| 31833/34278 [34:49:44<2:39:20, 3.91s/it] {'loss': 0.1244, 'grad_norm': 0.8451668065558846, 'learning_rate': 1.3283358897283438e-07, 'epoch': 0.93} 93%|█████████▎| 31833/34278 [34:49:44<2:39:20, 3.91s/it] 93%|█████████▎| 31834/34278 [34:49:47<2:24:54, 3.56s/it] {'loss': 0.1137, 'grad_norm': 0.8449992401771372, 'learning_rate': 1.3272543705428742e-07, 'epoch': 0.93} 93%|█████████▎| 31834/34278 [34:49:47<2:24:54, 3.56s/it] 93%|█████████▎| 31835/34278 [34:49:51<2:27:54, 3.63s/it] {'loss': 0.0831, 'grad_norm': 0.9208350676847564, 'learning_rate': 1.3261732858960598e-07, 'epoch': 0.93} 93%|█████████▎| 31835/34278 [34:49:51<2:27:54, 3.63s/it] 93%|█████████▎| 31836/34278 [34:49:54<2:22:20, 3.50s/it] {'loss': 0.1007, 'grad_norm': 0.6965852645334017, 'learning_rate': 1.3250926357975537e-07, 'epoch': 0.93} 93%|█████████▎| 31836/34278 [34:49:54<2:22:20, 3.50s/it] 93%|█████████▎| 31837/34278 [34:50:00<2:53:12, 4.26s/it] {'loss': 0.1049, 'grad_norm': 0.72587681724953, 'learning_rate': 1.3240124202570038e-07, 'epoch': 0.93} 93%|█████████▎| 31837/34278 [34:50:00<2:53:12, 4.26s/it] 93%|█████████▎| 31838/34278 [34:50:03<2:34:23, 3.80s/it] {'loss': 0.1028, 'grad_norm': 0.6481746470484258, 'learning_rate': 1.3229326392840468e-07, 'epoch': 0.93} 93%|█████████▎| 31838/34278 [34:50:03<2:34:23, 3.80s/it] 93%|█████████▎| 31839/34278 [34:50:06<2:22:11, 3.50s/it] {'loss': 0.099, 'grad_norm': 0.8006667847317643, 'learning_rate': 1.321853292888331e-07, 'epoch': 0.93} 93%|█████████▎| 31839/34278 [34:50:06<2:22:11, 3.50s/it] 93%|█████████▎| 31840/34278 [34:50:09<2:26:43, 3.61s/it] {'loss': 0.1228, 'grad_norm': 0.7546421097337805, 'learning_rate': 1.3207743810794815e-07, 'epoch': 0.93} 93%|█████████▎| 31840/34278 [34:50:09<2:26:43, 3.61s/it] 93%|█████████▎| 31841/34278 [34:50:13<2:30:18, 3.70s/it] {'loss': 0.1136, 'grad_norm': 0.6913794747741638, 'learning_rate': 1.3196959038671464e-07, 'epoch': 0.93} 93%|█████████▎| 31841/34278 [34:50:13<2:30:18, 3.70s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 93%|█████████▎| 31842/34278 [34:50:17<2:30:17, 3.70s/it] {'loss': 0.1047, 'grad_norm': 0.9952210474587667, 'learning_rate': 1.3186178612609346e-07, 'epoch': 0.93} 93%|█████████▎| 31842/34278 [34:50:17<2:30:17, 3.70s/it] 93%|█████████▎| 31843/34278 [34:50:20<2:25:57, 3.60s/it] {'loss': 0.1214, 'grad_norm': 0.9197226240457339, 'learning_rate': 1.317540253270494e-07, 'epoch': 0.93} 93%|█████████▎| 31843/34278 [34:50:20<2:25:57, 3.60s/it] 93%|█████████▎| 31844/34278 [34:50:24<2:28:31, 3.66s/it] {'loss': 0.1196, 'grad_norm': 0.7698174611805969, 'learning_rate': 1.316463079905428e-07, 'epoch': 0.93} 93%|█████████▎| 31844/34278 [34:50:24<2:28:31, 3.66s/it] 93%|█████████▎| 31845/34278 [34:50:30<2:56:54, 4.36s/it] {'loss': 0.1147, 'grad_norm': 0.9122035865888464, 'learning_rate': 1.3153863411753508e-07, 'epoch': 0.93} 93%|█████████▎| 31845/34278 [34:50:30<2:56:54, 4.36s/it] 93%|█████████▎| 31846/34278 [34:50:33<2:41:41, 3.99s/it] {'loss': 0.0893, 'grad_norm': 0.7490563563043272, 'learning_rate': 1.3143100370898886e-07, 'epoch': 0.93} 93%|█████████▎| 31846/34278 [34:50:33<2:41:41, 3.99s/it] 93%|█████████▎| 31847/34278 [34:50:37<2:33:22, 3.79s/it] {'loss': 0.1234, 'grad_norm': 0.8431775605464544, 'learning_rate': 1.3132341676586447e-07, 'epoch': 0.93} 93%|█████████▎| 31847/34278 [34:50:37<2:33:22, 3.79s/it] 93%|█████████▎| 31848/34278 [34:50:40<2:24:41, 3.57s/it] {'loss': 0.1118, 'grad_norm': 0.7153092145860344, 'learning_rate': 1.3121587328912222e-07, 'epoch': 0.93} 93%|█████████▎| 31848/34278 [34:50:40<2:24:41, 3.57s/it] 93%|█████████▎| 31849/34278 [34:50:43<2:20:05, 3.46s/it] {'loss': 0.1406, 'grad_norm': 1.19776123598946, 'learning_rate': 1.3110837327972248e-07, 'epoch': 0.93} 93%|█████████▎| 31849/34278 [34:50:43<2:20:05, 3.46s/it] 93%|█████████▎| 31850/34278 [34:50:47<2:22:16, 3.52s/it] {'loss': 0.1015, 'grad_norm': 0.8648257833374993, 'learning_rate': 1.3100091673862502e-07, 'epoch': 0.93} 93%|█████████▎| 31850/34278 [34:50:47<2:22:16, 3.52s/it] 93%|█████████▎| 31851/34278 [34:50:52<2:43:44, 4.05s/it] {'loss': 0.1013, 'grad_norm': 0.6255037728739352, 'learning_rate': 1.3089350366678855e-07, 'epoch': 0.93} 93%|█████████▎| 31851/34278 [34:50:52<2:43:44, 4.05s/it] 93%|█████████▎| 31852/34278 [34:50:56<2:45:18, 4.09s/it] {'loss': 0.095, 'grad_norm': 0.8270467120913118, 'learning_rate': 1.3078613406517228e-07, 'epoch': 0.93} 93%|█████████▎| 31852/34278 [34:50:56<2:45:18, 4.09s/it] 93%|█████████▎| 31853/34278 [34:50:59<2:33:18, 3.79s/it] {'loss': 0.1041, 'grad_norm': 0.9490322085209174, 'learning_rate': 1.3067880793473597e-07, 'epoch': 0.93} 93%|█████████▎| 31853/34278 [34:50:59<2:33:18, 3.79s/it] 93%|█████████▎| 31854/34278 [34:51:02<2:24:53, 3.59s/it] {'loss': 0.1159, 'grad_norm': 0.7924175106467733, 'learning_rate': 1.3057152527643668e-07, 'epoch': 0.93} 93%|█████████▎| 31854/34278 [34:51:02<2:24:53, 3.59s/it] 93%|█████████▎| 31855/34278 [34:51:05<2:16:33, 3.38s/it] {'loss': 0.1125, 'grad_norm': 0.8274527911865962, 'learning_rate': 1.3046428609123196e-07, 'epoch': 0.93} 93%|█████████▎| 31855/34278 [34:51:05<2:16:33, 3.38s/it] 93%|█████████▎| 31856/34278 [34:51:11<2:40:33, 3.98s/it] {'loss': 0.1152, 'grad_norm': 0.9026162693608782, 'learning_rate': 1.3035709038007993e-07, 'epoch': 0.93} 93%|█████████▎| 31856/34278 [34:51:11<2:40:33, 3.98s/it] 93%|█████████▎| 31857/34278 [34:51:17<3:05:36, 4.60s/it] {'loss': 0.1127, 'grad_norm': 0.780596824956173, 'learning_rate': 1.302499381439376e-07, 'epoch': 0.93} 93%|█████████▎| 31857/34278 [34:51:17<3:05:36, 4.60s/it] 93%|█████████▎| 31858/34278 [34:51:20<2:48:56, 4.19s/it] {'loss': 0.1046, 'grad_norm': 0.5554970680070257, 'learning_rate': 1.3014282938376034e-07, 'epoch': 0.93} 93%|█████████▎| 31858/34278 [34:51:20<2:48:56, 4.19s/it] 93%|█████████▎| 31859/34278 [34:51:23<2:40:17, 3.98s/it] {'loss': 0.1093, 'grad_norm': 0.7311824802092963, 'learning_rate': 1.3003576410050623e-07, 'epoch': 0.93} 93%|█████████▎| 31859/34278 [34:51:23<2:40:17, 3.98s/it] 93%|█████████▎| 31860/34278 [34:51:26<2:30:18, 3.73s/it] {'loss': 0.096, 'grad_norm': 0.8817108602131031, 'learning_rate': 1.299287422951301e-07, 'epoch': 0.93} 93%|█████████▎| 31860/34278 [34:51:26<2:30:18, 3.73s/it] 93%|█████████▎| 31861/34278 [34:51:29<2:21:25, 3.51s/it] {'loss': 0.0966, 'grad_norm': 0.7418745675867275, 'learning_rate': 1.2982176396858725e-07, 'epoch': 0.93} 93%|█████████▎| 31861/34278 [34:51:29<2:21:25, 3.51s/it] 93%|█████████▎| 31862/34278 [34:51:33<2:17:51, 3.42s/it] {'loss': 0.1137, 'grad_norm': 0.8576244592350082, 'learning_rate': 1.2971482912183363e-07, 'epoch': 0.93} 93%|█████████▎| 31862/34278 [34:51:33<2:17:51, 3.42s/it] 93%|█████████▎| 31863/34278 [34:51:35<2:10:28, 3.24s/it] {'loss': 0.1107, 'grad_norm': 0.9123038104314924, 'learning_rate': 1.2960793775582347e-07, 'epoch': 0.93} 93%|█████████▎| 31863/34278 [34:51:35<2:10:28, 3.24s/it] 93%|█████████▎| 31864/34278 [34:51:38<2:06:14, 3.14s/it] {'loss': 0.0852, 'grad_norm': 0.6741393165961056, 'learning_rate': 1.2950108987151045e-07, 'epoch': 0.93} 93%|█████████▎| 31864/34278 [34:51:38<2:06:14, 3.14s/it] 93%|█████████▎| 31865/34278 [34:51:42<2:06:48, 3.15s/it] {'loss': 0.099, 'grad_norm': 0.7158006914295666, 'learning_rate': 1.2939428546984878e-07, 'epoch': 0.93} 93%|█████████▎| 31865/34278 [34:51:42<2:06:48, 3.15s/it] 93%|█████████▎| 31866/34278 [34:51:44<2:03:23, 3.07s/it] {'loss': 0.104, 'grad_norm': 0.8480768058484065, 'learning_rate': 1.292875245517927e-07, 'epoch': 0.93} 93%|█████████▎| 31866/34278 [34:51:44<2:03:23, 3.07s/it] 93%|█████████▎| 31867/34278 [34:51:47<2:00:15, 2.99s/it] {'loss': 0.1127, 'grad_norm': 1.2313390211905544, 'learning_rate': 1.2918080711829483e-07, 'epoch': 0.93} 93%|█████████▎| 31867/34278 [34:51:47<2:00:15, 2.99s/it] 93%|█████████▎| 31868/34278 [34:51:51<2:14:48, 3.36s/it] {'loss': 0.1576, 'grad_norm': 0.8906095587323312, 'learning_rate': 1.2907413317030771e-07, 'epoch': 0.93} 93%|█████████▎| 31868/34278 [34:51:51<2:14:48, 3.36s/it] 93%|█████████▎| 31869/34278 [34:51:57<2:38:49, 3.96s/it] {'loss': 0.101, 'grad_norm': 0.6460838578639841, 'learning_rate': 1.2896750270878445e-07, 'epoch': 0.93} 93%|█████████▎| 31869/34278 [34:51:57<2:38:49, 3.96s/it] 93%|█████████▎| 31870/34278 [34:52:01<2:46:39, 4.15s/it] {'loss': 0.1133, 'grad_norm': 0.9412761702099182, 'learning_rate': 1.2886091573467597e-07, 'epoch': 0.93} 93%|█████████▎| 31870/34278 [34:52:01<2:46:39, 4.15s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 93%|█████████▎| 31871/34278 [34:52:04<2:31:24, 3.77s/it] {'loss': 0.1114, 'grad_norm': 0.8029726891533026, 'learning_rate': 1.2875437224893485e-07, 'epoch': 0.93} 93%|█████████▎| 31871/34278 [34:52:04<2:31:24, 3.77s/it] 93%|█████████▎| 31872/34278 [34:52:08<2:36:12, 3.90s/it] {'loss': 0.1216, 'grad_norm': 0.7234335704268551, 'learning_rate': 1.2864787225251141e-07, 'epoch': 0.93} 93%|█████████▎| 31872/34278 [34:52:08<2:36:12, 3.90s/it] 93%|█████████▎| 31873/34278 [34:52:12<2:27:05, 3.67s/it] {'loss': 0.1079, 'grad_norm': 0.8342462671527551, 'learning_rate': 1.2854141574635714e-07, 'epoch': 0.93} 93%|█████████▎| 31873/34278 [34:52:12<2:27:05, 3.67s/it] 93%|█████████▎| 31874/34278 [34:52:16<2:38:25, 3.95s/it] {'loss': 0.1159, 'grad_norm': 0.8613265968178566, 'learning_rate': 1.284350027314224e-07, 'epoch': 0.93} 93%|█████████▎| 31874/34278 [34:52:16<2:38:25, 3.95s/it] 93%|█████████▎| 31875/34278 [34:52:20<2:30:07, 3.75s/it] {'loss': 0.1247, 'grad_norm': 0.8961476307437986, 'learning_rate': 1.2832863320865696e-07, 'epoch': 0.93} 93%|█████████▎| 31875/34278 [34:52:20<2:30:07, 3.75s/it] 93%|█████████▎| 31876/34278 [34:52:25<2:56:33, 4.41s/it] {'loss': 0.1247, 'grad_norm': 0.7027444420307065, 'learning_rate': 1.282223071790101e-07, 'epoch': 0.93} 93%|█████████▎| 31876/34278 [34:52:25<2:56:33, 4.41s/it] 93%|█████████▎| 31877/34278 [34:52:30<2:56:06, 4.40s/it] {'loss': 0.1157, 'grad_norm': 0.9564352845338436, 'learning_rate': 1.2811602464343155e-07, 'epoch': 0.93} 93%|█████████▎| 31877/34278 [34:52:30<2:56:06, 4.40s/it] 93%|█████████▎| 31878/34278 [34:52:33<2:45:28, 4.14s/it] {'loss': 0.1266, 'grad_norm': 1.1018576709758896, 'learning_rate': 1.2800978560287002e-07, 'epoch': 0.93} 93%|█████████▎| 31878/34278 [34:52:33<2:45:28, 4.14s/it] 93%|█████████▎| 31879/34278 [34:52:36<2:32:58, 3.83s/it] {'loss': 0.1055, 'grad_norm': 0.8748241148358271, 'learning_rate': 1.279035900582748e-07, 'epoch': 0.93} 93%|█████████▎| 31879/34278 [34:52:36<2:32:58, 3.83s/it] 93%|█████████▎| 31880/34278 [34:52:39<2:22:24, 3.56s/it] {'loss': 0.1082, 'grad_norm': 0.9006780298953522, 'learning_rate': 1.2779743801059285e-07, 'epoch': 0.93} 93%|█████████▎| 31880/34278 [34:52:39<2:22:24, 3.56s/it] 93%|█████████▎| 31881/34278 [34:52:43<2:19:01, 3.48s/it] {'loss': 0.1543, 'grad_norm': 0.8733716203451908, 'learning_rate': 1.2769132946077235e-07, 'epoch': 0.93} 93%|█████████▎| 31881/34278 [34:52:43<2:19:01, 3.48s/it] 93%|█████████▎| 31882/34278 [34:52:46<2:10:51, 3.28s/it] {'loss': 0.1141, 'grad_norm': 0.7921478917179678, 'learning_rate': 1.2758526440976028e-07, 'epoch': 0.93} 93%|█████████▎| 31882/34278 [34:52:46<2:10:51, 3.28s/it] 93%|█████████▎| 31883/34278 [34:52:49<2:10:03, 3.26s/it] {'loss': 0.093, 'grad_norm': 0.716895583888058, 'learning_rate': 1.274792428585042e-07, 'epoch': 0.93} 93%|█████████▎| 31883/34278 [34:52:49<2:10:03, 3.26s/it] 93%|█████████▎| 31884/34278 [34:52:51<2:01:31, 3.05s/it] {'loss': 0.118, 'grad_norm': 0.9013773962196722, 'learning_rate': 1.273732648079501e-07, 'epoch': 0.93} 93%|█████████▎| 31884/34278 [34:52:51<2:01:31, 3.05s/it] 93%|█████████▎| 31885/34278 [34:52:54<2:01:33, 3.05s/it] {'loss': 0.1038, 'grad_norm': 0.8308549976620867, 'learning_rate': 1.2726733025904436e-07, 'epoch': 0.93} 93%|█████████▎| 31885/34278 [34:52:54<2:01:33, 3.05s/it] 93%|█████████▎| 31886/34278 [34:53:00<2:36:44, 3.93s/it] {'loss': 0.1128, 'grad_norm': 0.796908700369008, 'learning_rate': 1.271614392127324e-07, 'epoch': 0.93} 93%|█████████▎| 31886/34278 [34:53:00<2:36:44, 3.93s/it] 93%|█████████▎| 31887/34278 [34:53:04<2:34:58, 3.89s/it] {'loss': 0.1173, 'grad_norm': 0.7871456170316263, 'learning_rate': 1.2705559166996063e-07, 'epoch': 0.93} 93%|█████████▎| 31887/34278 [34:53:04<2:34:58, 3.89s/it] 93%|█████████▎| 31888/34278 [34:53:07<2:27:33, 3.70s/it] {'loss': 0.1364, 'grad_norm': 0.9454585672221922, 'learning_rate': 1.2694978763167165e-07, 'epoch': 0.93} 93%|█████████▎| 31888/34278 [34:53:07<2:27:33, 3.70s/it] 93%|█████████▎| 31889/34278 [34:53:10<2:17:01, 3.44s/it] {'loss': 0.1072, 'grad_norm': 1.0150272764416546, 'learning_rate': 1.2684402709881305e-07, 'epoch': 0.93} 93%|█████████▎| 31889/34278 [34:53:10<2:17:01, 3.44s/it] 93%|█████████▎| 31890/34278 [34:53:13<2:12:28, 3.33s/it] {'loss': 0.1134, 'grad_norm': 0.7229796377824613, 'learning_rate': 1.2673831007232795e-07, 'epoch': 0.93} 93%|█████████▎| 31890/34278 [34:53:13<2:12:28, 3.33s/it] 93%|█████████▎| 31891/34278 [34:53:20<2:48:53, 4.25s/it] {'loss': 0.1112, 'grad_norm': 0.8158135053267521, 'learning_rate': 1.2663263655315894e-07, 'epoch': 0.93} 93%|█████████▎| 31891/34278 [34:53:20<2:48:53, 4.25s/it] 93%|█████████▎| 31892/34278 [34:53:23<2:39:57, 4.02s/it] {'loss': 0.1061, 'grad_norm': 0.8555090184417553, 'learning_rate': 1.265270065422508e-07, 'epoch': 0.93} 93%|█████████▎| 31892/34278 [34:53:23<2:39:57, 4.02s/it] 93%|█████████▎| 31893/34278 [34:53:28<2:49:24, 4.26s/it] {'loss': 0.1042, 'grad_norm': 0.771408734266717, 'learning_rate': 1.2642142004054615e-07, 'epoch': 0.93} 93%|█████████▎| 31893/34278 [34:53:28<2:49:24, 4.26s/it] 93%|█████████▎| 31894/34278 [34:53:31<2:35:56, 3.92s/it] {'loss': 0.1167, 'grad_norm': 0.8500817337899947, 'learning_rate': 1.2631587704898752e-07, 'epoch': 0.93} 93%|█████████▎| 31894/34278 [34:53:31<2:35:56, 3.92s/it] 93%|█████████▎| 31895/34278 [34:53:36<2:46:51, 4.20s/it] {'loss': 0.1069, 'grad_norm': 0.8788507385964561, 'learning_rate': 1.2621037756851695e-07, 'epoch': 0.93} 93%|█████████▎| 31895/34278 [34:53:36<2:46:51, 4.20s/it] 93%|█████████▎| 31896/34278 [34:53:39<2:38:23, 3.99s/it] {'loss': 0.1039, 'grad_norm': 0.8389933634147309, 'learning_rate': 1.261049216000776e-07, 'epoch': 0.93} 93%|█████████▎| 31896/34278 [34:53:39<2:38:23, 3.99s/it] 93%|█████████▎| 31897/34278 [34:53:43<2:32:05, 3.83s/it] {'loss': 0.1062, 'grad_norm': 0.6556630091108445, 'learning_rate': 1.259995091446098e-07, 'epoch': 0.93} 93%|█████████▎| 31897/34278 [34:53:43<2:32:05, 3.83s/it] 93%|█████████▎| 31898/34278 [34:53:46<2:27:01, 3.71s/it] {'loss': 0.1197, 'grad_norm': 0.7419564977416073, 'learning_rate': 1.258941402030539e-07, 'epoch': 0.93} 93%|█████████▎| 31898/34278 [34:53:46<2:27:01, 3.71s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8dbf7b9030> Failed to fetch sample 3353709. Exception: cannot identify image file <_io.BytesIO object at 0x7f8dbf7b9030> 93%|█████████▎| 31899/34278 [34:53:52<2:52:58, 4.36s/it] {'loss': 0.1234, 'grad_norm': 0.9155257068684064, 'learning_rate': 1.2578881477635252e-07, 'epoch': 0.93} 93%|█████████▎| 31899/34278 [34:53:52<2:52:58, 4.36s/it] 93%|█████████▎| 31900/34278 [34:53:56<2:41:26, 4.07s/it] {'loss': 0.1127, 'grad_norm': 0.9316605553815461, 'learning_rate': 1.2568353286544432e-07, 'epoch': 0.93} 93%|█████████▎| 31900/34278 [34:53:56<2:41:26, 4.07s/it] 93%|█████████▎| 31901/34278 [34:53:59<2:30:14, 3.79s/it] {'loss': 0.1421, 'grad_norm': 0.7153606449528723, 'learning_rate': 1.2557829447127078e-07, 'epoch': 0.93} 93%|█████████▎| 31901/34278 [34:53:59<2:30:14, 3.79s/it] 93%|█████████▎| 31902/34278 [34:54:02<2:27:19, 3.72s/it] {'loss': 0.121, 'grad_norm': 0.8417139113878996, 'learning_rate': 1.2547309959477006e-07, 'epoch': 0.93} 93%|█████████▎| 31902/34278 [34:54:02<2:27:19, 3.72s/it] 93%|█████████▎| 31903/34278 [34:54:05<2:14:16, 3.39s/it] {'loss': 0.1217, 'grad_norm': 0.8498931817356064, 'learning_rate': 1.253679482368819e-07, 'epoch': 0.93} 93%|█████████▎| 31903/34278 [34:54:05<2:14:16, 3.39s/it] 93%|█████████▎| 31904/34278 [34:54:11<2:43:09, 4.12s/it] {'loss': 0.1281, 'grad_norm': 0.8064080262598519, 'learning_rate': 1.2526284039854563e-07, 'epoch': 0.93} 93%|█████████▎| 31904/34278 [34:54:11<2:43:09, 4.12s/it] 93%|█████████▎| 31905/34278 [34:54:14<2:32:16, 3.85s/it] {'loss': 0.1054, 'grad_norm': 0.6760580647127779, 'learning_rate': 1.2515777608069823e-07, 'epoch': 0.93} 93%|█████████▎| 31905/34278 [34:54:14<2:32:16, 3.85s/it] 93%|█████████▎| 31906/34278 [34:54:17<2:20:07, 3.54s/it] {'loss': 0.1087, 'grad_norm': 0.8039425443445063, 'learning_rate': 1.250527552842784e-07, 'epoch': 0.93} 93%|█████████▎| 31906/34278 [34:54:17<2:20:07, 3.54s/it] 93%|█████████▎| 31907/34278 [34:54:20<2:10:46, 3.31s/it] {'loss': 0.0989, 'grad_norm': 0.954453858923193, 'learning_rate': 1.2494777801022427e-07, 'epoch': 0.93} 93%|█████████▎| 31907/34278 [34:54:20<2:10:46, 3.31s/it] 93%|█████████▎| 31908/34278 [34:54:22<2:05:01, 3.17s/it] {'loss': 0.1338, 'grad_norm': 0.9314655885064037, 'learning_rate': 1.2484284425947236e-07, 'epoch': 0.93} 93%|█████████▎| 31908/34278 [34:54:22<2:05:01, 3.17s/it] 93%|█████████▎| 31909/34278 [34:54:26<2:07:15, 3.22s/it] {'loss': 0.1432, 'grad_norm': 1.0844910436201352, 'learning_rate': 1.2473795403296018e-07, 'epoch': 0.93} 93%|█████████▎| 31909/34278 [34:54:26<2:07:15, 3.22s/it] 93%|█████████▎| 31910/34278 [34:54:29<2:01:26, 3.08s/it] {'loss': 0.1106, 'grad_norm': 0.8381180465942429, 'learning_rate': 1.2463310733162371e-07, 'epoch': 0.93} 93%|█████████▎| 31910/34278 [34:54:29<2:01:26, 3.08s/it] 93%|█████████▎| 31911/34278 [34:54:32<2:10:24, 3.31s/it] {'loss': 0.1147, 'grad_norm': 0.6859338086230792, 'learning_rate': 1.2452830415639882e-07, 'epoch': 0.93} 93%|█████████▎| 31911/34278 [34:54:32<2:10:24, 3.31s/it] 93%|█████████▎| 31912/34278 [34:54:36<2:16:14, 3.46s/it] {'loss': 0.1113, 'grad_norm': 0.7367652410515695, 'learning_rate': 1.2442354450822092e-07, 'epoch': 0.93} 93%|█████████▎| 31912/34278 [34:54:36<2:16:14, 3.46s/it] 93%|█████████▎| 31913/34278 [34:54:39<2:11:37, 3.34s/it] {'loss': 0.0942, 'grad_norm': 0.9388791727098107, 'learning_rate': 1.2431882838802646e-07, 'epoch': 0.93} 93%|█████████▎| 31913/34278 [34:54:39<2:11:37, 3.34s/it] 93%|█████████▎| 31914/34278 [34:54:42<2:06:07, 3.20s/it] {'loss': 0.1083, 'grad_norm': 1.6551175408790229, 'learning_rate': 1.242141557967491e-07, 'epoch': 0.93} 93%|█████████▎| 31914/34278 [34:54:42<2:06:07, 3.20s/it] 93%|█████████▎| 31915/34278 [34:54:46<2:09:37, 3.29s/it] {'loss': 0.1214, 'grad_norm': 0.9551090036043243, 'learning_rate': 1.2410952673532372e-07, 'epoch': 0.93} 93%|█████████▎| 31915/34278 [34:54:46<2:09:37, 3.29s/it] 93%|█████████▎| 31916/34278 [34:54:49<2:11:10, 3.33s/it] {'loss': 0.1086, 'grad_norm': 1.22625024778241, 'learning_rate': 1.240049412046851e-07, 'epoch': 0.93} 93%|█████████▎| 31916/34278 [34:54:49<2:11:10, 3.33s/it] 93%|█████████▎| 31917/34278 [34:54:52<2:05:02, 3.18s/it] {'loss': 0.112, 'grad_norm': 0.8530652157131269, 'learning_rate': 1.2390039920576636e-07, 'epoch': 0.93} 93%|█████████▎| 31917/34278 [34:54:52<2:05:02, 3.18s/it] 93%|█████████▎| 31918/34278 [34:54:55<2:02:27, 3.11s/it] {'loss': 0.1048, 'grad_norm': 0.8429285448486027, 'learning_rate': 1.2379590073949953e-07, 'epoch': 0.93} 93%|█████████▎| 31918/34278 [34:54:55<2:02:27, 3.11s/it] 93%|█████████▎| 31919/34278 [34:54:58<2:00:41, 3.07s/it] {'loss': 0.11, 'grad_norm': 0.8755016187817724, 'learning_rate': 1.2369144580682002e-07, 'epoch': 0.93} 93%|█████████▎| 31919/34278 [34:54:58<2:00:41, 3.07s/it] 93%|█████████▎| 31920/34278 [34:55:00<1:56:23, 2.96s/it] {'loss': 0.1269, 'grad_norm': 0.826503120525939, 'learning_rate': 1.2358703440865928e-07, 'epoch': 0.93} 93%|█████████▎| 31920/34278 [34:55:00<1:56:23, 2.96s/it] 93%|█████████▎| 31921/34278 [34:55:04<2:00:19, 3.06s/it] {'loss': 0.1065, 'grad_norm': 0.8628695192756125, 'learning_rate': 1.2348266654594932e-07, 'epoch': 0.93} 93%|█████████▎| 31921/34278 [34:55:04<2:00:19, 3.06s/it] 93%|█████████▎| 31922/34278 [34:55:10<2:32:53, 3.89s/it] {'loss': 0.1195, 'grad_norm': 0.8469502563510405, 'learning_rate': 1.2337834221962165e-07, 'epoch': 0.93} 93%|█████████▎| 31922/34278 [34:55:10<2:32:53, 3.89s/it] 93%|█████████▎| 31923/34278 [34:55:13<2:21:46, 3.61s/it] {'loss': 0.1244, 'grad_norm': 1.1229549960110636, 'learning_rate': 1.2327406143060826e-07, 'epoch': 0.93} 93%|█████████▎| 31923/34278 [34:55:13<2:21:46, 3.61s/it] 93%|█████████▎| 31924/34278 [34:55:16<2:20:16, 3.58s/it] {'loss': 0.1116, 'grad_norm': 0.7639869690161986, 'learning_rate': 1.2316982417983958e-07, 'epoch': 0.93} 93%|█████████▎| 31924/34278 [34:55:16<2:20:16, 3.58s/it] 93%|█████████▎| 31925/34278 [34:55:20<2:19:14, 3.55s/it] {'loss': 0.1109, 'grad_norm': 0.7455086304433893, 'learning_rate': 1.230656304682465e-07, 'epoch': 0.93} 93%|█████████▎| 31925/34278 [34:55:20<2:19:14, 3.55s/it] 93%|█████████▎| 31926/34278 [34:55:23<2:20:44, 3.59s/it] {'loss': 0.1054, 'grad_norm': 1.0352930780386584, 'learning_rate': 1.229614802967599e-07, 'epoch': 0.93} 93%|█████████▎| 31926/34278 [34:55:23<2:20:44, 3.59s/it] 93%|█████████▎| 31927/34278 [34:55:27<2:17:40, 3.51s/it] {'loss': 0.1099, 'grad_norm': 0.7285119701188629, 'learning_rate': 1.2285737366630857e-07, 'epoch': 0.93} 93%|█████████▎| 31927/34278 [34:55:27<2:17:40, 3.51s/it] 93%|█████████▎| 31928/34278 [34:55:30<2:14:01, 3.42s/it] {'loss': 0.0959, 'grad_norm': 0.8975598205261752, 'learning_rate': 1.2275331057782224e-07, 'epoch': 0.93} 93%|█████████▎| 31928/34278 [34:55:30<2:14:01, 3.42s/it] 93%|█████████▎| 31929/34278 [34:55:36<2:43:03, 4.16s/it] {'loss': 0.1271, 'grad_norm': 0.870180636556917, 'learning_rate': 1.226492910322302e-07, 'epoch': 0.93} 93%|█████████▎| 31929/34278 [34:55:36<2:43:03, 4.16s/it] 93%|█████████▎| 31930/34278 [34:55:39<2:32:15, 3.89s/it] {'loss': 0.1095, 'grad_norm': 0.8956178959689346, 'learning_rate': 1.2254531503046062e-07, 'epoch': 0.93} 93%|█████████▎| 31930/34278 [34:55:39<2:32:15, 3.89s/it] 93%|█████████▎| 31931/34278 [34:55:42<2:24:42, 3.70s/it] {'loss': 0.0945, 'grad_norm': 0.6840812003072003, 'learning_rate': 1.2244138257344275e-07, 'epoch': 0.93} 93%|█████████▎| 31931/34278 [34:55:42<2:24:42, 3.70s/it] 93%|█████████▎| 31932/34278 [34:55:45<2:17:22, 3.51s/it] {'loss': 0.1007, 'grad_norm': 0.7504785332715114, 'learning_rate': 1.22337493662103e-07, 'epoch': 0.93} 93%|█████████▎| 31932/34278 [34:55:45<2:17:22, 3.51s/it] 93%|█████████▎| 31933/34278 [34:55:49<2:20:18, 3.59s/it] {'loss': 0.1053, 'grad_norm': 0.7472896064001563, 'learning_rate': 1.2223364829737072e-07, 'epoch': 0.93} 93%|█████████▎| 31933/34278 [34:55:49<2:20:18, 3.59s/it] 93%|█████████▎| 31934/34278 [34:55:53<2:22:40, 3.65s/it] {'loss': 0.104, 'grad_norm': 0.8041613913350402, 'learning_rate': 1.221298464801718e-07, 'epoch': 0.93} 93%|█████████▎| 31934/34278 [34:55:53<2:22:40, 3.65s/it] 93%|█████████▎| 31935/34278 [34:55:56<2:17:29, 3.52s/it] {'loss': 0.0918, 'grad_norm': 0.915643845045573, 'learning_rate': 1.220260882114327e-07, 'epoch': 0.93} 93%|█████████▎| 31935/34278 [34:55:56<2:17:29, 3.52s/it] 93%|█████████▎| 31936/34278 [34:56:01<2:35:10, 3.98s/it] {'loss': 0.1321, 'grad_norm': 0.6798992345577578, 'learning_rate': 1.2192237349207993e-07, 'epoch': 0.93} 93%|█████████▎| 31936/34278 [34:56:01<2:35:10, 3.98s/it] 93%|█████████▎| 31937/34278 [34:56:04<2:28:19, 3.80s/it] {'loss': 0.1049, 'grad_norm': 0.8095701910911953, 'learning_rate': 1.218187023230405e-07, 'epoch': 0.93} 93%|█████████▎| 31937/34278 [34:56:04<2:28:19, 3.80s/it] 93%|█████████▎| 31938/34278 [34:56:10<2:54:06, 4.46s/it] {'loss': 0.1138, 'grad_norm': 0.7939759489021841, 'learning_rate': 1.2171507470523868e-07, 'epoch': 0.93} 93%|█████████▎| 31938/34278 [34:56:11<2:54:06, 4.46s/it] 93%|█████████▎| 31939/34278 [34:56:13<2:35:41, 3.99s/it] {'loss': 0.1215, 'grad_norm': 0.9562131842452575, 'learning_rate': 1.2161149063960042e-07, 'epoch': 0.93} 93%|█████████▎| 31939/34278 [34:56:13<2:35:41, 3.99s/it] 93%|█████████▎| 31940/34278 [34:56:19<2:58:08, 4.57s/it] {'loss': 0.0997, 'grad_norm': 0.6817020416031795, 'learning_rate': 1.2150795012705053e-07, 'epoch': 0.93} 93%|█████████▎| 31940/34278 [34:56:19<2:58:08, 4.57s/it] 93%|█████████▎| 31941/34278 [34:56:23<2:45:32, 4.25s/it] {'loss': 0.1235, 'grad_norm': 0.8691699055196034, 'learning_rate': 1.2140445316851212e-07, 'epoch': 0.93} 93%|█████████▎| 31941/34278 [34:56:23<2:45:32, 4.25s/it] 93%|█████████▎| 31942/34278 [34:56:26<2:30:50, 3.87s/it] {'loss': 0.1195, 'grad_norm': 0.956361779604108, 'learning_rate': 1.2130099976491062e-07, 'epoch': 0.93} 93%|█████████▎| 31942/34278 [34:56:26<2:30:50, 3.87s/it] 93%|█████████▎| 31943/34278 [34:56:30<2:28:53, 3.83s/it] {'loss': 0.1021, 'grad_norm': 0.7365849996540791, 'learning_rate': 1.2119758991716912e-07, 'epoch': 0.93} 93%|█████████▎| 31943/34278 [34:56:30<2:28:53, 3.83s/it] 93%|█████████▎| 31944/34278 [34:56:33<2:20:26, 3.61s/it] {'loss': 0.1101, 'grad_norm': 0.6974657489834551, 'learning_rate': 1.2109422362621138e-07, 'epoch': 0.93} 93%|█████████▎| 31944/34278 [34:56:33<2:20:26, 3.61s/it] 93%|█████████▎| 31945/34278 [34:56:36<2:14:57, 3.47s/it] {'loss': 0.13, 'grad_norm': 0.9246905293246134, 'learning_rate': 1.2099090089295884e-07, 'epoch': 0.93} 93%|█████████▎| 31945/34278 [34:56:36<2:14:57, 3.47s/it] 93%|█████████▎| 31946/34278 [34:56:39<2:11:22, 3.38s/it] {'loss': 0.1279, 'grad_norm': 0.8507907797179879, 'learning_rate': 1.2088762171833579e-07, 'epoch': 0.93} 93%|█████████▎| 31946/34278 [34:56:39<2:11:22, 3.38s/it] 93%|█████████▎| 31947/34278 [34:56:53<4:14:27, 6.55s/it] {'loss': 0.1046, 'grad_norm': 0.7888337363170564, 'learning_rate': 1.207843861032626e-07, 'epoch': 0.93} 93%|█████████▎| 31947/34278 [34:56:53<4:14:27, 6.55s/it] 93%|█████████▎| 31948/34278 [34:56:57<3:43:22, 5.75s/it] {'loss': 0.1101, 'grad_norm': 0.7420549042870341, 'learning_rate': 1.206811940486613e-07, 'epoch': 0.93} 93%|█████████▎| 31948/34278 [34:56:57<3:43:22, 5.75s/it] 93%|█████████▎| 31949/34278 [34:57:00<3:16:35, 5.06s/it] {'loss': 0.1016, 'grad_norm': 0.7910105412932991, 'learning_rate': 1.205780455554545e-07, 'epoch': 0.93} 93%|█████████▎| 31949/34278 [34:57:00<3:16:35, 5.06s/it] 93%|█████████▎| 31950/34278 [34:57:03<2:55:07, 4.51s/it] {'loss': 0.1001, 'grad_norm': 0.8889477829657815, 'learning_rate': 1.2047494062456199e-07, 'epoch': 0.93} 93%|█████████▎| 31950/34278 [34:57:03<2:55:07, 4.51s/it] 93%|█████████▎| 31951/34278 [34:57:07<2:40:51, 4.15s/it] {'loss': 0.1186, 'grad_norm': 0.7223779251255619, 'learning_rate': 1.2037187925690364e-07, 'epoch': 0.93} 93%|█████████▎| 31951/34278 [34:57:07<2:40:51, 4.15s/it] 93%|█████████▎| 31952/34278 [34:57:23<5:01:10, 7.77s/it] {'loss': 0.1219, 'grad_norm': 0.7825001685835803, 'learning_rate': 1.2026886145340088e-07, 'epoch': 0.93} 93%|█████████▎| 31952/34278 [34:57:23<5:01:10, 7.77s/it] 93%|█████████▎| 31953/34278 [34:57:26<4:05:10, 6.33s/it] {'loss': 0.1181, 'grad_norm': 1.058982674765109, 'learning_rate': 1.2016588721497247e-07, 'epoch': 0.93} 93%|█████████▎| 31953/34278 [34:57:26<4:05:10, 6.33s/it] 93%|█████████▎| 31954/34278 [34:57:38<5:16:56, 8.18s/it] {'loss': 0.11, 'grad_norm': 0.9374541064444769, 'learning_rate': 1.200629565425382e-07, 'epoch': 0.93} 93%|█████████▎| 31954/34278 [34:57:38<5:16:56, 8.18s/it] 93%|█████████▎| 31955/34278 [34:57:53<6:25:20, 9.95s/it] {'loss': 0.1447, 'grad_norm': 0.9975085468401156, 'learning_rate': 1.1996006943701676e-07, 'epoch': 0.93} 93%|█████████▎| 31955/34278 [34:57:53<6:25:20, 9.95s/it] 93%|█████████▎| 31956/34278 [34:57:57<5:21:44, 8.31s/it] {'loss': 0.1047, 'grad_norm': 0.8064543591715934, 'learning_rate': 1.1985722589932747e-07, 'epoch': 0.93} 93%|█████████▎| 31956/34278 [34:57:57<5:21:44, 8.31s/it] 93%|█████████▎| 31957/34278 [34:58:03<4:57:29, 7.69s/it] {'loss': 0.1044, 'grad_norm': 0.7057703760831878, 'learning_rate': 1.1975442593038788e-07, 'epoch': 0.93} 93%|█████████▎| 31957/34278 [34:58:03<4:57:29, 7.69s/it] 93%|█████████▎| 31958/34278 [34:58:06<4:02:19, 6.27s/it] {'loss': 0.0972, 'grad_norm': 0.6704747895430312, 'learning_rate': 1.1965166953111508e-07, 'epoch': 0.93} 93%|█████████▎| 31958/34278 [34:58:06<4:02:19, 6.27s/it] 93%|█████████▎| 31959/34278 [34:58:22<5:48:45, 9.02s/it] {'loss': 0.1125, 'grad_norm': 0.7856997979761003, 'learning_rate': 1.1954895670242717e-07, 'epoch': 0.93} 93%|█████████▎| 31959/34278 [34:58:22<5:48:45, 9.02s/it] 93%|█████████▎| 31960/34278 [34:58:35<6:34:05, 10.20s/it] {'loss': 0.1028, 'grad_norm': 0.8636025047379615, 'learning_rate': 1.194462874452418e-07, 'epoch': 0.93} 93%|█████████▎| 31960/34278 [34:58:35<6:34:05, 10.20s/it] 93%|█████████▎| 31961/34278 [34:58:58<9:03:03, 14.06s/it] {'loss': 0.1224, 'grad_norm': 0.9302947967398764, 'learning_rate': 1.193436617604743e-07, 'epoch': 0.93} 93%|█████████▎| 31961/34278 [34:58:58<9:03:03, 14.06s/it] 93%|█████████▎| 31962/34278 [34:59:02<7:10:28, 11.15s/it] {'loss': 0.1388, 'grad_norm': 0.8765440605059343, 'learning_rate': 1.1924107964904175e-07, 'epoch': 0.93} 93%|█████████▎| 31962/34278 [34:59:02<7:10:28, 11.15s/it] 93%|█████████▎| 31963/34278 [34:59:05<5:38:45, 8.78s/it] {'loss': 0.133, 'grad_norm': 0.852805482050188, 'learning_rate': 1.1913854111186008e-07, 'epoch': 0.93} 93%|█████████▎| 31963/34278 [34:59:05<5:38:45, 8.78s/it] 93%|█████████▎| 31964/34278 [34:59:18<6:21:08, 9.88s/it] {'loss': 0.1119, 'grad_norm': 0.8567529364954466, 'learning_rate': 1.190360461498441e-07, 'epoch': 0.93} 93%|█████████▎| 31964/34278 [34:59:18<6:21:08, 9.88s/it] 93%|█████████▎| 31965/34278 [34:59:31<7:05:00, 11.02s/it] {'loss': 0.1172, 'grad_norm': 0.8774635317691548, 'learning_rate': 1.1893359476390809e-07, 'epoch': 0.93} 93%|█████████▎| 31965/34278 [34:59:31<7:05:00, 11.02s/it] 93%|█████████▎| 31966/34278 [34:59:35<5:33:40, 8.66s/it] {'loss': 0.1099, 'grad_norm': 0.7730345767223594, 'learning_rate': 1.1883118695496853e-07, 'epoch': 0.93} 93%|█████████▎| 31966/34278 [34:59:35<5:33:40, 8.66s/it] 93%|█████████▎| 31967/34278 [34:59:46<6:03:33, 9.44s/it] {'loss': 0.1122, 'grad_norm': 0.954948814954049, 'learning_rate': 1.1872882272393915e-07, 'epoch': 0.93} 93%|█████████▎| 31967/34278 [34:59:46<6:03:33, 9.44s/it] 93%|█████████▎| 31968/34278 [35:00:08<8:29:21, 13.23s/it] {'loss': 0.1199, 'grad_norm': 0.8080294698130455, 'learning_rate': 1.1862650207173365e-07, 'epoch': 0.93} 93%|█████████▎| 31968/34278 [35:00:08<8:29:21, 13.23s/it] 93%|█████████▎| 31969/34278 [35:00:19<8:10:01, 12.73s/it] {'loss': 0.0941, 'grad_norm': 1.2116619766730552, 'learning_rate': 1.1852422499926519e-07, 'epoch': 0.93} 93%|█████████▎| 31969/34278 [35:00:19<8:10:01, 12.73s/it] 93%|█████████▎| 31970/34278 [35:00:31<7:51:37, 12.26s/it] {'loss': 0.1017, 'grad_norm': 0.8350327503632, 'learning_rate': 1.1842199150744749e-07, 'epoch': 0.93} 93%|█████████▎| 31970/34278 [35:00:31<7:51:37, 12.26s/it] 93%|█████████▎| 31971/34278 [35:00:45<8:10:16, 12.75s/it] {'loss': 0.0813, 'grad_norm': 0.8437080265607998, 'learning_rate': 1.1831980159719203e-07, 'epoch': 0.93} 93%|█████████▎| 31971/34278 [35:00:45<8:10:16, 12.75s/it] 93%|█████████▎| 31972/34278 [35:00:48<6:21:24, 9.92s/it] {'loss': 0.1091, 'grad_norm': 0.8397235501594001, 'learning_rate': 1.1821765526941254e-07, 'epoch': 0.93} 93%|█████████▎| 31972/34278 [35:00:48<6:21:24, 9.92s/it] 93%|█████████▎| 31973/34278 [35:00:51<4:59:22, 7.79s/it] {'loss': 0.114, 'grad_norm': 0.7604132266193332, 'learning_rate': 1.1811555252502105e-07, 'epoch': 0.93} 93%|█████████▎| 31973/34278 [35:00:51<4:59:22, 7.79s/it] 93%|█████████▎| 31974/34278 [35:01:17<8:38:01, 13.49s/it] {'loss': 0.1102, 'grad_norm': 0.842547410919885, 'learning_rate': 1.1801349336492796e-07, 'epoch': 0.93} 93%|█████████▎| 31974/34278 [35:01:17<8:38:01, 13.49s/it] 93%|█████████▎| 31975/34278 [35:01:22<6:51:44, 10.73s/it] {'loss': 0.0986, 'grad_norm': 1.0731862014744726, 'learning_rate': 1.1791147779004474e-07, 'epoch': 0.93} 93%|█████████▎| 31975/34278 [35:01:22<6:51:44, 10.73s/it] 93%|█████████▎| 31976/34278 [35:01:37<7:38:44, 11.96s/it] {'loss': 0.1141, 'grad_norm': 0.696695142887009, 'learning_rate': 1.1780950580128292e-07, 'epoch': 0.93} 93%|█████████▎| 31976/34278 [35:01:37<7:38:44, 11.96s/it] 93%|█████████▎| 31977/34278 [35:01:40<5:59:56, 9.39s/it] {'loss': 0.1044, 'grad_norm': 0.8698705517419311, 'learning_rate': 1.1770757739955174e-07, 'epoch': 0.93} 93%|█████████▎| 31977/34278 [35:01:40<5:59:56, 9.39s/it] 93%|█████████▎| 31978/34278 [35:01:43<4:45:43, 7.45s/it] {'loss': 0.1014, 'grad_norm': 0.9605724397268341, 'learning_rate': 1.1760569258576215e-07, 'epoch': 0.93} 93%|█████████▎| 31978/34278 [35:01:43<4:45:43, 7.45s/it] 93%|█████████▎| 31979/34278 [35:02:03<7:06:15, 11.12s/it] {'loss': 0.1069, 'grad_norm': 0.9189215535029513, 'learning_rate': 1.1750385136082343e-07, 'epoch': 0.93} 93%|█████████▎| 31979/34278 [35:02:03<7:06:15, 11.12s/it] 93%|█████████▎| 31980/34278 [35:02:26<9:23:10, 14.70s/it] {'loss': 0.1059, 'grad_norm': 0.8198878868640496, 'learning_rate': 1.1740205372564484e-07, 'epoch': 0.93} 93%|█████████▎| 31980/34278 [35:02:26<9:23:10, 14.70s/it] 93%|█████████▎| 31981/34278 [35:02:42<9:36:40, 15.06s/it] {'loss': 0.1285, 'grad_norm': 0.8415747852264213, 'learning_rate': 1.173002996811351e-07, 'epoch': 0.93} 93%|█████████▎| 31981/34278 [35:02:42<9:36:40, 15.06s/it] 93%|█████████▎| 31982/34278 [35:02:45<7:19:08, 11.48s/it] {'loss': 0.1131, 'grad_norm': 0.7685357106287352, 'learning_rate': 1.1719858922820293e-07, 'epoch': 0.93} 93%|█████████▎| 31982/34278 [35:02:45<7:19:08, 11.48s/it] 93%|█████████▎| 31983/34278 [35:03:07<9:26:36, 14.81s/it] {'loss': 0.1162, 'grad_norm': 0.830730054729226, 'learning_rate': 1.1709692236775538e-07, 'epoch': 0.93} 93%|█████████▎| 31983/34278 [35:03:07<9:26:36, 14.81s/it] 93%|█████████▎| 31984/34278 [35:03:26<10:06:57, 15.87s/it] {'loss': 0.1008, 'grad_norm': 0.7540497253570982, 'learning_rate': 1.1699529910070173e-07, 'epoch': 0.93} 93%|█████████▎| 31984/34278 [35:03:26<10:06:57, 15.87s/it] 93%|█████████▎| 31985/34278 [35:03:48<11:19:44, 17.79s/it] {'loss': 0.1048, 'grad_norm': 0.924277777353587, 'learning_rate': 1.1689371942794791e-07, 'epoch': 0.93} 93%|█████████▎| 31985/34278 [35:03:48<11:19:44, 17.79s/it] 93%|█████████▎| 31986/34278 [35:03:59<10:05:51, 15.86s/it] {'loss': 0.0971, 'grad_norm': 0.7920417138843862, 'learning_rate': 1.1679218335040155e-07, 'epoch': 0.93} 93%|█████████▎| 31986/34278 [35:03:59<10:05:51, 15.86s/it] 93%|█████████▎| 31987/34278 [35:04:17<10:23:57, 16.34s/it] {'loss': 0.1026, 'grad_norm': 0.7963482116421964, 'learning_rate': 1.1669069086896911e-07, 'epoch': 0.93} 93%|█████████▎| 31987/34278 [35:04:17<10:23:57, 16.34s/it] 93%|█████████▎| 31988/34278 [35:04:20<7:51:09, 12.34s/it] {'loss': 0.0989, 'grad_norm': 0.807518314704998, 'learning_rate': 1.1658924198455546e-07, 'epoch': 0.93} 93%|█████████▎| 31988/34278 [35:04:20<7:51:09, 12.34s/it] 93%|█████████▎| 31989/34278 [35:04:23<6:03:25, 9.53s/it] {'loss': 0.0817, 'grad_norm': 0.9167264203577618, 'learning_rate': 1.1648783669806762e-07, 'epoch': 0.93} 93%|█████████▎| 31989/34278 [35:04:23<6:03:25, 9.53s/it] 93%|█████████▎| 31990/34278 [35:04:37<6:54:56, 10.88s/it] {'loss': 0.1103, 'grad_norm': 0.7444747618564228, 'learning_rate': 1.16386475010411e-07, 'epoch': 0.93} 93%|█████████▎| 31990/34278 [35:04:37<6:54:56, 10.88s/it] 93%|█████████▎| 31991/34278 [35:04:59<9:10:30, 14.44s/it] {'loss': 0.1158, 'grad_norm': 0.8915099216220743, 'learning_rate': 1.1628515692249042e-07, 'epoch': 0.93} 93%|█████████▎| 31991/34278 [35:04:59<9:10:30, 14.44s/it] 93%|█████████▎| 31992/34278 [35:05:02<6:59:57, 11.02s/it] {'loss': 0.1202, 'grad_norm': 0.8595692223499609, 'learning_rate': 1.1618388243520906e-07, 'epoch': 0.93} 93%|█████████▎| 31992/34278 [35:05:03<6:59:57, 11.02s/it] 93%|█████████▎| 31993/34278 [35:05:18<7:49:31, 12.33s/it] {'loss': 0.1016, 'grad_norm': 0.7402023856329026, 'learning_rate': 1.1608265154947285e-07, 'epoch': 0.93} 93%|█████████▎| 31993/34278 [35:05:18<7:49:31, 12.33s/it] 93%|█████████▎| 31994/34278 [35:05:29<7:41:08, 12.11s/it] {'loss': 0.1211, 'grad_norm': 0.8873259396295621, 'learning_rate': 1.1598146426618495e-07, 'epoch': 0.93} 93%|█████████▎| 31994/34278 [35:05:30<7:41:08, 12.11s/it] 93%|█████████▎| 31995/34278 [35:05:41<7:31:19, 11.86s/it] {'loss': 0.1532, 'grad_norm': 0.9479931842533931, 'learning_rate': 1.1588032058624798e-07, 'epoch': 0.93} 93%|█████████▎| 31995/34278 [35:05:41<7:31:19, 11.86s/it] 93%|█████████▎| 31996/34278 [35:06:01<9:10:30, 14.47s/it] {'loss': 0.1103, 'grad_norm': 0.898471466202595, 'learning_rate': 1.1577922051056622e-07, 'epoch': 0.93} 93%|█████████▎| 31996/34278 [35:06:01<9:10:30, 14.47s/it] 93%|█████████▎| 31997/34278 [35:06:15<9:02:15, 14.26s/it] {'loss': 0.1149, 'grad_norm': 0.7137174191426734, 'learning_rate': 1.1567816404004173e-07, 'epoch': 0.93} 93%|█████████▎| 31997/34278 [35:06:15<9:02:15, 14.26s/it] 93%|█████████▎| 31998/34278 [35:06:27<8:31:30, 13.46s/it] {'loss': 0.1083, 'grad_norm': 1.0439192802552064, 'learning_rate': 1.15577151175576e-07, 'epoch': 0.93} 93%|█████████▎| 31998/34278 [35:06:27<8:31:30, 13.46s/it] 93%|█████████▎| 31999/34278 [35:06:40<8:34:52, 13.56s/it] {'loss': 0.0964, 'grad_norm': 0.9115704089747924, 'learning_rate': 1.1547618191807164e-07, 'epoch': 0.93} 93%|█████████▎| 31999/34278 [35:06:40<8:34:52, 13.56s/it] 93%|█████████▎| 32000/34278 [35:06:54<8:33:33, 13.53s/it] {'loss': 0.1159, 'grad_norm': 0.9143799232175939, 'learning_rate': 1.1537525626843016e-07, 'epoch': 0.93} 93%|█████████▎| 32000/34278 [35:06:54<8:33:33, 13.53s/it]Set eos token id toSet eos token id toSet eos token id to Set eos token id to151658151658Set eos token id toSet eos token id to 151658 Set eos token to Set eos token to 151658Set eos token id to<|diff_marker|> 151658Set eos token to <|diff_marker|> Set generation config eos token id to<|diff_marker|> Set eos token to 151658 Set eos token to <|diff_marker|>Set generation config eos token id toSet eos token to 151658<|diff_marker|> [151658]Set generation config eos token id to [151658]<|diff_marker|>Set generation config eos token id to Set eos token toSet generation config eos token id to Set generation config eos token id to <|diff_marker|>[151658] [151658] [151658] Set generation config eos token id to [151658] [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to et eos token id to Set eos token id to151658 Set eos token id to 151658 151658151658 Set eos token to Set eos token toSet eos token id to<|diff_marker|> Set eos token to Set eos token id to Set eos token to <|diff_marker|> <|diff_marker|> Set generation config eos token id to 151658 <|diff_marker|> Set generation config eos token id toSet generation config eos token id to [151658] 151658Set eos token toSet generation config eos token id to [151658] <|diff_marker|> Set eos token to[151658][151658] <|diff_marker|>Set generation config eos token id to Set generation config eos token id to [151658] [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] ation config eos token id to [151658] Set eos token id to151658Set eos token id to Set eos token id toSet eos token to151658 <|diff_marker|>151658 151658Set eos token toSet generation config eos token id to <|diff_marker|>Set eos token toSet eos token to <|diff_marker|><|diff_marker|> Set generation config eos token id to [151658] Set generation config eos token id toSet generation config eos token id to [151658] [151658][151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back toSet eos token id back toSet eos token id back to 151645 151645151645Set eos token back to <|im_end|> Set eos token id back toSet eos token id back toSet eos token back toSet generation config eos token id back to <|im_end|> 151645Set eos token back toSet generation config eos token id back to [151645, 151643] <|im_end|>151645Set eos token back to <|im_end|> Set generation config eos token id back to Set eos token back toSet generation config eos token id back to [151645, 151643]<|im_end|> Set generation config eos token id back to [151645, 151643] [151645, 151643] [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [1516 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 93%|█████████▎| 32001/34278 [35:08:07<19:48:00, 31.30s/it] {'loss': 0.1197, 'grad_norm': 0.7971847368233306, 'learning_rate': 1.1527437422755194e-07, 'epoch': 0.93} 93%|█████████▎| 32001/34278 [35:08:07<19:48:00, 31.30s/it] 93%|█████████▎| 32002/34278 [35:08:24<17:11:42, 27.20s/it] {'loss': 0.1226, 'grad_norm': 0.8082456541975708, 'learning_rate': 1.1517353579633795e-07, 'epoch': 0.93} 93%|█████████▎| 32002/34278 [35:08:24<17:11:42, 27.20s/it] 93%|█████████▎| 32003/34278 [35:08:27<12:34:45, 19.91s/it] {'loss': 0.1239, 'grad_norm': 1.0664364553710668, 'learning_rate': 1.150727409756891e-07, 'epoch': 0.93} 93%|█████████▎| 32003/34278 [35:08:27<12:34:45, 19.91s/it] 93%|█████████▎| 32004/34278 [35:08:33<9:56:26, 15.74s/it] {'loss': 0.0996, 'grad_norm': 0.7235267375604488, 'learning_rate': 1.149719897665047e-07, 'epoch': 0.93} 93%|█████████▎| 32004/34278 [35:08:33<9:56:26, 15.74s/it] 93%|█████████▎| 32005/34278 [35:08:44<9:01:21, 14.29s/it] {'loss': 0.1029, 'grad_norm': 0.7508616798731628, 'learning_rate': 1.1487128216968346e-07, 'epoch': 0.93} 93%|█████████▎| 32005/34278 [35:08:44<9:01:21, 14.29s/it] 93%|█████████▎| 32006/34278 [35:08:48<7:02:08, 11.15s/it] {'loss': 0.1005, 'grad_norm': 0.8663010406096253, 'learning_rate': 1.1477061818612634e-07, 'epoch': 0.93} 93%|█████████▎| 32006/34278 [35:08:48<7:02:08, 11.15s/it] 93%|█████████▎| 32007/34278 [35:08:51<5:32:54, 8.80s/it] {'loss': 0.1071, 'grad_norm': 0.790146975826344, 'learning_rate': 1.1466999781672982e-07, 'epoch': 0.93} 93%|█████████▎| 32007/34278 [35:08:51<5:32:54, 8.80s/it] 93%|█████████▎| 32008/34278 [35:08:54<4:28:55, 7.11s/it] {'loss': 0.1154, 'grad_norm': 0.7646811019953647, 'learning_rate': 1.1456942106239377e-07, 'epoch': 0.93} 93%|█████████▎| 32008/34278 [35:08:54<4:28:55, 7.11s/it] 93%|█████████▎| 32009/34278 [35:08:57<3:41:07, 5.85s/it] {'loss': 0.1067, 'grad_norm': 0.8885198983739938, 'learning_rate': 1.1446888792401578e-07, 'epoch': 0.93} 93%|█████████▎| 32009/34278 [35:08:57<3:41:07, 5.85s/it] 93%|█████████▎| 32010/34278 [35:09:03<3:41:55, 5.87s/it] {'loss': 0.1062, 'grad_norm': 0.8334308513836821, 'learning_rate': 1.1436839840249347e-07, 'epoch': 0.93} 93%|█████████▎| 32010/34278 [35:09:03<3:41:55, 5.87s/it] 93%|█████████▎| 32011/34278 [35:09:08<3:29:09, 5.54s/it] {'loss': 0.1165, 'grad_norm': 0.7974042968390833, 'learning_rate': 1.1426795249872335e-07, 'epoch': 0.93} 93%|█████████▎| 32011/34278 [35:09:08<3:29:09, 5.54s/it] 93%|█████████▎| 32012/34278 [35:09:11<3:01:52, 4.82s/it] {'loss': 0.1104, 'grad_norm': 0.844353420312567, 'learning_rate': 1.1416755021360304e-07, 'epoch': 0.93} 93%|█████████▎| 32012/34278 [35:09:11<3:01:52, 4.82s/it] 93%|█████████▎| 32013/34278 [35:09:14<2:43:03, 4.32s/it] {'loss': 0.1339, 'grad_norm': 1.0480805887133557, 'learning_rate': 1.1406719154802848e-07, 'epoch': 0.93} 93%|█████████▎| 32013/34278 [35:09:14<2:43:03, 4.32s/it] 93%|█████████▎| 32014/34278 [35:09:31<4:57:22, 7.88s/it] {'loss': 0.1067, 'grad_norm': 0.9778261300554228, 'learning_rate': 1.1396687650289561e-07, 'epoch': 0.93} 93%|█████████▎| 32014/34278 [35:09:31<4:57:22, 7.88s/it] 93%|█████████▎| 32015/34278 [35:09:34<4:07:11, 6.55s/it] {'loss': 0.1121, 'grad_norm': 0.9799128343500716, 'learning_rate': 1.1386660507909986e-07, 'epoch': 0.93} 93%|█████████▎| 32015/34278 [35:09:34<4:07:11, 6.55s/it] 93%|█████████▎| 32016/34278 [35:09:37<3:27:07, 5.49s/it] {'loss': 0.1157, 'grad_norm': 0.7926375659453858, 'learning_rate': 1.1376637727753658e-07, 'epoch': 0.93} 93%|█████████▎| 32016/34278 [35:09:37<3:27:07, 5.49s/it] 93%|█████████▎| 32017/34278 [35:09:43<3:35:15, 5.71s/it] {'loss': 0.1117, 'grad_norm': 0.9591995660342114, 'learning_rate': 1.136661930991012e-07, 'epoch': 0.93} 93%|█████████▎| 32017/34278 [35:09:43<3:35:15, 5.71s/it] 93%|█████████▎| 32018/34278 [35:09:54<4:36:27, 7.34s/it] {'loss': 0.1177, 'grad_norm': 0.8650595079341008, 'learning_rate': 1.135660525446869e-07, 'epoch': 0.93} 93%|█████████▎| 32018/34278 [35:09:54<4:36:27, 7.34s/it] 93%|█████████▎| 32019/34278 [35:09:57<3:44:50, 5.97s/it] {'loss': 0.1335, 'grad_norm': 0.8103428031610451, 'learning_rate': 1.1346595561518848e-07, 'epoch': 0.93} 93%|█████████▎| 32019/34278 [35:09:57<3:44:50, 5.97s/it] 93%|█████████▎| 32020/34278 [35:10:00<3:12:36, 5.12s/it] {'loss': 0.1404, 'grad_norm': 1.015601796023541, 'learning_rate': 1.1336590231150024e-07, 'epoch': 0.93} 93%|█████████▎| 32020/34278 [35:10:00<3:12:36, 5.12s/it] 93%|█████████▎| 32021/34278 [35:10:04<3:01:31, 4.83s/it] {'loss': 0.1121, 'grad_norm': 1.02095304218318, 'learning_rate': 1.1326589263451427e-07, 'epoch': 0.93} 93%|█████████▎| 32021/34278 [35:10:04<3:01:31, 4.83s/it] 93%|█████████▎| 32022/34278 [35:10:09<2:58:39, 4.75s/it] {'loss': 0.0964, 'grad_norm': 0.7554192565240885, 'learning_rate': 1.1316592658512371e-07, 'epoch': 0.93} 93%|█████████▎| 32022/34278 [35:10:09<2:58:39, 4.75s/it] 93%|█████████▎| 32023/34278 [35:10:15<3:10:57, 5.08s/it] {'loss': 0.1118, 'grad_norm': 0.6385913709875719, 'learning_rate': 1.130660041642212e-07, 'epoch': 0.93} 93%|█████████▎| 32023/34278 [35:10:15<3:10:57, 5.08s/it] 93%|█████████▎| 32024/34278 [35:10:19<2:56:56, 4.71s/it] {'loss': 0.1145, 'grad_norm': 0.9340508884192655, 'learning_rate': 1.1296612537269935e-07, 'epoch': 0.93} 93%|█████████▎| 32024/34278 [35:10:19<2:56:56, 4.71s/it] 93%|█████████▎| 32025/34278 [35:10:25<3:12:00, 5.11s/it] {'loss': 0.112, 'grad_norm': 0.8276418139511901, 'learning_rate': 1.1286629021144802e-07, 'epoch': 0.93} 93%|█████████▎| 32025/34278 [35:10:25<3:12:00, 5.11s/it] 93%|█████████▎| 32026/34278 [35:10:30<3:14:02, 5.17s/it] {'loss': 0.1191, 'grad_norm': 0.8096684738787501, 'learning_rate': 1.1276649868136091e-07, 'epoch': 0.93} 93%|█████████▎| 32026/34278 [35:10:30<3:14:02, 5.17s/it] 93%|█████████▎| 32027/34278 [35:10:34<2:55:18, 4.67s/it] {'loss': 0.1004, 'grad_norm': 0.9568940920296325, 'learning_rate': 1.1266675078332734e-07, 'epoch': 0.93} 93%|█████████▎| 32027/34278 [35:10:34<2:55:18, 4.67s/it] 93%|█████████▎| 32028/34278 [35:10:37<2:36:26, 4.17s/it] {'loss': 0.1171, 'grad_norm': 0.7776215396400586, 'learning_rate': 1.1256704651823825e-07, 'epoch': 0.93} 93%|█████████▎| 32028/34278 [35:10:37<2:36:26, 4.17s/it] 93%|█████████▎| 32029/34278 [35:10:43<2:57:55, 4.75s/it] {'loss': 0.0976, 'grad_norm': 0.7800938299421314, 'learning_rate': 1.1246738588698458e-07, 'epoch': 0.93} 93%|█████████▎| 32029/34278 [35:10:43<2:57:55, 4.75s/it] 93%|█████████▎| 32030/34278 [35:10:46<2:43:55, 4.38s/it] {'loss': 0.1123, 'grad_norm': 0.6713518889285239, 'learning_rate': 1.1236776889045508e-07, 'epoch': 0.93} 93%|█████████▎| 32030/34278 [35:10:46<2:43:55, 4.38s/it] 93%|█████████▎| 32031/34278 [35:10:50<2:42:28, 4.34s/it] {'loss': 0.1193, 'grad_norm': 1.3320616497280364, 'learning_rate': 1.1226819552953849e-07, 'epoch': 0.93} 93%|█████████▎| 32031/34278 [35:10:50<2:42:28, 4.34s/it] 93%|█████████▎| 32032/34278 [35:10:54<2:29:09, 3.98s/it] {'loss': 0.1149, 'grad_norm': 0.8440007410344869, 'learning_rate': 1.121686658051252e-07, 'epoch': 0.93} 93%|█████████▎| 32032/34278 [35:10:54<2:29:09, 3.98s/it] 93%|█████████▎| 32033/34278 [35:10:57<2:22:05, 3.80s/it] {'loss': 0.098, 'grad_norm': 0.7885423931377564, 'learning_rate': 1.1206917971810339e-07, 'epoch': 0.93} 93%|█████████▎| 32033/34278 [35:10:57<2:22:05, 3.80s/it] 93%|█████████▎| 32034/34278 [35:11:00<2:18:14, 3.70s/it] {'loss': 0.1248, 'grad_norm': 1.829464074456043, 'learning_rate': 1.1196973726936122e-07, 'epoch': 0.93} 93%|█████████▎| 32034/34278 [35:11:00<2:18:14, 3.70s/it] 93%|█████████▎| 32035/34278 [35:11:04<2:13:26, 3.57s/it] {'loss': 0.0902, 'grad_norm': 0.7603648793100546, 'learning_rate': 1.1187033845978635e-07, 'epoch': 0.93} 93%|█████████▎| 32035/34278 [35:11:04<2:13:26, 3.57s/it] 93%|█████████▎| 32036/34278 [35:11:09<2:30:17, 4.02s/it] {'loss': 0.1017, 'grad_norm': 0.8812911824695407, 'learning_rate': 1.1177098329026581e-07, 'epoch': 0.93} 93%|█████████▎| 32036/34278 [35:11:09<2:30:17, 4.02s/it] 93%|█████████▎| 32037/34278 [35:11:12<2:23:28, 3.84s/it] {'loss': 0.1251, 'grad_norm': 0.8232236047285875, 'learning_rate': 1.1167167176168725e-07, 'epoch': 0.93} 93%|█████████▎| 32037/34278 [35:11:12<2:23:28, 3.84s/it] 93%|█████████▎| 32038/34278 [35:11:15<2:14:00, 3.59s/it] {'loss': 0.1219, 'grad_norm': 0.8740217291491303, 'learning_rate': 1.1157240387493662e-07, 'epoch': 0.93} 93%|█████████▎| 32038/34278 [35:11:15<2:14:00, 3.59s/it] 93%|█████████▎| 32039/34278 [35:11:20<2:22:41, 3.82s/it] {'loss': 0.1116, 'grad_norm': 0.8123527785586346, 'learning_rate': 1.1147317963090154e-07, 'epoch': 0.93} 93%|█████████▎| 32039/34278 [35:11:20<2:22:41, 3.82s/it] 93%|█████████▎| 32040/34278 [35:11:23<2:18:34, 3.72s/it] {'loss': 0.1074, 'grad_norm': 0.889147925885064, 'learning_rate': 1.113739990304663e-07, 'epoch': 0.93} 93%|█████████▎| 32040/34278 [35:11:23<2:18:34, 3.72s/it] 93%|█████████▎| 32041/34278 [35:11:26<2:13:33, 3.58s/it] {'loss': 0.1272, 'grad_norm': 0.796997302515763, 'learning_rate': 1.1127486207451687e-07, 'epoch': 0.93} 93%|█████████▎| 32041/34278 [35:11:26<2:13:33, 3.58s/it] 93%|█████████▎| 32042/34278 [35:11:29<2:04:48, 3.35s/it] {'loss': 0.1015, 'grad_norm': 0.866471962490944, 'learning_rate': 1.1117576876393921e-07, 'epoch': 0.93} 93%|█████████▎| 32042/34278 [35:11:29<2:04:48, 3.35s/it] 93%|█████████▎| 32043/34278 [35:11:33<2:06:45, 3.40s/it] {'loss': 0.0878, 'grad_norm': 0.7993024059883606, 'learning_rate': 1.1107671909961648e-07, 'epoch': 0.93} 93%|█████████▎| 32043/34278 [35:11:33<2:06:45, 3.40s/it] 93%|█████████▎| 32044/34278 [35:11:36<2:05:07, 3.36s/it] {'loss': 0.1101, 'grad_norm': 0.8144702879758206, 'learning_rate': 1.109777130824341e-07, 'epoch': 0.93} 93%|█████████▎| 32044/34278 [35:11:36<2:05:07, 3.36s/it] 93%|█████████▎| 32045/34278 [35:11:42<2:34:36, 4.15s/it] {'loss': 0.0909, 'grad_norm': 0.8887016433567662, 'learning_rate': 1.1087875071327525e-07, 'epoch': 0.93} 93%|█████████▎| 32045/34278 [35:11:42<2:34:36, 4.15s/it] 93%|█████████▎| 32046/34278 [35:11:45<2:27:50, 3.97s/it] {'loss': 0.1086, 'grad_norm': 0.974323575414125, 'learning_rate': 1.1077983199302422e-07, 'epoch': 0.93} 93%|█████████▎| 32046/34278 [35:11:45<2:27:50, 3.97s/it] 93%|█████████▎| 32047/34278 [35:11:49<2:27:53, 3.98s/it] {'loss': 0.1202, 'grad_norm': 0.700641465039684, 'learning_rate': 1.1068095692256364e-07, 'epoch': 0.93} 93%|█████████▎| 32047/34278 [35:11:49<2:27:53, 3.98s/it] 93%|█████████▎| 32048/34278 [35:11:52<2:16:46, 3.68s/it] {'loss': 0.1092, 'grad_norm': 0.8344842765064419, 'learning_rate': 1.1058212550277558e-07, 'epoch': 0.93} 93%|█████████▎| 32048/34278 [35:11:52<2:16:46, 3.68s/it] 93%|█████████▎| 32049/34278 [35:11:56<2:20:46, 3.79s/it] {'loss': 0.1098, 'grad_norm': 0.8276408686580056, 'learning_rate': 1.1048333773454378e-07, 'epoch': 0.93} 93%|█████████▎| 32049/34278 [35:11:56<2:20:46, 3.79s/it] 94%|█████████▎| 32050/34278 [35:12:02<2:45:50, 4.47s/it] {'loss': 0.0848, 'grad_norm': 0.6993079332475102, 'learning_rate': 1.103845936187492e-07, 'epoch': 0.94} 94%|█████████▎| 32050/34278 [35:12:02<2:45:50, 4.47s/it] 94%|█████████▎| 32051/34278 [35:12:09<3:04:05, 4.96s/it] {'loss': 0.1039, 'grad_norm': 0.779769502283324, 'learning_rate': 1.1028589315627448e-07, 'epoch': 0.94} 94%|█████████▎| 32051/34278 [35:12:09<3:04:05, 4.96s/it] 94%|█████████▎| 32052/34278 [35:12:12<2:42:37, 4.38s/it] {'loss': 0.1305, 'grad_norm': 1.1898230035142983, 'learning_rate': 1.1018723634799888e-07, 'epoch': 0.94} 94%|█████████▎| 32052/34278 [35:12:12<2:42:37, 4.38s/it] 94%|█████████▎| 32053/34278 [35:12:14<2:25:06, 3.91s/it] {'loss': 0.1017, 'grad_norm': 0.77744475191222, 'learning_rate': 1.1008862319480562e-07, 'epoch': 0.94} 94%|█████████▎| 32053/34278 [35:12:14<2:25:06, 3.91s/it] 94%|█████████▎| 32054/34278 [35:12:20<2:46:44, 4.50s/it] {'loss': 0.1094, 'grad_norm': 0.8275057891511748, 'learning_rate': 1.0999005369757287e-07, 'epoch': 0.94} 94%|█████████▎| 32054/34278 [35:12:20<2:46:44, 4.50s/it] 94%|█████████▎| 32055/34278 [35:12:24<2:43:22, 4.41s/it] {'loss': 0.1077, 'grad_norm': 0.7830985358322079, 'learning_rate': 1.098915278571816e-07, 'epoch': 0.94} 94%|█████████▎| 32055/34278 [35:12:24<2:43:22, 4.41s/it] 94%|█████████▎| 32056/34278 [35:12:28<2:33:21, 4.14s/it] {'loss': 0.1104, 'grad_norm': 1.038007306517236, 'learning_rate': 1.0979304567451166e-07, 'epoch': 0.94} 94%|█████████▎| 32056/34278 [35:12:28<2:33:21, 4.14s/it] 94%|█████████▎| 32057/34278 [35:12:31<2:19:58, 3.78s/it] {'loss': 0.1038, 'grad_norm': 0.7998792968146583, 'learning_rate': 1.0969460715044234e-07, 'epoch': 0.94} 94%|█████████▎| 32057/34278 [35:12:31<2:19:58, 3.78s/it] 94%|█████████▎| 32058/34278 [35:12:34<2:12:21, 3.58s/it] {'loss': 0.0875, 'grad_norm': 0.7130960643140338, 'learning_rate': 1.0959621228585126e-07, 'epoch': 0.94} 94%|█████████▎| 32058/34278 [35:12:34<2:12:21, 3.58s/it] 94%|█████████▎| 32059/34278 [35:12:37<2:09:59, 3.51s/it] {'loss': 0.1205, 'grad_norm': 0.7991544545494476, 'learning_rate': 1.0949786108161885e-07, 'epoch': 0.94} 94%|█████████▎| 32059/34278 [35:12:37<2:09:59, 3.51s/it] 94%|█████████▎| 32060/34278 [35:12:40<2:03:34, 3.34s/it] {'loss': 0.1215, 'grad_norm': 0.9583770351838974, 'learning_rate': 1.0939955353862164e-07, 'epoch': 0.94} 94%|█████████▎| 32060/34278 [35:12:40<2:03:34, 3.34s/it] 94%|█████████▎| 32061/34278 [35:12:43<2:00:51, 3.27s/it] {'loss': 0.1091, 'grad_norm': 0.708747277978358, 'learning_rate': 1.0930128965773723e-07, 'epoch': 0.94} 94%|█████████▎| 32061/34278 [35:12:43<2:00:51, 3.27s/it] 94%|█████████▎| 32062/34278 [35:12:46<1:55:54, 3.14s/it] {'loss': 0.0796, 'grad_norm': 0.6758522069728627, 'learning_rate': 1.0920306943984383e-07, 'epoch': 0.94} 94%|█████████▎| 32062/34278 [35:12:46<1:55:54, 3.14s/it] 94%|█████████▎| 32063/34278 [35:12:49<1:54:11, 3.09s/it] {'loss': 0.0836, 'grad_norm': 0.7543564942859913, 'learning_rate': 1.0910489288581794e-07, 'epoch': 0.94} 94%|█████████▎| 32063/34278 [35:12:49<1:54:11, 3.09s/it] 94%|█████████▎| 32064/34278 [35:12:53<2:01:12, 3.28s/it] {'loss': 0.096, 'grad_norm': 1.2929445917883506, 'learning_rate': 1.0900675999653609e-07, 'epoch': 0.94} 94%|█████████▎| 32064/34278 [35:12:53<2:01:12, 3.28s/it] 94%|█████████▎| 32065/34278 [35:12:59<2:28:49, 4.03s/it] {'loss': 0.0917, 'grad_norm': 0.8529796359826995, 'learning_rate': 1.0890867077287425e-07, 'epoch': 0.94} 94%|█████████▎| 32065/34278 [35:12:59<2:28:49, 4.03s/it] 94%|█████████▎| 32066/34278 [35:13:02<2:18:37, 3.76s/it] {'loss': 0.1283, 'grad_norm': 0.7370111459922979, 'learning_rate': 1.088106252157084e-07, 'epoch': 0.94} 94%|█████████▎| 32066/34278 [35:13:02<2:18:37, 3.76s/it] 94%|█████████▎| 32067/34278 [35:13:06<2:18:40, 3.76s/it] {'loss': 0.1104, 'grad_norm': 0.9501203927021816, 'learning_rate': 1.0871262332591281e-07, 'epoch': 0.94} 94%|█████████▎| 32067/34278 [35:13:06<2:18:40, 3.76s/it] 94%|█████████▎| 32068/34278 [35:13:12<2:45:36, 4.50s/it] {'loss': 0.1006, 'grad_norm': 0.8333221115080217, 'learning_rate': 1.0861466510436347e-07, 'epoch': 0.94} 94%|█████████▎| 32068/34278 [35:13:12<2:45:36, 4.50s/it] 94%|█████████▎| 32069/34278 [35:13:15<2:28:37, 4.04s/it] {'loss': 0.1054, 'grad_norm': 0.7896071035671067, 'learning_rate': 1.0851675055193579e-07, 'epoch': 0.94} 94%|█████████▎| 32069/34278 [35:13:15<2:28:37, 4.04s/it] 94%|█████████▎| 32070/34278 [35:13:18<2:17:43, 3.74s/it] {'loss': 0.1, 'grad_norm': 0.8726117427575014, 'learning_rate': 1.0841887966950237e-07, 'epoch': 0.94} 94%|█████████▎| 32070/34278 [35:13:18<2:17:43, 3.74s/it] 94%|█████████▎| 32071/34278 [35:13:22<2:18:44, 3.77s/it] {'loss': 0.1025, 'grad_norm': 0.8472957512982819, 'learning_rate': 1.08321052457937e-07, 'epoch': 0.94} 94%|█████████▎| 32071/34278 [35:13:22<2:18:44, 3.77s/it] 94%|█████████▎| 32072/34278 [35:13:28<2:46:20, 4.52s/it] {'loss': 0.1128, 'grad_norm': 0.9050049817836707, 'learning_rate': 1.0822326891811396e-07, 'epoch': 0.94} 94%|█████████▎| 32072/34278 [35:13:28<2:46:20, 4.52s/it] 94%|█████████▎| 32073/34278 [35:13:31<2:31:05, 4.11s/it] {'loss': 0.142, 'grad_norm': 0.7629335989022237, 'learning_rate': 1.0812552905090534e-07, 'epoch': 0.94} 94%|█████████▎| 32073/34278 [35:13:31<2:31:05, 4.11s/it] 94%|█████████▎| 32074/34278 [35:13:34<2:21:39, 3.86s/it] {'loss': 0.0998, 'grad_norm': 0.7569119322303541, 'learning_rate': 1.0802783285718488e-07, 'epoch': 0.94} 94%|█████████▎| 32074/34278 [35:13:34<2:21:39, 3.86s/it] 94%|█████████▎| 32075/34278 [35:13:39<2:28:52, 4.05s/it] {'loss': 0.107, 'grad_norm': 0.8736446102508948, 'learning_rate': 1.0793018033782355e-07, 'epoch': 0.94} 94%|█████████▎| 32075/34278 [35:13:39<2:28:52, 4.05s/it] 94%|█████████▎| 32076/34278 [35:13:42<2:18:40, 3.78s/it] {'loss': 0.1203, 'grad_norm': 0.9946571567558669, 'learning_rate': 1.0783257149369453e-07, 'epoch': 0.94} 94%|█████████▎| 32076/34278 [35:13:42<2:18:40, 3.78s/it] 94%|█████████▎| 32077/34278 [35:13:45<2:12:49, 3.62s/it] {'loss': 0.1192, 'grad_norm': 1.1956383419050371, 'learning_rate': 1.0773500632566769e-07, 'epoch': 0.94} 94%|█████████▎| 32077/34278 [35:13:45<2:12:49, 3.62s/it] 94%|█████████▎| 32078/34278 [35:13:48<2:03:29, 3.37s/it] {'loss': 0.1095, 'grad_norm': 0.8033344768674102, 'learning_rate': 1.0763748483461511e-07, 'epoch': 0.94} 94%|█████████▎| 32078/34278 [35:13:48<2:03:29, 3.37s/it] 94%|█████████▎| 32079/34278 [35:13:52<2:05:16, 3.42s/it] {'loss': 0.1083, 'grad_norm': 0.8577610732697442, 'learning_rate': 1.0754000702140666e-07, 'epoch': 0.94} 94%|█████████▎| 32079/34278 [35:13:52<2:05:16, 3.42s/it] 94%|█████████▎| 32080/34278 [35:13:55<2:00:42, 3.29s/it] {'loss': 0.1301, 'grad_norm': 1.2640094165415217, 'learning_rate': 1.0744257288691384e-07, 'epoch': 0.94} 94%|█████████▎| 32080/34278 [35:13:55<2:00:42, 3.29s/it] 94%|█████████▎| 32081/34278 [35:13:59<2:07:24, 3.48s/it] {'loss': 0.1192, 'grad_norm': 0.88340311691982, 'learning_rate': 1.0734518243200598e-07, 'epoch': 0.94} 94%|█████████▎| 32081/34278 [35:13:59<2:07:24, 3.48s/it] 94%|█████████▎| 32082/34278 [35:14:02<2:09:20, 3.53s/it] {'loss': 0.112, 'grad_norm': 0.8064643768696518, 'learning_rate': 1.0724783565755126e-07, 'epoch': 0.94} 94%|█████████▎| 32082/34278 [35:14:02<2:09:20, 3.53s/it] 94%|█████████▎| 32083/34278 [35:14:06<2:10:39, 3.57s/it] {'loss': 0.1187, 'grad_norm': 0.8475180988623313, 'learning_rate': 1.071505325644212e-07, 'epoch': 0.94} 94%|█████████▎| 32083/34278 [35:14:06<2:10:39, 3.57s/it] 94%|█████████▎| 32084/34278 [35:14:12<2:38:11, 4.33s/it] {'loss': 0.1235, 'grad_norm': 0.9446665934044493, 'learning_rate': 1.0705327315348235e-07, 'epoch': 0.94} 94%|█████████▎| 32084/34278 [35:14:12<2:38:11, 4.33s/it] 94%|█████████▎| 32085/34278 [35:14:15<2:24:33, 3.95s/it] {'loss': 0.1149, 'grad_norm': 0.768170012115103, 'learning_rate': 1.0695605742560345e-07, 'epoch': 0.94} 94%|█████████▎| 32085/34278 [35:14:15<2:24:33, 3.95s/it] 94%|█████████▎| 32086/34278 [35:14:18<2:11:30, 3.60s/it] {'loss': 0.1085, 'grad_norm': 0.7996103309772392, 'learning_rate': 1.0685888538165323e-07, 'epoch': 0.94} 94%|█████████▎| 32086/34278 [35:14:18<2:11:30, 3.60s/it] 94%|█████████▎| 32087/34278 [35:14:21<2:09:06, 3.54s/it] {'loss': 0.1163, 'grad_norm': 0.8840239057508736, 'learning_rate': 1.0676175702249936e-07, 'epoch': 0.94} 94%|█████████▎| 32087/34278 [35:14:21<2:09:06, 3.54s/it] 94%|█████████▎| 32088/34278 [35:14:24<2:03:21, 3.38s/it] {'loss': 0.1052, 'grad_norm': 1.0532790900297067, 'learning_rate': 1.0666467234900779e-07, 'epoch': 0.94} 94%|█████████▎| 32088/34278 [35:14:24<2:03:21, 3.38s/it] 94%|█████████▎| 32089/34278 [35:14:28<2:03:28, 3.38s/it] {'loss': 0.1038, 'grad_norm': 0.8433485382052007, 'learning_rate': 1.0656763136204617e-07, 'epoch': 0.94} 94%|█████████▎| 32089/34278 [35:14:28<2:03:28, 3.38s/it] 94%|█████████▎| 32090/34278 [35:14:31<2:07:25, 3.49s/it] {'loss': 0.1131, 'grad_norm': 0.7560474299323872, 'learning_rate': 1.0647063406248048e-07, 'epoch': 0.94} 94%|█████████▎| 32090/34278 [35:14:31<2:07:25, 3.49s/it] 94%|█████████▎| 32091/34278 [35:14:36<2:14:57, 3.70s/it] {'loss': 0.1216, 'grad_norm': 0.7042252536882668, 'learning_rate': 1.0637368045117669e-07, 'epoch': 0.94} 94%|█████████▎| 32091/34278 [35:14:36<2:14:57, 3.70s/it] 94%|█████████▎| 32092/34278 [35:14:39<2:09:19, 3.55s/it] {'loss': 0.1196, 'grad_norm': 0.9983765184405167, 'learning_rate': 1.062767705290002e-07, 'epoch': 0.94} 94%|█████████▎| 32092/34278 [35:14:39<2:09:19, 3.55s/it] 94%|█████████▎| 32093/34278 [35:14:42<2:07:21, 3.50s/it] {'loss': 0.1095, 'grad_norm': 0.8783148249243051, 'learning_rate': 1.0617990429681702e-07, 'epoch': 0.94} 94%|█████████▎| 32093/34278 [35:14:42<2:07:21, 3.50s/it] 94%|█████████▎| 32094/34278 [35:14:45<2:02:26, 3.36s/it] {'loss': 0.1227, 'grad_norm': 0.7706835504619733, 'learning_rate': 1.0608308175549142e-07, 'epoch': 0.94} 94%|█████████▎| 32094/34278 [35:14:45<2:02:26, 3.36s/it] 94%|█████████▎| 32095/34278 [35:14:49<2:03:24, 3.39s/it] {'loss': 0.1011, 'grad_norm': 0.8939887644792357, 'learning_rate': 1.0598630290588718e-07, 'epoch': 0.94} 94%|█████████▎| 32095/34278 [35:14:49<2:03:24, 3.39s/it] 94%|█████████▎| 32096/34278 [35:14:54<2:25:26, 4.00s/it] {'loss': 0.1088, 'grad_norm': 0.7309322511998092, 'learning_rate': 1.0588956774886971e-07, 'epoch': 0.94} 94%|█████████▎| 32096/34278 [35:14:54<2:25:26, 4.00s/it] 94%|█████████▎| 32097/34278 [35:15:00<2:50:09, 4.68s/it] {'loss': 0.1095, 'grad_norm': 0.6661812416048095, 'learning_rate': 1.057928762853011e-07, 'epoch': 0.94} 94%|█████████▎| 32097/34278 [35:15:00<2:50:09, 4.68s/it] 94%|█████████▎| 32098/34278 [35:15:04<2:41:27, 4.44s/it] {'loss': 0.109, 'grad_norm': 0.8456105650420225, 'learning_rate': 1.0569622851604567e-07, 'epoch': 0.94} 94%|█████████▎| 32098/34278 [35:15:04<2:41:27, 4.44s/it] 94%|█████████▎| 32099/34278 [35:15:07<2:25:31, 4.01s/it] {'loss': 0.1139, 'grad_norm': 0.86400129011861, 'learning_rate': 1.0559962444196603e-07, 'epoch': 0.94} 94%|█████████▎| 32099/34278 [35:15:07<2:25:31, 4.01s/it] 94%|█████████▎| 32100/34278 [35:15:12<2:38:12, 4.36s/it] {'loss': 0.0981, 'grad_norm': 1.114747802139998, 'learning_rate': 1.0550306406392486e-07, 'epoch': 0.94} 94%|█████████▎| 32100/34278 [35:15:12<2:38:12, 4.36s/it] 94%|█████████▎| 32101/34278 [35:15:16<2:32:26, 4.20s/it] {'loss': 0.1225, 'grad_norm': 0.8562614775002866, 'learning_rate': 1.0540654738278366e-07, 'epoch': 0.94} 94%|█████████▎| 32101/34278 [35:15:16<2:32:26, 4.20s/it] 94%|█████████▎| 32102/34278 [35:15:20<2:22:43, 3.94s/it] {'loss': 0.0977, 'grad_norm': 0.8197540439884026, 'learning_rate': 1.0531007439940455e-07, 'epoch': 0.94} 94%|█████████▎| 32102/34278 [35:15:20<2:22:43, 3.94s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f433b75f1a0> Failed to fetch sample 3646663. Exception: cannot identify image file <_io.BytesIO object at 0x7f433b75f1a0> 94%|█████████▎| 32103/34278 [35:15:23<2:17:17, 3.79s/it] {'loss': 0.1151, 'grad_norm': 1.1298989345727763, 'learning_rate': 1.0521364511464794e-07, 'epoch': 0.94} 94%|█████████▎| 32103/34278 [35:15:23<2:17:17, 3.79s/it] 94%|█████████▎| 32104/34278 [35:15:29<2:41:30, 4.46s/it] {'loss': 0.1046, 'grad_norm': 0.833695601396288, 'learning_rate': 1.051172595293759e-07, 'epoch': 0.94} 94%|█████████▎| 32104/34278 [35:15:29<2:41:30, 4.46s/it] 94%|█████████▎| 32105/34278 [35:15:34<2:41:50, 4.47s/it] {'loss': 0.1126, 'grad_norm': 0.9868865547606661, 'learning_rate': 1.0502091764444833e-07, 'epoch': 0.94} 94%|█████████▎| 32105/34278 [35:15:34<2:41:50, 4.47s/it] 94%|█████████▎| 32106/34278 [35:15:38<2:41:15, 4.45s/it] {'loss': 0.1155, 'grad_norm': 0.8271959984514432, 'learning_rate': 1.0492461946072563e-07, 'epoch': 0.94} 94%|█████████▎| 32106/34278 [35:15:38<2:41:15, 4.45s/it] 94%|█████████▎| 32107/34278 [35:15:41<2:25:29, 4.02s/it] {'loss': 0.1013, 'grad_norm': 0.8410155090686664, 'learning_rate': 1.0482836497906768e-07, 'epoch': 0.94} 94%|█████████▎| 32107/34278 [35:15:41<2:25:29, 4.02s/it] 94%|█████████▎| 32108/34278 [35:15:44<2:15:09, 3.74s/it] {'loss': 0.123, 'grad_norm': 0.7599988649280618, 'learning_rate': 1.0473215420033322e-07, 'epoch': 0.94} 94%|█████████▎| 32108/34278 [35:15:44<2:15:09, 3.74s/it] 94%|█████████▎| 32109/34278 [35:15:50<2:44:11, 4.54s/it] {'loss': 0.1349, 'grad_norm': 0.7464152785830234, 'learning_rate': 1.0463598712538104e-07, 'epoch': 0.94} 94%|█████████▎| 32109/34278 [35:15:50<2:44:11, 4.54s/it] 94%|█████████▎| 32110/34278 [35:15:56<2:58:13, 4.93s/it] {'loss': 0.0916, 'grad_norm': 0.9067006560858255, 'learning_rate': 1.0453986375507097e-07, 'epoch': 0.94} 94%|█████████▎| 32110/34278 [35:15:56<2:58:13, 4.93s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 94%|█████████▎| 32111/34278 [35:15:59<2:35:31, 4.31s/it] {'loss': 0.1003, 'grad_norm': 0.7312901611799437, 'learning_rate': 1.0444378409026012e-07, 'epoch': 0.94} 94%|█████████▎| 32111/34278 [35:15:59<2:35:31, 4.31s/it] 94%|█████████▎| 32112/34278 [35:16:02<2:19:02, 3.85s/it] {'loss': 0.1263, 'grad_norm': 0.9593404712318456, 'learning_rate': 1.0434774813180615e-07, 'epoch': 0.94} 94%|█████████▎| 32112/34278 [35:16:02<2:19:02, 3.85s/it] 94%|█████████▎| 32113/34278 [35:16:05<2:10:56, 3.63s/it] {'loss': 0.1046, 'grad_norm': 0.9865768206964413, 'learning_rate': 1.0425175588056724e-07, 'epoch': 0.94} 94%|█████████▎| 32113/34278 [35:16:05<2:10:56, 3.63s/it] 94%|█████████▎| 32114/34278 [35:16:08<2:04:16, 3.45s/it] {'loss': 0.0898, 'grad_norm': 0.9644272670639106, 'learning_rate': 1.0415580733739994e-07, 'epoch': 0.94} 94%|█████████▎| 32114/34278 [35:16:08<2:04:16, 3.45s/it] 94%|█████████▎| 32115/34278 [35:16:11<1:58:37, 3.29s/it] {'loss': 0.0851, 'grad_norm': 0.7781047636323862, 'learning_rate': 1.0405990250315967e-07, 'epoch': 0.94} 94%|█████████▎| 32115/34278 [35:16:11<1:58:37, 3.29s/it] 94%|█████████▎| 32116/34278 [35:16:14<1:55:35, 3.21s/it] {'loss': 0.1071, 'grad_norm': 0.9933907973999181, 'learning_rate': 1.039640413787052e-07, 'epoch': 0.94} 94%|█████████▎| 32116/34278 [35:16:14<1:55:35, 3.21s/it] 94%|█████████▎| 32117/34278 [35:16:19<2:10:29, 3.62s/it] {'loss': 0.1204, 'grad_norm': 0.8128984664761882, 'learning_rate': 1.0386822396489027e-07, 'epoch': 0.94} 94%|█████████▎| 32117/34278 [35:16:19<2:10:29, 3.62s/it] 94%|█████████▎| 32118/34278 [35:16:24<2:30:47, 4.19s/it] {'loss': 0.1088, 'grad_norm': 0.8148280473503925, 'learning_rate': 1.0377245026257143e-07, 'epoch': 0.94} 94%|█████████▎| 32118/34278 [35:16:24<2:30:47, 4.19s/it] 94%|█████████▎| 32119/34278 [35:16:30<2:51:43, 4.77s/it] {'loss': 0.1235, 'grad_norm': 0.6600777179891715, 'learning_rate': 1.0367672027260356e-07, 'epoch': 0.94} 94%|█████████▎| 32119/34278 [35:16:30<2:51:43, 4.77s/it] 94%|█████████▎| 32120/34278 [35:16:33<2:30:50, 4.19s/it] {'loss': 0.0957, 'grad_norm': 0.7702166296209475, 'learning_rate': 1.0358103399584096e-07, 'epoch': 0.94} 94%|█████████▎| 32120/34278 [35:16:33<2:30:50, 4.19s/it] 94%|█████████▎| 32121/34278 [35:16:36<2:17:07, 3.81s/it] {'loss': 0.103, 'grad_norm': 0.8574890822227507, 'learning_rate': 1.0348539143313741e-07, 'epoch': 0.94} 94%|█████████▎| 32121/34278 [35:16:36<2:17:07, 3.81s/it] 94%|█████████▎| 32122/34278 [35:16:40<2:21:17, 3.93s/it] {'loss': 0.102, 'grad_norm': 0.7744931859645473, 'learning_rate': 1.0338979258534776e-07, 'epoch': 0.94} 94%|█████████▎| 32122/34278 [35:16:40<2:21:17, 3.93s/it] 94%|█████████▎| 32123/34278 [35:16:43<2:11:05, 3.65s/it] {'loss': 0.107, 'grad_norm': 0.7518278067773781, 'learning_rate': 1.0329423745332523e-07, 'epoch': 0.94} 94%|█████████▎| 32123/34278 [35:16:43<2:11:05, 3.65s/it] 94%|█████████▎| 32124/34278 [35:16:46<2:01:48, 3.39s/it] {'loss': 0.1049, 'grad_norm': 1.0942535864638883, 'learning_rate': 1.0319872603792302e-07, 'epoch': 0.94} 94%|█████████▎| 32124/34278 [35:16:46<2:01:48, 3.39s/it] 94%|█████████▎| 32125/34278 [35:16:49<1:56:45, 3.25s/it] {'loss': 0.0933, 'grad_norm': 1.0858144600217252, 'learning_rate': 1.0310325833999269e-07, 'epoch': 0.94} 94%|█████████▎| 32125/34278 [35:16:49<1:56:45, 3.25s/it] 94%|█████████▎| 32126/34278 [35:16:53<2:04:36, 3.47s/it] {'loss': 0.125, 'grad_norm': 1.026925791389885, 'learning_rate': 1.0300783436038852e-07, 'epoch': 0.94} 94%|█████████▎| 32126/34278 [35:16:53<2:04:36, 3.47s/it] 94%|█████████▎| 32127/34278 [35:16:56<2:00:23, 3.36s/it] {'loss': 0.1069, 'grad_norm': 0.8653665951855168, 'learning_rate': 1.0291245409996097e-07, 'epoch': 0.94} 94%|█████████▎| 32127/34278 [35:16:56<2:00:23, 3.36s/it] 94%|█████████▎| 32128/34278 [35:16:59<1:58:38, 3.31s/it] {'loss': 0.0888, 'grad_norm': 1.0504917778941396, 'learning_rate': 1.0281711755956159e-07, 'epoch': 0.94} 94%|█████████▎| 32128/34278 [35:16:59<1:58:38, 3.31s/it] 94%|█████████▎| 32129/34278 [35:17:04<2:16:08, 3.80s/it] {'loss': 0.1189, 'grad_norm': 0.9266732479399925, 'learning_rate': 1.0272182474004299e-07, 'epoch': 0.94} 94%|█████████▎| 32129/34278 [35:17:04<2:16:08, 3.80s/it] 94%|█████████▎| 32130/34278 [35:17:07<2:09:46, 3.62s/it] {'loss': 0.1184, 'grad_norm': 0.9230162594241422, 'learning_rate': 1.0262657564225397e-07, 'epoch': 0.94} 94%|█████████▎| 32130/34278 [35:17:07<2:09:46, 3.62s/it] 94%|█████████▎| 32131/34278 [35:17:11<2:05:52, 3.52s/it] {'loss': 0.1142, 'grad_norm': 0.8241398713510739, 'learning_rate': 1.0253137026704607e-07, 'epoch': 0.94} 94%|█████████▎| 32131/34278 [35:17:11<2:05:52, 3.52s/it] 94%|█████████▎| 32132/34278 [35:17:13<1:56:08, 3.25s/it] {'loss': 0.0991, 'grad_norm': 0.8606522985963572, 'learning_rate': 1.0243620861526915e-07, 'epoch': 0.94} 94%|█████████▎| 32132/34278 [35:17:13<1:56:08, 3.25s/it] 94%|█████████▎| 32133/34278 [35:17:19<2:18:04, 3.86s/it] {'loss': 0.1125, 'grad_norm': 0.8115170765044549, 'learning_rate': 1.0234109068777254e-07, 'epoch': 0.94} 94%|█████████▎| 32133/34278 [35:17:19<2:18:04, 3.86s/it] 94%|█████████▎| 32134/34278 [35:17:22<2:12:47, 3.72s/it] {'loss': 0.1186, 'grad_norm': 0.7330669259049344, 'learning_rate': 1.0224601648540555e-07, 'epoch': 0.94} 94%|█████████▎| 32134/34278 [35:17:22<2:12:47, 3.72s/it] 94%|█████████▎| 32135/34278 [35:17:25<2:04:53, 3.50s/it] {'loss': 0.1318, 'grad_norm': 0.9528809566799136, 'learning_rate': 1.021509860090164e-07, 'epoch': 0.94} 94%|█████████▎| 32135/34278 [35:17:25<2:04:53, 3.50s/it] 94%|█████████▍| 32136/34278 [35:17:28<2:02:12, 3.42s/it] {'loss': 0.1086, 'grad_norm': 1.2634693443948881, 'learning_rate': 1.0205599925945442e-07, 'epoch': 0.94} 94%|█████████▍| 32136/34278 [35:17:28<2:02:12, 3.42s/it] 94%|█████████▍| 32137/34278 [35:17:31<1:58:18, 3.32s/it] {'loss': 0.1107, 'grad_norm': 0.861775568004273, 'learning_rate': 1.0196105623756781e-07, 'epoch': 0.94} 94%|█████████▍| 32137/34278 [35:17:31<1:58:18, 3.32s/it] 94%|█████████▍| 32138/34278 [35:17:35<1:58:03, 3.31s/it] {'loss': 0.1303, 'grad_norm': 0.8751065366212872, 'learning_rate': 1.0186615694420255e-07, 'epoch': 0.94} 94%|█████████▍| 32138/34278 [35:17:35<1:58:03, 3.31s/it] 94%|█████████▍| 32139/34278 [35:17:41<2:28:42, 4.17s/it] {'loss': 0.0952, 'grad_norm': 0.7100950551611143, 'learning_rate': 1.0177130138020741e-07, 'epoch': 0.94} 94%|█████████▍| 32139/34278 [35:17:41<2:28:42, 4.17s/it] 94%|█████████▍| 32140/34278 [35:17:44<2:22:37, 4.00s/it] {'loss': 0.127, 'grad_norm': 0.8449839806862404, 'learning_rate': 1.0167648954642895e-07, 'epoch': 0.94} 94%|█████████▍| 32140/34278 [35:17:44<2:22:37, 4.00s/it] 94%|█████████▍| 32141/34278 [35:17:49<2:26:36, 4.12s/it] {'loss': 0.1051, 'grad_norm': 0.6529840587223028, 'learning_rate': 1.0158172144371369e-07, 'epoch': 0.94} 94%|█████████▍| 32141/34278 [35:17:49<2:26:36, 4.12s/it] 94%|█████████▍| 32142/34278 [35:17:55<2:50:12, 4.78s/it] {'loss': 0.1235, 'grad_norm': 0.8914275159657254, 'learning_rate': 1.0148699707290711e-07, 'epoch': 0.94} 94%|█████████▍| 32142/34278 [35:17:55<2:50:12, 4.78s/it] 94%|█████████▍| 32143/34278 [35:18:01<3:01:25, 5.10s/it] {'loss': 0.1017, 'grad_norm': 0.7751532525943846, 'learning_rate': 1.013923164348557e-07, 'epoch': 0.94} 94%|█████████▍| 32143/34278 [35:18:01<3:01:25, 5.10s/it] 94%|█████████▍| 32144/34278 [35:18:04<2:41:21, 4.54s/it] {'loss': 0.1229, 'grad_norm': 0.9557705989379754, 'learning_rate': 1.0129767953040326e-07, 'epoch': 0.94} 94%|█████████▍| 32144/34278 [35:18:04<2:41:21, 4.54s/it] 94%|█████████▍| 32145/34278 [35:18:07<2:23:07, 4.03s/it] {'loss': 0.1123, 'grad_norm': 0.8410134147210956, 'learning_rate': 1.0120308636039632e-07, 'epoch': 0.94} 94%|█████████▍| 32145/34278 [35:18:07<2:23:07, 4.03s/it] 94%|█████████▍| 32146/34278 [35:18:10<2:09:14, 3.64s/it] {'loss': 0.1247, 'grad_norm': 1.1326334761987327, 'learning_rate': 1.0110853692567924e-07, 'epoch': 0.94} 94%|█████████▍| 32146/34278 [35:18:10<2:09:14, 3.64s/it] 94%|█████████▍| 32147/34278 [35:18:13<2:06:26, 3.56s/it] {'loss': 0.1071, 'grad_norm': 0.9496660829235316, 'learning_rate': 1.0101403122709518e-07, 'epoch': 0.94} 94%|█████████▍| 32147/34278 [35:18:13<2:06:26, 3.56s/it] 94%|█████████▍| 32148/34278 [35:18:17<2:09:31, 3.65s/it] {'loss': 0.1181, 'grad_norm': 0.8437905836275701, 'learning_rate': 1.0091956926548852e-07, 'epoch': 0.94} 94%|█████████▍| 32148/34278 [35:18:17<2:09:31, 3.65s/it] 94%|█████████▍| 32149/34278 [35:18:20<2:04:28, 3.51s/it] {'loss': 0.12, 'grad_norm': 1.0453297402604065, 'learning_rate': 1.0082515104170243e-07, 'epoch': 0.94} 94%|█████████▍| 32149/34278 [35:18:20<2:04:28, 3.51s/it] 94%|█████████▍| 32150/34278 [35:18:26<2:26:27, 4.13s/it] {'loss': 0.1256, 'grad_norm': 0.8428288523939395, 'learning_rate': 1.0073077655657959e-07, 'epoch': 0.94} 94%|█████████▍| 32150/34278 [35:18:26<2:26:27, 4.13s/it] 94%|█████████▍| 32151/34278 [35:18:30<2:26:02, 4.12s/it] {'loss': 0.1148, 'grad_norm': 0.7548680701419213, 'learning_rate': 1.0063644581096322e-07, 'epoch': 0.94} 94%|█████████▍| 32151/34278 [35:18:30<2:26:02, 4.12s/it] 94%|█████████▍| 32152/34278 [35:18:36<2:48:09, 4.75s/it] {'loss': 0.1304, 'grad_norm': 0.9450970919153946, 'learning_rate': 1.0054215880569485e-07, 'epoch': 0.94} 94%|█████████▍| 32152/34278 [35:18:36<2:48:09, 4.75s/it] 94%|█████████▍| 32153/34278 [35:18:40<2:37:54, 4.46s/it] {'loss': 0.1171, 'grad_norm': 0.8915063403696859, 'learning_rate': 1.0044791554161659e-07, 'epoch': 0.94} 94%|█████████▍| 32153/34278 [35:18:40<2:37:54, 4.46s/it] 94%|█████████▍| 32154/34278 [35:18:46<2:52:52, 4.88s/it] {'loss': 0.0982, 'grad_norm': 0.8769882247763162, 'learning_rate': 1.0035371601957e-07, 'epoch': 0.94} 94%|█████████▍| 32154/34278 [35:18:46<2:52:52, 4.88s/it] 94%|█████████▍| 32155/34278 [35:18:49<2:38:09, 4.47s/it] {'loss': 0.1008, 'grad_norm': 0.8917244343084209, 'learning_rate': 1.0025956024039551e-07, 'epoch': 0.94} 94%|█████████▍| 32155/34278 [35:18:49<2:38:09, 4.47s/it] 94%|█████████▍| 32156/34278 [35:18:52<2:20:06, 3.96s/it] {'loss': 0.1061, 'grad_norm': 0.8660089227440425, 'learning_rate': 1.0016544820493357e-07, 'epoch': 0.94} 94%|█████████▍| 32156/34278 [35:18:52<2:20:06, 3.96s/it] 94%|█████████▍| 32157/34278 [35:18:55<2:07:03, 3.59s/it] {'loss': 0.0905, 'grad_norm': 0.8980029735838663, 'learning_rate': 1.0007137991402572e-07, 'epoch': 0.94} 94%|█████████▍| 32157/34278 [35:18:55<2:07:03, 3.59s/it] 94%|█████████▍| 32158/34278 [35:18:58<2:02:34, 3.47s/it] {'loss': 0.1182, 'grad_norm': 0.7552683733797441, 'learning_rate': 9.997735536851017e-08, 'epoch': 0.94} 94%|█████████▍| 32158/34278 [35:18:58<2:02:34, 3.47s/it] 94%|█████████▍| 32159/34278 [35:19:01<2:02:02, 3.46s/it] {'loss': 0.1212, 'grad_norm': 0.9670731861815297, 'learning_rate': 9.988337456922737e-08, 'epoch': 0.94} 94%|█████████▍| 32159/34278 [35:19:01<2:02:02, 3.46s/it] 94%|█████████▍| 32160/34278 [35:19:05<2:04:05, 3.52s/it] {'loss': 0.1103, 'grad_norm': 0.7434491962365531, 'learning_rate': 9.978943751701609e-08, 'epoch': 0.94} 94%|█████████▍| 32160/34278 [35:19:05<2:04:05, 3.52s/it] 94%|█████████▍| 32161/34278 [35:19:08<2:00:43, 3.42s/it] {'loss': 0.0968, 'grad_norm': 0.7559641120074799, 'learning_rate': 9.969554421271455e-08, 'epoch': 0.94} 94%|█████████▍| 32161/34278 [35:19:08<2:00:43, 3.42s/it] 94%|█████████▍| 32162/34278 [35:19:14<2:24:10, 4.09s/it] {'loss': 0.1308, 'grad_norm': 0.8034284376231915, 'learning_rate': 9.960169465716152e-08, 'epoch': 0.94} 94%|█████████▍| 32162/34278 [35:19:14<2:24:10, 4.09s/it] 94%|█████████▍| 32163/34278 [35:19:17<2:12:54, 3.77s/it] {'loss': 0.1431, 'grad_norm': 0.757018958058187, 'learning_rate': 9.950788885119522e-08, 'epoch': 0.94} 94%|█████████▍| 32163/34278 [35:19:17<2:12:54, 3.77s/it] 94%|█████████▍| 32164/34278 [35:19:23<2:34:51, 4.40s/it] {'loss': 0.1374, 'grad_norm': 1.1414076423319532, 'learning_rate': 9.941412679565276e-08, 'epoch': 0.94} 94%|█████████▍| 32164/34278 [35:19:23<2:34:51, 4.40s/it] 94%|█████████▍| 32165/34278 [35:19:26<2:24:09, 4.09s/it] {'loss': 0.0941, 'grad_norm': 1.070905765134784, 'learning_rate': 9.932040849137014e-08, 'epoch': 0.94} 94%|█████████▍| 32165/34278 [35:19:26<2:24:09, 4.09s/it] 94%|█████████▍| 32166/34278 [35:19:29<2:10:57, 3.72s/it] {'loss': 0.1241, 'grad_norm': 0.8973320288044775, 'learning_rate': 9.922673393918614e-08, 'epoch': 0.94} 94%|█████████▍| 32166/34278 [35:19:29<2:10:57, 3.72s/it] 94%|█████████▍| 32167/34278 [35:19:32<2:03:04, 3.50s/it] {'loss': 0.0778, 'grad_norm': 1.2789932825467665, 'learning_rate': 9.913310313993562e-08, 'epoch': 0.94} 94%|█████████▍| 32167/34278 [35:19:32<2:03:04, 3.50s/it] 94%|█████████▍| 32168/34278 [35:19:38<2:30:54, 4.29s/it] {'loss': 0.0932, 'grad_norm': 0.8551584840952743, 'learning_rate': 9.903951609445406e-08, 'epoch': 0.94} 94%|█████████▍| 32168/34278 [35:19:38<2:30:54, 4.29s/it] 94%|█████████▍| 32169/34278 [35:19:41<2:20:32, 4.00s/it] {'loss': 0.1129, 'grad_norm': 0.7892981867974093, 'learning_rate': 9.894597280357798e-08, 'epoch': 0.94} 94%|█████████▍| 32169/34278 [35:19:41<2:20:32, 4.00s/it] 94%|█████████▍| 32170/34278 [35:19:45<2:15:07, 3.85s/it] {'loss': 0.1077, 'grad_norm': 1.038712545214295, 'learning_rate': 9.885247326814285e-08, 'epoch': 0.94} 94%|█████████▍| 32170/34278 [35:19:45<2:15:07, 3.85s/it] 94%|█████████▍| 32171/34278 [35:19:50<2:29:42, 4.26s/it] {'loss': 0.0953, 'grad_norm': 1.0693996277997444, 'learning_rate': 9.875901748898298e-08, 'epoch': 0.94} 94%|█████████▍| 32171/34278 [35:19:50<2:29:42, 4.26s/it] 94%|█████████▍| 32172/34278 [35:19:54<2:22:11, 4.05s/it] {'loss': 0.1142, 'grad_norm': 0.8098148128622791, 'learning_rate': 9.86656054669316e-08, 'epoch': 0.94} 94%|█████████▍| 32172/34278 [35:19:54<2:22:11, 4.05s/it] 94%|█████████▍| 32173/34278 [35:19:57<2:12:37, 3.78s/it] {'loss': 0.1078, 'grad_norm': 0.7900266748948263, 'learning_rate': 9.857223720282472e-08, 'epoch': 0.94} 94%|█████████▍| 32173/34278 [35:19:57<2:12:37, 3.78s/it] 94%|█████████▍| 32174/34278 [35:20:00<2:03:31, 3.52s/it] {'loss': 0.1015, 'grad_norm': 0.9237403114889481, 'learning_rate': 9.847891269749388e-08, 'epoch': 0.94} 94%|█████████▍| 32174/34278 [35:20:00<2:03:31, 3.52s/it] 94%|█████████▍| 32175/34278 [35:20:03<1:59:02, 3.40s/it] {'loss': 0.1209, 'grad_norm': 0.8923289365142516, 'learning_rate': 9.838563195177342e-08, 'epoch': 0.94} 94%|█████████▍| 32175/34278 [35:20:03<1:59:02, 3.40s/it] 94%|█████████▍| 32176/34278 [35:20:06<1:56:58, 3.34s/it] {'loss': 0.1195, 'grad_norm': 0.7569550043956034, 'learning_rate': 9.829239496649656e-08, 'epoch': 0.94} 94%|█████████▍| 32176/34278 [35:20:06<1:56:58, 3.34s/it] 94%|█████████▍| 32177/34278 [35:20:09<1:58:30, 3.38s/it] {'loss': 0.1012, 'grad_norm': 0.6888915327303003, 'learning_rate': 9.819920174249486e-08, 'epoch': 0.94} 94%|█████████▍| 32177/34278 [35:20:09<1:58:30, 3.38s/it] 94%|█████████▍| 32178/34278 [35:20:14<2:05:51, 3.60s/it] {'loss': 0.0985, 'grad_norm': 0.7796935344380027, 'learning_rate': 9.810605228059988e-08, 'epoch': 0.94} 94%|█████████▍| 32178/34278 [35:20:14<2:05:51, 3.60s/it] 94%|█████████▍| 32179/34278 [35:20:18<2:11:58, 3.77s/it] {'loss': 0.1067, 'grad_norm': 0.7837266477355513, 'learning_rate': 9.801294658164484e-08, 'epoch': 0.94} 94%|█████████▍| 32179/34278 [35:20:18<2:11:58, 3.77s/it] 94%|█████████▍| 32180/34278 [35:20:24<2:34:43, 4.43s/it] {'loss': 0.096, 'grad_norm': 0.8458686648916788, 'learning_rate': 9.791988464645907e-08, 'epoch': 0.94} 94%|█████████▍| 32180/34278 [35:20:24<2:34:43, 4.43s/it] 94%|█████████▍| 32181/34278 [35:20:30<2:51:54, 4.92s/it] {'loss': 0.1069, 'grad_norm': 0.7529063868239642, 'learning_rate': 9.782686647587524e-08, 'epoch': 0.94} 94%|█████████▍| 32181/34278 [35:20:30<2:51:54, 4.92s/it] 94%|█████████▍| 32182/34278 [35:20:32<2:29:03, 4.27s/it] {'loss': 0.104, 'grad_norm': 0.8567471399250969, 'learning_rate': 9.773389207072214e-08, 'epoch': 0.94} 94%|█████████▍| 32182/34278 [35:20:32<2:29:03, 4.27s/it] 94%|█████████▍| 32183/34278 [35:20:36<2:19:50, 4.01s/it] {'loss': 0.1197, 'grad_norm': 0.7800869455553334, 'learning_rate': 9.764096143183133e-08, 'epoch': 0.94} 94%|█████████▍| 32183/34278 [35:20:36<2:19:50, 4.01s/it] 94%|█████████▍| 32184/34278 [35:20:39<2:13:36, 3.83s/it] {'loss': 0.1264, 'grad_norm': 0.9397806084771332, 'learning_rate': 9.754807456003157e-08, 'epoch': 0.94} 94%|█████████▍| 32184/34278 [35:20:39<2:13:36, 3.83s/it] 94%|█████████▍| 32185/34278 [35:20:42<2:04:51, 3.58s/it] {'loss': 0.0863, 'grad_norm': 0.809745770145711, 'learning_rate': 9.745523145615166e-08, 'epoch': 0.94} 94%|█████████▍| 32185/34278 [35:20:42<2:04:51, 3.58s/it] 94%|█████████▍| 32186/34278 [35:20:48<2:29:57, 4.30s/it] {'loss': 0.1165, 'grad_norm': 0.9002286358801612, 'learning_rate': 9.736243212102147e-08, 'epoch': 0.94} 94%|█████████▍| 32186/34278 [35:20:48<2:29:57, 4.30s/it] 94%|█████████▍| 32187/34278 [35:20:51<2:15:17, 3.88s/it] {'loss': 0.122, 'grad_norm': 0.7805276518654509, 'learning_rate': 9.726967655546926e-08, 'epoch': 0.94} 94%|█████████▍| 32187/34278 [35:20:51<2:15:17, 3.88s/it] 94%|█████████▍| 32188/34278 [35:20:54<2:08:00, 3.67s/it] {'loss': 0.1003, 'grad_norm': 0.9600304009593331, 'learning_rate': 9.717696476032267e-08, 'epoch': 0.94} 94%|█████████▍| 32188/34278 [35:20:54<2:08:00, 3.67s/it] 94%|█████████▍| 32189/34278 [35:20:57<2:01:36, 3.49s/it] {'loss': 0.1065, 'grad_norm': 0.754286919698732, 'learning_rate': 9.708429673640995e-08, 'epoch': 0.94} 94%|█████████▍| 32189/34278 [35:20:57<2:01:36, 3.49s/it] 94%|█████████▍| 32190/34278 [35:21:02<2:16:34, 3.92s/it] {'loss': 0.1218, 'grad_norm': 0.8368083610034457, 'learning_rate': 9.699167248455876e-08, 'epoch': 0.94} 94%|█████████▍| 32190/34278 [35:21:02<2:16:34, 3.92s/it] 94%|█████████▍| 32191/34278 [35:21:06<2:09:32, 3.72s/it] {'loss': 0.1006, 'grad_norm': 0.7588346705004958, 'learning_rate': 9.689909200559455e-08, 'epoch': 0.94} 94%|█████████▍| 32191/34278 [35:21:06<2:09:32, 3.72s/it] 94%|█████████▍| 32192/34278 [35:21:10<2:14:00, 3.85s/it] {'loss': 0.1198, 'grad_norm': 0.8304193073111201, 'learning_rate': 9.6806555300345e-08, 'epoch': 0.94} 94%|█████████▍| 32192/34278 [35:21:10<2:14:00, 3.85s/it] 94%|█████████▍| 32193/34278 [35:21:14<2:18:22, 3.98s/it] {'loss': 0.1089, 'grad_norm': 0.7431972213954714, 'learning_rate': 9.671406236963666e-08, 'epoch': 0.94} 94%|█████████▍| 32193/34278 [35:21:14<2:18:22, 3.98s/it] 94%|█████████▍| 32194/34278 [35:21:18<2:15:30, 3.90s/it] {'loss': 0.0997, 'grad_norm': 0.624978760386684, 'learning_rate': 9.662161321429441e-08, 'epoch': 0.94} 94%|█████████▍| 32194/34278 [35:21:18<2:15:30, 3.90s/it] 94%|█████████▍| 32195/34278 [35:21:21<2:03:58, 3.57s/it] {'loss': 0.0965, 'grad_norm': 0.6894520225294304, 'learning_rate': 9.652920783514319e-08, 'epoch': 0.94} 94%|█████████▍| 32195/34278 [35:21:21<2:03:58, 3.57s/it] 94%|█████████▍| 32196/34278 [35:21:24<2:03:48, 3.57s/it] {'loss': 0.084, 'grad_norm': 0.8174341948911126, 'learning_rate': 9.643684623300953e-08, 'epoch': 0.94} 94%|█████████▍| 32196/34278 [35:21:24<2:03:48, 3.57s/it] 94%|█████████▍| 32197/34278 [35:21:30<2:30:43, 4.35s/it] {'loss': 0.1178, 'grad_norm': 1.0582396097798812, 'learning_rate': 9.634452840871667e-08, 'epoch': 0.94} 94%|█████████▍| 32197/34278 [35:21:30<2:30:43, 4.35s/it] 94%|█████████▍| 32198/34278 [35:21:34<2:20:07, 4.04s/it] {'loss': 0.1038, 'grad_norm': 0.7454065307174922, 'learning_rate': 9.625225436308949e-08, 'epoch': 0.94} 94%|█████████▍| 32198/34278 [35:21:34<2:20:07, 4.04s/it] 94%|█████████▍| 32199/34278 [35:21:37<2:09:15, 3.73s/it] {'loss': 0.1187, 'grad_norm': 0.8494845543516857, 'learning_rate': 9.616002409695069e-08, 'epoch': 0.94} 94%|█████████▍| 32199/34278 [35:21:37<2:09:15, 3.73s/it] 94%|█████████▍| 32200/34278 [35:21:40<2:01:27, 3.51s/it] {'loss': 0.1044, 'grad_norm': 0.7343299135462298, 'learning_rate': 9.60678376111257e-08, 'epoch': 0.94} 94%|█████████▍| 32200/34278 [35:21:40<2:01:27, 3.51s/it] 94%|█████████▍| 32201/34278 [35:21:43<2:01:29, 3.51s/it] {'loss': 0.114, 'grad_norm': 0.8797948726980744, 'learning_rate': 9.59756949064361e-08, 'epoch': 0.94} 94%|█████████▍| 32201/34278 [35:21:43<2:01:29, 3.51s/it] 94%|█████████▍| 32202/34278 [35:21:46<1:53:46, 3.29s/it] {'loss': 0.1128, 'grad_norm': 0.8113452424525245, 'learning_rate': 9.588359598370456e-08, 'epoch': 0.94} 94%|█████████▍| 32202/34278 [35:21:46<1:53:46, 3.29s/it] 94%|█████████▍| 32203/34278 [35:21:49<1:50:13, 3.19s/it] {'loss': 0.1058, 'grad_norm': 0.7252834619600642, 'learning_rate': 9.579154084375375e-08, 'epoch': 0.94} 94%|█████████▍| 32203/34278 [35:21:49<1:50:13, 3.19s/it] 94%|█████████▍| 32204/34278 [35:21:54<2:12:13, 3.83s/it] {'loss': 0.1201, 'grad_norm': 0.8894169244713941, 'learning_rate': 9.569952948740525e-08, 'epoch': 0.94} 94%|█████████▍| 32204/34278 [35:21:54<2:12:13, 3.83s/it] 94%|█████████▍| 32205/34278 [35:21:59<2:22:31, 4.13s/it] {'loss': 0.1009, 'grad_norm': 0.7713508233508065, 'learning_rate': 9.560756191548004e-08, 'epoch': 0.94} 94%|█████████▍| 32205/34278 [35:21:59<2:22:31, 4.13s/it] 94%|█████████▍| 32206/34278 [35:22:03<2:17:22, 3.98s/it] {'loss': 0.1411, 'grad_norm': 0.860856659261507, 'learning_rate': 9.551563812880083e-08, 'epoch': 0.94} 94%|█████████▍| 32206/34278 [35:22:03<2:17:22, 3.98s/it] 94%|█████████▍| 32207/34278 [35:22:06<2:07:27, 3.69s/it] {'loss': 0.0957, 'grad_norm': 0.8472523668044764, 'learning_rate': 9.542375812818694e-08, 'epoch': 0.94} 94%|█████████▍| 32207/34278 [35:22:06<2:07:27, 3.69s/it] 94%|█████████▍| 32208/34278 [35:22:09<2:05:35, 3.64s/it] {'loss': 0.1207, 'grad_norm': 0.8393016543633706, 'learning_rate': 9.533192191445828e-08, 'epoch': 0.94} 94%|█████████▍| 32208/34278 [35:22:09<2:05:35, 3.64s/it] 94%|█████████▍| 32209/34278 [35:22:12<1:56:45, 3.39s/it] {'loss': 0.1046, 'grad_norm': 0.8120017488468546, 'learning_rate': 9.524012948843586e-08, 'epoch': 0.94} 94%|█████████▍| 32209/34278 [35:22:12<1:56:45, 3.39s/it] 94%|█████████▍| 32210/34278 [35:22:15<1:54:41, 3.33s/it] {'loss': 0.1239, 'grad_norm': 0.8621657024627788, 'learning_rate': 9.514838085093847e-08, 'epoch': 0.94} 94%|█████████▍| 32210/34278 [35:22:15<1:54:41, 3.33s/it] 94%|█████████▍| 32211/34278 [35:22:20<2:08:38, 3.73s/it] {'loss': 0.1414, 'grad_norm': 0.84849491023201, 'learning_rate': 9.5056676002786e-08, 'epoch': 0.94} 94%|█████████▍| 32211/34278 [35:22:20<2:08:38, 3.73s/it] 94%|█████████▍| 32212/34278 [35:22:23<2:06:18, 3.67s/it] {'loss': 0.1237, 'grad_norm': 0.6648422318792367, 'learning_rate': 9.496501494479615e-08, 'epoch': 0.94} 94%|█████████▍| 32212/34278 [35:22:23<2:06:18, 3.67s/it] 94%|█████████▍| 32213/34278 [35:22:26<1:58:58, 3.46s/it] {'loss': 0.1037, 'grad_norm': 0.6795376554667742, 'learning_rate': 9.48733976777877e-08, 'epoch': 0.94} 94%|█████████▍| 32213/34278 [35:22:26<1:58:58, 3.46s/it] 94%|█████████▍| 32214/34278 [35:22:29<1:53:09, 3.29s/it] {'loss': 0.1184, 'grad_norm': 0.7282466940752219, 'learning_rate': 9.478182420257887e-08, 'epoch': 0.94} 94%|█████████▍| 32214/34278 [35:22:29<1:53:09, 3.29s/it] 94%|█████████▍| 32215/34278 [35:22:32<1:51:51, 3.25s/it] {'loss': 0.0954, 'grad_norm': 0.7548401752090214, 'learning_rate': 9.46902945199868e-08, 'epoch': 0.94} 94%|█████████▍| 32215/34278 [35:22:32<1:51:51, 3.25s/it] 94%|█████████▍| 32216/34278 [35:22:36<1:50:49, 3.22s/it] {'loss': 0.117, 'grad_norm': 0.8676952289040676, 'learning_rate': 9.45988086308286e-08, 'epoch': 0.94} 94%|█████████▍| 32216/34278 [35:22:36<1:50:49, 3.22s/it] 94%|█████████▍| 32217/34278 [35:22:42<2:21:03, 4.11s/it] {'loss': 0.1184, 'grad_norm': 0.8790333954437566, 'learning_rate': 9.45073665359214e-08, 'epoch': 0.94} 94%|█████████▍| 32217/34278 [35:22:42<2:21:03, 4.11s/it] 94%|█████████▍| 32218/34278 [35:22:45<2:11:41, 3.84s/it] {'loss': 0.0971, 'grad_norm': 0.8883700766774195, 'learning_rate': 9.441596823608123e-08, 'epoch': 0.94} 94%|█████████▍| 32218/34278 [35:22:45<2:11:41, 3.84s/it] 94%|█████████▍| 32219/34278 [35:22:49<2:11:20, 3.83s/it] {'loss': 0.1061, 'grad_norm': 0.7900934281400136, 'learning_rate': 9.432461373212465e-08, 'epoch': 0.94} 94%|█████████▍| 32219/34278 [35:22:49<2:11:20, 3.83s/it] 94%|█████████▍| 32220/34278 [35:22:52<2:04:52, 3.64s/it] {'loss': 0.1203, 'grad_norm': 0.9932949804585048, 'learning_rate': 9.423330302486655e-08, 'epoch': 0.94} 94%|█████████▍| 32220/34278 [35:22:52<2:04:52, 3.64s/it] 94%|█████████▍| 32221/34278 [35:22:55<1:58:41, 3.46s/it] {'loss': 0.0977, 'grad_norm': 0.8228353632921792, 'learning_rate': 9.41420361151224e-08, 'epoch': 0.94} 94%|█████████▍| 32221/34278 [35:22:55<1:58:41, 3.46s/it] 94%|█████████▍| 32222/34278 [35:22:58<1:55:24, 3.37s/it] {'loss': 0.1055, 'grad_norm': 0.7413417391788512, 'learning_rate': 9.405081300370712e-08, 'epoch': 0.94} 94%|█████████▍| 32222/34278 [35:22:58<1:55:24, 3.37s/it] 94%|█████████▍| 32223/34278 [35:23:03<2:10:33, 3.81s/it] {'loss': 0.1083, 'grad_norm': 0.9451930479248437, 'learning_rate': 9.395963369143501e-08, 'epoch': 0.94} 94%|█████████▍| 32223/34278 [35:23:03<2:10:33, 3.81s/it] 94%|█████████▍| 32224/34278 [35:23:06<2:04:36, 3.64s/it] {'loss': 0.0938, 'grad_norm': 0.8405664626413575, 'learning_rate': 9.386849817912047e-08, 'epoch': 0.94} 94%|█████████▍| 32224/34278 [35:23:06<2:04:36, 3.64s/it] 94%|█████████▍| 32225/34278 [35:23:09<1:56:56, 3.42s/it] {'loss': 0.1025, 'grad_norm': 1.0092823366652526, 'learning_rate': 9.377740646757616e-08, 'epoch': 0.94} 94%|█████████▍| 32225/34278 [35:23:09<1:56:56, 3.42s/it] 94%|█████████▍| 32226/34278 [35:23:15<2:18:32, 4.05s/it] {'loss': 0.1294, 'grad_norm': 0.7185649144012645, 'learning_rate': 9.368635855761642e-08, 'epoch': 0.94} 94%|█████████▍| 32226/34278 [35:23:15<2:18:32, 4.05s/it] 94%|█████████▍| 32227/34278 [35:23:21<2:41:32, 4.73s/it] {'loss': 0.1332, 'grad_norm': 0.6742166336971404, 'learning_rate': 9.35953544500534e-08, 'epoch': 0.94} 94%|█████████▍| 32227/34278 [35:23:21<2:41:32, 4.73s/it] 94%|█████████▍| 32228/34278 [35:23:25<2:39:30, 4.67s/it] {'loss': 0.1196, 'grad_norm': 0.9077843544231706, 'learning_rate': 9.350439414569978e-08, 'epoch': 0.94} 94%|█████████▍| 32228/34278 [35:23:25<2:39:30, 4.67s/it] 94%|█████████▍| 32229/34278 [35:23:31<2:43:36, 4.79s/it] {'loss': 0.1123, 'grad_norm': 1.8356673550242073, 'learning_rate': 9.341347764536768e-08, 'epoch': 0.94} 94%|█████████▍| 32229/34278 [35:23:31<2:43:36, 4.79s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 94%|█████████▍| 32230/34278 [35:23:34<2:27:49, 4.33s/it] {'loss': 0.1417, 'grad_norm': 0.8468451722893325, 'learning_rate': 9.332260494986866e-08, 'epoch': 0.94} 94%|█████████▍| 32230/34278 [35:23:34<2:27:49, 4.33s/it] 94%|█████████▍| 32231/34278 [35:23:39<2:33:34, 4.50s/it] {'loss': 0.1128, 'grad_norm': 1.4832384056344383, 'learning_rate': 9.323177606001433e-08, 'epoch': 0.94} 94%|█████████▍| 32231/34278 [35:23:39<2:33:34, 4.50s/it] 94%|█████████▍| 32232/34278 [35:23:43<2:30:15, 4.41s/it] {'loss': 0.1274, 'grad_norm': 0.8977563019204942, 'learning_rate': 9.314099097661511e-08, 'epoch': 0.94} 94%|█████████▍| 32232/34278 [35:23:43<2:30:15, 4.41s/it] 94%|█████████▍| 32233/34278 [35:23:46<2:21:50, 4.16s/it] {'loss': 0.1133, 'grad_norm': 0.7821348721161093, 'learning_rate': 9.30502497004826e-08, 'epoch': 0.94} 94%|█████████▍| 32233/34278 [35:23:46<2:21:50, 4.16s/it] 94%|█████████▍| 32234/34278 [35:23:50<2:16:11, 4.00s/it] {'loss': 0.1313, 'grad_norm': 1.015399189609931, 'learning_rate': 9.295955223242503e-08, 'epoch': 0.94} 94%|█████████▍| 32234/34278 [35:23:50<2:16:11, 4.00s/it] 94%|█████████▍| 32235/34278 [35:23:53<2:05:03, 3.67s/it] {'loss': 0.1284, 'grad_norm': 0.7625042097865166, 'learning_rate': 9.286889857325343e-08, 'epoch': 0.94} 94%|█████████▍| 32235/34278 [35:23:53<2:05:03, 3.67s/it] 94%|█████████▍| 32236/34278 [35:23:57<2:05:54, 3.70s/it] {'loss': 0.1001, 'grad_norm': 0.7857895787778961, 'learning_rate': 9.277828872377714e-08, 'epoch': 0.94} 94%|█████████▍| 32236/34278 [35:23:57<2:05:54, 3.70s/it] 94%|█████████▍| 32237/34278 [35:24:00<2:00:43, 3.55s/it] {'loss': 0.1039, 'grad_norm': 0.8786706465429232, 'learning_rate': 9.268772268480498e-08, 'epoch': 0.94} 94%|█████████▍| 32237/34278 [35:24:00<2:00:43, 3.55s/it] 94%|█████████▍| 32238/34278 [35:24:03<1:55:12, 3.39s/it] {'loss': 0.0987, 'grad_norm': 0.8768587824741662, 'learning_rate': 9.25972004571446e-08, 'epoch': 0.94} 94%|█████████▍| 32238/34278 [35:24:03<1:55:12, 3.39s/it] 94%|█████████▍| 32239/34278 [35:24:08<2:12:56, 3.91s/it] {'loss': 0.1132, 'grad_norm': 0.8281244894290761, 'learning_rate': 9.250672204160538e-08, 'epoch': 0.94} 94%|█████████▍| 32239/34278 [35:24:08<2:12:56, 3.91s/it] 94%|█████████▍| 32240/34278 [35:24:13<2:22:49, 4.21s/it] {'loss': 0.1054, 'grad_norm': 0.8666201805007784, 'learning_rate': 9.241628743899445e-08, 'epoch': 0.94} 94%|█████████▍| 32240/34278 [35:24:13<2:22:49, 4.21s/it] 94%|█████████▍| 32241/34278 [35:24:16<2:10:44, 3.85s/it] {'loss': 0.1145, 'grad_norm': 0.9872921626328762, 'learning_rate': 9.232589665012004e-08, 'epoch': 0.94} 94%|█████████▍| 32241/34278 [35:24:16<2:10:44, 3.85s/it] 94%|█████████▍| 32242/34278 [35:24:19<2:00:01, 3.54s/it] {'loss': 0.1263, 'grad_norm': 0.9099156945262562, 'learning_rate': 9.223554967578763e-08, 'epoch': 0.94} 94%|█████████▍| 32242/34278 [35:24:19<2:00:01, 3.54s/it] 94%|█████████▍| 32243/34278 [35:24:25<2:24:47, 4.27s/it] {'loss': 0.1092, 'grad_norm': 0.7816016344764367, 'learning_rate': 9.214524651680545e-08, 'epoch': 0.94} 94%|█████████▍| 32243/34278 [35:24:25<2:24:47, 4.27s/it] 94%|█████████▍| 32244/34278 [35:24:28<2:11:30, 3.88s/it] {'loss': 0.1075, 'grad_norm': 0.750991763943585, 'learning_rate': 9.205498717397843e-08, 'epoch': 0.94} 94%|█████████▍| 32244/34278 [35:24:28<2:11:30, 3.88s/it] 94%|█████████▍| 32245/34278 [35:24:31<2:02:08, 3.60s/it] {'loss': 0.1177, 'grad_norm': 0.8534571402613954, 'learning_rate': 9.196477164811313e-08, 'epoch': 0.94} 94%|█████████▍| 32245/34278 [35:24:31<2:02:08, 3.60s/it] 94%|█████████▍| 32246/34278 [35:24:34<1:59:34, 3.53s/it] {'loss': 0.1059, 'grad_norm': 0.9121166727582298, 'learning_rate': 9.18745999400139e-08, 'epoch': 0.94} 94%|█████████▍| 32246/34278 [35:24:34<1:59:34, 3.53s/it] 94%|█████████▍| 32247/34278 [35:24:40<2:20:24, 4.15s/it] {'loss': 0.1119, 'grad_norm': 0.8343823646972929, 'learning_rate': 9.178447205048735e-08, 'epoch': 0.94} 94%|█████████▍| 32247/34278 [35:24:40<2:20:24, 4.15s/it] 94%|█████████▍| 32248/34278 [35:24:43<2:14:36, 3.98s/it] {'loss': 0.1116, 'grad_norm': 0.8591376981119724, 'learning_rate': 9.169438798033725e-08, 'epoch': 0.94} 94%|█████████▍| 32248/34278 [35:24:43<2:14:36, 3.98s/it] 94%|█████████▍| 32249/34278 [35:24:46<2:05:38, 3.72s/it] {'loss': 0.1219, 'grad_norm': 0.8586963944195071, 'learning_rate': 9.160434773036797e-08, 'epoch': 0.94} 94%|█████████▍| 32249/34278 [35:24:46<2:05:38, 3.72s/it] 94%|█████████▍| 32250/34278 [35:24:51<2:14:37, 3.98s/it] {'loss': 0.0961, 'grad_norm': 0.720980567545608, 'learning_rate': 9.151435130138331e-08, 'epoch': 0.94} 94%|█████████▍| 32250/34278 [35:24:51<2:14:37, 3.98s/it] 94%|█████████▍| 32251/34278 [35:24:54<2:07:43, 3.78s/it] {'loss': 0.111, 'grad_norm': 0.6589360325697987, 'learning_rate': 9.142439869418651e-08, 'epoch': 0.94} 94%|█████████▍| 32251/34278 [35:24:54<2:07:43, 3.78s/it] 94%|█████████▍| 32252/34278 [35:24:58<2:02:38, 3.63s/it] {'loss': 0.1047, 'grad_norm': 0.953257786619246, 'learning_rate': 9.133448990958083e-08, 'epoch': 0.94} 94%|█████████▍| 32252/34278 [35:24:58<2:02:38, 3.63s/it] 94%|█████████▍| 32253/34278 [35:25:01<1:58:49, 3.52s/it] {'loss': 0.1357, 'grad_norm': 1.0535537006691844, 'learning_rate': 9.12446249483695e-08, 'epoch': 0.94} 94%|█████████▍| 32253/34278 [35:25:01<1:58:49, 3.52s/it] 94%|█████████▍| 32254/34278 [35:25:07<2:25:15, 4.31s/it] {'loss': 0.1141, 'grad_norm': 0.9181781532984107, 'learning_rate': 9.115480381135466e-08, 'epoch': 0.94} 94%|█████████▍| 32254/34278 [35:25:07<2:25:15, 4.31s/it] 94%|█████████▍| 32255/34278 [35:25:12<2:27:56, 4.39s/it] {'loss': 0.1197, 'grad_norm': 0.7004570130653078, 'learning_rate': 9.106502649933679e-08, 'epoch': 0.94} 94%|█████████▍| 32255/34278 [35:25:12<2:27:56, 4.39s/it] 94%|█████████▍| 32256/34278 [35:25:15<2:19:26, 4.14s/it] {'loss': 0.1161, 'grad_norm': 0.7811468517955279, 'learning_rate': 9.097529301311969e-08, 'epoch': 0.94} 94%|█████████▍| 32256/34278 [35:25:15<2:19:26, 4.14s/it] 94%|█████████▍| 32257/34278 [35:25:18<2:09:27, 3.84s/it] {'loss': 0.102, 'grad_norm': 0.9753875994681088, 'learning_rate': 9.088560335350272e-08, 'epoch': 0.94} 94%|█████████▍| 32257/34278 [35:25:18<2:09:27, 3.84s/it] 94%|█████████▍| 32258/34278 [35:25:23<2:22:40, 4.24s/it] {'loss': 0.12, 'grad_norm': 0.7541581719261251, 'learning_rate': 9.07959575212869e-08, 'epoch': 0.94} 94%|█████████▍| 32258/34278 [35:25:23<2:22:40, 4.24s/it] 94%|█████████▍| 32259/34278 [35:25:28<2:23:46, 4.27s/it] {'loss': 0.1118, 'grad_norm': 0.8599202291508046, 'learning_rate': 9.07063555172727e-08, 'epoch': 0.94} 94%|█████████▍| 32259/34278 [35:25:28<2:23:46, 4.27s/it] 94%|█████████▍| 32260/34278 [35:25:31<2:13:57, 3.98s/it] {'loss': 0.0981, 'grad_norm': 0.8262972867905722, 'learning_rate': 9.061679734226115e-08, 'epoch': 0.94} 94%|█████████▍| 32260/34278 [35:25:31<2:13:57, 3.98s/it] 94%|█████████▍| 32261/34278 [35:25:35<2:12:00, 3.93s/it] {'loss': 0.096, 'grad_norm': 0.8200818163966893, 'learning_rate': 9.05272829970505e-08, 'epoch': 0.94} 94%|█████████▍| 32261/34278 [35:25:35<2:12:00, 3.93s/it] 94%|█████████▍| 32262/34278 [35:25:41<2:31:20, 4.50s/it] {'loss': 0.108, 'grad_norm': 0.6948563973969044, 'learning_rate': 9.043781248244011e-08, 'epoch': 0.94} 94%|█████████▍| 32262/34278 [35:25:41<2:31:20, 4.50s/it] 94%|█████████▍| 32263/34278 [35:25:44<2:17:15, 4.09s/it] {'loss': 0.125, 'grad_norm': 0.9039716304296024, 'learning_rate': 9.034838579922878e-08, 'epoch': 0.94} 94%|█████████▍| 32263/34278 [35:25:44<2:17:15, 4.09s/it] 94%|█████████▍| 32264/34278 [35:25:47<2:09:13, 3.85s/it] {'loss': 0.1054, 'grad_norm': 0.948507507146162, 'learning_rate': 9.025900294821533e-08, 'epoch': 0.94} 94%|█████████▍| 32264/34278 [35:25:47<2:09:13, 3.85s/it] 94%|█████████▍| 32265/34278 [35:25:51<2:08:08, 3.82s/it] {'loss': 0.1169, 'grad_norm': 0.8100205279020445, 'learning_rate': 9.016966393019688e-08, 'epoch': 0.94} 94%|█████████▍| 32265/34278 [35:25:51<2:08:08, 3.82s/it] 94%|█████████▍| 32266/34278 [35:25:54<1:57:22, 3.50s/it] {'loss': 0.0986, 'grad_norm': 0.7338423233355329, 'learning_rate': 9.008036874597226e-08, 'epoch': 0.94} 94%|█████████▍| 32266/34278 [35:25:54<1:57:22, 3.50s/it] 94%|█████████▍| 32267/34278 [35:25:57<1:51:33, 3.33s/it] {'loss': 0.1079, 'grad_norm': 1.026312423149906, 'learning_rate': 8.999111739633803e-08, 'epoch': 0.94} 94%|█████████▍| 32267/34278 [35:25:57<1:51:33, 3.33s/it] 94%|█████████▍| 32268/34278 [35:26:01<1:57:41, 3.51s/it] {'loss': 0.1179, 'grad_norm': 0.9176621989537525, 'learning_rate': 8.990190988209025e-08, 'epoch': 0.94} 94%|█████████▍| 32268/34278 [35:26:01<1:57:41, 3.51s/it] 94%|█████████▍| 32269/34278 [35:26:06<2:22:12, 4.25s/it] {'loss': 0.1076, 'grad_norm': 0.7518978043031617, 'learning_rate': 8.981274620402713e-08, 'epoch': 0.94} 94%|█████████▍| 32269/34278 [35:26:06<2:22:12, 4.25s/it] 94%|█████████▍| 32270/34278 [35:26:12<2:37:04, 4.69s/it] {'loss': 0.1037, 'grad_norm': 0.7782733248428133, 'learning_rate': 8.972362636294307e-08, 'epoch': 0.94} 94%|█████████▍| 32270/34278 [35:26:12<2:37:04, 4.69s/it] 94%|█████████▍| 32271/34278 [35:26:16<2:24:56, 4.33s/it] {'loss': 0.1086, 'grad_norm': 0.88650877935467, 'learning_rate': 8.963455035963409e-08, 'epoch': 0.94} 94%|█████████▍| 32271/34278 [35:26:16<2:24:56, 4.33s/it] 94%|█████████▍| 32272/34278 [35:26:20<2:19:58, 4.19s/it] {'loss': 0.1127, 'grad_norm': 0.8043260987467993, 'learning_rate': 8.954551819489565e-08, 'epoch': 0.94} 94%|█████████▍| 32272/34278 [35:26:20<2:19:58, 4.19s/it] 94%|█████████▍| 32273/34278 [35:26:22<2:04:39, 3.73s/it] {'loss': 0.0967, 'grad_norm': 0.7528719477366806, 'learning_rate': 8.945652986952325e-08, 'epoch': 0.94} 94%|█████████▍| 32273/34278 [35:26:22<2:04:39, 3.73s/it] 94%|█████████▍| 32274/34278 [35:26:28<2:26:00, 4.37s/it] {'loss': 0.1141, 'grad_norm': 0.9861613255844213, 'learning_rate': 8.936758538431067e-08, 'epoch': 0.94} 94%|█████████▍| 32274/34278 [35:26:28<2:26:00, 4.37s/it] 94%|█████████▍| 32275/34278 [35:26:31<2:14:32, 4.03s/it] {'loss': 0.1181, 'grad_norm': 0.8871444967746114, 'learning_rate': 8.92786847400512e-08, 'epoch': 0.94} 94%|█████████▍| 32275/34278 [35:26:31<2:14:32, 4.03s/it] 94%|█████████▍| 32276/34278 [35:26:34<2:03:55, 3.71s/it] {'loss': 0.1081, 'grad_norm': 0.7471736057557377, 'learning_rate': 8.918982793753972e-08, 'epoch': 0.94} 94%|█████████▍| 32276/34278 [35:26:34<2:03:55, 3.71s/it] 94%|█████████▍| 32277/34278 [35:26:37<1:56:20, 3.49s/it] {'loss': 0.0975, 'grad_norm': 0.7884487385782475, 'learning_rate': 8.910101497756951e-08, 'epoch': 0.94} 94%|█████████▍| 32277/34278 [35:26:37<1:56:20, 3.49s/it] 94%|█████████▍| 32278/34278 [35:26:40<1:49:54, 3.30s/it] {'loss': 0.1253, 'grad_norm': 1.1634976749767227, 'learning_rate': 8.901224586093271e-08, 'epoch': 0.94} 94%|█████████▍| 32278/34278 [35:26:40<1:49:54, 3.30s/it] 94%|█████████▍| 32279/34278 [35:26:44<1:56:07, 3.49s/it] {'loss': 0.113, 'grad_norm': 0.8013935716951578, 'learning_rate': 8.892352058842258e-08, 'epoch': 0.94} 94%|█████████▍| 32279/34278 [35:26:44<1:56:07, 3.49s/it] 94%|█████████▍| 32280/34278 [35:26:50<2:22:47, 4.29s/it] {'loss': 0.1102, 'grad_norm': 0.908155959134173, 'learning_rate': 8.883483916083068e-08, 'epoch': 0.94} 94%|█████████▍| 32280/34278 [35:26:50<2:22:47, 4.29s/it] 94%|█████████▍| 32281/34278 [35:26:53<2:07:48, 3.84s/it] {'loss': 0.1175, 'grad_norm': 0.926664533373589, 'learning_rate': 8.874620157894864e-08, 'epoch': 0.94} 94%|█████████▍| 32281/34278 [35:26:53<2:07:48, 3.84s/it] 94%|█████████▍| 32282/34278 [35:26:56<1:59:06, 3.58s/it] {'loss': 0.105, 'grad_norm': 0.9466495200848368, 'learning_rate': 8.865760784356859e-08, 'epoch': 0.94} 94%|█████████▍| 32282/34278 [35:26:56<1:59:06, 3.58s/it] 94%|█████████▍| 32283/34278 [35:26:59<1:55:01, 3.46s/it] {'loss': 0.1314, 'grad_norm': 0.9163070272330989, 'learning_rate': 8.856905795548098e-08, 'epoch': 0.94} 94%|█████████▍| 32283/34278 [35:26:59<1:55:01, 3.46s/it] 94%|█████████▍| 32284/34278 [35:27:05<2:20:07, 4.22s/it] {'loss': 0.1055, 'grad_norm': 0.7883809830136383, 'learning_rate': 8.848055191547633e-08, 'epoch': 0.94} 94%|█████████▍| 32284/34278 [35:27:05<2:20:07, 4.22s/it] 94%|█████████▍| 32285/34278 [35:27:08<2:05:37, 3.78s/it] {'loss': 0.1023, 'grad_norm': 0.8342912192103017, 'learning_rate': 8.839208972434455e-08, 'epoch': 0.94} 94%|█████████▍| 32285/34278 [35:27:08<2:05:37, 3.78s/it] 94%|█████████▍| 32286/34278 [35:27:11<2:01:01, 3.65s/it] {'loss': 0.1087, 'grad_norm': 0.8348360942403166, 'learning_rate': 8.830367138287555e-08, 'epoch': 0.94} 94%|█████████▍| 32286/34278 [35:27:11<2:01:01, 3.65s/it] 94%|█████████▍| 32287/34278 [35:27:15<2:02:27, 3.69s/it] {'loss': 0.1174, 'grad_norm': 1.0003868136548781, 'learning_rate': 8.821529689185981e-08, 'epoch': 0.94} 94%|█████████▍| 32287/34278 [35:27:15<2:02:27, 3.69s/it] 94%|█████████▍| 32288/34278 [35:27:21<2:21:32, 4.27s/it] {'loss': 0.1091, 'grad_norm': 0.8001076929016344, 'learning_rate': 8.81269662520845e-08, 'epoch': 0.94} 94%|█████████▍| 32288/34278 [35:27:21<2:21:32, 4.27s/it] 94%|█████████▍| 32289/34278 [35:27:23<2:05:51, 3.80s/it] {'loss': 0.115, 'grad_norm': 1.2193951438076844, 'learning_rate': 8.803867946433897e-08, 'epoch': 0.94} 94%|█████████▍| 32289/34278 [35:27:23<2:05:51, 3.80s/it] 94%|█████████▍| 32290/34278 [35:27:29<2:28:14, 4.47s/it] {'loss': 0.1199, 'grad_norm': 0.8640853572218443, 'learning_rate': 8.795043652941204e-08, 'epoch': 0.94} 94%|█████████▍| 32290/34278 [35:27:29<2:28:14, 4.47s/it] 94%|█████████▍| 32291/34278 [35:27:35<2:39:21, 4.81s/it] {'loss': 0.1039, 'grad_norm': 0.7240457643107918, 'learning_rate': 8.786223744809085e-08, 'epoch': 0.94} 94%|█████████▍| 32291/34278 [35:27:35<2:39:21, 4.81s/it] 94%|█████████▍| 32292/34278 [35:27:38<2:23:52, 4.35s/it] {'loss': 0.0993, 'grad_norm': 0.9594373398207939, 'learning_rate': 8.777408222116257e-08, 'epoch': 0.94} 94%|█████████▍| 32292/34278 [35:27:38<2:23:52, 4.35s/it] 94%|█████████▍| 32293/34278 [35:27:44<2:38:02, 4.78s/it] {'loss': 0.1092, 'grad_norm': 0.7836878391748856, 'learning_rate': 8.768597084941543e-08, 'epoch': 0.94} 94%|█████████▍| 32293/34278 [35:27:44<2:38:02, 4.78s/it] 94%|█████████▍| 32294/34278 [35:27:47<2:23:46, 4.35s/it] {'loss': 0.1087, 'grad_norm': 0.7213913717550353, 'learning_rate': 8.759790333363439e-08, 'epoch': 0.94} 94%|█████████▍| 32294/34278 [35:27:47<2:23:46, 4.35s/it] 94%|█████████▍| 32295/34278 [35:27:51<2:13:45, 4.05s/it] {'loss': 0.11, 'grad_norm': 0.7591685255798649, 'learning_rate': 8.750987967460711e-08, 'epoch': 0.94} 94%|█████████▍| 32295/34278 [35:27:51<2:13:45, 4.05s/it] 94%|█████████▍| 32296/34278 [35:27:54<2:05:46, 3.81s/it] {'loss': 0.112, 'grad_norm': 0.8033655210901949, 'learning_rate': 8.74218998731191e-08, 'epoch': 0.94} 94%|█████████▍| 32296/34278 [35:27:54<2:05:46, 3.81s/it] 94%|█████████▍| 32297/34278 [35:27:57<2:01:38, 3.68s/it] {'loss': 0.1237, 'grad_norm': 1.0044959807320812, 'learning_rate': 8.733396392995531e-08, 'epoch': 0.94} 94%|█████████▍| 32297/34278 [35:27:57<2:01:38, 3.68s/it] 94%|█████████▍| 32298/34278 [35:28:00<1:52:58, 3.42s/it] {'loss': 0.1222, 'grad_norm': 0.7833893556060869, 'learning_rate': 8.724607184590117e-08, 'epoch': 0.94} 94%|█████████▍| 32298/34278 [35:28:00<1:52:58, 3.42s/it] 94%|█████████▍| 32299/34278 [35:28:03<1:48:10, 3.28s/it] {'loss': 0.1107, 'grad_norm': 0.8650228097079342, 'learning_rate': 8.715822362174165e-08, 'epoch': 0.94} 94%|█████████▍| 32299/34278 [35:28:03<1:48:10, 3.28s/it] 94%|█████████▍| 32300/34278 [35:28:06<1:43:14, 3.13s/it] {'loss': 0.1245, 'grad_norm': 0.8301736997589824, 'learning_rate': 8.707041925826054e-08, 'epoch': 0.94} 94%|█████████▍| 32300/34278 [35:28:06<1:43:14, 3.13s/it] 94%|█████████▍| 32301/34278 [35:28:11<2:06:57, 3.85s/it] {'loss': 0.1235, 'grad_norm': 0.7275716008408162, 'learning_rate': 8.698265875624168e-08, 'epoch': 0.94} 94%|█████████▍| 32301/34278 [35:28:11<2:06:57, 3.85s/it] 94%|█████████▍| 32302/34278 [35:28:15<1:59:30, 3.63s/it] {'loss': 0.1119, 'grad_norm': 0.8993784203191924, 'learning_rate': 8.689494211646887e-08, 'epoch': 0.94} 94%|█████████▍| 32302/34278 [35:28:15<1:59:30, 3.63s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fa5ba8a1440> Failed to fetch sample 3187200. Exception: cannot identify image file <_io.BytesIO object at 0x7fa5ba8a1440> 94%|█████████▍| 32303/34278 [35:28:17<1:51:46, 3.40s/it] {'loss': 0.1101, 'grad_norm': 1.1175801904530618, 'learning_rate': 8.680726933972538e-08, 'epoch': 0.94} 94%|█████████▍| 32303/34278 [35:28:17<1:51:46, 3.40s/it] 94%|█████████▍| 32304/34278 [35:28:20<1:47:01, 3.25s/it] {'loss': 0.1215, 'grad_norm': 0.8399352229791115, 'learning_rate': 8.671964042679392e-08, 'epoch': 0.94} 94%|█████████▍| 32304/34278 [35:28:20<1:47:01, 3.25s/it] 94%|█████████▍| 32305/34278 [35:28:24<1:48:20, 3.29s/it] {'loss': 0.1098, 'grad_norm': 0.7335315001881796, 'learning_rate': 8.663205537845609e-08, 'epoch': 0.94} 94%|█████████▍| 32305/34278 [35:28:24<1:48:20, 3.29s/it] 94%|█████████▍| 32306/34278 [35:28:28<1:57:10, 3.57s/it] {'loss': 0.1251, 'grad_norm': 0.861878034998344, 'learning_rate': 8.654451419549459e-08, 'epoch': 0.94} 94%|█████████▍| 32306/34278 [35:28:28<1:57:10, 3.57s/it] 94%|█████████▍| 32307/34278 [35:28:32<1:59:31, 3.64s/it] {'loss': 0.1145, 'grad_norm': 0.7872442627576178, 'learning_rate': 8.645701687869046e-08, 'epoch': 0.94} 94%|█████████▍| 32307/34278 [35:28:32<1:59:31, 3.64s/it] 94%|█████████▍| 32308/34278 [35:28:35<1:59:14, 3.63s/it] {'loss': 0.1241, 'grad_norm': 0.7276279017712955, 'learning_rate': 8.63695634288253e-08, 'epoch': 0.94} 94%|█████████▍| 32308/34278 [35:28:35<1:59:14, 3.63s/it] 94%|█████████▍| 32309/34278 [35:28:39<2:01:10, 3.69s/it] {'loss': 0.0962, 'grad_norm': 0.8494474455200676, 'learning_rate': 8.628215384668015e-08, 'epoch': 0.94} 94%|█████████▍| 32309/34278 [35:28:39<2:01:10, 3.69s/it] 94%|█████████▍| 32310/34278 [35:28:42<1:56:31, 3.55s/it] {'loss': 0.1053, 'grad_norm': 0.9091544186677172, 'learning_rate': 8.61947881330344e-08, 'epoch': 0.94} 94%|█████████▍| 32310/34278 [35:28:42<1:56:31, 3.55s/it] 94%|█████████▍| 32311/34278 [35:28:48<2:20:34, 4.29s/it] {'loss': 0.1024, 'grad_norm': 0.800220103593855, 'learning_rate': 8.610746628866851e-08, 'epoch': 0.94} 94%|█████████▍| 32311/34278 [35:28:48<2:20:34, 4.29s/it] 94%|█████████▍| 32312/34278 [35:28:51<2:07:38, 3.90s/it] {'loss': 0.0982, 'grad_norm': 0.6635900744825833, 'learning_rate': 8.602018831436243e-08, 'epoch': 0.94} 94%|█████████▍| 32312/34278 [35:28:51<2:07:38, 3.90s/it] 94%|█████████▍| 32313/34278 [35:28:56<2:16:46, 4.18s/it] {'loss': 0.1251, 'grad_norm': 0.8273136038765367, 'learning_rate': 8.593295421089498e-08, 'epoch': 0.94} 94%|█████████▍| 32313/34278 [35:28:56<2:16:46, 4.18s/it] 94%|█████████▍| 32314/34278 [35:29:02<2:33:37, 4.69s/it] {'loss': 0.1015, 'grad_norm': 0.9357943378239129, 'learning_rate': 8.584576397904498e-08, 'epoch': 0.94} 94%|█████████▍| 32314/34278 [35:29:02<2:33:37, 4.69s/it] 94%|█████████▍| 32315/34278 [35:29:05<2:17:58, 4.22s/it] {'loss': 0.1262, 'grad_norm': 0.7543237702459774, 'learning_rate': 8.57586176195907e-08, 'epoch': 0.94} 94%|█████████▍| 32315/34278 [35:29:05<2:17:58, 4.22s/it] 94%|█████████▍| 32316/34278 [35:29:11<2:35:55, 4.77s/it] {'loss': 0.1259, 'grad_norm': 0.8876534827741925, 'learning_rate': 8.567151513331096e-08, 'epoch': 0.94} 94%|█████████▍| 32316/34278 [35:29:11<2:35:55, 4.77s/it] 94%|█████████▍| 32317/34278 [35:29:14<2:16:27, 4.18s/it] {'loss': 0.1071, 'grad_norm': 0.8748683143361243, 'learning_rate': 8.558445652098291e-08, 'epoch': 0.94} 94%|█████████▍| 32317/34278 [35:29:14<2:16:27, 4.18s/it] 94%|█████████▍| 32318/34278 [35:29:18<2:12:09, 4.05s/it] {'loss': 0.1214, 'grad_norm': 1.019682043259005, 'learning_rate': 8.549744178338259e-08, 'epoch': 0.94} 94%|█████████▍| 32318/34278 [35:29:18<2:12:09, 4.05s/it] 94%|█████████▍| 32319/34278 [35:29:21<2:03:24, 3.78s/it] {'loss': 0.1062, 'grad_norm': 0.7724503288375518, 'learning_rate': 8.54104709212883e-08, 'epoch': 0.94} 94%|█████████▍| 32319/34278 [35:29:21<2:03:24, 3.78s/it] 94%|█████████▍| 32320/34278 [35:29:24<1:53:55, 3.49s/it] {'loss': 0.0994, 'grad_norm': 0.8884411022332633, 'learning_rate': 8.53235439354766e-08, 'epoch': 0.94} 94%|█████████▍| 32320/34278 [35:29:24<1:53:55, 3.49s/it] 94%|█████████▍| 32321/34278 [35:29:27<1:50:37, 3.39s/it] {'loss': 0.1107, 'grad_norm': 0.9568008653287678, 'learning_rate': 8.5236660826723e-08, 'epoch': 0.94} 94%|█████████▍| 32321/34278 [35:29:27<1:50:37, 3.39s/it] 94%|█████████▍| 32322/34278 [35:29:31<1:58:17, 3.63s/it] {'loss': 0.1078, 'grad_norm': 0.7016210509757499, 'learning_rate': 8.514982159580298e-08, 'epoch': 0.94} 94%|█████████▍| 32322/34278 [35:29:31<1:58:17, 3.63s/it] 94%|█████████▍| 32323/34278 [35:29:34<1:53:46, 3.49s/it] {'loss': 0.1139, 'grad_norm': 0.9077299274976988, 'learning_rate': 8.506302624349205e-08, 'epoch': 0.94} 94%|█████████▍| 32323/34278 [35:29:34<1:53:46, 3.49s/it] 94%|█████████▍| 32324/34278 [35:29:37<1:50:38, 3.40s/it] {'loss': 0.1156, 'grad_norm': 0.7983976891298535, 'learning_rate': 8.497627477056514e-08, 'epoch': 0.94} 94%|█████████▍| 32324/34278 [35:29:37<1:50:38, 3.40s/it] 94%|█████████▍| 32325/34278 [35:29:40<1:46:58, 3.29s/it] {'loss': 0.104, 'grad_norm': 0.887561043756048, 'learning_rate': 8.488956717779661e-08, 'epoch': 0.94} 94%|█████████▍| 32325/34278 [35:29:40<1:46:58, 3.29s/it] 94%|█████████▍| 32326/34278 [35:29:44<1:48:25, 3.33s/it] {'loss': 0.1238, 'grad_norm': 0.8654210067457109, 'learning_rate': 8.480290346596087e-08, 'epoch': 0.94} 94%|█████████▍| 32326/34278 [35:29:44<1:48:25, 3.33s/it] 94%|█████████▍| 32327/34278 [35:29:47<1:43:01, 3.17s/it] {'loss': 0.1094, 'grad_norm': 0.7727612534057382, 'learning_rate': 8.471628363583174e-08, 'epoch': 0.94} 94%|█████████▍| 32327/34278 [35:29:47<1:43:01, 3.17s/it] 94%|█████████▍| 32328/34278 [35:29:50<1:41:02, 3.11s/it] {'loss': 0.1149, 'grad_norm': 0.8980402081435425, 'learning_rate': 8.46297076881819e-08, 'epoch': 0.94} 94%|█████████▍| 32328/34278 [35:29:50<1:41:02, 3.11s/it] 94%|█████████▍| 32329/34278 [35:29:53<1:42:47, 3.16s/it] {'loss': 0.1058, 'grad_norm': 0.6740917503354988, 'learning_rate': 8.454317562378467e-08, 'epoch': 0.94} 94%|█████████▍| 32329/34278 [35:29:53<1:42:47, 3.16s/it] 94%|█████████▍| 32330/34278 [35:29:56<1:41:56, 3.14s/it] {'loss': 0.1105, 'grad_norm': 0.7988787708196102, 'learning_rate': 8.445668744341274e-08, 'epoch': 0.94} 94%|█████████▍| 32330/34278 [35:29:56<1:41:56, 3.14s/it] 94%|█████████▍| 32331/34278 [35:30:00<1:47:42, 3.32s/it] {'loss': 0.1008, 'grad_norm': 0.7638095409849585, 'learning_rate': 8.43702431478377e-08, 'epoch': 0.94} 94%|█████████▍| 32331/34278 [35:30:00<1:47:42, 3.32s/it] 94%|█████████▍| 32332/34278 [35:30:03<1:44:23, 3.22s/it] {'loss': 0.1109, 'grad_norm': 0.8684555261019707, 'learning_rate': 8.428384273783175e-08, 'epoch': 0.94} 94%|█████████▍| 32332/34278 [35:30:03<1:44:23, 3.22s/it] 94%|█████████▍| 32333/34278 [35:30:06<1:42:12, 3.15s/it] {'loss': 0.0915, 'grad_norm': 0.6328075299142163, 'learning_rate': 8.419748621416646e-08, 'epoch': 0.94} 94%|█████████▍| 32333/34278 [35:30:06<1:42:12, 3.15s/it] 94%|█████████▍| 32334/34278 [35:30:09<1:46:25, 3.28s/it] {'loss': 0.1011, 'grad_norm': 0.9430103200458374, 'learning_rate': 8.411117357761289e-08, 'epoch': 0.94} 94%|█████████▍| 32334/34278 [35:30:09<1:46:25, 3.28s/it] 94%|█████████▍| 32335/34278 [35:30:13<1:49:38, 3.39s/it] {'loss': 0.1203, 'grad_norm': 0.8249591915840404, 'learning_rate': 8.402490482893988e-08, 'epoch': 0.94} 94%|█████████▍| 32335/34278 [35:30:13<1:49:38, 3.39s/it] 94%|█████████▍| 32336/34278 [35:30:17<1:51:20, 3.44s/it] {'loss': 0.0926, 'grad_norm': 0.7291717954263409, 'learning_rate': 8.393867996892014e-08, 'epoch': 0.94} 94%|█████████▍| 32336/34278 [35:30:17<1:51:20, 3.44s/it] 94%|█████████▍| 32337/34278 [35:30:20<1:52:27, 3.48s/it] {'loss': 0.1244, 'grad_norm': 0.8703143796152049, 'learning_rate': 8.385249899832249e-08, 'epoch': 0.94} 94%|█████████▍| 32337/34278 [35:30:20<1:52:27, 3.48s/it] 94%|█████████▍| 32338/34278 [35:30:23<1:47:23, 3.32s/it] {'loss': 0.1155, 'grad_norm': 0.779070003302947, 'learning_rate': 8.37663619179152e-08, 'epoch': 0.94} 94%|█████████▍| 32338/34278 [35:30:23<1:47:23, 3.32s/it] 94%|█████████▍| 32339/34278 [35:30:26<1:48:08, 3.35s/it] {'loss': 0.1194, 'grad_norm': 1.0629336366988282, 'learning_rate': 8.36802687284688e-08, 'epoch': 0.94} 94%|█████████▍| 32339/34278 [35:30:27<1:48:08, 3.35s/it] 94%|█████████▍| 32340/34278 [35:30:30<1:47:20, 3.32s/it] {'loss': 0.0926, 'grad_norm': 0.7835093401142775, 'learning_rate': 8.359421943075153e-08, 'epoch': 0.94} 94%|█████████▍| 32340/34278 [35:30:30<1:47:20, 3.32s/it] 94%|█████████▍| 32341/34278 [35:30:32<1:40:57, 3.13s/it] {'loss': 0.1051, 'grad_norm': 0.9044756377979435, 'learning_rate': 8.350821402553111e-08, 'epoch': 0.94} 94%|█████████▍| 32341/34278 [35:30:32<1:40:57, 3.13s/it] 94%|█████████▍| 32342/34278 [35:30:35<1:38:28, 3.05s/it] {'loss': 0.1142, 'grad_norm': 0.8804094413338565, 'learning_rate': 8.342225251357527e-08, 'epoch': 0.94} 94%|█████████▍| 32342/34278 [35:30:35<1:38:28, 3.05s/it] 94%|█████████▍| 32343/34278 [35:30:39<1:40:45, 3.12s/it] {'loss': 0.137, 'grad_norm': 1.3066257204772769, 'learning_rate': 8.333633489565284e-08, 'epoch': 0.94} 94%|█████████▍| 32343/34278 [35:30:39<1:40:45, 3.12s/it] 94%|█████████▍| 32344/34278 [35:30:42<1:42:06, 3.17s/it] {'loss': 0.1035, 'grad_norm': 0.7875741132318617, 'learning_rate': 8.325046117252933e-08, 'epoch': 0.94} 94%|█████████▍| 32344/34278 [35:30:42<1:42:06, 3.17s/it] 94%|█████████▍| 32345/34278 [35:30:45<1:43:22, 3.21s/it] {'loss': 0.1231, 'grad_norm': 0.8731378998190231, 'learning_rate': 8.316463134497188e-08, 'epoch': 0.94} 94%|█████████▍| 32345/34278 [35:30:45<1:43:22, 3.21s/it] 94%|█████████▍| 32346/34278 [35:30:48<1:38:27, 3.06s/it] {'loss': 0.0967, 'grad_norm': 0.8524985718756968, 'learning_rate': 8.30788454137471e-08, 'epoch': 0.94} 94%|█████████▍| 32346/34278 [35:30:48<1:38:27, 3.06s/it] 94%|█████████▍| 32347/34278 [35:30:51<1:43:52, 3.23s/it] {'loss': 0.1058, 'grad_norm': 0.7584908542335126, 'learning_rate': 8.299310337962052e-08, 'epoch': 0.94} 94%|█████████▍| 32347/34278 [35:30:52<1:43:52, 3.23s/it] 94%|█████████▍| 32348/34278 [35:30:57<2:10:14, 4.05s/it] {'loss': 0.0966, 'grad_norm': 0.6681495845564815, 'learning_rate': 8.290740524335817e-08, 'epoch': 0.94} 94%|█████████▍| 32348/34278 [35:30:57<2:10:14, 4.05s/it] 94%|█████████▍| 32349/34278 [35:31:02<2:14:21, 4.18s/it] {'loss': 0.1175, 'grad_norm': 0.8429285285254013, 'learning_rate': 8.282175100572387e-08, 'epoch': 0.94} 94%|█████████▍| 32349/34278 [35:31:02<2:14:21, 4.18s/it] 94%|█████████▍| 32350/34278 [35:31:05<2:01:57, 3.80s/it] {'loss': 0.0911, 'grad_norm': 0.8027082932232955, 'learning_rate': 8.27361406674837e-08, 'epoch': 0.94} 94%|█████████▍| 32350/34278 [35:31:05<2:01:57, 3.80s/it] 94%|█████████▍| 32351/34278 [35:31:09<2:02:00, 3.80s/it] {'loss': 0.1255, 'grad_norm': 0.8567381679743021, 'learning_rate': 8.265057422940148e-08, 'epoch': 0.94} 94%|█████████▍| 32351/34278 [35:31:09<2:02:00, 3.80s/it] 94%|█████████▍| 32352/34278 [35:31:12<1:54:10, 3.56s/it] {'loss': 0.0995, 'grad_norm': 0.8541403054887776, 'learning_rate': 8.256505169224105e-08, 'epoch': 0.94} 94%|█████████▍| 32352/34278 [35:31:12<1:54:10, 3.56s/it] 94%|█████████▍| 32353/34278 [35:31:15<1:52:01, 3.49s/it] {'loss': 0.0865, 'grad_norm': 0.7698180543207574, 'learning_rate': 8.247957305676568e-08, 'epoch': 0.94} 94%|█████████▍| 32353/34278 [35:31:15<1:52:01, 3.49s/it] 94%|█████████▍| 32354/34278 [35:31:18<1:50:10, 3.44s/it] {'loss': 0.1105, 'grad_norm': 1.0897995547477701, 'learning_rate': 8.239413832373865e-08, 'epoch': 0.94} 94%|█████████▍| 32354/34278 [35:31:18<1:50:10, 3.44s/it] 94%|█████████▍| 32355/34278 [35:31:23<2:00:40, 3.77s/it] {'loss': 0.1217, 'grad_norm': 0.894631470770573, 'learning_rate': 8.230874749392326e-08, 'epoch': 0.94} 94%|█████████▍| 32355/34278 [35:31:23<2:00:40, 3.77s/it] 94%|█████████▍| 32356/34278 [35:31:28<2:16:18, 4.26s/it] {'loss': 0.1373, 'grad_norm': 0.8897706786415958, 'learning_rate': 8.222340056808109e-08, 'epoch': 0.94} 94%|█████████▍| 32356/34278 [35:31:28<2:16:18, 4.26s/it] 94%|█████████▍| 32357/34278 [35:31:32<2:07:47, 3.99s/it] {'loss': 0.1133, 'grad_norm': 0.8475899700246209, 'learning_rate': 8.213809754697489e-08, 'epoch': 0.94} 94%|█████████▍| 32357/34278 [35:31:32<2:07:47, 3.99s/it] 94%|█████████▍| 32358/34278 [35:31:37<2:25:34, 4.55s/it] {'loss': 0.1076, 'grad_norm': 0.7119139509831721, 'learning_rate': 8.205283843136513e-08, 'epoch': 0.94} 94%|█████████▍| 32358/34278 [35:31:37<2:25:34, 4.55s/it] 94%|█████████▍| 32359/34278 [35:31:41<2:18:27, 4.33s/it] {'loss': 0.1006, 'grad_norm': 0.8062916356226331, 'learning_rate': 8.196762322201401e-08, 'epoch': 0.94} 94%|█████████▍| 32359/34278 [35:31:41<2:18:27, 4.33s/it] 94%|█████████▍| 32360/34278 [35:31:44<2:06:30, 3.96s/it] {'loss': 0.101, 'grad_norm': 1.0753529891432876, 'learning_rate': 8.188245191968202e-08, 'epoch': 0.94} 94%|█████████▍| 32360/34278 [35:31:44<2:06:30, 3.96s/it] 94%|█████████▍| 32361/34278 [35:31:50<2:24:29, 4.52s/it] {'loss': 0.0974, 'grad_norm': 0.8753470313202542, 'learning_rate': 8.179732452512911e-08, 'epoch': 0.94} 94%|█████████▍| 32361/34278 [35:31:50<2:24:29, 4.52s/it] 94%|█████████▍| 32362/34278 [35:31:53<2:11:15, 4.11s/it] {'loss': 0.1147, 'grad_norm': 0.7564879364630781, 'learning_rate': 8.17122410391158e-08, 'epoch': 0.94} 94%|█████████▍| 32362/34278 [35:31:53<2:11:15, 4.11s/it] 94%|█████████▍| 32363/34278 [35:31:57<2:03:57, 3.88s/it] {'loss': 0.1151, 'grad_norm': 0.7842977363139345, 'learning_rate': 8.162720146240144e-08, 'epoch': 0.94} 94%|█████████▍| 32363/34278 [35:31:57<2:03:57, 3.88s/it] 94%|█████████▍| 32364/34278 [35:32:00<1:57:00, 3.67s/it] {'loss': 0.1064, 'grad_norm': 0.7961969065277302, 'learning_rate': 8.154220579574601e-08, 'epoch': 0.94} 94%|█████████▍| 32364/34278 [35:32:00<1:57:00, 3.67s/it] 94%|█████████▍| 32365/34278 [35:32:04<2:02:29, 3.84s/it] {'loss': 0.1206, 'grad_norm': 0.9942200818097617, 'learning_rate': 8.145725403990668e-08, 'epoch': 0.94} 94%|█████████▍| 32365/34278 [35:32:04<2:02:29, 3.84s/it] 94%|█████████▍| 32366/34278 [35:32:07<1:51:26, 3.50s/it] {'loss': 0.1039, 'grad_norm': 0.8811670485871332, 'learning_rate': 8.137234619564282e-08, 'epoch': 0.94} 94%|█████████▍| 32366/34278 [35:32:07<1:51:26, 3.50s/it] 94%|█████████▍| 32367/34278 [35:32:10<1:52:37, 3.54s/it] {'loss': 0.1075, 'grad_norm': 0.8439337869759281, 'learning_rate': 8.12874822637133e-08, 'epoch': 0.94} 94%|█████████▍| 32367/34278 [35:32:10<1:52:37, 3.54s/it] 94%|█████████▍| 32368/34278 [35:32:14<1:51:29, 3.50s/it] {'loss': 0.0889, 'grad_norm': 0.9234172909187949, 'learning_rate': 8.120266224487416e-08, 'epoch': 0.94} 94%|█████████▍| 32368/34278 [35:32:14<1:51:29, 3.50s/it] 94%|█████████▍| 32369/34278 [35:32:20<2:15:16, 4.25s/it] {'loss': 0.1555, 'grad_norm': 0.7791893482755741, 'learning_rate': 8.111788613988369e-08, 'epoch': 0.94} 94%|█████████▍| 32369/34278 [35:32:20<2:15:16, 4.25s/it] 94%|█████████▍| 32370/34278 [35:32:23<2:03:12, 3.87s/it] {'loss': 0.1119, 'grad_norm': 0.8860145963518655, 'learning_rate': 8.103315394949906e-08, 'epoch': 0.94} 94%|█████████▍| 32370/34278 [35:32:23<2:03:12, 3.87s/it] 94%|█████████▍| 32371/34278 [35:32:26<1:56:11, 3.66s/it] {'loss': 0.0869, 'grad_norm': 0.8118565613192408, 'learning_rate': 8.094846567447523e-08, 'epoch': 0.94} 94%|█████████▍| 32371/34278 [35:32:26<1:56:11, 3.66s/it] 94%|█████████▍| 32372/34278 [35:32:32<2:18:51, 4.37s/it] {'loss': 0.1027, 'grad_norm': 1.0808754968211483, 'learning_rate': 8.086382131556935e-08, 'epoch': 0.94} 94%|█████████▍| 32372/34278 [35:32:32<2:18:51, 4.37s/it] 94%|█████████▍| 32373/34278 [35:32:35<2:09:28, 4.08s/it] {'loss': 0.114, 'grad_norm': 0.7209412532372016, 'learning_rate': 8.077922087353751e-08, 'epoch': 0.94} 94%|█████████▍| 32373/34278 [35:32:35<2:09:28, 4.08s/it] 94%|█████████▍| 32374/34278 [35:32:41<2:27:22, 4.64s/it] {'loss': 0.1124, 'grad_norm': 1.3381069972839603, 'learning_rate': 8.069466434913464e-08, 'epoch': 0.94} 94%|█████████▍| 32374/34278 [35:32:41<2:27:22, 4.64s/it] 94%|█████████▍| 32375/34278 [35:32:45<2:12:55, 4.19s/it] {'loss': 0.0897, 'grad_norm': 0.750155217556688, 'learning_rate': 8.06101517431146e-08, 'epoch': 0.94} 94%|█████████▍| 32375/34278 [35:32:45<2:12:55, 4.19s/it] 94%|█████████▍| 32376/34278 [35:32:48<2:07:29, 4.02s/it] {'loss': 0.1086, 'grad_norm': 0.7369771329983555, 'learning_rate': 8.052568305623342e-08, 'epoch': 0.94} 94%|█████████▍| 32376/34278 [35:32:48<2:07:29, 4.02s/it] 94%|█████████▍| 32377/34278 [35:32:51<1:56:37, 3.68s/it] {'loss': 0.0983, 'grad_norm': 0.801568927073826, 'learning_rate': 8.044125828924442e-08, 'epoch': 0.94} 94%|█████████▍| 32377/34278 [35:32:51<1:56:37, 3.68s/it] 94%|█████████▍| 32378/34278 [35:32:55<1:56:09, 3.67s/it] {'loss': 0.1041, 'grad_norm': 0.781518255797064, 'learning_rate': 8.035687744290143e-08, 'epoch': 0.94} 94%|█████████▍| 32378/34278 [35:32:55<1:56:09, 3.67s/it] 94%|█████████▍| 32379/34278 [35:32:58<1:49:49, 3.47s/it] {'loss': 0.0979, 'grad_norm': 0.7125332917030004, 'learning_rate': 8.027254051795774e-08, 'epoch': 0.94} 94%|█████████▍| 32379/34278 [35:32:58<1:49:49, 3.47s/it] 94%|█████████▍| 32380/34278 [35:33:00<1:42:39, 3.25s/it] {'loss': 0.1184, 'grad_norm': 0.6843526691826908, 'learning_rate': 8.018824751516663e-08, 'epoch': 0.94} 94%|█████████▍| 32380/34278 [35:33:00<1:42:39, 3.25s/it] 94%|█████████▍| 32381/34278 [35:33:07<2:10:44, 4.14s/it] {'loss': 0.0985, 'grad_norm': 1.002080672606367, 'learning_rate': 8.010399843528083e-08, 'epoch': 0.94} 94%|█████████▍| 32381/34278 [35:33:07<2:10:44, 4.14s/it] 94%|█████████▍| 32382/34278 [35:33:09<1:55:03, 3.64s/it] {'loss': 0.113, 'grad_norm': 0.7824026578557902, 'learning_rate': 8.00197932790514e-08, 'epoch': 0.94} 94%|█████████▍| 32382/34278 [35:33:09<1:55:03, 3.64s/it] 94%|█████████▍| 32383/34278 [35:33:13<1:57:18, 3.71s/it] {'loss': 0.113, 'grad_norm': 0.7873377395419179, 'learning_rate': 7.993563204723054e-08, 'epoch': 0.94} 94%|█████████▍| 32383/34278 [35:33:13<1:57:18, 3.71s/it] 94%|█████████▍| 32384/34278 [35:33:17<1:56:29, 3.69s/it] {'loss': 0.1215, 'grad_norm': 0.9239821817302963, 'learning_rate': 7.98515147405704e-08, 'epoch': 0.94} 94%|█████████▍| 32384/34278 [35:33:17<1:56:29, 3.69s/it] 94%|█████████▍| 32385/34278 [35:33:19<1:48:16, 3.43s/it] {'loss': 0.1125, 'grad_norm': 0.8012849692276517, 'learning_rate': 7.976744135982095e-08, 'epoch': 0.94} 94%|█████████▍| 32385/34278 [35:33:19<1:48:16, 3.43s/it] 94%|█████████▍| 32386/34278 [35:33:26<2:15:54, 4.31s/it] {'loss': 0.1211, 'grad_norm': 0.87859202581409, 'learning_rate': 7.968341190573325e-08, 'epoch': 0.94} 94%|█████████▍| 32386/34278 [35:33:26<2:15:54, 4.31s/it] 94%|█████████▍| 32387/34278 [35:33:29<2:04:44, 3.96s/it] {'loss': 0.1036, 'grad_norm': 0.8184684077296053, 'learning_rate': 7.959942637905783e-08, 'epoch': 0.94} 94%|█████████▍| 32387/34278 [35:33:29<2:04:44, 3.96s/it] 94%|█████████▍| 32388/34278 [35:33:33<2:01:49, 3.87s/it] {'loss': 0.1164, 'grad_norm': 0.8394996941678644, 'learning_rate': 7.951548478054405e-08, 'epoch': 0.94} 94%|█████████▍| 32388/34278 [35:33:33<2:01:49, 3.87s/it] 94%|█████████▍| 32389/34278 [35:33:35<1:51:49, 3.55s/it] {'loss': 0.0907, 'grad_norm': 0.7561225631203814, 'learning_rate': 7.943158711094079e-08, 'epoch': 0.94} 94%|█████████▍| 32389/34278 [35:33:35<1:51:49, 3.55s/it] 94%|█████████▍| 32390/34278 [35:33:40<2:01:14, 3.85s/it] {'loss': 0.0987, 'grad_norm': 0.6961283322428043, 'learning_rate': 7.934773337099855e-08, 'epoch': 0.94} 94%|█████████▍| 32390/34278 [35:33:40<2:01:14, 3.85s/it] 94%|█████████▍| 32391/34278 [35:33:43<1:53:54, 3.62s/it] {'loss': 0.105, 'grad_norm': 0.7230837570967884, 'learning_rate': 7.926392356146507e-08, 'epoch': 0.94} 94%|█████████▍| 32391/34278 [35:33:43<1:53:54, 3.62s/it] 94%|█████████▍| 32392/34278 [35:33:46<1:47:48, 3.43s/it] {'loss': 0.1104, 'grad_norm': 0.892162969730835, 'learning_rate': 7.918015768308806e-08, 'epoch': 0.94} 94%|█████████▍| 32392/34278 [35:33:46<1:47:48, 3.43s/it] 95%|█████████▍| 32393/34278 [35:33:49<1:44:12, 3.32s/it] {'loss': 0.1058, 'grad_norm': 0.7914370249735397, 'learning_rate': 7.90964357366164e-08, 'epoch': 0.95} 95%|█████████▍| 32393/34278 [35:33:49<1:44:12, 3.32s/it] 95%|█████████▍| 32394/34278 [35:33:55<2:09:37, 4.13s/it] {'loss': 0.1102, 'grad_norm': 0.7030963808324242, 'learning_rate': 7.90127577227967e-08, 'epoch': 0.95} 95%|█████████▍| 32394/34278 [35:33:55<2:09:37, 4.13s/it] 95%|█████████▍| 32395/34278 [35:33:59<2:04:01, 3.95s/it] {'loss': 0.1207, 'grad_norm': 0.9540850690225302, 'learning_rate': 7.89291236423767e-08, 'epoch': 0.95} 95%|█████████▍| 32395/34278 [35:33:59<2:04:01, 3.95s/it] 95%|█████████▍| 32396/34278 [35:34:02<1:55:44, 3.69s/it] {'loss': 0.1145, 'grad_norm': 1.2345368165618769, 'learning_rate': 7.884553349610191e-08, 'epoch': 0.95} 95%|█████████▍| 32396/34278 [35:34:02<1:55:44, 3.69s/it] 95%|█████████▍| 32397/34278 [35:34:05<1:50:42, 3.53s/it] {'loss': 0.1173, 'grad_norm': 0.7108277104669307, 'learning_rate': 7.876198728472062e-08, 'epoch': 0.95} 95%|█████████▍| 32397/34278 [35:34:05<1:50:42, 3.53s/it] 95%|█████████▍| 32398/34278 [35:34:11<2:15:18, 4.32s/it] {'loss': 0.113, 'grad_norm': 0.7955082892917396, 'learning_rate': 7.867848500897668e-08, 'epoch': 0.95} 95%|█████████▍| 32398/34278 [35:34:11<2:15:18, 4.32s/it] 95%|█████████▍| 32399/34278 [35:34:15<2:09:48, 4.14s/it] {'loss': 0.1283, 'grad_norm': 0.8259253011058484, 'learning_rate': 7.859502666961672e-08, 'epoch': 0.95} 95%|█████████▍| 32399/34278 [35:34:15<2:09:48, 4.14s/it] 95%|█████████▍| 32400/34278 [35:34:18<2:03:05, 3.93s/it] {'loss': 0.0807, 'grad_norm': 0.7176344232509159, 'learning_rate': 7.851161226738569e-08, 'epoch': 0.95} 95%|█████████▍| 32400/34278 [35:34:18<2:03:05, 3.93s/it] 95%|█████████▍| 32401/34278 [35:34:24<2:24:15, 4.61s/it] {'loss': 0.1229, 'grad_norm': 0.7244473039419826, 'learning_rate': 7.842824180302743e-08, 'epoch': 0.95} 95%|█████████▍| 32401/34278 [35:34:24<2:24:15, 4.61s/it] 95%|█████████▍| 32402/34278 [35:34:28<2:10:56, 4.19s/it] {'loss': 0.1113, 'grad_norm': 0.763476028012814, 'learning_rate': 7.834491527728694e-08, 'epoch': 0.95} 95%|█████████▍| 32402/34278 [35:34:28<2:10:56, 4.19s/it] 95%|█████████▍| 32403/34278 [35:34:31<2:05:48, 4.03s/it] {'loss': 0.1065, 'grad_norm': 0.9109140432203537, 'learning_rate': 7.826163269090914e-08, 'epoch': 0.95} 95%|█████████▍| 32403/34278 [35:34:31<2:05:48, 4.03s/it] 95%|█████████▍| 32404/34278 [35:34:34<1:56:00, 3.71s/it] {'loss': 0.1216, 'grad_norm': 0.747528977273796, 'learning_rate': 7.817839404463623e-08, 'epoch': 0.95} 95%|█████████▍| 32404/34278 [35:34:34<1:56:00, 3.71s/it] 95%|█████████▍| 32405/34278 [35:34:38<1:54:01, 3.65s/it] {'loss': 0.1179, 'grad_norm': 0.7952431788295623, 'learning_rate': 7.809519933921095e-08, 'epoch': 0.95} 95%|█████████▍| 32405/34278 [35:34:38<1:54:01, 3.65s/it] 95%|█████████▍| 32406/34278 [35:34:41<1:49:38, 3.51s/it] {'loss': 0.11, 'grad_norm': 0.8155826659945792, 'learning_rate': 7.80120485753777e-08, 'epoch': 0.95} 95%|█████████▍| 32406/34278 [35:34:41<1:49:38, 3.51s/it] 95%|█████████▍| 32407/34278 [35:34:44<1:48:23, 3.48s/it] {'loss': 0.096, 'grad_norm': 0.8149750360445821, 'learning_rate': 7.792894175387755e-08, 'epoch': 0.95} 95%|█████████▍| 32407/34278 [35:34:44<1:48:23, 3.48s/it] 95%|█████████▍| 32408/34278 [35:34:49<1:55:13, 3.70s/it] {'loss': 0.098, 'grad_norm': 0.5960639511586834, 'learning_rate': 7.784587887545269e-08, 'epoch': 0.95} 95%|█████████▍| 32408/34278 [35:34:49<1:55:13, 3.70s/it] 95%|█████████▍| 32409/34278 [35:34:52<1:49:04, 3.50s/it] {'loss': 0.1141, 'grad_norm': 0.9003230157312117, 'learning_rate': 7.776285994084476e-08, 'epoch': 0.95} 95%|█████████▍| 32409/34278 [35:34:52<1:49:04, 3.50s/it] 95%|█████████▍| 32410/34278 [35:34:56<1:58:55, 3.82s/it] {'loss': 0.1029, 'grad_norm': 0.6997698470257007, 'learning_rate': 7.767988495079536e-08, 'epoch': 0.95} 95%|█████████▍| 32410/34278 [35:34:56<1:58:55, 3.82s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 95%|█████████▍| 32411/34278 [35:34:59<1:52:33, 3.62s/it] {'loss': 0.0877, 'grad_norm': 0.7786886267937401, 'learning_rate': 7.759695390604505e-08, 'epoch': 0.95} 95%|█████████▍| 32411/34278 [35:34:59<1:52:33, 3.62s/it] 95%|█████████▍| 32412/34278 [35:35:03<1:54:07, 3.67s/it] {'loss': 0.1096, 'grad_norm': 0.8070390944776786, 'learning_rate': 7.75140668073332e-08, 'epoch': 0.95} 95%|█████████▍| 32412/34278 [35:35:03<1:54:07, 3.67s/it] 95%|█████████▍| 32413/34278 [35:35:07<1:59:46, 3.85s/it] {'loss': 0.1166, 'grad_norm': 0.6548227331117534, 'learning_rate': 7.74312236554009e-08, 'epoch': 0.95} 95%|█████████▍| 32413/34278 [35:35:07<1:59:46, 3.85s/it] 95%|█████████▍| 32414/34278 [35:35:11<1:56:03, 3.74s/it] {'loss': 0.1104, 'grad_norm': 0.7864888096709484, 'learning_rate': 7.734842445098811e-08, 'epoch': 0.95} 95%|█████████▍| 32414/34278 [35:35:11<1:56:03, 3.74s/it] 95%|█████████▍| 32415/34278 [35:35:15<1:57:23, 3.78s/it] {'loss': 0.1091, 'grad_norm': 0.7404766707515956, 'learning_rate': 7.726566919483313e-08, 'epoch': 0.95} 95%|█████████▍| 32415/34278 [35:35:15<1:57:23, 3.78s/it] 95%|█████████▍| 32416/34278 [35:35:21<2:19:23, 4.49s/it] {'loss': 0.1208, 'grad_norm': 1.1420709390800285, 'learning_rate': 7.718295788767537e-08, 'epoch': 0.95} 95%|█████████▍| 32416/34278 [35:35:21<2:19:23, 4.49s/it] 95%|█████████▍| 32417/34278 [35:35:24<2:05:27, 4.04s/it] {'loss': 0.1335, 'grad_norm': 1.083427221237561, 'learning_rate': 7.710029053025258e-08, 'epoch': 0.95} 95%|█████████▍| 32417/34278 [35:35:24<2:05:27, 4.04s/it] 95%|█████████▍| 32418/34278 [35:35:28<2:03:16, 3.98s/it] {'loss': 0.0932, 'grad_norm': 0.8615196675754264, 'learning_rate': 7.701766712330305e-08, 'epoch': 0.95} 95%|█████████▍| 32418/34278 [35:35:28<2:03:16, 3.98s/it] 95%|█████████▍| 32419/34278 [35:35:32<2:09:33, 4.18s/it] {'loss': 0.128, 'grad_norm': 0.8327881297641448, 'learning_rate': 7.693508766756508e-08, 'epoch': 0.95} 95%|█████████▍| 32419/34278 [35:35:32<2:09:33, 4.18s/it] 95%|█████████▍| 32420/34278 [35:35:35<1:58:34, 3.83s/it] {'loss': 0.113, 'grad_norm': 0.7653280774890531, 'learning_rate': 7.68525521637753e-08, 'epoch': 0.95} 95%|█████████▍| 32420/34278 [35:35:35<1:58:34, 3.83s/it] 95%|█████████▍| 32421/34278 [35:35:41<2:15:18, 4.37s/it] {'loss': 0.1037, 'grad_norm': 0.8654162343913903, 'learning_rate': 7.677006061267089e-08, 'epoch': 0.95} 95%|█████████▍| 32421/34278 [35:35:41<2:15:18, 4.37s/it] 95%|█████████▍| 32422/34278 [35:35:45<2:14:54, 4.36s/it] {'loss': 0.0953, 'grad_norm': 0.8712620590015554, 'learning_rate': 7.668761301498739e-08, 'epoch': 0.95} 95%|█████████▍| 32422/34278 [35:35:45<2:14:54, 4.36s/it] 95%|█████████▍| 32423/34278 [35:35:49<2:08:06, 4.14s/it] {'loss': 0.1309, 'grad_norm': 0.7792027222281702, 'learning_rate': 7.660520937146199e-08, 'epoch': 0.95} 95%|█████████▍| 32423/34278 [35:35:49<2:08:06, 4.14s/it] 95%|█████████▍| 32424/34278 [35:35:53<2:03:23, 3.99s/it] {'loss': 0.1182, 'grad_norm': 0.939320024063705, 'learning_rate': 7.65228496828302e-08, 'epoch': 0.95} 95%|█████████▍| 32424/34278 [35:35:53<2:03:23, 3.99s/it] 95%|█████████▍| 32425/34278 [35:35:56<2:01:36, 3.94s/it] {'loss': 0.1234, 'grad_norm': 0.7470944219962071, 'learning_rate': 7.644053394982698e-08, 'epoch': 0.95} 95%|█████████▍| 32425/34278 [35:35:56<2:01:36, 3.94s/it] 95%|█████████▍| 32426/34278 [35:36:00<1:53:52, 3.69s/it] {'loss': 0.1142, 'grad_norm': 0.7567899105879465, 'learning_rate': 7.635826217318676e-08, 'epoch': 0.95} 95%|█████████▍| 32426/34278 [35:36:00<1:53:52, 3.69s/it] 95%|█████████▍| 32427/34278 [35:36:02<1:45:11, 3.41s/it] {'loss': 0.1059, 'grad_norm': 0.7022270603143397, 'learning_rate': 7.627603435364562e-08, 'epoch': 0.95} 95%|█████████▍| 32427/34278 [35:36:02<1:45:11, 3.41s/it] 95%|█████████▍| 32428/34278 [35:36:05<1:38:34, 3.20s/it] {'loss': 0.115, 'grad_norm': 0.8755207850523682, 'learning_rate': 7.619385049193573e-08, 'epoch': 0.95} 95%|█████████▍| 32428/34278 [35:36:05<1:38:34, 3.20s/it] 95%|█████████▍| 32429/34278 [35:36:11<2:03:36, 4.01s/it] {'loss': 0.1208, 'grad_norm': 0.8289761925866534, 'learning_rate': 7.611171058879208e-08, 'epoch': 0.95} 95%|█████████▍| 32429/34278 [35:36:11<2:03:36, 4.01s/it] 95%|█████████▍| 32430/34278 [35:36:17<2:23:14, 4.65s/it] {'loss': 0.1055, 'grad_norm': 0.7441816679164663, 'learning_rate': 7.602961464494796e-08, 'epoch': 0.95} 95%|█████████▍| 32430/34278 [35:36:17<2:23:14, 4.65s/it] 95%|█████████▍| 32431/34278 [35:36:21<2:20:02, 4.55s/it] {'loss': 0.1038, 'grad_norm': 0.8571158621900724, 'learning_rate': 7.594756266113556e-08, 'epoch': 0.95} 95%|█████████▍| 32431/34278 [35:36:21<2:20:02, 4.55s/it] 95%|█████████▍| 32432/34278 [35:36:24<2:04:41, 4.05s/it] {'loss': 0.1166, 'grad_norm': 0.9005891761133941, 'learning_rate': 7.586555463808765e-08, 'epoch': 0.95} 95%|█████████▍| 32432/34278 [35:36:24<2:04:41, 4.05s/it] 95%|█████████▍| 32433/34278 [35:36:31<2:25:27, 4.73s/it] {'loss': 0.119, 'grad_norm': 0.7337949228878811, 'learning_rate': 7.578359057653751e-08, 'epoch': 0.95} 95%|█████████▍| 32433/34278 [35:36:31<2:25:27, 4.73s/it] 95%|█████████▍| 32434/34278 [35:36:34<2:10:50, 4.26s/it] {'loss': 0.1355, 'grad_norm': 0.8261839710203674, 'learning_rate': 7.57016704772151e-08, 'epoch': 0.95} 95%|█████████▍| 32434/34278 [35:36:34<2:10:50, 4.26s/it] 95%|█████████▍| 32435/34278 [35:36:38<2:06:58, 4.13s/it] {'loss': 0.1031, 'grad_norm': 0.7690019357159564, 'learning_rate': 7.56197943408532e-08, 'epoch': 0.95} 95%|█████████▍| 32435/34278 [35:36:38<2:06:58, 4.13s/it] 95%|█████████▍| 32436/34278 [35:36:43<2:21:09, 4.60s/it] {'loss': 0.1049, 'grad_norm': 0.8164878148238821, 'learning_rate': 7.553796216818177e-08, 'epoch': 0.95} 95%|█████████▍| 32436/34278 [35:36:43<2:21:09, 4.60s/it] 95%|█████████▍| 32437/34278 [35:36:46<2:06:42, 4.13s/it] {'loss': 0.1319, 'grad_norm': 0.790209351407258, 'learning_rate': 7.545617395993188e-08, 'epoch': 0.95} 95%|█████████▍| 32437/34278 [35:36:46<2:06:42, 4.13s/it] 95%|█████████▍| 32438/34278 [35:36:49<1:55:17, 3.76s/it] {'loss': 0.113, 'grad_norm': 0.7399463395526849, 'learning_rate': 7.537442971683406e-08, 'epoch': 0.95} 95%|█████████▍| 32438/34278 [35:36:49<1:55:17, 3.76s/it] 95%|█████████▍| 32439/34278 [35:36:54<2:04:34, 4.06s/it] {'loss': 0.1114, 'grad_norm': 0.8608838647635368, 'learning_rate': 7.529272943961774e-08, 'epoch': 0.95} 95%|█████████▍| 32439/34278 [35:36:54<2:04:34, 4.06s/it] 95%|█████████▍| 32440/34278 [35:36:57<1:53:53, 3.72s/it] {'loss': 0.1439, 'grad_norm': 0.7811986672017879, 'learning_rate': 7.521107312901177e-08, 'epoch': 0.95} 95%|█████████▍| 32440/34278 [35:36:57<1:53:53, 3.72s/it] 95%|█████████▍| 32441/34278 [35:37:00<1:46:40, 3.48s/it] {'loss': 0.1024, 'grad_norm': 0.8592101554393792, 'learning_rate': 7.512946078574667e-08, 'epoch': 0.95} 95%|█████████▍| 32441/34278 [35:37:00<1:46:40, 3.48s/it] 95%|█████████▍| 32442/34278 [35:37:03<1:44:44, 3.42s/it] {'loss': 0.1281, 'grad_norm': 0.868918578391751, 'learning_rate': 7.50478924105491e-08, 'epoch': 0.95} 95%|█████████▍| 32442/34278 [35:37:03<1:44:44, 3.42s/it] 95%|█████████▍| 32443/34278 [35:37:09<2:07:09, 4.16s/it] {'loss': 0.122, 'grad_norm': 0.9483448154147353, 'learning_rate': 7.496636800414847e-08, 'epoch': 0.95} 95%|█████████▍| 32443/34278 [35:37:09<2:07:09, 4.16s/it] 95%|█████████▍| 32444/34278 [35:37:13<2:01:37, 3.98s/it] {'loss': 0.1023, 'grad_norm': 0.7123911422266416, 'learning_rate': 7.488488756727252e-08, 'epoch': 0.95} 95%|█████████▍| 32444/34278 [35:37:13<2:01:37, 3.98s/it] 95%|█████████▍| 32445/34278 [35:37:16<1:53:53, 3.73s/it] {'loss': 0.0922, 'grad_norm': 0.9491089059453062, 'learning_rate': 7.480345110064846e-08, 'epoch': 0.95} 95%|█████████▍| 32445/34278 [35:37:16<1:53:53, 3.73s/it] 95%|█████████▍| 32446/34278 [35:37:18<1:44:28, 3.42s/it] {'loss': 0.1105, 'grad_norm': 0.8656337906944751, 'learning_rate': 7.472205860500403e-08, 'epoch': 0.95} 95%|█████████▍| 32446/34278 [35:37:18<1:44:28, 3.42s/it] 95%|█████████▍| 32447/34278 [35:37:21<1:39:02, 3.25s/it] {'loss': 0.1273, 'grad_norm': 0.9254558721488838, 'learning_rate': 7.464071008106477e-08, 'epoch': 0.95} 95%|█████████▍| 32447/34278 [35:37:21<1:39:02, 3.25s/it] 95%|█████████▍| 32448/34278 [35:37:24<1:34:56, 3.11s/it] {'loss': 0.0938, 'grad_norm': 1.1120337824344588, 'learning_rate': 7.455940552955732e-08, 'epoch': 0.95} 95%|█████████▍| 32448/34278 [35:37:24<1:34:56, 3.11s/it] 95%|█████████▍| 32449/34278 [35:37:28<1:38:59, 3.25s/it] {'loss': 0.1188, 'grad_norm': 0.8734464676563862, 'learning_rate': 7.447814495120775e-08, 'epoch': 0.95} 95%|█████████▍| 32449/34278 [35:37:28<1:38:59, 3.25s/it] 95%|█████████▍| 32450/34278 [35:37:31<1:37:09, 3.19s/it] {'loss': 0.0833, 'grad_norm': 0.932221433076052, 'learning_rate': 7.439692834674217e-08, 'epoch': 0.95} 95%|█████████▍| 32450/34278 [35:37:31<1:37:09, 3.19s/it] 95%|█████████▍| 32451/34278 [35:37:36<1:54:30, 3.76s/it] {'loss': 0.1447, 'grad_norm': 0.771003383833281, 'learning_rate': 7.431575571688443e-08, 'epoch': 0.95} 95%|█████████▍| 32451/34278 [35:37:36<1:54:30, 3.76s/it] 95%|█████████▍| 32452/34278 [35:37:39<1:47:32, 3.53s/it] {'loss': 0.138, 'grad_norm': 0.7799731886129633, 'learning_rate': 7.42346270623595e-08, 'epoch': 0.95} 95%|█████████▍| 32452/34278 [35:37:39<1:47:32, 3.53s/it] 95%|█████████▍| 32453/34278 [35:37:42<1:42:04, 3.36s/it] {'loss': 0.1272, 'grad_norm': 0.8575818353583003, 'learning_rate': 7.415354238389239e-08, 'epoch': 0.95} 95%|█████████▍| 32453/34278 [35:37:42<1:42:04, 3.36s/it] 95%|█████████▍| 32454/34278 [35:37:45<1:45:58, 3.49s/it] {'loss': 0.1115, 'grad_norm': 0.8350146255677302, 'learning_rate': 7.407250168220692e-08, 'epoch': 0.95} 95%|█████████▍| 32454/34278 [35:37:45<1:45:58, 3.49s/it] 95%|█████████▍| 32455/34278 [35:37:49<1:48:22, 3.57s/it] {'loss': 0.1331, 'grad_norm': 1.2155046907044247, 'learning_rate': 7.399150495802532e-08, 'epoch': 0.95} 95%|█████████▍| 32455/34278 [35:37:49<1:48:22, 3.57s/it] 95%|█████████▍| 32456/34278 [35:37:53<1:50:09, 3.63s/it] {'loss': 0.1078, 'grad_norm': 0.7861259438910083, 'learning_rate': 7.391055221207199e-08, 'epoch': 0.95} 95%|█████████▍| 32456/34278 [35:37:53<1:50:09, 3.63s/it] 95%|█████████▍| 32457/34278 [35:37:56<1:45:23, 3.47s/it] {'loss': 0.0894, 'grad_norm': 0.7929531775979544, 'learning_rate': 7.382964344506971e-08, 'epoch': 0.95} 95%|█████████▍| 32457/34278 [35:37:56<1:45:23, 3.47s/it] 95%|█████████▍| 32458/34278 [35:38:00<1:51:20, 3.67s/it] {'loss': 0.1174, 'grad_norm': 0.7207911773731474, 'learning_rate': 7.374877865774011e-08, 'epoch': 0.95} 95%|█████████▍| 32458/34278 [35:38:00<1:51:20, 3.67s/it] 95%|█████████▍| 32459/34278 [35:38:03<1:46:32, 3.51s/it] {'loss': 0.1043, 'grad_norm': 0.8473804329915969, 'learning_rate': 7.366795785080538e-08, 'epoch': 0.95} 95%|█████████▍| 32459/34278 [35:38:03<1:46:32, 3.51s/it] 95%|█████████▍| 32460/34278 [35:38:07<1:43:49, 3.43s/it] {'loss': 0.0955, 'grad_norm': 0.9293679818654809, 'learning_rate': 7.358718102498718e-08, 'epoch': 0.95} 95%|█████████▍| 32460/34278 [35:38:07<1:43:49, 3.43s/it] 95%|█████████▍| 32461/34278 [35:38:12<1:59:22, 3.94s/it] {'loss': 0.1084, 'grad_norm': 0.9925557625259872, 'learning_rate': 7.350644818100605e-08, 'epoch': 0.95} 95%|█████████▍| 32461/34278 [35:38:12<1:59:22, 3.94s/it] 95%|█████████▍| 32462/34278 [35:38:16<1:58:31, 3.92s/it] {'loss': 0.1226, 'grad_norm': 0.8357931404349741, 'learning_rate': 7.342575931958362e-08, 'epoch': 0.95} 95%|█████████▍| 32462/34278 [35:38:16<1:58:31, 3.92s/it] 95%|█████████▍| 32463/34278 [35:38:19<1:55:55, 3.83s/it] {'loss': 0.1135, 'grad_norm': 0.8516494057558605, 'learning_rate': 7.334511444144043e-08, 'epoch': 0.95} 95%|█████████▍| 32463/34278 [35:38:19<1:55:55, 3.83s/it] 95%|█████████▍| 32464/34278 [35:38:22<1:46:39, 3.53s/it] {'loss': 0.1019, 'grad_norm': 0.9690824340711388, 'learning_rate': 7.326451354729591e-08, 'epoch': 0.95} 95%|█████████▍| 32464/34278 [35:38:22<1:46:39, 3.53s/it] 95%|█████████▍| 32465/34278 [35:38:26<1:46:35, 3.53s/it] {'loss': 0.1279, 'grad_norm': 0.8434389326351772, 'learning_rate': 7.318395663786892e-08, 'epoch': 0.95} 95%|█████████▍| 32465/34278 [35:38:26<1:46:35, 3.53s/it] 95%|█████████▍| 32466/34278 [35:38:29<1:44:00, 3.44s/it] {'loss': 0.1118, 'grad_norm': 0.7854950540974656, 'learning_rate': 7.310344371388057e-08, 'epoch': 0.95} 95%|█████████▍| 32466/34278 [35:38:29<1:44:00, 3.44s/it] 95%|█████████▍| 32467/34278 [35:38:32<1:42:22, 3.39s/it] {'loss': 0.0937, 'grad_norm': 0.6897796201625724, 'learning_rate': 7.302297477604747e-08, 'epoch': 0.95} 95%|█████████▍| 32467/34278 [35:38:32<1:42:22, 3.39s/it] 95%|█████████▍| 32468/34278 [35:38:36<1:44:51, 3.48s/it] {'loss': 0.1019, 'grad_norm': 0.8238705776744283, 'learning_rate': 7.294254982508963e-08, 'epoch': 0.95} 95%|█████████▍| 32468/34278 [35:38:36<1:44:51, 3.48s/it] 95%|█████████▍| 32469/34278 [35:38:39<1:44:23, 3.46s/it] {'loss': 0.1055, 'grad_norm': 0.680516572844136, 'learning_rate': 7.286216886172425e-08, 'epoch': 0.95} 95%|█████████▍| 32469/34278 [35:38:39<1:44:23, 3.46s/it] 95%|█████████▍| 32470/34278 [35:38:42<1:41:48, 3.38s/it] {'loss': 0.1434, 'grad_norm': 0.8786545586463291, 'learning_rate': 7.278183188666965e-08, 'epoch': 0.95} 95%|█████████▍| 32470/34278 [35:38:42<1:41:48, 3.38s/it] 95%|█████████▍| 32471/34278 [35:38:46<1:45:04, 3.49s/it] {'loss': 0.1187, 'grad_norm': 1.0483298869582072, 'learning_rate': 7.270153890064246e-08, 'epoch': 0.95} 95%|█████████▍| 32471/34278 [35:38:46<1:45:04, 3.49s/it] 95%|█████████▍| 32472/34278 [35:38:52<2:06:15, 4.19s/it] {'loss': 0.12, 'grad_norm': 0.7640401178064328, 'learning_rate': 7.262128990435934e-08, 'epoch': 0.95} 95%|█████████▍| 32472/34278 [35:38:52<2:06:15, 4.19s/it] 95%|█████████▍| 32473/34278 [35:38:58<2:20:41, 4.68s/it] {'loss': 0.1075, 'grad_norm': 0.7435605455000042, 'learning_rate': 7.25410848985375e-08, 'epoch': 0.95} 95%|█████████▍| 32473/34278 [35:38:58<2:20:41, 4.68s/it] 95%|█████████▍| 32474/34278 [35:39:01<2:08:04, 4.26s/it] {'loss': 0.117, 'grad_norm': 0.839756218587077, 'learning_rate': 7.246092388389247e-08, 'epoch': 0.95} 95%|█████████▍| 32474/34278 [35:39:01<2:08:04, 4.26s/it] 95%|█████████▍| 32475/34278 [35:39:04<1:54:34, 3.81s/it] {'loss': 0.1018, 'grad_norm': 0.9486867625872087, 'learning_rate': 7.23808068611398e-08, 'epoch': 0.95} 95%|█████████▍| 32475/34278 [35:39:04<1:54:34, 3.81s/it] 95%|█████████▍| 32476/34278 [35:39:07<1:49:05, 3.63s/it] {'loss': 0.1132, 'grad_norm': 0.686994518444723, 'learning_rate': 7.230073383099556e-08, 'epoch': 0.95} 95%|█████████▍| 32476/34278 [35:39:07<1:49:05, 3.63s/it] 95%|█████████▍| 32477/34278 [35:39:10<1:44:13, 3.47s/it] {'loss': 0.1322, 'grad_norm': 1.194465992391673, 'learning_rate': 7.222070479417365e-08, 'epoch': 0.95} 95%|█████████▍| 32477/34278 [35:39:10<1:44:13, 3.47s/it] 95%|█████████▍| 32478/34278 [35:39:13<1:40:34, 3.35s/it] {'loss': 0.0982, 'grad_norm': 0.9061250797428757, 'learning_rate': 7.214071975138847e-08, 'epoch': 0.95} 95%|█████████▍| 32478/34278 [35:39:13<1:40:34, 3.35s/it] 95%|█████████▍| 32479/34278 [35:39:16<1:36:18, 3.21s/it] {'loss': 0.1363, 'grad_norm': 0.8336272683284411, 'learning_rate': 7.206077870335504e-08, 'epoch': 0.95} 95%|█████████▍| 32479/34278 [35:39:16<1:36:18, 3.21s/it] 95%|█████████▍| 32480/34278 [35:39:19<1:33:06, 3.11s/it] {'loss': 0.1225, 'grad_norm': 0.7735124687475878, 'learning_rate': 7.198088165078664e-08, 'epoch': 0.95} 95%|█████████▍| 32480/34278 [35:39:19<1:33:06, 3.11s/it] 95%|█████████▍| 32481/34278 [35:39:23<1:38:08, 3.28s/it] {'loss': 0.1145, 'grad_norm': 1.0237110014291062, 'learning_rate': 7.190102859439662e-08, 'epoch': 0.95} 95%|█████████▍| 32481/34278 [35:39:23<1:38:08, 3.28s/it] 95%|█████████▍| 32482/34278 [35:39:25<1:32:58, 3.11s/it] {'loss': 0.0958, 'grad_norm': 0.7648542443407397, 'learning_rate': 7.182121953489718e-08, 'epoch': 0.95} 95%|█████████▍| 32482/34278 [35:39:25<1:32:58, 3.11s/it] 95%|█████████▍| 32483/34278 [35:39:29<1:34:19, 3.15s/it] {'loss': 0.1251, 'grad_norm': 1.0863584681681098, 'learning_rate': 7.174145447300218e-08, 'epoch': 0.95} 95%|█████████▍| 32483/34278 [35:39:29<1:34:19, 3.15s/it] 95%|█████████▍| 32484/34278 [35:39:32<1:34:32, 3.16s/it] {'loss': 0.1177, 'grad_norm': 0.8585402501507573, 'learning_rate': 7.166173340942273e-08, 'epoch': 0.95} 95%|█████████▍| 32484/34278 [35:39:32<1:34:32, 3.16s/it] 95%|█████████▍| 32485/34278 [35:39:35<1:36:46, 3.24s/it] {'loss': 0.0855, 'grad_norm': 0.791338627216461, 'learning_rate': 7.158205634487103e-08, 'epoch': 0.95} 95%|█████████▍| 32485/34278 [35:39:35<1:36:46, 3.24s/it] 95%|█████████▍| 32486/34278 [35:39:42<2:04:20, 4.16s/it] {'loss': 0.0975, 'grad_norm': 0.8626801997444347, 'learning_rate': 7.150242328005763e-08, 'epoch': 0.95} 95%|█████████▍| 32486/34278 [35:39:42<2:04:20, 4.16s/it] 95%|█████████▍| 32487/34278 [35:39:45<1:55:38, 3.87s/it] {'loss': 0.1164, 'grad_norm': 0.7039490546372619, 'learning_rate': 7.142283421569474e-08, 'epoch': 0.95} 95%|█████████▍| 32487/34278 [35:39:45<1:55:38, 3.87s/it] 95%|█████████▍| 32488/34278 [35:39:48<1:50:18, 3.70s/it] {'loss': 0.1068, 'grad_norm': 0.7765559387927986, 'learning_rate': 7.134328915249177e-08, 'epoch': 0.95} 95%|█████████▍| 32488/34278 [35:39:48<1:50:18, 3.70s/it] 95%|█████████▍| 32489/34278 [35:39:51<1:42:51, 3.45s/it] {'loss': 0.129, 'grad_norm': 0.8814504468730606, 'learning_rate': 7.126378809115931e-08, 'epoch': 0.95} 95%|█████████▍| 32489/34278 [35:39:51<1:42:51, 3.45s/it] 95%|█████████▍| 32490/34278 [35:39:54<1:38:48, 3.32s/it] {'loss': 0.1025, 'grad_norm': 0.608087582565708, 'learning_rate': 7.118433103240729e-08, 'epoch': 0.95} 95%|█████████▍| 32490/34278 [35:39:54<1:38:48, 3.32s/it] 95%|█████████▍| 32491/34278 [35:39:57<1:33:07, 3.13s/it] {'loss': 0.1095, 'grad_norm': 0.7717572647366848, 'learning_rate': 7.110491797694519e-08, 'epoch': 0.95} 95%|█████████▍| 32491/34278 [35:39:57<1:33:07, 3.13s/it] 95%|█████████▍| 32492/34278 [35:40:01<1:40:56, 3.39s/it] {'loss': 0.1052, 'grad_norm': 0.9897115468888162, 'learning_rate': 7.10255489254813e-08, 'epoch': 0.95} 95%|█████████▍| 32492/34278 [35:40:01<1:40:56, 3.39s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f339fae62a0> Failed to fetch sample 2898120. Exception: cannot identify image file <_io.BytesIO object at 0x7f339fae62a0> 95%|█████████▍| 32493/34278 [35:40:04<1:38:37, 3.31s/it] {'loss': 0.1139, 'grad_norm': 0.8760283595412774, 'learning_rate': 7.094622387872508e-08, 'epoch': 0.95} 95%|█████████▍| 32493/34278 [35:40:04<1:38:37, 3.31s/it] 95%|█████████▍| 32494/34278 [35:40:08<1:44:46, 3.52s/it] {'loss': 0.1182, 'grad_norm': 0.6954293766892239, 'learning_rate': 7.086694283738427e-08, 'epoch': 0.95} 95%|█████████▍| 32494/34278 [35:40:08<1:44:46, 3.52s/it] 95%|█████████▍| 32495/34278 [35:40:10<1:36:44, 3.26s/it] {'loss': 0.1093, 'grad_norm': 0.9198009490309836, 'learning_rate': 7.078770580216664e-08, 'epoch': 0.95} 95%|█████████▍| 32495/34278 [35:40:10<1:36:44, 3.26s/it] 95%|█████████▍| 32496/34278 [35:40:14<1:40:14, 3.38s/it] {'loss': 0.1204, 'grad_norm': 1.0179819067131206, 'learning_rate': 7.070851277377944e-08, 'epoch': 0.95} 95%|█████████▍| 32496/34278 [35:40:14<1:40:14, 3.38s/it] 95%|█████████▍| 32497/34278 [35:40:18<1:47:46, 3.63s/it] {'loss': 0.1359, 'grad_norm': 0.8157738612589095, 'learning_rate': 7.062936375293039e-08, 'epoch': 0.95} 95%|█████████▍| 32497/34278 [35:40:18<1:47:46, 3.63s/it] 95%|█████████▍| 32498/34278 [35:40:22<1:47:56, 3.64s/it] {'loss': 0.1249, 'grad_norm': 0.8817424538251957, 'learning_rate': 7.055025874032562e-08, 'epoch': 0.95} 95%|█████████▍| 32498/34278 [35:40:22<1:47:56, 3.64s/it] 95%|█████████▍| 32499/34278 [35:40:25<1:41:49, 3.43s/it] {'loss': 0.1045, 'grad_norm': 0.8186664338244846, 'learning_rate': 7.047119773667066e-08, 'epoch': 0.95} 95%|█████████▍| 32499/34278 [35:40:25<1:41:49, 3.43s/it] 95%|█████████▍| 32500/34278 [35:40:28<1:36:44, 3.26s/it] {'loss': 0.1211, 'grad_norm': 0.8767729803975891, 'learning_rate': 7.039218074267273e-08, 'epoch': 0.95} 95%|█████████▍| 32500/34278 [35:40:28<1:36:44, 3.26s/it] 95%|█████████▍| 32501/34278 [35:40:31<1:38:09, 3.31s/it] {'loss': 0.1096, 'grad_norm': 0.768487745352673, 'learning_rate': 7.031320775903682e-08, 'epoch': 0.95} 95%|█████████▍| 32501/34278 [35:40:31<1:38:09, 3.31s/it] 95%|█████████▍| 32502/34278 [35:40:35<1:42:12, 3.45s/it] {'loss': 0.104, 'grad_norm': 0.8039052449936003, 'learning_rate': 7.023427878646739e-08, 'epoch': 0.95} 95%|█████████▍| 32502/34278 [35:40:35<1:42:12, 3.45s/it] 95%|█████████▍| 32503/34278 [35:40:38<1:40:21, 3.39s/it] {'loss': 0.0907, 'grad_norm': 0.8063128935223344, 'learning_rate': 7.015539382566882e-08, 'epoch': 0.95} 95%|█████████▍| 32503/34278 [35:40:38<1:40:21, 3.39s/it] 95%|█████████▍| 32504/34278 [35:40:41<1:34:49, 3.21s/it] {'loss': 0.1003, 'grad_norm': 0.9271385773683889, 'learning_rate': 7.007655287734727e-08, 'epoch': 0.95} 95%|█████████▍| 32504/34278 [35:40:41<1:34:49, 3.21s/it] 95%|█████████▍| 32505/34278 [35:40:44<1:32:59, 3.15s/it] {'loss': 0.119, 'grad_norm': 0.9647932622383132, 'learning_rate': 6.999775594220437e-08, 'epoch': 0.95} 95%|█████████▍| 32505/34278 [35:40:44<1:32:59, 3.15s/it] 95%|█████████▍| 32506/34278 [35:40:48<1:42:39, 3.48s/it] {'loss': 0.109, 'grad_norm': 0.8090997577971532, 'learning_rate': 6.991900302094567e-08, 'epoch': 0.95} 95%|█████████▍| 32506/34278 [35:40:48<1:42:39, 3.48s/it] 95%|█████████▍| 32507/34278 [35:40:52<1:43:34, 3.51s/it] {'loss': 0.1288, 'grad_norm': 0.8575302787877651, 'learning_rate': 6.984029411427285e-08, 'epoch': 0.95} 95%|█████████▍| 32507/34278 [35:40:52<1:43:34, 3.51s/it] 95%|█████████▍| 32508/34278 [35:40:58<2:05:19, 4.25s/it] {'loss': 0.118, 'grad_norm': 1.143824229557474, 'learning_rate': 6.976162922288865e-08, 'epoch': 0.95} 95%|█████████▍| 32508/34278 [35:40:58<2:05:19, 4.25s/it] 95%|█████████▍| 32509/34278 [35:41:03<2:12:30, 4.49s/it] {'loss': 0.1038, 'grad_norm': 0.8649502905621248, 'learning_rate': 6.968300834749531e-08, 'epoch': 0.95} 95%|█████████▍| 32509/34278 [35:41:03<2:12:30, 4.49s/it] 95%|█████████▍| 32510/34278 [35:41:06<2:01:14, 4.11s/it] {'loss': 0.1177, 'grad_norm': 0.9273763814160385, 'learning_rate': 6.960443148879559e-08, 'epoch': 0.95} 95%|█████████▍| 32510/34278 [35:41:06<2:01:14, 4.11s/it] 95%|█████████▍| 32511/34278 [35:41:09<1:52:46, 3.83s/it] {'loss': 0.1143, 'grad_norm': 0.7928934445398926, 'learning_rate': 6.952589864749115e-08, 'epoch': 0.95} 95%|█████████▍| 32511/34278 [35:41:09<1:52:46, 3.83s/it] 95%|█████████▍| 32512/34278 [35:41:13<1:48:34, 3.69s/it] {'loss': 0.1088, 'grad_norm': 0.9374873688596449, 'learning_rate': 6.944740982428144e-08, 'epoch': 0.95} 95%|█████████▍| 32512/34278 [35:41:13<1:48:34, 3.69s/it] 95%|█████████▍| 32513/34278 [35:41:16<1:44:34, 3.56s/it] {'loss': 0.1247, 'grad_norm': 0.8220838038955269, 'learning_rate': 6.936896501986868e-08, 'epoch': 0.95} 95%|█████████▍| 32513/34278 [35:41:16<1:44:34, 3.56s/it] 95%|█████████▍| 32514/34278 [35:41:20<1:48:09, 3.68s/it] {'loss': 0.1118, 'grad_norm': 0.7494913221019011, 'learning_rate': 6.929056423495285e-08, 'epoch': 0.95} 95%|█████████▍| 32514/34278 [35:41:20<1:48:09, 3.68s/it] 95%|█████████▍| 32515/34278 [35:41:23<1:42:29, 3.49s/it] {'loss': 0.0967, 'grad_norm': 0.574077662448829, 'learning_rate': 6.921220747023394e-08, 'epoch': 0.95} 95%|█████████▍| 32515/34278 [35:41:23<1:42:29, 3.49s/it] 95%|█████████▍| 32516/34278 [35:41:29<2:04:30, 4.24s/it] {'loss': 0.1119, 'grad_norm': 0.9191293324336272, 'learning_rate': 6.913389472641085e-08, 'epoch': 0.95} 95%|█████████▍| 32516/34278 [35:41:29<2:04:30, 4.24s/it] 95%|█████████▍| 32517/34278 [35:41:32<1:54:22, 3.90s/it] {'loss': 0.1148, 'grad_norm': 0.8234192992220227, 'learning_rate': 6.905562600418359e-08, 'epoch': 0.95} 95%|█████████▍| 32517/34278 [35:41:32<1:54:22, 3.90s/it] 95%|█████████▍| 32518/34278 [35:41:35<1:47:57, 3.68s/it] {'loss': 0.1079, 'grad_norm': 0.8801974156793452, 'learning_rate': 6.897740130425046e-08, 'epoch': 0.95} 95%|█████████▍| 32518/34278 [35:41:35<1:47:57, 3.68s/it] 95%|█████████▍| 32519/34278 [35:41:38<1:44:10, 3.55s/it] {'loss': 0.1209, 'grad_norm': 0.7527567196852354, 'learning_rate': 6.88992206273098e-08, 'epoch': 0.95} 95%|█████████▍| 32519/34278 [35:41:38<1:44:10, 3.55s/it] 95%|█████████▍| 32520/34278 [35:41:41<1:39:33, 3.40s/it] {'loss': 0.1172, 'grad_norm': 0.7828147462334349, 'learning_rate': 6.882108397406051e-08, 'epoch': 0.95} 95%|█████████▍| 32520/34278 [35:41:41<1:39:33, 3.40s/it] 95%|█████████▍| 32521/34278 [35:41:45<1:42:00, 3.48s/it] {'loss': 0.0962, 'grad_norm': 0.7652612727291339, 'learning_rate': 6.874299134519868e-08, 'epoch': 0.95} 95%|█████████▍| 32521/34278 [35:41:45<1:42:00, 3.48s/it] 95%|█████████▍| 32522/34278 [35:41:50<1:56:41, 3.99s/it] {'loss': 0.1053, 'grad_norm': 0.9264724715795395, 'learning_rate': 6.86649427414221e-08, 'epoch': 0.95} 95%|█████████▍| 32522/34278 [35:41:50<1:56:41, 3.99s/it] 95%|█████████▍| 32523/34278 [35:41:53<1:50:08, 3.77s/it] {'loss': 0.0987, 'grad_norm': 0.7727918440017685, 'learning_rate': 6.858693816342854e-08, 'epoch': 0.95} 95%|█████████▍| 32523/34278 [35:41:53<1:50:08, 3.77s/it] 95%|█████████▍| 32524/34278 [35:41:57<1:46:13, 3.63s/it] {'loss': 0.1073, 'grad_norm': 0.8662458394639372, 'learning_rate': 6.8508977611913e-08, 'epoch': 0.95} 95%|█████████▍| 32524/34278 [35:41:57<1:46:13, 3.63s/it] 95%|█████████▍| 32525/34278 [35:42:00<1:37:54, 3.35s/it] {'loss': 0.1183, 'grad_norm': 0.9801777045089599, 'learning_rate': 6.843106108757214e-08, 'epoch': 0.95} 95%|█████████▍| 32525/34278 [35:42:00<1:37:54, 3.35s/it] 95%|█████████▍| 32526/34278 [35:42:03<1:36:10, 3.29s/it] {'loss': 0.1045, 'grad_norm': 0.8325529827915136, 'learning_rate': 6.835318859110152e-08, 'epoch': 0.95} 95%|█████████▍| 32526/34278 [35:42:03<1:36:10, 3.29s/it] 95%|█████████▍| 32527/34278 [35:42:06<1:34:08, 3.23s/it] {'loss': 0.1235, 'grad_norm': 0.8014491100772806, 'learning_rate': 6.827536012319613e-08, 'epoch': 0.95} 95%|█████████▍| 32527/34278 [35:42:06<1:34:08, 3.23s/it] 95%|█████████▍| 32528/34278 [35:42:09<1:37:07, 3.33s/it] {'loss': 0.0946, 'grad_norm': 0.9160593276523509, 'learning_rate': 6.819757568455155e-08, 'epoch': 0.95} 95%|█████████▍| 32528/34278 [35:42:09<1:37:07, 3.33s/it] 95%|█████████▍| 32529/34278 [35:42:15<2:00:55, 4.15s/it] {'loss': 0.1158, 'grad_norm': 0.8850257788412628, 'learning_rate': 6.811983527586108e-08, 'epoch': 0.95} 95%|█████████▍| 32529/34278 [35:42:15<2:00:55, 4.15s/it] 95%|█████████▍| 32530/34278 [35:42:19<1:52:21, 3.86s/it] {'loss': 0.121, 'grad_norm': 0.7254581291890501, 'learning_rate': 6.804213889781974e-08, 'epoch': 0.95} 95%|█████████▍| 32530/34278 [35:42:19<1:52:21, 3.86s/it] 95%|█████████▍| 32531/34278 [35:42:21<1:42:34, 3.52s/it] {'loss': 0.1055, 'grad_norm': 0.8242057641046956, 'learning_rate': 6.796448655112142e-08, 'epoch': 0.95} 95%|█████████▍| 32531/34278 [35:42:21<1:42:34, 3.52s/it] 95%|█████████▍| 32532/34278 [35:42:24<1:37:49, 3.36s/it] {'loss': 0.095, 'grad_norm': 0.7271872053019198, 'learning_rate': 6.788687823645723e-08, 'epoch': 0.95} 95%|█████████▍| 32532/34278 [35:42:24<1:37:49, 3.36s/it] 95%|█████████▍| 32533/34278 [35:42:28<1:38:09, 3.38s/it] {'loss': 0.1194, 'grad_norm': 0.7964237689345448, 'learning_rate': 6.780931395452273e-08, 'epoch': 0.95} 95%|█████████▍| 32533/34278 [35:42:28<1:38:09, 3.38s/it] 95%|█████████▍| 32534/34278 [35:42:31<1:41:53, 3.51s/it] {'loss': 0.1105, 'grad_norm': 0.7695145596993271, 'learning_rate': 6.773179370600958e-08, 'epoch': 0.95} 95%|█████████▍| 32534/34278 [35:42:31<1:41:53, 3.51s/it] 95%|█████████▍| 32535/34278 [35:42:35<1:41:36, 3.50s/it] {'loss': 0.1241, 'grad_norm': 0.8957846772447188, 'learning_rate': 6.765431749160889e-08, 'epoch': 0.95} 95%|█████████▍| 32535/34278 [35:42:35<1:41:36, 3.50s/it] 95%|█████████▍| 32536/34278 [35:42:39<1:46:24, 3.67s/it] {'loss': 0.1188, 'grad_norm': 0.8944661895065943, 'learning_rate': 6.75768853120129e-08, 'epoch': 0.95} 95%|█████████▍| 32536/34278 [35:42:39<1:46:24, 3.67s/it] 95%|█████████▍| 32537/34278 [35:42:43<1:49:30, 3.77s/it] {'loss': 0.1327, 'grad_norm': 0.7907215058520678, 'learning_rate': 6.749949716791382e-08, 'epoch': 0.95} 95%|█████████▍| 32537/34278 [35:42:43<1:49:30, 3.77s/it] 95%|█████████▍| 32538/34278 [35:42:47<1:50:07, 3.80s/it] {'loss': 0.0997, 'grad_norm': 0.8652602143387097, 'learning_rate': 6.742215306000055e-08, 'epoch': 0.95} 95%|█████████▍| 32538/34278 [35:42:47<1:50:07, 3.80s/it] 95%|█████████▍| 32539/34278 [35:42:52<2:03:45, 4.27s/it] {'loss': 0.1197, 'grad_norm': 0.9278490960321242, 'learning_rate': 6.734485298896531e-08, 'epoch': 0.95} 95%|█████████▍| 32539/34278 [35:42:52<2:03:45, 4.27s/it] 95%|█████████▍| 32540/34278 [35:42:56<1:55:15, 3.98s/it] {'loss': 0.1298, 'grad_norm': 0.8093230324059786, 'learning_rate': 6.726759695549812e-08, 'epoch': 0.95} 95%|█████████▍| 32540/34278 [35:42:56<1:55:15, 3.98s/it] 95%|█████████▍| 32541/34278 [35:42:58<1:45:31, 3.64s/it] {'loss': 0.1049, 'grad_norm': 0.8335422554079511, 'learning_rate': 6.71903849602884e-08, 'epoch': 0.95} 95%|█████████▍| 32541/34278 [35:42:58<1:45:31, 3.64s/it] 95%|█████████▍| 32542/34278 [35:43:03<1:52:56, 3.90s/it] {'loss': 0.1042, 'grad_norm': 0.8902666678634433, 'learning_rate': 6.711321700402451e-08, 'epoch': 0.95} 95%|█████████▍| 32542/34278 [35:43:03<1:52:56, 3.90s/it] 95%|█████████▍| 32543/34278 [35:43:06<1:48:42, 3.76s/it] {'loss': 0.0962, 'grad_norm': 0.7495172320445483, 'learning_rate': 6.7036093087397e-08, 'epoch': 0.95} 95%|█████████▍| 32543/34278 [35:43:06<1:48:42, 3.76s/it] 95%|█████████▍| 32544/34278 [35:43:09<1:43:07, 3.57s/it] {'loss': 0.1163, 'grad_norm': 0.6637903822240693, 'learning_rate': 6.695901321109311e-08, 'epoch': 0.95} 95%|█████████▍| 32544/34278 [35:43:09<1:43:07, 3.57s/it] 95%|█████████▍| 32545/34278 [35:43:14<1:52:12, 3.88s/it] {'loss': 0.1148, 'grad_norm': 0.8944716687778668, 'learning_rate': 6.688197737580226e-08, 'epoch': 0.95} 95%|█████████▍| 32545/34278 [35:43:14<1:52:12, 3.88s/it] 95%|█████████▍| 32546/34278 [35:43:18<1:52:37, 3.90s/it] {'loss': 0.0942, 'grad_norm': 0.7032948601866934, 'learning_rate': 6.68049855822106e-08, 'epoch': 0.95} 95%|█████████▍| 32546/34278 [35:43:18<1:52:37, 3.90s/it] 95%|█████████▍| 32547/34278 [35:43:21<1:46:30, 3.69s/it] {'loss': 0.1009, 'grad_norm': 0.6992534994997684, 'learning_rate': 6.672803783100701e-08, 'epoch': 0.95} 95%|█████████▍| 32547/34278 [35:43:21<1:46:30, 3.69s/it] 95%|█████████▍| 32548/34278 [35:43:25<1:47:49, 3.74s/it] {'loss': 0.1014, 'grad_norm': 0.7429958521862423, 'learning_rate': 6.66511341228776e-08, 'epoch': 0.95} 95%|█████████▍| 32548/34278 [35:43:25<1:47:49, 3.74s/it] 95%|█████████▍| 32549/34278 [35:43:29<1:45:54, 3.68s/it] {'loss': 0.1069, 'grad_norm': 0.7998772670532653, 'learning_rate': 6.657427445850906e-08, 'epoch': 0.95} 95%|█████████▍| 32549/34278 [35:43:29<1:45:54, 3.68s/it] 95%|█████████▍| 32550/34278 [35:43:32<1:41:20, 3.52s/it] {'loss': 0.111, 'grad_norm': 0.8214812360205656, 'learning_rate': 6.64974588385875e-08, 'epoch': 0.95} 95%|█████████▍| 32550/34278 [35:43:32<1:41:20, 3.52s/it] 95%|█████████▍| 32551/34278 [35:43:36<1:44:43, 3.64s/it] {'loss': 0.1119, 'grad_norm': 0.6227033556614411, 'learning_rate': 6.642068726379958e-08, 'epoch': 0.95} 95%|█████████▍| 32551/34278 [35:43:36<1:44:43, 3.64s/it] 95%|█████████▍| 32552/34278 [35:43:39<1:41:56, 3.54s/it] {'loss': 0.1482, 'grad_norm': 0.8738627469696575, 'learning_rate': 6.634395973482976e-08, 'epoch': 0.95} 95%|█████████▍| 32552/34278 [35:43:39<1:41:56, 3.54s/it] 95%|█████████▍| 32553/34278 [35:43:43<1:48:41, 3.78s/it] {'loss': 0.1065, 'grad_norm': 0.8448002130865022, 'learning_rate': 6.626727625236307e-08, 'epoch': 0.95} 95%|█████████▍| 32553/34278 [35:43:43<1:48:41, 3.78s/it] 95%|█████████▍| 32554/34278 [35:43:49<2:07:27, 4.44s/it] {'loss': 0.1054, 'grad_norm': 0.6625229258472054, 'learning_rate': 6.619063681708504e-08, 'epoch': 0.95} 95%|█████████▍| 32554/34278 [35:43:49<2:07:27, 4.44s/it] 95%|█████████▍| 32555/34278 [35:43:53<1:59:14, 4.15s/it] {'loss': 0.1074, 'grad_norm': 0.7563557777917117, 'learning_rate': 6.611404142967847e-08, 'epoch': 0.95} 95%|█████████▍| 32555/34278 [35:43:53<1:59:14, 4.15s/it] 95%|█████████▍| 32556/34278 [35:43:56<1:53:37, 3.96s/it] {'loss': 0.1136, 'grad_norm': 0.9927561808730692, 'learning_rate': 6.603749009082782e-08, 'epoch': 0.95} 95%|█████████▍| 32556/34278 [35:43:56<1:53:37, 3.96s/it] 95%|█████████▍| 32557/34278 [35:44:00<1:49:55, 3.83s/it] {'loss': 0.1051, 'grad_norm': 0.7893414017567375, 'learning_rate': 6.596098280121699e-08, 'epoch': 0.95} 95%|█████████▍| 32557/34278 [35:44:00<1:49:55, 3.83s/it] 95%|█████████▍| 32558/34278 [35:44:06<2:07:06, 4.43s/it] {'loss': 0.1216, 'grad_norm': 0.8773759445176594, 'learning_rate': 6.588451956152875e-08, 'epoch': 0.95} 95%|█████████▍| 32558/34278 [35:44:06<2:07:06, 4.43s/it] 95%|█████████▍| 32559/34278 [35:44:09<1:55:18, 4.02s/it] {'loss': 0.1146, 'grad_norm': 0.8107183958581707, 'learning_rate': 6.580810037244533e-08, 'epoch': 0.95} 95%|█████████▍| 32559/34278 [35:44:09<1:55:18, 4.02s/it] 95%|█████████▍| 32560/34278 [35:44:13<1:55:08, 4.02s/it] {'loss': 0.0983, 'grad_norm': 0.7921634542797843, 'learning_rate': 6.573172523464954e-08, 'epoch': 0.95} 95%|█████████▍| 32560/34278 [35:44:13<1:55:08, 4.02s/it] 95%|█████████▍| 32561/34278 [35:44:16<1:44:21, 3.65s/it] {'loss': 0.0908, 'grad_norm': 0.7370843643468035, 'learning_rate': 6.56553941488236e-08, 'epoch': 0.95} 95%|█████████▍| 32561/34278 [35:44:16<1:44:21, 3.65s/it] 95%|█████████▍| 32562/34278 [35:44:21<2:03:49, 4.33s/it] {'loss': 0.0944, 'grad_norm': 0.6442301494912659, 'learning_rate': 6.557910711564697e-08, 'epoch': 0.95} 95%|█████████▍| 32562/34278 [35:44:21<2:03:49, 4.33s/it] 95%|█████████▍| 32563/34278 [35:44:25<2:00:00, 4.20s/it] {'loss': 0.117, 'grad_norm': 0.7498354563584554, 'learning_rate': 6.550286413580298e-08, 'epoch': 0.95} 95%|█████████▍| 32563/34278 [35:44:25<2:00:00, 4.20s/it] 95%|█████████▍| 32564/34278 [35:44:30<2:04:46, 4.37s/it] {'loss': 0.0957, 'grad_norm': 0.7483228818550821, 'learning_rate': 6.542666520997166e-08, 'epoch': 0.95} 95%|█████████▍| 32564/34278 [35:44:30<2:04:46, 4.37s/it] 95%|█████████▌| 32565/34278 [35:44:34<1:56:15, 4.07s/it] {'loss': 0.1179, 'grad_norm': 0.8375602683930373, 'learning_rate': 6.535051033883245e-08, 'epoch': 0.95} 95%|█████████▌| 32565/34278 [35:44:34<1:56:15, 4.07s/it] 95%|█████████▌| 32566/34278 [35:44:37<1:51:23, 3.90s/it] {'loss': 0.1083, 'grad_norm': 0.7275852938497513, 'learning_rate': 6.527439952306647e-08, 'epoch': 0.95} 95%|█████████▌| 32566/34278 [35:44:37<1:51:23, 3.90s/it] 95%|█████████▌| 32567/34278 [35:44:40<1:46:28, 3.73s/it] {'loss': 0.1156, 'grad_norm': 0.9878603165436323, 'learning_rate': 6.519833276335263e-08, 'epoch': 0.95} 95%|█████████▌| 32567/34278 [35:44:40<1:46:28, 3.73s/it] 95%|█████████▌| 32568/34278 [35:44:43<1:40:26, 3.52s/it] {'loss': 0.1064, 'grad_norm': 0.7871433797833636, 'learning_rate': 6.512231006036984e-08, 'epoch': 0.95} 95%|█████████▌| 32568/34278 [35:44:43<1:40:26, 3.52s/it] 95%|█████████▌| 32569/34278 [35:44:48<1:45:33, 3.71s/it] {'loss': 0.1166, 'grad_norm': 0.6917231232985578, 'learning_rate': 6.504633141479644e-08, 'epoch': 0.95} 95%|█████████▌| 32569/34278 [35:44:48<1:45:33, 3.71s/it] 95%|█████████▌| 32570/34278 [35:44:52<1:48:35, 3.81s/it] {'loss': 0.107, 'grad_norm': 0.7962815841372336, 'learning_rate': 6.497039682731243e-08, 'epoch': 0.95} 95%|█████████▌| 32570/34278 [35:44:52<1:48:35, 3.81s/it] 95%|█████████▌| 32571/34278 [35:44:55<1:44:11, 3.66s/it] {'loss': 0.1028, 'grad_norm': 0.8131996014082601, 'learning_rate': 6.489450629859394e-08, 'epoch': 0.95} 95%|█████████▌| 32571/34278 [35:44:55<1:44:11, 3.66s/it] 95%|█████████▌| 32572/34278 [35:44:58<1:40:44, 3.54s/it] {'loss': 0.0984, 'grad_norm': 0.7680218242347662, 'learning_rate': 6.481865982931934e-08, 'epoch': 0.95} 95%|█████████▌| 32572/34278 [35:44:58<1:40:44, 3.54s/it] 95%|█████████▌| 32573/34278 [35:45:04<2:02:47, 4.32s/it] {'loss': 0.1083, 'grad_norm': 0.7015601473634733, 'learning_rate': 6.474285742016583e-08, 'epoch': 0.95} 95%|█████████▌| 32573/34278 [35:45:04<2:02:47, 4.32s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 95%|█████████▌| 32574/34278 [35:45:08<2:00:16, 4.24s/it] {'loss': 0.1146, 'grad_norm': 0.8884200993336463, 'learning_rate': 6.466709907180957e-08, 'epoch': 0.95} 95%|█████████▌| 32574/34278 [35:45:08<2:00:16, 4.24s/it] 95%|█████████▌| 32575/34278 [35:45:14<2:13:48, 4.71s/it] {'loss': 0.1122, 'grad_norm': 0.9475049563156215, 'learning_rate': 6.459138478492721e-08, 'epoch': 0.95} 95%|█████████▌| 32575/34278 [35:45:14<2:13:48, 4.71s/it] 95%|█████████▌| 32576/34278 [35:45:19<2:11:21, 4.63s/it] {'loss': 0.1029, 'grad_norm': 0.7885576938871349, 'learning_rate': 6.45157145601949e-08, 'epoch': 0.95} 95%|█████████▌| 32576/34278 [35:45:19<2:11:21, 4.63s/it] 95%|█████████▌| 32577/34278 [35:45:21<1:54:05, 4.02s/it] {'loss': 0.1176, 'grad_norm': 0.7762302852183923, 'learning_rate': 6.444008839828875e-08, 'epoch': 0.95} 95%|█████████▌| 32577/34278 [35:45:21<1:54:05, 4.02s/it] 95%|█████████▌| 32578/34278 [35:45:24<1:45:45, 3.73s/it] {'loss': 0.1091, 'grad_norm': 0.8350150433801837, 'learning_rate': 6.436450629988267e-08, 'epoch': 0.95} 95%|█████████▌| 32578/34278 [35:45:24<1:45:45, 3.73s/it] 95%|█████████▌| 32579/34278 [35:45:27<1:40:47, 3.56s/it] {'loss': 0.1181, 'grad_norm': 0.7493533951830512, 'learning_rate': 6.428896826565223e-08, 'epoch': 0.95} 95%|█████████▌| 32579/34278 [35:45:27<1:40:47, 3.56s/it] 95%|█████████▌| 32580/34278 [35:45:31<1:39:24, 3.51s/it] {'loss': 0.109, 'grad_norm': 0.991159279510208, 'learning_rate': 6.421347429627134e-08, 'epoch': 0.95} 95%|█████████▌| 32580/34278 [35:45:31<1:39:24, 3.51s/it] 95%|█████████▌| 32581/34278 [35:45:34<1:36:04, 3.40s/it] {'loss': 0.1119, 'grad_norm': 0.8734027167661529, 'learning_rate': 6.413802439241445e-08, 'epoch': 0.95} 95%|█████████▌| 32581/34278 [35:45:34<1:36:04, 3.40s/it] 95%|█████████▌| 32582/34278 [35:45:37<1:36:27, 3.41s/it] {'loss': 0.1043, 'grad_norm': 0.9273288398776804, 'learning_rate': 6.406261855475492e-08, 'epoch': 0.95} 95%|█████████▌| 32582/34278 [35:45:37<1:36:27, 3.41s/it] 95%|█████████▌| 32583/34278 [35:45:41<1:35:30, 3.38s/it] {'loss': 0.1099, 'grad_norm': 0.8732606294444948, 'learning_rate': 6.398725678396611e-08, 'epoch': 0.95} 95%|█████████▌| 32583/34278 [35:45:41<1:35:30, 3.38s/it] 95%|█████████▌| 32584/34278 [35:45:44<1:32:54, 3.29s/it] {'loss': 0.1276, 'grad_norm': 0.6687579571720464, 'learning_rate': 6.39119390807208e-08, 'epoch': 0.95} 95%|█████████▌| 32584/34278 [35:45:44<1:32:54, 3.29s/it] 95%|█████████▌| 32585/34278 [35:45:50<1:55:09, 4.08s/it] {'loss': 0.112, 'grad_norm': 0.8621781554936837, 'learning_rate': 6.383666544569122e-08, 'epoch': 0.95} 95%|█████████▌| 32585/34278 [35:45:50<1:55:09, 4.08s/it] 95%|█████████▌| 32586/34278 [35:45:53<1:48:55, 3.86s/it] {'loss': 0.1071, 'grad_norm': 0.7539262707182163, 'learning_rate': 6.376143587954964e-08, 'epoch': 0.95} 95%|█████████▌| 32586/34278 [35:45:53<1:48:55, 3.86s/it] 95%|█████████▌| 32587/34278 [35:45:56<1:43:19, 3.67s/it] {'loss': 0.1211, 'grad_norm': 0.7066981580124291, 'learning_rate': 6.368625038296772e-08, 'epoch': 0.95} 95%|█████████▌| 32587/34278 [35:45:56<1:43:19, 3.67s/it] 95%|█████████▌| 32588/34278 [35:46:00<1:40:13, 3.56s/it] {'loss': 0.1042, 'grad_norm': 0.6900478942678211, 'learning_rate': 6.36111089566166e-08, 'epoch': 0.95} 95%|█████████▌| 32588/34278 [35:46:00<1:40:13, 3.56s/it] 95%|█████████▌| 32589/34278 [35:46:02<1:32:17, 3.28s/it] {'loss': 0.1204, 'grad_norm': 0.7289498956348031, 'learning_rate': 6.353601160116685e-08, 'epoch': 0.95} 95%|█████████▌| 32589/34278 [35:46:02<1:32:17, 3.28s/it] 95%|█████████▌| 32590/34278 [35:46:07<1:41:45, 3.62s/it] {'loss': 0.1201, 'grad_norm': 0.8394279801179323, 'learning_rate': 6.34609583172896e-08, 'epoch': 0.95} 95%|█████████▌| 32590/34278 [35:46:07<1:41:45, 3.62s/it] 95%|█████████▌| 32591/34278 [35:46:10<1:39:26, 3.54s/it] {'loss': 0.0991, 'grad_norm': 0.8028538442739547, 'learning_rate': 6.338594910565376e-08, 'epoch': 0.95} 95%|█████████▌| 32591/34278 [35:46:10<1:39:26, 3.54s/it] 95%|█████████▌| 32592/34278 [35:46:14<1:39:32, 3.54s/it] {'loss': 0.1242, 'grad_norm': 1.0028204581743745, 'learning_rate': 6.331098396692991e-08, 'epoch': 0.95} 95%|█████████▌| 32592/34278 [35:46:14<1:39:32, 3.54s/it] 95%|█████████▌| 32593/34278 [35:46:17<1:38:40, 3.51s/it] {'loss': 0.1065, 'grad_norm': 0.7777868754168333, 'learning_rate': 6.323606290178697e-08, 'epoch': 0.95} 95%|█████████▌| 32593/34278 [35:46:17<1:38:40, 3.51s/it] 95%|█████████▌| 32594/34278 [35:46:21<1:39:40, 3.55s/it] {'loss': 0.1047, 'grad_norm': 0.6508464250057778, 'learning_rate': 6.316118591089493e-08, 'epoch': 0.95} 95%|█████████▌| 32594/34278 [35:46:21<1:39:40, 3.55s/it] 95%|█████████▌| 32595/34278 [35:46:24<1:34:51, 3.38s/it] {'loss': 0.1295, 'grad_norm': 0.9154484747314856, 'learning_rate': 6.308635299491994e-08, 'epoch': 0.95} 95%|█████████▌| 32595/34278 [35:46:24<1:34:51, 3.38s/it] 95%|█████████▌| 32596/34278 [35:46:27<1:32:34, 3.30s/it] {'loss': 0.1293, 'grad_norm': 0.8207156517485412, 'learning_rate': 6.301156415453257e-08, 'epoch': 0.95} 95%|█████████▌| 32596/34278 [35:46:27<1:32:34, 3.30s/it] 95%|█████████▌| 32597/34278 [35:46:31<1:40:19, 3.58s/it] {'loss': 0.11, 'grad_norm': 0.715485299643818, 'learning_rate': 6.293681939039898e-08, 'epoch': 0.95} 95%|█████████▌| 32597/34278 [35:46:31<1:40:19, 3.58s/it] 95%|█████████▌| 32598/34278 [35:46:34<1:35:29, 3.41s/it] {'loss': 0.1142, 'grad_norm': 0.8233722592986634, 'learning_rate': 6.286211870318693e-08, 'epoch': 0.95} 95%|█████████▌| 32598/34278 [35:46:34<1:35:29, 3.41s/it] 95%|█████████▌| 32599/34278 [35:46:38<1:42:09, 3.65s/it] {'loss': 0.1298, 'grad_norm': 0.8366408034701655, 'learning_rate': 6.278746209356313e-08, 'epoch': 0.95} 95%|█████████▌| 32599/34278 [35:46:38<1:42:09, 3.65s/it] 95%|█████████▌| 32600/34278 [35:46:41<1:34:53, 3.39s/it] {'loss': 0.1018, 'grad_norm': 0.8882187416155616, 'learning_rate': 6.271284956219425e-08, 'epoch': 0.95} 95%|█████████▌| 32600/34278 [35:46:41<1:34:53, 3.39s/it] 95%|█████████▌| 32601/34278 [35:46:44<1:32:48, 3.32s/it] {'loss': 0.1109, 'grad_norm': 0.7522134535995358, 'learning_rate': 6.263828110974645e-08, 'epoch': 0.95} 95%|█████████▌| 32601/34278 [35:46:44<1:32:48, 3.32s/it] 95%|█████████▌| 32602/34278 [35:46:47<1:30:30, 3.24s/it] {'loss': 0.1142, 'grad_norm': 0.86579167965794, 'learning_rate': 6.256375673688586e-08, 'epoch': 0.95} 95%|█████████▌| 32602/34278 [35:46:47<1:30:30, 3.24s/it] 95%|█████████▌| 32603/34278 [35:46:53<1:52:39, 4.04s/it] {'loss': 0.1024, 'grad_norm': 1.0030382824630497, 'learning_rate': 6.248927644427694e-08, 'epoch': 0.95} 95%|█████████▌| 32603/34278 [35:46:53<1:52:39, 4.04s/it] 95%|█████████▌| 32604/34278 [35:46:57<1:52:03, 4.02s/it] {'loss': 0.1105, 'grad_norm': 0.6475231820740863, 'learning_rate': 6.241484023258526e-08, 'epoch': 0.95} 95%|█████████▌| 32604/34278 [35:46:57<1:52:03, 4.02s/it] 95%|█████████▌| 32605/34278 [35:47:01<1:51:05, 3.98s/it] {'loss': 0.1011, 'grad_norm': 0.976756384760913, 'learning_rate': 6.23404481024753e-08, 'epoch': 0.95} 95%|█████████▌| 32605/34278 [35:47:01<1:51:05, 3.98s/it] 95%|█████████▌| 32606/34278 [35:47:04<1:43:58, 3.73s/it] {'loss': 0.1185, 'grad_norm': 0.9550830639701129, 'learning_rate': 6.226610005461043e-08, 'epoch': 0.95} 95%|█████████▌| 32606/34278 [35:47:04<1:43:58, 3.73s/it] 95%|█████████▌| 32607/34278 [35:47:07<1:38:38, 3.54s/it] {'loss': 0.0972, 'grad_norm': 0.9693759171969529, 'learning_rate': 6.219179608965564e-08, 'epoch': 0.95} 95%|█████████▌| 32607/34278 [35:47:07<1:38:38, 3.54s/it] 95%|█████████▌| 32608/34278 [35:47:10<1:34:29, 3.39s/it] {'loss': 0.1223, 'grad_norm': 0.8450852425709949, 'learning_rate': 6.211753620827377e-08, 'epoch': 0.95} 95%|█████████▌| 32608/34278 [35:47:10<1:34:29, 3.39s/it] 95%|█████████▌| 32609/34278 [35:47:15<1:45:02, 3.78s/it] {'loss': 0.1071, 'grad_norm': 0.9628134473348807, 'learning_rate': 6.204332041112759e-08, 'epoch': 0.95} 95%|█████████▌| 32609/34278 [35:47:15<1:45:02, 3.78s/it] 95%|█████████▌| 32610/34278 [35:47:18<1:39:41, 3.59s/it] {'loss': 0.0981, 'grad_norm': 0.7988154187396046, 'learning_rate': 6.196914869887993e-08, 'epoch': 0.95} 95%|█████████▌| 32610/34278 [35:47:18<1:39:41, 3.59s/it] 95%|█████████▌| 32611/34278 [35:47:21<1:35:35, 3.44s/it] {'loss': 0.1174, 'grad_norm': 0.8331880469684507, 'learning_rate': 6.189502107219302e-08, 'epoch': 0.95} 95%|█████████▌| 32611/34278 [35:47:21<1:35:35, 3.44s/it] 95%|█████████▌| 32612/34278 [35:47:24<1:31:57, 3.31s/it] {'loss': 0.1269, 'grad_norm': 0.7796078562984378, 'learning_rate': 6.182093753172858e-08, 'epoch': 0.95} 95%|█████████▌| 32612/34278 [35:47:24<1:31:57, 3.31s/it] 95%|█████████▌| 32613/34278 [35:47:28<1:32:44, 3.34s/it] {'loss': 0.1081, 'grad_norm': 0.8244154041719207, 'learning_rate': 6.174689807814771e-08, 'epoch': 0.95} 95%|█████████▌| 32613/34278 [35:47:28<1:32:44, 3.34s/it] 95%|█████████▌| 32614/34278 [35:47:31<1:30:31, 3.26s/it] {'loss': 0.1021, 'grad_norm': 0.8878633346593348, 'learning_rate': 6.167290271211213e-08, 'epoch': 0.95} 95%|█████████▌| 32614/34278 [35:47:31<1:30:31, 3.26s/it] 95%|█████████▌| 32615/34278 [35:47:37<1:56:19, 4.20s/it] {'loss': 0.1258, 'grad_norm': 0.7331652320317426, 'learning_rate': 6.15989514342813e-08, 'epoch': 0.95} 95%|█████████▌| 32615/34278 [35:47:37<1:56:19, 4.20s/it] 95%|█████████▌| 32616/34278 [35:47:43<2:11:07, 4.73s/it] {'loss': 0.1125, 'grad_norm': 0.8022422341246551, 'learning_rate': 6.152504424531636e-08, 'epoch': 0.95} 95%|█████████▌| 32616/34278 [35:47:43<2:11:07, 4.73s/it] 95%|█████████▌| 32617/34278 [35:47:46<1:55:13, 4.16s/it] {'loss': 0.1084, 'grad_norm': 0.9996228397094764, 'learning_rate': 6.145118114587733e-08, 'epoch': 0.95} 95%|█████████▌| 32617/34278 [35:47:46<1:55:13, 4.16s/it] 95%|█████████▌| 32618/34278 [35:47:49<1:46:01, 3.83s/it] {'loss': 0.1023, 'grad_norm': 1.0537827190381488, 'learning_rate': 6.137736213662316e-08, 'epoch': 0.95} 95%|█████████▌| 32618/34278 [35:47:49<1:46:01, 3.83s/it] 95%|█████████▌| 32619/34278 [35:47:52<1:38:00, 3.54s/it] {'loss': 0.101, 'grad_norm': 0.7684435751201382, 'learning_rate': 6.130358721821272e-08, 'epoch': 0.95} 95%|█████████▌| 32619/34278 [35:47:52<1:38:00, 3.54s/it] 95%|█████████▌| 32620/34278 [35:47:55<1:37:38, 3.53s/it] {'loss': 0.1156, 'grad_norm': 0.8037522029007881, 'learning_rate': 6.122985639130497e-08, 'epoch': 0.95} 95%|█████████▌| 32620/34278 [35:47:55<1:37:38, 3.53s/it] 95%|█████████▌| 32621/34278 [35:47:58<1:34:26, 3.42s/it] {'loss': 0.1245, 'grad_norm': 0.8974018603279343, 'learning_rate': 6.115616965655824e-08, 'epoch': 0.95} 95%|█████████▌| 32621/34278 [35:47:58<1:34:26, 3.42s/it] 95%|█████████▌| 32622/34278 [35:48:02<1:36:01, 3.48s/it] {'loss': 0.1014, 'grad_norm': 0.7059823710551251, 'learning_rate': 6.108252701462925e-08, 'epoch': 0.95} 95%|█████████▌| 32622/34278 [35:48:02<1:36:01, 3.48s/it] 95%|█████████▌| 32623/34278 [35:48:05<1:32:29, 3.35s/it] {'loss': 0.1448, 'grad_norm': 1.0761115786263042, 'learning_rate': 6.100892846617745e-08, 'epoch': 0.95} 95%|█████████▌| 32623/34278 [35:48:05<1:32:29, 3.35s/it] 95%|█████████▌| 32624/34278 [35:48:11<1:53:01, 4.10s/it] {'loss': 0.1133, 'grad_norm': 1.0273670520567701, 'learning_rate': 6.093537401185901e-08, 'epoch': 0.95} 95%|█████████▌| 32624/34278 [35:48:11<1:53:01, 4.10s/it] 95%|█████████▌| 32625/34278 [35:48:14<1:41:45, 3.69s/it] {'loss': 0.0893, 'grad_norm': 0.9418891085534057, 'learning_rate': 6.086186365233005e-08, 'epoch': 0.95} 95%|█████████▌| 32625/34278 [35:48:14<1:41:45, 3.69s/it] 95%|█████████▌| 32626/34278 [35:48:17<1:40:26, 3.65s/it] {'loss': 0.1099, 'grad_norm': 0.6973764481163859, 'learning_rate': 6.078839738824782e-08, 'epoch': 0.95} 95%|█████████▌| 32626/34278 [35:48:17<1:40:26, 3.65s/it] 95%|█████████▌| 32627/34278 [35:48:20<1:34:20, 3.43s/it] {'loss': 0.1131, 'grad_norm': 0.6891166173220694, 'learning_rate': 6.071497522026737e-08, 'epoch': 0.95} 95%|█████████▌| 32627/34278 [35:48:20<1:34:20, 3.43s/it] 95%|█████████▌| 32628/34278 [35:48:23<1:32:52, 3.38s/it] {'loss': 0.1425, 'grad_norm': 0.9490350493604558, 'learning_rate': 6.064159714904428e-08, 'epoch': 0.95} 95%|█████████▌| 32628/34278 [35:48:23<1:32:52, 3.38s/it] 95%|█████████▌| 32629/34278 [35:48:29<1:50:54, 4.04s/it] {'loss': 0.1325, 'grad_norm': 0.8951531329085967, 'learning_rate': 6.056826317523357e-08, 'epoch': 0.95} 95%|█████████▌| 32629/34278 [35:48:29<1:50:54, 4.04s/it] 95%|█████████▌| 32630/34278 [35:48:32<1:43:28, 3.77s/it] {'loss': 0.1008, 'grad_norm': 0.7920240975475742, 'learning_rate': 6.049497329949139e-08, 'epoch': 0.95} 95%|█████████▌| 32630/34278 [35:48:32<1:43:28, 3.77s/it] 95%|█████████▌| 32631/34278 [35:48:35<1:34:49, 3.45s/it] {'loss': 0.1079, 'grad_norm': 0.8504101902469182, 'learning_rate': 6.042172752247e-08, 'epoch': 0.95} 95%|█████████▌| 32631/34278 [35:48:35<1:34:49, 3.45s/it] 95%|█████████▌| 32632/34278 [35:48:38<1:36:05, 3.50s/it] {'loss': 0.0959, 'grad_norm': 0.830303144502529, 'learning_rate': 6.034852584482442e-08, 'epoch': 0.95} 95%|█████████▌| 32632/34278 [35:48:38<1:36:05, 3.50s/it] 95%|█████████▌| 32633/34278 [35:48:42<1:34:25, 3.44s/it] {'loss': 0.1203, 'grad_norm': 0.6676870435088427, 'learning_rate': 6.027536826720859e-08, 'epoch': 0.95} 95%|█████████▌| 32633/34278 [35:48:42<1:34:25, 3.44s/it] 95%|█████████▌| 32634/34278 [35:48:47<1:51:32, 4.07s/it] {'loss': 0.1242, 'grad_norm': 0.9017591609409177, 'learning_rate': 6.020225479027419e-08, 'epoch': 0.95} 95%|█████████▌| 32634/34278 [35:48:47<1:51:32, 4.07s/it] 95%|█████████▌| 32635/34278 [35:48:52<1:54:02, 4.16s/it] {'loss': 0.1333, 'grad_norm': 1.08394380267901, 'learning_rate': 6.012918541467572e-08, 'epoch': 0.95} 95%|█████████▌| 32635/34278 [35:48:52<1:54:02, 4.16s/it] 95%|█████████▌| 32636/34278 [35:48:55<1:50:33, 4.04s/it] {'loss': 0.1073, 'grad_norm': 0.7961625153907718, 'learning_rate': 6.005616014106375e-08, 'epoch': 0.95} 95%|█████████▌| 32636/34278 [35:48:55<1:50:33, 4.04s/it] 95%|█████████▌| 32637/34278 [35:49:00<1:58:07, 4.32s/it] {'loss': 0.1214, 'grad_norm': 0.6973454160567234, 'learning_rate': 5.998317897009165e-08, 'epoch': 0.95} 95%|█████████▌| 32637/34278 [35:49:00<1:58:07, 4.32s/it] 95%|█████████▌| 32638/34278 [35:49:04<1:48:09, 3.96s/it] {'loss': 0.1193, 'grad_norm': 0.8946820605255777, 'learning_rate': 5.991024190241057e-08, 'epoch': 0.95} 95%|█████████▌| 32638/34278 [35:49:04<1:48:09, 3.96s/it] 95%|█████████▌| 32639/34278 [35:49:07<1:44:32, 3.83s/it] {'loss': 0.0908, 'grad_norm': 0.8561616523767539, 'learning_rate': 5.983734893867166e-08, 'epoch': 0.95} 95%|█████████▌| 32639/34278 [35:49:07<1:44:32, 3.83s/it] 95%|█████████▌| 32640/34278 [35:49:11<1:46:57, 3.92s/it] {'loss': 0.1369, 'grad_norm': 0.8619288451934363, 'learning_rate': 5.976450007952495e-08, 'epoch': 0.95} 95%|█████████▌| 32640/34278 [35:49:11<1:46:57, 3.92s/it] 95%|█████████▌| 32641/34278 [35:49:14<1:40:20, 3.68s/it] {'loss': 0.1303, 'grad_norm': 1.1506017454184636, 'learning_rate': 5.969169532562158e-08, 'epoch': 0.95} 95%|█████████▌| 32641/34278 [35:49:14<1:40:20, 3.68s/it] 95%|█████████▌| 32642/34278 [35:49:20<1:53:34, 4.17s/it] {'loss': 0.1144, 'grad_norm': 0.9371048522861081, 'learning_rate': 5.96189346776116e-08, 'epoch': 0.95} 95%|█████████▌| 32642/34278 [35:49:20<1:53:34, 4.17s/it] 95%|█████████▌| 32643/34278 [35:49:23<1:48:01, 3.96s/it] {'loss': 0.1135, 'grad_norm': 0.7567991425407807, 'learning_rate': 5.954621813614447e-08, 'epoch': 0.95} 95%|█████████▌| 32643/34278 [35:49:23<1:48:01, 3.96s/it] 95%|█████████▌| 32644/34278 [35:49:26<1:41:40, 3.73s/it] {'loss': 0.1092, 'grad_norm': 0.6268691765113534, 'learning_rate': 5.9473545701869696e-08, 'epoch': 0.95} 95%|█████████▌| 32644/34278 [35:49:26<1:41:40, 3.73s/it] 95%|█████████▌| 32645/34278 [35:49:29<1:35:39, 3.51s/it] {'loss': 0.1186, 'grad_norm': 0.7497910810730019, 'learning_rate': 5.940091737543507e-08, 'epoch': 0.95} 95%|█████████▌| 32645/34278 [35:49:29<1:35:39, 3.51s/it] 95%|█████████▌| 32646/34278 [35:49:32<1:33:02, 3.42s/it] {'loss': 0.1029, 'grad_norm': 0.8616186187157856, 'learning_rate': 5.9328333157489535e-08, 'epoch': 0.95} 95%|█████████▌| 32646/34278 [35:49:32<1:33:02, 3.42s/it] 95%|█████████▌| 32647/34278 [35:49:36<1:30:21, 3.32s/it] {'loss': 0.1162, 'grad_norm': 0.9867630838410505, 'learning_rate': 5.925579304868201e-08, 'epoch': 0.95} 95%|█████████▌| 32647/34278 [35:49:36<1:30:21, 3.32s/it] 95%|█████████▌| 32648/34278 [35:49:40<1:39:22, 3.66s/it] {'loss': 0.1112, 'grad_norm': 0.7842768877345314, 'learning_rate': 5.9183297049658637e-08, 'epoch': 0.95} 95%|█████████▌| 32648/34278 [35:49:40<1:39:22, 3.66s/it] 95%|█████████▌| 32649/34278 [35:49:43<1:33:40, 3.45s/it] {'loss': 0.0876, 'grad_norm': 0.6791383638777265, 'learning_rate': 5.9110845161067245e-08, 'epoch': 0.95} 95%|█████████▌| 32649/34278 [35:49:43<1:33:40, 3.45s/it] 95%|█████████▌| 32650/34278 [35:49:46<1:31:37, 3.38s/it] {'loss': 0.091, 'grad_norm': 0.7673675376467145, 'learning_rate': 5.9038437383555636e-08, 'epoch': 0.95} 95%|█████████▌| 32650/34278 [35:49:46<1:31:37, 3.38s/it] 95%|█████████▌| 32651/34278 [35:49:49<1:30:18, 3.33s/it] {'loss': 0.1073, 'grad_norm': 0.8811284227523455, 'learning_rate': 5.896607371776886e-08, 'epoch': 0.95} 95%|█████████▌| 32651/34278 [35:49:49<1:30:18, 3.33s/it] 95%|█████████▌| 32652/34278 [35:49:52<1:27:58, 3.25s/it] {'loss': 0.1318, 'grad_norm': 0.8780540531852291, 'learning_rate': 5.88937541643525e-08, 'epoch': 0.95} 95%|█████████▌| 32652/34278 [35:49:52<1:27:58, 3.25s/it] 95%|█████████▌| 32653/34278 [35:49:55<1:24:55, 3.14s/it] {'loss': 0.1066, 'grad_norm': 0.8550725392908367, 'learning_rate': 5.882147872395438e-08, 'epoch': 0.95} 95%|█████████▌| 32653/34278 [35:49:55<1:24:55, 3.14s/it] 95%|█████████▌| 32654/34278 [35:50:00<1:36:35, 3.57s/it] {'loss': 0.1124, 'grad_norm': 0.9066859606474291, 'learning_rate': 5.874924739721843e-08, 'epoch': 0.95} 95%|█████████▌| 32654/34278 [35:50:00<1:36:35, 3.57s/it] 95%|█████████▌| 32655/34278 [35:50:03<1:32:25, 3.42s/it] {'loss': 0.1127, 'grad_norm': 0.8223402910741842, 'learning_rate': 5.8677060184789134e-08, 'epoch': 0.95} 95%|█████████▌| 32655/34278 [35:50:03<1:32:25, 3.42s/it] 95%|█████████▌| 32656/34278 [35:50:06<1:28:43, 3.28s/it] {'loss': 0.104, 'grad_norm': 0.7837047894990621, 'learning_rate': 5.860491708731153e-08, 'epoch': 0.95} 95%|█████████▌| 32656/34278 [35:50:06<1:28:43, 3.28s/it] 95%|█████████▌| 32657/34278 [35:50:09<1:26:03, 3.19s/it] {'loss': 0.0879, 'grad_norm': 0.8590150222609634, 'learning_rate': 5.85328181054301e-08, 'epoch': 0.95} 95%|█████████▌| 32657/34278 [35:50:09<1:26:03, 3.19s/it] 95%|█████████▌| 32658/34278 [35:50:15<1:49:20, 4.05s/it] {'loss': 0.1379, 'grad_norm': 0.9602282152252887, 'learning_rate': 5.8460763239787666e-08, 'epoch': 0.95} 95%|█████████▌| 32658/34278 [35:50:15<1:49:20, 4.05s/it] 95%|█████████▌| 32659/34278 [35:50:18<1:39:29, 3.69s/it] {'loss': 0.1038, 'grad_norm': 1.0993961155785237, 'learning_rate': 5.83887524910276e-08, 'epoch': 0.95} 95%|█████████▌| 32659/34278 [35:50:18<1:39:29, 3.69s/it] 95%|█████████▌| 32660/34278 [35:50:23<1:48:41, 4.03s/it] {'loss': 0.1116, 'grad_norm': 0.811768160733312, 'learning_rate': 5.8316785859793836e-08, 'epoch': 0.95} 95%|█████████▌| 32660/34278 [35:50:23<1:48:41, 4.03s/it] 95%|█████████▌| 32661/34278 [35:50:29<2:04:20, 4.61s/it] {'loss': 0.0862, 'grad_norm': 0.8474323462158622, 'learning_rate': 5.824486334672752e-08, 'epoch': 0.95} 95%|█████████▌| 32661/34278 [35:50:29<2:04:20, 4.61s/it] 95%|█████████▌| 32662/34278 [35:50:32<1:53:29, 4.21s/it] {'loss': 0.1136, 'grad_norm': 1.2586626991754393, 'learning_rate': 5.817298495247148e-08, 'epoch': 0.95} 95%|█████████▌| 32662/34278 [35:50:32<1:53:29, 4.21s/it] 95%|█████████▌| 32663/34278 [35:50:35<1:43:19, 3.84s/it] {'loss': 0.1196, 'grad_norm': 0.9847370764031189, 'learning_rate': 5.8101150677667975e-08, 'epoch': 0.95} 95%|█████████▌| 32663/34278 [35:50:35<1:43:19, 3.84s/it] 95%|█████████▌| 32664/34278 [35:50:38<1:39:02, 3.68s/it] {'loss': 0.1057, 'grad_norm': 0.7908098364741442, 'learning_rate': 5.802936052295649e-08, 'epoch': 0.95} 95%|█████████▌| 32664/34278 [35:50:38<1:39:02, 3.68s/it] 95%|█████████▌| 32665/34278 [35:50:41<1:34:41, 3.52s/it] {'loss': 0.1082, 'grad_norm': 0.663226266575957, 'learning_rate': 5.795761448897985e-08, 'epoch': 0.95} 95%|█████████▌| 32665/34278 [35:50:41<1:34:41, 3.52s/it] 95%|█████████▌| 32666/34278 [35:50:44<1:29:47, 3.34s/it] {'loss': 0.0796, 'grad_norm': 0.8961722386481162, 'learning_rate': 5.78859125763781e-08, 'epoch': 0.95} 95%|█████████▌| 32666/34278 [35:50:44<1:29:47, 3.34s/it] 95%|█████████▌| 32667/34278 [35:50:47<1:26:35, 3.23s/it] {'loss': 0.1054, 'grad_norm': 0.7789305018609528, 'learning_rate': 5.7814254785790724e-08, 'epoch': 0.95} 95%|█████████▌| 32667/34278 [35:50:47<1:26:35, 3.23s/it] 95%|█████████▌| 32668/34278 [35:50:50<1:26:18, 3.22s/it] {'loss': 0.1085, 'grad_norm': 0.8662469204807426, 'learning_rate': 5.774264111785832e-08, 'epoch': 0.95} 95%|█████████▌| 32668/34278 [35:50:50<1:26:18, 3.22s/it] 95%|█████████▌| 32669/34278 [35:50:54<1:26:11, 3.21s/it] {'loss': 0.1043, 'grad_norm': 0.7783665091937831, 'learning_rate': 5.767107157321927e-08, 'epoch': 0.95} 95%|█████████▌| 32669/34278 [35:50:54<1:26:11, 3.21s/it] 95%|█████████▌| 32670/34278 [35:50:57<1:26:10, 3.22s/it] {'loss': 0.1212, 'grad_norm': 0.8554149891511343, 'learning_rate': 5.759954615251307e-08, 'epoch': 0.95} 95%|█████████▌| 32670/34278 [35:50:57<1:26:10, 3.22s/it] 95%|█████████▌| 32671/34278 [35:51:00<1:26:22, 3.23s/it] {'loss': 0.1013, 'grad_norm': 1.116922813535786, 'learning_rate': 5.752806485637863e-08, 'epoch': 0.95} 95%|█████████▌| 32671/34278 [35:51:00<1:26:22, 3.23s/it] 95%|█████████▌| 32672/34278 [35:51:03<1:26:22, 3.23s/it] {'loss': 0.0908, 'grad_norm': 0.6838342260271995, 'learning_rate': 5.745662768545324e-08, 'epoch': 0.95} 95%|█████████▌| 32672/34278 [35:51:03<1:26:22, 3.23s/it] 95%|█████████▌| 32673/34278 [35:51:06<1:25:41, 3.20s/it] {'loss': 0.0936, 'grad_norm': 0.6748127117389364, 'learning_rate': 5.7385234640375817e-08, 'epoch': 0.95} 95%|█████████▌| 32673/34278 [35:51:06<1:25:41, 3.20s/it] 95%|█████████▌| 32674/34278 [35:51:10<1:26:36, 3.24s/it] {'loss': 0.1078, 'grad_norm': 0.8109081455760337, 'learning_rate': 5.731388572178309e-08, 'epoch': 0.95} 95%|█████████▌| 32674/34278 [35:51:10<1:26:36, 3.24s/it] 95%|█████████▌| 32675/34278 [35:51:13<1:22:47, 3.10s/it] {'loss': 0.1237, 'grad_norm': 0.8996950441897913, 'learning_rate': 5.724258093031176e-08, 'epoch': 0.95} 95%|█████████▌| 32675/34278 [35:51:13<1:22:47, 3.10s/it] 95%|█████████▌| 32676/34278 [35:51:16<1:28:32, 3.32s/it] {'loss': 0.1279, 'grad_norm': 0.9041151303209126, 'learning_rate': 5.717132026659855e-08, 'epoch': 0.95} 95%|█████████▌| 32676/34278 [35:51:16<1:28:32, 3.32s/it] 95%|█████████▌| 32677/34278 [35:51:19<1:25:39, 3.21s/it] {'loss': 0.1226, 'grad_norm': 0.9389216356559982, 'learning_rate': 5.710010373128016e-08, 'epoch': 0.95} 95%|█████████▌| 32677/34278 [35:51:19<1:25:39, 3.21s/it] 95%|█████████▌| 32678/34278 [35:51:23<1:25:35, 3.21s/it] {'loss': 0.1248, 'grad_norm': 2.2230861250735847, 'learning_rate': 5.702893132499221e-08, 'epoch': 0.95} 95%|█████████▌| 32678/34278 [35:51:23<1:25:35, 3.21s/it] 95%|█████████▌| 32679/34278 [35:51:26<1:25:22, 3.20s/it] {'loss': 0.1027, 'grad_norm': 0.8085138104073984, 'learning_rate': 5.695780304836973e-08, 'epoch': 0.95} 95%|█████████▌| 32679/34278 [35:51:26<1:25:22, 3.20s/it] 95%|█████████▌| 32680/34278 [35:51:32<1:51:02, 4.17s/it] {'loss': 0.0885, 'grad_norm': 0.8832825819353127, 'learning_rate': 5.6886718902048334e-08, 'epoch': 0.95} 95%|█████████▌| 32680/34278 [35:51:32<1:51:02, 4.17s/it] 95%|█████████▌| 32681/34278 [35:51:35<1:40:48, 3.79s/it] {'loss': 0.1106, 'grad_norm': 1.217138091986393, 'learning_rate': 5.6815678886661953e-08, 'epoch': 0.95} 95%|█████████▌| 32681/34278 [35:51:35<1:40:48, 3.79s/it] 95%|█████████▌| 32682/34278 [35:51:39<1:39:15, 3.73s/it] {'loss': 0.1116, 'grad_norm': 1.086105299322099, 'learning_rate': 5.674468300284508e-08, 'epoch': 0.95} 95%|█████████▌| 32682/34278 [35:51:39<1:39:15, 3.73s/it] 95%|█████████▌| 32683/34278 [35:51:45<1:57:07, 4.41s/it] {'loss': 0.1122, 'grad_norm': 0.8230304666633287, 'learning_rate': 5.667373125123166e-08, 'epoch': 0.95} 95%|█████████▌| 32683/34278 [35:51:45<1:57:07, 4.41s/it] 95%|█████████▌| 32684/34278 [35:51:50<2:04:45, 4.70s/it] {'loss': 0.1066, 'grad_norm': 0.7375332041179844, 'learning_rate': 5.660282363245562e-08, 'epoch': 0.95} 95%|█████████▌| 32684/34278 [35:51:50<2:04:45, 4.70s/it] 95%|█████████▌| 32685/34278 [35:51:53<1:50:08, 4.15s/it] {'loss': 0.1065, 'grad_norm': 0.811745805945028, 'learning_rate': 5.653196014714868e-08, 'epoch': 0.95} 95%|█████████▌| 32685/34278 [35:51:53<1:50:08, 4.15s/it] 95%|█████████▌| 32686/34278 [35:51:56<1:38:57, 3.73s/it] {'loss': 0.113, 'grad_norm': 0.9421793573950739, 'learning_rate': 5.646114079594478e-08, 'epoch': 0.95} 95%|█████████▌| 32686/34278 [35:51:56<1:38:57, 3.73s/it] 95%|█████████▌| 32687/34278 [35:51:59<1:36:52, 3.65s/it] {'loss': 0.1076, 'grad_norm': 0.9071335278536922, 'learning_rate': 5.6390365579476195e-08, 'epoch': 0.95} 95%|█████████▌| 32687/34278 [35:51:59<1:36:52, 3.65s/it] 95%|█████████▌| 32688/34278 [35:52:02<1:33:13, 3.52s/it] {'loss': 0.11, 'grad_norm': 0.958546515415864, 'learning_rate': 5.631963449837352e-08, 'epoch': 0.95} 95%|█████████▌| 32688/34278 [35:52:02<1:33:13, 3.52s/it] 95%|█████████▌| 32689/34278 [35:52:05<1:28:32, 3.34s/it] {'loss': 0.0829, 'grad_norm': 0.6393352449910052, 'learning_rate': 5.624894755326904e-08, 'epoch': 0.95} 95%|█████████▌| 32689/34278 [35:52:05<1:28:32, 3.34s/it] 95%|█████████▌| 32690/34278 [35:52:10<1:36:33, 3.65s/it] {'loss': 0.1175, 'grad_norm': 0.8669815361704764, 'learning_rate': 5.617830474479391e-08, 'epoch': 0.95} 95%|█████████▌| 32690/34278 [35:52:10<1:36:33, 3.65s/it] 95%|█████████▌| 32691/34278 [35:52:13<1:34:33, 3.58s/it] {'loss': 0.1254, 'grad_norm': 0.7695261088984546, 'learning_rate': 5.6107706073578735e-08, 'epoch': 0.95} 95%|█████████▌| 32691/34278 [35:52:13<1:34:33, 3.58s/it] 95%|█████████▌| 32692/34278 [35:52:16<1:31:35, 3.47s/it] {'loss': 0.103, 'grad_norm': 0.846695217818394, 'learning_rate': 5.6037151540253574e-08, 'epoch': 0.95} 95%|█████████▌| 32692/34278 [35:52:16<1:31:35, 3.47s/it] 95%|█████████▌| 32693/34278 [35:52:19<1:27:06, 3.30s/it] {'loss': 0.079, 'grad_norm': 0.72007459617846, 'learning_rate': 5.596664114544903e-08, 'epoch': 0.95} 95%|█████████▌| 32693/34278 [35:52:19<1:27:06, 3.30s/it] 95%|█████████▌| 32694/34278 [35:52:23<1:29:37, 3.39s/it] {'loss': 0.1083, 'grad_norm': 0.7513026332398145, 'learning_rate': 5.589617488979349e-08, 'epoch': 0.95} 95%|█████████▌| 32694/34278 [35:52:23<1:29:37, 3.39s/it] 95%|█████████▌| 32695/34278 [35:52:27<1:33:00, 3.53s/it] {'loss': 0.0907, 'grad_norm': 0.6366230101102938, 'learning_rate': 5.5825752773917e-08, 'epoch': 0.95} 95%|█████████▌| 32695/34278 [35:52:27<1:33:00, 3.53s/it] 95%|█████████▌| 32696/34278 [35:52:29<1:27:17, 3.31s/it] {'loss': 0.101, 'grad_norm': 0.8375176551428908, 'learning_rate': 5.5755374798447394e-08, 'epoch': 0.95} 95%|█████████▌| 32696/34278 [35:52:29<1:27:17, 3.31s/it] 95%|█████████▌| 32697/34278 [35:52:34<1:40:40, 3.82s/it] {'loss': 0.1109, 'grad_norm': 0.8338850696366432, 'learning_rate': 5.568504096401417e-08, 'epoch': 0.95} 95%|█████████▌| 32697/34278 [35:52:34<1:40:40, 3.82s/it] 95%|█████████▌| 32698/34278 [35:52:40<1:58:21, 4.49s/it] {'loss': 0.1138, 'grad_norm': 0.9244959917031603, 'learning_rate': 5.56147512712446e-08, 'epoch': 0.95} 95%|█████████▌| 32698/34278 [35:52:40<1:58:21, 4.49s/it] 95%|█████████▌| 32699/34278 [35:52:44<1:47:39, 4.09s/it] {'loss': 0.1074, 'grad_norm': 0.7904081543814052, 'learning_rate': 5.5544505720765974e-08, 'epoch': 0.95} 95%|█████████▌| 32699/34278 [35:52:44<1:47:39, 4.09s/it] 95%|█████████▌| 32700/34278 [35:52:47<1:39:18, 3.78s/it] {'loss': 0.1079, 'grad_norm': 0.71380367927924, 'learning_rate': 5.547430431320555e-08, 'epoch': 0.95} 95%|█████████▌| 32700/34278 [35:52:47<1:39:18, 3.78s/it] 95%|█████████▌| 32701/34278 [35:52:51<1:40:07, 3.81s/it] {'loss': 0.0951, 'grad_norm': 0.722018876045306, 'learning_rate': 5.540414704919006e-08, 'epoch': 0.95} 95%|█████████▌| 32701/34278 [35:52:51<1:40:07, 3.81s/it] 95%|█████████▌| 32702/34278 [35:52:53<1:32:56, 3.54s/it] {'loss': 0.1095, 'grad_norm': 0.983592964344662, 'learning_rate': 5.533403392934622e-08, 'epoch': 0.95} 95%|█████████▌| 32702/34278 [35:52:53<1:32:56, 3.54s/it] 95%|█████████▌| 32703/34278 [35:52:56<1:26:51, 3.31s/it] {'loss': 0.0971, 'grad_norm': 0.7452949862713076, 'learning_rate': 5.5263964954299644e-08, 'epoch': 0.95} 95%|█████████▌| 32703/34278 [35:52:56<1:26:51, 3.31s/it] 95%|█████████▌| 32704/34278 [35:53:01<1:34:45, 3.61s/it] {'loss': 0.1353, 'grad_norm': 0.9514650723231737, 'learning_rate': 5.519394012467649e-08, 'epoch': 0.95} 95%|█████████▌| 32704/34278 [35:53:01<1:34:45, 3.61s/it] 95%|█████████▌| 32705/34278 [35:53:03<1:29:11, 3.40s/it] {'loss': 0.109, 'grad_norm': 0.7965203008504623, 'learning_rate': 5.5123959441100713e-08, 'epoch': 0.95} 95%|█████████▌| 32705/34278 [35:53:03<1:29:11, 3.40s/it] 95%|█████████▌| 32706/34278 [35:53:07<1:28:02, 3.36s/it] {'loss': 0.102, 'grad_norm': 0.7555129535758409, 'learning_rate': 5.505402290419792e-08, 'epoch': 0.95} 95%|█████████▌| 32706/34278 [35:53:07<1:28:02, 3.36s/it] 95%|█████████▌| 32707/34278 [35:53:10<1:23:46, 3.20s/it] {'loss': 0.1277, 'grad_norm': 0.8251394665377272, 'learning_rate': 5.498413051459261e-08, 'epoch': 0.95} 95%|█████████▌| 32707/34278 [35:53:10<1:23:46, 3.20s/it] 95%|█████████▌| 32708/34278 [35:53:14<1:35:56, 3.67s/it] {'loss': 0.1214, 'grad_norm': 0.7186693398842615, 'learning_rate': 5.4914282272908737e-08, 'epoch': 0.95} 95%|█████████▌| 32708/34278 [35:53:14<1:35:56, 3.67s/it] 95%|█████████▌| 32709/34278 [35:53:18<1:33:13, 3.57s/it] {'loss': 0.1309, 'grad_norm': 0.8534837535293472, 'learning_rate': 5.484447817976912e-08, 'epoch': 0.95} 95%|█████████▌| 32709/34278 [35:53:18<1:33:13, 3.57s/it] 95%|█████████▌| 32710/34278 [35:53:21<1:33:36, 3.58s/it] {'loss': 0.1216, 'grad_norm': 0.9406564450983651, 'learning_rate': 5.477471823579772e-08, 'epoch': 0.95} 95%|█████████▌| 32710/34278 [35:53:21<1:33:36, 3.58s/it] 95%|█████████▌| 32711/34278 [35:53:27<1:52:42, 4.32s/it] {'loss': 0.1008, 'grad_norm': 0.6421733732819564, 'learning_rate': 5.470500244161736e-08, 'epoch': 0.95} 95%|█████████▌| 32711/34278 [35:53:27<1:52:42, 4.32s/it] 95%|█████████▌| 32712/34278 [35:53:30<1:40:48, 3.86s/it] {'loss': 0.1093, 'grad_norm': 0.8149105103919811, 'learning_rate': 5.4635330797849217e-08, 'epoch': 0.95} 95%|█████████▌| 32712/34278 [35:53:30<1:40:48, 3.86s/it] 95%|█████████▌| 32713/34278 [35:53:33<1:36:10, 3.69s/it] {'loss': 0.1234, 'grad_norm': 0.7854076563447636, 'learning_rate': 5.456570330511724e-08, 'epoch': 0.95} 95%|█████████▌| 32713/34278 [35:53:33<1:36:10, 3.69s/it] 95%|█████████▌| 32714/34278 [35:53:37<1:34:36, 3.63s/it] {'loss': 0.0911, 'grad_norm': 0.8292307039778413, 'learning_rate': 5.449611996404203e-08, 'epoch': 0.95} 95%|█████████▌| 32714/34278 [35:53:37<1:34:36, 3.63s/it] 95%|█████████▌| 32715/34278 [35:53:40<1:34:41, 3.63s/it] {'loss': 0.104, 'grad_norm': 0.847145646906693, 'learning_rate': 5.442658077524476e-08, 'epoch': 0.95} 95%|█████████▌| 32715/34278 [35:53:41<1:34:41, 3.63s/it] 95%|█████████▌| 32716/34278 [35:53:43<1:28:06, 3.38s/it] {'loss': 0.085, 'grad_norm': 0.5969312758250869, 'learning_rate': 5.435708573934662e-08, 'epoch': 0.95} 95%|█████████▌| 32716/34278 [35:53:43<1:28:06, 3.38s/it] 95%|█████████▌| 32717/34278 [35:53:48<1:35:12, 3.66s/it] {'loss': 0.128, 'grad_norm': 0.8888413792758529, 'learning_rate': 5.428763485696764e-08, 'epoch': 0.95} 95%|█████████▌| 32717/34278 [35:53:48<1:35:12, 3.66s/it] 95%|█████████▌| 32718/34278 [35:53:51<1:34:53, 3.65s/it] {'loss': 0.1126, 'grad_norm': 0.8815023539978275, 'learning_rate': 5.4218228128727345e-08, 'epoch': 0.95} 95%|█████████▌| 32718/34278 [35:53:51<1:34:53, 3.65s/it] 95%|█████████▌| 32719/34278 [35:53:55<1:35:15, 3.67s/it] {'loss': 0.1305, 'grad_norm': 0.8988664623351925, 'learning_rate': 5.4148865555246896e-08, 'epoch': 0.95} 95%|█████████▌| 32719/34278 [35:53:55<1:35:15, 3.67s/it] 95%|█████████▌| 32720/34278 [35:53:58<1:33:58, 3.62s/it] {'loss': 0.1212, 'grad_norm': 0.8547614744774444, 'learning_rate': 5.407954713714414e-08, 'epoch': 0.95} 95%|█████████▌| 32720/34278 [35:53:58<1:33:58, 3.62s/it] 95%|█████████▌| 32721/34278 [35:54:02<1:30:41, 3.49s/it] {'loss': 0.1045, 'grad_norm': 0.9884351548375194, 'learning_rate': 5.4010272875039125e-08, 'epoch': 0.95} 95%|█████████▌| 32721/34278 [35:54:02<1:30:41, 3.49s/it] 95%|█████████▌| 32722/34278 [35:54:04<1:25:23, 3.29s/it] {'loss': 0.1197, 'grad_norm': 0.7517468882826768, 'learning_rate': 5.3941042769549146e-08, 'epoch': 0.95} 95%|█████████▌| 32722/34278 [35:54:04<1:25:23, 3.29s/it] 95%|█████████▌| 32723/34278 [35:54:10<1:45:21, 4.07s/it] {'loss': 0.1188, 'grad_norm': 0.7619462510575757, 'learning_rate': 5.387185682129259e-08, 'epoch': 0.95} 95%|█████████▌| 32723/34278 [35:54:10<1:45:21, 4.07s/it] 95%|█████████▌| 32724/34278 [35:54:14<1:41:37, 3.92s/it] {'loss': 0.1104, 'grad_norm': 0.9094217397506837, 'learning_rate': 5.380271503088841e-08, 'epoch': 0.95} 95%|█████████▌| 32724/34278 [35:54:14<1:41:37, 3.92s/it] 95%|█████████▌| 32725/34278 [35:54:20<1:57:47, 4.55s/it] {'loss': 0.1074, 'grad_norm': 0.8684033543593407, 'learning_rate': 5.373361739895222e-08, 'epoch': 0.95} 95%|█████████▌| 32725/34278 [35:54:20<1:57:47, 4.55s/it] 95%|█████████▌| 32726/34278 [35:54:24<1:57:10, 4.53s/it] {'loss': 0.1272, 'grad_norm': 0.7737522514610439, 'learning_rate': 5.366456392610131e-08, 'epoch': 0.95} 95%|█████████▌| 32726/34278 [35:54:24<1:57:10, 4.53s/it] 95%|█████████▌| 32727/34278 [35:54:27<1:43:37, 4.01s/it] {'loss': 0.1081, 'grad_norm': 0.8463556442373639, 'learning_rate': 5.3595554612952404e-08, 'epoch': 0.95} 95%|█████████▌| 32727/34278 [35:54:27<1:43:37, 4.01s/it] 95%|█████████▌| 32728/34278 [35:54:32<1:53:00, 4.37s/it] {'loss': 0.106, 'grad_norm': 0.7869651876905523, 'learning_rate': 5.352658946012224e-08, 'epoch': 0.95} 95%|█████████▌| 32728/34278 [35:54:32<1:53:00, 4.37s/it] 95%|█████████▌| 32729/34278 [35:54:38<2:04:52, 4.84s/it] {'loss': 0.1097, 'grad_norm': 1.1556828545657285, 'learning_rate': 5.345766846822475e-08, 'epoch': 0.95} 95%|█████████▌| 32729/34278 [35:54:38<2:04:52, 4.84s/it] 95%|█████████▌| 32730/34278 [35:54:42<1:54:12, 4.43s/it] {'loss': 0.1278, 'grad_norm': 0.8149670517569663, 'learning_rate': 5.3388791637877244e-08, 'epoch': 0.95} 95%|█████████▌| 32730/34278 [35:54:42<1:54:12, 4.43s/it] 95%|█████████▌| 32731/34278 [35:54:46<1:50:39, 4.29s/it] {'loss': 0.1176, 'grad_norm': 0.8282747326715098, 'learning_rate': 5.3319958969693665e-08, 'epoch': 0.95} 95%|█████████▌| 32731/34278 [35:54:46<1:50:39, 4.29s/it] 95%|█████████▌| 32732/34278 [35:54:49<1:40:36, 3.90s/it] {'loss': 0.099, 'grad_norm': 0.853088271351987, 'learning_rate': 5.3251170464288516e-08, 'epoch': 0.95} 95%|█████████▌| 32732/34278 [35:54:49<1:40:36, 3.90s/it] 95%|█████████▌| 32733/34278 [35:54:52<1:31:25, 3.55s/it] {'loss': 0.1136, 'grad_norm': 0.6244398348582837, 'learning_rate': 5.3182426122275753e-08, 'epoch': 0.95} 95%|█████████▌| 32733/34278 [35:54:52<1:31:25, 3.55s/it] 95%|█████████▌| 32734/34278 [35:54:55<1:31:42, 3.56s/it] {'loss': 0.1149, 'grad_norm': 0.7176252117683201, 'learning_rate': 5.311372594426989e-08, 'epoch': 0.95} 95%|█████████▌| 32734/34278 [35:54:55<1:31:42, 3.56s/it] 95%|█████████▌| 32735/34278 [35:55:00<1:38:44, 3.84s/it] {'loss': 0.1046, 'grad_norm': 0.7292212616731406, 'learning_rate': 5.304506993088321e-08, 'epoch': 0.95} 95%|█████████▌| 32735/34278 [35:55:00<1:38:44, 3.84s/it] 96%|█████████▌| 32736/34278 [35:55:03<1:36:25, 3.75s/it] {'loss': 0.098, 'grad_norm': 0.6841145006758904, 'learning_rate': 5.2976458082729666e-08, 'epoch': 0.96} 96%|█████████▌| 32736/34278 [35:55:03<1:36:25, 3.75s/it] 96%|█████████▌| 32737/34278 [35:55:06<1:30:34, 3.53s/it] {'loss': 0.1047, 'grad_norm': 0.768517442130169, 'learning_rate': 5.290789040042099e-08, 'epoch': 0.96} 96%|█████████▌| 32737/34278 [35:55:06<1:30:34, 3.53s/it] 96%|█████████▌| 32738/34278 [35:55:09<1:26:58, 3.39s/it] {'loss': 0.1126, 'grad_norm': 0.8851022480303948, 'learning_rate': 5.283936688457003e-08, 'epoch': 0.96} 96%|█████████▌| 32738/34278 [35:55:09<1:26:58, 3.39s/it] 96%|█████████▌| 32739/34278 [35:55:12<1:23:41, 3.26s/it] {'loss': 0.1247, 'grad_norm': 0.9288174884552797, 'learning_rate': 5.277088753578796e-08, 'epoch': 0.96} 96%|█████████▌| 32739/34278 [35:55:12<1:23:41, 3.26s/it] 96%|█████████▌| 32740/34278 [35:55:16<1:29:08, 3.48s/it] {'loss': 0.1158, 'grad_norm': 0.9339753629522923, 'learning_rate': 5.2702452354687075e-08, 'epoch': 0.96} 96%|█████████▌| 32740/34278 [35:55:16<1:29:08, 3.48s/it] 96%|█████████▌| 32741/34278 [35:55:19<1:24:19, 3.29s/it] {'loss': 0.1088, 'grad_norm': 0.7427770459169746, 'learning_rate': 5.2634061341876874e-08, 'epoch': 0.96} 96%|█████████▌| 32741/34278 [35:55:19<1:24:19, 3.29s/it] 96%|█████████▌| 32742/34278 [35:55:22<1:20:34, 3.15s/it] {'loss': 0.1001, 'grad_norm': 0.7796077651791834, 'learning_rate': 5.256571449796854e-08, 'epoch': 0.96} 96%|█████████▌| 32742/34278 [35:55:22<1:20:34, 3.15s/it] 96%|█████████▌| 32743/34278 [35:55:25<1:21:40, 3.19s/it] {'loss': 0.1188, 'grad_norm': 0.841522036360295, 'learning_rate': 5.2497411823573264e-08, 'epoch': 0.96} 96%|█████████▌| 32743/34278 [35:55:25<1:21:40, 3.19s/it] 96%|█████████▌| 32744/34278 [35:55:31<1:43:21, 4.04s/it] {'loss': 0.1126, 'grad_norm': 0.6581909732765954, 'learning_rate': 5.2429153319299987e-08, 'epoch': 0.96} 96%|█████████▌| 32744/34278 [35:55:31<1:43:21, 4.04s/it] 96%|█████████▌| 32745/34278 [35:55:35<1:39:02, 3.88s/it] {'loss': 0.1175, 'grad_norm': 0.8948577551819488, 'learning_rate': 5.236093898575767e-08, 'epoch': 0.96} 96%|█████████▌| 32745/34278 [35:55:35<1:39:02, 3.88s/it] 96%|█████████▌| 32746/34278 [35:55:38<1:37:06, 3.80s/it] {'loss': 0.0963, 'grad_norm': 0.6663435400942943, 'learning_rate': 5.229276882355583e-08, 'epoch': 0.96} 96%|█████████▌| 32746/34278 [35:55:38<1:37:06, 3.80s/it] 96%|█████████▌| 32747/34278 [35:55:43<1:40:33, 3.94s/it] {'loss': 0.1044, 'grad_norm': 0.7926324562650706, 'learning_rate': 5.222464283330342e-08, 'epoch': 0.96} 96%|█████████▌| 32747/34278 [35:55:43<1:40:33, 3.94s/it] 96%|█████████▌| 32748/34278 [35:55:45<1:31:09, 3.57s/it] {'loss': 0.1203, 'grad_norm': 0.9649432308644017, 'learning_rate': 5.215656101560829e-08, 'epoch': 0.96} 96%|█████████▌| 32748/34278 [35:55:45<1:31:09, 3.57s/it] 96%|█████████▌| 32749/34278 [35:55:49<1:28:32, 3.47s/it] {'loss': 0.1174, 'grad_norm': 1.0764631275838574, 'learning_rate': 5.2088523371077724e-08, 'epoch': 0.96} 96%|█████████▌| 32749/34278 [35:55:49<1:28:32, 3.47s/it] 96%|█████████▌| 32750/34278 [35:55:51<1:23:36, 3.28s/it] {'loss': 0.099, 'grad_norm': 0.730160856196989, 'learning_rate': 5.202052990032014e-08, 'epoch': 0.96} 96%|█████████▌| 32750/34278 [35:55:51<1:23:36, 3.28s/it] 96%|█████████▌| 32751/34278 [35:55:55<1:26:10, 3.39s/it] {'loss': 0.118, 'grad_norm': 0.8186530304108599, 'learning_rate': 5.1952580603941705e-08, 'epoch': 0.96} 96%|█████████▌| 32751/34278 [35:55:55<1:26:10, 3.39s/it] 96%|█████████▌| 32752/34278 [35:55:59<1:30:05, 3.54s/it] {'loss': 0.0974, 'grad_norm': 0.7699885973715855, 'learning_rate': 5.188467548254972e-08, 'epoch': 0.96} 96%|█████████▌| 32752/34278 [35:55:59<1:30:05, 3.54s/it] 96%|█████████▌| 32753/34278 [35:56:03<1:32:00, 3.62s/it] {'loss': 0.095, 'grad_norm': 0.9613407817142756, 'learning_rate': 5.1816814536749804e-08, 'epoch': 0.96} 96%|█████████▌| 32753/34278 [35:56:03<1:32:00, 3.62s/it] 96%|█████████▌| 32754/34278 [35:56:07<1:38:20, 3.87s/it] {'loss': 0.1038, 'grad_norm': 0.7684494701360869, 'learning_rate': 5.174899776714814e-08, 'epoch': 0.96} 96%|█████████▌| 32754/34278 [35:56:07<1:38:20, 3.87s/it] 96%|█████████▌| 32755/34278 [35:56:10<1:34:23, 3.72s/it] {'loss': 0.1099, 'grad_norm': 0.715427278530436, 'learning_rate': 5.1681225174350926e-08, 'epoch': 0.96} 96%|█████████▌| 32755/34278 [35:56:11<1:34:23, 3.72s/it] 96%|█████████▌| 32756/34278 [35:56:13<1:28:41, 3.50s/it] {'loss': 0.1198, 'grad_norm': 0.8722937914171108, 'learning_rate': 5.1613496758961545e-08, 'epoch': 0.96} 96%|█████████▌| 32756/34278 [35:56:13<1:28:41, 3.50s/it] 96%|█████████▌| 32757/34278 [35:56:17<1:29:51, 3.54s/it] {'loss': 0.1371, 'grad_norm': 0.8399977858132577, 'learning_rate': 5.154581252158619e-08, 'epoch': 0.96} 96%|█████████▌| 32757/34278 [35:56:17<1:29:51, 3.54s/it] 96%|█████████▌| 32758/34278 [35:56:21<1:30:58, 3.59s/it] {'loss': 0.1103, 'grad_norm': 0.9756169504636132, 'learning_rate': 5.147817246282882e-08, 'epoch': 0.96} 96%|█████████▌| 32758/34278 [35:56:21<1:30:58, 3.59s/it] 96%|█████████▌| 32759/34278 [35:56:24<1:25:03, 3.36s/it] {'loss': 0.126, 'grad_norm': 0.9175257322200491, 'learning_rate': 5.1410576583291736e-08, 'epoch': 0.96} 96%|█████████▌| 32759/34278 [35:56:24<1:25:03, 3.36s/it] 96%|█████████▌| 32760/34278 [35:56:27<1:22:40, 3.27s/it] {'loss': 0.1195, 'grad_norm': 1.1429093656478735, 'learning_rate': 5.134302488358056e-08, 'epoch': 0.96} 96%|█████████▌| 32760/34278 [35:56:27<1:22:40, 3.27s/it] 96%|█████████▌| 32761/34278 [35:56:32<1:38:17, 3.89s/it] {'loss': 0.1156, 'grad_norm': 0.6904741345709744, 'learning_rate': 5.127551736429759e-08, 'epoch': 0.96} 96%|█████████▌| 32761/34278 [35:56:32<1:38:17, 3.89s/it] 96%|█████████▌| 32762/34278 [35:56:35<1:32:21, 3.66s/it] {'loss': 0.1188, 'grad_norm': 0.6425373426976441, 'learning_rate': 5.120805402604512e-08, 'epoch': 0.96} 96%|█████████▌| 32762/34278 [35:56:35<1:32:21, 3.66s/it] 96%|█████████▌| 32763/34278 [35:56:40<1:41:38, 4.03s/it] {'loss': 0.1034, 'grad_norm': 0.8153282788925978, 'learning_rate': 5.114063486942655e-08, 'epoch': 0.96} 96%|█████████▌| 32763/34278 [35:56:40<1:41:38, 4.03s/it] 96%|█████████▌| 32764/34278 [35:56:43<1:31:06, 3.61s/it] {'loss': 0.0952, 'grad_norm': 0.9550712149912414, 'learning_rate': 5.1073259895042527e-08, 'epoch': 0.96} 96%|█████████▌| 32764/34278 [35:56:43<1:31:06, 3.61s/it] 96%|█████████▌| 32765/34278 [35:56:46<1:27:11, 3.46s/it] {'loss': 0.0979, 'grad_norm': 0.7575655806906811, 'learning_rate': 5.100592910349478e-08, 'epoch': 0.96} 96%|█████████▌| 32765/34278 [35:56:46<1:27:11, 3.46s/it] 96%|█████████▌| 32766/34278 [35:56:49<1:26:58, 3.45s/it] {'loss': 0.102, 'grad_norm': 0.7274889051662751, 'learning_rate': 5.0938642495384495e-08, 'epoch': 0.96} 96%|█████████▌| 32766/34278 [35:56:49<1:26:58, 3.45s/it] 96%|█████████▌| 32767/34278 [35:56:52<1:23:38, 3.32s/it] {'loss': 0.0989, 'grad_norm': 0.7797081707174183, 'learning_rate': 5.087140007131286e-08, 'epoch': 0.96} 96%|█████████▌| 32767/34278 [35:56:52<1:23:38, 3.32s/it] 96%|█████████▌| 32768/34278 [35:56:56<1:24:19, 3.35s/it] {'loss': 0.0928, 'grad_norm': 0.7614799380669769, 'learning_rate': 5.0804201831880505e-08, 'epoch': 0.96} 96%|█████████▌| 32768/34278 [35:56:56<1:24:19, 3.35s/it] 96%|█████████▌| 32769/34278 [35:56:59<1:22:48, 3.29s/it] {'loss': 0.1075, 'grad_norm': 0.8549511031994224, 'learning_rate': 5.073704777768584e-08, 'epoch': 0.96} 96%|█████████▌| 32769/34278 [35:56:59<1:22:48, 3.29s/it] 96%|█████████▌| 32770/34278 [35:57:02<1:23:47, 3.33s/it] {'loss': 0.1186, 'grad_norm': 0.8778242344808694, 'learning_rate': 5.0669937909330056e-08, 'epoch': 0.96} 96%|█████████▌| 32770/34278 [35:57:02<1:23:47, 3.33s/it] 96%|█████████▌| 32771/34278 [35:57:06<1:26:08, 3.43s/it] {'loss': 0.1228, 'grad_norm': 0.8661055700702226, 'learning_rate': 5.0602872227411e-08, 'epoch': 0.96} 96%|█████████▌| 32771/34278 [35:57:06<1:26:08, 3.43s/it] 96%|█████████▌| 32772/34278 [35:57:10<1:32:44, 3.70s/it] {'loss': 0.1002, 'grad_norm': 0.7538873029924936, 'learning_rate': 5.053585073252765e-08, 'epoch': 0.96} 96%|█████████▌| 32772/34278 [35:57:10<1:32:44, 3.70s/it] 96%|█████████▌| 32773/34278 [35:57:13<1:27:02, 3.47s/it] {'loss': 0.1145, 'grad_norm': 0.7249928753886032, 'learning_rate': 5.046887342527951e-08, 'epoch': 0.96} 96%|█████████▌| 32773/34278 [35:57:13<1:27:02, 3.47s/it] 96%|█████████▌| 32774/34278 [35:57:16<1:24:10, 3.36s/it] {'loss': 0.0956, 'grad_norm': 0.7851065270411969, 'learning_rate': 5.0401940306263884e-08, 'epoch': 0.96} 96%|█████████▌| 32774/34278 [35:57:16<1:24:10, 3.36s/it] 96%|█████████▌| 32775/34278 [35:57:19<1:21:23, 3.25s/it] {'loss': 0.1069, 'grad_norm': 0.7216456921036118, 'learning_rate': 5.0335051376077527e-08, 'epoch': 0.96} 96%|█████████▌| 32775/34278 [35:57:19<1:21:23, 3.25s/it] 96%|█████████▌| 32776/34278 [35:57:25<1:38:10, 3.92s/it] {'loss': 0.1144, 'grad_norm': 0.7984319027499761, 'learning_rate': 5.026820663531828e-08, 'epoch': 0.96} 96%|█████████▌| 32776/34278 [35:57:25<1:38:10, 3.92s/it] 96%|█████████▌| 32777/34278 [35:57:28<1:30:45, 3.63s/it] {'loss': 0.117, 'grad_norm': 0.7090533146522467, 'learning_rate': 5.02014060845829e-08, 'epoch': 0.96} 96%|█████████▌| 32777/34278 [35:57:28<1:30:45, 3.63s/it] 96%|█████████▌| 32778/34278 [35:57:31<1:25:45, 3.43s/it] {'loss': 0.1167, 'grad_norm': 0.7948516064912404, 'learning_rate': 5.013464972446813e-08, 'epoch': 0.96} 96%|█████████▌| 32778/34278 [35:57:31<1:25:45, 3.43s/it] 96%|█████████▌| 32779/34278 [35:57:34<1:23:33, 3.34s/it] {'loss': 0.1289, 'grad_norm': 0.9578149005999798, 'learning_rate': 5.0067937555569603e-08, 'epoch': 0.96} 96%|█████████▌| 32779/34278 [35:57:34<1:23:33, 3.34s/it] 96%|█████████▌| 32780/34278 [35:57:38<1:29:39, 3.59s/it] {'loss': 0.1186, 'grad_norm': 0.7431435145855708, 'learning_rate': 5.000126957848239e-08, 'epoch': 0.96} 96%|█████████▌| 32780/34278 [35:57:38<1:29:39, 3.59s/it] 96%|█████████▌| 32781/34278 [35:57:41<1:28:10, 3.53s/it] {'loss': 0.1311, 'grad_norm': 0.9385269079835733, 'learning_rate': 4.9934645793802696e-08, 'epoch': 0.96} 96%|█████████▌| 32781/34278 [35:57:41<1:28:10, 3.53s/it] 96%|█████████▌| 32782/34278 [35:57:47<1:46:26, 4.27s/it] {'loss': 0.1035, 'grad_norm': 0.7426492576133975, 'learning_rate': 4.9868066202124476e-08, 'epoch': 0.96} 96%|█████████▌| 32782/34278 [35:57:47<1:46:26, 4.27s/it] 96%|█████████▌| 32783/34278 [35:57:51<1:38:15, 3.94s/it] {'loss': 0.1105, 'grad_norm': 0.722847872566871, 'learning_rate': 4.980153080404227e-08, 'epoch': 0.96} 96%|█████████▌| 32783/34278 [35:57:51<1:38:15, 3.94s/it] 96%|█████████▌| 32784/34278 [35:57:54<1:30:59, 3.65s/it] {'loss': 0.1222, 'grad_norm': 0.8650234530628166, 'learning_rate': 4.973503960015058e-08, 'epoch': 0.96} 96%|█████████▌| 32784/34278 [35:57:54<1:30:59, 3.65s/it] 96%|█████████▌| 32785/34278 [35:57:57<1:26:01, 3.46s/it] {'loss': 0.1185, 'grad_norm': 0.8360374043334173, 'learning_rate': 4.9668592591042844e-08, 'epoch': 0.96} 96%|█████████▌| 32785/34278 [35:57:57<1:26:01, 3.46s/it] 96%|█████████▌| 32786/34278 [35:58:02<1:44:13, 4.19s/it] {'loss': 0.1155, 'grad_norm': 0.839297362196858, 'learning_rate': 4.9602189777311906e-08, 'epoch': 0.96} 96%|█████████▌| 32786/34278 [35:58:02<1:44:13, 4.19s/it] 96%|█████████▌| 32787/34278 [35:58:08<1:52:28, 4.53s/it] {'loss': 0.1112, 'grad_norm': 0.6934641601053315, 'learning_rate': 4.9535831159551186e-08, 'epoch': 0.96} 96%|█████████▌| 32787/34278 [35:58:08<1:52:28, 4.53s/it] 96%|█████████▌| 32788/34278 [35:58:10<1:39:15, 4.00s/it] {'loss': 0.1085, 'grad_norm': 1.53006393883715, 'learning_rate': 4.9469516738352986e-08, 'epoch': 0.96} 96%|█████████▌| 32788/34278 [35:58:10<1:39:15, 4.00s/it] 96%|█████████▌| 32789/34278 [35:58:14<1:32:59, 3.75s/it] {'loss': 0.1067, 'grad_norm': 0.8742586377683601, 'learning_rate': 4.94032465143085e-08, 'epoch': 0.96} 96%|█████████▌| 32789/34278 [35:58:14<1:32:59, 3.75s/it] 96%|█████████▌| 32790/34278 [35:58:17<1:33:13, 3.76s/it] {'loss': 0.1044, 'grad_norm': 0.8120832384895197, 'learning_rate': 4.933702048801003e-08, 'epoch': 0.96} 96%|█████████▌| 32790/34278 [35:58:17<1:33:13, 3.76s/it] 96%|█████████▌| 32791/34278 [35:58:20<1:27:20, 3.52s/it] {'loss': 0.1008, 'grad_norm': 0.7112104474890033, 'learning_rate': 4.927083866004934e-08, 'epoch': 0.96} 96%|█████████▌| 32791/34278 [35:58:20<1:27:20, 3.52s/it] 96%|█████████▌| 32792/34278 [35:58:24<1:25:06, 3.44s/it] {'loss': 0.0775, 'grad_norm': 0.8156216727451591, 'learning_rate': 4.920470103101649e-08, 'epoch': 0.96} 96%|█████████▌| 32792/34278 [35:58:24<1:25:06, 3.44s/it] 96%|█████████▌| 32793/34278 [35:58:27<1:26:35, 3.50s/it] {'loss': 0.1237, 'grad_norm': 0.9304001213469875, 'learning_rate': 4.9138607601502684e-08, 'epoch': 0.96} 96%|█████████▌| 32793/34278 [35:58:27<1:26:35, 3.50s/it] 96%|█████████▌| 32794/34278 [35:58:30<1:23:30, 3.38s/it] {'loss': 0.1073, 'grad_norm': 0.7397240385780188, 'learning_rate': 4.90725583720969e-08, 'epoch': 0.96} 96%|█████████▌| 32794/34278 [35:58:30<1:23:30, 3.38s/it] 96%|█████████▌| 32795/34278 [35:58:34<1:25:08, 3.44s/it] {'loss': 0.1118, 'grad_norm': 0.8207286387182503, 'learning_rate': 4.9006553343389774e-08, 'epoch': 0.96} 96%|█████████▌| 32795/34278 [35:58:34<1:25:08, 3.44s/it] 96%|█████████▌| 32796/34278 [35:58:37<1:23:41, 3.39s/it] {'loss': 0.1349, 'grad_norm': 0.9117029824346379, 'learning_rate': 4.894059251596972e-08, 'epoch': 0.96} 96%|█████████▌| 32796/34278 [35:58:37<1:23:41, 3.39s/it] 96%|█████████▌| 32797/34278 [35:58:43<1:40:20, 4.07s/it] {'loss': 0.1069, 'grad_norm': 0.9294226671764082, 'learning_rate': 4.887467589042683e-08, 'epoch': 0.96} 96%|█████████▌| 32797/34278 [35:58:43<1:40:20, 4.07s/it] 96%|█████████▌| 32798/34278 [35:58:46<1:30:58, 3.69s/it] {'loss': 0.1036, 'grad_norm': 0.9921486183494397, 'learning_rate': 4.88088034673484e-08, 'epoch': 0.96} 96%|█████████▌| 32798/34278 [35:58:46<1:30:58, 3.69s/it] 96%|█████████▌| 32799/34278 [35:58:49<1:27:59, 3.57s/it] {'loss': 0.1221, 'grad_norm': 0.995969663092959, 'learning_rate': 4.874297524732341e-08, 'epoch': 0.96} 96%|█████████▌| 32799/34278 [35:58:49<1:27:59, 3.57s/it] 96%|█████████▌| 32800/34278 [35:58:55<1:45:12, 4.27s/it] {'loss': 0.1041, 'grad_norm': 0.7589754390526152, 'learning_rate': 4.867719123093917e-08, 'epoch': 0.96} 96%|█████████▌| 32800/34278 [35:58:55<1:45:12, 4.27s/it] 96%|█████████▌| 32801/34278 [35:59:01<1:57:13, 4.76s/it] {'loss': 0.1026, 'grad_norm': 0.920959659120988, 'learning_rate': 4.861145141878243e-08, 'epoch': 0.96} 96%|█████████▌| 32801/34278 [35:59:01<1:57:13, 4.76s/it] 96%|█████████▌| 32802/34278 [35:59:04<1:43:51, 4.22s/it] {'loss': 0.12, 'grad_norm': 1.0198832879935908, 'learning_rate': 4.8545755811441054e-08, 'epoch': 0.96} 96%|█████████▌| 32802/34278 [35:59:04<1:43:51, 4.22s/it] 96%|█████████▌| 32803/34278 [35:59:07<1:39:24, 4.04s/it] {'loss': 0.1258, 'grad_norm': 0.7420504055522724, 'learning_rate': 4.8480104409501236e-08, 'epoch': 0.96} 96%|█████████▌| 32803/34278 [35:59:07<1:39:24, 4.04s/it] 96%|█████████▌| 32804/34278 [35:59:10<1:32:26, 3.76s/it] {'loss': 0.1081, 'grad_norm': 0.738726918682796, 'learning_rate': 4.8414497213549184e-08, 'epoch': 0.96} 96%|█████████▌| 32804/34278 [35:59:10<1:32:26, 3.76s/it] 96%|█████████▌| 32805/34278 [35:59:14<1:30:43, 3.70s/it] {'loss': 0.1024, 'grad_norm': 0.6951912015343367, 'learning_rate': 4.834893422416997e-08, 'epoch': 0.96} 96%|█████████▌| 32805/34278 [35:59:14<1:30:43, 3.70s/it] 96%|█████████▌| 32806/34278 [35:59:17<1:28:57, 3.63s/it] {'loss': 0.1158, 'grad_norm': 0.8459641707126396, 'learning_rate': 4.828341544194981e-08, 'epoch': 0.96} 96%|█████████▌| 32806/34278 [35:59:18<1:28:57, 3.63s/it] 96%|█████████▌| 32807/34278 [35:59:20<1:24:01, 3.43s/it] {'loss': 0.102, 'grad_norm': 0.7477834215712245, 'learning_rate': 4.8217940867473225e-08, 'epoch': 0.96} 96%|█████████▌| 32807/34278 [35:59:20<1:24:01, 3.43s/it] 96%|█████████▌| 32808/34278 [35:59:26<1:42:38, 4.19s/it] {'loss': 0.1436, 'grad_norm': 0.7300391656927202, 'learning_rate': 4.8152510501324745e-08, 'epoch': 0.96} 96%|█████████▌| 32808/34278 [35:59:26<1:42:38, 4.19s/it] 96%|█████████▌| 32809/34278 [35:59:30<1:37:01, 3.96s/it] {'loss': 0.1075, 'grad_norm': 1.000883386167395, 'learning_rate': 4.8087124344088353e-08, 'epoch': 0.96} 96%|█████████▌| 32809/34278 [35:59:30<1:37:01, 3.96s/it] 96%|█████████▌| 32810/34278 [35:59:33<1:28:27, 3.62s/it] {'loss': 0.1262, 'grad_norm': 0.9674040131635351, 'learning_rate': 4.8021782396348026e-08, 'epoch': 0.96} 96%|█████████▌| 32810/34278 [35:59:33<1:28:27, 3.62s/it] 96%|█████████▌| 32811/34278 [35:59:36<1:25:32, 3.50s/it] {'loss': 0.0904, 'grad_norm': 0.6453790645349463, 'learning_rate': 4.795648465868719e-08, 'epoch': 0.96} 96%|█████████▌| 32811/34278 [35:59:36<1:25:32, 3.50s/it] 96%|█████████▌| 32812/34278 [35:59:39<1:20:29, 3.29s/it] {'loss': 0.1081, 'grad_norm': 0.7435397773090228, 'learning_rate': 4.7891231131688695e-08, 'epoch': 0.96} 96%|█████████▌| 32812/34278 [35:59:39<1:20:29, 3.29s/it] 96%|█████████▌| 32813/34278 [35:59:43<1:25:46, 3.51s/it] {'loss': 0.1281, 'grad_norm': 0.7425090442832256, 'learning_rate': 4.782602181593488e-08, 'epoch': 0.96} 96%|█████████▌| 32813/34278 [35:59:43<1:25:46, 3.51s/it] 96%|█████████▌| 32814/34278 [35:59:48<1:35:08, 3.90s/it] {'loss': 0.1165, 'grad_norm': 0.7247203167434828, 'learning_rate': 4.7760856712008584e-08, 'epoch': 0.96} 96%|█████████▌| 32814/34278 [35:59:48<1:35:08, 3.90s/it] 96%|█████████▌| 32815/34278 [35:59:53<1:48:27, 4.45s/it] {'loss': 0.1239, 'grad_norm': 1.0385829791443835, 'learning_rate': 4.769573582049103e-08, 'epoch': 0.96} 96%|█████████▌| 32815/34278 [35:59:53<1:48:27, 4.45s/it] 96%|█████████▌| 32816/34278 [35:59:56<1:37:57, 4.02s/it] {'loss': 0.132, 'grad_norm': 0.8691451414890681, 'learning_rate': 4.763065914196341e-08, 'epoch': 0.96} 96%|█████████▌| 32816/34278 [35:59:56<1:37:57, 4.02s/it] 96%|█████████▌| 32817/34278 [35:59:59<1:31:22, 3.75s/it] {'loss': 0.1042, 'grad_norm': 0.7427551091637435, 'learning_rate': 4.756562667700748e-08, 'epoch': 0.96} 96%|█████████▌| 32817/34278 [35:59:59<1:31:22, 3.75s/it] 96%|█████████▌| 32818/34278 [36:00:04<1:34:30, 3.88s/it] {'loss': 0.128, 'grad_norm': 0.9395494879599302, 'learning_rate': 4.750063842620389e-08, 'epoch': 0.96} 96%|█████████▌| 32818/34278 [36:00:04<1:34:30, 3.88s/it] 96%|█████████▌| 32819/34278 [36:00:07<1:28:48, 3.65s/it] {'loss': 0.1175, 'grad_norm': 0.8307875710800233, 'learning_rate': 4.743569439013107e-08, 'epoch': 0.96} 96%|█████████▌| 32819/34278 [36:00:07<1:28:48, 3.65s/it] 96%|█████████▌| 32820/34278 [36:00:10<1:24:58, 3.50s/it] {'loss': 0.1032, 'grad_norm': 0.788224328384697, 'learning_rate': 4.737079456937077e-08, 'epoch': 0.96} 96%|█████████▌| 32820/34278 [36:00:10<1:24:58, 3.50s/it] 96%|█████████▌| 32821/34278 [36:00:13<1:22:39, 3.40s/it] {'loss': 0.1181, 'grad_norm': 0.9026167256181064, 'learning_rate': 4.730593896450197e-08, 'epoch': 0.96} 96%|█████████▌| 32821/34278 [36:00:13<1:22:39, 3.40s/it] 96%|█████████▌| 32822/34278 [36:00:16<1:20:41, 3.33s/it] {'loss': 0.1037, 'grad_norm': 0.7578415303003803, 'learning_rate': 4.724112757610311e-08, 'epoch': 0.96} 96%|█████████▌| 32822/34278 [36:00:16<1:20:41, 3.33s/it] 96%|█████████▌| 32823/34278 [36:00:19<1:20:17, 3.31s/it] {'loss': 0.1184, 'grad_norm': 0.712269092387141, 'learning_rate': 4.717636040475315e-08, 'epoch': 0.96} 96%|█████████▌| 32823/34278 [36:00:19<1:20:17, 3.31s/it] 96%|█████████▌| 32824/34278 [36:00:23<1:19:59, 3.30s/it] {'loss': 0.1004, 'grad_norm': 0.7880497015147347, 'learning_rate': 4.711163745103053e-08, 'epoch': 0.96} 96%|█████████▌| 32824/34278 [36:00:23<1:19:59, 3.30s/it] 96%|█████████▌| 32825/34278 [36:00:26<1:17:09, 3.19s/it] {'loss': 0.0993, 'grad_norm': 0.8939136102826749, 'learning_rate': 4.704695871551257e-08, 'epoch': 0.96} 96%|█████████▌| 32825/34278 [36:00:26<1:17:09, 3.19s/it] 96%|█████████▌| 32826/34278 [36:00:29<1:19:24, 3.28s/it] {'loss': 0.094, 'grad_norm': 0.5951060923281825, 'learning_rate': 4.698232419877658e-08, 'epoch': 0.96} 96%|█████████▌| 32826/34278 [36:00:29<1:19:24, 3.28s/it] 96%|█████████▌| 32827/34278 [36:00:32<1:16:08, 3.15s/it] {'loss': 0.1207, 'grad_norm': 0.7474564157472834, 'learning_rate': 4.6917733901400976e-08, 'epoch': 0.96} 96%|█████████▌| 32827/34278 [36:00:32<1:16:08, 3.15s/it] 96%|█████████▌| 32828/34278 [36:00:35<1:14:49, 3.10s/it] {'loss': 0.1042, 'grad_norm': 0.8172784909372025, 'learning_rate': 4.685318782396087e-08, 'epoch': 0.96} 96%|█████████▌| 32828/34278 [36:00:35<1:14:49, 3.10s/it] 96%|█████████▌| 32829/34278 [36:00:38<1:11:43, 2.97s/it] {'loss': 0.0943, 'grad_norm': 0.8084277300051048, 'learning_rate': 4.678868596703301e-08, 'epoch': 0.96} 96%|█████████▌| 32829/34278 [36:00:38<1:11:43, 2.97s/it] 96%|█████████▌| 32830/34278 [36:00:41<1:15:58, 3.15s/it] {'loss': 0.1208, 'grad_norm': 0.8381053554024123, 'learning_rate': 4.6724228331194166e-08, 'epoch': 0.96} 96%|█████████▌| 32830/34278 [36:00:41<1:15:58, 3.15s/it] 96%|█████████▌| 32831/34278 [36:00:44<1:13:18, 3.04s/it] {'loss': 0.1105, 'grad_norm': 0.7795954897951408, 'learning_rate': 4.665981491701776e-08, 'epoch': 0.96} 96%|█████████▌| 32831/34278 [36:00:44<1:13:18, 3.04s/it] 96%|█████████▌| 32832/34278 [36:00:47<1:15:14, 3.12s/it] {'loss': 0.1193, 'grad_norm': 1.1263082644142337, 'learning_rate': 4.6595445725080566e-08, 'epoch': 0.96} 96%|█████████▌| 32832/34278 [36:00:47<1:15:14, 3.12s/it] 96%|█████████▌| 32833/34278 [36:00:51<1:19:07, 3.29s/it] {'loss': 0.1138, 'grad_norm': 0.7691307823628832, 'learning_rate': 4.653112075595711e-08, 'epoch': 0.96} 96%|█████████▌| 32833/34278 [36:00:51<1:19:07, 3.29s/it] 96%|█████████▌| 32834/34278 [36:00:54<1:15:59, 3.16s/it] {'loss': 0.1025, 'grad_norm': 0.9599208035477919, 'learning_rate': 4.6466840010221395e-08, 'epoch': 0.96} 96%|█████████▌| 32834/34278 [36:00:54<1:15:59, 3.16s/it] 96%|█████████▌| 32835/34278 [36:00:57<1:15:50, 3.15s/it] {'loss': 0.0876, 'grad_norm': 0.9485995240003718, 'learning_rate': 4.640260348844683e-08, 'epoch': 0.96} 96%|█████████▌| 32835/34278 [36:00:57<1:15:50, 3.15s/it] 96%|█████████▌| 32836/34278 [36:01:03<1:36:49, 4.03s/it] {'loss': 0.1085, 'grad_norm': 0.7077820665793446, 'learning_rate': 4.6338411191207414e-08, 'epoch': 0.96} 96%|█████████▌| 32836/34278 [36:01:03<1:36:49, 4.03s/it] 96%|█████████▌| 32837/34278 [36:01:06<1:28:01, 3.66s/it] {'loss': 0.1123, 'grad_norm': 0.827331988072541, 'learning_rate': 4.627426311907601e-08, 'epoch': 0.96} 96%|█████████▌| 32837/34278 [36:01:06<1:28:01, 3.66s/it] 96%|█████████▌| 32838/34278 [36:01:10<1:29:19, 3.72s/it] {'loss': 0.1085, 'grad_norm': 0.8156878645710716, 'learning_rate': 4.621015927262551e-08, 'epoch': 0.96} 96%|█████████▌| 32838/34278 [36:01:10<1:29:19, 3.72s/it] 96%|█████████▌| 32839/34278 [36:01:13<1:22:49, 3.45s/it] {'loss': 0.129, 'grad_norm': 0.7015158744487017, 'learning_rate': 4.614609965242822e-08, 'epoch': 0.96} 96%|█████████▌| 32839/34278 [36:01:13<1:22:49, 3.45s/it] 96%|█████████▌| 32840/34278 [36:01:16<1:20:29, 3.36s/it] {'loss': 0.1252, 'grad_norm': 0.7925884216061743, 'learning_rate': 4.608208425905592e-08, 'epoch': 0.96} 96%|█████████▌| 32840/34278 [36:01:16<1:20:29, 3.36s/it] 96%|█████████▌| 32841/34278 [36:01:19<1:23:38, 3.49s/it] {'loss': 0.1214, 'grad_norm': 0.9877298809925094, 'learning_rate': 4.601811309308035e-08, 'epoch': 0.96} 96%|█████████▌| 32841/34278 [36:01:19<1:23:38, 3.49s/it] 96%|█████████▌| 32842/34278 [36:01:22<1:20:02, 3.34s/it] {'loss': 0.1239, 'grad_norm': 0.81919534179844, 'learning_rate': 4.595418615507219e-08, 'epoch': 0.96} 96%|█████████▌| 32842/34278 [36:01:22<1:20:02, 3.34s/it] 96%|█████████▌| 32843/34278 [36:01:27<1:27:35, 3.66s/it] {'loss': 0.0949, 'grad_norm': 0.7967418626689056, 'learning_rate': 4.589030344560208e-08, 'epoch': 0.96} 96%|█████████▌| 32843/34278 [36:01:27<1:27:35, 3.66s/it] 96%|█████████▌| 32844/34278 [36:01:31<1:30:59, 3.81s/it] {'loss': 0.1193, 'grad_norm': 0.8079560103900926, 'learning_rate': 4.582646496524124e-08, 'epoch': 0.96} 96%|█████████▌| 32844/34278 [36:01:31<1:30:59, 3.81s/it] 96%|█████████▌| 32845/34278 [36:01:34<1:28:24, 3.70s/it] {'loss': 0.104, 'grad_norm': 0.9559706878467896, 'learning_rate': 4.5762670714559196e-08, 'epoch': 0.96} 96%|█████████▌| 32845/34278 [36:01:34<1:28:24, 3.70s/it] 96%|█████████▌| 32846/34278 [36:01:37<1:21:46, 3.43s/it] {'loss': 0.0884, 'grad_norm': 0.684763539678176, 'learning_rate': 4.569892069412496e-08, 'epoch': 0.96} 96%|█████████▌| 32846/34278 [36:01:37<1:21:46, 3.43s/it] 96%|█████████▌| 32847/34278 [36:01:42<1:29:17, 3.74s/it] {'loss': 0.0919, 'grad_norm': 0.8668853365655103, 'learning_rate': 4.563521490450862e-08, 'epoch': 0.96} 96%|█████████▌| 32847/34278 [36:01:42<1:29:17, 3.74s/it] 96%|█████████▌| 32848/34278 [36:01:45<1:22:46, 3.47s/it] {'loss': 0.0941, 'grad_norm': 0.9643167870716441, 'learning_rate': 4.557155334627805e-08, 'epoch': 0.96} 96%|█████████▌| 32848/34278 [36:01:45<1:22:46, 3.47s/it] 96%|█████████▌| 32849/34278 [36:01:48<1:18:50, 3.31s/it] {'loss': 0.0918, 'grad_norm': 0.873714390428912, 'learning_rate': 4.550793602000114e-08, 'epoch': 0.96} 96%|█████████▌| 32849/34278 [36:01:48<1:18:50, 3.31s/it] 96%|█████████▌| 32850/34278 [36:01:51<1:21:57, 3.44s/it] {'loss': 0.1033, 'grad_norm': 0.893234930076437, 'learning_rate': 4.544436292624743e-08, 'epoch': 0.96} 96%|█████████▌| 32850/34278 [36:01:51<1:21:57, 3.44s/it] 96%|█████████▌| 32851/34278 [36:01:56<1:33:19, 3.92s/it] {'loss': 0.1277, 'grad_norm': 0.7652129907452041, 'learning_rate': 4.538083406558425e-08, 'epoch': 0.96} 96%|█████████▌| 32851/34278 [36:01:56<1:33:19, 3.92s/it] 96%|█████████▌| 32852/34278 [36:01:59<1:26:54, 3.66s/it] {'loss': 0.1071, 'grad_norm': 0.7577640621083667, 'learning_rate': 4.531734943857724e-08, 'epoch': 0.96} 96%|█████████▌| 32852/34278 [36:01:59<1:26:54, 3.66s/it] 96%|█████████▌| 32853/34278 [36:02:02<1:19:43, 3.36s/it] {'loss': 0.0948, 'grad_norm': 1.1307995167316394, 'learning_rate': 4.525390904579485e-08, 'epoch': 0.96} 96%|█████████▌| 32853/34278 [36:02:02<1:19:43, 3.36s/it] 96%|█████████▌| 32854/34278 [36:02:06<1:20:33, 3.39s/it] {'loss': 0.1044, 'grad_norm': 0.7003890147655835, 'learning_rate': 4.5190512887802186e-08, 'epoch': 0.96} 96%|█████████▌| 32854/34278 [36:02:06<1:20:33, 3.39s/it] 96%|█████████▌| 32855/34278 [36:02:09<1:20:27, 3.39s/it] {'loss': 0.1015, 'grad_norm': 0.8369313551339875, 'learning_rate': 4.512716096516601e-08, 'epoch': 0.96} 96%|█████████▌| 32855/34278 [36:02:09<1:20:27, 3.39s/it] 96%|█████████▌| 32856/34278 [36:02:13<1:26:10, 3.64s/it] {'loss': 0.1002, 'grad_norm': 0.7204840338085293, 'learning_rate': 4.506385327845197e-08, 'epoch': 0.96} 96%|█████████▌| 32856/34278 [36:02:13<1:26:10, 3.64s/it] 96%|█████████▌| 32857/34278 [36:02:16<1:19:48, 3.37s/it] {'loss': 0.1044, 'grad_norm': 1.4576364723203665, 'learning_rate': 4.500058982822464e-08, 'epoch': 0.96} 96%|█████████▌| 32857/34278 [36:02:16<1:19:48, 3.37s/it] 96%|█████████▌| 32858/34278 [36:02:19<1:19:10, 3.35s/it] {'loss': 0.099, 'grad_norm': 0.9758919669163466, 'learning_rate': 4.493737061504966e-08, 'epoch': 0.96} 96%|█████████▌| 32858/34278 [36:02:19<1:19:10, 3.35s/it] 96%|█████████▌| 32859/34278 [36:02:22<1:15:44, 3.20s/it] {'loss': 0.1227, 'grad_norm': 0.8193843089787872, 'learning_rate': 4.487419563949047e-08, 'epoch': 0.96} 96%|█████████▌| 32859/34278 [36:02:22<1:15:44, 3.20s/it] 96%|█████████▌| 32860/34278 [36:02:26<1:19:29, 3.36s/it] {'loss': 0.1016, 'grad_norm': 0.8757366667764627, 'learning_rate': 4.4811064902112175e-08, 'epoch': 0.96} 96%|█████████▌| 32860/34278 [36:02:26<1:19:29, 3.36s/it] 96%|█████████▌| 32861/34278 [36:02:30<1:26:27, 3.66s/it] {'loss': 0.0864, 'grad_norm': 0.6686091926794362, 'learning_rate': 4.474797840347711e-08, 'epoch': 0.96} 96%|█████████▌| 32861/34278 [36:02:30<1:26:27, 3.66s/it] 96%|█████████▌| 32862/34278 [36:02:33<1:22:27, 3.49s/it] {'loss': 0.1282, 'grad_norm': 0.8385389414468926, 'learning_rate': 4.468493614414926e-08, 'epoch': 0.96} 96%|█████████▌| 32862/34278 [36:02:33<1:22:27, 3.49s/it] 96%|█████████▌| 32863/34278 [36:02:37<1:22:42, 3.51s/it] {'loss': 0.1227, 'grad_norm': 1.046296384758437, 'learning_rate': 4.462193812469151e-08, 'epoch': 0.96} 96%|█████████▌| 32863/34278 [36:02:37<1:22:42, 3.51s/it] 96%|█████████▌| 32864/34278 [36:02:40<1:19:41, 3.38s/it] {'loss': 0.0847, 'grad_norm': 0.9308343716101727, 'learning_rate': 4.4558984345666745e-08, 'epoch': 0.96} 96%|█████████▌| 32864/34278 [36:02:40<1:19:41, 3.38s/it] 96%|█████████▌| 32865/34278 [36:02:43<1:19:29, 3.38s/it] {'loss': 0.1114, 'grad_norm': 0.6667854690095036, 'learning_rate': 4.4496074807635626e-08, 'epoch': 0.96} 96%|█████████▌| 32865/34278 [36:02:43<1:19:29, 3.38s/it] 96%|█████████▌| 32866/34278 [36:02:47<1:23:02, 3.53s/it] {'loss': 0.1207, 'grad_norm': 0.7987377356130329, 'learning_rate': 4.443320951116103e-08, 'epoch': 0.96} 96%|█████████▌| 32866/34278 [36:02:47<1:23:02, 3.53s/it] 96%|█████████▌| 32867/34278 [36:02:50<1:19:23, 3.38s/it] {'loss': 0.0965, 'grad_norm': 1.01976480586201, 'learning_rate': 4.437038845680308e-08, 'epoch': 0.96} 96%|█████████▌| 32867/34278 [36:02:50<1:19:23, 3.38s/it] 96%|█████████▌| 32868/34278 [36:02:54<1:19:46, 3.39s/it] {'loss': 0.1133, 'grad_norm': 0.8932694195104837, 'learning_rate': 4.4307611645124096e-08, 'epoch': 0.96} 96%|█████████▌| 32868/34278 [36:02:54<1:19:46, 3.39s/it] 96%|█████████▌| 32869/34278 [36:02:57<1:19:51, 3.40s/it] {'loss': 0.1083, 'grad_norm': 0.7346521941143019, 'learning_rate': 4.4244879076683065e-08, 'epoch': 0.96} 96%|█████████▌| 32869/34278 [36:02:57<1:19:51, 3.40s/it] 96%|█████████▌| 32870/34278 [36:03:00<1:18:57, 3.36s/it] {'loss': 0.1083, 'grad_norm': 0.7835186886056975, 'learning_rate': 4.418219075204122e-08, 'epoch': 0.96} 96%|█████████▌| 32870/34278 [36:03:00<1:18:57, 3.36s/it] 96%|█████████▌| 32871/34278 [36:03:03<1:18:04, 3.33s/it] {'loss': 0.1427, 'grad_norm': 0.8545462903301045, 'learning_rate': 4.411954667175811e-08, 'epoch': 0.96} 96%|█████████▌| 32871/34278 [36:03:03<1:18:04, 3.33s/it] 96%|█████████▌| 32872/34278 [36:03:07<1:18:13, 3.34s/it] {'loss': 0.1214, 'grad_norm': 0.7472910583556739, 'learning_rate': 4.405694683639161e-08, 'epoch': 0.96} 96%|█████████▌| 32872/34278 [36:03:07<1:18:13, 3.34s/it] 96%|█████████▌| 32873/34278 [36:03:11<1:22:51, 3.54s/it] {'loss': 0.1058, 'grad_norm': 0.967913088083774, 'learning_rate': 4.3994391246501846e-08, 'epoch': 0.96} 96%|█████████▌| 32873/34278 [36:03:11<1:22:51, 3.54s/it] 96%|█████████▌| 32874/34278 [36:03:14<1:21:23, 3.48s/it] {'loss': 0.1173, 'grad_norm': 0.7862975461298798, 'learning_rate': 4.39318799026478e-08, 'epoch': 0.96} 96%|█████████▌| 32874/34278 [36:03:14<1:21:23, 3.48s/it] 96%|█████████▌| 32875/34278 [36:03:17<1:17:02, 3.29s/it] {'loss': 0.1114, 'grad_norm': 0.9360038111422968, 'learning_rate': 4.3869412805386256e-08, 'epoch': 0.96} 96%|█████████▌| 32875/34278 [36:03:17<1:17:02, 3.29s/it] 96%|█████████▌| 32876/34278 [36:03:23<1:34:37, 4.05s/it] {'loss': 0.1173, 'grad_norm': 0.7903278081066042, 'learning_rate': 4.380698995527566e-08, 'epoch': 0.96} 96%|█████████▌| 32876/34278 [36:03:23<1:34:37, 4.05s/it] 96%|█████████▌| 32877/34278 [36:03:27<1:33:35, 4.01s/it] {'loss': 0.1239, 'grad_norm': 0.905513339696466, 'learning_rate': 4.374461135287278e-08, 'epoch': 0.96} 96%|█████████▌| 32877/34278 [36:03:27<1:33:35, 4.01s/it] 96%|█████████▌| 32878/34278 [36:03:33<1:47:11, 4.59s/it] {'loss': 0.1103, 'grad_norm': 0.8878514363182609, 'learning_rate': 4.3682276998735505e-08, 'epoch': 0.96} 96%|█████████▌| 32878/34278 [36:03:33<1:47:11, 4.59s/it] 96%|█████████▌| 32879/34278 [36:03:37<1:45:21, 4.52s/it] {'loss': 0.1312, 'grad_norm': 0.9682928562932697, 'learning_rate': 4.361998689341895e-08, 'epoch': 0.96} 96%|█████████▌| 32879/34278 [36:03:37<1:45:21, 4.52s/it] 96%|█████████▌| 32880/34278 [36:03:43<1:56:29, 5.00s/it] {'loss': 0.1155, 'grad_norm': 0.6971320156321881, 'learning_rate': 4.355774103748045e-08, 'epoch': 0.96} 96%|█████████▌| 32880/34278 [36:03:43<1:56:29, 5.00s/it] 96%|█████████▌| 32881/34278 [36:03:47<1:45:52, 4.55s/it] {'loss': 0.1156, 'grad_norm': 0.93114582243866, 'learning_rate': 4.3495539431475106e-08, 'epoch': 0.96} 96%|█████████▌| 32881/34278 [36:03:47<1:45:52, 4.55s/it] 96%|█████████▌| 32882/34278 [36:03:53<1:58:22, 5.09s/it] {'loss': 0.1183, 'grad_norm': 1.006106115889474, 'learning_rate': 4.343338207595804e-08, 'epoch': 0.96} 96%|█████████▌| 32882/34278 [36:03:53<1:58:22, 5.09s/it] 96%|█████████▌| 32883/34278 [36:03:56<1:46:49, 4.59s/it] {'loss': 0.1177, 'grad_norm': 0.7048743503820754, 'learning_rate': 4.3371268971484915e-08, 'epoch': 0.96} 96%|█████████▌| 32883/34278 [36:03:57<1:46:49, 4.59s/it] 96%|█████████▌| 32884/34278 [36:03:59<1:34:19, 4.06s/it] {'loss': 0.11, 'grad_norm': 0.9426349108180363, 'learning_rate': 4.330920011860973e-08, 'epoch': 0.96} 96%|█████████▌| 32884/34278 [36:03:59<1:34:19, 4.06s/it] 96%|█████████▌| 32885/34278 [36:04:03<1:29:06, 3.84s/it] {'loss': 0.1293, 'grad_norm': 0.9077580728160185, 'learning_rate': 4.3247175517887044e-08, 'epoch': 0.96} 96%|█████████▌| 32885/34278 [36:04:03<1:29:06, 3.84s/it] 96%|█████████▌| 32886/34278 [36:04:08<1:38:26, 4.24s/it] {'loss': 0.1283, 'grad_norm': 1.0798663308631007, 'learning_rate': 4.318519516986974e-08, 'epoch': 0.96} 96%|█████████▌| 32886/34278 [36:04:08<1:38:26, 4.24s/it] 96%|█████████▌| 32887/34278 [36:04:11<1:29:44, 3.87s/it] {'loss': 0.1151, 'grad_norm': 0.7957556625772834, 'learning_rate': 4.312325907511183e-08, 'epoch': 0.96} 96%|█████████▌| 32887/34278 [36:04:11<1:29:44, 3.87s/it] 96%|█████████▌| 32888/34278 [36:04:14<1:22:40, 3.57s/it] {'loss': 0.1007, 'grad_norm': 1.0294127171563132, 'learning_rate': 4.3061367234166764e-08, 'epoch': 0.96} 96%|█████████▌| 32888/34278 [36:04:14<1:22:40, 3.57s/it] 96%|█████████▌| 32889/34278 [36:04:17<1:19:27, 3.43s/it] {'loss': 0.0825, 'grad_norm': 0.8257449616386338, 'learning_rate': 4.2999519647585755e-08, 'epoch': 0.96} 96%|█████████▌| 32889/34278 [36:04:17<1:19:27, 3.43s/it] 96%|█████████▌| 32890/34278 [36:04:20<1:20:48, 3.49s/it] {'loss': 0.1035, 'grad_norm': 0.7546505037339957, 'learning_rate': 4.293771631592225e-08, 'epoch': 0.96} 96%|█████████▌| 32890/34278 [36:04:20<1:20:48, 3.49s/it] 96%|█████████▌| 32891/34278 [36:04:23<1:15:00, 3.25s/it] {'loss': 0.1109, 'grad_norm': 0.7245457104184709, 'learning_rate': 4.287595723972693e-08, 'epoch': 0.96} 96%|█████████▌| 32891/34278 [36:04:23<1:15:00, 3.25s/it] 96%|█████████▌| 32892/34278 [36:04:26<1:12:28, 3.14s/it] {'loss': 0.0971, 'grad_norm': 0.8459373428090952, 'learning_rate': 4.281424241955212e-08, 'epoch': 0.96} 96%|█████████▌| 32892/34278 [36:04:26<1:12:28, 3.14s/it] 96%|█████████▌| 32893/34278 [36:04:29<1:11:40, 3.10s/it] {'loss': 0.1057, 'grad_norm': 0.8149609414731143, 'learning_rate': 4.2752571855948496e-08, 'epoch': 0.96} 96%|█████████▌| 32893/34278 [36:04:29<1:11:40, 3.10s/it] 96%|█████████▌| 32894/34278 [36:04:32<1:10:10, 3.04s/it] {'loss': 0.1307, 'grad_norm': 0.7574602051965764, 'learning_rate': 4.269094554946618e-08, 'epoch': 0.96} 96%|█████████▌| 32894/34278 [36:04:32<1:10:10, 3.04s/it] 96%|█████████▌| 32895/34278 [36:04:38<1:30:06, 3.91s/it] {'loss': 0.0923, 'grad_norm': 0.84237972087483, 'learning_rate': 4.262936350065583e-08, 'epoch': 0.96} 96%|█████████▌| 32895/34278 [36:04:38<1:30:06, 3.91s/it] 96%|█████████▌| 32896/34278 [36:04:43<1:41:20, 4.40s/it] {'loss': 0.0889, 'grad_norm': 0.8154954755563415, 'learning_rate': 4.256782571006701e-08, 'epoch': 0.96} 96%|█████████▌| 32896/34278 [36:04:43<1:41:20, 4.40s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 96%|█████████▌| 32897/34278 [36:04:47<1:34:35, 4.11s/it] {'loss': 0.1263, 'grad_norm': 0.7796157232550976, 'learning_rate': 4.2506332178249286e-08, 'epoch': 0.96} 96%|█████████▌| 32897/34278 [36:04:47<1:34:35, 4.11s/it] 96%|█████████▌| 32898/34278 [36:04:50<1:28:09, 3.83s/it] {'loss': 0.0886, 'grad_norm': 0.6344972446368206, 'learning_rate': 4.244488290575166e-08, 'epoch': 0.96} 96%|█████████▌| 32898/34278 [36:04:50<1:28:09, 3.83s/it] 96%|█████████▌| 32899/34278 [36:04:53<1:22:24, 3.59s/it] {'loss': 0.093, 'grad_norm': 1.2740887915383508, 'learning_rate': 4.2383477893122584e-08, 'epoch': 0.96} 96%|█████████▌| 32899/34278 [36:04:53<1:22:24, 3.59s/it] 96%|█████████▌| 32900/34278 [36:04:56<1:20:00, 3.48s/it] {'loss': 0.11, 'grad_norm': 0.8093674068745108, 'learning_rate': 4.23221171409105e-08, 'epoch': 0.96} 96%|█████████▌| 32900/34278 [36:04:56<1:20:00, 3.48s/it] 96%|█████████▌| 32901/34278 [36:04:59<1:15:33, 3.29s/it] {'loss': 0.1429, 'grad_norm': 0.9556726021725885, 'learning_rate': 4.2260800649662756e-08, 'epoch': 0.96} 96%|█████████▌| 32901/34278 [36:04:59<1:15:33, 3.29s/it] 96%|█████████▌| 32902/34278 [36:05:02<1:15:02, 3.27s/it] {'loss': 0.1224, 'grad_norm': 0.7988374231433764, 'learning_rate': 4.219952841992725e-08, 'epoch': 0.96} 96%|█████████▌| 32902/34278 [36:05:02<1:15:02, 3.27s/it] 96%|█████████▌| 32903/34278 [36:05:06<1:15:29, 3.29s/it] {'loss': 0.1175, 'grad_norm': 0.9391374602952318, 'learning_rate': 4.2138300452250756e-08, 'epoch': 0.96} 96%|█████████▌| 32903/34278 [36:05:06<1:15:29, 3.29s/it] 96%|█████████▌| 32904/34278 [36:05:09<1:17:37, 3.39s/it] {'loss': 0.115, 'grad_norm': 0.7569434105553122, 'learning_rate': 4.207711674718007e-08, 'epoch': 0.96} 96%|█████████▌| 32904/34278 [36:05:09<1:17:37, 3.39s/it] 96%|█████████▌| 32905/34278 [36:05:12<1:13:42, 3.22s/it] {'loss': 0.1069, 'grad_norm': 0.7241375268849807, 'learning_rate': 4.201597730526141e-08, 'epoch': 0.96} 96%|█████████▌| 32905/34278 [36:05:12<1:13:42, 3.22s/it] 96%|█████████▌| 32906/34278 [36:05:15<1:10:46, 3.10s/it] {'loss': 0.0892, 'grad_norm': 0.7945752981885397, 'learning_rate': 4.1954882127040466e-08, 'epoch': 0.96} 96%|█████████▌| 32906/34278 [36:05:15<1:10:46, 3.10s/it] 96%|█████████▌| 32907/34278 [36:05:21<1:31:16, 3.99s/it] {'loss': 0.1165, 'grad_norm': 0.8453466298444707, 'learning_rate': 4.18938312130629e-08, 'epoch': 0.96} 96%|█████████▌| 32907/34278 [36:05:21<1:31:16, 3.99s/it] 96%|█████████▌| 32908/34278 [36:05:24<1:25:33, 3.75s/it] {'loss': 0.1226, 'grad_norm': 0.7038362290638049, 'learning_rate': 4.183282456387327e-08, 'epoch': 0.96} 96%|█████████▌| 32908/34278 [36:05:24<1:25:33, 3.75s/it] 96%|█████████▌| 32909/34278 [36:05:28<1:28:59, 3.90s/it] {'loss': 0.1085, 'grad_norm': 0.8187939030680478, 'learning_rate': 4.177186218001617e-08, 'epoch': 0.96} 96%|█████████▌| 32909/34278 [36:05:28<1:28:59, 3.90s/it] 96%|█████████▌| 32910/34278 [36:05:32<1:23:54, 3.68s/it] {'loss': 0.1064, 'grad_norm': 0.9455942275266978, 'learning_rate': 4.171094406203724e-08, 'epoch': 0.96} 96%|█████████▌| 32910/34278 [36:05:32<1:23:54, 3.68s/it] 96%|█████████▌| 32911/34278 [36:05:37<1:39:02, 4.35s/it] {'loss': 0.0955, 'grad_norm': 0.8317222060420671, 'learning_rate': 4.165007021047884e-08, 'epoch': 0.96} 96%|█████████▌| 32911/34278 [36:05:37<1:39:02, 4.35s/it] 96%|█████████▌| 32912/34278 [36:05:41<1:34:12, 4.14s/it] {'loss': 0.1097, 'grad_norm': 0.7933141621435059, 'learning_rate': 4.1589240625884986e-08, 'epoch': 0.96} 96%|█████████▌| 32912/34278 [36:05:41<1:34:12, 4.14s/it] 96%|█████████▌| 32913/34278 [36:05:45<1:31:46, 4.03s/it] {'loss': 0.1138, 'grad_norm': 0.8913368630929362, 'learning_rate': 4.152845530879912e-08, 'epoch': 0.96} 96%|█████████▌| 32913/34278 [36:05:45<1:31:46, 4.03s/it] 96%|█████████▌| 32914/34278 [36:05:48<1:24:37, 3.72s/it] {'loss': 0.077, 'grad_norm': 0.7858133652973975, 'learning_rate': 4.1467714259763034e-08, 'epoch': 0.96} 96%|█████████▌| 32914/34278 [36:05:48<1:24:37, 3.72s/it] 96%|█████████▌| 32915/34278 [36:05:51<1:19:51, 3.52s/it] {'loss': 0.1108, 'grad_norm': 0.8070321620835452, 'learning_rate': 4.1407017479319636e-08, 'epoch': 0.96} 96%|█████████▌| 32915/34278 [36:05:51<1:19:51, 3.52s/it] 96%|█████████▌| 32916/34278 [36:05:54<1:16:10, 3.36s/it] {'loss': 0.0944, 'grad_norm': 0.7260421751741842, 'learning_rate': 4.13463649680107e-08, 'epoch': 0.96} 96%|█████████▌| 32916/34278 [36:05:54<1:16:10, 3.36s/it] 96%|█████████▌| 32917/34278 [36:05:58<1:23:13, 3.67s/it] {'loss': 0.1107, 'grad_norm': 0.8073658116030934, 'learning_rate': 4.128575672637747e-08, 'epoch': 0.96} 96%|█████████▌| 32917/34278 [36:05:58<1:23:13, 3.67s/it] 96%|█████████▌| 32918/34278 [36:06:01<1:18:49, 3.48s/it] {'loss': 0.1207, 'grad_norm': 0.7818934109657987, 'learning_rate': 4.122519275496173e-08, 'epoch': 0.96} 96%|█████████▌| 32918/34278 [36:06:01<1:18:49, 3.48s/it] 96%|█████████▌| 32919/34278 [36:06:05<1:19:48, 3.52s/it] {'loss': 0.1011, 'grad_norm': 0.6977574966582254, 'learning_rate': 4.11646730543036e-08, 'epoch': 0.96} 96%|█████████▌| 32919/34278 [36:06:05<1:19:48, 3.52s/it] 96%|█████████▌| 32920/34278 [36:06:09<1:22:11, 3.63s/it] {'loss': 0.1056, 'grad_norm': 0.7836466621465713, 'learning_rate': 4.110419762494322e-08, 'epoch': 0.96} 96%|█████████▌| 32920/34278 [36:06:09<1:22:11, 3.63s/it] 96%|█████████▌| 32921/34278 [36:06:15<1:38:41, 4.36s/it] {'loss': 0.11, 'grad_norm': 0.7855120714498354, 'learning_rate': 4.10437664674207e-08, 'epoch': 0.96} 96%|█████████▌| 32921/34278 [36:06:15<1:38:41, 4.36s/it] 96%|█████████▌| 32922/34278 [36:06:18<1:28:35, 3.92s/it] {'loss': 0.1251, 'grad_norm': 0.7335995347301505, 'learning_rate': 4.098337958227561e-08, 'epoch': 0.96} 96%|█████████▌| 32922/34278 [36:06:18<1:28:35, 3.92s/it] 96%|█████████▌| 32923/34278 [36:06:23<1:38:36, 4.37s/it] {'loss': 0.1322, 'grad_norm': 0.8376774119876726, 'learning_rate': 4.0923036970047516e-08, 'epoch': 0.96} 96%|█████████▌| 32923/34278 [36:06:23<1:38:36, 4.37s/it] 96%|█████████▌| 32924/34278 [36:06:26<1:29:22, 3.96s/it] {'loss': 0.1078, 'grad_norm': 0.8464667864558919, 'learning_rate': 4.086273863127488e-08, 'epoch': 0.96} 96%|█████████▌| 32924/34278 [36:06:26<1:29:22, 3.96s/it] 96%|█████████▌| 32925/34278 [36:06:30<1:24:26, 3.74s/it] {'loss': 0.1028, 'grad_norm': 0.7911772461653852, 'learning_rate': 4.08024845664956e-08, 'epoch': 0.96} 96%|█████████▌| 32925/34278 [36:06:30<1:24:26, 3.74s/it] 96%|█████████▌| 32926/34278 [36:06:33<1:22:43, 3.67s/it] {'loss': 0.1104, 'grad_norm': 0.9011527629033576, 'learning_rate': 4.074227477624759e-08, 'epoch': 0.96} 96%|█████████▌| 32926/34278 [36:06:33<1:22:43, 3.67s/it] 96%|█████████▌| 32927/34278 [36:06:36<1:17:35, 3.45s/it] {'loss': 0.1424, 'grad_norm': 0.9795115911375706, 'learning_rate': 4.068210926106875e-08, 'epoch': 0.96} 96%|█████████▌| 32927/34278 [36:06:36<1:17:35, 3.45s/it] 96%|█████████▌| 32928/34278 [36:06:42<1:34:25, 4.20s/it] {'loss': 0.1111, 'grad_norm': 0.6949332747096366, 'learning_rate': 4.062198802149642e-08, 'epoch': 0.96} 96%|█████████▌| 32928/34278 [36:06:42<1:34:25, 4.20s/it] 96%|█████████▌| 32929/34278 [36:06:45<1:27:42, 3.90s/it] {'loss': 0.1154, 'grad_norm': 0.8964322759490889, 'learning_rate': 4.056191105806684e-08, 'epoch': 0.96} 96%|█████████▌| 32929/34278 [36:06:45<1:27:42, 3.90s/it] 96%|█████████▌| 32930/34278 [36:06:48<1:23:43, 3.73s/it] {'loss': 0.1171, 'grad_norm': 0.8493600943026116, 'learning_rate': 4.0501878371316806e-08, 'epoch': 0.96} 96%|█████████▌| 32930/34278 [36:06:48<1:23:43, 3.73s/it] 96%|█████████▌| 32931/34278 [36:06:52<1:20:40, 3.59s/it] {'loss': 0.1151, 'grad_norm': 0.7911199460032626, 'learning_rate': 4.044188996178255e-08, 'epoch': 0.96} 96%|█████████▌| 32931/34278 [36:06:52<1:20:40, 3.59s/it] 96%|█████████▌| 32932/34278 [36:06:55<1:16:19, 3.40s/it] {'loss': 0.0967, 'grad_norm': 0.9322881469754698, 'learning_rate': 4.0381945829998105e-08, 'epoch': 0.96} 96%|█████████▌| 32932/34278 [36:06:55<1:16:19, 3.40s/it] 96%|█████████▌| 32933/34278 [36:07:01<1:33:48, 4.18s/it] {'loss': 0.1201, 'grad_norm': 0.7760666350108234, 'learning_rate': 4.0322045976500246e-08, 'epoch': 0.96} 96%|█████████▌| 32933/34278 [36:07:01<1:33:48, 4.18s/it] 96%|█████████▌| 32934/34278 [36:07:05<1:33:12, 4.16s/it] {'loss': 0.1226, 'grad_norm': 0.9476825495695969, 'learning_rate': 4.0262190401822995e-08, 'epoch': 0.96} 96%|█████████▌| 32934/34278 [36:07:05<1:33:12, 4.16s/it] 96%|█████████▌| 32935/34278 [36:07:08<1:25:30, 3.82s/it] {'loss': 0.1042, 'grad_norm': 0.8372849613729361, 'learning_rate': 4.0202379106501486e-08, 'epoch': 0.96} 96%|█████████▌| 32935/34278 [36:07:08<1:25:30, 3.82s/it] 96%|█████████▌| 32936/34278 [36:07:11<1:19:44, 3.57s/it] {'loss': 0.1012, 'grad_norm': 0.8716121342632084, 'learning_rate': 4.014261209106862e-08, 'epoch': 0.96} 96%|█████████▌| 32936/34278 [36:07:11<1:19:44, 3.57s/it] 96%|█████████▌| 32937/34278 [36:07:17<1:36:46, 4.33s/it] {'loss': 0.1087, 'grad_norm': 0.8071127617666869, 'learning_rate': 4.0082889356058416e-08, 'epoch': 0.96} 96%|█████████▌| 32937/34278 [36:07:17<1:36:46, 4.33s/it] 96%|█████████▌| 32938/34278 [36:07:20<1:27:20, 3.91s/it] {'loss': 0.1343, 'grad_norm': 1.0736021576704553, 'learning_rate': 4.002321090200434e-08, 'epoch': 0.96} 96%|█████████▌| 32938/34278 [36:07:20<1:27:20, 3.91s/it] 96%|█████████▌| 32939/34278 [36:07:23<1:21:17, 3.64s/it] {'loss': 0.1102, 'grad_norm': 1.272409731944864, 'learning_rate': 3.996357672943874e-08, 'epoch': 0.96} 96%|█████████▌| 32939/34278 [36:07:23<1:21:17, 3.64s/it] 96%|█████████▌| 32940/34278 [36:07:26<1:18:59, 3.54s/it] {'loss': 0.121, 'grad_norm': 0.8475898315710867, 'learning_rate': 3.990398683889507e-08, 'epoch': 0.96} 96%|█████████▌| 32940/34278 [36:07:26<1:18:59, 3.54s/it] 96%|█████████▌| 32941/34278 [36:07:30<1:18:00, 3.50s/it] {'loss': 0.114, 'grad_norm': 0.7365683823754445, 'learning_rate': 3.984444123090403e-08, 'epoch': 0.96} 96%|█████████▌| 32941/34278 [36:07:30<1:18:00, 3.50s/it] 96%|█████████▌| 32942/34278 [36:07:35<1:34:10, 4.23s/it] {'loss': 0.1141, 'grad_norm': 0.8605090063585837, 'learning_rate': 3.978493990599741e-08, 'epoch': 0.96} 96%|█████████▌| 32942/34278 [36:07:35<1:34:10, 4.23s/it] 96%|█████████▌| 32943/34278 [36:07:41<1:39:58, 4.49s/it] {'loss': 0.1042, 'grad_norm': 0.717129922732362, 'learning_rate': 3.972548286470701e-08, 'epoch': 0.96} 96%|█████████▌| 32943/34278 [36:07:41<1:39:58, 4.49s/it] 96%|█████████▌| 32944/34278 [36:07:44<1:32:19, 4.15s/it] {'loss': 0.1269, 'grad_norm': 0.7521501690881692, 'learning_rate': 3.966607010756351e-08, 'epoch': 0.96} 96%|█████████▌| 32944/34278 [36:07:44<1:32:19, 4.15s/it] 96%|█████████▌| 32945/34278 [36:07:47<1:24:29, 3.80s/it] {'loss': 0.0992, 'grad_norm': 0.6593434180891637, 'learning_rate': 3.960670163509706e-08, 'epoch': 0.96} 96%|█████████▌| 32945/34278 [36:07:47<1:24:29, 3.80s/it] 96%|█████████▌| 32946/34278 [36:07:51<1:27:48, 3.96s/it] {'loss': 0.1182, 'grad_norm': 0.8931219923621592, 'learning_rate': 3.954737744783776e-08, 'epoch': 0.96} 96%|█████████▌| 32946/34278 [36:07:51<1:27:48, 3.96s/it] 96%|█████████▌| 32947/34278 [36:07:55<1:23:25, 3.76s/it] {'loss': 0.1016, 'grad_norm': 0.79771928209597, 'learning_rate': 3.9488097546315774e-08, 'epoch': 0.96} 96%|█████████▌| 32947/34278 [36:07:55<1:23:25, 3.76s/it] 96%|█████████▌| 32948/34278 [36:07:58<1:20:46, 3.64s/it] {'loss': 0.1198, 'grad_norm': 0.8599903754100809, 'learning_rate': 3.942886193105955e-08, 'epoch': 0.96} 96%|█████████▌| 32948/34278 [36:07:58<1:20:46, 3.64s/it] 96%|█████████▌| 32949/34278 [36:08:03<1:30:32, 4.09s/it] {'loss': 0.1202, 'grad_norm': 0.7881054768386069, 'learning_rate': 3.936967060259811e-08, 'epoch': 0.96} 96%|█████████▌| 32949/34278 [36:08:03<1:30:32, 4.09s/it] 96%|█████████▌| 32950/34278 [36:08:07<1:26:59, 3.93s/it] {'loss': 0.0911, 'grad_norm': 0.8539346892614871, 'learning_rate': 3.931052356145992e-08, 'epoch': 0.96} 96%|█████████▌| 32950/34278 [36:08:07<1:26:59, 3.93s/it] 96%|█████████▌| 32951/34278 [36:08:10<1:20:30, 3.64s/it] {'loss': 0.0975, 'grad_norm': 0.7723839977611351, 'learning_rate': 3.925142080817346e-08, 'epoch': 0.96} 96%|█████████▌| 32951/34278 [36:08:10<1:20:30, 3.64s/it] 96%|█████████▌| 32952/34278 [36:08:12<1:15:37, 3.42s/it] {'loss': 0.1021, 'grad_norm': 1.1751985653196537, 'learning_rate': 3.9192362343266065e-08, 'epoch': 0.96} 96%|█████████▌| 32952/34278 [36:08:12<1:15:37, 3.42s/it] 96%|█████████▌| 32953/34278 [36:08:16<1:14:37, 3.38s/it] {'loss': 0.1307, 'grad_norm': 1.1151891842326118, 'learning_rate': 3.913334816726511e-08, 'epoch': 0.96} 96%|█████████▌| 32953/34278 [36:08:16<1:14:37, 3.38s/it] 96%|█████████▌| 32954/34278 [36:08:21<1:28:58, 4.03s/it] {'loss': 0.1285, 'grad_norm': 1.0173224641863299, 'learning_rate': 3.907437828069738e-08, 'epoch': 0.96} 96%|█████████▌| 32954/34278 [36:08:21<1:28:58, 4.03s/it] 96%|█████████▌| 32955/34278 [36:08:25<1:28:27, 4.01s/it] {'loss': 0.1072, 'grad_norm': 1.1297874265647203, 'learning_rate': 3.901545268408913e-08, 'epoch': 0.96} 96%|█████████▌| 32955/34278 [36:08:25<1:28:27, 4.01s/it] 96%|█████████▌| 32956/34278 [36:08:31<1:40:31, 4.56s/it] {'loss': 0.1209, 'grad_norm': 0.7802241055611593, 'learning_rate': 3.8956571377966603e-08, 'epoch': 0.96} 96%|█████████▌| 32956/34278 [36:08:31<1:40:31, 4.56s/it] 96%|█████████▌| 32957/34278 [36:08:34<1:30:00, 4.09s/it] {'loss': 0.1092, 'grad_norm': 0.8205962512299297, 'learning_rate': 3.889773436285604e-08, 'epoch': 0.96} 96%|█████████▌| 32957/34278 [36:08:34<1:30:00, 4.09s/it] 96%|█████████▌| 32958/34278 [36:08:38<1:26:00, 3.91s/it] {'loss': 0.143, 'grad_norm': 0.8153359881409391, 'learning_rate': 3.8838941639282036e-08, 'epoch': 0.96} 96%|█████████▌| 32958/34278 [36:08:38<1:26:00, 3.91s/it] 96%|█████████▌| 32959/34278 [36:08:42<1:26:06, 3.92s/it] {'loss': 0.1295, 'grad_norm': 0.8393741738916831, 'learning_rate': 3.8780193207769154e-08, 'epoch': 0.96} 96%|█████████▌| 32959/34278 [36:08:42<1:26:06, 3.92s/it] 96%|█████████▌| 32960/34278 [36:08:45<1:20:03, 3.64s/it] {'loss': 0.1003, 'grad_norm': 0.8312326704942081, 'learning_rate': 3.8721489068842543e-08, 'epoch': 0.96} 96%|█████████▌| 32960/34278 [36:08:45<1:20:03, 3.64s/it] 96%|█████████▌| 32961/34278 [36:08:48<1:16:19, 3.48s/it] {'loss': 0.1245, 'grad_norm': 0.7485737538452321, 'learning_rate': 3.866282922302622e-08, 'epoch': 0.96} 96%|█████████▌| 32961/34278 [36:08:48<1:16:19, 3.48s/it] 96%|█████████▌| 32962/34278 [36:08:53<1:27:39, 4.00s/it] {'loss': 0.1222, 'grad_norm': 0.6920980023177091, 'learning_rate': 3.860421367084366e-08, 'epoch': 0.96} 96%|█████████▌| 32962/34278 [36:08:53<1:27:39, 4.00s/it] 96%|█████████▌| 32963/34278 [36:08:56<1:22:04, 3.75s/it] {'loss': 0.1136, 'grad_norm': 0.8393656485189316, 'learning_rate': 3.8545642412818327e-08, 'epoch': 0.96} 96%|█████████▌| 32963/34278 [36:08:56<1:22:04, 3.75s/it] 96%|█████████▌| 32964/34278 [36:09:02<1:36:31, 4.41s/it] {'loss': 0.0787, 'grad_norm': 0.6426028953976453, 'learning_rate': 3.84871154494737e-08, 'epoch': 0.96} 96%|█████████▌| 32964/34278 [36:09:02<1:36:31, 4.41s/it] 96%|█████████▌| 32965/34278 [36:09:05<1:30:21, 4.13s/it] {'loss': 0.1054, 'grad_norm': 0.9316557809494685, 'learning_rate': 3.842863278133102e-08, 'epoch': 0.96} 96%|█████████▌| 32965/34278 [36:09:05<1:30:21, 4.13s/it] 96%|█████████▌| 32966/34278 [36:09:09<1:27:10, 3.99s/it] {'loss': 0.1011, 'grad_norm': 0.8161632030829696, 'learning_rate': 3.837019440891321e-08, 'epoch': 0.96} 96%|█████████▌| 32966/34278 [36:09:09<1:27:10, 3.99s/it] 96%|█████████▌| 32967/34278 [36:09:13<1:25:12, 3.90s/it] {'loss': 0.1156, 'grad_norm': 0.86914766144398, 'learning_rate': 3.8311800332742065e-08, 'epoch': 0.96} 96%|█████████▌| 32967/34278 [36:09:13<1:25:12, 3.90s/it] 96%|█████████▌| 32968/34278 [36:09:16<1:19:53, 3.66s/it] {'loss': 0.1143, 'grad_norm': 0.9001803427227271, 'learning_rate': 3.825345055333829e-08, 'epoch': 0.96} 96%|█████████▌| 32968/34278 [36:09:16<1:19:53, 3.66s/it] 96%|█████████▌| 32969/34278 [36:09:21<1:32:03, 4.22s/it] {'loss': 0.1106, 'grad_norm': 0.6701753188413643, 'learning_rate': 3.819514507122368e-08, 'epoch': 0.96} 96%|█████████▌| 32969/34278 [36:09:21<1:32:03, 4.22s/it] 96%|█████████▌| 32970/34278 [36:09:24<1:24:09, 3.86s/it] {'loss': 0.124, 'grad_norm': 0.8365762014485906, 'learning_rate': 3.813688388691783e-08, 'epoch': 0.96} 96%|█████████▌| 32970/34278 [36:09:24<1:24:09, 3.86s/it] 96%|█████████▌| 32971/34278 [36:09:27<1:18:38, 3.61s/it] {'loss': 0.119, 'grad_norm': 1.013591248087509, 'learning_rate': 3.807866700094198e-08, 'epoch': 0.96} 96%|█████████▌| 32971/34278 [36:09:27<1:18:38, 3.61s/it] 96%|█████████▌| 32972/34278 [36:09:31<1:18:31, 3.61s/it] {'loss': 0.0823, 'grad_norm': 0.7875181488612599, 'learning_rate': 3.8020494413815165e-08, 'epoch': 0.96} 96%|█████████▌| 32972/34278 [36:09:31<1:18:31, 3.61s/it] 96%|█████████▌| 32973/34278 [36:09:34<1:13:56, 3.40s/it] {'loss': 0.1174, 'grad_norm': 0.7823356857316066, 'learning_rate': 3.796236612605641e-08, 'epoch': 0.96} 96%|█████████▌| 32973/34278 [36:09:34<1:13:56, 3.40s/it] 96%|█████████▌| 32974/34278 [36:09:37<1:11:46, 3.30s/it] {'loss': 0.122, 'grad_norm': 0.7649462781377928, 'learning_rate': 3.790428213818531e-08, 'epoch': 0.96} 96%|█████████▌| 32974/34278 [36:09:37<1:11:46, 3.30s/it] 96%|█████████▌| 32975/34278 [36:09:41<1:13:47, 3.40s/it] {'loss': 0.0914, 'grad_norm': 0.7209496697471285, 'learning_rate': 3.784624245072088e-08, 'epoch': 0.96} 96%|█████████▌| 32975/34278 [36:09:41<1:13:47, 3.40s/it] 96%|█████████▌| 32976/34278 [36:09:44<1:14:20, 3.43s/it] {'loss': 0.1241, 'grad_norm': 0.8375459597140754, 'learning_rate': 3.778824706417994e-08, 'epoch': 0.96} 96%|█████████▌| 32976/34278 [36:09:44<1:14:20, 3.43s/it] 96%|█████████▌| 32977/34278 [36:09:49<1:26:20, 3.98s/it] {'loss': 0.0936, 'grad_norm': 0.6895641257485305, 'learning_rate': 3.7730295979080956e-08, 'epoch': 0.96} 96%|█████████▌| 32977/34278 [36:09:49<1:26:20, 3.98s/it] 96%|█████████▌| 32978/34278 [36:09:53<1:21:14, 3.75s/it] {'loss': 0.0892, 'grad_norm': 0.9472593070816314, 'learning_rate': 3.767238919594185e-08, 'epoch': 0.96} 96%|█████████▌| 32978/34278 [36:09:53<1:21:14, 3.75s/it] 96%|█████████▌| 32979/34278 [36:09:57<1:27:24, 4.04s/it] {'loss': 0.1096, 'grad_norm': 0.7789037981698359, 'learning_rate': 3.761452671527832e-08, 'epoch': 0.96} 96%|█████████▌| 32979/34278 [36:09:57<1:27:24, 4.04s/it] 96%|█████████▌| 32980/34278 [36:10:00<1:20:39, 3.73s/it] {'loss': 0.1179, 'grad_norm': 0.7157690630040628, 'learning_rate': 3.755670853760773e-08, 'epoch': 0.96} 96%|█████████▌| 32980/34278 [36:10:00<1:20:39, 3.73s/it] 96%|█████████▌| 32981/34278 [36:10:03<1:16:24, 3.53s/it] {'loss': 0.1266, 'grad_norm': 0.9952490367997867, 'learning_rate': 3.7498934663446897e-08, 'epoch': 0.96} 96%|█████████▌| 32981/34278 [36:10:03<1:16:24, 3.53s/it] 96%|█████████▌| 32982/34278 [36:10:09<1:32:35, 4.29s/it] {'loss': 0.0923, 'grad_norm': 0.78846643130223, 'learning_rate': 3.7441205093310394e-08, 'epoch': 0.96} 96%|█████████▌| 32982/34278 [36:10:09<1:32:35, 4.29s/it] 96%|█████████▌| 32983/34278 [36:10:14<1:31:06, 4.22s/it] {'loss': 0.1077, 'grad_norm': 0.7882133989258244, 'learning_rate': 3.738351982771449e-08, 'epoch': 0.96} 96%|█████████▌| 32983/34278 [36:10:14<1:31:06, 4.22s/it] 96%|█████████▌| 32984/34278 [36:10:17<1:25:32, 3.97s/it] {'loss': 0.113, 'grad_norm': 0.8372383457356164, 'learning_rate': 3.7325878867173757e-08, 'epoch': 0.96} 96%|█████████▌| 32984/34278 [36:10:17<1:25:32, 3.97s/it] 96%|█████████▌| 32985/34278 [36:10:20<1:22:22, 3.82s/it] {'loss': 0.1028, 'grad_norm': 0.8303243025515532, 'learning_rate': 3.72682822122028e-08, 'epoch': 0.96} 96%|█████████▌| 32985/34278 [36:10:20<1:22:22, 3.82s/it] 96%|█████████▌| 32986/34278 [36:10:26<1:31:04, 4.23s/it] {'loss': 0.0972, 'grad_norm': 0.7177457152597371, 'learning_rate': 3.7210729863315645e-08, 'epoch': 0.96} 96%|█████████▌| 32986/34278 [36:10:26<1:31:04, 4.23s/it] 96%|█████████▌| 32987/34278 [36:10:29<1:27:41, 4.08s/it] {'loss': 0.1213, 'grad_norm': 0.6882980232675601, 'learning_rate': 3.7153221821026875e-08, 'epoch': 0.96} 96%|█████████▌| 32987/34278 [36:10:29<1:27:41, 4.08s/it] 96%|█████████▌| 32988/34278 [36:10:32<1:19:21, 3.69s/it] {'loss': 0.1317, 'grad_norm': 0.8717886244412169, 'learning_rate': 3.709575808584942e-08, 'epoch': 0.96} 96%|█████████▌| 32988/34278 [36:10:32<1:19:21, 3.69s/it] 96%|█████████▌| 32989/34278 [36:10:35<1:16:06, 3.54s/it] {'loss': 0.105, 'grad_norm': 0.7009483093016554, 'learning_rate': 3.703833865829565e-08, 'epoch': 0.96} 96%|█████████▌| 32989/34278 [36:10:35<1:16:06, 3.54s/it] 96%|█████████▌| 32990/34278 [36:10:39<1:17:11, 3.60s/it] {'loss': 0.1406, 'grad_norm': 0.7956738157981716, 'learning_rate': 3.6980963538879585e-08, 'epoch': 0.96} 96%|█████████▌| 32990/34278 [36:10:39<1:17:11, 3.60s/it] 96%|█████████▌| 32991/34278 [36:10:42<1:12:58, 3.40s/it] {'loss': 0.1005, 'grad_norm': 1.178436146550977, 'learning_rate': 3.69236327281125e-08, 'epoch': 0.96} 96%|█████████▌| 32991/34278 [36:10:42<1:12:58, 3.40s/it] 96%|█████████▌| 32992/34278 [36:10:46<1:19:55, 3.73s/it] {'loss': 0.0968, 'grad_norm': 1.0159017263716346, 'learning_rate': 3.68663462265062e-08, 'epoch': 0.96} 96%|█████████▌| 32992/34278 [36:10:46<1:19:55, 3.73s/it] 96%|█████████▋| 32993/34278 [36:10:51<1:22:45, 3.86s/it] {'loss': 0.1155, 'grad_norm': 0.7600469535711079, 'learning_rate': 3.680910403457194e-08, 'epoch': 0.96} 96%|█████████▋| 32993/34278 [36:10:51<1:22:45, 3.86s/it] 96%|█████████▋| 32994/34278 [36:10:54<1:16:29, 3.57s/it] {'loss': 0.1024, 'grad_norm': 0.6000076446619093, 'learning_rate': 3.6751906152822095e-08, 'epoch': 0.96} 96%|█████████▋| 32994/34278 [36:10:54<1:16:29, 3.57s/it] 96%|█████████▋| 32995/34278 [36:10:56<1:11:56, 3.36s/it] {'loss': 0.1128, 'grad_norm': 0.8937829675335617, 'learning_rate': 3.669475258176625e-08, 'epoch': 0.96} 96%|█████████▋| 32995/34278 [36:10:56<1:11:56, 3.36s/it] 96%|█████████▋| 32996/34278 [36:11:00<1:10:17, 3.29s/it] {'loss': 0.116, 'grad_norm': 0.8659494459743594, 'learning_rate': 3.663764332191455e-08, 'epoch': 0.96} 96%|█████████▋| 32996/34278 [36:11:00<1:10:17, 3.29s/it] 96%|█████████▋| 32997/34278 [36:11:03<1:11:53, 3.37s/it] {'loss': 0.0958, 'grad_norm': 1.0176679697840387, 'learning_rate': 3.658057837377716e-08, 'epoch': 0.96} 96%|█████████▋| 32997/34278 [36:11:03<1:11:53, 3.37s/it] 96%|█████████▋| 32998/34278 [36:11:06<1:10:13, 3.29s/it] {'loss': 0.1245, 'grad_norm': 0.77596651910336, 'learning_rate': 3.6523557737863646e-08, 'epoch': 0.96} 96%|█████████▋| 32998/34278 [36:11:06<1:10:13, 3.29s/it] 96%|█████████▋| 32999/34278 [36:11:11<1:22:35, 3.87s/it] {'loss': 0.1087, 'grad_norm': 0.7578318550528443, 'learning_rate': 3.646658141468251e-08, 'epoch': 0.96} 96%|█████████▋| 32999/34278 [36:11:11<1:22:35, 3.87s/it] 96%|█████████▋| 33000/34278 [36:11:17<1:36:04, 4.51s/it] {'loss': 0.1082, 'grad_norm': 0.9924055428571892, 'learning_rate': 3.640964940474334e-08, 'epoch': 0.96} 96%|█████████▋| 33000/34278 [36:11:17<1:36:04, 4.51s/it] 96%|█████████▋| 33001/34278 [36:11:21<1:27:54, 4.13s/it] {'loss': 0.1185, 'grad_norm': 0.7900169952493743, 'learning_rate': 3.635276170855351e-08, 'epoch': 0.96} 96%|█████████▋| 33001/34278 [36:11:21<1:27:54, 4.13s/it] 96%|█████████▋| 33002/34278 [36:11:24<1:22:43, 3.89s/it] {'loss': 0.1256, 'grad_norm': 0.9883778430563414, 'learning_rate': 3.629591832662149e-08, 'epoch': 0.96} 96%|█████████▋| 33002/34278 [36:11:24<1:22:43, 3.89s/it] 96%|█████████▋| 33003/34278 [36:11:28<1:23:32, 3.93s/it] {'loss': 0.1038, 'grad_norm': 0.8830936674417562, 'learning_rate': 3.623911925945467e-08, 'epoch': 0.96} 96%|█████████▋| 33003/34278 [36:11:28<1:23:32, 3.93s/it] 96%|█████████▋| 33004/34278 [36:11:31<1:19:58, 3.77s/it] {'loss': 0.1226, 'grad_norm': 0.9489308384871062, 'learning_rate': 3.618236450755985e-08, 'epoch': 0.96} 96%|█████████▋| 33004/34278 [36:11:31<1:19:58, 3.77s/it] 96%|█████████▋| 33005/34278 [36:11:37<1:34:28, 4.45s/it] {'loss': 0.1031, 'grad_norm': 0.7970359065891596, 'learning_rate': 3.6125654071444414e-08, 'epoch': 0.96} 96%|█████████▋| 33005/34278 [36:11:37<1:34:28, 4.45s/it] 96%|█████████▋| 33006/34278 [36:11:41<1:26:24, 4.08s/it] {'loss': 0.1215, 'grad_norm': 0.8478636430615173, 'learning_rate': 3.606898795161351e-08, 'epoch': 0.96} 96%|█████████▋| 33006/34278 [36:11:41<1:26:24, 4.08s/it] 96%|█████████▋| 33007/34278 [36:11:44<1:22:13, 3.88s/it] {'loss': 0.1089, 'grad_norm': 1.1068734345463336, 'learning_rate': 3.6012366148574505e-08, 'epoch': 0.96} 96%|█████████▋| 33007/34278 [36:11:44<1:22:13, 3.88s/it] 96%|█████████▋| 33008/34278 [36:11:47<1:18:29, 3.71s/it] {'loss': 0.1157, 'grad_norm': 0.8433289126861183, 'learning_rate': 3.5955788662831445e-08, 'epoch': 0.96} 96%|█████████▋| 33008/34278 [36:11:47<1:18:29, 3.71s/it] 96%|█████████▋| 33009/34278 [36:11:50<1:14:36, 3.53s/it] {'loss': 0.0957, 'grad_norm': 0.743744804801746, 'learning_rate': 3.589925549489004e-08, 'epoch': 0.96} 96%|█████████▋| 33009/34278 [36:11:50<1:14:36, 3.53s/it] 96%|█████████▋| 33010/34278 [36:11:54<1:11:35, 3.39s/it] {'loss': 0.1065, 'grad_norm': 0.9682467808318107, 'learning_rate': 3.5842766645255436e-08, 'epoch': 0.96} 96%|█████████▋| 33010/34278 [36:11:54<1:11:35, 3.39s/it] 96%|█████████▋| 33011/34278 [36:11:57<1:13:33, 3.48s/it] {'loss': 0.0963, 'grad_norm': 0.8089201034724288, 'learning_rate': 3.578632211443112e-08, 'epoch': 0.96} 96%|█████████▋| 33011/34278 [36:11:57<1:13:33, 3.48s/it] 96%|█████████▋| 33012/34278 [36:12:00<1:10:58, 3.36s/it] {'loss': 0.1121, 'grad_norm': 0.7571672096064874, 'learning_rate': 3.5729921902921684e-08, 'epoch': 0.96} 96%|█████████▋| 33012/34278 [36:12:00<1:10:58, 3.36s/it] 96%|█████████▋| 33013/34278 [36:12:04<1:11:09, 3.38s/it] {'loss': 0.1084, 'grad_norm': 0.8249523443104517, 'learning_rate': 3.567356601123062e-08, 'epoch': 0.96} 96%|█████████▋| 33013/34278 [36:12:04<1:11:09, 3.38s/it] 96%|█████████▋| 33014/34278 [36:12:07<1:12:11, 3.43s/it] {'loss': 0.1028, 'grad_norm': 0.9194602558745402, 'learning_rate': 3.561725443986086e-08, 'epoch': 0.96} 96%|█████████▋| 33014/34278 [36:12:07<1:12:11, 3.43s/it] 96%|█████████▋| 33015/34278 [36:12:10<1:09:49, 3.32s/it] {'loss': 0.1123, 'grad_norm': 0.7845212325979251, 'learning_rate': 3.556098718931478e-08, 'epoch': 0.96} 96%|█████████▋| 33015/34278 [36:12:10<1:09:49, 3.32s/it] 96%|█████████▋| 33016/34278 [36:12:13<1:08:28, 3.26s/it] {'loss': 0.1098, 'grad_norm': 0.8637188543436544, 'learning_rate': 3.55047642600953e-08, 'epoch': 0.96} 96%|█████████▋| 33016/34278 [36:12:13<1:08:28, 3.26s/it] 96%|█████████▋| 33017/34278 [36:12:16<1:06:18, 3.15s/it] {'loss': 0.0923, 'grad_norm': 0.7449570557338919, 'learning_rate': 3.544858565270426e-08, 'epoch': 0.96} 96%|█████████▋| 33017/34278 [36:12:16<1:06:18, 3.15s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f8ae07d68e0> Failed to fetch sample 3353706. Exception: cannot identify image file <_io.BytesIO object at 0x7f8ae07d68e0> 96%|█████████▋| 33018/34278 [36:12:21<1:16:43, 3.65s/it] {'loss': 0.0861, 'grad_norm': 0.8004000862337594, 'learning_rate': 3.5392451367643466e-08, 'epoch': 0.96} 96%|█████████▋| 33018/34278 [36:12:21<1:16:43, 3.65s/it] 96%|█████████▋| 33019/34278 [36:12:26<1:22:21, 3.92s/it] {'loss': 0.1095, 'grad_norm': 0.6309363146460676, 'learning_rate': 3.5336361405413076e-08, 'epoch': 0.96} 96%|█████████▋| 33019/34278 [36:12:26<1:22:21, 3.92s/it] 96%|█████████▋| 33020/34278 [36:12:29<1:16:32, 3.65s/it] {'loss': 0.102, 'grad_norm': 0.7282028087219455, 'learning_rate': 3.5280315766514915e-08, 'epoch': 0.96} 96%|█████████▋| 33020/34278 [36:12:29<1:16:32, 3.65s/it] 96%|█████████▋| 33021/34278 [36:12:32<1:15:41, 3.61s/it] {'loss': 0.0866, 'grad_norm': 0.7912197510853144, 'learning_rate': 3.522431445144858e-08, 'epoch': 0.96} 96%|█████████▋| 33021/34278 [36:12:32<1:15:41, 3.61s/it] 96%|█████████▋| 33022/34278 [36:12:35<1:11:09, 3.40s/it] {'loss': 0.1145, 'grad_norm': 0.9651112176832288, 'learning_rate': 3.5168357460714785e-08, 'epoch': 0.96} 96%|█████████▋| 33022/34278 [36:12:35<1:11:09, 3.40s/it] 96%|█████████▋| 33023/34278 [36:12:38<1:09:42, 3.33s/it] {'loss': 0.119, 'grad_norm': 0.8764524239134768, 'learning_rate': 3.5112444794812016e-08, 'epoch': 0.96} 96%|█████████▋| 33023/34278 [36:12:38<1:09:42, 3.33s/it] 96%|█████████▋| 33024/34278 [36:12:44<1:23:09, 3.98s/it] {'loss': 0.1232, 'grad_norm': 0.7404967584345311, 'learning_rate': 3.5056576454240984e-08, 'epoch': 0.96} 96%|█████████▋| 33024/34278 [36:12:44<1:23:09, 3.98s/it] 96%|█████████▋| 33025/34278 [36:12:50<1:34:10, 4.51s/it] {'loss': 0.1166, 'grad_norm': 0.9301669890657671, 'learning_rate': 3.5000752439499076e-08, 'epoch': 0.96} 96%|█████████▋| 33025/34278 [36:12:50<1:34:10, 4.51s/it] 96%|█████████▋| 33026/34278 [36:12:55<1:40:12, 4.80s/it] {'loss': 0.1369, 'grad_norm': 0.9760219893991673, 'learning_rate': 3.494497275108533e-08, 'epoch': 0.96} 96%|█████████▋| 33026/34278 [36:12:55<1:40:12, 4.80s/it] 96%|█████████▋| 33027/34278 [36:12:58<1:28:42, 4.25s/it] {'loss': 0.1372, 'grad_norm': 0.8939962196058359, 'learning_rate': 3.4889237389497673e-08, 'epoch': 0.96} 96%|█████████▋| 33027/34278 [36:12:58<1:28:42, 4.25s/it] 96%|█████████▋| 33028/34278 [36:13:02<1:25:00, 4.08s/it] {'loss': 0.0916, 'grad_norm': 0.8861108558928846, 'learning_rate': 3.48335463552335e-08, 'epoch': 0.96} 96%|█████████▋| 33028/34278 [36:13:02<1:25:00, 4.08s/it] 96%|█████████▋| 33029/34278 [36:13:06<1:23:08, 3.99s/it] {'loss': 0.0971, 'grad_norm': 0.6805670338444785, 'learning_rate': 3.477789964879019e-08, 'epoch': 0.96} 96%|█████████▋| 33029/34278 [36:13:06<1:23:08, 3.99s/it] 96%|█████████▋| 33030/34278 [36:13:08<1:16:14, 3.67s/it] {'loss': 0.1201, 'grad_norm': 0.7093956552409721, 'learning_rate': 3.4722297270664564e-08, 'epoch': 0.96} 96%|█████████▋| 33030/34278 [36:13:08<1:16:14, 3.67s/it] 96%|█████████▋| 33031/34278 [36:13:13<1:22:11, 3.95s/it] {'loss': 0.1015, 'grad_norm': 0.8640660654636393, 'learning_rate': 3.4666739221352885e-08, 'epoch': 0.96} 96%|█████████▋| 33031/34278 [36:13:13<1:22:11, 3.95s/it] 96%|█████████▋| 33032/34278 [36:13:16<1:18:26, 3.78s/it] {'loss': 0.1179, 'grad_norm': 1.0140748355593368, 'learning_rate': 3.461122550135143e-08, 'epoch': 0.96} 96%|█████████▋| 33032/34278 [36:13:16<1:18:26, 3.78s/it] 96%|█████████▋| 33033/34278 [36:13:20<1:17:48, 3.75s/it] {'loss': 0.1227, 'grad_norm': 0.7772640268005586, 'learning_rate': 3.4555756111155356e-08, 'epoch': 0.96} 96%|█████████▋| 33033/34278 [36:13:20<1:17:48, 3.75s/it] 96%|█████████▋| 33034/34278 [36:13:25<1:27:06, 4.20s/it] {'loss': 0.1257, 'grad_norm': 1.1213463004520035, 'learning_rate': 3.4500331051260383e-08, 'epoch': 0.96} 96%|█████████▋| 33034/34278 [36:13:25<1:27:06, 4.20s/it] 96%|█████████▋| 33035/34278 [36:13:29<1:24:03, 4.06s/it] {'loss': 0.1079, 'grad_norm': 0.8425831603907519, 'learning_rate': 3.4444950322161106e-08, 'epoch': 0.96} 96%|█████████▋| 33035/34278 [36:13:29<1:24:03, 4.06s/it] 96%|█████████▋| 33036/34278 [36:13:33<1:20:13, 3.88s/it] {'loss': 0.114, 'grad_norm': 0.6934389390212334, 'learning_rate': 3.438961392435158e-08, 'epoch': 0.96} 96%|█████████▋| 33036/34278 [36:13:33<1:20:13, 3.88s/it] 96%|█████████▋| 33037/34278 [36:13:36<1:16:33, 3.70s/it] {'loss': 0.1563, 'grad_norm': 0.8457633294947993, 'learning_rate': 3.433432185832641e-08, 'epoch': 0.96} 96%|█████████▋| 33037/34278 [36:13:36<1:16:33, 3.70s/it] 96%|█████████▋| 33038/34278 [36:13:40<1:18:38, 3.81s/it] {'loss': 0.1168, 'grad_norm': 0.8949233028699001, 'learning_rate': 3.4279074124579094e-08, 'epoch': 0.96} 96%|█████████▋| 33038/34278 [36:13:40<1:18:38, 3.81s/it] 96%|█████████▋| 33039/34278 [36:13:43<1:16:27, 3.70s/it] {'loss': 0.095, 'grad_norm': 0.6767415914019637, 'learning_rate': 3.422387072360256e-08, 'epoch': 0.96} 96%|█████████▋| 33039/34278 [36:13:43<1:16:27, 3.70s/it] 96%|█████████▋| 33040/34278 [36:13:47<1:18:08, 3.79s/it] {'loss': 0.1106, 'grad_norm': 0.8134499340441764, 'learning_rate': 3.4168711655889756e-08, 'epoch': 0.96} 96%|█████████▋| 33040/34278 [36:13:47<1:18:08, 3.79s/it] 96%|█████████▋| 33041/34278 [36:13:51<1:19:03, 3.83s/it] {'loss': 0.0971, 'grad_norm': 1.791976430899528, 'learning_rate': 3.411359692193361e-08, 'epoch': 0.96} 96%|█████████▋| 33041/34278 [36:13:51<1:19:03, 3.83s/it] 96%|█████████▋| 33042/34278 [36:13:55<1:15:36, 3.67s/it] {'loss': 0.0912, 'grad_norm': 0.8628128175687187, 'learning_rate': 3.405852652222596e-08, 'epoch': 0.96} 96%|█████████▋| 33042/34278 [36:13:55<1:15:36, 3.67s/it] 96%|█████████▋| 33043/34278 [36:13:57<1:10:47, 3.44s/it] {'loss': 0.0998, 'grad_norm': 0.7840498889838119, 'learning_rate': 3.400350045725809e-08, 'epoch': 0.96} 96%|█████████▋| 33043/34278 [36:13:57<1:10:47, 3.44s/it] 96%|█████████▋| 33044/34278 [36:14:02<1:19:23, 3.86s/it] {'loss': 0.1097, 'grad_norm': 0.6967376898445659, 'learning_rate': 3.3948518727521807e-08, 'epoch': 0.96} 96%|█████████▋| 33044/34278 [36:14:02<1:19:23, 3.86s/it] 96%|█████████▋| 33045/34278 [36:14:06<1:15:26, 3.67s/it] {'loss': 0.1213, 'grad_norm': 0.9112833338701539, 'learning_rate': 3.3893581333507286e-08, 'epoch': 0.96} 96%|█████████▋| 33045/34278 [36:14:06<1:15:26, 3.67s/it] 96%|█████████▋| 33046/34278 [36:14:09<1:14:43, 3.64s/it] {'loss': 0.1123, 'grad_norm': 0.8504616077584305, 'learning_rate': 3.383868827570524e-08, 'epoch': 0.96} 96%|█████████▋| 33046/34278 [36:14:09<1:14:43, 3.64s/it] 96%|█████████▋| 33047/34278 [36:14:13<1:16:39, 3.74s/it] {'loss': 0.0916, 'grad_norm': 0.7451934307086762, 'learning_rate': 3.3783839554605845e-08, 'epoch': 0.96} 96%|█████████▋| 33047/34278 [36:14:13<1:16:39, 3.74s/it] 96%|█████████▋| 33048/34278 [36:14:17<1:14:51, 3.65s/it] {'loss': 0.1235, 'grad_norm': 0.8904984547388519, 'learning_rate': 3.372903517069925e-08, 'epoch': 0.96} 96%|█████████▋| 33048/34278 [36:14:17<1:14:51, 3.65s/it] 96%|█████████▋| 33049/34278 [36:14:19<1:09:34, 3.40s/it] {'loss': 0.0947, 'grad_norm': 0.8104490859134511, 'learning_rate': 3.3674275124473966e-08, 'epoch': 0.96} 96%|█████████▋| 33049/34278 [36:14:19<1:09:34, 3.40s/it] 96%|█████████▋| 33050/34278 [36:14:23<1:12:25, 3.54s/it] {'loss': 0.1266, 'grad_norm': 0.8997780029101679, 'learning_rate': 3.361955941641959e-08, 'epoch': 0.96} 96%|█████████▋| 33050/34278 [36:14:23<1:12:25, 3.54s/it] 96%|█████████▋| 33051/34278 [36:14:26<1:09:13, 3.39s/it] {'loss': 0.1317, 'grad_norm': 0.7583106702966642, 'learning_rate': 3.356488804702407e-08, 'epoch': 0.96} 96%|█████████▋| 33051/34278 [36:14:26<1:09:13, 3.39s/it] 96%|█████████▋| 33052/34278 [36:14:29<1:07:41, 3.31s/it] {'loss': 0.1255, 'grad_norm': 0.9595040396376152, 'learning_rate': 3.351026101677535e-08, 'epoch': 0.96} 96%|█████████▋| 33052/34278 [36:14:29<1:07:41, 3.31s/it] 96%|█████████▋| 33053/34278 [36:14:33<1:10:37, 3.46s/it] {'loss': 0.092, 'grad_norm': 0.8097031344318243, 'learning_rate': 3.345567832616137e-08, 'epoch': 0.96} 96%|█████████▋| 33053/34278 [36:14:33<1:10:37, 3.46s/it] 96%|█████████▋| 33054/34278 [36:14:36<1:07:52, 3.33s/it] {'loss': 0.1251, 'grad_norm': 0.8790094470262768, 'learning_rate': 3.3401139975669515e-08, 'epoch': 0.96} 96%|█████████▋| 33054/34278 [36:14:36<1:07:52, 3.33s/it] 96%|█████████▋| 33055/34278 [36:14:42<1:22:54, 4.07s/it] {'loss': 0.1039, 'grad_norm': 0.8628659152043027, 'learning_rate': 3.334664596578718e-08, 'epoch': 0.96} 96%|█████████▋| 33055/34278 [36:14:42<1:22:54, 4.07s/it] 96%|█████████▋| 33056/34278 [36:14:45<1:16:22, 3.75s/it] {'loss': 0.1141, 'grad_norm': 0.8293429931343395, 'learning_rate': 3.329219629699954e-08, 'epoch': 0.96} 96%|█████████▋| 33056/34278 [36:14:45<1:16:22, 3.75s/it] 96%|█████████▋| 33057/34278 [36:14:48<1:10:00, 3.44s/it] {'loss': 0.0961, 'grad_norm': 0.8531929437695732, 'learning_rate': 3.323779096979396e-08, 'epoch': 0.96} 96%|█████████▋| 33057/34278 [36:14:48<1:10:00, 3.44s/it] 96%|█████████▋| 33058/34278 [36:14:54<1:25:19, 4.20s/it] {'loss': 0.098, 'grad_norm': 0.721512424405809, 'learning_rate': 3.3183429984655626e-08, 'epoch': 0.96} 96%|█████████▋| 33058/34278 [36:14:54<1:25:19, 4.20s/it] 96%|█████████▋| 33059/34278 [36:14:57<1:18:17, 3.85s/it] {'loss': 0.1024, 'grad_norm': 0.7962545737939721, 'learning_rate': 3.312911334207025e-08, 'epoch': 0.96} 96%|█████████▋| 33059/34278 [36:14:57<1:18:17, 3.85s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7ff4f3fa25c0> Failed to fetch sample 3353708. Exception: cannot identify image file <_io.BytesIO object at 0x7ff4f3fa25c0> 96%|█████████▋| 33060/34278 [36:15:00<1:12:17, 3.56s/it] {'loss': 0.1019, 'grad_norm': 0.8252475799718151, 'learning_rate': 3.307484104252245e-08, 'epoch': 0.96} 96%|█████████▋| 33060/34278 [36:15:00<1:12:17, 3.56s/it] 96%|█████████▋| 33061/34278 [36:15:03<1:11:03, 3.50s/it] {'loss': 0.0977, 'grad_norm': 0.8107301483904512, 'learning_rate': 3.302061308649629e-08, 'epoch': 0.96} 96%|█████████▋| 33061/34278 [36:15:03<1:11:03, 3.50s/it] 96%|█████████▋| 33062/34278 [36:15:06<1:07:43, 3.34s/it] {'loss': 0.0951, 'grad_norm': 0.7197481795572213, 'learning_rate': 3.296642947447693e-08, 'epoch': 0.96} 96%|█████████▋| 33062/34278 [36:15:06<1:07:43, 3.34s/it] 96%|█████████▋| 33063/34278 [36:15:09<1:06:31, 3.29s/it] {'loss': 0.1271, 'grad_norm': 0.8894285379799222, 'learning_rate': 3.2912290206947305e-08, 'epoch': 0.96} 96%|█████████▋| 33063/34278 [36:15:09<1:06:31, 3.29s/it] 96%|█████████▋| 33064/34278 [36:15:12<1:03:26, 3.14s/it] {'loss': 0.092, 'grad_norm': 0.6619987909443094, 'learning_rate': 3.2858195284390936e-08, 'epoch': 0.96} 96%|█████████▋| 33064/34278 [36:15:12<1:03:26, 3.14s/it] 96%|█████████▋| 33065/34278 [36:15:15<1:04:54, 3.21s/it] {'loss': 0.1097, 'grad_norm': 0.7370591185472479, 'learning_rate': 3.280414470729076e-08, 'epoch': 0.96} 96%|█████████▋| 33065/34278 [36:15:15<1:04:54, 3.21s/it] 96%|█████████▋| 33066/34278 [36:15:18<1:04:07, 3.17s/it] {'loss': 0.0959, 'grad_norm': 0.7120999999351729, 'learning_rate': 3.2750138476129736e-08, 'epoch': 0.96} 96%|█████████▋| 33066/34278 [36:15:18<1:04:07, 3.17s/it] 96%|█████████▋| 33067/34278 [36:15:24<1:17:27, 3.84s/it] {'loss': 0.1007, 'grad_norm': 1.7231995176486146, 'learning_rate': 3.269617659138968e-08, 'epoch': 0.96} 96%|█████████▋| 33067/34278 [36:15:24<1:17:27, 3.84s/it] 96%|█████████▋| 33068/34278 [36:15:27<1:11:55, 3.57s/it] {'loss': 0.1084, 'grad_norm': 0.743186413695021, 'learning_rate': 3.264225905355245e-08, 'epoch': 0.96} 96%|█████████▋| 33068/34278 [36:15:27<1:11:55, 3.57s/it] 96%|█████████▋| 33069/34278 [36:15:30<1:09:36, 3.45s/it] {'loss': 0.1067, 'grad_norm': 0.6858552204728713, 'learning_rate': 3.258838586309876e-08, 'epoch': 0.96} 96%|█████████▋| 33069/34278 [36:15:30<1:09:36, 3.45s/it] 96%|█████████▋| 33070/34278 [36:15:34<1:12:43, 3.61s/it] {'loss': 0.132, 'grad_norm': 0.7390737277577913, 'learning_rate': 3.253455702051045e-08, 'epoch': 0.96} 96%|█████████▋| 33070/34278 [36:15:34<1:12:43, 3.61s/it] 96%|█████████▋| 33071/34278 [36:15:37<1:12:03, 3.58s/it] {'loss': 0.1048, 'grad_norm': 0.7102986768287417, 'learning_rate': 3.248077252626769e-08, 'epoch': 0.96} 96%|█████████▋| 33071/34278 [36:15:37<1:12:03, 3.58s/it] 96%|█████████▋| 33072/34278 [36:15:41<1:10:25, 3.50s/it] {'loss': 0.1153, 'grad_norm': 0.8894319490504129, 'learning_rate': 3.2427032380851206e-08, 'epoch': 0.96} 96%|█████████▋| 33072/34278 [36:15:41<1:10:25, 3.50s/it] 96%|█████████▋| 33073/34278 [36:15:44<1:08:40, 3.42s/it] {'loss': 0.1329, 'grad_norm': 1.2816692484979821, 'learning_rate': 3.2373336584740066e-08, 'epoch': 0.96} 96%|█████████▋| 33073/34278 [36:15:44<1:08:40, 3.42s/it] 96%|█████████▋| 33074/34278 [36:15:47<1:05:58, 3.29s/it] {'loss': 0.1132, 'grad_norm': 0.8388061203776557, 'learning_rate': 3.231968513841388e-08, 'epoch': 0.96} 96%|█████████▋| 33074/34278 [36:15:47<1:05:58, 3.29s/it] 96%|█████████▋| 33075/34278 [36:15:50<1:05:56, 3.29s/it] {'loss': 0.1003, 'grad_norm': 0.8260590103042612, 'learning_rate': 3.2266078042351155e-08, 'epoch': 0.96} 96%|█████████▋| 33075/34278 [36:15:50<1:05:56, 3.29s/it] 96%|█████████▋| 33076/34278 [36:15:53<1:04:45, 3.23s/it] {'loss': 0.1083, 'grad_norm': 0.8251853949977179, 'learning_rate': 3.221251529703151e-08, 'epoch': 0.96} 96%|█████████▋| 33076/34278 [36:15:53<1:04:45, 3.23s/it] 96%|█████████▋| 33077/34278 [36:16:00<1:23:28, 4.17s/it] {'loss': 0.1028, 'grad_norm': 0.7431120586565373, 'learning_rate': 3.2158996902932896e-08, 'epoch': 0.96} 96%|█████████▋| 33077/34278 [36:16:00<1:23:28, 4.17s/it] 96%|█████████▋| 33078/34278 [36:16:03<1:18:34, 3.93s/it] {'loss': 0.1054, 'grad_norm': 1.0436005730416644, 'learning_rate': 3.21055228605327e-08, 'epoch': 0.96} 96%|█████████▋| 33078/34278 [36:16:03<1:18:34, 3.93s/it] 97%|█████████▋| 33079/34278 [36:16:07<1:16:59, 3.85s/it] {'loss': 0.0967, 'grad_norm': 0.8754435615184252, 'learning_rate': 3.2052093170307774e-08, 'epoch': 0.97} 97%|█████████▋| 33079/34278 [36:16:07<1:16:59, 3.85s/it] 97%|█████████▋| 33080/34278 [36:16:10<1:11:00, 3.56s/it] {'loss': 0.1219, 'grad_norm': 0.946657646123212, 'learning_rate': 3.199870783273662e-08, 'epoch': 0.97} 97%|█████████▋| 33080/34278 [36:16:10<1:11:00, 3.56s/it] 97%|█████████▋| 33081/34278 [36:16:13<1:09:43, 3.50s/it] {'loss': 0.0944, 'grad_norm': 0.5836997331509567, 'learning_rate': 3.194536684829497e-08, 'epoch': 0.97} 97%|█████████▋| 33081/34278 [36:16:13<1:09:43, 3.50s/it] 97%|█████████▋| 33082/34278 [36:16:16<1:07:02, 3.36s/it] {'loss': 0.1203, 'grad_norm': 0.8370698797524847, 'learning_rate': 3.189207021745855e-08, 'epoch': 0.97} 97%|█████████▋| 33082/34278 [36:16:16<1:07:02, 3.36s/it] 97%|█████████▋| 33083/34278 [36:16:20<1:08:46, 3.45s/it] {'loss': 0.1109, 'grad_norm': 0.6781248954224448, 'learning_rate': 3.1838817940704206e-08, 'epoch': 0.97} 97%|█████████▋| 33083/34278 [36:16:20<1:08:46, 3.45s/it] 97%|█████████▋| 33084/34278 [36:16:23<1:06:40, 3.35s/it] {'loss': 0.1234, 'grad_norm': 0.9151357452442114, 'learning_rate': 3.178561001850655e-08, 'epoch': 0.97} 97%|█████████▋| 33084/34278 [36:16:23<1:06:40, 3.35s/it] 97%|█████████▋| 33085/34278 [36:16:25<1:02:48, 3.16s/it] {'loss': 0.1005, 'grad_norm': 0.9572796163881453, 'learning_rate': 3.1732446451341326e-08, 'epoch': 0.97} 97%|█████████▋| 33085/34278 [36:16:25<1:02:48, 3.16s/it] 97%|█████████▋| 33086/34278 [36:16:29<1:06:53, 3.37s/it] {'loss': 0.1152, 'grad_norm': 0.8989409858058658, 'learning_rate': 3.167932723968259e-08, 'epoch': 0.97} 97%|█████████▋| 33086/34278 [36:16:29<1:06:53, 3.37s/it] 97%|█████████▋| 33087/34278 [36:16:32<1:05:24, 3.29s/it] {'loss': 0.1157, 'grad_norm': 0.9120765378663985, 'learning_rate': 3.162625238400496e-08, 'epoch': 0.97} 97%|█████████▋| 33087/34278 [36:16:32<1:05:24, 3.29s/it] 97%|█████████▋| 33088/34278 [36:16:35<1:03:49, 3.22s/it] {'loss': 0.1051, 'grad_norm': 0.8992382488484729, 'learning_rate': 3.157322188478196e-08, 'epoch': 0.97} 97%|█████████▋| 33088/34278 [36:16:35<1:03:49, 3.22s/it] 97%|█████████▋| 33089/34278 [36:16:39<1:03:47, 3.22s/it] {'loss': 0.089, 'grad_norm': 0.6885380623396845, 'learning_rate': 3.1520235742487084e-08, 'epoch': 0.97} 97%|█████████▋| 33089/34278 [36:16:39<1:03:47, 3.22s/it] 97%|█████████▋| 33090/34278 [36:16:42<1:03:56, 3.23s/it] {'loss': 0.1072, 'grad_norm': 0.7241347519213581, 'learning_rate': 3.1467293957593846e-08, 'epoch': 0.97} 97%|█████████▋| 33090/34278 [36:16:42<1:03:56, 3.23s/it] 97%|█████████▋| 33091/34278 [36:16:45<1:01:16, 3.10s/it] {'loss': 0.1185, 'grad_norm': 0.7173859987060868, 'learning_rate': 3.1414396530574655e-08, 'epoch': 0.97} 97%|█████████▋| 33091/34278 [36:16:45<1:01:16, 3.10s/it] 97%|█████████▋| 33092/34278 [36:16:48<1:01:00, 3.09s/it] {'loss': 0.1123, 'grad_norm': 0.9488102912576474, 'learning_rate': 3.136154346190079e-08, 'epoch': 0.97} 97%|█████████▋| 33092/34278 [36:16:48<1:01:00, 3.09s/it] 97%|█████████▋| 33093/34278 [36:16:51<1:02:07, 3.15s/it] {'loss': 0.1124, 'grad_norm': 0.7968865901405182, 'learning_rate': 3.1308734752045767e-08, 'epoch': 0.97} 97%|█████████▋| 33093/34278 [36:16:51<1:02:07, 3.15s/it] 97%|█████████▋| 33094/34278 [36:16:54<59:02, 2.99s/it] {'loss': 0.1047, 'grad_norm': 0.8296046017990707, 'learning_rate': 3.125597040147976e-08, 'epoch': 0.97} 97%|█████████▋| 33094/34278 [36:16:54<59:02, 2.99s/it] 97%|█████████▋| 33095/34278 [36:16:58<1:07:01, 3.40s/it] {'loss': 0.1087, 'grad_norm': 1.1175573532422671, 'learning_rate': 3.1203250410674625e-08, 'epoch': 0.97} 97%|█████████▋| 33095/34278 [36:16:58<1:07:01, 3.40s/it] 97%|█████████▋| 33096/34278 [36:17:01<1:03:07, 3.20s/it] {'loss': 0.0985, 'grad_norm': 0.8937334867429435, 'learning_rate': 3.115057478010053e-08, 'epoch': 0.97} 97%|█████████▋| 33096/34278 [36:17:01<1:03:07, 3.20s/it] 97%|█████████▋| 33097/34278 [36:17:05<1:06:22, 3.37s/it] {'loss': 0.1029, 'grad_norm': 1.1154369897194318, 'learning_rate': 3.1097943510227657e-08, 'epoch': 0.97} 97%|█████████▋| 33097/34278 [36:17:05<1:06:22, 3.37s/it] 97%|█████████▋| 33098/34278 [36:17:09<1:10:17, 3.57s/it] {'loss': 0.1053, 'grad_norm': 0.9569703271745534, 'learning_rate': 3.1045356601526746e-08, 'epoch': 0.97} 97%|█████████▋| 33098/34278 [36:17:09<1:10:17, 3.57s/it] 97%|█████████▋| 33099/34278 [36:17:12<1:08:03, 3.46s/it] {'loss': 0.1022, 'grad_norm': 1.0134480377950137, 'learning_rate': 3.09928140544663e-08, 'epoch': 0.97} 97%|█████████▋| 33099/34278 [36:17:12<1:08:03, 3.46s/it] 97%|█████████▋| 33100/34278 [36:17:16<1:10:11, 3.58s/it] {'loss': 0.1002, 'grad_norm': 0.969656792964338, 'learning_rate': 3.094031586951596e-08, 'epoch': 0.97} 97%|█████████▋| 33100/34278 [36:17:16<1:10:11, 3.58s/it] 97%|█████████▋| 33101/34278 [36:17:19<1:11:46, 3.66s/it] {'loss': 0.0964, 'grad_norm': 0.7280858817022674, 'learning_rate': 3.0887862047144227e-08, 'epoch': 0.97} 97%|█████████▋| 33101/34278 [36:17:19<1:11:46, 3.66s/it] 97%|█████████▋| 33102/34278 [36:17:23<1:13:40, 3.76s/it] {'loss': 0.1356, 'grad_norm': 0.80575138456284, 'learning_rate': 3.083545258781961e-08, 'epoch': 0.97} 97%|█████████▋| 33102/34278 [36:17:23<1:13:40, 3.76s/it] 97%|█████████▋| 33103/34278 [36:17:27<1:09:30, 3.55s/it] {'loss': 0.1038, 'grad_norm': 0.9630461123158345, 'learning_rate': 3.078308749200953e-08, 'epoch': 0.97} 97%|█████████▋| 33103/34278 [36:17:27<1:09:30, 3.55s/it] 97%|█████████▋| 33104/34278 [36:17:30<1:09:17, 3.54s/it] {'loss': 0.1202, 'grad_norm': 0.7824225056970916, 'learning_rate': 3.0730766760182494e-08, 'epoch': 0.97} 97%|█████████▋| 33104/34278 [36:17:30<1:09:17, 3.54s/it] 97%|█████████▋| 33105/34278 [36:17:33<1:06:11, 3.39s/it] {'loss': 0.1011, 'grad_norm': 0.7243585512474819, 'learning_rate': 3.067849039280424e-08, 'epoch': 0.97} 97%|█████████▋| 33105/34278 [36:17:33<1:06:11, 3.39s/it] 97%|█████████▋| 33106/34278 [36:17:39<1:20:56, 4.14s/it] {'loss': 0.1141, 'grad_norm': 0.8562293576233819, 'learning_rate': 3.0626258390342165e-08, 'epoch': 0.97} 97%|█████████▋| 33106/34278 [36:17:39<1:20:56, 4.14s/it] 97%|█████████▋| 33107/34278 [36:17:42<1:13:40, 3.77s/it] {'loss': 0.107, 'grad_norm': 0.7856195383148817, 'learning_rate': 3.057407075326258e-08, 'epoch': 0.97} 97%|█████████▋| 33107/34278 [36:17:42<1:13:40, 3.77s/it] 97%|█████████▋| 33108/34278 [36:17:47<1:19:39, 4.09s/it] {'loss': 0.1194, 'grad_norm': 1.3227329182781364, 'learning_rate': 3.052192748203175e-08, 'epoch': 0.97} 97%|█████████▋| 33108/34278 [36:17:47<1:19:39, 4.09s/it] 97%|█████████▋| 33109/34278 [36:17:50<1:14:55, 3.85s/it] {'loss': 0.1131, 'grad_norm': 0.8942257656405606, 'learning_rate': 3.046982857711434e-08, 'epoch': 0.97} 97%|█████████▋| 33109/34278 [36:17:50<1:14:55, 3.85s/it] 97%|█████████▋| 33110/34278 [36:17:54<1:14:38, 3.83s/it] {'loss': 0.1032, 'grad_norm': 0.8446760466420564, 'learning_rate': 3.0417774038976614e-08, 'epoch': 0.97} 97%|█████████▋| 33110/34278 [36:17:54<1:14:38, 3.83s/it] 97%|█████████▋| 33111/34278 [36:18:00<1:28:12, 4.54s/it] {'loss': 0.0995, 'grad_norm': 0.656919495620798, 'learning_rate': 3.0365763868082096e-08, 'epoch': 0.97} 97%|█████████▋| 33111/34278 [36:18:00<1:28:12, 4.54s/it] 97%|█████████▋| 33112/34278 [36:18:03<1:19:27, 4.09s/it] {'loss': 0.1018, 'grad_norm': 0.6938332147972859, 'learning_rate': 3.031379806489598e-08, 'epoch': 0.97} 97%|█████████▋| 33112/34278 [36:18:03<1:19:27, 4.09s/it] 97%|█████████▋| 33113/34278 [36:18:06<1:12:08, 3.72s/it] {'loss': 0.1079, 'grad_norm': 0.8633441793085521, 'learning_rate': 3.026187662988178e-08, 'epoch': 0.97} 97%|█████████▋| 33113/34278 [36:18:06<1:12:08, 3.72s/it] 97%|█████████▋| 33114/34278 [36:18:09<1:08:05, 3.51s/it] {'loss': 0.1093, 'grad_norm': 0.9591116687070425, 'learning_rate': 3.0209999563503015e-08, 'epoch': 0.97} 97%|█████████▋| 33114/34278 [36:18:09<1:08:05, 3.51s/it] 97%|█████████▋| 33115/34278 [36:18:13<1:10:40, 3.65s/it] {'loss': 0.1051, 'grad_norm': 0.7233070936573648, 'learning_rate': 3.015816686622319e-08, 'epoch': 0.97} 97%|█████████▋| 33115/34278 [36:18:13<1:10:40, 3.65s/it] 97%|█████████▋| 33116/34278 [36:18:16<1:10:31, 3.64s/it] {'loss': 0.1113, 'grad_norm': 0.8609837590828512, 'learning_rate': 3.010637853850473e-08, 'epoch': 0.97} 97%|█████████▋| 33116/34278 [36:18:16<1:10:31, 3.64s/it] 97%|█████████▋| 33117/34278 [36:18:20<1:08:19, 3.53s/it] {'loss': 0.0928, 'grad_norm': 0.8313443937140413, 'learning_rate': 3.0054634580810594e-08, 'epoch': 0.97} 97%|█████████▋| 33117/34278 [36:18:20<1:08:19, 3.53s/it] 97%|█████████▋| 33118/34278 [36:18:23<1:06:19, 3.43s/it] {'loss': 0.0932, 'grad_norm': 0.8196854774424199, 'learning_rate': 3.000293499360207e-08, 'epoch': 0.97} 97%|█████████▋| 33118/34278 [36:18:23<1:06:19, 3.43s/it] 97%|█████████▋| 33119/34278 [36:18:27<1:12:04, 3.73s/it] {'loss': 0.1083, 'grad_norm': 0.7198436893292882, 'learning_rate': 2.995127977734047e-08, 'epoch': 0.97} 97%|█████████▋| 33119/34278 [36:18:27<1:12:04, 3.73s/it] 97%|█████████▋| 33120/34278 [36:18:31<1:08:50, 3.57s/it] {'loss': 0.0956, 'grad_norm': 0.9496831099723301, 'learning_rate': 2.9899668932487636e-08, 'epoch': 0.97} 97%|█████████▋| 33120/34278 [36:18:31<1:08:50, 3.57s/it] 97%|█████████▋| 33121/34278 [36:18:34<1:08:55, 3.57s/it] {'loss': 0.0781, 'grad_norm': 0.7647315277194521, 'learning_rate': 2.984810245950431e-08, 'epoch': 0.97} 97%|█████████▋| 33121/34278 [36:18:34<1:08:55, 3.57s/it] 97%|█████████▋| 33122/34278 [36:18:39<1:15:15, 3.91s/it] {'loss': 0.1247, 'grad_norm': 0.8740475248357543, 'learning_rate': 2.9796580358850134e-08, 'epoch': 0.97} 97%|█████████▋| 33122/34278 [36:18:39<1:15:15, 3.91s/it] 97%|█████████▋| 33123/34278 [36:18:42<1:12:57, 3.79s/it] {'loss': 0.1056, 'grad_norm': 0.8668125591352983, 'learning_rate': 2.9745102630985844e-08, 'epoch': 0.97} 97%|█████████▋| 33123/34278 [36:18:42<1:12:57, 3.79s/it] 97%|█████████▋| 33124/34278 [36:18:46<1:10:17, 3.65s/it] {'loss': 0.1337, 'grad_norm': 0.8427167068623925, 'learning_rate': 2.9693669276371073e-08, 'epoch': 0.97} 97%|█████████▋| 33124/34278 [36:18:46<1:10:17, 3.65s/it] 97%|█████████▋| 33125/34278 [36:18:49<1:08:16, 3.55s/it] {'loss': 0.1255, 'grad_norm': 0.9657409626030692, 'learning_rate': 2.96422802954649e-08, 'epoch': 0.97} 97%|█████████▋| 33125/34278 [36:18:49<1:08:16, 3.55s/it] 97%|█████████▋| 33126/34278 [36:18:53<1:11:11, 3.71s/it] {'loss': 0.1361, 'grad_norm': 0.8345508028686581, 'learning_rate': 2.9590935688725288e-08, 'epoch': 0.97} 97%|█████████▋| 33126/34278 [36:18:53<1:11:11, 3.71s/it] 97%|█████████▋| 33127/34278 [36:18:56<1:08:31, 3.57s/it] {'loss': 0.0841, 'grad_norm': 1.0172467489185282, 'learning_rate': 2.9539635456611872e-08, 'epoch': 0.97} 97%|█████████▋| 33127/34278 [36:18:56<1:08:31, 3.57s/it] 97%|█████████▋| 33128/34278 [36:19:00<1:07:49, 3.54s/it] {'loss': 0.1381, 'grad_norm': 0.8745450174531825, 'learning_rate': 2.9488379599581507e-08, 'epoch': 0.97} 97%|█████████▋| 33128/34278 [36:19:00<1:07:49, 3.54s/it] 97%|█████████▋| 33129/34278 [36:19:04<1:12:37, 3.79s/it] {'loss': 0.1123, 'grad_norm': 1.157553950817821, 'learning_rate': 2.943716811809272e-08, 'epoch': 0.97} 97%|█████████▋| 33129/34278 [36:19:04<1:12:37, 3.79s/it] 97%|█████████▋| 33130/34278 [36:19:07<1:09:23, 3.63s/it] {'loss': 0.1208, 'grad_norm': 0.8153465544904627, 'learning_rate': 2.9386001012601805e-08, 'epoch': 0.97} 97%|█████████▋| 33130/34278 [36:19:07<1:09:23, 3.63s/it] 97%|█████████▋| 33131/34278 [36:19:10<1:06:02, 3.45s/it] {'loss': 0.1063, 'grad_norm': 0.7524619022845139, 'learning_rate': 2.9334878283566737e-08, 'epoch': 0.97} 97%|█████████▋| 33131/34278 [36:19:11<1:06:02, 3.45s/it] 97%|█████████▋| 33132/34278 [36:19:13<1:01:07, 3.20s/it] {'loss': 0.1001, 'grad_norm': 0.8994755021243348, 'learning_rate': 2.9283799931442704e-08, 'epoch': 0.97} 97%|█████████▋| 33132/34278 [36:19:13<1:01:07, 3.20s/it] 97%|█████████▋| 33133/34278 [36:19:19<1:16:43, 4.02s/it] {'loss': 0.1077, 'grad_norm': 0.7204162617885567, 'learning_rate': 2.923276595668656e-08, 'epoch': 0.97} 97%|█████████▋| 33133/34278 [36:19:19<1:16:43, 4.02s/it] 97%|█████████▋| 33134/34278 [36:19:22<1:13:28, 3.85s/it] {'loss': 0.1395, 'grad_norm': 0.8019148939981262, 'learning_rate': 2.9181776359754054e-08, 'epoch': 0.97} 97%|█████████▋| 33134/34278 [36:19:22<1:13:28, 3.85s/it] 97%|█████████▋| 33135/34278 [36:19:28<1:22:25, 4.33s/it] {'loss': 0.0897, 'grad_norm': 0.9545709462728342, 'learning_rate': 2.9130831141099268e-08, 'epoch': 0.97} 97%|█████████▋| 33135/34278 [36:19:28<1:22:25, 4.33s/it] 97%|█████████▋| 33136/34278 [36:19:31<1:17:55, 4.09s/it] {'loss': 0.101, 'grad_norm': 0.7268565055547692, 'learning_rate': 2.907993030117795e-08, 'epoch': 0.97} 97%|█████████▋| 33136/34278 [36:19:31<1:17:55, 4.09s/it] 97%|█████████▋| 33137/34278 [36:19:35<1:16:10, 4.01s/it] {'loss': 0.1089, 'grad_norm': 0.7361034516151381, 'learning_rate': 2.9029073840444733e-08, 'epoch': 0.97} 97%|█████████▋| 33137/34278 [36:19:35<1:16:10, 4.01s/it] 97%|█████████▋| 33138/34278 [36:19:39<1:13:22, 3.86s/it] {'loss': 0.1053, 'grad_norm': 1.0953714849920393, 'learning_rate': 2.89782617593537e-08, 'epoch': 0.97} 97%|█████████▋| 33138/34278 [36:19:39<1:13:22, 3.86s/it] 97%|█████████▋| 33139/34278 [36:19:43<1:12:24, 3.81s/it] {'loss': 0.0999, 'grad_norm': 0.7727801956727839, 'learning_rate': 2.8927494058357265e-08, 'epoch': 0.97} 97%|█████████▋| 33139/34278 [36:19:43<1:12:24, 3.81s/it] 97%|█████████▋| 33140/34278 [36:19:48<1:24:31, 4.46s/it] {'loss': 0.0884, 'grad_norm': 0.9204105396856075, 'learning_rate': 2.887677073790951e-08, 'epoch': 0.97} 97%|█████████▋| 33140/34278 [36:19:48<1:24:31, 4.46s/it] 97%|█████████▋| 33141/34278 [36:19:51<1:16:16, 4.02s/it] {'loss': 0.0931, 'grad_norm': 0.7203501760453466, 'learning_rate': 2.882609179846341e-08, 'epoch': 0.97} 97%|█████████▋| 33141/34278 [36:19:51<1:16:16, 4.02s/it] 97%|█████████▋| 33142/34278 [36:19:54<1:09:16, 3.66s/it] {'loss': 0.1241, 'grad_norm': 0.7463886266732962, 'learning_rate': 2.877545724047137e-08, 'epoch': 0.97} 97%|█████████▋| 33142/34278 [36:19:54<1:09:16, 3.66s/it] 97%|█████████▋| 33143/34278 [36:19:57<1:06:21, 3.51s/it] {'loss': 0.0919, 'grad_norm': 0.693979411774402, 'learning_rate': 2.8724867064385265e-08, 'epoch': 0.97} 97%|█████████▋| 33143/34278 [36:19:57<1:06:21, 3.51s/it] 97%|█████████▋| 33144/34278 [36:20:01<1:05:53, 3.49s/it] {'loss': 0.1145, 'grad_norm': 0.7106745996989098, 'learning_rate': 2.8674321270656946e-08, 'epoch': 0.97} 97%|█████████▋| 33144/34278 [36:20:01<1:05:53, 3.49s/it] 97%|█████████▋| 33145/34278 [36:20:06<1:13:24, 3.89s/it] {'loss': 0.1155, 'grad_norm': 0.8100047697571606, 'learning_rate': 2.8623819859737168e-08, 'epoch': 0.97} 97%|█████████▋| 33145/34278 [36:20:06<1:13:24, 3.89s/it] 97%|█████████▋| 33146/34278 [36:20:09<1:10:39, 3.75s/it] {'loss': 0.098, 'grad_norm': 0.7790447099378133, 'learning_rate': 2.8573362832077234e-08, 'epoch': 0.97} 97%|█████████▋| 33146/34278 [36:20:09<1:10:39, 3.75s/it] 97%|█████████▋| 33147/34278 [36:20:13<1:08:37, 3.64s/it] {'loss': 0.1144, 'grad_norm': 0.7495023717099109, 'learning_rate': 2.8522950188127342e-08, 'epoch': 0.97} 97%|█████████▋| 33147/34278 [36:20:13<1:08:37, 3.64s/it] 97%|█████████▋| 33148/34278 [36:20:18<1:20:28, 4.27s/it] {'loss': 0.1241, 'grad_norm': 0.8861864894397857, 'learning_rate': 2.847258192833824e-08, 'epoch': 0.97} 97%|█████████▋| 33148/34278 [36:20:18<1:20:28, 4.27s/it] 97%|█████████▋| 33149/34278 [36:20:21<1:13:09, 3.89s/it] {'loss': 0.1134, 'grad_norm': 1.1470730068790005, 'learning_rate': 2.8422258053159014e-08, 'epoch': 0.97} 97%|█████████▋| 33149/34278 [36:20:21<1:13:09, 3.89s/it] 97%|█████████▋| 33150/34278 [36:20:27<1:24:19, 4.49s/it] {'loss': 0.1124, 'grad_norm': 0.7778296006944421, 'learning_rate': 2.8371978563039304e-08, 'epoch': 0.97} 97%|█████████▋| 33150/34278 [36:20:27<1:24:19, 4.49s/it] 97%|█████████▋| 33151/34278 [36:20:31<1:22:16, 4.38s/it] {'loss': 0.1219, 'grad_norm': 0.648556325252706, 'learning_rate': 2.8321743458427087e-08, 'epoch': 0.97} 97%|█████████▋| 33151/34278 [36:20:31<1:22:16, 4.38s/it] 97%|█████████▋| 33152/34278 [36:20:35<1:18:36, 4.19s/it] {'loss': 0.1121, 'grad_norm': 0.8489118138471465, 'learning_rate': 2.8271552739772e-08, 'epoch': 0.97} 97%|█████████▋| 33152/34278 [36:20:35<1:18:36, 4.19s/it] 97%|█████████▋| 33153/34278 [36:20:38<1:13:20, 3.91s/it] {'loss': 0.1003, 'grad_norm': 1.0260411798809466, 'learning_rate': 2.8221406407521466e-08, 'epoch': 0.97} 97%|█████████▋| 33153/34278 [36:20:38<1:13:20, 3.91s/it] 97%|█████████▋| 33154/34278 [36:20:41<1:05:45, 3.51s/it] {'loss': 0.1063, 'grad_norm': 0.7700507317507708, 'learning_rate': 2.817130446212346e-08, 'epoch': 0.97} 97%|█████████▋| 33154/34278 [36:20:41<1:05:45, 3.51s/it] 97%|█████████▋| 33155/34278 [36:20:45<1:06:33, 3.56s/it] {'loss': 0.1049, 'grad_norm': 0.8193315575784067, 'learning_rate': 2.81212469040254e-08, 'epoch': 0.97} 97%|█████████▋| 33155/34278 [36:20:45<1:06:33, 3.56s/it] 97%|█████████▋| 33156/34278 [36:20:48<1:05:42, 3.51s/it] {'loss': 0.1234, 'grad_norm': 0.900771176844274, 'learning_rate': 2.8071233733673597e-08, 'epoch': 0.97} 97%|█████████▋| 33156/34278 [36:20:48<1:05:42, 3.51s/it] 97%|█████████▋| 33157/34278 [36:20:51<1:02:31, 3.35s/it] {'loss': 0.0932, 'grad_norm': 0.9560339972914657, 'learning_rate': 2.8021264951514916e-08, 'epoch': 0.97} 97%|█████████▋| 33157/34278 [36:20:51<1:02:31, 3.35s/it] 97%|█████████▋| 33158/34278 [36:20:57<1:17:46, 4.17s/it] {'loss': 0.1132, 'grad_norm': 0.6546265651253206, 'learning_rate': 2.7971340557995665e-08, 'epoch': 0.97} 97%|█████████▋| 33158/34278 [36:20:57<1:17:46, 4.17s/it] 97%|█████████▋| 33159/34278 [36:21:00<1:12:26, 3.88s/it] {'loss': 0.1238, 'grad_norm': 1.0290057482398285, 'learning_rate': 2.7921460553561042e-08, 'epoch': 0.97} 97%|█████████▋| 33159/34278 [36:21:00<1:12:26, 3.88s/it] 97%|█████████▋| 33160/34278 [36:21:03<1:06:29, 3.57s/it] {'loss': 0.1157, 'grad_norm': 0.9428544993149429, 'learning_rate': 2.7871624938656805e-08, 'epoch': 0.97} 97%|█████████▋| 33160/34278 [36:21:03<1:06:29, 3.57s/it] 97%|█████████▋| 33161/34278 [36:21:09<1:20:00, 4.30s/it] {'loss': 0.1169, 'grad_norm': 1.0415995121814428, 'learning_rate': 2.7821833713728152e-08, 'epoch': 0.97} 97%|█████████▋| 33161/34278 [36:21:09<1:20:00, 4.30s/it] 97%|█████████▋| 33162/34278 [36:21:12<1:12:14, 3.88s/it] {'loss': 0.1206, 'grad_norm': 0.6790913256221224, 'learning_rate': 2.7772086879218617e-08, 'epoch': 0.97} 97%|█████████▋| 33162/34278 [36:21:12<1:12:14, 3.88s/it] 97%|█████████▋| 33163/34278 [36:21:15<1:06:54, 3.60s/it] {'loss': 0.0954, 'grad_norm': 0.7560335334944387, 'learning_rate': 2.77223844355734e-08, 'epoch': 0.97} 97%|█████████▋| 33163/34278 [36:21:15<1:06:54, 3.60s/it] 97%|█████████▋| 33164/34278 [36:21:18<1:02:55, 3.39s/it] {'loss': 0.105, 'grad_norm': 0.9423620708381659, 'learning_rate': 2.7672726383235482e-08, 'epoch': 0.97} 97%|█████████▋| 33164/34278 [36:21:18<1:02:55, 3.39s/it] 97%|█████████▋| 33165/34278 [36:21:22<1:06:38, 3.59s/it] {'loss': 0.1118, 'grad_norm': 0.7353544534745028, 'learning_rate': 2.7623112722648394e-08, 'epoch': 0.97} 97%|█████████▋| 33165/34278 [36:21:22<1:06:38, 3.59s/it] 97%|█████████▋| 33166/34278 [36:21:25<1:03:27, 3.42s/it] {'loss': 0.1131, 'grad_norm': 0.8067071208282636, 'learning_rate': 2.757354345425567e-08, 'epoch': 0.97} 97%|█████████▋| 33166/34278 [36:21:25<1:03:27, 3.42s/it] 97%|█████████▋| 33167/34278 [36:21:28<1:02:16, 3.36s/it] {'loss': 0.1231, 'grad_norm': 0.9932651772818989, 'learning_rate': 2.7524018578498623e-08, 'epoch': 0.97} 97%|█████████▋| 33167/34278 [36:21:28<1:02:16, 3.36s/it] 97%|█████████▋| 33168/34278 [36:21:33<1:11:13, 3.85s/it] {'loss': 0.1231, 'grad_norm': 0.8114355437385394, 'learning_rate': 2.7474538095820792e-08, 'epoch': 0.97} 97%|█████████▋| 33168/34278 [36:21:33<1:11:13, 3.85s/it] 97%|█████████▋| 33169/34278 [36:21:39<1:20:32, 4.36s/it] {'loss': 0.1324, 'grad_norm': 0.8220403344879893, 'learning_rate': 2.742510200666293e-08, 'epoch': 0.97} 97%|█████████▋| 33169/34278 [36:21:39<1:20:32, 4.36s/it] 97%|█████████▋| 33170/34278 [36:21:42<1:12:53, 3.95s/it] {'loss': 0.1029, 'grad_norm': 0.9237094451487354, 'learning_rate': 2.737571031146691e-08, 'epoch': 0.97} 97%|█████████▋| 33170/34278 [36:21:42<1:12:53, 3.95s/it] 97%|█████████▋| 33171/34278 [36:21:45<1:07:23, 3.65s/it] {'loss': 0.1103, 'grad_norm': 0.8566931516908737, 'learning_rate': 2.732636301067293e-08, 'epoch': 0.97} 97%|█████████▋| 33171/34278 [36:21:45<1:07:23, 3.65s/it] 97%|█████████▋| 33172/34278 [36:21:48<1:06:20, 3.60s/it] {'loss': 0.1302, 'grad_norm': 0.8089451056696318, 'learning_rate': 2.7277060104722865e-08, 'epoch': 0.97} 97%|█████████▋| 33172/34278 [36:21:48<1:06:20, 3.60s/it] 97%|█████████▋| 33173/34278 [36:21:51<1:05:00, 3.53s/it] {'loss': 0.1215, 'grad_norm': 0.8801491851632947, 'learning_rate': 2.722780159405525e-08, 'epoch': 0.97} 97%|█████████▋| 33173/34278 [36:21:51<1:05:00, 3.53s/it] 97%|█████████▋| 33174/34278 [36:21:54<1:01:11, 3.33s/it] {'loss': 0.0923, 'grad_norm': 0.894810803757725, 'learning_rate': 2.7178587479111397e-08, 'epoch': 0.97} 97%|█████████▋| 33174/34278 [36:21:54<1:01:11, 3.33s/it] 97%|█████████▋| 33175/34278 [36:22:00<1:16:45, 4.18s/it] {'loss': 0.1213, 'grad_norm': 0.920052125297922, 'learning_rate': 2.7129417760329846e-08, 'epoch': 0.97} 97%|█████████▋| 33175/34278 [36:22:00<1:16:45, 4.18s/it] 97%|█████████▋| 33176/34278 [36:22:03<1:10:14, 3.82s/it] {'loss': 0.1189, 'grad_norm': 0.9199774232586273, 'learning_rate': 2.708029243814969e-08, 'epoch': 0.97} 97%|█████████▋| 33176/34278 [36:22:03<1:10:14, 3.82s/it] 97%|█████████▋| 33177/34278 [36:22:09<1:21:58, 4.47s/it] {'loss': 0.1001, 'grad_norm': 0.776353475684395, 'learning_rate': 2.703121151300947e-08, 'epoch': 0.97} 97%|█████████▋| 33177/34278 [36:22:09<1:21:58, 4.47s/it] 97%|█████████▋| 33178/34278 [36:22:12<1:12:07, 3.93s/it] {'loss': 0.1248, 'grad_norm': 0.9131735148224904, 'learning_rate': 2.698217498534772e-08, 'epoch': 0.97} 97%|█████████▋| 33178/34278 [36:22:12<1:12:07, 3.93s/it] 97%|█████████▋| 33179/34278 [36:22:15<1:06:25, 3.63s/it] {'loss': 0.1014, 'grad_norm': 0.7870175820259996, 'learning_rate': 2.693318285560187e-08, 'epoch': 0.97} 97%|█████████▋| 33179/34278 [36:22:15<1:06:25, 3.63s/it] 97%|█████████▋| 33180/34278 [36:22:18<1:01:29, 3.36s/it] {'loss': 0.1041, 'grad_norm': 0.8996599779335045, 'learning_rate': 2.6884235124209345e-08, 'epoch': 0.97} 97%|█████████▋| 33180/34278 [36:22:18<1:01:29, 3.36s/it] 97%|█████████▋| 33181/34278 [36:22:21<59:08, 3.23s/it] {'loss': 0.134, 'grad_norm': 0.8984752240465178, 'learning_rate': 2.6835331791607023e-08, 'epoch': 0.97} 97%|█████████▋| 33181/34278 [36:22:21<59:08, 3.23s/it] 97%|█████████▋| 33182/34278 [36:22:25<1:03:14, 3.46s/it] {'loss': 0.1004, 'grad_norm': 0.7918699501152622, 'learning_rate': 2.6786472858231772e-08, 'epoch': 0.97} 97%|█████████▋| 33182/34278 [36:22:25<1:03:14, 3.46s/it] 97%|█████████▋| 33183/34278 [36:22:28<1:03:33, 3.48s/it] {'loss': 0.1131, 'grad_norm': 0.7766932154413836, 'learning_rate': 2.673765832451991e-08, 'epoch': 0.97} 97%|█████████▋| 33183/34278 [36:22:28<1:03:33, 3.48s/it] 97%|█████████▋| 33184/34278 [36:22:32<1:07:09, 3.68s/it] {'loss': 0.0971, 'grad_norm': 1.3130603270873022, 'learning_rate': 2.6688888190906647e-08, 'epoch': 0.97} 97%|█████████▋| 33184/34278 [36:22:32<1:07:09, 3.68s/it] 97%|█████████▋| 33185/34278 [36:22:36<1:05:48, 3.61s/it] {'loss': 0.1051, 'grad_norm': 0.9150769599954006, 'learning_rate': 2.66401624578283e-08, 'epoch': 0.97} 97%|█████████▋| 33185/34278 [36:22:36<1:05:48, 3.61s/it] 97%|█████████▋| 33186/34278 [36:22:39<1:02:50, 3.45s/it] {'loss': 0.1002, 'grad_norm': 0.9161269213580724, 'learning_rate': 2.6591481125718967e-08, 'epoch': 0.97} 97%|█████████▋| 33186/34278 [36:22:39<1:02:50, 3.45s/it] 97%|█████████▋| 33187/34278 [36:22:42<1:01:53, 3.40s/it] {'loss': 0.1066, 'grad_norm': 0.6980169049739354, 'learning_rate': 2.6542844195013297e-08, 'epoch': 0.97} 97%|█████████▋| 33187/34278 [36:22:42<1:01:53, 3.40s/it] 97%|█████████▋| 33188/34278 [36:22:45<1:00:33, 3.33s/it] {'loss': 0.092, 'grad_norm': 0.7662191903971364, 'learning_rate': 2.6494251666146497e-08, 'epoch': 0.97} 97%|█████████▋| 33188/34278 [36:22:45<1:00:33, 3.33s/it] 97%|█████████▋| 33189/34278 [36:22:48<59:14, 3.26s/it] {'loss': 0.1104, 'grad_norm': 0.9375710003997315, 'learning_rate': 2.644570353955156e-08, 'epoch': 0.97} 97%|█████████▋| 33189/34278 [36:22:48<59:14, 3.26s/it] 97%|█████████▋| 33190/34278 [36:22:52<1:00:05, 3.31s/it] {'loss': 0.1005, 'grad_norm': 0.6504484474349596, 'learning_rate': 2.6397199815662022e-08, 'epoch': 0.97} 97%|█████████▋| 33190/34278 [36:22:52<1:00:05, 3.31s/it] 97%|█████████▋| 33191/34278 [36:22:55<58:32, 3.23s/it] {'loss': 0.1198, 'grad_norm': 0.8032264552360474, 'learning_rate': 2.6348740494910875e-08, 'epoch': 0.97} 97%|█████████▋| 33191/34278 [36:22:55<58:32, 3.23s/it] 97%|█████████▋| 33192/34278 [36:22:58<55:57, 3.09s/it] {'loss': 0.11, 'grad_norm': 0.9963163192859755, 'learning_rate': 2.6300325577731102e-08, 'epoch': 0.97} 97%|█████████▋| 33192/34278 [36:22:58<55:57, 3.09s/it] 97%|█████████▋| 33193/34278 [36:23:04<1:11:53, 3.98s/it] {'loss': 0.1056, 'grad_norm': 0.9106261616915791, 'learning_rate': 2.625195506455458e-08, 'epoch': 0.97} 97%|█████████▋| 33193/34278 [36:23:04<1:11:53, 3.98s/it] 97%|█████████▋| 33194/34278 [36:23:07<1:05:39, 3.63s/it] {'loss': 0.1141, 'grad_norm': 2.154062381770336, 'learning_rate': 2.6203628955813188e-08, 'epoch': 0.97} 97%|█████████▋| 33194/34278 [36:23:07<1:05:39, 3.63s/it] 97%|█████████▋| 33195/34278 [36:23:10<1:03:43, 3.53s/it] {'loss': 0.1288, 'grad_norm': 1.0892300851194652, 'learning_rate': 2.6155347251938247e-08, 'epoch': 0.97} 97%|█████████▋| 33195/34278 [36:23:10<1:03:43, 3.53s/it] 97%|█████████▋| 33196/34278 [36:23:13<1:00:00, 3.33s/it] {'loss': 0.1041, 'grad_norm': 0.8808649017784892, 'learning_rate': 2.610710995336163e-08, 'epoch': 0.97} 97%|█████████▋| 33196/34278 [36:23:13<1:00:00, 3.33s/it] 97%|█████████▋| 33197/34278 [36:23:16<57:41, 3.20s/it] {'loss': 0.0885, 'grad_norm': 0.875432106744955, 'learning_rate': 2.6058917060513002e-08, 'epoch': 0.97} 97%|█████████▋| 33197/34278 [36:23:16<57:41, 3.20s/it] 97%|█████████▋| 33198/34278 [36:23:19<56:45, 3.15s/it] {'loss': 0.1057, 'grad_norm': 0.7705466745780121, 'learning_rate': 2.601076857382312e-08, 'epoch': 0.97} 97%|█████████▋| 33198/34278 [36:23:19<56:45, 3.15s/it] 97%|█████████▋| 33199/34278 [36:23:22<55:46, 3.10s/it] {'loss': 0.1077, 'grad_norm': 0.7848633989472745, 'learning_rate': 2.5962664493721646e-08, 'epoch': 0.97} 97%|█████████▋| 33199/34278 [36:23:22<55:46, 3.10s/it] 97%|█████████▋| 33200/34278 [36:23:26<1:02:08, 3.46s/it] {'loss': 0.105, 'grad_norm': 0.9357134369685914, 'learning_rate': 2.5914604820638233e-08, 'epoch': 0.97} 97%|█████████▋| 33200/34278 [36:23:26<1:02:08, 3.46s/it] 97%|█████████▋| 33201/34278 [36:23:32<1:15:41, 4.22s/it] {'loss': 0.1006, 'grad_norm': 1.0718354077973316, 'learning_rate': 2.5866589555001432e-08, 'epoch': 0.97} 97%|█████████▋| 33201/34278 [36:23:32<1:15:41, 4.22s/it] 97%|█████████▋| 33202/34278 [36:23:35<1:11:37, 3.99s/it] {'loss': 0.1132, 'grad_norm': 0.9688733295168274, 'learning_rate': 2.5818618697240337e-08, 'epoch': 0.97} 97%|█████████▋| 33202/34278 [36:23:35<1:11:37, 3.99s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fb7e7f858f0> Failed to fetch sample 2701716. Exception: cannot identify image file <_io.BytesIO object at 0x7fb7e7f858f0> 97%|█████████▋| 33203/34278 [36:23:39<1:10:50, 3.95s/it] {'loss': 0.0975, 'grad_norm': 0.8703435809535759, 'learning_rate': 2.5770692247783502e-08, 'epoch': 0.97} 97%|█████████▋| 33203/34278 [36:23:39<1:10:50, 3.95s/it] 97%|█████████▋| 33204/34278 [36:23:42<1:05:53, 3.68s/it] {'loss': 0.0993, 'grad_norm': 0.7218303744110426, 'learning_rate': 2.5722810207058356e-08, 'epoch': 0.97} 97%|█████████▋| 33204/34278 [36:23:42<1:05:53, 3.68s/it] 97%|█████████▋| 33205/34278 [36:23:45<1:00:57, 3.41s/it] {'loss': 0.1138, 'grad_norm': 0.7759747512927662, 'learning_rate': 2.5674972575492896e-08, 'epoch': 0.97} 97%|█████████▋| 33205/34278 [36:23:45<1:00:57, 3.41s/it] 97%|█████████▋| 33206/34278 [36:23:51<1:14:32, 4.17s/it] {'loss': 0.1186, 'grad_norm': 0.8570026436756201, 'learning_rate': 2.562717935351289e-08, 'epoch': 0.97} 97%|█████████▋| 33206/34278 [36:23:51<1:14:32, 4.17s/it] 97%|█████████▋| 33207/34278 [36:23:57<1:22:07, 4.60s/it] {'loss': 0.0901, 'grad_norm': 0.7398037588879186, 'learning_rate': 2.5579430541546324e-08, 'epoch': 0.97} 97%|█████████▋| 33207/34278 [36:23:57<1:22:07, 4.60s/it] 97%|█████████▋| 33208/34278 [36:24:00<1:15:26, 4.23s/it] {'loss': 0.1161, 'grad_norm': 0.7236714997816808, 'learning_rate': 2.553172614001953e-08, 'epoch': 0.97} 97%|█████████▋| 33208/34278 [36:24:00<1:15:26, 4.23s/it] 97%|█████████▋| 33209/34278 [36:24:03<1:08:53, 3.87s/it] {'loss': 0.1117, 'grad_norm': 1.2607678151739428, 'learning_rate': 2.5484066149357723e-08, 'epoch': 0.97} 97%|█████████▋| 33209/34278 [36:24:03<1:08:53, 3.87s/it] 97%|█████████▋| 33210/34278 [36:24:06<1:06:15, 3.72s/it] {'loss': 0.0882, 'grad_norm': 0.9101920600804906, 'learning_rate': 2.543645056998667e-08, 'epoch': 0.97} 97%|█████████▋| 33210/34278 [36:24:06<1:06:15, 3.72s/it] 97%|█████████▋| 33211/34278 [36:24:10<1:03:45, 3.58s/it] {'loss': 0.1143, 'grad_norm': 0.8108032443512756, 'learning_rate': 2.538887940233159e-08, 'epoch': 0.97} 97%|█████████▋| 33211/34278 [36:24:10<1:03:45, 3.58s/it] 97%|█████████▋| 33212/34278 [36:24:13<1:01:44, 3.48s/it] {'loss': 0.1145, 'grad_norm': 0.7041602985171656, 'learning_rate': 2.5341352646816585e-08, 'epoch': 0.97} 97%|█████████▋| 33212/34278 [36:24:13<1:01:44, 3.48s/it] 97%|█████████▋| 33213/34278 [36:24:17<1:05:23, 3.68s/it] {'loss': 0.1186, 'grad_norm': 0.8401975422025159, 'learning_rate': 2.5293870303866876e-08, 'epoch': 0.97} 97%|█████████▋| 33213/34278 [36:24:17<1:05:23, 3.68s/it] 97%|█████████▋| 33214/34278 [36:24:21<1:05:34, 3.70s/it] {'loss': 0.1015, 'grad_norm': 0.8464830955458004, 'learning_rate': 2.5246432373906004e-08, 'epoch': 0.97} 97%|█████████▋| 33214/34278 [36:24:21<1:05:34, 3.70s/it] 97%|█████████▋| 33215/34278 [36:24:24<1:03:42, 3.60s/it] {'loss': 0.096, 'grad_norm': 0.8387548712600454, 'learning_rate': 2.5199038857357526e-08, 'epoch': 0.97} 97%|█████████▋| 33215/34278 [36:24:24<1:03:42, 3.60s/it] 97%|█████████▋| 33216/34278 [36:24:27<1:00:00, 3.39s/it] {'loss': 0.1151, 'grad_norm': 0.9199625599619493, 'learning_rate': 2.5151689754643883e-08, 'epoch': 0.97} 97%|█████████▋| 33216/34278 [36:24:27<1:00:00, 3.39s/it] 97%|█████████▋| 33217/34278 [36:24:30<1:00:04, 3.40s/it] {'loss': 0.1145, 'grad_norm': 1.3089082232519005, 'learning_rate': 2.5104385066188618e-08, 'epoch': 0.97} 97%|█████████▋| 33217/34278 [36:24:30<1:00:04, 3.40s/it] 97%|█████████▋| 33218/34278 [36:24:34<1:00:12, 3.41s/it] {'loss': 0.0957, 'grad_norm': 0.8277829418479505, 'learning_rate': 2.505712479241418e-08, 'epoch': 0.97} 97%|█████████▋| 33218/34278 [36:24:34<1:00:12, 3.41s/it] 97%|█████████▋| 33219/34278 [36:24:37<58:27, 3.31s/it] {'loss': 0.106, 'grad_norm': 0.6815338576041906, 'learning_rate': 2.5009908933741335e-08, 'epoch': 0.97} 97%|█████████▋| 33219/34278 [36:24:37<58:27, 3.31s/it] 97%|█████████▋| 33220/34278 [36:24:40<56:37, 3.21s/it] {'loss': 0.1094, 'grad_norm': 0.8492507818011145, 'learning_rate': 2.496273749059308e-08, 'epoch': 0.97} 97%|█████████▋| 33220/34278 [36:24:40<56:37, 3.21s/it] 97%|█████████▋| 33221/34278 [36:24:43<57:32, 3.27s/it] {'loss': 0.0947, 'grad_norm': 0.775050926146816, 'learning_rate': 2.4915610463389637e-08, 'epoch': 0.97} 97%|█████████▋| 33221/34278 [36:24:43<57:32, 3.27s/it] 97%|█████████▋| 33222/34278 [36:24:49<1:11:21, 4.05s/it] {'loss': 0.1078, 'grad_norm': 0.7439142650091273, 'learning_rate': 2.486852785255178e-08, 'epoch': 0.97} 97%|█████████▋| 33222/34278 [36:24:49<1:11:21, 4.05s/it] 97%|█████████▋| 33223/34278 [36:24:52<1:05:32, 3.73s/it] {'loss': 0.126, 'grad_norm': 0.9900561762145669, 'learning_rate': 2.4821489658500286e-08, 'epoch': 0.97} 97%|█████████▋| 33223/34278 [36:24:52<1:05:32, 3.73s/it] 97%|█████████▋| 33224/34278 [36:24:55<1:01:49, 3.52s/it] {'loss': 0.113, 'grad_norm': 0.7834785723011627, 'learning_rate': 2.4774495881654813e-08, 'epoch': 0.97} 97%|█████████▋| 33224/34278 [36:24:55<1:01:49, 3.52s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f4af0807c40> Failed to fetch sample 2854380. Exception: cannot identify image file <_io.BytesIO object at 0x7f4af0807c40> 97%|█████████▋| 33225/34278 [36:24:58<59:35, 3.40s/it] {'loss': 0.0971, 'grad_norm': 0.7419422618761509, 'learning_rate': 2.472754652243503e-08, 'epoch': 0.97} 97%|█████████▋| 33225/34278 [36:24:58<59:35, 3.40s/it] 97%|█████████▋| 33226/34278 [36:25:02<1:01:37, 3.51s/it] {'loss': 0.1132, 'grad_norm': 1.0690866161387549, 'learning_rate': 2.468064158125949e-08, 'epoch': 0.97} 97%|█████████▋| 33226/34278 [36:25:02<1:01:37, 3.51s/it] 97%|█████████▋| 33227/34278 [36:25:07<1:10:26, 4.02s/it] {'loss': 0.139, 'grad_norm': 0.9161898550399602, 'learning_rate': 2.463378105854841e-08, 'epoch': 0.97} 97%|█████████▋| 33227/34278 [36:25:07<1:10:26, 4.02s/it] 97%|█████████▋| 33228/34278 [36:25:10<1:03:47, 3.65s/it] {'loss': 0.1007, 'grad_norm': 0.8379852469233645, 'learning_rate': 2.4586964954718683e-08, 'epoch': 0.97} 97%|█████████▋| 33228/34278 [36:25:10<1:03:47, 3.65s/it] 97%|█████████▋| 33229/34278 [36:25:13<1:02:06, 3.55s/it] {'loss': 0.1222, 'grad_norm': 0.7514237099956552, 'learning_rate': 2.454019327018886e-08, 'epoch': 0.97} 97%|█████████▋| 33229/34278 [36:25:13<1:02:06, 3.55s/it] 97%|█████████▋| 33230/34278 [36:25:18<1:05:12, 3.73s/it] {'loss': 0.1079, 'grad_norm': 0.8241317805620066, 'learning_rate': 2.449346600537639e-08, 'epoch': 0.97} 97%|█████████▋| 33230/34278 [36:25:18<1:05:12, 3.73s/it] 97%|█████████▋| 33231/34278 [36:25:22<1:09:01, 3.96s/it] {'loss': 0.1072, 'grad_norm': 0.9168777228764998, 'learning_rate': 2.4446783160698152e-08, 'epoch': 0.97} 97%|█████████▋| 33231/34278 [36:25:22<1:09:01, 3.96s/it] 97%|█████████▋| 33232/34278 [36:25:26<1:08:15, 3.92s/it] {'loss': 0.103, 'grad_norm': 0.8124714608903882, 'learning_rate': 2.440014473657215e-08, 'epoch': 0.97} 97%|█████████▋| 33232/34278 [36:25:26<1:08:15, 3.92s/it] 97%|█████████▋| 33233/34278 [36:25:29<1:04:57, 3.73s/it] {'loss': 0.1044, 'grad_norm': 0.7571477527597106, 'learning_rate': 2.4353550733413056e-08, 'epoch': 0.97} 97%|█████████▋| 33233/34278 [36:25:29<1:04:57, 3.73s/it] 97%|█████████▋| 33234/34278 [36:25:34<1:08:22, 3.93s/it] {'loss': 0.1065, 'grad_norm': 0.75046520235371, 'learning_rate': 2.430700115163831e-08, 'epoch': 0.97} 97%|█████████▋| 33234/34278 [36:25:34<1:08:22, 3.93s/it] 97%|█████████▋| 33235/34278 [36:25:37<1:05:19, 3.76s/it] {'loss': 0.0979, 'grad_norm': 0.8101537463858147, 'learning_rate': 2.426049599166258e-08, 'epoch': 0.97} 97%|█████████▋| 33235/34278 [36:25:37<1:05:19, 3.76s/it] 97%|█████████▋| 33236/34278 [36:25:40<1:02:12, 3.58s/it] {'loss': 0.1058, 'grad_norm': 0.9014119870855315, 'learning_rate': 2.4214035253901093e-08, 'epoch': 0.97} 97%|█████████▋| 33236/34278 [36:25:40<1:02:12, 3.58s/it] 97%|█████████▋| 33237/34278 [36:25:46<1:14:42, 4.31s/it] {'loss': 0.1026, 'grad_norm': 1.1299225630003138, 'learning_rate': 2.4167618938769066e-08, 'epoch': 0.97} 97%|█████████▋| 33237/34278 [36:25:46<1:14:42, 4.31s/it] 97%|█████████▋| 33238/34278 [36:25:50<1:12:09, 4.16s/it] {'loss': 0.1032, 'grad_norm': 0.9588030555201384, 'learning_rate': 2.4121247046681174e-08, 'epoch': 0.97} 97%|█████████▋| 33238/34278 [36:25:50<1:12:09, 4.16s/it] 97%|█████████▋| 33239/34278 [36:25:56<1:20:49, 4.67s/it] {'loss': 0.0923, 'grad_norm': 1.2022350808963467, 'learning_rate': 2.4074919578050415e-08, 'epoch': 0.97} 97%|█████████▋| 33239/34278 [36:25:56<1:20:49, 4.67s/it] 97%|█████████▋| 33240/34278 [36:25:59<1:13:47, 4.27s/it] {'loss': 0.1014, 'grad_norm': 0.8357029085217434, 'learning_rate': 2.4028636533290904e-08, 'epoch': 0.97} 97%|█████████▋| 33240/34278 [36:25:59<1:13:47, 4.27s/it] 97%|█████████▋| 33241/34278 [36:26:02<1:08:58, 3.99s/it] {'loss': 0.1172, 'grad_norm': 0.7351472449121997, 'learning_rate': 2.3982397912816203e-08, 'epoch': 0.97} 97%|█████████▋| 33241/34278 [36:26:02<1:08:58, 3.99s/it] 97%|█████████▋| 33242/34278 [36:26:06<1:05:28, 3.79s/it] {'loss': 0.1098, 'grad_norm': 0.8882296353447906, 'learning_rate': 2.3936203717038753e-08, 'epoch': 0.97} 97%|█████████▋| 33242/34278 [36:26:06<1:05:28, 3.79s/it] 97%|█████████▋| 33243/34278 [36:26:09<1:02:46, 3.64s/it] {'loss': 0.117, 'grad_norm': 0.9327990009193649, 'learning_rate': 2.389005394637045e-08, 'epoch': 0.97} 97%|█████████▋| 33243/34278 [36:26:09<1:02:46, 3.64s/it] 97%|█████████▋| 33244/34278 [36:26:13<1:04:29, 3.74s/it] {'loss': 0.1141, 'grad_norm': 0.8334900163090636, 'learning_rate': 2.38439486012243e-08, 'epoch': 0.97} 97%|█████████▋| 33244/34278 [36:26:13<1:04:29, 3.74s/it] 97%|█████████▋| 33245/34278 [36:26:17<1:04:07, 3.72s/it] {'loss': 0.1138, 'grad_norm': 0.8102328846098208, 'learning_rate': 2.3797887682011632e-08, 'epoch': 0.97} 97%|█████████▋| 33245/34278 [36:26:17<1:04:07, 3.72s/it] 97%|█████████▋| 33246/34278 [36:26:21<1:04:51, 3.77s/it] {'loss': 0.1265, 'grad_norm': 0.8732129555678707, 'learning_rate': 2.375187118914324e-08, 'epoch': 0.97} 97%|█████████▋| 33246/34278 [36:26:21<1:04:51, 3.77s/it] 97%|█████████▋| 33247/34278 [36:26:24<1:00:50, 3.54s/it] {'loss': 0.1265, 'grad_norm': 0.9265668905180008, 'learning_rate': 2.3705899123030452e-08, 'epoch': 0.97} 97%|█████████▋| 33247/34278 [36:26:24<1:00:50, 3.54s/it] 97%|█████████▋| 33248/34278 [36:26:27<1:00:19, 3.51s/it] {'loss': 0.1314, 'grad_norm': 0.7005632848767195, 'learning_rate': 2.36599714840835e-08, 'epoch': 0.97} 97%|█████████▋| 33248/34278 [36:26:27<1:00:19, 3.51s/it] 97%|█████████▋| 33249/34278 [36:26:33<1:13:08, 4.26s/it] {'loss': 0.1068, 'grad_norm': 0.7864713792294779, 'learning_rate': 2.3614088272712055e-08, 'epoch': 0.97} 97%|█████████▋| 33249/34278 [36:26:33<1:13:08, 4.26s/it] 97%|█████████▋| 33250/34278 [36:26:36<1:06:22, 3.87s/it] {'loss': 0.1133, 'grad_norm': 0.7727446005676165, 'learning_rate': 2.3568249489325788e-08, 'epoch': 0.97} 97%|█████████▋| 33250/34278 [36:26:36<1:06:22, 3.87s/it] 97%|█████████▋| 33251/34278 [36:26:39<1:01:54, 3.62s/it] {'loss': 0.1133, 'grad_norm': 0.8746891260098911, 'learning_rate': 2.3522455134334932e-08, 'epoch': 0.97} 97%|█████████▋| 33251/34278 [36:26:39<1:01:54, 3.62s/it] 97%|█████████▋| 33252/34278 [36:26:42<1:00:10, 3.52s/it] {'loss': 0.1235, 'grad_norm': 0.8062800368374295, 'learning_rate': 2.347670520814749e-08, 'epoch': 0.97} 97%|█████████▋| 33252/34278 [36:26:42<1:00:10, 3.52s/it] 97%|█████████▋| 33253/34278 [36:26:45<57:39, 3.37s/it] {'loss': 0.1366, 'grad_norm': 0.9503210236297273, 'learning_rate': 2.3430999711171466e-08, 'epoch': 0.97} 97%|█████████▋| 33253/34278 [36:26:45<57:39, 3.37s/it] 97%|█████████▋| 33254/34278 [36:26:48<56:06, 3.29s/it] {'loss': 0.1053, 'grad_norm': 0.7298905343904102, 'learning_rate': 2.338533864381598e-08, 'epoch': 0.97} 97%|█████████▋| 33254/34278 [36:26:48<56:06, 3.29s/it] 97%|█████████▋| 33255/34278 [36:26:55<1:12:14, 4.24s/it] {'loss': 0.1275, 'grad_norm': 0.8383404371548997, 'learning_rate': 2.333972200648793e-08, 'epoch': 0.97} 97%|█████████▋| 33255/34278 [36:26:55<1:12:14, 4.24s/it] 97%|█████████▋| 33256/34278 [36:26:58<1:03:50, 3.75s/it] {'loss': 0.0874, 'grad_norm': 0.8118394073432936, 'learning_rate': 2.329414979959477e-08, 'epoch': 0.97} 97%|█████████▋| 33256/34278 [36:26:58<1:03:50, 3.75s/it] 97%|█████████▋| 33257/34278 [36:27:01<1:01:36, 3.62s/it] {'loss': 0.1127, 'grad_norm': 0.7989731379292758, 'learning_rate': 2.3248622023543387e-08, 'epoch': 0.97} 97%|█████████▋| 33257/34278 [36:27:01<1:01:36, 3.62s/it] 97%|█████████▋| 33258/34278 [36:27:05<1:05:59, 3.88s/it] {'loss': 0.1175, 'grad_norm': 0.6262229387101635, 'learning_rate': 2.320313867874069e-08, 'epoch': 0.97} 97%|█████████▋| 33258/34278 [36:27:05<1:05:59, 3.88s/it] 97%|█████████▋| 33259/34278 [36:27:10<1:11:18, 4.20s/it] {'loss': 0.1135, 'grad_norm': 0.8163687193739113, 'learning_rate': 2.3157699765591902e-08, 'epoch': 0.97} 97%|█████████▋| 33259/34278 [36:27:10<1:11:18, 4.20s/it] 97%|█████████▋| 33260/34278 [36:27:15<1:11:53, 4.24s/it] {'loss': 0.1125, 'grad_norm': 0.8569892640654317, 'learning_rate': 2.3112305284503365e-08, 'epoch': 0.97} 97%|█████████▋| 33260/34278 [36:27:15<1:11:53, 4.24s/it] 97%|█████████▋| 33261/34278 [36:27:20<1:16:49, 4.53s/it] {'loss': 0.1159, 'grad_norm': 0.7352684456279495, 'learning_rate': 2.3066955235879763e-08, 'epoch': 0.97} 97%|█████████▋| 33261/34278 [36:27:20<1:16:49, 4.53s/it] 97%|█████████▋| 33262/34278 [36:27:24<1:12:50, 4.30s/it] {'loss': 0.1172, 'grad_norm': 0.6172889263874495, 'learning_rate': 2.3021649620126873e-08, 'epoch': 0.97} 97%|█████████▋| 33262/34278 [36:27:24<1:12:50, 4.30s/it] 97%|█████████▋| 33263/34278 [36:27:27<1:08:26, 4.05s/it] {'loss': 0.106, 'grad_norm': 0.8463945571490848, 'learning_rate': 2.2976388437648267e-08, 'epoch': 0.97} 97%|█████████▋| 33263/34278 [36:27:27<1:08:26, 4.05s/it] 97%|█████████▋| 33264/34278 [36:27:32<1:15:24, 4.46s/it] {'loss': 0.1194, 'grad_norm': 1.0325710207608405, 'learning_rate': 2.2931171688848066e-08, 'epoch': 0.97} 97%|█████████▋| 33264/34278 [36:27:32<1:15:24, 4.46s/it] 97%|█████████▋| 33265/34278 [36:27:36<1:10:15, 4.16s/it] {'loss': 0.1122, 'grad_norm': 0.78629712982603, 'learning_rate': 2.288599937413094e-08, 'epoch': 0.97} 97%|█████████▋| 33265/34278 [36:27:36<1:10:15, 4.16s/it] 97%|█████████▋| 33266/34278 [36:27:41<1:15:22, 4.47s/it] {'loss': 0.0942, 'grad_norm': 0.7600775276779111, 'learning_rate': 2.2840871493898798e-08, 'epoch': 0.97} 97%|█████████▋| 33266/34278 [36:27:41<1:15:22, 4.47s/it] 97%|█████████▋| 33267/34278 [36:27:44<1:08:37, 4.07s/it] {'loss': 0.1302, 'grad_norm': 0.9811926471288643, 'learning_rate': 2.27957880485552e-08, 'epoch': 0.97} 97%|█████████▋| 33267/34278 [36:27:44<1:08:37, 4.07s/it] 97%|█████████▋| 33268/34278 [36:27:47<1:03:25, 3.77s/it] {'loss': 0.0909, 'grad_norm': 0.8820362620981588, 'learning_rate': 2.2750749038503162e-08, 'epoch': 0.97} 97%|█████████▋| 33268/34278 [36:27:47<1:03:25, 3.77s/it] 97%|█████████▋| 33269/34278 [36:27:50<1:00:02, 3.57s/it] {'loss': 0.1087, 'grad_norm': 0.6327932055874735, 'learning_rate': 2.2705754464144024e-08, 'epoch': 0.97} 97%|█████████▋| 33269/34278 [36:27:50<1:00:02, 3.57s/it] 97%|█████████▋| 33270/34278 [36:27:55<1:04:22, 3.83s/it] {'loss': 0.1172, 'grad_norm': 0.9295755842950882, 'learning_rate': 2.266080432587969e-08, 'epoch': 0.97} 97%|█████████▋| 33270/34278 [36:27:55<1:04:22, 3.83s/it] 97%|█████████▋| 33271/34278 [36:27:58<1:01:11, 3.65s/it] {'loss': 0.1073, 'grad_norm': 0.9489213979627705, 'learning_rate': 2.261589862411151e-08, 'epoch': 0.97} 97%|█████████▋| 33271/34278 [36:27:58<1:01:11, 3.65s/it] 97%|█████████▋| 33272/34278 [36:28:04<1:14:10, 4.42s/it] {'loss': 0.1114, 'grad_norm': 1.1446728499654448, 'learning_rate': 2.2571037359240268e-08, 'epoch': 0.97} 97%|█████████▋| 33272/34278 [36:28:04<1:14:10, 4.42s/it] 97%|█████████▋| 33273/34278 [36:28:08<1:10:21, 4.20s/it] {'loss': 0.1168, 'grad_norm': 0.767914187023319, 'learning_rate': 2.2526220531666752e-08, 'epoch': 0.97} 97%|█████████▋| 33273/34278 [36:28:08<1:10:21, 4.20s/it] 97%|█████████▋| 33274/34278 [36:28:11<1:04:15, 3.84s/it] {'loss': 0.1176, 'grad_norm': 0.7252827403882242, 'learning_rate': 2.2481448141791206e-08, 'epoch': 0.97} 97%|█████████▋| 33274/34278 [36:28:11<1:04:15, 3.84s/it] 97%|█████████▋| 33275/34278 [36:28:14<1:01:22, 3.67s/it] {'loss': 0.0991, 'grad_norm': 0.7670138313431533, 'learning_rate': 2.243672019001275e-08, 'epoch': 0.97} 97%|█████████▋| 33275/34278 [36:28:14<1:01:22, 3.67s/it] 97%|█████████▋| 33276/34278 [36:28:20<1:13:38, 4.41s/it] {'loss': 0.128, 'grad_norm': 0.8109237412730789, 'learning_rate': 2.2392036676730512e-08, 'epoch': 0.97} 97%|█████████▋| 33276/34278 [36:28:20<1:13:38, 4.41s/it] 97%|█████████▋| 33277/34278 [36:28:25<1:12:05, 4.32s/it] {'loss': 0.1018, 'grad_norm': 0.7587996528575016, 'learning_rate': 2.2347397602344722e-08, 'epoch': 0.97} 97%|█████████▋| 33277/34278 [36:28:25<1:12:05, 4.32s/it] 97%|█████████▋| 33278/34278 [36:28:30<1:19:49, 4.79s/it] {'loss': 0.101, 'grad_norm': 1.1152898987839726, 'learning_rate': 2.2302802967252847e-08, 'epoch': 0.97} 97%|█████████▋| 33278/34278 [36:28:30<1:19:49, 4.79s/it] 97%|█████████▋| 33279/34278 [36:28:34<1:13:32, 4.42s/it] {'loss': 0.1029, 'grad_norm': 0.7740685771683296, 'learning_rate': 2.225825277185345e-08, 'epoch': 0.97} 97%|█████████▋| 33279/34278 [36:28:34<1:13:32, 4.42s/it] 97%|█████████▋| 33280/34278 [36:28:38<1:09:29, 4.18s/it] {'loss': 0.1211, 'grad_norm': 0.858881855407697, 'learning_rate': 2.2213747016543442e-08, 'epoch': 0.97} 97%|█████████▋| 33280/34278 [36:28:38<1:09:29, 4.18s/it] 97%|█████████▋| 33281/34278 [36:28:41<1:06:58, 4.03s/it] {'loss': 0.1167, 'grad_norm': 0.9201220084377129, 'learning_rate': 2.2169285701721388e-08, 'epoch': 0.97} 97%|█████████▋| 33281/34278 [36:28:41<1:06:58, 4.03s/it] 97%|█████████▋| 33282/34278 [36:28:44<1:02:38, 3.77s/it] {'loss': 0.0877, 'grad_norm': 0.8283169503953051, 'learning_rate': 2.212486882778364e-08, 'epoch': 0.97} 97%|█████████▋| 33282/34278 [36:28:44<1:02:38, 3.77s/it] 97%|█████████▋| 33283/34278 [36:28:47<58:06, 3.50s/it] {'loss': 0.1099, 'grad_norm': 0.7731137437956862, 'learning_rate': 2.208049639512655e-08, 'epoch': 0.97} 97%|█████████▋| 33283/34278 [36:28:47<58:06, 3.50s/it] 97%|█████████▋| 33284/34278 [36:28:53<1:10:49, 4.28s/it] {'loss': 0.0937, 'grad_norm': 0.8886735153467984, 'learning_rate': 2.203616840414646e-08, 'epoch': 0.97} 97%|█████████▋| 33284/34278 [36:28:53<1:10:49, 4.28s/it] 97%|█████████▋| 33285/34278 [36:28:56<1:03:25, 3.83s/it] {'loss': 0.1172, 'grad_norm': 0.772189292030863, 'learning_rate': 2.1991884855239177e-08, 'epoch': 0.97} 97%|█████████▋| 33285/34278 [36:28:56<1:03:25, 3.83s/it] 97%|█████████▋| 33286/34278 [36:29:00<1:02:07, 3.76s/it] {'loss': 0.105, 'grad_norm': 0.9297205871305089, 'learning_rate': 2.1947645748799927e-08, 'epoch': 0.97} 97%|█████████▋| 33286/34278 [36:29:00<1:02:07, 3.76s/it] 97%|█████████▋| 33287/34278 [36:29:03<59:56, 3.63s/it] {'loss': 0.1077, 'grad_norm': 0.8003621453355055, 'learning_rate': 2.1903451085223958e-08, 'epoch': 0.97} 97%|█████████▋| 33287/34278 [36:29:03<59:56, 3.63s/it] 97%|█████████▋| 33288/34278 [36:29:07<59:17, 3.59s/it] {'loss': 0.1399, 'grad_norm': 0.8251416550823791, 'learning_rate': 2.185930086490595e-08, 'epoch': 0.97} 97%|█████████▋| 33288/34278 [36:29:07<59:17, 3.59s/it] 97%|█████████▋| 33289/34278 [36:29:10<56:34, 3.43s/it] {'loss': 0.0948, 'grad_norm': 0.7866508578070678, 'learning_rate': 2.1815195088238926e-08, 'epoch': 0.97} 97%|█████████▋| 33289/34278 [36:29:10<56:34, 3.43s/it] 97%|█████████▋| 33290/34278 [36:29:16<1:09:57, 4.25s/it] {'loss': 0.1088, 'grad_norm': 0.7879612377093205, 'learning_rate': 2.1771133755618124e-08, 'epoch': 0.97} 97%|█████████▋| 33290/34278 [36:29:16<1:09:57, 4.25s/it] 97%|█████████▋| 33291/34278 [36:29:19<1:05:18, 3.97s/it] {'loss': 0.1103, 'grad_norm': 0.7372331894153694, 'learning_rate': 2.172711686743545e-08, 'epoch': 0.97} 97%|█████████▋| 33291/34278 [36:29:19<1:05:18, 3.97s/it] 97%|█████████▋| 33292/34278 [36:29:22<1:00:08, 3.66s/it] {'loss': 0.1017, 'grad_norm': 0.9221458662624825, 'learning_rate': 2.1683144424085034e-08, 'epoch': 0.97} 97%|█████████▋| 33292/34278 [36:29:22<1:00:08, 3.66s/it] 97%|█████████▋| 33293/34278 [36:29:26<59:12, 3.61s/it] {'loss': 0.1046, 'grad_norm': 0.6565910677863517, 'learning_rate': 2.1639216425959342e-08, 'epoch': 0.97} 97%|█████████▋| 33293/34278 [36:29:26<59:12, 3.61s/it] 97%|█████████▋| 33294/34278 [36:29:29<56:50, 3.47s/it] {'loss': 0.1521, 'grad_norm': 0.9345734592058208, 'learning_rate': 2.159533287345028e-08, 'epoch': 0.97} 97%|█████████▋| 33294/34278 [36:29:29<56:50, 3.47s/it] 97%|█████████▋| 33295/34278 [36:29:32<57:07, 3.49s/it] {'loss': 0.11, 'grad_norm': 0.8823190509865528, 'learning_rate': 2.155149376694976e-08, 'epoch': 0.97} 97%|█████████▋| 33295/34278 [36:29:32<57:07, 3.49s/it] 97%|█████████▋| 33296/34278 [36:29:38<1:06:30, 4.06s/it] {'loss': 0.1082, 'grad_norm': 0.8445446854319447, 'learning_rate': 2.1507699106848577e-08, 'epoch': 0.97} 97%|█████████▋| 33296/34278 [36:29:38<1:06:30, 4.06s/it] 97%|█████████▋| 33297/34278 [36:29:40<1:00:11, 3.68s/it] {'loss': 0.1057, 'grad_norm': 0.9642143256794932, 'learning_rate': 2.146394889353809e-08, 'epoch': 0.97} 97%|█████████▋| 33297/34278 [36:29:40<1:00:11, 3.68s/it] 97%|█████████▋| 33298/34278 [36:29:44<57:58, 3.55s/it] {'loss': 0.1068, 'grad_norm': 0.8639220123901701, 'learning_rate': 2.1420243127409644e-08, 'epoch': 0.97} 97%|█████████▋| 33298/34278 [36:29:44<57:58, 3.55s/it] 97%|█████████▋| 33299/34278 [36:29:47<55:22, 3.39s/it] {'loss': 0.1107, 'grad_norm': 1.0167976612282208, 'learning_rate': 2.137658180885238e-08, 'epoch': 0.97} 97%|█████████▋| 33299/34278 [36:29:47<55:22, 3.39s/it] 97%|█████████▋| 33300/34278 [36:29:50<55:27, 3.40s/it] {'loss': 0.1066, 'grad_norm': 0.81110458572188, 'learning_rate': 2.133296493825654e-08, 'epoch': 0.97} 97%|█████████▋| 33300/34278 [36:29:50<55:27, 3.40s/it] 97%|█████████▋| 33301/34278 [36:29:56<1:07:37, 4.15s/it] {'loss': 0.1113, 'grad_norm': 0.6746057619205598, 'learning_rate': 2.1289392516011253e-08, 'epoch': 0.97} 97%|█████████▋| 33301/34278 [36:29:56<1:07:37, 4.15s/it] 97%|█████████▋| 33302/34278 [36:29:59<1:01:52, 3.80s/it] {'loss': 0.1103, 'grad_norm': 0.7692790261086835, 'learning_rate': 2.1245864542506213e-08, 'epoch': 0.97} 97%|█████████▋| 33302/34278 [36:29:59<1:01:52, 3.80s/it] 97%|█████████▋| 33303/34278 [36:30:02<58:52, 3.62s/it] {'loss': 0.1174, 'grad_norm': 0.9241372568505284, 'learning_rate': 2.1202381018129436e-08, 'epoch': 0.97} 97%|█████████▋| 33303/34278 [36:30:02<58:52, 3.62s/it] 97%|█████████▋| 33304/34278 [36:30:05<55:59, 3.45s/it] {'loss': 0.1236, 'grad_norm': 0.7847597273931936, 'learning_rate': 2.1158941943268952e-08, 'epoch': 0.97} 97%|█████████▋| 33304/34278 [36:30:05<55:59, 3.45s/it] 97%|█████████▋| 33305/34278 [36:30:09<56:23, 3.48s/it] {'loss': 0.1231, 'grad_norm': 1.1458056422264347, 'learning_rate': 2.1115547318313334e-08, 'epoch': 0.97} 97%|█████████▋| 33305/34278 [36:30:09<56:23, 3.48s/it] 97%|█████████▋| 33306/34278 [36:30:12<56:26, 3.48s/it] {'loss': 0.1029, 'grad_norm': 1.265289722213336, 'learning_rate': 2.1072197143649497e-08, 'epoch': 0.97} 97%|█████████▋| 33306/34278 [36:30:12<56:26, 3.48s/it] 97%|█████████▋| 33307/34278 [36:30:16<56:32, 3.49s/it] {'loss': 0.1261, 'grad_norm': 0.9444687779410674, 'learning_rate': 2.1028891419664354e-08, 'epoch': 0.97} 97%|█████████▋| 33307/34278 [36:30:16<56:32, 3.49s/it] 97%|█████████▋| 33308/34278 [36:30:19<54:37, 3.38s/it] {'loss': 0.1177, 'grad_norm': 0.7195124075247614, 'learning_rate': 2.0985630146744264e-08, 'epoch': 0.97} 97%|█████████▋| 33308/34278 [36:30:19<54:37, 3.38s/it] 97%|█████████▋| 33309/34278 [36:30:22<53:27, 3.31s/it] {'loss': 0.0905, 'grad_norm': 1.5685025454870865, 'learning_rate': 2.0942413325276688e-08, 'epoch': 0.97} 97%|█████████▋| 33309/34278 [36:30:22<53:27, 3.31s/it] 97%|█████████▋| 33310/34278 [36:30:25<51:12, 3.17s/it] {'loss': 0.1304, 'grad_norm': 0.9524786479041064, 'learning_rate': 2.0899240955646326e-08, 'epoch': 0.97} 97%|█████████▋| 33310/34278 [36:30:25<51:12, 3.17s/it] 97%|█████████▋| 33311/34278 [36:30:28<52:32, 3.26s/it] {'loss': 0.107, 'grad_norm': 1.0465798406007878, 'learning_rate': 2.085611303823898e-08, 'epoch': 0.97} 97%|█████████▋| 33311/34278 [36:30:28<52:32, 3.26s/it] 97%|█████████▋| 33312/34278 [36:30:34<1:05:34, 4.07s/it] {'loss': 0.1043, 'grad_norm': 0.7076932035780915, 'learning_rate': 2.0813029573439335e-08, 'epoch': 0.97} 97%|█████████▋| 33312/34278 [36:30:34<1:05:34, 4.07s/it] 97%|█████████▋| 33313/34278 [36:30:38<1:02:44, 3.90s/it] {'loss': 0.1071, 'grad_norm': 0.8522436171572916, 'learning_rate': 2.0769990561632647e-08, 'epoch': 0.97} 97%|█████████▋| 33313/34278 [36:30:38<1:02:44, 3.90s/it] 97%|█████████▋| 33314/34278 [36:30:41<59:45, 3.72s/it] {'loss': 0.1101, 'grad_norm': 0.7615812205683006, 'learning_rate': 2.0726996003202492e-08, 'epoch': 0.97} 97%|█████████▋| 33314/34278 [36:30:41<59:45, 3.72s/it] 97%|█████████▋| 33315/34278 [36:30:44<56:34, 3.53s/it] {'loss': 0.1142, 'grad_norm': 0.755199234780472, 'learning_rate': 2.0684045898533566e-08, 'epoch': 0.97} 97%|█████████▋| 33315/34278 [36:30:44<56:34, 3.53s/it] 97%|█████████▋| 33316/34278 [36:30:50<1:09:34, 4.34s/it] {'loss': 0.1327, 'grad_norm': 0.9259430231150687, 'learning_rate': 2.064114024800834e-08, 'epoch': 0.97} 97%|█████████▋| 33316/34278 [36:30:50<1:09:34, 4.34s/it] 97%|█████████▋| 33317/34278 [36:30:54<1:03:22, 3.96s/it] {'loss': 0.1031, 'grad_norm': 0.952728399417846, 'learning_rate': 2.059827905201095e-08, 'epoch': 0.97} 97%|█████████▋| 33317/34278 [36:30:54<1:03:22, 3.96s/it] 97%|█████████▋| 33318/34278 [36:30:56<58:02, 3.63s/it] {'loss': 0.1147, 'grad_norm': 0.7640032847065388, 'learning_rate': 2.055546231092276e-08, 'epoch': 0.97} 97%|█████████▋| 33318/34278 [36:30:56<58:02, 3.63s/it] 97%|█████████▋| 33319/34278 [36:31:00<55:46, 3.49s/it] {'loss': 0.1064, 'grad_norm': 0.7113730249546651, 'learning_rate': 2.0512690025127345e-08, 'epoch': 0.97} 97%|█████████▋| 33319/34278 [36:31:00<55:46, 3.49s/it] 97%|█████████▋| 33320/34278 [36:31:03<55:09, 3.45s/it] {'loss': 0.1155, 'grad_norm': 0.9526394956738103, 'learning_rate': 2.0469962195005522e-08, 'epoch': 0.97} 97%|█████████▋| 33320/34278 [36:31:03<55:09, 3.45s/it] 97%|█████████▋| 33321/34278 [36:31:06<53:08, 3.33s/it] {'loss': 0.0987, 'grad_norm': 0.8208247646065269, 'learning_rate': 2.04272788209392e-08, 'epoch': 0.97} 97%|█████████▋| 33321/34278 [36:31:06<53:08, 3.33s/it] 97%|█████████▋| 33322/34278 [36:31:12<1:04:56, 4.08s/it] {'loss': 0.1001, 'grad_norm': 0.7365215128034867, 'learning_rate': 2.0384639903309744e-08, 'epoch': 0.97} 97%|█████████▋| 33322/34278 [36:31:12<1:04:56, 4.08s/it] 97%|█████████▋| 33323/34278 [36:31:16<1:05:16, 4.10s/it] {'loss': 0.114, 'grad_norm': 0.9058537089768122, 'learning_rate': 2.034204544249685e-08, 'epoch': 0.97} 97%|█████████▋| 33323/34278 [36:31:16<1:05:16, 4.10s/it] 97%|█████████▋| 33324/34278 [36:31:21<1:07:19, 4.23s/it] {'loss': 0.1031, 'grad_norm': 0.9173028845224632, 'learning_rate': 2.0299495438881877e-08, 'epoch': 0.97} 97%|█████████▋| 33324/34278 [36:31:21<1:07:19, 4.23s/it] 97%|█████████▋| 33325/34278 [36:31:24<1:02:45, 3.95s/it] {'loss': 0.0938, 'grad_norm': 0.7301709254330327, 'learning_rate': 2.0256989892844526e-08, 'epoch': 0.97} 97%|█████████▋| 33325/34278 [36:31:24<1:02:45, 3.95s/it] 97%|█████████▋| 33326/34278 [36:31:30<1:12:27, 4.57s/it] {'loss': 0.1139, 'grad_norm': 0.8623916502247145, 'learning_rate': 2.0214528804763377e-08, 'epoch': 0.97} 97%|█████████▋| 33326/34278 [36:31:30<1:12:27, 4.57s/it] 97%|█████████▋| 33327/34278 [36:31:35<1:15:40, 4.77s/it] {'loss': 0.1281, 'grad_norm': 0.7493150574255667, 'learning_rate': 2.017211217501869e-08, 'epoch': 0.97} 97%|█████████▋| 33327/34278 [36:31:35<1:15:40, 4.77s/it] 97%|█████████▋| 33328/34278 [36:31:41<1:21:52, 5.17s/it] {'loss': 0.1104, 'grad_norm': 0.8172134180991346, 'learning_rate': 2.0129740003988485e-08, 'epoch': 0.97} 97%|█████████▋| 33328/34278 [36:31:41<1:21:52, 5.17s/it] 97%|█████████▋| 33329/34278 [36:31:44<1:12:14, 4.57s/it] {'loss': 0.132, 'grad_norm': 0.8068921454843423, 'learning_rate': 2.00874122920508e-08, 'epoch': 0.97} 97%|█████████▋| 33329/34278 [36:31:44<1:12:14, 4.57s/it] 97%|█████████▋| 33330/34278 [36:31:47<1:03:48, 4.04s/it] {'loss': 0.1377, 'grad_norm': 0.7895308248497875, 'learning_rate': 2.0045129039584222e-08, 'epoch': 0.97} 97%|█████████▋| 33330/34278 [36:31:47<1:03:48, 4.04s/it] 97%|█████████▋| 33331/34278 [36:31:50<1:00:01, 3.80s/it] {'loss': 0.1095, 'grad_norm': 0.7257825512227422, 'learning_rate': 2.0002890246965666e-08, 'epoch': 0.97} 97%|█████████▋| 33331/34278 [36:31:50<1:00:01, 3.80s/it] 97%|█████████▋| 33332/34278 [36:31:54<57:07, 3.62s/it] {'loss': 0.1059, 'grad_norm': 0.7815689958219003, 'learning_rate': 1.9960695914572613e-08, 'epoch': 0.97} 97%|█████████▋| 33332/34278 [36:31:54<57:07, 3.62s/it] 97%|█████████▋| 33333/34278 [36:31:56<53:41, 3.41s/it] {'loss': 0.1318, 'grad_norm': 0.7451748441768528, 'learning_rate': 1.9918546042781982e-08, 'epoch': 0.97} 97%|█████████▋| 33333/34278 [36:31:57<53:41, 3.41s/it] 97%|█████████▋| 33334/34278 [36:31:59<51:41, 3.29s/it] {'loss': 0.114, 'grad_norm': 0.8058696504447004, 'learning_rate': 1.9876440631969585e-08, 'epoch': 0.97} 97%|█████████▋| 33334/34278 [36:31:59<51:41, 3.29s/it] 97%|█████████▋| 33335/34278 [36:32:04<58:49, 3.74s/it] {'loss': 0.1319, 'grad_norm': 0.9150165019876599, 'learning_rate': 1.983437968251123e-08, 'epoch': 0.97} 97%|█████████▋| 33335/34278 [36:32:04<58:49, 3.74s/it] 97%|█████████▋| 33336/34278 [36:32:08<1:00:38, 3.86s/it] {'loss': 0.1131, 'grad_norm': 0.80739249159237, 'learning_rate': 1.9792363194782726e-08, 'epoch': 0.97} 97%|█████████▋| 33336/34278 [36:32:08<1:00:38, 3.86s/it] 97%|█████████▋| 33337/34278 [36:32:12<59:15, 3.78s/it] {'loss': 0.133, 'grad_norm': 0.7029138450106285, 'learning_rate': 1.9750391169159332e-08, 'epoch': 0.97} 97%|█████████▋| 33337/34278 [36:32:12<59:15, 3.78s/it] 97%|█████████▋| 33338/34278 [36:32:15<56:45, 3.62s/it] {'loss': 0.1339, 'grad_norm': 0.8346001268840691, 'learning_rate': 1.9708463606015194e-08, 'epoch': 0.97} 97%|█████████▋| 33338/34278 [36:32:15<56:45, 3.62s/it] 97%|█████████▋| 33339/34278 [36:32:19<55:47, 3.57s/it] {'loss': 0.1001, 'grad_norm': 0.8896279040035748, 'learning_rate': 1.9666580505725007e-08, 'epoch': 0.97} 97%|█████████▋| 33339/34278 [36:32:19<55:47, 3.57s/it] 97%|█████████▋| 33340/34278 [36:32:24<1:05:40, 4.20s/it] {'loss': 0.1143, 'grad_norm': 1.0616821086827493, 'learning_rate': 1.9624741868662922e-08, 'epoch': 0.97} 97%|█████████▋| 33340/34278 [36:32:24<1:05:40, 4.20s/it] 97%|█████████▋| 33341/34278 [36:32:28<1:02:28, 4.00s/it] {'loss': 0.1069, 'grad_norm': 0.7409043448761004, 'learning_rate': 1.9582947695202527e-08, 'epoch': 0.97} 97%|█████████▋| 33341/34278 [36:32:28<1:02:28, 4.00s/it] 97%|█████████▋| 33342/34278 [36:32:32<1:04:19, 4.12s/it] {'loss': 0.1005, 'grad_norm': 0.9229229172873346, 'learning_rate': 1.9541197985716298e-08, 'epoch': 0.97} 97%|█████████▋| 33342/34278 [36:32:32<1:04:19, 4.12s/it] 97%|█████████▋| 33343/34278 [36:32:37<1:08:18, 4.38s/it] {'loss': 0.1199, 'grad_norm': 0.7304547907080917, 'learning_rate': 1.9499492740577273e-08, 'epoch': 0.97} 97%|█████████▋| 33343/34278 [36:32:37<1:08:18, 4.38s/it] 97%|█████████▋| 33344/34278 [36:32:41<1:03:03, 4.05s/it] {'loss': 0.1144, 'grad_norm': 0.6824912921749677, 'learning_rate': 1.9457831960157937e-08, 'epoch': 0.97} 97%|█████████▋| 33344/34278 [36:32:41<1:03:03, 4.05s/it] 97%|█████████▋| 33345/34278 [36:32:44<1:01:24, 3.95s/it] {'loss': 0.1062, 'grad_norm': 0.7877393040686903, 'learning_rate': 1.9416215644830204e-08, 'epoch': 0.97} 97%|█████████▋| 33345/34278 [36:32:44<1:01:24, 3.95s/it] 97%|█████████▋| 33346/34278 [36:32:48<1:00:37, 3.90s/it] {'loss': 0.0913, 'grad_norm': 0.9844808545510941, 'learning_rate': 1.9374643794964897e-08, 'epoch': 0.97} 97%|█████████▋| 33346/34278 [36:32:48<1:00:37, 3.90s/it] 97%|█████████▋| 33347/34278 [36:32:51<56:57, 3.67s/it] {'loss': 0.0937, 'grad_norm': 1.1689726331096526, 'learning_rate': 1.9333116410934493e-08, 'epoch': 0.97} 97%|█████████▋| 33347/34278 [36:32:51<56:57, 3.67s/it] 97%|█████████▋| 33348/34278 [36:32:55<55:46, 3.60s/it] {'loss': 0.1255, 'grad_norm': 0.7126234237969035, 'learning_rate': 1.9291633493109254e-08, 'epoch': 0.97} 97%|█████████▋| 33348/34278 [36:32:55<55:46, 3.60s/it] 97%|█████████▋| 33349/34278 [36:32:59<1:00:48, 3.93s/it] {'loss': 0.1209, 'grad_norm': 0.7489160067019679, 'learning_rate': 1.9250195041858876e-08, 'epoch': 0.97} 97%|█████████▋| 33349/34278 [36:32:59<1:00:48, 3.93s/it] 97%|█████████▋| 33350/34278 [36:33:03<57:21, 3.71s/it] {'loss': 0.0771, 'grad_norm': 0.7490003086713569, 'learning_rate': 1.920880105755363e-08, 'epoch': 0.97} 97%|█████████▋| 33350/34278 [36:33:03<57:21, 3.71s/it] 97%|█████████▋| 33351/34278 [36:33:06<56:50, 3.68s/it] {'loss': 0.1051, 'grad_norm': 0.6394620877002085, 'learning_rate': 1.9167451540563765e-08, 'epoch': 0.97} 97%|█████████▋| 33351/34278 [36:33:06<56:50, 3.68s/it] 97%|█████████▋| 33352/34278 [36:33:09<52:30, 3.40s/it] {'loss': 0.1023, 'grad_norm': 0.9090195318328302, 'learning_rate': 1.9126146491257324e-08, 'epoch': 0.97} 97%|█████████▋| 33352/34278 [36:33:09<52:30, 3.40s/it] 97%|█████████▋| 33353/34278 [36:33:12<50:21, 3.27s/it] {'loss': 0.123, 'grad_norm': 0.8825669052766255, 'learning_rate': 1.908488591000346e-08, 'epoch': 0.97} 97%|█████████▋| 33353/34278 [36:33:12<50:21, 3.27s/it] 97%|█████████▋| 33354/34278 [36:33:15<48:55, 3.18s/it] {'loss': 0.1122, 'grad_norm': 0.8893309323915279, 'learning_rate': 1.9043669797171316e-08, 'epoch': 0.97} 97%|█████████▋| 33354/34278 [36:33:15<48:55, 3.18s/it] 97%|█████████▋| 33355/34278 [36:33:21<1:01:32, 4.00s/it] {'loss': 0.1222, 'grad_norm': 0.9490527767267487, 'learning_rate': 1.900249815312838e-08, 'epoch': 0.97} 97%|█████████▋| 33355/34278 [36:33:21<1:01:32, 4.00s/it] 97%|█████████▋| 33356/34278 [36:33:25<1:04:35, 4.20s/it] {'loss': 0.0956, 'grad_norm': 0.709442625281374, 'learning_rate': 1.8961370978241023e-08, 'epoch': 0.97} 97%|█████████▋| 33356/34278 [36:33:25<1:04:35, 4.20s/it] 97%|█████████▋| 33357/34278 [36:33:30<1:06:25, 4.33s/it] {'loss': 0.1154, 'grad_norm': 0.978385822187881, 'learning_rate': 1.8920288272878396e-08, 'epoch': 0.97} 97%|█████████▋| 33357/34278 [36:33:30<1:06:25, 4.33s/it] 97%|█████████▋| 33358/34278 [36:33:33<1:00:31, 3.95s/it] {'loss': 0.1041, 'grad_norm': 0.9637221416511035, 'learning_rate': 1.8879250037406315e-08, 'epoch': 0.97} 97%|█████████▋| 33358/34278 [36:33:33<1:00:31, 3.95s/it] 97%|█████████▋| 33359/34278 [36:33:37<58:22, 3.81s/it] {'loss': 0.1059, 'grad_norm': 0.8283305516874159, 'learning_rate': 1.8838256272190602e-08, 'epoch': 0.97} 97%|█████████▋| 33359/34278 [36:33:37<58:22, 3.81s/it] 97%|█████████▋| 33360/34278 [36:33:40<57:34, 3.76s/it] {'loss': 0.1024, 'grad_norm': 0.9277834903067519, 'learning_rate': 1.8797306977598184e-08, 'epoch': 0.97} 97%|█████████▋| 33360/34278 [36:33:40<57:34, 3.76s/it] 97%|█████████▋| 33361/34278 [36:33:45<1:00:56, 3.99s/it] {'loss': 0.1092, 'grad_norm': 0.8363224529409721, 'learning_rate': 1.8756402153994324e-08, 'epoch': 0.97} 97%|█████████▋| 33361/34278 [36:33:45<1:00:56, 3.99s/it] 97%|█████████▋| 33362/34278 [36:33:48<55:16, 3.62s/it] {'loss': 0.1196, 'grad_norm': 0.7339383701319554, 'learning_rate': 1.8715541801744286e-08, 'epoch': 0.97} 97%|█████████▋| 33362/34278 [36:33:48<55:16, 3.62s/it] 97%|█████████▋| 33363/34278 [36:33:53<1:04:49, 4.25s/it] {'loss': 0.0935, 'grad_norm': 1.1063275108381208, 'learning_rate': 1.8674725921212776e-08, 'epoch': 0.97} 97%|█████████▋| 33363/34278 [36:33:53<1:04:49, 4.25s/it] 97%|█████████▋| 33364/34278 [36:33:57<1:00:49, 3.99s/it] {'loss': 0.104, 'grad_norm': 0.756467841423844, 'learning_rate': 1.8633954512763953e-08, 'epoch': 0.97} 97%|█████████▋| 33364/34278 [36:33:57<1:00:49, 3.99s/it] 97%|█████████▋| 33365/34278 [36:34:00<59:13, 3.89s/it] {'loss': 0.1001, 'grad_norm': 0.7192393817105305, 'learning_rate': 1.8593227576761962e-08, 'epoch': 0.97} 97%|█████████▋| 33365/34278 [36:34:00<59:13, 3.89s/it] 97%|█████████▋| 33366/34278 [36:34:04<56:31, 3.72s/it] {'loss': 0.1177, 'grad_norm': 0.7233696841641885, 'learning_rate': 1.8552545113570963e-08, 'epoch': 0.97} 97%|█████████▋| 33366/34278 [36:34:04<56:31, 3.72s/it] 97%|█████████▋| 33367/34278 [36:34:10<1:06:44, 4.40s/it] {'loss': 0.1091, 'grad_norm': 1.0112742820744531, 'learning_rate': 1.851190712355344e-08, 'epoch': 0.97} 97%|█████████▋| 33367/34278 [36:34:10<1:06:44, 4.40s/it] 97%|█████████▋| 33368/34278 [36:34:13<1:03:15, 4.17s/it] {'loss': 0.0998, 'grad_norm': 0.7479705079772767, 'learning_rate': 1.8471313607071883e-08, 'epoch': 0.97} 97%|█████████▋| 33368/34278 [36:34:13<1:03:15, 4.17s/it] 97%|█████████▋| 33369/34278 [36:34:17<1:01:15, 4.04s/it] {'loss': 0.1123, 'grad_norm': 0.6921841689447383, 'learning_rate': 1.843076456448989e-08, 'epoch': 0.97} 97%|█████████▋| 33369/34278 [36:34:17<1:01:15, 4.04s/it] 97%|█████████▋| 33370/34278 [36:34:22<1:06:35, 4.40s/it] {'loss': 0.1199, 'grad_norm': 0.9172583354982481, 'learning_rate': 1.8390259996168835e-08, 'epoch': 0.97} 97%|█████████▋| 33370/34278 [36:34:22<1:06:35, 4.40s/it] 97%|█████████▋| 33371/34278 [36:34:28<1:13:40, 4.87s/it] {'loss': 0.1334, 'grad_norm': 0.8822899103914085, 'learning_rate': 1.83497999024701e-08, 'epoch': 0.97} 97%|█████████▋| 33371/34278 [36:34:28<1:13:40, 4.87s/it] 97%|█████████▋| 33372/34278 [36:34:34<1:19:02, 5.24s/it] {'loss': 0.1061, 'grad_norm': 0.7350092952227788, 'learning_rate': 1.830938428375506e-08, 'epoch': 0.97} 97%|█████████▋| 33372/34278 [36:34:34<1:19:02, 5.24s/it] 97%|█████████▋| 33373/34278 [36:34:38<1:12:33, 4.81s/it] {'loss': 0.1056, 'grad_norm': 0.7330787897650202, 'learning_rate': 1.8269013140385094e-08, 'epoch': 0.97} 97%|█████████▋| 33373/34278 [36:34:38<1:12:33, 4.81s/it] 97%|█████████▋| 33374/34278 [36:34:41<1:03:25, 4.21s/it] {'loss': 0.1096, 'grad_norm': 0.8272844833046916, 'learning_rate': 1.822868647271936e-08, 'epoch': 0.97} 97%|█████████▋| 33374/34278 [36:34:41<1:03:25, 4.21s/it] 97%|█████████▋| 33375/34278 [36:34:44<57:28, 3.82s/it] {'loss': 0.096, 'grad_norm': 0.6753441382658865, 'learning_rate': 1.818840428111923e-08, 'epoch': 0.97} 97%|█████████▋| 33375/34278 [36:34:44<57:28, 3.82s/it] 97%|█████████▋| 33376/34278 [36:34:47<55:24, 3.69s/it] {'loss': 0.1336, 'grad_norm': 0.7612429464049904, 'learning_rate': 1.814816656594387e-08, 'epoch': 0.97} 97%|█████████▋| 33376/34278 [36:34:47<55:24, 3.69s/it] 97%|█████████▋| 33377/34278 [36:34:51<54:44, 3.65s/it] {'loss': 0.1213, 'grad_norm': 0.9039727898563927, 'learning_rate': 1.8107973327551876e-08, 'epoch': 0.97} 97%|█████████▋| 33377/34278 [36:34:51<54:44, 3.65s/it] 97%|█████████▋| 33378/34278 [36:34:57<1:04:48, 4.32s/it] {'loss': 0.112, 'grad_norm': 0.7970470410181474, 'learning_rate': 1.8067824566302962e-08, 'epoch': 0.97} 97%|█████████▋| 33378/34278 [36:34:57<1:04:48, 4.32s/it] 97%|█████████▋| 33379/34278 [36:35:00<59:08, 3.95s/it] {'loss': 0.0963, 'grad_norm': 0.8800650514267462, 'learning_rate': 1.802772028255517e-08, 'epoch': 0.97} 97%|█████████▋| 33379/34278 [36:35:00<59:08, 3.95s/it] 97%|█████████▋| 33380/34278 [36:35:04<58:55, 3.94s/it] {'loss': 0.1139, 'grad_norm': 0.6846404777296988, 'learning_rate': 1.7987660476666556e-08, 'epoch': 0.97} 97%|█████████▋| 33380/34278 [36:35:04<58:55, 3.94s/it] 97%|█████████▋| 33381/34278 [36:35:08<59:12, 3.96s/it] {'loss': 0.1212, 'grad_norm': 0.913059160933428, 'learning_rate': 1.7947645148995162e-08, 'epoch': 0.97} 97%|█████████▋| 33381/34278 [36:35:08<59:12, 3.96s/it] 97%|█████████▋| 33382/34278 [36:35:12<59:37, 3.99s/it] {'loss': 0.1162, 'grad_norm': 0.8062975208045219, 'learning_rate': 1.790767429989737e-08, 'epoch': 0.97} 97%|█████████▋| 33382/34278 [36:35:12<59:37, 3.99s/it] 97%|█████████▋| 33383/34278 [36:35:15<57:10, 3.83s/it] {'loss': 0.1186, 'grad_norm': 1.1223707926612059, 'learning_rate': 1.7867747929730673e-08, 'epoch': 0.97} 97%|█████████▋| 33383/34278 [36:35:15<57:10, 3.83s/it] 97%|█████████▋| 33384/34278 [36:35:18<54:24, 3.65s/it] {'loss': 0.1021, 'grad_norm': 1.3384563318411626, 'learning_rate': 1.7827866038852005e-08, 'epoch': 0.97} 97%|█████████▋| 33384/34278 [36:35:18<54:24, 3.65s/it] 97%|█████████▋| 33385/34278 [36:35:21<50:42, 3.41s/it] {'loss': 0.0966, 'grad_norm': 0.80109189907296, 'learning_rate': 1.7788028627616083e-08, 'epoch': 0.97} 97%|█████████▋| 33385/34278 [36:35:21<50:42, 3.41s/it] 97%|█████████▋| 33386/34278 [36:35:25<51:50, 3.49s/it] {'loss': 0.1133, 'grad_norm': 0.9382834093700994, 'learning_rate': 1.774823569637929e-08, 'epoch': 0.97} 97%|█████████▋| 33386/34278 [36:35:25<51:50, 3.49s/it] 97%|█████████▋| 33387/34278 [36:35:28<50:01, 3.37s/it] {'loss': 0.1294, 'grad_norm': 0.7803182073018958, 'learning_rate': 1.7708487245497454e-08, 'epoch': 0.97} 97%|█████████▋| 33387/34278 [36:35:28<50:01, 3.37s/it] 97%|█████████▋| 33388/34278 [36:35:31<48:18, 3.26s/it] {'loss': 0.1155, 'grad_norm': 0.9460944671462214, 'learning_rate': 1.7668783275324176e-08, 'epoch': 0.97} 97%|█████████▋| 33388/34278 [36:35:31<48:18, 3.26s/it] 97%|█████████▋| 33389/34278 [36:35:35<52:12, 3.52s/it] {'loss': 0.1032, 'grad_norm': 1.0073333152762762, 'learning_rate': 1.7629123786215285e-08, 'epoch': 0.97} 97%|█████████▋| 33389/34278 [36:35:35<52:12, 3.52s/it] 97%|█████████▋| 33390/34278 [36:35:39<52:23, 3.54s/it] {'loss': 0.1105, 'grad_norm': 0.737617428990919, 'learning_rate': 1.7589508778523833e-08, 'epoch': 0.97} 97%|█████████▋| 33390/34278 [36:35:39<52:23, 3.54s/it] 97%|█████████▋| 33391/34278 [36:35:44<57:52, 3.91s/it] {'loss': 0.1224, 'grad_norm': 0.8700243402153485, 'learning_rate': 1.754993825260398e-08, 'epoch': 0.97} 97%|█████████▋| 33391/34278 [36:35:44<57:52, 3.91s/it] 97%|█████████▋| 33392/34278 [36:35:47<54:27, 3.69s/it] {'loss': 0.1212, 'grad_norm': 0.7810096236326348, 'learning_rate': 1.751041220880878e-08, 'epoch': 0.97} 97%|█████████▋| 33392/34278 [36:35:47<54:27, 3.69s/it] 97%|█████████▋| 33393/34278 [36:35:51<59:18, 4.02s/it] {'loss': 0.0967, 'grad_norm': 0.6825120453776412, 'learning_rate': 1.7470930647490724e-08, 'epoch': 0.97} 97%|█████████▋| 33393/34278 [36:35:51<59:18, 4.02s/it] 97%|█████████▋| 33394/34278 [36:35:55<57:21, 3.89s/it] {'loss': 0.132, 'grad_norm': 0.7636929463072785, 'learning_rate': 1.7431493569003422e-08, 'epoch': 0.97} 97%|█████████▋| 33394/34278 [36:35:55<57:21, 3.89s/it] 97%|█████████▋| 33395/34278 [36:35:59<57:09, 3.88s/it] {'loss': 0.1286, 'grad_norm': 0.8319015578493912, 'learning_rate': 1.7392100973698257e-08, 'epoch': 0.97} 97%|█████████▋| 33395/34278 [36:35:59<57:09, 3.88s/it] 97%|█████████▋| 33396/34278 [36:36:02<52:04, 3.54s/it] {'loss': 0.0962, 'grad_norm': 0.693362482239197, 'learning_rate': 1.7352752861927168e-08, 'epoch': 0.97} 97%|█████████▋| 33396/34278 [36:36:02<52:04, 3.54s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f47ea571f30> Failed to fetch sample 3255718. Exception: cannot identify image file <_io.BytesIO object at 0x7f47ea571f30> 97%|█████████▋| 33397/34278 [36:36:05<50:28, 3.44s/it] {'loss': 0.1056, 'grad_norm': 0.9325020114660988, 'learning_rate': 1.7313449234040992e-08, 'epoch': 0.97} 97%|█████████▋| 33397/34278 [36:36:05<50:28, 3.44s/it] 97%|█████████▋| 33398/34278 [36:36:08<50:21, 3.43s/it] {'loss': 0.1164, 'grad_norm': 0.8365567021857802, 'learning_rate': 1.7274190090390553e-08, 'epoch': 0.97} 97%|█████████▋| 33398/34278 [36:36:08<50:21, 3.43s/it] 97%|█████████▋| 33399/34278 [36:36:11<48:44, 3.33s/it] {'loss': 0.1283, 'grad_norm': 0.972605134269126, 'learning_rate': 1.723497543132724e-08, 'epoch': 0.97} 97%|█████████▋| 33399/34278 [36:36:11<48:44, 3.33s/it] 97%|█████████▋| 33400/34278 [36:36:14<47:16, 3.23s/it] {'loss': 0.103, 'grad_norm': 0.8577117535910984, 'learning_rate': 1.719580525719966e-08, 'epoch': 0.97} 97%|█████████▋| 33400/34278 [36:36:14<47:16, 3.23s/it] 97%|█████████▋| 33401/34278 [36:36:17<46:32, 3.18s/it] {'loss': 0.1125, 'grad_norm': 0.8280299127440968, 'learning_rate': 1.7156679568359203e-08, 'epoch': 0.97} 97%|█████████▋| 33401/34278 [36:36:17<46:32, 3.18s/it] 97%|█████████▋| 33402/34278 [36:36:20<45:43, 3.13s/it] {'loss': 0.1154, 'grad_norm': 0.8463966314981628, 'learning_rate': 1.7117598365154477e-08, 'epoch': 0.97} 97%|█████████▋| 33402/34278 [36:36:20<45:43, 3.13s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f0e40143920> Failed to fetch sample 3423931. Exception: cannot identify image file <_io.BytesIO object at 0x7f0e40143920> 97%|█████████▋| 33403/34278 [36:36:24<45:24, 3.11s/it] {'loss': 0.0971, 'grad_norm': 0.7877558319809553, 'learning_rate': 1.707856164793409e-08, 'epoch': 0.97} 97%|█████████▋| 33403/34278 [36:36:24<45:24, 3.11s/it] 97%|█████████▋| 33404/34278 [36:36:30<58:13, 4.00s/it] {'loss': 0.0881, 'grad_norm': 0.9374016346771774, 'learning_rate': 1.7039569417046655e-08, 'epoch': 0.97} 97%|█████████▋| 33404/34278 [36:36:30<58:13, 4.00s/it] 97%|█████████▋| 33405/34278 [36:36:33<55:25, 3.81s/it] {'loss': 0.1252, 'grad_norm': 0.9121423303443583, 'learning_rate': 1.7000621672840777e-08, 'epoch': 0.97} 97%|█████████▋| 33405/34278 [36:36:33<55:25, 3.81s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f22941be840> Failed to fetch sample 3646664. Exception: cannot identify image file <_io.BytesIO object at 0x7f22941be840> 97%|█████████▋| 33406/34278 [36:36:36<53:01, 3.65s/it] {'loss': 0.1106, 'grad_norm': 0.714116311834555, 'learning_rate': 1.696171841566341e-08, 'epoch': 0.97} 97%|█████████▋| 33406/34278 [36:36:36<53:01, 3.65s/it] 97%|█████████▋| 33407/34278 [36:36:39<50:06, 3.45s/it] {'loss': 0.1176, 'grad_norm': 0.9038189175558847, 'learning_rate': 1.69228596458626e-08, 'epoch': 0.97} 97%|█████████▋| 33407/34278 [36:36:39<50:06, 3.45s/it] 97%|█████████▋| 33408/34278 [36:36:44<55:50, 3.85s/it] {'loss': 0.1421, 'grad_norm': 0.837749300781571, 'learning_rate': 1.688404536378474e-08, 'epoch': 0.97} 97%|█████████▋| 33408/34278 [36:36:44<55:50, 3.85s/it] 97%|█████████▋| 33409/34278 [36:36:47<52:09, 3.60s/it] {'loss': 0.1134, 'grad_norm': 0.8193194685692583, 'learning_rate': 1.6845275569776774e-08, 'epoch': 0.97} 97%|█████████▋| 33409/34278 [36:36:47<52:09, 3.60s/it] 97%|█████████▋| 33410/34278 [36:36:50<48:41, 3.37s/it] {'loss': 0.094, 'grad_norm': 0.7724165656177046, 'learning_rate': 1.680655026418454e-08, 'epoch': 0.97} 97%|█████████▋| 33410/34278 [36:36:50<48:41, 3.37s/it] 97%|█████████▋| 33411/34278 [36:36:54<51:48, 3.59s/it] {'loss': 0.09, 'grad_norm': 0.8670529420332095, 'learning_rate': 1.676786944735387e-08, 'epoch': 0.97} 97%|█████████▋| 33411/34278 [36:36:54<51:48, 3.59s/it] 97%|█████████▋| 33412/34278 [36:36:57<51:33, 3.57s/it] {'loss': 0.0993, 'grad_norm': 0.6433729522077717, 'learning_rate': 1.672923311963004e-08, 'epoch': 0.97} 97%|█████████▋| 33412/34278 [36:36:58<51:33, 3.57s/it] 97%|█████████▋| 33413/34278 [36:37:01<49:09, 3.41s/it] {'loss': 0.0999, 'grad_norm': 1.0027729146225761, 'learning_rate': 1.6690641281357778e-08, 'epoch': 0.97} 97%|█████████▋| 33413/34278 [36:37:01<49:09, 3.41s/it] 97%|█████████▋| 33414/34278 [36:37:04<49:39, 3.45s/it] {'loss': 0.092, 'grad_norm': 0.6992562887457178, 'learning_rate': 1.6652093932881807e-08, 'epoch': 0.97} 97%|█████████▋| 33414/34278 [36:37:04<49:39, 3.45s/it] 97%|█████████▋| 33415/34278 [36:37:07<47:28, 3.30s/it] {'loss': 0.1141, 'grad_norm': 0.7068301941366186, 'learning_rate': 1.6613591074546855e-08, 'epoch': 0.97} 97%|█████████▋| 33415/34278 [36:37:07<47:28, 3.30s/it] 97%|█████████▋| 33416/34278 [36:37:10<44:58, 3.13s/it] {'loss': 0.1043, 'grad_norm': 0.814202526508201, 'learning_rate': 1.6575132706695417e-08, 'epoch': 0.97} 97%|█████████▋| 33416/34278 [36:37:10<44:58, 3.13s/it] 97%|█████████▋| 33417/34278 [36:37:13<44:11, 3.08s/it] {'loss': 0.1209, 'grad_norm': 0.8187714646032954, 'learning_rate': 1.6536718829672227e-08, 'epoch': 0.97} 97%|█████████▋| 33417/34278 [36:37:13<44:11, 3.08s/it] 97%|█████████▋| 33418/34278 [36:37:19<56:07, 3.92s/it] {'loss': 0.1084, 'grad_norm': 0.8708815506151023, 'learning_rate': 1.6498349443819227e-08, 'epoch': 0.97} 97%|█████████▋| 33418/34278 [36:37:19<56:07, 3.92s/it] 97%|█████████▋| 33419/34278 [36:37:22<52:52, 3.69s/it] {'loss': 0.1007, 'grad_norm': 0.673224660071052, 'learning_rate': 1.6460024549479482e-08, 'epoch': 0.97} 97%|█████████▋| 33419/34278 [36:37:22<52:52, 3.69s/it] 97%|█████████▋| 33420/34278 [36:37:27<1:00:20, 4.22s/it] {'loss': 0.1018, 'grad_norm': 1.0546626415822506, 'learning_rate': 1.6421744146994932e-08, 'epoch': 0.97} 97%|█████████▋| 33420/34278 [36:37:27<1:00:20, 4.22s/it] 97%|█████████▋| 33421/34278 [36:37:33<1:06:20, 4.64s/it] {'loss': 0.1079, 'grad_norm': 1.025942351102925, 'learning_rate': 1.638350823670698e-08, 'epoch': 0.97} 97%|█████████▋| 33421/34278 [36:37:33<1:06:20, 4.64s/it] 98%|█████████▊| 33422/34278 [36:37:38<1:08:40, 4.81s/it] {'loss': 0.1257, 'grad_norm': 0.8179767362394554, 'learning_rate': 1.6345316818958123e-08, 'epoch': 0.98} 98%|█████████▊| 33422/34278 [36:37:38<1:08:40, 4.81s/it] 98%|█████████▊| 33423/34278 [36:37:42<1:04:24, 4.52s/it] {'loss': 0.1147, 'grad_norm': 0.808942523053795, 'learning_rate': 1.630716989408754e-08, 'epoch': 0.98} 98%|█████████▊| 33423/34278 [36:37:42<1:04:24, 4.52s/it] 98%|█████████▊| 33424/34278 [36:37:45<57:51, 4.06s/it] {'loss': 0.1198, 'grad_norm': 1.6266496243270054, 'learning_rate': 1.6269067462437727e-08, 'epoch': 0.98} 98%|█████████▊| 33424/34278 [36:37:45<57:51, 4.06s/it] 98%|█████████▊| 33425/34278 [36:37:49<59:37, 4.19s/it] {'loss': 0.101, 'grad_norm': 0.8222191101815929, 'learning_rate': 1.6231009524347862e-08, 'epoch': 0.98} 98%|█████████▊| 33425/34278 [36:37:49<59:37, 4.19s/it] 98%|█████████▊| 33426/34278 [36:37:54<1:01:06, 4.30s/it] {'loss': 0.0949, 'grad_norm': 0.6514665271591409, 'learning_rate': 1.6192996080157676e-08, 'epoch': 0.98} 98%|█████████▊| 33426/34278 [36:37:54<1:01:06, 4.30s/it] 98%|█████████▊| 33427/34278 [36:37:58<59:13, 4.18s/it] {'loss': 0.1409, 'grad_norm': 0.7969046839209064, 'learning_rate': 1.615502713020689e-08, 'epoch': 0.98} 98%|█████████▊| 33427/34278 [36:37:58<59:13, 4.18s/it] 98%|█████████▊| 33428/34278 [36:38:03<1:01:58, 4.37s/it] {'loss': 0.1023, 'grad_norm': 0.8554952062082433, 'learning_rate': 1.6117102674833575e-08, 'epoch': 0.98} 98%|█████████▊| 33428/34278 [36:38:03<1:01:58, 4.37s/it] 98%|█████████▊| 33429/34278 [36:38:06<56:41, 4.01s/it] {'loss': 0.0852, 'grad_norm': 0.7427680985706927, 'learning_rate': 1.6079222714378008e-08, 'epoch': 0.98} 98%|█████████▊| 33429/34278 [36:38:06<56:41, 4.01s/it] 98%|█████████▊| 33430/34278 [36:38:09<54:04, 3.83s/it] {'loss': 0.1011, 'grad_norm': 0.7194565309418961, 'learning_rate': 1.6041387249176588e-08, 'epoch': 0.98} 98%|█████████▊| 33430/34278 [36:38:09<54:04, 3.83s/it] 98%|█████████▊| 33431/34278 [36:38:13<54:13, 3.84s/it] {'loss': 0.1043, 'grad_norm': 0.9604794209898954, 'learning_rate': 1.600359627956849e-08, 'epoch': 0.98} 98%|█████████▊| 33431/34278 [36:38:13<54:13, 3.84s/it] 98%|█████████▊| 33432/34278 [36:38:16<51:27, 3.65s/it] {'loss': 0.0975, 'grad_norm': 0.821924743818213, 'learning_rate': 1.596584980589011e-08, 'epoch': 0.98} 98%|█████████▊| 33432/34278 [36:38:16<51:27, 3.65s/it] 98%|█████████▊| 33433/34278 [36:38:20<51:43, 3.67s/it] {'loss': 0.1077, 'grad_norm': 0.7024338531347216, 'learning_rate': 1.5928147828478958e-08, 'epoch': 0.98} 98%|█████████▊| 33433/34278 [36:38:20<51:43, 3.67s/it] 98%|█████████▊| 33434/34278 [36:38:23<50:39, 3.60s/it] {'loss': 0.1035, 'grad_norm': 0.9182789520795457, 'learning_rate': 1.589049034767143e-08, 'epoch': 0.98} 98%|█████████▊| 33434/34278 [36:38:23<50:39, 3.60s/it] 98%|█████████▊| 33435/34278 [36:38:27<51:01, 3.63s/it] {'loss': 0.0841, 'grad_norm': 0.6687812974110965, 'learning_rate': 1.585287736380392e-08, 'epoch': 0.98} 98%|█████████▊| 33435/34278 [36:38:27<51:01, 3.63s/it] 98%|█████████▊| 33436/34278 [36:38:31<51:33, 3.67s/it] {'loss': 0.1128, 'grad_norm': 1.0874316139265678, 'learning_rate': 1.581530887721172e-08, 'epoch': 0.98} 98%|█████████▊| 33436/34278 [36:38:31<51:33, 3.67s/it] 98%|█████████▊| 33437/34278 [36:38:34<49:48, 3.55s/it] {'loss': 0.1025, 'grad_norm': 0.7284841937281055, 'learning_rate': 1.5777784888231228e-08, 'epoch': 0.98} 98%|█████████▊| 33437/34278 [36:38:34<49:48, 3.55s/it] 98%|█████████▊| 33438/34278 [36:38:39<56:54, 4.06s/it] {'loss': 0.1226, 'grad_norm': 0.879325279853439, 'learning_rate': 1.574030539719662e-08, 'epoch': 0.98} 98%|█████████▊| 33438/34278 [36:38:39<56:54, 4.06s/it] 98%|█████████▊| 33439/34278 [36:38:42<51:50, 3.71s/it] {'loss': 0.11, 'grad_norm': 1.013744102609131, 'learning_rate': 1.570287040444263e-08, 'epoch': 0.98} 98%|█████████▊| 33439/34278 [36:38:42<51:50, 3.71s/it] 98%|█████████▊| 33440/34278 [36:38:46<50:45, 3.63s/it] {'loss': 0.1121, 'grad_norm': 0.7306770282912424, 'learning_rate': 1.566547991030343e-08, 'epoch': 0.98} 98%|█████████▊| 33440/34278 [36:38:46<50:45, 3.63s/it] 98%|█████████▊| 33441/34278 [36:38:49<48:57, 3.51s/it] {'loss': 0.1157, 'grad_norm': 0.7091404223497363, 'learning_rate': 1.5628133915113196e-08, 'epoch': 0.98} 98%|█████████▊| 33441/34278 [36:38:49<48:57, 3.51s/it] 98%|█████████▊| 33442/34278 [36:38:52<46:22, 3.33s/it] {'loss': 0.1161, 'grad_norm': 0.9053919337519989, 'learning_rate': 1.5590832419205003e-08, 'epoch': 0.98} 98%|█████████▊| 33442/34278 [36:38:52<46:22, 3.33s/it] 98%|█████████▊| 33443/34278 [36:38:58<58:03, 4.17s/it] {'loss': 0.1046, 'grad_norm': 0.7123748698206179, 'learning_rate': 1.5553575422911915e-08, 'epoch': 0.98} 98%|█████████▊| 33443/34278 [36:38:58<58:03, 4.17s/it] 98%|█████████▊| 33444/34278 [36:39:01<53:17, 3.83s/it] {'loss': 0.1205, 'grad_norm': 0.9529840701004273, 'learning_rate': 1.5516362926566996e-08, 'epoch': 0.98} 98%|█████████▊| 33444/34278 [36:39:01<53:17, 3.83s/it] 98%|█████████▊| 33445/34278 [36:39:04<50:13, 3.62s/it] {'loss': 0.1173, 'grad_norm': 0.8206795705164966, 'learning_rate': 1.5479194930502206e-08, 'epoch': 0.98} 98%|█████████▊| 33445/34278 [36:39:04<50:13, 3.62s/it] 98%|█████████▊| 33446/34278 [36:39:07<47:06, 3.40s/it] {'loss': 0.1128, 'grad_norm': 1.0333538600532497, 'learning_rate': 1.544207143504839e-08, 'epoch': 0.98} 98%|█████████▊| 33446/34278 [36:39:07<47:06, 3.40s/it] 98%|█████████▊| 33447/34278 [36:39:13<57:45, 4.17s/it] {'loss': 0.1337, 'grad_norm': 0.7544708164282096, 'learning_rate': 1.5404992440538612e-08, 'epoch': 0.98} 98%|█████████▊| 33447/34278 [36:39:13<57:45, 4.17s/it] 98%|█████████▊| 33448/34278 [36:39:17<55:33, 4.02s/it] {'loss': 0.1074, 'grad_norm': 0.6724146045830542, 'learning_rate': 1.5367957947302615e-08, 'epoch': 0.98} 98%|█████████▊| 33448/34278 [36:39:17<55:33, 4.02s/it] 98%|█████████▊| 33449/34278 [36:39:20<50:40, 3.67s/it] {'loss': 0.1009, 'grad_norm': 0.7946364233903862, 'learning_rate': 1.5330967955671794e-08, 'epoch': 0.98} 98%|█████████▊| 33449/34278 [36:39:20<50:40, 3.67s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7f87a66a6250> Failed to fetch sample 3291478. Exception: cannot identify image file <_io.BytesIO object at 0x7f87a66a6250> 98%|█████████▊| 33450/34278 [36:39:23<47:53, 3.47s/it] {'loss': 0.0961, 'grad_norm': 0.7718172549434619, 'learning_rate': 1.5294022465976444e-08, 'epoch': 0.98} 98%|█████████▊| 33450/34278 [36:39:23<47:53, 3.47s/it] 98%|█████████▊| 33451/34278 [36:39:26<46:39, 3.38s/it] {'loss': 0.1179, 'grad_norm': 0.9088213407755129, 'learning_rate': 1.5257121478545744e-08, 'epoch': 0.98} 98%|█████████▊| 33451/34278 [36:39:26<46:39, 3.38s/it] 98%|█████████▊| 33452/34278 [36:39:32<58:14, 4.23s/it] {'loss': 0.1245, 'grad_norm': 0.9327672303193023, 'learning_rate': 1.5220264993709988e-08, 'epoch': 0.98} 98%|█████████▊| 33452/34278 [36:39:32<58:14, 4.23s/it] 98%|█████████▊| 33453/34278 [36:39:35<53:54, 3.92s/it] {'loss': 0.1115, 'grad_norm': 0.7753202811566637, 'learning_rate': 1.5183453011797243e-08, 'epoch': 0.98} 98%|█████████▊| 33453/34278 [36:39:35<53:54, 3.92s/it] 98%|█████████▊| 33454/34278 [36:39:39<52:38, 3.83s/it] {'loss': 0.0896, 'grad_norm': 0.8000881894874269, 'learning_rate': 1.5146685533136697e-08, 'epoch': 0.98} 98%|█████████▊| 33454/34278 [36:39:39<52:38, 3.83s/it] 98%|█████████▊| 33455/34278 [36:39:42<49:22, 3.60s/it] {'loss': 0.1077, 'grad_norm': 0.787279925433227, 'learning_rate': 1.510996255805697e-08, 'epoch': 0.98} 98%|█████████▊| 33455/34278 [36:39:42<49:22, 3.60s/it] 98%|█████████▊| 33456/34278 [36:39:45<46:22, 3.39s/it] {'loss': 0.132, 'grad_norm': 0.9641529645428879, 'learning_rate': 1.507328408688502e-08, 'epoch': 0.98} 98%|█████████▊| 33456/34278 [36:39:45<46:22, 3.39s/it] 98%|█████████▊| 33457/34278 [36:39:48<44:06, 3.22s/it] {'loss': 0.0928, 'grad_norm': 0.9703981020772765, 'learning_rate': 1.5036650119948926e-08, 'epoch': 0.98} 98%|█████████▊| 33457/34278 [36:39:48<44:06, 3.22s/it] 98%|█████████▊| 33458/34278 [36:39:51<43:27, 3.18s/it] {'loss': 0.1095, 'grad_norm': 1.1180763624483268, 'learning_rate': 1.5000060657575643e-08, 'epoch': 0.98} 98%|█████████▊| 33458/34278 [36:39:51<43:27, 3.18s/it] 98%|█████████▊| 33459/34278 [36:39:54<42:48, 3.14s/it] {'loss': 0.1101, 'grad_norm': 1.32960764059299, 'learning_rate': 1.4963515700092135e-08, 'epoch': 0.98} 98%|█████████▊| 33459/34278 [36:39:54<42:48, 3.14s/it] 98%|█████████▊| 33460/34278 [36:39:56<41:07, 3.02s/it] {'loss': 0.1139, 'grad_norm': 0.8027003055346538, 'learning_rate': 1.4927015247823695e-08, 'epoch': 0.98} 98%|█████████▊| 33460/34278 [36:39:56<41:07, 3.02s/it] 98%|█████████▊| 33461/34278 [36:40:00<41:56, 3.08s/it] {'loss': 0.1049, 'grad_norm': 0.7467701103652643, 'learning_rate': 1.4890559301097284e-08, 'epoch': 0.98} 98%|█████████▊| 33461/34278 [36:40:00<41:56, 3.08s/it] 98%|█████████▊| 33462/34278 [36:40:03<43:36, 3.21s/it] {'loss': 0.1041, 'grad_norm': 0.8576695245988681, 'learning_rate': 1.48541478602382e-08, 'epoch': 0.98} 98%|█████████▊| 33462/34278 [36:40:03<43:36, 3.21s/it] 98%|█████████▊| 33463/34278 [36:40:09<54:59, 4.05s/it] {'loss': 0.1221, 'grad_norm': 0.9751636245609486, 'learning_rate': 1.4817780925570625e-08, 'epoch': 0.98} 98%|█████████▊| 33463/34278 [36:40:09<54:59, 4.05s/it] 98%|█████████▊| 33464/34278 [36:40:14<59:04, 4.35s/it] {'loss': 0.0979, 'grad_norm': 0.8419763689914282, 'learning_rate': 1.4781458497419854e-08, 'epoch': 0.98} 98%|█████████▊| 33464/34278 [36:40:14<59:04, 4.35s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 98%|█████████▊| 33465/34278 [36:40:17<53:55, 3.98s/it] {'loss': 0.1111, 'grad_norm': 0.7866540633974234, 'learning_rate': 1.4745180576110629e-08, 'epoch': 0.98} 98%|█████████▊| 33465/34278 [36:40:17<53:55, 3.98s/it] 98%|█████████▊| 33466/34278 [36:40:21<50:53, 3.76s/it] {'loss': 0.1052, 'grad_norm': 0.8147538835025226, 'learning_rate': 1.4708947161966025e-08, 'epoch': 0.98} 98%|█████████▊| 33466/34278 [36:40:21<50:53, 3.76s/it] 98%|█████████▊| 33467/34278 [36:40:23<46:40, 3.45s/it] {'loss': 0.1063, 'grad_norm': 0.7236913976966386, 'learning_rate': 1.467275825530967e-08, 'epoch': 0.98} 98%|█████████▊| 33467/34278 [36:40:23<46:40, 3.45s/it] 98%|█████████▊| 33468/34278 [36:40:27<47:44, 3.54s/it] {'loss': 0.125, 'grad_norm': 0.871967743547231, 'learning_rate': 1.4636613856465198e-08, 'epoch': 0.98} 98%|█████████▊| 33468/34278 [36:40:27<47:44, 3.54s/it] 98%|█████████▊| 33469/34278 [36:40:30<46:20, 3.44s/it] {'loss': 0.1162, 'grad_norm': 0.8103982140571668, 'learning_rate': 1.4600513965755125e-08, 'epoch': 0.98} 98%|█████████▊| 33469/34278 [36:40:30<46:20, 3.44s/it] 98%|█████████▊| 33470/34278 [36:40:36<56:15, 4.18s/it] {'loss': 0.1191, 'grad_norm': 1.1658768722585042, 'learning_rate': 1.4564458583500861e-08, 'epoch': 0.98} 98%|█████████▊| 33470/34278 [36:40:36<56:15, 4.18s/it] 98%|█████████▊| 33471/34278 [36:40:39<50:25, 3.75s/it] {'loss': 0.0881, 'grad_norm': 0.7372703729649, 'learning_rate': 1.4528447710025484e-08, 'epoch': 0.98} 98%|█████████▊| 33471/34278 [36:40:39<50:25, 3.75s/it] 98%|█████████▊| 33472/34278 [36:40:43<50:49, 3.78s/it] {'loss': 0.0973, 'grad_norm': 0.6513245861024871, 'learning_rate': 1.4492481345649844e-08, 'epoch': 0.98} 98%|█████████▊| 33472/34278 [36:40:43<50:49, 3.78s/it] 98%|█████████▊| 33473/34278 [36:40:46<48:34, 3.62s/it] {'loss': 0.1037, 'grad_norm': 0.7532321738316657, 'learning_rate': 1.4456559490695355e-08, 'epoch': 0.98} 98%|█████████▊| 33473/34278 [36:40:46<48:34, 3.62s/it] 98%|█████████▊| 33474/34278 [36:40:49<45:29, 3.39s/it] {'loss': 0.1079, 'grad_norm': 0.8958885167130327, 'learning_rate': 1.4420682145482313e-08, 'epoch': 0.98} 98%|█████████▊| 33474/34278 [36:40:49<45:29, 3.39s/it] 98%|█████████▊| 33475/34278 [36:40:53<47:11, 3.53s/it] {'loss': 0.093, 'grad_norm': 0.9528026867579396, 'learning_rate': 1.4384849310331573e-08, 'epoch': 0.98} 98%|█████████▊| 33475/34278 [36:40:53<47:11, 3.53s/it] 98%|█████████▊| 33476/34278 [36:40:57<48:31, 3.63s/it] {'loss': 0.1488, 'grad_norm': 0.9398564865817778, 'learning_rate': 1.4349060985562325e-08, 'epoch': 0.98} 98%|█████████▊| 33476/34278 [36:40:57<48:31, 3.63s/it] 98%|█████████▊| 33477/34278 [36:41:00<48:03, 3.60s/it] {'loss': 0.1098, 'grad_norm': 0.7662224821385598, 'learning_rate': 1.4313317171494867e-08, 'epoch': 0.98} 98%|█████████▊| 33477/34278 [36:41:00<48:03, 3.60s/it] 98%|█████████▊| 33478/34278 [36:41:03<45:40, 3.43s/it] {'loss': 0.1247, 'grad_norm': 1.0432365376197361, 'learning_rate': 1.4277617868447835e-08, 'epoch': 0.98} 98%|█████████▊| 33478/34278 [36:41:03<45:40, 3.43s/it] 98%|█████████▊| 33479/34278 [36:41:06<44:22, 3.33s/it] {'loss': 0.1062, 'grad_norm': 0.9898892779238785, 'learning_rate': 1.4241963076739862e-08, 'epoch': 0.98} 98%|█████████▊| 33479/34278 [36:41:06<44:22, 3.33s/it] 98%|█████████▊| 33480/34278 [36:41:11<49:20, 3.71s/it] {'loss': 0.1068, 'grad_norm': 0.8288831787117793, 'learning_rate': 1.4206352796689582e-08, 'epoch': 0.98} 98%|█████████▊| 33480/34278 [36:41:11<49:20, 3.71s/it] 98%|█████████▊| 33481/34278 [36:41:15<50:41, 3.82s/it] {'loss': 0.1016, 'grad_norm': 1.1654789166534627, 'learning_rate': 1.4170787028615074e-08, 'epoch': 0.98} 98%|█████████▊| 33481/34278 [36:41:15<50:41, 3.82s/it] 98%|█████████▊| 33482/34278 [36:41:19<51:25, 3.88s/it] {'loss': 0.0871, 'grad_norm': 0.6830434175297825, 'learning_rate': 1.4135265772833307e-08, 'epoch': 0.98} 98%|█████████▊| 33482/34278 [36:41:19<51:25, 3.88s/it] 98%|█████████▊| 33483/34278 [36:41:23<50:25, 3.81s/it] {'loss': 0.1071, 'grad_norm': 0.8740675074834651, 'learning_rate': 1.4099789029661249e-08, 'epoch': 0.98} 98%|█████████▊| 33483/34278 [36:41:23<50:25, 3.81s/it] 98%|█████████▊| 33484/34278 [36:41:26<49:28, 3.74s/it] {'loss': 0.1158, 'grad_norm': 1.1663422147800555, 'learning_rate': 1.4064356799416423e-08, 'epoch': 0.98} 98%|█████████▊| 33484/34278 [36:41:26<49:28, 3.74s/it] 98%|█████████▊| 33485/34278 [36:41:29<45:42, 3.46s/it] {'loss': 0.1167, 'grad_norm': 1.02950746241133, 'learning_rate': 1.4028969082415245e-08, 'epoch': 0.98} 98%|█████████▊| 33485/34278 [36:41:29<45:42, 3.46s/it] 98%|█████████▊| 33486/34278 [36:41:34<50:05, 3.80s/it] {'loss': 0.1089, 'grad_norm': 0.8121815245135743, 'learning_rate': 1.3993625878972461e-08, 'epoch': 0.98} 98%|█████████▊| 33486/34278 [36:41:34<50:05, 3.80s/it] 98%|█████████▊| 33487/34278 [36:41:37<47:08, 3.58s/it] {'loss': 0.1149, 'grad_norm': 0.7529322717191119, 'learning_rate': 1.3958327189404486e-08, 'epoch': 0.98} 98%|█████████▊| 33487/34278 [36:41:37<47:08, 3.58s/it] 98%|█████████▊| 33488/34278 [36:41:42<52:16, 3.97s/it] {'loss': 0.0936, 'grad_norm': 0.8016289552099051, 'learning_rate': 1.3923073014026623e-08, 'epoch': 0.98} 98%|█████████▊| 33488/34278 [36:41:42<52:16, 3.97s/it] 98%|█████████▊| 33489/34278 [36:41:45<49:14, 3.74s/it] {'loss': 0.1036, 'grad_norm': 0.7566834874084734, 'learning_rate': 1.3887863353153064e-08, 'epoch': 0.98} 98%|█████████▊| 33489/34278 [36:41:45<49:14, 3.74s/it] 98%|█████████▊| 33490/34278 [36:41:48<47:49, 3.64s/it] {'loss': 0.1286, 'grad_norm': 0.8066875277382531, 'learning_rate': 1.3852698207098004e-08, 'epoch': 0.98} 98%|█████████▊| 33490/34278 [36:41:48<47:49, 3.64s/it] 98%|█████████▊| 33491/34278 [36:41:51<45:16, 3.45s/it] {'loss': 0.127, 'grad_norm': 1.3215638404967414, 'learning_rate': 1.3817577576176744e-08, 'epoch': 0.98} 98%|█████████▊| 33491/34278 [36:41:51<45:16, 3.45s/it] 98%|█████████▊| 33492/34278 [36:41:54<43:20, 3.31s/it] {'loss': 0.1046, 'grad_norm': 0.8640221730882329, 'learning_rate': 1.3782501460701258e-08, 'epoch': 0.98} 98%|█████████▊| 33492/34278 [36:41:54<43:20, 3.31s/it] 98%|█████████▊| 33493/34278 [36:41:57<42:42, 3.26s/it] {'loss': 0.1058, 'grad_norm': 0.8799891919501298, 'learning_rate': 1.3747469860985186e-08, 'epoch': 0.98} 98%|█████████▊| 33493/34278 [36:41:57<42:42, 3.26s/it] 98%|█████████▊| 33494/34278 [36:42:01<45:01, 3.45s/it] {'loss': 0.0938, 'grad_norm': 0.9599150347920247, 'learning_rate': 1.3712482777341052e-08, 'epoch': 0.98} 98%|█████████▊| 33494/34278 [36:42:01<45:01, 3.45s/it] 98%|█████████▊| 33495/34278 [36:42:04<42:53, 3.29s/it] {'loss': 0.0943, 'grad_norm': 0.7217733960288283, 'learning_rate': 1.3677540210082495e-08, 'epoch': 0.98} 98%|█████████▊| 33495/34278 [36:42:04<42:53, 3.29s/it] 98%|█████████▊| 33496/34278 [36:42:07<42:36, 3.27s/it] {'loss': 0.1087, 'grad_norm': 0.7299157592610854, 'learning_rate': 1.3642642159519826e-08, 'epoch': 0.98} 98%|█████████▊| 33496/34278 [36:42:07<42:36, 3.27s/it] 98%|█████████▊| 33497/34278 [36:42:11<45:34, 3.50s/it] {'loss': 0.1467, 'grad_norm': 0.9035806109289037, 'learning_rate': 1.3607788625965567e-08, 'epoch': 0.98} 98%|█████████▊| 33497/34278 [36:42:11<45:34, 3.50s/it] 98%|█████████▊| 33498/34278 [36:42:14<43:12, 3.32s/it] {'loss': 0.1114, 'grad_norm': 0.8251648123877693, 'learning_rate': 1.3572979609730586e-08, 'epoch': 0.98} 98%|█████████▊| 33498/34278 [36:42:14<43:12, 3.32s/it] 98%|█████████▊| 33499/34278 [36:42:18<46:34, 3.59s/it] {'loss': 0.1133, 'grad_norm': 0.8127618790375482, 'learning_rate': 1.353821511112574e-08, 'epoch': 0.98} 98%|█████████▊| 33499/34278 [36:42:18<46:34, 3.59s/it] 98%|█████████▊| 33500/34278 [36:42:22<45:13, 3.49s/it] {'loss': 0.1259, 'grad_norm': 1.048735789101215, 'learning_rate': 1.3503495130460786e-08, 'epoch': 0.98} 98%|█████████▊| 33500/34278 [36:42:22<45:13, 3.49s/it] 98%|█████████▊| 33501/34278 [36:42:28<55:38, 4.30s/it] {'loss': 0.1322, 'grad_norm': 0.7650104707120872, 'learning_rate': 1.346881966804714e-08, 'epoch': 0.98} 98%|█████████▊| 33501/34278 [36:42:28<55:38, 4.30s/it] 98%|█████████▊| 33502/34278 [36:42:31<49:38, 3.84s/it] {'loss': 0.1215, 'grad_norm': 0.9096094626126157, 'learning_rate': 1.3434188724192888e-08, 'epoch': 0.98} 98%|█████████▊| 33502/34278 [36:42:31<49:38, 3.84s/it] 98%|█████████▊| 33503/34278 [36:42:34<46:36, 3.61s/it] {'loss': 0.1101, 'grad_norm': 0.7911768998071496, 'learning_rate': 1.3399602299208337e-08, 'epoch': 0.98} 98%|█████████▊| 33503/34278 [36:42:34<46:36, 3.61s/it] 98%|█████████▊| 33504/34278 [36:42:37<44:26, 3.45s/it] {'loss': 0.1393, 'grad_norm': 0.8342474729210573, 'learning_rate': 1.3365060393401574e-08, 'epoch': 0.98} 98%|█████████▊| 33504/34278 [36:42:37<44:26, 3.45s/it] 98%|█████████▊| 33505/34278 [36:42:40<42:03, 3.26s/it] {'loss': 0.1439, 'grad_norm': 0.829972991771532, 'learning_rate': 1.3330563007080688e-08, 'epoch': 0.98} 98%|█████████▊| 33505/34278 [36:42:40<42:03, 3.26s/it] 98%|█████████▊| 33506/34278 [36:42:46<53:08, 4.13s/it] {'loss': 0.1196, 'grad_norm': 0.8317439227451584, 'learning_rate': 1.3296110140554319e-08, 'epoch': 0.98} 98%|█████████▊| 33506/34278 [36:42:46<53:08, 4.13s/it] 98%|█████████▊| 33507/34278 [36:42:52<1:01:22, 4.78s/it] {'loss': 0.1124, 'grad_norm': 0.8171054407467253, 'learning_rate': 1.326170179413e-08, 'epoch': 0.98} 98%|█████████▊| 33507/34278 [36:42:52<1:01:22, 4.78s/it] 98%|█████████▊| 33508/34278 [36:42:55<54:59, 4.28s/it] {'loss': 0.1217, 'grad_norm': 0.704629330813467, 'learning_rate': 1.3227337968114705e-08, 'epoch': 0.98} 98%|█████████▊| 33508/34278 [36:42:55<54:59, 4.28s/it] 98%|█████████▊| 33509/34278 [36:43:01<59:42, 4.66s/it] {'loss': 0.1042, 'grad_norm': 0.6362952901423882, 'learning_rate': 1.3193018662815416e-08, 'epoch': 0.98} 98%|█████████▊| 33509/34278 [36:43:01<59:42, 4.66s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 98%|█████████▊| 33510/34278 [36:43:04<53:31, 4.18s/it] {'loss': 0.1108, 'grad_norm': 0.7817879002668029, 'learning_rate': 1.315874387853855e-08, 'epoch': 0.98} 98%|█████████▊| 33510/34278 [36:43:04<53:31, 4.18s/it] 98%|█████████▊| 33511/34278 [36:43:08<52:34, 4.11s/it] {'loss': 0.1262, 'grad_norm': 0.7951073224680534, 'learning_rate': 1.3124513615589419e-08, 'epoch': 0.98} 98%|█████████▊| 33511/34278 [36:43:08<52:34, 4.11s/it] 98%|█████████▊| 33512/34278 [36:43:13<58:37, 4.59s/it] {'loss': 0.1213, 'grad_norm': 0.678908976733393, 'learning_rate': 1.3090327874274445e-08, 'epoch': 0.98} 98%|█████████▊| 33512/34278 [36:43:13<58:37, 4.59s/it] 98%|█████████▊| 33513/34278 [36:43:16<52:29, 4.12s/it] {'loss': 0.1038, 'grad_norm': 0.8913169990854443, 'learning_rate': 1.305618665489894e-08, 'epoch': 0.98} 98%|█████████▊| 33513/34278 [36:43:16<52:29, 4.12s/it] 98%|█████████▊| 33514/34278 [36:43:19<48:13, 3.79s/it] {'loss': 0.0742, 'grad_norm': 0.6844177475755414, 'learning_rate': 1.3022089957766548e-08, 'epoch': 0.98} 98%|█████████▊| 33514/34278 [36:43:20<48:13, 3.79s/it] 98%|█████████▊| 33515/34278 [36:43:26<57:24, 4.51s/it] {'loss': 0.0928, 'grad_norm': 0.6221148187316252, 'learning_rate': 1.2988037783183138e-08, 'epoch': 0.98} 98%|█████████▊| 33515/34278 [36:43:26<57:24, 4.51s/it] 98%|█████████▊| 33516/34278 [36:43:29<51:09, 4.03s/it] {'loss': 0.1121, 'grad_norm': 1.4901746606833184, 'learning_rate': 1.29540301314518e-08, 'epoch': 0.98} 98%|█████████▊| 33516/34278 [36:43:29<51:09, 4.03s/it] 98%|█████████▊| 33517/34278 [36:43:34<55:38, 4.39s/it] {'loss': 0.1065, 'grad_norm': 0.7422810677962711, 'learning_rate': 1.292006700287618e-08, 'epoch': 0.98} 98%|█████████▊| 33517/34278 [36:43:34<55:38, 4.39s/it] 98%|█████████▊| 33518/34278 [36:43:37<52:46, 4.17s/it] {'loss': 0.148, 'grad_norm': 0.9316007606678465, 'learning_rate': 1.2886148397759923e-08, 'epoch': 0.98} 98%|█████████▊| 33518/34278 [36:43:37<52:46, 4.17s/it] 98%|█████████▊| 33519/34278 [36:43:41<49:22, 3.90s/it] {'loss': 0.1225, 'grad_norm': 0.9082058317249178, 'learning_rate': 1.2852274316405567e-08, 'epoch': 0.98} 98%|█████████▊| 33519/34278 [36:43:41<49:22, 3.90s/it] 98%|█████████▊| 33520/34278 [36:43:47<57:05, 4.52s/it] {'loss': 0.1195, 'grad_norm': 1.0126244087613379, 'learning_rate': 1.2818444759115644e-08, 'epoch': 0.98} 98%|█████████▊| 33520/34278 [36:43:47<57:05, 4.52s/it] 98%|█████████▊| 33521/34278 [36:43:50<50:32, 4.01s/it] {'loss': 0.137, 'grad_norm': 1.0742154430928772, 'learning_rate': 1.2784659726192139e-08, 'epoch': 0.98} 98%|█████████▊| 33521/34278 [36:43:50<50:32, 4.01s/it] 98%|█████████▊| 33522/34278 [36:43:53<48:30, 3.85s/it] {'loss': 0.1077, 'grad_norm': 0.779363725290733, 'learning_rate': 1.2750919217936475e-08, 'epoch': 0.98} 98%|█████████▊| 33522/34278 [36:43:53<48:30, 3.85s/it] 98%|█████████▊| 33523/34278 [36:43:56<46:49, 3.72s/it] {'loss': 0.1029, 'grad_norm': 0.7613057254401159, 'learning_rate': 1.2717223234650079e-08, 'epoch': 0.98} 98%|█████████▊| 33523/34278 [36:43:56<46:49, 3.72s/it] 98%|█████████▊| 33524/34278 [36:43:59<44:15, 3.52s/it] {'loss': 0.1275, 'grad_norm': 1.022905821591553, 'learning_rate': 1.2683571776633819e-08, 'epoch': 0.98} 98%|█████████▊| 33524/34278 [36:43:59<44:15, 3.52s/it] 98%|█████████▊| 33525/34278 [36:44:05<52:57, 4.22s/it] {'loss': 0.1042, 'grad_norm': 0.8794318217864383, 'learning_rate': 1.2649964844188013e-08, 'epoch': 0.98} 98%|█████████▊| 33525/34278 [36:44:05<52:57, 4.22s/it] 98%|█████████▊| 33526/34278 [36:44:09<50:56, 4.06s/it] {'loss': 0.1179, 'grad_norm': 0.9135103344287407, 'learning_rate': 1.2616402437612418e-08, 'epoch': 0.98} 98%|█████████▊| 33526/34278 [36:44:09<50:56, 4.06s/it] 98%|█████████▊| 33527/34278 [36:44:12<48:31, 3.88s/it] {'loss': 0.1077, 'grad_norm': 0.8849909720850804, 'learning_rate': 1.2582884557207908e-08, 'epoch': 0.98} 98%|█████████▊| 33527/34278 [36:44:12<48:31, 3.88s/it] 98%|█████████▊| 33528/34278 [36:44:15<44:47, 3.58s/it] {'loss': 0.1012, 'grad_norm': 0.9713646329248453, 'learning_rate': 1.2549411203272021e-08, 'epoch': 0.98} 98%|█████████▊| 33528/34278 [36:44:15<44:47, 3.58s/it] 98%|█████████▊| 33529/34278 [36:44:19<43:29, 3.48s/it] {'loss': 0.104, 'grad_norm': 0.7887688045716686, 'learning_rate': 1.2515982376104518e-08, 'epoch': 0.98} 98%|█████████▊| 33529/34278 [36:44:19<43:29, 3.48s/it] 98%|█████████▊| 33530/34278 [36:44:22<41:17, 3.31s/it] {'loss': 0.1076, 'grad_norm': 0.8497511414826573, 'learning_rate': 1.2482598076003493e-08, 'epoch': 0.98} 98%|█████████▊| 33530/34278 [36:44:22<41:17, 3.31s/it] 98%|█████████▊| 33531/34278 [36:44:25<41:25, 3.33s/it] {'loss': 0.1212, 'grad_norm': 0.8172917817280038, 'learning_rate': 1.2449258303267597e-08, 'epoch': 0.98} 98%|█████████▊| 33531/34278 [36:44:25<41:25, 3.33s/it] 98%|█████████▊| 33532/34278 [36:44:28<40:31, 3.26s/it] {'loss': 0.1096, 'grad_norm': 0.8647268169479864, 'learning_rate': 1.241596305819437e-08, 'epoch': 0.98} 98%|█████████▊| 33532/34278 [36:44:28<40:31, 3.26s/it] 98%|█████████▊| 33533/34278 [36:44:31<41:20, 3.33s/it] {'loss': 0.1131, 'grad_norm': 0.7647816226522861, 'learning_rate': 1.238271234108024e-08, 'epoch': 0.98} 98%|█████████▊| 33533/34278 [36:44:31<41:20, 3.33s/it] 98%|█████████▊| 33534/34278 [36:44:37<50:22, 4.06s/it] {'loss': 0.1095, 'grad_norm': 0.6886995127166381, 'learning_rate': 1.2349506152223301e-08, 'epoch': 0.98} 98%|█████████▊| 33534/34278 [36:44:37<50:22, 4.06s/it] 98%|█████████▊| 33535/34278 [36:44:40<44:58, 3.63s/it] {'loss': 0.1005, 'grad_norm': 0.7503952331544292, 'learning_rate': 1.231634449191832e-08, 'epoch': 0.98} 98%|█████████▊| 33535/34278 [36:44:40<44:58, 3.63s/it] 98%|█████████▊| 33536/34278 [36:44:44<47:11, 3.82s/it] {'loss': 0.1056, 'grad_norm': 0.6880498841861453, 'learning_rate': 1.2283227360462834e-08, 'epoch': 0.98} 98%|█████████▊| 33536/34278 [36:44:44<47:11, 3.82s/it] 98%|█████████▊| 33537/34278 [36:44:48<47:23, 3.84s/it] {'loss': 0.1143, 'grad_norm': 0.708908472219334, 'learning_rate': 1.2250154758152167e-08, 'epoch': 0.98} 98%|█████████▊| 33537/34278 [36:44:48<47:23, 3.84s/it] 98%|█████████▊| 33538/34278 [36:44:51<43:49, 3.55s/it] {'loss': 0.0734, 'grad_norm': 0.6950050722908366, 'learning_rate': 1.2217126685281633e-08, 'epoch': 0.98} 98%|█████████▊| 33538/34278 [36:44:51<43:49, 3.55s/it] 98%|█████████▊| 33539/34278 [36:44:54<43:33, 3.54s/it] {'loss': 0.1104, 'grad_norm': 1.1183797592938378, 'learning_rate': 1.2184143142145444e-08, 'epoch': 0.98} 98%|█████████▊| 33539/34278 [36:44:54<43:33, 3.54s/it] 98%|█████████▊| 33540/34278 [36:45:00<50:05, 4.07s/it] {'loss': 0.1236, 'grad_norm': 0.7917304513124189, 'learning_rate': 1.2151204129038918e-08, 'epoch': 0.98} 98%|█████████▊| 33540/34278 [36:45:00<50:05, 4.07s/it] 98%|█████████▊| 33541/34278 [36:45:03<45:16, 3.69s/it] {'loss': 0.0944, 'grad_norm': 0.8255660088355159, 'learning_rate': 1.211830964625571e-08, 'epoch': 0.98} 98%|█████████▊| 33541/34278 [36:45:03<45:16, 3.69s/it] 98%|█████████▊| 33542/34278 [36:45:07<46:46, 3.81s/it] {'loss': 0.1461, 'grad_norm': 1.0263963579505295, 'learning_rate': 1.2085459694089475e-08, 'epoch': 0.98} 98%|█████████▊| 33542/34278 [36:45:07<46:46, 3.81s/it] 98%|█████████▊| 33543/34278 [36:45:10<46:40, 3.81s/it] {'loss': 0.1063, 'grad_norm': 0.8897985538898494, 'learning_rate': 1.2052654272833309e-08, 'epoch': 0.98} 98%|█████████▊| 33543/34278 [36:45:10<46:40, 3.81s/it] 98%|█████████▊| 33544/34278 [36:45:16<54:03, 4.42s/it] {'loss': 0.1214, 'grad_norm': 0.6096755265309961, 'learning_rate': 1.2019893382780312e-08, 'epoch': 0.98} 98%|█████████▊| 33544/34278 [36:45:16<54:03, 4.42s/it] 98%|█████████▊| 33545/34278 [36:45:19<49:24, 4.04s/it] {'loss': 0.1238, 'grad_norm': 0.8703492366971958, 'learning_rate': 1.1987177024223028e-08, 'epoch': 0.98} 98%|█████████▊| 33545/34278 [36:45:19<49:24, 4.04s/it] 98%|█████████▊| 33546/34278 [36:45:23<48:42, 3.99s/it] {'loss': 0.1124, 'grad_norm': 0.8568893824806576, 'learning_rate': 1.1954505197454002e-08, 'epoch': 0.98} 98%|█████████▊| 33546/34278 [36:45:23<48:42, 3.99s/it] 98%|█████████▊| 33547/34278 [36:45:26<45:08, 3.71s/it] {'loss': 0.0964, 'grad_norm': 0.8984923640720707, 'learning_rate': 1.1921877902763557e-08, 'epoch': 0.98} 98%|█████████▊| 33547/34278 [36:45:26<45:08, 3.71s/it] 98%|█████████▊| 33548/34278 [36:45:30<43:51, 3.61s/it] {'loss': 0.1107, 'grad_norm': 0.9315082956857679, 'learning_rate': 1.188929514044479e-08, 'epoch': 0.98} 98%|█████████▊| 33548/34278 [36:45:30<43:51, 3.61s/it] 98%|█████████▊| 33549/34278 [36:45:32<40:42, 3.35s/it] {'loss': 0.1127, 'grad_norm': 0.7133434124220804, 'learning_rate': 1.1856756910786915e-08, 'epoch': 0.98} 98%|█████████▊| 33549/34278 [36:45:32<40:42, 3.35s/it] 98%|█████████▊| 33550/34278 [36:45:36<40:56, 3.37s/it] {'loss': 0.0958, 'grad_norm': 0.8420504959062934, 'learning_rate': 1.1824263214081367e-08, 'epoch': 0.98} 98%|█████████▊| 33550/34278 [36:45:36<40:56, 3.37s/it] 98%|█████████▊| 33551/34278 [36:45:39<40:00, 3.30s/it] {'loss': 0.0959, 'grad_norm': 0.8654764531361201, 'learning_rate': 1.179181405061791e-08, 'epoch': 0.98} 98%|█████████▊| 33551/34278 [36:45:39<40:00, 3.30s/it] 98%|█████████▊| 33552/34278 [36:45:42<39:43, 3.28s/it] {'loss': 0.1416, 'grad_norm': 0.880735452510767, 'learning_rate': 1.1759409420686873e-08, 'epoch': 0.98} 98%|█████████▊| 33552/34278 [36:45:42<39:43, 3.28s/it] 98%|█████████▊| 33553/34278 [36:45:45<37:26, 3.10s/it] {'loss': 0.0964, 'grad_norm': 0.8464581829556374, 'learning_rate': 1.1727049324576355e-08, 'epoch': 0.98} 98%|█████████▊| 33553/34278 [36:45:45<37:26, 3.10s/it] 98%|█████████▊| 33554/34278 [36:45:49<40:06, 3.32s/it] {'loss': 0.0991, 'grad_norm': 0.6674807253957455, 'learning_rate': 1.1694733762576127e-08, 'epoch': 0.98} 98%|█████████▊| 33554/34278 [36:45:49<40:06, 3.32s/it] 98%|█████████▊| 33555/34278 [36:45:52<39:36, 3.29s/it] {'loss': 0.1231, 'grad_norm': 1.1879563273991856, 'learning_rate': 1.1662462734974845e-08, 'epoch': 0.98} 98%|█████████▊| 33555/34278 [36:45:52<39:36, 3.29s/it] 98%|█████████▊| 33556/34278 [36:45:55<37:18, 3.10s/it] {'loss': 0.0921, 'grad_norm': 0.94763739100455, 'learning_rate': 1.1630236242060056e-08, 'epoch': 0.98} 98%|█████████▊| 33556/34278 [36:45:55<37:18, 3.10s/it] 98%|█████████▊| 33557/34278 [36:45:58<38:28, 3.20s/it] {'loss': 0.0784, 'grad_norm': 0.7393453897394201, 'learning_rate': 1.1598054284119864e-08, 'epoch': 0.98} 98%|█████████▊| 33557/34278 [36:45:58<38:28, 3.20s/it] 98%|█████████▊| 33558/34278 [36:46:01<37:15, 3.11s/it] {'loss': 0.0906, 'grad_norm': 0.6927939059784717, 'learning_rate': 1.1565916861441263e-08, 'epoch': 0.98} 98%|█████████▊| 33558/34278 [36:46:01<37:15, 3.11s/it] 98%|█████████▊| 33559/34278 [36:46:04<35:59, 3.00s/it] {'loss': 0.1138, 'grad_norm': 0.8144790970997765, 'learning_rate': 1.1533823974311242e-08, 'epoch': 0.98} 98%|█████████▊| 33559/34278 [36:46:04<35:59, 3.00s/it] 98%|█████████▊| 33560/34278 [36:46:08<40:15, 3.36s/it] {'loss': 0.1004, 'grad_norm': 0.7101226923340795, 'learning_rate': 1.1501775623016243e-08, 'epoch': 0.98} 98%|█████████▊| 33560/34278 [36:46:08<40:15, 3.36s/it] 98%|█████████▊| 33561/34278 [36:46:13<46:07, 3.86s/it] {'loss': 0.1119, 'grad_norm': 0.8468378824637969, 'learning_rate': 1.14697718078427e-08, 'epoch': 0.98} 98%|█████████▊| 33561/34278 [36:46:13<46:07, 3.86s/it] 98%|█████████▊| 33562/34278 [36:46:17<47:03, 3.94s/it] {'loss': 0.1165, 'grad_norm': 0.7868780522659902, 'learning_rate': 1.1437812529076498e-08, 'epoch': 0.98} 98%|█████████▊| 33562/34278 [36:46:17<47:03, 3.94s/it] 98%|█████████▊| 33563/34278 [36:46:21<45:45, 3.84s/it] {'loss': 0.0972, 'grad_norm': 0.832739648487524, 'learning_rate': 1.1405897787002407e-08, 'epoch': 0.98} 98%|█████████▊| 33563/34278 [36:46:21<45:45, 3.84s/it] 98%|█████████▊| 33564/34278 [36:46:23<41:53, 3.52s/it] {'loss': 0.0913, 'grad_norm': 0.7707445998333764, 'learning_rate': 1.1374027581905201e-08, 'epoch': 0.98} 98%|█████████▊| 33564/34278 [36:46:23<41:53, 3.52s/it] 98%|█████████▊| 33565/34278 [36:46:27<42:12, 3.55s/it] {'loss': 0.1283, 'grad_norm': 0.752586600078335, 'learning_rate': 1.1342201914070206e-08, 'epoch': 0.98} 98%|█████████▊| 33565/34278 [36:46:27<42:12, 3.55s/it] 98%|█████████▊| 33566/34278 [36:46:30<41:05, 3.46s/it] {'loss': 0.1002, 'grad_norm': 0.8288984065373154, 'learning_rate': 1.1310420783781084e-08, 'epoch': 0.98} 98%|█████████▊| 33566/34278 [36:46:30<41:05, 3.46s/it] 98%|█████████▊| 33567/34278 [36:46:33<39:51, 3.36s/it] {'loss': 0.1135, 'grad_norm': 0.882658771823189, 'learning_rate': 1.1278684191321499e-08, 'epoch': 0.98} 98%|█████████▊| 33567/34278 [36:46:34<39:51, 3.36s/it] 98%|█████████▊| 33568/34278 [36:46:37<41:18, 3.49s/it] {'loss': 0.1148, 'grad_norm': 0.809954938094631, 'learning_rate': 1.124699213697511e-08, 'epoch': 0.98} 98%|█████████▊| 33568/34278 [36:46:37<41:18, 3.49s/it] 98%|█████████▊| 33569/34278 [36:46:40<39:42, 3.36s/it] {'loss': 0.1225, 'grad_norm': 0.8110699918505555, 'learning_rate': 1.1215344621025026e-08, 'epoch': 0.98} 98%|█████████▊| 33569/34278 [36:46:40<39:42, 3.36s/it] 98%|█████████▊| 33570/34278 [36:46:43<37:26, 3.17s/it] {'loss': 0.1148, 'grad_norm': 0.8564527368341427, 'learning_rate': 1.1183741643752688e-08, 'epoch': 0.98} 98%|█████████▊| 33570/34278 [36:46:43<37:26, 3.17s/it] 98%|█████████▊| 33571/34278 [36:46:46<36:55, 3.13s/it] {'loss': 0.109, 'grad_norm': 0.8592063659683723, 'learning_rate': 1.1152183205441202e-08, 'epoch': 0.98} 98%|█████████▊| 33571/34278 [36:46:46<36:55, 3.13s/it] 98%|█████████▊| 33572/34278 [36:46:52<45:55, 3.90s/it] {'loss': 0.1218, 'grad_norm': 0.8518428727333377, 'learning_rate': 1.1120669306372568e-08, 'epoch': 0.98} 98%|█████████▊| 33572/34278 [36:46:52<45:55, 3.90s/it] 98%|█████████▊| 33573/34278 [36:46:57<51:38, 4.40s/it] {'loss': 0.1312, 'grad_norm': 0.8120187384522541, 'learning_rate': 1.1089199946827111e-08, 'epoch': 0.98} 98%|█████████▊| 33573/34278 [36:46:57<51:38, 4.40s/it] 98%|█████████▊| 33574/34278 [36:47:01<47:54, 4.08s/it] {'loss': 0.1517, 'grad_norm': 1.0401449897679538, 'learning_rate': 1.1057775127086279e-08, 'epoch': 0.98} 98%|█████████▊| 33574/34278 [36:47:01<47:54, 4.08s/it] 98%|█████████▊| 33575/34278 [36:47:05<47:16, 4.03s/it] {'loss': 0.1105, 'grad_norm': 0.8482366633895007, 'learning_rate': 1.1026394847430954e-08, 'epoch': 0.98} 98%|█████████▊| 33575/34278 [36:47:05<47:16, 4.03s/it] 98%|█████████▊| 33576/34278 [36:47:08<44:25, 3.80s/it] {'loss': 0.0999, 'grad_norm': 0.6194503260232692, 'learning_rate': 1.0995059108140916e-08, 'epoch': 0.98} 98%|█████████▊| 33576/34278 [36:47:08<44:25, 3.80s/it] 98%|█████████▊| 33577/34278 [36:47:11<41:25, 3.55s/it] {'loss': 0.1031, 'grad_norm': 0.6525009094528953, 'learning_rate': 1.0963767909495938e-08, 'epoch': 0.98} 98%|█████████▊| 33577/34278 [36:47:11<41:25, 3.55s/it] 98%|█████████▊| 33578/34278 [36:47:14<39:18, 3.37s/it] {'loss': 0.1117, 'grad_norm': 0.8035432789246304, 'learning_rate': 1.0932521251775796e-08, 'epoch': 0.98} 98%|█████████▊| 33578/34278 [36:47:14<39:18, 3.37s/it] 98%|█████████▊| 33579/34278 [36:47:20<48:11, 4.14s/it] {'loss': 0.0903, 'grad_norm': 0.6810473434341285, 'learning_rate': 1.0901319135259158e-08, 'epoch': 0.98} 98%|█████████▊| 33579/34278 [36:47:20<48:11, 4.14s/it] 98%|█████████▊| 33580/34278 [36:47:23<45:53, 3.95s/it] {'loss': 0.141, 'grad_norm': 0.8300760745461686, 'learning_rate': 1.0870161560224134e-08, 'epoch': 0.98} 98%|█████████▊| 33580/34278 [36:47:23<45:53, 3.95s/it] 98%|█████████▊| 33581/34278 [36:47:26<42:30, 3.66s/it] {'loss': 0.1, 'grad_norm': 0.8687429024153636, 'learning_rate': 1.0839048526949391e-08, 'epoch': 0.98} 98%|█████████▊| 33581/34278 [36:47:26<42:30, 3.66s/it] 98%|█████████▊| 33582/34278 [36:47:30<42:00, 3.62s/it] {'loss': 0.1358, 'grad_norm': 0.8460977724789748, 'learning_rate': 1.080798003571304e-08, 'epoch': 0.98} 98%|█████████▊| 33582/34278 [36:47:30<42:00, 3.62s/it] 98%|█████████▊| 33583/34278 [36:47:36<50:28, 4.36s/it] {'loss': 0.1036, 'grad_norm': 0.716330625236146, 'learning_rate': 1.0776956086790968e-08, 'epoch': 0.98} 98%|█████████▊| 33583/34278 [36:47:36<50:28, 4.36s/it] 98%|█████████▊| 33584/34278 [36:47:39<47:54, 4.14s/it] {'loss': 0.122, 'grad_norm': 0.73467268286258, 'learning_rate': 1.07459766804624e-08, 'epoch': 0.98} 98%|█████████▊| 33584/34278 [36:47:39<47:54, 4.14s/it] 98%|█████████▊| 33585/34278 [36:47:44<48:05, 4.16s/it] {'loss': 0.1009, 'grad_norm': 0.7435576255141277, 'learning_rate': 1.0715041817002114e-08, 'epoch': 0.98} 98%|█████████▊| 33585/34278 [36:47:44<48:05, 4.16s/it] 98%|█████████▊| 33586/34278 [36:47:47<44:19, 3.84s/it] {'loss': 0.0872, 'grad_norm': 0.9354998523865375, 'learning_rate': 1.0684151496687112e-08, 'epoch': 0.98} 98%|█████████▊| 33586/34278 [36:47:47<44:19, 3.84s/it] 98%|█████████▊| 33587/34278 [36:47:51<45:38, 3.96s/it] {'loss': 0.0985, 'grad_norm': 0.6837934822505343, 'learning_rate': 1.0653305719792727e-08, 'epoch': 0.98} 98%|█████████▊| 33587/34278 [36:47:51<45:38, 3.96s/it] 98%|█████████▊| 33588/34278 [36:47:54<42:22, 3.68s/it] {'loss': 0.1109, 'grad_norm': 1.123557878098781, 'learning_rate': 1.0622504486594853e-08, 'epoch': 0.98} 98%|█████████▊| 33588/34278 [36:47:54<42:22, 3.68s/it] 98%|█████████▊| 33589/34278 [36:47:59<46:18, 4.03s/it] {'loss': 0.0882, 'grad_norm': 0.8260486070102699, 'learning_rate': 1.0591747797367713e-08, 'epoch': 0.98} 98%|█████████▊| 33589/34278 [36:47:59<46:18, 4.03s/it] 98%|█████████▊| 33590/34278 [36:48:02<43:52, 3.83s/it] {'loss': 0.1096, 'grad_norm': 0.8483487365515615, 'learning_rate': 1.0561035652386643e-08, 'epoch': 0.98} 98%|█████████▊| 33590/34278 [36:48:02<43:52, 3.83s/it] 98%|█████████▊| 33591/34278 [36:48:06<41:54, 3.66s/it] {'loss': 0.1261, 'grad_norm': 0.899948870894597, 'learning_rate': 1.0530368051925865e-08, 'epoch': 0.98} 98%|█████████▊| 33591/34278 [36:48:06<41:54, 3.66s/it] 98%|█████████▊| 33592/34278 [36:48:12<50:24, 4.41s/it] {'loss': 0.1259, 'grad_norm': 0.9432100765013671, 'learning_rate': 1.0499744996259054e-08, 'epoch': 0.98} 98%|█████████▊| 33592/34278 [36:48:12<50:24, 4.41s/it] 98%|█████████▊| 33593/34278 [36:48:15<47:46, 4.18s/it] {'loss': 0.0994, 'grad_norm': 0.7118238030219393, 'learning_rate': 1.0469166485658766e-08, 'epoch': 0.98} 98%|█████████▊| 33593/34278 [36:48:15<47:46, 4.18s/it] 98%|█████████▊| 33594/34278 [36:48:19<44:32, 3.91s/it] {'loss': 0.0928, 'grad_norm': 0.5866476698926588, 'learning_rate': 1.0438632520399227e-08, 'epoch': 0.98} 98%|█████████▊| 33594/34278 [36:48:19<44:32, 3.91s/it] 98%|█████████▊| 33595/34278 [36:48:21<40:30, 3.56s/it] {'loss': 0.1121, 'grad_norm': 0.8024200703226215, 'learning_rate': 1.0408143100751888e-08, 'epoch': 0.98} 98%|█████████▊| 33595/34278 [36:48:21<40:30, 3.56s/it] 98%|█████████▊| 33596/34278 [36:48:25<41:13, 3.63s/it] {'loss': 0.0924, 'grad_norm': 0.7835659963475179, 'learning_rate': 1.0377698226989863e-08, 'epoch': 0.98} 98%|█████████▊| 33596/34278 [36:48:25<41:13, 3.63s/it] 98%|█████████▊| 33597/34278 [36:48:31<50:06, 4.41s/it] {'loss': 0.1092, 'grad_norm': 0.7460840593951773, 'learning_rate': 1.0347297899384601e-08, 'epoch': 0.98} 98%|█████████▊| 33597/34278 [36:48:31<50:06, 4.41s/it] 98%|█████████▊| 33598/34278 [36:48:34<45:30, 4.01s/it] {'loss': 0.1233, 'grad_norm': 0.6389914910112341, 'learning_rate': 1.0316942118207551e-08, 'epoch': 0.98} 98%|█████████▊| 33598/34278 [36:48:34<45:30, 4.01s/it] 98%|█████████▊| 33599/34278 [36:48:38<42:37, 3.77s/it] {'loss': 0.1164, 'grad_norm': 0.8388707244957739, 'learning_rate': 1.0286630883729608e-08, 'epoch': 0.98} 98%|█████████▊| 33599/34278 [36:48:38<42:37, 3.77s/it] 98%|█████████▊| 33600/34278 [36:48:41<39:48, 3.52s/it] {'loss': 0.1012, 'grad_norm': 0.9032565473013633, 'learning_rate': 1.0256364196221669e-08, 'epoch': 0.98} 98%|█████████▊| 33600/34278 [36:48:41<39:48, 3.52s/it] 98%|█████████▊| 33601/34278 [36:48:46<45:29, 4.03s/it] {'loss': 0.1217, 'grad_norm': 0.9149871122702345, 'learning_rate': 1.0226142055953515e-08, 'epoch': 0.98} 98%|█████████▊| 33601/34278 [36:48:46<45:29, 4.03s/it] 98%|█████████▊| 33602/34278 [36:48:50<44:33, 3.95s/it] {'loss': 0.1058, 'grad_norm': 0.9215684318390998, 'learning_rate': 1.0195964463195485e-08, 'epoch': 0.98} 98%|█████████▊| 33602/34278 [36:48:50<44:33, 3.95s/it] 98%|█████████▊| 33603/34278 [36:48:53<41:10, 3.66s/it] {'loss': 0.1126, 'grad_norm': 0.9950604692617956, 'learning_rate': 1.0165831418216255e-08, 'epoch': 0.98} 98%|█████████▊| 33603/34278 [36:48:53<41:10, 3.66s/it] 98%|█████████▊| 33604/34278 [36:48:56<38:48, 3.46s/it] {'loss': 0.0971, 'grad_norm': 0.7541798563495795, 'learning_rate': 1.0135742921286163e-08, 'epoch': 0.98} 98%|█████████▊| 33604/34278 [36:48:56<38:48, 3.46s/it] 98%|█████████▊| 33605/34278 [36:48:59<37:21, 3.33s/it] {'loss': 0.1101, 'grad_norm': 0.7461091518709655, 'learning_rate': 1.0105698972672217e-08, 'epoch': 0.98} 98%|█████████▊| 33605/34278 [36:48:59<37:21, 3.33s/it] 98%|█████████▊| 33606/34278 [36:49:01<35:24, 3.16s/it] {'loss': 0.1261, 'grad_norm': 1.0912559824009103, 'learning_rate': 1.0075699572643649e-08, 'epoch': 0.98} 98%|█████████▊| 33606/34278 [36:49:01<35:24, 3.16s/it] 98%|█████████▊| 33607/34278 [36:49:07<44:11, 3.95s/it] {'loss': 0.1261, 'grad_norm': 0.9400767961900052, 'learning_rate': 1.0045744721468021e-08, 'epoch': 0.98} 98%|█████████▊| 33607/34278 [36:49:07<44:11, 3.95s/it] 98%|█████████▊| 33608/34278 [36:49:11<44:10, 3.96s/it] {'loss': 0.1264, 'grad_norm': 0.9017659561990901, 'learning_rate': 1.0015834419412895e-08, 'epoch': 0.98} 98%|█████████▊| 33608/34278 [36:49:11<44:10, 3.96s/it] 98%|█████████▊| 33609/34278 [36:49:14<41:46, 3.75s/it] {'loss': 0.106, 'grad_norm': 0.8615105950532248, 'learning_rate': 9.985968666745282e-09, 'epoch': 0.98} 98%|█████████▊| 33609/34278 [36:49:14<41:46, 3.75s/it] 98%|█████████▊| 33610/34278 [36:49:20<49:38, 4.46s/it] {'loss': 0.1219, 'grad_norm': 0.825579417128215, 'learning_rate': 9.95614746373108e-09, 'epoch': 0.98} 98%|█████████▊| 33610/34278 [36:49:21<49:38, 4.46s/it] 98%|█████████▊| 33611/34278 [36:49:27<55:14, 4.97s/it] {'loss': 0.1029, 'grad_norm': 0.6876971184436184, 'learning_rate': 9.926370810637853e-09, 'epoch': 0.98} 98%|█████████▊| 33611/34278 [36:49:27<55:14, 4.97s/it] 98%|█████████▊| 33612/34278 [36:49:30<49:15, 4.44s/it] {'loss': 0.1196, 'grad_norm': 0.6744817637104469, 'learning_rate': 9.896638707730944e-09, 'epoch': 0.98} 98%|█████████▊| 33612/34278 [36:49:30<49:15, 4.44s/it] 98%|█████████▊| 33613/34278 [36:49:34<47:07, 4.25s/it] {'loss': 0.1182, 'grad_norm': 0.86975152342235, 'learning_rate': 9.866951155274585e-09, 'epoch': 0.98} 98%|█████████▊| 33613/34278 [36:49:34<47:07, 4.25s/it] 98%|█████████▊| 33614/34278 [36:49:37<43:00, 3.89s/it] {'loss': 0.0906, 'grad_norm': 0.7316605252320606, 'learning_rate': 9.837308153535786e-09, 'epoch': 0.98} 98%|█████████▊| 33614/34278 [36:49:37<43:00, 3.89s/it] 98%|█████████▊| 33615/34278 [36:49:42<47:52, 4.33s/it] {'loss': 0.1253, 'grad_norm': 0.8640335590223941, 'learning_rate': 9.807709702778223e-09, 'epoch': 0.98} 98%|█████████▊| 33615/34278 [36:49:42<47:52, 4.33s/it] 98%|█████████▊| 33616/34278 [36:49:45<43:02, 3.90s/it] {'loss': 0.1125, 'grad_norm': 0.7699819317838843, 'learning_rate': 9.778155803265577e-09, 'epoch': 0.98} 98%|█████████▊| 33616/34278 [36:49:45<43:02, 3.90s/it] 98%|█████████▊| 33617/34278 [36:49:51<49:25, 4.49s/it] {'loss': 0.1143, 'grad_norm': 0.9002660430621313, 'learning_rate': 9.748646455262633e-09, 'epoch': 0.98} 98%|█████████▊| 33617/34278 [36:49:51<49:25, 4.49s/it] 98%|█████████▊| 33618/34278 [36:49:55<47:46, 4.34s/it] {'loss': 0.0892, 'grad_norm': 0.688563331347183, 'learning_rate': 9.719181659032518e-09, 'epoch': 0.98} 98%|█████████▊| 33618/34278 [36:49:55<47:46, 4.34s/it] 98%|█████████▊| 33619/34278 [36:49:58<42:24, 3.86s/it] {'loss': 0.1141, 'grad_norm': 0.7607676172778293, 'learning_rate': 9.68976141483835e-09, 'epoch': 0.98} 98%|█████████▊| 33619/34278 [36:49:58<42:24, 3.86s/it] 98%|█████████▊| 33620/34278 [36:50:01<39:26, 3.60s/it] {'loss': 0.0977, 'grad_norm': 0.6857942810864647, 'learning_rate': 9.6603857229427e-09, 'epoch': 0.98} 98%|█████████▊| 33620/34278 [36:50:01<39:26, 3.60s/it] 98%|█████████▊| 33621/34278 [36:50:03<36:35, 3.34s/it] {'loss': 0.0938, 'grad_norm': 0.8653116959258328, 'learning_rate': 9.63105458360758e-09, 'epoch': 0.98} 98%|█████████▊| 33621/34278 [36:50:03<36:35, 3.34s/it] 98%|█████████▊| 33622/34278 [36:50:10<46:19, 4.24s/it] {'loss': 0.1275, 'grad_norm': 0.8703808503252962, 'learning_rate': 9.601767997095556e-09, 'epoch': 0.98} 98%|█████████▊| 33622/34278 [36:50:10<46:19, 4.24s/it] 98%|█████████▊| 33623/34278 [36:50:14<45:35, 4.18s/it] {'loss': 0.0995, 'grad_norm': 0.953380987348052, 'learning_rate': 9.572525963666979e-09, 'epoch': 0.98} 98%|█████████▊| 33623/34278 [36:50:14<45:35, 4.18s/it] 98%|█████████▊| 33624/34278 [36:50:17<42:29, 3.90s/it] {'loss': 0.1027, 'grad_norm': 1.0503153488917936, 'learning_rate': 9.543328483584412e-09, 'epoch': 0.98} 98%|█████████▊| 33624/34278 [36:50:17<42:29, 3.90s/it] 98%|█████████▊| 33625/34278 [36:50:20<39:26, 3.62s/it] {'loss': 0.1037, 'grad_norm': 0.9197137964180604, 'learning_rate': 9.514175557107097e-09, 'epoch': 0.98} 98%|█████████▊| 33625/34278 [36:50:20<39:26, 3.62s/it] 98%|█████████▊| 33626/34278 [36:50:25<42:49, 3.94s/it] {'loss': 0.1313, 'grad_norm': 0.879340509246777, 'learning_rate': 9.485067184495932e-09, 'epoch': 0.98} 98%|█████████▊| 33626/34278 [36:50:25<42:49, 3.94s/it] 98%|█████████▊| 33627/34278 [36:50:28<41:02, 3.78s/it] {'loss': 0.145, 'grad_norm': 0.7984548676562049, 'learning_rate': 9.456003366010713e-09, 'epoch': 0.98} 98%|█████████▊| 33627/34278 [36:50:28<41:02, 3.78s/it] 98%|█████████▊| 33628/34278 [36:50:31<37:34, 3.47s/it] {'loss': 0.1002, 'grad_norm': 1.1241025417926034, 'learning_rate': 9.42698410191123e-09, 'epoch': 0.98} 98%|█████████▊| 33628/34278 [36:50:31<37:34, 3.47s/it] 98%|█████████▊| 33629/34278 [36:50:34<37:17, 3.45s/it] {'loss': 0.1082, 'grad_norm': 0.7751571980232825, 'learning_rate': 9.398009392456165e-09, 'epoch': 0.98} 98%|█████████▊| 33629/34278 [36:50:34<37:17, 3.45s/it] 98%|█████████▊| 33630/34278 [36:50:37<36:58, 3.42s/it] {'loss': 0.1209, 'grad_norm': 0.6539449698680845, 'learning_rate': 9.3690792379042e-09, 'epoch': 0.98} 98%|█████████▊| 33630/34278 [36:50:37<36:58, 3.42s/it] 98%|█████████▊| 33631/34278 [36:50:41<35:54, 3.33s/it] {'loss': 0.0988, 'grad_norm': 0.9025010932371627, 'learning_rate': 9.340193638514017e-09, 'epoch': 0.98} 98%|█████████▊| 33631/34278 [36:50:41<35:54, 3.33s/it] 98%|█████████▊| 33632/34278 [36:50:44<35:43, 3.32s/it] {'loss': 0.0958, 'grad_norm': 0.9088939761302108, 'learning_rate': 9.311352594543188e-09, 'epoch': 0.98} 98%|█████████▊| 33632/34278 [36:50:44<35:43, 3.32s/it] 98%|█████████▊| 33633/34278 [36:50:47<34:50, 3.24s/it] {'loss': 0.1222, 'grad_norm': 0.6746669390137113, 'learning_rate': 9.28255610624873e-09, 'epoch': 0.98} 98%|█████████▊| 33633/34278 [36:50:47<34:50, 3.24s/it] 98%|█████████▊| 33634/34278 [36:50:53<44:45, 4.17s/it] {'loss': 0.1036, 'grad_norm': 0.8845037677253808, 'learning_rate': 9.25380417388877e-09, 'epoch': 0.98} 98%|█████████▊| 33634/34278 [36:50:53<44:45, 4.17s/it] 98%|█████████▊| 33635/34278 [36:50:56<41:30, 3.87s/it] {'loss': 0.1208, 'grad_norm': 0.8244579628519634, 'learning_rate': 9.225096797719213e-09, 'epoch': 0.98} 98%|█████████▊| 33635/34278 [36:50:56<41:30, 3.87s/it] 98%|█████████▊| 33636/34278 [36:50:59<38:13, 3.57s/it] {'loss': 0.1214, 'grad_norm': 0.7776124324380745, 'learning_rate': 9.196433977996522e-09, 'epoch': 0.98} 98%|█████████▊| 33636/34278 [36:50:59<38:13, 3.57s/it] 98%|█████████▊| 33637/34278 [36:51:05<45:51, 4.29s/it] {'loss': 0.1226, 'grad_norm': 0.7720265074637314, 'learning_rate': 9.167815714977158e-09, 'epoch': 0.98} 98%|█████████▊| 33637/34278 [36:51:05<45:51, 4.29s/it] 98%|█████████▊| 33638/34278 [36:51:11<48:51, 4.58s/it] {'loss': 0.1007, 'grad_norm': 0.7452347521734213, 'learning_rate': 9.13924200891536e-09, 'epoch': 0.98} 98%|█████████▊| 33638/34278 [36:51:11<48:51, 4.58s/it] 98%|█████████▊| 33639/34278 [36:51:14<45:55, 4.31s/it] {'loss': 0.0961, 'grad_norm': 0.8160326354493608, 'learning_rate': 9.110712860067594e-09, 'epoch': 0.98} 98%|█████████▊| 33639/34278 [36:51:14<45:55, 4.31s/it] 98%|█████████▊| 33640/34278 [36:51:18<45:24, 4.27s/it] {'loss': 0.1054, 'grad_norm': 0.9735477117013972, 'learning_rate': 9.082228268688099e-09, 'epoch': 0.98} 98%|█████████▊| 33640/34278 [36:51:18<45:24, 4.27s/it] 98%|█████████▊| 33641/34278 [36:51:22<42:30, 4.00s/it] {'loss': 0.1179, 'grad_norm': 0.753904475086462, 'learning_rate': 9.053788235030558e-09, 'epoch': 0.98} 98%|█████████▊| 33641/34278 [36:51:22<42:30, 4.00s/it] 98%|█████████▊| 33642/34278 [36:51:25<38:25, 3.63s/it] {'loss': 0.1252, 'grad_norm': 1.0664785226371116, 'learning_rate': 9.025392759349771e-09, 'epoch': 0.98} 98%|█████████▊| 33642/34278 [36:51:25<38:25, 3.63s/it] 98%|█████████▊| 33643/34278 [36:51:28<36:26, 3.44s/it] {'loss': 0.0998, 'grad_norm': 0.7380798782229535, 'learning_rate': 8.997041841898312e-09, 'epoch': 0.98} 98%|█████████▊| 33643/34278 [36:51:28<36:26, 3.44s/it] 98%|█████████▊| 33644/34278 [36:51:30<34:45, 3.29s/it] {'loss': 0.1071, 'grad_norm': 0.8774971908772576, 'learning_rate': 8.968735482929868e-09, 'epoch': 0.98} 98%|█████████▊| 33644/34278 [36:51:30<34:45, 3.29s/it] 98%|█████████▊| 33645/34278 [36:51:33<33:18, 3.16s/it] {'loss': 0.1222, 'grad_norm': 1.0827863457925666, 'learning_rate': 8.94047368269757e-09, 'epoch': 0.98} 98%|█████████▊| 33645/34278 [36:51:33<33:18, 3.16s/it] 98%|█████████▊| 33646/34278 [36:51:36<32:36, 3.10s/it] {'loss': 0.0873, 'grad_norm': 0.8178162774695727, 'learning_rate': 8.912256441452882e-09, 'epoch': 0.98} 98%|█████████▊| 33646/34278 [36:51:36<32:36, 3.10s/it] 98%|█████████▊| 33647/34278 [36:51:41<37:20, 3.55s/it] {'loss': 0.1038, 'grad_norm': 0.7479260808972571, 'learning_rate': 8.884083759448381e-09, 'epoch': 0.98} 98%|█████████▊| 33647/34278 [36:51:41<37:20, 3.55s/it] 98%|█████████▊| 33648/34278 [36:51:45<37:29, 3.57s/it] {'loss': 0.1121, 'grad_norm': 0.9094641469251314, 'learning_rate': 8.855955636935531e-09, 'epoch': 0.98} 98%|█████████▊| 33648/34278 [36:51:45<37:29, 3.57s/it] 98%|█████████▊| 33649/34278 [36:51:48<37:26, 3.57s/it] {'loss': 0.12, 'grad_norm': 0.7544709780358172, 'learning_rate': 8.82787207416469e-09, 'epoch': 0.98} 98%|█████████▊| 33649/34278 [36:51:48<37:26, 3.57s/it] 98%|█████████▊| 33650/34278 [36:51:52<39:23, 3.76s/it] {'loss': 0.1003, 'grad_norm': 0.7819759218229222, 'learning_rate': 8.79983307138732e-09, 'epoch': 0.98} 98%|█████████▊| 33650/34278 [36:51:52<39:23, 3.76s/it] 98%|█████████▊| 33651/34278 [36:51:58<46:09, 4.42s/it] {'loss': 0.1235, 'grad_norm': 0.870250345927785, 'learning_rate': 8.771838628853225e-09, 'epoch': 0.98} 98%|█████████▊| 33651/34278 [36:51:58<46:09, 4.42s/it] 98%|█████████▊| 33652/34278 [36:52:03<46:14, 4.43s/it] {'loss': 0.1528, 'grad_norm': 0.9524485265012032, 'learning_rate': 8.743888746813312e-09, 'epoch': 0.98} 98%|█████████▊| 33652/34278 [36:52:03<46:14, 4.43s/it] 98%|█████████▊| 33653/34278 [36:52:06<41:58, 4.03s/it] {'loss': 0.1059, 'grad_norm': 0.8862812860992284, 'learning_rate': 8.715983425515718e-09, 'epoch': 0.98} 98%|█████████▊| 33653/34278 [36:52:06<41:58, 4.03s/it] 98%|█████████▊| 33654/34278 [36:52:10<42:46, 4.11s/it] {'loss': 0.1227, 'grad_norm': 1.0019586128183071, 'learning_rate': 8.688122665210796e-09, 'epoch': 0.98} 98%|█████████▊| 33654/34278 [36:52:10<42:46, 4.11s/it] 98%|█████████▊| 33655/34278 [36:52:14<40:59, 3.95s/it] {'loss': 0.0952, 'grad_norm': 1.1203320311115514, 'learning_rate': 8.660306466146683e-09, 'epoch': 0.98} 98%|█████████▊| 33655/34278 [36:52:14<40:59, 3.95s/it] 98%|█████████▊| 33656/34278 [36:52:17<39:08, 3.78s/it] {'loss': 0.1072, 'grad_norm': 0.7728779293846545, 'learning_rate': 8.632534828571516e-09, 'epoch': 0.98} 98%|█████████▊| 33656/34278 [36:52:17<39:08, 3.78s/it] 98%|█████████▊| 33657/34278 [36:52:20<36:55, 3.57s/it] {'loss': 0.1179, 'grad_norm': 0.7066081730542775, 'learning_rate': 8.60480775273398e-09, 'epoch': 0.98} 98%|█████████▊| 33657/34278 [36:52:20<36:55, 3.57s/it] 98%|█████████▊| 33658/34278 [36:52:23<35:45, 3.46s/it] {'loss': 0.1114, 'grad_norm': 0.9057985351791848, 'learning_rate': 8.577125238881102e-09, 'epoch': 0.98} 98%|█████████▊| 33658/34278 [36:52:23<35:45, 3.46s/it] 98%|█████████▊| 33659/34278 [36:52:28<39:07, 3.79s/it] {'loss': 0.1236, 'grad_norm': 0.725792279645597, 'learning_rate': 8.549487287259906e-09, 'epoch': 0.98} 98%|█████████▊| 33659/34278 [36:52:28<39:07, 3.79s/it] 98%|█████████▊| 33660/34278 [36:52:31<36:32, 3.55s/it] {'loss': 0.1038, 'grad_norm': 0.6701349775250318, 'learning_rate': 8.521893898117417e-09, 'epoch': 0.98} 98%|█████████▊| 33660/34278 [36:52:31<36:32, 3.55s/it] 98%|█████████▊| 33661/34278 [36:52:36<41:19, 4.02s/it] {'loss': 0.1128, 'grad_norm': 0.7656834595744108, 'learning_rate': 8.494345071700105e-09, 'epoch': 0.98} 98%|█████████▊| 33661/34278 [36:52:36<41:19, 4.02s/it] 98%|█████████▊| 33662/34278 [36:52:39<37:48, 3.68s/it] {'loss': 0.1264, 'grad_norm': 0.8793608677890365, 'learning_rate': 8.46684080825333e-09, 'epoch': 0.98} 98%|█████████▊| 33662/34278 [36:52:39<37:48, 3.68s/it] 98%|█████████▊| 33663/34278 [36:52:43<38:16, 3.73s/it] {'loss': 0.1028, 'grad_norm': 0.6874765366075883, 'learning_rate': 8.439381108023559e-09, 'epoch': 0.98} 98%|█████████▊| 33663/34278 [36:52:43<38:16, 3.73s/it] 98%|█████████▊| 33664/34278 [36:52:46<36:07, 3.53s/it] {'loss': 0.1213, 'grad_norm': 0.802076161968697, 'learning_rate': 8.411965971255042e-09, 'epoch': 0.98} 98%|█████████▊| 33664/34278 [36:52:46<36:07, 3.53s/it] 98%|█████████▊| 33665/34278 [36:52:49<33:59, 3.33s/it] {'loss': 0.0998, 'grad_norm': 2.7792084946682474, 'learning_rate': 8.38459539819314e-09, 'epoch': 0.98} 98%|█████████▊| 33665/34278 [36:52:49<33:59, 3.33s/it] 98%|█████████▊| 33666/34278 [36:52:52<33:47, 3.31s/it] {'loss': 0.1176, 'grad_norm': 0.8362544481670676, 'learning_rate': 8.357269389081547e-09, 'epoch': 0.98} 98%|█████████▊| 33666/34278 [36:52:52<33:47, 3.31s/it] 98%|█████████▊| 33667/34278 [36:52:55<32:22, 3.18s/it] {'loss': 0.1282, 'grad_norm': 0.7512691058425112, 'learning_rate': 8.329987944165064e-09, 'epoch': 0.98} 98%|█████████▊| 33667/34278 [36:52:55<32:22, 3.18s/it] 98%|█████████▊| 33668/34278 [36:52:58<32:12, 3.17s/it] {'loss': 0.1196, 'grad_norm': 0.845468959005257, 'learning_rate': 8.302751063686276e-09, 'epoch': 0.98} 98%|█████████▊| 33668/34278 [36:52:58<32:12, 3.17s/it] 98%|█████████▊| 33669/34278 [36:53:01<31:33, 3.11s/it] {'loss': 0.1028, 'grad_norm': 0.7454117878596589, 'learning_rate': 8.275558747889434e-09, 'epoch': 0.98} 98%|█████████▊| 33669/34278 [36:53:01<31:33, 3.11s/it] 98%|█████████▊| 33670/34278 [36:53:04<31:35, 3.12s/it] {'loss': 0.104, 'grad_norm': 0.815619565420258, 'learning_rate': 8.248410997016565e-09, 'epoch': 0.98} 98%|█████████▊| 33670/34278 [36:53:04<31:35, 3.12s/it] 98%|█████████▊| 33671/34278 [36:53:07<30:56, 3.06s/it] {'loss': 0.1186, 'grad_norm': 0.9247442920539567, 'learning_rate': 8.221307811310808e-09, 'epoch': 0.98} 98%|█████████▊| 33671/34278 [36:53:07<30:56, 3.06s/it] 98%|█████████▊| 33672/34278 [36:53:10<31:24, 3.11s/it] {'loss': 0.1227, 'grad_norm': 0.8804617502901677, 'learning_rate': 8.194249191013082e-09, 'epoch': 0.98} 98%|█████████▊| 33672/34278 [36:53:10<31:24, 3.11s/it] 98%|█████████▊| 33673/34278 [36:53:16<40:39, 4.03s/it] {'loss': 0.122, 'grad_norm': 0.8159412005318231, 'learning_rate': 8.167235136365414e-09, 'epoch': 0.98} 98%|█████████▊| 33673/34278 [36:53:16<40:39, 4.03s/it] 98%|█████████▊| 33674/34278 [36:53:20<40:10, 3.99s/it] {'loss': 0.1087, 'grad_norm': 1.1741006117092003, 'learning_rate': 8.140265647608725e-09, 'epoch': 0.98} 98%|█████████▊| 33674/34278 [36:53:20<40:10, 3.99s/it] 98%|█████████▊| 33675/34278 [36:53:24<40:35, 4.04s/it] {'loss': 0.0956, 'grad_norm': 0.7397438447371917, 'learning_rate': 8.113340724985042e-09, 'epoch': 0.98} 98%|█████████▊| 33675/34278 [36:53:24<40:35, 4.04s/it] 98%|█████████▊| 33676/34278 [36:53:31<48:52, 4.87s/it] {'loss': 0.1241, 'grad_norm': 0.8174985130404561, 'learning_rate': 8.086460368733062e-09, 'epoch': 0.98} 98%|█████████▊| 33676/34278 [36:53:31<48:52, 4.87s/it] 98%|█████████▊| 33677/34278 [36:53:34<42:52, 4.28s/it] {'loss': 0.0985, 'grad_norm': 0.9351532761366477, 'learning_rate': 8.059624579093705e-09, 'epoch': 0.98} 98%|█████████▊| 33677/34278 [36:53:34<42:52, 4.28s/it] 98%|█████████▊| 33678/34278 [36:53:38<40:36, 4.06s/it] {'loss': 0.1091, 'grad_norm': 0.8264677583332484, 'learning_rate': 8.032833356306224e-09, 'epoch': 0.98} 98%|█████████▊| 33678/34278 [36:53:38<40:36, 4.06s/it] 98%|█████████▊| 33679/34278 [36:53:42<42:09, 4.22s/it] {'loss': 0.1065, 'grad_norm': 0.8207624477175735, 'learning_rate': 8.006086700609872e-09, 'epoch': 0.98} 98%|█████████▊| 33679/34278 [36:53:42<42:09, 4.22s/it] 98%|█████████▊| 33680/34278 [36:53:45<37:51, 3.80s/it] {'loss': 0.1062, 'grad_norm': 0.7723821360481832, 'learning_rate': 7.979384612243901e-09, 'epoch': 0.98} 98%|█████████▊| 33680/34278 [36:53:45<37:51, 3.80s/it] 98%|█████████▊| 33681/34278 [36:53:48<35:07, 3.53s/it] {'loss': 0.0945, 'grad_norm': 0.8756408712289196, 'learning_rate': 7.9527270914459e-09, 'epoch': 0.98} 98%|█████████▊| 33681/34278 [36:53:48<35:07, 3.53s/it] 98%|█████████▊| 33682/34278 [36:53:52<35:51, 3.61s/it] {'loss': 0.1074, 'grad_norm': 1.0934535005137842, 'learning_rate': 7.926114138454566e-09, 'epoch': 0.98} 98%|█████████▊| 33682/34278 [36:53:52<35:51, 3.61s/it] 98%|█████████▊| 33683/34278 [36:53:56<36:20, 3.67s/it] {'loss': 0.1137, 'grad_norm': 0.70889898196579, 'learning_rate': 7.899545753506933e-09, 'epoch': 0.98} 98%|█████████▊| 33683/34278 [36:53:56<36:20, 3.67s/it] 98%|█████████▊| 33684/34278 [36:53:59<34:05, 3.44s/it] {'loss': 0.0959, 'grad_norm': 0.7487615325736632, 'learning_rate': 7.873021936840585e-09, 'epoch': 0.98} 98%|█████████▊| 33684/34278 [36:53:59<34:05, 3.44s/it] 98%|█████████▊| 33685/34278 [36:54:03<36:04, 3.65s/it] {'loss': 0.099, 'grad_norm': 0.6615837681059235, 'learning_rate': 7.846542688692005e-09, 'epoch': 0.98} 98%|█████████▊| 33685/34278 [36:54:03<36:04, 3.65s/it] 98%|█████████▊| 33686/34278 [36:54:06<35:28, 3.60s/it] {'loss': 0.1192, 'grad_norm': 0.763631584549047, 'learning_rate': 7.820108009297667e-09, 'epoch': 0.98} 98%|█████████▊| 33686/34278 [36:54:06<35:28, 3.60s/it] 98%|█████████▊| 33687/34278 [36:54:09<33:45, 3.43s/it] {'loss': 0.1097, 'grad_norm': 0.8010618778523452, 'learning_rate': 7.79371789889405e-09, 'epoch': 0.98} 98%|█████████▊| 33687/34278 [36:54:09<33:45, 3.43s/it] 98%|█████████▊| 33688/34278 [36:54:12<32:56, 3.35s/it] {'loss': 0.1429, 'grad_norm': 0.9549698425565081, 'learning_rate': 7.767372357715964e-09, 'epoch': 0.98} 98%|█████████▊| 33688/34278 [36:54:12<32:56, 3.35s/it] 98%|█████████▊| 33689/34278 [36:54:16<32:23, 3.30s/it] {'loss': 0.0939, 'grad_norm': 0.7716348496530502, 'learning_rate': 7.741071385999332e-09, 'epoch': 0.98} 98%|█████████▊| 33689/34278 [36:54:16<32:23, 3.30s/it] 98%|█████████▊| 33690/34278 [36:54:19<32:07, 3.28s/it] {'loss': 0.1007, 'grad_norm': 1.1080247987326997, 'learning_rate': 7.714814983978414e-09, 'epoch': 0.98} 98%|█████████▊| 33690/34278 [36:54:19<32:07, 3.28s/it] 98%|█████████▊| 33691/34278 [36:54:22<31:59, 3.27s/it] {'loss': 0.0956, 'grad_norm': 0.8263190894010536, 'learning_rate': 7.688603151888019e-09, 'epoch': 0.98} 98%|█████████▊| 33691/34278 [36:54:22<31:59, 3.27s/it] 98%|█████████▊| 33692/34278 [36:54:28<40:40, 4.16s/it] {'loss': 0.1035, 'grad_norm': 0.9552939698584892, 'learning_rate': 7.662435889962406e-09, 'epoch': 0.98} 98%|█████████▊| 33692/34278 [36:54:28<40:40, 4.16s/it] 98%|█████████▊| 33693/34278 [36:54:31<37:28, 3.84s/it] {'loss': 0.1089, 'grad_norm': 0.8852238599522226, 'learning_rate': 7.636313198434164e-09, 'epoch': 0.98} 98%|█████████▊| 33693/34278 [36:54:31<37:28, 3.84s/it] 98%|█████████▊| 33694/34278 [36:54:34<34:02, 3.50s/it] {'loss': 0.1068, 'grad_norm': 0.7441534776528622, 'learning_rate': 7.610235077537554e-09, 'epoch': 0.98} 98%|█████████▊| 33694/34278 [36:54:34<34:02, 3.50s/it] 98%|█████████▊| 33695/34278 [36:54:37<32:42, 3.37s/it] {'loss': 0.1273, 'grad_norm': 0.8998278979857637, 'learning_rate': 7.584201527505163e-09, 'epoch': 0.98} 98%|█████████▊| 33695/34278 [36:54:37<32:42, 3.37s/it] 98%|█████████▊| 33696/34278 [36:54:40<30:17, 3.12s/it] {'loss': 0.1212, 'grad_norm': 0.9221953547531013, 'learning_rate': 7.558212548568478e-09, 'epoch': 0.98} 98%|█████████▊| 33696/34278 [36:54:40<30:17, 3.12s/it] 98%|█████████▊| 33697/34278 [36:54:43<30:37, 3.16s/it] {'loss': 0.0911, 'grad_norm': 0.9391561379043512, 'learning_rate': 7.532268140961197e-09, 'epoch': 0.98} 98%|█████████▊| 33697/34278 [36:54:43<30:37, 3.16s/it] 98%|█████████▊| 33698/34278 [36:54:49<38:18, 3.96s/it] {'loss': 0.1265, 'grad_norm': 0.7395787939323956, 'learning_rate': 7.506368304913136e-09, 'epoch': 0.98} 98%|█████████▊| 33698/34278 [36:54:49<38:18, 3.96s/it] 98%|█████████▊| 33699/34278 [36:54:52<35:49, 3.71s/it] {'loss': 0.1114, 'grad_norm': 0.7795398518213585, 'learning_rate': 7.48051304065689e-09, 'epoch': 0.98} 98%|█████████▊| 33699/34278 [36:54:52<35:49, 3.71s/it] 98%|█████████▊| 33700/34278 [36:54:55<34:27, 3.58s/it] {'loss': 0.1247, 'grad_norm': 0.7745135194126502, 'learning_rate': 7.454702348422826e-09, 'epoch': 0.98} 98%|█████████▊| 33700/34278 [36:54:55<34:27, 3.58s/it] 98%|█████████▊| 33701/34278 [36:54:58<32:41, 3.40s/it] {'loss': 0.1271, 'grad_norm': 0.8725922866159622, 'learning_rate': 7.428936228441319e-09, 'epoch': 0.98} 98%|█████████▊| 33701/34278 [36:54:58<32:41, 3.40s/it] 98%|█████████▊| 33702/34278 [36:55:01<32:20, 3.37s/it] {'loss': 0.0777, 'grad_norm': 0.7601226929595027, 'learning_rate': 7.403214680942739e-09, 'epoch': 0.98} 98%|█████████▊| 33702/34278 [36:55:01<32:20, 3.37s/it] 98%|█████████▊| 33703/34278 [36:55:05<31:41, 3.31s/it] {'loss': 0.1291, 'grad_norm': 0.9176273069054299, 'learning_rate': 7.377537706155791e-09, 'epoch': 0.98} 98%|█████████▊| 33703/34278 [36:55:05<31:41, 3.31s/it] 98%|█████████▊| 33704/34278 [36:55:08<31:41, 3.31s/it] {'loss': 0.0944, 'grad_norm': 0.7472549936756512, 'learning_rate': 7.351905304310847e-09, 'epoch': 0.98} 98%|█████████▊| 33704/34278 [36:55:08<31:41, 3.31s/it] 98%|█████████▊| 33705/34278 [36:55:11<31:40, 3.32s/it] {'loss': 0.1141, 'grad_norm': 0.9094716647789562, 'learning_rate': 7.326317475636058e-09, 'epoch': 0.98} 98%|█████████▊| 33705/34278 [36:55:11<31:40, 3.32s/it] 98%|█████████▊| 33706/34278 [36:55:16<34:30, 3.62s/it] {'loss': 0.1215, 'grad_norm': 0.956582694642723, 'learning_rate': 7.30077422036013e-09, 'epoch': 0.98} 98%|█████████▊| 33706/34278 [36:55:16<34:30, 3.62s/it] 98%|█████████▊| 33707/34278 [36:55:19<33:15, 3.49s/it] {'loss': 0.099, 'grad_norm': 0.7160709952785085, 'learning_rate': 7.275275538711213e-09, 'epoch': 0.98} 98%|█████████▊| 33707/34278 [36:55:19<33:15, 3.49s/it] 98%|█████████▊| 33708/34278 [36:55:24<37:07, 3.91s/it] {'loss': 0.1043, 'grad_norm': 0.7956117148602124, 'learning_rate': 7.249821430916348e-09, 'epoch': 0.98} 98%|█████████▊| 33708/34278 [36:55:24<37:07, 3.91s/it] 98%|█████████▊| 33709/34278 [36:55:27<34:34, 3.65s/it] {'loss': 0.1045, 'grad_norm': 1.0359450303438382, 'learning_rate': 7.224411897203687e-09, 'epoch': 0.98} 98%|█████████▊| 33709/34278 [36:55:27<34:34, 3.65s/it] 98%|█████████▊| 33710/34278 [36:55:29<31:51, 3.37s/it] {'loss': 0.093, 'grad_norm': 0.842684944587351, 'learning_rate': 7.199046937799159e-09, 'epoch': 0.98} 98%|█████████▊| 33710/34278 [36:55:29<31:51, 3.37s/it] 98%|█████████▊| 33711/34278 [36:55:32<31:03, 3.29s/it] {'loss': 0.1224, 'grad_norm': 0.8383679980030517, 'learning_rate': 7.173726552929805e-09, 'epoch': 0.98} 98%|█████████▊| 33711/34278 [36:55:32<31:03, 3.29s/it] 98%|█████████▊| 33712/34278 [36:55:36<30:26, 3.23s/it] {'loss': 0.1023, 'grad_norm': 0.7657670068051833, 'learning_rate': 7.148450742821556e-09, 'epoch': 0.98} 98%|█████████▊| 33712/34278 [36:55:36<30:26, 3.23s/it] 98%|█████████▊| 33713/34278 [36:55:39<29:51, 3.17s/it] {'loss': 0.1017, 'grad_norm': 0.8992060937948977, 'learning_rate': 7.123219507700341e-09, 'epoch': 0.98} 98%|█████████▊| 33713/34278 [36:55:39<29:51, 3.17s/it] 98%|█████████▊| 33714/34278 [36:55:42<31:22, 3.34s/it] {'loss': 0.1071, 'grad_norm': 0.8077556941158505, 'learning_rate': 7.098032847790426e-09, 'epoch': 0.98} 98%|█████████▊| 33714/34278 [36:55:42<31:22, 3.34s/it] 98%|█████████▊| 33715/34278 [36:55:48<37:17, 3.97s/it] {'loss': 0.1024, 'grad_norm': 0.7106926564942275, 'learning_rate': 7.072890763317742e-09, 'epoch': 0.98} 98%|█████████▊| 33715/34278 [36:55:48<37:17, 3.97s/it] 98%|█████████▊| 33716/34278 [36:55:51<34:56, 3.73s/it] {'loss': 0.1239, 'grad_norm': 0.8228047962547985, 'learning_rate': 7.047793254506552e-09, 'epoch': 0.98} 98%|█████████▊| 33716/34278 [36:55:51<34:56, 3.73s/it] 98%|█████████▊| 33717/34278 [36:55:54<33:24, 3.57s/it] {'loss': 0.1149, 'grad_norm': 1.2068928857190815, 'learning_rate': 7.0227403215805675e-09, 'epoch': 0.98} 98%|█████████▊| 33717/34278 [36:55:54<33:24, 3.57s/it] 98%|█████████▊| 33718/34278 [36:56:00<40:24, 4.33s/it] {'loss': 0.093, 'grad_norm': 0.6898694648620306, 'learning_rate': 6.997731964764054e-09, 'epoch': 0.98} 98%|█████████▊| 33718/34278 [36:56:00<40:24, 4.33s/it] 98%|█████████▊| 33719/34278 [36:56:05<41:41, 4.48s/it] {'loss': 0.104, 'grad_norm': 0.6336542462070724, 'learning_rate': 6.97276818427961e-09, 'epoch': 0.98} 98%|█████████▊| 33719/34278 [36:56:05<41:41, 4.48s/it] 98%|█████████▊| 33720/34278 [36:56:10<41:34, 4.47s/it] {'loss': 0.1118, 'grad_norm': 0.7946128929906376, 'learning_rate': 6.947848980349836e-09, 'epoch': 0.98} 98%|█████████▊| 33720/34278 [36:56:10<41:34, 4.47s/it] 98%|█████████▊| 33721/34278 [36:56:13<38:45, 4.17s/it] {'loss': 0.1047, 'grad_norm': 0.8331790764768273, 'learning_rate': 6.922974353198441e-09, 'epoch': 0.98} 98%|█████████▊| 33721/34278 [36:56:13<38:45, 4.17s/it] 98%|█████████▊| 33722/34278 [36:56:19<44:01, 4.75s/it] {'loss': 0.1134, 'grad_norm': 0.9639134480924492, 'learning_rate': 6.898144303046361e-09, 'epoch': 0.98} 98%|█████████▊| 33722/34278 [36:56:19<44:01, 4.75s/it] 98%|█████████▊| 33723/34278 [36:56:22<40:05, 4.33s/it] {'loss': 0.1219, 'grad_norm': 0.9261810124901682, 'learning_rate': 6.873358830116194e-09, 'epoch': 0.98} 98%|█████████▊| 33723/34278 [36:56:22<40:05, 4.33s/it] 98%|█████████▊| 33724/34278 [36:56:26<36:58, 4.00s/it] {'loss': 0.1167, 'grad_norm': 0.9425190641302295, 'learning_rate': 6.848617934628321e-09, 'epoch': 0.98} 98%|█████████▊| 33724/34278 [36:56:26<36:58, 4.00s/it] 98%|█████████▊| 33725/34278 [36:56:29<35:23, 3.84s/it] {'loss': 0.1337, 'grad_norm': 1.023442998022823, 'learning_rate': 6.82392161680423e-09, 'epoch': 0.98} 98%|█████████▊| 33725/34278 [36:56:29<35:23, 3.84s/it] 98%|█████████▊| 33726/34278 [36:56:32<33:45, 3.67s/it] {'loss': 0.1188, 'grad_norm': 0.793750959786204, 'learning_rate': 6.799269876863745e-09, 'epoch': 0.98} 98%|█████████▊| 33726/34278 [36:56:32<33:45, 3.67s/it] 98%|█████████▊| 33727/34278 [36:56:35<30:45, 3.35s/it] {'loss': 0.116, 'grad_norm': 0.9351271772309842, 'learning_rate': 6.7746627150278024e-09, 'epoch': 0.98} 98%|█████████▊| 33727/34278 [36:56:35<30:45, 3.35s/it] 98%|█████████▊| 33728/34278 [36:56:38<30:00, 3.27s/it] {'loss': 0.1201, 'grad_norm': 0.7967768431673874, 'learning_rate': 6.750100131515669e-09, 'epoch': 0.98} 98%|█████████▊| 33728/34278 [36:56:38<30:00, 3.27s/it] 98%|█████████▊| 33729/34278 [36:56:44<37:09, 4.06s/it] {'loss': 0.1087, 'grad_norm': 0.8303397987610552, 'learning_rate': 6.725582126546615e-09, 'epoch': 0.98} 98%|█████████▊| 33729/34278 [36:56:44<37:09, 4.06s/it] 98%|█████████▊| 33730/34278 [36:56:47<34:29, 3.78s/it] {'loss': 0.1062, 'grad_norm': 0.7244863809147774, 'learning_rate': 6.701108700339354e-09, 'epoch': 0.98} 98%|█████████▊| 33730/34278 [36:56:47<34:29, 3.78s/it] 98%|█████████▊| 33731/34278 [36:56:50<32:34, 3.57s/it] {'loss': 0.1254, 'grad_norm': 0.8722186341634288, 'learning_rate': 6.6766798531126e-09, 'epoch': 0.98} 98%|█████████▊| 33731/34278 [36:56:50<32:34, 3.57s/it] 98%|█████████▊| 33732/34278 [36:56:54<32:48, 3.61s/it] {'loss': 0.1193, 'grad_norm': 0.7624681886954229, 'learning_rate': 6.652295585085066e-09, 'epoch': 0.98} 98%|█████████▊| 33732/34278 [36:56:54<32:48, 3.61s/it] 98%|█████████▊| 33733/34278 [36:56:57<32:17, 3.56s/it] {'loss': 0.1081, 'grad_norm': 0.8032247254944517, 'learning_rate': 6.627955896473248e-09, 'epoch': 0.98} 98%|█████████▊| 33733/34278 [36:56:57<32:17, 3.56s/it] 98%|█████████▊| 33734/34278 [36:57:01<31:47, 3.51s/it] {'loss': 0.1346, 'grad_norm': 0.9768125206862693, 'learning_rate': 6.603660787495303e-09, 'epoch': 0.98} 98%|█████████▊| 33734/34278 [36:57:01<31:47, 3.51s/it] 98%|█████████▊| 33735/34278 [36:57:04<30:18, 3.35s/it] {'loss': 0.0867, 'grad_norm': 0.8058024424483172, 'learning_rate': 6.579410258367724e-09, 'epoch': 0.98} 98%|█████████▊| 33735/34278 [36:57:04<30:18, 3.35s/it] 98%|█████████▊| 33736/34278 [36:57:09<35:22, 3.92s/it] {'loss': 0.0987, 'grad_norm': 0.7482135268712199, 'learning_rate': 6.5552043093070065e-09, 'epoch': 0.98} 98%|█████████▊| 33736/34278 [36:57:09<35:22, 3.92s/it] 98%|█████████▊| 33737/34278 [36:57:12<33:11, 3.68s/it] {'loss': 0.1318, 'grad_norm': 0.9304102671489258, 'learning_rate': 6.531042940529642e-09, 'epoch': 0.98} 98%|█████████▊| 33737/34278 [36:57:12<33:11, 3.68s/it] 98%|█████████▊| 33738/34278 [36:57:15<31:32, 3.50s/it] {'loss': 0.1001, 'grad_norm': 0.7830598836781537, 'learning_rate': 6.5069261522510145e-09, 'epoch': 0.98} 98%|█████████▊| 33738/34278 [36:57:15<31:32, 3.50s/it] 98%|█████████▊| 33739/34278 [36:57:18<30:16, 3.37s/it] {'loss': 0.0917, 'grad_norm': 0.7684164285739562, 'learning_rate': 6.482853944686507e-09, 'epoch': 0.98} 98%|█████████▊| 33739/34278 [36:57:18<30:16, 3.37s/it] 98%|█████████▊| 33740/34278 [36:57:22<30:26, 3.39s/it] {'loss': 0.1106, 'grad_norm': 0.7536747962239854, 'learning_rate': 6.458826318050948e-09, 'epoch': 0.98} 98%|█████████▊| 33740/34278 [36:57:22<30:26, 3.39s/it] 98%|█████████▊| 33741/34278 [36:57:25<31:20, 3.50s/it] {'loss': 0.1087, 'grad_norm': 0.7836223399991963, 'learning_rate': 6.434843272558611e-09, 'epoch': 0.98} 98%|█████████▊| 33741/34278 [36:57:25<31:20, 3.50s/it] 98%|█████████▊| 33742/34278 [36:57:29<30:49, 3.45s/it] {'loss': 0.123, 'grad_norm': 0.8938947438845976, 'learning_rate': 6.410904808424878e-09, 'epoch': 0.98} 98%|█████████▊| 33742/34278 [36:57:29<30:49, 3.45s/it] 98%|█████████▊| 33743/34278 [36:57:32<29:00, 3.25s/it] {'loss': 0.0952, 'grad_norm': 0.7465379618144085, 'learning_rate': 6.387010925861803e-09, 'epoch': 0.98} 98%|█████████▊| 33743/34278 [36:57:32<29:00, 3.25s/it] 98%|█████████▊| 33744/34278 [36:57:35<28:43, 3.23s/it] {'loss': 0.1137, 'grad_norm': 0.7405949789405286, 'learning_rate': 6.363161625083103e-09, 'epoch': 0.98} 98%|█████████▊| 33744/34278 [36:57:35<28:43, 3.23s/it] 98%|█████████▊| 33745/34278 [36:57:38<27:51, 3.14s/it] {'loss': 0.1294, 'grad_norm': 0.8374986638280276, 'learning_rate': 6.339356906303051e-09, 'epoch': 0.98} 98%|█████████▊| 33745/34278 [36:57:38<27:51, 3.14s/it] 98%|█████████▊| 33746/34278 [36:57:44<35:16, 3.98s/it] {'loss': 0.1078, 'grad_norm': 0.9245977154510832, 'learning_rate': 6.315596769732035e-09, 'epoch': 0.98} 98%|█████████▊| 33746/34278 [36:57:44<35:16, 3.98s/it] 98%|█████████▊| 33747/34278 [36:57:48<35:11, 3.98s/it] {'loss': 0.1185, 'grad_norm': 0.78849944130388, 'learning_rate': 6.291881215584328e-09, 'epoch': 0.98} 98%|█████████▊| 33747/34278 [36:57:48<35:11, 3.98s/it] 98%|█████████▊| 33748/34278 [36:57:51<33:01, 3.74s/it] {'loss': 0.1102, 'grad_norm': 0.8158094863158458, 'learning_rate': 6.268210244069761e-09, 'epoch': 0.98} 98%|█████████▊| 33748/34278 [36:57:51<33:01, 3.74s/it] 98%|█████████▊| 33749/34278 [36:57:54<31:06, 3.53s/it] {'loss': 0.1218, 'grad_norm': 0.8965222529472734, 'learning_rate': 6.244583855400943e-09, 'epoch': 0.98} 98%|█████████▊| 33749/34278 [36:57:54<31:06, 3.53s/it] 98%|█████████▊| 33750/34278 [36:57:57<29:43, 3.38s/it] {'loss': 0.0927, 'grad_norm': 0.7050811801846438, 'learning_rate': 6.2210020497882605e-09, 'epoch': 0.98} 98%|█████████▊| 33750/34278 [36:57:57<29:43, 3.38s/it] 98%|█████████▊| 33751/34278 [36:58:00<28:21, 3.23s/it] {'loss': 0.1248, 'grad_norm': 0.8155386673770202, 'learning_rate': 6.197464827442657e-09, 'epoch': 0.98} 98%|█████████▊| 33751/34278 [36:58:00<28:21, 3.23s/it] 98%|█████████▊| 33752/34278 [36:58:03<27:54, 3.18s/it] {'loss': 0.1034, 'grad_norm': 0.8699865058644309, 'learning_rate': 6.173972188573407e-09, 'epoch': 0.98} 98%|█████████▊| 33752/34278 [36:58:03<27:54, 3.18s/it] 98%|█████████▊| 33753/34278 [36:58:06<28:39, 3.27s/it] {'loss': 0.0969, 'grad_norm': 0.7230642282293623, 'learning_rate': 6.1505241333909e-09, 'epoch': 0.98} 98%|█████████▊| 33753/34278 [36:58:06<28:39, 3.27s/it] 98%|█████████▊| 33754/34278 [36:58:09<28:00, 3.21s/it] {'loss': 0.1072, 'grad_norm': 0.7383807593153091, 'learning_rate': 6.127120662104968e-09, 'epoch': 0.98} 98%|█████████▊| 33754/34278 [36:58:09<28:00, 3.21s/it] 98%|█████████▊| 33755/34278 [36:58:15<35:25, 4.06s/it] {'loss': 0.1239, 'grad_norm': 0.7064889880173199, 'learning_rate': 6.103761774923778e-09, 'epoch': 0.98} 98%|█████████▊| 33755/34278 [36:58:15<35:25, 4.06s/it] 98%|█████████▊| 33756/34278 [36:58:18<32:24, 3.72s/it] {'loss': 0.1309, 'grad_norm': 0.7074108063353488, 'learning_rate': 6.080447472055495e-09, 'epoch': 0.98} 98%|█████████▊| 33756/34278 [36:58:18<32:24, 3.72s/it] 98%|█████████▊| 33757/34278 [36:58:21<30:48, 3.55s/it] {'loss': 0.1446, 'grad_norm': 0.9665701456235283, 'learning_rate': 6.057177753709398e-09, 'epoch': 0.98} 98%|█████████▊| 33757/34278 [36:58:21<30:48, 3.55s/it] 98%|█████████▊| 33758/34278 [36:58:28<37:55, 4.38s/it] {'loss': 0.1068, 'grad_norm': 0.7167502735280957, 'learning_rate': 6.033952620092542e-09, 'epoch': 0.98} 98%|█████████▊| 33758/34278 [36:58:28<37:55, 4.38s/it] 98%|█████████▊| 33759/34278 [36:58:31<34:26, 3.98s/it] {'loss': 0.1252, 'grad_norm': 1.018252063516159, 'learning_rate': 6.010772071412541e-09, 'epoch': 0.98} 98%|█████████▊| 33759/34278 [36:58:31<34:26, 3.98s/it] 98%|█████████▊| 33760/34278 [36:58:34<31:20, 3.63s/it] {'loss': 0.0858, 'grad_norm': 0.8126773964569709, 'learning_rate': 5.987636107875894e-09, 'epoch': 0.98} 98%|█████████▊| 33760/34278 [36:58:34<31:20, 3.63s/it] 98%|█████████▊| 33761/34278 [36:58:37<30:21, 3.52s/it] {'loss': 0.1034, 'grad_norm': 0.7460523752464587, 'learning_rate': 5.964544729689658e-09, 'epoch': 0.98} 98%|█████████▊| 33761/34278 [36:58:37<30:21, 3.52s/it] 98%|█████████▊| 33762/34278 [36:58:43<37:03, 4.31s/it] {'loss': 0.1156, 'grad_norm': 0.7929453814306174, 'learning_rate': 5.941497937059227e-09, 'epoch': 0.98} 98%|█████████▊| 33762/34278 [36:58:43<37:03, 4.31s/it] 98%|█████████▊| 33763/34278 [36:58:48<39:09, 4.56s/it] {'loss': 0.1034, 'grad_norm': 0.8474348784984916, 'learning_rate': 5.918495730191654e-09, 'epoch': 0.98} 98%|█████████▊| 33763/34278 [36:58:48<39:09, 4.56s/it] 99%|█████████▊| 33764/34278 [36:58:51<34:38, 4.04s/it] {'loss': 0.1046, 'grad_norm': 0.845453547210441, 'learning_rate': 5.895538109291221e-09, 'epoch': 0.99} 99%|█████████▊| 33764/34278 [36:58:51<34:38, 4.04s/it] 99%|█████████▊| 33765/34278 [36:58:54<32:01, 3.75s/it] {'loss': 0.1169, 'grad_norm': 0.8228342380688106, 'learning_rate': 5.8726250745633205e-09, 'epoch': 0.99} 99%|█████████▊| 33765/34278 [36:58:54<32:01, 3.75s/it] 99%|█████████▊| 33766/34278 [36:58:57<30:44, 3.60s/it] {'loss': 0.1035, 'grad_norm': 0.8642893571375703, 'learning_rate': 5.849756626212788e-09, 'epoch': 0.99} 99%|█████████▊| 33766/34278 [36:58:57<30:44, 3.60s/it] 99%|█████████▊| 33767/34278 [36:59:00<29:13, 3.43s/it] {'loss': 0.1319, 'grad_norm': 0.873641406793772, 'learning_rate': 5.826932764442794e-09, 'epoch': 0.99} 99%|█████████▊| 33767/34278 [36:59:00<29:13, 3.43s/it] 99%|█████████▊| 33768/34278 [36:59:03<28:16, 3.33s/it] {'loss': 0.0969, 'grad_norm': 0.7918134576248088, 'learning_rate': 5.804153489458175e-09, 'epoch': 0.99} 99%|█████████▊| 33768/34278 [36:59:03<28:16, 3.33s/it] 99%|█████████▊| 33769/34278 [36:59:08<30:58, 3.65s/it] {'loss': 0.1002, 'grad_norm': 0.7837560319809252, 'learning_rate': 5.781418801461547e-09, 'epoch': 0.99} 99%|█████████▊| 33769/34278 [36:59:08<30:58, 3.65s/it] 99%|█████████▊| 33770/34278 [36:59:11<29:12, 3.45s/it] {'loss': 0.1048, 'grad_norm': 0.8508208071336113, 'learning_rate': 5.758728700656635e-09, 'epoch': 0.99} 99%|█████████▊| 33770/34278 [36:59:11<29:12, 3.45s/it] 99%|█████████▊| 33771/34278 [36:59:14<28:08, 3.33s/it] {'loss': 0.111, 'grad_norm': 0.8732390898780681, 'learning_rate': 5.736083187244945e-09, 'epoch': 0.99} 99%|█████████▊| 33771/34278 [36:59:14<28:08, 3.33s/it] 99%|█████████▊| 33772/34278 [36:59:18<30:20, 3.60s/it] {'loss': 0.1099, 'grad_norm': 0.726612294000801, 'learning_rate': 5.713482261429648e-09, 'epoch': 0.99} 99%|█████████▊| 33772/34278 [36:59:18<30:20, 3.60s/it] 99%|█████████▊| 33773/34278 [36:59:23<34:02, 4.05s/it] {'loss': 0.0882, 'grad_norm': 0.6977549401593138, 'learning_rate': 5.690925923412249e-09, 'epoch': 0.99} 99%|█████████▊| 33773/34278 [36:59:23<34:02, 4.05s/it] 99%|█████████▊| 33774/34278 [36:59:27<33:17, 3.96s/it] {'loss': 0.1109, 'grad_norm': 1.0404520066601548, 'learning_rate': 5.6684141733936996e-09, 'epoch': 0.99} 99%|█████████▊| 33774/34278 [36:59:27<33:17, 3.96s/it] 99%|█████████▊| 33775/34278 [36:59:30<31:09, 3.72s/it] {'loss': 0.1122, 'grad_norm': 0.8411315454281804, 'learning_rate': 5.645947011576059e-09, 'epoch': 0.99} 99%|█████████▊| 33775/34278 [36:59:30<31:09, 3.72s/it] 99%|█████████▊| 33776/34278 [36:59:34<30:39, 3.66s/it] {'loss': 0.1029, 'grad_norm': 0.7566714896192657, 'learning_rate': 5.623524438158612e-09, 'epoch': 0.99} 99%|█████████▊| 33776/34278 [36:59:34<30:39, 3.66s/it] 99%|█████████▊| 33777/34278 [36:59:40<36:07, 4.33s/it] {'loss': 0.1045, 'grad_norm': 0.7386793368375577, 'learning_rate': 5.601146453341755e-09, 'epoch': 0.99} 99%|█████████▊| 33777/34278 [36:59:40<36:07, 4.33s/it] 99%|█████████▊| 33778/34278 [36:59:43<33:08, 3.98s/it] {'loss': 0.1153, 'grad_norm': 0.8420695324047094, 'learning_rate': 5.578813057325883e-09, 'epoch': 0.99} 99%|█████████▊| 33778/34278 [36:59:43<33:08, 3.98s/it] 99%|█████████▊| 33779/34278 [36:59:46<30:34, 3.68s/it] {'loss': 0.1079, 'grad_norm': 0.8975325100795434, 'learning_rate': 5.55652425031028e-09, 'epoch': 0.99} 99%|█████████▊| 33779/34278 [36:59:46<30:34, 3.68s/it] 99%|█████████▊| 33780/34278 [36:59:49<28:42, 3.46s/it] {'loss': 0.0951, 'grad_norm': 0.7724228338656928, 'learning_rate': 5.534280032493678e-09, 'epoch': 0.99} 99%|█████████▊| 33780/34278 [36:59:49<28:42, 3.46s/it] 99%|█████████▊| 33781/34278 [36:59:52<28:49, 3.48s/it] {'loss': 0.1233, 'grad_norm': 0.9173868650710375, 'learning_rate': 5.512080404074804e-09, 'epoch': 0.99} 99%|█████████▊| 33781/34278 [36:59:52<28:49, 3.48s/it] 99%|█████████▊| 33782/34278 [36:59:55<28:14, 3.42s/it] {'loss': 0.1235, 'grad_norm': 0.9017049037987892, 'learning_rate': 5.489925365251836e-09, 'epoch': 0.99} 99%|█████████▊| 33782/34278 [36:59:55<28:14, 3.42s/it] 99%|█████████▊| 33783/34278 [37:00:00<30:09, 3.66s/it] {'loss': 0.1097, 'grad_norm': 0.6661587110680808, 'learning_rate': 5.467814916222392e-09, 'epoch': 0.99} 99%|█████████▊| 33783/34278 [37:00:00<30:09, 3.66s/it] 99%|█████████▊| 33784/34278 [37:00:03<28:15, 3.43s/it] {'loss': 0.099, 'grad_norm': 0.6911047352843254, 'learning_rate': 5.445749057184091e-09, 'epoch': 0.99} 99%|█████████▊| 33784/34278 [37:00:03<28:15, 3.43s/it] 99%|█████████▊| 33785/34278 [37:00:06<27:27, 3.34s/it] {'loss': 0.1084, 'grad_norm': 0.8754468376270351, 'learning_rate': 5.423727788333444e-09, 'epoch': 0.99} 99%|█████████▊| 33785/34278 [37:00:06<27:27, 3.34s/it] 99%|█████████▊| 33786/34278 [37:00:12<33:51, 4.13s/it] {'loss': 0.1234, 'grad_norm': 0.8638236782689099, 'learning_rate': 5.40175110986807e-09, 'epoch': 0.99} 99%|█████████▊| 33786/34278 [37:00:12<33:51, 4.13s/it] 99%|█████████▊| 33787/34278 [37:00:15<30:59, 3.79s/it] {'loss': 0.1126, 'grad_norm': 0.8495487146954932, 'learning_rate': 5.379819021982813e-09, 'epoch': 0.99} 99%|█████████▊| 33787/34278 [37:00:15<30:59, 3.79s/it] 99%|█████████▊| 33788/34278 [37:00:18<29:55, 3.66s/it] {'loss': 0.1149, 'grad_norm': 0.774478508037016, 'learning_rate': 5.3579315248747376e-09, 'epoch': 0.99} 99%|█████████▊| 33788/34278 [37:00:18<29:55, 3.66s/it] 99%|█████████▊| 33789/34278 [37:00:21<28:01, 3.44s/it] {'loss': 0.1014, 'grad_norm': 0.8827195113424046, 'learning_rate': 5.336088618738688e-09, 'epoch': 0.99} 99%|█████████▊| 33789/34278 [37:00:21<28:01, 3.44s/it] 99%|█████████▊| 33790/34278 [37:00:24<28:15, 3.47s/it] {'loss': 0.1233, 'grad_norm': 0.6734589417483602, 'learning_rate': 5.314290303770065e-09, 'epoch': 0.99} 99%|█████████▊| 33790/34278 [37:00:25<28:15, 3.47s/it] 99%|█████████▊| 33791/34278 [37:00:28<27:22, 3.37s/it] {'loss': 0.0978, 'grad_norm': 0.7914457591354623, 'learning_rate': 5.292536580162599e-09, 'epoch': 0.99} 99%|█████████▊| 33791/34278 [37:00:28<27:22, 3.37s/it] 99%|█████████▊| 33792/34278 [37:00:31<27:00, 3.34s/it] {'loss': 0.0885, 'grad_norm': 0.7155567007460567, 'learning_rate': 5.270827448111137e-09, 'epoch': 0.99} 99%|█████████▊| 33792/34278 [37:00:31<27:00, 3.34s/it] 99%|█████████▊| 33793/34278 [37:00:34<26:41, 3.30s/it] {'loss': 0.1077, 'grad_norm': 0.809195209325187, 'learning_rate': 5.249162907809413e-09, 'epoch': 0.99} 99%|█████████▊| 33793/34278 [37:00:34<26:41, 3.30s/it] 99%|█████████▊| 33794/34278 [37:00:40<33:18, 4.13s/it] {'loss': 0.1372, 'grad_norm': 0.7076078580788532, 'learning_rate': 5.227542959450604e-09, 'epoch': 0.99} 99%|█████████▊| 33794/34278 [37:00:40<33:18, 4.13s/it] 99%|█████████▊| 33795/34278 [37:00:43<30:22, 3.77s/it] {'loss': 0.1441, 'grad_norm': 0.9310975834524328, 'learning_rate': 5.2059676032284454e-09, 'epoch': 0.99} 99%|█████████▊| 33795/34278 [37:00:43<30:22, 3.77s/it] 99%|█████████▊| 33796/34278 [37:00:46<29:02, 3.61s/it] {'loss': 0.1368, 'grad_norm': 0.9107781975811486, 'learning_rate': 5.1844368393350054e-09, 'epoch': 0.99} 99%|█████████▊| 33796/34278 [37:00:46<29:02, 3.61s/it] 99%|█████████▊| 33797/34278 [37:00:49<27:06, 3.38s/it] {'loss': 0.1132, 'grad_norm': 0.8790947077131462, 'learning_rate': 5.162950667962352e-09, 'epoch': 0.99} 99%|█████████▊| 33797/34278 [37:00:49<27:06, 3.38s/it] 99%|█████████▊| 33798/34278 [37:00:52<25:57, 3.25s/it] {'loss': 0.1085, 'grad_norm': 0.8364424603862561, 'learning_rate': 5.141509089301999e-09, 'epoch': 0.99} 99%|█████████▊| 33798/34278 [37:00:52<25:57, 3.25s/it] 99%|█████████▊| 33799/34278 [37:00:56<26:18, 3.29s/it] {'loss': 0.1121, 'grad_norm': 1.0138809314595514, 'learning_rate': 5.120112103546571e-09, 'epoch': 0.99} 99%|█████████▊| 33799/34278 [37:00:56<26:18, 3.29s/it] 99%|█████████▊| 33800/34278 [37:00:59<26:29, 3.33s/it] {'loss': 0.1234, 'grad_norm': 0.9221749366125264, 'learning_rate': 5.09875971088647e-09, 'epoch': 0.99} 99%|█████████▊| 33800/34278 [37:00:59<26:29, 3.33s/it] 99%|█████████▊| 33801/34278 [37:01:03<28:32, 3.59s/it] {'loss': 0.1254, 'grad_norm': 0.6832511862452569, 'learning_rate': 5.077451911512099e-09, 'epoch': 0.99} 99%|█████████▊| 33801/34278 [37:01:03<28:32, 3.59s/it] 99%|█████████▊| 33802/34278 [37:01:08<30:36, 3.86s/it] {'loss': 0.1065, 'grad_norm': 0.8844021458838913, 'learning_rate': 5.056188705613863e-09, 'epoch': 0.99} 99%|█████████▊| 33802/34278 [37:01:08<30:36, 3.86s/it] 99%|█████████▊| 33803/34278 [37:01:11<29:38, 3.74s/it] {'loss': 0.0935, 'grad_norm': 0.8742626585704845, 'learning_rate': 5.0349700933810534e-09, 'epoch': 0.99} 99%|█████████▊| 33803/34278 [37:01:11<29:38, 3.74s/it] 99%|█████████▊| 33804/34278 [37:01:17<34:36, 4.38s/it] {'loss': 0.1122, 'grad_norm': 0.6718860110054, 'learning_rate': 5.013796075004074e-09, 'epoch': 0.99} 99%|█████████▊| 33804/34278 [37:01:17<34:36, 4.38s/it] 99%|█████████▊| 33805/34278 [37:01:20<30:48, 3.91s/it] {'loss': 0.1059, 'grad_norm': 0.7714586732886123, 'learning_rate': 4.9926666506716624e-09, 'epoch': 0.99} 99%|█████████▊| 33805/34278 [37:01:20<30:48, 3.91s/it] 99%|█████████▊| 33806/34278 [37:01:23<28:27, 3.62s/it] {'loss': 0.105, 'grad_norm': 0.83949546271214, 'learning_rate': 4.971581820572002e-09, 'epoch': 0.99} 99%|█████████▊| 33806/34278 [37:01:23<28:27, 3.62s/it] 99%|█████████▊| 33807/34278 [37:01:26<27:02, 3.44s/it] {'loss': 0.0902, 'grad_norm': 0.8939256990744707, 'learning_rate': 4.950541584893831e-09, 'epoch': 0.99} 99%|█████████▊| 33807/34278 [37:01:26<27:02, 3.44s/it] 99%|█████████▊| 33808/34278 [37:01:32<33:14, 4.24s/it] {'loss': 0.1202, 'grad_norm': 0.7161276479660431, 'learning_rate': 4.929545943825331e-09, 'epoch': 0.99} 99%|█████████▊| 33808/34278 [37:01:32<33:14, 4.24s/it] 99%|█████████▊| 33809/34278 [37:01:35<30:08, 3.86s/it] {'loss': 0.1125, 'grad_norm': 0.9380570904591177, 'learning_rate': 4.9085948975524654e-09, 'epoch': 0.99} 99%|█████████▊| 33809/34278 [37:01:35<30:08, 3.86s/it] 99%|█████████▊| 33810/34278 [37:01:38<28:47, 3.69s/it] {'loss': 0.1038, 'grad_norm': 0.8168481495001629, 'learning_rate': 4.887688446263971e-09, 'epoch': 0.99} 99%|█████████▊| 33810/34278 [37:01:38<28:47, 3.69s/it] 99%|█████████▊| 33811/34278 [37:01:42<29:19, 3.77s/it] {'loss': 0.0977, 'grad_norm': 0.860385922459585, 'learning_rate': 4.866826590145257e-09, 'epoch': 0.99} 99%|█████████▊| 33811/34278 [37:01:42<29:19, 3.77s/it] 99%|█████████▊| 33812/34278 [37:01:45<27:37, 3.56s/it] {'loss': 0.1178, 'grad_norm': 1.0703660474635481, 'learning_rate': 4.846009329383394e-09, 'epoch': 0.99} 99%|█████████▊| 33812/34278 [37:01:45<27:37, 3.56s/it] 99%|█████████▊| 33813/34278 [37:01:49<27:48, 3.59s/it] {'loss': 0.1323, 'grad_norm': 0.907132889092109, 'learning_rate': 4.825236664163791e-09, 'epoch': 0.99} 99%|█████████▊| 33813/34278 [37:01:49<27:48, 3.59s/it] 99%|█████████▊| 33814/34278 [37:01:52<26:48, 3.47s/it] {'loss': 0.0998, 'grad_norm': 1.175990114333469, 'learning_rate': 4.804508594671853e-09, 'epoch': 0.99} 99%|█████████▊| 33814/34278 [37:01:52<26:48, 3.47s/it] 99%|█████████▊| 33815/34278 [37:01:56<28:16, 3.67s/it] {'loss': 0.1324, 'grad_norm': 0.9312263874009149, 'learning_rate': 4.783825121093544e-09, 'epoch': 0.99} 99%|█████████▊| 33815/34278 [37:01:56<28:16, 3.67s/it] 99%|█████████▊| 33816/34278 [37:01:59<27:03, 3.51s/it] {'loss': 0.1164, 'grad_norm': 0.8647671220568324, 'learning_rate': 4.7631862436120506e-09, 'epoch': 0.99} 99%|█████████▊| 33816/34278 [37:01:59<27:03, 3.51s/it] 99%|█████████▊| 33817/34278 [37:02:03<26:27, 3.44s/it] {'loss': 0.1037, 'grad_norm': 0.933902115857926, 'learning_rate': 4.7425919624122244e-09, 'epoch': 0.99} 99%|█████████▊| 33817/34278 [37:02:03<26:27, 3.44s/it] 99%|█████████▊| 33818/34278 [37:02:08<32:09, 4.20s/it] {'loss': 0.1086, 'grad_norm': 0.8014744857424474, 'learning_rate': 4.722042277678918e-09, 'epoch': 0.99} 99%|█████████▊| 33818/34278 [37:02:08<32:09, 4.20s/it] 99%|█████████▊| 33819/34278 [37:02:11<29:05, 3.80s/it] {'loss': 0.1056, 'grad_norm': 0.7676682128542434, 'learning_rate': 4.701537189594207e-09, 'epoch': 0.99} 99%|█████████▊| 33819/34278 [37:02:11<29:05, 3.80s/it] 99%|█████████▊| 33820/34278 [37:02:14<26:31, 3.48s/it] {'loss': 0.1154, 'grad_norm': 0.9070406343868757, 'learning_rate': 4.681076698341836e-09, 'epoch': 0.99} 99%|█████████▊| 33820/34278 [37:02:14<26:31, 3.48s/it] 99%|█████████▊| 33821/34278 [37:02:20<30:59, 4.07s/it] {'loss': 0.1106, 'grad_norm': 0.8643679289367764, 'learning_rate': 4.6606608041038785e-09, 'epoch': 0.99} 99%|█████████▊| 33821/34278 [37:02:20<30:59, 4.07s/it] 99%|█████████▊| 33822/34278 [37:02:24<32:43, 4.31s/it] {'loss': 0.1027, 'grad_norm': 0.8080790414757442, 'learning_rate': 4.640289507063522e-09, 'epoch': 0.99} 99%|█████████▊| 33822/34278 [37:02:24<32:43, 4.31s/it] 99%|█████████▊| 33823/34278 [37:02:28<30:22, 4.00s/it] {'loss': 0.1249, 'grad_norm': 0.8425880321963215, 'learning_rate': 4.6199628074022895e-09, 'epoch': 0.99} 99%|█████████▊| 33823/34278 [37:02:28<30:22, 4.00s/it] 99%|█████████▊| 33824/34278 [37:02:31<29:14, 3.86s/it] {'loss': 0.1184, 'grad_norm': 1.1170594980059778, 'learning_rate': 4.599680705301146e-09, 'epoch': 0.99} 99%|█████████▊| 33824/34278 [37:02:31<29:14, 3.86s/it] 99%|█████████▊| 33825/34278 [37:02:34<27:43, 3.67s/it] {'loss': 0.0808, 'grad_norm': 0.780067863775467, 'learning_rate': 4.5794432009416134e-09, 'epoch': 0.99} 99%|█████████▊| 33825/34278 [37:02:34<27:43, 3.67s/it] 99%|█████████▊| 33826/34278 [37:02:37<25:57, 3.45s/it] {'loss': 0.1092, 'grad_norm': 0.6142058142011302, 'learning_rate': 4.559250294504658e-09, 'epoch': 0.99} 99%|█████████▊| 33826/34278 [37:02:37<25:57, 3.45s/it] 99%|█████████▊| 33827/34278 [37:02:41<26:10, 3.48s/it] {'loss': 0.1227, 'grad_norm': 0.841668590756161, 'learning_rate': 4.539101986170136e-09, 'epoch': 0.99} 99%|█████████▊| 33827/34278 [37:02:41<26:10, 3.48s/it] 99%|█████████▊| 33828/34278 [37:02:45<27:43, 3.70s/it] {'loss': 0.1022, 'grad_norm': 1.0237218215407604, 'learning_rate': 4.518998276117903e-09, 'epoch': 0.99} 99%|█████████▊| 33828/34278 [37:02:45<27:43, 3.70s/it]/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 99%|█████████▊| 33829/34278 [37:02:48<26:14, 3.51s/it] {'loss': 0.11, 'grad_norm': 0.7902743064688862, 'learning_rate': 4.498939164527261e-09, 'epoch': 0.99} 99%|█████████▊| 33829/34278 [37:02:48<26:14, 3.51s/it] 99%|█████████▊| 33830/34278 [37:02:51<25:17, 3.39s/it] {'loss': 0.097, 'grad_norm': 0.7704266705221741, 'learning_rate': 4.4789246515780645e-09, 'epoch': 0.99} 99%|█████████▊| 33830/34278 [37:02:51<25:17, 3.39s/it] 99%|█████████▊| 33831/34278 [37:02:54<24:24, 3.28s/it] {'loss': 0.1272, 'grad_norm': 1.0580313645697441, 'learning_rate': 4.458954737447951e-09, 'epoch': 0.99} 99%|█████████▊| 33831/34278 [37:02:54<24:24, 3.28s/it] 99%|█████████▊| 33832/34278 [37:02:58<24:40, 3.32s/it] {'loss': 0.1096, 'grad_norm': 0.7736702002583923, 'learning_rate': 4.4390294223162215e-09, 'epoch': 0.99} 99%|█████████▊| 33832/34278 [37:02:58<24:40, 3.32s/it] 99%|█████████▊| 33833/34278 [37:03:01<23:57, 3.23s/it] {'loss': 0.1002, 'grad_norm': 0.8535642495482743, 'learning_rate': 4.419148706359955e-09, 'epoch': 0.99} 99%|█████████▊| 33833/34278 [37:03:01<23:57, 3.23s/it] 99%|█████████▊| 33834/34278 [37:03:04<22:52, 3.09s/it] {'loss': 0.1245, 'grad_norm': 0.8200675228157727, 'learning_rate': 4.399312589757343e-09, 'epoch': 0.99} 99%|█████████▊| 33834/34278 [37:03:04<22:52, 3.09s/it] 99%|█████████▊| 33835/34278 [37:03:08<25:22, 3.44s/it] {'loss': 0.0944, 'grad_norm': 0.7445801999459258, 'learning_rate': 4.379521072684911e-09, 'epoch': 0.99} 99%|█████████▊| 33835/34278 [37:03:08<25:22, 3.44s/it] 99%|█████████▊| 33836/34278 [37:03:11<25:15, 3.43s/it] {'loss': 0.1251, 'grad_norm': 0.953599559050052, 'learning_rate': 4.3597741553191856e-09, 'epoch': 0.99} 99%|█████████▊| 33836/34278 [37:03:11<25:15, 3.43s/it] 99%|█████████▊| 33837/34278 [37:03:15<26:44, 3.64s/it] {'loss': 0.1225, 'grad_norm': 0.7392225373935208, 'learning_rate': 4.3400718378372455e-09, 'epoch': 0.99} 99%|█████████▊| 33837/34278 [37:03:15<26:44, 3.64s/it] 99%|█████████▊| 33838/34278 [37:03:19<27:39, 3.77s/it] {'loss': 0.1198, 'grad_norm': 1.169378435526558, 'learning_rate': 4.320414120415062e-09, 'epoch': 0.99} 99%|█████████▊| 33838/34278 [37:03:19<27:39, 3.77s/it] 99%|█████████▊| 33839/34278 [37:03:23<28:14, 3.86s/it] {'loss': 0.0919, 'grad_norm': 0.9150936407751873, 'learning_rate': 4.30080100322694e-09, 'epoch': 0.99} 99%|█████████▊| 33839/34278 [37:03:24<28:14, 3.86s/it] 99%|█████████▊| 33840/34278 [37:03:30<34:10, 4.68s/it] {'loss': 0.1148, 'grad_norm': 0.6768329207289796, 'learning_rate': 4.281232486448849e-09, 'epoch': 0.99} 99%|█████████▊| 33840/34278 [37:03:30<34:10, 4.68s/it] 99%|█████████▊| 33841/34278 [37:03:33<29:47, 4.09s/it] {'loss': 0.1131, 'grad_norm': 0.8517050775068293, 'learning_rate': 4.2617085702556515e-09, 'epoch': 0.99} 99%|█████████▊| 33841/34278 [37:03:33<29:47, 4.09s/it] 99%|█████████▊| 33842/34278 [37:03:37<31:01, 4.27s/it] {'loss': 0.1756, 'grad_norm': 0.842801122120863, 'learning_rate': 4.242229254821095e-09, 'epoch': 0.99} 99%|█████████▊| 33842/34278 [37:03:37<31:01, 4.27s/it] 99%|█████████▊| 33843/34278 [37:03:44<35:26, 4.89s/it] {'loss': 0.1001, 'grad_norm': 0.7570353160226188, 'learning_rate': 4.222794540318931e-09, 'epoch': 0.99} 99%|█████████▊| 33843/34278 [37:03:44<35:26, 4.89s/it] 99%|█████████▊| 33844/34278 [37:03:47<31:39, 4.38s/it] {'loss': 0.1161, 'grad_norm': 0.8437804409050115, 'learning_rate': 4.203404426924018e-09, 'epoch': 0.99} 99%|█████████▊| 33844/34278 [37:03:47<31:39, 4.38s/it] 99%|█████████▊| 33845/34278 [37:03:53<35:19, 4.90s/it] {'loss': 0.108, 'grad_norm': 0.9523134795250229, 'learning_rate': 4.184058914807887e-09, 'epoch': 0.99} 99%|█████████▊| 33845/34278 [37:03:53<35:19, 4.90s/it] 99%|█████████▊| 33846/34278 [37:03:56<30:37, 4.25s/it] {'loss': 0.1057, 'grad_norm': 0.9045730184233152, 'learning_rate': 4.164758004143732e-09, 'epoch': 0.99} 99%|█████████▊| 33846/34278 [37:03:56<30:37, 4.25s/it] 99%|█████████▊| 33847/34278 [37:04:02<34:25, 4.79s/it] {'loss': 0.1353, 'grad_norm': 0.7367454289036565, 'learning_rate': 4.145501695104193e-09, 'epoch': 0.99} 99%|█████████▊| 33847/34278 [37:04:02<34:25, 4.79s/it] 99%|█████████▊| 33848/34278 [37:04:05<29:49, 4.16s/it] {'loss': 0.1337, 'grad_norm': 2.4213463654490845, 'learning_rate': 4.1262899878613535e-09, 'epoch': 0.99} 99%|█████████▊| 33848/34278 [37:04:05<29:49, 4.16s/it] 99%|█████████▊| 33849/34278 [37:04:08<27:10, 3.80s/it] {'loss': 0.1229, 'grad_norm': 0.7595899548234707, 'learning_rate': 4.10712288258619e-09, 'epoch': 0.99} 99%|█████████▊| 33849/34278 [37:04:08<27:10, 3.80s/it] 99%|█████████▉| 33850/34278 [37:04:11<26:03, 3.65s/it] {'loss': 0.1158, 'grad_norm': 0.7828957073264476, 'learning_rate': 4.088000379449675e-09, 'epoch': 0.99} 99%|█████████▉| 33850/34278 [37:04:11<26:03, 3.65s/it] 99%|█████████▉| 33851/34278 [37:04:16<28:27, 4.00s/it] {'loss': 0.1097, 'grad_norm': 0.9058873105421239, 'learning_rate': 4.0689224786233385e-09, 'epoch': 0.99} 99%|█████████▉| 33851/34278 [37:04:16<28:27, 4.00s/it] 99%|█████████▉| 33852/34278 [37:04:19<27:28, 3.87s/it] {'loss': 0.129, 'grad_norm': 0.9725555610164482, 'learning_rate': 4.04988918027649e-09, 'epoch': 0.99} 99%|█████████▉| 33852/34278 [37:04:19<27:28, 3.87s/it] 99%|█████████▉| 33853/34278 [37:04:22<25:37, 3.62s/it] {'loss': 0.1048, 'grad_norm': 0.8101288708307562, 'learning_rate': 4.030900484580102e-09, 'epoch': 0.99} 99%|█████████▉| 33853/34278 [37:04:22<25:37, 3.62s/it] 99%|█████████▉| 33854/34278 [37:04:26<24:49, 3.51s/it] {'loss': 0.1083, 'grad_norm': 0.9430845170879232, 'learning_rate': 4.011956391702932e-09, 'epoch': 0.99} 99%|█████████▉| 33854/34278 [37:04:26<24:49, 3.51s/it] 99%|█████████▉| 33855/34278 [37:04:29<24:19, 3.45s/it] {'loss': 0.117, 'grad_norm': 0.7060511064029318, 'learning_rate': 3.9930569018148406e-09, 'epoch': 0.99} 99%|█████████▉| 33855/34278 [37:04:29<24:19, 3.45s/it] 99%|█████████▉| 33856/34278 [37:04:33<26:10, 3.72s/it] {'loss': 0.1143, 'grad_norm': 0.9708052015115435, 'learning_rate': 3.974202015083473e-09, 'epoch': 0.99} 99%|█████████▉| 33856/34278 [37:04:33<26:10, 3.72s/it] 99%|█████████▉| 33857/34278 [37:04:39<30:54, 4.41s/it] {'loss': 0.1169, 'grad_norm': 0.8593164344162948, 'learning_rate': 3.955391731678138e-09, 'epoch': 0.99} 99%|█████████▉| 33857/34278 [37:04:39<30:54, 4.41s/it] 99%|█████████▉| 33858/34278 [37:04:42<27:51, 3.98s/it] {'loss': 0.0969, 'grad_norm': 0.6501196245436692, 'learning_rate': 3.936626051766479e-09, 'epoch': 0.99} 99%|█████████▉| 33858/34278 [37:04:42<27:51, 3.98s/it] 99%|█████████▉| 33859/34278 [37:04:48<31:59, 4.58s/it] {'loss': 0.1012, 'grad_norm': 0.875014284433436, 'learning_rate': 3.917904975515585e-09, 'epoch': 0.99} 99%|█████████▉| 33859/34278 [37:04:48<31:59, 4.58s/it] 99%|█████████▉| 33860/34278 [37:04:51<28:39, 4.11s/it] {'loss': 0.1375, 'grad_norm': 0.7746246360939394, 'learning_rate': 3.8992285030930995e-09, 'epoch': 0.99} 99%|█████████▉| 33860/34278 [37:04:51<28:39, 4.11s/it] 99%|█████████▉| 33861/34278 [37:04:57<32:29, 4.68s/it] {'loss': 0.1088, 'grad_norm': 0.7526757611316315, 'learning_rate': 3.880596634666112e-09, 'epoch': 0.99} 99%|█████████▉| 33861/34278 [37:04:57<32:29, 4.68s/it] 99%|█████████▉| 33862/34278 [37:05:00<29:22, 4.24s/it] {'loss': 0.1131, 'grad_norm': 0.6386235396886358, 'learning_rate': 3.862009370400044e-09, 'epoch': 0.99} 99%|█████████▉| 33862/34278 [37:05:00<29:22, 4.24s/it] 99%|█████████▉| 33863/34278 [37:05:04<27:19, 3.95s/it] {'loss': 0.101, 'grad_norm': 0.7178575161832981, 'learning_rate': 3.84346671046143e-09, 'epoch': 0.99} 99%|█████████▉| 33863/34278 [37:05:04<27:19, 3.95s/it] 99%|█████████▉| 33864/34278 [37:05:07<25:36, 3.71s/it] {'loss': 0.0895, 'grad_norm': 0.8271433401077359, 'learning_rate': 3.824968655015138e-09, 'epoch': 0.99} 99%|█████████▉| 33864/34278 [37:05:07<25:36, 3.71s/it] 99%|█████████▉| 33865/34278 [37:05:10<24:09, 3.51s/it] {'loss': 0.1209, 'grad_norm': 1.0395034701488943, 'learning_rate': 3.806515204227701e-09, 'epoch': 0.99} 99%|█████████▉| 33865/34278 [37:05:10<24:09, 3.51s/it] 99%|█████████▉| 33866/34278 [37:05:13<23:58, 3.49s/it] {'loss': 0.1164, 'grad_norm': 0.8263482701477021, 'learning_rate': 3.788106358262322e-09, 'epoch': 0.99} 99%|█████████▉| 33866/34278 [37:05:13<23:58, 3.49s/it] 99%|█████████▉| 33867/34278 [37:05:16<22:31, 3.29s/it] {'loss': 0.0985, 'grad_norm': 0.8350036899593295, 'learning_rate': 3.769742117284425e-09, 'epoch': 0.99} 99%|█████████▉| 33867/34278 [37:05:16<22:31, 3.29s/it] 99%|█████████▉| 33868/34278 [37:05:19<21:23, 3.13s/it] {'loss': 0.0894, 'grad_norm': 0.8196861699741413, 'learning_rate': 3.751422481457212e-09, 'epoch': 0.99} 99%|█████████▉| 33868/34278 [37:05:19<21:23, 3.13s/it] 99%|█████████▉| 33869/34278 [37:05:22<20:24, 2.99s/it] {'loss': 0.1221, 'grad_norm': 0.7286077158393757, 'learning_rate': 3.733147450944996e-09, 'epoch': 0.99} 99%|█████████▉| 33869/34278 [37:05:22<20:24, 2.99s/it] 99%|█████████▉| 33870/34278 [37:05:25<20:21, 2.99s/it] {'loss': 0.1434, 'grad_norm': 1.1734134072376297, 'learning_rate': 3.714917025910425e-09, 'epoch': 0.99} 99%|█████████▉| 33870/34278 [37:05:25<20:21, 2.99s/it] 99%|█████████▉| 33871/34278 [37:05:28<20:47, 3.06s/it] {'loss': 0.0946, 'grad_norm': 0.7874856809417258, 'learning_rate': 3.6967312065161466e-09, 'epoch': 0.99} 99%|█████████▉| 33871/34278 [37:05:28<20:47, 3.06s/it] 99%|█████████▉| 33872/34278 [37:05:34<26:18, 3.89s/it] {'loss': 0.1083, 'grad_norm': 0.8778410972133301, 'learning_rate': 3.678589992925363e-09, 'epoch': 0.99} 99%|█████████▉| 33872/34278 [37:05:34<26:18, 3.89s/it] 99%|█████████▉| 33873/34278 [37:05:38<26:24, 3.91s/it] {'loss': 0.1026, 'grad_norm': 1.598198545475215, 'learning_rate': 3.6604933852985023e-09, 'epoch': 0.99} 99%|█████████▉| 33873/34278 [37:05:38<26:24, 3.91s/it] 99%|█████████▉| 33874/34278 [37:05:41<24:49, 3.69s/it] {'loss': 0.0936, 'grad_norm': 0.8277714856641315, 'learning_rate': 3.642441383798767e-09, 'epoch': 0.99} 99%|█████████▉| 33874/34278 [37:05:41<24:49, 3.69s/it] 99%|█████████▉| 33875/34278 [37:05:44<23:05, 3.44s/it] {'loss': 0.0986, 'grad_norm': 0.833821378908683, 'learning_rate': 3.6244339885865843e-09, 'epoch': 0.99} 99%|█████████▉| 33875/34278 [37:05:44<23:05, 3.44s/it] 99%|█████████▉| 33876/34278 [37:05:47<22:20, 3.33s/it] {'loss': 0.1097, 'grad_norm': 0.6860716202291299, 'learning_rate': 3.606471199822381e-09, 'epoch': 0.99} 99%|█████████▉| 33876/34278 [37:05:47<22:20, 3.33s/it] 99%|█████████▉| 33877/34278 [37:05:49<21:08, 3.16s/it] {'loss': 0.1197, 'grad_norm': 0.9047231834060151, 'learning_rate': 3.588553017666585e-09, 'epoch': 0.99} 99%|█████████▉| 33877/34278 [37:05:49<21:08, 3.16s/it] 99%|█████████▉| 33878/34278 [37:05:53<21:51, 3.28s/it] {'loss': 0.1137, 'grad_norm': 0.8772978706130129, 'learning_rate': 3.5706794422801783e-09, 'epoch': 0.99} 99%|█████████▉| 33878/34278 [37:05:53<21:51, 3.28s/it] 99%|█████████▉| 33879/34278 [37:05:59<26:59, 4.06s/it] {'loss': 0.1081, 'grad_norm': 1.0249440848752416, 'learning_rate': 3.5528504738213676e-09, 'epoch': 0.99} 99%|█████████▉| 33879/34278 [37:05:59<26:59, 4.06s/it] 99%|█████████▉| 33880/34278 [37:06:02<24:31, 3.70s/it] {'loss': 0.0979, 'grad_norm': 0.7789866297536088, 'learning_rate': 3.535066112450025e-09, 'epoch': 0.99} 99%|█████████▉| 33880/34278 [37:06:02<24:31, 3.70s/it] 99%|█████████▉| 33881/34278 [37:06:05<23:12, 3.51s/it] {'loss': 0.0884, 'grad_norm': 0.7906564491597248, 'learning_rate': 3.5173263583254678e-09, 'epoch': 0.99} 99%|█████████▉| 33881/34278 [37:06:05<23:12, 3.51s/it] 99%|█████████▉| 33882/34278 [37:06:08<23:13, 3.52s/it] {'loss': 0.1049, 'grad_norm': 1.0393986718624706, 'learning_rate': 3.4996312116047925e-09, 'epoch': 0.99} 99%|█████████▉| 33882/34278 [37:06:08<23:13, 3.52s/it] 99%|█████████▉| 33883/34278 [37:06:12<22:36, 3.43s/it] {'loss': 0.0933, 'grad_norm': 0.7313767163172219, 'learning_rate': 3.481980672446761e-09, 'epoch': 0.99} 99%|█████████▉| 33883/34278 [37:06:12<22:36, 3.43s/it] 99%|█████████▉| 33884/34278 [37:06:14<21:30, 3.28s/it] {'loss': 0.1167, 'grad_norm': 0.85539088527848, 'learning_rate': 3.4643747410090244e-09, 'epoch': 0.99} 99%|█████████▉| 33884/34278 [37:06:14<21:30, 3.28s/it] 99%|█████████▉| 33885/34278 [37:06:18<21:57, 3.35s/it] {'loss': 0.1062, 'grad_norm': 0.8256152225779988, 'learning_rate': 3.44681341744868e-09, 'epoch': 0.99} 99%|█████████▉| 33885/34278 [37:06:18<21:57, 3.35s/it] 99%|█████████▉| 33886/34278 [37:06:21<21:50, 3.34s/it] {'loss': 0.1139, 'grad_norm': 0.7836846465773687, 'learning_rate': 3.429296701922269e-09, 'epoch': 0.99} 99%|█████████▉| 33886/34278 [37:06:21<21:50, 3.34s/it] 99%|█████████▉| 33887/34278 [37:06:24<21:10, 3.25s/it] {'loss': 0.1251, 'grad_norm': 0.8079916059808888, 'learning_rate': 3.4118245945863326e-09, 'epoch': 0.99} 99%|█████████▉| 33887/34278 [37:06:24<21:10, 3.25s/it] 99%|█████████▉| 33888/34278 [37:06:30<26:41, 4.11s/it] {'loss': 0.1053, 'grad_norm': 1.049243222210774, 'learning_rate': 3.3943970955968573e-09, 'epoch': 0.99} 99%|█████████▉| 33888/34278 [37:06:30<26:41, 4.11s/it] 99%|█████████▉| 33889/34278 [37:06:36<30:04, 4.64s/it] {'loss': 0.1203, 'grad_norm': 0.8836078354958317, 'learning_rate': 3.377014205109275e-09, 'epoch': 0.99} 99%|█████████▉| 33889/34278 [37:06:36<30:04, 4.64s/it] 99%|█████████▉| 33890/34278 [37:06:41<30:10, 4.67s/it] {'loss': 0.1092, 'grad_norm': 0.6035320234324897, 'learning_rate': 3.3596759232790156e-09, 'epoch': 0.99} 99%|█████████▉| 33890/34278 [37:06:41<30:10, 4.67s/it] 99%|█████████▉| 33891/34278 [37:06:46<31:12, 4.84s/it] {'loss': 0.1408, 'grad_norm': 0.9012561199472223, 'learning_rate': 3.342382250260956e-09, 'epoch': 0.99} 99%|█████████▉| 33891/34278 [37:06:46<31:12, 4.84s/it] 99%|█████████▉| 33892/34278 [37:06:50<28:12, 4.38s/it] {'loss': 0.1165, 'grad_norm': 0.8166263530626607, 'learning_rate': 3.325133186209417e-09, 'epoch': 0.99} 99%|█████████▉| 33892/34278 [37:06:50<28:12, 4.38s/it] 99%|█████████▉| 33893/34278 [37:06:53<26:21, 4.11s/it] {'loss': 0.1001, 'grad_norm': 0.8651909650123857, 'learning_rate': 3.30792873127761e-09, 'epoch': 0.99} 99%|█████████▉| 33893/34278 [37:06:53<26:21, 4.11s/it] 99%|█████████▉| 33894/34278 [37:06:57<25:37, 4.00s/it] {'loss': 0.1255, 'grad_norm': 0.7425522189351392, 'learning_rate': 3.29076888562041e-09, 'epoch': 0.99} 99%|█████████▉| 33894/34278 [37:06:57<25:37, 4.00s/it] 99%|█████████▉| 33895/34278 [37:07:00<23:42, 3.72s/it] {'loss': 0.1039, 'grad_norm': 1.1253179179711035, 'learning_rate': 3.2736536493904734e-09, 'epoch': 0.99} 99%|█████████▉| 33895/34278 [37:07:00<23:42, 3.72s/it] 99%|█████████▉| 33896/34278 [37:07:03<22:08, 3.48s/it] {'loss': 0.1015, 'grad_norm': 0.7628373725503663, 'learning_rate': 3.256583022739901e-09, 'epoch': 0.99} 99%|█████████▉| 33896/34278 [37:07:03<22:08, 3.48s/it] 99%|█████████▉| 33897/34278 [37:07:09<26:47, 4.22s/it] {'loss': 0.1296, 'grad_norm': 0.8887192641238454, 'learning_rate': 3.239557005822458e-09, 'epoch': 0.99} 99%|█████████▉| 33897/34278 [37:07:09<26:47, 4.22s/it] 99%|█████████▉| 33898/34278 [37:07:12<24:33, 3.88s/it] {'loss': 0.1055, 'grad_norm': 0.801320497822789, 'learning_rate': 3.222575598789135e-09, 'epoch': 0.99} 99%|█████████▉| 33898/34278 [37:07:12<24:33, 3.88s/it] 99%|█████████▉| 33899/34278 [37:07:15<22:33, 3.57s/it] {'loss': 0.1054, 'grad_norm': 0.7180598202244861, 'learning_rate': 3.2056388017914773e-09, 'epoch': 0.99} 99%|█████████▉| 33899/34278 [37:07:15<22:33, 3.57s/it] 99%|█████████▉| 33900/34278 [37:07:18<22:19, 3.54s/it] {'loss': 0.1161, 'grad_norm': 0.7580971729199544, 'learning_rate': 3.188746614981586e-09, 'epoch': 0.99} 99%|█████████▉| 33900/34278 [37:07:18<22:19, 3.54s/it] 99%|█████████▉| 33901/34278 [37:07:22<22:16, 3.55s/it] {'loss': 0.115, 'grad_norm': 0.8562341518330403, 'learning_rate': 3.1718990385093408e-09, 'epoch': 0.99} 99%|█████████▉| 33901/34278 [37:07:22<22:16, 3.55s/it] 99%|█████████▉| 33902/34278 [37:07:25<21:08, 3.37s/it] {'loss': 0.1185, 'grad_norm': 0.7894581821337003, 'learning_rate': 3.155096072525732e-09, 'epoch': 0.99} 99%|█████████▉| 33902/34278 [37:07:25<21:08, 3.37s/it] 99%|█████████▉| 33903/34278 [37:07:28<20:06, 3.22s/it] {'loss': 0.1109, 'grad_norm': 0.8305931487248671, 'learning_rate': 3.1383377171806396e-09, 'epoch': 0.99} 99%|█████████▉| 33903/34278 [37:07:28<20:06, 3.22s/it] 99%|█████████▉| 33904/34278 [37:07:31<20:44, 3.33s/it] {'loss': 0.0931, 'grad_norm': 0.710003354133609, 'learning_rate': 3.1216239726233888e-09, 'epoch': 0.99} 99%|█████████▉| 33904/34278 [37:07:31<20:44, 3.33s/it] 99%|█████████▉| 33905/34278 [37:07:35<20:48, 3.35s/it] {'loss': 0.1062, 'grad_norm': 0.945893025959921, 'learning_rate': 3.1049548390038596e-09, 'epoch': 0.99} 99%|█████████▉| 33905/34278 [37:07:35<20:48, 3.35s/it] 99%|█████████▉| 33906/34278 [37:07:38<21:26, 3.46s/it] {'loss': 0.1167, 'grad_norm': 0.8633522359130263, 'learning_rate': 3.0883303164702673e-09, 'epoch': 0.99} 99%|█████████▉| 33906/34278 [37:07:38<21:26, 3.46s/it] 99%|█████████▉| 33907/34278 [37:07:41<20:51, 3.37s/it] {'loss': 0.0941, 'grad_norm': 0.8842800265805906, 'learning_rate': 3.071750405170826e-09, 'epoch': 0.99} 99%|█████████▉| 33907/34278 [37:07:41<20:51, 3.37s/it] 99%|█████████▉| 33908/34278 [37:07:45<20:56, 3.40s/it] {'loss': 0.0986, 'grad_norm': 0.8391003658343408, 'learning_rate': 3.0552151052543057e-09, 'epoch': 0.99} 99%|█████████▉| 33908/34278 [37:07:45<20:56, 3.40s/it] 99%|█████████▉| 33909/34278 [37:07:49<22:54, 3.72s/it] {'loss': 0.1361, 'grad_norm': 0.840982645146856, 'learning_rate': 3.038724416867811e-09, 'epoch': 0.99} 99%|█████████▉| 33909/34278 [37:07:49<22:54, 3.72s/it] 99%|█████████▉| 33910/34278 [37:07:55<27:06, 4.42s/it] {'loss': 0.092, 'grad_norm': 0.7738698120393673, 'learning_rate': 3.0222783401590016e-09, 'epoch': 0.99} 99%|█████████▉| 33910/34278 [37:07:55<27:06, 4.42s/it] 99%|█████████▉| 33911/34278 [37:07:58<24:06, 3.94s/it] {'loss': 0.1124, 'grad_norm': 0.734905388385287, 'learning_rate': 3.0058768752738723e-09, 'epoch': 0.99} 99%|█████████▉| 33911/34278 [37:07:58<24:06, 3.94s/it] 99%|█████████▉| 33912/34278 [37:08:02<23:27, 3.85s/it] {'loss': 0.1281, 'grad_norm': 0.8107093289877106, 'learning_rate': 2.989520022360082e-09, 'epoch': 0.99} 99%|█████████▉| 33912/34278 [37:08:02<23:27, 3.85s/it] 99%|█████████▉| 33913/34278 [37:08:05<21:58, 3.61s/it] {'loss': 0.1082, 'grad_norm': 0.8404966074340152, 'learning_rate': 2.9732077815625148e-09, 'epoch': 0.99} 99%|█████████▉| 33913/34278 [37:08:05<21:58, 3.61s/it] 99%|█████████▉| 33914/34278 [37:08:08<20:58, 3.46s/it] {'loss': 0.1088, 'grad_norm': 0.8900145930179061, 'learning_rate': 2.956940153027166e-09, 'epoch': 0.99} 99%|█████████▉| 33914/34278 [37:08:08<20:58, 3.46s/it] 99%|█████████▉| 33915/34278 [37:08:11<20:27, 3.38s/it] {'loss': 0.1132, 'grad_norm': 0.839804287940214, 'learning_rate': 2.9407171368994738e-09, 'epoch': 0.99} 99%|█████████▉| 33915/34278 [37:08:11<20:27, 3.38s/it] 99%|█████████▉| 33916/34278 [37:08:17<25:16, 4.19s/it] {'loss': 0.1297, 'grad_norm': 0.7373948432771645, 'learning_rate': 2.9245387333243225e-09, 'epoch': 0.99} 99%|█████████▉| 33916/34278 [37:08:17<25:16, 4.19s/it] 99%|█████████▉| 33917/34278 [37:08:21<23:49, 3.96s/it] {'loss': 0.127, 'grad_norm': 0.6906506282981626, 'learning_rate': 2.9084049424460414e-09, 'epoch': 0.99} 99%|█████████▉| 33917/34278 [37:08:21<23:49, 3.96s/it] 99%|█████████▉| 33918/34278 [37:08:25<24:19, 4.05s/it] {'loss': 0.1146, 'grad_norm': 0.8405123828936328, 'learning_rate': 2.8923157644084044e-09, 'epoch': 0.99} 99%|█████████▉| 33918/34278 [37:08:25<24:19, 4.05s/it] 99%|█████████▉| 33919/34278 [37:08:30<26:51, 4.49s/it] {'loss': 0.1149, 'grad_norm': 0.8178881314785621, 'learning_rate': 2.876271199355185e-09, 'epoch': 0.99} 99%|█████████▉| 33919/34278 [37:08:31<26:51, 4.49s/it] 99%|█████████▉| 33920/34278 [37:08:34<24:47, 4.16s/it] {'loss': 0.1123, 'grad_norm': 0.7996511022800938, 'learning_rate': 2.8602712474301575e-09, 'epoch': 0.99} 99%|█████████▉| 33920/34278 [37:08:34<24:47, 4.16s/it] 99%|█████████▉| 33921/34278 [37:08:38<24:36, 4.14s/it] {'loss': 0.1056, 'grad_norm': 0.7828003158065426, 'learning_rate': 2.8443159087754304e-09, 'epoch': 0.99} 99%|█████████▉| 33921/34278 [37:08:38<24:36, 4.14s/it] 99%|█████████▉| 33922/34278 [37:08:42<25:05, 4.23s/it] {'loss': 0.1072, 'grad_norm': 0.6241510006751408, 'learning_rate': 2.828405183533667e-09, 'epoch': 0.99} 99%|█████████▉| 33922/34278 [37:08:42<25:05, 4.23s/it] 99%|█████████▉| 33923/34278 [37:08:47<25:12, 4.26s/it] {'loss': 0.1123, 'grad_norm': 0.9955227614630167, 'learning_rate': 2.8125390718469757e-09, 'epoch': 0.99} 99%|█████████▉| 33923/34278 [37:08:47<25:12, 4.26s/it] 99%|█████████▉| 33924/34278 [37:08:50<22:36, 3.83s/it] {'loss': 0.0969, 'grad_norm': 0.7808665715637741, 'learning_rate': 2.7967175738569107e-09, 'epoch': 0.99} 99%|█████████▉| 33924/34278 [37:08:50<22:36, 3.83s/it] 99%|█████████▉| 33925/34278 [37:08:53<21:39, 3.68s/it] {'loss': 0.0956, 'grad_norm': 0.7306766518997789, 'learning_rate': 2.780940689705025e-09, 'epoch': 0.99} 99%|█████████▉| 33925/34278 [37:08:53<21:39, 3.68s/it] 99%|█████████▉| 33926/34278 [37:08:56<20:26, 3.49s/it] {'loss': 0.1179, 'grad_norm': 0.6744496999579083, 'learning_rate': 2.765208419531762e-09, 'epoch': 0.99} 99%|█████████▉| 33926/34278 [37:08:56<20:26, 3.49s/it] 99%|█████████▉| 33927/34278 [37:09:00<21:00, 3.59s/it] {'loss': 0.1129, 'grad_norm': 0.7961666872297498, 'learning_rate': 2.7495207634781194e-09, 'epoch': 0.99} 99%|█████████▉| 33927/34278 [37:09:00<21:00, 3.59s/it] 99%|█████████▉| 33928/34278 [37:09:06<24:43, 4.24s/it] {'loss': 0.1114, 'grad_norm': 0.8423488570580006, 'learning_rate': 2.733877721683986e-09, 'epoch': 0.99} 99%|█████████▉| 33928/34278 [37:09:06<24:43, 4.24s/it] 99%|█████████▉| 33929/34278 [37:09:09<23:33, 4.05s/it] {'loss': 0.0894, 'grad_norm': 0.7405981694754284, 'learning_rate': 2.7182792942881396e-09, 'epoch': 0.99} 99%|█████████▉| 33929/34278 [37:09:09<23:33, 4.05s/it] 99%|█████████▉| 33930/34278 [37:09:13<22:40, 3.91s/it] {'loss': 0.1029, 'grad_norm': 0.8079617588290383, 'learning_rate': 2.7027254814310232e-09, 'epoch': 0.99} 99%|█████████▉| 33930/34278 [37:09:13<22:40, 3.91s/it] 99%|█████████▉| 33931/34278 [37:09:17<22:55, 3.96s/it] {'loss': 0.1023, 'grad_norm': 0.6998068980562837, 'learning_rate': 2.6872162832508596e-09, 'epoch': 0.99} 99%|█████████▉| 33931/34278 [37:09:17<22:55, 3.96s/it] 99%|█████████▉| 33932/34278 [37:09:20<21:34, 3.74s/it] {'loss': 0.1042, 'grad_norm': 1.3290014117581306, 'learning_rate': 2.671751699886427e-09, 'epoch': 0.99} 99%|█████████▉| 33932/34278 [37:09:20<21:34, 3.74s/it] 99%|█████████▉| 33933/34278 [37:09:24<21:57, 3.82s/it] {'loss': 0.1083, 'grad_norm': 0.8834074317425398, 'learning_rate': 2.656331731475392e-09, 'epoch': 0.99} 99%|█████████▉| 33933/34278 [37:09:24<21:57, 3.82s/it] 99%|█████████▉| 33934/34278 [37:09:27<20:21, 3.55s/it] {'loss': 0.1065, 'grad_norm': 0.8090366186211602, 'learning_rate': 2.640956378155979e-09, 'epoch': 0.99} 99%|█████████▉| 33934/34278 [37:09:27<20:21, 3.55s/it] 99%|█████████▉| 33935/34278 [37:09:30<19:32, 3.42s/it] {'loss': 0.1172, 'grad_norm': 0.8713800432799568, 'learning_rate': 2.625625640064744e-09, 'epoch': 0.99} 99%|█████████▉| 33935/34278 [37:09:30<19:32, 3.42s/it] 99%|█████████▉| 33936/34278 [37:09:34<19:51, 3.48s/it] {'loss': 0.1052, 'grad_norm': 0.9535086899752458, 'learning_rate': 2.610339517339355e-09, 'epoch': 0.99} 99%|█████████▉| 33936/34278 [37:09:34<19:51, 3.48s/it] 99%|█████████▉| 33937/34278 [37:09:37<18:45, 3.30s/it] {'loss': 0.1121, 'grad_norm': 0.8555863263579646, 'learning_rate': 2.595098010115815e-09, 'epoch': 0.99} 99%|█████████▉| 33937/34278 [37:09:37<18:45, 3.30s/it] 99%|█████████▉| 33938/34278 [37:09:41<21:07, 3.73s/it] {'loss': 0.1012, 'grad_norm': 0.9805459250371799, 'learning_rate': 2.579901118530126e-09, 'epoch': 0.99} 99%|█████████▉| 33938/34278 [37:09:41<21:07, 3.73s/it] 99%|█████████▉| 33939/34278 [37:09:47<24:55, 4.41s/it] {'loss': 0.0979, 'grad_norm': 0.7345744517421733, 'learning_rate': 2.5647488427182897e-09, 'epoch': 0.99} 99%|█████████▉| 33939/34278 [37:09:47<24:55, 4.41s/it] 99%|█████████▉| 33940/34278 [37:09:53<27:32, 4.89s/it] {'loss': 0.1323, 'grad_norm': 0.8835896243778248, 'learning_rate': 2.549641182815199e-09, 'epoch': 0.99} 99%|█████████▉| 33940/34278 [37:09:53<27:32, 4.89s/it] 99%|█████████▉| 33941/34278 [37:09:56<24:06, 4.29s/it] {'loss': 0.1114, 'grad_norm': 0.7344805468122377, 'learning_rate': 2.5345781389557454e-09, 'epoch': 0.99} 99%|█████████▉| 33941/34278 [37:09:56<24:06, 4.29s/it] 99%|█████████▉| 33942/34278 [37:10:00<22:36, 4.04s/it] {'loss': 0.1137, 'grad_norm': 0.809929318413112, 'learning_rate': 2.5195597112748215e-09, 'epoch': 0.99} 99%|█████████▉| 33942/34278 [37:10:00<22:36, 4.04s/it] 99%|█████████▉| 33943/34278 [37:10:03<21:15, 3.81s/it] {'loss': 0.0856, 'grad_norm': 0.8039203377031532, 'learning_rate': 2.5045858999062087e-09, 'epoch': 0.99} 99%|█████████▉| 33943/34278 [37:10:03<21:15, 3.81s/it] 99%|█████████▉| 33944/34278 [37:10:06<20:11, 3.63s/it] {'loss': 0.1144, 'grad_norm': 0.779359803495864, 'learning_rate': 2.4896567049836896e-09, 'epoch': 0.99} 99%|█████████▉| 33944/34278 [37:10:06<20:11, 3.63s/it] 99%|█████████▉| 33945/34278 [37:10:10<19:55, 3.59s/it] {'loss': 0.1029, 'grad_norm': 0.790561357947606, 'learning_rate': 2.4747721266404902e-09, 'epoch': 0.99} 99%|█████████▉| 33945/34278 [37:10:10<19:55, 3.59s/it] 99%|█████████▉| 33946/34278 [37:10:14<20:27, 3.70s/it] {'loss': 0.106, 'grad_norm': 0.908733578638023, 'learning_rate': 2.4599321650098375e-09, 'epoch': 0.99} 99%|█████████▉| 33946/34278 [37:10:14<20:27, 3.70s/it] 99%|█████████▉| 33947/34278 [37:10:17<20:42, 3.75s/it] {'loss': 0.0964, 'grad_norm': 0.765747348911198, 'learning_rate': 2.445136820223293e-09, 'epoch': 0.99} 99%|█████████▉| 33947/34278 [37:10:17<20:42, 3.75s/it] 99%|█████████▉| 33948/34278 [37:10:21<19:32, 3.55s/it] {'loss': 0.1098, 'grad_norm': 0.7649930331908215, 'learning_rate': 2.4303860924140833e-09, 'epoch': 0.99} 99%|█████████▉| 33948/34278 [37:10:21<19:32, 3.55s/it] 99%|█████████▉| 33949/34278 [37:10:23<18:14, 3.33s/it] {'loss': 0.1468, 'grad_norm': 0.9362867907583359, 'learning_rate': 2.4156799817132147e-09, 'epoch': 0.99} 99%|█████████▉| 33949/34278 [37:10:23<18:14, 3.33s/it] 99%|█████████▉| 33950/34278 [37:10:26<17:26, 3.19s/it] {'loss': 0.1186, 'grad_norm': 0.8898718990061929, 'learning_rate': 2.401018488251694e-09, 'epoch': 0.99} 99%|█████████▉| 33950/34278 [37:10:26<17:26, 3.19s/it] 99%|█████████▉| 33951/34278 [37:10:30<17:37, 3.24s/it] {'loss': 0.1167, 'grad_norm': 0.8409265066223599, 'learning_rate': 2.3864016121616375e-09, 'epoch': 0.99} 99%|█████████▉| 33951/34278 [37:10:30<17:37, 3.24s/it] 99%|█████████▉| 33952/34278 [37:10:33<17:32, 3.23s/it] {'loss': 0.1166, 'grad_norm': 0.7395345854650033, 'learning_rate': 2.3718293535723857e-09, 'epoch': 0.99} 99%|█████████▉| 33952/34278 [37:10:33<17:32, 3.23s/it] 99%|█████████▉| 33953/34278 [37:10:36<17:01, 3.14s/it] {'loss': 0.1145, 'grad_norm': 1.015248079779123, 'learning_rate': 2.3573017126143904e-09, 'epoch': 0.99} 99%|█████████▉| 33953/34278 [37:10:36<17:01, 3.14s/it] 99%|█████████▉| 33954/34278 [37:10:39<16:56, 3.14s/it] {'loss': 0.1175, 'grad_norm': 0.8702693898561817, 'learning_rate': 2.3428186894169925e-09, 'epoch': 0.99} 99%|█████████▉| 33954/34278 [37:10:39<16:56, 3.14s/it] 99%|█████████▉| 33955/34278 [37:10:45<21:48, 4.05s/it] {'loss': 0.1121, 'grad_norm': 0.6597727036415594, 'learning_rate': 2.328380284110643e-09, 'epoch': 0.99} 99%|█████████▉| 33955/34278 [37:10:45<21:48, 4.05s/it] 99%|█████████▉| 33956/34278 [37:10:51<24:22, 4.54s/it] {'loss': 0.1052, 'grad_norm': 0.7357749183320261, 'learning_rate': 2.3139864968230175e-09, 'epoch': 0.99} 99%|█████████▉| 33956/34278 [37:10:51<24:22, 4.54s/it] 99%|█████████▉| 33957/34278 [37:10:54<21:55, 4.10s/it] {'loss': 0.1021, 'grad_norm': 0.7268518988741933, 'learning_rate': 2.299637327682902e-09, 'epoch': 0.99} 99%|█████████▉| 33957/34278 [37:10:54<21:55, 4.10s/it] 99%|█████████▉| 33958/34278 [37:10:57<19:58, 3.74s/it] {'loss': 0.11, 'grad_norm': 0.6513487882001722, 'learning_rate': 2.2853327768190823e-09, 'epoch': 0.99} 99%|█████████▉| 33958/34278 [37:10:57<19:58, 3.74s/it] 99%|█████████▉| 33959/34278 [37:11:02<22:17, 4.19s/it] {'loss': 0.1185, 'grad_norm': 0.7685793890117875, 'learning_rate': 2.2710728443586793e-09, 'epoch': 0.99} 99%|█████████▉| 33959/34278 [37:11:02<22:17, 4.19s/it] 99%|█████████▉| 33960/34278 [37:11:05<20:46, 3.92s/it] {'loss': 0.1157, 'grad_norm': 0.8430703910225354, 'learning_rate': 2.2568575304288133e-09, 'epoch': 0.99} 99%|█████████▉| 33960/34278 [37:11:05<20:46, 3.92s/it] 99%|█████████▉| 33961/34278 [37:11:09<19:41, 3.73s/it] {'loss': 0.1079, 'grad_norm': 0.7342130147692949, 'learning_rate': 2.2426868351566046e-09, 'epoch': 0.99} 99%|█████████▉| 33961/34278 [37:11:09<19:41, 3.73s/it] 99%|█████████▉| 33962/34278 [37:11:12<19:50, 3.77s/it] {'loss': 0.1297, 'grad_norm': 0.9779710870850913, 'learning_rate': 2.2285607586686186e-09, 'epoch': 0.99} 99%|█████████▉| 33962/34278 [37:11:12<19:50, 3.77s/it] 99%|█████████▉| 33963/34278 [37:11:16<18:59, 3.62s/it] {'loss': 0.0999, 'grad_norm': 1.01625764127674, 'learning_rate': 2.214479301091421e-09, 'epoch': 0.99} 99%|█████████▉| 33963/34278 [37:11:16<18:59, 3.62s/it] 99%|█████████▉| 33964/34278 [37:11:21<21:04, 4.03s/it] {'loss': 0.0967, 'grad_norm': 0.7672220115317753, 'learning_rate': 2.200442462549912e-09, 'epoch': 0.99} 99%|█████████▉| 33964/34278 [37:11:21<21:04, 4.03s/it] 99%|█████████▉| 33965/34278 [37:11:24<19:29, 3.74s/it] {'loss': 0.1148, 'grad_norm': 0.7909276021297114, 'learning_rate': 2.1864502431701017e-09, 'epoch': 0.99} 99%|█████████▉| 33965/34278 [37:11:24<19:29, 3.74s/it] 99%|█████████▉| 33966/34278 [37:11:29<22:33, 4.34s/it] {'loss': 0.1228, 'grad_norm': 0.7674713643682542, 'learning_rate': 2.172502643076335e-09, 'epoch': 0.99} 99%|█████████▉| 33966/34278 [37:11:29<22:33, 4.34s/it] 99%|█████████▉| 33967/34278 [37:11:33<21:18, 4.11s/it] {'loss': 0.1008, 'grad_norm': 1.086493672651407, 'learning_rate': 2.158599662392957e-09, 'epoch': 0.99} 99%|█████████▉| 33967/34278 [37:11:33<21:18, 4.11s/it] 99%|█████████▉| 33968/34278 [37:11:39<24:40, 4.78s/it] {'loss': 0.1152, 'grad_norm': 0.9305529523729485, 'learning_rate': 2.144741301245423e-09, 'epoch': 0.99} 99%|█████████▉| 33968/34278 [37:11:39<24:40, 4.78s/it] 99%|█████████▉| 33969/34278 [37:11:42<21:59, 4.27s/it] {'loss': 0.1137, 'grad_norm': 0.691799122538135, 'learning_rate': 2.1309275597558577e-09, 'epoch': 0.99} 99%|█████████▉| 33969/34278 [37:11:42<21:59, 4.27s/it] 99%|█████████▉| 33970/34278 [37:11:46<20:10, 3.93s/it] {'loss': 0.1317, 'grad_norm': 0.8863295115076664, 'learning_rate': 2.1171584380486055e-09, 'epoch': 0.99} 99%|█████████▉| 33970/34278 [37:11:46<20:10, 3.93s/it] 99%|█████████▉| 33971/34278 [37:11:49<19:02, 3.72s/it] {'loss': 0.0951, 'grad_norm': 0.9360840751832611, 'learning_rate': 2.1034339362463464e-09, 'epoch': 0.99} 99%|█████████▉| 33971/34278 [37:11:49<19:02, 3.72s/it] 99%|█████████▉| 33972/34278 [37:11:52<17:41, 3.47s/it] {'loss': 0.1065, 'grad_norm': 1.0275039878745016, 'learning_rate': 2.0897540544712046e-09, 'epoch': 0.99} 99%|█████████▉| 33972/34278 [37:11:52<17:41, 3.47s/it] 99%|█████████▉| 33973/34278 [37:11:55<17:14, 3.39s/it] {'loss': 0.1172, 'grad_norm': 0.7275166231646761, 'learning_rate': 2.0761187928458606e-09, 'epoch': 0.99} 99%|█████████▉| 33973/34278 [37:11:55<17:14, 3.39s/it] 99%|█████████▉| 33974/34278 [37:11:58<16:30, 3.26s/it] {'loss': 0.1153, 'grad_norm': 0.8463512795866125, 'learning_rate': 2.062528151491883e-09, 'epoch': 0.99} 99%|█████████▉| 33974/34278 [37:11:58<16:30, 3.26s/it] 99%|█████████▉| 33975/34278 [37:12:01<16:17, 3.23s/it] {'loss': 0.1101, 'grad_norm': 0.8650386158175781, 'learning_rate': 2.048982130530286e-09, 'epoch': 0.99} 99%|█████████▉| 33975/34278 [37:12:01<16:17, 3.23s/it] 99%|█████████▉| 33976/34278 [37:12:07<20:45, 4.12s/it] {'loss': 0.1337, 'grad_norm': 0.6942209457183107, 'learning_rate': 2.0354807300826397e-09, 'epoch': 0.99} 99%|█████████▉| 33976/34278 [37:12:07<20:45, 4.12s/it] 99%|█████████▉| 33977/34278 [37:12:10<18:48, 3.75s/it] {'loss': 0.1056, 'grad_norm': 0.8645080700651425, 'learning_rate': 2.0220239502688478e-09, 'epoch': 0.99} 99%|█████████▉| 33977/34278 [37:12:10<18:48, 3.75s/it] 99%|█████████▉| 33978/34278 [37:12:13<17:38, 3.53s/it] {'loss': 0.1084, 'grad_norm': 0.8021416694734592, 'learning_rate': 2.0086117912093696e-09, 'epoch': 0.99} 99%|█████████▉| 33978/34278 [37:12:13<17:38, 3.53s/it] 99%|█████████▉| 33979/34278 [37:12:19<21:23, 4.29s/it] {'loss': 0.1047, 'grad_norm': 0.8155716191224088, 'learning_rate': 1.995244253024109e-09, 'epoch': 0.99} 99%|█████████▉| 33979/34278 [37:12:19<21:23, 4.29s/it] 99%|█████████▉| 33980/34278 [37:12:23<20:06, 4.05s/it] {'loss': 0.1124, 'grad_norm': 1.0162152574593812, 'learning_rate': 1.98192133583186e-09, 'epoch': 0.99} 99%|█████████▉| 33980/34278 [37:12:23<20:06, 4.05s/it] 99%|█████████▉| 33981/34278 [37:12:26<18:46, 3.79s/it] {'loss': 0.099, 'grad_norm': 0.899792569027654, 'learning_rate': 1.9686430397519718e-09, 'epoch': 0.99} 99%|█████████▉| 33981/34278 [37:12:26<18:46, 3.79s/it] 99%|█████████▉| 33982/34278 [37:12:30<18:33, 3.76s/it] {'loss': 0.1034, 'grad_norm': 1.0551539411187696, 'learning_rate': 1.955409364902683e-09, 'epoch': 0.99} 99%|█████████▉| 33982/34278 [37:12:30<18:33, 3.76s/it] 99%|█████████▉| 33983/34278 [37:12:35<21:42, 4.42s/it] {'loss': 0.1029, 'grad_norm': 0.6042602662003457, 'learning_rate': 1.942220311402787e-09, 'epoch': 0.99} 99%|█████████▉| 33983/34278 [37:12:35<21:42, 4.42s/it] 99%|█████████▉| 33984/34278 [37:12:41<23:54, 4.88s/it] {'loss': 0.1101, 'grad_norm': 0.9069850981424438, 'learning_rate': 1.929075879369413e-09, 'epoch': 0.99} 99%|█████████▉| 33984/34278 [37:12:41<23:54, 4.88s/it] 99%|█████████▉| 33985/34278 [37:12:44<20:45, 4.25s/it] {'loss': 0.0983, 'grad_norm': 0.9710260042606573, 'learning_rate': 1.9159760689202447e-09, 'epoch': 0.99} 99%|█████████▉| 33985/34278 [37:12:44<20:45, 4.25s/it] 99%|█████████▉| 33986/34278 [37:12:48<19:45, 4.06s/it] {'loss': 0.1077, 'grad_norm': 1.0138261162568778, 'learning_rate': 1.9029208801718547e-09, 'epoch': 0.99} 99%|█████████▉| 33986/34278 [37:12:48<19:45, 4.06s/it] 99%|█████████▉| 33987/34278 [37:12:52<19:25, 4.01s/it] {'loss': 0.0993, 'grad_norm': 0.8687092665657794, 'learning_rate': 1.8899103132413722e-09, 'epoch': 0.99} 99%|█████████▉| 33987/34278 [37:12:52<19:25, 4.01s/it] 99%|█████████▉| 33988/34278 [37:12:55<18:36, 3.85s/it] {'loss': 0.1134, 'grad_norm': 0.8323700945122122, 'learning_rate': 1.87694436824426e-09, 'epoch': 0.99} 99%|█████████▉| 33988/34278 [37:12:55<18:36, 3.85s/it] 99%|█████████▉| 33989/34278 [37:12:58<17:33, 3.64s/it] {'loss': 0.1147, 'grad_norm': 1.1966702473090336, 'learning_rate': 1.864023045297092e-09, 'epoch': 0.99} 99%|█████████▉| 33989/34278 [37:12:58<17:33, 3.64s/it] 99%|█████████▉| 33990/34278 [37:13:02<17:12, 3.58s/it] {'loss': 0.1128, 'grad_norm': 0.6946174160030281, 'learning_rate': 1.851146344514776e-09, 'epoch': 0.99} 99%|█████████▉| 33990/34278 [37:13:02<17:12, 3.58s/it] 99%|█████████▉| 33991/34278 [37:13:08<20:40, 4.32s/it] {'loss': 0.1011, 'grad_norm': 0.6748738127061272, 'learning_rate': 1.8383142660116647e-09, 'epoch': 0.99} 99%|█████████▉| 33991/34278 [37:13:08<20:40, 4.32s/it] 99%|█████████▉| 33992/34278 [37:13:11<18:59, 3.99s/it] {'loss': 0.1007, 'grad_norm': 0.7796551964750823, 'learning_rate': 1.825526809903222e-09, 'epoch': 0.99} 99%|█████████▉| 33992/34278 [37:13:11<18:59, 3.99s/it] 99%|█████████▉| 33993/34278 [37:13:15<18:30, 3.89s/it] {'loss': 0.0923, 'grad_norm': 0.7387923943880071, 'learning_rate': 1.8127839763038003e-09, 'epoch': 0.99} 99%|█████████▉| 33993/34278 [37:13:15<18:30, 3.89s/it] 99%|█████████▉| 33994/34278 [37:13:19<18:41, 3.95s/it] {'loss': 0.1067, 'grad_norm': 0.7548773703453022, 'learning_rate': 1.8000857653260872e-09, 'epoch': 0.99} 99%|█████████▉| 33994/34278 [37:13:19<18:41, 3.95s/it] 99%|█████████▉| 33995/34278 [37:13:22<17:53, 3.79s/it] {'loss': 0.1052, 'grad_norm': 0.8038998741498162, 'learning_rate': 1.787432177083881e-09, 'epoch': 0.99} 99%|█████████▉| 33995/34278 [37:13:22<17:53, 3.79s/it] 99%|█████████▉| 33996/34278 [37:13:27<19:06, 4.06s/it] {'loss': 0.105, 'grad_norm': 0.7773612754787679, 'learning_rate': 1.7748232116909792e-09, 'epoch': 0.99} 99%|█████████▉| 33996/34278 [37:13:27<19:06, 4.06s/it] 99%|█████████▉| 33997/34278 [37:13:30<17:39, 3.77s/it] {'loss': 0.1251, 'grad_norm': 1.0876727764687901, 'learning_rate': 1.7622588692589593e-09, 'epoch': 0.99} 99%|█████████▉| 33997/34278 [37:13:30<17:39, 3.77s/it] 99%|█████████▉| 33998/34278 [37:13:33<17:11, 3.68s/it] {'loss': 0.109, 'grad_norm': 0.9704825951939312, 'learning_rate': 1.749739149900509e-09, 'epoch': 0.99} 99%|█████████▉| 33998/34278 [37:13:34<17:11, 3.68s/it] 99%|█████████▉| 33999/34278 [37:13:37<17:17, 3.72s/it] {'loss': 0.1007, 'grad_norm': 0.7351005748774584, 'learning_rate': 1.7372640537266506e-09, 'epoch': 0.99} 99%|█████████▉| 33999/34278 [37:13:37<17:17, 3.72s/it] 99%|█████████▉| 34000/34278 [37:13:40<16:12, 3.50s/it] {'loss': 0.0972, 'grad_norm': 0.7330137275697113, 'learning_rate': 1.7248335808500715e-09, 'epoch': 0.99} 99%|█████████▉| 34000/34278 [37:13:40<16:12, 3.50s/it]Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id toSet eos token id to 151658151658Set eos token id to Set eos token id toSet eos token toSet eos token to 151658<|diff_marker|><|diff_marker|> Set eos token to151658Set generation config eos token id to Set generation config eos token id to <|diff_marker|> Set eos token id toSet generation config eos token id to[151658][151658]Set eos token id toSet eos token to <|diff_marker|>151658[151658]151658 Set generation config eos token id toSet eos token toSet eos token id toSet eos token id toSet eos token id toSet eos token id toSet eos token id toSet eos token id toSet eos token id toonfig eos token id to151658 [151658][151658]Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id toSet eos token id to151658 Set eos token id to151658 Set eos token id to151658 Set eos token id to 151658Set eos token to 151658 Set eos token toSet eos token toSet eos token to151658151658<|diff_marker|>151658 <|diff_marker|>Set eos token to <|diff_marker|>Set eos token to<|diff_marker|>Set eos token toSet generation config eos token id to Set eos token to Set generation config eos token id to<|diff_marker|> Set generation config eos token id to<|diff_marker|><|diff_marker|> <|diff_marker|> Set generation config eos token id toSet generation config eos token id to[151658] [151658] [151658] Set generation config eos token id toSet generation config eos token id to Set generation config eos token id to [151658][ Set eos token id toSet eos token id t Set eos token id to 151658 151658 151658151658151658151658151658 151658 Set eos token toSet eos token to Set eos token toSet eos token to Set eos token to Set eos token to Set eos token to <|diff_marker|> <|diff_marker|> <|diff_marker|><|diff_marker|> <|diff_marker|> <|diff_marker|> <|diff_marker|>Set generation config eos token id to Set generation config eos token id toSet generation config eos token id toSet generation config eos token id to Set generation config eos token id toSet generation config eos token id to Set generation config eos token id to[151658][151658] [151658][151658][151658] [151658] [151658] Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set generation config eos token id to [151658] [151658][151658] [151658] Set eos token id baSet eo 151658 Set eos token to <|diff_marker|> S151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set ge 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] /mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/torch/utils/checkpoint.py:87: UserWarning: None of the inputs have requires_grad=True. Gradients will be None warnings.warn( 99%|█████████▉| 34001/34278 [37:14:19<1:04:24, 13.95s/it] {'loss': 0.1002, 'grad_norm': 1.1005420867252866, 'learning_rate': 1.7124477313801292e-09, 'epoch': 0.99} 99%|█████████▉| 34001/34278 [37:14:19<1:04:24, 13.95s/it] 99%|█████████▉| 34002/34278 [37:14:22<49:52, 10.84s/it] {'loss': 0.1141, 'grad_norm': 0.7729468141397632, 'learning_rate': 1.7001065054289557e-09, 'epoch': 0.99} 99%|█████████▉| 34002/34278 [37:14:22<49:52, 10.84s/it] 99%|█████████▉| 34003/34278 [37:14:26<40:39, 8.87s/it] {'loss': 0.1244, 'grad_norm': 0.7921442542462045, 'learning_rate': 1.687809903105908e-09, 'epoch': 0.99} 99%|█████████▉| 34003/34278 [37:14:26<40:39, 8.87s/it] 99%|█████████▉| 34004/34278 [37:14:30<32:58, 7.22s/it] {'loss': 0.1251, 'grad_norm': 0.8050298833471278, 'learning_rate': 1.6755579245208986e-09, 'epoch': 0.99} 99%|█████████▉| 34004/34278 [37:14:30<32:58, 7.22s/it] 99%|█████████▉| 34005/34278 [37:14:33<27:42, 6.09s/it] {'loss': 0.1231, 'grad_norm': 0.7913452179414457, 'learning_rate': 1.6633505697832842e-09, 'epoch': 0.99} 99%|█████████▉| 34005/34278 [37:14:33<27:42, 6.09s/it] 99%|█████████▉| 34006/34278 [37:14:38<25:50, 5.70s/it] {'loss': 0.1131, 'grad_norm': 0.8524388524735569, 'learning_rate': 1.6511878390018664e-09, 'epoch': 0.99} 99%|█████████▉| 34006/34278 [37:14:38<25:50, 5.70s/it] 99%|█████████▉| 34007/34278 [37:14:41<22:29, 4.98s/it] {'loss': 0.1034, 'grad_norm': 0.7207385234256999, 'learning_rate': 1.6390697322854476e-09, 'epoch': 0.99} 99%|█████████▉| 34007/34278 [37:14:41<22:29, 4.98s/it] 99%|█████████▉| 34008/34278 [37:14:45<19:55, 4.43s/it] {'loss': 0.1023, 'grad_norm': 0.892106125992658, 'learning_rate': 1.6269962497422742e-09, 'epoch': 0.99} 99%|█████████▉| 34008/34278 [37:14:45<19:55, 4.43s/it] 99%|█████████▉| 34009/34278 [37:14:48<17:58, 4.01s/it] {'loss': 0.1108, 'grad_norm': 0.8920521212411752, 'learning_rate': 1.6149673914800379e-09, 'epoch': 0.99} 99%|█████████▉| 34009/34278 [37:14:48<17:58, 4.01s/it] 99%|█████████▉| 34010/34278 [37:14:51<16:39, 3.73s/it] {'loss': 0.1082, 'grad_norm': 0.8646521052733187, 'learning_rate': 1.6029831576064303e-09, 'epoch': 0.99} 99%|█████████▉| 34010/34278 [37:14:51<16:39, 3.73s/it] 99%|█████████▉| 34011/34278 [37:14:55<17:08, 3.85s/it] {'loss': 0.1195, 'grad_norm': 0.9169541937022215, 'learning_rate': 1.591043548228033e-09, 'epoch': 0.99} 99%|█████████▉| 34011/34278 [37:14:55<17:08, 3.85s/it] 99%|█████████▉| 34012/34278 [37:14:58<16:00, 3.61s/it] {'loss': 0.0878, 'grad_norm': 0.7139755905377889, 'learning_rate': 1.5791485634514269e-09, 'epoch': 0.99} 99%|█████████▉| 34012/34278 [37:14:58<16:00, 3.61s/it] 99%|█████████▉| 34013/34278 [37:15:01<15:38, 3.54s/it] {'loss': 0.1181, 'grad_norm': 0.9725231084367686, 'learning_rate': 1.5672982033831941e-09, 'epoch': 0.99} 99%|█████████▉| 34013/34278 [37:15:01<15:38, 3.54s/it] 99%|█████████▉| 34014/34278 [37:15:05<16:28, 3.75s/it] {'loss': 0.1043, 'grad_norm': 0.7994798428439593, 'learning_rate': 1.5554924681288052e-09, 'epoch': 0.99} 99%|█████████▉| 34014/34278 [37:15:05<16:28, 3.75s/it] 99%|█████████▉| 34015/34278 [37:15:10<17:20, 3.95s/it] {'loss': 0.1302, 'grad_norm': 0.7636802346871505, 'learning_rate': 1.543731357793732e-09, 'epoch': 0.99} 99%|█████████▉| 34015/34278 [37:15:10<17:20, 3.95s/it] 99%|█████████▉| 34016/34278 [37:15:13<15:58, 3.66s/it] {'loss': 0.1101, 'grad_norm': 0.7368600712341854, 'learning_rate': 1.532014872483445e-09, 'epoch': 0.99} 99%|█████████▉| 34016/34278 [37:15:13<15:58, 3.66s/it] 99%|█████████▉| 34017/34278 [37:15:16<14:58, 3.44s/it] {'loss': 0.0941, 'grad_norm': 0.7183991162784704, 'learning_rate': 1.5203430123011953e-09, 'epoch': 0.99} 99%|█████████▉| 34017/34278 [37:15:16<14:58, 3.44s/it] 99%|█████████▉| 34018/34278 [37:15:20<15:32, 3.59s/it] {'loss': 0.1088, 'grad_norm': 0.8763857078528436, 'learning_rate': 1.5087157773530092e-09, 'epoch': 0.99} 99%|█████████▉| 34018/34278 [37:15:20<15:32, 3.59s/it] 99%|█████████▉| 34019/34278 [37:15:22<14:17, 3.31s/it] {'loss': 0.1064, 'grad_norm': 0.7429516001128845, 'learning_rate': 1.4971331677410273e-09, 'epoch': 0.99} 99%|█████████▉| 34019/34278 [37:15:22<14:17, 3.31s/it] 99%|█████████▉| 34020/34278 [37:15:28<17:31, 4.08s/it] {'loss': 0.1174, 'grad_norm': 0.7613951574229304, 'learning_rate': 1.4855951835696102e-09, 'epoch': 0.99} 99%|█████████▉| 34020/34278 [37:15:28<17:31, 4.08s/it] 99%|█████████▉| 34021/34278 [37:15:31<16:05, 3.76s/it] {'loss': 0.098, 'grad_norm': 0.7740688048253014, 'learning_rate': 1.4741018249420091e-09, 'epoch': 0.99} 99%|█████████▉| 34021/34278 [37:15:31<16:05, 3.76s/it] 99%|█████████▉| 34022/34278 [37:15:34<14:56, 3.50s/it] {'loss': 0.1332, 'grad_norm': 0.8195838171484119, 'learning_rate': 1.4626530919598093e-09, 'epoch': 0.99} 99%|█████████▉| 34022/34278 [37:15:34<14:56, 3.50s/it] 99%|█████████▉| 34023/34278 [37:15:38<15:03, 3.54s/it] {'loss': 0.085, 'grad_norm': 1.2659801867341571, 'learning_rate': 1.4512489847262612e-09, 'epoch': 0.99} 99%|█████████▉| 34023/34278 [37:15:38<15:03, 3.54s/it]Traceback (most recent call last): File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 359, in __getitem__ sample = self._get_item(i) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 407, in _get_item data_dict = self.preprocess_qwen2vl_v3( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 703, in preprocess_qwen2vl_v3 return self.preprocess_qwen2vl( File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 512, in preprocess_qwen2vl image_inputs = load_image_from_ceph(self.tcs_loader, conv) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 157, in load_image_from_ceph image = tcs_loader(image) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 64, in __call__ img = pil_loader(img_value_str) File "/mnt/hwfile/liuzhaoyang/workspace/GUIAgent/qwen2vl_gui/src/aguvis/dataset.py", line 40, in pil_loader img = Image.open(buff) File "/mnt/petrelfs/liuzhaoyang/workspace/programs/miniconda3/envs/aguvis/lib/python3.10/site-packages/PIL/Image.py", line 3536, in open raise UnidentifiedImageError(msg) PIL.UnidentifiedImageError: cannot identify image file <_io.BytesIO object at 0x7fafd81d5170> Failed to fetch sample 2735737. Exception: cannot identify image file <_io.BytesIO object at 0x7fafd81d5170> 99%|█████████▉| 34024/34278 [37:15:41<14:51, 3.51s/it] {'loss': 0.1159, 'grad_norm': 0.885384170892487, 'learning_rate': 1.4398895033423954e-09, 'epoch': 0.99} 99%|█████████▉| 34024/34278 [37:15:41<14:51, 3.51s/it] 99%|█████████▉| 34025/34278 [37:15:44<14:17, 3.39s/it] {'loss': 0.1043, 'grad_norm': 0.9042474170601904, 'learning_rate': 1.4285746479097973e-09, 'epoch': 0.99} 99%|█████████▉| 34025/34278 [37:15:44<14:17, 3.39s/it] 99%|█████████▉| 34026/34278 [37:15:48<14:32, 3.46s/it] {'loss': 0.1019, 'grad_norm': 0.7205575069692667, 'learning_rate': 1.4173044185300522e-09, 'epoch': 0.99} 99%|█████████▉| 34026/34278 [37:15:48<14:32, 3.46s/it] 99%|█████████▉| 34027/34278 [37:15:51<14:03, 3.36s/it] {'loss': 0.0908, 'grad_norm': 0.7391088876462139, 'learning_rate': 1.4060788153030802e-09, 'epoch': 0.99} 99%|█████████▉| 34027/34278 [37:15:51<14:03, 3.36s/it] 99%|█████████▉| 34028/34278 [37:15:54<13:18, 3.19s/it] {'loss': 0.121, 'grad_norm': 0.8612764153386636, 'learning_rate': 1.3948978383293565e-09, 'epoch': 0.99} 99%|█████████▉| 34028/34278 [37:15:54<13:18, 3.19s/it] 99%|█████████▉| 34029/34278 [37:16:00<16:41, 4.02s/it] {'loss': 0.1133, 'grad_norm': 0.6940616724774372, 'learning_rate': 1.3837614877088013e-09, 'epoch': 0.99} 99%|█████████▉| 34029/34278 [37:16:00<16:41, 4.02s/it] 99%|█████████▉| 34030/34278 [37:16:03<15:10, 3.67s/it] {'loss': 0.1256, 'grad_norm': 0.8593610006395457, 'learning_rate': 1.3726697635407792e-09, 'epoch': 0.99} 99%|█████████▉| 34030/34278 [37:16:03<15:10, 3.67s/it] 99%|█████████▉| 34031/34278 [37:16:08<17:21, 4.22s/it] {'loss': 0.1249, 'grad_norm': 0.7378808583037828, 'learning_rate': 1.3616226659246557e-09, 'epoch': 0.99} 99%|█████████▉| 34031/34278 [37:16:08<17:21, 4.22s/it] 99%|█████████▉| 34032/34278 [37:16:12<17:07, 4.18s/it] {'loss': 0.1145, 'grad_norm': 1.4935313770643186, 'learning_rate': 1.35062019495813e-09, 'epoch': 0.99} 99%|█████████▉| 34032/34278 [37:16:12<17:07, 4.18s/it] 99%|█████████▉| 34033/34278 [37:16:16<16:22, 4.01s/it] {'loss': 0.0982, 'grad_norm': 0.7204585551341403, 'learning_rate': 1.339662350740012e-09, 'epoch': 0.99} 99%|█████████▉| 34033/34278 [37:16:16<16:22, 4.01s/it] 99%|█████████▉| 34034/34278 [37:16:22<18:47, 4.62s/it] {'loss': 0.1038, 'grad_norm': 0.6170457621250698, 'learning_rate': 1.3287491333685564e-09, 'epoch': 0.99} 99%|█████████▉| 34034/34278 [37:16:22<18:47, 4.62s/it] 99%|█████████▉| 34035/34278 [37:16:25<16:43, 4.13s/it] {'loss': 0.1004, 'grad_norm': 0.7836008503839021, 'learning_rate': 1.317880542940353e-09, 'epoch': 0.99} 99%|█████████▉| 34035/34278 [37:16:25<16:43, 4.13s/it] 99%|█████████▉| 34036/34278 [37:16:29<16:17, 4.04s/it] {'loss': 0.128, 'grad_norm': 0.7978269990252792, 'learning_rate': 1.3070565795531009e-09, 'epoch': 0.99} 99%|█████████▉| 34036/34278 [37:16:29<16:17, 4.04s/it] 99%|█████████▉| 34037/34278 [37:16:35<19:04, 4.75s/it] {'loss': 0.0966, 'grad_norm': 1.0070737496571687, 'learning_rate': 1.2962772433028347e-09, 'epoch': 0.99} 99%|█████████▉| 34037/34278 [37:16:35<19:04, 4.75s/it] 99%|█████████▉| 34038/34278 [37:16:41<20:36, 5.15s/it] {'loss': 0.1446, 'grad_norm': 1.0961206417910445, 'learning_rate': 1.2855425342861439e-09, 'epoch': 0.99} 99%|█████████▉| 34038/34278 [37:16:41<20:36, 5.15s/it] 99%|█████████▉| 34039/34278 [37:16:45<19:02, 4.78s/it] {'loss': 0.1109, 'grad_norm': 0.8363924212235179, 'learning_rate': 1.2748524525990624e-09, 'epoch': 0.99} 99%|█████████▉| 34039/34278 [37:16:45<19:02, 4.78s/it] 99%|█████████▉| 34040/34278 [37:16:48<16:55, 4.27s/it] {'loss': 0.1214, 'grad_norm': 0.6959908934163784, 'learning_rate': 1.2642069983370698e-09, 'epoch': 0.99} 99%|█████████▉| 34040/34278 [37:16:48<16:55, 4.27s/it] 99%|█████████▉| 34041/34278 [37:16:51<15:34, 3.94s/it] {'loss': 0.0952, 'grad_norm': 0.66674851157858, 'learning_rate': 1.2536061715945346e-09, 'epoch': 0.99} 99%|█████████▉| 34041/34278 [37:16:51<15:34, 3.94s/it] 99%|█████████▉| 34042/34278 [37:16:54<14:16, 3.63s/it] {'loss': 0.1113, 'grad_norm': 0.9546778122469106, 'learning_rate': 1.2430499724663813e-09, 'epoch': 0.99} 99%|█████████▉| 34042/34278 [37:16:54<14:16, 3.63s/it] 99%|█████████▉| 34043/34278 [37:17:00<16:26, 4.20s/it] {'loss': 0.1114, 'grad_norm': 0.6442948551810961, 'learning_rate': 1.232538401047534e-09, 'epoch': 0.99} 99%|█████████▉| 34043/34278 [37:17:00<16:26, 4.20s/it] 99%|█████████▉| 34044/34278 [37:17:03<14:59, 3.84s/it] {'loss': 0.0965, 'grad_norm': 0.6836414226523234, 'learning_rate': 1.2220714574306957e-09, 'epoch': 0.99} 99%|█████████▉| 34044/34278 [37:17:03<14:59, 3.84s/it] 99%|█████████▉| 34045/34278 [37:17:06<14:12, 3.66s/it] {'loss': 0.1001, 'grad_norm': 0.779165312058414, 'learning_rate': 1.211649141710236e-09, 'epoch': 0.99} 99%|█████████▉| 34045/34278 [37:17:06<14:12, 3.66s/it] 99%|█████████▉| 34046/34278 [37:17:12<16:42, 4.32s/it] {'loss': 0.1185, 'grad_norm': 0.8468336953767481, 'learning_rate': 1.2012714539788585e-09, 'epoch': 0.99} 99%|█████████▉| 34046/34278 [37:17:12<16:42, 4.32s/it] 99%|█████████▉| 34047/34278 [37:17:15<15:19, 3.98s/it] {'loss': 0.1008, 'grad_norm': 0.6729363274783582, 'learning_rate': 1.190938394328711e-09, 'epoch': 0.99} 99%|█████████▉| 34047/34278 [37:17:15<15:19, 3.98s/it] 99%|█████████▉| 34048/34278 [37:17:19<14:53, 3.88s/it] {'loss': 0.1276, 'grad_norm': 0.7754525527136992, 'learning_rate': 1.1806499628530531e-09, 'epoch': 0.99} 99%|█████████▉| 34048/34278 [37:17:19<14:53, 3.88s/it] 99%|█████████▉| 34049/34278 [37:17:22<14:03, 3.68s/it] {'loss': 0.1208, 'grad_norm': 0.8506307230635818, 'learning_rate': 1.1704061596434779e-09, 'epoch': 0.99} 99%|█████████▉| 34049/34278 [37:17:22<14:03, 3.68s/it] 99%|█████████▉| 34050/34278 [37:17:25<13:15, 3.49s/it] {'loss': 0.1013, 'grad_norm': 1.023780179462189, 'learning_rate': 1.1602069847904685e-09, 'epoch': 0.99} 99%|█████████▉| 34050/34278 [37:17:25<13:15, 3.49s/it] 99%|█████████▉| 34051/34278 [37:17:29<13:14, 3.50s/it] {'loss': 0.1192, 'grad_norm': 0.6617346585707174, 'learning_rate': 1.1500524383861734e-09, 'epoch': 0.99} 99%|█████████▉| 34051/34278 [37:17:29<13:14, 3.50s/it] 99%|█████████▉| 34052/34278 [37:17:33<13:42, 3.64s/it] {'loss': 0.1371, 'grad_norm': 0.9134584862081634, 'learning_rate': 1.1399425205210758e-09, 'epoch': 0.99} 99%|█████████▉| 34052/34278 [37:17:33<13:42, 3.64s/it] 99%|█████████▉| 34053/34278 [37:17:35<12:24, 3.31s/it] {'loss': 0.1011, 'grad_norm': 0.7653441795668655, 'learning_rate': 1.1298772312851036e-09, 'epoch': 0.99} 99%|█████████▉| 34053/34278 [37:17:35<12:24, 3.31s/it] 99%|█████████▉| 34054/34278 [37:17:39<12:47, 3.43s/it] {'loss': 0.1062, 'grad_norm': 0.717620650852394, 'learning_rate': 1.1198565707681852e-09, 'epoch': 0.99} 99%|█████████▉| 34054/34278 [37:17:39<12:47, 3.43s/it] 99%|█████████▉| 34055/34278 [37:17:43<13:27, 3.62s/it] {'loss': 0.1016, 'grad_norm': 1.0198982643740935, 'learning_rate': 1.1098805390602486e-09, 'epoch': 0.99} 99%|█████████▉| 34055/34278 [37:17:43<13:27, 3.62s/it] 99%|█████████▉| 34056/34278 [37:17:47<13:52, 3.75s/it] {'loss': 0.1226, 'grad_norm': 0.8518773308694678, 'learning_rate': 1.0999491362495563e-09, 'epoch': 0.99} 99%|█████████▉| 34056/34278 [37:17:47<13:52, 3.75s/it] 99%|█████████▉| 34057/34278 [37:17:50<12:41, 3.45s/it] {'loss': 0.0971, 'grad_norm': 0.7623908621128579, 'learning_rate': 1.0900623624254814e-09, 'epoch': 0.99} 99%|█████████▉| 34057/34278 [37:17:50<12:41, 3.45s/it] 99%|█████████▉| 34058/34278 [37:17:53<12:17, 3.35s/it] {'loss': 0.1168, 'grad_norm': 0.7211693048706922, 'learning_rate': 1.0802202176757314e-09, 'epoch': 0.99} 99%|█████████▉| 34058/34278 [37:17:53<12:17, 3.35s/it] 99%|█████████▉| 34059/34278 [37:17:57<13:29, 3.69s/it] {'loss': 0.1154, 'grad_norm': 0.8603851488590624, 'learning_rate': 1.0704227020885694e-09, 'epoch': 0.99} 99%|█████████▉| 34059/34278 [37:17:57<13:29, 3.69s/it] 99%|█████████▉| 34060/34278 [37:18:01<12:58, 3.57s/it] {'loss': 0.1058, 'grad_norm': 1.0251546069850566, 'learning_rate': 1.0606698157511475e-09, 'epoch': 0.99} 99%|█████████▉| 34060/34278 [37:18:01<12:58, 3.57s/it] 99%|█████████▉| 34061/34278 [37:18:04<12:35, 3.48s/it] {'loss': 0.1007, 'grad_norm': 0.7998863832192737, 'learning_rate': 1.0509615587506183e-09, 'epoch': 0.99} 99%|█████████▉| 34061/34278 [37:18:04<12:35, 3.48s/it] 99%|█████████▉| 34062/34278 [37:18:07<12:06, 3.36s/it] {'loss': 0.127, 'grad_norm': 0.8498521076196066, 'learning_rate': 1.0412979311741345e-09, 'epoch': 0.99} 99%|█████████▉| 34062/34278 [37:18:07<12:06, 3.36s/it] 99%|█████████▉| 34063/34278 [37:18:12<13:25, 3.74s/it] {'loss': 0.1018, 'grad_norm': 0.8093687737428069, 'learning_rate': 1.031678933107183e-09, 'epoch': 0.99} 99%|█████████▉| 34063/34278 [37:18:12<13:25, 3.74s/it] 99%|█████████▉| 34064/34278 [37:18:15<12:40, 3.55s/it] {'loss': 0.1114, 'grad_norm': 0.7804226128275563, 'learning_rate': 1.0221045646363615e-09, 'epoch': 0.99} 99%|█████████▉| 34064/34278 [37:18:15<12:40, 3.55s/it] 99%|█████████▉| 34065/34278 [37:18:20<14:31, 4.09s/it] {'loss': 0.1374, 'grad_norm': 0.9194000153692374, 'learning_rate': 1.0125748258471569e-09, 'epoch': 0.99} 99%|█████████▉| 34065/34278 [37:18:20<14:31, 4.09s/it] 99%|█████████▉| 34066/34278 [37:18:24<14:36, 4.14s/it] {'loss': 0.1355, 'grad_norm': 0.7797226722297699, 'learning_rate': 1.0030897168239462e-09, 'epoch': 0.99} 99%|█████████▉| 34066/34278 [37:18:24<14:36, 4.14s/it] 99%|█████████▉| 34067/34278 [37:18:28<14:14, 4.05s/it] {'loss': 0.1329, 'grad_norm': 1.0513752125003564, 'learning_rate': 9.936492376516616e-10, 'epoch': 0.99} 99%|█████████▉| 34067/34278 [37:18:28<14:14, 4.05s/it] 99%|█████████▉| 34068/34278 [37:18:34<16:21, 4.67s/it] {'loss': 0.0866, 'grad_norm': 0.6836143583423224, 'learning_rate': 9.842533884146798e-10, 'epoch': 0.99} 99%|█████████▉| 34068/34278 [37:18:34<16:21, 4.67s/it] 99%|█████████▉| 34069/34278 [37:18:38<15:46, 4.53s/it] {'loss': 0.1356, 'grad_norm': 0.8211784816659222, 'learning_rate': 9.749021691973781e-10, 'epoch': 0.99} 99%|█████████▉| 34069/34278 [37:18:38<15:46, 4.53s/it] 99%|█████████▉| 34070/34278 [37:18:41<14:06, 4.07s/it] {'loss': 0.1149, 'grad_norm': 1.1690471816891186, 'learning_rate': 9.655955800824679e-10, 'epoch': 0.99} 99%|█████████▉| 34070/34278 [37:18:41<14:06, 4.07s/it] 99%|█████████▉| 34071/34278 [37:18:45<13:22, 3.87s/it] {'loss': 0.1108, 'grad_norm': 0.8040608420154217, 'learning_rate': 9.563336211532159e-10, 'epoch': 0.99} 99%|█████████▉| 34071/34278 [37:18:45<13:22, 3.87s/it] 99%|█████████▉| 34072/34278 [37:18:48<12:33, 3.66s/it] {'loss': 0.1148, 'grad_norm': 0.9887116452727273, 'learning_rate': 9.471162924928888e-10, 'epoch': 0.99} 99%|█████████▉| 34072/34278 [37:18:48<12:33, 3.66s/it] 99%|█████████▉| 34073/34278 [37:18:52<12:28, 3.65s/it] {'loss': 0.1247, 'grad_norm': 0.9079191114011983, 'learning_rate': 9.379435941830884e-10, 'epoch': 0.99} 99%|█████████▉| 34073/34278 [37:18:52<12:28, 3.65s/it] 99%|█████████▉| 34074/34278 [37:18:54<11:24, 3.35s/it] {'loss': 0.1216, 'grad_norm': 0.8907520479518357, 'learning_rate': 9.288155263059706e-10, 'epoch': 0.99} 99%|█████████▉| 34074/34278 [37:18:54<11:24, 3.35s/it] 99%|█████████▉| 34075/34278 [37:18:58<11:43, 3.47s/it] {'loss': 0.1117, 'grad_norm': 0.9711221010827501, 'learning_rate': 9.19732088942582e-10, 'epoch': 0.99} 99%|█████████▉| 34075/34278 [37:18:58<11:43, 3.47s/it] 99%|█████████▉| 34076/34278 [37:19:02<11:48, 3.51s/it] {'loss': 0.1195, 'grad_norm': 0.8999582269382925, 'learning_rate': 9.106932821750791e-10, 'epoch': 0.99} 99%|█████████▉| 34076/34278 [37:19:02<11:48, 3.51s/it] 99%|█████████▉| 34077/34278 [37:19:05<11:26, 3.42s/it] {'loss': 0.101, 'grad_norm': 0.6759357107902308, 'learning_rate': 9.01699106083398e-10, 'epoch': 0.99} 99%|█████████▉| 34077/34278 [37:19:05<11:26, 3.42s/it] 99%|█████████▉| 34078/34278 [37:19:08<11:08, 3.34s/it] {'loss': 0.0999, 'grad_norm': 0.743747522764742, 'learning_rate': 8.927495607480296e-10, 'epoch': 0.99} 99%|█████████▉| 34078/34278 [37:19:08<11:08, 3.34s/it] 99%|█████████▉| 34079/34278 [37:19:14<13:25, 4.05s/it] {'loss': 0.1026, 'grad_norm': 0.8807387907516453, 'learning_rate': 8.838446462483552e-10, 'epoch': 0.99} 99%|█████████▉| 34079/34278 [37:19:14<13:25, 4.05s/it] 99%|█████████▉| 34080/34278 [37:19:16<12:00, 3.64s/it] {'loss': 0.1084, 'grad_norm': 0.7764928226275828, 'learning_rate': 8.749843626648657e-10, 'epoch': 0.99} 99%|█████████▉| 34080/34278 [37:19:16<12:00, 3.64s/it] 99%|█████████▉| 34081/34278 [37:19:20<11:33, 3.52s/it] {'loss': 0.1233, 'grad_norm': 0.9501748476200179, 'learning_rate': 8.661687100758321e-10, 'epoch': 0.99} 99%|█████████▉| 34081/34278 [37:19:20<11:33, 3.52s/it] 99%|█████████▉| 34082/34278 [37:19:22<10:47, 3.30s/it] {'loss': 0.1073, 'grad_norm': 0.8364287403975764, 'learning_rate': 8.573976885600799e-10, 'epoch': 0.99} 99%|█████████▉| 34082/34278 [37:19:22<10:47, 3.30s/it] 99%|█████████▉| 34083/34278 [37:19:26<10:48, 3.33s/it] {'loss': 0.0953, 'grad_norm': 0.9066436897953483, 'learning_rate': 8.486712981964352e-10, 'epoch': 0.99} 99%|█████████▉| 34083/34278 [37:19:26<10:48, 3.33s/it] 99%|█████████▉| 34084/34278 [37:19:29<10:12, 3.16s/it] {'loss': 0.119, 'grad_norm': 1.050949421943745, 'learning_rate': 8.399895390626134e-10, 'epoch': 0.99} 99%|█████████▉| 34084/34278 [37:19:29<10:12, 3.16s/it] 99%|█████████▉| 34085/34278 [37:19:31<09:57, 3.10s/it] {'loss': 0.1053, 'grad_norm': 0.9541080393983301, 'learning_rate': 8.3135241123522e-10, 'epoch': 0.99} 99%|█████████▉| 34085/34278 [37:19:31<09:57, 3.10s/it] 99%|█████████▉| 34086/34278 [37:19:35<09:51, 3.08s/it] {'loss': 0.1043, 'grad_norm': 0.9015100872206822, 'learning_rate': 8.22759914793081e-10, 'epoch': 0.99} 99%|█████████▉| 34086/34278 [37:19:35<09:51, 3.08s/it] 99%|█████████▉| 34087/34278 [37:19:38<10:11, 3.20s/it] {'loss': 0.1118, 'grad_norm': 0.8138820065716954, 'learning_rate': 8.142120498111361e-10, 'epoch': 0.99} 99%|█████████▉| 34087/34278 [37:19:38<10:11, 3.20s/it] 99%|█████████▉| 34088/34278 [37:19:43<11:32, 3.64s/it] {'loss': 0.1044, 'grad_norm': 0.9281246681070685, 'learning_rate': 8.057088163671011e-10, 'epoch': 0.99} 99%|█████████▉| 34088/34278 [37:19:43<11:32, 3.64s/it] 99%|█████████▉| 34089/34278 [37:19:45<10:35, 3.36s/it] {'loss': 0.0911, 'grad_norm': 0.7806980332141217, 'learning_rate': 7.972502145359163e-10, 'epoch': 0.99} 99%|█████████▉| 34089/34278 [37:19:45<10:35, 3.36s/it] 99%|█████████▉| 34090/34278 [37:19:49<10:51, 3.47s/it] {'loss': 0.1124, 'grad_norm': 0.7832774255404693, 'learning_rate': 7.888362443936315e-10, 'epoch': 0.99} 99%|█████████▉| 34090/34278 [37:19:49<10:51, 3.47s/it] 99%|█████████▉| 34091/34278 [37:19:52<10:22, 3.33s/it] {'loss': 0.116, 'grad_norm': 0.7879711075617483, 'learning_rate': 7.80466906015187e-10, 'epoch': 0.99} 99%|█████████▉| 34091/34278 [37:19:52<10:22, 3.33s/it] 99%|█████████▉| 34092/34278 [37:19:58<13:01, 4.20s/it] {'loss': 0.1008, 'grad_norm': 0.7983604595183628, 'learning_rate': 7.721421994749678e-10, 'epoch': 0.99} 99%|█████████▉| 34092/34278 [37:19:58<13:01, 4.20s/it] 99%|█████████▉| 34093/34278 [37:20:04<13:52, 4.50s/it] {'loss': 0.1329, 'grad_norm': 0.8004284092173197, 'learning_rate': 7.638621248479139e-10, 'epoch': 0.99} 99%|█████████▉| 34093/34278 [37:20:04<13:52, 4.50s/it] 99%|█████████▉| 34094/34278 [37:20:07<12:23, 4.04s/it] {'loss': 0.1055, 'grad_norm': 0.6984064590860364, 'learning_rate': 7.556266822078551e-10, 'epoch': 0.99} 99%|█████████▉| 34094/34278 [37:20:07<12:23, 4.04s/it] 99%|█████████▉| 34095/34278 [37:20:10<11:43, 3.84s/it] {'loss': 0.1161, 'grad_norm': 0.8736392918962326, 'learning_rate': 7.47435871628066e-10, 'epoch': 0.99} 99%|█████████▉| 34095/34278 [37:20:10<11:43, 3.84s/it] 99%|█████████▉| 34096/34278 [37:20:15<12:29, 4.12s/it] {'loss': 0.0972, 'grad_norm': 1.0975838347093683, 'learning_rate': 7.392896931818217e-10, 'epoch': 0.99} 99%|█████████▉| 34096/34278 [37:20:15<12:29, 4.12s/it] 99%|█████████▉| 34097/34278 [37:20:18<11:24, 3.78s/it] {'loss': 0.1029, 'grad_norm': 0.7537191566906429, 'learning_rate': 7.311881469418414e-10, 'epoch': 0.99} 99%|█████████▉| 34097/34278 [37:20:18<11:24, 3.78s/it] 99%|█████████▉| 34098/34278 [37:20:21<10:36, 3.54s/it] {'loss': 0.1108, 'grad_norm': 0.7944820515650836, 'learning_rate': 7.231312329802897e-10, 'epoch': 0.99} 99%|█████████▉| 34098/34278 [37:20:21<10:36, 3.54s/it] 99%|█████████▉| 34099/34278 [37:20:24<10:31, 3.53s/it] {'loss': 0.1271, 'grad_norm': 0.7906896917838068, 'learning_rate': 7.151189513687762e-10, 'epoch': 0.99} 99%|█████████▉| 34099/34278 [37:20:24<10:31, 3.53s/it] 99%|█████████▉| 34100/34278 [37:20:30<12:33, 4.23s/it] {'loss': 0.105, 'grad_norm': 0.9187474403789473, 'learning_rate': 7.071513021800202e-10, 'epoch': 0.99} 99%|█████████▉| 34100/34278 [37:20:30<12:33, 4.23s/it] 99%|█████████▉| 34101/34278 [37:20:34<12:42, 4.31s/it] {'loss': 0.1082, 'grad_norm': 0.6981101245865255, 'learning_rate': 6.99228285483966e-10, 'epoch': 0.99} 99%|█████████▉| 34101/34278 [37:20:34<12:42, 4.31s/it] 99%|█████████▉| 34102/34278 [37:20:38<11:58, 4.08s/it] {'loss': 0.1208, 'grad_norm': 0.8331510702176763, 'learning_rate': 6.913499013516678e-10, 'epoch': 0.99} 99%|█████████▉| 34102/34278 [37:20:38<11:58, 4.08s/it] 99%|█████████▉| 34103/34278 [37:20:41<10:44, 3.68s/it] {'loss': 0.1047, 'grad_norm': 0.7078335438354388, 'learning_rate': 6.835161498536246e-10, 'epoch': 0.99} 99%|█████████▉| 34103/34278 [37:20:41<10:44, 3.68s/it] 99%|█████████▉| 34104/34278 [37:20:44<10:22, 3.58s/it] {'loss': 0.1158, 'grad_norm': 0.6408704157548614, 'learning_rate': 6.757270310597808e-10, 'epoch': 0.99} 99%|█████████▉| 34104/34278 [37:20:44<10:22, 3.58s/it] 99%|█████████▉| 34105/34278 [37:20:47<10:09, 3.52s/it] {'loss': 0.1349, 'grad_norm': 0.7426096241429958, 'learning_rate': 6.679825450395249e-10, 'epoch': 0.99} 99%|█████████▉| 34105/34278 [37:20:47<10:09, 3.52s/it] 99%|█████████▉| 34106/34278 [37:20:51<09:41, 3.38s/it] {'loss': 0.103, 'grad_norm': 1.0062563407269012, 'learning_rate': 6.602826918622463e-10, 'epoch': 0.99} 99%|█████████▉| 34106/34278 [37:20:51<09:41, 3.38s/it] 100%|█████████▉| 34107/34278 [37:20:55<10:14, 3.59s/it] {'loss': 0.0945, 'grad_norm': 0.7157635929457596, 'learning_rate': 6.526274715967784e-10, 'epoch': 1.0} 100%|█████████▉| 34107/34278 [37:20:55<10:14, 3.59s/it] 100%|█████████▉| 34108/34278 [37:21:01<12:13, 4.32s/it] {'loss': 0.1082, 'grad_norm': 0.8810473775563002, 'learning_rate': 6.450168843108451e-10, 'epoch': 1.0} 100%|█████████▉| 34108/34278 [37:21:01<12:13, 4.32s/it] 100%|█████████▉| 34109/34278 [37:21:04<11:25, 4.06s/it] {'loss': 0.1081, 'grad_norm': 0.9297848423141039, 'learning_rate': 6.37450930072725e-10, 'epoch': 1.0} 100%|█████████▉| 34109/34278 [37:21:04<11:25, 4.06s/it] 100%|█████████▉| 34110/34278 [37:21:07<10:28, 3.74s/it] {'loss': 0.1393, 'grad_norm': 1.0768216423786734, 'learning_rate': 6.299296089501417e-10, 'epoch': 1.0} 100%|█████████▉| 34110/34278 [37:21:07<10:28, 3.74s/it] 100%|█████████▉| 34111/34278 [37:21:10<09:49, 3.53s/it] {'loss': 0.1133, 'grad_norm': 0.9025208740283891, 'learning_rate': 6.224529210097086e-10, 'epoch': 1.0} 100%|█████████▉| 34111/34278 [37:21:10<09:49, 3.53s/it] 100%|█████████▉| 34112/34278 [37:21:14<10:19, 3.73s/it] {'loss': 0.1065, 'grad_norm': 0.7784503035521988, 'learning_rate': 6.150208663191492e-10, 'epoch': 1.0} 100%|█████████▉| 34112/34278 [37:21:14<10:19, 3.73s/it] 100%|█████████▉| 34113/34278 [37:21:17<09:30, 3.46s/it] {'loss': 0.1164, 'grad_norm': 0.885480820943917, 'learning_rate': 6.076334449439669e-10, 'epoch': 1.0} 100%|█████████▉| 34113/34278 [37:21:17<09:30, 3.46s/it] 100%|█████████▉| 34114/34278 [37:21:21<09:24, 3.44s/it] {'loss': 0.0943, 'grad_norm': 0.7198937868159827, 'learning_rate': 6.002906569502199e-10, 'epoch': 1.0} 100%|█████████▉| 34114/34278 [37:21:21<09:24, 3.44s/it] 100%|█████████▉| 34115/34278 [37:21:27<11:24, 4.20s/it] {'loss': 0.1015, 'grad_norm': 0.7557393367661378, 'learning_rate': 5.929925024039663e-10, 'epoch': 1.0} 100%|█████████▉| 34115/34278 [37:21:27<11:24, 4.20s/it] 100%|█████████▉| 34116/34278 [37:21:30<10:25, 3.86s/it] {'loss': 0.1027, 'grad_norm': 0.8592429764339267, 'learning_rate': 5.85738981369599e-10, 'epoch': 1.0} 100%|█████████▉| 34116/34278 [37:21:30<10:25, 3.86s/it] 100%|█████████▉| 34117/34278 [37:21:33<10:17, 3.83s/it] {'loss': 0.1145, 'grad_norm': 1.0326153929875528, 'learning_rate': 5.785300939126215e-10, 'epoch': 1.0} 100%|█████████▉| 34117/34278 [37:21:33<10:17, 3.83s/it] 100%|█████████▉| 34118/34278 [37:21:40<12:09, 4.56s/it] {'loss': 0.1282, 'grad_norm': 1.131867533188281, 'learning_rate': 5.713658400968714e-10, 'epoch': 1.0} 100%|█████████▉| 34118/34278 [37:21:40<12:09, 4.56s/it] 100%|█████████▉| 34119/34278 [37:21:43<11:16, 4.26s/it] {'loss': 0.1201, 'grad_norm': 0.8826275390906089, 'learning_rate': 5.642462199867415e-10, 'epoch': 1.0} 100%|█████████▉| 34119/34278 [37:21:43<11:16, 4.26s/it] 100%|█████████▉| 34120/34278 [37:21:46<10:24, 3.95s/it] {'loss': 0.0991, 'grad_norm': 0.7075084976064901, 'learning_rate': 5.571712336455149e-10, 'epoch': 1.0} 100%|█████████▉| 34120/34278 [37:21:46<10:24, 3.95s/it] 100%|█████████▉| 34121/34278 [37:21:49<09:37, 3.68s/it] {'loss': 0.09, 'grad_norm': 1.005407846859457, 'learning_rate': 5.501408811364739e-10, 'epoch': 1.0} 100%|█████████▉| 34121/34278 [37:21:49<09:37, 3.68s/it] 100%|█████████▉| 34122/34278 [37:21:53<09:19, 3.59s/it] {'loss': 0.1022, 'grad_norm': 1.0504259355479635, 'learning_rate': 5.431551625223463e-10, 'epoch': 1.0} 100%|█████████▉| 34122/34278 [37:21:53<09:19, 3.59s/it] 100%|█████████▉| 34123/34278 [37:21:56<09:12, 3.56s/it] {'loss': 0.0963, 'grad_norm': 0.8581664648419247, 'learning_rate': 5.362140778647495e-10, 'epoch': 1.0} 100%|█████████▉| 34123/34278 [37:21:56<09:12, 3.56s/it] 100%|█████████▉| 34124/34278 [37:21:59<08:40, 3.38s/it] {'loss': 0.1077, 'grad_norm': 0.7794675314709526, 'learning_rate': 5.293176272269662e-10, 'epoch': 1.0} 100%|█████████▉| 34124/34278 [37:21:59<08:40, 3.38s/it] 100%|█████████▉| 34125/34278 [37:22:02<08:09, 3.20s/it] {'loss': 0.1015, 'grad_norm': 0.8426456159811151, 'learning_rate': 5.224658106700586e-10, 'epoch': 1.0} 100%|█████████▉| 34125/34278 [37:22:02<08:09, 3.20s/it] 100%|█████████▉| 34126/34278 [37:22:08<10:14, 4.04s/it] {'loss': 0.1106, 'grad_norm': 0.6585376206474072, 'learning_rate': 5.15658628255089e-10, 'epoch': 1.0} 100%|█████████▉| 34126/34278 [37:22:08<10:14, 4.04s/it] 100%|█████████▉| 34127/34278 [37:22:11<09:37, 3.83s/it] {'loss': 0.1218, 'grad_norm': 0.7160620826893653, 'learning_rate': 5.088960800425646e-10, 'epoch': 1.0} 100%|█████████▉| 34127/34278 [37:22:11<09:37, 3.83s/it] 100%|█████████▉| 34128/34278 [37:22:14<08:56, 3.58s/it] {'loss': 0.1331, 'grad_norm': 0.9661865435285912, 'learning_rate': 5.021781660935477e-10, 'epoch': 1.0} 100%|█████████▉| 34128/34278 [37:22:14<08:56, 3.58s/it] 100%|█████████▉| 34129/34278 [37:22:18<08:40, 3.49s/it] {'loss': 0.1049, 'grad_norm': 0.7367700762748168, 'learning_rate': 4.95504886467435e-10, 'epoch': 1.0} 100%|█████████▉| 34129/34278 [37:22:18<08:40, 3.49s/it] 100%|█████████▉| 34130/34278 [37:22:21<08:09, 3.30s/it] {'loss': 0.1138, 'grad_norm': 0.8153560725549933, 'learning_rate': 4.888762412236236e-10, 'epoch': 1.0} 100%|█████████▉| 34130/34278 [37:22:21<08:09, 3.30s/it] 100%|█████████▉| 34131/34278 [37:22:24<08:24, 3.43s/it] {'loss': 0.1305, 'grad_norm': 0.810661963401519, 'learning_rate': 4.822922304220656e-10, 'epoch': 1.0} 100%|█████████▉| 34131/34278 [37:22:24<08:24, 3.43s/it] 100%|█████████▉| 34132/34278 [37:22:28<08:27, 3.47s/it] {'loss': 0.1035, 'grad_norm': 0.8503076546543817, 'learning_rate': 4.757528541210476e-10, 'epoch': 1.0} 100%|█████████▉| 34132/34278 [37:22:28<08:27, 3.47s/it] 100%|█████████▉| 34133/34278 [37:22:32<08:33, 3.54s/it] {'loss': 0.1124, 'grad_norm': 0.898054712318861, 'learning_rate': 4.692581123788564e-10, 'epoch': 1.0} 100%|█████████▉| 34133/34278 [37:22:32<08:33, 3.54s/it] 100%|█████████▉| 34134/34278 [37:22:35<08:08, 3.39s/it] {'loss': 0.1094, 'grad_norm': 0.8063388173715647, 'learning_rate': 4.628080052537787e-10, 'epoch': 1.0} 100%|█████████▉| 34134/34278 [37:22:35<08:08, 3.39s/it] 100%|█████████▉| 34135/34278 [37:22:37<07:43, 3.24s/it] {'loss': 0.1102, 'grad_norm': 0.928304486485139, 'learning_rate': 4.5640253280299084e-10, 'epoch': 1.0} 100%|█████████▉| 34135/34278 [37:22:37<07:43, 3.24s/it] 100%|█████████▉| 34136/34278 [37:22:40<07:28, 3.16s/it] {'loss': 0.1039, 'grad_norm': 0.7833583767426001, 'learning_rate': 4.500416950842246e-10, 'epoch': 1.0} 100%|█████████▉| 34136/34278 [37:22:40<07:28, 3.16s/it] 100%|█████████▉| 34137/34278 [37:22:46<09:00, 3.83s/it] {'loss': 0.1062, 'grad_norm': 0.7515909937925331, 'learning_rate': 4.437254921541012e-10, 'epoch': 1.0} 100%|█████████▉| 34137/34278 [37:22:46<09:00, 3.83s/it] 100%|█████████▉| 34138/34278 [37:22:49<08:34, 3.68s/it] {'loss': 0.0882, 'grad_norm': 0.7038696971438514, 'learning_rate': 4.3745392406868705e-10, 'epoch': 1.0} 100%|█████████▉| 34138/34278 [37:22:49<08:34, 3.68s/it] 100%|█████████▉| 34139/34278 [37:22:52<08:00, 3.46s/it] {'loss': 0.0885, 'grad_norm': 0.8055358729293154, 'learning_rate': 4.312269908840483e-10, 'epoch': 1.0} 100%|█████████▉| 34139/34278 [37:22:52<08:00, 3.46s/it] 100%|█████████▉| 34140/34278 [37:22:55<07:39, 3.33s/it] {'loss': 0.0983, 'grad_norm': 0.6796352115610131, 'learning_rate': 4.2504469265625124e-10, 'epoch': 1.0} 100%|█████████▉| 34140/34278 [37:22:55<07:39, 3.33s/it] 100%|█████████▉| 34141/34278 [37:22:58<07:27, 3.27s/it] {'loss': 0.1019, 'grad_norm': 0.7239870930231954, 'learning_rate': 4.1890702944025195e-10, 'epoch': 1.0} 100%|█████████▉| 34141/34278 [37:22:58<07:27, 3.27s/it] 100%|█████████▉| 34142/34278 [37:23:01<07:21, 3.25s/it] {'loss': 0.1002, 'grad_norm': 0.9938002737382183, 'learning_rate': 4.1281400129045136e-10, 'epoch': 1.0} 100%|█████████▉| 34142/34278 [37:23:01<07:21, 3.25s/it] 100%|█████████▉| 34143/34278 [37:23:05<07:09, 3.18s/it] {'loss': 0.0918, 'grad_norm': 0.7459366688459611, 'learning_rate': 4.0676560826180544e-10, 'epoch': 1.0} 100%|█████████▉| 34143/34278 [37:23:05<07:09, 3.18s/it] 100%|█████████▉| 34144/34278 [37:23:07<06:58, 3.12s/it] {'loss': 0.1104, 'grad_norm': 0.7463550455355336, 'learning_rate': 4.0076185040760497e-10, 'epoch': 1.0} 100%|█████████▉| 34144/34278 [37:23:08<06:58, 3.12s/it] 100%|█████████▉| 34145/34278 [37:23:11<07:25, 3.35s/it] {'loss': 0.1001, 'grad_norm': 0.7699426460408898, 'learning_rate': 3.948027277822508e-10, 'epoch': 1.0} 100%|█████████▉| 34145/34278 [37:23:11<07:25, 3.35s/it] 100%|█████████▉| 34146/34278 [37:23:14<07:02, 3.20s/it] {'loss': 0.1078, 'grad_norm': 0.7900420373373584, 'learning_rate': 3.888882404384786e-10, 'epoch': 1.0} 100%|█████████▉| 34146/34278 [37:23:14<07:02, 3.20s/it] 100%|█████████▉| 34147/34278 [37:23:17<06:52, 3.15s/it] {'loss': 0.1013, 'grad_norm': 1.1382369025979637, 'learning_rate': 3.8301838842957905e-10, 'epoch': 1.0} 100%|█████████▉| 34147/34278 [37:23:17<06:52, 3.15s/it] 100%|█████████▉| 34148/34278 [37:23:21<07:05, 3.27s/it] {'loss': 0.1096, 'grad_norm': 1.3269699975868081, 'learning_rate': 3.771931718071775e-10, 'epoch': 1.0} 100%|█████████▉| 34148/34278 [37:23:21<07:05, 3.27s/it] 100%|█████████▉| 34149/34278 [37:23:25<07:32, 3.51s/it] {'loss': 0.1235, 'grad_norm': 0.7587070204404274, 'learning_rate': 3.714125906234545e-10, 'epoch': 1.0} 100%|█████████▉| 34149/34278 [37:23:25<07:32, 3.51s/it] 100%|█████████▉| 34150/34278 [37:23:29<07:39, 3.59s/it] {'loss': 0.1095, 'grad_norm': 0.8446558768476956, 'learning_rate': 3.656766449305904e-10, 'epoch': 1.0} 100%|█████████▉| 34150/34278 [37:23:29<07:39, 3.59s/it] 100%|█████████▉| 34151/34278 [37:23:33<07:49, 3.69s/it] {'loss': 0.1127, 'grad_norm': 0.8949934761445224, 'learning_rate': 3.599853347796556e-10, 'epoch': 1.0} 100%|█████████▉| 34151/34278 [37:23:33<07:49, 3.69s/it] 100%|█████████▉| 34152/34278 [37:23:36<07:27, 3.55s/it] {'loss': 0.1294, 'grad_norm': 1.0200940014131596, 'learning_rate': 3.5433866022116516e-10, 'epoch': 1.0} 100%|█████████▉| 34152/34278 [37:23:36<07:27, 3.55s/it] 100%|█████████▉| 34153/34278 [37:23:41<08:22, 4.02s/it] {'loss': 0.1054, 'grad_norm': 0.90758597346904, 'learning_rate': 3.4873662130563425e-10, 'epoch': 1.0} 100%|█████████▉| 34153/34278 [37:23:41<08:22, 4.02s/it] 100%|█████████▉| 34154/34278 [37:23:46<09:12, 4.45s/it] {'loss': 0.1097, 'grad_norm': 0.7143673488871289, 'learning_rate': 3.4317921808302293e-10, 'epoch': 1.0} 100%|█████████▉| 34154/34278 [37:23:46<09:12, 4.45s/it] 100%|█████████▉| 34155/34278 [37:23:50<08:23, 4.09s/it] {'loss': 0.1004, 'grad_norm': 0.6712702705515735, 'learning_rate': 3.3766645060273605e-10, 'epoch': 1.0} 100%|█████████▉| 34155/34278 [37:23:50<08:23, 4.09s/it] 100%|█████████▉| 34156/34278 [37:23:53<07:38, 3.76s/it] {'loss': 0.1132, 'grad_norm': 0.9627409957114645, 'learning_rate': 3.3219831891417863e-10, 'epoch': 1.0} 100%|█████████▉| 34156/34278 [37:23:53<07:38, 3.76s/it] 100%|█████████▉| 34157/34278 [37:23:56<07:11, 3.57s/it] {'loss': 0.1004, 'grad_norm': 0.8457140099250648, 'learning_rate': 3.2677482306675554e-10, 'epoch': 1.0} 100%|█████████▉| 34157/34278 [37:23:56<07:11, 3.57s/it] 100%|█████████▉| 34158/34278 [37:24:01<08:00, 4.00s/it] {'loss': 0.1104, 'grad_norm': 0.6630682088470647, 'learning_rate': 3.213959631082064e-10, 'epoch': 1.0} 100%|█████████▉| 34158/34278 [37:24:01<08:00, 4.00s/it] 100%|█████████▉| 34159/34278 [37:24:04<07:42, 3.89s/it] {'loss': 0.1126, 'grad_norm': 1.7996736791987658, 'learning_rate': 3.160617390862708e-10, 'epoch': 1.0} 100%|█████████▉| 34159/34278 [37:24:04<07:42, 3.89s/it] 100%|█████████▉| 34160/34278 [37:24:10<08:55, 4.54s/it] {'loss': 0.092, 'grad_norm': 0.9877985269285867, 'learning_rate': 3.107721510497985e-10, 'epoch': 1.0} 100%|█████████▉| 34160/34278 [37:24:10<08:55, 4.54s/it] 100%|█████████▉| 34161/34278 [37:24:14<08:19, 4.27s/it] {'loss': 0.0887, 'grad_norm': 0.6839949995421452, 'learning_rate': 3.055271990448638e-10, 'epoch': 1.0} 100%|█████████▉| 34161/34278 [37:24:14<08:19, 4.27s/it] 100%|█████████▉| 34162/34278 [37:24:17<07:35, 3.93s/it] {'loss': 0.1097, 'grad_norm': 0.7442010323329299, 'learning_rate': 3.003268831180961e-10, 'epoch': 1.0} 100%|█████████▉| 34162/34278 [37:24:17<07:35, 3.93s/it] 100%|█████████▉| 34163/34278 [37:24:20<07:02, 3.68s/it] {'loss': 0.1165, 'grad_norm': 0.8848179072753259, 'learning_rate': 2.951712033172349e-10, 'epoch': 1.0} 100%|█████████▉| 34163/34278 [37:24:20<07:02, 3.68s/it] 100%|█████████▉| 34164/34278 [37:24:23<06:29, 3.42s/it] {'loss': 0.0826, 'grad_norm': 0.7790254380643167, 'learning_rate': 2.900601596872443e-10, 'epoch': 1.0} 100%|█████████▉| 34164/34278 [37:24:23<06:29, 3.42s/it] 100%|█████████▉| 34165/34278 [37:24:27<06:53, 3.66s/it] {'loss': 0.1104, 'grad_norm': 0.7864299898033266, 'learning_rate': 2.8499375227419854e-10, 'epoch': 1.0} 100%|█████████▉| 34165/34278 [37:24:27<06:53, 3.66s/it] 100%|█████████▉| 34166/34278 [37:24:31<06:40, 3.58s/it] {'loss': 0.1231, 'grad_norm': 0.8027462154254484, 'learning_rate': 2.7997198112306167e-10, 'epoch': 1.0} 100%|█████████▉| 34166/34278 [37:24:31<06:40, 3.58s/it] 100%|█████████▉| 34167/34278 [37:24:34<06:15, 3.38s/it] {'loss': 0.1233, 'grad_norm': 1.0899454076932567, 'learning_rate': 2.749948462787977e-10, 'epoch': 1.0} 100%|█████████▉| 34167/34278 [37:24:34<06:15, 3.38s/it] 100%|█████████▉| 34168/34278 [37:24:38<06:30, 3.55s/it] {'loss': 0.1195, 'grad_norm': 0.8119848674925277, 'learning_rate': 2.700623477858155e-10, 'epoch': 1.0} 100%|█████████▉| 34168/34278 [37:24:38<06:30, 3.55s/it] 100%|█████████▉| 34169/34278 [37:24:42<06:55, 3.81s/it] {'loss': 0.1332, 'grad_norm': 0.8075655322922792, 'learning_rate': 2.651744856885241e-10, 'epoch': 1.0} 100%|█████████▉| 34169/34278 [37:24:42<06:55, 3.81s/it] 100%|█████████▉| 34170/34278 [37:24:45<06:28, 3.60s/it] {'loss': 0.1109, 'grad_norm': 0.6774971443211152, 'learning_rate': 2.6033126003022213e-10, 'epoch': 1.0} 100%|█████████▉| 34170/34278 [37:24:45<06:28, 3.60s/it] 100%|█████████▉| 34171/34278 [37:24:49<06:27, 3.62s/it] {'loss': 0.1066, 'grad_norm': 0.8483895412969615, 'learning_rate': 2.555326708536532e-10, 'epoch': 1.0} 100%|█████████▉| 34171/34278 [37:24:49<06:27, 3.62s/it] 100%|█████████▉| 34172/34278 [37:24:54<07:21, 4.16s/it] {'loss': 0.1264, 'grad_norm': 0.8093775376395698, 'learning_rate': 2.5077871820267107e-10, 'epoch': 1.0} 100%|█████████▉| 34172/34278 [37:24:54<07:21, 4.16s/it] 100%|█████████▉| 34173/34278 [37:24:59<07:24, 4.23s/it] {'loss': 0.1116, 'grad_norm': 0.8841579981638696, 'learning_rate': 2.460694021189092e-10, 'epoch': 1.0} 100%|█████████▉| 34173/34278 [37:24:59<07:24, 4.23s/it] 100%|█████████▉| 34174/34278 [37:25:02<06:47, 3.92s/it] {'loss': 0.1044, 'grad_norm': 0.8846754073052753, 'learning_rate': 2.414047226445559e-10, 'epoch': 1.0} 100%|█████████▉| 34174/34278 [37:25:02<06:47, 3.92s/it] 100%|█████████▉| 34175/34278 [37:25:05<06:19, 3.68s/it] {'loss': 0.1066, 'grad_norm': 0.9412410205833899, 'learning_rate': 2.3678467982179986e-10, 'epoch': 1.0} 100%|█████████▉| 34175/34278 [37:25:05<06:19, 3.68s/it] 100%|█████████▉| 34176/34278 [37:25:08<05:54, 3.48s/it] {'loss': 0.1208, 'grad_norm': 0.8271434007851702, 'learning_rate': 2.3220927369116408e-10, 'epoch': 1.0} 100%|█████████▉| 34176/34278 [37:25:08<05:54, 3.48s/it] 100%|█████████▉| 34177/34278 [37:25:11<05:31, 3.28s/it] {'loss': 0.1079, 'grad_norm': 0.8512201912170679, 'learning_rate': 2.2767850429372684e-10, 'epoch': 1.0} 100%|█████████▉| 34177/34278 [37:25:11<05:31, 3.28s/it] 100%|█████████▉| 34178/34278 [37:25:17<06:47, 4.07s/it] {'loss': 0.1049, 'grad_norm': 0.7671519287134442, 'learning_rate': 2.231923716705664e-10, 'epoch': 1.0} 100%|█████████▉| 34178/34278 [37:25:17<06:47, 4.07s/it] 100%|█████████▉| 34179/34278 [37:25:20<06:18, 3.83s/it] {'loss': 0.1123, 'grad_norm': 0.8886894100532023, 'learning_rate': 2.1875087586054056e-10, 'epoch': 1.0} 100%|█████████▉| 34179/34278 [37:25:20<06:18, 3.83s/it] 100%|█████████▉| 34180/34278 [37:25:23<05:54, 3.61s/it] {'loss': 0.1188, 'grad_norm': 0.8438997153914551, 'learning_rate': 2.1435401690472756e-10, 'epoch': 1.0} 100%|█████████▉| 34180/34278 [37:25:23<05:54, 3.61s/it] 100%|█████████▉| 34181/34278 [37:25:26<05:25, 3.36s/it] {'loss': 0.1054, 'grad_norm': 0.8500831871218869, 'learning_rate': 2.1000179484087501e-10, 'epoch': 1.0} 100%|█████████▉| 34181/34278 [37:25:26<05:25, 3.36s/it] 100%|█████████▉| 34182/34278 [37:25:29<05:09, 3.22s/it] {'loss': 0.0879, 'grad_norm': 0.7365222705634256, 'learning_rate': 2.0569420970895092e-10, 'epoch': 1.0} 100%|█████████▉| 34182/34278 [37:25:29<05:09, 3.22s/it] 100%|█████████▉| 34183/34278 [37:25:35<06:24, 4.04s/it] {'loss': 0.1208, 'grad_norm': 0.9923526276086613, 'learning_rate': 2.01431261547258e-10, 'epoch': 1.0} 100%|█████████▉| 34183/34278 [37:25:35<06:24, 4.04s/it] 100%|█████████▉| 34184/34278 [37:25:41<07:14, 4.62s/it] {'loss': 0.1179, 'grad_norm': 0.8316420117386609, 'learning_rate': 1.9721295039298872e-10, 'epoch': 1.0} 100%|█████████▉| 34184/34278 [37:25:41<07:14, 4.62s/it] 100%|█████████▉| 34185/34278 [37:25:44<06:23, 4.13s/it] {'loss': 0.1157, 'grad_norm': 1.4848637927713826, 'learning_rate': 1.9303927628500085e-10, 'epoch': 1.0} 100%|█████████▉| 34185/34278 [37:25:44<06:23, 4.13s/it] 100%|█████████▉| 34186/34278 [37:25:47<05:52, 3.83s/it] {'loss': 0.109, 'grad_norm': 0.7938500757021467, 'learning_rate': 1.889102392599318e-10, 'epoch': 1.0} 100%|█████████▉| 34186/34278 [37:25:47<05:52, 3.83s/it] 100%|█████████▉| 34187/34278 [37:25:50<05:29, 3.62s/it] {'loss': 0.1175, 'grad_norm': 0.8120227142644452, 'learning_rate': 1.848258393544189e-10, 'epoch': 1.0} 100%|█████████▉| 34187/34278 [37:25:50<05:29, 3.62s/it] 100%|█████████▉| 34188/34278 [37:25:53<05:14, 3.50s/it] {'loss': 0.1075, 'grad_norm': 0.8087073132513186, 'learning_rate': 1.8078607660565463e-10, 'epoch': 1.0} 100%|█████████▉| 34188/34278 [37:25:53<05:14, 3.50s/it] 100%|█████████▉| 34189/34278 [37:25:57<05:21, 3.61s/it] {'loss': 0.1328, 'grad_norm': 0.7444642727653622, 'learning_rate': 1.767909510491661e-10, 'epoch': 1.0} 100%|█████████▉| 34189/34278 [37:25:57<05:21, 3.61s/it] 100%|█████████▉| 34190/34278 [37:26:01<05:16, 3.60s/it] {'loss': 0.1091, 'grad_norm': 0.8060351901233608, 'learning_rate': 1.728404627204805e-10, 'epoch': 1.0} 100%|█████████▉| 34190/34278 [37:26:01<05:16, 3.60s/it] 100%|█████████▉| 34191/34278 [37:26:04<05:03, 3.48s/it] {'loss': 0.1106, 'grad_norm': 0.7364525489511471, 'learning_rate': 1.6893461165512494e-10, 'epoch': 1.0} 100%|█████████▉| 34191/34278 [37:26:04<05:03, 3.48s/it] 100%|█████████▉| 34192/34278 [37:26:10<06:03, 4.23s/it] {'loss': 0.1192, 'grad_norm': 0.9649021370790292, 'learning_rate': 1.6507339788807141e-10, 'epoch': 1.0} 100%|█████████▉| 34192/34278 [37:26:10<06:03, 4.23s/it] 100%|█████████▉| 34193/34278 [37:26:14<06:04, 4.29s/it] {'loss': 0.1243, 'grad_norm': 0.8226085288924465, 'learning_rate': 1.6125682145373688e-10, 'epoch': 1.0} 100%|█████████▉| 34193/34278 [37:26:14<06:04, 4.29s/it] 100%|█████████▉| 34194/34278 [37:26:18<05:39, 4.04s/it] {'loss': 0.0993, 'grad_norm': 0.7496164783485849, 'learning_rate': 1.5748488238653824e-10, 'epoch': 1.0} 100%|█████████▉| 34194/34278 [37:26:18<05:39, 4.04s/it] 100%|█████████▉| 34195/34278 [37:26:21<05:16, 3.81s/it] {'loss': 0.1268, 'grad_norm': 0.9370912822704288, 'learning_rate': 1.5375758071922707e-10, 'epoch': 1.0} 100%|█████████▉| 34195/34278 [37:26:21<05:16, 3.81s/it] 100%|█████████▉| 34196/34278 [37:26:24<04:49, 3.54s/it] {'loss': 0.1092, 'grad_norm': 0.7862567085951591, 'learning_rate': 1.500749164856652e-10, 'epoch': 1.0} 100%|█████████▉| 34196/34278 [37:26:24<04:49, 3.54s/it] 100%|█████████▉| 34197/34278 [37:26:27<04:42, 3.48s/it] {'loss': 0.0988, 'grad_norm': 0.7339864850693282, 'learning_rate': 1.4643688971860416e-10, 'epoch': 1.0} 100%|█████████▉| 34197/34278 [37:26:27<04:42, 3.48s/it] 100%|█████████▉| 34198/34278 [37:26:33<05:38, 4.23s/it] {'loss': 0.104, 'grad_norm': 0.750307678578148, 'learning_rate': 1.4284350045079555e-10, 'epoch': 1.0} 100%|█████████▉| 34198/34278 [37:26:33<05:38, 4.23s/it] 100%|█████████▉| 34199/34278 [37:26:39<06:15, 4.75s/it] {'loss': 0.0992, 'grad_norm': 0.7161423311215511, 'learning_rate': 1.3929474871388072e-10, 'epoch': 1.0} 100%|█████████▉| 34199/34278 [37:26:39<06:15, 4.75s/it] 100%|█████████▉| 34200/34278 [37:26:42<05:24, 4.16s/it] {'loss': 0.1004, 'grad_norm': 0.9210874526488155, 'learning_rate': 1.3579063454005614e-10, 'epoch': 1.0} 100%|█████████▉| 34200/34278 [37:26:42<05:24, 4.16s/it] 100%|█████████▉| 34201/34278 [37:26:45<04:58, 3.88s/it] {'loss': 0.116, 'grad_norm': 0.7399212394152471, 'learning_rate': 1.3233115796040807e-10, 'epoch': 1.0} 100%|█████████▉| 34201/34278 [37:26:45<04:58, 3.88s/it] 100%|█████████▉| 34202/34278 [37:26:48<04:32, 3.59s/it] {'loss': 0.1083, 'grad_norm': 0.8879265071603267, 'learning_rate': 1.2891631900546764e-10, 'epoch': 1.0} 100%|█████████▉| 34202/34278 [37:26:48<04:32, 3.59s/it] 100%|█████████▉| 34203/34278 [37:26:51<04:15, 3.40s/it] {'loss': 0.0913, 'grad_norm': 0.8688691488447081, 'learning_rate': 1.2554611770632107e-10, 'epoch': 1.0} 100%|█████████▉| 34203/34278 [37:26:51<04:15, 3.40s/it] 100%|█████████▉| 34204/34278 [37:26:54<04:13, 3.43s/it] {'loss': 0.1039, 'grad_norm': 0.8755796021913493, 'learning_rate': 1.2222055409238932e-10, 'epoch': 1.0} 100%|█████████▉| 34204/34278 [37:26:54<04:13, 3.43s/it] 100%|█████████▉| 34205/34278 [37:26:57<04:01, 3.31s/it] {'loss': 0.1133, 'grad_norm': 0.7932573367705784, 'learning_rate': 1.1893962819364836e-10, 'epoch': 1.0} 100%|█████████▉| 34205/34278 [37:26:57<04:01, 3.31s/it] 100%|█████████▉| 34206/34278 [37:27:01<03:55, 3.27s/it] {'loss': 0.132, 'grad_norm': 0.8643948103149971, 'learning_rate': 1.1570334003951911e-10, 'epoch': 1.0} 100%|█████████▉| 34206/34278 [37:27:01<03:55, 3.27s/it] 100%|█████████▉| 34207/34278 [37:27:04<03:44, 3.16s/it] {'loss': 0.1012, 'grad_norm': 0.8684557441525104, 'learning_rate': 1.1251168965886738e-10, 'epoch': 1.0} 100%|█████████▉| 34207/34278 [37:27:04<03:44, 3.16s/it] 100%|█████████▉| 34208/34278 [37:27:07<03:56, 3.38s/it] {'loss': 0.1112, 'grad_norm': 0.7384427021177166, 'learning_rate': 1.0936467708055898e-10, 'epoch': 1.0} 100%|█████████▉| 34208/34278 [37:27:07<03:56, 3.38s/it] 100%|█████████▉| 34209/34278 [37:27:11<04:00, 3.48s/it] {'loss': 0.1113, 'grad_norm': 0.7491017461603411, 'learning_rate': 1.0626230233179436e-10, 'epoch': 1.0} 100%|█████████▉| 34209/34278 [37:27:11<04:00, 3.48s/it] 100%|█████████▉| 34210/34278 [37:27:15<04:08, 3.65s/it] {'loss': 0.112, 'grad_norm': 0.8501798348215374, 'learning_rate': 1.032045654408842e-10, 'epoch': 1.0} 100%|█████████▉| 34210/34278 [37:27:15<04:08, 3.65s/it] 100%|█████████▉| 34211/34278 [37:27:20<04:21, 3.91s/it] {'loss': 0.1288, 'grad_norm': 0.708422280305766, 'learning_rate': 1.0019146643502898e-10, 'epoch': 1.0} 100%|█████████▉| 34211/34278 [37:27:20<04:21, 3.91s/it] 100%|█████████▉| 34212/34278 [37:27:23<04:04, 3.70s/it] {'loss': 0.11, 'grad_norm': 0.892736371624153, 'learning_rate': 9.722300534087403e-11, 'epoch': 1.0} 100%|█████████▉| 34212/34278 [37:27:23<04:04, 3.70s/it] 100%|█████████▉| 34213/34278 [37:27:28<04:27, 4.12s/it] {'loss': 0.1268, 'grad_norm': 0.7949983304967112, 'learning_rate': 9.429918218561984e-11, 'epoch': 1.0} 100%|█████████▉| 34213/34278 [37:27:28<04:27, 4.12s/it] 100%|█████████▉| 34214/34278 [37:27:33<04:40, 4.39s/it] {'loss': 0.1047, 'grad_norm': 0.9191063415013071, 'learning_rate': 9.141999699424641e-11, 'epoch': 1.0} 100%|█████████▉| 34214/34278 [37:27:33<04:40, 4.39s/it] 100%|█████████▉| 34215/34278 [37:27:38<04:48, 4.58s/it] {'loss': 0.116, 'grad_norm': 0.7401892187233303, 'learning_rate': 8.858544979339912e-11, 'epoch': 1.0} 100%|█████████▉| 34215/34278 [37:27:38<04:48, 4.58s/it] 100%|█████████▉| 34216/34278 [37:27:41<04:17, 4.16s/it] {'loss': 0.1206, 'grad_norm': 0.7011227202244126, 'learning_rate': 8.579554060805795e-11, 'epoch': 1.0} 100%|█████████▉| 34216/34278 [37:27:41<04:17, 4.16s/it] 100%|█████████▉| 34217/34278 [37:27:46<04:21, 4.29s/it] {'loss': 0.1183, 'grad_norm': 0.8806883928171629, 'learning_rate': 8.305026946320294e-11, 'epoch': 1.0} 100%|█████████▉| 34217/34278 [37:27:46<04:21, 4.29s/it] 100%|█████████▉| 34218/34278 [37:27:50<04:11, 4.20s/it] {'loss': 0.1123, 'grad_norm': 0.9655641910202492, 'learning_rate': 8.034963638325898e-11, 'epoch': 1.0} 100%|█████████▉| 34218/34278 [37:27:50<04:11, 4.20s/it] 100%|█████████▉| 34219/34278 [37:27:54<04:01, 4.09s/it] {'loss': 0.096, 'grad_norm': 0.5808085748149094, 'learning_rate': 7.769364139265101e-11, 'epoch': 1.0} 100%|█████████▉| 34219/34278 [37:27:54<04:01, 4.09s/it] 100%|█████████▉| 34220/34278 [37:27:57<03:37, 3.76s/it] {'loss': 0.1306, 'grad_norm': 0.8818674122350973, 'learning_rate': 7.508228451469369e-11, 'epoch': 1.0} 100%|█████████▉| 34220/34278 [37:27:57<03:37, 3.76s/it] 100%|█████████▉| 34221/34278 [37:28:00<03:20, 3.52s/it] {'loss': 0.1023, 'grad_norm': 1.0098508528559604, 'learning_rate': 7.251556577270169e-11, 'epoch': 1.0} 100%|█████████▉| 34221/34278 [37:28:00<03:20, 3.52s/it] 100%|█████████▉| 34222/34278 [37:28:03<03:16, 3.52s/it] {'loss': 0.117, 'grad_norm': 0.8681199510423564, 'learning_rate': 6.999348518943461e-11, 'epoch': 1.0} 100%|█████████▉| 34222/34278 [37:28:03<03:16, 3.52s/it] 100%|█████████▉| 34223/34278 [37:28:06<03:02, 3.31s/it] {'loss': 0.1011, 'grad_norm': 0.7221477738461264, 'learning_rate': 6.751604278820711e-11, 'epoch': 1.0} 100%|█████████▉| 34223/34278 [37:28:06<03:02, 3.31s/it] 100%|█████████▉| 34224/34278 [37:28:12<03:38, 4.05s/it] {'loss': 0.1131, 'grad_norm': 0.7846265418677336, 'learning_rate': 6.508323859011345e-11, 'epoch': 1.0} 100%|█████████▉| 34224/34278 [37:28:12<03:38, 4.05s/it] 100%|█████████▉| 34225/34278 [37:28:15<03:27, 3.92s/it] {'loss': 0.0924, 'grad_norm': 0.8736913819740687, 'learning_rate': 6.269507261791318e-11, 'epoch': 1.0} 100%|█████████▉| 34225/34278 [37:28:15<03:27, 3.92s/it] 100%|█████████▉| 34226/34278 [37:28:18<03:08, 3.63s/it] {'loss': 0.1098, 'grad_norm': 0.7938951168846574, 'learning_rate': 6.035154489214546e-11, 'epoch': 1.0} 100%|█████████▉| 34226/34278 [37:28:18<03:08, 3.63s/it] 100%|█████████▉| 34227/34278 [37:28:21<02:56, 3.47s/it] {'loss': 0.1054, 'grad_norm': 0.7860922547822048, 'learning_rate': 5.805265543390448e-11, 'epoch': 1.0} 100%|█████████▉| 34227/34278 [37:28:21<02:56, 3.47s/it] 100%|█████████▉| 34228/34278 [37:28:25<02:59, 3.59s/it] {'loss': 0.1172, 'grad_norm': 0.9894812049323295, 'learning_rate': 5.57984042637294e-11, 'epoch': 1.0} 100%|█████████▉| 34228/34278 [37:28:25<02:59, 3.59s/it] 100%|█████████▉| 34229/34278 [37:28:30<03:07, 3.82s/it] {'loss': 0.1061, 'grad_norm': 0.9294474570565269, 'learning_rate': 5.3588791402159335e-11, 'epoch': 1.0} 100%|█████████▉| 34229/34278 [37:28:30<03:07, 3.82s/it] 100%|█████████▉| 34230/34278 [37:28:36<03:34, 4.47s/it] {'loss': 0.1169, 'grad_norm': 0.9494862180189568, 'learning_rate': 5.142381686806808e-11, 'epoch': 1.0} 100%|█████████▉| 34230/34278 [37:28:36<03:34, 4.47s/it] 100%|█████████▉| 34231/34278 [37:28:39<03:13, 4.13s/it] {'loss': 0.1089, 'grad_norm': 0.7637484304705885, 'learning_rate': 4.930348068143964e-11, 'epoch': 1.0} 100%|█████████▉| 34231/34278 [37:28:39<03:13, 4.13s/it] 100%|█████████▉| 34232/34278 [37:28:42<02:53, 3.77s/it] {'loss': 0.1041, 'grad_norm': 0.8862950308683506, 'learning_rate': 4.722778286114782e-11, 'epoch': 1.0} 100%|█████████▉| 34232/34278 [37:28:42<02:53, 3.77s/it] 100%|█████████▉| 34233/34278 [37:28:45<02:38, 3.53s/it] {'loss': 0.099, 'grad_norm': 0.7745062735066747, 'learning_rate': 4.519672342551129e-11, 'epoch': 1.0} 100%|█████████▉| 34233/34278 [37:28:45<02:38, 3.53s/it] 100%|█████████▉| 34234/34278 [37:28:48<02:28, 3.38s/it] {'loss': 0.111, 'grad_norm': 0.7984060139393339, 'learning_rate': 4.321030239340385e-11, 'epoch': 1.0} 100%|█████████▉| 34234/34278 [37:28:48<02:28, 3.38s/it] 100%|█████████▉| 34235/34278 [37:28:51<02:21, 3.28s/it] {'loss': 0.1055, 'grad_norm': 0.85411201766429, 'learning_rate': 4.1268519780923724e-11, 'epoch': 1.0} 100%|█████████▉| 34235/34278 [37:28:51<02:21, 3.28s/it] 100%|█████████▉| 34236/34278 [37:28:57<02:55, 4.19s/it] {'loss': 0.0955, 'grad_norm': 0.7978049144042352, 'learning_rate': 3.9371375606944706e-11, 'epoch': 1.0} 100%|█████████▉| 34236/34278 [37:28:57<02:55, 4.19s/it] 100%|█████████▉| 34237/34278 [37:29:01<02:44, 4.01s/it] {'loss': 0.1158, 'grad_norm': 0.854740245108905, 'learning_rate': 3.751886988812015e-11, 'epoch': 1.0} 100%|█████████▉| 34237/34278 [37:29:01<02:44, 4.01s/it] 100%|█████████▉| 34238/34278 [37:29:04<02:35, 3.89s/it] {'loss': 0.0801, 'grad_norm': 0.6876753949975288, 'learning_rate': 3.571100264054827e-11, 'epoch': 1.0} 100%|█████████▉| 34238/34278 [37:29:04<02:35, 3.89s/it] 100%|█████████▉| 34239/34278 [37:29:07<02:19, 3.59s/it] {'loss': 0.109, 'grad_norm': 0.8745558354823828, 'learning_rate': 3.3947773880327326e-11, 'epoch': 1.0} 100%|█████████▉| 34239/34278 [37:29:07<02:19, 3.59s/it] 100%|█████████▉| 34240/34278 [37:29:12<02:24, 3.80s/it] {'loss': 0.1088, 'grad_norm': 0.8988813743641949, 'learning_rate': 3.222918362355554e-11, 'epoch': 1.0} 100%|█████████▉| 34240/34278 [37:29:12<02:24, 3.80s/it] 100%|█████████▉| 34241/34278 [37:29:18<02:45, 4.47s/it] {'loss': 0.1209, 'grad_norm': 0.7468545358649729, 'learning_rate': 3.055523188522091e-11, 'epoch': 1.0} 100%|█████████▉| 34241/34278 [37:29:18<02:45, 4.47s/it] 100%|█████████▉| 34242/34278 [37:29:21<02:32, 4.25s/it] {'loss': 0.1367, 'grad_norm': 0.9124898635304158, 'learning_rate': 2.8925918680866582e-11, 'epoch': 1.0} 100%|█████████▉| 34242/34278 [37:29:21<02:32, 4.25s/it] 100%|█████████▉| 34243/34278 [37:29:27<02:47, 4.79s/it] {'loss': 0.1143, 'grad_norm': 0.8657235373221649, 'learning_rate': 2.7341244024370328e-11, 'epoch': 1.0} 100%|█████████▉| 34243/34278 [37:29:27<02:47, 4.79s/it] 100%|█████████▉| 34244/34278 [37:29:31<02:25, 4.29s/it] {'loss': 0.0945, 'grad_norm': 0.6457685587430032, 'learning_rate': 2.5801207930720163e-11, 'epoch': 1.0} 100%|█████████▉| 34244/34278 [37:29:31<02:25, 4.29s/it] 100%|█████████▉| 34245/34278 [37:29:34<02:12, 4.01s/it] {'loss': 0.0939, 'grad_norm': 0.9287976316252068, 'learning_rate': 2.430581041268365e-11, 'epoch': 1.0} 100%|█████████▉| 34245/34278 [37:29:34<02:12, 4.01s/it] 100%|█████████▉| 34246/34278 [37:29:37<01:59, 3.74s/it] {'loss': 0.0892, 'grad_norm': 0.7213050276263969, 'learning_rate': 2.2855051484138578e-11, 'epoch': 1.0} 100%|█████████▉| 34246/34278 [37:29:37<01:59, 3.74s/it] 100%|█████████▉| 34247/34278 [37:29:41<01:59, 3.86s/it] {'loss': 0.1103, 'grad_norm': 0.8124233881580213, 'learning_rate': 2.1448931157852515e-11, 'epoch': 1.0} 100%|█████████▉| 34247/34278 [37:29:41<01:59, 3.86s/it] 100%|█████████▉| 34248/34278 [37:29:45<01:51, 3.72s/it] {'loss': 0.1106, 'grad_norm': 0.7611849642952453, 'learning_rate': 2.0087449446593022e-11, 'epoch': 1.0} 100%|█████████▉| 34248/34278 [37:29:45<01:51, 3.72s/it] 100%|█████████▉| 34249/34278 [37:29:51<02:07, 4.41s/it] {'loss': 0.0997, 'grad_norm': 0.8055971271842521, 'learning_rate': 1.877060636201744e-11, 'epoch': 1.0} 100%|█████████▉| 34249/34278 [37:29:51<02:07, 4.41s/it] 100%|█████████▉| 34250/34278 [37:29:54<01:53, 4.04s/it] {'loss': 0.1072, 'grad_norm': 0.8440487769792574, 'learning_rate': 1.7498401916893338e-11, 'epoch': 1.0} 100%|█████████▉| 34250/34278 [37:29:54<01:53, 4.04s/it] 100%|█████████▉| 34251/34278 [37:29:57<01:45, 3.91s/it] {'loss': 0.1079, 'grad_norm': 0.8052304979021084, 'learning_rate': 1.627083612176783e-11, 'epoch': 1.0} 100%|█████████▉| 34251/34278 [37:29:57<01:45, 3.91s/it] 100%|█████████▉| 34252/34278 [37:30:03<01:58, 4.56s/it] {'loss': 0.1297, 'grad_norm': 0.8352534142633674, 'learning_rate': 1.508790898774315e-11, 'epoch': 1.0} 100%|█████████▉| 34252/34278 [37:30:03<01:58, 4.56s/it] 100%|█████████▉| 34253/34278 [37:30:08<01:52, 4.50s/it] {'loss': 0.1061, 'grad_norm': 0.9629333553281815, 'learning_rate': 1.3949620525366414e-11, 'epoch': 1.0} 100%|█████████▉| 34253/34278 [37:30:08<01:52, 4.50s/it] 100%|█████████▉| 34254/34278 [37:30:11<01:37, 4.08s/it] {'loss': 0.1075, 'grad_norm': 0.794338304710376, 'learning_rate': 1.2855970744629632e-11, 'epoch': 1.0} 100%|█████████▉| 34254/34278 [37:30:11<01:37, 4.08s/it] 100%|█████████▉| 34255/34278 [37:30:14<01:27, 3.81s/it] {'loss': 0.1117, 'grad_norm': 0.6942865165536916, 'learning_rate': 1.1806959655524807e-11, 'epoch': 1.0} 100%|█████████▉| 34255/34278 [37:30:14<01:27, 3.81s/it] 100%|█████████▉| 34256/34278 [37:30:18<01:23, 3.80s/it] {'loss': 0.1234, 'grad_norm': 0.7866379169387789, 'learning_rate': 1.0802587268043951e-11, 'epoch': 1.0} 100%|█████████▉| 34256/34278 [37:30:18<01:23, 3.80s/it] 100%|█████████▉| 34257/34278 [37:30:21<01:14, 3.55s/it] {'loss': 0.1085, 'grad_norm': 1.0778192537085203, 'learning_rate': 9.842853589958623e-12, 'epoch': 1.0} 100%|█████████▉| 34257/34278 [37:30:21<01:14, 3.55s/it] 100%|█████████▉| 34258/34278 [37:30:26<01:21, 4.07s/it] {'loss': 0.113, 'grad_norm': 0.8202622444032077, 'learning_rate': 8.927758630705719e-12, 'epoch': 1.0} 100%|█████████▉| 34258/34278 [37:30:26<01:21, 4.07s/it] 100%|█████████▉| 34259/34278 [37:30:29<01:10, 3.72s/it] {'loss': 0.138, 'grad_norm': 0.8612818136871452, 'learning_rate': 8.057302398056799e-12, 'epoch': 1.0} 100%|█████████▉| 34259/34278 [37:30:29<01:10, 3.72s/it] 100%|█████████▉| 34260/34278 [37:30:34<01:14, 4.14s/it] {'loss': 0.1096, 'grad_norm': 0.7891601589283768, 'learning_rate': 7.2314848997834255e-12, 'epoch': 1.0} 100%|█████████▉| 34260/34278 [37:30:34<01:14, 4.14s/it] 100%|█████████▉| 34261/34278 [37:30:38<01:08, 4.02s/it] {'loss': 0.1197, 'grad_norm': 0.9345282474650779, 'learning_rate': 6.450306143102047e-12, 'epoch': 1.0} 100%|█████████▉| 34261/34278 [37:30:38<01:08, 4.02s/it] 100%|█████████▉| 34262/34278 [37:30:41<01:01, 3.86s/it] {'loss': 0.117, 'grad_norm': 0.7278489069436884, 'learning_rate': 5.713766135784227e-12, 'epoch': 1.0} 100%|█████████▉| 34262/34278 [37:30:41<01:01, 3.86s/it] 100%|█████████▉| 34263/34278 [37:30:45<00:56, 3.74s/it] {'loss': 0.1264, 'grad_norm': 0.8756420881385565, 'learning_rate': 5.021864883381078e-12, 'epoch': 1.0} 100%|█████████▉| 34263/34278 [37:30:45<00:56, 3.74s/it] 100%|█████████▉| 34264/34278 [37:30:48<00:51, 3.69s/it] {'loss': 0.0869, 'grad_norm': 0.8340731296432814, 'learning_rate': 4.374602393109051e-12, 'epoch': 1.0} 100%|█████████▉| 34264/34278 [37:30:48<00:51, 3.69s/it] 100%|█████████▉| 34265/34278 [37:30:51<00:44, 3.46s/it] {'loss': 0.1258, 'grad_norm': 0.8608276081109758, 'learning_rate': 3.771978669409038e-12, 'epoch': 1.0} 100%|█████████▉| 34265/34278 [37:30:51<00:44, 3.46s/it] 100%|█████████▉| 34266/34278 [37:30:54<00:39, 3.29s/it] {'loss': 0.1148, 'grad_norm': 0.8681531460238366, 'learning_rate': 3.2139937189423765e-12, 'epoch': 1.0} 100%|█████████▉| 34266/34278 [37:30:54<00:39, 3.29s/it] 100%|█████████▉| 34267/34278 [37:30:58<00:36, 3.30s/it] {'loss': 0.1019, 'grad_norm': 0.8631633579570969, 'learning_rate': 2.7006475461499593e-12, 'epoch': 1.0} 100%|█████████▉| 34267/34278 [37:30:58<00:36, 3.30s/it] 100%|█████████▉| 34268/34278 [37:31:01<00:33, 3.33s/it] {'loss': 0.0848, 'grad_norm': 0.9156021269009654, 'learning_rate': 2.23194015602779e-12, 'epoch': 1.0} 100%|█████████▉| 34268/34278 [37:31:01<00:33, 3.33s/it] 100%|█████████▉| 34269/34278 [37:31:04<00:28, 3.19s/it] {'loss': 0.1001, 'grad_norm': 0.7409696551956314, 'learning_rate': 1.8078715519065371e-12, 'epoch': 1.0} 100%|█████████▉| 34269/34278 [37:31:04<00:28, 3.19s/it] 100%|█████████▉| 34270/34278 [37:31:10<00:32, 4.03s/it] {'loss': 0.1064, 'grad_norm': 0.9556775396839313, 'learning_rate': 1.4284417382270933e-12, 'epoch': 1.0} 100%|█████████▉| 34270/34278 [37:31:10<00:32, 4.03s/it] 100%|█████████▉| 34271/34278 [37:31:13<00:25, 3.64s/it] {'loss': 0.1223, 'grad_norm': 0.9079732754228492, 'learning_rate': 1.0936507177650158e-12, 'epoch': 1.0} 100%|█████████▉| 34271/34278 [37:31:13<00:25, 3.64s/it] 100%|█████████▉| 34272/34278 [37:31:17<00:23, 3.93s/it] {'loss': 0.1044, 'grad_norm': 0.8537987482525036, 'learning_rate': 8.034984944060853e-13, 'epoch': 1.0} 100%|█████████▉| 34272/34278 [37:31:17<00:23, 3.93s/it] 100%|█████████▉| 34273/34278 [37:31:20<00:17, 3.59s/it] {'loss': 0.1137, 'grad_norm': 0.8281308361387377, 'learning_rate': 5.579850698156363e-13, 'epoch': 1.0} 100%|█████████▉| 34273/34278 [37:31:20<00:17, 3.59s/it] 100%|█████████▉| 34274/34278 [37:31:23<00:13, 3.46s/it] {'loss': 0.1015, 'grad_norm': 0.7654669263751744, 'learning_rate': 3.571104473243381e-13, 'epoch': 1.0} 100%|█████████▉| 34274/34278 [37:31:23<00:13, 3.46s/it] 100%|█████████▉| 34275/34278 [37:31:26<00:10, 3.35s/it] {'loss': 0.0917, 'grad_norm': 0.8681306608926576, 'learning_rate': 2.008746274873019e-13, 'epoch': 1.0} 100%|█████████▉| 34275/34278 [37:31:26<00:10, 3.35s/it] 100%|█████████▉| 34276/34278 [37:31:31<00:07, 3.76s/it] {'loss': 0.1167, 'grad_norm': 0.7804431762812664, 'learning_rate': 8.927761252497391e-14, 'epoch': 1.0} 100%|█████████▉| 34276/34278 [37:31:31<00:07, 3.76s/it] 100%|█████████▉| 34277/34278 [37:31:34<00:03, 3.54s/it] {'loss': 0.1194, 'grad_norm': 1.0700611561542164, 'learning_rate': 2.23194029924656e-14, 'epoch': 1.0} 100%|█████████▉| 34277/34278 [37:31:34<00:03, 3.54s/it] 100%|██████████| 34278/34278 [37:31:45<00:00, 5.69s/it] {'loss': 0.1066, 'grad_norm': 0.7014689983147184, 'learning_rate': 0.0, 'epoch': 1.0} 100%|██████████| 34278/34278 [37:31:45<00:00, 5.69s/it]Set eos token id toSet eos token id toSet eos token id to 151658151658151658 Set eos token toSet eos token to Set eos token to<|diff_marker|> <|diff_marker|>Set eos token id to <|diff_marker|> Set generation config eos token id to Set generation config eos token id to Set generation config eos token id to 151658 [151658] [151658] [151658] Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id toSet eos token id to et eos token id toSet eos token id to|> 151658 Set eos token to <|diff_marker|> Set generation config eos token id to Set generation config eos token id to [151658] [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to151658 151658 151658 151658Set eos token to Set eos token id toSet eos token toSet eos token id to Set eos token to<|diff_marker|>Set eos token to <|diff_marker|> 151658 <|diff_marker|>151658 Set generation config eos token id to Set generation config eos token id toSet eos token to Set generation config eos token id to Set eos token to<|diff_marker|> <|diff_marker|> [151658] <|diff_marker|>[151658][151658]Set generation config eos token id to Set generation config eos token id to Set generation config eos token id to [151658][151658] [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] [151658][151658] en id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] et generation config eos token id to Set eos token toSet generation config eos token id to <|diff_marker|>[151658] Set eos token id to Set generation config eos token id to[151658] [151658] 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id toSet eos token id to 151658 151658 Set eos token to <|diff_marker|>Set eos token to <|diff_marker|>Set generation config eos token id to Set generation config eos token id to Set eos token id to[151658][151658] 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643]Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to Set eos token id back to 151645[151645, 151643] Set eos token back to <|im_end|> Set generation config eos token id back to Set eos token id back to [151645, 151643] 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] {'train_runtime': 135140.0077, 'train_samples_per_second': 32.466, 'train_steps_per_second': 0.254, 'train_loss': 0.15108584606978395, 'epoch': 1.0} 100%|██████████| 34278/34278 [37:32:20<00:00, 5.69s/it]Set eos token id to 151658 Set eos tokSet eos token id to > Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658Set eos token id to o 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] 151658 Set eos token to <|diff_marker|> Set eos token id toSet generation confi151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658Set eos token id to Set eos token to <|diff_marker|> 151658Set generation config eos token id to Set eos to151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|>Set eos token id to Set generation config eos token id to 151658 [151658]Set eos token to Set eos token id to <|diff_marker|> Set generation config eos token id to 151658 Set eos token to <|diff_marker|>[151658] Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to Set eos token id to[151658] 151658 Set eos token to <|diff_marker|> Set eos token id toSet generation config eos token id to Set eos token id to [151658]151658 151658 Set eos token to <|diff_marker|>Set eos token to <|diff_marker|>Set generation config eos token id to Set generation config eos token id to Set eos token id to[151658][151658] 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id to 151658 Set eos token to <|diff_marker|> Set generation config eos token id to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to Set eos token id back to 151645[151645, 151643] Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to Set eos token id back to 151645[151645, 151643] Set eos token back to <|im_end|> Set eos token id back toSet generation config eos token id back to 151645 Set eos token back to <|im_end|> [151645, 151643]Set generation config eos token id back to [151645, 151643] Set eos token id back toSet eos token id back to 151645151645 Set eos token back to Set eos token back to<|im_end|> <|im_end|> Set generation config eos token id back to Set generation config eos token id back to [151645, 151643] [1516 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Set eos token id back to [151658] Set eos token back to <|diff_marker|> Set generation config eos token id back to [151658] Set eos token id back to 151645 Set eos token back to <|im_end|> Set generation config eos token id back to [151645, 151643] Rank 0: Model saved to work_dirs/aguvis_torchrun_64gpus_9//checkpoints/